Master Makers Competition 2016-2017
During the 2016-2017 academic year, a team of fellow Carnegie Mellon University students and I participated in a mobile robotics competition. The goal of this competition was to design and build a mobile robot capable of autonomously navigating a competition course in the quickest time possible. Our team won first place in 2016-2017 competition.
All components of the mechanical design are carefully chosen or custom designed to snugly fit all electrical components and dissipate stress through tight contacts in a suspensionless system. The BeetleBot is propelled by a rear wheel drive system with each wheel driven by a brushed DC motor. Omni-directional wheels were chosen to improve turning capabilities while eliminating wheel slip that would be present in a 4-wheel drive differential drive setup. The 6 inch diameter of the wheels allow for the beetlebot to function both right-side up as well as upside down, allowing for robust performance. All components of the beetlebot are bolted to the laser cut acrylic base plate and a top plate, which ensures tight contacts in the entire system, and allows for easy viewing and debugging of internal components.
The BeetleBot is powered by two 5:1 HD Premium Planetary Gear Motors that each operate with a stall torque of 97.2 oz-in (0.69 N-m) and speeds of up to 1621 rpm. With the BeetleBot’s mass of 6.2 kg, the robot is capable of a maximum speed of 28.9 mph (12.9 m/s ) with all four wheels on the ground. Given the wheel configuration of the beetlebot with two driven rear wheels and two forward omnidirectional wheels, the beetlebot is capable of turning by pivoting around a single axis allowing for extremely tight turns and improved directional control. Assuming the brushless motors are the primary source of power consumption, the 11.1 volt and 5000 mAh lithium-polymer battery will reasonably power the system continuously for over an hour. More practically, limitations on run-time will arise from motor overheat rather than battery storage.
The ultrasonic sensors have an effective range of up to 1.5 m when running at 50 hz resulting in reliable vehicle localization for circuit navigation. The inertial measurement unit is based on the BNO055 absolute orientation sensor from Bosch Sensortec GmbH. The BNO055 is a System in Package (SiP), integrating a triaxial 14-bit accelerometer, a triaxial 16-bit gyroscope with a range of ±2000 degrees per second, a triaxial geomagnetic sensor and a 32-bit microcontroller running the BSX3.0 FusionLib software. The sensor features three-dimensional acceleration, yaw rate and magnetic field strength data each in 3 perpendicular axes. Quaternion data, euler angles, rotation vectors, linear acceleration, and the gravity vector are also easily obtained using the IMU. The IMU board also features a interrupt engine, which allows for triggering interrupts based on motion recognition, slope detection, and high g detection.
The encoders attached to the brushed motors possess 240 ticks per revolution allowing for an angular resolution of 1.5 degrees. This system allows for reliability when applying robust orientation control to the BeetleBot. Finally, the control algorithms are all run on the Arduino Mega 2560 microcontroller, which greatly improves ease of use as it is an open source platform.
In order to validate initial design ideas, and quickly debug and tune controllers, a detailed simulation environment was developed for the BeetleBot. This simulation was developed using the Simulink SimMechanics package within Matlab.
First, a detailed description of the robot was generated using a Solidworks to xml file plugin. This plugin lets the user export a full xml file detailing the robot’s geometry, masses, inertias, spatial relationships, moving joints, and the necessary CAD files with one click. This description of the physical characteristics of the system was then imported into SimMechanics using a built in Matlab tool that automatically builds a simulink model based on the xml description provided to it. This provided the base upon which the simulation environment was designed.
After extensive cleanup of the automatically generated simulink model to make it intuitive and more user-friendly, the next step was to introduce ground reaction forces to the simulation. This can be a very complicated issue, especially for cylindrical contacts (like wheels) that have an infinitesimally small contact point that is continuously changing. Fortunately, there is a openly available Simulink library to solve this exact problem, which we took full advantage of. Integration of this Simscape Multibody Contact Library with our simulation was relatively straightforward, and soon we had full ground reaction forces with realistic nonlinear friction models in our simulation.
During the development of this vehicle, we noticed something unexpected. Initially we were controlling the vehicle with the driven wheels in the front. This was working perfectly fine until we started trying to perform fast, dynamic motions. During these dynamic motions, the car kept spinning out of control without warning. Strangely enough, we noticed when we drove the car with the omni-wheels in front, the car drove completely straight without problems. This behavior can be seen in the following videos:
This behavior initially did not seem to have a quick and intuitive explanation, so a complete derivation of the dynamics ensued.
Let the robot, with its omni-wheels in the front, be modeled by a vehicle with 2 fixed back wheels and two caster wheels in front, as shown in the following picture.
For simplification of our analysis, we assume that the forward velocity is constant. Furthermore, we assume that there exists a small perturbation in the y direction. This perturbation, and thus phi is assumed to be small compared to U, meaning we can use the small angle approximations when considering the linear dynamics. The side force at the rear fixed wheels is F and, since these wheels are assumed to roll without friction, there is no force in the direction these wheel are rolling. At the front axle there is no force at all in the x-y plane because of the “friction-less” contact between the omni-wheels and the ground. There are, of course, vertical reaction forces at both axles, but they play no role in the lateral dynamics of the car. Using these assumptions, the equations of motion are easily written:
Considering the velocity of the center of the real axle one can derive the condition of zero sideways velocity for the rear wheels. We assume that the rear wheels roll straight ahead with no sideways motion at all, otherwise known as the no-slip assumption. The side velocity of the center of the rear axle with respect to the center-line of the car is then:
Solving the equations of motion for F and combining leads to a single dynamic equation, which can then be differentiated. The result is:
This equation appears to be second-order, but realizing that the variables of importance is angular velocity, instead of the angle phi itself, we can do a change of variables:
Stability analysis of such a system is straightforward, and results in the following behavior:
In our case, A will be negative as long as U (x velocity) is positive since all the parameters are positive and there is a negative sign in front of the equation. This means that when x velocity is positive the system is inherently stable. When U becomes negative, however, A becomes positive, meaning that the system becomes unstable. This proves what we found empirically, that our robot’s passive dynamics are stable only when the omni-wheels are in front when moving forward.
When we use this passive stability information and don’t try to fight the natural dynamics of the robot, we can execute dynamic behaviors smoothly, as seen in the following video.
Extended Kalman Filter
Once the electro-mechanical design of the robot platform was finished, the first challenge to be overcome was fusing the sensor data from wheel encoders, the Inertial Measurement Unit, and ultrasonic distance sensors together to give the optimal estimate of the state of the robot. The state of the robot was taken to be its heading (theta) and location (x,y) in a known map. Unfortunately, the motion and measurement models for this system are nonlinear because of sine and cosine terms, meaning the implementation of an extended kalman filter was necessary for optimal state estimation.
Measurement and motion model covariance matrices R and Q were calculated using information from sensor datasheets found online, and then tuned based on the actual results found in the BeetleBot system. Initial covariances on state were small, because in reality we carefully placed the robot in it’s assumed start position for competition.
The motion model, or prediction step, was a function of our predicted robot motion based on encoder readings. Through encoder readings, the number of wheel revolutions for both the left and right wheel was calculated since the last kalman update. Then the heading and position change of the robot were approximated with the following model:
The measurement model, or update step, was a function of our sensor readings and the expected location of obstacles based on our estimated current position in a known map. The function used for the left and right distance sensors were as follows:
Incorporating the front and back distance sensors used similar equations, with one major change. Because the ultrasonic distance sensors used in the BeetleBot only reliably measure distances up to 1.5 meters, the majority of the time the front and back distance sensors are not giving any useful information. In order for the extended kalman filter to take this occlusion issue into account, multiple versions of the kalman filter were derived for all potential occlusion situations. Then, depending on which distance sensors are active (and giving useful information), a switch statement ensures the kalman filter only uses the measurement model for these active sensors.
After developing and debugging this extended kalman filter in Matlab, it was rewritten in C++ and successfully implemented on the BeetleBot arduino hardware at a speed of 50 hz. In the following video it can be seen that the robot is autonomously navigating a track using the state estimates from the kalman filter.