Tuesday, August 6, 2013

Amazing Sensors!

A new project!

Lately, I've been working with different sensors as part of my day job at TI.  TI happens to have recently released a booster pack for the Stellaris and Tiva launchpads that contains (among other things) an accelerometer, a gyroscope, and a magnetometer.  I figured I'd use my recent experience to do something fun with sensors in my free time :)


Using only the highest quality cell phone cameras and engineering studio production values, I made a quick video describing the system and showing it in action

Project Overview


  • Accelerometer, gyroscope, and magnetometer fused via direct cosine matrix to observe roll, pitch, and yaw of hand held transmitter
  • RemoTI software stack used to pair and transmit data over Zigbee RF4CE protocol between transmitter and receiver
  • Software generated PWM signal used to control servos attached to X and Y axis of marble maze based on roll and pitch data received
  • Source code and bill of materials can be found on my github

Hardware

The key piece of hardware that made this project possible was the SensorHub Booster Pack.  The booster pack comes with an MPU 9150, which provides 3-axis data from a gyroscope, an accelerometer, and a magnetometer.  The communication between the MCU and these sensors is handled with an interrupt driven I2C interface.

An additional bonus of using the SensorHub booster pack is that it was designed to contain a set of EM headers.  For this project, I connected a CC2533 radio transceiver to those headers, which enables short range (~5 meters) radio communication between two modules.  It was a bit tricky to set up the CC2533, as there was not a plethora of sample code for using it in conjunction with a Stellaris, but fortunately I was able to find a spare development kit that made learning and debugging the radio interface much easier.

The final hardware component was the marble maze itself.  I actually was able to find this already pre-fabricated.  A TI field engineer did a project very similar to this, but using the accelerometer on an msp430 Chronos watch and an lm3s9b96 for the maze control.  Unfortunately, that demo was starting to show its age: it used the SimpliciTI stack for wireless communication, which is no longer supported, the 9b96 was from back when the Stellaris line was controlled by Luminary Micro, meaning it had no Texas Instruments branding, and the 9b96 used for the PWM control is part of the now, sadly, NRND'd m3 line.  As a result of these factors, the demo was languishing in a marketing office, and hadn't been used for a trade show in years :(

Software

Once I procured the hardware, I set about to looking for example code.  I came across a project a coworker of mine created, the wireless air mouse demo, that was a great help.

The sensor communication was the easiest part to cover (due in no small part to the fact that I've spent the past year and change working with these exact same sensors for a different project within TI).  TI has a really great sensor library that is available in the current Tivaware C series release that handles a large part of the sensor functionality.  The air mouse demo already handled taking the sensor data and feeding it into a direct cosine matrix, which generates the eigenvectors needed to determine the roll, pitch, and yaw of the device at a given time.  Once I had this data being reliably generated, I moved on to the radio transmission portion of the code.

The radio portion of the project was by far the most difficult.  I opted to use the RemoTI RF4CE software stack, because a) that's what the air mouse demo used, and b) it sounded like an interesting topic to learn more about.  The RF4CE protocol basically classifies each node on the RF network as being either a target or a controller, and sets up a very robust algorithm for pairing and sending data between two devices.  Unfortunately, the airmouse demo only covered the controller side of the code; for that demo a CC2531 USB dongle was used to handle the target side of the communication, the source code for which I was unable to find.  It took a few weeks of reading up on the RemoTI documentation and learning about how the on board MCU on the CC2533 works, but I was eventually able to get a hello world program running, which quickly gave way to transferring roll, pitch, and yaw data from the hand held controller down to the maze controlling target.

The servo communication was fairly straightforward: each is controller by a pulse width modulated signal with a base of 20 ms and an active time of between 1 and 2 ms to rotate it to either extreme.  I was hoping to use a Tiva Launchpad for this purpose, as it contains a hardware PWM module, but I was not able to find one lying around the lab, so I opted to just use a general purpose timer to create a software based PWM signal for the servos.  Once I had roll and pitch data coming over the radio interface, all I had to do was to normalize the rotation between -1 and 1, use that to generate an active time for the PWM, and generate the signal.  I ended up adding a multiplier to each axis to get the demo to make a bit more sense, as the launchad is longer along the Y axis than it is the X axis, so it takes a greater effort to change the pitch on the board than it does the roll.  As a result, without a different multiplier on each axis, the controller gave the impression that it required more effort to get the board to move in the Y axis than it did to get the same motion on the X axis.

As always, the source code for this project can be found on my github.

Future Steps

I have a few ideas for where to go from here, but nothing concrete on the horizon.  My first thought was that it would be really easy to modify this code to instead control a POV camera gimble.  Something like using the target side MCU to control a gimble that has a go-pro attached to sync the camera movement to the movement of a launchpad attached to a hat.  Then maybe finding a way to stream a low res version of that video to an RC car or quadcopter to allow for POV control of the remote vehicle without requiring line of sight.  Still thinking about what hardware would be ideal for this, maybe giving me an excuse to buy a beaglebone black.  For now, though, I'm quite happy with my little marble maze demo :)

1 comment:

  1. Hi Jordan. I am using The SensorHub and tried to program a processing sketch for drawing the board in 3D in "real time".

    http://e2e.ti.com/support/microcontrollers/tiva_arm/f/908/t/285768.aspx


    However, the info on the yaw axis doesn't match reality...rotation is not progressive.

    Did you experience similar results?
    Thank you

    ReplyDelete