Laser Turret Progress

Over the past month, I’ve been working on a laser turret for the DefconBots autonomous robot competition this Summer. I entered the same competition back in 2006 and enjoyed the experience, so I figured I’d give it another try. Since I am fairly comfortable wiring firmware, I plan on focusing on the mechanical aspects of this project, as well as the image processing (both of which I know little about.) It should be a fun project and great learning experience. Here is some of the progress I’ve made since I started:

The first thing I did was start from the reference design on the website and bought a laser, servos, and pan/tilt mount for them. I build a very simple controller with an MBED and used a push button and joystick to control. This gave me an idea of the speed/resolution of the servos. (The speed wasn’t completely valid, since there was no heavy webcam to carry).


The servos were quick, but if a target is at the maximum 10m distance, the resolution is not fine enough to track it. Also, all of the weight of the camera is being held up by a tiny servo, which isn’t a great idea.

I thought of a few different designs to try and alleviate this problem, so I went to the hobby shop and bought some materials.The second prototype relieved one of the servos from weight bearing duties and allowed for slightly finer vertical resolution. This one still didn’t have the camera, but I learned a few things about working with balsa wood and cutting metal rods with a hacksaw…


Now that I had a ‘stable’ platform, I decided to attach a webcam to see how it would behave. I ended up choosing a rather heavy one, but it has excellent optics and can be mounted on a tripod. I put on a slightly larger/faster servo, bought a tripod-size screw, and mounted the webcam for testing. The new servo was fast but more often than not, the balsa wood base moved just as much as the camera.


While waiting for some parts I ordered, I decided to try and make some progress in the computer vision department. I installed Ubuntu in an old laptop and spent a few hours getting openCV compiled from source. (Turns out there’s a much easier way. I’m not sure how I missed it the first time…)

I was able to get a quick program that detects circles, picks the largest one, and moves the camera toward it. It worked, but the limitations of the mount were apparent. If the camera moves too far forward, it just falls off, never to return. That being said, the most important lesson from that experiment was: Get a dark, solid background, while testing/debugging your CV algorithm. T-shirts with round shapes are discouraged, as the robot will think you are the target and ignore the real one…


After receiving a large black tablecloth, I set it up and continued testing. Since the final device needs to actually shoot at the targets, I figured I might as well attach a laser to it again. I couldn’t come up with an elegant mounting solution in a short time, so I just used electrical tape and some cut cardboard for support.

The solid background worked extremely well. The robot was no longer focusing on non-targets. After adding some code to only enable the laser when the target was centered, I tested it again. The movement was still choppy, but the detection and shooting worked just as expected.


Since I was still waiting for parts for the next iteration of the mechanical platform, I switched to working on test targets. The organizers were nice enough to provide a target reference design as well. It consists of blue LEDs, red light sensors, and a microcontroller to detect the hits. I only had some of the required parts at home, so I decided to make a slightly simpler version of it.

Target Front - Light sensor and blue LED

Target Front – Light sensor and blue LED

After testing the light sensor, I decided to mount it on a perf board, along with the blue LED and some resistors. I thought about using old phone cable to take the signals back to the microcontroller doing the measuring, but I found some headphone cables and used them instead. I cut a hole on a ping pong ball, mounted the board inside, hot glued a bunch, and started testing.



I spent some time getting the microcontroller that’s doing the hit detection working. The code should be fully modular and easily support new targets (once I put them together…) I added a calibration routine so it knows what is ambient light (including that blue LED) and what is a hit. The competition requires the laser to be on-target for 1.5 seconds, I set mine for 0.5 for this quick test.


That’s most of my progress so far. I’ve also been working on other mechanical designs, but I haven’t documented any of them yet. Working on this has been educational and quite fun! Incremental updates are usually on youtube way before I write about them.

Target setup for testing.

Target setup for testing.

Using BusBlaster + openOCD on OSX Mavericks

I’ve been using the Dangerous Prototypes Bus Blaster along with openOCD to program/debug my LPC1769 microcontroller. It’s been a while since I did anything, so when I tried again last week and couldn’t get it to work, I was slightly confused. The only change (that I was aware of) was an OSX upgrade from Mountain Lion to Mavericks.

Whenever I tried to run openOCD, I would get the following error:
Error: unable to open ftdi device: unable to claim usb device. Make sure the default FTDI driver is not in use

I tried reinstalling the FTDI drivers (both VCM and D2XX), unloading them, reloading, and several other things without any luck. Whenever I plugged the BusBlaster in, two devices would show up under /dev. /dev/cu.usbserial-1d11A and /dev/cu.usbserial-1d11B. This meant that the FTDI driver was working, but at the same time, stopping openOCD from using the device. What did not make sense was that this was happening even when I unloaded the FTDI kext.

After some research, I found out that since OSX Mavericks, the AppleUSBFTDI kernel driver is included by default. This was the one setting up the usbserial devices whenever I plugged the device in. The main solution everyone seemed to have was to unload AppleUSBFTDI using:
sudo kextunload -bundle 

Finally! After unloading the kext, openOCD started working again. Unfortunately, this means that you can’t use any usb-serial devices anymore, because the kext isn’t loaded… That’s unfortunate.

Luckily, I had an idea. What if I unload the kext, run openOCD, then load it again? It worked! OpenOCD was still working and the rest of the FTDI usb-serial devices were loaded. Finally a working solution that didn’t cripple the system. This still requires me to unload/load the kext every time I want to use openOCD, which is not a terribly elegant solution.

During my research into the problem, I ran into a forum post with a similar problem, except it was in 2007 and instead of the Apple FTDI driver, it was the regular FTDI driver causing problems. Their solution had them edit the kext’s Info.plist to make sure the driver didn’t do anything to a particular usb device. (Using product and vendor id’s) That got me thinking that maybe I could do the same with the Apple kext.
To do this, open:

Find the device you want excluded, in this case AppleUSBEFTDI-6010-0 and comment out the key and dictionary. (I looked up the product id on OSX’s System Information window)

FTDI Device Id

Now you just have to reload the kext and everything should work without having to load/unload kexts every time!
sudo kextunload -bundle
sudo kextload -bundle

AppleUSBFTDI.kext Info.Plist

AppleUSBFTDI.kext Info.Plist

Skype Video Message Exporter

A few days ago, I found out that you can send video messages to people via Skype. Someone asked me if I knew of a way to export them from Skype. (That way you can watch them while offline, back them up for later, or whatever else you might want to do.)

After searching around the web for a while, I realized that there is no such feature at the moment. I also saw many people suggesting an SQLite browser add-on for firefox.

It turns out that Skype uses a sqlite database to save messages and other information. On OSX, the main database file is located under ~/Library/Application Support/Skype/skype_username/main.db

Inside the file is a table called VideoMessage, which contains the URL for all received video messages. What people were doing was finding the URL using the sqlite browser, then downloading each file individually. Another problem was that those links expire after some time, so if you haven’t viewed the video in Skype in the recent past, they won’t work. Since this method is a bit complicated for most users, I decided to automate it.

My first version consisted of a bash script that used find to get the database file, then called sqlite3 to get the urls, and generated an html file with links to the videos. It worked, but it still wasn’t as easy as it could be.

The next/last step consisted of writing an Objective-C app to do it all.

VideoMessageExporter Screenshot

VideoMessageExporter Screenshot

The application automatically finds the database file (or files, if there are multiple Skype accounts). After that, it opens each one and gets the video message info to display. The users can then select the messages they want and download them all at once to the desktop.

You can download the latest OSX build here: VideoMessageExporter Releases
For more info and all the sources check out the github page.

Kinesis Freestyle 2 Keyboard ‘Mod’ (to Fix Media Keys)

Soon after I bought my Kinesis Freestyle 2 Keyboard, I noticed that the media keys (next/previous song) did not work with Spotify on OSX. The strange part is that they work just fine on iTunes. I decided to email the Kinesis tech support to see if they had a solution.

Kinesis support was extremely quick to respond and informed me that this was a known problem and was fixed in the latest batch of keyboards. They offered to update the firmware on mine for free, but that required me sending them my keyboard (which is awesome!). I didn’t want to do that, so I figured, If they can update the firmware on this thing, so can I!

Kinesis Freestyle 2 for Mac

Kinesis Freestyle 2 for Mac

I had no knowledge of usb keyboards before I started, so the first thing I did was take the thing apart. (Well, one side of it, where all the controls are.) As usual, there was one extra hidden screw under a “Do Not Remove” tag.

Do Not Remove "O.K."

Do Not Remove “O.K.”

The keyboard itself has a few IC’s. The larger one is a Genesys Logic GL850G, which is a USB Hub. this makes sense, since there are two extra USB ports on the keyboard. The second one is an Alcor Micro AU9410 USB Keyboard Controller. Great, now I know what to look for!

Kinesis PCB Back Side

Kinesis PCB Back Side

After finding the data sheet for the controller, I started reading. The Alcor Micro guys were nice enough to include an example schematic, which included the pinout for the part. After reading some more, I realized there is an optional external EEPROM that can be used to change the controller’s configuration. This seemed like the perfect place to start probing.

AU9410 Reference Schematic

AU9410 Reference Schematic

Unfortunately, the EEPROM is not labeled. On the other hand, the controller schematic states that it must be ‘a 24C08 or compatible’ part. Microchip makes one, so I found that data sheet. Time to start probing!

The keyboard designers left a nice 5 pin header unpopulated on the board. After some looking around, here’s the pinout I came up with:

1 – VCC
2 – EEPROM Write Protect
3 – I2C SCL
4 – I2C SDA
5 – GND

I used my Saleae Logic to take a look at what the keyboard does when it’s connected. The first it does after starting is read a whole lot of data from the EEPROM. After looking at the addresses, I noticed it was all the USB descriptor information. Including the device name, manufacturer, etc…

Initial Capture (USB Descriptors, etc.)

Initial Capture (USB Descriptors, etc.)

The next thing I tried was pressing on a key to see what happened. Turns out that whenever a key is pressed, the controller reads from the EEPROM every ~9ms or so. The address it reads from corresponds to the row and column on the keyboard. The value it reads is a four byte number. For example, when I press the letter ‘A’, it reads 00 00 04 04. The letter ‘G’ results in 00 00 0A 0A. The ‘next song’ and ‘previous song’ keys had slightly different codes 03 B5 00 40 and 03 B6 00 3E respectively.

'A' Press Capture (Single)

‘A’ Press Capture (Single)

Key Press Capture (Multiple)

Key Press Capture (Multiple)

I wasn’t sure exactly what these numbers meant, so I started googling around. I ended up looking at the USB HID Usage Tables. In chapter 10, they have the Keyboard/Keypade codes. The letter ‘A’ is mapped to 0×04 and ‘G’ is mapped to 0x0A. Great, now I know what the numbers mean! Or not. Turns out the keyboard codes only go up to 0xE7. The rest (up to 0xFFFF) are reserved…

Initial Setup with Salae Logic

Initial Setup with Saleae Logic

After some struggling, I ended up on the same document, this time in chapter 15 (Consumer Page). Looks like the ‘Usage ID’ 0xB5 is set for “Scan Next Track”. Perfect! I still don’t know what the rest of the bytes mean, but the 0xB5 means scan next track…

Now that I know what I need to change, I have to figure out how to do it. I took out my BusPirate and connected it to the I2C bus. Once it’s in I2C mode, you can dump the entire EEPROM using “[0xa0 0x00][0xa1 r:1025]“. I had to read an extra byte because the last one kept getting NACKed. Always make sure to backup your EEPROM data before modifying! For the curious, here are my EEPROMs contents:

48 31 B1 C3 1C 35 C3 CC CC DE 35 70 47 50 70 B1 60 69 DE 13 4D 51 51 65 65 88 88 A0 09 02 19 00 01 01 00 A0 19 09 04 00 00 01 09 00 00 00 07 05 81 03 01 00 FF 09 02 3B 00 02 01 00 A0 19 09 04 00 00 01 03 01 01 00 09 21 10 01 00 01 22 41 00 07 05 81 03 08 00 0A 09 04 01 00 01 03 00 00 00 09 21 10 01 00 01 22 35 00 07 05 82 03 04 00 FF 05 01 09 06 A1 01 05 07 19 E0 29 E7 15 00 25 01 75 01 95 08 81 02 95 01 75 08 81 01 95 05 75 01 05 08 19 01 29 05 91 02 95 01 75 03 91 01 95 06 75 08 15 00 26 FF 00 05 07 19 00 2A FF 00 81 00 C0 00 00 10 01 09 00 00 08 8F 05 10 94 00 01 01 02 00 01 09 29 03 00 00 16 32 00 FF 12 01 10 01 00 00 00 08 8F 05 10 94 22 01 01 02 00 01 05 01 09 80 A1 01 85 02 75 01 95 01 15 00 25 01 09 81 81 06 09 82 81 06 09 83 81 06 75 05 81 01 C0 05 0C 09 01 A1 01 85 03 95 01 75 10 19 00 2A 3C 02 81 00 C0 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 04 03 09 04 28 4B 49 4E 45 53 49 53 20 43 4F 52 50 4F 52 41 54 49 4F 4E 46 4B 42 38 30 30 48 4D 20 4B 69 6E 65 73 69 73 20 46 72 65 65 73 74 79 6C 65 32 20 66 6F 72 20 4D 61 63 30 55 53 42 20 4D 75 6C 74 69 6D 65 64 69 61 20 4B 65 79 62 6F 61 72 64 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 E7 E7 00 00 E7 E7 00 00 00 00 00 00 00 00 43 B8 00 00 00 00 00 E5 00 08 50 E1 00 08 4F E5 00 08 2F E1 00 08 30 E5 00 00 E2 E2 00 00 22 22 00 00 3A 3A 00 00 1C 1C 00 08 06 E1 00 08 04 E5 00 08 1D E5 00 00 E1 E1 00 00 E5 E5 00 00 21 21 00 00 4D 4D 00 00 4A 4A 00 00 50 50 00 00 00 00 00 00 52 52 00 00 58 58 00 00 67 67 00 00 57 57 00 00 4E 4E 00 00 4B 4B 00 00 56 00 00 00 55 00 00 00 63 00 00 00 5B 00 00 00 5E 00 00 00 61 00 00 00 00 00 02 81 00 E5 00 00 4F 4F 00 00 54 00 00 00 62 00 00 00 5A 00 00 00 5D 00 00 00 00 E5 03 B8 00 E1 00 00 4C 4C 00 00 51 51 00 00 53 00 00 00 2C 2C 00 00 59 00 00 00 5C 00 00 00 00 E4 03 E9 00 43 03 EA 00 42 00 0C 07 45 00 00 28 28 00 01 51 44 00 00 31 31 00 00 2A 2A 00 00 00 E6 00 00 00 00 00 00 19 19 00 00 E7 E7 00 00 0A 0A 00 00 E3 E3 00 00 00 E4 00 08 1B E1 00 00 23 23 00 00 27 27 00 00 2D 2D 00 00 38 38 00 00 32 32 00 00 34 34 00 00 33 33 00 00 2F 2F 00 00 13 13 00 00 26 26 03 E2 00 41 00 00 00 00 00 00 37 37 00 00 00 00 00 00 0F 0F 00 00 00 00 00 00 12 12 00 00 25 25 00 00 2E 2E 00 00 87 87 00 00 36 36 00 00 00 00 00 00 0E 0E 00 00 30 30 00 00 0C 0C 00 00 24 24 00 00 00 00 00 00 11 11 00 00 10 10 00 00 0B 0B 00 00 0D 0D 00 00 1C 1C 00 00 18 18 00 00 21 21 00 00 22 22 00 00 05 05 00 00 19 19 00 00 0A 0A 00 00 09 09 00 00 17 17 00 00 15 15 00 00 20 20 00 00 6A 3B 00 00 2C 2C 00 00 06 06 00 00 45 3D 00 00 07 07 00 01 52 3C 00 00 08 08 00 00 1F 1F 00 00 69 3A 03 B3 00 40 00 00 1B 1B 00 00 64 64 00 00 16 16 00 00 39 39 00 00 1A 1A 00 00 1E 1E 00 00 35 35 03 CD 00 3F 00 00 1D 1D 00 00 29 29 00 00 04 04 00 00 2B 2B 00 00 14 14 03 B4 00 3E 00 00 E0 E0 00 08 19 E1 00 00 E6 E6 00 00 00 00 00 00 4C 4C 00 00 23 23 00 00 23 23
Sale Logic + Bus Pirate Setup

Saleae Logic + Bus Pirate Setup

Now that I have a backup of the data, time to start modifying they keys! But no, it didn’t work. I forgot to pull the write protect line low before attempting to write to the EEPROM. Now it works! As a first experiment, I reprogrammed the next song key with “[0xa6 0xA8 0x00 0x00 0x04 0x04]“. Immediately after, pressing the next song key produced ‘a’ characters. Perfect, now I can modify what any key does. (This could be the source of some nasty pranks… but I won’t go there today.)

Full Setup

Full Setup

So the current media key codes are broken… what should I change them to? I have no idea. I spend a good hour or two searching around, trying to figure out what the ‘correct’ code would be, with no luck. I started thinking about how I could brute force all codes until one worked. Fortunately, I didn’t have to. I went back to the USB HID Usage Tables and noticed the ‘Fast Forward’ and ‘Rewind’ keys, which had codes 0xB3 and 0xB4. Could it be that easy? Nah, there’s other bytes which are different in the current configuration. Lets try it anyway… So I programmed these values on the EEPROM: “[0xa6 0xA8 0x03 0xB3 0x00 0x40][0xa6 0xE0 0x03 0xB4 0x00 0x3E]“

HID Usage Tables Screenshot

HID Usage Tables Screenshot

Turns out it was that easy! The media keys work now. :) I have no idea what the last byte does, but changing it didn’t seem to affect the key function. So there you have it. Instead of sending my keyboard back for a firmware update, I spent the morning probing around and fixed it. Much more fun, not to mention educational! If there’s anyone else with this problem, I hope this helps!

Debugging ARM Cortex-M3 Devices with GDB and openOCD

After getting the gcc-arm compiler working with the mbed, I decided to take a look at my LPCXpresso LPC1769 development board. The mbed is really easy to program. It mounts as a flash drive and you just drag and drop the binary file onto it. Unfortunately, that’s it. There is no way to get any debug information out of it. The LPCXpresso, on the other hand, comes with a nice LPC-link board attached just for this purpose. Unfortunately(again), it only works with certain IDE’s, like code_red. I cut the lpc-link board off and instead used a BusBlaster from Dangerous Prototypes along with OpenOCD. It took me a while to actually program the device, so I’ll leave that for later. This post is about debugging!

BusBlaster and LPCXpresso LPC1769

BusBlaster and LPCXpresso LPC1769

So why, you might ask, do I go to all this trouble to get a debugger working? Because debuggers are awesome! Without them, one has to resort to printf statements(if you’re lucky enough to have that working) and LED’s. Sure, those are useful sometimes, but having access to memory, registers, stepping through code, etc. makes debugging much easier!


* All of the following takes place in Ubuntu 12.04.
* I’m using a simple blinking led program as an example


  1. Compile your code with arm-gcc and make sure to pass the -g flag to generate debugging information.
  2. Run openOCD to connect to the device.
    $ openocd -f openocd.cfg
  3. Run GDB and connect to openOCD
     $ arm-none-eabi-gdbtui -ex "target remote localhost:3333" blink.elf

    If you see [No source available], it’s probably because the core is running. Do the following to halt it the first time:

    (gdb) continue
    (gdb) Ctrl+c

    The c source should now be visible

  4. Setup split view to see dissasembly (if you want)

    (gdb) layout split
  5. Debug away!
Split view in GDB

Split view in GDB

It’s been a while since I use GDB, but here are some examples of commands that are useful:
(gdb) Ctrl+C – Halts execution
(gdb) step/next – step through a line (of c code)
(gdb) stepi/nexti – step through an individual instruction
(gdb) continue – Continue execution
(gdb) break 22 – set a breakpoint in line 22 (you can also name a function instead)
(gdb) delete – get rid of all the breakpoints
(gdb) where/backtrace – print backtrace of all stack frames
(gdb) info locals – see local variables
(gdb) info registers – see all registers and values
(gdb) x/2xw 0x2009C018 - show 2 words(in hex!) starting from memory address 0x2009C018
(gdb) set {uint32_t}0x2009C018 = 0×400000 – set value of memory address 0x2009C018 to 0×400000
There are many more things you can do with GDB. For more info. check out Debugging with GDB.

GCC-ARM for Cortex-M3 on Ubuntu

Don’t want to read though this post? Just want the code, go get it from my github repo!

I’ve been meaning to start working with Cortex-M3 processors for a few of my projects. Sometimes the MSP430′s I usually use just aren’t fast enough. :-)

Previously, I’ve used the mbed, which is a nice Cortex-M3(LPC1768) development board. The best part about it is that you program it by just copying the bin file to it as if it was a flash drive. It takes care of the actual device programming for you. The downside is that you don’t have any debugging capabilities, and that you have to use their online compiler.

The online compiler isn’t so bad, but you have to use their libraries and it’s all in C++. When working with microcontrollers, I prefer plain old C. To get around this, I decided to figure out how to use the GNU GCC-ARM compiler so I could have full control of the processor. I’m not going to cover every step I took, because that’s boring, but here’s a quick summary of how it went. (You can also look at my commit history for more details).

While trying to figure out what flags to pass to the compiler to build the most basic program, I found out the online mbed compiler has a ‘recently’ added export feature! You write your program with their online, C++, compiler, export it for gcc-arm, then you can build it all off-line. They include all the pre-compiled libraries and header files required, along with a nice Makefile. This made my life much easier. Now all I had to do was start cutting stuff out until I was happy.

I started by switching the main file to C, from C++. I then started removing all of their included libraries. Eventually, I was able to remove all of their files and start using the standard CMSIS headers. I had to find a plain C version of the startup_LPC17xx.c from some other code examples on NXP’s site. Finally, I had a very basic Makefile that built an LED blinky program that actually worked on the mbed.

One thing that was slightly disturbing was that my blinky program was ~20kB! That is way too big! Turns out they were including things like the C math library and floating-point versions of scanf and printf! I don’t need that… Now the code is ~2.5kB, which is still a bit high, but not 20kB!

So that was my experience getting started with the gcc-arm compiler. You can get the code from my github repo. I will keep updating this as I learn more. My goal is for every new ARM project I work on to start from here.

Automation Progress – Light Alarm and Remote

If you haven’t heard about my ‘home automation’ project, you should probably read this first.

This weekend was rather productive. I managed to get my light alarm working! I wrote escheduler, which is something like crond, but calls functions, instead of running programs. It’s also more limited, since it only schedules events on a weekly basis.

Beaglebone with a more 'permanent' radio setup

Beaglebone with a more ‘permanent’ radio setup

Escheduler runs on the beaglebone, which is currently my ‘home server’. The current set-up turns on the light behind my bed a few minutes before I’m supposed to wake up. Eventually there will be a whole fade-in period, but I just wanted to get something working. I left it running overnight, and sure enough, the lights turned on on time this morning!

Previously, the only way to control the lights was to ssh into the beaglebone and run swrite with the radio commands. You might not think so, but ssh-ing into the server and typing commands from a smartphone in bed at 6am is not as easy as it sounds! To make things easier, I decided to make a web interface I can use from my phone.

Simple RGB LED controller in smartphone browser

Simple RGB LED controller in smartphone browser

It took longer than expected, but I ended up using the twitter bootstrap. I haven’t done any web development in several years, so I had to re-learn a lot of things. In the end, I just set up a few buttons to turn the lights on or off. When clicked, there’s an AJAX request in the background to a php file that calls swrite to send the commands via the beaglebone radio. Next up will be controlling the actual colors from the web interface, as well as viewing and editing the alarms.

Here’s a quick video of the remote in action:

Beginnings of Home(ok, apartment) Automation

I’ve been neglecting my projects for a while, but I finally decided to stop being lazy and started working on them again. Several of my old posts deal with msp430′s and cc2500′s, but they’re mostly hardware updates and examples of changing lights to music. Now that I have a semi-stable platform and decent libraries, I figured I might as well do something useful with them.

My ‘end goal’ is to be able to control things like lighting and temperature automatically(and/or remotely) as well as gather data about the environment(temperature, motion, air quality, etc.) As usual, the first thing I did was start designing the whole system in my head and over-complicating things. I wanted to do a fancy server with flexible communications protocols, which got overwhelming pretty fast. After some time without getting anything done, I decided to start fresh and do small, manageable, tasks that eventually will end up being the full working system.

The first task I decided to do was to get my beaglebone talking to my cc2500 radios. I didn’t want to waste time trying to figure out how to get SPI and interrupts working on the beaglebone, so I went for the simpler solution and put an msp430 to control the cc2500. I say simpler because I already had a uart-to-msp430-to-cc2500 bridge working, so the only new thing was figuring out the uart on the beaglebone. Luckily, there are several good posts about it online.

Beaglebone with CC2500

The “Server” — Beaglebone with CC2500

Once I had my ‘server’ talking to the radio, I had to write a program to control it. Again, I could write an entire, complicated, sever application, or I could do something much simpler to start. Since I’m currently only testing with wireless lighting, I don’t need the server to receive data. Instead of having an always running application that takes over the serial port, my swrite program is called from the command line each time a new packet is sent. While limited, this solution is enough for now.

Wireless RGB LED Controller

Wireless RGB LED controller mounted on bike rack

To start testing the lighting control, I mounted two RGB LED strips with controllers in my apartment. One is behind my bike rack in the living room, and the other behind my bed. The one behind my bed will eventually be tied to my alarm clock, so I can start turning up the lights before I wake up. Since my computer is not in my room, I use the one in the bike rack for testing.

Lights on Bike Rack

Lights on Bike Rack

So that’s what I have so far. It’s not terribly exciting, but maybe if I write about it, I’ll be more motivated to keep working on it. Right now I’m writing a program on the server that will allow me to schedule events. A bit like crond, but instead of running programs, it calls my own functions. The first test will be to set up the light-alarm. I hope it works!

RGB LED strip behind bed

RGB LED strip behind bed

Biking to SFO

I recently found out that the San Francisco International Airport (SFO) offers bike parking for up to 14 days. I decided to try it out during the thanksgiving break just for fun (with the added bonus of saving on driving/parking or taxi costs).

The San Jose airport is much closer to where I live (San Jose/Cupertino area), but SFO has more direct flights to where I need to go. SFO is over 40 miles from my apartment, so the trip did require some planning. First of all, I had to reduce the amount of luggage I packed, since I was going to be carrying it on my back. I can’t predict if it’s going to be raining or not, so I decided to use a waterproof backpack. Finally, I didn’t want to bike all 40 miles, so I only biked to/from CalTrain stations and took the train the rest of the way.

Getting There

My flight was at 12:40 on Saturday. I decided to take the 8:14am train from Sunnyvale, so I would get to the airport around 9:30am. (This way, I could miss the train and still make it in time. Flats happen!) The weather was great, cold and rainy, so I got to use all my waterproof gear.

Ready for some rain!

It took me about half an hour to get to the caltrain station. I didn’t run into any problems on the way there.

I probably shouldn’t stand there…

The train was on time and the bike car was fairly empty.

That’s the waterproof bag! (On the train)

I got off the train at the Millbrae station. There are two options to get to SFO from there. One is to take the BART, the second is to bike the rest of the way. Since I already had my bike, I decided to use it. It was just over 2 miles, so it took around 10 minutes(Which I assume is faster than taking the BART).

Yup, it was still raining a bit. (Millbrae Station)

To get to the airport from the station, I took Millbrae Ave to S McDonnell Rd. Millbrae Ave was the only segment of the trip without a dedicated bike lane. Once I reached the airport, I registered my bike with the guy in the security booth (who had never done it before) and locked it away.

Yes, it folds!

So that was the trip there… Oh wait, I should warn anyone trying to do the same. If it happens to be raining, and your clothes are soaking wet, change before going through security! I always opt out of the TSA’s full body scanners and get the enhanced pat down (fun!). After they pat you down, they scan the gloves they used on you for whatever it is they’re looking for. Turns out, if the gloves are soaked, the machine doesn’t like it, and you get an extra-special pat down and search. Luckily, I had plenty of time and the TSA agents were nice.

Be sure to bring a change of clothes. Nobody likes traveling with wet socks/shoes!

Getting Back

I was slightly worried when the airport guy had never had anyone register a bike with him. Turns out, we did it right and my bike was still there when I returned. Phew!

Bike was still there when I returned!

I got back after dark, but I brought my bike lights so that wasn’t a problem. The airport road was pretty well lit, so if you forget, you should be ok.

Millbrae station. Busy!

The train was surprisingly full(after seeing the empty-ish station), but the bike car didn’t have too many bikes.

NOT my bike on the train

Since I flew across the country and didn’t have much to eat, I did have to make an unscheduled stop for food.

Turns out they have bike parking too!

That’s it, my experiment was successful! It is possible to get to SFO on your bike.

PIR Sensor Comparison

I recently started working on a project that might require some Passive Infra-red (PIR) sensors. Sparkfun had two different ones, so I figured I’d try them both.

Zilog ePIR and SE-10

The comparison is between a Zilog ePIR and an SE-10. One very important thing to note is that I did not use the ePIR‘s serial interface. I only used the hardware interface, set to the highest sensitivity, and shortest delay. I felt that this would be a better comparison. I made a video that shows my test setup, as well as some results, so most of the relevant information is there. I’ll use this blog post to add some setup details, as well as my conclusion.

First of all, I must warn everyone (as Sparkfun did) about the SE-10. The wire colors mean absolutely NOTHING. As you can see in the photo, there is a black, brown, and red cables. Here’s what they were connected to on mine: Black=12V, Brown=GND, Red=Alarm. It’s not a huge problem, just make sure you know before supplying power. There’s a small label that says B+ and AL on the bottom near each wire that should help.

The alarm pin needs to have a pull-up resistor, but that’s about it. Out of the box, it worked without a problem.

You’ll notice the SE-10 takes in 12V. Several comments on the Sparkfun product page deal with powering the device with 5V or 3.3V. All I did was remove the 5V regulator and jump across it with a wire (big red one on the photo). Unfortunately, I ripped off some traces while attempting to remove the regulator (oops!). At first it wouldn’t work, but I noticed a very small trace going under one of the resistors and connecting to the ground pin of the regulator. After connecting that to ground again (small red wire on the photo), everything worked fine at 5V.

SE-10 Jumper

The ePIR sensor took me a bit longer to get working. The datasheet is quite long, mostly documenting all the configurations and serial interface. They do have a nice schematic showing the Hardware Interface Mode. It shows all the connections required to get this working without the serial interface. The schematic has three potentiometers, a photoresistor and three regular resistors. Luckily, you don’t even need that for the most basic operation. The pots are all for setting the delay(pin 2), sensitivity(pin 3), and light gate threshold(pin 6). You can just connect the delay and  sensitivity pins to ground, and the light gate to VCC(which is 3.3V for this device). This will use the minimum delay (2 seconds), maximum sensitivity, and have the device always on.

The light gate is there in case you want to enable/disable your sensor depending on ambient light with a photoresistor. You can also just use the pin as an enable/disable for the whole device. It is disabled while the input is less than 1V.

The delay had me a bit confused initially. It is not a delay before it lets you know something is moving. On the other hand, once it detects something, it holds a ‘detected’ state for the delay period. I had the minimum delay setting of 2 seconds (it goes up to 15 minutes), so after detecting motion, the signal would stay low for that amount of time.

So which sensor should you use? As always, it depends… I found the ePIR sensor to be much more sensitive than the SE-10. Maybe it was just the one I received, or maybe they’re all like that, but the ePIR could detect me at a wider angle (and behind the blanket!). It also seemed faster than the SE-10 at longer distances. On the other hand, when I stood a few feet in front of the sensors and turned around, the response was about the same.

Another thing I mentioned in the video is the ‘delay’ time with the ePIR. The output signal is much cleaner because of it. The SE-10, on the other hand, constantly switches while something is moving. I guess that could be useful if you’re trying to count something…

If you’re looking for a very easy to setup sensor, and aren’t too worried about sensitivity, I’d go with the SE-10. It’s three wires and a pull-up resistor! Doesn’t get much simpler than that. On the other hand, if you want more control, and are willing to spend more time tweaking stuff, the ePIR is probably a better idea.

1 2 3 11  Scroll to top