AngelHack Vancouver and Bluetooth Low Energy
Event: detailsThis last weekend I participated at the AngelHack hackathon. What is this? Teams got to pitch their idea and collect members that were interested in the same thing. You then had 24 hours to complete/hack your project and present it to the group.
The idea I wanted to work on included the use of a TI Sensor tag. This is similar to apples iBeacon technology and uses BlueTooth 4.0 Low Energy. The low energy means that the devices work for low range, but also that some will operate for over 2 years on a coin battery. We intended to use a number of TI Sensor tags and an android device to get accurate indoor positional data. Once we had an accurate position and an orientation for the device we planned on allowing you to interrogate your environment. This environment interrogation would take form as a kind of Augmented Reality looking through the devices camera. I had imagined the terminators red view of the world with constant printouts of scanned objects ;)
The plan for getting the BLE data was to fix the sensor tags to known coordinates (Lat, Lng) and then measure the field strength of the tag to the device. Using a number of field strength values we could then interpolate to get an accurate position for the device. THIS DID NOT WORK :(
As it turns out the TI Sensor tag is incapable of providing data that can be used in this manor. About the only predictable use of the field strength was to determine if you were really closet to the device (like 10cm close). This was a major letdown and it took us well into the hack to realize that we were going to be unable to use the devices in this manor.
TI Sensor Tag
I have now ordered the "estimote" beacons, with the hopes that you will work with the above idea. I have already had a few reports that they are higher powered and should not suffer from the same shortcomings. Here is a link to the product http://estimote.com
But like any good hacker you adapt and move on. We changed the idea from trying to position a moving target to more of a check-in system. By that I mean you would bring your device right up to the tag.. once we knew you were really close to the device we assigned you the same position as the fixed tag. From here we displayed your location and the locations of "points of interest".
Once you know there are points of interest you can switch into the AR mode that we have and view information about the objects right on top of the camera. You could imagine that you have entire product schema that you could drill down into to learn more, or even overlay video or virtual message boards that people could write on ect.
In the end the project lacked the right visuals to really impress the crowd. However the knowledge gained was really valuable and I will be moving forward with this in the future.
Links to some source code on github
android-scala-ble - this is the scanning code required for android to locate devices and their signal strength.android-scala-gl - OpenGL ES 2 library to help drawing basic shapes and sprites for the Augmented Reality portion of the project.
Estimote seems interesting! Doesn't look like they have an android SDK/API per se though (although people seem to have used BLE to detect the estomote beacons)
ReplyDeleteYea I am still waiting for mine to show up. I hope to have more success with them. Glad you have reports of them working :)
ReplyDelete