Brick in the Cloud

The Lego Machine Cloud Project


3 Comments

Realtime Capture of Lego Robot Sensor Data in Force.com

In my last blog I focused on getting the Lego robot to understand a selection of commands given to it via posts I sent to its Chatter persona via Salesforce’s Chatter mobile app. In return it gave basic confirmations (as post commonts) as having executed the commands.

This time I wanted to explore the sensors that came with the kit. And ways in which I can push the data from those sensors into Force.com for further processing and analytics. Eventually enabling dynamic adaption of the robot through a combination of Apex code running in the Cloud as well as the NXC code running within the robot.

ultrasonic

The Ultrasonic Sensor is quite an interesting one, it allows the robot to read the distance (in cm) of objects in front of it. The video below shows this sensor in action, as it feeds its sensor data (ranging from 0-255 cm) back to Salesforce in realtime!

Normally Salesforce only shows new data when you refresh the page. I wanted to see the data as it arrived. For this I turned to the Salesforce Streaming API. Having created a small Visualforce page, I was able to connect to a topic i created to display sensor data on the page as the robot sent it.

As you can see in the short video I have two custom objects, Robot and Sensor Data. Each record on the Robot object, reprepsents a Robot in the physical world (I’ve got quite an exciting ‘robot social’ vision for this concept btw!). The Sensor Data object is a child and thus the data it captures is linked with the specific robot that generated it.

Screen Shot 2013-03-17 at 22.05.26

As before I am currently using a custom Heroku RESTful service to accomplish the connection to Salesforce, it is called via the following NXC code.

SetSensorUltrasonic(S4);
int sensorVal = SensorUS(S4);
string data = NumToStr(sensorVal);
string result = quickGETHostname(WB_PORT,
    "brickinthecloud.herokuapp.com",
    "/services/sensor?sensor=Ultrasonic&data=" + data);

Whats next?

If you looked closely at the video you will see my Robot object has a couple of fields I didn’t use in the demo! The ‘Distance’ and ‘Breach Action’ fields!
Screen Shot 2013-03-17 at 22.18.43
In a future post I plan to use these to dynamically drive parts of the robots programming by changing these values and have it change its behaviour dynamically (without the need to download new code to it). To do this, I will revist the Chatter command integration from my previous blog. Adding the ability to respond to Chatter Field Tracking events!
Screen Shot 2013-03-17 at 22.19.04

Advertisements