Brick in the Cloud

The Lego Machine Cloud Project

Realtime Capture of Lego Robot Sensor Data in


In my last blog I focused on getting the Lego robot to understand a selection of commands given to it via posts I sent to its Chatter persona via Salesforce’s Chatter mobile app. In return it gave basic confirmations (as post commonts) as having executed the commands.

This time I wanted to explore the sensors that came with the kit. And ways in which I can push the data from those sensors into for further processing and analytics. Eventually enabling dynamic adaption of the robot through a combination of Apex code running in the Cloud as well as the NXC code running within the robot.


The Ultrasonic Sensor is quite an interesting one, it allows the robot to read the distance (in cm) of objects in front of it. The video below shows this sensor in action, as it feeds its sensor data (ranging from 0-255 cm) back to Salesforce in realtime!

Normally Salesforce only shows new data when you refresh the page. I wanted to see the data as it arrived. For this I turned to the Salesforce Streaming API. Having created a small Visualforce page, I was able to connect to a topic i created to display sensor data on the page as the robot sent it.

As you can see in the short video I have two custom objects, Robot and Sensor Data. Each record on the Robot object, reprepsents a Robot in the physical world (I’ve got quite an exciting ‘robot social’ vision for this concept btw!). The Sensor Data object is a child and thus the data it captures is linked with the specific robot that generated it.

Screen Shot 2013-03-17 at 22.05.26

As before I am currently using a custom Heroku RESTful service to accomplish the connection to Salesforce, it is called via the following NXC code.

int sensorVal = SensorUS(S4);
string data = NumToStr(sensorVal);
string result = quickGETHostname(WB_PORT,
    "/services/sensor?sensor=Ultrasonic&data=" + data);

Whats next?

If you looked closely at the video you will see my Robot object has a couple of fields I didn’t use in the demo! The ‘Distance’ and ‘Breach Action’ fields!
Screen Shot 2013-03-17 at 22.18.43
In a future post I plan to use these to dynamically drive parts of the robots programming by changing these values and have it change its behaviour dynamically (without the need to download new code to it). To do this, I will revist the Chatter command integration from my previous blog. Adding the ability to respond to Chatter Field Tracking events!
Screen Shot 2013-03-17 at 22.19.04

3 thoughts on “Realtime Capture of Lego Robot Sensor Data in

  1. Pingback: Meanwhile… On BrickInTheCloud… Sensory Input! | Andy in the Cloud

  2. What is the communications method of the lego robot…is it doing http pushes, tcp or udp? I’m currently exploring the best ways to get non-http protocols to talk to the likes of SF and Heroku. Cheers, Fred.

    • It uses HTTP Bayeux protocol, otherwise know as the Salesforce Streaming API. I have included sample Java code that runs on the robot which consumes this API. Though in general the JVM supports low level TCP/IP also, so effectively other protocols. Though keep in mind Salesforce only supports HTTP based protocols.

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s