In my last blog I focused on getting the Lego robot to understand a selection of commands given to it via posts I sent to its Chatter persona via Salesforce’s Chatter mobile app. In return it gave basic confirmations (as post commonts) as having executed the commands.
This time I wanted to explore the sensors that came with the kit. And ways in which I can push the data from those sensors into Force.com for further processing and analytics. Eventually enabling dynamic adaption of the robot through a combination of Apex code running in the Cloud as well as the NXC code running within the robot.
The Ultrasonic Sensor is quite an interesting one, it allows the robot to read the distance (in cm) of objects in front of it. The video below shows this sensor in action, as it feeds its sensor data (ranging from 0-255 cm) back to Salesforce in realtime!
Normally Salesforce only shows new data when you refresh the page. I wanted to see the data as it arrived. For this I turned to the Salesforce Streaming API. Having created a small Visualforce page, I was able to connect to a topic i created to display sensor data on the page as the robot sent it.
As you can see in the short video I have two custom objects, Robot and Sensor Data. Each record on the Robot object, reprepsents a Robot in the physical world (I’ve got quite an exciting ‘robot social’ vision for this concept btw!). The Sensor Data object is a child and thus the data it captures is linked with the specific robot that generated it.
As before I am currently using a custom Heroku RESTful service to accomplish the connection to Salesforce, it is called via the following NXC code.
SetSensorUltrasonic(S4); int sensorVal = SensorUS(S4); string data = NumToStr(sensorVal); string result = quickGETHostname(WB_PORT, "brickinthecloud.herokuapp.com", "/services/sensor?sensor=Ultrasonic&data=" + data);
If you looked closely at the video you will see my Robot object has a couple of fields I didn’t use in the demo! The ‘Distance’ and ‘Breach Action’ fields!
In a future post I plan to use these to dynamically drive parts of the robots programming by changing these values and have it change its behaviour dynamically (without the need to download new code to it). To do this, I will revist the Chatter command integration from my previous blog. Adding the ability to respond to Chatter Field Tracking events!
Pingback: Meanwhile… On BrickInTheCloud… Sensory Input! | Andy in the Cloud
March 3, 2014 at 5:18 pm
What is the communications method of the lego robot…is it doing http pushes, tcp or udp? I’m currently exploring the best ways to get non-http protocols to talk to the likes of SF and Heroku. Cheers, Fred.
March 3, 2014 at 5:22 pm
It uses HTTP Bayeux protocol, otherwise know as the Salesforce Streaming API. I have included sample Java code that runs on the robot which consumes this API. Though in general the JVM supports low level TCP/IP also, so effectively other protocols. Though keep in mind Salesforce only supports HTTP based protocols.