For years now we’ve lamented the fact that it takes an entire PC to run a node for System Platform. The bootstrap components themselves are pretty lightweight so why can’t we get them to run on something smaller than a full-blown PC requiring a square foot of space and 120V power. Well, with the help of some of the most brilliant engineers I’ve known (a couple guys in our office) we’ve figured out to get IAS running on an Android device. To be fair it will probably be a while before we have something we’d be confident in releasing in the wild (i.e. production use). Even still, this is a great start to what we think might be a game changer for System Platform. Other vendors are already doing something similar (http://www.inductiveautomation.com/) and (http://www.androidblip.com/android-apps/autobase-hmiscada-viewer-127912.html) but what we’re trying to do goes much further. We didn’t want to just run an HMI on your smart end device, we wanted the actual AOS (Application Object Server) to run an on Android device. You might ask, what in the world would be an application for this? The most obvious starts in the oil fields of the west. Most well heads and pump stations have some sort of local PLC that reports back on some infrequent interval over cell or long haul radio. There might be a really basic HMI in the form of a Panelview or something similar at the unit to allow operators to interact. That’s a ton of equipment for some really basic functions. What if we could put a small battery powered Android device (smartphone or tablet maybe) at the unit and run our object server locally instead of remotely.
Click to expand our “Big Picture” for the architecture
Is and Android device really powerful enough? Well, in a Samsung Epic 4G we have the following specs. 1GHZ processor. That’s plenty of muscle. Currently on my Epic I have 350 MB of available RAM. If we only want to run say 100 lightweight objects (simple AI’s, DI’s, etc) then that’s plenty of RAM. Read on below about how we’re using numerous sensors on the unit.
Some of the advantages of this system over traditional long haul SCADA and RTU stations.
1) Vastly improved UI for operator. The operator could connect wirelessly to the local object server from a laptop or tablet to interact with the local wellhead or pump station while working on the equipment. This sure beats using cellular or 900 MHZ radio.
2) Dramatically reduced power consumption. We’ve run tests with our setup running active object servers charged by a 16×16 solar cell and we never dropped below 75% battery on the unit. Obviously there are a lot of variables here but running the units off solar and an outboard battery pack is feasible and real. We’ve done it.
3) Cheap redundant servers. You can run a pair of smart devices (currently they have to be an exact model match but we’re working on that) talking BlueTooth between them for your RMC and now you have redundant AOS’s in the field. Our max we’ve found in terms of loading is about 250 simple objects yielding about 2500 checkpointed items per second. Remember BlueTooth 2.0 has a maximum throughput of 2.1 MBIT/s. That’s plenty to support a dedicated RMC.
4) Cellular based communications. Sprint gives us unlimited data we’re going to use it! These devices communicate back with the central control room over cellular signals. If we don’t have 3G or 4G, then a simple CDMA signal will do. Yes, a number of vendors do this now with their purpose built RTU’s but the cellular radios can be as much as $1K to $2K and we haven’t even discussed the cost of service.
Now for the really cool parts. Some we have really rudimentary working demos and others are potentially feasible but not yet developed.
4) Wellhead assemblies are sometimes skid based and can move around depending on numerous geological and business conditions. With built-in GPS on your device we can now report back to the central system our exact location so no more guessing if PIT-101-15 is at Wellhead 77G or 85F. Some simple mapping applications with off the shelf controls and software and we’re in business.
5) Accidents are a way of life in the oil fields. Explosions can and do occur. The reason you usually don’t know about them is because of extraordinary safety systems and extraordinary individuals who maintain these systems. However, wouldn’t it be great if you got immediate notification of some “event”, whether that be an explosion, a pump coming apart, or someone hitting the unit with their truck accidentally. With the built in accelerometer these types of reporting events are now possible.
6) Did you say 32 GB of store and forward? It’s not uncommon for your smartphone to have 32 GB of storage via an external card. Yes the OS takes up some space and the bootstrap and objects take up some room but that leaves you with plenty of space for store and forward if you lose communications for an extended interval.
7) Using the built-in camera for remote monitoring. It would take some twiddling with the physical configuration but we would love to use the built-in camera to sense motion then take shots every 5 seconds or so as a record that someone was at the station and servicing it. If you want to get really crazy you might could turn this into a really low res IP camera but that may be too much to ask.
But what about PLC connectivity?
Well that’s the easy part. Use your Wifi connection to a WAP connected to a physical ethernet switch connected to your PLC. We haven’t quite figured out how to do things like serial connectivity if that’s your only option. We can, however, talk Modbus TCP & Ethernet I/P, which was pretty low hanging fruit.
What if I don’t have a PLC?
We’re not there yet but we’re looking at interfacing with Arduino based devices (http://www.arduino.cc/). No point in reinventing the wheel here. We are most definitely not hardware geeks so anything we do need to interface with easily obtainable and supportable hardware.
We’ve developed for Froyo (2.2) because of the extensive additional functionality afforded us in the new package. We’ve taken a look at 2.3.3 but haven’t done anything with it yet.
So where do we go from here? Well, first and foremost we probably need to work with the powers that be at Invensys/Wonderware to figure out if they’d be willing to support continuing this effort. We wanted to go off on our own to figure out if this was feasible first, before engaging them for resources or support. If Invensys doesn’t support us that kinda shuts it down as I would never expect someone to run production systems on an unsupported configuration. At that point it would simply be a side project for experimentation.
For a long term vision we need to finish off some of the features we’ve mentioned above. Once that’s complete we need to work on the HMI side of things. We’ve only focused on the AOS side so far. We’re thinking that running InTouch on your tablet may be a serious challenge so we’re inclined to write our own software using some derivative of MXAccess. If you didn’t know, MXAccess is the layer you use to actually communicate with an Archestra system. Our grand vision is not to replace your big HP and Dell servers back in the central control room, that doesn’t make sense on a number of levels. What does make sense, however, is pushing the compute power for local monitoring, alarming, and data collection to a $200 low power device in the field that can be easily serviced and replaced by a technician.
What do you think? If you’ve got some under the hood IAS experience (i.e. AOT, MXAccess, etc.) and you are adept at Android programming drop us a line. We’re looking for people to possibly join the effort and/ or serve as sounding boards for new ideas. We haven’t totally decided on whether or not to go the Open Source route. Our guess at this point is that there could be legal issues with Invensys/Wonderware and it doesn’t make sense to go there. As I mentioned above, without long term support from Invensys we really don’t have a project.
Looking forward to everyone’s ideas and feedback. We hope to roll this out by April Fool’s Day next year.