Saturday, January 28, 2012

GA144 as a low level, low energy Erlang

I've been reading some of the newsgroup threads regarding the GA144 and most of the critiques have been mostly from a low level perspective (GA144 vs FPGAs vs ARMs, etc).  One can argue that the processor can't compete when considering the anemic amount of memory each node has and the limited limited peripheral support.  But, let us put aside that argument for a moment (we'll get back to it later).

Here I primarily want to discuss the GA144 from a software (problem solving) architecture perspective. This is where my primary interests reside. I'd lilke the GA144 to have formal and flexible SPI support. I'd like to see more peripheral capability built in. I'd like to see 3.3-5VDC support on I/O pins so level shifter chips aren't needed.  But, I think the potential strong point of GA144 is in what you can do with the cores from a software (problem solving) architectural design perspective.

Notice I keep mentioning software and problem solving together?  I want to make clear that I am not talking about software architecture in terms of library or framework building.  I'm talking about the architecture of the solution space.  I'm talking about the software model of the solution space.
Let's look at an analogy.

If I were to build a large telecommunication switch (handling thousands of simultaneous calls) and I implemented the software in Erlang or C++ (and assuming that they both would allow me to reach spec -- maybe 99.9999% uptime, no packet loss, etc.) at the end of the day you wouldn't be able to tell the system performance apart.

However, one of the (advertised) benefits of Erlang is that it allows you to do massive concurrency. This is not a performance improvement, but (perhaps) a closer model to how you want to implement a telco switch.  Lots of calls are happening at the same time. This makes the software easier to reason about and (arguably) safer -- your implementation stays closer to the solution space model.

Do you see what I am getting at?

Now, let's look at the problem I've been talking about here on the blog (previously described in vague terms): I want to integrate  dozens of wireless sensors to a sensor base station.  The base station can run off of outlet power but must be able to run off a battery for days (in case of a power outage). It is designed to be small and discrete (no big panel mounted to the wall with UPS backup). It needs to run 24/7 and be very fault tolerant.

The sensors are small, battery efficient and relatively "dumb". Each samples data until it reaches a prescribed threshold (perhaps performing light hysteresis/averaging before wasting power to send data) and it is up to the sensor base station to keep track of what is going on. The base station collects data, analyzes it, tracks it and may send alerts via SMS or perhaps just report it in a daily SMS summary.

Let's consider one typical sensor in my system: A wireless stove range monitor. This sensor, perhaps mounted to a range hood, would monitor the heat coming from the burners. This sensor will be used to (ultimately) let a remote individual know (via SMS) that a stove burner has been left on an unusually long time.  Perhaps grandma was cooking and forgot to turn the burner off.

This stove range sensor probably shouldn't be too smart. Or, in other words, it is not up to it to determine if the stove has been on "too long".  It reports  a temperature reading once it recognizes an elevated temperature reading over a 5 minute interval (grandma is cooking).  It then continues to report the temperature every 10 minutes until it drops below a prescribed threshold.   This is not a "smart" sensor. But it is not too dumb (it only uses RF transmit power when it has sufficiently determined significant temperature events -- rather than just broadcasting arbitrary temperature samples all day long).

The sensor base station will need a software model that takes this data, tracks it and makes a determination that there is an issue. Just because the stove range is on for a few hours may not mean there is a problem. A slow elevated temperature rise followed by stasis may suggest that a pot is just simmering.  However, if the stove is exhibiting this elevated temperature past 11pm at night -- certainly grandma isn't stewing a chicken this time at night!    You don't want to get too fancy, but  there can be lots of data points to consider when using this stove range monitor.

Here is my solution model (greatly simplified) for this sensor monitor:


  1. Receive a temperature sample
  2. Is it at stasis? If so, keep track of how long
  3. Is it still rising? Compare it with "fire" levels -- there may be no pot on the burner or it is scorching
  4. Is the temperature still rising? Fast?  Send an SMS alert
  5. Is it on late in the evening?  Send an SMS alert
  6. Keep a running summary (timestamped) of what is going on. Log it.
  7. Every night at 11pm, generate a summary of when the range was used and for how long. Send the summary via SMS
Imagine this as a long running process. It is constantly running, considering elapsed time and calendar time in its calculations.


Now, this is just one of many types of sensor that the base station must deal with. Each will have its own behavior (algorithm).

I can certainly handle a bunch of  sensors with a fast processor (ARM?). But my software model is different for each sensor. Wouldn't it be nice to have each sensor model to be independent? I could do this with Linux and multiple processes. But, really, the above model isn't really that sophisticated. It could (perhaps) easily fit in a couple of GA144 nodes (the sensor handler, logger, calendar and SMS notifier would exist elsewhere). And It would be nice to code this model as described (without considering state machines or context switches, etc).

So, back to the argument at the top of this post... I don't care if the GA144 isn't a competitive "MCU". My software models are simple but concurrent.  My design could easily use other MCUs to handle the SMS sending or RF radio receives. What is important is the software model. The less I have to break that up into state machines or deal with an OS, the better.

This is my interest in the GA144: A low level, low energy means of keeping my concurrent software model intact.  I don't need the GA144 to be an SMS modem handler.  I don't need it to perform the duties of a calendar. I need it to help me implement my software model as designed.

3 comments:

Anonymous said...

The ga144 should work at 3 volts, i heard c.m. say in one of the forth day videos.

Todd Coram said...

Yeah, I think I remember hearing that too. The spec sheet has max VDD at 2.3V and max voltage on any I/O pin as VDD+0.5V. So, 3V sounds about right.

But, I would probably keep VDD and I/O within recommended operating conditions...

These days its not too hard to find peripheral devices (mems, sram, flash, etc) that work at <=2V. Its just that I have a box full of 3.3V versions of said devices ;-)

esaid said...

Hello,

are you really sure ? in recommended operating conditions : Vdd Supply voltage = 1.8 V , and 2.0 V Max .
You can use translator 1.8v / 3.3v (MAX3023).

http://esaid.free.fr/tutoriel_arrayforth/Ga144_pcb/composants.html