Sunday, July 19, 2015

Elder Home Care in an RF noisy house

The BT tags I mentioned in my previous post is acting erratically.  During certain times the tracker tokens lose contact with the server (for minutes) even if just a couple of feet away.  BT LE is supposed to be broadcasting on a channel not used by IEEE 802.11 Wi-Fi, so I am not sure what is drowning the broadcast. I don't have a 2.4Ghz wireless (house) phone so that isn't the culprit.

I don't have a spectrum analyzer, so I am limited in my investigative resources...

I'd hate to have to drop down to 433Mhz sensors.

The good news is that this can possibly be solved in software.  The problem is the "false positives".  Since the monitor notifies me upon the sensor going out of range, when these RF anomalies occur I am falsely alerted.  One approach is to have a "control" tag permanently installed in the room with the detector. If both the control and tracking tag go "out of range" then it must be an RF anomaly and I shouldn't be notified.

Friday, July 03, 2015

Phase II of Elder Home Care (formerly Elder Home Alone) Monitoring System

It's been a while since I've posted about my home monitoring system.

Short recap:

A couple of years ago, my Mother-in-law lived alone in a Condo and was prone to leave her stove on accidentally and other forgetful things. I started working on a home care monitor for the "Independent Elderly".  It would include basic occupancy trackers, water overflow detectors and stove/kitchen monitoring (to make sure it isn't left unattended and to monitor her eating habits).

Well, fast forward to..  my dementia diagnosed Mother-in-law moved in just over a year ago.  So, the problems are a bit different.  She wanders. She gets up in the middle of the night to go to the bathroom and can't find her way back to her bedroom. She may go upstairs in the dark and stumble or venture outside.  Sometimes, during the day she may decide to walk home... to her childhood home, several states away.  She is old, but fast.

The current system uses "cheap" X10 RF motion detectors and door monitors. I can review past activities (e.g. when did she get up this morning? Did she frequent the bathroom last night?) or I can be alerted to the house door being opened (Is she just checking the weather? Is she going to sit on the porch? Is she going to make a run for it?).
The alerting system consists of some software I wrote (runs under Linux on a small Intel NUC PC) and it, currently, sends XMPP (Jabber/Chat) messages to a cloud server (on Digital Ocean) which runs Prosody XMPP server. My wife and I are connected to this server using XMPP chat software (Xabber) on our Android phones.  We can query the monitoring system from our phone or be chimed when the door opens.

It has run well for a year now. :)

But, now that my Mother-in-law is prone to taking long unannounced walks, this system is not enough.
The phone chimes when the door is opened. Is it one of the kids? Is it her checking the weather? Is the door already opened from a previous check?  Is she *really* still sitting on the porch 10 minutes from now?

So, after an early phone call one morning, from the Police (she managed to get several blocks from the house before sunrise), we decided we needed to invest in a tracking solution.

Most tracking solutions either involve GPS (battery drain, and overkill -- if we know that she has left we can pretty much find her in a matter of minutes -- if we know she has left.

Not a lot of solutions out there.  Found one on an Alzheimers website. It *only* requires recharging every 48 hours. Ugh.  What do we do while it charges? Do we need to buy two?

So, I decided to look into BT LE (Bluetooth LE). I had built several BT LE tags years ago and was interested to see what the state of the art was today.  Apparently, Fitbit uses BT LE beaconing. That is, every second or so it broadcasts it's address so your phone can handily connect to it on demand.
BTE has a very limited range, but that's okay.
Also, Apple has been pushing "iBeacon" for their own (non-elderly) tracking purposes. They have a spec and a number of hardware vendors. I found this tag on Amazon for $14. Although meant to be used with Apple devices, it does a simple BTE beacon/broadcast that I can readily track.  This is perfect size to be "hidden" in her purse (in a small crevice/pocket) and the battery should last 6 months - 1 year (I'll assume 3 months and schedule an early battery replacement).

Armed with the BT 4.0 PCI card in my NUC, I attacked this challenge a week ago. Now I have a rudimentary system that will let me know when my Mother-in-law has ventured beyond the front porch. My android phone (running Xabber) is notified whenever the tag goes out of range.

There is a lot of work to perfect this, but I am happy with the preliminary results. I will be moving the notifier beyond the phone (maybe home media -- DNLA/TV/etc or just speakers on the NUC) and making it work locally in case we lose Internet connectivity.

I will be releasing the software into open source within the next few weeks.

Saturday, February 28, 2015

Virtualization: Your PC is a Universe

PCs (and, honestly I am really talking about Laptops and the newer PC replacement tablets) are so powerful that they no longer have to be thought of as singular "client" resources.  That is, with sufficient memory (let's start at 8GB RAM)  and with enough SSD speed storage (>128GB),  folk like myself typically run many virtual computers inside our computers.

If I need to run Windows, I just fire up Virtualbox. If I need to do server development, I can pick stuff like Vagrant, Docker or go directly to LXC.  I can do Android development. I can do Windows development. I can try out Haiku or some new BSD.  I can do all of this without changing the underlying OS.  The underlying OS, in fact, is starting to become irrelevant.  Give me a Windows box and I can do full Linux stuff on it without replacing the OS: Just start up a Linux VM.

The thing is, at any given moment, my laptop is a Universe of virtual computers. I can network these computers together; I can simulate resources; I can test them, probe them and manipulate them.

This is new. Yes, yes -- the tech is pretty old (e.g. virtual machines), but the realization of this tech on a portable computer is new.

If you want to see where we may be heading, check out something like Rump kernels or OSv. We are starting to leave the OS behind and look at computing in terms of "microservices" -- collaborating virtual computers that solve a particular problem.

With the resources we now have on hand, why are we talking about systemd and Dbus and other single computer entities?

The next time you approach a design, try thinking about how your laptop can be *everything*. And then let that influence your design.


I will be Cyborg.

I haven't had a lot of time to post to this blog and I am wondering if this is the end of the line for it.
Well, we will see.  But for now...

I am approaching 50 (in 1.5 years) and my eyes are shot (I'm very near sighted).  The screen is blurry (I have transitional bi-focals, so my "clear" view is pretty marginal) and isn't going to get any better.

So, if my eyes sight starts to quickly wane (my eye doctor isn't really concerned... yet), what do I do?
While I can use magnifying glasses for my circuit work (which starting to become a thing of the past for me anyway), what about my programming and computer science stuff  (i.e. my screen work)?

Duh.
I'm a programmer and technologist.  I can hack something together to supplement my poor vision.  Even if I were to go blind (that isn't currently in the cards, but who knows), there are ways to continue to do "Computer Science".
There is technology already out there, and I can always invent what I need to aid me if my eyesight worsens.

Sometimes I forget that, with software and some gadgetry, we invent whatever we need. We are indeed sorcerers and alchemists :)

Wednesday, October 01, 2014

Forth and the Minimalist

Not all Forth programmers are minimalists, but chances are, if you use arrayForth, colorForth or something inspired by it (like MyForth), then you may be a minimalist.

Being a minimalist, you seek the simplest, most concise use of resources.  You tend to avoid rambling code and the idea of calling a (3rd party) library function makes you uncomfortable.

One of the reasons I like using MyForth (and the 8051) is that it forces you to think about how to simplify the problem you are trying to solve.  This is a good exercise but also offers some advantages when you are working on low power (or very tiny) embedded systems.  No matter how beefy an MCU can get, there is always a need for something "smaller" and lower power (e.g. a tiny low transistor count 8 bit MCU has more chance running off of "air" than a 32 bit fast, feature rich MCU).

The 8051 has rather poor math capabilities. Everything is geared toward 8 bits. If you use a C compiler, this is hidden from you.  The compiler will generate a ton of code to make sure that your 16 or 32 bit math works. This causes code bloat and will slow you down -- thereby causing more power consumption.  Programming in a minimalist Forth makes you think about whether or not you actually need the math.  Is  there a cheat?  You look at old school methods and you may find them. I grew up on the 6502 (Commodore VIC20/C64, Atari, Apple, etc).  You did all you could to avoid doing "real" math (especially if it broke the 8 bit barrier).  You had limited resources and you made the most of what you had.

But, is this just an "exercise"?  I don't think so. There are practical benefits that go beyond just old school cleverness. You (can) have more compact code that performs better. The less code you produce, the fewer chances for bugs. The less code you produce, the more reliable your product.

Gone are the days (for most of us) of penny counting the costs of components. I'd rather have a bunch of simple components (e.g. logic gates, simple MCU, peripheral processors etc) that do work for me rather than a big processor with a complex library.  Chip components tend to be "coded" at a higher level of quality assurance than pure libraries.  I trust a USB->serial chip more than some USB->serial library for my MCU. If the library fails, they say "update". If a chip fails... they risk going out of business -- who trusts production runs to faulty chips?

In the end, the minimalist is fighting the status quo.  It is a futile fight, but we can't seem to give it up. It is in our nature.

Wednesday, July 30, 2014

AFT - an elegant weapon for a more civilized age..

This is a sort of nostalgic post and, in some sense, it is also a "toot your own horn" one as well.  I am writing this mainly for myself.  I am trying to remind myself what I've liked most about programming.

Years ago, actually almost 2 decades ago -- around 1996,  I wrote a text mark up system called AFT.  AFT stood for Almost Free Text. It was inspired by Ward Cunningham's original Wiki mark up but went further.

I had a problem. I didn't like using WYSIWYG word processors and the world was moving towards HTML.  I liked Ward's mark up. He was using it on this new "Wiki Wiki" thing. I answered an invite sent to the Patterns List and became one of the first of the wiki users in 1995.  (But that is a different story or a different time.)

AFT was my attempt at a writing system to produce publishable (web and print) documentation.  Since then, it has developed a (now waning) user base.  You can see how many web pages/documents use it without changing the default "watermark" with this query.

As of Ubuntu 14.04, you can get AFT by issuing an "apt-get install aft" against the standard repository.
I think it is still part of FreeBSD "world".  I believe it still runs under Windows too.

Various "modern" mark up languages (written in "modern" programming languages) have since surpassed AFT in adoption, but for me, it still is a more elegant and pleasurable experience.

Over the years (although not very recently), I've updated, fixed and generally maintained the code.  There are no known crashes (it literally take whatever you throw at it and tries to produce good looking output -- although that may fail) and it doesn't require me to look at the HTML (or PDF) manual  (written in AFT!) unless I want to do something complex.

AFT is implemented in Perl. Originally it was written in awk, but I taught myself Perl so as to re-implement it in the late 1990s.

It is, for me, interesting Perl code.  I have modernized it over the years, but it still doesn't depend on CPAN (a good thing if you just want to have non-programmers "download it" and run without dependencies -- yes I know there are packaging solutions to that problem today...).

AFT has "back end" support for HTML, LaTeX, lout and some rudimentary RTF.  These days I think mostly HTML and LaTeX is used.

You can customize the HTML or LaTeX to support different styles by modifying or creating a configuration file.  This configuration file is "compiled" into a Perl module and becomes part of the run time script.

AFT has been a pleasure to hack on now and then. It still runs flawlessly on new Perl releases and has proven not too fragile to add experimental features to. I've accepted some small code fixes and fragments over the years, but generally it is mostly my code.

As I wrote (and rewrote) AFT, I thought frequently of Don Knuth's coding approach (as excellently documenting in my all time favorite book on programming: Literate Programming).  I certainly can't match the master, but the slow thoughtful development he enthuses was inspiring.

Over the years I've gotten a few "thank you" notes for AFT (but nothing in the past few years) and that makes it my (to date) proudest contribution to Free Software.

Maybe I'll dust off the code and introduce some more experimental features...



Sunday, July 27, 2014

Concurrency and multi-core MCUs (GA144) in my house monitor

My house monitoring system monitors lots of sensors. This suggests a multi-core approach, doesn't it?

The problem with (the current concept of)  multi-cores is that they are typically ruled by a monolithic operating system. Despite what goes on in each core, there is one single point of failure: the operating system. Plus, without core affinity, our code may be moved around.  In a 8 core Intel processor, you are NOT guaranteed to be running a task per core (likely, for execution efficiency, your task is load balanced among the cores).  Each core is beefy too. Dedicating a whole core to a single sensor sounds very wasteful.

This, I believe, is flawed think  in our current concurrency model (at least as far as embedded systems go).

I want multiple "nodes" for computation. I want each node to be  isolated and self reliant.  (I'm talking from an embedded perspective here -- I understand the impracticality of doing this on general purpose computers).

If I have a dozen sensors, I want to connect them directly to a dozen nodes that independently manage them.  This isn't just about data collection. The nodes should be able to perform some high level functions.  I essentially want one monitoring app per node.

For example: I should be able to instruct a PIR motion-sensor node to watch for a particular motion pattern before it notifies another node to disperse an alert. There may be some averaging or more sophisticated logic to detect the interesting pattern.

Normally, you would have a bunch of physically separate sensor nodes (MCU + RF),  but RF is not very reliable. Plus, to change the behavior of the sensor nodes you would have to collect and program each MCU.

So, consider for this "use case" that the sensors are either wired or that the sensors are RF modules with very little intelligence built in (i.e. you never touch the RF sensor's firmware): RF is just a "wire".  Now we can focus on the nodes.

The Green Arrays GA144 and Parallax Propeller are the first widely-available MCUs (I know of) to encourage this "one app per node" approach.  But, the Propeller doesn't have enough cores (8) and the GA144  (with 144 cores) doesn't have enough I/O (for sake of this discussion, since the GA144 has so many cores I am willing to consider a node to be a "group of core").

Now, let's consider a concession...
With the GA144, I could fall back to the RF approach.  I'll can emulate more I/O by feeding the nodes from edge nodes that actually collect the data (via RF).  I can support dozens of sensors that way.

But, what does that buy me over a beefy single core Cortex-M processing dozens of sensors?

With the Cortex-M, I am going to have to deal with interrupts and either state machines or coroutines. (although polling is possible to replace the interrupts, the need for a state machine or coroutines remain the same).  This is essentially "tasking".

This can become heinous. So,  I start to think about using an OS (for task management).  Now I've introduced more software (and more problems).  But can I run dozens of "threads" on the Cortex-M? What's my context switching overhead?  Do I have a programming language that lets me do green threads?  (Do I use an RTOS instead?)

All of this begins to smell of  anti-concurrency (or at least one step back from our march towards seamless concurrency oriented programming).

So, let's say I go back to the GA144. The sensor monitoring tasks are pretty lightweight and independent. When I code them I don't need to think about interrupts or state machines. Each monitor sits in a loop, waiting for sensor input and  a "request status" message from any other node.
In C psuedo-code :

while (1) { 
  switch(wait_for_msg()) {
    case SENSOR: 
       if (compute_status(get_sensor_data()) == ALERT_NOW)
          send_status(alert_monitor);
       break;
    case REQUEST:
       send_status(requester);
       break;
  }
}

This loop is all there is.  The "compute_status" may talk to other sensor nodes or do averaging, etc.
What about timer events? What if the sensor needs a concept of time or time intervals?  That can be done outside of the node by having a periodic REQUEST trigger.

(This, by the way, is very similar to what an Erlang app would strive for (see my previous post GA144 as a low level, low energy Erlang).

Now, the above code would need to be in Forth to work on the GA144 (ideally arrayForth or PolyForth), but you get the idea (hopefully ;-)


Tuesday, July 22, 2014

A Reboot (of sorts): The IoT has got me down. I think we've lost the plot.

The IoT (Internet of Things) has got me down.  I think we've lost the plot.

In most science fiction I've read (and seen), technology is ubiquitous and blends into the background.  The author of a science fiction book may go into excruciating detail explaining the technology, but that is par for the course.

In science fiction films the technology tends to be taken for granted.  Outside of plot devices, all the cool stuff is "just a part of life".

Re-watch Blade Runner, Minority Report, etc. Do the characters obsess (via smartphone or other personal device) over the temperature of their home while they are away?  Do they gleefully purchase Internet connected cameras and watch what their pets are up to?

It is 2014 and we buy IoT gadgets that demand our attention and time.  Nest and Dropcam: I am looking at you.

Beyond "Where is my Jet Pack?", I want "Set and Forget" technology.  The old antiquated "Security Monitoring" services (e.g. ADT) got it partially right. You lived with it. You didn't focus on it and you weren't visiting web pages to obsess over your house's security state.  But that model is dying (or should be). It is expensive, proprietary and requires a human in the loop ($$$).

What do we replace it with?

I think that the "Internet" in the IoT is secondary.  First, I want a NoT (Network of things) that is focused on making my house sensors work together.  Sure, if I have a flood, fire or a break in, I want to be notified wherever I am at (both in the house and out).  When I am away from my home  is where the Internet part of IoT comes into play.

My current Panoptes prototype (based on X10) monitors my house for motion and door events. My wife or I review events (via our smartphone) in the morning when we wake up. It gives me valuable information, such as "when did my teenage son get to bed?" and "was mother-in-law having a sleepless night?" and "is mother-in-law up right now?".  Reviewing this info doesn't require the Internet but does require a local network connection.

I also register for "door events" before I go to bed. This immediately alerts me (via smartphone) if  "mother-in-law is confused and has wandered outside".

When I leave the house, I can monitor (via XMPP using my smartphone) activity in the house. When I know everyone is out, I can register (also via XMPP)  for door/motion events. I can tell if someone is entering my house (our neighborhood has had a recent break in).

This is an important Internet aspect of Panoptes.  I rarely use it though.  My main use of Panoptes turns out to be when I am at home.

So, I want IoT stuff, but I want it to be "Set and Forget".  This is the primary focus in my series of Monitoring projects.