Wednesday, October 02, 2019

Wandering Tag - Discreet GPS tag for people and pets

It's been a while since I've posted any ramblings.  Those (few) souls who have been following my sporadic blogging and rants (thank you all dozen or so of you ;)  may remember me going on about "elderly monitoring" (I even had a side blog addressing this).

An decenet Elderly monitoring system would include some lightweight tracking capabilities (i.e. for early dementia sufferers who tend to wander from the house).

I've been working on and off of such a device.  I kept running into walls regarding how to do long range communication of geolocation.  Cellular is obvious but is costly, requires a "subscription" and is at the mercy of the telcos. I wanted something more in control of the user (consumer).  Basically: No subscription needed.

Enter LoRa.  Well, enter LoRa like capabilities: hundreds of meters range RF.  In testing some simple LoRa modules I've been able to get reliably around 800 meters (approx 1/2 mile), in a sub-urban (e.g. building, trees, hills, etc) environment, with a small trace antenna broadcasting to a large whip antenna (at 915Mhz).  I should be able to get further but these initial tests were with conservative bandwidth and spread factor settings.

This is actually an adequate range to track a "wandering" elderly person. When my mother-in-law lived with us and she would "take off" early in the AM (before we woke),  we always found her within a quarter to half mile radius (siting on a bench or at the local McDonalds).

In addition, this is probably adequate range for tracking pets (e.g. dogs that get out of the yard, cats that decide to go missing, etc).  So, I'll probably focus on that as an initial target.

I've gotten requests for pointers to "pet tracker" products.  I found a few, with Amazon's Fetch  being the gorilla in the room.  (Amazon's solution sounds like yet another way for them to gather more private data on people... I don't want to do that).

Now, to be clear,  I'm talking LoRa and not LoRaWAN.  I know that there are asset tracking devices out there that do LoRaWAN (and rely on infrastructures like the Things Network).  There are also "Tile" trackers that rely on social networks (i.e. people with apps installed on their phones that help the "community" find a lost pet).

What I want is a completely private system that requires no external "subscription" or "social network".  So, you would run a LoRa base station in your home (e.g. nothing fancy, maybe an ESP32 + LoRa + large antenna) and talk with it using your smartphone via Wi-Fi or BLE.  The basestation would also be "portable" so you could take it with you (walking or in a vehicle) to do live tracking of the pet.

Of course, it may make sense for you to use something like twilio to forward events to your smartphone when you aren't home, but that would be an option (not a requirement).

I am working on this now and have made quite interesting progress.  I intend the design and code to be open sourced (in particular: GPL'd) and there will be some interesting sizing as well as power saving tricks.

More to come... stay tuned.

Thursday, June 06, 2019

Sleepy Bee to MSP430 to STM32L031... Woeful complexity

I need a simple very, very low power MCU to form the basis of coin cell sized IoT sensor monitoring.
Here is what it needs to do:

  1. Control the Semtech SX1276 LoRa chip via SPI
  2. Talk to an accelerometer via I2C
  3. Spend a lot of time sleeping waiting for accelerometer events.
  4. Wake up, send a LoRa command and go back to sleep
  5. Wash, rinse, repeat.

(If you read this blog you'll know that C and pre-written Arduino or STM32Cube or whatever frameworks are not the answer for me.  I'm old and tired of learning your new (potentially half baked) frameworks. I've got spec sheets on the peripheral chips and know how to use them....)

So, I need to choose...

Sigh... the C8051 (Silabs SleepyBee) is just too low level. I've got enough 16 bit math and string manipulation to do that it simply doesn't make much sense to go this low.

MSP430 is long in the tooth. Sure, the specs are still there and the chips are available but they seem overpriced for what I need (basically 1KB RAM and maybe 32K Flash).  I'm not nickle-and-diming it, but it just feels wrong to pay $3-4 for such an old design.  I've got Forth Inc SwiftX MSP430 code and there is always Mecrisp too, so from a development perspective the MSP430 has what I need.  But, the chip availability is spotty and pricey...

What about ARM Cortex?  It's hard to beat the ARM Cortex these days. Size, power, diversity: It has that in spades.  I'm well versed with the STM32L496 and STM32L432 but those M4 beasts are overkill for my needs.  They are very complex chips.

So I stumble upon the STM32L031 and I am blown away by the datasheet specs. It's a Cortex M0+ with MSP430-like low power specs. Heck, it blows away most MSP430s. And is cheaper... given 8KB RAM and 32KB Flash.

A couple of years ago I ported Forth Inc's SwiftX ARM compiler to the STM32L432. It took a little effort, but it worked.  I think I used an STM32F411 as the basis for the port.  I'm not looking forward to porting to the STM32L031.  But wait, it looks like there is a Mecrisp-stellaris port for that chip. Hurrah!  But, wait again... its a bare port. No systick, no CMSIS mappings, etc.  Okay. I like bare metal stuff... but... I have this thing I want to build and I need something working soon.  Maybe I don't want to bite off the effort.  Let's see... there is already a systick interface in Mecrisp for another STM32 part, maybe I can start there.  But then I have to deal with sleep transitions.  On the MSP430 (and Sleepy Bee) it is dead easy. The STM32L  has always been a bear.  Is the L031 sleep code the same as the L432?  I hope so.. otherwise I have a lot of reading to do.

I download the L031 reference manual...  it's around 900 pages. Okay, I know I don't need to consume it all, but I'll need to do the laborious register bit hunt and the requisite ten thousand clocking options.

What am I doing here?  I need a simple low power microcontroller with an RTC sleep mode less than a couple of uA and run mode less than 5mA.  I want it with enough memory that I can get my stuff done.

To add to this dilemma, the STM32L072 (which should be very close to th STM32L031 I mentioned above) is now the darling of "module" embeds.  Well, "darling" may be overselling it, but there is a rather nice LoRa module I'm looking at that embeds this MCU. If (a big IF) I get the L031 working it should be a simple port to the L072.

But... I'm stuck on the overall complexity of the STM32L.  I've been there before (the L496), but I was paid to work with that chip for 9 months. Nine months living and breathing the nearly 2000 pages of reference manual.  Tweaking. Tuning.

Maybe MSP430 isn't dead to me after all. The MSP430F2274 (1KB RAM + 32KB Flash) is more than adequate.  I've got a SPI driver written. I've got a development environment. 
But at $6 (  it's insanely pricey.  The STM32L031 (which has more memory, features, smaller sizes and uses less power) is only around $2.50.

Honestly, I am not making millions of devices so a few dollars here and there doesn't really matter.

But I have to ask myself... is it worth the complexity?

Wednesday, May 15, 2019

Frustrated with Complexity

I've been doing a bit of ESP32 development at work (and some at home) and I've hit a wall.  When things fail, they fail bad. I can't say that I understand the xtensa toolchain (I do know that it takes several minutes to build an ESP32 binary on my laptop: FreeRTOS and my app).

I've been using LuaRTOS and am generally happy with it (a few crashes due to not quite well debugged libraries), but my problem is....

The more and more I use other people's code, the less I can say I know what is going on inside of my devices.  I am building (critical) house monitoring devices (including one that controls a pump via H-Bridge and PWM).  Not exactly real time constraints, but certainly 24/7 reliability. I need to know that this system can be trusted.

Okay, okay... so back to Forth.  Looking into replacing the ESP32 (w/ LuaRTOS) Copious Free Time project with a SleepyBee (w/ MyForth) and an ESP8266 (in AT command mode) for Wi-Fi connectivity.

Why?  Because I'm pedantic that way.  Close to the metal, baby!

Sunday, March 17, 2019

Alien Technology... of sorts...and a restart?

It's been a long time since I've blogged here.  I have no idea if anybody is still reading this, but I'll go ahead and say that I intend to start writing more blog entries in the coming weeks.

But, in the meantime, a bit of catch-up:

I've had a wonderful opportunity during 2017-2018 to work with GreenArrays and their wonderful GA144 chip, on an actual real work project.  There were lots of starts and stops, and not quite "full time", but we did deliver around 25 boards (and software) to the customer for "evaluation and proof-of-principle" goals.  They liked what they got (delivered last December), but... they don't know if they want to continue. 

I also got to do some more work with MyForth (Charley Shattuck's Arduino version) and have (for my own personal projects) pivoted back to the 8051 version (Charley & Bob Nash's work).
I ported MyForth to SiLab's (new) SleepyBee (EFM8SB2).  I'll write more about that later. Perhaps a lot more, since it is one of my Copious Free Time projects.

Taking a break from intense GA144 work (in PolyFORTH and arrayFORTH)  I am getting nostalgic for Forth block editors.  I'm an emacs die-hard, but there is something still very focused and useful in using a block editor (well, especially when it is integrated into your development environment as deeply as PolyFORTH is on the GA144).

I may dust off my GA144 EVB and do some toolsmithing or I may dive deeper into the MyForth tooling (hmmm... can I fit a block oriented dev environment on an EFM82B2?)  Recompilation is the challenge (as all of that is done in the PC under gforth).  I could use the gforth block editor but MyForth is not an incremental compiler....

Still... I expect to do more Forth in the next few months (if not at work, then on my Copious Free Time projects).  Stay tuned.

Friday, October 13, 2017

Wish List: Clojure tethered to an MCU running Forth

I want to do 2 things:

  1. Use the power of a full desktop system (e.g. Linux, Emacs, Clojure, etc) to play with some SPI/I2C peripherals
  2. Compile a very limited subset of Clojure/Lisp to Forth for flashing into a microcontroller

Essentially, I want to take the "tethered" Forth environment (i.e. a full blown interactive development environment talking directly to an MCU),  but instead of Forth (Gforth, etc) on the desktop I want to use Clojure/Lisp  (basically a language with very rich desktop support).  

#1 is pretty easy.  I can pick a popular Forth like Mecrisp (which runs on lots of MCUs) and talk to it's Forth interpreter from the Clojure REPL. 

#2 is harder, but necessary if I don't want to use #1 just for prototyping.

But why not just use a terminal and Mecrisp (for #1)?

Each Cortex M arm chip comes with a ton of definitions (registers, bit names, etc) that I don't want loaded onto the chip. Also, every little "helper" function I write (to enrich the Forth REPL)  takes space on the MCU

 Tethered Forths don't have this problem (as you leverage the desktop Forth to handle such things).   

But, while Gforth is very nice, it still isn't as "rich" as I want regarding integration into Linux/Emacs (i.e. not enough batteries included).

Tuesday, June 13, 2017

Low Power MCU Fetishes

So, I am pretty familiar with the STM32L4xx low power Cortex-M4 MCU.
It has some insanely low power consumption profiles including a  0.65 µA standby mode with RTC (and just 14 µs wakeup time).

This thing is a beast to program (1600+ reference manual to start with and copious app notes).  But, I work with this chip for a living.

This current consumption is specified (in the documentation) at 1.8V, so a more typical 3-3.3V actual power supply will likely cause it to consume more current.   I am assuming direct battery hookup otherwise the current consumption of an LDO regulator has to be considered.

Still... this is insanely low.

I am looking to play around a bit with some "old" 8-core Parallax Propeller Chips (P8X32A) I have laying around and I read some forum discussion where you could likely get it to consume as little as 7- 10 µA  when running just one COG doing not much (maybe as a timer?).

To these jaded ears that sounds like a lot of power, but... honestly... really?

With a couple of 1200 mAh  AAA batteries (3V) the P8X32A would run around 11 years.  That is greater than the shelf life of AAA batteries.

Most of the current consumption these days aren't from the MCU but from the peripherals and sensors. If I wanted to beacon some temperature measurements via BLE (connection-less -- just as a "tag") maybe every minute, the battery life drops to around 1 year.  So, it would be more reasonable to beacon every 10 minutes.  Then I could get maybe 5-6 years.

We get lured into thinking too much about how low an MCU can go, when in the world of IoT, it is the RF that is killing us.

Just food for thought.

Monday, June 05, 2017

My Forth Story (Part 1)

This is just a few collected thoughts on my 30+ years of using Forth. So, if you are expecting high quality technical content, please move on.  Yes, nothing to see here, move on...

This past weekend I was going through old books, trying to clear out some bookshelf space, when I came upon a yellowing Forth Dimensions from 1986.  It got me thinking about when I first became enamored with Forth and how it is has popped up now and then throughout my career.

Back in 1982, armed with my first computer: a Commodore VIC-20, I started my first year in college (I was 16 years old -- I skipped a year in grade school) infatuated by the possibilities offered by computer programming.  I wasn't really college material (I was planning on going into TV repair or maybe an Art school), but I had just (to everyone's surprise) won the Engineering division of the DC Science Fair and was offered a 4 year scholarship to the University of the District of Columbia.  I had prototyped an LED display based oscilloscope using some op-amps and 555 timers.  It was inspired by a design I saw in Popular Electronics.  But I digress...

So, here I was starting college (and a job as a TA in the computer lab!) and I knew it was time to "up" my skills (I was proficient in BASIC and some 6502 assembly).  We had a lab full of the newly purchased Commodore 64s (C64) and a terminal room (oooh, remember green phosphorous terminals?) remotely connected via 1200 baud modems to the school's DEC2060 (more on that later). I would split most of my day time between the C64 and DEC260 and my nights were spent hacking on my venerable Commodore VIC-20.

Suffice it to say, my VIC-20 wasn't cutting it to get me kick started into  the highly competitive CS department.   I saved up money to get a Commodore 64 so I could continue my hacking education from the comfort of home.

On the DEC2060 we didn't have BASIC.  We had a sophisticated Macro assembler, Rutger University Pascal and Fortran IV and 77.  None of this would work on the C64, and BASIC was quickly running out of steam.

It was either through BYTE (or maybe it was Compute!) magazine that I stumbled upon this language developed by this guy named Chuck Moore.  It was Forth and there were a couple of implementations available on the C64.  An implementation that intrigued me, in particular, came in cartridge form and booted (nearly) instantaneously.  This wouldn't require me to fiddle with the painfully slow floppy drive.

I became obsessed with Forth. The interactivity and the power (to lock up the C64) was addictive.

But, my CS (well EE, I started as an EE student and defected to CS) courses were on a DEC2060. The DEC20 was a lovely 36-bit word "mainframe" (shhhh! DEC wasn't allowed to call them mainframe as they didn't want to face the wrath of IBM and their patents). The 36-bit word size happened to be perfect for a Lisp cons cell.  I found Lisp quite lovely, powerful and intriguing but I was still in the midst of my Forth obsession.

This obsession became even more all consuming, around 1985, when I read about Chuck Moore's  amazing Novix NC4016. I even ordered a fact sheet from Novix Inc so I could pour over as much detail as I could -- knowing I would never likely touch one.

In late 1985, my C64 Forth obsession hit a wall.   This wall was my obsession with Fractals, particularly the Mandelbrot Set, of which I first heard about in the August 1987 issue of Scientific American.  The C64 just didn't seem to have enough processing power to execute my naive implementation of Mandelbrot's algorithm.

Eventually, I found a Forth that ran on a DEC20, converted the algorithm to fixed point and managed to get the set rendered on a graphics terminal (over a 1200 baud modem!).  If my memory serves me correctly, the terminal was a fancy Tektronix 4150.  It took a lot of false starts and missed classes, but a couple of days later I had a color fragment of the famous fractal.

As I got further along in my CS curriculum, I discovered that technologies like home computers (C64, etc) and languages like Forth were not really encouraged as tools of study.  So, I learned TOPS-20 assembly, Pascal, Fortran (IV and 7), TeX and Lisp.  I fell in love with Emacs (the original written in TECO!) and generally was happy, but I was missing some of the hands on immediacy of having my own personal computer and personal happy-to-crash-it-language like Forth.  There was a driving need, brewing inside me, to do something low level -- something dangerous.

So when a secretly procured  copy of AT&T's UNIX Version 7 arrived at the University Computer Center (where I worked, distracting me from my scholarship and short circuiting the pursuit of my degree), I worked with a couple of my friends to boot it on the DEC20, play around with it and then remove it,  before the next day's classes began.  This was no trivial task as it required hand entering the bootloader on the front panel. Fun stuff.

I soon fell for the spartan language that accompanied the UNIX tape: C.

Over the years, I would continue to play with (and implement my own) Forths, but it wouldn't be until 20 years later that I would get a chance to program in it extensively (around 2006), when I revisited my low level hardware past in the form of embedded development on Microcontrollers.

To be continued...