Tuesday, January 19, 2016

SNNR web stack: SQLite, NGINX, Node.js, Raspberry pi

I'm writing to introduce SNNR, a "new" web stack: SQLite, NGINX, Node.js running on a Raspberry Pi.  I intend to use this stack for home heat monitoring and controlling.  I had previously built some electric imp based devices to monitor the temperature, and then built some to act as thermostats, and initially I will use this stack to provide a very simple API for the electric imps to store data.  Later it will  provide a web app for users to display temperature data and control the thermostats.  The stack runs on a Raspberry Pi 2 model B running Raspbian, uses NGINX (v1.2.1), Node.js (v4.2.4) and SQLite3.
All code available at this repository on github: https://github.com/dllahr/home_control/tree/v0.1



Installed Raspbian version 4.1.7-v7+ on Raspberry Pi 2 model B - downloaded from here:

Instructions here:

Edit:  setup to accept ssh connections (does not by default) - instructions here:


From Raspbian command line:
sudo apt-get install nginx
Started nginx, verified it was running:
sudo service nginx start
curl localhost
The last command returns some simple html of the default nginx response:
<h1>Welcome to nginx!</h1>

Node.js & integration with SQLite3

I initially attempted to install Node.js through the default package - and npm, which had to be installed separately - but npm did not work.  I uninstalled node and npm, removed the dependencies (apt-get autoremove), and then after trying a few things I got a newer version of node and npm working by following slightly modified versions of the instructions here:
wget https://nodejs.org/dist/v4.2.4/node-v4.2.4-linux-armv7l.tar.gz
tar xzf node-v4.2.4-linux-armv7l.tar.gz
cd node-v4.2.4-linux-armv7l/
sudo cp -R * /usr/local
Important note:  to me this is a very messy install because it is not obvious to me how I would uninstall.  I could use the list of files in the node-v4.2.4-linux-armv7l directory to figure out which files had been copied over, but I don't know a priori if any existing were overwritten.

I then used npm to install the Node module to integrate with SQLite3:
npm install sqlite3
It failed with a long string of error messages, but the one at the top that was key was:
In file included from ../src/database.h:10:0,
                 from ../src/database.cc:4:
../node_modules/nan/nan.h:41:3: error: #error This version of node/NAN/v8 requires a C++11 compiler
Based on that message google pointed me towards this issue in the github repo for nodejs/nan.  When I searched for which version of gcc was c++11 compliant, this gcc page seemed to indicate that version 4.3 would be sufficient - and I had v4.6 by default.  However it hadn't worked and the github issue recommended using g++ 4.8 or later, so I upgraded gcc to version 4.8:
sudo apt-get install gcc-4.8
sudo rm /usr/bin/gcc
sudo ln -s /usr/bin/gcc-4.8 /usr/bin/gcc
(/usr/bin/gcc was just a symbolic link to /usr/bin/gcc-4.6, so by replacing it I just changed the default gcc in use).  This of course did not work, and it took me far too long to realize why:  I still needed to upgrade g++!  I did that analogously to how I upgraded gcc:
sudo apt-get install g++-4.8
sudo rm /usr/bin/g++
sudo ln -s /usr/bin/g++-4.8 /usr/bin/g++
I then was able to install the Node SQLite3 module (using the command above) and I was able to run the Node SQLite3 example successfully!  I then modified it slightly to change from using an in-memory database to a file (code here), re-ran it, and then used the SQLite3 command line tool to verify the contents.  NB:  I had also independently installed SQLite3 on my system (sudo apt-get install sqlite3), however if I'm reading the documentation for the Node SQLite3 module correctly, when it installs it builds its own version of SQLite3 which it statically links.  To make sure I was understanding this correctly, I uninstalled the SQLite3 command line tool, and re-ran the Node SQLite module test - it worked! (NNB:  I reinstalled the SQLite3 command line tool afterwards).

I tested the ability to use Node as a server by starting it with some example code (here) with:
node example.js
I tested that it was running with:
curl localhost:8124
and it returned the expected message:  "Hello the Alka Bear and Beary the Polar Bear"

Configure NGINX to act as reverse proxy server for Node.js

Based on these instructions (under the section Set Up Reverse Proxy Server), I configured NGINX to forward traffic to the running Node.js instance by adding this code to the /etc/nginx/sites-enabled/default under the "server" section:
location /homeControl/ {
    proxy_http_version 1.1;
    proxy_set_header Upgrade $http_upgrade;
    proxy_set_header Connection 'upgrade';
    proxy_set_header Host $host;
    proxy_cache_bypass $http_upgrade;
This causes traffic at the location of homeControl to be forwarded to the Node.js instance running on the localhost ( listening on port 8124.  From a separate machine, I used curl to test:
( is the address of the Raspberry Pi on my local network).  The expected message appeared in the web browser:  "Hello the Alka Bear and Beary the Polar Bear"

Database for storing measurements

The database currently contains one main table describing the measurements that have been made.  It is generalized / abstracted so that it can contain measurements of any type.  It is initialized such that it can accept measurements of hardware voltage, light level, and temperature in Farenheit, but any measurement in any unit of measure can easily be added.  Each row therefore contains the ID of the device making the measurement, the time of the measurement, the type of measurement, the numerical value that was measured, and the units that the value is in.  The SQL code is here:

Code for Node.js to receive data from electric imps

Previously I had setup the electric imps to measure temperatures:

The main takeaway is that the electric imps send a post request with JSON encoded data (code here).

Based on advice here:

I wrote this code to receive JSON from the electric imps and save it to the SQLite database:

Currently when an electric imp sends data, it includes its device ID, as well as 60 measurements of hardware voltage, light level, and temperature, with the time the measurements were recorded.  For each each of these 60 measurements of each of the 3 types (180 total) the system adds an entry to the database.


Currently the system is very simple and only provides logging of measurements made by 3 electric imp temperature sensors; however, by having those measurements in an efficient database it is already possible to selectively query and then manually display - e.g. select an individual device and then retrieve the latest (e.g. within the last day) data.  Future blog posts will add displaying of this data, and then control of other electric imp's that act as thermostats.

No comments:

Post a Comment