Backing up Node-Red

Node-RedFor anyone looking in for the first time – I have a home control system based around ESP8266 boards and the odd Arduino. They talk MQTT to a Raspberry Pi which acts as a central controller – using Node-Red – the greatest thing since sliced bread.  I’m in the process of putting together a (Linux) script to replicate my Raspberry Pi setup – as this is now central to controlling my various ESP8266 and Arduino gadgets around the place.  I’ve learned a lot in the past few days but one last thing that was niggling me was Node-Red.

So installing Node-Red is easy enough and restoring nodes is easy enough – even specials like mine are a couple of simple operations to retrieve… for example my duskdawn node which in turn needs the sunrise node.….

cd /usr/lib/node_modules
npm install suncalc
wget --no-verbose
https://bitbucket.org/scargill/duskdawn/get/0305d8365c4b.zip -O node-red-contrib-duskdawn.zip
unzip -q node-red-contrib-duskdawn.zip
mv scargill-duskdawn* node-red-contrib-duskdawn
npm install node-red-contrib-duskdawn

Looks complicated but you don’t need to understand it because it can be put in a batch file (or run a line at a time).

So all of that is “easy” enough (I put easy in quotes as it is bloody hard when you are first faced with Linux). So to install you should be in the right directory, pull in suncalc, grab my code, unzip it, move it and install it. Doddle.

flowsWhat’s not so obvious are the actual flows you use in Node-Red – if you’ve seen mine you’ll see I have several pages of them – items feeding websocket to mqtt, http to mysql, experiments, comments – there is no way you’d want to manually recreate this stuff – thankfully within node-red there is the ability to select a page full of flows (or a single one)  and copy and paste them – but that doesn’t help if you have many pages and you want to recreate your work in another machine.

Well, it turns out (certainly in my case) that the info is all stored in 2 files – and it also explains why all my work disappears temporarily if I log in as someone else!!!

in your user home page (might be root, might be /home/pi there is a node red directory – or maybe a hidden .node-red directory (the dot hides directories – I have winSCP set to show them all) there are a couple of files…and in the file name is embedded your hostname – so they’re not the same on all machines – and copying them won’t work unless you change the name.

SO on my main Pi2 there are files in /home/pi/.node-red called flows_OtherPi.json and flows_otherPi_cred.json -  yes you guessed it – the second file contains all the login details etc you you put into your nodes.  My target machine is called raspnas and in my /root/.node-red directory there were a couple of files called…. as you might suspect now, flows_raspnas.json and flows_raspnas_cred.json.

With Node-Red STOPPED, I wiped the latter, copied the files form OtherPi over and renamed them. Started Node-Red and LO – all my flows and pages work with credentials intact.

If only life in general were so easy…

Facebooktwittergoogle_pluspinterestlinkedin

4 thoughts on “Backing up Node-Red

  1. Peter

    A related question about node-red:

    I have tried numerous times using NPM to add the mysql node to node-red. While NPM appears to succeed in doing so, the new node never shows up in the menu. I have treid doing this as root and as the normal Pi user. I am getting good data but am having to store it in text files for parsing later. Can you suggest what I might be doing wrong?

    1. Perhaps someone in here can help. Personally I gave up on mysql a long time ago as being too write-hungry for a Raspberry Pi and so I now use SQLITE extensively - the setup is in my script - and I use the normal SQLITE node for Node-Red. Slightly different to MYSQL of course but for general logging etc I find it fine. A thought - depending on what you are doing. I ALSO quite a lot just store stuff to a file - and read it back when needed. If you are using a Pi or similar, then your SD is writing in blocks anyway and hence your file writing may not be as bad as it looks:

      Here's what I do for saving non-volatile stuff. On startup - and every 5 seconds - I inject into an expanded version of this function.

      // The purpose of this code is to update an object on change. At power up check for file and create if necessary
      // var is "intel" in this case

      var fs = global.get("fs");
      var fl="/home/pi/intel.data";

      if (global.get("intel")!==undefined) // Remember - don't use !== with null
      {
      if (global.get("intel.Counter")!==0) // if the var exists - now check counter
      {
      global.set("intel.Counter",global.get("intel.Counter")-1); // if counter is non-zero - decrement it
      if (global.get("intel.Counter")===0) // if counter drops to zero update file
      {
      fs.writeFile(fl, JSON.stringify(global.get("intel")), function(err) {
      if(err) {
      return console.log(err);
      }
      });
      }
      }
      }
      else
      { // If no var (powerup scenario) ..does the file exist?
      try {
      fs.accessSync(fl, fs.F_OK);
      // If file exists create the VAR by reading the file into it
      fs.readFile(fl, 'utf8', function (err,data) {
      if (err) {
      return console.log(err);
      }
      global.set("intel",JSON.parse(data));
      });
      } catch (e)
      {
      // Otherwise create both var AND file - ensuring counter is zero. New bits can be added dynamically
      intel = {
      PergolaRgbState:0,
      PergolaRgbLevel:0,
      PergolaRgbColour:"AAAAAAAA",
      Counter : 0
      };
      fs.writeFile(fl, JSON.stringify(intel), function(err) {
      if(err) {
      return console.log(err);
      }
      });
      }
      }

  2. Not sure if you're aware but you can npm install from git repos. No need to use wget.

    Now if you were installing a bunch of packages and didn't want to rely on internet being up I'd direct you to yarn and it's offline cache feature.

    What you really want to look into is docker on pi

Comments are closed.