Tuesday, 22 October 2013

Raspberry Pi and Arduino

I am putting together a data logger for the biogas generator.

I would like it networked so I don't have to go out in the cold, so will use a raspberry pi.   To make interfacing the sensors easy I will connect the Pi to an Arduino microcontroller.   This is a bit over the top as I should be able to do everything I need using the Pi's GPIO pins, but Arduino has a lot of libraries to save me programming....

To get it working I installed the following packages using:
apt-get install gcc-avr avr-libc avrdude arduino-core arduino-mk

To test it, copy the Blink.ino sketch from /usr/share/arduino/examples/01.Basics/Blink/ to a user directory.
Then create a Makefile in the same directory that has the following contents:
ARDUINO_DIR  = /usr/share/arduino
TARGET       = Blink
ARDUINO_LIBS =
BOARD_TAG    = uno
ARDUINO_PORT = /dev/ttyACM0
include /usr/share/arduino/Arduino.mk
Then just do 'make' to compile it, then upload to the arduino (in this case a Uno) using:
avrdude -F -V -p ATMEGA328P -c arduino -P/dev/ttyACM0  -U build-cli/Blink.hex
The LED on the Arduino Uno starts to blink - success!

Saturday, 19 October 2013

Small Scale Biogas Generator

I heard on the radio last week that some farmers are using anaerobic digesters to produce methane-rich biogas from vegetable waste.
This got me wondering if we could use our domestic waste to produce usable fuel gas - maybe to heat the greenhouse or something similar.

I thought I would make a small scale experimental digester to see if it works, and what amount of gas it makes, to see if it is worth thinking about something bigger.

My understanding is that the methane producing bacteria work best at over 40 degC, so I will heat the digester.  I will do this electrically for the experimental set up because it is easy, and I can measure the energy consumption easily that way.

I am using a 25 litre fermentation vessel for the digester - I got one with a screw on cap rather than a bucket so I can run it at slightly elevated pressure if it starts to make gas.
For simplicity I got a 1 m2 electric underfloor heating blanket to heat the vessel.  I will use an electro-mechanical thermostat as a protection device in case the electronic temperature controller I will produce looses its marbles and tries to melt the vessel.


To start with I just wrapped the blanket around the vessel.

But before I tested it I realised that this approach is no good - the vessel will not be full of liquid, so I do not want the heating element all the way up the sides.








So, I removed the heating element from the underfloor heating mat, and wrapped it around the bottom of the vessel instead.














To improve heat transfer between the heating element and the vessel, I pushed as much silicone grease as I could get in around the element wires, then wrapped it in gaffer tape to make sure it all held together and I don't get covered in grease:

It is looking promising now - the element gets warm, and the thermostat trips it out when it starts to get hot.  The dead band on the thermostat is too big to be useful for this application (it is about 10 degC), so I will just use that as an over-heat protection device, and us an Arduino microcontroller to control and log the temperature.

To get the proof of concept prototype working, I think I need to:
  • Sort out a temperature controller - will use an arduino and a solid state relay to switch the heater elements on and off.
  • Gas Handling - I will need to do something with the gas that is generated, while avoiding blowing up the house or garage - I have seen somewhere where they recommend using an aluminised mylar baloon, which sounds like a good idea if I can find one.
  • Gas Composition Measurement - I will need to find out the proportion of methane to carbon dioxide that I am generating - still not sure how to do that.   It would be possible with a tunable IR laser diode, but not sure if that is feasible without spending real money.  Any suggestions appreciated!
  • Gas volume measurement - the other thing I am interested in is how much gas is generated - not sure how best to measure very low gas flow rates.  I am wondering about modifying a U-bend type airlock to detect how many bubbles pass through - maybe detect the water level changing before the bubble passes through.
If this looks feasible, the next stages of development would be:
  • Automate gas handling to use the gas generated to heat the digester - success would be making it self sustaining so that it generated enough gas to keep it warm.  That would mean scaling it up would produce excess gas that I could use for something, else.
  • Think about how far I can scale it up - depends on what fuel to use - kitchen and 'soft' garden waste is limited, so might have to look for something else....
Will post an update when I get it doing something.



Saturday, 5 October 2013

Using Raspberry Pi as an IP Camera to Analogue Converter

I have an old-fashioned analogue TV distribution system in our house.   We use it for a video monitor for our disabled son so we can check he is ok.
The quality of the analogue camera we use is not good, but rather than getting a new analogue one, I thought I should really get into digital IP cameras.
I have had quite a nice IP camera with decent infra-red capabilities for a while (a Ycam Knight).   You can view the images and hear the audio on a computer, but it is not as useful as it working on the little portable flat panel TVs we have installed in a few rooms for the old analogue camera.

I am trying an experiment using a raspberry Pi to take the audio and video from the IP camera, and convert it to analogue signals so my old equipment can be used to view it.

What we have is:

  • IP Camera connected to home network.
  • Raspberry Pi connected to same network.
  • Analogue video and audio signals from Pi connected to an RF modulator, which is connected to our RF distribution system.
Using this I can tune the TVs on the RF distribution system to view the Raspberry Pi output.

I set up the Pi to view the audio and video streams from the IP camera by using the omxplayer video player, which is optimised for the Pi.   I added the following to /etc/rc.local:
omxplayer rtsp://192.168.1.18/live_mpeg4.sdp &
Now when the Pi boots, it displays the video from the IP camera on its screen, which is visible to other monitors via the RF modulator.

My concern is how reliable this will be - I tried earlier in the year and the Pi crashed after a few weeks with a completely mangled root filesystem, which is no good at all.   This time I am using a new Pi and new SD card for the filesystem, so I will see how long it lasts.

Sunday, 22 September 2013

Human Power Meters

I have just done a triathlon with my disabled son, Benjamin (team-bee.blogspot.co.uk)

While we were training I started to try to calculate the energy requirements for the event, because I was worried about running out of glycogen before the end.   Most of the calculation methods can not take account of weather - especially wind, so I am starting to wonder how to make a power meter for our bike.   I can either go for strain gauges in the cranks, which is likely to be difficult mechanically, or I am wondering if I can just use my heart rate.
I have just got a Garmin 610 sports watch with heart rate monitor.  It uses a wireless protocol called 'ant'.  I'll have to look at how good heart rate is as a surrogate for power output.
I may have to go to a gym to calibrate myself against a machine that measures work done...a winter project I think!

Sunday, 24 March 2013

Further Development of Video Based Seizure Detector

I have made a bit more progress with the video based epileptic seizure detector.

Someone on the OpenCV Google Plus page suggested that I look at the Lucas-Kanade feature tracking algorithm, rather than trying to analyse all of the pixels at once like I was doing.

This looks quite promising.  First you have to decide which features in the image to use - corners are good for tracking.  OpenCV has a neat cv.GoodFeaturesToTrack function which makes suggestions - you give it a couple of parameters, including a 'quality' parameter to help it choose.  This gives a list of (x,y) coordinates of the good features to track.  Note that this means 'good' mathematically, not necessarily the limbs of the test subject....

Once you have some features to track, OpenCV again provides a cv.CalcOpticalFlowPyrLK, where you give it the list of features, the previous image and a new image, and it calculates the locations of the features in the new image.

I have then gone into the fourier analysis that I have been trying for the other types of seizure detection. This time I calculate the speed of each feature over a couple of seconds, and record this as a time series, then calculate the fourier transform to give the frequency spectrum of the motion.   If there is oscillation above a threshold amplitude in a given frequency band for a specified time we raise an alarm as a possible seizure.

The code is functioning, but is a fair way off being operational yet.  The code for this is in my OpenSeizureDetector github repository (https://github.com/jones139/OpenSeizureDetector).

The current issues are:

  • I really want to track motion of limbs, but there is no guarantee that cv.GoodFeaturesToTrack will detect these as good features - I can make this more likely by attaching reflective tape, which glows under IR illumination from the night vision camera...if I can persuade Benjamin to wear it.
  • There is something wrong with the frequency calculation still - I can understand a factor of two, but it seems a bit more than that.
  • If the motion is too quick, it looses the point, so I have to set it to re-initialise using GoodFeaturesToTrack periodically.
  • An Example of it working with my daughter doing Benjamin-like behaviour is shown below.   Red circles are drawn around points if a possible seizure is detected.
  • This does not look too good - lots of points detected, and even the reflective strips on the wrists and ankles get lost.  It seems to work better in darkness though, where I get something like the second video, where there are only a few points, and most of those are on my high-vis reflective strips.

  • It does give some nice debugging graphs of the speed measurements and the frequency spectra though.
So, still a bit of work to do.....







Saturday, 9 March 2013

First go at a Video Based Epileptic Seizure Detector

Background

I have been working on a system to detect epileptic seizures (fits) to raise an alarm without requiring sensors to be attached to the subject.
I am going down three routes to try to do this:

  • Accelerometers
  • Audio
  • Video
This is about my first 'proof of concept' go at a video based system.

Approach

I am trying to detect the shaking of a fit.  I will do this by monitoring the signal from an infrared video camera, so it will work in monochrome.  The approach is:
  1. Reduce the size of the image by averaging pixels into 'meta pixels' - I do this using the openCV pyrDown function that does the averaging (it is used to build image pyramids of various resolution versions of an image).  I am reducing the 640x480 video stream down to 10x7 pixels to reduce the amount of data I have to handle.
  2. Collect a series of images to produce a time series of images.  I am using 100 images at 30 fps, which is about 3 seconds of video.
  3. For each pixel in the images, calculate the fourier transform of the series of measured pixel intensities - this gives the frequency at which the pixel intensity is varying.
  4. If the amplitude of oscillation at a given frequency is above a threshold value, treat this as a motion at that particular frequency (ie, it could be a fit).
  5. The final version will check that this motion continues for several seconds before raising an alarm.  In this test version, I am just  highlighting the detected frequency of oscillation on the original video stream.

Code

The code uses the OpenCV library, which provides a lot of video and image handling functions - far more than I understand...
My intention had been to write it in C, but I struggled with memory leaks (I must have been doing something wrong and not releasing storage, because it just ate all my computer's memory until it crashed...).
Instead I used the Python bindings for OpenCV - this ran faster and used much less memory than my C version (this is a sign that I made mistakes in the C one, rather than Python being better!).
The code for the seizure detector is here - very rough 'proof of concept' one at the moment - it will have a major rewrite if it works.

Test Set Up

To test the system, I have created a simple 'test card' video, which has a number of circles oscillating at different frequencies - the test is to see if I can pick out the various frequencies of oscillation.  The code to produce the test video is here....And here is the test video (not very exciting to watch I'm afraid).
The circles are oscillating at between 0 and 8 Hz (when played at 30 fps).

Results

The output of the system is shown in the video below.  The coloured circles indicate areas where motion has been detected.  The thickness of the line and the colour shows the frequency of the detected motion.
  • Blue = <3 hz="" li="">
  • Yellow = 3-6 Hz
  • Red = 6-9 Hz
  • White = >9 Hz

The things to note are:
  • No motion detected near the stationary 0 Hz circle (good!).
  • <3hz 1="" 2="" and="" circles="" detected="" good="" hz="" li="" motion="" near="" the="">
  • 3-6 Hz motion detected near the 2,3,4 and 5 Hz circles (ok, but why is it near the 2Hz one?)
  • 6-9 Hz motion detected near the 5 and 6 Hz circles (a bit surprising)
  • >9Hz motion detected near the 4 and 7 Hz circles and sometimes the 8Hz one (?)
So, I think it is sometimes getting the frequency too high.  This may be as simple as how I am doing the  check - it is using the highest frequency that exceeds the threshold.  I think I should update it to use the frequency with maximum amplitude (which exceeds the thershold).
Also, I have something wrong with positioning the markers to show the motion - I am having to convert from a pixel in the low res image to the location in the high resolution one, and it does not always match up with the position of the moving circles.

But, it is looking quite promising.  Rather computer intensive at the moment though - it is using pretty much 100% of one of the CPU cores on my Intel Core I5 laptop, so not much chance of getting this to run on a Raspberry Pi, which was my intention.




Saturday, 2 March 2013

Getting Started with OpenCV

I am starting work on the video version of my Epileptic Seizure detector project, while I wait for a very sensitive microphone to arrive off the slow boat from China, which I will use for the Audio version.

I am using the OpenCV computer vision library.  What I am hoping to do is to either:

  • Detect the high frequency movement associated with a seizure, or
  • Detect breathing (and raise an alarm if it stops)
This seems quite similar to the sort of things that MIT have demonstrated some success with last year (http://people.csail.mit.edu/mrub/vidmag/).   Their code is written in Matlib, which is a commercial package, so not much use to me, so I am looking at doing something similar in OpenCV.

But first things first, I need to get OpenCV working.  I am going to use plain old C, because I know the syntax (no funny '<'s in the code that you seem to get in C++).  I may move to Python if I start to need to plot graphs to understand what is happening, so I can use the matplotlib graphical library.

I am using CMake to sort out the make file.  I really don't know how this works - I must have found a tutorial somewhere that told me to create a file called CMakeLists.txt.  Mine looks like:
cmake_minimum_required(VERSION 2.8)
PROJECT( sd )
FIND_PACKAGE( OpenCV REQUIRED )
ADD_EXECUTABLE( sd Seizure_Detector.c )
TARGET_LINK_LIBRARIES( sd ${OpenCV_LIBS} )
Running 'cmake' creates a standard Makefile, and then typing 'make' will compile Seizure_Detector.c and link it into an executable called 'sd', including the OpenCV libraries.   Seems quite clever.

The program to detect a seizure is going to have to look for changes in a series of images in a certain frequency range (a few Hz I think).   To detect this I will need to collect a series of images, process them, and do some sort of Fourier transform to detect the frequency components.

So to get started, grab an image from the networked camera.  This seems to work:
IplImage *origImg = 0;
char *window1 = "Original";
int main() {
    camera = cvCaptureFromFile("rtsp://192.168.1.18/live_mpeg4.sdp");
    if(camera!=NULL) {
    cvNamedWindow(window1,CV_WINDOW_AUTOSIZE);
    while((origImg=cvQueryFrame(camera)) != NULL) {
      procImg = cvCreateImage(cvGetSize(origImg),8,1);
      cvShowImage(window1,origImg);
    }
}
}

I can also smooth the image, and do some edge detection:

    while((origImg=cvQueryFrame(camera)) != NULL) {
      procImg = cvCreateImage(cvGetSize(origImg),8,1);
      cvCvtColor(origImg,procImg,CV_BGR2GRAY);
      //cvSmooth(procImg, procImg, CV_GAUSSIAN_5x5,9,9,0,0);
      smoothImg = cvCreateImage(cvGetSize(origImg),8,1);
      cvSmooth(procImg, smoothImg, CV_GAUSSIAN,9,9,0,0);
      cvCanny(smoothImg,procImg,0,20,3);
   
      cvShowImage(window1,origImg);
      cvShowImage(window2,procImg);
}

Full code at https://github.com/jones139/arduino-projects/tree/master/seizure_detector/video_version.

I am about to update the code to maintain a set of the most recent 15 images (=1 second of video), so I can do some sort of time series analysis on it to get the frequencies.....


Sunday, 24 February 2013

Epileptic Seizure Detector (3)

I installed an accelerometer on the underside of the floorboard where my son sleeps to see if there is any chance of detecting him having an epileptic seizure by the vibrations induced in the floor.
I used the software for the seizure detector that I have been working with before (see earlier post).

The software logs data to an SD card in Comma-Separated-Values (CSV) format, recording the raw accelerometer reading, and the calculated spectrum once per second.  This left me with 26 MB of data to analyse after running it all night.....

I wrote a little script in Python that uses the matplotlib library to visualise it.   I create a 2 dimensional array where there is one column for each record in the file (ie once column per second).  The rows are the frequency bins from the fourier transform.  The values in the array are the amplitude of the spectral component from the fourier transform.
The idea is that I can look for periods where I have seen high levels of vibration at different frequencies to see if it could detect a seizure.  The results are shown below:
Here you can see the background noise of a few counts in the 1-7 Hz range.   The 13-15Hz signal is a mystery to me.  I wonder if it is the resonant frequency of our house?
Up to 170 sec is just me walking around the room - discouragingly little response - maybe something at about 10 Hz.  This is followed by me sitting still on the floorboard up to ~200 seconds (The 10 Hz signal disappears?)
The period at ~200 seconds is me stamping vigorously on the floorboard, to prove that the system is alive.
Unfortunately the period after 200 seconds is me lying on the floorboard shaking as vigorously as I could, and it is indistinguishable from the normal activity before 170 seconds.

So, I think attaching a simple IC accelerometer to a floorboard will not work - attaching it directly to the patient's forearm looks very promising, but not the floorboard.

I am working on an audio breathing detector now as the next non-contact option....

The code to analyse the data and produce the above chart can be found on github.  It uses the excellent matplotlib scientific visualisation package.

Wednesday, 13 February 2013

Epileptic Seizure Detector (2)

Update to add another spectrum...

I have been working on setting up the Epileptic Seizure Detector.  I tried wearing it for a while, and simulating the shaking associated with a tonic-clonic seizure.   Some example spectra collected on the memory card are shown below:
This shows that the background noise level is at about 4 counts.   
Wearing the accelerometer on the biscep gives a peak up to about 8 counts at 7 Hz, but it is not well defined.  
Wearing the accelerometer on the wrist gives a much more well defined peak at 6-7 Hz. (and it raised an alarm nicely).

I have also tried an ADXL345 digital accelerometer.  The performance is similar to the analogue one, but I think it may be slightly more sensitive.  Example spectra with the accelerometer attached to the biscep are shown below.  ONe is a simulated fit.  The other is a false alarm going down the stairs.  Not that much difference!


Therefore I think there is scope for this set up to work if it is worn as a wrist watch, but just attaching it to other parts of the body may not be sensitive enough.

I wonder if I could make a wrist sensor that is watch sized, with a wireless link to a processor / alarm unit?

Not sure if I will be able to persuade Benjamin to wear a wrist sensor though....Might have to think about microphones.

Saturday, 9 February 2013

Soldering onto Surface Mount ICs

I recently bought an accelerometer IC to use on my epileptic seizure detector project.  It is a tiny surface mount device as you can see below.
I gave a lot of thought to how to connect wires to it.  I did consider conductive glue, but it would be difficult to hold them all still for long enough for it to set, so I went back to solder.  This is how I did it...

1.  Mount the IC onto stripboard using apoxy adhesive:

2.  While the glue is setting, modify the soldering iron by wrapping some 1mm2 copper wire around the tip to give a very fine tip.  Use solder to increase the heat transfer between the wire and the tip:

3.  Tin the solder pads on the IC, using some very fine solder (I got some 32swg solder off ebay).

4.  Obtain some very fine copper wire (I disassembled some cheap alarm flexible cable, and used strands from that).

5.  Hold a strand of wire onto a solder pad, and touch it with the soldering iron to melt the solder and create the joint.

6.  Repeat for all connections:


7.  Route the fine wires to the copper tracks, and solder on.  I used the insulation from the original alarm  cable to prevent short circuits:

Fiddly, and not very neat, but it worked for me - it is being used in my prototype epileptic seizure detector.



Epileptic Seizure Detector (1)

Our son worried us a bit a couple of weeks ago when he had quite a nasty fit, so I have been thinking about making an alarm to warn a carer that a person in their charge is having a seizure.

There are a few different ways to do this that I have thought of:

  1. Detect Movement using an accelerometer
  2. Detect the sounds associated with the movement using a microphone
  3. Monitor the movement with a CCTV camera and use image processing to detect the abnormal movement.
I am trying option 1 (accelerometer) first, but am working on the CCTV approach in parallel by learning OpenCV.

Because our son is autistic, it will be very difficult to get him to wear a device, so I hope to detect movement through the floorboard where he sleeps, but this will be much less sensitive than detecting it directly.  Therefore, this first proof of concept version is working by attaching the accelerometer to a limb to see if I can get it working.  The issues with it are:
  1. We do not want false alarms caused by normal movement - I am addressing this by using a fourier transform to filter out only a range of frequencies of movement, in the hope that I can select the characteristic shaking of a seizure, but not detect too much normal movement.
  2. A quick shake should not raise an alarm, so to set off an alarm the acceleration in the appropriate frequency band should be more than a threshold value for a specified length of time (3 sec currently).  This will give a warning 'pip'.   If the shaking continues for 10 sec, it raises a buzzing alarm.
  3. Sensitivity will be a problem for detecting it through the floor - will need to work on that another evening.
The system uses an Arduino microcontroller, connected to a Freescale MMA7361 three axis accelerometer.   The accelerometer is a tiny (5mm x 3mm) surface mount device, so soldering it is a challenge - you can see how I did it here.
To enable data logging so I can tune it to get the frequency response, threshold etc. the arduino is also connected to a real time clock module and a SD card module.
The completed prototype is shown below:
The code is in my Arduino Projects github repository.

And here is a simple demonstration of it working - you can hear the warning 'pip' and the alarm 'buzz' in the background when I shake my arm to simulate a seizure. 
Still quite a bit of work to do - build it on stripboard to make it more robust, then try attaching it to the floor and seeing if I can detect any signal from someone shaking.  If not, I will have to minaturise it to make it wearable, and train Benjamin to wear it....


Monday, 28 January 2013

Getting Started with Raspberry Pi

I have had a Raspberry Pi single board computer in a box in the attic for a few months - I had forgotten that I had pre-ordered it, and was busy with the Arduino solar panel power meter when it arrived, so didn't do anything with it.

Well, I know that the wheelchair project will need some brackets to mount motors, lights, GPS receiver etc., and have been reading about 3d printing, and thought it would be a handy excuse err... a necessary part of the project, to try out 3d printing for these parts.   And the 3d printer will need a little print server, so I don't tie up my laptop when it is printing.   So, I am dusting off the Raspberry Pi and having a go at setting it up to see if it will be able to do that.

These are my notes, so that I can do it again if I accidentally break it...

Basic Set-Up

  • Download the Debian root filesystem image from the Raspberry Pi web site.
  • Unzip the archive to give us 2012-12-16-wheezy-raspbian.img.
  • Copy it to a 4GB SD card using dd if=2012-12-16-wheezy-raspbian.img of=/dev/sdb.  (Note, write to whole SD card, not to a partition - sdb, not sdb1).
  • Put SD card into raspberry pi, connect HDMI to TV in living room and switch on.
  • Success - boot messages displayed on TV
  • Failure - it lands in an interactive set-up utility, and I don't have a keyboard for it - doh....maybe I should have gone for openWRT.
  • Try different approach - forget the TV now I know it boots, and just connect it up to the network.  It gets is IP address from my router, and I can now ssh into it, with username pi, password raspberry.
  • Now I can run sudo rasppi-config, which is the same config utility that came up on the TV monitor.  Used this to expand root filesystem to fill SD card, but didn't see much point in changing anything else (will sort out a user in a minute and do away with the pi user).

3d Printing Stuff

  • Followed instructions at https://github.com/w-A-L-L-e/printerface, with the following exceptions:
  • mv kliment-Printrun-71e5da0/ printrun
  • Node-js needed sudo apt-get install nodejs not node-js.
  • Had to do sudo ln -s /usr/bin/nodejs /usr/bin/node to get npm install.sh to work.
  • needed to  curl https://npmjs.org/install.sh | sudo sh. to avoid directory access errors.
  • The forever@0.9.2 failed to install with lots of errors, but npm install -g forever worked.
  • But starting printerface using forever failed with an error on line 404 (monitor.send).
  • node printerface.js works though - web interface appears on port 8080.
Will update when I get further....

Sunday, 6 January 2013

Design Calcs for Power Assisted Wheelchair

Update to correct my deliberate mistake...Answer is still about the same, but I am now designing to a 1 in 3 (18 deg) gradient.

A very quick go at some preliminary design calculations for the power assisted cross country wheelchair.
The idea is that the motor should be capable of preventing it slipping backwards on a 18 deg incline (fairly arbitrary, but needed to make a design assumption).
  • Based on an assumed mass of 50kg, would give a weight of 490 N.
  • This resolves to a force down the 18deg slope of 152 N.
  • Which is equivalent of a torque on an 18" (0.46m) od wheel of 85x0.46 = 35 Nm.
Alas this is more than twice the torque delivered by the bicycle hub motor (15 Nm).  Now the motor is likely to have internal gears, but it could be tricky to make some new ones to reduce its speed and increase its torque... [Update - oh no it doesn't - it is shown as gearless].
So it looks like if I am going to use hub motors, I will have to use two of them.  This would probably  be sensible, as it will be better to drive the rear wheels, but also expensive...

I have ordered an electric wheelchair conversion kit off ebay - will see how that goes.  Torque and speed should be ok, but it looks heavy and clunky, so I expect to upgrade it...

Power Assisted Cross Country Wheelchair

Our son Benjamin does not walk too well, and will suddenly run out of energy, so when we are out in the countryside we take a three wheeler cross country wheelchair for him:
He has got too big for this one, so we are going to get him the biggest one we can find:
This new one has 16" spoked wheels, so this means it should be possible to add some form of power assistance to it, as you can get some nice lightweight motors that fit into bicycle wheel hubs.
So, I intend to get an electric bicycle conversion kit, and fit it to the wheelchair.  There are a few things to deal with to make it work:
  1. Will the new hub fit in the front forks of the wheelchair? (waiting for supplier to measure it for me on Monday).
  2. Although fitting the powered wheel in the front will be the easiest mechanically (assuming it fits), the front wheel has less weight on it, so it may just spin, and not be much use, so I may have to look at how to fit it to one of the rear wheels (which then raises the concern about whether it will spin round in circles!
  3. The bike set-up will be intending to go a lot faster than I want this wheelchair to go (I guess it will target around 12mph, but I think 4mph will feel quite fast enough for me).   Mounting the hub in a smaller wheel will reduce the speed, but I think it will still be too fast (will do the sum later...), so I think I will have to modify the motor driver.    The motor is a brushless motor, which from what I have read sounds pretty much the same as a stepper motor - you have to feed it with a wave form to get it to go around (and go in the right direction).   So even if I can not simply modify the controller, I can use its power transistors etc. and use an arduino to make the waveforms.
An alternative may be to go for two electric wheelchair motors, but they look awfully heavy compared to the bike motor, so I am tempted to go with that as a trial.   If it doesn't work, I'll put the electric kit on our Hase Pino tandem to help me up the hills, as Benjamin doesn't put too much effort into pedalling!
IMAG0035

I'd be interested to hear if anyone has tried this and has experiences to share.


Sunday, 16 December 2012

Approaching a working version of Arduino Solar Monitor

Christmas is coming, so I have to get a working version of the Arduino Solar Panel monitor.

My original intent was that it would have the following features:

  1. Measure the collector differential temperature.
  2. Infer the water flow rate from pump speed.
  3. Calculate the instantaneous power being collected.
  4. Calculate hourly and daily average powers.
  5. Log this information to an SD card.
  6. To achieve (5) easily, derive the time from an external Real-Time-Clock (RTC).
Now, features 1-3 are working, phew.   Feature 5 is implemented, but I need to think about what I really want (is daily average power useful?  Or should I just integrate total heat collected in a day?

Features 5 and 6 are proving troublesome, as I think I am starting to get to the limit of a single Arduino board (or more precisely the ATMega 328 controller on the board).

The main problem is that I am running out of RAM, and am going to have to give some serious thought to how to manage it better (just like old times programming a Zilog Z80A.....).

One problem is the number of different interfaces (and hence libraries) that I am having to use to achieve this.  The base software uses:
  1. OneWire.h and DallasTemperature.h to do the temperature monitoring, using a One-Wire bus.
  2. LiquidCrystal.h to drive the LCD display, using parallel data transfer
To add SD card support and a real time clock, I will need:
  1. Wire.h and DS1307RTC.h to access the real time clock from an I2C interface.
  2. SD.h to access the SD card from a SPI interface.
Each library uses a bit of ram , and there is only 2k of ram on the chip, so I am running out of it rapidly.  When I tried to add the RTC code, the board re-booted every few seconds, which I think was an out of memory issue.

So, the de-scoped system is not going to do SD card logging.  As compensation I have added switches to the two spare digital lines to use to provide a simple user interface so you can scroll between instantaneous, hourly and daily data, and maybe even set the clock (but I do worry about running out of RAM again if I get too adventurous!

Given this, I have put the 'Version 1' hardware together, and mounted it in a cheap 2 gang socket pattress box with a blank cover cut out to hold the board:
The toggle switch on the front is for the display back-light - I thought that would be easier than a push-button if you were trying to use buttons for the interface - the new buttons are facing the bottom of the picture on the side of the front panel, so you can't see them on this photo (and I made a mess of cutting the holes for them, so it looks a bit ugly...).

Right, just got to sort out the software now.  Current version is on github.

Sunday, 9 December 2012

Odd Behaviour of Arduino

I think I have got too used to working on large computers, which have essentially infinite resources, as far as my little projects are concerned, so I am having a bit of trouble with Arduino.   The two interesting problems I have seen are:


  1. Low Battery:  The symptoms were the device operating ok for quite a while (initially around an hour, but later ~5 minutes), then doing very unexpected things - what I saw was the pin 13 LED flashing on and off, but my software was not doing anything with that pin.   It looked like the board had just lost its marbles.   It turned out that the issue was low voltage on the 9V battery that was powering it - I did not realise initially because the LED back light on the display worked fine, which is the test I was using for 'it has got power'.  When I put a volt meter on it, it was only providing around 5.3V, so I think that this was insufficient to start the arduino properly...but surprisingly was enough to light the LED ok - I have always thought that LEDs need more power than little electronic devices, so if the LED is ok, it has enough power - this is not true, so I must think of another simple check!
  2. Out of Memory:  My solar thermal monitor now works nicely with an LCD display, and I made a simple test program to write data to an SD card.  The odd thing is that merging the two together results in a program that compiles ok and loads onto the device, but when it tries to write to the SD card, the arduino re-starts (I had worse effects earlier when it would just not start at all, until I removed some code in the setup function that writes to the SD card).  I think it must be running out of memory, but I need to do some work to check this...will update this once I have fixed it.

Sunday, 2 December 2012

Arduino Based Solar Panel Power Monitor

My dad has a solar thermal collector on his roof (2x20 vacuum tube collectors).   His commercial controller gives total kWh collected, but I want an instantaneous kW indication.   I am developing this using Arduino (see Microcontrollers Revisited).

Version 1 used a 7 segment LED display:
But I have now received an LCD display off the slow boat from China courtesy of Ebay, so wanted to update it to give more information on the display.
The updated board is shown below:

I made a little mistake when I was modifying the circuit board, and when I added the LCD contrast potentiometer, I accidentally left it one of the digital output pins connected to ground if you turned the potentiometer too far.   The Arduino board suddenly stopped working altogeher, so I thought I had fried it.  I was pleasantly surprised when I powered it from a 9V battery rather than from the USB socket, because it came back to life and appeared to work normally.
We installed it on the panel, and started to get sensible readings off it this morning (see above picture), but it was reported dead this afternoon, with the Pin 13 LED flashing every now and then (which is odd because my software does not use Pin 13)....It starts to work if you power it down for a few minutes, but within 10 mins it crashes again with Pin 13 LED flashing.....So I think it is dead - will de-solder the nano and put another one in instead....

Well, replaced Arduino nano on the PCB with a new one.   It worked fine....for a while (10-15 mins), then as I was packing up, I noticed that the display was very dim, but the LED back light still worked ok, which made me think the battery was ok.   5 minutes later, it was doing the same behaviour as the previous one - no LCD display, and the Pin 13 LED flashing.   This time though, the new Aruino Nano has a working voltage regulator on the USB port.  When I plugged the board into the computer via USB, the board reset and is working fine again.....I must put a current meter on the 9V battery to make sure it is not pulling a ridiculous current or something.  If I am lucky it was just a dead battery, but I am still surprised that the LED backlight worked ok, when the arduino could not boot...I'll have to think about this a bit more...

There was definitely a dead battery involved - the 9V battery was only giving 5.9V, so I can see why the 5V voltage regulator may have been struggling.  The thing I need to find out though, is why is the battery dead?  This should be a nice low power circuit, but I will have to get a new one and check the current it is drawing.

Sunday, 25 November 2012

Microcontrollers Revisited

Quite a few years ago (probably 10), I started getting interested in using microcontrollers to do small computing tasks that required input-output, and low power consumption.   I did not get very far with them because every time I wanted to do something, I would have to write software from scratch (to talk to a display or sensor etc.).  Also there was a lot of soldering involved to put the boards together, with crytstals, capacitors etc. to support the microcontroller.

I recently discovered Arduino (http://arduino.cc), which is a simple microcontroller with a standard PCB board layout, where assembled boards are sold cheaply.  The Arduino Uno seems like a good one to use for prototypes, as all the I/O pins are taken out to headers that you can attach jumper wires to easily.   For 'production'  versions though, the Arduino Nano seems like a better option, as it is much smaller, and you can solder connections directly onto the board rather than using jumpers etc.  I bought a few of these (or at least clones of them) very cheap (~£11 each) off Ebay.

You can download a simple development environment where you can write the code for the boards in C/C++, compile it, and load it onto the board via USB - seems to work very well.

By far the best feature of Arduino though is the user contributed libraries - there are libraries for accessing one-wire devices, LCD displays etc., so you do not have to start from scratch for each project, which makes development much, much quicker.

So, I am starting to think of all of those 'I could make one of those, but it is a bit of a waste to use a full-blown computer for it' projects.   The ones I am starting on are:

  • Solar Thermal Monitor (Power Meter for water heater solar panel) - I have a first version working - see the solThMon directory in my Github repository.  A bit more description is provided in the github wiki.
  • Alternative Weather Station Receiver - the idea is to use a simple 433MHz radio receiver to read the signals from our weather station, so we do not need the big LCD display that came with it (no progress yet, but I have the hardware for it...).
To give an idea of what these things look like, here is a picture:

Sunday, 29 July 2012

Making Videos from Images

My daughter collected a nice series of images of the demolition of the old Steetley chimney in Hartlepool:

There are 25 images, so we thought it would be nice to make a little movie of them.
First they need a bit of processing, because the horizontal is not level.   We rotated the images, and trimmed off the edges to straighten them up using ImageMagick using:
mogrify -resize '700' -rotate 4 -shave '50x60' -resize '640' *.JPG 
This re-sizes the images to 700 pixels wide, rotates clockwise by 4 deg, shaves 50 pixels of the top and bottom, and 60 pixels off the sides.  It then re-sizes to 640 pixels wide. We then turned it into an mp4 avi movie file using mplayer's mencoder using:
mencoder "mf://*.JPG" -mf fps=4 -o Steetley.avi -ovc lavc -lavcopts vcodec=msmpeg4v2:vbitrate=800 -vf scale=640:480
This plays nicely using mplayer, and uploads to flickr too.

Final movie is here:
Or on Flickr here








Saturday, 7 July 2012

Webcam using NLSU2 and OpenWRT

I am trying to set up a simple web camera to take some photos and videos of birds on our bird table.
I have had an edimax ip camera for a long time, but have had some trouble with using and configuring it from Linux.  So, I am having an attempt with a different approach from bits of hardware I have around.   There is a Linksys NLSLU2, which is made as a network storage device, and I just picked up a logitech usb web camera.   I decided to use OpenWRT as the operating system, because I used that on a previous project (bifferboard weather station).
This post is a few notes of where I have got to, and the problems encountered.

Operating System

  • Download OpenWRT:   To start with I used a tarball of OpenWRT 'backfire' 10.03, but it problems with packages not being compatible.   Rather than understand why, I went for checking out the latest source code from the 'backfire' branch of the OpenWRT SVN repository.
  • Build OpenWRT:  This is as simple as changing into the source directory and doing 'make menuconfig', selecting the target system as 'IXP4xx', then the profile as NSLU2, exiting and typing 'make'.   This creates a flash disk image as bin/ixp4xx/openwrt-nslu2-squashfs.bin.
  • Put the NSLU2 into upgrade mode by powering off, pressing and holding the reset button with a paper clip then powering on.  Wait for ~10 sec until the top LED changes colour (very difficult for me to tell as I am colour blind...), then release the reset button.  The top LED now flashes two slightly different colours.
  • Connect the computer with the OpenWRT image on it to the same ethernet network as the NSLU2 - Wireless connections do not work!.
  • Flash the new image onto the NSLU2 using 'sudo upslug2 -i openwrt-nslu2-squashfs.bin'.
  • Once the flashing is complete, the NSLU2 reboots and you should be able to telnet into it - default ip address is 192.168.1.1, which is a pain if that is the same one as your router...
Configuring OpenWRT
The basic OpenWRT build does not do much, so it needs to be configured and the extra required packages adding:
  • Default IP Address - this can be set by editing package/base-files/files/etc/config/network [there is probably a better way to do this, but this works...].
  • Add the drivers for the USB video cameras - using menuconfig select them in the kernel config section - make sure they are selected to be built-in [*] rather than modules [M], because otherwise they are not included in the flash drive image.
  • Add some software to do something with the video source - add the mjpg-streamer and motion packages.
  • Re-build OpenWRT and re-flash it.
  • You should now be able to look at :8080?action=stream and see the image from the camera from mjpeg-stream
Making it Do Something Useful
  • mjpg-streamer seems to work nicely and produces a jpeg stream that can be viewed on a browser.  No sound though - will have to look into how to deal with that.
  • motion also seems to work - need to modify /etc/motion.conf to tell it what to do.
  • I would really like to to record video clips when motion is detected, so am currently trying to see if I can build ffmpeg to link that to motion.....
  • I have run out of time this weekend.  What I have done is just used motion and removed mjpeg-streamer, as motion provides a mjpeg stream too.  The basic set-up is:
  • Logitec usb web cam.
  • Motion records images to /tmp/cam1/// - separated into different directories to stop the numbers of files getting out of hand.
  • A little php script /cgi-bin/browse.php is used to browse through the directories and view the images.
  • I could not get ffmpeg to compile, so no videos for now.
  • This is going to be installed at my sisters, so will have to talk her through setting up her router so I can ssh into it remotely to fix it when it does not work.....
  • My version, using the same motion and php script set up, but using the edimax ip camera can be seen at http://maps.webhop.net/webcam.