tag:blogger.com,1999:blog-60424013332903895652024-02-02T04:48:10.464+00:00How Did I Do That? (NT)A record of me trying to fathom out how to do a few nerdy things - hopefully this will help me remember how I made things work when I have forgotten...
Graham Joneshttp://www.blogger.com/profile/05191458206760305309noreply@blogger.comBlogger127125tag:blogger.com,1999:blog-6042401333290389565.post-20245044507833676892015-01-04T10:39:00.000+00:002015-01-04T19:41:26.549+00:00DIY Gas Chromatograph<h2>
Why Bother?</h2>
<div>
A while ago I started looking at making a <a href="http://nerdytoad.blogspot.co.uk/search/label/Biogas">biogas generator</a>, but didn't commission it because I had no way of detecting what gases were coming off it. A few weeks ago I was inspired by someone showing that they had made a <a href="http://www.freetronics.com.au/blogs/news/7793291-diy-arduino-controlled-gas-chromatograph#.VKkEqvmsXVM">DIY gas chromatograph</a>, so thought that if I made one, we could do real experiments on biogas production from household waste to see what works best etc.....</div>
<div>
<br /></div>
<div>
So, our litle project for this Christmas holiday was to make ourselves a DIY gas chromatograph and see if we could use it to detect the difference between CO2 and Methane, which are the two main products I expect to see from the fermentation to produce biogas.</div>
<div>
<br /></div>
<div>
Note that when I talked to some chemists about this they advised that I could do this much more easily using wet chemistry because of the significant differences between CO2 and Methane, but I am a physicist, so something using physical properties sounds much more fun! (It is also much more of a useful education project for Laura, but she didn't know that at the start).</div>
<h2>
The Principles</h2>
<div>
A gas chromatograph relies on a constant flow of carrier gas passing through a 'column', which is in a temperature controlled oven. You inject the sample gas into the flow at the inlet of the column, and the constituent parts travel through the column at different rates, so the different constituents come out of the column at different times after injection. See pretty picture from <a href="http://en.wikipedia.org/wiki/Gas_chromatography">wikipedia article</a> below.</div>
<div class="separator" style="clear: both; text-align: center;">
<a href="http://upload.wikimedia.org/wikipedia/commons/thumb/8/87/Gas_chromatograph.png/350px-Gas_chromatograph.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" src="http://upload.wikimedia.org/wikipedia/commons/thumb/8/87/Gas_chromatograph.png/350px-Gas_chromatograph.png" height="175" width="320" /></a></div>
<div class="separator" style="clear: both; text-align: center;">
<br /></div>
<h2 style="clear: both; text-align: left;">
Components of a DIY Chromatograph</h2>
<h3>
Infrastructure</h3>
<div>
The infrastructure (temperature measurement, temperature control, detector control etc.) can be done using an arduino microcontroller.</div>
<h4>
Arduino Based Temperature Controller</h4>
<div>
This was Laura's part of the project - she developed an Arduino programme (sketch) that does the following:</div>
<div>
<ul>
<li>Measures the resistance of thermistors (assuming they are wired as a potential divider).</li>
<li>Converts the resistance to temperature in degC.</li>
<li>Performs 3 term (PID) temperature control by varying an 'analogue' output pin to control the oven temperature (see below for oven details).</li>
<li>Outputs relevant data (temperatures etc.) to the controlling computer using the USB serial connection on the arduino.</li>
<li>Responds to commands from the USB serial line to change set point, PID gains etc.</li>
</ul>
<div>
The Arduino Code is here: <a href="https://github.com/jones139/arduino-projects/tree/master/gc/oven">https://github.com/jones139/arduino-projects/tree/master/gc/oven</a>.</div>
</div>
<h4>
User Interface</h4>
<div>
We had a difficult design choice for user interface - do we write a 'native' user interface on a computer connected to the arduino, or make a web based system?</div>
<div>
I decided to go for a web based system, which means that you can use any computer as the user interface, so we need a little web server. Although some people use Arduino's for this, I thought it would be much easier to use a Raspberry Pi.</div>
<div>
We re-cycled the web server code from our <a href="http://www.openseizuredetector.org.uk/">Seizure Detector</a> project, which is a simple python web server.</div>
<div>
The python programme does the following:</div>
<div>
<ul>
<li>Listen for web requests.</li>
<li>If no special commands are given, it serves a simple page showing the chromatograph settings and a graph of the temperature history (which will also be the detector output). </li>
<li>The main web page includes javascript code to allow bits of it to be updated without refreshing the whole page every time (the html/javascript code is Laura's).</li>
<li>Respond to specific commands (such as change set point) by sending these to the arduino across the serial line.</li>
<li>Collect data from the arduino (it sends a set of data every second), and create at time series.</li>
<li>Use the time series data to plot a graph of temperature history etc.</li>
</ul>
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEi685NaTySLSxS1JcO5f1qfm0AuFECx2rtOw7_DDP4mz4TpKmkNVbm1QnHA9ILe27tLXx8YBLhrdC4S8YUzeh0hwKNEINOjguGwenAU6lIxF-vEEjyzlnw60-JgBZPRbJYamku2hxnS1GM/s1600/Screenshot+from+2015-01-04+10:36:56.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEi685NaTySLSxS1JcO5f1qfm0AuFECx2rtOw7_DDP4mz4TpKmkNVbm1QnHA9ILe27tLXx8YBLhrdC4S8YUzeh0hwKNEINOjguGwenAU6lIxF-vEEjyzlnw60-JgBZPRbJYamku2hxnS1GM/s1600/Screenshot+from+2015-01-04+10:36:56.png" height="356" width="640" /></a></div>
<div>
<br /></div>
<div>
The web server code is the python files here: <a href="https://github.com/jones139/arduino-projects/tree/master/gc">https://github.com/jones139/arduino-projects/tree/master/gc</a> (execute runServer.py) to start the web server.</div>
</div>
<div>
<br /></div>
<div>
The html and javascript based user interface is all here: <a href="https://github.com/jones139/arduino-projects/tree/master/gc/www">https://github.com/jones139/arduino-projects/tree/master/gc/www</a>.</div>
<div>
<br /></div>
<div>
The infrastructure part went well - we have a web interface to a three term temperature controller that works fine, and sends data back to the web server, which produces a graph of the temperature history. You can change set point, controller gains etc over the web interface.</div>
<h3>
Power Supply and Case</h3>
<div>
We will need a variety of power supplies (5V for the Raspberry Pi, 12V for heaters, mains for the pump). I had an old computer case in the Attic, so we used that - it has a power supply that gives 5V, 12V high current, +/-12V and 3.3V, so plenty for what we need. The case will also house the finished instrument so it will look neater than most of my projects once I put the lid on!</div>
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="float: left; margin-right: 1em; text-align: left;"><tbody>
<tr><td><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgcZk6eqafCDXRAVYYfEDuE-3NY4eioRZyy7C43o3SwuqYfcTCNr1CvT1lS6EBDjZgndHgRPjIFSd6La1gvoJ4Z994C-rUrNLzTMHr_4SVJHf-loxYGSg2k7lmMGOpgUDQoCJjDhha1nTw/s1600/IMG_20141116_130100.jpg" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgcZk6eqafCDXRAVYYfEDuE-3NY4eioRZyy7C43o3SwuqYfcTCNr1CvT1lS6EBDjZgndHgRPjIFSd6La1gvoJ4Z994C-rUrNLzTMHr_4SVJHf-loxYGSg2k7lmMGOpgUDQoCJjDhha1nTw/s1600/IMG_20141116_130100.jpg" height="150" width="200" /></a></td></tr>
<tr><td class="tr-caption" style="font-size: 13px;">The case for the project!</td></tr>
</tbody></table>
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEj8qnr5RH0l8YHqCzz3JVCePVXw5-ZxaEggWVbILbknL-HRJUHCeCUJ0gU9ReRgBh4hyAT8H08PBCktsy2c12WZ3_dFr8ic53ZuCtqD78D876Lssr2RKXFA4PkFD_1a2gGIppWkNQJTapY/s1600/IMG_20141116_130609.jpg" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEj8qnr5RH0l8YHqCzz3JVCePVXw5-ZxaEggWVbILbknL-HRJUHCeCUJ0gU9ReRgBh4hyAT8H08PBCktsy2c12WZ3_dFr8ic53ZuCtqD78D876Lssr2RKXFA4PkFD_1a2gGIppWkNQJTapY/s1600/IMG_20141116_130609.jpg" height="150" width="200" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Case before I removed the old computer boards to make room. - the power supply is at the back.</td></tr>
</tbody></table>
The ATX power supply does not start up when the unit is powered on - you needed to press the on button on the case, which energised a line to the power supply via the main computer board. You can force the power supply to run by shorting a particular pin on the main connector down to ground:<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjffPVSxmB7nCI1RHlcNfYsR3G41JuUxxl87fVbvIDQdKVNZZB5JfLM7fHyTdyDTfbtolJgfr69g83q7oMojJ_6muGXWm4AMXI97YxlB7MxTF3QhxEaApgneFgbeipQovvbHRURxYH9SOo/s1600/IMG_20141116_144836.jpg" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjffPVSxmB7nCI1RHlcNfYsR3G41JuUxxl87fVbvIDQdKVNZZB5JfLM7fHyTdyDTfbtolJgfr69g83q7oMojJ_6muGXWm4AMXI97YxlB7MxTF3QhxEaApgneFgbeipQovvbHRURxYH9SOo/s1600/IMG_20141116_144836.jpg" height="320" width="240" /></a></div>
<br />
<div>
<br /></div>
<h3>
Carrier Gas</h3>
<div>
To keep things simple I propose to use air as the carrier gas, and use a fish tank air pump to push it through the column. Because it is a bit noisy, we made the Arduino and web interface allow you to switch it on and off easily. The pump is mains powered so we used a solid state relay to switch it on and off, and covered the mains connections with plastic to stop us blowing ourselves up with loose wires in the case...</div>
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjL9bczooZCDCXIfZN-Lk-APz6AQTx-fpi0FM-B_SnBmA_LjRoDThBr04FSkVi1X0NVzoSw0GhhRQcUAzKVxC3uh3hjymCKr5UU3EvP9bDeK3WWxQjCihJSlorJ4PVuD6yWVZGPVhy6mcM/s1600/IMG_20141116_080145.jpg" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjL9bczooZCDCXIfZN-Lk-APz6AQTx-fpi0FM-B_SnBmA_LjRoDThBr04FSkVi1X0NVzoSw0GhhRQcUAzKVxC3uh3hjymCKr5UU3EvP9bDeK3WWxQjCihJSlorJ4PVuD6yWVZGPVhy6mcM/s1600/IMG_20141116_080145.jpg" height="240" width="320" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">The air pump with sample injector syringe.</td></tr>
</tbody></table>
<br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiJnLz-SjSnv3rXitnArryA2iJ7CJcVSu0sCchP-hnFYqkpWUwfIAUlkyNQx4S8LTXNnZM1ZognjawMfIiVEcX_fL6zdfK8yA3fJK8clxjv_M_DFYmTRca98IbzBifqbSh58PjEG_MY0g0/s1600/IMG_20141227_202217.jpg" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiJnLz-SjSnv3rXitnArryA2iJ7CJcVSu0sCchP-hnFYqkpWUwfIAUlkyNQx4S8LTXNnZM1ZognjawMfIiVEcX_fL6zdfK8yA3fJK8clxjv_M_DFYmTRca98IbzBifqbSh58PjEG_MY0g0/s1600/IMG_20141227_202217.jpg" height="240" width="320" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Solid state relay mounted in bottom of the case - all the mains connections are covered in clear plastic to avoid them contacting low voltage parts of the equipment.</td></tr>
</tbody></table>
<div>
<br /></div>
<h3>
Oven</h3>
<div>
For the oven we need an insulated case and some heaters. For the case we used the old CD drive case from the computer, because it fits in the computer case neatly:</div>
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjN9VCnJfyjz8i8fiRHUHBsBxeQ5uRr1-0ySQJmwA4SDpLLETBx5U7-h_8VARlbQEpwZYpNpKSruuGc73ZCkrzFSblI3BReTUC1R181fjg-2FxJNwT5K3hyphenhyphenJ_6RPIZ0eRwfmPpaR2nHRy0/s1600/IMG_20141116_142507.jpg" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjN9VCnJfyjz8i8fiRHUHBsBxeQ5uRr1-0ySQJmwA4SDpLLETBx5U7-h_8VARlbQEpwZYpNpKSruuGc73ZCkrzFSblI3BReTUC1R181fjg-2FxJNwT5K3hyphenhyphenJ_6RPIZ0eRwfmPpaR2nHRy0/s1600/IMG_20141116_142507.jpg" height="240" width="320" /></a></div>
<div>
<br /></div>
<div>
We added some polystyrene insulation to the top to reduce heat loss, and a bit of bubble wrap to the bottom (could not get too much in, or there would be no room in the oven....</div>
<div>
The heater element is an aluminium plate cut to the size of the oven with three resistors bolted to it.</div>
<div>
A power transistor is also mounted in the case to switch the current flow to the resistors. This means that the 12V power supply only has to go to the oven, and we can provide a 5V switching signal from the arduino to control the heater using the transistor:</div>
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiJ9MYAdRipMQ-68XxeO_wIyy7LEYjMq9BrjjYs761uRCUteLPNzxytdLAbaLuH8vO0Q6OamVgOBsw33-8tFWV6HCDP52asKp59_I8BkEmnwTh_V7TH2TrhdJCm_TZCnrUd02vo_FxiWCs/s1600/IMG_20141121_111049.jpg" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiJ9MYAdRipMQ-68XxeO_wIyy7LEYjMq9BrjjYs761uRCUteLPNzxytdLAbaLuH8vO0Q6OamVgOBsw33-8tFWV6HCDP52asKp59_I8BkEmnwTh_V7TH2TrhdJCm_TZCnrUd02vo_FxiWCs/s1600/IMG_20141121_111049.jpg" height="240" width="320" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Heater plate with resistors attached, along with power transistor to control the heater current. Note that we had to disconnect the transistor heat sink from the plate because grounding it to earth switched on the transistor, so we had an over-heat fault on first commissioning - the arduino tried to switch off the heaters, but they continued at full power - at least we proved that we can get the oven to just over 90degC...</td></tr>
</tbody></table>
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://raw.githubusercontent.com/jones139/arduino-projects/master/gc/oven/circuit_diagrams.jpg" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" height="400" src="https://raw.githubusercontent.com/jones139/arduino-projects/master/gc/oven/circuit_diagrams.jpg" width="282" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Circuit diagrams for the thermistor measurement and the heater control circuit.</td></tr>
</tbody></table>
<h3>
Detector</h3>
<div>
The detector is my part, and is the bit that is holding up the project at the moment!</div>
<div>
<h4>
First Version - heat loss to environment</h4>
My first go was to rely on the gas coming out of the oven being hot, and looking at the amount of cooling of the sample gas compared to pure carrier gas as it passed through some copper tubes:</div>
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEix8gD7chQcdGySmbjgY-VYRFsDeoZ5jmpSQBCyBXPI3d8CkZc85WdVxJtnpmUdrWeAnPh62zfItMsvu30mTZ4K5XdpLwl452-hdbSJS-lWsKyY8ofJTjVNKvvxqszZfjGUWRZ4eBe0yPs/s1600/IMG_20141115_220152.jpg" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEix8gD7chQcdGySmbjgY-VYRFsDeoZ5jmpSQBCyBXPI3d8CkZc85WdVxJtnpmUdrWeAnPh62zfItMsvu30mTZ4K5XdpLwl452-hdbSJS-lWsKyY8ofJTjVNKvvxqszZfjGUWRZ4eBe0yPs/s1600/IMG_20141115_220152.jpg" height="240" width="320" /></a></div>
<div class="separator" style="clear: both; text-align: center;">
<br /></div>
<div class="separator" style="clear: both; text-align: left;">
Unfortunately the gas flow rate is so low that the gas has cooled to ambient temperature before it gets to the detector, so I can't measure anything useful, so need a re-think.</div>
<h4 style="clear: both; text-align: left;">
Second version - heated constantan wire</h4>
<div class="separator" style="clear: both; text-align: left;">
Next, try a hot wire detector - loop of constantan wire used to heat a thermistor using a constant current source - the temperature above ambient should depend on the thermal properties of the gas surrounding it.</div>
<div class="separator" style="clear: both; text-align: left;">
<br /></div>
<div class="separator" style="clear: both; text-align: left;">
Here my lack of practice at electronics design let me down - I made a high current source using a trusty (>30 year old) 741 op-amp and a power transistor. </div>
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjoHxgPJ11NrWPKT8YrFTPj7Bk5t9cVj5WBLyKKC9M7P220vuGNzHCFSv5qMRdQAckxorMCAdfqzu8qM7ed7kHD8-TiGMELMJf1X2LZFILeLe0g5Noo2fXjLKYoC77Uo6n2pTQAdeSL2xk/w415-h553-no/IMG_20150103_194623.jpg" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjoHxgPJ11NrWPKT8YrFTPj7Bk5t9cVj5WBLyKKC9M7P220vuGNzHCFSv5qMRdQAckxorMCAdfqzu8qM7ed7kHD8-TiGMELMJf1X2LZFILeLe0g5Noo2fXjLKYoC77Uo6n2pTQAdeSL2xk/w415-h553-no/IMG_20150103_194623.jpg" height="320" width="240" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Arduino, along with 741 and power transistor current source (the sense resistor is the big grey cylinder above the arduino). The things in the crocodile clip are the heated and ambient thermistors.</td></tr>
</tbody></table>
<div class="separator" style="clear: both; text-align: left;">
<br /></div>
<div class="" style="clear: both; text-align: left;">
Unfortunately I was using a 100R resistor to sense the current, and my loop of constantan is only about 1R. This meant that I put a lot more power into the sense resistor than my 'hot' wire - no detectable increase in wire temperature, but smoke and a warming glow from the sense resistor.... Replaced it with the more robust resistor shown in the picture above, which acts as a nice room heater, but no measurable heating of the thermistor.</div>
<div class="" style="clear: both; text-align: left;">
<br /></div>
<div class="" style="clear: both; text-align: left;">
So, need a higher resistance heater for the thermistor - think I will dismantle a 12V light bulb next....</div>
<h2 style="clear: both; text-align: left;">
Summary</h2>
<div class="separator" style="clear: both; text-align: left;">
Quite an interesting holiday project, but not finished. </div>
<div class="separator" style="clear: both; text-align: left;">
What went well:</div>
<div class="separator" style="clear: both; text-align: left;">
</div>
<ul>
<li>Working web interface to an arduino temperature controller</li>
<li>Working web based data logger.</li>
<li>Working oven and switchable pump.</li>
<li>Nice case with useful power supply.</li>
<li>Laura learned to programme an Arduino, and write javascript web pages</li>
</ul>
<br />
<div class="separator" style="clear: both; text-align: left;">
What didn't go well:</div>
<div class="separator" style="clear: both; text-align: left;">
</div>
<ul>
<li>The detector!</li>
<li>I am out of practice at electronics design, and mis-judged heat losses from very very low gas flow rates!</li>
</ul>
<br />
<div>
<br /></div>
<div>
<br /></div>
<div>
<br /></div>
<div>
<br /></div>
Graham Joneshttp://www.blogger.com/profile/05191458206760305309noreply@blogger.com1tag:blogger.com,1999:blog-6042401333290389565.post-32084033887432067872014-10-26T21:30:00.000+00:002014-10-26T21:30:05.747+00:00Alternative Operating System (Cyanogenmod) on Samsung S4 miniMy Samsung S4 mini Android mobile phone works very well, but it keeps running out of internal storage space for applications, so in practice I can not have very many of my own applications on the device.<br />
<br />
I realised this is because the phone came with a lot of applications pre-installed, which keep getting updated, and the updates take up storage space (in addition to the factory installed version, which is not replaced). And I don't use most of the applications that are installed on it - no need for things like Google Maps when you can use <a href="http://osmand.net/">OsmAnd</a> navigation etc. whcih uses <a href="http://openstreetmap.org/">OpenStreetMap</a> data so is more detailed.<br />
<br />
So tonight I decided to try installing <a href="http://cyanogenmod.org/">cyanogenmod</a>, which is another build of Android that can replace the factory firmware. I found this a bit nerve wracking because I was doing it as a bit of a 'black box' - download this file, press these buttons etc. There are also several versions of a S4 mini (mine is a GT-I9192, which seems to be less common). If I were doing it on a Windows computer I would be very worried about viruses etc. - still nervous about the firmware that I have downloaded - might try to build it from source another day to give me a bit more confidence.<br />
<br />
The end result is my phone seems to work, running cyanogenmod 11, which is good<br />
<br />
Don't treat this as instructions of how to do it - it is just my notes so I can remember.<br />
<br />
<b>Recovery Image</b><br />
The S4 mini has a recovery mode, which seems to be a very small operating system. You need a replacement for this which will let you do more things (like backup your existing firmware before you start anything more serious).<br />
There are a few different alternative recovery systems around, but the one I found that claims to work on an I9192, is called 'Philz' which is a more advanced version of one called 'clockworkmod'.<br />
<br />
I got the latest version of Philz recovery from the link <a href="http://www.theandroidsoul.com/samsung-galaxy-s4-mini-duos-gt-i9192-philz-touch-advanced-cwm-recovery/">here</a>. And loaded it onto the device using the '<a href="https://github.com/Benjamin-Dobell/Heimdall">heimdal</a>' software running on my xubuntu linux laptop (I just used the ubuntu packaged version rather than building from source) - I did this by following the instructions <a href="http://wiki.cyanogenmod.org/w/Install_CM_for_serranoltexx">here</a>.<br />
<br />
It is now possible to boot the phone into recovery mode by pressing the Volumme Up, Home and Power buttons when booting.<br />
<br />
<b>Install Cyanogenmod</b><br />
The extra worrying part is that you need the version of cyanogenmod that matches your phone (not sure what will happen if you don't, but it might take a bit of recovering from...). I searched the internet to find an unofficial version for my phone (GT-I9192), and got the latest version from <a href="https://s.basketbuild.com/filedl/devs?dev=k2wl&dl=k2wl/cm11_i9192/cm-11-20140814-UNOFFICIAL-serranodsdd.zip">here</a>, which is referenced from a post on the <a href="http://forum.xda-developers.com/showthread.php?t=2790789">xda developers forum</a>.<br />
<br />
This went surprisingly smoothly - you can set the recovery program to install a 'zip' image from sideloader, and send the image using 'adb sideload <image .zip="">'.</image><br />
<br />
Re-booted and the phone works again, phew!.<br />
<br />
<b>Google Apps</b><br />
One issue with the 'stock' cyanogenmod is that it does not include any of the propriatory google applications, in particular I wanted GMail, Google Plus and the play store.<br />
While it is possible to back them up from the factory firmware, and then restore them into cyanogenmod, you can get pre-packaged versions on the internet (may be issues with licencing here I suspect...), which are packaged as 'gapps' and can be loaded as a 'zip' file the same way as cyanogenmod.<br />
<br />
This now gives me a working gmail etc., and i can install other apps like osmand, national rail etc. using play store.<br />
<br />
Unfortunately I have installed loads of other google apps that I don't really want, which slightly defeats the object of going to an alternative firmware - I might have to look at doing the backup and restore bit myself and being more selective about what I back up....<br />
<br />
So, I think I have got back to a working phone - I'll have to test it a bit this week before I go travelling again and need it more.Graham Joneshttp://www.blogger.com/profile/05191458206760305309noreply@blogger.com0tag:blogger.com,1999:blog-6042401333290389565.post-37291908015546927562014-09-28T20:28:00.000+01:002015-01-04T09:10:29.501+00:00Charity Document Management System<span style="font-family: Helvetica Neue, Arial, Helvetica, sans-serif;">After a bit more development of the Document Management System for our Academy Charitable Trust (<a href="http://github.com/jones139/hdms">HDMS</a>), I have now got something working which I think is useable. There may well be some changes once we use it in anger for a while and find some 'features' annoying!</span><br />
<span style="font-family: Helvetica Neue, Arial, Helvetica, sans-serif;"><br /></span>
<br />
<h2>
Background</h2>
<div style="box-sizing: border-box; color: #333333; font-family: 'Helvetica Neue', Helvetica, 'Segoe UI', Arial, freesans, sans-serif; font-size: 16px; line-height: 25.6000003814697px; margin-bottom: 16px;">
HDMS is a <a href="http://en.wikipedia.org/wiki/Document_management_system" style="background-attachment: initial; background-clip: initial; background-image: initial; background-origin: initial; background-position: initial; background-repeat: initial; background-size: initial; box-sizing: border-box; color: #4183c4; text-decoration: none;">Document Management System</a> that has been developed for Hartlepool Aspire Trust (<a href="http://catcoteacademy.co.uk/" style="background-attachment: initial; background-clip: initial; background-image: initial; background-origin: initial; background-position: initial; background-repeat: initial; background-size: initial; box-sizing: border-box; color: #4183c4; text-decoration: none;">Catcote Academy</a>).</div>
<div style="box-sizing: border-box; color: #333333; font-family: 'Helvetica Neue', Helvetica, 'Segoe UI', Arial, freesans, sans-serif; font-size: 16px; line-height: 25.6000003814697px; margin-bottom: 16px;">
It has been developed because the Trust is expected to have many policies to ensure compliance with statutory regulations, and these policies are implemented within the trust using procedures for detailed instructions, and forms to record information.</div>
<div style="box-sizing: border-box; color: #333333; font-family: 'Helvetica Neue', Helvetica, 'Segoe UI', Arial, freesans, sans-serif; font-size: 16px; line-height: 25.6000003814697px; margin-bottom: 16px;">
It is important that the latest versions of the Policies, Procedures and Forms are available to staff and key stakeholders, and that changes between versions can be tracked and communicated to stakeholders so they know what has changed when a new document is issued.</div>
<h2 style="box-sizing: border-box; color: #333333; font-family: 'Helvetica Neue', Helvetica, 'Segoe UI', Arial, freesans, sans-serif; font-size: 16px; line-height: 25.6000003814697px; margin-bottom: 16px;">
User Interface</h2>
<div style="box-sizing: border-box; color: #333333; font-family: 'Helvetica Neue', Helvetica, 'Segoe UI', Arial, freesans, sans-serif; font-size: 16px; line-height: 25.6000003814697px; margin-bottom: 16px;">
HDMS has been developed to store the Trust's documents in a single repository (a web server) and present the latest version of documents to interested parties. Users are initially presented with a graphical summary of the document structure.<br />
<a href="https://github.com/jones139/hdms/raw/v1.2/doc/home_page_screenshot.png" style="background-attachment: initial; background-clip: initial; background-image: initial; background-origin: initial; background-position: initial; background-repeat: initial; background-size: initial; box-sizing: border-box; color: #4183c4; text-decoration: none;" target="_blank"><img alt="Screenshot Image" src="https://github.com/jones139/hdms/raw/v1.2/doc/home_page_screenshot.png" style="border: 0px; box-sizing: border-box; max-width: 100%;" /></a> The user clicks on parts of the graphical summary to search for specific types of documents (such as Financial Procedures, or Human Resources Policies). This gives a list of documents, showing the latest revision number with date of issue, with clickable icons to download either the PDF version or native version of the file.<br />
<a href="https://github.com/jones139/hdms/raw/v1.2/doc/document_list_screenshot.png" style="background-attachment: initial; background-clip: initial; background-image: initial; background-origin: initial; background-position: initial; background-repeat: initial; background-size: initial; box-sizing: border-box; color: #4183c4; text-decoration: none;" target="_blank"><img alt="Screenshot Image" src="https://github.com/jones139/hdms/raw/v1.2/doc/document_list_screenshot.png" style="border: 0px; box-sizing: border-box; max-width: 100%;" /></a>Authorised users have options to create new revisions, or edit existing draft documents.</div>
<div style="box-sizing: border-box; color: #333333; font-family: 'Helvetica Neue', Helvetica, 'Segoe UI', Arial, freesans, sans-serif; font-size: 16px; line-height: 25.6000003814697px; margin-bottom: 16px;">
Draft versions of documents are not publicly visible, but can be viewed by authorised users. Approval and issue of documents is managed by the draft document being sent electronically to reviewers/approvers.<br />
The document is issued and becomes the latest version once all the reviewers/approvers have approved the document.</div>
<div style="box-sizing: border-box; color: #333333; font-family: 'Helvetica Neue', Helvetica, 'Segoe UI', Arial, freesans, sans-serif; font-size: 16px; line-height: 25.6000003814697px; margin-bottom: 16px;">
The workflow for creating, revising and approving a document is shown in a set of slides <a href="https://github.com/jones139/hdms/raw/master/doc/HAT_DMS.pdf">here</a>.</div>
<div style="box-sizing: border-box; color: #333333; font-family: 'Helvetica Neue', Helvetica, 'Segoe UI', Arial, freesans, sans-serif; font-size: 16px; line-height: 25.6000003814697px; margin-bottom: 16px;">
<span style="line-height: 25.6000003814697px;">The system stores both 'native' (e.g. MS Word) documents and PDF documents. By default the PDF version is delivered to the public, as this can not be modified accidentally. The system can also store 'extra' files, which may be the source files for drawings or tables of data that are used in the document - this is useful for future updates so the author can obtain all the data used to produce the original document.</span></div>
<h2 style="box-sizing: border-box; color: #333333; font-family: 'Helvetica Neue', Helvetica, 'Segoe UI', Arial, freesans, sans-serif; font-size: 16px; line-height: 25.6000003814697px; margin-bottom: 16px;">
Live Version</h2>
<div style="box-sizing: border-box; color: #333333; font-family: 'Helvetica Neue', Helvetica, 'Segoe UI', Arial, freesans, sans-serif; font-size: 16px; line-height: 25.6000003814697px; margin-bottom: 16px;">
The live version of the system is running at (<a href="http://catcotegb.co.uk/hdms" style="background-attachment: initial; background-clip: initial; background-image: initial; background-origin: initial; background-position: initial; background-repeat: initial; background-size: initial; box-sizing: border-box; color: #4183c4; text-decoration: none;">http://catcotegb.co.uk/hdms</a>).</div>
<div style="box-sizing: border-box; color: #333333; font-family: 'Helvetica Neue', Helvetica, 'Segoe UI', Arial, freesans, sans-serif; font-size: 16px; line-height: 25.6000003814697px; margin-bottom: 16px;">
The software is quite general so may be of use to other small and medium size organisations who wish to manage their documentation in a systematic way. There is a demonstration version of the system available for testing at <a href="http://catcotegb.co.uk/hdms_demo" style="background-attachment: initial; background-clip: initial; background-image: initial; background-origin: initial; background-position: initial; background-repeat: initial; background-size: initial; box-sizing: border-box; color: #4183c4; text-decoration: none;">http://catcotegb.co.uk/hdms_demo</a> - login as 'user1' with password 'test'). The source code is available on my <a href="https://github.com/jones139/hdms">GitHub repository</a>.</div>
<div style="box-sizing: border-box; color: #333333; font-family: 'Helvetica Neue', Helvetica, 'Segoe UI', Arial, freesans, sans-serif; font-size: 16px; line-height: 25.6000003814697px; margin-bottom: 16px;">
Please let me know if you are interested in using this for your organisation and I will help explain how to set it up, because my installation instructions may not be complete!</div>
Graham Joneshttp://www.blogger.com/profile/05191458206760305309noreply@blogger.com0tag:blogger.com,1999:blog-6042401333290389565.post-24482840762285406112014-08-29T22:15:00.001+01:002015-01-04T09:10:51.665+00:00Academy Charitable Trust Document Management SystemLast year our school converted to an <a href="http://catcoteacademy.co.uk/">academy</a>. To help us with the set-up of the administrative side of the new organisation, I set up an electronic document management system to hold our management documents such as policies and procedures.<br />
<br />
The system I set up was a modified version of <a href="https://github.com/jones139/opendocman">OpenDocMan</a>. This has worked pretty well from the point of view of recording the documents and allowing us to retrieve the issued version, but now we are looking at updating some of the documents, and establishing another part of the organisation, we are finding some limitations. The most significant problem is that the document does not appear publicly while it is waiting for approval - I want the latest issued document to always be available even while we are reviewing and approving the new version.<br />
<br />
I decided that rather than modifying my version of <a href="https://github.com/jones139/opendocman">OpenDocMan</a>, it is probably better to write an alternative simple system based on an established software framework.<br />
<br />
The new <a href="https://github.com/jones139/hdms">Hartlepool Aspire Trust Document Management System (HDMS)</a> is based on the <a href="http://cakephp.org/">cakephp</a> framework, which makes interfacing with the database, and dealing with internet http requests very simple, and it automatically produced the code to do basic database record creation/deletion etc. automatically, so I only had to do the 'business' logic.<br />
<br />
The concepts for the new system and workflow are shown in <a href="https://github.com/jones139/hdms/blob/master/doc/HAT_DMS.pdf?raw=true">these slides</a>, and there is a demo installation <a href="http://catcotegb.co.uk/hdms_demo">here</a>.<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://github.com/jones139/hdms/blob/master/doc/HAT_DMS.pdf?raw=true"><img border="0" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhCfeW8-Lkagtq_z1q90TnwvKO_x2geseculPAZfgH1YluyXRnU5NnxP4__FSxujMjx6fOZ8IFXVvLhaP4XRyO44unu0KFDEfFUEIbD6hsEVHtxlmic6bMinVflWs8HSnYZmaliANvDFNc/s1600/HAT_DMS+(6).jpg" height="360" width="640" /></a></div>
Graham Joneshttp://www.blogger.com/profile/05191458206760305309noreply@blogger.com0tag:blogger.com,1999:blog-6042401333290389565.post-58322337869214363662014-01-13T22:36:00.003+00:002014-01-13T22:36:24.183+00:00Breathing Detection with Kinect - A working Prototype Seizure Detector!<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjsRUN4bZ34qPdH6qCdBExfEiufbsOqd630rCwdU7rCfsEaK1wVDqJSBfkFopvCl_1jbYHQeE3WH7Mf4WJm3JwFFtvUO-LveiE6qto4Z2LjdWTqyVAejxlzSL8bCeVC9EBKNeH3HcvqwM4/s1600/20140113_160252.jpg" imageanchor="1" style="clear: right; float: right; margin-bottom: 1em; margin-left: 1em;"><img border="0" height="112" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjsRUN4bZ34qPdH6qCdBExfEiufbsOqd630rCwdU7rCfsEaK1wVDqJSBfkFopvCl_1jbYHQeE3WH7Mf4WJm3JwFFtvUO-LveiE6qto4Z2LjdWTqyVAejxlzSL8bCeVC9EBKNeH3HcvqwM4/s200/20140113_160252.jpg" width="200" /></a>The seizure detector project has come forward a long way since I have been using the Kinect.<br />
I now have a working prototype that monitors breathing and can alarm if the breathing rate is abnormally low. It sends data to our 'bentv' monitors (image right), and has a web interface so I can see what it is doing (image below). It is on soak test now.....<br />
<br />
Details at <a href="http://openseizuredetector.org.uk/">http://openseizuredetector.org.uk</a>.<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhj-wnx8ogtD4X9kfboQWZqmdXNHjncSnO-T7CSlRdbIBR2CaaBdN5uk7HZwCifm90YKguQc2A0Rm-5rooIn2itpx3im1fec3KMKaZR5R5x12XlbR2odJM__wYOPlUrk3YC_HDCBg95LHQ/s1600/website_output.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhj-wnx8ogtD4X9kfboQWZqmdXNHjncSnO-T7CSlRdbIBR2CaaBdN5uk7HZwCifm90YKguQc2A0Rm-5rooIn2itpx3im1fec3KMKaZR5R5x12XlbR2odJM__wYOPlUrk3YC_HDCBg95LHQ/s1600/website_output.png" /></a></div>
<br />Graham Joneshttp://www.blogger.com/profile/05191458206760305309noreply@blogger.com0tag:blogger.com,1999:blog-6042401333290389565.post-11056721289980715192014-01-05T21:10:00.002+00:002014-01-05T21:14:29.355+00:00Breathing Detection using Kinect and OpenCV - Part 2 - Peak detectionA few days ago I published a <a href="http://nerdytoad.blogspot.co.uk/2014/01/breathing-detection-using-kinect-and.html">post</a> about how I am using a Microsoft Kinect depth camera and the OpenCV image processing library to identify a test subject from a background, and analyse the series of images from the camera to detect small movements.<br />
<br />
The next stage is to calculate the brightness of the test subject at each frame, and turn that into a time series so we can see how it changes with time, and analyse it to detect specific events.<br />
<br />
We can use the openCV 'mean' function to work out the average brightness of the test image easily, then just add it onto the end of an array, and trim the first value off the start to keep the length the same.<br />
The resulting image and time series are shown below:<br />
<div class="separator" style="clear: both; text-align: center;">
</div>
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEj4gwh1dUERcahFtdBnhWzutAwYAnSwXWI3kLBacazWVsCB9XU7B6vkEBFYNzI-O5g6sop3iyKedvDa-KjoA_DRz5ZDgC5ziwma3oFsgHRPjk8YIC3bsDzp_CsKRxhpWSupqAxBIniZjMI/s1600/example_maskedImg.png" imageanchor="1" style="clear: left; float: left; margin-bottom: 1em; margin-right: 1em;"><img border="0" height="240" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEj4gwh1dUERcahFtdBnhWzutAwYAnSwXWI3kLBacazWVsCB9XU7B6vkEBFYNzI-O5g6sop3iyKedvDa-KjoA_DRz5ZDgC5ziwma3oFsgHRPjk8YIC3bsDzp_CsKRxhpWSupqAxBIniZjMI/s320/example_maskedImg.png" width="320" /></a></div>
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjrFnMyUaXvJl0oM0YVtfxAZE-t5QBpqrxT5xZBQRxJ_E4O1F04qnfL6mtvbsLxIzLB5d3Y2-tuI6dt0zI92i0N1wfaHeejNSkTZQS9vbJHgt0F0e-1Z96fp6SXI5mI-yQFAvRNfqiqK3Q/s1600/example_chartImg.png" imageanchor="1" style="clear: left; float: left; margin-bottom: 1em; margin-right: 1em;"><img border="0" height="240" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjrFnMyUaXvJl0oM0YVtfxAZE-t5QBpqrxT5xZBQRxJ_E4O1F04qnfL6mtvbsLxIzLB5d3Y2-tuI6dt0zI92i0N1wfaHeejNSkTZQS9vbJHgt0F0e-1Z96fp6SXI5mI-yQFAvRNfqiqK3Q/s320/example_chartImg.png" width="320" /></a><br />
The image here shows that we can extract the subject from the background quite accurately (this is Benjamin's body and legs as he lies on the floor). the shading is the movement relative to the average position.<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
The resulting time series is shown here - the measured data is the blue spiky line. The red one is the smoothed version (I know I have a half second offset between the two...).<br />
<br />
The red dots are peaks detected using a very simple peak searching algorithm.<br />
The chart clearly shows a 'fidget' being detected as a large peak. There is a breathing event at about 8 seconds that has been detected too.<br />
<br />
So, the detection system is looking promising - I have had better breathing detection when I was testing it on myself - I think I will have to change the position of the camera a bit to improve sensitivity.<br />
<br />
I have now set up a simple python based web server to allow other applications to connect to this one to request the data.<br />
<br />
We are getting there. The outstanding issues are:<br />
<br />
<ul>
<li>Memory Leak - after the application has run for 30 min the computer gets very slow and eventually crashes - I suspect a memory leak somewhere - this will have to be fixed!</li>
<li>Optimum camera position - I think I can get better breathing detection sensitivity by altering the camera position - will have to experiment a bit.</li>
<li>Add some code to identify whether we are looking at Benjamin or just noise - at the moment I analyse the largest bright subject in the image, and assume that is Benjamin - I should probably have a minimum size limit so it gives up if it can not see Benjamin.</li>
<li>Summarise what we are seeing automatically - "normal breathing", "can't see Benjamin", "abnormal breathing", "fidgeting" etc.</li>
<li>Modify our monitors that we use to keep an eye on Benjamin to talk to the new web server and display the status messages and raise an alarm if necessary.</li>
</ul>
The code is available <a href="https://github.com/jones139/OpenSeizureDetector/">here</a>.<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />Graham Joneshttp://www.blogger.com/profile/05191458206760305309noreply@blogger.com0tag:blogger.com,1999:blog-6042401333290389565.post-72064424414974410992014-01-01T23:50:00.002+00:002014-01-03T23:02:12.944+00:00Breathing Detection using Kinect and OpenCV - Part 1 - Image ProcessingI have had a go at detecting breathing using an XBox Kinnect depth sensor and the OpenCV image processing library.<br />
I have seen a research paper that did breathing detection, but it relied on fitting the output of the Kinect to a skeleton model to identify the chest area to monitor. I would like to do it with a less calculation intensive route, so am trying to just use image processing.<br />
<br />
To detect the small movements of the chest during breathing, I am doing the following:<br />
<table cellpadding="0" cellspacing="0" class="tr-caption-container" style="float: left; margin-right: 1em; text-align: left;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEh_b2eqBduHgUv4191brz_Gz8dCl67KpWwH0FdzPvCdXQTqPXYxRRImVRT6p32ct3J8bbhiume0PT1Cx1bcCtwWBo_mAJxAUeK9O7nrcaPzVw-ghcLp4GUBc6FkebTGQxhfPyfPMl52lC4/s1600/background_depth.png" imageanchor="1" style="clear: left; margin-bottom: 1em; margin-left: auto; margin-right: auto;"><img border="0" height="150" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEh_b2eqBduHgUv4191brz_Gz8dCl67KpWwH0FdzPvCdXQTqPXYxRRImVRT6p32ct3J8bbhiume0PT1Cx1bcCtwWBo_mAJxAUeK9O7nrcaPzVw-ghcLp4GUBc6FkebTGQxhfPyfPMl52lC4/s200/background_depth.png" width="200" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Start with a background depth image of empty room.</td></tr>
</tbody></table>
<br />
<table cellpadding="0" cellspacing="0" class="tr-caption-container" style="float: left; text-align: left;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEi4fCvjFE56sv1hWl1-tdZTSvPEinE6CpDhPEg1i2IPYH9-4HR1Wjatg_P1efnwml83qUcRFxKuVskvh-7YKbZ0AVGpo8Vls98E3TaInlZqzenLp-DfOhYxpd9hVD7S5_MsZWPWHeW1b1M/s1600/example_depth.png" imageanchor="1" style="clear: left; margin-bottom: 1em; margin-left: auto; margin-right: auto;"><img border="0" height="150" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEi4fCvjFE56sv1hWl1-tdZTSvPEinE6CpDhPEg1i2IPYH9-4HR1Wjatg_P1efnwml83qUcRFxKuVskvh-7YKbZ0AVGpo8Vls98E3TaInlZqzenLp-DfOhYxpd9hVD7S5_MsZWPWHeW1b1M/s200/example_depth.png" width="200" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Grab a depth image from kinect</td></tr>
</tbody></table>
<table cellpadding="0" cellspacing="0" class="tr-caption-container" style="float: left; margin-right: 1em; text-align: left;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEg82QRu5ff7U241MByyjXyGf0e7Xsq13NwSvnLVsCRkMwbV3zQkCcdIaPc9N2HDrl1fPgRKS5b251bgagFXSbDpP-USpAng6hemJbw3hcrpS8V9vcJ0UUwipdeJbLlGYlIHd2fPlE9K8ko/s1600/example_depth_bgsub.png" imageanchor="1" style="clear: left; margin-bottom: 1em; margin-left: auto; margin-right: auto;"><img border="0" height="150" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEg82QRu5ff7U241MByyjXyGf0e7Xsq13NwSvnLVsCRkMwbV3zQkCcdIaPc9N2HDrl1fPgRKS5b251bgagFXSbDpP-USpAng6hemJbw3hcrpS8V9vcJ0UUwipdeJbLlGYlIHd2fPlE9K8ko/s200/example_depth_bgsub.png" width="200" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Subtract Background so we have only the test subject.</td></tr>
</tbody></table>
<br />
<br />
<br />
<br />
<table cellpadding="0" cellspacing="0" class="tr-caption-container" style="float: left; margin-right: 1em; text-align: left;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEin8GKclKhubWNyA8eAdkFL-Vhogsm8xKk8mp5on4-fpR91Zfw0tRvHSovL5utbNIh716Vq9wCPWZvxHSnYz5GKEMeue8JXswCLsaxAw4DAPZtzX6OP0Z6mV98fX0Er-5Fg6uioTNH6XaY/s1600/example_depth_bgsub_autobg.png" imageanchor="1" style="clear: left; margin-bottom: 1em; margin-left: auto; margin-right: auto;"><img border="0" height="150" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEin8GKclKhubWNyA8eAdkFL-Vhogsm8xKk8mp5on4-fpR91Zfw0tRvHSovL5utbNIh716Vq9wCPWZvxHSnYz5GKEMeue8JXswCLsaxAw4DAPZtzX6OP0Z6mV98fX0Er-5Fg6uioTNH6XaY/s200/example_depth_bgsub_autobg.png" width="200" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Subtract a rolling average background image, and amplify the resulting small differences - makes image very sensitive to small movements.</td></tr>
</tbody></table>
<br />
<br />
<table cellpadding="0" cellspacing="0" class="tr-caption-container" style="float: right; margin-left: 1em; text-align: right;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiFWtur4G3g5-k81DCrkJX8agkJfbAznbxm2Wsu1LSN9BrwVqqMr4jw5lIf0eJNGo6EJq5vyOU4r7ojrKMPYN_Qd5wvJzVxaQWBxsav846qUkTc56l248QhQLJhy_GemEjvkMzIAwStDs4/s1600/example_breathing_raw.avi" imageanchor="1" style="clear: right; margin-bottom: 1em; margin-left: auto; margin-right: auto;"><img border="0" height="150" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiFWtur4G3g5-k81DCrkJX8agkJfbAznbxm2Wsu1LSN9BrwVqqMr4jw5lIf0eJNGo6EJq5vyOU4r7ojrKMPYN_Qd5wvJzVxaQWBxsav846qUkTc56l248QhQLJhy_GemEjvkMzIAwStDs4/s200/example_breathing_raw.avi" width="200" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Resulting video shows image brightness changing due to chest movements from breathing.</td></tr>
</tbody></table>
<ul>
<li><div class="separator" style="clear: both; text-align: center;">
</div>
</li>
</ul>
<div>
<br /></div>
<div>
We can calculate the average brightness of the test subject image - the value clearly changes due to breathing movements - job for tomorrow night is to do some statistics to work out the breathing rate from this data.</div>
<div>
<br /></div>
<div>
The source code of the python script that does this is the 'benfinder' program in the <a href="https://github.com/jones139/OpenSeizureDetector/tree/master/kinect_version">OpenSeizureDetector archive</a>.</div>
<ul>
</ul>
Graham Joneshttp://www.blogger.com/profile/05191458206760305309noreply@blogger.com1tag:blogger.com,1999:blog-6042401333290389565.post-82329124867609610162013-12-31T20:17:00.000+00:002014-01-03T23:01:45.653+00:00A Microsoft Kinect Based Seizure Detector?<h2>
Background</h2>
I have been trying to develop an epileptic seizure detector for our son on-and-off for the last year. The difficulty is that it has to be non-contact as he is autistic and will not tolerate any contact sensors, and would not lie on a sensor mat etc.<br />
I had a go at a video based version previously, but struggled with a lot of noise, so put it on hold.<br />
<br />
<a href="http://g-ecx.images-amazon.com/images/G/02/uk-videogames/2010/Xbox/kinectsideways-lg.jpg" imageanchor="1" style="clear: right; float: right; margin-bottom: 1em; margin-left: 1em;"><img border="0" height="132" src="http://g-ecx.images-amazon.com/images/G/02/uk-videogames/2010/Xbox/kinectsideways-lg.jpg" width="200" /></a>At the weekend I read a book "<a href="http://www.packtpub.com/opencv-computer-vision-with-python/book">OpenCV Computer Vision with Python</a>" by Joseph Howse - this was a really good summary of how to combine openCV video processing into an application - dealing with separating user interface from video processing etc. Most significantly he pointed out that it is now quite easy to use a Microsoft Kinect sensor with openCV (it looked rather complicated earier in the year when I looked), so thought I should give it a go. <br />
<br />
<h2>
</h2>
<h2>
Connecting Kinect</h2>
When I saw a Kinect sensor in a second hand gadgets shop on Sunday, I had to buy it and see what it can do.<br />
<br />
The first pleasant surprise that I got was that it came with a power supply and had a standard USB plug on it (I thought I would have to solder a USB plug onto it) - I plugged it into my laptop (Xubuntu 13.10), and it was immediately detected as a Video4Linux webcam - a very good start.<br />
<h2>
</h2>
<h2>
System Software</h2>
I installed the <a href="https://github.com/OpenKinect/libfreenect">libfreenect</a> library and its python bindings (I built it from source, but I don't think I had to - there is an ubuntu package python-freenect which would have done it).<br />
<br />
I deviated from the advice in the book here, because the Author suggested using the <a href="http://www.openni.org/">OpenNI</a> library, but this didn't seem to work - looks like they no longer support Microsoft Kinect sensors (suspect it is a licensing issue...). Also the particularly clever software to do skeleton detection (<a href="http://www.openni.org/files/nite/">Nite</a>) is not open source so you have to install it as a binary package, which I do not like. It seems that the way to get OpenNI working with Kinect is to use a wrapper around libfreenect, so I decided to stick with libfreenect.<br />
<br />
The only odd thing is whether you need to be root to use the kinect or not - sometimes it seems I need to access it as root, then after that it works as a normal user - will think about this later - must be something to do with udev rules, so not a big deal at the moment....<br />
<h2>
</h2>
<h2>
BenFinder Software</h2>
<div>
To see whether the Kinect looks promising to use as a seizure detector, wrote a small application based on the framework in Joseph Howse's book. I had to modify it to work with libfreenect - basically it is a custom frame grabber.</div>
<div>
The code does the following:</div>
<div>
<ul>
<li>Display video streams from kinect, from either the video camera or the infrared depth camera on the kinect - works! (switch between the two with the 'd' key).</li>
<li>Save an image to disk ('s' key).</li>
<li>Subtract a background image from the current image, and display the resulting image ('b' key).</li>
<li>Record a video (tab key).</li>
</ul>
<div>
The code is in my <a href="https://github.com/jones139/OpenSeizureDetector">Open Seizure Detector github repository</a>.</div>
</div>
<div>
<br /></div>
<div>
The idea is that it should be able to distinguish Benjamin from the background reliably, so we can then start to analyse his image to see if his movements seem odd (those who know Benjamin will know that 'odd' is a bit difficult to define for him!).</div>
<h2>
</h2>
<h2>
Output</h2>
<div>
I am very pleased with the output - it looks like it could work - a few images:</div>
<div class="separator" style="clear: both; text-align: center;">
</div>
<br />
<table cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhsrVay1On1eue9KYquXiOFQukm-m-Jn1WU-PuhTXSpbJCuSIpWYvUda7v45tyGICymghh33WSvJ_T4XTE87mJDC7j2daITShvT7sJiHUC0g_lWY25glg-_zg0w-LvNtMc2kcC-OzTg_jQ/s1600/background_video.png" imageanchor="1" style="clear: left; margin-bottom: 1em; margin-left: auto; margin-right: auto;"><img border="0" height="300" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhsrVay1On1eue9KYquXiOFQukm-m-Jn1WU-PuhTXSpbJCuSIpWYvUda7v45tyGICymghh33WSvJ_T4XTE87mJDC7j2daITShvT7sJiHUC0g_lWY25glg-_zg0w-LvNtMc2kcC-OzTg_jQ/s400/background_video.png" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Output from Kinect Video Camera (note the clutter to make detection difficult!)</td></tr>
</tbody></table>
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgczkGZguA8Mp4-VBo324rok2j4hPYHADYIRf0TxYUGXUt_0aYMWSgyGpU12erO5nGRWEZkgoTT79Eex1-59om9Tz8n19zZ5-oebp65osGOJcyUx8pEdG9J8r3B-roejPR4c6LkH2ioYsM/s1600/background_depth.png" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" height="300" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgczkGZguA8Mp4-VBo324rok2j4hPYHADYIRf0TxYUGXUt_0aYMWSgyGpU12erO5nGRWEZkgoTT79Eex1-59om9Tz8n19zZ5-oebp65osGOJcyUx8pEdG9J8r3B-roejPR4c6LkH2ioYsM/s400/background_depth.png" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Kinect Depth Camera Output - Note black hole created by open door.</td></tr>
</tbody></table>
<br />
<br />
<br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEge_Bj7sDZhulmCMm3Y_M6sX7KpaNsCcp1Vv-FEg1_8dn1YjlnNYZ5hr8s7pQ6DGlGws659CMpyNfgJH3aWN0hxRGmMVgRbJPqaSnMnRe7FFmwhH4Y9-oOcGHmato2TKEwpfgtBC1EREr0/s1600/example_output.png" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" height="300" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEge_Bj7sDZhulmCMm3Y_M6sX7KpaNsCcp1Vv-FEg1_8dn1YjlnNYZ5hr8s7pQ6DGlGws659CMpyNfgJH3aWN0hxRGmMVgRbJPqaSnMnRe7FFmwhH4Y9-oOcGHmato2TKEwpfgtBC1EREr0/s400/example_output.png" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Depth Camera Output with background image subtracted - note that the subject stands out quite clearly.</td></tr>
</tbody></table>
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhKD_cg9kO1PdOumMZ4CICh0vBYDaV-c45O4lVKgpLEIZ74R8lYeZqj-aa5fFtvHY0d5omzS-tTsVyo-bfSUJ-HWq-FtYZ1iNuBr87IS0nZovIiyWjsX-eQ1Fh52g0s5yxKPYSqRRmqmh4/s1600/example.avi" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" height="300" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhKD_cg9kO1PdOumMZ4CICh0vBYDaV-c45O4lVKgpLEIZ74R8lYeZqj-aa5fFtvHY0d5omzS-tTsVyo-bfSUJ-HWq-FtYZ1iNuBr87IS0nZovIiyWjsX-eQ1Fh52g0s5yxKPYSqRRmqmh4/s400/example.avi" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Example of me trying to do Benjamin-like behaviours to see if I can be detected.</td></tr>
</tbody></table>
<h2>
</h2>
<h2>
Conclusion & What Next</h2>
<div>
Background subtraction from the depth camera makes the test subject stand out nice and clearly - should be quite easy to detect him computationally.</div>
<div>
Next stage is to see if the depth camera is sensitive enough to detect breathing (when lying still) - will try by subtracting an each image from the average of the last 30 or so, and amplifying the differences to see if it can be seen.</div>
<div>
If that fails, I will look at <a href="https://github.com/joaquimrocha/Skeltrack">Skeltrack</a> to fit a body model to the images and analyse movement of limbs (but this will be much more computationally costly).</div>
<div>
Then I will have to look at infrastructure to deploy this - I will either need a powerful computer in Benjamin's room to interface with the Kinect and do the analysis, or maybe use a Raspberry Pi to interface with the kinect and serve the depth camera output as a video stream.</div>
<div>
<br /></div>
<div>
Looking promising - will add another post with the breathing analysis in the new year...</div>
Graham Joneshttp://www.blogger.com/profile/05191458206760305309noreply@blogger.com0tag:blogger.com,1999:blog-6042401333290389565.post-19739809708711534672013-12-05T20:44:00.002+00:002013-12-05T20:51:10.617+00:00Using a Kobo Ebook Reader as a Gmail Notifier<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiJKxrV_EkhVQ-nvMCRMIte8sfL78L-w8eiH0LuaCJ_CCCdS-y_Am1ZcdctAA1wWvlsykUzXBTfJAkR0slcXdced17WhHb4fAuLNAGQYPRXfCnBkdY7leoALUVnc5NCj9zO4AxJxIAm9_Q/s1600/20131205_200441.jpg" imageanchor="1" style="clear: right; float: right; margin-bottom: 1em; margin-left: 1em;"><img border="0" height="320" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiJKxrV_EkhVQ-nvMCRMIte8sfL78L-w8eiH0LuaCJ_CCCdS-y_Am1ZcdctAA1wWvlsykUzXBTfJAkR0slcXdced17WhHb4fAuLNAGQYPRXfCnBkdY7leoALUVnc5NCj9zO4AxJxIAm9_Q/s320/20131205_200441.jpg" width="180" /></a>A certain person that I know well does not read her emails very often and sees it as a chore to switch on the computer to see if she has any. And no, I can't interest her in a smartphone that will do email for her....This post is about making a simple device to hang on the wall like a small picture next to the calendar so she can always see if she has emails to know if it is worth putting the computer on.<br />
<br />
I was in WH Smith the other day and realised that they were selling <a href="http://www.whsmith.co.uk/dept/kobo-ereaders/home#kobo-ereaders">Kobo Mini e-book readers</a> for a very good price (<£30). When you think about it the reader is a small battery powered computer with wifi interface, a 5" e-ink screen with a touch screen interface. This sounds like just the thing to hang on the wall and use to display the number of un-read emails.<br />
<br />
Fortunately some clever people have worked out how to modify the software on the device - it runs linux and the manufacturers have published the open source part of the device firmware (<a href="https://github.com/kobolabs/Kobo-Reader">https://github.com/kobolabs/Kobo-Reader</a>). I haven't done it myself, but someone else has compiled python to run on the device and use the pygame library to handle writing to the screen (<a href="http://www.mobileread.com/forums/showthread.php?t=219173">http://www.mobileread.com/forums/showthread.php?t=219173</a>). Note that I needed this later build of python to run on my new kobo mini as some of the other builds that are available crashed without any error messages - I think this is to do with the version of some of the c libraries installed on the device.<br />
Finally someone called Kevin Short wrote a programme to use a kobo as a weather monitor, which is very similar to what I am trying to do and was a very useful template to start from - thank you, Kevin! (<a href="http://www.mobileread.com/forums/showthread.php?t=194376">http://www.mobileread.com/forums/showthread.php?t=194376</a>).<br />
<br />
The steps I followed to get this working were:<br />
<br />
<ul>
<li>Enable telnet and ftp access to the kobo (<a href="http://wiki.mobileread.com/wiki/Kobo_Touch_Hacking">http://wiki.mobileread.com/wiki/Kobo_Touch_Hacking</a>)</li>
<li>Put python on the 'user' folder of the device (/mnt/onboard/.python).</li>
<li>Extend the LD_LIBRARY_PATH in /etc/profile to point to the new python/lib and pygame library directories.</li>
<li>Add 'source /etc/profile' into /etc/init.d/rcS so that we have access to the python libraries during boot-up.</li>
<li>Prevented the normal kobo software from starting by commenting out the lines that start the 'hindenburg' and 'nickel' applications in /etc/init.d/rcS.</li>
<li>Killed the boot-up animation screen by adding the following into rcS:<br /> killall on-animator.sh<br /> sleep 1</li>
<li>Added my own boot-up splash screen by adding the follwing to rcS:<br /> cat /etc/images/SandieMail.raw | /usr/local/Kobo/pickel showpic </li>
<li>Enabled wifi networking on boot up by referencing a new script /etc/network/wifiup.sh in rcS, which contains:<br /> insmod /drivers/ntx508/wifi/sdio_wifi_pwr.ko<br /> insmod /drivers/ntx508/wifi/dhd.ko <br /> sleep 2 <br /> ifconfig eth0 up <br /> wlarm_le -i eth0 up <br /> wpa_supplicant -s -i eth0 -c /etc/wpa_supplicant/wpa_supplicant.conf -C /var/run/wpa_supplicant -B sleep 2 <br /> udhcpc -S -i eth0 -s /etc/udhcpc.d/default.script -t15 -T10 -A3 -f -q</li>
<li>Started my new gmail notifier program using the following in rcS:<br /> cd /mnt/onboard/.apps/koboGmail<br /> /usr/bin/python gmail.py > /mnt/onboard/gmail.log 2>&1 &</li>
</ul>
<div>
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhxwefCIe2KDUcVXhzMRKyD_10yH1vXHhYltFFY0JYIxRiDEkePQzYKj0UERCgoCVF1GW1HjrF1unnKaLUNqgoEuIo22ef1_fog65nQyKtH9kuf8MUZgJp3lY4qq6291mKhCf0qZC8wI5U/s1600/20131205_200441.jpg" imageanchor="1" style="clear: right; float: right; margin-bottom: 1em; margin-left: 1em;"><img border="0" height="640" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhxwefCIe2KDUcVXhzMRKyD_10yH1vXHhYltFFY0JYIxRiDEkePQzYKj0UERCgoCVF1GW1HjrF1unnKaLUNqgoEuIo22ef1_fog65nQyKtH9kuf8MUZgJp3lY4qq6291mKhCf0qZC8wI5U/s640/20131205_200441.jpg" width="360" /></a>The actual python program to do the logging is quite simple - it uses the pygame program to write to a framebuffer screen, but uses a utility called 'full_update' that is part of the kobo weather project to update the screen. The program does the following:</div>
<div>
<ul>
<li>Get the battery status, and create an appropriate icon to show battery state.</li>
<li>Get the wifi link status and create an appropriate icon to show the link state.</li>
<li>Get the 'atom' feed of the user's gmail account using the url, username and password stored in a configuration file.</li>
<li>Draw the screen image showing the number of unread emails, and the sender and subject of the first 10 unread mails, and render the battery and wifi icons onto it.</li>
<li>Update the kobo screen with the new image.</li>
<li>Wait a while (5 seconds at the moment for testing, but will make it longer in the future - 5 min would probably be plenty).</li>
<li>Repeat indefinitely.</li>
</ul>
<div>
The source code is in my <a href="https://github.com/jones139/koboProjects">github repository</a>.</div>
<div>
<br /></div>
<div>
The resulting display is pretty basic, but functional as shown in the picture.</div>
</div>
<div>
<br /></div>
<h2>
Things to Do</h2>
<div>
There are a few improvements I would like to make to this:</div>
<div>
<ol>
<li>Make it less power intensive by switching off wifi when it is not needed (it can flatten its battery in about 12 hours so will need to be plugged into a mains adapter at the moment).</li>
<li>Make it respond to the power switch - you can switch it off by holding the power switch across for about 15 seconds, but it does not shutdown nicely - no 'bye' display on the screen or anything like that - just freezes.</li>
<li>Get it working as a usb mass storage device again - it does usb networking at the moment instead, so you have to use ftp to update the software or log in and use vi to edit the configuration files - not user friendly.</li>
<li>Make it respond to the touch screen - I will need to interpret the data that appears in /dev/input for this. The python library evdev should help with interpreting the data, but it uses native c code so I need a cross compiler environment for the kobo to use that, which I have not set up yet. Might be as easy to code it myself as I will only be doing simple things.</li>
<li>Get it to flash its LED to show that there are unread emails - might have to modify the hardware to add a bigger LED that faces the front rather than top too.</li>
<li>Documentation - if anyone wants to get this working themselves, they will need to put some effort in, because the above is a long way off being a tutorial. It should be possible to make a kobo firmware update file that would install it if people are interested in trying though.</li>
</ol>
</div>
<div>
<br /></div>
<div>
<br /></div>
Graham Joneshttp://www.blogger.com/profile/05191458206760305309noreply@blogger.com1tag:blogger.com,1999:blog-6042401333290389565.post-31053689233725594842013-10-22T21:35:00.001+01:002013-10-22T21:35:41.183+01:00Raspberry Pi and ArduinoI am putting together a data logger for the biogas generator.<br />
<br />
I would like it networked so I don't have to go out in the cold, so will use a raspberry pi. To make interfacing the sensors easy I will connect the Pi to an Arduino microcontroller. This is a bit over the top as I should be able to do everything I need using the Pi's GPIO pins, but Arduino has a lot of libraries to save me programming....<br />
<br />
To get it working I installed the following packages using:<br />
<blockquote class="tr_bq">
apt-get install gcc-avr avr-libc avrdude arduino-core arduino-mk</blockquote>
<br />
To test it, copy the Blink.ino sketch from /usr/share/arduino/examples/01.Basics/Blink/ to a user directory.<br />
Then create a Makefile in the same directory that has the following contents:<br />
<blockquote class="tr_bq">
ARDUINO_DIR = /usr/share/arduino<br />TARGET = Blink<br />ARDUINO_LIBS =<br />BOARD_TAG = uno<br />ARDUINO_PORT = /dev/ttyACM0<br />include /usr/share/arduino/Arduino.mk</blockquote>
<div>
Then just do 'make' to compile it, then upload to the arduino (in this case a Uno) using:</div>
<blockquote class="tr_bq">
avrdude -F -V -p ATMEGA328P -c arduino -P/dev/ttyACM0 -U build-cli/Blink.hex</blockquote>
<div>
The LED on the Arduino Uno starts to blink - success!</div>
<br />Graham Joneshttp://www.blogger.com/profile/05191458206760305309noreply@blogger.com0tag:blogger.com,1999:blog-6042401333290389565.post-60529502934954645522013-10-19T21:50:00.002+01:002013-10-19T22:01:25.120+01:00Small Scale Biogas GeneratorI heard on the radio last week that some farmers are using anaerobic digesters to produce methane-rich biogas from vegetable waste.<br />
This got me wondering if we could use our domestic waste to produce usable fuel gas - maybe to heat the greenhouse or something similar.<br />
<br />
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjoPGcL8Qhza0n3zVU5_Wb9y67OUymqAevPJCpJxsPCR5x9rxSItkWSFr89uWTP2p9dTuDWmGJy4apY9rjqFsJ0wHG8qaDoDHn68mqS99IEqwr9zbg4fojReGUGLwES_jE0AjqDIwfJ65k/s1600/20131019_154034.jpg" imageanchor="1" style="clear: right; float: right; margin-bottom: 1em; margin-left: 1em;"><img border="0" height="180" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjoPGcL8Qhza0n3zVU5_Wb9y67OUymqAevPJCpJxsPCR5x9rxSItkWSFr89uWTP2p9dTuDWmGJy4apY9rjqFsJ0wHG8qaDoDHn68mqS99IEqwr9zbg4fojReGUGLwES_jE0AjqDIwfJ65k/s320/20131019_154034.jpg" width="320" /></a>I thought I would make a small scale experimental digester to see if it works, and what amount of gas it makes, to see if it is worth thinking about something bigger.<br />
<br />
My understanding is that the methane producing bacteria work best at over 40 degC, so I will heat the digester. I will do this electrically for the experimental set up because it is easy, and I can measure the energy consumption easily that way.<br />
<br />
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiVY9dBtV27CRej0174kj3xRsT5dVJ8MSdPJDueN1968R8WwgzUKCEVyzmkqKdfZPI-dxeOS-FXumrYK7OzyEJ7NGbPuZY3YD7mspis_tq0p9kPyK83PoU7cD33nV1fq9P4uRjO7Z2er5c/s1600/20131019_160946.jpg" imageanchor="1" style="clear: right; float: right; margin-bottom: 1em; margin-left: 1em;"><img border="0" height="320" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiVY9dBtV27CRej0174kj3xRsT5dVJ8MSdPJDueN1968R8WwgzUKCEVyzmkqKdfZPI-dxeOS-FXumrYK7OzyEJ7NGbPuZY3YD7mspis_tq0p9kPyK83PoU7cD33nV1fq9P4uRjO7Z2er5c/s320/20131019_160946.jpg" width="180" /></a>I am using a 25 litre fermentation vessel for the digester - I got one with a screw on cap rather than a bucket so I can run it at slightly elevated pressure if it starts to make gas.<br />
For simplicity I got a 1 m2 electric underfloor heating blanket to heat the vessel. I will use an electro-mechanical thermostat as a protection device in case the electronic temperature controller I will produce looses its marbles and tries to melt the vessel.<br />
<br />
<br />
To start with I just wrapped the blanket around the vessel.<br />
<br />
But before I tested it I realised that this approach is no good - the vessel will not be full of liquid, so I do not want the heating element all the way up the sides.<br />
<br />
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEizJC3Y5LMgSSAuqBxGS0pi9FnmNe5GZLUg5C_xMa9r3SemiGKCW-CnOK2m83mBXOudAvKZafp3Y4nKof9wSilBXdMU_T8rrafQvTNfL9VbRxHtuOQaWWKhmzbksRTya8Qczd8WmB9hsHs/s1600/20131019_192234.jpg" imageanchor="1" style="clear: right; float: right; margin-bottom: 1em; margin-left: 1em;"><img border="0" height="320" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEizJC3Y5LMgSSAuqBxGS0pi9FnmNe5GZLUg5C_xMa9r3SemiGKCW-CnOK2m83mBXOudAvKZafp3Y4nKof9wSilBXdMU_T8rrafQvTNfL9VbRxHtuOQaWWKhmzbksRTya8Qczd8WmB9hsHs/s320/20131019_192234.jpg" width="180" /></a><br />
<br />
<br />
<br />
<br />
<br />
<br />
So, I removed the heating element from the underfloor heating mat, and wrapped it around the bottom of the vessel instead.<br />
<span style="text-align: center;"><br /></span>
<span style="text-align: center;"><br /></span>
<span style="text-align: center;"><br /></span>
<span style="text-align: center;"><br /></span>
<span style="text-align: center;"><br /></span>
<span style="text-align: center;"><br /></span>
<span style="text-align: center;"><br /></span>
<span style="text-align: center;"><br /></span>
<span style="text-align: center;"><br /></span>
<span style="text-align: center;"><br /></span>
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjJHILPUTHioXiUy3O3nWCjgZqQi6ynXwWNe0hFf4dqzmVTXo-UPY3SL7RYRqaeJE1WWuJCa3V0VIwiNwHuFXmHi1WnB8E0NHMYGlPu2Jx0qkIOlV0rLEKJLKvF09W89F-txuaBsb43rHM/s1600/20131019_203309.jpg" imageanchor="1" style="clear: right; float: right; margin-bottom: 1em; margin-left: 1em;"><img border="0" height="400" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjJHILPUTHioXiUy3O3nWCjgZqQi6ynXwWNe0hFf4dqzmVTXo-UPY3SL7RYRqaeJE1WWuJCa3V0VIwiNwHuFXmHi1WnB8E0NHMYGlPu2Jx0qkIOlV0rLEKJLKvF09W89F-txuaBsb43rHM/s400/20131019_203309.jpg" width="225" /></a><span style="text-align: center;"></span><br />
<span style="text-align: center;"><br /></span>
<span style="text-align: center;"><br /></span>
<span style="text-align: center;"><br /></span>
To improve heat transfer between the heating element and the vessel, I pushed as much silicone grease as I could get in around the element wires, then wrapped it in gaffer tape to make sure it all held together and I don't get covered in grease:<br />
<div class="separator" style="clear: both; text-align: center;">
<br /></div>
<div class="separator" style="clear: both; text-align: left;">
It is looking promising now - the element gets warm, and the thermostat trips it out when it starts to get hot. The dead band on the thermostat is too big to be useful for this application (it is about 10 degC), so I will just use that as an over-heat protection device, and us an Arduino microcontroller to control and log the temperature.</div>
<div class="separator" style="clear: both; text-align: left;">
<br /></div>
<div class="separator" style="clear: both; text-align: left;">
To get the proof of concept prototype working, I think I need to:</div>
<div class="separator" style="clear: both; text-align: left;">
</div>
<ul>
<li>Sort out a temperature controller - will use an arduino and a solid state relay to switch the heater elements on and off.</li>
<li>Gas Handling - I will need to do something with the gas that is generated, while avoiding blowing up the house or garage - I have seen <a href="http://www.re-energy.ca/biogas-generator">somewhere</a> where they recommend using an aluminised mylar baloon, which sounds like a good idea if I can find one.</li>
<li>Gas Composition Measurement - I will need to find out the proportion of methane to carbon dioxide that I am generating - still not sure how to do that. It would be possible with a tunable IR laser diode, but not sure if that is feasible without spending real money. Any suggestions appreciated!</li>
<li>Gas volume measurement - the other thing I am interested in is how much gas is generated - not sure how best to measure very low gas flow rates. I am wondering about modifying a U-bend type airlock to detect how many bubbles pass through - maybe detect the water level changing before the bubble passes through.</li>
</ul>
<div>
If this looks feasible, the next stages of development would be:</div>
<div>
<ul>
<li>Automate gas handling to use the gas generated to heat the digester - success would be making it self sustaining so that it generated enough gas to keep it warm. That would mean scaling it up would produce excess gas that I could use for something, else.</li>
<li>Think about how far I can scale it up - depends on what fuel to use - kitchen and 'soft' garden waste is limited, so might have to look for something else....</li>
</ul>
<div>
Will post an update when I get it doing something.</div>
</div>
<br />
<div class="separator" style="clear: both; text-align: left;">
<br /></div>
<br />Graham Joneshttp://www.blogger.com/profile/05191458206760305309noreply@blogger.com0tag:blogger.com,1999:blog-6042401333290389565.post-47618525551911814732013-10-05T22:13:00.000+01:002013-10-05T22:13:06.717+01:00Using Raspberry Pi as an IP Camera to Analogue ConverterI have an old-fashioned analogue TV distribution system in our house. We use it for a video monitor for our disabled son so we can check he is ok.<br />
The quality of the analogue camera we use is not good, but rather than getting a new analogue one, I thought I should really get into digital IP cameras.<br />
I have had quite a nice IP camera with decent infra-red capabilities for a while (a <a href="http://www.y-cam.com/">Ycam Knight</a>). You can view the images and hear the audio on a computer, but it is not as useful as it working on the little portable flat panel TVs we have installed in a few rooms for the old analogue camera.<br />
<br />
I am trying an experiment using a raspberry Pi to take the audio and video from the IP camera, and convert it to analogue signals so my old equipment can be used to view it.<br />
<br />
What we have is:<br />
<br />
<ul>
<li>IP Camera connected to home network.</li>
<li>Raspberry Pi connected to same network.</li>
<li>Analogue video and audio signals from Pi connected to an RF modulator, which is connected to our RF distribution system.</li>
</ul>
Using this I can tune the TVs on the RF distribution system to view the Raspberry Pi output.<br />
<br />
I set up the Pi to view the audio and video streams from the IP camera by using the omxplayer video player, which is optimised for the Pi. I added the following to /etc/rc.local:<br />
<blockquote class="tr_bq">
omxplayer rtsp://192.168.1.18/live_mpeg4.sdp &</blockquote>
Now when the Pi boots, it displays the video from the IP camera on its screen, which is visible to other monitors via the RF modulator.<br />
<br />
My concern is how reliable this will be - I tried earlier in the year and the Pi crashed after a few weeks with a completely mangled root filesystem, which is no good at all. This time I am using a new Pi and new SD card for the filesystem, so I will see how long it lasts.<br />
<br />Graham Joneshttp://www.blogger.com/profile/05191458206760305309noreply@blogger.com0tag:blogger.com,1999:blog-6042401333290389565.post-12072323406099068202013-09-22T16:35:00.002+01:002013-09-22T16:35:24.576+01:00Human Power MetersI have just done a triathlon with my disabled son, Benjamin (<a href="http://team-bee.blogspot.co.uk/">team-bee.blogspot.co.uk</a>)<br />
<br />
While we were training I started to try to calculate the energy requirements for the event, because I was worried about running out of glycogen before the end. Most of the calculation methods can not take account of weather - especially wind, so I am starting to wonder how to make a power meter for our bike. I can either go for strain gauges in the cranks, which is likely to be difficult mechanically, or I am wondering if I can just use my heart rate.<br />
I have just got a Garmin 610 sports watch with heart rate monitor. It uses a wireless protocol called 'ant'. I'll have to look at how good heart rate is as a surrogate for power output. <br />
I may have to go to a gym to calibrate myself against a machine that measures work done...a winter project I think!Graham Joneshttp://www.blogger.com/profile/05191458206760305309noreply@blogger.com0tag:blogger.com,1999:blog-6042401333290389565.post-59408978778198296762013-03-24T23:02:00.001+00:002013-03-24T23:10:52.560+00:00Further Development of Video Based Seizure DetectorI have made a bit more progress with the video based <a href="http://nerdytoad.blogspot.co.uk/search/label/Seizure_Detector">epileptic seizure detector</a>.<br />
<br />
Someone on the OpenCV Google Plus page suggested that I look at the Lucas-Kanade feature tracking algorithm, rather than trying to analyse all of the pixels at once like I was doing.<br />
<br />
This looks quite promising. First you have to decide which features in the image to use - corners are good for tracking. OpenCV has a neat cv.GoodFeaturesToTrack function which makes suggestions - you give it a couple of parameters, including a 'quality' parameter to help it choose. This gives a list of (x,y) coordinates of the good features to track. Note that this means 'good' mathematically, not necessarily the limbs of the test subject....<br />
<br />
Once you have some features to track, OpenCV again provides a cv.CalcOpticalFlowPyrLK, where you give it the list of features, the previous image and a new image, and it calculates the locations of the features in the new image.<br />
<br />
I have then gone into the fourier analysis that I have been trying for the other types of seizure detection. This time I calculate the speed of each feature over a couple of seconds, and record this as a time series, then calculate the fourier transform to give the frequency spectrum of the motion. If there is oscillation above a threshold amplitude in a given frequency band for a specified time we raise an alarm as a possible seizure.<br />
<br />
The code is functioning, but is a fair way off being operational yet. The code for this is in my OpenSeizureDetector github repository (<a href="https://github.com/jones139/OpenSeizureDetector">https://github.com/jones139/OpenSeizureDetector</a>).<br />
<br />
The current issues are:<br />
<br />
<ul>
<li>I really want to track motion of limbs, but there is no guarantee that cv.GoodFeaturesToTrack will detect these as good features - I can make this more likely by attaching reflective tape, which glows under IR illumination from the night vision camera...if I can persuade Benjamin to wear it.</li>
<li>There is something wrong with the frequency calculation still - I can understand a factor of two, but it seems a bit more than that.</li>
<li>If the motion is too quick, it looses the point, so I have to set it to re-initialise using GoodFeaturesToTrack periodically.</li>
<li>An Example of it working with my daughter doing Benjamin-like behaviour is shown below. Red circles are drawn around points if a possible seizure is detected.</li>
</ul>
<div>
<iframe allowfullscreen='allowfullscreen' webkitallowfullscreen='webkitallowfullscreen' mozallowfullscreen='mozallowfullscreen' width='320' height='266' src='https://www.blogger.com/video.g?token=AD6v5dxfZ8hzdDHWMOVkj9oz2gx5BHtgYTS_q42zld5XRwzQ-X5q035HFpzyHbriQhFJ1JvgAKTLy8DQPf_DPkd2Wg' class='b-hbp-video b-uploaded' frameborder='0'></iframe></div>
<div>
<ul>
<li><span style="text-align: left;">This does not look too good - lots of points detected, and even the reflective strips on the wrists and ankles get lost. It seems to work better in darkness though, where I get something like the second video, where there are only a few points, and most of those are on my high-vis reflective strips.</span></li>
</ul>
<br />
<ul>
<li>It does give some nice debugging graphs of the speed measurements and the frequency spectra though.</li>
</ul>
<div>
So, still a bit of work to do.....</div>
<br />
<div class="separator" style="clear: both; text-align: center;">
</div>
<div class="separator" style="clear: both; text-align: center;">
<iframe allowfullscreen='allowfullscreen' webkitallowfullscreen='webkitallowfullscreen' mozallowfullscreen='mozallowfullscreen' width='320' height='266' src='https://www.blogger.com/video.g?token=AD6v5dwadkcH1aQf920mholWgDe9uJcOsZpmv_tKdxE37bzYeaWvWoTL111aW6w1XHxN7OhKAsTScnzBDUzUnnxySA' class='b-hbp-video b-uploaded' frameborder='0'></iframe></div>
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjeeKREzisA2Zq3SmfRXFRKjI8e8PCmXCTEz6D__4zR-M_Fj9pdGRwjHL2fF6FfGNHI_r4ASH7ls6OxZ8rJGkEa_FD9k5zuGYxzQutUSWvp936lkmGJvRVLPSxivmErC-tfDIdL2rN1hfk/s1600/plot.png" imageanchor="1" style="clear: left; float: left; margin-bottom: 1em; margin-right: 1em;"><img border="0" height="320" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjeeKREzisA2Zq3SmfRXFRKjI8e8PCmXCTEz6D__4zR-M_Fj9pdGRwjHL2fF6FfGNHI_r4ASH7ls6OxZ8rJGkEa_FD9k5zuGYxzQutUSWvp936lkmGJvRVLPSxivmErC-tfDIdL2rN1hfk/s320/plot.png" width="231" /></a></div>
<div>
<br /></div>
<div>
<br /></div>
<div>
<br /></div>
<div>
<br /></div>
<div>
<br /></div>
<div>
<br /></div>
Graham Joneshttp://www.blogger.com/profile/05191458206760305309noreply@blogger.com1Hartlepool, UK54.691745 -1.212926000000038654.544835 -1.5356495000000385 54.838654999999996 -0.89020250000003864tag:blogger.com,1999:blog-6042401333290389565.post-8598864698561804812013-03-09T23:02:00.001+00:002013-03-09T23:02:36.434+00:00First go at a Video Based Epileptic Seizure Detector <h3>
Background</h3>
I have been working on a system to detect epileptic seizures (fits) to raise an alarm without requiring sensors to be attached to the subject.<br />
I am going down three routes to try to do this:<br />
<br />
<ul>
<li>Accelerometers</li>
<li>Audio</li>
<li>Video</li>
</ul>
<div>
This is about my first 'proof of concept' go at a video based system.</div>
<h3>
Approach</h3>
<div>
I am trying to detect the shaking of a fit. I will do this by monitoring the signal from an infrared video camera, so it will work in monochrome. The approach is:</div>
<div>
<ol>
<li>Reduce the size of the image by averaging pixels into 'meta pixels' - I do this using the openCV pyrDown function that does the averaging (it is used to build image pyramids of various resolution versions of an image). I am reducing the 640x480 video stream down to 10x7 pixels to reduce the amount of data I have to handle.</li>
<li>Collect a series of images to produce a time series of images. I am using 100 images at 30 fps, which is about 3 seconds of video.</li>
<li>For each pixel in the images, calculate the fourier transform of the series of measured pixel intensities - this gives the frequency at which the pixel intensity is varying.</li>
<li>If the amplitude of oscillation at a given frequency is above a threshold value, treat this as a motion at that particular frequency (ie, it could be a fit).</li>
<li>The final version will check that this motion continues for several seconds before raising an alarm. In this test version, I am just highlighting the detected frequency of oscillation on the original video stream.</li>
</ol>
<h3>
Code</h3>
<div>
The code uses the <a href="http://opencv.org/">OpenCV</a> library, which provides a lot of video and image handling functions - far more than I understand...</div>
<div>
My intention had been to write it in C, but I struggled with memory leaks (I must have been doing something wrong and not releasing storage, because it just ate all my computer's memory until it crashed...).</div>
<div>
Instead I used the Python bindings for OpenCV - this ran faster and used much less memory than my C version (this is a sign that I made mistakes in the C one, rather than Python being better!).</div>
<div>
The code for the seizure detector is <a href="https://github.com/jones139/arduino-projects/tree/master/seizure_detector/video_version">here</a> - very rough 'proof of concept' one at the moment - it will have a major rewrite if it works.</div>
<h3>
Test Set Up</h3>
</div>
<div>
To test the system, I have created a simple 'test card' video, which has a number of circles oscillating at different frequencies - the test is to see if I can pick out the various frequencies of oscillation. The code to produce the test video is <a href="https://github.com/jones139/arduino-projects/blob/master/seizure_detector/video_version/testcards/makeTestCard.py">here</a>....And here is the test video (not very exciting to watch I'm afraid).</div>
<div>
The circles are oscillating at between 0 and 8 Hz (when played at 30 fps).</div>
<div class="separator" style="clear: both; text-align: center;">
<object width="320" height="266" class="BLOGGER-picasa-video" classid="clsid:D27CDB6E-AE6D-11cf-96B8-444553540000" codebase="http://download.macromedia.com/pub/shockwave/cabs/flash/swflash.cab#version=6,0,40,0" data-thumbnail-src="https://lh5.googleusercontent.com/-Xj_AtmZizEE/UTu6zTioskI/AAAAAAAAAxw/MPGVOzAOflc/s1600/testcard.mpg"><param name="movie" value="http://video.google.com/googleplayer.swf?videoUrl=http://redirector.googlevideo.com/videoplayback?id%3D9c70d57ba9a663d1%26itag%3D5%26source%3Dpicasa%26cmo%3Dsensitive_content%253Dyes%26ip%3D0.0.0.0%26ipbits%3D0%26expire%3D1365461130%26sparams%3Did,itag,source,ip,ipbits,expire%26signature%3D333329239BED336B1F5698B1A31DDAAF9F16E98C.B3DC8992BBA6759772DEA40C61262B991CB477DB%26key%3Dlh1" /><param name="bgcolor" value="#FFFFFF" /><param name="allowFullScreen" value="true" /><embed width="320" height="266" src="http://video.google.com/googleplayer.swf?videoUrl=http://redirector.googlevideo.com/videoplayback?id%3D9c70d57ba9a663d1%26itag%3D5%26source%3Dpicasa%26cmo%3Dsensitive_content%253Dyes%26ip%3D0.0.0.0%26ipbits%3D0%26expire%3D1365461130%26sparams%3Did,itag,source,ip,ipbits,expire%26signature%3D333329239BED336B1F5698B1A31DDAAF9F16E98C.B3DC8992BBA6759772DEA40C61262B991CB477DB%26key%3Dlh1" type="application/x-shockwave-flash" allowfullscreen="true"></embed></object></div>
<h3>
Results</h3>
<div>
The output of the system is shown in the video below. The coloured circles indicate areas where motion has been detected. The thickness of the line and the colour shows the frequency of the detected motion.</div>
<div>
<ul>
<li>Blue = <3 hz="" li="">
<li>Yellow = 3-6 Hz</li>
<li>Red = 6-9 Hz</li>
<li>White = >9 Hz</li>
</3></li>
</ul>
<div class="separator" style="clear: both; text-align: center;">
<object width="320" height="266" class="BLOGGER-picasa-video" classid="clsid:D27CDB6E-AE6D-11cf-96B8-444553540000" codebase="http://download.macromedia.com/pub/shockwave/cabs/flash/swflash.cab#version=6,0,40,0" data-thumbnail-src="https://lh5.googleusercontent.com/-RgrfoLs3aUU/UTu8dbAYDhI/AAAAAAAAAyg/RBfte7YXBlo/s1600/seizure_test.mpg"><param name="movie" value="http://video.google.com/googleplayer.swf?videoUrl=http://redirector.googlevideo.com/videoplayback?id%3D534220180b04aa20%26itag%3D5%26source%3Dpicasa%26cmo%3Dsensitive_content%253Dyes%26ip%3D0.0.0.0%26ipbits%3D0%26expire%3D1365462114%26sparams%3Did,itag,source,ip,ipbits,expire%26signature%3D7C3CB5C92D83E99B994F423D5074D83B30943449.7AE93FDAA9B87917AB71582180BD50FB97F3BD8E%26key%3Dlh1" /><param name="bgcolor" value="#FFFFFF" /><param name="allowFullScreen" value="true" /><embed width="320" height="266" src="http://video.google.com/googleplayer.swf?videoUrl=http://redirector.googlevideo.com/videoplayback?id%3D534220180b04aa20%26itag%3D5%26source%3Dpicasa%26cmo%3Dsensitive_content%253Dyes%26ip%3D0.0.0.0%26ipbits%3D0%26expire%3D1365462114%26sparams%3Did,itag,source,ip,ipbits,expire%26signature%3D7C3CB5C92D83E99B994F423D5074D83B30943449.7AE93FDAA9B87917AB71582180BD50FB97F3BD8E%26key%3Dlh1" type="application/x-shockwave-flash" allowfullscreen="true"></embed></object></div>
<div>
<br /></div>
<div>
The things to note are:</div>
</div>
<div>
<ul>
<li>No motion detected near the stationary 0 Hz circle (good!).</li>
<li><3hz 1="" 2="" and="" circles="" detected="" good="" hz="" li="" motion="" near="" the="">
<li>3-6 Hz motion detected near the 2,3,4 and 5 Hz circles (ok, but why is it near the 2Hz one?)</li>
<li>6-9 Hz motion detected near the 5 and 6 Hz circles (a bit surprising)</li>
<li>>9Hz motion detected near the 4 and 7 Hz circles and sometimes the 8Hz one (?)</li>
</3hz></li>
</ul>
<div>
So, I think it is sometimes getting the frequency too high. This may be as simple as how I am doing the check - it is using the highest frequency that exceeds the threshold. I think I should update it to use the frequency with maximum amplitude (which exceeds the thershold).</div>
</div>
<div>
Also, I have something wrong with positioning the markers to show the motion - I am having to convert from a pixel in the low res image to the location in the high resolution one, and it does not always match up with the position of the moving circles.</div>
<div>
<br /></div>
<div>
But, it is looking quite promising. Rather computer intensive at the moment though - it is using pretty much 100% of one of the CPU cores on my Intel Core I5 laptop, so not much chance of getting this to run on a Raspberry Pi, which was my intention.</div>
<div>
<br /></div>
<div>
<br /></div>
<br />
<br />Graham Joneshttp://www.blogger.com/profile/05191458206760305309noreply@blogger.com0tag:blogger.com,1999:blog-6042401333290389565.post-90887623967596912132013-03-02T23:09:00.000+00:002013-03-02T23:12:22.523+00:00Getting Started with OpenCVI am starting work on the video version of my Epileptic Seizure detector project, while I wait for a very sensitive microphone to arrive off the slow boat from China, which I will use for the Audio version.<br />
<br />
I am using the <a href="http://opencv.org/">OpenCV</a> computer vision library. What I am hoping to do is to either:<br />
<br />
<ul>
<li>Detect the high frequency movement associated with a seizure, or</li>
<li>Detect breathing (and raise an alarm if it stops)</li>
</ul>
<div>
This seems quite similar to the sort of things that MIT have demonstrated some success with last year (<a href="http://people.csail.mit.edu/mrub/vidmag/">http://people.csail.mit.edu/mrub/vidmag/</a>). Their code is written in Matlib, which is a commercial package, so not much use to me, so I am looking at doing something similar in OpenCV.</div>
<div>
<br /></div>
<div>
But first things first, I need to get OpenCV working. I am going to use plain old C, because I know the syntax (no funny '<'s in the code that you seem to get in C++). I may move to Python if I start to need to plot graphs to understand what is happening, so I can use the matplotlib graphical library.</div>
<div>
<br /></div>
<div>
I am using CMake to sort out the make file. I really don't know how this works - I must have found a tutorial somewhere that told me to create a file called CMakeLists.txt. Mine looks like:</div>
<blockquote class="tr_bq">
<blockquote class="tr_bq">
cmake_minimum_required(VERSION 2.8)</blockquote>
<blockquote class="tr_bq">
PROJECT( sd )</blockquote>
<blockquote class="tr_bq">
FIND_PACKAGE( OpenCV REQUIRED )</blockquote>
<blockquote class="tr_bq">
ADD_EXECUTABLE( sd Seizure_Detector.c )</blockquote>
<blockquote class="tr_bq">
TARGET_LINK_LIBRARIES( sd ${OpenCV_LIBS} )</blockquote>
</blockquote>
Running 'cmake' creates a standard Makefile, and then typing 'make' will compile Seizure_Detector.c and link it into an executable called 'sd', including the OpenCV libraries. Seems quite clever.<br />
<br />
The program to detect a seizure is going to have to look for changes in a series of images in a certain frequency range (a few Hz I think). To detect this I will need to collect a series of images, process them, and do some sort of Fourier transform to detect the frequency components.<br />
<br />
So to get started, grab an image from the networked camera. This seems to work:<br />
IplImage *origImg = 0;<br />
char *window1 = "Original";<br />
int main() {<br />
camera = cvCaptureFromFile("rtsp://192.168.1.18/live_mpeg4.sdp");<br />
if(camera!=NULL) {<br />
cvNamedWindow(window1,CV_WINDOW_AUTOSIZE);<br />
while((origImg=cvQueryFrame(camera)) != NULL) {<br />
procImg = cvCreateImage(cvGetSize(origImg),8,1);<br />
cvShowImage(window1,origImg);<br />
}<br />
}<br />
}<br />
<br />
I can also smooth the image, and do some edge detection:<br />
<br />
while((origImg=cvQueryFrame(camera)) != NULL) {<br />
procImg = cvCreateImage(cvGetSize(origImg),8,1);<br />
cvCvtColor(origImg,procImg,CV_BGR2GRAY);<br />
//cvSmooth(procImg, procImg, CV_GAUSSIAN_5x5,9,9,0,0);<br />
smoothImg = cvCreateImage(cvGetSize(origImg),8,1);<br />
cvSmooth(procImg, smoothImg, CV_GAUSSIAN,9,9,0,0);<br />
cvCanny(smoothImg,procImg,0,20,3);<br />
<br />
cvShowImage(window1,origImg);<br />
cvShowImage(window2,procImg);<br />
}<br />
<br />
Full code at <a href="https://github.com/jones139/arduino-projects/tree/master/seizure_detector/video_version">https://github.com/jones139/arduino-projects/tree/master/seizure_detector/video_version</a>.<br />
<br />
I am about to update the code to maintain a set of the most recent 15 images (=1 second of video), so I can do some sort of time series analysis on it to get the frequencies.....<br />
<br />
<br />Graham Joneshttp://www.blogger.com/profile/05191458206760305309noreply@blogger.com0tag:blogger.com,1999:blog-6042401333290389565.post-39964613407805973122013-02-24T17:35:00.004+00:002013-02-24T17:35:45.538+00:00Epileptic Seizure Detector (3)I installed an accelerometer on the underside of the floorboard where my son sleeps to see if there is any chance of detecting him having an epileptic seizure by the vibrations induced in the floor.<div>
I used the software for the seizure detector that I have been working with before (see earlier post).</div>
<div>
<br /></div>
<div>
The software logs data to an SD card in Comma-Separated-Values (CSV) format, recording the raw accelerometer reading, and the calculated spectrum once per second. This left me with 26 MB of data to analyse after running it all night.....</div>
<div>
<br /></div>
<div>
I wrote a little script in Python that uses the matplotlib library to visualise it. I create a 2 dimensional array where there is one column for each record in the file (ie once column per second). The rows are the frequency bins from the fourier transform. The values in the array are the amplitude of the spectral component from the fourier transform.</div>
<div>
The idea is that I can look for periods where I have seen high levels of vibration at different frequencies to see if it could detect a seizure. The results are shown below:</div>
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhTU-vGmJCPZnOtI2G9eFCcL0jornIp4LmzJobr2A_aQEM5gurVA8d0B7Vqko6gD8h1fe3QEn7LJJz88feAV_8fVADj0efyC0p6Q7oah0_xgIRox0EfHgDSVanRgplJGY8HnNyonlonUI4/s1600/various_activities.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="482" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhTU-vGmJCPZnOtI2G9eFCcL0jornIp4LmzJobr2A_aQEM5gurVA8d0B7Vqko6gD8h1fe3QEn7LJJz88feAV_8fVADj0efyC0p6Q7oah0_xgIRox0EfHgDSVanRgplJGY8HnNyonlonUI4/s640/various_activities.png" width="640" /></a></div>
<div>
Here you can see the background noise of a few counts in the 1-7 Hz range. The 13-15Hz signal is a mystery to me. I wonder if it is the resonant frequency of our house?</div>
<div>
Up to 170 sec is just me walking around the room - discouragingly little response - maybe something at about 10 Hz. This is followed by me sitting still on the floorboard up to ~200 seconds (The 10 Hz signal disappears?)</div>
<div>
The period at ~200 seconds is me stamping vigorously on the floorboard, to prove that the system is alive.</div>
<div>
Unfortunately the period after 200 seconds is me lying on the floorboard shaking as vigorously as I could, and it is indistinguishable from the normal activity before 170 seconds.</div>
<div>
<br /></div>
<div>
So, I think attaching a simple IC accelerometer to a floorboard will not work - attaching it directly to the patient's forearm looks very promising, but not the floorboard.</div>
<div>
<br /></div>
<div>
I am working on an audio breathing detector now as the next non-contact option....</div>
<div>
<br /></div>
<div>
The code to analyse the data and produce the above chart can be found on <a href="https://github.com/jones139/arduino-projects/blob/master/seizure_detector/accelerometer_version/data_analysis/analyse_csv.py">github</a>. It uses the excellent <a href="http://matplotlib.org/">matplotlib</a> scientific visualisation package.</div>
Graham Joneshttp://www.blogger.com/profile/05191458206760305309noreply@blogger.com0tag:blogger.com,1999:blog-6042401333290389565.post-90775192332185762732013-02-13T22:55:00.001+00:002013-02-15T22:08:08.932+00:00Epileptic Seizure Detector (2)Update to add another spectrum...<br />
<br />
I have been working on setting up the <a href="http://nerdytoad.blogspot.co.uk/2013/02/epileptic-seizure-detector-1.html">Epileptic Seizure Detector</a>. I tried wearing it for a while, and simulating the shaking associated with a tonic-clonic seizure. Some example spectra collected on the memory card are shown below:<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEipVnWL3OJh6Adjomg95519oXhiKDNAENdn0uQuooE9F0gqxyaoHvYcYp_v_7jmSFnK2JoKs_UST79W9hqkC2igIINwshQXADkC_IJsJinTivA998kXSrebGu-r5psfb6IEpq2R24DG0yc/s1600/Test+Spectra.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="480" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEipVnWL3OJh6Adjomg95519oXhiKDNAENdn0uQuooE9F0gqxyaoHvYcYp_v_7jmSFnK2JoKs_UST79W9hqkC2igIINwshQXADkC_IJsJinTivA998kXSrebGu-r5psfb6IEpq2R24DG0yc/s640/Test+Spectra.png" width="640" /></a></div>
<div class="separator" style="clear: both; text-align: left;">
This shows that the background noise level is at about 4 counts. </div>
<div class="separator" style="clear: both; text-align: left;">
Wearing the accelerometer on the biscep gives a peak up to about 8 counts at 7 Hz, but it is not well defined. </div>
<div class="separator" style="clear: both; text-align: left;">
Wearing the accelerometer on the wrist gives a much more well defined peak at 6-7 Hz. (and it raised an alarm nicely).</div>
<div class="separator" style="clear: both; text-align: left;">
<br /></div>
<div class="separator" style="clear: both; text-align: left;">
I have also tried an ADXL345 digital accelerometer. The performance is similar to the analogue one, but I think it may be slightly more sensitive. Example spectra with the accelerometer attached to the biscep are shown below. ONe is a simulated fit. The other is a false alarm going down the stairs. Not that much difference!</div>
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhZJOlWHklk47f_s6A3MfCz9cyxQ0SKJTiD0hDQqecVTk1kT8eK-uNsCMK5nJluEUIQSf8lqJFIKkgBVuSfS4Sc45x3g0vCiqrmj35ZbEfXVTLpYmekVWAZAjWqIqrXJ6dJ-hr846EnO1Q/s1600/Test+Spectra+2+(digital+accel).png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="386" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhZJOlWHklk47f_s6A3MfCz9cyxQ0SKJTiD0hDQqecVTk1kT8eK-uNsCMK5nJluEUIQSf8lqJFIKkgBVuSfS4Sc45x3g0vCiqrmj35ZbEfXVTLpYmekVWAZAjWqIqrXJ6dJ-hr846EnO1Q/s640/Test+Spectra+2+(digital+accel).png" width="640" /></a></div>
<div class="separator" style="clear: both; text-align: left;">
<br /></div>
<div class="separator" style="clear: both; text-align: left;">
<br /></div>
<div class="separator" style="clear: both; text-align: left;">
Therefore I think there is scope for this set up to work if it is worn as a wrist watch, but just attaching it to other parts of the body may not be sensitive enough.</div>
<div class="separator" style="clear: both; text-align: left;">
<br /></div>
<div class="separator" style="clear: both; text-align: left;">
I wonder if I could make a wrist sensor that is watch sized, with a wireless link to a processor / alarm unit?</div>
<div class="separator" style="clear: both; text-align: left;">
<br /></div>
<div class="separator" style="clear: both; text-align: left;">
Not sure if I will be able to persuade Benjamin to wear a wrist sensor though....Might have to think about microphones.</div>
Graham Joneshttp://www.blogger.com/profile/05191458206760305309noreply@blogger.com0tag:blogger.com,1999:blog-6042401333290389565.post-58057697429611000102013-02-09T22:28:00.002+00:002013-02-09T22:44:53.874+00:00Soldering onto Surface Mount ICsI recently bought an accelerometer IC to use on my <a href="http://nerdytoad.blogspot.co.uk/2013/02/epileptic-seizure-detector-1.html">epileptic seizure detector</a> project. It is a tiny surface mount device as you can see below.<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiZBXGu5znoAieshj14g11l0IXigxh4pq64qQIa27D5ZNRTyeGcxnt59O7O19aoEPKH-EofLqsPE_BC87ABbqbG7nX7LScCwnzYZJ1ntJ1u8KLYrmov8poKqurmTFflKDDw6jAjtiiZJFM/s1600/IMGP1809.JPG" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="320" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiZBXGu5znoAieshj14g11l0IXigxh4pq64qQIa27D5ZNRTyeGcxnt59O7O19aoEPKH-EofLqsPE_BC87ABbqbG7nX7LScCwnzYZJ1ntJ1u8KLYrmov8poKqurmTFflKDDw6jAjtiiZJFM/s320/IMGP1809.JPG" width="285" /></a></div>
<div class="separator" style="clear: both; text-align: left;">
I gave a lot of thought to how to connect wires to it. I did consider conductive glue, but it would be difficult to hold them all still for long enough for it to set, so I went back to solder. This is how I did it...</div>
<div class="separator" style="clear: both; text-align: left;">
<br /></div>
<div class="separator" style="clear: both; text-align: left;">
1. Mount the IC onto stripboard using apoxy adhesive:</div>
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEihjEBf17OKdK0O2Ov9Qjw0F58boUMYMW0QydDmQhsptURv4G7cxKrKtGBenv_9PXKgB_VfZqOYFQ1leZ09-6pizd_swZgAAofI8VU-NjO6cA4IIrB5zQjHgmToNgo444-fa1ORIpfsphM/s1600/IMGP1816.JPG" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="240" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEihjEBf17OKdK0O2Ov9Qjw0F58boUMYMW0QydDmQhsptURv4G7cxKrKtGBenv_9PXKgB_VfZqOYFQ1leZ09-6pizd_swZgAAofI8VU-NjO6cA4IIrB5zQjHgmToNgo444-fa1ORIpfsphM/s320/IMGP1816.JPG" width="320" /></a></div>
<div class="separator" style="clear: both; text-align: center;">
<br /></div>
<div class="separator" style="clear: both; text-align: left;">
2. While the glue is setting, modify the soldering iron by wrapping some 1mm2 copper wire around the tip to give a very fine tip. Use solder to increase the heat transfer between the wire and the tip:</div>
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjC_o2x4_1F5xyePXkwmtqdL2pgpKfDQwP-3cROqovhXmMs3M-1hX1-hCE4TBpO1wa-IMC2O3i1HdWx2bxH0Ve_1SCUPtjuPMUqt9aPbIFkOYsyOWXE2KIEQke7i_o8urfgmURsNp_GGng/s1600/IMGP1818.JPG" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="320" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjC_o2x4_1F5xyePXkwmtqdL2pgpKfDQwP-3cROqovhXmMs3M-1hX1-hCE4TBpO1wa-IMC2O3i1HdWx2bxH0Ve_1SCUPtjuPMUqt9aPbIFkOYsyOWXE2KIEQke7i_o8urfgmURsNp_GGng/s320/IMGP1818.JPG" width="220" /></a></div>
<div class="separator" style="clear: both; text-align: left;">
<br /></div>
<div class="separator" style="clear: both; text-align: left;">
3. Tin the solder pads on the IC, using some very fine solder (I got some 32swg solder off ebay).</div>
<div class="separator" style="clear: both; text-align: left;">
<br /></div>
<div class="separator" style="clear: both; text-align: left;">
4. Obtain some very fine copper wire (I disassembled some cheap alarm flexible cable, and used strands from that).</div>
<div class="separator" style="clear: both; text-align: left;">
<br /></div>
<div class="separator" style="clear: both; text-align: left;">
5. Hold a strand of wire onto a solder pad, and touch it with the soldering iron to melt the solder and create the joint.</div>
<div class="separator" style="clear: both; text-align: left;">
<br /></div>
<div class="separator" style="clear: both; text-align: left;">
6. Repeat for all connections:</div>
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEg7p-emV_b1cysoS85UX_gvjA96Ni6zjvcXCQi4BnH6v5BsaleD4IYNQqYnIYrWk3vWOPEZwAfPW4vNmJHx0Q_zzRSfxXjEXWMnu0gyd8jyulqnt50aFqqMKEXp6DiTPJZpPg9xGfd0pJU/s1600/IMGP1820.JPG" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="274" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEg7p-emV_b1cysoS85UX_gvjA96Ni6zjvcXCQi4BnH6v5BsaleD4IYNQqYnIYrWk3vWOPEZwAfPW4vNmJHx0Q_zzRSfxXjEXWMnu0gyd8jyulqnt50aFqqMKEXp6DiTPJZpPg9xGfd0pJU/s320/IMGP1820.JPG" width="320" /></a></div>
<div class="separator" style="clear: both; text-align: left;">
<br /></div>
<div class="separator" style="clear: both; text-align: left;">
<br /></div>
<div class="separator" style="clear: both; text-align: left;">
7. Route the fine wires to the copper tracks, and solder on. I used the insulation from the original alarm cable to prevent short circuits:</div>
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgqsm9hp1LaXparRxmUorHfaRvf_M9Sa5sLMWuAaT1fHVLhzLxmskm4rnRCyCc6spIEO3Zab8Vlp5a0kKeGfHsA7IeX2DWsHbbHENpAh34Nm6QRasEc_D64Jr6gN8AaJ4QYGFDwWqMVuUs/s1600/IMGP1826.JPG" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="239" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgqsm9hp1LaXparRxmUorHfaRvf_M9Sa5sLMWuAaT1fHVLhzLxmskm4rnRCyCc6spIEO3Zab8Vlp5a0kKeGfHsA7IeX2DWsHbbHENpAh34Nm6QRasEc_D64Jr6gN8AaJ4QYGFDwWqMVuUs/s320/IMGP1826.JPG" width="320" /></a></div>
<div class="separator" style="clear: both; text-align: center;">
<br /></div>
<div class="separator" style="clear: both; text-align: left;">
Fiddly, and not very neat, but it worked for me - it is being used in my prototype <a href="http://nerdytoad.blogspot.co.uk/2013/02/epileptic-seizure-detector-1.html">epileptic seizure detector</a>.</div>
<div class="separator" style="clear: both; text-align: left;">
<br /></div>
<div class="separator" style="clear: both; text-align: center;">
<br /></div>
<br />Graham Joneshttp://www.blogger.com/profile/05191458206760305309noreply@blogger.com0tag:blogger.com,1999:blog-6042401333290389565.post-90628626450566670652013-02-09T22:19:00.002+00:002013-02-13T22:46:08.435+00:00Epileptic Seizure Detector (1)Our son worried us a bit a couple of weeks ago when he had quite a nasty fit, so I have been thinking about making an alarm to warn a carer that a person in their charge is having a seizure.<br />
<br />
There are a few different ways to do this that I have thought of:<br />
<br />
<ol>
<li>Detect Movement using an accelerometer</li>
<li>Detect the sounds associated with the movement using a microphone</li>
<li>Monitor the movement with a CCTV camera and use image processing to detect the abnormal movement.</li>
</ol>
<div>
I am trying option 1 (accelerometer) first, but am working on the CCTV approach in parallel by learning OpenCV.</div>
<div>
<br /></div>
<div>
Because our son is autistic, it will be very difficult to get him to wear a device, so I hope to detect movement through the floorboard where he sleeps, but this will be much less sensitive than detecting it directly. Therefore, this first proof of concept version is working by attaching the accelerometer to a limb to see if I can get it working. The issues with it are:</div>
<div>
<ol>
<li>We do not want false alarms caused by normal movement - I am addressing this by using a fourier transform to filter out only a range of frequencies of movement, in the hope that I can select the characteristic shaking of a seizure, but not detect too much normal movement.</li>
<li>A quick shake should not raise an alarm, so to set off an alarm the acceleration in the appropriate frequency band should be more than a threshold value for a specified length of time (3 sec currently). This will give a warning 'pip'. If the shaking continues for 10 sec, it raises a buzzing alarm.</li>
<li>Sensitivity will be a problem for detecting it through the floor - will need to work on that another evening.</li>
</ol>
<div>
The system uses an Arduino microcontroller, connected to a <a href="http://uk.rs-online.com/web/p/accelerometer-ics/7190999/">Freescale MMA7361 three axis accelerometer</a>. The accelerometer is a tiny (5mm x 3mm) surface mount device, so soldering it is a challenge - you can see how I did it <a href="http://nerdytoad.blogspot.co.uk/2013/02/soldering-onto-surface-mount-ics.html">here</a>.</div>
</div>
<div>
To enable data logging so I can tune it to get the frequency response, threshold etc. the arduino is also connected to a real time clock module and a SD card module.<br />
The completed prototype is shown below:<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhMZBjZ4xP2p8akLeKaa-cYZ50ov9kJAZ62aKLUseArg85ZqZ3K0S5Eor5nF22gAg8Efax41PMJWWkKos4dwdpR9SXVmY8qdhp35Xv7T3T4VMmOuJKeDf1scuY9ZSMz6HsTeX-yQyJwGZI/s1600/IMGP0002.JPG" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="240" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhMZBjZ4xP2p8akLeKaa-cYZ50ov9kJAZ62aKLUseArg85ZqZ3K0S5Eor5nF22gAg8Efax41PMJWWkKos4dwdpR9SXVmY8qdhp35Xv7T3T4VMmOuJKeDf1scuY9ZSMz6HsTeX-yQyJwGZI/s320/IMGP0002.JPG" width="320" /></a></div>
<div class="separator" style="clear: both; text-align: left;">
The code is in my <a href="https://github.com/jones139/arduino-projects">Arduino Projects</a> github repository.</div>
<div class="separator" style="clear: both; text-align: left;">
<br /></div>
<div class="separator" style="clear: both; text-align: left;">
And here is a simple demonstration of it working - you can hear the warning 'pip' and the alarm 'buzz' in the background when I shake my arm to simulate a seizure. </div>
<div class="separator" style="clear: both; text-align: center;">
<object class="BLOGGER-picasa-video" classid="clsid:D27CDB6E-AE6D-11cf-96B8-444553540000" codebase="http://download.macromedia.com/pub/shockwave/cabs/flash/swflash.cab#version=6,0,40,0" data-thumbnail-src="https://lh5.googleusercontent.com/-7pszvNyPmAk/URa9zC9m8BI/AAAAAAAAAe4/QQrfooWHbUg/s1600/IMGP1828.AVI" height="266" width="320"><param name="movie" value="http://video.google.com/googleplayer.swf?videoUrl=http://redirector.googlevideo.com/videoplayback?id%3Da7169a98d97312cc%26itag%3D5%26source%3Dpicasa%26cmo%3Dsensitive_content%253Dyes%26ip%3D0.0.0.0%26ipbits%3D0%26expire%3D1363040270%26sparams%3Did,itag,source,ip,ipbits,expire%26signature%3D7EF26F92E830E379FE30D06361197B24CCD2993F.DAE7AC8F246FBC1EE12FF84D5E6FA18BA058B614%26key%3Dlh1" /><param name="bgcolor" value="#FFFFFF" /><param name="allowFullScreen" value="true" /><embed width="320" height="266" src="http://video.google.com/googleplayer.swf?videoUrl=http://redirector.googlevideo.com/videoplayback?id%3Da7169a98d97312cc%26itag%3D5%26source%3Dpicasa%26cmo%3Dsensitive_content%253Dyes%26ip%3D0.0.0.0%26ipbits%3D0%26expire%3D1363040270%26sparams%3Did,itag,source,ip,ipbits,expire%26signature%3D7EF26F92E830E379FE30D06361197B24CCD2993F.DAE7AC8F246FBC1EE12FF84D5E6FA18BA058B614%26key%3Dlh1" type="application/x-shockwave-flash" allowfullscreen="true"></embed></object></div>
<div class="separator" style="clear: both; text-align: left;">
Still quite a bit of work to do - build it on stripboard to make it more robust, then try attaching it to the floor and seeing if I can detect any signal from someone shaking. If not, I will have to minaturise it to make it wearable, and train Benjamin to wear it....</div>
<div class="separator" style="clear: both; text-align: left;">
<br /></div>
<br /></div>
Graham Joneshttp://www.blogger.com/profile/05191458206760305309noreply@blogger.com0tag:blogger.com,1999:blog-6042401333290389565.post-37803314469746270632013-01-28T20:26:00.001+00:002013-01-28T20:26:06.580+00:00Getting Started with Raspberry PiI have had a Raspberry Pi single board computer in a box in the attic for a few months - I had forgotten that I had pre-ordered it, and was busy with the <a href="http://nerdytoad.blogspot.co.uk/2012/12/approaching-working-version-of-arduino.html">Arduino solar panel power meter</a> when it arrived, so didn't do anything with it.<br />
<br />
Well, I know that the <a href="http://nerdytoad.blogspot.co.uk/2013/01/initial-design-calcs-for-power-assisted.html">wheelchair project</a> will need some brackets to mount motors, lights, GPS receiver etc., and have been reading about <a href="http://reprap.org/">3d printing</a>, and thought it would be <strike>a handy excuse</strike> err... a necessary part of the project, to try out 3d printing for these parts. And the 3d printer will need a little print server, so I don't tie up my laptop when it is printing. So, I am dusting off the Raspberry Pi and having a go at setting it up to see if it will be able to do that.<br />
<br />
These are my notes, so that I can do it again if I accidentally break it...<br />
<h2>
Basic Set-Up</h2>
<div>
<ul>
<li>Download the Debian <a href="http://downloads.raspberrypi.org/images/raspbian/2012-12-16-wheezy-raspbian/2012-12-16-wheezy-raspbian.zip">root filesystem image</a> from the <a href="http://www.raspberrypi.org/">Raspberry Pi web site</a>.</li>
<li>Unzip the archive to give us 2012-12-16-wheezy-raspbian.img.</li>
<li>Copy it to a 4GB SD card using dd if=2012-12-16-wheezy-raspbian.img of=/dev/sdb. (Note, write to whole SD card, not to a partition - sdb, not sdb1).</li>
<li>Put SD card into raspberry pi, connect HDMI to TV in living room and switch on.</li>
<li>Success - boot messages displayed on TV</li>
<li>Failure - it lands in an interactive set-up utility, and I don't have a keyboard for it - doh....maybe I should have gone for <a href="http://openwrt.org/">openWRT</a>.</li>
<li>Try different approach - forget the TV now I know it boots, and just connect it up to the network. It gets is IP address from my router, and I can now ssh into it, with username pi, password raspberry.</li>
<li>Now I can run sudo rasppi-config, which is the same config utility that came up on the TV monitor. Used this to expand root filesystem to fill SD card, but didn't see much point in changing anything else (will sort out a user in a minute and do away with the pi user).</li>
</ul>
<h2>
3d Printing Stuff</h2>
</div>
<div>
<ul>
<li>Followed instructions at <a href="https://github.com/w-A-L-L-e/printerface">https://github.com/w-A-L-L-e/printerface</a>, with the following exceptions:</li>
<li></li>
<li>mv kliment-Printrun-71e5da0/ printrun</li>
<li>Node-js needed sudo apt-get install nodejs not node-js.</li>
<li>Had to do sudo ln -s /usr/bin/nodejs /usr/bin/node to get npm install.sh to work.</li>
<li>needed to <span style="color: #333333; font-family: Consolas, 'Liberation Mono', Courier, monospace; font-size: 12px; line-height: 19px;">curl https://npmjs.org/install.sh | sudo sh. to avoid directory access errors.</span></li>
<li><span style="color: #333333; font-family: Consolas, Liberation Mono, Courier, monospace;"><span style="font-size: 12px; line-height: 19px;">The forever@0.9.2 failed to install with lots of errors, but npm install -g forever worked.</span></span></li>
<li><span style="color: #333333; font-family: Consolas, Liberation Mono, Courier, monospace;"><span style="font-size: 12px; line-height: 19px;">But starting printerface using forever failed with an error on line 404 (monitor.send).</span></span></li>
<li><span style="color: #333333; font-family: Consolas, Liberation Mono, Courier, monospace;"><span style="font-size: 12px; line-height: 19px;">node printerface.js works though - web interface appears on port 8080.</span></span></li>
</ul>
</div>
Will update when I get further....<br />
<br />Graham Joneshttp://www.blogger.com/profile/05191458206760305309noreply@blogger.com0tag:blogger.com,1999:blog-6042401333290389565.post-10100464990026010682013-01-06T19:22:00.002+00:002013-01-20T15:46:02.533+00:00Design Calcs for Power Assisted Wheelchair<b>Update to correct my deliberate mistake...Answer is still about the same, but I am now designing to a 1 in 3 (18 deg) gradient.</b><br />
<br />
A very quick go at some preliminary design calculations for the <a href="http://nerdytoad.blogspot.com/2013/01/power-assisted-cross-country-wheelchair.html">power assisted cross country wheelchair</a>.<br />
The idea is that the motor should be capable of preventing it slipping backwards on a 18 deg incline (fairly arbitrary, but needed to make a design assumption).<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEixrQDp8Y9NaWt-IdhKQUt9Vo3UfEbb-xMxSGkzCFeiyesJIpMysmPBM57mJZR1QsM3_6TvS0DjuiMx2UvXPvtqBe4_qkACvS9vPx_iJfa277BF3KdQXJtDwTyynlbrtjjeDXDO1IZCitg/s1600/Cross-Country+Wheelchair.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="300" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEixrQDp8Y9NaWt-IdhKQUt9Vo3UfEbb-xMxSGkzCFeiyesJIpMysmPBM57mJZR1QsM3_6TvS0DjuiMx2UvXPvtqBe4_qkACvS9vPx_iJfa277BF3KdQXJtDwTyynlbrtjjeDXDO1IZCitg/s400/Cross-Country+Wheelchair.png" width="400" /></a></div>
<ul>
<li>Based on an assumed mass of 50kg, would give a weight of 490 N.</li>
<li>This resolves to a force down the 18deg slope of 152 N.</li>
<li>Which is equivalent of a torque on an 18" (0.46m) od wheel of 85x0.46 = 35 Nm.</li>
</ul>
<div>
Alas this is more than twice the torque delivered by the bicycle hub motor (15 Nm). Now the motor is likely to have internal gears, but it could be tricky to make some new ones to reduce its speed and increase its torque... [Update - oh no it doesn't - it is <a href="http://www.ebay.co.uk/itm/Brushless-Mini-Hub-Motor-36V-200W-Front-Wheel-Uk-Seller-/110997563530?_trksid=p5197.m1992&_trkparms=aid%3D111000%26algo%3DREC.CURRENT%26ao%3D1%26asc%3D14%26meid%3D4689701705660011606%26pid%3D100015%26prg%3D1006%26rk%3D1%26sd%3D110997563530%26">shown as gearless</a>].<br />
So it looks like if I am going to use hub motors, I will have to use two of them. This would probably be sensible, as it will be better to drive the rear wheels, but also expensive...</div>
<div>
<br /></div>
<div>
I have ordered an <a href="http://bit.ly/10HZZEa">electric wheelchair conversion kit off ebay</a> - will see how that goes. Torque and speed should be ok, but it looks heavy and clunky, so I expect to upgrade it...</div>
Graham Joneshttp://www.blogger.com/profile/05191458206760305309noreply@blogger.com3tag:blogger.com,1999:blog-6042401333290389565.post-66455264823717550622013-01-06T09:07:00.001+00:002013-01-06T09:07:12.376+00:00Power Assisted Cross Country WheelchairOur son Benjamin does not walk too well, and will suddenly run out of energy, so when we are out in the countryside we take a three wheeler cross country wheelchair for him:<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="http://www.specialneedspushchairs.co.uk/images/xl/mountain_buggy_xl05.jpg" imageanchor="1" style="clear: left; float: left; margin-bottom: 1em; margin-right: 1em;"><img border="0" height="200" src="http://www.specialneedspushchairs.co.uk/images/xl/mountain_buggy_xl05.jpg" width="165" /></a></div>
<div class="separator" style="clear: both; text-align: left;">
He has got too big for this one, so we are going to get him the biggest one we can find:</div>
<div class="separator" style="clear: both; text-align: center;">
<a href="http://www.specialneedspushchairs.co.uk/babyjogger_freedom.htm" style="clear: left; float: left; margin-bottom: 1em; margin-right: 1em;"><img border="0" height="200" src="http://www.specialneedspushchairs.co.uk/images/baby_jogger/babyjogger_freedom21.jpg" width="175" /></a></div>
<div class="separator" style="clear: both; text-align: left;">
This new one has 16" spoked wheels, so this means it should be possible to add some form of power assistance to it, as you can get some nice lightweight motors that fit into bicycle wheel hubs.</div>
<div class="separator" style="clear: both; text-align: center;">
<a href="http://cgi.ebay.co.uk/ws/eBayISAPI.dll?ViewItem&item=110986394156&ssPageName=ADME:X:RTQ:GB:1123"><img border="0" height="165" src="http://img689.imageshack.us/img689/2739/motorwheel.png" width="200" /></a></div>
<div class="separator" style="clear: both; text-align: left;">
So, I intend to get an electric bicycle conversion kit, and fit it to the wheelchair. There are a few things to deal with to make it work:</div>
<div class="separator" style="clear: both; text-align: left;">
</div>
<ol>
<li>Will the new hub fit in the front forks of the wheelchair? (waiting for supplier to measure it for me on Monday).</li>
<li>Although fitting the powered wheel in the front will be the easiest mechanically (assuming it fits), the front wheel has less weight on it, so it may just spin, and not be much use, so I may have to look at how to fit it to one of the rear wheels (which then raises the concern about whether it will spin round in circles!</li>
<li>The bike set-up will be intending to go a lot faster than I want this wheelchair to go (I guess it will target around 12mph, but I think 4mph will feel quite fast enough for me). Mounting the hub in a smaller wheel will reduce the speed, but I think it will still be too fast (will do the sum later...), so I think I will have to modify the motor driver. The motor is a brushless motor, which from what I have read sounds pretty much the same as a stepper motor - you have to feed it with a wave form to get it to go around (and go in the right direction). So even if I can not simply modify the controller, I can use its power transistors etc. and use an arduino to make the waveforms.</li>
</ol>
<div>
An alternative may be to go for two electric wheelchair motors, but they look awfully heavy compared to the bike motor, so I am tempted to go with that as a trial. If it doesn't work, I'll put the electric kit on our Hase Pino tandem to help me up the hills, as Benjamin doesn't put too much effort into pedalling!</div>
<div>
<a href="http://www.flickr.com/photos/jones139/7328416558/" title="IMAG0035 by jones139, on Flickr"><img alt="IMAG0035" height="333" src="http://farm8.staticflickr.com/7238/7328416558_f1eb9c2dcd.jpg" width="500" /></a>
<br />
<br />
I'd be interested to hear if anyone has tried this and has experiences to share.</div>
<br />
<br />Graham Joneshttp://www.blogger.com/profile/05191458206760305309noreply@blogger.com1tag:blogger.com,1999:blog-6042401333290389565.post-80006551027610199172012-12-16T19:35:00.003+00:002013-02-16T22:09:25.241+00:00Approaching a working version of Arduino Solar MonitorChristmas is coming, so I have to get a working version of the <a href="http://nerdytoad.blogspot.co.uk/2012/12/arduino-based-solar-panel-power-monitor.html">Arduino Solar Panel monitor</a>.<br />
<br />
My original intent was that it would have the following features:<br />
<br />
<ol>
<li>Measure the collector differential temperature.</li>
<li>Infer the water flow rate from pump speed.</li>
<li>Calculate the instantaneous power being collected.</li>
<li>Calculate hourly and daily average powers.</li>
<li>Log this information to an SD card.</li>
<li>To achieve (5) easily, derive the time from an external Real-Time-Clock (RTC).</li>
</ol>
<div>
Now, features 1-3 are working, phew. Feature 5 is implemented, but I need to think about what I really want (is daily average power useful? Or should I just integrate total heat collected in a day?</div>
<div>
<br /></div>
<div>
Features 5 and 6 are proving troublesome, as I think I am starting to get to the limit of a single Arduino board (or more precisely the ATMega 328 controller on the board).</div>
<div>
<br /></div>
<div>
The main problem is that I am running out of RAM, and am going to have to give some serious thought to how to manage it better (just like old times programming a Zilog Z80A.....).</div>
<div>
<br /></div>
<div>
One problem is the number of different interfaces (and hence libraries) that I am having to use to achieve this. The base software uses:</div>
<div>
<ol>
<li>OneWire.h and DallasTemperature.h to do the temperature monitoring, using a One-Wire bus.</li>
<li>LiquidCrystal.h to drive the LCD display, using parallel data transfer</li>
</ol>
<div>
To add SD card support and a real time clock, I will need:</div>
</div>
<div>
<ol>
<li>Wire.h and DS1307RTC.h to access the real time clock from an I2C interface.</li>
<li>SD.h to access the SD card from a SPI interface.</li>
</ol>
<div>
Each library uses a bit of ram , and there is only 2k of ram on the chip, so I am running out of it rapidly. When I tried to add the RTC code, the board re-booted every few seconds, which I think was an out of memory issue.</div>
</div>
<div>
<br /></div>
<div>
So, the de-scoped system is not going to do SD card logging. As compensation I have added switches to the two spare digital lines to use to provide a simple user interface so you can scroll between instantaneous, hourly and daily data, and maybe even set the clock (but I do worry about running out of RAM again if I get too adventurous!</div>
<div>
<br /></div>
<div>
Given this, I have put the 'Version 1' hardware together, and mounted it in a cheap 2 gang socket pattress box with a blank cover cut out to hold the board:</div>
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhvnY1M711fQ7cCWGj4uWMgze2aRVMfHoU5V6PtrWxDvSnjLxEnVe-R981OmOLZq-3TGSghaQzZYIxoQUrGHlkkjmYkjU6kDBgDkyf07x2KhUOhe17gKeQLgfBEuMNsNc9R2VDzTGvPJ6A/s1600/IMGP1771.JPG" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="240" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhvnY1M711fQ7cCWGj4uWMgze2aRVMfHoU5V6PtrWxDvSnjLxEnVe-R981OmOLZq-3TGSghaQzZYIxoQUrGHlkkjmYkjU6kDBgDkyf07x2KhUOhe17gKeQLgfBEuMNsNc9R2VDzTGvPJ6A/s320/IMGP1771.JPG" width="320" /></a></div>
<div>
The toggle switch on the front is for the display back-light - I thought that would be easier than a push-button if you were trying to use buttons for the interface - the new buttons are facing the bottom of the picture on the side of the front panel, so you can't see them on this photo (and I made a mess of cutting the holes for them, so it looks a bit ugly...).</div>
<div>
<br /></div>
<div>
Right, just got to sort out the software now. Current version is on <a href="https://github.com/jones139/arduino-projects/tree/master/solThMon">github</a>.</div>
<div>
<br /></div>
Graham Joneshttp://www.blogger.com/profile/05191458206760305309noreply@blogger.com0tag:blogger.com,1999:blog-6042401333290389565.post-40453993057234909082012-12-09T15:19:00.001+00:002012-12-09T15:19:15.657+00:00Odd Behaviour of ArduinoI think I have got too used to working on large computers, which have essentially infinite resources, as far as my little projects are concerned, so I am having a bit of trouble with Arduino. The two interesting problems I have seen are:<br />
<br />
<br />
<ol>
<li>Low Battery: The symptoms were the device operating ok for quite a while (initially around an hour, but later ~5 minutes), then doing very unexpected things - what I saw was the pin 13 LED flashing on and off, but my software was not doing anything with that pin. It looked like the board had just lost its marbles. It turned out that the issue was low voltage on the 9V battery that was powering it - I did not realise initially because the LED back light on the display worked fine, which is the test I was using for 'it has got power'. When I put a volt meter on it, it was only providing around 5.3V, so I think that this was insufficient to start the arduino properly...but surprisingly was enough to light the LED ok - I have always thought that LEDs need more power than little electronic devices, so if the LED is ok, it has enough power - this is not true, so I must think of another simple check!</li>
<li>Out of Memory: My <a href="https://github.com/jones139/arduino-projects/tree/master/solThMon">solar thermal monitor</a> now works nicely with an LCD display, and I made a simple test program to write data to an SD card. The odd thing is that merging the two together results in a program that compiles ok and loads onto the device, but when it tries to write to the SD card, the arduino re-starts (I had worse effects earlier when it would just not start at all, until I removed some code in the setup function that writes to the SD card). I think it must be running out of memory, but I need to do some work to check this...will update this once I have fixed it.</li>
</ol>
Graham Joneshttp://www.blogger.com/profile/05191458206760305309noreply@blogger.com0