These are truly magnificent pieces of engineering, and we were very lucky to be able to use one for the day.
Our version had been outfitted with a Lego studded test bed, which allows for the rapid testing and prototyping of Lego models. As you can see from the picture above, the tunnel is a finely-engineered beast.
On the left of the tunnel is a laminar air intake, which forces the air through a honeycombed aluminium structure to reduce the turbulence. Once through this, the diameter of the tunnel reduces sharply, increasing the airspeed through the test bed.
The LEGO bed is attached to two load sensors as can be seen below. One senses movement in the vertical direction, which indicates a Lift force in Newtons, and the other senses the Drag generated by the model in Newtons. The machine outputs the readings to two digital 7-segment LED displays, and are accurate to 0.01 of a Newton. The air exits via the exhaust side, and is driven by a mains-powered air conditioning fan. This is adjustable up to 18m/s at top speed, so the tunnel can really push the limits of most lego models.
It is very hard to directly visualise the airflows without some seriously dense smoke (which would set off alarms, asthma), but we have cotton thread on a wand that can be used to directly visualise the airflow over a model by tracing it around the outside of the model.
After the initial setup in the school library, I opened up my Big Box o Lego (TM), featuring many classic LEGO pieces from the golden era of 1980s space lego, and infused with Dexter’s more contemporary LEGO star wars pieces. We even took my younger son’s duplo airplane as seen above.
The basic procedure was outlined:
(1) Build a model, get it tested in the wind tunnel
(2) Record the Drag and Lift Values from the tunnel
(3) Refine your design, retest.
We had an incredible variety of models made, from the austere and efficient to the beautifully sculpted and adjustable masterpieces. Over 45 different models were made and refined over the course of the day, involving multiple rounds of testing.
The best models of the day tended to have the best Lift to Drag ratio, the best being 1.09N Lift, 0.38N Drag. We discussed how you could make them more efficient, including using flat pieces to reduce drag around the studs, removing unnecessary fandangles and gizmos.
The crowd that gathered was equal parts boys and girls, as well as equal part child and adult. LEGO is a great leveller, and many families had to pull away their participants in order to see the amazing things on offer elsewhere in the Science Fair.
Buy, or build your own robotic arm from the plans available from phenoptix. Connect it to a Pi using an Adafruit I2C servo controller board. Control it using ScratchGPIO. Get creative. This is perfect for KS2 or KS3 students who want a build challenge, but aren’t quite ready for the challenge of coding the robot control in python.
I bought a MeArm from Phenoptix before the Christmas break, as my STEM club kids had wanted to build a Mars Rover style robot with a robotic arm. I thought I’d start small and work my way up to building my own so I bought a retail kit from the phenoptix website after being really impressed with the V3 at the Poole Raspberry Pi Jam 2014.
The kit is sold by Phenoptix (www.makersify.com) and comes with everything needed to build the robot, including laser cut acrylic parts, M3 nuts and bolts for fixing, and 4x 9g Turnigy servo motors. They also entered the design into the Hackaday I/O contest, and open sourced the hardware so you can roll your own if you need to. This makes it an amazing choice for school STEM projects if your school has access to a laser cutter or a 3D printer, as you can make the robot for less than £20 quid and a dig around in the parts bin. Controlling the robot is left up to you: There are build instructions for Arduino, Pi, and Beaglebone (and presumably ninja skills with 555 timers and pots) as the control methods. The code for all these has been hosted on gitub.
The build was moderately difficult for KS3/KS4 students, and was not helped by the sometimes obtuse instructions on www.instructables.com. The Stem club were going to try and document a clearer build – I think the instructions could be done in a different colour acrylic to make it clearer, or perhaps add some 3D or wireframe instructions like an IKEA flatpack build! One of our improvements was to add engraved numbers to adjoining edges, so 2 fits with 2, 5 with 5 etc. But then again, solving problems is part of the whole STEM process!
My initial kit had the incorrect number of parts, so the build stalled about 2/3 of the way through but a quick email to the website meant they dispatched replacement parts in about 4 days. It’s really important to get the initial positions of the servos correct in the build, as it can have two consequences:
Having to disassemble the robot to correct the servo position in order to get the full range of movement. (A pain)
If the base servo is misaligned, the robot can end up pushing against the supports and burning out your servos. The claw servo can also burn out easily if you haven’t set limits in software and misaligned the servo before assembly.
Once assembled, the robot looks pretty awesome but the robot won’t be able to move without precisely timed PWM signals generated from a Pi, Arduino, Beaglebone or a servo controller unit. These have to be precisely timed, and require a PWM pin, +5 V and ground for each servo.
Connections to the GPIO
This how-to is for a Raspberry Pi Model B or B+ (not a Pi 2) , using Adafruit’s 16 channel servo controller board, Scratch GPIO 7.
1. Assemble the Adafruit board by soldering in the header pins
2. Enable i2C on your pi by following the instructions at the Adafruit Learning Centre. This is just editing a config file and then rebooting down the Pi
3.Attach the breakout pins on the Adafruit board pins to the correct GPIO pins as shown below.
This picture is from a B+ GPIO.
Make sure that the Pi is turned off before attaching pins to the controller board!
VCC is attached to 3V3 (GPIO 1 ) This provides the voltage for the controller chip at 3V.
Gnd is attached to Gnd (GPIO 6 ) This provides a ground voltage for the controller chip at 0V
SDA is attached to SDA (GPIO3 ) This pin is the Serial Data pin, where all of the PWM signals are pushed to the different servos
SCL is attached to SCL (GPIO5 ) This pin is the Serial Clock pin, where the timed pulses for i2C master-slave communication are generated.
and you should get output that it has detected your device (mine was 0x40)
Connecting the servos to the board.
Connect each servo to the board correctly:
Make sure that the Black wire is connected to ground, The red wire to V+, and the White wire to PWM.
I connected the servos to the board as follows:
Base servo: Channel 0
‘Shoulder’ servo: Channel 1
‘Elbow’ servo: Channel 2
‘Gripper’ servo: Channel 3
I usually have a whole bunch of female-female header leads for experiments with Raspberry Pis, and then I use solid jumper leads to connect these to the servo leads. You should thread the leads through the centre of the arm to keep the leads tidy, and to minimise snagging whilst the robot is moving. You’ll need to extend the servo leads of the micro servos in order to prevent any tension whilst the robot is in motion. I’ve got 20cm jumper leads bought on Amazon, and can highly recommend them for STEM projects.
The Adafruit 16channel 12bit i2c board, allows the Pi to control the 4 servos using i2c over the GPIO. You need to use an external 5V power source to power this board, otherwise the Pi will brown out whenever the servos are moving. You can damage your Pi if you try to drive 4 servos with the GPIO. I used an external bench power supply to make sure the servos got a constant voltage supply.
Make sure that you connect the servos to the Adafruit board, the Adafruit board to the GPIO and the External Supply to the Adafruit board whilst the Pi is powered down. Check, check and check all your connections again. The GPIO can be a fickle beast, and connecting it incorrectly can damage the Pi. Be nice to your Pi.
Once you reboot your Pi, you should see the adafruit board will light up, and you can check that the Pi can see the i2C device with
Connecting the Adafruit Servo Board to Scratch GPIO.
Install ScratchGPIO on the pi using
wget http://bit.ly/1wxrqdp -O isgh7.sh
sudo bash isgh7.sh
ScratchGPIO7 should then be installed. If you don’t get a link on the desktop, you might need to dig around in /etc/ to locate the scratchGPIO7.sh file.
Start ScratchGPIO. The program should automagically detect the i2c device if it can be seen using the i2cdetect command.
Set up some variables:
AdaServo0 – This is channel 0, the base servo, and should only be sent values ranging from -90 to 90
AdaServo1 -This is channel 1, the shoulder servo, and should only be sent values ranging from 70 to 0
AdaServo2 -This is channel 2, the ‘elbow’ servo, and should only be sent values ranging from 20 to 70
AdaServo3 -This is channel 3, the gripper servo, and should only be sent values from 0 to 30.
These values are all assuming that you have assembled the arm according to instructions. You might find that your ranges are slightly different.
You can choose to increment the values of the variables by using the arrow keys, or you might create a generator script which will automatically scan through the values for each variable. we had a ‘panic’ function that set the robot to a home state with a single button press if things got hairy!
Here’s a video of it in action:
Be careful not to exceed the maximum values for the servos. The 9g servos have relatively weak plastic gear trains, and can strip their gears easily!
We’ll share our scratch code, and will post our python code as soon as we can get coding again.
For Science and Engineering week, I thought it would be inspiring to make a giant Polargraph machine that would be left all day to draw at school. They occupy a perfect space at the intersection of art, maths and technology.
What is a Polargraph?
A polargraph is a line drawing machine that uses two fixed stepper motors to move a hanging gondola that holds the pen. The machine usually draws pictures using a continuous line.
Why is it called a polargraph?
The machine uses a polar co-ordinate system to calculate where the pen is, and how far to move the motors in order to draw lines. (It can convert polar co-ordinates to Cartesian x-y co-ordinates)
How does it work?
The polargraph controller takes a .png, .gif or .jpg image as the input, and decides on the brightness of different areas in a grid. If an area is dark, then the machine tells the arduino to signal to the motors to put the pen lines close together, and produce a dark shaded pixel. . If an area is light then the machine tells the motors to put the lines further apart.
What can it draw?
It can draw any bitmap image that it is given. It works best with high-contrast images. It is quite versatile and can draw in a variety of ways – It can scribble, it can draw circles, or it can build pictures up in squiggles or perfect square waves. The curved bands are due to the way that the motor instructions are generated, and which point is fixed in the picture. It can draw vector files directly, as they are path-based images.
How did I build it?
I followed the truly incredible instructable form Sandy Noble at instructables. It’s one of the best that I’ve ever followed – The instructions are clear, the walkthroughs are detailed to just the right level, he links to all the influences he used for the project, and it produces amazing results.
Although Sandy has been inspired by others, including Der Kritzler machine and others he really has nurtured a community around polargraphs and V-plotters through a great forum, and excellent communication with customers or interested people. He has also innovated in the area of v-plotters and really enabled a community to move forward by open-sourcing his designs and software. It’s entirely possible to buy ready-made plotters from the polargraph store, but I really wanted to roll my own and follow the instructable.
Mark 1 – (prototype)
I built a comedy home version at first, with the stepper motors velcroed onto the top of the board directly. This had an advantage where there a bit of play in this mounting system, and it absorbed some of the resonance in the cables.
I built it using a an arduino uno (legit!), and an Adafruit motor shield V1 clone from ebay (only because the real one isn’t made anymore). I don’t have access to a 3D printer, or a laser-cutter at home. so I tried to make a home-grown solution to the gondola, but ended up buying a kit from the polargraph store.
Problems that I had while building the prototype machine and how I fixed them:
1. Motors turned backwards – Answer steppers wired in reverse. This was apparent as I moved my arduino from the front of the machine to the rear between revisions!
2. Gondola moved off edge of page when drawing – hadn’t set machine size before starting and uploaded to the arduino. This is a really important stage that has to be done every time, or saved into the default_properties config file.
3. Pen would slip in picture – My counterweights were hitting the floor as the strings were too long. This was solved by shortening the blind cables, but I think in future I will try and use a pulley system to stop the cables vibrating and resonating.
4. Stepper Motors were getting crazy hot -I stole some heatsinks from a PC graphics cards and added small heatsinks to both the motor driver ICs on the motor board, and to the top of the steppers. I also drove them at lower voltage and reduced the max speed. This solved the problem nicely.
5. Pen was lifting from the paper causing incomplete pictures – I worked out a technique to get the paper as flat as possible in terms of using masking tape. In future, I’ll use low-tack spray mount to get the paper to attach flat to the board, and then mask the outsides.
6. Servo failing to lift. – It took me a while to find this one out. My servo was loose from the gondola, and was failing to lift in certain parts of the image. This is a major headache if your machine has spent 6 hours drawing a flawless image, only for the servo to unstick, and then draw a line across the middle of the image. Solution – Use a stronger adhesive or change the gondola.
V1 It took me about 8 hours total to construct, and more importantly about 8 hours to troubleshoot. Getting to know the software is really important. Once I’d worked out what worked and what didn’t (and how to get the best out of my pens, paper and machine, I set my sights on the largest machine that I could build and transport.
Polargraph Mark II.
It took about 4 hours to build, including a hinged A-frame so that I can change the angle of the board, and prop it up in public spaces without too much hassle.
The machine measures 1100 mm x 1800mm (The largest board I could fit into my car).
The only significant changes I made were that the machine size is effectively tripled, the electronics are all hidden at the back (a bit of a shame for a public machine I thought, but it would reduce tinkering / breakage – I might make a perspex enclosure for all the goodies), and I drilled a 12mm hole centre bottom in the board to run the servo cable hrough from the bottom. This serves to pull the gondola parallel to the board.
Portraits seem to be the most popular polargraph subjects, as they are universally recognisable, and we have a significant amount of subconscious brainpower dedicated to face recognition. I tried to go for a couple of portraits, but also tried to get kids talking about science, maths, art and technology through the pictures.
I’ve been plotting various famous scientists, impossible shapes, anatomy pictures and other random pieces for the foyer at school.
The machine was almost universally well received, especially for portraits of people. It generates so many questions from curious students, and there seems to be a constant phalanx of children in front of it walking back and forth to try and make the more abstract images resolve with distance. It’s one of the most pleasing builds in terms of satisfaction, as it produces such amazing works of Art.
My next project is to get the machine to run from a Raspberry Pi, and to try and emulate the amazing work of the blackstripes.nl or the gocupi projects (who are ‘inspired by’ Sandy Nobles work themselves).
I’ll add a time-lapse video as I have time to make a full-length one of a long drawing!.
The majority of my work at the moment is supporting ICT teachers who want to introduce and develop Computing within their own curriculum ahead of the changes planned for September 2014. Throughout my work with other teachers, I’ve been sharing some of the pedagogic devices and strategies I’ve been using with the aim of ensuring that their teaching of Computing is engaging and inspiring.
One such game that I’ve developed with my classes I call ‘Sabotage’. I’ve discovered that this can be used in a whole variety of ways, but to help you develop an understanding of it I will attempt to describe just one simple example for you.
I found when I first moved from teaching Scratch (a visual programming language) to Python (a text-based programming language) that children became very frustrated with the high numbers of syntax errors which prevented their scripts from working. This created…
tl:dr : I hacked open an intel QX3 toy microscope, replaced the crummy sensor with a Picamera, installed RPI_cam_web_interface on it and now have a wireless electronic microscope that you can control using a web interface. It’s a great little teaching tool, and the camera can live with the microscope permanently, the Pi can be detached and used for other projects.
Background: When the education pyramid for my county was collapsed into two tiers (Primary and Secondary only- Middle schools were closed down), some of the science equipment was redistributed to other schools. We got a bunch of the Intel Qx3 Mattel microscopes, which were ‘legacy equipment’, and hadn’t been used in years (sneeze alert). I saw these sad microscopes and thought they could be revitalised with some ‘Pi inside'(TM) Back when they were released, they boasted fairly good specifications, including 10, 60 and 200x optical magnigfication, USB connectivity, CCD sensor capable of native res of 320×240, and incandescent illuminators above and below the stage. They are great toy microscopes, and work well for the price, although the drivers haven’t been updated for Windows in years, and the software really only works reliably with XP. This also means that if you do get an old XP box/laptop you have to use the heinous digital blue software bundled with the camera, which stinks of bloat, and wacky menus. Getting it functioning: Mac OS X has a working solution with macam, but doesn’t allow direct control of the illuminators, and my school doesn’t have any OS X machines to hand. There is a solution for Linux machines that works as described in this excellent blog post here. Since it works on a linux distro, and video4linux identifies the camera as a gspca (CPiA) webcam, it should work on a Pi, since V4L is baked into the kernel of Debian. There is a great blog post here detailed how to get it working with Camorama. Naturally, I plugged it straight into a Pi via usb and managed to get it working in X using camorama using the following commands.
On running camorama, It worked, but it was really disappointing. The res was low, the framerate from the webcam was sucky, the sensor had a huge amount of noise present and it crashed after about 2 minutes every time. I persevered with getting it to work, but my coding skills weren’t up to the challenge of getting V4L2 to play more nicely with the camera (although I did manage to get the illuminator to turn on and off using:
v4l2-ctl -c illuminator_1=0
At this stage, I was ready to say that the cams were working again – We had a solution that allowed kids to get images from them, and use them in education, but it was clunky, and didn’t offer that good a solution. Option 2: Void that Warranty So, I removed the sensor board, and stuck a Pi camera inside, set up streaming webcam with controls and it’s awesome. I haven’t added GPIO controlled LED lights yet, but it’s a huge improvement over the previous res with the camorama option. The step were as follows:
Detach Microscope from stand
Unscrew the back by undoing the six screws on the back of the microscope.
Disassemble microscope. Carefully take apart the 4 main pieces of the webcam (back plate, front plate, base, microscope body) I was really careful to avoid getting dust inside the optical assembly, which seemed fairly well sealed.
Unclip cables: Usb and illumination ribbon cables from control board. (might need a screwdriver here, they are glued)
Drill out the Usb cable from the back plate of the microscope (I couldn’t loosen the glue, so had to resort to persuasion)
Unscrew the control board. There are 3 screws with maddening Red Plastic glue over them to prevent fiddling. I cut this gunk up with a razor blade first, then unscrewed them fully. It’s well worth tampering with them.
Modify Pi camera by unscrewing the lens completely (use tweezers and use a craft knife to break the glue seal). This allows more light into the sensor, and is far better for microscopy and telescope use. You can reverse this, and set the lens focus back to inifinity afterwards, but if you are not careful, you could risk damaging your PiCamera module. There’s a nice tutorial here on how to do it.
align pi-camera over old sensor hole. You can see this in the photo above.
live test using raspistill to get position of the camera module where the old CMOS sensor was and test focal length, focus etc. You might need to adjust the picamera further away from the lens assembly to make sure that the stage can focus when you adjust the stage height with the knobs.
raspistill -o image.jpg
This command will generate a preview screen if you are in X. You should see the red light come on, and then if you look in your home directory, you can look at the file, and make adjustments.
affix pi camera in place. I used a medium servo horn (it just had the correct spacing for the screw holes) or other piece of plastic/wood. Metal is a bad choice, it will short the connections on the back of the camera module, likely resetting the Pi. It’s possible to 3D print or laser cut a bracket to the correct size, but I didn’t have access to those at home)
Feed ribbon cable out through the usb cable hole. In time I’ll probably Sugru it in place when I’ve found the sweet spot of replacement flex cable length and where I want the Pi. Longer = more flexibility (no pun intended)
Reassamble the Microscope by putting all of the bits back together (Make sure the camera module is aligned so that it has a light path straight to the stage – Initially I put it in back to front, and couldn’t see anything, and went through a painstaking process of checking camera connections, rebooting, before I realised that it was a basic physics fail).
Install the amazingRPI_Cam_Web_Interface on the Pi. ( There is excellent documentation on the elinux site, with installation instructions). This gets an MJPEG feed from the picamera and streams it to a web server with the Pi’s address. The web interface has awesome customisation options and allows you capture stills, video and time-lapses through a browser.
Add a usb wi-fi dongle to Pi and power it up via battery or mains power.
Browse to the IP address of your Pi on another computer. (you can do this by typing ifconfig into the terminal, or by watching the messages that appear during startup.
At this stage, it works, it can be used by anyone who knows how to point a browser to an IP address, providing you have a wireless connection. You can probably plug in a monitor, keyboard and browse to localhost on the Pi itself to control it, although I haven’t tried, but a wireless microscope is SO much cooler! Next steps- Use a Pi A+ and see whether I can secrete the workings into the base so it is an invisible mod apart from the camera ribbon cable. -Replace the bulbs with LEDs, and control them via GPIO, and add buttons to the RPI_cam script. -Remove the reboot and shutdown buttons from the script, or at least password protect them. -Write a proper instructable for people who like to void their warranty with style. Naturally, not having access to school wifi makes it a bit more tricky, but you can still plug in a keyboard and monitor via HDMI, and you will be able to access the web cam interface at
in a browser on the Pi.
Here’s a sample of 3 zoomed images downloaded from the Camera browser interface.
Following on from my slow motion combustion reactions, which were really popular for classroom teaching, I thought I’d try the same formula with another crowd pleasing reaction: Neutralisation with universal indicator on magnetic stirrer.
First reaction was a real beauty, but there was lots of turbulence due to air bubbles caused by the height of the liquid in the conical flask and the high speed of the stirrer.
I decided to film portrait orientation. Although this gives large black bars when viewed online, it gave a really good view of the conical flask and the beautiful swirling purple colours as I added the sodium hydroxide. It’s a really good talking point for classroom discussions about neutralisation. -What’s going on? Why doesn’t it all change at the same time? -Why don’t the liquids mix instantly? What will happen? Why did it go Blue?
This was the least successful in terms of colour changes, but the vortex is really beautiful and engulfs the stirring bar. I love the final colour.
Genie in a bottle.
I looked up a recipe from the RSC demonstration handbook for ‘dancing flames‘, which is a reaction between aluminium foil and copper chloride. The acidified solution eats away at the oxide layer, exposing the aluminium for reaction and then generating sweet hydrogen gas for the exploding. Watch the video for the burn:
This is a great experiment that allows a really good and deep discussion of reactivity, and also flame tests. It’s applicable to KS3-5, and is a great point of focus as a demonstration with the lights off.
Here’s an update of what we’ve been up to in the last 4 weeks of STEM club.
We initially built the excellent MeArm from a phenoptix kit, priced £29.99. It’s a nice robot with 4 degrees of freedom, that can be controlled via Arduino, Rasberry Pi, Beaglebone etc. We managed to get it working really nicely with a Raspberry Pi and ScratchGPIO, Simon Walter’s excellent version of Scratch for the Pi. The challenge for the group is to control the arm using Python. It was suggested to use a Wii Nunchuk, or perhaps a Microsoft Kinect, but early experiments have suggested that keyboard input is challenging enough!
Once the robot is mechanically sound, and the servo positions are set, then we should be able to connect it via I2C using the Adafruit 16 channel Servo I2C board. Once this is done, then we’ll release the python geeks, and let them run wild with their programming and imagination.