Aerodynamic testing at the Purbeck School Science Fair

tl:dr – Make LEGO models and test them in our Scale Wind Tunnel, test and improve!

At the recent Purbeck School Community Science Fair, I was fortunate enough to be running a Scale Wind Tunnel manufactured by Clive Evans at Scale Engineering.

Website

Quote in wired magazine about making your own wind tunnel. 

These are truly magnificent pieces of engineering, and we were very lucky to be able to use one for the day.

Our version had been outfitted with a Lego studded test bed, which allows for the rapid testing and prototyping of Lego models.   As you can see from the picture above, the tunnel is a finely-engineered beast.

 

On the left of the tunnel is a laminar air intake, which forces the air through a honeycombed aluminium structure to reduce the turbulence.  Once through this, the diameter of the tunnel reduces sharply, increasing the airspeed through the test bed.

The LEGO bed is attached to two load sensors as can be seen below.  One senses movement in the vertical direction, which indicates a Lift force in Newtons, and the other senses the Drag generated by the model in Newtons.  The machine outputs the readings to two digital 7-segment LED displays, and are accurate to 0.01 of a Newton.   The air exits via the exhaust side, and is driven by a mains-powered air conditioning fan.  This is adjustable up to 18m/s at top speed, so the tunnel can really push the limits of most lego models.

It is very hard to directly visualise the airflows without some seriously dense smoke (which would set off alarms, asthma), but we have cotton thread on a wand that can be used to directly visualise the airflow over a model by tracing it around the outside of the model.

After the initial setup in the school library, I opened up my Big Box o Lego (TM), featuring many classic LEGO pieces from the golden era of 1980s space lego, and infused with Dexter’s more contemporary LEGO star wars pieces.  We even took my younger son’s duplo airplane as seen above.

The basic procedure was outlined:

(1) Build a model, get it tested in the wind tunnel

(2) Record the Drag and Lift Values from the tunnel

(3) Refine your design, retest.

(4) Repeat.

We had an incredible variety of models made, from the austere and efficient to the beautifully sculpted and adjustable masterpieces.  Over 45 different models were made and refined over the course of the day, involving multiple rounds of testing.

The best models of the day tended to have the best Lift to Drag ratio, the best being 1.09N Lift,  0.38N Drag. We discussed how you could make them more efficient, including using flat pieces to reduce drag around the studs, removing unnecessary fandangles and gizmos.

The crowd that gathered was equal parts boys and girls, as well as equal part child and adult.  LEGO is a great leveller, and many families had to pull away their participants in order to see the amazing things on offer elsewhere in the Science Fair.

This close-up shows the control unit, with readouts as well as the test bed.  Models to be tested are also lined up on the bottom.  That's me twiddling knobs.
This close-up shows the control unit, with readouts as well as the test bed. Models to be tested are also lined up on the bottom. That’s me twiddling knobs.
This was a particularly sassy model, with adjustable angle of attack for the wings.  The builder decided that inverting the wings would reduce drag.  He was right!
This was a particularly sassy model, with adjustable angle of attack for the wings. The builder decided that inverting the wings would reduce drag. He was right!
I was constantly surprised at the amazing variety of designs that people were able to make out of my limited pool of LEGO.
I was constantly surprised at the amazing variety of designs that people were able to make out of my limited pool of LEGO.
Two different solutions to aerospace design: reduce the surface friction, or go for maximum lift
Two different solutions to aerospace design: reduce the surface friction, or go for maximum lift
The fan running at 70% power.  This was, of course, off limits to members of the public in the rare case of a 'testing to destruction' event.
The fan running at 70% power. This was, of course, off limits to members of the public in the rare case of a ‘testing to destruction’ event.
This gives an idea of the beautiful engineering that went into the creating of this incredible wind tunnel.
This gives an idea of the beautiful engineering that went into the creating of this incredible wind tunnel. Thanks to Clive Evans for the loan of the wind tunnel.

Getting the Phenoptix MeArm to work with ScratchGPIO

tl:dr

Buy, or build your own robotic arm from the plans available from phenoptix.  Connect it to a Pi using an Adafruit I2C servo controller board.  Control it using ScratchGPIO.  Get creative.  This is perfect for KS2 or KS3 students who want a build challenge, but aren’t quite ready for the challenge of coding the robot control in python.

Background

I bought a MeArm from Phenoptix before the Christmas break, as my STEM club kids had wanted to build a Mars Rover style robot with a robotic arm.  I thought I’d start small and work my way up to building my own  so I bought a retail kit from the phenoptix website after being really impressed with the V3 at the Poole Raspberry Pi Jam 2014.

The kit is sold by Phenoptix (www.makersify.com) and comes with everything needed to build the robot, including laser cut acrylic parts, M3 nuts and bolts for fixing, and 4x 9g Turnigy servo motors.   They also entered the design into the Hackaday I/O contest, and open sourced the hardware so you can roll your own if you need to.    This makes it an amazing choice for school STEM projects if your school has access to a laser cutter or a 3D printer, as you can make the robot for less than £20 quid and a dig around in the parts bin.  Controlling the robot is left up to you: There are build instructions for Arduino, Pi, and Beaglebone (and presumably ninja skills with 555 timers and pots) as the control methods.  The code for all these has been hosted on gitub.

Our own laser-cut MeArm parts, as shared on Thingiverse.
Our own laser-cut MeArm parts (well, they’ve been removed from the acrylic), as shared by Ben on Thingiverse.
The build was moderately difficult for KS3/KS4 students, and was not helped by the sometimes obtuse instructions on www.instructables.com.  The Stem club were going to try and document a clearer build – I think the instructions could be done in a different colour acrylic to make it clearer, or perhaps add some 3D or wireframe instructions like an IKEA flatpack build! One of our improvements was to add engraved numbers to adjoining edges, so 2 fits with 2, 5 with 5 etc.  But then again, solving problems is part of the whole STEM process!

The Build

My initial kit had the incorrect number of parts, so the build stalled about 2/3 of the way through but a quick email to the website meant they dispatched replacement parts in about 4 days.  It’s really important to get the initial positions of the servos correct in the build, as it can have two consequences:

  • Having to disassemble the robot to correct the servo position in order to get the full range of movement. (A pain)
  •  If the base servo is misaligned, the robot can end up pushing against the supports and burning out your servos. The claw servo can also burn out easily if you haven’t set limits in software and misaligned the servo before assembly.
    Robots building robots. Skynet!
    .Retail Kit meArm is in the background.  Our own version is in the foreground. I watched them build it without setting the servo positions correctly and kept quiet.    Then they worked it out for themselves, and disassembled, reset and rebuilt without my help!

Once assembled, the robot looks pretty awesome  but  the robot won’t be able to move without precisely timed PWM signals generated from a Pi, Arduino, Beaglebone or a servo controller unit.  These have to be precisely timed, and require a PWM pin, +5 V and ground for each servo.

Connections to the GPIO

This how-to is for a Raspberry Pi Model B or B+ (not a Pi 2) , using Adafruit’s 16 channel servo controller board, Scratch GPIO 7.

I decided to use the Adafruit 16 channel servo controller. It works over i2C, and so only requires 4 pins from the GPIO.  You will have to enable i2C on your Pi first, by following the tutorial online at Adafruit Learning Centre

1. Assemble the Adafruit board by soldering in the header pins

2. Enable i2C on your pi by following the instructions at the Adafruit Learning Centre.  This is just editing a config file and then rebooting  down the Pi

3.Attach the breakout pins on the Adafruit board pins to the correct GPIO pins as shown below.

The Pins we need are GPIO 1 (3v3), GPIO 3 (SDA), GPIO 5 (SCL) and GPIO 6 (GND)
This picture is from a B+ GPIO.

Make sure that the Pi is turned off before attaching pins to the controller board!

VCC is attached to 3V3 (GPIO 1 )       This provides the voltage for the controller chip at 3V.

Gnd is attached to Gnd (GPIO 6 )         This provides a ground voltage for the controller chip at 0V

SDA is attached to SDA  (GPIO3 )                  This pin is the Serial Data pin, where all of the PWM signals are pushed to the different servos

SCL is attached to SCL (GPIO5 )         This pin is the Serial Clock pin, where the timed pulses for i2C master-slave communication are generated.

The correct pins are on the header on the left hand side: The GND, SDA, SCL, VCC pins are the ones we want.
and you should get output that it has detected your device (mine was 0x40)

Connecting the servos to the board.

Connect each servo to the board correctly:

Make sure that the Black wire is connected to ground, The red wire to V+, and the White wire to PWM.

I connected the servos to the board as follows:

  • Base servo: Channel 0
  • ‘Shoulder’ servo: Channel 1
  • ‘Elbow’ servo: Channel 2
  • ‘Gripper’ servo: Channel 3

I usually have a whole bunch of female-female header leads for experiments with Raspberry Pis, and then I use solid jumper leads to connect these to the servo leads.  You should thread the leads through the centre of the arm to keep the leads tidy, and to minimise snagging whilst the robot is moving.  You’ll need to extend the servo leads of the micro servos in order to prevent any tension whilst the robot is in motion.  I’ve got 20cm jumper leads bought on Amazon, and can highly recommend them for STEM projects.

The Adafruit 16channel 12bit i2c board, allows the Pi to control the 4 servos using i2c over the GPIO.  You need to use an external 5V power source to power this board, otherwise the Pi will brown out whenever the servos are moving.  You can damage your Pi if you try to drive 4 servos with the GPIO.  I used an external bench power supply to make sure the servos got  a constant voltage supply.

Make sure that you connect the servos to the Adafruit board, the Adafruit board to the GPIO and the External Supply to the Adafruit board whilst the Pi is powered down.    Check, check and check all your connections again.  The GPIO can be a fickle beast, and connecting it incorrectly can damage the Pi.  Be nice to your Pi.

Once you reboot your Pi, you should see the adafruit board will light up, and you can check that the Pi can see the i2C device with

 sudo i2cdetect

Connecting the Adafruit Servo Board to Scratch GPIO.

Install ScratchGPIO on the pi using

 wget http://bit.ly/1wxrqdp -O isgh7.sh

followed by

sudo bash isgh7.sh

ScratchGPIO7 should then be installed.  If you don’t get a link on the desktop, you might need to dig around in /etc/ to locate the scratchGPIO7.sh file.

Start ScratchGPIO.  The program should automagically detect the i2c device if it can be seen using the i2cdetect command.

Set up some variables:

AdaServo0       – This is channel 0, the base servo, and should only be sent values ranging from -90 to 90

AdaServo1        -This is channel 1, the shoulder servo, and should only be sent values ranging from 70 to 0

AdaServo2        -This is channel 2, the ‘elbow’ servo, and should only be sent values ranging from 20 to 70

AdaServo3        -This is channel 3, the gripper servo, and should only be sent values from 0 to 30.

These values are all assuming that you have assembled the arm according to instructions.  You might find that your ranges are slightly different.

You can choose to increment the values of the variables by using the arrow keys, or you might create a generator script which will automatically scan through the values for each variable.   we had a ‘panic’ function that set the robot to a home state with a single button press if things got hairy!

Here’s a video of it in action:

Be careful not to exceed the maximum values for the servos.  The 9g servos have relatively weak plastic gear trains, and can strip their gears easily!

We’ll share our scratch code, and will post our python code as soon as we can get coding again.

Polargraph is go!

For Science and Engineering week, I thought it would be inspiring to make a giant Polargraph machine that would be left all day to draw at school.  They occupy a perfect space at the intersection of art, maths and technology.

The penrose triangle.  I drew this using vector drawing mode, and generated the file using StippleGen2 from EvilMadScientist labs.  <3
The penrose triangle. I drew this using vector drawing mode, and generated the file using StippleGen2 from EvilMadScientist labs. ❤

What is a Polargraph?

A polargraph is a line drawing machine that uses two fixed stepper motors to move a hanging gondola that holds the pen.  The machine usually draws pictures using a continuous line.

Why is it called a polargraph?

The machine uses a polar co-ordinate system to calculate where the pen is, and how far to move the motors in order to draw lines. (It can convert polar co-ordinates to Cartesian x-y co-ordinates)

How does it work?

The polargraph controller takes a .png, .gif or .jpg image as the input, and decides on the brightness of different areas in a grid. If an area is dark, then the machine tells the arduino to signal to the motors  to put the pen lines close together, and produce a dark shaded pixel. .  If an area is light then the machine tells the motors to put the lines further apart.

What can it draw?

It can draw any bitmap image that it is given.  It works best with high-contrast images.    It is quite versatile and can draw in a variety of ways – It can scribble, it can draw circles, or it can build pictures up in squiggles or perfect square waves.   The curved bands are due to the way that the motor instructions are generated, and which point is fixed in the picture. It can draw vector files directly, as they are path-based images.

How did I build it? 

I followed the truly incredible instructable form Sandy Noble at instructables.  It’s one of the best that I’ve ever followed – The instructions are clear, the walkthroughs are detailed to just the right level, he links to all the influences he used for the project, and it produces amazing results.

Polargraph Community

Although Sandy  has been inspired by others, including Der Kritzler machine and others he really has nurtured a community around polargraphs and V-plotters through a great forum, and excellent communication with customers or interested people.  He has also innovated in the area of v-plotters and really enabled a community to move forward by open-sourcing his designs and software.   It’s entirely possible to buy ready-made plotters from the polargraph store, but I really wanted to roll my own and follow the instructable.

Mark 1 – (prototype) 

I built a comedy home version at first, with the stepper motors velcroed onto the top of the board directly.  This had an advantage where there a bit of play in this mounting system, and it absorbed some of the resonance in the cables.

I built it using a an arduino uno (legit!), and an Adafruit motor shield V1 clone from ebay (only because the real one isn’t made anymore).   I don’t have access to a 3D printer, or a laser-cutter at home.   so I tried to make a home-grown solution to the gondola, but ended up buying a kit from the polargraph store.

Problems that I had while building the prototype machine and how I fixed them:

1. Motors turned backwards – Answer steppers wired in reverse. This was apparent as I moved my arduino from the front of the machine to the rear between revisions!

2. Gondola moved off edge of page when drawing – hadn’t set machine size before starting and uploaded to the arduino.  This is a really important stage that has to be done every time, or saved into the default_properties config file.

3. Pen would slip in picture – My counterweights were hitting the floor as the strings were too long.  This was solved by shortening the blind cables, but I think in future I will try and use a pulley system to stop the cables vibrating and resonating.

4. Stepper Motors were getting crazy hot -I stole some heatsinks from a PC graphics cards and  added small heatsinks to both the motor driver ICs on the motor board, and to the top of the steppers.  I also drove them at lower voltage and reduced the max speed.  This solved the problem nicely.

5. Pen was lifting from the paper causing incomplete pictures – I worked out a technique to get the paper as flat as possible in terms of using masking tape.  In future, I’ll use low-tack spray mount to get the paper to attach flat to the board, and then mask the outsides.

6. Servo failing to lift.  –  It took me a while to find this one out.  My servo was loose from the gondola, and was failing to lift in certain parts of the image.  This is a major headache if your machine has spent 6 hours drawing a flawless image, only for the servo to unstick, and then draw a line across the middle of the image.  Solution – Use a stronger adhesive or change the gondola.

Pl
Comedy Mk1 version.  Gaffer tape all round, counterweights made from bags of coins, stepper motors held with gaffer and velcro.  Amazing that it could still draw, all things considered.  This was my second ever image.

V1 It took me about 8 hours total to construct, and more importantly about 8 hours to troubleshoot.  Getting to know the software is really important.  Once I’d worked out what worked and what didn’t (and how to get the best out of my pens, paper and machine, I set my sights on the largest machine that I could build and transport.

Polargraph Mark II. 

It took about 4 hours to build, including a hinged A-frame so that I can change the angle of the board, and prop it up in public spaces without too much hassle.

The machine measures 1100 mm x 1800mm (The largest board I could fit into my car).

The only significant changes I made were that the machine size is effectively tripled, the electronics are all hidden at the back (a bit of a shame for a public machine I thought, but it would reduce tinkering / breakage – I might make a perspex enclosure for all the goodies), and I drilled a 12mm hole centre bottom in the board to run the servo cable hrough from the bottom.  This serves to pull the gondola parallel to the board.

Subjects

Portraits seem to be the most popular polargraph subjects, as they are universally recognisable, and we have a significant amount of subconscious brainpower dedicated to face recognition.  I tried to go for a couple of portraits, but also tried to get kids talking about science, maths, art and technology through the pictures.

I’ve been plotting various famous scientists, impossible shapes, anatomy pictures and other random pieces for the foyer at school.

Madame Curie.  The picture looks a bit tatty towards the bottom left: The paper was not sitting flat, and the Pen got clogged with paper fibres.
Madame Curie. The picture looks a bit tatty towards the bottom left: The paper was not sitting flat, and the Pen got clogged with paper fibres.  I still love the detail in the top of the image. I’d like to repeat this one under optimum conditions.  Draw time: 11h!

Reception

Impossible shapes - Part 2.   I used an image, then used inkscape to trace vectors.  This .svg file was exported to the polargraph and then drawn.
Impossible shapes – Part 2. I used an image, then used inkscape to trace vectors. This .svg file was exported to the polargraph and then drawn. Draw time: 35m

The machine was almost universally well received, especially for portraits of people.  It generates so many questions from curious students, and there seems to be a constant phalanx of children in front of it walking back and forth to try and make the more abstract images resolve with distance.     It’s one of the most pleasing builds in terms of satisfaction, as it produces such amazing works of Art.

My next project is to get the machine to run from a Raspberry Pi, and to try and emulate the amazing work of the blackstripes.nl or the gocupi projects (who are ‘inspired by’ Sandy Nobles work themselves).

I’ll add a time-lapse video as I have time to make a full-length one of a long drawing!.

Microscopy – with added Pi

tl:dr :  I hacked open an intel QX3 toy microscope, replaced the crummy sensor with a Picamera, installed RPI_cam_web_interface on it and now have a wireless electronic microscope that you can control using a web interface.  It’s a great little teaching tool, and the camera can live with the microscope permanently, the Pi can be detached and used for other projects.

An intel QX3 microscope with a Raspberry Pi camera inside.  The Model B is mounted on top temporarily whilst I work on fitting an A+ inside the microscope body.
An intel QX3 microscope with a Raspberry Pi camera inside. The Model B is mounted on top temporarily whilst I work on fitting an A+ inside the microscope body.

Background: When the education pyramid for my county was collapsed into two tiers (Primary and Secondary only-  Middle schools were closed down), some of the science equipment was redistributed to other schools.  We got a bunch of the Intel Qx3 Mattel microscopes, which were ‘legacy equipment’, and hadn’t been used in years (sneeze alert).    I saw these sad microscopes and thought they could be revitalised with some ‘Pi inside'(TM) Back when they were released, they boasted fairly good specifications, including 10, 60 and 200x optical magnigfication, USB connectivity, CCD sensor capable of native res of 320×240, and incandescent illuminators above and below the stage.  They are great toy microscopes, and work well for the price, although the drivers haven’t been updated for Windows in years, and the software really only works reliably with XP.  This also means that if you do get an old XP box/laptop you have to use the heinous digital blue software bundled with the camera, which stinks of bloat, and wacky menus.  Getting it functioning: Mac OS X has a working solution with macam, but doesn’t allow direct control of the illuminators, and my school doesn’t have any OS X machines to hand. There is a solution for Linux machines that works as described in this excellent blog post here.  Since it works on a linux distro, and video4linux  identifies the camera as a gspca (CPiA) webcam, it should work on a Pi, since V4L is baked into the kernel of Debian. There is a great blog post here detailed how to get it working with Camorama. Naturally, I plugged it straight into a Pi via usb and managed to get it working in X using camorama using the following commands.

sudo apt-get update

sudo apt-get upgrade

sudo apt-get install v4l2 camorama mplayer ffmpeg

On running camorama, It worked, but it was really disappointing.  The res was low, the framerate from the webcam was sucky, the sensor had a huge amount of noise present and it crashed after about 2 minutes every time. I persevered with getting it to work, but my coding skills weren’t up to the challenge of getting V4L2 to play more nicely with the camera (although I did manage to get the illuminator to turn on and off using:

v4l2-ctl -c illuminator_1=0

At this stage, I was ready to say that the cams were working again – We had a solution that allowed kids to get images from them, and use them in education, but it was clunky, and didn’t offer that good a solution. Option 2: Void that Warranty So, I removed the sensor board, and stuck a Pi camera inside, set up streaming webcam with controls and it’s awesome.  I haven’t added GPIO controlled LED lights yet, but it’s a huge improvement over the previous res with the camorama option.   The step were as follows:

  • Detach Microscope from stand
  • Unscrew the back by undoing the six screws on the back of the microscope.
  • Disassemble microscope.  Carefully take apart the 4 main pieces of the webcam (back plate, front plate, base, microscope body) I was really careful to avoid getting dust inside the optical assembly, which seemed fairly well sealed.
  • Unclip cables: Usb and illumination ribbon cables from control board. (might need a screwdriver here, they are glued)
  • Drill out the Usb cable from the back plate of the microscope (I couldn’t loosen the glue, so had to resort to persuasion)
Drllling out the usb cable holder at the back of the microscope.  It was held with adhesive far stronger than the surrounding plastic, and I risked shattering the microscope casing.
Drllling out the usb cable holder at the back of the microscope. It was held with adhesive far stronger than the surrounding plastic, and I risked shattering the microscope casing.
  • Unscrew the control board.  There are 3 screws with maddening Red Plastic glue over them to prevent fiddling.  I cut this gunk up with a razor blade first, then unscrewed them fully. It’s well worth tampering with them.
  • Modify Pi camera by unscrewing the lens completely (use tweezers and use a craft knife to break the glue seal). This allows more light into the sensor, and is far better for microscopy and telescope use. You can reverse this, and set the lens focus back to inifinity afterwards, but if you are not careful, you could risk damaging your PiCamera module.   There’s a nice tutorial here on how to do it.
I replaced the CCD sensor module by unscrewing the glued screws and mounting  the camera module in exactly the same place.  I used a servo horn to hold the camera in place and prevent  dust entering the sensor.
I replaced the CCD sensor module by unscrewing the glued screws and mounting the camera module in exactly the same place. I used a servo horn to hold the camera in place and prevent dust entering the sensor.
      • align pi-camera over old sensor hole. You can see this in the photo above.
      • live test using raspistill to get position of the camera module where the old CMOS sensor was and test focal length, focus etc.  You might need to adjust the picamera further away from the lens assembly to make sure that the stage can focus when you adjust the stage height with the knobs.
        raspistill -o image.jpg

        This command will generate a preview screen if you are in X.  You should see the red light come on, and then if you look in your home directory, you can look at the file, and make adjustments.

    • affix pi camera in place.  I used a medium servo horn (it just had the correct spacing for the screw holes) or other piece of plastic/wood.  Metal is a bad choice, it will short the connections on the back of the camera module, likely resetting the Pi. It’s possible to 3D print or laser cut a bracket to the correct size, but I didn’t have access to those at home)
    • Feed ribbon cable out through the usb cable hole.   In time I’ll probably Sugru it in place when I’ve found the sweet spot of replacement flex cable length and where I want the Pi.  Longer = more flexibility (no pun intended)
You can see the Picamera mounted inside the microscope casing, and the flex cable is routed out of the original USB cable hole.
You can see the Picamera mounted inside the microscope casing, and the flex cable is routed out of the original USB cable hole.  The  Model B is mounted on top, caseless, for the moment.
  • Reassamble the Microscope by putting all of the bits back together (Make sure the camera module is aligned so that it has a light path straight to the stage – Initially I put it in back to front, and couldn’t see anything, and went through a painstaking process of checking camera connections, rebooting, before I realised that it was a basic physics fail).
  • Install the amazing RPI_Cam_Web_Interface on the Pi. ( There is excellent documentation on the elinux site, with installation instructions).  This gets an MJPEG feed from the picamera and streams it to a web server with the Pi’s address.  The web interface has awesome customisation options and allows you capture stills, video and time-lapses through a browser.
  • Add a usb wi-fi dongle to Pi and power it up via battery or mains power.
  • Browse to the IP address of your Pi on another computer.  (you can do this by typing ifconfig into the terminal, or by watching the messages that appear during startup.

At this stage, it works, it can be used by anyone who knows how to point a browser to an IP address, providing you have a wireless connection.  You can probably plug in a monitor, keyboard and browse to localhost on the Pi itself to control it, although I haven’t tried, but a wireless microscope is SO much cooler! Next steps- Use a Pi A+ and see whether I can secrete the workings into the base so it is an invisible mod apart from the camera ribbon cable. -Replace the bulbs with LEDs, and control them via GPIO, and add buttons to the RPI_cam script. -Remove the reboot and shutdown buttons from the script, or at least password protect them. -Write a proper instructable for people who like to void their warranty with style. Naturally, not having access to school wifi makes it a bit more tricky, but you can still plug in a keyboard and monitor via HDMI, and you will be able to access the web cam interface at

 http://localhost/

in a browser on the Pi.

Here’s a sample of 3 zoomed images downloaded from the Camera browser interface.

L
Ladybird at 10x magnification. I found it inside the window casings in my lab. 😦
60x magnification of ladybird showing detail of the compound eye.
60x magnification of ladybird showing detail of the compound eye.
This is a focus-stacked jpg made in photoshop.  I had to correct a red cast on the photos, and focus stacked 5 images , hence the ganky edges around the frame.
This is a focus-stacked jpg made in photoshop. I had to correct a red cast on the photos, and focus stacked 5 images , hence the ganky edges around the frame.

Slow motion chemistry: A real storm in a teacup

Following on from my slow motion combustion reactions, which were really popular for classroom teaching, I thought I’d try the same formula with another crowd pleasing reaction: Neutralisation with universal indicator on magnetic stirrer.

First reaction was a real beauty, but there was lots of turbulence due to air bubbles caused by the height of the liquid in the conical flask and the high speed of the stirrer.

I decided to film portrait orientation. Although this gives large black bars when viewed online, it gave a really good view of the conical flask and the beautiful swirling purple colours as I added the sodium hydroxide.  It’s a really good talking point for classroom discussions about neutralisation.  -What’s going on?  Why doesn’t it all change at the same time? -Why don’t the liquids mix instantly?  What will happen? Why did it go Blue?

This was the least successful in terms of colour changes, but the vortex is really beautiful and engulfs the stirring bar. I love the final colour.

Genie in a bottle. 

I looked up a recipe from the RSC demonstration handbook for ‘dancing flames‘, which is a reaction between aluminium foil and copper chloride. The acidified solution eats away at the oxide layer, exposing the aluminium for reaction and then generating sweet hydrogen gas for the exploding. Watch the video for the burn:

This is a great experiment that allows a really good and deep discussion of reactivity, and also flame tests.  It’s applicable to KS3-5, and is a great point of focus as a demonstration with the lights off.

MeArm update – Work in progress.

Here’s an update of what we’ve been up to in the last 4 weeks of STEM club.

Team MeArm.

We initially built the excellent MeArm from a phenoptix kit, priced £29.99. It’s a nice robot with 4 degrees of freedom, that can be controlled via Arduino, Rasberry Pi, Beaglebone etc.  We managed to get it working really nicely with a Raspberry Pi and ScratchGPIO, Simon Walter’s excellent version of Scratch for the Pi.   The challenge for the group is to control the arm using Python.  It was suggested to use a Wii Nunchuk, or perhaps a Microsoft Kinect, but early experiments have suggested that keyboard input is challenging enough!

Robots building robots.  Skynet!
Progress on our MeArm build. The kit version is in the background, and our JCB yellow in the foreground.  We’re about 2 weeks away from movement control!

Once the robot is mechanically sound, and the servo positions are set, then we should be able to connect it via I2C using the Adafruit 16 channel Servo I2C board.  Once this is done, then we’ll release the python geeks, and let them run wild with their programming and imagination.

Running an in-house Scratch Jam.

Having run a really successful Pi Scratch Jam in my school for 30 year 9 students, I thought I’d share some ideas, tips and resources from our day. It involved a lot of planning, prep and problem solving, and we had a surprise visit from OfSted that day! They were really impressed with the problem solving, creativity and resilience that the kids showed, as well as the engagement and enjoyment all-round. I was really impressed with the quality of the finished projects.

We’d had a visit to Bournemouth University to try their Scratch Jam where our kids scooped top prize, and they asked us to run an internal version in-house.

The brief was fairly simple: Connect Scratch to the real world and Create something Fun.
This was the original brainstorm that Mike and I did on wrapping paper.

All great ideas start out as a sketch.
All great ideas start out as a sketch.

Originally we’d drafted 6 different challenges:
-Make a giant interactive boardgame
-Make an interactive floor piano.
-Create an e-reader hidden inside a real book.
-Create an interactive Nerf firing range.
-Create a fruit based controller for Pong
-Create a skeleton sensitive scratch games using a Kinect.
-Create a sound and light controlled game.
-Make a tilt-sensitive burglar alarm.

(We ditched the burglar alarm due the kids unfamiliarity with the GPIO, and the e-reader we felt was too challenging as we couldn’t source a suitable screen at the time).  We hooked up and hashed out a quick table of projects and made a shopping list, together with what we thought the programming difficulty was like, including any python handlers or requisitions we needed to make.
Project WRAP

We also made a list of what we’d require from IT services, including projectors, portable speakers, phono leads, 4-way extensions etc.  All that stuff makes the biggest difference on the day.

In  the end, we used a mixture of Pis and Windows PCs running Scratch.  We used Makey Makeys, Kinect and a picoboard to do the real world connecting. (We’ll run a Pi/GPIO challenge next time, but we decided to start with something they already knew.

Here’s my tips list, in no particular order.

  1. Work backwards – Make sure you have a clear goal state that you want the students to achieve in their groups. Work backwards from this to structure your challenges.
  2.  Use code experts to help out with the code challenges in groups. Either get Ks4 or Ks5 students to help out. Teach them to ask questions, not provide readymade solutions.
  3.  Enable flexibility in the challenge – The kids love to be able to be able to solve problems their own way, not a pre-prescribed way.
  4.  Provide structure to the challenges – We used programming goals, construction goals and creative goals.
  5.  Reward problem solving, teamwork – This means that you are embedding and rewarding the skill sets that you want to see develop.
  6.  Have prizes, certificates pre-organized. (Our prize fund dwindled spectacularly, so we had to improvise).
  7. Mix up the teams a little.  I had a STEM team who knew how to code for the Kinect really well in Scratch, so I gave them a project they were unfamiliar with.  Their ideas were much better as a result.
  8.  Share the judgement criteria in advance – If they know what to do in advance, they can plan for it at the start of the day, and there’s less last minute inclusions into the code, something which often breaks code at the crucial moment.
  9.  Fuel – Biscuits were welcome.  Malted Milk were particularly prized for the combination of low crumb, high flavour.
  10.  Build time for a showcase – Sharing their creations is paramount. If they don’t feel like their input is valued or shared, they won’t value the whole activity. We had an hour’s showcase, and had a visiting class and Senior Team from the school.  Watching them jump around with the Kinect-Controlled shooter was priceless.
  11. Bookmark the day with a purpose, and a link to real STEM careers.  I chose to link the day to problem solving, and also different interface designs.  We tried to have industry speakers and visitors to come and judge, but we had a last minute cancellation.
  12.  Share all projects online – We uploaded all of the kids code to the Scratch Website to showcase their work.  For future Code Jams, we’ll upload our code to github.

There were some technical issues, but our school’s technicians are really supportive, and we made sure we’d put our requisitions in the week before, so there was not much unexpected hardware faffing to do.

The quality of the final builds and programs were incredible. The winning team had build a Giant Version of operation called Dogeration, which unsurprisingly, had a Vetinary twist. They hooked up the BBQ tongs to the Makey with wires, and then had different scoring and penalties based on how many times you touched the sides of the cavities. Their prize for the day was to get their project made into a permanent laser-cut wooden version (still under construction).

Here’s the final document that was printed and shared with the groups on the day: It pretty much covered most of what they needed to know, and they could research their code, or instructables using the links.

The Great Purbeck Scratch3 (Word Document – Sorry!)

If you do use it, then give us a shout and tell us how your Jam went down!

I hope this helps other teachers to take the leap and run their own Scratch Jam in-house.  Our next step is to run this in other feeder Primary schools, and use our students as the code experts, and build their skills in mentoring and helping others to come to their own solutions to problems.