Here’s an update of what we’ve been up to in the last 4 weeks of STEM club.
We initially built the excellent MeArm from a phenoptix kit, priced £29.99. It’s a nice robot with 4 degrees of freedom, that can be controlled via Arduino, Rasberry Pi, Beaglebone etc. We managed to get it working really nicely with a Raspberry Pi and ScratchGPIO, Simon Walter’s excellent version of Scratch for the Pi. The challenge for the group is to control the arm using Python. It was suggested to use a Wii Nunchuk, or perhaps a Microsoft Kinect, but early experiments have suggested that keyboard input is challenging enough!
Once the robot is mechanically sound, and the servo positions are set, then we should be able to connect it via I2C using the Adafruit 16 channel Servo I2C board. Once this is done, then we’ll release the python geeks, and let them run wild with their programming and imagination.
Bournemouth University’s STEM outreach program came to visit Purbeck School on Monday, and bought with them 15 Lego Mindstorms EV3 kits. The kits had been made into pre-assembled robots with a Large Servo motor for lateral movement, and a medium motor to lift the arm assembly. They had ultrasound sensors for distance sensing, and an IR sensor for manual control via the IR Beacon.
The brief of the day was simple: Write code for a Robot to be able to play a game of Ping Pong against another robot. The challenge turned out to require problem solving, mathematics, iterating, collaboration, failure, resilience and sabotage.
The challenge was really well structured, with the students learning the basics of connecting, writing code blocks and downloading them to the brick, and within about 20 minutes of starting, most groups had control of the servo motors, and shortly afterwards, they were able to add another loop to their program to control the flipper. They had to figure out the correct number of degrees to rotate the arm in order for an efficient flip!
The code challenges were well-thought out, and got the students to learn the rudiments of controlling the robot successfully by thinking about how far the robots had to move by setting rotation limits on the motors, and learning to use logic and loops to make the robots respond to the infrared remotes. They had time to test the robots out in the test arena, and make sure that the robots were responsive to input, and could also hit the ball with the paddle. There was a key trade-off between Power and Speed at which the paddle moved, that the students had to find to hit the sweet spot.
Customisation was also a large part of the day, with some groups recording audio samples, or drawing their own pictures, or playing short musical sequences at the press of a button. 3 groups had ‘entrance moves’ and intimidating aggressive moves.
Once the matches began, it became evident that there were issues with IR interference from other groups since the EV3 kits limit you to 4 channels, and even in a large room, there was significant crosstalk between the groups. (There was also a good amount of comedy sabotage to be had).
There were 20+ matches, the competetive element was very strong and the quality of the sport got much better as the day went on. In particular, the single-member teams did particularly well, ranking in 3rd and 4th place even though they had no previous experience with the NXT-G programming environment. There had to be an overall winner, and team Virginia Tech prevailed in the end.
We hope to be involved with Naomi and the STEM outreach team again. The feedback was really positive from the students, and this would work really well with a younger cohort of students.
As a teacher, I end up delivering more CPD than I actually attend myself. It was part of my job role in my last school, and I’ve continued to do it at Purbeck, but I was ridiculously excited to be invited to the Raspberry Pi Educators Workshop during the half term. My application was long and thorough, but I really didn’t think that I’d be invited. I’m looking forward to taking some crazy ideas with me, and speaking to the engineers at the foundation to see what is possible, especially with the new hardware in the form of the Raspberry Pi 2.
We plan to run workshops for our feeder schools based using our set of Pis, and it will be incredible to be inspired by this course and share what we have been doing so far at our STEM club.
Having already bought and assembled a MeArm kit from Makersify, I decided to use the open hardwire files from thingiverse to laser cut all the parts from 3mm Acrylic.
Thingiverse files. If your DT department in school is as awesome as mine, they’ll show you how to use the cutter so you can plan and cut your own designs.
I had some trouble with the .dxf files running in 2D design, so I printed the design out on paper first and then did all of the measurements, comparing it to the real mearm that I built in order to make sure the scaling was perfect. (If not, the servos won’t fit in the holes, and the bolts won’t self-tap)
I also cut a mountboard version for the students to see, and get to measure with vernier callipers to check the dimensions. (It turned out it had scaled wrong vertically, and the test-print would never have assembled as all the circular holes were ellipses!)
In order to build the robot, you’ll need a bunch of M3 bolts, and some Turnigy 9g micro servos. I picked up a load of servos cheap on ebay , at £11.99 for 4, which seems like a good deal. (I would get a spare set, in case you burn the servos out during a build!)
I used namrick.co.uk to get small batches of the screws, as they were by far the best deal I could find in the UK, cheaper by about £25 than all other dealers.
Once the students had done the inventory, they began to follow the assembly instructions, starting with the baseplate. Here you can see that the longer M3 bolts are used like standoffs in the assembly. You can also see the mounting holes to bolt on an arduino for controlling the servos.
We’ll keep you posted on the build as it happens, and then how to control it with scratch on the raspberry pi.
Microsoft Kinect units can be found second hand on eBay for £20 or so, and they are STEM gold. There are so many amazing pieces of code out there that let you extract the joint info from Kinect and make it ready for other apps in the form of OSC, midi, or raw data (NI mate is a commercial venture).
One of the best uses of a Microsoft Kinect I’ve seen is with Scanect, a piece of software that lets you map the video data to the distance data in order to make accurate 3d models of small objects, people or rooms rapidly and cheaply. However, that’s quite an advanced STEM project, and I needed something that could use the power of the Kinect, and combine it with the rapid prototyping and easily accessible code of Scratch.
Using a great piece of software developed by Stephen Howell called Kinect2Scratch, (http://scratch.saorog.com/ for the download) it’s possible to import skeleton data from the kinect directly into Scratch and use it to code with. You have to install the Kinect runtime for Windows first, but once running, The Kinect2Scratch program runs in the background, extracts the joint information, and sends it to Scratch. (setup guide here: http://scratch.saorog.com/setup.pdf)
Once Kinect2SCratch is working in the background, it just chugs away and sends skeleton tracking data to Scratch, with very few issues. The student reactions to having such profound input into Scratch was incredible. Within about 20 minutes, they had brainstormed at least 8 different ways of using joint input to make games, animations, presentations.
It comes with some really great sample programs to get you started, but my favourite one was the skeleton tracking. It uses some crafty code to draw a live skeleton in scratch with very little lag, and puts Kitty Cat’s head on the top.
Over the course of the Purbeck 2014 Scratch Jam, Team Kinect made a game from the bottom up that used the Chest Centre tracking point (Around where the solar plexus is) to control the Horizontal movement of a space ship in a vertical scrolling shooter. This actually made playing the game a real workout, as to avoid incoming meteorites, you need to move your entire body, ducking and jumping at times! The ship fires by raising arms to prepare the lasers, and then bringing them below the shoulders to fire.
It’s a great game, with randomised meteorites, good scoring mechanic, menacing score and great workout to boot.
While teaching students about how radio signals are used to send information using FM I decided to make a raspberry pi radio transmitter using the instructions from Makezine and Instructables, so that we could explore the range of transmission, and the effects that different aerials would have on the radio signal.
The PiFM python module was created by Oliver Mattos and Oskar Weigl from the Imperial College Robotics society, and can broadcast 16 bit mono WAV format sound files that can be played on any FM receiving radio.
Makezine made a pre-compiled install image that autoruns the transmitting software on boot, and will shuffle audio files that have been placed in the root directory. I’ll be honest, their video was a bit Hipsterrific for my liking, but it got me interested enough to tinker.
The basis behind the generation of the signals sounds simple enough: GPIO4 is modulated via PWM fast enough to produce a signal in the FM (megahertz) range. However, I quickly realised that it produces a lot of interference in nearby devices, and there are clear side bands of interference in neighbouring frequencies. (Just tune the radio through the frequency range and you’ll find many sidebands) Although the power output of the device is so low, it might interfere with Emergency Service radio, and as we are located very close to a Fire Station, we decided to exercise caution!
With an FM radio right next to the Pi, it picked up the signal very clearly, but dropped off within 3-5m. With a 15cm crocodile clip wire attachedto GPIO4 the range jumped to 20-30m radius, and was very clear.
Some builds have used Duck antennae in order to boost the signal broadcast radius, but presumably this would also boost the strength of the side bands, and would drown out commercial stations at short range.
Extremely easy, and very quick to get results.
Minimal materials needed, and little expertise required for students. Easily replicable.
Wanting to illustrate combustion for GCSE chemistry using a classic demo with a new twist, I did a Whoosh bottle experiment filmed at 240fps. The standard experiment involves setting light to methanol, ethanol or propanol in a polycarbonate water cooler bottle, and the rapid exit/entrance of gases through the neck causes a whoosh.
Filming in slow motion allowed us to discuss in much greater detail what was going on during the reaction, and to work out that the actual reaction is far more complicated than a simple equation.
The experiment was beautiful, filmed on an iPhone 6, and then slowed down in-app using SlowPro.
As an extra experiment, I then tried methanol in a volumetric flask, behind a safety screen in an empty lab, just in case.
A talking point and a teaching point.
Getting the lighting and safety aspects correct! (This is against CLEAPPS advice – never do this in an occupied lab)
Using a TriggerTrip to take a high-res photo during the combustion process.
Simultaneous combustion of methanol, ethanol and propanol to compare burn rates and completeness of combustion.