Lego drawing machine

IMG_1760-0

We’ve been working on an art collaboration, and have built a drawing machine based around a pantograph and a rotating turntable. It produces a Spirograph-style pattern, which can be tailored by changing the arm length, pivot point, rotor speeds and turntable speed. The build is a prototype, and can be replicated with power functions motors, rcx, Nxt kits, or even old school technics kits. A build video will follow, but check the time lapse for a quick overview.

We used the power functions speed controller remote so that we can control the rotor speeds and directions, as opposed to the digital remote which only allows full on/full off.    If you were to replicate this build using NXT, you would have to set the Servomotor speeds in software, and then execute the program.  You can see in the video above that Shorna changes the drawing pattern at 0.08 seconds in the video.

Advertisements

Kinect2Scratch – The power of joint tracking in Scratch.

Microsoft Kinect units can be found second hand on eBay for £20 or so, and they are STEM gold.  There are so many amazing pieces of code out there that let you extract the joint info from Kinect and make it ready for other apps in the form of OSC, midi, or raw data  (NI mate is a commercial venture).

One of the best uses of a Microsoft Kinect I’ve seen is with Scanect, a piece of software that lets you map the video data to the distance data in order to make accurate 3d models of small objects, people or rooms rapidly and cheaply.  However, that’s quite an advanced STEM project, and I needed something that could use the power of the Kinect, and combine it with the rapid prototyping and easily accessible code of Scratch.

Using a great piece of software developed by Stephen Howell called Kinect2Scratch,  (http://scratch.saorog.com/ for the download) it’s possible to import skeleton data from the kinect directly into Scratch and use it to code with.  You have to install the Kinect runtime for Windows first, but once running, The Kinect2Scratch program runs in the background, extracts the joint information, and sends it to Scratch.  (setup guide here: http://scratch.saorog.com/setup.pdf)

Mac users have an option that is coded by Kenta Hara:  http://github.com/mactkg/kinect2Scratch4Mac

Once Kinect2SCratch is working in the background, it just chugs away and sends skeleton tracking data to Scratch, with very few issues.  The student reactions to having such profound input into Scratch was incredible.  Within about 20 minutes, they had brainstormed at least 8 different ways of using joint input to make games, animations, presentations.

It comes with some really great sample programs to get you started, but my favourite one was the skeleton tracking.  It uses some crafty code to draw a live skeleton in scratch with very little lag, and puts Kitty Cat’s head on the top.

Over the course of the Purbeck 2014 Scratch Jam, Team Kinect made a game from the bottom up that used the Chest Centre tracking point (Around where the solar plexus is) to control the Horizontal movement of a space ship in a vertical scrolling shooter.  This actually made playing the game a real workout, as to avoid incoming meteorites, you need to move your entire body, ducking and jumping at times!   The ship fires by raising arms to prepare the lasers, and then bringing them below the shoulders to fire.

It’s a great game, with randomised meteorites, good scoring mechanic, menacing score and great workout to boot.

The Project is shared at:

http://scratch.mit.edu/projects/37598364/

Raspberry Pi Radio.

While teaching students about how radio signals are used to send information using FM  I decided to make a raspberry pi radio transmitter using the instructions from Makezine and Instructables, so that we could explore the range of transmission, and the effects that different aerials would have on the radio signal.

http://www.instructables.com/id/Raspberry-Pi-Radio-Transmitter

http://makezine.com/projects/make-38-cameras-and-av/raspberry-pirate-radio.

The PiFM python module was created by Oliver Mattos and Oskar Weigl from the Imperial College Robotics society,  and can broadcast 16 bit mono WAV format sound files that can be played on any FM receiving radio.

Makezine made a pre-compiled install image that autoruns the transmitting software on boot, and will shuffle audio files that have been placed in the root directory.  I’ll be honest, their video was a bit Hipsterrific for my liking, but it got me interested enough to tinker.

The basis behind the generation of the signals sounds simple enough: GPIO4 is modulated via PWM fast enough to produce a signal in the FM (megahertz) range.   However, I quickly realised that it produces a lot of interference in nearby devices, and there are clear side bands of interference in neighbouring frequencies. (Just tune the radio through the frequency range and you’ll find many sidebands)  Although the power output of the device is so low, it might interfere with Emergency Service radio, and as we are located very close to a Fire Station, we decided to exercise caution!

With an FM radio right next to the Pi, it picked up the signal very clearly, but dropped off within 3-5m.  With a 15cm crocodile clip wire attachedto GPIO4  the range jumped to 20-30m radius, and was very clear.

Some builds have used Duck antennae in order to boost the signal broadcast radius, but presumably this would also boost the strength of the side bands, and would drown out commercial stations at short range.

Build Notes:

Extremely easy, and very quick to get results.

Minimal materials needed, and little expertise required for students.  Easily replicable.

Next Steps:

School radio station?

Using the piFM RDS code on github to send track information, and station Identification info.  Code:https://github.com/ChristopheJacquet/PiFmRds 

Slow motion combustion experiments.

Wanting to illustrate combustion for GCSE chemistry using a classic demo with a new twist, I did a Whoosh bottle experiment filmed at 240fps.  The standard experiment involves setting light to methanol, ethanol or propanol in a polycarbonate water cooler bottle, and the rapid exit/entrance of gases through the neck causes a whoosh.

Filming in slow motion allowed us to discuss in much greater detail what was going on during the reaction, and to work out that the actual reaction is far more complicated than a simple equation.

The experiment was beautiful, filmed on an iPhone 6, and then slowed down in-app using SlowPro.

As an extra experiment, I then tried methanol in a volumetric flask, behind a safety screen in an empty lab, just in case.

A talking point and a teaching point.

Challenges:

Getting the lighting and safety aspects correct! (This is against CLEAPPS advice – never do this in an occupied lab)

Next Steps:

Using a TriggerTrip to take a high-res photo during the combustion process.

Simultaneous combustion of methanol, ethanol and propanol to compare burn rates and completeness of combustion.

Doodlebots @ STEM club.

Inspired by an article at Makezine on scribblebots, we decided to roll our own using some bits at school.

This video shows the bots in action in slow motion, and clearly shows how the eccentric vibration motor allows the bot to move in a circle.   We just used rubber bungs stuck onto the motor to create the offset motor.  We experimented with a range of pens, pencils and markers.  Sharpie style markers worked well with the sugar paper, but if you are tiling smaller sheets of paper, make sure to tape on the underside and make the seams as flat as possible.  The activity was great fun, and produced some crazy generative art, which was my lab wallpaper for a couple of weeks.  Check out the videos:  Continue reading