Polargraph is go!

For Science and Engineering week, I thought it would be inspiring to make a giant Polargraph machine that would be left all day to draw at school.  They occupy a perfect space at the intersection of art, maths and technology.

The penrose triangle.  I drew this using vector drawing mode, and generated the file using StippleGen2 from EvilMadScientist labs.  <3
The penrose triangle. I drew this using vector drawing mode, and generated the file using StippleGen2 from EvilMadScientist labs. ❤

What is a Polargraph?

A polargraph is a line drawing machine that uses two fixed stepper motors to move a hanging gondola that holds the pen.  The machine usually draws pictures using a continuous line.

Why is it called a polargraph?

The machine uses a polar co-ordinate system to calculate where the pen is, and how far to move the motors in order to draw lines. (It can convert polar co-ordinates to Cartesian x-y co-ordinates)

How does it work?

The polargraph controller takes a .png, .gif or .jpg image as the input, and decides on the brightness of different areas in a grid. If an area is dark, then the machine tells the arduino to signal to the motors  to put the pen lines close together, and produce a dark shaded pixel. .  If an area is light then the machine tells the motors to put the lines further apart.

What can it draw?

It can draw any bitmap image that it is given.  It works best with high-contrast images.    It is quite versatile and can draw in a variety of ways – It can scribble, it can draw circles, or it can build pictures up in squiggles or perfect square waves.   The curved bands are due to the way that the motor instructions are generated, and which point is fixed in the picture. It can draw vector files directly, as they are path-based images.

How did I build it? 

I followed the truly incredible instructable form Sandy Noble at instructables.  It’s one of the best that I’ve ever followed – The instructions are clear, the walkthroughs are detailed to just the right level, he links to all the influences he used for the project, and it produces amazing results.

Polargraph Community

Although Sandy  has been inspired by others, including Der Kritzler machine and others he really has nurtured a community around polargraphs and V-plotters through a great forum, and excellent communication with customers or interested people.  He has also innovated in the area of v-plotters and really enabled a community to move forward by open-sourcing his designs and software.   It’s entirely possible to buy ready-made plotters from the polargraph store, but I really wanted to roll my own and follow the instructable.

Mark 1 – (prototype) 

I built a comedy home version at first, with the stepper motors velcroed onto the top of the board directly.  This had an advantage where there a bit of play in this mounting system, and it absorbed some of the resonance in the cables.

I built it using a an arduino uno (legit!), and an Adafruit motor shield V1 clone from ebay (only because the real one isn’t made anymore).   I don’t have access to a 3D printer, or a laser-cutter at home.   so I tried to make a home-grown solution to the gondola, but ended up buying a kit from the polargraph store.

Problems that I had while building the prototype machine and how I fixed them:

1. Motors turned backwards – Answer steppers wired in reverse. This was apparent as I moved my arduino from the front of the machine to the rear between revisions!

2. Gondola moved off edge of page when drawing – hadn’t set machine size before starting and uploaded to the arduino.  This is a really important stage that has to be done every time, or saved into the default_properties config file.

3. Pen would slip in picture – My counterweights were hitting the floor as the strings were too long.  This was solved by shortening the blind cables, but I think in future I will try and use a pulley system to stop the cables vibrating and resonating.

4. Stepper Motors were getting crazy hot -I stole some heatsinks from a PC graphics cards and  added small heatsinks to both the motor driver ICs on the motor board, and to the top of the steppers.  I also drove them at lower voltage and reduced the max speed.  This solved the problem nicely.

5. Pen was lifting from the paper causing incomplete pictures – I worked out a technique to get the paper as flat as possible in terms of using masking tape.  In future, I’ll use low-tack spray mount to get the paper to attach flat to the board, and then mask the outsides.

6. Servo failing to lift.  –  It took me a while to find this one out.  My servo was loose from the gondola, and was failing to lift in certain parts of the image.  This is a major headache if your machine has spent 6 hours drawing a flawless image, only for the servo to unstick, and then draw a line across the middle of the image.  Solution – Use a stronger adhesive or change the gondola.

Pl
Comedy Mk1 version.  Gaffer tape all round, counterweights made from bags of coins, stepper motors held with gaffer and velcro.  Amazing that it could still draw, all things considered.  This was my second ever image.

V1 It took me about 8 hours total to construct, and more importantly about 8 hours to troubleshoot.  Getting to know the software is really important.  Once I’d worked out what worked and what didn’t (and how to get the best out of my pens, paper and machine, I set my sights on the largest machine that I could build and transport.

Polargraph Mark II. 

It took about 4 hours to build, including a hinged A-frame so that I can change the angle of the board, and prop it up in public spaces without too much hassle.

The machine measures 1100 mm x 1800mm (The largest board I could fit into my car).

The only significant changes I made were that the machine size is effectively tripled, the electronics are all hidden at the back (a bit of a shame for a public machine I thought, but it would reduce tinkering / breakage – I might make a perspex enclosure for all the goodies), and I drilled a 12mm hole centre bottom in the board to run the servo cable hrough from the bottom.  This serves to pull the gondola parallel to the board.

Subjects

Portraits seem to be the most popular polargraph subjects, as they are universally recognisable, and we have a significant amount of subconscious brainpower dedicated to face recognition.  I tried to go for a couple of portraits, but also tried to get kids talking about science, maths, art and technology through the pictures.

I’ve been plotting various famous scientists, impossible shapes, anatomy pictures and other random pieces for the foyer at school.

Madame Curie.  The picture looks a bit tatty towards the bottom left: The paper was not sitting flat, and the Pen got clogged with paper fibres.
Madame Curie. The picture looks a bit tatty towards the bottom left: The paper was not sitting flat, and the Pen got clogged with paper fibres.  I still love the detail in the top of the image. I’d like to repeat this one under optimum conditions.  Draw time: 11h!

Reception

Impossible shapes - Part 2.   I used an image, then used inkscape to trace vectors.  This .svg file was exported to the polargraph and then drawn.
Impossible shapes – Part 2. I used an image, then used inkscape to trace vectors. This .svg file was exported to the polargraph and then drawn. Draw time: 35m

The machine was almost universally well received, especially for portraits of people.  It generates so many questions from curious students, and there seems to be a constant phalanx of children in front of it walking back and forth to try and make the more abstract images resolve with distance.     It’s one of the most pleasing builds in terms of satisfaction, as it produces such amazing works of Art.

My next project is to get the machine to run from a Raspberry Pi, and to try and emulate the amazing work of the blackstripes.nl or the gocupi projects (who are ‘inspired by’ Sandy Nobles work themselves).

I’ll add a time-lapse video as I have time to make a full-length one of a long drawing!.

Advertisements

Sabotage: Teach Debugging By Stealth

Incredible post from teachcomputing about how to get students to debug by Stealth. Already implemented!

teachcomputing.wordpress.com


The majority of my work at the moment is supporting ICT teachers who want to introduce and develop Computing within their own curriculum ahead of the changes planned for September 2014. Throughout my work with other teachers, I’ve been sharing some of the pedagogic devices and strategies I’ve been using with the aim of ensuring that their teaching of Computing is engaging and inspiring.

One such game that I’ve developed with my classes I call ‘Sabotage’. I’ve discovered that this can be used in a whole variety of ways, but to help you develop an understanding of it I will attempt to describe just one simple example for you.

The Problem:

I found when I first moved from teaching Scratch (a visual programming language) to Python (a text-based programming language) that children became very frustrated with the high numbers of syntax errors which prevented their scripts from working. This created…

View original post 737 more words

Microscopy – with added Pi

tl:dr :  I hacked open an intel QX3 toy microscope, replaced the crummy sensor with a Picamera, installed RPI_cam_web_interface on it and now have a wireless electronic microscope that you can control using a web interface.  It’s a great little teaching tool, and the camera can live with the microscope permanently, the Pi can be detached and used for other projects.

An intel QX3 microscope with a Raspberry Pi camera inside.  The Model B is mounted on top temporarily whilst I work on fitting an A+ inside the microscope body.
An intel QX3 microscope with a Raspberry Pi camera inside. The Model B is mounted on top temporarily whilst I work on fitting an A+ inside the microscope body.

Background: When the education pyramid for my county was collapsed into two tiers (Primary and Secondary only-  Middle schools were closed down), some of the science equipment was redistributed to other schools.  We got a bunch of the Intel Qx3 Mattel microscopes, which were ‘legacy equipment’, and hadn’t been used in years (sneeze alert).    I saw these sad microscopes and thought they could be revitalised with some ‘Pi inside'(TM) Back when they were released, they boasted fairly good specifications, including 10, 60 and 200x optical magnigfication, USB connectivity, CCD sensor capable of native res of 320×240, and incandescent illuminators above and below the stage.  They are great toy microscopes, and work well for the price, although the drivers haven’t been updated for Windows in years, and the software really only works reliably with XP.  This also means that if you do get an old XP box/laptop you have to use the heinous digital blue software bundled with the camera, which stinks of bloat, and wacky menus.  Getting it functioning: Mac OS X has a working solution with macam, but doesn’t allow direct control of the illuminators, and my school doesn’t have any OS X machines to hand. There is a solution for Linux machines that works as described in this excellent blog post here.  Since it works on a linux distro, and video4linux  identifies the camera as a gspca (CPiA) webcam, it should work on a Pi, since V4L is baked into the kernel of Debian. There is a great blog post here detailed how to get it working with Camorama. Naturally, I plugged it straight into a Pi via usb and managed to get it working in X using camorama using the following commands.

sudo apt-get update

sudo apt-get upgrade

sudo apt-get install v4l2 camorama mplayer ffmpeg

On running camorama, It worked, but it was really disappointing.  The res was low, the framerate from the webcam was sucky, the sensor had a huge amount of noise present and it crashed after about 2 minutes every time. I persevered with getting it to work, but my coding skills weren’t up to the challenge of getting V4L2 to play more nicely with the camera (although I did manage to get the illuminator to turn on and off using:

v4l2-ctl -c illuminator_1=0

At this stage, I was ready to say that the cams were working again – We had a solution that allowed kids to get images from them, and use them in education, but it was clunky, and didn’t offer that good a solution. Option 2: Void that Warranty So, I removed the sensor board, and stuck a Pi camera inside, set up streaming webcam with controls and it’s awesome.  I haven’t added GPIO controlled LED lights yet, but it’s a huge improvement over the previous res with the camorama option.   The step were as follows:

  • Detach Microscope from stand
  • Unscrew the back by undoing the six screws on the back of the microscope.
  • Disassemble microscope.  Carefully take apart the 4 main pieces of the webcam (back plate, front plate, base, microscope body) I was really careful to avoid getting dust inside the optical assembly, which seemed fairly well sealed.
  • Unclip cables: Usb and illumination ribbon cables from control board. (might need a screwdriver here, they are glued)
  • Drill out the Usb cable from the back plate of the microscope (I couldn’t loosen the glue, so had to resort to persuasion)
Drllling out the usb cable holder at the back of the microscope.  It was held with adhesive far stronger than the surrounding plastic, and I risked shattering the microscope casing.
Drllling out the usb cable holder at the back of the microscope. It was held with adhesive far stronger than the surrounding plastic, and I risked shattering the microscope casing.
  • Unscrew the control board.  There are 3 screws with maddening Red Plastic glue over them to prevent fiddling.  I cut this gunk up with a razor blade first, then unscrewed them fully. It’s well worth tampering with them.
  • Modify Pi camera by unscrewing the lens completely (use tweezers and use a craft knife to break the glue seal). This allows more light into the sensor, and is far better for microscopy and telescope use. You can reverse this, and set the lens focus back to inifinity afterwards, but if you are not careful, you could risk damaging your PiCamera module.   There’s a nice tutorial here on how to do it.
I replaced the CCD sensor module by unscrewing the glued screws and mounting  the camera module in exactly the same place.  I used a servo horn to hold the camera in place and prevent  dust entering the sensor.
I replaced the CCD sensor module by unscrewing the glued screws and mounting the camera module in exactly the same place. I used a servo horn to hold the camera in place and prevent dust entering the sensor.
      • align pi-camera over old sensor hole. You can see this in the photo above.
      • live test using raspistill to get position of the camera module where the old CMOS sensor was and test focal length, focus etc.  You might need to adjust the picamera further away from the lens assembly to make sure that the stage can focus when you adjust the stage height with the knobs.
        raspistill -o image.jpg

        This command will generate a preview screen if you are in X.  You should see the red light come on, and then if you look in your home directory, you can look at the file, and make adjustments.

    • affix pi camera in place.  I used a medium servo horn (it just had the correct spacing for the screw holes) or other piece of plastic/wood.  Metal is a bad choice, it will short the connections on the back of the camera module, likely resetting the Pi. It’s possible to 3D print or laser cut a bracket to the correct size, but I didn’t have access to those at home)
    • Feed ribbon cable out through the usb cable hole.   In time I’ll probably Sugru it in place when I’ve found the sweet spot of replacement flex cable length and where I want the Pi.  Longer = more flexibility (no pun intended)
You can see the Picamera mounted inside the microscope casing, and the flex cable is routed out of the original USB cable hole.
You can see the Picamera mounted inside the microscope casing, and the flex cable is routed out of the original USB cable hole.  The  Model B is mounted on top, caseless, for the moment.
  • Reassamble the Microscope by putting all of the bits back together (Make sure the camera module is aligned so that it has a light path straight to the stage – Initially I put it in back to front, and couldn’t see anything, and went through a painstaking process of checking camera connections, rebooting, before I realised that it was a basic physics fail).
  • Install the amazing RPI_Cam_Web_Interface on the Pi. ( There is excellent documentation on the elinux site, with installation instructions).  This gets an MJPEG feed from the picamera and streams it to a web server with the Pi’s address.  The web interface has awesome customisation options and allows you capture stills, video and time-lapses through a browser.
  • Add a usb wi-fi dongle to Pi and power it up via battery or mains power.
  • Browse to the IP address of your Pi on another computer.  (you can do this by typing ifconfig into the terminal, or by watching the messages that appear during startup.

At this stage, it works, it can be used by anyone who knows how to point a browser to an IP address, providing you have a wireless connection.  You can probably plug in a monitor, keyboard and browse to localhost on the Pi itself to control it, although I haven’t tried, but a wireless microscope is SO much cooler! Next steps- Use a Pi A+ and see whether I can secrete the workings into the base so it is an invisible mod apart from the camera ribbon cable. -Replace the bulbs with LEDs, and control them via GPIO, and add buttons to the RPI_cam script. -Remove the reboot and shutdown buttons from the script, or at least password protect them. -Write a proper instructable for people who like to void their warranty with style. Naturally, not having access to school wifi makes it a bit more tricky, but you can still plug in a keyboard and monitor via HDMI, and you will be able to access the web cam interface at

 http://localhost/

in a browser on the Pi.

Here’s a sample of 3 zoomed images downloaded from the Camera browser interface.

L
Ladybird at 10x magnification. I found it inside the window casings in my lab. 😦
60x magnification of ladybird showing detail of the compound eye.
60x magnification of ladybird showing detail of the compound eye.
This is a focus-stacked jpg made in photoshop.  I had to correct a red cast on the photos, and focus stacked 5 images , hence the ganky edges around the frame.
This is a focus-stacked jpg made in photoshop. I had to correct a red cast on the photos, and focus stacked 5 images , hence the ganky edges around the frame.

Slow motion chemistry: A real storm in a teacup

Following on from my slow motion combustion reactions, which were really popular for classroom teaching, I thought I’d try the same formula with another crowd pleasing reaction: Neutralisation with universal indicator on magnetic stirrer.

First reaction was a real beauty, but there was lots of turbulence due to air bubbles caused by the height of the liquid in the conical flask and the high speed of the stirrer.

I decided to film portrait orientation. Although this gives large black bars when viewed online, it gave a really good view of the conical flask and the beautiful swirling purple colours as I added the sodium hydroxide.  It’s a really good talking point for classroom discussions about neutralisation.  -What’s going on?  Why doesn’t it all change at the same time? -Why don’t the liquids mix instantly?  What will happen? Why did it go Blue?

This was the least successful in terms of colour changes, but the vortex is really beautiful and engulfs the stirring bar. I love the final colour.

Genie in a bottle. 

I looked up a recipe from the RSC demonstration handbook for ‘dancing flames‘, which is a reaction between aluminium foil and copper chloride. The acidified solution eats away at the oxide layer, exposing the aluminium for reaction and then generating sweet hydrogen gas for the exploding. Watch the video for the burn:

This is a great experiment that allows a really good and deep discussion of reactivity, and also flame tests.  It’s applicable to KS3-5, and is a great point of focus as a demonstration with the lights off.

MeArm update – Work in progress.

Here’s an update of what we’ve been up to in the last 4 weeks of STEM club.

Team MeArm.

We initially built the excellent MeArm from a phenoptix kit, priced £29.99. It’s a nice robot with 4 degrees of freedom, that can be controlled via Arduino, Rasberry Pi, Beaglebone etc.  We managed to get it working really nicely with a Raspberry Pi and ScratchGPIO, Simon Walter’s excellent version of Scratch for the Pi.   The challenge for the group is to control the arm using Python.  It was suggested to use a Wii Nunchuk, or perhaps a Microsoft Kinect, but early experiments have suggested that keyboard input is challenging enough!

Robots building robots.  Skynet!
Progress on our MeArm build. The kit version is in the background, and our JCB yellow in the foreground.  We’re about 2 weeks away from movement control!

Once the robot is mechanically sound, and the servo positions are set, then we should be able to connect it via I2C using the Adafruit 16 channel Servo I2C board.  Once this is done, then we’ll release the python geeks, and let them run wild with their programming and imagination.