Here are two new PCBs for controlling a v-plotter. These were designed with Pycupi in mind but should be adaptable for other drawing machines.
The first is an add-on board for a Raspberry Pi 2 or 3.
The second is a kinda mini motherboard for a Pi Zero (the Pi Zero plugs into this board).
Both boards are currently untested and will require additional software modifications to work (either modifying the Linux install to free up the serial port or to enable comms over I2C).
A caveat with the Pi Zero board… we haven’t yet run Pycupi on a Pi Zero. It should work, but it might be slow rendering jpg representations of plots.
Both boards also supply power to their respective Raspberry Pis via the 12v DC connector using a DC to DC converter module.
Also the Arduinos on these boards are 3.3v and run at 8MHz so there may be timing issues to address.
I’ve included a ‘Master’ power switch that kills power to the Pi/Steppers and Arduino. I’ve also included a single push button that doesn’t have any specific purpose yet other than it might be useful as a shutdown button for the Pi. Or it could potentially be used as a pause button to pause a drawing.
Before getting PCBs made, I need to wait for samples of the additional components, print off a paper copy of the PCB and test for fit. If everything looks good i’ll get some sample boards manufactured and post an update.
MakeBmth held its regular monthly meeting at Eagle Labs Bournemouth on Thursday 10th March 2016. We plan to meet there once a month (the second regular monthly meeting will be held at the Uni or under the new Hilton Hotel – check the meetup group or forum for details).
We spent the evening learning about Brians Python port/rewrite of Gocupi, called Pycupi.
Pycupi (and Gocupi) is a cut down firmware that implements a command queue, stepper motor and servo control functions. Along with a control application written in Python (or ‘Go’ in Gocupis case). The who thing is meant to be run on a Raspberry Pi (though you could in theory run it on any computer).
Pycupi does the image processing on the Raspberry Pi and can (currently) generate simulations (jpeg images) of the expected output or it can plot that using a suitable stepper controller. The firmware in the Pycupi github repository is currently configured to work with my Eggbot/Polargraph electronics.
We learned about the multiple co-ordinate systems that are working together to produce a drawing.
Native: The co-ordinate system that works in motor steps.
System: The co-ordinate system that Pycupi maps to the native co-ordinates.
Image: The drawing co-ordinates which defines the drawing area.
We looked at how the native co-ordinates are not Polar (Angle and distance) but is a position triangulated based on the length of the two motor strings or chains. To calculate the position we need to know the machine width and how many mm of string or chain each step of the motor represents and finally a we need a reference or starting point (home).
A bit of trigonometry gives us the lengths of the strings/chains for a given position.
We looked at the different renderers currently avilalbe in Pycupi.
Spiral Arc: Renders an image using tiny spirals of varying size & density and the whole thing plots in on an Arc path from a corner (Defaults to top left (NW)).
renderSpiralArc(filename, x, y, image_width, pixel_size, drawing_object)
filename = the image to render (jpg/png/bmp)
x = the horizontal position of the top left of the image (bottom left if rotated)
y = the vertical position of the top left of the image (bottom left if rotated)
image_width = how many pixels wide the render should be
pixel_size = how big (in pixels) the spiral elements will be (bigger = less detail)
drawing_object = the python object you created to represent the machine
Norwegian Spiral: Renders an image using amplitude modulation along a spiral path from the centre of the image.
Last Wednesday I ran a small ‘Messy Workshop’, where I introduced several students to the processes of moulding and casting. I only had a short amount of time – around 3 hours. Within this I had to squeeze quite a bit in, so I put on my best motor mouth for the occasion!
I brainstormed what I could include, and got some guidance from the tech in the casting room – I’ll create some basic moulds using the vac former, a silicone mould of the same shape in order to do a comparison of the two processes, and a couple of different resin types; polyester (clear cast in this case) and Polyurethane (Fast cast).
After half an hour of me waffling on about the contents of each canister, types of facilities available, and associated H&S we created some basic alginate moulds and created a plaster cast of our hands / fingers.
This was pretty decent and took us to around 11am, when we took a short break – this allowed for us to give the plaster some time to cure. We were all rather hasty in our de-moulding, and lost a few fingers, most people thought this was the most distressing thing they had ever seen, whilst others found the whole thing rather hilarious!
Typical art college fayre!
After all that, we continued on to the resin room where we created the reproductions using the vac form moulds. I prepared several, but gave the option for students to create their own if they wanted to make more than one. I also showed the silicone mould off and we poured that one as well. Over the remaining hour, resin was mixed and moulds were poured.
Well, onto one of the real reason I made this a forum post, was to show off some nifty mould making processes I had a go at. Not only is the nature of these mid week workshops a way to get some cool stuff (mostly hard skills) introduced to students, but is also a way for me to indulge my nerd and try and create / explore new ways of doing the stuff we do already, which can all be summed up as making.
Tuesday was spent mostly drafting in Rhino 3D some nifty tile pieces, the kind of thing we might look at 3D printing one offs, and then going off to reproduces using a variety of casting mediums. (or maybe in the future, we will simply 3d print the lot?), anyway – I made a bunch of tiles.
a typical process would be to create a plastic / wooden / Lego box to place them in to enable us to pour silicone over.
Well, now we’re in CAD and can 3D print the form, why not add a mould box into the mix, print it all out, and pour strait away.
The next thing I went on to illustrate was the process of creating a shape from a two part mould.
The parts for a two piece mould
All the components we went on to make on the Wednesday were simple one piece moulds, with an open back – this enabled us to quickly make some shapes. I created both side A and side B, in Rhino, as if they were made from silicone. The components also included the pour hole and the bleed lines already in place. This is something that you would usually create by submerging the parts 50% on clay and sculpting out the pour and bleed lines.
The next cool thing you add are the ball bearings / marbles, as these create locator pins for both side A and B, ensuring that you don’t miss align the mould and create a miss cast.
In the real world, that’s a days work (sort of ) and is also rather messy, you have to take precautions that you don’t get clay all over your original work. Once you have one side in silicone, you dismantle everything, take it all out – and then re construct the box; this time with the silicone and the part in the bottom, and now you can pour a new batch of silicone on the top (silicone wont stick to silicone) and your on your way to having a two piece mould. Sounds tricky? It is until you do it yourself!
Well – that will all take 2 days what with the cure time, if I’m lucky – and I only have 3 hours. I do have the benefit at this point to a couple of hours to prep for the session however. I think to myself, what if I were to take the workflow of creating the walls for my simple shape and turning it into its own mould box, and creating the mould box for side A and side B of a two piece mould?! Well – here goes. Mouldception.
This is what the mould will look like, rendered blue to represent the silicone material! note the pour holes, bleeder lines and the positive and negative locators.
Everything went as planned! I created the mould boxes for side A and side B, poured in the silicone, and 12 hrs later, had the silicone in my hands.
The silicone parts visualised next to their respective mould boxes.
A quick pour of the parts and we can see how things turned out. the pour hole could have done with being a little larger – perhaps we can use a syringe to force in the material as the working time is around 1 or 2 minuets.
The two parts of the mould taped up and ready to pour
The bleed holes worked really well…. except for perhaps my positioning of them , I was still left with a hole in the first reproduction.
I was able to position the two together, make the pour, and 15 / 20 mins later I got the parts out. The striking thing here was how much they resembled the 3d printed parts!
The parts that I printed were done so on the ‘fast print’ setting so contain all the features of a fast 3d print – its really amazing how much detail is carried through the mould / casting process – and how many comments we got thinking the resin part was made by a 3d printer, as it had a similar weight, finish, and surface quality.
I’ll return back with some HQ images of the printed + resin parts later, so into the process I guess I forgot to document the later (more interesting?) steps!
Collection of moulds, in both PLA and silicone.
Plaster cast finger mountain
The two part moulds.
The principle is that you cannot cast material that goes hard (plaster / resin) into a rigid mould. So here you can see we played around and injected silicone into the PLA moulds.
In the silicone moulds of the same shape, you can see we have poured / cast the resin. the reproduction (minus the air bubble) is great! All details captured.
The two types of mould for the larger tile piece. The ‘Vacform’ moulds and the silicone mould. The silicone moulds (condition cure) are only good for the Polyurethane resins (fast casts) and the vac forms for the polyester resin (clear cast).
Clear cast with pigments – and loosely mixed inside the vac form!
MakeBournemouth was invited to an evening at the new Eagle Lab, an initiative by Barclays to encourage the local community to get together and learn from each other.
That’s the only image I got of the evening – somehow forgetting to take some pictures of the larger space. Their twitter feed is here and has a few images of the space within it.
The meetup went great, and no doubt the amount of new faces was linked to the opportunity to be hosted there tonight, have to work hard to get the new guys to continue to attend!
After a brief tour and overview of the offerings of Eagle Labs, we broke off into groups and carried on as we normally do – getting updates on each of our projects and bashing our heads together to solve some issues.
Will be having an ongoing discussion to see how we can work with Eagle Labs for future events / workshops.
While digging out the Christmas decos I found the WS2811 addressable RGB LED Xmas lights from an old workshop.
They were missing the Arduino that was controlling them (probably been pinched for another project) so I decided to hook them up to an Arduino Pro Mini that I could leave permanently attached. I uploaded the FastLED librarys DemoReel100 example and strung the lights up in the porch.
All nice and xmasy – except I now have to remember to turn the lights on and off… I figured I could get the Arduino to do that for me. Sure I could plug the whole thing into a mains timer but where’s the fun in that. I had a few DS3231 I2C real time clock modules that I have been meaning to play with so this seemed like a reasonable excuse for a quick and dirty project. It also gave me an excuse to try interrupts on the Arduino as the DS3231 has two settable alarms that generate an ‘interrupt’ signal.
After testing the RTC and interrupts on their own to prove it worked (using JChristensens DS3232RTC library) , I was able to connect the DS3231 to pins A4 & A5 on the Pro Mini (These analogue pins double up as the I2C bus on this board) and the interrupt output from the DS3231 to pin 3 on the Pro Mini (the Pro Mini supports hardware interrupts on pins 2 & 3) and hacking together the fastLED demo code with the RTC demo code with some horrible glue code (its so bad I’m not going to include to here) I had a working set of Xmas lights that turn themselves on and off at predefined times, plus the Arduino now knows the time even after a power off.
The Minions Operation with Minions audio samples is finally finished. Finishing a project completely (like making enclosures and making sure things are nicely finished) is really time consuming. Still, its done now and in time for Christmas. Hopefully the Nieces and Nephews will find it amusing because I have endured hours of Minions samples doing this…
So i’ve gone from this:
With the electronics tucked neatly away in the 3d printed orange box.
And from the other side:
Lid off, electronics exposed. Holes drilled for speaker which was glued in place with epoxy.
Electronics up close:
I’ve replaced the original light bulb with an LED (fore ground), the motor with a mobile phone buzzer motor (the little metal disk with the black top, middle left), a prototyping board with the Arduino Pro clone (middle) and WT588D audio board (top) and some resistors, a transistor, diode and cap.
Originally I had intended to use the buzzer motor that came with the game. It all worked perfectly when running from a power supply but as soon as I tried to run it all from 3 x AA batteries, the Arduino would reset as soon as the motor was triggered.
Testing the electronics worked (little green speaker was too quiet and was replaced with 0.5w 8ohm speaker):
I added the capacitor to try and smooth out the current, but it wasn’t enough. I tried reducing the current through the transistor by increasing the resistance of the resistors feeding the transistor base, but that didn’t work until the resistance was enough to stop the motor powering up. In the end I swapped the original buzzer for a tiny version from a mobile phone (these can be found for less than a quid on ebay) and that worked happily from battery power.
More recently I have had problems with the WT588D not working after power on. It takes a few power-off/power-on attempts before it will start playing samples. The Arduino is working because the buzzer and the LED work. I plan to add a power on delay and then play a sample to confirm it is working. It should be possible to have the Arduino detect if a sample is playing by reading the ‘busy’ signal from the WT588D and resetting it if not detected during a startup routine.
I bought a Minions Operation game for my Nephew & Nieces (who are Minion mad) but it felt like (MB Games?) Hasbro Games missed a trick by not having it play Minions samples. So I added that feature by adding a WT588D voice module controlled by an Arduino Mini Pro.
This meant dismantling the whole thing and breaking the plastic rivet things… I guess it won’t be going back to Amazon any time soon 🙂
I made use of the ‘buzzer’ by driving it with a transistor for 500ms. I’ll replace the light bulb with an LED that stays lit while the sample is playing.
The original version doesn’t need a power switch as the tweezers and metal base are the switch. My version draws power when idle, so I needed to remind people it was still switched on. I added a ‘nag’ mode that plays ~60 second samples of one of the 3 Minions cover version songs (selected at random) every 2 minutes if nothing happens, until someone turns off the power (when I add a power switch). Nag mode is really really annoying after a while – especially when you are trying to test it works properly…
I’ve got to put all of the electronics on a prototype board and then cram all of this stuff plus batteries underneath the playing surface. There isn’t enough room in the original battery/buzzer box, but it looks like the plastic frame is designed with extra modules in mind so I should be able to design and 3d print a suitable enclosure.
The WT588D has 32MB of flash memory which was more than enough for 40+ samples at 22000Hz Mono as well as the 3 x nag tunes and then some. I think it was showing 88% used in the upload software.
It is run in 3 line mode with 3 pins to ‘select’ the device, send a clock signal and pulse the binary data that corresponds to a randomly chosen sample. I think it was playing sample 32 when the scope captured this clock (top) and data.
Edit – copyright claim!! omg. YEah so will dispute for fair usage!
A quick demo of a project Asha is working on, hopefully i’ll edit this post with a bit more info as it comes.
Short story is that it is a flex sensor running through a pro mini, and using the radio chips that Neil kindly donated. The data is driven through Max MSP which in turn increases and decreases the level of volume through the PC.
Maybe I can persuade her to write about its process here….