Category Archives: technical

making technical thoughts

Making real things

I've sort of been busy the past few months jumping from one project to the next without taking time to have a pause, but I've just finished my Iron cat helmet that I've been working on for a few months so it feels a good time to sit down and do a quick round up of everything I've been busy with.  I'll do a full write up of each one over the next few weeks so keep a watch out for those if you want to know more.

October and November 2016.  I was lucky to get on a series of workshops run by code liberation in conjunction with the V & A, Goldsmiths College and Machines Room.  This culminated in showing our games at the November V & A Lates. This was a great introduction for programming in Unity for me.  I got to use my Blender skills that I had been learning with my Red Robot project by building a 3D duck model for the game and I also realised how much I love making things with wood.

At the start of this year I took a six week course in using a metal working lathe at Uxbridge college.  This was a bit like being back on my BTEC again after all these years. The course wasn't amazing but I do feel a lot more confident using a lathe safely which is what I wanted .  Now I just need to work out how to get a lathe and a workshop of my own.

In January I started making what I have called #IronCat  an Iron man helmet for a Lucky arm waving cat.  the first attempt was scrapped as the pattern was too complicated when it was shrunk in size but I managed to find a simpler version.  Lots of hot glue, Papier Mache, filling,sanding and spraying later its finally finished.  Lots of mistakes in it and things I would do differently but all in all its not turned out too bad.

 

Inspired by this Tutorial by Frank Ippolito on the Tested Youtube channel I painted a childs face mask to be half beautiful and half horror.

 

I then made a wooden skeleton model kit and applied and improved the same techniques I started learning on the mask.  I was so happy with this I've already bought another Skeleton to repeat and improve on, with the aim of selling them on Etsy.

 

So that's that then,  oh and there was a Saturday afternoon taxidermying an Hamster in that sneaked in as well and a model car from the same website that I downloaded the skull model

It wasn't really  a conscious decision to spend so much time making physical things and not coding or playing with LEDs but I was wanting to do more solid 3D work to help me think in 3 dimensions for when working with Blender and I want to apply some of these techniques to exploring  V.R and mix this with practical effects work.

 

Whats next?  Well I'm hacking a Tiny Tears doll into something more interesting and fun.  This will be back to coding and raspberry Pi but also mixing in some sculpting, silicon molding and resin casting to keep up the theme of playing with real things.

 

 

 

 

 

Art design developer making Projects technical

What I did with Lightning data,a Raspberry Pi and Anti-static bags

Do you ever wake up in the middle of the night when there is a thunderstorm happening and think "I wonder if its possible to make a digital sculpture that visualises lightning data using LEDs"?
No?
must just be me then.

That was back in March and after lots of faffing around with wire, plastic metal and code I have made it happen, here's how.

The first thing was to see if it was actually possible and for that I had to see if the data was available. There are commercial services that provide the information but they tend to be quite expensive as it is valuable for I think Insurance and similar services. I found an amateur network of people who run detectors and upload the data but they don't publish the data unless you have one of the detectors. I tried the Met office but they only publish data for lightning data for around the U.K, they publish an image file not actual data I could do anything with and there isn't enough lightning happening around the U.K for it to be interesting.

After finding lots of Flash based sites that would be either difficult or impossible to extract the Data I found the University Of Washington global map of Lightning data http://wwlln.net/new/map/    A bit of poking around and it was possible to find the underlying data to work with http://wwlln.net/new/map/data/current.json 

Next was the technology choice.  It would be powered by a Raspberry pi and use Neopixel strips for the LEDs.  I considered  both Python and NodeJS for writing the code. Python would probably have been the better choice and I think it would have been faster and easier but I hadn't done any NodeJS for a while and I wanted to dip my toes in again so that's want I plumped for.

First part of the develpoment was to control the Neopixels from the Raspberry Pi.  I used this Library https://www.npmjs.com/package/rpi-ws281x-native and set to work playing with the Neopixels.

It was around this time I had the idea of using metallic anti-static bags.

The eagle eyed among you might notice that I'm  actually using an Arduino for that test

I also started to think about what the sculpture would look like.  Not straight and angular like a single fork lightning strike, not a  sphere with dots plotted on it, I wasn't going for an artistic interpretation more than a simple  data visualisation. I began to think about  the swirliness and chaos of clouds and storms and started sketching some ideas.

I eventually settled on four long strips of  Neopixels, intertwined, each strip representing one quadrant of the earth taking the point that the equator and the Greenwich meridian intersect as the centre as that seemed a sensible thing to do.  I fell into a bit of an hole learning about map projections and plotting the data onto images of the Earth using P5JS .  It wasn't really necessary for the final outcome but I wanted to check I was doing everything correctly and it felt the right thing to do.

lightning data plotted onto map of Earth

I did some prototyping in P5JS to get an idea what it would look like and to  start working with the data.

Now I knew I could control the LEDs, had an idea of what I wanted the end result to look like and I was comfortable with the data I began to pull it all together.

To stop the Neopixels just from dangling down limply I came up with the idea of using the waterproof strips that have a clear plastic covering over them, into this I pushed down a thin solid wire that I had inserted into a clear heatshrink sleeving to prevent it electrically  shorting the LEDs strips.  This was a lot harder than I had anticipated and there was a lot more swearing than i thought there would be.

These strips were then covered in sleeves  I had made by cutting and gluing anti static bags. That was also more difficult than i thought it would be,involved burning my fingers with hot glue  and I didn't really work the best way to do it until it was finished. But it kept me amused on twitter for a few evenings.

A few hours were spent wiring it all up, crimping the connectors together, then taking it all apart and doing it again properly so it would actually  work and its now finished.  It can run either with the Raspberry Pi connected to the internet and pull down updates to the data or offline if there is some data loaded.

There are still some bugs in the software that need fixing.  I want to update the code so that it automatically detects if there is a live internet connection and if not will run in offline mode. It has run successfully for several hours per day for five days at an art show for Science Museum staff.

 

I've already started on getting rid of the breadboard wiring for the chip that converts the output from the Pi to a voltage level suitable for the Neopixels but thats a story for another day.

PCB next to bread board with chip and wires

 

 

I'd like to exhibit it further  but don't really know how to go about that. I would love to hear from anyone who could help me with further exhibiting or  knows how I would do that.  Leave a comment or get in touch on Twitter.

So here it is running

 

developer making technical

Going Beyond Arduino part 1

This is Part one of probably three. but that might change.

I love Arduino, you should love Arduino, we all should love Arduino. Good that's settled then. But if Arduino is so good why would I want to go away from this.

Like a lot of things I start playing with, I want to take them apart,explore them deeper and find out how they work, even if that means things get broken in the process. With Arduino that started when I started listening to the Embedded.FM podcast. If you don't listen to Embedded.FM I really recommend it, Elecia and Chris cover everything from Hacking BB8 toys, through STEM education to the control of quadcopters and satellites, with the occasional cat interview thrown in for good measure.

Right, back to Basics and a few terms explained. The Arduino is a microcontroller board. A microcontroller is a simple computer that has all the components that are normally all on separate chips and connected together on one single chip.

To make programming easier the microcontrollers on Arduino boards are pre-programmed with a bootloader. The bootloader is a small program that runs on the microcontroller to allow the code to be sent straight from your computer to the microcontroller without any additional hardware needed.

My motivation when I started with this was to completely remove the Arduino environment. This meant no bootloader, no Arduino IDE or libraries and to write the code in C.

The first step I took was to buy a ATTiny85 microcontroller. These are the chips that are found on Adafruit Gemma, they are simple and cheap enough to not have to worry if things went wrong.

After getting the ATTiny85 working I started playing with the chips on an existing Arduino board that were already programmed with the bootloader and also through up other surprises I didn't know about. I'll cover those in a future part, when I start to

Before I begin its worth mentioning that all the hardware and software I'm talking about is relevant to the Atmel AVR chips which power the majority of Arduino boards. There are some boards that use ARM or Intel chips. I haven't explored these boards yet.

To program the microcontroller without the Arduino environment two things are needed a programmer to talk to the microcontroller and software to send the code to the microntroller.

There a lots of different hardware programmers available, its possible to set up an existing Arduino board to program.
This is the one I have,
USBASP Programmer

it plugs in to the USB port, includes a cable and adapter to convert the header on the cable from 10 pins to 6 pins. The other small device in the centre of the cable is a breadboard adapter from Adafruit that makes using the programmer with breadboard easy.

Searching for usbasp on ebay will turn up loads of these, some the bare boards and some in the nice green enclosure like mine.

The software needed to send the programs from your computer is called AVRDUDE. There might be other software for doing this but I'm not aware of any.

For installing AVRDUDE Limor Fried (Lady Ada of Adafruit) has tutorials for Windows, Mac and Linux. 

I installed AVRDUDE by installing Crosspack as suggested.

Its worth having a read of all of Lady Ada's AVR Programming tutorial. I found it really useful.

So thats the beginning covered, the chips,the hardware and the software. Next time we'll build a simple circuit and program the ATTiny85 using the USBASP programmer and AVRDUDE.

technical

Hacking a rowing Robot

Robots are cool right? and the coolest robot around is Nao Robot right? and wearable technology is cool right? So what would be better than a Hackathon themed around interfacing Nao to some interesting wearable technology?  I couldn't think of anything cooler so when I saw exactly that I signed up straight away.

Originally I thought it would be interesting to do something museum related, but as I'd already spent a day with Nao and museum people exploring how Nao could be used in a Museum I decided to look at something different.

Recently I'd used blender to design a little version of the Rowing boat that is going to be used by Sarah Weldon when she attempts to become the first woman to row solo around the U.K in 2016.  I thought it would be interesting to think what if Sarah could take a Nao on board with her and how it could be used.

The day started off with an introduction to Nao, the devices that were available to interface with and a good introduction to programming Nao using  Choregraphe and python.

I teamed up with Sam Ahern who I'd previously met at a Flossie event   and is working with Lego Mindstorms for her MSc, and Alan Rushforth who brought a Nao robot for use to uses and much needed knowledge of C++.

Getting started we managed to get a Thalmic Labs Myo armband  and a  pulse oximeter.  I started playing around with the Myo armband and Sam started on the Oximeter.

With the help of Alan and his C++ skills we managed to get data out of the armband and write a python program to roughly detect a rowing motion, based on the roll axis of the device.   Sam had been fighting with her new laptop and the oximeter so we werent able to use that.

On day 2 we  put together our demo.  Sam worked on making Nao move to indicate if the strokes per minutes that you were rowing was too high,too low or just about right.  With a bit of fine tuning and practise it worked pretty well, right up to the point we had to demo it.

Nao sitting
Our Nao robot for the weekend

It was really great to watch what the other teams had worked on.  My favourite was the group who had used Nao to treat people with Sleep Apnoea using the Oximeter to detect the sleep problem and Nao to either tap the sleeping person to take them out of the state or direct a person to move the sleeping person to a better position.

It was a really great weekend and I learnt so much.  It was all about thinking of novel  ideas of how Nao and robots could be used.  I could really imagine Nao been taken on long solo voyages to act as both a companion and as an intelligent dashboard, setting rowing pace by interpreting sensor data from the rower or sailer , taking data feeds from GPS and radio to warn of obstacles and act as a mini cox. A robot has been taken to the International Space Station so maybe when Sarah is ready to depart on her expedition around the U.K Nao will have got his sea legs and be ready to accompany her.

If you want to learn more about Sarah and her epic row around the U.K go to her website Oceans Project  and follow her on youtube 

To find out more about Nao look on the Aldebaran robotics website 

And a big thank you to Carl Clement and everyone at UKNAO , and QMUL  for making the great hackathon  possible.

 

 

 

 

 

technical thoughts

Making A Nothing - The story of two Twitter bots

Making nothing is just sitting around not doing anything, but making A Nothing is making something that has no content.  While developing the twitter bot @X3Prospero I become fascinated with twitter bots, not just the types and variety and content of them but how their continous metronomic beat of  their function, sending out messages repeatedly, never stopping or resting or needing any user input.

Yes there are ones that are intermittent and maybe linked to physical phenomena,data feed or  machines but these don't  have the same fascination to me. What fascinated me was that the operation of the twitter bot was more the message than the content it carried.

just after Christmas I decided to start experimenting with this thought and have developed two twitter bots so far.  The first one @ColoursAll  is a bot that will tweet every colour represented by the RGB colour model from 0x000000 (Black) to 0xFFFFFF (White). The code is finished but I'm just sorting out hosting for it, should be up and running in about a week.

Yesterday I visited the Whitechapel Gallery and saw David Batchelor  exhibtion  He has taken five hundred photographs of white squares and rectangles in cities around the world.  Seeing the projections of the squares quickly cycle through, the background changing but always a white rectangle in the centre of the screen made me think abut making a twitter bot that had even less content in it than @ColoursAll but went through the most limited content I could think of with that same beat.

Last night I ran @TickTockBot for just over one hour, its tweets were either the word 'tick' or 'tock' followed by an ascending count.  The count was only their because twitter doesn't allow duplicate tweets. After two hundred and ninety three tweets it stopped. The bot hit the limit imposed by twitter.

 

The code for @AllColours is on Github

The Storify of the @ticktockbot is here 

 

 

museums technical wearable

Virtual Reality - Is this real life or just fantasy?

virtual reality helmet from the 1990s Back in the 1990's Virtual Reality (VR) was going to be the next big thing, for a moment it did look that way, then it all went wrong.  The headsets were big and clumsy and generally not very good. VR went away for a long time until the Oculus Rift took Kickstarter by surprise.   The company  went on to be bought by Facebook for over $2 Billion dollars and itself kickstarted a new VR industry.

With this new interest in VR I thought a round up of some of the headsets and applications  would be a good idea.  The Oculus Rift is now in its 4th developer version.  A full consumer version is intended to be launched next year.

Other manufacturers haven't been resting on their laurels.

Samsung have taken a slightly different approach.  The Gear V.R does not use its own display but is a holder for the Samsung Galaxy Note 4. The Samsung phone acts as both the display and the processing power of the device.

All the current devices can be split into either a Rift type or Gear V.R type device.  Each method has its own advantages or disadvantages.  The Oculus Rift requires a wired connection to a P.C or games console.  This limits its mobility and flexibility but does enable  more  detailed and responsive graphics that a powerful Graphics Processor can give.  The Gear V.R  can be easier to set up and with not being tied to a P.C is more mobile both in the location it can be used and for the person using the device. Disadvantages of this approach are battery life, the power of the Graphics Processor (GPU) in the phone and that using the phone like this for long periods can cause the GPU to overheat so to protect itself it will scale back its output quality, either the frames per second or the detail that is shown.

Google took the Gear V.R approach to the extreme when it launched Cardboard at I/O its developer conference earlier on this year.  A cardboard kit that folds up into V.R headset and can accept a wide range of phones.  I recently bought one of these and was genuinely impressed at how good it is. There is a difference between what Cardboard can do and what the Rift can do but there is a place for both.  Rift can be such an immersive experience if you aren't used to it, it can be better to use sat down, I have seen people stumble around as they become consumed in the virtual world. Having taken Cardboard to work and letting a few people try it, it can be much more social experience passing the headset around and comparing and sharing experiences.  Less totally immersive, different but not less.

Similar to Cardboard is the DIY VR headset  also adds trucker cap mounting and inclusion of the Leap motion sensor as the Oculus Rift has done.  Headsets like cardboard and DIY VR will get more people trying out V.R, developing for it and thinking up ideas and applications, that can only be a good thing.

Other devices that fall somewhere between cardboard and Gear V.R are the Archos V.R Headset  and the Zeiss V.R One 

Don't worry I haven't missed the obligatory 3D printing and Arduino  mention here it is with the Adafruit V.R Headset 

The most well known of the other Oculus Rift type devices is the Sony Morpheus not yet released  and like the Rift will probably launch in 2015 but at the moment hasn't had the same amount of public testing.   Morpheus is intended for use with Sony consoles, though just like when the Microsoft Kinect was launched it was soon hacked to work on other devices as well , that could well happen with Morpheus.

Microsoft themselves probably  have a V.R headset in development but less is known about this and its currently more rumour than confirmed product.

There are lots of applications for Virtual reality devices.  Really it needs a separate post for them but a couple of notable ones are the Volvo app for Cardboard,   The Paul Mcartney app for Cardboard  and the Thomas cook 360 experience for the Gear V.R .  These aren't small niche ideas but large brands using V.R to demonstrate their product in new and interesting ways.

I've not heard much of V.R being used in museums or galleries yet except as an experience to try the technology. The only exhibition that I'm aware of using V.R is the De/coding the Apocalypse at Somerset house. Overall the exhibition is very good, it uses digital technology in a very restrained and grown up way, but the use of the Oculus rift didn't really work for me. I think mainly because it was tethered for i'm guessing security reasons and the cord was too short again guessing but for health and safety.  I wanted more freedom of movement.  Glad they tried though.  Looking forward to seeing more applications for V.R, given the amount of headsets available and software being developed will surely be some interesting concepts developed

Caught in a landslide, no escape from reality

technical thoughts

A New Steering wheel?

I'm not a massive film fan but there a few I really love. The original Star Wars trilogy, The original Italian job and just about every Bond film. Maybe I just enjoy films with classic and beautiful cars in them, nothing wrong with that. One of my favourite features of classic cars apart the lovely styling of the bodies and the sound of the engines are the steering wheels. Admittedly modern Steering wheels are much safer, well padded and with an airbag in them to prevent serious injuries but the simplicity of the classic Moto Lita wheel is a work of art.

Aston Martin DB5 from Skyfall
Aston Martin DB5 from Skyfall

The modern Formula one Steering wheel is packed full of electronics and technology and cost around $50,000.  I say 'around $50,000' Whether they actually cost that I have no idea. Every article and commentator always quotes that exact conveniently round figure, let just say they aren't cheap. I do like that the trophies of the Australian Grand Prix are replica's of the steering wheel of one of Jack Brabham's cars.  A lovely nod to the past.

The steering wheel for Bloodhound SSC the car hoping to be the first car to reach 1000 Mph will be 3D printed and has been designed to fit Andy Green's hands exactly. Again an amazing piece of technology but not simple  beautiful in the same way as the Moto Lita.

Early cars didn't have wheels to steer with but used tillers as found on boats.  Were tillers used just because there was no suitable analogy from existing land vehicles?  Horses use a bridle and reins, which aren't suitable as there is too much play in the mechanism.  Trains just have rails so aren't steered.

As the cars moved from three to four wheels and the technology advanced the steering wheel become the standard way to steer the car.

There have been one or two attempts to do thing differently.  The Early models of the Austin Allegro had a square 'quartic' steering wheel which was a flop and was replaced with a standard circular wheel on the later models.  When I was a child our family had a Austin Allegro.  It was a 'V' reg.  from 1980 it must have had the standard wheel although I don't remember as I sat in the back. I do remember thinking it was very luxurious as it had a pull down armrest.  I'm easily impressed me.

The Mirov 2 a 'revolutionary sports turbo from the soviet union' had a steering wheel that could change from the right to the left hand side.  Except it didn't, it was a fictional car created for the Norwich union advert.  However the Mclaren F1 whilst having a conventional steering wheel did feature a unusual seating layout of having the driver in the centre with two passengers either side and slightly further back.  The F1 wasn't the first car to have the central driving position.  That honour goes to the 1935 Alfa Romeo 6C 2300 Aerodinamica Spider  a beautiful car with many innovations.  The prototype of the Land Rover had a central driving position but as the project developed it reverted to a conventional layout.

1935 Alfa Romeo 6C 2300 Aerodinamica Spider

The Matra Bagheera of 1973 had 3 seats but the driver position was conventionally on the left hand side.  All I can say about this car is that it looks horrible and so are all the websites and videos on youtube about especially the 'sexy' ones.  Not linking to any,  the internet is no place for videos of  ladies taking their clothes off and doing rude things

So why a blog post all about steering wheels? Well it was seeing this competition  to design a steering wheel for Ford.  How odd I thought, while Google and other hi-tech firms in Silicon Valley are designing self driving cars, Ford are updating the Steering wheel.

Sadly didn't enter the competition as was too busy with my own work and study and only saw the competition a few days before the closing date. From reading the competition description it really is aimed at serious designers who can actually do things with pens and pencils.

Here is my idea for the competition.  Rather than have a steering wheel, have a flat surface in front of the driver.  on that surface have a model of the car, using cameras around the car and live satellite and other sensor data project the environment that is currently outside the vehicle  including all the other cars on the road and the buildings and surrounding areas.
Imagine something like the video below,small and fitted into the dashboard area of a car, but rather than the the marketing shots of the car driving around the track, all the live data, projected from within your own vehicle.

PERCH Car from PERCH on Vimeo.

 

The majority of the time the car would be computer driven but if you did want to take over, it would simply be a case of moving the model car on the flat surface is the same way you move a computer mouse.

Don't think that anything I've thought of here is going to be possible for the next few years.  The cost and bulk of the projection and being able to get the data into the car is going to need a reliable high speed data connection.

What I do think is possible is that there could be big changes coming to the whole automobile industry very soon if self driving cars become a reality, and it does look like it could happen.  There are still lots of hurdles to get over, not just technical but legal and administrative.

A quote often incorrectly attributed to Henry Ford is "If I had asked people what they wanted, they would have said faster horses."  It would have been so much cooler for this post if he had said it.

Lets just pretend he did say it.   Are Ford currently trying to make faster horses while the technology industry works on new cars that will radically change how we think about cars.   Its quite possible.  Cars are being seen as less of a status symbol by many younger people now.  Environmental pressures are demanding they be made to be cleaner.

Looking forward to seeing what the winning entry of the competition looks like. Hopefully its not just a piece of circular metal and plastic.  If it isn't radically different I can honestly  imagine that Ford will soon join Nokia as one of those companies that once dominated an industry but were soon forgotten when a competitor came from nowhere and introduced a massive change

 

developer python technical thoughts

Do you remember the Mackerel?

You might remember my recent post on writing a short python program to solve the Yakult problems on the tube
I said then that I didn't think it was the only or best way to solve the problem. I've spent a few minutes tonight improving it and have also made it so it takes a command line argument so it can be run from the shell

e.g
>> python yakult-problem.py mackerel

st. Johns wood

Not going to go into details her but if you are interested it is up on github

developer python technical thoughts

Solving the 'Yakult Problem' in Python

They say one of the best ways to learn something is to teach it,  I think they do, sure I have heard that somewhere, never sure who they are and  even if they don't I'm saying it now.  As I'm doing a proper course in learning Python I decided to write this short tutorial on one aspect of the python Language.

If you live in London you may have seen Posters advertising the Yakult Pro-bio-whatever drink stuff.  Not the faintest idea if its any good but I do like the posters with a short teaser on it.  Something similar to:

Yakult riddle

 

 

o.k Lets break the problem down.

1. load the file in

2. Go over (or Iterate over as we say in computer speak) all the station names

3.  Iterate over all the letters in the word that I am testing the  station names against.

4. Iterate over all the letters in the station name and compare it against the word being test

5 if  there is a match of letters , stop and move onto the next station, no point in carrying on

6 If no letter matches are found, store that station name for retrieval later.

Loading the File

It took a little while to find a list of all that station names in an easy to use format, but manages it after a bit of searching a messing. The stations are stored in a file called station_list.txt

#Create  a list to store the station names in 

station_list = []   

#open the file called stations_list.txt and store it in the object called stations_file

stations_file = open('./station_list.txt', 'r')  

#Read the stations in from a file
     for line in stations_file:

#Store all the stations in a list after converting the characters to lower case

station_list.append(line.lower())

stations_file.close()

I don't want to focus on the file reading as it is not core to this exerscise.  Converting the characters to lower case was something that was picked up in testing when I realised the the Upper case initial letters of the station names were not triggering matches against lower case letters in the chosen string.

Ok rather than going line by line it will make more sense if I add in the 3 iterations of step 2,3 and 4 then flesh those out with what happens in each one.

 

for station in station_list:

    for my_char in my_string:

        for station_char in station:

 

 

As described above the program iterates over the station_list.  'my_string' is then name of the variable that holds the string that is passed to the function when it is called, that is then iterated over and finally every character in the current station name is iterated over.

So the first thing to do is compare the current station character with the current character in my_string and if they match then stop the current loop

if station_char == my_char:

    break

 

Thats stops any more tests on that station name happening for that one particular letter in my_string, but carrying on testing my_string at all is wasteful and we haven't done anything to store the station name if none of the characters match.

 

 if my_char_count == len(my_string) -1 and not found_letter:

    result_stations_list.append(station)

 

You will see that these lines test a variable called  my_char_count against the length of  my_string and variable called found_letter .  This is finding out if the current character is the last one in the word and if no matches have been found.

If  that test is passed the station name is added to the list of  results with

     result_stations_list.append(station)

Finally the my_char_count is incremented and when all the iterations are compete the result_stations_list is returned from the function

 

    my_char_count +=1

 return result_stations_list

Because the station list are separated by a newline the list shows that as '\n' these can be removed if you like, but i'll leave that as an exercise for you to try.

 

That shows the basic workings of the program,  the full thing is on github have a look and see what you think.  I'm not saying this is the best way to solve this problem, am sure its not the only way.   But its the way I wanted to solve it using the Python language constructs we have been learning in class and hopefully it nails the workings of the language into my head.

For another take on the problem a solution in Java can be found on this blog

 

Oh and if you are interested 'St Johns Wood'

making silly technical thoughts

Real Cloud Storage

A few days ago I twittered the tweet above, or is it tweeted the tweet? I don't know, anyway it doesn't matter. What does matter is that is the odd sort of thought that goes through my head. Wouldn't it be interesting to send digital data to the clouds. Both the data in our computer and the clouds in the sky are both somehow ephemeral and yet long lived . We see the data only because it is represented on our screens by glowing pixels,turned on or off. That data could have been around for months or years. Data we think can easily be deleted may be out of our control stored in far off servers, the cloud.
The water vapour that forms the clouds changes form all the time, falling as rain into the seas and oceans, flowing through rivers and streams. The clouds may last only a few seconds the water lives on.

A few month ago the Daily mail published an article trying to explain the leak of snapchat data. And in the article they used the explanation that 'the cloud' is 'not an actual cloud' Don't worry that's not a link to the Dail Mail.

But what if it was possible to store data in the clouds? Why shouldn't it be. Not having to rely on energy hungry data centres, never knowing if our data is safe or not, not having the worry of not knowing who might have unauthorised access to our private photos or documents would be great.

So I propose the following idea, its beta at the moment.
cloud_storage

The Cloud making machine would be something like this:

The arduino would take the data from the computer and control a fan to send long and short pulses of clouds out of the machine. I was thinking of Morse code as it is easily encoded and would be suitable for the low bandwidth.

Reading the data back from the cloud may be difficult, but hey at least its secure.