Eliza vs GPT

Many years ago there was a program called Eliza. It was very good at making people think it was human. I came across a Basic version in “More Basic Computer Games”, typed it into my Micro-Tan and got it working. It was great fun. It pretended to be a type of psychiatrist but all it ever did was parrot back to you what you had entered. My favourite bit of the code was the part which changed "I” to “Your” and “Me” to “You” etc so that it could be sent back to the user as if the computer understood what you had just written. What struck me about the reaction of other people to the program was how easy it was to make them think the thing understood what they had entered and, which is much more scary, how keen folks were for this to be the case. They really wanted to believe the software was properly clever. Me, I just wanted to type in things like “I’ve just shot grandma” so that I could get back the response “You’ve just shot grandma? Tell me more about your family”.

I was strongly reminded of Eliza when Ross was showing me how good ChatGPT is at writing programs. He asked for some Arduino code to make lights flash in response to sensors and what came back looked like fairly convincing C. It was very impressive. But it it is still not clever. It is just taking a bunch of stuff from you, looking things up and then crafting a response that chimes with what you expected to see. Sometimes it might combine things in ways you don’t expect, sometimes it will find things that strike you as original. And it might react differently from Eliza if you tell it you just shot grandma. But I don’t think it’s clever like we are. That’s not to say that it won’t change the world though. It will. For one thing search engines are going to get a lot easier to use and a lot more conversational. For another, the essay and the programming exercise are about to get massively devalued as a way of assessing knowledge. Some students will use ChatGPT to craft their submissions. Others will question why they are being asked to write something which can be done better by a machine.

For me the hardest thing about writing and programming has never been about turning out the prose or getting the code to work (although it can be fiddly), it has been working out what the program needs to do or thinking up a good subject and then crafting a narrative that works well with it. I like to think that with more of the “grunt work” out of the way with tools like ChatGPT we could focus our efforts on these human parts of problem solving. I’m looking forward to playing with it.

Christmas Meetup Fun at Hull Makerspace

We had our Hardware Group Christmas Meetup at Hull Makerspace this evening. We had sweets, mince pies, instant pictures, a robot dog and a digital trombone. As you do. Much fun was had. I’d taken along “Bluey” my Sony Aibo (he’s called Bluey because he’s, well, bluey) and he was on fine form. He could easily see his pink ball against the Makerspace floor and it was great to watch him chasing it around the place. We’ll be having more events next year. I’ll keep you posted.

Instax Overdrive

It is quite fun working with chemical photography instead of digital. It behaves in ways you might not expect. If you put too much light on the Instax film it goes dark and then produces a negative of the image. The picture on the left shows this in action. It is supposed to be a multiple exposure of some lights.

The centres of the bulbs should be bright white rather than purple or green. This is due to some chemical reaction or other. People have even managed to create proper looking pictures by shining very powerful flashes onto negative images. Instant photography has made picture taking harder, costlier and more of a lottery in terms of what you get. I love it.

Dal-E 2 and Kids

“Two unicorns on a roofgaRDEN having tea”

Number one granddaughter came to see us today. We were talking about stuff and playing games, as you do. And then I thought it might be fun to show her Dall-E 2. This is a lump of cloud-based AI that will take text from you and make a matching picture. We decided we wanted to see two unicorns on a rooftop garden having tea. Above you can see one of the results.

Then we tried some more phrases and we found that some worked, and some didn’t. All the time I was wondering whether showing this stuff to a five-year-old was the right thing to do. Would she now give up drawing and work on the basis that she can just ask for pictures to be drawn when she wants one? I really hope not. And actually, I don’t think she will.

The program is very clever but pretty soon we started to find that it tended to head off for the one thing it knew about, which was not always what we wanted. Tools like these are going to find their way into our lives whether we want them or not and knowing what they are and how they are limited is really important.

DDD North in Hull was completely wonderful

DDD North today at Hull University. Lots of interesting talks. Including one from me. First live event since forever. What’s not to love. Although I really wish I’d taken a notebook and pen. Completely forgot how useful they are.

The first session I went to was by Derek Graham. How to be psychic. It turns out that you don’t actually have to be psychic to write useful code. But if you get it right you can certainly appear to be. This was a great description of sensible things to do when writing software. I particularly liked his tip for someone who didn’t know what to make. He said “Make something”. There’s a hugely important lesson here for folks learning to program. “Your first attempt doesn’t have to be right”. Even if you are an expert it is virtually guaranteed that your first attempt will not do everything the problem demands. So why not lean into that and make something that kind of works and then have a framework that lets you iterate. I was also impressed by the term “walking skeleton”. A version that does the absolute minimum the requirements need but will serve as the basis of more discussion and development to get to the finished solution. Brian even mentioned UML (great stuff) and a tool that I’m going to look at for making diagrams from text: https://plantuml.com/

The second session was by Luce Carter. Productivity++: Things I Have Learned from Managing My ADHD. This was a very confident presentation of strong content. Turns out that it is really all about organisation and mindset. I’m not sure I’m prone to ADHD any more than I think I might be psychic. But it was a great description of tools and techniques that you can use to keep yourself moving forwards. Luce mentioned a tool called Notion which looks really interesting. I organise my work using a single word file that contains my diary and all my projects. Not optimal. Notion looks super useful. It stores data in the cloud and runs across all my devices.

Third session was by John Stavely. Getting started with Satellite IoT. It turns out that there is low-cost gear you can get which lets you send packets of data to a satellite as it flies overhead. Then, when the satellite goes over a downlink it will send the data down to earth where it appears in MQTT messages that can pop up in your Azure IoT Hub application. You have to do some work to predict when to send to the satellite. They come along every now and then and precess as they orbit the earth. You also have to do a bit of error correction and bit twiddling to make the best of the 64 bytes you can send. But it means you could make something that can send data from anywhere on the surface of the earth. Amazing. At the moment it is even free to use. You’ll have to buy some kit and you need somewhere with a good view of the sky, but it works. John has a GitHub site here with his software and more details.

Thanks to DDD North for the picture

Then it was time for my talk. I was talking about getting started on the internet. In rhyme. I changed into my red jacket and went for it. I really enjoyed the talk. I just hope the audience did. I’m not sure how many folks learned much, but I like to think they picked up a few interesting rhymes in amongst the cheese puns. I was quite merciless in my extraction of funds for Red Nose Day. I think for everyone who came along it was the most expensive experience they’ve had for a while. But by the end we’d raised over 120 pounds for a super-good cause. The audience were fantastic. They rolled with all the punches, threw money at me (in the form of carefully folded five-pound notes - not painful coins) and went along with everything. I’d taken my Mint TL70 and I was taking pictures of the audience wearing my big hat (for a fee of course). The camera did a great job. People loved having a physical picture of themselves to take away. You can donate too if you like. Go here.

The final session I went to was from Don Wibier of DevExpress. State Management in Blazor. I’m getting very interested in Blazor. It will be featuring in the next version of the C# Yellow Book. There was some great technical content, but my mind was a bit full of bad rhymes and not in a state to absorb a great deal. Fortunately, Don has put a whole slew of videos on YouTube which I now intend to search out.

Thanks so much to the DDD team, and particularly Boss, for setting up such a wonderful event.

Red Nose Day Reprise at DDD North on Saturday

I’m doing a session at DDD North this Saturday, 3rd of December. I thought it might be fun (and charitable) to reprise the Red Nose Day talk I did this year and try to raise a bit of cash for the cause. The lovely folks at DDD North agreed, and so I’ve switched my session to “How the Web Works”. It’s at 2:30 pm on Saturday 3rd December on the University of Hull Campus. You can find out more about DDD North and register here. You can donate here.

I’ve really missed going to Red Nose Day sessions. Let’s hope the audience feel the same way on Saturday.

The Man from Toronto

The man from Toronto is on the right

Tonight we watched “The Man from Toronto”. It’s a caper movie about a failing fitness instructor who gets mistaken for a ruthless hitman with hilarious and action-packed consequences.

In the old days they used to have “B” movies. These were made because for some reason a trip to the movies used to involve seeing two films. The main feature and the “B” feature. I guess this gave them more time to sell popcorn. Anyhoo, B movies had a slightly sub-par cast and budget and got released once before appearing decades later on Sunday afternoons on ITV. Some of them turned into classics. Some of them gave directors and stars their first taste of success before they made it to the big time.

Now that streaming companies are putting money into movie making I think we are seeing the return of the “B movie”. Much better than a “Made for TV” movie but not quite at the level of a cinematic release and probably destined to be watched on a Sunday afternoon. I’m perhaps being a bit harsh on “The Man from Toronto” by saying it is a bit of a modern “B” movie. It is nicely done and everyone plays their part well. But it is no Fast and Furious film. Worth a watch though.

Mint RF70 photography tips

Took the Mint RF70 camera to Burnby Hall for lunch today. And by that I don’t mean that we ate it when we got there. We had a nice meal at the cafe and then a wander round the garden taking a few pictures before coming home where I got on with Chapter 10. Too much detail? Two words: My Blog.

Anyhoo, I think I think I’m getting more of a handle on instant photography. Rob’s tips from this trip.

  • Over exposure (too bright) is better than under exposure (too dark).

  • The camera meter takes a reading based on the overall brightness of the scene in front. You can half-press the shutter to set the metering and then frame your subject. If you want to increase the exposure (brighten things), point the camera more at the ground and half press the button. If you want to decrease the exposure (darken things), point the camera at the sky and half press it. Then frame your shot and press the button all the way down to take the picture.

  • Instant photography seems to work well with a big, strong subject rather than lots of little things.

  • Camera shake is a thing. Use a light meter to make sure that the camera is not going to pick a shutter speed less that 125th of a second which will lead to shaky shots.

Watch Wednesday

Talking of things to watch while the football is on (and there really is a thing in it too) you might like to take a look at Wednesday. It’s an offspring of the Adams Family franchise which you can find on Netflix. Some of the episodes are directed by Tim Burton and they seem to have spent a lot of money on this. The early episodes are really good with some great sardonic humour and lovely set pieces. The later ones turn a bit “Harry Potterish” but the whole thing is carried along by the acting and production. Well worth a look.

Heading to Driffield

David asked me if I fancied giving a talk to some sixth formers at Driffield. A chance to perform in front of an interested audience? Count me in. I took along a few toys, the trombone controller and my cut-price laptop. Much fun was had. The students were great. Lots of lovely questions (although one person did ask how tall I am - and after I had specifically told them not to do that). Kids eh?

I love telling the tale of embedded development. This is the best time ever to be doing it. Making stuff has never been so easy, so cheap, and so useful for building up your personal brand. I’m looking forward to going back some time in the future to see what they have been making.

David had brought in some trombones for me to look at, including the super shiny one above. It was nice to be able to compare the action of my sensor with something real.

Breath detecting with an environmental sensor

Turns you can do it. I’ve been wondering how I can detect people blowing into a trombone. As you do. A microphone is one possibility, but that involves analogue to digital conversion and sound processing and stuff. And owning a suitable microphone. I do however have a bunch of BMP20 environmental sensors. These contain an air pressure detector. They are supposed to be used for weather data and determining your height above sea level. Would it work for breath?

The answer is yes. If you put a sensor in a closed box (see above) and then blow into the box you can make a detectable difference to the pressure inside. All you have to do is sample the air pressure at the start and then look for a change of around 5 or so during gameplay.

It worked really well for a while. Then the BMP280 stopped working. I had a look in the box and discovered why. It was rather disgusting. Breathing into a box produces not just air, but a lot of water vapour too. The inside of the box and the sensor itself was covered in what you could politely call “dew” but was actually something slightly different. Trombones have a “spit valve” on one end to release all the stuff that accumulates. I’m happy to have proved the principle. I guess I could engineer some baffles or a waterproof membrane over the sensor to keep it dry, but the thought has occurred that in these virus laden times, passing around some thing that you take in turns to breath into might not be a great idea.

So I’m building a version of the controller that uses buttons rather than breathing.

Hello Harrogate

Yesterday we went to Harrogate for the Knitting and Stitching Show. That is, some people in my party did that part. I’m not into knitting or stitching just yet. I went round Harrogate looking for things to photograph using the Mint RF70 that I’d taken with me. Harrogate was doing its best to be interesting, what with Transformers wandering around and inflatable Thunderbirds vehicles. The weather really wasn’t helping much though. The rain was pretty much constant. I wasn’t the only person there wearing waterproof trousers.

However, we all had a great time. Wool was bought, embroidery was done and we had a really nice lunch at the Fat Badger (strongly recommended). Then I went round one of the many lovely parks near the town centre and managed to grab a few more shots before the heavens opened again. Great day.

Add graphs to your IoT projects

I’m building a trombone controller. And why not? I’m using a distance sensor to track the position of the trombone slide. The output is a bit noisy. But how noisy? A graph would help, but how do I get that? Very easily as it turns out. I just added a print statement to my Circuit Python application:

print(raw,",",average)

This prints out my raw and averaged values with a comma between them. Then I used Thonny to run the program in the trombone for a while and moved the slider. Then I stopped the program, copied the output of the terminal window into notepad and saved it with the file extension “.csv”. (Comma separated values).

Then I opened the file with Excel (other spreadsheets are available) and made the above graph. It shows how my rolling average (the red trace) cleans up a lot of noise but makes the values lag slightly (look at the how the red trace rises slightly after the blue one).

If you aren’t sure what your signals look like this is a very easy way to do it. The Arduino IDE has a graphing feature built in that I’ve used once or twice, but there’s nothing like dropping your values into a proper spreadsheet for analysis. And it is very easy to do using the magic of cut and paste.