AI and Wife Knowledge
/I enjoyed this article by Martin Rowson: “I asked AI to name my wife. To the hopelessly incorrect people it cited, my deepest apologies” but I fear he may be missing the point. I wouldn’t consider a new acquaintance stupid if they couldn’t name my wife, but I would worry a bit if they didn’t know what a wife was or asked me to explain marriage.
Conversational AI is built around plausible guesses designed to keep the conversation going, not absolute facts. In this respect the AI seems to be working well. I use AI a lot to help me write software, but I don’t mind that it doesn’t know who my wife is. In fact, I work with lots of other folks in the same position.
I regard AI as a useful tool that increases my efficiency and lets me realise my ideas in areas where I have limited skills. I thought I might enhance this post with a cartoon example of what might happen if a person or an AI confused “wife” with “wine” and I asked ChatGPT to illustrate this with a cartoon “in the style of Martin Rowson”.
I was pleased when it refused to do this owing to content guidelines. I was less pleased when it helpfully suggested weasel words to get around this restriction. But then these didn’t work either. We went back and forth for a while and eventually I congratulated ChatGPT on how well it was protecting the interests of artists.
I then asked if it would refuse to write “in the style of Rob Miles” if anyone asked. It said it would not reject such a request. Apparently images and words are treated differently. If I had learned to draw my output would be safer against AI impersonation. Oh well. I eventually got my image by asking it to draw two penguins at a party in an ice palace.
The dangers of confusing “wife” with “wine”. The penguin on the right is saying ‘I’ve got lots that I keep in a cellar under my house"‘
If you are interested you can read the entire exchange here.