Love and Large Language Models

We went to see the Fantastic Four movie last night. It was OK. Before the film they had a bunch of adverts, including one for Google Gemini, one of the many AI assistants being forced down our throats at the moment. I found this one particularly depressing when it showed the sample query “How do I know if I am really in love?”. Ugh.

This is not what you should use AI for. AI is for things like “how do I unblock a toilet”, or “how do I create a tuple containing only one element in Python”. Not for affairs of the heart. I guess that the creators of AI have decided that most of us don’t need to unblock toilets or create tuples very often (unless our lives have taken a particularly strange turn), so they are moving into other aspects of the human condition.

Please don’t use the tool for things like this. For one thing you need to remember that one of the aims of an AI assistant is to keep you talking as long as possible (a bit like a hostage negotiator) and to do this it will tell you things it thinks you might like to hear. For another, remember that, since you aren’t paying for the service, Google will soon move on to monetising your engagements, so questions about love might well end up resulting in your next searches returning lots of adverts for chocolates and underwear.

I must admit I quite enjoy talking to AI when I’m doing stuff with it, and it is not a huge step to starting to think that the software understands me and cares what I am doing. But it doesn’t and it doesn’t. It just wants to keep me talking.