It probably comes under the law of unintended consequences, but apparently Google allows people to think they’re cleverer than they are.
This is because someone believes they know the answer to something; but they don’t. They know they can look up the answer – and confuse that with thinking they’re smart. There’s a big difference.
Which is why the speculation that ChatGPT may do away with the need for writers won’t match reality.
Just because an algorithm can spit back a whole lot of generic information doesn’t mean it has created a story which is useful, relevant or nuanced.
(A side-aspect to be aware of too is the way that ChatGPT’s constantly-blinking cursor spells out its findings plays into the notion of the labour illusion.
Though the information it provides could easily appear on our screens instantaneously, the pretence that something’s working away on our behalf makes it feel more valuable).
Which is a digression from the point that we shouldn’t over-value programs such as ChatGPT.
AI is a tool. Tools are most effective when they’re wielded by knowledgeable people.
You don’t hire a hammer. You hire a carpenter.
You don’t hire a paintbrush and palette. You hire an artist.
You don’t hire a utensil. You hire a chef
It’s another way of saying that tools such as ChatGPT are simply that.
From a writing point of view, they’re wonderful research assistants. But gathering information on its own isn’t enough.
ChatGPT and its ilk simply provide generic information (and will increasingly be spotted, and downgraded as such, by search engines).
It takes a writer to finesse and refine content to a client context. Equally, it requires a human-being to ask the questions which form the basis of a cogent article.
AI writers don’t have that ability.
Carpenters, artists and chefs use tools.
So do writers…and writing isn’t dead.