A funny thing happened with ChatGpt
Have you heard of the new chatbot, ChatGpt (Generative Pre-trained Transformer)?
It was launched just a couple of months ago and it has gone viral. I've been using it on and off, mainly instead of Googling. It is generally agreed that despite its great capabilities, its style is flat and dry, and it 'lacks emotions'.
On this last, I beg to differ, as the following anecdote will show.
Every Sunday I post on Instagram a different book dedication. The author's dedication is the first thing a reader sees when opening a physical copy. It is usually an expression of gratitude from the writer to a person who's very important to them. Reading these dedications and understanding the context behind them can make for great posts.
This week, I chose the following dedication:
"To Her, Hand in Hand we come Christopher Robin and I To lay this book in your lap. Say you’re surprised? Say you like it? Say it’s just what you wanted? Because it’s yours - Because we love you".
Beautiful, isn't it?
I had an educated guess that this heart-wrenching cry for love and attention, which opens “Winnie the Pooh”, is dedicated to Dorothy "Daphne" de Sélincourt, who was Christopher Robin’s mother and Milne’s wife.
Still, I asked ChatGpt who "Her" was, and the answer I received was:
The dedication in the first book of the Winnie-the-Pooh stories, which was published in 1926, reads: "To Christopher Robin, who has been the chief author of all these stories." Christopher Robin was the son of A. A. Milne, and the character of Christopher Robin in the stories is based on him. So the dedication is to Christopher Robin, not to someone named "Her."
A little bit snarky, are we?
Since I have the physical copy in my library (click here for photographic evidence), I know the great all-knowing deep learning AI is wrong (yes, what you're hearing is a definite note of gloating).
And not only was it wrong, but its tone was also aggrieved! From this day onwards, I don't want to hear people say that ChatGPT lacks emotions. The thingy definitely has feelings. Also, it is as fallible as humans, who are the real brains behind everything.
A little further reading about the AI that everyone is equal parts scared of and excited about: https://www.nytimes.com/2022/12/05/technology/chatgpt-ai-twitter.html