Connect with us

Company News

Maybe AI needs to write its own dictionary

An explosion of interest in artificial intelligence is irking a lot of people concerned about the encroachment by computers into human domains such as visual arts, music, and literature. More recently, even the terms we use to describe these systems don’t sit well with some who take issue with digging up old words for new uses or anthropomorphizing machines.

Both phenomena were already common well before electric circuits started writing poems, so much of the recent drama is based on fear rather than logic. Which is suitable: Machines don’t feel fear, and human logic often fails.

Even the term artificial intelligence is taken as an affront by sentient beings when it’s ascribed to non-living objects. It’s a valid point, one that’s been argued for centuries even as philosophers struggle to define “intelligence.” To that end, computer scientists such as Alan Turing, famed for helping crack German cryptography during World War II, have speculated how we might test whether a machine could mimic humans well enough to fool us, and devised The Imitation Game for that purpose.

“The original question, ‘Can machines think!’ I believe to be too meaningless to deserve discussion,” Turing wrote in 1950. “Nevertheless I believe that at the end of the century the use of words and general educated opinion will have altered so much that one will be able to speak of machines thinking without expecting to be contradicted.”

He was, unfortunately, wrong. The debate is far from settled, and humans constantly contradict each other — to many, the words and are oxymoronic. As for the use of words being altered to match our progress on the topic, well, that’s where humans seem to struggle.

After “intelligence” and “thinking,” the expression “hallucination” is next in the firing line. When an AI tool, such as ChatGPT, confidently makes a statement that isn’t true, it is said to be hallucinating. But critics are quick to say that the machine isn’t doing this at all; it’s simply making things up.

They’re right. But there are a few problems with this rush to judgment. First of all, as as executives at Alphabet Inc. and Microsoft Corp. have been at pains to point out, chatbots are not search engines, they’re merely trained to mimic human prose, not serve up correct information. Getting a fact wrong is neither success or failure. The other issue is that a hallucination is, by definition, “an unfounded or mistaken impression or notion.”

More broadly, though, the outrage over assigning old words to new use cases is unwarranted.

Human language is living and constantly evolving. For centuries, we have engaged in anthropomorphism — attributing human characteristics to non-human beings. We name our pets, even when it’s the only animal in the house and can’t speak. We assume dolphins are happy because their long mouths look like a smile. And we impose human-like judgments such as cunning (cats), loyalty (dogs), and bravery (lions).

Non-living beings get the same treatment. Computer software is said to have bugs — a term that arose when American computer scientist Grace Hopper found a moth trapped in a piece of equipment — and engine power is still measured in horses. There’s no rodent on your desk, even though you rest your hands on it daily, and you may be caught red-handed without having butchered a poached animal.

This is what happens when a computer is asked to write its own dictionary:

And we still do this today, with some AI systems described as being built with neural networks. They do not, of course, have neurons or neural pathways. In their physical form, AI neural networks are structures that define the relationship between data — in binary form — held by the transistors within a chip. We accept the term “iPhone keyboard” although no such thing exists, as Steve Jobs proudly announced.

If people are to fend off the rise of the machines, they’ll need to do what they do best: evolve. That means accepting that even our language shifts over time according to new circumstances. In fact, this linguistic adaptability could be precisely what makes humans more resilient. Computers are said to be like mischievous genies: They’ll do just what you ask for, even if that’s not what you meant. Ask even the smartest chatbot like OpenAI’s ChatGPT to explain a word, and it’ll provide a text-book definition — the one we gave it.

And if humans don’t like the idea of having terms people use for each other being applied to computers that are starting to mimic us, then there’s an alternative: Get the machines to define themselves. I tried this with ChatGPT and the humanity-purists may be disappointed the results were rather anthropomorphic. Among the eight examples it offered was , as we defined above: The gradual change in the way an AI model perceives and interacts with the world over time, often due to exposure to new data or changing conditions.

It’s not surprising that many terms are derivative of what we invented ourselves. Humans have thousands of years of life and evolution to build upon, machines have only the history we give them, plus the ability to hallucinate.

Still, what the bot presented offers a useful middle ground. We can look at each term, such as , and have a guess at what it may mean. Yet since it was a computer, not a human, concocting the definition then we have only the bot to blame. And if machines aren’t sentient, then really, they’re blameless anyway.

If humans refuse to adapt their language in a world where computers are more prevalent, then a far worse scenario will emerge — one where machines define things on their own terms.

More From Bloomberg Opinion:

  • Google Is Making Breakthroughs Much Bigger Than AI: Tim Culpan
  • The Invasion of the Fake AI Photos: Parmy Olson
  • China’s ChatBot Advantage May Come From a Dark Place: Tim Culpan

This column does not necessarily reflect the opinion of the editorial board or Bloomberg LP and its owners. Bloomberg

Click to comment

You must be logged in to post a comment Login

Leave a Reply

Copyright © 2024 Communications Today

error: Content is protected !!