Anything but human: Tech world grapples with AI

CHARLOTTE, N.C. (QUEEN CITY NEWS) – In the last few years, you’ve likely seen it for yourself: various software or advanced tech that seem like anything but human.

That’s because it is anything but human: artificial intelligence, or intuitive programs, ranging from ChatGPT to Midjourney to advanced robotics, that open up our world but also raise a lot of questions.

Much of those programs and technology are on the consumer level.

To help 2024 voters, Meta says it will begin labeling political ads that use AI-generated imagery

One recently added feature on the newest iOS operating system for iPhones (along with newer-model iPads and the macOS) is one such example of this consumer-level technology, which allows for a user to create a voice profile and, essentially, allows your phone, tablet, or computer to use a “clone” of your voice to say whatever you want it to.

An accessibility feature, the “voice cloning”, or personal voice, has the user say several commands, which are then processed on the device and into a profile that the device uses to say whatever it is prompted. While it does not allow you to make it the voice of your phone or “Siri” digital assistant, it allows you to type whatever you wish into a prompt, which it will then say.

The results are extremely close to the real thing.

“I am talking to you, and I heard your voice (clone).  I’m not sure I could recognize the difference,” said Dr. Lee Tiedrich, a distinguished faculty fellow in law and responsible technology at the Duke Initiative for Science and Society in an interview with Queen City News anchor Derek Dellinger, who recently used the personal voice feature for this story.

Tiedrich said the curriculum has been in a state of flux for years now because of new technology and software being made available.

“There is so much focus on artificial intelligence,” she said. “The advancements are coming at a record pace.”

Humanoid robots are here, but they’re a little awkward. Do we really need them?

Queen City News spoke with Tiedrich about these consumer-level advances. Using Apple’s personal voice feature as an example, the discussion was focused on its beneficial uses and its potentially detrimental misuses.

As mentioned previously, the personal voice feature is an accessibility feature.  By inference, it could help those who may be able to speak now, but not in the future, possibly due to a medical issue.

“To have a tool that allows them to be able to communicate more, whether it be a temporary or permanent issue, may be a bridge for them to use their voice more,” said Tiedrich.

Other examples of beneficial use of this emerging technology include personalized shopping, AI assistance, and helping prevent fraud.

However, experts also note that it can cause fraud in the wrong hands.

Mason Wilder, with the Association of Certified Fraud Examiners, said that the wild frontier of fraud with this technology is already here.

“There are already cases of organizations being targeted with deepfake audio clips that often are voicemails, and it could be the fraudsters pretending to be somebody’s executive or boss.”

The potential for scams, fraud, and what are known as “deepfakes” are things that are still being addressed by authorities of all types.

Examples include an audio or video clip of someone saying something that was never said by that person, or a piece of art reported to be made by a legendary artist but is instead AI-generated art.

“All of these scams have been around in one form or another for forever,” said Kathy Stokes, director of fraud prevention programs at AARP.  “But as technology allows it, they get better and more convincing and less obvious that they are scams.”

Stokes said the best advice for now is to trust your instincts.

“We need to engage our inner skeptic on everything,” she said.

The potential for the technology falling into the wrong hands is present.

As for the example of Apple’s personal-voice feature, the voice prompts and profile are created on the device, though they can be exported out to someone else. That, in addition to a phone being stolen, could create a proverbial “Pandora’s box” of one’s likeness being out in the world without their knowledge or permission.

There are also ethical concerns with emerging technology. Examples include using the voice and likeness of someone who has died and how it could be used or affect the grief of a loved one, the possibility of a stalker getting access to someone’s voice and likeness, or the possibility of addiction.

“People create these digital companions and do we as a society want people retreating into this digital world,” asked Tiedrich.

Tiedrich said the quality of the technology will only get better, and added a number of technology companies have signed on to an effort to be guarded about emerging technology, especially artificial intelligence.

The idea, Tiedrich said, is not centered around keeping AI from destroying humanity, but integrating it into everyday life.

For the latest news, weather, sports, and streaming video, head to Queen City News.