Address
33-17, Q Sentral.

2A, Jalan Stesen Sentral 2, Kuala Lumpur Sentral,

50470 Federal Territory of Kuala Lumpur

Contact
+603-2701-3606
info@linkdood.com

Meet GPT-4o: A Fresh Chapter in AI

Just last Monday, OpenAI rolled out something pretty cool called GPT-4o. It’s a new version of their AI that can handle chats, see pictures, and even interact in real-time better than ever before. It’s made to be super user-friendly and quick. At a big event, they showed off how it can stop talking mid-sentence, react super fast, and even get a sense of how users feel, which is a big deal in the world of AI.

woman using voice command recorder on smartphone

A Voice That Sounds Familiar

During the event, there were demos with “Mark and Barret,” who showed what GPT-4o can do. But what really got people talking was how one of the AI voices sounded a lot like actress Scarlett Johansson. Remember her voice in the movie “Her”? Yeah, it was kind of like that. This sparked a lot of chats and a bit of drama about using celebrity voices in AI.

The Scarlett Johansson Mix-Up

Scarlett Johansson was actually asked by OpenAI if they could use her voice for GPT-4o, but she said no because she wants to keep her artistic work personal. Despite her saying no, when GPT-4o spoke, many thought it sounded just like her, which led to some confusion and upset feelings. OpenAI said they didn’t mean for the AI to sound like her, but they admitted they should have been clearer about it.

The Big Questions About AI and Ethics

This whole situation with Scarlett Johansson’s voice look-alike brings up some big ethical questions about AI. As AI gets better, we need to think about things like who owns what AI creates, how to use someone’s voice or image, and making sure everyone knows what’s going on.

Is AI Stealing Creativity?

AI is getting really good at making stuff, which makes us wonder: who really “owns” what AI makes? This gets tricky when AI starts to do things that feel really human, like writing a song or drawing a picture. It’s important for artists and the people who make AI to talk about this and figure it out together.

Building AI the Right Way

To keep things fair and above board, making AI needs clear rules, making sure everyone agrees to how their stuff is used, and having lots of different people involved in the process. We need to make sure everyone is treated fairly and that we’re all clear about how AI is changing things.

Get the scoop on the drama around OpenAI’s new GPT-4o and its voice that sounds a lot like Scarlett Johansson. Plus, dive into the ethical puzzles AI brings up, and why it’s so important to keep things clear and fair as technology moves forward.

young woman with bangs and tattoo on hand recording voice message on smartphone while standing

FAQ: OpenAI’s GPT-4o and the Scarlett Johansson Controversy

1. Why was there a controversy around GPT-4o’s voice sounding like Scarlett Johansson?

The fuss started when people noticed that one of the voices used during the GPT-4o demo sounded a lot like Scarlett Johansson, especially reminiscent of her role in the movie “Her.” This caught everyone off-guard because Scarlett had actually turned down OpenAI’s request to use her voice for the AI. This situation led to a mix of surprise and concern over how AI uses and mimics human voices, even when the original person has said no. OpenAI later clarified that it wasn’t their intention to copy her voice but acknowledged they could have handled the communication better.

2. What does this incident tell us about AI and creativity?

This whole scenario opens up a bigger conversation about the lines between AI-generated content and human creativity. When AI starts mimicking human traits—like a voice—it raises questions about originality, ownership, and even the ethics of AI in creative fields. How do we credit creation when a machine is involved? How do we protect the personal and artistic rights of individuals? These are complex issues that need a lot of thought and careful handling as AI technology continues to advance.

3. How can we ensure ethical AI development in the future?

Ensuring ethical AI development means setting up clear rules that respect both innovation and individual rights. This involves creating guidelines that cover how AI can use personal data, like someone’s voice or image, and ensuring these guidelines are followed. It also means that the people developing AI need to work closely with legal experts, policymakers, and the creative community to make sure everyone’s on the same page. Transparency, accountability, and open dialogue are key to navigating the exciting but challenging landscape of AI technology.

Sources The Guardian