Physical Address

304 North Cardinal St.
Dorchester Center, MA 02124

Deepfakes Didn’t Impact 2024 Election Because They’re Not So Good, Study Finds


It seems that even the Internet is becoming more and more popular immersed in false imageswe can adopt the human ability to sniff out BS when needed. A recent study shows that artificial intelligence created by AI had no effect on this year’s elections around the world because it was not very effective.

There’s been a lot of concern over the years that what’s going on but what’s going on in production can be distracting to audiences in harmful ways. The rise of artificial intelligence has rekindled those fears, as the technology makes it easy for anyone to create fake visuals and audio that look real. Back in August, a political consultant used AI President Biden’s statement of robocalls telling New Hampshire voters to stay home during the Democratic state primaries.

Tools like ElevenLabs make it possible to provide a short transcript of the person speaking and repeat the speech to say whatever the user wants. While many commercial AI tools include these protective layers, open versions are available.

Despite this progress, a Financial Times in a new article we looked back over the year and found that, all over the world, very little politics has been happening.

He mentioned a report from the Alan Turing Institute which found that only 27 AI-generated pieces were spread during the summer elections in Europe. The report stated that there was no evidence that the decisions were influenced by AI’s ignorance because “high exposure was seen among a minority of users with political beliefs that already correspond to the issues within the content.” In other words, among the few who saw the content (before it was mentioned) and were encouraged to believe it, it strengthened the candidate’s beliefs even though those who saw it were aware of the AI-generated content. The report cited an example of AI-generated footage showing Kamala Harris speaking at a rally standing in front of Soviet flags.

In the US, the News Literacy Project found more than 1,000 examples of fake news about the presidential election, but only 6% were created using AI. On X, mentions of “deepfake” or “AI-generated” in the Community Notes are only mentioned with the release of new images, not during elections.

Interestingly, it seems that social media users can be quite anonymous reality images as created by AI than the other way around, but in general, users show great skepticism. And fake news can still be dealt with through legitimate communication channels, or through other means like Google image searches.

It’s hard to calculate with certainty how many people have been affected by deepfakes, but the findings that they didn’t work can make sense. AI images are everywhere these days, but images created using artificial intelligence are still prone to false positives. A hand may be strangely long, or a face not seen in the mirror; there are many small signs that will give away that the image is fake. Photoshop can be used to create many illusions, but doing so requires skill.

AI agents should not be happy about this story. It means that the images produced still have a ways to go. Anyone who is out An OpenAI version of Sora knows that the video it produces is not very good – it looks like something made by a video game engine (speculation is that they were trained on video games), who doesn’t understand things like physics very well.

All that being said, there are concerns to be had. Alan Turing Institute Report he said after assuming that beliefs can be strengthened by the depth of the falsehood even though the audience knows that the media is not true; the confusion surrounding how part of the media and reality undermines trust in online sources; and AI images have already been used targeting female politicians who are deeply pornographicwhich can damage their opinion and their reputation because it promotes the beliefs of sex.

This technology will continue to improve, so it’s something to keep an eye on.



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *