Microsoft AI Bing Faces FAKE News Concerns: Here’s What Happened
Microsoft AI Bing Faces FAKE News Concerns: Here’s What Happened
Microsoft has opened the AI gates to more people with Copilot but the company has a fake news concern that could become a problem in 2024.

Microsoft has a big leg up on the other tech giants in the AI arena but it seems the company has some concerns that need to be addressed quickly. Microsoft has released its AI chatbot called Copilot that seems to be generating inaccurate information regarding the US Elections 2024.

Researchers have observed that the AI chatbot is sharing details about the upcoming events with information sourced from previous instances and in many ways wrong in what it has to offer for the queries. AI chatbots have raised concerns about misinforming people and Microsoft has a lot on its plate to fix the reported mess, especially if it wants to avoid major intervention from the government in the coming months building up to the big election.

The company has gone on a rampage with its focus on AI, which at one point was close to hiring OpenAI CEO, Sam Altman as its Chief of the AI research lab.

Its reported investment of $10 billion in OpenAI has given Microsoft early access to latest versions and features of ChatGPT, which is now headed to the 5th gen model.

You can have fun with AI chatbots with basic queries and requests but the serious matter needs to be investigated so that it doesn’t create major issues for the company and everyone involved in the episode.

Copilot is also getting a wider release now, which means more people might be signing up to the AI chatbot to get their responses, and if those are inaccurate and fake, Microsoft has a serious problem that cannot be sidelined anymore. Instances like these also validate the need for tighter regulations so that AI cannot pose any danger to the overall working of regimes and other entities.

What's your reaction?

Comments

https://popochek.com/assets/images/user-avatar-s.jpg

0 comment

Write the first comment for this!