{"id":26666,"date":"2023-12-15T15:33:34","date_gmt":"2023-12-15T15:33:34","guid":{"rendered":"https:\/\/nftandcrypto-news.com\/crypto\/microsoft-bing-ai-chatbot-gives-misleading-election-info-data\/"},"modified":"2023-12-15T15:33:36","modified_gmt":"2023-12-15T15:33:36","slug":"microsoft-bing-ai-chatbot-gives-misleading-election-info-data","status":"publish","type":"post","link":"https:\/\/nftandcrypto-news.com\/crypto\/microsoft-bing-ai-chatbot-gives-misleading-election-info-data\/","title":{"rendered":"Microsoft Bing AI chatbot gives misleading election info, data"},"content":{"rendered":"
<\/p>\n
A study from two Europe-based nonprofits has found that Microsoft\u2019s artificial intelligence (AI) Bing chatbot, now rebranded as Copilot, produces misleading results on election information and misquotes its sources.<\/p>\n
The study was released by AI Forensics and AlgorithmWatch on Dec. 15 and found that Bing\u2019s AI chatbot gave wrong answers 30% of the time to basic questions regarding political elections in Germany and Switzerland. Inaccurate answers were on candidate information, polls, scandals and voting.<\/p>\n
It also produced inaccurate responses to questions about the 2024 presidential elections in the United States.<\/p>\n
Bing\u2019s AI chatbot was used in the study because it was one of the first AI chatbots to include sources in its answers, and the study said that the inaccuracies are not limited to Bing. They reportedly conducted preliminary tests on ChatGPT-4 and also found discrepancies.<\/p>\n
The nonprofits clarified that the false information has not influenced the outcome of elections, though it could contribute to public confusion and misinformation.<\/p>\n
\u201cAs generative AI becomes more widespread, this could affect one of the cornerstones of democracy: the access to reliable and transparent public information.\u201d<\/p><\/blockquote>\n
Additionally, the study found that the safeguards built into the AI chatbot were \u201cunevenly\u201d distributed and caused it to provide evasive answers 40% of the time.<\/p>\n
Related:\u00a0<\/em><\/strong>Even the Pope has something to say about artificial intelligence<\/em><\/strong><\/p>\n
According to a Wall Street Journal report on the topic, Microsoft responded to the findings and said it plans to correct the issues before the U.S. 2024 presidential elections. A Microsoft spokesperson encouraged users to always check for accuracy in the information obtained from AI chatbots.<\/p>\n
Earlier this year in October, senators in the U.S. proposed a bill that would reprimand creators of unauthorized AI replicas of actual humans \u2014 living or dead.<\/p>\n
In November, Meta, the parent company of Facebook and Instagram, introduced a mandate that banned the usage of generative AI ad creation tools for political advertisers as a precaution for the upcoming elections.<\/p>\n
Magazine: <\/em><\/strong>\u2018AI has killed the industry\u2019: EasyTranslate boss on adapting to change<\/em><\/strong><\/p>\n
<\/div>\n","protected":false},"excerpt":{"rendered":"
A study from two Europe-based nonprofits has found that Microsoft\u2019s artificial intelligence (AI) Bing chatbot, now rebranded as Copilot, produces misleading results on election information and misquotes its sources. The study was released by AI Forensics and AlgorithmWatch on Dec. 15 and found that Bing\u2019s AI chatbot gave wrong answers 30% of the time to […]<\/p>\n","protected":false},"author":1,"featured_media":26667,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"nf_dc_page":"","om_disable_all_campaigns":false,"footnotes":""},"categories":[42],"tags":[],"class_list":["post-26666","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-crypto"],"yoast_head":"\n
Microsoft Bing AI chatbot gives misleading election info, data | NFT & Crypto News<\/title>\n\n\n\n\n\n\n\n\n\n\n\n\t\n\t\n\t\n\n\n\n\t\n\t\n\t\n