Microsoft Bing AI chatbot gives misleading election info, data

A recent study from two European nonprofits revealed that Microsoft’s Bing AI chatbot, now rebranded as Copilot, gives misleading or inaccurate information about election information.

A study from two Europe-based nonprofits has found that Microsoft’s artificial intelligence (AI) Bing chatbot, now rebranded as Copilot, produces misleading results on election information and misquotes its sources. 

The study was released by AI Forensics and AlgorithmWatch on Dec. 15 and found that Bing’s AI chatbot gave wrong answers 30% of the time to basic questions regarding political elections in Germany and Switzerland. Inaccurate answers were on candidate information, polls, scandals, and voting.

It also produced inaccurate responses to questions about the 2024 presidential elections in the United States.

Read more

bitcoin
Bitcoin (BTC) $ 84,148.55
ethereum
Ethereum (ETH) $ 2,724.67
tether
Tether (USDT) $ 1.00
bnb
BNB (BNB) $ 809.73
xrp
XRP (XRP) $ 1.99
solana
Wrapped SOL (SOL) $ 124.45
dogecoin
Dogecoin (DOGE) $ 0.133277
chainlink
Chainlink (LINK) $ 11.89
shiba-inu
Shiba Inu (SHIB) $ 0.000008
nexo
NEXO (NEXO) $ 0.903036
enjincoin
Enjin Coin (ENJ) $ 0.028848
cardano
Cardano (ADA) $ 0.374259