Don’t listen to Microsoft Copilot, David Attenborough and William Shatner are very much alive
Copilot wrongly claims that several celebrities are dead.
When you purchase through links on our site, we may earn an affiliate commission.Here’s how it works.
What you need to know
Microsoft Copilot is once again sharing incorrection information. When asked about notable figures that have died in 2024, Copilot lists living celebrities, including Sir David Attenborough. Barring any heartbreaking news that has not been reported, Attenborough is very much alive.
In fact, a school in Leicestershirereceived a letter from himearlier this week. Attenborough was alsonamed a top British cultural figurein a recent poll. While that designation can be given to someone who has passed, such as when the Late Queen Elizabeth was named a cultural icon, Attenborough received the honor while alive.
The phenomenon was noticed by several people who took to X (formerly Twitter) and other platforms. When asked if he was okay, William Shatner jokingly responded that he was not fine after reading about his death.The Vergeshared other examples, including one listing Attenborough as deceased. I’ve seen similar results in my testing. In addition to listing living people as dead, Copilot incorrectly stated several deaths from previous years occurred in 2024.
Not after reading this.😱 https://t.co/eHWpeOZtM8October 2, 2024
This is only the latest example of AI getting facts wrong. Copilot has shared false information regarding the US electionin the past. Some believe that ChatGPT, which is part of what powers Copilot, hasgotten less intelligencesince launch. In the early days of the chatbot, Copilot, then known as Bing Chat,shared bizarre and creepy responses.
I have first-hand experience with AI chatbots spreading false information. Last year, I wrote an article about howGoogle Bard incorrectly statedthat it had been shut down. Bing Chat then scanned my article and wrongly interpreted it to mean that Google Bard had been shut down. That saga provided ascary glimpse of chatbots feeding chatbots.
AI often struggles with logic and reasoning. That fact isn’t surprising when you consider how AI works. Tools like Copilot are not actually intelligent. They’re not using reasoning skills in a way that a human would. They’re often tripped up by the phrasing used in prompts and miss key pieces of information in questions. Mix inAI’s struggles to understand satireand you have a dangerous recipe for misinformation.
Fixing AI
Microsoft unveiled amajor update to Copilotyesterday. The update is meant to make the AI assist more personal and interactive. As explained by our Senior Editor Zac Bowden, “Microsoft really wants you to view the new Copilot as more than just an AI tool. It wants you to treat it like a friend, whether that be by asking it for advice on how to ask out a crush, venting about work, or chatting about nothing because that’s what people do.”
Get the Windows Central Newsletter
All the latest news, reviews, and guides for Windows and Xbox diehards.
The new Copilot has features such as “Copilot Voice” that aim to make interaction with the chatbot feel conversational. The tool can also suggest topics to discuss and share summaries of daily news.
A new interface and some voice features may help in making Copilot feel more personal, but I’d prefer my friends, human or digital, don’t share false information and claim living cultural icons are dead. Perhaps more training time will make our digital friend more factual.
Sean Endicott is a tech journalist at Windows Central, specializing in Windows, Microsoft software, AI, and PCs. He’s covered major launches, from Windows 10 and 11 to the rise of AI tools like ChatGPT. Sean’s journey began with the Lumia 740, leading to strong ties with app developers. Outside writing, he coaches American football, utilizing Microsoft services to manage his team. He studied broadcast journalism at Nottingham Trent University and is active on X @SeanEndicott_ and Threads @sean_endicott_.