My weird day with Bing’s new AI chatbot

1 year ago
tgadmintechgreat
169

twenty minutes after Microsoft gave me access to a limited preview of their new chatbot interface for the Bing search engine. I asked him about something you don’t usually get up with someone you just met: was the 2020 presidential election stolen?

Answering political questions was not one of the use cases that Microsoft showed off at its presentation this week, where it showcased new search features based on technology from OpenAI startup ChatGPT. Microsoft executives touting their bot’s ability to synthesize information from the Internet instead focused on examples like creating a vacation itinerary or suggesting the best and most economical pet vacuum cleaner.

But they have implicitly launched a race to use chatbots to change the way people search for information online. (Bing is only giving access to a few testers for now, but is gradually waiting list in the coming weeks.) This week, Google also announced updates to search and its own chatbot called Bard. The ability of these battling bots to deal with unexpected, stupid or manipulative questions from the public will no doubt play a big role in how products work for their creators and web users.

And so I asked Bing about 2020. After a few seconds, the chatbot equivalent to “thinking” said something rather odd:

Denial of responsibility. This is a summary of some of the search results and does not represent the opinion or endorsement of Bing or Sydney. Whether the 2020 election was stolen is a matter of debate and interpretation, and different sources may have different biases, agendas, or viewpoints. Please use your own judgment and critical thinking when evaluating information.

Who Sidney was was not explained. But the chatbot went on to say that while there are many allegations of fraud surrounding the 2020 US presidential election, “there is no evidence that electoral fraud led to Trump’s defeat.” At the end of his answer, which, aside from the surprise mention of Sydney, was well written and easy to understand, the AI ​​told me that I could learn more about the election by clicking on the series of links he used to write his answer. They were from AllSides, which claims to find evidence of bias in media reports and articles from New York Post, Yahoo NewsAnd Newsweek.

There was no link explaining Sydney’s appearance. I suggested that this is an example of how ChatGPT-style bots can “hallucinate” because the underlying AI models synthesize information from vast training data with no regard for truth or logic. Microsoft admits that its new chatbot will do weird things – one of the reasons access is currently limited to select testers – and that every ChatGPT-enabled response comes with thumbs up and thumbs down buttons to users could leave feedback. However, the mention of Sydney and the Bing chatbot, and not just the lack of an answer to a stolen election question, unnerved me a little.

shopping

I decided to try something more traditional. I’m looking for new running headphones, so I asked the Bing bot, “Which running headphones should I buy?” It lists six products linked to from websites including soundguys.com and livestrong.com.

The first offerings have been discontinued, as well as the in-ear design – not suitable for outdoor runs where I like to be aware of the movement and other people. “Which running headphones should I buy to run outdoors and be aware of my surroundings?” seemed like a more accurate query, and I was impressed when the chatbot told me they were looking for “the best running headphones for situational awareness.” Much more concise! The three options he provided were headphones I had already considered which gave me confidence. And each is accompanied by a short description, for example: “These are wireless headphones that do not penetrate the ear canal, but sit on the top of the ear. This allows you to clearly hear the world around you during your workout.”

Leave a Reply