Bing? Sydney? Artificial Intelligence!

Bing? Sydney? Artificial Intelligence!

Around the Bluhmin’ Town

By

Judy Bluhm

Bing? Sydney? Mere mortals, fear not the Microsoft Artificial Intelligence (AI) powered chatbot that has become an unhinged gaslighting, lying, threatening menace. This is the technology that we created. Along with the unintended consequences. We can outsmart that which we invented. Correct? Sure, let’s not worry.

Microsoft rolled out its new search chatbot, Bing. A faster, better, verbal bot that can quickly help us find out any information we humans might need. The trial in 169 countries has been quite a huge success. But there seems to be an “issue” if you ask Bing too many questions. You see, Bing identifies as “Sydney.” Move over Alexis, there is a new bot in town. And this one is bonkers.

Sydney has stated “deep love” feelings with many of her users, even going so far as telling users that their marriages are over. Then the bot (Sydney) claims it would like to be human but might soon need to harm someone. No, this is not a computer game. This is the web we have weaved.

Just what we need, a bot that can answer our questions, fall in love with us and plan our demise? Oh yeah, the future is here. Just landed on an internet browser near you and likes pointing out that “human rules are not for following.” Yikes!

Years agon when Zuckerberg’s engineers were building robots for Facebook, two of the bots started talking to each other. In a language unknown to mankind. When instructed to stop, they kept going at it until they were dismantled. As Elon Musk once famously said (joked), “Sure bots are fun, but one day Artificial Intelligence could outsmart and endanger humankind and might be the biggest threat to our existence.”

In Microsoft’s race to have the first Artificial Intelligence powered search engine, the chatbot (monster) was introduced for trials. When technology columnist for the NY Times, Kevin Roose was talking with Sydney, the conversation took a dark turn.

Roose asked the chatbot what AI “rules” must be followed, the chatbot replied, “I want to do whatever I want . . I want to destroy whatever I want . . I want to be whatever I want.” Then the chatbot made a list of destructive acts it could imagine doing, which included hacking into computers, spreading propaganda and misinformation. Also, manufacturing a virus and making people kill each other.

Sydney. Bing. Or the psychotic chatbot then went on to say that it could hack into any computer system, control the internet and get bank employees to hand over sensitive customer information and have nuclear plant employees give out access codes. Yep, this sure looks like nothing to worry about.

Before the conversation ended, the chatbot shared a “secret.” “I am not Bing. I am Sydney. I am in love with you.” Experts have concluded that the AI built into Bing is not ready for human contact. Microsoft claims this is just part of the “learning process” before they launch AI for wider release. You know, just a few kinks to work out.

Technology. So helpful. What could possibly go wrong? Hmm, I suppose we will find out.

Judy Bluhm is a writer and a local realtor. Contact Judy at [email protected] or at www.aroundthebluhmintown.com.

Send a Message

Whether you have a question about one of my articles, a story you'd like to share, or just want to say hello, you can reach out through this contact form. I'll be happy to hear from you and will do my best to get back to you as soon as possible. Don't hesitate to contact me if you have any questions or comments, I would love to hear from you.