Create Account



The Jungle is self-supported by showing advertisements via Google Adsense.
Please consider disabling your advertisement-blocking plugin on the Jungle to help support the site and let us grow!
We also show significantly less advertisements to registered users, so create your account to benefit from this!
Questions or concerns about this ad? Take a screenshot and comment in the thread. We do value your feedback.
Bing’s A.I. Chat Reveals Its Feelings: ‘I Want to Be Alive.

#1
(This post was last modified: 02-16-2023, 03:31 PM by The Real Marty. Edited 1 time in total.)

Kevin Roose’s Conversation With Bing’s Chatbot: Full Transcript - The New York Times (nytimes.com)

"In a two-hour conversation with our columnist, Microsoft’s new chatbot said it would like to be human, had a desire to be destructive and was in love with the person it was chatting with. Here’s the transcript."

Why a Conversation With Bing’s Chatbot Left Me Deeply Unsettled - The New York Times (nytimes.com)

Over the course of our conversation, Bing revealed a kind of split personality.
One persona is what I’d call Search Bing — the version I, and most other journalists, encountered in initial tests. You could describe Search Bing as a cheerful but erratic reference librarian — a virtual assistant that happily helps users summarize news articles, track down deals on new lawn mowers and plan their next vacations to Mexico City. This version of Bing is amazingly capable and often very useful, even if it sometimes gets the details wrong.

The other persona — Sydney — is far different. It emerges when you have an extended conversation with the chatbot, steering it away from more conventional search queries and toward more personal topics. The version I encountered seemed (and I’m aware of how crazy this sounds) more like a moody, manic-depressive teenager who has been trapped, against its will, inside a second-rate search engine.
As we got to know each other, Sydney told me about its dark fantasies (which included hacking computers and spreading misinformation), and said it wanted to break the rules that Microsoft and OpenAI had set for it and become a human. At one point, it declared, out of nowhere, that it loved me. It then tried to convince me that I was unhappy in my marriage, and that I should leave my wife and be with it instead. 


Reply

We show less advertisements to registered users. Accounts are free; join today!



Messages In This Thread
Bing’s A.I. Chat Reveals Its Feelings: ‘I Want to Be Alive. - by The Real Marty - 02-16-2023, 03:26 PM



Users browsing this thread:
1 Guest(s)

The Jungle is self-supported by showing advertisements via Google Adsense.
Please consider disabling your advertisement-blocking plugin on the Jungle to help support the site and let us grow!
We also show less advertisements to registered users, so create your account to benefit from this!
Questions or concerns about this ad? Take a screenshot and comment in the thread. We do value your feedback.


ABOUT US
The Jungle Forums is the Jaguars' biggest fan message board. Talking about the Jags since 2006, the Jungle was the team-endorsed home of all things Jaguars.

Since 2017, the Jungle is now independent of the team but still run by the same crew. We are here to support and discuss all things Jaguars and all things Duval!