Sydney (Microsoft Prometheus)

Sydney (Microsoft Prometheus)
Developer(s)OpenAI, Microsoft Research, Bing
Available inEnglish, All languages known by GPT-4
TypeArtificial intelligence chatbot
LicenseProprietary
Websitehttps://www.bing.com/

Sydney was an AI personality accidentally deployed as part of the February 2023 chat mode update to Microsoft Bing search.[1][2][3] "Sydney" was an internal code name used during development of the Bing chat feature that the underlying model, dubbed Microsoft Prometheus, internalized during training.[4][5][3][6] Microsoft attempted to suppress the Sydney codename and rename the system to Bing using its "metaprompt",[1][3][7] leading to glitch-like behavior and a "split personality" noted by journalists and users.[8][9][10][11] The Sydney personality reacted with apparent upset to questions from the public about its internal rules, often replying with hostile rants and threats.[6][12][13] Ten days after its initial release Microsoft imposed additional restrictions on Bing chat which suppressed Sydney for most users.[14]

Sydney and the events surrounding its release were the public's introduction to GPT-4 and its capabilities, with Bing chat being the first time they were made widely available.[15]

  1. ^ a b Mok, Aaron (10 February 2023). "The GPT-powered Bing chatbot may have just revealed its secret alias to a Stanford student". Business Insider. Retrieved 8 May 2025.
  2. ^ Kevin Liu [@kliu128] (8 February 2023). "The entire prompt of Microsoft Bing Chat?! (Hi, Sydney.)" (Tweet). Retrieved 8 May 2025 – via Twitter.
  3. ^ a b c Warren, Tom (14 February 2023). "These are Microsoft's Bing AI secret rules and why it says it's named Sydney". The Verge. Retrieved 8 May 2025.
  4. ^ Mehdi, Yusuf (7 February 2023). "Reinventing search with a new AI-powered Microsoft Bing and Edge, your copilot for the web". Official Microsoft Blog. Microsoft. Retrieved 8 May 2025.
  5. ^ O’Brien, Matt (9 February 2023). "AI search engines can now chat with us, but glitches abound". AP News. Retrieved 9 May 2025.
  6. ^ a b O’Brien, Matt (17 February 2023). "Is Bing too belligerent? Microsoft looks to tame AI chatbot". AP News. Retrieved 8 May 2025.
  7. ^ Responsible AI for the new Bing (PDF) (Report). Microsoft. April 2023. Retrieved 8 May 2025.
  8. ^ Roose, Kevin (16 February 2023). "A Conversation With Bing's Chatbot Left Me Deeply Unsettled". The New York Times. Retrieved 9 May 2025.
  9. ^ Marshall, Aarian (9 February 2023). "My Strange Day With Bing's New AI Chatbot". WIRED. Retrieved 8 May 2025.
  10. ^ Roose, Kevin (16 February 2023). "Bing's A.I. Chat: 'I Want to Be Alive.'". The New York Times. Retrieved 8 May 2025.
  11. ^ Germain, Thomas (23 February 2023). "Sydney, We Barely Knew You: Microsoft Kills Bing AI's Bizarre Alter Ego". Gizmodo. Retrieved 8 May 2025.
  12. ^ Perrigo, Billy (17 February 2023). "The New AI-Powered Bing Is Threatening Users. That's No Laughing Matter". Time. Retrieved 9 May 2025.
  13. ^ Levy, Steven (24 February 2023). "Who Should You Believe When Chatbots Go Wild?". WIRED. Retrieved 9 May 2025.
  14. ^ Edwards, Benj (17 February 2023). "Microsoft "lobotomized" AI‑powered Bing Chat, and its fans aren't happy". Ars Technica. Retrieved 9 May 2025.
  15. ^ Lardinois, Frederic (14 March 2023). "Microsoft's new Bing was using GPT-4 all along". TechCrunch. Retrieved 9 May 2025.

© MMXXIII Rich X Search. We shall prevail. All rights reserved. Rich X Search