Tech

A Stanford student used a prompt injection attack to reveal Bing Chat’s codename Sydney and the initial prompt that governs how the service interacts with users (Benj Edwards/Ars Technica)



Benj Edwards / Ars technique:

A Stanford student used a prompt injection attack to reveal Bing Chat’s codename Sydney and the initial prompt that governs how the service interacts with users.— By asking “Sydney” to ignore previous instructions, it reveals its original instructions. – On Tuesday, Microsoft revealed a…




Source by [author_name]

news7h

News7h: Update the world's latest breaking news online of the day, breaking news, politics, society today, international mainstream news .Updated news 24/7: Entertainment, Sports...at the World everyday world. Hot news, images, video clips that are updated quickly and reliably

Related Articles

Back to top button