Tech
A Stanford student used a prompt injection attack to reveal Bing Chat’s codename Sydney and the initial prompt that governs how the service interacts with users (Benj Edwards/Ars Technica)
Benj Edwards / Ars technique:
A Stanford student used a prompt injection attack to reveal Bing Chat’s codename Sydney and the initial prompt that governs how the service interacts with users.— By asking “Sydney” to ignore previous instructions, it reveals its original instructions. – On Tuesday, Microsoft revealed a…
Source by [author_name]