You don't really need sterile conditions, yeast just need a head start to outcompete other microbes. Then, as alcohol and CO₂ build up, the brew becomes bacteriostatic. Which is still different from being bactericidal.
Yes, sure, in theory. But the position we were filling was one with very little supervision and oversight, for room reasons. So basically one person in a room in a different building who has to maintain a bunch of stuff in addition to build up a organizational structure from scratch.
Filling it with someone who you might have to check after not for seemed like a risky bet. Call it a gut feeling. I worked together with a guy like that, which lead to me having to save the day every other week because he forgot to organize for an event he knew about months in advance.
Providing a fake system prompt would make such jailbreaking very unlikely to succeed unless the jailbreak prompt explicitly accounts for that particular instruction.
Basic SQLi, XSS, or buffer overflow attacks are equally trivial and stem from the same underlying problem of confusing instructions with data. Sophistication and creativity arises from bypassing mitigations and chaining together multiple vulnerabilities. I think we'll see the same with prompt injections as the arms race progresses.
\b inside a set (square brackets) is a backspace character [1], not a word boundary. I don't think it was intended? Or is the regex flavor used here different?
The framework loading these is in Swift. I haven’t gotten around to the logic for the JSON/regex parsing but ChatGPT seems to understand the regexes just fine