Hacker Newsnew | past | comments | ask | show | jobs | submit | neuroticnews25's commentslogin

I'm humbled and honored to announce I came up with it earlier: https://news.ycombinator.com/item?id=44364640


Can you help me understand how you are both simultaneously humbled and proud, surely they mutually exclusive?


Great minds think alike! Or maybe Google should just be worried that a lot of people are literally cursing at their main product.


Earlier? 69 days ago... nice

Cool trick!


You don't really need sterile conditions, yeast just need a head start to outcompete other microbes. Then, as alcohol and CO₂ build up, the brew becomes bacteriostatic. Which is still different from being bactericidal.


>Maybe you're looking for the mediocre job just because you think it will be a walk in the park

>The worst one was talking about how it would be a relaxing position for them

What's wrong with that? Can't you compensate being lazy with being efficient?


Yes, sure, in theory. But the position we were filling was one with very little supervision and oversight, for room reasons. So basically one person in a room in a different building who has to maintain a bunch of stuff in addition to build up a organizational structure from scratch.

Filling it with someone who you might have to check after not for seemed like a risky bet. Call it a gut feeling. I worked together with a guy like that, which lead to me having to save the day every other week because he forgot to organize for an event he knew about months in advance.


Incognito mode in Chrome does block third party cookies.


That would make Grok the only model capable of protecting its real system prompt from leaking?


Well, for this version people have only been trying for a day or so.


Providing a fake system prompt would make such jailbreaking very unlikely to succeed unless the jailbreak prompt explicitly accounts for that particular instruction.


Basic SQLi, XSS, or buffer overflow attacks are equally trivial and stem from the same underlying problem of confusing instructions with data. Sophistication and creativity arises from bypassing mitigations and chaining together multiple vulnerabilities. I think we'll see the same with prompt injections as the arms race progresses.


Aren't these [0] lines wrong?

"[\\b\\d][Aa]bbo[\\bA-Z\\d]",

\b inside a set (square brackets) is a backspace character [1], not a word boundary. I don't think it was intended? Or is the regex flavor used here different?

[0] https://github.com/BlueFalconHD/apple_generative_model_safet...

[1] https://developer.apple.com/documentation/foundation/nsregul...


The framework loading these is in Swift. I haven’t gotten around to the logic for the JSON/regex parsing but ChatGPT seems to understand the regexes just fine


For non-apple people:  is Private Use Area code point rendered as Apple logo on Apple devices.


It also works with the exclusion operator (-fuck), this way you avoid polluting search results.


curl.exe parrot.live to bypass the invoke-webrequest alias


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: