Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Things like financial and medical data should be required to have an audit log that you can see, in real-time and subscribe to updates for, including extraction into "anonymised" formats, along with a description of that process, format and a justification for why it is robust against deanonymisation. If data is handled well, there is nothing to fear here. Fiddly, perhaps. Expensive, probably. But personal data processing should be risky and expensive.

Deliberately extracting personal data into un-audited environments without good reason (eg printing a label for shipping), should be punished with GDPR-style global turnover-based penalties and jail for those responsible.



> Deliberately extracting personal data into un-audited environments without good reason (eg printing a label for shipping), should be punished with GDPR-style global turnover-based penalties and jail for those responsible.

There already are, but only for Europeans through the GDPR.


Technically not quite, because even in the EU, you don't have to provide the audit log for someone's data specifically and you as a subject have to make specific requests to delete or retreive your data, it's not make transparent to you as a default position. But yes, you can't just dump it out anywhere you want.

How it should be is that personal data's current and historical disposition is always available to the person in question.

If that's a problem for the company processing the data (other than being fiddly to implement at first), that sounds like the company is up to some shady shit that they want to keep quiet about.

Nothing to hide, nothing to fear should apply here, and companies should be fucking terrified with an existential dread of screwing up their data handling and looking for ways to always avoid handing PII at all costs. The analogy of PII being like radioactive material is a good one. You need excellent processes, excellent reasons to be doing it in the first place, you must show you can do it safely, securely and if you fuck up, you'd better hope your process documentation is top tier or you'll be in the dock. Or, better, you can decide that actually you can make do by handling the nuclear material only in some safer form like encapsulated vitrified blocks at least for most of your processes.

The data processing industry has repeatedly demonstrated they they cannot be trusted and so they should reap the whirlwind.


It doesn't say audited environments as such, but you are required to use secure environments that you control as a basis. What "secure" means can always be discussed, but in general it depends on what data you process and what you do with it; if it is a large volume/big population/article 9-data auditable environments should be expected - though not publicly auditable. Although that would be nice...

Fully agree on what you are saying, and my popcorn is ready for August when the penalties part of the AI Act comes into force. There is a grace period for two years for certain systems already on the market, but any new model introduced after August this year has to be compliant. AI Act+GDPR will be a great show to watch...




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: