> Col. Serhiy Demydiuk, the head of Ukraine’s national Cyberpolice unit, has not accused anyone at MeDoc of being involved with the attack. He has said that the company was warned multiple times about potential security vulnerabilities in its systems. “They knew about it,” Demydiuk told the Associated Press. “They were told many times by various anti-virus firms... For this neglect, the people in this case will face criminal responsibility."
Criminal charges for not fixing known vulnerabilities? That's a risky road to travel down. Especially given the general state of infosec among various governments around the world and the offensive first mindset of their security agencies.
This sounds like the result of being pressured to appear to be doing something about a very public problem rather than using the justice system to set a meaningful precedent. Especially if enforcement is going to be arbitrarily based on only exceptionally bad cases.
People love to use the courts as a knee jerk reaction to every problem without considering more effective or efficient alternatives first.
>Criminal charges for not fixing known vulnerabilities? That's a risky road to travel down.
It is a risky road, but I believe that something (not necessarily this, and certainly not "criminal" charges) needs to be done to make these firms have some responsability.
Instead of the road comparison, let's make another one, you have an electrician (certified/authorized/whatever) come to your house, enter it and replace a wall socket.
By mistake (not intentionally) he mis-wires something or doesn't notice that the wires insulation is not good and as soon as you insert a plug, the wires go on fire and the fire extends to your home.
Would you ask him to reimburse the damages?
I would guess yes, and most probably he has a suitable insurance to cover this (hopefully rare) cases.
Wouldn't this infection by malware be a similar form of malpractice by someone that should know what they are doing?
This is not the case of someone downloading a new phone ringer from some unknown site, it is a professional software firm that provdes tax software to an almost whole country that sends you an update.
"appear to be doing something about a very public problem"
Well, that's what authoritarian governments do. They also use raids to project their power and get loyalty from the police in return for ignoring all the corruption and extortion.
Sounds more like corruption than authoritarianism - questionable criminal charges that go away for the right price. (Authoritarianism would be questionable prosecutions, rigidly enforced).
It's more like corruption-fueled authoritarianism: police is allowed to extort as long, as they extort only those companies that government wants to harm, judges are corrupt and afraid of the government, but as long as they side with the government nobody is looking for corruption. So there are both elements, rigidly enforced questionable prosecutions and questionable criminal charges that go away for the right price. And I'm pretty sure they always go hand in hand.
So you'd like to have agencies set up to fine every company that gets hacked through 'known vulnerabilities'? Enforcing this arbitrarily after big hacks is hardly an equivalent analogy to enforcing traffic violations. It'd have to be consistent, well defined, and widely enforced to be at all effective.
To me this is an emotional reaction that has no regard for cause/effect.
>So you'd like to have agencies set up to fine every company that gets hacked through 'known vulnerabilities'?
Not exactly, but I do feel that entities recklessly handling PII or possibly in this case their update servers should face consequences.
>Enforcing this arbitrarily after big hacks is hardly an equivalent analogy to enforcing traffic violations. It'd have to be consistent, well defined, and widely enforced to be at all effective.
We definitely agree on this.
>To me this is an emotional reaction that has no regard for cause/effect.
The end result will likely result in more companies wasting time of useless theatrics like PCI compliance to protect themselves from legal liability rather than meaningfully protecting users data and preventing their systems from being launch points for bigger attacks.
This is why I'm highly doubtful about the ROI of burdening companies, courts, and law enforcement with this 'solution'.
Even though it feels good to punish a faceless corporation for making a seemingly obvious mistake.
What's wrong with a PCI-like compliance that ensures companies that affect this many people have their servers patched on a regular basis?
Rubber stamps like PCI compliance might look like time wasters. Not all of them are. Given the huge increase in the amount of online credit card transactions, the number of cases where payment information is compromised is very low. That is partly due to PCI compliance IMO.
You wouldn't introduce these fines just like that of course. You would have some reasonable formal procedure. For example, the bug must be documented somewhere (publicly or not), such as in a CVE, and you must been given enough time to fix it.
I think this is absolutely necessary in these days where vulnerable IOT devices are made into botnets, that people are held responsible for neglegience. The damage this can cause is potentially huge.
There should also be a way to punish people if they find a vulnerability internally, and willfully neglect to fix it. The bar for this should be reasonably high, but it is IMHO the same as if a car manufacturer finds a problem with their brakes and ignores it.
Also, there probably would have to be a way for a manufacturer to throw their hands up and say "sorry, we can't fix this" - declaring technical debt bankruptcy. In that case, I think it should not necessarily result in criminal charges, but it must have some consequences. Maybe allowing third parties to take the code and deploy fixes, maybe banning sales, you lose IP, or have to pay a fine.
> Enforcing this arbitrarily after big hacks is hardly an equivalent analogy to enforcing traffic violations. It'd have to be consistent, well defined, and widely enforced to be at all effective.
"Consistent" and "widely enforced" don't apply to traffic violations either.
Criminal negligence usually has to cause death or something serious like that. Not sure why it shouldn't be extended to causing other types of harm. In this case, it seems like they were warned they were putting their customers at risk and did it anyway. That seems much more serious than just having a bug.
this amounts to enforcing whatever the software vendor of choice wants to install on your drive. Taking the choice out of a update by law- without a counter law enforcing that at least two suppliers of choice exist - that is really sinister.
Criminal charges for not fixing known vulnerabilities? That's a risky road to travel down. Especially given the general state of infosec among various governments around the world and the offensive first mindset of their security agencies.
This sounds like the result of being pressured to appear to be doing something about a very public problem rather than using the justice system to set a meaningful precedent. Especially if enforcement is going to be arbitrarily based on only exceptionally bad cases.
People love to use the courts as a knee jerk reaction to every problem without considering more effective or efficient alternatives first.