Yes, but it's not exactly the same situation. "The EPA’s equipment isn’t able to detect lower levels of Butyl Acrylate that can still pose a threat to human health." In Chernobyl, the Soviets had the opposite problem: the maximum concentration their standard equipment could read was far below the real level.
Unfortunately this is not the case with many Geiger counters, especially older ones. Geiger-Muller (GM) tubes detect radiation by measuring ionization events. When a high energy particle passes through the tube, it leaves a little trail through which electricity can flow, but only for a moment — this temporary flow of electricity is amplified and fed into the detection circuit, and it also what is used to generate the typical clicking noise. The Geiger counter is basically an edge detector + summing circuit with decay.
When a GM tube is overloaded, the particles create too many overlapping trails, and the tube cannot stop conducting electricity. This continuous flow of electricity does not trigger the edge detector, so it thinks the level of radiation is zero. Some modern equipment has an alarm for this condition.
When you die, your heirs get the stock with "step up basis", meaning the cost basis used to calculate capital gains resets to the market price at time of death. So if they sell right after you die, they pay no capital gains tax.
Heirs in the United States don't pay capital gains tax, but doesn't the estate of the super rich guy have to pay up to 40% in estate taxes on all his assests (with an exemption on the first ~$5M)?
Jeff Bezos is worth $200B. If he died, his estate would have to pay 40% estate tax on $195B in assets (first ~$5M being exempt), which is $78B in tax. That's still a hell of a lot tax. I know there were some loopholes used by the Walton family and other super rich, but isn't the above how it's supposed to work?
I'm no super genius, but it does sound like simply changing the tax code to require paying back all loans before changing the cost basis would be both easily and less possibly catastrophic than all of a sudden taxing unrealized gains.
A jury's findings are final only in determining the facts of the case. The actual legal implications of those facts can still be appealed, reviewed, and modified.
You could issue a double spend transaction that goes to another wallet you control with a higher fee and the network will probably apply that one first.
I'm assuming the ones in denial will say something like "It's up to the individual to evaluate the quality of the content e̶v̶e̶n̶ ̶t̶h̶o̶u̶g̶h̶ ̶w̶e̶ ̶e̶n̶g̶i̶n̶e̶e̶r̶ ̶o̶u̶r̶ ̶p̶l̶a̶t̶f̶o̶r̶m̶ ̶t̶o̶ ̶g̶e̶n̶e̶r̶a̶t̶e̶ ̶a̶n̶ ̶i̶n̶s̶t̶a̶n̶t̶a̶n̶e̶o̶u̶s̶ ̶e̶m̶o̶t̶i̶o̶n̶a̶l̶ ̶r̶e̶s̶p̶o̶n̶s̶e̶ ̶t̶o̶ ̶m̶a̶x̶i̶m̶i̶z̶e̶ ̶e̶n̶g̶a̶g̶e̶m̶e̶n̶t̶."
I've been to Husum! I and my family stayed there when we visited. I'm an American, but my ancestors emigrated from that area. It's a beautiful town and we were shown the utmost hospitality by both the people there and our distant relatives that live nearby.
"The tar archive spec does not specify an upper limit on archive size, but my kernel keeps saying 'no space left on device'. It's obviously a bad kernel."
Well, in your example, there is a physical limit on how much space is available. In the case of deeply nested json, we are talking about structures that perfectly fit into memory, and could be decoded in a fraction of a second if only a different algorithm were used.
I mean, it'd be a pathological 202 byte JSON, but yeah (I guess it could be a DOS attack of some kind, actually, hmm...)
Ruby is a good example, because the `oj` gem, on presumably the same system, is listed as "infinite." Obviously not truly "infinite", eventually it'll run out of RAM -- but this shows it is _not_ an issue of machine resources really.
As the OP notes, it's because if you implement using recursion, you'll run out of stack in most languages (now let's start talking about tail call optimization...), but this isn't the only implementation choice, you _can_ implement to not have this limitation, and probably should.
> the standard Ruby "json" gem can be made to overflow on just 202 bytes of valid JSON input.
It is not overflowing, but about aborting with with a proper error.
You can argue whether 100 is a reasonable default, but I think it is not too stupid to have a maximum depth here and bail out as soon as possible. Because what will happen if you accept arbitrary nesting? Then next guy in the chain who actually works with the parsed json tree will also have to handle arbitrary nesting. And if you are now not careful, you have some error deep in some quick and dirty user written code (which might actually be exploitable instead of not accepting the json in the first place).
I would say you think you can of the 100 as some kind of input sanitation.
max_nesting: The maximum depth of nesting allowed in the parsed data structures.
Disable depth checking with :max_nesting => false. It defaults to 100.
I did that intentionally, and not because FUD. I hear ">100 levels of nesting", and think "huge document", which as demonstrated here isn't necessarily true. So I said "202 bytes" to emphasize "not huge document"; to emphasize that deep nesting doesn't really imply much about document size.
100+ levels of nesting implies nothing about the size of a JSON file, just that it's an extremely niche use case. OTOH almost all JSON files are 202+ bytes. Couple that with the title and it just leads to confusion.
I would love to have a hardware-backed soft U2F daemon for Linux. It would be great if this could use a TPM to encrypt the data. What would it take to add such capability to your software? I would be happy to lend my (limited) Rust experience to help make it happen.
I would too. Go ahead and open an issue and we can chat more, but in short I think the biggest hurdle is understanding TPM2 and what API to use to interact with the device. (TPM2 is necessary so the key material never leaves the TPM, signing happens in the device itself)