Hacker Newsnew | past | comments | ask | show | jobs | submit | besselheim's commentslogin

I've found Compiler Explorer to be very useful indeed, for two main use cases: gaining insight into how security mitigations are implemented in different compilers, and for quickly checking my working when reverse engineering tricky C++ compiled code.

Really interesting to see how the tool works behind the scenes.


To give another example, Amon Tobin's Verbal achieves a similar effect through piecing together fragments of vocal samples: https://youtu.be/HsdBgQqBfsM


It makes sense to collect everything possible, then discard what you don't need later, because when conducting an investigation you don't know in advance which data are of interest.

If your communications are intercepted, stored, but then never looked at, and eventually deleted - this is functionally equivalent to having never been collected at all.


There's one big difference: if your data was never collected, then a breach (internal or external) doesn't endanger it. If it was "collected but never looked at," then it's subject to the integral of every mistake, malicious action or rule change from now until they loose it. One of the major things that Snowden revealed was that random nobodies had huge access to sigint material.


It's the responsibility of the communicating parties to protect themselves against interception if they consider this to be an unacceptable risk. Using end to end encryption for message content secrecy, and obfuscating message routes using e.g. Tor to help mask source and destination pairs.


The US government can break TOR and pretty much everything else when they really want. That’s no protection.


Do you have any evidence of this?



We have already had proofs of the LEAs inability to keep TS information safe, expect a worse level of protection for routinely intercepted phone calls, emails, ...

It's very very difficult (some people say impossible) to assure 100 percent that people's data have been safely stored and transmitted for their whole retention lifetime.


That's an argument for ensuring that such data is properly secured, not an argument against collecting it in the first place.


If you ask me, keep one or the other. The inability to maintain the operational reliability of a datastore (including backups), does not inspire confidence.

Assuming this is an incident and not a coverup.


Assuming 32-bit unsigned integers (but similarly for other integer sizes), multiplying by 0x11111111 and shifting right 28 bits would give a function of period 15, at least for inputs between 1 and 0x1000000e inclusive. Faster than dividing by 15.


You're absolutely on the right track with a (wrapping) multiply and a shift right, but the constants aren't quite right.


The WebSDR at http://websdr.ewi.utwente.nl:8901 (based in the Netherlands) can be used to hear the station as it is broadcast. Set to 4625 kHz, USB.


This site is really cool. Thanks for sharing though I'm afraid I'll be wasting quite some time today playing around with it :)

[edit]

It looks like the site doesn't get such a large number of people very often and the CPU load is getting too high to hear anything clearly. For clearer audio of the buzzer check here:

http://websdr.printf.cc:8901/


More web based receivers are also available on http://sdr.hu and http://websdr.org


Neato - it's picking up the buzzer station right now!


If you're unfamiliar, search for 'number station.' They are kinda neat and still in use.


MOV RAX, 1 is seven bytes: 48 C7 C0 01 00 00 00


The parent has a good point actually, because "mov eax,1" automatically zero-extends in 64-bit mode.

It's still one byte longer than the equivalent AArch64 instruction, though.


Fine. If you are really into getting the shortest instruction, try "xorl %eax,%eax" then "incl %eax" which is four bytes (31 c0 ff c0).


push $1/pop %eax (6a 01 58) is shorter, but perhaps not the best idea.


Actually, there is no 32-bit pop instruction in x86-64 mode. Your code won't work.


You're right: it should be pushq $1/pop %rax (which is also three bytes, although there will be a prefix byte for registers r8 through r15).


The work by Hayashi published last year had derived precursor egg cells, but the sperm was from normal mouse testes as usual. As described in the article, they did successfully create healthy offspring, though the success rate was very small.


The distinction does make sense though. If an intercept is taken, never looked at, and eventually deleted, then from a 'privacy violation' point of view it may as well not have been collected at all.

But these still need to be intercepted in the first place, because you don't know in advance which of the 0.01% of records are of interest.


From a privacy standpoint, the fact that they make a copy of your data is alarming, as this multiplies the attack surface for your data.

Also, all the data is "looked at". Sure, not by a human, but if the algorithmn can flag you as a terrorist and then humans look through the data, what's the difference.

You also seem to be under the impression that they delete the data once they're done. But why would they ever delete that copy of your data unless they need the space for something else? They never will, and that's why they're building a huge datacenter in Utah, to store EVERYTHING INDEFINITELY.

As to your last point, I'll flip the table: if only 0.01% of the records are interesting and only 0.01% of those records lead to any meaningful insight into terrorist activities, why should we continue these programs knowing that they are, on average, 99.999% ineffective?


"...But these still need to be intercepted in the first place, because you don't know in advance which of the 0.01% of records are of interest...."

They do not, and the reason why is interesting.

In free countries, there is an order to things. This order is important.

First there is an event. A person applies for security clearance, there's a robbery, somebody reports a crime, and so on.

Second there is an investigation. Based on the nature of the event reported, facts are gathered to determine whether or not the state needs to intervene. More importantly, the people are identified that need intervention. Based on the investigation, sometimes the state might intervene with force, as in when a SWAT team shows up at your door.

The order is clear, and necessary. Event > data gathering > investigation > people. We start with an event and end up with people who might be criminals. That's because everybody is guilty of something. It's important to limit the state to only go after people where there is a clear, independent prompt for something to happen, ie, an event.

But what if we change it around? What if we collect data all the time? Well then we are no longer limited to having a good and independent reason for taking action. Instead, now we can start with the person and then figure out what they're guilty of until it justifies the action we've already selected. And guess what? Everybody is guilty of something.

All we've done is create a machine where we can identify people we don't like, push a button, and then find reasons to put them in jail or apply force to them in other ways. Free countries cannot continue operating in an environment like that. Because we've made the law do our bidding, there is effectively no rule of law. And people are smart enough to figure that out.


If you can go ahead and mail me a duplicate of your debit card, I'd like to intercept it. I promise not to collect it.

The fact that the infrastructure is in place and rampant abuse has happened in the past make the distinction essentially meaningless.

If someone is going to justify interception they need to justify collection, because we all know it's going to happen unjustly to at least some citizens.


That doesn't help you if the code running outside your filesystem is compromised.

For example, a backdoor implanted in the disk firmware would be virtually undetectable for the vast majority of users.


I agree, expressing non-authentic feelings is so common we even have figurative expressions to describe it, e.g. to "put on a brave face" or "grin and bear it".

On the other hand, it seems that autistics have the unpleasant pressure of doing this in almost every social interaction.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: