Hacker Newsnew | past | comments | ask | show | jobs | submit | Zizzle's commentslogin

Even if these extensions are being used I'm sure the attack surface of X11, it's extensions and it's drivers is large enough to make it trivial to breach. Even compared to a modern browser like Chrome.


Do you mean Xorg rather than X11? X11 is just a protocol (or a set of protocols, depending on how you look at it). Xorg, on the other hand, is the most popular implementation. There are (were?) others, some of them much smaller in terms of source code (but also features, platform support, etc.).


I wish they would throw a dialog or interstitial that would let people know about it.

The more people asking why they have a "reduced quality" video in 2016 despite paying $90 for their phone plan the better.


Dick Clarke has been saying the same.

> Clarke explained that the FBI was trying to get the courts to essentially compel speech from Apple with the All Writs Act. "This is a case where the federal government using a 1789 law trying to compel speech. What the FBI is trying to do is make code-writers at Apple, to make them write code that they do not want to write that will make their systems less secure," he said. "Compelling them to write code. And the courts have ruled in the past that computer code is speech."

http://arstechnica.com/tech-policy/2016/03/former-cyber-czar...


"During his testimony today, Comey dismissed the notion that Apple’s assistance in the San Bernardino case would impact other phones, reiterating his belief that any code Apple created to help in this case would only work on Farook’s phone."

And that belief is based on what exactly?

Apple has being saying the opposite. Apple doesn't know it's own code? FBI knows it better.


It's true, but very narrow, I think. FBI means that Apple could sign the update to only work on that phone. Apple means that once the compromised version of the OS is built, the only thing stopping it from being widespread is changing the device id check code to other phones or taking it out entirely.


Right. If you look at it from a security point of view, once the compromised OS is created you've created a much more valuable and vulnerable target for hacking.

Let's say that some attacker wants to create a compromised OS and install it on a certain device.

If apple never creates the compromised OS, they would need to hack into apple, get all of the source code necessary to build iOS, figure out how to build it, figure out how to modify it in the desired ways, how to get it installed on a phone, steal the crypto keys necessary to do the signing, and sign the bad build.

If apple has created the compromised OS, they would just need to hack into apple and get the compromised OS build, steal the crypto keys, and sign it.

The first scenario is a large-scale software engineering project. Anyone that's been given a large source dump will tell you that it's horrible and takes forever to do anything, and iOS is going to be absolutely huge and tricky. You'd need a large, highly trained team of security/OS devs, which is hard to come by and would be extremely expensive.

The second scenario could conceivably be done by a single hacker, if they can find vulnerabilities in apple's security.


Apple also has a huge firewall (in the figurative sense) right now in that it is a large amount of effort for them to create this new security-relaxed version of the OS, and the government can't be compelled to force them to produce it.

Now, let's say that they have written it for some reason, but it is restricted to a single device id. Well, it's now a lot easier for the government to compel Apple to hack another phone, because they can creditably argue that all Apple has to do is change some string constant and re-sign the package. The burden of work is now much, much less than if the tool itself doesn't already exist.

Apple doesn't want to ever create the tool. If they have to create it for any reason, even if it starts out being locked to a single device id, they've lost the war.


You while it makes things easier you don't need to have a source code to break protections.

Crackers broke copy protections for decades without having access to source code of protected games.

The only thing you would need is to have access to private key needed to sign the new code so that phone will accept it, but even that could be broken by hardware engineer.

Anyway the whole thing does not make much sense. Those shooters are already dead, they destroyed their private phones, this was a work phone, they already can access metadata (outgoing/incoming calls etc) from cell provider. FBI went public with this even though in their best interest would be to do it secretly. What does FBI expect in doing this publicly? Did they expect is to cheer for them and complain about evil Apple not helping to break evil terrorists' phone?

It doesn't make much sense... unless the real goal was to make people trust Apple more after Snowden's disclosures. Isn't interesting that Google, Facebook, Microsoft... every company which was previously involved in PRISM supporting Apple? Trusting them benefits both, the agencies and those corporations.


> Did they expect us to cheer for them and complain about evil Apple not helping to break evil terrorists' phone?

I think that is exactly what they expected. Terrorists and pedophiles are the best way for federal TLAs to expand their powers.


Except digital signing makes the compromised OS totally and utterly useless for other phones. Changing the OS would cause the signature check to fail.

And if you can get around the digital signage, you don't need the compromised OS.

Conway's technical interpretation of the Apple deliverables is right. There's a legal precedent which could cause reuse (and is rightly matter for debate/utter refusal of the FBI position), but if you just debate the technical merits Apple has been very misleading about the consequences.


For this one case you're right because the FBI will allow Apple to lock the special OS build to this device's ID. The problem is if the FBI can force Apple to create a special build to order, with features specified by the FBI, they can also be ordered by the FBI to create an OS build that isn't locked to one device. And if the FBI can make them do this, so can any law enforcement or government agency capable of finding an amenable judge, such as say the CIA, the DEA, the NSA, or any random public prosecutor. THAT is the problem.


Not to mention that once this is done in the US, what is to prevent other governments in countries where Apple does business to compel Apple to do the same?

China (or Russia or Germany or whoever) could force Apple to backdoor phones used by CIA informants in that country.


And the fact that who is to assure that Apple doesn't leave one or multiple bugs around that makes the compromised OS not so tied to a single device as they meant it to be?

It's a ticking bomb, man.


The FBI can't legally do any of that.


"Except digital signing makes the compromised OS totally and utterly useless for other phones."

This carries with it the assumption that the digital signing and verification mechanisms are infallible and impervious to attack. That is an unwise assumption. Even if a software system appears to be perfectly secure at a given time, it is reasonable to assume that at some point a vulnerability will be discovered.


> And if you can get around the digital signage, you don't need the compromised OS.

Not necessarily. Someone could get their hands on the signing keys or find a vulnerability in the signature verification without having the knowledge or resources to create something worth signing. Or figure out a way to bypass the check by changing something that isn't covered by the signature, or use something like rowhammer or hardware hacking to flip the bit from saying the check failed to saying the check passed, etc.


It is useful on other phones if someone figures out how to hack whatever mechanism is used to do the phone ID check. If that happens, suddenly this patch works vs all phones


> And if you can get around the digital signage, you don't need the compromised OS.

signing <> encrypting


Find?

Wait three weeks or three months for the FBI to request n copies of the evil thing, tailored to each of the n phones it wants to open. Better still, wait for a few others to make similar requests. Now penetrate or impersonate a law enforcement agency of your choice and send Apple a routine request for the n+1th copy, tailored to the phone of your choice.


> steal the crypto keys

Once you have done that, the other steps are easy.


That is an interesting assertion that you do not back up in any way. I don't know about you I don't see the other steps as anything like easy. And I've been doing this software thing for a while now so I think I some benefit from experience to draw on.


Every single version of the iOS kernel has been dumped. That gives you most [0] of what you need to craft a modified version. The largest barrier to running these modified versions is getting the target hardware to accept them as authentic. All public bootrom/iBoot exploits on the iPhone 3GS/4 patch the bootloaders' RSA authentication out in some form or another. There are no public bootrom exploits out for iPhone 4S+ devices.

Thus, having the signing key (or the power to compel signing at will) is an incredible ability privy only to Apple.

[0] Some Mach-O information is lost. Decryption of the imgX formatted kernel is preferable.



How hard do you think it is to sign software with keys you already have? Cracking software to avoid erasing the device and to support talking to the security hardware without a timeout... I don't even know what one would think is hard there.


Why do you say that the first scenario requires a large-scale software engineering effort? They don't want new features, or major changes to the crypto systems - just disable the function that causes the phone to lock for longer and longer times when wrong PIN codes are entered.

Comment out the function call. Change the number of allowed guesses to MAX_INT. Change the time increment to zero. Click build.

This is not a hard task!


That's a good point that I was assuming it would be difficult. I've since done a little bit of reading up on what we know about how difficult it would be.

This is the best writeup I found: http://blog.trailofbits.com/2016/02/17/apple-can-comply-with...

So, you'd need an update to iOS/the phone firmware, and for newer devices you'd also need an update to the secure enclave firmware. You can't do anything about the 80ms delay, because that's baked into the hashing function (and changing the hashing function would generate invalid results). The FBI is also asking for the ability to enter passcodes electronically rather than via the touchscreen, which would be new code.

If iOS and the SE firmware are really nicely factored to disable security, and it's not hard to add the new functionality, then this might not be too much work. However I doubt that that is going to be the case. The whole point of the security system is to make it difficult to crack, so there might be other countermeasures involved, tricky dependencies, and low-level hardware hackery. If it were simple to do, why wouldn't it have already been done by others reverse-engineering the compiled code? There is certainly financial motivation to do so.


> Click Build

I'm willing to bet iOS has a huge build infrastructure, many different components, and about a snowball's chance in hell of having a single nice clean Makefile for you to type one command to get a build without access to that infrastructure.


But the point is that Apple has that infrastructure. If the FBI asked the right person or people, I would bet they could get this done on their lunch break.

People are making it out like the FBI is asking Apple to rewrite a big part of iOS. That's not the part of the request that's the problem.


It depends on the source code you find: if you find a source code without comments or even worse an obfuscated code it would be a really really difficult task to modify it the right way.


But what it's really about is the legal precedent it would set, allowing the government to force companies to unlock devices for them.


It's not the unlocking that's the precedent. It's the door into "modify your source code in such and such a way."

Come to think of it, why aren't free market standard bearers rallying against this as government intrusion into market features?


The answer is they are and this is somewhat a litmus test for who stands behind free markets and those who don't despite what they often claim.

Example from organization supporting free markets. http://fee.org/articles/apple-defies-fbi/

And then there is no shortage of examples from the presidential candidates who claim to support free markets, yet none are standing behind Apple.


Because it isn't politically expedient for them to do so, as Apple is very very successful, and as such, most people (voters in this case) who are uninformed love the theater and cheap political pot shots lobbed at Apple. It's clear that there is no discussion at a national nor international level of the actual implications of what this means for not just Apple, but indeed for the American economy and software built in the US.

For instance, do you really think Microsoft will be able to sell Office software to the Netherlands Government if the DOJ/NSA/whoever can use the All-Writs Act to force Microsoft to implement a backdoor into their software? Would the NSA be able to use the AWA in conjunction with a NSL and a secret court to force the hand of companies? Politicians / the public don't really grasp what's at stake here. What we're really talking about creating a complete and real artificial handicap for all software companies located/based out of the US. Already there is pushback in China, Europe and Australia to ditch American-made software after the Snowden revelations. A ruling in favor of the FBI will only compound and accelerate this issue and will have a marked and measurable effect on the revenues of software and hardware companies located in the US.

While Google/Apple/Facebook/Microsoft/Cisco et al may not be able to relocate, this will definitely cause small and medium sized firms to relocate or possibly to never incorporate within the US to begin with. This may be effectively and preemptively scaring away the next Google. Law of unintended consequences and all that.


Cause most of them never were actually for the free market.

Also: Terrorism.


Correct. That's what I should have written.


"why aren't free market standard bearers rallying against this"

because it has nothing to do with the "free market".


Even worse than that, the precedent will be set that government agencies can ask any tech company to subvert their own security. So TVs, phones, echoes, webcams, computers, analytics.js could all legally be modified to become surveillance devices, and if doing one, why not do them all?


My question is: why hasn't the fbi gone after a smaller player to set this kind of precedent? One that wouldn't have had the huge legal resources to oppose the request that Apple has.

I think what's coming out of this is that the Fbi is riddled with incompetence and inability to face modern threats, plus a silly hybris that is the foundation for silly strategical mistakes.


Presumably because smaller players don't have such elaborate security. Those can always argue that the government should use one of the well-known exploits.

I also could imagine that Apple would aid such a case anyways.


There is nothing in the digital signature check that allows it to be locked to a device, so that's logic that has to kick in AFTER the trusted layer has validated the code. At that point, it is a simple matter of altering the device ID check in unsecured RAM and you've now got another cracked phone.


Actually, I would argue that this is not true. Before installing, the device wants a ticket to be signed by apple that contains a hash of the firmware to be installed, the phone's identifier and a nonce it has just generated. See here: https://www.theiphonewiki.com/wiki/SHSH

So by not signing any requests for that particular firmware hash, Apple can effectively neuter that firmware and make sure it's never installed anywhere but on the target phone.

The problem is though: If apple can be compelled to do this once, they can also be compelled to do this any other time.


That's not part of the boot chain. That's for OTA updates.


So would a signature check on the trusted layer against a signature generated with the device id (you'd need to distribute a different binary against every device id) permit the generation of an OS image that could only run on a single device?


It would, in theory. If there weren't any catches with this approach though... Apple could have avoided having itself in this position in the first place.


Once you've changed the device id check code, you'd still need to sign it again if you wanted to distribute it widely


Sure, but once the FBI has forced Apple to write the code, forcing them to update the device ID and sign the new build is trivial by comparison, which means that breaking into any random iPhone will become routine.


Is it possible to change a phone's device id to match the initial target, though?


I believe the FBI is suggesting that Apple tie the update to the phone's IMEI, which I believe phone thieves routinely change by desoldering and replacing a chip.


Apple firmware updates are signed on a per-install basis.


Not sure why this got downvoted, I'm not too familiar with iOS but AFAIK this is exactly how the SHSH system works with the modern iPhones.

Quick googling seems to support this.


I didn't downvote you, but I think you're being downvoted because the information content isn't much more than "but cryptography something something!"

I mentioned that the most common method for uniquely identifying a handset (the IMEI) can be changed by switching a chip on the iPhone's main board. (At least this was true 6 years ago.)

So, unless Apple uses an interactive signature scheme or prevents the FBI/intelligence agencies from ever seeing the signature (using TLS with hard-coded certs), then the signature can be replayed.

If the signature can be replayed, then in order to prevent FBiOS being used on multiple phones, it must be tied to one or more unique identifiers, probably excluding the IMEI.

Many people understood my post as shorthand for the above. Responding to this with "[But] Apple firmware updates are signed on a per-install basis." doesn't add to the conversation unless you provide further details. At least, that's my best guess as to why you've been downvoted.


>I mentioned that the most common method for uniquely identifying a handset (the IMEI) can be changed by switching a chip on the iPhone's main board. (At least this was true 6 years ago.)

https://www.theiphonewiki.com/wiki/ECID Firmware updates use this, not IMEIs. And I think the IMEI is more commonly used to identify the radio, not the device itself. But I could be wrong about that.

>So, unless Apple uses an interactive signature scheme or prevents the FBI/intelligence agencies from ever seeing the signature (using TLS with hard-coded certs), then the signature can be replayed.

Every time you update an iPhone it generates a nonce, called APTicket. Apple signs that, your ECID and the firmware. The nonce essentially makes replay attacks impossible, even if you managed to swap a devices ECID.


Thanks for the research! If they're signing the ECID using an interactive signature algorithm, then it sounds like they've thought it through pretty well.

> And I think the IMEI is more commonly used to identify the radio, not the device itself.

Across manufactures, I'm not sure another quasi-unique identifier in common use.

> Every time you update an iPhone it generates a nonce, called APTicket. Apple signs that, your ECID and the firmware.

This is one variant of interactive signature scheme.


Yes, but that is not nearly the same burden as actually writing the compromised OS. It's probably as easy to compel them to sign the update as it is to compel them to turn over iCloud data.


They could easily request the device signing keys via a different case or again using the all-writs act stating that it's necessary for whatever.

Not turning over the encryption/signing keys would be followed up with jail time / contempt of court charges for any officers/developers/etc refusing to remand the keys into federal custody.


Yes someone can modify the OS image to take out the check for phone ID and then it can work on any phone.

Also the FBI would want to use it on other phones as well. They might modify it themselves to work with other phones.


Moreover, once the compromised OS is created, Apple will be compelled to unlock iPhones in every country where it does business, including countries like China and Russia.


Apple releases updates for their phones often. They could engineer a hack that can then be patched by new versions of iOS.


> Comey dismissed the notion

Seems just for a moment, because, via Reuters:

http://www.reuters.com/article/us-apple-encryption-congress-...

"FBI Director James Comey told a congressional panel on Tuesday that a final court ruling forcing Apple Inc (AAPL.O) to give the FBI data from an iPhone used by one of the San Bernardino shooters would be “potentially precedential” in other cases where the agency might request similar cooperation from technology companies."

"Manhattan District Attorney Cyrus Vance testified in support of the FBI on Tuesday, arguing that default device encryption "severely harms" criminal prosecutions at the state level, including in cases in his district involving at least 175 iPhones."


Apple is talking about source code, the FBI is talking about a signed binary. I'm fairly certain Apple has the technical ability to create a signed binary that only executes on a single phone.


More importantly, once "GovtOS" (as Apple's filing calls it) is developed -- even if the government is billed $800K for the privilege -- each subsequent writ will be much less expensive to fulfill, creating a tidal wave of LEO requests to unlock phones. So Apple wants to head this off right now, because otherwise the floodgates will open.


Not necessarily. Apple could simply delete all code modified to make that change, necessitating a similar amount of work for each phone unlocked.


Apple has said that for legal reasons, it may be forced to keep the code permanently and will have to secure it permanently out of concern for future legal/court obligations specific to this case.


In support of this, someone forwarded me a very interesting article written by someone who creates forensic software for a living. The legal requirements surrounding the creation of a software tool for forensic purposes, which this proposed effort requested by the government might fall under, are nothing less than herculean in scope.

http://www.zdziarski.com/blog/?p=5645


Zdziarski's arguments are very illuminating on how this is not a simple or one-off request. Excellent read.


I'm sure defense counsel would want to be able to verify that it isn't modifying file access times, or deleting data, or planting data, or otherwise disturbing evidence when the update is put in.


If Apple had designed the iPhone to require user authentication before updating the software/firmware then they wouldn't be in this mess and they would not be able to comply with the court order short of hacking/jailbreaking the phone. If the pin was required to be entered before installing new software on the phone then the FBI would first need to know the pin to load GovOS on the phone so they would not be able to crack the pin using this method. And once Apple patches their software to require user authentication before installing updates they will no longer be able to comply with any similar type of request.


It's a pretty good bet that China, Russia, the NSA, and other state security agencies have access to Apple's source code (not by Apple providing it to them, but by having pwned an employee's laptop). If Apple creates the source code to do this, these state agencies will be a digital signature away from being able to crack any iPhone that ends up in their physical possession. This applies even if Apple deletes the source code soon after providing the binary to the FBI, since it will have been siphoned off the corp network while under development.

Still a good idea?


> "It's a pretty good bet that China, Russia, the NSA, and other state security agencies have access to Apple's source code (not by Apple providing it to them, but by having pwned an employee's laptop). If Apple creates the source code to do this, these state agencies will be a digital signature away from being able to crack any iPhone"

In the scenario you lay out, these security agencies are incapable of writing their own modifications to iOS, even though they possess the source to iOS.

Absolutely ridiculous. If they can steal the source and signing key, they certainly have access to the technical expertise to do it themselves.

I mean christ, exactly how complicated do you think this pin timeout logic is? If they can hire sufficiently skilled hackers, they can certainly hire sufficiently skilled developers.

The security of the system lies in the secrecy of the signing key. If they can meet that bar, they can surmount any other obstacle.


> These state agencies will be a digital signature away from being able to crack any iPhone that ends up in their physical possession.

Which in the current world, is about as far from having an exploit as one can be. Digital signing works pretty well.


Can you explain how that would be implemented cryptographically? Doesn't seem like an obvious feature to have included to me.


My understanding is that when you install iOS on an iPhone, an Apple server signs the OS as part of a challenge-response protocol. The challenge includes a unique device ID, and I believe the signed iOS is only installable on a device with that ID. http://www.saurik.com/id/12 has more details.

Think about this in the context of jailbreaking to understand why such a facility exists. Apple doesn't want users to install their own modifications to iOS, and they also don't want users to install old versions of iOS that have vulnerabilities that would allow people to modify the OS.

One way you could implement something like this is to have a public/private keypair within the device and have updates encrypted with the public key; then design the device to only run an OS that it could decrypt with its private key. To do this well, you would need a TPM that did not allow the private key to leave the device, nor to be reset.


All iOS software updates, even the normal ones, bear a digital signature that incorporates the device's UDID. The bulk of the software update is the same for all devices, but Apple must generate a new signature for each device using Apple's private signing key.


I don't know if Apple has any specific capability as part of the firmware verification, but even if they didn't they could just put something like this early in the boot process:

    if (unique_device_id != SAN_BERNARDINO_DEVICE_ID) {
        halt();
    }
If this code must be signed to execute then it can't be modified to work on another device without Apple signing it again.

This assumes there's a unique device ID that is known to the FBI and can't be tampered with. Maybe the serial number or IMEI?


Fixed that for you:

    if (unique_device_id != SAN_BERNARDINO_DEVICE_ID) {
        goto fail
    }


My understanding is that phone thieves routinely change the IMEI by desoldering and replacing a chip. If this weren't the case, I think it would be fairly easy for detectives to call up the person currently in possession of any given stolen iPhone.



It looks like there's something called a UDID which is a SHA-1 hash of a bunch of identifying information. So, difficult to fake even if you can twiddle the source values or swap in new chips.

https://www.theiphonewiki.com/wiki/UDID


Except they have the shooter's phone, which has the identifying information which results in the correct UDID. To get the same UDID on another phone they just need to change the source values to the same values as in the shooter's phone. The fact that it's a cryptographic hash doesn't really help here, assuming they can change all the source values at will.


I'm not sure how Apple could develop GovtOS without at least testing it on other iPhones.


Clearly its based on his deep and extensive 30 year work experience as a information security and cryptology developer at Apple. Additionally, I believe Comey was the first person to jailbreak the original iPhone. (sarcasm)


Comey in testimony today:

"Whatever the judge's decision is in California ... will be instructive for other courts, and there may well be other cases that involve the same kind of phone and the same operating system"

It's a little strange for him to dismiss the notion that this will set a precedent because it will just be for one phone and then imply that this case will set a precedent for other phones.


> It's a little strange for him to dismiss the notion that this will set a precedent

Here's the report that he confirmed the precedent:

"Comey told a congressional panel" "that a final court ruling" "would be “potentially precedential” in other cases where the agency might request similar cooperation from technology companies."

http://www.reuters.com/article/us-apple-encryption-congress-...


I think apple signs every single OS installation operation by using a mechanism Jailbreakers refer to as SHSH (https://www.theiphonewiki.com/wiki/SHSH), so it could be argued that, yes, Apple is in full control over what phone the firmware gets installed on.

However, if they can be compelled to do this once for one phone, they can be compelled to do this many more times for as many phones as the FBI or everyone else wants.

I would say: Both are right in this case.


Actually he didn't. I dont remember the exact words, but he made it clear that they were interested to set a precedent with this case, and that's what this whole case is about.


No, he's right.

Apple would have to change some config files to unlock the other phones, resulting in new code.


Comey's magical thinking knows no bounds.


I recently put a turbo on my van since I live high in the mountains where the atmospheric pressure is low and the performance loss is noticeable.

Of course the ECU needs adjustments to the fuelling tables (you need to run rich under boost to prevent detonation), spark timing tables, as well as a patch to the OS to allow the use of a different manifold pressure sensor (default OS doesn't recognise press above 100kpa).

I guess this was illegal.

I have heard rumblings from the professional engine tuners that the OEMs are already starting to lock down ECUs. Not only via DRM, but by having enough checks in the code that modifying parameters to up performance results in error codes and limp home mode. They expect to be having to go to after market ECUs soon.

Some of them would have cost more than my whole project:

https://www.holley.com/products/fuel_systems/fuel_injection/...

Fortunately there are more DIY friendly set ups:

http://megasquirt.info/products/diy-kits/

But it seems a waste to have to throw away a perfectly good ECU because the OEM (or gov) decided to lock it down.


I'm a dealer for Holley ECU products. They're fantastic! But you are correct, people in my industry are now forced to tell the modern hot rodder that step one (for the most part) is purchasing a $1200+ computer replacement plus rewiring their car in order to do anything.


You can modify your ECU all you want so long as it doesn't have any DRM-like protections.


I'm a bit worried that these stories floating around about title 2 being imminent will give the monopolies time to throw more money at the problem to make it go away.

i.e. "convince" Wheeler or get congress to pass a bill.

Or it could be a ploy by Wheeler. Make noise about full regulation, then make the carriers think they have won some concession when he negotiates and implements something lesser.


I foresee an arms race here.

Next gen FTDI clones will work around this driver detection. Next FTDI driver has new detection code.

Iterate until the counterfeit chips are indistinguishable from the real thing via software.


That's the arms race that's already in place. The new one will take place in a courtroom.


> Iterate until the counterfeit chips are indistinguishable from the real thing via software.

Will they still cost less?


Probably


But... innovation!

Seriously, though, this just looks like extracting economic rents from the fact that their vendor ids come pre-installed in certain OSes.


It also seems like a task with inherent parallelism.

Upload the corpus and throw a bunch of cloud instances at it.


Why leave out PostgreSQL?


we picked the most popular out of the RDBMS, NOSQL and newSQL world. In the future we will compare against more systems. We are actually in the process of releasing benchmarks as well for each of the above systems. We will be sure to include postgres


This is like comparing apples and oranges. What technique do you use for benchmarking?


Something like TodoMVC for backends would be nice... in essence, you create a backend for a TodoMVC front end, each using the same web-server platform and language and TodoMVC front end. The difference being the back end SQL server, with as much processing on the server, if it supports procedures, as possible. Maybe extending the example for a location, and a local date/time.

Using Node.js, and Angular for the server/front end, it would be easy enough to swap out the "todo-mvc-server-data" module... as long as each supported the same interface(s), it could be a good test...

Setup the same hardware for each backend, and then run performance tests against a node cluster for the front end. It would by no means be comprehensive, but would be a nice comparison point (like TodoMVC itself).


> We will be sure to include postgres

Yeah, right. Come back when you do.


Exactly.

It's not like the ISPs and making a profit. They are extremely profitable and just want more.

Makes me wonder how much this discussion is being atrsoturfed.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: