Hacker Newsnew | past | comments | ask | show | jobs | submit | noahm's commentslogin

Man, this makes me sad beyond words. It's a great loss for all of us, even those who don't realize it.


Yeah, I was really hoping for Detroit. It's got a lot of really underutilized infrastructure at this point and could be really revitalized by a large tech company expanding its footprint there. Real estate prices would be relatively low, too.

I'm surprised to see a place like Boston on the list. Having lived there for ~15 years, I just can't imagine adding an additional 50,000 well paid tech workers and their families to that area. It's already densely populated and expensive.


Fargate uses the same task definition abstraction as Amazon ECS. See http://docs.aws.amazon.com/AmazonECS/latest/developerguide/l... So yes, you can launch multiple containers in a single logical unit.


A word of caution: ECS multi-container tasks do not have the same semantics as Kubernetes pods. In particular, there is no support for bidirectional network discovery.


Almost every account? You mean almost every account that NYT bothered to include in that article.

As a current employee at Amazon, that article might as well be fiction as far as I can tell. It describes nothing like what I or any of my co-workers have ever experienced here. We chuckle about it, occasionally, making jokes about crying at our desk, etc. But seriously, that article does not describe the place where I work.

When the article came out, Jeff sent a response to the company that has subsequently been published online. In it he said, paraphrased, "I would quit if my work environment was what the NYT article described, and I'm sure any of you would, too. If you're here, and you're work environment is anything like what the article describes, please _tell me_ so we can fix it."

You've mentioned elsewhere that Amazon has among the worst employee retention in the industry, but that's a fact that I'd also dispute. I've seen articles like http://www.slate.com/blogs/business_insider/2013/07/28/turno..., but as far as I can tell they're using median tenure as a proxy for employee retention. That's going to skew the results in any company that is hiring at a high rate. Imagine a hypothetical company that has gone from 50 to 100 employees in the past 12 months. Median tenure is only going to be less than 1 year at that company, regardless of whether or not anybody has quit. Amazon also hire lots of seasonal workers to handle shipping, customer support etc, which also very likely skews the results.


These tech workers seem to say otherwise about both working conditions and turnover: https://www.reddit.com/r/cscareerquestions/comments/4fg44p/1...

Glassdoor is filed with them too

I'm glad you are happy at your job. Many people disagree.


Sure, but just as with customers, the dissatisfied (ex-)employees are going to be most vocal.

Do you really think Amazon could build things like AWS, Alexa, the Go Store, etc, if it was a sweatshop where everybody but Jeff was in hell? I don't.


> Sure, but just as with customers, the dissatisfied (ex-)employees are going to be most vocal.

Agreed that Glassdoor is an outrageously bad indicator about a company, taken in isolation.

> Do you really think Amazon could build things like AWS, Alexa, the Go Store, etc, if it was a sweatshop where everybody but Jeff was in hell?

Nobody is claiming that every single employee has a terrible life at Amazon. But new college grads who don't mind and/or mind but don't really believe better working conditions are possible can build most of what you referenced.

I have only anecdotal evidence, but the majority of ex Facebook, Apple, and Google employees I know say pretty balanced pro/con things about their former workplaces.

Of the six ex Amazon employees I know well enough to discuss this with, they universally decry the horrible work cultures they experienced there.

So, I'm glad you like your work culture, but I know I'd never even consider working at a Jeff Bezos company. Despite how awesome I thing Blue Origin is.


4 people I graduated with in 2013, plus my wife (who went to work there more recently) all work at AWS, and they all love it. More anecdotal evidence.


If I need a apply for a license to read the law, can I also claim that I need to be licensed in order to be actually bound by the law?

Obviously I cannot, but the absurdity of the idea that I can be legally bound by something that is not freely available to me is striking.


The update from 1password indicated that there was application layer encryption happening in addition to the TLS encryption, so a breach of the TLS protection did not expose any sensitive data. Presumably other sites are in similar situations. But don't take my word for it, go change all your passwords.


Any hosted password manager should be "host proof". They should not have the decryption keys and it should not be possible for them to disclose your unencrypted passwords no matter how careless they or their intermediaries are. They should be sending an encrypted blob over the wire which is only decrypted in your client app or browser when you enter the passphrase.


1Password said that even though they were not affected, they will still move away from Cloudflare due to bad optics.


> Presumably other sites are in similar situations.

Not to my understanding. 1password uses client-side encryption, using keys generated from your master password. This means that any data transmitted over the wire is already encrypted, whether over SSL or not.

Most other sites do not do this, at all, in any way. If you use a website that use'd CloudFlare's SSL termination, change your passwords, cancel your credit card (if you sent it to that site in the past few months, eg Uber/Lyft).

> go change all your passwords.

Yes, correct =].


If you'd seriously cancel your credit cards over this, I'd love to hear how you model that threat relative to all the other risks inherent in using a credit card anywhere (not just online).


It is an awesome time to be alive, but I do wish we'd start doing more interesting stuff with our technology than simply streamlining the ability to buy stuff. Convenience is wonderful, and I make use of lots of modern technological improvements in that regard, but I'm skeptical as to whether any of it has really made a measurable improvement on my standard of living.

https://medium.com/the-mission/silicon-valley-has-a-problem-... is largely what I mean. I don't blame the tech industry, necessarily. It's society as a whole that seems to have developed such a warped set of values, and I have no idea what to do about it.



I used to work with a number of LISP machine believers at the MIT AI Lab/CSAIL. They all had more modern computers for day to day tasks, but used the lispm for most of their programming. This wasn't that long ago (I left in 2010), and I suspect that those machines will remain in active use for as long as people can keep them running.

They all believed that the loss of the lisp machine was a serious loss to society and were all very much saddened by it. I never used the system enough to come to my own conclusions in that regard, but it was interesting food for thought. As somebody for whom Linux/POSIX is very deeply entrenched, would I even recognize a truly superior system if it was dropped in my lap? More importantly, would society in general? The superior technology is rarely the "winner"


It was by far the most productive programming environment I have ever used. The level of integration of the editor, debugger, IO system, and interpreted and compiled code is unparalleled. Interestingly it philosophically descended from MACLISP development on a machine (PDP-10) that was designed with Lisp in mind and that had an O/S (ITS) whose "shell" was a debugger, so you could also do pretty tightly coupled development with EMACS (in TECO) and your code in a mix of interpreted and compiled Lisp. In theory this deep level of integration need not be Lisp-specific, but I haven't seen it that often.

The closest I've used were the three environments at PARC when I was there: Smalltalk, Mesa/Cedar and Interlisp-D. When I use Xcode or Eclipse I feel removed from the machine. In these other environments I felt simultaneously able to think at a higher level and yet more tightly coupled to the hardware.

I've used various GNU Emacs modes and the coupling between them and the runtime environment is not tight enough. Today I use SLIME+SBCL and it's OK. It too lacks the tight coupling of the lispm. However for production we'll end up re-coding in C++ for performance.

PS: A good friend of mine scorns the lispm-style of development as "programming by successive approximation." There's some truth in that.


"Programming by successive approximation" is hardly scorn worthy in my mind.


I prefer to think of it as "rapid prototyping".


I too am left wondering, what is the alternative?


Well, with exploratory programming you tend to build up a couple of data structures, add some functions to manipulate them, and then extend out from there. In a more structured way you think up front about a lot more things. In the second you probably doodle some stuff out on the whiteboard or on paper before coding; in the first you probably simply start with an empty buffer type straight into the REPL. There's a time and place for each and certainly not a well-defined dividing line between the two (except in some highly structured UML/TDD processes which may not even exist any more outside aerospace).

For example I talked about the Lisp implementation of the code I'm working on: for deployment it looks like it'll be be an implementation in C++ based on what we end up learning about performance; certain implementation decisions that were particularly good or particularly bad, or that we iterated on several times before settling on something good; and that handles memory management more directly.


> programming by successive approximation

That's called 'Agile' nowadays...


How was using Mesa/Cedar?

From my Native Oberon experience and having devoured all Xerox PARC related papers, I imagine it was a great experience.

But having used the real thing is quite different assessment.


It was fun although at that point in my life I was not comfortable with statically typed languages. So it was good for me as well.

I really just experimented in it and the (more welcoming to me) Smalltalk environment. I used InterLisp-D as my "day job" language (actually we implemented 3-Lisp in it, with some custom microcode).

BTW there was a good paper from the Mesa group which I can't find online (my copy must be buried in a box someplace) comparing the performance of counted strings vs delimited strings (e.g. [3, 'f', 'o', 'o'] vs ['f', 'o', 'o', \0] in C syntax). According to the paper the bounded strings were much faster. All three languages (Smalltalk, Mesa and Lisp) used counted strings.


I think that is the one discussing about ropes structures.

Thanks for the feedback.

Nowadays I use .NET and Java environments as an "almost like" experience of what mainstream computing could have looked like.

But when I see companies like Apple releasing playgrounds, Oracle adding a REPL and the edit/continue and REPL in .NET, there is some hope left.


> I think that is the one discussing about ropes structures.

No, this was simply counted vs terminated strings. I would like to find that paper and revisit the data again to see if it is still true.


Emacs + Geiser + Scheme =)


> The superior technology is rarely the "winner"

True dat. Unfortunately, as Theo de Raadt once said:

"In some industry markets, high quality can be tied to making more money, but I am sure by now all of us know the computer industry is not like that."

I think the only thing we can do is follow the trend. This whole industry is not perfect anyway.


We can do more than that. If everybody followed the trend, then Lisp would be dead, and we'd all just write Java.

shivers


Thankfully Rich Hickey & co wrote Clojure so we can program on a modern Lisp in the Java Virtual Machine and in the browser! (ClojureScript) (even the .NET CLR is supported)

[1] http://clojure.org

[2] http://clojurescript.org


Though it kind of sucks that Tail Call Elimination is such a difficult task on the JVM.

Scheme kind of gets you thinking in a way that works iteratively, but gets expressed recursively. Its easy to read, and can make for some great optimisation without being premature.

The JVM does not really support this style of programming - despite LISP's syntax leading towards it.


Ironically, the Lisp Machines didn't have TCO either.


I wouldn't say that it was appropriate there, either.

I really wish I could get my hands on a LISP Machine.

I love the idea of LISP being so close to the metal, but that power means some design tradeoffs.

Scheme makes sense with TCE.

I'm not sure InterLISP and the like need it - memory is limited and you work without so many of the system overheads I'm used to with modern systems. Iteration is less costly here, and that makes it easier to not need recursive design. (Which is more expensive).

In simple terms:

Why worry about blowing up a stack you don't need, when you've thought long and hard before you allocated it?


One of their great failings. I'm a Scheme user, so I feel a bit weird without TCO.


The designers argued that a million-line software (the OS + basic applications) was easier to debug/develop without having TCO everywhere. It makes stack traces useless, unless one thinks of clever ways to keep tail calls recorded, which makes it complex/tricky. The basic machine architecture is a stack machine with compact and nice instructions. Stack traces were useful then. The compiler also was not very sophisticated when it comes to optimizations.


I've long thought the right thing for Common Lisp would be to provide a way to declare sets of functions such that any tail call from one member of the set to another would be TCO'd; all other calls would push stack. This lets you write sets of mutually tail-recursive routines without forcing TCO on the entire system.


It's an entirely reasonable tradeoff, which I understand. But not having TCO is a strong negative from the language perspective, IMHO. Not an impossibly bad one, but it's really nice.


Funny enough, it seems JavaScript will get TCO before Java. JavaScript, for f*ck's sake! This will provide even more argumentative power to the people who claim JavaScript is 'Scheme in C clothes'.


recur in clojure is also easy to read in my opinion.


True.

However, a named let in Scheme is not a loop. Its still a lambda.

Which means you can have nested lets with TCE, and you can construct them on the fly, depending on what your needs are, usung patterns like currying.

That flexibility just doesn't work on the JVM. You can force it, but it'll be slow and horrible compared to another pattern - and I don't think a LISP should tell me how to do something. That's Python's ideology.


It pops up every now and then in Java Language Summit presentations, usually in comparison with .NET, which does support it.

So who knows, maybe some day the JVM finally gets it.


Well the .NET CLR was designed by a lisper.


Yeah, it's super nice.


Not a super fan of Clojure, but it's a good language. Shame it's still stuck in the Java ecosystem, though.


I believe the whole point was to target the JVM, because of reuse and maturity. And actually it's not stuck just in java ecosystem, clojure got wings years ago. Yoy can find it inside a browser today too :)


I wonder if there ever has been talk of a native Clojure? I guess it may not be very usable without the JVM ecosystem, though. Frankly, I find calling JVM library calls from Clojure to be quite ugly and really stand out in the code (mostly because of the mix of the lower-case-dash-delimited variable and fn nameing convention of Clojure and the mixed-case/camelCase naming style of Java.


Yes, there are a few aborted attempts.

The problem with all languages that decide to implement their own runtime, instead of building it on top of JVM or .NET eco-systems is that their native code generation and GC implementation are always going to be worse.

Also there is the issue of having to implement the whole set of third party libraries from scratch, just like PyPy and JRuby have issues using libraries that rely on CPython or Ruby FFI.

So unless you get a set of developers really committed to go through the efforts of making it succeed, everyone will ignore it.


The closest thing to a native Clojure is Pixie[1]. As the authors note, it's a "Clojure inspired lisp", not a "Clojure Dialect".

>Pixie implements its own virtual machine. It does not run on the JVM, CLR or Python VM. It implements its own bytecode, has its own GC and JIT. And it's small. Currently the interpreter, JIT, GC, and stdlib clock in at about 10.3MB once compiled down to an executable.

[1] https://github.com/pixie-lang/pixie


Yeah, but for many of us the Java ecosystem is a feature.

You only get tooling comparable to Visual VM or Mission Control in commercial Common Lisps.

Also I think many that bash Java don't realise it is the only language ecosystem that matches C and C++ in availability across OSes, including many embedded ones.


I realize, I just can't stand the ecosystem. Everything is super verbose, frustratingly overcomplicated, and full of XML. Yuck.


It is a consequence of being an enterprise language.

I imagine you never had the pleasure of doing enterprise distributed computing projects via CORBA, DCOM, SUN-RPC, DCE in C, C++, Visual Basic and Smalltalk.

Guess where those enterprise architects moved on.

EDIT: Should have mentioned Delphi and Objective-C as well.


I didn't, it sounds unpleasant, I can guess, and I don't like it, which is why I'm not a fan of Java.


Even though I prefer static typing to dynamic typing, Clojure is a pleasure to do some coding for fun while traveling.

It would be nicer if the performance was better though, many that only know Java via Clojure don't realize the performance impact of Clojure.

At least 1.8 did had some improvements in that direction.


I used to think likewise (you can check my previous comments). But at the end of the day, you gotta pay the bills...

Nonetheless, I have huge respect for CL. I think I'm gonna learn it well once and for all. (maybe use it for back-end).


True 'nuff.

I prefer Scheme to CL. The tooling isn't as developed, but Chicken and Guile are both plenty usable, and Scheme is less crufty than CL.


Chicken + some plugins they provide are almost more than enough for most people! Great project.


Indeed. It's actually my preferred language for writing code. It may not have as many libraries, and it might not be as mature as, say, CL, but it's just so pleasant to program in.


The blub paradox would indicate that you wouldn't recognize it.

Richard P. Gabriel's famous "Lisp: The Good News, The Bad News, And How To Win Big" (aka "Worse is Better") discusses this very thing. I'd reccomend reading it.

But frankly, it's hard to say that an environment is objectively better. What one person may view as a step up, another may view as a step down, and we all have a kneejerk reaction to unfamiliar environments. The lispm is a really nice environment, to be sure, but I'll likely never know if it's better. Emacs will have to be good enough (which it certainly is).


There's still significant things on the list below you cant do with Linux, C, etc. So, yeah, I'd say a modern version of Genera would give hou a worthwhile experience.

http://www.symbolics-dks.com/Genera-why-1.htm


At a conference (IIRC it was LISA in 2000) somebody had posted a joke to the bulletin board. It read: "Lost: one version control system, never used. If found, please return to Linus Torvalds"

Version control worked by posting your patches to the mailing list. Linus would apply your patches to his tree, which he'd occasionally tar up and upload to the mirrors.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: