Hacker Newsnew | past | comments | ask | show | jobs | submit | jbz's commentslogin

I feel like I'm nitpicking but I couldn't let it go, and yes its already been mentioned once but I feel that argument offered by confuzatron was lacking. I'm referencing to your characterization of Linq in C# as something inextricably linked to the language, how you cant even begin to separate it from the language.

Straight off the bat the most glaring problem with that statement is that Linq is not part of C# in any way shape or form since all C# code is compiled to bytecode, it would be literally impossible for C# to have Linq and for it to not be available for the rest of the languages supported by dotNet. The funny part is the expressive form of Linq that reads like a sentence is not even fully supported by C#! Only VB.Net has fully implemented Linq expressiveness.

There are many more reasons that argument is wrong, but I feel just pointing out that one above shows how far off it is.

In my experience its almost always best to shy away from hyping your idea of better by knocking the competition, your article stands well on its own and C# has so many obvious flaws that theres no need to add to it. Stick to whats good about X, not whats crappy about alternative Y.

Again all in all an interesting read since im not familiar with Clojure, but I just had to nitpick.


Straight off the bat the most glaring problem with that statement is that Linq is not part of C# in any way shape or form since all C# code is compiled to bytecode, it would be literally impossible for C# to have Linq and for it to not be available for the rest of the languages supported by dotNet.

There's no such thing as "LINQ bytecode". LINQ is just syntactic sugar that can coexist with vanilla C# (or VB.NET, or whatever) because Microsoft decided to modify its compiler and IDE to allow it. It is fundamentally impossible for you or me to make a similar change, unless we're willing to eschew the Microsoft toolchain.

In Clojure, you do not have the same limitation. That's the only point the article was making, and it's completely correct in that respect.


see my reply to nathanmarz


Straight off the bat the most glaring problem with that statement is that Linq is not part of C#

Parts of it are. Query expression keywords[1] are defined in the C# 3.0 Language Specification (see 7.15 Query expressions)[2].

[1] http://msdn.microsoft.com/en-us/library/bb310804.aspx

[2] http://www.microsoft.com/downloads/details.aspx?FamilyID=dfb...


I'm not bashing C#. I'm just saying that you couldn't define Linq in vanilla C#. It had to be implemented as part of the compiler.

In Clojure, you can create embedded languages without modifying the compiler. That's all I'm saying.


I can appreciate and understand that you didnt mean to come of as combative, but to me it did come off that way - and others since im not the only one pointing this out.

Can you name the compiler change that enabled Linq without looking it up? Can you define Linq, what it basically is? It feels like if you could you wouldnt make such a statement since all Linq is at its core is an iteration engine. Its literally just methods you dump your collection into plus an anonymous method on top and vrooom goes the engine applying the method to each item. The pretty syntax form you usually see is not actually Linq the framework and leads to significantly more problems than it solves, its basically just for PR purposes.

Edit: This feels like it might get out of hand and spin into a good old nitpicking programmers war. Reading my post again it came out way more confrontational than I meant, and I didnt mean any insult by it. My main contention is that the Linq syntax is often confused for the Linq framework - they are not in the same state let alone ballpark. Plus the argument can be made (and i would agree to a large extent) that it wasnt the compiler that was modified to allow Linq, it was Linq that was waiting in the wings for the compiler team to implement features they had planned quite some time before. However I am also being a stickler and stubborn, I nitpicked when even from my point of view it wasnt such a large error - it was the way it disrupted the flow of a pretty good article I was getting into, thats what made it stick out for me.


I'm not the OP, but here are some compiler features missing from C# 2.0 that would have prevented a LINQ-type library: lambda expressions, implicit types, and anonymous types.

None of these required modification to the runtime, they were purely compiler features. And, as pointed out, they were features only Microsoft had the ability to implement.


Strictly speaking, none of those features were necessary to implement LINQ. Anonymous functions were already possible in C# 2.0 with the ugly delegate syntax. Type inference saves a lot of keystrokes, and anonymous types save big families of generic tuple types, but you could do without -- hell, people write functional Java.

(A better comparison would be to look at LINQ-to-SQL specifically; that would not be possible in any sense without the expression tree libraries and compiler support introduced in C# 3.0, since there was no way to "quote" a C# expression and look into it. That's much closer to the mark here.)

However, I agree with you in spirit. A big enough quantitative difference becomes a qualitative one; nobody would actually want to use LINQ+C#2. And likewise, few people want to write small-scale DSLs in C#.


Well, your argument is tiptoeing down a very narrow path based on the definition of "necessary". Here's what Eric Lippert has to say about which features were necessary for LINQ: http://blogs.msdn.com/b/ericlippert/archive/2009/10/05/why-n...


Yes, you're right. Expression trees are especially interesting from an integrated DSL perspective, since you can do ridiculous things like turn lambda expressions into a syntax for hashes and so on. I omitted them because I believe that really did require a runtime modification to support.

But I would argue that since DSLs are all about affordances, it's not sufficient to say that similar functionality would be "possible". If the new approach doesn't represent significant semantic compression (which your hypothetical LINQ+C#2 would not), no one will use it. By that measure, the syntactic sugar added to C# 3.0 was absolutely a necessary precondition for LINQ.


LinqBridge allows support for Linq to run on .NET 2.0. The description of how it works might provide some insight:

http://www.albahari.com/nutshell/linqbridge.aspx


That requires the .NET 3.0 compiler to work, which was my point: LINQ is largely a compile-time feature.


No, the features that were used to build LINQ were added to C#. LINQ was then built using them, and those same features are available to you to build anything similar.


I'm obviously just referring to the syntax added to C# to support Linq.

I'm not even criticizing Linq/C#, I'm just using it as a point of comparison to help the reader understand Clojure concepts. How that comes across as combative I don't understand.


Im afraid we will have to agree to disagree, I think weve both stated our cases to the extent they can be clearly stated. I wanted to edit my previous post to take out the combativeness but responses had been put up and it would like a cop out if i did that. I banged it out without double-checking it for overall tone, but again I do appreciate the article and overall found it a good read.


I think what Nathan is referring to is the SQL-like syntax for LINQ, which really couldn't be implemented in C#, as C# doesn't have any syntax extension features.

However, expression-based LINQ is implemented in C#. That's possible because of several language features added to C# 3.0, notably expression trees (code as data) and anonymous functions (lambda). Those features make C# expressive enough to do things like LINQ in C#, though in a somewhat clumsier way than Clojure does it.

For example, if you write cities.Where(s => s.StartsWith("L")), that "s => " is a lambda expression, but because the Where method takes an expression tree, the expression is turned into a data structure rather than executable code. This is similar (not identical!) to Clojure recognizing that you're calling a macro rather than a function and letting you see code as data.


Thats why it felt like a cheap detour to me. To me it read like Linq is completely unchangeable and you're stuck with what MS/C# language dictates it is, but almost every single component of it can be swapped out with your own implementation and overloads that in practice you can do much of what is implied as unachievable. Its been so long ive lost some of my grasp on it, but from what I recall you can swap out even the core methods that the C# sentence type syntax ends up being converted to, now is that not really close to what was highlighted in the article as impossible? With expression trees added on top to build out queries with decision trees during runtime you can make it work against whatever kind of datastore youre interacting with. In the end I felt its not nearly as immovable as portrayed in the article, and at the time i felt like the article was past its prime since it had been an hour with few comments - though im kinda wishing I kept my mouth shut now that theres been an invasion of comments.

To start splitting hairs about exact definition of what was written in the article seems to miss the point to me, the general flow and feel was dismissive of its ability. Is it clumsier? From what I can tell so far yes its clumsier, but its not powerless and immovable which is how it came off. Maybe im just sensitive though, or maybe im insensitive in how i portrayed my argument, but I really did just meant my original comment as constructive criticism.


Was going to delete this item but maybe someone will find this useful. I assumed JS was active since the query string contained the variable "post_form_id_source=AsyncRequest" which to me seemed like an obvious nod the now ubiquitous xmlHTTPRequest being used. While it doesn't prove conclusively that there wasn't JS running, it does make it plausible that it was purely an HTTP connection kept alive by X-Cnoection header. Im still fuzzy as to how this works or how its done but ill post a quick excerpt of what i found after some searching:

"Missed Cneonctions

This header:

Cneonction: close and its variant:

nnCoection: close were two of the headers which first spurred my interest in HTTP headers.

imdb.com, amazon.com, gamespy.com, and google.com have all at various times used these or similar misspellings of connection, and I’m not by any means the first to have noticed. My first thought was that this was just a typo. After more consideration, however, I now believe this is something done by a hackish hardware load balancer trying to “remove” the connection close header when proxying for an internal server. That way, the connection can be held open and images can be transmitted through the same TCP connection, while the backend web server doesn’t need to be modified at all. It just closes the connection and moves on to the next request. Ex-coworker and Mudd alumus jra has a similar analysis."

source: http://www.nextthing.org/archives/2005/08/07/fun-with-http-h...


You can see it in the logs but ill point it out specifically, origination point seems to be money.cnn.com which I was reading at one point. I do not visit facebook directly or have a facebook account if its relevant.


Title of this item should be changed to "Eternal September has arrived."


You're pretty much taking the evolution of email spam and going backwards, unless you can point to a specific reason twitter is different from the general evolution of spam on every network up till now I cant see how thats right.

Given a million people sending out 10 spams each theres bound to be a very similar way they carry it out. However 10 spammers dedicated churning out the maximum capacity they can achieve will put up a fight no matter how you try to stop them.

Email spam originated with people like hood, generally not doing much damage and very easy to block and shut down. It eventually led to the current situation where vast majority of spam originates from a handful of spammers. So did comment spamming, and early social network spamming, they all pretty much followed this pattern.

In my view you're letting yourself get wrapped up in the minutiae of the argument. What hood did was wrong and no amount of justification will change the fact that he externalized the cost of commercial advertising. However none of us can look back at our lives and say we didn't have such morally questionable moments. We generally react like hopefully hood has done and don't make it a career; assholes keep going even when they only make $1 for every $100 they externalize.


"Externalized the cost of commercial advertising." That's a very succinct way to put it, thanks.

But I'm not sure how you conclude that that was happening. People who clicked on the affiliate links only paid him off if they made a purchase. If no purchase was made, no money changed hands between any players. Right? An affiliate link is not like an ad.

In fact, even if an affiliate link was like an ad, there's no data on whether the users clicking on the link are poorly targeted, or that the CTR will significantly suffer.


He used twitters resources against their T&C's, costing them money, to make himself money.

"Externalized the cost of commercial advertising" is a perfect way to describe what happened.


I can still remember the first time I put together a vmware virtualcenter server with a few servers running the hypervisor. It was the first time I encountered software that left me speechless since it felt like it was built with magic, at the time since I simply couldnt comprehend how this could be done.

I ended up quitting the company I built it at because they didnt see the point of virtualization and werent going to implement it.


Indeed. Generally speaking, virtualization adoption hasn't set any speed records. But slow and steady wins the race, as they say. :)


Did you read the article in full? I know its a rude question to ask but it really seems like you didnt because youre falling prey to exactly the type of thinking it highlights as the problem. The solution doesnt "feel" right, and we humans are without peer in justifying our discomfort avoidance in cloaks of righteousness, truth, etc. That the correct solution just happens to coincide with the solution that avoids self-doubting discomfort is serendipity. Even this serendipity is so rarely noticed, let alone questioned, doesn't this seem odd?

Take for instance healthcare, its spiraling costs. A doctor ordering what some would call excessive tests for a patient is doing the right thing. It is better to be safe than sorry after all. The trial lawyers, insurance companies, government handouts and so on are the reason its skyrocketing.

What about the human fault of judging ones own immoral actions and impropriety on a relative scale of those around us? What about compounding that by generally giving everyone an "i am a good person" foundational belief. To examine it with no intention of challenging it will cause most people to feel a tangible unease because they're treading on some dangerous ground, a sense of fear that its best not to mess around in this place in case you break something accidentally. Each human with an intact "i am good" core applies goodness to all their actions by default, meaning if a person is not truly judging their motives or reasoning they are deemed to be "good" actions with pure motives because they are good and nothing specifically shows they're doing "bad" right that moment.

When the city with one of the fastest rising health costs in the US was examined it was found that the rising costs had a partner along for the trip, medical procedures and tests were rising as well. The doctors were doing nothing illegal, and obviously none of them thought they were doing anything wrong. So why the rise? It was just doctors increasingly exploring that grey area that doesn't challenge your central belief of your own goodness. Patient complains of headaches, and worries about something they read online about brain tumors, the doctor thinks its unlikely but its better safe than sorry, and imagine if he does have a brain tumor not only would i have missed it i might be sued too. Repeat this process over and over and goalposts move, more behavior is acceptable such as "ok so maybe she didnt need hip replacement surgery right now, but she definitely would have needed it soon, and im a far cry from Doctor EvilCompetitor, i cant believe he talked that poor shmuck into allowing brain surgery!".

Consider the complicated medical conditions that homeless contract on the streets with illness on top of illness requiring weeklong ICU visits, it becomes drastically reduced if a person is living in a home. A single avoided hospital visit of that kind alone can justify the free apartment for year or two or three.

Im trying to find that article again, will post link when i find it.


> Did you read the article in full?

My point is that the article doesn't contain all relevant data.

> A doctor ordering what some would call excessive tests for a patient is doing the right thing.

May be doing the right thing. Some people think that my life is worth $X. Who's to say that they're correct?

> Consider the complicated medical conditions that homeless contract on the streets with illness on top of illness requiring weeklong ICU visits, it becomes drastically reduced if a person is living in a home.

Assumes behavior not in evidence. Homeless in SF have "not street" options that they refuse. What makes you think that they'll take different options AND that their risks will chance correspondingly.

I note that lots of folks with homes manage to get "street illnesses" so it's not true that homes solve disease. There's a decent correlation for current "homed" populations, but that doesn't tell us what would happen if we "homed" other populations.


It doesn't seem right to state "He's pretty unclear on economics, yes." and then go on to make an unqualified statement of amazement that soviets were able to maintain some level of parity as long as they did.

Like it or not Russia defeated the bulwark of the German war machine, there is no question they suffered the brunt of the German military force that would have easily overwhelmed other allied nations, even possibly the US. Russia did so using that inefficient system to allocate resources.

They did not do this by simply chucking their soldiers at the Germans with the threat of being shot by a Peoples Party Officer if they tried to run back, and often repeated myth of why Russia was able to halt and then pursue the Germans back to Berlin.


I am fully aware of the quality of Russian military equipment and military during WW2. (they had the best mass-produced tank in the war, one of the best air support fighters, and they had much simplified logistics vs. the US) However, after Stalin purged most of the officers, and the damage of WW2 to the industrial base, and then the results of planned economy (shrinking the overall size of the economy), it's pretty amazing they had the resources to maintain near-parity with the US for ~10-20 years. They only really achieved this by devoting a much larger fraction of their industrial base to the military than was done in the west.

Also, any discussion of Russia during WW2 needs to include Simo Hayha ("White Death"), just because he's the most amazing sniper in the history of the world.


Are you disillusioned with the current status of programming/IT/tech in general?

I cant say if I'm reading this right, but it really sounds like you're suffering a crisis of confidence in your decision to spend your time / career in tech (tech just being a broad catchall). I might be making a leap, but to me your tone of frustration and harshness ring familiar and its an open secret that a lot of us hit this brick wall at one point or another.


For what its worth in my experience in enterprise administration for a 10,000+ computer network at one company, and a 4,000+ computer network at another, computer crime is _never_ reported, rarely receives a cursory investigation, and a full investigation is forbidden by policy.

I define computer crime as a successful network intrusion where an attacker gains access to the internal network, which occurs at a frighteningly high level.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: