Hacker Newsnew | past | comments | ask | show | jobs | submit | ujal's commentslogin


Are there similar projects for the Ethereum blockchain? Imagine a censorship resistant Darknet.


Ethereum really isn't great for stuff like this. Check this article:

https://www.yours.org/content/blockchain-computing-on-ethere...


Bitcoin (SV) is also going to make blockchains like Ethereum redundant.

Bitcoin can do all the things people typically go to Ethereum for, smart contracts, token, 'on chain computation, etc. Bitcoin can do them even better.


You also have hdom & co. coming from the i-am-not-a-framework side - https://github.com/thi-ng/umbrella/

Production size and evaluation times are a thing with react apps.


Whats the canonical way to deduce the types in Rust from JSON? JSON Schemas?


This particular crate, serde_json, lets you create any type that implements Serde's Deserialize trait when you're parsing.


... and it also provides a dynamic interface too, if you need to work that way. You probably don’t...


I will probably never understand IQ and its purpose for real life problems. It is either me who doesn't get it or all the people out there who lack introspection to understand what exactly it was that enabled them to grasp some new concept.


Perhaps you don't give a f* anymore?


This is incidental, but there are important activities that gotten harded for me in recent years - just as OP described. It's pretty bold to blame just lack of motivation or commitment.


Even if true, that's more of a description than a solution. Generating new f*s to give is a hard problem.


"If you wish to derive a commercial advantage by not releasing your application under the GPLv3 or any other compatible open source license, you must purchase a non-exclusive commercial SOD license. By purchasing a commercial license, you do not longer have to release your application's source code." --


I argue this is even better: anybody who plans to benefit commercially must pass on the fruits of their success to upstream.


There are now more ML related jobs on stackoverflow than AngularJS.


This is likely more about the faddishness of employer choices in front-end frameworks.

Overall, web development & front-end jobs dwarf machine learning jobs on Stack Overflow.


~400 (Machine Learning) vs ~900 (Frontend).

Apparently 30% of Web Devs have < 5 years experience. Time to start specializing in some other domain.


I don't get the sentiment of the article either. I can't speak for researchers but software engineers are living through very exciting times.

  State of the art in numbers:
  Image Classification - ~$55, 9hrs (ImageNet)
  Object Detection - ~$40, 6hrs (COCO)
  Machine Translation - ~$40, 6hrs (WMT '14 EN-DE)
  Question Answering - ~$5, 0.8hrs (SQuAD)
  Speech recognition - ~$90, 13hrs (LibriSpeech)
  Language Modeling - ~$490, 74hrs (LM1B)
"If you think Deep (Reinforcement) Learning is going to solve AGI, you are out of luck" --

I don't know. Duplex equipped with a way to minimize his own uncertainties sounds quite scary.


Duplex was impressive but cheap street magic: https://medium.com/@Michael_Spencer/google-duplex-demo-witch...

Microsoft OTOH quietly shipped the equivalent in China last month: https://www.theverge.com/2018/5/22/17379508/microsoft-xiaoic...

Google has lost a lot of steam lately IMO. Facebook is releasing better tools and Microsoft, the company they nearly vanquished a decade ago, is releasing better products. Google does remain the master of its own hype though.


> Microsoft, the company they nearly vanquished a decade ago, is releasing better products.

Google nearly vanquished Microsoft a decade ago? Where can I read more about this bit of history :) ?

IMO, Axios [0] seem to do a better job of criticizing Google's Duplex AI claims, as they repeatedly reached out to their contacts at Google for answers.

0: https://www.axios.com/google-ai-demo-questions-9a57afad-9854...


I think they are overselling Google's contributions a bit. It was more "Web 2.0" that shook Microsoft's dominance in tech. Google was a big curator and pushed state-of-the-art. Google was built on a large network of commodity hardware, they were able to do that because of the Open Source Software. Microsoft licensing would have been prohibitive to such innovation. There was some reenforcement that helped Linux gain momentum in other domains like Mobile and Desktop. Googled helped curate "Web 2.0" with developments / acquisitions like Maps and Gmail. When more of your life was spent on the web, the operating system meant less and that's also why Apple was able to make strides with their platforms. People weren't giving up as much when they switched to Mac as they would have previously.

Microsoft was previously the gatekeeper to almost every interaction with software (roughly 1992 - 2002). I don't know of good books on it but Tim O'Reilly wrote quite a bit about Web 2.0.


My question was actually tongue-in-cheek, which I tried to communicate with the smiley face.

I'm quite familiar with Google's history and would not characterize them as having vanquished Microsoft.

For the most part, Microsoft doesn't need to lose for Google to win (except of course in the realm of web search and office productivity).


You're right, it was Steve Ballmer who nearly vanquished Microsoft at a time when Google was the company to work for in tech and kept doing amazing things. At least IMO.

Unfortunately, by the time of my brief stint at Google, the place was a professional dead-end where most of the hirees got smoke blown up their patooties at orientation about how amazing they were to be accepted into Google, only to be blind allocated into me-too MVPs of stuff they'd read about on TechCrunch. All IMO of course.

That said, I met the early Google Brain team there and I apparently made a sufficiently negative first impression for one of their leaders to hold a grudge against me 6 years later, explaining at last who it was that had blacklisted me there. So at least that mystery is solved.

PS It was pretty obvious these were voice actors in a studio conversing with the AI. That is impressive, but speaking as a former DJ myself, when one has any degree of voice training, one pronounces words without much accent and without slurring them together. Google will likely never admit anything here: they don't have to.

But I will give Alphabet a point for Waymo being the most professionally-responsible self-driving car effort so far. Compare and contrast with Tesla and Uber.


My thoughts on AGI (at least in the sense of being indistinguishable from interaction with a human) are the same as my thoughts on extraterrestrial life: I'll believe it only when I see it (or at least when provided with proof that the mechanism is understood). This extrapolation on a sample size of one is something I don't understand. How is the fact that machine learning can do specific stuff better than humans different in principle than the fact that a hand calculator can do some specific stuff better than humans? On what evidence can we extrapolate from this to AGI?

We haven't found life outside this planet, and we haven't created life in a lab, therefore n=1 for assessing probability of life outside earth (which means we can't calculate a probability for this yet). Likewise, we haven't created anything remotely like animal intelligence (let alone human) and we have no good theory regarding how it works, so n=1 for existing forms of general intelligence.

Note that I'm not saying there can be no extraterrestrial life or that we will never develop AGI, just that I haven't seen any evidence at this point in time that any opinions for or against their possibility are anything more than baseless speculation.


This is what we know from Google about Duplex:

"To train the system in a new domain, we use real-time supervised training. This is comparable to the training practices of many disciplines, where an instructor supervises a student as they are doing their job, providing guidance as needed, and making sure that the task is performed at the instructor’s level of quality. In the Duplex system, experienced operators act as the instructors. By monitoring the system as it makes phone calls in a new domain, they can affect the behavior of the system in real time as needed. This continues until the system performs at the desired quality level, at which point the supervision stops and the system can make calls autonomously." --


If the dollar amounts refer to the training cost for the cheapest DL model, do you have references for them? A group of people at fast.ai trained an ImageNet model for 26$, presumably after spending a couple hundered on getting everything just right: http://www.fast.ai/2018/04/30/dawnbench-fastai/


Thats what you get with Google TPUs on reference models. The ImageNet numbers are from RiseML, the rest is from here - https://youtu.be/zEOtG-ChmZE?t=1079


Lisp is perfectly suited for this task due to the minimal and familiar syntax to non-programmers. Math should be taught in prefix notation if u ask me.


Familiar? Seriously?!? That syntax is not familiar to non-programmers.

> Math should be taught in prefix notation if u ask me.

Perhaps it should, but it's not.

Or was this whole post sarcasm, and it was done so well that I missed it?


There must be a name for that, when someone casually says something is simple when it is in fact a very complicated subject.

EDIT: After some googling, TIL humblebragging.


Well, hopefully nobody ever asks you!


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: