Hacker Newsnew | past | comments | ask | show | jobs | submit | mungoman2's commentslogin

No, the tool rtl_433 repackages payload data in json for easier downstream consumption.


I agree! Seems like very interesting work.


As proposed in the article, calculating the area weights all points in the viewshed equally. I wonder if it makes sense to give more weight to far-away points? Or maybe less weight? Not sure! But would be interesting to explore and see if it gives anything.


I'm the author. Well there's actually a term in the kernel to normalise the surface area of each visible point by a factor of tan(1 radian): https://github.com/tombh/total-viewsheds/blob/main/crates/ke...

I don't know if that's related to what you're thinking about?


Wow, with this in place the incentives are enormous for OpenAI to allow sponsors to pay for a slight nudge in the recommendation this way or that.

This will replace the current ad economy.


Yeah, it seems obvious that this is how models will be monetized in the future. The free version of ChatGPT will stop being a loss leader for the subscription and start paying for itself with commissions. The vast majority of people will use the free version.

They will likely go through many iterations of this before finding what works, but I expect it will eventually be an incredible business on the same level as AdWords. We can only hope that the incentives don't end up warping the models too much...


> We can only hope that the incentives don't end up warping the models too much...

Oh my sweet summer child. How can you be so optimistic after seeing Google's decline? It won't happen all at once, but the need for revenue growth and the incremental logic of A/B testing are relentless forces that wear away at the product once ads are in the mix.


It's been 27 years since Google Search launched, and I still find it very useful despite some perverse incentives. A pretty good run I would say. If OpenAI declines that slowly, and is then displaced by something better, I'd say it was a good outcome.


Google search declined slowly and then all at once.


I hope it takes as long for AI enshittification, but the tech "cycle" has become shorter and the pressure to make revenue is much more intense than it was for Google.


I don't really get comments about unreasonable labor in this comment and TFA. It's what, 16x16 = 256 SIM cards per such machine. At 10 seconds per SIM card, that's like an hour to fill one machine. In a week at 8 hours per day you can fill 40 such machines with SIM cards.


Yeah, I don't think labor is a problem here since the people willing to make this happen somehow also found 1) money to buy these racks, 2) many times 256 SIM cards, 3) a few locations in NYC


Also, from the article:

> The exact devices [..] are sold for an eye-watering $3,730.

That seems just a tad bit hyperbolic


Interested as I have a basement full of stuff in questionable order.

Could you explain a bit more? If a box in the basement box is marked with 4 emoticons, how does this help you understand content, context, history of it?


I can imagine a QR generator for "obsidian://" links, which would open tagged locations, could be very useful, to identify contents of boxes without opening them.

If the codes are written by hand, then typing them into UI and manually searching for them could be tedious.

---

Emojis/Random Images on boxes, could be used to quickly, visually find the right box in a sea of identical gray boxes.


Wouldn't a slower tick make it easier as you get more wall time to do the same challenge.


No? Wall time (that the challenge runs on) is unchanged, game time (Vsync) is running at 83% of full speed (50Hz vs 60Hz), so if something tied to frame rate (animation, walking speed etc.) takes 1 second to do on NTSC, it'll take 1.2 seconds to do on PAL etc.


For story writing you generate into the framework of https://news.ycombinator.com/item?id=45134144


This is extremely cool! Is this the first type of tools that are genuinely enabled by AI?

Can we distill the story into something to feed z3 to prove there are no plot holes?


Yes, agreed. What we see is simply a clever way to differentiate the customers that can pay a premium from those that can't. The end goal is to extract the maximum amount of money.


Or, equivalently, to enable the largest number of customers to use the product, by decreasing prices for smaller customers and increasing them for large ones.


I would call it obvious instead of clever, but otherwise fully agreed.


And leaves the rest as a security risk


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: