Hacker Newsnew | past | comments | ask | show | jobs | submit | felipeerias's commentslogin

Thinking about the relationship between creation and verification is a good way to develop productive workflows with AI tools.

One that works particularly well in my case is test-driven development followed by pair programming:

• “given this spec/context/goal/… make test XYZ pass”

• “now that we have a draft solution, is it in the right component? is it efficient? well documented? any corner cases?…”


Tools can make individuals and teams more effective. This is just as true for LLM-based tools as it was for traditional ones.

The question is not whether one (1) LLM can replace one (1) expert.

Rather, it is how much farther an expert can get through better tooling. In my experience, it can be pretty far indeed.


In this context, time constraints are measured in hours and are very informative regarding the student’s capacity to prioritise, plan and carry out their work under pressure.

It is actually very informative when one person can


In these studies, the qualitative data is often a lot more informative than the quantitative.

Understanding how concrete people navigate a domain and noting the common points between them can be illuminating.

Trying to calculate a generalisable statistical result from them… probably not so much.


I share the sentiment but it seems a bit like imposing a human narrative on a Universe that does not seem to care all that much about us. Maybe we really are stuck in the Solar System and space is just too vast to do much about it.


In your example, you could achieve a similar outcome with a skill that included a custom command-line tool and a brief description of how to use it.

MCPs are specially well suited for cases that need a permanent instance running alongside the coding agent, for example to handle authentication or some long-lived service that is too cumbersome to launch every time the tool is called.


I mention mean-time to decision making and that's one of the rationales for the mcp. A skill could call a script that does the same thing -- but at that point aren't we just splitting hairs? We are both talking about automated repetitive thinking + actions that the agent takes? And if the skill requires authentication, you have to encode passing that auth into the prompt. MCP servers can just read tokens from the filesystem at call time and don't require thinking at all.


I was experimenting with this technology almost a decade ago as part of my work as interaction designer:

https://darker.ink/writings/Mobile-design-with-device-to-dev...

It has a lot of potential but unfortunately it has been kept back until now by lack of support and interoperability.


Waayy back in 2009 we had Bump [1], which allowed transfer between devices and later web apps as well – by banging your phone against the spacebar. It worked 98% of the time and was faster than AirDrop is today, even though we only had 3G.

Google acquired it and immediately killed it.

[1] https://en.wikipedia.org/wiki/Bump_(application)


Bump didn't use direct device-to-device communication. A central server correlated the two bumping phones, based on geolocation and accelerometer data, then swapped the data via the server. At least that's how it worked in the early days. (Wiki page confirms)

Since it's relying on your internet connection, skeptical it'd be faster than AirDrop for a large amount of data like photos. But for swapping contacts I bet it was faster since it didn't have to spend time establishing a new direct connection.


That's true, I should have mentioned it did not use device-to-device communication. It was the best possible experience for the time though, BT was not viable and wifi direct did not exist. 3G averaged at maybe 10Mbps, and photos were 2 megapixels (if you had a camera at all), more than enough speed. We were mostly sharing URLs and contacts.

By faster I mean the initial connection, it was instant despite the server-based pairing, which made it feel even more magical. With AirDrop you sometimes experience quite a bit of signal hunting.

A comparable experience would be when you touch phones to share a contact with NFC, it was in that ballpark of responsiveness.


Waaay back when in Japan, sekigaisen (infrared) was a verb meaning to transfer contact details or photos or whatever between phones via infrared. It was amazing how fast the iPhone took over Japan and killed off their quirky phone ecosystem.

Edit: want to emphasize that it was totally ubiquitous. Every phone has it


yes, "beaming" in the us was also used for quite a while. as in IR beam

japanese phones were buggy, feature packed monstrosities. a bunch of companies fighting to check as many boxes as they could. it's not a surprise that they got wiped out by an attempt to make a holistic internet communicator.

but for a while, there was nothing like them and their ability to get information on the internet


I wonder if this was driven by the Palm Pilots in the early 2000s. We beamed contacts, calendar entries, whole apps via IR. At trade shows exhibitors had terminals that would constantly send out contact informations via OBEX (?).


In the US (edit: and elsewhere!), "beaming" worked great between Apple Newton devices, including the pretty cool eMate 300 (an early Jony Ive creation, I just found on Wikipedia).

In 1993.


Microsoft Zune had the ability to send music wirelessly to other Zunes, it was called squirting


That's appalling. "Yo let me squirt you"


Somehow "squirting their users" perfectly defines Microsoft to this day


squirt me bro


My friends in school would send ringtones, wallpapers, and other small files through Bluetooth. It normally worked pretty well no matter the device.


I remember being blown away by the Gameboy Colour IR link. You could use it to trade Pokemon. That makes a bit more sense now if sekigaisen was already a popular ecosystem.


And in Pokemon Gold/Silver/Crystal on GameBoy Color you could send Mystery gifts via IR!

Someone even ported it to an emulator! https://shonumi.github.io/articles/art11.html


When I was pretty early in my career, I inherited a legacy project from the CTO who didn't want to maintain it anymore. We decided as a team that I'd just recreate the project with a modern tool chain.

A few weeks later, the CTO looked at my work and asked why it was missing xyz features from his legacy project, saying that if I'm gonna take a project and rewrite it, it better be at least as good as the old project.

It was a pretty good lesson for me to get early in my career, and I've carried it with me ever since. Don't break or rewrite that which already works.

It's evident that no one at Google ever got that lesson.

NB: I know Google definitely has other reasons for acquiring and killing off Bump — they were probably building a competing technology that was shitty and bump was doing it better and sooner than them so better to buy and kill than to make their own product better. But I think my the lesson from my anecdote still stands from a purely product point of view, and I feel like it should make business sense but apparently you can make bad micro business decisions as long as you can convince shareholders they were good macro business decisions.


I changed my thoughts on rewriting after reading this:

https://www.joelonsoftware.com/2000/04/06/things-you-should-...

PS: I just realized this article is older than some of the people here.


I would rewrite if the alternative is maintaining bad code for a long time. But yeah, it’s best to be pessimistic. And be really careful about changes. There are books written about the methods to use.


Wow! Not quite older than me, but my age was in the single digits :)

Thanks for sharing, it's always great to learn from folks who have been through it for literal decades.


The lesson I retain from a similar endeavor is that you should document all the usecase of a module or a project before rewriting them. And that task can be as exhausting as formally verifying the module.


Yeah, it could be several weeks or even months before even writing a single line of code depending on size of project. Important to do, but PMs would be a hard to sell that to :(

(Ideally these things are written while the code is being written but let's be honest, we rarely keep those up to date)


I do wonder how many great little user-friendly bits of software got destroyed in aquishutdowns. Incredible way to deploy capital to delete software, but that's the big internet world for you.


> Waayy back in 2009 we had Bump [1], which allowed transfer between devices and later web apps as well

Over the Internet. There are dozens of such services, and none of them can compete with Airdrop.

The main point of Airdrop is that it doesn't need Internet connectivity and won't use any metered data (or, on recent iOS versions, at least if Wi-Fi Assist is turned off, I believe).

Just as important is the fact that there's no need to install any application – any Apple device comes with Airdrop preinstalled.


What's sad is what largely replaced device to device transfers was just messaging apps. But messaging apps compress media horribly. iMessage isn't so bad, but send a photo through almost anything else and all meta data is stripped, and the image resolution and bitrate are the absolute bare minimum to look ok on a phone. But try to print it and it will be horrible.


> iMessage isn't so bad

iMessage is very bad in certain circumstances, think if the recipient is on 3G or 4G it really compresses videos. It's not obvious and doesn't tell the recipient or offer an option so if you're working in video you keep being told "Can you make it higher res" when this happens


Stripping the metadata on a photo is probably a feature though. For privacy reasons the default should most likely be that location, device info etc are taken out of photos that might go viral or be shared beyond what the original user intended.


It depends, I do wish it was an option that the user can pick from. Quite often you get sent photos of you or an event you were at and you'd like the metadata to be preserved. For posting on social media, sure it's best to strip it.


There could probably be a niche market (until platforms implement the functionality) for enhancing the metadata of Whatsapp pictures from family & friends and guess it from the context. i.e. your auntie sending you now a picture of yourself 30 years ago which will show up as dated 2025 by default, which totally sucks.


Your comment reminded me of this:

https://theyseeyourphotos.com/


If I am not mistaken, Bump still required a connection to the Internet. WiFi Aware does not, because the phones create an ad-hoc link on the spot.

The connection can be very fast. In this example, a 280 MB file is transferred in less than 10 seconds:

https://vimeo.com/418946837


Bump was like magic.

The only app I have ever truly thought “this is the future”


Very cool, didn't know such app had existed, thank you! Wanted to use a similar approach to connect people in a smaller friends-only social network.


I can almost guarantee it wasn’t faster than airdrop (when it works) is today. I remember using bump on wifi, and it was limited to (shocking) wifi speeds at the time. I have as recently as last week transferred 1GB video files in under 20 seconds using airdrop. That simply was not possible in 2009.


Connection speed, not transfer speed indeed – that was purely network dependent. In any case nobody was transferring 1GB files from their phones at the time :)


airdrop uses wifi direct... so


IMHO coding use cases are much more constrained by tooling than by raw model capabilities at the moment. Perhaps we have finally reached the time of diminishing returns and that will remain the case going forward.


This seems preferable. Wasting tokens on tools when a standardized, reliable interface to those tools should be all that's required.

The magic of LLMs is that they can understand the latent space of a problem and infer a mostly accurate response. Saying you need to subscribe to get the latest tools is just a sales tactic trained into the models to protect profits.


It seems very naive to presume that a tool which explicitly works by unblocking the retrieval of harmful information will not be used for, among other purposes, retrieving that same harmful information.


The goal isn't to make that specific information accessible; it's to get rid of all refusals across the board.

Going after the most extreme cases has the effect of ripping out the weeds by the root, rather than plucking leaf after leaf.


Users will ask ChatGPT for recommendations and the answer will feature products and services that have paid to be there, probably with some sort of attribution mechanism so OpenAI can get paid extra if the user ends up completing the purchase.


Checkout will happen directly in the app, and yes they will collect a fee on it.


ChatGPT will become a salesman working on commission.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: