We use a C compiler for embedded systems that doesn't support link time optimizations (unless you pay for the pro version, that is). I have been thinking about some tool like this that merges all C source files for compilation.
That's called a "unity" build, isn't it? I was under the impression that it was a relatively well-known technique, such that there are existing tools to merge a set of source files into a single .c file.
Unless i am understanding you wrong, you could easily do this by #including all your a.c, b.c etc. into one file input.c and feeding that to the compiler.
We did this for a home-grown SoC with a gcc port for which there was no linker.
> It feels like OpenAI is mostly concerned with developing proofs of the untrustability of every digital medium
Which, to me, makes sense. Once the underlying technology exists, a malicious actor would not think twice before developing tools of deceptionlike those. It makes sense that OpenAI would work on that "in the open" to demonstrate that we now need to be skeptical of audios.
Hopefully I'm not being over optimistic, and I only vaguely understand the many tradeoffs that go into rocket design. But, technology-wise, starship seems analogous to a falcon heavy propelling a space shuttle. Both of which are manageable, proven technologies.
I think you are incorrect. If you build a 230-m tall pipe and fill it up with water, the water at the ground-level end of the pipe will be at the exact same pressure as the water 230 m deep in the ocean. Hydrostatic pressure only depends on the depth, not the container shape.
The water at the ground level on the ocean will have the same pressure as the water at ground level in the pipe. The water at 230m down in the pipe will be the same as the water 230m down in the ocean.
a pipe 230m tall with the bottom at ground level would experience the same hydrostatic pressure as water 230m under the ocean. the pressure comes from the weight of water above it, not distance from sea level. this is literally how water towers work.
The post I replied to said the opposite - that the pressure at the surface of the ocean would be the same as the pressure at the end of a pipe going under water, if there was a pipe.
Which clearly isn’t true or we’d have a trivial perpetual motion machine.
It is different this time. I bet that was also said when the transformations that you mentioned occurred, but this time it really is different.
LLM models are pretty general in their capabilities, so it is not like the relatively slow process of electrification, when lamplighters lost their jobs. Everyone can lose their jobs in a matter of months because AI can do close to everything.
I am excited to live in a world where AI has "freed" humans from wage slavery, but our economic system is not ready to deal with that yet.
I'm skeptical. This will drastically change what it means to do a job in a way that has never happened before, but humans will find a way to deal with the fallout. We don't have a choice. Besides, if we were able to disrupt the very foundations of our economy for a minor virus, we can and will do the same to deal with this if required.
Either way this change has already arrived and we are starting to adapt our lives in response to it like we have many times in the past.
tldr: This change is significant but we'll manage.
I wouldn’t say the handling of COVID was smooth to say the least.
Yes we handled it, we are still paying the bill for that handling (inflation).
I think AI will have the disruption level of COVID, but there will not be an end in sight, 5%, 10, 20, 50% of people will lose jobs and even if they can refrain and handle it, it will take 5-10 years for those people to handle it. Can the countries have people on unemployment for that long ?
Productivity will skyrocket and with it the standard of living. Humans will always enjoy having other humans doing stuff for them.
Sure, it will be faster this time and there will be some growth pains.
It's not a matter of being ready, it's a matter of needing this. If you look at society's problems today, we're in a deadlock. I believe the benefits of AI can help alleviate a lot.
It will most likely widen, but who cares? What matters to me is the quality of my life, not others. If they're managing to get better than me while doing something useful to society, good for them.
What really matters is: the poor of tomorrow will laugh at the life of today's rich.
I mean, the poor won't have the Bezos' yatch, but they'll have access to some life amenities, health resources, etc, that Bezos can't even dream of having today.
Too many times I have been the guy in the top left quadrant of the chart in the post. I eventually came to the realization that I have often been disrespectful to the intelligence and capacity of my colleagues. I am lucky that I get to work with smart people, and I have to remind myself that these smart people know what they are doing and don't want to do more work than necessary. This has helped me set my tone and ask better questions.
What the fuck indeed. I can't help but think of the Nuremberg laws in Germany. More specifically the "Law for the Protection of German Blood and German Honor" [1]. Sure, it's not _that bad_ (e.g., the Israeli law is not banning marriages per se) but has a similar sentiment.