Hacker Newsnew | past | comments | ask | show | jobs | submit | MrEldritch's commentslogin

I primarily use it as a toy, but it's also come in genuinely handy for me as a scripting language and calculator (but a calculator that also works with strings). The terseness isn't really a code-golf thing; once you're familiar with the glyphs (which are really a fairly well-chosen set of powerful algorithmic primitives) it makes it more straightforward than any conventional language I've worked with to just Implement An Algorithm, with no boilerplate or fluff. Arrays with array broadcasting, combined with Uiua's stack combinators, are just a really flexible and general tool for expressing how data flows through a program, and the glyphs make it possible to (once you're used to them) translate those flows into code very naturally and smoothly.

It's difficult for me to express just how fun Uiua can be to use. It's rather like one of those Zachtronics games - both in that figuring out how to fit your task into the array-programming model can be a bit of a puzzle sometimes, and in that once you've done that it's an extremely quick and non-frustrating process to make it work. The pure essence of what makes programming enjoyable, for better and for worse. There's also just a lot less plumbing and documentation-reading involved; the flexibility and terseness of the glyphs means that a lot of things you might have to call out to a standard library for in other languages you can Just Write because the entire implementation could easily be of length comparable to the name. (There's also a lot less plumbing and documentation-reading involved because there's only, like, four Uiua libraries anyway. Less to plumb together. Like I said, what makes programming enjoyable, for better and for worse.)

(Why Uiua and not, like, APL? I actually find APL enormously more difficult to read, due to the syntax - APL glyphs have two context-dependent readings, depending on whether they're being used as an binary infix or a unary prefix, and figuring out how the parse tree breaks down when squinting at a sea of glyphs is painful. It's like a whole language of garden path sentences. Uiua glyphs have fixed arity and always mean exactly one thing; so there's twice as many glyphs, but parsing by sight-reading is way more straightforward.)


"Directed energy weapon" is essentially the more technical way of saying "death ray". Lasers, microwaves, particle beams, that sort of thing.


It seems very unlikely - the researchers estimate a lifetime for phosphine of just thousands of years in the more temperate parts of the atmosphere, and just thousands of seconds nearer to the surface.


the energy flux available in ambient radio waves passing through something the size of a handheld is microscopic, pretty much.


While on the subject of FORTH for the 6502, I can't resist mentioning Eloraam's now-defunct (but huge in its heyday) Minecraft mod Redpower 2! It included fully-functional computer blocks that could interface with redstone machinery ... running a custom FORTH on an emulated 6502 processor :)


I think what they're doing with those sculptures is that they're taking three sine waves of different frequencies A, B, and C, and that the sculpture is the path that the point (cos(A t), cos(B t), cos(C t)) takes over the combined period of all three frequencies. That's why it's cube-shaped; x, y, and z each vary from -1 to 1 over the respective periods of each component. Like a 3D Lissajous figure; see this (https://gfycat.com/angelicdismalamazonparrot) for a 2D example.

But I also agree that I shouldn't have to be guessing like this, and that it'd be better if they had more explanation.


Does Hacker News have a "Report" button?


It's hidden behind the timestamp. Click the timestamp, get to the post information page, and finally click the flag link at the top.


The Flag link?


Why, what'd OP comment that required reporting?

Paraphrase is fine but I gotta know!


If you enable showdead in your preferences you can read the whole thing, which is a rambling off-topic comment on their product called "interference".


TIL about showdead - thanks!


Perhaps I'm missing something, but ... this malware's initial infection vector is via email. If your computer is connected to email, it cannot possibly be air-gapped, unless I'm severely misunderstanding what "air-gapped" means.

I assume that it's meant that the malware infects an internet-connected computer, jumps to removable storage, and then hopefully that storage is plugged into the target computer, possibly through multiple intermediate infections? But the fact that viruses can spread via thumbdrives is hardly novel either.

And the kicker, of how you get the files out of the air-gap, is also not mentioned; "ESET says that during its research, it was not able to identify any Ramsay exfiltration module just yet." I'm certainly aware of a number of sexy proof-of-concept side-channel attacks that modulate things like fan noise or graphics card activity or infrasound to try and exfiltrate data in a way that an external agent could pick up, but there's no evidence that this malware uses any of them; perhaps the hope is that another infected flash drive gets plugged in with an exfiltration module, slurps up the data, and then transmits it out when it's plugged back into a network-connected machine.


Obviously, like everyone, I'm a little concerned about "Embrace, Extend, Extinguish" and I don't really believe that Microsoft are suddenly "the good guys".

But I don't, ultimately, think that this will lead to the demise of Linux. Desktop Linux as a serious competitor for a general-use operating system was already not happening and not really showing any signs of growth beyond a tiny percentage of geeks; if it was going to happen then I think this would be a major barrier to it (why bother abandoning Windows or learning to dual-boot for Linux functionality when you can just use WSL?), but since it wasn't going to happen anyway, another reason it's not happening won't really matter.

(and on the other hand, by providing a less-scary way to get familiar with Linux from inside the operating system you're already used to, it might lower the barrier by an equal amount)

And Windows is resource-heavy and full of enough overhead (not to mention license fees) that it's never going to replace Linux in the server, high-performance, or embedded space, and this certainly won't effect that.

I suppose it may end up leading to fewer devs directly running Linux in the workplace, since the corp can issue them a more corp-comfy Windows machine instead and trust that they'll still be able to get work done, and all the odd troubles of getting Linux to work comfortably on a laptop just sort of vanish if you can just use Windows anyway, which could erase a good chunk of the tiny desktop/laptop marketshare that Linux already has. Or it may not. Who knows!


I've been using linux on my home machine for several years and it's been fantastic. I first installed linux in the mid 90's though so maybe I'm just familiar with it.

I did use Mac OSx and Windows as my primary desktop until maybe 5 years ago.

And since I've been working from home I've found that I don't really have a need for windows except for server stuff. Interestingly enough, I also found that I can ssh into a windows server and use powershell to do basically everything I ever need to.

So now my workflow has flipped. Instead of having a windows desktop that I use to ssh into a linux vm I have a linux desktop that I use 90% of the time and ssh into a windows vm to do the rest.

I even ran into a bug in Evolution's ews (exchange plugin), filed a bug report, and it's already fixed.

I know that systems engineering/admin work is probably more suited to linux that other careers, but when this is all over I think I'll be using linux in my office also.


I believe that Windows will soon 'do an Apple'.

That is, just as Apple has built its proprietary Desktop built on top of a BSD UNIX base, Microsoft is gradually moving towards building its Windows Desktop on top of a Linux base.

That would remove hundreds/thousands of highly paid MSFT developers that currently have to maintain the underlying OS and obtains their function from the thousands of 'unpaid by Microsoft' Linux OS developers.


Are you saying, we'll finally see the year of Linux desktop?


> I believe that Windows will soon 'do an Apple'.

You are comparing apples with oranges, 'MS doing an orange' will be decades slow and will result in a two faced monstrosity that they will probably rename to JANUS or something of sort.


Given that that story, with that specific wording, has been a running joke circulating around the Internet for years, I suspect your co-worker's kid has never said any such thing.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: