The notion that "you could use the same application across the three major OSs" is the feature – the feature is not that it uses Electron. That it uses Electron to accomplish the feature is incidental.
Am I missing something? Why is there a need for a separate programming language for a use case like this? Don't we have enough programming languages to get by? Why a library wouldn't suffice?
It just does not seem reasonable to me to dump an ecosystem built by an established programming language to solve a domain problem
> No, not even nearly enough. Every little problem domain needs its own language.
> Then you really do not understand the value of DSLs. And you don't have to dump anything, eDSLs allow everything to co-exist nicely.
How about using a language that lets you build DSLs from it? Lisps are especially good choice.
I don't really see the point in createing a new language that is 95% identical to everything else, only to tweak that last 5%. Yes, you're still using a library, you're just hiding it inside the compiler.
I beg to differ. After skimming through the article, it seems that the proposed langauge resembles a general-purpose language (at least superficially resembling lua/javascript) with FFI to C (again what lua is good at). The domain-specific parts are either libraries or covered by runtime (in C), instead of language-level constructs.
IMHO there is no real benefit of creating a "new" language compared to e.g. just using Lua + C.
In this particular case it's the underlying VM which provides domain-specific semantics, not the language itself. If you could retarget Lua or JS to run on top of this peculiar VM - fine, but building a new language is easier.
i don't see anything in that paper that wouldn't fit into a regular API. i did see a line that said the details of the VM were outside the scope of the paper. i would be happy to hear what in those details prevents this from being done with existing languages. are you familiar with it?
also: easier for the developers perhaps but not for users.
I haven't read the link yet, but I'm expecting some facility that automatically assigns computational and behavioral tasks to swarm members based on their relative locations.
Literally anything can be implemented in Lua+C, so it's not whether it can be done, but whether the language+VM in question add new ideas to swarm research. If so, maybe those ideas can be translated to other languages people want to use instead.
Sorry -- I really meant if there's anything that calls for new language constructs to be reasonably expressive (i.e. alternative being really awkward API calls). I found none.
Yes, to my taste this language is nowhere near a DSL it should have been, but still, there is a significant portion of VM semantics that is hard to implement on top of an existing VM.
The article outlines it as follows:
"1) Sensor readings are collected and stored in the BVM; 2) Incoming messages are collected and processed by the BVM; 3) A portion of the Buzz script is executed; 4) The messages in the BVM output queue are sent (as many as possible, according to the available payload size; see Sec. IV); 5) Actuator values are collected from the BVM state and applied."
I.e., it looks like a complex synchronisation semantics which runs underneath any language layers and therefore hard to implement anywhere higher than in the VM.
API leaks abstractions and does not provide any domain-specific semantics. Imagine an API for, say, parsing. A DSL would have a nice BNF-like syntax and abstract any implementation details away. API - well, you won't ever get any further than Parsec, and Parsec is awful.
So, yes, as long as you have a sufficiently powerful meta-language, eDSL is almost always better than a library. But yet, if your language is not powerful at all, DSLs (standalone) are still better then the libraries, they're just a bit more complicated to implement.
Same thing - leaky abstraction. Too much host language semantics leaking into the problem domain. It's not a pure BNF, and therefore it's faulty by design.
This is great, is it possible to make it work with Docker?
Edit: Yes, it is.
Edit2: It is possible, but whether is it worth it or not is still a question. xhyve is a supervisor after all, you still need to boot a vm to use Docker. So if you like me, thought for a second that this can make containers in OSX a reality, don't get excited too early.
If you want a better solution on OSX than you already have, try https://github.com/codekitchen/dinghy - it does a really great job of wrapping Vagrant and taking care of setting up NFS instead of the incredibly slow default vboxfs.
Alternatively I've used Parallels + NFS with Vagrant for boot2docker but it's a real pain to get set up correctly. Dinghy just did it automatically, and also set up DNS and NTP for me. Can't recommend it enough.
The benefit for me would be quick, isolated development environments. I want to boot a contained swarm of services connected to each other in a single command, and shut them down in a single command as well
And installing docker / docker-compose / machine, etc. using one brew command would be a great addition in order to ease introduction of the tool to the whole team.
Are you aware that your OS X machine is, itself, a deployment target for developers somewhere? All environments benefit from enforceable loose coupling.
...that's the point. Some people have strong emotional reaction when they see this robot being kicked even though this is clearly, obviously, a robot. They haven't, as far as I know, built in any emotional engagement stuff. They could easily build in a few movements - robot pauses, cranes head towards human companion, makes noise, robot continues - which would freak a few people out.
yes, and hopefully, 10 years down the line it is not merged with an ai which can take a dim view of it's predecessors being kicked around on some carbon based life form's whims and fancies (after watching it's genesis)
Except it is in the AI's interest to get kicked like that. This way the company that constructs robots can show how reliable they are, and as a result receive more funding. Our robot overlords will show this to their "children" during history classes.
Or in other words: there is no reason to expect that AI will have irrational feelings similar to human feelings.
I dunno, it's not too hard to imagine a scenario where the AI realizes being kicked is counter to its goals and decides to remove the aggressor. Especially when these things are used for combat, as mentioned elsewhere in the thread. Friend/foe indicator malfunctioning? Well, good luck.
I think in this context, it would be more equivalent to sparring than anything: a test of one's abilities. A sufficiently-intelligent AI would likely think the same should it watch videos of its predecessors being kicked during testing.
I think its more that they have the cash to do everything so that's what they are doing. When you do everything you are unavoidable.
Not a bad strategy.
They haven't lost a bit of relevance with the majority of the paying user base. However the tech press like to spin it that way. When I see a Mac or a Linux box in a 2000+ seat corporate network, then perhaps I'll believe it. The only markets they aren't winning are the freshly created volatile ones.
I should have added that I'm a .NET developer who's recently returned working with in an open source shop. The problem I see is not that they don't have good tools or relevant solutions for real world problems. The problem I see is I'm in my late thirties and I find myself to be one of the younger developers when I go out to MS related user groups and events.
Spending time in the open source world with younger developers, it's clear the only value they get from Microsoft is for their gaming machines at home. While a number of them respect the development tools, it doesn't matter because Microsoft has done a great job tying it to the Windows titanic.
I recall interviewing with a number of companies in the mid 90's and talking with the programmers who worked with DEC based tools. The ones I talked to had nothing but good things to say about their tools and systems and how they had real implementations of various systems that PC's were trying to implement at the time. I can't help but look back and find myself in the same position on a course towards becoming irrelevant in 10-20 years.
I guess what I was trying to get at is Microsoft is going to have to make a considerable effort to focus on luring the generation they lost if they want to remain relevant. That generation of developers isn't simply going to pony up for slightly better tools. They demand having access to software when it's available so they can download it immediately with one of twenty different package managers.
I'm not predicting Microsoft's doom, but I am saying that Microsoft is facing an incredible number of challenges that will make "do everything" a bad strategy. On too many fronts they're faced with competition that ranges any from inferior to superior, but free. In development tools, they're up against a huge community of open source tools with some of them funded by huge companies whose revenues don't depend on the sale of software. MS Office's share is eroding, not just to direct competition, but to indirect alternatives that posit that complicated word processors and spreadsheets are the wrong answer.
I also don't think Microsoft can do everything because software is going through a Cambrian type of explosion where you're seeing all sorts of manifestations of species and hybrids. Sure a lot of these species will die out, but as we witness new species of databases and operating systems it will be difficult for a large company like Microsoft to predict which ideas it needs to pay attention to and which to ignore. Adapting the complicated markets is an incredible challenge, but there are companies that have done a remarkable job staying on top.
The formatting on your comment is messed up, hopefully this helps:
I should have added that I'm a .NET developer who's recently returned working with in an open source shop. The problem I see is not that they don't have good tools or relevant solutions for real world problems. The problem I see is I'm in my late thirties and I find myself to be one of the younger developers when I go out to MS related user groups and events.
Spending time in the open source world with younger developers, it's clear the only value they get from Microsoft is for their gaming machines at home. While a number of them respect the development tools, it doesn't matter because Microsoft has done a great job tying it to the Windows titanic.
I recall interviewing with a number of companies in the mid 90's and talking with the programmers who worked with DEC based tools. The ones I talked to had nothing but good things to say about their tools and systems and how they had real implementations of various systems that PC's were trying to implement at the time. I can't help but look back and find myself in the same position on a course towards becoming irrelevant in 10-20 years.
I guess what I was trying to get at is Microsoft is going to have to make a considerable effort to focus on luring the generation they lost if they want to remain relevant. That generation of developers isn't simply going to pony up for slightly better tools. They demand having access to software when it's available so they can download it immediately with one of twenty different package managers.
I'm not predicting Microsoft's doom, but I am saying that Microsoft is facing an incredible number of challenges that will make "do everything" a bad strategy. On too many fronts they're faced with competition that ranges any from inferior to superior, but free. In development tools, they're up against a huge community of open source tools with some of them funded by huge companies whose revenues don't depend on the sale of software. MS Office's share is eroding, not just to direct competition, but to indirect alternatives that posit that complicated word processors and spreadsheets are the wrong answer.
I also don't think Microsoft can do everything because software is going through a Cambrian type of explosion where you're seeing all sorts of manifestations of species and hybrids. Sure a lot of these species will die out, but as we witness new species of databases and operating systems it will be difficult for a large company like Microsoft to predict which ideas it needs to pay attention to and which to ignore. Adapting the complicated markets is an incredible challenge, but there are companies that have done a remarkable job staying on top.
> They haven't lost a bit of relevance with the majority of the paying user base. However the tech press like to spin it that way. When I see a Mac or a Linux box in a 2000+ seat corporate network, then perhaps I'll believe it.
I had the same response to the predicted demise of Palm and Blackberry. I'm not saying that I know the outcome but that current success is not a predictor of future performance, especially in an industry where disruption is such a focus.
I happen to work at one of those companies where all developers get Linux desktops. In fact, I'm typing this comment from said Linux desktop. You just need to get out more.
There's a big difference between a software company equipping developers with Linux based machines and a midsize to large non-tech company equipping their users with Linux or even Macs.
Granted, I think Microsoft is losing their grip and non-tech companies (I work for one) are seriously beginning to consider PC alternatives. On the manager and executive level, and power user level, IT departments seem to be accommodating Macs more. For non-power users, web based machines like Chromebooks are becoming a more attractive each day.