Hacker Newsnew | past | comments | ask | show | jobs | submit | wbthomason's commentslogin

I have similar feelings toward ROS, but I don't have experience with industrial alternatives. Would you be willing to elaborate (insofar as you're allowed to) about what made the "step-function" level of difference between ROS and the internal systems you've used? Or on what you would want to see in such a next-generation system?


Robotics is very much split into research users and industry. In industrial automation you buy a device (say a sensor or programmable logic controller) that comes with a few supported protocols. Typically a field-bus for realtime communication. And for the fancy ones also OPC UA (ISO 62541).

I don't know automation vendors that sell devices with ROS-support out of the box. With the use of Python it must be hard to get ROS safety-certified.

OPC UA is like a merge of ROS (channels for PubSub communication) and "CORBA done right" (an object-model that can be transparently interacted with over the network).

Full transparency, I maintain open62541, an open source implementation of OPC UA.


When I was using it, I didn't get any vibes of "anything done right", I actually find OPC UA to be a horrible (and underdocumented) mess. This can also be seen in nearly all the various language bindings, just as if nearly everyone has had some problems with the protocol or ran out of steam at the most basic things.

open62541 is really nice though, thank you for your work.

I do find any comparison of ROS and OPC UA a bit weird though, but that maybe depends a little on your use case, especially if not strictly talking about what you mentioned.


I DM'd your Twitter


Huh, interesting - this is news to me, as I use Eigen all the time/see it used all over for robotics. Is there a good replacement for robotics-specific operations/small matrices generally (I see some people mentioning DirectXMath?)? Or is the tradeoff just between spending the time and effort to write SIMD intrinsics yourself vs. lower performance but greater convenience with Eigen?

One advantage of Eigen's approach that I haven't seen mentioned here is that its templated design makes it easy to substitute custom scalar types for operations, which helps enable straightforward automatic differentiation and other such tools (e.g. I'm currently using Eigen to make a tracing JIT for computations like FK, etc. over scenegraphs).


Blaze [0] looks promising. There was a little discussion here [1] about it's comparison and performance, especially for small sizes.

[0] https://bitbucket.org/blaze-lib/blaze/src/master/

[1] https://bitbucket.org/blaze-lib/blaze/issues/266/blaze-vs-ei...


Thanks, this looks promising indeed! Especially because it similarly supports custom scalar types: https://bitbucket.org/blaze-lib/blaze/wiki/Vector%20and%20Ma...


One small downside of Blaze vs. Eigen for robotics is that Blaze seems to lack the highly convenient geometry types and operations that Eigen ships with. This wouldn't be difficult to build on top of Blaze (and could lead to some interesting performance comparisons), but does present some additional work necessary to use Blaze in many robotics applications.


glm is also good for 3d math. It mimics the API of OpenGL shaders, so it's a good option if you already know how to write shaders (or are interested in learning).

https://github.com/g-truc/glm


This is an interesting read, thanks!

I'm a bit surprised by https://viralinstruction.com/posts/badjulia/#julia_cant_easi... - I've been considering embedding Julia (as opposed to LuaJIT) in a project I'm working on that requires fast, mostly numerical computation.

Based on https://docs.julialang.org/en/v1/manual/embedding/, this seemed fairly simple, and as though I could avoid repeatedly paying the compile time latency cost by just calling my embedded functions at least once before the performance-critical loop (to ensure that they're compiled). Is that not the case? There's definitely still the high memory overhead to consider, but (admittedly without having tried it yet), embedding Julia doesn't seem too terrible to do. Worse than e.g. compiling a shared object library or something, but not at all unreasonably hard.


That might very well be true! But it's still much less pleasant than just using a compiled binary from a static language.


Totally fair, yep. Thanks again!


There's also Wmail[1], which I've been using for the past few months. It's essentially just an Electron wrapper around the web app, with all that entails. It's worked fairly well for me, though, and enables easier inbox switching.

[1]: http://thomas101.github.io/wmail/


Huh. Thanks for posting this, I actually hadn't heard of it. It avoids some of my complaints about Boxy[1]; I'll have to give it a try.

[1]: http://www.boxyapp.co/


Great, I hope it works for you!


We're planning on it! We'll be doing a write-up of the whole process and stack, to be posted after the launch. We're actually using an analog transceiver designed for FPV use to transmit video to the ground, and a separate digital transceiver to send telemetry down and receive commands.


What platform/browser are you on?


Sorry, we meant for the text beneath the counter and the tweet feed to give details on the event. You'll be able to control where our camera points during the flight.


I was one of the students in this course. If you're interested, I would recommend reading the professor's "course wrap-up"(http://rust-class.org/pages/course-wrapup.html) - it discusses his philosophy going into the semester. In particular, it addresses the most common comment I've seen in this thread, namely the rationale behind choosing to focus on general systems coding rather than kernel hacking.

Personally, I found the course enjoyable and worthwhile. It was not what I was expecting, and I was admittedly disappointed that we didn't go much into actual bare-metal programming. That said, I found it to be a good introduction to the concepts of systems programming, with a few detours. As I am interested in OS coding, I'm performing my own independent study to learn the material which might have been covered in a more "traditional" OS course, but the course did a solid job of covering concurrency, memory allocation, etc.

As for the choice of Rust: I really like the language, and plan to continue using it for personal projects. I'm not sure it was quite mature enough for exclusive use this past semester (the 0.7 to 0.8 switch midway through threw the TAs for a bit of a loop), but I've been very impressed so far, and think that this coming semester of the course (which will also use Rust) should have a much better time of it. The one annoyance I had with Rust was the lack of documentation, but the Rust community (including a fair number of those in this thread) were incredibly helpful, and I'm currently working on a project to provide an alternate tutorial, using a series of small "stepping-stone" programs to build from "Hello, World" to a MapReduce implementation. Personally, I think I might have picked either Go or a blend of C and Rust for this past semester, but Rust is becoming a better choice as time proceeds.

I was also one of the members on the student team that made the ARM kernel in Rust. Due to the timing of the project (right in the middle of preparation for final exams and the rest of our end-of-semester project due dates) and some "lazily evaluated" work ethic on the part of our team, it was fairly rushed, and is much less complete and polished than I would like. I'm planning on continuing work on it. Though I would disagree with the assertion that separating Rust from its runtime is as "painless" as it has been said to be, it wasn't terribly difficult, and solutions for that particular problem are also constantly being improved.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: