I didn't watch it (I read a plot summery) but I did watch what I think was Captain America or Captain America: The Winter Soldier and they had a guy uploaded into a computer. That's earlier in the timeline, right? So it's proven doable in the universe they set up with the films.
How likely do you think governments/central banks would keep away from inflating the currency they have complete control over, when its normally in their best interest to print currency?
I haven't read your entire reply, but the top link has a graph relating CPI with M2 if I'm not mistaken. Is'nt CPI a terrible measurement since it uses units that change constantly. Bit like measuring distance with a meter stick that changes in length everyday.
> Is'nt CPI a terrible measurement since it uses units that change constantly.
CPI's "units" are what people buy in day to day life. It has to change because it is a model of what is happening in the real world, and while imperfect that doesn't mean it's not useful:
As a Canadian, StatCan regularly changes what's in the CPI basket of goods in my country, and that is a good thing. Because if they didn't, then we'd still be looking at video rental prices (removed in 2015), 35mm film (2013), and going back further lard (1967):
Things added over the years: ride share services, ISP prices, mobile phone plans, etc.
The goods and services that people use when living life change over the years/decades and their lifetime, so why shouldn't the CPI (which can determine the wage raises they get to pay for living) reflect that reality?
Colleges and universities normally teach students the basics of digital logic and electronics before diving into Verilog or VHDL. At least that's how it was when I was a student many years ago.
You can learn computer architecture without knowing an HDL but it helps.
In my experience, commercial ASIC design in the US, VHDL is more popular in colleges and Verilog is more popular in industry.
Whichever one you like - they are basically equivalent in terms of what they can do. Verilog is more like C while VHDL is more like Ada.
We're in a golden age of youtube videos teaching you the basics of Verilog/VHDL, digital design, computer architecture, FPGAs, CPU implementation, etc. Starting in simulation (like verilator and gtkwave) is a great idea because 1) it's free 2) you get easy visibility into the low-level behavior of your design and 3) simulation is an important part of hardware design, testing, and debugging.
Besides HDLs (which are great) you can also use block design tools; for example, hneemann's Digital (which was influenced/inspired by Logisim) looks like it could be a nice learning, development, and simulation tool - and it also exports to Verilog and VHDL. There are several web-based environments as well.
I've also recently run into some other HDLs which look interesting, such as Migen, which is used in the LiteX SoC framework and implements an HDL as a Python library. (It also has successors which I have not used.)
And on a semi-related note, Patterson and Hennessy's "Computer Organization and Design (RISC-V Edition)" is an updated classic introduction to computer architecture (and its sequel is legendary.) I also recently discovered Harris and Harris' Digital Design and Computer Architecture (RISC-V Edition), and its follow-on lab courses, RVfpga and RVSoC, which cover implementation on an FPGA board and/or simulator.
A personal reason: I'm building a global simulator for shadows [1]. I started with SRTM based elevation data which allowed me to cast mountain shadows. I'm now offering some LiDAR data which also includes buildings, structures and trees (~50cm resolution). People use my website for sun mapping their gardens, real estate, farms, events, photography, academic research, solar systems. It would be nice to amp up that resolution by another factor.
Making better maps seems like an obvious usecase, and by "maps" I mostly mean technical documentation. I assume most EU countries have a requirement that new building permits need to come with precisely measured outlines, this could help increase their precision and/or make the measurements cheaper.
Sometimes we need to give awesome tools to creative people and see what they come up with, even when we don't understand the implications ourselves.
I think millimeter accurate GPS is one of those tools. It has the power to enable so many things. Things we cannot imagine without using the tool itself.
40 cm vs 1 mm is the difference between landing a quadcopter smoothly or crashing it into the ground.
20 cm vs 1 mm is the difference between a robot navigating through a door or crashing into the wall.
20 cm vs 1 mm is the difference between mowing the lawn or cutting through your flower bed.
Unfortunately it doesn't look like we'll be getting millimeter accurate GPS anytime soon. The Genesis satellite might be a prerequisite though.
- The satellite will accomplish this [precision] by having the usual main Earth-measuring techniques co-located on board [satelite navigation, interferometry, laser ..] When used together, the ESA expects to be able to correct for biases inherent in each technique.
- An updated International Terrestrial Reference Frame (ITRF) will have immediate benefits on satellite-based systems, impacting Galileo-enabled applications in fields like aviation, traffic management, autonomous vehicles, positioning and navigation
- The space agency added that meteorology, natural hazard prediction, monitoring climate change effects, land management and surveying – as well as the study of gravitational and non-gravitational forces as fields – would also see benefits.