Having electric cars is just moving the pollution to another place, were the power is generated. Forests and trees are the filter of the planet, if you don't have enough the pollution stays no matter what.
Having electric vehicles moves power production and emissions as a result elsewhere. That said it decreases the amount of emissions elsewhere since it uses cleaner energy. You still keep creating more parks and building forests/sustaining them. This is all part of a multi-pronged assault on climate change. You are creating a false option - such that it is only this or that.
> You are creating a false option - such that it is only this or that.
Exactly what I wonder, the pollution problem should not be turned into an electric car sales booster, it is a wider problem that needs to be tackled from different directions and I don't hear anybody making a wider analysis or plan.
A phone app could replace this right? Using the front facing camera you could run some image recognition to count the flow of people and show the statistics on screen. Put the phone on a stand and leave it running. Much cheaper.
Their whole selling point seems to be that their hardware + software can count people accurately without needing a camera. The premise being that cameras spook people.
Cameras are already commonplace enough where I work, for security purposes. So not sure how easily cameras can be eliminated.
The dashboards they provide don't seem to be have to be coupled to the kind of ($850) hardware they are selling. Like you said, cameras should be able to do the job.
I don't think cameras will be displaced / eliminated. It's just a question of whether or not they will be accepted as a form of active surveillance or remain a method of security. Today, they are largely security but it's entirely possible the world moves toward mass surveillance.
Many have / are trying this. Harder to pull off than it seems. Privacy is a big part but distributing the infra to do this at scale is nontrivial. A number of years back Placemeter used to pay you 50/mo to install your Android device on a window sill so they could understand movement. Never quite took off.
MAC address tracking is one approach but it's imprecise and with the proliferation of "things with antennas," you have to do a lot of reconciling on the backend to not count 1 person as 3 when they have multiple devices on them. Euclid analytics tried this. It's a common but flawed approach to count (use depending).
Have you looked at how much stuff is packed in those AirPods? There’s certainly not enough room for a screwable cap and a consumer-safe lithium battery, while retaining rigidity and weatherproofing.
Bedsides, how many AirPods are likely to be dying from poor battery life compared to being lost, damaged or upgraded?
If this did happen, I suspect there would be more instances of people losing the battery cap, or throwing it away because of undiagnosed battery terminal faults, or of children dying because they swallowed the impossibly small pill shaped battery...
You could screw something on there. Just make the part that hangs outside your ear 2 mm thicker. It wouldn't be in your ear. Apple wants those things to fail, battery life to run out and not be replaceable. It's completely obvious.
Native apps are generally a lot easier to scrape since they rely on an API which can't be changed willy nilly without breaking compatibility with older apps. Also you can't do captcha etc on API calls in the same way you can on websites. And of course the data is neatly formatted to be machine readable.
The Podcasts and TV apps are native Catalyst (iPad apps on macOS) apps.
The Music app is still some sort of Frankenstein iTunes thing. The Library section views are native. The Apple Music section is still web. It is still slow compared to Spotify but much better compared to how it is in iTunes.
Bear in mind that humans don't generally get involved with the lower levels of the chip design these days, the design is maintained at a fairly high level with gate placement and routing tending to be driven by genetic algorithms. (This is why when you look at the dies of new microcontrollers/processors they tend to look less ordered than dies from the 70s and 80s)
For chip design specifically, that's rather tricky. In my experience, it generally requires a graduate EE degree. However, there are a lot of affiliated roles if silicon interests you. With the rise of SoCs and other digitally controlled parts, there is a bigger demand for supplemental software for validation and configuration. You can also get into an Applications Engineering role generally with an undergrad EE or CE degree.
> What would be the career path for a job in chip design and manufacturing?
Very long and very lucky.
I looked into microelectronics when I was a teenager. In 2009, you needed to be at the top of your class just to get an internship with a fab or a fabless.
In general, there are more automation with each year in both design and fab side.
Companies have luxury of employing people with 20 year careers at prices of average software devs in California.
The only exception is, of course, China: their fabs offer 6 digit salaries for senior engineers, but work experience is famously bad.
All of them expect foreign specialists to come with some "magic trick solution" for their business refusing to work, and of course they get pissed off if told that there are no magic tricks in this business.
I diverged from a very similar career path due to layoffs spurred by my employer being acquired, the remaining HW design team told to promptly relocate to Bay Area, whereupon they were gradually weaned out by extremely large work loads. Those to made it were then told to promptly relocate to SE Asia. The layoff, in hindsight, seemed the most appealing option.
That's the market signalling low demand, and a dying field. Im so glad I didnt get that gig out of college at altera and got diverted into software by market forces.
Is EE actually the correct field to study for formal verification? I have been looking into this at one point and it seemed very much within the realm of computer scientists.
For instance, our EE study program does not include even basic computability theory, the formal logic/graph theory was pretty minimal and besides some K-maps also zero boolean function theory.
This compared with the CS program that contains multiple very in-depth courses exactly on those topics which seem pretty critical for formal verification of hardware.
You could go the computer engineering route. It's a nice combination of EE and CS classes that will give you electrical theory experience that a CS major will not get and more experience with digital/software design than an EE will get. Not every university offers it. There is usually a large focus on writing C and Verilog/VHDL in the higher up classes. If you can't pursue a computer engineering degree at your school, an EE degree with a heavy focus on digital design will get you close. If can, try to get approval to take CS classes to expose you to more FPGA things if your CS path does that. There are a bunch of free online MIT classes for this kind of stuff. One thing to help, go out and buy a cheap Xilinx or Altera FPGA to start messing around with stuff on your own. Learn how to write fast assembly for MIPS or 8051 and fast C for RISC-V or ARM processors.
MSCoE here. FPGAs have limited applicability due to their high cost per unit and relatively low clock speeds. They were all the rage ~15 years ago during my grad study, but then flattened out. Although, it does look like they're bouncing back, likely due to newfound popularity for mining, and cheaper/better product lines coming out. Cheap, embedded stuff based on MIPS, ARM, even mobile processors like NVidia Tegra, will probably enjoy an order of magnitude broader market footprint, however.
I work in aerospace, FPGAs are the dominant chip. Everything from basic microprocessors to complex SDRs all run on them. Xilinx is the big one but there are smaller shops making more application-specific chips.
The EE degree is only there to make you employable as an EE. You'll have to teach yourself formal methods on your own. That's why it's a way to stand out.
"You have to teach yourself on your own" is always a good idea, but it would be an immense waste of time not to use the classes that your educational institution provides. Double majoring exists specifically for this purpose.
Read lots of books, make simple electronic toys. Read some more books, hang on forums, visit some conferences. Read more, talk to specialists. Understand bleeding edge in some area. Find an idea there, open a YC startup in chips, gather a team and dive head on in CTO role.