I don't know there's any great solution here. Something that works well most of the time is going to lull drivers into not paying attention. Calling it autopilot probably doesn't help.
Tesla is either going to have to hobble it enough that drivers pay attention, or significantly improve it, if that's possible.
It seems obvious that there's room for improving a system that can't detect an object that's 75 feet long and 13 1/2 feet high directly in front of it.
Self-driving cars are decades away from reality, and self-driving trucks are even further away.
If only there we some other way to notice solid objects, instead of relying on the statistical analysis of video frame images relative to one another, over time, in real time.
If only we could figure out how to detect solid objects, but alas, such a difficult problem. So hard to implement that no one should hold their breath, in hopeful anticipation that a carmaker might add solid object detection to their sensor array package, so that cars can do the most important thing everyone expect from them.
But, you know, when you make a self driving car, hazard avoidance and collision detection actually turn up much lower on the list of priorities that you'd expect.
(in driver's ed, we learned not to overdrive our headlights, to not drive so fast that our stopping distance began beyond our visual field of good lighting surrounded by darkness; meaning that should an object suddenly appear in the light cone as you approach it, the combination of your human reaction time, combined with the tire's capacity for friction against the road surface, combined against the total inertia of the car in motion should all remain in the cognitive focus of your situational awareness as you drive a car at night; without this, as a human, you are not a safe driver;
and yet robots are judged according to different standards, even though, for them, it is always nightime, and the world is perpetually a cloudy environment shrouded in mysterious darkness; too bad that's not what sells cars)
> If only we could figure out how to detect solid objects, but alas, such a difficult problem.
Can’t we just put signs on everything?
By placing an industry-standardised logo on the corners of vehicles, and giving them away free as stickers / screw on signs, we could identify the anything that ought be considered solid and necessary to avoid regardless of its velocity.
In my opinion we aren’t going to see Level 4+ autonomous vehicles until we start including their requirements in our built environs design.
No. Your plan shifts the responsibility for avoiding collisions away from the “autonomous“ vehicle (where it belongs) to everything else in the world. If these vehicle can’t operate in the world as it exits, then they should not be allowed to operate autonomously.
LIDAR would help a lot. The radar Tesla's have lacks the angular resolution to distinguish between a stationary object next to the road and a stationary object in the middle of the road. Tesla claims that CV will eventually pick up the slack, but I'm pretty damn skeptical of that.
Consider that Google/Waymo seems a lot more serious about CV research than Tesla, and yet their prototypes all use LIDAR and CV sensor fusion. If Waymo thinks LIDAR is important, why should I trust Tesla when they say it isn't? And remember there is a strong profit motive for Tesla to downplay the necessity of LIDAR: LIDAR is expensive and bulky right now, but Tesla wants to advertise their cars as containing all the hardware necessary for self-driving so they can profit from the automation hype. Waymo uses LIDAR because they're trying to make it function while Tesla scoffs at LIDAR because they're trying to sell cars with hype.
I disagree on the grounds that it already happened. Perhaps survivor bias is obscuring the fact that Teslas alone have already driven roughly billion miles on autopilot. Perhaps we don't hear about those because they are uneventful.
Yes there will be bugs--all software will have bugs--and they will decline over time. Meanwhile, the accident rate of autopilot is probably better than human drivers.
The numbers for autopilot are a lot more complicated than that, and you cannot claim it is statistically safer based on the data we have. But if Tesla did have data showing autopilot is safer, I'm sure they'd be ecstatic to share it with the public. Instead they're burying the data and it can only be obtained after years of legal battles.
Fair enough, though I'm more concerned about the claims for fully autonomous vehicles without driver controls. How many times in those billion miles did a human driver intervene? Or rather, would it have been two billion miles on autopilot if the human driver had not had to take control?
Key point is that the autopilot will get better over time, while the humans will not. This is one of those questions that is best answered by looking at the first derivative rather than the quantity in question.
I've driven several thousand miles with no hands, only my knee. Of course that number is deceptive because it doesn't tell you how frequently I intervened with my hands.
I completely disagree they are decades away. I'd be shocked if there weren't completely autonomous vehicles operating anywhere in U.S.A in 5 years. No idea why you think trucks are further either.
Trucks are further because a) they are heavier and require much more distance to stop, b) they are longer and require much more room to turn and maneuver, 3) they are much more likely to carry large quantities of hazardous materials. Driving a truck safely is a lot more complex than driving a car safely.
There may be a perception that trucks will happen sooner because you could focus on interstate highways only, with "ports" of some kind near on/off ramps. It would have value, and still keep the vehicles on high quality, well marked and maintained roads with few intersections.
And the rear of a dry van truck is 8 1/2 feet wide by 13 1/2 feet high, so if it can't detect that, how is it supposed to avoid another car, let alone a pedestrian?
This case (truck perpendicular to travel path) should literally be the easiest test case for object detection, other than larger stationary objects like buildings and bridges.
I seem to recall that the last time this scenario (truck across the path of travel, although in that case it wasn't moving) led to an accident, the Tesla mistook it for a bridge.
I suspect this is going to be a perpetual problem for vehicles without lidar, but I'm not particularly clueful.
Tesla is pushing the bounds of the signal they can extract from the forward facing radar (as a result of the first Autopilot trailer accident resulting in a death).
That's interesting. I can imagine, lacking real depth information, why that would happen. Especially for specialty trailers, like the ones that pull logs or wind turbine blades... where there's a large open area under the cargo.
> Something that works well most of the time is going to lull drivers into not paying attention.
That's my biggest objection, I either want to be 100% disengaged or nothing. A system that wants but doesn't require you to pay attention and allows you to check out is absurd. It's hard enough sometimes reacting to things when you are driving but a system that let's you stop paying attention but might ask you to make immediate responses to situations it can't handle? Hell no.
We've had our Tesla since November and it has given us two 30 day trials of Autopilot and the "no thanks" is so obfuscated that it's easier to say yes and then opt out after it enables it.
Some of the features like Intelligent Cruise Control would be nice but no way in hell do I want the half assed autopilot feature.
Stop marketing Tesla's Level-2 ADAS as 'AutoPilot', maybe? It would be a pretty radical change, but it would get rid of what is by far the most common misconception about it - namely, that it allows for even momentary distraction by the driver (as the "autopilot" on an aircraft would) - in fact, it does not. On the flip side of it, they could fairly reintroduce the AutoPilot name when they reach at least Level 3 on the ADAS scale (allowing for momentary distraction in highly selective conditions, such as-- for a start-- when the car is basically restricted to moving at pedestrian-safe speeds anyway, due to intense congestion or other reasons).
Tesla is either going to have to hobble it enough that drivers pay attention, or significantly improve it, if that's possible.