I'm so lost. The guy decided to pick up the phone from the floor while driving the car at high speed.
1. It could be ANY car with similar at that time auto steer capabilities.
2. Why the hate , because of some false promise ? Because as of today same car would save the guy in exact same situation, because FSD now handles red lights perfectly. Far better and safer vs ANY other tech included in the avg car price of same segment ( $40-50k).
Not sure if it’s using the same FSD decision matrix but my model S chimed at me to drive into the intersection while sitting at a red light Last night with absolutely zero possibility it saw a green light anywhere in the intersection.
Perfectly isn’t a descriptor I would use. But this is just anecdotal.
Another name for "false promise" when made for capital gain is "fraud". And when the fraud is in the context of vehicular autonomy, it becomes "fraud with reckless endangerment". And when it leads to someone's death, that makes it "proximate cause to manslaughter".
As the source article says, the jury did agree that the driver was mostly liable. They found Tesla partially liable because they felt that Tesla's false promise led to the driver picking up his phone. If they'd been more honest about the limitations of their Autopilot system, as other companies are about their assisted driving functionalities, the driver might have realized that he needed to stop the car before picking up his phone.
this is literally one of 1-3 companies who have a decent strategy in the age of AI. the rest is pretending changes will not affect them. even this judgement:
the guy decided to pick the phone while driving car not capable of red light detection. It could be any other car with similar auto steer capabilities. Right now same car with OTA updates would keep him alive. Sure, they are doing something wrong.
Is it responsible to let users do auto speed and auto lane on a high speed highway without other autopilot features ?
Rollout both technologies at scale , and try to guess with one will cause more harm giving th fact there will be users in both cars trying to put legs on a steering wheel :
A stupid tech that will not even try to do safe things
Or software that is let’s say 4x less safe vs avg human but still very capable of doing maneuvering without hitting obvious walls etc ?
Giving people more ways to shut themselves in the foot does not improve the safety.
I find the entire thing a kind of dark pattern as the system along with misleading marketing makes you lax over time just to catch you off guard.
You get used with the system to work correctly and then when you expect less it does the unthinkable and the whole world blames you for not supervising a beta software product on the road on day 300 with the same rigour you did on day one.
I can see a very direct correlation with LLM systems. Claude has been working great for me until one day when it git reset the entire repo and I’ve lost two days work because it couldn’t revert a file it corrupted . This happened because I just supervised it just like you would supervise a FSD car with “bypass” mode. Fortunately it didn’t kill anyone , just two days of work lost. If there was the risk of someone being killed I would never allow a bypass /fsd/supervise mode regardless of how unlikely this is to happen.
they have very good guardrails to prevent you that, unlike autolane etc.
Teslas has sensors , eye trackers etc is it possible to shoot yourself in the leg, sure. But not in any different way vs human doing irrational things in the car, make up, arguing , love etc.
Human-being is an irrational create that should not drive except for fun in isolated environment. Tesla or Waymo or anyone else.... It is good to remove human from the road, the faster the better.
>> It is good to remove human from the road, the faster the better.
I’m all for this but not to replace dumb people with dumb software. I think the FSD should be treated more like the airplane safety. We have the opportunity to do this right not just what’s the cheapest way we can get away with it.
well, if you don't read news that try to panic about everything new, that's +- exactly how people currently use FSD.
When I'm driving FSD If i want to drink, eat, etc, instead of doing weird one hand tricks every driver did, i just turn FSD and let it drive. When I'm tired , I'm doing the same. Again , attention control works really good, it doesn't let you sit on the phone etc. unlike many other cars with less advanced features. You can't be on FSD + Phone but you can easily be on the phone + lane control in other car.
Phone is by far the biggest real killer of people, and no body is trying to create a campaign against phone mounts, etc.
As a software developer I only really started to understand how divorced from reality Elon is when he started talking about software development. That was a sinking feeling: "Oh he doesn't actually know something I don't know. He's just keeping the music playing, so he doesn't have to sit down on the chairs that don't exist. Fuck."
The only reason for a source code to be is for humans to read it, bun when the source code gets churned (by AI agents) in too large of a quantity for any human to realistically read and analyze, then what's the point of having a source code in the first place? Generating binary directly simply makes sense. Working with binary does, even when a human is involved, as long as there's an AI helper as well. The human simply can ask the AI assistant to explain whatever logical aspects behind the binary code and instruct the AI agent to modify the binary code directly, if necessary. That may be scary and not easy to accept. Going further with this idea, even the written text may become "too costly to work with" when there will be an AI agent to verbally or graphically serve the human with whatever informational aspect of a given text that could be of interest in a given situation.
LLMs are trained on source code, so that's what they can (barely) write. Decompiling is a -lossy- action which means that training directly on the output would have much less information and would be a nightmare if one (human or llm) needs to debug.
Real business people keep running production lines if they are profitable and build additional production lines for new businesses. Real business people do not shut down profitable lines because they are making a pivot, that is what failing startups do.
EV's are even bigger threat now if you outside regulated bubble in US. everywhere else, china dominates the market with cheaper and cheaper EV's, while EU/US automakers fail to compete. replace tesla with china.
```
The incidents included a collision with a fixed object at 17 miles per hour, a crash with a bus while the Tesla vehicle was stopped, a crash with a truck at four miles per hour, and two cases where Tesla vehicles backed into fixed objects at low speeds.
```
so in reality one crash with fixed object, the rest is... questionable, and it's not a crash as you portrait. Such statistic will not even go into human reports, as it goes into non driving incidents, parking lot etc.
What dataset? Isn't the article clearly specified a different number?
Your context sucks, and it's good as a lie.
>Waymo reports 51 incidents in Austin alone in this same NHTSA database, but its fleet has driven orders of magnitude more miles in the city than Tesla’s supervised “
you are talking about 5 incidents, this is not statistics. Its just a fluctuation of random numbers, and random events like bus hits the taxi while idle. It's already 20% of your data is incorrect lol , since it's 1 out of 5.
So far , you can clearly tell :
1. tesla works decent in a limited environment, no crazy patterns
2. It's a limited env that means nothing. Scale is still not there. They ned to prove themself.
1. It could be ANY car with similar at that time auto steer capabilities. 2. Why the hate , because of some false promise ? Because as of today same car would save the guy in exact same situation, because FSD now handles red lights perfectly. Far better and safer vs ANY other tech included in the avg car price of same segment ( $40-50k).
reply