YTread Logo
YTread Logo

Tesla Autonomy Day Autopilot Full Self Driving With Model 3 Explained

Apr 15, 2024
So we're already at 770 million miles to navigate on

autopilot

, which is really cool and I think one thing worth highlighting is that we continue to accelerate and learn from this data, as Andre talked about. about this data engine, as this accelerates, we're actually making more and more assertive lane changes, we're learning from these cases where we'll intervene, whether it's because they're not detecting the exit correctly or because they wanted the car to be a little bit more joyful in different environments and we just want to keep progressing, so to start all of this off, we started by trying to understand the world around us and we talked about the different sensors in the vehicle, but I want to go a little bit deeper here. eight cameras, but we also have 12 ultrasonic or radar sensors, a GPS inertial measurement unit and then one thing we forget is that we also have the pedal and steering inputs, so not only can we see what's happening around the vehicle , but also how humans chose to interact with that environment, so I'll talk to this clip right now.
tesla autonomy day autopilot full self driving with model 3 explained
This basically shows what is happening in the car today and we will continue to push this so that we start with a single neural network and look at the detections around it. Then we build all of that together from multiple neural networks and multiple inferences, bring in the other sensors and turn it into a feeling called vector space, an understanding of the world around us and this is something that as we continue to get better and better at this , we are increasingly moving this logic to the neural networks themselves and the obvious endgame here is that the neural network examines all the cars, gathers all the information and ultimately generates a source of truth for the world. around us and this is actually not like an artistic representation in many ways, it is actually the result of one of the debugging tools that we use on the team every day to understand what the world around us looks like, so another thing that I think is really really exciting to me and I think when I hear about sensors like LIDAR, a common question is about having additional sensory modalities, like why not have some redundancy in the vehicle and I want to delve into one thing that doesn't It's always obvious. with the neural networks themselves, so we have one neural network running on our wide fisheye camera, for example, that neural network doesn't make one prediction about the world, but rather it makes many separate predictions, some of which actually They audit each other, so as a real example, we have the ability to detect a pedestrian, something we train very care

full

y and work hard on, we also have the ability to detect obstacles on the road and a pedestrian.
tesla autonomy day autopilot full self driving with model 3 explained

More Interesting Facts About,

tesla autonomy day autopilot full self driving with model 3 explained...

It's an obstacle and it shows up differently to the neural network, it says oh there's something I can We don't drive and these together combine to give us a better idea of ​​what we can and can't do in front of the vehicle and how to plan for it. Then we do it through multiple cameras because we have overlapping fields of view in many places around the vehicle in front we have a particularly large number of overlapping fields of view. Lastly, we can combine that, if things like radar and ultrasound to build these extremely precise understandings of what's happening in front of the car, we can use that to learn. future behaviors that are very precise, we can also build very precise predictions of how things will continue to happen in front of us, so one example that I think is really exciting is that we can look at cyclists and people and not just ask where they are. now but where are you going?
tesla autonomy day autopilot full self driving with model 3 explained
This is actually the heart of the ordinary part of our next generation automatic emergency braking system that will not only stop for people in your way, but will suffer, you will be in your way and that will run in the shadows. so right now we will go out to the fleet this quarter. I'll talk about shadow mode in a second, so when you want to start a feature like this to navigate on

autopilot

on the highway system, you can start by learning from the data and I can simply observe how humans do things today, which It's their assertiveness profile, how they change lanes, what causes them to abort or change it, like their maneuvers, and you can see things that aren't immediately obvious, like oh yeah, I'm taneous merging. it's rare but very complicated and very important and you can start generating opinions on different scenarios like a vehicle overtaking quickly so this is what we do when we initially have some algorithm that you want to test we can put it in the fleet and we can see what they would have done in a real world scenario like this car that is passing us very quickly and this is taken from our real simulation environment showing different paths we have considered taking and how they overlap in the real world behavior of a user when you tune those algorithms and feel good about them specifically and this is really taking that out of the neural network, putting it into that vector space and building and tuning these parameters on top of that, ultimately I think we can do it through every time. more machine learning comes out to a controlled rollout, which for us is our early access program and this is getting this out to a couple thousand people who are really excited to give you very thoughtful but useful data on husbands, not a opening.
tesla autonomy day autopilot full self driving with model 3 explained
Do it in closed loop in the real world and you see their interventions and we talked about when someone takes over, we can get that clip, try to understand what's going on and one thing we can actually do is play this. Go back in an open loop again and ask, as we build our software, if we're getting closer or further away from how humans behave in the real world and one thing that's cool with

full

y autonomous computers: we're actually building our own racks. . and infrastructure so that you can basically set up our fully autonomous computers, build them on our own cluster, and actually run this very sophisticated infrastructure to understand over time, as we tune in, fix that these algorithms are getting closer and closer to the behavior of humans. and finally we know if we can exceed their capabilities and once we had this, we were very good about it.
We wanted to do our broad launch, but to start we asked everyone to confirm the behavior of the cars through stock confirmation, so we started making batches and a lot of predictions about how we should navigate the highway, we asked the people to tell us if this is right or wrong and this is again an opportunity to fire up that data engine and we spotted some really difficult and interesting long stories in this case. So I think a really fun example is these very interesting cases of simultaneous merging where you start going and then someone moves behind her before you realize it and what is the appropriate behavior here and what are the attunements of the neural network. we have to do to be super precise about the appropriate behaviors here we worked, we adjusted them in the background, we improved them and over time we got 9 million successfully accepted lane changes and used them again with our continuous integration infrastructure to Really understand how we think we are ready and this is something in which we are completely autonomous.
It's also very exciting for me, since we own the entire software stack right from kernel patch to ice, it's like fine-tuning. In the image signal processor we can start collecting even more data that is even more precise and this allows us to better and better tune these faster iteration cycles, so earlier this month we thought we were ready to employ a even smoother version of autopilot navigation on the highway system and that perfect version does not require a stock confirmation so you can sit there, relax, put your hand on the wheel and just supervise with the car running and in this case , we are actually seeing. over a hundred thousand automated lane changes every day on the highway system and this is something that has made us super cool to implement at scale and what excites me most about all of this is the actual life cycle of this and how actually able to spinning that data engine faster and faster over time and I think one thing that's really becoming very clear is the combination of the infrastructure that we've built, the tools that we've built on top of that and the combined power of the whole being. -

driving

computer I think we can do this even faster as we move forward now to get an autopilot of the highway system on the city streets and yes, with that I will hand over the

driving

of yes, okay, to the best of my knowledge and understand, all those lanes.
There have been changes with zero accidents, that's correct, yes, I watch every accident, so it's conservative obviously, but having hundreds of billions of lane changes and zero accidents is a great achievement for the team, yes, thank you. you

If you have any copyright issue, please Contact