YTread Logo
YTread Logo

Watch Tesla Unveil Its Full Self-Driving Computer In Under 5 Minutes

Feb 27, 2020
At first it seems unlikely how Tesla, which had never designed a chip before, could have designed the best chip in the world, but that is objectively what has happened. The old Teslas that are being produced now have this

computer

in October 2016. Tesla announced that its vehicles will start shipping with cameras and sensors to support

full

y autonomous

driving

through a software update and a

computer

update according to Elon Musk , that day all the cars arrived. that are being produced have all the hardware necessary for autonomous

driving

, Tesla held an event for its investors called Autonomy Investor Day at its Palo Alto office.
watch tesla unveil its full self driving computer in under 5 minutes
The event showcased software updates for Tesla's

self

-driving technology that will be rolled out in a series of over-the-air updates starting now. Tesla's Autopilot feature enables autonomous driving on highways on ramps and off ramps with regular information from driver. Tesla CEO Elon Musk also

unveil

ed a new computer called Tesla FSD, or

full

self

-driving. Computer formerly known as Autopilot Hardware Update 3.0, the FSD is said to be currently in production and will allow Tesla's software updates to work, here's what it looks like. I would like to point out that this is actually a fairly small computer that fits behind the glove box between the glove box and the firewall in the car, it doesn't take up half your trunk, as they said before, there are two totally independent computers on the dashboard and their principle here is that if any part of this could fail and the call continues to conduct so you have the cameras on bail, the power circuits may fail, one of the autonomous computer chips on the Tesla pulse rifle may fail. the car keeps driving, the chance of this computer failing is substantially less than the chance of someone passing out there at the top left. we can see the Euler interface of the cameras, we can ingest 2.5 billion pixels per second, which is more than enough for all the sensors we know, in case you use, say, lidar, could you process the lidar?
watch tesla unveil its full self driving computer in under 5 minutes

More Interesting Facts About,

watch tesla unveil its full self driving computer in under 5 minutes...

It's stupid and anyone's lucky. trusting the answered is doomed to failure expensive and expensive sensors that are unnecessary it's like having a bunch of expensive pens only appendages that compare an appendix is ​​bad, well now I put a bunch of them, that's ridiculous, you'll see every autonomy because I want level four and level five systems that can handle all possible situations in 99.99% of the cases and chasing some of the last good ones is going to be very complicated and very difficult and will require a very powerful visual system, so I'm telling you showing some images of what you can find in any segment of that line, so at first you only have very simple cars in the future, then those cars start to look a little weird, then maybe you have black stone cars and then maybe of cars. and cars, then maybe you'll start getting involved in really weird events, like overturned cars or even cars in the air.
watch tesla unveil its full self driving computer in under 5 minutes
We see a lot of things coming from the fleet and we see them at a really good pace compared to all of our comparators, so the rate of progress at which you can actually tackle these problems iterate in the software and really feed the neural hours with the data right that rate of progress is really proportional to how often you encounter these situations in the wild and we encounter them much more frequently than anywhere else, so we're going to do extremely well, we have eight cameras, but we also have twelve ultrasonic sensors or a radar, an inertial measurement unit, GPS and one thing we forget about also is the pedal and steering actions, so not only can we observe what is happening around the vehicle, we can observe how humans chose to interact with that environment, so I'll talk to this clip right now, this basically shows what's happening in the car today and we'll continue to push.
watch tesla unveil its full self driving computer in under 5 minutes
This moves forward, so we start with a single neural network, look at the detections around it, then build all of that together, multiple neural networks in multiple directions, bring in the other sensors and turn it into an Elan called vector space, an understanding of the world. around us and this is something that as we continue to get better at this, we are moving more and more of this logic towards neural networks and the obvious end here is that the neural network analyzes all the cars and brings all the information together. and ultimately generates a source of truth for the world around us.
Does full self-driving capability make you want to buy a trial a little more?

If you have any copyright issue, please Contact