YTread Logo
YTread Logo

Will a robot take my job? | The Age of A.I.

Jun 18, 2020
This is a bit wild. "Will a

robot

take

my job?" It is one of the most searched questions on Google. It makes some sense. You know, when telephones were invented, there were probably horse messengers running amok all over the place. Vinyl is making a comeback, but fidelity aside, why bother when you can have a curated playlist with thousands of songs right on your phone? Come to think of it, I can't tell you the last time I shot a movie...on film. Look, change makes people panic, but things evolve anyway, so we accept it and move on.
will a robot take my job the age of a i
And now that we are entering the so-called fourth industrial revolution, yes, automation

will

replace some jobs, but it

will

also create new ones and probably give rise to industries that never existed. That's the topic some of these people are exploring: how machine learning is making things more productive, more efficient, safer, and cleaner. And cut! That felt good. Do you want to go again? -No, we are ready. -Well. Maybe just back off that mark. Here? -Yeah. -Well. That's good. Okay, yeah, I think I understand. The trucking industry is at a crossroads. E-commerce is driving demand for transportation higher than ever, but the harsh realities of long-haul trucking are sending professional drivers looking for other work, leaving a shortage of 50,000 drivers in the United States alone.
will a robot take my job the age of a i

More Interesting Facts About,

will a robot take my job the age of a i...

Of course, there are some industries that will be affected by AI. in terms of workforce it's declining, but there are some industries that don't have the workforce to sustain it, and that's why they go offline, that's where A.I. can fit very well. My name is Maureen Fitzgerald. She has been driving trucks for 32 years and was ready to make a change. Three, two, one. - - Engaged. She started autonomous driving. As she approached retirement, Maureen saw automation coming and assumed she would replace it. She was partly right. Automation is coming, but its work will not be lost.
will a robot take my job the age of a i
It will evolve. I have been in this career for a long time, I have seen how everything has changed. The monotony from coast to coast, the low pay and, you know, you're sleeping in a bunk bed, you're sleeping in a truck stop. It's hard. It's a hard life. I saw an ad for a test pilot and said, "That has to be the perfect job." Okay, but this isn't just about Maureen. We've all heard about self-driving cars, but why are self-driving trucks setting the pace for the autonomous driving industry? We already have

robot

ic planes. Most planes fly on their own, but self-driving cars are a more difficult problem because a lot more happens on the roads than in the air, and driving on the highway is much easier than driving in the city.
will a robot take my job the age of a i
So I think we'll see fleets of automated trucks long before we see driverless cars in the city. TuSimple's company starts with a dream... a dream that we wanted to build A.I. Technology for autonomous trucks. Can you give me a full clockwise rotation? Originally, the truck was not designed to be automated. Okay, can you give me a 50% speedup? We added our intelligence, the servers and all the sensors to make this work. We are in the bunk of a TuSimple tractor. We've removed what would normally be beds and refrigerators, and added our computer, which is the brain of the AI.
Driving the car is what's called a "full AI problem." If you solve it, you solve all the other problems in A.I. However, to solve it, you have to solve all the A.I. It requires vision, it requires robotic control, it requires movement and navigation, but it also requires social interaction with other drivers, so at the end of the day, to drive a car well in the city, you need everything. . It requires an enormous amount of common sense. Common sense. That's what Maureen is doing here... supervising the A.I. truck while learning to navigate in real-world conditions, ready to

take

control if something goes wrong.
Today, they're taking a ten-mile loop to test scenarios a human driver might encounter. Can this 10 ton vehicle safely travel on a busy highway? Can you merge and change lanes? Alright, ready. Three, two, one. Engaged. Autonomous driving began. My name is David Ruggiero and I am a test engineer at TuSimple. My job is to let you know what's going on, to let you feel absolutely comfortable that the truck is doing what it's supposed to do. Shifting gears... There we are. When the vehicle is activated, she is fully aware of what is happening around the truck, but she does not have to touch anything.
Alright, clear to the right. It is not clear from the left. I see the left approaching. A trip or a day are never the same. The left is now clear. It will go? -We're ok! -Come on. The truck was cautious... So I still see hard braking. He was probably too cautious. Just go manual? Yes, we can do that. Autonomous driving. Here we go. Very good. - -Engaged. Autonomous driving began. I'm turning 45. It looks good. Center Lane. I feel like I have a relationship with the truck. Easy, easy... I'm talking to the truck. I say it when it's good.
I had never been a test driver before and I had never been a truck driving alone before. In fact, we are both learning together. Still green. Still green... We are without accelerator, light brake. Okay, red light. Turn to the left. The turn signal is on. I see the lead van. Human drivers cause 30,000 or more deaths a year. Long-distance road transport can be dangerous, boring and therefore ripe for automation. On the right. Okay, I see the other car moving, I see green. Still green... Still green. I don't see anything approaching. We are turning around. That will work.
Very good, the new access ramp. Merging on a highway is difficult and dangerous. It requires mastering a complicated set of physical and mental skills, having acute sensory and spatial awareness, preparing for unpredictability and human fallibility... And we're on it. ...and do it all fast and fresh every time, otherwise you risk failure or worse. The turn signal is on. Well, I am showing a leftist intention. We will have a truck next to us, but it is moving. Well. Confirmed. Looks like we can get through it, the van is moving. This truck is measuring what's coming up behind it, how fast they're coming up behind it...
It seems a little congested up there. Yes. You have a full 360 view, all the time. AI uses the images that come from cameras with other sensors, LiDAR and radar. LiDAR is a laser rangefinder that measures the distance to objects in 360 degrees around them. It gives us a three-dimensional image of the world. In the virtual world, there is a direct correspondence between the map and the location where the vehicle is, and the behavior of other vehicles, their locations, their speed, their future intentions, and once all these things are recreated, it is basically asking to the computer to play a game.
Okay, front tire, continuing at 57. Good job. When I look at the data the truck provides me, I not only see the current environment, but I also see the future. And we have a slow vehicle that cuts us off. I see the cut, at 51. That was good. Well, I saw the cut. If we need to make a lane change, the truck will know before I do. Well, I currently don't have... There we go, I have an intention. We're leaving right now. Left shift. Alright. We're ok. Okay let's go. Well, this is where things get interesting. This truck is equipped with sophisticated cameras and software, but it still lacks one critical element... what the A.I. maybe he never has.
The central problem in A.I. is that human beings have common sense and computers do not. We take common sense for granted. We know how the world works and everything we do in our daily lives involves common sense... A fairly constant current now to our left. I see it. ...and people have been trying to imbue A.I. with common sense from the beginning, and the extraordinary thing is, at the end of the day, how little progress we have made. The bottom line is that one day, the AI ​​truck won't need Maureen, David, or anyone else to accompany it.
Each of these tests is a building block toward that goal, designed to help you learn how humans operate and make decisions, even when thrown a curveball. This truck is playing us. Just getting the vehicle. It's everywhere... -It's getting closer to the line. -Mmm-hmm. I have a left-wing intention. The lane is blocked. We were having some challenges from one of their camera trucks. All I need is a space. I'm just looking for the gap. Come on, you can pass it, it's okay. The truck was cautious because it couldn't predict what the camera truck was doing. Why is it going so slow?
The film crew's vehicle was filming the truck all afternoon and, yes, it was not driving normally. The camera truck was what we would consider a non-compliant vehicle. He was doing things that a normal driver wouldn't expect him to do. He doesn't trust it. Okay, I see the cut. Sixty and slower. Fifty-two and slowing down. I still see a leftist intention. We'll see. Okay, we're not clear anymore. No, it's full. Previous shift to the left. We will not go. He's everywhere, that guy. The truck wants to overtake the camera crew, but the constant changes in speed and unusual behavior confuse the AI... so it refuses to take the risk.
Cancelled. There are behaviors that the truck has to learn to become human, in a sense, because not everyone else operates at the same level. Alright. It turns out that getting the first basic systems was a big milestone, but then there's this long tail of sticky situations that are much harder to solve. When you merge, do you have to give in or does it make sense to move forward? That requires a level of understanding that humans barely have, but machines have not yet reached that level. It definitely has to be the most interaction I've ever seen with a vehicle.
The big takeaway from today's race... there is no such thing as a perfect race. Over time, we expect trucks to react to a change in environment as quickly as possible, regardless of what we throw at them. Okay, slow down towards the door and it looks like they're going to let us turn around. Beautiful. Thanks people. We still have a lot of work to do to validate the system, ensure everything is perfect, and this will require substantial driving and simulation experience. Watching something develop over time, you never thought it would learn like this, and now it's learning and you feel like a proud parent.
He was making some incredible decisions that I had never seen him make before. People ask me all the time, "Oh, by having autonomous trucks you're taking away jobs from truck drivers," and I don't think that's true, because it's not about taking away jobs, it's about filling space. of a job that a driver does not want to do. Thank you for traveling with TuSimple. Ta-da! High five. Autonomous driving. What Maureen thought was the twilight of her working life actually became the dawn of something new, and it's an age-old phenomenon. Tractors didn't make farming obsolete and video didn't kill the radio star.
It just provided new opportunities. New technologies are changing the way we think about work. Hey, Garrett, east on 424 should be fine. 10-4. I think the biggest changes we face when we think about AI. and the future of A.I. is what we do when A.I. it changes what the job functions are and what the workforce looks like, what they're going to do. So this is a fear, because people don't like change, and change that you don't understand... is scary for a lot of people. The Port of Long Beach is the second busiest port in North America... and the first fully automated container terminal in the United States.
For most people, shipping and logistics aren't that appealing, but for an AI, it's a dream. With more than 14,000 people moving more than eight million giant containers around the world, this port handles around $200 billion in cargo a year. Automating this industry will make it more efficient, without a doubt, but also safer. As you can see this fence here, from this point on, for about an eighth of a mile in that direction there is a fenced no man's zone, which is run completely without software. In the old design, boat activity was mixed with truck activity, competing for space, congestion, safety issues... ...but the size of the boats continued to grow, seven times larger than ships. that we ever had in from the beginning, so a new model was necessary, which is what LBCT is.
When a ship approaches, we use different algorithms to calculate traffic, scheduling, dispatch and planning. We are doing this with 10 different cranes at the same time, 50 different vehicles at the same time and we have 30,000 different places to placethat container in the yard. When we unload a container from a ship, the first thing it will do is be picked up by a ship-to-shore crane that has an operator inside. -Hey, Lily, it's Boyle. - Yes, I copy it. Okay, Lily, which side do you want to start on? Ship-to-shore cranes will place the containers on the platform. On that platform, there are more than 20 cameras that use optical character recognition, or OCR.
Then another crane will pick it up and place it in a fully automated vehicle. That vehicle will cross the yard and arrive at the correct place, planned by the system. What looks like a small wad of gum are transponders buried in the ground. They are in a grid, there are more than 10,000 in the production field. There is an antenna on the front and rear, and they read them as you drive and then transmit them to the system to let you know where you are. Once it reaches the yard, an automatic stacker crane will pick it up and deposit it on a block in the yard.
From the block, it will eventually be deposited onto a truck or a pump car, from where it can reach a train. I don't have my key. I'm on my way. That's why we work in teams. - What would you say? - They say that we can eventually learn to adapt to an environment that mixes robotic and human elements, but it is more difficult for machines to adapt to us, so here it is necessary to separate them for safety reasons. Because humans are unpredictable, automation can be physically dangerous for people, because what if I'm a 200-ton machine and I don't see you?
I could run over you. This is the greatest danger area. When we go out into the field, we request a restriction, and they block a certain area of ​​the production field, and no AGV knows to enter that area, and we can work safely. Do you want these two right here? - Yeah... -Okay, great. This setup is completely automated, but it's not like someone walks in, presses a button, and puts their feet up on your desk. It still employs hundreds of people on any given day... What does it look like on that side, Eric? I have a broken railing here, Julio.
We'll have to solder it very quickly. Well. Take my hood from inside the truck, please? Here we go, guys. Take care of your eyes. Eyeballs! You know, this is an asset-intensive facility. Without a doubt, the number of mechanics required to maintain this facility has increased. My background is automation. That's why I decided to come here, because it was high-tech. Instead of having a yard crane operator alone in a cabin all day, we now work remotely in the operating room. About 80% of the time, most faults can be reset from here. Well, easy solution. Easy solution. I'm Garrett Garedo, a crane mechanic here at LBCT.
This is the core operating system. A bypass key will allow the operator to bypass a function. Automation can make the workplace safer, and this is a good example...taking someone out of the yard and putting them in the control room. Right here, I'm going to have to move this container because it's not placed very well. Are you ready? -Yes, go ahead. AI, like any technology that comes to our society, changes the workforce. There is a misconception that there will be fewer of these new jobs. Actually, that is not the case. Let me move it back. There are new jobs, there are different types of jobs.
I was a crane operator. In the past, we had to climb five flights to the taxi. It was a little difficult, especially in the back, and then when the automation started they just repositioned me and now it's a lot easier. If you don't accept it, you'll just fight it, and no... You have to use it as a tool. Right on the money. You know, our obligation was to retrain those same people for the new jobs that that technology has created. We are the progress that was needed for the freight movement industry. For the foreseeable future, there will continue to be many things that only humans can really do.
So I sent a mechanic to STS-1. Well. Therefore, the future of work is not about humans being replaced by machines. It's humans discovering how to do their jobs better with the help of machines. Structured environments are good for automation. Smooth surfaces, right angles, less chance of chaos. But what about more complex human environments? Can we learn to work with robots by hand? Ah, there we go. Wait a second. Arrest. Let's see if that works. RoboHub is a premier robotics incubator at the University of Waterloo in Ontario, Canada. The clamp does not work on the left side.
Maybe if I turn around a little... One thing they're doing is developing A.I. and robotics to be used in unstructured, more human environments. Like at home. Calm down Calm down. TALOS is one of the most advanced humanoids on the planet. He can walk, he can talk, he can see you in 3D... but he can't do most of those things right away, so you really have to take him as a tool and teach him to do a lot of things. of these things. At TALOS, we want to explore two different aspects of A.I. First, to perceive objects in the world...
So you have cameras in your eyes, and you also have a depth camera where you can see how far away things are within your vision, much like what we do with our depth perception , and as soon as TALOS has that ability to see, "Oh, there's a human here, there's an object back here," then it can start building a map of its world, so that it knows in 3D what's around it. . Computer vision is a way to mimic how we see the world. I can say, "Well, that's a TV," or "That's a cat," or "That's a dog," so what am I looking at when I look at a picture?
Computer vision tries to figure out which objects in this image represent what we would understand. And now I can teach the robot, basically by grabbing it, and I'll just gently move it and guide it towards the bottle... and you just press the right trigger button to execute the same movement. Should I close it? No, no, it will do it automatically this time, because I have already integrated it. We also want to explore the application of A.I. to understand locomotion in terms of balance. No, it's going to crash. He played too much. The biggest misconception about robots is that they are more capable than they are, that they are more widespread than they are... ...and that they are not there yet.
Well, let's give it one more chance. Maybe you've seen humanoid robots that can do parkour and do backflips. The only ones that are really comparable are the Valkyrie that was designed by NASA, of which there are only two or three in the world... and the Boston Dynamics Atlas, but most have very few sensors, and many of the Al At the same time, they will be controlled remotely or given a very clear, pre-established path, and in a situation like that, what they are doing is pushing the limits of the mechanical systems. That's very similar to what we're doing here, except in our case, we're driving the software side.
The next step forward is to start replacing all our dumb, blind and dangerous machines with machines that have built-in sensors and vision, that can work side by side with humans to be more productive and safer at the same time. Maybe we can run a table with the robot together? So if we put the table in your hands... So for a task where the robot might be carrying something with a human, it will need to be able to sense how hard the human is pushing and pulling, as if you are being guided in a dance. Can we tighten the clamps too, Alex?
Yes Yes. So as humans we have the sense of touch and we have an idea of ​​how much force we are applying and it is applied to us. With TALOS, you have no skin, it's hard plastic everywhere, so you can only really feel what's in your engines... This will take a second. We have to teach him how to translate that into a kind of sense of touch. Try to keep it somewhat level, but what if you also rotate the table? Let me show you how much is really possible. Whoever writes the controller is always braver than the rest of us.
And you start pulling a little... and then the robot will walk with you. TALOS is using tame control on the upper body, which basically softens it. The robot automatically detects the forces that are applied to the table. The human being still has full control. This is very important for the safety of the people around the robot. Within robotics, the areas where we will really see major advances are those environments that are relatively controlled and predictable, so a good example is an Amazon warehouse. In those environments, you already see a lot of people, and also a lot of robots... and they are working together.
Once you have a process and you've boiled it down to an algorithm, you can replicate it so that if one robot can learn a completely new skill, you can copy that knowledge across the cloud to all the other robots, and now they all have that same skill. ability. It's a whole new world, a whole new type of economy, and we're just beginning to understand its implications. Is it still difficult to imagine a completely new automated world? Maybe it's more about what we're gaining than what we're losing. Will automation make things faster? More efficient? More productive? Tastier?
I've been making pizza for 15 years. In my life, I've probably made two million pizzas... so coming to Zume was amazing, because it was like, wow, this is different than any other pizza place. Then I came up with the idea for Zume about seven years ago, when I realized that Big Pizza was based on one core concept... which was that there was no better way to do it. I thought, "I wonder if there's a better way to do this." Large pizza. It doesn't sound the same as Big Data or Big Pharma, but it's not just about pizza.
It is also a huge waste. About $200 billion worth of food was wasted in the United States last year. 200 billion dollars. That's a lot down the drain, and it's also in the sky. Waste contributes greatly to greenhouse gas emissions. Perhaps half of all the food produced in the world is wasted. Artificial intelligence can help us change supply chains to be much more efficient. If we can then have an impact there, that will already have huge, huge implications. One of the classic tensions in any food business is producing too much and wasting food, or not producing enough food and running out of supplies.
It takes years and years of practice for a human to predict that, but it's actually a data problem that machines and AI can solve. in particular, they are really good at getting it right. So at its core, Zume is a fundamentally new logistics model. We predict what we're going to sell before you even ask for it, and that entire predictive layer of the business is powered by artificial intelligence. When Zume first called me, I obviously thought, "Why does a pizza company want a CTO?" But then when I heard the vision, the fact that we were going to be able to change an industry by focusing on this incredible edge platform.
To the extreme, which also makes amazing pizza, then it made sense to disrupt Domino's, Zume uses machine learning to try to forecast how much pizza people want, what type and when variables... location, day of the week, weather. , what's on TV, past order trends, and then predict how many Pepperonis San Francisco will want on a Tuesday, for example. Our supply chains are incredibly inefficient waste a lot of food. part of that waste. We would know what the demand is going to be, where the products will be, and ultimately that would benefit us all. Absolute perfection would be that there is no pizza wasted, and every time a customer looks, there is always one. the type of pizza you would like available.
So it's an optimization problem. How close can you get? Optimization, true, but what else? "End to End" is not just about predictions. It also involves automation, or... cooking. Pizza and robots, a very interesting concept. Pepe and Giorgio distribute the sauce. Here we have Marta, who spreads the sauce on the pizzas... So we continue with Veggie Zupremes, guys. These look good. ...and down here at the end we have Bruno, who helps carry the pizza from the assembly line to the ovens, or around the ring road, and behind us, Vincenzo, who helps load the pizzas into the cartridges.
Based on his algorithm, the A.I. Predict how many pizzas and what type to load into mobile kitchens. When orders arrive, he also decides which mobile kitchen will bake them and exactly when to put each one in the oven. Each pizza has its own cooking profile and recipe. So each of these ovens is actually a robot. They are connected to cloud services that are always monitored, making it possible for one person working here to cook up to 120 pizzas per hour. It's humans and technology working together. It's pretty impressive. That and the pizza. I love pizza. Pizza is one of my favorite foods.
We use A.I.In a simulation, how do we get the correct ETA for delivery? The AI ​​I can suggest that this is the best place to place a truck so that we can offer our customers the freshest and best quality product. Then we have a BBQ chicken. BBQ... When we receive orders, our system controls the ovens, which control the cooking time and how long the pizzas need to cook. AI We can change the workflow in one of our vehicles to make sure the pizza is cooked at the last moment. ...and then the system chooses which controller to send it to.
Here's your order... A BBQ Chicken ranch. They are algorithms and also, it is A.I. besides humans, not AI. instead of humans. Now we have a pineapple ready to use. His predictions about the number of pizzas we carry in the truck are pretty accurate. Last Zupreme of the night. Sometimes we have a bunch of leftover pizzas at the end of the night, and we think the prediction was probably wrong for today, and then boom, an order arrives, and the next thing you know, another order arrives. on like eight pizzas at a time. All that data is fed back into the learning algorithm, so every week we try to evolve our learning algorithms to do a better job than the week before.
Part of what we are optimizing is reducing waste. Maybe if you could use prediction to go back into the supply chain and more accurately predict what you need to grow and where you need to source that product, I think you could fundamentally change things. We are in the early stages of a massive shift in technology that is allowing machines to perform many tasks that only humans could previously perform. The future of work is evolving. Automation is making the workplace safer, greener and more efficient, no doubt. If we can figure out how to integrate A.I. technologies, not to replace humans, but to increase their capabilities, to make life more fun, to make us more productive and more creative...
I think that is where the power of the machine will come from. Thousands of years ago we were hunter-gatherers. Over time, we became farmers, and now that we may be entering this fourth industrial revolution, one that connects the physical, the biological, and the digital, AI is not only changing jobs, but us too. We are the progress. So why focus on the rear view when we can look down the road and see what we are becoming? In any case, it is a real challenge to build robots with the kind of mobility and dexterity that is beginning to approach that of humans.
Kind of a cliché that we all think about is that we would like to have a robot that can go to the refrigerator and grab a beer for us. I mean, if you think about what that entails, it's still a huge challenge. A robot that is capable of doing that has to be heavy enough to go into a refrigerator. You have to be able to open the door and you have to have the visual perception and dexterity to reach out and grab the beer, and that already means a pretty heavy machine, otherwise it would just tip over.
Building a robot to do that is not only difficult, but will be quite expensive, so I think it will be quite a while before we actually see the types of robots we imagine in the world of science fiction operating in our daily lives. . Get ready for a photo. Please smile.

If you have any copyright issue, please Contact