- "onclick =" window.open (this.href, 'win2', 'status = no, toolbar = no, scrollbars = yes, titlebar = no, menubar = no, resizable = yes, w> Print
No other industry has been more revolutionary with the advent of robots than the automotive industry. Robots are far superior to humans in doing such a monotonous tedious job, which is the assembly of cars.
On production lines, robots stamp car parts, weld them, paint, install accessories, transport components between workstations. Robots can even check the quality of their own work, their sensors catch the slightest defects that can slip away from the human eye. The use of robots markedly advanced the performance of the automotive industry. Automata not only do work around the clock, they - if properly programmed - almost make no mistakes. This allows you to build cars faster and cheaper.
The most common type of robot in the automotive industry is a robot arm. A wide range of movements and sensitive manipulators allow such a robot to perform various types of operations.
In the process of moving the skeletons of half-assembled cars along the conveyor, a robot arm is extended to weld parts. Robots carry out any welding work necessary for the manufacture of a car.
Local welding is an integral part of automobile production. The robot arm compresses two metal plates in one place, then high voltage passes through the metal parts, turning them into a joint called a nugget.
Like programmed painters, two robots cover a sports car with a layer of red paint of standard thickness at a certain angle and from a certain distance. The result exceeds all expectations.
A robot arm inserts a windshield into a vehicle near completion. Assembly robots for securing wires and batteries receive the necessary parts and components from the transporting robot.
Robot car without steering wheel
Autonomous cars of current projects use the following situation analysis systems:
- GPS based
- based on odometer
- computer analysis.
Modern control systems interpret information received from sensors and thus determine the appropriate navigation paths, obstacles, signs, etc.
Autonomous cars are able to analyze “sensually” - to determine, for example, various types of vehicles moving along the highway.
This feature is a significant addition when planning a trip to a destination.
The history of the development of a robot car
Robot machines are found in demo documents that are still marked by the period 1920-30.
However, the first robotic cars, which really are a kind of autonomous vehicle, appeared only in the mid-1980s.
These were mainly products of the creativity of Carnegie Mellon University enthusiasts (Navlab) and the development of the Australian company ALV.
Since 2014, the Czech company Tesla has already noted more interesting designs.
However, until the full-fledged projects corresponding to the fifth level of system automation, the matter has not yet reached.
Tesla Motors has produced a prototype car robot, structurally corresponding only to the second level.
In fact, this is a half-autonomous model of transport, which does not completely exclude human intervention. Meanwhile, Tesla Motors back in October 2014 began mass production of autopilot cars.
Potential exclusive owners need to purchase software that implements autopilot functions as an optional accessory.
The Tesla autopilot is now classified by Level 2 and Level 3 Robotics on a scale established by the Society of Automotive Engineers (SAE), a community of automotive engineers.
Such a classification in particular determines: an autopilot is capable of acting autonomously, but requires constant human attention.
Driving in this version involves a quick change from automatic to manual at a critical moment.
The eighth version of autopilot
Canadian and at the same time American engineer Elon Musk (Elon Musk) at the end of August 2016 announced a new autopilot software (version 8.0).
The program involves the use of radar (lidar) signals to generate orientation in conditions of poor visibility.
In September 2016, Tesla Motors specialists launched the 8th firmware version using a radar installation as the main sensor.
There is a function for calculating radar data obtained by reflecting radio waves from the roadway and from vehicles in front.
However, the capabilities of the radar "iron" were not enough. Now designers are counting on the use of NVIDIA CUDA technology. What will come of it, time will tell.
Assembly robots at Mercedes
Robots collect Mercedes cars. The production process without a person.
Conveyor Mercedes-Benz. Automation.
Watching these videos, I want to admire the beauty of robotic technology. Not a single extra movement, perfect installation accuracy. Beauty performed by electronic-mechanical labor. But after all, this beauty was originally created by people. Then why are they practically not in the frame?
Paradoxes of automation
The reverse side of automation is the displacement of people from production. People become redundant in the world of people!
And the main paradox - now Mercedes is far from the legendary example of German quality and reliability. Feedback on extraordinary maintenance due to manufacturing defects has become the norm. Truly reliable Merces were released only at a time when people were working on the assembly line ...
Robots began to lose people
The management of the Daimler AG concern, after studying deterioration in quality, made an unprecedented decision - to abandon robotic conveyors and replace absolutely all robots with people. A paradoxical decision was voiced by the head of production, Marcus Schäfer, and it is based on the fact that the lineup of Mercedes-Benz in recent years has become so variably expanded that "soulless automation" was unable to quickly cope with the huge variety of options available.
Collector robots were economically disadvantageous in providing an individual approach to solving the increased variety of tasks.
Automation crowded out people
Restructuring will begin with the largest factory in Sindelfingen. Every year, 400 thousand new cars go off the assembly line of this division of Daimler AG. Production processes are waiting for dramatic changes, which will require significant economic injections, however, the Germans also have Napoleonic plans for this conveyor. After updating the technology, the plant will launch immediately 30 new models in the next four years. And all new products will be assigned the category “manual assembly”. A good kick to competitors.
Of course, the hype around replacing robots with humans is just a tricky publicity stunt. No one is going to get rid of automation 100%. Some of the robots will be left, however, as promised, now they will function under the control of people and carry out only a monotonous, heavy and uncomplicated routine.
Why did Duckietown appear and how are autonomous toy cars arranged
In 2016, the Massachusetts Institute of Technology (MIT) received a grant for the development of autopilots for cars. Researchers got the idea that the same software will work on both real and toy cars, which can be taught to operate autonomously in a model city. So the student project Duckietown was created.
Teachers from MIT assembled a group of students with different skills and set the task to build a toy city in which cars drive completely autonomously. Students received boxes with glands, chips, controllers and assembled cars.
Then a wheel controller and a miniature computer based on the Raspberry Pi were installed on each of them - this is how the car learned to drive.
Next step: Robot Operating System (ROS) was installed on this small computer. This is a set of tools for various robotic solutions - both industrial and toy. So, each student has an autonomous mobile robot called “Dakibot”.
The tasks were further divided: one group of students was engaged in the development of algorithms for searching for road markings, the other in algorithms for recognizing traffic signals, and someone needed to develop a protocol through which cars “agree” on the traffic order at intersections. Over the year, students solved the basic tasks of autonomous driving along a marked road.
In February, I met with Liam Paul (one of the authors of the Duckietown project - approx. “Papers”) and asked what needs to be done to engage in research on autonomous car traffic in Russia as part of this project. He said: "Nothing, organize, no problem."
At first, we and two students from LETI and AU tried to quickly go the same way as the MIT students last semester: from scratch we built one car, road, environment, started and debugged the basic algorithms. From MIT they took the main idea, descriptions and source codes.
In 2017, we launched a course at the Computer Science Center, where students and schoolchildren created an already full-scale model city with a dozen autonomous “duckbots” and solved several new algorithmic problems.
You can draw an analogy of Duckietown with Linux: when the source code of this OS was first published, then it had very few opportunities. But everyone who was interested in the emergence of new features began to add their own designs to this project. Here is the same idea, only the goal is a standard open platform for autonomous navigation and unmanned vehicles. And we got involved in this project: we mastered what we have and started adding new features.
Duckietown is designed in such a way that the entrance threshold for working with it is very low: you do not need to study at the university for five years, you can even be a schoolboy to deal with basic things. Anyone who is interested in this topic can connect to the project, start creating something and immediately see the result of their efforts.
How robots navigate in space
An autonomous car is a robot that must be able to solve an unsolvable problem - to navigate in unknown conditions and act independently. Now only a person is capable of this, and many researchers dream of creating autonomous algorithms.
There is a layer of SLAM tasks - Simultaneous Localization and Mapping (a method of simultaneous localization and map building - approx. “Papers”), the general formulation of which is very simple. There is a mobile robot with a set of sensors - for example, a camera, a rangefinder. The robot is placed in an environment unknown to him, and he, looking at the sensor readings, somehow moves and simultaneously solves two problems: he builds a space map and determines his coordinates on this map.
If you take a good robot vacuum cleaner and put it in a room, then, after making a series of movements, he will understand how the space works and will be able to build a map and his route. But this is almost always not enough, because there are some objects in the room that also need to be recognized and react to their appearance.
A similar task also applies to cars: the car is driving, refines the map, localizes itself in it - for this it has markers that the autonomous navigation system can recognize. For example, Dakibot looks primarily at the markup. If he sees her, he thinks: "Okay, I'm somewhere on the strip, we must go parallel to this line." But he still does not know how to build a global map.
But the task, after all, is not simply to move along the lane - it is necessary to manage. Sometimes there are stop lines or obstacles, and the machine must respond to the circumstances. She “looks” with sensors at the world, sees, for example, a stop line and stops. It analyzes how the intersection is arranged, whether there is a traffic light or other cars, how you can leave. Next, she needs to plan a trajectory that will allow you to pass the intersection and again get on the lane.
If we talk about autonomous driving globally, then real cars have more powerful assistants - for example, the GPS system. But it gives inaccurate coordinates, the difference sometimes reaches 3-5 meters, and on the road it can be an oncoming lane. Therefore, GPS is not enough. There are electronic maps that the navigator loads, but there are problems: they dug here, put a sign here - this is not reflected in the maps instantly.
Duckietown is interesting in that here the focus is on autonomy - there is no external system. Many navigation projects imply that some powerful server stands nearby, monitors and manages everything. Here, each car is completely autonomous.
What should be the infrastructure for autonomous cars
Infrastructure, of course, helps, although 100 percent it will not solve the problem of autonomous driving. First, let's see what is the difficulty of localization on the ground. Take indoor navigation, for example: a mobile robot moves, looks at a wall, measures its sensors, and localizes itself with an error due to data errors. This error can be corrected by finding certain points in the measurements, say, in the image from the camera.
But here, not everything is smooth. For example, I look at the wall, I see some irregularities, but the sun began to shine on the other side - and I no longer see these irregularities, but I see something else. The vast majority of algorithms suffer from a lack of robustness: the robot gets to the point at which it was before, but cannot recognize it. The observation conditions have changed - and he thinks that he was in another part of the space.
On the street morning, evening, summer, autumn. In winter, everything is covered with snow, in autumn - with leaves. It started to rain - everything looks completely different. How to recognize it? The infrastructure can help in this way: you can mark the space with artificial tags, for example, QR codes, which inside themselves contain exact coordinates. Then the unmanned vehicle sees a QR code, reads the coordinates and can radically correct the accumulated error.
A person has a very important ability that robots do not yet have: he recognizes images well - for example, objects. I look at a chair and know that it is a chair. This is enough for me to understand its shape, to imagine that there are legs below, to understand how far it is from me. I have a three-dimensional model in my head. But the robot does not have it - he builds it, looking at a flat picture that looks different from different points of view.
There is one more problem, which, perhaps, will be solved soon. There is a class of algorithms Structure From Motion (SFM) - this is the identification of the structure of space from observations during movement. You can observe the singular points of objects and restore their structure. In solving this problem, machine learning methods will help, the flowering of which we are now observing. Neural networks are capable of recognizing images quite well, which means that they can help in orientation in a real environment. Now for their training quite a lot of initial data has been accumulated.
Technically, creating an infrastructure that helps unmanned vehicles feel confident in an urban environment is quite simple. It is possible, for example, to allocate separate bands for such machines. It’s not difficult to do this even now. But the problem that remains to be solved is to teach the car to move in an unknown dynamic environment, based on the traffic situation.
How toy cars relate to global science
Duckietown - this is real technology, this is exactly the beauty of the project. Everything that is used in the industry is used here. In addition, if some new method or algorithm appeared outside the project, then why not try it in Duckietown. There are no restrictions. And if this new algorithm is also open source, then it can generally be taken and integrated.
The Duckietown project can be seen as a platform for experimentation. If some new scientific article comes out, you can implement its provisions here. Or, on the contrary, if something fundamentally new arises in the project, then you can make it into an article, submit it at a conference and publish it. One of the goals of the project is to do research on its basis and to create scientific works.
If we talk about the value of Duckietown for students who participate in it, then they can be sure that their knowledge is at the forefront of technology.