官术网_书友最值得收藏!

Challenges in current deployments 

Companies have started public testing with autonomous taxi services in the US, which are often driven at low speeds and nearly always with a security driver. 

A few of these autonomous taxi services are listed in the following table:

The fully autonomous vehicle announcements (including testing and beyond) are listed in the following table:

Note: Due to the COVID-19 pandemic, global lockdown timelines might be impacted.

However, despite these advances, there is one question we must ask: SDC development has existed for decades, but why is it taking so long to become a reality? The reason is that there are lots of components to SDCs, and the dream can only become a reality with the proper integration of these components. So, what we have today is multiple prototypes of SDCs from multiple companies to showcase their promising technologies.

The key ingredients or differentiators of SDCs are the sensors, hardware, software, and algorithms that are used. Lots of system and software engineering is required to bring all these four differentiators together. Even the choice of these differentiators plays an important role in SDC development.

In this section, we will cover existing deployments and their associated challenges in SDCs. Tesla has recently revealed their advancements and the research they've conducted on SDCs. Currently, most Tesla vehicles are capable of supplementing the driver's abilities. It can take over the tedious task of maintaining lanes on highways; monitoring and matching the speeds of surrounding vehicles; and can even be summoned to you while you are not in the vehicle. These capabilities are impressive and, in some cases, even life-saving, but it is still far from a full SDC. Tesla's current output still requires regular input from the driver to ensure they are paying attention and capable of taking over when needed.

There are four primary challenges that automakers such as Tesla need to overcome in order to succeed in replacing the human driver. We'll go over these now.

Building safe systems

The first one is building a safe system. In order to replace human drivers, the SDC needs to be safer than a human driver. So, how do we quantify that? It is impossible to guarantee that accidents will not occur without real-world testing, which comes with that innate risk.

We can start by quantifying how good human drivers are. In the US, the current fatality rate is about one death per one million hours of driving. This includes human error and irresponsible driving, so we can probably hold the vehicles to a higher standard, but that's the benchmark nonetheless. Therefore, the SDC vehicle needs to have fewer fatalities than once every one million hours, and currently, that is not the case. We do not have enough data to calculate accurate statistics here, but we do know that Uber's SDC required a human to intervene approximately every 19 kilometers (KM). The first case of pedestrian fatality was reported in 2018 after a pedestrian was hit by Uber's autonomous test vehicle.

The car was in self-driving mode, sitting in the driving seat with a human backup driver. Uber halted testing of SDCs in Arizona, where such testing had been approved since August 2016. Uber opted not to extend its California self-driving trial permit when it expired at the end of March 2018. Uber's vehicle that hit the pedestrian was using LIDAR sensors that didn't work using light coming from camera sensors. However, Uber's test vehicle made no effort to slow down, even though the vehicle was occupied by the human backup driver, who wasn't careful and was not paying attention.

According to the data obtained by Uber, the vehicle first observed the pedestrian 6 seconds before the impact with its RADAR and LIDAR sensors. At the time of the hazard, the vehicle was traveling at 70 kilometers per hour. The vehicle continued at the same speed and when the paths of the pedestrian and the car converged, the classification algorithm of the machine was seen trying to classify what object was in its view. The system switched its identification from an unidentified object, to a car, to a cyclist with no identification of the driving path of the pedestrian. Just 1.3 seconds before the crash, the vehicle was able to recognize the pedestrian. The vehicle was required to perform an emergency brake but didn't as it was programmed not to brake.

As per the algorithm's prediction, the vehicle performed a speed deceleration of more than 6.5 meters per square second. Also, the human operator was expected to intervene, but the vehicle was not designed to alert the driver. The driver did intervene a few seconds before the impact by engaging the steering wheel and braking and bringing the vehicle's speed to 62 kilometers per hour, but it was too late to save the pedestrian. Nothing malfunctioned in the car and everything worked as planned, but it was clearly a case of bad programming. In this case, the internal computer was clearly not programmed to deal with this uncertainty, whereas a human would normally slow down when confronted with an unknown hazard. Even with high-resolution LIDAR, the vehicle failed to recognize the pedestrian.

The cheapest computer and hardware 

Computer and hardware architecture plays a significant role in SDCs. As we know, a large part of that lies in the hardware itself and the programming that goes into it. Tesla unveiled its new, purpose-built computer; a chip specifically optimized for running a neural network. It has been designed to be retrofitted in existing vehicles. This computer is of a similar size and power to the existing self-driving computers. This has increased Tesla's SDC computer capabilities by 2,100% as it allows it to process 2,300 frames per second, 2,190 frames more than the previous iteration. This is a massive performance jump, and that processing power will be needed to analyze footage from the suit of sensors Tesla has.

The Tesla autopilot model currently consists of three forward-facing cameras, all mounted behind the windshield. One is a 120-degree wide-angle fish-eye lens, which gives situational awareness by capturing traffic lights and objects moving into the path of travel. The second camera is a narrow-angle lens that provides longer-range information needed for high-speed driving. The third is the main camera, which sits in the middle of these two cameras. There are four additional cameras on the sides of the vehicle that check for vehicles unexpectedly entering any lane, and provide the information needed to safely enter intersections and change lanes. The eight and final camera is located at the rear, which doubles as a parking camera, but is also used to avoid crashes from rear hazards.

The vehicle does not completely rely on visual cameras. It also makes use of 12 ultrasonic sensors, which provide a 360-degree picture of the immediate area around the vehicle, and one forward-facing RADAR:

Fig 1.3: Camera views

Finding the correct sensor fusion has been a subject of debate among competing SDC companies. Elon Musk recently stated that anyone relying on LIDAR sensors (which work similarly to RADAR but utilize light instead of radio waves) is doomed. To understand why he said this, we will plot the strengths of each sensor on a RADAR chart, as follows:

Fig 1.4: RADAR chart

RADAR has great resolution; it provides highly detailed information about what it's detecting. It works in low and high light situations and is also capable of measuring speed. It has a good range and works moderately well in poor weather conditions. Its biggest weakness is that these sensors are expensive and bulky. This is where the second challenge of building a SDC comes into play: building an affordable system that the average person can buy.

Let's look at the following RADAR chart:

Fig 1.5: RADAR chart  strength

LIDAR sensors are the big sensors we see on Waymo, Uber, and most competing SDC companies output. Elon Musk has become more aware of LIDAR's potential after SpaceX utilized it in their dragon-eye navigation sensor. It's a drawback for Tesla for now, who focused on building not just a cost-effective vehicle, but a good-looking vehicle. Fortunately, LIDAR technology is gradually becoming smaller and cheaper.

Waymo, a subsidiary of Google's parent company Alphabet, sells its LIDAR sensors to any company that does not intend to compete with its plans for a self-driving taxi service. When they started in 2009, the per-unit cost of a LIDAR sensor was around $75,000, but they have managed to reduce this to $7,500 as of 2019 by manufacturing the units themselves. Waymo vehicles use four LIDAR sensors on each side of the vehicle, placing the total cost of these sensors for the third party at $30,000. This sort of pricing does not line up with Tesla's mission as their mission is to speed up the world so that it moves toward sustainable transport. This issue has pushed Tesla toward a cheaper sensor fusion setup.

Let's look at the strength and weaknesses of the three other sensor types  RADAR, camera sensor, and ultrasonic sensor  to see how Tesla is moving forward without LIDAR.

First, let's look at RADAR. It works well in all conditions. RADAR sensors are small and cheap, capable of detecting speed, and their range is good for short- and long-distance detection. Where they fall short is in the low-resolution data they provide, but this weakness can easily be augmented by combining it with cameras. The plot for RADAR and its cameras can be seen in the following image:

Fig 1.6: RADAR and camera plot

When we combine the two, this yields the following plot:

Fig 1.7: Added RADAR and camera plot

This combination has excellent range and resolution, provides color and contrast information for reading street signs, and is extremely small and cheap. Combining RADAR and cameras allows each to cover the weaknesses of the other. They are still weak in terms of proximity detection, but using two cameras in stereo can allow the camera to work like our eyes to estimate distance. When fine-tuned distance measurements are needed, we can use the ultrasonic sensor. An example of an ultrasonic sensor can be seen in the following photo:

Fig 1.8: Ultrasonic sensor

These are circular sensors dotted around the car. In Tesla cars, eight surround cameras have coverage of 360 degrees around the car at a range of up to 250 meters. This vision is complemented by 12 upgraded ultrasonic sensors, which allow the detection of both hard and soft objects that are nearly twice the distance away from the prior device. A forward-facing RADAR with improved processing offers additional data about the world at a redundant wavelength and can be seen through heavy rain, fog, dust, and even the car ahead. This is a cost-effective solution. According to Tesla, their hardware is already capable of allowing their vehicles to self-drive. Now, they just need to continue improving the software algorithms. Tesla is in a fantastic position to make it work.

In the following screenshot, we can see the object detection camera sensor used by Tesla cars:

Fig 1.9: Object detection using a camera sensor

When training a neural network, data is key. Waymo has logged millions of kilometers driven in order to gain data, while Tesla has logged over a billion. 33% of all driving with Tesla cars is with the autopilot engaged. This data collection extends past autopilot engagement. Tesla cars also drive manually and collect data in areas where autopilot is not allowed, such as in the city or streets. Accounting for all the unpredictability of driving requires an immense amount of training for a machine learning algorithm, and this is where Tesla's data gives them an advantage. We will read about neural networks in later chapters. One key point to note is that the more data you have to train a neural network, the better it is going to be. Tesla's machine vision does a decent job, but there are still plenty of gaps there.

The Tesla software places bounding boxes around the objects it detects while categorizing them as cars, trucks, bicycles, and pedestrians. It labels each with a relative velocity to the vehicle, and what lane they occupy. It highlights drivable areas, marks the lane dividers, and sets a projected path between them. This frequently struggles in complicated scenarios, and Tesla is working on improving the accuracy of the models by adding new functionalities. Their latest SDC computer is going to radically increase its processing power, which will allow Tesla to continue adding functionality without needing to refresh information constantly. However, even if they manage to develop the perfect computer vision application, programming the vehicle on how to handle every scenario is another hurdle. This is a vital part of building not only a safe vehicle but a practical self-driving vehicle.

Software programming

Another challenge is programming for safety and practicality, which are often at odds with each other. If we program a vehicle purely for safety, its safest option is not to drive. Driving is an inherently dangerous operation, and programming for the multiple scenarios that arise while driving is an insanely difficult task. It is easy to say follow the rules of the road, but the problem is, humans don't follow the rules of the road perfectly. Therefore, programmers need to enable SDCs to react to this. Sometimes, the computer will need to make difficult decisions and may need to make a decision that involves endangering the life of its occupants or people outside the vehicle. This is a dangerous task, but if we continue improving on the technology, we could start to see reduced road deaths, all while making taxi services drastically cheaper and freeing many people from the financial burden of purchasing a vehicle.

Tesla is in a fantastic position to gradually update their software as they master each scenario. They don't need to create the perfect SDC out of the gate, and with this latest computer, they are going to be able to continue their technological growth.

Fast internet

Finally, for many processes in SDCs, fast internet is required. The 4G network is good for online streaming and playing smartphone games, but when it comes to SDCs, next-generation technology such as 5G is required. Companies such as Nokia, Ericsson, and Huawei are researching how to bring out efficient internet technology to specifically meet the needs of SDCs.

In the next section, we will read about the levels of autonomy of autonomous vehicles, which are defined by the Society of Automotive Engineering.

主站蜘蛛池模板: 河池市| 浦城县| 平南县| 普兰店市| 周至县| 长治市| 安乡县| 珠海市| 洛隆县| 依兰县| 恩施市| 旬邑县| 探索| 阿荣旗| 云浮市| 邹城市| 咸宁市| 鹰潭市| 德州市| 汉川市| 鸡泽县| 曲阳县| 柯坪县| 澄城县| 揭西县| 上饶市| 黑山县| 澳门| 桂林市| 普安县| 扶余县| 屯门区| 昌都县| 宁都县| 镇原县| 东丰县| 蒙自县| 乌拉特中旗| 多伦县| 得荣县| 阿巴嘎旗|