A close up shows the illuminated blind spot monitoring icon on a 2021 Toyota Sequoia.

Behind the Curtain – How Advanced Sensors Enable Automated Driving

Automated driving is here – and we’re not talking about the recent release of Tesla’s Full Self-Driving 9.0 Beta. Pretty much every new car (and most recent used models) are officially considered to have some level of driving automation according to the industry standards set out by the SAE. While this is still not a commercially available car that has reached the vaunted Level 3, it is difficult to find a modern vehicle that isn’t Level 1 or Level 2, and the best are only a few technicalities short of being Level 3. But what sort of hardware changes to modern cars have enabled this nearly universal level of self-driving technology?

The development of modern computers with high reliability and fast processor speeds has been instrumental to the implementation of automated driver-assist technologies. However, a computer alone is useless without some way to perceive the environment around it. Rudimentary driver-assist features like anti-lock brakes, traction control, and stability control only needed inputs of what your vehicle was doing, which was fairly simple to incorporate into cars. But accurately understanding the environment and obstacles outside of your car required a much higher level of technology.

Electronic Eyes

While every manufacturer has developed its own specific systems that allow its models to perceive the world around them, they all use the same handful of different sensor types. Some driver-assist systems use a blend of different sensor types for different purposes, while other systems rely on a single type of sensor. The main categories are:

  • Cameras – The cheapest and most common type of sensor, cameras are generally mounted individually or in pairs looking through the front windshield. They are very flexible but have the same limitations as the human eye – they don’t perform well in darkness or bad weather and can be confused by bright lights and optical illusions.
  • Thermal – Technically a type of camera, thermal imagers “see” in the IR spectrum rather than the visual spectrum. This allows them to detect heat sources like cars, pedestrians, or animals, and even see through darkness or bad weather. However, thermal cannot provide the same range of information as normal cameras.
  • Ultrasonic – A fairly simple short-range system, ultrasonic sensors emit high-pitched sound waves to detect nearby obstacles. They are commonly used in parking assist systems but lack the resolution for more complicated tasks.
  • Radar – While you might imagine big, rotating arrays when you hear the word “radar,” automotive radars use small flat panels with digital beamforming to scan their surroundings. Modern millimeter wave arrays can provide extremely detailed pictures of their surroundings and work in all weather and lighting conditions. The downside? Higher costs compared to cameras.
  • Lidar – As you might guess from the name, lidar is very similar to radar. However, rather than emitting radio waves, lidar uses lasers. This technology provides greater precision than even millimeter wave radar but is far more expensive and requires bulky equipment.
  • GPS – Although technically not a type of sensor, more and more manufacturers are beginning to integrate GPS navigation data into their automated driving systems. This can give the car a heads up as to what lies down the road, allowing it to slow down before approaching a sharp corner or intersection, for instance.

The most common sensor types in modern vehicles are cameras, ultrasonic sensors, and radar. Thermal imagers only appear on a handful of high-end models, while lidar is yet to be offered on a commercially available vehicle (although many designers of self-driving cars see it as the ideal sensor technology and have used it on their prototypes).

To get a better idea of how these sensors work together, let’s take a look at two examples on different ends of the spectrum – Toyota and Tesla. Both of these companies are aggressively pursuing automated driving technologies but are doing so in different ways. As usual, Toyota has adopted a conservative approach very similar to what most other manufacturers use. In contrast, Tesla has forged its own path with a truly one-of-a-kind system.

A close up shows the infotainment screen in a 2021 Subaru Forester with the EyeSight driver monitoring system.

Toyota Safety Sense 2.5 Plus (TSS 2.5+)

Now standard on nearly every new Toyota model, TSS 2.5+ was introduced just last year and is the latest generation of Toyota’s driver-assist technology. It includes six main features and incorporates a forward-facing camera mounted near the rearview mirror behind the windshield, and a forward-facing radar mounted behind the Toyota badge on the grille.

Higher trims of current Toyota models supplement the basic TSS 2.5+ features with additional safety technologies such as Blind Spot Monitor and Rear Cross Traffic Alert. Although advertised as two separate features, these systems always come as a package because they use the same pair of rear-facing radars hidden behind the rear bumper – two radars are needed to cover both sides of the car.

Another safety feature that brings added sensors is Toyota’s Bird’s Eye View Camera. This system uses small cameras mounted in the front bumper and the bottom of each side mirror, plus the standard rearview camera to construct a picture of the vehicle’s surroundings. The final Toyota safety feature is Intelligence Clearance Sonar, which relies on no fewer than eight ultrasonic sensors – four in the front bumper and four in the rear bumper – to detect nearby obstacles at low speeds.

Most manufacturers offer similar advanced safety features with similar sensor arrangements to what Toyota employs, although there are differences between brands. For instance, Subaru EyeSight uses a trademark pair of forward-facing cameras instead of the more common single camera, the argument being that a stereoscopic system can be used to estimate distance in the same way as the human eyes.

Interestingly, Toyota may have a small leg up on its competitors as the company owns 35% of Denso Ten, one of the leading suppliers of automotive millimeter wave radars. Denso Ten’s parent company, Denso, is also part of the larger Toyota group alongside the Toyota Motor Corporation. This is fairly uncommon, as most manufacturers source their sensors from third-party suppliers – although Toyota’s program is not entirely in-house, and they recently contracted ZF and Mobileye to develop next-generation sensors.

Tesla Autopilot

If there is an automaker that is known for going its own way, it is Tesla. The startup EV brand is well-known for aggressively developing self-driving technologies, but they recently made waves when they announced that on top of not pursuing lidar technology, they were going to end their use of radar and rely entirely on cameras. This decision was immediately criticized by most commentators, but it does have advantages in terms of affordability, simplicity, and the amount of information the system can gather.

With this update, Tesla Autopilot will rely on a series of eight cameras positioned around the vehicle. There are three forward-facing cameras mounted behind the windshield, one each dedicated to detecting long-range, mid-range, and short-range obstacles. Each side of the car has two cameras – one looking backward mounted on the front fender and one looking forwards mounted in the B pillar. The eighth camera is a fairly conventional rearview camera mounted on the back of the vehicle.

However, despite dropping radar, Tesla has not yet committed to cameras alone. It still includes an array of 12 ultrasonic sensors sourced from Valeo, a French automotive supplier. Six sensors are installed in the front bumper and six in the rear bumper, providing detection and ranging of obstacles too close for the cameras to see properly.

A close up shows a Tesla autopilot camera.

The Future of Sensors

Compared to Toyota’s more mainstream approach, it can be seen that even though Tesla uses only two types of sensors, it actually employs a much larger number of individual sensors – 20 in total compared to 16 for a fully-equipped Toyota. Meanwhile, a base model Toyota will only have 2 individual sensors, plus a backup camera that is not integrated with any automated driving technology. While it could be argued that the use of radar means Toyota doesn’t need as many individual sensors, it seems likely that Tesla’s greater number of fully integrated sensors is capable of providing a much more detailed picture of the vehicle’s surroundings.

As driver-assist and self-driving technologies continue to advance, the development of the sensors needed to support them will have to keep pace. There is still a large effort to develop new types of sensors, from lidar to vehicle to vehicle communications, but we may well have already seen the peak of sensor diversity. If Tesla’s new camera-based approach pans out, the bulk of the future development may well be focused on the software and processing side of the equation rather than on designing more advanced (and expensive) sensor technologies.