October 12, 2017
Getting to the finish line faster: Optical technologies for self-driving cars
The automotive industry is primed for historic disruption. Practically every car company is investing in autonomous vehicles, with Ford, General Motors, Honda, Toyota, Tesla, and others racing to market in what seems a futuristic contest. Google, Apple, and several other high-tech companies are also serious contenders, putting pressure on traditional car manufacturers with substantial R&D investments in autonomous vehicles.
Ford is expected to roll out a fleet of driverless cars in 2021. And, in an April 2017 TED Talk, Tesla CEO Elon Musk said Tesla is on track “for being able to go cross-country from LA to New York by the end of the year, fully autonomous…no controls touched at any point during the journey.” People who are babies today may never need a driver’s license. In just four years, teenagers could opt to hail a self-driving car to the school dance instead of asking Mom to drop them off.
By 2030, some experts predict that up to 15 percent of new vehicles could be fully autonomous. (McKinsey) With such stiff competition, a faster time to market is critical – a fact that our customers know well. Many appear on this Business Insider list of those most likely to reach the finish line first.
How optical technologies help autonomous vehicles
Autonomous cars are possible thanks to a combination of optical technologies that utilize multiple lasers and cameras. Together, these technologies enable self-driving vehicles to navigate the road. In addition to getting riders from point A to B, guidance systems for autonomous vehicles can:
Warn against collisions
Detect vehicles in blind spots
Maneuver cars into parking spots
See things our eyes cannot
Some technologies are more cost-effective than others. LIDAR uses laser light to 3D map a car’s surroundings, but it is one of the costlier optical technologies.
Cameras mounted on self-driving cars are the most cost-efficient of the technologies. These omnidirectional camera systems provide a 360-degree field of view in a horizontal plane (a visual field that is spherically or hemi-spherically shaped). But cameras can’t work alone. All the technologies must work in harmony to ensure a safe autonomous ride. Given this, engineering teams designing optical systems for autonomous vehicles can’t afford design uncertainties and delays. The development of each optical component must be spot on.
Learn how to design an omnidirectional, catadioptric sensor in OpticStudio >
Solution from Zemax
To get to a high-confidence design more quickly, many companies designing optical systems for autonomous vehicles use OpticStudio and LensMechanix. Together, these modern virtual prototyping tools streamline the workflow and information exchange between optical and mechanical engineers.
< Return to blog