<p>Elon Musk built his electric car company, Tesla, around the promise that it represented the future of driving — a phrase emblazoned on the automaker’s website.</p>.<p>Much of that promise was centered on Autopilot, a system of features that could steer, brake and accelerate the company’s sleek electric vehicles on highways. Over and over, Musk declared that truly autonomous driving was nearly at hand — the day when a Tesla could drive itself — and that the capability would be whisked to drivers over the air in software updates.</p>.<p>Unlike technologists at almost every other company working on self-driving vehicles, Musk insisted that autonomy could be achieved solely with cameras tracking their surroundings. But many Tesla engineers questioned whether it was safe enough to rely on cameras without the benefit of other sensing devices — and whether Musk was promising drivers too much about Autopilot’s capabilities.</p>.<p>Now those questions are at the heart of an investigation by the National Highway Traffic Safety Administration after at least 12 accidents in which Teslas using Autopilot drove into parked firetrucks, police cars and other emergency vehicles, killing one person and injuring 17 others.</p>.<p><strong>Also Read | <a href="https://www.deccanherald.com/international/world-news-politics/civilisation-will-crumble-if-people-dont-have-more-kids-elon-musk-1058816.html" target="_blank">Civilisation will 'crumble' if people don't have more kids: Elon Musk</a></strong></p>.<p>Families are suing Tesla over fatal crashes, and Tesla customers are suing the company for misrepresenting Autopilot and a set of sister services called Full Self Driving, or FSD.</p>.<p>As the guiding force behind Autopilot, Musk pushed it in directions other automakers were unwilling to take this kind of technology, interviews with 19 people who worked on the project over the past decade show. Musk repeatedly misled buyers about the services’ abilities, many of those people say. All spoke on the condition of anonymity, fearing retaliation from Musk and Tesla.</p>.<p>Musk and a top Tesla lawyer did not respond to multiple email requests for comment for this article over several weeks, including a detailed list of questions. But the company has consistently said that the onus is on drivers to stay alert and take control of their cars should Autopilot malfunction.</p>.<p>Since the start of Tesla’s work on Autopilot, there has been a tension between safety and Musk’s desire to market Tesla cars as technological marvels.</p>.<p>For years, Musk has said Tesla cars were on the verge of complete autonomy. “The basic news is that all Tesla vehicles leaving the factory have all the hardware necessary for Level 5 autonomy,” he declared in 2016. The statement surprised and concerned some working on the project, since the Society of Automotive Engineers defines Level 5 as full driving automation.</p>.<p>More recently, he has said that new software — currently part of a beta test by a limited number of Tesla owners who have bought the FSD package — will allow cars to drive themselves on city streets as well as highways. But as with Autopilot, Tesla documentation says drivers must keep their hands on the wheel, ready to take control of the car at any time.</p>.<p>Regulators have warned that Tesla and Musk have exaggerated the sophistication of Autopilot, encouraging some people to misuse it.</p>.<p>“Where I get concerned is the language that’s used to describe the capabilities of the vehicle,” said Jennifer Homendy, chair of the National Transportation Safety Board, which has investigated accidents involving Autopilot and criticized the system’s design. “It can be very dangerous.”</p>.<p><strong>Also Read | <a href="https://www.deccanherald.com/business/business-news/musk-hiring-engineers-to-solve-problems-that-affects-peoples-lives-1058160.html" target="_blank">Musk hiring engineers to solve problems that affects people's lives</a></strong></p>.<p>In addition, some who have long worked on autonomous vehicles for other companies — as well as seven former members of the Autopilot team — have questioned Tesla’s practice of constant modifications to Autopilot and FSD, pushed out to drivers through software updates, saying it can be hazardous because buyers are never quite sure what the system can and cannot do.</p>.<p>Hardware choices have also raised safety questions. Within Tesla, some argued for pairing cameras with radar and other sensors that worked better in heavy rain and snow, bright sunshine and other difficult conditions. For several years, Autopilot incorporated radar, and for a time Tesla worked on developing its own radar technology. But three people who worked on the project said Musk had repeatedly told members of the Autopilot team that humans could drive with only two eyes and that this meant cars should be able to drive with cameras alone.</p>.<p>They said he saw this as “returning to first principles” — a term Musk and others in the technology industry have long used to refer to sweeping aside standard practices and rethinking problems from scratch.</p>.<p>In May, Musk said on Twitter that Tesla was no longer putting radar on new cars. He said the company had tested the safety implications of not using radar but provided no details.</p>.<p>Some people have applauded Musk, saying a certain amount of compromise and risk was justified as he strove to reach mass production and ultimately change the automobile industry.</p>.<p>But recently, even Musk has expressed some doubts about Tesla’s technology. After repeatedly describing FSD in speeches, in interviews and on social media as a system on the verge of full autonomy, Musk in August called it “not great.” The team working on it, he said on Twitter, “is rallying to improve as fast as possible.”</p>.<p><strong>Cameras as Eyes</strong></p>.<p>Tesla began developing Autopilot more than seven years ago as an effort to meet new safety standards in Europe, which required technology such as automatic braking, according to three people familiar with the origins of the project.</p>.<p>The company originally called this an “advanced driver assistance” project, but was soon exploring a new name. Executives led by Musk decided on “Autopilot,” although some Tesla engineers objected to the name as misleading, favoring “Copilot” and other options, these three people said.</p>.<p>The name was borrowed from the aviation systems that allow planes to fly themselves in ideal conditions with limited pilot input.</p>.<p>At Autopilot’s official announcement in October 2014, Tesla said that the system would brake automatically and keep the car in a lane but added that “the driver is still responsible for, and ultimately in control of, the car.” It said that self-driving cars were “still years away from becoming a reality.”</p>.<p>At the beginning, Autopilot used cameras, radar and sound-wave sensors. But Musk told engineers that the system should eventually be able to drive autonomously from door to door — and it should do so solely with cameras, according to three people who worked on the project.</p>.<p>They said the Autopilot team continued to develop the system using radar and even planned to expand the number of radar sensors on each car, as well as exploring lidar — “light detection and ranging” devices that measure distances using laser pulses.</p>.<p>But Musk insisted that his two-eyes metaphor was the way forward and questioned whether radar was ultimately worth the headache and expense of buying and integrating radar technology from third parties, four people who worked on the Autopilot team said.</p>.<p>Over time, the company and the team moved closer to his way of thinking, placing more emphasis on camera technology, these people said.</p>.<p>Other companies developing driver-assistance systems and fully autonomous cars thought cameras were not enough. Google, for example, outfitted its self-driving test cars with expensive lidar devices as big as buckets mounted on the roof.</p>.<p><strong>Also Read | <a href="https://www.deccanherald.com/international/avenge-the-dinosaurs-elon-musk-on-nasas-asteroid-defence-test-1054484.html" target="_blank">'Avenge the dinosaurs': Elon Musk on NASA's asteroid defence test</a></strong></p>.<p>Cameras, by contrast, were cheap and small, which made them appealing to Tesla for its sleek cars. Radar, which uses radio waves and has been around for decades, was cheaper than lidar, a less-common technology. But according to three people who worked on the project, some engineers backed Musk’s cameras-only approach, arguing that radar was not always accurate and that it was difficult to reconcile radar data with information from cameras.</p>.<p>Autonomous driving experts said Musk’s cameras-as-eyes analogy was deeply flawed, as did eight former Autopilot engineers interviewed for this article, although some said there were colleagues who supported Musk’s view.</p>.<p>Aesthetics also influenced decisions about radar.</p>.<p>In late 2014, Tesla began installing radar on its Model S sedans as it prepared to roll out the first version of Autopilot. But Musk did not like the way the radar looked inside an open hole in the front of the cars and told his engineers to install a rubber seal, according to two people who worked on the project at the time, even though some employees warned that the seal could trap snow and ice and prevent the system from working properly.</p>.<p>These people said the company went ahead with Musk’s instructions without testing the design in winter weather — but resolved the situation after customers complained that the radar stopped working in winter.</p>.<p>In mid-2015, Musk met with a group of Tesla engineering managers to discuss their plans for the second version of Autopilot. One manager, an auto industry veteran named Hal Ockerse, told Musk he wanted to include a computer chip and other hardware that could monitor the physical components of Autopilot and provide backup if parts of the system suddenly stopped working, according to two people with knowledge of the meeting.</p>.<p>But Musk slapped down the idea, they said, arguing it would slow the progress of the project as Tesla worked to build a system that could drive cars by themselves. Already angry after Autopilot malfunctioned on his morning drive that day, Musk berated Ockerse for even suggesting the idea. Ockerse soon left the company.</p>.<p>By the end of 2015, Musk was publicly saying that Teslas would drive themselves within about two years. “I think we have all the pieces, and it’s just about refining those pieces, putting them in place, and making sure they work across a huge number of environments — and then we’re done,” he told Fortune magazine.</p>.<p>Other companies such as Google, Toyota and Nissan exploring autonomous driving were not nearly as optimistic in their public statements.</p>.<p><strong>A Fatal Accident</strong></p>.<p>In May 2016, about six months after Musk’s remarks appeared in Fortune, a Model S owner, Joshua Brown, was killed in Florida when Autopilot failed to recognize a tractor-trailer crossing in front of him. His car had radar and a camera.</p>.<p>Musk held a short meeting with the Autopilot team and briefly addressed the accident. He did not delve into the details of what went wrong but told the team that the company must work to ensure that its cars did not hit anything, according to two people who were part of the meeting.</p>.<p>Tesla later said that during the crash, Autopilot’s camera could not distinguish between the white truck and the bright sky. Tesla has never publicly explained why the radar did not prevent the accident. Radar technology, like cameras and lidar, is not flawless. But most in the industry believe that this means you need as many types of sensors as possible.</p>.<p>Less than a month after the crash, Musk said at an event hosted by Recode, a tech publication, that autonomous driving was “basically a solved problem” and that Teslas could already drive more safely than humans. He made no mention of the accident in which Brown was killed, although Tesla said in a blog post a few weeks later — headlined “A Tragic Loss” — that it had immediately reported the episode to federal regulators.</p>.<p>Although it is not clear that they were influenced by the fatal accident, Musk and Tesla soon showed a renewed interest in radar, according to three engineers who worked on Autopilot. The company began an effort to build its own radar technology, rather than using sensors built by other suppliers. The company hired Duc Vu, an expert in the field, in October 2016 from the auto parts company Delphi.</p>.<p>But 16 months later, Vu suddenly parted ways with the company after a disagreement he had with another executive over a new wiring system in Tesla’s cars, the three people said. In the weeks and months that followed, other members of the radar team left as well.</p>.<p>Over several months after those departures, Tesla reclassified the radar effort as a research undertaking rather than one actively aimed at production, the three people said.</p>.<p><strong>The Quest for Fully Autonomous Cars</strong></p>.<p>As Tesla approached the introduction of Autopilot 2.0, most of the Autopilot team dropped their normal duties to work on a video meant to show just how autonomous the system could be. But the final video did not provide a full picture of how the car operated during the filming.</p>.<p>The route taken by the car had been charted ahead of time by software that created a 3D digital map, a feature unavailable to drivers using the commercial version of Autopilot, according to two former members of the Autopilot team. At one point during the filming of the video, the car hit a roadside barrier on Tesla property while using Autopilot and had to be repaired, three people who worked on the video said.</p>.<p>The video was later used to promote Autopilot’s capabilities, and it is still on Tesla’s website.</p>.<p>When Musk unveiled Autopilot 2.0 in October 2016, he said at the news conference that all new Tesla cars now included the cameras, computing power and all other hardware they would need for “full self driving” — not a technical term, but one that suggested truly autonomous operation.</p>.<p>His statements took the engineering team by surprise, and some felt that Musk was promising something that was not possible, according to two people who worked on the project.</p>.<p>Sterling Anderson, who led the project at the time and later started an autonomous driving company called Aurora, told Tesla’s sales and marketing teams that they should not refer to the company’s technology as “autonomous” or “self-driving” because this would mislead the public, according to two former employees.</p>.<p>Some in the company may have heeded the advice, but Tesla was soon using the term “full self driving” as a standard way of describing its technology.</p>.<p>By 2017, Tesla began selling a set of services that the company has described as a more advanced version of Autopilot, calling the package Full Self Driving. Its features include responding to traffic lights and stop signs — and changing lanes without being prompted to by the driver. The company sold the package for up to $10,000.</p>.<p>Engineers who have worked on the technology acknowledge that these services have yet to reach the full autonomy implied in its name and promised by Musk in public statements. “I’m highly confident the car will drive itself for the reliability in excess of a human this year,” he said during an earnings call in January. “This is a very big deal.”</p>.<p>In early November, Tesla recalled nearly 12,000 vehicles that were part of the beta test of new FSD features, after deploying a software update that the company said might cause crashes because of unexpected activation of the cars’ emergency braking system.</p>.<p>Schuyler Cullen, who oversaw a team that explored autonomous-driving possibilities at the South Korean tech giant Samsung, said in an interview that Musk’s cameras-only approach was fundamentally flawed. “Cameras are not eyes! Pixels are not retinal ganglia! The FSD computer is nothing like the visual cortex!” said Cullen, a computer-vision specialist who now runs a startup that is building a new kind of camera-based sensor.</p>.<p>Amnon Shashua, CEO of Mobileye, a former Tesla supplier that has been testing technology that is similar to the electric-car maker’s, said Musk’s idea of using only cameras in a self-driving system could ultimately work, although other sensors may be needed in the short term. He added that Musk might exaggerate the capabilities of the company’s technology, but that those statements shouldn’t be taken too seriously.</p>.<p>“One should not be hung up on what Tesla says,” Shashua said. “Truth is not necessarily their end goal. The end goal is to build a business.”</p>
<p>Elon Musk built his electric car company, Tesla, around the promise that it represented the future of driving — a phrase emblazoned on the automaker’s website.</p>.<p>Much of that promise was centered on Autopilot, a system of features that could steer, brake and accelerate the company’s sleek electric vehicles on highways. Over and over, Musk declared that truly autonomous driving was nearly at hand — the day when a Tesla could drive itself — and that the capability would be whisked to drivers over the air in software updates.</p>.<p>Unlike technologists at almost every other company working on self-driving vehicles, Musk insisted that autonomy could be achieved solely with cameras tracking their surroundings. But many Tesla engineers questioned whether it was safe enough to rely on cameras without the benefit of other sensing devices — and whether Musk was promising drivers too much about Autopilot’s capabilities.</p>.<p>Now those questions are at the heart of an investigation by the National Highway Traffic Safety Administration after at least 12 accidents in which Teslas using Autopilot drove into parked firetrucks, police cars and other emergency vehicles, killing one person and injuring 17 others.</p>.<p><strong>Also Read | <a href="https://www.deccanherald.com/international/world-news-politics/civilisation-will-crumble-if-people-dont-have-more-kids-elon-musk-1058816.html" target="_blank">Civilisation will 'crumble' if people don't have more kids: Elon Musk</a></strong></p>.<p>Families are suing Tesla over fatal crashes, and Tesla customers are suing the company for misrepresenting Autopilot and a set of sister services called Full Self Driving, or FSD.</p>.<p>As the guiding force behind Autopilot, Musk pushed it in directions other automakers were unwilling to take this kind of technology, interviews with 19 people who worked on the project over the past decade show. Musk repeatedly misled buyers about the services’ abilities, many of those people say. All spoke on the condition of anonymity, fearing retaliation from Musk and Tesla.</p>.<p>Musk and a top Tesla lawyer did not respond to multiple email requests for comment for this article over several weeks, including a detailed list of questions. But the company has consistently said that the onus is on drivers to stay alert and take control of their cars should Autopilot malfunction.</p>.<p>Since the start of Tesla’s work on Autopilot, there has been a tension between safety and Musk’s desire to market Tesla cars as technological marvels.</p>.<p>For years, Musk has said Tesla cars were on the verge of complete autonomy. “The basic news is that all Tesla vehicles leaving the factory have all the hardware necessary for Level 5 autonomy,” he declared in 2016. The statement surprised and concerned some working on the project, since the Society of Automotive Engineers defines Level 5 as full driving automation.</p>.<p>More recently, he has said that new software — currently part of a beta test by a limited number of Tesla owners who have bought the FSD package — will allow cars to drive themselves on city streets as well as highways. But as with Autopilot, Tesla documentation says drivers must keep their hands on the wheel, ready to take control of the car at any time.</p>.<p>Regulators have warned that Tesla and Musk have exaggerated the sophistication of Autopilot, encouraging some people to misuse it.</p>.<p>“Where I get concerned is the language that’s used to describe the capabilities of the vehicle,” said Jennifer Homendy, chair of the National Transportation Safety Board, which has investigated accidents involving Autopilot and criticized the system’s design. “It can be very dangerous.”</p>.<p><strong>Also Read | <a href="https://www.deccanherald.com/business/business-news/musk-hiring-engineers-to-solve-problems-that-affects-peoples-lives-1058160.html" target="_blank">Musk hiring engineers to solve problems that affects people's lives</a></strong></p>.<p>In addition, some who have long worked on autonomous vehicles for other companies — as well as seven former members of the Autopilot team — have questioned Tesla’s practice of constant modifications to Autopilot and FSD, pushed out to drivers through software updates, saying it can be hazardous because buyers are never quite sure what the system can and cannot do.</p>.<p>Hardware choices have also raised safety questions. Within Tesla, some argued for pairing cameras with radar and other sensors that worked better in heavy rain and snow, bright sunshine and other difficult conditions. For several years, Autopilot incorporated radar, and for a time Tesla worked on developing its own radar technology. But three people who worked on the project said Musk had repeatedly told members of the Autopilot team that humans could drive with only two eyes and that this meant cars should be able to drive with cameras alone.</p>.<p>They said he saw this as “returning to first principles” — a term Musk and others in the technology industry have long used to refer to sweeping aside standard practices and rethinking problems from scratch.</p>.<p>In May, Musk said on Twitter that Tesla was no longer putting radar on new cars. He said the company had tested the safety implications of not using radar but provided no details.</p>.<p>Some people have applauded Musk, saying a certain amount of compromise and risk was justified as he strove to reach mass production and ultimately change the automobile industry.</p>.<p>But recently, even Musk has expressed some doubts about Tesla’s technology. After repeatedly describing FSD in speeches, in interviews and on social media as a system on the verge of full autonomy, Musk in August called it “not great.” The team working on it, he said on Twitter, “is rallying to improve as fast as possible.”</p>.<p><strong>Cameras as Eyes</strong></p>.<p>Tesla began developing Autopilot more than seven years ago as an effort to meet new safety standards in Europe, which required technology such as automatic braking, according to three people familiar with the origins of the project.</p>.<p>The company originally called this an “advanced driver assistance” project, but was soon exploring a new name. Executives led by Musk decided on “Autopilot,” although some Tesla engineers objected to the name as misleading, favoring “Copilot” and other options, these three people said.</p>.<p>The name was borrowed from the aviation systems that allow planes to fly themselves in ideal conditions with limited pilot input.</p>.<p>At Autopilot’s official announcement in October 2014, Tesla said that the system would brake automatically and keep the car in a lane but added that “the driver is still responsible for, and ultimately in control of, the car.” It said that self-driving cars were “still years away from becoming a reality.”</p>.<p>At the beginning, Autopilot used cameras, radar and sound-wave sensors. But Musk told engineers that the system should eventually be able to drive autonomously from door to door — and it should do so solely with cameras, according to three people who worked on the project.</p>.<p>They said the Autopilot team continued to develop the system using radar and even planned to expand the number of radar sensors on each car, as well as exploring lidar — “light detection and ranging” devices that measure distances using laser pulses.</p>.<p>But Musk insisted that his two-eyes metaphor was the way forward and questioned whether radar was ultimately worth the headache and expense of buying and integrating radar technology from third parties, four people who worked on the Autopilot team said.</p>.<p>Over time, the company and the team moved closer to his way of thinking, placing more emphasis on camera technology, these people said.</p>.<p>Other companies developing driver-assistance systems and fully autonomous cars thought cameras were not enough. Google, for example, outfitted its self-driving test cars with expensive lidar devices as big as buckets mounted on the roof.</p>.<p><strong>Also Read | <a href="https://www.deccanherald.com/international/avenge-the-dinosaurs-elon-musk-on-nasas-asteroid-defence-test-1054484.html" target="_blank">'Avenge the dinosaurs': Elon Musk on NASA's asteroid defence test</a></strong></p>.<p>Cameras, by contrast, were cheap and small, which made them appealing to Tesla for its sleek cars. Radar, which uses radio waves and has been around for decades, was cheaper than lidar, a less-common technology. But according to three people who worked on the project, some engineers backed Musk’s cameras-only approach, arguing that radar was not always accurate and that it was difficult to reconcile radar data with information from cameras.</p>.<p>Autonomous driving experts said Musk’s cameras-as-eyes analogy was deeply flawed, as did eight former Autopilot engineers interviewed for this article, although some said there were colleagues who supported Musk’s view.</p>.<p>Aesthetics also influenced decisions about radar.</p>.<p>In late 2014, Tesla began installing radar on its Model S sedans as it prepared to roll out the first version of Autopilot. But Musk did not like the way the radar looked inside an open hole in the front of the cars and told his engineers to install a rubber seal, according to two people who worked on the project at the time, even though some employees warned that the seal could trap snow and ice and prevent the system from working properly.</p>.<p>These people said the company went ahead with Musk’s instructions without testing the design in winter weather — but resolved the situation after customers complained that the radar stopped working in winter.</p>.<p>In mid-2015, Musk met with a group of Tesla engineering managers to discuss their plans for the second version of Autopilot. One manager, an auto industry veteran named Hal Ockerse, told Musk he wanted to include a computer chip and other hardware that could monitor the physical components of Autopilot and provide backup if parts of the system suddenly stopped working, according to two people with knowledge of the meeting.</p>.<p>But Musk slapped down the idea, they said, arguing it would slow the progress of the project as Tesla worked to build a system that could drive cars by themselves. Already angry after Autopilot malfunctioned on his morning drive that day, Musk berated Ockerse for even suggesting the idea. Ockerse soon left the company.</p>.<p>By the end of 2015, Musk was publicly saying that Teslas would drive themselves within about two years. “I think we have all the pieces, and it’s just about refining those pieces, putting them in place, and making sure they work across a huge number of environments — and then we’re done,” he told Fortune magazine.</p>.<p>Other companies such as Google, Toyota and Nissan exploring autonomous driving were not nearly as optimistic in their public statements.</p>.<p><strong>A Fatal Accident</strong></p>.<p>In May 2016, about six months after Musk’s remarks appeared in Fortune, a Model S owner, Joshua Brown, was killed in Florida when Autopilot failed to recognize a tractor-trailer crossing in front of him. His car had radar and a camera.</p>.<p>Musk held a short meeting with the Autopilot team and briefly addressed the accident. He did not delve into the details of what went wrong but told the team that the company must work to ensure that its cars did not hit anything, according to two people who were part of the meeting.</p>.<p>Tesla later said that during the crash, Autopilot’s camera could not distinguish between the white truck and the bright sky. Tesla has never publicly explained why the radar did not prevent the accident. Radar technology, like cameras and lidar, is not flawless. But most in the industry believe that this means you need as many types of sensors as possible.</p>.<p>Less than a month after the crash, Musk said at an event hosted by Recode, a tech publication, that autonomous driving was “basically a solved problem” and that Teslas could already drive more safely than humans. He made no mention of the accident in which Brown was killed, although Tesla said in a blog post a few weeks later — headlined “A Tragic Loss” — that it had immediately reported the episode to federal regulators.</p>.<p>Although it is not clear that they were influenced by the fatal accident, Musk and Tesla soon showed a renewed interest in radar, according to three engineers who worked on Autopilot. The company began an effort to build its own radar technology, rather than using sensors built by other suppliers. The company hired Duc Vu, an expert in the field, in October 2016 from the auto parts company Delphi.</p>.<p>But 16 months later, Vu suddenly parted ways with the company after a disagreement he had with another executive over a new wiring system in Tesla’s cars, the three people said. In the weeks and months that followed, other members of the radar team left as well.</p>.<p>Over several months after those departures, Tesla reclassified the radar effort as a research undertaking rather than one actively aimed at production, the three people said.</p>.<p><strong>The Quest for Fully Autonomous Cars</strong></p>.<p>As Tesla approached the introduction of Autopilot 2.0, most of the Autopilot team dropped their normal duties to work on a video meant to show just how autonomous the system could be. But the final video did not provide a full picture of how the car operated during the filming.</p>.<p>The route taken by the car had been charted ahead of time by software that created a 3D digital map, a feature unavailable to drivers using the commercial version of Autopilot, according to two former members of the Autopilot team. At one point during the filming of the video, the car hit a roadside barrier on Tesla property while using Autopilot and had to be repaired, three people who worked on the video said.</p>.<p>The video was later used to promote Autopilot’s capabilities, and it is still on Tesla’s website.</p>.<p>When Musk unveiled Autopilot 2.0 in October 2016, he said at the news conference that all new Tesla cars now included the cameras, computing power and all other hardware they would need for “full self driving” — not a technical term, but one that suggested truly autonomous operation.</p>.<p>His statements took the engineering team by surprise, and some felt that Musk was promising something that was not possible, according to two people who worked on the project.</p>.<p>Sterling Anderson, who led the project at the time and later started an autonomous driving company called Aurora, told Tesla’s sales and marketing teams that they should not refer to the company’s technology as “autonomous” or “self-driving” because this would mislead the public, according to two former employees.</p>.<p>Some in the company may have heeded the advice, but Tesla was soon using the term “full self driving” as a standard way of describing its technology.</p>.<p>By 2017, Tesla began selling a set of services that the company has described as a more advanced version of Autopilot, calling the package Full Self Driving. Its features include responding to traffic lights and stop signs — and changing lanes without being prompted to by the driver. The company sold the package for up to $10,000.</p>.<p>Engineers who have worked on the technology acknowledge that these services have yet to reach the full autonomy implied in its name and promised by Musk in public statements. “I’m highly confident the car will drive itself for the reliability in excess of a human this year,” he said during an earnings call in January. “This is a very big deal.”</p>.<p>In early November, Tesla recalled nearly 12,000 vehicles that were part of the beta test of new FSD features, after deploying a software update that the company said might cause crashes because of unexpected activation of the cars’ emergency braking system.</p>.<p>Schuyler Cullen, who oversaw a team that explored autonomous-driving possibilities at the South Korean tech giant Samsung, said in an interview that Musk’s cameras-only approach was fundamentally flawed. “Cameras are not eyes! Pixels are not retinal ganglia! The FSD computer is nothing like the visual cortex!” said Cullen, a computer-vision specialist who now runs a startup that is building a new kind of camera-based sensor.</p>.<p>Amnon Shashua, CEO of Mobileye, a former Tesla supplier that has been testing technology that is similar to the electric-car maker’s, said Musk’s idea of using only cameras in a self-driving system could ultimately work, although other sensors may be needed in the short term. He added that Musk might exaggerate the capabilities of the company’s technology, but that those statements shouldn’t be taken too seriously.</p>.<p>“One should not be hung up on what Tesla says,” Shashua said. “Truth is not necessarily their end goal. The end goal is to build a business.”</p>