Musk Pushed Back Against Tesla Employees’ Autopilot Concerns: Report

2016 Tesla Model S

Tesla CEO Elon Musk’s drive to develop and market new driving technology is nicely recognized, but former employees say he brushed aside their concerns about the security of the company’s Autopilot system.

Several employees, like a former Autopilot engineer, told CNN Funds that their concerns fell on deaf ears, as Musk usually reverted back to a “bigger picture” position on safety.

The automaker’s semi-autonomous driving system came under scrutiny in the wake of a fatal May crash. Musk claims that although the Autopilot didn’t recognize a transport truck in that case, the technique tends to make roads safer. He’s pledged to do far more to educate owners on how to effectively use Autopilot, but has no plans to stop providing the technique.

Musk told the Wall Street Journal “we knew we had a technique that on balance would save lives.”

Speaking to CNNMoney, ex-Autopilot engineer Eric Meadows claims he was pulled more than by police in 2015 although testing Autopilot on a Los Angeles highway, a few months just before the system’s release. The Tesla had difficulty handling turns, and the police suspected him of becoming intoxicated.

Meadows was later fired for overall performance motives, but he claims his worries about Autopilot’s security — specifically the possibility that owners would “push the limits” of the technologies — grew over time.

“The last two months I was scared an individual was going to die,” he said.

The report mentions a former Tesla executive who worked closely with Musk, and claims the CEO was regularly at loggerheads with “overly cautious” employees. Tesla’s self-parking feature went ahead as planned, another source claims, despite worries that sensors wouldn’t function effectively if the vehicle was close to the edge of a steep slope. Again, the greater great of preventing driveway deaths overruled these concerns.

The employee mix at Tesla falls into two categories — younger, data-driven personnel and seasoned automotive sector varieties. The report cites several sources who claim that information is the guiding element in Tesla’s decisions, meaning slight risk is allowed if it means a greater possible for general safety.

Even though this bothers some engineers and customer safety groups, even the agency investigating the Might crash sides with Musk’s views on security. Recently, National Highway Targeted traffic Security Administration administrator Mark Rosekind mentioned the sector “cannot wait for perfect” when it comes to advertising and marketing potentially life-saving autonomous technology.

[Image: Tesla Motors]

Vehicle Critiques – The Truth About Cars

Driver in Fatal Tesla Crash was Speeding Even though on Autopilot: NHTSA

2013-Tesla-Model-S-Rear

The National Transportation Security Board didn’t assign any blame in its initial report into the fatal May possibly 7 crash of a Tesla Model S, but did confirm new particulars.

The agency claims Joshua Brown’s vehicle was in Autopilot mode at the time of the crash, and was travelling above the 65 mile per hour speed limit ahead of colliding with a tractor-trailer, according to Reuters.

Both the NTSB and National Highway Targeted traffic Security Administration are investigating the crash, billed as the first fatality involving a self-driving car. Significantly of the investigation’s focus is on what function the semi-autonomous Autopilot technique had on the crash.

According to the NTSB’s findings, the Traffic-Conscious Cruise Control and Autosteer lane-maintaining program on Brown’s vehicle have been activated at the time of the crash. Tesla admitted that the vehicle’s Autopilot didn’t recognize the truck as it crossed the highway in front of Brown. The vibrant sunlight reflecting off the side of the trailer confused the program.

Besides that, the preliminary report located that Brown’s vehicle was going 74 miles per hour. Even though it is above the speed limit on that highway, numerous drivers set their cruise handle to nine miles per hour more than the limit to make time and avoid speeding tickets.

An NTSB official said speeding could have contributed to the crash, but is not the result in. A full report is due a year from now.

The collision sheared off the best of Brown’s Tesla, which traveled 297 feet following exiting from underneath the trailer. The Model S then hit a power pole, snapping it, ahead of coming to rest 50 feet away. Truck driver Frank Baressi claimed he heard the film Harry Potter playing in the wreckage, but police stated the portable DVD player and laptop located in the car weren’t running soon after the crash.

Baressi hasn’t been charged by Florida police.

Car Critiques – The Truth About Vehicles

Consumer Watchdog Slams Elon Musk, Demands Tesla Pull the Plug on Autopilot

Tesla HQ

America’s highest profile consumer advocacy group is calling out Tesla CEO Elon Musk for waiting a month to disclose the prospective risk posed to owners by the company’s Autopilot technologies.

In a letter to Musk, Consumer Watchdog demands that Tesla sideline its Autopilot program till it can be established protected, criticizes the CEO for side-stepping blame in numerous crashes, and accuses him of placing the public at danger.

Tesla’s semi-autonomous Autopilot system is continually updated based on owner feedback. The company’s tradition of “beta testing” its products was called out by safety advocates right after it was revealed on June 30 that Autopilot played a role in a fatal May possibly 7 crash on a Florida highway.

For Consumer Watchdog, founded in 1985 with assist from automobile safety advocate Ralph Nader, the particulars of the crash are proof of a hazardous Autopilot flaw.

“An autopilot whose sensors can’t distinguish among the side of a white truck and a vibrant sky simply is not prepared to be deployed on public roads,” reads the letter, signed by president Jamie Court and two executives. “Tesla must immediately disable the autopilot feature on all your cars until it can be confirmed to be secure. At a minimum, autopilot have to be disabled till the total benefits of NHTSA’s investigation are released.”

Tesla’s admission of the crash coincided with the National Highway Site visitors Safety Administration opening an investigation into the incident. Since then, the NHTSA launched another investigation into the July 1 rollover crash of a Model X in Pennsylvania. That automobile was allegedly driving in Autopilot mode at the time.

The automaker claims it informed the NHTSA of the Could 7 crash on Could 16, and sent an investigator to examine the wreckage on May possibly 18. The automaker’s investigation was completed during the final week of Could.

In its letter to Musk, Customer Watchdog mentions Tesla’s “inexplicable delay” in notifying owners of the crash, calling the month-extended gap “inexcusable.” The group goes on to say that beta testing shouldn’t be used on merchandise that could lead to fatal consequences for the user, and accuses Tesla of making use of its shoppers as “guinea pigs.”

“You want to have it each ways with autopilot,” the letter reads. “On the 1 hand you extoll the supposed virtues of autopilot, making the impression that, once engaged, it is self-enough. Your customers are lulled into believing their automobile is protected to be left to do the driving itself. On the other hand you walk back any guarantee of safety, saying autopilot is still in Beta mode and drivers have to pay attention all the time.”

Statements created by Musk and the “Tesla Team” in the aftermath of the current crashes and a rear-finish collision last November quantity to “victim blaming,” the group stated. It demands that Autopilot only return when it can be confirmed protected, with a pledge from Tesla to be liable if any faults take place when the method is activated.

Car Reviews – The Truth About Cars

Tesla Faces Backlash Over Autopilot Technology in Wake of Crash

Tesla Model S, Image: Tesla Motors

Safety advocates are claiming Tesla’s reputation as a top innovator in the automotive globe could breed overconfidence in its new technology, placing drivers in danger.

The May possibly 7 death of a Tesla driver whose automobile collided with a tractor trailer even though in “Autopilot” mode sparked renewed calls for proper vetting of sophisticated technologies in production vehicles — particularly if the technologies permits the car to drive itself.

Joshua Brown was killed on a Florida highway after his 2015 Tesla Model S’s Autopilot mistook a brightly-lit tractor trailer crossing the highway as the sky. The autonomous driving system didn’t react to the obstacle, leading to a fatal collision. The National Highway Traffic Safety Administration is now investigating the Model S and its Autopilot method.

Following the crash, the truck’s driver, Frank Baressi, claimed the victim was watching a film at the time of the crash, saying he could hear the film Harry Potter playing from the Tesla’s wreckage.

Tesla automobiles cannot play videos on their infotainment screens, but Reuters now reports that the Florida Highway Patrol located a portable, aftermarket DVD player in the wreckage of Brown’s automobile. Brown was a great fan of Tesla and its Autopilot technology, uploading many dashcam videos to his YouTube page, including one particular displaying the system avoiding a collision with a truck earlier this year.

Police said no video recording device — mounted to the dash or elsewhere — was discovered in the wreckage.

Tesla markets the Autopilot program as a driver’s help, preserving that drivers nonetheless need to have to be conscious of their surroundings and prepared to respond to danger although the program is activated. The mere presence of the technologies, nonetheless, could lead to overconfidence in its abilities.

Speaking to Bloomberg, Jackie Gillan, president of Advocates for Highway and Auto Safety, criticized the practice of “beta” testing — getting shoppers test and help increase new technology by way of genuine-globe use.

“Allowing automakers to do their personal testing, with no distinct guidelines, indicates consumers are going to be the guinea pigs in this experiment,” stated Gillan. “This is going to take place once more and once more and once more.”

Joan Claybrook, automotive safety advocate and former NHTSA director, said the “trial-and-error technique” is a threat to public security.

“The history of the auto market is they test and test and test,” she told Bloomberg. “This is a life-and-death issue.”

Anticipate the Florida crash to make other automakers further cautious about perfecting their own autonomous driving technology (or semi-autonomous driving aids) ahead of generating it offered in production cars. In March, NHTSA administrator Mark Rosekind gave the regulator a six month timeline in which to develop federal rules for self-driving vehicles.

[Image: Tesla Motors]

Auto Testimonials – The Truth About Vehicles

NHTSA Investigating Tesla Model S Following Fatal ‘Autopilot’ Crash

tesla-model-s-

A current fatal crash of a 2015 Tesla Model S operating in “Autopilot” mode prompted the National Highway Site visitors Safety Administration to open a preliminary investigation into the model, Reuters is reporting.

Because the crash occurred when the vehicle was beneath the control of an autonomous driving system, the NHTSA mentioned it is planning “an examination of the style and overall performance of any driving aids in use at the time of the crash.”

A preliminary investigation is the first step the agency can take if it believes a vehicle is unsafe and might require to be recalled. The probe involves a total of 25,000 Tesla Model S autos.

Tesla responded to the news on its web site with a post titled “A Tragic Loss”:

This is the very first recognized fatality in just over 130 million miles where Autopilot was activated. Among all cars in the US, there is a fatality every single 94 million miles. Worldwide, there is a fatality roughly every 60 million miles. It is important to emphasize that the NHTSA action is just a preliminary evaluation to determine whether the technique worked according to expectations.

Following our regular practice, Tesla informed NHTSA about the incident immediately right after it occurred. What we know is that the vehicle was on a divided highway with Autopilot engaged when a tractor trailer drove across the highway perpendicular to the Model S. Neither Autopilot nor the driver noticed the white side of the tractor trailer against a brightly lit sky, so the brake was not applied. The higher ride height of the trailer combined with its positioning across the road and the really uncommon circumstances of the effect triggered the Model S to pass beneath the trailer, with the bottom of the trailer impacting the windshield of the Model S. Had the Model S impacted the front or rear of the trailer, even at high speed, its sophisticated crash safety method would likely have prevented serious injury as it has in quite a few other equivalent incidents.

Tesla went on to clarify that drivers are presented with a message explaining how to use Autopilot safety when they engage the function:

When drivers activate Autopilot, the acknowledgment box explains, amongst other things, that Autopilot “is an assist feature that demands you to maintain your hands on the steering wheel at all times,” and that “you need to preserve manage and responsibility for your vehicle” whilst utilizing it.

CNN is reporting that the crash happened Might 7 in Williston, Florida. Tesla’s stock sank in soon after-hours trading when news of the investigation broke.

Car Testimonials – The Truth About Automobiles