The technology is pretty cool, but do not let your new responsibilities in partially self-driving cars distract you from your responsibility at the wheel.

SAN FRANCISCO – Tesla's innovative and controversial autopilot software, which powers the partially self-propelled features of their electric cars, is widely used for highway driving, according to initial findings from a volunteer MIT study.

Shared at one Conference in Cambridge, Massachusetts Wednesday came a day after the recent crash of a Tesla with autopilot, and as two consumer groups renewed criticism of the software Re's name and marketing mislead her dangerous driver. [1659008] Software Tesla CEO Elon Musk, who was launched in 2015, once said he could be "safer than man" as the number of Tesla on the road increases and other automakers make their own, in part unveil autonomous vehicles.

The Massachusetts Institute of Technology, in an ongoing study by 34 Tesla owners who volunteered for the project, found that autopilot was used during 36% of the miles driven by the 22-car test group (including some cars) Couples).

"The fact that most drivers often use it is in line with what Musk said," said Bryan Reimer, research scientist at MIT AgeLab and deputy director of the New England University Transportation Center at MIT USA TODAY (19659008) They would most likely use it for highway driving, with the next largest cluster between 25 and 45 miles per hour, Reimer said Wednesday in a speech at the annual tech conference of the New England Motor Press Association.

More: & # 39; Consumer Reports & # 39; returns course, recommends Tesla Model 3 after Elon Musk improves the brakes

More: Your fancy new car controls and brakes for you; So why leave your fingers on the steering wheel?

More: Tesla, Uber crashes headlights automatically. Here is what it will not do.

More: Warning, Driver: Your car can drive alone, but you're still responsible

Tesla owners seem to come in some flavors. Some admire their state-of-the-art tech sedans, but never autopilot, but most are so excited about the semi-autonomous features that they regularly use them to take away the monotony of driving. (In extreme cases, they film themselves while the car is driving.)

Reimer says that interviews with participants in other test groups reveal a "glaring gap" in the understanding of automation and safety technology. He says that driver education needs to be increased by stakeholders such as automakers, dealers and perhaps even licensing authorities.

This confusion may have played a role in some of Tesla's recent crashes on autopilot.

The last one took place on Tuesday in Laguna Beach, California, where a Model S met and parked a parked but empty police car.

In an earlier crash in Utah, the driver – who only had a broken ankle suffered a fire engine at 60 mph – looked at her phone and broke away from driving for 80 seconds before impact. In March, the driver of a Model X died after his autopilot-capable car was turned into a highway divider in Mountain View, California.

A Tesla sedan with a half-life autonomous autopilot function at 97 km / h (97 km / h) at the rear of a fire truck that crashed before the 11th impact. May 2018 apparently was not slowed down, but police say the autopilot function is unknown. (Photo: South Jordan Police Department / AP)

In each of these cases, Tesla has advised consumers that the system is not intended to make the vehicle a self-driving car, and that it has constant control the driver requires.

But Musk's enthusiastic prediction for the capabilities of autopilot, as well as his name, often override those admonitions, say two consumer groups.

Consumer Watchdog and the Center for Car Safety held a press conference in Los Angeles state and federal regulators on Wednesday to make Tesla rename autopilot and may require it to be further tested.

"People who rely on Tesla (autopilot) are killed, and that's what we want to stop," said John Simpson, Consumer Watchdog's privacy and technology project manager, who has been tracking Tesla for years.

Simpson said the two groups are urging the officials of the California Department of Motor Vehicles to investigate how the electric car's claims about the technology are in line with the autopilot's reality.

Jason Levine, executive director of the Center for Auto Safety, said that consumer groups are also urging the Federal Trade Commission to familiarize themselves with what they refer to as "dangerously misleading and deceptive marketing practices."

"Tesla captured the imagination of the buying public with a car thrown directly to consumers by a celebrity CEO," Levine said. "But the software and hardware (the autopilot who uses radar and cameras to scan the road) needs to be improved, or the name needs to change – these recent deaths should keep politicians pausing."

One of the most difficult aspects of self-driving cars is the moment when a situation requires the driver to regain control, also known as "surrender".

Tesla to Nissan to Cadillac automakers have used various types of feedback to force the driver to re-engage with the vehicle once that driver-assistive technology is deployed.

In the MIT study of Almost 20,000 autopilot "Ausgrasterungen" – when the control was returned to the drivers – a small 0.5% were initiated by the car, with humans for reasons of planned maneuvers to complex road scenarios take over.

This result indicates that this particular test group does not appear to abuse the system, which would trigger various warnings to regain control of the vehicle.

The public is intrigued when it comes to the coming age of self-driving cars, a future heralded by some of the biggest names in technology. But these efforts suffered a blow in March when an autonomous Uber car killed a pedestrian in Arizona and caused the company to pull out of nationwide testing.

How much confusion prevails is confirmed by a finding from a MIT AgeLab survey question that asked respondents: "To your awareness, are self-driving vehicles available for purchase today?"

Almost 23% said yes. But today there are no self-driving vehicles for sale; only cars with semi-autonomous features like autopilot.

"As I mentioned," says Reimer, "there is a lot of confusion."

Follow USA TODAY Tech writer Marco della Cava on Twitter


Show thumbnails

Show captions

Last slideNext object

Read or Share this History: