A Tesla Model 3 driver was checking on his dog in the back seat when his car, on Autopilot mode, smashed into two parked cars, according to a police report out of Connecticut.The incident happened on the states I-95 highway early morning Saturday, December 7.According to a post by the Connecticut State Police department, one of the vehicles the Tesla struck was a police car pulled over to the side of the road, assisting a separate vehicle accident.Both of the patrol cars assisting with the other accident had their lights flashing, and a flare was set up, so it should have been pretty easy for somebody to see them. Unless, of course, youre relying on your car to drive itself while youre turned around checking on your dog in the back seat, which is exactly what happened.The driver of the Model 3 was issued a misdemeanor summons for Reckless Driving and Reckless Endangerment, but luckily nobody nor any animal was seriously injured in the accident.The CSP issued a warning to remind motorists that although a number of vehicles have some automated capabilities, there are no vehicles currently on sale that are fully automated or self-driving.Tesla has not commented on the incident, and the investigation has not revealed whether or not the Model 3s brakes were applied prior to the collision, by either the driver or
Origin: Tesla driver distracted by dog in car blames Autopilot after smashing into police
autopilot
Tesla Autopilot users rate feature’s safety high, even after close calls, crashes
2018 Tesla Model 3 Peter Bleakney photo The vast majority of Tesla Model 3 owners consider the cars Autopilot feature a real safety benefit, despite the fact the systems sometimes put them in danger.The overwhelming consensus about Autopilot is that it made owners feel safer while driving, according to a survey of Model 3 owners conducted by Bloomberg. Of the 5,000 owners polled, over 90 per cent touted the safety benefits of the system.The survey also found 13 per cent of owners say the Autopilot mode has put them into a dangerous situation before.Perhaps most interestingly, the overlap between Model 3 owners who answered those two questions that way was pretty big most of the drivers who reported being put in a dangerous situation by the system also said it made them feel safer. Were not sure what is going through those peoples minds, exactly.One owner surveyed, for example, admitted their car didnt slow down enough to take a bend in the road and ended up hitting a truck; they rated Autopilots overall safety four stars out of five. To be fair to to the capability of Teslas software, some 28 per cent of owners also say Autopilot has saved them from a dangerous situation.It would be interesting to see what other drivers think of Teslas Autopilot when they are driving down the same stretch of road as a Model 3. Do they feel more nervous that the Tesla might be driving itself? Or do they feel safer knowing that there is a second set of eyes on the road?For more on electric vehicles, listen to Drivings EV podcast Plugged In. Plugged In is available on Apple Podcasts, Spotify, Stitcher, and Google
Origin: Tesla Autopilot users rate feature’s safety high, even after close calls, crashes
Tesla sued by family of Florida man killed in Autopilot crash
Tesla was sued for the second time in three months by the family of a car owner who was killed in a crash while using the driver-assistance system Autopilot.Jeremy Banner, 50, died when the Model 3 sedan he was driving failed to brake or steer to avoid a semi trailer that ran a stop sign on a Florida highway in March, according to the lawsuit, which also names the driver of the semi as a defendant.Banner had engaged the Autopilot system about 10 seconds before the collision.Representatives for Tesla didnt immediately respond to a request for comment on the suit, which was filed Thursday.The National Transportation Safety Board issued a preliminary report on the crash in May and said data from the vehicle showed Autopilot was active at the time of the incident. The preliminary data indicated that neither the driver nor the Autopilot system executed evasive maneuvers.Banner is survived by his wife and three children. Were not just talking about the consequences of this defect to the Banner family, which is horrific, Trey Lytal, a lawyer for the family, said during a press conference. These products are defective.Lytal compared Banners accident to that of Joshua Brown, a Tesla Model S owner who died in a similar crash involving a tractor trailer in 2016. The family of Walter Huang, an Apple Inc. engineer who died in a Model X last year in Mountain View, California, sued the company in
Origin: Tesla sued by family of Florida man killed in Autopilot crash
YouTube moron films himself in Tesla on Autopilot with no one at the wheel
2018 Tesla Model 3Peter Bleakney In a truly stupid video recently posted to Instagram, YouTuber Alex Choi attempted to make himself look cool by sitting in the back seat of a semi-autonomous Tesla Model 3 with nobody behind the wheel. Choi posted the eight-second clip to his Instagram Stories, clearly showing himself taking a video from the back seat of the car as it self-navigated heavy traffic, a friend in the front passenger seat and the driver’s seat completely empty, just like the part of his brain where the common sense is supposed to be. Needless to say, if something were to go wrong with the system, or even if it were turned off suddenly – it deactivates when someone turns the wheel or touches the brakes – he would have been totally screwed, and likely would’ve hurt somebody else on the road. meanwhile, YouTuber and new #TeslaModel3 Performance owner, Alex Choi is posted this video to his Instagram story last night. Its probably the most reckless thing that hes done and thats saying something. pic.twitter.com/TK5zwgRohX Det Ansinn (@detansinn) June 4, 2019 This isn’t the first time Choi has done something stupid that endangers other people’s lives. A video was posted a while back with him making an extremely ill-timed merge in a Lamborghini that almost ended a motorcycle rider’s life. Honestly, if you’re going to do something for the gram, at least make it cool or interesting; this is just stupid, and doesn’t help the reputation of the Autopilot system. Once again, Tesla’s Autopilot system is not some stupid toy to be played with. It’s a driver assistance aid aimed at making driving more comfortable, so quit pretending it’s your own personal chauffeur. It isn’t. Tesla is also partly to blame for this kind of behavior, since the automaker still uses language like “full self-driving capability” to describe Autopilot’s advantages, when that’s something it really doesn’t
Origin: YouTube moron films himself in Tesla on Autopilot with no one at the wheel
YouTuber videos himself using Tesla’s Autopilot from the back seat
2018 Tesla Model 3Peter Bleakney In a truly stupid video recently posted to Instagram, YouTuber Alex Choi attempted to make himself look cool by sitting in the back seat of a semi-autonomous Tesla Model 3 with nobody behind the wheel. Choi posted the eight-second clip to his Instagram Stories, clearly showing himself taking a video from the back seat of the car as it self-navigated heavy traffic, a friend in the front passenger seat and the driver’s seat completely empty, just like the part of his brain where the common sense is supposed to be. Needless to say, if something were to go wrong with the system, or even if it were turned off suddenly – it deactivates when someone turns the wheel or touches the brakes – he would have been totally screwed, and likely would’ve hurt somebody else on the road. meanwhile, YouTuber and new #TeslaModel3 Performance owner, Alex Choi is posted this video to his Instagram story last night. Its probably the most reckless thing that hes done and thats saying something. pic.twitter.com/TK5zwgRohX Det Ansinn (@detansinn) June 4, 2019 This isn’t the first time Choi has done something stupid that endangers other people’s lives. A video was posted a while back with him making an extremely ill-timed merge in a Lamborghini that almost ended a motorcycle rider’s life. Honestly, if you’re going to do something for the gram, at least make it cool or interesting; this is just stupid, and doesn’t help the reputation of the Autopilot system. Once again, Tesla’s Autopilot system is not some stupid toy to be played with. It’s a driver assistance aid aimed at making driving more comfortable, so quit pretending it’s your own personal chauffeur. It isn’t. Tesla is also partly to blame for this kind of behavior, since the automaker still uses language like “full self-driving capability” to describe Autopilot’s advantages, when that’s something it really doesn’t
Origin: YouTuber videos himself using Tesla’s Autopilot from the back seat
Autopilot was on when Tesla hit semi trailer in fatal crash: safety board
A Tesla Model 3 involved in a March 1 fatal crash in Florida was being driven by the vehicle’s semi-autonomous Autopilot system and the driver’s hands weren’t on the steering wheel, according to the U.S. National Transportation Safety Board. The driver was killed when the car slammed into the side of a semi-truck that was crossing a highway in Delray Beach, the NTSB said in a preliminary report released Thursday. The driver apparently wasn’t steering in the eight seconds before the collision, according to NTSB. “Preliminary vehicle data show that the Tesla was traveling about 68 mph (109 km/h) when it struck the semitrailer,” the report said. “Neither the preliminary data nor the videos indicate that the driver or the ADAS executed evasive maneuvers.” ADAS refers to Tesla’s advanced driver assistance system. The NTSB used video from a nearby surveillance camera showing the collision and the video devices that Teslas use to help them steer and perform other functions. “We are deeply saddened by this accident and our thoughts are with everyone affected by this tragedy,” Tesla said in an emailed statement. The company informed NTSB and the National Highway Traffic Safety Administration about the Autopilot activation after reviewing the car’s computerized data log, the company said. The crash is the latest Tesla accident under investigation by the NTSB and is strikingly similar to 2016 case in which a Model S hit the side of a truck without braking. In that fatal collision, the NTSB found that the design of Tesla Autopilot system was partially responsible for the crash and issued two recommendations to the company and other manufacturers to improve the safety of such partially autonomous driving tools. Among the NTSB’s findings was that the car’s sensors weren’t designed to identify the side of the truck and, therefore, didn’t slow the car. The NTSB preliminary report on the March 1 collision doesn’t spell out what the car’s sensors detected as the vehicle approached the truck. The safety board is also looking at another fatal crash involving Autopilot in 2018 in California. In that case, a Model X struck a concrete highway barrier, killing the driver. NTSB investigators are also probing how the electric Tesla’s batteries behave after accidents following several
Origin: Autopilot was on when Tesla hit semi trailer in fatal crash: safety board
Tesla sued over fatal crash blamed on autopilot navigation error
2018 Model X sits on display outside a Tesla showroom.David Zalubowski / AP Photo Tesla Inc. was sued by the family of a man who died as the result of a crash allegedly caused when the Autopilot navigation system of his 2017 Model X malfunctioned. The family of Walter Huang, 38, said in a complaint filed April 26 in California state court that the vehicle, which was sold as a “state-of-the-art” automobile, lacked safety features, such as an automatic emergency braking system. Such features are available on much less expensive vehicles from other carmakers, as well as on more recent Model Xs, Huang’s family said. The family also alleges that Tesla knew, or should have known, “that the Tesla Model X was likely to cause injury to its occupants by leaving travel lanes and striking fixed objects when used in a reasonably foreseeable manner.” The carmaker should have issued a recall or provided a warning “in light of the risk of harm,” the family said in the complaint. Huang died because “Tesla is beta testing its Autopilot software on live drivers,” B. Mark Fong, a lawyer for the family, said in a statement. “The Huang family wants to help prevent this tragedy from happening to other drivers using Tesla vehicles or any semi-autonomous vehicles.” Huang crashed on the morning of March 23, 2018, while driving on U.S. Highway 101 in Mountain View, California, when the Tesla Autopilot allegedly turned the vehicle left, straight into the concrete median. He’s survived by his wife, two children, and his parents. The State of California Department of Transportation is also named as a defendant for failing to repair or restore a crash attenuator that had been damaged in a collision a week before Huang’s crash. The case is Huang v. Tesla Inc., 19CV346663, California Superior Court, Santa Clara
Origin: Tesla sued over fatal crash blamed on autopilot navigation error