I can’t tell if we’re on the verge of some new kind of circus or if this video unintentionally demonstrates that the power to amaze us will always stand on the flesh side of things.
Quite cool self-build prototype though. Definitely a lot of clown potential.
Let’s first look at this curious choice of word “recall” to speak about a software reversion, as it’s more generally used in the industry. It sounds like Waymo had to take out of the street their whole fleet because some software update went wrong, like Tesla had to recall all their cars sold in the US because of non-compliant emergency light design. Waymo didn’t do that. They just reverted the software update and uploaded a patched version. Calling it a recall is a bit of a misnomer and here to make them look compliant with some security practices that exist with regular consumer cars. But that framework is clearly not adapted to this new software-defined vehicle ownership model.
The second most interesting bit here, which to me seems overlooked by the journalist reporting this incident, is a “minor” (according to Waymo) software failure that created two consecutive identical accidents between Waymo cars and the same pickup truck. Read that again. One unfortunate pickup truck was hit by 2 different Waymo cars within the time frame of a couple minutes because it looked weird. Imagine if that pickup truck had crossed the path of more vehicles with that particular faulty software update. How many crashes would that have generated?
The robotaxi’s vision model had not taken into account a certain pattern of pickup truck, thus none of these robotaxis were able to behave correctly around it, resulting in multiple crashes. Which brings the question, should a fleet of hundreds or even thousands of robotaxis run on the same software version (with potentially the same bugs)? If you happen to drive a vehicle or wear a piece of garment that makes a robotaxi behave dangerously, every robotaxi suddenly is out there to get you.
San Franciscans celebrate Chinese new year by setting Waymo’s robotaxi on fire.
Waymo Vehicle surrounded and then graffiti’d, windows were broken, and firework lit on fire inside the vehicle which ultimately caught the entire vehicle on fire. #SFFD Photos by Séraphine Hossenlopp pic.twitter.com/aOTqL3Rk8V
Waymo, the robotaxi company from Alphabet/Google, broke the first law of Asimov.
Way more interesting is to read how the robocompany describes the incident:
“The cyclist was occluded by the truck and quickly followed behind it, crossing into the Waymo vehicle’s path. When they became unoccluded, our vehicle applied heavy braking but was not able to avoid the collision,” Waymo said.
Let me emphasize that: “the cyclist crossed into the Waymo vehicle’s path“. That’s such an engineering thing to say. It’s your 2 tons metal box on wheels that does not have a small moving vehicle hidden by a larger one in its computation vision model. Your software calculates a trajectory to pass behind that truck. Oops, there was a cyclist there. But it’s the cyclist who crosses your path? How convenient.
Philips DreamStation, a robot to help you breath at night, turned out to be a killing machine.
Since April 2021, the FDA has received more than 116,000 MDRs [Medical Device Reports], including 561 reports of death, reportedly associated with the PE-PUR foam breakdown or suspected foam breakdown.
Manufacturers […] are required to submit medical device reports (MDRs) when they become aware of an event that reasonably suggests that one of their devices may have caused or contributed to a death or serious injury, or has malfunctioned and that device or a similar device marketed by the manufacturer would be likely to cause or contribute to a death or serious injury if the malfunction were to recur.
Not sure if it’s intentional, but agile and safe in the same sentence is sure to hit high on search engine confusion, especially with a github website to promote your paper. You’re going to get a ton of hits from webshits with 99 problems but robotdog obstacle-avoidance ain’t gonna be one.
Also, calling something “but safe” is, how do I say it clearly but nicely, shooting yourself in the bearing balls. You’re not going to make me think for one sec that this noisy cocaine high articulated pet is inoffensive.
Looks like you know your classics though. The “robotdog kicking bloopers” are always welcome. You seemed a little too careful though not to hurt the animal, a little too safe?
Driverless cars have been documented running red lights, blocking emergency responders and swerving into construction zones.
[…] When driverless cars break the rules of the road, there’s not much law enforcement can do. In California, traffic tickets can be written only if there is an actual driver in the car.
A meme doing the rounds on social media sent me down a rabbit hole of clickbait articles strangely all converging to this ice-cream truck shaped android called Promobot.
Grey surveillance video showing a strange looking android robot standing still on the side of a road. A Tesla passes by and the robot tips over. A person runs across the street to bring assistance to the robot.
The meme mainly emphasized how Promobot made the news in 2019 when their inventor claimed a self-driving Tesla ran over it and “killed” it.
A robot doing seppuku with the help of another robot has quite a bit of meme power, I agree.
It’s that same robot that is sooooo intelligent, it “escaped form its lab” and blocked traffic for a few hours while its creators were busy taking photos of the incident.
A pretty good summary of the issues with robotaxis right now. The gap between young Silicon Valley entrepreneurs and tenured city officials is abysmal. Best part of the video is right at the beginning, when we see the journalist trapped in an expensive fully automated metal box getting kicked by an angry citizen not allowed to park their car because of the “intelligence” of said metal box.
We have no standards on which to base whether these vehicle are actually as safe as humans, safer than humans or not as safe as humans, except to trust that these companies are telling us the truth about their safety statistics