Fragility Robotics

ExTwitter post from Agility Robotics includes a video of a bipedal robot performing a task of moving boxes in a simulated factory setup in what looks like a trade show. The robot crashes on itself at the end in what appears to be a failure of its legs.

So many things happening in this ExTwitter post, let’s unpack.

It’s quite interesting for a robotics company to put up a social media message of its own product failures, especially when that failure happens in a product demo setting at a trade fair. Knowing the price per square footage of these trade shows, a setup with a conveyor belt and box shelf is no small marketing budget. Failures in the lab are ok, and there’s a history of robodog video bloopers, but failures when you’re trying to convince a large crowd to buy your tech, maybe much less so.

So, marketing probably thought, Well, that’s a million dollar fuckup, so let’s change strategy and use this to our advantage. We’ll make it a viral social media event. And while we are at it, let’s make our own metrics: “99% success rate over 20 hours“.

That robot was not going to get back to work without serious repairs, so forget having it moving boxes for the rest of the day. Since we’re talking about human-looking robots replacing humans doing machine jobs, might as well expect robothings to do the work 24/7, as metric of success, not 20/6 or 20/4, in that particular context. So let’s rewrite that as “82.5% success rate over 24 hours” if robothing gets repaired in a few hours. “20.7% success rate over 4 days”, if you forgot to bring a set of “quick change limbs” to the show.

Lastly, I can’t stop looking at the crowd standing on the other side of the conveyor belt, witnessing the scene. The lack of response or interest in what just unfolded is palpable. No one seems surprised, amused or alarmed. Even what looks like members of the sales team from [Fr-]Agility Robotics barely turned around to see what was happening behind their back and then just ignored their flagship robotic product having a melt-down.

Staging robots in manufacturing settings is boring. Breaking a leg is no way to impress.

Crashing at scale

Waymo is voluntarily recalling the software that powers its robotaxi fleet after two vehicles crashed into the same towed pickup truck

Waymo recalls and updates robotaxi software after two cars crashed into the same towed truck

Let’s first look at this curious choice of word “recall” to speak about a software reversion, as it’s more generally used in the industry. It sounds like Waymo had to take out of the street their whole fleet because some software update went wrong, like Tesla had to recall all their cars sold in the US because of non-compliant emergency light design. Waymo didn’t do that. They just reverted the software update and uploaded a patched version. Calling it a recall is a bit of a misnomer and here to make them look compliant with some security practices that exist with regular consumer cars. But that framework is clearly not adapted to this new software-defined vehicle ownership model.

The second most interesting bit here, which to me seems overlooked by the journalist reporting this incident, is a “minor” (according to Waymo) software failure that created two consecutive identical accidents between Waymo cars and the same pickup truck. Read that again. One unfortunate pickup truck was hit by 2 different Waymo cars within the time frame of a couple minutes because it looked weird. Imagine if that pickup truck had crossed the path of more vehicles with that particular faulty software update. How many crashes would that have generated?

The robotaxi’s vision model had not taken into account a certain pattern of pickup truck, thus none of these robotaxis were able to behave correctly around it, resulting in multiple crashes. Which brings the question, should a fleet of hundreds or even thousands of robotaxis run on the same software version (with potentially the same bugs)? If you happen to drive a vehicle or wear a piece of garment that makes a robotaxi behave dangerously, every robotaxi suddenly is out there to get you.

Driving isn’t an autonomous activity

Driverless cars are often called autonomous vehicles – but driving isn’t an autonomous activity. It’s a co-operative social activity, in which part of the job of whoever’s behind the wheel is to communicate with others on the road. Whether on foot, on my bike or in a car, I engage in a lot of hand gestures – mostly meaning ‘wait!’ or ‘go ahead!’ – when I’m out and about, and look for others’ signals. San Francisco Airport has signs telling people to make eye contact before they cross the street outside the terminals. There’s no one in a driverless car to make eye contact with, to see you wave or hear you shout or signal back. The cars do use their turn signals – but they don’t always turn when they signal.

“In the Shadow of Silicon Valley” by Rebecca Solnit

/via @clive@saturation.social

Absolutely Robody Cares

Rothing speaks bullshit more than an android nobot in a wheelchair tele-operated by a white dude wearing Hololenses, as if he was in some kind of alternate hospitality world we’d all want to live in.

Making it a square video does not make it cooler either.

“Remote caregivers”, I suppose that’s what they call “robodies”, providing “companionship”, “genuine human connection” and “the warmth of human interaction” is now a “comforting reality”, says the soothing female narrator voice trying to hit every keyword in the marketing 101 playbook.

“Nurturing human connection, enriching lives and redefining the essence of care”, more like nurturing the techno dream of total surveillance, enriching stock value for shareholders and pouring gasoline over healthcare.

Robotaxis are on fire

San Franciscans celebrate Chinese new year by setting Waymo’s robotaxi on fire.

More than meets the vision sensor

Waymo, the robotaxi company from Alphabet/Google, broke the first law of Asimov.

Way more interesting is to read how the robocompany describes the incident:

“The cyclist was occluded by the truck and quickly followed behind it, crossing into the Waymo vehicle’s path. When they became unoccluded, our vehicle applied heavy braking but was not able to avoid the collision,” Waymo said.

https://boingboing.net/2024/02/07/waymo-autonomous-car-hit-bicyclist.html

Let me emphasize that: “the cyclist crossed into the Waymo vehicle’s path“. That’s such an engineering thing to say. It’s your 2 tons metal box on wheels that does not have a small moving vehicle hidden by a larger one in its computation vision model. Your software calculates a trajectory to pass behind that truck. Oops, there was a cyclist there. But it’s the cyclist who crosses your path? How convenient.

Robots put you to sleep… forever

Product shot of Philips DreamStation device.

Philips DreamStation, a robot to help you breath at night, turned out to be a killing machine.

Since April 2021, the FDA has received more than 116,000 MDRs [Medical Device Reports], including 561 reports of death, reportedly associated with the PE-PUR foam breakdown or suspected foam breakdown.

Problems Reported with Recalled Philips Ventilators, BiPAP Machines, and CPAP Machines

Manufacturers […] are required to submit medical device reports (MDRs) when they become aware of an event that reasonably suggests that one of their devices may have caused or contributed to a death or serious injury, or has malfunctioned and that device or a similar device marketed by the manufacturer would be likely to cause or contribute to a death or serious injury if the malfunction were to recur.

ibid.

Philips recalled the machines and stop selling them in the US. Where else are those still on sale?

/ht @boingboing