"[Self-driving cars] don't drive like people, they drive like robots," Mike Ramsey, an analyst who specializes in advanced automotive technology at Gartner Inc, told Bloomberg. "They're odd and that's why they get hit."
Per Bloomberg, accidents happened mostly at intersections when self-driving vehicles were waiting to make a right turn into oncoming traffic. The outlet noted that Waymo's now-retired "Firefly" car was rear-ended twice in two separate occasions when it stopped to yield before making the right turn.
Another incident saw a robotic vehicle get clipped by a truck trying to get past the latter's grandma-like driving skills, otherwise known as going the speed limit.
"You put a car on the road which may be driving by the letter of the law, but compared to the surrounding road users, it's acting very conservatively," explained Karl Iagnemma, CEO of NuTonomy, a self-driving software developer. "This can lead to situations where the autonomous car is a bit of a fish out of water."
"They were cutting the corners really close, closer than humans would," Missy Cummings, a robotics professor at Duke University, said. "We typically take wider turns."
Working alongside the Virginia Tech Transportation Institute, the Ford Motor Company went as far as conducting an experiment to see how autonomous cars could communicate with humans via light signals.
"Humans violate the rules in a safe and principled way, and the reality is that autonomous vehicles in the future may have to do the same thing if they don't want to be the source of bottlenecks," Iagnemma added.
Though California is right now the only state requiring collision reports when an autonomous car is struck, it's a safe bet that at least some humans need to dust off their old manuals from driving school.