One station involved robots and had several light curtains to protect operators. The idea is that if someone walks past a certain area, the robots would stop moving and not hurt the operator. This was part of the requirements document that was specified. So as a test, when the robot was moving we pushed a chair into the light curtain. And watched in horror as the robot continued moving and eventually finished its move operation.
Was their first response "the requirements doesn't say anything about chairs"?
Guess that will be a problem when AI starts kicking in. "I'm programmed to stop for operators. That killed cleaner isn't an operator, the overall was blue, not red.".