Jailbreaking LLM-Controlled RobotsSurprising no one, it’s easy to trick an LLM-controlled robot into ignoring its safety instructions.UncategorizedLLMhackingroboticssocial engineering