Tool use has long been a hallmark of human intelligence, as well as a practical problem to solve for a wide range of robotic applications. But machines are still slow to exert just the right amount of force to control tools that aren’t rigidly attached to their hands.
To manipulate these tools more robustly, researchers from MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) in collaboration with the Toyota Research Institute (TRI) have designed a system that can grasp tools and apply the appropriate amount of force to a given task, such as suck up liquid or write a word with a pen.
The system, called Series Elastic End Effectors, or SEED, uses soft bubble grippers and embedded cameras to map how the grippers deform over a six-dimensional space (think of an airbag that inflates and deflates) and applies force to a tool. Using six degrees of freedom, the object can be moved left and right, up or down, forward and backward, roll, tilt and yaw. The closed-loop controller—a self-regulating system that maintains a desired state without human interaction—uses SEED and visuotactile feedback to adjust the position of the robotic arm to apply the desired force.
For example, this can be useful for someone using tools when there is uncertainty in the height of a table, as a pre-programmed path can miss the table completely. “We have relied heavily on the work of Mason, Raibert and Craig on what we call a hybrid force position controller,” says Hyung Ju Suh, a Ph.D. student in electrical engineering and computer science at MIT, CSAIL affiliate, and lead author on a new paper on SEED. “The idea is that if you actually had three dimensions to move in when writing on a board, you’d want to be able to control the position on some of the axes while controlling the force on the other axis.”
Rigid-body robots and their counterparts can only take us so far; Softness and yielding give luxury and the ability to deform, to feel the interaction between the tool and the hand.
With SEED, each execution the robot detects is a recent 3D image from the grippers, thereby tracking in real time how the grippers change shape around an object. These images are used to reconstruct the tool’s position, and the robot uses a learned model to map the tool’s position to the measured force. The learned model is obtained using the robot’s previous experience perturbing a torque sensor to find out how stiff the bubble grippers are. Now, once the robot has sensed the force, it will compare it to the force that the user is commanding and maybe say to itself, “it turns out that the force I’m sensing right now isn’t quite there. I need to press harder.” It would then move in the direction of increasing power, all done over 6D space.
During the “scraper task”, SEED got the right amount of power to mop up some liquid on a plane where the baseline methods struggled to get the right sweep. When asked to put paper on the pen, the robot effectively wrote “MIT” and was also able to apply the right amount of force to drive a screw.
While SEED was aware that it must control the force or torque for a given task, the workpiece would inevitably slip if gripped too hard, so there is an upper limit to the hardness exerted. If you are a rigid robot, you can also simulate softer systems than your natural mechanical stiffness – but not the other way around.
Currently, the system assumes a very specific geometry for the tools: it must be cylindrical, and there are still many limitations on how it can generalize when it encounters new kinds of shapes. Future work may involve generalizing the framework to different forms so that it can handle arbitrary tools in nature.
“No one will be surprised that compliance can help with tools, or that force sensing is a good idea; the question here is where on the robot compliance should go and how soft it should be,” says co-author Russ Tedrake. Toyota Professor of Electrical Engineering and Computer Science, Aeronautics and Astronautics, and Mechanical Engineering at MIT and Principal Investigator at CSAIL. “Here we explore regulating a quite soft six-degree-of-freedom stiffness directly at the hand/tool interface and show that there are some nice advantages to doing so.”
Helps soft robots become stiff as needed
HJ Terry Suh et al., SEED: Series Elastic End Effectors in 6D for Visuotactile Tool Use. arXiv:2111.01376v1 [cs.RO]arxiv.org/abs/2111.01376
Provided by the MIT Computer Science & Artificial Intelligence Lab
Citation: Soft robots that grip with the right amount of force (2022, September 23) Retrieved September 23, 2022, from https://techxplore.com/news/2022-09-soft-robots-amount.html
This document is subject to copyright. Except for any fair dealing for the purpose of private study or research, no part may be reproduced without written permission. The content is provided for informational purposes only.