PopSci Q&A: A Robot Masters the Art of Chicken Deboning
Using 3-D imaging and autonomous arms to take apart a bird cleaner and faster
De-boning a chicken, duck or other bird can be an arduous and unpleasant task — even Julia Child said it could take way too long “because of fright.” Yet with patience and the right knife, any human can do it. But a robot?
Robots may be able to assist with surgery, but chicken butchering is, in some ways, more of a challenge, and one that engineers at Georgia Tech are hoping to solve.
Their new Intelligent Cutting and Deboning System uses 3-D cameras and force-feedback algorithms to determine the proper cut.
Poultry is one of the biggest industries in Georgia — if the peach state were a country, it would be the sixth-largest poultry producer in the world, according to Gary McMurray, chief of Georgia Tech Research Institute’s Food Processing Technology Division. So a lot of chickens come from Georgia, and a lot of Georgia farms could benefit from automation.
McMurray and his colleagues are designing an autonomous chicken butcher bot using a pair of robotic arms. One arm moves with two degrees of freedom, for making simple cuts across a plane. The other arm grabs the bird and can move with six degrees of freedom, aligning the bird in any desirable position relative to the knife.
The algorithms that control this placement are highly complex, and the mathematical models used to develop them could be used in other robot meat-slicing situations — either butchering other types of livestock, or for surgical procedures. PopSci talked to McMurray about building a robot butcher and the difficulties of boning two chickens per second.
Below is an edited transcript of our conversation.
PopSci: Why is it so difficult to automate the process of boning a chicken?
Gary McMurray: In the automotive industry, in electronics, in food — anybody who does manufacturing has been in pursuit of the “lot size of one,” where everything you make is unique and customized. Ford, for instance, would love to be able to set up one plant, and that plant just takes orders. In one minute they can make the Explorer, and in the next minute they can make the Fiesta — every car is unique and different. That’s a really hard problem to do, but that’s what we’re trying to do with this intelligent cutting system. Every bird is unique and different, not only its anatomy, but in how it’s placed on the cone.
You have to take every bird and look at it, and make decisions. You can’t see inside the bird, to know its anatomical structure, so you have to adapt. That’s what the human does — the human just does it in an incredible way.
Think about when you carve a turkey for the holidays. You know generally where to put the knife in. You might sometimes feel with your hand, but you put the blade in and you start cutting. You feel the backbone, and you are able to just guide the blade. This is really difficult on a poultry floor, because they’re trying to do so many. But it’s something that a human does fairly well.
PS: How does this machine decide where to cut?
GM: We have two stereo cameras to identify the external points, and a 6-degree of freedom force torque cell, where the blade is attached, so we are sensing the forces that are on the blade. The robot gives position and velocity feedback about how fast we are moving.
We use three external points on the bird, and we’re able to predict the internal structure, to guess where some of the key bones are. That works to about plus or minus 3 mm, so it’s not perfect, but it’s pretty good. We use image processing to be able to identify those points, and create a nominal cutting trajectory of the blade that’s unique to every bird. The location of the shoulder joint, which is one of the big things we’re trying to cut, can vary plus or minus a quarter of an inch.
We have high-speed force-control algorithms, so when you make first contact with the bone, it can guide the blade around the bone without cutting into it. We want to be able to detect whether we are cutting meat, tendon or bones. If you are cutting meat or tendon, that’s good. As soon as you hit bone, that’s bad. You need to make a decision: Do I pull out, do I go around the bone, or something else? You want to make sure you do not generate a bone fragment or a bone chip.
PS: Could you use X-ray to determine the exact bone locations, like the automated lamb boning room developed by Scott Technologies in New Zealand?
GM: They [the lamb processors] can afford to use these different sensing modalities because they take longer to collect the data. Some of these lamb facilities, they might be processing one lamb or one sheep every 30 seconds, some one a minute. In a poultry processing facility, where we’re doing the deboning task, they are doing one bird every two seconds. They don’t have time — the birds are just going by too fast.
PS: What’s the most difficult part of this problem?
GM: They have all been extremely challenging, because none of this really existed. The force control in one sense was one of the easier problems, because there was an existing body of literature we could build upon, so that helped. But with everything else, there was not really as much to build upon. Cutting, itself, is not a very well-understood process, especially when you are cutting bio-materials. They are so incredibly nonlinear, there’s a lot more complexity in that. So that’s something we had to expend a lot of energy modeling.
PS: There’s another robot that can cut through flesh — the Da Vinci surgical robot. Did you work with them at all on these force-control algorithms? Will this work impact robot surgery at all?
GM: Most of the robotic surgery stuff is tele-operated, so there’s a man in the loop. They’re not really as concerned about trying to automatically detect these things. They are more concerned about filtering out the trimmers, providing the right visual image of what the person is doing. They’re not trying to automate surgery, but this work could really impact that.
I do see some possibility of expanding this out into surgery, but people are not ready for a fully automated robotic surgeon. What I’d like to do is talk about what’s the first step. There may be some simple steps a surgeon doesn’t want to do. A doctor doesn’t open, and he certainly doesn’t close — that’s done by other people. So is there a task that maybe a robot can do? Or vice versa. Would it help if you’re doing robotic surgery, collecting this force feedback data and being able to provide some feedback to the doctor? “Hey, you just ran into bone,” or “the force is increasing, danger danger” scenario. There may be some way to augment or help the doctor.
Chicken Boning Arm
PS: What do you want this system to do, ultimately?
In the short term, I would like to be able to simply remove the worker doing these cuts, and put this system in. That would help the plants with an ergonomics issue, and help them increase their yield. In the long term, we have a program to design the poultry plant of the future. We would like to be able to radically change the way we do the process.
Now that we understand how important food safety is, and we want to minimize water use, etc, how would I do the process completely differently? We’re kicking around a lot of ideas. If I’ve got a smart deboning system, why do I eviscerate the bird, which can lead to a lot of the contamination issues? Maybe I should just debone the bird before I eviscerate it. That’s called hot deboning, where the meat is still warm. When you look at the food safety implications, the fact that you now have technology that could guarantee you are not going to cut too far and cut the intestines — maybe that’s a viable solution.
PS: Have you lost your appetite for chicken since embarking on this project?
GM: When I took this job I was kind of worried about that, because I ate a lot of poultry. But I learned that the industry does a really good job. It’s not this ugly or disgusting or horrible process. It is done very well, and I actually do have confidence that what I get is safe food. So I was really happy about that. If I had to give up chicken, what would I go to?