Cover Story

Video: julos / Creatas Video+ / Getty Images Plus
Ergonomics
AI-Powered
Motion capture technology revolutionizing safety in manufacturing
By Benita Mehta, Chief Editor
T
he manufacturing industry has long struggled with musculoskeletal disorders (MSDs), often relying on manual observations to mitigate risk. Musculoskeletal injuries are still one of the biggest challenges in the manufacturing industry. As safety professionals look for better ways to reduce risk, motion capture combined with AI is increasingly being utilized. And this technology can identify movement patterns and high-risk body mechanics so companies can address problems before injuries occur.
Kristianne Egbert, a certified professional ergonomist at Briotix Health and expert in motion capture technology, spoke to ISHN about motion capture AI and how it’s being utilized in the manufacturing space to improve ergonomics and overall safety. Egbert is an expert in the technology.
The shift from traditional "stopwatch and clipboard" assessments to high-tech motion capture and AI-driven analysis has begun. Egbert, a 25-year veteran in the field, explains that while the science of ergonomics remains constant, the tools used to apply it are undergoing a massive evolution.
Motion capture technology — via computer vision and wearable sensors — addresses the inherent blind spots of human observation. By removing human error in estimating joint angles and counting repetitions, AI provides more accurate, granular data. However, Egbert emphasizes that technology doesn't replace the specialist; rather, it shifts the ergonomist’s role from data collection to high-level problem-solving and coaching.
For me as an ergonomist, it lets me spend less time measuring and counting and more time solving. It allows me to focus on the part that I really love... translating data into actionable change,” she said.
Human observers can only capture moments in time and are prone to estimation errors, emphasizes Egbert. Motion capture provides a continuous, data-rich view of a worker's entire shift. "The accuracy of motion capture is likely to be higher than that of a human... because it is actually measuring those joint angles. It really can remove the blind spots that we find in traditional assessments,” she said.
By aggregating motion data with historical claims and injury reports, companies can now predict risks at the individual and departmental levels. Beyond just identifying risks (e.g., "this person is bending too much"), new AI tools are beginning to suggest specific engineering controls and vendor links to fix the problem.
"AI is starting to identify patterns for us... we are starting to be able to correlate those with higher injury rates or higher reports of discomfort,” Egbert said. “As more data is loaded, that learning will become more and more valuable."
Every assessment, whether it's observational, checklist-based, or technology-driven, is going to have its strengths and weaknesses. The goal is understanding what each method sees well and what it doesn't and filling in those gaps appropriately.
ISHN: Are there any blind spots in traditional assessments that motion capture addresses?
Egbert: I think (this question is) very thoughtful because the question that I usually hear asks about the blind spots of motion capture, particularly with video capture. What if it can't see it? What if somebody walks in front of it? Ultimately, every tool is going to have some sort of blind spot associated with it… humans aren't perfect. And there's a lot that isn't captured or that a person can't see because they're looking at a moment in time, or a segment in time of the job task that they're analyzing and that they're looking at. They can't follow somebody and capture every movement across their entire workday. Every assessment, whether it's observational, checklist-based, or technology-driven, is going to have its strengths and weaknesses. The goal is understanding what each method sees well and what it doesn't and filling in those gaps appropriately, maybe not just relying on one or the other.
ISHN: When we talk about AI in this context, is AI itself identifying the risks based on preset ergonomic standards, or is it new risk patterns based on what it is looking at and assessing?
Egbert: I think when people hear AI, they think, oh, this is something new that's being invented, something new and fancy, right? Ultimately, I don't think that's actually what the case is. In reality, the AI tools, at least the ones that I'm the most familiar with in terms of ergonomics, they're really grounded in the existing science that's out there. We know the risk thresholds. We're simply using AI, this computer vision or motion capture sensors, to help identify what those risk factors are. To say, here's the risk threshold that you hit. Ultimately, it's still rooted in that science that's been around for a really long time.
The next part of that is that the world of AI is rapidly changing, like literally day to day. And that's when we move into things like predictive analytics, the movement patterns that somebody is working in that the AI is starting to identify for us. We are starting to be able to correlate those with higher injury rates or higher reports of discomfort. And as more and more of that data is being loaded into these AI systems, that learning will become more and more valuable.
ISHN: Does the presence of motion capture technology change how employees move and interact?
Egbert: Yeah, it definitely it definitely alters their behavior, and usually does change the way people move and how they interact. Is that the presence of motion capture technology? Because I can tell you that if an ergonomist or, even worse, a safety team is coming to observe what's happening in a job task and they're standing there with their clipboards and tape measures, maybe they have a force gauge and a camera. Do you think people are going to change their behavior? They will. So whatever that whatever is capturing the movements of a person is likely to alter the way that they're behaving.
I don't think this is exclusive to AI technology. I don't know that anybody likes Big Brother watching, regardless of who Big Brother is. Is that difference measurable or not? I don't know.
ISHN: We often talk about leading indicators when it comes to safety. Does this technology allow us to predict an injury to specific individuals or is it mainly used to identify broader risks associated with certain tasks?
Egbert: Well, it's a little bit of both realistically. So in the case of the individual: Using the video playback to show that worker how they're actually working. That's basically predicting their areas of risk. And it'll give us data on it too, to say you were in a high-risk position for this body part for X percent of this cycle that was uploaded. And obviously those change on each person. Because if it's somebody really tall, then maybe they're not having to reach up as high. And so their risks are going to be different than somebody that's a little bit shorter. So yes, it can help on that individual level. But again, as the AI is getting better and better in the broader sense, we can aggregate that data. across more and more workers. And then we can also start to match with claims data and injury prevention data so that we can really start to understand and deliver risk prediction at a higher level for different locations or for different job roles or for different body parts and kind of still be able to tie it back to the specific tasks that are happening, which is cool.
ISHN: With AI doing the heavy lifting on analysis and identification, how does the role of ergonomics specialists change or evolve?
Egbert: I like the word evolve. I think that's a good way to think of it. It's probably one of the most interesting shifts that has happened in the field of ergonomics in a really long time. It basically, for me as an ergonomist, it lets me spend less time measuring and counting and more time solving. I always love to compare it: I'm going to do this risk assessment manually, and then I'm going to run it through the AI and see how it does. And pat myself on the back when it comes up exactly the same. But ultimately, it's so much faster, so it allows me to focus on the part that I really love.
I love coming up with solutions for things. And this gives me the time to really dive into those solutions and focus my efforts on understanding better what's going to work and what's not going to work because it's a lot easier for me to be able to quantify what those risks are. We still have to interpret it. You still have to go in and check it because sometimes it does glitch or sometimes somebody does walk in front of the camera or sometimes one of the sensors did lose its charge and stop working or get disconnected. And that still requires expertise, but it allows us to focus less on actually gathering the data and translating that into actionable change.
Teresa Kee is the Vice President of Health and Safety for United Rentals. Her background includes nearly two decades of managing health and safety programs with a focus on employee safety.

