Many manufacturers aren’t doing enough to measure human activities alongside automated processes, found a study by A.T. Kearney and Drishti.
Manufacturing is widely considered to be a mature area for automation, but most analytics don’t apply to human activities in production, found a recent survey. This lack of insight impedes the ability to do capacity planning, workforce management, or process engineering.
Humans still perform 72% of manufacturing tasks, according to a survey of more than 100 leading manufacturers. Management consultancy A.T. Kearney and Drishti Technologies Inc. conducted the survey and announced the results today.
Industry 4.0 ‘blind spot’
“Despite the prominence of people on the factory floor, digital transformation strategies for even the most well-known, progressive manufacturers in the world remain largely focused on machines,” said Michael Hu, partner at A.T. Kearney. “This massive imbalance in the analytics footprint leaves manufacturers around the globe with a human-shaped blind spot, which prevents them from realizing the full potential of Industry 4.0.”
“If you walk along automotive assembly lines, you can see the value of robots,” said Prasad Akella, founder and CEO of Drishti. “From mid-1980s, we saw robots go through the body shop and take over the paint shop, which were extremely hazardous and had well-defined problems.”
“But when you get to final assembly, it’s still human activities,” he told Robotics Business Review. “Robotics is not yet at a point where it can completely take over.”
“Having been in this space for 35 years, I’ve seen how the simple act of manipulation, which we take for granted, is a nontrivial problem,” Akella said. “The only place where I’ve seen a lights-out plant work is a FANUC plant at the base of Mt. Fuji.”
“I’ve had conversations with executives at many manufacturers,” he said. “Even if you read all the Industry 4.0 materials coming out of the German federal government, plus marketing materials and VC portfolios, you’ll see that all of this effort is for keeping track of machines.”
“There are more than 1.5 million robots in the world, according to the International Federation of Robotics,” noted Akella. “Suppliers are producing 250,000 to 500,000 per year, but contrast that with the fact that [according to Goldman Sachs Research] there are about 340 million humans on the line.”
“We’ve got some decades before robots take over,” he added. “To be pragmatic, we must measure human activities and understand and optimize how they can work with robots.”
Measuring human activities
Since the time of Henry Ford a century ago, time-and-motion studies have been the standard method for measuring human activities. Skilled engineers spend about 37% of their time gathering analytics data manually, said survey respondents.
“The principles underlying these 100-year-old measurement techniques are still valid, but they are too manual to scale, return incomplete data sets, and are subject to observation biases,” said Akella. “Manufacturers need larger and more complete data sets from human activities to help empower operators to contribute value to their fullest potential. This data will benefit everyone in the assembly ecosystem: plant managers, supervisors, engineers, and, most importantly, the operators themselves.”
In addition, A.T. Kearney and Drishti found that “73% of variability on the factory floor stems from humans, and 68% of defects are caused by human activities.” They said that 39% of engineering time is spent on investigating the root causes of defects.
“If your intent is to produce cars, phones, or TVs of high quality, you have to approach it differently,” Akella said. “Companies have huge data architectures, but executives get a ‘deer in the headlights’ look when you mention human beings.”
Robots plus people
“With collaborative robots, you want the capabilities of both human beings and robots,” said Akella. “You want the dexterity and intelligence of humans combined with the computer vision and repetitive abilities of robots. The efficiency of a combined system is much higher than either independently.”
“For example, at General Motors, it would take six months to reprogram a line for a new product,” he recalled. “But you can retrain people in weeks and change a smartphone every year. The value of optimizing human activities and machines in a holistic sense is clear.”
“Today, the only way to get information is from the plant manager or a machine looking at one point,” Akella said. Seventy one percent of survey respondents said that manual time and motion studies were important sources of data about human activities.
“With kaizen, or continuous improvement, we take gobs of data and drive decisions from data that reflects all the variability of humans on the line,” he said. “An inexpensive camera sitting on top of the line can extract from video assembly actions, as well as time and motion studies.”
“From that data set, we can extract percentage points of process improvements, so people keep jobs and companies perform better,” Akella said.
Drishti applies AI to manufacturing
Drishti uses “action recognition” and AI to include human activities in the digital transformation of factories. The company’s headquarters are in Palo Alto, Calif., and its engineering office is in Bangalore, India. Drishti says its mission “is to extend human capabilities in an increasingly automated world.”
“Using our deep-learning engine that we’ve built ourselves, a manufacturer can derive an understanding of what operators are doing,” explained Akella. “It becomes core of the data set that we’re building out.”
“As with a self-driving car or Google Photos, the deep-learning engine can identify a stop sign, a human, or other vehicles at an intersection by looking at a single frame,” he said. “The much harder problem is to do this in time. This enables us to be pioneers.”
Akella described three “dimensions” of Drishti’s offering. “The first is there are opportunities to drive greater efficiency at big organizations,” he said.
“The second are quality improvements, such as seeing if all screws are done, to prevent rework,” Akella said.
“The third dimension is traceability,” he said. “For instance, Samsung didn’t know at first why Galaxy Notes were blowing up. It had to examine process data from two plants and separate vendors.”
Drishti raised $10 million this past summer.
“We’re in expansion mode, and we’re growing rapidly,” said Akella. “We’re looking to connect with potential employees who believe in the centrality of human beings, which is something very different.”
“We’ll name customers next year,” he said. “We’re working with some of the biggest players in the automotive, electronics manufacturing, and medical device spaces.”