Removing Barriers to AI Adoption in Manufacturing

Removing Barriers to AI Adoption in Manufacturing
Removing Barriers to AI Adoption in Manufacturing

Few industries can benefit from AI more than the manufacturing sector. The sector generates enormous amounts of data, involves repetitive manual tasks and presents multi-dimensional problems beyond the scope of many conventional tools. Whether it is improving quality, reducing downtime or optimizing efficiency, AI is the perfect tool to solve many complex manufacturing problems. Deloitte recently reported in a survey on AI adoption in manufacturing that 93% of companies believe AI will be a pivotal technology to drive growth and innovation in the sector. However, the vast majority of manufacturing companies have to overcome a lot of barriers that impede digital transformation and AI initiatives:
 

  • Lack of talent: Organizations across all industries are discovering that experienced AI professionals are difficult to hire. Data scientists typically work for a handful of Fortune 500 companies, and employing these data scientists is beyond most businesses’ reach. AI projects require an interdisciplinary team of data scientists, ML engineers, software architects, and BI analysts and SMEs. Given the variety of AI projects and the substantial amount of data manipulation required, building and retaining this type of team is quite challenging. The problem is compounded for manufacturing since the industry is not considered cool. Moreover, the manufacturing industry is likely to face a severe workforce shortage due to the looming retirement of SMEs. AI Automation and AutoML 2.0 are key technologies that can address this skill gap and accelerate digital transformation in manufacturing.

  • Data quality: Data quality and data management issues are critical, given the high reliance on high-quality data by AI projects. AI and Machine learning tools rely on data to train underlying algorithms.  Access to clean, meaningful data is critical for the success of AI initiatives. But, manufacturing data can be biased, outdated, and full of errors. The production floor, heavy manufacturing environment, in particular, is characterized by extreme, harsh operating conditions.  Fluctuating temperatures, noise, and vibrations can lead to inaccurate sensor data and produce data inaccuracies. The manufacturing site could be at a remote location, bringing additional complexities in terms of data storage. The security policies may not allow data to be shared with the cloud, requiring on-premises solutions. Operational data is spread across multiple databases in multiple formats, not suitable for analytics, and requires preprocessing. For example, a predictive maintenance application will need access to the computerized maintenance management system or process historians. It may further require connectors or custom scripts to retrieve and manipulate the data. The solution lies in leveraging automation for AI- focussed data preparation.

  • Technology infrastructure & interoperability: The manufacturing sites have a wide variety of machines, tools and systems that use disparate and often competing technologies. The infrastructure may be running older versions of software,  not compatible with other systems, and lack interoperability. In the absence of standards and common frameworks, customers have to carefully think about the machine to machine communication, connect legacy machines, and which sensors or convertors to install. An ecosystem of players that offer compatible components that use standard rules and frameworks to connect to ERP, MES, and PLC/SCADA systems will address the interoperability problem. OPA UA is becoming the key protocol for Industry 4.0 communication and data modeling.

  • Real-time decisions making: Many applications in manufacturing are sensitive to latencies and require an ultra-fast response. These applications cannot wait for the round trip journey to the cloud to perform data processing and get actionable insights. The decision has to be made in real-time, acted upon immediately within minutes, or sometimes milliseconds. Such rapid decision-making requires streaming analytics capability and real-time prediction services. Real-time data processing allows manufacturers to take action immediately and prevent undesirable consequences. For example, using predictive analytics for quality, manufacturers can identify defective components and perform rework or replace the faulty component preventing a product recall.

  • Edge deployments: The concept of edge computing is fundamental in manufacturing. It becomes more efficient to process data locally near the source of data for faster response. Real-time decision making and intelligent local control systems require edge-based computing. The ability to deploy predictive models on the edge devices such as machines, local gateway, or server is critical to enable smart manufacturing applications.

  • Trust & transparency: One key hurdle preventing broad AI adoption is the complexity behind the technology and the lack of trust that has created the AI transparency paradox. The AI technology stack is extraordinarily complex and challenging for the vast majority of people. People without a data science background struggle to understand how predictive modeling works and do not trust the abstract algorithms behind AI technology. Transparency implies providing information about the AI pipeline - the input data used in the process, algorithms selected, and how the model made predictions. One approach to increase trust is to provide details about the AI workflow.  That includes providing a detailed process to transform the raw data into the inputs of machine learning (a.k.a. feature engineering), and how the ML model produces predictions by combining hundreds of or even more features. Lack of transparency is where Explainable AI can help. By giving insight about how the prediction models work and the reasoning behind predictions, manufacturing organizations can build trust and increase transparency.

 
A domain-agnostic, end-to-end AI Automation platform that offers analytical flexibility to address multiple use cases will dramatically improve operational experts’ lives and address these barriers. AI automation allows SME to focus on day-to-day responsibilities while automated data pre-processing and feature engineering will enable them to build predictive models at the click of a button. Explainable AI and transparent features create trust and garner buy-in from domain experts. The containerized prediction model allows real-time prediction capability and accelerates AI deployment at the edge on the manufacturing floor. Empowering manufacturing and production SMEs to do more with less using AI automation is the right way to accelerate manufacturing digital transformation.

About The Author


Ryohei Fujimaki is the founder & CEO of dotData, a spin off of NEC Corporation and the first company focused on delivering end-to-end data science automation for the enterprise. Dr. Fujimaki is a world-renowned data scientist, and was the youngest research fellow appointed in the 119-year history of NEC.


Did you enjoy this great article?

Check out our free e-newsletters to read more great articles..

Subscribe