Factory of the Future: From Vision to Reality | Automation.com

Factory of the Future: From Vision to Reality

March 252015
Factory of the Future: From Vision to Reality

By Andy Chang, Senior Program Manager, Academic Research, NI

Over the past decade, we have become more and more dependent on the latest technologies in electronics and communications, from mobile devices and intelligent vehicles to home automation. This technology advancement has drastically improved products, but the rapid change in design requirements challenges manufacturers who want to develop cost-effective products in today’s competitive market. Although factories today are nothing like the hectic and noisy production facilities of the past, the inflexible monolithic production systems make it extremely difficult and costly for manufacturers to adapt to new technologies. The Factory of the Future is a research and technology initiative aimed to push emerging technologies to improve the competitiveness of manufacturing processes by leveraging cyber-physical systems and big data analytics to enable a smarter, operator-centric production.

Fig 1. Connected Elements in Factory of the Future

One of the key components for improving efficiency in the Factory of the Future is a smarter tool such as a drill or a tightening tool. These smart devices are designed to communicate with a main infrastructure or locally with operators and other tools. In the latter circumstance, the devices are required to provide situational awareness and make real-time decisions based on local and distributed intelligence in the network.

In the case of a manufacturing facility, smart tools can help simplify the production process and improve efficiency by removing physical data logs and manuals. Operators must focus on their operational tasks, during which they need to keep their hands free for using the appropriate tools. For example, developing an airplane involves tens of thousands of steps that operators must follow with many checks in place to ensure quality.

When manufacturers add intelligence to their systems, the smart tools understand the actions that the operator must perform next, automatically adjust to the proper settings, and simplify the task for the operator. Once the action has completed, the smart tools can also monitor and log the results of the action, which improves the efficiency of the production process.

Consider the example of an airplane subassembly that has roughly 400,000 points that need to be tightened down, which requires over 1,100 basic tightening tools in the current production process. The operator has to closely follow a list of steps and ensure the proper torque law settings for each location using the correct tool. Because of the manual process, human error adds a lot of risk to the production. This is significant since even a single location being tightened down incorrectly could cost hundreds of thousands of dollars in the long run. Smarter tools and devices understand which task the operator is about to perform using vision to process its surroundings and automatically adjust the settings for other tools.

Fig 2. Examples of Vision Algorithms Used for a Smart Measuring and Tracking Tool

On the automation side, robot manipulator systems have been used for decades across various industries for a wide variety of applications. These systems are typically designed using a proprietary or custom end-to-end solution, and adding functionality to them is challenging through limited vendor-defined black boxes. Configuring these robotic systems can be extremely costly because the particular configuration or solution applies to only the specific vendor.

As production systems evolve into “lean” systems—in not only organization but also planning and technologies—a common communication layer or architecture is needed to allow scalability and adaptability. For example, many robotics system architectures can be divided into three main parts: sensing, thinking, and acting. Sensing typically involves reading sensor data. Most manipulators are outfitted with sensors, such as an encoder for motor position feedback and a vision tracking system to perceive the environment’s data. Thinking functions use sensor data to plan movements.

Industrial manipulators usually feature inverse kinematics and obstacle avoidance algorithms. The “act” portion of the control regime translates the positioning commands into drive signals for specific actuators. Many advanced algorithms such as sensor fusion leveraging 3D cameras have emerged in academic research and can make existing manipulator systems drastically more efficient and effective. This common layer not only provides the ability to conduct rapid algorithm prototyping and validation but also acts as a gateway to communicate across the entire factory infrastructure.

Fig 3. Common Communication Layer

Fig 4. Connected Devices in a Factory Infrastructure

Various technology silos make up the factory floor today, and each technique, design, and piece of equipment makes modern manufacturing efficient, organized, and structured. Many leading manufacturers have launched a series of research projects in these areas and have demonstrated the viability and scalability of a platform-based approach that combines software and embedded hardware. For example, Airbus has used NI LabVIEW software and reconfigurable hardware as part of its Factory of the Future testbed to accelerate development and create a horizontal technology platform that the company can scale for each technology silo. Given the increased technology complexity and advancement, the ongoing challenge for the Factory of the Future is to identify a common framework that can leverage the technological advancement in each silo and be applied across the platform while maintaining high-quality assurance and full traceability throughout the process.

About the Author

Andy Chang is a senior manager for academic research at National Instruments in Austin, Texas. He is currently a PhD candidate in the department of Mechanical Engineering at the University of Texas, Austin. He received the B.S degree from the University of California, San Diego in Mechanical Engineering and M.S degree from the University of Michigan, Ann Arbor in Mechanical Engineering. His research interests include optimal control, dynamic system modeling, robotics, and cyber-physical systems. He recently participated in the White House initiative for Cyber-physical systems, a member of euRobotics, and was the vice-chair of industry for IFAC Mechatronics Symposium.

MORE ARTICLES

VIEW ALL

RELATED