It should come as no surprise that the Internet of Things (IoT) is one of the most eagerly anticipated trends in heavy industry. Powered by a host of technologies, including low-cost sensors, IP and wireless networks, private and public clouds, and powerful edge infrastructure, industrial IoT promises to transform the way companies provide products and services and interact with customers and partners.
Simultaneously, another revolution is taking place in artificial intelligence (AI). For years, programmed intelligence based on simple rules and limited data inputs have powered various industrial applications. A robot arm that extracts a molded part from a chemical wash after certain conditions are met is an example of a narrow, “weak” AI.
Recently, a new generation of industrial AI has emerged: systems powered by machine learning, a subset of AI that predicts events or optimizes performance. The secret sauce in machine learning is algorithms that are capable of quickly processing large, interconnected data sets that would be too complex for the human brain.
With the ability of IoT sensors and applications to generate massive amounts of data pertaining to individual components as well as the health of entire facilities, and the ability of machine learning to derive insights from an array of inputs at scale, there is a lot of anticipation about the possibilities of AI-powered IoT.
“Edge analytics and IoT are enabling technologies that bring the power of IoT to a growing number of industries, including areas where there is low connectivity or where power is an issue,” says David Schatsky, a managing director at Deloitte who specializes in emerging technology and business trends. “When you start to get really large amounts of data, that’s what machine learning is good for.”
There is pent up data in the “things” in the IoT, says Dr. Tom Bradicich, vice president and general manager for servers, converged edge, and IoT systems for Hewlett Packard Enterprise. This data has new and valuable insight, which can be unleashed to improve operational efficiency, lower costs, and increase safety and security. “AI and machine learning, applied to IoT and things data, is a key next step in more fully exploiting the business, engineering, and scientific insights derived from this new source of big data.”
Bradicich says that cognitive technologies will have a wide-ranging impact on industrial firms, and not just in operational settings. “What would you do if you could be perpetually connected to your products, customers, and facilities?” Bradicich asks. “Such connectivity affords the opportunity to apply cognitive technologies such as predictive analytics to the behaviors of products and customers, which in turn can drive better planning for warranties, inventories, and new products.”
Predictive maintenance to the rescue
Maintenance is a promising area for AI-powered IoT. Industrial firms tend to be reactive when it comes to maintenance, sending a crew to fix a piece of equipment when it stops working or emits an alarm. Some firms will also have regularly scheduled maintenance, based on accumulated usage or calendar dates. For instance, a logistics company may have biannual service checks of every vehicle in its fleet and replace certain parts or entire vehicles on a set schedule.
The problem with reactive repairs and scheduled maintenance is it’s wasteful and expensive. A broken piece of machinery is not only costly to fix or replace, but it can take out an entire assembly line or otherwise limit operations. Scheduled maintenance can be wasteful, too, requiring time and other resources even if the equipment in question doesn’t need to be maintained or replaced.
“Machine learning is really good for automatically identifying patterns or anomalies,” Schatsky says. “Most of the predictive maintenance apps that you see being deployed make some use of machine learning to process the data that’s collected and to indicate anomalies that can then draw the attention of engineers to see if maintenance or intervention is required.”
Heavy industry leverages AI
Another opportunity for AI on the edge is in manufacturing environments that require robotics and other types of machinery that can operate autonomously. Robotic devices already serve in front-line capacities in assembly lines, warehouses, and other industrial environments, taking on tasks that are especially repetitive or dangerous for human workers.
Moreover, robots are increasingly part of the wider industrial IoT ecosystem. When connected to networks, supply chain applications, and other systems, they can boost efficiency and achieve scale.
Where do AI technologies such as machine learning fit in? AI can multiply the effectiveness of industrial automation beyond the narrow tasks that robots once performed. For instance, semi-autonomous trucks, trains, and loaders have long been a part of the mining industry, but they are typically guided by pre-programmed routines, fixed tracks, and remote human operators. A new generation of mining technology uses AI, GIS, and GPS data and programmable logic controllers, which enable driverless vehicles and loaders to operate autonomously and determine optimal routes and positioning.
According to a report by the International Institute for Sustainable Development and Columbia University, algorithm-driven driverless technology can contribute to a 20 percent increase in output, a 15 percent decrease in fuel consumption, and an 8 percent decrease in maintenance costs.
Moreover, as Ethernet replaces proprietary networks in mining environments and industrial IoT is extended to mining sites and processing facilities on the edge of the network, new types of sensors, controllers, and intelligent instruments can further boost operations. These technologies are more cost-effective in terms of provisioning and software maintenance. Moreover, the insights derived from them provide far better clarity into plant operations. Mining applications that use an algorithm-driven optimization technology called multivariable predictive control (MPC) can lead to improvements in yield, capacity, and energy consumption.
Collaboration between robots and humans
While robots were once kept in cages, strictly segregated from human workers to avoid injury, cutting-edge AI is allowing robots and humans to work more closely together.
“The biggest difference is this generation of robots is designed to work among people,” Schatsky says. “The new generation of robots can use AI technologies such as computer vision, speech recognition, and more sophisticated analytics of the sensors that they have, all with the goal of making them safe to deploy among people.”
Claudia Pérez D’Arpino, a robotics and AI researcher at MIT, says that bringing robots and humans together in the same workspace requires a whole new set of capabilities that weren’t even considered 10 years ago. It’s not enough for robots to work safely among humans; she notes that they also have to be efficient.
“In manufacturing, every second is important, so if the robot is going to be very slow just to collaborate with the human, it is not going to work,” she says. “The robot has to collaborate, but it still has to do its own tasks on time.”
D’Arpino and her colleagues are developing a machine learning system called C-Learn (the C stands for “constraints”), which aims to allow non-coders to train robots to perform specific industrial tasks, such as assembling a component or welding a piece of metal. The tasks can be transferred from one robot to another and even to different environments, such as an assembly line in a different country.
This approach represents a huge improvement in industrial automation, D’Arpino says. “Currently, it takes a month of coding to train a robot to do a task,” she says. “That only works if you are going to execute this task for years.” She envisions AI-powered robots spreading to areas of manufacturing that require frequent changes, an area that up until now has not been cost-effective for industrial robots.
AI on the edge vs. AI in the cloud
As AI-powered IoT and robotics enable new manufacturing paradigms, there is the question of how the IT infrastructure will work. Do AI systems require storage and compute resources from powerful clouds, or do such systems need to be distributed to the factory floor or remote facilities where industrial processes are taking place?
The answer to this question: It depends. Clearly, certain types of sensors and devices cannot afford to wait for data or commands from the cloud, because latency or processing delays may cause safety or performance issues. An autonomous vehicle that moves parts across the factory floor will need to make decisions on the fly about which route to take or how to best position itself when loading and unloading parts to avoid spillage. An AI-enabled manufacturing robot will similarly need to crunch real-time sensor data from its own inputs as well as data from nearby equipment to optimize performance, carry out specific tasks, and avoid causing injury to nearby workers.
If hundreds of robots and IoT devices are working in tandem, the edge infrastructure requirements will be significant. According to IDC, at least 40 percent of IoT-created data will be stored, processed, analyzed, and acted upon close to or at the edge of the network by 2019.
On the other hand, cloud services are a realistic option to evaluate performance across the system, to store massive amounts of data thrown off by thousands of sensors and IoT devices, and to look for insights based on archived data.
“The cloud is most useful when you are trying to get a view across multiple facilities,” Deloitte’s Schatsky says. “For tracking, monitoring, and diagnosing individual equipment, it probably makes sense to do it closer to the edge.”
There are additional cost considerations that favor compute and storage resources being located to the edge. Transmitting large quantities of raw data to a faraway data center or cloud service will require more time and electricity than if it’s processed locally. This is a huge concern for battery-powered IoT devices, which must conserve power to extend their usability.
AI can’t solve all industrial problems
AI-powered IoT and robots will not solve all of the problems on the factory floor or remote site. “AI is good for some things, but it’s not relevant in other areas,” Schatsky says. “Like any technology investment, it has to be rooted in a real business case.”
When it comes to evaluating how AI might yield true benefits in an industrial setting, Schatsky advises companies to focus on applications that have the biggest potential return. This might include tasks or processes that are known to be highly inefficient or applications that are capable of generating large amounts of data that could provide useful insights to improve operations.
HPE’s Bradicich says companies can take a smart approach to evaluating AI and conducting pilots. “Businesses can quantify the ROI and minimize the risks of AI and machine learning benefits by applying a smaller controlled test use case, engaging an application subject matter expert, and partnering with vendors with proven track records,” says Bradicich. If results from that are positive, “incremental scaling up while quantifying the ROI after each step will help ensure success.”
AI on the edge: Lessons for leaders
MIT’s D’Arpino notes that while there is a lot of excitement in some quarters regarding the potential benefits of using AI-powered robots in industrial settings, there is also a lot of misunderstanding as to what such technologies are capable of doing. “Because of press stories and movies, some people think robots can do everything and that they will have the same level of cognitive processing that we have,” D’Arpino says. “That’s not true. And I don’t think it’s going to happen in my lifetime.”
AI on the edge: Lessons for leaders
- AI can multiply the effectiveness of industrial automation beyond the narrow tasks that robots once performed.
- Machine learning is good for automatically identifying patterns or anomalies.
- Like any technology investment, AI must be rooted in a real business case