Across the manufacturing industry, innovation is happening at the edge. Edge computing allows manufacturers to process data closer to the source where it is being generated, rather than sending it offsite to a cloud or data center for analysis and response. 

For an industry defined by machinery and supply chains, this comes as no surprise. The proliferation of smart equipment, robotics and AI-powered devices designed for the manufacturing sector underscores the value edge presents to manufacturers. 

Yet, when surveyed, a significant gap appears between organizations that recognize the value of edge computing (94%) and those who are currently running mature edge strategies (10%). Running edge devices and smart-manufacturing machines does not always mean there is a fully functioning edge strategy in place. 

Why the gap? 

What is holding back successful edge implementation in an industry that clearly recognizes its benefits?

The very same survey mentioned above suggests that complexity is to blame– with 85% of respondents saying that a simpler path to edge operations is needed. 

What specifically do these complexities consist of? Top among them is: 

Data security constraints: managing large volumes of data generated at the edge, maintaining adequate risk protections, and adhering to regulatory compliance policies creates edge uncertainty.Infrastructure decisions: choosing, deploying, and testing edge infrastructure solutions can be a complex, costly proposition. Components and configuration options vary significantly based on manufacturing environments and desired use casesOvercoming the IT/OT divide: barriers between OT (operational technology) devices on the factory floor and enterprise applications (IT) in the cloud limit data integration and time to value for edge initiatives. Seamless implementation of edge computing solutions is difficult to achieve without solid IT/OT collaboration in place.Lack of edge expertise: a scarcity of edge experience limits the implementation of effective edge strategies. The move to real-time streaming data, data management, and mission-critical automation has a steep learning curve.

Combined, these challenges are holding back the manufacturing sector today, limiting edge ROI (return on investment), time to market and competitiveness across a critical economic sector. 

As organizations aspire toward transformation, they must find a holistic approach to simplifying—and reaping the benefits of — smart factory initiatives at the edge.

Build a Simpler Edge 

What does a holistic approach to manufacturing edge initiatives look like? It begins with these best practices: 

Start with proven technologies to overcome infrastructure guesswork and obtain a scalable, unified edge architecture that ingests, stores, and analyzes data from disparate sources in near-real time and is ready to run advanced smart-factory applications in a matter of days, not weeks. Deliver IT and OT convergence by eliminating data silos between edge devices on the factory floor (OT) and enterprise applications in the cloud (IT), rapidly integrating diverse data types for faster time to value Streamline the adoption of edge use cases with easy and quick deployment of new applications, such as machine vision for improved production quality and digital twin composition for situational modeling, monitoring, and simulationScale securely using proven security solutions that protect the entire edge estate, from IT to OT. Strengthen industrial cybersecurity using threat detection, vulnerability alerts, network segmentation, and remote incident managementEstablish a foundation for future innovation with edge technologies that scale with your business, are easily configured to adopt new use cases— like artificial intelligence, machine learning and private 5G— that minimize the complexity that holds manufacturers back from operating in the data age.

Don’t go it alone

The best way to apply these practices is to start with a tested solution designed specifically for manufacturing edge applications. Let your solution partner provide much of the edge expertise your organization may not possess internally. A partner who has successfully developed, tested and deployed edge manufacturing solutions for a wide variety of use cases will help you avoid costly mistakes and reduce time to value along the way. 

You don’t need to be an industry expert to know that the manufacturing sector is highly competitive and data-driven. Every bit of information, every insight matters and can mean the difference between success or failure. 

Product design and quality, plant performance and safety, team productivity and retention, customer preferences and satisfaction — are all contained in your edge data. Your ability to access and understand that data depends entirely on the practices you adopt today. 

Digitally transforming edge operations is essential to maintaining and growing your competitive advantage moving forward.

A trusted advisor at the edge

Dell has been designing and testing edge manufacturing solutions for over a decade, with customers that include EricssonMcLarenLinde and the Laboratory for Machine Tools at Aachen University

You can learn more about our approach to edge solutions for the manufacturing sector, featuring Intel® Xeon® processors, at Dell Manufacturing Solutions. The latest 4th Gen Intel® Xeon® Scalable processors have built-in AI acceleration for edge workloads – with up to 10x higher PyTorch real-time inference performance with built-in Intel® Advanced Matrix Extensions (Intel® AMX) (BF16) vs. the prior generation (FP32)1.

See [A17] at intel.com/processorclaims: 4th Gen Intel® Xeon® Scalable processors. Results may vary.

Edge Computing

Private 5G is the next evolution of networking for mission-critical applications used in factories, logistics centers and hospitals. In fact,  any environment that needs the reliability, security and speed of a wired connection combined with the movement of people, things and data.

The element of movement is often a factor in Industry 4.0 digital transformation – and that’s where private 5G shines.

Private 5G is deployed as an extension of an organization’s WAN. It’s fast, secure, reliable and has low latency. You can rely on it to transmit data. But if you don’t have a computing resource at the edge where the data is collected to create actionable intelligence in real time, you’re missing out on revolutionary possibilities.

Edge computing brings out the real potential of private 5G

Bringing managed private 5G together with managed edge computing enables businesses to analyze situations in the now – no more waiting for data to be collected (often a slow process) and sent to a data center to be processed first.

In manufacturing, this combined-platform approach quickly delivers the right information to where decisions have to be made: the factory floor. This has implications for everything from an evolutionary increase in productivity and quality, to greater flexibility and customization.

Organizations also have to control data sovereignty, ownership and location. Private 5G can protect data by ensuring that all traffic remains on-premises.

While private 5G is a powerful tool, use cases make it exciting

If you switch to private 5G, it helps to avoid Wi-Fi access-point proliferation as well as blind spots in monitoring, as asset-based sensors can collect and transmit huge volumes of data quickly, and we can achieve indoor-positioning accuracy of less than one meter.

It’s also a much simpler exercise to reconfigure connectivity between devices and improve the timing and synchronization of data feeds from sensors.

Last year, Cisco’s Strategic Execution Office ran a study on private 5G in collaboration with Deloitte, titled “Vertical Use Cases Offer Development”, which delves into the main applications of private 5G through use cases.

They found that the highest demand for private 5G is in the manufacturing, logistics and government industries. Their findings match our experience, as these are the sectors in which NTT’s Private 5G and Edge as a Service are most in demand.

Moving from broad themes to specific applications

The study identified four themes: enabling hybrid connectivity; activation and policy setup for varied sensor profiles; advanced intelligence with private 5G and the edge-computing stack; and integrated app and infrastructure to enable business outcomes.

NTT’s experience has taught us that these themes can be translated into five main areas of application:

Group wireless communications (push-to-talk) enable workers to communicate across locations, with real-time location tracking.Private 5G supports augmented reality and virtual reality, allowing for self-assist, work-assist, and remote-assist capabilities.Private 5G makes real-time connectivity and control possible for autonomous guided vehicles.Computer vision for automatic video surveillance, inspection and guidance is faster and more efficient on a private 5G network.Connected devices can remain reliably and securely connected to the enterprise network throughout the work shift without relying on Wi-Fi or portable hot spots.

Exploring the difference 5G will make in manufacturing

The study also explores how private 5G can optimize assets and processes in manufacturing, assembly, testing, and storage facilities. Private 5G allows for faster and more precise asset tracking, system monitoring, and real-time schedule and process optimization using location and event data from sensors and factory systems.

The research provides two examples of private 5G use cases in factories:

Factory asset intelligence: Traceability from parts to product, with increased sensor enablement across manufacturing, assembly and testing sitesDynamic factory scheduling: Closed-loop control and safety applications enabled by real-time actuation, sensor fusion and dynamic process schedules.

As we continue to explore the potential of private 5G, it is clear that this technology has the power to transform the manufacturing industry and pave the way for a more efficient and effective future.

To find out more about the use cases private 5G unlocks and how they can offer business benefits, download NTT’s white paper: Smart manufacturing: accelerating digital transformation with private 5G networks and edge computing.

Edge Computing, Manufacturing Industry, Manufacturing Systems, Private 5G

The manufacturing industry is undergoing a renaissance, thanks in part to advances in information technology. Two IT leaders who have been on the forefront of that are Kim Mackenroth and Chris Nardecchia.

Kim Mackenroth is vice president and global CIO of Textron, a Fortune 302 multi-industry company with around 33,000 employees worldwide. Her global IT organization comprises five business-segment CIOs, as well as shared services provided by the CISO, CTO, and the leader of enterprise business systems. CIO 100 award-winner Chris Nardecchia also wears multiple leadership hats in his role as senior vice president and chief digital and information officer of Rockwell Automation, the world’s largest pure-play industrial automation and IoT company.

These two industry leaders have much in common, from their parallel career paths to their leadership philosophies and experiences. When the three of us spoke for a recent episode of the Tech Whisperers podcast, we explored how Mackenroth and Nardecchia are succeeding with their transformation journeys, winning with people, and not only answering the CEO’s call but also changing the IT narrative to get those calls in the first place. Afterwards, we spent some time talking about their career journeys and the technology that excites them about the future of manufacturing and business. What follows is that conversation, edited for length and clarity.

Dan Roberts: You have similar career stories in that neither of you started out in IT and never intended to get into this profession. Where did you start, and how did you get here?

Kim Mackenroth, vice president and global CIO, Textron

Textron

Kim Mackenroth: I believe a career path is not a ladder but a jungle gym of experiences — some lateral, some vertical — that provide a solid foundation for the ones that follow. While I never intended to be a CIO, I have always had the philosophy of ‘take the role that scares you the most because that is where you will grow the most.’

Throughout my career at Textron, I have had many roles, spanning supply chain, manufacturing, integrated product teams, and working on helicopter programs. There was an opportunity to be part of a new way of conducting business at our Bell business unit, so I took that role, and as a result, the CIO took notice and asked me to join his organization as a direct report. He rotated me throughout IT, and then I became a CIO of two other businesses prior to becoming Global CIO.

Chris Nardecchia, SVP and chief digital and information officer, Rockwell Automation

Rockwell Automation

Chris Nardecchia: I started as a chemical engineer doing chemical engineering things, building and operating chemical and nuclear processes, producing polymers and nuclear fuel, etc. That led me into a role in the pharmaceutical industry, again, building and operating processes to manufacture pharmaceuticals. As part of that role, I got involved with computer control of manufacturing processes, and that’s when I started to code. That then led to an increasing involvement of trying to move data across the enterprise to compare manufacturing operations around the world.

During this period, when we were expanding at a rapid pace and building new manufacturing plants around the globe, I was approached by the head of our manufacturing division to lead the implementation of SAP across our manufacturing network. I realized it was career-limiting to tell the president of a division ‘no,’ and that started my journey into IT.

From that point forward, I was fortunate enough to lead both the IT and OT teams within global pharmaceutical manufacturing and supply chain organizations. Those experiences prepared me well for what Rockwell Automation needed in their next IT leader — someone who can not only run the IT operations but also understand manufacturing in the OT environment.

Dan Roberts: I recently spoke with Charlie Feld, who said that, before the internet, we had more time to build relationships and to think. But we also we didn’t have the technology to do all these great things we’ve come up with since. What technology are you most excited about now and as you look to the future?

Kim Mackenroth: Earlier in my career, I worked at Bell’s drive systems center, where we build all the high-tolerance parts for our complex gearboxes that go into our helicopters. We had large batches of parts, which would have to get through an enormous amount of processing, machines, and external providers to complete. We used to say they travel many miles to hopefully yield the parts that we needed for assemblies.

We had a dream at that time: Wouldn’t it be great to live in this world where we could have one piece part flow. Where we could have machines that were capable of digital loops where they could adapt, they could produce a quality part every time, they could do multiple operations, they could significantly reduce the amount of equipment that was required, the number of operations and the amount of span time.

Bell now has a manufacturing technology center — a purposeful factory of the future. It was all about creating and testing the capabilities that I just talked about, and how to embed that back into our core manufacturing processes. Everything that I just described is happening, and it’s the culmination of engineering, manufacturing, modern machines and software, all coming together to yield the future we dreamed about.

Chris Nardecchia: Those things really excite me. We’re clearly in the age of AI. We’ve seen the amazing progress with open-source AI, with ChatGPT, and previously with DALL-E. This is just the tip of the iceberg. Applying similar capabilities to manufacturing, as Kim just went through, is in progress, and it’s just going to accelerate. I’m extremely excited about the exponential impacts that applying these technologies can have on manufacturing operations.

To cite just a few results that have been achieved with digitization at Rockwell, we’ve seen a 40% improvement in quality, a seven-figure improvement in productivity, and, prior to recent supply chain issues, our on-time delivery improved from 82% to 96%. These are big numbers, but imagine what the possibilities are when you apply advanced AI algorithms.

Here’s a real-life example at Rockwell. Part of our manufacturing process is to create electronic components with circuit boards, and you embed computer chips in them. We have six plants with 24 manufacturing lines and 50 machines that contain 2,000 nozzles that place these chips from a spool at a very high speed onto the printed circuit board. The exact placement on that board is critical. If you get these off a few millimeters, then you’re scrapping boards.

Over time the nozzles can wear out and drift away from the proper location. To avoid bad boards being created, we used to perform maintenance on a time schedule, and we’d frequently replace good nozzles that still had life in them. Now, we replace them just before failure by leveraging an AI solution that predicts the drift of these nozzles at very high speed and notifies the operator via a visual application when they’re predicted to fail.

At a cost of $5 to $500 per nozzle, it saves significant costs, but more importantly, it maximizes the machine utilization and uptime. This is just one example. If you think about the countless manufacturing lines across the world, there’s just huge opportunities.

Roberts: Kim, you talk about software like driving a car. What do you mean by that?

Mackenroth: Everything was previously hardware-centric. It’s like in the example that Chris gave, we had those windows of opportunities where we’d replace hardware because the source of the world was hardware. Now, software eats hardware for lunch.

Look at Tesla. When you get into a Tesla, it’s not all the bells and whistles from a hardware perspective. That’s not what it’s about. It’s about that software experience — having those monthly updates, getting the new features and functions. Probably not the most comfortable seat, not all the little ecosystems that you would have in a luxury vehicle, but the enthusiasm is off the charts because of that customer engagement, that customer experience of, what am I going to get next? What am I going to be able to do next? None of that could have been accomplished with the previous automotive industry approach.

Roberts: Back in the day, we used the phrase eating our own dog food — or drinking our own champagne. Chris, you talk about it as Rockwell on Rockwell. What does that look like?

Nardecchia: The concept here is borrowed from the software world where we use our own products and our own solutions and our own manufacturing facilities, not only to improve our own operations, but to showcase them for our customers. So in our manufacturing headquarters today, where there previously wasn’t any manufacturing because everything shifted overseas, we’ve now brought back manufacturing and demonstrated in almost a lights-out facility, one operator, all the technology advancements that people can apply.

This is us walking the talk, not only with our own products but our partner ecosystem so that we can feel good about what we’re promoting to our customers and find the flaws of the implementation experience. If the customer is going to experience this, we want to experience it first and then modify what that experience is for the end customer. Demonstrating these capabilities in our own four walls allows us to speak about them with conviction with our customers.

Roberts: Is there going to be a manufacturing renaissance in the US?

Nardecchia: I think it’s happening and it’s driven by two or three things. One is the supply chain and the geopolitical events that are happening. That’s awakened people to say, ‘What do we do and how do we secure our supply chain?’ It’s also driven a little bit by the labor shortage — how do we sustain a society in a growing population through automation and that marriage between machines and human intelligence? How does that work?

A number of companies that moved manufacturing to low-tax havens are now announcing facilities being built in the US in the coming year. We’re seeing the semiconductors move from Asia over to the US. We’ll have to see if it stays and sticks, but I believe that you’re going to see more manufacturing centric in the US.

Roberts: Kim, after 27 years at Textron, what keeps you excited about what you do?

Mackenroth: I get this question a lot, and I would go all the way back to the beginning. I am so grateful for the group of leaders that brought me in at Bell where I began my career, the mentors that challenged me, and the wonderful teammates and colleagues I get to work with. There’s that phrase, ‘Hire people who amaze you and then teach them how to amaze themselves.’ I feel like Textron has done that for me, and it’s part of my legacy to those people that follow me to make sure they’re having a career journey that amazes themselves.

But there’s two big reasons why I’m here outside of all of that. I love the talent philosophy. It is rare in a multi-industry, global organization to have such a passion for developing and promoting people from within. That’s incredibly special and supports the opportunities that we can offer. My CHRO talks about community, cause, and career. That is important, but I would add people, purpose, and passion. My number one, most enthusiastic item is purpose, and if I summarize everything that we do at Textron, we literally defend freedom. We protect the warfighter. We save lives. We build time machines. We move humanity. Who else can say that?

Dig deeper into the career journeys and leadership playbooks of Mackenroth and Nardecchia by tuning in to the Tech Whisperers podcast.

Innovation, Manufacturing Industry

As part of its ongoing strategy to expand its roster of public cloud regions and catch up with larger cloud service providers such as AWS, Microsoft and Google, Oracle has launched a new cloud region in Chicago to cater to enterprises operating out of the US Midwest.

The Chicago region, which will be Oracle’s fourth public cloud region in the US and 41st globally, will primarily cater to manufacturing and financial services firms among other industries operating in that part of the country, said Leo Leung, vice president of products and strategy at Oracle.  

The Midwest region, according to Oracle, is home to more than 60% of all US manufacturing firms and houses the world’s largest financial derivatives exchange.

“This is just going to give them (enterprise customers in the region) the capability of running their workloads closer to their headquarters versus other parts of the country,” Leung said, adding that the demand in the region is fueling the company’s growing bookings for Oracle Cloud Infrastructure (OCI).

CEO Safra Catz, during an earnings call for its quarter ended November, had said that the company had triple-digit bookings growth across its infrastructure-as-a-service (IaaS) services for the past two quarters and basis this growth, the company planned to invest $2.4 billion approximately every quarter for the next few quarters.

The new region in Chicago will offer over 100 OCI services and applications, including Oracle Autonomous Database, MySQL Heatwave, OCI Data Science, Oracle Container Engine for Kubernetes, and Oracle Analytics, the company said.  

Oracle has three other regions in the US, situated in Ashburn, Virginia; San Jose, California; and Phoenix, Arizona.

Globally, the company has a total of 55 cloud regions including national security regions.

Nine new regions are currently being built, Catz had said during the earnings call, according to a transcript from Motley Fool.

For the quarter ended November, the company’s total revenue grew 25% in constant currency, buoyed by revenue growth from its infrastructure and applications cloud businesses, which grew 59% and 45% respectively, in constant currency.

Cloud Computing, Finance and Accounting Systems, Manufacturing Industry, Technology Industry

As the chief engineer and head of the department for digital transformation of manufacturing technologies at the Laboratory for Machine Tools and Production Engineering (WZL) within RWTH Aachen University, I’ve seen a lot of technological advancements in the manufacturing industry over my tenure. I hope to help other manufacturers struggling with the complexities of AI in manufacturing by summarizing my findings and sharing some key themes.

The WZL has been synonymous with pioneering research and successful innovations in the field of production technology for more than a hundred years, and we publish over a hundred scientific and technical papers on our research activities every year. The WZL is focused on a holistic approach to production engineering, covering the specifics of manufacturing technologies, machine tools, production metrology and production management, helping manufacturers test and refine advanced technology solutions before putting them into production at the manufacturing edge. In my team, we have a mix of computer scientists, like me, working together with mathematicians and mechanical engineers to help manufacturers use advanced technologies to gain new insights from machine, product, and manufacturing data.

Closing the edge AI insight gap starts and ends with people 

Manufacturers of all sizes are looking to develop AI models they can use at the edge to translate their data into something that’s helpful to engineers and adds value to the business. Most of our AI efforts are focused on creating a more transparent shop floor, with automated, AI-driven insights that can:

Enable faster and more accurate quality assessmentReduce the time it takes to find and address process problemsDeliver predictive maintenance capabilities that reduce downtime

However, AI at the manufacturing edge introduces some unique challenges. IT teams are used to deploying solutions that work for a lot of different general use cases, while operational technology (OT) teams usually need a specific solution for a unique problem. For example, the same architecture and technologies can enable AI at the manufacturing edge for various use cases, but more often than not, the way to extract data from edge OT devices and systems that move their data into the IT systems is unique for each case. 

Unfortunately, when we start a project, there usually isn’t an existing interface for getting data out of OT devices and into the IT system that is going to process it. And each OT device manufacturer has its own systems and protocols. In order to take a general IT solution and transform into something that can answer specific OT needs, IT and OT teams must work together at the device level to extract meaningful data for the AI model. This will require IT to start speaking the language of OT, developing a deep understanding of the challenges OT faces daily, so the two teams can work together. In particular, this requires a clear communication of divided responsibilities between both domains and a commitment to common goals. 

Simplifying data insights at the manufacturing edge

Once IT and OT can work together to successfully get data from OT systems to the IT systems that run the AI models, that’s just the beginning. A challenge I see a lot in the industry is when an organization still uses multiple use-case-specific architectures and pipelines to build their AI foundation. The IT systems themselves often need to be upgraded, because legacy systems can’t handle the transmission needs of these very large data sets. 

Many of the companies we work with throughout our various research communities, industry consortia or conferences, such as WBAICNAP or AWK2023 — especially the small to medium manufacturers — ask us specifically for technologies that don’t require highly specialized data scientists to operate. That’s because manufacturers can have a hard time justifying the ROI if a project requires adding one or more data scientists to the payroll. 

To answer these needs, we develop solutions that manufacturers can use to get results at the edge as simply as possible. As a mechanical engineering institute, we’d rather not spend a lot of time doing research about infrastructure and managing IT systems, so we often seek out partners like Dell Technologies, who have the solutions and expertise to help reduce some of the barriers to entry for AI at the edge.

For example, when we did a project that involved high- frequency sensors, there was no product available at the time that could deal with our volume and type of data. We were working with a variety of open-source technologies to get what we needed, but securing, scaling, and troubleshooting each component led to a lot of management overhead.

We presented our use case to Dell Technologies, and they suggested their Streaming Data Platform. This platform reminds me of the way the smartphone revolutionized usability in 2007. When the smartphone came out, it had a very simple and intuitive user interface so anyone could just turn it on and use it without having to read a manual. 

The Streaming Data Platform is like that. It reduces friction to make it easier for people who are not computer scientists to capture the data flow from an edge device without having technical expertise in these systems. The platform also makes it easy to visualize the data at a glance, so engineers can quickly achieve insights.

When we applied it to our use case, we found that it deals with these data streams very naturally and efficiently, and it reduced the amount of time required to manage the solution. Now, developers can focus on developing the code, not dealing with infrastructure complexities. By reducing the management overhead, we can use the time saved to work with data and get better insights.

The future of AI at the manufacturing edge

With all of this said, one of the biggest challenges I see overall with AI for edge manufacturing is the recognition that AI insights are an augmentation to people and knowledge — not a replacement. And that it is much more important for people to work together in managing and analyzing that data to ensure that the end goal of getting business insights to serve a particular problem are being met. 

When manufacturers use many different solutions pieced together to find insights, it might work, but it’s unnecessarily difficult. There are technologies out there today that can remedy these challenges, it’s just a matter of finding them and checking them out. We’ve found that the Dell Streaming Data Platform can capture data from edge devices, analyze the data using AI models in near real time, and feed insights back to the business to add value that benefits both IT and OT teams.

Learn more

If you are interested in current challenges, trends and solutions to empower sustainable production, find out more at the AWK2023 where more than a thousand participants from production companies all around the world come together to discuss solutions for green production.

Find out more about AI at the manufacturing edge solutions from Dell Technologies and Intel.  

***

Intel® Technologies Move Analytics Forward

Data analytics is the key to unlocking the most value you can extract from data across your organization. To create a productive, cost-effective analytics strategy that gets results, you need high performance hardware that’s optimized to work with the software you use.

Modern data analytics spans a range of technologies, from dedicated analytics platforms and databases to deep learning and artificial intelligence (AI). Just starting out with analytics? Ready to evolve your analytics strategy or improve your data quality? There’s always room to grow, and Intel is ready to help. With a deep ecosystem of analytics technologies and partners, Intel accelerates the efforts of data scientists, analysts, and developers in every industry. Find out more about Intel advanced analytics.

IT Leadership

Over the past 184 years, The Procter & Gamble Co. (P&G) has grown to become one of the world’s largest consumer goods manufacturers, with worldwide revenue of more than $76 billion in 2021 and more than 100,000 employees. Its brands are household names, including Charmin, Crest, Dawn, Febreze, Gillette, Olay, Pampers, and Tide.

In summer 2022, P&G sealed a multiyear partnership with Microsoft to transform P&G’s digital manufacturing platform. The partners say they will create the future of digital manufacturing by leveraging the industrial internet of things (IIoT), digital twin, data, and AI to bring products to consumers faster and increase customer satisfaction, all while improving productivity and reducing costs.

“The main purpose of our digital transformation is to help create superior solutions for daily problems of millions of consumers around the world, while generating growth and value for all stakeholders,” says Vittorio Cretella, CIO of P&G. “We do that by leveraging data, AI, and automation with agility and scale across all dimensions of our business, accelerating innovation and increasing productivity in everything we do.”

The digital transformation of P&G’s manufacturing platform will enable the company to check product quality in real-time directly on the production line, maximize the resiliency of equipment while avoiding waste, and optimize the use of energy and water in manufacturing plants. Cretella says P&G will make manufacturing smarter by enabling scalable predictive quality, predictive maintenance, controlled release, touchless operations, and manufacturing sustainability optimization. These things have not been done at this scale in the manufacturing space to date, he says.

Smart manufacturing at scale

The company has already undertaken pilot projects in Egypt, India, Japan, and the US that use Azure IoT Hub and IoT Edge to help manufacturing technicians analyze insights to create improvements in the production of baby care and paper products.

For instance, the production of diapers involves assembling many layers of material at high speed with great precision to ensure optimal absorbency, leak protection, and comfort. The new IIoT platform uses machine telemetry and high-speed analytics to continuously monitor production lines to provide early detection and prevention of potential issues in the material flow. This, in turn, improves cycle time, reduces network losses, and ensures quality, all while improving operator productivity.

P&G is also piloting the use of IIoT, advanced algorithms, machine learning (ML), and predictive analytics to improve manufacturing efficiencies in the production of paper towels. P&G can now better predict finished paper towel sheet lengths.

Smart manufacturing at scale is a challenge. It requires taking data from equipment sensors, applying advanced analytics to derive descriptive and predictive insights, and automating corrective actions. The end-to-end process requires several steps, including data integration and algorithm development, training, and deployment. It also involves large amounts of data and near real-time processing.

“The secret to scale is to lessen complexity by providing common components at the edge and in the Microsoft cloud that engineers can work with to deploy diverse use cases into a specific manufacturing environment — without having to create everything from scratch,” Cretella says.

Using Microsoft Azure as the foundation, Cretella says P&G will now be able to digitize and integrate data from more than 100 manufacturing sites around the world and enhance AI, ML, and edge computing services for real-time visibility. In turn, this will enable P&G employees to analyze production data and leverage AI to support decisions that drive improvement and exponential impact.

“Accessing this level of data, at scale, is rare within the consumer goods industry,” Cretella says.

Data and AI as digital fundamentals

P&G took the first steps in its AI journey more than five years ago. It has moved past what Cretella calls the “experimentation phase” with scaled solutions and increasingly sophisticated AI applications. Data and AI have since become central to the company’s digital strategy.

“We leverage AI across all dimensions of our business to predict outcomes and increasingly to prescribe actions through automation,” Cretella says. “We have applications in our product innovation space, where thanks to modelling and simulation we can shorten the lead time to develop a new formula from months to weeks; in the way we engage and communicate with our consumers, using AI to deliver to each of them brand messages delivered at their right time, right channel, and with the right content.”

P&G also uses predictive analytics to help ensure the company’s products are available at retail partner “where, when, and how consumers shop for them,” Cretella says, adding that P&G engineers also use Azure AI to ensure quality control and equipment resilience on the production line.

While P&G’s recipe for scale relies on technology, including investment in a scalable data and AI environment centered on cross-functional data lakes, Cretella says P&G’s secret sauce is the skills of hundreds of talented data scientists and engineers who understand the company’s business inside and out. To that end, P&G’s future is about embracing automation of AI, which will allow its data engineers, data scientists, and ML engineers to spend less time on manual, labor-intensive tasks so they can focus on the areas where they add value.

“Automation of AI also allows us to deliver with consistent quality and to manage bias and risk,” he says, adding that automating AI will also “make these capabilities accessible to an increasingly larger number of employees, thus making the benefits of AI pervasive across the company.”

The power of people

Another element to achieving agility at scale is P&G’s “composite” approach to building teams in the IT organization. P&G balances the organization between central teams and teams embedded in its categories and markets. The central teams create enterprise platforms and technology foundations, while the embedded teams use those platforms and foundations to build digital solutions that address their units’ specific business opportunities. Cretella also notes that the company prioritizes insourcing talent, especially in areas such as data science, cloud management, cybersecurity, software engineering, and DevOps.

To accelerate P&G’s transformation, Microsoft and P&G have created a Digital Enablement Office (DEO) staffed by experts from both organizations. The DEO will serve as an incubator to create high-priority business scenarios in the areas of product manufacturing and packaging processes that P&G can implement across the company. Cretella considers it as more of a project management office than a center of excellence.

“It coordinates all the efforts of the different innovation teams that work on business use cases and ensures an efficient scaled deployment of proven solutions that develop,” he says.

Cretella has some advice for CIOs trying to drive digital transformation in their organizations: “First, be driven and find your energy in the passion for the business and how to apply technology to create value. Second, be equipped with tons of learning agility and genuine curiosity to learn. Last, invest in people — your teams, your peers, your bosses — because technology alone does not change things; people do.”

Artificial Intelligence, Digital Transformation, Manufacturing Industry

The manufacturing industry is experiencing its “fourth industrial revolution,” with manufacturers focused on leveraging IT to stay competitive and meet the demand for digital services that can enhance their physical wares. Sensors, AI, and robotics are key Manufacturing 4.0 technologies that fueled data strategies aimed at identifying inefficiencies, streamlining processes, and improving the ability to forecast and predict industry trends.

Because of this, IT professionals are in high demand in the manufacturing industry — even more so as supply chain issues persist and manufacturers consider bringing more of their operations back onshore. Demand has increased so much that IT job postings in manufacturing doubled between May 2021 and 2022, according to Dice.com, with increased demand for skills such as agile development, Python, software development, automation, C++, SQL, and Java, among others.

If you’re an IT pro looking to break into the manufacturing industry, or a manufacturing IT leader wanting to know where hiring will be the most competitive, here are nine of the most in-demand tech jobs in manufacturing, according to data from Dice.

Software engineer

The demand for software in manufacturing has only increased, as nearly every piece of equipment or hardware is now connected to the internet in some form — and this has also increased the demand for software engineers. In this role, you’ll be expected to design, code, debug, improve, and maintain software to support the organization. You may be put to work on automation, modernization, equipment installation, equipment support, or designing software to meet business needs. Software design and implementation can sometimes take years, depending on what’s being developed, so it’s a pivotal role for ensuring a company stays on track and on budget with digital transformation.

The average salary for a financial software engineer is $119,593 per year, with a reported salary range of $88,000 to $177,000 per year, according to data from Glassdoor.

Principal software engineer

Principal software engineers are typically responsible for managing large-scale, high-level projects that may require larger teams or longer lead times, and it’s one of the highest levels software engineers can reach. In this role, you’ll need to manage and oversee the technical aspects of the organization’s biggest projects and initiatives. Duties vary depending on the type of manufacturing, but these tech pros are often tasked with implementing plans developed by software architects and managing a team of engineers to code and script software. It’s a technical role that also requires a level of soft skills such as leadership, communication, and analytical skills.

The average salary for a principal software engineer is $169,453 per year, with a reported salary range of $128,000 to $235,000 per year, according to data from Glassdoor.

Systems engineer

Systems engineers are responsible for focusing solely on the systems and infrastructure in an organization and identifying areas for improvement, designing new solutions, and advising the company on the best hardware or software to meet a client’s requirements. You’ll be tasked with ensuring that the systems in an organization are always available, reliable, and well-maintained, as well as troubleshooting any problems or issues that arise. In manufacturing, systems engineers are typically expected to focus on process flows, identifying issues in development processes, developing management control systems, implement quality control procedures, and working directly with clients to better understand requirements and needs.

The average salary for a systems engineer is $110,245 per year, with a reported salary range of $82,000 to $158,000 per year, according to data from Glassdoor.

Principal systems engineer

Principal systems engineer is a high-level engineering role for designing and implementing complex computer systems across a variety of teams. These tech pros work closely with software developers, hardware engineers, and other tech professionals to ensure that the company’s products are up to industry standards and address customer needs. It’s the highest level for a systems engineer and you’ll typically act as a supervisor, overseeing the day-to-day operations and performance of servers, storage, and network infrastructures. You’ll also be responsible for managing building, patching, testing, and deployment of systems and platforms, ensuring that they meet clients’ needs

The average salary for a principal systems engineer is $169,453 per year, with a reported salary range of $128,000 to $235,000 per year, according to data from Glassdoor.

Embedded software engineer

Embedded software engineers are responsible for designing and developing software for embedded devices and systems. They are typically expected to work on systems and software designed for specific tasks. Depending on the role, you may also be expected to work on the entire system that the embedded software functions within, in order to test it to see whether it works properly. But overall, it’s a role that requires a very narrow focus and designing software solutions to meet unique or specific needs in the organization.

The average salary for an embedded software engineer is $114,884 per year, with a reported salary range of $88,000 to $166,000 per year, according to data from Glassdoor.

Data scientist

Data scientists are critical in manufacturing because they help businesses collect, manage, and store relevant data for customers, clients, products, and services. It’s a role that helps manufacturing companies identify areas for process improvements, potential risks, and waste that can be eliminated in various processes. Job listings typically ask for skills such as machine learning, AI, SQL, Python, AWS, and the ability to work with relational databases, analyze and model engineering data, and identify emerging technologies that will help the company meet its goals.

The average salary for a data scientist is $122,004 per year, with a reported salary range of $90,000 to $176,000 per year, according to data from Glassdoor.

Software developer

As the manufacturing industry increasingly relies on technology and software, there’s a big demand for software developers. Software developers are expected to recommend software programs to help address manufacturing needs, run software tests on internal computer programs, modify open-source code to suit business needs, or design and develop custom software for the organization. In manufacturing, software developers are tasked with working on software for internal and external clients to manage projects, suppliers, supply chains, data analysis, and smart technologies for products.  

The average salary for a software developer is $111,729 per year, with a reported salary range of $78,000 to $181,000 per year, according to data from Glassdoor.

Business analyst

In the manufacturing industry, business analysts are responsible for using data and analytics to help the business make decisions and to support digital transformation. Business analysts are expected to perform requirement analysis, document processes and communicate analysis insights to various departments and leadership. As a business analyst, you’ll also be expected to recommend new or emerging technology and processes to improve automation, and to stay on top of the latest process and IT advancements to automate and modernize systems.

The average salary for a business analyst is $97,744 per year, with a reported salary range of $70,000 to $155,000 per year, according to data from Glassdoor.

DevSecOps engineer

DevSecOps is the intersection of development, security, and operations — it’s an expansion on DevOps, with an added priority on security. DevSecOps engineers are responsible for monitoring processes, conducting risk analysis, automating security control, managing incidents and security protocol, maintaining internal and external systems, and implementing safety practices within the organization. It’s a role that requires strong communication, leadership, and administrative skills as well as hard skills such as Python, Java, C++, Ruby, DAST, SAST, and modeling tools such as Rhapsody and SysML.

The average salary for a DevSecOps engineer is $120,117 per year, with a reported salary range of $89,000 to $169,000 per year, according to data from Glassdoor.

Careers, Hiring, IT Jobs, Manufacturing Industry

The manufacturing industry has fully entered the digital era, with the most digitally advanced manufacturers integrating their information technology and operational technology environments to give themselves an edge in the marketplace.

They’re also advancing their use of automation and analytics to streamline back-office functions, logistics, and production, and to optimize resources, reduce costs, and identify new opportunities, industry analysts say.

In its 2022 Digital Factory Update, ABI Research predicted that spending on smart manufacturing would grow at a 12% compound annual growth rate, from $345 billion in 2021 to more than $950 billion in 2030 — all to support digital transformation initiatives.

The research firm further predicted that manufacturers would continue to increase their spending on analytics, collaborative industrial software, and wireless connectivity in upcoming years, with factories increasingly adopting Industry 4.0 solutions, including autonomous mobile robots, asset tracking, simulation, and digital twins.

Michael Larner, a research director at ABI Research, says technology innovation will remain instrumental for manufacturers’ success as they confront myriad challenges, such as escalating energy costs, supply chain disruptions, staffing shortages, and the need to optimize resources.

“We’re seeing a move away from basic automation to integrating IT and OT teams, optimizing using analytics to make sure things are as efficient as possible, and to do things like predictive maintenance and proactive monitoring,” Larner says.

Consequently, manufacturers now prioritize IT investments that create and extend their “digital thread,” in which applications within their IT architecture talk to one another, with data from one system informing and directing action in others, Larner says.

“Manufacturers appreciate they cannot just react to events but need to be proactive,” he adds. “A successful CIO needs to be equally at home on the factory floor understanding issues and influencing change as in the boardroom talking strategically and obtaining project funds.”

CIO.com recently recognized the following 10 manufacturers as part of the CIO 100 Awards for IT Innovation and Leadership. Here is a look at how they are capitalizing on the value of IT.

Avery Dennison brings intelligence to its supply chain

Nicholas Colisto, VP and CIO, Avery Dennison

Avery Dennison

Organization: Avery Dennison

Project: Advanced Planning System (APS) for Operational Production Planning and Detailed Scheduling

IT Leader: Nicholas Colisto, VP and CIO

Like many manufacturers, Avery Dennison saw the need to strategically use technology to add more agility, efficiency, and speed to its increasingly large and complex supply chain.

In response, Avery Dennison IT partnered with the company’s global supply chain operations to create the Advanced Planning System (APS) for Operational Production Planning and Detailed Scheduling, a system that provides granular insights into supply issues and constraints, giving the supply chain team the tools needed to make effective and timely decisions.

APS gathers, captures, and combines all relevant inputs and outputs of supply chain events in near real-time to form a digital twin of the supply chain. The system’s easy-to-use interface enables the centralized team to monitor and take immediate action on events.

Working with the supply chain function to define and align organizational priorities and needs, Avery Dennison IT set about creating a mathematical model for establishing short-, mid-, and long-term operational plans. IT also automated a range of tasks, using intelligent optimization algorithms to identify anomalies, interpret impacts on downstream supply chain actors, and communicate that information to relevant stakeholders — thereby enabling the supply chain team to quickly respond to and remediate issues.

“This system enables our supply chain function to gain insights into the constraints and issues within the different assets of the supply chain, giving us all the tools in a single place to make effective and timely decisions,” says Vice President and CIO Nicholas Colisto. “Together, our new robust data lake, predictive analytics, and digital-twin capabilities are helping Avery Dennison to optimize the utilization of our assets, avoid downtime, and provide better visibility and analysis of the complete supply chain, allowing the organization to respond quickly to disruptions and deviations.”

CoorsTek streamlines production operations

Matt Mehlbrech, VP of IT, CoorsTek

CoorsTek

Organization: CoorsTek

Project: Model Plant Implementation

IT Leader: Matt Mehlbrech, VP of IT

Production operators at CoorsTek had an efficiency issue. To report on production, monitor machine performance, record quality readings, and search and retrieve documents, they had to use multiple computer systems and paper-based processes, rendering their work inefficient and prone to errors due to the lack of integration and data validation.

To remedy this, and create a foundation for future transformation, CoorsTek IT developed Model Plant, an integrated systems strategy focused on increasing production operator efficiency, providing key metrics to manufacturing management, and preparing CoorsTek for a future ERP implementation.

At its core is a manufacturing operations management (MOM) system, which combines manufacturing execution and maintenance management in a single integrated platform. As envisioned, Model Plant will integrate this MOM system with the company’s ERP, quality management, and machine connectivity systems into a single console for CoorsTek production operators, helping them stay focused on producing high-quality ceramic parts.

IT designed, developed, and implemented Model Plant at six CoorsTek plants in under 18 months, with the environment already improving production operators’ efficiency. Implementation is ongoing, with IT seeing it as foundational to a smooth transition to a new ERP.

“Our Model Plant systems enable real-time visibility of our entire shop floor and creates a comprehensive data set that we can analyze and learn from to unlock tangible productivity and quality gains,” says Matt Mehlbrech, the company’s vice president of IT.

Dow digitizes its manufacturing facilities

Melanie Kalmar, corporate vice president, CIO, and CDO, Dow

Dow

Organization: Dow

Project: Dow Digital Manufacturing Acceleration (DMA) Program

IT Leader: Melanie Kalmar, corporate vice president, CIO, and CDO

Dow in fall 2020 launched its Digital Manufacturing Acceleration (DMA) program with the goal of accelerating the deployment of digital technologies across its global manufacturing and maintenance areas.

One program deliverable to date is a private, secure, high-speed cellular network that is fully within the enterprise network. This network provides real-time access to data and collaboration tools, thereby enabling employees to extend their work beyond the traditional office out into the manufacturing plant environment.

DMA also includes a secure cloud for hosting, integrating, and contextualizing data for employees through purpose-built applications, and a mobile-first platform where employees can access their data and actionable insights.

Through such deliverables, DMA correlates and integrates manufacturing data and analysis that employees can use to make better, faster decisions. Capabilities delivered through the program also help improve asset reliability, safety, quality performance, and operational efficiency while reducing operating costs and unplanned events.

Together, DMA solutions create an improved employee experience, a digitally enabled workforce, and a resilient manufacturing organization by integrating and contextualizing manufacturing data from engineering, operations, maintenance, logistics, ERP and related ecosystem data sources, and by delivering data and insights from multiple systems through a single, user-centric interface.

“Dow’s DMA’s success would not be possible without the outstanding collaboration between our information systems and manufacturing teams. By deepening our understanding of what our manufacturing teams need to safely and reliably meet our customers’ needs and what IS can deliver for manufacturing, we have been able to arrive at these high-performing DMA solutions,” says Corporate Vice President, CIO, and CDO Melanie Kalmar.

Dow’s DMA solutions are in use at the company’s largest facility, where they’re already generating value through increases in manufacturing asset utilization, reductions in unplanned events and production outages, production volume increases, and other quantifiable key performance indicators. Implementation of DMA solutions at other Dow facilities is ongoing.

Analytics proves key to Eastman materials innovation

Aldo Noseda, vice president and CIO, Eastman

Eastman

Organization: Eastman Chemicals

Project: Fluid Genius Digital Product

IT Leader: Aldo Noseda, vice president and CIO

Eastman Chemicals in 2021 launched Fluid Genius, a patent-pending digital platform that enables manufacturing plant workers to monitor issues that could impact operations, safety, yield, and maintenance budgets.

Fluid Genius can monitor, analyze, and extend the life of Eastman’s customers’ heat transfer fluid. The technology can predict fluid life expectancy and advise how best to extend it while also avoiding unplanned manufacturing shutdowns.

Fluid Genius also provides forward-looking insights, which allows plant maintenance engineers and operations managers to plan the optimal time for maintenance, thereby minimizing risk and costs. Fluid Genius’s recommendation engine also helps plant engineers better understand the factors impacting the quality of their heat transfer fluid, which in turn helps them operate their plants safely and inform budgeting for future maintenance needs.

To create this application, Eastman drew on its nearly 50 years of operating system sample analysis data to draw in-depth insights on fluid chemistry behaviors over time. The company combined that data with end-user input into its maintenance and incident logs and advanced artificial intelligence/machine learning techniques to create the platform’s proprietary fluid analytics.

“Digital products and services like Fluid Genius are critical to Eastman becoming a leading material innovation company because of the value they provide to our customers and how they differentiate us from competitors,” says Eastman VP and CIO Aldo Noseda.

Eastman has already seen strong uptake in use of this platform, and company officials note that the platform benefits its customer acquisition and retention efforts.

General Motors turns to IT to navigate semiconductor shortage

Fred Killeen, CIO and VP of Global IT, General Motors

General Motors

Organization: General Motors

Project: Semiconductor Shortage: Protecting GM Revenue

IT Leader: Fred Killeen, CIO and VP of Global IT

Like all modern vehicle-makers, General Motors relies on semiconductors to make its products; in fact, a typical vehicle relies on more than 3,000 chips within its electronic control units (ECU) to run essential vehicle functions.

As a result, General Motors faced the possibility of shutting down production when a global shortage of semiconductors arose in 2021. But thanks to an innovative technology that enabled GM to build vehicles without key electronic components and park them in lots for completion later when semiconductors became available, factories in North America could keep running.

The IT-developed solution — an industry first — essentially extended the secure plant network into vehicle storage locations, some of which were hundreds of miles away from manufacturing sites in the United States and Mexico.

This allowed repair teams, who were equipped with GM standard test tools configured with mobile printers and portable hot spots, to connect to the GM network and company systems to validate parts, complete repairs, download engine control software to ECUs, and test the vehicles. Completed vehicles were driven or returned to plants on carriers for the final stages of quality testing before being shipped to dealerships and customers.

The IT team also built tools to collect data about chips and suppliers and to provide that data on demand, enabling GM to route chips to suppliers who produce ECUs for its most popular vehicles.

Additionally, business intelligence and analytics teams developed tools and reporting features to manage the ECU repair process and prioritize fixes based on the time a vehicle was in a lot and their impact on GM revenue. These teams also developed a unified view, thereby displaying all data in one place.

“Working with teams across the organization, they delivered an industry-first technology solution that allowed us to build vehicles without some electronic components and complete them remotely as semiconductors became available — it’s a process that continues to prove valuable today,” says Fred Killeen, CIO and global VP of IT.

Oshkosh goes digital to optimize its supply chain

Anupam Khare, senior vice president and CIO, Oshkosh

Oshkosh

Organization: Oshkosh

Project: Enhanced Supply Chain Efficiency

IT Leader: Anupam Khare, senior vice president and CIO

Oshkosh had identified three issues within its supply chain function, which is tasked with acquiring the rights parts at the right time to support the manufacturing of customized, premium products requiring sophisticated components.

First, there was a lack of parts visibility across Oshkosh’s supply network, which led to unexpected parts shortages, putting the company at risk for production delays and shutdowns. That visibility issue in turn had created inefficient logistics and inventory leveling processes, driving up costs. And then there was the company’s supply chain technology itself, which comprised disparate, unconnected systems.

To address those issues, Oshkosh’s digital technology team partnered with its supply chain function to develop an advanced digital platform complete with system-to-system integration, cohesive data environments, data enhancements, and machine learning. Together, they teams also created a series of advanced analytics solutions to establish a more connected and optimized supply chain.

The platform pulls together critical data sources across several systems and functional areas and serves as the foundation for agile delivery of additional digital products. For example, the digital technology team quickly deployed a parts shortage prediction model soon after the platform was created. The team also deployed multiple advanced analytics solutions to optimize inventory levels, logistics, and shipping costs.

Initiated in January 2021 and now fully deployed, the collection of technologies has already increased supply chain visibility, part shortage prediction accuracy, and operational efficiency, and has helped Oshkosh avoid stockouts and subsequent costly production delays and shutdowns.

“This project showed the importance of IT staying well-connected with business partners to anticipate opportunities/challenges early and to co-create value through digital solutions,” says Anupam Khare, senior vice president and CIO.

Otis takes smart elevator to new heights

Organization: Otis Elevator

Project: Otis ONE

IT Leader: Renee Zaugg, vice president and CIO (Zaugg has since left Otis.)

To bring more transparent, proactive, and predictive services to customers, Otis in mid-2020 started work on a global digital transformation program called Otis ONE.

Leveraging technologies such as IoT, big data, AI, mobile, and cloud, Otis ONE provides real-time visibility of elevator health, insights for predictive maintenance, and remote assistance and troubleshooting.

Otis ONE is tailored to deliver real-time data insights to a range of personas — from campus owners to field engineers to maintenance officers — in their day-to-day operations, thereby  empowering various stakeholders to make more informed decisions and deliver more predictive maintenance, more effectively.

The Otis ONE architecture consists of three tiers — edge, platform, and enterprise. Edge relies on various gateway/sensor packages to collect and send a range of elevator data to the cloud via a cellular network, which is then integrated with the platform tier via its IoT hub or event hub. The platform tier features real-time business processing fueled by a rules engine that analyzes data and immediately determines various conditions on the state of the elevator, notifying Otis workers and external customers proactively in the event of anomalies.

The Otis ONE enterprise tier brings together multiple applications that leverage the information coming from the edge and platform tiers, integrating it with elevator master and service data to give a 360-degree, real-time view of elevators around the globe.

“The connected elevator creates value for our customers with greater equipment uptime. Otis ONE enhances transparency to our customers, productivity of our own teams in predication and proactiveness,” says Ezhil Nanjappan, executive director and CTO, adding that the Otis ONE ecosystem serves as a “foundation for any smart cities and smart buildings.”

Owens Corning leverage low-code to improve plant safety

Steve Zerby, senior vice president and CIO, Owens Corning

Owens Corning

Organization: Owens Corning

Project: Low-Code Digital Platform for High Impact to Plant Safety and Compliance

IT Leader: Steve Zerby, senior vice president and CIO

Owens Corning must monitor and manage the vapors from asphalt tanks for both safety and regulatory compliance reasons. But it found its process for doing so was ineffective and slow. Data was collected in offline databases, and was plant-specific and difficult to share across the company. In addition, the analysis required to uncover potential hazards took days.

So, in August 2021, Owens Corning’s asphalt business function and IT teamed up to co-create a digital platform that could transform and improve the effectiveness of that process through improved data collection and actionable analytics.

Low-code application development technology enabled the two teams to quickly create a proof-of-concept, gather feedback, and fine-tune the platform. Within three week, a pilot was delivered to a plant, and the platform itself was implemented across 20-plus plants in only three months.

The platform provides a single source of truth for loss prevention data, with proactive monitoring and analytics that alert plant leaders with real-time insights to ensure hazard prevention.

The company has seen improvements in its program. Loss prevention hazards are now identified and addressed in real-time, thereby ensuring more effective personnel safety and regulatory compliance reporting. That visibility has also enabled more proactive preventive maintenance of equipment, thereby minimizing unplanned production outages. And, thanks to the platform’s analytics capabilities, insights are generated in minutes instead of days, enabling the company to make informed decisions quickly.

“Digitizing sensor measure data taken from tanks, integrating other data points, and providing easy-to-use visuals and analytics empowered plant operators. They can quickly assess potential hazards and risks in real-time and proactively take preventive actions,” says Steve Zerby, senior vice president and CIO, noting that a key element of success was having “a dedicated business product owner who is passionate about bringing others along in using the tools.”

Rockwell Automation launches customer-centric transformation

Chris Nardecchia, SVP and chief information and digital officer, Rockwell

Rockwell

Organization: Rockwell Automation

Project: Rockwell Automation Drives Innovation, Customer Centricity and Moves Towards a Data-Driven Operating Model Through its Enterprise Transformation Office

IT Leader: Chris Nardecchia, SVP and chief information and digital officer

As Rockwell Automation works to help customers accelerate their digital transformations, the company likewise embarked on its own DX journey. In doing so, it is creating new customer experiences, transforming business models, and empowering workforce innovation.

Launched in August 2020, the initiative is helping to build a data-driven operating model that is agile and centered around the customer so it can both respond to changing needs and create lasting customer connections instead of one-time transactional product sales.

To do that, the Enterprise Transformation Office (ExO) mapped out end-to-end business processes and customer experience needs for the target state and prioritized the investments required to deliver a holistic customer experience.

The office is moving Rockwell Automation to a data-driven operating model, in which it will execute more than two dozen product development cycles annually and use customer information to shape those products. Additionally, the ExO is working to transform business models, processes, software, and service portfolios to deliver more subscription-based offerings to customers.

Rockwell Automation is using telemetry to gather data and insights as well as creating a comprehensive, unified view of customer data to enhance the company’s customer engagements. It’s updating processes and systems to support its future state. It’s also leveraging data gathered from other initiatives to innovate products, grow the market, and grow the business. And it has implemented a structured approach for governing its transformation roadmap.

“This program is critical to realizing our vision of reinventing our products to outcomes, reimagining our customers’ experience, and redefining our operating model towards subscriptions and annual recurring revenue,” says Senior Vice President and Chief Information and Digital Officer Chris Nardecchia.

Schneider Electric secures against rise in OT cyberattacks

Elizabeth Hackenson, SVP and CIO, Schneider Electric

Schneider Electric

Organization: Schneider Electric

Project: Cybersecurity Connected Service Hub

IT Leader: Elizabeth Hackenson, SVP and CIO

In response to the growing number and sophistication of cyberattacks directed at operational technology (OT), Schneider Electric in 2020 created a cybersecurity OT operating model aimed at optimizing the cybersecurity performance of its 220 manufacturing plants and 35 distribution centers around the globe.

The model establishes one Security Operation Center (SOC) for all Schneider IT and OT. It also establishes a threat detection platform at each plant that raises security alerts to the SOC via an interface with the security information and event management (SIEM) system.

The operating model also features the Cybersecurity Connected Service Hub (CSH), a 24/7 global team to support the company’s industrial sites in every type of cybersecurity event. Its primary responsibilities are to identify and baseline cybersecurity posture of OT and IT devices; detect and remediate security alerts; and monitor for and remediate vulnerabilities.

CSH is also charged with driving continuous improvement and progressively improving the company’s cybersecurity posture in OT devices and processes.

The CSH team receives alerts from the SOC, analyzes those alerts and defines how this must be remediated. The team works with cybersecurity site leaders (CSL) to remediate alerts; CSLs are OT experts trained in cybersecurity who work in an assigned plant, with a CSL in each plant, and have accountability for the cybersecurity of the OT in that plant. They can remediate a cyber incident and manage the business continuity.

The CSH also provides cybersecurity training for all CSLs. It builds and promulgates standard operation procedures (SOP) for each type of security event and vulnerability. It also monitors cybersecurity KPIs in all plants, remediating in conjunction with the CSL any gaps that it identifies.

“The creation of the Cybersecurity Connected Services Hub is a milestone in our continuous journey towards greater resilience and cybersecurity at Schneider Electric,” says SVP and CIO Elizabeth Hackenson.

CIO 100, Digital Transformation, IT Leadership, Manufacturing Industry

At the Laboratory for Machine Tools and Production Engineering (WZL) of RWTH Aachen University, scientists, mathematicians, and software developers conduct manufacturing research, working together to gain new insights from machine, product, and manufacturing data. Manufacturers partner with the team at WZL to refine solutions before putting them into production in their own factories. 

Recently, WZL has been looking for ways to help manufacturers analyze changes in processes, monitor output and process quality, then adjust in real-time. Processing data at the point of inception, or the edge, would allow them to modify processes as required while managing large data volumes and IT infrastructure at scale.

Connected devices generate huge volumes of data

According to IDC, the amount of digital data worldwide will grow by 23% through 2025, driven in large part by the rising number of connected devices. Juniper Research found that the total number of IoT connections will reach 83 billion by 2024. This represents a projected 130% growth rate from 35 billion connections in 2020.

WZL is no stranger to this rise in data volume. As part of their manufacturing processes, fine blanking incubators generate massive amounts of data that must first be recorded at the sharp end and processed extremely quickly. Their specialized sensors for vibrations, acoustics and other manufacturing conditions can generate more than 1 million data points per second.

Traditionally, WZL’s engineers have processed small batches of this data in the data center. But this method could take days to weeks to gain insights. They wanted a solution that would enable them to implement and use extremely low-latency streaming models to garner insights in real-time without much in-house development.

Data-driven automation at the edge 

WZL implemented a platform which could ingest, store, and analyze their continuously streaming data as it was created. This system gives organizations access to a single solution for all their data (whether streaming or not) that provides out-of-the box functionality and support for high-speed data ingestion with an open-source and auto-scaling streaming storage solution. 

Now, up to 1,000 characteristic values are recorded every 0.4 milliseconds – nearly 80TB of data every 24 hours. This data is immediately stored and pre-analyzed in real-time at the edge on powerful compact servers, enabling further evaluation using artificial intelligence and machine learning. These characteristic values leverage huge amounts of streaming image, X-ray and IoT data to detect and predict abnormalities throughout the metal stamping process. 

The WZL team found that once the system was implemented, it could be scaled without constraint. “No matter how many sensors we use, once we set up the analytics pipeline and the data streams, we don’t have to address any load-balancing issues,” said Philipp Niemietz, Head of Digital Technologies at WZL. 

With conditions like speed and temperature under constant AI supervision, the machinery is now able to automatically adjust itself to prevent any interruptions. By monitoring the machines in this way, WZL have also enhanced their predictive maintenance capabilities. Learn more about how you can leverage Dell Technologies edge solutions.

***

Intel® Technologies Move Analytics Forward

Data analytics is the key to unlocking the most value you can extract from data across your organization. To create a productive, cost-effective analytics strategy that gets results, you need high performance hardware that’s optimized to work with the software you use.

Modern data analytics spans a range of technologies, from dedicated analytics platforms and databases to deep learning and artificial intelligence (AI). Just starting out with analytics? Ready to evolve your analytics strategy or improve your data quality? There’s always room to grow, and Intel is ready to help. With a deep ecosystem of analytics technologies and partners, Intel accelerates the efforts of data scientists, analysts, and developers in every industry. Find out more about Intel advanced analytics.

IT Leadership

April 14, 2022

Source:  Tim Guido, Corporate Director, Performance Improvement, Sanmina | Manufacturing Tomorrow

Many industries such as the automotive, medical and semiconductor sectors must comply with third party standards to control processes, reduce risk and ensure quality during the manufacturing of products. Over the past few years, organizations have begun to embrace an even broader mindset towards risk-based thinking, motivated by the growing discipline of regulatory compliance and an increasing number of unexpected global events that have impacted their operations.

When manufacturers want to implement a new production line, they are examining all of the possible risks and scenario planning for every reasonable action that could either prevent or mitigate a risk if it materializes. Some people call this business continuity, risk management or disaster management. Nothing has brought these concerns more to top of mind than the past few years of dealing with trade wars, the pandemic, extreme weather and supply chain shortages.

Risk Management Checklist

Risk management is about the practical investment in preventative and mitigating measures that can be tapped during a crisis. There are four main areas to consider when building a risk management or business continuity program:

Risk Assessment. The first action to take is to put a stake in the ground in terms of what could go wrong at each plant, whether it happens to be a fire, earthquake, chemical spill, cyber attack or something else. This will vary for different regions. The possibility of an earthquake impacting operations in California is much higher than in Alabama. An act of terrorism may be more likely to happen in certain countries versus others.

Let’s say a manufacturer is setting up a new production line. The first step would be to complete a risk assessment form that spans different areas – Employee Health and Safety, Finance, HR, IT, Operations and Program Management. Based on the guidelines provided, the person completing the form identifies possible issues and potential impacts – this could be anything from production or shipment delays to impacts to employee health and safety. Then a threat rating is assigned between 1 and 5 for both occurrence and impact, with 5 being a critical situation that warrants the most attention.

Then, preventative and mitigating measures are determined based on factors that could contribute to the adverse event. Are there inadequate controls, lack of monitoring or poor training that might add to a problem? Could these areas be improved to either prevent or lessen the potential impact? While an earthquake isn’t preventable, an organization could retrofit their building and upgrade IT systems to ensure that people are safe and can still perform their job duties if a temblor hits.

Incident Management & Business Recovery Planning. Building out all of the details for incident management and business recovery is essential, if not glamorous. A contact list needs to be created so that a key lead can contact all affected employees, customers and suppliers during a disaster. Getting customers and suppliers in the loop early could enable them to become part of the solution. A call notification script should be drafted that provides consistent communications to impacted parties and decisions need to be made about whom gets told what in certain scenarios. Checklists and drills should also be included, such as how to safely clear employees from a facility.

 

Internal Audit Checks. Once the business recovery plan is drafted, it should be audited annually. This ensures that the right action plans are included and the correct project leaders and backup leads are identified and verified. Each section, such as advanced planning, revision histories and recovery priorities, must be evaluated as part of the audit to ensure that there’s a solid plan in place and that all participants are properly trained and on board with the approach.

 

Test Exercise. Every plant should run through a drill for their highest-priority emergencies to evaluate preparedness. They must be able to prove that there’s a data IT recovery capability and have a rough idea of what can be done for a customer in the scope of the test exercise. If work needs to be moved to another location, are they able to confirm the backup plant’s capacity and a timeline for the transfer? Do they understand the open orders that need to be transferred? How does the detailed recovery plan work in terms getting operations back up and running?  For each action, what would be considered a success and how soon? A sample objective would be to get access to a site within one hour and have at least 80 percent of the team notified within the hour of a situation.

After running a drill, evaluating its effectiveness and making improvements to the plan and communicating it to the team should occur. If actions such as getting access to the site, notifying the team, understanding orders, getting alternate facility confirmation and knowing the right customer contacts can all be demonstrated during the exercise, then the majority of functional activities are ready to go, even if the actual crisis requires some fine tuning of processes. Just like the overall plan, the test runs should be performed at least once a year to verify their continued relevance.

Preventing Problems Before They Happen

At Sanmina, we are seeing increasingly robust expectations for risk management programs across the markets that we serve. Customers are more eager to get involved in understanding the details of these plans than ever before and are considering them an integral part of their manufacturing strategy.

It’s vitally important to understand potential risks, evaluate the scope and effectiveness of an action plan and cultivate a living risk management process that is periodically reviewed and updated. It’s also critical to instill a preventative mindset within an organization’s culture because it’s not always an intuitive thought process. While fixing a problem in the moment may be beneficial, it’s important to build a mindset that’s not just about corrective thinking but a proactive approach that identifies potential root causes that could help prevent or lessen a problem that may occur in the future.

The post Four Steps to Reducing Manufacturing Risk appeared first on Internet of Business.