As part of its ongoing strategy to expand its roster of public cloud regions and catch up with larger cloud service providers such as AWS, Microsoft and Google, Oracle has launched a new cloud region in Chicago to cater to enterprises operating out of the US Midwest.

The Chicago region, which will be Oracle’s fourth public cloud region in the US and 41st globally, will primarily cater to manufacturing and financial services firms among other industries operating in that part of the country, said Leo Leung, vice president of products and strategy at Oracle.  

The Midwest region, according to Oracle, is home to more than 60% of all US manufacturing firms and houses the world’s largest financial derivatives exchange.

“This is just going to give them (enterprise customers in the region) the capability of running their workloads closer to their headquarters versus other parts of the country,” Leung said, adding that the demand in the region is fueling the company’s growing bookings for Oracle Cloud Infrastructure (OCI).

CEO Safra Catz, during an earnings call for its quarter ended November, had said that the company had triple-digit bookings growth across its infrastructure-as-a-service (IaaS) services for the past two quarters and basis this growth, the company planned to invest $2.4 billion approximately every quarter for the next few quarters.

The new region in Chicago will offer over 100 OCI services and applications, including Oracle Autonomous Database, MySQL Heatwave, OCI Data Science, Oracle Container Engine for Kubernetes, and Oracle Analytics, the company said.  

Oracle has three other regions in the US, situated in Ashburn, Virginia; San Jose, California; and Phoenix, Arizona.

Globally, the company has a total of 55 cloud regions including national security regions.

Nine new regions are currently being built, Catz had said during the earnings call, according to a transcript from Motley Fool.

For the quarter ended November, the company’s total revenue grew 25% in constant currency, buoyed by revenue growth from its infrastructure and applications cloud businesses, which grew 59% and 45% respectively, in constant currency.

Cloud Computing, Finance and Accounting Systems, Manufacturing Industry, Technology Industry

As the chief engineer and head of the department for digital transformation of manufacturing technologies at the Laboratory for Machine Tools and Production Engineering (WZL) within RWTH Aachen University, I’ve seen a lot of technological advancements in the manufacturing industry over my tenure. I hope to help other manufacturers struggling with the complexities of AI in manufacturing by summarizing my findings and sharing some key themes.

The WZL has been synonymous with pioneering research and successful innovations in the field of production technology for more than a hundred years, and we publish over a hundred scientific and technical papers on our research activities every year. The WZL is focused on a holistic approach to production engineering, covering the specifics of manufacturing technologies, machine tools, production metrology and production management, helping manufacturers test and refine advanced technology solutions before putting them into production at the manufacturing edge. In my team, we have a mix of computer scientists, like me, working together with mathematicians and mechanical engineers to help manufacturers use advanced technologies to gain new insights from machine, product, and manufacturing data.

Closing the edge AI insight gap starts and ends with people 

Manufacturers of all sizes are looking to develop AI models they can use at the edge to translate their data into something that’s helpful to engineers and adds value to the business. Most of our AI efforts are focused on creating a more transparent shop floor, with automated, AI-driven insights that can:

Enable faster and more accurate quality assessmentReduce the time it takes to find and address process problemsDeliver predictive maintenance capabilities that reduce downtime

However, AI at the manufacturing edge introduces some unique challenges. IT teams are used to deploying solutions that work for a lot of different general use cases, while operational technology (OT) teams usually need a specific solution for a unique problem. For example, the same architecture and technologies can enable AI at the manufacturing edge for various use cases, but more often than not, the way to extract data from edge OT devices and systems that move their data into the IT systems is unique for each case. 

Unfortunately, when we start a project, there usually isn’t an existing interface for getting data out of OT devices and into the IT system that is going to process it. And each OT device manufacturer has its own systems and protocols. In order to take a general IT solution and transform into something that can answer specific OT needs, IT and OT teams must work together at the device level to extract meaningful data for the AI model. This will require IT to start speaking the language of OT, developing a deep understanding of the challenges OT faces daily, so the two teams can work together. In particular, this requires a clear communication of divided responsibilities between both domains and a commitment to common goals. 

Simplifying data insights at the manufacturing edge

Once IT and OT can work together to successfully get data from OT systems to the IT systems that run the AI models, that’s just the beginning. A challenge I see a lot in the industry is when an organization still uses multiple use-case-specific architectures and pipelines to build their AI foundation. The IT systems themselves often need to be upgraded, because legacy systems can’t handle the transmission needs of these very large data sets. 

Many of the companies we work with throughout our various research communities, industry consortia or conferences, such as WBAICNAP or AWK2023 — especially the small to medium manufacturers — ask us specifically for technologies that don’t require highly specialized data scientists to operate. That’s because manufacturers can have a hard time justifying the ROI if a project requires adding one or more data scientists to the payroll. 

To answer these needs, we develop solutions that manufacturers can use to get results at the edge as simply as possible. As a mechanical engineering institute, we’d rather not spend a lot of time doing research about infrastructure and managing IT systems, so we often seek out partners like Dell Technologies, who have the solutions and expertise to help reduce some of the barriers to entry for AI at the edge.

For example, when we did a project that involved high- frequency sensors, there was no product available at the time that could deal with our volume and type of data. We were working with a variety of open-source technologies to get what we needed, but securing, scaling, and troubleshooting each component led to a lot of management overhead.

We presented our use case to Dell Technologies, and they suggested their Streaming Data Platform. This platform reminds me of the way the smartphone revolutionized usability in 2007. When the smartphone came out, it had a very simple and intuitive user interface so anyone could just turn it on and use it without having to read a manual. 

The Streaming Data Platform is like that. It reduces friction to make it easier for people who are not computer scientists to capture the data flow from an edge device without having technical expertise in these systems. The platform also makes it easy to visualize the data at a glance, so engineers can quickly achieve insights.

When we applied it to our use case, we found that it deals with these data streams very naturally and efficiently, and it reduced the amount of time required to manage the solution. Now, developers can focus on developing the code, not dealing with infrastructure complexities. By reducing the management overhead, we can use the time saved to work with data and get better insights.

The future of AI at the manufacturing edge

With all of this said, one of the biggest challenges I see overall with AI for edge manufacturing is the recognition that AI insights are an augmentation to people and knowledge — not a replacement. And that it is much more important for people to work together in managing and analyzing that data to ensure that the end goal of getting business insights to serve a particular problem are being met. 

When manufacturers use many different solutions pieced together to find insights, it might work, but it’s unnecessarily difficult. There are technologies out there today that can remedy these challenges, it’s just a matter of finding them and checking them out. We’ve found that the Dell Streaming Data Platform can capture data from edge devices, analyze the data using AI models in near real time, and feed insights back to the business to add value that benefits both IT and OT teams.

Learn more

If you are interested in current challenges, trends and solutions to empower sustainable production, find out more at the AWK2023 where more than a thousand participants from production companies all around the world come together to discuss solutions for green production.

Find out more about AI at the manufacturing edge solutions from Dell Technologies and Intel.  

***

Intel® Technologies Move Analytics Forward

Data analytics is the key to unlocking the most value you can extract from data across your organization. To create a productive, cost-effective analytics strategy that gets results, you need high performance hardware that’s optimized to work with the software you use.

Modern data analytics spans a range of technologies, from dedicated analytics platforms and databases to deep learning and artificial intelligence (AI). Just starting out with analytics? Ready to evolve your analytics strategy or improve your data quality? There’s always room to grow, and Intel is ready to help. With a deep ecosystem of analytics technologies and partners, Intel accelerates the efforts of data scientists, analysts, and developers in every industry. Find out more about Intel advanced analytics.

IT Leadership

Over the past 184 years, The Procter & Gamble Co. (P&G) has grown to become one of the world’s largest consumer goods manufacturers, with worldwide revenue of more than $76 billion in 2021 and more than 100,000 employees. Its brands are household names, including Charmin, Crest, Dawn, Febreze, Gillette, Olay, Pampers, and Tide.

In summer 2022, P&G sealed a multiyear partnership with Microsoft to transform P&G’s digital manufacturing platform. The partners say they will create the future of digital manufacturing by leveraging the industrial internet of things (IIoT), digital twin, data, and AI to bring products to consumers faster and increase customer satisfaction, all while improving productivity and reducing costs.

“The main purpose of our digital transformation is to help create superior solutions for daily problems of millions of consumers around the world, while generating growth and value for all stakeholders,” says Vittorio Cretella, CIO of P&G. “We do that by leveraging data, AI, and automation with agility and scale across all dimensions of our business, accelerating innovation and increasing productivity in everything we do.”

The digital transformation of P&G’s manufacturing platform will enable the company to check product quality in real-time directly on the production line, maximize the resiliency of equipment while avoiding waste, and optimize the use of energy and water in manufacturing plants. Cretella says P&G will make manufacturing smarter by enabling scalable predictive quality, predictive maintenance, controlled release, touchless operations, and manufacturing sustainability optimization. These things have not been done at this scale in the manufacturing space to date, he says.

Smart manufacturing at scale

The company has already undertaken pilot projects in Egypt, India, Japan, and the US that use Azure IoT Hub and IoT Edge to help manufacturing technicians analyze insights to create improvements in the production of baby care and paper products.

For instance, the production of diapers involves assembling many layers of material at high speed with great precision to ensure optimal absorbency, leak protection, and comfort. The new IIoT platform uses machine telemetry and high-speed analytics to continuously monitor production lines to provide early detection and prevention of potential issues in the material flow. This, in turn, improves cycle time, reduces network losses, and ensures quality, all while improving operator productivity.

P&G is also piloting the use of IIoT, advanced algorithms, machine learning (ML), and predictive analytics to improve manufacturing efficiencies in the production of paper towels. P&G can now better predict finished paper towel sheet lengths.

Smart manufacturing at scale is a challenge. It requires taking data from equipment sensors, applying advanced analytics to derive descriptive and predictive insights, and automating corrective actions. The end-to-end process requires several steps, including data integration and algorithm development, training, and deployment. It also involves large amounts of data and near real-time processing.

“The secret to scale is to lessen complexity by providing common components at the edge and in the Microsoft cloud that engineers can work with to deploy diverse use cases into a specific manufacturing environment — without having to create everything from scratch,” Cretella says.

Using Microsoft Azure as the foundation, Cretella says P&G will now be able to digitize and integrate data from more than 100 manufacturing sites around the world and enhance AI, ML, and edge computing services for real-time visibility. In turn, this will enable P&G employees to analyze production data and leverage AI to support decisions that drive improvement and exponential impact.

“Accessing this level of data, at scale, is rare within the consumer goods industry,” Cretella says.

Data and AI as digital fundamentals

P&G took the first steps in its AI journey more than five years ago. It has moved past what Cretella calls the “experimentation phase” with scaled solutions and increasingly sophisticated AI applications. Data and AI have since become central to the company’s digital strategy.

“We leverage AI across all dimensions of our business to predict outcomes and increasingly to prescribe actions through automation,” Cretella says. “We have applications in our product innovation space, where thanks to modelling and simulation we can shorten the lead time to develop a new formula from months to weeks; in the way we engage and communicate with our consumers, using AI to deliver to each of them brand messages delivered at their right time, right channel, and with the right content.”

P&G also uses predictive analytics to help ensure the company’s products are available at retail partner “where, when, and how consumers shop for them,” Cretella says, adding that P&G engineers also use Azure AI to ensure quality control and equipment resilience on the production line.

While P&G’s recipe for scale relies on technology, including investment in a scalable data and AI environment centered on cross-functional data lakes, Cretella says P&G’s secret sauce is the skills of hundreds of talented data scientists and engineers who understand the company’s business inside and out. To that end, P&G’s future is about embracing automation of AI, which will allow its data engineers, data scientists, and ML engineers to spend less time on manual, labor-intensive tasks so they can focus on the areas where they add value.

“Automation of AI also allows us to deliver with consistent quality and to manage bias and risk,” he says, adding that automating AI will also “make these capabilities accessible to an increasingly larger number of employees, thus making the benefits of AI pervasive across the company.”

The power of people

Another element to achieving agility at scale is P&G’s “composite” approach to building teams in the IT organization. P&G balances the organization between central teams and teams embedded in its categories and markets. The central teams create enterprise platforms and technology foundations, while the embedded teams use those platforms and foundations to build digital solutions that address their units’ specific business opportunities. Cretella also notes that the company prioritizes insourcing talent, especially in areas such as data science, cloud management, cybersecurity, software engineering, and DevOps.

To accelerate P&G’s transformation, Microsoft and P&G have created a Digital Enablement Office (DEO) staffed by experts from both organizations. The DEO will serve as an incubator to create high-priority business scenarios in the areas of product manufacturing and packaging processes that P&G can implement across the company. Cretella considers it as more of a project management office than a center of excellence.

“It coordinates all the efforts of the different innovation teams that work on business use cases and ensures an efficient scaled deployment of proven solutions that develop,” he says.

Cretella has some advice for CIOs trying to drive digital transformation in their organizations: “First, be driven and find your energy in the passion for the business and how to apply technology to create value. Second, be equipped with tons of learning agility and genuine curiosity to learn. Last, invest in people — your teams, your peers, your bosses — because technology alone does not change things; people do.”

Artificial Intelligence, Digital Transformation, Manufacturing Industry

The manufacturing industry is experiencing its “fourth industrial revolution,” with manufacturers focused on leveraging IT to stay competitive and meet the demand for digital services that can enhance their physical wares. Sensors, AI, and robotics are key Manufacturing 4.0 technologies that fueled data strategies aimed at identifying inefficiencies, streamlining processes, and improving the ability to forecast and predict industry trends.

Because of this, IT professionals are in high demand in the manufacturing industry — even more so as supply chain issues persist and manufacturers consider bringing more of their operations back onshore. Demand has increased so much that IT job postings in manufacturing doubled between May 2021 and 2022, according to Dice.com, with increased demand for skills such as agile development, Python, software development, automation, C++, SQL, and Java, among others.

If you’re an IT pro looking to break into the manufacturing industry, or a manufacturing IT leader wanting to know where hiring will be the most competitive, here are nine of the most in-demand tech jobs in manufacturing, according to data from Dice.

Software engineer

The demand for software in manufacturing has only increased, as nearly every piece of equipment or hardware is now connected to the internet in some form — and this has also increased the demand for software engineers. In this role, you’ll be expected to design, code, debug, improve, and maintain software to support the organization. You may be put to work on automation, modernization, equipment installation, equipment support, or designing software to meet business needs. Software design and implementation can sometimes take years, depending on what’s being developed, so it’s a pivotal role for ensuring a company stays on track and on budget with digital transformation.

The average salary for a financial software engineer is $119,593 per year, with a reported salary range of $88,000 to $177,000 per year, according to data from Glassdoor.

Principal software engineer

Principal software engineers are typically responsible for managing large-scale, high-level projects that may require larger teams or longer lead times, and it’s one of the highest levels software engineers can reach. In this role, you’ll need to manage and oversee the technical aspects of the organization’s biggest projects and initiatives. Duties vary depending on the type of manufacturing, but these tech pros are often tasked with implementing plans developed by software architects and managing a team of engineers to code and script software. It’s a technical role that also requires a level of soft skills such as leadership, communication, and analytical skills.

The average salary for a principal software engineer is $169,453 per year, with a reported salary range of $128,000 to $235,000 per year, according to data from Glassdoor.

Systems engineer

Systems engineers are responsible for focusing solely on the systems and infrastructure in an organization and identifying areas for improvement, designing new solutions, and advising the company on the best hardware or software to meet a client’s requirements. You’ll be tasked with ensuring that the systems in an organization are always available, reliable, and well-maintained, as well as troubleshooting any problems or issues that arise. In manufacturing, systems engineers are typically expected to focus on process flows, identifying issues in development processes, developing management control systems, implement quality control procedures, and working directly with clients to better understand requirements and needs.

The average salary for a systems engineer is $110,245 per year, with a reported salary range of $82,000 to $158,000 per year, according to data from Glassdoor.

Principal systems engineer

Principal systems engineer is a high-level engineering role for designing and implementing complex computer systems across a variety of teams. These tech pros work closely with software developers, hardware engineers, and other tech professionals to ensure that the company’s products are up to industry standards and address customer needs. It’s the highest level for a systems engineer and you’ll typically act as a supervisor, overseeing the day-to-day operations and performance of servers, storage, and network infrastructures. You’ll also be responsible for managing building, patching, testing, and deployment of systems and platforms, ensuring that they meet clients’ needs

The average salary for a principal systems engineer is $169,453 per year, with a reported salary range of $128,000 to $235,000 per year, according to data from Glassdoor.

Embedded software engineer

Embedded software engineers are responsible for designing and developing software for embedded devices and systems. They are typically expected to work on systems and software designed for specific tasks. Depending on the role, you may also be expected to work on the entire system that the embedded software functions within, in order to test it to see whether it works properly. But overall, it’s a role that requires a very narrow focus and designing software solutions to meet unique or specific needs in the organization.

The average salary for an embedded software engineer is $114,884 per year, with a reported salary range of $88,000 to $166,000 per year, according to data from Glassdoor.

Data scientist

Data scientists are critical in manufacturing because they help businesses collect, manage, and store relevant data for customers, clients, products, and services. It’s a role that helps manufacturing companies identify areas for process improvements, potential risks, and waste that can be eliminated in various processes. Job listings typically ask for skills such as machine learning, AI, SQL, Python, AWS, and the ability to work with relational databases, analyze and model engineering data, and identify emerging technologies that will help the company meet its goals.

The average salary for a data scientist is $122,004 per year, with a reported salary range of $90,000 to $176,000 per year, according to data from Glassdoor.

Software developer

As the manufacturing industry increasingly relies on technology and software, there’s a big demand for software developers. Software developers are expected to recommend software programs to help address manufacturing needs, run software tests on internal computer programs, modify open-source code to suit business needs, or design and develop custom software for the organization. In manufacturing, software developers are tasked with working on software for internal and external clients to manage projects, suppliers, supply chains, data analysis, and smart technologies for products.  

The average salary for a software developer is $111,729 per year, with a reported salary range of $78,000 to $181,000 per year, according to data from Glassdoor.

Business analyst

In the manufacturing industry, business analysts are responsible for using data and analytics to help the business make decisions and to support digital transformation. Business analysts are expected to perform requirement analysis, document processes and communicate analysis insights to various departments and leadership. As a business analyst, you’ll also be expected to recommend new or emerging technology and processes to improve automation, and to stay on top of the latest process and IT advancements to automate and modernize systems.

The average salary for a business analyst is $97,744 per year, with a reported salary range of $70,000 to $155,000 per year, according to data from Glassdoor.

DevSecOps engineer

DevSecOps is the intersection of development, security, and operations — it’s an expansion on DevOps, with an added priority on security. DevSecOps engineers are responsible for monitoring processes, conducting risk analysis, automating security control, managing incidents and security protocol, maintaining internal and external systems, and implementing safety practices within the organization. It’s a role that requires strong communication, leadership, and administrative skills as well as hard skills such as Python, Java, C++, Ruby, DAST, SAST, and modeling tools such as Rhapsody and SysML.

The average salary for a DevSecOps engineer is $120,117 per year, with a reported salary range of $89,000 to $169,000 per year, according to data from Glassdoor.

Careers, Hiring, IT Jobs, Manufacturing Industry

The manufacturing industry has fully entered the digital era, with the most digitally advanced manufacturers integrating their information technology and operational technology environments to give themselves an edge in the marketplace.

They’re also advancing their use of automation and analytics to streamline back-office functions, logistics, and production, and to optimize resources, reduce costs, and identify new opportunities, industry analysts say.

In its 2022 Digital Factory Update, ABI Research predicted that spending on smart manufacturing would grow at a 12% compound annual growth rate, from $345 billion in 2021 to more than $950 billion in 2030 — all to support digital transformation initiatives.

The research firm further predicted that manufacturers would continue to increase their spending on analytics, collaborative industrial software, and wireless connectivity in upcoming years, with factories increasingly adopting Industry 4.0 solutions, including autonomous mobile robots, asset tracking, simulation, and digital twins.

Michael Larner, a research director at ABI Research, says technology innovation will remain instrumental for manufacturers’ success as they confront myriad challenges, such as escalating energy costs, supply chain disruptions, staffing shortages, and the need to optimize resources.

“We’re seeing a move away from basic automation to integrating IT and OT teams, optimizing using analytics to make sure things are as efficient as possible, and to do things like predictive maintenance and proactive monitoring,” Larner says.

Consequently, manufacturers now prioritize IT investments that create and extend their “digital thread,” in which applications within their IT architecture talk to one another, with data from one system informing and directing action in others, Larner says.

“Manufacturers appreciate they cannot just react to events but need to be proactive,” he adds. “A successful CIO needs to be equally at home on the factory floor understanding issues and influencing change as in the boardroom talking strategically and obtaining project funds.”

CIO.com recently recognized the following 10 manufacturers as part of the CIO 100 Awards for IT Innovation and Leadership. Here is a look at how they are capitalizing on the value of IT.

Avery Dennison brings intelligence to its supply chain

Nicholas Colisto, VP and CIO, Avery Dennison

Avery Dennison

Organization: Avery Dennison

Project: Advanced Planning System (APS) for Operational Production Planning and Detailed Scheduling

IT Leader: Nicholas Colisto, VP and CIO

Like many manufacturers, Avery Dennison saw the need to strategically use technology to add more agility, efficiency, and speed to its increasingly large and complex supply chain.

In response, Avery Dennison IT partnered with the company’s global supply chain operations to create the Advanced Planning System (APS) for Operational Production Planning and Detailed Scheduling, a system that provides granular insights into supply issues and constraints, giving the supply chain team the tools needed to make effective and timely decisions.

APS gathers, captures, and combines all relevant inputs and outputs of supply chain events in near real-time to form a digital twin of the supply chain. The system’s easy-to-use interface enables the centralized team to monitor and take immediate action on events.

Working with the supply chain function to define and align organizational priorities and needs, Avery Dennison IT set about creating a mathematical model for establishing short-, mid-, and long-term operational plans. IT also automated a range of tasks, using intelligent optimization algorithms to identify anomalies, interpret impacts on downstream supply chain actors, and communicate that information to relevant stakeholders — thereby enabling the supply chain team to quickly respond to and remediate issues.

“This system enables our supply chain function to gain insights into the constraints and issues within the different assets of the supply chain, giving us all the tools in a single place to make effective and timely decisions,” says Vice President and CIO Nicholas Colisto. “Together, our new robust data lake, predictive analytics, and digital-twin capabilities are helping Avery Dennison to optimize the utilization of our assets, avoid downtime, and provide better visibility and analysis of the complete supply chain, allowing the organization to respond quickly to disruptions and deviations.”

CoorsTek streamlines production operations

Matt Mehlbrech, VP of IT, CoorsTek

CoorsTek

Organization: CoorsTek

Project: Model Plant Implementation

IT Leader: Matt Mehlbrech, VP of IT

Production operators at CoorsTek had an efficiency issue. To report on production, monitor machine performance, record quality readings, and search and retrieve documents, they had to use multiple computer systems and paper-based processes, rendering their work inefficient and prone to errors due to the lack of integration and data validation.

To remedy this, and create a foundation for future transformation, CoorsTek IT developed Model Plant, an integrated systems strategy focused on increasing production operator efficiency, providing key metrics to manufacturing management, and preparing CoorsTek for a future ERP implementation.

At its core is a manufacturing operations management (MOM) system, which combines manufacturing execution and maintenance management in a single integrated platform. As envisioned, Model Plant will integrate this MOM system with the company’s ERP, quality management, and machine connectivity systems into a single console for CoorsTek production operators, helping them stay focused on producing high-quality ceramic parts.

IT designed, developed, and implemented Model Plant at six CoorsTek plants in under 18 months, with the environment already improving production operators’ efficiency. Implementation is ongoing, with IT seeing it as foundational to a smooth transition to a new ERP.

“Our Model Plant systems enable real-time visibility of our entire shop floor and creates a comprehensive data set that we can analyze and learn from to unlock tangible productivity and quality gains,” says Matt Mehlbrech, the company’s vice president of IT.

Dow digitizes its manufacturing facilities

Melanie Kalmar, corporate vice president, CIO, and CDO, Dow

Dow

Organization: Dow

Project: Dow Digital Manufacturing Acceleration (DMA) Program

IT Leader: Melanie Kalmar, corporate vice president, CIO, and CDO

Dow in fall 2020 launched its Digital Manufacturing Acceleration (DMA) program with the goal of accelerating the deployment of digital technologies across its global manufacturing and maintenance areas.

One program deliverable to date is a private, secure, high-speed cellular network that is fully within the enterprise network. This network provides real-time access to data and collaboration tools, thereby enabling employees to extend their work beyond the traditional office out into the manufacturing plant environment.

DMA also includes a secure cloud for hosting, integrating, and contextualizing data for employees through purpose-built applications, and a mobile-first platform where employees can access their data and actionable insights.

Through such deliverables, DMA correlates and integrates manufacturing data and analysis that employees can use to make better, faster decisions. Capabilities delivered through the program also help improve asset reliability, safety, quality performance, and operational efficiency while reducing operating costs and unplanned events.

Together, DMA solutions create an improved employee experience, a digitally enabled workforce, and a resilient manufacturing organization by integrating and contextualizing manufacturing data from engineering, operations, maintenance, logistics, ERP and related ecosystem data sources, and by delivering data and insights from multiple systems through a single, user-centric interface.

“Dow’s DMA’s success would not be possible without the outstanding collaboration between our information systems and manufacturing teams. By deepening our understanding of what our manufacturing teams need to safely and reliably meet our customers’ needs and what IS can deliver for manufacturing, we have been able to arrive at these high-performing DMA solutions,” says Corporate Vice President, CIO, and CDO Melanie Kalmar.

Dow’s DMA solutions are in use at the company’s largest facility, where they’re already generating value through increases in manufacturing asset utilization, reductions in unplanned events and production outages, production volume increases, and other quantifiable key performance indicators. Implementation of DMA solutions at other Dow facilities is ongoing.

Analytics proves key to Eastman materials innovation

Aldo Noseda, vice president and CIO, Eastman

Eastman

Organization: Eastman Chemicals

Project: Fluid Genius Digital Product

IT Leader: Aldo Noseda, vice president and CIO

Eastman Chemicals in 2021 launched Fluid Genius, a patent-pending digital platform that enables manufacturing plant workers to monitor issues that could impact operations, safety, yield, and maintenance budgets.

Fluid Genius can monitor, analyze, and extend the life of Eastman’s customers’ heat transfer fluid. The technology can predict fluid life expectancy and advise how best to extend it while also avoiding unplanned manufacturing shutdowns.

Fluid Genius also provides forward-looking insights, which allows plant maintenance engineers and operations managers to plan the optimal time for maintenance, thereby minimizing risk and costs. Fluid Genius’s recommendation engine also helps plant engineers better understand the factors impacting the quality of their heat transfer fluid, which in turn helps them operate their plants safely and inform budgeting for future maintenance needs.

To create this application, Eastman drew on its nearly 50 years of operating system sample analysis data to draw in-depth insights on fluid chemistry behaviors over time. The company combined that data with end-user input into its maintenance and incident logs and advanced artificial intelligence/machine learning techniques to create the platform’s proprietary fluid analytics.

“Digital products and services like Fluid Genius are critical to Eastman becoming a leading material innovation company because of the value they provide to our customers and how they differentiate us from competitors,” says Eastman VP and CIO Aldo Noseda.

Eastman has already seen strong uptake in use of this platform, and company officials note that the platform benefits its customer acquisition and retention efforts.

General Motors turns to IT to navigate semiconductor shortage

Fred Killeen, CIO and VP of Global IT, General Motors

General Motors

Organization: General Motors

Project: Semiconductor Shortage: Protecting GM Revenue

IT Leader: Fred Killeen, CIO and VP of Global IT

Like all modern vehicle-makers, General Motors relies on semiconductors to make its products; in fact, a typical vehicle relies on more than 3,000 chips within its electronic control units (ECU) to run essential vehicle functions.

As a result, General Motors faced the possibility of shutting down production when a global shortage of semiconductors arose in 2021. But thanks to an innovative technology that enabled GM to build vehicles without key electronic components and park them in lots for completion later when semiconductors became available, factories in North America could keep running.

The IT-developed solution — an industry first — essentially extended the secure plant network into vehicle storage locations, some of which were hundreds of miles away from manufacturing sites in the United States and Mexico.

This allowed repair teams, who were equipped with GM standard test tools configured with mobile printers and portable hot spots, to connect to the GM network and company systems to validate parts, complete repairs, download engine control software to ECUs, and test the vehicles. Completed vehicles were driven or returned to plants on carriers for the final stages of quality testing before being shipped to dealerships and customers.

The IT team also built tools to collect data about chips and suppliers and to provide that data on demand, enabling GM to route chips to suppliers who produce ECUs for its most popular vehicles.

Additionally, business intelligence and analytics teams developed tools and reporting features to manage the ECU repair process and prioritize fixes based on the time a vehicle was in a lot and their impact on GM revenue. These teams also developed a unified view, thereby displaying all data in one place.

“Working with teams across the organization, they delivered an industry-first technology solution that allowed us to build vehicles without some electronic components and complete them remotely as semiconductors became available — it’s a process that continues to prove valuable today,” says Fred Killeen, CIO and global VP of IT.

Oshkosh goes digital to optimize its supply chain

Anupam Khare, senior vice president and CIO, Oshkosh

Oshkosh

Organization: Oshkosh

Project: Enhanced Supply Chain Efficiency

IT Leader: Anupam Khare, senior vice president and CIO

Oshkosh had identified three issues within its supply chain function, which is tasked with acquiring the rights parts at the right time to support the manufacturing of customized, premium products requiring sophisticated components.

First, there was a lack of parts visibility across Oshkosh’s supply network, which led to unexpected parts shortages, putting the company at risk for production delays and shutdowns. That visibility issue in turn had created inefficient logistics and inventory leveling processes, driving up costs. And then there was the company’s supply chain technology itself, which comprised disparate, unconnected systems.

To address those issues, Oshkosh’s digital technology team partnered with its supply chain function to develop an advanced digital platform complete with system-to-system integration, cohesive data environments, data enhancements, and machine learning. Together, they teams also created a series of advanced analytics solutions to establish a more connected and optimized supply chain.

The platform pulls together critical data sources across several systems and functional areas and serves as the foundation for agile delivery of additional digital products. For example, the digital technology team quickly deployed a parts shortage prediction model soon after the platform was created. The team also deployed multiple advanced analytics solutions to optimize inventory levels, logistics, and shipping costs.

Initiated in January 2021 and now fully deployed, the collection of technologies has already increased supply chain visibility, part shortage prediction accuracy, and operational efficiency, and has helped Oshkosh avoid stockouts and subsequent costly production delays and shutdowns.

“This project showed the importance of IT staying well-connected with business partners to anticipate opportunities/challenges early and to co-create value through digital solutions,” says Anupam Khare, senior vice president and CIO.

Otis takes smart elevator to new heights

Organization: Otis Elevator

Project: Otis ONE

IT Leader: Renee Zaugg, vice president and CIO (Zaugg has since left Otis.)

To bring more transparent, proactive, and predictive services to customers, Otis in mid-2020 started work on a global digital transformation program called Otis ONE.

Leveraging technologies such as IoT, big data, AI, mobile, and cloud, Otis ONE provides real-time visibility of elevator health, insights for predictive maintenance, and remote assistance and troubleshooting.

Otis ONE is tailored to deliver real-time data insights to a range of personas — from campus owners to field engineers to maintenance officers — in their day-to-day operations, thereby  empowering various stakeholders to make more informed decisions and deliver more predictive maintenance, more effectively.

The Otis ONE architecture consists of three tiers — edge, platform, and enterprise. Edge relies on various gateway/sensor packages to collect and send a range of elevator data to the cloud via a cellular network, which is then integrated with the platform tier via its IoT hub or event hub. The platform tier features real-time business processing fueled by a rules engine that analyzes data and immediately determines various conditions on the state of the elevator, notifying Otis workers and external customers proactively in the event of anomalies.

The Otis ONE enterprise tier brings together multiple applications that leverage the information coming from the edge and platform tiers, integrating it with elevator master and service data to give a 360-degree, real-time view of elevators around the globe.

“The connected elevator creates value for our customers with greater equipment uptime. Otis ONE enhances transparency to our customers, productivity of our own teams in predication and proactiveness,” says Ezhil Nanjappan, executive director and CTO, adding that the Otis ONE ecosystem serves as a “foundation for any smart cities and smart buildings.”

Owens Corning leverage low-code to improve plant safety

Steve Zerby, senior vice president and CIO, Owens Corning

Owens Corning

Organization: Owens Corning

Project: Low-Code Digital Platform for High Impact to Plant Safety and Compliance

IT Leader: Steve Zerby, senior vice president and CIO

Owens Corning must monitor and manage the vapors from asphalt tanks for both safety and regulatory compliance reasons. But it found its process for doing so was ineffective and slow. Data was collected in offline databases, and was plant-specific and difficult to share across the company. In addition, the analysis required to uncover potential hazards took days.

So, in August 2021, Owens Corning’s asphalt business function and IT teamed up to co-create a digital platform that could transform and improve the effectiveness of that process through improved data collection and actionable analytics.

Low-code application development technology enabled the two teams to quickly create a proof-of-concept, gather feedback, and fine-tune the platform. Within three week, a pilot was delivered to a plant, and the platform itself was implemented across 20-plus plants in only three months.

The platform provides a single source of truth for loss prevention data, with proactive monitoring and analytics that alert plant leaders with real-time insights to ensure hazard prevention.

The company has seen improvements in its program. Loss prevention hazards are now identified and addressed in real-time, thereby ensuring more effective personnel safety and regulatory compliance reporting. That visibility has also enabled more proactive preventive maintenance of equipment, thereby minimizing unplanned production outages. And, thanks to the platform’s analytics capabilities, insights are generated in minutes instead of days, enabling the company to make informed decisions quickly.

“Digitizing sensor measure data taken from tanks, integrating other data points, and providing easy-to-use visuals and analytics empowered plant operators. They can quickly assess potential hazards and risks in real-time and proactively take preventive actions,” says Steve Zerby, senior vice president and CIO, noting that a key element of success was having “a dedicated business product owner who is passionate about bringing others along in using the tools.”

Rockwell Automation launches customer-centric transformation

Chris Nardecchia, SVP and chief information and digital officer, Rockwell

Rockwell

Organization: Rockwell Automation

Project: Rockwell Automation Drives Innovation, Customer Centricity and Moves Towards a Data-Driven Operating Model Through its Enterprise Transformation Office

IT Leader: Chris Nardecchia, SVP and chief information and digital officer

As Rockwell Automation works to help customers accelerate their digital transformations, the company likewise embarked on its own DX journey. In doing so, it is creating new customer experiences, transforming business models, and empowering workforce innovation.

Launched in August 2020, the initiative is helping to build a data-driven operating model that is agile and centered around the customer so it can both respond to changing needs and create lasting customer connections instead of one-time transactional product sales.

To do that, the Enterprise Transformation Office (ExO) mapped out end-to-end business processes and customer experience needs for the target state and prioritized the investments required to deliver a holistic customer experience.

The office is moving Rockwell Automation to a data-driven operating model, in which it will execute more than two dozen product development cycles annually and use customer information to shape those products. Additionally, the ExO is working to transform business models, processes, software, and service portfolios to deliver more subscription-based offerings to customers.

Rockwell Automation is using telemetry to gather data and insights as well as creating a comprehensive, unified view of customer data to enhance the company’s customer engagements. It’s updating processes and systems to support its future state. It’s also leveraging data gathered from other initiatives to innovate products, grow the market, and grow the business. And it has implemented a structured approach for governing its transformation roadmap.

“This program is critical to realizing our vision of reinventing our products to outcomes, reimagining our customers’ experience, and redefining our operating model towards subscriptions and annual recurring revenue,” says Senior Vice President and Chief Information and Digital Officer Chris Nardecchia.

Schneider Electric secures against rise in OT cyberattacks

Elizabeth Hackenson, SVP and CIO, Schneider Electric

Schneider Electric

Organization: Schneider Electric

Project: Cybersecurity Connected Service Hub

IT Leader: Elizabeth Hackenson, SVP and CIO

In response to the growing number and sophistication of cyberattacks directed at operational technology (OT), Schneider Electric in 2020 created a cybersecurity OT operating model aimed at optimizing the cybersecurity performance of its 220 manufacturing plants and 35 distribution centers around the globe.

The model establishes one Security Operation Center (SOC) for all Schneider IT and OT. It also establishes a threat detection platform at each plant that raises security alerts to the SOC via an interface with the security information and event management (SIEM) system.

The operating model also features the Cybersecurity Connected Service Hub (CSH), a 24/7 global team to support the company’s industrial sites in every type of cybersecurity event. Its primary responsibilities are to identify and baseline cybersecurity posture of OT and IT devices; detect and remediate security alerts; and monitor for and remediate vulnerabilities.

CSH is also charged with driving continuous improvement and progressively improving the company’s cybersecurity posture in OT devices and processes.

The CSH team receives alerts from the SOC, analyzes those alerts and defines how this must be remediated. The team works with cybersecurity site leaders (CSL) to remediate alerts; CSLs are OT experts trained in cybersecurity who work in an assigned plant, with a CSL in each plant, and have accountability for the cybersecurity of the OT in that plant. They can remediate a cyber incident and manage the business continuity.

The CSH also provides cybersecurity training for all CSLs. It builds and promulgates standard operation procedures (SOP) for each type of security event and vulnerability. It also monitors cybersecurity KPIs in all plants, remediating in conjunction with the CSL any gaps that it identifies.

“The creation of the Cybersecurity Connected Services Hub is a milestone in our continuous journey towards greater resilience and cybersecurity at Schneider Electric,” says SVP and CIO Elizabeth Hackenson.

CIO 100, Digital Transformation, IT Leadership, Manufacturing Industry

At the Laboratory for Machine Tools and Production Engineering (WZL) of RWTH Aachen University, scientists, mathematicians, and software developers conduct manufacturing research, working together to gain new insights from machine, product, and manufacturing data. Manufacturers partner with the team at WZL to refine solutions before putting them into production in their own factories. 

Recently, WZL has been looking for ways to help manufacturers analyze changes in processes, monitor output and process quality, then adjust in real-time. Processing data at the point of inception, or the edge, would allow them to modify processes as required while managing large data volumes and IT infrastructure at scale.

Connected devices generate huge volumes of data

According to IDC, the amount of digital data worldwide will grow by 23% through 2025, driven in large part by the rising number of connected devices. Juniper Research found that the total number of IoT connections will reach 83 billion by 2024. This represents a projected 130% growth rate from 35 billion connections in 2020.

WZL is no stranger to this rise in data volume. As part of their manufacturing processes, fine blanking incubators generate massive amounts of data that must first be recorded at the sharp end and processed extremely quickly. Their specialized sensors for vibrations, acoustics and other manufacturing conditions can generate more than 1 million data points per second.

Traditionally, WZL’s engineers have processed small batches of this data in the data center. But this method could take days to weeks to gain insights. They wanted a solution that would enable them to implement and use extremely low-latency streaming models to garner insights in real-time without much in-house development.

Data-driven automation at the edge 

WZL implemented a platform which could ingest, store, and analyze their continuously streaming data as it was created. This system gives organizations access to a single solution for all their data (whether streaming or not) that provides out-of-the box functionality and support for high-speed data ingestion with an open-source and auto-scaling streaming storage solution. 

Now, up to 1,000 characteristic values are recorded every 0.4 milliseconds – nearly 80TB of data every 24 hours. This data is immediately stored and pre-analyzed in real-time at the edge on powerful compact servers, enabling further evaluation using artificial intelligence and machine learning. These characteristic values leverage huge amounts of streaming image, X-ray and IoT data to detect and predict abnormalities throughout the metal stamping process. 

The WZL team found that once the system was implemented, it could be scaled without constraint. “No matter how many sensors we use, once we set up the analytics pipeline and the data streams, we don’t have to address any load-balancing issues,” said Philipp Niemietz, Head of Digital Technologies at WZL. 

With conditions like speed and temperature under constant AI supervision, the machinery is now able to automatically adjust itself to prevent any interruptions. By monitoring the machines in this way, WZL have also enhanced their predictive maintenance capabilities. Learn more about how you can leverage Dell Technologies edge solutions.

***

Intel® Technologies Move Analytics Forward

Data analytics is the key to unlocking the most value you can extract from data across your organization. To create a productive, cost-effective analytics strategy that gets results, you need high performance hardware that’s optimized to work with the software you use.

Modern data analytics spans a range of technologies, from dedicated analytics platforms and databases to deep learning and artificial intelligence (AI). Just starting out with analytics? Ready to evolve your analytics strategy or improve your data quality? There’s always room to grow, and Intel is ready to help. With a deep ecosystem of analytics technologies and partners, Intel accelerates the efforts of data scientists, analysts, and developers in every industry. Find out more about Intel advanced analytics.

IT Leadership

April 14, 2022

Source:  Tim Guido, Corporate Director, Performance Improvement, Sanmina | Manufacturing Tomorrow

Many industries such as the automotive, medical and semiconductor sectors must comply with third party standards to control processes, reduce risk and ensure quality during the manufacturing of products. Over the past few years, organizations have begun to embrace an even broader mindset towards risk-based thinking, motivated by the growing discipline of regulatory compliance and an increasing number of unexpected global events that have impacted their operations.

When manufacturers want to implement a new production line, they are examining all of the possible risks and scenario planning for every reasonable action that could either prevent or mitigate a risk if it materializes. Some people call this business continuity, risk management or disaster management. Nothing has brought these concerns more to top of mind than the past few years of dealing with trade wars, the pandemic, extreme weather and supply chain shortages.

Risk Management Checklist

Risk management is about the practical investment in preventative and mitigating measures that can be tapped during a crisis. There are four main areas to consider when building a risk management or business continuity program:

Risk Assessment. The first action to take is to put a stake in the ground in terms of what could go wrong at each plant, whether it happens to be a fire, earthquake, chemical spill, cyber attack or something else. This will vary for different regions. The possibility of an earthquake impacting operations in California is much higher than in Alabama. An act of terrorism may be more likely to happen in certain countries versus others.

Let’s say a manufacturer is setting up a new production line. The first step would be to complete a risk assessment form that spans different areas – Employee Health and Safety, Finance, HR, IT, Operations and Program Management. Based on the guidelines provided, the person completing the form identifies possible issues and potential impacts – this could be anything from production or shipment delays to impacts to employee health and safety. Then a threat rating is assigned between 1 and 5 for both occurrence and impact, with 5 being a critical situation that warrants the most attention.

Then, preventative and mitigating measures are determined based on factors that could contribute to the adverse event. Are there inadequate controls, lack of monitoring or poor training that might add to a problem? Could these areas be improved to either prevent or lessen the potential impact? While an earthquake isn’t preventable, an organization could retrofit their building and upgrade IT systems to ensure that people are safe and can still perform their job duties if a temblor hits.

Incident Management & Business Recovery Planning. Building out all of the details for incident management and business recovery is essential, if not glamorous. A contact list needs to be created so that a key lead can contact all affected employees, customers and suppliers during a disaster. Getting customers and suppliers in the loop early could enable them to become part of the solution. A call notification script should be drafted that provides consistent communications to impacted parties and decisions need to be made about whom gets told what in certain scenarios. Checklists and drills should also be included, such as how to safely clear employees from a facility.

 

Internal Audit Checks. Once the business recovery plan is drafted, it should be audited annually. This ensures that the right action plans are included and the correct project leaders and backup leads are identified and verified. Each section, such as advanced planning, revision histories and recovery priorities, must be evaluated as part of the audit to ensure that there’s a solid plan in place and that all participants are properly trained and on board with the approach.

 

Test Exercise. Every plant should run through a drill for their highest-priority emergencies to evaluate preparedness. They must be able to prove that there’s a data IT recovery capability and have a rough idea of what can be done for a customer in the scope of the test exercise. If work needs to be moved to another location, are they able to confirm the backup plant’s capacity and a timeline for the transfer? Do they understand the open orders that need to be transferred? How does the detailed recovery plan work in terms getting operations back up and running?  For each action, what would be considered a success and how soon? A sample objective would be to get access to a site within one hour and have at least 80 percent of the team notified within the hour of a situation.

After running a drill, evaluating its effectiveness and making improvements to the plan and communicating it to the team should occur. If actions such as getting access to the site, notifying the team, understanding orders, getting alternate facility confirmation and knowing the right customer contacts can all be demonstrated during the exercise, then the majority of functional activities are ready to go, even if the actual crisis requires some fine tuning of processes. Just like the overall plan, the test runs should be performed at least once a year to verify their continued relevance.

Preventing Problems Before They Happen

At Sanmina, we are seeing increasingly robust expectations for risk management programs across the markets that we serve. Customers are more eager to get involved in understanding the details of these plans than ever before and are considering them an integral part of their manufacturing strategy.

It’s vitally important to understand potential risks, evaluate the scope and effectiveness of an action plan and cultivate a living risk management process that is periodically reviewed and updated. It’s also critical to instill a preventative mindset within an organization’s culture because it’s not always an intuitive thought process. While fixing a problem in the moment may be beneficial, it’s important to build a mindset that’s not just about corrective thinking but a proactive approach that identifies potential root causes that could help prevent or lessen a problem that may occur in the future.

The post Four Steps to Reducing Manufacturing Risk appeared first on Internet of Business.

Click Here to Access the Full Webinar

With increasing pressures on pricing, speed, individualization, and sustainability brought on by customers and competitors, already- intricate supply chains & manufacturing processes are becoming more and more complex.

A resilient business needs digitalized manufacturing operations, customer service and supply chains to run with speed and flexibility.

Lean manufacturing (also known as lean production, just-in-time manufacturing and just-in-time production, or JIT) is a production method aimed primarily at reducing times within the production system as well as response times from suppliers and to customers. Central to the concept is the elimination of waste or activities which add no value to the process. And this in turn provides a basis for operational excellence by standardizing processes and creating a culture of continuous improvement by monitoring, proactively maintaining equipment and empowering employees.

Industry 4.0 refers to the digital transformation of industrial processes through Industrial Internet of Things (IIoT) and cyber-physical systems – smart, autonomous systems that use computer-based algorithms to monitor and control physical things like machinery, robots, and vehicles.  

In this workshop, we shall discuss and showcase use cases where Industry 4.0 meets lean manufacturing – with the aim of increasing operational efficiency!

SPEAKERS:

Sujit Hemachandran – Sr Lead, Industry 4.0 Digital Transformation, SAP Labs
Sujit is a technologist who specializes in understanding and helping customers adopt technologies and applications. He is currently engaged with SAP customers in the discrete and process sector to help them on their Industry 4.0 and industrial IoT journeys. His experience with SAP software includes strategizing and developing various enterprise technologies and platforms, such as mobile, integration, IoT, and cloud communications.

Ben Hughes – Industry 4.0 Hub OT Specialist, SAP Labs
Ben has 18 years of engineering experience in a wide range of industries. His experience includes working with many manufacturers to deploy automation solutions on plant floors, developing automated residential lighting systems, and developing custom control systems for the US Navy and Coast Guard. During his time off, Ben enjoys spending time with family, travelling, and reading, and is a volunteer with his son’s boy scout troop.

Jack LaMaina – Industry 4.0 Hub Specialist, SAP Labs
Jack graduated from The College of New Jersey with a Major in Business Management and a Minor in International Studies. Early in his career at SAP, Jack supported the Mission Control Center before taking on the role of a Solution Advisor for SAP’s Customer Experience Suite of solutions. Jack supports SAP customers by bridging the gap between business and technology, turning complex problems into value creating opportunities for customers by showing the power of SAP software in support of customers’ digital transformation goals.

Matt Ruff – Industry 4.0 Hub Specialist, SAP Labs
Matt graduated with an engineering degree and spent his early career learning the ins and outs of cocoa & chocolate processing. Matt has ?5+ years of consulting experience with SAP’s Manufacturing Portfolio & is a previous owner of SAP’s Model Company for Connected Manufacturing. Currently, Matt is focused on optimizing and integrating business processes and showcasing SAP’s strengths to customers.

Vivekananda Panigrahy – Industry 4.0 Hub Specialist, SAP Labs
Vivek is a full stack developer of SAP Business Technology Platform with 9+ Years of Experience with a strong focus on Design Patterns, Solution Architecture and Problem-Solving Skills to the best of Customer Success. Currently, Vivek supports SAP’s Digital Supply Chain Industry 4.0 Innovation Hub team helping customers throughout their Digital Transformation Journey.

Fill out the form to access the full webinar







This iframe contains the logic required to handle Ajax powered Gravity Forms.

The post Digitize Manufacturing Operations with Industry 4.0 appeared first on Internet of Business.