For chemists, finding just the right molecule for a particular application can be like searching for a needle in a haystack. With several million compounds to choose from, chemists often must resort to intuition when trying to solve complex problems around chemical processes.

US multinational Dow Chemical was working with a pulp and paper manufacturer to improve inefficiencies in its chemical process with a goal of producing a better pulp yield through a safer process. But to hit that narrow target, Dow needed a faster, more thorough method of identifying candidate molecules. So it launched a collaboration with Chemical Abstract Services (CAS), a division of the American Chemistry Society, to leverage CAS SciFinder-n, which, unlike generic search engines, is optimized for searching for chemical molecules from an electronic catalog of more than 200 million compounds. 

The resulting project, SmartSearch, has earned Dow a 2023 CIO 100 Award list for IT leadership and innovation, and now enables thousands of Dow chemists to discover needed molecules in minutes that once took weeks to identify.

“This is what caused us to go down the road of finding a faster way to search molecules through a database. That was the start of this project,” says Nathan Wilmot, IT director of data client partnerships in Dow’s enterprise data and analytics group. “The innovation here is previously [chemists] relied very heavily on intuition and thumbing through catalogs to identify molecules.”

With SmartSearch for Dow, Dow chemists can very quickly filter down the “chemical, physical, and commercial availability [of molecules, as well as the] health and safety properties for a molecule or set of molecules that might fit a given application” in a matter of minutes, he says.

In its pulp pilot, for example, CAS SciFinder helped Dow identify a set of 8 to 12 safer, more sustainable molecules as possible candidates to replace the existing material the manufacturer aimed to replace.

Dow has since narrowed that pool down to two to four molecular options and expects to commercialize one of those within the next year or two, Wilmot says. “It eliminates a lot of experimentation time … and accelerates our research quite dramatically.”

A partnership based on chemistry

Chemical Abstract Service was founded in 1903 as a resources for chemists globally and is the “keeper” of a registry that contains more than 200 million substances that are indexed and curated from a variety of sources, including patents and meeting notes, according to a CAS spokeswoman.

SmartSearch for Dow is a custom project that uses the CAS SciFinder proprietary search technology and a platform with a “knowledge graph” of more than 2.5 billion entities and 20-plus billion chemical relationships, according to Venki Rao, CTO of CAS.

Its content is supported by modern natural language processing (NLP) and neural network–based relevance technologies, Rao says, noting the union of its platform with new AI technologies “appended” to the existing platform provides Dow with new capabilities to power their R&D and achieve next-generation business goals.

Dow has partnered with CAS for many years but approached the company about a deeper collaboration that would unify the strengths of each to create SmartSearch for Dow, says Dow’s Wilmot, who would not say whether the collaboration will extend to actual development or the use of AI in R&D but the possibilities — business and scientific — could be life altering.

There are millions of molecules that exist and finding one that has a unique use — tying the right molecules to the right application — using digital tools did not exist before and will have a profound impact on Dow’s business, Wilmot says.

“We have worked closely with them as an innovation partner specifically in this area since between 2019 and 2020 when [CAS] started to refresh its business model and have a services organization,” Wilmot says. “They’ve done a great job of collaborating with us on this and other areas.”

Next steps in computational chemistry

Dow’s chemists continue to build their own database of chemicals and are augmenting it with CAS’s expertise to “make better decisions that they could not have otherwise,” Wilmot says.

SmartSearch for Dow is now being used on a number of projects across Dow’s plastics, silicone, and polyurethane businesses, as well as for many industrial solutions, the IT director notes. At any given time, Dow has 10 to 12 people committed to the collaborative solution.

“Instead of a needle in a haystack [approach], SmartSearch allows us to find the best molecule based on the data we have,” Wilmot says. “We can get this to all the researchers to find the best targets available — the most sustainable, cost-effective, high-performance, and high-value materials as quickly as they can, giving us a competitive advantage.”

All borne of what Wilmot calls “a cool collaboration between our IT organization with an external partner,” which, as it scales, will help Dow be more efficient and sustainable, he says, with potential impacts in industrial chemistry, catalysis, and other areas to continue producing sustainable and safe materials across the industry. “That’s the critical piece for us.”

Artificial Intelligence, Digital Transformation

For research institutions, a solid IT foundation can prove to be the difference in delivering meaningful results for scientific endeavors — and thereby in securing valuable funding for further research.

To that end, University of California, Riverside has launched an ambitious cloud transformation to shift from a small on-premises data center to an advanced research platform powered by Google Cloud Platform and its various service offerings.

As part of a three-year partnership with Google Public Sector, which kicked off in January, UC Riverside aims to empower its researchers in computer science, materials and quantum engineering, genomics, and precision agriculture to fully exploit Google’s location-agnostic application modernization platform, as well as its scalable compute and high performance computing (HPC) capabilities, says Matthew Gunkel, CIO of IT solutions at UCR.

Gunkel enlisted Google Public Sector professional services specifically as part of a strategy to quickly evolve UC Riverside’s small data center into an advanced cloud hub with robust research computing capabilities that would enable researchers to better compete for grants and funding opportunities.

“We identified Google as being well aligned with us strategically,” says Gunkel. “They have an agile infrastructure. They have the ability to facilitate industry-leading service concepts in additional clouds through a service they run called Anthos.”

Google’s Anthos is a hybrid cloud container platform for managing Kubernetes workloads across on-prem and public cloud environments. Gunkel also cited Google’s Looker and Big Query BI data analysis tools and its Chronicle security operations suite as important for enabling the university to operate a wide variety of applications and research on the cloud.

A partnership and cloud training model

With roughly 180 staff members, UC Riverside IT is relatively small, with largely traditional on-premises IT skills. As such, migrating to the cloud alone was not part of Gunkel’s plan.

Google’s assistance in developing a more efficient cloud architecture and training UCR’s IT staff in cloud technologies has been an immeasurably valuable service, he says, adding that Google is in a support role and is not running the show. UCR’s cloud architecture, for example, has been designed to be location-agnostic so the university is not locked into any one vendor and can adopt a multicloud platform over the long term.

“The services engagement is consulting and training to assist us in moving initial cloud workloads and to assist in our architecture to align to GCP services,” Gunkel says. “This is a ‘teach us to fish’ model. It’s all our work.”

UC Riverside IT is well on its way to migrating its core data to the cloud, developing its research platform, and shifting a range of applications to support the needs of its user base, which ranges from quantum engineering researchers to administrators, faculty, and students.

To date, UCR has moved the “vast majority of our data stores to Google,” Gunkel says, noting that his staff is currently refining the architecture and ETL processes for management and organization of the data long term.

In addition, UC Riverside IT is aligning its data to be accessed from Looker, Google’s enterprise BI and analytics platform, though which UCR will be deploying its Oracle Finance application for scaled reporting. UC Riverside is also rewriting a number of legacy applications to be cloud-native while revamping others for the cloud — there will be no ‘lift and shift’ of any applications, Gunkel says.

To that end, Google helped UC Riverside re-architect and migrate certain legacy services, including an LDAP configuration on a Solaris Unix server, as part of a process of identifying increased efficiencies for the deployment and operation of those services, which has been “an educational experience for a lot of my staff,” Gunkel says, noting that the overall transformation has required “cultural change management.”

Empowering research in the cloud

But the university’s evolving research hub is the crown jewel of the cloud migration.

“We have been working with a number of researchers on a platform that we are calling ‘Ursa Major’ where we committed to a number of compute instances and storage and RAM and GPUs that would be available to our researchers over a three-year time period,” Gunkel says.

Jim Kennedy, CTO of UC Riverside, says Google is helping architect the research hub and is also helping the IT chiefs make connections with researchers beyond UCR to help train UCR’s research faculty on Ursa Major, which will expand and grow beyond the three-year agreement with Google.

“Google connects us to experts in various research fields, and have conversations with our faculty directly, such as our genomics researcher on campus. There are experts on Google’s side, too,” Kennedy says.

Google also helped the Gunkel and Kennedy extend the university’s subscription-based compute and storage services to researchers in a multitude of disciplines. In the past, if a materials engineering researcher wanted to run workloads on several thousand processors, they would often have to write proposals to gain access to external supercomputer clusters.

With HPC requiring vast computing power, Gunkel also notes the benefit for efficiency and sustainability of shifting those workloads to the cloud. “We’re in a fairly constrained region against mountains and our ability to bring power into the university is something we’re constantly battling,” Gunkel says. “One of the things our researchers were very concerned about was [building] a sustainable, more eco-friendly solution. It’s something UCR values heavily but it’s also a challenge for us locally.”

Still, the migration, still in its early days, is being designed to accommodate a wide range of computing constituencies. For instance, UCR is also using Salesforce and MuleSoft as well as Google’s API layer to provide the “connective tissue” that is required across the university’s many enterprise platforms.

“The best way to think of the university is really as a collection or community of small businesses,” Gunkel says. “A lot of what we try to provide on the service stack side are tools that empower all of them in their different endeavors.”

Cloud Computing, Education Industry, High-Performance Computing

Strong performances in software and consulting helped IBM’s profit and revenue increase in the first quarter, even as a post-pandemic slowdown hit much of the technology industry.

IBM’s software and consulting revenue both rose 3% year over year. In the software segment, IBM’s enterprise Linux unit, Red Hat, saw growth of 8%, while application operations saw the highest level of growth in the consulting segment, rising by 7%.

The strong showing from software and consulting help boost total profit by a healthy 26%, to $927 million, and revenue by .4%, to $14.3 billion, for the quarter ending March 31, according to IBM’s earnings report, issued late Thursday. The strong dollar had a negative effect on sales — in terms of constant currency, taking out the effect of currency fluctuations, IBM revenue rose by 4%.

IBM now receives about three-quarters of its annual revenue from tech services, but the company is seeing some deceleration in consulting from the previous robust growth levels, chairman and CEO Arvind Krishna said on a call with analysts after the results had been published.

“More recently, clients are prioritizing digital transformation projects that focus on cost takeout, productivity and quick returns,” he said.

This trend was further reflected in IBM’s infrastructure segment, which was down 3.7% year on year; within the Hybrid Infrastructure business, z Systems revenue was up 7%. This was driven by fourth quarter z16 availability, with its performance having outpaced prior cycles due to customers leveraging it for AI at scale and energy efficiency, Krishna told analysts.

Focus on artificial intelligence

Commenting on IBM’s ongoing focus, Krishna said that hybrid cloud and artificial intelligence would be a priority for driving both business outcomes and innovation this year.

He noted that while AI is projected to add $16 trillion to the global economy by 2030, its use case within the enterprise differs widely from the AI being offered to consumers, given its need for more accurate results, trusted data and governance tools.

“We are seeing a lot more interest from business and using AI to boost productivity and reduce costs,“ Krishna said. “Productivity gains will come from enterprises turning their workflows into simpler automated processes with AI.”

Krishna said IBM will be focused on using AI to help organizations enhance internal audit and compliance processes and automate call center responses to improve accuracy and customer satisfaction.

GlobalFoundries lawsuit

On the same day IBM posted its first quarter 2023 financial results, chip manufacturer GlobalFoundries announced it was launching a lawsuit against the company, accusing it of unlawfully sharing confidential intellectual property and trade secrets with Japanese chip maker Rapidus.

IBM announced in 2021 that it had developed a 2nm chip, and then late last year unveiled a partnership with Rapidus calling for commercial production of 2nm chips, with manufacturing done in Japan. Chips made with the 2nm manufacturing process will be used for a wide range of applications and machines, from laptops to high performance computing servers, and are expected to slash the carbon footprint of data centers due to optimized performance.

The GlobalFoundries complaint asserts that IBM unlawfully disclosed its confidential IP and trade secrets after IBM sold its microelectronics business in 2015.

The company also alleges that IBM has been undertaking “unlawful recruitment efforts” by poaching GlobalFoundries engineering staff, a practice that it alleges has accelerated since the Rapidus partnership was announced in December 2022.

GlobalFoundries has asked the court to order an end to those recruitment efforts and is seeking compensatory and punitive damages as well as an injunction against IBM to stop the “unlawful disclosure and use” of GlobalFoundries’ trade secrets.

IBM has denied the allegations, contained in a complaint to federal court in the Southern District of New York in which GlobalFoundries asks for compensatory and punitive damages as well as an injunction against IBM to prevent further disclosure and use of what GlobalFoundries considers its trade secrets.

Enterprise Applications, IT Consulting Services, Technology Industry

Theo Blackwell MBE, Chief Digital Officer at the London Mayor’s Office, sits down with CIO UK editor Doug Drinkwater on CIO UK Leadership Live to give a whistle-stop tour on CDO misconceptions, smart city futures, fostering local government collaboration and balancing technological innovation with digital inclusion.

Watch the episode:

Listen to the episode:

CIO Leadership Live

To reduce its carbon footprint and mitigate climate change, the National Hockey League (NHL) has turned to data and analytics to gauge the sustainability performance of the arenas where its teams play. In October, the league, with partner SAP, launched NHL Venue Metrics, a sustainability platform that teams and their venue partners can use for data collection, validation, and reporting and insights.

The new platform furthers the sustainability journey the NHL started in 2010 when it inaugurated its NHL Green initiative to promote sustainable business practices across the league. It followed that in 2014 with the first sustainability report issued by a North American professional sports league and, in 2015, a commitment to counterbalance the league’s entire carbon footprint for three consecutive seasons.

“It’s meaningful for us because the roots of our game are people playing on frozen ponds,” says Omar Mitchell, vice president of sustainable infrastructure and growth initiatives at the NHL. “We need fresh water; we need cold weather. And when it comes to arenas, when you think about it, we play in a giant refrigerator. So, we use a lot of energy, a lot of resources, to play on a frozen water sheet.”

When the NHL began its sustainability journey, Mitchell’s role did not yet exist. He joined the league in 2012 as its first sustainability director with a mandate to find ways to embed sustainable business practices across the league and its member clubs.

“The most important thing about any sustainability platform is you cannot impact what you cannot measure,” Mitchell says. “That’s consistent across whatever your functional role, whatever your industry focuses on. The only way you can really advance change is by measuring, and then from measurement, impact. Sustainability is all about continuous business improvement. Sustainability is all about innovation and business optimization. The only way for you to speak in the language of business is to have the data that help you derive those insights.”

Benchmarking best practices

Driving sustainability practices in the NHL has unique challenges given the league’s structure. The NHL consists of 32 franchises across North America (seven teams in Canada and 25 in the US), each of which is owned and operated by separate entities, most of which also own and operate a venue. Washington, DC-based Monumental Sports & Entertainment, for example, owns the NHL’s Washington Capitals, NBA’s Washington Wizards, WNBA’s Washington Mystics, and the Capital One Arena in DC. The NHL can influence and promote practices among its franchisees but cannot mandate them.

Mitchell notes that it often helps to showcase the business benefits of various initiatives in addition to their environmental benefits. For example, more than two-thirds of NHL arenas have converted to LED game lights, leading to substantial energy savings in those facilities.

“These are the lights that illuminate the ice surface,” Mitchell says. “The old technology was 1,000-watt metal halide lights.”

Mitchell notes the new LED lighting technology actually makes the ice sheet look brighter, making the surface pop.

“We’re not telling the venues, ‘You must change your lights,’” Mitchell says. “We’re showing them all of the examples and best cases for why this innovation is so important and successful, as well as the benefits from an environmental standpoint. So, the majority of all of our buildings now have LED lights.”

Mitchell says the league is thinking of NHL Venue Metrics in the same way.

“We are using our technology and our platform to write the rules of how they should be measuring their venue operations and reporting against those venue operations and providing insights into benchmarks of how they should be operating their venue,” he says.

From there, the league can gather and collate those results to spot trends, gain insight into where venues are doing better or worse, and share best practices.

“Benchmarking, analyzing, and then showcasing those best practices, that’s the power of this tool,” Mitchell adds.

IT-driven sustainability

The league released sustainability reports in 2014 and 2018. In the process, it determined that venue operations comprise about 70% of its overall carbon footprint. That finding led it to ask SAP for help creating a technology solution that would allow it to track the carbon output of venues and ultimately start moving the needle in the right direction.

NHL Venue Metrics is an end-to-end, cloud-based platform to help venues measure and analyze the carbon footprint they generate across areas such as energy, water, waste, and recycling. It consists of three main components:

Data collection: An interface platform sits between the clubs and their venue partners, allowing them to share relevant resource consumption and environmental data with the league.Data calculation and validation: A processing and verification engine enables the league to track data consistency and identify reporting errors and calculation formulas to create the league’s carbon inventory.Data reporting and insights: A visualization dashboard shows environmental, consumption, and financial metrics.

The operational data is processed using SAP HANA Cloud and visualized with SAP Analytics Cloud.

SAP is the technical lead on NHL Venue Metrics. Mitchell’s team also works closely with the NHL’s club business and analytics group for data capture and the processing of ticketing and premium concessions, for example. Mitchell’s team also works closely with the IT group to ensure the platform and its data are secure.

The league launched the NHL Venue Metrics platform in October, so it’s still in the early stages. At this point, Mitchell says the team has learned a lot about data collection.

“This is an iterative process where we’re getting constant feedback from the venues about things like units of measure and what’s important for verification of the data that’s reported,” Mitchell says.

As more data comes in, the league will be looking to identify resource consumption reduction opportunities and operational enhancements such as increasing diversion of waste from landfill to recycling. Mitchell’s hope is to glean insights from venues that are doing well, build those insights into best practices, and share them with other clubs to be adopted at their venues.

“That’s what success will look like,” he says. “That’s where we will move the needle on really embedding environmental stability across the league.”

Analytics, Green IT, SAP

Efficient supply chain operations are increasingly vital to business success, and for many enterprise, IT is the answer.

With over 2,000 suppliers and 35,000 components, Kanpur-based Lohia Group was facing challenges in managing its vendors and streamlining its supply chain. The capital goods company, which has been in textiles and flexible packaging for more than three decades, is a major supplier of end-to-end machinery for flexible woven (polypropylene and high-density polyethylene) packaging industry.

“In the absence of an integrated system, there was no control on vendor supply, which led to an increased and unbalanced inventory,” says Jagdip Kumar, CIO of Lohia. “There was also a mismatch between availability of stock and customer deliveries. At the warehouse level, we had no visibility with respect to what inventory we had and where it was located.”

Those issues were compounded by the fact that the lead time for certain components required to fulfill customer orders ranges from four to eight months. With such long component delivery cycles, client requirements often change. “The customer would want a different model of the machine, which required different components. As we used Excel and email, we were unable to quickly make course correction,” Kumar says. 

Jagdip Kumar, CIO, Lohia Corp


Moreover, roughly 35% of the components involved in each customer order are customized based on the customer’s specific requirements. Long lead times and a lack of visibility at the supplier’s end meant procurement planning for these components was challenging, he says, adding that, in the absence of any ability to forecast demand, Lohia was often saddled with disbalanced (either extra or less) inventory.

The solution? Better IT.

Managing suppliers to enhance efficiency and customer experience

To manage its inventory and create a win-win situation for the company and its suppliers, Kumar opted to implement a vendor management solution.

“The solution was conceptualized with the goal of removing the manual effort required during the procurement process by automating most of the tasks of the company and the supplier while providing the updates that the former needed,” says Kumar.

“We roped in KPMG to develop the vendor portal for us on this SAP platform, which is developed on SAP BTP (Business Technology Platform), a business-centric, open, and unified platform for the entire SAP ecosystem,” he says.

The application was developed using SAP FIORI/UI5, while the backend was developed using SAP O-Data/ABAP services. The cloud-based front end is integrated with Lohia’s ERP system, thereby providing all relevant information in real-time. It took four months to implement the solution, which went live in September 2021.

With the new deployment, the company now knows the changes happening in real-time, be it the non-availability of material or a customer not making the payment or wanting to delay delivery of their ordered machine. “All these changes now get communicated to the vendors who prepone or postpone accordingly. Armed with complete visibility, we were able to reduce our inventory by 10%, which resulted in cost savings of around ₹ 200 million,” says Kumar.

The vendor portal has also automated several tasks such as schedule generation and gate entry, which have led to increases in productivity and efficiency.

“The schedules are now automatically generated through MRP [material requirement planning] giving visibility to our suppliers for the next three to four months, which helps them to plan their raw material requirements in advance and provide us timely material,” Kumar says. The result is a material shortage reduction of 15% and a 1.5X increase in productivity. “It has also helped us to give more firm commitments to our customers and our customers delivery has improved significantly, increasing customer trust,” he says.

“Earlier there was always a crowd at the gate as the entry of each truck took 10-15 minutes. The new solution automatically picks up the consignment details when the vendor ships it. At the gate, only the barcode is scanned, and truck entry is allowed entry. With 100 trucks coming in every day, we now save 200-300 minutes of precious time daily,” he says.

Kumar’s in-house development team worked in tandem with KPMG to build custom capabilities on the platform, such as automatic scheduling and FIFO (first in, first out) inventory valuation.

To ensure suppliers would adopt the solution, Lohia deployed its own team at each vendors’ premises for two to three days to teach them how to use the portal.

“We showcased the benefits that they could gain over the next two to three months by using the solution,” Kumar says. “We have been able to onboard 200 suppliers, who provide 80% of the components, on this portal. We may touch 90-95% by the end of this year.”

Streamlining warehouse operations to enhance productivity

At the company’s central warehouse in Kanpur, Kumar faced traceability issues related to its spare parts business. Also, stock was spread across multiple locations and most processes were manual, leading to inefficient and inaccurate spare parts dispatches.

“There were instances when a customer asked for 100 parts, and we supplied only 90 parts. There were also cases wherein a customer had asked for two different parts in different quantities, and we dispatched the entire quantity comprising only one part,” says Kumar. “Then there was the issue of preference. As we take all the payment upfront from our customers, our preference is to supply the spare part on a ‘first come first serve’ basis. However, there could be another customer whose factory was down because he was awaiting a part. We could not prioritize that customer’s delivery over others.”

That the contract workers were not literate, and the company had too much dependency on their experience was another bottleneck.

To overcome these problems, and to integrate its supply chain logistics with its warehouse and distribution processes, Lohia partnered with KPMG to deploy SAP EWM application on the cloud.

“We decided to optimize the warehouse processes with the usage of barcode, QR code, and wifi-enabled RF-based devices. There was also a need to synchronize warehouse activities through the integration of warehouse processes with tracking and traceability functions,” says Kumar. The implementation commenced on 01st April 2022, and it went live on 01st August 2022.

To achieve traceability, Kumar barcoded Lohia’s entire stock. “We now get a list from the system on the dispatchable order and its sequence. Earlier there was a lot of time wastage, as we didn’t know which part was kept in which portion of the warehouse. Employees no longer take the zig-zag path as the new solution provides the complete path and the sequence in which they must go and pick up the material,” Kumar says.

Kumar also implemented aATP (Advanced Available-to-Promise), which provides a response to order fulfilment inquiries in Sales and Production Planning. This feature within the EWM solution provides a check based on the present stock situation and any planned or anticipated stock receipts.

“The outcome was as per the expectations. There was improved inventory visibility across the warehouse as well as in-transit stock. The EWM dashboard helped warehouse supervisor to have controls on inbound, outbound, stocks overview, resource management, and physical inventory,” says Kumar.

“Earlier one person used to complete only 30 to 32 parts in a day but after this implementation, the same person dispatches 47 to 48 parts in a day, which is a significant jump of 50% in productivity. The entire process has become 100% accurate with no wrong supply. If there is short supply, it is known to us in advance. There is also a 25% reduction in overall turnaround time in inbound and outbound processes,” he adds.

Supply Chain Management Software

Carhartt’s signature workwear is near ubiquitous, and its continuing presence on factory floors and at skate parks alike is fueled in part thanks to an ongoing digital transformation that is advancing the 133-year-old Midwest company’s operations to make the most of advanced digital technologies, including the cloud, data analytics, and AI.

The company, which operates four factories in Kentucky and Tennessee and designs all its products at its Dearborn, Mich., headquarters, began its digital transformation roughly four years ago. Today, more than 90% of its applications run in the cloud, with most of its data is housed and analyzed in a homegrown enterprise data warehouse.

Katrina Agusti, a 19-year veteran of the company who was named CIO six months ago, has played a pivotal role retooling the workwear retailer for the modern era, under previous CIO John Hill.

Now Agusti, who began her Carhartt tenure as a senior programmer analyst, is charged with leading the company’s transformation into its next phase, one that is accelerating daily with the barrage of complex technologies changing the global supply chain and business practices, Agusti says.

As part of that transformation, Agusti has plans to integrate a data lake into the company’s data architecture and expects two AI proofs of concept (POCs) to be ready to move into production within the quarter. Like all manufacturers in the information age, Carhartt is also increasing relying on automation and robotics at its service and fulfillment centers as it faces challenges in finding talent on the technology side and in the labor force to meet growing demand.

And demand certainly is on the rise for the workwear manufacturer, which is currently experiencing double-digit growth in all three of its lines of its business — direct to consumer, direct to business, and wholesale.

Tuning a transformation to make the most of data

Carhartt launched its Cloud Express initiative as part of a foundational transformation to shift the company’s 220 applications to Microsoft Azure. Two legacy applications, its warehouse management solution and its payroll and benefits solutions, still run on premises but those applications may soon be replaced in favor of cloud-native solutions, Agusti says.

Moving to the cloud — even amidst the pandemic — was a major win for Carhartt. Aside from the obvious speed to market and scalability gains, the vast improvements in stability, performance, uptime, maintenance, failover monitoring, and alerting has automated many of the costly, time-consuming IT tasks, thereby freeing up the IT team to tackle advanced data analytics and to experiment with other new technologies.

Agusti says Carhartt will likely embrace a multicloud architecture in the long run, but for now she and her team are ramping up their cloud expertise in part through conversations with other CIOs about best practices.

“We’re still learning and building the muscle internally to properly run in the cloud and how to manage in the cloud, and not just the management of systems but how to size them,” she says, adding that she is also homing in on data architecture and retention strategies. “It’s a different beast to manage workloads in the cloud versus workloads on premise. We’re still in that journey.”

Like many CIOs, Carhartt’s top digital leader is aware that data is the key to making advanced technologies work. Carhartt opted to build its own enterprise data warehouse even as it built a data lake with Microsoft and Databricks to ensure that its handful of data scientists have both engines with which to manipulate structured and unstructured data sets.

“Today, we backflush our data lake through our data warehouse. Architecturally, what we’d like to do is bring the data in first into the data lake, whether it is structured or unstructured, and then feed it into our data warehouse,” Agusti says, adding that they continue to design a data architecture that is ideal for different data sets.

She does not currently have plans to retire the homegrown data warehouse in favor of the data lake because the team has customized many types of certified data sets for it.

“The data lake will be more in service to our data science team and consumer-facing teams that are building out journeys using unstructured data to inform those personalization,” Agusti says, noting Carhartt’s six data scientists have built several machine learning models that are currently in test mode.

Two such projects are nearing production, the first of which supports Carhartt’s replication of inventory for its five distribution centers and three different businesses.

“We’re trying to use it for decision support and to plan all of that inventory into different distribution centers based on service levels,” she says, noting that the model can optimize Carhartt’s distribution network by taking into account capacities as well as supply and demand and inventory levels.

The second POC is aimed at helping data scientists collect consumer data that can be leveraged to “personalize the consumer journey,” including demographics information and data from consumer surveys, Agusti says.

The power of tech

Like many CIOs, Agusti’s biggest challenge is change management — especially when it comes to persuading employees that the company’s AI models really work.

“Teams are skeptical that technology can provide the decision support and automation that they do today,” the CIO says. “We have a lot of use cases and we’re running them in POC mode because we need to prove to our end users and business community that these models can make those decisions for you.”

Agusti expects many companies are in this transition mode. “There are different functions along the maturity curve,” she says of the AI efforts under way, “but I think there are so many potential applications that can leverage technology especially in data analytical spaces.”

To pique her resolve about the power of technology, all the CIO has to do is think about how, without investments in technology and talent, the pandemic might have derailed the company’s business.

At first, during the pandemic, many essential workers needed to be equipped with Carhartt work gear for extra protection. As a result, the company’s revenue stream grew in the double digits, even when certain business segments were curtailed due to widespread work stoppages.

Once work stoppages started taking hold, Carhartt gained a rare glimpse into its supply chain, enabling its data analysts to view the steps of the supply chain in exquisite detail, like the individual frames in a film.

“What the pandemic did was create the need for that visibility and proactive exception management,” Agusti says. “Every leg of that journey becomes important when you’re having disruption. It was the catalyst for us to get more granular in the visibility and exception management of every single step in the supply chain.”

Thanks to that visibility — and IT’s push to keep Carhartt’s businesses humming — the company is in a better place with its supply chain. It’s still not at the “predictable” level that it was pre-pandemic, Agusti says, but “we’re starting to see logistical lead times level out and improvements of lead times for goods creation getting better.”

Analytics, Artificial Intelligence, Data Management

For pharmaceutical companies in the digital era, intense pressure to achieve medical miracles falls as much on the shoulders of CIOs as on lead scientists.

Rigid requirements to ensure the accuracy of data and veracity of scientific formulas as well as machine learning algorithms and data tools are common in modern laboratories.

When Bob McCowan was promoted to CIO at Regeneron Pharmaceuticals in 2018, he had previously run the data center infrastructure for the $81.5 billion company’s scientific, commercial, and manufacturing businesses since joining the company in 2014.

In that capacity, he knew that, in addition to having the right team and technical building blocks in place, data was the key to Regeneron’s future success.

“It is all about the data. Everything we do is data-driven, and at that time, we were very datacenter-driven but the technology had lots of limitations” says McCowan. “It worked for us to keep the company successful, but it wasn’t giving us the scale and horsepower needed.”

To achieve what the company would need going forward, McCowan knew Regeneron would have to undergo a major transformation and build a more enhanced data pipeline that could inject data from up to 1,000 data sources in “analytical ready formats” for both the business and the scientists to consume, the CIO says.

And to do this, a move to the cloud was essential. “The only way to enable our scientists and scale up and grow in the future is to really embrace the cloud, and not just in terms of computational power and storage, but being able to deploy into different environments, different countries,” McCowan says. “If you are not on the cloud, you are going to be left behind.”

Empowering scientists through the cloud

McCowan set about migrating Regeneron to Amazon Web Services in late 2018. By 2020, IT had moved roughly 60% of all company data to the cloud — no minor task for an international firm that generated $16 billion in revenue in 2021, employs more than 10,000 people, and holds nine FDA- and EMA-approved drugs with an additional 30 in clinical trials.

The company’s multicloud infrastructure has since expanded to include Microsoft Azure for business applications and Google Cloud Platform to provide its scientists with a greater array of options for experimentation.

“Google created some very interesting algorithms and tools that are available in AWS,” McCowan says. “And some things [Regeneron’s scientists] can only try out in the Google cloud. So, we are using all three mainstream clouds, but really the core of it is around AWS.”

Due to the complexity of the Regeneron’s experimentation and testing, the company uses a variety of standard SaaS tools for analysis but its enhanced cloud-based MetaBio Data Discovery Platform, which provides a wide array of data services, data management tools, and machine learning tools as “icing on the cake,” is the crown jewel of the company’s analytics operations, McCowan says.

MetaBio, which received a 2022 CIO 100 Award, provides a single source for datasets in a unified format, enabling researchers to quickly extract information about various therapeutic functions without having to worry about how to prepare or find the data.

“Scientists come to us with white papers which may be identifying theoretical ways that you could analyze a scientific experiment,” McCowan says. “We’ll work with those scientists and actually build the computer models and go run it, and it can be anything from sub-visual particle imaging to protein folding,” he says. “In other cases, it’s more of a standard computational requirement and we help them provide the data in the right formats. Then the data is consumed by SaaS-based computational tools, but it still sits within our organization and sits within the controls of our cloud-based solutions.”

Much of Regeneron’s data, of course, is confidential. For that reason, many of its data tools — and even its data lake — were built in-house using AWS.

“We have our own data lakehouses in AWS,” says McCowan, who also lead Regeneron IT to a 2020 CIO 100 Award, for developing Regeneron Deva Platform, a research computing platform built to simplify, scale, and accelerate the early discovery analytical experience. “By creating some small adjustments, we are allowing scientists to connect data in ways they were not able to before. Our vision for the data lake is that we want to be able to connect every group, from our genetic center through manufacturing through clinical safety and early research. That’s hard to do when you have 30 years of data.”

The data platform provides constant access to connected and contextualized data via data lakes, scalable clouds, data processing and AI services, the CIO says, adding that the company’s data lakes manage roughly 200 terabytes of data.

Fueling innovation with data

McCowan is cautious not to restrict the use of external tools — particularly cloud-native tools — that help scientists dig for discoveries. At the infrastructure level, Regeneron scientists use AWS EMR and Cloudera. At the data pipeline level, scientists use Apigee, Airflow, NiFi, and Kafka. At the data warehouse level, scientists use Redshift. As you go up the stack, different data analytics come into play, such as DataIQ. From a language perspective, scientists use Python and Jupyter Notebooks.

For McCowan, the key is to give scientists any and all tools that allow them to explore their hypotheses and test theories. “One of the fantastic things about Regeneron is that we’re driven by curiosity,” the CIO says. “We’re driven by science, and by innovation, and we try to avoid putting hard boundaries around what we do because it tends to stifle innovation.”

Despite the fact that Regeneron scientists have AI and ML tools at their disposal, data remains the key, McCowan says, and it’s the power of the cloud and analytics alone that may reveal the next biggest breakthrough from data that is 10 years old.

“I can’t tell you how many times I’ve read about these fantastic projects using AI and ML, but you never see the output because they fail,” McCowan says. “And the reason they are failing is that people are not putting enough thought into where the data is coming from. That is why we built our data infrastructure. So, by the time that data lands in the data lakes, and we start applying AI and ML, we know we are using it against high-quality data.”

As the company’s chief technologist, McCowan’s job is to digitize everything and help scientists make the best use of the data and metadata regardless of how it is generated.

“It always comes back to the data and the insights that we can provide using different technologies and increasing the speed of decision-making,” McCowan says, adding that providing scientists with the ability to run experimentation mathematically through engines using AI and ML models speeds up discovery, but it will never replace the wet lab.

The combination of enhanced IT and science is what will drive maximum innovation at Regeneron, McCowan says. And here, the MetaBio data platform will play a key role in facilitating breakthrough discoveries far faster than previously possible.

“The level of detail there with us digitizing everything, we’re able to apply technology and tools to help scientists make connections that they were just not able to make before,” McCowan says. “If you look at it from a pure data perspective, what we can do is find ways to [enable scientists] to connect the data better and faster and make those insights and bring drugs to market down to a five-year or four-year [process], when before it was a 10-year process.”

Analytics, CIO 100, Cloud Computing, Healthcare Industry, Machine Learning

No one sounds more amazed about his career path than Anthony Osijo himself. He was a career banker living in Hong Kong when friends from university pitched their new utility startup called Bboxx. 

“I was an angel investor,” laughs Osijo from his London home office. Now, as Chief Financial Officer of Bboxx, it’s Osijo’s job to successfully steer the utility from startup to profitable scale-up company.    

Understatedly, Osijo explains, “There’s a huge difference. As a startup, we must focus on surviving. As a scale-up, we must grow and become profitable.”

Ending energy poverty

The award-winning company’s mission is to end energy poverty in developing countries, like Rwanda, Nigeria, and Mali. Bboxx manufactures, distributes, and finances solar powered electricity for nearly 30 million people, in accordance with the United Nations Sustainable Development Goals.   

It is an incredibly ambitious, arduous, and expensive goal to get electricity to millions of underserved people. “We need to provide innovative solutions now, before competitors ‘steal’ our opportunity.”

Osijo explains to investors, “These markets will be big, but later.” 

Investors want results

Bboxx needed to serve more people by growing quickly and sustainably in the cloud.

“It is vital to have a strategic global partner like SAP that is flexible and innovative enough to want to work with us on that journey,” explains Osijo. 

Bboxx turned to SAP to better manage accounting, supply chains, and inventory with SAP’s cloud-based platform. Other SAP cloud solutions are integrated to help Bboxx automate processes and smoothly expand its footprint into new markets.

Osijo says working with SAP has proven a ‘door-opener’ for the company. He can show investors how the company is meeting environmental and sustainability standards, even while Bboxx copes with real world business challenges like the pandemic, war, weather catastrophes, and supply chain dilemmas.  

Life Changing Impact

Bboxx is rapidly expanding into new territories, doubling in size every two years over the past six years. The results are life-changing for impoverished communities that now have electricity and clean water for the first time. Bboxx customers use their mobile phones to track electricity usage and pay pills on Bboxx’s mobile app.

 “We’re positively impacting millions of peoples’ lives,” shares Osijo. “I am passionate about the company’s purpose-driven mission to use technology to transform lives through access to energy.” 

Bboxx is an SAP Innovation Award-winner. Read about their amazing accomplishment that earned them their award.

Cloud Management

Over the past 184 years, The Procter & Gamble Co. (P&G) has grown to become one of the world’s largest consumer goods manufacturers, with worldwide revenue of more than $76 billion in 2021 and more than 100,000 employees. Its brands are household names, including Charmin, Crest, Dawn, Febreze, Gillette, Olay, Pampers, and Tide.

In summer 2022, P&G sealed a multiyear partnership with Microsoft to transform P&G’s digital manufacturing platform. The partners say they will create the future of digital manufacturing by leveraging the industrial internet of things (IIoT), digital twin, data, and AI to bring products to consumers faster and increase customer satisfaction, all while improving productivity and reducing costs.

“The main purpose of our digital transformation is to help create superior solutions for daily problems of millions of consumers around the world, while generating growth and value for all stakeholders,” says Vittorio Cretella, CIO of P&G. “We do that by leveraging data, AI, and automation with agility and scale across all dimensions of our business, accelerating innovation and increasing productivity in everything we do.”

The digital transformation of P&G’s manufacturing platform will enable the company to check product quality in real-time directly on the production line, maximize the resiliency of equipment while avoiding waste, and optimize the use of energy and water in manufacturing plants. Cretella says P&G will make manufacturing smarter by enabling scalable predictive quality, predictive maintenance, controlled release, touchless operations, and manufacturing sustainability optimization. These things have not been done at this scale in the manufacturing space to date, he says.

Smart manufacturing at scale

The company has already undertaken pilot projects in Egypt, India, Japan, and the US that use Azure IoT Hub and IoT Edge to help manufacturing technicians analyze insights to create improvements in the production of baby care and paper products.

For instance, the production of diapers involves assembling many layers of material at high speed with great precision to ensure optimal absorbency, leak protection, and comfort. The new IIoT platform uses machine telemetry and high-speed analytics to continuously monitor production lines to provide early detection and prevention of potential issues in the material flow. This, in turn, improves cycle time, reduces network losses, and ensures quality, all while improving operator productivity.

P&G is also piloting the use of IIoT, advanced algorithms, machine learning (ML), and predictive analytics to improve manufacturing efficiencies in the production of paper towels. P&G can now better predict finished paper towel sheet lengths.

Smart manufacturing at scale is a challenge. It requires taking data from equipment sensors, applying advanced analytics to derive descriptive and predictive insights, and automating corrective actions. The end-to-end process requires several steps, including data integration and algorithm development, training, and deployment. It also involves large amounts of data and near real-time processing.

“The secret to scale is to lessen complexity by providing common components at the edge and in the Microsoft cloud that engineers can work with to deploy diverse use cases into a specific manufacturing environment — without having to create everything from scratch,” Cretella says.

Using Microsoft Azure as the foundation, Cretella says P&G will now be able to digitize and integrate data from more than 100 manufacturing sites around the world and enhance AI, ML, and edge computing services for real-time visibility. In turn, this will enable P&G employees to analyze production data and leverage AI to support decisions that drive improvement and exponential impact.

“Accessing this level of data, at scale, is rare within the consumer goods industry,” Cretella says.

Data and AI as digital fundamentals

P&G took the first steps in its AI journey more than five years ago. It has moved past what Cretella calls the “experimentation phase” with scaled solutions and increasingly sophisticated AI applications. Data and AI have since become central to the company’s digital strategy.

“We leverage AI across all dimensions of our business to predict outcomes and increasingly to prescribe actions through automation,” Cretella says. “We have applications in our product innovation space, where thanks to modelling and simulation we can shorten the lead time to develop a new formula from months to weeks; in the way we engage and communicate with our consumers, using AI to deliver to each of them brand messages delivered at their right time, right channel, and with the right content.”

P&G also uses predictive analytics to help ensure the company’s products are available at retail partner “where, when, and how consumers shop for them,” Cretella says, adding that P&G engineers also use Azure AI to ensure quality control and equipment resilience on the production line.

While P&G’s recipe for scale relies on technology, including investment in a scalable data and AI environment centered on cross-functional data lakes, Cretella says P&G’s secret sauce is the skills of hundreds of talented data scientists and engineers who understand the company’s business inside and out. To that end, P&G’s future is about embracing automation of AI, which will allow its data engineers, data scientists, and ML engineers to spend less time on manual, labor-intensive tasks so they can focus on the areas where they add value.

“Automation of AI also allows us to deliver with consistent quality and to manage bias and risk,” he says, adding that automating AI will also “make these capabilities accessible to an increasingly larger number of employees, thus making the benefits of AI pervasive across the company.”

The power of people

Another element to achieving agility at scale is P&G’s “composite” approach to building teams in the IT organization. P&G balances the organization between central teams and teams embedded in its categories and markets. The central teams create enterprise platforms and technology foundations, while the embedded teams use those platforms and foundations to build digital solutions that address their units’ specific business opportunities. Cretella also notes that the company prioritizes insourcing talent, especially in areas such as data science, cloud management, cybersecurity, software engineering, and DevOps.

To accelerate P&G’s transformation, Microsoft and P&G have created a Digital Enablement Office (DEO) staffed by experts from both organizations. The DEO will serve as an incubator to create high-priority business scenarios in the areas of product manufacturing and packaging processes that P&G can implement across the company. Cretella considers it as more of a project management office than a center of excellence.

“It coordinates all the efforts of the different innovation teams that work on business use cases and ensures an efficient scaled deployment of proven solutions that develop,” he says.

Cretella has some advice for CIOs trying to drive digital transformation in their organizations: “First, be driven and find your energy in the passion for the business and how to apply technology to create value. Second, be equipped with tons of learning agility and genuine curiosity to learn. Last, invest in people — your teams, your peers, your bosses — because technology alone does not change things; people do.”

Artificial Intelligence, Digital Transformation, Manufacturing Industry