When I was a CIO, I always dreaded the annual budget season because I knew, somewhere during the process, the CEO, my boss, would ask, “What are we getting for this constantly growing IT department.”

It’s a question that keeps most CIOs up at night when asked to defend IT investments, and it’s one all CIOs should expect to answer, given that IT expenditures can range from 1% to more than 50% of a company’s total revenue.

For most IT departments, this is a very difficult question to answer because the systems that we develop are not used by IT but are used by other departments to increase their sales, reduce their expenses, or be more competitive in the marketplace.

As such, an IT leader’s usual response to this question is a general statement about how IT has implemented projects across the corporation that have achieved corporate strategic objectives. We seldom have any empirical data to back up our claims. So what’s a CIO to do?

IT as a business

There are two ways to address this issue. The first option is to transition from a non-charge out environment, where IT absorbs all development costs, to a charge out environment where all IT costs are assigned to the user departments based on their use of the resources. In this case, IT operates as a zero-cost department and there are no annual budget issues. All IT has to do is tell the user departments how much to budget for IT.

But there are great downsides to this approach that far outweigh its ease of use for IT. First, this process tends to place the automation agenda into the hands of individual departments or profit centers rather than looking at IT and digitalization as an overall company necessity. An example of this would be the development of artificial intelligence systems. The ramifications of this sort of system would affect all departments.

Second, with a charge out system, IT sends each department a monthly bill charging their P&L for a range of services, including development costs, IT infrastructure usage costs, and the dreaded overhead costs. This bill presents a huge challenge to IT to maintain its cordial relationships especially if it is higher than the budget estimates.

Perhaps worse, shifting to charge out, or chargeback, approach treats IT like a business — a system that might sound good on the surface but means that the user department may begin to look to outside IT organizations to develop shadow IT systems that are sold as a cheaper alternative. These systems can only make internal system maintenance more complicated and drive a wedge through the company and its automation agenda.

The better way

The second and better way to approach the problem of IT value is to measure the effectiveness of the IT operation. Why should IT be the only department that is immune from corporate oversight? The advertising department is routinely measured on whether it is increasing corporate sales. HR is constantly being questioned on how its salary system compares to the industry. Manufacturing is always being challenged on its costs and if there are alternative methods and locations. Marketing must assure top management that its brand positioning is the best for the company.

The only way to measure IT is to enforce a requirement that all large scale new or modified system projects are analyzed, after completion, to verify that the objectives were met and the ROI was proven.

In my book The 9 1/2 Secrets of a Great IT Organization the 1/2 secret is the post-implementation audit. I called it a half secret because few companies do it. It should be treated as a full secret, however, because it will assure a much more effective and successful department. But it is not generally done for a number of reasons.

First, conducting a post-implementation audit requires a significant amount of analysis that is very detailed and can span several years. Just gathering the data can be time consuming especially given that many of the project personnel may have changed jobs or even companies since the project was completed.

Next, it cannot be done until at least a year after the system has gone live since no system is fully functional on day one. Sometimes it is hard to convince both IT and the user department that it is worth the time to analyze a completed system because there are more important projects to complete.

Moreover, the user department is often not interested in proving ROI for several reasons. Perhaps they inflated the initial ROI to get the attention of the IT steering committee. A close analysis may discover this practice. Additionally, the ROI may have contained significant headcount reductions that were used to generate a better return. The department may desire to forget these moves once the project is completed.

Of course, it’s not always about the user department. IT may also not want to see the audit done because it may have underestimated the cost or completion date on the original estimate.

The recommended way to complete this audit is to remove the responsibility from the user department and from IT. An independent organization, preferably under the auspices of the financial arm of the company, should conduct the post-implementation audit. This group should have been involved initially in developing the ROI for the project and are in the best place to assure objectivity in the result.

If done this way, the user department will be held to its ROI commitment, IT will be held to its performance objectives, and the CIO will be able to answer the question posed by the CEO about IT investments by saying, for example, “We implemented 17 projects this year which increased sales by 35% and reduced expenses by 14%.” 

Wouldn’t that be a great conversation to have, not only with the CEO but with the entire company. 

Business IT Alignment, IT Leadership

Like most CIOs you’ve no doubt leaned on ROI, TCO and KPIs to measure the business value of your IT investments. Maybe you’ve even surpassed expectations in each of these yardsticks.

Those Three Big Acronyms are still important for fine-tuning your IT operations, but success today is increasingly measured in business outcomes. Put another way: Did you achieve the desired results for your IT investments?

For more than a decade, IT departments derived business value from cloud computing—public, private and maybe hybrid. Of late, concerns about the public “cloud-first” approach have emerged to challenge business value and skewer ROI, TCO and KPIs. And it drew the curtain on a critical reality: IT profiles are much more complex.

A more thoughtful approach to procuring and managing assets is needed to help hurdle the challenges posed by those diverse estates. To understand how to get there, it helps to first unpack how we got here.

When Diminishing Returns Become Budget Busters

For years enterprises scrambled to build applications in public cloud environments; there was legitimate business value in rapid innovation, deployment and scalability, as well as unfettered access to more geographical regions.

“Cloud-first strategy” became a cure-all for datacenter impediments, as well as an IT leader’s tentpole for digital transformation.

More recently some organizations have reported diminishing returns from their public cloud implementations. Some companies calculated savings after moving from public clouds to on-premises—or cloud repatriation. Others conducted apples-to-apples comparisons of public cloud versus on-premises costs.

In some instances, poor implementation and faulty configurations were the culprits for deteriorating ROI, TCO and KPI values. Collectively these factors have dulled the initial sheen of agility and innovation around the public cloud.

The reality is the decision to put applications in the public cloud or on-premises systems is not an either-or argument; rather, it requires a nuanced conversation, as consultant Ian Meill points out in this sober assessment.

Smart Workload Placement is Key

Meill is right. The real argument about where to allocate applications to generate business value is around the most appropriate location to place each workload. Because, again, IT environments are far more complex these days. They’ve become multicloud estates.

To accommodate an accrual of disparate applications, you’re likely running a mix of public (probably more than one) and (maybe) private clouds in addition to your traditional on-premises systems. You might even operate out of a colo facility for the benefits cloud adjacency affords you in reducing latency. Maybe you manage edge devices, too.

Workload placement is based on several factors, including performance, latency, costs, and data governance rules, among other variables. How, where and when you opt to place workloads helps determine the business value of your IT investments.

For example, you may elect to place a critical HR application on-premises for data locality rules that govern in which geographies employee data can run. Or perhaps you choose to offload an analytics application to the public cloud for rapid scalability during peak traffic cycles. And maybe you need to move an app to the edge for speedier data retrieval.

Of course, achieving business value via strategic workload placement isn’t a given. There is no setting them and forgetting them.

As you navigate the intricacies of workload placement, you face many challenges such as: Economic uncertainty (the market is whipsawing); deficit in IT talent (do you honestly recall a time this wasn’t an issue?); abundant risk (data resiliency, cybersecurity, governance, natural disasters); and other disruptions that threaten to crimp innovation (long IT procurement cycles and slow provisioning of developer services).

You can try to tackle those challenges with a piecemeal approach, but you’ll get more value if you deploy an intentional approach to running workloads in their most optimal location. This planning is part of a multicloud-by-design strategy that will enable you to run your IT estate with a modern cloud experience.

A Cloud Experience Boosts Business Value

As it happens, an as-a-Service model can help deliver the cloud experience you seek.
For instance, developers can access resources needed to build cloud-native applications via a self-service environment, freeing up your staff from racking and stacking, provisioning and configuring assets to focus on other business critical tasks.

To help you better align cost structure with business value, pay-as-you-go consumption reduces your reliance on the rigorous IT procurement process. This cloud experience will also help you reduce risk associated with unplanned downtime, latency and other issues that impact performance and availability SLAs aligned to your needs.

Leveraging such a model—and in conjunction with trusted partners—IT departments can reduce overprovisioning by 42% and support costs by up to 70%, as well as realize a 65% reduction in unplanned downtime events, according to IDC research commissioned by Dell1.

Dell Technologies APEX portfolio of services can help you successfully manage applications and data spanning core datacenters to the edge, as well as the mix of public and private clouds that comprise your multicloud environment. This will help you achieve the business outcomes you seek.

Regardless of where you opt to run your assets, doing so without a modern cloud experience is bound to leave business value languishing on your (or someone else’s) datacenter floor.

Learn more about our portfolio of cloud experiences delivering simplicity, agility and control as-a-Service: Dell Technologies APEX.

[1] The Business Value of Dell Technologies APEX as-a-Service Solutions, Dell Technologies and IDC, August 2021

Cloud Management

Generative AI such as ChatGPT has of late captured the imagination of business leaders across industries. While enterprise IT orgs by and large are taking a measured approach, some early movers are showing impressive results.

CarMax’s IT team, for one, has been working with Microsoft and OpenAI to leverage GPT-3.x for business value even before ChatGPT became a household name.   

That is why the omnichannel used-car retailer earned a coveted spot on the 2023 CIO 100 Award list: for its early, innovative use of a nascent AI technology that led to a spike in page views as well as higher SEO ranking and placement that drove substantial business growth.

CarMax EVP and CITO Shamim Mohammad, the brains behind his company’s digital transformation and AI push, would not specify how much money CarMax has netted from its AI investments and Microsoft Azure OpenAI Service to date, but the increased customer traffic no doubt has had an impact on revenue, while the cost implications of leveraging AI have also contributed to the bottom line, he says. 

“We would have had to have hired tens or maybe hundreds of content writers and taken years to generate this content,” says Mohammad. “We were able to do this literally in a matter of hours.”

The combination of CarMax’s revolutionary digital business model applied to the used-car business with revolutionary AI tools available to all makes for powerful and profitable business outcomes. Despite the current overall economic slowdown, CarMax’s Q4 2022 revenues rose 48.8% to $7.7 billion compared Q4 2021, with revenues for fiscal 2022 increasing 68.3% to $31.9 billion overall.

First-mover AI benefits

CarMax’s IT leaders and IT staff were experimenting with OpenAI’s GPT-3.x natural language model on pilots before Microsoft’s well publicized $10 billion investment in the nonprofit at the outset of 2023.  

In this early use case, the CarMax team employed GPT 3.5’s enhanced “iteration on prompts” to feed scrubbed and formatted data for thousands of used cars into a DaVinci model. Following that, a small dataset was sent for editing and fine-tuning and the content was pumped into the DaVinci model for mass publishing and consumer consumption.

Because use of the DaVinci natural language model itself requires fewer data scientists , which are scarce and super expensive, CarMax was also able to realize IT cost benefits, in addition to content creation savings.  

“The model’s ability to learn with just a few examples of intended outputs, a process called few-shot learning, helps CarMax’s 60-plus product teams use the models without requiring additional teams of data scientists,” according to a CarMax representative.

Creating car research content using AI eases the car purchasing process for consumers and the content creation process for editors. But perhaps most important, it drives an order of magnitude more inventory, eyeballs, reviews, and sales for CarMax.com. This is due to CarMax’s use of the cloud-based OpenAI API natural language model, which enables the extraction of millions — not thousands — of keywords.

As a pioneer, CarMax is reaping the early benefits of what will likely be a major business driver across the globe, one analyst says.

“As the use of generative AI becomes more widespread, it is causing significant disruption in many industries and sectors,” says Ritu Jyoti, group vice president of AI and automation research at IDC. “CarMax was at the forefront of embracing generative AI responsibly in partnership with Microsoft and has been a successful industry disrupter that has transformed the process of buying a used car.”

Looking forward

This is just the beginning, says Mohammed. Although he would not specify how CarMax is currently using the enhanced GPT-4 models released in March 2023, he is moving full steam ahead streamlining content creation for car research pages and scaling its use of GPT to help make buying a used car “effortless,” he says, adding that CarMax programmers are working on new aspects of the customer experience and will use GPT to gain efficiencies within the company. 

“My team is using the latest version for some other use cases, but we haven’t gone public yet,” he says. “That’s the way my team works. They like to experiment and try new things in a controlled way.”

Mohammed started the cloud journey and digital transformation when he became CIO of CarMax in 2014. Everything CarMax does that is new is done on the cloud but the company still has a small data center that will eventually be phased out.

Customer security is critical for CarMax, Mohammad says. The company claims Microsoft Azure’s security, compliance, reliability, and other enterprise-grade capabilities are what enables the company to scale its use of AI to enable the extraction of keywords and “filter out any harmful content in the user reviews.”

As a Microsoft Azure shop, CarMax relies on Azure Data Lake, an essential component of the company’s AI output, the CIO notes. “Data is the core of everything we’re doing because it feeds our machine learning algorithm that feeds our AI capability,” he says.

In fact, the blueprint CarMax has created within Azure can also be used by other companies, according to CarMax representatives.

CarMax is continuing to experiment and innovate using Azure OpenAI and GPT but Mohammad insists he is putting in place strong governance to ensure that machine learning and automation is used by its programmers in a disciplined manner to fulfill the company’s business goals.

“It’s not the wild west,” he says. “We have a good sense of what are the guardrails and what are the ways we’re going to be leveraging AI. How we deploy and utilize AI needs to be very much consistent with who we are a company.”

Artificial Intelligence, CIO 100, Digital Transformation

The emergence of business models driven by data along with the evolution of modern analytics and cloud capabilities have increased the interest in data management multifold. As a result, enterprises are breaking down data siloes, transforming their data architectures, and democratizing access to data tools to accelerate decision-making.

But the journey to the data-driven enterprise remains challenging, riddled by roadblocks, from budgeting issues to buy-in difficulties. And sound data governance practices can’t be given short shrift in the rush to unlock hidden insights from data.

With all that in addition to privacy and compliance laws continually evolving across the globe, the chief data officer role as become a highly challenging — and enterprise-critical — balancing act. To learn more about how data leaders are embracing the challenge, CIO.com caught up with Tejasvi Addagada, chief data officer at HDFC Bank, to discuss the various aspects of data impacting enterprises today.

Tapping the business value of data while keeping it secure is a complex balancing act. How can IT leaders convert data into dollars while ensuring its security?   

Addagada: A well-desired culture change of data awareness in an organization can be achieved through data democratization, a science that makes data accessible to anyone. By making data available and easily accessible, revenue streams can be improved through direct and indirect monetization of data. 

Data protection enables responsible data consumption on the heels of data democratization. Even though a data marketplace cannot provide free access to all data, there can be risk-based controls that must be actively managed. A few of these controls are privacy, security, authentication, encryption, entitlements, user access management, device management, and data rights management. 

New Privacy laws are coming into force while existing ones are under constant review. Technology leaders must account for the laws of every geography they do business in as a breach can bring about strong penalties. How can data officers meet regulations confidently? 

Privacy policy is constantly evolving across geographies, towards providing more control for customers on their personal data yet letting companies and public authorities share what is required for efficient governance, better service, and public good. Privacy engineering as a science must cater to providing geographical awareness that is backed by technology advancements like catalog, privacy, and security analytics. 

Assessment of the threat surface area begins with determining the classification of personal data in a geographical area. It is crucial that the catalog has the intelligence to apply geographical rules to classify the personal data, since what constitutes personal data differs between countries. As an example, financial information may be considered sensitive personal data in India but not in Europe.  

Over 137 countries have legislation to protect data and privacy. The data office can formalize, as part of the overall breach incident response, the integration of privacy intelligence and thereby privacy reporting tasks that have geographical context. Further, data offices can partner with the legal teams to ensure compliance with regulatory requirements. 

Siloed data undercuts its value. What approach should IT decision makers undertake to ensure end-to-end data discovery process across the network?  

If data is siloed, it cannot be used for developing insights and products. For an organization that is yet to invest in managing its data and thinks centralization is costly or a bottleneck, a data mesh architecture is a decentralized approach at its core, with its domain team ingesting its operational and analytical data and developing data products. 

However, even in a decentralized setup, data needs to be discovered, as what is not known cannot be used. Information Technology as a function will have to support data discovery platform with an objective to understand the technical data estate that can then be defined as meaning by domain teams. 

The implementation of data governance is both imperative and challenging to prevent multiple versions of the truth with an organization. How can proper data governance be ensured?  

From the initial concept of corporate governance, IT governance has evolved into the recent concept of data governance. Globally, the adoption of cloud services, the evolution of modern data stacks, and improved data literacy have led to a greater interest in governing data over the past years. 

Implementing data governance is necessary to get sustainable value from data. A subfunction can be formalized as an authorized provisioning service. It can support activities that help ensure that a data element can be rightfully sourced from a designated provisioning point. In addition, it can have the domain team express their trust in certifying data as a system of record as well as authorized to provision. 

Other technologies that can help the identification and certification of single version of truth are data discovery, profiling, quality, and observability, to name a few. 

If there are multiple values to properties of an entity like a customer, technology like master data management can translate the know-how of operational personnel into prioritization and survivorship rules that can create and maintain a version of the truth that can be consumed universally within an organization. 

Data-driven projects demand a substantial investment of budget and resources. How can data officers justify both?  

Investments into data capabilities and development of data products have increased multifold over the past years. This requires investments into tools as well as commissioning people as well as augmented knowledge workers like consultants along with setting up new processes as well as interventions. 

Formalizing management of data through data governance can increase transparency, accountability, responsibility, independence, and fairness in implementing corporate governance. One crucial aspect of formalization from data offices is assessing return-on-investment on investments and maintaining the value of data assets. 

What tips would you share with IT leaders looking to establish a data strategy and direction for their companies?  

The 1994 Hawley Committee report first identified data as an asset, defining it as ‘data that is or should be documented, and that has value or potential value.’ Data offices can focus on the decision rights related to the data assets and the network of relations to ensure data is qualitative, consistent, usable, secure, protected, and yet available.  

In the past decade, the interest in data management has increased multifold with the evolution of business models that are driven by data along with the evolution of the modern data stack and cloud capabilities. This has in fact resulted in a need for improved data literacy around the globe. Industry bodies like DAMA, EDM Council, along with other data communities are providing global literacy around benefits of managing data with standard frameworks.

During the process of determining the company’s goals, the board is entrusted with exercising critical judgment, while the data office is responsible for designing data strategy and policies to ensure that these goals that have data contribution are met. 

Information Technology is not to blame for the emphasis on people and process capabilities; however, it should be considered when planning future technology investments that can enable the achievement of the goals outlined by data and business strategy. 

IT leaders can keep up with rapid advancements in data technology including data collection, cloud storage and processing, machine learning operations, automation in data operations and data security to name a few domains of interest. Within the organization, the data officer can build a data-driven culture by imparting awareness around benefits of managing data activities through interactive newsletters, roadshows, board representation and formalization of people and processes that involve data. 

Chief Data Officer, Data Governance, Data Management, Digital Transformation

Jeff Dirks is fascinated by new technologies like generative AI. But when it comes to implementation, the chief information and technology officer of workforce augmentation firm TrueBlue chooses a path that trails early adopters. “We’re in the early majority,” is the CIO/CTO’s blunt self-assessment.

Although many IT leaders would like to think of themselves — and have others think of them — as in the vanguard of new technology adoption, the vast majority find themselves in the middle of a bell curve, with innovators leading the way and laggards trailing behind, according to Everett Rogers’ diffusion of innovations theory [see chart]. But there is no one “right” place to be along the curve. The trick is to know where your organization belongs — and to make the most of it.

The diffusion of innovations theory, published by Everett Rogers in 1962, places most organizations in the middle of a bell curve of technology adoption. 

Public Domain

“Organizations that are willing to explore new technologies and be the first in their industry face the highest risk and highest potential return on new technology. But that is very few companies,” says Brian Burke, research vice president for technology innovation at Gartner. A far savvier approach for most organizations is to exploit adjacency — to keep an eye on innovators in similar industries and, when the time is right, adopt the technologies they are using.

“If you’re in banking and you see that an insurance company has adopted a technology, you might adopt it as the first in your industry, gaining a first-mover advantage with less risk,” says Burke.

Dirks buys into that philosophy for TrueBlue, whose core business, PeopleReady, is to provide a platform for efficiently matching day laborers with companies that have contingent labor needs. For TrueBlue, the so-called “gig-economy” companies like Uber and Lyft and venture-backed competitors such as Wonolo and Instawork are adjacent bellwethers. While such companies are technology-first, TrueBlue is evolving from brick-and-mortar to digital, a path that calls for incremental, rather than radical innovation.

TrueBlue’s award-winning Affinix app, which aims to make recruiters more efficient by predicting which candidates have the highest probability of success, implements data science, machine learning, and RPA, technologies that Dirks calls mainstream. Consistent with that approach, Dirks is exploring the use of blockchain technology to create a trusted ledger of contingent laborers’ credentials, including facets such as drug and background checks. Although blockchain is no longer new, using it in this way would be a first in the day-labor industry, Dirks says.

Institutionalizing innovation

Far from the fast-track of venture-backed data science, government agencies often are found  among late-majority and laggard organizations. “In the public sector, there is a propensity to not rock the boat. But that’s not always the right approach,” says Feroz Merchhiya, CIO and CISO of the City of Glendale, Ariz. He ranks the city between early majority and late majority, depending on the project.

When the state of Arizona centralized the collection of state sales taxes several years ago, municipalities such as Glendale struggled with the inefficiency of collecting taxes, sending them to the state, and then receiving their share back in disbursements.

“We did not have a solution to do this. It was very cumbersome for cities as well as businesses. There were a slew of complex interactions that needed to take place,” says Merchhiya. Having scouted industry events and discussed the situation with analysts, he and his team discovered there was no off-the-shelf solution.

“When we realized there was nothing available, we thought it would be a good idea to develop it on our own,” he says. With his own staff of 50 and a like number of contingent workers, Merchhiya led the development of AZ Tax Central, which Glendale is now using and making available to other cities for a charge that covers costs only.

With that successful project under his belt, Merchhiya is seeking to institutionalize the innovation process. Knowing that innovation in government can never succeed without administrative support and budget dollars, Merchhiya convenes annual meetings with civic officials to learn the issues they are facing and to brainstorm how technology might address them.

“It’s one thing to do accidental innovation. It’s another to put a process in place. I’m trying not to be an accidental innovator — the only way I can do that is to engage the business by inviting everybody to get together and have a conversation,” he says. 

Tailoring innovation for real-world impact

Engagement is also important at the District of Columbia Water and Sewer Authority (DC Water), an agency that serves the nation’s capital and the surrounding region, including Dulles Airport. “You have to embrace people to make them comfortable with suggesting ideas — and not being disappointed if their idea doesn’t move forward,” says Thomas Kuczynski, vice president of IT at DC Water, which is responsible for 1,300 miles of water distribution pipe, and 1,900 miles of sewer pipe.

Although DC Water’s budget does contain an allocation for experimental work, IT must keep its eyes on real-world challenges. Kuczynski says the agency’s technology adoption sometimes falls into each of Rogers’ classifications, depending on the project. “Our focus is to have a positive impact on the business as quickly as possible,” says Kuczynski. “We focus on opportunities where we believe we can create efficiencies and improve operations.”

For example, DC Water began using an AI-based product called PipeSleuth to inspect the sewer system. An improvement over previous CCTV systems, PipeSleuth sends a sausage-shaped drone on wheels through the system to look for pipe anomalies. Using deep-learning, neural-network technology, it tags defects and produces a report, rather than requiring operators to view reams of video as is necessary with CCTV-based systems.

Using PipeSleuth has enabled DC Water to substantially reduce the cost of pipe inspections, allowing the agency to inspect more pipe at the same cost. “The more we can inspect regularly, the more improvements we can make because we better understand the system. If we can do more inspections per dollar, we can apply that repair dollar to a bigger problem because we know more about my system,” says Kuczynski.

Another DC Water innovation is an event management system that integrates in a single dashboard SCADA data, inbound calls, work orders, USGS data, rain-level gauge, and other IoT inputs from sensors tracking water pressure, flow, and level. The system also tracks personnel and vehicles via GPS to dispatch repair staff to the highest priority trouble spots quickly.

“The dispatch system integrates IT and OT in a single dashboard. Now we can manage emergencies more effectively. In the past, if we got 20 calls about one problem, there might be 20 work orders. Now those calls are consolidated into one work order,” he says.  

The art of selection

Ty Tastepe, senior vice president and CIO of Cedar Fair Entertainment Company, an operator of 13 amusement parks and resorts in the US and Canada, says his company tends to be in the “fast-follower” category, a space that’s generally recognized to sit somewhere between early-adopter and early-majority groupings.

“We look at technology implementations not only in our own industry but also in adjacent industries like retail and food services,” Tastepe says. He adds, “If there were no proven technology, we would entertain being an early adopter or try to pilot something new to address a business challenge.”  

“The good ideas are plentiful — the challenge is to identify the must dos, and whether a project fits within budget and resource constraints,” says the CIO. Because Cedar Fair is a public company, Tastepe thinks in terms of delivering results. He says new technologies must answer yes to at least one of three questions: Does it generate revenue? Does it improve efficiency? Does it satisfy compliance requirements?

“There will always be budgeting constraints; that’s why due diligence up front is important. We do a discovery process. Once the business sponsor puts together a charter for an initiative that can be enabled by a technology implementation, we discuss the merits of the initiative at the portfolio management committee,” he says. The next step is to flesh out the business model and the costs, sometimes with the assistance of a partner. “If we decide to move forward with the project, we might do a proof of concept before we deploy at scale,” he adds.

Gartner’s Burke agrees with Tastepe that winnowing down the field of technologies to a few plausible candidates is essential. “When you’re scouting new technologies it’s a bit of an art as well as a science,” says the analyst. Typically, organizations scan several hundred technologies, a number that must be reduced to a couple dozen for serious study, Burke says.

Knowing when to pull the plug

One company that is widely recognized to be in the innovator category is Amazon.com,

which willingly invests in technology-heavy concepts such as Amazon Go, a convenience story with no checkout. Although Amazon has built a couple dozen such stores, the giant retailer recently announced the closure of several. In addition to reining in Go stores, Amazon.com has reportedly curtailed its drone delivery initiative. While such explorations and reversals are more than most companies can risk, they underscore the importance of continually evaluating pilots to ensure there is enough warranted value in going forward with the concept.

“In Amazon’s case, they are very good at testing technology and then abandoning it if it doesn’t work,” says Ananda Chakravarty, vice president of research for retail merchandising and marketing at IDC. One reason for the Go retrenchment, according to the analyst, is that the cost of cameras dropped significantly, making them a better technology choice than the sensors that Go implemented to track inventory on shelves.

Even so, according to Chakravarty, behemoths like Amazon and Walmart, which has also experimented with Scan-and-Go checkouts and delivery drones, can afford the luxury of trying — and learning from — things that others cannot.

“There’s some real value-added to a test-and-learn approach,” says the analyst. For example, he explains, Amazon.com initially planned to sell Go technology to other retailers, and when that didn’t pan out, the company decided to target Go to niche markets such as airports, stadiums, and transportation venues, where consumers place the highest value on the convenience of a frictionless experience.

Checks and balances

Rajiv Garg, associate professor of information systems and operations management at Emory University, says putting together a diverse team to evaluate new technologies is an essential step. The group should encompass multiple corporate departments and a variety of demographics. “You might need a millennial on your team to ask how your organization is impacting the environment and society,” he suggests. Once the team is complete, he advises, turn them loose to explore.

“Send them to conferences, buy them VR headsets. If the team doesn’t like one technology, that’s fine because they might find something else they like,” says the professor. Generative AI has reached a point at which it demands to be evaluated, according to Garg. “ChatGPT and DALL-E are going to be embedded in our work somehow. You need to engage employees and use them in your workplace,” he says.

Too often, says Gartner’s Burke, the excitement of working with new technologies causes organizations to jump the gun, failing to perform due diligence ahead of time.

“Companies that have assessed a tech opportunity before they launch a proof of concept are a small minority of companies. In contrast, most organizations identify a nifty technology and then rush into a pilot without having done the easier thing — to determine whether it will actually help them in some way,” says Burke.

Even if most organizations could benefit from more careful up-front analysis, gaining an edge in the market ultimately depends on the willingness to give new technologies a try.

“We like taking a lot of swings. The more swings you take, the higher the probability that one of them will hit something translating into competitive advantage,” says Dirks.

Innovation

IT is no longer perceived as a cost factor or a pure support function at many organizations, according to management consultancy 4C Group’s Markus Matschi. And the digitization push during the pandemic accelerated this. But despite such advances, the question of the value contribution of IT isn’t always clearly answered. “Due to the increasing relevance and added value of IT in companies, it’s essential for CIOs to not think in terms of costs, but value contributions,” says Matschi.

In a joint study with Markus Westner and Tobias Held from the department of computer science and mathematics at the University of Regensburg, the 4C experts examined the topic by focusing on how the IT value proposition is measured, made visible, and communicated. From many discussions with CIOs and current data, they developed a process model for practice.

“To develop a practical approach, it was important for us to understand the current challenges of the CIOs together with scientific findings and then integrate them,” says Martin Stephany, another consultant at 4C. 

Although the discussion has been ongoing for a long time both in science and practice, there’s often disagreement on the basics. “It starts with the fact IT value contribution is defined in various ways and there’s no uniform understanding or definition of it in companies,” adds Westner.

Because the contribution to value and innovation is often unclear, employees from business-related departments perceive IT as a black box and can’t always judge what it does and what added value it entails. This is also why some companies appoint a CDO to close the gap of value contribution between business and IT.

IT must be able to show the departments and the management team their possibilities and the added value, says Heiko Weigelt, CIO of Funke Media Group. But because it’s determined by the respective stakeholders and not by IT itself, transparency and understanding are necessary in both directions.

A challenge for determining the value contribution is the selection of suitable key figures. According to the study, IT departments today primarily use technical and IT-related metrics. That is legitimate, but in this way, there’s no direct connection to the business.

Plus, there’s often a lack of affinity for meaningful KPIs, both in IT and in the specialist departments, says Jürgen Stoffel, CIO at global reinsurer Hannover Re. Therefore, in practice, only a few metrics suitable for both sides would be found, and the result is the IT value proposition is often unseen.

“A consistent portfolio of metrics coordinated with the business would be helpful,” says Thomas Kleine, CIO of Pfizer Germany, and Held from the University of Regensburg adds: “Companies have to get away from purely technical key figures and develop both quantitative and qualitative metrics with a business connection.”

In order to make progress along this path, the consultants developed a process model with several development and evaluation phases, using current scientific findings and speaking to CIOs. They also tested the concept in a German mechanical engineering company. This resulted in a process model consisting of six steps, with which the IT value proposition can be measured and communicated.

1. Analysis of business goals and business environment

The consultants raised concerns that many CIOs start with their own metrics without knowing what’s vital to the business. IT managers should first look at the business goals and the business environment since without knowing goals and market trends, it’s difficult to link added value with IT.

2. Analysis of stakeholders

The next step is thorough analysis of stakeholders with continuous and organized management since it’s key for CIOs to identify and prioritize key stakeholders. Then they should talk to them individually and find out what their goals are and where IT can help. “CIOs should act as stakeholders’ partners in these discussions,” says Stephany, adding that the core question must be: “How can IT create added value so we can become better together?”

3. Modeling the business capabilities

In order to create a common basis for discussion, more transparency is needed at the interface between business and IT, the study authors explain. Business capabilities structured in a business capability map (BCM) can serve as a starting point. It all revolves around what the company is doing today and where does its future potential lie. “The BCM can be the central tool for discussions with stakeholders,” explains Matschi. In this way, companies could find out for themselves which business capabilities are differentiating and how IT can support them.

4. Model the business-IT relationships

Based on the BCM, dependencies and relationships between business and IT can be made visible. CIOs can demonstrate how and where their IT provides concrete support today and in the future. This serves as a starting point for measuring IT value proposition. In this way, maximum transparency is achieved and, according to the authors, the IT black box is solved.

Christian Büchner, CIO of SachsenEnergie, has already gained BCM experience: “At SachsenEnergie, we work with a BCM that’s arranged according to business-related departments and all of our more than 600 applications, or IT capabilities, are assigned to these business capabilities,” he says. 

Christian Graf, CIO at toy manufacturer Schüco, is taking a similar approach. The points of contact with IT are mapped and managed along the digital customer journey.

5. Measure the value proposition

“We’ve learned in our discussions and from scientific findings that there is no general, standardized measurement with definitive key figures,” says Matschi. “Each measurement is individual and dependent on the stakeholder and the business case under consideration.”

Against this background, the consultants developed a 3×3 matrix structured according to IT- and business-oriented key figures. Both quantitative and qualitative measurement approaches are presented in order to measure the value contribution individually for a stakeholder or a business scenario. Depending on the focus of the activity (operation, projects, innovation) and business architecture (IT capabilities, business capabilities, business goals), different types of added value can be derived, behind which there are certain metrics.

Discussions with the CIOs made it clear where the hurdles lie in the measurement. It’s particularly difficult to measure the IT value added, for example, in day-to-day operations, especially in network or workplace ​​commodity IT services, says Holger Blumberg, CIO at mechanical engineering company Krones.

6. Planning the communication

Once the value contribution is identified in its respective forms, the next step is well-planned communication. CIOs should consider what information they want to provide to whom. From 4C’s point of view, the form of presentation is also decisive for success, which must look different for qualitative metrics than for quantitative ones.

In order to get more added-value visibility, and find out where IT can provide even better support, CIOs go different ways. Kleine from Pfizer Germany, for example, introduced IT ambassadors in the departments that support communication. Büchner, CIO of SachsenEnergie, relies on the additional role of “demand managers,” or IT employees with business skills who act as an interface between IT and business, and take care of the communication with the department.

The sum of its parts

Discussions with CIOs have shown that the process model can be used in practice. Krones CIO Blumberg, who helped develop the model with his team as a practical partner, reports on successful piloting and success in cooperation with stakeholders. In addition, the survey revealed other useful results that could help make the IT value contribution more tangible. Matschi and Held add there’s no single value behind it, but a conglomerate of metrics. These can be of a qualitative or quantitative nature that ultimately show how satisfied stakeholders in the departments are with IT. It’s important to connect facts with perceptions, and that metrics are measured in dimensions that are relevant for respective stakeholders. The bottom line is it’s about shared success across departmental boundaries. Both IT and business departments should be able to see how IT affects business performance, so neither can create value on their own. In the end, it’s the combination of IT and organizational skills that counts.

(This post is based on an article from our sister publication CIO Germany.)

Business IT Alignment, CIO, IT Leadership

CIOs collaborate with C-suite colleagues on a regular basis. Given the high value of data and analytics to business, among the most important of these relationships is the one a CIO develops with their chief data officer (CDO).

CDO is responsible for enterprise-wide governance and use of information as an asset, through data analysis, processing, mining, and other means. Considering the nature of their responsibilities, CIOs and CDOs are bound to have lots of opportunities to work together and must do so in a way beneficial to the enterprise as a whole.

“The CIO role has been around for over four decades, with the responsibility of managing the systems and infrastructure that produce business data and administering those repositories that contain data, including the oversight of business intelligence initiatives to exploit data assets for reporting and business insight,” says Su Rayburn, vice president of information management and analytics at Delta Community Credit Union (DCCU).

As data and analytics have become more critical to business, data volumes have increased significantly, and companies have turned to cloud and other technologies to scale and democratize their analytics strategies. “This has prompted many companies to add the CDO position to executive management, to be responsible for managing the strategy, quality, and governance of an increasingly crucial asset,” Rayburn says.

Vesting a senior leader with the exclusive role of managing all aspects of data, including governance, risk, compliance, policy, and business value realization management is increasingly seen as a solution for data-driven companies, Rayburn says. And with the introduction of these new C-suite colleagues, divisions of labor and lines of collaboration among data and IT teams are shifting along the systems/data divide.

Collaboration is key

The emergence of the CDO doesn’t mean CIOs are no longer involved in data projects. In fact, the two executives can work collaboratively to ensure that an organization is getting the most from its information resources.

“The CDO and the CIO should work in close collaboration and build a partnership and an alliance,” says Helena Schwenk, vice president and chief data and analytics officer at database software company Exasol.

“This alliance could prove quite useful, especially as many organizations remain keen on driving digital transformation,” Schwenk says. “As we very well know, data is [the] beating heart of any digital transformation.” It’s to the benefit of both to work closely together, she says.

“One of the most critical relationships I have is with our CIO,” says Kathy Rudy, chief data and analytics officer with technology research and advisory firm ISG.

“I look at the relationship as both tactical and strategic,” Rudy says. “From a tactical perspective, we need to have a common understanding and agreement around data security, privacy, retention, storage, taxonomy, data structures, and base technology for managing data across the enterprise. Once you have a common understanding of the foundational aspects of data, you can move on to the strategic; how to leverage data to drive business results.”

Rudy works with the CIO and business partners to develop products that monetize data for the firm.

“Having the tactical elements of our data ‘handled,’ we aren’t bogged down with questions about how we are going to do something, but rather we can focus on the art of the possible — what we can do with speed and agility,” Rudy says. “We have a creative license to develop new products leveraging APIs or microservices that combine our data into new and hopefully revenue-generating products.”

The team can also quickly respond to business requests for data that supports running the business, “which honestly are endless and previously took weeks to implement,” Rudy says. “We now look like superstars when we can say, ‘Yes, we can do that,’ and enable it overnight.”

Excelling at analytics

A strong CIO-CDO partnership positions companies well to leverage emerging technologies and data strategies, such as edge analytics.

“Data and analytics are essential to timely decision-making and fuel digital transformation,” Rayburn says. “It is hard to imagine efficient analytics without a well-designed data architecture working seamlessly with enterprise architecture. Given the co-custodianship of data, successful implementation of enterprise strategies will depend upon CDOs and CIOs working well together.”

Today, many applications employ embedded analytics to interact smartly with end users, Rayburn says. “Most of these apps employ edge analytics as they take in and analyze data in real-time at the application level to maintain a sub-second response,” she says. “Pulling this off will require a good partnership between the CIO and CDO, with data architects working with enterprise systems architects to ensure the requisite performance and scalability.”

DCCU has deployed analytics in its mobile banking app that required the company’s data scientists to develop alongside its systems architects, to ensure a cohesive architecture, Rayburn says.

Another area where CDOs and CIOs can work together is DataOps, a set of practices that combine an integrated and process-oriented perspective on data with automation and methods from agile software engineering to improve quality and speed and to promote a culture of continuous improvement in the area of data analytics.

“DataOps allows for the application of the DevOps methodology to ongoing deployment and maintenance of data or data analytics-intensive applications,” Rayburn says. “By adding data specialists to operational processes typically handled by IT, DataOps ensures the much-needed collaboration and integration between IT and data teams happens with the objective of seamless orchestration of data, tools, code, and environments.”

DCCU uses DataOps for its mobile apps to improve time to market for some of its customer-facing analytics products, through continuous delivery. “A data scientist works hand-in-hand with IT to test and implement analytics iterations in sandboxes for quick and continuous deployment of models,” Rayburn says.

When CDOs and CIOs work together, “joint strategizing, planning, developing, and coordinating will ensure an efficient division of labor that eliminates data silos and accelerates digital transformation,” Rayburn says.

Avoiding friction

Given that there will likely be overlap among the CIO and CDO responsibilities in some areas, there’s bound to be friction.

Data is generated by or consumed by the applications that enable the business,” says Marcus Murph, leader of CIO advisory at consulting firm KPMG. “This creates a natural friction between the CIO and CDO, as choices about data architecture, data governance, tools — and their costs — can conflict with broader IT operating model preferences.”

In addition, data must be secured, and this creates potential conflicts between data solutions and cybersecurity standards typically established by the CISO, Murph says. This can also create friction between CDOs and other executives. “None of these friction points needs [to] be counterproductive,” he says. “Proper operating model design explicitly identifies these points of friction and provides mechanisms to avoid or resolve conflicts.”

Both the CIO and CDO roles have a similar purpose in that they lead corporate efforts to drive positive business outcomes through the optimal use of technology, including data and related technology, Rayburn says.

Because data can’t be easily separated from its underlying technology infrastructure, there could be a conflict in terms of responsibilities, she says.

“Too many IT shops are still more vested in the infrastructure and technologies that house the data than the expertise to drive value from data, unless the CIO has traditionally maintained strong analytics focus,” Rayburn says.

The role of the CIO emerged in the mid-1980s, whereas the chief data officer is a relatively new appointment recently gaining traction in the face of increasing digitalization, Schwenk says.

“The lack of clarity around roles and responsibilities and the drivers for CDO appointments can mean there is friction between these two roles,” Schwenk says. “Their roles and responsibilities are dependent on the overarching business goals and where the organization is on its digital transformation journey. Still, there could be friction when it comes to how data is managed within the IT infrastructure, which could make CIOs feel threatened.”

For more progressive organizations, a clearer distinction between the roles and responsibilities of these senior leaders and where they fit into the organization is more common, Schwenk says.

Reporting structures

Where CIOs and CDOs fit into a company’s reporting structure varies based on the data maturity, industry, and state of digitalization in the business, Schwenk says. “Why the CDO has been appointed bears considerable weight [on] where they typically report,” she says.

According to the CIO.com’s State of the CIO 2023 survey, 53% of CDOs report to the CIO or top IT executive, with 35% reporting to the CEO and 7% reporting to the CFO or top finance exec.

But the CDO role is evolving. The first generation focused on governance and compliance and building out a trustworthy data foundation, Schwenk says. The second generation had a solid foundation in the data governance and compliance area, but was also looking to drive business value from the data.

“This is actually where we see this tie-in with digital transformation cementing itself,” Schwenk says. “These CDOs placed greater emphasis on a more proactive approach to data management, rather than reacting to GDPR [General Data Privacy Regulation] and other privacy laws and regulations.”

For example, they began serving customers online by using data to provide a better customer experience, optimizing or digitalizing supply chains, or other things of that nature, Schwenk says.

“CDOs from the first generation, [who] tend to care more about managing and governing their data, report to the CIO or IT leader,” Schwenk says. “The second generation tends to be more business-oriented, which means they could report to a leader of the business function or the CEO. And research shows that CDOs reporting to the CEO tend to have more success. They have sponsorship, a clear sense of direction, etc.”

All this matters “because the use of data, exploitation, control and management and governance of data isn’t a purely technical decision, just for the IT department or assumed solely from the business side,” Schwenk says.

While CIOs typically have been more likely than CDOs to report to the CEO, this is changing, adds Abhijit Mazumder, CIO and global head of sales enablement at consulting firm TCS. “Increasingly, CDOs may report directly to the CEO,” he says. “In other cases, a CDO may report to the [general manager] of an individual business unit.”

To ensure success, the CIO and CDO must be closely aligned, Mazumder says. “In situations where both report to the CEO, this is even more important,” he says. “Because the roles may overlap in platforms and practices, conversations about new platforms, vendors, or even new revenue lines should always involve both leaders and their teams.”

Chief Data Officer, CIO, Data Management, IT Leadership

By Hock Tan, Broadcom President & CEO

In the years that I have led Broadcom, I have found two things to be true for technology leaders: First, success with your customers starts with success with your ecosystem partners; and second, driving ecosystem growth is key to maintaining the growth of your own business.

This is why, at Broadcom, we bring innovation, investment and attention into our making customer value a lasting reality through our pioneering partner programs. These programs help us drive two pivotal customer objectives: innovation in technology and innovation in business models.

From joint innovation to accessing new markets, our pioneering partner programs help us do more for customers. As digital transformation accelerates, customers need fully integrated solutions that address their needs.

Today, we have more than 35,000 partners in our IT infrastructure and cybersecurity software ecosystem, and every single one plays a vital role in bringing value and success for our customers. We work with many kinds of partners across the entire value chain – including the production, procurement, distribution and deployment of our products. They help us expand the reach of our technology and drive better business efficiency and experiences for customers.

When we set out to make any business decision, we always ask ourselves the following three questions:

Does it drive a better outcome for the customer?Does it allow and enable profitability for a partner?Does it drive better efficiencies for Broadcom?

If the answer to any of these is “no”, it’s not a path worth pursuing. Our partners and customers should always benefit from the decisions we make.

What partners bring to Broadcom’s customers

At Broadcom, we understand that the key to growth isn’t found in being all things to all people, but instead we believe our customer-first mindset, coupled with purposeful partnerships, is key to delivering untapped value for customers. 

Broadcom’s innovative and industry-first partnership models provide that purposeful plan for how our partners integrate into the overall value chain, and empower each company to leverage their core competencies and do what they do best. Our highly capable partners help us provide solutions for customers ranging from the world’s largest public and private organizations to small- and medium-sized businesses (SMBs). Through Broadcom’s unique friction free Expert Advantage Partner Program, partners deliver high value services to customers of all sizes – including our largest enterprise accounts. 

Yet, the value our partners deliver goes far beyond services. Showcased on our Insights Marketplace at expert.broadcom.com, customers can find our partner-built applications that extend our product capabilities and tailor them for specific use cases – unlocking more value from our customers’ investments. In short, for every challenge, there’s a Broadcom partner ready to deliver the solution and support the specialized needs of businesses – regardless of size. 

What Broadcom brings to partners

At Broadcom, we are unique in how we engage with and support our partner ecosystem. Often, commercial vendors will attempt to control how their partners conduct business. But at Broadcom, we empower partners to identify and pursue their own commercial strategies, so they can bring sales and services to end-user customers on their own terms. We introduce industry-first, go-to-market partner models with shared risk and significant rewards. 

Our Global Cyber Security Aggregator Program (CSAP) is proof. CSAP was launched to expand our market reach and deliver enhanced levels of service to a subset of commercial enterprises with unique needs. The program brings together Broadcom’s Symantec cyber security solutions and partners’ resources along with their in-country expertise to offer a best-in-class customer experience. We have made significant investments, including in-sales training to ensure our distribution partners are well equipped to provide better customer support and a quicker response time to evolving threats.

Our customers can also receive hands-on technical help through our unique Broadcom Software Knights Program. We vet and provide certified partners with ongoing technical training, product presale and sales intelligence so that they can handle any complex issue put in front of them with hands-on technical support. We provide them with the best so that our customers experience the best.

Together, we have a shared goal and responsibility of addressing our customers’ needs and delivering superior outcomes. It’s a win-win-win. Our message to our customers, current partners and future partners is this: our goal is to deliver superior outcomes for customers of all sizes; and our partners’ success is our success. We understand the value our partner ecosystem brings to Broadcom and mutual customers, and we are committed to our partner and customers’ continued success.  

Learn more about Broadcom here.

About Hock Tan:

Broadcom Software

Hock Tan is Broadcom President, Chief Executive Officer and Director. He has held this position since March 2006. From September 2005 to January 2008, he served as chairman of the board of Integrated Device Technology. Prior to becoming chairman of IDT, Mr. Tan was the President and Chief Executive Officer of Integrated Circuit Systems from June 1999 to September 2005. Prior to ICS, Mr. Tan was Vice President of Finance with Commodore International from 1992 to 1994, and previously held senior management positions with PepsiCo and General Motors. Mr. Tan served as managing director of Pacven Investment, a venture capital fund in Singapore from 1988 to 1992, and served as managing director for Hume Industries in Malaysia from 1983 to 1988.

IT Leadership

Companies across nearly every vertical are finding a transformational lifeline in industry clouds. Swiss biopharmaceutical Idorsia is one such company, having embraced a partnership with industry cloud provider Veeva to survive.

In June 2017, Idorsia had a lot on its plate, namely a new company to stand up, with 650 scientists and employees, a robust discovery pipeline, early-stage clinical assets, and plans to launch commercial products within five years.

More challenging, its spin-off from Actelion following Johnson & Johnson’s acquisition meant there were no systems or technology platforms. Idorsia needed a partner to help it move through the arduous scientific process and multiple-nation regulatory processes that accompany drug launches.

“We started with a blank page. I actually had no other choice than going for the cloud at that time,” says Joseph Bejjani, CIO of Idorsia, who selected Veeva, an industry cloud for life sciences. “Veeva covers a large scope of our environment from clinical development to quality regulatory affairs from a user experience in one interface.”

That’s just one of the benefits of an industry cloud, he says. Veeva’s life sciences cloud, for example, not only handles Idorsia’s regulatory, sustainability, and commercial processes but also provides predefined FDA formatting.

Perhaps most important, Idorsia taps into Veeva’s evolving knowledge base, which encompasses data from other customers such as major pharmaceuticals giants Merck, Bayer, and Kronos, the CIO says. And that is a major gain for a startup — getting the know-how and experience of Veeva’s entire customer base, he says.

“Compliance is key for us, but industry knowledge is extremely important for a relatively small company. We get the collective knowledge of our industry,” he says, noting that Idorsia also relies on Veeva to navigate regulatory issues that vary in each nation. “Veeva cloud solution provides us with industry best practices.”

Going vertical

Hundreds of “industry clouds” tailored to specific verticals have been developed by a range of vendors, from hypervisors that sponsor vertical solutions to consulting firms that have built custom clouds for select clients.

These clouds are also often distinguished by the underlying partnership that resulted in the solution or the underlying platform on which the cloud runs. Veeva, for example, runs on Salesforce CRM.

Idorsia’s Bejjani says there are two components to the biopharmaceutical’s Veeva cloud: one for R&D and another for commercial requirements. Idorsia chose Veeva when it was in the last phase of a clinical study of its first commercial insomnia drug. The company had only nine months to complete the process before submitting its application to the FDA.

Joseph Bejjani, CIO, Idorsia

Idorsia

Veeva’s solution captures all the management and technology checkpoints — the structure, output, and terminology, Bejjani says, adding that it then “integrates with another fundamental tool in our clinical operations. This file captures all the data that we use to submit our procedure. It’s pre-defined with standard chapters. When you submit to the FDA, you must have clearly defined chapters.”

Idorsia could build its own Salesforce-based solution but the value Veeva adds is immeasurable, its CIO notes. “It’s better to go with an industry cloud because you inherit what research work the industry cloud provider has,” he says.

Idorsia currently has two products launched commercially. The insomnia solution launched in the US, Italy, and Germany. Last year, the company launched another product in Japan and currently has 10 products in clinical development, roughly half of which are in the late stages.

“The configuration we have today has been extremely beneficial because I do have the vendor’s attention. I have a unique setup. I have a large scope of functionality and systems,” Bejjani says, noting his strategy and speed, simplicity and sustainability needs led him to choose a cloud platform based solution from a preferred vendor with strong industry knowledge and presence.”

Making sense of a complex market

Given the variety of approaches and solutions, the industry cloud market has grown vast and complex. Consulting firms such as KPMG and Accenture agree there is no clear definition of what an industry cloud is, and its components, services, and technology stacks are still evolving.

“It’s a term that is still forming, but we would all agree on now is that it’s using cloud technology to solve problems specific to an industry sector,” says Marcus Murph, KPMG’s US leader for cloud.

Murph points out, for example, that Microsoft has a financial solutions industry cloud yet many enterprises use IBM for financial services in the cloud and still other financial companies have developed a high-end solution in conjunction with NASDAQ that includes analytics and machine learning models.

Much of the focus of industry clouds to date has been on foundational aspects of doing business in the cloud, such as which workloads to migrate to the cloud, whether to lift and shift those workloads directly to the cloud, or redesign them from scratch into cloud-native applications. But as Idorsia’s use of Veeva shows, some industry clouds also offer industry-based tooling, such as “solutions that address regulatory challenges and controls in different sectors, as well as data models specific to different vertical sectors,” Murph says.

Due to their nature, industry clouds likely will remain collaborative affairs. “The industry-based cloud has to be an ecosystem that stitches different technologies together and solves different problems,” Murph says. “I don’t know that you’ll ever see one company dominate an industry cloud on its own.”

CIOs must think strategically before selecting an existing industry cloud solution or building a custom industry cloud with a partner, says Ashley Skyrme, global cloud first strategy and consulting lead at Accenture.

This involves “rewiring their value chain” of products, solutions, and services, namely rethinking their tech stack more strategically, orchestrating multiple data assets and unlocking data from many sources.         

Skyrme pointed to Volkswagen as a great example of an enterprise that built an automotive cloud platform by opening up and collaborating across different industries to bring its supply chain together.

“It’s not a pre-formulated kit,” Skyrme says about defining the industry cloud. “We think of it as much more exhaustive across the cloud continuum. It’s an evolving ecosystem of standardized, reusable, and interoperable digital assets. That’s the holy grail of the industry cloud. Driving differentiation and growth … new products, new platforms, and new experiences.”

As for Idorsia, embracing an early but established life sciences industry cloud has no doubt enabled the startup to turn its R&D into a profitable business — which can be more challenging than the science itself. And Bejjani is one CIO who is glad he didn’t try to tackle it alone.

“We could do it, but it would be very time consuming and expensive,” he says. “Veeva already created the vertical for the pharmaceutical industry and the workflows and terminologies of the industry are pre-configured and embedded in their product.”

For enterprises like Idorsia whose tech stacks aren’t their key differentiator, the value proposition of industry clouds is compelling.

Cloud Computing

The past several years have thrown numerous challenges at consumer packaged goods (CPG) companies. The pandemic has led to shifting consumer channel preferences, a supply chain crunch, and cost pressure, to name just a few. CPG titan Unilever has been answering the challenge with analytics and artificial intelligence (AI).

The 93-year-old, London-based CPG company is the world’s largest soap producer. Its products include food and condiments, toothpaste, beauty products and much more, including brands like Dove, Hellmann’s, and Ben & Jerry’s ice cream.

Alessandro Ventura, CIO and vice president of analytics and business services for North America at Unilever, has been at the forefront of helping the company apply AI to its businesses for years. While originally in the role of IT director, he has since added analytics and people services to his portfolio.

“That’s everything from facility management, fleet management, employee and facilities services, and people data, and that kind of stuff,” Ventura explains.

Unilever believes AI is not a technology of tomorrow. It’s already being widely used, and Ventura feels all industries will need to adapt to it.

In recent months, Unilever has developed a number of new technology applications to help its lines of business in the markets of tomorrow. One of the most important is “Alex,” short for Alexander the Great. Alex, powered by ChatGPT, filters emails in Unilever’s Consumer Engagement Center, sorting spam from real consumer messages. For the legitimate messages, it then recommends responses to Unilever’s human agents.

“Although Alex is good at what it does, it may lack a bit of a personal touch that instead our consumer engagement center agents have in big quantities,” Ventura says. “So, we let them decide whether they want to respond to our consumer as Alex suggested, or they want to add some personal recommendation; if the answer suggested by Alex is wrong or doesn’t have an answer, they can flag it so Alex can learn it the following time.” 

Generative AI in action

Alex was created using a system of neural networks, with ChatGPT for content generation. Ventura says the tool can understand what a consumer is asking and even capture the tone. It can then store the answer and sentiment in Salesforce. Importantly, he says, the tool does the heavy lifting on those tasks, giving the human agents more time to dedicate to what they do best. To date, Ventura says Alex has helped Unilever reduce the amount of time agents spend drafting an answer by more than 90%.

Another Unilever tool, called Homer, leverages ChatGPT to generate content. It’s a neural network that takes a few details about a product and generates an Amazon product listing, with a short description and long description that matches the brand tone.

“We want to ensure we captured the voice of the brand so, for example, that we differentiate between a TRESemmé and a Dove shampoo, and the system got it absolutely nailed,” Ventura says. 

Another AI-based tool that Unilever launched on the week of US Thanksgiving supports the Hellmann’s mayonnaise brand. Its purpose is to reduce food waste.

“It links up with the recipe management system that we have at Hellmann’s, so somebody can go in and select two or three ingredients that they have in the fridge and get in exchange recipes for what they can do with those ingredients,” Ventura says.

In the first week, the tool got 80,000 users who reported loving it.

For Ventura, that’s the magic of analytics and AI in the CPG space: It enables personalization at scale.

“In CPG, we rely more and more on analytics and AI for different things,” he says. “Consumers are more and more specific about what they want. It’s a bit of a cliché, but they really do want personalized products and experiences. Analytics helps CPG to understand the context they’re navigating through and what the consumer wants, and then, with AI, we can scale that one-to-one relationship across all the multitude of consumers that we have.”

Co-creation key to AI success

Beyond the consumer relationship, analytics and AI are also key to making CPG companies more sustainable. Ventura points to examples like ingredient traceability and using machine learning (ML) to automate forecasting, which in turn helps the company minimize waste. Unilever is also applying analytics and AI to logistics, including tracking inventory and optimizing routes.

“The old interpretation of elasticity, we threw it out the window,” Ventura says of operations in the wake of the inflation crisis. “We had to come up with new calculations because the traditional ones were giving us very different scenarios from what we were seeing happening at the shelves. Going forward, we will continue to see that pressure from all the different challenges coming from the geopolitical situation around the world.” 

To support its innovation around analytics and AI, Unilever has adopted a hybrid model. It has a global center of excellence, but also keeps some data scientists embedded with business units.

“It’s basically a two-gear system,” Ventura says. “The local team can be activated very quickly, ingest the data very quickly, and then create a statistical model and analytics model together with the business, sitting next to each other. Then, if that model can be leveraged across and scaled, we pass it on to the global team so they can move data sets in the global data lake that we have and can start creating and maintaining that model at a global level.” 

Ventura believes co-creation and co-ownership of analytics and AI capabilities with the business function is essential to success.

“Whether it is machine learning for automating the forecast or Alex with the Consumer Engagement Center, if we show up with a black box and say, ‘Hey, follow whatever the machine tells you,’ it will take a long time and probably will never get to 100% trust in the machine,” Ventura says. “With co-creation and co-ownership, I feel like we get to start with the right foot, with the human and the machine working alongside each other in partnership, almost as colleagues. Also, you get a much less biased system in the end because you’re able to introduce a much more diverse angle in your algorithms, both from a business perspective and a technology perspective.” 

Artificial Intelligence, Digital Transformation