Enterprises seeking to thrive in an innovation-centric economy are capitalizing on multi-cloud strategies to leverage unique cloud services. These services help accelerate initiatives supporting AI, data processing, and other pursuits, such as driving compute to the edge.

That’s all well and good – until the CIO gets the bill.

In a survey of more than 1,000 global IT decision makers conducted by Forrester Research with HashiCorp, 94% of respondents said their organization was currently paying for avoidable cloud expenses.1

Meanwhile IDC’s Archana Venkatraman, Research Director, Cloud Data Management, Europe, adds: “While cloud adoption has accelerated, cloud governance and control mechanisms haven’t kept pace. As a result, up to 30% of cloud spend is categorized as ‘waste’ spend.”2

Examples of cloud cost surprises

Even inside the controlled environment of an enterprise’s datacenter, it’s not always easy for IT staffers to keep track of resource utilization. Now imagine the challenge of tracking usage from dozens of engineers in a multi-cloud environment where each service provider has its own tooling, processes, and procedures. Being able to bring all that data into a single view is extremely difficult.

Here are the top three cloud cost surprises that CIOs are likely to encounter.

Unused resources: Over time, cloud environments inevitably sprawl, leading to unused storage volumes, idle databases and zombie test instances.

Modernization: Businesses are slow to adopt newer instance types which offer 20% or more efficiency gains due to lack of visibility and understanding of the upgrade path. Given the hundreds of instance types, it’s easy to understand why.

Anomalies: The biggest cloud cost surprise is the one that comes out of nowhere – an unexpected spike that could be caused by a variety of factors – misconfigurations, orphaned instances, runaway crypto-mining malware, or unauthorized deployments.

Enter FinOps

FinOps has become the de facto way in which enterprises manage cloud cost uncertainties, typically with machine learning used to help deliver insights to DevOps, CloudOps and the C-Suite. What’s more, FinOps provides a common language between the developers, infrastructure and business leaders to help show Return on Invetsment per project or set of services.

A disciplined FinOps practice, coupled with tools like OpsNow, helps in three ways:

FinOps tools can discover unused or orphaned resources and enable organizations to “rightsize” their cloud deployments.

Through the use of anomaly detection, FinOps provides an early warning system that alerts IT teams to usage spikes and budget overruns before they get out of control.

Cloud environments are constantly changing, so FinOps is never done; it’s an ongoing process that can help organizations optimize their cloud spend over time, do a better job of budgeting and forecasting, as well as avoid those billing surprises.

The OpsNow approach

There are many options for FinOps ranging from developing tools in-house to purchasing FinOps platforms. Managing your own platform and tooling presents  CIOs with investment and ongoing maintenance challenges.  In the fast moving world of the cloud that is a risk many prefer to not pursue.

OpsNow offers a different take and allows you to deploy without the investment nor maintenance.  Coupled with its methodologies and metrics you have a way to monitor and track your success.  

OpsNow, a spinoff from Bespin Global, provides  a SaaS platform which utilizes a “shared savings”  model – customers only pay a small percentage based on actual savings and are free to utilize the breadth of capabilities at no cost otherwise.

The OpsNow platform provides a single pane of glass across multi-cloud environments to identify unused resources, recommend capacity adjustments, provide AI insight for optimization, and provide no-risk cost savings from their AutoSavings tool which best even  the multi-year commitments offered from the clouds.

Advanced cost analytics and machine learning also ensure this technology can support a more efficient approach to cloud spend, which ensures IT leaders can innovate without worrying about unexpected costs. The automation is one aspect, but the ability to closely model true usage and ensure the coverage and utilization are closely aligned to the forecast is where the savings typically add up. 

Begin your journey to better cloud cost efficiency now.

1 HashiCorp, Forrester Research Report: ​​Unlocking Multicloud’s Operational Potential, 2022
2 IDC, IDC Blog, The Era of FinOps: Focus is Shifting from Cloud Features to Cloud Value, February 2023

Cloud Computing

Jeff Dirks is fascinated by new technologies like generative AI. But when it comes to implementation, the chief information and technology officer of workforce augmentation firm TrueBlue chooses a path that trails early adopters. “We’re in the early majority,” is the CIO/CTO’s blunt self-assessment.

Although many IT leaders would like to think of themselves — and have others think of them — as in the vanguard of new technology adoption, the vast majority find themselves in the middle of a bell curve, with innovators leading the way and laggards trailing behind, according to Everett Rogers’ diffusion of innovations theory [see chart]. But there is no one “right” place to be along the curve. The trick is to know where your organization belongs — and to make the most of it.

The diffusion of innovations theory, published by Everett Rogers in 1962, places most organizations in the middle of a bell curve of technology adoption. 

Public Domain

“Organizations that are willing to explore new technologies and be the first in their industry face the highest risk and highest potential return on new technology. But that is very few companies,” says Brian Burke, research vice president for technology innovation at Gartner. A far savvier approach for most organizations is to exploit adjacency — to keep an eye on innovators in similar industries and, when the time is right, adopt the technologies they are using.

“If you’re in banking and you see that an insurance company has adopted a technology, you might adopt it as the first in your industry, gaining a first-mover advantage with less risk,” says Burke.

Dirks buys into that philosophy for TrueBlue, whose core business, PeopleReady, is to provide a platform for efficiently matching day laborers with companies that have contingent labor needs. For TrueBlue, the so-called “gig-economy” companies like Uber and Lyft and venture-backed competitors such as Wonolo and Instawork are adjacent bellwethers. While such companies are technology-first, TrueBlue is evolving from brick-and-mortar to digital, a path that calls for incremental, rather than radical innovation.

TrueBlue’s award-winning Affinix app, which aims to make recruiters more efficient by predicting which candidates have the highest probability of success, implements data science, machine learning, and RPA, technologies that Dirks calls mainstream. Consistent with that approach, Dirks is exploring the use of blockchain technology to create a trusted ledger of contingent laborers’ credentials, including facets such as drug and background checks. Although blockchain is no longer new, using it in this way would be a first in the day-labor industry, Dirks says.

Institutionalizing innovation

Far from the fast-track of venture-backed data science, government agencies often are found  among late-majority and laggard organizations. “In the public sector, there is a propensity to not rock the boat. But that’s not always the right approach,” says Feroz Merchhiya, CIO and CISO of the City of Glendale, Ariz. He ranks the city between early majority and late majority, depending on the project.

When the state of Arizona centralized the collection of state sales taxes several years ago, municipalities such as Glendale struggled with the inefficiency of collecting taxes, sending them to the state, and then receiving their share back in disbursements.

“We did not have a solution to do this. It was very cumbersome for cities as well as businesses. There were a slew of complex interactions that needed to take place,” says Merchhiya. Having scouted industry events and discussed the situation with analysts, he and his team discovered there was no off-the-shelf solution.

“When we realized there was nothing available, we thought it would be a good idea to develop it on our own,” he says. With his own staff of 50 and a like number of contingent workers, Merchhiya led the development of AZ Tax Central, which Glendale is now using and making available to other cities for a charge that covers costs only.

With that successful project under his belt, Merchhiya is seeking to institutionalize the innovation process. Knowing that innovation in government can never succeed without administrative support and budget dollars, Merchhiya convenes annual meetings with civic officials to learn the issues they are facing and to brainstorm how technology might address them.

“It’s one thing to do accidental innovation. It’s another to put a process in place. I’m trying not to be an accidental innovator — the only way I can do that is to engage the business by inviting everybody to get together and have a conversation,” he says. 

Tailoring innovation for real-world impact

Engagement is also important at the District of Columbia Water and Sewer Authority (DC Water), an agency that serves the nation’s capital and the surrounding region, including Dulles Airport. “You have to embrace people to make them comfortable with suggesting ideas — and not being disappointed if their idea doesn’t move forward,” says Thomas Kuczynski, vice president of IT at DC Water, which is responsible for 1,300 miles of water distribution pipe, and 1,900 miles of sewer pipe.

Although DC Water’s budget does contain an allocation for experimental work, IT must keep its eyes on real-world challenges. Kuczynski says the agency’s technology adoption sometimes falls into each of Rogers’ classifications, depending on the project. “Our focus is to have a positive impact on the business as quickly as possible,” says Kuczynski. “We focus on opportunities where we believe we can create efficiencies and improve operations.”

For example, DC Water began using an AI-based product called PipeSleuth to inspect the sewer system. An improvement over previous CCTV systems, PipeSleuth sends a sausage-shaped drone on wheels through the system to look for pipe anomalies. Using deep-learning, neural-network technology, it tags defects and produces a report, rather than requiring operators to view reams of video as is necessary with CCTV-based systems.

Using PipeSleuth has enabled DC Water to substantially reduce the cost of pipe inspections, allowing the agency to inspect more pipe at the same cost. “The more we can inspect regularly, the more improvements we can make because we better understand the system. If we can do more inspections per dollar, we can apply that repair dollar to a bigger problem because we know more about my system,” says Kuczynski.

Another DC Water innovation is an event management system that integrates in a single dashboard SCADA data, inbound calls, work orders, USGS data, rain-level gauge, and other IoT inputs from sensors tracking water pressure, flow, and level. The system also tracks personnel and vehicles via GPS to dispatch repair staff to the highest priority trouble spots quickly.

“The dispatch system integrates IT and OT in a single dashboard. Now we can manage emergencies more effectively. In the past, if we got 20 calls about one problem, there might be 20 work orders. Now those calls are consolidated into one work order,” he says.  

The art of selection

Ty Tastepe, senior vice president and CIO of Cedar Fair Entertainment Company, an operator of 13 amusement parks and resorts in the US and Canada, says his company tends to be in the “fast-follower” category, a space that’s generally recognized to sit somewhere between early-adopter and early-majority groupings.

“We look at technology implementations not only in our own industry but also in adjacent industries like retail and food services,” Tastepe says. He adds, “If there were no proven technology, we would entertain being an early adopter or try to pilot something new to address a business challenge.”  

“The good ideas are plentiful — the challenge is to identify the must dos, and whether a project fits within budget and resource constraints,” says the CIO. Because Cedar Fair is a public company, Tastepe thinks in terms of delivering results. He says new technologies must answer yes to at least one of three questions: Does it generate revenue? Does it improve efficiency? Does it satisfy compliance requirements?

“There will always be budgeting constraints; that’s why due diligence up front is important. We do a discovery process. Once the business sponsor puts together a charter for an initiative that can be enabled by a technology implementation, we discuss the merits of the initiative at the portfolio management committee,” he says. The next step is to flesh out the business model and the costs, sometimes with the assistance of a partner. “If we decide to move forward with the project, we might do a proof of concept before we deploy at scale,” he adds.

Gartner’s Burke agrees with Tastepe that winnowing down the field of technologies to a few plausible candidates is essential. “When you’re scouting new technologies it’s a bit of an art as well as a science,” says the analyst. Typically, organizations scan several hundred technologies, a number that must be reduced to a couple dozen for serious study, Burke says.

Knowing when to pull the plug

One company that is widely recognized to be in the innovator category is,

which willingly invests in technology-heavy concepts such as Amazon Go, a convenience story with no checkout. Although Amazon has built a couple dozen such stores, the giant retailer recently announced the closure of several. In addition to reining in Go stores, has reportedly curtailed its drone delivery initiative. While such explorations and reversals are more than most companies can risk, they underscore the importance of continually evaluating pilots to ensure there is enough warranted value in going forward with the concept.

“In Amazon’s case, they are very good at testing technology and then abandoning it if it doesn’t work,” says Ananda Chakravarty, vice president of research for retail merchandising and marketing at IDC. One reason for the Go retrenchment, according to the analyst, is that the cost of cameras dropped significantly, making them a better technology choice than the sensors that Go implemented to track inventory on shelves.

Even so, according to Chakravarty, behemoths like Amazon and Walmart, which has also experimented with Scan-and-Go checkouts and delivery drones, can afford the luxury of trying — and learning from — things that others cannot.

“There’s some real value-added to a test-and-learn approach,” says the analyst. For example, he explains, initially planned to sell Go technology to other retailers, and when that didn’t pan out, the company decided to target Go to niche markets such as airports, stadiums, and transportation venues, where consumers place the highest value on the convenience of a frictionless experience.

Checks and balances

Rajiv Garg, associate professor of information systems and operations management at Emory University, says putting together a diverse team to evaluate new technologies is an essential step. The group should encompass multiple corporate departments and a variety of demographics. “You might need a millennial on your team to ask how your organization is impacting the environment and society,” he suggests. Once the team is complete, he advises, turn them loose to explore.

“Send them to conferences, buy them VR headsets. If the team doesn’t like one technology, that’s fine because they might find something else they like,” says the professor. Generative AI has reached a point at which it demands to be evaluated, according to Garg. “ChatGPT and DALL-E are going to be embedded in our work somehow. You need to engage employees and use them in your workplace,” he says.

Too often, says Gartner’s Burke, the excitement of working with new technologies causes organizations to jump the gun, failing to perform due diligence ahead of time.

“Companies that have assessed a tech opportunity before they launch a proof of concept are a small minority of companies. In contrast, most organizations identify a nifty technology and then rush into a pilot without having done the easier thing — to determine whether it will actually help them in some way,” says Burke.

Even if most organizations could benefit from more careful up-front analysis, gaining an edge in the market ultimately depends on the willingness to give new technologies a try.

“We like taking a lot of swings. The more swings you take, the higher the probability that one of them will hit something translating into competitive advantage,” says Dirks.


Cloud architectures and remote workforces have effectively dissolved the network perimeter, the traditional line of defense for IT security. Lacking that decisive boundary, the work of security teams has changed. Now to guard against data breaches, ransomware, and other types of cyber threats, protecting network endpoints is more important than ever. 

But protecting endpoints is a priority with a massive scope. Endpoints encompass everything from employee laptops, desktops, and tablets to on-premises servers, containers, and applications running in the cloud. Endpoint security requires a comprehensive and flexible strategy that goes way beyond what security teams relied on a decade or more ago. Then IT assets were nearly all on-premises and protected by a firewall. Those days are over.

Ransomware continues to evolve

Ransomware continues to be a major threat to organizations of all sizes. After declining for a couple of years, ransomware attacks are on the rise again. They increased 23% from 2021 to 2022. 

Not only are attacks more frequent, they’re also more disruptive. In 2021, 26% of attacks led to disruptions that lasted a week or longer. In 2022, that number jumped to 43%.

On average, each of these attacks cost its victim $4.54 million, including ransom payments made as well as costs for remediation. As bad as these numbers are, they’re poised to get worse. That’s because in the past year, attackers have adopted new models for extorting money from victims.

Business email compromise attacks

Another prevalent form of attack is business email compromise (BEC), where criminals send an email impersonating a trusted business contact, such as a company CEO, an HR director, or a purchasing manager. The email, often written to convey a sense of urgency, instructs the recipient to pay an invoice, wire money, send W-2 information, send serial numbers of gift cards, or to take some other action that appears legitimate, even if unusual. If the recipient follows these instructions, the requested money or data is actually sent to the criminals, not the purported recipient.

Between June 2016 and December 2021, the FBI recorded over 240,000 national and international complaints about BEC attacks, which cumulatively resulted in losses of $43 billion. Ransomware might make more headlines, but BEC attacks are 64 times as costly. And they are becoming more frequent, increasing 65% between 2019 and 2021.

“Endpoint monitoring won’t stop a BEC attack,” explains Tim Morris, Chief Security Advisor, Americas at Tanium. “But it might tell you a little more about the person who opened the email and what they did with it. Context can give you the clues you need for determining whether the attack is part of a broader campaign, reaching other recipients with deceptive messages.”

Practical tips for endpoint management

How should CIOs and other IT leaders respond to these evolving threats? Here are five tips.

Treat endpoints as the new network edge.

With so many people working remotely and 48% of applications running in the cloud, it’s time to recognize that the new line of defense is around every endpoint, no matter where it is and what type of network connection—VPN or not—it’s operating with.

2. Identify all devices connecting to the network, even personal devices not officially authorized.

“You can’t secure what you can’t manage,” says Morris. “And you can’t manage what you don’t know.” Security Operations Centers (SOC) need to know all the endpoints they’re responsible for. Audits of enterprise networks routinely find endpoint management systems miss about 20 percent of endpoints. SOC teams should put tools and processes in place to ensure they have a complete inventory of endpoints and can monitor the status of endpoints in real time.

3. Patch continually.

Patching has always been important to ensure endpoints have access to the latest features and bug fixes. But now that software vulnerabilities have emerged as a major inroad for attackers, it’s critically important to ensure patches are applied promptly. Organizations can’t hope to respond to supply chain attacks like Log4j without putting in place automated solutions for software bills of materials and patching.

4. Drill.

Once you have a cybersecurity plan, a cybersecurity toolset, and a trained staff, it’s important to practice hunting for threats and responding to attacks of all kinds. It’s helpful to take a Red Team/Blue Team approach, assigning a team of security analysts to break into a network while another team tries to defend it. These drills almost always uncover gaps in security coverage. Drills also help teams build trust and work together more effectively.

5. Get endpoint context.

When attacks occur, it’s important to respond as quickly as possible. To respond effectively, security teams need to understand what’s happening on affected endpoints, no matter where they are. Which processes are running? What network traffic is taking place? What files have been recently downloaded? What’s the patch status?

Analysts often need answers in minutes from endpoints thousands of miles away. And they don’t have time to install new software or hope the remote user can help them set up a connection. Security teams need to have a system already in place for analyzing endpoints and collecting this data, so that when any type of attack occurs—even attacks like BEC attacks—they can collect the contextual information needed for understanding what happened and what threats remain active.

Cyber threats are becoming more prevalent, more sophisticated, and harder to identify and track. For more tips—five more in fact—on how to reduce the risk of cyberattacks and ensure that when attacks occur, they can be contained quickly and efficiently, check out this eBook.


By Sean Duca, vice president and regional chief security officer for Asia Pacific and Japan at Palo Alto Networks

Some economists predict that we could soon face a global recession. Looking at history, this does not bode well for levels of cybercrime. However, there is some evidence that macroeconomic conditions can impact cybercrime. In times of economic downturn, for example, cybercrime may increase as people turn to illegal activities to make money. During the 2008–2009 Global Financial Crisis (GFC) and subsequent recession, researchers noted that cybercrime rates increased dramatically. Their report focused exclusively on financial cybercrime, including identity theft. It attributed the rise to the proliferation of new technologies in regions around the world, with many more people than ever before possessing IT skills.

Adapting to an evolving cybercrime landscape

Cybercrime is constantly evolving. To keep ahead of cybercriminals, organizations must remain agile, pivoting to embrace new strategies and technologies whenever necessary in order to outrun attacks. When businesses are struggling due to the recession, there’s a strong temptation to look for ways to cut spending.

This is the time when many CFOs trawl through every line item in the budget, looking for potential savings. Any non-essential expenditure that can be cut, or postponed, will be. This can be a challenge to cyber budgets in organizations where security is viewed primarily as a cost center.

However, for another smaller group of CFOs, cybersecurity is viewed as an area to discover potential savings. After all, most cybersecurity expenditures are aimed at preventing potential problems. As a result, it’s hard to calculate its ROI, but knowing cybercrime rates are likely to rise during tight economic times, these leaders understand that cutting cybersecurity expenditures could, in fact, cost more than they had anticipated.

Taking a long-term approach to cybersecurity expenditures

Investments in cybersecurity are cumulative and allow organizations to build up their resilience to cyberthreats incrementally over time. So when an organization starts cutting back on its cybersecurity program, it can take many years to build back up to the level of cyber maturity it had before the belt-tightening.

Any minor savings achieved by cutting cybersecurity budgets in the near term could make your organization easy prey for cybercriminals whilst they actively hunt for their next victim. Moreover, if breached, your cybercrime losses could dwarf any budgetary savings. As a result, it’s essential to work closely with your CISO and security team to understand the tools they rely on most to keep up with the relentless pace of cyberthreats catalyzed by competent, well-funded, and motivated adversaries.

How organizations can gain efficiencies from security choices

Complexity can make managing and securing an organization’s systems and data challenging. It can be challenging to keep track of all the components and ensure they are correctly configured and secured. This complexity can make identifying and responding to potential threats harder.

Second, complexity can make it more difficult for an organization to communicate effectively and coordinate its cybersecurity efforts. For example, suppose an organization has multiple teams or departments responsible for different parts of its cybersecurity strategy. In that case, ensuring that everyone is working together and following the same processes and procedures can be challenging.

Overall, it is vital for organizations to carefully manage the complexity of their cybersecurity efforts to protect their systems and data effectively. Moreover, it may involve implementing processes and technologies to help reduce complexity and streamline cybersecurity management, and train employees to understand and follow best practices.

In these challenging macroeconomic times, consolidation becomes a key priority to address a dozen tools by leveraging a platform or, as Gartner has referred, a cybersecurity mesh. Gartner believes that by 2024, organizations adopting a cybersecurity mesh architecture will reduce the financial impact of security incidents by an average of 90%. 1   That’s a number no CFO would want to ignore.

Cyber resilience isn’t always about increased expenditures. Cyber budgets must be spent wisely, often without increasing costs or targeting the most likely risks. That starts with your crown jewels, who has access to them, and how they are protected. You can work with your security leaders to align on platforms that positively impact the health of your security.

1. Felix Gaehtgens et al, Top Strategic Technology Trends for 2022: Cybersecurity Mesh, Gartner, October 18, 2021

For more cybersecurity thought leadership, please visit us online.

About Sean Duca:

Sean is vice president and regional chief security officer for Asia Pacific and Japan at Palo Alto Networks. In this role, Sean spearheads the development of thought leadership, threat intelligence, and security best practices for the cybersecurity community and business executives. With more than 20 years of experience in the IT and security industry, he acts as a trusted advisor to organizations across the region and helps them improve their security postures and align security strategically with business initiatives.

Prior to joining Palo Alto Networks, he spent 15 years in a variety of roles at Intel Security (McAfee), with his last position as the Chief Technology Officer for Asia Pacific. Before this, Sean was involved in software development, technical support, and consulting services for a range of Internet security solutions.

Data and Information Security, IT Leadership

A significant number of organizations are operating in a hybrid model — and expect to continue with that hybrid environment for the foreseeable future.

Global analytics and advice firm Gallup has found that the percentage of remote-capable workers working in a hybrid environment increased in 2022, moving from 42% in February to 49% in June. Gallup also found that only 20% of remote-capable employees were working fully on-site while 30% were fully remote in 2022. Furthermore, the firm had predicted that the number of hybrid workers would continue to increase, hitting 55% as we enter 2023.

Other reports confirm that hybrid work is here to stay. In its Future of Work predictions for 2023, IDC called hybrid work “a mainstay for our global future work landscape,” adding that “hybrid work will drive new technology solutions across functions and industries alike.”

Technologies cited by IDC include intelligent space and capacity planning tools, which the firm predicts 55% of global enterprises will use to reinvent office locations by 2024. IDC also predicts 65% of G2000 companies will consider online presence to be at parity to real life across their engaged workforce by 2025, with 30% of those same organizations adopting immersive metaverse conferencing tech by 2027.

While metaverse adoption is a ways off, CIOs are making significant investments in technologies aimed at improving the hybrid work experience. For example, Foundry’s 2022 State of the CIO survey shows that IT executives are investing in cybersecurity improvements and collaboration platforms better suited to their organizational needs as well as network reliability and performance improvements.

With competition for talent still tight and pressure on organizations to maximize employee productivity, Anthony Abbatiello, workforce transformation practice leader at professional services firm PwC, says CIOs should focus on what and how they can improve the hybrid experience for users.

He advises CIOs to partner with their counterparts in HR to identify the worker archetypes that exist in their organizations to understand how they work and what they need to succeed.

“CIOs should be asking how to create the right experience that each worker needs and what do they need to be productive in their job,” Abbatiello says. “Even if you’ve done that before, the requirements of people in a hybrid environment have changed.”

Hybrid workers today are looking for digital workplace experiences that are seamless as they move between home and office, Abbatiello says. This include technologies that enable them to replicate in cyberspace the personal connections and spontaneous collegiality that more easily happen in person, as they seeking experiences that are consistent regardless of where they’re working on any given day.

Here is a look at how some CIOs are bolstering their technology offerings to improve the hybrid working experience of their IT and business users.

Consistent, high-quality digital experiences

As senior vice president and CIO at Nutaninx, Wendy M. Pfeiffer says she’s seeking to do just that — deliver consistent, high-quality digital experiences that drive productivity and efficiency whether team members are working remotely, in the office, or a combination of the two.

Wendy Pfeiffer, CIO, Nutanix


“The solutions going forward required some innovative technology but also process updates; it’s a mix of those things,” Pfeiffer says, adding that fully remote workers, always on-premises employees, and those that alternate between remote and in-office each face unique challenges in a hybrid environment.

To ensure she and her IT team deliver the right tools for every worker type, Pfeiffer co-developed with other executives the company’s “Principles for Effective Hybrid Work.” She says these helped her, her colleagues, and IT better understand workplace dynamics and requirements so IT can best address them.

“These principles form the basis of our prioritization, our choice of technologies, and are reflected in our [objectives and key results],” Pfeiffer explains. “So now whenever we do something in IT, like release a capability, we have these principles in mind. And we say what we’re doing, the principle it’s related to, and here’s how we measure it.”

Those principles, along with lessons learned during recent years, have helped Pfeiffer sharpen her tech strategy for supporting hybrid work in 2023.

To start, Nutaninx IT has decided to go with applications that perform, act, and look the same whether being accessed in the office or from a remote location to eliminate any confusion or extra layers that workers might have to think through as they do their work. The same goes for any hardware (e.g., laptops) being used. This, Pfeiffer says, will also help minimize or even eliminate lost productivity from context-switching — the shifting from one task to another unrelated one.

As for new tools heading into 2023, Nutaninx IT is rolling out Lucidspark, a virtual whiteboard for real-time collaboration from anywhere. Pfeiffer says her team worked with Zoom, which her company uses, to add Lucidspark into that platform so Nutaninx employees can seamlessly access it even when in those virtual meetings.

IT is also launching a tool from Huddle that uses artificial intelligence for note-taking to support teams that are working asynchronously and need to share ideas and conversations. Pfeiffer says she expects this tool, which workers can also access through the company’s Zoom application, to bring more efficiency to information hand-offs — a critical advantage for employees who, thanks in part to virtual work capabilities, are now often working at different times.

Another addition will be 360-degree cameras and microphones from Owl Labs, which will be deployed in company conference rooms and integrated with Zoom to make those joining meetings remotely feel more like they’re in the room.

Pfeiffer says IT also has a big focus in 2023 on bringing in technologies that enable a consumer-like experience, as employees who have been working from home “are using consumer tech like mobile phones and comfy chairs and gaming computers, and consumer tech runs circles around traditional enterprise technology” in terms of experience. For example, IT is bringing in the Discord voice, video, and text chat app so that workers can have multiple synchronous conversations in parallel, from wherever they are, just as they would in an office environment.

“Everywhere we can, we’re taking lessons from the consumer experience and bringing that to the office,” Pfeiffer adds.

Refining the hybrid work experience

Michael Error, vice president of IT and CIO for Blue Cross Blue Shield of North Dakota, is likewise looking to advance his company’s hybrid work experience with the tech investments he’s making.

Michael Error, VP of IT and CIO, Blue Cross Blue Shield of North Dakota

Blue Cross Blue Shield of North Dakota

“Early on, it was good enough to get our employees home and that they stayed safe and they had the capabilities to get on [company systems],” he says. “Since that time, we have moved to ‘What’s the experience?’ That has meant regardless of where you are, do you feel like it’s all the same. And it’s not just having same from the technology whether you’re remote or not; it’s about making the people all feel they’re part of a team regardless of their location.”

He adds: “We’re trying to drive the same experience whether you’re in the office or remote and driving consistent, high-quality experiences.”

To do that, Error says he and his IT team have been educating employees about using and maximizing existing capabilities, whether its conference room cameras or the whiteboard features in Microsoft Teams.

Error is also rolling out new software-based phones that he says will provide a “less clunky” feel, more flexibility, and a more consistent experience as workers move between in-office and remote locations.

And IT is deploying a collaboration platform from Miro to better meet the needs expressed by hybrid teams, Error says.

Moreover, IT is working with the various business-side groups “to see what’s of interest to them, to evaluate products, and to see if those [products] could benefit the entire organization.”

Error adds: “In 2023, IT will be focused on refinement, what are the little things we don’t have today that will fill in the blanks.”

Turning to tech for seamless continuity

Abhijit Mazumder, CIO at Tata Consultancy Services (TCS), is similarly looking to how his IT team can improve the hybrid experience it has already enabled.

Abhijit Mazumder, CIO, Tata Consultancy Services

Tata Consultancy Services

“With things inching back to normalcy, our aim right now is to ensure that employees work with seamless continuity whether from home or the office,” he says, adding that TCS has been investing in technologies — including collaboration software and cloud-based applications — to support a hybrid workforce for more than five years.

“Basic amenities such as provision of huddle rooms linked to other TCS centers have made remote collaboration across the globe possible in a matter of minutes. We are continuously working to ensure a seamless transition to our employees and customers, independent of their physical location,” he adds.

For 2023, Mazumder says the focus will be on “tweaking, sustaining, and scaling our secure borderless workspaces [SBWS] solution and enabling additional capabilities in our offices to enable collaboration.”

He also plans to continue his investments in cloud collaboration, organizational resiliency and cybersecurity as well as the personnel to support all that.

“As a consulting and IT services organization, evolving technologies means scaling talent development. Though we have invested significantly in building a culture of agility within the organization, I think there is more to our journey. Our key focus would be scaling talent development in hybrid work model,” Mazumder adds.

The end-user device as office

Other CIOs are also focused on deploying technologies that can step up their company’s existing hybrid work environment.

Thomas Phelps, CIO, Laserfiche


Thomas Phelps, CIO at Laserfiche, for example, says his company decided in early 2020 to create a tech stack that would support employees being independent from any specific workstation.

“My philosophy is your office is your laptop and everything you need to be productive is enabled through that laptop,” he says.

That strategy enabled the company to quickly pivot to remote work when the pandemic sent workers home that March and then hybrid as restrictions eased.

Still, he says there’s room for improvement. For example, existing digital platforms aren’t great at fostering ad hoc sessions and replicating spontaneous interactions — in other words, the proverbial “watercooler conversations,” Phelps says, noting, “It is still easier to have those when someone is physically present.”

Phelps is now searching for technologies that can help with that, and as part of that he’s deploying the Miro online whiteboard platform to support hybrid collaboration.

Phelps is also rolling out a new conference room videoconferencing system with zoom cameras, auto framing, streaming options, and other features to create a better, more equitable experience for remote attendees.

Similarly, Laserfiche is enabling captioning on Zoom, including captioning in multiple languages, to support equitable communication quality for all attendees. And he’s focused on ensuring whatever technologies IT deploys can be used on whatever devices workers want to use.

“Enabling work through different devices not just your typical laptop but also your mobile devices is going to be key, as is supporting work from anywhere through the right security architecture centered on zero-trust principles,” he says.

Upgrades and improvements

Ramon Richards, SVP and CIO, Fannie Mae

Fannie Mae

Today, as organizations enter their fourth year of providing remote and hybrid work to many or most of their workers, CIOs often speak about the need to bring continuous improvement — a longtime IT principle — to this space.

For example, Ramon Richards, senior vice president and CIO of Fannie Mae, where most employees can choose where to do their day-to-day work most of the time, talks about improving and enhancing the hybrid work environment.

His plans include evolving the organization’s conferencing capabilities “to create a more immersive experience using new camera solutions and virtual reality.” As Richard explains: “This will further enhance the overall employee experience to securely collaborate remotely and on-site with people both inside and outside the organization.”

He also plans to enhance existing IT support capabilities by adding modern end-user contact channels, such as crowdsourcing to troubleshoot common IT issues so that employees will have additional options to access and receive tech support.

And he is simplifying the organization’s device management capabilities by completing the migration to a common cloud-based intelligence-driven digital workspace platform for unified endpoints, virtual desktops, and mobile application management.

Morever, IT “is focusing on the end-user experience for all service deliveries in 2023, leveraging automation, when possible, to proactively detect and resolve device issues with self-healing capabilities and provide more effective self-service information,” he says. “Coupled with this effort is a cohesive change management process to ensure effective employee communication and adoption of new technologies.”

Bart Murphy, chief technology and information officer, OCLC


Bart Murphy, chief technology and information officer of OCLC, a library technology and research organization, says he, too, is continuing to look at technology investments that can improve the hybrid work environment that his company adopted in April 2022 after working almost entirely remotely through the first two years of the pandemic.

“This includes continually upgrading WiFi access points, conference room technology, and meeting spaces for improved in-office connectivity and collaboration. We are in the process of moving to Teams for our phone system to provide more efficient ways to connect both internally with associates and externally with customers,” he says. “We continue to invest in our VPN capacity to ensure those working from home have all the access and performance they need to be productive.”

He adds: “We will continue to be intentional about creating events for all associates intended to experience and build on the strong culture that exists at OCLC and seek input from associates.”

Collaboration Software, Remote Work, Staff Management

The tech industry has long been known for its lack of diversity and, as a result, there’s been a big push for companies to take DEI strategies seriously. Diversity not only helps organizations perform better but fostering equity and inclusion can also strengthen recruiting and retention rates, as well as overall employee satisfaction.

In fact, diverse companies have been shown to have a 2.5 times higher cash flow per employee and three in four job seekers and workers prefer diverse companies, according to data from a 2022 BuiltIn report. The report also found that diverse management increases revenue by 19% and gender-diverse companies report performing 15% higher financial returns than the industry median.

These five companies provide strong examples of successfully implemented DEI strategies that have helped diversify the talent pipeline, close skills gaps, and create opportunity for underserved populations. Through internship programs, apprenticeships, returnships, and other unique talent and upskilling programs, these examples can help inspire the right DEI strategy for your organization.

AllianceBernstein diversifies its tech talent pipeline

AllianceBernstein has been heralded in the financial services industry for its employee satisfaction, work environment, compensation, career opportunities, and diversity by The Everest Group. And it is the Nashville, Tenn.-based company’s wide-ranging DEI efforts that truly stand out.

AllianceBernstein works directly with job training nonprofit Year Up, which connects the global asset management corporation with tech talent from underserved populations. The company also offers career transition programs for former pro athletes and military members, and partners with the Nashville Software School to offer vocational training. AllianceBernstein also recently added full-time, paid apprenticeships, and tuition-free web development bootcamps alongside the Greater Nashville Technology Council (NTC). The company also recently launched an HBCU Scholars Program, providing up to 20 students with scholarships after completing a 9-week summer internship.

“The more perspectives we can have in the room, having different ideas, people that have different lived experiences, people that have different backgrounds — that really matters,” says Janessa Cox-Irvin, global head of diversity and inclusion at AllianceBernstein.

LinkedIn partners up for a more diverse IT future 

Like AllianceBernstein, LinkedIn also partners with job training nonprofit Year Up, but takes the relationship a step further by pairing an employee volunteership program with workshop events for Year Up students. Through a long-standing partnership that started in 2011 with the company’s first CEO, Jeff Weiner, LinkedIn has long been working to open and diversify its talent pipeline.

The company has a dedicated internal team with employees tasked with maintaining and growing the LinkedIn’s relationship with YearUp, which works to connect underserved young adults with opportunities in the IT industry through a career readiness program that includes an internship. Students are given training on hard and soft skills to prepare them for IT careers, along with hands-on experience in their desired field.  

LinkedIn offers two unique programs alongside YearUp: LinkedIn Coaches and Investment Days, which are referred to as InDays. Through the LinkedIn Coaches program, employees receive training to become coaches to help connect Year Up students seeking job advice and other career resources. On InDays, which occur twice a year, LinkedIn hosts all current Year Up students for an on-site event featuring career workshops, mock interviews, networking events, and resume workshops.

Corporate partners typically offer financial support and internship opportunities, but LinkedIn has invested even more by setting aside significant resources to support the relationship with Year Up. Year Up students get the chance to connect directly with LinkedIn employees, receive support on their resumes and LinkedIn profiles, attend mock interview workshops, and have the chance to network with professionals in the industry.

AAMC’s intentional DEI strategy helps employees flourish

The Association of American Medical Colleges (AAMC) is a prime example of how a strong and focused DEI strategy can help a company flourish. The organization’s DEI Council focuses on establishing employee resource groups, evaluating reporting systems for biases, and having a critical eye on inclusion at every level. The organization also hired a DEI director to build a small team under HR to serve as a dedicated leader focused on fostering inclusivity in the organization.

It was important to the leadership at AAMC to create a formal strategy and framework to address racism and to offer anti-racism training through the Sustained Dialogue Institute. The company has worked to eliminate bias in hiring to ensure a more diverse workplace, identifying and eliminating problematic terminology, and implementing DEI goals that are tied to leadership performance. It became important to the organization that they model DEI from the top down, making it something they put into regular practice, rather than just lip-service.

“Our formalized DEI strategy focuses on our workforce and challenges our workplace, culture, leadership, the ability to lead in an inclusive way, including partnerships, community engagement, outreach, and more,” says Yvonne Massenburg, chief human resources officer at AAMC.

The organization has focused on creating more avenues for employees to voice opinions and feel heard through town hall meetings, management meetings, and ongoing meetings with leadership where employees can connect with leadership. The goal is to create various pathways for employees to feel heard and seen, without fear that it will impact their career opportunities. It’s about fostering a welcoming environment where people feel comfortable challenging the status quo.

IBM returnship helps restart IT careers 

Taking time away from the tech industry can make it dauting to return, but programs such as IBM’s Tech Re-Entry Program help ease the transition. Program participants are called “returners,” and they aren’t interns, entry-level hires, or apprentices — they are viewed as highly experienced professionals with extensive backgrounds in IT. Therefore, the experience is shaped differently than it would be for a less experienced intern or apprentice.

One IBM returner, Anju Nair, had to take a 15-year break from her IT career to focus on her health and her family, deciding to return to her career once her daughter entered grade school. However, she knew the transition wouldn’t be seamless but soon found IBM’s Tech Re-Entry program and joined the data scientist program. She attributes the program to helping her build confidence and recognize her value to the industry.

“I was confident in my skills, because I had the prior experience, I knew what I was working towards, and I knew I was open to learning; but to get that opportunity is the toughest part,” Nair says.

The program is unique in that it focuses on experienced professionals, rather than entry-level workers or young adults. It’s meant to demystify the process of getting back into the industry, bridge any skills gaps in technologies or skills, and help candidates get a feel of the current corporate landscape. The program diversifies the pool of talented candidates by breaking down unfair barriers that have historically been in place for those with gaps on their resumes.

Accenture bridges IT gaps with apprenticeships

Accenture’s apprenticeship program is a strong example of how these training programs can fill skills gaps and diversity the IT talent pipeline. Apprenticeships offer the opportunity to “earn while you learn,” offering a nontraditional path towards a lucrative IT career. Accenture decided to launch its apprenticeship program when the organization realized they weren’t accessing all available pools of talent, especially candidates without a traditional four-year degree in IT, according to Pallavi Verma, Accenture’s senior managing director of North America Quality and Risk Lead.

“It’s really about providing opportunity for people and for us to open up our pool of people. There’s many pools of talent and we recognize that we shouldn’t be looking at just one pool of talent,” Verma says.

The apprenticeship programs are developed at a local level, enabling Accenture to ensure it is recruiting local talent and building relationships with local colleges and training programs. The program supports candidates enrolled in community college and those with a four-year degree in a nontechnical field who want to change career paths. Examples include an architect, a food truck operator, natural gas pipe fitter, and an English teacher who all wanted to be reskilled for IT roles through the apprenticeship program.

The on-the-job training provided by the apprenticeships enable students to learn through real-life scenarios that can’t always be replicated in a classroom setting. The apprenticeships create opportunity where it may not have existed before and open Accenture to tap into fresh perspectives, ideas, and backgrounds to create better products, services, and software.  

Diversity and Inclusion, Women in IT

Recently, chief information officers, chief data officers, and other leaders got together to discuss how data analytics programs can help organizations achieve transformation, as well as how to measure that value contribution. We shared our insights at this CIO Online virtual roundtable event, which included leaders from organizations in healthcare, financial services, utilities, communications, and more.

The discussion focused mostly on data management issues and opportunities around “supply” — its quality, ownership, access, and other matters. Supply and consumption are symbiotic principles that together maximize the value of the enterprise data asset. While “consumption” matters are just as critical, getting data supply right is essential to ensuring that data — and the insights it drives — are available and trustworthy.

Our participants report encountering the view that data analytics programs don’t justify the effort to implement and operate them — even as companies spend more on big data and analytics every year. These leaders also struggle to set up metrics that demonstrate their programs’ achievements of transformation objectives.

These two challenges are closely linked: Better metrics on data analytics program value would go a long way toward dispelling the perception that these programs are not worthwhile. In our conversation, we acknowledged that doubts about their worth is a symptom that the business is not equipped to derive full value from the program. It’s a situation that calls for empowering the business to read, analyze, work, and even argue with data — effectively and confidently.

This post summarizes our conversation and describes some strategies we discussed to derive and demonstrate data analytics program value.

A community of teams

Organizations can support data analytics program effectiveness by ensuring that all impacted stakeholders (e.g., business, IT, data management, security, risk and compliance etc.) are engaged appropriately in sustained development and management of trusted data and insights. In other words, treating data and the insights it provides like any other critical corporate asset.

Mark Carson, Managing Director – Data & Analytics Leader at Protiviti, described this sustained engagement model as “a holistic, cross-functional approach that starts, and ends, with business value”; a way of looking at modern data and analytics that resonated with our participants.

In addition to consistent cross-functional engagement, senior leadership buy-in and support is paramount to ensure the organization’s corporate strategy and its data and analytics strategy remain symbiotic in realizing business value. This alignment is essential to any data analytics program because it focuses program efforts on mission-critical transformation.

Arguing with the data

In successful data analytics programs, business users absorb, assess, and act on the data by reading, working, analyzing, and arguing with it.

… Arguing with data? When data owners achieve maximum comfort with analytical tools, when they attain maximum literacy with the data itself, then they will use data “to support a larger narrative intended to communicate some message to a particular audience,” in the words of a formative early paper on data literacy. This is when data analytics programs deliver their greatest value. Empowering the business to argue with data is the highest goal.

Measuring data analytics’ value

Participants shared questions about how to measure data analytics program value. Metrics fall within the governance domain, which is the purview of owners and stewards together.

Lucas Lau, Senior Director – Machine Learning & AI Practice Leader at Protiviti, outlined a list of categories to simplify metrics. Considering only these four dimensions can help leaders simplify the seemingly complex problem of demonstrating value:

How does our use of data analytics increase revenue?How does the data analytics program reduce losses for the business?How is data analytics helping us drive down capital expenditure and operating costs?How are we using data analytics to manage enterprise risk?

Measuring along these dimensions dispels doubts that data analytics programs don’t deliver value. The answers to these four questions help program leaders gain traction: in the absence of doubt, more gets done. The questions offer an additional benefit when they generate new ideas about further advantages a data analytics program could deliver.

Gaining momentum through early wins

Early wins provide momentum as well as a foundation from which teams build a sense of community and confidence. As organizations acknowledge that business transformation is not a project, but an ongoing process, the confidence-boosting, competence-building experience gained via early wins provide the basis for ongoing tracking of achievements and corrections to data-driven decisions as needed.

The power of the roundtable

Leaders can help their data analytics programs deliver value by articulating the data ownership role for the business community. Then, they can measure program value along the lines of increased revenue, reduced losses, lower costs, and better-managed risk. They can focus on early wins that build team confidence and competence as the foundation for ongoing program effectiveness.

The ideas we generated together highlight what can happen when a community of leaders discusses barriers to success in a confidential environment. We thank CIO Online for the opportunity to meet with and advise the leaders who joined us for this roundtable.

Connect with the authors:

Mark Carson

Managing Director – Data & Analytics Leader at Protiviti

Lucas Lau

Senior Director – Machine Learning & AI Practice Leader at Protiviti

Analytics, Business Intelligence

By Andy Nallappan, Chief Technology Officer and Head of Software Business Operations, Broadcom Software

The information technology that enables scientific and commercial breakthroughs, from precision medicine to digital transformation, demonstrates tech’s boundless potential to improve our world. Yet, tech practitioners have long traded progress for increased complexity.

IT complexity, seen in spiraling IT infrastructure costs, multi-cloud frameworks that require larger teams of software engineers, the proliferation of data capture and analytics, and overlapping cybersecurity applications, is the hallmark—and also the bane—of the modern enterprise.

With a better understanding of IT complexity, large enterprises can partner with their strategic vendors to reduce IT complexity and drive more innovation and business success from it.

Complexity Continues to Increase

Complexity exhausts IT budgets and workers—a widespread problem that worsened during the pandemic. Seventy percent of executives say that IT complexity has been a growing challenge for their organization over the past two years, according to Broadcom Software-sponsored research by Harvard Business Review Analytic Services. While 85% of executives state that reducing IT complexity is an organizational priority, only 27% said their company had managed it effectively.

Regarding complexity, David Linthicum, managing director and chief cloud strategy officer at Deloitte Consulting LLP, comments that over the last five years, people have been migrating to the cloud and using more complex distributed deployments, such as multi-cloud, edge computing, IoT, and things like that.”

The Harvard Business Review Analytic Services report Taming IT Complexity through Effective Strategies and Partnerships discusses the root causes of IT complexity and the penalties organizations pay for failing to undertake an effective complexity-reduction strategy. In the report, Linthicum describes multi-cloud as “the straw that broke the camel’s back” because rather than centralize all enterprise data in a single cloud, companies are expanding their investments in multiple areas and trying to stitch all the pieces together.

Seeking Solutions to IT Complexity    

Not all IT solutions work as intended. A recurring problem known as “legacy spaghetti” occurs when IT teams layer on new technology solutions “that are disparate, siloed, and unorganized,” according to the report.

Nearly two-thirds of executives in the study indicate that incompatible systems and technologies are the top factors fueling IT complexity. Not surprisingly, 67% of respondents said employees are frustrated or confused by persistent complexity, with three-in-five noting that it costs money and creates unnecessary additional work.

Though complexity has plagued data centers since the days of mainframes and punch cards, the study indicates that:

82% of executives view IT complexity as an impediment to success.81% believe that reducing it creates a competitive advantage.73% agree that IT complexity is an organizational expense.36% of respondents say that reducing complexity in security creates more resilient systems—less vulnerable to security breaches.

At long last, the tide may be turning. According to the study, most organizations have developed specific strategies to reduce IT complexity and create operational efficiency. While there are many steps a company can take to minimize complexity, starting with simpler tools, enterprises can also partner with firms that specialize in complexity reduction to boost IT agility. In the study, three-in-five executives agree that working with a trusted partner is key to reducing complexity.

The report does point out, however, that all complexity is not bad. Innovation is the cornerstone of success for nearly all companies today. But innovation can require a certain level of complexity, meaning that ongoing innovation requires continual attention to the complexity it inevitably will create. “Executing innovation initiatives at scale involves risk-reward calculations all along the journey because there will be missteps and even failures. Innovation initiatives almost always require individuals with diverse expertise and experience, including from outside the organization, to collaborate and experiment together,” says Linda Hill, the Wallace Brett Donham Professor of Business Administration at the Harvard Business School, where she is also chair of the Leadership Initiative. “The more complex whatever you’re trying to do, by definition, the riskier it is.”

“A partner might have done this dozens of times with several other companies, and so we learn from that and ask, ‘What were lessons learned, and how can we accelerate quick-win improvements?’” said Jason Duigou, CIO of Indianapolis-based Medxcel, a healthcare facilities services company, in the Harvard Business Review Analytic Services report.

Nine Ways to Improve IT Complexity

The report also highlights nine practices to enable firms to improve the way they manage IT complexity:

Develop a common language around IT complexity—identifying a project’s complexity risk helps firms prioritize resources better.Find a balance between innovation and complexity—too much customization creates burdens, adds costs, and spikes risk.Take a modular approach to transformation—tackle new features incrementally to reduce stress on legacy systems.Get the C-suite on board—the study indicates that not all senior executives understand the weight of the problem. Complexity reduction needs funding and executive support.Retire redundant technologies—phasing out incompatible systems and redundant technologies is the most common complexity-reduction program, according to the study.Connect systems and increase compatibility—API-first strategies can help reduce the complexity of connecting disparate applications and improve system interoperability.Train employees—organizations that excel at reducing complexity tend to train their employees to promote better utilization of existing tools and systems.Create feedback loops—mitigating complexity starts with listening to how customers and employees experience a service or product. Feedback can help eliminate unnecessary features that lead to unwanted complexity.Develop metrics and measure progress, but see things through—measuring progress against objectives and appropriate peer groups can help improve IT complexity reduction efforts.

The report points out that there are no easy or inexpensive ways to reduce IT complexity. Companies that stand out as leaders in this effort tend to focus on limiting rather than trying to halt the constant march of complexity. Download the Harvard Business Review Analytic Services report now to learn more.

About Andy Nallappan:

Broadcom Software

Andy is the Chief Technology Officer and Head of Software Business Operations for Broadcom Software. He oversees the DevOps, SaaS Platform & Operations, and Marketing for the software business divisions within Broadcom.

IT Leadership

For probably the umpteenth time, we use the term “garbage in, garbage out” when summarizing problems with data quality. It has indeed become a cliché. Various industry studies have uncovered the high cost of bad data, and it’s estimated that poor data quality costs organizations an average of $12 million yearly. Data teams waste 40% of their time troubleshooting data downtime, even at mature data organizations, and utilizing advanced data stacks.

Data quality, which has always been a critical component of enterprise data governance, remains an Achilles heel for CIOs, CCOs, and CROs. In fact, data quality has become even more challenging to tackle with the prolific increase in data volume and types — structured, unstructured, and semi-structured data.

Data quality is not just a technology problem and never will be because we rarely think of the quality of the data we source when implementing new business initiatives and technology. Technology is only an enabler, and to get the most from the technology, we need to think about the business processes and look for opportunities to re-engineer or revamp these business processes when we start a new technology project. Some of the aspects of understanding these business processes are:

What data do we need?Do we understand the sources of this data?Do we have control over these sources?Do we need to apply any transformations (i.e., changes to this data)?Most importantly, do our end users trust the data for their usage and reporting?

These questions sound basic and obvious. However, most organizations have trust issues with their data. The end users rarely know the source of truth, so they end up building their data fiefdoms, creating their own reports, and maintaining their own dashboards.

Eventually, this causes ‘multiple sources of ‘truth,’ each being a different version of the other. As a result, this causes sleepless nights, especially when we want to submit a regulatory report, make any executive decisions, or submit SEC filings. Not only is this wasting valuable engineering time, but it’s also costing precious revenue and diverting attention away from initiatives moving the business’s needle. In addition, this is a misuse of data scientists’ core skills and adds additional costs and time that could be better used for the organization’s business priorities.

Over time, data quality issues have become more extensive, complex, and costlier to manage. A survey conducted by Monte Carlo suggests that nearly half of all organizations measure data quality most often by the number of customer complaints their company receives, highlighting the ad hoc nature of this vital element of modern data strategy. Most organizations decide to address this issue in a piecemeal fashion that is a practical approach but requires a tremendous effort to understand the data, document the lineage, identify data owners, identify key data elements (KDE), maintain these KDEs, and apply the data governance lifecycle to the data.

No wonder this is only a tactical solution; sooner or later, we need to start working on another tactical project to resolve the issues caused by the previous tactical project and so on. This means an endless cycle of massive spending on IT, frustration because of low return on investment from technology projects, and buying new technology products that promise a total overhaul.

What is data quality management?

Data quality management (DQM) is the set of procedures, policies, and processes an enterprise uses to maintain reliable data in a data warehouse as a system of record, golden record, master record, or single version of the truth. First, the data must be cleansed using a structured workflow involving profiling, matching, merging, correcting, and augmenting source data records. DQM workflows must also ensure the data’s format, content, handling, and management comply with all relevant standards and regulations.

So how do we tackle data quality with a proactive approach? There are a few options, from the traditional approach to the real-time solution.

Traditional approach: Data quality at the sourceThis is the traditional and, in most cases, the best approach to handling data qualityThis includes identifying all the data sources (external and internal)Documenting the data quality requirements and rulesApplying these rules at the source level (in the case of external sources, we apply these rules where the data enters our environment)Once the quality is handled at the source level, we publish this data for the end users through applications such as a data lake or a data warehouse. This data lake or warehouse becomes the “system of insight” for everyone in the organization.Pros of this approach:Most reliable approachOne-time and strategic solutionIt helps you with optimizing your business processesCons of this approachWe need a cultural shift to look at data quality at the source level, ensuring this is applied every time there is a new data source.This is possible only with executive sponsorship, i.e., a top-down decision-making approach, making it an integral part of every employee’s daily activity.Data owners must be ready to invest time and funding to implement data quality at the sources they are responsible for.Implementation of a data quality management toolModern DQM tools automate profiling, monitoring, parsing, standardizing, matching, merging, correcting, cleansing, and enhancing data for delivery into enterprise data warehouses and other downstream repositories. The tools enable creating and revising data quality rules. They support workflow-based monitoring and corrective actions, both automated and manual, in response to quality issues.This approach includes working with the business stakeholders to develop an overall data quality strategy and framework and selecting and implementing the best tool for that framework.The implemented tool should be able to discover all data, profile it, and find patterns. The tool then needs to be trained with data quality rules.Once the tool is trained to a satisfactory level, it starts applying the rules, which helps improve the overall data quality.The training of the tool is perpetual — it keeps learning more as you discover and input the new rules.Pros of this approach:Easy to implement and quick resultsThere is no need to separately work on in-depth lineage documentation (tool automates the data lineage) and governance methodology; we need to define the DQ workflows so tools can automate those.Cons of this approach:Training of the tool requires a good understanding of data and data quality requirementsThere is a tendency to expect that everything will be automated. This is not the case.This is not a strategic solution; it does not help with business process improvement.

Based on the above considerations, we believe the best approach is a combination of the traditional and the DQM tools approach:

First, set up a business-driven data quality framework and an organization responsible for supporting it.Second, define an enterprise DQ philosophy: “Whoever creates the data owns the data.” Surround this with guiding principles and appropriate incentives. Organize around domain-driven design and treat data as a product.Third, develop an architectural blueprint that treats good data and bad data separately and deploy a robust real-time exception framework that notifies the data owner of data quality issues. This framework should include a real-time dashboard highlighting success and failure with clear and well-defined metrics. Bad data should never flow into the good data pipeline.Fourth, incorporating this holistic DQ ecosystem should be mandated for each domain/source/application in a reasonable timeframe and every new application going forward.

Data quality remains one of the foremost challenges for most organizations. There is no guaranteed approach to solving this problem. One needs to look at the various factors, such as the organization’s technology landscape, legacy architecture, existing data governance operating model, business processes, and, most importantly, the organizational culture. The problem cannot be solved only with new technology or by adding more people. It needs to be a combination of business process re-engineering, a data-driven decision-making culture, and the ability to use the DQ tools most optimally. It is not a one-time effort, but a lifestyle change for the organization.

Learn more about Protiviti data and analytics services.

Data Management

There’s no doubt that today’s small- and medium-sized business leaders are facing several unprecedented challenges. And building resilience to weather any upcoming storms is essential.

In this first episode of our 5-episode podcast, Essential Connections: The Business Owner’s Guide to Growth During Economic Uncertainty, we welcome Jamie Domenici, Chief Marketing Officer at GoTo. Jamie’s unique expertise in marketing and hands-on experience in small- and mid-sized companies makes her wisdom and best practices especially relevant and meaningful.

“My passion is around small businesses,” she says. “And my whole life honestly, has been associated with small businesses. And now working at larger companies, I really spend a lot of my time and my efforts thinking about how we help small businesses navigate some of those more challenging experiences.”

Navigating today’s turbulence requires a strategy that allows for agility and strength. Jamie’s advice? “Resilience is all about the ability to bounce back,” she says. “I think a resilient strategy is one that allows you to adapt quickly to market changes, view mistakes and failures as learning opportunities, and most importantly, anticipate problems so you can prepare for them in the future.”

Listen in to learn all the details, including Jamie’s actionable insights on how to use marketing and technology for differentiation.

IT Leadership