The greatest challenge for any CIO, undoubtedly, is aligning technology with an organization’s business goals.

Most CIOs know it is the No. 1 objective and far easier said than done. For a multitude of reasons — some political, some budgetary, some cultural — often the desired outcome of using technology to achieve business objectives seems out of reach.

Steve Taylor took the IT reins for mortgage subservicer Cenlar because he was confident his longtime experience as a consultant would help him bridge the gap between IT and the business — and deliver “material business impact” that is demonstrable and measurable, he says.

But getting results requires more than just sophisticated tech know-how and business acumen. It takes cooperation across the board, from the C-suite to the employee base, and that is the tough part, says Taylor, who was named senior vice president and CIO in March as part of a companywide reorganization that also brought in a new chairman of the board, two co-CEOs, and a talent director to the Ewing, N.J.-based company.

“A lot of times companies tends to outpace the technology and they don’t work together,” says Taylor, who prior to launching his own consulting business worked for large companies like Fidelity Investments. “From the standpoint of technology and business alignment, we need to make things dynamic.”

A cross-functional approach

As part of his alignment strategy, Taylor pairs his company’s business analysts and IT professionals in real-time — either in the same room or via videoconfernce — to collaborate on projects simultaneously.

This side-by-side approach enables colleagues from different sides of the aisle to catch all the business and technical nuances of a business process, identify the best workflow processes to achieve the specified business goal, and together ensure the accuracy of the data selected, the formulas used, and the code developed.

“What I am trying to do is make IT part of the business,” Taylor says, noting that automation is a business tool — not an IT product. “Our goal is to implement it and teach the business how to build their own workflows versus putting in a request. We want to push that back.”

To enable this, Taylor has reassigned business analysts to work for the company’s IT staff and reassigned IT employees to business roles. What Taylor is trying to do is align the business and technology practitioners and processes seamlessly to achieve maximum business impact, he says.

Cenlar, for instance, has also created new roles within IT called business information officers (BIOs) and solutions architects “whom we place side by side with the business, even in operations,” Taylor says. “They see exactly what the workflow is, what are their business needs and what their clients are asking for. Then the goal for this team is to come back to me, the CIO, and I work with IT leadership and the business.”

The power of the cloud

None of would be possible without the cloud, says Taylor, who inherited a core cloud platform on Microsoft Azure, which is rounded out by MuleSoft middleware, commercial analytics, and automation tools such as UiPath, as well as a data warehouse and essential SaaS offerings such as Teams.

The mortgage subservicing company migrated to Microsoft Azure in 2019 and spent the next three years (18 months if one accounts for the downtime during the pandemic) moving numerous assets such as Citrix databases and test systems in seven “waves” to Azure.

When Taylor came on board, he was responsible for moving most of Cenlar’s corporate data, including its client-facing data and interfaces, in the last three “waves” to the Azure cloud. “Wave 10 was really the meat of getting to the cloud because now all of our data was there,” Taylor says.

With that transition complete, Cenlar could then start delivering real-time data to its BIOs rather than stagnant weekly reports. Currently, there remain only 200 servers left in Cenlar’s data center. All the data contained on those servers will also be migrated to the cloud.

Taylor is also committed to implementing a hybrid, multicloud approach to avoid lock-in and expand Cenlar’s capabilities. For example, the company will implement more SaaS solutions on Amazon Web Services. “We want to make sure we’re not so stuck in one cloud [if we need] to pivot at a later date,” he says.

Overhauling the IT strategy

Cenlar’s chief clients are the banks and credit unions that provide loans for homeowners. In that sense, the company has two constituencies — one which works directly for the company’s corporate clients and another for their customers, the homeowners. For the latter, for instance, Cenlar employs Avaya call center technology to aid customers with mortgage information.

“We have two faces of technology — one very focused on our homeowners, which is very digital and very transformative, and then for banks and mortgage clients, for which we do a lot of data analytics and management of customer portfolios,” Taylor notes.

Cenlar’s automation and workflow processes, many of which precede Taylor, are highly effective at eliminating costly human errors. Taylor points out that if one of Cenlar’s 3,000 employees makes a single mistake on a single transaction, such as a misplacing one digit in a financial transaction, it has a ripple effect on the homeowner, the bank and, of course, Cenlar’s efficiency.

To address this, Cenlar’s technologists and analysts started developing automated workflows using industry-standard products, such as the Decisions analytics tool and AI chatbots, to deliver on core business objectives for consumers and corporate clients — for example, to provide fast, accurate answers to consumer questions and business analysts’ requests for data.

When he signed on six months ago, Taylor began unifying the IT and business teams, while also expanding Cenlar’s automation efforts and use of AI. He currently counts 200 in his IT staff and a handful of data scientists, and hopefully more soon.

To make Cenlar more agile, Taylor will have his work cut out for him re-engineering business processes across the board. Consulting companies allude to these macro changes — such as instituting agility and flexibility across multiple lines of business — as waterfall transformations.

One Gartner analyst says Cenlar’s CIO is addressing one of the most advanced challenges facing enterprise IT today: fusing IT with business analysts in an effort to design more collaborative business processes based on objectives and key results (OKR).

“CIOs have tried to solve this by using forms of business relationship management to get closer to the business. This provides advantages for sure and is useful. However, if the business is unclear in what they need to achieve, getting closer and being a better listener will not solve the problem,” says Irving Tyler, a vice president and analyst at Gartner. “The solution is for IT to provide leadership, to help business leaders increase their knowledge of technology and how it can solve business challenges.”

Many CIOs, like Taylor, are addressing this by forming cross-functional teams comprising business subject matter experts, business technologists such as data scientists, and IT experts, Tyler says.

While Taylor eyes Cenlar’s waterfall transformation globally, he is currently focused on building teams tailored to deliver immediate micro changes that matter, he says — an approach HVAC manufacturer Carrier is also taking.

Cenlar’s BIOs, who are assigned to ensure constant information exchange and precisely developed workflows, ensure there is fusion, the CIO says.

“This is very different than what I have seen in other companies where they don’t have technology embedded in the business side-by-side with them,” says Taylor, who participated in global changes that affect multiple business lines at Fidelity. “Just putting people together next to each other does not always make for success. But when IT is a business capability and leaders in IT and the business share objectives, the disconnects are removed.”

Business IT Alignment

Most organizations understand the profound impact that data is having on modern business. In Foundry’s 2022 Data & Analytics Study, 88% of IT decision-makers agree that data collection and analysis have the potential to fundamentally change their business models over the next three years.

The ability to pivot quickly to address rapidly changing customer or market demands is driving the need for real-time data. But poor data quality, siloed data, entrenched processes, and cultural resistance often present roadblocks to using data to speed up decision making and innovation.

We asked the CIO Experts Network, a community of IT professionals, industry analysts, and other influencers, why real-time data is so important for today’s business and how data helps organizations make better, faster decisions. Based on their responses, here are four recommendations for improving your ability to make data-driven decisions. 

Use real-time data for business agility, efficient operations, and more

Business and IT leaders must keep pace with customer demands while dealing with ever-shifting market forces. Gathering and processing data quickly enables organizations to assess options and take action faster, leading to a variety of benefits, said Elitsa Krumova (@Eli_Krumova), a digital consultant, thought leader and technology influencer.

“The enormous potential of real-time data not only gives businesses agility, increased productivity, optimized decision-making, and valuable insights, but also provides beneficial forecasts, customer insights, potential risks, and opportunities,” said Krumova.

Other experts agree that access to real-time data provides a variety of benefits, including competitive advantage, improved customer experiences, more efficient operations, and confidence amid uncertain market forces:

“Business operations must be able to make adjustments and corrections in near real time to stay ahead of the competition. Few companies have the luxury of waiting days or weeks to analyze data before reacting. Customers have too many options. And in some industries — like healthcare, financial services, manufacturing, etc., — not having real-time data to make rapid critical adjustments can lead to catastrophic outcomes.” — Jack Gold (@jckgld), President and Principal Analyst at J. Gold Associates LLC.

“When insights from the marketplace are not transmitted in real time, the ability to make critical business decisions disappears. We’ve all experienced the pain of what continues to happen with the disconnect between customer usage metrics and gaps in supply chain data.” — Frank Cutitta (@fcutitta), CEO and Founder, HealthTech Decisions Lab

“Operationally, think of logistics. Real-time data provides the most current intelligence to manage the fleet and delivery, for example. Strategically, with meaningful real-time data, systemic issues are easier to identify, portfolio decisions faster to make, and performance easier to evaluate. At the end of the day, it drives better results in safety, customer satisfaction, the bottom line, and ESG [environmental, social, and governance].” — Helen Yu (@YuHelenYu), Founder and CEO, Tigon Advisory Corp.

“Businesses are facing a rapidly evolving set of threats from supply chain constraints, rising fuel costs, and shipping delays. Taking too much time to make a decision based on stale data can increase overall costs due to changes in fuel prices, availability of inventory, and logistics impacting the shipping and delivery of products. Organizations utilizing real-time data are the best positioned to deal with volatile markets.” — Jason James (@itlinchpin), CIO at Net Health

Build a foundation for continuous improvement

The experts offered several practical examples of how real-time data can help deliver continuous improvement in a variety of areas across the business, with the help of automation, which is a key capability for making data actionable.

“In the process of digital transformation, businesses are moving from human-dependent to digital business processes,” said Nikolay Ganyushkin (nikolaygan), CEO and Co-founder of Acure. “This means that all changes, all transitions, are instantaneous. The control of key parameters and business indicators should also be based on real-time data, otherwise such control will not keep up with the processes.”

Real-time data and automated processes present a powerful combination for improving cybersecurity and resiliency.

“When I was coming up in InfoSec, we could only do vulnerability scanning between midnight and 6 am. We never got good results because systems were either off, or there was just nothing going on at those hours,” said George Gerchow (@georgegerchow), CSO and SVP of IT, Sumo Logic. “Today, we do them at the height of business traffic and can clearly see trends of potential service outages or security incidents.”

Will Kelly (@willkelly), an analyst and writer focused on the cloud and DevOps, said that harnessing real-time data is critical “in a world where delaying business and security decisions can prove even more costly than just a couple of years ago. Tapping into real-time data provides decision-makers with immediate access to actionable intelligence, whether a security alert on an attack in-progress or data on a supply chain issue as it happens.”

Real-time data facilitates timely, relevant, and insightful decisions down to the business unit level, said Gene De Libero (@GeneDeLibero), Chief Strategy Officer at GeekHive.com. Those decisions can have a direct impact on customers. “Companies can uncover and respond to changes in consumer behavior to promote faster and more efficient personalization and customization of customer experiences,” he said.

Deploy an end-to-end approach to storing, accessing, and analyzing data

To access data in real time — and ensure that it provides actionable insights for all stakeholders — organizations should invest in the foundational components that enable more efficient, scalable, and secure data collection, processing, and analysis. These components, including cloud-based databases, data lakes, and data warehouses, artificial intelligence and machine learning (AI/ML) tools, analytics, and internet of things capabilities, must be part of a holistic, end-to-end strategy across the enterprise:

“Real-time data means removing the friction and latency from sourcing data, processing it, and enabling more people to develop smarter insights. Better decisions come from people trusting that the data reflects evolving customer needs and captures an accurate state of operations.” — Isaac Sacolick (@nyike), StarCIO Leader and Author of Digital Trailblazer

“Organizations must use a system that draws information across integrated applications. This is often made simpler if the number of platforms is kept to a minimum. This is the only way to enable a real-time, 360-degree view of everything that is happening across an organization — from customer journeys to the state of finances.” — Sridhar Iyengar (@iSridhar), Managing Director, Zoho Europe

“Streaming processing platforms allow applications to respond to new data events instantaneously. Whether you’re distributing news events, moving just-in-time inventory, or processing clinical test results, the ability to process that data instantly is the power of real-time data.” — Peter B. Nichol (@PeterBNichol), Chief Technology Officer at OROCA Innovations

As your data increases, expand your data-driven capabilities

The volume and types of data organizations collect will continue to increase. Forward-thinking leadership teams will continue to expand their ability to leverage that data in new and different ways to improve business outcomes.

“The power of real-time data is amplified when your organization can enrich data with additional intelligence gathered from the organization,” said Nichol. “Advanced analytics can enhance events with scoring models, expanded business rules, or even new data.”

Nichol offered the example of combining a customer’s call — using an interactive voice response system — with their prior account history to enrich the interaction. “By joining events, we can build intelligent experiences for our customers, all in real time,” he said.

It’s one of the many ways that new technologies are increasing the opportunities to use real-time data to fundamentally change how businesses operate, now and in the future.

“As businesses become increasingly digitalized, the amount of data they have available is only going to increase,” said Iyengar. “We can expect real-time data to have a more significant impact on decision-making processes within leading, forward-thinking organizations as we head deeper into our data-centric future.”

Learn more about ways to put your data to work on the most scalable, trusted, and secure cloud.

Business Intelligence

When Steve Pimblett joined The Very Group in October 2020 as chief data officer, reporting to the conglomerate’s CIO, his task was to help the enterprise uncover value in its rich data heritage.

For a company that made its name in mail-order catalog sales, the idea of building an enterprise-wide data catalog seemed to be an appropriate part of that process.

Very grew from the successive mergers of a number of mail order catalog companies, the oldest dating back to the 1890s. Its constituent companies later moved into high-street retail, launched new mail-order brands selling clothing on credit, and even created a consumer financial data broker, later spun off like so many of the group’s other non-core activities.

The group’s move online began in the 1990s with its first steps into e-commerce, followed by the closure of its physical stores in 2005. It launched its first online-only brand, Very, in 2009 and finally abandoned its printed catalogs to go all-in online in 2015.

The whole company rebranded as Very in 2020, the year Pimblett joined. He found a rich collection of data assets, including information on over 2.2 million daily website visits, 4.8 million active customers and 49 million items delivered annually.

Behind the flagship brand, though, he says data remained scattered in siloes across many legacy business units and applications, with limited automation, many glossaries, and complex data lineage, and stewardship making it hard to govern and audit.

Data and analytics experts were also spread across the organization, with some under the technology team but others embedded in the various business units.

“There was no one to help everybody with standards and central approaches, so every business vertical was doing it differently,” he says. “‘It’ being everything from how they collect and measure data, to how they understand it and their own glossary. It was very fragmented, and I brought it together into a hub-and-spoke model.”

The new model enables Very to design once and deploy everywhere, while maintaining a product focus.

As a result, Pimblett now runs the organization’s data warehouse, analytics, and business intelligence. “We’re a Power BI shop,” he says. “I run the infrastructure and a central enterprise BI team.”

Establishing a clear and unified approach to data

But getting to this stage was an intricate process that involved creating centers of excellence for things like data analytics that own the end-to-end infrastructure, application and skill sets, as well as career plans for staff.

Pimblett took a carrot-and-stick approach to get everyone working together, partnering with them on value creation (the carrot of profit) and risk mitigation (the stick of compliance). “It’s about making sure we understand the legal basis by which we’re capturing data, what we’re doing with it, where it flows, how we use it, and that we govern all those things,” he says.

Enterprises need to be aware of the dual nature of the data they hold, that it can be both an asset and a liability, he says.

One of the early projects on which he was able to add value through a partnership between his data hub and one of the business unit spokes was in building a new demand forecasting tool.

“We’re a multi-category retailer with over 160,000 SKUs, so forecasting how much stock to buy of each SKU is a business challenge, but also very much a technology and mathematical challenge,” he says.

Steve Pimblett

To get buy-in from business units for projects like this, he says, “you have to sell them the benefit and the outcome of shared platforms, reuse, shared data, and the efficiencies that they’ll get,” and not the technology you’ll use.

“A lot of roles in data just talk about the data,” he says. “Where do we store it? What’s the infrastructure? What’s our warehousing technology? You know, good old DBAs, modelers, and analysts.”

Instead, says Pimblett, he and his data colleagues ask business managers, “Where do you think you can create value from data? What type of decisions are you making? Where is there opportunity to automate? And how can we delight the customer or empower your colleagues to take better decisions? Turn it into an outcome, a value and an action conversation. That tends to get them engaged,” he says.

 A more nimble catalog business

Very has come full circle as a business built on catalog data, but it took some introspection in order to figure out the best way to get there.

“Cataloging your data is more important than ever for many companies, with so many technology options, different data silos, enterprise warehousing, lake houses, data lakes, and all those types of capabilities,” says Pimblett. “Understanding what data you’ve got locked in all these different stores is a big part of the jigsaw puzzle.”

So he began working on a pilot project with data catalog and governance tool vendor Alation about a year ago, after it responded to Very’s RFP. In a first test of the technology, he used Alation to catalog a subset of Very’s data held in an old Teradata database. It took about nine weeks to set up the infrastructure, make the connection to the database, and index and understand the metadata. Very is focusing on short sprints like this, rather than on monolithic 12-month projects that may not fit the business when finished.

“Run a pilot within nine weeks, prove it, prove the value, and then roll it forward into production is very much how we think about our full technology agenda,” he says.

Pimblett hasn’t yet catalogued all of Very’s data, however. It’s always going to be a work in progress. “We’re picking off the highest potential value and highest risk areas,” he says. “We’ve done it in our financial services area, and some of our marketing area. Those tend to hold the biggest amount of our customer information.”

The next step will be to roll it out across the whole company.

“We’ve got some massive systems that take time to index — not from a tech perspective, but from a data stewardship and understanding perspective,” he says.

Value, not vanity

Reflecting on things he might have done differently over the two years since he joined Very, Pimblett cautions against embarking on new technology projects for the sake of it and recommends always thinking about the desired outcome or action first.

If you don’t, he says, “there’ll be an occasion when you realize you didn’t comply with your own principles and start with the action and outcome.” In those situations, he says, you need to tell yourself: “Get back to your strategy. You’ve thrown value away because you’ve had a team working on a vanity project rather than creating business value.

One of the next value-creating projects to which Very will be applying its rich data legacy centers on loans: By the end of 2022, it will pilot a new personal finance business, offering its existing customer base loans of up to £7,500 ($8,800) over one to five years.

“We’ve got a trusted brand and we’ve just started to innovate based on our technology and data capabilities,” he says.

Chief Data Officer, Data Center Management

There’s a cyber security arms race happening right now – and the criminals are winning.

Estimates predict the cost of cybercrime will top $10.5 trillion by 2025, with financial institutions being particularly vulnerable. One study found the average cost of cybercrime to financial services companies was $18 million – 40% higher than the average cost for other sectors.

To counter this rising threat, companies need to spot problems before they occur, rather than simply sit back and hope their defenses can withstand the attack when it does inevitably come.

Plugging the dam

Trying to stop the cyber onslaught is akin to trying to plug holes in a dam, as cyber threats increase. Not only are organized criminals becoming more sophisticated, but state actors are also taking aim at financial institutions, too. Banks spend on average between 6% – 14% of their annual budgets on countering cyber threats. This includes both beefing up IT infrastructure and attracting the top cyber security talent.

But the number of incidents and the cost of those incidents continues to rise. It is not enough to build a security infrastructure that simply sits there and waits to be attacked. Companies also have to find the smaller clues to what may become a bigger problem: they must seal the cracks in the dam before the holes even develop. 

From detection to prevention

Cyber criminals leave clues as to where they may strike next, like footprints in the sand – but it takes sophisticated AI analytics to spot them.

Teradata’s technology allows companies to go from fraud detection to fraud prevention.

Providing a bird’s-eye view of operations down to granular details. It monitors and tracks all user interactions across digital channels – building identity graphs of individual users, creating contextual views of each transaction and acting in real time to only stop fraudulent transactions, not genuine ones.

This enabled companies to proactively detect, prevent, and address first and third-party fraud, money laundering and other financial crimes.

The £100m footprints

One such example of deploying this kind of analytics to tackle fraudsters, was when Teradata worked with a global top five bank which was under attack by remote access takeover fraud.

The problem grew by 15% during Covid, which brought not only financial losses but pressure from regulators. After deploying Celebrus and Teradata Vantage, the bank was able to establish a hyper-personalized behavioral fraud solution.

It worked by capturing digital interactions in real-time and analyzing the data for transactional and behavioral patterns, running millions of micro models to assess behaviors and deploying insights in sub-second response time. It detected over £100m in preventable fraud, with the bank now able to detect and prevent 70% of fraud cases.

Levelling up the arms race

The cyber fraud arms race is one of move and counter move, but by investing in analytics financial services can deploy a powerful new weapon in the fight against the fraudsters.

For more information on how to move from fraud detection to prevention click here.

Fraud, Fraud Protection and Detection Software

With so much diverse data available, why do so many companies still struggle to embrace the real-time, data-driven decision-making they need to be more agile? One thing is clear: The challenge isn’t solved by technology alone.

“You can’t buy transformation,” says Tom Godden, Principal Technical Evangelist with the Enterprise Strategy team at AWS. “Real change doesn’t come just from new technology—it comes from rethinking your processes, which are enabled by the technology.”

Godden offers three tips to help organizations break down data silos, improve data quality, and overcome other longstanding data challenges to foster a culture of decision-making that drives business agility.

1. Build the foundation for managing data in real-time.

A modern data strategy must emphasize data quality at the source of origin, rather than traditional methods of cleansing and normalizing at the point of consumption. Make sure you have the proper infrastructure, tools, and services in place to capture data from a variety of sources, ensure the quality of the data you’re collecting, and manage it securely, end to end.

The technical underpinnings of a modern data strategy include cloud-based databases, data lakes, and data warehouses; artificial intelligence and machine learning (AI/ML) tools; and analytics. The infrastructure must be supported by a comprehensive plan to manage, access, analyze, and protect data across its entire lifecycle, with fully automated processes and robust integration to make data actionable across the organization.

“It may sound obvious, but if you do not build the right processes to capture all the data, you can’t act on the data,” says Godden.

2. Don’t just democratize data – democratize the decisions based on that data.

Investing in the data management infrastructure, tools, and processes necessary to capture data in real-time through a variety of data feeds and devices is just the first step. If you aren’t simultaneously creating a culture that allows people to act on data, you’re just creating frustration.

To that end, avoid “reporting ghost towns” that require people to stop what they’re doing and access a different tool for insights. Instead, build analytics capabilities directly into their workflows, with context, so they can easily apply the insights to their daily activities.

3. Provide the types of guardrails that spur innovation instead of inhibiting it.

Building automated processes for metadata, including information on data lineage and shelf life, builds confidence in the data. By storing data in its raw or native format, you can apply access policies to individuals without having to modify the data.

This approach ensures more flexibility for how people can use the data they need without compromising the fidelity of the data itself. A data lake can serve as a foundational element of a data unification strategy, providing a single source of truth with supporting policies for real-time provisioning based on permissions.

Agile decision making: How three companies are benefiting from a modern data strategy

Organizations are already capturing the benefits of real-time access to data based on roles and permissions. Here are three examples:

Swimming Australia, the nation’s top governing body for swimming, has long been at the forefront of science. Now, it’s using data to analyze race performance and create bespoke training programs for individual athletes. A data lake unified athlete statistics and metrics in a single location, and AI/ML tools are helping the team tailor training programs and track competitors. Analysts and coaches capture real-time physiological data during training sessions and combine that information with race analysis to determine how to evolve training efforts for individual swimmers. Coaches and athletes can easily track progress in real time from their phones via cloud-based dashboards. Today, with its modern data architecture, the national team can create benchmarking reports in minutes, an innovation that helped make the Australians the most successful relay team in the 2020 Tokyo Olympic games.

Coca-Cola Andina, which produces and distributes products licensed by The Coca-Cola Company within South America, needed a solution to collect all relevant information on the company, its customers, logistics, coverage, and assets within a single accurate source. The answer was a cloud-based data lake, which allowed the company to implement new products and services to customize the different value propositions for its more than 260,000 customers. With all the resources and functionality that the data lake enables, Coca-Cola Andina ensures its partners and customers have access to reliable information for making strategic decisions for the business. Coca-Cola Andina ingested more than 95% of the data from its different areas of interest, which allows it to build excellence reports in just a few minutes and implement advanced analytics. The cloud infrastructure increased productivity of the analysis team by 80%.

Vyaire, a global medical company, needed a way to help its 4,000 employees make better, data-based decisions utilizing both first- and-third-party data. Adopting AWS Data Exchange to find, subscribe to, and use third-party data has made it easier to incorporate data sources into the company’s own data ecosystem, resulting in quicker insights to help teams focus on getting results, not administration. Easy access to third-party data via the AWS Data Exchange catalog has encouraged more experimentation and innovation, giving Vyaire’s leadership confidence that it can meet the changing market for respiratory care products and direct investment in the right area to improve its product portfolio.

Too many organizations continue to be held back from using data effectively to drive all aspects of their business. A modern data strategy will empower teams and individuals, regardless of role or organizational unit, to analyze and use data to make better, faster decisions – enabling the sustainable advantage that comes from business agility.

Learn more about ways to put your data to work on the most scalable, trusted, and secure cloud.

Digital Transformation

Most organizations realize that using data to better understand customer needs and preferences is vital to creating consistently great customer experiences. The challenge many face is how to put all of the data they’re collecting to work toward that goal. 

We asked the CIO Experts Network, a community of IT professionals, industry analysts, and other influencers, how businesses can make better use of their data to improve customer experiences. Here are four key takeaways from their responses.  

Be a better listener

Positive customer experience is good for business. McKinsey research has found that improving the customer experience (CX) can increase revenues by 2-7%, boost profitability by 1-2%, and increase shareholder return by 7-10%. Yet organizations continue to struggle with customer experience management. A McKinsey CX survey found that just 7% of the customer voice is shared with CX leaders and only 13% of CX leaders are confident that their organization can take action on CX issues in near real-time. 

“Customers are the heart of every company. However, too many organizations don’t listen carefully enough to them,” said Scott Schober (@ScottBVS), President/CEO at Berkeley Varitronics Systems, Inc. 

“Each customer’s data tells a story that cannot be expressed through gut instinct, emotion, or by committee,” Schober said. “When organizations understand a customer’s habits, buying patterns, and preferences, they will gain that customer’s trust for a lifetime of recurring revenue.”

The good news is that, given the wealth of data that organizations have access to in our digitally driven world, they have more opportunity to analyze customer behaviors and preferences to develop improved experiences, according to the experts:

“Data is ultimately the footprint and behavioral pattern of one’s customers. The data can show what they are using, as well as how, where, and how long they are using it. For example, solutions like website heatmaps can be used to understand a customer’s behavior on a website. Studying that behavior can allow developers to improve the customer experience.”

Jason James (@itlinchpin), CIO of Net Health

“Businesses applying customer data to website personalization [should focus on] creating unique online experiences that delight customers and lead to 1:1 connections that increase return visits to the site and, more importantly, sales.”

Will Kelly (@willkelly), Senior Product Marketing Manager at Section

Invest in tools for managing, analyzing, and using data

Deeper insights about customers require a modern data foundation and tools for gathering, verifying, and integrating data. Investments should focus on improving data quality, ensuring data governance, and layering in tools such as artificial intelligence (AI) and machine learning to accelerate insights and make predictions that drive innovation:

“Many businesses don’t do a very good job of verifying data, which leads to customer frustration as they try to navigate through a business. Data quality is even more important than data quantity. Estimates are that up to 30% of corporate data is inaccurate or corrupted. That is a major impediment. Utilize as much data verification as you can manage, especially as there are many tools available to help in this task.”

Jack Gold (@jckgld), President and Principal Analyst at J. Gold Associates, LLC

“Business leaders should centralize real-time customer data profiles by integrating data from across the customer journey. Access to the profiles should go through a data governance process that enables business leaders from sales, marketing, customer service, and operations to create segmentations based on their objectives. Segmentations help drive ongoing experimentation to learn about customer objectives, which leads to creating personalized experiences.”

Isaac Sacolick (@nyike), StarCIO Leader and Author of Digital Trailblazer

“Capturing and analyzing implicit and explicit data from available data streams to answer specific questions and spot patterns improves the customer experience and provides the business with actionable insights. Adding AI and ML to the mix offers a recipe to whip up a competitive advantage that not only delights customers but also helps a company leapfrog — and stay ahead of — its competitors.”

Gene De Libero (@GeneDeLibero), Chief Strategy Officer at GeekHive.com

Break down data silos to open up collaboration and innovation

Unifying data helps businesses “eliminate data silos and ensures all departments — from product and services to sales and marketing — can observe how their work impacts the customer experience, and can work towards improving it,” said Sridhar Iyengar (@iSridhar), Managing Director at Zoho Europe.

It’s also important to recognize that “the customer experience doesn’t end at the front of the house,” said Peter B. Nichol (@PeterBNichol), Chief Technology Officer at OROCA Innovations. “Leaders can coordinate treatments, tactics, and offers across channels by linking back-end processes with front-end services or interactions.” 

To bring it all together, Nikolay Ganyushkin (LinkedIn: nikolaygan), CEO and Co-founder of Acure, offers an example of a telecom company that re-configured how user requests for technical support were processed. “By connecting this data to the CRM system and to the network monitoring system,” he said, “we were able to set up automatic reporting of problems, which reduced customer churn and increased customer loyalty.”

Build awareness on data literacy and privacy across the business

Although data and technology infrastructure play a critical role in improving customer experience, our experts also note the importance of supplementing technology with proper training as well as awareness around privacy issues.

“Teach the data user how to use the insights,” said Frank Cutitta (@fcutitta), CEO and Founder at HealthTech Decisions Lab. “There are illusions that if we simply give them data, we will see the results; or if we visualize the data they will better understand it on their own. Data requires coaching and storytelling, not just do-it-yourself PowerPoint decks with no notes or talking points.” 

Iyengar (@iSridhar) added: “There is a fine line between use of customer data to enhance customer experience and abuse of data privacy. It’s essential that businesses avoid data exploitation, which requires customer consent in all areas of data usage related to marketing, clearly listing all data practices in an upfront privacy policy, and using business tools that are industry-compliant with security and data protection regulations.”

The bottom line

Today’s business and IT leaders realize that data is critical for creating better experiences, but many continue to struggle to enable their people to act on that insight. Organizations can get closer to gaining a 360-degree view of their customers by investing in a modern infrastructure and tools, unifying data across the business, and training the workforce to apply analytics and insights to their daily activities. That’s how a data-driven approach to CX can drive better business outcomes. 

“Hyper-competitive companies know that data-driven environments change customer behavior for the better,” said Nichol (@PeterBNichol). “The conventional request-reply encounters are a thing of the past. Instead, customers demand a superior experience, designed around their data, from purchasing to production.”

Learn more about ways to put your data to work on the most scalable, trusted, and secure cloud.

IT Leadership

For the healthcare sector, siloed data comes across as a major bottleneck in the way of innovative use cases such as drug discovery, clinical trials, and predictive healthcare. An Aster DM Healthcare, an Indian healthcare institution, has now found a solution to this problem that could lead to several cutting-edge solutions.

A single patient generates nearly 80MB of data annually through imaging and electronic medical records. RBC Capital Market projects that the annual growth rate of data for healthcare will reach 36% by 2025. “Genomic data alone is predicted to be 2 to 40 exabytes by 2025, eclipsing the amount of data acquired by all other technological platforms,” it says.

Although AI-enabled solutions in areas such as medical imaging are helping to address pressing challenges such as staffing shortages and aging populations, accessing silos of relevant data spread across various hospitals, geographies, and other health systems, while complying with regulatory policies, is a massive challenge.

Dr Harsh Rajaram, COO at Aster Telehealth, India & GCC

istock

“In a distributed learning setup, data from different hospitals must be brought together to create a centralised data repository for model training, raising lot of concerns on data privacy. Hospitals are sceptical in participating in such initiatives, fearing losing control on the patient data, though they see immense value in it,” says Dr Harsha Rajaram, COO at Aster Telehealth, India & GCC. Its parent firm Aster DM Healthcare is a conglomerate with hospitals, clinics, pharmacies, and healthcare consultancy service under its portfolio.

To overcome these challenges, Aster Innovation and Research Centre, the innovation hub of Aster DM Healthcare, has deployed its Secure Federated Learning Platform (SFLP) that securely and rapidly enables access to anonymised and structured health data for research and collaboration.

Federated learning is a method of training AI algorithms with data stored at multiple decentralised sources without moving that data. The SFLP allows access to diverse data source without compromising the data privacy, because data remains at the source, while the model training happens from multiple data sources.

“The platform marks a paradigm shift by getting the compute to the data rather than getting the data to the compute,” says Dr Lalit Gupta, consultant AI scientist-innovation at Aster Digital Health.

“Federated technology provided us a platform through which we can unlock the immense potential data provides to draw better insights into clinical, operational, and business challenges and tap on newer opportunities without the fear of losing control of our data. It will allow data scientists from multiple organisations to perform AI training without sharing raw data. By gaining access to larger data sets, they can develop more accurate AI models. It will also ensure data compliance and governance,” COO Rajaram says.

The building blocks of SFLP

Before deploying the platform, Aster conducted a capability demonstration, or proof of concept, of the platform using hospital data from the Bengaluru and Vijayawada clusters of Aster Hospital.

“The platform comprised a two-node collaboration with machines physically located in Bangalore and Vijayawada. The director/aggregator was in Bangalore and the two envoy/collaborator were distributed between Bengaluru and Vijayawada, respectively. The software setup included Ubuntu 20.04.02 with kernel version 5.4.0-65-generic, OpenFL Python library for collaboration, PyTorch Python library[GG1]  for developing deep learning models, and Nvidia Quadro RTX 6000 GPU,” says Gupta.

Dr Lalit Gupta, consultant AI scientist-innovation at Aster Digital Health

istock

“The Aster IT team helped to install and set up the three servers, enabled ports, installed the operating system and necessary drivers, and maintained the servers. The IT team also helped to fetch the data from PACS and HIS, which was required for federated learning experiments,” he says. PACS refers to picture archiving and communication system, a medical imaging technology used to store and transmit electronic images and reports. An HIS or health information system is designed to manage healthcare data.

As part of the capability demonstration, more than 125,000 chest X-ray images, including 18,573 images from more than 30,000 unique patient data from Bengaluru, were used to train a CheXNet AI model, developed in Python, to detect abnormalities in the X-ray report. The additional 18,537 images provided a 3% accuracy boost due to real-world data that was otherwise not available for training the AI model.

The platform can accommodate any analytical tool and does not have any restrictions on the size of data. “We shall decide on size of data based on use case. In case of our capability demonstration experiments, we used a chest X-ray image database of around 30GB,” says COO Rajaram.

It took Aster about eight months, including four months of the capability demonstration, to deploy the system. The platform went live in June 2022. We are in our early days with hardware and software deployed at only two hospitals currently. We intend to increase these deployments to multiple hospitals and look forward to other providers joining hands to leverage the ecosystem,” says Rajaram.

Addressing new data security challenges

While federated learning as a methodology is a well-acknowledged approach to address the data privacy challenges, it also brings in additional security risks as the data/AI model assets are more exposed to possible hacking. Hence, it is essential to provide security capabilities to go with the privacy.

A set of security related instruction codes are built into the central processing units of the servers, which provide the required hardware-based memory encryption that isolates specific application code and data in memory for data security. “The platform combines federated learning with security guarantees enabled by its hardware. This helps to protect data and AI model in storage, when transmitted over network, and during execution of federated learning training jobs. The security features in the platform provide confidentiality, integrity, and attestation capabilities that prevent stealing or reverse-engineering of the data distribution,” says Rajaram.

“Annotation was already in our PACS system. We used its API for data extraction. Though anonymisation was not required since it was within our network, for the pilot we did anonymise the data from the back end,” he says.

Electronic Health Records, Healthcare Industry

Much of the hype around big data and analytics focuses on business value and bottom-line impacts. Those are enormously important in the private and public sectors alike. But for government agencies, there is a greater mission: improving people’s lives.

Data makes the most ambitious and even idealistic goals—like making the world a better place—possible.

This is intrinsically worthwhile, but it has now been codified as part of the Federal Data Strategy and its stated mission to “fully leverage the value of federal data for mission, service, and the public good.” Making the world a better place with data is not only noble, it’s required.

This comes at a critical time for the US and the world—the global population faces complex challenges that cannot be ignored, including:

Caring for a large, aging population that is living longer than ever: For the first time in US history, older adults are expected to outnumber children by 2034. (US Census)Combating climate change: The US experienced 20 different billion-dollar weather and climate disasters in 2021, the second-most in history after the record 21 billion-dollar disasters in 2020. (Climate.gov)Navigating growing geopolitical tensions and conflicts: Agencies operate in a time of immense complexity, from rising inflation to war in Ukraine to a vast cybersecurity threat landscape (and the recent Executive Order on Improving the Nation’s Cybersecurity).Adapting to new social, economic, and public health realities: The COVID-19 pandemic and other factors have had indelible impacts on how (and where) people live and work, including a significant increase in remote and hybrid work, as well as creating a renewed urgency around improving public health.

The public sector has a massive, growing asset at its disposal for tackling these and other pressing issues: data. It is central to the kinds of intelligent analysis and insights required to take swift, informed action that positively impacts people’s lives and improves the world around them

To achieve this vision, agencies must modernize and optimize how they collect, organize, manage, analyze, and act on that data—in an open manner that fosters trust and accountability with citizens, partners, and other stakeholders.

There are four fundamental attributes of this transformation that must be brought together as part of a data framework for a better world. These pillars include:

Agility: Agencies need the tools, skills, and culture to rapidly adopt and collect new data sources, build new applications and data products, and deliver actionable insights. They cannot be asked to solve tomorrow’s problems with yesterday’s technology.Trust: For data to make the world a better place, people must be able to trust it. Agencies need tools and policies that balance transparency, security, privacy, and regulatory compliance. They will need to root out bias and ensure they—and their machine or algorithmic counterparts—leverage data in a manner that is equitable and ethical.Data-Driven Culture: The public sector must operate with a shared commitment to learning and adopting a data-driven culture in their own work—and to fostering that same data-driven culture in society at large.Open AI and ML Collaboration to Improve Government Services: The scope and scale of public data—and the challenges we must tackle—means that human effort and ingenuity alone won’t be enough. To accomplish the most ambitious goals—improving people’s lives—government agencies will need to increasingly rely on automation and an open AI and machine learning collaboration that bridges research and services with cutting-edge data products. 

Cloudera is well-positioned to support this modern data framework for a better world. We’ve built a hybrid data platform leveraging data lakehouse, data mesh, and data fabric as post-movement tools for optimal mission performance. 

That starts with harnessing the vast amounts of data in motion—structured and unstructured, and from a vast number of sources—into a single platform so that it can be utilized for the greater good. And we continuously invest in the tools and capabilities agencies need to ensure trust and transparency, from robust security protocols to data lineage to metadata and more.

You can learn more in our interactive ebook, “Data in Motion to Accelerate Your Mission.”

Together, we can use the power of data to make the world a better place.

Want to learn more? Watch Carolyn Duby’s presentation at Cloudera Government Forum.

Cloud Storage

There’s no roadmap. No standard approach. No required certifications. No tried-and-true methods to guarantee success. Executive coaching sounds like a profession at the opposite end of the spectrum from the CIO role.

Yet for former CIOs Jim Rinaldi of NASA’s Jet Propulsion Lab and Jim DiMarzio of Toyo Tires and Mazda North American Operations, their recent career transitions to coaching felt like a natural next step after the mentoring work they’ve done for decades.

“Coaching is very individual,” says DiMarzio, who retired from Toyo in July 2021 and now serves as an IT Executive Coach with IDG’s CIO Executive Council. “It’s really about exploring the person’s plans and getting to know who they are professionally. His clients are first-level through mid-level IT managers. “The good thing is, they all want to talk to you and improve, so it’s positive right out of the box.”

“More people in our profession should learn how to coach, not just how to manage,” says Rinaldi, who retired from NASA/JPL earlier this year and now serves as executive director for Innovate@UCLA. “The workforce we have coming up today is digitally enabled but lacking experience.”

Catching up with these longtime IT leaders recently, we talked about the challenges they’re hearing about and helping with in their new coaching roles.

Maryfran Johnson: In today’s hybrid/remote workplaces, how are coaching needs changing when it comes to developing next-gen leaders?

Jim DiMarzio

Jim DiMarzio: There are a lot of nuances with the way people are managing remotely today, but communication is still the No. 1 concern. In some cases, managers are finding it easier to reach those people who used to hide in their offices and not answer the phone. But it’s also more difficult now—without those face-to-face meetings—to get your visions and ideas across about where technology can take the business next.

Jim Rinaldi:  I see two changes that are really happening. First, we have a growing workforce that can work from anywhere, and second, this new workforce expects managers to treat them the way they want to work. How does a manager and team look at the expectations of this new workforce? How do you make sure it’s inclusive and has much greater decision transparency?

As we get through all this business and digital transformation, there needs to be a management transformation, too. We need to rethink how this new workforce is being managed and motivated, and there’s not enough focus on it yet. One resource I’ve been recommending lately is Keith Ferrazzi’s new book, “Competing in the New World of Work.”

What leadership lessons did you learn the hard way (and are now sharing as a coach?)

Jim Rinaldi

Rinaldi:  Acceptance of change and how you deal with it, personally. I grew up in the days when if you didn’t move upward in the company, you weren’t moving. My reward system was always based on that. Now I wonder: Why didn’t I go and do something more, like get an advanced degree in computer science or math?

When I’m talking to someone today who has an opportunity to grow in a different way, I say, ‘Why not?’ Should it always be about money, promotions, and titles? Make sure you’re doing what you have a passion for, not just another stepping stone.

DiMarzio: There are three top things I talk about in my coaching work. No. 1: Have a vision and a strategy, regardless of your level, that you can rally your team around and feel good about. At the director level and up, that needs to be a business strategy everyone understands. No. 2: Always ensure you’re talking honestly to your staff, especially at the middle management level. Make sure they are talking to their staff, too. You have to have trust in those people to tell you what’s going on. No. 3: Make sure you have a good environment for the team. It takes only one person to poison that atmosphere.

Looking at your own career strategically, what was the best decision you ever made?

DiMarzio: I was working at Subaru on the east coast in the 1980s, and it was very large organization at the time, doing really well. As an IT manager in a very large shop, I knew I needed to work with the business more. But the reaction I’d get was always ‘You’re the IT guy, why are you talking?’ So, I went back to school to get my MBA. Once I was able to talk about the business, working with them on a vision of how technology could help, that was a turning point in my career.

Rinaldi: I’d say the best decisions are always around hiring good people. When you get the right people for the organization and job, your life is so much better! When you don’t, you’ve failed and have to fix it. I figured out early on that I enjoyed working with people like scientists, executives and professionals who I could trust and build relationships with. I learned that I enjoyed working at places that celebrated their successes and valued me as an employee. My desire to exceed expectations was driven by the environment I worked in.

What do you wish you figured out earlier in your career and would advise other IT leaders to do today?

DiMarzio: I’d have to say the value of networking and talking to other CIOs—not just for career purposes but to find out the best info about industry vendors or hear about where others are finding talent. The networking I did in the automotive industry was also important. For example, I kept in touch with the president of Land Rover after I left there, and he was one of the reasons I ended up at Mazda, where I stayed for 15 years.

Rinaldi:  My advice is to find a leader you admire, and then watch them, listen to them. Most CEOs and bosses are pretty good and you can learn from them, but also look at CEOs you don’t know. The important point is to observe people and watch their styles. And realize that in your 20s and 30s, you don’t have a leadership style yet! But you will develop one.   

This article originally appeared in CIO’s Career Strategist newsletter. Sign up today!

Careers, IT Leadership