Data is critical to success for universities. Data provides insights that support the overall strategy of the university. It can also help with specific use cases: from understanding where to invest resources and discovering new ways to engage pupils, to measuring academic outcomes and boosting student performance. Data also lies at the heart of creating a secure, Trusted Research Environment to accelerate and improve research.

Yet most universities struggle to collect, analyse, and activate their data resources. There are many reasons why.

For a start, data is often siloed according to the departments or functions it relates to. That means the various “dots” that join these datasets are missed, along with any potentially valuable insights.

This has not been helped by the fact that universities have traditionally lagged the private sector in terms of cloud adoption, a key technology enabler for effective data storage and analysis. One thing holding universities back has been a reluctance to move away from traditional buying models. Long-term CapEx agreements have helped universities manage costs, but such models are inflexible. In the age of the cloud, what’s needed is a more agile OpEx-based approach that enables universities to upgrade their data infrastructure as and when required.

Finally, the skills gap remains a challenge to the better use of data. Eighty-five percent of education leaders identify data skills as important to their organisation, but they currently lack 19% of skilled professionals required to meet their needs.

How can universities overcome these barriers? The first step is to put in place a robust data strategy. Each strategy will be different according to the unique needs of the university, but at a minimum it should include the following:

Evaluation of current data estate to understand pinch points and siloes so these can start to be tackled.Alignment of organisation strategy with technical requirementsEvaluation of the cloud market and cloud adoption roadmap to enable data transformation and agile, integrated data use.Comprehensive upskilling programme to overcome data skills gaps.

As universities embark on this journey, finding the right partner will be critical. One option is to team up with a company like SoftwareONE, which has extensive experience in enabling data strategies for large organisations.

Significantly, SoftwareONE is an Amazon Web Services (AWS) Premier Consulting Partner, which means it can bring to bear the capabilities of one of the world’s leading cloud platforms. SoftwareONE adds value by optimising and automating AWS infrastructure as code, which makes it faster and less expensive for universities to get their cloud data programmes up and running. The company also offers a rapid, cost-effective, and secure path to building trusted cloud-based research environments. 

What’s more, partners like SoftwareONE can help address the skills challenge, and not only through automation. SoftwareONE helps to upskill IT teams at universities and provides a full infrastructure as a managed service. Whatever your organisation’s level of comfort with the cloud, SoftwareONE can help you leverage cloud-based data tools with ease.

For more information about how SoftwareONE can help build your data strategy click here.

Education and Training Software, Education Industry

Cloud adoption continues its meteoric rise, with IT leaders increasingly going all-in on the platform. But succeeding in the cloud can be complex, and CIOs have continued to fumble their cloud strategies in 2022 in a variety of ways, industry observers say.

Topping the list of typical cloud strategy are three mistakes that fall under the heading of mental blueprint blunders: assuming that a cloud strategy is an IT-only endeavor, that all data must be moved to the cloud, and that a cloud strategy is the same as a data center strategy.

These fundamental issues resonate strongly with Liberty Mutual Executive Vice President and CIO James McGlennon, who admits to having dropped the ball on these strategic priorities while developing the insurance company’s advanced cloud infrastructure.

“These three are still a work in progress,” says McGlennon, adding that every mistake identified and fixed leads to a healthier cloud transformation. “We have continued to evolve our cloud strategy as we gain more insight into the leverage we can gain in engineering, resilience, scaling, security, and market testing.”

Business-IT alignment — pairing the cloud’s technology prowess with an organization’s business business goals — is a top priority for all cloud journeys, but remains a challenging issue for CIOs.

“C-level executives must be involved in assuring that the IT agenda is in line with the business plan. This is their job relative to IT,” says Paul Ingevaldson, a retired CIO based in St Charles, Ill., who says this should be in the rear-view window by now.

“It cannot be delegated, and it certainly should not be the job of the CIO — although she or he has a voice. We should no longer be discussing this issue,” he laments. “It should be a given for any CIO.”

As 2022 ends, CIOs know that business outcomes can only be as good as their cloud strategy is sound. But often they push forward with a blueprint ripe for disappointment, as these common cloud strategy mistakes show.

No way in, no way out

In addition to going it alone, insisting on moving all data to the cloud, and approaching the cloud the way they would a data center, CIOs also often fall prey to flawed thinking about the scope of their digital transformation, either by failing to have an exit strategy if cloud plan 1.0 flops or believing it is too late to implement a cloud strategy at all.

Research firm Gartner notes, for example, that a cloud exit strategy is an essential “insurance policy” that enables CIOs to bail out of a faulty implementation at the lowest cost possible. But this backup plan often slips through the cracks, analysts say.

On the flipside, many CIOs harbor the misconception that it is too late to reap the prime rewards of IT in the cloud. In some cases, however, CIOs just beginning their cloud journeys are better off than early adopters because the related toolsets and services are more plentiful, mature, and far easier to implement. This makes the cloud transformation less expensive and the outcomes better, CIOs say.

“Sometimes, waiting is better than rushing into cloud computing,” claims Dawei Jiang, chief cloud engineer at the US Trade and Patent Office (USPTO). “Everyone has a roadmap. Do not rush; pick the best and smartest solution, and avoid unnecessary work.”

Serendipity proved this point for boot and shoe manufacturer Wolverine, which hit pause on its cloud journey during COVID, delaying a hybrid-cloud transformation that has leapt forward of late, given newly available tools and an IT culture better-suited to making the most of the cloud.

Superficial planning — or worse, outsourcing your strategy

CIOs also blow their cloud strategy out of the gate by confusing a cloud strategy with an implementation plan or confusing an “executive mandate” or “cloud first” motto with an actual cloud strategy, according to Gartner. Such superficial approaches to the cloud are certain recipes for remorse, the research firm says.

But worse off may be those who pass the buck on their cloud strategy to partners, Gartner maintains. Relying exclusively on a single cloud vendor, such as Microsoft, Amazon, or Google, or a top IT outsourcing firm, for example, to design an enterprise cloud strategy is a big mistake, Gartner and CIOs agree.

The blueprint for each company’s digital transformation is unique and requires a deep dive into all IT systems by the entire C-suite and IT team to optimize the outcome, analysts note. No third party knows an enterprise better than its executives and employees.

Cloud vendors have created a collection of cookie-cutter blueprints that may be customized for specific purposes, or IT consulting firms can be instrumental helping enterprises implement their cloud strategies, Gartner maintains. But under no circumstances should the cloud vendor nor a consulting firm lead the strategic blueprint, analysts and CIOs say.

“Maybe the biggest blunder is just pushing a strategy but not looking at the management and leadership capabilities,” says Vince Kellen, CIO at the University of California San Diego, who hired two executives from consulting firms as university IT employees to help him build the university’s cloud strategy.

“The way to beat the odds on both cost and quality is for the CIO to have high team IQ within its unit, meaning the IT unit is able to apply local context to the technical solution in a way that either saves money and or builds better quality,” Kellen says.

Major sins of omission

Another top cloud expert points out three additional mistakes CIOs often make when formulating their cloud strategies.

“Not architecting for the cloud,” says IDC analyst Dave McCarthy, when asked where CIOs commonly go wrong when building their cloud strategy. “While it is possible to ‘lift and shift’ existing workloads, enterprises often experience less than desirable costs and performance with this approach. You need to adapt applications to cloud-native concepts to realize the full value.”

CIOs also often make the mistake of “not implementing enough automation,” says McCarthy, who is research vice president of cloud and edge infrastructure services for IDC. “Best practices in cloud include automating everything from the deployment of infrastructure and applications to management and security. Most outages or security breaches are the result of manual misconfigurations.

But perhaps the worst sin CIOs can make, analysts across the spectrum agree, is fail to plan for the shift in culture and skills required to devise and implement a successful cloud strategy. The cloud functions differently than traditional IT systems, and a cloud strategy must not only require new skills but a change in thinking about how to design and manage the environment, McCarthy says.

“When you go to the cloud, it’s like moving from a bicycle to a high-performance vehicle, and you can’t assume that what you did before [in the data center] will work the same; otherwise, you’ll have the same mess on your hands,” says Craig Williams, CIO at Ciena. “It can be worse, too, because your costs can get out of control if you don’t get in front of it.”

Budgeting, Cloud Computing, IT Leadership, IT Strategy

Kidney diseases are a leading cause of death in the US, claiming more than a quarter million lives each year. Roughly 37 million people in the US are inflicted with chronic kidney disease (CKD) although most are undiagnosed. Left untreated, CKD may advance and can lead to more serious medical issues such as high blood pressure, diabetes, and complete kidney failure, or End Stage Renal Disease (ESRD).

The solution for many at that extreme stage is dialysis or a kidney transplant — both of which have a significant impact on the quality of life. Every 24 hours, 360 people in the US begin dialysis treatment for kidney failure, according to the CDC.

One organization at the forefront of clinical care and innovation is DaVita, one of the largest providers of kidney care services, with more than 2,800 outpatient dialysis centers in the US and close to 400 outpatient dialysis centers in 11 countries worldwide. This year alone, the company has served nearly 200,000 patients in the US at its outpatient centers and is actively pushing the kidney care industry to adopt high-quality standards of care for all patients.

While treating ESRD patients through its large network of dialysis centers is the Fortune 200 company’s primary business, the company is also involved in efforts to reduce CKD cases and the need for dialysis treatment and transplants as well. Here, IT is playing a significant role.

“We’ve been working to enable world-class integrated care at scale and transform the delivery of care at each point of a patient’s journey,” says Alan Cullop, SVP & CIO at DaVita.  “Our digital transformation strategy is centered around establishing a consumer-oriented model that helps us customize chronic care management based on the ever-changing conditions of each patient.”

The foundation for DaVita’s digital transformation will be a new technical platform and clinical documentation system that “allows for deeper integration across our applications and improves our ability to capture data throughout the patient’s care,” Cullop says, noting that development has been a multi-year process and deployment is now underway and will be completed in 2023. “We’re providing our physician partners, clinical teams, and patients with digital capabilities that support our efforts to proactively improve the quality of care our patients receive.”  

DaVita also provides millions of dollars in funding to address ancillary issues related to kidney disease sufferers, such as food insecurity and even support to patients impacted by environmental catastrophes such as hurricanes and earthquakes.

In this CIO Executive Council Future Forward podcast, we talk with DaVita’s technology chief about the company’s plan to expand activities from CKD treatment to disease prevention. Cullop also talks about DaVita’s strategies for AI and data analytics, as well as the importance of passion and culture as drivers of technology innovation.

The following are edited excerpts from that discussion. Click on the podcast players below to listen to Parts 1 & 2 of the conversation.

Tim Scannell:  How much of a role do technologies like data analytics and AI play in DaVita’s overall technology and business strategy?

Alan Cullop: We have a very large and very focused effort on AI and data analytics. There’s so much power in data and the insights doctors get early in the care continuum and how we engage with patients even before they are in dialysis. We’re using predictive algorithms to identify signs of undiagnosed kidney disease. We’re also doing a lot with population health management, performing more comprehensive patient assessments, managing our high-risk patients, stabilizing transitions of care, optimizing outpatient care, and really trying to call out things that help us understand disease progression.

We’re looking at a variety of sources of data, putting it in data lakes, and then using that to drive predictive models that really help our doctors and our care teams to stratify our patient’s risk by taking actions at the right time.

A lot of innovation initiatives right now are small and more closely aligned with tangible results. While this may be a great short-term strategy, how might this impact the concept of innovation in general as we move forward?

Alan Cullop: Innovation usually starts with a problem or something we’re trying to solve. And with AI sometimes you stumble on it unexpectantly as a search for the unknown unknown. The trick is to not let the outcomes and particular things we’re trying to solve get in the way of innovative thinking or block our sense of what’s possible.  

Sometimes smaller focused innovation efforts do lead to much bigger ideas. But, ideation sessions, innovation sessions, and hackathons have led to some interesting insights we’ve built upon and can be applied across the board. We encourage our teams to really embrace it, but we’re going to make mistakes. One of the better ways to learn is if you make a mistake, and now you know more than you did before and you know how to perhaps not repeat it.

IT culture is important today, especially in retaining and recruiting talent. As companies shift to a new normal of hybrid working, do you think there’ll be a significant impact on traditional cultural structures?

Alan Cullop: I think there are three fundamental issues or points that help build and sustain a strong culture. First, I am very excited by the increased focus on diversity, inclusion, and belonging in our society. We’ve been very focused on these issues for quite some time and really interested in embracing different perspectives. We’ve made great progress, but it’s one of those things that I would say your work is never done. I’m proud to be a part of the conversation and proud of the engagement level of our teammates.

Second, I think flexibility is crucial. We need to understand how and where our teammates want to work, which roles are conducive with work being done remotely versus hybrid models and find ways to keep our engagement high. For us, that’s a balance. We’re exploring more flexible work arrangements and talking to teammates about how and where they want to work to meet their needs.

Finally, leadership needs to be visible and consistent in terms of demonstrating the importance of culture and engagement. It’s easy to talk about culture, but it’s certainly harder to carve out time to be present and to genuinely engage with teammates on culture. Everyone looks to leadership to be role models and set examples. So, it’s important that we take the time to walk the walk and not just talk the talk.

In earlier conversations, you talked about thepower of purpose.’ Can you tell me just what this means to you and how it comes into play at DaVita?

Alan Cullop: I think it’s super important and something we take very seriously. We talk about the power of purpose all the time in healthcare and what it means to stay connected to our patients and their families, and what we can do to really improve their quality of life and in many cases save lives. We bake this into our IT strategy, our team meetings, and our engagement approaches. I love the innovation and enablement that we bring. It personally gives me a lot of energy and passion and a sense of purpose. We’re doing something and we’re giving back to others, which I think for a lot of us helps bring a true sense of purpose.

IT Leadership

Once relegated to the back office, CIOs are now key customer-facing leaders charged with delivering great customer experiences.

According to our State of the CIO 2022 survey, 57% of responding IT leaders say improving customer experience (CX) is a top goal, while 81% are implementing new technologies to support customer interactions.

Research also indicates that many organizations are struggling to reach their CX objectives, as they face challenges in bringing together the required skills, tools, and data. To help overcome those hurdles, we asked researchers and CX leaders to list steps CIOs can take to strengthen their organization’s customer experience function. Here’s the ten action items that will have the greatest impact.

1. Embrace a customer-centric mindset

CIOs themselves will have to think and work in new ways as their organizations make CX a strategic objective, says Elli Rader, a partner at digital services firm West Monroe who works in its Product Experience and Engineering Lab.

Some CIOs, however, will have to work harder than others to make this shift, Rader says.

“Some CIOs — particularly in times of pressure, which is now — tend to fall back on things they’ve done for a long time. They fall back on things that are familiar because they have muscle memory. And those things tend to be around delivery, like ship fast, but, unfortunately, they miss what’s best for the customer when that’s your focus,” Rader says.

“They need to be open to new ways of working, so not just building the same well but building the right things,” she adds, explaining that CIOs need to learn enough about human-centered design and other CX-related methodologies so they’re able to effectively lead their CX teams.

2. Evolve the culture

“As a CIO my role is to first create an organization and culture that supports and enables customer-centric, design thinking, technically innovative, agile ways of working and secondly to enable and empower our colleagues to be successful in re-imagining the way we work and the way we provide value to our customers, both internally and externally,” says Mark Mintz, senior vice president and CIO of Charles River Laboratories.

“When we are evaluating opportunities for how best to serve our internal and external customers, we continuously seek to understand the core of the opportunity from a customer perspective as well as from a business perspective and strive to re-imagine those experiences based on what we learn,” he explains. “From there we identify our key metrics that we expect to impact through our re-imagination and then we stay focused on regular collaboration with our customers, guided by data and metrics, to ensure we are delivering the highest value experiences that not only are delights for our customers but also creating real business value for them.”

3. Create shared responsibility for CX within the C-suite

Customer experience technologies were among the top IT investments over the past year, according to Foundry’s 2022 State of the CIO Study. But CIOs have to own more of the CX program than technology delivery; they must share ownership for it, along with the marketing, operations, sales, and revenue officers, as well as the chief customer or chief experience officers if those roles exist within the company.

Research has found that companies that deliver great customer experiences make CX a shared responsibility, says Sudhir Rajagopal, research director for Future of Customer Experience at research firm IDC.

“It’s not about any one person or one function,” he says. “It’s one thing to say it, but what it means is that each one of those functions, every one of those leaders within their functions, thinks about how they’re delivering on their operational pieces against the same customer objectives.”

4. Build joint ownership through cross-functional teams

That joint ownership must extend throughout the organization, says Sam Bright, chief product and experience officer at Upwork, a platform serving the freelance work market.

“In order to ensure better product and business outcomes, it’s essential that tech leaders put a structure in place that tightly couples product, experience, and data teams and even combines them at certain junctures where it makes sense. This instills a culture of collaboration across the organization that brings multiple departments — whether directly or indirectly responsible for the customer experience — closer together,” he explains.

Bright says Upwork has focused on creating a team structure that “binds together product and experience.” That includes co-locating these teams as well as promoting transparency and collaboration across the organization.

“With a more unified team, where VPs and product managers of our various products or delivery models sit right alongside our UX function, analytics team as well as customer support and trust experts, we get a true 360-degree view of customer problems we’re trying to solve while creating a real feedback loop that revolves around customer obsession,” Bright says.

5. Align CX strategy to the enterprise strategy

Delivering great customer experiences is a good goal, but the objective should be delivering great experiences in ways that also help the organization achieve its goals, says Thomas Randall, advisory director at Info-Tech Research Group and its SoftwareReviews division.

“IT has to work with the business to ask, ‘What are our goals for customer experience?’ whether that’s upselling or customer retention or customer acquisition,” Randall says, explaining that aligning CX with the corporate strategy enables IT to identify the technologies, features, and functions to pursue and advance.

For example, CIOs whose companies are focused on customer retention will need to prioritize the delivery of technologies that can solve customer problems on their own and, if they can’t, enable customer service reps to quickly do so.

6. Define the problem before defining the solution

“As technologists, it’s easy for us to quickly jump to the addition or purchase of a technology solution without clearly defining the reason and problem it is meant to solve,” says Erin Howard, executive director of product, service, and experience design at Charles River Laboratories.

“Using methodologies found in design-thinking practices can help to define the problem space; journey and process mapping, persona definition, ideation workshops and many other broad thinking tactics can help you and your team spend time in the problem, empathize with the customer and ultimately define and design the solution with a focus on the needs and experience of your customer,” she explains.

That’s critical for determining not just how to build a great customer experience but building the right ones.

“Fully understanding the problem, looking at it from different angles, and spending more time in the problem space will lead to better solution evaluation, understanding of needs, and definition of goals before purchasing your next technology solution,” Howard says.

7. Get a handle on the data

Teams looking to improve CX often focus on the experience layer, putting most or all of their efforts into understanding customer journeys and customer personas.

But teams need good data to do all that successfully. “That’s the layer that needs to be fundamentally addressed,” Rajagopal says, pointing out that IT is instrumental in building the infrastructure needed to bring in first-, second-, and third-party data and create a “unified customer data view.”

Justin Skinner, CIO of SmileDirectClub, knows the value of building data sets to leverage for enhanced CX, pointing to the company’s SmileMaker Platform, which uses artificial intelligence to enable customers to see in minutes how the company’s treatment plans could change their smile.

“With this technology, we’re able to utilize data from 1.8 million smiles to create an educational experience for customers that makes it easier for them to get started,” Skinner says. “Analyzing the smiles of our existing users allows new customers to go through the process more quickly, while also enabling us to reach consumers across the globe at scale.”

8. Prioritize and prune

“There are way more ideas and work than time and money,” says Amy Evins, executive vice president and CIO of LPL Financial. That’s why she believes high-performing CX functions have a disciplined approach to prioritizing their work.

To that point, Evins’ team calculates the value of proposals and prioritizes them based on the impact that they are each expected to bring.

“That also empowers the team to say no, so when they push back, they have the data to show why, and then they can prioritize the customer experience products that will drive the most value,” Evins says.

Rader advocates for that approach, too. She also advises CIOs to use a similar approach with existing features and functions by determining which ones don’t deliver value so they can be retired.

“Look at what features are being used and kill those that aren’t. Why spend money maintaining that stuff when you can spend money building new features that customers will actually want?” she asks.

9. Measure the impact of CX efforts

To prioritize and prune effectively, and to determine which activities are actually improving customer experiences, organizations must have accurate ways to measure the value of their CX work, Rajagopal says.

That means using more than the Net Promotor Score (NPS) and instead using additional customer-centric metrics such as the Customer Effort Score and measuring whether CX initiatives bring increased consumer spending.

There’s some movement on that front but not a lot. IDC predicts that at least 30% of organizations will introduce new success metrics to track and measure the internal and external flows of customer value creation by 2024. IDC also predicts that one-fourth of global brands will abandon the Customer Satisfaction Score (CSAT) as a measure of customer experience and adopt a Customer Effort Score correlated to outcomes as a key indicator of journey satisfaction and success by 2027.

Rajagopal says CIOs can bring in tools that enable such measurements, adding that CIOs can also deploy AI technologies that analyze text and verbal tone, enabling organizations to gauge customer sentiment during interactions.

10. Build CX teams and a true CX discipline

Like other CIOs, Mintz lists attention to hiring the right people as one of his top tasks in support of a strong CX function.

“A large focus of my time is spent on attracting, exciting, and retaining world-class talent and giving them the support they need to use the best of their expertise to deliver on our objectives,” he says.

CIOs should also be attentive to how they organize their talent, Rajagopal says, adding that research shows that organizations with the best CX have specialized CX teams with dedicated resources to execute experience transformation. Furthermore, those teams are coupled with the organization’s digital transformation strategy.

“So the customer experience team is looking across all [functions and touchpoints], asking whether they’re addressing customer needs while also addressing some of the operational constraints,” Rajagopal explains.

Such moves, experts add, enable organizations to move past seeing CX as a series of one-off projects to creating a true CX practice that’s capable of continuous improvement and evolution as market demands change.

Digital Transformation, Software Development

“Our sustainability goals and key performance indicators are important to us. We’re committed to achieve net-zero carbon emissions in areas we can influence by 2030 with a three-pronged approach that includes avoidance, reduction and compensation.” — Arthur Schneider, Head of Sustainability Management

One of Europe’s largest IT services providers, Bechtle offers a full IT portfolio to more than 70,000 enterprises and public sector customers in 14 countries throughout Europe. Providing hardware, software, and IT solutions and services, the company also delivers an extensive array of cloud offerings from dedicated private clouds to deployments with the world’s largest hyperscalers, including Amazon Web Services, Google Cloud, and Microsoft Azure.

Bechtle also offers the Google Cloud VMware Engine, an offering that empowers enterprises to quickly realize the power of Google Cloud while using the VMware technologies they know and trust. Notably, Bechtle also achieved the VMware Cloud Verified distinction.

We recently connected with Arthur Schneider, Head of Sustainability Management at Bechtle to learn more about the company’s sustainability initiatives, what it means to be part of the VMware Zero Committed Initiative and what he sees in his work with customers. We also took the opportunity to learn where he sees the greatest opportunities to make a difference.

“Bechtle is unique for a number of reasons, among them the fact that we offer a unique blend of IT services, from systems integration to running our customers’ full IT infrastructure if desired,” says Schneider. “We offer this through more than 80 facilities we refer to as IT system houses, and also provide IT leaders with IT E-commerce throughout Europe, a one-stop source of the supplies they need, from printer cartridges and laptops to full infrastructure-as-a-service or a custom-designed server farm.”

Schneider notes that because Bechtle business extends far beyond the company’s data center operations its sustainability efforts also encompass a much wider array of business functions. This includes a large effort not only to achieve a core tenet of the VMware Zero Carbon Committed initiative – to power the VMware operations and datacenters with 100% renewable energy by 2030 – but also to radically reduce the carbon footprint of Bechtle’s logistics and fulfillment operations.

“On the most basic level, in our Sustainable Strategy 2030 effort we’ve defined four main areas of action we will focus on, including Ethical Business, Environment, People, and Digital Future,” adds Schneider. “Some of our goals include achieving net-zero status by 2030, extending our portfolio of sustainable products, increasing diversity across management levels, and achieving sustainability goals in our procurement and supply chain operations.”

This multi-faceted approach can be seen in the scale and scope of Bechtle’s sustainability efforts. For example, the company’s headquarters at Neckarsulm achieved the noteworthy achievement of being powered by 100% green energy since 2021. Furthermore, the facility that serves more than 2,000 people, uses geothermal, photovoltaic, and solar thermal systems. Datacenters are also built with sustainability in mind. The company’s Pfalzkom datacenter in Mutterstadt, for example, is entirely powered by green electricity.

Additional investments extend Bechtle’s efforts further. Examples include the purchase of 50 Volkswagen ID.3 all-electric cars and the installation of numerous electric vehicle charging stations across the company’s operations. Bechtle even unveiled the Bechtle box, a special shipping container made out of 80% recyclable materials that can be reused many times and folded for return to Bechtle during subsequent deliveries.

“We are consistently looking at sustainability efforts from a three-pronged approach that includes avoidance, reduction, and compensation and that means looking at our entire operation,” says Schneider. “We’ve defined a sustainable portfolio: sustainable product consulting, green logistics and packaging services, repair services, product end-of-life services, and perhaps most importantly energy efficient cloud hosting.”

Schneider stresses that the cloud by its very nature offers perhaps the greatest potential, both because it consolidates power consumption into far more efficient facilities, and because it empowers others to likewise reduce their carbon footprint, whether it’s enabling remote work or replacing inefficient legacy hardware.

“Enabling enterprises to move from on premises locations to the co-locations that are powered by sustainable sources of course offers the opportunity to dramatically reduce emissions,” he says. “Sustainability has become a responsibility for all of society. Our customers have their own goals and they want to see similar actions on behalf of their vendors. VMware’s Zero Carbon Committed initiative is a great way to help demonstrate and act on that shared vision.”

Learn more about Bechtle and its partnership with VMware here.  

Cloud Management, Green IT

Every enterprise needs a data strategy that clearly defines the technologies, processes, people, and rules needed to safely and securely manage its information assets and practices.

As with just about everything in IT, a data strategy must evolve over time to keep pace with evolving technologies, customers, markets, business needs and practices, regulations, and a virtually endless number of other priorities.

Here’s a quick rundown of seven major trends that will likely reshape your organization’s current data strategy in the days and months ahead.

1. Real-time data gets real — as does the complexity of dealing with it

CIOs should prioritize their investment strategy to cope with the growing volume of complex, real-time data that’s pouring into the enterprise, advises Lan Guan, global data and AI lead at business consulting firm Accenture.

Guan believes that having the ability to harness data is non-negotiable in today’s business environment. “Unique insights derived from an organization’s data constitute a competitive advantage that’s inherent to their business and not easily copied by competitors,” she observes. “Failing to meet these needs means getting left behind and missing out on the many opportunities made possible by advances in data analytics.”

The next step in every organization’s data strategy, Guan says, should be investing in and leveraging artificial intelligence and machine learning to unlock more value out of their data. “Initiatives such as automated predictive maintenance on machinery or workforce optimization through operational data are only a few of the many opportunities enabled by the pairing of a successful data strategy with the impactful deployment of artificial intelligence.”

2. In-house data access demands take center stage

CIOs and data leaders are facing a growing demand for internal data access. “Data is no longer just used by analysts and data scientists,” says Dinesh Nirmal, general manager of AI and automation at IBM Data. “Everyone in their organization — from sales to marketing to HR to operations — needs access to data to make better decisions.”

The downside is that providing easy access to timely, relevant data has become increasingly challenging. “Despite massive investments, the data landscape within enterprises is still overly complex, spread across multiple clouds, applications, locations, environments, and vendors,” Nirmal says.

As a result, a growing number of IT leaders are looking for data strategies that will allow them to manage the massive amounts of disparate data located in silos without introducing new risk and compliance challenges. “While the need for data access internally is rising, [CIOs] also have to keep pace with rapidly evolving regulatory and compliance measures, like the EU Artificial Intelligence Act and the newly released White House Blueprint for an AI Bill of Rights,” Nirmal says.

3. External data sharing gets strategic

Data sharing between business partners is becoming far easier and much more cooperative, observes Mike Bechtel, chief futurist at business advisory firm Deloitte Consulting. “With the meaningful adoption of cloud-native data warehouses and adjacent data insights platforms, we’re starting to see interesting use cases where enterprises are able to braid their data with counterparties’ data to create altogether new, salable, digital assets,” he says.

Bechtel envisions an upcoming sea change in external data sharing. “For years, boardroom and server room folks alike have talked abstractly about the value of having all this data, but the geeks among us have known that the ability to monetize that data required it to be more liquid,” he says. “Organizations may have petabytes of interesting data, but if it’s calcified in an aging on-premises warehouse, you’re not going to be able to do much with it.”

4. Data fabric and data mesh adoption rises

Data fabric and data mesh technologies can help organizations squeeze the maximum value out of all the elements in a technical stack and hierarchy in a practical and usable manner. “Many enterprises still utilize legacy solutions, old and new technologies, inherited policies, processes, procedures, or approaches, but wrestle with having to blend it all within a new architecture that enables more agility and speed,” says Paola Saibene, principal consultant at IT advisory firm Resultant.

Mesh enables an organization to draw the information and insights it needs from the environment in its current state without having to radically change it or massively disrupt it. “This way, CIOs can take advantage of [tools] they already have, but add a layer on top that allows them to make use of all those assets in a modern and fast way,” Saibene explains.

Data fabric is an architecture that enables the end-to-end integration of various data pipelines and cloud environments through the use of intelligent and automated systems. The fabric, especially at the active metadata level, is important, Saibene notes. “Interoperability agents will make it look like everything is incredibly well-connected and has been intentionally architected that way,” she says. “As such, you’re able to gain all the insights you need while avoiding having to overhaul your environment.”

5. Data observability becomes business-critical

Data observability extends the concept of data quality by closely monitoring data as it flows in and out of the applications. The approach provides business-critical insights into application information, schema, metrics, and lineage, says Andy Petrella, founder of data observability provider, Kensu, and the author of Fundamentals of Data Observability (O’Reilly, 2022).

A key data observability attribute is that it acts on metadata, providing a safe way to monitor data directly within applications. As sensitive data leaves the data pipeline; it’s collected by a data observability agent, Petrella says. “Thanks to this information, data teams can troubleshoot data issues faster and prevent them from propagating, lowering maintenance costs, restoring trust in data, and scaling up value creation from data,” he adds.

Data observability creates an entirely new solution category, Petrella claims. “CIOs should first understand the different approaches to observing data and how it differs from quality management,” he notes. They should then identify the stakeholders in their data team, since they will be responsible for adopting observability technology.

An inability to improve data quality will likely hinder data team productivity while decreasing data trust across the entire data chain. “In the long term, this could push data activities into the background, impacting the organization’s competitiveness and ultimately its revenue,” Petrella states.

IT leaders are contending with soaring complexity and unfathomable volumes of data spread across the technology stack, observes Gregg Ostrowski, executive CTO of Cisco AppDynamics. “They’re having to integrate a massively expanding set of cloud-native services with existing on-premise technologies,” he notes. “From a data strategy perspective, the biggest trend is the need for IT teams to get clear visualization and insight in their applications irrespective of domain, whether on-premises, in the cloud or hybrid environments.”

6. ‘Data as a product’ begins delivering business value

Data as a product is a concept that aims to solve real-world business problems through the use of blended data captured from many different sources. “This capture-and-analyze approach provides a new level of intelligence for companies that can result in a real, bottom-line impact,” says Irvin Bishop, Jr., CIO at Black & Veatch, a global engineering, procurement, consulting, and construction company.

Understanding how to harvest and apply data can be a game-changer in many ways, Bishop states. He reports that Black & Veatch is working with clients to develop data product roadmaps and establish relevant KPIs. “One example is how we utilize data within the water industry to better manage the physical health of critical infrastructure,” he notes. “Data gives our water clients the ability to predict when a piece of equipment will likely need to be replaced and what type of environmental impact it can withstand based on past performance data.” Bishop says that the approach gives participating clients more control over service reliability and their budgets.

7. Cross-functional data product teams arise

As organizations begin treating data as a product, it’s becoming necessary to establish product teams that are connected across IT, business, and data science sectors, says Traci Gusher, data and analytics leader at business advisory firm EY Americas.

Data collection and management shouldn’t be classified as just another project, Gusher notes. “Data needs to be viewed as a fully functional business area, no different than HR or finance,” she claims. “The move to a data product approach means your data will be treated just like a physical product would be — developed, marketed, quality controlled, enhanced, and with a clear tracked value.”

Analytics, Data Management

Organizations that have embraced a cloud-first model are seeing a myriad of benefits. The elasticity of the cloud allows enterprises to easily scale up and down as needed. In practice, rather than commit to just one cloud service in today’s world of more distributed organizations due to Covid-19, many enterprises prefer to have multiple cloud solutions they source from a variety of vendors.

The cloud also helps to enhance security, improves insight into data, and aids with disaster recovery and cost savings. Cloud has become a utility for successful businesses. Around 75% of enterprise customers using cloud infrastructure as a service (IaaS) have been predicted to adopt a deliberate multi-cloud strategy by 2022, up from 49% in 2017, according to Gartner.

“Businesses don’t want to be locked into one particular cloud,” says Tejpal Chadha, Global Head, Digitate SaaS Cloud & Cyber Security. “They want to run their applications on different clouds so they’re not dependent on one in case it were to temporarily shut down. Multi-cloud has really become a must-have for organizations.”

Yet, at the same time, companies that tap into these multi-cloud solutions are opening themselves up to additional, and potentially significant, security risks. They become increasingly vulnerable in an age of more sophisticated, active cyberhackers.

To address security risks, cloud services have their own monitoring processes and tools that are designed to keep data secure. Many offer customers basic monitoring tools for free. But if companies want a particularly robust monitoring service, they often must pay added fees. With multiple clouds, this added expense can be significant.

“The cost goes up when you have to have specific monitoring tools for each cloud,” Chadha says. “Monitoring also needs to be instantaneous or real-time to be effective.”

Organizations using multi-cloud solutions are also susceptible to cloud sprawl, which happens when an organization lacks visibility into or control over its cloud computing resources.The organizationtherefore ends up with excess, unused servers or paying higher rates than necessary.

For enterprises safeguarding their multi-cloud solutions, a better tactic is to use just one third-party overarching tool for all clouds – one that monitors everything instantaneously. ignio™, the award-winning enterprise automation platform from AIOps vendor Digitate, does just that.

ignio AIOps, Digitate’s flagship product, facilitates autonomous cloud IT operations by tapping into AI and machine learning to provide a closed-loop solution for Azure and AWS, with versions for Google Cloud (GCP) and private clouds also coming soon. With visibility and intelligence across layers of cloud services, ignio AIOps provides multi-cloud support by leveraging cloud-native technologies and APIs. It also provides actionable insights to better manage your cloud technology stack.

ignio is unique in that it cuts across multiple data centers, both private and public clouds, and seamlessly handles everything  in a single window. It gets a bird’s eye view of the health of a company’s data centers and clouds. Then, ignio continuously monitors, predicts, and takes corrective action across clouds while also automating previously manual tasks, which ignio calls “closed-loop remediation.” The closed-loop remediation enables companies to automate actions for both remediation, compliance, and other essential CloudOps tasks.

“The ignio AIOps software first comes in and, in the blueprinting process, gives a holistic view of what companies have in their universe,” Chadha says. “We call that blueprinting or discovery. Then, we help automate tasks. We’re completely agnostic when it comes to monitoring or taking corrective action, or helping increase automation across all of these clouds.”

As Digitate ignio customers automate processes and reduce manual IT efforts, they’re finding they’re saving money — some millions of dollars a year. For many companies, tasks that once took three days now take only an hour.

“The biggest benefits are that less manual work is ultimately needed, and then there’s also the costs savings,” Chadha says. “Enterprises using this tool are managing their multi-cloud estate much more efficiently.”

To learn more about Digitate ignio and how Digitate’s products can help you thread the multi-cloud needle, visit Digitate.

IT Leadership

Developing a strategy for controlling hard-to-predict cloud costs, remains difficult, especially when considering the new decentralized model of procurement.

Consider this: “Any person who can commit code to the cloud can commit your organization to spend,” said Jennifer Hays, senior vice president of engineering efficiency and assurance at Fidelity Investments and the FinOps Foundation’s governance board chairperson.

Yet there may be solutions. Hays – along with National Grid CIO Andi Karaboutis, McDermott CIO Vagesh Dave, and Accenture’s Cloud First Global Strategy and Consulting Lead Ashley Skyrme – are among the speakers at CIO’s Future of Cloud Summit, taking place virtually on November 8.

The event will drill into key areas of cloud innovation, challenges, and leadership. It also will feature insights and tactical tips from the winners of IDC’s Future of Digital Infrastructure Awards.

The summit kicks off with an introduction to the new era of digital fluency from MJ Petroni, CEO and “Cyborg Anthropologist” at Causeit. Petroni will move beyond the latest industry buzzwords explain exactly how advanced technologies can and will affect their business. Petroni will return later in the day for an interactive discussion with attendees to give more context and answer questions live.

One challenge for today’s CIO is to get beyond the cloud hype and determine where it’s useful and where it’s not. Arthur Hu, senior vice president and global CIO of Lenovo, will talk about the company’s strategy on building scalable architecture with resiliency and agility in mind. Hu also will share how Lenovo incubated its own hybrid cloud module and spun it out as a new business unit.

Corralling the complexities of the cloud is another challenge. Stu Kippleman, CIO of Parsons Corporation, will give practical tips for business success.

One way to streamline operations is to migrate to an industry cloud. Discussing the pros and cons of these specialized clouds are Chad Wright, CIO of Boston Dynamics; Kumar Iyer, business technology senior leader at KeyBank; and IDC Research Manager Nadia Ballard.

What does a functioning cloud strategy look like in practice? Jeremy Meller, CIO at Children’s Healthcare of Atlanta, will share how it’s using cloud to advance data analytics and provide new patient services.

Finally, listen in to conversations with the winners of IDC’s Future of Digital Infrastructure Awards. Mary Johnston Turner, IDC’s research vice president of digital infrastructure will go inside the strategies and process of the winners in three categories: Autonomous Operations winner WSP, featuring Director of Global Networks Services Richard Evers; Cloud Technology winner Pac-12 Networks, featuring Ryan Currier, senior vice president of engineering and products; and Ubiquitous Deployment winner Utah Division of Technology Services, featuring Chief Operating Officer Daniel Harmuth.

Throughout the summit, sponsors including IBM Cloud and Freshworks will offer thought leadership and solutions on subjects such as creating a modern digital infrastructure.

Check out the full summit agenda here. The event is free to attend for qualified attendees. Don’t miss out – register today.

Cloud Computing, IDG Events, IT Strategy

This article was co-authored by Duke Dyksterhouse, an Associate at Metis Strategy

Data & Analytics is delivering on its promise. Every day, it helps countless organizations do everything from measure their ESG impact to create new streams of revenue, and consequently, companies without strong data cultures or concrete plans to build one are feeling the pressure. Some are our clients—and more of them are asking our help with their data strategy. 

Often their ask is a thinly veiled admission of overwhelm. They struggle to even articulate their objective, or don’t know where to start. The variables seem endless: data—security, science, storage, mining, management, definition, deletion, integration, accessibility, architecture, collection, governance, and the ever-elusive, data culture. But for all that technical complexity, their overwhelm is more often a symptom of mindset. They think that when carving out their first formal data strategy, they must have all the answers up front—that all the relevant people, processes, and technologies must be lined up neatly, like dominos. 

We discourage that thinking. Mobilizing data is more like getting a flywheel spinning: it takes tremendous effort to get the wheel moving, but its momentum is largely self-sustaining; and thus, as you incrementally apply force, the wheel spins faster and faster, until fingertip touches are enough to sustain a blistering velocity. As the wheel builds to that speed, the people, processes, and technologies needed to support it make themselves apparent. 

In this article, we offer four things you can do to get your flywheel spinning faster, and examine each through the story of Alina Parast, Chief Information Officer of ChampionX, and how she is helping transform the company (which delivers solutions to the upstream and midstream oil and gas industry) into a data-driven powerhouse. 

Step 1: Choose the right problem 

When ChampionX went public, its cross-functional team (which included supply chain, digital/IT, and commercial experts) avoided or at least tempered any grandiose, buzzword-filled declarations about “transformations” and “data-driven cultures” in favor of real-world problem solving. But also, it didn’t choose just any problem: it chose the right problem—which is the first and most crucial step to getting your flywheel spinning. 

At the time, one of ChampionX’s costliest activities in its Chemical Technologies business was monitoring and maintaining customer sites, many of which were in remote parts of the country. “It was more than just labor and fuel,” Alina explained. “We had to spend a lot on maintaining vehicles capable of navigating the routes to those sites, and on figuring out what, exactly, those routes were. There were, and still are, no Google maps for where our field technicians need to go.” Those costs were the price of “keeping customers’ tanks full, not dry”– one of ChampionX’s guiding principles and the core of its value proposition to improve the lives of its customers. “And so, we wondered, ‘how can we serve that end?’” 

  The problem the team chose to solve—lowering the cost of site trips—might appear mundane, but it had all the right ingredients to get the flywheel moving. First, the problem was urgent, as it was among ChampionX’s most significant expenses. Second, the problem was simple (even if its solution was not). It was easy to explain: It costs us a lot to trek to these sites. How can we lower that cost? Third, it was tangible. It concerned real world objects—trucks, wells, equipment, and other things people could see, hear, or feel. Equally important, the team could point to the specific financial line items their efforts would move. Finally, the problem was shared by the enterprise at large. As part of the cross-functional leadership team, Alina didn’t limit herself to solving what were ostensibly CIO-related problems. She understood: if it was a problem she and her organization could help solve, then it was a CIO-related problem. 

IT executives talk often of people, processes, and technology as the cornerstones of IT strategy, but they sometimes forget to heed the nucleus of all strategy: solving real business problems. When you’re getting started, set aside your concerns about who you will hire, what tools you will use, and how your people will work together—those things will make themselves apparent in time. First get your leaders in a room. Forego the slides, the spreadsheets, and the roadmaps. Instead, ask, with all sincerity: What problem are we trying to solve? The answer will not come as easily as you expect, but the conversation will be invaluable. 

Step 2: Capture the right data 

Once you’ve identified a problem worthy of solving, the next step is to capture the data you need to solve it. If you’ve defined your problem well, you’ll know what that data is, which is key. Just as defining your problem narrows the variety of data you might capture, figuring out what data you need, where to get it, and how to manage it will narrow the vast catalog of people, processes, and technologies that could compose your data environment. 

Consider how this played out for Alina and ChampionX. Once the team knew the problem—site visits were costly—they quickly identified the logical solution: Reduce the number of required site visits. Most visits were routine, rather than in response to an active problem, so if ChampionX could glean what was happening at the site remotely, they could save considerable time, fuel, and money. That insight told them what data they would need, which in turn allowed ChampionX’s IT and Commercial Digital teams to discern who and what they needed to capture it. They needed IoT sensors, for example, to extract relevant data from the sites. And they needed a place to store that data—they lacked infrastructure that could manage both the terabytes pouring off the sensors and the coupling customer data (which resided within enterprise platforms such as ERP, transportation, and supply & demand planning). So, they built a data-lake.  

Each of these initiatives—standing up secure cloud infrastructure, the design of the data lake, the sensors, the storage, the necessary training—was a major undertaking and is continuing to evolve. But the ChampionX team not only solved the site-visit problem; they provided a foundation for the company’s data environment and the data-driven initiatives that would follow. The data lake, for example, came to serve as a home for an ever-growing volume and variety of data from ChampionX’s other business units, which in turn led to some valuable insights (more on that in the next section). 

Knowing what data to capture provides the context you need to start selecting people, tools, and processes. Whichever you select, they will lend themselves to unpredictable ends, so it’s a taxing and fruitless exercise to try and map every way in which one component of your data environment will tie to all others— and from that, to choose a toolkit. Instead, figure out what you need for the problem—and the data—in front of you. Because you’ll be making selections in relation to something real and important in your organization, odds are, your selections will end up serving something else real and important. But in this case, you’ll be able to specify the names, costs, and sequencing of the things you need—details that will make your data strategy real and get your flywheel spinning faster. 

Step 3: Connect dots that once seemed disparate 

As you begin to capture data and your flywheel spins faster, new opportunities and data will reveal themselves. It wasn’t long after ChampionX’s team had installed the IoT sensors to remotely monitor customer sites that they realized the same data could be applied elsewhere. ChampionX now had a wealth of topographical data that no one else did, and it used this data to move both the top and the bottom lines. It moved the bottom line by optimizing the routes that ChampionX’s vehicles took to sites—solving the no-Google-Maps-where-we’re-going problem—and it moved the top by monetizing the data as a new revenue stream. 

The data lake, too, took on new purpose. Other business initiatives began parking their data in it, which prompted cross-functional teams to contemplate the various kinds of information swirling around together and how they might amount to more than the sum of their parts. One type was customer, order, and supply chain data, which ChampionX was regularly required to pull and merge with site data to perform impact analyses—reports of which and how their customers were affected by a disruption in supply chain networks. Merging those data used to take weeks, largely because the two data had always lived in different ecosystems. Now, the same analyses took only hours. 

There are two takeaways here. The first is that it’s okay if your data flywheel spins slowly at the start—just get it going. Attracting even a few new opportunities or types of data will afford you the chance to draw connections between things that once seemed disparate. That pattern recognition will speed up your flywheel at an exponential rate and encourage an appropriately complex data environment to take shape around it. 

The second takeaway is similar to those of the first two steps: Choose wisely among the opportunities you could pursue. Not every insight that is interesting is useful; pursue the ones that are most valuable and real, the ones people can see, measure, and feel. These will overlap significantly with tedious and banal, recurring organizational activities (like pulling together impact reports). If you can solve these problems, you will prove the viability of data as a force for change in your organization, and a richer data culture will begin to emerge, pushing the flywheel to an intimidating pace. 

Step 4: Build outward from your original problem 

The story of ChampionX that we’ve examined is only one chapter of a much larger tale. As the company has collected more data and its people gleaned new insights, the problems that Alina and her business partners take on have grown in scope and complexity, and ChampionX’s flywheel has reached a speed capable of powering data-first problem-solving across the company’s entire supply chain. 

Yet, most of the problems in some way trace back to the simple question of how the company might spend less on site-checks. ChampionX’s team has not hopped willy-nilly from problems that concern the supply chain to those that concern Marketing, or HR, or Finance; the team is expanding outward in logical progression from their original problem. And because they have, their people, processes, and technologies, in terms of maturity, are only ever a stone’s throw from being able to tackle the next challenge—which is always built on the one before it. 

As your flywheel spins faster, you will have more problems to choose among. Prioritize those that are not only feasible and valuable but also thematically consistent with the problems you’ve already solved. That way, you’ll be able to leverage the momentum you’ve built. Your data environment will already include many of the people and tools you need for the job. You won’t feel as if you’re starting anew or have to argue a from-scratch case to your stakeholders. 

Building a data strategy is like spinning a flywheel. It’s cyclical, iterative, gradual, perpetual. There is no special line that, if crossed, will deem your organization “data-driven.” And likewise, there is no use in thinking of your data strategy as something binary, as if it were a building under construction that will one day be complete. The best thing you can do is focus on using your data to solve problems that are urgent, simple, tangible, and valuable. Assemble the people, processes, and technologies you need to tackle those problems. Then, move onto the next, and then the next, and then the next, allowing the elements of a vibrant data ecosystem to emerge along the way. You cannot will your data strategy into existence; you can only draw it in, by focusing on the flywheel. And when it appears, you, and everyone else, will know it. 

Analytics, Data Management

Modernizing and future-proofing your analytics

Executive-level commitment to a broad data governance strategy is gaining momentum in order to balance technology, people, and processes. In a recent Gartner survey, 78% of CFOs said they will increase or maintain enterprise digital investments. And a Gartner forecast states worldwide IT spending will grow 3% in 2022.

The counterbalance to this positive trend comes from NewVantage Partners’ Data and AI Leadership Executive Survey 2022, which states only 26% of respondents claim to have reached their data goals. The gap between data winners and stragglers is widening.

Technology balance

One look at the Andreessen-Horowitz framework for the modern data infrastructure and you see data ecosystem complexity is becoming a nightmare to manage. The ability to properly secure this new smorgasbord of data platform choices increases the management challenge.

Andreessen-Horowitz framework for the modern data infrastructure

People balance

Until recently, data management and analysis was almost solely an IT function. Today, the business ranks are filled with similar skills with data stewards, data analysts, and data scientists tasked to build a data security governance platform. Meanwhile, CISOs, CIOs, and CDOs are thinking about compliance requirements and implementation. And IT has seen dwindling resources to cater to data consumers. While there are many positives regarding the expansion of data-related roles, it has also meant dwindling IT resources directly dedicated to data consumers, despite IT being tasked with servicing a growing data landscape.

Process balance

On-premises technologies have moved to the cloud, often in an à la carte, buy-as-you-go style, without significant forward-looking strategy. In addition, a stream of new regulations demands new processes to regulate and assure the responsible discovery, access, and use of data. Add to this the federation of our data expertise into the business functions, and organizations now require a scalable approach to data governance processes.

The growing cost of getting it wrong

While many proof points exist for the value of data and the positive impact, the cost of doing nothing or getting it wrong has gone somewhat unnoticed. Key considerations include:

The average cost of a security breach in 2022 is around $4.35m, compared to $3.8m two years ago (Source: IBM’s Cost of a Data Breach Report 2022).Regulatory fines, such as GDPR, are becoming real with companies such as Amazon and WhatsApp reporting multi-hundred-thousand-dollar fines.Analyst, data engineer, and data scientist productivity remains a major challenge as they continue to report 80% of their time is spent on finding and getting access to the right data, as well as cleaning that data.The intangible cost of delayed business decisions because the projects are on hold or severely impacted and delayed.Loss of consumer trust once confidence is broken due to mishandling of data, causing lasting damages to a company’s brand as well as severe financial repercussions.

Modernizing your data security governance thinking

Modernization starts with thinking differently about the approach to people, processes, and technology.

Modernizing data security governance technology: Security and data governance need to exist across every part of the data lifecycle. Maintaining that security posture on a point-by-point basis is simply not viable. A broad-based data security platform that will bring you a centralized data security control plane across your entire hybrid data estate is required.

Modernizing the roles of your data stakeholders: Key stakeholders have expanded beyond the traditional experts employed by IT. Data experts live in the business. Data scientists in the business team are embraced, but data governance stakeholders have yet to receive the formal recognition they deserve. The data owners are business people. Formalize security and data governance objectives early. Empower your business data stakeholders to perform those objectives in a scalable and automated manner.

Modernizing your data governance processes: Gartner speaks extensively of the evolution of data governance from dictated (IT command and control) to distributed (everything left to be performed at the edges of the process). Implement a blended model where the system is based on federated responsibilities with centralized control and auditability.

Unified data security governance

AWS, Snowflake, Databricks, Azure and Google continue to deliver more choices on their ecosystems, which offer more opportunities for your business. But more choices inherently increase the difficulty of enforcing security across this increasingly diverse landscape. The only way to future-proof your analytics along with your security and privacy posture is through a unified data security governance approach. Privacera was co-founded by the innovators who led the charge in creating Apache Ranger™, one of the most widely used open source data security and access control frameworks. As the only scalable data policy management solution based on open standards, Privacera offers a proven way to future-proof, while preventing vendor lock-in. Read more about the immense benefits of a data security platform based on open standards.

Data and Information Security