Economic instability and uncertainty are the leading causes for technology budget decreases, according to the IDG/Foundry 2022 annual State of the CIO survey. Despite a desire to cut budgets, data remains the key factor to a business succeeding – especially during economic uncertainty. According to the Harvard Business Review, data-driven companies have better financial performance, are more likely to survive, and are more innovative.[1]

So how do companies find this balance and create a cost-effective data stack that can deliver real value to their business? A new survey from Databricks, Fivetran, and Foundry that surveyed 400-plus senior IT decision-makers in data analytics/AI roles at large global companies, finds that 96% of respondents report negative business effects due to integration challenges. However, many IT and business leaders are discovering that modernizing their data stack overcomes those integration hurdles, providing the basis for a unified and cost-effective data architecture.

Building a performant & cost-effective data stack 

The Databricks, Fivetran, and Foundry report points the way for four investment priorities for data leaders: 

1. Automated data movement. A data pipeline is critical to the modern data infrastructure. Data pipelines ingest and move data from popular enterprise SaaS applications, and operational and analytic workloads to cloud-based destinations like data lakehouses. As the volume, variety and velocity of data grow, businesses need fully managed, secure and scalable data pipelines that can automatically adapt as schemas and APIs change while continuously delivering high-quality, fresh data. Modernizing analytic environments with an automated data movement solution reduces operational risk, ensures high performance, and simplifies ongoing management of data integration. 

2. A single system of insight. A data lakehouse incorporates integration tools that automate ELT to enable data movement to a central location in near real time. By combining both structured and unstructured data and eliminating separate silos, a single system of insight like the data lakehouse enables data teams to handle all data types and workloads. This unified approach of the data lakehouse dramatically simplifies the data architecture and combines the best features of a data warehouse and a data lake. This enables improved data management, security, and governance in a single data architecture to increase efficiency and innovation. Last, it supports all major data and AI workloads making data more accessible for decision-making.

A unified data architecture results in a data-driven organization that gains both BI, analytics and AI/ML insights at speeds comparable to those of a data warehouse, an important differentiator for tomorrow’s winning companies. 

3. Designed for AI/ML from the ground up. AI/ML is gaining momentum, as more than 80% of organizations are using or exploring the use of (AI) to stay competitive. “AI remains a foundational investment in digital transformation projects and programs,” says Carl W. Olofson, research vice president with IDC, who predicts worldwide AI spending will exceed $221B by 2025.[2] Despite that commitment, becoming a data-driven company fueled by BI analytics and AI insights is proving to be beyond the reach of many organizations that find themselves stymied by integration and complexity challenges. The data lakehouse solves this by providing a single solution for all major data workloads from streaming analytics to BI, data science, and AI. It empowers data science and machine learning teams to access, prepare and explore data at scale.

4. Solving the data quality issue. Data quality tools(59%) stand out as the most important technology to modernize the data stack, according to IT leaders in the survey. Why is data quality so important? Traditionally, business intelligence (BI) systems enabled queries of structured data in data warehouses for insights. Data lakes, meanwhile, contained unstructured data that was retained for the purposes of AI and Machine Learning (ML). However, maintaining siloed systems, or attempting to integrate them through complex workarounds, is difficult and costly. In a data lakehouse, metadata layers on top of open file formats increase data quality, while query engine advances speed and performance. This serves the needs of both BI analytics and AI/ML workloads in order to assure the accuracy, reliability, relevance, completeness, and consistency of data. 

According to the Databricks, Fivetran, and Foundry report, nearly two-thirds of IT leaders are using a data lakehouse, and more than four out of five say they’re likely to consider implementing one. At a moment when cost pressure is calling into question open-ended investments in data warehouses and data lakes, savvy IT leaders are responding as they place a high priority on modernizing their data stack. 

Download the full report to discover exclusive insights from IT leaders into their data pain points, how theyplan to address them, and what roles they expect cloud and data lakehouses to play in their data stack modernization.

[1] https://mitsloan.mit.edu/ideas-made-to-matter/why-data-driven-customers-are-future-competitive-strategy

[2]  Source: IDC’s Worldwide Artificial Intelligence Spending Guide, Feb V1 2022. 

Data Architecture

Stop me if you’ve heard this one before. Several economists, a bank president, and a couple of reporters walk into a bar. The economists lament, “A thick fog of uncertainty still surrounds us.” The bank president wails, “Economic hurricane.” The reporters keen about “gut-churning feelings of helplessness” and “a world of confusion.”

Sitting in a booth with his hard-working direct reports, the chief information officer sighs. “Same-old, same-old. Uncertainty is our jam.”

For as long as there has been an IT organization, CIOs have been charged with “keeping the lights on (KTLO), delivering five-nines stability (just 5.15 minutes of downtime per year), and expanding digital capabilities in a world characterized by massive economic, political, social, and technological uncertainty.

In other words, IT leaders know there is nothing uncertain about uncertainty. After all, uncertainty is the one certainty. We should not run from uncertainty; we should embrace it. Paraphrasing poet Robert Frost, when it comes to uncertainty, there is “no way out but through.”

What really drives uncertainty

One of the smartest guys on this planet, Ed Gough, formerly chief scientist and CTO at NATO Undersea Research Center (NURC) and former technical director at the US Navy’s Naval Meteorology and Oceanography Command, explained to me that ignorance is at the root of uncertainty.

As John Kay and Mervyn King set forth in Radical Uncertainty: Decision-Making Beyond the Numbers, “Uncertainty is the result of our incomplete knowledge of the world, or about the connection between our present actions and their future outcomes.”

There will always be uncertainties external to the organization. But the uncertainties that do the most to destroy IT value are the self-inflicted ones.

The No. 1 source of uncertainty in the workplace is absence of strategy. Business scholars, think tanks, and some members of the media are discovering that many organizations have not explicitly stated where they want to go and how they plan to get there. To wit, two-thirds of enterprises do not have a data strategy.

And among the companies that do have a strategy, just 14% of their employees “have a good understanding of their company’s strategy and direction.”

It all boils down to what Warner Moore, founder and CISO at Columbus, Ohio-based Gamma Force, recently told me: “Uncertainty isn’t the problem; lack of leadership is the problem.”

Focus on what matters most

Business school is where a lot of today’s business leaders learn their trade. And if you examine how business schools approach uncertainty, you can begin to see where this leadership issue takes root.

In business schools around the world, MBAs are counseled to combat uncertainty by compiling a comprehensive list of possible outcomes and then attach numerical probabilities to each scenario. This approach is untenable, as possible outcomes are infinite and assigning probabilities — subject to assumptions and biases — creates a false sense of precision.

One of the guiding principles for those who would master uncertainty is to recognize that there has always been something irresistible about advice in mathematical form. Over-reliance on metrics has given rise to the term “McNamara fallacy” referring to the tragic missteps associated with the misaligned quantifications used during the Vietnam War.

Instead of flailing around trying to enumerate everything that could happen, executives need to place intense scrutiny on a subset of critical uncertainties. In other words, neglect the right uncertainties.

I spoke with a subset of the 75 senior IT executives attending the Digital Solutions Gallery hosted by The Ohio State University to get their thoughts about which zones of uncertainty they were focusing on. The general consensus was that one of the best places to start managing hyper-uncertainty was talent.

Atlanta Fed President Raphael Bostic speaking at a “mini-conference” Survey of Business Uncertainty: Panel Member Economic Briefing and Policy Discussion earlier this year told attendees, “Finding workers is a big problem.”

Finding workers might be an uncertain undertaking but retaining key performers is not. Leaders have it in their power to know what their high performers are thinking. For these key employees it is possible to paint reasonably clear pictures of what happens next.

Mike McSally, a human capital advisor with 20-plus years of experience in executive recruiting, does not believe recruiting has to be a problem. Reducing talent uncertainty is a simple matter of managing personal networks. McSally suggests having your top ten performers take out a yellow sheet of paper and write down “the top twenty people you have ever worked with.” Give them a call.

When you find a qualified candidate, deliver to them an authentic “what-a-day-at-work-really-looks-like” depiction of the role being filled. When that depiction aligns with your strategic vision and your company’s mission, you’ll have a leg up on converting that candidate to a new team member.

That kind of leadership approach will help you handle talent uncertainties, better positioning your organization for the future. I am certain of that.

Hiring, IT Leadership, IT Strategy, Staff Management