Economic instability and uncertainty are the leading causes for technology budget decreases, according to the IDG/Foundry 2022 annual State of the CIO survey. Despite a desire to cut budgets, data remains the key factor to a business succeeding – especially during economic uncertainty. According to the Harvard Business Review, data-driven companies have better financial performance, are more likely to survive, and are more innovative.[1]

So how do companies find this balance and create a cost-effective data stack that can deliver real value to their business? A new survey from Databricks, Fivetran, and Foundry that surveyed 400-plus senior IT decision-makers in data analytics/AI roles at large global companies, finds that 96% of respondents report negative business effects due to integration challenges. However, many IT and business leaders are discovering that modernizing their data stack overcomes those integration hurdles, providing the basis for a unified and cost-effective data architecture.

Building a performant & cost-effective data stack 

The Databricks, Fivetran, and Foundry report points the way for four investment priorities for data leaders: 

1. Automated data movement. A data pipeline is critical to the modern data infrastructure. Data pipelines ingest and move data from popular enterprise SaaS applications, and operational and analytic workloads to cloud-based destinations like data lakehouses. As the volume, variety and velocity of data grow, businesses need fully managed, secure and scalable data pipelines that can automatically adapt as schemas and APIs change while continuously delivering high-quality, fresh data. Modernizing analytic environments with an automated data movement solution reduces operational risk, ensures high performance, and simplifies ongoing management of data integration. 

2. A single system of insight. A data lakehouse incorporates integration tools that automate ELT to enable data movement to a central location in near real time. By combining both structured and unstructured data and eliminating separate silos, a single system of insight like the data lakehouse enables data teams to handle all data types and workloads. This unified approach of the data lakehouse dramatically simplifies the data architecture and combines the best features of a data warehouse and a data lake. This enables improved data management, security, and governance in a single data architecture to increase efficiency and innovation. Last, it supports all major data and AI workloads making data more accessible for decision-making.

A unified data architecture results in a data-driven organization that gains both BI, analytics and AI/ML insights at speeds comparable to those of a data warehouse, an important differentiator for tomorrow’s winning companies. 

3. Designed for AI/ML from the ground up. AI/ML is gaining momentum, as more than 80% of organizations are using or exploring the use of (AI) to stay competitive. “AI remains a foundational investment in digital transformation projects and programs,” says Carl W. Olofson, research vice president with IDC, who predicts worldwide AI spending will exceed $221B by 2025.[2] Despite that commitment, becoming a data-driven company fueled by BI analytics and AI insights is proving to be beyond the reach of many organizations that find themselves stymied by integration and complexity challenges. The data lakehouse solves this by providing a single solution for all major data workloads from streaming analytics to BI, data science, and AI. It empowers data science and machine learning teams to access, prepare and explore data at scale.

4. Solving the data quality issue. Data quality tools(59%) stand out as the most important technology to modernize the data stack, according to IT leaders in the survey. Why is data quality so important? Traditionally, business intelligence (BI) systems enabled queries of structured data in data warehouses for insights. Data lakes, meanwhile, contained unstructured data that was retained for the purposes of AI and Machine Learning (ML). However, maintaining siloed systems, or attempting to integrate them through complex workarounds, is difficult and costly. In a data lakehouse, metadata layers on top of open file formats increase data quality, while query engine advances speed and performance. This serves the needs of both BI analytics and AI/ML workloads in order to assure the accuracy, reliability, relevance, completeness, and consistency of data. 

According to the Databricks, Fivetran, and Foundry report, nearly two-thirds of IT leaders are using a data lakehouse, and more than four out of five say they’re likely to consider implementing one. At a moment when cost pressure is calling into question open-ended investments in data warehouses and data lakes, savvy IT leaders are responding as they place a high priority on modernizing their data stack. 

Download the full report to discover exclusive insights from IT leaders into their data pain points, how theyplan to address them, and what roles they expect cloud and data lakehouses to play in their data stack modernization.

[1] https://mitsloan.mit.edu/ideas-made-to-matter/why-data-driven-customers-are-future-competitive-strategy

[2]  Source: IDC’s Worldwide Artificial Intelligence Spending Guide, Feb V1 2022. 

Data Architecture

Most organizations understand the profound impact that data is having on modern business. In Foundry’s 2022 Data & Analytics Study, 88% of IT decision-makers agree that data collection and analysis have the potential to fundamentally change their business models over the next three years.

The ability to pivot quickly to address rapidly changing customer or market demands is driving the need for real-time data. But poor data quality, siloed data, entrenched processes, and cultural resistance often present roadblocks to using data to speed up decision making and innovation.

We asked the CIO Experts Network, a community of IT professionals, industry analysts, and other influencers, why real-time data is so important for today’s business and how data helps organizations make better, faster decisions. Based on their responses, here are four recommendations for improving your ability to make data-driven decisions. 

Use real-time data for business agility, efficient operations, and more

Business and IT leaders must keep pace with customer demands while dealing with ever-shifting market forces. Gathering and processing data quickly enables organizations to assess options and take action faster, leading to a variety of benefits, said Elitsa Krumova (@Eli_Krumova), a digital consultant, thought leader and technology influencer.

“The enormous potential of real-time data not only gives businesses agility, increased productivity, optimized decision-making, and valuable insights, but also provides beneficial forecasts, customer insights, potential risks, and opportunities,” said Krumova.

Other experts agree that access to real-time data provides a variety of benefits, including competitive advantage, improved customer experiences, more efficient operations, and confidence amid uncertain market forces:

“Business operations must be able to make adjustments and corrections in near real time to stay ahead of the competition. Few companies have the luxury of waiting days or weeks to analyze data before reacting. Customers have too many options. And in some industries — like healthcare, financial services, manufacturing, etc., — not having real-time data to make rapid critical adjustments can lead to catastrophic outcomes.” — Jack Gold (@jckgld), President and Principal Analyst at J. Gold Associates LLC.

“When insights from the marketplace are not transmitted in real time, the ability to make critical business decisions disappears. We’ve all experienced the pain of what continues to happen with the disconnect between customer usage metrics and gaps in supply chain data.” — Frank Cutitta (@fcutitta), CEO and Founder, HealthTech Decisions Lab

“Operationally, think of logistics. Real-time data provides the most current intelligence to manage the fleet and delivery, for example. Strategically, with meaningful real-time data, systemic issues are easier to identify, portfolio decisions faster to make, and performance easier to evaluate. At the end of the day, it drives better results in safety, customer satisfaction, the bottom line, and ESG [environmental, social, and governance].” — Helen Yu (@YuHelenYu), Founder and CEO, Tigon Advisory Corp.

“Businesses are facing a rapidly evolving set of threats from supply chain constraints, rising fuel costs, and shipping delays. Taking too much time to make a decision based on stale data can increase overall costs due to changes in fuel prices, availability of inventory, and logistics impacting the shipping and delivery of products. Organizations utilizing real-time data are the best positioned to deal with volatile markets.” — Jason James (@itlinchpin), CIO at Net Health

Build a foundation for continuous improvement

The experts offered several practical examples of how real-time data can help deliver continuous improvement in a variety of areas across the business, with the help of automation, which is a key capability for making data actionable.

“In the process of digital transformation, businesses are moving from human-dependent to digital business processes,” said Nikolay Ganyushkin (nikolaygan), CEO and Co-founder of Acure. “This means that all changes, all transitions, are instantaneous. The control of key parameters and business indicators should also be based on real-time data, otherwise such control will not keep up with the processes.”

Real-time data and automated processes present a powerful combination for improving cybersecurity and resiliency.

“When I was coming up in InfoSec, we could only do vulnerability scanning between midnight and 6 am. We never got good results because systems were either off, or there was just nothing going on at those hours,” said George Gerchow (@georgegerchow), CSO and SVP of IT, Sumo Logic. “Today, we do them at the height of business traffic and can clearly see trends of potential service outages or security incidents.”

Will Kelly (@willkelly), an analyst and writer focused on the cloud and DevOps, said that harnessing real-time data is critical “in a world where delaying business and security decisions can prove even more costly than just a couple of years ago. Tapping into real-time data provides decision-makers with immediate access to actionable intelligence, whether a security alert on an attack in-progress or data on a supply chain issue as it happens.”

Real-time data facilitates timely, relevant, and insightful decisions down to the business unit level, said Gene De Libero (@GeneDeLibero), Chief Strategy Officer at GeekHive.com. Those decisions can have a direct impact on customers. “Companies can uncover and respond to changes in consumer behavior to promote faster and more efficient personalization and customization of customer experiences,” he said.

Deploy an end-to-end approach to storing, accessing, and analyzing data

To access data in real time — and ensure that it provides actionable insights for all stakeholders — organizations should invest in the foundational components that enable more efficient, scalable, and secure data collection, processing, and analysis. These components, including cloud-based databases, data lakes, and data warehouses, artificial intelligence and machine learning (AI/ML) tools, analytics, and internet of things capabilities, must be part of a holistic, end-to-end strategy across the enterprise:

“Real-time data means removing the friction and latency from sourcing data, processing it, and enabling more people to develop smarter insights. Better decisions come from people trusting that the data reflects evolving customer needs and captures an accurate state of operations.” — Isaac Sacolick (@nyike), StarCIO Leader and Author of Digital Trailblazer

“Organizations must use a system that draws information across integrated applications. This is often made simpler if the number of platforms is kept to a minimum. This is the only way to enable a real-time, 360-degree view of everything that is happening across an organization — from customer journeys to the state of finances.” — Sridhar Iyengar (@iSridhar), Managing Director, Zoho Europe

“Streaming processing platforms allow applications to respond to new data events instantaneously. Whether you’re distributing news events, moving just-in-time inventory, or processing clinical test results, the ability to process that data instantly is the power of real-time data.” — Peter B. Nichol (@PeterBNichol), Chief Technology Officer at OROCA Innovations

As your data increases, expand your data-driven capabilities

The volume and types of data organizations collect will continue to increase. Forward-thinking leadership teams will continue to expand their ability to leverage that data in new and different ways to improve business outcomes.

“The power of real-time data is amplified when your organization can enrich data with additional intelligence gathered from the organization,” said Nichol. “Advanced analytics can enhance events with scoring models, expanded business rules, or even new data.”

Nichol offered the example of combining a customer’s call — using an interactive voice response system — with their prior account history to enrich the interaction. “By joining events, we can build intelligent experiences for our customers, all in real time,” he said.

It’s one of the many ways that new technologies are increasing the opportunities to use real-time data to fundamentally change how businesses operate, now and in the future.

“As businesses become increasingly digitalized, the amount of data they have available is only going to increase,” said Iyengar. “We can expect real-time data to have a more significant impact on decision-making processes within leading, forward-thinking organizations as we head deeper into our data-centric future.”

Learn more about ways to put your data to work on the most scalable, trusted, and secure cloud.

Business Intelligence