By George Trujillo, Principal Data Strategist, DataStax

I recently had a conversation with a senior executive who had just landed at a new organization. He had been trying to gather new data insights but was frustrated at how long it was taking. (Sound familiar?) After walking his executive team through the data hops, flows, integrations, and processing across different ingestion software, databases, and analytical platforms, they were shocked by the complexity of their current data architecture and technology stack. It was obvious that things had to change for the organization to be able to execute at speed in real time.

Data is a key component when it comes to making accurate and timely recommendations and decisions in real time, particularly when organizations try to implement real-time artificial intelligence. Real-time AI involves processing data for making decisions within a given time frame. The time frame window can be in minutes, seconds, or milliseconds, based on the use case. Real-time AI brings together streaming data and machine learning algorithms to make fast and automated decisions; examples include recommendations, fraud detection, security monitoring, and chatbots.

A whole lot has to happen behind the scenes to succeed and get tangible business results. The underpinning architecture needs to include event-streaming technology, high-performing databases, and machine learning feature stores. All of this needs to work cohesively in a real-time ecosystem and support the speed and scale necessary to realize the business benefits of real-time AI.

It isn’t easy. Most current data architectures were designed for batch processing with analytics and machine learning models running on data warehouses and data lakes. Real-time AI requires a different mindset, different processes, and faster execution speed. In this article, I’ll share insights on aligning vision and leadership, as well as reducing complexity to make data actionable for delivering real-time AI solutions.

A real-time AI north star

More than once, I’ve seen senior executives completely aligned on mission while their teams fight subtle yet intense wars of attrition across different technologies, siloes, and beliefs on how to execute the vision.

A clear vision for executing a real-time AI strategy is a critical step to align executives and line-of-business leaders on how real-time AI will increase business value for the organization.

The execution plan must come from a shared vision that offers transparency and includes defining a laundry list of methodologies, technology stacks, scope, processes, cross-functional impacts, resources, and measurements with sufficient detail so that cross-functional teams have enough direction to collaborate and work together to achieve operational goals.

Machine learning models (algorithms that comb through data to recognize patterns or make decisions) rely on the quality and reliability of data created and maintained by application developers, data engineers, SREs, and data stewards. How well these teams work together will determine the speed they deliver real-time AI solutions. As real-time becomes pervasive across the organization, several questions begin to arise:

How are cross-functional teams enabled to support the speed of change, agility, and quality of data for real-time AI, as ML models evolve? What level of alerting, observability, and profiling can be counted on to ensure trust in the data by the business?How do analysts and data scientists find, access, and understand the context around real-time data?How is data, process, and model drift managed for reliability? Downstream teams can create strategy drift without a clearly defined and managed execution strategy; is the strategy staying consistent, evolving, or beginning to drift?Real-time AI is a science project until benefits to the business are realized. What metrics are used to understand the business impact of real-time AI?

As scope increases, so does the need for broad alignment

The growth of real-time AI in the organization impacts execution strategy. New projects or initiatives, like adding intelligent devices for operational efficiency, improving real-time product recommendations, or opening new business models for real-time, tend to be executed at an organization’s edge—by specialized experts, evangelists, and other individuals who innovate.

The edge is away from the business center of gravity—away from entrenched interests, vested political capital, and the traditional way of thinking.

The edge has less inertia, so it’s easier to facilitate innovation, new ways of thinking, and approaches that are novel compared to an organization’s traditional lines of business, institutional thinking, and existing infrastructure. Business transformation occurs when innovation at the edge can move into the center lines of business such as operations, e-commerce, customer service, marketing, human resources, inventory, and shipping/receiving.

A real-time AI initiative is a science project until it demonstrates business value. Tangible business benefits such as increased revenue, reduced costs in operational efficiency and better decisioning must be shared with the business.

Expanding AI from the edge into the core business units requires continuous effort in risk and change management, demonstrating value and strategy, and strengthening the culture around data and real-time AI. One should not move AI deeper into the core of an organization without metrics and results that demonstrate business value that has been achieved through AI at the current level. Business outcomes are the currency for AI to grow in an organization.

A real-time data platform

Here we see the current state of most data ecosystems compared to the real-time data stack necessary to drive real-time AI success:

DataStax

Leaders face challenges in executing a unified and shared vision across these environments.  Real-time data doesn’t exist in silos; it flows in two directions across a data ecosystem. The data used to train ML models may exist in memory caches, the operational data store, or in the analytic databases. Data must get back to the source to provide instructions to devices or to provide recommendations to a mobile app. A unified data ecosystem enables this in real time.

DataStax

Within the real-time data ecosystem, the heart of real-time decisioning is made up of the real-time streaming data, the ML feature store, and the ML feature engine. Reducing complexity here is critical.

DataStax

I’ve highlighted how data for real-time decisioning flows in both directions across data sources, streaming data, databases, analytic data platforms, and the cloud. Machine-learning features contain data used to train machine-learning models and to be used as inference data when the models run in production. Real-time models that make decisions in real-time require an ecosystem that supports speed and agility for updating existing models and putting new models into production across the data dimensions shown below.

DataStax

A real-time data ecosystem includes two core components: the data ingestion platform that receives real-time messages and event streams, and the operational data store that integrates and persists the real-time events, operational data, and the machine learning feature data.  These two foundational cores need to be aligned for agility across the edge, on-premises, hybrid cloud, and multi-vendor clouds. 

Complexity from disparate data platforms will not support the speed and agility that data needs to work at to support real-time AI. Changing criteria, new data, and evolving customer conditions can cause machine learning models to get out of date quickly. The data pipeline flows across memory caches, dashboards, event streams, databases, and analytical platforms that have to be updated, changed, or infused with new data criteria. Complexity across the data ecosystem impacts the speed to perform these updates accurately. 

A unified, multi-purpose data ingestion platform and operational data store greatly reduce the number of technology languages teams must speak and the complexity of working with real-time data flows across the ecosystem. A unified stack also improves the ability to scale real-time AI across an organization. As mentioned earlier, reducing complexity also improves the cohesiveness of the different teams supporting the real-time data ecosystem.

New real-time AI initiatives need to look at the right data technology stack through the lens of what it takes to support evolving machine learning models running in real-time. This doesn’t necessarily require ripping and replacing existing systems. Minimize disruption by running new data through an updated, agile, real-time data ecosystem and slowly migrate out of data platforms to the real-time AI stack as needed.

Wrapping up

Moving real-time AI from the edge of innovation to the center of the business will be one of the biggest challenges for organizations in 2023. A shared vision, driven by leadership and a unified real-time data stack, are key factors for enabling innovation with real-time AI. Growing a community around innovation with real-time AI makes the whole stronger than the parts–and is the only way that AI can bring tangible business results.

Learn how DataStax enables real-time AI.

About George Trujillo:

George is principal data strategist at DataStax. Previously, he built high-performance teams for data-value driven initiatives at organizations including Charles Schwab, Overstock, and VMware. George works with CDOs and data executives on the continual evolution of real-time data strategies for their enterprise data ecosystem. 

Artificial Intelligence, IT Leadership, Machine Learning

By Chet Kapoor, Chairman & CEO of DataStax

Every business needs an artificial intelligence strategy, and the market has been validating this for years. Gartner® predicts that, “By 2027, over 90% of new software applications that are developed in the business will contain ML models or services, as enterprises utilize the massive amounts of data available to the business.1” And with the rise of tools like ChatGPT, more organizations than ever are thinking about how AI and ML can transform their business.

Still, most companies have not yet benefited from real-time AI. They fail because data is served too slowly in complicated environments, making real-time actions almost impossible. AI cannot work with the wrong data, at the wrong time, delivered by the wrong infrastructure.

So, how do leading enterprises use AI to drive business outcomes? And why should you care about real-time AI? Let’s dive in.

Winning with AI: It starts with data

A successful AI strategy starts with data. More specifically, you need real-time data at scale. Leaders like Google, Netflix, and Uber have already mastered this. Their ML models are embedded in their applications and use the same real-time data. They aggregate events and actions in real-time through streaming services, and expose this data to ML models. And they build it all on a database that can store massive volumes of event data.

Ultimately, it’s about acting on your data in the moment and serving millions of customers in real-time. Think about these examples:

● Netflix tracks every user’s actions to refine its recommendation engine, then it uses this data to propose the content you will love most
● Uber gathers driver, rider, and partner data to update a prediction engine that informs customers about wait times, or suggests routes to drivers
● FedEx aggregates billions of package events to optimize operations and share visibility with its customers on delivery status

How DataStax helps: A new class of apps

We have been working on unlocking real-time data for a long time at DataStax. We started with Apache Cassandra® 12 years ago, serving the largest datasets in the world. Then we made it a database-as-a-service with Astra DB and added Astra Streaming to make real-time data a reality.

Now, we have another exciting piece of the puzzle: Kaskada, a machine-learning company that recently joined forces with DataStax. Their feature engine helps customers get more value from real-time data. By adding Kaskada’s technology, we’ll be able to provide a simple, end-to-end stack that brings ML to data—not the other way around.

This unlocks a whole new class of applications that can deliver the instantaneous, personalized experiences customers demand – all in one unified open-source stack. Take the conversational AI company Uniphore, for example. Uniphore has an AI assistant that does sentiment analysis on sales calls. It helps sellers build better customer relationships and loyalty. Without the ability to process data in real-time, their solution would not be possible. Uniphore relies on DataStax to power its AI experience – with speed, scale, and affordability.

The future is bright

We believe every company should be able to deploy real-time AI at 3X the scale and half the cost. Our new mandate is clear: Real-time AI for everyone. We have the right data, at the right time, and the right infrastructure.

Now, it’s about executing the vision with our customers, communities, and partners. I’m super excited about making this a reality.

Click here to learn more about the power of real-time AI.

[1] Gartner, “A Mandate for MLOps, ModelOps and DeOps Coordination,” Van Baker, Nov. 22, 2022

GARTNER is a registered trademark and service mark of Gartner, Inc. and/or its affiliates in the U.S. and internationally and is used herein with permission.

About Chet Kapoor:

Chet is Chairman and CEO of DataStax. He is a proven leader and innovator in the tech industry with more than 20 years in leadership at innovative software and cloud companies, including Google, IBM, BEA Systems, WebMethods, and NeXT. As Chairman and CEO of Apigee, he led company-wide initiatives to build Apigee into a leading technology provider for digital business. Google (Apigee) is the cross-cloud API management platform that operates in a multi- and hybrid-cloud world. Chet successfully took Apigee public before the company was acquired by Google in 2016. Chet earned his B.S. in engineering from Arizona State University.

Artificial Intelligence, IT Leadership

By Thomas Been, DataStax

Building data-driven, high-growth businesses takes a certain kind of roll-up-your-sleeves, determined, and smart builder who understands the importance of building a unified, foundational data architecture.

We call these people Digital Champions. They’re visionaries in using real-time data and the cloud to deliver unprecedented value to their organizations and, in turn, to their customers.

Last year, we set out to identify enterprises and builders that build powerful, real-time applications that define the future of data. They’ve planned and prepared to scale to whatever demand their growing businesses and customers place upon them. Since we named our first batch of Digital Champions in November, the program has gained some significant momentum.

Our latest group of Digital Champions—SupPlant, K2View, and Ibexa—all hail from Europe and all have taken on big challenges to lead their organizations to success with real-time data.

SupPlant

Revital Kremer is greedy for data. To the chief technology officer of SupPlant, which supplies farmers with critical agronomic insights, the more real-time weather, plant, soil, and water data the company’s solution can gather, the bigger the improvements in crop yield and productivity for customers.

However, as Kremer puts it, collecting data is hard—especially when it involves the variety of sensors and broad range of locales where SupPlant’s customers work. It’s a good thing she’s passionate about what she does and draws energy from the ability of technology to solve the world’s problems. Kremer has worked in a variety of sectors, from gaming to defense, but, as she put it recently, “The agtech domain, and SupPlant in particular, is the first time I’ve built technology that really makes a difference and can help farmers overcome water scarcity and climate changes.”

Kremer embodies the Digital Champion ethos, so we’re proud to honor her here.

Learn more about why Supplant is a Digital Champion here.

K2View

The team at K2View isn’t afraid of challenges (the company’s name, after all, is a nod to the second tallest and notoriously challenging to climb mountain in the world). The company enables large enterprises—like AT&T, American Express, and Hertz—to gain 360-degree views of their customers to predict churn, fraud, and more—all in real-time.

For K2View’s chief technology officer Yuval Perlov’s career climb through the telecom, e-commerce, and financial sectors, there’s been at least one consistent challenge: data. So it’s no surprise that he chose Apache Cassandra to provide the extremely high throughput and rock-solid reliability required to build a real-time data platform for K2View’s customers.

For the trust they’ve built in bringing real-time data to big enterprises, K2View is an obvious choice to earn the title of Digital Champion. 

Learn more about why K2View is a Digital Champion here.

Ibexa

Nazariy Kostiv, senior Java engineer at Norway’s Ibexa, doesn’t have time to think about infrastructure. Recently named the leader of the company’s Java team, Kostiv is intently focused on developing innovations for Ibexa’s Digital Experience Platform, which enables customers to build customized B2B e-commerce experiences.

Building real-time recommendation engines and other applications at scale is how Kostiv and his team make an impact—not managing database resources. It’s one reason Ibexa chose DataStax Astra DB. (“We don’t lose sleep over concerns of node failure or other issues,” as Kostiv puts it).

From how they built new user experiences for a major movie theater chain in Colombia to their work modernizing outdated websites for the French government, Kostiv and the Ibexa team exemplify the data-driven, roll-up-your-sleeves approach of the Digital Champion.

Learn more about why Ibexa is a Digital Champion here.

Does your organization fit the Digital Champion mold? Contact me to nominate a Digital Champion today!

About Thomas Been:

DataStax

Thomas leads marketing at DataStax. He has helped businesses transform with data and technology for more than 20 years, in sales and marketing leadership roles at Sybase (acquired by SAP), TIBCO, and Druva.

Digital Transformation, IT Leadership

By Bryan Kirschner, Vice President, Strategy at DataStax

Imagine getting a recommendation for the perfect “rainy Sunday playlist” midway through your third Zoom meeting on Monday.

Or a receiving text about a like-for-like substitute for a product that was out of stock at your preferred e-commerce site 10 minutes after you’d already paid a premium for it on another.

Or arriving late for lunch with a long-time friend and being notified that “to have arrived earlier, you should have avoided the freeway.”

We all expect apps to be both “smart” and “fast.” We can probably all call to mind some that do both so well that they delight us. We can also probably agree that failures like those above are a recipe for brand damage and customer frustration—if not white-hot rage.

We’re at a critical juncture for how every organization calibrates their definition of  “fast” and “smart” when it comes to apps—which brings significant implications for their technology architecture.

It’s now critical to ensure that all of an enterprise’s real-time apps will be artificial-intelligence capable, while every AI app is capable of real-time learning.

“Fast enough” isn’t any more

First: Meeting customer expectations for what “fast enough” means has already become table stakes. By 2018, for example, the BBC knew that for every additional second a web page took to load, 10% of users would leave—and the media company was already building technical strategy and implementation accordingly. Today, Google considers load time such an important positive experience that it factors into rankings in search results—making “the speed you need” a moving target that’s as much up to competitors as not.

The bar will keep rising, and your organization needs to embrace that.

Dumb apps = broken apps

Second: AI has gotten real, and we’re in the thick of competition to deploy use cases that create leverage or drive growth. Today’s winning chatbots satisfy customers. Today’s winning recommendation systems deliver revenue uplift. The steady march toward every app doing some data-driven work on behalf of the customer in the very moment that it matters most—whether that’s a spot-on “next best action” recommendation or a delivery time guarantee—isn’t going to stop.

Your organization needs to embrace the idea that a “dumb app” is synonymous with a “broken app.”

We can already see this pattern emerging: In a 2022 survey of more than 500 US organizations, 96%of those who currently have AI or ML in wide deployment expect all or most of their applications to be real-time within three years.

Beyond the batch job

The third point is less obvious—but no less important. There’s a key difference between applications that serve “smarts” in real time and those capable of “getting smarter” in real time. The former rely on batch processing to train machine learning models and generate features (measurable properties of a phenomenon). These apps accept some temporal gap between what’s happening in the moment and the data driving an app’s AI.

If you’re predicting the future position of tectonic plates or glaciers, a gap of even a few months might not matter. But what if you are predicting “time to curb?”

Uber doesn’t rely solely on what old data predicts traffic “ought to be” when you order a ride: it processes real-time traffic data to deliver bang-on promises you can count on. Netflix uses session data to customize the artwork you see in real time.

When the bits and atoms that drive your business are moving quickly, going beyond the batch job to make applications smarter becomes critical. And this is why yesterday’s AI and ML architectures won’t be fit for purpose tomorrow: The inevitable trend is for more things to move more quickly.

Instacart offers an example: the scope and scale of e-commerce and the digital interconnectedness of supply chains are creating a world in which predictions about item availability based on historical data can be unreliable. Today, Instacart apps can get smarter about real-time availability using a unique data asset: the previous 15 minutes of shopper activity.

‘I just wish this AI was a little dumber,’ said no one

Your organization needs to embrace the opportunity to bring true real-time AI to real-time applications.

Amazon founder Jeff Bezos famously said, “I very frequently get the question: ‘What’s going to change in the next 10 years?’ … I almost never get the question: ‘What’s not going to change in the next 10 years?’ And I submit to you that that second question is actually the more important of the two—because you can build a business strategy around the things that are stable in time.”

This sounds like a simple principle, but many companies fail to execute on it.

He articulated a clear north star: “It’s impossible to imagine a future 10 years from now where a customer comes up and says, ‘Jeff, I love Amazon; I just wish the prices were a little higher.’ ‘I love Amazon; I just wish you’d deliver a little more slowly.’ Impossible.”

What we know today is that it’s impossible to imagine a future a decade from now where any customer says, “I just wish the app was a little slower,” “I just wish the AI was a little dumber,” or “I just wish its data was a little staler.”

The tools to build for that future are ready and waiting for those with the conviction to act on this.

Learn how DataStax enables real-time AI.

About Bryan Kirschner:

Bryan is Vice President, Strategy at DataStax. For more than 20 years he has helped large organizations build and execute strategy when they are seeking new ways forward and a future materially different from their past. He specializes in removing fear, uncertainty, and doubt from strategic decision-making through empirical data and market sensing.

Artificial Intelligence, IT Leadership

By George Trujillo, Principal Data Strategist, DataStax

Increased operational efficiencies at airports. Instant reactions to fraudulent activities at banks. Improved recommendations for online transactions. Better patient care at hospitals. Investments in artificial intelligence are helping businesses to reduce costs, better serve customers, and gain competitive advantage in rapidly evolving markets. Titanium Intelligent Solutions, a global SaaS IoT organization, even saved one customer over 15% in energy costs across 50 distribution centers, thanks in large part to AI.  

To succeed with real-time AI, data ecosystems need to excel at handling fast-moving streams of events, operational data, and machine learning models to leverage insights and automate decision-making. Here, I’ll focus on why these three elements and capabilities are fundamental building blocks of a data ecosystem that can support real-time AI.

DataStax

Real-time data and decisioning

First, a few quick definitions. Real-time data involves a continuous flow of data in motion. It’s streaming data that’s collected, processed, and analyzed on a continuous basis. Streaming data technologies unlock the ability to capture insights and take instant action on data that’s flowing into your organization; they’re a building block for developing applications that can respond in real-time to user actions, security threats, or other events. AI is the perception, synthesis, and inference of information by machines, to accomplish tasks that historically have required human intelligence. Finally, machine learning is essentially the use and development of computer systems that learn and adapt without following explicit instructions; it uses models (algorithms) to identify patterns, learn from the data, and then make data-based decisions.

Real-time decisioning can occur in minutes, seconds, milliseconds, or microseconds, depending on the use case. With real-time AI, organizations aim to provide valuable insights during the moment of urgency; it’s about making instantaneous, business-driven decisions. What kinds of decisions are necessary to be made in real-time? Here are some examples:

Fraud It’s critical to identify bad actors using high-quality AI models and data

Product recommendations It’s important to stay competitive in today’s ever-expanding online ecosystem with excellent product recommendations and aggressive, responsive pricing against competitors. Ever wonder why an internet search for a product reveals similar prices across competitors, or why surge pricing occurs?

Supply chain With companies trying to stay lean with just-in-time practices, it’s important to understand real-time market conditions, delays in transportation, and raw supply delays, and adjust for them as the conditions are unfolding.

Demand for real-time AI is accelerating

Software applications enable businesses to fuel their processes and revolutionize the customer experience. Now, with the rise of AI, this power is becoming even more evident. AI technology can autonomously drive cars, fly aircraft, create personalized conversations, and transform the customer and business experience into a real-time affair. ChatGPT and Stable Diffusion are two popular examples of how AI is becoming increasingly mainstream. 

With organizations looking for increasingly sophisticated ways to employ AI capabilities, data becomes the foundational energy source for such technology. There are plenty of examples of devices and applications that drive exponential growth with streaming data and real-time AI:  

Intelligent devices, sensors, and beacons are used by hospitals, airports, and buildings, or even worn by individuals. Devices like these are becoming ubiquitous and generate data 24/7. This has also accelerated the execution of edge computing solutions so compute and real-time decisioning can be closer to where the data is generated.AI continues to transform customer engagements and interactions with chatbots that use predictive analytics for real-time conversations. Augmented or virtual reality, gaming, and the combination of gamification with social media leverages AI for personalization and enhancing online dynamics.Cloud-native apps, microservices and mobile apps drive revenue with their real-time customer interactions.

It’s clear how these real-time data sources generate data streams that need new data and ML models for accurate decisions. Data quality is crucial for real-time actions because  decisions often can’t be taken back. Determining whether to close a valve at a power plant, offer a coupon to 10 million customers, or send a medical alert has to be dependable and on-time. The need for real-time AI has never been more urgent or necessary.

Lessons not learned from the past

Organizations have over the past decade put a tremendous amount of energy and effort into becoming data driven but many still struggle to achieve the ROI from data that they’ve sought. A 2023 New Vantage Partners/Wavestone executive survey highlights how being data-driven is not getting any easier as many blue-chip companies still struggle to maximize ROI from their plunge into data and analytics and embrace a real data-driven culture:

19.3% report they have established a data culture26.5% report they have a data-driven organization39.7% report they are managing data as a business asset47.4% report they are competing on data and analytics

Outdated mindsets, institutional thinking, disparate siloed ecosystems, applying old methods to new approaches, and a general lack of a holistic vision will continue to impact success and hamper real change. 

Organizations have balanced competing needs to make more efficient data-driven decisions and to build the technical infrastructure to support that goal. While big data technologies like Hadoop were used to get large volumes of data into low-cost storage quickly, these efforts often lacked the appropriate data modeling, architecture, governance, and speed needed for real-time success.

This resulted in complex ETL (extract, transform, and load) processes and difficult-to-manage datasets. Many companies today struggle with legacy software applications and complex environments, which leads to difficulty in integrating new data elements or services. To truly become data- and AI-driven, organizations must invest in data and model governance, discovery, observability, and profiling while also recognizing the need for self-reflection on their progress towards these goals.

Achieving agility at scale with Kubernetes

As organizations move into the real-time AI era, there is a critical need for agility at scale. AI needs to be incorporated into their systems quickly and seamlessly to provide real-time responses and decisions that meet customer needs. This can only be achieved if the underlying data infrastructure is unified, robust, and efficient. A complex and siloed data ecosystem is a barrier to delivering on customer demands, as it prevents the speedy development of machine learning models with accurate, trustworthy data.

Kubernetes is a container orchestration system that automates the management, scaling, and deployment of microservices. It’s also used to deploy machine learning models, data streaming platforms, and databases. A cloud-native approach with Kubernetes and containers brings scalability and speed with increased reliability to data and AI the same way it does for microservices. Real-time needs a tool and an approach to support scaling requirements and adjustments; Kubernetes is that tool and cloud-native is the approach. Kubernetes can align a real-time AI execution strategy for microservices, data, and machine learning models, as it adds dynamic scaling to all of these things. 

Kubernetes is a key tool to help do away with the siloed mindset. That’s not to say it’ll be easy. Kubernetes has its own complexities, and creating a unified approach across different teams and business units is even more difficult. However, a data execution strategy has to evolve for real-time AI to scale with speed. Kubernetes, containers, and a cloud-native approach will help. (Learn more about moving to cloud-native applications and data with Kubernetes in this blog post.)

Unifying your organization’s real-time data and AI strategies

Data, when gathered and analyzed properly, provides the inputs necessary for functional ML models. An ML model is an application created to find patterns and make decisions when accessing datasets. The application will contain ML mathematical algorithms. And, once ML models are trained and deployed, they help to more effectively guide decisions and actions that make the most of the data input. So it’s critical that organizations understand the importance of weaving together data and ML processes in order to make meaningful progress toward leveraging the power of data and AI in real-time. From architectures and databases to feature stores and feature engineering, a myriad of variables must work in sync for this to be accomplished.

ML models need to be built,  trained, and then deployed in real-time. Flexible and easy-to-work-with data models are the oil that makes the engine for building models run smoothly. ML models  require data for testing and developing the model and for inference when the ML models are put in production (ML inference is the process of an ML model making calculations or decisions on live data).

Data for ML is made up of individual variables called features. The features can be raw data  that has been processed or analyzed or derived. ML model development is about finding the right features for the algorithms. The ML workflow for creating these features is referred to as feature engineering. The storage for these features is referred to as a feature store. Data and ML model development fundamentally depend on one another..

That’s why it is essential for leadership to build a clear vision of the impact of data-and-AI alignment—one that can be understood by executives, lines of business, and technical teams alike. Doing so sets up an organization for success, creating a unified vision that serves as a foundation for turning the promise of real-time AI into reality .

A real-time AI data ingestion platform and operational data store

Real-time data and supporting machine learning models are about data flows and machine-learning-process flows. Machine learning models require quality data for model development and for decisioning when the machine learning models are put in production. Real-time AI needs the following from a data ecosystem:

A real-time data ingestion platform for messaging, publish/subscribe (“pub/sub” asynchronous messaging services), and event streamingA real-time operational data store for persisting data and ML model features An aligned data ingestion platform for data in motion and an operational data store working together to reduce the data complexity of ML model developmentChange data capture (CDC) that can send high-velocity database events back into the real-time data stream or in analytics platforms or other destinations.An enterprise data ecosystem architected to optimize data flowing in both directions.

DataStax

Let’s start with the real-time operational data store, as this is the central data engine for building ML models. A modern real-time operational data store excels at integrating data from multiple sources for operational reporting, real-time data processing, and support for machine learning model development and inference from event streams. Working with the real-time data and the features in one centralized database environment accelerates machine learning model execution.

Data that takes multiple hops through databases, data warehouses, and transformations moves too slow for most real-time use cases. A modern real-time operational data store (Apache Cassandra® is a great example of a database used for real-time AI by the likes of Apple, Netflix, and FedEx) makes it easier to integrate data from real-time streams and CDC pipelines. 

Apache Pulsar is an all-in-one messaging and streaming platform, designed as a cloud-native solution and a first class citizen of Kubernetes. DataStax Astra DB, my employer’s database-as-a-service built on Cassandra, runs natively in Kubernetes. Astra Streaming is a cloud-native managed real-time data ingestion platform that completes the ecosystem with Astra DB. These stateful data solutions bring alignment to applications, data, and AI.

The operational data store needs a real-time data ingestion platform with the same type of integration capabilities, one that can ingest and integrate data from streaming events. The streaming platform and data store will be constantly challenged with new and growing data streams and use cases, so they need to be scalable and work well together. This reduces the complexity for developers, data engineers, SREs, and data scientists to build and update data models and ML models.  

A real-time AI ecosystem checklist

Despite all the effort that organizations put into being data-driven, the New Vantage Partners survey mentioned above highlights that organizations still struggle with data. Understanding the capabilities and characteristics for real-time AI is an important first step toward designing a data ecosystem that’s agile and scalable.  Here is a set of criteria to start with:

A holistic strategic vision for data and AI that unifies an organizationA cloud-native approach designed for scale and speed across all componentsA data strategy to reduce complexity and breakdown silosA data ingestion platform and operational data store designed for real-timeFlexibility and agility across on-premises, hybrid-cloud, and cloud environmentsManageable unit costs for ecosystem growth

Wrapping up

Real-time AI is about making data actionable with speed and accuracy. Most organizations’ data ecosystems, processes and capabilities are not prepared to build and update ML models at the speed required by the business for real-time data. Applying a cloud-native approach to applications, data, and AI improves scalability, speed, reliability, and portability across deployments. Every machine learning model is underpinned by data. 

A powerful datastore, along with enterprise streaming capabilities turns a traditional ML workflow (train, validate, predict, re-train …) into one that is real-time and dynamic, where the model augments and tunes itself on the fly with the latest real-time data.

Success requires defining a vision and execution strategy that delivers speed and scale across developers, data engineers, SREs, DBAs, and data scientists. It takes a new mindset and an understanding that all the data and ML components in a real-time data ecosystem have to work together for success. 

Special thanks to Eric Hare at DataStax, Robert Chong at Employers Group, and Steven Jones of VMWare for their contributions to this article. 

Learn how DataStax enables real-time AI.

About George Trujillo:

George is principal data strategist at DataStax. Previously, he built high-performance teams for data-value driven initiatives at organizations including Charles Schwab, Overstock, and VMware. George works with CDOs and data executives on the continual evolution of real-time data strategies for their enterprise data ecosystem. 

Artificial Intelligence, IT Leadership

By Bryan Kirschner, Vice President, Strategy at DataStax

In their 2020 book Competing in the Age of AI, Harvard Business School professors Marco Iansiti and Karim Lakhani make some bold predictions about the winning enterprises of the future.

These organizations, which they refer to as “AI factories,” build a “virtuous cycle between user engagement, data collection, algorithm design, prediction, and improvement,” unlocking new paths to growth as software moves to the core of the enterprise.

A little more than two years after the publication of their seminal work, data gathered from IT leaders and practitioners lend a lot of credence to Iansiti and Lakhani’s hypotheses — particularly those regarding the kind of technology architectures and strategies that engender success with AI.

The AI factory

Successful AI companies — think Apple, Netflix, Google, Uber, or FedEx — build innovative applications and, as they scale, start the flywheel of data, growth, and improvement spinning by gathering ever-growing amounts of real-time data, accessing it instantly, and tuning their predictions.

User experiences become more personal and intuitive; key decisions can be made nearly instantaneously; and predictions can occur in real-time, empowering a business to improve outcomes in the moment.

This unlocks new paths to growth: in the authors’ words, as AI factories “accumulate data by increasing scale (or even scope), the algorithms get better and the business creates greater value, something that enables more usage and thus the generation of even more data.”

For more traditional firms to achieve this kind of success requires a host of changes in both their operating models and technology profiles.

Open-source software and AI success

The State of the Data Race 2022 report is based on a survey of over 500 IT leaders and practitioners that delved into their organizations’ data strategies.

For the purpose of this analysis, responses were divided into three groups:

those where both AI and ML are already in widespread deploymentthose where AI and ML are at most in the pilot phase or early daysthose in between these two extremes, characterized as being in “limited deployment”

The study assumed the organizations with AI/ML widely in production provide useful information about the evolving shape of the “AI factory” and looked for differences across the three stages of maturity.

Iansiti and Lakhani wrote that AI factories will evolve “from a focus on proprietary technologies and software to an emphasis on shared development and open source” because the competitive advantage they enjoy comes from data they accumulate — not the software they develop in-house.

The survey data backs this up in spades. A strong majority of each of the three AI/ML groups considers open-source software (OSS) at least “somewhat” important to their organization (73%, 96%, and 97%, respectively, ordered from “early days” to “wide deployment”).

But ratings of “very” important closely track AI/ML maturity: 84% of companies with AI/ML in wide deployment describe OSS this way (22%of “early days” organizations do, and this jumps to 46% of those with AI/ML in limited deployment).

Perhaps even more striking, organizations not using OSS are a tiny minority (1%, 1%, and 7%, ordered from “wide deployment” to “early days”). But a majority of those with AI/ML in wide deployment (55%) join companies like The Home Depot in having a company-wide mandate for use of OSS.

Real-time data and AI

Consider the AI leaders mentioned above. These companies have assembled technology infrastructures that enable instantaneous changes and decisions based on real-time feedback. Relying on day-old data and batch processing to update the routing of a package to ensure on-time delivery just doesn’t cut it at FedEx.

So, it isn’t surprising that Iansiti and Lakhani report that AI factories lean into real time. “The top enterprises … develop tailored customer experiences, mitigate the risk of customer churn, anticipate equipment failure, and enable all kinds of process decisions in real time,” they say.

Much like with OSS, findings from The State of the Data Race point to real-time data (and the technology architecture that enables it) as a matter of core strategy for the AI leaders. The substantial use of this correlates with AI maturity: 81% of companies that have broadly deployed AI/ML say real-time data is a core strategy. Forty-eight percent of organizations with limited AI/ML deployment describe it as a core strategy; the figure was 32% for companies in the early stages of AI/ML.

But among the advanced group, a full 61% say that leveraging real-time data is a strategic focus across their organization (four times that of organizations in the early days, and more than twice that of those with limited deployment). And 96%of today’s AI/ML leaders expect all or most of their apps to be real time within three years.

This makes sense: as an enterprise intentionally rewires its operations to make the most of AI/ML, it becomes especially important to eliminate any arbitrary architectural barriers to new use cases that require “speed at scale” anywhere in the business.

Today’s OSS as-a-service ecosystem makes that possible for everyone, freeing the future organization to make the most of its unique customer interactions and datasets.

Uniphore: A case study in real-time data, AI, and OSS

Uniphore helps its enterprise customers cultivate more fruitful relationships with their customers by applying AI to sales and customer service communications. The company relies on real-time data to quickly analyze and provide feedback to salespeople upon thousands of customer reactions during video calls.

“We have about fourteen different AI models we run in real time to coalesce the data into something meaningful for our clients,” says Saurabh Saxena, Uniphore’s head of technology and VP of engineering. “Any kind of latency is going to have a negative effect on the real time side.”

“Without the ability to process data in real-time, our solution really wouldn’t be possible,” he adds.

To get “the speed they need,” Uniphore relies on open-source Apache Cassandra® delivered as a service via DataStax (my employer) Astra DB. Its performance and reliability are key to ensuring Uniphore’s system is something every salesperson is motivated to rely on in order to be more effective in the moment.

But winning adoption among line staff points to another of Iansiti and Lakhani’s insights on the implications of AI for senior management. As the latter explained in a 2021 interview, “AI is good at predictions” — and predictions are “the guts of an organization.” Senior executives need to constantly ask, “Do I have data now to improve my prediction power — my accuracy, my speed?”

As Uniphore points out, sales forecast accuracy is something most sales leaders are concerned about. As a knock-on effect of using Uniphore’s tools, quantitative data on sentiment and engagement can flow into sales forecasts without the need for more staff time. In addition to the direct uplift that sellers experience, forecasts improve– — management to spend their time on more important things, like investing for growth, with greater confidence.

This closes the loop on Iansiti and Lakhani’s insight that AI factories can unlock a more powerful operating model over and above the benefits of individual use cases and point solutions.

Building an AI factory

Organizations that leaned into the insights in Competing in the Age of AI may have stolen a march on their competition. Judging from our survey data, they’ve been amply rewarded for doing so. The good news is that they’ve proven best practices for success — and the tools you need to accelerate your own progress on the journey to becoming an “AI factory” are ready and waiting.

Learn how DataStax enables AI-powered apps

About Bryan Kirschner:

Bryan is Vice President, Strategy at DataStax. For more than 20 years he has helped large organizations build and execute strategy when they are seeking new ways forward and a future materially different from their past. He specializes in removing fear, uncertainty, and doubt from strategic decision-making through empirical data and market sensing.

Artificial Intelligence, IT Leadership

Imagine an airport that uses computer vision to track errant luggage in real time, or a commercial kitchen able to detect refrigeration conditions and prevent spoilage. Imagine an amusement park outfitting its rides with sensors that can talk directly to operations for upgraded safety and better guest experiences. Imagine a factory or a chain of retailers reducing energy and cutting equipment downtime. 

These scenarios are not imaginary. They are playing out across industries with the help of edge computing, Internet of Things (IoT) devices and an innovative approach known as Business Outcomes-as-a-Service.[1]

In each case, the company has volumes of streaming data and needs a way to quickly analyze it for outcomes such as greater asset availability, improved site safety and enhanced sustainability. In each case, they are taking strategic advantage of data generated at the edge, using artificial intelligence and cloud architecture. And they’re achieving significant wins.[2]

Here, we explore the demands and opportunities of edge computing and how an approach to Business Outcomes-as-a-Service can provide end-to-end analytics with lowered operational risk.

From the Edge to the Cloud and Back

Computing at the edge and the far edge allows for data to be processed near the point where it’s generated. The speed and volume of data flowing, often in real time, from sensors and other IoT devices, comes with potential for enormous gains in business and operational intelligence. But this advancement also adds complexity. 

Most organizations still need methods for analyzing data at the point of conception so it can be acted upon immediately. Some have managed to derive meaningful, rapid and repeatable business outcomes from their IoT data streams and analytics using Business Outcomes-as-a-Service (Atos BOaaS), developed by Atos, an international leader in digital transformation. Already, Atos customers have reported positive experiences.

“For a retail customer, we’re talking about 66,000 hours saved in maintenance and compliance for maintaining the edge environment, which translates into about 480 metric tons of CO2 saved every year — thanks to automation and end-to-end monitoring,” said Arnaud Langer, Global Edge and IoT Senior Product Director at Atos.

Four Key Benefits of an End-to-End Analytics Service

As many tech and industry leaders are noting,[3] businesses are now prioritizing value and speed to deployment. Outcome-based solutions delivered in an as-a-service model allow companies to realize this rapid time-to-value. 

Those using a turnkey, scalable BOaaS platform are quickly able to manage an entire AI and IoT ecosystem from one dashboard, across the cloud, edge and far edge.[4] The solution allows them to generate value from real-time data using a platform for ingesting, storing and analyzing continuously streaming data. It’s bringing advanced analytics and AI capabilities where they’re needed most – the edge. Already deployed in commercial kitchens and retail chains, on factory floors and at amusement parks, the solution has shown the following benefits.

Value: Increased uptime of critical assets with predictive maintenanceSustainability: Reduced onsite support and carbon footprint with touchless operationsSafety: AI-enhanced computer vision for safer, efficient operationsCost-effectiveness: Full operational expense (OpEx) as-a-service pricing and operational model

For a manufacturer or retailer, for instance, an equipment or IT interruption would typically impact employees, customers and revenue due to the traditionally painful restoration process. But the monitoring BOaaS system reduces downtime by detecting interruptions before they occur so that remediation can happen before the equipment fails, while it is still running, and before any downtime is experienced – and the problem can often be resolved remotely. If immediate remedies are not possible, the system will alert staff then procure and ship a replacement part to arrive on site. Employees can securely connect to the platform and deploy the applications they need via the cloud, minimizing impact to business operations. 

Across industries, data streams often surpass the ability to capture and analyze information. By tapping into hundreds of untapped endpoints and millions of data points that were previously underutilized, the Atos system allows real-time innovations. For example, AI based predictive maintenance and computer vision to monitor all hardware—lowering support costs, decreasing IT complexity and driving decarbonization.

The Technology Behind Business Outcomes

It was a tall order for Atos: Harness the power of data by bringing together hardware, software, and AI in one OpEx solution. 

To most effectively develop BOaaS as a touchless, end-to-end managed service, Atos leveraged the compute and storage power of Dell Technologies. Atos chose Dell Technologies’ Streaming Data Platform[5] for its ability to deliver reliable, fast and secure data pipelines from edge to cloud.

“Using Dell Technologies solutions, we’ve already achieved a 10% reduction in downtime. This can save up to millions of dollars annually,” Langer said. In the future, we expect to triple that to 30% lower downtime, saving untold millions per customer, per location.”

Watch this video to learn more about how Atos and Dell are enabling game-changing innovation at the edge. 

[1] https://atos.net/en/2022/press-release_2022_05_04/atos-launches-innovative-edge-to-cloud-5g-and-ai-enabled-solution

[2] https://atos.net/en/portfolio/accelerate-decisions-with-data-for-better-business-outcomes

[3] https://www.enterprisetimes.co.uk/2022/12/30/business-and-technology-trends-for-professional-services-in-2023/

[4] https://www.engineering.com/story/bring-ai-and-automation-to-the-far-edge-of-the-factory-floor

[5] https://www.dell.com/en-us/dt/storage/streaming-data-platform.htm

IT Leadership

Since the pandemic began, 60 million people in Southeast Asia have become digital consumers. The staggering opportunities Asia’s burgeoning digital economy presents are reason enough to spur you into rethinking the way you do business.

This means one thing: digital transformation. Cloud adoption empowers organisations to adapt quickly to sudden market disruptions. Back when the pandemic was at its peak, hybrid work and enterprise mobile apps ensured critical operations were able to maintain business-as-usual despite lockdowns and border closures. Today, they are empowering an increasingly mobile workforce to stay productive—on their terms.

Facilitating this transformation saw organisations dismantling legacy infrastructures and adopting decentralised networks, cloud-based services, and the widespread use of employees’ personal devices.

But with this new cloud-enabled environment of mobile devices and apps, remote workspaces, and edge-computing components came substantial information gaps. Ask yourself if you have complete visibility of all your IT assets; there’s a good chance you’d answer no. This shouldn’t come as a surprise, as 94% of organisations find 20% or more of their endpoints undiscovered and therefore unprotected

Why you can’t ignore your undiscovered (and unprotected) endpoints

The rapid proliferation of endpoints, which increases the complexity of today’s IT environments and introduces a broader attack surface for cyber criminals to exploit, only serves to underscore the importance of knowing all your endpoints. Here’s what will happen if you don’t.

Exposure to security risk. You need to keep your doors and windows locked if you want to secure your home. But what if you don’t know how many you have or where they are located? It’s the same with endpoints: you can’t protect what you can’t see. Knowing your endpoints and getting real-time updates on their status will go a long way to proactively keeping cyber threats at bay and responding to an incident rapidly—and at scale.

Poor decision-making. Access to real-time data relies on instantaneous communication with all your IT assets, the data from which enable your teams to make better-informed decisions. Yet current endpoint practices work with data collected at an earlier point in time. What this means is that by the time your team utilises the data, it’s already outdated. This, in turn, renders the insights they derived inaccurate, and in some instances, unusable.

Inefficient operations. Despite IT assets being constantly added to or decommissioned from the environment due to workforce shifts and new requirements, many enterprises still track their inventory manually with Excel spreadsheets. You can imagine their struggle to get a complete and accurate inventory of every single asset and the resulting guessing games IT teams need to play to figure out what to manage and patch without that inventory.

Getting a better handle on ever-present security threats 

Having a bird’s-eye view of your endpoints requires you to have the right tools to manage them, no matter the size or complexity of your digital environment. These should help regain real-time visibility and complete control by:

Identifying unknown endpoints that are yet to be discovered, evaluated, and monitoredFinding issues by comparing installations and versions of your software for each endpoint against defined software bundles and updatesStandardising your environment by instantly applying updates to out-of-date installations and installing software missing from endpoints that require themEnabling automation of software management to further reduce reliance on IT teams by governing end-user self-service

You only stand to gain when you truly understand the importance of real-time visibility and complete control over your endpoints—and commit to it. In the case of insurer Zurich, having a high-resolution view over an environment with over 100,000 endpoints worldwide meant greater cyber resilience, savings of up to 100 resource hours a month, and deeper collaboration between cybersecurity and operations.

Secure your business with real-time visibility and complete control over your endpoints. Learn how with Tanium.

Endpoint Protection

By Bryan Kirschner, Vice President, Strategy at DataStax

Not too long ago, many CIOs might have had to wrestle with a chief financial officer who viewed enterprise IT as a utility. To use a household metaphor, a CFO might be sorely tempted to save money if budgets were tight by lowering the thermostat on cold days.

Now technology has become such an indispensable tool to drive revenue or remove costs that CIOs are just as likely to be asked to do more in the face of a downturn rather than less.

In one recent large-scale survey, 83% of respondents said their companies were concerned about a recession in 2023 – but 90% indicated IT spending would either increase (51%) or stay the same.

More tellingly, organizations that were more concerned about the effects of a possible recession were more likely to be planning bigger IT spending than those who were less concerned, the report said. “Just 30% of companies with ‘no plans’ to make major preparations for a recession reported that they were getting ready to hike IT spending, in contrast to solid majorities – 68% and 55% – for companies who were already making recession plans or planned to in the near future, respectively,” the survey said.

Some of the drivers are expanding proven use cases, such as personalization. For example: Target’s internal platform that optimizes advertising placements on Target.com to deliver a more personalized guest experience drove more than $1 billion in value in 2021 – which, with continued investment, is expected to  double in a few years.

A bigger slice of a shrinking pie

In-the-moment experiences like personalization are a proven way to drive revenue. The recent State of the Data Race 2022 report found that 71% of respondents could tie revenue growth directly to real-time data.

But technology investments that delight customers can also be a way to win a “bigger slice” when macro trends are “shrinking the pie.” Given rising US interest rates,  “As the size of the mortgage industry shrinks … now is the time to ramp up tech spending to create a better customer experience and gain market share,” according to  Brian Woodring, CIO of Detroit-based Rocket Mortgage.

And robotic process automation (RPA) offers broadly relevant potential to attack rote work and waste. According to one survey, organizations that moved beyond piloting automation achieved an average cost reduction of 32% in their targeted areas.

Multi-dimensional challenges

Any organization can benefit from building a strong cultural focus by taking advantage of best-of-breed technologies and rallying teams to pivot and double down on taking share, bolstering revenue, or protecting margins as the need arises.

But it’s worth pointing out that there’s a broader strategic dimension to this, too. Sometimes “tough operating conditions” aren’t simply a matter of, say, a temporary pullback in consumer spending. Challenges can be multi-dimensional and can have lasting impacts. The COVID-19 pandemic, for example, simultaneously created a shortage of pharmacists while increasing demand for revenue-generating services such as administering vaccines and tests.

Under these radically changed circumstances, executives asked a pointed question: “We looked at our system and said, ‘Why are we filling prescriptions the way we did in 1995?’”

Now a growing network of automated, centralized drug-filling centers (complete with rows of robot arms) sort and bottle pills. According to Walgreens, this cuts pharmacist workloads by at least 25% and will save the company more than $1 billion a year.

Ask yourself these two questions

“When the going gets tough, the tough get coding” looks to be advice enterprises are taking to heart. But it’s also worth considering the distinction between these two questions: “Must we make better use of technology because tough circumstances give us no choice in the matter?” and “Might we do better because we’ve grown accustomed to legacy practices?”

A strong culture of asking the latter question in good times will always set your organization up to have better answers to the former question when tough times force the issue.

Learn more about DataStax here.

About Bryan Kirschner:

Bryan is Vice President, Strategy at DataStax. For more than 20 years he has helped large organizations build and execute strategy when they are seeking new ways forward and a future materially different from their past. He specializes in removing fear, uncertainty, and doubt from strategic decision-making through empirical data and market sensing.

CIO, IT Leadership

By Bryan Kirschner, Vice President, Strategy at DataStax

From delightful consumer experiences to attacking fuel costs and carbon emissions in the global supply chain, real-time data and machine learning (ML) work together to power apps that change industries.

New research co-authored by Marco Iansiti, the co-founder of the Digital Initiative at Harvard Business School, sheds further light on how a data platform with robust real-time capabilities contribute to delivering competitive, ML-driven experiences in large enterprises.

It’s yet another key piece of evidence showing that there is a tangible return on a data architecture that is cloud-based and modernized – or, as this new research puts it, “coherent.”

Data architecture coherence

In the new report, titled “Digital Transformation, Data Architecture, and Legacy Systems,” researchers defined a range of measures of what they summed up as “data architecture coherence.” Then, using rigorous empirical analysis of data collected from Fortune 1000 companies, they found that every “yes” answer to a question about data architecture coherence results in about 0.7–0.9 more machine learning use casesacross the company. Moving from the bottom quartile to the top quartile of data architecture coherence leads to more intensive machine learning capabilities across the corporation, and about14% more applications and use cases being developed and turned into products.

They identified two architectural elements for processing and delivering data: the “data platform,” which covers the sourcing, ingestion, and storage of data sets, and the “machine learning (ML) system,” which trains and productizes predictive models using input data.

They conclude that what they describe as coherent data platforms “deliver real-time capabilities in a robust manner:they can incorporate dynamic updates to data flows and return instantaneous results to end-user queries.”

These kinds of capabilities enable companies like Uniphore to build a platform that applies AI to sales and customer interactions to analyze sentiment in real-time and boost sales and customer satisfaction.

Putting data in the hands of the people that need it

The study results don’t surprise us. In the latest State of the Data Race survey report, over three quarters of the more than 500 tech leaders and practitioners  (78%) told us real-time data is a “must have.” And nearly as many (74%) have ML in production.

Coherent data platforms also can “combine data from various sources, merge new data with existing data, and transmit them across the data platform and among users,” according to Iansiti and his co-author Ruiqing Cao of the Stockholm School of Economics.

This is critical, because at the end of the day, competitive use cases are built, deployed, and iterated by people: developers, data scientists, and business owners – potentially collaborating in new ways at established companies.

The authors of the study call this “co-invention,” and it’s a key requirement. In their view a coherent data architecture “helps traditional corporations translate technical investments into user-centric co-inventions.” As they put it, “Such co-inventions include machine learning applications and predictive analytics embedded across the organization in various business processes, which increase the value of work conducted by data users and decision-makers.”

We agree and can bring some additional perspective on the upside of that kind of approach. In The State of the Data Race 2022 report, two-thirds (66%) of respondents at organizations that made a strategic commitment to leveraging real-time data said developer productivity had improved. And, specifically among developers, 86% of respondents from those organizations said, “technology is more exciting than ever.” That represents a 24-point bump over those organizations where real time data wasn’t a priority.

The focus on a modern data architecture has never been clearer

Nobody likes data sprawl, data silos, and manual or brittle processes – all aspects of a data architecture that hamper developer productivity and innovation. But the urgency and the upside of modernizing and optimizing the data architecture keeps coming into sharper focus.

For all the current macroeconomic uncertainty, this much is clear: the path to future growth depends on getting your data architecture fit to compete and primed to deliver real time, ML-driven applications and experiences.

Learn more about DataStax here.

About Bryan Kirschner:

Bryan is Vice President, Strategy at DataStax. For more than 20 years he has helped large organizations build and execute strategy when they are seeking new ways forward and a future materially different from their past. He specializes in removing fear, uncertainty, and doubt from strategic decision-making through empirical data and market sensing.

Data Architecture, IT Leadership