Since the pandemic began, 60 million people in Southeast Asia have become digital consumers. The staggering opportunities Asia’s burgeoning digital economy presents are reason enough to spur you into rethinking the way you do business.

This means one thing: digital transformation. Cloud adoption empowers organisations to adapt quickly to sudden market disruptions. Back when the pandemic was at its peak, hybrid work and enterprise mobile apps ensured critical operations were able to maintain business-as-usual despite lockdowns and border closures. Today, they are empowering an increasingly mobile workforce to stay productive—on their terms.

Facilitating this transformation saw organisations dismantling legacy infrastructures and adopting decentralised networks, cloud-based services, and the widespread use of employees’ personal devices.

But with this new cloud-enabled environment of mobile devices and apps, remote workspaces, and edge-computing components came substantial information gaps. Ask yourself if you have complete visibility of all your IT assets; there’s a good chance you’d answer no. This shouldn’t come as a surprise, as 94% of organisations find 20% or more of their endpoints undiscovered and therefore unprotected

Why you can’t ignore your undiscovered (and unprotected) endpoints

The rapid proliferation of endpoints, which increases the complexity of today’s IT environments and introduces a broader attack surface for cyber criminals to exploit, only serves to underscore the importance of knowing all your endpoints. Here’s what will happen if you don’t.

Exposure to security risk. You need to keep your doors and windows locked if you want to secure your home. But what if you don’t know how many you have or where they are located? It’s the same with endpoints: you can’t protect what you can’t see. Knowing your endpoints and getting real-time updates on their status will go a long way to proactively keeping cyber threats at bay and responding to an incident rapidly—and at scale.

Poor decision-making. Access to real-time data relies on instantaneous communication with all your IT assets, the data from which enable your teams to make better-informed decisions. Yet current endpoint practices work with data collected at an earlier point in time. What this means is that by the time your team utilises the data, it’s already outdated. This, in turn, renders the insights they derived inaccurate, and in some instances, unusable.

Inefficient operations. Despite IT assets being constantly added to or decommissioned from the environment due to workforce shifts and new requirements, many enterprises still track their inventory manually with Excel spreadsheets. You can imagine their struggle to get a complete and accurate inventory of every single asset and the resulting guessing games IT teams need to play to figure out what to manage and patch without that inventory.

Getting a better handle on ever-present security threats 

Having a bird’s-eye view of your endpoints requires you to have the right tools to manage them, no matter the size or complexity of your digital environment. These should help regain real-time visibility and complete control by:

Identifying unknown endpoints that are yet to be discovered, evaluated, and monitoredFinding issues by comparing installations and versions of your software for each endpoint against defined software bundles and updatesStandardising your environment by instantly applying updates to out-of-date installations and installing software missing from endpoints that require themEnabling automation of software management to further reduce reliance on IT teams by governing end-user self-service

You only stand to gain when you truly understand the importance of real-time visibility and complete control over your endpoints—and commit to it. In the case of insurer Zurich, having a high-resolution view over an environment with over 100,000 endpoints worldwide meant greater cyber resilience, savings of up to 100 resource hours a month, and deeper collaboration between cybersecurity and operations.

Secure your business with real-time visibility and complete control over your endpoints. Learn how with Tanium.

Endpoint Protection

By Bryan Kirschner, Vice President, Strategy at DataStax

Not too long ago, many CIOs might have had to wrestle with a chief financial officer who viewed enterprise IT as a utility. To use a household metaphor, a CFO might be sorely tempted to save money if budgets were tight by lowering the thermostat on cold days.

Now technology has become such an indispensable tool to drive revenue or remove costs that CIOs are just as likely to be asked to do more in the face of a downturn rather than less.

In one recent large-scale survey, 83% of respondents said their companies were concerned about a recession in 2023 – but 90% indicated IT spending would either increase (51%) or stay the same.

More tellingly, organizations that were more concerned about the effects of a possible recession were more likely to be planning bigger IT spending than those who were less concerned, the report said. “Just 30% of companies with ‘no plans’ to make major preparations for a recession reported that they were getting ready to hike IT spending, in contrast to solid majorities – 68% and 55% – for companies who were already making recession plans or planned to in the near future, respectively,” the survey said.

Some of the drivers are expanding proven use cases, such as personalization. For example: Target’s internal platform that optimizes advertising placements on Target.com to deliver a more personalized guest experience drove more than $1 billion in value in 2021 – which, with continued investment, is expected to  double in a few years.

A bigger slice of a shrinking pie

In-the-moment experiences like personalization are a proven way to drive revenue. The recent State of the Data Race 2022 report found that 71% of respondents could tie revenue growth directly to real-time data.

But technology investments that delight customers can also be a way to win a “bigger slice” when macro trends are “shrinking the pie.” Given rising US interest rates,  “As the size of the mortgage industry shrinks … now is the time to ramp up tech spending to create a better customer experience and gain market share,” according to  Brian Woodring, CIO of Detroit-based Rocket Mortgage.

And robotic process automation (RPA) offers broadly relevant potential to attack rote work and waste. According to one survey, organizations that moved beyond piloting automation achieved an average cost reduction of 32% in their targeted areas.

Multi-dimensional challenges

Any organization can benefit from building a strong cultural focus by taking advantage of best-of-breed technologies and rallying teams to pivot and double down on taking share, bolstering revenue, or protecting margins as the need arises.

But it’s worth pointing out that there’s a broader strategic dimension to this, too. Sometimes “tough operating conditions” aren’t simply a matter of, say, a temporary pullback in consumer spending. Challenges can be multi-dimensional and can have lasting impacts. The COVID-19 pandemic, for example, simultaneously created a shortage of pharmacists while increasing demand for revenue-generating services such as administering vaccines and tests.

Under these radically changed circumstances, executives asked a pointed question: “We looked at our system and said, ‘Why are we filling prescriptions the way we did in 1995?’”

Now a growing network of automated, centralized drug-filling centers (complete with rows of robot arms) sort and bottle pills. According to Walgreens, this cuts pharmacist workloads by at least 25% and will save the company more than $1 billion a year.

Ask yourself these two questions

“When the going gets tough, the tough get coding” looks to be advice enterprises are taking to heart. But it’s also worth considering the distinction between these two questions: “Must we make better use of technology because tough circumstances give us no choice in the matter?” and “Might we do better because we’ve grown accustomed to legacy practices?”

A strong culture of asking the latter question in good times will always set your organization up to have better answers to the former question when tough times force the issue.

Learn more about DataStax here.

About Bryan Kirschner:

Bryan is Vice President, Strategy at DataStax. For more than 20 years he has helped large organizations build and execute strategy when they are seeking new ways forward and a future materially different from their past. He specializes in removing fear, uncertainty, and doubt from strategic decision-making through empirical data and market sensing.

CIO, IT Leadership

By Bryan Kirschner, Vice President, Strategy at DataStax

From delightful consumer experiences to attacking fuel costs and carbon emissions in the global supply chain, real-time data and machine learning (ML) work together to power apps that change industries.

New research co-authored by Marco Iansiti, the co-founder of the Digital Initiative at Harvard Business School, sheds further light on how a data platform with robust real-time capabilities contribute to delivering competitive, ML-driven experiences in large enterprises.

It’s yet another key piece of evidence showing that there is a tangible return on a data architecture that is cloud-based and modernized – or, as this new research puts it, “coherent.”

Data architecture coherence

In the new report, titled “Digital Transformation, Data Architecture, and Legacy Systems,” researchers defined a range of measures of what they summed up as “data architecture coherence.” Then, using rigorous empirical analysis of data collected from Fortune 1000 companies, they found that every “yes” answer to a question about data architecture coherence results in about 0.7–0.9 more machine learning use casesacross the company. Moving from the bottom quartile to the top quartile of data architecture coherence leads to more intensive machine learning capabilities across the corporation, and about14% more applications and use cases being developed and turned into products.

They identified two architectural elements for processing and delivering data: the “data platform,” which covers the sourcing, ingestion, and storage of data sets, and the “machine learning (ML) system,” which trains and productizes predictive models using input data.

They conclude that what they describe as coherent data platforms “deliver real-time capabilities in a robust manner:they can incorporate dynamic updates to data flows and return instantaneous results to end-user queries.”

These kinds of capabilities enable companies like Uniphore to build a platform that applies AI to sales and customer interactions to analyze sentiment in real-time and boost sales and customer satisfaction.

Putting data in the hands of the people that need it

The study results don’t surprise us. In the latest State of the Data Race survey report, over three quarters of the more than 500 tech leaders and practitioners  (78%) told us real-time data is a “must have.” And nearly as many (74%) have ML in production.

Coherent data platforms also can “combine data from various sources, merge new data with existing data, and transmit them across the data platform and among users,” according to Iansiti and his co-author Ruiqing Cao of the Stockholm School of Economics.

This is critical, because at the end of the day, competitive use cases are built, deployed, and iterated by people: developers, data scientists, and business owners – potentially collaborating in new ways at established companies.

The authors of the study call this “co-invention,” and it’s a key requirement. In their view a coherent data architecture “helps traditional corporations translate technical investments into user-centric co-inventions.” As they put it, “Such co-inventions include machine learning applications and predictive analytics embedded across the organization in various business processes, which increase the value of work conducted by data users and decision-makers.”

We agree and can bring some additional perspective on the upside of that kind of approach. In The State of the Data Race 2022 report, two-thirds (66%) of respondents at organizations that made a strategic commitment to leveraging real-time data said developer productivity had improved. And, specifically among developers, 86% of respondents from those organizations said, “technology is more exciting than ever.” That represents a 24-point bump over those organizations where real time data wasn’t a priority.

The focus on a modern data architecture has never been clearer

Nobody likes data sprawl, data silos, and manual or brittle processes – all aspects of a data architecture that hamper developer productivity and innovation. But the urgency and the upside of modernizing and optimizing the data architecture keeps coming into sharper focus.

For all the current macroeconomic uncertainty, this much is clear: the path to future growth depends on getting your data architecture fit to compete and primed to deliver real time, ML-driven applications and experiences.

Learn more about DataStax here.

About Bryan Kirschner:

Bryan is Vice President, Strategy at DataStax. For more than 20 years he has helped large organizations build and execute strategy when they are seeking new ways forward and a future materially different from their past. He specializes in removing fear, uncertainty, and doubt from strategic decision-making through empirical data and market sensing.

Data Architecture, IT Leadership

Most of us have experienced breaks in supply chains recently. In the automotive industry, for example, no one can tell you exactly when a much-needed part will arrive. And, manufacturing has slowed to the point that previously-owned (used) popular cars — available now — cost more than the “new” models — on a wait list.

This lack of transparency frustrates Schnellecke Logistics SE. As one of the world’s leading automotive logistics service providers, the company not only delivers automotive parts in sync with its customers’ production processes, it also supports their process optimization.

Logistics supply inefficiency hinders process optimization

While external factors like volatile demand, employee shortages, and agile supply chains drove Schnellecke to become more efficient — what they really needed to transform business was real-time insight and transparency into the entire supply-chain process.

In their way were disparate systems that made it hard to retrieve and share vital information on shop-floor production processes and operations. And, without mobile access from the shop floor, they lacked the real-time data necessary to anticipate potential issues.

That blind spot was a driving factor in the decision for Schnellecke and its subsidiary, LOGIS GmbH IT, to build a digital control tower to manage logistics. The ability to visualize real-time data gives them (and their customers by extension) management oversight from transportation and warehousing, to packaging, assembly, and supply.

Managing supply from the digital boardroom in the control tower

Now, from the control tower’s digital boardroom, Schnellecke and its customers can make decisions based on real-time data from all their supply logistics processes. The role-specific applications and analytics drive greater efficiency and transparency with real-time operational data, automated KPI reports, digital shift protocols, and more.

“We developed a ‘digital boardroom for logistics’ to enable the integrated, end-to-end, and real-time monitoring of our logistics process. By also providing this tool to our customers, we have established a new level of customer service for our core business,” says Karsten Keil, CIO, Schnellecke and VP group IT and digitization LOGIS GmbH.

As an SAP intelligent enterprise, LOGIS built the digital control tower solution on the SAP Business Technology Platform. This solution is an important building block in Schnellecke’s digital strategy. All digital initiatives are combined and synchronised in the Schnellecke iX+ framework.

70% faster: operational insights and impact analysis

“With SAP Business Technology Platform, we have built a digital boardroom for logistics that provides us with a complete view by role and product to see what is happening on the shop floor from procurement to line feeding,” explains Denis Wirries, head of competence center IoT, LOGIS GmbH.

Schnellecke has achieved operational efficiency and excellence through process optimization. With mobile access to the dashboard from the shop floor they can see deviations easily and take action. By aggregating and processing real-time data digitally from various internal and external sources they are able to make better decisions. The automated reporting alone, saves them two hours per day. With a 70% increase in operational insight and impact analysis, both employees and customers have embraced a digital mindset.

3x faster issue resolution, 25% more issues predicted

Now, Schnellecke resolves issues 3x faster than before and predicts 25% more issues — improving operations, productivity, and customer satisfaction. All relevant logistics supply data is centralized and displayed in real time on the dashboard which is accessible from all commonly used — including mobile — devices. Manual data searches have become obsolete and operational processes are stable.

Schnellecke innovation has lit up the automotive and manufacturing supply chain and logistics processes end-to-end. Everyone benefits. In recognition, Keil was named one of the top 10 CIOs of the year 2021 by the German IT magazine Computerwoche.

“You have an idea, say it out loud and you have a team of colleagues who tirelessly and passionately fight for this idea together with you,” Keil exclaims. â€œThis is what makes Schnellecke so special for me. And that is also the reason for this success.”

Schnellecke is also a 2022 SAP Innovation Awards winner. You can read the company’s pitch deck to learn more about the digital architecture.

Schnellecke

Digital Transformation

Most organizations understand the profound impact that data is having on modern business. In Foundry’s 2022 Data & Analytics Study, 88% of IT decision-makers agree that data collection and analysis have the potential to fundamentally change their business models over the next three years.

The ability to pivot quickly to address rapidly changing customer or market demands is driving the need for real-time data. But poor data quality, siloed data, entrenched processes, and cultural resistance often present roadblocks to using data to speed up decision making and innovation.

We asked the CIO Experts Network, a community of IT professionals, industry analysts, and other influencers, why real-time data is so important for today’s business and how data helps organizations make better, faster decisions. Based on their responses, here are four recommendations for improving your ability to make data-driven decisions. 

Use real-time data for business agility, efficient operations, and more

Business and IT leaders must keep pace with customer demands while dealing with ever-shifting market forces. Gathering and processing data quickly enables organizations to assess options and take action faster, leading to a variety of benefits, said Elitsa Krumova (@Eli_Krumova), a digital consultant, thought leader and technology influencer.

“The enormous potential of real-time data not only gives businesses agility, increased productivity, optimized decision-making, and valuable insights, but also provides beneficial forecasts, customer insights, potential risks, and opportunities,” said Krumova.

Other experts agree that access to real-time data provides a variety of benefits, including competitive advantage, improved customer experiences, more efficient operations, and confidence amid uncertain market forces:

“Business operations must be able to make adjustments and corrections in near real time to stay ahead of the competition. Few companies have the luxury of waiting days or weeks to analyze data before reacting. Customers have too many options. And in some industries — like healthcare, financial services, manufacturing, etc., — not having real-time data to make rapid critical adjustments can lead to catastrophic outcomes.” — Jack Gold (@jckgld), President and Principal Analyst at J. Gold Associates LLC.

“When insights from the marketplace are not transmitted in real time, the ability to make critical business decisions disappears. We’ve all experienced the pain of what continues to happen with the disconnect between customer usage metrics and gaps in supply chain data.” — Frank Cutitta (@fcutitta), CEO and Founder, HealthTech Decisions Lab

“Operationally, think of logistics. Real-time data provides the most current intelligence to manage the fleet and delivery, for example. Strategically, with meaningful real-time data, systemic issues are easier to identify, portfolio decisions faster to make, and performance easier to evaluate. At the end of the day, it drives better results in safety, customer satisfaction, the bottom line, and ESG [environmental, social, and governance].” — Helen Yu (@YuHelenYu), Founder and CEO, Tigon Advisory Corp.

“Businesses are facing a rapidly evolving set of threats from supply chain constraints, rising fuel costs, and shipping delays. Taking too much time to make a decision based on stale data can increase overall costs due to changes in fuel prices, availability of inventory, and logistics impacting the shipping and delivery of products. Organizations utilizing real-time data are the best positioned to deal with volatile markets.” — Jason James (@itlinchpin), CIO at Net Health

Build a foundation for continuous improvement

The experts offered several practical examples of how real-time data can help deliver continuous improvement in a variety of areas across the business, with the help of automation, which is a key capability for making data actionable.

“In the process of digital transformation, businesses are moving from human-dependent to digital business processes,” said Nikolay Ganyushkin (nikolaygan), CEO and Co-founder of Acure. “This means that all changes, all transitions, are instantaneous. The control of key parameters and business indicators should also be based on real-time data, otherwise such control will not keep up with the processes.”

Real-time data and automated processes present a powerful combination for improving cybersecurity and resiliency.

“When I was coming up in InfoSec, we could only do vulnerability scanning between midnight and 6 am. We never got good results because systems were either off, or there was just nothing going on at those hours,” said George Gerchow (@georgegerchow), CSO and SVP of IT, Sumo Logic. “Today, we do them at the height of business traffic and can clearly see trends of potential service outages or security incidents.”

Will Kelly (@willkelly), an analyst and writer focused on the cloud and DevOps, said that harnessing real-time data is critical “in a world where delaying business and security decisions can prove even more costly than just a couple of years ago. Tapping into real-time data provides decision-makers with immediate access to actionable intelligence, whether a security alert on an attack in-progress or data on a supply chain issue as it happens.”

Real-time data facilitates timely, relevant, and insightful decisions down to the business unit level, said Gene De Libero (@GeneDeLibero), Chief Strategy Officer at GeekHive.com. Those decisions can have a direct impact on customers. “Companies can uncover and respond to changes in consumer behavior to promote faster and more efficient personalization and customization of customer experiences,” he said.

Deploy an end-to-end approach to storing, accessing, and analyzing data

To access data in real time — and ensure that it provides actionable insights for all stakeholders — organizations should invest in the foundational components that enable more efficient, scalable, and secure data collection, processing, and analysis. These components, including cloud-based databases, data lakes, and data warehouses, artificial intelligence and machine learning (AI/ML) tools, analytics, and internet of things capabilities, must be part of a holistic, end-to-end strategy across the enterprise:

“Real-time data means removing the friction and latency from sourcing data, processing it, and enabling more people to develop smarter insights. Better decisions come from people trusting that the data reflects evolving customer needs and captures an accurate state of operations.” — Isaac Sacolick (@nyike), StarCIO Leader and Author of Digital Trailblazer

“Organizations must use a system that draws information across integrated applications. This is often made simpler if the number of platforms is kept to a minimum. This is the only way to enable a real-time, 360-degree view of everything that is happening across an organization — from customer journeys to the state of finances.” — Sridhar Iyengar (@iSridhar), Managing Director, Zoho Europe

“Streaming processing platforms allow applications to respond to new data events instantaneously. Whether you’re distributing news events, moving just-in-time inventory, or processing clinical test results, the ability to process that data instantly is the power of real-time data.” — Peter B. Nichol (@PeterBNichol), Chief Technology Officer at OROCA Innovations

As your data increases, expand your data-driven capabilities

The volume and types of data organizations collect will continue to increase. Forward-thinking leadership teams will continue to expand their ability to leverage that data in new and different ways to improve business outcomes.

“The power of real-time data is amplified when your organization can enrich data with additional intelligence gathered from the organization,” said Nichol. “Advanced analytics can enhance events with scoring models, expanded business rules, or even new data.”

Nichol offered the example of combining a customer’s call — using an interactive voice response system — with their prior account history to enrich the interaction. “By joining events, we can build intelligent experiences for our customers, all in real time,” he said.

It’s one of the many ways that new technologies are increasing the opportunities to use real-time data to fundamentally change how businesses operate, now and in the future.

“As businesses become increasingly digitalized, the amount of data they have available is only going to increase,” said Iyengar. “We can expect real-time data to have a more significant impact on decision-making processes within leading, forward-thinking organizations as we head deeper into our data-centric future.”

Learn more about ways to put your data to work on the most scalable, trusted, and secure cloud.

Business Intelligence

With so much diverse data available, why do so many companies still struggle to embrace the real-time, data-driven decision-making they need to be more agile? One thing is clear: The challenge isn’t solved by technology alone.

“You can’t buy transformation,” says Tom Godden, Principal Technical Evangelist with the Enterprise Strategy team at AWS. “Real change doesn’t come just from new technology—it comes from rethinking your processes, which are enabled by the technology.”

Godden offers three tips to help organizations break down data silos, improve data quality, and overcome other longstanding data challenges to foster a culture of decision-making that drives business agility.

1. Build the foundation for managing data in real-time.

A modern data strategy must emphasize data quality at the source of origin, rather than traditional methods of cleansing and normalizing at the point of consumption. Make sure you have the proper infrastructure, tools, and services in place to capture data from a variety of sources, ensure the quality of the data you’re collecting, and manage it securely, end to end.

The technical underpinnings of a modern data strategy include cloud-based databases, data lakes, and data warehouses; artificial intelligence and machine learning (AI/ML) tools; and analytics. The infrastructure must be supported by a comprehensive plan to manage, access, analyze, and protect data across its entire lifecycle, with fully automated processes and robust integration to make data actionable across the organization.

“It may sound obvious, but if you do not build the right processes to capture all the data, you can’t act on the data,” says Godden.

2. Don’t just democratize data – democratize the decisions based on that data.

Investing in the data management infrastructure, tools, and processes necessary to capture data in real-time through a variety of data feeds and devices is just the first step. If you aren’t simultaneously creating a culture that allows people to act on data, you’re just creating frustration.

To that end, avoid “reporting ghost towns” that require people to stop what they’re doing and access a different tool for insights. Instead, build analytics capabilities directly into their workflows, with context, so they can easily apply the insights to their daily activities.

3. Provide the types of guardrails that spur innovation instead of inhibiting it.

Building automated processes for metadata, including information on data lineage and shelf life, builds confidence in the data. By storing data in its raw or native format, you can apply access policies to individuals without having to modify the data.

This approach ensures more flexibility for how people can use the data they need without compromising the fidelity of the data itself. A data lake can serve as a foundational element of a data unification strategy, providing a single source of truth with supporting policies for real-time provisioning based on permissions.

Agile decision making: How three companies are benefiting from a modern data strategy

Organizations are already capturing the benefits of real-time access to data based on roles and permissions. Here are three examples:

Swimming Australia, the nation’s top governing body for swimming, has long been at the forefront of science. Now, it’s using data to analyze race performance and create bespoke training programs for individual athletes. A data lake unified athlete statistics and metrics in a single location, and AI/ML tools are helping the team tailor training programs and track competitors. Analysts and coaches capture real-time physiological data during training sessions and combine that information with race analysis to determine how to evolve training efforts for individual swimmers. Coaches and athletes can easily track progress in real time from their phones via cloud-based dashboards. Today, with its modern data architecture, the national team can create benchmarking reports in minutes, an innovation that helped make the Australians the most successful relay team in the 2020 Tokyo Olympic games.

Coca-Cola Andina, which produces and distributes products licensed by The Coca-Cola Company within South America, needed a solution to collect all relevant information on the company, its customers, logistics, coverage, and assets within a single accurate source. The answer was a cloud-based data lake, which allowed the company to implement new products and services to customize the different value propositions for its more than 260,000 customers. With all the resources and functionality that the data lake enables, Coca-Cola Andina ensures its partners and customers have access to reliable information for making strategic decisions for the business. Coca-Cola Andina ingested more than 95% of the data from its different areas of interest, which allows it to build excellence reports in just a few minutes and implement advanced analytics. The cloud infrastructure increased productivity of the analysis team by 80%.

Vyaire, a global medical company, needed a way to help its 4,000 employees make better, data-based decisions utilizing both first- and-third-party data. Adopting AWS Data Exchange to find, subscribe to, and use third-party data has made it easier to incorporate data sources into the company’s own data ecosystem, resulting in quicker insights to help teams focus on getting results, not administration. Easy access to third-party data via the AWS Data Exchange catalog has encouraged more experimentation and innovation, giving Vyaire’s leadership confidence that it can meet the changing market for respiratory care products and direct investment in the right area to improve its product portfolio.

Too many organizations continue to be held back from using data effectively to drive all aspects of their business. A modern data strategy will empower teams and individuals, regardless of role or organizational unit, to analyze and use data to make better, faster decisions – enabling the sustainable advantage that comes from business agility.

Learn more about ways to put your data to work on the most scalable, trusted, and secure cloud.

Digital Transformation

By Chet Kapoor, Chairman and CEO, DataStax

There is no doubt that this decade will see more data produced than ever before.

But what’s truly going to transform our lives, define the trajectory of each of our organizations, and reshape industries is not the massive volume of data. It’s the unmatched degree to which this data can now be activated in applications that drive action in real time: minute by minute (or even second by second), across work, play, and commerce. Where technology might have been a constraint in the past, it’s now an enabler.

Here, we’ll take a look at why real-time apps are no longer just the domain of internet giants and discuss three ways that your organization can move toward delivering real-time data.

The future is here

IDC predicts that by next year there will be more than 500 million new cloud native digital apps and services – more than the total created over the past 40 years.

We’re already living in this future. We get turn-by-turn driving directions while listening to an AI-recommended playlist, and then arrive at the exact time our e-commerce order is brought to us curbside – along with a cup of hot coffee.

The real-time data powering apps that change industries is no longer just offered by a Google or a Spotify.

Companies like Target excel at it. The retailer delights customers with an app that shows users what they most want to see, ensures no one ever misses a deal, has a near-perfect record of intelligent substitutions for out-of-stock items, and gets users their orders on their terms (and it might just include a drink from Starbucks, another enterprise that is a real-time app powerhouse).

Smaller businesses are making real-time data core to their offerings, too. Ryzeo offers a marketing platform that leverages real-time data generated by events on its clients’ e-commerce websites. An item that a shopper views or searches for instantly results in an AI-driven recommendation through its “suggested items.”  Real-time data – and the technology that supports it – is how Ryzeo makes this happen. 

Inaction isn’t an option

The door is open to you and your organization, too.

The best-of-breed technologies that power winning real-time apps are open source and available as a service, on demand to all. There are tons of proven use cases across industries. When you leverage these use cases and technologies, there’s a big payoff – you increase your organization’s ability to innovate and turn data into delightful customer experiences.

This will not only transform how your business grows, but how your business works.

As consumers, we never want to go back to dumb apps that evolve slowly, don’t know our context, and fail to act intelligently on our behalf. In fact, we desire the opposite.

When you put the customer’s digital experience at the center of agile workflows, make fast decisions, and rapidly iterate, you create a powerful feedback loop. Every win shows the power of a new and more fulfilling way of working. So does every failure – by providing valuable learnings.

The one thing you can count on is that inaction is not an option. And at this moment in time, why would we want to wait?

There is no doubt that real time data can reduce waste, increase safety, help the environment, make people happier and healthier. And we’re only just getting started.

So how do you get started? You can make three important choices right now to set your organization on a path to excel at delivering real-time data.

Step 1: Pick up the right tools

The technology to deliver outstanding, data-powered, real-time experiences has arrived – and we’ve got it in spades. The best of breed tools are open source. They grew out of the “best of the internet” to solve novel problems about scale and data velocity. Apache Cassandra®, for example, was developed at Facebook to manage massive amounts of messaging data.

Joining the open source ecosystem means you don’t have to reinvent the wheel. This is important because what sets your organization’s real-time data experiences apart won’t be the infrastructure. It’ll be how you put your domain knowledge to use in new ways that delight your users.

Most of these technologies are available on demand as-a-service to everyone. If you didn’t add them to your data infrastructure yesterday, do it today.

Step 2: Assemble the right teams

When every company is a software company, every executive must also be a software executive. This includes your line of business owners, general managers, and functional leaders.

Winning companies reorganize team structures and accountability to match. The days of data scientists experimenting alone in an ivory tower and developers working under requirements that were “thrown over the wall” to IT are over. “The business” can no longer think of data and technology as “IT’s problem.”

All of your employees need to be trained to identify and capitalize on opportunities for using data and technology to drive business results. Your line of business owners must be held accountable for making it happen.

To empower them, assign your developers, data scientists, and technical product managers to cross-functional teams working side-by-side with their business domain colleagues that own customer experiences. This is a ticket out of “pilot purgatory” and a key to democratizing innovation across your company.

Step 3: Ask the right questions

As you advance on your journey, more and more smart systems will be working every minute of every day to answer your industry’s key questions, like “what’s the most compelling personalized offer for this customer?” or “what’s the optimal inventory for each store location?”

What those systems can’t do is ask questions that only humans can, such as “how do we want to evolve our relationship with our customers?” Or “how can we deploy our digital capabilities in ways that differentiate us from our competitors?”

No algorithm is going to kick out the brilliant and empathetic idea to “Show Us Your Tarzhay,” which turned what might have otherwise been the unfortunate necessity of having to shop on a limited budget into the opportunity to celebrate and share a distinctive personal style. Similarly, it took human creativity to expand the concept from clothing into a new category (groceries).

If you take the first two steps listed above, you will start to free up your people’s time to ask creative questions and improve their ability to deliver on the answers using best-of-breed technology. Equip, challenge, and inspire them to think big about where you want to take your customers next, and you’ll get your organization moving in the right direction to provide the benefits of real-time data to your customers.

Learn more about DataStax here.

About Chet Kapoor:

Chet is Chairman and CEO of DataStax. He is a proven leader and innovator in the tech industry with more than 20 years in leadership at innovative software and cloud companies, including Google, IBM, BEA Systems, WebMethods, and NeXT. As Chairman and CEO of Apigee, he led company-wide initiatives to build Apigee into a leading technology provider for digital business. Google (Apigee) is the cross-cloud API management platform that operates in a multi- and hybrid-cloud world. Chet successfully took Apigee public before the company was acquired by Google in 2016. Chet earned his B.S. in engineering from Arizona State University.

Data Management, IT Leadership

By Aaron Ploetz, Developer Advocate

There are many statistics that link business success to application speed and responsiveness. Google tells us that a one-second delay in mobile load times can impact mobile conversions by up to 20%. And a 0.1 second improvement in load times improved retail customer engagement by 5.2%, according to a study by Deloitte.

It’s not only the whims and expectations of consumers that drive the need for real-time or near real-time responsiveness. Think of a bank’s requirement to detect and flag suspicious activity in the fleeting moments before real financial damage can happen. Or an e-tailer providing locally relevant product promotions to drive sales in a store. Real-time data is what makes all of this possible.

Let’s face it – latency is a buzz kill. The time that it takes for a database to receive a request, process the transaction, and return a response to an app can be a real detriment to an application’s success. Keeping it at acceptable levels requires an underlying data architecture that can handle the demands of globally deployed real-time applications. The open source NoSQL database Apache Cassandra®  has two defining characteristics that make it perfectly suited to meet these needs: it’s geographically distributed, and it can respond to spikes in traffic without adverse effects to its unmatched throughput and low latency.

Let’s explore what both of these mean to real-time applications and the businesses that build them.

Real-time data around the world

Even as the world has gotten smaller, exactly where your data lives still makes a difference in terms of speed and latency. When users reside in disparate geographies, supporting responsive, fast applications for all of them can be a challenge.

Say your data center is in Ireland, and you have data workloads and end users in India. Your data might pass through several routers to get to the database, and this can introduce significant latency into the time between when an application or user makes a request and the time it takes for the response to be sent back.

To reduce latency and deliver the best user experience, the data need to be as close to the end user as possible. If your users are global, this means replicating data in geographies where they reside.

Cassandra, built by Facebook in 2007, is designed as a distributed system for deployment of large numbers of nodes across multiple data centers. Key features of Cassandra’s distributed architecture are specifically tailored for deployment across multiple data centers. These features are robust and flexible enough that you can configure clusters (collections of Cassandra nodes, which are visualized as a ring) for optimal geographical distribution, for redundancy, for failover and disaster recovery, or even for creating a dedicated analytics center that’s replicated from your main data storage centers.

But even if your data is geographically distributed, you still need a database that’s designed for speed at scale.

The power of a fast, transactional database

NoSQL databases primarily evolved over the last decade as an alternative to single-instance relational database management systems (RDBMS) which had trouble keeping up with the throughput demands and sheer volume of web-scale internet traffic.

They solve scalability problems through a process known as horizontal scaling, where multiple server instances of the database are linked to each other to form a cluster.

Some NoSQL database products were also engineered with data center awareness, meaning the database is configured to logically group together certain instances to optimize the distribution of user data and workloads. Cassandra is both horizontally scalable and data-center aware. 

Cassandra’s seamless and consistent ability to scale to hundreds of terabytes, along with its exceptional performance under heavy loads, has made it a key part of the data infrastructures of companies that operate real-time applications – the kind that are expected to be extremely responsive, regardless of the scale at which they’re operating. Think of the modern applications and workloads that have to be reliable, like online banking services, or those that operate at huge, distributed scale, such as airline booking systems or popular retail apps.

Logate, an enterprise software solution provider, chose Cassandra as the data store for the applications it builds for clients, including user authentication, authorization, and accounting platforms for the telecom industry.

“From a performance point of view, with Cassandra we can now achieve tens of thousands of transactions per second with a geo-redundant set-up, which was just not possible with our previous application technology stack,” said Logate CEO and CTO Predrag Biskupovic.

Or what about Netflix? When it launched its streaming service in 2007, it used an Oracle database in a single data center. As the number of users and devices (and data) grew rapidly, the limitations on scalability and the potential for failures became a serious threat to Netflix’s success. Cassandra, with its distributed architecture, was a natural choice, and by 2013, most of Netflix’s data was housed there. Netflix still uses Cassandra today, but not only for its scalability and rock-solid reliability. Its performance is key to the streaming media company –  Cassandra runs 30 million operations per second on its most active single cluster, and 98% of the company’s streaming data is stored on Cassandra.

Cassandra has been shown to perform exceptionally well under heavy load. It can consistently show very fast throughput for writes per second on a basic commodity workstation. All of Cassandra’s desirable properties are maintained as more servers are added, without sacrificing performance.

Business decisions that need to be made in real time require high-performing data storage, wherever the principal users may be. Cassandra enables enterprises to ingest and act on that data in real time, at scale, around the world. If acting quickly on business data is where an organization needs to be, then Cassandra can help you get there.

Learn more about DataStax here.

About Aaron Ploetz:

DataStax

Aaron has been a professional software developer since 1997 and has several years of experience working on and leading DevOps teams for startups and Fortune 50 enterprises.

IT Leadership, NoSQL Databases