Thanks to cloud, Internet of Things (IoT), and 5G technologies, every link in the retail supply chain is becoming more tightly integrated. These technologies are also allowing retailers to capture and gather insights from more and more data – with a big assist from artificial intelligence (AI) and machine learning (ML) technologies – to become more efficient and achieve evolving sustainability goals.   

From maintaining produce at the proper temperature to optimizing a distributor’s delivery routes, retail organizations are transforming their businesses to streamline product storage and delivery and take customer experiences to a new level of convenience—saving time and resources and reinforcing new mandates for sustainability along the entire value chain.

“Transformation using these technologies is not just about finding ways to reduce energy consumption now,” says Binu Jacob, Head of IoT, Microsoft Business Unit, Tata Consultancy Services (TCS). “It’s also about being able to capture the insights needed to better forecast energy consumption in the future.”

Reducing energy consumption across the value chain

For example, AI/ML technologies can detect the outside temperature and regulate warehouse refrigeration equipment to keep foods appropriately chilled, preventing spoilage and saving energy.

“The more information we can collect about energy consumption of in-store food coolers, and then combine that with other data such as how many people are in the store or what the temperature is outside, the more efficiently these systems can regulate temperature for the coolers to optimize energy consumption,” says K.N. Shanthakumar, Solution Architect – IoT, Retail Business Unit, TCS.

Landmark Group, one of the largest retail and hospitality organizations in the Middle East, wanted to reduce energy consumption and carbon footprint, improve operational excellence, and make progress toward its sustainability goals. Working with TCS, Landmark Group deployed TCS Clever Energy at more than 500 sites, including stores, offices, warehouses, and malls, resulting in significant improvements in energy efficiency and carbon emissions at these sites.

“Retail customers are looking to achieve net zero goals by creating sustainable value chains and reducing the environmental impact of their operations,” says Marianne Röling, Vice President Global System Integrators, Microsoft. “TCS’ extensive portfolio of sustainability solutions, built on Microsoft Cloud, provides a comprehensive approach for businesses to embrace sustainability and empower retail customers to reduce their energy consumption, decarbonize their supply chains, meet their net zero goals, and deliver on their commitments.”

Optimizing delivery workflows

For delivery to retail outlets, logistics programs—TCS DigiFleet is one example—increasingly rely on AI/ML to help distributors plan optimized routes for drivers, reducing fuel consumption and associated costs. Video and visual analytics ensure that trucks are filled before they leave the warehouse or distribution center, consolidating deliveries into fewer trips. Sensors and other IoT devices track inventory and ensure that products are safe and secure. Postnord implemented this solution to increase fill rate, thereby improving operations and cost savings. 

“Instead of dispatching multiple trucks with partially filled containers, you can send fewer trucks with fully loaded containers on a route that has been optimized for the most efficient delivery,” says Shanthakumar. “5G helps with the monitoring of contents of the containers and truck routes in real time while dynamically making adjustments as needed and communicating with the driver for effective usage.”

More data, better insights

With cloud-driven modernization, intelligence derived from in-store systems and sensors can automatically feed into the supply chain to address consumer expectations on a real-time basis. In keeping with the farm-to-fork movement, for example, consumers can scan a barcode to find out where a product originated and what cycles it went through before landing on the grocery store shelf.

With 5G-enabled smart mirrors, a person can virtually try on apparel. By means of a touchpad or kiosk, the mirror technology can superimpose a garment on a picture to show the shopper how it will look, changing colors and other variables with ease.

Retail transformation enabled by AI/ML, IoT and 5G technologies is still evolving, but we’re already seeing plenty of real-world examples of what the future holds, including autonomous stores and drone deliveries. The key for retail organizations is building a cloud-based infrastructure that not only accelerates this type of innovation, but also helps them become more resilient, adaptable, and sustainable while staying compliant, maintaining security, and preventing fraud.

Learn more about how TCS’ Sustainability and Smart Store solution empowers retailers to reimagine store operations, optimize operational costs, improve security, increase productivity, and enhance customer experience.

Cloud Computing, Digital Transformation, Retail Industry

After moving resources and applications to the cloud, pioneering enterprises in the finance and telecommunications sectors have been stepping up efforts to dive into the cloud. They are trying to leverage innovation on the cloud to drive exponential business growth.

Huawei Cloud has suggested three ways for enterprises to dive into cloud: tapping into more cloud-native technologies, developing more application innovation on cloud, and drawing on state-of-the-art expertise and experience.

Enterprises have encountered many challenges as they seek to dive into cloud:

Difficult technologies: Cloud native comes with a wide range of tech stacks. There are a range of different infrastructure, applications, data, AI, and IoT technologies to be mastered, which can be challenging for many enterprises.Complex scenarios: As digital transformation continues, enterprises have to deal with more and more scenarios where different business needs are involved, where there are no unified standards, and where successful experience is hard to share. This hinders innovation.Challenges acquiring state-of-the-art experience: There have been no platforms available to consolidate and replicate state-of-the-art expertise and experience from industry pioneers.

How Huawei Cloud Helps Enterprises Dive into Cloud with Three Approaches

Huawei Cloud Stack is a cloud solution provided by Huawei Cloud to help large enterprises dive into cloud. Deployed on-premises, Huawei Cloud Stack enables enterprises to establish a secure, reliable, and efficient hybrid cloud with improved security and compliance and continuous service innovation.

To help enterprises rise to the challenges, and in response to the shift from cloud migration to a dive into cloud, Huawei Cloud Stack provides three key approaches to simplify technologies, develop easier specific-scenario solutions, and share experience.

Setting Up a Preferred Digital Foundation

Over the past two years, Huawei has set up teams dedicated to serving customers in different industries. Huawei Cloud Stack has established a preferred digital foundation to enable these dedicated teams to cooperate with Huawei’s internal ICT solution organizations, so they can jointly help customers speed up digital transformation.

We enhanced coordination with ICT products and helped adopt different technologies to provide product portfolios tailored to more customer needs.

We developed cloud-native disaster recovery and security protection capabilities to free customers from having to spend time maintain infrastructure, so they can better focused on innovation.

Plus, we standardized and visualized the processes of AI development, application integration, and data governance, and harnessed aPaaS capabilities to offer industry-specific expertise as APIs that can be called with ease.

Providing Standardized Scenario-specific Solutions

To accommodate the diverse needs of enterprises in different industries, Huawei Cloud Stack aims to provide scenario-specific solutions by tapping into continuous operations and cooperating with ecosystem partners.

We identify consistent customer needs in similar scenarios and integrate hardware, cloud service offerings, networking, configurations, and applications into a standard solution that can easily meet those needs.

We work with our ecosystem partners to pre-test the industry applications included in the solution and update them continuously to keep up to date with the most current customer needs. Only proven solutions are launched to market.

We also consolidate helpful procedures and tools such as an application migration procedure consisting of 12 steps in four stages and an automated migration tool. This helps standardize the process of continuous operations.

Distilling Experience into Professional Services

Huawei Cloud Stack provides over 70 professional services in 8 categories to distill Huawei’s own experience in digital transformation and their experience enabling digital transformation for customers in diverse industries. We provide the services for customers in numerous industries, services that help them build, move to, use, and manage cloud. Additionally, in the Huawei Cloud Stack zone of Huawei’s KooGallery, an online store, there are more than 500 popular applications available for enterprises to use. It is expected to act as an important application distribution platform for enterprises in the future. At HUAWEI CONNECT 2022, Huawei worked with Financial Information Technology Institute (FITI) to release the Modern Financial Core System White Paper, aiming to share the experience and expertise of developing modern core systems in the finance sector with customers across all sectors.

Huawei Cloud Stack has helped many customers dive into the cloud to unleash digital power. For the government of Foshan, a prefecture-level city in China, Huawei Cloud Stack’s Astro low-code development platform helped government application developers use iteration and visual orchestration to quickly create and roll out applications. Thanks to its low skill requirements, the platform also allows officers who want to use applications to develop applications by themselves. This means much higher efficiency. Huawei Cloud Stack’s mining industrial Internet solution integrates big data, application governance, IoT, ecosystem setup, and other technological advances. It has helped many mining companies, like Hongliulin Coal Mine of Shaanxi Coal and Chemical Industry Group, transform their mining operations. Also, using Huawei Cloud Stack’s professional data governance service, HydroLancang, a subsidiary of the China Huaneng Group, set up a data governance system that unifies data standards and lets data empower operations in the energy sector.

New Huawei Cloud Stack Version Unveiled to Unleash Digital Power

Over the next decade, more enterprises will dive into the cloud to reap more benefits and positive outcomes. As a trusted partner of enterprises in their cloud journey, Huawei Cloud has adopted two strategies:

Think cloud native and act cloud native in composable delivery, data-driven operations, DevOps development, service architecture design, and security and trustworthiness assurance to help enterprises promote application modernization.Explore a new model for industrial-scale AI development to operationalize AI in core applications of enterprises in diverse sectors.

Huawei Cloud Stack 8.2 was launched in line with Huawei Cloud’s strategies. This new version has been improved in many ways:

We have reinforced the infrastructure’s capabilities, including cloud-edge synergy and all-scenario disaster recovery. The cloud-native architecture they are built on provides a solid, resilient foundation for digital transformation. We developed a new security protection system consisting of Cloud Security Brain and seven layers of protection. Cloud Security Brain is a cloud-native security operations center backed by Huawei Cloud’s international security operations experience. It accesses more than 200 kinds of security data and provides over 100 security event response plans to offer actionable insights into risks and help close 99% of security events in minutes. The seven-layer system of protection provides comprehensive protection for physical machines, cloud servers, applications, and data.AI training, AI Cortex, and industrial Internet solutions have been put in place to drive cloud innovation and intelligent upgrade. The CityCore solution and Pangu mining model infuse AI into city governance and industrial development, at scale, and with continuous iteration and optimization.We have launched scenario-specific solutions such as industrial Internet for mining, a new financial distributed core, integrated finance management, and digital twins for cities. We also worked with third parties to release Modern Financial Core System White Paper, Financial Digital Transformation Best Practice White Paper, and Introduction to City Digital Twins. This way, we have distilled expertise and suited core needs of customers in the industry-specific scenarios.

Huawei Cloud Stack 8.2 helps enterprises in more sectors to reap more benefits from modernization and smart transformation:

Shenzhen’s Futian District leveraged Huawei Cloud Stack’s AI-powered CityCore solution to enable automated allocation of service tickets. Ticket allocation that took 4 minutes now needs just 50 seconds, and the accuracy reaches over 90%.Hongliulin Coal Mine of Shaanxi Coal and Chemical Industry Group worked in tandem with Huawei Cloud Stack to set up an intelligent integrated management and control platform with an industrial Internet architecture for mining. The platform centrally captures huge volumes of device data, provides over 100 device models and 400 operational models, and offers data insights into mining operations. This means distilled mining expertise, less manpower, and more secure and efficient operations.

Huawei Cloud Stack is now used to accelerate digital transformation for more than 4,800 customers across 150 countries around the world, customers including more than 800 e-Government cloud customers and 300 financial institutions. We have released more than 30 scenario-specific solutions tailored to multiple sectors, such as government, finance, transportation, energy, and manufacturing. The solutions include unified government affair management, financial intelligent data lake, and smart airport solutions.

In the future, Huawei Cloud Stack will continue to innovate to lay a more powerful foundation for digital transformation of every sector while joining hands with more customers to dive into cloud and unleash digital power.

Digital Transformation

Now, more than ever, global businesses have an opportunity. With people and infrastructure touching every point on the planet — and new technology empowering us to radically change the way we consume resources — we can lead the world toward a better, more sustainable future. 

That optimism stems from three core beliefs: 

We can build our business ecosystems to promote environmental stewardship, achieving ambitious goals like net zero and zero waste. We can foster well-being and growth, as well as diversity, equity, inclusion, and accessibility (DEIA), for all our people — those who work for us and those who live in the communities we touch. We can grow our businesses through innovation and digitization, establishing long-term prosperity without social or environmental compromise. 

Learn how TCS and Microsoft are powering sustainability through innovation.  

Green IT, Retail Industry

Now, more than ever, global businesses have an opportunity. With people and infrastructure touching every point on the planet — and new technology empowering us to radically change the way we consume resources — we can lead the world toward a better, more sustainable future. 

That optimism stems from three core beliefs: 

We can build our business ecosystems to promote environmental stewardship, achieving ambitious goals like net zero and zero waste. We can foster well-being and growth, as well as diversity, equity, inclusion, and accessibility (DEIA), for all our people — those who work for us and those who live in the communities we touch. We can grow our businesses through innovation and digitization, establishing long-term prosperity without social or environmental compromise. 

Learn how TCS and Microsoft are powering sustainability through innovation.  

Financial Services Industry, Green IT

At UL Solutions, CIO Karriem Shakoor has identified clear cultural and architectural requirements for achieving data democratization so that IT can get out of the reports business and into driving revenue.

Recently, I had the chance to speak at length with Shakoor about data strategy at the global safety science company, which has over 15,000 employees in 40 countries. What follows is an edited version of our interview.

Martha Heller: How is software changing UL Solutions as a business?

Karriem Shakoor: UL Solutions’ ambition is to be our customers’ most trusted, scienced-based safety, security, and sustainability partner, which means that we need best-in-class technology infrastructure. For example, investing in industry-leading customer relationship management software lets us leverage the collective innovation of that software company’s entire customer base toward meeting our own transformation goals, rather than starting from scratch. That allows our sales teams to run and track their activities with feature-rich and fully integrated processes.

But the software tools are only as powerful as our ability to create a consistent view of our customer base. We can digitize our services and enable their appropriate pricing and configuration for a customer, but to fully leverage the software investment, we also need reliable, accurate customer and account data to support direct marketing, lead generation, and personalization.

What are the steps toward having a data strategy that fully leverages the software?

Good governance is a must if you want to harness the full power of data for new products and services and achieve data democratization.

Every company must be intentional about governing and proactively managing the quantities of data it creates each year, using effective standards and quality rules. If not, they risk diluting the value they can derive, and slowing decision-making.

Email offers a simple example. In order to use email marketing to engage customers, it’s critical to build an accurate and trusted repository of email addresses. Without enforcing a convention for how those addresses are formatted and ensuring that the systems that record those addresses — whether manually or using automation — conform to that convention, you jeopardize the usability of key data.

What is data democratization and why is it important?

Democratizing data empowers stakeholders to access and use that data to answer questions on their own without working through an IT broker. For example, a stakeholder should be able to run a report without having to request that IT pull the information. After IT certifies datasets that meet validated stakeholder needs and makes them available internally, end users can draw from those datasets on demand, speeding stakeholder decision-making and getting IT out of the business of running reports.

In addition to standards and governance, what else does an organization need for data democratization?

Effective data democratization requires a data management culture that empowers business stakeholders to define how certain information will be used. This also means holding people accountable for using that information appropriately and subject to good governance.

Data democratization also requires subject matter experts inside business units and functions who understand data analytics and reporting. IT alone simply cannot drive successful data democratization.

What is your architectural strategy for enabling the democratization of data?

There really is no single best architectural design. You need adherence to strong data governance and consistent practices for defining your data and mastering it with the right tools to achieve a standard, concise view of key data types across your business.

What is the CIO’s role in leading data strategy?

An effective data strategy must connect to a business imperative. Every CIO needs to understand the company’s multi-year strategy and desired outcomes, and the data-related capabilities necessary to drive those outcomes.

At UL Solutions, tapping into our data to build a deeper understanding of customer needs and buying behaviors can help expand our relationships with existing customers.

What advice do you have for CIOs on developing a culture of data democratization?

Start with a clear strategic intention. Connecting our data democratization proposals to the company’s business strategy went a long way toward helping our executive team appreciate why we prioritized building a single, consistent view of our customers. This approach really helped generate enthusiasm and build the commitment we needed.

I also recommend that CIOs resist trying to execute a data strategy with their IT teams alone. In any company, there are at least three different groups outside of IT that think about your key data every day. For example, pricing managers, product managers, and inside sales teams need to buy into the data strategy, and so do your executive peers. You need your chief revenue or chief commercial officers sitting right next to you, championing the importance of data governance and quality.

Finally, understand that your most important work as CIO is to bring the right data leadership into the IT organization. You cannot wait to be asked to build out a data team; as CIO, you have to be one step ahead.

Data Management, IT Leadership

By George Trujillo, Principal Data Strategist, DataStax

Increased operational efficiencies at airports. Instant reactions to fraudulent activities at banks. Improved recommendations for online transactions. Better patient care at hospitals. Investments in artificial intelligence are helping businesses to reduce costs, better serve customers, and gain competitive advantage in rapidly evolving markets. Titanium Intelligent Solutions, a global SaaS IoT organization, even saved one customer over 15% in energy costs across 50 distribution centers, thanks in large part to AI.  

To succeed with real-time AI, data ecosystems need to excel at handling fast-moving streams of events, operational data, and machine learning models to leverage insights and automate decision-making. Here, I’ll focus on why these three elements and capabilities are fundamental building blocks of a data ecosystem that can support real-time AI.

DataStax

Real-time data and decisioning

First, a few quick definitions. Real-time data involves a continuous flow of data in motion. It’s streaming data that’s collected, processed, and analyzed on a continuous basis. Streaming data technologies unlock the ability to capture insights and take instant action on data that’s flowing into your organization; they’re a building block for developing applications that can respond in real-time to user actions, security threats, or other events. AI is the perception, synthesis, and inference of information by machines, to accomplish tasks that historically have required human intelligence. Finally, machine learning is essentially the use and development of computer systems that learn and adapt without following explicit instructions; it uses models (algorithms) to identify patterns, learn from the data, and then make data-based decisions.

Real-time decisioning can occur in minutes, seconds, milliseconds, or microseconds, depending on the use case. With real-time AI, organizations aim to provide valuable insights during the moment of urgency; it’s about making instantaneous, business-driven decisions. What kinds of decisions are necessary to be made in real-time? Here are some examples:

Fraud It’s critical to identify bad actors using high-quality AI models and data

Product recommendations It’s important to stay competitive in today’s ever-expanding online ecosystem with excellent product recommendations and aggressive, responsive pricing against competitors. Ever wonder why an internet search for a product reveals similar prices across competitors, or why surge pricing occurs?

Supply chain With companies trying to stay lean with just-in-time practices, it’s important to understand real-time market conditions, delays in transportation, and raw supply delays, and adjust for them as the conditions are unfolding.

Demand for real-time AI is accelerating

Software applications enable businesses to fuel their processes and revolutionize the customer experience. Now, with the rise of AI, this power is becoming even more evident. AI technology can autonomously drive cars, fly aircraft, create personalized conversations, and transform the customer and business experience into a real-time affair. ChatGPT and Stable Diffusion are two popular examples of how AI is becoming increasingly mainstream. 

With organizations looking for increasingly sophisticated ways to employ AI capabilities, data becomes the foundational energy source for such technology. There are plenty of examples of devices and applications that drive exponential growth with streaming data and real-time AI:  

Intelligent devices, sensors, and beacons are used by hospitals, airports, and buildings, or even worn by individuals. Devices like these are becoming ubiquitous and generate data 24/7. This has also accelerated the execution of edge computing solutions so compute and real-time decisioning can be closer to where the data is generated.AI continues to transform customer engagements and interactions with chatbots that use predictive analytics for real-time conversations. Augmented or virtual reality, gaming, and the combination of gamification with social media leverages AI for personalization and enhancing online dynamics.Cloud-native apps, microservices and mobile apps drive revenue with their real-time customer interactions.

It’s clear how these real-time data sources generate data streams that need new data and ML models for accurate decisions. Data quality is crucial for real-time actions because  decisions often can’t be taken back. Determining whether to close a valve at a power plant, offer a coupon to 10 million customers, or send a medical alert has to be dependable and on-time. The need for real-time AI has never been more urgent or necessary.

Lessons not learned from the past

Organizations have over the past decade put a tremendous amount of energy and effort into becoming data driven but many still struggle to achieve the ROI from data that they’ve sought. A 2023 New Vantage Partners/Wavestone executive survey highlights how being data-driven is not getting any easier as many blue-chip companies still struggle to maximize ROI from their plunge into data and analytics and embrace a real data-driven culture:

19.3% report they have established a data culture26.5% report they have a data-driven organization39.7% report they are managing data as a business asset47.4% report they are competing on data and analytics

Outdated mindsets, institutional thinking, disparate siloed ecosystems, applying old methods to new approaches, and a general lack of a holistic vision will continue to impact success and hamper real change. 

Organizations have balanced competing needs to make more efficient data-driven decisions and to build the technical infrastructure to support that goal. While big data technologies like Hadoop were used to get large volumes of data into low-cost storage quickly, these efforts often lacked the appropriate data modeling, architecture, governance, and speed needed for real-time success.

This resulted in complex ETL (extract, transform, and load) processes and difficult-to-manage datasets. Many companies today struggle with legacy software applications and complex environments, which leads to difficulty in integrating new data elements or services. To truly become data- and AI-driven, organizations must invest in data and model governance, discovery, observability, and profiling while also recognizing the need for self-reflection on their progress towards these goals.

Achieving agility at scale with Kubernetes

As organizations move into the real-time AI era, there is a critical need for agility at scale. AI needs to be incorporated into their systems quickly and seamlessly to provide real-time responses and decisions that meet customer needs. This can only be achieved if the underlying data infrastructure is unified, robust, and efficient. A complex and siloed data ecosystem is a barrier to delivering on customer demands, as it prevents the speedy development of machine learning models with accurate, trustworthy data.

Kubernetes is a container orchestration system that automates the management, scaling, and deployment of microservices. It’s also used to deploy machine learning models, data streaming platforms, and databases. A cloud-native approach with Kubernetes and containers brings scalability and speed with increased reliability to data and AI the same way it does for microservices. Real-time needs a tool and an approach to support scaling requirements and adjustments; Kubernetes is that tool and cloud-native is the approach. Kubernetes can align a real-time AI execution strategy for microservices, data, and machine learning models, as it adds dynamic scaling to all of these things. 

Kubernetes is a key tool to help do away with the siloed mindset. That’s not to say it’ll be easy. Kubernetes has its own complexities, and creating a unified approach across different teams and business units is even more difficult. However, a data execution strategy has to evolve for real-time AI to scale with speed. Kubernetes, containers, and a cloud-native approach will help. (Learn more about moving to cloud-native applications and data with Kubernetes in this blog post.)

Unifying your organization’s real-time data and AI strategies

Data, when gathered and analyzed properly, provides the inputs necessary for functional ML models. An ML model is an application created to find patterns and make decisions when accessing datasets. The application will contain ML mathematical algorithms. And, once ML models are trained and deployed, they help to more effectively guide decisions and actions that make the most of the data input. So it’s critical that organizations understand the importance of weaving together data and ML processes in order to make meaningful progress toward leveraging the power of data and AI in real-time. From architectures and databases to feature stores and feature engineering, a myriad of variables must work in sync for this to be accomplished.

ML models need to be built,  trained, and then deployed in real-time. Flexible and easy-to-work-with data models are the oil that makes the engine for building models run smoothly. ML models  require data for testing and developing the model and for inference when the ML models are put in production (ML inference is the process of an ML model making calculations or decisions on live data).

Data for ML is made up of individual variables called features. The features can be raw data  that has been processed or analyzed or derived. ML model development is about finding the right features for the algorithms. The ML workflow for creating these features is referred to as feature engineering. The storage for these features is referred to as a feature store. Data and ML model development fundamentally depend on one another..

That’s why it is essential for leadership to build a clear vision of the impact of data-and-AI alignment—one that can be understood by executives, lines of business, and technical teams alike. Doing so sets up an organization for success, creating a unified vision that serves as a foundation for turning the promise of real-time AI into reality .

A real-time AI data ingestion platform and operational data store

Real-time data and supporting machine learning models are about data flows and machine-learning-process flows. Machine learning models require quality data for model development and for decisioning when the machine learning models are put in production. Real-time AI needs the following from a data ecosystem:

A real-time data ingestion platform for messaging, publish/subscribe (“pub/sub” asynchronous messaging services), and event streamingA real-time operational data store for persisting data and ML model features An aligned data ingestion platform for data in motion and an operational data store working together to reduce the data complexity of ML model developmentChange data capture (CDC) that can send high-velocity database events back into the real-time data stream or in analytics platforms or other destinations.An enterprise data ecosystem architected to optimize data flowing in both directions.

DataStax

Let’s start with the real-time operational data store, as this is the central data engine for building ML models. A modern real-time operational data store excels at integrating data from multiple sources for operational reporting, real-time data processing, and support for machine learning model development and inference from event streams. Working with the real-time data and the features in one centralized database environment accelerates machine learning model execution.

Data that takes multiple hops through databases, data warehouses, and transformations moves too slow for most real-time use cases. A modern real-time operational data store (Apache Cassandra® is a great example of a database used for real-time AI by the likes of Apple, Netflix, and FedEx) makes it easier to integrate data from real-time streams and CDC pipelines. 

Apache Pulsar is an all-in-one messaging and streaming platform, designed as a cloud-native solution and a first class citizen of Kubernetes. DataStax Astra DB, my employer’s database-as-a-service built on Cassandra, runs natively in Kubernetes. Astra Streaming is a cloud-native managed real-time data ingestion platform that completes the ecosystem with Astra DB. These stateful data solutions bring alignment to applications, data, and AI.

The operational data store needs a real-time data ingestion platform with the same type of integration capabilities, one that can ingest and integrate data from streaming events. The streaming platform and data store will be constantly challenged with new and growing data streams and use cases, so they need to be scalable and work well together. This reduces the complexity for developers, data engineers, SREs, and data scientists to build and update data models and ML models.  

A real-time AI ecosystem checklist

Despite all the effort that organizations put into being data-driven, the New Vantage Partners survey mentioned above highlights that organizations still struggle with data. Understanding the capabilities and characteristics for real-time AI is an important first step toward designing a data ecosystem that’s agile and scalable.  Here is a set of criteria to start with:

A holistic strategic vision for data and AI that unifies an organizationA cloud-native approach designed for scale and speed across all componentsA data strategy to reduce complexity and breakdown silosA data ingestion platform and operational data store designed for real-timeFlexibility and agility across on-premises, hybrid-cloud, and cloud environmentsManageable unit costs for ecosystem growth

Wrapping up

Real-time AI is about making data actionable with speed and accuracy. Most organizations’ data ecosystems, processes and capabilities are not prepared to build and update ML models at the speed required by the business for real-time data. Applying a cloud-native approach to applications, data, and AI improves scalability, speed, reliability, and portability across deployments. Every machine learning model is underpinned by data. 

A powerful datastore, along with enterprise streaming capabilities turns a traditional ML workflow (train, validate, predict, re-train …) into one that is real-time and dynamic, where the model augments and tunes itself on the fly with the latest real-time data.

Success requires defining a vision and execution strategy that delivers speed and scale across developers, data engineers, SREs, DBAs, and data scientists. It takes a new mindset and an understanding that all the data and ML components in a real-time data ecosystem have to work together for success. 

Special thanks to Eric Hare at DataStax, Robert Chong at Employers Group, and Steven Jones of VMWare for their contributions to this article. 

Learn how DataStax enables real-time AI.

About George Trujillo:

George is principal data strategist at DataStax. Previously, he built high-performance teams for data-value driven initiatives at organizations including Charles Schwab, Overstock, and VMware. George works with CDOs and data executives on the continual evolution of real-time data strategies for their enterprise data ecosystem. 

Artificial Intelligence, IT Leadership

The Communications Platform as a Service (CPaaS) market is big and growing fast. Already worth more than $8 billion, analysts predict that the market will nearly triple in size to $22 billion by 2025. CPaaS is a cloud platform that exposes communications functions such as SMS, voice, video, and IP chat via programmable application programming interfaces (APIs) so that developers can more easily code these functions into applications, workflows, and systems.

Recently, Ericsson’s CTO Erik Ekudden interviewed Vinod Lala, Chief Strategy Officer at Vonage, on the opportunities on offer at the nexus of CPaaS and 5G. According to Lala, one of the key drivers behind the growth of CPaaS is business-to-customer engagement through mobile apps.

“As customers increasingly…‘live’ in their mobile phones, so businesses have reacted by communicating with them that way, via SMS, chat, or now even video and voice,” he said.

CPaaS simplifies the deployment of multi-modal communications integration and is therefore an ideal solution for both digital native brands and non-native brands looking to reach customers through apps.

The second key driver behind the evolution of the CPaaS market is the arrival of 5G APIs. As Ekudden explained, being able to connect to 5G is “very handy” for developers and enterprises as it provides access to a range of pre-built tools and capabilities including Quality of Service (QoS), network slicing, advanced security features, device status information, and precise positioning.

As Ekudden noted: “There’s a real opportunity to look at what 5G already brings and how that can fuel and enhance capabilities in existing applications and enterprises.”

5G APIs are therefore profoundly important to the development of CPaaS, unlocking network controls and thereby offering a whole new dimension to developers for enhanced use cases.

According to Lala, applications that rely on low latency, such as in gaming or telemedicine, will benefit the most from these capabilities. Indeed, many of the network features synonymous with 5G were built specifically to support such advanced applications.

5G APIs will therefore provide what Ekudden describes as a “missing gearbox” for developers to create new CPaaS use cases on a global scale.

5G APIs are helping to create win-win collaborations where developers on one side have the freedom to innovate, and communications service providers – on the other side –  can create revenue streams through the network value they reveal to developers.

Ultimately this evolution is about exposing and enabling new capabilities that are guided by the needs of enterprises and third-party developer communities, closing the gap between the needs coming from applications, enterprises, and consumers and the network capabilities. CPaaS promises to unlock new layers of innovation both within enterprises and the developer ecosystem that supports them, which in turn will deliver better customer experiences and enhanced value for all.

To learn more about how 5G is empowering and extending the expanding CPaaS market, you can watch the fireside chat in full here.

5G, Telecommunications, Telecommunications Industry

Nowadays, the world seems to experience once-in-a-century storms almost monthly. These cataclysmic weather events often cause extensive property damage, including major disruptions to the power grid that can cripple IT systems. More commonly, human error and power fluctuations can be just as costly and devastating to continued IT service delivery. To avoid costly outages and data loss, businesses must ensure continued operations with power protection delivered by a smart solution like Dell VxRail and the APC by Schneider Electric Smart UPS with PowerChute Network Shutdown software.

If the outage is prolonged, the Dell-APC solution enables remote shut down to protect IT systems and ensure a non-disruptive restart.

When the power goes out, gracefully shutting down connected IT devices — like servers, storage devices, and hyper-converged infrastructure (HCI) — helps prevent further damage to those devices. It also prevents loss of business data and damage to enterprise workloads and helps ensure a smoother process for restarting and getting the business back up and running.

Why is this so important? Because the cost of downtime can be catastrophic. Estimates of IT service downtime costs range from $80,000 an hour on the lower end of the scale to $5 million an hour for larger enterprises. And that doesn’t account for damage to business reputation — whether a retailer loses its POS systems, or a larger organization loses its online customer service and sales systems.

Dell Technologies VxRail

With so much at stake, a UPS with remote management capabilities is critical to protect the HCI system and workloads it supports. HCI systems, like Dell VxRail, have become the backbone for data centers and larger organizations. HCI has historically been used to support specific workloads like virtual desktops (VDI). However, it has emerged as a workhorse for running mission-critical workloads that require elevated levels of performance and availability. Enterprises should consider deploying an intelligent smart UPS like the Dell-APC PowerChute solution to protect those mission-critical workloads running on HCI.

While HCI is also well-suited for supporting multiple sites, losing power at remote sites can still cause system damage and data corruption. To prevent this type of damage, organizations must install a UPS at every HCI installation. Ideally, the UPS will keep systems operating throughout an outage. However, if an outage lasts too long, businesses must have a process in place to ensure an automated graceful shutdown, followed by a sequenced infrastructure restart. 

To gracefully shut down the HCI, the UPS must be able to communicate over a distributed network. Then it has to initiate a step-by-step restart sequence to ensure hardware and data protection. The automated restart should begin once power is restored. This automated remedy for power interruption can save time and money — and, ultimately, minimize downtime.

Integrated systems like Dell VxRail HCI and the APC by Schneider Electric Smart UPS with PowerChute Network Shutdown software can help businesses simplify and automate the process during catastrophic power outages and ensure business continuity by enabling graceful shutdown and the ability to simply move virtual machines to another system. This level of network protection acts as insurance against catastrophic downtime that could otherwise lead to the loss of all IT services.  

To learn more about how integrated IT solutions like Dell VxRail and the APC by Schneider Electric Smart UPS with PowerChute Network Shutdown software protect business data assets and ensure business continuity, please visit us here.

Watch this video to learn more:

Infrastructure Management, IT Leadership

Kidney diseases are a leading cause of death in the US, claiming more than a quarter million lives each year. Roughly 37 million people in the US are inflicted with chronic kidney disease (CKD) although most are undiagnosed. Left untreated, CKD may advance and can lead to more serious medical issues such as high blood pressure, diabetes, and complete kidney failure, or End Stage Renal Disease (ESRD).

The solution for many at that extreme stage is dialysis or a kidney transplant — both of which have a significant impact on the quality of life. Every 24 hours, 360 people in the US begin dialysis treatment for kidney failure, according to the CDC.

One organization at the forefront of clinical care and innovation is DaVita, one of the largest providers of kidney care services, with more than 2,800 outpatient dialysis centers in the US and close to 400 outpatient dialysis centers in 11 countries worldwide. This year alone, the company has served nearly 200,000 patients in the US at its outpatient centers and is actively pushing the kidney care industry to adopt high-quality standards of care for all patients.

While treating ESRD patients through its large network of dialysis centers is the Fortune 200 company’s primary business, the company is also involved in efforts to reduce CKD cases and the need for dialysis treatment and transplants as well. Here, IT is playing a significant role.

“We’ve been working to enable world-class integrated care at scale and transform the delivery of care at each point of a patient’s journey,” says Alan Cullop, SVP & CIO at DaVita.  “Our digital transformation strategy is centered around establishing a consumer-oriented model that helps us customize chronic care management based on the ever-changing conditions of each patient.”

The foundation for DaVita’s digital transformation will be a new technical platform and clinical documentation system that “allows for deeper integration across our applications and improves our ability to capture data throughout the patient’s care,” Cullop says, noting that development has been a multi-year process and deployment is now underway and will be completed in 2023. “We’re providing our physician partners, clinical teams, and patients with digital capabilities that support our efforts to proactively improve the quality of care our patients receive.”  

DaVita also provides millions of dollars in funding to address ancillary issues related to kidney disease sufferers, such as food insecurity and even support to patients impacted by environmental catastrophes such as hurricanes and earthquakes.

In this CIO Executive Council Future Forward podcast, we talk with DaVita’s technology chief about the company’s plan to expand activities from CKD treatment to disease prevention. Cullop also talks about DaVita’s strategies for AI and data analytics, as well as the importance of passion and culture as drivers of technology innovation.

The following are edited excerpts from that discussion. Click on the podcast players below to listen to Parts 1 & 2 of the conversation.

Tim Scannell:  How much of a role do technologies like data analytics and AI play in DaVita’s overall technology and business strategy?

Alan Cullop: We have a very large and very focused effort on AI and data analytics. There’s so much power in data and the insights doctors get early in the care continuum and how we engage with patients even before they are in dialysis. We’re using predictive algorithms to identify signs of undiagnosed kidney disease. We’re also doing a lot with population health management, performing more comprehensive patient assessments, managing our high-risk patients, stabilizing transitions of care, optimizing outpatient care, and really trying to call out things that help us understand disease progression.

We’re looking at a variety of sources of data, putting it in data lakes, and then using that to drive predictive models that really help our doctors and our care teams to stratify our patient’s risk by taking actions at the right time.

A lot of innovation initiatives right now are small and more closely aligned with tangible results. While this may be a great short-term strategy, how might this impact the concept of innovation in general as we move forward?

Alan Cullop: Innovation usually starts with a problem or something we’re trying to solve. And with AI sometimes you stumble on it unexpectantly as a search for the unknown unknown. The trick is to not let the outcomes and particular things we’re trying to solve get in the way of innovative thinking or block our sense of what’s possible.  

Sometimes smaller focused innovation efforts do lead to much bigger ideas. But, ideation sessions, innovation sessions, and hackathons have led to some interesting insights we’ve built upon and can be applied across the board. We encourage our teams to really embrace it, but we’re going to make mistakes. One of the better ways to learn is if you make a mistake, and now you know more than you did before and you know how to perhaps not repeat it.

IT culture is important today, especially in retaining and recruiting talent. As companies shift to a new normal of hybrid working, do you think there’ll be a significant impact on traditional cultural structures?

Alan Cullop: I think there are three fundamental issues or points that help build and sustain a strong culture. First, I am very excited by the increased focus on diversity, inclusion, and belonging in our society. We’ve been very focused on these issues for quite some time and really interested in embracing different perspectives. We’ve made great progress, but it’s one of those things that I would say your work is never done. I’m proud to be a part of the conversation and proud of the engagement level of our teammates.

Second, I think flexibility is crucial. We need to understand how and where our teammates want to work, which roles are conducive with work being done remotely versus hybrid models and find ways to keep our engagement high. For us, that’s a balance. We’re exploring more flexible work arrangements and talking to teammates about how and where they want to work to meet their needs.

Finally, leadership needs to be visible and consistent in terms of demonstrating the importance of culture and engagement. It’s easy to talk about culture, but it’s certainly harder to carve out time to be present and to genuinely engage with teammates on culture. Everyone looks to leadership to be role models and set examples. So, it’s important that we take the time to walk the walk and not just talk the talk.

In earlier conversations, you talked about thepower of purpose.’ Can you tell me just what this means to you and how it comes into play at DaVita?

Alan Cullop: I think it’s super important and something we take very seriously. We talk about the power of purpose all the time in healthcare and what it means to stay connected to our patients and their families, and what we can do to really improve their quality of life and in many cases save lives. We bake this into our IT strategy, our team meetings, and our engagement approaches. I love the innovation and enablement that we bring. It personally gives me a lot of energy and passion and a sense of purpose. We’re doing something and we’re giving back to others, which I think for a lot of us helps bring a true sense of purpose.

IT Leadership

This article was co-authored by Duke Dyksterhouse, an Associate at Metis Strategy.

A lobby television isn’t all that uncommon or remarkable for a $4.5-billion-dollar company, but what’s on the 85-inch screen in the lobby of Generac’s headquarters certainly is. Rather than the predictable advertisements or staged photos featuring happy employees, it’s a demo of the energy management firm’s latest innovation, called PowerINSIGHTS.  

It’s an interactive platform. Zip and click and zoom about a map of North America bespeckled with glowing, Generac-orange dots, and as you dance about, watch the handful of key metrics in the UI change to reflect the region examined: UtilityScore, OpportunityScore, PowerScore. Simple metrics, but dense with information, telling not only of any one region’s energy landscape but of the entire energy market’s trajectory. 

Tim Dickson, CIO, Generac

Generac Power Systems

“Every day that I come into the office,” explains Tim Dickson, CIO of Generac, “I see people I’ve never met, people I’ve never even seen, standing around the demo screen in the lobby. And ideas for how to improve it are pouring in. Other business units, like our subsidiary Ecobee, have already gotten involved. They’ve added their assets to the platform.” 

In the world of energy management, Generac’s PowerINSIGHTS platform is a riveting achievement in the race to extract an unprecedented level of intelligence from power grids, which have become more difficult to manage with the rise of Distributed Energy Resources (DERs) like solar, EVs, and, of course, Generac generators. DERs are hard to visualize as they come in many forms and run on unpredictable schedules. PowerINSIGHTS changes that. Its glowing orange dots represent the once “hidden” DERs, and its accompanying metrics reveal how such energy in a geography is managed, used, distributed, and so on. 

“This platform brings an incredible amount of unseen energy into play,” says Amod Goyal, one of Generac’s development experts and the manager of the PowerINSIGHTS implementation. “We can see where there’s idle power that a customer might want to sell and where we can redistribute it to help people in need, like after a hurricane. We can do this all without providing any external access to customer data, and we never disclose any personal identifiable information.” 

PowerINSIGHTS’ value and novelty may make you think the platform is the premeditated outcome of an arduous program. Its display in the Generac lobby encourages that suspicion. But PowerINSIGHTS is the unexpected outcome of a hackathon led by Tim and his IT organization. Even more notably, the hackathon was one of Tim’s first initiatives after taking the helm as CIO in August of 2020. 

Conventional wisdom suggests CIOs should master IT fundamentals before they get innovative. The helpdesk must run like a German train station, the Wi-Fi can’t drop (ever), and the conference room must be easier to navigate than an iPhone. While getting the basics right is table stakes for any CIO, if you wait to innovate until your peers commend you for doing so, bring a comfy chair because you’re going to be waiting for a while. Additionally, the master-the-rules-before-you-break-them philosophy is exceedingly narrow. Who made Wi-Fi or conference-room navigation the rule? The CIO is meant to enable the business, and there are many ways to do that beyond ensuring network uptime.  

The best CIOs want to rattle their departments, change their organizations’ stars, and lunge at the big ideas white-boarded in a frenzy of inspiration. But, as is often the case, what if they don’t have the resources, the time, the money, or the mandate?  

Do it anyway, Tim says. You might surprise yourself. On the heels of the successful hackathon and PowerINSIGHTS development, he offered three points of advice and encouragement for technology leaders who want to drive innovation, even if they aren’t sure they are ready: You have more at your disposal than you think, your people are more talented than you know, and you will be known for what you do. 

You have more at your disposal than you think 

Despite what some IT leaders think, innovation is not reserved only for the Googles and the Teslas of the world. Additionally, not all innovative organizations need to be built from scratch. You don’t have to invest in a new kitchen to cook something new; sometimes you need only to step back and consider how you might differently combine the ingredients you already have.  

PowerINSIGHTS is a perfect example of this. No element of the platform is all that novel, Tim says, and Generac had the underlying data for years. What’s more, the geospatial visualization of that data was made possible by a feature of Microsoft Azure that had been hiding in plain sight. The innovation came from a new combination of these elements.  

There may also be significant change agents in your broader ecosystem. For example, to build momentum behind his hackathon, Tim recruited vendors to sponsor it. Microsoft, Databricks, and others sent in experts a month ahead of time to upskill Generac’s workforce. Suddenly, IT employees found themselves learning the things that interested them and developing the skills they wanted to develop. Other departments, feeling the excitement, jumped into the mix and IT employees found themselves solving problems alongside their peers from Connectivity and Engineering, a demonstration of the business partnership CIOs dream of. 

Often, the best inventions seem obvious in retrospect. Keep that in mind when you think your department lacks the resources to build something new. Tim recruited partners to support the hackathon, yes, but what made the difference was Tim’s push to give employees the chance to innovate with what they had. Without that push, it’s likely that the PowerINSIGHTS idea would not have seen the light of day.  

Your people are more talented than you know 

As corporate IT departments evolve, so too are the qualities their leaders seek in candidates. Where nuts-and-bolts, black-and-white problem-solving once may have sufficed, skills like ownership, autonomy, creativity, big-picture thinking, and continuous learning are quickly becoming essential. Because many IT leaders have yet to see their current employees exhibit these traits, they tend to think they lack them altogether. Therefore, they decide they cannot transform their department or make it innovative until they first hire the “right” people. Since that often requires a budget they don’t have, it’s a good excuse to stand still. 

Oftentimes, however, employees already have the autonomy, creativity, and all the attributes that companies covet; they just lack an avenue to showcase those attributes. As Tim predicted it would, the hackathon opened that avenue to Generac’s employees. He elaborated on this insight last year in Metis Strategy’s Digital Symposium: “We had 16 teams participate, 70 people, and we’ve implemented over half of [their] ideas in production deployment. What that showed me is that there was a significant amount of pent-up demand…a significant desire for folks who aspired to do more…and show and present their ideas…in a form that they didn’t necessarily have before.” 

The hackathon revealed such an explosive appetite for innovation that, in its wake, Tim and his colleagues configured a digital COE as a central muscle for nurturing that appetite on an ongoing basis. The COE helps anyone in the organization, regardless of their position or business unit, develop their ideas with emerging technologies. “It allows those people with the ideas an avenue to bring them to light,” explained Tim. “When you have that type of engagement from team members, where they feel their voices are being heard, that’s a model that can scale…so we’ve embraced that here at Generac.” 

You don’t always need better talent to innovate. Sometimes, you need to innovate to find out how good your talent is. That’s the paradox that drove Tim to host his hackathon in the first place. He wanted to learn who and what he was working with. Dickson likens it to karaoke: “You just don’t know who’s going to hop up, grab the mic, and just wail it out,” he says. “It’s one of the most inspiring things to witness. But you have to play a tune worth singing to.” 

You are known for what you do 

Aristotle once wrote, “We are what we repeatedly do.” Tim’s rendition is, “You will be known for what you do.” In either case, the emphasis is on the “do.” Tim’s gentle reminder to his employees, and his advice to CIOs, is that the most eloquent memos and best-laid plans are meaningless if there’s no action behind them. Don’t try to convince anyone that you or your department are innovators or wait for permission to become innovators. Be innovators. 

The key is to get to something real, however rough. If the idea is even halfway decent, says Tim, that will change everything. And PowerINSIGHTS is the perfect example. Prior to the hackathon and maybe even prior to that, Tim and his team could have frittered away time around the water cooler, spitballing the merits of such an innovation to anyone who would listen. But they didn’t. Instead, they built it, crude as the first iteration may have been. At first, the user interface was Spartan, the user experience clunky, but no matter. 

“Once we had something people could see and touch, the whole mood shifted,” Tim said. “The CEO actually… proposed some of the first use cases for PowerINSIGHTS and has remained very involved in the project since.”  

That initial action on the innovation front led to real transformation for legacy processes and technologies as well. In Generac’s case, one of the biggest shifts has been an embrace of cloud infrastructure. “Everything was on-prem when I started. But cloud will be essential to supporting PowerINSIGHTS in the long run, so we’ve stood up a cloud-first infrastructure. And of course, the benefits of that have reached beyond PowerINSIGHTS.”  

We preach often in this column that you don’t have to have all the answers before embarking on an innovation initiative. Tim and PowerINSIGHTS are clear evidence of that. His team had a plan, of course, but they didn’t wait for anyone’s permission to leap. CIOs hoping to reposition their organizations need not wait. By engaging teams across the organization and acting quickly, you will likely discover new opportunities for innovation, energize a team of talented and passionate people, and win respect, quickly, from your peers. 

CIO, Innovation