By George Trujillo, Principal Data Strategist, DataStax

I recently had a conversation with a senior executive who had just landed at a new organization. He had been trying to gather new data insights but was frustrated at how long it was taking. (Sound familiar?) After walking his executive team through the data hops, flows, integrations, and processing across different ingestion software, databases, and analytical platforms, they were shocked by the complexity of their current data architecture and technology stack. It was obvious that things had to change for the organization to be able to execute at speed in real time.

Data is a key component when it comes to making accurate and timely recommendations and decisions in real time, particularly when organizations try to implement real-time artificial intelligence. Real-time AI involves processing data for making decisions within a given time frame. The time frame window can be in minutes, seconds, or milliseconds, based on the use case. Real-time AI brings together streaming data and machine learning algorithms to make fast and automated decisions; examples include recommendations, fraud detection, security monitoring, and chatbots.

A whole lot has to happen behind the scenes to succeed and get tangible business results. The underpinning architecture needs to include event-streaming technology, high-performing databases, and machine learning feature stores. All of this needs to work cohesively in a real-time ecosystem and support the speed and scale necessary to realize the business benefits of real-time AI.

It isn’t easy. Most current data architectures were designed for batch processing with analytics and machine learning models running on data warehouses and data lakes. Real-time AI requires a different mindset, different processes, and faster execution speed. In this article, I’ll share insights on aligning vision and leadership, as well as reducing complexity to make data actionable for delivering real-time AI solutions.

A real-time AI north star

More than once, I’ve seen senior executives completely aligned on mission while their teams fight subtle yet intense wars of attrition across different technologies, siloes, and beliefs on how to execute the vision.

A clear vision for executing a real-time AI strategy is a critical step to align executives and line-of-business leaders on how real-time AI will increase business value for the organization.

The execution plan must come from a shared vision that offers transparency and includes defining a laundry list of methodologies, technology stacks, scope, processes, cross-functional impacts, resources, and measurements with sufficient detail so that cross-functional teams have enough direction to collaborate and work together to achieve operational goals.

Machine learning models (algorithms that comb through data to recognize patterns or make decisions) rely on the quality and reliability of data created and maintained by application developers, data engineers, SREs, and data stewards. How well these teams work together will determine the speed they deliver real-time AI solutions. As real-time becomes pervasive across the organization, several questions begin to arise:

How are cross-functional teams enabled to support the speed of change, agility, and quality of data for real-time AI, as ML models evolve? What level of alerting, observability, and profiling can be counted on to ensure trust in the data by the business?How do analysts and data scientists find, access, and understand the context around real-time data?How is data, process, and model drift managed for reliability? Downstream teams can create strategy drift without a clearly defined and managed execution strategy; is the strategy staying consistent, evolving, or beginning to drift?Real-time AI is a science project until benefits to the business are realized. What metrics are used to understand the business impact of real-time AI?

As scope increases, so does the need for broad alignment

The growth of real-time AI in the organization impacts execution strategy. New projects or initiatives, like adding intelligent devices for operational efficiency, improving real-time product recommendations, or opening new business models for real-time, tend to be executed at an organization’s edge—by specialized experts, evangelists, and other individuals who innovate.

The edge is away from the business center of gravity—away from entrenched interests, vested political capital, and the traditional way of thinking.

The edge has less inertia, so it’s easier to facilitate innovation, new ways of thinking, and approaches that are novel compared to an organization’s traditional lines of business, institutional thinking, and existing infrastructure. Business transformation occurs when innovation at the edge can move into the center lines of business such as operations, e-commerce, customer service, marketing, human resources, inventory, and shipping/receiving.

A real-time AI initiative is a science project until it demonstrates business value. Tangible business benefits such as increased revenue, reduced costs in operational efficiency and better decisioning must be shared with the business.

Expanding AI from the edge into the core business units requires continuous effort in risk and change management, demonstrating value and strategy, and strengthening the culture around data and real-time AI. One should not move AI deeper into the core of an organization without metrics and results that demonstrate business value that has been achieved through AI at the current level. Business outcomes are the currency for AI to grow in an organization.

A real-time data platform

Here we see the current state of most data ecosystems compared to the real-time data stack necessary to drive real-time AI success:

DataStax

Leaders face challenges in executing a unified and shared vision across these environments.  Real-time data doesn’t exist in silos; it flows in two directions across a data ecosystem. The data used to train ML models may exist in memory caches, the operational data store, or in the analytic databases. Data must get back to the source to provide instructions to devices or to provide recommendations to a mobile app. A unified data ecosystem enables this in real time.

DataStax

Within the real-time data ecosystem, the heart of real-time decisioning is made up of the real-time streaming data, the ML feature store, and the ML feature engine. Reducing complexity here is critical.

DataStax

I’ve highlighted how data for real-time decisioning flows in both directions across data sources, streaming data, databases, analytic data platforms, and the cloud. Machine-learning features contain data used to train machine-learning models and to be used as inference data when the models run in production. Real-time models that make decisions in real-time require an ecosystem that supports speed and agility for updating existing models and putting new models into production across the data dimensions shown below.

DataStax

A real-time data ecosystem includes two core components: the data ingestion platform that receives real-time messages and event streams, and the operational data store that integrates and persists the real-time events, operational data, and the machine learning feature data.  These two foundational cores need to be aligned for agility across the edge, on-premises, hybrid cloud, and multi-vendor clouds. 

Complexity from disparate data platforms will not support the speed and agility that data needs to work at to support real-time AI. Changing criteria, new data, and evolving customer conditions can cause machine learning models to get out of date quickly. The data pipeline flows across memory caches, dashboards, event streams, databases, and analytical platforms that have to be updated, changed, or infused with new data criteria. Complexity across the data ecosystem impacts the speed to perform these updates accurately. 

A unified, multi-purpose data ingestion platform and operational data store greatly reduce the number of technology languages teams must speak and the complexity of working with real-time data flows across the ecosystem. A unified stack also improves the ability to scale real-time AI across an organization. As mentioned earlier, reducing complexity also improves the cohesiveness of the different teams supporting the real-time data ecosystem.

New real-time AI initiatives need to look at the right data technology stack through the lens of what it takes to support evolving machine learning models running in real-time. This doesn’t necessarily require ripping and replacing existing systems. Minimize disruption by running new data through an updated, agile, real-time data ecosystem and slowly migrate out of data platforms to the real-time AI stack as needed.

Wrapping up

Moving real-time AI from the edge of innovation to the center of the business will be one of the biggest challenges for organizations in 2023. A shared vision, driven by leadership and a unified real-time data stack, are key factors for enabling innovation with real-time AI. Growing a community around innovation with real-time AI makes the whole stronger than the parts–and is the only way that AI can bring tangible business results.

Learn how DataStax enables real-time AI.

About George Trujillo:

George is principal data strategist at DataStax. Previously, he built high-performance teams for data-value driven initiatives at organizations including Charles Schwab, Overstock, and VMware. George works with CDOs and data executives on the continual evolution of real-time data strategies for their enterprise data ecosystem. 

Artificial Intelligence, IT Leadership, Machine Learning

By Chet Kapoor, Chairman & CEO of DataStax

Every business needs an artificial intelligence strategy, and the market has been validating this for years. Gartner® predicts that, “By 2027, over 90% of new software applications that are developed in the business will contain ML models or services, as enterprises utilize the massive amounts of data available to the business.1” And with the rise of tools like ChatGPT, more organizations than ever are thinking about how AI and ML can transform their business.

Still, most companies have not yet benefited from real-time AI. They fail because data is served too slowly in complicated environments, making real-time actions almost impossible. AI cannot work with the wrong data, at the wrong time, delivered by the wrong infrastructure.

So, how do leading enterprises use AI to drive business outcomes? And why should you care about real-time AI? Let’s dive in.

Winning with AI: It starts with data

A successful AI strategy starts with data. More specifically, you need real-time data at scale. Leaders like Google, Netflix, and Uber have already mastered this. Their ML models are embedded in their applications and use the same real-time data. They aggregate events and actions in real-time through streaming services, and expose this data to ML models. And they build it all on a database that can store massive volumes of event data.

Ultimately, it’s about acting on your data in the moment and serving millions of customers in real-time. Think about these examples:

● Netflix tracks every user’s actions to refine its recommendation engine, then it uses this data to propose the content you will love most
● Uber gathers driver, rider, and partner data to update a prediction engine that informs customers about wait times, or suggests routes to drivers
● FedEx aggregates billions of package events to optimize operations and share visibility with its customers on delivery status

How DataStax helps: A new class of apps

We have been working on unlocking real-time data for a long time at DataStax. We started with Apache Cassandra® 12 years ago, serving the largest datasets in the world. Then we made it a database-as-a-service with Astra DB and added Astra Streaming to make real-time data a reality.

Now, we have another exciting piece of the puzzle: Kaskada, a machine-learning company that recently joined forces with DataStax. Their feature engine helps customers get more value from real-time data. By adding Kaskada’s technology, we’ll be able to provide a simple, end-to-end stack that brings ML to data—not the other way around.

This unlocks a whole new class of applications that can deliver the instantaneous, personalized experiences customers demand – all in one unified open-source stack. Take the conversational AI company Uniphore, for example. Uniphore has an AI assistant that does sentiment analysis on sales calls. It helps sellers build better customer relationships and loyalty. Without the ability to process data in real-time, their solution would not be possible. Uniphore relies on DataStax to power its AI experience – with speed, scale, and affordability.

The future is bright

We believe every company should be able to deploy real-time AI at 3X the scale and half the cost. Our new mandate is clear: Real-time AI for everyone. We have the right data, at the right time, and the right infrastructure.

Now, it’s about executing the vision with our customers, communities, and partners. I’m super excited about making this a reality.

Click here to learn more about the power of real-time AI.

[1] Gartner, “A Mandate for MLOps, ModelOps and DeOps Coordination,” Van Baker, Nov. 22, 2022

GARTNER is a registered trademark and service mark of Gartner, Inc. and/or its affiliates in the U.S. and internationally and is used herein with permission.

About Chet Kapoor:

Chet is Chairman and CEO of DataStax. He is a proven leader and innovator in the tech industry with more than 20 years in leadership at innovative software and cloud companies, including Google, IBM, BEA Systems, WebMethods, and NeXT. As Chairman and CEO of Apigee, he led company-wide initiatives to build Apigee into a leading technology provider for digital business. Google (Apigee) is the cross-cloud API management platform that operates in a multi- and hybrid-cloud world. Chet successfully took Apigee public before the company was acquired by Google in 2016. Chet earned his B.S. in engineering from Arizona State University.

Artificial Intelligence, IT Leadership

As the chief engineer and head of the department for digital transformation of manufacturing technologies at the Laboratory for Machine Tools and Production Engineering (WZL) within RWTH Aachen University, I’ve seen a lot of technological advancements in the manufacturing industry over my tenure. I hope to help other manufacturers struggling with the complexities of AI in manufacturing by summarizing my findings and sharing some key themes.

The WZL has been synonymous with pioneering research and successful innovations in the field of production technology for more than a hundred years, and we publish over a hundred scientific and technical papers on our research activities every year. The WZL is focused on a holistic approach to production engineering, covering the specifics of manufacturing technologies, machine tools, production metrology and production management, helping manufacturers test and refine advanced technology solutions before putting them into production at the manufacturing edge. In my team, we have a mix of computer scientists, like me, working together with mathematicians and mechanical engineers to help manufacturers use advanced technologies to gain new insights from machine, product, and manufacturing data.

Closing the edge AI insight gap starts and ends with people 

Manufacturers of all sizes are looking to develop AI models they can use at the edge to translate their data into something that’s helpful to engineers and adds value to the business. Most of our AI efforts are focused on creating a more transparent shop floor, with automated, AI-driven insights that can:

Enable faster and more accurate quality assessmentReduce the time it takes to find and address process problemsDeliver predictive maintenance capabilities that reduce downtime

However, AI at the manufacturing edge introduces some unique challenges. IT teams are used to deploying solutions that work for a lot of different general use cases, while operational technology (OT) teams usually need a specific solution for a unique problem. For example, the same architecture and technologies can enable AI at the manufacturing edge for various use cases, but more often than not, the way to extract data from edge OT devices and systems that move their data into the IT systems is unique for each case. 

Unfortunately, when we start a project, there usually isn’t an existing interface for getting data out of OT devices and into the IT system that is going to process it. And each OT device manufacturer has its own systems and protocols. In order to take a general IT solution and transform into something that can answer specific OT needs, IT and OT teams must work together at the device level to extract meaningful data for the AI model. This will require IT to start speaking the language of OT, developing a deep understanding of the challenges OT faces daily, so the two teams can work together. In particular, this requires a clear communication of divided responsibilities between both domains and a commitment to common goals. 

Simplifying data insights at the manufacturing edge

Once IT and OT can work together to successfully get data from OT systems to the IT systems that run the AI models, that’s just the beginning. A challenge I see a lot in the industry is when an organization still uses multiple use-case-specific architectures and pipelines to build their AI foundation. The IT systems themselves often need to be upgraded, because legacy systems can’t handle the transmission needs of these very large data sets. 

Many of the companies we work with throughout our various research communities, industry consortia or conferences, such as WBAICNAP or AWK2023 — especially the small to medium manufacturers — ask us specifically for technologies that don’t require highly specialized data scientists to operate. That’s because manufacturers can have a hard time justifying the ROI if a project requires adding one or more data scientists to the payroll. 

To answer these needs, we develop solutions that manufacturers can use to get results at the edge as simply as possible. As a mechanical engineering institute, we’d rather not spend a lot of time doing research about infrastructure and managing IT systems, so we often seek out partners like Dell Technologies, who have the solutions and expertise to help reduce some of the barriers to entry for AI at the edge.

For example, when we did a project that involved high- frequency sensors, there was no product available at the time that could deal with our volume and type of data. We were working with a variety of open-source technologies to get what we needed, but securing, scaling, and troubleshooting each component led to a lot of management overhead.

We presented our use case to Dell Technologies, and they suggested their Streaming Data Platform. This platform reminds me of the way the smartphone revolutionized usability in 2007. When the smartphone came out, it had a very simple and intuitive user interface so anyone could just turn it on and use it without having to read a manual. 

The Streaming Data Platform is like that. It reduces friction to make it easier for people who are not computer scientists to capture the data flow from an edge device without having technical expertise in these systems. The platform also makes it easy to visualize the data at a glance, so engineers can quickly achieve insights.

When we applied it to our use case, we found that it deals with these data streams very naturally and efficiently, and it reduced the amount of time required to manage the solution. Now, developers can focus on developing the code, not dealing with infrastructure complexities. By reducing the management overhead, we can use the time saved to work with data and get better insights.

The future of AI at the manufacturing edge

With all of this said, one of the biggest challenges I see overall with AI for edge manufacturing is the recognition that AI insights are an augmentation to people and knowledge — not a replacement. And that it is much more important for people to work together in managing and analyzing that data to ensure that the end goal of getting business insights to serve a particular problem are being met. 

When manufacturers use many different solutions pieced together to find insights, it might work, but it’s unnecessarily difficult. There are technologies out there today that can remedy these challenges, it’s just a matter of finding them and checking them out. We’ve found that the Dell Streaming Data Platform can capture data from edge devices, analyze the data using AI models in near real time, and feed insights back to the business to add value that benefits both IT and OT teams.

Learn more

If you are interested in current challenges, trends and solutions to empower sustainable production, find out more at the AWK2023 where more than a thousand participants from production companies all around the world come together to discuss solutions for green production.

Find out more about AI at the manufacturing edge solutions from Dell Technologies and Intel.  

***

Intel® Technologies Move Analytics Forward

Data analytics is the key to unlocking the most value you can extract from data across your organization. To create a productive, cost-effective analytics strategy that gets results, you need high performance hardware that’s optimized to work with the software you use.

Modern data analytics spans a range of technologies, from dedicated analytics platforms and databases to deep learning and artificial intelligence (AI). Just starting out with analytics? Ready to evolve your analytics strategy or improve your data quality? There’s always room to grow, and Intel is ready to help. With a deep ecosystem of analytics technologies and partners, Intel accelerates the efforts of data scientists, analysts, and developers in every industry. Find out more about Intel advanced analytics.

IT Leadership

Artificial intelligence (AI) has become a hot topic for countries worldwide, and both public- and private-sector organizations have already started leveraging it as a response to continuous digital disruption. According to IDC’s 2022 Artificial Intelligence Spending Guide, global AI spending reached $88.6 billion in 2021, and it is forecast to grow at a compound annual growth rate (CAGR) of 25.6% over the 2021–2025 period.

Canada, China, and the United States are among the countries in which many organizations began their AI journeys early, supported by government initiatives. Saudi Arabia is no different in terms of its commitment to becoming an AI powerhouse. As an extension of the country’s Vision 2030, the Saudi Data and AI Authority (SDAIA) was established in 2019, followed by the release of the National Strategy for Data and AI in 2020. Given its strong focus on and dedication to fostering an AI ecosystem, Saudi Arabia is poised to see strong growth in overall AI spending, with a CAGR of 29% forecast for the 2021–2025 period that will see the market reach a value of $563 million in 2025.

According to a recent IDC survey of CIOs in the Kingdom, almost half the Saudi organizations that invest in AI technologies prefer to customize off-the-shelf solutions to meet their needs. This suggests an appetite among end users to work with local and global technology solution providers to meet their organizations’ specific business requirements. In terms of in-demand AI applications, more than one-third of Saudi organizations are investing in digital assistants, chatbots, and conversational agents augmented by Arabic language capabilities. Recommendation engines, as well as prediction and forecasting, are other key investment areas that will be leveraged across various vertical markets.

Organizations in Saudi Arabia should approach AI as a strategic initiative rather than as a one-off project so they can drive organization-wide innovation and become AI disruptors. To understand an organization’s AI readiness, five maturity traits of AI disruptors need to be evaluated. These are vision, people, process, technology, and data readiness. Responding to IDC’s recent Saudi Arabia CIO Survey, 35% of organizations said that identifying and hiring talent and the cost of implementation were major challenges. The lack of a centralized strategy and unclear processes, the lack of alignment across business and IT functions, the lack of a single technology architecture and platform were also highlighted as major challenges.

Like all other emerging technologies, AI is also prone to cybersecurity threats. Threat actors and adversaries launch attacks to compromise the confidentiality, availability, and integrity of AI systems. Training data security, algorithm security, trained model security, and platform security, as well as model transparency, ethics, and responsibility, are the key focus areas for building a secure and sustainable AI practice.

SBM has been heavily investing in its emerging technology capabilities and views data and AI as the key enablers of many sophisticated technology use cases and as critical instruments for helping Saudi Arabia to achieve its Vision 2030 goals. SBM has in-house developed solutions and has been a longstanding IBM partner, dating back to 1968. SBM and IBM both possess experienced emerging-technology professionals on the ground in the Kingdom. These capabilities enable them to work closely with customers to deliver customized products and services around Big Data and analytics, AI, automation, data governance and compliance, and security.

SBM will continue to introduce innovative new solutions through its strong partnerships and emerging technology capabilities to meet the requirements of businesses in the Kingdom, thereby contributing to the realization of Saudi Vision 2030.

To view and download the full whitepaper, click here.

Artificial Intelligence

Artificial intelligence (AI) has become a hot topic for countries worldwide, and both public- and private-sector organizations have already started leveraging it as a response to continuous digital disruption. According to IDC’s 2022 Artificial Intelligence Spending Guide, global AI spending reached $88.6 billion in 2021, and it is forecast to grow at a compound annual growth rate (CAGR) of 25.6% over the 2021–2025 period.

Canada, China, and the United States are among the countries in which many organizations began their AI journeys early, supported by government initiatives. Saudi Arabia is no different in terms of its commitment to becoming an AI powerhouse. As an extension of the country’s Vision 2030, the Saudi Data and AI Authority (SDAIA) was established in 2019, followed by the release of the National Strategy for Data and AI in 2020. Given its strong focus on and dedication to fostering an AI ecosystem, Saudi Arabia is poised to see strong growth in overall AI spending, with a CAGR of 29% forecast for the 2021–2025 period that will see the market reach a value of $563 million in 2025.

According to a recent IDC survey of CIOs in the Kingdom, almost half the Saudi organizations that invest in AI technologies prefer to customize off-the-shelf solutions to meet their needs. This suggests an appetite among end users to work with local and global technology solution providers to meet their organizations’ specific business requirements. In terms of in-demand AI applications, more than one-third of Saudi organizations are investing in digital assistants, chatbots, and conversational agents augmented by Arabic language capabilities. Recommendation engines, as well as prediction and forecasting, are other key investment areas that will be leveraged across various vertical markets.

Organizations in Saudi Arabia should approach AI as a strategic initiative rather than as a one-off project so they can drive organization-wide innovation and become AI disruptors. To understand an organization’s AI readiness, five maturity traits of AI disruptors need to be evaluated. These are vision, people, process, technology, and data readiness. Responding to IDC’s recent Saudi Arabia CIO Survey, 35% of organizations said that identifying and hiring talent and the cost of implementation were major challenges. The lack of a centralized strategy and unclear processes, the lack of alignment across business and IT functions, the lack of a single technology architecture and platform were also highlighted as major challenges.

Like all other emerging technologies, AI is also prone to cybersecurity threats. Threat actors and adversaries launch attacks to compromise the confidentiality, availability, and integrity of AI systems. Training data security, algorithm security, trained model security, and platform security, as well as model transparency, ethics, and responsibility, are the key focus areas for building a secure and sustainable AI practice.

SBM has been heavily investing in its emerging technology capabilities and views data and AI as the key enablers of many sophisticated technology use cases and as critical instruments for helping Saudi Arabia to achieve its Vision 2030 goals. SBM has in-house developed solutions and has been a longstanding IBM partner, dating back to 1968. SBM and IBM both possess experienced emerging-technology professionals on the ground in the Kingdom. These capabilities enable them to work closely with customers to deliver customized products and services around Big Data and analytics, AI, automation, data governance and compliance, and security.

SBM will continue to introduce innovative new solutions through its strong partnerships and emerging technology capabilities to meet the requirements of businesses in the Kingdom, thereby contributing to the realization of Saudi Vision 2030.

Artificial Intelligence

Increasing margins is critical to achieving sustained success in the retail industry.To maximize margins, leaders consider how to run the store more efficiently, how to deliver the best services to customers and how to grow new services. Traditionally, they have used rear-view mirror data to help accomplish these goals—that is, examining historical data from months prior and coming up with a plan. 

Today, retailers are relying more on proactive and contextual data in real-time. For instance, what are the online shopper’s preferences? Do they tend to buy button-down shirts and khakis or jeans and t-shirts? How does a brick-and-mortar store’s layout affect purchasing decisions? Context involves gathering data about human behavior throughout the customer journey to figure out why they buy what they buy. But how do you capture human emotions and activities in the moment, and then turn that data into useful information? How do you account for changes in behavior and preferences over time?

Obtaining the right insights, consistently, at a micro level about the consumer is key to delivering a more meaningful and personalized customer experience. Combining consumer shopping preferences with historical data can give you a contextually-rich, action-reaction paradigm. To accomplish this, retailers are turning to computer vision complemented by artficial intelligence.

Watch the videoReimagine the Future of Retail

Computer vision provides video and audio for additional context, complementing other types of data. Together, these data points become part of an analytics workflow delivering a tangible outcome. Using a federated approach, data can be analyzed where it is collected, producing insights used to make decisions in real time. 

This federated approach to analytics enables forward-thinking retailers to incorporate new approaches to using and orchestrating their data, using computer vision systems that grow as they grow. New use cases are more achievable, and IT can leverage these technologies to scale and drive further processes that enhance their momentum towards achieving the digitally-driven store of the future.

Let’s look at how computer vision is impacting the customer experience, store security and operations, revenue growth and sustainability today and what that means going forward. 

Continuing to address a top priority for the retail industry by improving safety and security 

Most retail establishments started their computer vision journey years ago when they brought in video camera systems for security purposes, providing them with a foundation to build on. Now it’s paying off.

When tied to a computer vision system, the visual data, historical data and AI can offer real-time situational awareness. Analysis occurs mainly on-site, at the edge. It’s quick and accurate, reducing staff response time. For example, a maintenance crew member can react almost immediately to spilled substances that could cause an accident. Anomaly detection can enhance a store’s loss prevention processes such as alerting security personnel to people who are concealing stolen items, and a real-time video analytics platform can even help with finding missing children.

Tackling current and future operational efficiency challenges 

The conventional store, where you build a structure and stock it with products and displays, is being transformed by customer’s buying patterns. The Intelligent Store (see Figure 1) consists of processes around employees (scheduling and reduction of effort), inventory and customers that can be constantly monitored and improved in real-time. With the intelligent store, retailers can, transform, adapt and respond to its customer’s needs and their beahiour with context and personalization 

With accurate data, managers today can utilize hyper-personalization to drive more sales, demand forecasting to maintain inventories and optimized route planning to cut costs. For this, you need real-time insights using sensors and cameras, and a strategy that aligns operations with the customer experience, autonomous retail and a host of integrated technologies to make it all happen.

Dell

Figure 1. The Intelligent Store extends across all facets of the retail industry to deliver benefits including real-time operational improvements, hyper-personalization and automation, scalability and security.

One goal of an Intelligent Store is to empower customers by reducing friction in the buying experience. That means touchless checkout, where items are “rung up” automatically as customers leave the store. For staffed checkouts, computer vision can monitor customer lines and move staff where needed in real time. Video-based inventory tracking ensures items are always in stock and enables traceability, as well as optimized picking for fulfilling ecommerce grocery orders. And curbside delivery is improved by combining visual data such as number plate and/or vehicle recognition, and sensor data so staff begin preparing to deliver groceries as soon as a customer drives into the lot.

The digital twin is another technology that boosts operational efficiency. Using software models, a retailer can run simulations of a real-world environment before committing to expensive changes. Imagine a designer creating a store planogram or distribution center in 3D, and using AI to determine the freshness of perishable items (to reduce spoilage), to optimize customer flow and merchandising, and for predictive analysis. A digital twin can be rendered on-site without the need to exchange huge amounts of data with a data center as the processing occurs at the edge.

Watch the video: Edge and computer vision are enabling better Retail

Enhancing the customer experience while increasing revenues

Happy customers inevitably buy more, so it is up to retailers to provide the right product with the right value. And by investing in the customer experience, revenues will automatically be maximized.

Consider virtual try-on, which combines computer vision, AI and augmented reality to allow shoppers to try on glasses, clothing and other items using their mobile device’s camera, or an in-store digital kiosk or mirror. “See it in your room” for furniture and electronics is similar. Virtual try-on is both immersive and a time-saver for customers, potentially resulting in higher per-session sales. 

Computer vision systems linked to inventory management systems is also a boon for the customer experience and optimization of revenue. Where cameras are used to scan existing inventory and update records, stock level checks are more accurate, helping to ensure the customer’s item isn’t backordered. Automatic updates to inventory after sales are completed saves on back-house time. From a merchandising perspective, computer vision can identify which areas of a store gets the most foot traffic and target hot spots where product should be placed.  

On the flip side is how to avoid losing revenue. Shrinkage in the global retail sector accounts for a staggering $100 billion USD* in annual losses, creating demand for technology and/or processes to prevent theft and fraud, and to better secure transactions. Many grocery stores now use cameras mounted at checkout stations to watch for sweetheart checking, prevent or detect item swapping and identify inaccurate scanning and payments.

Read the IDC Whitepaper “Future Loss Prevention: Advancing Fraud Detection Capabilities at self-checkout and throughout the retail store

Becoming environmental stewards and following sustainability practices

Many corporations today support initiatives to conserve resources and reduce waste. Computer vision is helping stores, malls, distribution centers and the like accomplish their sustainability goals.

The retail industry has several avenues to sustainability. Two of the most constructive are reducing energy consumption and using modern inventory management techniques.

Most of us are familiar with refrigerated cases with motion sensors that turn the lights on when a door is opened. Entire facilities can use the same principles, like smart HVAC, overhead and outdoor lighting to minimize power consumption.

Reducing food waste is another way to save money while having a positive impact on the community and environment. According to RTS, about 30% of the food in U.S. grocery stores is thrown away every year. Optimized cold chain management reduces spoilage as well as the energy needed to maintain perishables from the loading dock to the freezer case or produce bin. Proactive restocking, based on historical data and AI, further ensures that items are available when needed and in sellable quantities for a particular store.

Although the pandemic boosted online and curbside pickup sales, the resulting supply chain issues have left customers somewhat disillusioned and wondering which important item will become hard to get. Customers will accept some inconvenience due to a worldwide event, however, retailers need to be prepared for the near-term future shopper who has high expectations and whose loyalty may be harder to keep. That can be done through a data-driven approach using computer vision and AI.

Retail organizations can build on the safety and security infrastructure already deployed in their stores and at a pace that’s right for their business. Digital transformation is an on-going process and many retailers are already engaged with Dell Technologies in developing the right framework to guide them through their journey, while enabling their business to remain agile and innovate.

For an overview of computer vision and its impact on retail, read the Solution Brief, “Protecting retails assets and unlocking the potential of your data with AI-driven Computer Vision.”

Learn more about how computer vision is positively impacting other industries: 

The Future Is Computer Vision – Real-Time Situational Awareness, Better Quality and Faster InsightsComputer Vision Is Transforming the Transportation Industry, Making It Safer, More Efficient and Improving the Bottom LineHow Computer Vision is revolutionizing the Manufacturing Supply ChainHow the Sports and Entertainment Industry Is Reinventing the Fan Experience and Enhancing Revenues with Computer Vision

* Sensormatic Global Shrink Index: https://www.sensormatic.com/landing/shrink-index-sensormatic

Artificial Intelligence

Whether the cause is a disability, long illness, psychological challenges, or parental choice, many children find themselves facing the daunting task of acquiring an education at home.

During the global pandemic, homeschooling issues became a particular concern, as teachers and parents attempted to use their time and energy most productively and vulnerable students worried about falling behind, among other anxieties.

On a practical level, in some cases, homeschooling can eat up 10% of a particular school’s budget even when less than 1% of the pupils require the service.

As the world shut down, international technology company NTT DATA Business Solutions attempted to remedy apprehensions by creating a digital teaching engine to help children learn, teachers teach, and parents homeschool.

Not only would the Artificial Intelligence (AI) Learning Helper assist students in improving reading and other skills, the new platform would also manage to meet the emotional needs of each child.

Human avatars

From its headquarters in Bielefeld, Germany, NTT DATA assists a variety of industries, including chemical, pharmaceutical, wholesale, and consumer products, always searching for new places to innovate.

The company also works closely with the Media Lab at the Massachusetts Institute of Technology (MIT), drawing research from such disciplines as technology, media, science, art, and design.

“The possibilities of artificial intelligence fascinate us,” noted Thomas Nørmark NTT DATA’s global head of AI and robotics.

Previously, the company’s “digital human platform” allowed NTT DATA to develop avatars for receptionists, sales associates, shop floor assistants, and car vendors, among others.

Those experiences would prove invaluable in the development of the avatars that would both teach children and interact with them in a personalized way.

Emotional fluency

Turning to enterprise resource planning software leader SAP for its foundation, NTT DATA used a number of the solutions to make the AI Learning helper come to life: SAP Data Intelligence for both AI and Machine Learning (ML), SAP Analytics Cloud to amass data about individual students’ learning progress, and SAP Conversational AI to manage the conversations between Learning Helper and the students. 

These allowed the platform to utilize AI specialties like body language detection, emotional feedback, micro expression – which registers facial expressions that sometimes last only 1/30th of a second – and summarization algorithms to create new phrases to relay information in a language every student could grasp.

Each screen would be the equivalent of a virtual buddy who could understand a pupil’s changing emotions – detecting whether he or she were frustrated or unmotivated – and patiently adjust to the situation.

At times, the Learning Helper would conclude, a child simply needed to engage in small talk for a few minutes before turning back to the lesson.

The innovation would provide much needed relief for parents and teachers who are not always able to exhibit the same type of sensitivity when they are dealing with so many other obligations.

Tracking for success

The app was deployed in January 2021, with the Danish municipality of Toender’s school district and a British hospital school becoming the first places to use the AI Learning Helper.

Students discovered that they could access the platform at any time on any device, and there was no limit on how long a session could last.

In addition to assisting pupils with vocabulary, pronunciation, and story comprehension, the avatar generated and answered questions.

Through classwork, as opposed to testing, the solution could track each child’s progress, communicating that information to parents and teachers, and come up with lessons tailored to areas where the student could improve.

Participating schools noted that estimated homeschooling costs decreased by 50%, leading to a 5% reduction in the overall budget.

For creating a personalized virtual helper to bring out student strengths and alleviate both loneliness and frustration, NTT DATA Business Solutions received a 2022 SAP Innovation Award – part of an annual ceremony rewarding organizations using SAP technologies to improve the world.

You can read all about what NTT DATA did to win this coveted award, and how, in their Innovation Awards pitch deck.

As developing nations gain greater access to the Internet and education becomes more democratized, the company plans to use the AI Learning Helper to teach thousands more.

Artificial Intelligence, Machine Learning

Artificial intelligence (AI) has captured the imagination of a wide variety of businesses. I have this image of CEOs in boardrooms around the globe declaring, “We must have AI! Our competitors use AI! We can’t be left behind!” There might be some table-pounding associated with this scenario. There will certainly be corporate minions scurrying around to fulfill the AI dreams of their CEO.