How are modern CIOs making an impact with multi-cloud? A recently released VMware report, “CIO Essential Guidance: Modernizing Applications in a Multi-Cloud World,” outlines these four key factors that influence success:

Drive Developer Velocity

The best applications are created by the most talented developers, so it’s crucial to attract and retain the best talent. Taking it a step further, according to a recent Forrester poll, 69% of business leaders agreed that a good Developer Experience (DevEx) results in a better customer experience (CX).1 In fact, it’s clear that DevEx directly impacts CX: 45% of enterprise IT executives report that their dev teams push software releases on a monthly or faster pace (on average).2

With so much at stake, it’s critical to enable your developers to do what they do best: code. But all too often, barriers prevent this from happening. Cumbersome legacy platforms and tools slow down developers, which is why CIOs need to remove friction from the underlying development infrastructure and create an environment where teams can focus on achieving outcomes.

This may include creating agile workflows and automating manual processes as well as handoffs, provisioning and even meetings and paperwork. Adopting a cross-cloud development platform that offers the programs and codes your developers prefer, including pre-selected open-source products, will also elevate velocity, improve DevEx and unleash innovation.

VMware

Embrace Unified Cloud Management

Despite all the advantages of multi-cloud development environments, if you can’t easily manage your cloud estate, you’re not reaping the full benefits. You could be underutilizing resources from one cloud provider, while maxing out on another. Moreover, lack of visibility means greater risk.

VMWare

A successful strategy for overcoming these types of challenges is to choose the best cloud provider for each app – whether it’s an application platform for developers, an observability app for risk management, automation for operations, or something else — yet manage all clouds as if they were one, using a single platform. This approach reduces operational complexity and presents opportunities for greater governance, cost savings and risk management. 

Shift Security Left

Shifting security left – meaning building in features that bolster security across the entire app pipeline, from the build phase all the way through deployment and optimization – is essential in today’s complex threat landscape. This approach, combined with a unifying security platform and modern development principles, reduces risks and allows you to identify vulnerabilities and issues faster.

VMWare

When security management controls reside on a central platform, CIOs can better manage risk,compliance, and more across their overall cloud strategy – spanning entire application development and operations processes.

Take a Platform-as-a-Product Approach

Unifying platforms are vitally important to the success of your modern apps in a multi-cloud world. The operation of these platforms should be of the utmost importance, seeing as they are the product that keeps the company running.

If you view your unifying multi-cloud platforms as drivers of innovation, growth and data protection, and run them as a product, you can reimagine the way you prioritize and manage your apps and cloud estate. With a Platform-as-a-Product approach, it’s easier to keep your focus on the big picture.

Bringing It All Together

With almost 75% of businesses operating across multiple public clouds 5, it’s become clear that efforts to modernize need to be executed strategically. CIOs who’ve enjoyed success in this area are realizing cost savings, revenue growth, and improved innovation. They have taken the time to standardize functions across clouds, chosen the clouds that best meet the needs of their apps, and operated unifying platforms that maintain seamless business control over multiple cloud providers. They have also improved DevEx to make the best use of one of their greatest assets: their Dev team.

For more ways to influence multi-cloud success, download the complete report: CIO Essential Guidance: Modernizing Applications in a Multi-Cloud World

Learn more by clicking here.

[1] Taking it a step further, according to a recent Forrester poll, 69% of business leaders agreed that a good Developer Experience (DevEx) results in a better customer experience (CX).

[2] In fact, it’s clear that DevEx directly impacts CX: 45% of enterprise IT executives report that their dev teams push software releases on a monthly or faster pace (on average).

VMware

According to a PwC report, one in three consumers (32%) say they will walk away from a brand they love after just one bad experience. Unlike personal relationships, loyalty in the consumer world can be surprisingly transitory. This gets worse in the digital world where it takes just a few clicks and minutes to uninstall one app and replace it with a competitor’s app. There are similarities between how loyalty is formed in the physical and digital world. It all boils down to two things – how you feel about that relationship and how much time you are investing in it.

Deliver Delightful Customer Experiences

156. That’s the number of apps I’ve installed on my mobile phone. On any given day, I will be using at least 10% of them. And out of these, my favourite app is a local banking app. It’s one app that I feel was designed just for me. It’s completely intuitive, allows me to perform most tasks in less than 3 clicks, has all the functions that I need to perform banking on-the-go, is constantly updated with new features, comes with great performance and stability and most of all is very secure. These are what I’d refer to as key ingredients to provide delightful customer experiences.

A great amount of design thinking goes into building such modern apps that deliver intuitive user experiences. A pod-based team structure can be set up where you have all the stakeholders responsible for delivering the app. There needs to be strong alignment amongst all the stakeholders ranging from the software developer, the product manager, line of business all the way to the quality engineer. Everyone should know what they are delivering, why they are delivering and how they will be delivering.

Leveraging the right set of technologies will be a key success criterion for such apps. The app should adopt a cloud native architecture to ensure agility, scalability, and resilience. Security should be incorporated from the earliest stages of app development to minimize risk, time, and costs. These best practices coupled with a sound design thinking approach can help enhance customer experience and as a result improve loyalty.

Elevate Customer Engagement

Another way to measure loyalty in the digital world is by the amount of time consumers are using an app. App engagement time is crucial as it influences revenues through ads, spendings, as well as consumer data that can be monetized in the future. To maximize engagement and app-stickiness, companies are increasingly introducing more revenue-generating offerings within their apps. To that end, we’ve seen the rise of the one-app-to-rule-them-all aka a Superapp. Some of the well-known Superapps in Asia are household names e.g., Grab, Gojek, WeChat and PayTM. Grab for example started out as a ride-hailing app. Today its offerings include deliveries, mobility, financial services among others. Gartner anticipates that Superapps will be one of the top 10 strategic technology trends for 2023.

A major downside of a Superapp is that if compromised due to security vulnerabilities in the app’s code, a malware in its libraries, or a configuration error, it can become the-one-key-to-access-them-all for bad actors. It can be a free pass to not just tamper with, but also exfiltrate all types of sensitive consumer data. According to a McKinsey report, 71% of consumers said they would stop doing business with a company if it gave away sensitive data without permission.

To tackle this data privacy issue, all data exchanges within a Superapp should be encrypted. In addition, we should also perform real time monitoring of sensitive data leaks such as credit cards, and other personal identifiable information (PII).

Engage a Trusted Partner

To build customer loyalty in the digital world, businesses need to delight customers and keep them engaged. Leveraging cloud-native architectures, incorporating sound security and data privacy practices, and using design thinking methodology will be instrumental in building modern, secure, and engaging apps. In addition, it will also be important to engage the right technology partner who can support you on this journey.

For the past 30 years, SUSE has been helping customers realize their business goals through transformative and cutting-edge open-source technologies.

Rancher Prime is an industry leading platform that helps companies roll out scalable and resilient cloud native and container-based apps across a distributed IT landscape. It empowers DevOps teams to build and deploy modern apps and updates in a rapid and automated manner.SUSE NeuVector protects apps from bad actors throughout its software lifecycle from development to production environments. It helps security teams implement zero trust controls for apps that may be running in a distributed environment. NeuVector also comes with advanced preventive threat capabilities to prevent data loss in real time.

Learn more at this link: Rancher by SUSE.

SUSE

Vishal Ghariwala is the Senior Director & CTO, APJ and Greater China for SUSE, a global leader in true open source solutions. In this capacity, he engages with customer and partner executives across the region, and is responsible for growing SUSE’s mindshare by being the executive technical voice to the market, press, and analysts. He also has a global charter with the SUSE Office of the CTO to assess relevant industry, market and technology trends and identify opportunities aligned with the company’s strategy.

Prior to joining SUSE, Vishal was the Director for Cloud Native Applications at Red Hat where he led a team of senior technologists responsible for driving the growth and adoption of the Red Hat OpenShift, API Management, Integration and Business Automation portfolios across the Asia Pacific region. Before that, he spent a significant amount of time with leading middleware vendors such as IBM, ILOG and Intalio, as well as the public sector.

Vishal has over 20 years of experience in the Software industry and holds a Bachelor’s Degree in Electrical and Electronic Engineering from the Nanyang Technological University in Singapore.

Vishal is here on LinkedIn: https://www.linkedin.com/in/vishalghariwala/

Application Security, Mobile Development

The transition to a modern business intelligence model requires IT to adopt a collaborative approach that includes the business in all aspects of the overall program. This guide focuses on the platform evaluation and selection. It is intended for IT to use collaboratively with business users and analysts as they assess each platform’s ability to execute on the modern analytics workflow and address the diverse needs of users across the organization.

“It all went live in less than two months,” said Paul Egan, It Manager of Business Intelligence at Tableau. “The CEO had his new production-strength dashboards in Tableau in less than two months of the server being deployed—and that was a pretty phenomenal turnaround.”

Download this free whitepaper to learn more.

Digital Transformation


Introduction
Since its inception decades ago, the primary objective of business intelligence has been the creation of a top-down single source of truth from which organizations would centrally track KPIs and performance metrics with static reports and dashboards. This stemmed from the proliferation of data in spreadsheets and reporting silos throughout organizations, often yielding different and conflicting results. With this new mandate, BI-focused teams were formed, often in IT departments, and they began to approach the problem in the same manner as traditional IT projects, where the business makes a request of IT, IT logs a ticket, then fulfills the request following a waterfall methodology.

While this supplier/consumer approach to BI appeared to be well-suited for the task of centralizing an organization’s data and promoting consistency, it sacrificed business agility. There was a significant lag between the time the question was asked, and the time the question was answered. This delay and lack of agility within the analysis process led to lackluster adoption and low overall business impact.

The emergence of self-service BI in recent years has challenged the status quo, especially for IT professionals who have spent the better part of the past two decades building out a BI infrastructure designed for developing top-down, centralized reporting and dashboards. Initially, this self-service trend was viewed as a nuisance by most IT departments and was virtually ignored. The focus remained on producing a centrally-managed single source of truth for the organization.

Fast-forward to today and IT finds itself at a crossroad with self-service BI as the new normal that can no longer be ignored. The traditional approach to BI is becoming less and less relevant as the business demands the agility that comes with self-service to drive adoption and improve organization outcomes. This, paired with the continued exponential growth in data volume and complexity, presents IT with an important choice.

Either the demand for self-service BI is embraced, and IT evolves to become the enabler of the broader use and impact of analytics throughout their organizations, or it is ignored and IT continues as the producer of lower-value enterprise reporting stifled by the limitations of traditional tools. IT professionals who are ready to serve as a catalyst and embrace this opportunity will deliver far greater value to their organizations than those who choose to ignore the real needs of their business users and analysts.

As organizations begin the transition from a traditional top-down approach driven by IT to a self-service approach enabled by IT and led by the business, a new framework and overall strategy is required. This means that past decisions supporting the core foundational
components of a BI program—people, process, and platform—must be revisited. Adjustments are needed in these three core areas to support the shift from a model of top-down BI development and delivery to a self-service-based modern BI model which is driven, and
primarily executed on, by the business.

People
Self-service analytics does not mean end users are allowed unfettered access to any and all data and analytic content. It means they have the freedom to explore pertinent business data that is trusted, secure, and governed. This is where process comes into play and represents the component that requires the most significant shift in traditional thinking for IT. A successful modern BI program is able to deliver both IT control and end-user autonomy and agility. A well-established and well-communicated process is required for an organization to strike this delicate balance.

A top-down, waterfall-based process only addresses the IT control part of the equation. A traditional BI deployment focuses primarily on locking down data and content with governance. This means limiting access and freedom to only a few people with specialized technical skills who are expected to meet the needs and answer the questions of the many. This typically involves developer-centric processes to design and build the enterprise data warehouse (EDW) model, build the ETL jobs to transform and load data into the model, construct the semantic layer to mask the complexity of the underlying data structures, and finally build the businessfacing reports and dashboards as originally requested by the business.

The unfortunate reality is that this approach often fails to deliver on the vision and promise of BI—to deliver significant and tangible value to the organization through improved decision making with minimal time, effort, and cost. Top-down, IT-led BI models often have an inverse profile of time, effort, and cost relative to the value they deliver to the organization.

A modern analytics solution requires new processes and newly-defined organizational roles and responsibilities to truly enable a collaborative self-service-based development process. IT and users must collaborate to jointly develop the rules of the road for their secure environment that each other must abide by in order to maximize the business value of analytics without compromising on the governance or security of the data.

IT’s success is highlighted, and its value to the organization realized, when the business can realize significant value and benefit from investments in analytics and BI. Should IT still be considered successful even if not a single end-user utilizes the BI system to influence a single business decision? Traditional processes intended to serve top-down BI deployments are too often measured by metrics that are not tied to outcomes or organizational impact.

If the ETL jobs that IT created ran without failure and the EDW was loaded without error and all downstream reports refreshed, many IT organizations would consider themselves successful.

Merely supplying data and content to users without any regard for whether or not it is adopted and provides value through improved outcomes is simply not enough. Modern BI requires updated processes to support self-service analytics across the organization. It also
requires the definition of new success metrics for which IT and the business are jointly accountable and are therefore equally invested.

Where processes and technology intertwine, there is tremendous opportunity. Technical innovations, especially with AI, will continue to make it easier to automate processes and augment users of all skill levels throughout the analytics workflow. And while process can
accelerate, rather than inhibit, successful analytics outcomes, it’s important to recognize that this relies on a system and interface that people are eager to use. If processes aren’t supported by the right platform choice, they will stifle adoption.

Platform
Since BI has been historically viewed as an IT initiative, it is not surprising that IT drove virtually every aspect of platform evaluation, selection, purchasing, implementation, deployment, development, and administration. But with drastic changes required to modernize the people and process components of a BI and analytics program, IT must change the criteria for choosing the technology to meet these evolving requirements. Perhaps the most obvious change is that IT must intimately involve business users and analysts from
across the organization in evaluating and ultimately deciding which modern platform best fits the organization and addresses the broad needs of the users. For more information on selecting the right analytics platform, check out the Evaluation Guide.

A modern platform must address a wide range of needs and different personas as well as the increased pace of business and the exponential growth in data volume and complexity. IT requires that the chosen platform enables governance and security while end users require easy access to content and the ability to explore and discovery in a safe environment.

The chosen platform must also be able to evolve with the landscape and integrate easily with other systems within an organization. A centralized EDW containing all data needed for analysis, which was the cornerstone of traditional BI, is simply not possible in the big-data era. Organizations need a platform that can adapt to an evolving data landscape and insulate users from increased complexity and change.

The most critical aspect is the ability to meet these diverse needs in an integrated and intuitive way. This integration is depicted on the following page as the modern analytic workflow. The diagram highlights the five key capabilities that must flow seamlessly in order for the three personas depicted in the center to effectively leverage the platform.

The BI and analytics platform landscape has passed a tipping point, as the market for modern products is experiencing healthy growth while the traditional segment of the market is declining with little to no net new investment. IT leaders should capitalize on this market
shift and seize the opportunity to redefine their role in BI and analytics as a far more strategic one that is critical to the future success of the organization. Adopting a collaborative approach to recast the foundational aspects of the BI program and truly support self-service is the key to changing the perception of IT from a producer to a strategic partner and enabler for the organization.

Promise
In today’s era of digital transformation, IT leaders are increasingly expected to take on digital business initiatives in their organizations, including identifying cost savings and finding new revenue streams. Realizing the value of data for these transformational efforts, many businesses are modernizing and increasing their analytics investments to innovate and accelerate change.
Everyone agrees that putting data at the center of conversations promises change. However, most organizations are failing to successfully implement an enterprise-wide analytics program.

IT is well positioned for a leadership role in these efforts, and is essential for the task of giving people the relevant data they need for decision-making. Modern analytics shifts IT’s role to a more strategic partner for the business, empowering users to navigate a trusted, self-service environment. But beyond access to the data, everyone needs the motivation and confidence to properly make decisions with it. You need to identify the relationships between job functions and data and change behaviors that run deep into the fabric of your organization’s culture.

This also means expanding your definition of self-service so that business users participate in some of the traditionally IT-led responsibilities associated with data and analytics—like administration, governance, and education. With the right processes, standards, and change management, business users can help manage data sources, analytics content, and users in the system, as well as contribute to training, evangelism, and the internal community. When users value and participate in these efforts, IT can manage strategic initiatives like business SLAs and ensuring the security of company assets.

Although every organization’s journey to building a data-driven organization will differ, achieving your transformational goals requires a deliberate and holistic approach to developing your analytics practice. Success at scale relies on a systematic, agile approach to identify key sources of data, how data is selected, managed, distributed, consumed, and secured, and how users are educated and engaged. The better you understand your organization’s requirements, the better you will be able to proactively support the broad use of data.

Tableau Blueprint provides concrete plans, recommendations, and guidelines as a step-by-step guide to creating a data-driven organization with modern analytics. We worked with thousands of customers and analytics experts to capture best practices that help turn repeatable processes into core capabilities to build and reinforce a data-driven mindset throughout your organization.
Learn more and get started today.

About Tableau
Tableau is a complete, integrated, and enterprise-ready visual analytics platform that helps people and organizations become more data driven. Whether on-premises or in the cloud, on Windows or Linux, Tableau leverages your existing technology investments and scales with you as your data environment shifts and grows. Unleash the power of your most valuable assets: your data and your people.

Analytics

Every company and government entity is tasked with striking a critical balance between data access and security. As Forrester’s Senior Analyst Richard Joyce stated, “For a typical Fortune 1000 company, just a 10 percent increase in data accessibility will result in more than $65 million additional net income.” As the need to become more data-driven accelerates, it’s imperative enterprises equally balance privacy and governance requirements.

To achieve this balance, we need to change how we perceive data security. Amidst growing friction between teams — i.e. those who create and manage data access policies and those who need data to perform their duties — we must accept security, IT, and privacy teams want to provision as much data as possible. But those teams face constraints and compliance complexity.

Traditionally, data security, privacy, and regulations have been thought of as a cost center expense. Instead, we need to look at data security as the means for positive change, a driver for greater data accessibility, enhanced operational efficiency, and actual business value.

Many remain far short of the goal

Enterprises of all sizes struggle with the shift. NewVantage Partners’s Data and AI Leadership Executive Survey 2023 found less than a quarter of firms reported a data-driven organization or data culture. And in the State of Data and Analytics Governance, Gartner suggests by 2025 80 percent of organizations seeking to scale digital business, including analytical initiatives, will fail because they don’t modernize their data and analytics governance.

Data access drives growth. So, what’s the reason for low data-culture adoption? To be truly data-driven requires tight collaboration between many different functions, and there’s a lack of certainty regarding individual-role responsibilities. Strategic gaps must be addressed.

The reality of data security and access

When it comes to data security and access, companies are typically either:

Overly restrictive on data access. Data security is seen as an impediment to overall company growth. This is typically due to data, organizational, and technological complexity.Or, overly focused on perimeter and application defenses, leveraging cyberdefenses and coarse-grained identity and access management (IAM). Data systems are open to exploitation in the event of a breach.

Most experience the worst of both these scenarios, where data security and access are simply broken — inconsistent, atomistic.

A primary challenge of solving the data democratization balancing act lies in the complex web of internal and external privacy, security, and governance policies. They change over time and need to be applied and maintained consistently and efficiently across business teams.

In the middle are the technical teams managing the complex data and analytical system. Due to constraints, security, privacy, and data management teams default to a tight lockdown of data to ensure compliance and security. It’s not any one team’s fault, but a major blocker to becoming data-driven.

Unified data security platform

Siloed, decentralized, inefficient, unclear roles and responsibilities, and an absence of a holistic strategy. So, what’s the solution as more companies face costly data breaches and low data usability rates? An enterprise-wide, scalable strategy that leverages a unified data security platform. One that includes integrated capabilities to simplify and automate universal data security processes across the entire data and analytic ecosystem. With the ability to discover and classify sensitive data, data attributes can be used to automatically deliver instantaneous data access to authorized users. Proper data security governance helps teams get access to more data faster.

Additional data masking and encryption layers can be added to make sensitive data available for analytics without compromising security. Even if a breach occurs, fine-grained access limits exposure, and audit capabilities quickly identify compromised data.

Executing a proper data security strategy provides the last mile of the data governance and cataloging journey. All of it key to the balancing act of data democratization, with comprehensive data governance enabling faster insights while maintaining compliance. 

Enterprise-wide governed data sharing

Privacera helps Fortune companies modernize their data architecture via a holistic, enterprise-wide data security platform and automated data governance. A data security platform empowers the data democratization you need to increase data usability and nurture your data-driven culture. Analysts and business units get more data faster. IT liberates time and resources. Security and privacy teams easily monitor and implement data security policies.

Learn more about achieving modern data security governance and democratized analytics for faster insights here.

Data and Information Security

Here’s a proposition to consider: among the ranks of large enterprises, commercial success increasingly relies on digital transformation. In turn, digital transformation relies on modernized enterprise networks that deliver flexibility, performance and availability from the edge to the cloud. Intuitively, this hypothesis makes a lot of sense.

In many enterprises, it’s also increasingly becoming the subject of painstaking debate. After two years of quick-fix digitalization on top of pre-COVID-era network technologies, the limits of the status quo are becoming evident. All too often, legacy networks limit the potential for digital transformation. In many organizations, it’s way past time to address the fundamentals.

If this debate sounds familiar to you, it’s worth looking at the 2022-23 Global Network Report from NTT, a new piece of research that offers an intriguing view of how enterprises around the world are managing their networks.

Among other things, NTT’s survey suggests a strong correlation between a willingness to invest in modernizing networks and high levels of commercial performance. At the other end of the spectrum, NTT’s survey confirms many enterprise networks suffer from long-term underinvestment and increasing levels of technical debt. The distance between these two different approaches feels substantial.

NTT’s report – based on responses from over 1,300 network specialists and IT and business decision-makers worldwide – defines high levels of commercial performance using straightforward criteria. To qualify as a “top-performer”, organizations in the survey needed to have generated year-on-year revenue growth of over 10%. They also needed to have generated operating margins of over 15% in the last financial year.

In network terms, what do these organizations look like? It’s here that a willingness to invest in modern network technologies starts to look like an indispensable ingredient for high performance in commercial terms.

Nine out of 10 top-performing organizations are increasing network investment to support digital transformation. Many are spending over 2% of their annual revenues – a significant sum – on their networks, deploying technologies designed to enable rapid transformation, provide greater availability and flexibility, and support not just today’s requirements, but tomorrow’s requirements as well.

Eight out of 10 high-performing organizations say their network strategy is aligned with their business goals. In practice, this involves a clear understanding that the quality of the network directly affects their ability to address the most pressing business and digital transformation challenges. (By contrast, only 42% of underperformers share this sentiment.)

The underperformers in NTT’s survey are a mirror image of these overachieving organizations. Most CIOs and CTOs at these companies agree that networks play a vital role in delivering revenue growth. They also recognize business demands for increased speed, agility and innovation can only be satisfied by new operating models. And yet these organizations typically suffer from delayed upgrades, high levels of technical debt and poor visibility across the network.

The older the network is, the greater the chance of negative impacts on service delivery, customer satisfaction and the employee experience. Some 69% of the CIOs and CTOs surveyed by NTT say technical debt continues to accumulate. Asked to identify the risks generated by underinvestment, respondents most frequently pointed to classic effects of technical debt: inflated IT operational costs and limited availability of new services required for digital transformation.

For these enterprises, networks threaten to become a cross between a millstone and a minefield (slowing down progress and continually threatening to blow up in the face of network professionals).

In this hybrid and hyperconnected world where organizations need to deliver great employee and customer experiences, the network provides the fabric of the digital organization. NTT’s intelligent and secure Network as a Service enables a complete edge-to-cloud strategy, delivering a wide array of benefits: increased agility, reduced risk, greater flexibility, scalability, automation, predictability and control.

Given today’s high-performance hybrid environment, Matthew Allen, Vice President, Service Offer Management – Networking at NTT, suspects that the status quo is time-limited for underperforming enterprises.

“You can start to transform your business on the networks you have. However, as this business transformation drives a distribution of applications and business functions across many, diverse locations (SaaS, PaaS, IaaS, private cloud, etc.), a legacy network solution will not be able to keep pace with this change – it will become increasingly difficult for distributed applications and workloads to communicate effectively and securely, at the speed the business requires.”

NTT’s survey suggests organizations that delay network modernization run the risk of ending up in an unsustainable position – technical debt will continue to accumulate, downtime will occur as networks fail, and the increased operational complexity of stitching together and maintaining networks to support distributed workloads will eventually cause something to slip. Certainly, the commercial implications look unpleasant.

On this basis alone, it’s worth looking at NTT’s survey. It’s also worth asking yourself about your organization’s network strategy. Does it look like the strategy of a top-performing organization or an underperforming one? NTT’s analysis suggests that the difference between the two is more important than we might imagine. To learn more, read the 2022–23 Global Network Report from NTT – you can view the key findings infographic or download the complete report with access to the full data set.

Networking

Live shopping is one of the most exciting retail experiences in a long time. As shoppers become increasingly eager to buy via live shopping on social platforms such as Instagram and TikTok, retailers face new challenges: How to capture the shoppers’ attention on social media when the urge to buy hits? How can retailers create seamlessly interactive and immersive experiences that prompt consumers to hit the “buy button” during the live stream?

While retailers see the opportunity to pounce on this new trend, the hurdles they face are immense:

Connecting social commerce touchpoints like TikTok and Instagram, as well as built-in chat, video and voice functions, to commerce experiences is a must for seamless omnichannel experiences. Still, it’s not always easy to deploy new features and channels in their current commerce platform.Autoscaling traffic peaks from the live shopping experience is necessary, so consumers don’t face slowdowns or crashes when buying a product. When shoppers see a 404 error page, they’re not likely to come back. Unfortunately, most retailers still face scalability problems during sudden traffic spikes. Experimenting with new touchpoints enables brands and retailers to be ahead of changing customer demands now and in the future. Yet, experimentation is challenging to achieve in today’s IT environment.

These challenges stem from the rigid and monolithic nature of commerce platforms most retailers and brands still use today. These solutions, built for the desktop eCommerce era, lack the flexibility and scalability needed for spontaneous and high-volume sales powered by live shopping.

Flexible, scalable, agile: modern commerce starts with MACH

What’s the alternative for retailers looking at live shopping and beyond? As customer demands and market conditions change quickly, retailers and brands must move faster to capitalise on new ways to sell, such as omnichannel commerce, digital clienteling and personalisation. This means adapting customer experiences on the fly by adding new touchpoints, products, features, locales, currencies and every aspect of commerce without hassle.

This maximum flexibility and scalability philosophy is powered by the principles of MACH (Microservices-based, API-first, Cloud-native and Headless). In a nutshell, MACH-based architecture breaks down functionalities  such as integrating Instagram as a commerce channel  into modular pieces that can be easily customized, deployed, scaled and managed over time. In contrast to all-in-one legacy platforms, retailers using MACH can experiment, scale, change and adapt any functions at any time without disrupting their commerce backend or customer-facing storefronts. As 81% and 88% of adults under 55 years of age in Australia and New Zealand respectively shop online, and nine in 10 retail dollars spent offline during Australian peak season were influenced by digital, retailers are urged to rethink their digital commerce infrastructure to succeed in this new landscape.

For example, the Canadian menswear retail chain, Harry Rosenimplemented digital clienteling that sparked online traffic peaks. With MACH, the company had 0% downtime even as page views per session increased by 150%, coping with a three-fold increase in online sales without disruption.

Australian retail giant Kmart opted for MACH-based infrastructure to elevate personalisation and product categorisation, as well as autoscaling capabilities. During the COVID-19 pandemic, Kmart handled three times the online volume compared to pre-pandemic levels, and still, their eCommerce infrastructure was twice as fast. The company doubled the conversion rate and, just as importantly, operated at a third of its previous infrastructure costs.

Lastly, fashion retailer Express saw online traffic spikes that were three times higher than the busiest hour of Black Friday after a sales promotion went viral. Thanks to MACH, the company could avoid downtime and slowdowns to its webshop, fully capitalising on the sudden surge in sales.

Many retailers are shifting to MACH-based platforms to cope with traffic spikes from digital shopping. With modern commerce, retailers can add new touchpoints and experiment with new ways of reaching customers without constraints to the commerce infrastructure. 

E-commerce Services

With 190 participating countries and 24 million visitors, Expo 2020 Dubai was one of the world’s largest events, connecting everyone to innovative and inspiring ideas for a brighter future. But what does it take to support an event on such a grand scale? The answer is a robust cloud and modern IT infrastructure, which would allow 1,200 employees to collaborate with one another, amidst nationwide lockdowns and supply chain disruptions.

As part of the World Expos, Expo 2020 Dubai sought to create one of the smartest and most connected places to give its participants a lasting impression beyond the event. More than just departing with the knowledge and connections gained from the Expo, the organization also wanted to impart the rich cultural heritage of the United Arab Emirates (UAE). This means delivering a deeply personalized and hyper-relevant experience, from a smooth ticketing journey to chatbots that offered real-time assistance in multiple languages.

To bring this ambition to life, a robust foundation of technology was necessary, one that could support the seamless integration of systems and apps, and a myriad of digital services, and meet numerous, diverse IT requirements. It was with these in mind that Expo 2020 Dubai decided on a multi-cloud infrastructure that was hyper-flexible, scalable, secure, and reliable enough to support the event’s operations while serving as a platform to manage the build process for the event.

Behind The Winning Cloud Partnership

Expo 2020 Dubai was built from the ground up: a 4.38km² wide site comprising sprawling parks, a shopping mall, and the Expo Village. In the same vein, its cloud journey also underpinned the various stages of its development, including civil infrastructure, building construction, crowd management, smart city operations, and marketing. Key to this multi-cloud infrastructure was flexibility, scalability, and security, upon which its integrated, intelligent systems were built on. This enabled the Expo teams, vendors, suppliers, and volunteers across nations to work seamlessly together.

It is through the collaborative effort of e& and Accenture that the Etisalat OneCloud and Amazon Web Services (AWS) were successfully integrated to make Expo 2020 Dubai one of the first and largest true multi-cloud infrastructures in the region. Etisalat OneCloud provided the resilient, reliable, and secure environment the event needed for its localized business-critical apps, whereas AWS delivered the structure necessary to support global digital services and apps, such as websites, participant portals and eCommerce platforms.

But what brought both solutions together was Accenture Service Delivery Platform, which offered the interconnectivity for enabling several layers of integration at the app and security level.

As the technological groundwork of Expo 2020 Dubai consisted of over 90 applications, Accenture Service Delivery Platform delivered the integration the multi-cloud infrastructure required without any external systems while meeting the stringent app requirements around scalability, security, and hyper-reliability. This was done across six months of development and throughout the entire customer lifecycle spanning awareness, discovery, purchase, and post-sales.

Delivering An Unprecedented Experience

Through this sprawling multi-cloud infrastructure, Expo 2020 Dubai could host all the Pavilion designs, themes, and content from over 190 participating countries while integrating authorizations, supply chain management, and workforce licensing functions. At the same time, the event realized seamless and highly personalized experiences for its visitors with a suite of visitor-facing digital channels. This was inclusive of the Expo 2020 official mobile app, virtual assistant, and an official website.

Expo 2020 Dubai also incorporated a central information hub and a best-in-class ticketing journey alongside digital services tailored to a visitor’s personal preferences in real-time and in their preferred language. Then there was AMAL, a chatbot powered by artificial intelligence, instrumental in gathering critical information on the Expo shows and attractions while giving live feedback as the event took place.

It is clear that behind this global gathering of nations designed around enhancing our collective knowledge, aspirations, and progress, a large-scale digital transformation took place: one which enabled the multi-cloud environment for Expo 2020 Dubai and was instrumental to the success of this life-changing event.

The Expo’s key themes of opportunity, mobility, and sustainability were succinctly captured in its infrastructure, demonstrating the potential of cloud in unlocking intelligent operations and business agility. As evident in the successes of Expo 2020 Dubai and other businesses, such as leading transport fuels provider Ampol, cloud has become an indispensable cornerstone to succeed in today’s digital-first economy. And it’s this very cloud continuum that will continue to bring businesses one step closer to innovation, aiding them in delivering truly transformative services and experiences.

Read the full story here:  https://www.accenture.com/ae-en/case-studies/applied-intelligence/expo-2020-dubai

Hybrid Cloud, Infrastructure Management, Multi Cloud

By: Lars Koelendorf, EMEA Vice President, Solutions & Enablement at Aruba, a Hewlett Packard Enterprise company

Can an enterprise CEO today be successful without having a strong relationship with the CIO and the corporate network?

The short answer is no. Technology today powers and enables so much of how businesses function. Given the pace of digitization, the corporate network, led by the CIO, is increasingly becoming a critical business decision center for the CEO within the broader context of running a large enterprise.

In particular, there are three points CEOs today must consider when examining the network and their relationship with the CIO.

1. Investing in the network is foundational to achieving business goals

Is there any department across the modern enterprise business that would not benefit from the ability to work better, faster, easier, smarter, cheaper, and more secure?

The COVID-19 pandemic has already proven again and again why digital transformation is now fundamental to business growth and survival, especially in the face of outside, unanticipated events severely impacting normal business operations.

Matching technology with how business engages key publics, from clients to the community to investors and beyond, allows employees to create higher quality work while producing more competitive products and services that keep pace with ever-evolving demands. It means empowering back-end functions to support the rest of the business better than before. Meanwhile, regardless of which department they belong to or where they choose to work, employees must have the best experience possible, without any technical roadblocks and complications that can stop them from delivering their best work. Otherwise, employees will and are seeking out that environment elsewhere. Indeed, many employees actually experienced very good connectivity while working from home during the pandemic – and now demand that same easy and seamless experience coming back into the workplace or while on the road.

The key to creating that effective work environment is ensuring the CIO makes clear to the CEO the value of automated systems, which not only includes streamlining operations, but eliminating human error, overcoming human limitations, and freeing up employees to focus on projects that drive real value. In short, with the right technology, CIOs can drive actionable insights from the deluge of data that a given company has been accumulating that support the CEO’s long-term vision and business goals.

Enterprise data has the potential to deliver significant cost savings, improve operational efficiency, and even unlock new business opportunities and revenue streams. But first, it needs to be stored, secured, sorted, and analyzed – all of which a great enterprise network can facilitate.

To unlock its full potential, CEOs need to work closely with their CIOs and other department heads to understand the exact impact that the network could have on every area of the business.

2. The network also plays a vital role in achieving sustainability goals

Sustainability is not just a strategic priority. For most companies around the world, sustainability has become the priority, given that it’s being driven both from the top down (by company boards, investors, and governments) and from the bottom up (by employees, the general public, and key communities affected by business operations). In essence, networking capabilities must align with corporate sustainability goals and initiatives to truly achieve its full potential.

The network plays an integral role in empowering enterprises to become more sustainable, to measure and prove that sustainability, and to build more sustainable products and services. Therefore, investing in the right network infrastructure should be at the top of any CEO’s agenda, and they will need to work in tandem with the CIO and other relevant department heads to achieve those aims.

3. A modern network can help the enterprise stay ahead of potential pitfalls

Given the rate of change and disruption, any CEO simply investing just enough in the network to keep operations moving has already lost the plot. The CEO instead must work closely with the CIO to anticipate future business needs, opportunities, and threats, outlining clear goals and corresponding initiatives that ensure the modern network is flexible and nimble enough to meet the challenges.

It used to be that if the network were down, employees could do other manual work while waiting for a fix. Today, however, if there are issues with the network, everything stops, from the factory floor to the storefront to the corporate headquarters. In that sense, the network is mission-critical to keeping the business running.

But the network has so much more potential than this – to help the business continually stay ahead of and be differentiated from the competition. The reason is an agile network creates the foundation for every area of the business to innovate, from IT to R&D and logistics.

With an agile network, the infrastructure is always ready to integrate, support, secure, and fund any new technological developments that might help the business to move the needle on its goals.

Creating Strong C-suite Connections

While this particular article has focused on the relationship between the CEO and the network, at the end of the day, the CEO must empower the CIO to be an advocate for the network and support all C-suite members to work together towards building one that helps them achieve both individual departmental and collective organizational goals.

For more on creating a modern, agile network, learn about Aruba ESP (Edge Services Platform): https://www.arubanetworks.com/solutions/aruba-esp/

Networking

Introduction
Since its inception decades ago, the primary objective of business intelligence has been the creation of a top-down single source of truth from which organizations would centrally track KPIs and performance metrics with static reports and dashboards. This stemmed from the proliferation of data in spreadsheets and reporting silos throughout organizations, often yielding different and conflicting results. With this new mandate, BI-focused teams were formed, often in IT departments, and they began to approach the problem in the same manner as traditional IT projects, where the business makes a request of IT, IT logs a ticket, then fulfills the request following a waterfall methodology.

While this supplier/consumer approach to BI appeared to be well-suited for the task of centralizing an organization’s data and promoting consistency, it sacrificed business agility.

There was a significant lag between the time the question was asked, and the time the question was answered. This delay and lack of agility within the analysis process led to lackluster adoption and low overall business impact.

The emergence of self-service BI in recent years has challenged the status quo, especially for IT professionals who have spent the better part of the past two decades building out a BI infrastructure designed for developing top-down, centralized reporting and dashboards. Initially, this self-service trend was viewed as a nuisance by most IT departments and was virtually ignored. The focus remained on producing a centrally-managed single source of truth for the organization.

Fast-forward to today and IT finds itself at a crossroad with self-service BI as the new normal that can no longer be ignored. The traditional approach to BI is becoming less and less relevant as the business demands the agility that comes with self-service to drive adoption and improve organization outcomes. This, paired with the continued exponential growth in data volume and complexity, presents IT with an important choice.

As organizations begin the transition from a traditional top-down approach driven by IT to a self-service approach enabled by IT and led by the business, a new framework and overall strategy is required. This means that past decisions supporting the core foundational components of a BI program—people, process, and platform—must be revisited. Adjustments are needed in these three core areas to support the shift from a model of top-down BI development and delivery to a self-service-based modern BI model which is driven, and primarily executed on, by the business.

People
A successful transition to self-service business analytics begins with people and should be the top priority for IT when considering changes required for BI modernization. In a traditional BI model, people were often considered last after platform and process. The widely-used mantra “if you build it, they will come” exemplifies the belief that business users would gravitate toward a well-built system of record for BI that would answer all of their business questions.

This desired end-state rarely came to fruition since there was little to no collaboration between the business users and IT during the process of building the solution after an upfront requirements-gathering phase. In the absence of active engagement and feedback from the business during the time between requirements gathering and project completion, there are many opportunities for failure that typically emerge. A few of the most common include:

• Business or organizational changes occur during the development process that render the initial requirements obsolete or invalid.
• Incomplete or inaccurate requirements are given in the initial process phases.
• Errors are made in the process of translating business requirements into technical requirements.

The end result of these situations is often that business users disengage from the BI program completely and an organization’s investment in time and resources are wasted due to lack of adoption. Business users and analysts need to use analytics in order for it to have any impact and deliver organizational value. A BI model that embraces self-service puts these users first and allows them to explore, discover, and build content that they will ultimately use to make better business decisions and transform business processes.

Collaboration between the business and IT is critical to the success of the implementation as IT knows how to manage data and the business knows how to interpret and use data within the business processes they support. They have the context within which analytics and the insight derived from it will be used to make better business decisions and ultimately improve outcomes. This collaboration of the groups early on will not only lead to the deployment of a platform that meets the needs of the business but also drives adoption and impact of the platform overall.

Process
Self-service analytics does not mean end users are allowed unfettered access to any and all data and analytic content. It means they have the freedom to explore pertinent business data that is trusted, secure, and governed. This is where process comes into play and represents the component that requires the most significant shift in traditional thinking for IT. A successful modern BI program is able to deliver both IT control and end-user autonomy and agility. A well-established and well-communicated process is required for an organization to strike this delicate balance.

A top-down, waterfall-based process only addresses the IT control part of the equation. A traditional BI deployment focuses primarily on locking down data and content with governance. This means limiting access and freedom to only a few people with specialized technical skills who are expected to meet the needs and answer the questions of the many. This typically involves developer-centric processes to design and build the enterprise data warehouse (EDW) model, build the ETL jobs to transform and load data into the model, construct the semantic layer to mask the complexity of the underlying data structures, and finally build the businessfacing reports and dashboards as originally requested by the business.

The unfortunate reality is that this approach often fails to deliver on the vision and promise of BI—to deliver significant and tangible value to the organization through improved decision making with minimal time, effort, and cost. Top-down, IT-led BI models often have an inverse profile of time, effort, and cost relative to the value they deliver to the organization.

Tableau

A modern analytics solution requires new processes and newly-defined organizational roles and responsibilities to truly enable a collaborative self-service-based development process. IT and users must collaborate to jointly develop the rules of the road for their secure environment that each other must abide by in order to maximize the business value of analytics without compromising on the governance or security of the data.

IT’s success is highlighted, and its value to the organization realized, when the business can realize significant value and benefit from investments in analytics and BI. Should IT still be considered successful even if not a single end-user utilizes the BI system to influence a single business decision? Traditional processes intended to serve top-down BI deployments are too often measured by metrics that are not tied to outcomes or organizational impact. If the ETL jobs that IT created ran without failure and the EDW was loaded without error and all downstream reports refreshed, many IT organizations would consider themselves successful.

Merely supplying data and content to users without any regard for whether or not it is adopted and provides value through improved outcomes is simply not enough. Modern BI requires updated processes to support self-service analytics across the organization. It also requires the definition of new success metrics for which IT and the business are jointly accountable and are therefore equally invested.

Where processes and technology intertwine, there is tremendous opportunity. Technical innovations, especially with AI, will continue to make it easier to automate processes and augment users of all skill levels throughout the analytics workflow. And while process can accelerate, rather than inhibit, successful analytics outcomes, it’s important to recognize that this relies on a system and interface that people are eager to use. If processes aren’t supported by the right platform choice, they will stifle adoption.

Platform
Since BI has been historically viewed as an IT initiative, it is not surprising that IT drove virtually every aspect of platform evaluation, selection, purchasing, implementation, deployment, development, and administration. But with drastic changes required to modernize the people and process components of a BI and analytics program, IT must change the criteria for choosing the technology to meet these evolving requirements. Perhaps the most obvious change is that IT must intimately involve business users and analysts from across the organization in evaluating and ultimately deciding which modern platform best
fits the organization and addresses the broad needs of the users. For more information on selecting the right analytics platform, check out the Evaluation Guide.

A modern platform must address a wide range of needs and different personas as well as the increased pace of business and the exponential growth in data volume and complexity. IT requires that the chosen platform enables governance and security while end users require easy access to content and the ability to explore and discovery in a safe environment.

The chosen platform must also be able to evolve with the landscape and integrate easily with other systems within an organization. A centralized EDW containing all data needed for analysis, which was the cornerstone of traditional BI, is simply not possible in the big-data era. Organizations need a platform that can adapt to an evolving data landscape and insulate users from increased complexity and change.

The most critical aspect is the ability to meet these diverse needs in an integrated and intuitive way. This integration is depicted on the following page as the modern analytic workflow. The diagram highlights the five key capabilities that must flow seamlessly in order for the three personas depicted in the center to effectively leverage the platform.

Tableau

The BI and analytics platform landscape has passed a tipping point, as the market for modern products is experiencing healthy growth while the traditional segment of the market is declining with little to no net new investment. IT leaders should capitalize on this market shift and seize the opportunity to redefine their role in BI and analytics as a far more strategic one that is critical to the future success of the organization. Adopting a collaborative approach to recast the foundational aspects of the BI program and truly support self-service is the key to changing the perception of IT from a producer to a strategic partner and enabler for the organization.

Promise
In today’s era of digital transformation, IT leaders are increasingly expected to take on digital business initiatives in their organizations, including identifying cost savings and finding new revenue streams. Realizing the value of data for these transformational efforts, many businesses are modernizing and increasing their analytics investments to innovate and accelerate change. Everyone agrees that putting data at the center of conversations promises change. However, most organizations are failing to successfully implement an enterprise-wide analytics program.

IT is well positioned for a leadership role in these efforts, and is essential for the task of giving people the relevant data they need for decision-making. Modern analytics shifts IT’s role to a more strategic partner for the business, empowering users to navigate a trusted, self-service environment. But beyond access to the data, everyone needs the motivation and confidence to properly make decisions with it. You need to identify the relationships between job functions and data and change behaviors that run deep into the fabric of your organization’s culture.

This also means expanding your definition of self-service so that business users participate in some of the traditionally IT-led responsibilities associated with data and analytics—like administration, governance, and education. With the right processes, standards, and change management, business users can help manage data sources, analytics content, and users in the system, as well as contribute to training, evangelism, and the internal community. When users value and participate in these efforts, IT can manage strategic initiatives like business SLAs and ensuring the security of company assets.

Although every organization’s journey to building a data-driven organization will differ, achieving your transformational goals requires a deliberate and holistic approach to developing your analytics practice. Success at scale relies on a systematic, agile approach to identify key sources of data, how data is selected, managed, distributed, consumed, and secured, and how users are educated and engaged. The better you understand your organization’s requirements, the better you will be able to proactively support the broad use of data.

Tableau Blueprint provides concrete plans, recommendations, and guidelines as a step-by-step guide to creating a data-driven organization with modern analytics. We worked with thousands of customers and analytics experts to capture best practices that help turn repeatable processes into core capabilities to build and reinforce a data-driven mindset throughout your organization. Learn more and get started today.

IT Leadership