Every company and government entity is tasked with striking a critical balance between data access and security. As Forrester’s Senior Analyst Richard Joyce stated, “For a typical Fortune 1000 company, just a 10 percent increase in data accessibility will result in more than $65 million additional net income.” As the need to become more data-driven accelerates, it’s imperative enterprises equally balance privacy and governance requirements.

To achieve this balance, we need to change how we perceive data security. Amidst growing friction between teams — i.e. those who create and manage data access policies and those who need data to perform their duties — we must accept security, IT, and privacy teams want to provision as much data as possible. But those teams face constraints and compliance complexity.

Traditionally, data security, privacy, and regulations have been thought of as a cost center expense. Instead, we need to look at data security as the means for positive change, a driver for greater data accessibility, enhanced operational efficiency, and actual business value.

Many remain far short of the goal

Enterprises of all sizes struggle with the shift. NewVantage Partners’s Data and AI Leadership Executive Survey 2023 found less than a quarter of firms reported a data-driven organization or data culture. And in the State of Data and Analytics Governance, Gartner suggests by 2025 80 percent of organizations seeking to scale digital business, including analytical initiatives, will fail because they don’t modernize their data and analytics governance.

Data access drives growth. So, what’s the reason for low data-culture adoption? To be truly data-driven requires tight collaboration between many different functions, and there’s a lack of certainty regarding individual-role responsibilities. Strategic gaps must be addressed.

The reality of data security and access

When it comes to data security and access, companies are typically either:

Overly restrictive on data access. Data security is seen as an impediment to overall company growth. This is typically due to data, organizational, and technological complexity.Or, overly focused on perimeter and application defenses, leveraging cyberdefenses and coarse-grained identity and access management (IAM). Data systems are open to exploitation in the event of a breach.

Most experience the worst of both these scenarios, where data security and access are simply broken — inconsistent, atomistic.

A primary challenge of solving the data democratization balancing act lies in the complex web of internal and external privacy, security, and governance policies. They change over time and need to be applied and maintained consistently and efficiently across business teams.

In the middle are the technical teams managing the complex data and analytical system. Due to constraints, security, privacy, and data management teams default to a tight lockdown of data to ensure compliance and security. It’s not any one team’s fault, but a major blocker to becoming data-driven.

Unified data security platform

Siloed, decentralized, inefficient, unclear roles and responsibilities, and an absence of a holistic strategy. So, what’s the solution as more companies face costly data breaches and low data usability rates? An enterprise-wide, scalable strategy that leverages a unified data security platform. One that includes integrated capabilities to simplify and automate universal data security processes across the entire data and analytic ecosystem. With the ability to discover and classify sensitive data, data attributes can be used to automatically deliver instantaneous data access to authorized users. Proper data security governance helps teams get access to more data faster.

Additional data masking and encryption layers can be added to make sensitive data available for analytics without compromising security. Even if a breach occurs, fine-grained access limits exposure, and audit capabilities quickly identify compromised data.

Executing a proper data security strategy provides the last mile of the data governance and cataloging journey. All of it key to the balancing act of data democratization, with comprehensive data governance enabling faster insights while maintaining compliance. 

Enterprise-wide governed data sharing

Privacera helps Fortune companies modernize their data architecture via a holistic, enterprise-wide data security platform and automated data governance. A data security platform empowers the data democratization you need to increase data usability and nurture your data-driven culture. Analysts and business units get more data faster. IT liberates time and resources. Security and privacy teams easily monitor and implement data security policies.

Learn more about achieving modern data security governance and democratized analytics for faster insights here.

Data and Information Security

Here’s a proposition to consider: among the ranks of large enterprises, commercial success increasingly relies on digital transformation. In turn, digital transformation relies on modernized enterprise networks that deliver flexibility, performance and availability from the edge to the cloud. Intuitively, this hypothesis makes a lot of sense.

In many enterprises, it’s also increasingly becoming the subject of painstaking debate. After two years of quick-fix digitalization on top of pre-COVID-era network technologies, the limits of the status quo are becoming evident. All too often, legacy networks limit the potential for digital transformation. In many organizations, it’s way past time to address the fundamentals.

If this debate sounds familiar to you, it’s worth looking at the 2022-23 Global Network Report from NTT, a new piece of research that offers an intriguing view of how enterprises around the world are managing their networks.

Among other things, NTT’s survey suggests a strong correlation between a willingness to invest in modernizing networks and high levels of commercial performance. At the other end of the spectrum, NTT’s survey confirms many enterprise networks suffer from long-term underinvestment and increasing levels of technical debt. The distance between these two different approaches feels substantial.

NTT’s report – based on responses from over 1,300 network specialists and IT and business decision-makers worldwide – defines high levels of commercial performance using straightforward criteria. To qualify as a “top-performer”, organizations in the survey needed to have generated year-on-year revenue growth of over 10%. They also needed to have generated operating margins of over 15% in the last financial year.

In network terms, what do these organizations look like? It’s here that a willingness to invest in modern network technologies starts to look like an indispensable ingredient for high performance in commercial terms.

Nine out of 10 top-performing organizations are increasing network investment to support digital transformation. Many are spending over 2% of their annual revenues – a significant sum – on their networks, deploying technologies designed to enable rapid transformation, provide greater availability and flexibility, and support not just today’s requirements, but tomorrow’s requirements as well.

Eight out of 10 high-performing organizations say their network strategy is aligned with their business goals. In practice, this involves a clear understanding that the quality of the network directly affects their ability to address the most pressing business and digital transformation challenges. (By contrast, only 42% of underperformers share this sentiment.)

The underperformers in NTT’s survey are a mirror image of these overachieving organizations. Most CIOs and CTOs at these companies agree that networks play a vital role in delivering revenue growth. They also recognize business demands for increased speed, agility and innovation can only be satisfied by new operating models. And yet these organizations typically suffer from delayed upgrades, high levels of technical debt and poor visibility across the network.

The older the network is, the greater the chance of negative impacts on service delivery, customer satisfaction and the employee experience. Some 69% of the CIOs and CTOs surveyed by NTT say technical debt continues to accumulate. Asked to identify the risks generated by underinvestment, respondents most frequently pointed to classic effects of technical debt: inflated IT operational costs and limited availability of new services required for digital transformation.

For these enterprises, networks threaten to become a cross between a millstone and a minefield (slowing down progress and continually threatening to blow up in the face of network professionals).

In this hybrid and hyperconnected world where organizations need to deliver great employee and customer experiences, the network provides the fabric of the digital organization. NTT’s intelligent and secure Network as a Service enables a complete edge-to-cloud strategy, delivering a wide array of benefits: increased agility, reduced risk, greater flexibility, scalability, automation, predictability and control.

Given today’s high-performance hybrid environment, Matthew Allen, Vice President, Service Offer Management – Networking at NTT, suspects that the status quo is time-limited for underperforming enterprises.

“You can start to transform your business on the networks you have. However, as this business transformation drives a distribution of applications and business functions across many, diverse locations (SaaS, PaaS, IaaS, private cloud, etc.), a legacy network solution will not be able to keep pace with this change – it will become increasingly difficult for distributed applications and workloads to communicate effectively and securely, at the speed the business requires.”

NTT’s survey suggests organizations that delay network modernization run the risk of ending up in an unsustainable position – technical debt will continue to accumulate, downtime will occur as networks fail, and the increased operational complexity of stitching together and maintaining networks to support distributed workloads will eventually cause something to slip. Certainly, the commercial implications look unpleasant.

On this basis alone, it’s worth looking at NTT’s survey. It’s also worth asking yourself about your organization’s network strategy. Does it look like the strategy of a top-performing organization or an underperforming one? NTT’s analysis suggests that the difference between the two is more important than we might imagine. To learn more, read the 2022–23 Global Network Report from NTT – you can view the key findings infographic or download the complete report with access to the full data set.

Networking

Live shopping is one of the most exciting retail experiences in a long time. As shoppers become increasingly eager to buy via live shopping on social platforms such as Instagram and TikTok, retailers face new challenges: How to capture the shoppers’ attention on social media when the urge to buy hits? How can retailers create seamlessly interactive and immersive experiences that prompt consumers to hit the “buy button” during the live stream?

While retailers see the opportunity to pounce on this new trend, the hurdles they face are immense:

Connecting social commerce touchpoints like TikTok and Instagram, as well as built-in chat, video and voice functions, to commerce experiences is a must for seamless omnichannel experiences. Still, it’s not always easy to deploy new features and channels in their current commerce platform.Autoscaling traffic peaks from the live shopping experience is necessary, so consumers don’t face slowdowns or crashes when buying a product. When shoppers see a 404 error page, they’re not likely to come back. Unfortunately, most retailers still face scalability problems during sudden traffic spikes. Experimenting with new touchpoints enables brands and retailers to be ahead of changing customer demands now and in the future. Yet, experimentation is challenging to achieve in today’s IT environment.

These challenges stem from the rigid and monolithic nature of commerce platforms most retailers and brands still use today. These solutions, built for the desktop eCommerce era, lack the flexibility and scalability needed for spontaneous and high-volume sales powered by live shopping.

Flexible, scalable, agile: modern commerce starts with MACH

What’s the alternative for retailers looking at live shopping and beyond? As customer demands and market conditions change quickly, retailers and brands must move faster to capitalise on new ways to sell, such as omnichannel commerce, digital clienteling and personalisation. This means adapting customer experiences on the fly by adding new touchpoints, products, features, locales, currencies and every aspect of commerce without hassle.

This maximum flexibility and scalability philosophy is powered by the principles of MACH (Microservices-based, API-first, Cloud-native and Headless). In a nutshell, MACH-based architecture breaks down functionalities  such as integrating Instagram as a commerce channel  into modular pieces that can be easily customized, deployed, scaled and managed over time. In contrast to all-in-one legacy platforms, retailers using MACH can experiment, scale, change and adapt any functions at any time without disrupting their commerce backend or customer-facing storefronts. As 81% and 88% of adults under 55 years of age in Australia and New Zealand respectively shop online, and nine in 10 retail dollars spent offline during Australian peak season were influenced by digital, retailers are urged to rethink their digital commerce infrastructure to succeed in this new landscape.

For example, the Canadian menswear retail chain, Harry Rosenimplemented digital clienteling that sparked online traffic peaks. With MACH, the company had 0% downtime even as page views per session increased by 150%, coping with a three-fold increase in online sales without disruption.

Australian retail giant Kmart opted for MACH-based infrastructure to elevate personalisation and product categorisation, as well as autoscaling capabilities. During the COVID-19 pandemic, Kmart handled three times the online volume compared to pre-pandemic levels, and still, their eCommerce infrastructure was twice as fast. The company doubled the conversion rate and, just as importantly, operated at a third of its previous infrastructure costs.

Lastly, fashion retailer Express saw online traffic spikes that were three times higher than the busiest hour of Black Friday after a sales promotion went viral. Thanks to MACH, the company could avoid downtime and slowdowns to its webshop, fully capitalising on the sudden surge in sales.

Many retailers are shifting to MACH-based platforms to cope with traffic spikes from digital shopping. With modern commerce, retailers can add new touchpoints and experiment with new ways of reaching customers without constraints to the commerce infrastructure. 

E-commerce Services

With 190 participating countries and 24 million visitors, Expo 2020 Dubai was one of the world’s largest events, connecting everyone to innovative and inspiring ideas for a brighter future. But what does it take to support an event on such a grand scale? The answer is a robust cloud and modern IT infrastructure, which would allow 1,200 employees to collaborate with one another, amidst nationwide lockdowns and supply chain disruptions.

As part of the World Expos, Expo 2020 Dubai sought to create one of the smartest and most connected places to give its participants a lasting impression beyond the event. More than just departing with the knowledge and connections gained from the Expo, the organization also wanted to impart the rich cultural heritage of the United Arab Emirates (UAE). This means delivering a deeply personalized and hyper-relevant experience, from a smooth ticketing journey to chatbots that offered real-time assistance in multiple languages.

To bring this ambition to life, a robust foundation of technology was necessary, one that could support the seamless integration of systems and apps, and a myriad of digital services, and meet numerous, diverse IT requirements. It was with these in mind that Expo 2020 Dubai decided on a multi-cloud infrastructure that was hyper-flexible, scalable, secure, and reliable enough to support the event’s operations while serving as a platform to manage the build process for the event.

Behind The Winning Cloud Partnership

Expo 2020 Dubai was built from the ground up: a 4.38km² wide site comprising sprawling parks, a shopping mall, and the Expo Village. In the same vein, its cloud journey also underpinned the various stages of its development, including civil infrastructure, building construction, crowd management, smart city operations, and marketing. Key to this multi-cloud infrastructure was flexibility, scalability, and security, upon which its integrated, intelligent systems were built on. This enabled the Expo teams, vendors, suppliers, and volunteers across nations to work seamlessly together.

It is through the collaborative effort of e& and Accenture that the Etisalat OneCloud and Amazon Web Services (AWS) were successfully integrated to make Expo 2020 Dubai one of the first and largest true multi-cloud infrastructures in the region. Etisalat OneCloud provided the resilient, reliable, and secure environment the event needed for its localized business-critical apps, whereas AWS delivered the structure necessary to support global digital services and apps, such as websites, participant portals and eCommerce platforms.

But what brought both solutions together was Accenture Service Delivery Platform, which offered the interconnectivity for enabling several layers of integration at the app and security level.

As the technological groundwork of Expo 2020 Dubai consisted of over 90 applications, Accenture Service Delivery Platform delivered the integration the multi-cloud infrastructure required without any external systems while meeting the stringent app requirements around scalability, security, and hyper-reliability. This was done across six months of development and throughout the entire customer lifecycle spanning awareness, discovery, purchase, and post-sales.

Delivering An Unprecedented Experience

Through this sprawling multi-cloud infrastructure, Expo 2020 Dubai could host all the Pavilion designs, themes, and content from over 190 participating countries while integrating authorizations, supply chain management, and workforce licensing functions. At the same time, the event realized seamless and highly personalized experiences for its visitors with a suite of visitor-facing digital channels. This was inclusive of the Expo 2020 official mobile app, virtual assistant, and an official website.

Expo 2020 Dubai also incorporated a central information hub and a best-in-class ticketing journey alongside digital services tailored to a visitor’s personal preferences in real-time and in their preferred language. Then there was AMAL, a chatbot powered by artificial intelligence, instrumental in gathering critical information on the Expo shows and attractions while giving live feedback as the event took place.

It is clear that behind this global gathering of nations designed around enhancing our collective knowledge, aspirations, and progress, a large-scale digital transformation took place: one which enabled the multi-cloud environment for Expo 2020 Dubai and was instrumental to the success of this life-changing event.

The Expo’s key themes of opportunity, mobility, and sustainability were succinctly captured in its infrastructure, demonstrating the potential of cloud in unlocking intelligent operations and business agility. As evident in the successes of Expo 2020 Dubai and other businesses, such as leading transport fuels provider Ampol, cloud has become an indispensable cornerstone to succeed in today’s digital-first economy. And it’s this very cloud continuum that will continue to bring businesses one step closer to innovation, aiding them in delivering truly transformative services and experiences.

Read the full story here:  https://www.accenture.com/ae-en/case-studies/applied-intelligence/expo-2020-dubai

Hybrid Cloud, Infrastructure Management, Multi Cloud

By: Lars Koelendorf, EMEA Vice President, Solutions & Enablement at Aruba, a Hewlett Packard Enterprise company

Can an enterprise CEO today be successful without having a strong relationship with the CIO and the corporate network?

The short answer is no. Technology today powers and enables so much of how businesses function. Given the pace of digitization, the corporate network, led by the CIO, is increasingly becoming a critical business decision center for the CEO within the broader context of running a large enterprise.

In particular, there are three points CEOs today must consider when examining the network and their relationship with the CIO.

1. Investing in the network is foundational to achieving business goals

Is there any department across the modern enterprise business that would not benefit from the ability to work better, faster, easier, smarter, cheaper, and more secure?

The COVID-19 pandemic has already proven again and again why digital transformation is now fundamental to business growth and survival, especially in the face of outside, unanticipated events severely impacting normal business operations.

Matching technology with how business engages key publics, from clients to the community to investors and beyond, allows employees to create higher quality work while producing more competitive products and services that keep pace with ever-evolving demands. It means empowering back-end functions to support the rest of the business better than before. Meanwhile, regardless of which department they belong to or where they choose to work, employees must have the best experience possible, without any technical roadblocks and complications that can stop them from delivering their best work. Otherwise, employees will and are seeking out that environment elsewhere. Indeed, many employees actually experienced very good connectivity while working from home during the pandemic – and now demand that same easy and seamless experience coming back into the workplace or while on the road.

The key to creating that effective work environment is ensuring the CIO makes clear to the CEO the value of automated systems, which not only includes streamlining operations, but eliminating human error, overcoming human limitations, and freeing up employees to focus on projects that drive real value. In short, with the right technology, CIOs can drive actionable insights from the deluge of data that a given company has been accumulating that support the CEO’s long-term vision and business goals.

Enterprise data has the potential to deliver significant cost savings, improve operational efficiency, and even unlock new business opportunities and revenue streams. But first, it needs to be stored, secured, sorted, and analyzed – all of which a great enterprise network can facilitate.

To unlock its full potential, CEOs need to work closely with their CIOs and other department heads to understand the exact impact that the network could have on every area of the business.

2. The network also plays a vital role in achieving sustainability goals

Sustainability is not just a strategic priority. For most companies around the world, sustainability has become the priority, given that it’s being driven both from the top down (by company boards, investors, and governments) and from the bottom up (by employees, the general public, and key communities affected by business operations). In essence, networking capabilities must align with corporate sustainability goals and initiatives to truly achieve its full potential.

The network plays an integral role in empowering enterprises to become more sustainable, to measure and prove that sustainability, and to build more sustainable products and services. Therefore, investing in the right network infrastructure should be at the top of any CEO’s agenda, and they will need to work in tandem with the CIO and other relevant department heads to achieve those aims.

3. A modern network can help the enterprise stay ahead of potential pitfalls

Given the rate of change and disruption, any CEO simply investing just enough in the network to keep operations moving has already lost the plot. The CEO instead must work closely with the CIO to anticipate future business needs, opportunities, and threats, outlining clear goals and corresponding initiatives that ensure the modern network is flexible and nimble enough to meet the challenges.

It used to be that if the network were down, employees could do other manual work while waiting for a fix. Today, however, if there are issues with the network, everything stops, from the factory floor to the storefront to the corporate headquarters. In that sense, the network is mission-critical to keeping the business running.

But the network has so much more potential than this – to help the business continually stay ahead of and be differentiated from the competition. The reason is an agile network creates the foundation for every area of the business to innovate, from IT to R&D and logistics.

With an agile network, the infrastructure is always ready to integrate, support, secure, and fund any new technological developments that might help the business to move the needle on its goals.

Creating Strong C-suite Connections

While this particular article has focused on the relationship between the CEO and the network, at the end of the day, the CEO must empower the CIO to be an advocate for the network and support all C-suite members to work together towards building one that helps them achieve both individual departmental and collective organizational goals.

For more on creating a modern, agile network, learn about Aruba ESP (Edge Services Platform): https://www.arubanetworks.com/solutions/aruba-esp/

Networking

Introduction
Since its inception decades ago, the primary objective of business intelligence has been the creation of a top-down single source of truth from which organizations would centrally track KPIs and performance metrics with static reports and dashboards. This stemmed from the proliferation of data in spreadsheets and reporting silos throughout organizations, often yielding different and conflicting results. With this new mandate, BI-focused teams were formed, often in IT departments, and they began to approach the problem in the same manner as traditional IT projects, where the business makes a request of IT, IT logs a ticket, then fulfills the request following a waterfall methodology.

While this supplier/consumer approach to BI appeared to be well-suited for the task of centralizing an organization’s data and promoting consistency, it sacrificed business agility.

There was a significant lag between the time the question was asked, and the time the question was answered. This delay and lack of agility within the analysis process led to lackluster adoption and low overall business impact.

The emergence of self-service BI in recent years has challenged the status quo, especially for IT professionals who have spent the better part of the past two decades building out a BI infrastructure designed for developing top-down, centralized reporting and dashboards. Initially, this self-service trend was viewed as a nuisance by most IT departments and was virtually ignored. The focus remained on producing a centrally-managed single source of truth for the organization.

Fast-forward to today and IT finds itself at a crossroad with self-service BI as the new normal that can no longer be ignored. The traditional approach to BI is becoming less and less relevant as the business demands the agility that comes with self-service to drive adoption and improve organization outcomes. This, paired with the continued exponential growth in data volume and complexity, presents IT with an important choice.

As organizations begin the transition from a traditional top-down approach driven by IT to a self-service approach enabled by IT and led by the business, a new framework and overall strategy is required. This means that past decisions supporting the core foundational components of a BI program—people, process, and platform—must be revisited. Adjustments are needed in these three core areas to support the shift from a model of top-down BI development and delivery to a self-service-based modern BI model which is driven, and primarily executed on, by the business.

People
A successful transition to self-service business analytics begins with people and should be the top priority for IT when considering changes required for BI modernization. In a traditional BI model, people were often considered last after platform and process. The widely-used mantra “if you build it, they will come” exemplifies the belief that business users would gravitate toward a well-built system of record for BI that would answer all of their business questions.

This desired end-state rarely came to fruition since there was little to no collaboration between the business users and IT during the process of building the solution after an upfront requirements-gathering phase. In the absence of active engagement and feedback from the business during the time between requirements gathering and project completion, there are many opportunities for failure that typically emerge. A few of the most common include:

• Business or organizational changes occur during the development process that render the initial requirements obsolete or invalid.
• Incomplete or inaccurate requirements are given in the initial process phases.
• Errors are made in the process of translating business requirements into technical requirements.

The end result of these situations is often that business users disengage from the BI program completely and an organization’s investment in time and resources are wasted due to lack of adoption. Business users and analysts need to use analytics in order for it to have any impact and deliver organizational value. A BI model that embraces self-service puts these users first and allows them to explore, discover, and build content that they will ultimately use to make better business decisions and transform business processes.

Collaboration between the business and IT is critical to the success of the implementation as IT knows how to manage data and the business knows how to interpret and use data within the business processes they support. They have the context within which analytics and the insight derived from it will be used to make better business decisions and ultimately improve outcomes. This collaboration of the groups early on will not only lead to the deployment of a platform that meets the needs of the business but also drives adoption and impact of the platform overall.

Process
Self-service analytics does not mean end users are allowed unfettered access to any and all data and analytic content. It means they have the freedom to explore pertinent business data that is trusted, secure, and governed. This is where process comes into play and represents the component that requires the most significant shift in traditional thinking for IT. A successful modern BI program is able to deliver both IT control and end-user autonomy and agility. A well-established and well-communicated process is required for an organization to strike this delicate balance.

A top-down, waterfall-based process only addresses the IT control part of the equation. A traditional BI deployment focuses primarily on locking down data and content with governance. This means limiting access and freedom to only a few people with specialized technical skills who are expected to meet the needs and answer the questions of the many. This typically involves developer-centric processes to design and build the enterprise data warehouse (EDW) model, build the ETL jobs to transform and load data into the model, construct the semantic layer to mask the complexity of the underlying data structures, and finally build the businessfacing reports and dashboards as originally requested by the business.

The unfortunate reality is that this approach often fails to deliver on the vision and promise of BI—to deliver significant and tangible value to the organization through improved decision making with minimal time, effort, and cost. Top-down, IT-led BI models often have an inverse profile of time, effort, and cost relative to the value they deliver to the organization.

Tableau

A modern analytics solution requires new processes and newly-defined organizational roles and responsibilities to truly enable a collaborative self-service-based development process. IT and users must collaborate to jointly develop the rules of the road for their secure environment that each other must abide by in order to maximize the business value of analytics without compromising on the governance or security of the data.

IT’s success is highlighted, and its value to the organization realized, when the business can realize significant value and benefit from investments in analytics and BI. Should IT still be considered successful even if not a single end-user utilizes the BI system to influence a single business decision? Traditional processes intended to serve top-down BI deployments are too often measured by metrics that are not tied to outcomes or organizational impact. If the ETL jobs that IT created ran without failure and the EDW was loaded without error and all downstream reports refreshed, many IT organizations would consider themselves successful.

Merely supplying data and content to users without any regard for whether or not it is adopted and provides value through improved outcomes is simply not enough. Modern BI requires updated processes to support self-service analytics across the organization. It also requires the definition of new success metrics for which IT and the business are jointly accountable and are therefore equally invested.

Where processes and technology intertwine, there is tremendous opportunity. Technical innovations, especially with AI, will continue to make it easier to automate processes and augment users of all skill levels throughout the analytics workflow. And while process can accelerate, rather than inhibit, successful analytics outcomes, it’s important to recognize that this relies on a system and interface that people are eager to use. If processes aren’t supported by the right platform choice, they will stifle adoption.

Platform
Since BI has been historically viewed as an IT initiative, it is not surprising that IT drove virtually every aspect of platform evaluation, selection, purchasing, implementation, deployment, development, and administration. But with drastic changes required to modernize the people and process components of a BI and analytics program, IT must change the criteria for choosing the technology to meet these evolving requirements. Perhaps the most obvious change is that IT must intimately involve business users and analysts from across the organization in evaluating and ultimately deciding which modern platform best
fits the organization and addresses the broad needs of the users. For more information on selecting the right analytics platform, check out the Evaluation Guide.

A modern platform must address a wide range of needs and different personas as well as the increased pace of business and the exponential growth in data volume and complexity. IT requires that the chosen platform enables governance and security while end users require easy access to content and the ability to explore and discovery in a safe environment.

The chosen platform must also be able to evolve with the landscape and integrate easily with other systems within an organization. A centralized EDW containing all data needed for analysis, which was the cornerstone of traditional BI, is simply not possible in the big-data era. Organizations need a platform that can adapt to an evolving data landscape and insulate users from increased complexity and change.

The most critical aspect is the ability to meet these diverse needs in an integrated and intuitive way. This integration is depicted on the following page as the modern analytic workflow. The diagram highlights the five key capabilities that must flow seamlessly in order for the three personas depicted in the center to effectively leverage the platform.

Tableau

The BI and analytics platform landscape has passed a tipping point, as the market for modern products is experiencing healthy growth while the traditional segment of the market is declining with little to no net new investment. IT leaders should capitalize on this market shift and seize the opportunity to redefine their role in BI and analytics as a far more strategic one that is critical to the future success of the organization. Adopting a collaborative approach to recast the foundational aspects of the BI program and truly support self-service is the key to changing the perception of IT from a producer to a strategic partner and enabler for the organization.

Promise
In today’s era of digital transformation, IT leaders are increasingly expected to take on digital business initiatives in their organizations, including identifying cost savings and finding new revenue streams. Realizing the value of data for these transformational efforts, many businesses are modernizing and increasing their analytics investments to innovate and accelerate change. Everyone agrees that putting data at the center of conversations promises change. However, most organizations are failing to successfully implement an enterprise-wide analytics program.

IT is well positioned for a leadership role in these efforts, and is essential for the task of giving people the relevant data they need for decision-making. Modern analytics shifts IT’s role to a more strategic partner for the business, empowering users to navigate a trusted, self-service environment. But beyond access to the data, everyone needs the motivation and confidence to properly make decisions with it. You need to identify the relationships between job functions and data and change behaviors that run deep into the fabric of your organization’s culture.

This also means expanding your definition of self-service so that business users participate in some of the traditionally IT-led responsibilities associated with data and analytics—like administration, governance, and education. With the right processes, standards, and change management, business users can help manage data sources, analytics content, and users in the system, as well as contribute to training, evangelism, and the internal community. When users value and participate in these efforts, IT can manage strategic initiatives like business SLAs and ensuring the security of company assets.

Although every organization’s journey to building a data-driven organization will differ, achieving your transformational goals requires a deliberate and holistic approach to developing your analytics practice. Success at scale relies on a systematic, agile approach to identify key sources of data, how data is selected, managed, distributed, consumed, and secured, and how users are educated and engaged. The better you understand your organization’s requirements, the better you will be able to proactively support the broad use of data.

Tableau Blueprint provides concrete plans, recommendations, and guidelines as a step-by-step guide to creating a data-driven organization with modern analytics. We worked with thousands of customers and analytics experts to capture best practices that help turn repeatable processes into core capabilities to build and reinforce a data-driven mindset throughout your organization. Learn more and get started today.

IT Leadership

In today’s dynamic world of work from anywhere, organizations are experiencing new pressure points. IT and security leaders find themselves grappling with extended enterprises of employees, contractors, and suppliers remotely located across the globe using an expanded set of technologies. The broad adoption of cloud apps, platforms, and infrastructure has led to a complete re-thinking of access, governance, and security.

While remote, extended enterprises accessing cloud-based technology bring potential risks, it also offers significant upside for businesses. CIOs have recognized how strategic their organizations can be in driving business growth, productivity, and reducing complexity by pushing rapid technology adoption and creating seamless, secure, and simple authentication and authorization experiences for their broad workforces.

Collectively, these changes have emphasized the need for a more holistic identity-first approach to technology adoption, implementation, and security. Much of that starts with understanding who has access to what, when they received access, and who authorized that access. That technology domain has traditionally been known as Identity Governance and Administration (IGA), but as new ways of working collide with new security paradigms, those definitions are shifting and evolving to match modern enterprise IT environments.

This broad need for IGA capabilities is well-founded, as enterprises are recognizing the side effects of distributed and fragmented user bases and tech stacks: a sharp rise in orphaned accounts that are a major security risk and a resource drain, and a lack of control and visibility into cloud application security posture, lacking clear reporting of access and any time constraints.

The weakness of traditional IGA systems

As companies start shifting to an identity-first approach to security, IGA is becoming a more sought-after capability for organizations requiring better visibility of identity administration and access entitlements across their IT infrastructure. This is a major departure from traditional, compliance-driven models, as IGA is being seen more as an enabler rather than risk mediation.

Traditional IGA solutions are primarily solving a legacy problem and were not built to manage identities in cloud-first IT environments. They lack the ability to easily integrate to modern applications and are challenging to implement, often taking 12-18 months to deploy, requiring professional services, and considerable maintenance costs along the way. The outcome is too often that traditional IGA solutions are bolted on and left alone, resulting in non-updated software and potentially with greater security holes than before. To make matters worse, legacy systems are generally designed with a small subset of users in mind, with user experiences that make broad adoption and education a significant challenge.

In a world where cloud technologies have democratized access and adoption, IGA solutions should make it possible for more users within an organization to compliantly engage with applications either as an end user or as an authorizer, ultimately driving the business forward.

The modern approach to identity governance

As enterprises continue to adopt more cloud technologies and work in a distributed environment across a broad set of users, IGA must evolve to enable rather than disrupt modern enterprises. IT leaders need a cloud-native, enterprise-grade solution that is approaching identity governance not as a bolt-on solution, but as one that has been foundationally incorporated into a broader identity-first security posture. To keep pace with today’s speed of innovation and adoption, a modern solution must be deployed in days, and be easy to use and maintain. Lastly, a modern IGA solution must deliver a seamless and frictionless experience for the workforce and help boost the productivity and agility of its IT organization.

Okta’s cloud-first approach to identity governance

As the first born-in-the-cloud identity provider, Okta has taken its modern approach to identity and access management (IAM) and applied it to IGA with Okta Identity Governance, which is now generally available. Okta Identity Governance is part of Okta’s broader workforce identity vision, unifying IAM and IGA to improve enterprises’ security posture, helping them mitigate modern security risks, improve their IT efficiency, and meet today’s productivity and compliance challenges.

Deeply integrated into Okta’s existing IAM solutions, Okta Identity Governance provides an unparalleled comprehensive view of every user’s access patterns. Enriched user context allows reviewers to not only simplify the access certification process, but also make informed decisions about user access ensuring only the right people have access to right resources. It meets users where they are by providing easy to use self-service access request capabilities, tightly integrated with collaboration tools built on a converged IAM and Governance solution, automating the provisioning of access to an enterprise’s applications and cloud resources.

With a network of 7,000+ pre-built integrations, Okta Identity Governance can provide intelligent and easy to use identity governance capabilities with the ability to automate complex identity processes, at scale.

Analyst firms and the federal government have agreed on the broad, foundational role identity plays in securing today’s organizations. Identity is the number one pillar of zero trust architecture, and that approach is built on the principle of least privilege with identity governance serving as a critical component. As organizations continue to adopt a zero trust framework, they are starting to realize the importance of moving away from a distributed identity architecture to a unified approach. Okta’s unified platform extends access and identity administration to include the key access governance tools that modern organizations need to mitigate modern security risks and improve IT resource efficiency. 

To learn more about Okta Identity Governance, visit the Okta blog.

About the Author

Paresh Bhaya is the Senior Director, Product Marketing for Identity Management business at Okta. He has been in the security industry for 10+ years and has experience in all phases of product development and marketing. He is passionate about security and you can always find him chatting about some deep security problem. Prior to Okta he was leading the Product Marketing efforts at Salesforce and worked at successful startups before that. He has an M.S. in Electrical Engineering from University of Texas.

Security

In today’s dynamic world of work from anywhere, organizations are experiencing new pressure points. IT and security leaders find themselves grappling with extended enterprises of employees, contractors, and suppliers remotely located across the globe using an expanded set of technologies. The broad adoption of cloud apps, platforms, and infrastructure has led to a complete re-thinking of access, governance, and security.

While remote, extended enterprises accessing cloud-based technology bring potential risks, it also offers significant upside for businesses. CIOs have recognized how strategic their organizations can be in driving business growth, productivity, and reducing complexity by pushing rapid technology adoption and creating seamless, secure, and simple authentication and authorization experiences for their broad workforces.

Collectively, these changes have emphasized the need for a more holistic identity-first approach to technology adoption, implementation, and security. Much of that starts with understanding who has access to what, when they received access, and who authorized that access. That technology domain has traditionally been known as Identity Governance and Administration (IGA), but as new ways of working collide with new security paradigms, those definitions are shifting and evolving to match modern enterprise IT environments.

This broad need for IGA capabilities is well-founded, as enterprises are recognizing the side effects of distributed and fragmented user bases and tech stacks: a sharp rise in orphaned accounts that are a major security risk and a resource drain, and a lack of control and visibility into cloud application security posture, lacking clear reporting of access and any time constraints.

The weakness of traditional IGA systems

As companies start shifting to an identity-first approach to security, IGA is becoming a more sought-after capability for organizations requiring better visibility of identity administration and access entitlements across their IT infrastructure. This is a major departure from traditional, compliance-driven models, as IGA is being seen more as an enabler rather than risk mediation.

Traditional IGA solutions are primarily solving a legacy problem and were not built to manage identities in cloud-first IT environments. They lack the ability to easily integrate to modern applications and are challenging to implement, often taking 12-18 months to deploy, requiring professional services, and considerable maintenance costs along the way. The outcome is too often that traditional IGA solutions are bolted on and left alone, resulting in non-updated software and potentially with greater security holes than before. To make matters worse, legacy systems are generally designed with a small subset of users in mind, with user experiences that make broad adoption and education a significant challenge.

In a world where cloud technologies have democratized access and adoption, IGA solutions should make it possible for more users within an organization to compliantly engage with applications either as an end user or as an authorizer, ultimately driving the business forward.

The modern approach to identity governance

As enterprises continue to adopt more cloud technologies and work in a distributed environment across a broad set of users, IGA must evolve to enable rather than disrupt modern enterprises. IT leaders need a cloud-native, enterprise-grade solution that is approaching identity governance not as a bolt-on solution, but as one that has been foundationally incorporated into a broader identity-first security posture. To keep pace with today’s speed of innovation and adoption, a modern solution must be deployed in days, and be easy to use and maintain. Lastly, a modern IGA solution must deliver a seamless and frictionless experience for the workforce and help boost the productivity and agility of its IT organization.

Okta’s cloud-first approach to identity governance

As the first born-in-the-cloud identity provider, Okta has taken its modern approach to identity and access management (IAM) and applied it to IGA with Okta Identity Governance, which is now generally available. Okta Identity Governance is part of Okta’s broader workforce identity vision, unifying IAM and IGA to improve enterprises’ security posture, helping them mitigate modern security risks, improve their IT efficiency, and meet today’s productivity and compliance challenges.

Deeply integrated into Okta’s existing IAM solutions, Okta Identity Governance provides an unparalleled comprehensive view of every user’s access patterns. Enriched user context allows reviewers to not only simplify the access certification process, but also make informed decisions about user access ensuring only the right people have access to right resources. It meets users where they are by providing easy to use self-service access request capabilities, tightly integrated with collaboration tools built on a converged IAM and Governance solution, automating the provisioning of access to an enterprise’s applications and cloud resources.

With a network of 7,000+ pre-built integrations, Okta Identity Governance can provide intelligent and easy to use identity governance capabilities with the ability to automate complex identity processes, at scale.

Analyst firms and the federal government have agreed on the broad, foundational role identity plays in securing today’s organizations. Identity is the number one pillar of zero trust architecture, and that approach is built on the principle of least privilege with identity governance serving as a critical component. As organizations continue to adopt a zero trust framework, they are starting to realize the importance of moving away from a distributed identity architecture to a unified approach. Okta’s unified platform extends access and identity administration to include the key access governance tools that modern organizations need to mitigate modern security risks and improve IT resource efficiency. 

To learn more about Okta Identity Governance, visit the Okta blog.

About the Author

Paresh Bhaya is the Senior Director, Product Marketing for Identity Management business at Okta. He has been in the security industry for 10+ years and has experience in all phases of product development and marketing. He is passionate about security and you can always find him chatting about some deep security problem. Prior to Okta he was leading the Product Marketing efforts at Salesforce and worked at successful startups before that. He has an M.S. in Electrical Engineering from University of Texas.

Security

The global supply chain isn’t just bending, it’s on the verge of breaking.

Pent-up demand from the lifting of lockdowns, geopolitical instability, war, problems in China, inflation and energy costs are all factors in creating an unprecedented level of supply chain instability – which pose an existential threat to companies the world over.

Known as ‘black swan events’, such major disruptions to supply chains were once relatively rare but are now the norm.

Companies have responded – understandably – by shifting to crisis mode. Taking action such as trying to diversify their supply chains as best they can. But this has created new problems.

Crisis response

For decades, companies around the world have been able to rely on a certain level of predictability in their supply chains. Globalization was the watchword, China played a pivotal role in manufacturing affordable goods and prices were stable. Those days are gone.

The new world requires a multi-faceted approach to supply chain planning with road, sea and air. Regional supply chains as well as global ones have to be employed, backup providers need to be on call, routes and suppliers have to be switched at the drop of a hat to keep things moving. The supply chain is now a sprawling network, where each department has to make constant quick-fire decisions, often unaware of what other parts of the network are doing.

The need for oversight

What’s missing is a view of the whole picture. Immutable data that can inform strategic decision making and allow users to see what works best, and what doesn’t.

This new level of oversight could take many forms. It could be overseen by a new department head, or simply someone with regular access to the right information. But data is vital if any new approach is to be successful.

To this end, companies will increasingly turn to solutions to create a dynamic supply chain such as Teradata’s Transportation and Logistics Data Model (TLDM).  It gathers and analyzes data in real time, allowing companies to monitor things such as ambient data, like weather and road conditions – which provide insights into possible deviations from the expected delivery lead time. GPS data captured via sensors located on a carrier, along with driver break schedules and speed changes en route, provide insights into delays.

Predictive analytics help to anticipate consumption patterns, while a combination of predictive demand modeling and real time assessment provides clear visibility into the supply chain – enabling actions to be taken such as rerouting, reprioritization of production/shipping schedules and changes in inventory levels.

Having this level of insight provides a two-pronged advantage. Real-time data allows for quick decisions to be made to keep things flowing, while analytics allows for more strategic planning and for more problems to be avoided before they even happen.

Supply chain of the future

It’s clear that with uncertainty and instability at its core, the modern supply chain will rely on immutable, actionable data that provides oversight of all aspects of supply chain operations. This ought to help bring an end to siloed working and allow for a responsive and agile way of working fit for the future.

For more information on Teradata’s dynamic supply chain solutions click here.

Supply Chain