Stop me if you’ve heard this one before. Several economists, a bank president, and a couple of reporters walk into a bar. The economists lament, “A thick fog of uncertainty still surrounds us.” The bank president wails, “Economic hurricane.” The reporters keen about “gut-churning feelings of helplessness” and “a world of confusion.”

Sitting in a booth with his hard-working direct reports, the chief information officer sighs. “Same-old, same-old. Uncertainty is our jam.”

For as long as there has been an IT organization, CIOs have been charged with “keeping the lights on (KTLO), delivering five-nines stability (just 5.15 minutes of downtime per year), and expanding digital capabilities in a world characterized by massive economic, political, social, and technological uncertainty.

In other words, IT leaders know there is nothing uncertain about uncertainty. After all, uncertainty is the one certainty. We should not run from uncertainty; we should embrace it. Paraphrasing poet Robert Frost, when it comes to uncertainty, there is “no way out but through.”

What really drives uncertainty

One of the smartest guys on this planet, Ed Gough, formerly chief scientist and CTO at NATO Undersea Research Center (NURC) and former technical director at the US Navy’s Naval Meteorology and Oceanography Command, explained to me that ignorance is at the root of uncertainty.

As John Kay and Mervyn King set forth in Radical Uncertainty: Decision-Making Beyond the Numbers, “Uncertainty is the result of our incomplete knowledge of the world, or about the connection between our present actions and their future outcomes.”

There will always be uncertainties external to the organization. But the uncertainties that do the most to destroy IT value are the self-inflicted ones.

The No. 1 source of uncertainty in the workplace is absence of strategy. Business scholars, think tanks, and some members of the media are discovering that many organizations have not explicitly stated where they want to go and how they plan to get there. To wit, two-thirds of enterprises do not have a data strategy.

And among the companies that do have a strategy, just 14% of their employees “have a good understanding of their company’s strategy and direction.”

It all boils down to what Warner Moore, founder and CISO at Columbus, Ohio-based Gamma Force, recently told me: “Uncertainty isn’t the problem; lack of leadership is the problem.”

Focus on what matters most

Business school is where a lot of today’s business leaders learn their trade. And if you examine how business schools approach uncertainty, you can begin to see where this leadership issue takes root.

In business schools around the world, MBAs are counseled to combat uncertainty by compiling a comprehensive list of possible outcomes and then attach numerical probabilities to each scenario. This approach is untenable, as possible outcomes are infinite and assigning probabilities — subject to assumptions and biases — creates a false sense of precision.

One of the guiding principles for those who would master uncertainty is to recognize that there has always been something irresistible about advice in mathematical form. Over-reliance on metrics has given rise to the term “McNamara fallacy” referring to the tragic missteps associated with the misaligned quantifications used during the Vietnam War.

Instead of flailing around trying to enumerate everything that could happen, executives need to place intense scrutiny on a subset of critical uncertainties. In other words, neglect the right uncertainties.

I spoke with a subset of the 75 senior IT executives attending the Digital Solutions Gallery hosted by The Ohio State University to get their thoughts about which zones of uncertainty they were focusing on. The general consensus was that one of the best places to start managing hyper-uncertainty was talent.

Atlanta Fed President Raphael Bostic speaking at a “mini-conference” Survey of Business Uncertainty: Panel Member Economic Briefing and Policy Discussion earlier this year told attendees, “Finding workers is a big problem.”

Finding workers might be an uncertain undertaking but retaining key performers is not. Leaders have it in their power to know what their high performers are thinking. For these key employees it is possible to paint reasonably clear pictures of what happens next.

Mike McSally, a human capital advisor with 20-plus years of experience in executive recruiting, does not believe recruiting has to be a problem. Reducing talent uncertainty is a simple matter of managing personal networks. McSally suggests having your top ten performers take out a yellow sheet of paper and write down “the top twenty people you have ever worked with.” Give them a call.

When you find a qualified candidate, deliver to them an authentic “what-a-day-at-work-really-looks-like” depiction of the role being filled. When that depiction aligns with your strategic vision and your company’s mission, you’ll have a leg up on converting that candidate to a new team member.

That kind of leadership approach will help you handle talent uncertainties, better positioning your organization for the future. I am certain of that.

Hiring, IT Leadership, IT Strategy, Staff Management

Introduction
Since its inception decades ago, the primary objective of business intelligence has been the creation of a top-down single source of truth from which organizations would centrally track KPIs and performance metrics with static reports and dashboards. This stemmed from the proliferation of data in spreadsheets and reporting silos throughout organizations, often yielding different and conflicting results. With this new mandate, BI-focused teams were formed, often in IT departments, and they began to approach the problem in the same manner as traditional IT projects, where the business makes a request of IT, IT logs a ticket, then fulfills the request following a waterfall methodology.

While this supplier/consumer approach to BI appeared to be well-suited for the task of centralizing an organization’s data and promoting consistency, it sacrificed business agility.

There was a significant lag between the time the question was asked, and the time the question was answered. This delay and lack of agility within the analysis process led to lackluster adoption and low overall business impact.

The emergence of self-service BI in recent years has challenged the status quo, especially for IT professionals who have spent the better part of the past two decades building out a BI infrastructure designed for developing top-down, centralized reporting and dashboards. Initially, this self-service trend was viewed as a nuisance by most IT departments and was virtually ignored. The focus remained on producing a centrally-managed single source of truth for the organization.

Fast-forward to today and IT finds itself at a crossroad with self-service BI as the new normal that can no longer be ignored. The traditional approach to BI is becoming less and less relevant as the business demands the agility that comes with self-service to drive adoption and improve organization outcomes. This, paired with the continued exponential growth in data volume and complexity, presents IT with an important choice.

As organizations begin the transition from a traditional top-down approach driven by IT to a self-service approach enabled by IT and led by the business, a new framework and overall strategy is required. This means that past decisions supporting the core foundational components of a BI program—people, process, and platform—must be revisited. Adjustments are needed in these three core areas to support the shift from a model of top-down BI development and delivery to a self-service-based modern BI model which is driven, and primarily executed on, by the business.

People
A successful transition to self-service business analytics begins with people and should be the top priority for IT when considering changes required for BI modernization. In a traditional BI model, people were often considered last after platform and process. The widely-used mantra “if you build it, they will come” exemplifies the belief that business users would gravitate toward a well-built system of record for BI that would answer all of their business questions.

This desired end-state rarely came to fruition since there was little to no collaboration between the business users and IT during the process of building the solution after an upfront requirements-gathering phase. In the absence of active engagement and feedback from the business during the time between requirements gathering and project completion, there are many opportunities for failure that typically emerge. A few of the most common include:

• Business or organizational changes occur during the development process that render the initial requirements obsolete or invalid.
• Incomplete or inaccurate requirements are given in the initial process phases.
• Errors are made in the process of translating business requirements into technical requirements.

The end result of these situations is often that business users disengage from the BI program completely and an organization’s investment in time and resources are wasted due to lack of adoption. Business users and analysts need to use analytics in order for it to have any impact and deliver organizational value. A BI model that embraces self-service puts these users first and allows them to explore, discover, and build content that they will ultimately use to make better business decisions and transform business processes.

Collaboration between the business and IT is critical to the success of the implementation as IT knows how to manage data and the business knows how to interpret and use data within the business processes they support. They have the context within which analytics and the insight derived from it will be used to make better business decisions and ultimately improve outcomes. This collaboration of the groups early on will not only lead to the deployment of a platform that meets the needs of the business but also drives adoption and impact of the platform overall.

Process
Self-service analytics does not mean end users are allowed unfettered access to any and all data and analytic content. It means they have the freedom to explore pertinent business data that is trusted, secure, and governed. This is where process comes into play and represents the component that requires the most significant shift in traditional thinking for IT. A successful modern BI program is able to deliver both IT control and end-user autonomy and agility. A well-established and well-communicated process is required for an organization to strike this delicate balance.

A top-down, waterfall-based process only addresses the IT control part of the equation. A traditional BI deployment focuses primarily on locking down data and content with governance. This means limiting access and freedom to only a few people with specialized technical skills who are expected to meet the needs and answer the questions of the many. This typically involves developer-centric processes to design and build the enterprise data warehouse (EDW) model, build the ETL jobs to transform and load data into the model, construct the semantic layer to mask the complexity of the underlying data structures, and finally build the businessfacing reports and dashboards as originally requested by the business.

The unfortunate reality is that this approach often fails to deliver on the vision and promise of BI—to deliver significant and tangible value to the organization through improved decision making with minimal time, effort, and cost. Top-down, IT-led BI models often have an inverse profile of time, effort, and cost relative to the value they deliver to the organization.

Tableau

A modern analytics solution requires new processes and newly-defined organizational roles and responsibilities to truly enable a collaborative self-service-based development process. IT and users must collaborate to jointly develop the rules of the road for their secure environment that each other must abide by in order to maximize the business value of analytics without compromising on the governance or security of the data.

IT’s success is highlighted, and its value to the organization realized, when the business can realize significant value and benefit from investments in analytics and BI. Should IT still be considered successful even if not a single end-user utilizes the BI system to influence a single business decision? Traditional processes intended to serve top-down BI deployments are too often measured by metrics that are not tied to outcomes or organizational impact. If the ETL jobs that IT created ran without failure and the EDW was loaded without error and all downstream reports refreshed, many IT organizations would consider themselves successful.

Merely supplying data and content to users without any regard for whether or not it is adopted and provides value through improved outcomes is simply not enough. Modern BI requires updated processes to support self-service analytics across the organization. It also requires the definition of new success metrics for which IT and the business are jointly accountable and are therefore equally invested.

Where processes and technology intertwine, there is tremendous opportunity. Technical innovations, especially with AI, will continue to make it easier to automate processes and augment users of all skill levels throughout the analytics workflow. And while process can accelerate, rather than inhibit, successful analytics outcomes, it’s important to recognize that this relies on a system and interface that people are eager to use. If processes aren’t supported by the right platform choice, they will stifle adoption.

Platform
Since BI has been historically viewed as an IT initiative, it is not surprising that IT drove virtually every aspect of platform evaluation, selection, purchasing, implementation, deployment, development, and administration. But with drastic changes required to modernize the people and process components of a BI and analytics program, IT must change the criteria for choosing the technology to meet these evolving requirements. Perhaps the most obvious change is that IT must intimately involve business users and analysts from across the organization in evaluating and ultimately deciding which modern platform best
fits the organization and addresses the broad needs of the users. For more information on selecting the right analytics platform, check out the Evaluation Guide.

A modern platform must address a wide range of needs and different personas as well as the increased pace of business and the exponential growth in data volume and complexity. IT requires that the chosen platform enables governance and security while end users require easy access to content and the ability to explore and discovery in a safe environment.

The chosen platform must also be able to evolve with the landscape and integrate easily with other systems within an organization. A centralized EDW containing all data needed for analysis, which was the cornerstone of traditional BI, is simply not possible in the big-data era. Organizations need a platform that can adapt to an evolving data landscape and insulate users from increased complexity and change.

The most critical aspect is the ability to meet these diverse needs in an integrated and intuitive way. This integration is depicted on the following page as the modern analytic workflow. The diagram highlights the five key capabilities that must flow seamlessly in order for the three personas depicted in the center to effectively leverage the platform.

Tableau

The BI and analytics platform landscape has passed a tipping point, as the market for modern products is experiencing healthy growth while the traditional segment of the market is declining with little to no net new investment. IT leaders should capitalize on this market shift and seize the opportunity to redefine their role in BI and analytics as a far more strategic one that is critical to the future success of the organization. Adopting a collaborative approach to recast the foundational aspects of the BI program and truly support self-service is the key to changing the perception of IT from a producer to a strategic partner and enabler for the organization.

Promise
In today’s era of digital transformation, IT leaders are increasingly expected to take on digital business initiatives in their organizations, including identifying cost savings and finding new revenue streams. Realizing the value of data for these transformational efforts, many businesses are modernizing and increasing their analytics investments to innovate and accelerate change. Everyone agrees that putting data at the center of conversations promises change. However, most organizations are failing to successfully implement an enterprise-wide analytics program.

IT is well positioned for a leadership role in these efforts, and is essential for the task of giving people the relevant data they need for decision-making. Modern analytics shifts IT’s role to a more strategic partner for the business, empowering users to navigate a trusted, self-service environment. But beyond access to the data, everyone needs the motivation and confidence to properly make decisions with it. You need to identify the relationships between job functions and data and change behaviors that run deep into the fabric of your organization’s culture.

This also means expanding your definition of self-service so that business users participate in some of the traditionally IT-led responsibilities associated with data and analytics—like administration, governance, and education. With the right processes, standards, and change management, business users can help manage data sources, analytics content, and users in the system, as well as contribute to training, evangelism, and the internal community. When users value and participate in these efforts, IT can manage strategic initiatives like business SLAs and ensuring the security of company assets.

Although every organization’s journey to building a data-driven organization will differ, achieving your transformational goals requires a deliberate and holistic approach to developing your analytics practice. Success at scale relies on a systematic, agile approach to identify key sources of data, how data is selected, managed, distributed, consumed, and secured, and how users are educated and engaged. The better you understand your organization’s requirements, the better you will be able to proactively support the broad use of data.

Tableau Blueprint provides concrete plans, recommendations, and guidelines as a step-by-step guide to creating a data-driven organization with modern analytics. We worked with thousands of customers and analytics experts to capture best practices that help turn repeatable processes into core capabilities to build and reinforce a data-driven mindset throughout your organization. Learn more and get started today.

IT Leadership

Real World Evidence (RWE) refers to the analysis of Real World Data (RWD), which is used by life sciences companies and healthcare organizations to securely obtain, store, analyze, and gain insights about the functioning of a drug or medical invention. RWE helps medical professionals and other stakeholders demonstrate the value of a particular drug’s effectiveness in treating medical conditions in a real-world setting.

Today, life sciences companies have a huge opportunity to unlock the potential of RWD and improve patient outcomes.

Why the buzz around RWE?

For some time now, the use of mobile devices, computers, and different mobile equipment to collect and store massive amounts of health data has been accelerating. This data has the potential to allow life sciences companies to conduct clinical trials more effectively.

Moreover, the introduction of more sophisticated and modern analytical capabilities has paved the way for advanced analysis of the data acquired and its application in medical product development and approval.

The ever-changing healthcare sector

Even today, healthcare payors and governments continue to face enormous data management and storage capacity challenges. Drug prices are increasing, owing to development costs and an increase in demand for personalized treatments, which is placing unimaginable pressure on life sciences companies.

These companies are focusing on the development of integrated solutions and therefore are moving “beyond the pill” and becoming solution providers. With a renewed focus on offering value to various stakeholders, these companies are creating new commercial models. RWE helps them respond to these trends successfully.

Harnessing the power of cloud

With AWS cloud, it is now easier for pharma companies to derive huge datasets from a vast pool of sources. Companies are equipped with the organizational intelligence to understand the needs of stakeholders and mitigate the challenges of large-scale data storage, data analytics, and sharing.

The best way to maximize the utility of RWE is to successfully integrate disparate data types. Life sciences companies must store, search, analyze, and normalize data of different types and sizes coming from different sources, including medical devices, wearables, genomics reports, data claims, clinic trials, and electronic medical records.

One common solution for these disparate data types is a data lake, which allows organizations to store data in a centralized repository — both in its original form and in a searchable format. One benefit of the AWS data lake is that data can be stored as-is; there is no need to transform it into a predefined schema. Unlike with a data warehouse, companies do not need to structure the data first. They can use the data for tagging, searching, and analysis without worrying about its original format.

When it comes to pharma companies, the artificial intelligence and machine learning capabilities of AWS help process data for real-world evidence, such as:

Quick access to all types of data like genomics, clinic trials, and claimsThe integration of new RWE data to existing data in the data lake

Advanced RWE analytics use predictive and probabilistic causal models, unsupervised algorithms, and machine learning to extract deeper insights from these data sets. These help in building large data sets with significant information on thousands of patient variables.

With the on-premise data repositories getting replaced by cloud-based data lakes, pharma companies now have access to a scalable platform that provides cutting-edge analytics. These companies will be at the forefront of technological innovation, as RWE becomes the big picture in the world of pharma in the years to come.

Author Bio

TCS

Ph: +91 9818804103
E-mail: a.goel1@tcs.com

Dr. Ashish Goel is a molecular oncologist from Johns Hopkins University, with more than 23 years of experience across different facets of the pharmaceutical industry ranging from new drug discovery to decision support analytics. He has held leadership positions in Pharma/pharma-centric organizations assisting in key decisions ranging from designing NCE like Lipaglyn (Zydus) to Revenue Forecasting and lately Real-world Evidence and HEOR (IQVIA and TCS). He currently leads the Value Demonstration team in TCS, focusing on business transformation via evidence generation management and automation. He has published extensively in peer-reviewed international journals; owns patents and has been invited speaker to conferences and thought leadership forums/interviews.

To learn more, visit us here.

Cloud Computing

By: Stephanie Crawford, Solutions Marketing Manager at Aruba, a Hewlett Packard Enterprise company.

Recent discussion about hybrid work has centered around enabling employees to work remotely, but remote work is only half of the hybrid equation. Equally important is ensuring that the corporate office environment provides the workspace and the technology to support this new way of working. Naturally, the network infrastructure plays a big role in this.

As a key business tool and an enabler of a successful hybrid workplace, a modern, flexible network should provide: 

Robust campus Wi-Fi that supports the traffic demands of bandwidth-hungry applications at scaleA platform for the increasing number of IoT devices used to support a fluctuating population of on-campus workers and a smart office setup A consistent experience for workers, regardless of where they are when they log in A unified infrastructure that simplifies network management and security for wireless, wired, and WAN infrastructure across campus, branch, remote, and data center locations—with acquisition, deployment, and management options to suit business needs 

The evolution of the office 

The way people work has changed and the office has to adapt to meet their new needs. This means that although workers are “returning to the office,” it may no longer be the familiar place they remember. For example, the corporate campus may only be a part-time location for many full-time employees.

While knowledge workers have become adept at using home offices for independent work, the campus provides a better environment for face-to-face meetings and collaborative efforts. This subtle shift changes the requirements for both the office space and the corporate network.

Creating a collaborative campus with Wi-Fi 6

To support the campus as a collaborative space, companies are reworking floorplans to create more shared spaces in a variety of configurations. There are new video-enabled conference rooms, and cubicles are being cleared out in favor of wide-open spaces—both indoors and out—that are more conducive to conversation and impromptu collaboration. These common areas require robust, ubiquitous Wi-Fi to support an increased appetite for bandwidth-hungry, low-latency business applications such as video conferencing and Wi-Fi calling.

After all, these apps aren’t going away just because people are back in the office. A distributed work force, differing in-office schedules, and a reduced need for travel all mean meetings are going hybrid too, with a mix of in-person and remote participants. Hybrid employees are bringing their bandwidth-intense apps to the office with them, and if the infrastructure can’t handle all that network traffic, it can negatively impact the business. 

Fortunately, Aruba’s portfolio of AI-powered Wi-Fi 6 and Wi-Fi 6E access points offers a perfect fit for these new demands, providing prioritized, high-bandwidth access in high density areas. These APs are equipped with Air Slice, application assurance that goes beyond device-based airtime fairness to identify and prioritize mission-critical applications, ensuring an excellent user experience. This is just one example of how the modern network, an essential enabler of hybrid work, delivers business—not just technology—outcomes.

APs as IoT platforms

With a more transient workforce comes even more opportunity to create smart spaces with IoT devices. Already in use within smart buildings, environmental sensors to control things like lighting, heating, and cooling can deliver even more savings—without compromising safety or comfort—in a hybrid setting where occupancy can vary significantly from one day to another. Today’s offices may also take advantage of IoT devices to enable air quality monitoring, wayfinding, access, and door-locking systems, or even hot-desking or hoteling solutions used to book in-office workspaces. 

Since Aruba’s Wi-Fi 6 and Wi-Fi 6E APs can be used as IoT platforms, those IoT devices can become part of the corporate network—no need for a separate IoT overlay network. While IoT devices have earned a reputation for lackluster security and for joining networks without the knowledge of or permission from IT, Aruba ESP’s identity-based access control supports Zero Trust and SASE frameworks to ensure only authorized devices join the network. Meanwhile, AI-powered Client Insights, part of Aruba Central’s cloud-native network management, ensures those devices are all identified and governed by established policy enforcement rules to ensure traffic is segmented and each respective device can only access IT resources it’s entitled to access.

A consistent experience, wherever you’re working

There are clearly differences between working on campus and working remotely, but the network experience shouldn’t be among them. Although employees may work in multiple locations, they expect a consistent experience whether they’re logging on from their remote office or a corporate campus—and Aruba EdgeConnect Microbranch delivers it. Role- and device-based security provide a consistent level of access, while policy-based routing intelligently routes cloud traffic, eliminating the need to direct all traffic through the data center for inspection. This improves performance while ensuring policies are applied universally, regardless of location.

Similarly, Aruba Central provides network administrators visibility and management for the whole network—wired, wireless, WAN, branch, and remote offices—all from an AI-powered single pane of glass with deep visibility into total network and client health.

Network as a service for greater flexibility

Just as Aruba ESP provides a flexible, modern networkHPE GreenLake for Aruba provides a wide variety of flexible network as a service (NaaS) options to suit unique business needs. OpEx options allow customers to upgrade to the latest hardware when it’s available, rather than waiting for capital depreciation on existing hardware. Organizations can also opt for the flexibility to scale the network up or down according to business needs, and they can choose to manage the network in-house, outsource the management, or a combination of the two—all either ongoing or on a project basis. This allows businesses to get the network they need without the constraints of traditional acquisition, deployment, and management approaches.

Going back to something better

The way the work force works continues to evolve and workplaces need to keep up. In order for a hybrid workforce to take full advantage of the benefits the on-campus experience can offer for better business outcomes, the network they’re going back to will likely need to be better than the one they left. Additional information can be found in these resources: 

Home Office vs Workplace: How Does Your Network Measure Up? (webinar)[CS1] The Hybrid Workplace Solution Overview (PDF)Unify IT, IoT, and Operational Technology (OT) networks (web page)Extend the WAN to remote workers with EdgeConnect Microbranch (tech brief)

 [CS1]The “(webinar)” part should not be hyperlinked.

IT Leadership

By Keith O’Brien, Chief Technology Officer for the Service Provider line of business at Palo Alto Networks

With each successive generation, advances in mobile technology have trained us to expect ever-faster mobile speeds and the ability of the signal to transport ever-greater loads of data.

Increased data transfer rates enabled 3G to handle larger capacities, and that generation was the first to have serious broadband capabilities. As 4G LTE rolled out, mobile signals could now support interactive multimedia, voice, and video with greater speed and efficiency.

Therefore, it’s not surprising that 5G is characterized mostly by its ability to download very large data files in the blink of an eye and to carry broader sets of data, such as HD video streaming, virtual reality applications, and massive IoT implementations.

But we’re missing a big part of the picture by looking at 5G as just a faster, stronger horse in the race. It’s not just what it does but how it does it. The software-defined architecture of 5G, including 5G security, brings forward use cases that were not previously imaginable.

5G security has a tremendous enabling role to play in this new mobile generation. Cybersecurity doesn’t have to be an afterthought, a risk to be mitigated, and a barrier to be overcome. Done right, building cybersecurity into the 5G architecture can be simple enough to reduce operational complexity, intelligent to provide contextual security outcomes, and predictive to swiftly respond to known and unknown threats in real time.

In a sense, the closer analogy for the advent of 5G is the jump that was made from 1G to 2G. Just as the move toward a digital, cellular network — Global System for Mobile Communications (GSM) — made possible an expanded scope of voice and data capabilities, so too the move to software-defined network functionality in 5G is revolutionizing what mobile networks can do for users, enterprises, and the mobile network operators (MNO) that serve them.

Here are just a few of the new mobile capabilities:

Network slices, organized by vertical industry functions, make it possible to create custom private 5G networks with specific service characteristics.Multi-access Edge Computing (MEC), with its distributed support for low latency and capacity for rapid delivery of massive data amounts, enables mission critical enterprise applications and creates richer experiences across remote devices.Dense, small cell architecture, using femtocells, picocells, and microcells, play a critical role in densifying the network so that devices and applications that require a low latency, ultra-reliable connection can be supported.

Simply put, there is a lot more to mobile technology than a phone call these days.

And with that expansion of capabilities comes both greater opportunities for products and services delivered by 5G, as well as greater risks from exploitation of the architecture and technology. Here are some signs that we’re not paying close enough attention to how 5G is changing the game.

Lateral movements and network complexityhacking has never had it so good.

The distributed 5G network no longer has a clear perimeter. The assets and workloads of service providers, enterprises, and CSPs are intertwined. Only through visibility and control across the whole system can service providers and enterprises detect breaches, lateral movement, and stop the kill chains.

The US government has sounded the alarm on lateral movements, particularly in cloud-based platforms: “Communications paths that rely on insecure authentication mechanisms, or that are not locked down sufficiently by policy, could be used as a lateral movement path by an attacker, allowing the attacker to change between planes or pivot to gain privileges on another set of isolated network resources.[1]

The CISA/NSA joint guidance statement then gives several recommendations for mitigating damage caused by such breaches, among them, “Sophisticated analytics, based on machine learning and artificial intelligence, can help detect adversarial activity within the cloud and provide the 5G stakeholder with the means to detect malicious use of customer cloud resources.”

Protecting this complex, distributed network environment needs a platform approach with ML-powered threat detection that secures the key 5G interfaces under a single umbrella — no matter if they are distributed across private and public telco clouds and data centers.

Speed is a two-edged swordtaming the beast with automation and AI.

By now it is clear that the expanded surface area of a 5G network — including MEC, network slices, and instantaneous service orchestration — creates territory where speed is both expected and routine. However, highly distributed cloud and edge environments, as well as a proliferation of IoT and user-owned devices also add up to a 5G security environment where threats evolve faster, move more quickly, get more opportunities to attack, and can potentially cause more damage. Additionally, the complexity of modern mobile network architectures enhances the risk of misconfiguration.

These complex, distributed environments are making it very difficult for MNOs to manage networks or to implement changes without the use of automation and artificial intelligence that can tame the speed.

Automation is the key for building 5G networks that have security built in from the moment they are built or the moment new services are configured.

Additionally, there is the concept of proportional response in managing threats to consider. Attacks rapidly and automatically evolve, and attackers use machines to automatically morph attacks. Also, the threat actors utilize artificial intelligence (AI) to automate and obfuscate the attacks, and thus similar techniques are needed to defend.

Automation and AI should be at the core of 5G security to analyze vast amounts of telemetry, proactively assist in intelligently stopping attacks and threats and recommend security policies.

5G innovation and architecture are driving new use cases into unexpected quarters.

For example, the ultra-dense, low latency world of small cell architecture (femtocells, picocells, and microcells) is growing exponentially. Recent industry reports project small cell deployment to grow by 800% over the next decade[2]. Some of the largest growth in the US will come in suburban areas.

The advent of mobile slice services, with their customizable network features and service level agreements (SLA), are already making waves within specific vertical industry use cases, such as automation of field services in upstream energy enterprises, exciting new potential for real-time clinical analysis in healthcare provider settings, and for ultra-reliable, very low latency data services for automated vehicles. The potential has been barely considered thus far.

Meanwhile, U.S. fixed wireless access (FWA) where mobile is used as the primary form of connectivity is expected to grow at 16.1% CAGR through 2026[3]. This growth will be driven by the expansion of 5G networks, particularly in rural areas, coupled with increasing demand for broadband internet in the post-Covid-19 era.

The challenge, of course, will be to meet the demand for 5G use cases in areas well outside of the densely populated core city areas.

CSPs aren’t picking up on enterprise appetite for 5g security-as-a-service demands.

A 2020 study[4] by BearingPoint conducted with analyst firm Omdia found that only one in five (21%) of early enterprise 5G deals is being led by communication service providers (CSPs), while secondary CSP suppliers such as hardware suppliers and systems integrators are leading 40% and large enterprises themselves are leading about a third.

Another BearingPoint survey of 250 executives[5] from large enterprises and small/medium businesses (SMBs) found a predominant appetite for all-encompassing platform solutions. Some 62% of enterprises and 68% of SMBs want to buy a 5G solution that is “ready to go.” This, of course, means solutions that act on and deliver outcomes that are designed from the beginning to be secure and to keep privileged data private.

Skills are not keeping up with needs for 5G security.

Perhaps the biggest flashing signal that 5G security needs more attention comes from a GSMA study last year that reported[6] 48% of surveyed operators said that not having enough knowledge or tools to discover and solve security vulnerabilities was their greatest challenge. This is further exacerbated for 39% of surveyed operators who indicated that they were faced with a limited pool of security experts. Operators want to focus on security, but to do so they will need to partner to build up their offerings.

There are tremendous opportunities and appetite for the capabilities that 5G mobile technology can deliver. We can open new markets for private networks, industry-centric slice solutions, small cell deployment, and MEC applications. But it’s equally clear that all of us in the mobile industry, whether MNOs, network equipment providers, integrators, OEMs, cybersecurity software vendors or cloud players, must work together to develop the simpler platform solutions that the markets are seeking if we are to make good on the promise of 5G.

Learn more about 5G here.

About Keith O’Brien:

Keith serves as the Chief Technology Officer for the Service Provider line of business at Palo Alto Networks and has over 25 years of leadership experience in the cyber security industry. His specialties include secure network design and architecture, application security, devsecops and service provider security.

[1] Security Guidance For 5g Cloud Infrastructures, October 2021. US National Security Agency (NSA) and Cybersecurity and Infrastructure Security Agency (CISA).

[2] The Suburban Migration: New Mapping Analysis Reveals Surprising U.S. Small Cell Growth, Altman Solon. July 2022.

[3] US FWA residential subscribership will grow at 16.1% CAGR through 2026, Global Data Technology. July 2021.

[4] DTWS: CSPs losing big on 5G enterprise deals, BearingPoint and Omdia. October 2020.

[5] DTWS: CSPs losing big on 5G enterprise deals, BearingPoint. October 2020.

[6] Securing private networks in the 5G era, GSMA. June 2021.

5G, IT Leadership

By: Stephanie Crawford, Solutions Marketing Manager at Aruba, a Hewlett Packard Enterprise company.

Recent discussion about hybrid work has centered around enabling employees to work remotely, but remote work is only half of the hybrid equation. Equally important is ensuring that the corporate office environment provides the workspace and the technology to support this new way of working. Naturally, the network infrastructure plays a big role.

As a key business tool and an enabler of a successful hybrid workplace, a modern, flexible network should provide: 

Robust campus Wi-Fi that supports the traffic demands of bandwidth-hungry applications at scaleA platform for the increasing number of IoT devices used to support a fluctuating population of on-campus workers and a smart office setup A consistent experience for workers, regardless of where they are when they log in A unified infrastructure that simplifies network management and security for wireless, wired, and WAN infrastructure across campus, branch, remote, and data center locations—with acquisition, deployment, and management options to suit business needs 

The evolution of the office 

The way people work has changed, and the office has to adapt to meet their new needs. This means that although workers are “returning to the office,” it may no longer be the familiar place they remember. For example, the corporate campus may only be a part-time location for many full-time employees.

While knowledge workers have become adept at using home offices for independent work, the campus provides a better environment for face-to-face meetings and collaborative efforts. This subtle shift changes the requirements for both the office space and the corporate network.

Creating a collaborative campus with Wi-Fi 6

To support the campus as a collaborative space, companies are reworking floorplans to create more shared spaces in a variety of configurations. There are new video-enabled conference rooms, and cubicles are being cleared out in favor of wide-open spaces—both indoors and out—that are more conducive to conversation and impromptu collaboration. These common areas require robust, ubiquitous Wi-Fi to support an increased appetite for bandwidth-hungry, low-latency business applications such as video conferencing and Wi-Fi calling.

After all, these apps aren’t going away just because people are back in the office. A distributed work force, differing in-office schedules, and a reduced need for travel all mean meetings are going hybrid too, with a mix of in-person and remote participants. Hybrid employees are bringing their bandwidth-intense apps to the office with them, and if the infrastructure can’t manage all that network traffic, it can negatively impact the business. 

Fortunately, Aruba’s portfolio of AI-powered Wi-Fi 6 and Wi-Fi 6E access points offers a perfect fit for these new demands, providing prioritized, high-bandwidth access in high density areas. These APs are equipped with Air Slice, application assurance that goes beyond device-based airtime fairness to identify and prioritize mission-critical applications, ensuring an excellent user experience. This is just one example of how the modern network, an essential enabler of hybrid work, delivers business (not just technology) outcomes.

APs as IoT platforms

With a more transient workforce comes even more opportunity to create smart spaces with IoT devices. Already in use within smart buildings, environmental sensors to control things like lighting, heating, and cooling can deliver even more savings—without compromising safety or comfort—in a hybrid setting where occupancy can vary significantly from one day to another. Today’s offices may also take advantage of IoT devices to enable air quality monitoring, wayfinding, access, and door-locking systems, or even hot-desking or hoteling solutions used to book in-office workspaces. 

Since Aruba’s Wi-Fi 6 and Wi-Fi 6E APs can be used as IoT platforms, those IoT devices can become part of the corporate network—no need for a separate IoT overlay network. While IoT devices have earned a reputation for lackluster security and for joining networks without the knowledge of or permission from IT, Aruba ESP’s identity-based access control supports Zero Trust and SASE frameworks to ensure only authorized devices join the network. Meanwhile, AI-powered Client Insights, part of Aruba Central’s cloud-native network management, ensures those devices are all identified and governed by established policy enforcement rules to ensure traffic is segmented and each respective device can only access IT resources it’s entitled to access.

A consistent experience, wherever you’re working

There are clearly differences between working on campus and working remotely, but the network experience shouldn’t be among them. Although employees may work in multiple locations, they expect a consistent experience whether they’re logging on from their remote office or a corporate campus—and Aruba EdgeConnect Microbranch delivers it. Role- and device-based security provide a consistent level of access, while policy-based routing intelligently routes cloud traffic, eliminating the need to direct all traffic through the data center for inspection. This improves performance while ensuring policies are applied universally, regardless of location.

Similarly, Aruba Central provides network administrators visibility and management for the whole network—wired, wireless, WAN, branch, and remote offices—all from an AI-powered single pane of glass with deep visibility into total network and client health.

Network as a service for greater flexibility

Just as Aruba ESP provides a flexible, modern networkHPE GreenLake for Aruba provides a wide variety of flexible network as a service (NaaS) options to suit unique business needs. OpEx options allow customers to upgrade to the latest hardware when it’s available, rather than waiting for capital depreciation on existing hardware. Organizations can also opt for the flexibility to scale the network up or down according to business needs, and they can choose to manage the network in-house, outsource the management, or a combination of the two—all either ongoing or on a project basis. This allows businesses to get the network they need without the constraints of traditional acquisition, deployment, and management approaches.

Going back to something better

The way the work force works continues to evolve, and workplaces need to keep up. In order for a hybrid workforce to take full advantage of the benefits the on-campus experience can offer for better business outcomes, the network they’re going back to will likely need to be better than the one they left. Additional information can be found in these resources: 

Home Office vs Workplace: How Does Your Network Measure Up? (webinar)The Hybrid Workplace Solution Overview (PDF)Unify IT, IoT, and Operational Technology (OT) networks (web page)Extend the WAN to remote workers with EdgeConnect Microbranch (tech brief)IT Leadership

By Serge Lucio, Vice President and General Manager, Agile Operations Division, Broadcom Software

Over the course of 2020 and 2021, the fabric of our work lives was radically altered, ushering in a world in which hybrid models were the norm. In 2022, these changes will be solidified and entrenched. Because of this, it is now more critical than ever to address continued demands for agility, while establishing methodologies, tools, and technologies that unite hybrid teams and facilitate new, and better, ways of working.

Meeting these objectives and adapting to new realities simply won’t happen by working in the same way. That’s why the adoption of approaches like value stream management (VSM) has continued to accelerate. For example, according to Broadcom Software’s recent survey of business and IT executives to better understand how they are adapting to the new realities by utilizing value stream management, they are adopting VSM to focus on increasing efficiency (70%), improving product quality (67%), and increasing customer value (63%).

Value stream management: A brief introduction

VSM is a framework that enables new ways of working by helping organizations focus on optimizing the delivery of value to customers.

In the early days, VSM was used across DevOps teams to boost product delivery efficiency by identifying and eliminating waste. Today, top organizations are expanding from their laser focus on efficiency to also include maximizing effectiveness across the enterprise by establishing end-to-end project visibility and funding the initiatives that matter most.

Through the effective adoption of VSM, teams can realize a number of key advantages:

Get to market faster. VSM helps optimize product lifecycles and enables teams to get releases out the door on an agile cadence and make enhancements iteratively so they remain better aligned with evolving customer priorities. Further, teams can deliver value faster than the competition and get revenue sooner.

Build high-quality products that customers value. VSM promotes a user-centric approach, enabling teams to deliver value in short cycles. VSM also promotes the effective integration of customer feedback into the value stream, so teams can effectively align strategy and development work around building what customers want most.

Reduce risk and waste. Teams are contending with significant product development challenges that are stifling their ability to meet key objectives. Teams lack visibility throughout the product lifecycle and struggle with inefficient processes. Through VSM, they can gain a clearer picture of resources, work, and customer requirements, enabling them to make more informed trade-off decisions. By aligning with customer feedback and iterating more quickly, teams can respond more effectively and reduce the risk of expensive market misses.

Collaborate more effectively. Silos continue to represent a persistent challenge for organizations in 2022 and with the expansion of work from anywhere, these challenges aren’t going away. Through VSM, teams can be more fully empowered to build better products and increase productivity. VSM supports alignment between teams through transparency to reduce inefficiencies and improve ROI and business outcomes.

How to get started with VSM

The first step is the creation of a value stream map. Your map will provide details on how work flows through your organization, from idea to customer value. Every step throughout the process needs to be documented to achieve transparency throughout the organization.

As teams are often focused on their own projects and tasks, it becomes helpful for everyone to see the broader picture to understand what is happening in other areas of the business and how things get done. Understanding how critical your contribution is in the value stream helps make better decisions because they understand the impact those decisions have.

Also, VSM technologies can provide transparency and consistency across stakeholders, no matter what their discipline, role or location. Ultimately, this empowers organizations to realize the true value of VSM and helps them maximize the value they deliver to their customers.

Conclusion

VSM is the framework that enables the new ways of working that are required to succeed now. Through well-executed VSM initiatives, organizations can accelerate digital transformation and boost agility, which can, in turn, increase the ROI of products and thus company profitability.

VSM can help establish value-focused teams and align people around customer value. Ultimately, this sets the groundwork for building responsiveness into an organization’s DNA, enabling it to become a market disruptor rather than being the victim of disruption.

To learn more about how Broadcom Software can help you with your digital transformation, contact us here.

About Serge Lucio:

Broadcom Software

Serge Lucio is Vice President and General Manager of the Agile Operations Division at Broadcom Software. In this role, he is responsible for the company’s solutions that help organizations scale their digital transformation by fusing business and IT.

Digital Transformation, IT Leadership

Much of the hype around big data and analytics focuses on business value and bottom-line impacts. Those are enormously important in the private and public sectors alike. But for government agencies, there is a greater mission: improving people’s lives.

Data makes the most ambitious and even idealistic goals—like making the world a better place—possible.

This is intrinsically worthwhile, but it has now been codified as part of the Federal Data Strategy and its stated mission to “fully leverage the value of federal data for mission, service, and the public good.” Making the world a better place with data is not only noble, it’s required.

This comes at a critical time for the US and the world—the global population faces complex challenges that cannot be ignored, including:

Caring for a large, aging population that is living longer than ever: For the first time in US history, older adults are expected to outnumber children by 2034. (US Census)Combating climate change: The US experienced 20 different billion-dollar weather and climate disasters in 2021, the second-most in history after the record 21 billion-dollar disasters in 2020. (Climate.gov)Navigating growing geopolitical tensions and conflicts: Agencies operate in a time of immense complexity, from rising inflation to war in Ukraine to a vast cybersecurity threat landscape (and the recent Executive Order on Improving the Nation’s Cybersecurity).Adapting to new social, economic, and public health realities: The COVID-19 pandemic and other factors have had indelible impacts on how (and where) people live and work, including a significant increase in remote and hybrid work, as well as creating a renewed urgency around improving public health.

The public sector has a massive, growing asset at its disposal for tackling these and other pressing issues: data. It is central to the kinds of intelligent analysis and insights required to take swift, informed action that positively impacts people’s lives and improves the world around them

To achieve this vision, agencies must modernize and optimize how they collect, organize, manage, analyze, and act on that data—in an open manner that fosters trust and accountability with citizens, partners, and other stakeholders.

There are four fundamental attributes of this transformation that must be brought together as part of a data framework for a better world. These pillars include:

Agility: Agencies need the tools, skills, and culture to rapidly adopt and collect new data sources, build new applications and data products, and deliver actionable insights. They cannot be asked to solve tomorrow’s problems with yesterday’s technology.Trust: For data to make the world a better place, people must be able to trust it. Agencies need tools and policies that balance transparency, security, privacy, and regulatory compliance. They will need to root out bias and ensure they—and their machine or algorithmic counterparts—leverage data in a manner that is equitable and ethical.Data-Driven Culture: The public sector must operate with a shared commitment to learning and adopting a data-driven culture in their own work—and to fostering that same data-driven culture in society at large.Open AI and ML Collaboration to Improve Government Services: The scope and scale of public data—and the challenges we must tackle—means that human effort and ingenuity alone won’t be enough. To accomplish the most ambitious goals—improving people’s lives—government agencies will need to increasingly rely on automation and an open AI and machine learning collaboration that bridges research and services with cutting-edge data products. 

Cloudera is well-positioned to support this modern data framework for a better world. We’ve built a hybrid data platform leveraging data lakehouse, data mesh, and data fabric as post-movement tools for optimal mission performance. 

That starts with harnessing the vast amounts of data in motion—structured and unstructured, and from a vast number of sources—into a single platform so that it can be utilized for the greater good. And we continuously invest in the tools and capabilities agencies need to ensure trust and transparency, from robust security protocols to data lineage to metadata and more.

You can learn more in our interactive ebook, “Data in Motion to Accelerate Your Mission.”

Together, we can use the power of data to make the world a better place.

Want to learn more? Watch Carolyn Duby’s presentation at Cloudera Government Forum.

Cloud Storage