Anywhere work comes with all-around security risks. When HP moved its workforce of 70,000 employees and contractors to a hybrid model, the umbrella of devices it had to protect expanded exponentially.

HP had to ensure that they could:

Meet today’s rising security challenges

Guard employee data

Make layers of security transparent and automated Keep software up to date

Ensure security policy compliance

How HP achieved it

The HP CISO and Security team looked to their own extensive arsenal of services and chose the HP Adaptive Endpoint Management, HP Sure Click Enterprise, and HP Wolf Security components to protect its growing hybrid work environment.

Automated security policies

HP Adaptive Endpoint Management helps automate security updates from the cloud. When a cyberthreat is on the horizon, employees no longer have to wait until they are back on a corporate network—they can get the latest patches and upgrades from anywhere. Adaptive Endpoint Management enabled HP to rationalize its device management practices by streamlining policy management and using a corporate-ready device image.

Vigilant protection for end users

HP Sure Click Enterprise isolates key applications in their own micro-virtual containers—trapping and deleting risky programs, viruses, and malware as soon as the task is closed, keeping them from infecting the PC or network.

Built-in device protections

HP Wolf Security provides hardware-enforced full stack security that works below, in, and above the OS. IT departments are better able to improve lifecycle management as well as incident and disaster recovery while helping security teams minimize risk and increase team productivity—without interruption to employees.

Click here to read the full case study. To find out more about HP’s Workforce Security Solutions click here.

Remote Work, Security

When it comes to facilities, IT, staffing, and supply chain, businesses today need a whole new kind of blueprint to thrive in the new era of uncertainty.

Discover the five ways to help prepare for whatever is thrown your way while still meeting your desired business outcomes.

Workforce Experience

The whole purpose of gathering people in offices is so they can connect and interact; replicating this can be challenging when your workforce is distributed across multiple locations. Connectedness is about more than ongoing operations; it’s also about getting and keeping the best people.

IT Optimization

If the old style of blueprint was dogmatic, with a top-down structure offering little room to manoeuver, the new style of blueprint must embrace change and put people at the center. Give your staff the equipment they need to let them use the tools they’re already comfortable with, whether that’s a laptop, a tablet, or workstation.

Manageability

Nothing is more important than making sure that business can keep going whatever challenges arise. Today, enterprise IT’s mission has emerged not just as a vendor of services to internal corporate clients but as a full strategic partner in keeping the business running no matter where workers or customers may be.

Security Fortification

Whatever your security blueprint is, ensure it acknowledges the very real threats that are already occurring and likely to become more intense. That means having endpoints with built-in protection against the biggest threats and ensuring you have a mechanism to keep those endpoints up to date wherever they may be.

Sustainable Impact

Ensuring that technology is used in a way that aligns with your Environmental, Social, and Governance (ESG) goals is crucial for protecting the planet and creating sustainable working practices. By actively seeking out solutions that reduce your carbon footprint and promote environmentally responsible practices, organizations can make a positive impact on the world. Read the full guide from HP here.

To find out more about HP’s Workforce Solutions click here. To read more about HP’s Sustainable Impact click here. Discover more on HP’s Workforce Experience here. Learn about HP’s Workforce Security Solutions by visiting here. And find out more about HP’s Managed Device Services here.

Remote Work

Data is the powerhouse of digital transformation. That’s no surprise. But did you know that data is also one of the most significant factors in whether a company can achieve its sustainability goals? 

Business leaders are at a crossroads. On one hand, a perilous financial landscape threatens to stall growth, with companies of all sizes retreating to more well-established profit drivers. On the other hand, environmental stewardship has swelled into a legitimate consideration underpinning nearly every business decision, underscored by the severity of recent climate reports and a surge in consumer activism.  

This begs the question – as digital and environmental transition both hit the top of corporate agendas, what are the most important changes needed to achieve this dual transition?  

In this webinar, Microsoft’s Rosie Mastrandrea, TCS’ Jai Mishra, and Equinor’s Vegard Torset explore the crossroads of data and digital transformation — and how the right approach can unlock your sustainability goals. 

Watch the webinar.  

Digital Transformation, Financial Services Industry

As organizations shape the contours of a secure edge-to-cloud strategy, it’s important to align with partners that prioritize both cybersecurity and risk management, with clear boundaries of shared responsibility.

The security-shared-responsibility model is essential when choosing as-a-service offerings, which make a third-party partner responsible for some element of the enterprise operational model. Outsourcing IT operations has become a smart business strategy. But outsourcing operational risk is untenable, given the criticality of data-first modernization to overall enterprise success.

“Intellectual property is key to a company’s success,” notes Simon Leech, operational security lead for HPE GreenLake Cloud Services. “Therefore, it’s up to CIOs to do due diligence about what sort of security controls are in place and to ensure data is well protected in an [as-a-service] operating model. The security-shared-responsibility model provides a clear definition of the roles and responsibilities for security.”

Having a well-articulated and seamlessly integrated security-shared-responsibility model is table stakes. Organizations are spending far more time grappling with the costs and consequences of highly complex cyberattacks, to the tune of a 72% spike in costs over the last five years, according to the Accenture/Ponemon Institute’s “Ninth Annual Cost of Cybercrime” study. Specifically, the study attributed an average $4 million loss to business disruption, with another $5.9 million associated with information losses. In total, the global cost of cybercrime is skyrocketing, expected to grow 15% annually to hit the $10.5 trillion mark by 2025, noted the “2020 Cybersecurity Ventures” report. 

HPE GreenLake: Security by Design

Against this backdrop of heightened cybercrime activity, organizations are more vulnerable as the proliferation of platforms, internet-of-things (IoT) devices, and cloud applications has created an expanded attack surface and widened security gaps. A new security-by-design approach infuses security practices and capabilities directly into new systems as they are built — versus addressing security requirements later as an afterthought. 

An organization’s approach to security must also scale at the speed of digital transformation. This means that security must be automated and integrated directly into continuous-integration/continuous-delivery (CI/CD) pipelines, ensuring that safeguards are applied consistently across workloads, no matter where data resides. This also makes it easier for developers to create secure code. As organizations grapple with additional complexity challenges, they need access to third-party security experts to close any internal security gaps.

The HPE GreenLake security-shared-responsibility model differs from that of the typical cloud provider, because the as-a-service platform delivers a public cloud experience everywhere, including in a company’s private data center and/or in a shared colocation facility. The company or colocation provider maintains responsibility for securing the connectivity and physical data center, and HPE’s responsibilities vary, depending on the chosen HPE GreenLake consumption model. For example:

In a bare metal model, HPE is responsible for securing the HPE GreenLake infrastructure and cloud experience, but the customer takes ownership of everything on top of that infrastructure, including the operating system (OS), hypervisor, container orchestration, applications, and more.

With containers and virtual machines, the responsibility shifts and HPE GreenLake handles security for the lower layers that includes the hypervisors, software-defined networking, and container orchestration. Here again, the customer is responsible for securing the guest OS, applications, and data.

For workloads, such as SAP Hana delivered as a service or electronic health records as a service, HPE GreenLake takes security responsibility for everything up through the application layer whereas the customer maintains ownership of data security.

“In all three scenarios, security of customer data is always the responsibility of the customer,” Leech says. “It’s ultimately their responsibility to decide what data they put in the cloud, what data they keep out of the cloud, and how they keep that data protected.”

Best Practices for Security Success

Drill down into the details. Leech cautions that the No. 1 rule for security success is understanding the boundaries of responsibility and not making any premature assumptions. Organizations should confer with their cloud service provider to clearly understand and delineate who has responsibility for what. Most cloud providers, including HPE, offer collateral that drills down into the details of their security-shared-responsibility model, and customers should take full advantage.

“The risk is really one of blissful ignorance,” he says. “The assumption can be made that security is there, but unless you actually go into the contract and look at the details, you might be making a wrong assumption.”

Include the enterprise risk management team. Invite the enterprise risk management team into the discussion early on, so it has a clear understanding of the potential risks. With that knowledge, it can help determine what is acceptable, based on a variety of factors, including the industry, specific regulatory climate, and customer demands. 

Follow security-by-design principles. Use the security-shared-responsibility model as an opportunity to address security early on and identify potential gaps. In addition to automation and ensuring that security is code-driven, embrace zero trust and identity and privilege as foundational principles. “By understanding what those gaps are early enough, you can build compensating controls into your environment and make sure it is protected in a way you’d expect it to be,” Leech explains.

Know that visibility is essential. Security monitoring should be a part of the routine to gain a full understanding of what’s happening in the environment. Organizations can opt to do security monitoring on their own or enlist additional services as part of an HPE GreenLake contract. “It goes back to that idea of blissful ignorance,” Leech says. “If I’m not doing any security monitoring, then I never have any security incidents, because I don’t know about them.”

The HPE GreenLake edge-to-cloud platform was designed with zero-trust principles and scalable security as cornerstones of its architecture and development — leveraging common security building blocks, from silicon to cloud, that continuously protect your infrastructure, workloads, and data so you can adapt to increasingly complex threats. For more information, visit https://www.hpe.com/us/en/solutions/security.html

Security

In 2022, with the pandemic subsiding, the National Museum of African American History and Culture at the Smithsonian Institution in Washington, DC, once again served more than 1 million visitors. But thanks to an inventive digital offering, called Searchable Museum, the museum has been able to reach even more.

The searchable replica of the museum, which launched in November 2021, allows anyone around the globe to experience all elements of the museum, even with low-latency internet and inexpensive phones and tablets — a big benefit, especially for those without the means to visit the physical museum, which opened to much fanfare in 2016, including the attendance of then President Barack Obama and former President George W. Bush.

“After we opened the museum, the director and curatorial staff were talking about how to how to reach even more people who wouldn’t be able to come to the museum and experience it,” says Jill Roberts, program manager of the Searchable Museum at the Office of Digital Strategy and Engagement (ODSE).  “We wanted a digital platform that would bring the museum to other people in the world.”

Curators and IT experts who designed the digital replica continue to enhance the platform with technologies that make it not only more widely accessible but also replete with fresh, robust content. The project was awarded a 2022 US CIO 100 Award for leadership and innovation.

And it’s working. To date, the Searchable Museum has served more than 1.6 million page views, with more than 65% of that traffic coming from mobile phones, Roberts says. A good deal of the traffic has been driven by a QR code deployed across social media channels and other communications outlets to promote awareness of the digital museum’s existence.

Digital storytelling

To entice a technical partner to build the digital site, the ODSE published an RFP and received 15 qualified IT specialists that wanted to take on the immense task of digitally recreating a multifloor museum.

Digital designer Fearless of Baltimore was awarded the contract to build and maintain the Smithsonian’s first and only digital museum. The startup focused on federal contracts and earned its first contract with the Secret Service in 2017. Today, it employs more than 200.

John Foster, COO of Fearless, says the project goes beyond the term ‘searchable’ to provide instead a digital storytelling of the African American experience as depicted in the physical National Museum of African American History and Culture.  

John Foster, COO, Fearless

Fearless

The digital version’s first exhibit on the site, “Slavery and Freedom,” is a foundational aspect of the physical museum followed by 10 additional exhibits found in the physical and digital museums.  

To develop the project, Fearless leveraged Smithsonian’s APIs to access a massive catalog of digital content, including 3D models, videos, podcasts, and imagery not available in the physical building in order to create an immersive, rich experience that rivals a walk-through.

“From the beginning, we challenged ourselves to follow a more audience-centered, data-informed approach to its design and development,” says Adam Martin, CDO of the museum. “Together with our partners, the museum engaged in an iterative process to reimagine and transform the in-person visitor experience for online audiences.”

Adam Martin, chief digital officer, National Museum of African American History and Culture, Smithsonian Institution

NMAAHC, Smithsonian Institution

For instance, the History Elevator transports visitors to the early 1400s using images from various centuries, narrated by Maya Angelou, as well as digital displays and statistics of the 40,000 slave ships of the transatlantic slave trade, as well as a feature on the domestic slave trade that displays authentic excerpts of bills of sale of human beings and slave auction lists of names. The “Paradox of Liberty” exhibit depicts Thomas Jefferson’s ownership of 609 slaves, as well as Sugar Pot and Tower of Cotton artifacts that depict the “juxtaposition of profit and power and the human cost” of slave production.

The Searchable Museum also features content not available in the physical museum. For instance, while the museum features a replica of the Point of Pines Slave Cabin, one of two highly protected slave cabins on Edistro Island, S.C., only the digital Searchable Museum offers a 360-degree look inside the cabin.

Fearless software engineer Avery Smith agreed that leading-edge technologies ushered in by the metaverse might one day be able to transport a museum visitor digitally using virtual reality and augmented reality technologies but he — and all those involved in Searchable Museum — insist they want to prevent making the digital museum inaccessible. A VR/AR experience requires the use of headsets that can often cost hundreds of dollars, he reminds.

The primary aim of the Searchable Museum, after all, is to lower the barrier to entry for all and make it widely accessible globally. To date, visitors from nearly 50 nations outside of the US have accessed the Searchable Museum. Foster is proud that his team accomplished its mission without shortchanging any of the experiences or content in the building.

“I can experience that story on my laptop as well as on my phone,” Foster says of the digital replica, which mimics the multifloor building, starting with a darker tone for the slavery exhibit on the bottom floor and progressing to brighter tones as the viewers ascends through the stairs of African American history.

 “As you go through the history, the higher you get, the feeling of lightness should come about because you see some of the progress being made in the African American story,” Foster says.

The technology behind the Searchable Museum

The Searchable Museum runs on Amazon Web Services and uses APIs created by the Smithsonian IT team to access all the metadata available in the massive catalog of artifacts, images, video clips, 3D objects, and other components that reside within the 11 inaugural exhibitions in the building.

Fearless designers also employed unique technologies to ensure viewers have a speedy, rich experience regardless of the device they use. Many of these technologies are time-tested frameworks and tools that don’t get a lot of publicity but are known platforms that have evolved and aged well. Fearless chose Gatsby, a JavaScript framework for fast web page creation, and Craft CMS, which provides a speedy yet robust reach into the museum’s sizable catalog of metadata stored in AWS S3.

The Gatsby static-site generator scaffolds the front end of the website, enabling developers to build ultrafast web page rendering, while the headless CMS fetches and renders the metadata stored in the back-end cloud systems at high speeds.

“Gatsby takes all that processing that used to happen in real-time and runs the processing before the website is deployed,” says Smith, of Fearless, about the pre-rendering framework. “I advocated for its use with React instead of JavaScript because of my experience working with the Small Business Administration. I learned with Gatsby how clever it is and how it allows for a very, very, very fast web experience … and those speed gains are important for an image-heavy website.”

Avery Smith, software engineer, Fearless 

Fearless

Smith also notes that the Smithsonian’s APIs provide access to 3D images in various resolutions that are immensely helpful for the curators designing the content and to the IT experts delivering on their vision. “Technology doesn’t need to be super complex to get the job done,” Smith says.

The Searchable Museum continues adding compelling content such as a special exhibit on Afrofuturism in March and its technology partners are exploring technologies such as geofencing, which will allow students to play games and learn along the way. But the most important aspect of the Searchable Museum, all say, is accessibility — and Fearless deployed unique, not bleeding-edge technologies to deliver on that dream.

“Technology doesn’t need to be super complex to get the job done,” Smith says.

CIO 100, Digital Transformation, Government IT


Introduction
Since its inception decades ago, the primary objective of business intelligence has been the creation of a top-down single source of truth from which organizations would centrally track KPIs and performance metrics with static reports and dashboards. This stemmed from the proliferation of data in spreadsheets and reporting silos throughout organizations, often yielding different and conflicting results. With this new mandate, BI-focused teams were formed, often in IT departments, and they began to approach the problem in the same manner as traditional IT projects, where the business makes a request of IT, IT logs a ticket, then fulfills the request following a waterfall methodology.

While this supplier/consumer approach to BI appeared to be well-suited for the task of centralizing an organization’s data and promoting consistency, it sacrificed business agility. There was a significant lag between the time the question was asked, and the time the question was answered. This delay and lack of agility within the analysis process led to lackluster adoption and low overall business impact.

The emergence of self-service BI in recent years has challenged the status quo, especially for IT professionals who have spent the better part of the past two decades building out a BI infrastructure designed for developing top-down, centralized reporting and dashboards. Initially, this self-service trend was viewed as a nuisance by most IT departments and was virtually ignored. The focus remained on producing a centrally-managed single source of truth for the organization.

Fast-forward to today and IT finds itself at a crossroad with self-service BI as the new normal that can no longer be ignored. The traditional approach to BI is becoming less and less relevant as the business demands the agility that comes with self-service to drive adoption and improve organization outcomes. This, paired with the continued exponential growth in data volume and complexity, presents IT with an important choice.

Either the demand for self-service BI is embraced, and IT evolves to become the enabler of the broader use and impact of analytics throughout their organizations, or it is ignored and IT continues as the producer of lower-value enterprise reporting stifled by the limitations of traditional tools. IT professionals who are ready to serve as a catalyst and embrace this opportunity will deliver far greater value to their organizations than those who choose to ignore the real needs of their business users and analysts.

As organizations begin the transition from a traditional top-down approach driven by IT to a self-service approach enabled by IT and led by the business, a new framework and overall strategy is required. This means that past decisions supporting the core foundational
components of a BI program—people, process, and platform—must be revisited. Adjustments are needed in these three core areas to support the shift from a model of top-down BI development and delivery to a self-service-based modern BI model which is driven, and
primarily executed on, by the business.

People
Self-service analytics does not mean end users are allowed unfettered access to any and all data and analytic content. It means they have the freedom to explore pertinent business data that is trusted, secure, and governed. This is where process comes into play and represents the component that requires the most significant shift in traditional thinking for IT. A successful modern BI program is able to deliver both IT control and end-user autonomy and agility. A well-established and well-communicated process is required for an organization to strike this delicate balance.

A top-down, waterfall-based process only addresses the IT control part of the equation. A traditional BI deployment focuses primarily on locking down data and content with governance. This means limiting access and freedom to only a few people with specialized technical skills who are expected to meet the needs and answer the questions of the many. This typically involves developer-centric processes to design and build the enterprise data warehouse (EDW) model, build the ETL jobs to transform and load data into the model, construct the semantic layer to mask the complexity of the underlying data structures, and finally build the businessfacing reports and dashboards as originally requested by the business.

The unfortunate reality is that this approach often fails to deliver on the vision and promise of BI—to deliver significant and tangible value to the organization through improved decision making with minimal time, effort, and cost. Top-down, IT-led BI models often have an inverse profile of time, effort, and cost relative to the value they deliver to the organization.

A modern analytics solution requires new processes and newly-defined organizational roles and responsibilities to truly enable a collaborative self-service-based development process. IT and users must collaborate to jointly develop the rules of the road for their secure environment that each other must abide by in order to maximize the business value of analytics without compromising on the governance or security of the data.

IT’s success is highlighted, and its value to the organization realized, when the business can realize significant value and benefit from investments in analytics and BI. Should IT still be considered successful even if not a single end-user utilizes the BI system to influence a single business decision? Traditional processes intended to serve top-down BI deployments are too often measured by metrics that are not tied to outcomes or organizational impact.

If the ETL jobs that IT created ran without failure and the EDW was loaded without error and all downstream reports refreshed, many IT organizations would consider themselves successful.

Merely supplying data and content to users without any regard for whether or not it is adopted and provides value through improved outcomes is simply not enough. Modern BI requires updated processes to support self-service analytics across the organization. It also
requires the definition of new success metrics for which IT and the business are jointly accountable and are therefore equally invested.

Where processes and technology intertwine, there is tremendous opportunity. Technical innovations, especially with AI, will continue to make it easier to automate processes and augment users of all skill levels throughout the analytics workflow. And while process can
accelerate, rather than inhibit, successful analytics outcomes, it’s important to recognize that this relies on a system and interface that people are eager to use. If processes aren’t supported by the right platform choice, they will stifle adoption.

Platform
Since BI has been historically viewed as an IT initiative, it is not surprising that IT drove virtually every aspect of platform evaluation, selection, purchasing, implementation, deployment, development, and administration. But with drastic changes required to modernize the people and process components of a BI and analytics program, IT must change the criteria for choosing the technology to meet these evolving requirements. Perhaps the most obvious change is that IT must intimately involve business users and analysts from
across the organization in evaluating and ultimately deciding which modern platform best fits the organization and addresses the broad needs of the users. For more information on selecting the right analytics platform, check out the Evaluation Guide.

A modern platform must address a wide range of needs and different personas as well as the increased pace of business and the exponential growth in data volume and complexity. IT requires that the chosen platform enables governance and security while end users require easy access to content and the ability to explore and discovery in a safe environment.

The chosen platform must also be able to evolve with the landscape and integrate easily with other systems within an organization. A centralized EDW containing all data needed for analysis, which was the cornerstone of traditional BI, is simply not possible in the big-data era. Organizations need a platform that can adapt to an evolving data landscape and insulate users from increased complexity and change.

The most critical aspect is the ability to meet these diverse needs in an integrated and intuitive way. This integration is depicted on the following page as the modern analytic workflow. The diagram highlights the five key capabilities that must flow seamlessly in order for the three personas depicted in the center to effectively leverage the platform.

The BI and analytics platform landscape has passed a tipping point, as the market for modern products is experiencing healthy growth while the traditional segment of the market is declining with little to no net new investment. IT leaders should capitalize on this market
shift and seize the opportunity to redefine their role in BI and analytics as a far more strategic one that is critical to the future success of the organization. Adopting a collaborative approach to recast the foundational aspects of the BI program and truly support self-service is the key to changing the perception of IT from a producer to a strategic partner and enabler for the organization.

Promise
In today’s era of digital transformation, IT leaders are increasingly expected to take on digital business initiatives in their organizations, including identifying cost savings and finding new revenue streams. Realizing the value of data for these transformational efforts, many businesses are modernizing and increasing their analytics investments to innovate and accelerate change.
Everyone agrees that putting data at the center of conversations promises change. However, most organizations are failing to successfully implement an enterprise-wide analytics program.

IT is well positioned for a leadership role in these efforts, and is essential for the task of giving people the relevant data they need for decision-making. Modern analytics shifts IT’s role to a more strategic partner for the business, empowering users to navigate a trusted, self-service environment. But beyond access to the data, everyone needs the motivation and confidence to properly make decisions with it. You need to identify the relationships between job functions and data and change behaviors that run deep into the fabric of your organization’s culture.

This also means expanding your definition of self-service so that business users participate in some of the traditionally IT-led responsibilities associated with data and analytics—like administration, governance, and education. With the right processes, standards, and change management, business users can help manage data sources, analytics content, and users in the system, as well as contribute to training, evangelism, and the internal community. When users value and participate in these efforts, IT can manage strategic initiatives like business SLAs and ensuring the security of company assets.

Although every organization’s journey to building a data-driven organization will differ, achieving your transformational goals requires a deliberate and holistic approach to developing your analytics practice. Success at scale relies on a systematic, agile approach to identify key sources of data, how data is selected, managed, distributed, consumed, and secured, and how users are educated and engaged. The better you understand your organization’s requirements, the better you will be able to proactively support the broad use of data.

Tableau Blueprint provides concrete plans, recommendations, and guidelines as a step-by-step guide to creating a data-driven organization with modern analytics. We worked with thousands of customers and analytics experts to capture best practices that help turn repeatable processes into core capabilities to build and reinforce a data-driven mindset throughout your organization.
Learn more and get started today.

About Tableau
Tableau is a complete, integrated, and enterprise-ready visual analytics platform that helps people and organizations become more data driven. Whether on-premises or in the cloud, on Windows or Linux, Tableau leverages your existing technology investments and scales with you as your data environment shifts and grows. Unleash the power of your most valuable assets: your data and your people.

Analytics

Stop me if you’ve heard this one before. Several economists, a bank president, and a couple of reporters walk into a bar. The economists lament, “A thick fog of uncertainty still surrounds us.” The bank president wails, “Economic hurricane.” The reporters keen about “gut-churning feelings of helplessness” and “a world of confusion.”

Sitting in a booth with his hard-working direct reports, the chief information officer sighs. “Same-old, same-old. Uncertainty is our jam.”

For as long as there has been an IT organization, CIOs have been charged with “keeping the lights on (KTLO), delivering five-nines stability (just 5.15 minutes of downtime per year), and expanding digital capabilities in a world characterized by massive economic, political, social, and technological uncertainty.

In other words, IT leaders know there is nothing uncertain about uncertainty. After all, uncertainty is the one certainty. We should not run from uncertainty; we should embrace it. Paraphrasing poet Robert Frost, when it comes to uncertainty, there is “no way out but through.”

What really drives uncertainty

One of the smartest guys on this planet, Ed Gough, formerly chief scientist and CTO at NATO Undersea Research Center (NURC) and former technical director at the US Navy’s Naval Meteorology and Oceanography Command, explained to me that ignorance is at the root of uncertainty.

As John Kay and Mervyn King set forth in Radical Uncertainty: Decision-Making Beyond the Numbers, “Uncertainty is the result of our incomplete knowledge of the world, or about the connection between our present actions and their future outcomes.”

There will always be uncertainties external to the organization. But the uncertainties that do the most to destroy IT value are the self-inflicted ones.

The No. 1 source of uncertainty in the workplace is absence of strategy. Business scholars, think tanks, and some members of the media are discovering that many organizations have not explicitly stated where they want to go and how they plan to get there. To wit, two-thirds of enterprises do not have a data strategy.

And among the companies that do have a strategy, just 14% of their employees “have a good understanding of their company’s strategy and direction.”

It all boils down to what Warner Moore, founder and CISO at Columbus, Ohio-based Gamma Force, recently told me: “Uncertainty isn’t the problem; lack of leadership is the problem.”

Focus on what matters most

Business school is where a lot of today’s business leaders learn their trade. And if you examine how business schools approach uncertainty, you can begin to see where this leadership issue takes root.

In business schools around the world, MBAs are counseled to combat uncertainty by compiling a comprehensive list of possible outcomes and then attach numerical probabilities to each scenario. This approach is untenable, as possible outcomes are infinite and assigning probabilities — subject to assumptions and biases — creates a false sense of precision.

One of the guiding principles for those who would master uncertainty is to recognize that there has always been something irresistible about advice in mathematical form. Over-reliance on metrics has given rise to the term “McNamara fallacy” referring to the tragic missteps associated with the misaligned quantifications used during the Vietnam War.

Instead of flailing around trying to enumerate everything that could happen, executives need to place intense scrutiny on a subset of critical uncertainties. In other words, neglect the right uncertainties.

I spoke with a subset of the 75 senior IT executives attending the Digital Solutions Gallery hosted by The Ohio State University to get their thoughts about which zones of uncertainty they were focusing on. The general consensus was that one of the best places to start managing hyper-uncertainty was talent.

Atlanta Fed President Raphael Bostic speaking at a “mini-conference” Survey of Business Uncertainty: Panel Member Economic Briefing and Policy Discussion earlier this year told attendees, “Finding workers is a big problem.”

Finding workers might be an uncertain undertaking but retaining key performers is not. Leaders have it in their power to know what their high performers are thinking. For these key employees it is possible to paint reasonably clear pictures of what happens next.

Mike McSally, a human capital advisor with 20-plus years of experience in executive recruiting, does not believe recruiting has to be a problem. Reducing talent uncertainty is a simple matter of managing personal networks. McSally suggests having your top ten performers take out a yellow sheet of paper and write down “the top twenty people you have ever worked with.” Give them a call.

When you find a qualified candidate, deliver to them an authentic “what-a-day-at-work-really-looks-like” depiction of the role being filled. When that depiction aligns with your strategic vision and your company’s mission, you’ll have a leg up on converting that candidate to a new team member.

That kind of leadership approach will help you handle talent uncertainties, better positioning your organization for the future. I am certain of that.

Hiring, IT Leadership, IT Strategy, Staff Management

Introduction
Since its inception decades ago, the primary objective of business intelligence has been the creation of a top-down single source of truth from which organizations would centrally track KPIs and performance metrics with static reports and dashboards. This stemmed from the proliferation of data in spreadsheets and reporting silos throughout organizations, often yielding different and conflicting results. With this new mandate, BI-focused teams were formed, often in IT departments, and they began to approach the problem in the same manner as traditional IT projects, where the business makes a request of IT, IT logs a ticket, then fulfills the request following a waterfall methodology.

While this supplier/consumer approach to BI appeared to be well-suited for the task of centralizing an organization’s data and promoting consistency, it sacrificed business agility.

There was a significant lag between the time the question was asked, and the time the question was answered. This delay and lack of agility within the analysis process led to lackluster adoption and low overall business impact.

The emergence of self-service BI in recent years has challenged the status quo, especially for IT professionals who have spent the better part of the past two decades building out a BI infrastructure designed for developing top-down, centralized reporting and dashboards. Initially, this self-service trend was viewed as a nuisance by most IT departments and was virtually ignored. The focus remained on producing a centrally-managed single source of truth for the organization.

Fast-forward to today and IT finds itself at a crossroad with self-service BI as the new normal that can no longer be ignored. The traditional approach to BI is becoming less and less relevant as the business demands the agility that comes with self-service to drive adoption and improve organization outcomes. This, paired with the continued exponential growth in data volume and complexity, presents IT with an important choice.

As organizations begin the transition from a traditional top-down approach driven by IT to a self-service approach enabled by IT and led by the business, a new framework and overall strategy is required. This means that past decisions supporting the core foundational components of a BI program—people, process, and platform—must be revisited. Adjustments are needed in these three core areas to support the shift from a model of top-down BI development and delivery to a self-service-based modern BI model which is driven, and primarily executed on, by the business.

People
A successful transition to self-service business analytics begins with people and should be the top priority for IT when considering changes required for BI modernization. In a traditional BI model, people were often considered last after platform and process. The widely-used mantra “if you build it, they will come” exemplifies the belief that business users would gravitate toward a well-built system of record for BI that would answer all of their business questions.

This desired end-state rarely came to fruition since there was little to no collaboration between the business users and IT during the process of building the solution after an upfront requirements-gathering phase. In the absence of active engagement and feedback from the business during the time between requirements gathering and project completion, there are many opportunities for failure that typically emerge. A few of the most common include:

• Business or organizational changes occur during the development process that render the initial requirements obsolete or invalid.
• Incomplete or inaccurate requirements are given in the initial process phases.
• Errors are made in the process of translating business requirements into technical requirements.

The end result of these situations is often that business users disengage from the BI program completely and an organization’s investment in time and resources are wasted due to lack of adoption. Business users and analysts need to use analytics in order for it to have any impact and deliver organizational value. A BI model that embraces self-service puts these users first and allows them to explore, discover, and build content that they will ultimately use to make better business decisions and transform business processes.

Collaboration between the business and IT is critical to the success of the implementation as IT knows how to manage data and the business knows how to interpret and use data within the business processes they support. They have the context within which analytics and the insight derived from it will be used to make better business decisions and ultimately improve outcomes. This collaboration of the groups early on will not only lead to the deployment of a platform that meets the needs of the business but also drives adoption and impact of the platform overall.

Process
Self-service analytics does not mean end users are allowed unfettered access to any and all data and analytic content. It means they have the freedom to explore pertinent business data that is trusted, secure, and governed. This is where process comes into play and represents the component that requires the most significant shift in traditional thinking for IT. A successful modern BI program is able to deliver both IT control and end-user autonomy and agility. A well-established and well-communicated process is required for an organization to strike this delicate balance.

A top-down, waterfall-based process only addresses the IT control part of the equation. A traditional BI deployment focuses primarily on locking down data and content with governance. This means limiting access and freedom to only a few people with specialized technical skills who are expected to meet the needs and answer the questions of the many. This typically involves developer-centric processes to design and build the enterprise data warehouse (EDW) model, build the ETL jobs to transform and load data into the model, construct the semantic layer to mask the complexity of the underlying data structures, and finally build the businessfacing reports and dashboards as originally requested by the business.

The unfortunate reality is that this approach often fails to deliver on the vision and promise of BI—to deliver significant and tangible value to the organization through improved decision making with minimal time, effort, and cost. Top-down, IT-led BI models often have an inverse profile of time, effort, and cost relative to the value they deliver to the organization.

Tableau

A modern analytics solution requires new processes and newly-defined organizational roles and responsibilities to truly enable a collaborative self-service-based development process. IT and users must collaborate to jointly develop the rules of the road for their secure environment that each other must abide by in order to maximize the business value of analytics without compromising on the governance or security of the data.

IT’s success is highlighted, and its value to the organization realized, when the business can realize significant value and benefit from investments in analytics and BI. Should IT still be considered successful even if not a single end-user utilizes the BI system to influence a single business decision? Traditional processes intended to serve top-down BI deployments are too often measured by metrics that are not tied to outcomes or organizational impact. If the ETL jobs that IT created ran without failure and the EDW was loaded without error and all downstream reports refreshed, many IT organizations would consider themselves successful.

Merely supplying data and content to users without any regard for whether or not it is adopted and provides value through improved outcomes is simply not enough. Modern BI requires updated processes to support self-service analytics across the organization. It also requires the definition of new success metrics for which IT and the business are jointly accountable and are therefore equally invested.

Where processes and technology intertwine, there is tremendous opportunity. Technical innovations, especially with AI, will continue to make it easier to automate processes and augment users of all skill levels throughout the analytics workflow. And while process can accelerate, rather than inhibit, successful analytics outcomes, it’s important to recognize that this relies on a system and interface that people are eager to use. If processes aren’t supported by the right platform choice, they will stifle adoption.

Platform
Since BI has been historically viewed as an IT initiative, it is not surprising that IT drove virtually every aspect of platform evaluation, selection, purchasing, implementation, deployment, development, and administration. But with drastic changes required to modernize the people and process components of a BI and analytics program, IT must change the criteria for choosing the technology to meet these evolving requirements. Perhaps the most obvious change is that IT must intimately involve business users and analysts from across the organization in evaluating and ultimately deciding which modern platform best
fits the organization and addresses the broad needs of the users. For more information on selecting the right analytics platform, check out the Evaluation Guide.

A modern platform must address a wide range of needs and different personas as well as the increased pace of business and the exponential growth in data volume and complexity. IT requires that the chosen platform enables governance and security while end users require easy access to content and the ability to explore and discovery in a safe environment.

The chosen platform must also be able to evolve with the landscape and integrate easily with other systems within an organization. A centralized EDW containing all data needed for analysis, which was the cornerstone of traditional BI, is simply not possible in the big-data era. Organizations need a platform that can adapt to an evolving data landscape and insulate users from increased complexity and change.

The most critical aspect is the ability to meet these diverse needs in an integrated and intuitive way. This integration is depicted on the following page as the modern analytic workflow. The diagram highlights the five key capabilities that must flow seamlessly in order for the three personas depicted in the center to effectively leverage the platform.

Tableau

The BI and analytics platform landscape has passed a tipping point, as the market for modern products is experiencing healthy growth while the traditional segment of the market is declining with little to no net new investment. IT leaders should capitalize on this market shift and seize the opportunity to redefine their role in BI and analytics as a far more strategic one that is critical to the future success of the organization. Adopting a collaborative approach to recast the foundational aspects of the BI program and truly support self-service is the key to changing the perception of IT from a producer to a strategic partner and enabler for the organization.

Promise
In today’s era of digital transformation, IT leaders are increasingly expected to take on digital business initiatives in their organizations, including identifying cost savings and finding new revenue streams. Realizing the value of data for these transformational efforts, many businesses are modernizing and increasing their analytics investments to innovate and accelerate change. Everyone agrees that putting data at the center of conversations promises change. However, most organizations are failing to successfully implement an enterprise-wide analytics program.

IT is well positioned for a leadership role in these efforts, and is essential for the task of giving people the relevant data they need for decision-making. Modern analytics shifts IT’s role to a more strategic partner for the business, empowering users to navigate a trusted, self-service environment. But beyond access to the data, everyone needs the motivation and confidence to properly make decisions with it. You need to identify the relationships between job functions and data and change behaviors that run deep into the fabric of your organization’s culture.

This also means expanding your definition of self-service so that business users participate in some of the traditionally IT-led responsibilities associated with data and analytics—like administration, governance, and education. With the right processes, standards, and change management, business users can help manage data sources, analytics content, and users in the system, as well as contribute to training, evangelism, and the internal community. When users value and participate in these efforts, IT can manage strategic initiatives like business SLAs and ensuring the security of company assets.

Although every organization’s journey to building a data-driven organization will differ, achieving your transformational goals requires a deliberate and holistic approach to developing your analytics practice. Success at scale relies on a systematic, agile approach to identify key sources of data, how data is selected, managed, distributed, consumed, and secured, and how users are educated and engaged. The better you understand your organization’s requirements, the better you will be able to proactively support the broad use of data.

Tableau Blueprint provides concrete plans, recommendations, and guidelines as a step-by-step guide to creating a data-driven organization with modern analytics. We worked with thousands of customers and analytics experts to capture best practices that help turn repeatable processes into core capabilities to build and reinforce a data-driven mindset throughout your organization. Learn more and get started today.

IT Leadership

Real World Evidence (RWE) refers to the analysis of Real World Data (RWD), which is used by life sciences companies and healthcare organizations to securely obtain, store, analyze, and gain insights about the functioning of a drug or medical invention. RWE helps medical professionals and other stakeholders demonstrate the value of a particular drug’s effectiveness in treating medical conditions in a real-world setting.

Today, life sciences companies have a huge opportunity to unlock the potential of RWD and improve patient outcomes.

Why the buzz around RWE?

For some time now, the use of mobile devices, computers, and different mobile equipment to collect and store massive amounts of health data has been accelerating. This data has the potential to allow life sciences companies to conduct clinical trials more effectively.

Moreover, the introduction of more sophisticated and modern analytical capabilities has paved the way for advanced analysis of the data acquired and its application in medical product development and approval.

The ever-changing healthcare sector

Even today, healthcare payors and governments continue to face enormous data management and storage capacity challenges. Drug prices are increasing, owing to development costs and an increase in demand for personalized treatments, which is placing unimaginable pressure on life sciences companies.

These companies are focusing on the development of integrated solutions and therefore are moving “beyond the pill” and becoming solution providers. With a renewed focus on offering value to various stakeholders, these companies are creating new commercial models. RWE helps them respond to these trends successfully.

Harnessing the power of cloud

With AWS cloud, it is now easier for pharma companies to derive huge datasets from a vast pool of sources. Companies are equipped with the organizational intelligence to understand the needs of stakeholders and mitigate the challenges of large-scale data storage, data analytics, and sharing.

The best way to maximize the utility of RWE is to successfully integrate disparate data types. Life sciences companies must store, search, analyze, and normalize data of different types and sizes coming from different sources, including medical devices, wearables, genomics reports, data claims, clinic trials, and electronic medical records.

One common solution for these disparate data types is a data lake, which allows organizations to store data in a centralized repository — both in its original form and in a searchable format. One benefit of the AWS data lake is that data can be stored as-is; there is no need to transform it into a predefined schema. Unlike with a data warehouse, companies do not need to structure the data first. They can use the data for tagging, searching, and analysis without worrying about its original format.

When it comes to pharma companies, the artificial intelligence and machine learning capabilities of AWS help process data for real-world evidence, such as:

Quick access to all types of data like genomics, clinic trials, and claimsThe integration of new RWE data to existing data in the data lake

Advanced RWE analytics use predictive and probabilistic causal models, unsupervised algorithms, and machine learning to extract deeper insights from these data sets. These help in building large data sets with significant information on thousands of patient variables.

With the on-premise data repositories getting replaced by cloud-based data lakes, pharma companies now have access to a scalable platform that provides cutting-edge analytics. These companies will be at the forefront of technological innovation, as RWE becomes the big picture in the world of pharma in the years to come.

Author Bio

TCS

Ph: +91 9818804103
E-mail: a.goel1@tcs.com

Dr. Ashish Goel is a molecular oncologist from Johns Hopkins University, with more than 23 years of experience across different facets of the pharmaceutical industry ranging from new drug discovery to decision support analytics. He has held leadership positions in Pharma/pharma-centric organizations assisting in key decisions ranging from designing NCE like Lipaglyn (Zydus) to Revenue Forecasting and lately Real-world Evidence and HEOR (IQVIA and TCS). He currently leads the Value Demonstration team in TCS, focusing on business transformation via evidence generation management and automation. He has published extensively in peer-reviewed international journals; owns patents and has been invited speaker to conferences and thought leadership forums/interviews.

To learn more, visit us here.

Cloud Computing