A common misconception is that high-powered computing (HPC) and exascale supercomputing are too powerful for traditional businesses — that they’re only designed for mammoth university and government programs that seek to answer humanity’s biggest questions, like how the galaxies are formed or finding solutions for global crises like climate change and hunger.

But the reality is that HPC and exascale also have business-critical use cases across every industry, transforming operations and solving everyday business challenges.

And with the release of the world’s most powerful — and first — public exascale supercomputer, the HPE Cray Frontier, the possibilities become truly infinite.

How HPC and exascale supercomputing accelerate business transformation

Innovations in blockchain, artificial intelligence, machine learning, augmented reality, and other technologies are opening doors to new opportunities and better, faster ways of doing business. Grabbing hold of these opportunities and remaining agile in a constantly evolving world means that every industry must embrace digital transformation.

But without centralized data and insights, it can be hard to be agile and anticipate what’s next — especially for large corporations. That’s where HPC and exascale supercomputing can help. For example, at HPE, we employed HPC to centralize and consolidate multiple ERPs as well as our analytics platforms — something we couldn’t do with traditional servers. This effort transformed our business and positioned us to offer more than 80 products as a service.

And there are plenty of other use cases. Carestream Health uses HPC with AI-as-a-Service and other offerings from HPE to improve medical imaging accuracy, automate processes, and reduce cycle times. NCAR uses HPC to forecast summer rainfall and reservoir inflows for smarter water management. Oil and gas companies use HPC to analyze seismic data and determine new well locations. Healthcare organizations use HPC to accelerate vaccine development. Retailers use HPC to reduce inventory shrinkage. The list goes on.

HPE Cray Frontier: The world’s fastest supercomputer

Developed for the US Department of Energy’s Oak Ridge National Laboratory (ORNL), HPE Cray Frontier is the world’s fastest supercomputer. Problems that once took months to solve can now be worked out in a matter of days, surfacing answers to questions humans never thought to ask.

With performance at 1.1 exaflops, the HPE Cray Frontier is faster than the world’s seven most powerful supercomputers combined. Powered by 3rd Gen AMD EPYC processors and AMD Instinct MI250x accelerators, Frontier calculates 10x quicker than the world’s top supercomputers, solving problems that are 8x as complex.

It’s also worth noting that the Frontier is a huge IT sustainability win for customers thanks to efficient AMD processing technology and sophisticated liquid-cooling capabilities. Frontier is the most power-efficient supercomputing architecture in the world, topping the Green500 list.

“This technology is going to change the world,” remarks Dr. Thomas Zacharia, ORNL Director, “from medicine to biology to materials to deep space to climate change to energy transitions.” Frontier can predict mutations to combat future pandemics, develop new medical treatments faster, and ultimately save lives.

GE is also using Frontier in their drive to design more powerful, sustainable wind turbines. Frontier allows them to look at physics and flow across an entire farm of wind turbines to study interdependencies and impacts at scale, so they can manufacture more efficient, reliable, and cost-effective turbines.

Discover if HPC supercomputing is right for your business

Are you ready to explore the superpowers of HPC and exascale supercomputing for your business? Together, GDT and HPE can lead you through a free, one-day transformation workshop where we examine your business challenges and priorities and build a digital roadmap. As a leader in space, we apply our expertise and leadership and all the lessons we’ve learned to customize a realistic roadmap that accelerates your business.

To get started with your roadmap, contact the experts at GDT.

Digital Transformation

Every company and government entity is tasked with striking a critical balance between data access and security. As Forrester’s Senior Analyst Richard Joyce stated, “For a typical Fortune 1000 company, just a 10 percent increase in data accessibility will result in more than $65 million additional net income.” As the need to become more data-driven accelerates, it’s imperative enterprises equally balance privacy and governance requirements.

To achieve this balance, we need to change how we perceive data security. Amidst growing friction between teams — i.e. those who create and manage data access policies and those who need data to perform their duties — we must accept security, IT, and privacy teams want to provision as much data as possible. But those teams face constraints and compliance complexity.

Traditionally, data security, privacy, and regulations have been thought of as a cost center expense. Instead, we need to look at data security as the means for positive change, a driver for greater data accessibility, enhanced operational efficiency, and actual business value.

Many remain far short of the goal

Enterprises of all sizes struggle with the shift. NewVantage Partners’s Data and AI Leadership Executive Survey 2023 found less than a quarter of firms reported a data-driven organization or data culture. And in the State of Data and Analytics Governance, Gartner suggests by 2025 80 percent of organizations seeking to scale digital business, including analytical initiatives, will fail because they don’t modernize their data and analytics governance.

Data access drives growth. So, what’s the reason for low data-culture adoption? To be truly data-driven requires tight collaboration between many different functions, and there’s a lack of certainty regarding individual-role responsibilities. Strategic gaps must be addressed.

The reality of data security and access

When it comes to data security and access, companies are typically either:

Overly restrictive on data access. Data security is seen as an impediment to overall company growth. This is typically due to data, organizational, and technological complexity.Or, overly focused on perimeter and application defenses, leveraging cyberdefenses and coarse-grained identity and access management (IAM). Data systems are open to exploitation in the event of a breach.

Most experience the worst of both these scenarios, where data security and access are simply broken — inconsistent, atomistic.

A primary challenge of solving the data democratization balancing act lies in the complex web of internal and external privacy, security, and governance policies. They change over time and need to be applied and maintained consistently and efficiently across business teams.

In the middle are the technical teams managing the complex data and analytical system. Due to constraints, security, privacy, and data management teams default to a tight lockdown of data to ensure compliance and security. It’s not any one team’s fault, but a major blocker to becoming data-driven.

Unified data security platform

Siloed, decentralized, inefficient, unclear roles and responsibilities, and an absence of a holistic strategy. So, what’s the solution as more companies face costly data breaches and low data usability rates? An enterprise-wide, scalable strategy that leverages a unified data security platform. One that includes integrated capabilities to simplify and automate universal data security processes across the entire data and analytic ecosystem. With the ability to discover and classify sensitive data, data attributes can be used to automatically deliver instantaneous data access to authorized users. Proper data security governance helps teams get access to more data faster.

Additional data masking and encryption layers can be added to make sensitive data available for analytics without compromising security. Even if a breach occurs, fine-grained access limits exposure, and audit capabilities quickly identify compromised data.

Executing a proper data security strategy provides the last mile of the data governance and cataloging journey. All of it key to the balancing act of data democratization, with comprehensive data governance enabling faster insights while maintaining compliance. 

Enterprise-wide governed data sharing

Privacera helps Fortune companies modernize their data architecture via a holistic, enterprise-wide data security platform and automated data governance. A data security platform empowers the data democratization you need to increase data usability and nurture your data-driven culture. Analysts and business units get more data faster. IT liberates time and resources. Security and privacy teams easily monitor and implement data security policies.

Learn more about achieving modern data security governance and democratized analytics for faster insights here.

Data and Information Security

Since the premier of the wildly popular 1993 dinosaur cloning film Jurassic Park, the sciences featured in the film, genetic engineering and genomics, have advanced at breathtaking rates. When the film was released, the Human Genome Project was already working on sequencing the entire human genome for the first time. They completed the project in 2003 after 13 years and at a cost of $1 billion. Today, the human genome can be sequenced in less than a day and at a cost of less than $1,000.

One leading genomics research organization, The Wellcome Sanger Institute in England, is on a mission to improve the health of all humans by developing a comprehensive understanding of the 23 chromosomes in the human body. They’re relying on cutting edge technology to operate at incredible speed and scale, including reading and analyzing an average of 40 trillion DNA base pairs a day.

Alongside advances in DNA sequencing techniques and computational biology, high-performance computing (HPC) is at the heart of the advances in genomic research. Powerful HPC helps researchers process large-scale sequencing data to solve complex computing problems and perform intensive computing operations across massive resources.

Genomics at Scale

Genomics is the study of an organism’s genes or genome. From curing cancer and combatting COVID-19 to better understanding human, parasite, and microbe evolution and cellular growth, the science of genomics is booming. The global genomics market is projected to grow to $94.65 billion by 2028 from $27.81 billion in 2021, according to Fortune Business Insights. Enabling this growth is a HPC environment that is contributing daily to a greater understanding of our biology, helping to accelerate the production of vaccines and other approaches to health around the world.

Using HPC resources and math techniques known as bioinformatics, genomics researchers analyze enormous amounts of DNA sequence data to find variations and mutations that affect health, disease, and drug response. The ability to search through the approximately 3 billion units of DNA across 23,000 genes in a human genome, for example, requires massive amounts of compute, storage, and networking resources.

After sequencing, billions of data points must be analyzed to look for things like mutations and variations in viruses. Computational biologists use pattern-matching algorithms, mathematical models, image processing, and other techniques to obtain meaning from this genomic data.

A Genomic Powerhouse

At the Sanger Institute, scientific research is happening at the intersection of genomics and HPC informatics. Scientists at the Institute tackle some of the most difficult challenges in genomic research to fuel scientific discoveries and push the boundaries of our understanding of human biology and pathogens. Among many other projects, the Institute’s Tree of Life program explores the diversity of complex organisms found in the UK through sequencing and cellular technologies. Scientists are also creating a reference map of the different types of human cells.

Science on the scale of that conducted at the Sanger Institute requires access to massive amounts of data processing power. The Institute’s Informatics Support Group (ISG) helps meet this need by providing high performance computing environments for Sanger’s scientific research teams. The ISG team provides support, architecture design and development services for the Sanger Institute’s traditional HPC environment and an expansive OpenStack private cloud compute infrastructure, among other HPC resources.

Responding to a Global Health Crisis

During the COVID-19 pandemic, the Institute started working closely with public health agencies in the UK and academic partners to sequence and analyze the SARS-COV-2 virus as it evolved and spread. The work has been used to inform public health measures and to help save lives.

As of September 2022, over 2.2 million coronavirus genomes have been sequenced at Wellcome Sanger. They are immediately made available to researchers around the world for analysis. Mutations that affect the virus’s spike protein, which it uses to bind to and enter human cells, are of particular interest and the target of current vaccines. Genomic data is used by scientists with other information to ascertain which mutations may affect the virus’s ability to transmit, cause disease, or evade the immune response.

Society’s greater understanding of genomics, and the informatics that goes with it, has accelerated the development of vaccines and our ability to respond to disease in a way that’s never been possible before. Along the way, the world is witnessing firsthand the amazing power of genomic science.

Read more about genomics, informatics, and HPC in this white paper and case study of the Wellcome Sanger Institute.

***

Intel® Technologies Move Analytics Forward

Data analytics is the key to unlocking the most value you can extract from data across your organization. To create a productive, cost-effective analytics strategy that gets results, you need high performance hardware that’s optimized to work with the software you use.

Modern data analytics spans a range of technologies, from dedicated analytics platforms and databases to deep learning and artificial intelligence (AI). Just starting out with analytics? Ready to evolve your analytics strategy or improve your data quality? There’s always room to grow, and Intel is ready to help. With a deep ecosystem of analytics technologies and partners, Intel accelerates the efforts of data scientists, analysts, and developers in every industry. Find out more about Intel advanced analytics.

High-Performance Computing

Most organizations understand the profound impact that data is having on modern business. In Foundry’s 2022 Data & Analytics Study, 88% of IT decision-makers agree that data collection and analysis have the potential to fundamentally change their business models over the next three years.

The ability to pivot quickly to address rapidly changing customer or market demands is driving the need for real-time data. But poor data quality, siloed data, entrenched processes, and cultural resistance often present roadblocks to using data to speed up decision making and innovation.

We asked the CIO Experts Network, a community of IT professionals, industry analysts, and other influencers, why real-time data is so important for today’s business and how data helps organizations make better, faster decisions. Based on their responses, here are four recommendations for improving your ability to make data-driven decisions. 

Use real-time data for business agility, efficient operations, and more

Business and IT leaders must keep pace with customer demands while dealing with ever-shifting market forces. Gathering and processing data quickly enables organizations to assess options and take action faster, leading to a variety of benefits, said Elitsa Krumova (@Eli_Krumova), a digital consultant, thought leader and technology influencer.

“The enormous potential of real-time data not only gives businesses agility, increased productivity, optimized decision-making, and valuable insights, but also provides beneficial forecasts, customer insights, potential risks, and opportunities,” said Krumova.

Other experts agree that access to real-time data provides a variety of benefits, including competitive advantage, improved customer experiences, more efficient operations, and confidence amid uncertain market forces:

“Business operations must be able to make adjustments and corrections in near real time to stay ahead of the competition. Few companies have the luxury of waiting days or weeks to analyze data before reacting. Customers have too many options. And in some industries — like healthcare, financial services, manufacturing, etc., — not having real-time data to make rapid critical adjustments can lead to catastrophic outcomes.” — Jack Gold (@jckgld), President and Principal Analyst at J. Gold Associates LLC.

“When insights from the marketplace are not transmitted in real time, the ability to make critical business decisions disappears. We’ve all experienced the pain of what continues to happen with the disconnect between customer usage metrics and gaps in supply chain data.” — Frank Cutitta (@fcutitta), CEO and Founder, HealthTech Decisions Lab

“Operationally, think of logistics. Real-time data provides the most current intelligence to manage the fleet and delivery, for example. Strategically, with meaningful real-time data, systemic issues are easier to identify, portfolio decisions faster to make, and performance easier to evaluate. At the end of the day, it drives better results in safety, customer satisfaction, the bottom line, and ESG [environmental, social, and governance].” — Helen Yu (@YuHelenYu), Founder and CEO, Tigon Advisory Corp.

“Businesses are facing a rapidly evolving set of threats from supply chain constraints, rising fuel costs, and shipping delays. Taking too much time to make a decision based on stale data can increase overall costs due to changes in fuel prices, availability of inventory, and logistics impacting the shipping and delivery of products. Organizations utilizing real-time data are the best positioned to deal with volatile markets.” — Jason James (@itlinchpin), CIO at Net Health

Build a foundation for continuous improvement

The experts offered several practical examples of how real-time data can help deliver continuous improvement in a variety of areas across the business, with the help of automation, which is a key capability for making data actionable.

“In the process of digital transformation, businesses are moving from human-dependent to digital business processes,” said Nikolay Ganyushkin (nikolaygan), CEO and Co-founder of Acure. “This means that all changes, all transitions, are instantaneous. The control of key parameters and business indicators should also be based on real-time data, otherwise such control will not keep up with the processes.”

Real-time data and automated processes present a powerful combination for improving cybersecurity and resiliency.

“When I was coming up in InfoSec, we could only do vulnerability scanning between midnight and 6 am. We never got good results because systems were either off, or there was just nothing going on at those hours,” said George Gerchow (@georgegerchow), CSO and SVP of IT, Sumo Logic. “Today, we do them at the height of business traffic and can clearly see trends of potential service outages or security incidents.”

Will Kelly (@willkelly), an analyst and writer focused on the cloud and DevOps, said that harnessing real-time data is critical “in a world where delaying business and security decisions can prove even more costly than just a couple of years ago. Tapping into real-time data provides decision-makers with immediate access to actionable intelligence, whether a security alert on an attack in-progress or data on a supply chain issue as it happens.”

Real-time data facilitates timely, relevant, and insightful decisions down to the business unit level, said Gene De Libero (@GeneDeLibero), Chief Strategy Officer at GeekHive.com. Those decisions can have a direct impact on customers. “Companies can uncover and respond to changes in consumer behavior to promote faster and more efficient personalization and customization of customer experiences,” he said.

Deploy an end-to-end approach to storing, accessing, and analyzing data

To access data in real time — and ensure that it provides actionable insights for all stakeholders — organizations should invest in the foundational components that enable more efficient, scalable, and secure data collection, processing, and analysis. These components, including cloud-based databases, data lakes, and data warehouses, artificial intelligence and machine learning (AI/ML) tools, analytics, and internet of things capabilities, must be part of a holistic, end-to-end strategy across the enterprise:

“Real-time data means removing the friction and latency from sourcing data, processing it, and enabling more people to develop smarter insights. Better decisions come from people trusting that the data reflects evolving customer needs and captures an accurate state of operations.” — Isaac Sacolick (@nyike), StarCIO Leader and Author of Digital Trailblazer

“Organizations must use a system that draws information across integrated applications. This is often made simpler if the number of platforms is kept to a minimum. This is the only way to enable a real-time, 360-degree view of everything that is happening across an organization — from customer journeys to the state of finances.” — Sridhar Iyengar (@iSridhar), Managing Director, Zoho Europe

“Streaming processing platforms allow applications to respond to new data events instantaneously. Whether you’re distributing news events, moving just-in-time inventory, or processing clinical test results, the ability to process that data instantly is the power of real-time data.” — Peter B. Nichol (@PeterBNichol), Chief Technology Officer at OROCA Innovations

As your data increases, expand your data-driven capabilities

The volume and types of data organizations collect will continue to increase. Forward-thinking leadership teams will continue to expand their ability to leverage that data in new and different ways to improve business outcomes.

“The power of real-time data is amplified when your organization can enrich data with additional intelligence gathered from the organization,” said Nichol. “Advanced analytics can enhance events with scoring models, expanded business rules, or even new data.”

Nichol offered the example of combining a customer’s call — using an interactive voice response system — with their prior account history to enrich the interaction. “By joining events, we can build intelligent experiences for our customers, all in real time,” he said.

It’s one of the many ways that new technologies are increasing the opportunities to use real-time data to fundamentally change how businesses operate, now and in the future.

“As businesses become increasingly digitalized, the amount of data they have available is only going to increase,” said Iyengar. “We can expect real-time data to have a more significant impact on decision-making processes within leading, forward-thinking organizations as we head deeper into our data-centric future.”

Learn more about ways to put your data to work on the most scalable, trusted, and secure cloud.

Business Intelligence

With so much diverse data available, why do so many companies still struggle to embrace the real-time, data-driven decision-making they need to be more agile? One thing is clear: The challenge isn’t solved by technology alone.

“You can’t buy transformation,” says Tom Godden, Principal Technical Evangelist with the Enterprise Strategy team at AWS. “Real change doesn’t come just from new technology—it comes from rethinking your processes, which are enabled by the technology.”

Godden offers three tips to help organizations break down data silos, improve data quality, and overcome other longstanding data challenges to foster a culture of decision-making that drives business agility.

1. Build the foundation for managing data in real-time.

A modern data strategy must emphasize data quality at the source of origin, rather than traditional methods of cleansing and normalizing at the point of consumption. Make sure you have the proper infrastructure, tools, and services in place to capture data from a variety of sources, ensure the quality of the data you’re collecting, and manage it securely, end to end.

The technical underpinnings of a modern data strategy include cloud-based databases, data lakes, and data warehouses; artificial intelligence and machine learning (AI/ML) tools; and analytics. The infrastructure must be supported by a comprehensive plan to manage, access, analyze, and protect data across its entire lifecycle, with fully automated processes and robust integration to make data actionable across the organization.

“It may sound obvious, but if you do not build the right processes to capture all the data, you can’t act on the data,” says Godden.

2. Don’t just democratize data – democratize the decisions based on that data.

Investing in the data management infrastructure, tools, and processes necessary to capture data in real-time through a variety of data feeds and devices is just the first step. If you aren’t simultaneously creating a culture that allows people to act on data, you’re just creating frustration.

To that end, avoid “reporting ghost towns” that require people to stop what they’re doing and access a different tool for insights. Instead, build analytics capabilities directly into their workflows, with context, so they can easily apply the insights to their daily activities.

3. Provide the types of guardrails that spur innovation instead of inhibiting it.

Building automated processes for metadata, including information on data lineage and shelf life, builds confidence in the data. By storing data in its raw or native format, you can apply access policies to individuals without having to modify the data.

This approach ensures more flexibility for how people can use the data they need without compromising the fidelity of the data itself. A data lake can serve as a foundational element of a data unification strategy, providing a single source of truth with supporting policies for real-time provisioning based on permissions.

Agile decision making: How three companies are benefiting from a modern data strategy

Organizations are already capturing the benefits of real-time access to data based on roles and permissions. Here are three examples:

Swimming Australia, the nation’s top governing body for swimming, has long been at the forefront of science. Now, it’s using data to analyze race performance and create bespoke training programs for individual athletes. A data lake unified athlete statistics and metrics in a single location, and AI/ML tools are helping the team tailor training programs and track competitors. Analysts and coaches capture real-time physiological data during training sessions and combine that information with race analysis to determine how to evolve training efforts for individual swimmers. Coaches and athletes can easily track progress in real time from their phones via cloud-based dashboards. Today, with its modern data architecture, the national team can create benchmarking reports in minutes, an innovation that helped make the Australians the most successful relay team in the 2020 Tokyo Olympic games.

Coca-Cola Andina, which produces and distributes products licensed by The Coca-Cola Company within South America, needed a solution to collect all relevant information on the company, its customers, logistics, coverage, and assets within a single accurate source. The answer was a cloud-based data lake, which allowed the company to implement new products and services to customize the different value propositions for its more than 260,000 customers. With all the resources and functionality that the data lake enables, Coca-Cola Andina ensures its partners and customers have access to reliable information for making strategic decisions for the business. Coca-Cola Andina ingested more than 95% of the data from its different areas of interest, which allows it to build excellence reports in just a few minutes and implement advanced analytics. The cloud infrastructure increased productivity of the analysis team by 80%.

Vyaire, a global medical company, needed a way to help its 4,000 employees make better, data-based decisions utilizing both first- and-third-party data. Adopting AWS Data Exchange to find, subscribe to, and use third-party data has made it easier to incorporate data sources into the company’s own data ecosystem, resulting in quicker insights to help teams focus on getting results, not administration. Easy access to third-party data via the AWS Data Exchange catalog has encouraged more experimentation and innovation, giving Vyaire’s leadership confidence that it can meet the changing market for respiratory care products and direct investment in the right area to improve its product portfolio.

Too many organizations continue to be held back from using data effectively to drive all aspects of their business. A modern data strategy will empower teams and individuals, regardless of role or organizational unit, to analyze and use data to make better, faster decisions – enabling the sustainable advantage that comes from business agility.

Learn more about ways to put your data to work on the most scalable, trusted, and secure cloud.

Digital Transformation