The tenth largest IT service provider in the world, Fujitsu’s more than 124,000 employees can be found on the leading edge of digital transformation in virtually every industry. These include the automotive, financial services, health care, law enforcement, manufacturing, and retail sectors. It is also found in efforts to accelerate environmental, social, and governance efforts.

We recently caught up with Antonio Medianero, director of cloud application at Fujitsu Services España S.A. to learn more about the company’s cloud services and solutions, what it means to be VMware Cloud Verified, and what he sees as the next big thing in cloud. We also took the opportunity to explore Fujitsu’s vision for hybrid IT.

“On the most basic level, as a company we have more than 80 years of experience working in various industries around the world,” says Medianero. “We design, develop, implement, manage and optimize the systems businesses of all kinds need to address their operational, application, and infrastructure needs.”

Fujitsu offers a wide range of consulting services and infrastructure and business continuity solutions. Its team of cloud specialists helps clients deliver services to their customers and achieve strong, seamless operations that reflect the power of the cloud and the innumerable technological gains it makes possible.

“Everything we provide, from rapid advice and guidance to extensive application services, robust platforms, and multi-cloud functionality, reflects the breadth and depth of our teams’ technical and business expertise,” he says. “We scale to meet demand, offer new ways of working, create the foundations for future growth, and design clouds that meet the cost, data protection, compliance, and security requirements of our customers. We believe it’s imperative for enterprises to have the freedom and flexibility to use the right cloud for every task, application, and business need.”

Fujitsu supports all major public clouds. The company’s Fujitsu Cloud is a private cloud based on VMware technology that delivers high levels of security and privacy to ensure that sensitive operations and data are protected. Fujitsu also offers Infrastructure-as-a-Service, Containers-as-a-Service, and Platform-as-a-Service in private cloud environments.

“For hybrid cloud we seamlessly combine public, private, and managed cloud infrastructures with traditional on-premises IT,” he adds. “We can do this on a modest budget to ensure that every customer’s hybrid IT integration is as cost-effective as possible. Our approach provides an intelligent architecture that seamlessly integrates into the existing environment through a management framework that can adapt and evolve as business needs change. It also offers the right mix of services for the rapid deployment of new cloud-based solutions while ensuring strict alignment with governance, compliance, privacy, and security needs – all while enabling data availability across any geography and device.”

Notably, Fujitsu’s PRIMEFLEX is an integrated system that includes pre-configured, pre-tested hybrid cloud-enabled systems – including those for VMware technologies – and all of the services needed to offer a fast path for the development of hybrid data architectures. PRIMEFLEX accelerates the entire process, including design, deployment, and maintenance. It also lets customers easily turn their on-premises infrastructure into a private cloud, introduce cloud pricing models, and connect on-premise systems to the cloud of their choice.

“Our hybrid cloud solutions and services enable growth, reduce risk, lower costs, and increase productivity,” says Medianero. “They also deliver the centralized control that is so important in a world where business units are using the cloud directly.”

Medianero stresses that being VMware Cloud Verified is an important distinction for the Fujitsu team in Spain. It also reflects a longstanding and productive partnership between both companies.

“The VMware Cloud Verified distinction is an important milestone for us because it demonstrates to those in Spain that we comply with, and achieve, the highest standards,” he says. “Fujitsu and VMware’s longstanding partnership also allows us to leverage world-class public and private cloud-based technologies to drive fast-paced continuity strategies. Whether an organization is looking for the performance, resilience, and control of the private cloud or the scale, innovation, and changeability of the public cloud, together we can offer the ideal solutions, services, and seamless integration needed to fully support those efforts and the work of employees today and tomorrow.”

Learn more about Fujitsu Spain and its partnership with VMware here or review success stories from across the globe.

Cloud Management, IT Leadership

For enterprises looking to wrest the most value from their data, especially in real-time, the “data lakehouse” concept is starting to catch on.

The idea behind the data lakehouse is to merge together the best of what data lakes and data warehouses have to offer, says Gartner analyst Adam Ronthal.

Data warehouses, for their part, enable companies to store large amounts of structured data with well-defined schemas. They are designed to support a large number of simultaneous queries and to deliver the results quickly to many simultaneous users.

Data lakes, on the other hand, enable companies to collect raw, unstructured data in many formats for data analysts to hunt through. These vast pools of data have grown in prominence of late thanks to the flexibility they provide enterprises to store vast streams of data without first having to define the purpose of doing so.  

The market for these two types of big data repositories is “converging in the middle, at the lakehouse concept,” Ronthal says, with established data warehouse vendors adding the ability to manage unstructured data, and data lake vendors adding structure to their offerings.

For example, on AWS, enterprises can now pair Amazon Redshift, a data warehouse, with Amazon Redshift Spectrum, which enables Redshift to reach into Amazon’s unstructured S3 data lakes. Meanwhile, data lake Snowflake can now support unstructured data with external tables, Ronthal says.

When companies have separate lakes and warehouses, and data needs to move from one to the other, it introduces latency and costs time and money, Ronthal adds. Combining the two in one platform reduces effort and data movement, thereby accelerating the pace of uncovering data insights.

And, depending on the platform, a data lakehouse can also offer other features, such as support for data streaming, machine learning, and collaboration, giving enterprises additional tools for making the most of their data.

Here is a look at at the benefits of data lakehouses and how several leading organizations are making good on their promise as part of their analytics strategies.

Enhancing the video game experience

Sega Europe’s use of data repositories in support of its video games has evolved considerably in the past several years.

In 2016, the company began using the Amazon Redshift data warehouse to collect event data from its Football Manager video game. At first this event data consisted simply of players opening and closing games. The company had two staff members looking into this data, which streamed into Redshift at a rate of ten events per second.

“But there was so much more data we could be collecting,” says Felix Baker, the company’s head of data services. “Like what teams people were managing, or how much money they were spending.”

By 2017, Sega Europe was collecting 800 events a second, with five staff working on the platform. By 2020, the company’s system was capturing 7,000 events per second from a portfolio of 30 Sega games, with 25 staff involved.

At that point, the system was starting to hit its limits, Baker says. Because of the data structures needed for inclusion in the data warehouse, data was coming in batches and it took half an hour to an hour to analyze it, he says.

“We wanted to analyze the data in real-time,” he adds, but this functionality wasn’t available in Redshift at the time.

After performing proofs of concept with three platforms — Redshift, Snowflake, and Databricks — Sega Europe settled on using Databricks, one of the pioneers of the data lakehouse industry.

“Databricks offered an out-of-the-box managed services solution that did what we needed without us having to develop anything,” he says. That included not just real-time streaming but machine learning and collaborative workspaces.

In addition, the data lakehouse architecture enabled Sega Europe to ingest unstructured data, such as social media feeds, as well.

“With Redshift, we had to concentrate on schema design,” Baker says. “Every table had to have a set structure before we could start ingesting data. That made it clunky in many ways. With the data lakehouse, it’s been easier.”

Sega Europe’s Databricks platform went live into production in the summer of 2020. Two or three consultants from Databricks worked alongside six or seven people from Sega Europe to get the streaming solution up and running, matching what the company had in place previously with Redshift. The new lakehouse is built in three layers, the base layer of which is just one large table that everything gets dumped into.

“If developers create new events, they don’t have to tell us to expect new fields — they can literally send us everything,” Baker says. “And we can then build jobs on top of that layer and stream out the data we acquired.”

The transition to Databricks, which is built on top of Apache Spark, was smooth for Sega Europe, thanks to prior experience with the open-source engine for large-scale data processing.

“Within our team, we had quite a bit of expertise already with Apache Spark,” Baker says. “That meant that we could set up streams very quickly based on the skills we already had.”

Today, the company processes 25,000 events per second, with more than 30 data staffers and 100 game titles in the system. Instead of taking 30 minutes to an hour to process, the data is ready within a minute.

“The volume of data collected has grown exponentially,” Baker says. In fact, after the pandemic hit, usage of some games doubled.

The new platform has also opened up new possibilities. For example, Sega Europe’s partnership with Twitch, a streaming platform where people watch other people play video games, has been enhanced to include a data stream for its Humankind game, so that viewers can get a player’s history, including the levels they completed, the battles they won, and the civilizations they conquered.

“The overlay on Twitch is updating as they play the game,” Baker says. “That is a use case that we wouldn’t have been able to achieve before Databricks.”

The company has also begun leveraging the lakehouse’s machine learning capabilities. For example, Sega Europe data scientists have designed models to figure out why players stop playing games and to make suggestions for how to increase retention.

“The speed at which these models can be built has been amazing, really,” Baker says. “They’re just cranking out these models, it seems, every couple of weeks.”

The business benefits of data lakehouses

The flexibility and catch-all nature of data lakehouses is fast proving attractive to organizations looking to capitalize on their data assets, especially as part of digital initiatives that hinge quick access to a wide array of data.

“The primary value driver is the cost efficiencies enabled by providing a source for all of an organization’s structured and unstructured data,” says Steven Karan, vice president and head of insights and data at consulting company Capgemini Canada, which has helped implement data lakehouses at leading organizations in financial services, telecom, and retail.

Moreover, data lakehouses store data in such a way that it is readily available for use by a wide array of technologies, from traditional business intelligence and reporting systems to machine learning and artificial Intelligence, Karan adds. “Other benefits include reduced data redundancy, simplified IT operations, a simplified data schema to manage, and easier to enable data governance.”

One particularly valuable use case for data lakehouses is in helping companies get value from data previously trapped in legacy or siloed systems. For example, one Capgemini enterprise customer, which had grown through acquisitions over a decade, couldn’t access valuable data related to resellers of their products.

“By migrating the siloed data from legacy data warehouses into a centralized data lakehouse, the client was able to understand at an enterprise level which of their reseller partners were most effective, and how changes such as referral programs and structures drove revenue,” he says.

Putting data into a single data lakehouse makes it easier to manage, says Meera Viswanathan, senior product manager at Fivetran, a data pipeline company. Companies that have traditionally used both data lakes and data warehouses often have separate teams to manage them, making it confusing for the business units that needed to consume the data, she says.

In addition to Databricks, Amazon Redshift Spectrum, and Snowflake, other vendors in the data lakehouse space include Microsoft, with its lakehouse platform Azure Synapse, and Google, with its BigLake on Google Cloud Platform, as well as data lakehouse platform Starburst.

Accelerating data processing for better health outcomes

One company capitalizing on these and other benefits of data lakehouses is life sciences analytics and services company IQVIA.

Before the pandemic, pharmaceutical companies running drug trials used to send employees to hospitals and other sites to collect data about things such adverse effects, says Wendy Morahan, senior director of clinical data analytics at IQVIA. “That is how they make sure the patient is safe.”

Once the pandemic hit and sites were locked down, however, pharmaceutical companies had to scramble to figure out how to get the data they needed — and to get it in a way that was compliant with regulations and fast enough to enable them to spot potential problems as quickly as possible.

Moreover, with the rise of wearable devices in healthcare, “you’re now collecting hundreds of thousands of data points,” Morahan adds.

IQVIA has been building technology to do just that for the past 20 years, says her colleague Suhas Joshi, also a senior director of clinical data analytics at the company. About four years ago, the company began using data lakehouses for this purpose, including Databricks and the data lakehouse functionality now available with Snowflake.

“With Snowflake and Databricks you have the ability to store the raw data, in any format,” Joshi says. “We get a lot of images and audio. We get all this data and use it for monitoring. In the past, it would have involved manual steps, going to different systems. It would have taken time and effort. Today, we’re able to do it all in one single platform.”

The data collection process is also faster, he says. In the past, the company would have to write code to acquire data. Now, the data can even be analyzed without having to be processed first to fit a database format.

Take the example of a patient in a drug trial who gets a lab result that shows she’s pregnant, but the pregnancy form wasn’t filled out properly, and the drug is harmful during pregnancy. Or a patient who has an adverse event and needs blood pressure medication, but the medication was not prescribed. Not catching these problems quickly can have drastic consequences. “You might be risking a patient’s safety,” says Joshi.

Analytics, Data Architecture, Data Management

Founded in 1784 by Alexander Hamilton, BNY Mellon is one of the oldest banks in the U.S. and is the world’s largest custodian bank and securities services company, with $2.4 trillion in assets under management, another $46 trillion in assets under custody, and more than $307 billion in private wealth.

It is also evolving to become a digital bank, with cloud a key element of this transformation.

Laying the foundation for its cloud strategy, BNY Mellon undertook a multi-year application modernization effort. “In that course of that journey, we virtualized and containerized approximately 95% of our distributed applications in our internal ecosystem. We essentially built an orchestration layer and viewed public cloud as just another landing zone outside our data centers,” says Joe Sieczkowski, CIO of architecture and engineering at BNY Mellon.

At CIO’s recent Future of Cloud Summit, John Gallant, enterprise consulting director with Foundry sat down with Sieczkowski to learn more about his cloud strategy, governance in the cloud, and leveraging cloud where it is most effective. What follows are edited excerpts of that conversation. For more insights, watch the full video embedded below.

On BNY Mellon’s cloud strategy:

First and foremost, we view cloud as a journey, not a destination. Our strategy is essentially to leverage the public cloud’s economies of scale, to drive business value, to lower risk, to increase resiliency, and really to ensure our infrastructure is evergreen. Essentially, cloud enables us to better serve our stakeholders.

Today, our strategy is to have a multi-cloud approach. We have to go to where our clients are. We are going to pick best of breed solutions and maintain our ability to pivot as needed. We are going to limit lock-in, we are going to understand our exit strategy. And frankly, for critical business workloads, we actually may select a process on multiple providers, like Azure and GCP or AWS and Azure, etc.

Governance in multicloud environments:

BNY Mellon already has a rigorous governance process. And our approach has been to extend that process and enhance that process to cover cloud. So as an example, every development initiative has to go through a permit-to-design, permit-to-build, and permit-to-operate tollgate. And that is where we do the architecture reviews, the security reviews, the risk reviews, and even the operational reviews to make sure that we are appropriately securing, monitoring, and governing everything we do for our stakeholders.

BNY Mellon’s modernization journey:

BNY Mellon is evolving into a digital bank. The key point here is that our cloud strategy is part of our overall technology and digital journey, as we continually modernize. So we view this as we have laid the foundation for public cloud with our internal modernization journey. This included enhancing our designs, our standards, our controls, quality assurance, as well as the governance and tollgates around it. We have established well-defined designed patterns and blueprints that are constantly evolving, and we also established anti-patterns that people must avoid. Technology is always evolving, and we have to evolve with it and continue to manage it professionally.

How cloud bolsters resilience:

BNY Mellon has a very strong resiliency posture. However, we believe cloud will afford us an opportunity to really think about next-generation resiliency. This includes scaling during market events, avoiding missing windows, avoiding missing service-level agreements. And frankly, we have also been thinking about the notion of a lifeboat in the cloud, meaning that if there was a really drastic event, we could spin up a lifeboat—in the cloud—to process workloads. We are thinking about it as a cost-effective way to further improve our resiliency posture.

Where cloud is most effective:

One area I think cloud is just going to be really effective is any area which involves experimentation and has a high opportunity cost. Because when you can experiment, you can potentially enter a new business quickly, test an idea. So, for instance, let’s say I have an idea or one of our leading data scientists has an idea for a next-generation fraud model. We can spin up 1,000 GPUs in the cloud for 2 weeks, test a new fraud model, and then turn them off. I just completely removed the opportunity cost.

Another thing that comes to mind is there are customers that want the data close to them. And so cloud in another area, if a customer happens to want their data in their own country—both for latency reasons and perhaps for data domicile reasons—it allows us, as an organization, to go to the customer rather than have the customer come to us.

On AI in the enterprise:

I believe data science, ML, and AI will actually transform the enterprise. In and outside of financial services. In my point of view, regardless of industry, effective firms will be deriving insights from data and driving actionable strategies to provide value for the customers and stakeholders.

At the end of the day, AI and ML isn’t magic. At its core, it is sophisticated and complex math on data. And you need to understand purpose and outcome, establish the right due diligence, peer review, and your processes to test your algos, to ensure effectiveness.

And you hear this a lot. But at the end of the day, you need to make sure your results are explainable and free of bias. It is all about data driving insights and insights driving strategy.

Cloud Computing, Financial Services Industry, Multi Cloud

Founded in 1784 by Alexander Hamilton, BNY Mellon is one of the oldest banks in the U.S. and is the world’s largest custodian bank and securities services company, with $2.4 trillion in assets under management, another $46 trillion in assets under custody, and more than $307 billion in private wealth.

It is also evolving to become a digital bank, with cloud a key element of this transformation.

Laying the foundation for its cloud strategy, BNY Mellon undertook a multi-year application modernization effort. “In that course of that journey, we virtualized and containerized approximately 95% of our distributed applications in our internal ecosystem. We essentially built an orchestration layer and viewed public cloud as just another landing zone outside our data centers,” says Joe Sieczkowski, CIO of architecture and engineering at BNY Mellon.

At CIO’s recent Future of Cloud Summit, John Gallant, enterprise consulting director with Foundry sat down with Sieczkowski to learn more about his cloud strategy, governance in the cloud, and leveraging cloud where it is most effective. What follows are edited excerpts of that conversation. For more insights, watch the full video embedded below.

On BNY Mellon’s cloud strategy:

First and foremost, we view cloud as a journey, not a destination. Our strategy is essentially to leverage the public cloud’s economies of scale, to drive business value, to lower risk, to increase resiliency, and really to ensure our infrastructure is evergreen. Essentially, cloud enables us to better serve our stakeholders.

Today, our strategy is to have a multi-cloud approach. We have to go to where our clients are. We are going to pick best of breed solutions and maintain our ability to pivot as needed. We are going to limit lock-in, we are going to understand our exit strategy. And frankly, for critical business workloads, we actually may select a process on multiple providers, like Azure and GCP or AWS and Azure, etc.

Governance in multicloud environments:

BNY Mellon already has a rigorous governance process. And our approach has been to extend that process and enhance that process to cover cloud. So as an example, every development initiative has to go through a permit-to-design, permit-to-build, and permit-to-operate tollgate. And that is where we do the architecture reviews, the security reviews, the risk reviews, and even the operational reviews to make sure that we are appropriately securing, monitoring, and governing everything we do for our stakeholders.

BNY Mellon’s modernization journey:

BNY Mellon is evolving into a digital bank. The key point here is that our cloud strategy is part of our overall technology and digital journey, as we continually modernize. So we view this as we have laid the foundation for public cloud with our internal modernization journey. This included enhancing our designs, our standards, our controls, quality assurance, as well as the governance and tollgates around it. We have established well-defined designed patterns and blueprints that are constantly evolving, and we also established anti-patterns that people must avoid. Technology is always evolving, and we have to evolve with it and continue to manage it professionally.

How cloud bolsters resilience:

BNY Mellon has a very strong resiliency posture. However, we believe cloud will afford us an opportunity to really think about next-generation resiliency. This includes scaling during market events, avoiding missing windows, avoiding missing service-level agreements. And frankly, we have also been thinking about the notion of a lifeboat in the cloud, meaning that if there was a really drastic event, we could spin up a lifeboat—in the cloud—to process workloads. We are thinking about it as a cost-effective way to further improve our resiliency posture.

Where cloud is most effective:

One area I think cloud is just going to be really effective is any area which involves experimentation and has a high opportunity cost. Because when you can experiment, you can potentially enter a new business quickly, test an idea. So, for instance, let’s say I have an idea or one of our leading data scientists has an idea for a next-generation fraud model. We can spin up 1,000 GPUs in the cloud for 2 weeks, test a new fraud model, and then turn them off. I just completely removed the opportunity cost.

Another thing that comes to mind is there are customers that want the data close to them. And so cloud in another area, if a customer happens to want their data in their own country—both for latency reasons and perhaps for data domicile reasons—it allows us, as an organization, to go to the customer rather than have the customer come to us.

On AI in the enterprise:

I believe data science, ML, and AI will actually transform the enterprise. In and outside of financial services. In my point of view, regardless of industry, effective firms will be deriving insights from data and driving actionable strategies to provide value for the customers and stakeholders.

At the end of the day, AI and ML isn’t magic. At its core, it is sophisticated and complex math on data. And you need to understand purpose and outcome, establish the right due diligence, peer review, and your processes to test your algos, to ensure effectiveness.

And you hear this a lot. But at the end of the day, you need to make sure your results are explainable and free of bias. It is all about data driving insights and insights driving strategy.

Cloud Computing, Financial Services Industry, Multi Cloud

Skills shortages continue to plague the IT sector, causing UK technology job vacancies to shoot up by almost 200% since 2020, according to BCS.

Not having the right skills or team is the third biggest worry among senior IT decision-makers in the UK, with two-thirds of technology executives (66%) highlighting that their organisation’s digital transformation projects are being stalled due to struggles in recruiting IT professionals with the skills they need.

Cybersecurity is the UK tech sector’s most sought-after skill set according to the Nash Squared Digital Leadership Report, with 43% of respondents reporting a shortage, followed by big data specialists and analysts (36%), technical architects (33%) and developers (32%). Other in-demand skill sets include network engineering and devops.

Sadly, there’s no quick fix to the problem of tech skills shortages. With the biggest cause of IT skill shortfalls in the UK being a lack of STEM graduates coming through the education system, changes to public policy are key. However, there are steps that CIOs can take to begin easing recruitment challenges.

1. Change the perception of a career in IT

One of these is to work towards changing the general perception of IT careers and giving people a better understanding of how varied work in the technology sector can be.

“IT often has the perception that it’s solely focused on the lone ranger sitting in a darkened room responding to the bad guys. In terms of attracting talent, this may not appeal to those who’re searching for a career that’s people-focused and revolves around being part of a team,” says Heather Hinton, CISO of cloud-based comms company RingCentral.

Hiring, IT Leadership, IT Skills, IT Training