Already a leader in Malaysia’s burgeoning cloud services and solutions sector when it was acquired by Time dotCom, one of the region’s largest fixed-line communications companies in 2021, AVM Cloud recently became one of the select group of providers who offer VMware Cloud Verified Services to earn the VMware Sovereign Cloud distinction.

Originally known as Integrated Global Solutions Technologies, AVM Cloud has a long relationship with VMware going back to 2010.

David Chan, CEO, AVM Cloud

AVM Cloud

AVM Cloud’s CEO David Chan explains that “being named VMware’s Hybrid Cloud Provider of the Year FY 2018 reflected our commitment to provide customers with choices that enable them to optimize their unique cloud journey and in many ways our decision to pursue and earn the VMware Sovereign Cloud distinction is a natural progression of that effort. Now our customers can choose to have their data safely and securely kept, maintained, and safeguarded by Malaysian citizens in Malaysian territory.”

Chan notes that AVM Cloud’s commitment to providing enterprises with choices is readily apparent in the depth and breadth of the company’s portfolio. This includes not only its hybrid cloud products, but also the private AVM Cloud offered in multi-tenant and dedicated versions, Infrastructure-as-a-Service and Platform-as-a-Service, the company’s Fusion backup to cloud solution, and AVM’s Cloud-In-A-Box – a ready-made offering that lets organizations deploy a private cloud with robust security features on premises or in a co-located data center.

Notably, AVM Cloud also offers a number of custom cloud solutions. This includes an ever-growing portfolio of cloud-native applications based on VMware Tanzu.

Chan says AVM Cloud’s top priority in achieving its status was to be able to cater to full spectrum of customers’ workloads, including those that are best served when data resides in, is safeguarded in, and is managed and maintained within sovereign territory without intervention from foreign entities

Sovereignty is increasingly a priority for many organizations in Malaysia. In the case of AVM Cloud, this includes customers in numerous industries, including financial services and manufacturing.

“The regulatory requirements on sovereign cloud are still nascent and developing in Malaysia,” he says. Data sovereignty is reflected in existing legal and policy frameworks which encompass a comprehensive, cross-sectional framework to protect personal data in commercial transactions and play an important role in helping companies address data sovereignty issues.

These issues are directly addressed by the five criteria and numerous requirements that must be met to achieve the VMware Sovereign Cloud distinction: data sovereignty and jurisdiction control, data access and integrity, data security and compliance, data independence and mobility, and data innovation and analytics. AVM Cloud addresses each of them.

“Our sovereign clouds are architected and built to deliver security and data access that meets the strict requirements of regulated industries and local jurisdiction laws on data privacy, access, and control,” Chan says. “We deliver this national capability for digital resilience while still enabling our customers to access a hyperscale cloud in another region for ancillary workloads or analytics. In this way, Malaysian companies can demonstrate to their customers that they value their trust and treat their personal data with the utmost care. Ultimately, this commitment will benefit all Malaysian citizens.”

Learn more about AVM Cloud and its partnership with VMware here.

Cloud Management, IT Leadership

Before any innovation initiative starts, there are questions (and usually lots of them). What is innovation and, more importantly, what does it mean for your organization? What fears or misperceptions hold innovation back? If you haven’t yet, check out this blog before reading this follow-up piece.

Decades ago, Netflix mailed DVDs to homes and a copy of the Yellow Pages was next to every landline phone. Today, Netflix is a streaming juggernaut producing award-winning content and the Yellow Pages’ website,, is one of the world’s largest online directories used by millions of businesses. 

How did these well-established brands simultaneously run and reinvent? Their stories prove that innovation without disruption is possible. Here’s what we can learn from their successes… 

Introduce incremental value while keeping your core service

In the prelude to this blog, it was discussed that innovation doesn’t have to be a breakthrough new technology or a completely new business model. It can be something simple that adds value to the day-to-day lives of your customers or employees. is the perfect example of this. While the Yellow Pages eventually ended its printed product, it took two years for them to finally get there. The company worked incrementally, first by strengthening its market share through strategic acquisitions and then by adding multi-channel marketing as a business service. Today, Yellow Pages provides the same service as its first publication back in 1966 faster and better. Likewise, to this day, you can still have movies delivered to your home as part of your Netflix service. 

Great innovation shows that you don’t need to forsake the old in favor of the new. By retaining their core service while laying the groundwork for next-phase evolution, these brands were able to retain their loyal customer base while reaching out to new audiences. 

How can your organization take small steps toward innovation while still following its North Star? Perhaps this means augmenting your voice-only customer service with new digital channels that make it easier for your customers to contact you. Maybe it’s adding analytics that allow you to better understand what your customers want so you can give it to them. 

Digital channels, analytics, AI, and automation are key technologies companies are leveraging to drive innovation, but they’re only accessible in the cloud. What does that mean for your decades-old on-premises systems?

Innovation starts with ideation and follows through with implementation, but current investments shouldn’t be abandoned in the name of innovation.

You may not be ready to move on from your rock solid on-premises systems (and that’s okay), but you know how greatly you can benefit from new capabilities delivered via the cloud. Even if you are ready, the systems you have in place are deeply entwined with other frameworks. Even something seemingly simple like adding new digital channels to your voice-only contact center can create a lot of new challenges such as different agency interfaces, split reporting of customer interactions between voice and digital channels, and changing communications mode in “real time” to address customers’ needs (ie: shifting from chat to voice to video). 

You need to move at a pace and path that fits your business needs. If your technology isn’t mature enough to deliver what needs to be developed, you’ll need to collaborate with partners who offer enabling technologies to gain access. 

Here are a few examples of innovation without disruption using Avaya’s Experience Platform:

Superior Propane – Canada’s leading supplier for propane delivery and tank installs – uses Avaya’s Experience Platform to open the door to cloud and AI without having to rip and replace its existing on-premises systems. 

Our partner alliances also helped speed time to value by offering access to dozens of leading tech vendors that could seamlessly integrate with the company’s Avaya solution. Superior Propane began driving innovation internally and externally using proactive automated outbound notifications, real-time reporting, and speech and desktop analytics that helped reduce Average Handle Time (AHT) by 30 seconds per call.

Standard Chartered Bank uses Avaya’s Experience Platform for a personalized path to cloud adoption.

Public cloud was not a model that was yet suitable for them when weighed against their business and client requirements. Avaya Enterprise Cloud created the global platform they needed while ensuring optimal security and data privacy. With Avaya, the bank has been able to innovate externally with digital, personalized CX and internally with enhancements to its agent desktop.  

Unlock new value through the cloud at your own pace 

At the heart of business is the propensity to keep moving forward, opening new doors, and doing new things. Embracing new ideas, however, shouldn’t mean disruption to your existing business operations. Create your own path to cloud technologies that drive innovation – Avaya can help every step of the way. 

IT Leadership

Organizations that have embraced a cloud-first model are seeing a myriad of benefits. The elasticity of the cloud allows enterprises to easily scale up and down as needed. In practice, rather than commit to just one cloud service in today’s world of more distributed organizations due to Covid-19, many enterprises prefer to have multiple cloud solutions they source from a variety of vendors.

The cloud also helps to enhance security, improves insight into data, and aids with disaster recovery and cost savings. Cloud has become a utility for successful businesses. Around 75% of enterprise customers using cloud infrastructure as a service (IaaS) have been predicted to adopt a deliberate multi-cloud strategy by 2022, up from 49% in 2017, according to Gartner.

“Businesses don’t want to be locked into one particular cloud,” says Tejpal Chadha, Global Head, Digitate SaaS Cloud & Cyber Security. “They want to run their applications on different clouds so they’re not dependent on one in case it were to temporarily shut down. Multi-cloud has really become a must-have for organizations.”

Yet, at the same time, companies that tap into these multi-cloud solutions are opening themselves up to additional, and potentially significant, security risks. They become increasingly vulnerable in an age of more sophisticated, active cyberhackers.

To address security risks, cloud services have their own monitoring processes and tools that are designed to keep data secure. Many offer customers basic monitoring tools for free. But if companies want a particularly robust monitoring service, they often must pay added fees. With multiple clouds, this added expense can be significant.

“The cost goes up when you have to have specific monitoring tools for each cloud,” Chadha says. “Monitoring also needs to be instantaneous or real-time to be effective.”

Organizations using multi-cloud solutions are also susceptible to cloud sprawl, which happens when an organization lacks visibility into or control over its cloud computing resources.The organizationtherefore ends up with excess, unused servers or paying higher rates than necessary.

For enterprises safeguarding their multi-cloud solutions, a better tactic is to use just one third-party overarching tool for all clouds – one that monitors everything instantaneously. ignio™, the award-winning enterprise automation platform from AIOps vendor Digitate, does just that.

ignio AIOps, Digitate’s flagship product, facilitates autonomous cloud IT operations by tapping into AI and machine learning to provide a closed-loop solution for Azure and AWS, with versions for Google Cloud (GCP) and private clouds also coming soon. With visibility and intelligence across layers of cloud services, ignio AIOps provides multi-cloud support by leveraging cloud-native technologies and APIs. It also provides actionable insights to better manage your cloud technology stack.

ignio is unique in that it cuts across multiple data centers, both private and public clouds, and seamlessly handles everything  in a single window. It gets a bird’s eye view of the health of a company’s data centers and clouds. Then, ignio continuously monitors, predicts, and takes corrective action across clouds while also automating previously manual tasks, which ignio calls “closed-loop remediation.” The closed-loop remediation enables companies to automate actions for both remediation, compliance, and other essential CloudOps tasks.

“The ignio AIOps software first comes in and, in the blueprinting process, gives a holistic view of what companies have in their universe,” Chadha says. “We call that blueprinting or discovery. Then, we help automate tasks. We’re completely agnostic when it comes to monitoring or taking corrective action, or helping increase automation across all of these clouds.”

As Digitate ignio customers automate processes and reduce manual IT efforts, they’re finding they’re saving money — some millions of dollars a year. For many companies, tasks that once took three days now take only an hour.

“The biggest benefits are that less manual work is ultimately needed, and then there’s also the costs savings,” Chadha says. “Enterprises using this tool are managing their multi-cloud estate much more efficiently.”

To learn more about Digitate ignio and how Digitate’s products can help you thread the multi-cloud needle, visit Digitate.

IT Leadership

Based in Italy and with more than 20 years of experience helping enterprises, from large international firms to emerging mid-sized operations, grow their businesses with technology, WIIT serves a rapidly expanding and diverse customer base. With a full portfolio that includes an extensive array of cloud offerings – including private, public, and hybrid cloud services – the company is well-known for its track record of success in helping organizations realize the full potential of the cloud while bypassing the complexity often associated with large-scale digital transformations.

Serving leaders in the energy, fashion, financial services, food, healthcare, manufacturing, media, pharmaceutical, professional services, retail, and telecommunications industries, WIIT works with organizations that have stringent business continuity needs, mission-critical applications, and crucial data security and sovereignty requirements. Customers draw on the company’s full suite of solutions which includes everything from digital collaboration tools, a full cybersecurity stack, extensive software development services, and innovations that let customers embrace the Internet of Things.

We recently caught up with Alessandro Cozzi, founder and CEO of WIIT to learn about the company, what he’s seeing in its burgeoning cloud business, and what he feels will be the next big thing. We also took the opportunity to learn why it was important to achieve the VMware Cloud Verified distinction, not just for WIIT, but the companies it serves in Italy, Germany, and around the globe.

“The traditional IT model is no longer sustainable,” says Cozzi. “Often the most value of the cloud lies in hybrid architectures that for the vast majority of enterprises are complex to design and manage. We offer a platform that secures and optimizes the full mix of disparate infrastructure, from edge computing to public cloud, that many organizations need. We also govern it with specialized expertise, certifications, and top-tier proprietary assets that enable us to exceed the most demanding service level agreements.”

Notably, WIIT offers a highly customized Premium Cloud that is uniquely tailored to each organization, a Premium Private offering for critical applications, and a Public Cloud that offers seamless connectivity to Amazon Web Services, Google Cloud, and Microsoft Azure. The company’s Premium Multicloud services enable customers to combine elements of each to best address their needs.

Cozzi notes that WIIT’s Premium Private Cloud ensures extremely high levels of security, scalability, and data reliability, while the public clouds are complimentary especially for applications that aren’t critical. He also points out that hyperscalers are becoming specialized, prompting more companies to rely on services from different cloud providers.

“The hybrid cloud is often the answer when there is a need to host systems in more than one location for international processes, regulatory needs, network latency, data sovereignty, or application requirements,” he adds. “At WIIT we engineer, implement, and govern custom hybrid cloud and multi-cloud models in conjunction with the complex IT architectures our customers need. And we make the most of the unique capabilities of different clouds and data centers to ensure that our clients can continually evolve their businesses.”

Cozzi stresses that VMware’s trusted technologies and new innovations play a pivotal role in these efforts. It’s what led the company to achieve the VMware Cloud Verified distinction.

“WIIT is among the most innovative cloud solutions and service providers in Europe,” he says. “Most of our customers rely on VMware technologies for critical services. Showing that we have deep expertise with them is yet another way that we provide peace of mind and a serene cloud journey.”

Not surprisingly, Cozzi only sees cloud adoption increasing in light of customers’ success in growing their businesses with the cloud and the new capabilities a flexible, hybrid approach makes possible.

“It gives me great pride that today we’re able to remove so much of the complexity involved in even the largest, most involved cloud deployments so that customers simply experience the full potential of the cloud,” says Cozzi. “But I’m also really excited about the innovations we are seeing in the world of applications and the advancements in cloud microservices now taking shape. The impact of the cloud will only increase.  We’re intent to grow a business that continues to offer customers secure and innovative cloud services that recognize not only people, but also the environment, as a strategic priority.  Ultimately, we are committed to being a key player not just in the realm of digital transformations, but also in just and sustainable transitions of infrastructure and the businesses processes and practices it is used for.”    

Learn more about WIIT and its partnership with VMware here.

Cloud Management, IT Leadership, VMware

An organization depends on its financial institution to complete a major transaction, but a glitch holds up funds, negatively impacting cash flow. Meanwhile, regulators fined a different financial institution for failing to catch fraudulent transactions.

In both situations, better business transaction monitoring could have helped prevent negative, costly outcomes. In the former, more seamless monitoring would have ensured the bank client’s transaction was completed faster, maintaining and even boosting customer satisfaction. For the latter, it could have prevented regulatory fines.

Today, especially, as modern applications and systems have become more complicated and IT infrastructure more complex, seamless business transaction monitoring is crucial. Enterprises need the proper tools to detect incidents early, automate wherever they can to make processes more efficient and free up IT workers for more complicated tasks, and be able to automatically solve any issues quickly.

“Business transaction monitoring is transitioning from a support function to a critical element in any organization’s operations,” says Digitate CEO Akhilesh Tripathi.

French utility company ENGIE, for one, needed a solution that could monitor its workload automation processes across its extensive IT infrastructure and business applications, and reduce dependency on manual issue resolution.

One of the world’s largest independent power producers, ENGIE conducts some two-million meter readings and generates over 150,000 invoices nearly every day. ENGIE approached Digitate for help to digitally transform its billing and payment process. The company wanted to move away from manual monitoring and remediation, which were both inefficient and risk-prone, increasing the operational cost and often leading to inaccuracies and delays in revenue generation.

Within 18 months, ENGIE was able to transform its workload process through a closed-loop solution that uses intelligent automation to automatically identify and solve any issues, further cementing its transformation into a digital enterprise. Digitate collaborated with ENGIE to provide a layered solution for monitoring workload processes to create a “blueprint” of the company’s entire batch system.

Now, ENGIE spends less time and effort on manual monitoring. It has reduced impacts to downstream processes like billing and payment communications by 80%, realized a 95% reduction in customer complaint tickets, and prevented €5 million per day ($4.87 million) in revenue loss. For its successful digital transformation, powered by Digitate, ENGIE was named Order-to-Cash winner in the Hackett Group’s 2022 Digital Awards.

With better business transaction monitoring as a part of their digital transformation, companies find business processes are well supported to generate cash, while transparency increases enterprise-wide. Financial institutions completing tens of millions of transactions daily can feel confident they can detect high-risk and suspicious activities.

Regardless of industry, business transaction monitoring has a critical impact on enterprises with tangible results. For instance, utility companies monitoring hundreds of thousands of bills have peace of mind that delays can be quickly remedied or prevented from happening at all. Yet, to be successful, organizations working to become digital enterprises must have the right technology.

“With the complexity of more and more systems in place, companies need tools to create visibility and confidence in IT to execute fast and protect the business,” Tripathi says.

With its closed-loop solutions, Digitate helps companies monitor all events across their IT infrastructure to create an integrated view. Tools detect anomalies, investigate, and self-heal to correct. Routine activities are automated, doing away with repetitive manual tasks and freeing up valuable IT workers’ time.

“We close the loop and we resolve problems to create value proposition for our customers,” Tripathi says. “They’re able to do this in one-tenth of the time it would otherwise take them.”

To learn more about better business transaction monitoring and how Digitate’s products can help you, visit Digitate.

IT Leadership

Founded in 2011, Lintasarta Cloudeka is a division of Lintasarta, Indonesia’s leading provider of information and communications technology. Offering everything from fiber optics to data centers and satellite networks, Lintasarta has a presence throughout Indonesia, with 54 facilities spread throughout the nation and more than 2,400 enterprise customers. These include leading businesses in a wide range of industries, including agriculture, banking, government, health care, higher education, manufacturing, retail, telecommunications, and technology.

We recently caught up with Ginandjar, Marketing & Solution Director at Lintasarta, to learn what he’s seeing in Indonesia’s rapidly growing market for cloud services and solutions, what’s accelerating cloud adoption in the country, what he sees as the next driver of growth, and what it means for the company to achieve the VMware Cloud Verified distinction.

“Throughout the country Lintasarta Cloudeka is known as the cloud provider from and for Indonesia,” says Ginandjar. “We have a really strong understanding of the needs of the industries we serve and provide end-to-end cloud services to a diverse customer base that includes large global enterprises and small- and medium-sized companies. We really pride ourselves on helping businesses realize the full potential of the cloud, and in the process enhance and grow their businesses.”

Lintasarta Cloudeka’s wide array of cloud solutions and services includes robust public, private, and multi-cloud offerings and an extensive portfolio of managed services, from full Infrastructure-as-a-Service (IaaS) and Platform-as-a-Service (PaaS) to Backup and Disaster Recovery-as-a-Service, cloud object storage, and everything in between.

Notably, the company’s Deka Prime public cloud solution and Deka Premium private cloud solution are both based on VMware technologies, as is the company’s IaaS, PaaS, and other offerings. Ginandjar notes that customers manage them, as well as on-premises VMware-based infrastructure, with ease from a single dashboard using VMware vCloud Director.

“Most of our customers require a flexible infrastructure that doesn’t require them to manage it on their own, but enables them to effortlessly spin up or spin down infrastructure capacity and computing power as needed,” adds Ginandjar. “With Lintasarta Cloudeka they only pay for what they need and use, which is a one of the reasons many enterprises initially turn to us. With other providers, they often receive surprising bills or are forced to pay for services they don’t use.”

He also stresses that reliability and performance are key concerns for most organizations. It’s a reality he says leads many to Lintasarta Cloudeka.

“Because we are directly supported by Lintasarta’s high-speed, high-performance networks that are renowned for their unflinching reliability, and we utilize VMware technologies that are proven and trusted by enterprises in every industry and our relied by most organizations in their on-site data centers, prospective customers know they are getting a solution that will free them to focus more on their applications and their business rather than their new software-defined infrastructure.”

The resulting peace-of-mind is something he believes will be even stronger because Lintasarta Cloudeka now holds the VMware Cloud Verified distinction.

“Achieving the VMware Cloud Verified distinction is not an easy task for any cloud solutions or services provider, says Ginandjar. “For customers, VMware Cloud Verified is more than a distinction – it’s how they seek a trusted cloud service provider. They can be certain that all technology was implemented using best practices by individuals that really understand the solutions being deployed.”

While Ginandjar still feels that some misconceptions about the cloud remain among many companies in Indonesia – including that the primary use cases for the cloud, albeit valuable ones, are storage and backup – he feels that that is changing.

 “We’re seeing a dramatic increase in microservices and autoscaling here in Indonesia,” he says. “And with VMware, we’re ideally positioned to speed up the development process involved. Now our customers can react faster and offer new services faster without ever worrying if they have the infrastructure needed to make them possible.”

Learn more about Lintasarta Cloudeka and its partnership with VMware here.

Cloud Computing, IT Leadership, Managed Cloud Services

Companies have learned to thrive — and in some cases survive — by leveraging data for competitive advantage. But how many organizations are truly data-driven enterprises?

“Data is becoming increasingly valuable, especially from a business perspective,” says Lakshmanan Chidambaram, president of Americas strategic verticals at global IT consulting firm Tech Mahindra. “Afterall, data can tell us a lot about a company’s processes and activities. It shows whether one is moving in the right direction, identifies areas of improvement, and suggests an appropriate process to make those improvements.”

Here are some key traits of a data-driven enterprise, according to experts.

They operate with an organization-wide data strategy

To be a data-driven enterprise requires having a cohesive, comprehensive data strategy that applies across the organization. This encompasses technology and automation, including the use of artificial intelligence (AI). But it also includes culture, governance, cybersecurity, data privacy, skills, and other components.

“The market for data governance, storage, and analytical tools has grown considerably, yet enterprises are still struggling to wrap their arms around the scope of the challenge,” Chidambaram says. “CIOs, CTOs, and [chief administrative officers] must step back and establish an enterprise-wide strategy to harness the value of data for their enterprise and integrate AI to enable sales, marketing, and operational excellence.”

This includes ensuring that the data architecture provides both data professionals and non-technical decision-makers with the tools they need to move beyond instinctual and anecdotal decisions, Chidambaram says.

“Many corporate and government enterprises are leveraging data-driven insights for improving customer service, reducing operating expenses, creating new business streams, and achieving overall business efficiency,” Chidambaram says.

Getting an organizations’ leadership and workforce to commit to a data-driven approach is key to determining success, Chidambaram says. “Organizations must ensure that they [address] the following question to call themselves a truly data-driven organization: Is everybody willing to embrace data as part of the business culture?” he says.

They optimize resource allocation

It’s one thing to develop a data-driven strategy; it’s another thing entirely to effectively execute on the plan. That’s where having the right resources in place and updating them as needed is important.

“Once the strategy is defined, the people, process, and tools to support the strategy are critical for a data-driven organization,” says Kathy Rudy, partner and chief data and analytics officer at global technology research and advisory firm ISG.

For example, organizations need to have a process for building data catalogs; procedures and tools for data cleansing and data quality; defined data use cases and the right tools to support the use cases; effective and secure access to data for internal and external users; overall security to support the use cases; and a data center of excellence to support complex data requests

From a people perspective, being a data-driven organization means having a solid team of data analysts, data scientists, data engineers, and other professionals in place, and providing the necessary training when skills need to be updated.

They emphasize data governance

Data governance is another component of the overall data strategy that warrants extra attention. Governance encompasses data security, privacy, reliability, integrity, accuracy, and other areas. It’s essential to maintaining a data-driven operation.

Without data governance, you cannot trust that the data you are using is of high quality, is synced across data sets by a common taxonomy, or is secure,” Rudy says. “Data governance also provides the foundation for access to the data.”

ISG is often faced with disparate databases with differing taxonomies and ways of maintaining the datasets, Rudy says. “Once we established a centralized data governance methodology — with people, processes, and tools — we were able to develop new ways to use our data internally and externally for client delivery, products, and data monetization.”

The centralized approach also established proper security protocols for data access inside the business, Rudy says. “Many people think data should be democratized, though I’m not convinced of that,” she says. “Unless you truly understand the source for the data, how it was collected, the context of the data, and [how to] analyze data, improper use can lead to bad decisions.”

For example, when the ISG sales teams asked for account information, the data team began pulling reports and discovered there were multiple names for the same client. “This made it very hard to pull together a snapshot of business over time, what was selling, by whom, etc.,” Rudy says. “Lack of governance over our data led to dirty data in our system, and an incomplete picture of a client that might have led us to incorrectly design an account strategy.”

Responsible data use is paramount for data-driven organizations, says Deepika Duggirala, senior vice president of global technology platforms at TransUnion, a provider of financial services, healthcare, and insurance technologies.

“This means securing all data within an enterprise’s data ecosystem — both in motion and at rest — while maintaining the privacy of associates and consumers,” Duggirala says. “An enterprise must be able to evolve alongside growing data protection regulations, doing so by educating all associates on US and international data privacy and protection regulations, and building security and compliance into the initial design of all data storage and consumption. This mindset is how TransUnion makes trust possible and protects our data ecosystem and its compliance.” 

They establish a broad data mindset

Building a data culture and mindset is part of the overall data strategy, but it bears special mention because it truly helps bring the strategy to life.

“All aspects of decision-making are influenced by data,” Duggirala says. “Associates are fluent in its interpretation to better understand the market and make sound decisions. This is the core of TransUnion’s product development process — product managers, customer experience designers, and developers all leverage a different facet of our data to identify solutions that solve specific needs, define launch timelines, and ensure simple, intuitive features.”

At companies that are data-driven, “there is an organization-wide acknowledgement that data is at the heart of decision-making,” Rudy says. “So, when challenges are posed, questions are asked or strategy is designed, people automatically reach for data to support decision-making.”

At ISG, “from marketing and sales materials that describe our credentials, to client deliverables where data is used to substantiate recommendations, industry briefings where we back up our knowledge and expertise with data and facts, data truly is at the heart of everything we do,” Rudy says. “Data gives businesses a competitive advantage. We view data as circular. We are continuously in the process of collecting, validating, managing, curating, and analyzing data to drive insights for all our stakeholders.”

Data-driven organizations have many drivers, says Theresa Kushner, head of the Innovation Center for North America at consulting firm NTT Data. “This means that no matter where you sit in the organization, you can have access to the data you need to do your job,” she says. “Non-data-driven organizations are usually siloed in their approach to data management.”

NTT Data research shows that a minority of organizations say data is shared seamlessly across the enterprise. “In a data-driven enterprise this is not the case,” Kushner says. “Because these groups are directed by their leadership to make decisions based on data and because they have teams that pay special attention to key data sets, they can move quickly and drive their businesses using accurate, readily available data.”

Regular collaboration is key to having a data mindset. “Data is nothing without people sharing and using it,” Kushner says. “Effective data-driven cultures depend on efficient collaboration and open communication between owners of the data and its users. This trait of a data-driven organization supersedes all others such as training, certification, data governance, regular process updates.”

They make data collection a primary concern

Many AI projects are shelved in short order because data scientists cannot find the data that is needed for a proposed model, Kushner says. “Oftentimes this is because the data was never collected,” she says. “Data-driven organizations do not have this problem. They know which data domains are important and necessary to the running of the business, and they ensure that these datasets are protected and curated.”

For example, most companies have customer relationship management (CRM) systems that are used by sales to record and track opportunities, Kushner says. But the data in these systems is often incomplete for customers and their transactions, especially if data entry is the responsibility of the salesperson, she says.

“This means that when data scientists want to create a customer model that identifies those customers who will buy at a particular time or from a specific channel, the data they need might not be available or complete enough to support the model,” Kushner says. “Data-driven organizations, however, understand that this data is primary to running the business and as a result ensure that data management practices are thorough for key areas.”

In many cases, to ensure that data is entered appropriately, these organizations automate sales entry processes to free sales from tedious entry tasks. “Depending on the business type or industry, key areas may change,” Kushner says. “For example, manufacturers may find that managing the information on their suppliers more closely is their key data domain. No matter what industry, data-driven organizations have a plan for collecting, managing, and using key data.”

They foster strong collaboration between IT and the business

Data-driven enterprises tend to feature good working relationships between IT and business leaders. For example, when the CIO works closely with the finance department, a company can maximize the value of financial data.

“Delivering the right information, at the right time, in the right format to executives and managers requires a close partnership between the CFO and CIO,” says Lynn Calhoun, CFO at professional services firm BDO.

“This includes getting the finance and IT teams together to define information requirements, collaborating on setting up the right IT systems and architecture to meet those requirements, and working closely to implement and support agile systems and processes that can keep pace with today’s rapidly changing business environment,” Calhoun says.

In BDO’s case, “we work closely together to understand what the business ‘needs,’ not just what they ask for, which is usually constrained by what they know,” Calhoun says. That constraint limits the ability of the business leaders to achieve their goals, he says.

Analytics, Data Management, Data Mining

A vast majority of enterprises surveyed globally are overspending in the cloud, according to a new HashiCorp-Forrester report.

In a survey that saw participation of over 1,000 IT decision makers across North America, Europe, Middle East and Asia-Pacific, 94% of respondents said their organizations had notable, avoidable cloud expenses due to a combination of factors including underused and overprovisioned resources, and lack of skills to utilize cloud infrastructure.

Underused resources were among the top reasons for overspending, the report showed, with more than 66% of respondents listing it, followed by 59% and 47% of respondents claiming that overprovisioned resources and lack of needed skills, respectively, caused the wastage.

Another 37% of respondents also listed manual containerization as a contributor to overspending in the cloud.

Nearly 60% of respondents said they are already using multicloud infrastructures, with an additional 21% saying they will be moving to such an architecture within the next 12 months.

The report showed that a majority of enterprises surveyed were already using multicloud infrastructures.

Multicloud infrastructure works for most enterprises

Further, the report said that 90% of respondents claimed a multicloud strategy is working for their enterprises. This contrasts with just 53% of respondents in last year’s survey claiming that such a strategy was working for them.

Reliability was the major driver of multicloud adoption this year, with 46% of respondents citing it as the top reason for adopting the computing architecture. Digital transformation came in second place this year, cited by 43% of respondents as the main driver for the move to multicloud, slipping from first place last year.

Other factors driving multicloud adoption this year included scalability, security and governance, and cost reduction.

Almost 86% of respondents claimed they are dependent on cloud operations and strategy teams, which perform critical tasks such as standardizing cloud services, creating and sharing best practices and policies, and centralizing cloud security and compliance.

Skill shortages were the top barrier to multicloud adoption, with 41% of respondents listing it as the top reason. Some of the other barriers listed by respondents included teams working in silos, compliance, risk management and lack of training.

Additionally, almost 99% of respondents said that infrastructure automation is important for multicloud operations as it can provide benefits such as faster, reliable self-service IT infrastructure, better security, and better utilization of cloud resources, along with faster incident response.

Eighty-nine percent of respondents said they see security as a key driver for multicloud success, with nearly 88% of respondents claiming they already relied on security automation tools. Another 83% of respondents said they already use some form of infrastructure as code and network infrastructure automation, according to the report.

Cloud Computing, Multi Cloud

The pace of business is accelerating. Enterprises today require the robust networks and infrastructure required to effectively manage and protect an ever-increasing volume of data. They must also deliver the speed and low-latency great customer experiences require in an era marked by dramatic innovations in edge computing, artificial intelligence, machine learning, the Internet of Things, unified communications, and other singular computing trends now synonymous with business success.

We recently caught up with Mike Fuhrman, Chief Product and Information Officer at Flexential, to learn how the company is helping customers gain the connectivity and cloud solutions they need and what it means to be VMware Cloud Verified. We also took the opportunity to learn what he sees as the next big transformative trends in cloud computing.

“We serve companies in numerous industries, including those in software and IT services, manufacturing, finance, insurance, retail, health care, transportation, media and Internet, and telecommunications,” says Fuhrman. “The kinds of organizations that benefit most from our unique approach are those that require a seamless and integrated solution, low-latency connectivity across North America and beyond—and access to an all-inclusive solution from a single vendor so they can offload as much or as little of the IT management burden as they want. Organizations also turn to us for access to an extensive bench of professional services and IT experts who can assist in everything from strategy and design to implementation, ongoing management and optimization regardless of how their businesses scale.” 

Flexential’s 40 state-of-the-art data centers in North America – located in metropolitan areas across the country to ensure short hops and low latency – are connected by a 100 Gbps network backbone that is easily upgradable to 400 Gbps for those customers that require ultra-high speeds. Notably, the company offers cloud solutions with built-in security and compliance to the hypervisor for the peace of mind that results when infrastructure is audit-ready at all times. Industry-leading SLAs also guarantee that applications and the data within and used by them – the very lifeblood of the enterprise – is always accessible and protected.

The company’s unified cloud platform, FlexAnywhere™, integrates colocation, cloud, edge, connectivity, data protection, and the managed and professional services to deliver a true hybrid IT approach. Flexential also offers two types of private clouds, a hosted private cloud with dedicated resources for compute, networking and storage; and Advanced Access, a hosted private cloud offering that enables enterprises to maintain control of the provider-owned VMware vCenter server. An industry-first, it allows for greater personalization in a virtual environment that performs and functions like an on-premise solution.

Flexential also provides a managed public cloud service that lets customers use AWS and Azure, through a managed, consumption-based model that ensures they only pay for the public cloud services and capacity they use. And the Flexential Cloud Fabric, a Network-as-a-Service solution, makes it easy to spin-up, configure and manage all cloud connections from a single pane of glass, an innovation that makes true multi-cloud operations a reality.

The company also offers Desktop-as-a-Service, Disaster Recovery-as-a-Service, managed containers, as well as managed services for networking and infrastructure, security, and compliance. Addition services address each stage of the cloud journey, from design and migration to optimization.

All of the company’s cloud offerings are based on VMware technology. This includes vSphere, HCX, vROps, NSX accessed through Flexential’s FXP portal, which is tightly integrated with VMware vCloud Director.

“Being VMware Cloud Verified is important to us because it not only allows us to demonstrate Flexential delivering best-of-class solutions to our customers today, but it also contributes to an important strategic partnership that allows us to innovate and easily deliver market-leading future enhancements to our customers going forward,” adds Fuhrman. “Being VMware Cloud Verified is also important for our customers because it gives them the confidence and assurance that our cloud solutions utilize best-in-class network, storage, and compute solutions that are future-proofed and based on industry-leading technology that will continue to leverage market-leading innovations.”

Fuhrman notes that this confidence and assurance will be crucial as enterprises embrace what he sees as the next big transformative trend in cloud computing. It’s a trend he says already has significant momentum and will be increasingly commonplace in the near-term future.

“Multi-cloud deployments that enable enterprises to capitalize on the strengths of individual clouds for specific use cases, software-defined data centers, and enhancements to virtualized networks are where we will see a dramatic increase in activity,” says Fuhrman. “And that will only increase as more organizations stretch workloads across multi-cloud environments to dramatically improve customer experiences and simultaneously lowering their overall IT costs.”

Learn more about Flexential and its partnership with VMware here.

Cloud Computing, IT Leadership

For enterprises looking to wrest the most value from their data, especially in real-time, the “data lakehouse” concept is starting to catch on.

The idea behind the data lakehouse is to merge together the best of what data lakes and data warehouses have to offer, says Gartner analyst Adam Ronthal.

Data warehouses, for their part, enable companies to store large amounts of structured data with well-defined schemas. They are designed to support a large number of simultaneous queries and to deliver the results quickly to many simultaneous users.

Data lakes, on the other hand, enable companies to collect raw, unstructured data in many formats for data analysts to hunt through. These vast pools of data have grown in prominence of late thanks to the flexibility they provide enterprises to store vast streams of data without first having to define the purpose of doing so.  

The market for these two types of big data repositories is “converging in the middle, at the lakehouse concept,” Ronthal says, with established data warehouse vendors adding the ability to manage unstructured data, and data lake vendors adding structure to their offerings.

For example, on AWS, enterprises can now pair Amazon Redshift, a data warehouse, with Amazon Redshift Spectrum, which enables Redshift to reach into Amazon’s unstructured S3 data lakes. Meanwhile, data lake Snowflake can now support unstructured data with external tables, Ronthal says.

When companies have separate lakes and warehouses, and data needs to move from one to the other, it introduces latency and costs time and money, Ronthal adds. Combining the two in one platform reduces effort and data movement, thereby accelerating the pace of uncovering data insights.

And, depending on the platform, a data lakehouse can also offer other features, such as support for data streaming, machine learning, and collaboration, giving enterprises additional tools for making the most of their data.

Here is a look at at the benefits of data lakehouses and how several leading organizations are making good on their promise as part of their analytics strategies.

Enhancing the video game experience

Sega Europe’s use of data repositories in support of its video games has evolved considerably in the past several years.

In 2016, the company began using the Amazon Redshift data warehouse to collect event data from its Football Manager video game. At first this event data consisted simply of players opening and closing games. The company had two staff members looking into this data, which streamed into Redshift at a rate of ten events per second.

“But there was so much more data we could be collecting,” says Felix Baker, the company’s head of data services. “Like what teams people were managing, or how much money they were spending.”

By 2017, Sega Europe was collecting 800 events a second, with five staff working on the platform. By 2020, the company’s system was capturing 7,000 events per second from a portfolio of 30 Sega games, with 25 staff involved.

At that point, the system was starting to hit its limits, Baker says. Because of the data structures needed for inclusion in the data warehouse, data was coming in batches and it took half an hour to an hour to analyze it, he says.

“We wanted to analyze the data in real-time,” he adds, but this functionality wasn’t available in Redshift at the time.

After performing proofs of concept with three platforms — Redshift, Snowflake, and Databricks — Sega Europe settled on using Databricks, one of the pioneers of the data lakehouse industry.

“Databricks offered an out-of-the-box managed services solution that did what we needed without us having to develop anything,” he says. That included not just real-time streaming but machine learning and collaborative workspaces.

In addition, the data lakehouse architecture enabled Sega Europe to ingest unstructured data, such as social media feeds, as well.

“With Redshift, we had to concentrate on schema design,” Baker says. “Every table had to have a set structure before we could start ingesting data. That made it clunky in many ways. With the data lakehouse, it’s been easier.”

Sega Europe’s Databricks platform went live into production in the summer of 2020. Two or three consultants from Databricks worked alongside six or seven people from Sega Europe to get the streaming solution up and running, matching what the company had in place previously with Redshift. The new lakehouse is built in three layers, the base layer of which is just one large table that everything gets dumped into.

“If developers create new events, they don’t have to tell us to expect new fields — they can literally send us everything,” Baker says. “And we can then build jobs on top of that layer and stream out the data we acquired.”

The transition to Databricks, which is built on top of Apache Spark, was smooth for Sega Europe, thanks to prior experience with the open-source engine for large-scale data processing.

“Within our team, we had quite a bit of expertise already with Apache Spark,” Baker says. “That meant that we could set up streams very quickly based on the skills we already had.”

Today, the company processes 25,000 events per second, with more than 30 data staffers and 100 game titles in the system. Instead of taking 30 minutes to an hour to process, the data is ready within a minute.

“The volume of data collected has grown exponentially,” Baker says. In fact, after the pandemic hit, usage of some games doubled.

The new platform has also opened up new possibilities. For example, Sega Europe’s partnership with Twitch, a streaming platform where people watch other people play video games, has been enhanced to include a data stream for its Humankind game, so that viewers can get a player’s history, including the levels they completed, the battles they won, and the civilizations they conquered.

“The overlay on Twitch is updating as they play the game,” Baker says. “That is a use case that we wouldn’t have been able to achieve before Databricks.”

The company has also begun leveraging the lakehouse’s machine learning capabilities. For example, Sega Europe data scientists have designed models to figure out why players stop playing games and to make suggestions for how to increase retention.

“The speed at which these models can be built has been amazing, really,” Baker says. “They’re just cranking out these models, it seems, every couple of weeks.”

The business benefits of data lakehouses

The flexibility and catch-all nature of data lakehouses is fast proving attractive to organizations looking to capitalize on their data assets, especially as part of digital initiatives that hinge quick access to a wide array of data.

“The primary value driver is the cost efficiencies enabled by providing a source for all of an organization’s structured and unstructured data,” says Steven Karan, vice president and head of insights and data at consulting company Capgemini Canada, which has helped implement data lakehouses at leading organizations in financial services, telecom, and retail.

Moreover, data lakehouses store data in such a way that it is readily available for use by a wide array of technologies, from traditional business intelligence and reporting systems to machine learning and artificial Intelligence, Karan adds. “Other benefits include reduced data redundancy, simplified IT operations, a simplified data schema to manage, and easier to enable data governance.”

One particularly valuable use case for data lakehouses is in helping companies get value from data previously trapped in legacy or siloed systems. For example, one Capgemini enterprise customer, which had grown through acquisitions over a decade, couldn’t access valuable data related to resellers of their products.

“By migrating the siloed data from legacy data warehouses into a centralized data lakehouse, the client was able to understand at an enterprise level which of their reseller partners were most effective, and how changes such as referral programs and structures drove revenue,” he says.

Putting data into a single data lakehouse makes it easier to manage, says Meera Viswanathan, senior product manager at Fivetran, a data pipeline company. Companies that have traditionally used both data lakes and data warehouses often have separate teams to manage them, making it confusing for the business units that needed to consume the data, she says.

In addition to Databricks, Amazon Redshift Spectrum, and Snowflake, other vendors in the data lakehouse space include Microsoft, with its lakehouse platform Azure Synapse, and Google, with its BigLake on Google Cloud Platform, as well as data lakehouse platform Starburst.

Accelerating data processing for better health outcomes

One company capitalizing on these and other benefits of data lakehouses is life sciences analytics and services company IQVIA.

Before the pandemic, pharmaceutical companies running drug trials used to send employees to hospitals and other sites to collect data about things such adverse effects, says Wendy Morahan, senior director of clinical data analytics at IQVIA. “That is how they make sure the patient is safe.”

Once the pandemic hit and sites were locked down, however, pharmaceutical companies had to scramble to figure out how to get the data they needed — and to get it in a way that was compliant with regulations and fast enough to enable them to spot potential problems as quickly as possible.

Moreover, with the rise of wearable devices in healthcare, “you’re now collecting hundreds of thousands of data points,” Morahan adds.

IQVIA has been building technology to do just that for the past 20 years, says her colleague Suhas Joshi, also a senior director of clinical data analytics at the company. About four years ago, the company began using data lakehouses for this purpose, including Databricks and the data lakehouse functionality now available with Snowflake.

“With Snowflake and Databricks you have the ability to store the raw data, in any format,” Joshi says. “We get a lot of images and audio. We get all this data and use it for monitoring. In the past, it would have involved manual steps, going to different systems. It would have taken time and effort. Today, we’re able to do it all in one single platform.”

The data collection process is also faster, he says. In the past, the company would have to write code to acquire data. Now, the data can even be analyzed without having to be processed first to fit a database format.

Take the example of a patient in a drug trial who gets a lab result that shows she’s pregnant, but the pregnancy form wasn’t filled out properly, and the drug is harmful during pregnancy. Or a patient who has an adverse event and needs blood pressure medication, but the medication was not prescribed. Not catching these problems quickly can have drastic consequences. “You might be risking a patient’s safety,” says Joshi.

Analytics, Data Architecture, Data Management