Economic instability and uncertainty are the leading causes for technology budget decreases, according to the IDG/Foundry 2022 annual State of the CIO survey. Despite a desire to cut budgets, data remains the key factor to a business succeeding – especially during economic uncertainty. According to the Harvard Business Review, data-driven companies have better financial performance, are more likely to survive, and are more innovative.[1]

So how do companies find this balance and create a cost-effective data stack that can deliver real value to their business? A new survey from Databricks, Fivetran, and Foundry that surveyed 400-plus senior IT decision-makers in data analytics/AI roles at large global companies, finds that 96% of respondents report negative business effects due to integration challenges. However, many IT and business leaders are discovering that modernizing their data stack overcomes those integration hurdles, providing the basis for a unified and cost-effective data architecture.

Building a performant & cost-effective data stack 

The Databricks, Fivetran, and Foundry report points the way for four investment priorities for data leaders: 

1. Automated data movement. A data pipeline is critical to the modern data infrastructure. Data pipelines ingest and move data from popular enterprise SaaS applications, and operational and analytic workloads to cloud-based destinations like data lakehouses. As the volume, variety and velocity of data grow, businesses need fully managed, secure and scalable data pipelines that can automatically adapt as schemas and APIs change while continuously delivering high-quality, fresh data. Modernizing analytic environments with an automated data movement solution reduces operational risk, ensures high performance, and simplifies ongoing management of data integration. 

2. A single system of insight. A data lakehouse incorporates integration tools that automate ELT to enable data movement to a central location in near real time. By combining both structured and unstructured data and eliminating separate silos, a single system of insight like the data lakehouse enables data teams to handle all data types and workloads. This unified approach of the data lakehouse dramatically simplifies the data architecture and combines the best features of a data warehouse and a data lake. This enables improved data management, security, and governance in a single data architecture to increase efficiency and innovation. Last, it supports all major data and AI workloads making data more accessible for decision-making.

A unified data architecture results in a data-driven organization that gains both BI, analytics and AI/ML insights at speeds comparable to those of a data warehouse, an important differentiator for tomorrow’s winning companies. 

3. Designed for AI/ML from the ground up. AI/ML is gaining momentum, as more than 80% of organizations are using or exploring the use of (AI) to stay competitive. “AI remains a foundational investment in digital transformation projects and programs,” says Carl W. Olofson, research vice president with IDC, who predicts worldwide AI spending will exceed $221B by 2025.[2] Despite that commitment, becoming a data-driven company fueled by BI analytics and AI insights is proving to be beyond the reach of many organizations that find themselves stymied by integration and complexity challenges. The data lakehouse solves this by providing a single solution for all major data workloads from streaming analytics to BI, data science, and AI. It empowers data science and machine learning teams to access, prepare and explore data at scale.

4. Solving the data quality issue. Data quality tools(59%) stand out as the most important technology to modernize the data stack, according to IT leaders in the survey. Why is data quality so important? Traditionally, business intelligence (BI) systems enabled queries of structured data in data warehouses for insights. Data lakes, meanwhile, contained unstructured data that was retained for the purposes of AI and Machine Learning (ML). However, maintaining siloed systems, or attempting to integrate them through complex workarounds, is difficult and costly. In a data lakehouse, metadata layers on top of open file formats increase data quality, while query engine advances speed and performance. This serves the needs of both BI analytics and AI/ML workloads in order to assure the accuracy, reliability, relevance, completeness, and consistency of data. 

According to the Databricks, Fivetran, and Foundry report, nearly two-thirds of IT leaders are using a data lakehouse, and more than four out of five say they’re likely to consider implementing one. At a moment when cost pressure is calling into question open-ended investments in data warehouses and data lakes, savvy IT leaders are responding as they place a high priority on modernizing their data stack. 

Download the full report to discover exclusive insights from IT leaders into their data pain points, how theyplan to address them, and what roles they expect cloud and data lakehouses to play in their data stack modernization.

[1] https://mitsloan.mit.edu/ideas-made-to-matter/why-data-driven-customers-are-future-competitive-strategy

[2]  Source: IDC’s Worldwide Artificial Intelligence Spending Guide, Feb V1 2022. 

Data Architecture

Efficient supply chain operations are increasingly vital to business success, and for many enterprise, IT is the answer.

With over 2,000 suppliers and 35,000 components, Kanpur-based Lohia Group was facing challenges in managing its vendors and streamlining its supply chain. The capital goods company, which has been in textiles and flexible packaging for more than three decades, is a major supplier of end-to-end machinery for flexible woven (polypropylene and high-density polyethylene) packaging industry.

“In the absence of an integrated system, there was no control on vendor supply, which led to an increased and unbalanced inventory,” says Jagdip Kumar, CIO of Lohia. “There was also a mismatch between availability of stock and customer deliveries. At the warehouse level, we had no visibility with respect to what inventory we had and where it was located.”

Those issues were compounded by the fact that the lead time for certain components required to fulfill customer orders ranges from four to eight months. With such long component delivery cycles, client requirements often change. “The customer would want a different model of the machine, which required different components. As we used Excel and email, we were unable to quickly make course correction,” Kumar says. 

Jagdip Kumar, CIO, Lohia Corp

istock

Moreover, roughly 35% of the components involved in each customer order are customized based on the customer’s specific requirements. Long lead times and a lack of visibility at the supplier’s end meant procurement planning for these components was challenging, he says, adding that, in the absence of any ability to forecast demand, Lohia was often saddled with disbalanced (either extra or less) inventory.

The solution? Better IT.

Managing suppliers to enhance efficiency and customer experience

To manage its inventory and create a win-win situation for the company and its suppliers, Kumar opted to implement a vendor management solution.

“The solution was conceptualized with the goal of removing the manual effort required during the procurement process by automating most of the tasks of the company and the supplier while providing the updates that the former needed,” says Kumar.

“We roped in KPMG to develop the vendor portal for us on this SAP platform, which is developed on SAP BTP (Business Technology Platform), a business-centric, open, and unified platform for the entire SAP ecosystem,” he says.

The application was developed using SAP FIORI/UI5, while the backend was developed using SAP O-Data/ABAP services. The cloud-based front end is integrated with Lohia’s ERP system, thereby providing all relevant information in real-time. It took four months to implement the solution, which went live in September 2021.

With the new deployment, the company now knows the changes happening in real-time, be it the non-availability of material or a customer not making the payment or wanting to delay delivery of their ordered machine. “All these changes now get communicated to the vendors who prepone or postpone accordingly. Armed with complete visibility, we were able to reduce our inventory by 10%, which resulted in cost savings of around ₹ 200 million,” says Kumar.

The vendor portal has also automated several tasks such as schedule generation and gate entry, which have led to increases in productivity and efficiency.

“The schedules are now automatically generated through MRP [material requirement planning] giving visibility to our suppliers for the next three to four months, which helps them to plan their raw material requirements in advance and provide us timely material,” Kumar says. The result is a material shortage reduction of 15% and a 1.5X increase in productivity. “It has also helped us to give more firm commitments to our customers and our customers delivery has improved significantly, increasing customer trust,” he says.

“Earlier there was always a crowd at the gate as the entry of each truck took 10-15 minutes. The new solution automatically picks up the consignment details when the vendor ships it. At the gate, only the barcode is scanned, and truck entry is allowed entry. With 100 trucks coming in every day, we now save 200-300 minutes of precious time daily,” he says.

Kumar’s in-house development team worked in tandem with KPMG to build custom capabilities on the platform, such as automatic scheduling and FIFO (first in, first out) inventory valuation.

To ensure suppliers would adopt the solution, Lohia deployed its own team at each vendors’ premises for two to three days to teach them how to use the portal.

“We showcased the benefits that they could gain over the next two to three months by using the solution,” Kumar says. “We have been able to onboard 200 suppliers, who provide 80% of the components, on this portal. We may touch 90-95% by the end of this year.”

Streamlining warehouse operations to enhance productivity

At the company’s central warehouse in Kanpur, Kumar faced traceability issues related to its spare parts business. Also, stock was spread across multiple locations and most processes were manual, leading to inefficient and inaccurate spare parts dispatches.

“There were instances when a customer asked for 100 parts, and we supplied only 90 parts. There were also cases wherein a customer had asked for two different parts in different quantities, and we dispatched the entire quantity comprising only one part,” says Kumar. “Then there was the issue of preference. As we take all the payment upfront from our customers, our preference is to supply the spare part on a ‘first come first serve’ basis. However, there could be another customer whose factory was down because he was awaiting a part. We could not prioritize that customer’s delivery over others.”

That the contract workers were not literate, and the company had too much dependency on their experience was another bottleneck.

To overcome these problems, and to integrate its supply chain logistics with its warehouse and distribution processes, Lohia partnered with KPMG to deploy SAP EWM application on the cloud.

“We decided to optimize the warehouse processes with the usage of barcode, QR code, and wifi-enabled RF-based devices. There was also a need to synchronize warehouse activities through the integration of warehouse processes with tracking and traceability functions,” says Kumar. The implementation commenced on 01st April 2022, and it went live on 01st August 2022.

To achieve traceability, Kumar barcoded Lohia’s entire stock. “We now get a list from the system on the dispatchable order and its sequence. Earlier there was a lot of time wastage, as we didn’t know which part was kept in which portion of the warehouse. Employees no longer take the zig-zag path as the new solution provides the complete path and the sequence in which they must go and pick up the material,” Kumar says.

Kumar also implemented aATP (Advanced Available-to-Promise), which provides a response to order fulfilment inquiries in Sales and Production Planning. This feature within the EWM solution provides a check based on the present stock situation and any planned or anticipated stock receipts.

“The outcome was as per the expectations. There was improved inventory visibility across the warehouse as well as in-transit stock. The EWM dashboard helped warehouse supervisor to have controls on inbound, outbound, stocks overview, resource management, and physical inventory,” says Kumar.

“Earlier one person used to complete only 30 to 32 parts in a day but after this implementation, the same person dispatches 47 to 48 parts in a day, which is a significant jump of 50% in productivity. The entire process has become 100% accurate with no wrong supply. If there is short supply, it is known to us in advance. There is also a 25% reduction in overall turnaround time in inbound and outbound processes,” he adds.

Supply Chain Management Software

Every organization pursuing digital transformation needs to optimize IT from edge to cloud to move faster and speed time to innovation. But the devil’s in the details. Each proposed IT infrastructure purchase presents decision-makers with difficult questions. What’s the right infrastructure configuration to meet our service level agreements (SLAs)? Where should we modernize — on-premises or in the cloud? And how do we demonstrate ROI in order to proceed?

There are no easy, straightforward answers. Every organization is at a different stage in the transformation journey, and each one faces unique challenges. The conventional approach to IT purchasing decisions has been overwhelmingly manual: looking through spreadsheets, applying heuristics, and trying to understand all the complex dependencies of workloads on underlying infrastructure.

Partners and sellers are similarly constrained. They must provide a unique solution for each customer with little to no visibility into a prospect’s IT environment. This has created an IT infrastructure planning and buying process that is inaccurate, time-consuming, wasteful, and inherently risky from the perspective of meeting SLAs.

Smarter solutions make for smarter IT decisions

It’s time to discard legacy processes and reinvent IT procurement with a new approach that leverages the power of data-driven insights. For IT decision makers and their partners and sellers, a modern approach involves three essential steps to optimize procurement — and accelerate digital transformation:

1. Understand your VM needs

Before investing in infrastructure modernization, it’s critical to get a handle on your current workloads. After all, you must have a clear understanding of what you already have before deciding on what you need. To reach that understanding, enterprises, partners, and sellers should be able to collect and analyze fine-grained resource utilization data per virtual machine (VM) — and then leverage those insights to precisely determine the resources each VM needs to perform its job.

Why is this so important? VM admins often select from a menu of different sized VM templates when they provision a workload. They typically do so without access to data — which can lead to slowed performance due to under-provisioning, or oversubscribed VMs if they choose an oversized template. It’s essential to right-size your infrastructure plan before proceeding.

2. Model and price infrastructure with accuracy

Any infrastructure purchase requires a budget, or at least an understanding of how much money you intend to spend. To build that budget, an ideal IT procurement solution provides an overview of your inventory, including aggregate information on storage, compute, virtual resource allocation, and configuration details. It would also provide a simulator for on-premises IT that includes the ability to input your actual costs of storage, hosts, and memory. Bonus points for the ability to customize your estimate with depreciation term, as well as options for third-party licensing and hypervisor and environmental costs.

Taken together, these capabilities will tell you how much money you’re spending to meet your needs — and help you to avoid overpaying for infrastructure.

3. Optimize workloads across public and private clouds

Many IT decision makers wonder about the true cost of running particular applications in the public cloud versus keeping them on-premises. Public cloud costs often start out attractively low but can increase precipitously as usage and data volumes grow. As a result, it’s vital to have a clear understanding of cost before deciding where workloads will live. A complete cost estimate involves identifying the ideal configurations for compute, memory, storage, and network when moving apps and data to the cloud.

To do this, your organization and your partners and sellers need a procurement solution that can map their entire infrastructure against current pricing and configuration options from leading cloud providers. This enables you to make quick, easy, data-driven decisions about the costs of running applications in the cloud based on the actual resource needs of your VMs.

And, since you’ve already right sized your infrastructure (step 1), you won’t have to worry about moving idle resources to the cloud and paying for capacity you don’t need.

HPE leads the way in modern IT procurement

HPE has transformed the IT purchasing experience with a simple procurement solution delivered as a service: HPE CloudPhysics. Part of the HPE GreenLake edge-to-cloud platform, HPE CloudPhysics continuously monitors and analyzes your IT infrastructure, models that infrastructure as a virtual environment, and provides cost estimates of cloud migrations. Since it’s SaaS, there’s no hardware or software to deal with — and no future maintenance.

HPE CloudPhysics is powered by some of the most granular data capture in the industry, with over 200 metrics for VMs, hosts, data stores, and networks. With insights and visibility from HPE CloudPhysics, you and your sellers and partners can seamlessly collaborate to right-size infrastructure, optimize application workload placement, and lower costs. Installation takes just minutes, with insights generated in as little as 15 minutes.

Across industries, HPE CloudPhysics has already collected more than 200 trillion data samples from more than one million VM instances worldwide. With well over 4,500 infrastructure assessments completed, HPE CloudPhysics already has a proven record of significantly increasing the ROI of infrastructure investments.

This is the kind of game-changing solution you’re going to need to transform your planning and purchasing experience — and power your digital transformation.

____________________________________

About Jenna Colleran

HPE

Jenna Colleran is a Worldwide Product Marketing Manager at HPE. With over six years in the storage industry, Jenna has worked in primary storage and cloud storage, most recently in cloud data and infrastructure services. She holds a Bachelor of Arts degree from the University of Connecticut.

Cloud Management, HPE, IT Leadership