Understanding the student lifecycle isn’t easy. With more higher education institutions attempting to embrace digital learning, there is a growing need for visibility throughout the student journey. By gathering data across every student, faculty and alumni touchpoint, institutions can optimise each stage of the admission and onboarding process. 

The appetite for insights among higher education institutions is such that the global big data analytics in education market is expected to grow from a value of $18.02 billion in 2022 to reach $36.12 billion by 2027.

Unfortunately, many institutions remain reliant on legacy solutions with siloed data, which introduces lots of ad hoc manual tasks that slow the process of attracting and nurturing prospects. 

Automation will play a key role in enabling providers to implement a data-first approach – and better support prospects and recruitment faculties to ensure the student lifecycle runs as smoothly as possible. 

The problem with legacy tools in higher education 

Most higher education institutions today rely on legacy middleware they are familiar with, but that fails to offer visibility over the student lifecycle. These solutions make it difficult to access student records, accommodation, financial data and third party or cloud platforms. 

Data is also isolated and siloed in on-premises solutions, making it difficult to generate insights and optimise the student experience. 

In order to generate concrete insights, data needs to be collected at the edge of the network and across campus to feed into a centralised analytics solution. There it can be processed to develop insights into how to improve operations over the long-term.

How Boomi addresses these challenges 

The answer for these organisations is to undergo digital transformation by migrating datasets to the cloud. Ultimately, this will generate concrete insights to enhance the experience for students and faculties. 

While this transition is already underway, with 54.3% of higher education institutions reporting they were cloud-based in 2021, there are many that still need to migrate to the cloud. 

Integration platform as a service (iPaaS) solutions like the Boomi AtomSphere Platform can help enable this transition by unifying application data to ensure insights are accessible throughout the environment via a single cloud platform. 

Essentially, Boomi offers organisations the ability to connect data from a variety of sources, helping with the process of migrating data to the cloud and connecting data sources wherever they may be.  

Connecting data allows decision makers to generate the insights needed to make faster admission decisions – such as streamlining the onboarding experience for prospects and recruitment faculties. 

The easy way to move to the cloud 

Boomi has emerged as a key provider in enabling higher education institutions to move to the cloud. Boomi supports Amazon Web Services (AWS) data migration and application modernisation to link data, systems, applications, processes, and people together as part of a cohesive ecosystem. 

This approach enables higher education institutions to leverage a growing number of services through AWS, simplify data pipelines and improve transparency for decision makers. 

Ultimately, by providing decision makers with access to high quality data, institutions will not only increase the quality of the student experience but become more cost efficient by maximising retention.

To find out more about Boomi click here.

Your CTA

Education and Training Software, Education Industry

The successful journey to cloud adoption for Banking, Financial Services, and Insurance (BFSI) enterprises cannot be completed without addressing the complexities of core business systems. Many businesses have been able to migrate corporate support systems – such as ERP and CRM, as well as IT security and infrastructure systems to the public cloud. However, security concerns, legacy architecture, country-specific regulations, latency requirements, and transition challenges continue to keep the core system from cloud adoption.

BFSI enterprises will be unable to realize the full cloud potential until their core business systems use cloud platforms and services. Firms are looking for solutions that will allow them to continue operating out of their data centers while also providing access to the cloud-shared infrastructure made available to them in their own data centers.

To address these challenges, leading cloud service providers have launched hybrid integrated solution offerings that allow enterprises to access cloud services from their respective data centers via shared infrastructure provided by cloud providers. These allow enterprises to deploy their applications on either the cloud shared infrastructure or on their own data centers without having to rewrite the code.

Enterprises have two options: run applications directly on the cloud or run computing and storage on-premises using the same APIs. To provide a consistent experience across on-premises and cloud environments, the on-premises cloud solution is linked to the nearest cloud service provider region. Cloud infrastructure, services, and updates, like public cloud services, are managed by cloud service providers.

AWS Outposts is a leading hybrid integrated solution that provides enterprises with seamless public cloud services at their data centers. Outpost is a managed AWS service that includes computing and storage. Outpost provides enterprises with an option to be closer to the data center, as many BFSI core systems require significantly low latency, as well as an ecosystem of business applications residing on on-premise data centers.

AWS Outposts will deliver value to BFSI enterprises

Several BFSI enterprises use appliance-based databases for high performance and high availability computing. In the short and medium term, it is unlikely that these enterprises will migrate their appliance-based databases to the cloud; however, AWS provides an option to run these systems on Outposts while keeping databases on the appliances. Outpost also assists in the migration of databases from proprietary and expansive operating systems and hardware to more cost-effective and economical hardware options.

Other use cases, such as commercial off-the-shelf BFSI products that require high-end servers, can be easily moved to AWS Outposts, lowering the total cost of ownership. As a strategy, legacy monolithic core applications that require reengineering can be easily moved to AWS Outposts first and then modernized incrementally onto the public cloud.

A unified hybrid cloud system is the way forward for BFSI enterprises

AWS Outposts offer BFSI enterprises a solution that combines public and private infrastructure, consistent service APIs, and centralized management interfaces. The AWS Outpost service will be able to assist BFSI enterprises in dealing with the many expansive appliance-based core systems that run on proprietary vendor-provided operating systems and hardware that are very expensive.

AWS Outposts will allow BFSI enterprises to gradually migrate to the public cloud while maintaining core application dependencies. AWS Outpost enables a true hybrid cloud for BFSI enterprises.

Author Bio


Ph: +91 9841412619

E-mail: asim.kar@tcs.com

Asim Kar has 25+ years of overall IT experience spanning executing large-scale transformation programs and running technology organizations in BFSI in migration and reengineering space. He is currently heading the cloud technology focus group in BFSI. He leads complex transformation projects in traditional technologies in telecom, insurance, banks, and financial services programs.

To learn more, visit us here.

Hybrid Cloud

Ask anyone who’s lost an online auction or abandoned a shopping cart in today’s hybrid & multi-cloud world, speed matters for transactions and business decisions –      nanoseconds count.

As cloud migration continues apace, accessing and using the data that runs and informs your applications has become a challenge for organizations of all sizes. Cloud Search company Elastic takes the challenge of the observability of data availability and security head on.

Defining the challenge

For a typical data team, 80% of time is spent on data discovery, preparation, and protection, and only 20% of time is spent on actual analytics and getting to insight, says IDC. Data silos, legacy data management tools and skill sets, and the impact of the COVID-19 pandemic all have hobbled organizations’ efforts to unify, share and analyze data for forecasting and better business decisions.

“It’s astonishing how much inefficiency exists across the industry,” says Brian Bergholm, Senior Marketing Manager in the Cloud Product Marketing team at Elastic. “In fact, based on some recent survey work that Elastic conducted with Wakefield Research, we found that 81% of knowledge workers say they have a hard time finding documents under pressure.”[1] 

That inefficiency has some hard costs associated with it, Bergholm says, which are manifested in three trends noted by Elastic.

“One, information is becoming harder to find, and this inability to find information actually costs the average enterprise $2.5 million per year,” he says. “Second, enterprise IT is becoming harder to keep performant, and system downtime costs the average enterprise $1.5 million per hour. Third, cyber threats are becoming harder to prevent, and these data breaches cost the average enterprise $3.8 million per incident. Adding these all together accrues to millions of dollars of unnecessary costs.”

Components for data success

To tackle these costly inefficiencies, organizations are turning increasingly to an integrated approach as opposed to a suite of point solutions. Bergholm points to three advantages of an integrated solution: speed, scale, and relevance.

“From a speed standpoint, you can find matches in milliseconds within both structured and unstructured datasets,” says Bergholm. “You can scale massively and horizontally across literally hundreds of systems, and the most important aspect is that you can generate highly relevant results and actionable insights from your data.”

Integrated solutions also increasingly take advantage of technologies like artificial intelligence and machine learning, he says.

“We’ve also built machine learning in so it’s in the suite, and these capabilities can be leveraged across all three solution areas: search, observability, and security.

The security mandate

When talking about data challenges, security is a prime consideration. There are two sides to the coin: data security, and leveraging data for security intelligence.

First, any search solution itself must be secure and compliant. “Elastic takes data sovereignty very seriously” says Bergholm. “We’ve invested to ensure Elastic is operating in compliance with the principles of GDPR, and, in fact, Elastic Cloud is available in 17 Google Cloud regions. This allows you to place applications where the data lives and supports local data sovereignty and governance requirements.”

Second, an integrated search approach can be applied to the security data that’s collected routinely by organizations.

“By using advanced search analytics, you can leverage petabytes of data, enriched with threat intelligence to glean the insights you need to protect your organization,” says Bergholm. “Search also helps mitigate cyber threats by exposing unfolding attacks by correlating diverse data. We use machine learning algorithms and natural language processing capabilities and other tools to better understand context and meaning from a wider array of data types and formats, and all of this helps your SEC ops teams to quickly identify issues.”

Google Cloud Platform, Managed Cloud Services

The meager supply and high salaries of data scientists have led to a decision among many companies totally in keeping with artificial intelligence ― to automate whatever is possible. Case in point is machine learning. A Forrester study found that automated machine learning (AutoML) has been adopted by 61% of data and analytics decision makers in companies using AI, with another 25% of companies saying they’ll do so in the next year. 

Automated machine learning (AutoML) automates repetitive and manual machine learning tasks. That’s no small thing, especially when data scientists and data analysts now spend a majority of their time cleaning, sourcing, and preparing data. AutoML allows them to outsource these tasks to machines to more quickly develop and deploy AI models. 

If your company is still hesitating in adoption of AutoML, here are some very good reasons to deploy it sooner than later.

1. AutoML Super Empowers Data Scientists

AutoML transfers data to a training algorithm. It then searches for the best neural network for each desired use case. Results can be generated within 15 minutes instead of hours. Deep neural networks in particular are notoriously difficult for a non-expert to tune properly. AutoML automates the process of training a large selection of deep learning and other types of candidate models. 

With AutoML, data scientists can say goodbye to repetitive, tedious, time-consuming tasks. They can iterate faster and explore new approaches to what they’re modeling. The ease of use of AutoML allows more non-programmers and senior executives to get involved in conceiving and executing projects and experiments.

2. AutoML Can Have Big Financial Benefits

With automation comes acceleration. Acceleration can be monetized. 

Companies using AutoML have experienced increased revenue and savings from their use of the technology. A healthcare organization saved $2 million per year from reducing nursing hours and $10 million from reduced patient stays. A financial services firm saw revenue climb 1.5-4% by using AutoML to handle pricing optimization.

3. AutoML Improves AI Development Efforts

AutoML simplifies the process of choosing and optimizing the best algorithm for each machine learning model. The technology selects from a wide array of choices (e.g., decision trees, logistic regression, gradient boosted trees) and automatically optimizes the model. It then transfers data to each training algorithm to help determine the optimal architecture. Automating ML modeling also reduces the risk of human error.

One company reduced time-to-deployment of ML models by a factor of 10 over past projects. Others boosted lead scoring and prediction accuracy and reduced engineering time. Using ML models created with AutoML, customers have reduced customer churn, reduced inventory carryovers, improved email opening rates, and generated more revenue.

4. AutoML is Great at Many Use Cases

Use cases where AutoML excels include risk assessment in banking, financial services, and insurance; cybersecurity monitoring and testing; chatbot sentiment analysis; predictive analytics in marketing; content suggestions by entertainment firms; and inventory optimization in retail. AutoML is also being put to work in healthcare and research environments to analyze and develop actionable insights from large data sets.

AutoML is being used effectively to improve the accuracy and precision of fraud detection models. One large payments company improved the accuracy of their fraud detection model from 89% to 94.7% and created and deployed fraud models 6 times faster than before. Another company that connects retailers with manufacturers reduced false positive rates by 55% and sped up deployment of models from 3-4 weeks to 8 hours. 

A Booming Market for AutoML

The global AutoML market is booming, with revenue of $270 million in 2019 and predictions that the market will approach $15 billion by 2030, a CAGR of 44%. A report by P&S Intelligence summed up the primary areas of growth for the automation technology: “The major factors driving the market are the burgeoning requirement for efficient fraud detection solutions, soaring demand for personalized product recommendations, and increasing need for predictive lead scoring.”

Experts caution that AutoML is not going to replace data scientists any time soon. It is merely a powerful tool that accelerates their work and allows them to develop, test, and finetune their strategies. With AutoML, more people can participate in AI and ML projects, utilizing their understanding of their data and business and letting automation do much of the drudgery. 

The Easy Button

Whether you’re just getting started or you’ve been doing AI, ML and DL for some time, Dell Technologies can help you capitalize on the latest technological advances, making AI simpler, speeding time to insights with proven Validated Designs for AI.

Validated Designs for AI are jointly engineered and validated to make it quick and easy to deploy a hardware-software stack optimized to accelerate AI initiatives. These integrated solutions leverage H2o.ai for Automatic Machine Learning. NVIDIA AI Enterprise software can increase data scientist productivity, while VMware® vSphere with Tanzu simplifies IT operations. Customers report that Validated Designs enable 18–20% faster configuration and integration, save 12 employee hours a week with automated reconciliation feeds, and reduce support requirements by 25%.

Validated Designs for AI speed time to insight with automatic machine learning, MLOps and a comprehensive set of AI tools. Dell PowerScale storage improves AI model training accuracy with fast access to larger data sets, enabling AI at scale to drive real‑time, actionable responses. VxRail enables 44% faster deployment of new VMs, while Validated Designs enable 18x faster AI models.

You can confidently deploy an engineering‑tested AI solution backed by world‑class Dell Technologies Services and support for Dell Technologies and VMware solutions. Our worldwide Customer Solution Centers with AI Experience Zones enable you to leverage engineering expertise to test and optimize solutions for your environments. Our expert consulting services for AI help you plan, implement and optimize AI solutions, while more than 35,000 services experts can meet you where you are on your AI journey. 

AI for AI is here, making it easier and faster than ever to scale AI success. For more information, visit Dell Artificial Intelligence Solutions.  


Intel® Technologies Move Analytics Forward

Data analytics is the key to unlocking the most value you can extract from data across your organization. To create a productive, cost-effective analytics strategy that gets results, you need high performance hardware that’s optimized to work with the software you use.

Modern data analytics spans a range of technologies, from dedicated analytics platforms and databases to deep learning and artificial intelligence (AI). Just starting out with analytics? Ready to evolve your analytics strategy or improve your data quality? There’s always room to grow, and Intel is ready to help. With a deep ecosystem of analytics technologies and partners, Intel accelerates the efforts of data scientists, analysts, and developers in every industry. Find out more about Intel advanced analytics.

IT Leadership

Whether the cause is a disability, long illness, psychological challenges, or parental choice, many children find themselves facing the daunting task of acquiring an education at home.

During the global pandemic, homeschooling issues became a particular concern, as teachers and parents attempted to use their time and energy most productively and vulnerable students worried about falling behind, among other anxieties.

On a practical level, in some cases, homeschooling can eat up 10% of a particular school’s budget even when less than 1% of the pupils require the service.

As the world shut down, international technology company NTT DATA Business Solutions attempted to remedy apprehensions by creating a digital teaching engine to help children learn, teachers teach, and parents homeschool.

Not only would the Artificial Intelligence (AI) Learning Helper assist students in improving reading and other skills, the new platform would also manage to meet the emotional needs of each child.

Human avatars

From its headquarters in Bielefeld, Germany, NTT DATA assists a variety of industries, including chemical, pharmaceutical, wholesale, and consumer products, always searching for new places to innovate.

The company also works closely with the Media Lab at the Massachusetts Institute of Technology (MIT), drawing research from such disciplines as technology, media, science, art, and design.

“The possibilities of artificial intelligence fascinate us,” noted Thomas Nørmark NTT DATA’s global head of AI and robotics.

Previously, the company’s “digital human platform” allowed NTT DATA to develop avatars for receptionists, sales associates, shop floor assistants, and car vendors, among others.

Those experiences would prove invaluable in the development of the avatars that would both teach children and interact with them in a personalized way.

Emotional fluency

Turning to enterprise resource planning software leader SAP for its foundation, NTT DATA used a number of the solutions to make the AI Learning helper come to life: SAP Data Intelligence for both AI and Machine Learning (ML), SAP Analytics Cloud to amass data about individual students’ learning progress, and SAP Conversational AI to manage the conversations between Learning Helper and the students. 

These allowed the platform to utilize AI specialties like body language detection, emotional feedback, micro expression – which registers facial expressions that sometimes last only 1/30th of a second – and summarization algorithms to create new phrases to relay information in a language every student could grasp.

Each screen would be the equivalent of a virtual buddy who could understand a pupil’s changing emotions – detecting whether he or she were frustrated or unmotivated – and patiently adjust to the situation.

At times, the Learning Helper would conclude, a child simply needed to engage in small talk for a few minutes before turning back to the lesson.

The innovation would provide much needed relief for parents and teachers who are not always able to exhibit the same type of sensitivity when they are dealing with so many other obligations.

Tracking for success

The app was deployed in January 2021, with the Danish municipality of Toender’s school district and a British hospital school becoming the first places to use the AI Learning Helper.

Students discovered that they could access the platform at any time on any device, and there was no limit on how long a session could last.

In addition to assisting pupils with vocabulary, pronunciation, and story comprehension, the avatar generated and answered questions.

Through classwork, as opposed to testing, the solution could track each child’s progress, communicating that information to parents and teachers, and come up with lessons tailored to areas where the student could improve.

Participating schools noted that estimated homeschooling costs decreased by 50%, leading to a 5% reduction in the overall budget.

For creating a personalized virtual helper to bring out student strengths and alleviate both loneliness and frustration, NTT DATA Business Solutions received a 2022 SAP Innovation Award – part of an annual ceremony rewarding organizations using SAP technologies to improve the world.

You can read all about what NTT DATA did to win this coveted award, and how, in their Innovation Awards pitch deck.

As developing nations gain greater access to the Internet and education becomes more democratized, the company plans to use the AI Learning Helper to teach thousands more.

Artificial Intelligence, Machine Learning

Anil Bhatt, Global Chief Information Officer at Elevance Health, joins host Maryfran Johnson for this CIO Leadership Live interview, jointly produced by CIO.com and the CIO Executive Council. They discuss using AI in predictive healthcare, blockchain collaborations, the future of digital healthcare, global innovation trends and more.

Watch this episode:

Listen to this episode:

CIO Leadership Live

Artificial intelligence (AI) is one?if not the?key technology of our decade. Technological advances in this field are not only fundamentally changing our economies, industries and markets, but are also exerting enormous influence on traditional business practices, many of which will disappear, while others will be transformed or completely reinvented.


What is business analytics?

Business analytics is the practical application of statistical analysis and technologies on business data to identify and anticipate trends and predict business outcomes. Research firm Gartner defines business analytics as “solutions used to build analysis models and simulations to create scenarios, understand realities, and predict future states.”

While quantitative analysis, operational analysis, and data visualizations are key components of business analytics, the goal is to use the insights gained to shape business decisions. The discipline is a key facet of the business analyst role.

Wake Forest University School of Business notes that key business analytics activities include:

Identifying new patterns and relationships with data miningUsing quantitative and statistical analysis to design business modelsConducting A/B and multivariable testing based on findingsForecasting future business needs, performance, and industry trends with predictive modelingCommunicating findings to colleagues, management, and customers


What are the benefits of business analytics?

Business analytics can help you improve operational efficiency, better understand your customers, project future outcomes, glean insights to aid in decision-making, measure performance, drive growth, discover hidden trends, generate leads, and scale your business in the right direction, according to digital skills training company Simplilearn.


What is the difference between business analytics and data analytics?

Business analytics is a subset of data analytics. Data analytics is used across disciplines to find trends and solve problems using data mining, data cleansing, data transformation, data modeling, and more. Business analytics also involves data mining, statistical analysis, predictive modeling, and the like, but is focused on driving better business decisions.


What is the difference between business analytics and business intelligence?

Business analytics and business intelligence (BI) serve similar purposes and are often used as interchangeable terms, but BI can be considered a subset of business analytics. BI focuses on descriptive analytics, data collection, data storage, knowledge management, and data analysis to evaluate past business data and better understand currently known information. Whereas BI studies historical data to guide business decision-making, business analytics is about looking forward. It uses data mining, data modeling, and machine learning to answer “why” something happened and predict what might happen in the future.

Business analytics techniques

According to Harvard Business School Online, there are three primary types of business analytics:

Descriptive analytics: What is happening in your business right now? Descriptive analytics uses historical and current data to describe the organization’s present state by identifying trends and patterns. This is the purview of BI.Predictive analytics: What is likely to happen in the future? Predictive analytics is the use of techniques such as statistical modeling, forecasting, and machine learning to make predictions about future outcomes.Prescriptive analytics: What do we need to do? Prescriptive analytics is the application of testing and other techniques to recommend specific solutions that will deliver desired business outcomes.

Simplilearn adds a fourth technique:

Diagnostic analytics: Why is it happening? Diagnostic analytics uses analytics techniques to discover the factors or reasons for past or current performance.

Examples of business analytics

San Jose Sharks build fan engagement

Starting in 2019, the San Jose Sharks began integrating its operational data, marketing systems, and ticket sales with front-end, fan-facing experiences and promotions to enable the NHL hockey team to capture and quantify the needs and preferences of its fan segments: season ticket holders, occasional visitors, and newcomers. It uses the insights to power targeted marketing campaigns based on actual purchasing behavior and experience data. When implementing the system, Neda Tabatabaie, vice president of business analytics and technology for the San Jose Sharks, said she anticipated a 12% increase in ticket revenue, a 20% projected reduction in season ticket holder churn, and a 7% increase in campaign effectiveness (measured in click-throughs).

GSK finds inventory reduction opportunities

As part of a program designed to accelerate its use of enterprise data and analytics, pharmaceutical titan GlaxoSmithKline (GSK) designed a set of analytics tools focused on inventory reduction opportunities across the company’s supply chain. The suite of tools included a digital value stream map, safety stock optimizer, inventory corridor report, and planning cockpit.

Shankar Jegasothy, director of supply chain analytics at GSK, says the tools helped GSK gain better visibility into its end-to-end supply chain and then use predictive and prescriptive analytics to guide decisions around inventory and planning.

Kaiser Permanente streamlines operations

Healthcare consortium Kaiser Permanente uses analytics to reduce patient waiting times and the amount of time hospital leaders spend manually preparing data for operational activities.

In 2018, the consortium’s IT function launched Operations Watch List (OWL), a mobile app that provides a comprehensive, near real-time view of key hospital quality, safety, and throughput metrics (including hospital census, bed demand and availability, and patient discharges).

In its first year, OWL reduced patient wait time for admission to the emergency department by an average of 27 minutes per patient. Surveys also showed hospital managers reduced the amount of time they spent manually preparing data for operational activities by an average of 323 minutes per month.

Business analytics tools

Business analytics professionals need to be fluent in a variety of tools and programming languages. According to the Harvard Business Analytics program, the top tools for business analytics professionals are:

SQL: SQL is the lingua franca of data analysis. Business analytics professionals use SQL queries to extract and analyze data from transactions databases and to develop visualizations.Statistical languages: Business analytics professionals frequently use R for statistical analysis and Python for general programming.Statistical software: Business analytics professionals frequently use software including SPSS, SAS, Sage, Mathematica, and Excel to manage and analyze data.

Business analytics dashboard components

According to analytics platform company OmniSci, the main components of a typical business analytics dashboard include:

Data aggregation: Before it can be analyzed, data must be gathered, organized, and filtered.Data mining: Data mining sorts through large datasets using databases, statistics, and machine learning to identify trends and establish relationships.Association and sequence identification: Predictable actions that are performed in association with other actions or sequentially must be identified.Text mining: Text mining is used to explore and organize large, unstructured datasets for qualitative and quantitative analysis.Forecasting: Forecasting analyzes historical data from a specific period to make informed estimates predictive of future events or behaviors.Predictive analytics: Predictive business analytics use a variety of statistical techniques to create predictive models that extract information from datasets, identify patterns, and provide a predictive score for an array of organizational outcomes.Optimization: Once trends have been identified and predictions made, simulation techniques can be used to test best-case scenarios.Data visualization: Data visualization provides visual representations of charts and graphs for easy and quick data analysis.

Business analytics salaries

Here are some of the most popular job titles related to business analytics and the average salary for each position, according to data from PayScale:

Analytics manager: $71K-$132KBusiness analyst: $48K-$84KBusiness analyst, IT: $51K-$100KBusiness intelligence analyst: $52K-$98KData analyst: $46K-$88KMarket research analyst: $42K-$77KQuantitative analyst: $61K-$131KResearch analyst, operations: $47K-$115KSenior business analyst: $65K-$117KStatistician: $56K-$120KAnalytics

Cairn Oil & Gas is a major oil and gas exploration and production company in India. It currently contributes 25% to India’s domestic crude production (about 28.4 MMT) and is aiming to account for 50% of the total output. The company plans to spend ₹3,16,09 crores (₹31.6 billion) over the next three years to boost its production.

The oil and gas industry currently confronts three major challenges: huge price fluctuation with volatile commodity prices, capital-intensive processes and long lead times, and managing production decline.

Sandeep Gupta, chief digital and information officer at Cairn Oil & Gas, is using state-of-the-art technologies to overcome these challenges and achieve business goals. “We have adopted a value-focused approach to deploying technological solutions. We partner with multiple OEMs and service integrators to deploy highly scalable projects across the value chain,” he says.

Reducing operational costs with drones, AI, and edge computing

Sandeep Gupta, chief digital and information officer, Cairn Oil & Gas


The oil and gas industry is facing huge price fluctuation due to volatile commodity prices and geopolitical conditions. In such a scenario, it becomes crucial for the business to manage costs.

Sustained oil production depends on uninterrupted power supply. However, managing transmission lines is a high-cost, resource-intensive task. For Cairn, it meant managing 250km of power lines spread across 3,111 square kilometers. They supply power to the company’s Mangala, Bhagyam, and Aishwarya oil fields and its Rageshwari gas fields in Rajasthan.

To reduce operational costs, the company decided to use drones. The images captured by the drones are run through an AI image-recognition system. The system analyses potential damage to power lines, predicts possible failure points, and suggests preventive measures, thereby driving data-driven decision-making instead of operator-based judgment.

“Algorithms such as convolutional neural networks were trained on images captured when the overhead powerlines are running in their ideal condition. The algorithm then compares the subsequent images that are taken at an interval of six months when any anomalies are captured. An observation is then put into portal for the maintenance team to take corrective and preventive action,” says Gupta.

This is a service-based contract between Cairn and the maintenance provider where the monitoring is carried out on biannual basis for 220kV power lines and annually for 500kV power lines.

“Since the implementation of drone-based inspection, the mean time between failure has increased from 92 to 182 days. This has reduced oil loss to 2,277 barrels per year, leading to cost savings worth approximately ₹12 crores [₹120 million]. As it enables employees to carry out maintenance activities in an effective manner, a small team can work more efficiently, and the manpower required reduces,” Gupta says.

The remote location of operations coupled with a massive volume of data (Cairn generates about 300GB data per day) that is generated make the oil and gas industry ideal for the use of edge-based devices for computing.

With smart edge devices, critical parameters are stored and processed at remote locations. The devices are installed in the field which send data via MQTT protocol where cellular network connectivity is available. They store data up to 250GB on the Microsoft Azure cloud and perform analytics using machine-learning algorithms, as well as provide intelligent alarms.

Without these devices, the data generated would be transported to faraway data centres, clogging the network bandwidth. “Edge computing helps reduce our IT infrastructure cost as lower bandwidth is sufficient to handle the large volume of data. These devices deployed are tracking critical operational parameters such as pressure, temperature, emissions, and flow rate. The opportunity cost of not having edge computing would result in requiring a higher bandwidth of network, which would amount to around 2X of the current network cost,” says Gupta. “This also has an implication on the health and safety risk of our personnel and equipment.”

Reducing lead times through a cloud-first strategy

The oil exploration process has a lead time of around three to five years and requires huge capital commitment. Out of these three to five years, a significant amount of time is taken up by petrotechnical experts (geologists, geophysicists, petroleum engineers, and reservoir engineers) in simulating models that require massive computational power.

Petrotechnical workflow entails evaluation of subsurface reservoir characteristics to identify the location for drilling the wells. These workflows are carried out by petrotechnical experts via multiple suites of software applications that can help identify the location and trajectory of wells to be drilled.

“Capital allocation and planning for future exploration has become riskier due to long lead times. To achieve our goals, increasing computing capabilities are essential. For this, we have adopted and executed a cloud-first strategy,” says Gupta. Thus, Cairn has completely migrated the workloads for petrotechnical workflows to the cloud. “This migration has removed the constraints of on-premises computational capabilities. As a result, there is almost 30% reduction in time to first oil,” he says.

Managing decline in production through predictive analytics

Cairn has considerable volume, variety, and velocity of data coming from different sources across production, exploration, and administration. “Using this data, we have deployed multiple large-scale projects, including predictive analytics, model predictive control, and reservoir management, which have been scaled across multiple sites,” says Gupta. Model predictive control (MPC) is a technology where the equipment is monitored for various operating parameters and is then operated in a particular range to get maximum efficiency, while maintaining the constraints in the system.

At the heart of this lies Disha, a business intelligence initiative that uses dashboards driving critical actionable insights. “The philosophy for developing Disha was to make the right data available to the right people at the right time. We wanted to remove file-based data sharing and reporting as significant time goes in creating these reports. We connected data from various sources such as SAP HANA, Historian, Microsoft SharePoint, Petrel, LIMS, and Microsoft Azure cloud onto a single Microsoft PowerBI ecosystem where customized reports can be created,” says Gupta.

Disha was developed in a hybrid mode with an in-house team and an analytics provider over the course of three years. It offers more than 200 customized dashboards, including a well-monitoring dashboard, a production-optimisation dashboard, a CEO and CCO dashboard, and a rig-scheduling dashboard.

“With data now easily and quickly accessible in an interactive format across the organisation, which was earlier restricted to a select few, the corrective actions for resource allocation are now based on the data,” Gupta says. “For instance, we leverage Disha to monitor the parameter and output of the electronic submersible pump, which handles oil and water. It helps us in tracking the gains achieved through MPC implementation. All this enables better decision-making and has helped to allocate resources in optimized manner, thus managing the decline in productivity.” Going forward, Cairn plans to partner with a few big analytics providers and build a single platform to help contextualize its data and deploy micro solutions, according to business needs. “This will be a low-code platform that will enable individual teams to build solutions on their own,” Gupta says. “The initiatives are oriented towards sustaining the production levels, while reducing time to first oil. Some of the initiatives include artificial lift system monitoring, well monitoring, and well-test validation,” says Gupta.

Artificial Intelligence, Digital Transformation