Generative artificial intelligence (GenAI) tools such as Azure OpenAI have been drawing attention in recent months, and there is widespread consensus that these technologies can significantly transform the retail industry. The most well-known GenAI application is ChatGPT, an AI agent that can generate a human-like conversational response to a query. Other well-known GenAI applications can generate narrative text to summarize or query large volumes of data, generate images and video in response to descriptive phrases, or even generate complex code based on natural language questions.

GenAI technologies offer significant potential benefits for retail organizations, including speedy price adjustments, customized behavior-based incentives, and personalized recommendations in response to searches and customer preferences. These technologies can create new written, visual, and auditory content based on natural language prompts or existing data. Their advanced analytic capabilities can help determine better locations for new stores or where to target new investments. Generative AI chatbots can provide faster, more relevant customer assistance leading to increased customer satisfaction and in some cases, reduced costs and customer churn. To gain a deeper understanding of how retail organizations can benefit from Generative AI applications, we spoke with James Caton, Practice Leader, Data and Artificial Intelligence, at Microsoft, and Girish Phadke, Technology Head, Microsoft and Cloud Platforms, at Tata Consultancy Services (TCS). James and Girish discussed three ways Generative AI is transforming retail: speeding innovation, creating a better customer experience, and driving growth.

How can Generative AI speed innovation in retail?

James Caton: We’re already seeing a lot of data-driven innovation in the industry. Microsoft Azure OpenAI Service, which provides access to OpenAI’s large language models, allows more probing and deep questioning of data. A frontline worker could have the ability to “chat with their data,” to conversationally query inventory or shipping options for example, see the response in a chart, and ask for trend analysis and deeper insights.

It essentially gives you an assistant or a Copilot to help do your job. Imagine having several assistants that are parsing the data, querying the data, and bringing data reports and visual graphs back to you. And you can send the copilot back and say, “please look here,” and “I want more information there.” As a retail sales manager, OpenAI will allow you to develop more innovative solutions, more tailored strategies, and more personalized experiences.

How does Generative AI’s conversational flow enable a more compelling customer experience?

Girish Phadke: Existing call center tools can be conversational, and they do have access to 360-degree customer views, but there is a limit in terms of how far back they can go and what kind of data they can process to answer the customer’s query.

The new Generative AI models can go deeper into historical information, summarize it, and then present it in a human-like conversation. These models can pull data from multiple interactions and sources, from a huge amount of information, and create a response that is the best fit to answer a particular customer’s question. Essentially, tailoring the answer not only based on a massive knowledge base of data, but also on the individual customer’s preferences.

Can you share an example of how one of your customers has benefited from using OpenAI to process and analyze vast amounts of information?

Caton: CarMax reviews millions of vehicles. The challenge for new buyers was there were too many reviews, and they could not get a good sense for why people liked or disliked a certain vehicle. CarMax used the Azure OpenAI Service to analyze millions of reviews and present a summary. If a customer was looking at a certain make and model, the Azure OpenAI service summarized the reviews and presented the top three reasons people liked it and the top three reasons they disliked it. The technology summarized millions of comments, so that customers didn’t have to, thus improving the customer experience and satisfaction.

Are there steps that retailers can take to get ready for OpenAI and similar tools?

Caton: If a retailer wants to take advantage of these capabilities, the first thing they need to do is move their data to the Microsoft Cloud. Then, partners like TCS can help them develop their preferred use case, such as applying Generative AI to inventory or sales data or helping develop more tailored marketing campaigns. TCS knows the industry as well as most retailers. They understand the technology, how to manage and migrate data, and how to optimize to make best use of the new capabilities.

Phadke: We understand this is a new technology; retailers are likely to be cautious. They can start by augmenting existing capabilities, such as with more comprehensive Azure ChatGPT, and adjust the governance models as they learn more about their data and processes. As confidence grows, they can begin to automate the larger deployment mechanism.

How long does it typically take for an organization to see a return on investment from Generative AI?

Phadke: With the right strategy and right set of use cases, a system can start generating a positive ROI very quickly. TCS offers a six-week discovery assessment to help with ideation and strategy development. Within 12 to 16 weeks of adopting Azure OpenAI Service, an organization can have a more scaled-out implementation.

Do retail organizations have to embrace Generative AI technologies right now if they want to be able to compete?

Phadke: I think if some retailers choose to ignore this technology, they risk falling behind. Earlier adopters might get a competitive advantage. This technology is disruptive in nature and will have a significant impact on many industries, including retail.

Caton: OpenAI is the fastest application to hit 100 million users —faster than Facebook, Instagram, or WhatsApp. The risk for slow adopters is that their competitors are adopting it and might gain a competitive advantage. It is being adopted very widely, very quickly.

Learn how to master your cloud transformation journey with TCS and Microsoft Cloud.

TCS

Girish Phadke, Technology Head, Microsoft and Cloud Platforms, TCS
Girish Phadke leads Edge to Cloud Solutions, AI, and Innovation focus areas within the TCS Microsoft Business Unit. He provides advisory to customers on next generation architectures and business solutions. He tracks and incubates new technologies through TCS Microsoft Business Unit Innovation hubs across the globe. Girish is based out of Mumbai, India, and in his free time loves watching science fiction movies.
https://www.linkedin.com/in/ girish-phadke-ab25034/

Microsoft

James Caton, Practice Leader, Data & Artificial Intelligence, Microsoft James Caton serves as an AI Practice Leader at Microsoft, helping global system integrators build sustainable Azure Artificial Intelligence businesses. He has held technical and commercial leadership positions at software companies SAS and IBM, as well as with Larsen & Toubro Construction where he led their India Smart Cities business. James lives in Ave Maria, Florida with his wife and three daughters. https://www.linkedin.com/in/jmcaton/

Artificial Intelligence, Retail Industry

Over the next five years, the healthcare industry is expected to go through dramatic changes as service providers expand value-based care models and equipment manufacturers strive to keep pace in a digital-first world. One factor driving global transformation is the push to bring healthcare services, as well as responsibilities and control, closer to consumers.

By 2027, 70% of healthcare organizations will rely on digital-first strategies to empower patients to take a more active role in health responsibilities and experiences, notes IDC in its FutureScape Worldwide Healthcare Industry 2023 Predictions report. Spending on digital patient engagement and experience technologies that target such areas as mental health, telehealth, clinical trials, digital therapeutics, and ‘care anywhere’ solutions is expected to rise, according to the report.

Bedford, Mass.-based Novanta is a trusted technology partner to medical and advanced industrial OEMs, with deep proprietary expertise in photonics, vision, and precision motion technologies. Novanta engineers mission-critical components and subsystems that deliver extreme precision and performance, enabling customers to improve productivity, achieve breakthrough performance, and enhance people’s lives.

Leading the technology strategy at Novanta is Dr. Sarah Betadam, an experienced digital executive who, as presently the company’s CIO and CISO, is charged with delivering transformational solutions that scale operations and improve business agility, but she also helps drive business strategies and global objectives. Betadam is well-suited for the task, given the range of her experience prior to joining Novanta more than four years ago, and holds a doctorate in engineering and engineering management, as well as degrees in computer science from noted universities.

A self-confessed data analytics and research junkie, Betadam wrote a thesis presented to George Washington University a few years ago that outlines a contemporary model for IT program management that challenges many existing models, which she calls ‘overly subjective’ and less viable in today’s world.

“Everybody wants to have a business impact and business outcome,” she explains. “But business value is what is important for a particular business, which stems from the business capabilities rolled up from the business strategy. There’s revenue generation, the compliance aspect, operational efficiency, and legal. If you total all of this up it gives you a spectrum,” Betadam says, noting that this model was used at a previous organization and resulted in a more focused and successfully executed strategy.

More details on Betadam’s program management model, as well as her thoughts and insights on the challenges that lie ahead in terms of leading a transformational digital business, are presented in this CIO Executive Council/IDC Future Forward podcast interview. The following are edited excerpts from the podcast. Click on the links at the end of the excerpts to listen to Parts 1 & 2 of the full conversation.

Tim Scannell: IT organizations are going through a lot of structural changes, with reporting lines changing and the focus becoming more intense on business and customer facing objectives. How has your IT organization changed over the past year or two, and how has this made Novanta more flexible and adaptable?

Sarah Betadam: We’re expanding our competency levels — not just within IT, but across the business — to be successful. There’s also another aspect in terms of the agile transformation within our team and flexibility in terms of product roadmap ownership. This means really emphasizing the business partnership and making sure we’re aligning with the business strategy. We’re very big into Lean processes; aligning to those is how you get the momentum going. It depends on the resilience of the CIO to make sure there’s a match between the capabilities you’re trying to add as components to the Novanta organization and the current flow of programs and projects that each business unit is running.

Technology initiatives that have a direct impact on the bottom line are more of an emphasis today. Does this create any pressures on you and your IT organization to complete projects more quickly and with less risk involved?

Betadam: This is why agile is important for our entire organization. It’s educating the business that it’s okay that something is not going to be perfect the first time around. I also use Lean language, which is more familiar, letting them know there will be continuous development, process improvement, iterations, and that everything is not going to be perfect. It’s a journey and people are getting more comfortable with that approach than in the past two years.

A lot of companies today are investing more in data analytics and business intelligence. Some are even establishing digital strategy teams consisting of people from both IT and the business communities. The problem, however, is finding experienced IT and business talent. How is that impacting your data analytics strategies?

Betadam: We started our journey pre-pandemic with one or two people and now have a BI Council. We have brought in experts from the business to take ownership of the different data that we are trying to centralize, but finding people and talent is very difficult. One solution that has worked for us is to expand your search to places where you wouldn’t think that there would be talent. We’ve started doing analytics training, where everybody is invited to join, and more people from all departments are interested in learning more about data reporting. We are evangelizing this across the organization, so you don’t have to be IT person. You could be anywhere within Novanta, and we will train you to make use of the data that could help you and your team.

A lot of IT organizations are being asked to minimize risk and focus on projects that deliver a definable and maximum ROI. Do you see this as impacting your innovation initiatives in any way?

Betadam: I think there is definitely a slowdown in innovation activities, especially during times of economic uncertainty. Innovation comes with a certain tolerance level of accepting failure. At the same time, you are trying to keep the lights on and making sure customers are satisfied when there is a lot happening outside of your own control. Depending on the leader, however, there might be some incubation to keep things going. We’ve been looking at different efficiency solutions within the organization and at tools like AI. But this is done by a smaller group because we really needed to shift our focus to meet the customer amid all these different vectors that surround us. So, it slows down, but I don’t think as a leader you should shut it down because then you’d be behind.

Digital Transformation, IT Leadership

As CIO Neil Holden moved his company, Halfords Group, further into the cloud, he sought to do more than simply “lift-and-shift” IT operations.

Rather, Holden — like most CIOs — wanted his increasing use of cloud to enable and shape the company’s transformation agenda. To succeed in that objective, he knew he had to transform not just the tech stack but his own IT department as well.

“You definitely have to look at your own IT [department] structure with any kind of cloud adoption,” he says. “IT has to operate very differently now not just because of cloud but because of what cloud means to the business.”

So Holden, who has been CIO at Halfords — the UK’s largest retailer of motoring and cycling products and services — since 2017, developed a strategy to reorganize his tech team. He did that as he was devising the company’s overall cloud strategy, seeing that as the best way to ensure his staff could seize on the capabilities that the cloud provides and the business opportunities that the cloud could enable.

“You have to have the right organization to achieve that, because if you’re just going to stick your stuff in the cloud, you’re not going to leverage those investments [to their fullest],” he says.

CIOs along with researchers, consultants, and advisors agree that IT must change itself, how it works and how it organizes its workers, if it wants to gain the most benefits out of cloud computing.

Otherwise, they say, IT simply moves the location of its servers from its own data centers to someone else’s — and risks missing out on the innovation, transformation, and speed to market that cloud adoption enables.

“You can’t take your same skills and teams from on-prem to the cloud. That’s where you’re going to fail,” says Sushant Tripathi, vice president and North American lead for cloud transformation at Tata Consultancy Services. Instead, CIOs need to retrain and reorganize IT to take advantage of all the bells and whistles that cloud offers, he says.

Here, four IT leaders detail how they have taken action on this front.

Leaving behind linear processes

Holden’s reorganization focused in part on eliminating linear software development, a linear project process and the department team structure that accommodated that linear approach to getting work done.

Neil Holden, CIO, Halfords Group

Halford’s Group

“We changed our structure entirely,” he says.

Previously, Halford’s IT function was conventionally organized with a structure made up of separate teams for business analysis, solutions design, infrastructure, and so on. Under that organization, work moved from one team to the next down the line.

“Someone would talk to the business, hand off the requirements to the design team, and that’s then handed over to delivery and infrastructure teams,” he says, explaining that the teams worked independently and created agreements between themselves that hashed out each team’s deliverables and timelines. “Now all that [work] happens within an agile circle with iterative delivery, so the linear process has all been crushed together.”

Here’s how he did it: Holden hired cloud architects, who brought cloud integration experience and training to the agile methodology Holden embraced. He also trained existing staffers in cloud skills and the agile method. And he hired agile coaches to work with his IT team. Then he broke up those distinct, independent teams and created Scrum teams staffed by product owners, business analysts, solution architects, front-end developers, back-end developers, and testers.

The new Scrum teams worked iteratively instead of in a linear mode to speed up the delivery of new capabilities and features, allowing IT — and the business as a whole — to capitalize on the company’s cloud investments.

“A big part of this change wasn’t just cloud but changing the hearts and minds of people. So we put a huge amount of effort into training,” Holden says, adding that he orchestrated a nearly clean cut-over to this new structure in late 2021.

Holden says he sees the value of this reorganization in his team’s ability to more quickly. He calculated that one project, created and deployed in 42 days by his revamped IT team, would have taken the old IT department 152 days to complete.

Cores and chapters to unlock cloud talent

Arizona State University CIO Lev Gonick has similarly reconfigured his IT team to better seize on the opportunities cloud provides.

Lev Gonick, CIO, Arizona State University

Arizona State University

That reconfiguring didn’t happen right away, Gonick says. ASU started its cloud journey a decade ago with experiments, before becoming more strategic and aggressive about cloud adoption when Gonick became CIO in 2017. ASU now has about 85% of its workloads in the cloud.

Gonick says his team had to change if it was going to be agile enough to keep pace with business needs and scale as the university grows. Gonick’s solution was to “fundamentally flatten the organization.”

“It was a high-stakes gamble on my part,” he says, noting that he decided to make the changes during the early part of the pandemic. “What we did instead of having vertically oriented teams, we created series of ‘cores,’ which is the language of large software development shops.”

Gonick says these cores represent “rapidly reconfigurable pools of talent” with each one focused on five specific areas. He says the majority of the teams and their work is organized around five cores, which are professional development communities around a common practice. There are four technical cores: engineering, service delivery, product and programs, and data and analytics; the fifth core is related to learning experience.

Managers in the product and programs core bring the right combination of talent together to work in chapters, which Gonick likens to work groups; for example, there are 30 engineering chapters.

“The reason we did this is to make sure we’re aligned to what the cloud affords us the opportunity to do,” he says, adding that this organizational structure lets IT professionals stretch and exercise their talent by working on diverse projects “rather than be in a salt mine and work day in and day out with the same set of tools.”

He adds: “It’s really about unleashing human talent. This is my own personal view here, but most enterprise technical teams are steeped in hierarchical organizations that suffocate way too much of the talent. Most [professionals] have a breadth of knowledge that they rarely have a chance to explore, share, and build. But this affords our teams the opportunities to grow as a professional community and to be highly engaged — not only between themselves but with the business.”

Centralizing teams for cloud success

Like ASU, Liberty Mutual Insurance has been on its cloud journey for the past decade, starting off with experimentation before going all-in six years ago for its ability to “give us speed to market, drive costs down, and give us flexibility in turning on and off capabilities,” says Monica Caldas, who became Liberty Mutual’s executive vice president and global CIO in January after serving in two other executive IT roles at the company since 2018.

Monica Caldas, EVP and global CIO, Liberty Mutual

Liberty Mutual

Throughout Liberty Mutual’s cloud journey, IT leadership has focused on developing the talent and skills needed to move from an on-premises environment to one that’s mostly in the cloud, Caldas says. “It became a large-scale transformation [in which] everyone had a part to play.”

As part of that, Liberty Mutual’s infrastructure team needed to be reshaped, as it no longer needed to maintain the same expanse of hardware that it had managed over the years. Instead, the infrastructure team was transformed into a centralized digital services unit with a global mandate focused on cloud capabilities to be leveraged across the company.

Caldas says infrastructure professionals previously had been focused on working with and supporting the company’s business units, “but they didn’t have one single set of strategies with one roadmap on where they were going.”

Under the new structure, where the new digital services team has a global mandate, they are creating repeatable processes that the entire company can easily access and use, which “generates that flywheel on the speed of delivery,” Caldas says.

Moreover, the digital services team, because it is centralized, does this more efficiently, saving costs she says. And the team is able to do so more effectively, as team members are able to hone their skills, refine processes, and thus deliver high quality results.

“Our Global Digital Services [GDS] team is a centralized function that ensures critical business applications are always available. With over 70% of Liberty Mutual’s application and infrastructure footprint operating in the public cloud, GDS has oversight of the company’s global cloud and DevOps architecture and operations, helping the company work faster,” Caldas says.

Other IT teams then focus on delivering solutions for business needs.

“We have technology teams who are also focused on driving outcomes for our core business units with a mission of delivering differentiating capabilities in service of our customers, agents, clients and partners,” Caldas explains.

On a similar note, Liberty Mutual IT leaders have also created a centralized cybersecurity and operational resilience team focused a global mandate to ensure “secure, stable systems.”

Caldas adds: “Today, we are aligned as a global organization oriented around a single set of strategies, roadmaps, and vision statements about where we are going. Everything is digital-first for our customers, agents, clients, and partners, and our cloud journey brings it full circle in terms of how we use technology as a competitive enabler.”

Consolidating teams for better security operations

For Brad Stone, CIO of Booz Allen Hamilton, cloud enables the speedy delivery of needed capabilities and supports business innovation and transformation.

Brad Stone, CIO, Booz Allen Hamilton

Booz Allen Hamilton

“We have organized ourselves to make sure that we can capitalize on that,” he says.

That includes how he has approached his security strategy, an area Stone believed needed to be transformed to ensure Booz Allen achieved the successes it sought from its cloud investments.

“You’ve got to set a strong foundation between your cybersecurity and your IT operations teams,” he says, stressing that consistent security operations across the enterprise helps identify and reduce risks.

Stone, who also oversees security, says security operations at Booz Allen had previously been structured under three infrastructure-based units: one supporting on-premises infrastructure, another supporting cloud, and a third supporting the company’s software-as-a-service platforms.

Despite all three teams reporting up a single leader, each team optimized for itself, resulting in individual stovepipes and technology inefficiencies, Stone says. Moreover, the differences between each stovepipe also meant more work for security teams trying to manage and mitigate risks. In fact, it fostered “a legacy mindset” as well as a fragmented approach with security thinking it needed certain tools for on-prem and others for SaaS and still others for commercial cloud.

“We struggled with commonality, common visibility, and we just had too much churn,” Stone says. So he collapsed the three infrastructure teams into one integrated infrastructure and compute team, so “it was not on-prem versus cloud” but instead “made it more of a team sport” — with workers from each of the original three teams being cross-trained to break down the silos and work as a cohesive unit capable of supporting all flavors of infrastructure that exist at Booz Allen.

That work took about eight months, happening in 2021 and into 2022, Stone says, adding that the integrated infrastructure and compute team has enabled Stone to modernize security operations to better suit the company’s mixed technical environment.

“Our security team could better integrate,” he says. So instead of security operations addressing availability, reliability, and confidentiality requirements for each individual stovepipe, it could address that triad across the entire infrastructure landscape. That makes security both more effective and more efficient.”

To illustrate this, Stone offers the following example: “Let’s say you have a critical vulnerability for an open source piece of software. When you have your different infrastructure run separately and you’ve created some stovepipes between them, you have a harder time responding at the speed of the threat, to discovering and remediating,” he says.

But by collapsing all three into an integrated infrastructure team, security is able to use common tools, a single IT service management solution and one configuration management database across the whole — upping the ability to discover and respond to security issues in a timely manner.

Cloud Computing, IT Leadership, IT Skills, IT Strategy

Straumann Group’s Sridhar Iyengar has a bold mission: To transform the nearly 70-year-old company’s data and technology organization into a data-as-a-service provider for the global manufacturer and supplier of dental implants, prosthetics, orthodontics, and digital dentistry — and to provide business stakeholders machine learning (ML) as a service as well.

“My vision is that I can give the keys to my businesses to manage their data and run their data on their own, as opposed to the Data & Tech team being at the center and helping them out,” says Iyengar, director of Data & Tech at Straumann Group North America.

Doing so will be no small feat. The Basel, Switzerland-based company, which operates in more than 100 countries, has petabytes of data, including highly structured customer data, data about treatments and lab requests, operational data, and a massive, growing volume of unstructured data, particularly imaging data. The company’s orthodontics business, for instance, makes heavy use of image processing to the point that unstructured data is growing at a pace of roughly 20% to 25% per month.

Advances in imaging technology present Straumann Group with the opportunity to provide its customers with new capabilities to offer their clients. For example, imaging data can be used to show patients how an aligner will change their appearance over time.

“It gives a lot of power to our providers in selling their services and at the same time gets more NPS [net promoter score] for us from the patient,” says Iyengar, who believes AI will play a critical role in Straumann’s image processing and lab treatments businesses. Hence the drive to provide ML as a service to the Data & Tech team’s internal customers.

“All they would have to do is just build their model and run with it,” he says.

But to augment its various businesses with ML and AI, Iyengar’s team first had to break down data silos within the organization and transform the company’s data operations.

“Digitizing was our first stake at the table in our data journey,” he says.

Selling the value of data transformation

Iyengar and his team are 18 months into a three- to five-year journey that started by building out the data layer — corralling data sources such as ERP, CRM, and legacy databases into data warehouses for structured data and data lakes for unstructured data.

That step, primarily undertaken by developers and data architects, established data governance and data integration. Now, the team’s information architects, in conjunction with business analysts, are working on the semantic layer, which feeds data from data warehouses and data lakes into data marts, including a finance mart, sales mart, supply chain mart, and market mart. The next goal, with the aid of partner Findability Sciences, will be to build out ML and AI pipelines into an information delivery layer that can support predictive and prescriptive analytics.

“As the information layer gets mature, that’s where the ML and the AI will start seeing some green shoots,” he says, adding that although data transformation was a pressing need when he signed on in 2021, he wanted a more compelling vision to sell the board and business leaders on tackling it.

For that, he relied on a defensive and offensive metaphor for his data strategy. The defensive side includes traditional elements of data management, such as data governance and data quality. The offensive side? That is the domain of AI and advanced analytics that serve a role beyond just insight and business optimization.

“The offensive side is how to generate revenue, all of the insights from the historical data that we have collected and, in fact, forecast the trends that are coming,” Iyengar says. “Most of the data that we get on the offensive side are unstructured, and we want to make sure that it makes sense to the business leaders and help them harmonize and enrich it in such a manner that they can serve their customers more efficiently and that the customers get served and leverage Straumann’s services in a much more robust, frictionless manner.”

Not surprisingly, it was this offensive side that got Straumann’s board invested in Iyengar’s plan for transformation.

“When the customer-centricity and the digital transformation piece was proposed — along with data transformation — I think that resonated with them,” Iyengar says.

Skilling up for the future

Iyengar’s team found success by adopting a use-case approach, not unlike that of one of Strauman’s core businesses. “We pretty much took the same principle of the pre-treatment and the post-treatment images that we show to our patients,” Iyengar says.

The team asked company leaders to pick a number of customer-centric vectors to illustrate how data innovations could be used to drive business outcomes. One of the targets was driving down customer churn. The team started by splitting churn propensity into two values: one for retention of existing customers and one for new customer acquisition. It used typical customer lifetime values and analyzed buying patterns to provide the marketing team and sales team with insights they could use to drive their strategies.

Iyengar says adopting this approach to selling digital transformation internally has made the job much easier. “We are seeing a lot of investments being approved from all the businesses in order to support that initiative,” he says.

In the meantime, as the team begins to build out ML and AI capabilities, it is also imperative to transform the Data & Tech team itself.

“The skill set that we have inherently from our traditional school point of view doesn’t suit the ML and AI part of it,” Iyengar says. “What you need there is statisticians and mathematicians, not programmers and coders, right? So, we have been transforming ourselves as well, culturally and from a skill point of view. That takes its own time. We have a learning curve at our end to build the right skill set within us.”

Iyengar is supplementing his team’s skill set with help from enterprise AI specialist Findability Sciences. The company’s Findability.ai platform combines machine learning, computer vision, and natural language processing (NLP) to aid customers in their AI journey.

“I have a lot of traditional ETL skills in my team,” he says. “What I don’t have is the ML/AI skill set right now. Partners are helping us in that space.”

Ultimately, Iyengar says, these changes will transform how the Data & Tech team interfaces with the business. For now, it operates under a centralized “hub and spokes” model. But he says hiring statisticians and mathematicians in his team won’t be scalable. Instead, what he really wants within three to five years is to embed them in teams closer to the lines of business, so the businesses can run models by themselves.

“Right now, we’re driving the bus at 100 miles and hour and changing the tires at the same time, which is not going to be scalable by any means, though I’m proud of my team that we are doing it,” he says.

Artificial Intelligence, Data Management, Predictive Analytics

Predictive analytics definition

Predictive analytics is a category of data analytics aimed at making predictions about future outcomes based on historical data and analytics techniques such as statistical modeling and machine learning. The science of predictive analytics can generate future insights with a significant degree of precision. With the help of sophisticated predictive analytics tools and models, any organization can now use past and current data to reliably forecast trends and behaviors milliseconds, days, or years into the future.

Predictive analytics has captured the support of wide range of organizations, with a global market size of $12.49 billion in 2022, according to a research study published by The Insight Partners in August 2022. The report projects the market will reach $38 billion by 2028, growing at a compound annual growth rate (CAGR) of about 20.4% from 2022 to 2028.

Predictive analytics in business

Predictive analytics draws its power from a wide range of methods and technologies, including big data, data mining, statistical modeling, machine learning, and assorted mathematical processes. Organizations use predictive analytics to sift through current and historical data to detect trends and forecast events and conditions that should occur at a specific time, based on supplied parameters.

With predictive analytics, organizations can find and exploit patterns contained within data in order to detect risks and opportunities. Models can be designed, for instance, to discover relationships between various behavior factors. Such models enable the assessment of either the promise or risk presented by a particular set of conditions, guiding informed decision-making across various categories of supply chain and procurement events.

For tips on how to effectively harness the power of predictive analytics, see “7 secrets of predictive analytics success.”

Benefits of predictive analytics

Predictive analytics makes looking into the future more accurate and reliable than previous tools. As such it can help adopters find ways to save and earn money. Retailers often use predictive models to forecast inventory requirements, manage shipping schedules, and configure store layouts to maximize sales. Airlines frequently use predictive analytics to set ticket prices reflecting past travel trends. Hotels, restaurants, and other hospitality industry players can use the technology to forecast the number of guests on any given night in order to maximize occupancy and revenue.

By optimizing marketing campaigns with predictive analytics, organizations can also generate new customer responses or purchases, as well as promote cross-sell opportunities. Predictive models can help businesses attract, retain, and nurture their most valued customers.

Predictive analytics can also be used to detect and halt various types of criminal behavior before any serious damage is inflected. By using predictive analytics to study user behaviors and actions, an organization can detect activities that are out of the ordinary, ranging from credit card fraud to corporate spying to cyberattacks.

Predictive analytics use cases

Organizations today use predictive analytics in a virtually endless number of ways. The technology helps adopters in fields as diverse as finance, healthcare, retailing, hospitality, pharmaceuticals, automotive, aerospace, and manufacturing.

Here are a few ways organizations are making use of predictive analytics:

Aerospace: Predict the impact of specific maintenance operations on aircraft reliability, fuel use, availability, and uptime.Automotive: Incorporate records of component sturdiness and failure into upcoming vehicle manufacturing plans. Study driver behavior to develop better driver assistance technologies and, eventually, autonomous vehicles.Energy: Forecast long-term price and demand ratios. Determine the impact of weather events, equipment failure, regulations, and other variables on service costs.Financial services: Develop credit risk models. Forecast financial market trends. Predict the impact of new policies, laws, and regulations on businesses and markets.Manufacturing: Predict the location and rate of machine failures. Optimize raw material deliveries based on projected future demands.Law enforcement: Use crime trend data to define neighborhoods that may need additional protection at certain times of the year.Retail: Follow an online customer in real-time to determine whether providing additional product information or incentives will increase the likelihood of a completed transaction.

Predictive analytics examples

Organizations across all industries leverage predictive analytics to make their services more efficient, optimize maintenance, find potential threats, and even save lives. Here are three examples:

Rolls-Royce optimizes maintenance schedules and reduces carbon footprint

Rolls-Royce, one of the world’s largest manufacturers of aircraft engines, has deployed predictive analytics to help dramatically reduce the amount of carbon its engines product while also optimizing maintenance to help customers keep their planes in the air longer.

DC Water drives down water loss

The District of Columbia Water and Sewer Authority (DC Water) is using predictive analytics to drive down water loss in its system. Its flagship tool, Pipe Sleuth, uses an advanced, deep learning neural network model to do image analysis of small diameter sewer pipes, classify them, and then create a condition assessment report.

PepsiCo tackles supply chain with predictive analytics

PepsiCo is transforming its ecommerce sales and field sales teams with predictive analytics to help it know when a retailer is about to be out of stock. The company has created the Sales Intelligence Platform, which combines retailer data with PepsiCo’s supply chain data to predict out-of-stocks and alert users to reorder.

Predictive analytics tools

Predictive analytics tools give users deep, real-time insights into an almost endless array of business activities. Tools can be used to predict various types of behavior and patterns, such as how to allocate resources at particular times, when to replenish stock or the best moment to launch a marketing campaign, basing predictions on an analysis of data collected over a period of time.

Some of the top predictive analytics software platforms and solutions include:

Alteryx Analytics Automation PlatformAmazon SageMakerH20 AI CloudIBM SPSSRapidMinerSAP Analytics CloudSAS ViyaTIBCO

For more on the tools that drive predictive analysis, see “Top 8 predictive analytics tools.”

Predictive analytics models

Models are the foundation of predictive analytics — the templates that allow users to turn past and current data into actionable insights, creating positive long-term results. Some typical types of predictive models include:

Customer Lifetime Value Model: Pinpoint customers who are most likely to invest more in products and services.Customer Segmentation Model: Group customers based on similar characteristics and purchasing behaviors.Predictive Maintenance Model: Forecast the chances of essential equipment breaking down.Quality Assurance Model: Spot and prevent defects to avoid disappointments and extra costs when providing products or services to customers.

Predictive modeling techniques

Model users have access to an almost endless range of predictive modeling techniques. Many methods are unique to specific products and services, but a core of generic techniques, such as decision trees, regression — and even neural networks — are now widely supported across a wide range of predictive analytics platforms.

Decision trees, one of the most popular techniques, rely on a schematic, tree-shaped diagram that’s used to determine a course of action or to show a statistical probability. The branching method can also show every possible outcome of a particular decision and how one choice may lead to the next.

Regression techniques are often used in banking, investing, and other finance-oriented models. Regression helps users forecast asset values and comprehend the relationships between variables, such as commodities and stock prices.

On the cutting edge of predictive analytics techniques are neural networks — algorithms designed to identify underlying relationships within a data set by mimicking the way a human mind functions.

Predictive analytics algorithms

Predictive analytics adopters have easy access to a wide range of statistical, data-mining and machine-learning algorithms designed for use in predictive analysis models. Algorithms are generally designed to solve a specific business problem or series of problems, enhance an existing algorithm, or supply some type of unique capability.

Clustering algorithms, for example, are well suited for customer segmentation, community detection, and other social-related tasks. To improve customer retention, or to develop a recommendation system, classification algorithms are typically used. A regression algorithm is typically selected to create a credit scoring system or to predict the outcome of many time-driven events.

Predictive analytics in healthcare

Healthcare organizations have become some of the most enthusiastic predictive analytics adopters for a very simple reason: The technology is helping them save money.

Healthcare organizations use predictive analytics in several ways, including intelligently allocating facility resources based on past trends, optimizing staff schedules, identifying patients at risk for a costly near-term readmission and adding intelligence to pharmaceutical and supply acquisition and management.

Healthcare consortium Kaiser Permanente has used predictive analytics to create a hospital workflow tool that it uses to identify non-intensive care unit (ICU) patients that are likely to rapidly deteriorate within the next 12 hours. NorthShore University HealthSystem has embedded a predictive analytics tool in patients’ electronic medical records (EMRs) that helps it identify which chest pain patients should be admitted for observation and which patients can be sent home.

For a deeper look, see “Healthcare analytics: 4 success stories.”

How should an organization begin with predictive analytics?

While getting started in predictive analytics isn’t a snap, it’s a task that virtually any business can handle as long as one remains committed to the approach and is willing to invest the time and funds necessary to get the project moving. Beginning with a limited-scale pilot project in a critical business area is an excellent way to cap start-up costs while minimizing the time before financial rewards begin rolling in. Once a model is put into action, it generally requires little upkeep as it continues to grind out actionable insights for many years.

For a deeper look, see “How to get started with predictive analytics.”

Predictive analytics salaries

Here are some of the most popular job titles related to predictive analytics and the average salary for each position, according to data from PayScale.

Analytics manager: $72K-$134KDirector of analytics: $84K-$179KBusiness analyst: $49K-$87KChief data scientist: $133K-$290KData analyst: $46K-$89KData scientist: $70K-$137K

More on predictive analytics:

7 projects primed for predictive analytics7 ways predictive analytics can improve customer experienceTop 8 predictive analytics tools comparedIdentifying high-risk patients with predictive analyticsPredictive analytics: 4 success stories
Analytics, Artificial Intelligence, Predictive Analytics

Business intelligence definition

Business intelligence (BI) is a set of strategies and technologies enterprises use to analyze business information and transform it into actionable insights that inform strategic and tactical business decisions. BI tools access and analyze data sets and present analytical findings in reports, summaries, dashboards, graphs, charts, and maps to provide users with detailed intelligence about the state of the business.

The term business intelligence often also refers to a range of tools that provide quick, easy-to-digest access to insights about an organization’s current state, based on available data.

Benefits of BI

BI helps business decision-makers get the information they need to make informed decisions. But the benefits of BI extend beyond business decision-making, according to data visualization vendor Tableau, including the following:

Data-driven business decisions: The ability to drive business decisions with data is the central benefit of BI. A strong BI strategy can deliver accurate data and reporting capabilities faster to business users to help them make better business decisions in a more timely fashion.Faster analysis and intuitive dashboards: BI improves reporting efficiency by condensing reports into dashboards that are easy for non-technical users to analyze, saving them time when seeking to glean insights from data.Increased organizational efficiency: BI can help provide holistic views of business operations, giving leaders the ability to benchmark results against larger organizational goals and identify areas of opportunity.Improved customer experience: Ready access to data can help employees charged with customer satisfaction provide better experiences.Improved employee satisfaction: Providing business users access to data without having to contact analysts or IT can reduce friction, increase productivity, and facilitate faster results.Trusted and governed data: Modern BI platforms can combine internal databases with external data sources into a single data warehouse, allowing departments across an organization to access the same data at one time.Increased competitive advantage: A sound BI strategy can help businesses monitor their changing market and anticipate customer needs.

Business intelligence examples

Reporting is a central facet of BI and the dashboard is perhaps the archetypical BI tool. Dashboards are hosted software applications that automatically pull together available data into charts and graphs that give a sense of the immediate state of the company.

Although business intelligence does not tell business users what to do or what will happen if they take a certain course, neither is BI solely about generating reports. Rather, BI offers a way for people to examine data to understand trends and derive insights by streamlining the effort needed to search for, merge, and query the data necessary to make sound business decisions.

For example, a company that wants to better manage its supply chain needs BI capabilities to determine where delays are happening and where variabilities exist within the shipping process. That company could also use its BI capabilities to discover which products are most commonly delayed or which modes of transportation are most often involved in delays.

The potential use cases for BI extend beyond the typical business performance metrics of improved sales and reduced costs.

Tableau and software review site G2 also offer concrete examples of how organizations might put business intelligence tools to use:

A co-op organization could use BI to keep track of member acquisition and retention.BI tools could automatically generate sales and delivery reports from CRM data.A sales team could use BI to create a dashboard showing where each rep’s prospects are on the sales pipeline.

Business intelligence vs. business analytics

Business analytics and BI serve similar purposes and are often used as interchangeable terms, but BI should be considered a subset of business analytics. BI focuses on descriptive analytics, data collection, data storage, knowledge management, and data analysis to evaluate past business data and better understand currently known information. Whereas BI studies historical data to guide business decision-making, business analytics is about looking forward. It uses data mining, data modeling, and machine learning to answer why something happened and predict what might happen in the future.

Business intelligence is descriptive, telling you what’s happening now and what happened in the past to get your organization to that state: Where are sales prospects in the pipeline today? How many members have we lost or gained this month? Business analytics, on the other hand, is predictive (what’s going to happen in the future?) and prescriptive (what should the organization be doing to create better outcomes?).

This gets to the heart of the question of who business intelligence is for. BI aims to deliver straightforward snapshots of the current state of affairs to business managers. While the predictions and advice derived from business analytics requires data science professionals to analyze and interpret, one of the goals of BI is that it should be easy for relatively non-technical end users to understand, and even to dive into the data and create new reports.

Business intelligence systems and tools

A variety of different types of tools fall under the business intelligence umbrella. The software selection service SelectHub breaks down some of the most important categories and features:

DashboardsVisualizationsReportingData miningETL (extract-transfer-load — tools that import data from one data store into another)OLAP (online analytical processing)

Of these tools, dashboards and visualization are by far the most popular; they offer the quick and easy-to-digest data summaries that are at the heart of BI’s value proposition.

Some of the top BI tools include:

DomoDundas BIMicrosoft Power BIMicroStrategyOracle Analytics CloudQlikSASSisenseTableauTibco

Business intelligence jobs

Any company that’s serious about BI will need to have business intelligence analysts on staff. BI analysts use data analytics, data visualization, and data modeling techniques and technologies to identify trends. The role combines hard skills such as programming, data modeling, and statistics, with soft skills like communication, analytical thinking, and problem-solving.

Even if your company relies on self-service BI tools on a day-to-day basis, BI analysts have an important role to play, as they are necessary for managing and maintaining those tools and their vendors. They also set up and standardize the reports that managers are going to be generating to make sure that results are consistent and meaningful across your organization. And to avoid garbage in/garbage out problems, business intelligence analysts need to make sure the data going into the system is correct and consistent, which often involves getting it out of other data stores and cleaning it up.

Business intelligence analyst jobs often require only a bachelor’s degree, at least at the entry level, though to advance up the ranks an MBA may be helpful or even required. As of January 2023, the median business intelligence salary is around $72,000, though depending on your employer that could range from $53,000 to $97,000.

More on business intelligence:

8 keys to a successful business intelligence strategyTop 12 business intelligence tools6 BI challenges IT teams must addressWhat is a business intelligence analyst? A role for driving business value with dataCareer roadmap: Business intelligence analyst9 business intelligence certifications to advance your BI careerTop 10 BI data visualization toolsTop 7 business intelligence trends
Big Data, Business Intelligence, Enterprise Applications

Technologies like the Internet of Things (IoT), artificial intelligence (AI), and advanced analytics provide tremendous opportunities to increase efficiency, safety, and sustainability. However, for businesses with operations in remote locations, the lack of public infrastructure, including cloud connectivity, often places these digital innovations out of reach.

Until recently, this has been the predicament of oil and gas companies operating oil wells, pipelines, and offshore rigs in remote, hard-to-reach locales. But the arrival of private 5G for oil and gas has changed this. Here’s how private 5G is transforming oil and gas operations in the field.

Secure bandwidth & real-time monitoring in remote locales

5G is a hardened telco network environment that provides one of the most secure networks in the world. Using this same technology, private 5G delivers an ultra-secure, restricted-access mobile network that gives businesses reliable connectivity and bandwidth to support their data transmission needs.

Private 5G enables a transportable “network-in-a-box” solution that can be relocated to provide connectivity and bandwidth in remote locations. This self-contained network offers the low-latency connectivity needed to configure, provision, and monitor a network. Furthermore, private 5G is also incredibly reliable, especially compared to traditional Wi-Fi, enabling superior communications and bandwidth-intensive, edge-to-cloud data transmission.

Increased productivity and efficiency

This highly reliable network solution is transforming oil and gas companies, which rely on heavy equipment with lots of moving parts, often running 24×7. By implementing intelligent IoT solutions that track vibrations, odors, and other conditions, oil and gas companies can monitor distributed, remote sites and equipment from a central location.

This is a game changer from an efficiency and productivity standpoint. For example, private 5G accelerates time to production for remote locations by eliminating the cost and time associated with coordinating with telco to build infrastructure. Additionally, private 5G helps oil and gas companies keep sites running smoothly, combining IoT solutions with AI and machine learning to enable predictive maintenance. This reduces costly equipment breakdowns and repairs, minimizes operational disruptions, and extends the life of hardware.

Furthermore, private 5G enables operators to diagnose and upgrade firmware and machinery and perform maintenance remotely. This decreases the need for travel and the number of crews in the field and reduces equipment downtime.

Private 5G enables improved safety and sustainability

Private 5G supports advanced solutions that boost workplace safety. Oil and gas companies can apply intelligent edge solutions to monitor for security breaches and safety hazards. IoT sensors can detect gas and equipment leaks, temperature fluctuations, and vibrations to avoid catastrophic events and keep employees safe.

From a sustainability standpoint, private 5G enables solutions that help prevent oil and gas leaks, reducing environmental impacts. Furthermore, oil and gas companies can implement smart solutions that minimize energy and resource usage and reduce emissions in the field.

Unlock the potential of private 5G

Private 5G is transforming oil and gas operations as well as businesses in other industries with remote, hard-to-reach operations. As an award-winning, international IT solutions provider and network integrator, GDT can help your organization design and implement an HPE private 5G solution to meet your specific needs.

HPE brings together cellular and Wi-Fi for private networking across multiple edge-to-cloud use cases. HPE’s private 5G solution is based on HPE 5G Core Stack, an open, cloud-native, container-based 5G core network solution.

To discover how private 5G can transform your business, contact the experts at GDT for a free consultation.

5G

GITEX GLOBAL is the largest event for global enterprise and government technology, with public digital transformation top of the agenda. Though different countries had slightly different priorities, they were all rebuilding from the pandemic through initiatives in three areas:

Digital public utilities, which include e-services for essentials such as gas, water, and communication systemsDigital economy initiatives, such as lowering the barriers to entry for businesses to transform and expandDigital society, focusing on simplifying citizen-government interactions  

The blueprint for national digital transformation

Huawei’s Global Chief Public Services Industry Scientist, Hong-Eng Koh, detailed Huawei’s long involvement in digital transformation.

“Over the past decade, Huawei has been involved in over 700 cities and 100 countries in terms of public services and transformation. And we’ve noticed certain similarities required for public service digital transformation to be successful,” says Koh.

Government agencies tasked with digital transformation projects should keep some key considerations in mind to ensure success. Koh further outlined seven questions transformation leaders should ask themselves:

Do you have a vision and a champion that can drive the transformation?Have you identified the department, governor, and process to implement the transformation? If none are suitable, can you create a department dedicated to it?Do you know which laws and regulations are impacted by the transformation?Have you planned for an operating model that is sustainable from a budget standpoint?Do you have the right skillsets/talent in place to execute the transformation?Do you have a digital and data strategy in place?Do you have the right technology ecosystem to support transformation?

Developing the technology ecosystem

Huawei’s experience in public digital transformation projects has given it a leading position in the technology ecosystem that supports them. “Data is a new crude oil. By itself, it has no value. At Huawei, we have the end-to-end technologies to process the data and extract value from it,” Koh explains.

It’s important to select a full-stack tech vendor that can support all the foundational elements of transformed business processes. This includes data awareness technologies, such as Wi-Fi 6, LTE, and IoT platforms that provide the mesh of communication technologies across wide areas to efficiently be aware of data from sensors across the community.

Transmitting data to edge processing points and central data centres also needs to be both high-performing and cost-efficient. Huawei supports more than 50% of global national backbones and last-mile connectivity.

In the zettabyte era where data growth is unprecedented, Huawei builds data centres and other essential computing infrastructure—in particular, storage—to provide a robust infrastructure that efficiently supports the data.

However, as Koh pointed out, data will only deliver value when processed and refined. On that end, Huawei offers the software (such as database-as-a-service, security-as-a-service, and computing-as-a-service) the platform (cloud with AI), and the advanced services such as AI and ML frameworks to support and speed up the data processing process, and reduce time-to-value.

The digital transformation imperative

 The recent pandemic has served as a catalyst to advance the business case for digitisation of governments globally. And with citizens having experienced how digital has enabled a semblance of normalcy during lockdowns and border closures, they will only continue demanding for more efficient and digital government delivery that can ultimately enhance their way of life.

As a result, leading governments that have embraced this opportunity are reaping the benefits of improved productivity, shorter lead times to action community requests, reduced rates of error, and improved workforce and community satisfaction.

Find out more at HUAWEI CONNECT 2022

Accelerate your government’s digital transformation agenda by attending HUAWEI CONNECT 2022 in Bangkok, Dubai, Paris, or Shenzhen.

Learn how enterprise, ecosystem partners, application providers, and developers can work best together to lead best practice industry development and build an open, sound ecosystem.

Join us there to Unleash Digital. For more information, visit https://www.huawei.com/en/events/huaweiconnect

Digital Transformation

One trap IT leaders often fall into when seeking a new job is viewing their resume as a historical document of their career. The reality is that your resume should paint a clear picture of your career’s future, detailing your past work experience as a roadmap that leads inevitably to your next leadership gig.

But striking that balance between detailing the past and mapping toward the future can be challenging, especially while keeping your resume to-the-point. A few key strategies, however, can help you tell your career story through your resume without getting bogged down in the past.

With more than 20 years of experience in data and analytics, Gloria Edsall, whose identity has been changed for this article, is an aspiring CDO looking to break into a C-suite role. We paired Edsall up with Stephen Van Vreede, president, executive resume writer, and coach for ITtechExec.com, to help her strengthen her resume to convey how her career path qualifies her for a CDO role. 

“Our meetings focused on discussing how the company and its customers derived value from the actions of the candidate and their team. We spent a good deal of time talking about [her] goals and interests so that the new resume could be tailored to the future and not simply a post-mortem on her career,” says Van Vreede.

Following is a report on that process and some tips outlining how Van Vreede’s work with Edsall can help you better shape your resume as a leadership journey toward new opportunities.

Highlight your leadership skills from the top

Van Vreede worked with Edsall to create a cohesive theme for her resume based on her career goals and interests. The first two to three pages of Edsall’s original resume focused on credentials such as education, certification, training, and publications, which were a “big distraction,” according to Van Vreede.

“For someone looking to pursue leadership roles that focus on turning data science and analytics organizations into a value driver for the business, there was almost no content that spoke to the candidate’s track record providing tangible business value,” he says.

> Download: Gloria Edsall’s original resume

To remedy this, Van Vreede included an executive summary at the start of the resume to clearly outline how Edsall’s experience and knowledge help her excel in leadership roles, detailing what type of leader Edsall is and how her experience with management and data analytics makes her an ideal candidate for a CDO role.

Van Vreede also pulled up three top career highlights for Edsall, including generating $25M in value by creating a predictive analytics engine, bringing in $25M in new contracts and $14M in business value through advanced and predictive analytics practices, and driving $5M in value by maturing data science practices. These main accomplishments show Edsall’s ability to benefit an organization’s bottom line through her analytics experience, while also leading successful and resilient teams in the process.

Keep it brief and strategic

One glaring issue with Edsall’s original resume is that it was far too long and, according to Van Vreede, was “more appropriately used as an academic CV instead of a personal marketing document for corporate roles,” at seven pages long. This is one of the most common resume mistakes tech professionals make, especially as your career history grows longer, it can be difficult to know what to keep and what to leave off your resume.

As a general rule, a professional resume should be a concise 1-2 pages when applying for corporate roles. Recruiters read through thousands of resumes, so they’re more likely to lose focus or abandon your resume altogether if they can’t get a sense of your qualifications within the first few minutes.

Edsall knew her resume was too long, and wasn’t happy with the formatting around her skills summary, job achievements, and past work experience. She didn’t know the best way to consolidate experience from more than a decade ago, or how to highlight her achievements and connect those to the value she’d bring to a CDO role.

In addition to being too lengthy, the original resume was also “highly technical and tactical in nature,” according to Van Vreede. This is a common issue that technologists run into when writing a resume: They include technical verbiage that might alienate recruiters or hiring managers who aren’t as familiar with the technical side of the role.

Van Vreede addressed this by consolidating her professional experience and creating a side bar along the right side of both pages to showcase Edsall’s education, credentials, and key skills. Including this type of sidebar enabled Van Vreede to bring Edsall’s chronological work history to the forefront, without having to bury her education and credentials at the bottom of the resume.

Including executive summaries and a side bar with your education, skills, and credentials is a great way to remove redundancies from your work experience, allowing you to focus on specific accomplishments at each role, while consolidating your evergreen skills, expertise, and knowledge into short and simple lists. It also gives recruiters and hiring managers an easy way to ensure you have the necessary skills and qualifications with just a quick glance. Ultimately, you want to grab a recruiter’s or hiring manager’s attention from the jump, encouraging them to delve further into your overall experience.

The final results

In the end, Van Vreede helped take Edsall’s resume from “technical and tactical” to “strategic and achievements based.” Most importantly, he focused on highlighting achievements that illustrated how Edsall has built or transformed data science and analytics units at each company and driven profits and business value through her efforts. These achievements help tell the story of her career and how those experiences will make her a strong candidate for a CDO role.

> Download: Gloria Edsall’s final resume

Edsall says she was most surprised by how Van Vreede was able to take the original resume down from seven pages to just two, and notes that the process helped her “learn what leaders look for when recruiting leaders.” She is happy with the final resume, which better highlights her qualifications for a C-suite role.

Van Vreede, too, sees Edsall’s experience and skills now standing out with a resume that is “aesthetically pleasing and packed with great content, but all wrapped up in just two pages,” he says. Overall, the final document will help Edsall demonstrate the value-add she brings to the table when interviewing with potential employers.

Careers, IT Leadership, Resumes

The digital transformation of the education sector is accelerating at pace. You don’t need to look far to find powerful examples of how data is helping to enrich both student and educator outcomes. Gardens, Libraries and Museums of The University of Oxford digitised its collections and reduced storage costs by 50-60% and avoided a management cost increase of 13% with the cloud.

Anjanesh Babu of the University of Oxford worked with CirrusHQ to deliver the service. He said: “Cirrus HQ have been an engaging, proactive and learning partner for us. Every one of their team – from the account manager, to solution architects to accounts have a clear sense of dedication, purpose and focus putting the customer first. They are willing to learn and adapt from customer inputs which puts them solidly ahead in the partnership space. CirrusHQ are considered an extension of our internal team with shared expertise as well as knowledge.”

At the core of this transformation lies the need to leverage data and associated apps and services in a way that is agile, cost effective, secure and scalable. Migrating data, apps and services to a market-leading cloud provider, such as Amazon Web Services (AWS), delivers all of this and more. And working with a trusted AWS partner such as CirrusHQ can help education providers unlock the full potential of these benefits.

Bursting cloud-migration myths

Institutions are often concerned about losing control of their data, but cloud migration actually empowers data access and agility. Ultimately cloud migration, using a solution such as AWS, enables educators to focus more time, money and effort on delivering first-class outcomes rather than being distracted by the very real demands of running hardware and software.

Cloud services have evolved rapidly in recent years and many of the perceived barriers to migration no longer exist.

There are a couple of myths when it comes to the cloud. Firstly, that the cloud is too expensive. There is a cost of migration, but cloud providers such as AWS have reduced the total cost of ownership significantly. Cloud services generally now cost less than owning and managing a physical data centre.

The second myth is that the cloud isn’t secure. Market leaders, such as AWS, subject themselves to some of the strictest security controls and audits in the industry. The biggest risks instead now lie with apps and services which have been poorly designed by organisations themselves.

Six benefits of cloud migration in education

Cloud use cases can be found right across the education sector. For example, a step-change in innovation, performance and student provision can be achieved in administration and assessment processing, teaching practices, remote learning and continual professional development. Andre Zelenka, from Birkbeck, University of London said: “AWS Technology is vast and CirrusHQ have engaged with us to understand our requirements, propose a sensible way forward, and help us to execute that. All without recourse to AWS tech speak, smoothing the path for our projects.”

The wider benefits of cloud migration also include: 

Cost reduction – Education companies can, on average, save just under a third (31%) of data management costs.Digital transformation – Cloud isn’t just a great way to store data, it is transformational. For example, it enables public sector organisations to innovate and adopt an entrepreneurial ‘fail fast’ mentality, accelerating time to market.Agility, staff productivity and staff retention – AWS migration is shown to trigger a 66% boost in administrator productivity.Security and resilience – According to IDC, IT systems downtime costs the global economy up to $2.5 billion annually. With AWS, however, companies operate on one secure, robust platform, enabling superior governance.Avoiding vendor lock-in – Market-leading cloud services such as AWS do all the heavy lifting, making it easy for customers using end-of-life products to migrate their databases and servers to the cloud and modernise in a streamlined way. Organisations are urged, however, to take action before vendor lock-in becomes an urgent problem.Scalability at speed – Cloud services such as AWS enable education sector organisations to futureproof their IT ecosystems, scale at their own pace, with no limits, adding resources at the right time and expanding their cloud environment to meet changing needs and goals.

Finding the right partner

Organisations in the education sector looking to move to the cloud will need help throughout the migration process. The ideal partner will have AWS partner certifications, a high customer satisfaction score and case studies demonstrating that they follow best practice.

Proven industry experience in this fast-moving area is also essential. CirrusHQ, for example, has an AWS migration track record going back 15 years, and answers an average of 4,000 customer support requests every month.

It is also worth working with a migration partner with a good geographic spread, so you can be confident of support whenever and wherever you need it. CirrusHQ has capability to deliver in both the UK and EMEA.

To find out more about how CirrusHQ can help click here.

Education Industry