Seemingly since the beginning of time, CIOs have been working to change their IT organizations from “order takers” into “business partners.” They have established business relationship management functions, developed “we are the business” rallying cries, and built leadership development programs emphasizing influence, courage, and business acumen.

These efforts have had a positive impact, but an incremental one. Yes, most IT leadership teams have stronger relationships with their business partners than, say, five years ago, but there is still a long way to go. “Raise the credibility of the IT organization” continues to appear at the top of the wish list that our clients give to our firm when we launch a new CIO search.

Encouraging technologists, often introverts who have spent their careers mastering complex skills, to deepen their understanding of marketing, commercial operations, supply chain, and finance is a slow march. But with the movement of software into the heart of most organizations’ products, services, and growth strategies, a slow march is not sufficient.

So, how do CIOs expedite the business partnership skills of their teams? They adopt what is clearly becoming the gold standard of IT and business team integration: a capabilities (or product) management model.

Carissa Rollins, who became the CIO of Illumina in April of this year, strongly believes in the capabilities management model. With the $5 billion biotech company expanding from R&D and manufacturing into clinical-based genomic health, Rollins sees IT playing an increasingly critical role in business growth and patient care.

“Traditionally, Illumina has focused on the lab, but we are now moving out of the lab and into personalized patient care,” says Rollins. “We are working with physicians and payers on ways to help people understand their genomic health.”

One area where IT can make its business impact felt is around Illumina’s recently announced the NovaSeq X Series, a powerful set of sequencers that promises to advance the real-world impact of genomic sequencing.

“NovaSeq X is an amazing machine,” Rollins says. “Now we need to surround it with a great customer experience that helps providers and patients understand the impact of data on patient health.”

Refocusing IT for business impact

Illumina’s shift from the lab to the patient necessitates Rollins’ IT team to have a more acute focus on customer data and experience. It also requires IT to take initiative to be a co-creator of business solutions.

“In IT, we are too focused on doing what the business wants us to do, so we don’t take the time to invest and learn about the business,” she says. “So, when they tell us what system they want, we don’t have enough knowledge to say, ‘Here is a better idea.’”

Rollins believes a capabilities model is essential, and it starts by establishing standards and reigning in shadow IT.

“It is important to strike the right balance between standards and citizen development,” she says. “In our complex world, IT cannot control everything, but we need standards, especially in our regulated environment. At the same time, we have to allow for citizen development, which will only grow as we hire young tech-savvy people who will work with RPA [robotic process automation] and ML [machine learning] on their own. They won’t wait for IT.”

RPA presents an excellent opportunity for citizen development, but not without the right foundation, as Rollins learned in a previous role. “Our business partners had created more than 300 bots without IT’s knowledge,” she says. “When we upgraded the system, we broke all of them.”

With standards and governance in place, the next step is defining the company’s target capabilities. On which capabilities does the company need to spend more time and money? On customer self-service to ensure a seamless experience? On IoT to be more efficient in device manufacturing?

“The good news is that most industries have a standard capability map to start with,” says Rollins. “Once we have that map, we need to socialize it with our business partners to make sure we all agree that these are Illumina’s target capabilities. This process never ends. The map is always evolving.”

With an agreed-upon capabilities map in hand, the next step is to assess how the current investment strategy aligns to it. “Once IT understands what we are investing in each capability, they become much more focused on our overall business strategy,” says Rollins. “They start to ask why we are spending so much on transportation management and so little on customer self-service, for example. The IT team starts to think like investors, not order-takers.” 

Down to execution

The next chapter in the capabilities story is, of course, delivery. “As CIO, my job is to build a model that that gives IT and our business partners a roadmap that ties into our business strategy,” she says. “At that point, my role in capabilities management recedes, and the CTO position becomes more important.”

The CTO role has many different definitions in the market. Still, for Rollins and Illumina, that person is the lead architect and engineer of the platforms that support the capabilities roadmap. The CTO makes sure the platforms integrate, through APIs, into partner and customer platforms. “The CTO sets the standards for reusable platforms, while the capability manager knows what functions we need to deliver,” says Rollins.

Once you have a capability model that defines your investment strategy, and a CTO to build your platforms, now it is all about building the product teams to execute the capabilities map. “The temptation is to jump right in and build all of your capability teams at once,” says Rollins. “But my advice is to pick a few pilot areas, because how the capabilities teams will work together is very different from how work was done in the past.”

Let’s take customer self-service. The capability manager for the customer self-service team would likely be a very senior person from the customer service organization. That person listens to customer feedback to determine a features roadmap with sub-capabilities. The capability manager brings onto the team a lead engineer, responsible for architecture and design all the way through to testing. “These roles are no longer separate, which is a big shift for IT,” says Rollins. “Before, you had solution architects, developers, and testers. But in the capability model, the engineers are responsible for all those activities, which gives them greater responsibility for delivering the right capability.”

Rollins points out that each step toward a capability model is not linear, but should be run in parallel, and that not all capabilities, like those running on packaged software, will move into the new model right away. 

But regardless of the approach, it is important for CIOs to move to the new model. “In a capability model, IT is no longer accountable just for delivering a new website in China; they are accountable for delivering the customer experience and the sales around that website,” she says. “CIOs cannot build technology-forward businesses with a traditional IT delivery model. We have to shift from delivering IT to delivering capabilities.”

IT Leadership, IT Strategy

The cloud offers limitless scalability and flexibility, powering digital transformation across every industry. But when not managed strategically in the long run, cloud spending can quickly escalate and impact margins, cost of goods sold (COGS), and cost of revenue (COR).

To optimize cloud investments, C-level executives are increasingly adopting cloud financial operations (FinOps). This framework positions organizations to manage their cloud investments more effectively, driving increased accountability to maximize business value. In this article, I’ll explore common cloud optimization and FinOps challenges and strategies for overcoming them.

1. Cloud usage & costs

Most enterprise companies have shared infrastructure, and managing cost allocation across marketing, HR, accounting, and other departments can be tricky. How they handle this depends upon the business-unit driver and the organization’s culture, typically defined at the C-level.

The business unit must tie back to the key performance indicators (KPIs) associated with the domain and the objectives and key results (OKRs). Managing and aligning cost allocation to the business unit requires real-time visibility and reporting around cloud costs and usage, with cost allocation constructs aligned to departmental needs. Organizations must examine shared resources, storage costs, network costs, platform services, monitoring, logging, and licensing. Then they must choose a financial model, whether an even split, fixed, or proportional model.

From a strategic perspective, some organizations set up executive sponsorship focusing on the FinOps maturity model and decision framework. Others start with a FinOps maturity assessment, establishing an actionable roadmap that defines the FinOps domain and organization roles and objectives, all aligned to business spending, efficiency, transparency, and compliance.

2. Performance tracking and benchmarking

When it comes to performance tracking and benchmarking, organizations frequently face challenges around resource utilization and efficiency. Utilization and efficiency provide vital insights for understanding the business value of expenses incurred. However, it can be challenging to measure the business value associated with each type of cloud resource based on performance, availability, and other factors.

Overcoming these challenges goes back to KPIs and OKRs. Organizations must define and track KPIs that meet efficiency and utilization objectives and deliver value-creation. For example, if the goal is to reduce hot storage, a KPI must be defined to meet the efficiency objective and deliver value creation—and it must be measured. This requires adopting the right FinOps tools, processes, and people.

3. Real-time decision making

A framework and accountability structure form the foundation for real-time decisions around usage, costs, and performance to meet organizational goals. However, establishing a FinOps decision framework and accountability structure can pose a challenge, particularly for those organizations with low FinOps maturity.

The organization must first perform a maturity assessment to understand the role of FinOps within the context of the overall organization. Once in place, the organization can develop and assign a repeatable process that enables real-time decision-making to a center of excellence, steering committee, or governance structure, depending upon the organization.

4. Cloud rate optimization

In this domain, organizations define and adjust pricing model goals based on historical data and make purchase decisions based on goals and discounts being offered—all to optimize cloud spending. Cloud service providers provide slightly different offerings with unique embedded discounts – some providers have computer unit discounts, and some have utilization-based pricing.

As a result, organizations often face challenges around data analysis, show-back, and managing commitment-based discounts. To address this, many enterprises use a KPI-driven, hybrid-cloud purchasing strategy to align their commitment period with infrastructure workload characteristics and lifecycle. This strategy aligns well with the concept of a smart cloud.

5. Cloud usage optimization

Dynamically matching cloud resources to demand to optimize cloud usage and ensure sufficient business value can also pose challenges.

That’s why organizations increasingly implement automated workload management solutions that match running resources to workload demand, scaling, de-scaling, and even turning off unused resources in real-time to maximize ROI while minimizing the TCO. Business-driven KPIs and OKRs help organizations define outliers and set the thresholds that inform alerts and actions.

6. Organizational alignment

FinOps capabilities are embedded within organizational processes and units, and often, this is where companies fail at FinOps. Unfortunately, many start with technology (tools) instead of organizational alignment, creating conflicts and challenges around policies, governance, and areas of responsibility.

Organizations must establish a FinOps framework at the C-level, complete with policies, processes, best practices, and a playbook that help ensure organizational alignment and buy-in. Once aligned, organizations can harvest the benefits of FinOps including:

Centralized smart-cloud cost managementAlignment to and accountability of cloud roles and usersImproved confidence and accuracy around budgets and forecastsAdvanced communication throughout the organization, creating a FinOps culture

Improve cloud optimization and FinOps maturity

As data volumes and usage grow, cloud FinOps enables organizations to manage cloud investments more strategically, efficiently, and cost-effectively. Understanding FinOps maturity can help organizations identify and resolve trouble areas to improve cloud optimization and accelerate business outcomes.

GDT can help your organization improve cloud optimization and FinOps maturity. By engaging with GDT, you’ll get a portfolio-level analysis of your FinOps maturity, along with cloud service optimization opportunities and recommendations to help you maximize spend efficiency, reduce cloud costs, and develop strategies for proactively and dynamically managing future expenses and utilization.

Contact the experts at GDT to learn more about improving cloud optimization and FinOps maturity.

Cloud Management

You’ve invested in state-of-the-art, end-to-end security solutions. You’ve implemented robust security and privacy policies and outlined best practices. You’ve got monitoring and detection in place at every level and you apply updates as soon as they’re available. You’ve done everything a smart and responsible organization needs to do to safeguard your systems, networks, data, and other assets from cyber threats.

The question is: are you prepared to recover from a cyber event?

As cyber threats increase in frequency and sophistication, most businesses will eventually fall prey to a cyber event, despite their best efforts. The longer it takes to recover, the more it will cost. Swift recovery is paramount to minimizing damage. Simply put, organizations must prepare to recover from a cyber event before it occurs.

Why a disaster recovery plan may not be good enough

Many organizations have disaster recovery plans and assume the concept of disaster recovery and cyber recovery are the same: a system or location goes down, you shift operations, complete recovery efforts, and return to normal. However, the two scenarios have some vital differences.

When a disaster happens — such as a data center fire or server hardware dying —you get alerted right away. You know when and where the disaster occurred and have a predictable recovery point objective (RPO).

On the other hand, with a cyber event, you’re sure of only one thing: there’s been an attack. You don’t know when it began, where it happened, the scope of the damage, or how to mitigate the intrusion. Although you may have been alerted on a Tuesday at 8 AM, the cyber event may have occurred days, weeks, or even months earlier — which means that the initial damages you’re aware of may only scratch the surface of a much bigger problem.

Furthermore, cyber-attacks have become increasingly sophisticated and commonplace, and a 5-year-old disaster recovery plan may not cover modern scenarios. If you don’t have a cyber event recovery plan, it could take days or even weeks to recover, costing time, money, customer trust, and lost business.

Store secondary copies of information offsite or off-network

In addition to the potential for natural disasters, storing data solely onsite exposes your business to risks such as backup file corruption should your local network suffer an attack. As part of your cyber event recovery plan, ensure you’re storing secondary copies of information offsite or off-network. Keep these copies readily available, so you can begin recovery efforts immediately to limit damage and costs.

Secondary data storage solutions range from offsite servers or tape storage to private or public cloud backups. Cloud storage is your best bet when it comes to accelerating the time to recovery. Data is easily accessible and does not require manual intervention, meaning recovery work can start quickly.   

Determine data classification and order of recovery

Data classification involves categorizing information based on sensitivity and business value. Organizations have many reasons to perform data classification, ranging from security and data compliance to risk management and storage cost control.

When recovering from a cyber event, data classification makes it easier to identify what data has been lost, the scope of the damage, and, ultimately, the event’s cause. When organizations do not understand data classification, recovery efforts take far longer and far more work. And in some cases, they may not be able to recover fully.

Another critical and related piece of the puzzle is understanding the order in which your environment needs to be recovered. While many organizations are aware of this need, many are not prepared. Data classification enables you to identify codependences within your IT topology. If your most critical application relies on lesser or noncritical systems to function, those applications need to be labeled as critical.

Have a failback plan

Once you have mitigated damages from the cyber event, you need to return operations from the secondary location to your original location. Having a failback plan in place — whether moving back to infrastructure that’s on-premises or in the cloud — enables your company to resume business as soon as possible with minimal downtime or data loss. Unfortunately, very few companies are positioned to do this quickly, costing additional time and money.

Your failback plan should incorporate all data and data changes as well as workflows. The failback plan should include your data classifications and order of recovery as well as testing to verify data accuracy, primary systems, and network quality. Ideally, the failback process should be automated.

Get industry-leading cyber recovery as a service.

No matter how secure you’ve made your infrastructure, the likelihood of a cyber event impacting your organization eventually is relatively high. With cyber recovery plans in place, you can minimize damages and costs and accelerate the time to recovery.

One of the best things your organization can do is to take advantage of data or cyber recovery as a service (CRaaS). Based in the cloud, CRaaS saves time and money when a cyber event happens because it streamlines information recovery.

Zerto on HPE GreenLake makes cyber recovery faster and easier and frees up your organization to mitigate the threat and stop the intrusion, reducing the overall cost and damages caused by the cyber event. Benefits include down-to-the-second RPOs via continuous data protection and journal-based recovery. Zerto also offers the industry’s fastest recovery time objectives.

To learn more about how you can improve your readiness for a cyber event, talk to one of GDT’s cyber recovery specialists.

Security

3M Health Information Systems (3M HIS), one of the world’s largest providers of software solutions for the healthcare industry, exemplifies 3M Co.’s legendary culture of innovation. By combining the power of a cloud-based data ecosystem with artificial intelligence (AI) and machine learning (ML), 3M HIS is transforming physician workflows and laborious “back office” processes to help healthcare organizations streamline clinical documentation and billing, enhance security and compliance, and redesign the physician-patient experience.

The cloud served as the foundation for this transformation. Migrating its 3MTM 360 EncompassTM System clients to Amazon Web Services (AWS) is helping 3M HIS improve the capture, management, and analysis of patient information across the continuum of care. 3M 360 Encompass is a collection of applications that work together to help hospitals streamline processes, receive accurate reimbursement, promote compliance, and make data-informed decisions. The cloud-based version of the platform has helped 3M HIS and its clients address three primary challenges: a disjointed patient care journey; the byzantine processes that often inhibit timely and accurate billing, reimbursement, and other record-keeping; and the ongoing need to protect and properly use patient data.

Improving the patient care journey with data and the cloud

The broader objective of 3M HIS’s evolving cloud transformation strategy is to help caregivers improve patient outcomes and staff efficiencies by removing barriers to care and providing access to contextually relevant insights at the time of care, according to Detlef Koll, Vice President of Product Development with 3M HIS. Caregivers now work with consistent, reliable tools within 3M 360 Encompass that improve communication and reduce the types of errors and delays that cause patient anxiety and revenue cycle inefficiencies.

The journey a patient takes through the healthcare system can span years and touch multiple providers, from primary care to specialists, test labs, medical imaging, and pharmacies. During each step, multiple types of data are captured in the patient’s medical record, which serves as an ongoing “narrative” of the patient’s clinical condition and the care delivered. Physician notes from visits and procedures, test results, and prescriptions are captured and added to the patient’s chart and reviewed by medical coding specialists, who work with tens of thousands of codes used by insurance companies to authorize billing and reimbursement.

A complete, compliant, structured, and timely clinical note created in the electronic health record (EHR) empowers many downstream users and is essential for delivering collaborative care and driving appropriate reimbursement. Supporting physicians with cloud-based, speech-enabled documentation workflows, 3M HIS further creates time to care by delivering proactive, patient-specific, and in-workflow clinical insights as the note is being created.

The goal of this automated computer-assisted physician documentation (CAPD) technology is to reduce the cognitive overload on physicians regarding coding requirements while closing gaps in patient care and clinical documentation. Without CAPD closing that loop in real time, errors or ambiguities in the clinical note lead to what Koll describes as an “asynchronous” process, requiring physicians to review and correct the note on a patient seen days earlier, thus taking the physician’s time away from patient care and causing delays in the revenue cycle.

To address the issue, 3M HIS needed a way to semantically integrate information from multiple data sources based on the needs of various use cases, so it deployed AWS data management tools and services, including Amazon RDS, Amazon Redshift, and Amazon Athena, for what Koll calls “opportunistic aggregation of information.” For example, for billing coding, the platform extracts only the relevant billable elements such as an office visit for which a claim will be submitted. This type of flexible, cloud-based data management allows 3M HIS to aggregate different data sets for different purposes, ensuring both data integrity and faster processing. “This is a dynamic view on data that evolves over time,” said Koll.

Improving workflows through intelligent, automated processes

The process for gathering data about a patient’s care, then extracting the billable elements to submit to an insurance company for reimbursement, has long been handled by professional coders who can accurately tag each medical procedure with the correct code out of tens of thousands of possibilities. Errors in that process can lead to rejected claims and additional time required by caregivers to correct any gaps or inconsistencies in clinical documentation, which in turn delays cash flows across the complex network of physicians, hospitals, labs, pharmacies, suppliers, and insurers. 

3M HIS’s cloud transformation strategy addressed this challenge by giving clients access to a new suite of data management and AI/ML tools that deliver levels of processing power, functionality, and scale unthinkable in the former on-premises model.  

“If you had to build some of the capabilities yourself, you would probably never get there,”
said Michael Dolezal, Vice President of 3M  Digital Science Community.  With AWS tools such as Amazon QuickSight and Amazon SageMaker, 3M HIS’s clients can “get there” today: “Now our clients not only have a cloud-based instance for their data, but they gain access to tools they never had before and get the ability to do things they otherwise wouldn’t,” Dolezal said. By bringing 3M 360 Encompass to the AWS Cloud, 3M HIS has been able to scale natural language processing and automation capabilities and leverage tools such as Amazon Textract to improve data input and processing to more efficiently organize a patient’s chart.

Automatic speech recognition to capture the clinical narrative at the point of care, along with AWS AI/ML services, helps 3M HIS aggregate, structure, and contextualize data to enable the development of task-specific workflow applications. For instance, to mitigate the administrative burden on physicians, real-time dictation and transcription workflows can be enhanced with automated, closed-loop CAPD, whereby a physician dictating an admit note can be “nudged” that a certain condition is not fully specified in the note and can fix the gap in real time.

Taking frontline physician-assistive solutions to the next level, embedded virtual assistant technology can automate everyday tasks like placing orders for medications and tests. Innovating incrementally toward smarter and more automated workstreams, the 3M HIS ambient clinical documentation solution makes documentation in the EHR a byproduct of the natural patient-physician conversation and not a separate, onerous task for the doctor. This frees the physician to focus completely on the patient during the visit, thereby transforming the experience of healthcare for all stakeholders.

“We want to reduce the inefficient steps in the old model by unifying and information-enabling workflows so that documentation of the procedure and the coding of that procedure are no longer separate work steps,” said Koll. “It has the potential to save hours of time per day for a doctor.”  

Enhancing the security of patient data

The security and governance of patient data is non-negotiable in healthcare, an industry subject to the most stringent data privacy regulations. Administrators are obligated to make sure patient data is consistently used only for its intended purpose, is processed only by the application it was collected for, and stored and retained according to the specific national regulations involved. The cloud gives 3M HIS more confidence that the data passing through its platform remains secure throughout its journey.

“Using a cloud-based solution means you can apply the same security practices and protocol monitoring across all of your data in a very consistent way,” said Dolezal. The platform ensures a shared responsibility for security across 3M HIS, its clients, and AWS.  

Securing patient data in an on-premises health information system puts the onus to protect that information on the client’s infosec team, with risks compounded by each client’s unique IT infrastructure and internal data access policies. Security by design is one of the underlying operating principles for AWS. With a single set of code to monitor, maintain, and patch, 3M HIS is able to keep its platform current, quickly respond to new threats, and vigorously protect patient data centrally, with more certainty that its clients are protected as well.

4 best practices for data-driven transformation

Dolezal and Koll advise anyone considering moving large sets of data to the cloud to follow some fundamental precepts in designing a new solution:

Start with the client and work backward to a solution:  Be clear on the problem you want to solve and the outcomes you want to deliver to the caregiver and patient and work backward from there to identify the right technology tools and services to help achieve those goals.Don’t over-engineer the solution: Many IT organizations are moving away from traditional point solutions for collecting, storing, and analyzing patient information. To reduce complexity, enhance security, and improve flexibility, consider an end-to-end solution that is easier to deploy and update than traditional on-premises solutions, and lets organizations add new functionality incrementally.Bake in security from the start: In highly regulated industries, such as healthcare and financial services, security regulations demand high levels of security and personal privacy protection. These capabilities must be built in as foundational components of any system used to collect, manage, and analyze that data.Don’t constrain native data: Create a data management strategy that accommodates all types of data and isn’t confined to a specific set of use cases today. With both structured and unstructured data flowing into the system, the future ability to analyze the past means having data schema that doesn’t need to be re-architected.

In an intense environment with a relentless focus on cost reduction and improved clinical outcomes in conjunction with greater patient and physician well-being, 3M HIS helps clients efficiently capture and access patient data, gain meaningful insights from all the data, and drive high-value action to meet complex goals.

Learn more about ways to put your data to work on the most scalable, trusted, and secure cloud.

Cloud Computing, Healthcare Industry

Over the past several years, IT has undergone a profound shift in which a formerly support-oriented organization has taken on a much more prominent customer-centric role. Much of this has occurred thanks to the power of data to drive decisions and digital transformation’s impact in enabling companies to create new service- and data-based offerings around their core products.    

After hearing recently that Art Hu’s CIO role at Lenovo had been significantly expanded to incorporate customer solutions, I was eager to talk to him and learn more about how he has approached this shift to a “CIO-plus,” customer-centric IT leadership position. Our conversation touched on how this new opportunity evolved, the central role that data plays for CIOs today, and how his new CTO duties differ from those of his CIO post. What follows is an edited version of our interview.

Martha Heller: In addition to Lenovo CIO, this past April you became CTO of Lenovo Solutions and Services Group. What is that new business?

Art Hu: Last year, Lenovo stood up our Solutions and Services Group (SSG) as a part of our pivot from hardware sales to solutions, including offering our products as a service. Device-as-a-service (DaaS), for example, is our fastest growing business. It allows customers to avoid making large hardware capital expenditures and move to a “consume as you go” model where we manage, configure, and deploy their devices for them. 

Think of SSG as a high-growth startup. Last year, Lenovo earned more than $70 billion in revenue, of which SSG contributed a little more than $5 billion. Our goal is to double our revenue in the next few years.

What does SSG’s rapid growth indicate about the evolving world of business and technology?

The first point is the shift from delivery to outcomes. It is no longer our goal to simply manufacture and ship a piece of hardware. Our goal is to deliver value through business outcomes. But you can only pivot to a customer-outcomes mindset if you have customer intimacy and understand the context in which your technology is being used.

One example is Lenovo’s AIOps service, where we analyze a customer’s data about hybrid cloud service and make recommendations to optimize it for stability and performance. The rapid growth of SSG points to the fact that today, you need more than the right technology: The technology needs to make your data visible, accessible, and actionable.

What lessons learned can you offer CIOs on generating actionable data?

The first lesson is to create clear data standards. We learned that some data needs to be common (the definition of a ‘shipment,’ for example), but we also learned that we cannot standardize everything. We need to allow for some variation in how people operate, even within a global model.

The second is to start building guardrails around the data. When we told our business partners what they could and could not do with the data, they just went off on their own and did it anyway. We realized that we needed to strike the right balance between standards, guardrails, and flexibility. It took us a couple of iterations to get it right.

The third lesson is all about education. You cannot assume that your business partners know what to do with the data. Early adopters will know exactly what they want, but others may not. As an example, we had significantly upgraded our ability to gather feedback on social media, and we started to send customer comments from Twitter and LinkedIn back to our product development teams. We assumed those teams would love all that data, but we were wrong. Someone actually said, “Please stop sending me all of this data. I don’t know what to do with it.” We learned that delivering data is not enough. We need to help our business partners understand it and then use it, to take advantage of these marketplace insights.

What does the SGG CTO role entail?

The first part of my job is to choose the technology investments that will enhance our customer offerings. Where can RPA (robotic process automation) complement our solutions portfolio? How will AR and VR extend our capabilities? The second is to expand the solutions portfolio to bring more choices to our customers. And the third is to leverage SSG as a platform for innovation that drives the future of our solutions and services strategy.

How is that job different from your CIO role?

In both roles, I am continuously scanning the technology landscape to identify opportunities and building a strong engineering team and culture to deliver. But there are differences as well.

For example, my level of external engagement. A CIO’s stakeholders are typically the business users within the company. As Lenovo’s CIO, I build our core business applications, social media and e-commerce sites, and spend time thinking about business scenarios, deployment, and ways to capture value. As CTO, I spend more time in the market, understanding emerging trends and the competitive landscape. Those give me a strong perspective of customer insights, which are then sharpened in discussions with our business leaders and the sales teams. The result is a better-informed offering development process.

My perspective on budgets and investments is also different. As CIO, you typically have a budget against which you prioritize investments and initiatives. But as the CTO in a new P&L, if I cannot articulate a clear value proposition for my technology investment roadmap, my development budget is zero. The conversation shifts from “your budget needs to decrease by 10%, so you can do some but not all of your priorities” to “your budget is zero because we don’t believe your technology strategy will grow our P&L.” (That hasn’t happened yet, fortunately.)

Finally, there is the difference of working in a startup versus working at global scale. As CIO of Lenovo, I manage the teams that support a multibillion-dollar business. The SSG CTO role, on the other hand, has required me to get into the nitty-gritty of incubating a business. Being a leader in a $70 billion business is very different than supporting a new line of business in the “from-zero-to-one” stage of maturity. 

How did you wind up in the role?  

I was asked to take on both roles for several reasons. The first is that for years, industry trends have indicated that the future would be increasingly software-defined, so we have been building up our software capability within IT for quite a while. 

At the same time, SSG’s approach to developing our offerings has been to tap into the best of Lenovo’s assets from across the enterprise and integrate them into a single offering, with software being part of the “glue” that ties it together. The requirements to execute this type of exercise were a natural fit with IT’s software capabilities, especially the engineering methodologies, processes, and platforms that were needed to build SSG’s R&D platform.

Second is our “Lenovo powers Lenovo” concept. Over the years, our customers have wanted to know how we, at Lenovo, run our business: Supply chain planning, warehouse operations, and globalization are good examples. They also wanted to know more about the hybrid cloud solution we built in-house. They said, ‘Can’t we just buy what you’re doing?’ So, we took some of the solutions we had developed internally to run Lenovo, productized them, and began offering those to our customers. Although I didn’t know it then, I was incubating a small business within IT, and that was one of the seeds that ultimately led to my taking on the CTO role.

What advice do you have for CIOs who would like to take on a CTO position?

Don’t wait. If you are developing software solutions internally that could be valuable to your customers, start thinking about those solutions as the beginning of a business. As CIO, you are perfectly positioned to do this, since so many ingredients of a software-enabled solutions business are already sitting within your purview. My advice to CIOs who are looking to do more would be to look at the assets they already have.

IT Leadership

As a household name in household goods, with annual sales of $22 billion, Whirlpool has 54 manufacturing and tech research centers worldwide, and bursts with a portfolio that includes several familiar brands including KitchenAid, Maytag, Amana, Yummly, among others. The company employs 69,000 around the world as well and Danielle Brown, the company’s SVP and CIO, has a unique perspective on how best to lead the company’s digital transformation strategy.

Having joined the company in November 2020 to lead its Global Information Systems, Brown understands that cross-collaboration and effectively leveraging data to create new products and services are not only essential to future success, but speak of having the responsibility in such a privileged position to lean into that seat at the table, which means having a voice and an understanding of where technology is headed.

“Our vision is to be the best kitchen and laundry company in constant pursuit of improving life at home, which has become even more evident and important over the past couple of years,” she says. “Data shows that people continue to use our products on a more continuous basis. We’ve also seen people at home researching, browsing, and purchasing more online. All of these things have been transformational for our business.”

Of course, the end-to-end consumer journey is always a work in progress at Whirlpool, which began prior to Brown’s arrival. “But working across our leadership team,” she says, “one of the things I always say about IT is we have a unique view of the company. We can see all of the various processes, so with that unique vantage point, part of our role is to connect a number of those dots. That’s where we have the opportunity to talk about this as a full journey and know what a consumer has. We have to think about that technology and how it’s layered together as an IT organization. That is part of the value we bring to the table. So with my coming in, those are some of the things the IT organization focuses on as a leadership team.”

Brown recently spoke with CIO Leadership Live host Maryfran Johnson about advancing product features via sensor data, accelerating digital twin strategies, reinventing supply chain dynamics and more. Here are some edited excerpts of that conversation. Watch the full video below for more insights.

On four strategic priorities: One is delivering product leadership, which includes data and technology that support things like the digital twin and digital thread throughout a product’s lifecycle. And that is where the IT organization really has a hand in helping to enable that product leadership. The second is leveraging IoT and AI to support new digital services and new digital products that we can offer our consumers. Third is about winning that digital consumer journey by utilizing the technology to engage with a customer from pre- to post-purchase. And our fourth strategic priority is about reinventing the value chain with greater visibility. That’s another way in which our IT organization was able to work side by side with our business partners to advance this one. So end to end, our strategic priority has stood the test of time.

On re-recruiting talent: Employees today have more options than they had in the past. As a company, we have to ensure we promote our value proposition. There’s the saying, “People will leave a boss, but not necessarily leave the company.” And what they want from their boss is someone who cares about their career. It’s the employee’s role first but they’re partnering with their boss or supervisor because they only have a limited view. So we have a tool called Career Compass, which shares employee experiences and helps let an employee know a manager cares about their career. When you have different leaders or new leaders in an organization, you don’t want your experiences to be forgotten. So we start with what that person’s career has been to date and then explore where you want to go in your career, but not on the traditional ladder. I’ve heard it referred to as the lattice. There are many different routes to take. It’s not necessarily about a job or promotion; it’s about the experiences that someone wants to have in their career, because it’s those experiences that are required if you want to be a global CIO or an enterprise architect. Things like that really matter and will allow companies to retain talent.

On innovation ecosystems: You have to think about what technology is really mature versus the technology that is more speculative. AI and machine learning are mature today. You also have natural language processing, doing technology through RPAs and things of that nature. So we’re leveraging those things in our business and market. But you also have the more speculative technology, like metaverse and blockchain and things of that nature. For emerging technologies like those, we’ll experiment internally and think about how they might apply to our business and how it could create new or different opportunities. But things have to add value for the end consumer. It can’t just be the technology for the sake of it.

On the enterprise data strategy: I am a self-admitted data geek. When you leverage internal data, you need governance around that data. The two are extremely important. Our priority is around delivering product innovation and having that digital twin or that digital thread where data is fundamental. This is working in partnership with the strategy of our product organization, and how to simplify the data and ensure it’s threaded throughout—in a digital way—or whether it’s embedded within our systems of record. The right governance around that product data has to be in place too so it can be used throughout the full product lifecycle. That’s how data governance is critical to our organization and analytics are a way to unlock value.

CIO, Data Governance, Digital Transformation, IT Leadership

It’s clear that in the last two years, the global pandemic has created unique circumstances for business and IT leaders at small- and mid-sized businesses.. Yet strategic technology decisions, such as vendor consolidation, can help support a business’s ability to handle volatility and remain agile in challenging times.

In this second episode of our 5-episode podcast, Essential Connections: The Business Owner’s Guide to Growth During Economic Uncertainty, we examine how to future-proof your business with agile IT leadership.

In fact, according to GoTo research, 95% of those surveyed have consolidated or plan to begin consolidation to multi-purpose vendors as they seek to streamline and optimize in the face of global uncertainty. Our guest, Paddy Srinivasan, GoTo’s CEO, says this is essential.

“When the pandemic set upon us, most IT departments were in a scrambling mode. And they bought a lot of technologies just to make ends meet as their workforce started dispersing across the world,” he explains. “So, there was a mad scramble for solutions for remote connectivity, monitoring, management, security, and the whole nine yards.”

Fast forward, and most IT departments are looking for ways to streamline these investments, he explains. “Having multiple vendors and tools drive up costs significantly. And it also becomes super hard to share information and have seamless workflows for all the tasks that the employees (both in IT and outside IT) must accomplish on a day-to-day basis.” The answer: Streamlining IT vendors.

Listen in to learn all the details, including Paddy’s actionable insights on how to reduce the burden on IT while keeping employees and customers happy.

Be sure to listen to other episodes in our series, Essential Connections: The Business Owner’s Guide to Growth During Economic Uncertaintyand learn how you can future-proof your business with agile IT leadership.

IT Leadership, Small and Medium Business

The good news for CIOs wanting to enable domain experts to develop their own apps to solve business problems is that there’s a vast array of low-code development platforms to choose from. The bad news: there’s a vast array of platforms to choose from.

CIOs may, therefore, have mixed feelings about SAP’s release of yet another low-code development platform into that crowded market at its Tech Ed developer conference this week.

Gartner calls the potential users of all these low-code development platforms “business technologists” — employees whose responsibilities include creating technology solutions, not simply using the tools IT gives them.

“Smart CIOs are looking for ways to exploit these skill sets, and that often includes the use of low-code platforms for enabling faster delivery of solutions,” said Dennis Gaughan, distinguished VP analyst at Gartner. “But one of the main concerns they have is trying to avoid a huge proliferation of low-code tools that create management and governance challenges.”

SAP’s new platform, SAP Build, aims to give business technologists — or builders, as SAP calls them — secure access to business processes and data to augment enterprise applications and automate processes through a drag-and-drop interface, while letting CIOs manage that access.

“The use of something like SAP Build offers them a platform that builds on existing investments in SAP and provides a more centralized mechanism for managing and governing the applications created by technologists across the enterprise,” Gaughan added.

Bernhard Schaffrik, principal analyst at Forrester, echoed the point about SAP Build offering ease of access to data, applications, and structures from the SAP world.

But however simple the low-code platforms an enterprise adopts, CIOs will still need to ensure users understand some of the complexity that lies behind them, he said. “What’s key is that business developers must be trained and educated about the impacts of building and running applications and automations regarding architecture, compliance, IT and information security,” said Schaffrik.

Build’s three pillars

SAP Build isn’t all new as the Build name previously referred to a user interface prototyping service. Now a complete end-to-end development environment, Build contains elements of older offerings, including SAP Launchpad, a central point of access to in-house and third-party application extensions, and AppGyver, the no-code development platform vendor SAP acquired in February 2021. The AppGyver name will now disappear from the SAP environment, said SAP’s head of low-code and no-code products, Sebastian Schroetel.

Build also contains Work Zone, a role-based web and workflow content. The name lives on as Build’s tool for creating pages, editing menus and adding UI integration cards. Work Zone is one of Build’s three pillars, Schroetel said. The others being Build Apps and Build Process Automation.

SAP has been piloting Build internally and at Qualtrics, the customer experience company in which SAP owns a majority stake, Schroetel added. SAP’s talent attraction (recruitment) team used Build Process Automation to accelerate the headcount approval process, he said.

The new low-code platform shares some features with SAP’s pro-code development tools: both are built on Business Technology Platform, SAP’s data management layer, and share the same API hub, making it possible for business technologists and professional developers within an enterprise to collaborate.

The recycled name isn’t the only thing about Build that may give business technologists déjà vu: Last year at TechEd, SAP CTO Juergen Mueller talked up a unified low-code no-code experience formed by the fusion of AppGyver Composer and SAP Business Application Studio.

This year, SAP is taking that fusion further, adding more functionality and simplifying the interface. Build works with non-SAP systems, allowing the automation of processes in cloud-based productivity suites, for example. SAP Build’s process automation pillar also includes functionality from Signavio, a business process intelligence tool it acquired around the same time as AppGyver, providing visibility into existing processes and customized recommendations on how to simplify and optimize them.

Per-user pricing

Another area in which SAP is hoping to simplify things is pricing. SAP Build will be available as a subscription starting at €1,000 per month for 25 active users, with additional user licenses starting at €18 per active user per month, a company spokesperson said.

SAP Build won’t be the sole answer to everyone’s low-code needs, however, nor even every SAP shop’s needs.

“Any time you try to standardize on a single platform for development, you’re going to deal with tradeoffs,” Gartner’s Gaughan said. Compared to SAP’s plodding approach, “Smaller, independent low-code vendors that have been doing this for a while and are solely focused on low code can potentially add new capabilities more quickly,” he added.

Since uptake of such independent platforms is dependent on their ability to connect to other products, they may also have more integrations with other applications and processes, but they likely won’t be able to connect to SAP applications as deeply as Build can.

Forrester’s Schaffrik noted that SAP Build is as open to extension as most other low-code development and automation platforms out there, but cautioned, “The devil’s in the details, so I’m recommending decision makers to diligently look at what they require regarding ‘openness’ of a platform.” For Gaughan, it’s not a matter of choosing one development platform over another. “Ultimately, I think a lot of large enterprises will have multiple low-code tools in their kit bag,” he said, “and I suspect that, for existing SAP customers, Build will be likely be a core part of their low-code strategy.”

Application Management, Build Automation, Cloud Management, IT Governance, No Code and Low Code, SAP

Decision support systems definition

A decision support system (DSS) is an interactive information system that analyzes large volumes of data for informing business decisions. A DSS supports the management, operations, and planning levels of an organization in making better decisions by assessing the significance of uncertainties and the tradeoffs involved in making one decision over another.

A DSS leverages a combination of raw data, documents, personal knowledge, and/or business models to help users make decisions. The data sources used by a DSS could include relational data sources, cubes, data warehouses, electronic health records (EHRs), revenue projections, sales projections, and more.

The concept of DSS grew out of research conducted at the Carnegie Institute of Technology in the 1950s and 1960s, but really took root in the enterprise in the 1980s in the form of executive information systems (EIS), group decision support systems (GDSS), and organizational decision support systems (ODSS). With organizations increasingly focused on data-driven decision making, decision science (or decision intelligence) is on the rise, and decision scientists may be the key to unlocking the potential of decision science systems. Bringing together applied data science, social science, and managerial science, decision science focuses on selecting between options to reduce the effort required to make higher-quality decisions.

Decision support system examples

Decision support systems are used in a broad array of industries. Example uses include:

GPS route planning. A DSS can be used to plan the fastest and best routes between two points by analyzing the available options. These systems often include the capability to monitor traffic in real-time to route around congestion.Crop planning. Farmers use DSS to help them determine the best time to plant, fertilize, and reap their crops. Bayer Crop Science has applied analytics and decision-support to every element of its business, including the creation of “virtual factories” to perform “what-if” analyses at its corn manufacturing sites.Clinical DSS. These systems help clinicians diagnose their patients. Penn Medicine has created a clinical DSS that helps it get ICU patients off ventilators faster.ERP dashboards. These systems help managers monitor performance indicators. Digital marketing and services firm Clearlink uses a DSS system to help its managers pinpoint which agents need extra help.

Decision support systems vs. business intelligence

DSS and business intelligence (BI) are often conflated. Some experts consider BI a successor to DSS. Decision support systems are generally recognized as one element of business intelligence systems, along with data warehousing and data mining.

Whereas BI is a broad category of applications, services, and technologies for gathering, storing, analyzing, and accessing data for decision-making, DSS applications tend to be more purpose-built for supporting specific decisions. For example, a business DSS might help a company project its revenue over a set period by analyzing past product sales data and current variables. Healthcare providers use clinical decision support systems to make the clinical workflow more efficient: computerized alerts and reminders to care providers, clinical guidelines, condition-specific order sets, and so on.

DSS vs. decision intelligence

Research firm, Gartner, declared decision intelligence a top strategic technology trend for 2022. Decision intelligence seeks to update and reinvent decision support systems with a sophisticated mix of tools including artificial intelligence (AI) and machine learning (ML) to help automate decision-making. According to Gartner, the goal is to design, model, align, execute, monitor, and tune decision models and processes.

Types of decision support system

In the book Decision Support Systems: Concepts and Resources for Managers, Daniel J. Power, professor of management information systems at the University of Northern Iowa, breaks down decision support systems into five categories based on their primary sources of information.

Data-driven DSS. These systems include file drawer and management reporting systems, executive information systems, and geographic information systems (GIS). They emphasize access to and manipulation of large databases of structured data, often a time-series of internal company data and sometimes external data.

Model-driven DSS. These DSS include systems that use accounting and financial models, representational models, and optimization models. They emphasize access to and manipulation of a model. They generally leverage simple statistical and analytical tools, but Power notes that some OLAP systems that allow complex analysis of data may be classified as hybrid DSS systems. Model-driven DSS use data and parameters provided by decision-makers, but Power notes they are usually not data-intensive.

Knowledge-driven DSS. These systems suggest or recommend actions to managers. Sometimes called advisory systems, consultation systems, or suggestion systems, they provide specialized problem-solving expertise based on a particular domain. They are typically used for tasks including classification, configuration, diagnosis, interpretation, planning, and prediction that would otherwise depend on a human expert. These systems are often paired with data mining to sift through databases to produce data content relationships.

Document-driven DSS. These systems integrate storage and processing technologies for document retrieval and analysis. A search engine is an example.

Communication-driven and group DSS. Communication-driven DSS focuses on communication, collaboration, and coordination to help people working on a shared task, while group DSS (GDSS) focuses on supporting groups of decision makers to analyze problem situations and perform group decision-making tasks.

Components of a decision support system

According to Management Study HQ, decision support systems consist of three key components: the database, software system, and user interface.

DSS database. The database draws on a variety of sources, including data internal to the organization, data generated by applications, and external data purchased from third parties or mined from the Internet. The size of the DSS database will vary based on need, from a small, standalone system to a large data warehouse.DSS software system. The software system is built on a model (including decision context and user criteria). The number and types of models depend on the purpose of the DSS. Commonly used models include:
Statistical models. These models are used to establish relationships between events and factors related to that event. For example, they could be used to analyze sales in relation to location or weather.
Sensitivity analysis models. These models are used for “what-if” analysis.
Optimization analysis models. These models are used to find the optimum value for a target variable in relation to other variables.
Forecasting models. These include regression models, time series analysis, and other models used to analyze business conditions and make plans.
Backward analysis sensitivity models. Sometimes called goal-seeking analysis, these models set a target value for a particular variable and then determine the values other variables need to hit to meet that target value.
DSS user interface. Dashboards and other user interfaces that allow users to interact with and view results.

Decision support system software

According to Capterra, the popular decision support system software includes:

Checkbox. This no-code service automation software for enterprises uses a drag-and-drop interface for building applications with customizable rules, decision-tree logic, calculations, and weighted scores.Yonyx. Yonyx is a platform for creating DSS applications. It features support for creating and visualizing decision tree–driven customer interaction flows. It especially focuses on decision trees for call centers, customer self-service, CRM integration, and enterprise data.Parmenides Edios. Geared for midsize/large companies, Parmenides Eidos provides visual reasoning and knowledge representation to support scenario-based strategizing, problem solving, and decision-making.XLSTAT. XLSTAT is an Excel data analysis add-on geared for corporate users and researchers. It boasts more than 250 statistical features, including data visualization, statistical modeling, data mining, stat tests, forecasting methods, machine learning, conjoint analysis, and more.1000minds is an online suite of tools and processes for decision-making, prioritization, and conjoint analysis. It is derived from research at the University of Otago in the 1990s into methods for prioritizing patients for surgery.Information Builders WebFOCUS. This data and analytics platform is geared for enterprise and midmarket companies that need to integrate and embed data across applications. It offers cloud, multicloud, on-prem, and hybrid options.QlikView is Qlik’s classic analytics solution, built on the company’s Associative Engine. It’s designed to help users with their day-to-day tasks using a configurable dashboard.SAP BusinessObjects. BusinessObjects consists of reporting and analysis applications to help users understand trends and root causes.TIBCO Spotfire. This data visualization and analytics software helps users create dashboards and power predictive applications and real-time analytics applications.Briq is a predictive analytics and automation platform built specifically for general contractors and subcontractors in construction. It leverages data from accounting, project management, CRM, and other systems, to power AI for predictive and prescriptive analytics.Analytics, Data Science

Even though 90% of IT leaders in the UK expect an economic downturn, technology spending this year is set to grow at its third fastest rate in over 15 years, and most tech executives expect their budget to rise in 2023, according to the latest Digital Leadership report from talent and technology solutions firm Nash Squared (formerly Harvey Nash Group).

More than half (52%) of IT leaders in the UK polled for the report expect their technology budget to rise, and  56% of UK organisations expect to increase their technology headcount in 2023. Only approximately one in seven believe their budget will fall.

The Nash Squared Digital Leadership Report marks the 24th year the firm has polled leading CIOs, CTOs and CDOs, with this year’s study finding a surprising uptick in global, European and UK tech spending, as organisations accelerate digital transformation initiatives and adapt to hybrid working.

The report polled 1,783 digital leaders across 87 countries, including 746 in the UK, and discovered that improving operational efficiency, customer experience and developing new products and services are the top three priorities for digital leaders— markedly similar to last year’s findings.

It also suggested, however, that many of those same leaders are yet to feel the pinch from the recession and cost-of-living pressures.

“Economic headwinds are gathering and indicators are turning negative—but despite or even because of this, UK businesses know that investment in technology remains crucial. Both to maximise the efficiency of what they already have and to become more agile and responsive in highly unpredictable conditions, technology is the key enabler,” said Bev White, CEO of Nash Squared.

Tech spend is rising, as lines with business blur

Nash Squared’s report, pointedly, highlights that business and technology spending is becoming increasingly entwined, and how businesses define technology spending is getting hazy.

“What defines ‘technology spend’ is a point of debate,” according to the report, which added that the average IT budget for all respondents was a surprisingly high 7% of organisation revenue. “Most would agree spending on IT infrastructure is technology spend. Most would agree spending on Google adverts is not. But in the middle ground sit applications like customer systems, new technology products and apps. Is even defining it as ‘technology spend’ helpful?”

White believes that the growth in spending can be seen as a lasting impact from Covid-19, which gave CIOs the encouragement “to go faster and further”, in particular accelerating their investments in data and cybersecurity.

Alex Bazin, CTO at legal firm Lewis Silkin LLP, said that the Nash report’s headlines echo what he sees on a “day-to-day basis”, with investment rising at his company—a City firm focused on driving internal efficiencies. “We’ve got to keep the investment even, maybe even especially, through economic downturn,” he said.

Part of this investment, he said, can be attributed ton hybrid and remote working, with Lewis Silkin’s new office in Manchester representing a “significant investment in itself.”

Bazin—who said that ROI timeframes remain relatively unaffected, with most projects needing to deliver value within two years—does however note that the lines between IT and business spending are closing. This potentially muddies the waters, in terms of understanding where tech investment growth is coming from, and the impact it has on more traditional IT budgets.

“It’s hard to draw the line sometimes,” he said, giving the example of library services falling into his remit and budget, as subscription services fall into the realm of data and knowledge sharing. He adds that other industries, such as automotive manufacturing, have the additional complexity of IT/OT (operational technology) convergence and budgets falling between the cracks of technology, product and operations.

Tech executives in a variety of industries agree the lines between spending on IT and other segments of business are blurring. Nadine Thomson, Global CTO at Mediacom, gave one such example at the WPP-owned media agency.

“If you think about product, product doesn’t necessarily always sit in an IT or even in a CTO function,” shes says. “In my role, I kind of share product…with our chief product officer. So that’s one example of an area where you wouldn’t necessarily see it all on the IT line.”

Business-led tech is on the rise

There has also been a general increase in business-led technology, she notes, highlighting that this would not be cloud hosting or licensing costs, but rather areas like business analysis and product management.

“I wonder if some organisations are starting to think about how they’re accounting for technology more broadly,” she said, adding that budgets are now under ‘more pressure’ than two months ago.

In fact, a lot has changed recently. The Nash Squared survey was based on responses between 20 July and 10 October— chancellor Jeremy Hunt axed most of the so-called ‘mini budget’ seven days later, with now-former UK prime minister Liz Truss relinquishing her role after 10 days, on 20 October. The political tumult has added to general economic uncertainty.

For Scott Petty, the CDIO (chief digital information officer) at Vodafone, the financial uncertainty represents another opportunity for CIOs to drive change through crisis, even if investments are more acutely focused on projects which can save energy and drive operational efficiencies.

“So things like investments to save energy, and automation. Anything that can reduce power consumption, suddenly has an amazing business case,” he said on the sidelines of Gartner’s symposium in Barcelona, which ended Thursday. “So you’re seeing a wave of investments in those areas,” he said, pointing to data centre consolidation and cloud migration plans moving from “three-year plans to 18-month plans.”

“Will that continue? It really depends how long the downturn lasts, how long the headwinds are and how big the energy upside is,” Petty said.

AI and RPA cuts as projects get prioritised

Even though CIOs seemingly have yet to feel the impact of economic headwinds, some technologies have already been scaled back.  Although investment remains strong in cloud (67% of executives polled by Nash reported large-scale usage in the UK), companies are cutting back their investments in big data and RPA (robotic process automation).

“As the CIO, you want to make sure that you’re putting things on the board that show real value and help the business transform itself, grow and scale be more productive,” said Nash Squared’s White, adding that organisations are committing to bigger infrastructure projects rather than smaller, and more iterative AI projects which “start out small and then permeate.”

Mediacom’s Thomson expressed surprise at the relative fall from grace for RPA, suggesting that efficiency must be king in financially uncertain times.

“We know that talent is getting harder to get, so cutting back on anything that’s going to drive automation or RPA is a strange decision,” she said.

Lewis Silkin’s Bazin has a similar stance on AI, saying that this suite of technologies is “front and centre” to the company’s ambitions, with “nothing on pause.” In particular, he said the law firm is looking to AI for document discovery to help build legal cases and give advice to clients, as well as for contract analysis, contract automation and fact-checking on case law. “Anything that decouples effort from the equation makes a quick difference, and a rapid ROI,” he said.

Skills gap and diversity progress—but sustainability stalls

Elsewhere in Nash Squared’s report, there was concern over how cost-of-living pressures were having an impact on salary demands and thus recruitment, and frustration with the UK government’s inability to tackle the digital skills divide. But there was promising news for gender diversity: Approximately 28% of new hires are female, with the recruitment firm attributing the slight rise in the number of female leaders (up from 12% to 15% year-on-year) and new hires likely due at least in part to greater flexibility in the workplace.

Thomson, though encouraged by the findings, believes that building diverse teams is an ongoing journey. At Mediacom, she points to a “reasonably diverse” technology team, built over three years through partnerships with the likes of D&I organisation Tech Talent Charter, connections in the CIO-CTO world, and internal schemes like WPP ‘Visible Start’, which gives women an opportunity to come back after a career break. But she believes that developing the company culture, as well as active role modelling, is pivotal to get to a point where word-of-mouth drives diversity and employee retention.

“People recommend or bring in other people. And that’s actually really helpful, because they stick—and they stick because the culture is already there. You’re coming into a welcoming culture.”

Big data analysts, cybersecurity experts and technical architects were the top three job types sought in the UK, according to the report, but rising salaries were a concern, with almost two-thirds of UK leaders saying that the rising cost of living has made salary demands ‘unsustainable’.

Sustainability, as was the case last year, remains somewhat down on the CIO’s priority list. In the UK, while 43% of respondents think technology has a ‘big part to play’ in sustainability, only 22% are using technology to measure their carbon footprint to any great extent.

A fifth of digital leaders polled in the UK (20%) think sustainability had only a ‘negligible or no part to play in 2022’, leading Nash Squared to question if there’s a vacuum in leadership responsibility for sustainability. “We expected to see it playing a greater role than when we measured it last year when in fact it appears little has changed,” the report said. “Do digital leaders have their heads firmly in the sand or is the board not focusing them on this?”

Bazin notes that IT concerns about sustainability could differ widely by sector—a logistics company may, for instance, see a bigger environmental impact from haulage by air or sea than from IT—but believes an altogether bigger challenge is focusing on goals for the year ahead. In the study, most UK leaders cited concerns about a lack of focus on digital innovation (21%), followed by under-resourcing (18%) and prioritising ideas (10%).

“IT has been so busy in innovating core IT for the pandemic, and adapting for the hybrid workplace. But it is really important that business owns business innovation, and IT owns its part of that,” Bazin said, adding that getting the right team and process structures in place is at the top of his agenda.

Budgeting, IT Strategy