Large IT projects are hard to execute, particularly when in-house staff are often pulled into their day jobs and distracted by other priorities. This can be costly for organizations. In fact, McKinsey suggests that early cost and schedule overruns can cause projects to cost twice as much as anticipated. One common resolution to this challenge is for companies to seek outside support to ensure success. 

There are four critical ways that outside support can make a difference.

Rapid talent aggregation

One of the most challenging aspects of software engineering in today’s environment is assembling quality talent. CIO magazine identifies the top 10 most in-demand tech jobs and says that 86% of technology managers say it’s challenging to find skilled professionals. If your current team does not have the capacity or skills to tackle the required project, consider an outsourced partner.

The right partner can bring a team of highly skilled engineers together in a matter of days or weeks, allowing you to accelerate development and deliver your key projects in a timely fashion. Key talent can be added and removed from projects as needed. Selected properly, your outsourced provider will have a group of tried and tested experts with deep knowledge of the chosen tech stack and therefore can iterate and compose much faster.  

Developer velocity 

Time to market is critical for the success of any project, particularly when it impacts revenue. Therefore, IT projects must be scoped, ramped, and run expeditiously in order to take advantage of market dynamics. 

In an in-depth study of 440 large enterprises, McKinsey identified the most critical factors that enabled organizations to achieve high developer velocity. Four key areas have the greatest impact on software development performance: tools, culture, product management and talent management. The study revealed those with higher developer velocity outperform competitors by up to five times. 

When selecting an outsourcer, validate what tools and project management structure they will bring to the table; validate past project success in terms of both budget and on-time delivery. Inspect project plans to ensure it includes full and rigorous testing, especially around security, full quality assurance, and performance optimization. 

A project outsourced to an established applications platform provider with dedicated experts, like Edgio, will include rigorous testing and rollout plans, full quality assurance, and performance optimization—ensuring that your investment ultimately delivers peak efficiency for your customers and your business.

Knowledge sharing

Great professional services teams accumulate best practices over time and will bring complementary skill sets into the business they’re partnering with. Shared knowledge helps grow the skillset of your internal team, and enables them to contribute more meaningfully to the success of your business.

Your employee satisfaction can even increase from personal and professional progress felt when learning new technology, frameworks, or languages throughout major IT projects developed in partnership with external experts. 

This aspect cannot be overlooked, given that 91% of employees report being frustrated with inadequate workplace technology and 71% consider looking for a new employer as a consequence. Expert teams have the depth of knowledge on a breadth of tools that help save tremendous time and many headaches by creating ​efficient, automated workflows

Ensure that your team gets the opportunity to work directly with your outsourced development team to facilitate knowledge sharing.

4. Faster deployment cadence

Companies integrating software development with IT operations are seeing increased productivity and 83% faster releases. We’ve personally seen deployment cadences double through the use of Edgio’s integrated workflow for web application deployment. 

Leveraging experts who start on day one with automated deployment and testing, standardized processes, and improved development and operations communication can bring releases to market faster. Enable your team to innovate more and wait for code less. 

To outsource or not to outsource?

Large projects can take a significant toll on an organization if they are not managed properly. To be effective and efficient, project teams need a common vision, shared team processes, and a high-performance culture.

If you’re asking yourself the following questions, consider hiring a team of experts: 

What architecture do we need to support a next-generation operating model?How can we rapidly build, scale and sustain a cutting edge customer-centric tech stack?What technologies, frameworks, or API integrations provide a high-quality experience? How do we create the most secure workflow for fast releases and updates?

At first glance, outsourcing can seem an expensive option. However, I advise businesses considering software development outsourcing to think long-term. The right team will minimize costs and bring more value by delivering a better product quicker with a more robust and flexible IT architecture, and will ultimately generate significant ROI.

Edgio accelerates your web development and application performance. Learn more about Edgio and our expert services.

IT Leadership

Data is what drives digital business. Consider how strategically important it has become for companies to leverage advanced analytics to uncover trends that can help them gain decisive insights they might not otherwise possess.

But data-driven projects are not always easy to launch, let alone complete. In fact, enterprises face several challenges as they look to leverage their information resources to gain a competitive advantage.

Foundry’s recent Data & Analytics Study looked into why organizations have difficulty making good on the promise of data-driven projects, and revealed several key roadblocks to success. Here are the top six reasons data initiatives fail to materialize and deliver, as revealed by the research, along with tips from IT leaders and data experts on how to overcome them.

1. Lack of funding for data initiatives

Funding can be hard to come by for any technology initiatives, particularly in an uncertain economy. This certainly applies to data projects. These undertakings might be competing with a host of other initiatives in need of financing, so it’s important for IT leaders and their data teams to present a strong business case for each project, and to not make them overly complex.

“While budget is always tricky, this is a question of priorities and right-sizing the body of work,” says Craig Susen, CTO and technology enablement lead at management consulting firm Unify Consulting. “Looking for obvious outcomes [does not] always require reworking the entire infrastructure.”

Being data-driven is as much a cultural pursuit as it is anything else, Susen says. “It requires designing/rethinking key performance indicators, capturing data in a smart timely manner, landing it in common areas quickly,” he says. “Then it can be evaluated and aggregated, either applying advanced visualization technologies or working it against machine learning algorithms. It’s all a complicated bit of science. Having said that, many companies overcomplicate this process by trying to do too much all at once or over-indexing in places that don’t drive true value to their businesses and customers.”

CIOs and other technology leaders need to develop strong working relationships with fellow C-suite members, particularly CFOs. In many cases it’s the finance executive who makes the decision on budget approvals, so to improve the likelihood of getting the needing funding technology chiefs need to be able to demonstrate why data-driven projects are important to the bottom line.

2. Lack of a clearly articulated data strategy

Lacking a complete data strategy to guide data-driven projects “is like not having an outline to guide a thesis,” says Charles Link, senior director of data and analytics at Covanta, a provider of sustainable materials management and environmental solutions.

“Every project should contribute some paving stones to the road leading to the desired destination,” Link says. “A data strategy identifies how to align information and technology to help you get there. Your business should be able to travel down the road as you deliver value.”

To be successful, a data strategy should have both a data management component — generally IT tools, technologies, and methods — and a data use strategy, Link says.

Oftentimes there isn’t a clear understanding within enterprises of what data is available, how the data is defined, how frequently it changes, and how it is being used, says Mike Clifton, executive vice president and chief information and digital officer at Alorica, a global customer service outsourcing firm.

Companies need to create a common language among stakeholders in advance of establishing any data-driven projects, Clifton says. “If you don’t have a solid foundation, budget and funding are too unpredictable and often get cut first due to a lack of clear scope and achievable outcome,” he says.

3. Technology to implement data projects is too costly

Making the challenge of getting sufficient funding for data projects even more daunting is the fact that they can be expensive endeavors. Data-driven projects require a substantial investment of resources and budget from inception, Clifton says.

“They are generally long-term projects that can’t be applied as a quick fix to address urgent priorities,” Clifton says. “Many decision makers don’t fully understand how they work or deliver for the business. The complex nature of gathering data to use it efficiently to deliver clear [return on investment] is often intimidating to businesses because one mistake can exponentially drive costs.”

When done correctly, however, these projects can streamline and save the organization time and money over the long haul, Clifton says. “That’s why it is essential to have a clear strategy for maximizing data and then ensuring that key stakeholders understand the plan and execution,” he says.

In addition to investing in the tools needed to support data-driven projects, organizations need to recruit and retain professionals such as data scientists. These in-demand positions typically command high levels of compensation.

4. Other digital transformation initiatives took priority

Digital transformations are under way at organizations in virtually every industry, and it’s easy to see how projects related to these efforts could be given a high priority. That doesn’t mean data-driven projects should be put on the back burner.

“If digital transformation efforts are taking priority over data initiatives, then you need to re-evaluate,” Link says. “All digital transformation initiatives should envelope data initiatives. You cannot have one without the other.”

Ignoring the data aspects of transformation could invite failure of other initiatives. “I would be concerned to pursue digital transformation without a solid data strategy, as the results, iterations, and pivots needed to be successful should all be data-driven decisions,” says David Smith, vice president and CIO at moving and logistics company Atlas Van Lines.

“If this is an organizational roadblock, I would recommend using the digital transformation initiative as the genesis of a data strategy execution,” Smith says.

5. Lack of executive buy-in or advocacy for data initiatives

If senior executives are not sold on data-driven projects, their chance of success will likely diminish because of lack of adequate funding and resources.

“Lack of buy-in from the top can kill a data-driven project before it starts,” says Scott duFour, global CIO at Fleetcor, a provider of business payments services. “I am fortunate that isn’t a problem at Fleetcor, as I get buy-in for projects from our CEO by partnering with leadership running lines of business to validate the importance of big data for company growth and success.”

To get executive buy-in, technology leaders must be able to articulate from the beginning what the outcomes of data projects will be and align them to business priorities or pain points, Clifton says. Ironically, all digital-related deployments depend heavily on data to achieve benefits, “so whether or not the executives realize it, they are funding data initiatives,” he says.

The organization’s data strategy should inform executives about how data projects can support the goals of the business. “The data initiatives should focus on the accomplishment of those objectives through actionable intelligence and automation,” Link says.

In some cases, the lack of support might stem from the fact that business leaders do not really know what they want from data projects, and therefore do not understand the value, Smith says. “If they cannot see the value, then they won’t support it,” he says.

It’s a good practice to use small proof of concept opportunities to show the value through operational dashboards or the automation of manual tasks, Smith says. “This will create interest from the executive team,” he says.

6. Lack of appropriate skill sets

The technology skills shortage is affecting nearly every area of IT, including data-driven projects.

“Without enough IT talent and people with the right skill sets, it’s tough to get data-driven projects done,” duFour says. “And the IT employee shortage is real in several areas of IT.” To try to draw technology workers, Fleercor offers flexible working arrangements and provides training so employees can improve their skills.

“We have also cast a wider net in the talent search,” duFour says. “Although a four-year degree or more is ideal, companies should look for potential employees with associate degrees, IT-type certifications, and other pertinent skills that can help move data-driven projects forward.”

Hiring talent with the specific technical experience needed to lead and manage data-driven projects “is a challenge in this competitive job market, but it’s key in ensuring you have the right skills in place to successfully implement the projects,” Clifton says. “Without the right skills and expertise up front, companies can start a project and then run into issues where the team is unable to quickly and effectively identify and resolve the problem.”

Data scientists, data stewards, and data forensics experts are becoming mainstay roles, Clifton says, whereas data architects were the higher-end skills most needed in prior years.

“Affordable talent has been my biggest challenge,” Link says. “There is no one right answer. I have brought in fresh talent from recent graduates and invested time, only to have them poached at crazy salaries. In my experience, there is a lot of value in having people co-located for faster learning and collaboration. My latest approach is to work with organizations like Workforce Opportunity Services to build my own team from high-caliber workers. It will take time to get there but we are focused on the long-term results.”

Analytics, Data Management, Data Science

Topping the list of executive priorities for 2023—a year heralded by escalating economic woes and climate risks—is the need for data driven insights to propel efficiency, resiliency, and other key initiatives. Many companies have been experimenting with advanced analytics and artificial intelligence (AI) to fill this need. Now, they must turn their proof of concept into a return on investment. But, how? 

Organizations are making great strides, putting into place the right talent and software. Yet many are struggling to move into production because they don’t have the right foundational technologies to support AI and advanced analytics workloads. Some are relying on outmoded legacy hardware systems. Others are stymied by the cost and control issues that come with leveraging a public cloud. Most have been so drawn to the excitement of AI software tools that they missed out on selecting the right hardware. 

As the pace of innovation in these areas accelerates, now is the time for technology leaders to take stock of everything they need to successfully leverage AI and analytics.

Look at Enterprise Infrastructure

An IDC survey[1] of more than 2,000 business leaders found a growing realization that AI needs to reside on purpose-built infrastructure to be able to deliver real value. In fact, respondents cited the lack of proper infrastructure as a primary culprit for failed AI projects. Blocking the move to a more AI-centric infrastructure, the survey noted, are concerns about cost and strategy plus overly complex existing data environments and infrastructure.

Though experts agree on the difficulty of deploying new platforms across an enterprise, there are options for optimizing the value of AI and analytics projects.[2] Foundational considerations include compute power, memory architecture as well as data processing, storage, and security. 

It’s About the Data

For companies that have succeeded in an AI and analytics deployment, data availability is a key performance indicator, according to a Harvard Business Review report.[3] In short, the report’s successful leaders have democratized their company’s data—making it accessible to staff, acquiring it from customers and suppliers, and sharing it back. Dealing with data is where core technologies and hardware prove essential. Here’s what to consider:

Ingesting the data: To be able to analyze more data at greater speeds, organizations need faster processing via high-powered servers and the right chips for AI—whether CPUs or GPUs. Modern compute infrastructures are designed to enhance business agility and time to market by supporting workloads for databases and analytics, AI and machine learning (ML), high performance computing (HPC) and more. Storing the data: Many organizations have plenty of data to glean actionable insights from, but they need a secure and flexible place to store it. The most innovative unstructured data storage solutions are flexible and designed to be reliable at any scale without sacrificing performance. And modern object storage solutions, offer performance, scalability, resilience, and compatibility on a globally distributed architecture to support enterprise workloads such as cloud-native, archive, IoT, AI, and big data analytics.Protecting the data: Cyber threats are everywhere—at the edge, on-premises and across cloud providers. An organization’s data, applications and critical systems must be protected. Many leaders are seeking a trusted infrastructure that can operate with maximum flexibility and business agility without compromising security. They are looking to adopt a zero-trust architecture, embedding security capabilities across an enterprise-wide line of storage, servers, hyperconverged, networking, and data protection solutions. Moving the data: As the landscape of data generation shifts and data traffic patterns grow more complex, surging demands require a network reevaluation in most organizations. For data to travel seamlessly, they must have the right networking system. However, traditional proprietary networks often lack scalability, proven cloud-based solutions, and automation, while open-source solutions can be expensive and inflexible. Open networking answers the challenge by accommodating software choice, ecosystem integration, and automation for the modern enterprise from edge to core to cloud.Accessing the data: Increasingly, AI development and deployment is taking place on powerful yet efficient workstations. These purpose-built systems enable teams to do AI and analytics work smarter and faster during all stages of AI development, and increasingly during deployment as they support inferencing at the edge. And to give employees access to the data they need, organizations will need to move away from legacy systems that are siloed, rigid and costly to new solutions that enable analytics and AI with speed, scalability, and confidence. A data lakehouse supports business intelligence (BI), analytics, real-time data applications, data science and ML in one place. It provides rapid, direct access to trusted data for data scientists, business analysts, and others who need data to drive business value. 

Focus on Outcomes

Analytics and AI hold the promise of driving better business insights from data warehouses, streams, and lakes. But first, enterprises will need to honestly assess their ability to not just develop but successfully deploy an AI or analytics project. Most will need to modernize critical infrastructure and hardware to be able to support AI development and deployment from edge to data center to cloud. Those that do so will find their data and applications to be force multipliers. Along the way, they will have implemented upgrades that keep data secure and accessible—imperatives for meeting IT and business objectives in the months and years to come. 

To learn more about Creating an End-to-End Infrastructure for AI Successread the IDC white paperand visit


Intel® Technologies Move Analytics Forward

Data analytics is the key to unlocking the most value you can extract from data across your organization. To create a productive, cost-effective analytics strategy that gets results, you need high performance hardware that’s optimized to work with the software you use.

Modern data analytics spans a range of technologies, from dedicated analytics platforms and databases to deep learning and artificial intelligence (AI). Just starting out with analytics? Ready to evolve your analytics strategy or improve your data quality? There’s always room to grow, and Intel is ready to help. With a deep ecosystem of analytics technologies and partners, Intel accelerates the efforts of data scientists, analysts, and developers in every industry. Find out more about Intel advanced analytics.




IT Leadership

This post is brought to you by NVIDIA and CIO. The views and opinions expressed herein are those of the author and do not necessarily represent the views and opinions of NVIDIA.

CIOs seeking big wins in high business-impacting areas where there’s significant room to improve performance should review their data science, machine learning (ML), and AI projects.

A recent IDC report on AI projects in India[1] reported that 30-49% of AI projects failed for about one-third of organizations, and another study from Deloitte casts 50% of respondents’ organizational performance in AI as starters or underachievers.

That same study found 94% of respondents say AI is critical to success over the next five years. Executives see the AI opportunity for competitive differentiation and are looking for leaders to deliver successful outcomes.

ML and AI are still relatively new practice areas, and leaders should expect ongoing learning and an improving maturity curve. But CIOs, CDOs, and chief scientists can take an active role in improving how many AI projects go from pilot to production.

Are data science teams set up for success?

A developing playbook of best practices for data science teams covers the development process and technologies for building and testing machine learning models. Developing models isn’t trivial, and data scientists certainly have challenges cleansing and tagging data, selecting algorithms, configuring models, setting up infrastructure, and validating results.

Leaders who want to improve AI delivery performance should address this first question: are data scientists set up for success? Are they working on problems that can yield meaningful business outcomes? Do they have the machine learning platforms (such as NVIDIA AI Enterprise),infrastructure access, and ongoing training time to improve their data science practices?

CIOs and CDOs should lead ModelOps and oversee the lifecycle

Leaders can review and address issues if the data science teams struggle to develop models. But to launch models and ensure success, CIOs and CDOs must establish a model lifecycle or ModelOps.

The lifecycle starts before model development and requires educating business leaders on their roles in contributing to AI projects. It also requires steps for planning the infrastructure at scale, instituting compliance and governance, creating an edge security strategy, and partnering with impacted teams to ensure a successful transformation.

Here are several factors to consider:  

Educate business leaders about their roles in ML projects. Have business leaders defined realistic success criteria and areas of low-risk experimentation? Are they involved in pilots and providing feedback? Are they ready to transform business processes with machine learning capabilities, or will they slow down investments at the first speed bump?Adopt a build, buy, or partner when developing models. Sometimes, developing proprietary models makes sense, but also evaluate frameworks such as recommendation engines or speech AI SDKs.Think a step ahead regarding production infrastructure requirements. The lab infrastructure used to develop models, and the lower scale required to pilot an AI capability, may not be the optimal production infrastructure. For example, AI in healthcare, smart buildings, and industrial applications that impact human safety may require edge or embedded computing options to ensure reliability and performance.Plan for large-scale AI applications on the edge. Where there are thousands of IoT devices, there are opportunities to deploy AI applications to run on the devices. For example, fleets of vehicles, including delivery trucks, construction tools, and farming equipment, can use device-deployed AI apps to provide real-time feedback to their operators that improve productivity and safety. An edge management solution that deploys the apps to the devices, supports communications, and provides monitoring capabilities is critical.Establish MLOps, ModelOps, and infrastructure-monitoring capabilities. The data science teams will need MLOps to automate paths to production, while compliance should require ModelOps and want model updates to address model drift. Infrastructure and operations teams will want monitoring to help them review cloud infrastructure costs, performance, and reliability.

IT teams don’t just deploy apps. They participate in planning to deliver business outcomes and then institute DevOps to ensure delivery and ongoing enhancements. Applying similar practices to data science, machine learning, and AI will improve successful pilot and production deliveries.

[1] IDC FutureScape: Worldwide Artificial Intelligence 2021 Predictions — India Implications

Artificial Intelligence, Data Science, Machine Learning

For many of today’s IT teams, there’s a common, recurring question that keeps being posed: Why are we doing this?

This question is fundamental, some may say basic, but it is often one that teams don’t get good, solid answers to. Further, this speaks to a broader lack of visibility and insight. Among the many potential initiatives considered, why was the current one selected? At a higher level, what’s the reasoning behind the relative staffing and budgeting priorities across products, support, and operations? Why do some teams have ample resources, while others are running lean? Even worse, what if teams find out that they’ve been wasting significant time on a “zombie” project, that is, one that’s lost executive sponsorship? For example, weeks after an executive departed, teams may find out there’s no longer support for an initiative they’ve dedicated significant effort to, and that it is going to be shelved before it ever sees the light of day.

When teams are wasting time focusing on zombie projects or they’re operating in the dark in terms of how their efforts map to business priorities, their productivity, morale, and results can all suffer.

This underscores why instituting Value Stream Management (VSM) is such a vital endeavor.

Simply put, VSM is about maximizing the delivery of value to customers. Through VSM, teams seek to ensure they’re properly funding, defining, aligning, measuring, and optimizing value streams.

VSM can provide leaders with transparency into the work that’s being done and how that aligns with their investments. Further, VSM is about bridging the gap between business and IT.

Through VSM, development teams and their management can gain better insight into the purpose of their work, and how it translates into value for customers and the organization.

VSM platforms can be integral in realizing this potential. VSM platforms can help teams manage work and investments across the enterprise. Not only can work be seen from a top-down view but also from a bottom-up perspective. This is beneficial for many different reasons. At the most basic level, teams that understand why they are being asked to work on a particular initiative will have a higher level of engagement and commitment to making that initiative a success.

More practically, the understanding of how work aligns with the organization’s strategic goals allows project teams to make more informed decisions on the approach to use, the way to overcome problems, and so on. In an agile environment, where success is dependent on empowered teams, that level of transparency is critical.

Transparency isn’t just important to the teams doing the work either. PMOs and related support functions have to ensure the effective use of resources, and that means delivering the best possible combination of business outcomes. Only when they can view the end-to-end integration of work — from goal, to investment decision, to work plan — can they intelligently make those prioritization and scheduling choices. The same holds true for product owners. Their focus is different than the PMO, but they must translate business vision into technical priorities, and that can only happen with end-to-end transparency.

Advanced VSM platforms can represent a solution for teams across the enterprise. From IT to business executives, these platforms can support planning, decision making, investment, and resources management. Through VSM platforms, teams can stop operating in the dark, and they can avoid having to waste time on zombie projects.

To learn more about ValueOps, the VSM platform from Broadcom, be sure to visit our ValueOps VSM page. Find out how you can deliver more value to your customers and gain increased visibility and alignment in your organization.

Explore ValueOps Value Stream Management, built to manage what you value most.

Collaboration Software

One of the most important parameters for measuring the success of any technology implementation is the return on investment (ROI). Providing a compelling ROI on technology initiatives also puts CIOs in a stronger position for securing support and funds from the business for future projects. This compounding effect shows just how imperative it is for enterprise technology leaders to ramp up the ROI from their deployments.

Here are a few strategies that CIOs have employed to churn out the maximum returns from their technology endeavours.

Align projects with business goals

Too often IT initiatives are undertaken solely as technical projects, with only loose affiliation with line-of-business stakeholders, ushering in the risk of drifting too far from the overall goals and business objectives of the organization. For organizations to work optimally, “information technology must be aligned with business vision and mission,” says Shuvankar Pramanick, deputy CIO at Manipal Health Enterprises. If well aligned, such IT projects can even help generate new business opportunities, he says.

Citing an example, Pramanik says that if the discharge process for hospital patients holding a third-party health insurance, which typically takes five to eight hours, can be brought down to one hour with the help of technology intervention, a new patient can be admitted and given that bed faster, leading to substantial business gain.

Moreover, “by aligning projects goals with broader organizational goals, you can create several more opportunities for the organization,” Pramanik says.

“Suppose a hospital develops an app for patients’ appointment and consultation. In due course of time, this app will gather a lot of patient (demographic) data that can be leveraged to offer new promotional features (discounts, for instance) or enhanced services,” he says. “If the data shows there are more pregnant patients at the gynecology department, they can be offered certain attractive packages so that they are engaged with the hospital till their delivery. Later, the hospital can send suggestions to them for continuing consultation in the pediatrics department in the same hospital after their delivery.”

In this way, a simple app initially targeted for booking appointments can not only help retain patients through their pregnancy but also convert them to pediatrics’ patients, thereby opening a new line of business, Pramanik says.

Embrace diversity of thought

Jaspreet Singh, partner at advisory and consulting firm Grant Thornton, says that diverse teams have repeatedly proven to be smarter and more efficient if core business goals are aligned with technology implementations.

Depending on the objective of the project, diverse viewpoints from stakeholders across roles (users, business leaders, developers, etc.), across functions (HR, Finance, IT, etc.), and across backgrounds, identities, and abilities can play a key role in maximizing the return on an initiative — especially an organization-wide project, such as an ERP implementation.

Pramanik says diverse teams have different life experiences, which enable individuals to approach problems differently, in turn helping them to empathize with end users in new ways. “It is important

to value diversity to learn new things, exchange ideas, and experiences to achieve the best outcome. To achieve a particular objective there can be multiple ways. Different opinions will help zero down to the best possible option and help lead the project in the best possible manner,” he says.

Diversity of thought helps negate biased decision making. It also draws a more holistic vision for the project by compounding varied viewpoints, Singh says. “Diversity of thought processes pushes the team to challenge themselves continuously and strive to think outside the box. Diverse teams constantly re-examine facts and remain objective,” he says.

Savvy CIOs believe in bringing all parties to a common table from the beginning. But Pramanik emphasizes that IT leaders must formulate the best way to exchange ideas while giving voice to all stakeholders. This requires breaking down silos and assessing each party’s thinking styles so that the final implementation witnesses a smooth acceptance.

“In case of building an application for HR department, the chief human resource manager needs to take the opinion of and get consent from all the downline HR heads, so that all the aspects of the applications are covered and there are no acceptability hiccups later on,” Pramanik says. “If the perfection level of the app is 90% to 95%, your customization will be minimal in the future and the application will bear a high ROI. But this perfection can be achieved only if (in this case) all the HR heads put in their thoughts and suggestions during the discussion phase itself.”

Deploy scalable technology

From ERP implementation to application building, CIOs swear by scalable technology. “And why not, tech scalability can make or break ROI,” says Pramanik.

“When building a mobile app for offering a particular service to the customers, say e-pharmacy, you should develop it in such a way that if the business wants to offer more diverse services (x-ray or blood sample collection) through the app in the next two to three years, the requirements should fit into the application without changing its core architecture or else there will be a big dent on the ROI. Initially building a scalable app may look expensive but changing the complete architecture later will incur more cost,” he says.

Similar considerations of follow-on returns should be kept in mind when considering legacy modernization initiatives. For example, even if an existing ERP is working efficiently, and there is no tangible ROI in migrating to the newer version, an upgrade can still be advantageous. As V Ranganathan Iyer, Group CIO of auto component manufacturer JBM Group, says, “the new version of the ERP will accommodate newer technology enhancements, which can be leveraged to derive unforeseen returns out of the project in the long run.”

Here, a cloud-native approach can be beneficial, Singh says.

“A business may have introduced a new product or solution that requires less infrastructure because it has less capacity needs in the beginning. Eventually, however, it will be necessary to scale up the infrastructure as the customer base grows. This is simple to accomplish with the aid of a cloud service provider since there are options for real-time scalability in the infrastructure. Hosting the entire infrastructure on-premise will turn out to be exorbitant,” he says.

Adopt the agile methodology

While many CIOs have implemented agile methodologies for project deployment, those who haven’t are missing opportunities to streamline their project investments.

Anjani Kumar, CIO at pharmaceutical company Strides, says agile offers the flexibility to break down a project into various small parts, which are delivered through cycles or iterations. This visibility on iterations during the project is extremely effective.

“At the end of each sprint or iteration, a minimum viable product is released, which can be used by end users. Any changes in market demands and user requirements can be considered in subsequent sprints. This approach ensures that the ROI is boosted by considering the changing demands on the IT initiative and making course corrections as per market demands,” he says.

“For instance, in the case of a mobile app built for a company’s sales representatives, the process can be split into three components — the UI/UX component, data integration, and integration with other third-party apps. As the final step for ensuring payment, integration compliance on payments must be introduced through PCI-compliant coding. This is followed by an end-to-end testing upgrading from component,” Kumar says. “If we have 30 screens for UI/UX, then in the typical waterfall delivery, the 30 UI/UX screens would be available only at the end of the project, after which the users might give feedback, which in turn might require changes to kick in further refinement in the UI/UX. As opposed to this, in agile delivery, 7 out of the 30 may be considered in one sprint delivery. At the end of that sprint, users would have a UI/UX for testing and it would be open to any changes if required instead of being there till the end of the project.”

Projects are executed — and adopted — much more effectively when users can provide feedback in iterations, Kumar says. “Consequently, the adoption of such an initiative would also be much higher and that would lead to boosting of ROI.”

Leverage a platform-based approach

Gopinath Jayaraj, CIO at automotive manufacturing company Tata Motors, says, “ROI can be multiplied several-fold by leveraging a platform-based approach as opposed to deploying point solutions.”

Investments made in digital initiatives that are standalone in nature may not necessarily integrate with the rest of the business processes or IT landscape of an organization, thereby yielding limited returns, he says.

“However, if one takes a platform approach where requirements from multiple business streams are considered and then a solution is designed leveraging components of existing landscape, the investment requirement is reduced while the return increases as we are now fulfilling beyond point requirements,” Jayaraj says. “As more requirements are added, the need to invest in them individually is eliminated and the overall ROI is boosted because of reusing the already established component. Platforms establish flexible, loosely coupled integrations among key enterprise system components and can be leveraged for multiple use cases in the future.”

Giving an example, Jayaraj says an appropriately designed enterprise digital platform with a harmonious data model can be reused for multiple applications on the customer side. For example, a well-designed auto spares digital service can be used effectively for supporting a customer-browsable spares catalogue, effective real-time assistance for auto mechanics during service, and an e-commerce platform for selling spares and accessories.

The DigiVOR project at Tata Motors is an example of a platform-based approach. It not only addresses customer service and inventory management for Tata Motors but also helps increase revenue through spare part sales for those customers who might have taken up the grey market route for fulfilling spare parts orders.

“The moment Tata Motors receive a VOR (vehicle-off-road) call registered (either through the digital portal or a call), the support team starts figuring out the critical spare that’s required and putting out a DigiVOR order into the system. DigiVOR initiates an automatic search across all spare inventories that Tata Motors has with the dealer distributors, and authorised service centres in the vicinity of the vehicle,” says Jayaraj. Consequently, this platform-based approach has helped Tata Motors in two business areas: customer service and sales. Not only does DigiVOR reduce turnaround time for procuring inventory from a dealership near where an incident has occurred; it also ensures increased revenue for Tata Motors through spare part sales, as customers can now procure original spare parts through this initiative.

ROI and Metrics

The manufacturing industry has fully entered the digital era, with the most digitally advanced manufacturers integrating their information technology and operational technology environments to give themselves an edge in the marketplace.

They’re also advancing their use of automation and analytics to streamline back-office functions, logistics, and production, and to optimize resources, reduce costs, and identify new opportunities, industry analysts say.

In its 2022 Digital Factory Update, ABI Research predicted that spending on smart manufacturing would grow at a 12% compound annual growth rate, from $345 billion in 2021 to more than $950 billion in 2030 — all to support digital transformation initiatives.

The research firm further predicted that manufacturers would continue to increase their spending on analytics, collaborative industrial software, and wireless connectivity in upcoming years, with factories increasingly adopting Industry 4.0 solutions, including autonomous mobile robots, asset tracking, simulation, and digital twins.

Michael Larner, a research director at ABI Research, says technology innovation will remain instrumental for manufacturers’ success as they confront myriad challenges, such as escalating energy costs, supply chain disruptions, staffing shortages, and the need to optimize resources.

“We’re seeing a move away from basic automation to integrating IT and OT teams, optimizing using analytics to make sure things are as efficient as possible, and to do things like predictive maintenance and proactive monitoring,” Larner says.

Consequently, manufacturers now prioritize IT investments that create and extend their “digital thread,” in which applications within their IT architecture talk to one another, with data from one system informing and directing action in others, Larner says.

“Manufacturers appreciate they cannot just react to events but need to be proactive,” he adds. “A successful CIO needs to be equally at home on the factory floor understanding issues and influencing change as in the boardroom talking strategically and obtaining project funds.” recently recognized the following 10 manufacturers as part of the CIO 100 Awards for IT Innovation and Leadership. Here is a look at how they are capitalizing on the value of IT.

Avery Dennison brings intelligence to its supply chain

Nicholas Colisto, VP and CIO, Avery Dennison

Avery Dennison

Organization: Avery Dennison

Project: Advanced Planning System (APS) for Operational Production Planning and Detailed Scheduling

IT Leader: Nicholas Colisto, VP and CIO

Like many manufacturers, Avery Dennison saw the need to strategically use technology to add more agility, efficiency, and speed to its increasingly large and complex supply chain.

In response, Avery Dennison IT partnered with the company’s global supply chain operations to create the Advanced Planning System (APS) for Operational Production Planning and Detailed Scheduling, a system that provides granular insights into supply issues and constraints, giving the supply chain team the tools needed to make effective and timely decisions.

APS gathers, captures, and combines all relevant inputs and outputs of supply chain events in near real-time to form a digital twin of the supply chain. The system’s easy-to-use interface enables the centralized team to monitor and take immediate action on events.

Working with the supply chain function to define and align organizational priorities and needs, Avery Dennison IT set about creating a mathematical model for establishing short-, mid-, and long-term operational plans. IT also automated a range of tasks, using intelligent optimization algorithms to identify anomalies, interpret impacts on downstream supply chain actors, and communicate that information to relevant stakeholders — thereby enabling the supply chain team to quickly respond to and remediate issues.

“This system enables our supply chain function to gain insights into the constraints and issues within the different assets of the supply chain, giving us all the tools in a single place to make effective and timely decisions,” says Vice President and CIO Nicholas Colisto. “Together, our new robust data lake, predictive analytics, and digital-twin capabilities are helping Avery Dennison to optimize the utilization of our assets, avoid downtime, and provide better visibility and analysis of the complete supply chain, allowing the organization to respond quickly to disruptions and deviations.”

CoorsTek streamlines production operations

Matt Mehlbrech, VP of IT, CoorsTek


Organization: CoorsTek

Project: Model Plant Implementation

IT Leader: Matt Mehlbrech, VP of IT

Production operators at CoorsTek had an efficiency issue. To report on production, monitor machine performance, record quality readings, and search and retrieve documents, they had to use multiple computer systems and paper-based processes, rendering their work inefficient and prone to errors due to the lack of integration and data validation.

To remedy this, and create a foundation for future transformation, CoorsTek IT developed Model Plant, an integrated systems strategy focused on increasing production operator efficiency, providing key metrics to manufacturing management, and preparing CoorsTek for a future ERP implementation.

At its core is a manufacturing operations management (MOM) system, which combines manufacturing execution and maintenance management in a single integrated platform. As envisioned, Model Plant will integrate this MOM system with the company’s ERP, quality management, and machine connectivity systems into a single console for CoorsTek production operators, helping them stay focused on producing high-quality ceramic parts.

IT designed, developed, and implemented Model Plant at six CoorsTek plants in under 18 months, with the environment already improving production operators’ efficiency. Implementation is ongoing, with IT seeing it as foundational to a smooth transition to a new ERP.

“Our Model Plant systems enable real-time visibility of our entire shop floor and creates a comprehensive data set that we can analyze and learn from to unlock tangible productivity and quality gains,” says Matt Mehlbrech, the company’s vice president of IT.

Dow digitizes its manufacturing facilities

Melanie Kalmar, corporate vice president, CIO, and CDO, Dow


Organization: Dow

Project: Dow Digital Manufacturing Acceleration (DMA) Program

IT Leader: Melanie Kalmar, corporate vice president, CIO, and CDO

Dow in fall 2020 launched its Digital Manufacturing Acceleration (DMA) program with the goal of accelerating the deployment of digital technologies across its global manufacturing and maintenance areas.

One program deliverable to date is a private, secure, high-speed cellular network that is fully within the enterprise network. This network provides real-time access to data and collaboration tools, thereby enabling employees to extend their work beyond the traditional office out into the manufacturing plant environment.

DMA also includes a secure cloud for hosting, integrating, and contextualizing data for employees through purpose-built applications, and a mobile-first platform where employees can access their data and actionable insights.

Through such deliverables, DMA correlates and integrates manufacturing data and analysis that employees can use to make better, faster decisions. Capabilities delivered through the program also help improve asset reliability, safety, quality performance, and operational efficiency while reducing operating costs and unplanned events.

Together, DMA solutions create an improved employee experience, a digitally enabled workforce, and a resilient manufacturing organization by integrating and contextualizing manufacturing data from engineering, operations, maintenance, logistics, ERP and related ecosystem data sources, and by delivering data and insights from multiple systems through a single, user-centric interface.

“Dow’s DMA’s success would not be possible without the outstanding collaboration between our information systems and manufacturing teams. By deepening our understanding of what our manufacturing teams need to safely and reliably meet our customers’ needs and what IS can deliver for manufacturing, we have been able to arrive at these high-performing DMA solutions,” says Corporate Vice President, CIO, and CDO Melanie Kalmar.

Dow’s DMA solutions are in use at the company’s largest facility, where they’re already generating value through increases in manufacturing asset utilization, reductions in unplanned events and production outages, production volume increases, and other quantifiable key performance indicators. Implementation of DMA solutions at other Dow facilities is ongoing.

Analytics proves key to Eastman materials innovation

Aldo Noseda, vice president and CIO, Eastman


Organization: Eastman Chemicals

Project: Fluid Genius Digital Product

IT Leader: Aldo Noseda, vice president and CIO

Eastman Chemicals in 2021 launched Fluid Genius, a patent-pending digital platform that enables manufacturing plant workers to monitor issues that could impact operations, safety, yield, and maintenance budgets.

Fluid Genius can monitor, analyze, and extend the life of Eastman’s customers’ heat transfer fluid. The technology can predict fluid life expectancy and advise how best to extend it while also avoiding unplanned manufacturing shutdowns.

Fluid Genius also provides forward-looking insights, which allows plant maintenance engineers and operations managers to plan the optimal time for maintenance, thereby minimizing risk and costs. Fluid Genius’s recommendation engine also helps plant engineers better understand the factors impacting the quality of their heat transfer fluid, which in turn helps them operate their plants safely and inform budgeting for future maintenance needs.

To create this application, Eastman drew on its nearly 50 years of operating system sample analysis data to draw in-depth insights on fluid chemistry behaviors over time. The company combined that data with end-user input into its maintenance and incident logs and advanced artificial intelligence/machine learning techniques to create the platform’s proprietary fluid analytics.

“Digital products and services like Fluid Genius are critical to Eastman becoming a leading material innovation company because of the value they provide to our customers and how they differentiate us from competitors,” says Eastman VP and CIO Aldo Noseda.

Eastman has already seen strong uptake in use of this platform, and company officials note that the platform benefits its customer acquisition and retention efforts.

General Motors turns to IT to navigate semiconductor shortage

Fred Killeen, CIO and VP of Global IT, General Motors

General Motors

Organization: General Motors

Project: Semiconductor Shortage: Protecting GM Revenue

IT Leader: Fred Killeen, CIO and VP of Global IT

Like all modern vehicle-makers, General Motors relies on semiconductors to make its products; in fact, a typical vehicle relies on more than 3,000 chips within its electronic control units (ECU) to run essential vehicle functions.

As a result, General Motors faced the possibility of shutting down production when a global shortage of semiconductors arose in 2021. But thanks to an innovative technology that enabled GM to build vehicles without key electronic components and park them in lots for completion later when semiconductors became available, factories in North America could keep running.

The IT-developed solution — an industry first — essentially extended the secure plant network into vehicle storage locations, some of which were hundreds of miles away from manufacturing sites in the United States and Mexico.

This allowed repair teams, who were equipped with GM standard test tools configured with mobile printers and portable hot spots, to connect to the GM network and company systems to validate parts, complete repairs, download engine control software to ECUs, and test the vehicles. Completed vehicles were driven or returned to plants on carriers for the final stages of quality testing before being shipped to dealerships and customers.

The IT team also built tools to collect data about chips and suppliers and to provide that data on demand, enabling GM to route chips to suppliers who produce ECUs for its most popular vehicles.

Additionally, business intelligence and analytics teams developed tools and reporting features to manage the ECU repair process and prioritize fixes based on the time a vehicle was in a lot and their impact on GM revenue. These teams also developed a unified view, thereby displaying all data in one place.

“Working with teams across the organization, they delivered an industry-first technology solution that allowed us to build vehicles without some electronic components and complete them remotely as semiconductors became available — it’s a process that continues to prove valuable today,” says Fred Killeen, CIO and global VP of IT.

Oshkosh goes digital to optimize its supply chain

Anupam Khare, senior vice president and CIO, Oshkosh


Organization: Oshkosh

Project: Enhanced Supply Chain Efficiency

IT Leader: Anupam Khare, senior vice president and CIO

Oshkosh had identified three issues within its supply chain function, which is tasked with acquiring the rights parts at the right time to support the manufacturing of customized, premium products requiring sophisticated components.

First, there was a lack of parts visibility across Oshkosh’s supply network, which led to unexpected parts shortages, putting the company at risk for production delays and shutdowns. That visibility issue in turn had created inefficient logistics and inventory leveling processes, driving up costs. And then there was the company’s supply chain technology itself, which comprised disparate, unconnected systems.

To address those issues, Oshkosh’s digital technology team partnered with its supply chain function to develop an advanced digital platform complete with system-to-system integration, cohesive data environments, data enhancements, and machine learning. Together, they teams also created a series of advanced analytics solutions to establish a more connected and optimized supply chain.

The platform pulls together critical data sources across several systems and functional areas and serves as the foundation for agile delivery of additional digital products. For example, the digital technology team quickly deployed a parts shortage prediction model soon after the platform was created. The team also deployed multiple advanced analytics solutions to optimize inventory levels, logistics, and shipping costs.

Initiated in January 2021 and now fully deployed, the collection of technologies has already increased supply chain visibility, part shortage prediction accuracy, and operational efficiency, and has helped Oshkosh avoid stockouts and subsequent costly production delays and shutdowns.

“This project showed the importance of IT staying well-connected with business partners to anticipate opportunities/challenges early and to co-create value through digital solutions,” says Anupam Khare, senior vice president and CIO.

Otis takes smart elevator to new heights

Organization: Otis Elevator

Project: Otis ONE

IT Leader: Renee Zaugg, vice president and CIO (Zaugg has since left Otis.)

To bring more transparent, proactive, and predictive services to customers, Otis in mid-2020 started work on a global digital transformation program called Otis ONE.

Leveraging technologies such as IoT, big data, AI, mobile, and cloud, Otis ONE provides real-time visibility of elevator health, insights for predictive maintenance, and remote assistance and troubleshooting.

Otis ONE is tailored to deliver real-time data insights to a range of personas — from campus owners to field engineers to maintenance officers — in their day-to-day operations, thereby  empowering various stakeholders to make more informed decisions and deliver more predictive maintenance, more effectively.

The Otis ONE architecture consists of three tiers — edge, platform, and enterprise. Edge relies on various gateway/sensor packages to collect and send a range of elevator data to the cloud via a cellular network, which is then integrated with the platform tier via its IoT hub or event hub. The platform tier features real-time business processing fueled by a rules engine that analyzes data and immediately determines various conditions on the state of the elevator, notifying Otis workers and external customers proactively in the event of anomalies.

The Otis ONE enterprise tier brings together multiple applications that leverage the information coming from the edge and platform tiers, integrating it with elevator master and service data to give a 360-degree, real-time view of elevators around the globe.

“The connected elevator creates value for our customers with greater equipment uptime. Otis ONE enhances transparency to our customers, productivity of our own teams in predication and proactiveness,” says Ezhil Nanjappan, executive director and CTO, adding that the Otis ONE ecosystem serves as a “foundation for any smart cities and smart buildings.”

Owens Corning leverage low-code to improve plant safety

Steve Zerby, senior vice president and CIO, Owens Corning

Owens Corning

Organization: Owens Corning

Project: Low-Code Digital Platform for High Impact to Plant Safety and Compliance

IT Leader: Steve Zerby, senior vice president and CIO

Owens Corning must monitor and manage the vapors from asphalt tanks for both safety and regulatory compliance reasons. But it found its process for doing so was ineffective and slow. Data was collected in offline databases, and was plant-specific and difficult to share across the company. In addition, the analysis required to uncover potential hazards took days.

So, in August 2021, Owens Corning’s asphalt business function and IT teamed up to co-create a digital platform that could transform and improve the effectiveness of that process through improved data collection and actionable analytics.

Low-code application development technology enabled the two teams to quickly create a proof-of-concept, gather feedback, and fine-tune the platform. Within three week, a pilot was delivered to a plant, and the platform itself was implemented across 20-plus plants in only three months.

The platform provides a single source of truth for loss prevention data, with proactive monitoring and analytics that alert plant leaders with real-time insights to ensure hazard prevention.

The company has seen improvements in its program. Loss prevention hazards are now identified and addressed in real-time, thereby ensuring more effective personnel safety and regulatory compliance reporting. That visibility has also enabled more proactive preventive maintenance of equipment, thereby minimizing unplanned production outages. And, thanks to the platform’s analytics capabilities, insights are generated in minutes instead of days, enabling the company to make informed decisions quickly.

“Digitizing sensor measure data taken from tanks, integrating other data points, and providing easy-to-use visuals and analytics empowered plant operators. They can quickly assess potential hazards and risks in real-time and proactively take preventive actions,” says Steve Zerby, senior vice president and CIO, noting that a key element of success was having “a dedicated business product owner who is passionate about bringing others along in using the tools.”

Rockwell Automation launches customer-centric transformation

Chris Nardecchia, SVP and chief information and digital officer, Rockwell


Organization: Rockwell Automation

Project: Rockwell Automation Drives Innovation, Customer Centricity and Moves Towards a Data-Driven Operating Model Through its Enterprise Transformation Office

IT Leader: Chris Nardecchia, SVP and chief information and digital officer

As Rockwell Automation works to help customers accelerate their digital transformations, the company likewise embarked on its own DX journey. In doing so, it is creating new customer experiences, transforming business models, and empowering workforce innovation.

Launched in August 2020, the initiative is helping to build a data-driven operating model that is agile and centered around the customer so it can both respond to changing needs and create lasting customer connections instead of one-time transactional product sales.

To do that, the Enterprise Transformation Office (ExO) mapped out end-to-end business processes and customer experience needs for the target state and prioritized the investments required to deliver a holistic customer experience.

The office is moving Rockwell Automation to a data-driven operating model, in which it will execute more than two dozen product development cycles annually and use customer information to shape those products. Additionally, the ExO is working to transform business models, processes, software, and service portfolios to deliver more subscription-based offerings to customers.

Rockwell Automation is using telemetry to gather data and insights as well as creating a comprehensive, unified view of customer data to enhance the company’s customer engagements. It’s updating processes and systems to support its future state. It’s also leveraging data gathered from other initiatives to innovate products, grow the market, and grow the business. And it has implemented a structured approach for governing its transformation roadmap.

“This program is critical to realizing our vision of reinventing our products to outcomes, reimagining our customers’ experience, and redefining our operating model towards subscriptions and annual recurring revenue,” says Senior Vice President and Chief Information and Digital Officer Chris Nardecchia.

Schneider Electric secures against rise in OT cyberattacks

Elizabeth Hackenson, SVP and CIO, Schneider Electric

Schneider Electric

Organization: Schneider Electric

Project: Cybersecurity Connected Service Hub

IT Leader: Elizabeth Hackenson, SVP and CIO

In response to the growing number and sophistication of cyberattacks directed at operational technology (OT), Schneider Electric in 2020 created a cybersecurity OT operating model aimed at optimizing the cybersecurity performance of its 220 manufacturing plants and 35 distribution centers around the globe.

The model establishes one Security Operation Center (SOC) for all Schneider IT and OT. It also establishes a threat detection platform at each plant that raises security alerts to the SOC via an interface with the security information and event management (SIEM) system.

The operating model also features the Cybersecurity Connected Service Hub (CSH), a 24/7 global team to support the company’s industrial sites in every type of cybersecurity event. Its primary responsibilities are to identify and baseline cybersecurity posture of OT and IT devices; detect and remediate security alerts; and monitor for and remediate vulnerabilities.

CSH is also charged with driving continuous improvement and progressively improving the company’s cybersecurity posture in OT devices and processes.

The CSH team receives alerts from the SOC, analyzes those alerts and defines how this must be remediated. The team works with cybersecurity site leaders (CSL) to remediate alerts; CSLs are OT experts trained in cybersecurity who work in an assigned plant, with a CSL in each plant, and have accountability for the cybersecurity of the OT in that plant. They can remediate a cyber incident and manage the business continuity.

The CSH also provides cybersecurity training for all CSLs. It builds and promulgates standard operation procedures (SOP) for each type of security event and vulnerability. It also monitors cybersecurity KPIs in all plants, remediating in conjunction with the CSL any gaps that it identifies.

“The creation of the Cybersecurity Connected Services Hub is a milestone in our continuous journey towards greater resilience and cybersecurity at Schneider Electric,” says SVP and CIO Elizabeth Hackenson.

CIO 100, Digital Transformation, IT Leadership, Manufacturing Industry

The financial services and insurance industries often claim the title of digital leaders, thanks to their use of technology to create new products and services, and to deliver those offerings in frictionless, seamless ways.

This emphasis on digital transformation was underscored in a recent Gartner study that found that, for the first time ever, technology as a strategic financial services business priority has surpassed growth among CEOs and senior business executives surveyed.

“This dramatic shift signals the extent to which executives view the role of digital, where for many it is now a critical means to an end,” says Jasleen Sindhu, a senior director of research within Gartner’s Financial Services Practice.

While other industries remain on par with the financial sector in terms of their degree of digitalization, the financial services industry shows notable maturity when it comes to digital customer interactions where it “stands out as being better than most other industries,” Sindhu says.

Organizations leading the charge in the financial services and insurance industries exhibit certain characteristics in their approach to IT, helping them to outperform the competition, she says. “These characteristics — composable thinking, business architecture, and composable technologies — … together allow firms to quickly adapt and respond to opportunities and threats.”

Sindhu adds: “The focus for these firms is no longer about ‘having a good digital strategy’ but it is about ‘having a business strategy that creates value and differentiates, enabled by digital.’” recently recognized seven companies in the financial services and insurance space whose use of technology stands out as part of the CIO 100 Awards for IT Innovation and Leadership. Here is a look at their award-winning work.

Aflac virtualizes the customer enrollment experience

Rich Gilbert, chief digital information officer, Aflac


Organization: Aflac

Project: Reimagining the Enrollment Experience

IT Leader: Rich Gilbert, chief digital information officer

When the pandemic hit in 2020, Aflac quickly realized its reliance on in-person meetings to sell its products would not work in a world seeking to limit contact. So, it reimagined its business model, tasking teams to develop new digital experiences.

Within months, Aflac IT rolled out new end-to-end digital capabilities that enabled the insurer’s agents to inform, educate, book, and enroll customers on any device from anywhere.

“Without this initiative, the impact of the pandemic could have been much more severe given Aflac’s business model is centered around a face-to-face interaction. Thanks to the hard work of the team, we were able to understand key moments of our agent’s and customer’s journey to reimagine their ‘jobs to be done’ in a new, digitally enabled way. What we learned was that when you involve your end-users in the process, seek to understand rather than respond, and bring everyone to the table without silos, you can deliver great digital products with significant impact,” says Rich Gilbert, chief digital information officer.

Hatch, Aflac’s US innovation lab, took the lead, pulling together leaders and teams to develop and implement a virtual enrollment experience. The teams took a human-centered design approach, engaging in rapid experimentation while validating the proposed solutions. The work resulted in a collection of capabilities that were then distributed across agile teams to advance.

Fully deployed by the end of 2020, those capabilities transformed processes throughout the enrollment lifecycle. For example, agents who once relied on physical materials such as product brochures gained tools to generate on-brand, compliance-approved sites at scale in minutes. Customers gained personalized sites where they can view their specific product offerings, watch informational videos, view dates for enrollment, and book time directly with their agent.

Additionally, Hatch accelerated a new AI-driven decision support tool that recommends products to customers based on past purchases by customers with similar profiles. The team also enabled digital enrollment with features such as e-signature.

To deliver these capabilities, Hatch worked with the other teams to overcome multiple challenges, such as integrating legacy systems, ensuring legal and regulatory compliance, and contending with resource fluctuations as a result of COVID.

Ally optimizes customer savings experience

Sathish Muthukrishnan, chief information, data, and digital officer, Ally Financial

Ally Financial

Organization: Ally Financial

Project: Smart Savings: Bucket Goals

IT Leader: Sathish Muthukrishnan, chief information, data, and digital officer

Ally Financial set out to create a suite of financial tools to help its customers overcome their struggles to save money. Ally’s Smart Savings Tools, first launched in February 2020, help customers maximize savings while minimizing the time and effort required to do so.

One tool within the suite, Bucket Goals, enables customers to set funding and timeframe targets for their savings goals, and to monitor their progress, providing features to help them, such as customized prompts and automated savings.

Ally’s design approach started with TM Studio, its concept testing and innovation team. The team engaged consumers, asking about their savings objectives and challenges, and then used that information to identify savings obstacles as well as ways to overcome them.

Meanwhile, using human-centric design principles, its consumer research and usability teams worked with engineering to understand and implement the right customer experience.

Ally built the front-end UI with React and chose Ember to power a dynamic digital experience. The tool is API-driven and uses real-time transactional data to enable a personalized experience. It also uses native mobile languages Swift (iOS) and Kotlin (Android) to optimize mobile app performance.

Customers who have used Ally’s Smart Savings Tools have saved twice as much as those who haven’t, according to Ally data, which also shows that more than 2 million “bucket goals” have been created.

“In many ways this project was a springboard for Ally,” says Sathish Muthukrishnan, chief information, data, and digital officer. “We created a unique digital experience framework and model — a combo of human-centered design thinking, digital-native experience development, and scalable technologies — that allows us to quickly identify consumer pain points and develop and refine solutions. This listen-first and design-second approach enables us to fully leverage our unique hybrid persona of fintech and established financial institution and create a transformational experience for the consumer.”

City National Bank invests in data-driven customer relationships

Organization: City National Bank of Florida

Project: Innovative Data Grouping Approach to Optimize Customer Relationship Value

IT Leader: Ariel Carrion, CIO

At City National Bank (CNB), customer relationship management needed an overhaul. The bank’s customers were assigned relationship managers based on their accounts’ point of entry or branch origin. As a result, large net-worth customers did not always get paired with the most suitable relationship manager, and customers with multiple accounts were assigned numerous relationship managers to help serve their needs.

To optimize service, balance resource workloads, and maximize revenue, CNB wanted to group customer household and organizational relationships, identify high-value relationships, and more effectively assign relationship managers best suited to develop and grow those accounts. To do so, the bank set out to implement a data-grouping solution to handle the complexities of accounts without negatively impacting customers. Partnering with consulting firm Protiviti, CNB implement a graph data structure and developed algorithms to uncover nested customer relationships, gauge the size of each banking relationship, and optimize resource assignment.

As part of this project, the technology team established consistent logic rules for account linkages; defined new policies and processes to assign relationship managers; and continuously refined the criteria based on new insights gleaned from the advanced analytics. It also implemented a distributed data platform (Databricks) on Azure to process enormous amounts of customer data on a daily basis.

The technology facilitated data-based decisions for resource assignment, thereby improving and simplifying CNB’s relationship creation and maintenance process. It also enabled CNB to measure the size of each banking relationship, automate nearly all client relationship assignments, and drive other improvements in its customer relationship function.

Global Payments automates code security scanning

Organization: Global Payments

Project: Macroscope

IT Leader: Guido Sacchi, executive vice president and CIO

Global Payments sought to create a more efficient, scalable process for securing the code it developed, while doing more to cut application security risk. So, the application security team developed Macroscope, an automated code scanning tool.

Macroscope scans all code in all repositories, including all branches. It can scan many different programming languages, code at check-in, and infrastructure as code. It also features centralized vulnerability management, email notifications for new findings, vulnerability aging reports, false positive tracking, and remediation metrics.

Prior to Macroscope, Global Payments application teams had to manually request scans, which were conducted independently, a process that yielded multiple reports that were not reconciled for false positives and duplicate results, leaving developers to shift through them. The volume of scan requests annually resulted in thousands of reports, which application teams then had to review. Moreover, teams usually requested scans late in the development process, with the remediation process making it difficult for development teams to keep on track with their release schedules.

Now with Macroscope, application security is better integrated into the software development lifecycle, including Global Payments’ continuous integration/continuous delivery (CI/CD) pipelines. Macroscope also reduces friction in finding security vulnerabilities by enabling developers to fix these vulnerabilities during their normal software development lifecycle (SDLC) process. The project has also delivered other business benefits, such as reducing the time developers spend fixing security vulnerabilities in code and improving speed to market.

Hastings Mutual turns to data to tame weather impacts on policies

Eshwar Pastapur, vice president and CIO, Hastings Mutual Insurance Co.

Hastings Mutual Insurance Co.

Organization: Hastings Mutual Insurance Co.

Project: Geospatial Intelligence

IT Leader: Eshwar Pastapur, vice president and CIO

The CIO and senior vice president of operations at Hastings Mutual Insurance had a very specific objective: To use the company’s data visualization tools to provide weather-related policy exposures. This would bring in-house and improve a function the company was getting as a fee-based service, with information provided only monthly via spreadsheet.

Developing such capabilities would enable Hastings Mutual to more accurately predict the number of claims in each of its coverage areas. It would also help forecast the necessary claim support levels and improve fraud detection and the accuracy of loss reporting.

The project relies on open-source weather data from the National Oceanic and Atmospheric Administration and Iowa State University, overlaying company policies and delivering greater claim predictability through the ability to drill down at the policy level.

To deliver high-performance rendering of data, the team used single-page applications (SPAs) on the front end. This not only allows for easy delivery of information via web browsers but also helps with performance as pages are rendered directly on client devices. The team also used elastic techniques to expedite searching through large amounts of business data.

Geographic information is provided using PostGIS, a relational database server optimized for geocoded data. Sisense business intelligence software provides the interface to present the resulting screens and support user interaction. The resulting tool, with self-service visualization and analytics, allows for quick access and easy manipulation of the data by users. Moreover, it enables Hastings Mutual to add more capabilities in the future, such as incorporating AI for underwriting decision-making.

Implemented in under six months and fully deployed in 2021, the in-house geospatial solution already has reduced claim-related and other costs and improved relevancy and timely decision-making.

“This started as an expense-saving project, but now it has become a powerful competitive advantage for us,” says Vice President and CIO Eshwar Pastapur, explaining that IT has and can continue to layer more capabilities to the tool as business needs arise.

TIAA modernizes with microservices

Ajit Naidu, CIO of retirement, marketing & digital client technology, TIAA


Organization: TIAA

Project: Remittance Modernization

IT Leader: Ajit Naidu, CIO of retirement, marketing, and digital client technology

As TIAA’s client base and demand for services grew, its remittance system wasn’t keeping up, resulting in a scaling issue that impacted customers and left TIAA developers propping up the technology to ensure the company adhered to its contractual service-level agreements.

The company’s concerns over such issues gave rise to the Remittance Platform Modernization project, which was launched to develop an application that performed 100 times better with increased stability and scalability, as well as support for accelerated feature delivery, a seamless client experience, and a vastly improved user interface.

TIAA used domain-driven design to develop the solution and associated services, establishing an internal community-managed framework for building microservices, which are the core of TIAA’s new platform architecture. The company also deployed a container-first operating model that delivers all applications and services within a containerized environment for scalability and flexibility.

“The Remittance modernization effort was the refactoring and re-engineering of a complex, monolithic and legacy platform,” says Ajit Naidu, CIO of retirement, marketing, and digital client technology. “This involved creating a component- and services-based architecture enabling the dynamic assembly of solutions and automated hierarchical unit testing. All the services created are built on a framework that optimizes distributed processing of data and files with caching capabilities. Also implemented the capability to replay/unwind transactions to easily fix data and/or processing errors.”

The results: a 220% improvement in system performance and a 72% reduction in data movement resulting in significant reduction in failure points, with 98% testing now automated.

“In addition, this led to more than 60% reduction in time to market with over 70% reduction in operational costs,” Naidu adds.

Zenus Bank goes mobile for global business model

Pedro Martinez, CIO, Zenus Bank

Zenus Bank

Organization: Zenus Bank

Project: Zenus Bank app

IT Leader: Pedro Martinez, CIO

Zenus, a newly chartered digital bank with a global business model, is offering individuals and businesses around the world the ability to open a US bank account without having to be a US citizen or resident. It offers that capability through its native iOS and Android apps.

But developing the Zenus app presented multiple unique challenges for the new bank. Teams had to ensure the app meets all regulatory requirements while also performing to technological standards to ensure a user-friendly omnichannel banking experience to clients all over the world. They also had to successfully onboard vendors to take on their subset of the project and manage complex process integrations.

“The US banking industry has many regulatory and licensing obligations. To ensure that we met all of them, we needed to have an IT infrastructure that was secure, stable, and fast. We also wanted to take a modular approach to banking, which requires many integrations. We found it essential to be working alongside software and implementation partners who are customer-centric, adhere to the latest standards, and can picture one unified front and backend platform,” says CIO Pedro Martinez. “Through our partnerships, with the likes of Microsoft, we’re achieving operational excellence, so our end users have a positive banking experience.”

Using agile methodologies, developers created a patent-pending omnichannel remote onboarding process and a patent-pending proprietary customer risk score. It also delivers real-time cloud-hosted fraud management on all channels.

The technology has fueled the bank’s business objectives, enabling Zenus to achieve cross-border American Visa card issuance and giving Zenus the ability to offer instant and unlimited cross-border USD transfers. To date, approximately 80% of international applicants automatically are approved by the bank’s patent-pending banking system in less than 10 minutes.

The Zenus App is live in Apple and Google Play stores in more than 180 countries.

CIO 100, Financial Services Industry, Insurance Industry

Once a laggard in IT adoption, the healthcare industry now universally embraces digital transformation.

Consider the figures: According to a 2022 survey from healthcare consultancy The Chartis Group, 99% of the 143 US health system executives it polled agreed on the importance of investing in digital initiatives.

Research points to several drivers that are pushing healthcare entities forward on their digital journeys, with the need to improve patient outcomes and reduce the costs of care topping the list. Other motivations include the desire to provide better patient experiences, to better aid clinicians and support staff in their jobs, and to compete with digital natives that are entering the healthcare market.

The pandemic accelerated the healthcare sector’s digital journey, forcing institutions to work remotely where possible and to find ways to keep up with the high demand for care brought on by COVID, notes Abhishek Singh, a partner at Everest Group and leader of the research firm’s cloud and legacy transformation practice.

Singh says healthcare is seeking transformation throughout its operations — from back-office processes through mid-office functions to front door–type engagements with patients, with digitalization initiatives in the first two areas supporting the last.

“Organizations are more willing and able to spend, as they’re seeing the benefits of their [digital initiatives],” he adds.

Like other sectors, the healthcare industry faces challenges such as financial and resource constraints as they move forward with tech projects, industry experts say. Healthcare also must contend with industry-specific challenges that can slow DX, says Taylor Davis, president of KLAS Research.

“It’s the most complex service industry the world has ever seen,” Davis says. The sector rests on an extensive, complex body of knowledge that’s rapidly increasing. Its core systems hold hundreds of thousands of data fields per individual — significantly more than the per-person count held by core systems in other industries. It’s governed by complicated regulations. It has complicated funding and payment systems. And it’s a high-risk environment, where the consequences of being wrong can be catastrophic or even fatal.

Yet, despite that staggering complexity, experts are seeing leaps in digitalization and transformation to deliver better experiences, more efficient services, and improved care outcomes.

The following eight winners of CIO 100 Awards for IT Innovation and Leadership highlight the digital successes that the healthcare industry has delivered.

Atlantic Health taps AI to expedite critical radiology reviews

Sunil Dadlani, SVP and CIO, Atlantic Health System

Atlantic Health System

Organization: Atlantic Health System

Project: Radiology Imaging AI Analysis Project

IT Leader: Sunil Dadlani, SVP and CIO

For radiologists at Morristown, N.J.-based Atlantic Health System, speed is critical when reviewing patients’ imaging studies.

But the radiologists saw that the work was taking longer than they wanted, particularly for highly advanced studies or studies from patients with previous imaging that also required reviews.

Moreover, while the images could reveal patients in need of immediate additional care — care that was being delayed by the time it took to interpret the images — radiologists didn’t know which patients needed expedited treatment until they had reviewed their images.

Atlantic Health System turned to artificial intelligence to solve for that conundrum, teaming up IT with the radiology department to select software that uses AI to analyze images, with Food and Drug Administration–approved algorithms trained to spot and flag acute abnormalities.

More specifically, the AI analyzes images in queue for review, looking for and identifying clinical abnormalities in those images and then flagging those images for priority review by the radiologists. The technology ensures that images that could indicate the need for time-sensitive treatment are seen first, enabling patients to get the care they need as quickly as possible.

The technology, now fully deployed, is delivering results: By November 2021, the AI prioritized 469 images out of the 8,479 scans it analyzed.

“As with all technological innovations throughout our organization, the patient is always first and foremost at the center of anything we do. This program embodies that approach in a unique way — it is both virtually a clinical assistant to our radiologists, and at the same time an advocate for our patients with the most urgent needs,” says Senior Vice President and CIO Sunil Dadlani.

CommonSpirit Health goes digital to enhance patient experience

Organization: CommonSpirit Health

Project: Connected Patient Journeys and Experiences

IT Leader: Suja Chandrasekaran, system SEVP and chief information and digital officer (Chandrasekaran has since left the company)

CommonSpirit Health has more than 1,000 care sites and 141 hospitals in 21 states. That made the creation and delivery of integrated, standardized, and connected patient experiences across a continuum of provided care essential.

With that in mind, its IT and digital teams in July 2020 embarked on a project to create connected, personalized patient journeys and experiences, with a goal to deliver better, more holistic healthcare for patients and to develop better operating models for providers.

Engaging all key stakeholders, processes, and systems, the teams created governance structures, working groups, cross-platform application architectures, common goals, and key performance indicators to ensure the project successfully progressed.

They then charted key patient journeys, infusing digital and human experiences in each journey to promote patient engagement and deliver a better care experience. The teams then identified, evaluated, and prioritized those experiences, working with operational and clinical teams to build new products and processes to further support them.

CommonSpirit Health has enjoyed positive returns on its investments. For example, in fiscal 2021 its Unified Web Experience served more than 6.5 million page views in two of its transitioned markets. And its Search and Schedule Experience served more than 2.6 million clinician profile views to patients, some 94,000 online appointment bookings and more than 720,000 click-to-call connections to care providers.

ChedMed IntuneHealth offers VIP care to senior patients, wherever they need it

Hernando Celada, chief innovation and strategic initiatives officer, ChenMed


Organization: IntuneHealth – A ChenMed Company

Project: IntuneHealth

IT Leader: Hernando Celada, CIO

Miami-based ChenMed has created a new entity, IntuneHealth, that uses technology to deliver concierge, VIP care when and where its senior patients need it.

IntuneHealth is a fully-integrated system of care featuring a patient-facing app through which patients can access care at any time, including virtual visits with the touch of a button.

Patients can use the app to order medicines, schedule appointments, get lab results and their medical history. They can use the app to access concierge and support team members around the clock year-round.

Moreover, they can use the app to manage their care whether that care is delivered virtually, in office, or at their own homes, as well as whether the care is provided by their own primary care physicians (PCPs) or specialists.

IntuneHealth also uses technology to enable doctor-to-doctor coordination of care, so PCPs can guide their patients through specialty care and visits.

Other features include an easy-to-use interface with data-driven personalization; a patient portal featuring appointment scheduling and reminders; medication refill requests and reminders; referral management; and the capability to link with monitoring devices, wearables, and other such tools.

Patients can use the app to connect to their caregivers via video, phone, or in-app chat. Additionally, they can use it to access virtual events and social activities, such as cooking classes, fitness competitions, and games.

“IntuneHealth is taking the hassle out of healthcare and delivering high-quality, convenient care to our members, whether that’s in one of our centers, virtually through our app, or in the comfort of their own home,” says CIO Hernando Celada. “We believe healthcare should be accessible, simple, and coordinated to deliver the best experience and outcomes for our patients — and we look forward to providing that at IntuneHealth.”

Jackson Healthcare automates patient record-sharing

Michael Garcia, SVP and CIO, Jackson Healthcare System

Jackson Healthcare System

Organization: Jackson Healthcare System

Project: Medical Record Surveillance and Record Sharing

IT Leader: Michael Garcia, SVP and CIO

Miami-based Jackson Healthcare System faced challenges around efficiently sharing patient records with other entities. This was a particular problem when working with smaller medical organizations, such as private practices and nursing facilities, that referred their own patients to the larger Jackson Healthcare System for medical services.

More specifically, many of those smaller entities rely on manual processes to retrieve patient records from Jackson. That created additional work for Jackson Healthcare workers and those at referring organizations, which often had to manually match retrieved records with their own patient files. This manual work meant records requests could take weeks to complete.

To address those issues, cross-disciplinary teams turned to process automation, data manipulation, an integration engine, and other technologies for its Medical Record Surveillance and Record Sharing project. The in-house solution uses data surveillance to scan medical records in real-time, match them to the partner organizations, and then automate the medical record transfers.

This solution also sends medical records to the referral organization via the method of their choice, as files, direct messages, electronic fax, or even physical fax. Moreover, the technology can scale, thereby helping Jackson efficiently handle an increasing number of referrals.

“For too long, healthcare has faced the hurdle of a challenging technological landscape and limited options to share patient records easily. The automation of medical record-sharing overcame this big hurdle for referring organizations and enabled Jackson to grow partnerships while simultaneously benefiting our patients and community,” says Senior Vice President and CIO Michael Garcia.

He adds: “This technology has become a strategic differentiator to keep Jackson as a community leader and a place for everyone to receive the highest-level care.”

Marshfield Clinic turns to telehealth to overcome staffing shortages

Organization: Marshfield Clinic Health System

Project: Telehealth Solutions in Critical Access Hospitals

IT Leader: Jeri Koester, CIO

Marshfield Clinic Health System was facing a staffing challenge at its Neillsville, Wisc., facility. The healthcare organization had a shortage of physicians, mostly primary care providers and emergency department clinicians who supervised its nurse practitioner (NP) hospitalists and provided direct patient care.

MCHS saw telehealth as a solution to that problematic scenario. It deployed a TeleDoc Health RP-Vita Robot, which enabled MD hospitalists to conduct virtual visits with patients and virtually supervise and consult with NP hospitalists at the Neillsville facility.

Remote providers can move the self-driving robotic cart themselves by simply clicking a link to direct it to its destination, with onboard sonar keeping the cart from colliding with people or objects in its path.

Buoyed by the success of that program, MCHS expanded the use of telehealth in its facilities to enable remote specialty consults for patients. For example, the organization deployed InTouch TV pro devices to each patient room, first at MMC-Neillsville and later at its Ladysmith, Wisc., hospital. The technology converts every in-room television to a telehealth endpoint capable of offering video and audio visits with healthcare providers across a range of care — from medical specialties to nutrition needs to spiritual services. The InTouch TV Pro also enables patients to facilitate their own video visits with friends and family when in-person visitations aren’t possible.

According to MCHS, these telehealth solutions have helped increase healthcare access and quality while decreasing costs and provider dissatisfaction.

“Part of our mission at Marshfield Clinic Health System is to bring affordable healthcare to our communities. We need to use innovation to meet the needs of such a rural healthcare population,” says CIO Jeri Koester. “In some of our communities, if we weren’t present, patients would drive hours for access care. By maximizing our capabilities with a telehealth solution, we are able to bring expert-level physician care to wherever our patients are.”

Novant Health tele-ICU improves patient care virtually

Onyeka Nchege, SVP and CIO, Novant Health

Novant Health

Organization: Novant Health

Project: Digital Enhancement and Virtualization of Traditional Care Channels and Processes

IT Leader: Onyeka Nchege, SVP and CIO

Novant Health, based in Winston-Salem, N.C., has focused on deploying technology to create virtually enhanced care in its own facilities, in its patients’ homes, and even in community locations.

Novant Health’s tele-ICU program demonstrates this vision, enabling critical care physicians (called intensivists) and experienced critical care nurses to remotely monitor patients who are being seen onsite in community care facilities. These intensivists and nurses can work from Novant Health’s two command centers, other care sites, or even their own homes.

Novant Health built this tele-ICU program on various technologies, including mobile advanced sensor capabilities that can be moved into standard hospital rooms to provide an intensive-care degree of monitoring for the patient; tele-ICU workstations in the command centers with multiple monitors, headsets, and webcams for intensivists; and an electronic medical record (EMR) system with tools tailored to support digital tele-ICU workflows.

To ensure the tele-ICU program enabled a seamless patient care experience, Novant Health’s Digital Products and Services (DPS) team automated communication between care teams. The DPS team also worked with clinicians to develop and deploy algorithms to score medical and safety alerts based on patient acuity levels and then automated notifications.

“Our work developing our tele-ICU program has enhanced the patient experience and virtualized our traditional care channels to deliver business value, increase access to care, and improve the quality of that care,” says Senior Vice President and Chief Information Officer Onyeka Nchege. “This project was a transformative component of Novant Health’s path through the pandemic surge and laid the foundation for virtual extensions now woven throughout all previously existing physical locations.”

In addition to tele-ICU, Novant Health’s DPS team has delivered other advanced technologies (such as AI/machine learning, advanced sensors, autonomous drones, and robotics) to support next-level healthcare and provide the foundation for a highly-advanced, interwoven composite care delivery method.

Penn Medicine digitally transforms DNA sample collection

Michael Restuccia, SVP and CIO, Penn Medicine

Penn Medicine

Organization: Penn Medicine

Project: Accelerating Research by Integrating Clinical and Research IT Systems to Rapidly Expand Penn Medicine’s Biorepository

IT Leader: Michael Restuccia, SVP and CIO

As Penn Medicine anticipated a dramatic growth in patient DNA sample collections, officials concluded that the organization’s existing processes were too labor-intensive to enable that expected expansion.

So Penn Medicine set out to develop a technology-enabled process that could support collecting large numbers of specimens as well as obtaining and storing patient consent — all without driving up costs and increasing manual work.

To deliver on that objective, multiple teams collaborated to design and implement a new process for managing the physical samples and the data associated with them.

This multidisciplinary project built a process that leverages electronic medical record (EMR) system and its patient portal to obtain patient consent; uses the clinical lab system to barcode samples; uses the laboratory information management system (LIMS) to process samples; and works across those multiple systems to link research samples to clinical identifiers.

In another key innovation, the LIMS and data analytics teams set up an integration with the EMR and clinical lab IS systems to identify the patient’s electronic patient identifier from the collected sample’s clinical accession number, with linking occurring automatically each night. The new setup increases quality and speed by eliminating high-cost manual efforts and ensuring a higher degree of accuracy over the legacy manual approach.

Penn Medicine officials said the project is now successfully supporting its expanding collection of biological samples. But they also noted that the collaborative effort to re-imagine a process leveraging established, centralized systems, staff, and processes was itself a key innovation.

“Over the past decade, we have invested significantly in foundational technologies to support our clinical and research operations. When our research community proposed a significant expansion of our central sample bio-repository, we seized upon the opportunity to modernize their patient consenting and sample collection processes,” says Michael Restuccia, Penn Medicine senior vice president and CIO.

University of Miami Health System enlists IoT to ensure COVID vigilance

David Reis, CIO, University of Miami Health System and Miller School of Medicine

University of Miami Health System and Miller School of Medicine

Organization: University of Miami Health System

Project: Enhancing Tele-Vigilance

IT Leader: Dr. David Reis, VP of IT and CIO

Seeking to minimize COVID-19 transmissions as cases surged in Florida, the University of Miami Health System IT department developed what it termed “tele-vigilance” using IoT-enabled remote monitoring capabilities.

The IT department partnered with the IoT company TytoCare and Epic Systems, maker of electronic medical records (EMR) systems, to invent an end-to-end solution.

TytoCare’s all-in-one modular physical IoT device remotely monitors vital signs and breath sounds. Users can also use the device to pick up images of the skin, ears, eyes, and throats and then upload all that information to their medical record.

Doctors could enroll patients in the tele-vigilance program by using the EMR to order the TytoCare IoT device and establish for each patient which vitals to monitor and for how long. Each order would route to the tele-vigilance group, which would assist each individual — whether a student, employee, or patient — with device setup.

The IT department developed a new capability within the EMR to link the IoT device with each patient’s medical record and to automatically pull in the patient’s vital signs from the device.

IT developed automated data integrity methods, so that a real-time notification would be sent to the tele-vigilance team to contact patients if there were linking issues.

Additionally, the IT team integrated the IoT device with the patient portal, so individuals could see the vital signs that were sent to the tele-vigilance doctor.

And the team created a real-time push notification capability, which would send alerts to the tele-vigilance provider’s smartphone if any patient’s vital signs were out of range.

The University of Miami Health System deployed thousands of devices to students, employees, and patients, allowing them to be closely monitored at home after a COVID-19 positive test, thereby reducing unnecessary hospitalizations and exposing others to the virus while ensuring escalation of care if needed.

Analytics, Artificial Intelligence, Business Process Management, CIO 100, Data Management, Healthcare Industry, Internet of Things