The increasing amounts of data generated by today’s modern enterprise continues to challenge organizations as they look to extract valuable insights from that data. The inability to leverage all kinds of data and the amount of expertise required from data scientists to prep data put a strain on many enterprises. IT groups need a new approach to allow for better solutions and to get data analysis tools and technologies into the hands of domain experts.

“It’s a very competitive marketplace for well trained, smart data scientists, and for the public and private sector, getting access to those resources and keeping them is very difficult,” says Andy MacIsaac, director of solutions marketing, public sector, at Alteryx. “But many times, organizations don’t always need to have that high-level resource for every data issue or need for answers. What we need to do is democratize the approach to data analytics and give access to the domain experts who can self-serve their data and analytics needs.”

Business leaders unable to access these data tools end up asking the data scientists for answers to their questions, taking them away from their larger, more mission-critical data projects. This creates friction as the two groups clash over what should be a cooperative effort.

“You want your data scientists to be working on bigger, more impactful projects,” says MacIsaac. “They don’t want to be data janitors, cleaning up data and running standard reports. They need to be tackling the big problems, because that’s what they were trained for. By lowering the barrier to analytics through new self-service and low-code data tools, you are putting a lot of capability into the hands of a larger group of people.”

Tools such as those offered by Alteryx can be quickly deployed for an enterprise, as well as public agencies, offering new insights for those groups. With automation and data cleanup tools freeing up time for data experts, they can look for new answers to new questions, and take advantage of more data sources – all without worrying about the effort to code algorithms or figure out which data is good or bad.

For example, a government agency was able to use the tools to discover whether sailors on ships during the Vietnam War were eligible for benefits related to the deployment of Agent Orange, by determining what ship they were aboard on a given day.

“Typically a business or agency will start with a business problem or challenge, and there’s always a question that needs to be figured out,” says MacIsaac. “With these tools, you don’t have to be a data scientist to get the answers if you know the data you need, pull it down and create the analysis.”

To learn more about how Alteryx can help your organization expand the use of its data analytics tools, click here.

Data Science

Even the modern workplace can be boring and repetitive. Enter robotic process automation (RPA): a smart set of tools that deploys AI and low-code options to simplify workflows and save everyone time while also adding safeguards that can prevent costly mistakes.

What is RPA?

Robotic process automation (RPA) is an application of technology, governed by business logic and structured inputs, aimed at automating business processes. Using RPA tools, a company can configure software, or a “robot,” to capture and interpret applications for processing a transaction, manipulating data, triggering responses, and communicating with other digital systems.

In some organizations, RPA is a way to modernize old software without replacing it. Most organizations have business applications that work perfectly well but require users to click on the same boxes in the same patterns all day long. RPA tools aim to replace that tedium, adding a new layer to automate repetitive tasks without having to reinvent the application at the core.

RPA benefits

RPA is also a relatively simple way to integrate AI algorithms into old applications. Many RPA platforms offer computer vision and machine learning tools that can guide the older code. Optical character recognition, for example, might extract a purchase order from an uploaded document image and trigger accounting software to deal with it. The ability to suck words and numbers from images are a big help for document-heavy businesses such as insurance or banking.

The biggest benefit, however, may be how RPA tools are “programmed,” or “trained” — a process by which the platforms’ robots “learn” watching business users click away. This job, sometimes called “process discovery,” can use a click stream to imitate what your users just did — similar to how spreadsheet macros can be created.

Still, RPA isn’t automatic. Manual intervention and tweaking is necessary during training. Sometimes code must be written to handle what can’t be achieved by a preconfigured bot. But you won’t have to do much of this. Moreover, the bots keep getting smarter, making training easier and edge cases less frequent. AI routines can also help look for patterns that may speed up the bots in the future.

Top RPA tools

RPA tools have grown to be parts of larger ecosystems that map out and manage the enterprise computing architecture. These systems can manage the various APIs and services while also helping the data flow with extra bots.

RPA tools are also starting to take on roles managing the cloud. While the first iterations were aimed at desktop users, functions that help with backend control are more common. The boundaries between RPA for the desktop and maintaining the databases and services is blurring more and more.

The RPA marketplace offers a mixture of new, purpose-built tools and older tools that have been given features for adding automation. Some began as business process management (BPM) tools and expanded with new features. Some vendors market their tools as “workflow automation” or “work process management.” Others distinguish RPA from “business process automation” by saying that RPA includes more sophisticated AI and machine vision routines.

Following is an alphabetical list of the top RPA tools available today. For more on how to determine which RPA tool is best for your organization, see “How to choose RPA software: 10 key factors to consider.”

RPA toolMajor featuresUse cases

Document editing and signature tracking
Contract and agreement processing

Java-centric bots offer cross-platform range
Client management and compliance paperwork processing

Automation Anywhere
The Center of Excellence (CoE) manager tracks the performance of the various bots in a centralized dashboard; Bot Insight drills down to track the performance of each bot
Opening up bot development and deployment across the enterprise

Pay-as-you-go pricing simplifies adoption
Chatbot management; front-, middle-, and back-office document processing

AWS Lambda
Automating backend data flows in the Amazon cloud
Fixing problems and smoothing data movement between services

Cyclone Robotics
Built for the Chinese market with a wide range of plugins tackling major platforms and services with AI
A wide range of markets, including mobile

Integration with AI for OCR and language analysis; mainframe integration; desktop version
Chatbot and call center support; desktop automation

EdgeVerve Systems
Open-source edition; tighter integration with AI for contextual and visual processing
Supply chain management, financial transactions

Fortra Automate
Integration with Microsoft desktop applications
Claims processing, service industries

IBM Automation
Deep experience with enterprise workflow; integration with many mainframes
Data capture, scientific process management; business decision automation, front-line customer care

Integration with enterprise content management tools; microapps platform to simplify deployment
Managing content collections; data pipeline integration

AI-powered chatbots and a cloud-native robots
The Work Execution System offers general support for document-centric business tasks

Microsoft Power
Focus on Windows 10 or 11 platform on the desktop or on Azure
Broad, enterprise-wide empowerment; AI integration

MuleSoft RPA (formerly Servicetrace)
AI-based OCR and a good editor encourages development; recent merger will bolster integration with API-based workflows
Banking, utilities, and other industries with heavy compliance-driven work

Integration between desktop assistants and server-side backend
Call center automation; customer service tools; speeding workflow by creating robots that first learn by assisting humans before graduating to full autonomy in the back office

Tight integration with dominant desktop tools
Compliance pipelines dominated by documents

NTT-AT WinActor
Heavy integration with Microsoft tools
Email processing and database integration; spreadsheet automation

Fully integrated with suite of enterprise tools for developing, deploying, and automating data processing
Regulatory compliance and integration

Python-based bots
Document processing and data extraction

Samsung SDS Brity RPA
Aimed at improving industrial and enterprise business flow through automation
Time-saving and quality improvement for enterprise-driven tasks

Integration with the SAP stack
Automating the business processes tracked and driven by SAP

SS&C Blue Prism
Big investment in AI, including machine vision and sentiment analysis for classifying and responding to all messages
Building a full chain of document and message processing

Open environment allows integration of VB.Net, C#, Python, and Java code when challenges grow
Integration with full legacy stack solutions; transaction processing

Digital workers tuned to common roles for RPA and workforce automation.
Email and client interaction; task routing


Document-centric tasks such as PDF editing or generating eSignatures for contracts are one of the focuses for Airslate. The bots for simplifying the workflow are programmed with the drag-and-drop Flow Creator. Preprogrammed resources include connections to major backends such as Salesforce as well as a collection of templates for common processes. 

Major features: Document editing and signature trackingMajor use cases: Contract and agreement processing


Appian acquired Jidoka in 2020 and changed the product’s name to Appian RPA while integrating it with its Digital Process Automation suite. Jiodka is a Japanese term that might be translated as “automation with a human touch,” a reference to how its software robots are trained to emulate humans interacting with the standard systems — mainframe terminal, web, databases, and so on. Appian RPA’s low-code integrated development environment (IDE) encourages fast creation of custom bots, while the dashboard tracks all the operating robots and can create a video of the screen to help debug the bots deployed across Appian’s cloud. The information is ingested into what they call a “Data Fabric” filled with not just numbers and letters, but relationships between elements. Deeper integration across both desktop platforms and mobile brings their tool to the edges of any enterprise network.

Major features: Java-centric bots offer cross-platform rangeMajor use cases: Client management and compliance paperwork processing

Automation Anywhere

The Bot Store at Automation Anywhere offers a collection of tools for the Automation 360 platform that perform standard clicking and tracking as well as processes that glue together complex data files. There are bots for extracting information from spreadsheets, files, or web pages, and bots for storing this information in databases for issue tracking, invoice processing, and more. Many of the bots rely on APIs such as Microsoft Azure’s image analysis API. One of the goals is opening up access across the enterprise with easy-to-automate tools such as AARI, which can turn any web application into an automated worker. They also offer a “community edition” that is free for small businesses with a limited workflow and a cloud-based service, saving you the trouble of installing and maintaining the RPA itself. 

Major features: The Center of Excellence (CoE) manager tracks the performance of the various bots in a centralized dashboard; Bot Insight drills down to track the performance of each botMajor use cases: Opening up bot development and deployment across the enterprise


The bots at AutomationEdge offer “hyperautomation” through a mixture of API interaction and AI. The focus is interacting with web pages, databases, and Excel spreadsheets. Its “Conversational RPA” brings a natural language interface to many interactions. Many bots in the bot store are preconfigured for specific industries or sections of a business, such as human resources or customer relations. AutomationEdge also offers a free version that’s limited in time, steps, and reach. Some AI-driven options such as the Conversational RPA and Intelligent Document Processing aren’t included. A cloud-based service is also available for those who don’t want to install it.

Major features: Pay-as-you-go pricing simplifies adoptionMajor use cases: Chatbot management; front-, middle-, and back-office document processing

AWS Lambda

The Amazon cloud is filled with options for data processing. Lambda functions act like logical glue for connecting services and automating work flowing through their networks. The functions can be as small or as large as needed and they can be triggered when new data arrives. Lambda functions are aimed more at automating work on the backend, and they are most efficient when working with AWS services but can be connected to any service with extra work.

Major features: Automating backend data flows in the Amazon cloudMajor use cases: Fixing problems and smoothing data movement between services

Cyclone Robotics

The Cyclone toolset is growing into a broad selection of tools that support low code and not-so-low code development. Its RPA Studio brings together basic automation tools for building data pipelines with advanced AI tools for OCR and computer vision. It also offers a low-code option for integrating multiple tools into a cohesive, automated workflow. Small and midsize businesses can also run the tools in Cyclone’s cloud using the EasyPie service.

Major features: Built for the Chinese market with a wide range of plugins tackling major platforms and services with AIMajor use cases: A wide range of markets, including mobile


TruBots, the name Datamatics gives its individual programs, are created with TruBot Designer, a tool that enables you to create and edit the software. Much of the work is accomplished by dragging and dropping components in a visual designer, but developers can also adjust the system-generated code in an IDE. The bots can be coordinated with TruBot Cockpit, and the system emphasizes text processing with special tools for scanning images and making sense of unstructured text. The tool runs in the cloud but some features can be installed on your own machine with a personal edition for handling more personal tasks, something Datamatics calls the “democratization of RPA.” Teams with document-heavy workloads can use TruCap, a tool for template-free data ingestion.

Major features: Integration with AI for OCR and language analysis; mainframe integration; desktop versionMajor use cases: Chatbot and call center support; desktop automation

EdgeVerve Systems

The AssistEdge system helps build out your data processing infrastructure by integrating with major data sources and tracking users to discover common work patterns with AssistEdge Discover. Call centers and customer help portals can use AssistEdge Engage to automate the repetitive tasks of orchestrating multiple legacy systems. When possible, EdgeVerve relies on AI to provide contextual help and process incoming forms and other data. The document processing system, XtractEdge, for instance, offers OCR to speed form processing. The company also has systems optimized for industries such as supply chain management (TradeEdge) or banking. It offers migration from desktop to a cloud solution, and an open-source edition.

Major features: Open-source edition; tighter integration with AI for contextual and visual processingMajor use cases: Supply chain management, financial transactions

Fortra Automate

The RPA tools from Fortra (formerly HelpSystems) tackle business tasks ranging from responding to inquiries to generating reports. The core Desktop Automation tool scrapes data sources and interacts with web apps and local software by simulating events in the Windows GUI. There’s an emphasis on Microsoft Office tools to produce reports, both textual and graphical, consumed while managing a business. Larger jobs that span multiple desktops can use Automate Plus and Automate Ultimate for added scale. Document scanning is performed with Automate Intelligent Capture. All integrate security and audit capabilities to help managers after development.

Major features: Integration with Microsoft desktop applicationsMajor use cases: Claims processing, service industries

IBM Automation

IBM offers a wide range of options for automating menial tasks, split into separate products, and bundled under the umbrella of IBM Automation. IBM Cloud Pak for Business Automation, for example, provides a low-code studio for testing and developing automation strategies. AI tools provide optical character recognition for documents. The Watson Assistant provides customer care with integrated bots. Teams can iterate over the workflows and explore hypothetical strategies with the Processing Mining tools. All the software can be deployed locally or in IBM’s cloud.

Major features: Deep experience with enterprise workflow; integration with many mainframesMajor use cases: Data capture, scientific process management; business decision automation, front-line customer care


ImageTech Systems makes Kofax, a set of bots for document processing and workflow automation. Its Design Studio offers an IDE for turning code written in Java, Python, or another programming language into instructions for their bots. Some users will want to use its Automated Process Discovery code for tracking existing workflows and producing bots. Code can also be spun off into smaller tools called Kapow Kapplets that handle focused chores locally. All the behavior is tracked with standard analytics and reported through a dashboard so you can watch for robotic glitches. 

Major features: Integration with enterprise content management tools; microapps platform to simplify deploymentMajor use cases: Managing content collections; data pipeline integration


Laiye is another platform emerging from the Chinese marketplace to target retail groups and others with extensive customer requirements. The Automation Creator is a drag-and-drop IDE for turning workflow into Robots that can be deployed and tracked with tools like the Creativity Center. 

Major features: AI-powered chatbots and a cloud-native robotsMajor use cases: The Work Execution System offers general support for document-centric business tasks

Microsoft Power

The Power Automate tool from Microsoft is part of the company’s Power platform for creating apps, virtual agents, and BI reports. The Desktop tool focuses on automating common Windows 10 (and higher) operations while the Cloud tool handles server-side tasks. The user-friendly interface enables everyone to track their workflow and convert it into an automated, editable routine. Power Advisor tracks statistics about performance to locate bottlenecks and other issues. Microsoft is integrating some of its AI into Power. Users can build new Automation scripts with natural language by describing what should happen. (It’s said to be in preview.) AI Builder can also create and deploy models that make predictions and even decisions, taking more work off of users’ shoulders.

Major features: Focus on Windows 10 or 11 platform on the desktop or on AzureMajor use cases: Broad, enterprise-wide empowerment; AI integration

MuleSoft RPA (formerly Servicetrace)

The Mulesoft RPA tool from Salesforce, once known as Servicetrace, is now part of a larger platform for workplace automation and enterprise architecture. The RPA tools use AI and machine learning to help decode documents and automatically collect data. Automation can be scripted with the drag-and-drop RPA Builder, which brings wizard-driven solutions, or collected automatically with RPA Recorder, which watches users to capture repetitive tasks. When the results are deployed with the bot Manager, the system’s vertical scaling enhances parallel operations enabling multiple bots to run simultaneously. 

Major features: AI-based OCR and a good editor encourages development; recent merger will bolster integration with API-based workflowsMajor use cases: Banking, utilities, and other industries with heavy compliance-driven work


The NICE robots are designed to run as supervised assistants for humans or, if they’re competent enough, as unsupervised back-office tools. The goal is “Journey Orchestration” so customers or staff are helped along at each step of the digital pipeline. One assistant, NEVA, is billed as a friendly assistant and “workforce multiplier” for customer service issues. The Scene Composer for the Real-Time Designer can track how clicks and keystrokes interact with web pages. Data from other sources can be gathered through Connectors to standard back-office sources such as SAP, Siebel, and .Net servers. Its CXexchange offers hundreds of extensions and agents that speed integration. CXone, its open cloud platform, helps support this growth around the globe.

Major features: Integration between desktop assistants and server-side backendMajor use cases: Call center automation; customer service tools; speeding workflow by creating robots that first learn by assisting humans before graduating to full autonomy in the back office


The RPA tools from Kryon are now part of the Nintex data automation constellation, creating a complete platform for managing processes and business workflows. Process Discovery helps find the work that needs to be automated and turned into bots that can be deployed and tracked. For document-heavy processes that may require signatures, Nintex’s collection of RPA bots focuses on integration with Office365, Salesforce, and Adobe tools to automate the process of creating documents and signing them in a digitized legal pipeline. The results can run either in the cloud or on premises.

Major features: Tight integration with dominant desktop toolsMajor use cases: Compliance pipelines dominated by documents

NTT-AT WinActor

NTT-AT’s WinActor was built to save Windows users’ time by automating the most common steps. It integrates with major Microsoft tools to build sophisticated workflows by recording user actions. These are turned into scenarios, and users can trigger these scenarios when a new event occurs such as the arrival of an email. A new request for information, for instance, can be turned into a qualified lead for the sales database with a few clicks. A wide variety of supplemental libraries can extend the tool to handle specific tasks such as creating PDF versions.

Major features: Heavy integration with Microsoft toolsMajor use cases: Email processing and database integration; spreadsheet automation


Pega from Pegasystems offers a wide variety of tools that speed up integration and processing for enterprises, including AI classifiers, chatbots, DevOps support tools, and pure RPA. Creating the right automation can begin with Pega’s AI-driven workforce intelligence tool, a bot that installs on desktops to track how people work. This survey will reveal bottlenecks where poor back-end processing can be automated now and in the future. Pega wants to deliver “self-healing” and “self-learning” applications that can use AI and other statistics to recognize new opportunities for better automation. Pega supports common use cases such as reconciling financial transactions and onboarding new customers. The company also offers low-code options for BPM.

Major features: Fully integrated with suite of enterprise tools for developing, deploying, and automating data processingMajor use cases: Regulatory compliance and integration


Juggling documents with Python-based bots on Linux, Mac, or Windows desktops is the main focus of Rocketbot. Text can be extracted using Rocketbot Telescope and then fed into backed data using bots trained by Rocketbot Studio’s drag-and-drop editor. Rocketbot Orquestador will manage them, running them as needed while compiling statistics.

Major features: Python-based botsMajor use cases: Document processing and data extraction

Samsung SDS Brity RPA

Samsung SDS’s Brity RPA is split into three parts. Designer offers drag-and-drop flowcharting for both desktop and enterprise back-end legacy services through a variety of connectors. Bot schedules and runs the various jobs at pre-set times or in response to events, rebooting virtual machines and simulating all events that might be generated by a real human. Bigger, more independent jobs can be split off to run in the Bot processor. Samsung is also integrating a wide variety of AI routines (ML, NLP, visual, and analytic ) and is expanding to deliver collaboration software for teams.

Major features: Aimed at improving industrial and enterprise business flow through automationMajor use cases: Time-saving and quality improvement for enterprise-driven tasks


SAP now offers a Robotic Process Automation option to simplify many of the workflow operations with its software. SAP’s tool can watch current teams to imitate their actions. When it’s done, you can tweak the process in a drag-and-drop low-code IDE. The results are deployed into the SAP environment to live as either attended or unattended bots. Teams that want to leverage the work of others can turn to the SAP RPA store to download bots for common tasks such as unpacking Excel spreadsheets looking for orders to recognize and categorize.

Major features: Integration with the SAP stackMajor use cases: Automating the business processes tracked and driven by SAP

SS&C Blue Prism

SS&C Blue Prism, one of the earliest RPA companies that began in 2012, pushes “intelligent automation” that mixes more AI into the process to simplify scaling and adaptive processes. The emphasis is on using AI and machine learning to “create journeys” for your data as it’s handed off along a chain of bots that often make fully automated decisions through sophisticated machine learning algorithms. You string together a sequence of actions at the beginning, but then each action generates statistics that can be used to train and improve the choices made. The company also maintains a digital exchange where third-party plugins and add-ons can be purchased to extend the powers by creating connections with traditional databases such as MySQL, larger providers such as AWS, and social media outlets such as Twitter.

Major features: Big investment in AI, including machine vision and sentiment analysis for classifying and responding to all messagesMajor use cases: Building a full chain of document and message processing


UiPath offers a full collection of tools for discovering workflows through Process Mining and Task Analysis and turning them an autonomous processes that can be edited and tweaked. These robots are controlled by Orchestrator, which triggers them in response to events while tracking behavior, generating reports, and controlling access where needed for compliance. UiPath is expanding into AI and is emphasizing machine vision tools that can extract information from images or screenshots. These are often focused on OCR to convert letters and numbers into machine-understandable forms.

Major features: Open environment allows integration of VB.Net, C#, Python, and Java code when challenges growMajor use cases: Integration with full legacy stack solutions; transaction processing


The digital workers from WorkFusion come with individual human names and special focuses. Tara, for instance, is a “top OFAC / AML expert who is laser-focused on keeping your transactions risk-free.” Casey is a customer relations specialist who is “obsessed with creating a better, faster customer experience.” Enterprises can begin with them as a starting point or create a custom version that can deploy OCR and some AI to their particular tasks. The digital workers are deployed with Workforce Enterprise to either run autonomously or work as assistants for humans that remain in the loop.

Major features: Digital workers tuned to common roles for RPA and workforce automation.Major use cases: Email and client interaction; task routing

Open source

The major companies are generally selling proprietary tools, although community editions with limited functionality are common. Open-source processes are less cosmmon but you can often accomplish many of the simple tasks by stringing together some open source projects. You are likely going to have to do much more work to train the tools yourself, often by typing code into an editor. Still, they remain an interesting option. Check out Puppeteer, Selenium, and  Headless Firefox for a basic start.

Major features: Full open source access to code; no vendor lock-inMajor use cases: Web integration; data collection; testing and verification
BPM Systems, Enterprise Applications

It feels like just yesterday that we were promised that cloud servers cost just pennies. You could rent a rack with the spare change behind the sofa cushions and have money left for an ice cream sandwich.

Those days are long gone. When the monthly cloud bill arrives, CFOs are hitting the roof. Developer teams are learning that the pennies add up, sometimes faster than expected, and it’s time for some discipline.

Cloud cost managers are the solution. They track all the bills, allocating them to the various teams responsible for their accumulation. That way the group that added too many fancy features that need too much storage and server time will have to account for their profligacy. The good programmers who don’t use too much RAM and disk space can be rewarded.

Smaller teams with simple configurations can probably get by with the stock services of the cloud companies. Cost containment is a big issue for many CIOs now and the cloud companies know it. They’ve started adding better accounting tools and alarms that are triggered before the bills reach the stratosphere. See Azure Cost Management, Google Cloud Cost Management, and AWS Cloud Financial Management tools for the big three clouds.

Once your cloud commitment gets bigger, independent cost management tools start to become attractive. They’re designed to work with multiple clouds and build reports that unify the data for easy consumption. Some even track the machines that run on premises so you can compare the cost of renting versus building out your own server room.

In many cases, cloud cost managers are part of a larger suite designed to not just watch the bottom line but also enforce other rules such as security. Some are not marketed directly as cloud control tools but have grown to help solve this problem. Some tools for surveying enterprise architectures or managing software governance now track costs at the same time. They can offer the same opportunities for savings that purpose-built cloud cost tools do — and they help with their other management chores as well.

What follows is an alphabetical list of the best cloud cost tracking tools. The area is rapidly expanding as enterprise managers recognize they need to get a grip on their cloud bills. All of them can help govern the burgeoning empire of server instances that may stretch around the world.


The first job for Anodot’s collection of cloud monitoring tools is to track the flow of data through the various services and applications. If there’s an anomaly or hiccup that will affect users, it will raise a flag. Tracking the cost of instances and pods across your multiple clouds is part of this larger job. The dashboard produces a collection of infographics that make it possible to study each microservice or API and determine just how much it costs to keep it running in times of high demand and low. This granular detail gives you the ability to spot the expensive workloads and find a way to prune them.

Standout features:

Integrated with a broader monitoring system to deliver better customer experience at a reasonable priceAvailable as a white-label platform for integration and reselling


Tracking and reining in containers in a Kubernetes environment is the goal for Cisco’s AppDynamics, formerly known as Replex. The tool is now part of a larger system that watches clusters in public clouds or running locally to ensure they are performing correctly. Tracking costs is just one small part of a system that is constantly gathering statistics and watching for anomalies. One important reporting process charges back costs to the teams responsible for them so everyone can understand what’s creating the monthly bill. AppDynamics also offers a proprietary machine learning engine to turn historical data into a plan for efficient deployment. A policy control layer offers granular restrictions to ensure teams have access to what they need but are locked out of what they don’t.

Standout features:

Integrates cost management with general application monitoringConnect user experiences and business results for every layer of the software stack

Apptio Cloudability

Apptio makes a large collection of tools for managing IT shops, and Cloudability is its tool for handling cloud costs. The tool breaks down the various cloud instances in use, allocating them to your teams for accounting purposes. Ideally, teams will be able to control their own costs and predict future usage with the reports and dashboards on offer. Cloudability’s True Cost Explorer, for instance, offers pivotable charts to switch between aggregated variables to establish accurate plans and predict future usage. Cloudability integrates with ticketing tools such as Jira for planning and with tracking tools such as PagerDuty or Datadog for monitoring.

Standout features:

Planning future purchasing of reserved instances to lock in savings for the constant demandAllocating upcoming workloads to available instances of the right capabilities


Dashboards created by CloudAdmin are simple and direct. The tool tracks cloud usage and offers suggestions for rightsizing your servers or converting them to reserved instances. Server instances can be allocated to teams and then tracked with a budget. If spending crosses a defined line, alerts are integrated with email or other common communication tools such as PagerDuty to notify personnel of the need for attention.

Standout features:

Carefully filtered data feeds extract the key details about spending to save time wading through too much informationAutomated alerts can stop runaway spending when it crosses thresholds


CloudCheckr focuses on controlling cloud costs and security. The tool is part of NetApp’s Spot constellation for cloud management and is responsible for cost management by tracking standard spending events, such as consumption, forecasting, and the rightsizing of instances. The tool supports reselling for companies that add their own layers to commodity cloud instances. A white label option makes it possible to pass through all the reporting and charts to help your customers understand their billing. There’s also a focus on supporting public clouds used by governments.

Standout features:

Monitor compliance with privacy regulations by tracking security configurationRightsize reserved instances by tracking baseline consumption


Watching over cloud machines, networks, serverless platforms, and other applications is the first job for Datadog’s collection of tools. Tracking cloud costs is just one part of the workload. Its telemetry gathers data about performance and cost, and Datadog builds this into a dashboard to help organizations understand both application cost and performance. The goal is to facilitate decisions about application performance with an eye on the price of delivering it. Understanding the tradeoff can lead to cost savings.

Standout features:

Broad suite for infrastructure monitoring across multiple cloudsMonitoring of real users and simulated users make it easier to deliver a better user experience


Densify builds a collection of tools for managing cloud infrastructure by juggling containers and VMware instances. The best way to run your clusters, according to Densify, is to keep precise, meticulous records of load and then use this data to scale up and down quickly. Densify’s optimizers focus on cloud resources such as instances, Kubernetes clusters, and VMware machines. Densify suggests this approach improves scaling by 30%. Densify’s FinOps tool generates extensive reports to help keep application developers and bean counters happy.

Standout features:

Track loads on machines to ensure rightsized instance allocationBuild reports summarizing consumption to help developers rightsize hardware

Flexera One

The Flexera One cloud management suite tackles many cloud management tasks, such as tracking assets or organizing governance to orchestrate control. An important section of the suite is devoted to controlling the budget. The tool offers multicloud accounting for tracking spending with elaborate reporting broken down by team and project. Flexera One also offers suggestions for optimizing consumption by targeting wasteful allocations, and it provides automated systems to put these observations into practice. The tool also integrates machine learning and artificial intelligence to help analyze consumption patterns across multiple clouds.

Standout features:

Integrates reporting across multiple clouds to help business groups understand costsIdentifies options for rightsizing instances and eliminating wasteful spending


DevOps teams can use the CI/CD pipeline that’s the central part of Harness to automate deployment and then, once the code is running, track usage to keep budgets in line. Harness’s cost management features watch for anomalies compared to historic spending, generating alerts for teams. A feature for automatically stopping unused instances can work with spot machines, effectively unlocking their potential for cost savings while working around their ephemeral nature.

Standout features:

Deep integration with the development pipeline to make cost savings part of the software creation processAutomated compliance integrates cost management with regulatory and governance work


Teams that rely on Kubernetes to deploy pods of containers can install Kubecost to track spending. It will work across all major (and minor) clouds as well as pods hosted on premises. Costs are tracked as Kubernetes adjusts to handle loads and are presented in a unified set of reports. Large jumps or unexpected deployments can trigger alerts for human intervention.

Standout features:

Optimized for tracking how Kubernetes deployments affect costsDynamic recommendations track opportunities for lowering spending


DevOps teams rely on ManageEngine to track a range of potential issues from security to API endpoint overload. Its CloudSpend tool will extract data from cloud spreadsheet bills and aggregate it to provide a useful, actionable level of understanding. Costs can be charged back to the specific teams, and ManageEngine’s predictive analytics will plan reserved instances based on historical data. Currently available for AWS and Azure.

Standout features:

Spend Analysis drills down deeply into the data to granular detailMulti-currency support for worldwide deployment

Nutanix Xi

Organizations with large multicloud deployments can use Nutanix Cost Management (formerly Beam) to track costs across a range of installations, including private cloud machines hosted on premises. The tool can be customized to generate accurate cost estimates of private installations by taking into account heating and cooling costs, hardware, and data center rent. This makes it easier to make accurate decisions about allocating workloads to the lowest-cost deployment. The process can be automated to simplify management and forward-planning for budgeting for reserved instances.

Standout features:

Metering of private clouds builds direct insight into the costs of on-prem hardwareBudget alerting and dynamic optimization help rightsize consumption to minimize costs


Teams running extensive collections of microservices rely on ServiceNow to manage some of the stack. Many of the tools are customer-facing solutions like IT automation, but there are also more backend tools for optimizing IT operations by intelligently managing performance. Newer AIOps can deliver artificial intelligence solutions too.

Standout features:

Broad selection of tools for tracking and optimizing IT assetsRisk management well integrated with governance tools


IBM relies on Turbonomic to deliver an AI-powered solution for managing deployment to match application demand with infrastructure. The tool will automatically start, stop, and move applications in response to demand. The data driving these decisions is stored in a warehouse to train the AI that will be making future decisions. The latest version includes a new dashboard and reporting framework based on Grafana.

Standout features:

Full-stack integrated graphics to understand demand and cost across an applicationDesigned to automate resource allocation to save engineering teams from the chore

VMware Aria CloudHealth

VMware built Aria Cost and Aria Automation under the CloudHealth brand to manage deployments across all major cloud platforms as well as hybrid clouds. The cost accounting module tracks spending, allocating it to business teams while optimizing deployments to minimize costs. The modeling layer can build out amortization and consumption schedules to forecast future demand. Financial managers and development teams can drill down into these forecasts to focus on specific applications or constellations of services. The larger product line integrates the cost management with automated deployment and security enforcement.

Standout features:

Spending governance ensures that teams are following individual budgets for resource consumptionIntegrate cloud costs with business metrics and key performance indicators to understand the connection between computational costs and the bottom line


Much of the responsibility for cloud costs comes from the engineers who write and deploy the code. They make the granular decisions to startup more instances and store more data. Yotascale wants to put more information in their hands to enable them to optimize their hardware consumption with tools designed to track machines and allocate their costs directly to the teams responsible. The forecasting tools can also spot anomalies, raising alerts to prevent any surprise bills at the end of the month.

Standout features:

Engineer-targeted tools deliver budget information directly to the teams building the software and starting up the machinesAutomated tracking delivers forecasts and flags problems and overconsumption


While many cloud managers offer insights through sophisticated reports, Zesty is designed to automate the work of spinning up and shutting down extra instances. A key feature enables it to watch the spot market for deeply discounted instances with excess capacity on the cloud. It offers a tool informed by artificial intelligence algorithms that can work with AWS’s API to make decisions that keep just enough machines running to satisfy users without breaking the budget. The tool can even control the amount of disk space allocated to individual machines while buying and selling processor time on the spot from reserved instance marketplaces.

Standout features:

Deep management of details such as storage space allocation to minimize costsIntegration with spot market to take advantage of the lowest possible costs
Cloud Computing, Cloud Management

As CIO of United Airlines, Jason Birnbaum is laser focused on using technology and data to enable the company’s 86,000 employees to create as seamless a customer travel experience as possible. “Our goal is to improve the entire travel process from when you plan a trip to when you plan the next trip,” says Birnbaum, who joined the airline in 2015 and became CIO last July.

One opportunity for improvement was with customers who are frustrated about arriving at the gate after boarding time and unable to board because the doors are shut, while the plane is sitting on the ground. “The situation is not only frustrating to our customers, but also to our employees,” Birnbaum says. “We are in the business of getting people to where they want to go. If we can’t help them do that, it drives us crazy.”

So, Birnbaum and his team built ConnectionSaver, an analytics-driven engine that assesses arriving connections, calculates a customer’s distance from the gate, looks at all other passenger itineraries, where the plane is going, and whether the winds will allow the flight to make up time, and then makes a real-time determination about waiting for the connecting passenger. ConnectionSaver communicates directly with the customer that the agents are holding the plane.

ConnectionSaver is a great example of how a “simple” solution resulted from a tremendous amount of cultural, organizational, and process transformation, so I asked Birnbaum to describe the transformation “chapters” behind this kind of innovation.

Chapter 1: IT trust and credibility

“For years, it was common for technology organizations to have too little credibility to drive transformation,” says Birnbaum. “That was our story, and we worked very diligently to change the narrative.”

Key to changing the narrative was giving senior IT leaders end-to-end business process ownership responsibilities. “We started moving toward a process ownership model several years ago, and since then, we’ve made significant improvements in technology reliability, user satisfaction, and our employees’ trust in the tools,” Birnbaum says. “This is important because every transformation chapter depends on use of the technology. If our employees don’t trust the tools, we will never get to transformation.”

A process could be gate management, buying a ticket, managing baggage, or boarding a plane, each of which runs on multiple systems. “Before we moved from systems to process ownership, people would see that their system is up, so they would assume the problem belonged to someone else,” says Birnbaum. “In that model, no one was looking out for the end user. Now, we have collaborative conversations about accountability for business outcomes, not system performance.”

Chapter 2: Improving the employee experience

Like every company, United Airlines has been working to improve the customer experience for years, but more recently has expanded its “design thinking” energies to tools for employees. To facilitate this expansion, Birnbaum grew the Digital Technology employee user experience team from three people to 60, all acutely focused on integrating the employee experience into the customer experience.

The employee user experience team spends time with gate agents, contact centers, and airplane technicians to identify technology to help employees help customers. “The goal of the employee user experience team is to provide tools that are intuitive enough for the employee to create a great customer experience, which in turn, creates a great employee experience,” says Birnbaum. “It is important for companies to invest in change management, but you need less change management if you give employees tools that they really want to use.”

For example, the user experience team learned that flight attendants felt ill equipped to improve the experience of a customer once the customer is on the plane. If a customer agreed to change seats or check a bag, for example, there was little a flight attendant could do to improve the experience in real-time. “All they had was a book of discount coupons, but the customer had to call a contact center with a code to get the discount,” says Birnbaum. “The reward required five more steps for the customer; it did not feel immediate.”

So, the team developed a tool called “In the Moment Care,” which uses an AI engine to make reward recommendations to the flight attendant who can offer compensation, miles, or discounts in any situation. The customer can see the reward on his or her phone right away, which immediately improves both the customer and employee experience. “We knew the customers would be happier with having their problem solved in real-time, but we were surprised by how much the flight attendants loved the tool,” says Birnbaum.  “They said, ‘I get to be the hero. I get to save the day.’”

The employee user experience team then turned its attention to the process of “turning the plane,” which includes every task that happens from the minute a plane lands to when it takes off again. It involves at least 35 employees in a 30-minute window.  

Take baggage, for example. Traditionally, during the boarding process, if the overhead bins were starting to fill up at the back of the plane, that flight attendant had no way to communicate to the flight attendant in the front of the plane that it is time to start checking bags. Their only option was to call the captain to call the network center to call the gate to get them to start checking bags.

To create a better communication channel, the employee user experience team worked with the developers to create a new tool, Easy Chat, that puts every employee responsible for a turn activity into one chat room for the length of the turn. “Whether the bins are filling up, or they need more orange juice, or they are waiting for two more customers to come down the ramp, the team can communicate directly to digitally coordinate the turn,” says Birnbaum. “Once the flight is gone, each employee will be connected to another group in another time and place.”

Again, Birnbaum sees that the value of Easy Chat is well beyond the customer experience. “I was just talking to a few flight attendants the other day, who told me that Easy Chat makes them feel like they are a part of a team, rather than a bunch of people with individual roles,” says Birnbaum. “United has a lot of employees, and they don’t work with the same people every day. The new tool allows us them to work as a team and to feel connected to each other.”

Chapter 3: Data at scale

To improve the analytics capabilities of the company, Birnbaum and his team built a hub and spoke model with a central advanced analytics team in IT that collaborates with each operational area to develop the right data models. 

“The operating teams live and breathe the analytics — they are the people scheduling the planes — so they are key to unlocking the value of the analytics,” says Birnbaum. “Digital Technology’s job is to collect, structure, and secure the data, and help our operational groups exploit it. We want the data scientists in the operating areas to take the lead on how to make the data valuable at scale.”

For example, United has always worked to understand the cause of a flight delay. Was it a mechanical problem? Did the crew show up late? “The teams would spend hours figuring out whose fault it was, which was a huge distraction from running the operation,” says Birnbaum. To solve this problem, the analytics team, in partnership with the operations team, created a “Root Cause Analyzer” that collects operational data about the flight.

“Now, instead of spending time debating why the flight was delayed, we can quickly see exactly what happened and spend all of our time on process improvement,” says Birnbaum.

With the foundational, employee experience, and data chapters now under way, Birnbaum is thinking about the next chapter: Using technology and analytics to integrate and personalize a customer’s entire travel experience.

“If you have a rough time getting to the airport, but the flight attendant greets you by your name and knows what you ordered, you will still have a good trip,” says Birnbaum.  “It is our job to use technology to help our employees deliver that great customer experience.”

Digital Transformation, Employee Experience, Travel and Hospitality Industry

In a bid to help retailers transform their in-store, inventory-checking processes and enhance their e-commerce sites, Google on Friday said that it is enhancing Google Cloud for Retailers with a new shelf-checking, AI-based capability, and updating its Discovery AI and Recommendation AI services.

Shelf-checking technology for inventory at physical retail stories has been a sought-after capability since low — or no — inventory is a troubling issue for retailers. Empty shelves cost US retailers $82 billion in missed sales in 2021 alone, according to an analysis from NielsenIQ.

The new AI-based tool for shelf-checking, according to the company, can be used to improve on-shelf product availability, provide better visibility into current conditions at the shelves, and identify where restocks are needed.

The tool, which is built on Google’s Vertex AI Vision and powered by two machine learning models — product recognizer and tag organizer — can be used to identify different product types based on visual imaging and text features, the company said, adding that retailers don’t have to spend time and effort into training their own AI models.

Further, the shelf-checking tool can identify products from images taken from a variety of angles and across devices such as a ceiling-mounted camera, a mobile phone or a store robot, Google said in a statement. Images from these devices are fed into Google Cloud for Retailers.

The capability, which is currently in preview and is expected to be generally available to retailers globally in the coming months, will not share any retailer’s imagery and data with Google and can only be used to identify products and tags, the company added.

Improving retail website experience

To help retailers make their online browsing and product discovery experience better, Google Cloud is also introducing a new AI-powered browse feature in its Discovery AI service for retailers.

The capability uses machine learning to select the optimal ordering of products to display on a retailer’s e-commerce site once shoppers choose a category, the company said, adding that the algorithm learns the ideal product ordering for each page over time based on historical data.

As it learns, the algorithm can optimize how and what products are shown for accuracy, relevance, and the likelihood of making a sale, Google said, adding that the capability can be used on different pages within a website.

“This browse technology takes a whole new approach, self-curating, learning from experience, and requiring no manual intervention. In addition to driving significant improvements in revenue per visit, it can also save retailers the time and expense of manually curating multiple ecommerce pages,” the company said in a statement.

The new capability, which has been made generally available, currently supports 72 languages.

Personalized recommendations for customers

In order to help retailers create hyperpersonalization for their online customers, Google Cloud has released a new AI-based capability for its Recommendation AI service for retailers.

The new capability, which is expected to advance Google Cloud’s existing Retail Search service, is underpinned by a product-pattern recognizer machine learning model that can study a customer’s behavior on a retail website, such as clicks and purchases, to understand the person’s preferences.

The AI then moves products that match those preferences up in search and browse rankings for a personalized result, the company said.

“A shopper’s personalized search and browse results are based solely on their interactions on that specific retailer’s ecommerce site, and are not linked to their Google account activity,” Google said, adding that the shopper is identified either through an account they have created with the retailer’s site, or by a first-party cookie on the website.

The capability has been made generally available.

Artificial Intelligence, Cloud Computing, Retail Industry, Supply Chain

Low-code/no-code visual programming tools promise to radically simplify and speed up application development by allowing business users to create new applications using drag and drop interfaces, reducing the workload on hard-to-find professional developers. A September 2021 Gartner report predicted that by 2025, 70% of new applications developed by enterprises will use low-code or no-code technologies, up from less than 25% in 2020.

Many customers find the sweet spot in combining them with similar low code/no code tools for data integration and management to quickly automate standard tasks, and experiment with new services. Customers also report they help business users quickly test new services, tweak user interfaces and deliver new functionality. However, low code/no code is not a silver bullet for all application types and can require costly rewriting if a customer underestimates the degree to which applications must scale or be customized once they reach production. So there’s a lot in the plus column, but there are reasons to be cautious, too.

Here are some examples of how IT pros are using low code/no code tools to deliver benefits beyond just reducing the workload on professional developers.

Experimenting with user interfaces, delivering new services

Sendinblue, a provider of cloud-based marketing communication software, uses low code workflow automation, data integration and management tools to quickly experiment with features such as new pricing plans, says CTO Yvan Saule. Without low code, which allows him to test new features at 10 to 15% of the cost of traditional development, “we couldn’t afford all the experiments we’re doing,” he says. “If we had to write 15 different pricing systems, it could’ve taken years,” requiring backend fulfillment systems and credit checks for each specific price.

Financial technology and services company Fidelity National Information Services (FIS) uses the low code WaveMaker to develop the user interfaces for the customer-facing applications it builds for its bank customers, using APIs to connect those applications to the customer’s or FIS’ back-end systems. “It’s for speed to market,” says CTO Vikram Ramani. This approach is especially valuable given the shortage of skilled developers. While FIS is still evaluating the results, Ramani says they expect at least a 20 to 30% speed improvement.

Vikram Ramani, Fidelity National Information Services CTO

Fidelity National Information Services

And among low-code tools, for instance, FIS chose WaveMaker because its components seemed more scalable than its competitors, and its per-developer licensing model was less expensive than the per runtime model of other tools.

At Joist, a startup developing financial and sales management software for contractors, CEO Rohan Jawali is using the no code AppMachine platform to quickly build application prototypes, get customer feedback, and then build the actual application in order to skip a few iterations in the design process. At a previous employer, he could spin out a simple information and contact sharing mobile app for construction workers in a couple days compared to several weeks using conventional languages. Tapping the content management system within AppMachine made it easy for users to upload the required data into it, he says.

Process automation and data gathering

At bottled water producer Blue Triton Brands, Derek Lichtenwalner used Microsoft’s low code Microsoft Power Apps to build an information sharing and communications application for production workers. Before becoming an IS/IT business analyst in early 2022, Lichtenwalner had no formal computer training, but was able to build the app in about a month. It’s now in use at six facilities with about 1,200 users, with plans to expand it to 3,000 at the company’s 27 facilities.

Derek Lichtenwalner, IS/IT business analyst, Blue Triton Brands

Blue Triton Brands

Using non-IT users such as Lichtenwalner to develop apps that share information and automate processes is a good option for industries with small staffs of skilled developers, such as construction, where “there are many processes that need to be digitized and low code and no code can make that easier,” says Jawali.

Some vendors and customers are using low code/no code concepts to ease not only app development, but data sharing among apps. Sendinblue, for example, abstracts its application programming interfaces (APIs) into reusable widgets. By mapping its APIs with those used by its customers’ systems, says Saule, his developers “can drag-and-drop the integration functions, and build new capabilities atop that integration.”

Understand your needs

Low code/no code “can be an IT professional’s best friend for traditional, day-to-day challenges such as workflow approvals and data gathering,” says Carol Dann, VP information technology at the National Hockey League (NHL). But she warns against trying to use such a tool for a new application just because it’s enjoyable to use. And choosing the wrong solution can backfire quickly, she adds, with any quick wins erased by the need to code or work around the shortcomings.

“No code is a good fit when you have a simple application architecture and you want to quickly deploy and test an application,” adds Jawali. It’s best, he says, in “innovative experiments where you want a lot of control over the user experience, user interface—something you can’t get with a low code platform. Low code is more useful when you need to introduce more security and links to other applications, but at the cost of greater complexity and the need for more technical developers.”

The drag and drop simplicity of no code makes it difficult to achieve that final percent of differentiation that makes an application useful for a specific organization, and makes low code a better fit, says Mark Van de Wiel, field CTO at data integration software provider Fivetran. That extra customization might include, for example, the use of fuzzy logic to correct misspelled names in a customer database or the business logic needed to calculate a one to 10 score of the effectiveness of various marketing assets based on metrics such as views versus click-throughs.

Low code/no code apps can also be difficult to scale, says Saule. He recommends limiting their use to non-core strategic parts of the business, and when you want to experiment. “When an experiment succeeds and needs to scale is when it’s time to think about rewriting it,” he says, but in a more traditional yet scalable language.

Just because no code/low code lets more users create their own applications doesn’t mean they should, says Van de Wiel. At Fivetran, an analytics group prepares the marketing effectiveness dashboards used by the rest of the organization. “This allows our employees to focus on what they’re good at,” he says, rather than wasting time creating duplicate dashboards, or wasting money tying up IT infrastructure downloading the same data.

Organizations allowing extensive DIY app development must also ensure this larger pool of non-professional developers follows corporate and regulatory requirements to protect customer data, he says.

Know when to use low code vs no code tools

AppMachine was a good fit for the information-sharing apps Jawali developed because they must only serve hundreds of users rather than tens of thousands or more. It’s less suited to applications that require high levels of security, he says, because it can’t create the user profiles needed for role-based access. It also lacked support for the APIs required to connect to other applications such as product or issue management systems, or delivery tracking applications.

“Trouble can seep in when folks try to make tools do things they weren’t really meant to do rather than concluding it’s not the right fit,” says Dann. Developing a close relationship with the vendor’s customer success managers and understanding everything the tools have to offer—what the tool does well and what it was never intended to do—are critical to make well educated decisions. “Saying, ‘No, that’s really not what our platform does’ is a perfectly acceptable answer from a vendor,” she says.

Among other assessment questions, Dann recommends asking if a no code/low code vendor is willing to take part in an information security review, whether their solution has a robust API to integrate with other applications and whether it has an authentication and authorization strategy that fits with the customer’s security processes.

Don’t overlook “off the shelf” low code/no code solutions

If you think of low code/no code as a strategy to simplify development rather than a product category, you can find opportunities to speed app development with software you already own, or with off-the-shelf capabilities in familiar products such as cloud storage. This is especially true for routine, well-defined functions such as managing documents and workflows.

The NHL configured the integration capabilities of its workflow management system as a low code/no code solution to replace a legacy system for channeling requests to its creative staff. It deployed this replacement system in about two weeks, compared to at least six months using traditional development methods, says Dann.

The league met another need—alerting its scouts to new information about promising players—by configuring the Box Relay no code workflow automation tool in the Box cloud-based data storage service to automatically assign tasks and move documents to the proper folders once they were processed. “This took the whole process out of email and streamlined workflow,” she says, “by using a ‘what you see is what you get’ interface rather than writing code.”       

Used properly, low code/no code tools can speed new apps to customers and employees while slashing costs and delays. They can also prevent security or policy breaches, or the need to rewrite an application that can’t scale if it’s successful. But like any tool, understanding what they can and can’t do—and what you need them to do—is essential.

Data Management, IT Leadership, No Code and Low Code

In the beginning, no one needed enterprise architecture tools. A back of an envelope would do in the early years. Thomas Watson Jr., one of the leaders of International Business Machines, supposedly said in 1940s, “I think there is a world market for about five computers.” 

The modern enterprise, however, is much different. Some employees have more than five computers on their desk alone. Even a small organization may have thousands of machines; some can easily have more than a million. That’s before the sensors and smart gadgets that make up the internet of things are taken into account. Enterprise architecture (EA) systems track all these machines and the software that runs on them — not to mention how these software layers interact. Because of this, EA tools are the single source of truth for managing these burgeoning virtual worlds.

The state of the EA tool market

The EA marketplace is robust with more than several dozen serious competitors. Some specialize in specific platforms or clouds. Others offer deeper integration with business intelligence (BI) and business process management (BPM) software. Some began life as generic modeling software; others were purpose-built for enterprise architecture. All compile long lists of machines and offer various tabular and graphical dashboards for tracking them.

EA systems gather device and software information in a variety of ways. The most manual process involves asking stakeholders and developers to fill out forms detailing who owns what machines. The most automated tools log into a company’s clouds directly, counting the machines themselves. Most use a hybrid approach. Some offer drag-and-drop widgets so that developers, architects, and managers can create a model of all the machines, the software those machines run, and how the data flows from one machine to another.

Everyone from the CIO to the rapid-response team can use the charts and graphs from an EA dashboard to look up processes and track the flow of data. Some watch for bad machines or overloaded pipelines. They can repair problems by following a cascade of failure. Others plan for the future by finding bottlenecks or shortcuts. All rely on the data in the system as a springboard for making quick decisions.

Many of the tools use ArchiMate, an open modeling standard designed to capture much of the complexity of enterprise architecture. It’s built to work closely with the TOGAF open framework. The views and visualizations are created in a manner similar to that of UML (Unified Modeling Language), another generalized approach for visualizing design.

An important consideration is the level of integration with the type of software in your local stack. All of the EA tools support big collections of modules that can gather data from particular clouds or operating systems but some support various clouds and operating systems better than others.

Another consideration is their ability to connect with computing and service clouds. Some EA tools specialize in cloud instances and compute pods. All cloud companies maintain their own tools for tracking your systems and some EA tools can absorb this data directly. Single-cloud teams often rely more on the cloud’s own management software to track deployments.

Choosing the best solution for your organization requires evaluating the tools’ ability to integrate with your technology stack and then weighing the usefulness of the charts and tables that the software produces. Following is an overview, in alphabetical order, of the top enterprise architecture platforms available today.

Top 20 enterprise architecture tools

ArdoqAtoll Group SAMUAvolution AbacusBOC Group ADOITBiZZdesign HoriZZonCapsifiCapsteraClausmark Bee360EASLeanIX Enterprise Architecture SuiteMega HopexOrbus Software iServerPlanview Enterprise OneQualiWare Enterprise ArchitectureQuest Erwin EvolveServiceNowSoftware AG AlfabetSparxUnicom System ArchitectValueBlue Blue Dolphin


Ardoq creates a “digital twin” of your organization by collecting information from a variety of users, developers, and stakeholders throughout your enterprise with a collection of simplified forms. The goal is to engage people who understand the roles of various systems. This data creates a more “democratic” opportunity for everyone to use the visualizations of the network and data flows to support and modernize the systems supporting their roles. The tool offers integrations with the major clouds and an API that’s open to customization through all major languages (Python, C#, Java, etc.).

Major use cases:

Simulating architectural stress when demand spikes in order to plan for major eventsUnderstanding how user behavioral shifts lead to demand changesApplication portfolio management for better strategic planning

Atoll Group SAMU

The Atoll Group created SAMU EA Tool to track enterprise architecture by examining deep connections throughout on-prem hardware, the cloud layer, and BPM tools. It offers integration with monitoring tools (Tivoli, ServiceNow, etc.), cloud virtualization managers (VMware, AWS, etc.), configuration management databases (CA, BMC), and service organization tools (BMC, HP). These integrations feed a centralized data model that is augmented with input from stakeholders.

Major use cases:

Creating visualizations of architectureInforming the architectural review and strategic planning processImproving communications by creating a visual foundation for understanding

Avolution Abacus

Avolution’s created Abacus to deliver a diagram-driven dashboard that captures the range and extent of your enterprise architecture. The core integrations with office tools such as SharePoint, Excel, Visio, Google Sheets, Technopedia, and ServiceNow simplify usage for workflows that use them. The tool began adding a machine-learning layer and users can now experiment with training a model that can help answer questions like which staff member is responsible for a particular system.

Major use cases:

Opening up IT to the larger workplace to empower the entire organization to understand data flowsUsing extensive enterprise modeling to build a roadmap for future developmentTracking business metrics that integrate with enterprise performance


Helping teams manage resources, predict demand, and track all assets is the goal for BOC Group’s ADOIT, a wide-open tool that maps each system or software package to an object. Data flows between the systems are turned into relationships captured by the objects using a metamodel that can be customized. Business processes are also modeled similarly by a companion product, ADONIS, that is well-integrated. The web-based tool also integrates with tools such as Atlassian’s Confluence for faster data capture and evolution.

Major use cases:

Creating an enterprise-wide model so all team members can understand and improve the stackProviding full access to EA data while away from a desktop with the ADOIT mobile appOrchestrating tech mergers and acquisitions through thorough mapping of assets

BiZZdesign HoriZZon

The philosophy from BiZZdesign is to use its tool HoriZZon to model business workflows and the tech stack that supports them. HoriZZon offers a graph-based model for collecting data from all stakeholders so its analytics engine can generate charts illustrating the current state of the system. Managing change and planning for the future is a big emphasis for BiZZdesign and HoriZZon is designed to help manage the risk of redesign. The tool set supports major standards such as ArchiMate, TOGAF, and BPMN.

Main use cases:

Anticipating future demands through predictive modelingWorking with both business and tech architecture to orchestrate workflowsAnticipate issues with risk, security, and governance by modeling data security needs.


Jalapeno from Capsifi creates business models in its cloud-based platform. The goal is not just capturing the workflow in a model but enabling leadership to understand enough to drive a transformation through innovation. The software allows users to knit together modeling concepts such as “customer journeys” or “value stream” and to integrate this with data gathered from tools such as Jira. This data can yield metrics reported through a collection of charts and gauges designed to measure progress or “burndown.”

Main use cases:

Planning strategically for the future of the enterprise stacksCreating a nexus of communication to coordinate all enterprise stakeholdersManaging continuous transformation through Kanban-style tools for agile teams.


The Business Architecture tool from Capstera focuses on creating a map of the business architecture itself. The value and process maps help define and track the roles of the various sections of the business. The connections to the underlying software and tools can be added along the way.

Main use cases:

Producing reports that explore the business architecture firstThinking about the connections between people, divisions, and work requirementsDeveloping strategic summaries for long-term planning

Clausmark Bee360

The team members who turn to Clausmark’s flagship product Bee360 (formerly known as Bee4IT) are coming for a system designed to offer a simple source of truth about a business’s workflow so that many roles can make smarter decisions. The system also offers the ability to track and allocate costs with Bee360 FM (financial management).

Main use cases:

Empowering C-suite level management of projects and asset allocationEvolving an accurate digital twin for both understanding current data flows and planning future enhancementsBuilding an integrated knowledge base to track all digital workflows


The Essential package from EAS or Enterprise Architecture Solutions began as an open-source project and evolved into a commercially-available cloud solution. It creates a metamodel describing the interactions between systems and business processes. The commercial version includes packages for tracking some common business workflows such as data management or GDPR compliance.

Main use cases:

Evaluating the technical maturity of your architectureDriving security and governance through better tracking of all assetsControlling and managing complexity as it evolves in your system.

LeanIX Enterprise Architecture Suite

The LeanIX collection of tools includes Enterprise Architecture Management and several other tools that perform tasks such as SaaS Management and Value Stream Management to track cloud deployments and the services that run on them. Together, they collect data on your IT infrastructure and present it in a graphical dashboard. The tool is tightly integrated with several major cloud workflow tools, including Confluence, Jira, Signavio, and Lucidchart, an advantage for teams that are already using these to plan and execute development strategies.

Main use cases:

Managing application modernization and cloud migrationEvaluating obsolescence for software servicesControlling and managing cost

Mega Hopex

Mega built the Hopex platform to support modeling enterprise applications while understanding the business workflows they support. Data governance and risk management are a big part of the equation. The tool is built on Azure and relies on a collection of open standards, including GraphQL and REST queries, to gather information from component systems. Reporting is integrated with Microsoft’s Office tools as well as graphical solutions such as Tableau and Qllk.

Main use cases:

Deriving data-driven insights to guide cloud and application deploymentCreating accurate models of usage to understand architectural demandsCapturing an estimate of demand with surveys and other tools to plan for future needs

Orbus Software iServer

Orbus originally built its iServer tools on the Microsoft stack and its product will be familiar and usable to any team that’s tightly aligned with Microsoft’s tools. Reports emerge in Microsoft Word. The data is formatted for Excel. Everything runs well on Azure. The tools aren’t limited to Microsoft environments because its collection of modules support the dominant, open standards for integration to gather data. They’re expanding connections to other reporting platforms such as ServiceNow and Lucidchart.

Main use cases:

Controlling security and compliance risks through better visibility and deeper vision of the underlying architectureDestroying information silos in organizations by opening up access and spreading understandingManaging technical debt and cloud migration

Planview Enterprise One

Planview offers a constellation of products for tracking teamwork, processes, and enterprise architecture. Its enterprise tools are broken into three tiers for Strategic Portfolio Management, Product Portfolio Management, and Project Portfolio Management. Together they create databases of machines and software layers that deliver role-based views for managers and team members. The tool is integrated with common ticket-tracking systems such as Jira for creating workflow analytics and reporting. Planview has integrated tools formerly known as Daptiv, Barometer, and Projectplace that were acquired during a merger.

Main use cases:

Building a long-term, strategic vision for architectural evolutionTracking development work at a project-level and integrating this into any strategyFocusing on customer experience and product structure to drive change

QualiWare Enterprise Architecture

The Enterprise Architecture tool from QualiWare is part of a broad collection of modeling tools aimed at capturing all business processes. It offers a clean slate for building a digital twin that can document just how a customer’s journey progresses. The company is integrating various artificial intelligence algorithms to enhance both documentation and process discovery.

Main use cases:

Establishing a collaborative ecosystem for business managers to understand the enterprise architectureCapturing architectural design elements to build a knowledge ecosystem around the stackEncouraging broad participation in documentation creation and review

Quest Erwin Evolve

Quest’s Erwin Evolve tool began life as a data modeling system and grew to offer enterprise architecture and business process modeling. Teams can use customized data structures to capture the complexity of modern, interlocking software systems and the business workflows that they manage. The web-based tool creates models that generate role-based graphs and other visualizations that form dashboards for all team members. They also have an AI-based modeling tool that can integrate white board sketches. 

Main use cases:

Building a digital twin for strategic modeling of the enterprise data architectureUnderstanding customer journeys through outward facing systemsTracking services and systems using application portfolio management


The collection of tools from ServiceNow are broken down to focus on particular types of the architecture, including Assets, DevOps, Security, or Service. They catalog the various machines and software platforms to map and understand the various workflows and dataflows in the enterprise. Careful analysis of the reports and dashboards can minimize risk and build resilience into the system.

Main use cases:

Tracking possible assets, services, and systems defining the enterpriseUniting governance issues, risk containment, and IT management and security operations in a single platformManaging customer-facing services by integrating CRM tools

Software AG Alfabet

Alfabet is one of a large collection of products for managing APIs, cloud computing, and applications supporting devices from the internet of things. The system gathers information from a variety of interfaces and produces hundreds if not thousands of potential reports filled with lifecycle maps, charts, rankings, and geographic coordinates. While traditionally Software AG offers tools such as ADABAS that are closely aligned with IBM’s offerings, Alfabet offers tight integration with all major platforms, including collaboration spaces such as Microsoft Teams. Its latest version will include an audible option, Alfabot, that delivers a “conversational user interface.”

Major use cases:

Driving change through tracking projects and running codeEnforcing compliance and software standardsUsing reports, maps, and dashboards to implement business-driven change


Sparx created four levels of its tool so that teams of various sizes can tackle projects of various sizes and complexity. All offer UML-based modeling that tracks the parts of increasingly complex systems. A simulation engine enables war gaming and understanding how failure can propagate and cascade, an essential part of disaster planning. Sparx recognizes that models can be built for a variety of reasons from pure analysis, software development, or strategic planning, and they’ve provided hundreds of potential pre-built design patterns to guide modeling.

Major use cases:

Simulating changes in demand and load to understand and project future needsTracing problems and potential issues through a matrix of connectionsGenerating documentation

Unicom System Architect

One of the offerings from Unicom’s TeamBlue is System Architect, a tool that uses a metamodel to gather as much data about the running systems automatically, sometimes through reverse engineering the data flows. This system wide data model can be presented in user-customizable dashboards for team members of all roles. Forward-looking managers can also start simulations to help optimize resource allocation.

Major use cases:

Asking “what if” questions about the architectural modelBuilding a meta-model of data and systemsCreating migration and transformation plans

ValueBlue Blue Dolphin

ValueBlue’s BlueDolphin gathers data in three ways. First, it depends on standards-driven automation (ITSM, SAM) to import basic data. Second, it works with architects and systems designers in file formats such as ArchiMate or BPMN. Finally, it surveys other stakeholders with questionnaires driven by customizable templates. All of this is delivered in a visual environment that tracks the historical evolution of systems.

Major use cases:

Gathering system-wide data from internal and external stakeholders through automated and form-based collectionGenerating forward-looking reports to monitor and drive changeNurturing cooperation and collaboration through open data reporting

More on enterprise architecture:

What is enterprise architecture? A framework for transformationWhat is an enterprise architect? A vital role for IT operations7 traits of successful enterprise architectsThe dark secrets of enterprise architecture6 reasons to invest in enterprise architecture tools12 certifications for enterprise architectsWhy you need an enterprise architectWhy enterprise architecture maximizes organizational valueEnterprise architects as digital transformers
Enterprise Applications, Enterprise Architecture, IT Strategy, ITSM, Software Development

What is project management?

Project management is a business discipline that involves applying specific processes, knowledge, skills, techniques, and tools to successfully deliver outcomes that meet project goals. Project management professionals drive, guide, and execute company-identified value-added goals by applying processes and methodologies to plan, initiate, execute, monitor, and close all activities related to a given business project in alignment with the organization’s overall strategic objectives.

Project management steps

Project management is broken down into five phases or life cycle. Each phase intersects with any of 10 knowledge areas, which include: integration, scope, time, cost, quality, human resources, communication, risk procurement, and stakeholder management. The phases, processes and associated knowledge areas provide an organized approach for project managers and their teams to work through projects, according to the following outline:

Initiating phase:

Integration management: Develop project charter.
Stakeholder management: Identify stakeholders.

Planning phase:

Integration management: Develop project management plan.
Scope management: Define scope, create work breakdown structure (WBS), gather requirements.
Time management: Plan and develop schedules and activities, estimate resources and timelines.
Costs management: Estimate costs, determine budgets.
Quality management: Identify quality requirements.
Human resource management: Plan and identify human resource needs.
Communications management: Plan stakeholder communications.
Risk management: Perform qualitative and quantitative risk analysis, plan risk mitigation strategies.
Procurement management: Identify and plan required procurements.
Stakeholder management: Plan for stakeholder expectations.

Execution phase:

Integration management: Direct and manage all project work.
Quality management: Performing all aspects of managing quality.
Human resource management: Select, develop, and manage the project team.
Communications management: Manage all aspects of communications.
Procurement management: Secure necessary procurements.
Stakeholder management: Manage all stakeholder expectations.

Monitoring and controlling phase:

Integration management: Monitoring and control project work and manage any necessary changes.
Scope management: Validate and control the scope of the project.
Time management: Control project scope.
Costs management: Controlling project costs.
Quality management: Monitor quality of deliverables.
Communications management: Monitor all team and stakeholder communications.
Procurement management: Keep on top of any necessary procurements.
Stakeholder management: Take ownership of stakeholder engagements.

Closing phase:

Integration management: Close all phases of the project.
Procurement management: Close out all project procurements.

Stakeholder expectations

Stakeholders can be any person or group with a vested stake in the success of a project, program, or portfolio, including team members, functional groups, sponsors, vendors, and customers. Expectations of all stakeholders must be carefully identified, communicated, and managed. Missing this can lead to misunderstandings, conflict, and even project failure.

Here are some tips for managing stakeholder expectations.

Assemble a team specific to project goals, ensuring team members have the right mix of skills and knowledge to deliver.
Leave sufficient time in advance of a project for key individuals to delve into and discuss issues and goals before the project begins.
Ensure the project timeline and scheduled tasks are realistic.

Project scope

During the planning phase, all project details must be solidified, including goals, deliverables, assumptions, roles, tasks, timeline, budget, resources, quality aspects, terms, and so on. The customer and key stakeholders work together to solidify and agree on the scope before the project can begin. The scope guides the project work and any changes to the scope of the project must be presented and approved as a scope change request.

Project budgets

Budgets play a large role in whether a project progresses, or if it can be completed. Few companies have an unlimited budget, so the first thing project stakeholders look at in determining whether a project succeeded or failed is the bottom line. This fact fuels the pressure project leaders, and their teams face with each passing day. As such, effective budget management is a primary area of focus for project managers who value their careers. The following are five strategies for maintaining control of your project budget before it succumbs to whopping cost overruns:

Understand stakeholder’s true needs and wants
Budget for surprises
Develop relevant KPIs
Revisit, review, re-forecast
Keep everyone informed and accountable

Project management methodologies

Most projects are conducted based on a specific methodology for ensuring project outcomes based on a range of factors. As such, choosing the right project management methodology (PMM) is a vital step for success. There are many, often overlapping approaches to managing projects, the most popular of which are waterfall, agile, hybrid, critical path method, and critical chain project management, among others. Agile, which includes subvariants such as Lean and Scrum, is increasing in popularity and is being utilized in virtually every industry. Originally adopted by software developers, agile uses short development cycles called sprints to focus on continuous improvement in developing a product or service.


Successful organizations codify project management efforts under an umbrella organization, either a project management office (PMO) or an enterprise project management office (EPMO).

A PMO is an internal or external group that sets direction and maintains and ensures standards, best practices, and the status of project management across an organization. PMOs traditionally do not assume a lead role in strategic goal alignment.

An EPMO has the same responsibilities as a traditional PMO, but with an additional key high-level goal: to align all project, program, and portfolio activities with an organization’s strategic objectives. Organizations are increasingly adopting the EPMO structure, whereby, project, program, and portfolio managers are involved in strategic planning sessions right from the start to increase project success rates.

PMOs and EPMOs both help organizations apply a standard approach to shepherding projects from initiation to closure. In setting standard approaches, PMOs and EPMOs offer the following benefits:

ground rules and expectations for the project teams
a common language for project managers, functional leaders, and other stakeholders that smooths communication and ensures expectations are fully understood
higher levels of visibility and increased accountability across an entire organization
increased agility when adapting to other initiatives or changes within an organization
the ready ability to identify the status of tasks, milestones, and deliverables
relevant key performance indicators for measuring project performance

Project management roles

Depending on numerous factors such as industry, the nature and scope of the project, the project team, company, or methodology, projects may need the help of schedulers, business analysts, business intelligence analysts, functional leads, and sponsors. Here is a comparison of the three key roles within the PMO or EPMO, all are in high demand due to their leadership skill sets.

Project manager: Plays the lead role in planning, executing, monitoring, controlling, and closing of individual projects. Organizations can have one or more project managers.

Program manager: Oversees and leads a group of similar or connected projects within an organization. One or more project managers will typically report to the program manager.

Portfolio manager: This role is at the highest level of a PMO or EPMO and is responsible for overseeing the strategic alignment and direction of all projects and programs. Program managers will typically report directly to the portfolio manager.

Project management certification

Successful projects require highly skilled project managers, many with formal training or project management certifications. Some may have project management professional certifications or other certifications from the PMI or another organization. Project management certifications include:

PMP: Project Management Professional
CAPM: Certified Associate in Project Management
PgMP: Program Management Professional
PfMP:Portfolio Management Professional
CSM: Certified Scrum Master
CompTIA Project+ Certification
PRINCE2 Foundation/PRINCE2 Practitioner
CPMP: Certified Project Management Practitioner
Associate in Project Management
MPM: Master Project Manager
PPM: Professional in Project Management

Project management tools

Project management software and templates increase team productivity and effectiveness and prepare the organization for changes brought about by high-impact projects. has compiled the ultimate project management toolkit as well as some open-source project management tools to help you plan, execute, monitor, and successfully polish off your next high-impact project.

Project management software falls into multiple categories. Some tools are categorized as project management software; others are more encompassing, such as project portfolio management (PPM) software. Some are better suited for small businesses and others for larger organizations. Project managers will also often use task management, schedule management, collaboration, workflow management, and other types of tools. These are just a few examples of the project management software and tools available to help simplify project management.

Popular project management tools include:


Project management skills

Effective project managers need more than technical know-how. The role also requires several non-technical skills, and it is these softer skills that often determine whether a project manager — and the project — will be successful. Project managers must have these seven non-technical skills: leadership, motivation, communication, organization, prioritization, problem-solving, and adaptability. It’s also beneficial to have a strategic mindset, have change management and organizational development expertise, agility, and conflict resolution capabilities, among other skills.

Project management jobs and salaries

By 2027, the demand for project managers will grow to 87.7 million, according to PMI, but these hires won’t all be project manager titles. While the more generic titles are project manager, program manager, or portfolio manager, each role may differ depending on industry and specialization. There are also coordinators, schedulers, and assistant project managers, among other roles.

Project managers have historically garnered high-paying salaries upwards of six figures, depending on the role, seniority, and location. Indeed provides a searchable list for job salaries, including some annual project management salaries companies are offering for these roles: 

Project manager: Base salary $85,311, bonus $13,500
Program manager: $85,796
Portfolio manager: $100,742
Software/IT project manager: $106,568
Project administrator: $62,093
Project planner: $69,528
Project controller: $90,342
Document controller: $74,899
Project leader: $130,689
Program/project director: $101,126
Head of program/project: $128,827

Careers, Certifications, IT Governance Frameworks, IT Leadership, IT Skills, IT Strategy, Project Management Tools

Companies today face disruptions and business risks the likes of which haven’t been seen in decades. The enterprises that ultimately succeed are the ones that have built up resilience.

To be truly resilient, an organization must be able to continuously gather data from diverse sources, correlate it, draw accurate conclusions, and in near-real time trigger appropriate actions. This requires continuous monitoring of events both within and outside an enterprise to detect, diagnose, and resolve issues before they can cause any damage.  

This is especially true when it comes to enterprise procurement. Upwards of 70% of an organization’s revenue can flow through procurement. This highlights the critical need to detect potential business disruptions, spend leakages (purchases made at sub-optimal prices by deviating from established contracts, catalogs, or procurement policies), non-compliance, and fraud. Large organizations can have a dizzying array of data related to thousands of suppliers and accompanying contracts.

Yet amassing and extracting value from these large amounts of data is difficult for humans to keep up with, as the number of data sources and volume of data only continues to grow exponentially. Current data monitoring and analysis methods are no longer sufficient.

“While periodic spend analysis was okay up until a few years ago, today it’s essential that you do this kind of data analysis continuously, on a daily basis, to spot issues and address them quicker,” says Shouvik Banerjee, product owner for ignio Cognitive Procurement at Digitate.

Enterprises need a tool that continuously monitors data so they can use their funds more effectively. Companies across industries have found success with ignio Cognitive Procurement, an AI-based analytics solution for procure-to-pay. The solution screens purchase transactions to detect and predict anomalies that increase risk, spend leakage, cycle time, and non-compliance.

For example, the product flags purchase requests with suppliers who have a poor track record of compliance with local labor laws. Likewise, it flags urgent purchases whose fulfillment is likely to be delayed based on patterns observed in similar transactions in the past.  It also flags invoices that need to be prioritized to take advantage of early payment discounts.

“It’s a system of intelligence versus other products in the market, which are systems of record,” says Banerjee. Not only does ignio Cognitive Procurement analyze an organization’s array of transactions, it also takes into account relevant market data on suppliers and categories on a daily basis.

ignio Cognitive Procurement is unique for its ability to correlate what’s currently happening in the market with what’s going on inside an organization, and it makes specific recommendations to stakeholders. For example, the solution can simplify category managers’ work, helping them source the best deals for their company, or make decisions such as whether to place an order now or hold off for a month.

Charged with finding the best suppliers and monitoring their success within the context of the market, category managers work better and smarter when they can tap into ignio Cognitive Procurement.

ignio Cognitive Procurement also identifies other opportunities to save money and improve the effectiveness of procurement. For instance, the solution proactively makes business recommendations that seamlessly take into account not only price, but also a variety of key factors like timeliness, popularity, external market indicators, suppliers’ market reputation, and their legal, compliance, and sustainability records.

“Companies also use the software to analyze that part of spend that’s not happening through contracts,” says Banerjee, “and they’ve been able to identify items which have significant price variance.”

To avoid irreversible damage or missed opportunities and to keep a competitive advantage, organizations across industries urgently need an AI-based analytics solution for procure-to-pay that can augment their human capabilities.

To learn more about Digitate’signio Cognitive Procurement, click here.

Analytics, IT Leadership

By Milan Shetti, CEO Rocket Software

According to PwC, almost two-thirds (60%) of Chief Information Officers (CIOs) see digital transformation as one of the most important drivers of growth this year. The cloud has been a major part of most organizations’ IT investments and digital transformation journeys. In fact, Gartner forecasts worldwide public cloud end-user spending will reach nearly $500 billion in 2022. But with all the hype around the cloud, many organizations overlook one of their business’s most critical tools: terminal emulation.

Terminal emulators for IBM Z, IBM i, and other mainframe systems aren’t exactly as popular as the cloud nowadays, but they play a critical role in ensuring secure access to stored data for organizations that rely on the mainframe. From ensuring regulatory compliance to serving customers more efficiently, terminal emulation is key to enabling a range of business processes.

Read on to learn why IT leaders can’t afford to overlook terminal emulation any longer — and what steps to take next.

Terminal emulators are not what they once were

Until recently, terminal emulators have been limited in their capabilities, hindering users from a lack of configurability options and cumbersome interfaces. Because of this, organizations experienced a loss of productivity and an increase in overall frustration for end users and administrators alike.

However, today’s new generation of emulators allow IT teams to access their business-critical applications through home computers, or even mobile devices, without any compromises to functionality. The pandemic was a significant catalyst for innovation in terminal emulator capabilities. When the at-home workforce spiked from 17% pre-pandemic to 44% during the pandemic, it drove the need for professionals to access critical systems securely, no matter where the user of that technology was located.

The latest terminal emulators deliver exceptional configurability and seamless access for users regardless of their physical location, allowing remote team members to be as productive and efficient as they would in a traditional office. Additionally, unlike terminal emulators from 20 years ago, today’s modern terminal emulator provides the latest security features as well as customized, feature-rich customer and user experiences. Terminal emulators that are kept up to date with today’s business needs enable access to applications from any browser, allowing employees to manage even the most complex functions from wherever they are.

Not every emulator is created equal

As companies upgrade their legacy systems, updated terminal emulators will help them access data from centralized systems and facilitate automation. But not all terminal emulators are created equal. Organizations that have a weak terminal emulator that lacks flexibility can experience disruptions in their user workflows, creating discomfort and slowing processes.

One important aspect an organization should look for when evaluating terminal emulator solutions is the ability to unify and customize their existing IT environment. One of the primary motivators for enterprises evaluating their existing terminal emulators is the operational benefits from the perspective of IT teams. Unifying terminal emulation reduces the number of supported solutions needed, which saves IT professionals time and resources. The ability to customize the terminal environment is also a key feature, as many users with decades of experience have strong preferences and have even developed shortcuts for their most often-used tasks. Terminal emulator customization allows users to feel familiar and comfortable with the new environment.

Another critical aspect of today’s terminal emulators for business longevity is their security capabilities. While others may let their terminal emulator solution remain stagnant, vendors like Rocket Software are continually developing and improving their emulation solution to keep up with current security protocols, ensuring it is secure and compliant. This is important, especially for companies in highly regulated industries such as financial services, to avoid costly fines and penalties.

Terminal emulation is one of your business’s most critical tools because it provides access to the data enterprises needed to meet customer needs while also providing improved experiences for employees. Unlike terminal emulators of the past, modern solutions provide customizable terminal environments, ensure security compliance, and allow for excellent customer and user experiences.

[cta] To learn more about the power of Rocket Terminal Emulator, visit our website.

Digital Transformation