In today’s fast-paced business world, where companies must constantly innovate to keep up with competitors,depending on fully customizable software solutions created with programming languages and manual coding is insufficient.

Instead, enterprises increasingly are pursuing no-code and low-code solutions for application development. No-code and low-code development entails creating software applications by using a user-friendly graphic interface that often includes drag and drop. These solutions require less coding expertise, making application development accessible to a larger swath of workers. That accessibility is critical, especially as companies continue to face a shortage of highly skilled IT workers. In fact, IDC has identified low-code/no-code mobile applications as a driver of the future of work.

“The key difference between traditional and no-code and low-code solutions is just how easy and flexible the user experience can be with no-code and low-code,” says Alex Zhong, director of product marketing at GEP. “Speed has become more and more important in the business environment today. You need to get things done in a rapid way when you’re responding to the disruptive environment and your customers.”

The traditional application development process is both complicated and multilayered. It entails zeroing in on the business need, evaluating and assessing the idea, submitting the application development request to IT, getting evaluations and approvals to secure funding, designing, creating and producing, and doing user testing.

“Traditionally it’s a lengthy process with many people involved,” Zhong says. “This can take quite a few weeks and often longer.” Not only does the time workers spend accrue but various costs also quickly add up. “The new way of application development reduces complexity, tremendously shortens the process, and puts application development more in users’ hands.”

Here are some other benefits of no-code/low-code solutions over the traditional approach:

Projects are more malleable. “With local solutions, you can make changes quicker,” says Kelli Smith, GEP’s head of product management for platform. With fewer levels of approval and cooks in the kitchen, it’s easy to tweak ideas on the fly and make improvements to applications as you go.

Ideas are less likely to get lost in translation. With traditional development, sometimes ideas aren’t perfectly translated into a product. With the user at the helm working closely with IT, ideas are more likely to be accurately executed.

IT and the business work better together. No-code and low-code solutions are typically driven by someone close to the business, but IT is still involved in an advisory role — especially in initial stages. The relationship becomes more of a collaborative one. “The business is developing together with IT,” Smith says.

Developers are freed up for more complex work. With the business more involved in application development, IT workers’ time is freed up to dedicate to more complicated tasks and projects rather than an excess of manual or administrative work.

Often, moving away from traditional application development is a process for enterprises. Companies may start with low-code solutions and gradually shift toward no-code solutions. The evolution requires a culture change, vision from leadership, and endorsement from IT.

Importantly, employees also need to be empowered to participate.

GEP believes that no-code/low-code is the way of the future. The company is leading efforts in no-code and low-code solutions through partners and investments in solutions. “In today’s environment,” Zhong says, “no-code/low-code is simply key to giving enterprises more flexibility.”

At GEP we help companies with transformative, holistic supply chain solutions so they can become more agile and resilient. Our end-to-end comprehensive, unified solutions harness technology to change organizations for the better. To find out more, visit GEP.

Supply Chain

As popularity of project collaboration software grows along with other software-as-a-service products, a research report from SaaS purchasing platform Vertice shows that more than 90% of enterprises are overpaying for these tools.   

The project collaboration software market is estimated to reach a value of $27.40 billion by the end of 2022, from $21.69 billion in 2021, according to a report from Grand View Research, which links the growth to factors such as the evolution of the workplace and the rising need to incorporate effective means of team collaboration across different geographies in an enterprise due to the pandemic.

Another survey, conducted by market research firm Gartner, showed that there was a 44% increase in the use of SaaS-based project collaboration tools between 2019 and 2021.

This increase in usage of these collaboration tools have led more than 80% of vendors to increase their price listing by 10% every year since 2019, the Vertice report showed.

Lack of pricing transparency is a challenge

A lack of transparency in pricing seem to be the biggest challenge for enterprises when buying project collaboration software, resulting in overpayment for these tools, according to the Vertice report.

A meagre 14% of software vendors selling project collaboration software list prices on their website or through third parties, the report showed.

The nondisclosure of pricing poses a significant challenge for enterprises as they are not able to compare pricing across a variety of vendors, the company said in its report, adding that most project collaboration vendors require a consultation with their sales teams before they quote a price.

Lack of pricing transparency also plagues the broader SaaS category as well. Only 45% of vendors list pricing online, while 55% of vendors obscure pricing from potential customers, a separate report from OpenView Venture Partners showed.

Is long-term commitment the answer?

Considering long-term commitment or multiyear contracts could be the only solution that an enterprise might have when looking to seek discounts while buying project collaboration tools, Vertice said.  

Currently, 89% of project collaboration vendors offer discounts based on term length, the report showed.

In addition, the autorenewal clauses of software contracts also contributes to price increases, Vertice said, noting that 91% of vendors have autorenewal clauses stipulated in their contracts and nearly 72% of project collaboration vendors have clauses that allow them to change their pricing at any given time.

“It’s typical for vendors to automate the cost increases that are passed onto customers,” the company said in a statement.

Budgeting, Collaboration Software, Enterprise Applications

While a new forecast released Monday by Spiceworks/Ziff Davis said that overall IT spending will be largely unhampered by recessionary trends in the economic outlook, much of that spending will be driven by large enterprises, leaving the picture much murkier for small and medium-size businesses.

The forecast is based on a survey of IT professionals in the US and Europe, which was performed this summer by Aberdeen Research. Fully 90% of respondents said that they either planned to increase spending or keep it steady in 2023. However, the impulse to buy is not evenly distributed across companies—while 61% of large enterprises said that they plan an expansion of IT spending in 2023, just 41% of smaller companies said the same.

Counterintuitively, the researchers said, companies more worried about the effects of a possible recession were more likely to have bigger IT spending in their future plans than those who were not. Just 30% of companies with “no plans” to make major preparations for a recession reported that they were getting ready to hike IT spending, in contrast to solid majorities—68% and 55%—for companies who were already making recession plans or planned to in the near future, respectively.

That level of preparedness, coupled with the fact that some companies may be planning to reinvest cost savings from other areas into IT, reflect lessons learned during past economic downturns, according to Jim Rapoza, vice president and principal analyst at Aberdeen.

IT spending during a recession shows benefits

“Businesses that invested in technology during the pandemic saw significant benefits,” he said on a conference call announcing the study’s results. “Our research revealed improvements across performance, reliability, security and even reduced overall IT costs among organizations that modernized their infrastructure —even if that was initially out of necessity.”

Essentially, he said, recessions shouldn’t spur IT cutbacks. Companies that did so in 2001 and 2008 were frequently punished for it by the market. Hence, larger businesses , particularly those that have already weathered past economic crises, tend to be much more likely to either maintain their IT spending levels or even to increase them during economic headwinds.

That trend is already recognizable in the figures for uptake of newer technologies, the study found, particularly 5G, edge, serverless computing, and AI. Part of the reason for that is that many of them are interrelated. The type of connectivity enabled by 5G makes it easier for some companies to deploy edge computing, which creates the volumes of data required to feed AI models, and so on. Hence, companies with the financial wherewithal to either build those capabilities out on their own or hire managed service providers to take care of them—that is to say, big businesses—are much more likely to be working on them, and thus are more likely to reap the benefits.

That fact, along with the higher uptake of managed services in general among larger companies, could mean that such enterprises are better prepared to weather an economic downturn, or any other kind of large-scale headwind on the market, according to head of tech insights at Spiceworks/Ziff Davis, Peter Tsai.

“The pandemic’s not over—what if there’s another deadly wave that forces everyone to go remote again?” he said on the conference call. “Having that hybrid infra makes it easier to flip that switch to ‘remote’ back on.”

Budget, Technology Industry

John Hill, chief digital information officer for MSC Industrial Supply, received his Doctor of Business Administration degree in May. His dissertation research examined the question: After years of investment in project management, change management, and more recently, agile principles, why do most companies still have a stunningly high failure rate for digital initiatives? 

Hill’s conclusion was that companies need to do a better job with organizational digital agility (ODA), which is a business’s ability to create digital capabilities quickly and efficiently.

“My hypothesis was that in many companies, there are organizational behaviors that impede project and change management practices,” says Hill. “You cannot deliver anything with terrible project and change management, but excellence in those areas alone is not sufficient for developing digital capabilities.”

What is organizational digital agility?

ODA, says Hill, is made up of three components: slack, alignment, and speed.  

ODA defines “slack” as resources that you can move from one project to another without losing any value. “When you have everyone fully engaged in an operations project, but you have to pull them off that project for something more urgent, you pose risk to the first initiative,” Hill says.

With slack, CIOs do not allocate all their resources to operational issues. They allocate some to innovation, education, or continuous improvement, which they can pause to move those resources to something more urgent. “The key to slack is that moving resources off of innovation to a high-priority project poses little operational risk,” says Hill. “Those resources can then return to the innovation activity without losing productivity.”

The second component of ODA is alignment. Hill points out in his research that most companies have a governance process through which executives align on the top list of projects for the year. But if you ask the management team which project is No. 5, 6, or 7, they will likely not know. “When conflict of resources occurs among those top projects, no one knows how to resolve it, because the prioritization is not granular enough,” Hill says. “Without clear alignment, you lose efficiency when those resources move from project to project. You increase the likelihood that the squeaky wheel gets the resources.”

The last component of ODA is speed, which is becoming critically important at a time when the rate of technology change in the market is so high. 

For his research, Hill conducted a study of 132 CIOs from companies ranging from $300 million to over $10 billion in annual revenues, and he was able to demonstrate a correlation between a firm’s ODA and its creation of digital capabilities. His research further demonstrated the connection between a firm’s digital capabilities and its ability to compete in the marketplace. “The impact of ODA is pretty stunning,” he says.

Advice for CIOs looking to improve ODA

For slack, Hill suggests setting aside sprints for innovation, education, and reducing technical debt. “This way, if you need those resources to fight a fire, you are not posing a risk to the organization,” he says.

For alignment, Hill has three recommendations:

List more than just your top priorities, and be sure to create an ordinal ranking. At MSC, the executive team has taken that first step.Develop processes to understand the true capacity in the organization. Hill likes the enterprise Kanban board, so you “see who is available and who is tapped out,” he says.Create end-to-end teams and reduce the number of resources that move from project to project. “Call them squads or product teams or scrums, but you want all the resources necessary to build a new digital capability on one team,” says Hill. “An end-to-end team could be made up of business analysts, BI analysts, data engineers, developers, and QA, with a few traveling shared resources.”

Hill acknowledges that speed is the trickiest component of ODA because it can be very difficult to find the root cause of why you are not moving faster. Here, Hill suggests shifting your emphasis from old-school steering committees to a product management model and the “end-to-end teams” concept.

“At the heart of improving speed is empowering the product owners to make decisions about the backlog and execute,” he says. “Having to wait for steering committee meetings can really slow things down.”

To increase speed at MSC, Hill is working on the concept of portfolio allocation. With an ordinal ranking of investment priorities, he aims to allocate spend to the product owners in each business unit or functional area. Those product owners then prioritize that spend within their product team. “The teams are closest to understanding what new capabilities are needed in each platform,” says Hill. “They don’t need to slow down the work by asking for direction.”

How ODA impacts the CIO role

During his first five months at MSC, which distributes metalworking and other industrial products, Hill has been reducing the time it takes to review team activity. “When I spend time with the team, I just want to know about any blockers, issues, and risks and how they are being delt with,” he says. “The team should not spend time building project status presentations. I want them to be fast and use Jira, not PowerPoint. Our meetings should last 15 minutes.”

Finally, Hill suggests acknowledging that ODA does not come from changing a few processes. It is an entirely new way of working for the IT team.

“IT leaders are used to functionally and operationally managing the people that work for them,” Hill says. “Now, the person taking a squad leadership role must work across the entire organization to bring together business analysists, data scientists, developers, and operations people to ensure we are not duplicating work. At the same time, some associates who functionally report to them may be taking work instruction from another squad leader. Changing the behaviors of IT leaders is a big challenge.”

But an even greater challenge than changing behaviors in IT is getting executives to embrace the prioritization process. “When C-level executives try to squeeze in a new priority, rather than recognizing that other work needs to stop, they usually compress everything else. When you have too much work in progress, you reduce efficiency,” he says.

To Hill, ODA is a perfect example of how the CIO role is evolving away from technology expert and toward an orchestrator role. “CIOs are not the people who need to be technology experts” says Hill. “Today’s CIOs are the designers of organizations who can keep 25 balls in the air at the same time, and know how all those balls fit into the digital vision years from now.”

Change Management

The meager supply and high salaries of data scientists have led to a decision among many companies totally in keeping with artificial intelligence ― to automate whatever is possible. Case in point is machine learning. A Forrester study found that automated machine learning (AutoML) has been adopted by 61% of data and analytics decision makers in companies using AI, with another 25% of companies saying they’ll do so in the next year. 

Automated machine learning (AutoML) automates repetitive and manual machine learning tasks. That’s no small thing, especially when data scientists and data analysts now spend a majority of their time cleaning, sourcing, and preparing data. AutoML allows them to outsource these tasks to machines to more quickly develop and deploy AI models. 

If your company is still hesitating in adoption of AutoML, here are some very good reasons to deploy it sooner than later.

1. AutoML Super Empowers Data Scientists

AutoML transfers data to a training algorithm. It then searches for the best neural network for each desired use case. Results can be generated within 15 minutes instead of hours. Deep neural networks in particular are notoriously difficult for a non-expert to tune properly. AutoML automates the process of training a large selection of deep learning and other types of candidate models. 

With AutoML, data scientists can say goodbye to repetitive, tedious, time-consuming tasks. They can iterate faster and explore new approaches to what they’re modeling. The ease of use of AutoML allows more non-programmers and senior executives to get involved in conceiving and executing projects and experiments.

2. AutoML Can Have Big Financial Benefits

With automation comes acceleration. Acceleration can be monetized. 

Companies using AutoML have experienced increased revenue and savings from their use of the technology. A healthcare organization saved $2 million per year from reducing nursing hours and $10 million from reduced patient stays. A financial services firm saw revenue climb 1.5-4% by using AutoML to handle pricing optimization.

3. AutoML Improves AI Development Efforts

AutoML simplifies the process of choosing and optimizing the best algorithm for each machine learning model. The technology selects from a wide array of choices (e.g., decision trees, logistic regression, gradient boosted trees) and automatically optimizes the model. It then transfers data to each training algorithm to help determine the optimal architecture. Automating ML modeling also reduces the risk of human error.

One company reduced time-to-deployment of ML models by a factor of 10 over past projects. Others boosted lead scoring and prediction accuracy and reduced engineering time. Using ML models created with AutoML, customers have reduced customer churn, reduced inventory carryovers, improved email opening rates, and generated more revenue.

4. AutoML is Great at Many Use Cases

Use cases where AutoML excels include risk assessment in banking, financial services, and insurance; cybersecurity monitoring and testing; chatbot sentiment analysis; predictive analytics in marketing; content suggestions by entertainment firms; and inventory optimization in retail. AutoML is also being put to work in healthcare and research environments to analyze and develop actionable insights from large data sets.

AutoML is being used effectively to improve the accuracy and precision of fraud detection models. One large payments company improved the accuracy of their fraud detection model from 89% to 94.7% and created and deployed fraud models 6 times faster than before. Another company that connects retailers with manufacturers reduced false positive rates by 55% and sped up deployment of models from 3-4 weeks to 8 hours. 

A Booming Market for AutoML

The global AutoML market is booming, with revenue of $270 million in 2019 and predictions that the market will approach $15 billion by 2030, a CAGR of 44%. A report by P&S Intelligence summed up the primary areas of growth for the automation technology: “The major factors driving the market are the burgeoning requirement for efficient fraud detection solutions, soaring demand for personalized product recommendations, and increasing need for predictive lead scoring.”

Experts caution that AutoML is not going to replace data scientists any time soon. It is merely a powerful tool that accelerates their work and allows them to develop, test, and finetune their strategies. With AutoML, more people can participate in AI and ML projects, utilizing their understanding of their data and business and letting automation do much of the drudgery. 

The Easy Button

Whether you’re just getting started or you’ve been doing AI, ML and DL for some time, Dell Technologies can help you capitalize on the latest technological advances, making AI simpler, speeding time to insights with proven Validated Designs for AI.

Validated Designs for AI are jointly engineered and validated to make it quick and easy to deploy a hardware-software stack optimized to accelerate AI initiatives. These integrated solutions leverage H2o.ai for Automatic Machine Learning. NVIDIA AI Enterprise software can increase data scientist productivity, while VMware® vSphere with Tanzu simplifies IT operations. Customers report that Validated Designs enable 18–20% faster configuration and integration, save 12 employee hours a week with automated reconciliation feeds, and reduce support requirements by 25%.

Validated Designs for AI speed time to insight with automatic machine learning, MLOps and a comprehensive set of AI tools. Dell PowerScale storage improves AI model training accuracy with fast access to larger data sets, enabling AI at scale to drive real‑time, actionable responses. VxRail enables 44% faster deployment of new VMs, while Validated Designs enable 18x faster AI models.

You can confidently deploy an engineering‑tested AI solution backed by world‑class Dell Technologies Services and support for Dell Technologies and VMware solutions. Our worldwide Customer Solution Centers with AI Experience Zones enable you to leverage engineering expertise to test and optimize solutions for your environments. Our expert consulting services for AI help you plan, implement and optimize AI solutions, while more than 35,000 services experts can meet you where you are on your AI journey. 

AI for AI is here, making it easier and faster than ever to scale AI success. For more information, visit Dell Artificial Intelligence Solutions.  

***

Intel® Technologies Move Analytics Forward

Data analytics is the key to unlocking the most value you can extract from data across your organization. To create a productive, cost-effective analytics strategy that gets results, you need high performance hardware that’s optimized to work with the software you use.

Modern data analytics spans a range of technologies, from dedicated analytics platforms and databases to deep learning and artificial intelligence (AI). Just starting out with analytics? Ready to evolve your analytics strategy or improve your data quality? There’s always room to grow, and Intel is ready to help. With a deep ecosystem of analytics technologies and partners, Intel accelerates the efforts of data scientists, analysts, and developers in every industry. Find out more about Intel advanced analytics.

IT Leadership

Most businesses have a long-term vision and core strategic goals. But many struggle to make them a reality. The problem often arises because the strategic goals are so overarching and distant, that they aren’t focused on consistently among the day-to-day direction.

Deeper cultural issues typically underly the problem. Often, there is no clear link between the people working on important projects and the business outcomes targeted. People may be toiling on tech projects to improve website functionality or automate processes yet have no idea of the ultimate goals that those tasks are designed to feed, such as achieving a specific percentage increase in conversion or customer retention.

There is a better way. When people understand clearly what they are working towards and why, it is highly motivational. It changes how they approach their work. They will try different ways of achieving the same goals and become significantly more innovative.

The successful approach involves using empiricism, the recognition that knowledge comes from evidence and first-hand experiences. This approach involves staff noting exactly what is happening at various stages of a project and regularly matching their progress against ultimate goals.

It also means creating intermediate and sprint goals within the larger targets. Intermediate goals are set along the way, essential for overall success. Sprint goals are the specific smaller achievements that move teams towards the intermediate goals.

To use a human example: imagine I were to weigh 120kg, and wanted to lose 40kg in a year. Within the year, it would be easy to lose sight of making steady progress towards that goal. So, I would split my main goal of losing 40kg in a year into intermediate targets of losing 3.5kg every month. But achieving those intermediate targets relies on small sprint goals, such as making a weekly meal plan, joining a gym, using the treadmill twice a week and riding my bike to work two days a week.

Crucially, with these sprint and intermediate goals, there is regular achievement, measurement, and accountability. They allow for early course correction. They are outcome oriented and focused, rather than just being some task that the team has to perform, and can be measured against the strategic goal, to make its progress transparent.

In business: A hotel bookings company might be targeting full automation in two years, replacing costly phone-based or in-person bookings. Its intermediate goal, in one year, might be to have automated 50% of processes. Its sprint goals would then be to survey customers to understand their requirements around automated booking, create easier and shorter forms, improve the website layout, then introduce a chatbot that helps customers complete the booking process. The sprint goals’ outcomes would also be measured continuously, to check on progress being made and whether there is a need to adapt.  

We’ve been on this journey at GfK, in a drive to further improve our project measurement, transparency and adaptability. To maximize our efficiency in driving towards this goal, we’ve split the required changes into intermediate goals, with numerous sprint goals in between.

Every two months we have a review meeting with all product teams, to inspect and adapt. We look at the overall task, and measure against it what has changed. If we’re not achieving our targets or need to adapt the targets, this regular review gives us immediate flexibility to shift course. Our focus on evidence-based change, and transparent accountability, means we are witnessing a dramatic enhancement in our ability to hit larger goals successfully.

Looking ahead, what excites me most is the ability to continue experimenting in my own, and my teams’, breakdown of projects to hit ultimate goals. This way of working identifies everyone as a direct part of the big picture and specifies the contribution they need to deliver. By breaking down a strategic goal into smaller goals like intermediate and sprint goals, making them outcome focused, measuring them continuously, and adapting the direction towards the ultimate goal, we achieve the desired outcome. To sum it all up: it might take longer to achieve the desired result, but the end result will be the one desired.

GfK is recruiting! We have various tech roles open, and we welcome applications from everyone – with or without a tech background. Find out more at gfk.com/careers.  

Business Operations