Ensuring strong software security and integrity has never been more important because software drives the modern digital business. High-profile vulnerabilities discovered over the past few years, with the potential to lead to attacks against organizations using the software, have hammered home the need to be vigilant about vulnerability management.

Perhaps the most dramatic recent example was the zero-day vulnerability discovered in Apache’s popular open-source Log4j logging service. The logging utility is used by millions of Java applications, and the underlying flaw—called Log4Shell—can be exploited relatively easily to enable remote code execution on a compromised machine. The impact of the vulnerability was felt worldwide, and security teams had to scramble to find and mitigate the issue.

In November 2022, open-source toolkit developers announced two high-severity vulnerabilities that affect all versions of OpenSSL 3.0.0 up to 3.0.6. OpenSSL is a toolkit supporting secure communications in web servers and applications. As such, it’s a key component of the Transport Layer Security (TLS) protocol, which ensures that data sent over the internet is secure.

SBOMs as a solution

One of the most effective tools for finding and addressing such vulnerabilities and keeping software secure is the software bill of materials (SBOM). SBOMs are formal, machine-readable records that contain the details and supply chain relationships and licenses of all the different components used to create a particular software product. They are designed to be shared across organizations to provide transparency of the software components provided by different players in the supply chain.

Many software providers build their applications by relying on open-source and commercial software components. An SBOM enumerates these components, creating a “recipe” for how the software was created.

For example, something like the OpenSSL toolkit includes dependencies that are difficult or, in many cases, impossible for traditional vulnerability scanners to uncover. It requires a multilayered approach to help security teams identify third-party libraries associated with a software package. This is where an SBOM can help.

The U.S. Department of Commerce has stated that SBOMs provide those who produce, purchase, and operate the software with information that enhances their understanding of the supply chain. This enables multiple benefits, most notably the potential to track known newly emerged vulnerabilities and risks.

These records form a foundational data layer on which further security tools, practices, and assurances can be built, the Commerce Department says, and serve as the foundation for an evolving approach to software transparency.

A 2022 report by the Linux Foundation Research, based on a survey of 412 organizations from around the world, showed that 90% of the organizations had started their SBOM journey.

More than half of the survey participants said their organizations are addressing SBOMs in a few, some, or many areas of their business, and 23% said they are addressing them across nearly all areas of their business or have standard practices that include the use of SBOMs. Overall, 76% of organizations had a degree of SBOM readiness at the time of the survey.

The research showed that the use of open-source software is widespread and that software security is a top organizational priority. Given the worldwide efforts to address software security, SBOMs have emerged as a key enabler, it said. Growth of SBOM production or consumption was expected to accelerate by about 66% during 2022, leading to SBOM production or consumption use by 78% of organizations.

The top-three benefits of producing SBOMs identified by survey participants were that SBOMs made it easier for developers to understand dependencies across components in an application, monitor components for vulnerabilities, and manage license compliance.

Key features to consider

SBOMs are a key to quickly finding and fixing vulnerabilities before it’s too late. That’s because they dig deep into the various dependencies among software components, examining the compressed files with applications to effectively manage risk. It might take a software vendor days or weeks to confirm with its developers whether its products are affected or not. That’s too long a window of opportunity in which cybercriminals can exploit vulnerabilities.

With SBOMs, security teams can know exactly where an affected component is being used across applications in use within their organizations.

It’s important for organizations to understand that not all SBOM offerings from vendors are alike. An ideal solution delivers critical, real-time visibility into an organization’s software environments, enabling them to make better-informed decisions to manage risk.

SBOMs should be able to answer questions such as:

Exactly where is a particular software package located?Which open-source dependencies, if any, does an application use?Which version of the software package is running?Do any other applications use the software package?

A key capability includes having the ability to understand every software component at runtime, uncover software packages and break them apart to examine all constituent components without the need to engage the software vendor.

SBOMs should also be able to address any vulnerabilities or misconfigurations found in the various software components; take quick action to mitigate supply chain risk, even removing applications completely across affected endpoints; and optimize an organization’s investments in third-party tools by populating them with granular, accurate and real-time SBOM data.

The takeaway 

Digital businesses today rely on software to support all kinds of processes. In fact, it’s difficult to imagine any company operating without applications. Keeping software secure and reliable is essential for success today.

With solutions such as SBOMs, security teams at organizations can be confident that they have a good handle on all the complexities inherent in the software world, and that they are keeping up on any flaws that need to be addressed to keep applications secure.

Learn how Tanium’s Converged Endpoint Management (XEM) platform can address SBOMs to give your organization real-time visibility—even in the most complex software environments.

Security

The need for efficient software development has taken on greater importance as enterprises introduce more and more digital services and add automation capabilities to enhance business processes. Managing software projects might not be at the top of CIOs’ priority lists, but it is something that IT leaders will have to master.

There are plenty of challenges involved in managing software projects, and IT executives who learn how to address these hurdles can help their organizations build better applications to drive business growth and enhance customer experience.

Here are some of the more likely challenges IT leaders and teams face with software projects, and how they can address them.

Delivering on time and on budget

Completing software projects in a timely manner while staying within

budget is a long-time challenge of software development. Any number of things can happen to cause delays and drive up costs.

One possible solution is to embrace the agile methodology of software development. Agile calls for collaboration among cross-functional teams as well as end users, and promotes adaptive planning, evolutionary development, continual improvement, flexibility in responding to changes in requirements and early delivery of products.

“Agile software development projects iterate the cycle of plan, do, check, adjust — and the end user or representative sponsoring [the project] is key in all these stages,” says Ola Chowning, a partner with global technology research and advisory firm ISG.

The waterfall method of gathering all the requirements, designing the entire software capability to meet all the requirements, building all the needed capabilities and reviewing and obtaining buy-in from end users is rarely used today, Chowning says. “This older method, by the way, is where on-time, on-budget challenges were the most onerous, because of the guessing game created when the software team had to estimate large bodies of work and assume some level of acceptance or rework by end users,” she says.

Agile enables the software team and end users to “collectively learn to plan better, work better and adjust more quickly, and outcomes become far more predictable as the way the team works becomes more predictable,” Chowning says. “On time and on budget are much easier to judge with the finger on the pulse of expectations of both the users and the developers.”

Creating and maintaining an agile culture

While adopting agile makes sense for software development at many organizations, it can come with hurdles. And many IT leaders who think their organizations have instituted agile practices fail to understand that what their teams are undertaking isn’t, in fact, agile.

“The intersection of agile software development practices and ‘traditional’ project management remains a challenge for many organizations,” Chowning says. “By now, you would think we would have cracked this nut, but it still seems to stymie many of our clients.”

Whereas software development is approached with a strictly agile way of working — repeated sprints, stories, and multiple releases that build up the end software product in an iterative manner — many organizations continue to struggle as they attempt to manage projects in a waterfall manner, Chowning says.

“This often begins during the project’s business case for funding, where we are typically asked to estimate the total outlay to be reported against in a traditional waterfall-phased framework of requirements, design, build, test and deploy,” Chowning says.

More mature organizations are turning to a project management approach that instead lays out the estimate of overall cost to overall value in a more steady approach across time, Chowning says. “Those who are using this agile project management approach are able to reap some of the real key benefits of agile, [but] it may require some adjustment of investment decisions or even financial practices in terms of project spend.”

A huge challenge for IT organizations is driving the agile model at the enterprise level, says Christian Kelly, managing director at technology consulting and services firm Accenture. “Agile at the team level is now widespread, but recent data suggest that it’s not going as well at the enterprise level, as most organizations struggle to connect strategies to the work their teams are doing,” he says.

This limits organizations’ ability to prioritize portfolios, plan for capacity, manage dependencies and connect goals to outcomes, Kelly says. “To deliver on the promise of agile, organizations need to implement the agile culture, systems, and best-in-class tools needed to better connect strategies to outcomes,” he says.

Aligning projects with overall organizational goals

“IT projects cannot be done in a bubble,” says Chetna Mahajan, chief digital and information officer at analytics platform provider Amplitude. “If your initiative is not aligned with business priorities, you are not set up for success from the outset and you will be swimming against the current at all times.”

To ensure business alignment and buy-in, all software projects should have a business executive sponsor, Mahajan says. When her previous company was implementing configure, price, quote (CPQ) software, the executive sponsors included Mahajan and the chief revenue office.

“This provided us with an escalation channel for both business and technical decisions and deliverables,” Mahajan says. “It was no longer perceived as a technology initiative and it got the visibility and attention it needed across the company. We not only came in under budget and on time, but also were able to increase automation 30% and reduce sales cycle by a couple weeks.”

Most technology projects fail because they lack concrete key performance indicators (KPIs), Mahajan says. “I categorize project metrics mainly into two buckets, one that monitors project execution and the other that measures business outcome,” she says. “What we can’t measure we can’t improve. While it is important to stay the course on budget, scope, and timeline, we must keep a constant eye on the business KPIs.”

The KPIs for a project should be specific and linked to company goals. “This not only helps create a culture of accountability, but also allows for companies to validate their business case to inform future investment decisions,” Mahajan says.

Winning over stakeholders and sponsors

Culture is often a key challenge in the ability to manage software projects in an agile fashion, Chowning says, because sponsors and key stakeholders of the project need to be comfortable and willing to work in the manner most suited to agile.

“Many may still want, instead, to try to work in a more traditional manner—build all requirements, design the entire end state, and only then build and deploy the entire end state,” Chowning says. “This can present a dilemma, as the software development practice and the project management practice try to proceed in two completely different and disconnected approaches.”

Educating the sponsors and key stakeholders in an agile project management approach, and helping them adjust behaviors, is key to managing expectations and enabling software development to proceed in the most effective and efficient manner, Chowning says.

It’s important to engage user representation up front and then continually throughout the iterations of the software development, regardless of the methodology being used, Chowning says.

“Gone are the days when it is sufficient to talk to users up front, and then not engage them again until some mystical user acceptance testing towards the end of the project,” Chowning says. “Users, or user representation [should] be engaged in all aspects of the software development and designs. Small feature developments, prototypes, trials and showcases are all useful means of ensuring users are both engaged and feedback is obtained constantly.”

Need for new development — and management — skills

“One of the biggest challenges we face [is] how to ensure we are continuously providing a strong developer experience and managing the ongoing upskilling of our employees as technologies evolve,” says Amit Sharma, CTO at financial technology company Broadridge Financial Solutions and former CIO at financial services provider Western Union.

“This means creating [automated] solutions, providing a secure, stable environment to develop and test, and equipping our developers with a suite of tools that facilitates a simple, manageable experience and alleviates the overhead and burden of heavy administration,” Sharma says.

With the rapid pace of change in software development, companies need to  train software engineers and others to adapt to new technologies, languages, and development processes. IT and product leaders need to acknowledge that there might be interruptions in projects because of the need to develop new skill sets, Sharma says. They also need to value the people involved in software development.

“It is critical that we recognize that our technology teams have built solutions over the course of many years, and as a result have become the subject matter experts not just of the system, but of the product as well,” Sharma says. “It is [vital] to bring them with us into the next generation of our product, no matter the technology it is founded on.”

In addition to the need for new developer skills, many IT leaders need to hone their own skills.

“Many IT leaders suffer from a massive talent gap in the ability to understand user needs, to create software roadmaps that meet business needs, to drive trade-offs against these roadmaps, and to move from process-based thinking to customer value and customer journey thinking,” Accenture’s Kelly says. “This is why concepts like value stream mapping, customer jobs/value propositions, and design thinking have become so important.”

Agile Development, Project Management, Software Development

By Bryan Kirschner, Vice President, Strategy at DataStax

In their 2020 book Competing in the Age of AI, Harvard Business School professors Marco Iansiti and Karim Lakhani make some bold predictions about the winning enterprises of the future.

These organizations, which they refer to as “AI factories,” build a “virtuous cycle between user engagement, data collection, algorithm design, prediction, and improvement,” unlocking new paths to growth as software moves to the core of the enterprise.

A little more than two years after the publication of their seminal work, data gathered from IT leaders and practitioners lend a lot of credence to Iansiti and Lakhani’s hypotheses — particularly those regarding the kind of technology architectures and strategies that engender success with AI.

The AI factory

Successful AI companies — think Apple, Netflix, Google, Uber, or FedEx — build innovative applications and, as they scale, start the flywheel of data, growth, and improvement spinning by gathering ever-growing amounts of real-time data, accessing it instantly, and tuning their predictions.

User experiences become more personal and intuitive; key decisions can be made nearly instantaneously; and predictions can occur in real-time, empowering a business to improve outcomes in the moment.

This unlocks new paths to growth: in the authors’ words, as AI factories “accumulate data by increasing scale (or even scope), the algorithms get better and the business creates greater value, something that enables more usage and thus the generation of even more data.”

For more traditional firms to achieve this kind of success requires a host of changes in both their operating models and technology profiles.

Open-source software and AI success

The State of the Data Race 2022 report is based on a survey of over 500 IT leaders and practitioners that delved into their organizations’ data strategies.

For the purpose of this analysis, responses were divided into three groups:

those where both AI and ML are already in widespread deploymentthose where AI and ML are at most in the pilot phase or early daysthose in between these two extremes, characterized as being in “limited deployment”

The study assumed the organizations with AI/ML widely in production provide useful information about the evolving shape of the “AI factory” and looked for differences across the three stages of maturity.

Iansiti and Lakhani wrote that AI factories will evolve “from a focus on proprietary technologies and software to an emphasis on shared development and open source” because the competitive advantage they enjoy comes from data they accumulate — not the software they develop in-house.

The survey data backs this up in spades. A strong majority of each of the three AI/ML groups considers open-source software (OSS) at least “somewhat” important to their organization (73%, 96%, and 97%, respectively, ordered from “early days” to “wide deployment”).

But ratings of “very” important closely track AI/ML maturity: 84% of companies with AI/ML in wide deployment describe OSS this way (22%of “early days” organizations do, and this jumps to 46% of those with AI/ML in limited deployment).

Perhaps even more striking, organizations not using OSS are a tiny minority (1%, 1%, and 7%, ordered from “wide deployment” to “early days”). But a majority of those with AI/ML in wide deployment (55%) join companies like The Home Depot in having a company-wide mandate for use of OSS.

Real-time data and AI

Consider the AI leaders mentioned above. These companies have assembled technology infrastructures that enable instantaneous changes and decisions based on real-time feedback. Relying on day-old data and batch processing to update the routing of a package to ensure on-time delivery just doesn’t cut it at FedEx.

So, it isn’t surprising that Iansiti and Lakhani report that AI factories lean into real time. “The top enterprises … develop tailored customer experiences, mitigate the risk of customer churn, anticipate equipment failure, and enable all kinds of process decisions in real time,” they say.

Much like with OSS, findings from The State of the Data Race point to real-time data (and the technology architecture that enables it) as a matter of core strategy for the AI leaders. The substantial use of this correlates with AI maturity: 81% of companies that have broadly deployed AI/ML say real-time data is a core strategy. Forty-eight percent of organizations with limited AI/ML deployment describe it as a core strategy; the figure was 32% for companies in the early stages of AI/ML.

But among the advanced group, a full 61% say that leveraging real-time data is a strategic focus across their organization (four times that of organizations in the early days, and more than twice that of those with limited deployment). And 96%of today’s AI/ML leaders expect all or most of their apps to be real time within three years.

This makes sense: as an enterprise intentionally rewires its operations to make the most of AI/ML, it becomes especially important to eliminate any arbitrary architectural barriers to new use cases that require “speed at scale” anywhere in the business.

Today’s OSS as-a-service ecosystem makes that possible for everyone, freeing the future organization to make the most of its unique customer interactions and datasets.

Uniphore: A case study in real-time data, AI, and OSS

Uniphore helps its enterprise customers cultivate more fruitful relationships with their customers by applying AI to sales and customer service communications. The company relies on real-time data to quickly analyze and provide feedback to salespeople upon thousands of customer reactions during video calls.

“We have about fourteen different AI models we run in real time to coalesce the data into something meaningful for our clients,” says Saurabh Saxena, Uniphore’s head of technology and VP of engineering. “Any kind of latency is going to have a negative effect on the real time side.”

“Without the ability to process data in real-time, our solution really wouldn’t be possible,” he adds.

To get “the speed they need,” Uniphore relies on open-source Apache Cassandra® delivered as a service via DataStax (my employer) Astra DB. Its performance and reliability are key to ensuring Uniphore’s system is something every salesperson is motivated to rely on in order to be more effective in the moment.

But winning adoption among line staff points to another of Iansiti and Lakhani’s insights on the implications of AI for senior management. As the latter explained in a 2021 interview, “AI is good at predictions” — and predictions are “the guts of an organization.” Senior executives need to constantly ask, “Do I have data now to improve my prediction power — my accuracy, my speed?”

As Uniphore points out, sales forecast accuracy is something most sales leaders are concerned about. As a knock-on effect of using Uniphore’s tools, quantitative data on sentiment and engagement can flow into sales forecasts without the need for more staff time. In addition to the direct uplift that sellers experience, forecasts improve– — management to spend their time on more important things, like investing for growth, with greater confidence.

This closes the loop on Iansiti and Lakhani’s insight that AI factories can unlock a more powerful operating model over and above the benefits of individual use cases and point solutions.

Building an AI factory

Organizations that leaned into the insights in Competing in the Age of AI may have stolen a march on their competition. Judging from our survey data, they’ve been amply rewarded for doing so. The good news is that they’ve proven best practices for success — and the tools you need to accelerate your own progress on the journey to becoming an “AI factory” are ready and waiting.

Learn how DataStax enables AI-powered apps

About Bryan Kirschner:

Bryan is Vice President, Strategy at DataStax. For more than 20 years he has helped large organizations build and execute strategy when they are seeking new ways forward and a future materially different from their past. He specializes in removing fear, uncertainty, and doubt from strategic decision-making through empirical data and market sensing.

Artificial Intelligence, IT Leadership

As companies shift their focus from the digital transformation of individual processes to the business outcomes enabled by a digitally transformed organisation, software engineering will become a core enterprise capability. To become a software-powered organisation, companies must first identify and address the concerns of its developers in areas such as developer experience, developer velocity and software security.

As per the recent IDC InfoBrief “The Significance of Open Source Software in the Digital-First Future Enterprise”, open source software (OSS) is an important driver of enterprise digital innovation and provides greater agility, performance and security compared to proprietary software. The research, conducted by International Data Corporation (IDC) and commissioned by SUSE surveyed 838 respondents in 11 Asia/Pacific countries across a range of industries such as financial services and insurance, telecommunications, and government. It examines key trends, challenges, and priorities in DevOps and security solutions and the impact of open-source software (OSS) on developer productivity.

According to Linus Lai, Research Vice President, IDC, “In a digital-first world where every organisation is a software-driven business, our research shows that open source software plays a very significant role across the enterprise technology stack, driving innovation, improved customer experiences, and overall digital transformation”.

Key Research Findings

OSS accounts for almost 70% of all software used in a typical Asia/Pacific enterprise to drive digital innovation.OSS improves developer satisfaction by addressing concerns specific to the business and technology environments developers operate in.The ability of OSS to address enterprise cybersecurity issues is unmatched due to robust vendor and community testing and expert software support from principal enterprise-grade OSS vendors.61% of respondents rated the performance of OSS as being superior compared to proprietary software.

The research shows that although OSS is relevant across the breadth of the enterprise technology stack, its usage varies. For example, close to 60% of respondents are using OSS for Database and 53% for Operating Systems. When it comes to container-related technologies, only 30% are using OSS. This is primarily because enterprises in the Asia/Pacific region are, on average, earlier in their container adoption journey than in mature markets.

With the rise of microservices and cloud-native applications, new security challenges are arising for enterprise IT departments. This is forcing enterprises to consider innovative new ways to manage security at the container level and pay due consideration to these challenges when choosing a container security solution. Development teams are also under pressure to stay current with newer cloud-native technologies and deliver applications faster.

In today’s highly competitive IT environment, skilled developers are not a cheap resource and top talent is not readily available. Therefore, maximising the developer’s value to the organisation should be high on the agenda for the business to operate at peak efficiency. The research findings show that the use of OSS not only improves developer productivity, but also increases developers’ access to emerging technologies and open source innovations. OSS is also more robust and secure than proprietary software due to rigorous community reviews and shorter development cycles. This enables enterprise development teams to innovate quickly.

As enterprises progress along their digital journey, they will expand the use of OSS in new domains that are crucial to their success. 60% of respondents identified security as the top technology domain for use of OSS in the future. The list also includes newer and emerging technology domains such as AI/big data analytics, Container Management and Metaverse. In addition, the use of containers will be essential as hybrid cloud computing becomes the default enterprise operating model, and enterprises look to expand into new areas such as “the edge”.

To access the full IDC InfoBrief: The Significance of Open Source Software in the Digital-First Future Enterprise, please click here.

Learn more about SUSE here.  

SUSE

Vishal Ghariwala is the Chief Technology Officer for the APJ and Greater China regions for SUSE, a global leader in true open source solutions. In this capacity, he engages with customer and partner executives across the region, and is responsible for growing SUSE’s mindshare by being the executive technical voice to the market, press, and analysts. He also has a global charter with the SUSE Office of the CTO to assess relevant industry, market and technology trends and identify opportunities aligned with the company’s strategy.

Prior to joining SUSE, Vishal was the Director for Cloud Native Applications at Red Hat where he led a team of senior technologists responsible for driving the growth and adoption of the Red Hat OpenShift, API Management, Integration and Business Automation portfolios across the Asia Pacific region.

Vishal has over 20 years of experience in the Software industry and holds a Bachelor’s Degree in Electrical and Electronic Engineering from the Nanyang Technological University in Singapore.

Vishal is here on LinkedIn: https://www.linkedin.com/in/vishalghariwala/

Open Source

GPU manufacturer Nvidia is expanding its enterprise software offering with three new AI workflows for retailers it hopes will also drive sales of its hardware accelerators.

The workflows are built on Nvidia’s existing AI technology platform. One tracks shoppers and objects across multiple camera views as a building block for cashierless store systems; one aims to prevent ticket-switching fraud at self-service checkouts; and one is for building analytics dashboards from surveillance camera video.

Nvidia isn’t packaging these workflows as off-the-shelf applications, however. Instead, it will make them available for enterprises to integrate themselves, or to buy as part of larger systems developed by startups or third-party systems integrators.

“There are several of them out there, globally, that have successfully developed these kinds of solutions, but we’re making it easier for more software companies and also system integrators to build these kinds of solutions,” said Azita Martin, Nvidia’s VP of retail.

She expects that demand for the software will drive sales of edge computing products containing Nvidia’s accelerator chips, as latency issues mean the algorithms for cashierless and self-checkout systems need to be running close to the checkout and not in some distant data center.

In addition to tracking who is carrying what items out of the store, the multiple camera system can also recognize when items have been put back on the wrong shelf, directing staff to reshelve them so that other customers can find them and stock outages are avoided, she said.

“We’re seeing huge adoption of frictionless shopping in Asia-Pacific and Europe, driven by shortage of labor,” said Martin.

Nvidia will face competition from Amazon in the cashierless store market, though, since while Amazon initially developed its Just Walk Out technology for use in its own Amazon Go and Amazon Fresh stores, it’s now offering it to third-party retailers, too. The first non-Amazon supermarket to use the company’s technology opened in Kansas City in December.

Assessing cost control

The tool to prevent ticket switching is intended to be integrated with camera-equipped self-service point-of-sale terminals, augmenting them with the ability to identify the product being scanned and verify it matches the barcode.

The cost of training the AI model to recognize these products went beyond the usual spending on computing capacity.

“We bought tens of thousands of dollars of products like steak and Tide and beer and razors, which are the most common items stolen, and we trained these algorithms,” said Martin.

Nvidia kept its grocery bill under control using its Omniverse simulation platform. “We didn’t buy every size of Tide and every packaging of beer,” she adds. “We took Omniverse and created synthetic data to train those algorithms even further for higher accuracy.”

Beer presents a particular challenge for the image recognition system, as it often sells in different-size multipacks or in special-edition packaging associated with events like the Super Bowl. However, the system continues to learn about new product formats and packaging from images captured at the checkout.

While implementation will be left up to retailers and their systems integrators, Martin suggested the tool might be used to lock up a point-of-sale terminal when ticket switching is suspected, summoning a member of staff to reset it and help the customer rescan their items.

Nvidia is touting high accuracy for its algorithms, but it remains to be seen how this will work out in deployment.

“These algorithms will deliver 98% accuracy in detecting theft and shutting down the point of sale and preventing it,” she said.

But that still leaves a 2% false positive rate, so CIOs will want to carefully monitor the potential impact on profitability, customer satisfaction, and frequent resets to prevent ticket switching.

A $100 billion problem

A 2022 survey by the National Retail Federation found that inventory shrink amounted to 1.44% of revenue — a relatively stable figure over the last decade — and in 2021, losses due to shrink totaled almost $100 billion, the NRF estimated.

Of that, survey respondents said 26% was due to process or control failures, 29% due to employee or internal theft, and 37% due to external theft.

But Nvidia suggests that its loss prevention technology could eliminate 30% of shrinkage. That, though, would mean it could prevent four-fifths of all external retail theft, even though in addition to ticket switching, that category also includes shoplifting and organized retail crime activities such as cargo theft, and the use of stolen or cloned credit cards to obtain merchandise.

Plus, potential gains must be weighed against the cost of deploying the technology, which, Martin says, “depends on the size of the store, the number of cameras and how many stores you deploy it to.”

More positively, Nvidia is also offering AI workflows that can process surveillance camera video feeds to generate a dashboard of retail analytics, including a heatmap of the most popular aisles and hour-by-hour trends in customer count and dwell time. “All of this is incredibly important in optimizing the merchandising, how the store is laid out, where the products go, and on what shelves to drive additional revenue,” Martin said.

Artificial Intelligence, IT Strategy, Retail Industry

By Andy Nallappan, Chief Technology Officer and Head of Software Business Operations, Broadcom Software

The information technology that enables scientific and commercial breakthroughs, from precision medicine to digital transformation, demonstrates tech’s boundless potential to improve our world. Yet, tech practitioners have long traded progress for increased complexity.

IT complexity, seen in spiraling IT infrastructure costs, multi-cloud frameworks that require larger teams of software engineers, the proliferation of data capture and analytics, and overlapping cybersecurity applications, is the hallmark—and also the bane—of the modern enterprise.

With a better understanding of IT complexity, large enterprises can partner with their strategic vendors to reduce IT complexity and drive more innovation and business success from it.

Complexity Continues to Increase

Complexity exhausts IT budgets and workers—a widespread problem that worsened during the pandemic. Seventy percent of executives say that IT complexity has been a growing challenge for their organization over the past two years, according to Broadcom Software-sponsored research by Harvard Business Review Analytic Services. While 85% of executives state that reducing IT complexity is an organizational priority, only 27% said their company had managed it effectively.

Regarding complexity, David Linthicum, managing director and chief cloud strategy officer at Deloitte Consulting LLP, comments that over the last five years, people have been migrating to the cloud and using more complex distributed deployments, such as multi-cloud, edge computing, IoT, and things like that.”

The Harvard Business Review Analytic Services report Taming IT Complexity through Effective Strategies and Partnerships discusses the root causes of IT complexity and the penalties organizations pay for failing to undertake an effective complexity-reduction strategy. In the report, Linthicum describes multi-cloud as “the straw that broke the camel’s back” because rather than centralize all enterprise data in a single cloud, companies are expanding their investments in multiple areas and trying to stitch all the pieces together.

Seeking Solutions to IT Complexity    

Not all IT solutions work as intended. A recurring problem known as “legacy spaghetti” occurs when IT teams layer on new technology solutions “that are disparate, siloed, and unorganized,” according to the report.

Nearly two-thirds of executives in the study indicate that incompatible systems and technologies are the top factors fueling IT complexity. Not surprisingly, 67% of respondents said employees are frustrated or confused by persistent complexity, with three-in-five noting that it costs money and creates unnecessary additional work.

Though complexity has plagued data centers since the days of mainframes and punch cards, the study indicates that:

82% of executives view IT complexity as an impediment to success.81% believe that reducing it creates a competitive advantage.73% agree that IT complexity is an organizational expense.36% of respondents say that reducing complexity in security creates more resilient systems—less vulnerable to security breaches.

At long last, the tide may be turning. According to the study, most organizations have developed specific strategies to reduce IT complexity and create operational efficiency. While there are many steps a company can take to minimize complexity, starting with simpler tools, enterprises can also partner with firms that specialize in complexity reduction to boost IT agility. In the study, three-in-five executives agree that working with a trusted partner is key to reducing complexity.

The report does point out, however, that all complexity is not bad. Innovation is the cornerstone of success for nearly all companies today. But innovation can require a certain level of complexity, meaning that ongoing innovation requires continual attention to the complexity it inevitably will create. “Executing innovation initiatives at scale involves risk-reward calculations all along the journey because there will be missteps and even failures. Innovation initiatives almost always require individuals with diverse expertise and experience, including from outside the organization, to collaborate and experiment together,” says Linda Hill, the Wallace Brett Donham Professor of Business Administration at the Harvard Business School, where she is also chair of the Leadership Initiative. “The more complex whatever you’re trying to do, by definition, the riskier it is.”

“A partner might have done this dozens of times with several other companies, and so we learn from that and ask, ‘What were lessons learned, and how can we accelerate quick-win improvements?’” said Jason Duigou, CIO of Indianapolis-based Medxcel, a healthcare facilities services company, in the Harvard Business Review Analytic Services report.

Nine Ways to Improve IT Complexity

The report also highlights nine practices to enable firms to improve the way they manage IT complexity:

Develop a common language around IT complexity—identifying a project’s complexity risk helps firms prioritize resources better.Find a balance between innovation and complexity—too much customization creates burdens, adds costs, and spikes risk.Take a modular approach to transformation—tackle new features incrementally to reduce stress on legacy systems.Get the C-suite on board—the study indicates that not all senior executives understand the weight of the problem. Complexity reduction needs funding and executive support.Retire redundant technologies—phasing out incompatible systems and redundant technologies is the most common complexity-reduction program, according to the study.Connect systems and increase compatibility—API-first strategies can help reduce the complexity of connecting disparate applications and improve system interoperability.Train employees—organizations that excel at reducing complexity tend to train their employees to promote better utilization of existing tools and systems.Create feedback loops—mitigating complexity starts with listening to how customers and employees experience a service or product. Feedback can help eliminate unnecessary features that lead to unwanted complexity.Develop metrics and measure progress, but see things through—measuring progress against objectives and appropriate peer groups can help improve IT complexity reduction efforts.

The report points out that there are no easy or inexpensive ways to reduce IT complexity. Companies that stand out as leaders in this effort tend to focus on limiting rather than trying to halt the constant march of complexity. Download the Harvard Business Review Analytic Services report now to learn more.

About Andy Nallappan:

Broadcom Software

Andy is the Chief Technology Officer and Head of Software Business Operations for Broadcom Software. He oversees the DevOps, SaaS Platform & Operations, and Marketing for the software business divisions within Broadcom.

IT Leadership

Citing currency fluctuations, Microsoft is all set to increase prices of its on-premises software, online services and Windows licenses in India by up to 11%.

The new prices that are expected to take effect from February 1, 2023, are meant to “harmonize” prices for Microsoft software and services between India and the Asian region, the company said, adding that it “periodically assesses the impact of its local pricing for software products and online services to ensure there is reasonable alignment across regions.”

The change will see India prices for commercial on-premises software rise by 4.5%, Microsoft said in a blog post. Prices for online services are set to increase by 9%, bringing these services close to prevailing dollar prices in the Asian region.

Come February, Windows licences, whose prices are set to increase by 11%, will be the most impacted.

Further, the company said that pricing for select services such as Microsoft 365 and Dynamics 365 for “direct customers” in India will start reflecting from February.

The price rise will not affect existing product orders for business users that are under price protection licensing agreements, Microsoft said.

“However, prices for new product additions under such licensing agreements and purchases under new contracts will be as defined by the pricelist at the time of order,” the company said.

This means that if an enterprise adds new services before February under the Microsoft price protection program, they would not have to pay the increased prices.

Microsoft claims that despite the increase in prices, its customers in India “buying online services in Indian rupee will continue to find Microsoft cloud offerings highly competitive.”

The change in pricing does not cover Microsoft’s hardware products, such as Surface devices, or Office and Windows consumer products, the company said, adding that the price changes will also not affect resellers prices direct as they continue to be determined by resellers themselves.

Microsoft, like its competitors, such as AWS, Google and Oracle, continues to face revenue slowdown in the wake of the pandemic, uncertain macroeconomic conditions, and geopolitical issues. The company recently reported its slowest growth in five years for the first quarter of its fiscal 2023 despite seeing revenue increase across business segments such as cloud, Dynamics 365 and Office 365.

Microsoft, Pricing

By Milan Shetti, CEO Rocket Software

If you ask business leaders to name their company’s most valuable asset, most will say data. But while businesses recognize the value of data, few have the processes and tools in place to access its full potential. In our most recent Rocket survey, 46% of IT professionals indicate that at least half of their content is “dark data”— meaning it’s processed but never used.

A big reason for the proliferation of dark data is the amount of unstructured data within business operations. Research shows that more than 60% of today’s corporate data is unstructured, and a significant amount of it is in the form of non-traditional “records,” like text and social media messages, audio files, video, and images. These numbers are growing with the continuation of remote work and the continued adoption of collaborative cloud software. Organizations that have not evolved their data management processes to reflect this exponential growth in data are vulnerable to missed opportunities and non-compliance. Therefore, many have accelerated digital transformation efforts to modernize operations and ensure they are reaping their data’s potential value.

In August of 2022, Rocket Software surveyed more than 500 corporate IT and line of business professionals to better understand the current challenges content management teams face and the steps to overcome these obstacles. Let’s take a closer look at the key findings in Rocket’s 2022 Survey Report, Content Management: The Movement to Modernization , and how content management software is helping businesses take advantage of the vast amount of information while staying compliant in today’s fast-paced, complex markets.

The challenges of today’s content management

The transition to remote work and migration to open cloud systems has allowed businesses to provide employees with more flexible working schedules and introduce cutting-edge technology to operations. However, remote work has led to employees communicating and sharing information on personal devices, which has increased the amount of valuable business content scattered being stored on shared and personal drives. Scattered and hard-to-track data — also known as data sprawl — has posed significant threats to companies’ ability to govern their content and has left them susceptible to regulatory infractions and missed opportunities. Our survey found that 37% of IT professionals believe employees saving content on shared and personal drives presents the greatest challenge to content management.

Along with data sprawl, IT professionals have grown concerned over the amount of redundant, obsolete, or trivial (ROT) data building up in companies’ systems. In fact, over a third of those surveyed reported ROT as the greatest threat to data management. Teams’ inability to properly locate, process, and remove excess or unnecessary data has led to data overload, higher costs, data discrepancies, and increased risk of data corruption and non-compliance.

While modernization may be the answer to data sprawl and ROT, IT professionals say that disruptions caused by the movement to cloud systems have also significantly hindered employees’ ability to manage data. Thirty-six percent of respondents consider cloud migration the greatest challenge to content governance moving forward.

While cloud migration may pose short-term challenges, unstructured and unmanaged data is a greater risk to operations as it will inevitably affect productivity, compliance, and a company’s overall success. Teams need the visibility and resources provided by cloud technologies to keep up with and eliminate data sprawl and ROT. Businesses that neglect implementing innovative tools into content processes put operations at a significant disadvantage.  

Taking advantage of unstructured data with content management software

Nearly one-fifth of professionals admitted that all of their organization’s data is unstructured. While most unstructured data goes unused, 59% of IT professionals say their company’s unstructured data and content is extremely or very significant. This revelation should leave business leaders with the fear of missing out (FOMO) when it comes to their unprocessed and underutilized content.

To get the most out of their data, businesses have begun introducing integrative content tools into operations. Integrating content software with popular collaboration and communication tools, like Microsoft 365 and Microsoft Teams, provides content teams with a unified view of all their company’s data from a single platform. This enables data professionals to easily monitor, process, store, and remove all their companies’ data while helping organizations get the most out of their unstructured data.

The need to automate content governance

The introduction of complex data privacy regulations globally has made data governance more challenging. In fact, only 33% of IT professionals feel their organization’s unstructured data is adequately governed. This lack of accountability leaves businesses vulnerable to regulatory infractions, substantial fines, and potential criminal prosecution. Many IT professionals are pushing for automated data governance processes, with over three-quarters believing their company would gain a significant competitive advantage if information security and compliance operations were automated.

While innovative content management technologies that automate and streamline content governance exist, they are still unavailable in many of today’s business operations due to ongoing misconceptions about the limitations of mainframe technology. However, by utilizing the right combination of content modernization solutions, like Rocket Software’s Mobius Content Services suite, teams can bring the power of automation and innovative practices to content operations along any infrastructure. 

The role of content management software moving forward

While outdated processes and a lack of automation continue to hinder content management efforts, it appears that businesses are finally recognizing their importance. Over the next 18 months, about one-third of IT professionals say their companies plan to continue modernization efforts by introducing more artificial intelligence, reporting and analytics tools, and integrative content software into operations.

[call to action]

Moving forward, businesses will continue to deal with stringent compliance regulations, the growing adoption of remote work, and increasing customer demands for faster service. Organizations must implement solutions into their operations that enable them to govern and leverage all of their content to gain a competitive advantage. By utilizing content management software, like Rocket Software’s Mobius Content Services , businesses can provide IT professionals with an unparalleled overview of a company’s entire content system and reduce human activity with automation capabilities. This will enable companies to ensure compliance while also taking data operations and utilization to the next level.

Data Management

By Milan Shetti, CEO Rocket Software

In today’s fast-paced digital business world, organizations have become highly adaptive and agile to keep up with the ever-evolving demands of consumers and the market. This has pushed many organizations to accelerate their digital transformation efforts in order to remain competitive and better serve their constituents — and there is no sign of slowing down. Statista estimates that global investment in digital transformation is expected to increase significantly between 2022 and 2025, from $1.8 trillion to $2.8 trillion. While a recent Rocket survey on the  state of the mainframe showed that the mainframe — due to its reliability and superior security — is here to stay, many organizations are moving to hybrid infrastructure with a “cloud-first approach” to operations. A key component to the appeal of cloud-first business models is cloud technologies’ ability to simplify processes and streamline workflows through integration and automation.

This is especially true for content management operations looking to navigate the complexities of data compliance while getting the most from their data. According to IBM, every day people create an estimated 2.5 quintillion bytes of data (that’s 2.5 followed by 18 zeros!). IT professionals tasked with managing, storing, and governing the vast amount of incoming information need help. Content management solutions can simplify data governance and provide the tools needed to simplify data migration and facilitate a cloud-first approach to content management.

Let’s take a closer look at the essential features cloud-first businesses should look for in a content management software.

Enhanced content-rich automation

Data analysts looking to streamline content processes need a content-rich automation software that allows them to easily design and deploy workflows, integrated processes and customize applications. The best modern content solutions leverage low-code/no-code process and presentation services to streamline the construction of business applications and provide a secure and collaborative platform for execution. This gives companies the ability to quickly adapt software and processes and implement innovative methodologies — like DevOps and Continuous Integration Continuous Development (CI/CD) testing — to continually improve operations, bring products and services to market faster, and develop better customer outcomes.

Expanded collaboration support

Digital transformation brings about a lot of change — in technology, processes, communication channels, and so on. To minimize business disruptions and avoid misunderstandings or important information being overlooked, it is critical for teams to maintain healthy communication and collaboration throughout transformation. Nothing can hinder digital team collaboration more than a lack of connectivity. As much as content management teams need to stay connected in order to maintain data integrity and compliance, so too does their content software. Teams need a highly integrative content management technology that can connect them across third-party vendors and popular communication (Slack, Microsoft Teams) and management tools (Microsoft Sharepoint® and 365) to centralize internal communications and shared information.  

Extended cloud governance

The move to cloud-first operations brings both positives and negatives for content management teams. Implementing a cloud data management operation provides teams with unparalleled data availability, mobility, and visibility. However, cloud applications are less secure than mainframe environments and increase vulnerabilities to data breaches. To combat these cloud-based challenges, businesses must look for content management solutions that support immutable cloud storage technologies, like AWS Object Lock, which allows users to store data using a write-once-read-many (WORM) model that mitigates tampering by disabling the ability to edit content once it is stored.

Modernized infrastructure deployment

Manual data migration can be a heavy lift for content management teams. Each piece of valuable information must be manually pulled, copied, reformatted, and moved to the new cloud system. While these tasks are not difficult, they are tedious and vulnerable to mistakes that can delay operations or jeopardize valuable information. Also, these tasks pull employees away from more important content governance tasks, which can leave an organization vulnerable to missed opportunities and regulatory infractions. Organizations need content management tools to facilitate migration efforts, streamline processes, and mitigate business disruptions. Teams need software that can automate the tedious manual processes involved in deploying, managing, and scaling containerized applications while maintaining the integrity and security of essential documents throughout cloud migration. By eliminating the potential for human error employees will be free to focus on more business-critical content management tasks.

Organizations looking to optimize their content management operations throughout data migration must leverage content management technology. Tools like Rocket Software’s Mobius Content Services Suite of technologies deliver the agility and adaptability needed to make the most of your content while maintaining compliance. Mobius Content Services provides content-rich automation, modernization deployment and connectivity to streamline processes, facilitate collaboration, and support a business’s transition to a cloud-first approach.

To learn more about Rocket Software’s Mobius Content Services Suite, click here.

Digital Transformation

By Andy Nallappan, Chief Technology Officer and Head of Software Business Operations, Broadcom Software

Last month at Gartner Symposium in Orlando, Fla., I enjoyed talking with ZDNet’s Chris Preimesberger and Sahana Sarma, leader of Google Cloud’s transformation advisory, about the enterprise software landscape and how it is growing more complex and business-critical daily. Transforming and modernizing software are key priorities for global organizations and critical to achieving the highest level of security and compliance. Many industries, from manufacturing, to automotive or financial service are becoming increasingly software-driven, changing their traditional portfolio mix and business models.

Here are some key points from that discussion:

Investing in R&D

At Broadcom we are seeing some of the same challenges our customers face that lead them to transform their software including: business transformation, talent risks, and managing costs. In a hyper competitive market where start-ups are challenging the established enterprises, companies are looking to pivot their business models through digital transformation. For Broadcom, who is experiencing some of the same challenges, it’s interesting to note that R&D spend has out-paced our revenue by almost 50%. That tells you quite a bit about how we approach doing our business and the importance we place on innovation.

Broadcom wasn’t originally a software company – it started in the early 1960s as the processor-making division of HP – but we got into the software business when we acquired CA Technologies and the Symantec Enterprise business. They were not modern cloud-based software companies and their portfolios were a mix of traditional on-prem software, cloud services, and some cloud-native.

One of the biggest challenges Broadcom had was to standardize platforms and processes of our acquired software companies, and so Broadcom Software worked closely with Google Cloud on this transformation journey, leading us to modernize and transform our own enterprise software. We wanted to standardize and put into a single, cloud-native platform all of our software, where it could give the biggest benefits to customers, and so it could scale, be resilient, and secure.

Continuous Innovation

With our Google Cloud journey, we’ve brought all of our software platforms onto a single SaaS platform. One of the reasons our customers want SaaS applications is that they want to see innovation happening at a faster rate. If you’re using a traditional on-prem application, you have to do upgrades and reinstalls, and it takes years to get it completed.

So as a software provider, we’ve got to deliver those SaaS apps in that space and the new features that go with them, and we need to do them in a speedy DevOps manner. Going to the cloud and modernizing (these systems) enables our developers to deliver all of this to the expectations of our customers in order to help them transform their businesses.

At Broadcom, we worked to give each of our software divisions a single pane of glass to better manage the business and track what was happening from the sales motion to customer adoption to R&D spend.  We also centralized software operations so the engineers could focus on delivering technologies that solved big, complex problems for our customers — in a way we liberated them to focus on great innovations and stronger customer experiences.

Sahana shared that at Google Cloud what they see from their customers when they transform their software is that they get to introduce more modern practices into their technology stack like containerization. So by having a more modern software stack, you can easily add in newer technologies — which introduces innovations. And by developing, adopting and promoting Open Source technologies Google Cloud ensures the neutrality of cloud and helps safeguard investments. Finally, by using Google Cloud as the backbone, you can free up software engineers to focus on new technologies and new ideas since they are not bogged down by complexities of different architectures and platforms.

Exceptional Experiences

Transforming and modernizing your software stack can help customers deliver better experiences for their customers and employees including:

Always available and auto-scalable: Clients deliver improved experience to their end customers with an always available and auto-scalable technology stack to meet customer demands with unparalleled responsiveness leveraging our fastest and safest global networkModernizing applications: Using solutions like Apigee and Anthos customers are unlocking their traditional systems to leverage the flexibility and agility of Cloud. If systems are unified, end-users can get what they need done in fewer steps. It improves the overall experience the end users have with the product or brand.AI & ML: A modernized technology stack allows them to leverage AI & ML tools, making it easier for our customers to anticipate the needs of their end-users to deliver better experience, besides significantly improving operational efficiencies

Reducing IT Complexity

Obviously the benefit of reducing IT Complexity helps customers to see a large range of benefits like faster delivery of products, improved compliance, and higher level of security when they modernize. The more complex your IT architecture is, with different platforms or silos, the more risk you introduce. By modernizing and having an open system, you can reduce IT complexity and therefore reduce risk. Most importantly, a modernized technology stack helps to quickly adapt and respond to the market, economic and customer demands. By transforming and modernizing you can lessen IT complexity and not only lower risk but deliver more success for your business and your customers.

To learn more about how Broadcom Software can help you modernize, optimize, and protect your enterprise, contact us here.

About Andy Nallappan:

Broadcom Software

Andy is the Chief Technology Officer and Head of Software Business Operations for Broadcom Software. He oversees the DevOps, SaaS Platform & Operations, and Marketing for the software business divisions within Broadcom.

Cloud Security, IT Leadership