If you believe the hype, generative AI has the potential to transform how we work and play with digital technologies.

Today’s eye-popping text-and-image generating classes of AI capture most of the limelight, but this newfangled automation is also coming to software development.

It is too soon to say what impact this emerging class of code-generating AI will have on the digital world. Descriptors ranging from “significant” to “profound” are regularly tossed around.

What we do know: IT must take a more proactive role in supporting software developers as they begin to experiment with these emergent technologies.

Generative AI Could Change the Game

Many generative AI coding tools have come to the fore, but perhaps none possesses more pedigree than Copilot, developed by Microsoft’s GitHub coding project management hub.

A type of virtual assistant, Copilot uses machine learning to recommend the next line of code a programmer might write. Just as OpenAI’s ChatGPT gathers reams of text from large corpuses of Web content, Copilot takes its bits and bytes insights from the large body of software projects stored on GitHub.

Although it’s early days for such tools, developers are excited about Copilot’s potential for enhanced workflows, productivity gained and time saved. Empirical and anecdotal evidence suggests it can shave anywhere between 10% and 55% of time coding, depending on who you listen to.

Today Copilot is targeted at professional programmers who have mastered GitHub and committed countless hours to creating and poring over code. Yet it’s quite possible that Copilot and other tools like it will follow the money and migrate downstream to accommodate so-called citizen developers.

DIY AI, for Non-Coders

Typically sitting in a business function such as sales or marketing, citizen-developers (cit-devs)  are non-professional programmers who use low-code or no-code software to build field service, market and analytics apps through drag-and-drop interfaces rather than via the rigors of traditional hand-coding.

If the low-code/no-code evolution has come to your company, you may have marveled at how this capability freed your staff to focus on other tasks, even as you helped these erstwhile developers color within the governance lines.

Considering their all-around efficacy, self-service, do-it-yourself tools are in-demand: The market for low-code and no-code platforms is poised to top $27 billion market in 2023, according to Gartner.

Now imagine what organizations will pony up for similar tools that harness AI to strap rocket boosters onto software development for non-techie coders. In the interest of catering to these staff, GitHub, OpenAI and others will likely create versions of their coding assistants that streamline development for cit-devs. GitHub, for example, is adding voice and chat interfaces to simplify its UX even more.

It’s not hard to imagine where it goes from there. Just as the API economy fostered new ecosystems of software interoperability, generative AI plugins will facilitate more intelligent information services for big brands. Already OpenAI plugins are connecting ChatGPT to third-party applications, enabling the conversational AI to interact with APIs defined by developers.

One imagines this AI-styled plug-and-play will broaden the potential for developers, both of the casual and professional persuasion. Workers will copilot coding tasks alongside generative AI, ideally enhancing their workflows. This emerging class of content creation tools will foster exciting use cases and innovation while affording your developers teams with more options for how they execute their work. This will also mean development will continue to become more decentralized outside the realm of IT.

Keep an Open Mind for the Future

The coming convergence of generative AI and software development will have broad implications and pose new challenges for your IT organization.

As an IT leader, you will have to strike the balance between your human coders—be they professionals or cit-devs—and their digital coworkers to ensure optimal productivity. You must provide your staff  guidance and guardrails that are typical of organizations adopting new and experimental AI.

Use good judgment. Don’t enter proprietary or otherwise corporate information and assets into these tools.

Make sure the output aligns with the input, which will require understanding of what you hope to achieve. This step, aimed at pro programmers with knowledge of garbage in/garbage out practices, will help catch some of the pitfalls associated with new technologies.

When in doubt give IT a shout.

Or however you choose to lay down the law on responsible AI use. Regardless of your stance, the rise of generative AI underscores how software is poised for its biggest evolution since the digital Wild West known as Web 2.0.

No one knows what the generative AI landscape will look like a few months from now let alone how it will impact businesses worldwide.

Is your IT house in order? Are you prepared to shepherd your organization through this exciting but uncertain future?

Learn more about our Dell Technologies APEX portfolio of cloud experiences, which affords developers more options for how and where to run workloads while meeting corporate safeguards: Dell Technologies APEX

Business Intelligence and Analytics Software

Technology innovation is happening at breakneck speed, creating new opportunities and threats for companies of all sizes and industries. At the same time, ever-evolving macroeconomic conditions are pressuring leaders to drive business outcomes against tighter margins.

While today’s business climate certainly feels like a test for the survival of the fittest, your goal should not be to just survive, but rather to thrive. In this new reality where a bank can go under in two weeks, and new innovations are fundamentally changing our ways of life (e.g., ChatGPT) the speed with which businesses are able to respond to changing market dynamics and customer needs is critical.

In a world where companies are defined by the digital services they can deliver, software agility IS business agility, and in turn results in better business outcomes.

Software agility translates to teams being able to quickly build and deploy applications in order to address changing customer needs, create new revenue streams, and scale to meet demand. Software agility also means that if the organization would like to leverage a new technology to innovate and remain competitive, teams can easily deliver an app that does just that.

However, as organizations rush to adopt cloud computing, Kubernetes, and open source software, they are faced with a sprawling landscape of tools and services that can be overwhelming and costly. And while it’s inspiring to behold the innovation happening around us – or as Stephen O’Grady from RedMonk called it “an embarrassment of riches” – developers can find themselves suffering from “analysis paralysis” and even worse, having to stitch their favorite tools and services together, taking time away from writing code and delivering those applications that run the business and delight the end users.  

Instead of tools serving to enhance productivity, today’s developers are caught in a tangled web of tools and services just to support their day-to-day. Additionally, this has added to the complexity of managing apps and infrastructure in a multi-cloud environment, where teams have to control costs, ensure performance, and manage consistent security policies across these diverse and distributed environments. Businesses with a multi-cloud environment also face a growing number of IT silos that can lead to inefficiencies and increased risk. And, as is increasingly common in today’s tightening labor market, a lack of skilled professionals adds yet another layer of difficulty. The complexity of today’s digital systems has multiple points of friction that need to be addressed if we are to achieve true business agility.

Organizations need a new approach – establish a common platform supported by a platform engineering function to unify application delivery and management across different apps and clouds. This will help in optimizing application development and operations, and strengthening security and compliance. What we are finding is, for developers to be successful and productive, they need: 1. A common data platform, 2. A seamless, self-service experience with minimal hand-offs and separation of concerns 3. Strong knowledge of open source technologies.

With our comprehensive Tanzu and Aria platforms, VMware has become a trusted partner to help organizations become more agile and responsive in software development. Delivering accessible, developer-friendly experiences and IT operational management is core to our DNA. With the depth and breadth of our technologies, alongside the consulting expertise of Tanzu Labs, we help organizations transform their methodologies and practices as well. We deliver near real-time insights across the application lifecycle with simplified, streamlined, and powerful solutions for developers, platform engineering and cloud operations teams to seamlessly provision, monitor, secure, and optimize apps across multiple clouds and their entire app portfolio.

We help customers across nearly every industry – from retail and healthcare to financial services and the public sector – build a culture that empowers developer productivity to enable the delivery of better, more secure software so they can meet changing market dynamics and customer expectations.

When there’s a market condition, how it impacts your business – whether positively or negatively – depends on how fast you can change, move, and adapt. At a time when organizations are pressured to perform and achieve business outcomes more efficiently than ever, now is the time to accelerate app delivery and digital transformation efforts. The best defense is an optimized and more productive offense. Change is inevitable, as we’ve learned in the last few years. So now and in the months and years ahead, it’s imperative for organizations to continue on their journeys to be future-ready for the next disruption.

To learn more, visit us here.

Software Development

Strong performances in software and consulting helped IBM’s profit and revenue increase in the first quarter, even as a post-pandemic slowdown hit much of the technology industry.

IBM’s software and consulting revenue both rose 3% year over year. In the software segment, IBM’s enterprise Linux unit, Red Hat, saw growth of 8%, while application operations saw the highest level of growth in the consulting segment, rising by 7%.

The strong showing from software and consulting help boost total profit by a healthy 26%, to $927 million, and revenue by .4%, to $14.3 billion, for the quarter ending March 31, according to IBM’s earnings report, issued late Thursday. The strong dollar had a negative effect on sales — in terms of constant currency, taking out the effect of currency fluctuations, IBM revenue rose by 4%.

IBM now receives about three-quarters of its annual revenue from tech services, but the company is seeing some deceleration in consulting from the previous robust growth levels, chairman and CEO Arvind Krishna said on a call with analysts after the results had been published.

“More recently, clients are prioritizing digital transformation projects that focus on cost takeout, productivity and quick returns,” he said.

This trend was further reflected in IBM’s infrastructure segment, which was down 3.7% year on year; within the Hybrid Infrastructure business, z Systems revenue was up 7%. This was driven by fourth quarter z16 availability, with its performance having outpaced prior cycles due to customers leveraging it for AI at scale and energy efficiency, Krishna told analysts.

Focus on artificial intelligence

Commenting on IBM’s ongoing focus, Krishna said that hybrid cloud and artificial intelligence would be a priority for driving both business outcomes and innovation this year.

He noted that while AI is projected to add $16 trillion to the global economy by 2030, its use case within the enterprise differs widely from the AI being offered to consumers, given its need for more accurate results, trusted data and governance tools.

“We are seeing a lot more interest from business and using AI to boost productivity and reduce costs,“ Krishna said. “Productivity gains will come from enterprises turning their workflows into simpler automated processes with AI.”

Krishna said IBM will be focused on using AI to help organizations enhance internal audit and compliance processes and automate call center responses to improve accuracy and customer satisfaction.

GlobalFoundries lawsuit

On the same day IBM posted its first quarter 2023 financial results, chip manufacturer GlobalFoundries announced it was launching a lawsuit against the company, accusing it of unlawfully sharing confidential intellectual property and trade secrets with Japanese chip maker Rapidus.

IBM announced in 2021 that it had developed a 2nm chip, and then late last year unveiled a partnership with Rapidus calling for commercial production of 2nm chips, with manufacturing done in Japan. Chips made with the 2nm manufacturing process will be used for a wide range of applications and machines, from laptops to high performance computing servers, and are expected to slash the carbon footprint of data centers due to optimized performance.

The GlobalFoundries complaint asserts that IBM unlawfully disclosed its confidential IP and trade secrets after IBM sold its microelectronics business in 2015.

The company also alleges that IBM has been undertaking “unlawful recruitment efforts” by poaching GlobalFoundries engineering staff, a practice that it alleges has accelerated since the Rapidus partnership was announced in December 2022.

GlobalFoundries has asked the court to order an end to those recruitment efforts and is seeking compensatory and punitive damages as well as an injunction against IBM to stop the “unlawful disclosure and use” of GlobalFoundries’ trade secrets.

IBM has denied the allegations, contained in a complaint to federal court in the Southern District of New York in which GlobalFoundries asks for compensatory and punitive damages as well as an injunction against IBM to prevent further disclosure and use of what GlobalFoundries considers its trade secrets.

Enterprise Applications, IT Consulting Services, Technology Industry

Ensuring strong software security and integrity has never been more important because software drives the modern digital business. High-profile vulnerabilities discovered over the past few years, with the potential to lead to attacks against organizations using the software, have hammered home the need to be vigilant about vulnerability management.

Perhaps the most dramatic recent example was the zero-day vulnerability discovered in Apache’s popular open-source Log4j logging service. The logging utility is used by millions of Java applications, and the underlying flaw—called Log4Shell—can be exploited relatively easily to enable remote code execution on a compromised machine. The impact of the vulnerability was felt worldwide, and security teams had to scramble to find and mitigate the issue.

In November 2022, open-source toolkit developers announced two high-severity vulnerabilities that affect all versions of OpenSSL 3.0.0 up to 3.0.6. OpenSSL is a toolkit supporting secure communications in web servers and applications. As such, it’s a key component of the Transport Layer Security (TLS) protocol, which ensures that data sent over the internet is secure.

SBOMs as a solution

One of the most effective tools for finding and addressing such vulnerabilities and keeping software secure is the software bill of materials (SBOM). SBOMs are formal, machine-readable records that contain the details and supply chain relationships and licenses of all the different components used to create a particular software product. They are designed to be shared across organizations to provide transparency of the software components provided by different players in the supply chain.

Many software providers build their applications by relying on open-source and commercial software components. An SBOM enumerates these components, creating a “recipe” for how the software was created.

For example, something like the OpenSSL toolkit includes dependencies that are difficult or, in many cases, impossible for traditional vulnerability scanners to uncover. It requires a multilayered approach to help security teams identify third-party libraries associated with a software package. This is where an SBOM can help.

The U.S. Department of Commerce has stated that SBOMs provide those who produce, purchase, and operate the software with information that enhances their understanding of the supply chain. This enables multiple benefits, most notably the potential to track known newly emerged vulnerabilities and risks.

These records form a foundational data layer on which further security tools, practices, and assurances can be built, the Commerce Department says, and serve as the foundation for an evolving approach to software transparency.

A 2022 report by the Linux Foundation Research, based on a survey of 412 organizations from around the world, showed that 90% of the organizations had started their SBOM journey.

More than half of the survey participants said their organizations are addressing SBOMs in a few, some, or many areas of their business, and 23% said they are addressing them across nearly all areas of their business or have standard practices that include the use of SBOMs. Overall, 76% of organizations had a degree of SBOM readiness at the time of the survey.

The research showed that the use of open-source software is widespread and that software security is a top organizational priority. Given the worldwide efforts to address software security, SBOMs have emerged as a key enabler, it said. Growth of SBOM production or consumption was expected to accelerate by about 66% during 2022, leading to SBOM production or consumption use by 78% of organizations.

The top-three benefits of producing SBOMs identified by survey participants were that SBOMs made it easier for developers to understand dependencies across components in an application, monitor components for vulnerabilities, and manage license compliance.

Key features to consider

SBOMs are a key to quickly finding and fixing vulnerabilities before it’s too late. That’s because they dig deep into the various dependencies among software components, examining the compressed files with applications to effectively manage risk. It might take a software vendor days or weeks to confirm with its developers whether its products are affected or not. That’s too long a window of opportunity in which cybercriminals can exploit vulnerabilities.

With SBOMs, security teams can know exactly where an affected component is being used across applications in use within their organizations.

It’s important for organizations to understand that not all SBOM offerings from vendors are alike. An ideal solution delivers critical, real-time visibility into an organization’s software environments, enabling them to make better-informed decisions to manage risk.

SBOMs should be able to answer questions such as:

Exactly where is a particular software package located?Which open-source dependencies, if any, does an application use?Which version of the software package is running?Do any other applications use the software package?

A key capability includes having the ability to understand every software component at runtime, uncover software packages and break them apart to examine all constituent components without the need to engage the software vendor.

SBOMs should also be able to address any vulnerabilities or misconfigurations found in the various software components; take quick action to mitigate supply chain risk, even removing applications completely across affected endpoints; and optimize an organization’s investments in third-party tools by populating them with granular, accurate and real-time SBOM data.

The takeaway 

Digital businesses today rely on software to support all kinds of processes. In fact, it’s difficult to imagine any company operating without applications. Keeping software secure and reliable is essential for success today.

With solutions such as SBOMs, security teams at organizations can be confident that they have a good handle on all the complexities inherent in the software world, and that they are keeping up on any flaws that need to be addressed to keep applications secure.

Learn how Tanium’s Converged Endpoint Management (XEM) platform can address SBOMs to give your organization real-time visibility—even in the most complex software environments.

Security

The need for efficient software development has taken on greater importance as enterprises introduce more and more digital services and add automation capabilities to enhance business processes. Managing software projects might not be at the top of CIOs’ priority lists, but it is something that IT leaders will have to master.

There are plenty of challenges involved in managing software projects, and IT executives who learn how to address these hurdles can help their organizations build better applications to drive business growth and enhance customer experience.

Here are some of the more likely challenges IT leaders and teams face with software projects, and how they can address them.

Delivering on time and on budget

Completing software projects in a timely manner while staying within

budget is a long-time challenge of software development. Any number of things can happen to cause delays and drive up costs.

One possible solution is to embrace the agile methodology of software development. Agile calls for collaboration among cross-functional teams as well as end users, and promotes adaptive planning, evolutionary development, continual improvement, flexibility in responding to changes in requirements and early delivery of products.

“Agile software development projects iterate the cycle of plan, do, check, adjust — and the end user or representative sponsoring [the project] is key in all these stages,” says Ola Chowning, a partner with global technology research and advisory firm ISG.

The waterfall method of gathering all the requirements, designing the entire software capability to meet all the requirements, building all the needed capabilities and reviewing and obtaining buy-in from end users is rarely used today, Chowning says. “This older method, by the way, is where on-time, on-budget challenges were the most onerous, because of the guessing game created when the software team had to estimate large bodies of work and assume some level of acceptance or rework by end users,” she says.

Agile enables the software team and end users to “collectively learn to plan better, work better and adjust more quickly, and outcomes become far more predictable as the way the team works becomes more predictable,” Chowning says. “On time and on budget are much easier to judge with the finger on the pulse of expectations of both the users and the developers.”

Creating and maintaining an agile culture

While adopting agile makes sense for software development at many organizations, it can come with hurdles. And many IT leaders who think their organizations have instituted agile practices fail to understand that what their teams are undertaking isn’t, in fact, agile.

“The intersection of agile software development practices and ‘traditional’ project management remains a challenge for many organizations,” Chowning says. “By now, you would think we would have cracked this nut, but it still seems to stymie many of our clients.”

Whereas software development is approached with a strictly agile way of working — repeated sprints, stories, and multiple releases that build up the end software product in an iterative manner — many organizations continue to struggle as they attempt to manage projects in a waterfall manner, Chowning says.

“This often begins during the project’s business case for funding, where we are typically asked to estimate the total outlay to be reported against in a traditional waterfall-phased framework of requirements, design, build, test and deploy,” Chowning says.

More mature organizations are turning to a project management approach that instead lays out the estimate of overall cost to overall value in a more steady approach across time, Chowning says. “Those who are using this agile project management approach are able to reap some of the real key benefits of agile, [but] it may require some adjustment of investment decisions or even financial practices in terms of project spend.”

A huge challenge for IT organizations is driving the agile model at the enterprise level, says Christian Kelly, managing director at technology consulting and services firm Accenture. “Agile at the team level is now widespread, but recent data suggest that it’s not going as well at the enterprise level, as most organizations struggle to connect strategies to the work their teams are doing,” he says.

This limits organizations’ ability to prioritize portfolios, plan for capacity, manage dependencies and connect goals to outcomes, Kelly says. “To deliver on the promise of agile, organizations need to implement the agile culture, systems, and best-in-class tools needed to better connect strategies to outcomes,” he says.

Aligning projects with overall organizational goals

“IT projects cannot be done in a bubble,” says Chetna Mahajan, chief digital and information officer at analytics platform provider Amplitude. “If your initiative is not aligned with business priorities, you are not set up for success from the outset and you will be swimming against the current at all times.”

To ensure business alignment and buy-in, all software projects should have a business executive sponsor, Mahajan says. When her previous company was implementing configure, price, quote (CPQ) software, the executive sponsors included Mahajan and the chief revenue office.

“This provided us with an escalation channel for both business and technical decisions and deliverables,” Mahajan says. “It was no longer perceived as a technology initiative and it got the visibility and attention it needed across the company. We not only came in under budget and on time, but also were able to increase automation 30% and reduce sales cycle by a couple weeks.”

Most technology projects fail because they lack concrete key performance indicators (KPIs), Mahajan says. “I categorize project metrics mainly into two buckets, one that monitors project execution and the other that measures business outcome,” she says. “What we can’t measure we can’t improve. While it is important to stay the course on budget, scope, and timeline, we must keep a constant eye on the business KPIs.”

The KPIs for a project should be specific and linked to company goals. “This not only helps create a culture of accountability, but also allows for companies to validate their business case to inform future investment decisions,” Mahajan says.

Winning over stakeholders and sponsors

Culture is often a key challenge in the ability to manage software projects in an agile fashion, Chowning says, because sponsors and key stakeholders of the project need to be comfortable and willing to work in the manner most suited to agile.

“Many may still want, instead, to try to work in a more traditional manner—build all requirements, design the entire end state, and only then build and deploy the entire end state,” Chowning says. “This can present a dilemma, as the software development practice and the project management practice try to proceed in two completely different and disconnected approaches.”

Educating the sponsors and key stakeholders in an agile project management approach, and helping them adjust behaviors, is key to managing expectations and enabling software development to proceed in the most effective and efficient manner, Chowning says.

It’s important to engage user representation up front and then continually throughout the iterations of the software development, regardless of the methodology being used, Chowning says.

“Gone are the days when it is sufficient to talk to users up front, and then not engage them again until some mystical user acceptance testing towards the end of the project,” Chowning says. “Users, or user representation [should] be engaged in all aspects of the software development and designs. Small feature developments, prototypes, trials and showcases are all useful means of ensuring users are both engaged and feedback is obtained constantly.”

Need for new development — and management — skills

“One of the biggest challenges we face [is] how to ensure we are continuously providing a strong developer experience and managing the ongoing upskilling of our employees as technologies evolve,” says Amit Sharma, CTO at financial technology company Broadridge Financial Solutions and former CIO at financial services provider Western Union.

“This means creating [automated] solutions, providing a secure, stable environment to develop and test, and equipping our developers with a suite of tools that facilitates a simple, manageable experience and alleviates the overhead and burden of heavy administration,” Sharma says.

With the rapid pace of change in software development, companies need to  train software engineers and others to adapt to new technologies, languages, and development processes. IT and product leaders need to acknowledge that there might be interruptions in projects because of the need to develop new skill sets, Sharma says. They also need to value the people involved in software development.

“It is critical that we recognize that our technology teams have built solutions over the course of many years, and as a result have become the subject matter experts not just of the system, but of the product as well,” Sharma says. “It is [vital] to bring them with us into the next generation of our product, no matter the technology it is founded on.”

In addition to the need for new developer skills, many IT leaders need to hone their own skills.

“Many IT leaders suffer from a massive talent gap in the ability to understand user needs, to create software roadmaps that meet business needs, to drive trade-offs against these roadmaps, and to move from process-based thinking to customer value and customer journey thinking,” Accenture’s Kelly says. “This is why concepts like value stream mapping, customer jobs/value propositions, and design thinking have become so important.”

Agile Development, Project Management, Software Development

By Bryan Kirschner, Vice President, Strategy at DataStax

In their 2020 book Competing in the Age of AI, Harvard Business School professors Marco Iansiti and Karim Lakhani make some bold predictions about the winning enterprises of the future.

These organizations, which they refer to as “AI factories,” build a “virtuous cycle between user engagement, data collection, algorithm design, prediction, and improvement,” unlocking new paths to growth as software moves to the core of the enterprise.

A little more than two years after the publication of their seminal work, data gathered from IT leaders and practitioners lend a lot of credence to Iansiti and Lakhani’s hypotheses — particularly those regarding the kind of technology architectures and strategies that engender success with AI.

The AI factory

Successful AI companies — think Apple, Netflix, Google, Uber, or FedEx — build innovative applications and, as they scale, start the flywheel of data, growth, and improvement spinning by gathering ever-growing amounts of real-time data, accessing it instantly, and tuning their predictions.

User experiences become more personal and intuitive; key decisions can be made nearly instantaneously; and predictions can occur in real-time, empowering a business to improve outcomes in the moment.

This unlocks new paths to growth: in the authors’ words, as AI factories “accumulate data by increasing scale (or even scope), the algorithms get better and the business creates greater value, something that enables more usage and thus the generation of even more data.”

For more traditional firms to achieve this kind of success requires a host of changes in both their operating models and technology profiles.

Open-source software and AI success

The State of the Data Race 2022 report is based on a survey of over 500 IT leaders and practitioners that delved into their organizations’ data strategies.

For the purpose of this analysis, responses were divided into three groups:

those where both AI and ML are already in widespread deploymentthose where AI and ML are at most in the pilot phase or early daysthose in between these two extremes, characterized as being in “limited deployment”

The study assumed the organizations with AI/ML widely in production provide useful information about the evolving shape of the “AI factory” and looked for differences across the three stages of maturity.

Iansiti and Lakhani wrote that AI factories will evolve “from a focus on proprietary technologies and software to an emphasis on shared development and open source” because the competitive advantage they enjoy comes from data they accumulate — not the software they develop in-house.

The survey data backs this up in spades. A strong majority of each of the three AI/ML groups considers open-source software (OSS) at least “somewhat” important to their organization (73%, 96%, and 97%, respectively, ordered from “early days” to “wide deployment”).

But ratings of “very” important closely track AI/ML maturity: 84% of companies with AI/ML in wide deployment describe OSS this way (22%of “early days” organizations do, and this jumps to 46% of those with AI/ML in limited deployment).

Perhaps even more striking, organizations not using OSS are a tiny minority (1%, 1%, and 7%, ordered from “wide deployment” to “early days”). But a majority of those with AI/ML in wide deployment (55%) join companies like The Home Depot in having a company-wide mandate for use of OSS.

Real-time data and AI

Consider the AI leaders mentioned above. These companies have assembled technology infrastructures that enable instantaneous changes and decisions based on real-time feedback. Relying on day-old data and batch processing to update the routing of a package to ensure on-time delivery just doesn’t cut it at FedEx.

So, it isn’t surprising that Iansiti and Lakhani report that AI factories lean into real time. “The top enterprises … develop tailored customer experiences, mitigate the risk of customer churn, anticipate equipment failure, and enable all kinds of process decisions in real time,” they say.

Much like with OSS, findings from The State of the Data Race point to real-time data (and the technology architecture that enables it) as a matter of core strategy for the AI leaders. The substantial use of this correlates with AI maturity: 81% of companies that have broadly deployed AI/ML say real-time data is a core strategy. Forty-eight percent of organizations with limited AI/ML deployment describe it as a core strategy; the figure was 32% for companies in the early stages of AI/ML.

But among the advanced group, a full 61% say that leveraging real-time data is a strategic focus across their organization (four times that of organizations in the early days, and more than twice that of those with limited deployment). And 96%of today’s AI/ML leaders expect all or most of their apps to be real time within three years.

This makes sense: as an enterprise intentionally rewires its operations to make the most of AI/ML, it becomes especially important to eliminate any arbitrary architectural barriers to new use cases that require “speed at scale” anywhere in the business.

Today’s OSS as-a-service ecosystem makes that possible for everyone, freeing the future organization to make the most of its unique customer interactions and datasets.

Uniphore: A case study in real-time data, AI, and OSS

Uniphore helps its enterprise customers cultivate more fruitful relationships with their customers by applying AI to sales and customer service communications. The company relies on real-time data to quickly analyze and provide feedback to salespeople upon thousands of customer reactions during video calls.

“We have about fourteen different AI models we run in real time to coalesce the data into something meaningful for our clients,” says Saurabh Saxena, Uniphore’s head of technology and VP of engineering. “Any kind of latency is going to have a negative effect on the real time side.”

“Without the ability to process data in real-time, our solution really wouldn’t be possible,” he adds.

To get “the speed they need,” Uniphore relies on open-source Apache Cassandra® delivered as a service via DataStax (my employer) Astra DB. Its performance and reliability are key to ensuring Uniphore’s system is something every salesperson is motivated to rely on in order to be more effective in the moment.

But winning adoption among line staff points to another of Iansiti and Lakhani’s insights on the implications of AI for senior management. As the latter explained in a 2021 interview, “AI is good at predictions” — and predictions are “the guts of an organization.” Senior executives need to constantly ask, “Do I have data now to improve my prediction power — my accuracy, my speed?”

As Uniphore points out, sales forecast accuracy is something most sales leaders are concerned about. As a knock-on effect of using Uniphore’s tools, quantitative data on sentiment and engagement can flow into sales forecasts without the need for more staff time. In addition to the direct uplift that sellers experience, forecasts improve– — management to spend their time on more important things, like investing for growth, with greater confidence.

This closes the loop on Iansiti and Lakhani’s insight that AI factories can unlock a more powerful operating model over and above the benefits of individual use cases and point solutions.

Building an AI factory

Organizations that leaned into the insights in Competing in the Age of AI may have stolen a march on their competition. Judging from our survey data, they’ve been amply rewarded for doing so. The good news is that they’ve proven best practices for success — and the tools you need to accelerate your own progress on the journey to becoming an “AI factory” are ready and waiting.

Learn how DataStax enables AI-powered apps

About Bryan Kirschner:

Bryan is Vice President, Strategy at DataStax. For more than 20 years he has helped large organizations build and execute strategy when they are seeking new ways forward and a future materially different from their past. He specializes in removing fear, uncertainty, and doubt from strategic decision-making through empirical data and market sensing.

Artificial Intelligence, IT Leadership

As companies shift their focus from the digital transformation of individual processes to the business outcomes enabled by a digitally transformed organisation, software engineering will become a core enterprise capability. To become a software-powered organisation, companies must first identify and address the concerns of its developers in areas such as developer experience, developer velocity and software security.

As per the recent IDC InfoBrief “The Significance of Open Source Software in the Digital-First Future Enterprise”, open source software (OSS) is an important driver of enterprise digital innovation and provides greater agility, performance and security compared to proprietary software. The research, conducted by International Data Corporation (IDC) and commissioned by SUSE surveyed 838 respondents in 11 Asia/Pacific countries across a range of industries such as financial services and insurance, telecommunications, and government. It examines key trends, challenges, and priorities in DevOps and security solutions and the impact of open-source software (OSS) on developer productivity.

According to Linus Lai, Research Vice President, IDC, “In a digital-first world where every organisation is a software-driven business, our research shows that open source software plays a very significant role across the enterprise technology stack, driving innovation, improved customer experiences, and overall digital transformation”.

Key Research Findings

OSS accounts for almost 70% of all software used in a typical Asia/Pacific enterprise to drive digital innovation.OSS improves developer satisfaction by addressing concerns specific to the business and technology environments developers operate in.The ability of OSS to address enterprise cybersecurity issues is unmatched due to robust vendor and community testing and expert software support from principal enterprise-grade OSS vendors.61% of respondents rated the performance of OSS as being superior compared to proprietary software.

The research shows that although OSS is relevant across the breadth of the enterprise technology stack, its usage varies. For example, close to 60% of respondents are using OSS for Database and 53% for Operating Systems. When it comes to container-related technologies, only 30% are using OSS. This is primarily because enterprises in the Asia/Pacific region are, on average, earlier in their container adoption journey than in mature markets.

With the rise of microservices and cloud-native applications, new security challenges are arising for enterprise IT departments. This is forcing enterprises to consider innovative new ways to manage security at the container level and pay due consideration to these challenges when choosing a container security solution. Development teams are also under pressure to stay current with newer cloud-native technologies and deliver applications faster.

In today’s highly competitive IT environment, skilled developers are not a cheap resource and top talent is not readily available. Therefore, maximising the developer’s value to the organisation should be high on the agenda for the business to operate at peak efficiency. The research findings show that the use of OSS not only improves developer productivity, but also increases developers’ access to emerging technologies and open source innovations. OSS is also more robust and secure than proprietary software due to rigorous community reviews and shorter development cycles. This enables enterprise development teams to innovate quickly.

As enterprises progress along their digital journey, they will expand the use of OSS in new domains that are crucial to their success. 60% of respondents identified security as the top technology domain for use of OSS in the future. The list also includes newer and emerging technology domains such as AI/big data analytics, Container Management and Metaverse. In addition, the use of containers will be essential as hybrid cloud computing becomes the default enterprise operating model, and enterprises look to expand into new areas such as “the edge”.

To access the full IDC InfoBrief: The Significance of Open Source Software in the Digital-First Future Enterprise, please click here.

Learn more about SUSE here.  

SUSE

Vishal Ghariwala is the Chief Technology Officer for the APJ and Greater China regions for SUSE, a global leader in true open source solutions. In this capacity, he engages with customer and partner executives across the region, and is responsible for growing SUSE’s mindshare by being the executive technical voice to the market, press, and analysts. He also has a global charter with the SUSE Office of the CTO to assess relevant industry, market and technology trends and identify opportunities aligned with the company’s strategy.

Prior to joining SUSE, Vishal was the Director for Cloud Native Applications at Red Hat where he led a team of senior technologists responsible for driving the growth and adoption of the Red Hat OpenShift, API Management, Integration and Business Automation portfolios across the Asia Pacific region.

Vishal has over 20 years of experience in the Software industry and holds a Bachelor’s Degree in Electrical and Electronic Engineering from the Nanyang Technological University in Singapore.

Vishal is here on LinkedIn: https://www.linkedin.com/in/vishalghariwala/

Open Source

GPU manufacturer Nvidia is expanding its enterprise software offering with three new AI workflows for retailers it hopes will also drive sales of its hardware accelerators.

The workflows are built on Nvidia’s existing AI technology platform. One tracks shoppers and objects across multiple camera views as a building block for cashierless store systems; one aims to prevent ticket-switching fraud at self-service checkouts; and one is for building analytics dashboards from surveillance camera video.

Nvidia isn’t packaging these workflows as off-the-shelf applications, however. Instead, it will make them available for enterprises to integrate themselves, or to buy as part of larger systems developed by startups or third-party systems integrators.

“There are several of them out there, globally, that have successfully developed these kinds of solutions, but we’re making it easier for more software companies and also system integrators to build these kinds of solutions,” said Azita Martin, Nvidia’s VP of retail.

She expects that demand for the software will drive sales of edge computing products containing Nvidia’s accelerator chips, as latency issues mean the algorithms for cashierless and self-checkout systems need to be running close to the checkout and not in some distant data center.

In addition to tracking who is carrying what items out of the store, the multiple camera system can also recognize when items have been put back on the wrong shelf, directing staff to reshelve them so that other customers can find them and stock outages are avoided, she said.

“We’re seeing huge adoption of frictionless shopping in Asia-Pacific and Europe, driven by shortage of labor,” said Martin.

Nvidia will face competition from Amazon in the cashierless store market, though, since while Amazon initially developed its Just Walk Out technology for use in its own Amazon Go and Amazon Fresh stores, it’s now offering it to third-party retailers, too. The first non-Amazon supermarket to use the company’s technology opened in Kansas City in December.

Assessing cost control

The tool to prevent ticket switching is intended to be integrated with camera-equipped self-service point-of-sale terminals, augmenting them with the ability to identify the product being scanned and verify it matches the barcode.

The cost of training the AI model to recognize these products went beyond the usual spending on computing capacity.

“We bought tens of thousands of dollars of products like steak and Tide and beer and razors, which are the most common items stolen, and we trained these algorithms,” said Martin.

Nvidia kept its grocery bill under control using its Omniverse simulation platform. “We didn’t buy every size of Tide and every packaging of beer,” she adds. “We took Omniverse and created synthetic data to train those algorithms even further for higher accuracy.”

Beer presents a particular challenge for the image recognition system, as it often sells in different-size multipacks or in special-edition packaging associated with events like the Super Bowl. However, the system continues to learn about new product formats and packaging from images captured at the checkout.

While implementation will be left up to retailers and their systems integrators, Martin suggested the tool might be used to lock up a point-of-sale terminal when ticket switching is suspected, summoning a member of staff to reset it and help the customer rescan their items.

Nvidia is touting high accuracy for its algorithms, but it remains to be seen how this will work out in deployment.

“These algorithms will deliver 98% accuracy in detecting theft and shutting down the point of sale and preventing it,” she said.

But that still leaves a 2% false positive rate, so CIOs will want to carefully monitor the potential impact on profitability, customer satisfaction, and frequent resets to prevent ticket switching.

A $100 billion problem

A 2022 survey by the National Retail Federation found that inventory shrink amounted to 1.44% of revenue — a relatively stable figure over the last decade — and in 2021, losses due to shrink totaled almost $100 billion, the NRF estimated.

Of that, survey respondents said 26% was due to process or control failures, 29% due to employee or internal theft, and 37% due to external theft.

But Nvidia suggests that its loss prevention technology could eliminate 30% of shrinkage. That, though, would mean it could prevent four-fifths of all external retail theft, even though in addition to ticket switching, that category also includes shoplifting and organized retail crime activities such as cargo theft, and the use of stolen or cloned credit cards to obtain merchandise.

Plus, potential gains must be weighed against the cost of deploying the technology, which, Martin says, “depends on the size of the store, the number of cameras and how many stores you deploy it to.”

More positively, Nvidia is also offering AI workflows that can process surveillance camera video feeds to generate a dashboard of retail analytics, including a heatmap of the most popular aisles and hour-by-hour trends in customer count and dwell time. “All of this is incredibly important in optimizing the merchandising, how the store is laid out, where the products go, and on what shelves to drive additional revenue,” Martin said.

Artificial Intelligence, IT Strategy, Retail Industry

By Andy Nallappan, Chief Technology Officer and Head of Software Business Operations, Broadcom Software

The information technology that enables scientific and commercial breakthroughs, from precision medicine to digital transformation, demonstrates tech’s boundless potential to improve our world. Yet, tech practitioners have long traded progress for increased complexity.

IT complexity, seen in spiraling IT infrastructure costs, multi-cloud frameworks that require larger teams of software engineers, the proliferation of data capture and analytics, and overlapping cybersecurity applications, is the hallmark—and also the bane—of the modern enterprise.

With a better understanding of IT complexity, large enterprises can partner with their strategic vendors to reduce IT complexity and drive more innovation and business success from it.

Complexity Continues to Increase

Complexity exhausts IT budgets and workers—a widespread problem that worsened during the pandemic. Seventy percent of executives say that IT complexity has been a growing challenge for their organization over the past two years, according to Broadcom Software-sponsored research by Harvard Business Review Analytic Services. While 85% of executives state that reducing IT complexity is an organizational priority, only 27% said their company had managed it effectively.

Regarding complexity, David Linthicum, managing director and chief cloud strategy officer at Deloitte Consulting LLP, comments that over the last five years, people have been migrating to the cloud and using more complex distributed deployments, such as multi-cloud, edge computing, IoT, and things like that.”

The Harvard Business Review Analytic Services report Taming IT Complexity through Effective Strategies and Partnerships discusses the root causes of IT complexity and the penalties organizations pay for failing to undertake an effective complexity-reduction strategy. In the report, Linthicum describes multi-cloud as “the straw that broke the camel’s back” because rather than centralize all enterprise data in a single cloud, companies are expanding their investments in multiple areas and trying to stitch all the pieces together.

Seeking Solutions to IT Complexity    

Not all IT solutions work as intended. A recurring problem known as “legacy spaghetti” occurs when IT teams layer on new technology solutions “that are disparate, siloed, and unorganized,” according to the report.

Nearly two-thirds of executives in the study indicate that incompatible systems and technologies are the top factors fueling IT complexity. Not surprisingly, 67% of respondents said employees are frustrated or confused by persistent complexity, with three-in-five noting that it costs money and creates unnecessary additional work.

Though complexity has plagued data centers since the days of mainframes and punch cards, the study indicates that:

82% of executives view IT complexity as an impediment to success.81% believe that reducing it creates a competitive advantage.73% agree that IT complexity is an organizational expense.36% of respondents say that reducing complexity in security creates more resilient systems—less vulnerable to security breaches.

At long last, the tide may be turning. According to the study, most organizations have developed specific strategies to reduce IT complexity and create operational efficiency. While there are many steps a company can take to minimize complexity, starting with simpler tools, enterprises can also partner with firms that specialize in complexity reduction to boost IT agility. In the study, three-in-five executives agree that working with a trusted partner is key to reducing complexity.

The report does point out, however, that all complexity is not bad. Innovation is the cornerstone of success for nearly all companies today. But innovation can require a certain level of complexity, meaning that ongoing innovation requires continual attention to the complexity it inevitably will create. “Executing innovation initiatives at scale involves risk-reward calculations all along the journey because there will be missteps and even failures. Innovation initiatives almost always require individuals with diverse expertise and experience, including from outside the organization, to collaborate and experiment together,” says Linda Hill, the Wallace Brett Donham Professor of Business Administration at the Harvard Business School, where she is also chair of the Leadership Initiative. “The more complex whatever you’re trying to do, by definition, the riskier it is.”

“A partner might have done this dozens of times with several other companies, and so we learn from that and ask, ‘What were lessons learned, and how can we accelerate quick-win improvements?’” said Jason Duigou, CIO of Indianapolis-based Medxcel, a healthcare facilities services company, in the Harvard Business Review Analytic Services report.

Nine Ways to Improve IT Complexity

The report also highlights nine practices to enable firms to improve the way they manage IT complexity:

Develop a common language around IT complexity—identifying a project’s complexity risk helps firms prioritize resources better.Find a balance between innovation and complexity—too much customization creates burdens, adds costs, and spikes risk.Take a modular approach to transformation—tackle new features incrementally to reduce stress on legacy systems.Get the C-suite on board—the study indicates that not all senior executives understand the weight of the problem. Complexity reduction needs funding and executive support.Retire redundant technologies—phasing out incompatible systems and redundant technologies is the most common complexity-reduction program, according to the study.Connect systems and increase compatibility—API-first strategies can help reduce the complexity of connecting disparate applications and improve system interoperability.Train employees—organizations that excel at reducing complexity tend to train their employees to promote better utilization of existing tools and systems.Create feedback loops—mitigating complexity starts with listening to how customers and employees experience a service or product. Feedback can help eliminate unnecessary features that lead to unwanted complexity.Develop metrics and measure progress, but see things through—measuring progress against objectives and appropriate peer groups can help improve IT complexity reduction efforts.

The report points out that there are no easy or inexpensive ways to reduce IT complexity. Companies that stand out as leaders in this effort tend to focus on limiting rather than trying to halt the constant march of complexity. Download the Harvard Business Review Analytic Services report now to learn more.

About Andy Nallappan:

Broadcom Software

Andy is the Chief Technology Officer and Head of Software Business Operations for Broadcom Software. He oversees the DevOps, SaaS Platform & Operations, and Marketing for the software business divisions within Broadcom.

IT Leadership

Citing currency fluctuations, Microsoft is all set to increase prices of its on-premises software, online services and Windows licenses in India by up to 11%.

The new prices that are expected to take effect from February 1, 2023, are meant to “harmonize” prices for Microsoft software and services between India and the Asian region, the company said, adding that it “periodically assesses the impact of its local pricing for software products and online services to ensure there is reasonable alignment across regions.”

The change will see India prices for commercial on-premises software rise by 4.5%, Microsoft said in a blog post. Prices for online services are set to increase by 9%, bringing these services close to prevailing dollar prices in the Asian region.

Come February, Windows licences, whose prices are set to increase by 11%, will be the most impacted.

Further, the company said that pricing for select services such as Microsoft 365 and Dynamics 365 for “direct customers” in India will start reflecting from February.

The price rise will not affect existing product orders for business users that are under price protection licensing agreements, Microsoft said.

“However, prices for new product additions under such licensing agreements and purchases under new contracts will be as defined by the pricelist at the time of order,” the company said.

This means that if an enterprise adds new services before February under the Microsoft price protection program, they would not have to pay the increased prices.

Microsoft claims that despite the increase in prices, its customers in India “buying online services in Indian rupee will continue to find Microsoft cloud offerings highly competitive.”

The change in pricing does not cover Microsoft’s hardware products, such as Surface devices, or Office and Windows consumer products, the company said, adding that the price changes will also not affect resellers prices direct as they continue to be determined by resellers themselves.

Microsoft, like its competitors, such as AWS, Google and Oracle, continues to face revenue slowdown in the wake of the pandemic, uncertain macroeconomic conditions, and geopolitical issues. The company recently reported its slowest growth in five years for the first quarter of its fiscal 2023 despite seeing revenue increase across business segments such as cloud, Dynamics 365 and Office 365.

Microsoft, Pricing