Like its predecessors, COP27 offered mixed results. As the conference’s detractors have long lamented, if COPs were truly effective, we wouldn’t have needed 27 of them. Still, there are some genuine marks of progress to celebrate. A landmark “loss and damage” fund will come as welcome news for the many vulnerable countries that have been disproportionately affected by climate change. But compensating for damage isn’t the same as preventing it, and the continued reluctance of major nations to outlaw or even cut back on fossil fuel projects suggests there will be no shortage of fresh damage in the future.

Welcome as public discussions around climate change like COP may be, words alone are insufficient. Until leaders start turning promises into actionable policies, each conference will be marred by the sense that more could have been done. One of the great frustrations is that tangible solutions exist. They are fully formed, just waiting for forward-thinking leaders to put them into practice. City and business decision makers serious about demonstrating their sustainable credentials have an obligation – and an opportunity – to turn the tide.

One of those tangible solutions is LED lighting conversion. It’s also one of the easiest and most overlooked means of making measurable progress toward climate goals quickly. For example, if a mid-sized industrial town upgraded all of its approximately 1,000,000 conventional light points to LED, carbon emissions could potentially be reduced by over 18,000 tons of CO2 per year—equivalent to the amount of CO2 saved by taking 7,000 fossil-fueled cars off the road.

The good

The new “loss and damage” fund is by a distance the most important development to emerge from this year’s COP. It ensures that countries from the developing world that have borne the brunt of climate change, despite being only minor contributors to it, will receive help from many of the world’s largest emitters.

There was skepticism going into the conference as to whether such a bill really stood a chance of being accepted by the power players. The United States was expected to oppose any measures that might leave it financially liable for its enormous historical emissions, but after the EU pivoted toward supporting the measure, the US followed suit – a win that should not be overlooked.

However, it’s worth tempering any tendency to celebrate by pointing out there’s still the crucial matter of deciding exactly who will be paying what to whom, a task reserved for future COPs. Strong but so-called transition economies like China and India may be reluctant to agree they owe as much as more established superpowers like the US. Also, while these measures will have a huge positive impact in dealing with climate change’s effects, they do nothing to halt its causes.

Another less reported win is the commitment to deliver the trillions of dollars needed to drastically cut emissions and help societies adapt to the increasingly severe impacts of climate disruption. Suggested reforms within multilateral development banks (MDBs) and international financial institutions like the IMF and World Bank would see more investment in green initiatives and renewable energy, as opposed to funneling further funding into detrimental fossil fuel endeavors.

The bad

The mood in general was low heading into COP27. The triumphalism that coloured some aspects of COP26 had long since faded in the glare of the 2022 energy crisis. Prices soared, citizens felt the squeeze, and leaders from around the world who had proudly preached the need for sustainable solutions in simpler times were quick to walk back their promises.

So too did COP27 walk back pledges made in Glasgow a year earlier. New wording added to that conference’s final agreement calls for the accelerated development of “low-emission” energy systems. Experts warn this diluted language is a very intentional way of leaving the door open to further natural gas development.

Agreements around phasing down coal were made at COP26, but no further progress was made at COP27. The burning of fossil fuels remains the heel on our planet’s throat, and leaders seem reluctant to alleviate the pressure. As noted by Dave Reay, policy director at ClimateXChange and executive director at Edinburgh Climate Change Institute, “COP27 seems to have been more a case of trying to prevent backsliding on what was agreed at COP26 in Glasgow a year ago, rather than getting new and stronger ambition and action on reducing emissions.”

The ugly

Leaders continue to talk about their commitment to the Paris Agreement’s main goal of limiting global heating to no more than 1.5° C above pre-industrial levels by the end of the century. But this increasingly feels like a pipe dream. Even limiting warming to a 2° C rise seems more unlikely with each passing year. Keeping the promises made in Paris would require a 43% cut in global greenhouse emissions by 2030. Current policies are on track to cut only around 0.3% by that date.

The results of further inaction are clear. They can be seen everywhere from Bangladesh and Pakistan to Australia and California. So, what can be done now that will make a difference?

The things we can control

Lighting accounts for 13% of all electricity usage worldwide and 4% of global greenhouse gas emissions. Cities account for about 78% of global energy consumption, and lighting accounts for around 40% of that. Too often lighting is overlooked as a sustainable solution but transitioning to connected LED is a sure-fire way to reduce emissions and benefit from immediate energy savings – as much as 80% over conventional non-connected lighting.

If the EU switched all the existing conventional light points in its 27 member countries to LED, it would save around €65 billion per year — money that could be spent on developing sustainable energy and other initiatives while also relieving the burden on citizens struggling to pay their energy bills.

What about on a bigger scale? A global switch to LED could see energy consumption drop by 5% globally by 2030, even while the total number of light points continues to rise.

Relying on an annual climate conference isn’t enough to stave off catastrophe. Action is needed now, by all of those with the power to enact change. Connected LED technology is here and available today — it’s proven, cost-effective, and relatively easy to deploy. Buildings and cities simply can’t achieve net zero or energy-neutral targets without it.

The barriers to switching connected LED lighting are low, especially with funding available from programmes such as the EU Green Deal and the IIJA in the US. It’s time to pick up the pace.

Learn more about the advantages of switching to public LED lighting here.

Green IT

Cloud, sustainability, scale, and exponential data growth—these major factors that set the tone for high performance computing (HPC) in 2022 will also be key in driving innovation for 2023. As more organizations rely on HPC to speed time to results, especially for their data-intensive applications, the $40B market[1] faces challenges and opportunities. Fortunately, the HPC community is both collaborative and transparent in our work to apply and advance supercomputing technologies. 

Recently members of our community came together for a roundtable discussion, hosted by Dell Technologies, about trends, trials, and all the excitement around what’s next. 

Answers to Supercomputing Challenges

We identified the following challenges and here provide leading thoughts on each concern.

Sustainability: As the HPC market grows, so do the implications of running such energy-intensive and complex infrastructure. In an effort to achieve sustainability, industry leaders are prioritizing ways to reduce CO2 impact and even decarbonize HPC—not an easy task with total power usage increasing. What’s motivating our move to energy reduction:

A dependence on expensive fossil fuels that makes supercomputing more costly overallThe high and unfavorable environmental impact of fossil fuel-based energyPressure from governmental agencies to design more efficient solutions and data centers

Though HPC customers want to measure their own energy usage, the tools to do so may not yet offer sufficient metrics at the application scale. Alternatively, some in the European Union are shifting their priority from trying to run operations faster, to trying to run them with lower power consumption. This calls for experimenting with more efficient choices like running workloads with different chips. 

Cooling: Over the next few years, we will see an increase in the use of silicon processors and accelerators which require significantly more power—and thus generate more heat. As leaders in the HPC industry, we are worried about how to cool these data centers. 

Many are looking at innovative data center designs including modular centers and colocation. Another big focus is on liquid cooling.[2] Direct liquid cooling offers superior thermal management and five times the cooling capacity of air flow. Immersion cooling leverages oil-based engineered fluids for a high performance, complex cooling solution. Deployed successfully around the globe, liquid cooling is becoming essential to future proofing data centers.

Scaling out and developing large-scale systems: To meet demand, the HPC industry is developing and honing strategies to effectively scale and deploy large systems that are both efficient and reliable. It’s a tall order, and one that will hinge on a few factors:

Accelerator deployment and management at scaleChanges to power and cooling design decisions at very large scaleOpen-source deployment of high-performance clusters to run simulation, AI, and data analytics workloads

What’s New and Growing Among HPC Users? 

In the HPC industry, we are experiencing and driving massive shifts in terms of what we do, and how and where we do it. Here are the shifts we noted:

Delivery Models: Moving from an almost strictly on-premises systems approach (with some remote access services), HPC is embracing remote delivery models. Customer interest in HPC delivery models like colocation, managed services, and cloud computing is driven by tremendous growth in service-based models (including IaaS/PaaS/SaaS) along with on-demand and subscription payment models. Of course, data center challenges are also driving demand for these alternatives. New solutions, including Dell APEX HPC Services and HPC on Demand, address these customer requirements and wants.

Workflow Optimization from Edge to Core/Cloud and Back: Integration with edge devices as well as integration with different HPC systems is currently designed in-house or otherwise customized. Moving forward, we will see workflows that are more capable and widely adopted to facilitate edge-core-cloud needs like generating meshes, performing 3D simulations, performing post-simulation data analysis, and feeding data into machine learning models—which support, guide, and in some case replace the need for simulation.

Advances in Artificial Intelligence and Machine Learning (AI/ML): AI/ML will continue growing as an important workload in HPC. Due to the rapid growth in data sizes, there’s an increased need for HPC solutions that can run large training models. At the same time, these models can complement simulation, guiding targets or reducing parameter space for some problems. In the HPC community, we recognize a need for tools to support machine learning operations and data science management; these tools must be able to scale and integrate with HPC software, compute and storage environments.

Data Processing Units: We anticipate a jump in DPU usage but must support customers in figuring out which use cases offer quantifiable advantages in price/performance and performance/watt. It’s important to note that more research and benchmark comparisons are needed to help customers make the best decisions. Some examples of when DPUs can be advantageous for HPC workloads include: 

Collective operationsBare metal provisioning for maximum HPC performance by moving the hypervisor to the DPU, freeing up CPU cyclesImproving communications by task offload; If codes are task-based, the user can potentially move tasks to less busy nodes

Composable Infrastructure: We note the benefits in resource utilization offered by composable infrastructure, but still see uncertainty about whether it’s the future or a niche. As with DPUs, more research and quantifiable comparisons are needed to support customers in determining whether composable infrastructure is right for their next system. Yes, specific AI workflows require special hardware configurations. Though composable infrastructure may remove the restrictions of traditional architectures, there’s debate[3] about whether it can scale and whether the ROI would be reached by increased flexibility and utilization.

Quantum computing

As an HPC community, we share a growing consensus that quantum computing systems (QC) will and must be integrated with ‘classical’ HPC systems. QC systems are superior only at certain types of calculations and thus may best serve as accelerators. At Dell Technologies we have developed a hybrid classical/quantum platform that leverages Dell PowerEdge servers with Qiskit Dell Runtime, and IonQ Aria quantum processing units. With the platform, classical and quantum simulation workloads can execute on‐premises, while quantum workloads, such as modeling larger and more complex molecules for pharmacological development, can be executed with IonQ QPUs.[4]

The HPC Outlook for 2023

The impressively large HPC market continues to grow at a healthy rate, fueled by commercial demand for data processing and AI/ML training. HPC workloads and delivery models are more diverse than ever, leading to a more diverse customer community. And in HPC, community is important. Though we face some of the greatest challenges in application, system and data center scaling, HPC technologies remain at the leading edge of computing. 

As a community, we keep sharing information and we stay on top of developments to realize the maximum benefits of HPC. A robust resource for sharing, learning and networking can be found with the Dell HPC Community, found at This global community of HPC customers, users and companies comes together for weekly online events that are open to all as well as in-person meetings scheduled three times a year. 

Engage with the Dell HPC community by visiting






Intel® Technologies Move Analytics Forward

Data analytics is the key to unlocking the most value you can extract from data across your organization. To create a productive, cost-effective analytics strategy that gets results, you need high performance hardware that’s optimized to work with the software you use.

Modern data analytics spans a range of technologies, from dedicated analytics platforms and databases to deep learning and artificial intelligence (AI). Just starting out with analytics? Ready to evolve your analytics strategy or improve your data quality? There’s always room to grow, and Intel is ready to help. With a deep ecosystem of analytics technologies and partners, Intel accelerates the efforts of data scientists, analysts, and developers in every industry. Find out more about Intel advanced analytics.

IT Leadership

Next generation chatbots are now writing poetry and giving math lessons, but these smart applications have a bigger job to do. Advanced chatbots simulate human interaction via complex artificial intelligence (AI) processes, or conversational AI. As business-ready systems, conversational AI is joining mainstream tech to deliver strategic benefits to customers and employees. For companies looking to adopt or expand their use of conversational AI, there’s quite a bit to understand and consider. 

Now that humans and machines are talking to each other, decision-makers will need clarity around the capabilities—especially as they vet various products and platforms. It helps to start by defining some key terms.

Artificial intelligence (AI): A wide-ranging category of technology that allows computers to “strive to mimic human intelligence through experience and learning.”[1] Common AI applications involve analysis of language, imagery, video, and data. Machine learning (ML): In its definition of AI, Gartner cites ML as one of AI’s notable “advanced analysis and logic-based techniques,”[2] whereby computer systems can learn from their experiences without explicit programming. Natural language processing (NLP) focuses on machine reading comprehension through grammar and context, enabling it to determine the intended meaning of a sentence.  Known for applications such as voice-to-text and language translation, NLP uses AI and often ML to enable a computer to understand spoken or written human language. Natural language generation (NLG) focuses on text generation, or the construction of text in English or other languages, by a machine and based on a given dataset.Conversational AI: This advanced application of NLP is what allows people to have a spoken or written conversation with a computer system. At their best, conversational AI systems closely match human conversation—passing a measure called the Turing test.[3] Here’s how it works from a technical perspective: During the automatic speech recognition (ASR) stage, a person may ask a question and the application converts that audio waveform to text. During the NLP phase, the question is interpreted, and the device generates a smart response. Finally, the text is converted back into audio for the user during the text-to-speech (TTS) stage. 

A Quick Rundown of How Conversational AI Works

Asking a smart phone whether it’s going to rain, telling a virtual assistant to play ’90s hip hop, requesting a navigation system give directions to a new sushi restaurant—each are examples of interacting with conversational AI. By speaking in a normal voice, a person can communicate with a device that understands, finds answers, and replies with natural-sounding speech.

Conversational AI may seem simple to the end user. But the technology behind it is intricate, involving multiple steps, a massive amount of computing power, and computations that occur in less than 300 milliseconds. When an application is presented with a question, the audio waveform is converted to text in what’s known as the automatic speech recognition stage. Using NLP, the question is interpreted and a response is generated. At the next step, called text-to-speech, the text response is converted into speech signals to generate audio. 

Why Customers and Employees Prefer Conversational AI
Most people have experienced the frustration of talking to a legacy chatbot, and perhaps even resorted to anger or shouting “Representatitive!”. But once chatbots are enhanced with conversational AI capabilities, research shows customer satisfaction rates to be three times higher, attributed to shorter wait times and more accurate, consistent customer support.[4]

For employees, conversational AI can reduce stress and boost productivity by handling most low-level tasks and easing their day-to-day human-machine interactions. This frees up staff for other valuable and higher-level functions, benefiting customers and increasing morale.

Overall, for companies, the benefits may seem obvious: more productive staff and better customer service leading to increased productivity as well as higher customer satisfaction and retention rates. An additional benefit comes from the learning and training of models that continually improve and enhance employee and customer experiences.

Conversational AI in Action, From Retail to Healthcare to Real Estate

In constant search of competitive advantage, companies are increasing their investments in AI to the tune of a projected $204 billion by 2025.[5] Across industries, the technology promises to deepen customer insights, drive employee efficiency, and accelerate innovation. 

In retail, conversational AI is giving shoppers a streamlined experience with call centers and customer service interactions. As the clunky chatbots of yore are replaced with savvy AI chatbots, customers can quickly get their questions answered, receive product recommendations, find the proper digital channel for their inquiry, or connect with a human service agent. 

In healthcare, applications for conversational AI can support telehealth patient triage to identify potential medical conditions. Systems can also be trained to securely manage patient data—making it easier to access information such as test results or immunization records. And the technology can support patients who are scheduling an appointment, checking on insurance eligibility, or looking for a provider.

In real estate, conversational AI tools are being applied to the time-sensitive lead generation process, automating functions for accuracy and efficiency. Chatbots are also handling initial conversations to assess what a customer is looking to buy or sell. Given AI’s ability to handle thousands of calls per day, a program can be integrated with the customer relationship management system, or CRM, to create more positive experiences.

Five Questions to Ask Before Deploying a Conversational AI System

Once a company is ready to explore a conversational AI project, there will be groundwork. Here are five essential questions—and clues to finding the answers.

What kind of hardware do you need? The answer depends on the application scope and throughput needs. Some implementations rely on ML tools and run best on high-performance computing. Others may be more limited in scope. In any case, Dell Technologies Validated Designs offer tested and proven configurations to fit needs based on specific use cases.Which user interface options will your project support? Whether it’s a text-only chatbot or the more user-friendly voice interface, the decision must be based on what’s best for the customer and the budget.What platforms will be supported? Determine how customers might access the chatbot—via mobile app, web, social media—and think about whether to integrate with popular voice assistants. Will you build your own or rely on a vendor? Doing it in-house requires expertise and time but offers more control. If selecting a vendor, consider whether one vendor or multiple vendors will be needed for the end-to-end system. What kind of infrastructure will you need? This depends on whether the implementation will be hosted in a private or public cloud service. For those hosting in their own data centers, likely for compliance or security reasons, be sure the vendor’s systems are designed specifically to meet the speed and performance for conversational AI. 

As consumers become more familiar with AI, using it to create art and pay bills and plan their workouts, the technology holds greater professional promise. Conversational AI is already supporting a number of essential business functions—a boon for customers, staff, and the bottom line. Executives can set the foundation for their own advanced chatbots and other applications by ensuring their IT systems are ready for innovation. 

Read the Guide to Conversational AI for Financial Services, and explore AI solutions from Dell Technologies and Intel

IT Leadership

The demand for ongoing transformation and innovation is going to continue to drive IT budgets into 2023. As a solution to the challenges of inflation, recession, geopolitical instability, and the broader economy, IT is seen as the way forward.

Research shows that more than half – 52 per cent – of organisations are expecting to increase spending in IT. Among those that have already commenced the transformation journey, that number rises to 67 per cent.

Transformation is broad, however, and what IT leaders will see as a priority for investment will be technologies that bring human-centricity to the experience of interacting with technology. What this comes down to is two priorities – helping staff work at maximum productivity and efficiency, and ensuring that they’re happy in their jobs.

As Forrester put it: recession fears and talent constraints make paying attention to existing employees more important than ever – deep within the “Great Resignation” and facing an unemployment rate of just 3.4 per cent, every member of the executive team is being tasked with grappling with the challenge of talent.

By deploying IT spending correctly, the CIO is in the best position to solve this issue. Lenovo research shows that 75 per cent of employees seek purpose-driven work, and that transformation spend can be most effectively deployed in delivering solutions that engage workers, free up focus time, and improve business outcomes by helping everyone to do their best work.

Priorities for human centricity

The first priority for CIOs is to understand that remote work is here to stay, and the focus needs to be on turning that into a strategic opportunity. The Forrester research suggests that four in ten hybrid-working companies will try and roll back that policy and doing so will backfire on them.

In contrast, Lenovo’s research reveals that investing in remote work offers advantages. Employees are more productive, and 78 per cent of employees report that having better collaboration technology has unlocked the opportunity to recruit a more diverse workforce. This has unlocked the opportunity to access broader skillsets. In this context, it’s no small wonder that 44 per cent of IT leaders plan on making investments in hybrid collaboration tools to improve remote communication.

Elsewhere, employees also want to believe in the work they’re doing, and this means being a good corporate citizen and investing in sustainable solutions. This needs to be a multi-faceted approach, including the use of renewable energy, leveraging technology that is energy efficient, and reducing waste by ensuring that technology is reparable and has a long life. Hybrid work has a role to play here, too, by helping to eliminate the energy-inefficient commute to work.

However, human centricity also means understanding how people work. This is because, in a hybrid work environment, there will be times where employees want to come into the office. There, they need both a seamless and superior experience. This means they need to be able to continue working as they had been at home, while also enjoying a superior working experience in the office. Communication between onsite and offsite employees also needs to be seamless, and for CIOs, this need to modernise the in-office experience will result in some IT projects. For example, Lenovo research shows that 32 per cent of companies have subscriptions to workplace collaboration tools that help manage IT-related tasks. CIOs might need to invest in further solutions to continue to strengthen the “work anywhere” experience.

Delivering human centricity needs holistic solutions

As the popular saying goes, too many cooks spoil the broth. The more individual pieces of technology and services that a company uses, the more likely it is to compromise the employee experience, as solutions don’t work seamlessly together. Vendor consolidation is expected to be a key theme for CIOs going into 2023.

For Lenovo, being able to provide an end-to-end solution to CIOs has been a key priority. By choosing a trusted partner to help streamline solutions, Lenovo promotes a superior human-centric experience in three ways:

Lead with a purpose-driven vision – Lenovo solutions equip employees with durable, repairable, energy-efficient technology, from individual devices right up to the datacentre solutions in the office.Super-charge collaboration – Lenovo solutions leverage AR devices, as well as smart platforms to allow for complex, rich real-time collaboration, screen, and device sharing.Support a trusted end-to-end technology partner – Lenovo solutions make Device-as-a-Service, Infrastructure-as-a-Service and custom cloud solutions from a single partner possible, allowing the organisation to consolidate their touchpoints and vendors.

A full 82 per cent of IT leaders want to work with technology that delivers on the value of the transformed workforce. This means technology that allows for human centricity, and, in 2023, it will mean the difference in hybrid work being a point of competitive advantage for the organisation.

Remote Work

GITEX GLOBAL is the largest event for global enterprise and government technology, with public digital transformation top of the agenda. Though different countries had slightly different priorities, they were all rebuilding from the pandemic through initiatives in three areas:

Digital public utilities, which include e-services for essentials such as gas, water, and communication systemsDigital economy initiatives, such as lowering the barriers to entry for businesses to transform and expandDigital society, focusing on simplifying citizen-government interactions  

The blueprint for national digital transformation

Huawei’s Global Chief Public Services Industry Scientist, Hong-Eng Koh, detailed Huawei’s long involvement in digital transformation.

“Over the past decade, Huawei has been involved in over 700 cities and 100 countries in terms of public services and transformation. And we’ve noticed certain similarities required for public service digital transformation to be successful,” says Koh.

Government agencies tasked with digital transformation projects should keep some key considerations in mind to ensure success. Koh further outlined seven questions transformation leaders should ask themselves:

Do you have a vision and a champion that can drive the transformation?Have you identified the department, governor, and process to implement the transformation? If none are suitable, can you create a department dedicated to it?Do you know which laws and regulations are impacted by the transformation?Have you planned for an operating model that is sustainable from a budget standpoint?Do you have the right skillsets/talent in place to execute the transformation?Do you have a digital and data strategy in place?Do you have the right technology ecosystem to support transformation?

Developing the technology ecosystem

Huawei’s experience in public digital transformation projects has given it a leading position in the technology ecosystem that supports them. “Data is a new crude oil. By itself, it has no value. At Huawei, we have the end-to-end technologies to process the data and extract value from it,” Koh explains.

It’s important to select a full-stack tech vendor that can support all the foundational elements of transformed business processes. This includes data awareness technologies, such as Wi-Fi 6, LTE, and IoT platforms that provide the mesh of communication technologies across wide areas to efficiently be aware of data from sensors across the community.

Transmitting data to edge processing points and central data centres also needs to be both high-performing and cost-efficient. Huawei supports more than 50% of global national backbones and last-mile connectivity.

In the zettabyte era where data growth is unprecedented, Huawei builds data centres and other essential computing infrastructure—in particular, storage—to provide a robust infrastructure that efficiently supports the data.

However, as Koh pointed out, data will only deliver value when processed and refined. On that end, Huawei offers the software (such as database-as-a-service, security-as-a-service, and computing-as-a-service) the platform (cloud with AI), and the advanced services such as AI and ML frameworks to support and speed up the data processing process, and reduce time-to-value.

The digital transformation imperative

 The recent pandemic has served as a catalyst to advance the business case for digitisation of governments globally. And with citizens having experienced how digital has enabled a semblance of normalcy during lockdowns and border closures, they will only continue demanding for more efficient and digital government delivery that can ultimately enhance their way of life.

As a result, leading governments that have embraced this opportunity are reaping the benefits of improved productivity, shorter lead times to action community requests, reduced rates of error, and improved workforce and community satisfaction.

Find out more at HUAWEI CONNECT 2022

Accelerate your government’s digital transformation agenda by attending HUAWEI CONNECT 2022 in Bangkok, Dubai, Paris, or Shenzhen.

Learn how enterprise, ecosystem partners, application providers, and developers can work best together to lead best practice industry development and build an open, sound ecosystem.

Join us there to Unleash Digital. For more information, visit

Digital Transformation

In the second of this two part CIO webinar series ‘Driving business success with true enterprise applications’, we speak with DXC Technology, brewing giant Lion and analysts Ecosystm about ‘How to take customer experience to the next level’.

Today more than ever before, the customer is king.

And having been conditioned – some might say spoilt – over the past several years to hyper-personalisation via the growing number of intelligent digital platforms, delivering them the best possible experiences has now emerged as a major competitive differentiator.

Key to this is ensuring you have the right data and systems to underpin every customer interaction. This means having all customer-related data in a single repository, updated in real time, and accessible by business critical systems.

But as with so many challenges in this business, it’s a lot easier said than done. You need the right strategy, the right teams and the right culture, as well as the right solutions in place. And you need a proper plan to bring everyone in the organisation along with you on the journey.

Brewing giant Lion is one of the most established companies in Australasia, having carried some of our most famous and enduring beverage brands over a more than one hundred year history. And the explosion of new beer brands, especially in the craft space over the past several years, means it’s a more dynamic and complex business than ever.

Back in 2019, before COVID, Lion’s head of customer business process excellence, Nicole Parés and her team embarked a mission to gain a “360 degree view” of the company’s partner customers.

Shortly after, they met and quickly partnered with DXC Technology to set out a strategy to optimise the SAP Marketing solution and provide a customer experience that only a fully integrated tech stack could provide.

Changing customer – and partner – expectations demanded that the company undertake a full transformation if it was going to develop proper systems for managing its many moving parts and remain competitive into the future.

“I recognised that we had, three hundred legacy systems that had very little integration between them, making it difficult to utilise data and analytics, to draw insights and make business decisions,” Parés recalls.

Deployment of multiple SAP modules followed, until one day the legacy systems were switched off and Lion was operating in a new digital world.

Data driven

Perhaps the most profound change, Parés says, has been the cultural shift towards becoming an organisation that properly understands the value of data, especially in terms of driving better customer experiences.

“I can see the impacts through that our entire value chain … it [data] gives you the avenue through the technology of that marketing cloud to be really targeted in who your customer segments are.

“That in itself has been a transformative journey for our business”.

Matthew Varone, SAP CX consultant with DXC Technology says he and his team were initially brought in to work on a fairly narrow use case deploying SAP CX applications and marketing cloud.

From there the remit was expanded to encompass a broader plan to improve engagement with Lion’s extensive partner channel.

“[We started] working towards the right solution that’s 100 percent focussed on excellent customer experiences”.

This meant making sure the solution was right for everyone “and so that involved partnering very closely and doing lots of discovery, and going on the journey together to make sure we get the right outcome.”

Alan Hesketh, principal adviser with ecosystem and author of Start fast: Achieving rapid impact from digital transformation, has himself been at the coalface of several major digital transformation projects, in particular for the retail and fast moving consumer goods sectors.

He notes that for companies like Lion the challenge is how to harness data so that people find it easier – and are more inclined – to do business with you.

It’s essential therefore, to have “accurate data”.

“[It means] you’re able to use the data that they’re giving you to do the right kind of promotional activities to get that the expected returns .. and so when they’re doing that electronic ordering, it just flows through smoothly.”

Varone explains that DXC has helped Lion connect with its customers across multiple touchpoints.

“We’re measuring click through rates and engagement when they’re on the phone with agents and through the CRM system. You know those interactions are coming back, and all of those are being centralized and building this vision of one picture of a customer.”

Parés adds that being able to demonstrate that the project would give rise to new revenue streams was a key factor in getting buy in from the broader executive team. And further endearing her and her team was the fact that there were “zero issues” after ‘go-live’; a remarkable achievement for a project of this scale.

“It’s amazing now to look back and reflect on what that journey was like for us, putting that case forward and getting the buy in. Now we have the right technology, we can turn that into a competitive advantage.

Working on the marriage

Much is made of the importance of having great teams if major digital projects are to achieve their most ambitious objectives. And that also means different teams being able to work well together.

Parés reflects that hers and Varone’s team at DXC were united by strong sense of purpose but without any illusions about what needed to be done.

“We kept each other accountable, challenging what it was we were trying to achieve, and how we would get there.

“I think for me what I really enjoyed was it wasn’t a case of saying ‘here’s what I want to do. Go and do it. It was a partnership to understand, ‘well, have you considered that this is possible, or you are doing this process right now?’.”

Varone feels that that’s what made the “marriage” between Lion and DXC special.

“Did it take work? Yes, like most teams like this, forming and storming and norming, and all of those sorts of progressions that we all make together.”

“But there was always a lot of respect for each other”.


By Andy Nallappan, Chief Technology Officer and Head of Software Business Operations, Broadcom Software

Last month at Gartner Symposium in Orlando, Fla., I enjoyed talking with ZDNet’s Chris Preimesberger and Sahana Sarma, leader of Google Cloud’s transformation advisory, about the enterprise software landscape and how it is growing more complex and business-critical daily. Transforming and modernizing software are key priorities for global organizations and critical to achieving the highest level of security and compliance. Many industries, from manufacturing, to automotive or financial service are becoming increasingly software-driven, changing their traditional portfolio mix and business models.

Here are some key points from that discussion:

Investing in R&D

At Broadcom we are seeing some of the same challenges our customers face that lead them to transform their software including: business transformation, talent risks, and managing costs. In a hyper competitive market where start-ups are challenging the established enterprises, companies are looking to pivot their business models through digital transformation. For Broadcom, who is experiencing some of the same challenges, it’s interesting to note that R&D spend has out-paced our revenue by almost 50%. That tells you quite a bit about how we approach doing our business and the importance we place on innovation.

Broadcom wasn’t originally a software company – it started in the early 1960s as the processor-making division of HP – but we got into the software business when we acquired CA Technologies and the Symantec Enterprise business. They were not modern cloud-based software companies and their portfolios were a mix of traditional on-prem software, cloud services, and some cloud-native.

One of the biggest challenges Broadcom had was to standardize platforms and processes of our acquired software companies, and so Broadcom Software worked closely with Google Cloud on this transformation journey, leading us to modernize and transform our own enterprise software. We wanted to standardize and put into a single, cloud-native platform all of our software, where it could give the biggest benefits to customers, and so it could scale, be resilient, and secure.

Continuous Innovation

With our Google Cloud journey, we’ve brought all of our software platforms onto a single SaaS platform. One of the reasons our customers want SaaS applications is that they want to see innovation happening at a faster rate. If you’re using a traditional on-prem application, you have to do upgrades and reinstalls, and it takes years to get it completed.

So as a software provider, we’ve got to deliver those SaaS apps in that space and the new features that go with them, and we need to do them in a speedy DevOps manner. Going to the cloud and modernizing (these systems) enables our developers to deliver all of this to the expectations of our customers in order to help them transform their businesses.

At Broadcom, we worked to give each of our software divisions a single pane of glass to better manage the business and track what was happening from the sales motion to customer adoption to R&D spend.  We also centralized software operations so the engineers could focus on delivering technologies that solved big, complex problems for our customers — in a way we liberated them to focus on great innovations and stronger customer experiences.

Sahana shared that at Google Cloud what they see from their customers when they transform their software is that they get to introduce more modern practices into their technology stack like containerization. So by having a more modern software stack, you can easily add in newer technologies — which introduces innovations. And by developing, adopting and promoting Open Source technologies Google Cloud ensures the neutrality of cloud and helps safeguard investments. Finally, by using Google Cloud as the backbone, you can free up software engineers to focus on new technologies and new ideas since they are not bogged down by complexities of different architectures and platforms.

Exceptional Experiences

Transforming and modernizing your software stack can help customers deliver better experiences for their customers and employees including:

Always available and auto-scalable: Clients deliver improved experience to their end customers with an always available and auto-scalable technology stack to meet customer demands with unparalleled responsiveness leveraging our fastest and safest global networkModernizing applications: Using solutions like Apigee and Anthos customers are unlocking their traditional systems to leverage the flexibility and agility of Cloud. If systems are unified, end-users can get what they need done in fewer steps. It improves the overall experience the end users have with the product or brand.AI & ML: A modernized technology stack allows them to leverage AI & ML tools, making it easier for our customers to anticipate the needs of their end-users to deliver better experience, besides significantly improving operational efficiencies

Reducing IT Complexity

Obviously the benefit of reducing IT Complexity helps customers to see a large range of benefits like faster delivery of products, improved compliance, and higher level of security when they modernize. The more complex your IT architecture is, with different platforms or silos, the more risk you introduce. By modernizing and having an open system, you can reduce IT complexity and therefore reduce risk. Most importantly, a modernized technology stack helps to quickly adapt and respond to the market, economic and customer demands. By transforming and modernizing you can lessen IT complexity and not only lower risk but deliver more success for your business and your customers.

To learn more about how Broadcom Software can help you modernize, optimize, and protect your enterprise, contact us here.

About Andy Nallappan:

Broadcom Software

Andy is the Chief Technology Officer and Head of Software Business Operations for Broadcom Software. He oversees the DevOps, SaaS Platform & Operations, and Marketing for the software business divisions within Broadcom.

Cloud Security, IT Leadership

2022 could be a turning point for pairing edge computing and 5G in the enterprise. Let’s examine trends to watch.

The distributed, granular nature of edge computing – where an “edge device” could mean anything from an iPhone to a hyper-specialized IoT sensor on an oil rig in the middle of an ocean – is reflected in the variety of its enterprise use cases.

There are some visible common denominators powering edge implementations: Containers and other cloud-native technologies come to mind, as does machine learning. But the specific applications of edge built on top of those foundations quickly diversify.

“Telco applications often have little in common with industrial IoT use cases, which in turn differ from those in the automotive industry,” says Gordon Haff, technology evangelist, Red Hat.

This reflects the diversity of broader edge computing trends he sees expanding in 2022.

When you pair maturing edge technologies and the expansion of 5G networks, the enterprise strategies and goals could become even more specific.

Simply put, “the 5G and edge combination varies by the type of enterprise business,” says Yugal Joshi, partner at Everest Group, where he leads the firm’s digital, cloud, and application services research practices.

Broadly speaking, the 5G-edge tandem is poised to drive the next phases of digital transformations already underway in many companies. As Joshi sees it, there will be a new wave of high-value production assets (including the copious amounts of data that edge devices and applications produce) becoming mainstream pieces of the IT portfolio – and subsequently creating business impact.

“Enterprises combine 5G to edge locations and create a chain of smart devices that can communicate with each other and back-end systems, unlike earlier times where network transformation didn’t touch the last-mile device,” Joshi says.


Edge computing’s turning-point year

The 5G-edge pairing is a long-tail event for enterprises. But there are plenty of reasons – including, of course, the expansion of telco-operated 5G networks – to think 2022 will be a turning-point year.

“We’ll see the transition from many smaller, early-stage deployments to wide-scale, global deployments of production 5G networks, following cloud-native design principles,” says Red Hat CTO Chris Wright. “As we provide a cloud-native platform for 5G, we have great visibility into this transition.”

“In 2022, 5G and edge will unify as a common platform to deliver ultra-reliable and low latency applications,” says Shamik Mishra, CTO for connectivity, Capgemini Engineering. A confluence of broader factors is feeding this type of belief including, of course, more widely available 5G networks.

“Edge use cases have a potential to go mainstream in 2022 with the development of edge-to-cloud architecture patterns and the rollout of 5G,” says Saurabh Mishra, senior manager of IoT at SAS.

The “last mile” concept is key. From a consumer standpoint, the only thing most people really care about when it comes to 5G is: “This makes my phone faster.”

The enterprise POV is more complex. At its core, though the 5G-edge relationship also boils down to speed, it’s usually expressed in two related terms more familiar to the world of IT: latency and performance. The relentless pursuit of low latency and high performance is embedded in the DNA of IT leaders and telco operators alike.

New horizons, familiar challenges

Consumer adoption of 5G and edge is enviably straightforward: Do I live in a coverage area, and do I need a new phone?

Obviously, there’s a little more to it from both the operator and broader enterprise perspective. While the potential of 5G-enabled edge architectures and applications is vast – and potentially lucrative – there will be some challenges for IT and business leaders along the way. Many of them may seem familiar.

For one, the 5G-edge combo in an enterprise context invariably means deploying and managing not just IT but OT (operational technology), and lots of it. As with other major initiatives, there will be a lot of moving parts and pieces to manage.

“Governance and scale will continue to be a challenge given the disparate people and systems involved – OT versus IT,” says Mishra from SAS. “Decision-making around what workloads live in the cloud versus the edge and a lack of understanding about the security profile for an edge-focused application will also be a challenge.”

Scale may be the biggest mountain to climb. It will require pinpoint planning, according to Kris Murphy, senior principal software engineer at Red Hat.

“Standardize ruthlessly, minimize operational ‘surface area,’ pull whenever possible over push, and automate the small things,” Murphy says.

5G and edge will also breed another familiar issue for CIOs – the occasional gap between what a vendor or provider says it can do and what it can actually do in your organization. Joshi says this is one important area that enterprise leaders can prepare for now, while the underlying technologies advance and mature.

“What will be more important for enterprise IT is to enhance its business understanding of operational technology, as well as be comfortable working with a variety of network equipment providers, cloud vendors, and IT service providers,” Joshi says.

Lock-in could be another familiar challenge for enterprise IT, Joshi says, underlining the need for rigorous evaluation of potential platforms and providers.

“Open source adoption and openness of the value chain, [including] RAN software, towers, base stations, cloud compute, and storage” will be an important consideration, Joshi says, as well a nose for finding substance amidst hype.

That brings us back to use cases. If you’re unsure about what’s next for 5G and edge in your organization, then start with the potential business applications. That should ultimately guide any further strategic development. Joshi sees growing adoption of remote training using digital twins, remote health consultations, media streaming, and real-time asset monitoring, among other uses.

“Any enabling factors in 5G such as small cells and low latency, strongly align to an edge architecture,” Joshi says. “However, the intention should not be to enable 5G, but to have a suitable business scenario where 5G adoption can enhance impact.”

To learn more, visit Red Hat here.

Edge Computing

From new Google and Facebook algorithms to GDPR, every so often a seismic change happens which can catch businesses on the backfoot. The impending loss of third party cookies is potentially one of those changes.

Google announced it would phase out third-party cookies in Chrome by late 2023 in a move dubbed by some as the ‘cookiepocalypse’. It follows other industry moves to address privacy concerns such as Apple Intelligent Tracking Prevention, which stops companies from identifying and profiling their customers using third party cookies. Already, approximately 30% of their web visitors can no longer be profiled. This will jump to around 90% next year when Google Chrome removes support for third party cookies.

The consequences for marketers who aren’t prepared could be profound. Analysis by McKinsey found the publishing industry alone could lose up to £10 billion in advertising revenue as a result of the changes.

A vital tool for business

The third party cookie has had something of a bad reputation in some circles, particularly where they relate to privacy. But cookies improve the user experience and provide an invaluable tool for tracking website visitors and collecting data which allows companies to better target ads to the right audiences.

A web browser can actually use up to eight types of cookies, although first and third party cookies are the type most people are familiar with. When someone is online, a cookie just stores a small amount of data on their computer. The data could include things such as which sites they visited and pages they viewed during their online session.

While first party cookies only get that data from the site the user visited, a third party one lets other sites access that data. This helps build up customer profiles and tailor ads and offers specifically to people you know will be interested in what you’ve got to offer. For marketers, they’re invaluable for measuring campaign effectiveness and conversion rates as well as personalising content.  

A world after cookies

The void created for marketers and advertisers by the demise of third party cookies can’t be overstated. Business agility will be vital as companies search for alternatives.

The focus going forward will have to be on first party data information which the customer has provided directly and which, while sometimes harder to get – is by its nature more accurate in helping build a picture of what they want and how they behave. For marketers this means gathering data directly through apps, websites, surveys rather than tracking their devices.

One such solution is offered by Teradata in partnership with Celebrus. It delivers the world’s only true first-party digital identity solution, which is unaffected by browser restrictions and third-party cookie deprecation.

The bottom line is that business is about to lose a gold mine of data. But there are solutions out there to mitigate this loss if action is taken now. For more information on how to close the gap on digital identity management click here.

Identity and Access Management, Identity Management Solutions