Economic instability and uncertainty are the leading causes for technology budget decreases, according to the IDG/Foundry 2022 annual State of the CIO survey. Despite a desire to cut budgets, data remains the key factor to a business succeeding – especially during economic uncertainty. According to the Harvard Business Review, data-driven companies have better financial performance, are more likely to survive, and are more innovative.[1]

So how do companies find this balance and create a cost-effective data stack that can deliver real value to their business? A new survey from Databricks, Fivetran, and Foundry that surveyed 400-plus senior IT decision-makers in data analytics/AI roles at large global companies, finds that 96% of respondents report negative business effects due to integration challenges. However, many IT and business leaders are discovering that modernizing their data stack overcomes those integration hurdles, providing the basis for a unified and cost-effective data architecture.

Building a performant & cost-effective data stack 

The Databricks, Fivetran, and Foundry report points the way for four investment priorities for data leaders: 

1. Automated data movement. A data pipeline is critical to the modern data infrastructure. Data pipelines ingest and move data from popular enterprise SaaS applications, and operational and analytic workloads to cloud-based destinations like data lakehouses. As the volume, variety and velocity of data grow, businesses need fully managed, secure and scalable data pipelines that can automatically adapt as schemas and APIs change while continuously delivering high-quality, fresh data. Modernizing analytic environments with an automated data movement solution reduces operational risk, ensures high performance, and simplifies ongoing management of data integration. 

2. A single system of insight. A data lakehouse incorporates integration tools that automate ELT to enable data movement to a central location in near real time. By combining both structured and unstructured data and eliminating separate silos, a single system of insight like the data lakehouse enables data teams to handle all data types and workloads. This unified approach of the data lakehouse dramatically simplifies the data architecture and combines the best features of a data warehouse and a data lake. This enables improved data management, security, and governance in a single data architecture to increase efficiency and innovation. Last, it supports all major data and AI workloads making data more accessible for decision-making.

A unified data architecture results in a data-driven organization that gains both BI, analytics and AI/ML insights at speeds comparable to those of a data warehouse, an important differentiator for tomorrow’s winning companies. 

3. Designed for AI/ML from the ground up. AI/ML is gaining momentum, as more than 80% of organizations are using or exploring the use of (AI) to stay competitive. “AI remains a foundational investment in digital transformation projects and programs,” says Carl W. Olofson, research vice president with IDC, who predicts worldwide AI spending will exceed $221B by 2025.[2] Despite that commitment, becoming a data-driven company fueled by BI analytics and AI insights is proving to be beyond the reach of many organizations that find themselves stymied by integration and complexity challenges. The data lakehouse solves this by providing a single solution for all major data workloads from streaming analytics to BI, data science, and AI. It empowers data science and machine learning teams to access, prepare and explore data at scale.

4. Solving the data quality issue. Data quality tools(59%) stand out as the most important technology to modernize the data stack, according to IT leaders in the survey. Why is data quality so important? Traditionally, business intelligence (BI) systems enabled queries of structured data in data warehouses for insights. Data lakes, meanwhile, contained unstructured data that was retained for the purposes of AI and Machine Learning (ML). However, maintaining siloed systems, or attempting to integrate them through complex workarounds, is difficult and costly. In a data lakehouse, metadata layers on top of open file formats increase data quality, while query engine advances speed and performance. This serves the needs of both BI analytics and AI/ML workloads in order to assure the accuracy, reliability, relevance, completeness, and consistency of data. 

According to the Databricks, Fivetran, and Foundry report, nearly two-thirds of IT leaders are using a data lakehouse, and more than four out of five say they’re likely to consider implementing one. At a moment when cost pressure is calling into question open-ended investments in data warehouses and data lakes, savvy IT leaders are responding as they place a high priority on modernizing their data stack. 

Download the full report to discover exclusive insights from IT leaders into their data pain points, how theyplan to address them, and what roles they expect cloud and data lakehouses to play in their data stack modernization.

[1] https://mitsloan.mit.edu/ideas-made-to-matter/why-data-driven-customers-are-future-competitive-strategy

[2]  Source: IDC’s Worldwide Artificial Intelligence Spending Guide, Feb V1 2022. 

Data Architecture

Chris Mills, Head of Customer Success, EMEA at Slack

The roles of the CTO and CIO have grown enormously in recent years, proving fundamental in facilitating the rapid shift from traditional working to hybrid working during the pandemic. But this was no short term shift—the value of the CTO and CIO continues to rise.

The next challenge on the horizon? Ensuring workers everywhere remain aligned, efficient and productive despite the economic turbulence organisations are buckling in for. Now, the spotlight is on tech leaders to once again steward businesses through another technological revolution—one in which the digital headquarters (HQ) is key.

Technology revolution 2.0

The majority of businesses are now adept at hybrid working, with many establishing policies to meet the needs of the workforce. This does not mean, however, that setups aren’t without a few wrinkles, with urgent issues including duplicate tools, bloating costs and unoptimised processes.

The digital HQ solves these challenges by uniting teams, partners, and the tools they use in a single digital space—making how work gets done simpler, more pleasant and more efficient. In fact, teams that use Slack as their digital HQ are 49% more productive.

In the digital HQ, bottomless email inboxes are replaced with Slack channels—a way of organising conversations based on topic, project, or initiative. While employees find information-sharing no longer tethered to inflexible meetings; instead, happening in Slack Huddles or Clips—free-flowing real-time or asynchronous audio and video messages that mean, on average, teams have 36% fewer meetings.

Another area the digital HQ really shines in its ability to drive productivity is through its ability to automate tasks—with Slack users launching 1.7 million automated workflows a day. For an example of this in action, businesses should look at Vodafone.

Automating for efficiency

Vodafone first started using the digital HQ as a foundation for modern engineering practices, but now uses it to enhance its collaboration worldwide. This has created opportunities for efficiency, in particular for the DevOps team, where release requests had the potential to be more streamlined.

With the digital HQ, the team is able to simplify release requests and use Slack’s Workflow Builder to automate a complex process. Developers now add the details of a release to a simple form, then used to populate a dedicated Slack channel so that the wider team has a real-time view of what’s going on.

Through the digital HQ, Vodafone has developed an efficient way of dealing with release requests that can number over 100 a month. They remain productive and focussed on the work, not the admin, while other teams retain visibility over an integral part of the business.

Slack

A HQ for challenging times

The pandemic demonstrated that, even in challenging times, productive and efficient ways of working are possible through technology. Vodafone is living proof that our technology evolution didn’t stop there, with the digital HQ providing a new foundation for the future of work.

The macroeconomic challenges we face will surely pass, and there’ll undoubtedly be something else close to follow. Yet the digital HQ is no one-trick pony. By supercharging not just our productivity, efficiency and collaboration, but our resilience too, with the digital HQ, businesses can prepare for the future—whatever it looks like.  

For more information on how Slack’s Digital HQ can help your business click here.

Application Performance Management, Remote Work, Workstations

Government organizations have the longest average buying cycle for technology when compared to other sectors, with procurement laws injecting complexity into the procurement process, according to a Gartner report.

The report, which is based on a survey of 1,120 executives with the inclusion of 79 public sector employees across the US, Canada, France, Germany, the UK, Australia and Singapore, showed that the average buying cycle for government entities is 22 months.

This is in contrast to at least 48% of all respondents saying that their buying cycle for technology averaged around six to seven months.

“Technology acquisition brings challenges to the public sector that do not commonly exist in other industries,” said Dean Lacheca, vice president and analyst at Gartner.   

“Each jurisdiction has its own procurement laws and policies, and within that, each agency or department can have its own interpretation of them. A failure to conform to the rules can have serious consequences, from unwanted publicity to personal risk of prosecution,” Lacheca added.

Some of the other reasons behind the delay include changes in scope, research and evaluation along with reaching an agreement around budgeting.

Many respondents also said that these delays occur before the beginning of the procurement process, with at least 74% of public sector respondents claiming that developing a business case for purchases takes a long time.

More than 76% said that scope changes requiring additional research and evaluation was also another major factor resulting in delays, Gartner said, adding that 75% of respondents listed reaching agreement around budgeting as a major concern for delays in the buying decision.

“While government buying cycles can be long, it is important to note that these time frames are not set,” said Lacheca.

“Initial planned timelines can be delayed as a result of a combination of both controllable and uncontrollable factors, especially when no external deadlines exist.”

Government procurement teams are large

A typical public sector buying team has 12 participants, with varying levels of participation in the process, Gartner said, adding that government C-level executives tend to be less involved in the technology buying process when compared to the private sector in order to avoid association with the process and creating the perception of political influence in the outcome.

This also makes government C-level executives less willing to defend the process if challenged by unsuccessful vendors or the media, the research firm said.

Further, the survey shows that public sector buying teams are more likely to be composed of lower-level operational staff, who act as subject matter experts providing recommendations to their C-suite.

At least 68% of public sector respondents claim that another reason for delay is their inability to obtain specific product or implementation requirements details from the provider, Gartner said.

The research firm adds that public sector organizations are significantly more likely to value references from existing clients than non-public sector buyers, partly because public sector organizations are rarely in direct competition and often share common challenges.

Government IT