Veeam expert predicts the 2020 technology landscape

Veeam expert predicts the 2020 technology landscape

2019 has been another year of rapid progress and we continue to see the successful Digital Transformation of many enterprises, shaping the future of technology. Dave Russell, Vice President of Enterprise Strategy at Veeam, discusses some of the key technology trends that businesses will look to take advantage of and prepare for in the year ahead.

Throughout 2019, technology has continued to have a transformative impact on businesses and communities. From the first deployments of 5G to businesses getting to grips with how they use Artificial Intelligence (AI), it’s been another year of rapid progress.

From an IT perspective, we have seen two major trends that will continue into 2020. The first is that on-premises and public cloud will increasingly become equal citizens. Cloud is becoming the new normal model of deployment, with 85% of businesses self-identifying as being predominantly hybrid-cloud or multi-cloud today. Related to this are the issues of cybersecurity and data privacy, which remain the top cloud concerns of IT decision makers. In 2020, cyberthreats will increase rather than diminish, so businesses must ensure that 100% of their business-critical data can be recovered.

Here are some of the key technology trends that businesses will look to take advantage of and prepare for in the year ahead.

Container adoption will become more mainstream

In 2020, container adoption will lead to faster software production through more robust DevOps capabilities and Kubernetes will consolidate its status as the de facto container orchestration platform. The popularity of container adoption or ‘containerisation’ is driven by two things: speed and ease. Containers are abstract data types that isolate an application from an operating system. With containers, microservices are packaged with their dependencies and configurations. This makes it faster and easier to develop, ship and deploy services. The trend towards multi-cloud means businesses need data to be portable across various clouds – especially the major providers – AWS, Microsoft Azure and Google Cloud. 451 Research projects the market size of application container technologies to reach US$4.3 billion by 2022 and in 2020 more businesses will view containers as a fundamental part of their IT strategy.

Cloud data management will increase data mobility and portability

Businesses will look to Cloud Data Management to guarantee the availability of data across all storage environments in 2020. Data needs to be fluid in the hybrid and multi cloud landscape, and Cloud Data Management’s capacity to increase data mobility and portability is the reason it has become an industry in and of itself. The 2019 Veeam Cloud Data Management report revealed that organisations pledged to spend an average of US$41 million on deploying cloud data management technologies this year. To meet changing customer expectations, businesses are constantly looking for new methods of making data more portable within their organisation. The vision of ‘your data, when you need it, where you need it’ can only be achieved through a robust CDM strategy, so its importance will only grow over the course of next year.

Backup success and speed gives way to restore success and speed

Data availability Service Level Agreements (SLAs) and expectations will rise in the next 12 months. Whereas the threshold for downtime, or any discontinuity of service, will continue to decrease. Consequently, the emphasis of the backup and recovery process has shifted towards the recovery stage. Backup used to be challenging, labour- and cost-intensive. Faster networks, backup target devices, as well as improved data capture and automation capabilities have accelerated backup. According to our 2019 Cloud Data Management report, almost one-third (29%) of businesses now continuously back up and replicate high-priority applications. The main concern for businesses now is that 100% of their data is recoverable and that a full recovery is possible within minutes. As well as providing peace of mind when it comes to maintaining data availability, a full complement of backed up data can be used for research, development and testing purposes. This leveraged data helps the business make the most informed decisions on Digital Transformation and business acceleration strategies.

Everything is becoming software-defined

Businesses will continue to pick and choose the storage technologies and hardware that work best for their organisation, but data centre management will become even more about software. Manual provisioning of IT infrastructure is fast becoming a thing of the past. Infrastructure as Code (IaC) will continue its proliferation into mainstream consciousness. Allowing business to create a blueprint of what infrastructure should do, then deploy it across all storage environments and locations, IaC reduces the time and cost of provisioning infrastructure across multiple sites. Software-defined approaches such as IaC and cloud-native – a strategy which natively utilises services and infrastructure from cloud computing providers – are not all about cost though. Automating replication procedures and leveraging the public cloud offers precision, agility and scalability – enabling organisations to deploy applications with speed and ease. With over three-quarters (77%) of organisations using Software-as-a-Service (SaaS), a software-defined approach to data management is now relevant to the vast majority of businesses.

Organisations will replace, not refresh, when it comes to backup solutions

In 2020, the trend towards replacement of backup technologies over augmentation will gather pace. Businesses will prioritise simplicity, flexibility and reliability of their business continuity solutions as the need to accelerate technology deployments becomes even more critical. In 2019, organisations said they had experienced an average of five unplanned outages in the last 12 months. Concerns over the ability of legacy vendors to guarantee data availability are driving businesses towards total replacement of backup and recovery solutions rather than augmentation of additional backup solutions that will be used in conjunction with the legacy tool(s). The drivers away from patching and updating solutions to replacing them completely include maintenance costs, lack of virtualisation and cloud capabilities, and shortcomings related to speed of data access and ease of management. Starting afresh gives businesses peace of mind that they have the right solution to meet user demands at all times.

All applications will become mission-critical

The number of applications that businesses classify as mission-critical will rise during 2020 – paving the way to a landscape in which every app is considered a high priority. Previously, organisations have been prepared to distinguish between mission-critical apps and non-mission-critical apps. As businesses become completely reliant on their digital infrastructure, the ability to make this distinction becomes very difficult. On average, the 2019 Veeam Cloud Data Management report revealed that IT decision makers say their business can tolerate a maximum of two hours’ downtime of mission-critical apps. But what apps can any enterprise realistically afford to have unavailable for this amount of time? Application downtime costs organisations a total of US$20.1 million globally in lost revenue and productivity each year, with lost data from mission-critical apps costing an average of $102,450 per hour. The truth is that every app is critical.

Browse our latest issue

Intelligent CIO Middle East

View Magazine Archive