Software, cloud to define IT in 2019

Damian Wilk, senior regional director for the Middle East at Veritas Technologies, on how businesses can navigate the new IT architecture

Wilk: "The need for simplification and autonomy is growing, so backup and backup systems are becoming easier to manage."
Wilk: "The need for simplification and autonomy is growing, so backup and backup systems are becoming easier to manage."

Hardware is losing relevance as companies roll out software-defined architectures and concepts and off-the-shelf hardware, a major trend that started several years ago and that will continue to intensify in 2019. Additionally, more and more organisations are guided by a cloud-first approach, setting up all new processes and applications from the outset as virtual workloads. However, this raises questions and consequences elsewhere that companies must address in the coming year.

Companies will want to collect, analyse and exchange more data

Whether big data, IoT or edge computing – data is the lowest common denominator in all the big business trends. Companies of all sizes are step-by-step covering their manufacturing, development, marketing and sales processes and even their products, with a digital layer to create their life cycle to understand better what the customer wants.The concern alone that competitors may be actively seeking such insights may drive smaller companies to initiate their own digitisation projects, as nobody wants to get left behind. Every sensor, every new workload and every big data analysis will generate new and, above all, even more data that has to be stored, evaluated, linked, backed up, maintained and archived according to legal requirements. The amount of data for our customers already grows on average by 49% per year, as our studies show, a figure that will only increase in 2019.

Software-defined becomes more dominant as architecture

Companies are looking for quick and easy solutions to create storage for their data and most of them find their answer in the cloud, where several terabytes are quickly booked by a mouse click. In a recent UAE Databerg Survey conducted in October this year, cloud is set to change the data management environment in UAE dramatically over the next 2 years. While 38% of the organisational data is currently stored in the cloud, this is expected to increase to 55% within the next 12 months.

One reason: Cloud infrastructure providers can build capacity fast and is highly scalable to millions of customers with an extremely high degree of virtualisation and software-defined architectures (SDA). The resources of standardised hardware are made available by software, securely and automatically configured, and assigned to customers and their workloads. If new capacities are needed, the nodes in the software-defined architecture can simply be extended by additional physical machines. The SDA and its software interfaces then integrate them and cleverly share the new resources.

This scale-out model will become increasingly prevalent among large companies that need to scale their private and public clouds quickly and easily. The necessary hardware under it is then dominated by standard components and hyper-converged appliances, which provide network, storage and virtual stack of their own. Questions about hardware are becoming less relevant because of software-defined models, since standard components often suffice.

New workloads in heavily mixed architectures emerge

While the infrastructure and operations of SDA are becoming increasingly automated and unified, the variety of applications and workloads will continue to grow in 2019. Because even big data arrives in the mainstream and small and medium-sized companies use the corresponding services of the large public cloud providers for initial analysis. According to a survey of Terradata among 260 global corporations, 80% of respondents last year invested in artificial intelligence and machine learning to extract new knowledge from their data. The revenue forecasts for platforms such as MongoDB, Cassandra, OpenSQL, Hadoop or SAP HANA clearly show that SMEs in this area are also becoming increasingly active.

These platforms generate their own workloads and bring their own architectures. And they can usually interact closely with the services of the cloud providers because they share common interfaces or the cloud service providers themselves use the platform as the foundation of their own big business.

Ideally, organisations will be able to choose the right service from the cloud for each task, like a virtual toolbox. The workloads will connect multiple different clouds, so that a company will operate a colourful mix of hybrid and public cloud as well as local resources.

Lack of expert talent

The EU Commission expects that in 2020 around ten million citizens in the EU will work as so-called data workers. However, it takes a lot of IT-sense to develop and set up the necessary architectures and application concepts. It is to be expected for 2019 and later that companies will increasingly seek more talent. Two of the top four IT certifications already relate to cloud development and the solution architecture in the cloud. Lack of knowledge and employees are seen as one of the hurdles in digitisation. The lack of experts has led to another trend: the IT systems must be radically simpler to operate. After all, there are hardly any increasing budgets available to the IT departments for the digitisation process. At the same time, they must operate their existing infrastructure highly available. With traditional operating models and without the appropriate subject matter experts, this balancing act cannot be achieved.

The need for simplification and autonomy

The need for simplification and autonomy is growing, so backup and backup systems are becoming easier to manage, are increasingly autonomous, and aligning with new workloads and software-defined architectures will be the deciding factor in the years to come to relieve the existing teams. Like other internal IT tasks, the backup must, ideally like virtualisation, completely exclude the physical world and hide it from the user. Also, automatisms are indispensable both during initial implementation and during operation. This requires very complex algorithms that guide the backup software and hardware through initial configuration using Machine Learning and Artificial Intelligence as soon as the Infrastructure as Code parameters are passed through to them.

Backup systems need to scale, both horizontally and vertically, as data volumes continue to grow massively and spread geographically even more across multiple infrastructures. The federated backup systems need to talk to each other to learn about their current state and, at the same time, use artificial intelligence to analyse trends to identify future tasks, peak loads and bottlenecks, and respond accordingly. Thanks to artificial intelligence, the physical backup infrastructure creates a smart and dynamic network that, as a good team, takes into account the strengths and weaknesses of each member and allows them to interact with each other in a dynamic and agile way.

All of this has to happen as autonomously as possible without the need for human interaction. In addition, the backup concept should be able to redistribute responsibility for data and workloads and their recovery across multiple shoulders. For example, Big Data team members may want to control their own virtual resources and switch back to earlier versions without relying on backup teams. The process should be done for the Big Data teams in a few clicks and without backup training. They should concentrate on their core task and do not waste time on ancillary tasks. Veritas is working on such automatisms so that customers can reliably recover their data tomorrow with extremely little effort.

Most Popular

Digital Edition

Subscribe today and get your copy of the magazine for free