Nutanix: “In the Age of Hybrid Cloud and Multicloud, Portability is the Key”.

We interviewed Alejandro Solana, technical director of Nutanix Iberia, to review the technological landscape and the latest innovations the software manufacturer is launching on the market.

Let’s take a look back to 2009. Nutanix was founded that year by three ex-Google engineers (Dheeraj Pandey, Mohit Aron and Ajeet Singh), who left the search engine giant to realise an idea that had not been taken up by the company.

Shortly afterwards, and thanks to a significant financial injection from various venture capital investors, they brought to market appliances that were intended to facilitate and accelerate hardware and software deployments for IT departments. Nutanix was probably the company that invented the concept of hyperconvergence, an architecture that allowed the hardware (compute, storage and networking) needed to scale as needed without the need for tedious configuration and testing before bringing it into production.

Such deployment times for new resources ranged from months to days in many cases, which is why hyperconvergence became a benchmark architecture in the technology sector.

Time has proven Nutanix right and all major vendors have a similar approach to meet market needs.

Today, the great challenge for Nutanix lies in bringing this scalability to hybrid and multicloud environments where the really important things are data, applications and operations, as well as their portability between platforms, regardless of where the processes are executed.

In the following interview, Alejando Solana, CTO of Nutanix and an old acquaintance of Silicon from his previous role at VMware, talks about all this and more:

-We have had several years in which innovation in the technology sector has accelerated exponentially, why do you think this is, and how important is the cloud computing model in this acceleration?

In this time we have realised that what is important for any organisation is the applications, the data and how both are operated. Everything that has happened in the ICT market over the last few years is related to one of these three areas.

At the application level, we have been looking for architectures that facilitate horizontal scaling. We have seen traditional client/server, Java, Web, SOA, microservices, containers and, finally, serverless. And we have reached a point where logical capability has become a function.

From all this has come the consumerisation of technology. The cloud means industrialising everything that is typically built in an IT environment: infrastructure, platform, applications… to offer it as a service that facilitates and simplifies all processes.

Customers are now looking for application architectures that break dependencies with the traditional IT world.

A similar thing is happening with data. Years ago we had large databases and huge SAN (Storage Area Networks) systems that scaled vertically, but now we are seeing just the opposite trend through Big Data, Data Lakes, Artificial Intelligence, etc. And all of this connected with algorithms that make the most of the information and the data lifecycle.

So in these two big areas of applications and data we have moved towards a horizontal scaling model, a model that seeks to reduce dependence on a particular technology or a particular manufacturer.

“We should not be afraid of change, we have to adapt to it and see the opportunities it creates”.

It is true that the large hyperscalars try to tie customers to their own technologies and APIs, but customers do not want to repeat the same dependencies that have occurred in the past with architectures such as the mainframe, UNIX or Windows. Instead, companies are looking to develop and structure information to be as standard as possible and to facilitate portability when needed. In the era of hybrid cloud and multi-cloud, portability is key.

Finally, the third key line is related to operations. IT departments have had to deal with installations, maintenance, upgrades, backups, etc., even having to do it on weekends so as not to disrupt services. With the cloud, this is all a thing of the past. The cloud has shown us that, if you are aware of the tasks you have to perform to maintain, update and manage the lifecycle of a service, you can carry them out with increasingly sophisticated levels of automation.

-To what extent has automation changed the rules of the game in the industry?

Automation is the next big wave. So far, virtual machines have been automated through containers and their encapsulation, but now the same is being done with operations, which can be encapsulated to be deployed and consumed via APIs.

-Of all the technological innovations you have experienced over the years, which one would you consider the most disruptive?

There have been many innovations but I would approach disruption from another point of view. Organisations have a challenge that the pandemic has highlighted: the agility to respond, the ability to adapt to a new context in an agile way. In this way, they can respond quickly to any circumstance. This has been a major challenge not only at the technological level but also at the organisational level. Hierarchical and silo-based organisational models, where people don’t talk to each other, are becoming a thing of the past and we are moving towards a more collaborative organisational model, where departments can communicate more easily and share existing resources to make the most of them.

-Tell me about the balance between hardware and software on the data centre side. This is one of the things that has changed the most in the last decade, with software becoming more and more relevant in the choice of one solution or another. To what extent is this the case?

We are heading towards a model that is going to be like an assembly line. Looking at the different previous waves of technology, the type of application conditioned the hardware, but the current model seeks to avoid dependencies in any of the layers. It’s like having a switch and turning it on so you can consume what you need when you need it. It’s something that is tied to the application and data architecture that you have on top of it. The hardware is still going to be there, but what organisations are looking for now is capacity (compute, memory, storage, network) that can be consumed on an as-needed basis and that can scale quickly to meet application and data requirements. Hardware manufacturers, hyper-scale providers such as AWS, Microsoft Azure and Google Cloud, or private cloud providers, are developing their infrastructure with architectures that comply with the premise of providing them as a service.

-What is your assessment of 2022, what has the year just ended been like for Nutanix?

We are at a very interesting time because the pandemic has accentuated the digitisation needs of organisations. With the hierarchical model I mentioned earlier, the capacity to respond was practically non-existent. Hence, a lot of projects started to be developed to adapt organisations to the future. One of the processes for this adaptation was digitalisation, both at the level of business processes as well as infrastructure and applications.

And that is precisely what Nutanix provides: industrialising and digitising infrastructure management regardless of location and in record time. Deployment processes that used to take months can be put into production in a matter of days with our platform. These years have been a turning point for Nutanix because we have entered into very important projects and many doors have opened.

Alejandro Solana, technical director of Nutanix Iberia

In the last two years we have tripled our business volume and the trend is significantly upward, with a quarterly growth forecast of close to 20 percent, led by the increase in subscriptions and the volume of contracted projects, which is also increasing. In other words, we are getting bigger and more complex contracts.

One detail I like to emphasise here is that our customers are investing more and more in our platform. For example, the second contract with us is often three or four times the investment made in the first one. When the customer realises the ease of the learning curve and the problems that are solved with our platform, they usually repeat and expand capacity.

-I understand you are referring to global data, how is Nutanix business going in Spain?

Although the Spanish market is a bit conservative when it comes to innovation, the truth is that we are currently moving forward with greater agility than in other countries in our area.

On the other hand, the contracts we have been closing in previous stages were more aimed at medium-sized companies, but this is something that, as I said, is changing in recent quarters because we are reaching larger companies such as those in the IBEX 35.

To what extent are your partners important in resolving this capillarity when it comes to reaching small and medium-sized companies?

They are very important. In fact, this area is being strengthened. Last year we revamped our channel programme to facilitate recruitment and create new specialisations that were not considered until now. Until a year ago, partners were talking about “hyperconvergence”, but this concept has evolved to “hybrid cloud infrastructure”.

For example, a more specialised ecosystem is developing in areas such as VDI (Virtual Desktop Infrastructure) with Red Hat Linux.

It is also notable that the channel programme is moving towards the SaaS model: Nutanix as a Service. This is a portal where partners can access resources, content, training, education, marketing tools, etc. on a self-service basis.

-All this is closely related to the Centre of Excellence that Nutanix has opened in Barcelona, isn’t it?

Yes, we opened this centre in 2022 to develop the channel programme and provide greater support to customers and partners. This opening partly explains the growth of Nutanix in Iberia, with 77% more professionals on staff.

It is a centre that is boosting brand visibility and the expansion we are making in the market. Spain has become a spearhead in Europe as this centre supports the entire EMEA region.

-An expansion that, in a way, puts hyperconvergence on the back burner because you are addressing and offering more and more technologies as part of your platform. How has the company evolved in the last four years since you joined?

Indeed, these four years seem much longer to me precisely because of the change we have undergone in such a short period of time. The current company has nothing to do with the one I knew at the time when the transition from hardware to software was taking place [Nutanix started out as a company that marketed appliances and shortly afterwards focused on software development]. In the last two years we have developed fields such as the subscription model and we have expanded into the hyperscale markets and other providers such as OVHcloud, Equinix, etc.

Even the message I give to companies has changed completely because I talk about many more functionalities and services, which, on a personal level, is enriching.

-What are the company’s expectations for 2023?

In 2023 we are going to follow a continuous line in which we will integrate the new launches and services that are being developed, such as the ‘Baremetal as a Service’ model through OVHcloud and Equinix that I was telling you about. In a context where hardware supply chains are suffering, it is important to be able to propose this type of agreement in which hardware is also consumed as a service.

We also want to focus on the agreement closed with Microsoft Azure a month ago so that customers of this hyperscale can run the Nutanix platform there, especially to develop a multi-cloud strategy.

That’s probably another big change in the Nutanix strategy, moving from private cloud environments to multi-cloud environments, but with a unified experience. What can you tell me about that?

Virtually all organisations are considering the multi-cloud model. Our approach is to provide a consistent, standardised platform, with the same experience regardless of the location of data, applications and operations. In practice, customers can easily enable or disable workloads based on their needs no matter where they are located, be it public or private cloud. All under the same dashboards, in an automated and integrated way.

Nutanix addresses this model. In fact, at the last Nutanix .NEXT conference, Nutanix Portal was announced to further facilitate the integration of different services and workloads.

But that’s not all, as it will allow the incorporation of services to monitor the security of the environment, capable of suggesting improvements in the systems, to monitor costs and thus move the loads to more economical locations, etc.

A differentiator at this point is that this movement between clouds is complete, so that, through 360º observability, policies, security or the flow of information can be moved, as well as the applications and data themselves.

-What is the status of hyperconvergence, the concept is not so much talked about anymore, but it is becoming more and more present?

The basis of hyperconvergence is already defined and the objective is to reinforce the “composable” concept so that all the pieces of the puzzle in these multicloud environments can fit together, independently of the provider and transparently for the companies that consume these services. New needs will arise, new innovations such as those based on artificial intelligence, which will have to fit together, so our goal is to simplify the environment so that all these pieces fit together easily. And something similar happens at the management and administration level as I mentioned before: security processes, data flow, costs, etc. also have to be “composable”.

-Yes, there is more and more talk about automation and you mentioned it earlier. Do you think it is the next big thing in the industry?

Yes, it is one of the major developments because we are moving towards a context of hyper-automation. Until recently, automation was being applied to manual tasks performed by humans, but the trend is for it to be more and more present in the solutions themselves internally, for example through APIs, to perform all kinds of processes. These components will be increasingly present “out of the box”, included as part of the solution, to give way to hyper-automation.

These innovations aim to industrialise and standardise deployments more effectively and quickly.

We are heading towards a world where all manual tasks, regardless of their type, will be automated and available as part of the service, while the human factor will be focused on bringing ideas, vision and value to the business.

This is not necessarily a negative development – quite the opposite. This is because, as hyper-automation spreads, it is also creating new skilled profiles and functionalities that will need to be filled. We should not be afraid of change, we have to adapt to it and see the opportunities it generates.