PREVIOUS ARTICLENEXT ARTICLE
INTERVIEWS
By 31 March 2026 | Categories: interviews

0

Much has changed in 2026, but some fundamentals remain constant. Chief among them are the growing importance of data and the accelerating role of AI in shaping how businesses operate. 

But what is becoming more apparent is that AI is not all promise and potential. Even as it is accelerating innovation across every industry, it is also exposing deep fault lines in how organisations manage, govern and protect their data.

Recently, I sat down with Mena Migally, Regional Vice President for Eastern Europe, the Middle East and Africa at Veeam to talk about data, security, and enterprise business today, and asked him what is changing in the industry.

One of these changes is that the era of siloed backup and reactive security is over. Instead, Migally stressed that IT's future belongs to unified data resilience that is built for AI speed, regulatory complexity and a threat landscape that is evolving just as fast as the technology itself.

He explained that when organisations first embraced cloud computing, adoption unfolded gradually. Multi‑cloud strategies, governance frameworks and operational maturity took years to develop. And I still well remember the security misgivings and concerns companies had about putting their data off-prem. But AI and its use, by contrast, has arrived at a very different pace.

“Today, organisations can spin up a full AI environment in a matter of days, and the pressure is coming from everywhere, including from customers, from competitors and from inside the business itself,'' he explained.

That speed has created a paradox. AI promises unprecedented operational insight and efficiency, but it is also magnifying long‑standing weaknesses in how enterprises handle data. This is especially true for unstructured data that now sprawls across clouds, data centres, applications and backup environments.

According to Veeam, this tension is driving a fundamental shift in how organisations think about data protection, governance and recovery, marking a shift from traditional backup towards holistic data resilience.

The elephant in the room

Unstructured data has been growing at scale for years, long before generative AI entered the mainstream. Documents, emails, sensor data, application logs and media files accumulate across environments, often with little oversight.

“What’s changed is that AI feeds on that same unstructured data,” he explained.

As organisations race to extract value from AI projects, new questions surface quickly: Who has access to sensitive information? How long has data been sitting unused? Where is it located, and does it comply with local sovereignty laws?

Another question that begs asking is whether organisations are governing only their primary production data, or also the secondary data stored in backups and archives, where information can remain untouched for years.

“These challenges didn’t disappear when AI arrived,” Migally says. “They multiplied.”

The result is what Veeam describes as data sprawl which in a nutshell is fragmented, poorly governed data estates that increase operational cost, compliance risk and exposure to cyber threats.

The latter, to my mind, is the most worrying, given rapid and rapid increase of cybercrime, and the fact that cybercriminals are now using AI to increase their spread too.  

Resilience is key

Historically, data protection focused on backup and recovery: ensuring that systems could be restored after failure. In today’s environment, that definition is no longer sufficient.

Migally explained that while data protection is reactive, resilience is proactive.

But true data resilience, as Veeam defines it, is about more than restoring files. It is about bouncing forward, whereby an organisation is able to resume operations rapidly, even in the face of ransomware, system compromise or regulatory disruption.

Exacerbating matters is the fact that downtime tolerance has shrunk dramatically. A fact that comes up frequently in interviews and conversations with businesses is that today's customers and consumers are different.

They expect faster service and have far less tolerance for downtime or system failures, with even milliseconds of interruption triggering customer complaints and having a negative impact on the business.

At the same time, attackers are becoming more sophisticated, using automation and AI‑driven techniques to scale ransomware and cybercrime with fewer resources.

“The attackers are getting smarter, and they are moving faster,” notes Migally. That reality has pushed resilience from an IT concern into a core business priority.

Breaking down the silos

One of the central themes emerging from Veeam’s platform strategy is the need to eliminate silos between primary and secondary data, between cloud and on‑premises systems, and between backup, security and governance tools.

A unified data platform, Migally argues, changes the equation entirely.

“When you have complete visibility, you can make decisions faster,” he says. “You know who has access, both human users and AI systems alike. You know where the data sits, how long it’s been there, and what it’s costing you.”

This “single pane of glass” approach allows organisations to respond more effectively to regulatory change, optimise operational costs and reduce risk — without waiting for audits or reacting after incidents occur.

It also addresses a critical reality of AI adoption: speed. “Everyone is operating at AI speed now, but without visibility, there’s no way to control it,” he points out.

Embedding AI into operations, safely

AI is not just something Veeam’s customers are deploying; it is also embedded within Veeam’s own platforms.

According to Migally, AI plays a key role in operational insight, automated risk detection and policy enforcement. Rather than presenting raw data, the platform provides summarised, human‑readable guidance that helps teams make faster decisions — such as identifying data that no longer needs to be retained or backed up multiple times.

Automation also brings some tantalising benefits. Self‑healing and co‑automated actions allow organisations to enforce retention or deletion policies consistently, lowering the chance that sensitive data is stored unnecessarily or exposed inadvertently.

“Automation reduces operational cost, but more importantly, it reduces operational risk,” Migally says.

Crucially, Veeam does not position AI as a standalone solution. Rather, he stressed that integration across platforms is essential.

“AI is too big to be tackled by any one player. It must be part of a broader operational ecosystem,” he contends.

The heart of data sovereignty

As data sovereignty laws tighten across regions and industries, organisations face increasing pressure to understand not just where data is stored but where it can be recovered. Migally asserts that in fact recoverability is the most critical aspect of sovereignty.

Backing up data in a compliant location is only part of the equation. In the event of an incident, organisations must be able to recover quickly into an environment that also meets regulatory requirements, whether that is a specific cloud region or an alternative infrastructure.

Time becomes the deciding factor. During recovery, the luxury of extended planning disappears. He notes that organisations that have already mapped their data, access controls and recovery paths are far better positioned to respond.

Moving beyond checkbox compliance

Another bugaboo that haunts businesses is compliance. Part of the reason why compliance can be such a headache is because it has traditionally been reactive: a new regulation emerges, an audit follows, gaps are identified and remediated.

Veeam’s unified approach aims to reverse that cycle.

“If you already understand where your data is, who has access to it and how it flows, applying a new regulation becomes much easier,” Migally explains.

Instead of scrambling to assemble reports, organisations can apply compliance rules directly to an existing data map, immediately identifying what is compliant and what is not.

This shifts compliance from a periodic exercise to a continuous capability; one that is especially important as regulations proliferate across countries, industries and AI‑specific frameworks.

New risks in an AI‑driven world

While many data risks are familiar, AI introduces new complexities, particularly around speed and scale.

Migally points to “toxic combinations” of data as a growing challenge. Static rules that worked in traditional environments struggle to keep up when access patterns, tools and data relationships change dynamically.

“With AI, access changes faster than ever before,” he says. “You need to detect and respond at the same speed.”

Deepfakes, synthetic data and increasingly convincing generative outputs add another layer of trust concerns, even if Veeam itself focuses on the underlying unstructured data rather than generated content.

The real threat

Despite heightened awareness of cyber risk, Migally believes the greatest vulnerability isn't lack of technology, advanced persistent threats or ransomware, but rather, a very human one.

“The biggest challenge isn’t lack of tools,” he says. “It’s complacency.”

He elaborated that the assumption that “nothing bad - such as a security breach - has happened yet” (and the presumption that it won't) can delay critical investment until after an incident, when the cost is far higher. Migally likens it to insurance, pointing out that purchasing protection after an accident rarely delivers the same outcome as being prepared in advance.

He elaborates that internally, AI has become a foundational capability at Veeam, embedded across operations from HR to order management.

“If you’re not using AI, you’re not operating at the speed your customers expect,” Migally says.

For this reason, he explained, multiple AI‑driven initiatives have been rolled out across departments within Veeam, reinforcing AI’s role as a core enablement pillar rather than a standalone experiment.

The path forward

Looking three to five years ahead, Migally sees a clear dividing line between organisations that succeed and those that struggle, between those that attain data resilience, and those that fall by the wayside

He asserts that organisations that adopt a holistic, unified approach to data management, rather than just investing in the best in class backup only, or the best of class access control solution only, will be better positioned to scale, comply and recover.

Migally emphasizes that the key distinction between resilient organizations and those at risk lies in their comprehensive knowledge of all assets and data within their operations. He explains that companies that invest the effort to fully understand and account for everything they possess, whether it is data, tools, or systems, more effectively lay the foundation for true resilience. This detailed awareness allows them to build robust defenses and recovery strategies, ensuring successful data protection and operational continuity.

In contrast, organizations that lack this level of insight leave themselves vulnerable. Migally warns that even a seemingly minor oversight, such as an unmonitored tool or forgotten data repository, can result in a significant exposure. The risk is that a small tool or vulnerability could inadvertently reveal all the organization’s assets to external threats.

Therefore, establishing a complete inventory and understanding of all organizational resources is not just a best practice, but a critical requirement for resilience in 2026.

USER COMMENTS

Read
Magazine Online
TechSmart.co.za is South Africa's leading magazine for tech product reviews, tech news, videos, tech specs and gadgets.
Start reading now >
Download latest issue

Have Your Say


What new tech or developments are you most anticipating this year?
New smartphone announcements (46 votes)
Technological breakthroughs (29 votes)
Launch of new consoles, or notebooks (14 votes)
Innovative Artificial Intelligence solutions (29 votes)
Biotechnology or medical advancements (24 votes)
Better business applications (160 votes)