Advertisement
X

Securing Cloud Workloads Beyond “At Rest”: Soumya Debabrata Pani On The Next Frontier Of Trust

Cloud-security expert Soumya Debabrata Pani argues the next frontier is protecting data in use—securing AI workloads, runtime environments, and trust at scale across public-cloud platforms.

Soumya Debabrata Pani

Public-cloud security has travelled far in a decade—but not, says Soumya Debabrata Pani, far enough. The Microsoft-and-Google veteran argues that the industry’s real test now lies in protecting data while it is being processed, not just when it sits on a disk.

“Encryption at rest solved yesterday’s audit question,” Pani notes. “The tougher question is: what happens to an AI model’s training data once it’s loaded into GPU memory?”

The Cost of Standing Still

IBM’s 2024 Cost of a Data Breach study puts the global average impact of a breach at USD 4.88 million—a 10 percent jump in a single year. Such figures, Pani says, force boards to look beyond checkbox compliance.

“Regrettably, we still see security treated as a project completed at deployment. Modern attackers exploit runtime blind spots, not just unpatched servers,” he explains.

Those blind spots are expanding. IDC forecasts public-cloud spending to hit USD 1.35 trillion by 2027, nearly double 2023 levels, as regulated workloads flood off-premises. “Every new workload is a fresh piece of attack surface,” Pani warns. “If we don’t scale protections proportionally, costs will rise faster than adoption.”

From Credentials to Confidential Computing

Pani’s early work at Microsoft focused on automated credential management: rotating, vaulting and retiring passwords for millions of Azure hosts without human touch. The approach, he says, removed roughly “80 percent of what an attacker typically tries first.”

But passwords are only the starting line. In 2020 his team introduced a fleet-wide BitLocker rollout that encrypted every byte written to disk, including legacy servers. That project—once deemed “logistically impossible” at hyperscale—has since become table stakes across major providers.

Today, Pani concentrates on privacy-enhancing computation (PEC) techniques at Google Cloud, especially for AI and machine-learning accelerators. Gartner predicts that roughly 60 percent of large organizations will adopt at least one PEC method by 2025 to meet regulatory and competitive pressure.

“Model weights are intellectual property; training data is often personal data,” Pani says. “Techniques like trusted-execution environments, federated learning and selective homomorphic encryption let you compute without exposing either.”

Practical Advice for CISOs

1. Treat runtime as a first-class attack surface.

“Many architectures still hand off decrypted data to memory without attestation. Start by mapping where clear-text still appears inside the pipeline.”

2. Build a chain of trust that spans procurement to decommission.

“Hardware-rooted keys should follow the server—literally remain soldered—through its entire life-cycle.”

Advertisement

3. Assume cultural debt equals technical debt.

“Automation fails if engineers default to ‘break-glass’ overrides. Enforce policy through code reviews and compensating controls, not memos.”

Looking Ahead

Pani is sceptical of any single “silver-bullet” technology. Instead, he likens modern cloud security to aviation safety: “You fly because many imperfect safeguards overlap.” Confidential-computing enclaves, differential-privacy overlays and post-quantum key hierarchies, he argues, must interlock rather than compete.

What keeps him engaged after a decade of 3 a.m. pager alerts? “We’re building trust at industrial scale,” he says. “Everything people do online—banking, telemedicine, generative-AI diagnostics—assumes the cloud is a safe substrate. Our job is to make that assumption correct before reality proves it wrong.”

As regulators tighten rules on data sovereignty and AI ethics, and as enterprises increase investments in cloud infrastructure, the task of securing data in transit is becoming increasingly important in cloud evolution. Pani is already working on these challenges.

About Soumya Debabrata Pani

Soumya Debabrata Pani is an experienced cloud-security specialist whose career spans leadership roles at Microsoft and Google Cloud. With more than a decade of experience in public-cloud infrastructure and cybersecurity, Pani has worked on critical initiatives that enhance trust in cloud platforms, particularly through the integration of privacy-enhancing computation (PEC) techniques and runtime protection mechanisms.

Advertisement

At Microsoft, Pani led efforts in automating credential management across Azure, eliminating large classes of vulnerabilities through zero-touch rotation and vaulting. He later worked on the rollout of disk-level encryption using BitLocker across legacy and modern hosts—an initiative once considered logistically impractical at hyperscale by leading experts. His contributions helped standardize encryption-at-rest practices across major cloud providers.

Currently at Google Cloud, Pani focuses on securing AI and machine-learning workloads using confidential computing. His work explores advanced security technologies such as trusted-execution environments (TEEs), federated learning, and selective homomorphic encryption—methods designed to protect data while in active use, not just at rest or in transit. He is particularly concerned with protecting model weights and training data, which often contain proprietary and sensitive information.

Pani is a vocal advocate for treating runtime as a primary attack surface and for embedding security throughout the entire hardware and software lifecycle. His philosophy aligns with the belief that overlapping, multi-layered safeguards—rather than a single solution—are essential to maintaining trust at scale.

Advertisement

As enterprises accelerate cloud adoption and regulators demand tighter controls, Pani’s work focuses on the infrastructure underpinning secure, ethical, and resilient digital systems.

Show comments
Published At:
US