Select your language

Green Datacenter Racks

AI as a Service: A Sovereign AI solution from Switzerland

AI offers significant opportunities: higher efficiency, faster innovation, and long-term competitiveness. But how can organizations use AI models securely without regulatory requirements or cross-border data processing becoming a risk?

Phoenix and Green have launched a sovereign AI solution that is operated and hosted locally in Switzerland. Built on the latest NVIDIA H200 GPUs, it enables enterprises to run leading large language models in a controlled environment. Thomas Taroni, Chairman of the Board at Phoenix Technologies, and Marco Stadler, Chief Sales Officer at Green, explain how organizations can execute their AI initiatives using this new platform.
 

Data sovereignty, control, and compliance are critical when deploying AI. Thomas Taroni, how does your solution address these requirements?

Thomas Taroni: Enterprises need to maintain full control over their data at all times. To achieve that, data processing must remain local. When data is used by AI models, it needs to be decrypted briefly for computation. If this happens outside the country, the data may be exposed to additional risk. With our solution, data is processed exclusively in Switzerland. No data is transferred or processed abroad.

Bild: Thomas Taroni, Phoenix, im Gespräch mit Green

In AI deployments, data protection is not a choice. It is a fundamental requirement.

Thomas Taroni, Chairman of the Board at Phoenix Technologies


Beyond data protection, sovereignty also means freedom of choice at the technology level. Our customers are not locked into a single model. They have the flexibility to deploy different models depending on their use case, including DeepSeek, Granite, Llama, Apertus, and others. With our solution, enterprises can select the model that best aligns with their technical and business requirements.
 

Many enterprise data sets already reside in the public cloud today. In this context, does it still make sense to adopt the new AI as a Service offering from Phoenix and Green?

Marco Stadler: Most organizations operate hybrid IT architectures. Some workloads run in the public cloud, SaaS services from global providers are widely used, while other systems are still operated on-premises or in private clouds. From a cost and architecture perspective, it is advisable to bring data sources and data processing as close together as possible, including physically. This consideration complements the broader requirements around digital sovereignty.

Green is the preferred data center provider for many hyperscalers with Swiss data regions. This allows AI clusters based on the new AI platform to be deployed in close proximity to relevant data sources. The approach is particularly attractive for enterprises that host their private infrastructure at Green. Cost-efficient cross-connects enable direct connectivity between enterprise environments and sovereign AI services.

 

Bild: Marco Stadler, CSO von Green

Data gravity makes one principle clear: AI should run close to the data.

Marco Stadler, Chief Sales Officer Green

 


Can local AI clusters compete technologically?

Thomas Taroni: Yes, absolutely. Inference, meaning the use of trained models, does not require gigafactories of the kind we associate with training large language models. What matters is the ability to deploy trained models efficiently for real-world applications. That is why we rely on the latest NVIDIA H200 GPUs. These are the first of their kind to be running in production in Europe, hosted in Green’s data centers.
 

What capabilities do modern data centers need to support these workloads?

Marco Stadler: First and foremost, the fundamentals: high availability, security, and redundancy. Beyond that, modern GPUs demand significantly higher power density. The pace of innovation is rapid, and data centers must be able to support hybrid cooling architectures that combine air cooling with liquid cooling. Many data centers built just a few years ago were not designed for modern GPU clusters. Green’s facilities are.


Marco Stadler, CSO, Green:

Many data centers built just a few years ago are not designed for modern GPU clusters. Green’s facilities are.
 

 


Why did Phoenix Technologies choose to partner with Green?

Thomas Taroni: We have been working together for more than twelve years. Rack space is widely available, but a truly high-availability environment capable of running AI clusters is not. Cooling and power requirements tie AI infrastructure much more closely to the physical building, which means data centers and AI infrastructure increasingly evolve together. Connectivity was another decisive factor. Only at Green did we find the ability to operate across multiple locations with high-performance interconnection. From our perspective, Green is clearly a technology leader.
 

What role does connectivity play for customers?

Marco Stadler: A critical one. It’s about private direct connections, but also about the ability to move very large volumes of data within milliseconds. An AI cluster can generate half a petabyte of output in seconds, images, video, simulation data. These volumes need to be transported without bottlenecks. High-performance connectivity is fundamental. That’s why we rely on direct cross-connects and 100-gigabit Layer-2 links between data center locations.
 

When companies start AI projects with large volumes of unstructured data, what do you recommend?

Thomas Taroni: We have developed a seven-step program. The first three steps are what we call “Get ready for AI.” I’m happy to share some key takeaways.
 


Checklist: From unstructured data to AI applications

Preparation

  • Analyze and clean existing data sets

  • Understand input and output relationships: AI learns from input and output. Only then can it deliver meaningful results and identify patterns

  • Define guardrails: Which values and rules should the AI follow?

Pilot

  • Select and test initial use cases

  • Validate results and train employees

Scale

  • Enable broad adoption across the organization

  • Adapt infrastructure and processes


Why are guardrails so important?

Thomas Taroni: AI does not have built-in ethics. It learns solely from the data we provide. That’s why guardrails are essential. They define how AI should respond to sensitive questions and prevent uncontrolled behavior. Guardrails are like traffic signs. They guide how AI operates in everyday situations. Only once these rules work reliably in a pilot phase should AI be rolled out to the wider organization.
 

Should companies start with a vision or with concrete use cases?

Marco Stadler: Both. Organizations need a clear vision, but they also need to take pragmatic first steps. A simple initial use case helps build experience quickly. It’s equally important to involve employees from the beginning. Adoption only happens when people are enabled to work effectively with AI.

Thomas Taroni: Exactly. These initiatives don’t work top-down. There’s no value if two-thirds of the workforce cannot use AI effectively. What matters is that employees understand how to solve tasks with AI and how to achieve high-quality results.


Conclusion

The sovereign AI solution from Phoenix and Green demonstrates that enterprises no longer have to choose between security and innovation. It combines control and security with the freedom to deploy the most suitable technology.

The pace of development will continue to accelerate. Organizations that start now build expertise early and secure a lasting advantage.

 

Talk to us about AI as a Service:

+41 56 460 23 80 or business@green.ch

 

Other topics worth reading:

Copied to clipboard