Rafay Systems
Rafay builds infrastructure orchestration and workflow automation software that powers self-service compute consumption for Sovereign AI Clouds, Cloud Service Providers & large Enterprises. Customers leverage the Rafay Platform to orchestrate multi-tenant consumption of cloud-native and AI infrastructure along with AI platforms and applications such as AI-Models-as-a-Service, Accenture's AI Refinery, and other 3rd-party applications.
On this channel, you’ll find tutorials, demos, and expert discussions on AI and cloud-native (Kubernetes) infrastructure orchestration and workflow automation. Learn how to build governed, multi-tenant GPU clouds, accelerate AI/ML workflows, and deliver modern, self-service environments across public cloud, private data centers, or sovereign deployments.
Subscribe to stay ahead of the curve with best practices for scaling AI workloads, optimizing GPU utilization, and empowering teams to innovate—without complexity or bottlenecks.
Rafay Platform enables enterprises in public clouds to accelerate AI and cloud-native use cases.
Rafay Platform enables enterprises in private clouds to accelerate AI and cloud-native use cases.
Rafay Platform enables GPU cloud orchestration for enterprises and cloud providers
SLURM Clusters with GPU Nodes using Rafay
End User Self Service Experience for Access to a Bare Metal Server using GPU PaaS
Configure and Create a SKU for End User Self Service Bare Metal Servers
How Rafay Solves the GPU utilization problem
Provision and Use Bare Metal Servers using Rafay GPU PaaS
1-Click Deployments of DataRobot using Rafay GPU PaaS
Roles and RBAC in Rafay's PaaS
End User Self Service Access to SLURM clusters on Kubernetes using Project Slinky
Sharing Data between Jupyter Notebooks using NFS
Run BioContainers using Docker on a Remote VM
Autoscaling Airflow with KEDA using Rafay
Autoscaling Kafka with KEDA using Rafay
End User Self Service Provisioning and Access to VMs on VMWare vSphere
How Cloud Providers can provide Multi Tenant, Serverless Inference to their Customers
End User Self Service-Select, Configure, Provision and Use a Virtual Machine using Rafay
1-Click Deployments of Dataiku for End Users of GPU Clouds and Cloud Providers using Rafay
Self Service LLM Inference for Developers using Rafay GPU PaaS
Rafay GPU PaaS Demo For GPU Clouds & Enterprises
End User Self Service Access to Sagemaker AI
Fractional GPUs using Nvidia's KAI Scheduler
Installation Steps for Rafay's Air Gapped Controller
Enabling Self-Service Consumption of GPUs and AI Applications with Nvidia
The Growing Shift Towards Data Center and GPU Investments in the Cloud Market
Utilizing Infrastructure and Meeting Customer Demands
Blue/Green Management of Software Add-ons for AKS clusters using Rafay Cluster Blueprints
Blue/Green Kubernetes Upgrades of a Fleet of AKS Clusters
User Self Service Provisioning and Access to DeepSeek on Amazon EKS using Rafay PaaS