GPU Servers
Dedicated GPU instances with NVIDIA hardware and Ollama pre-installed. Run LLM models privately on your own hardware — your prompts and data never leave the VM.
Most Popular
GPU A16
GPUEntry-level GPU for local inference and AI experimentation.
$449/month
CPU: 6 vCPU
RAM: 64 GB RAM
Disk: 500 GB NVMe
BW: 8 TB Transfer
GPU: A16 — 16 GB VRAM
- Everything in Ultra
- NVIDIA A16 — 16 GB GDDR6
- Ollama pre-installed
- Private on-device inference
- Run 7B-13B parameter models
- 3 workspaces
- Daily backups
- Firewall management (50 rules)
- SSH access
GPU A100
GPUHigh-performance GPU for production inference and fine-tuning.
$1249/month
CPU: 6 vCPU
RAM: 60 GB RAM
Disk: 800 GB NVMe
BW: 10 TB Transfer
GPU: A100 — 40 GB VRAM
- Everything in GPU A16
- NVIDIA A100 — 40 GB HBM2e
- Run 30B-70B parameter models
- Fine-tuning capable
- 5 workspaces
- Hourly backups
- Firewall management (50 rules)
- SSH access
GPU L40S
GPUDedicated L40S GPU for private inference and production AI workloads.
$1499/month
CPU: 16 vCPU
RAM: 180 GB RAM
Disk: 1200 GB NVMe
BW: 12 TB Transfer
GPU: L40S — 48 GB VRAM
- Everything in GPU A100
- NVIDIA L40S — 48 GB GDDR6
- Run 70B+ parameter models
- 5 workspaces
- Hourly backups
- Firewall management (50 rules)
- SSH access
Compare GPU plans
GPU A16
$449/month
| vCPU | 6 |
| RAM | 64 GB |
| Storage | 500 GB |
| GPU | A16 |
| VRAM | 16 GB |
| Bandwidth | 8 TB Transfer |
| Workspaces | 3 |
| Backups | |
| Firewall Rules | 50 rules |
| REST API | |
| BYOK (any LLM) | |
| Workspace Sandboxing | |
| Ollama Pre-installed | |
| Private Inference | |
| SSH Access |
GPU A100
$1249/month
| vCPU | 6 |
| RAM | 60 GB |
| Storage | 800 GB |
| GPU | A100 |
| VRAM | 40 GB |
| Bandwidth | 10 TB Transfer |
| Workspaces | 5 |
| Backups | |
| Firewall Rules | 50 rules |
| REST API | |
| BYOK (any LLM) | |
| Workspace Sandboxing | |
| Ollama Pre-installed | |
| Private Inference | |
| SSH Access |
GPU L40S
$1499/month
| vCPU | 16 |
| RAM | 180 GB |
| Storage | 1200 GB |
| GPU | L40S |
| VRAM | 48 GB |
| Bandwidth | 12 TB Transfer |
| Workspaces | 5 |
| Backups | |
| Firewall Rules | 50 rules |
| REST API | |
| BYOK (any LLM) | |
| Workspace Sandboxing | |
| Ollama Pre-installed | |
| Private Inference | |
| SSH Access |