GriswoldLabs
Home Projects Apps Blog Hire Me About
Home Projects Apps Blog Hire Me About
← All Tags

Tagged: vllm

1 post

January 29, 2026
gpu v100 ai unraid vllm inference

Adding V100 GPUs to an Unraid Server for LLM Inference

Planning guide for installing NVIDIA V100 PCIe 32GB GPUs in a Dell R7525 homelab server. Model sizing, power budgets, and revenue potential.

© 2026 GriswoldLabs. Built with Astro.