Back
GPU AI Calculator
Calculate VRAM requirements and costs for LLM training and inference
Model Configuration
Lower precision reduces memory but may affect accuracy
Common values: 4096, 8192, 32768, 131072 (128k)
VRAM: 80 GB
TFLOPS (FP16): 1979
Bandwidth: 3350 GB/s
Price: $4.00/hr
Total VRAM: 80.00 GB
VRAM Requirement
Required VRAM
17.88 GB
Model: 8B params × 2 bytes × 1.2 (overhead)
Fit Status
Fits comfortably
Available: 80.00 GB
VRAM Usage22.4%
Used: 17.88 GBTotal: 80.00 GB
Cost Estimator
Hourly Cost
$4.00
1 × NVIDIA H100 SXM @ $4.00/hr
Monthly Cost (24/7)
$2,920.00
730 hours/month
Note: Prices are estimated hourly rates. Actual costs may vary by cloud provider and region. Consider spot instances for training workloads.