Skip to content

Add Docker as alternative installation and serving method for vLLM

cd1f0f1
Select commit
Loading
Failed to load commit list.
Merged

Add Docker as alternative vLLM installation and serving method #54

Add Docker as alternative installation and serving method for vLLM
cd1f0f1
Select commit
Loading
Failed to load commit list.
Mintlify / Mintlify Deployment succeeded Apr 9, 2026 in 36s

Deployment Succeeded

Details

Verified update permissions
Fetching and validating config file...
Fetching .mintignore file...
Fetching ASSISTANT.md file...
Successfully validated docs.json
Successfully fetched .mintignore
No Assistant.md file found
No docsConfig evaluation needed
DocsConfig evaluation complete
Fetched all file paths
Fetched 0 OpenApi file(s)
Fetched 0 AsyncApi file(s)
Skipped OpenAPI navigation generation
Skipped AsyncAPI navigation generation
Successfully updated API reference metadata.
Identified 87 stale file(s) and 0 stale rss file(s)
Updating targeted paths:
  deployment/gpu-inference/vllm.mdx
Successfully updated deployment
LaTeX configuring is unchanged
Successfully saved config
No stale tracked assets found
Updating navigation...
Navigation updated
Skipped editor nav regeneration (no existing record)
Queued search indexing
Skipped search indexing (non-manual update).
Contextual starter questions are not enabled.
Skipped deployment summary for preview deployment.
Successfully deleted stale OpenAPI document(s)
Successfully deleted stale AsyncAPI document(s)
Starting page revalidation...
Revalidating all pages...
Revalidating 109 paths
Page revalidation complete
Successfully deleted stale tracked asset(s)
Queued update of llms-full.txt
Skipped agent readiness scoring for preview deployment
Skipping Vercel revalidation (subdomain not in revalidation list)
Updated Cloudflare deployment cache