↑ ↓ Navigate Enter Open Esc Close
Skip to main content
TinyComputers.io
  • Archive
  • Tags
  • Articles
  • RSS feed
  • About Me
  • ★ Best Of
  • ⚙ Projects
    🛒 SHOP: Loading prices... 🛒 SHOP: Loading prices...
    📈 STOCKS: Loading prices... 📈 STOCKS: Loading prices...
    📚 FROM THE ARCHIVES: Loading posts... 📚 FROM THE ARCHIVES: Loading posts...
     Silver: --- ·  Gold: --- ·  Copper: --- ·  Iron Ore: --- ·  Hot-Rolled Coil Steel: --- ·  Uranium: --- ·  WTI Oil: --- ·  Coal: --- ·
    Featured AI

    Running vLLM in Docker with AMD ROCm and the Continue.dev CLI

    2026-01-25  · 10 min read

    A practical guide to running vLLM on AMD GPUs using Docker and ROCm, then connecting it to Continue.dev's cn CLI for AI-assisted coding.

    Sponsors

    PCBWay
    Proud Sponsor
    PCBWay
    Professional PCB manufacturing and assembly. Bringing our hardware projects from design files to reality.
    Learn More →
    View All Articles →

    Contents © 2026 A.C. Jokela - Creative Commons License
    This work is licensed under a Creative Commons Attribution-ShareAlike |  Terms of Service | Copyright | Affiliates and Supporters | Privacy Policy

    Recommended Reading
    Cookie Notice

    We use cookies to analyze site traffic and improve your experience. By clicking "Accept", you consent to our use of cookies. See our Privacy Policy for details.

    Before you go...

    Get occasional updates about retro computing, embedded systems, and hardware hacking. No spam, unsubscribe anytime.

    We respect your privacy. See our Privacy Policy.