github

  • Board Nominations
    Nominations have now closed and the results are available here.
  • Hey Guest, MARCHintosh 2026 is upon us. Check out community projects, join GlobalTalk, and have fun!
  1. Kai Robinson

    How to: Use Ollama with unsupported GPU

    nVidia has CUDA, AMD has ROCm and then...if you're not lucky, you have nothing, for GPU accelerated LLM workloads? NO! Because Ollama now supports Vulkan! This means that if, like me, you have a machine that's not cutting edge, you can still leverage the power of the GPU to accelerate the...