ollama: ROCM usage #77
Labels
No labels
WIP
bug
enhancement
new stack
security
to be researched
won't fix
No milestone
No project
No assignees
1 participant
Notifications
Due date
No due date set.
Depends on
#76 ollama
skobkin/docker-stacks
Reference: skobkin/docker-stacks#77
Loading…
Reference in a new issue
No description provided.
Delete branch "%!s()"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
Following #76. It's needed to find a way to run LLM's using ROCM.
See:
Detected iGPU, but didn't work due to missing ROCm module for such target. Need to test on full-size GPU.
Already fixed in recent version. Current
ollama
stack works fine on discrete GPU.