ollama: ROCM usage #77

Open
opened 2024-02-23 22:34:21 +00:00 by skobkin · 1 comment
Owner
Following #76. It's needed to find a way to run LLM's using ROCM. See: - https://github.com/ollama/ollama/issues/738 - https://github.com/ollama/ollama/blob/main/docs/development.md#linux-rocm-amd - ✅ https://hub.docker.com/r/bergutman/ollama-rocm
skobkin added the
to be researched
bug
enhancement
labels 2024-02-23 22:34:21 +00:00
skobkin self-assigned this 2024-02-23 22:34:21 +00:00
skobkin added a new dependency 2024-02-23 22:34:35 +00:00
skobkin referenced this issue from a commit 2024-03-07 02:04:30 +00:00
Author
Owner

Detected iGPU, but didn't work due to missing ROCm module for such target. Need to test on full-size GPU.

Detected iGPU, but didn't work due to missing ROCm module for such target. Need to test on full-size GPU.
Sign in to join this conversation.
No Milestone
No Assignees
1 Participants
Notifications
Due Date
The due date is invalid or out of range. Please use the format 'yyyy-mm-dd'.

No due date set.

Depends on
#76 ollama
skobkin/docker-stacks
Reference: skobkin/docker-stacks#77
No description provided.