Skip to content

Instantly share code, notes, and snippets.

View linuxtek-canada's full-sized avatar

LinuxTek Canada linuxtek-canada

View GitHub Profile
@@@@@
Skipping rebuild
@@@@@
If you are experiencing issues with the pre-compiled builds, try setting REBUILD=true
If you are still experiencing issues with the build, try setting CMAKE_ARGS and disable the instructions set as needed:
CMAKE_ARGS="-DLLAMA_F16C=OFF -DLLAMA_AVX512=OFF -DLLAMA_AVX2=OFF -DLLAMA_FMA=OFF"
see the documentation at: https://localai.io/basics/build/index.html
Note: See also https://github.com/go-skynet/LocalAI/issues/288
@@@@@
CPU info:
@linuxtek-canada
linuxtek-canada / docker-compose.yaml
Last active April 25, 2024 23:45
LocalAI - docker-compose fail - [llama-cpp] Fails: could not load model: rpc error: code = Unavailable desc = error reading from server: EOF
version: '3.8'
services:
localai:
image: localai/localai:latest-aio-gpu-hipblas
deploy:
resources:
limits:
cpus: 8.0
memory: 32G
@linuxtek-canada
linuxtek-canada / gist:115043e27ed3849a2b3d0f0b4b2a07d4
Created April 24, 2024 13:13
LocalAI docker-compose configuration
version: '3.8'
services:
localai:
image: localai/localai:latest-aio-gpu-hipblas
deploy:
resources:
limits:
cpus: 8.0
memory: 32G