We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
LocalAI version:
Environment, CPU architecture, OS, and Version:
24.2.0 Darwin Kernel Version 24.2.0: Fri Dec 6 19:03:40 PST 2024; root:xnu-11215.61.5~2/RELEASE_ARM64_T6041 arm64
Describe the bug
> [stage-9 10/13] RUN if [[ ( "${EXTRA_BACKENDS}" =~ "coqui" || -z "${EXTRA_BACKENDS}" ) && "extras" == "extras" ]]; then make -C backend/python/coqui ; fi && if [[ ( "${EXTRA_BACKENDS}" =~ "parler-tts" || -z 80.9s => [stage-9 11/13] RUN if [[ ( "${EXTRA_BACKENDS}" =~ "vall-e-x" || -z "${EXTRA_BACKENDS}" ) && "extras" == "extras" ]]; then make -C backend/python/vall-e-x ; fi && if [[ ( "${EXTRA_BACKENDS}" =~ "openvoice" 122.6s => ERROR [stage-9 12/13] RUN if [[ ( "${EXTRA_BACKENDS}" =~ "vllm" || -z "${EXTRA_BACKENDS}" ) && "extras" == "extras" ]]; then make -C backend/python/vllm ; fi && if [[ ( "${EXTRA_BACKENDS}" =~ "autogptq" || - 12.9s ------ > [stage-9 12/13] RUN if [[ ( "${EXTRA_BACKENDS}" =~ "vllm" || -z "${EXTRA_BACKENDS}" ) && "extras" == "extras" ]]; then make -C backend/python/vllm ; fi && if [[ ( "${EXTRA_BACKENDS}" =~ "autogptq" || -z "${EXTRA_BACKENDS}" ) && "extras" == "extras" ]]; then make -C backend/python/autogptq ; fi && if [[ ( "${EXTRA_BACKENDS}" =~ "bark" || -z "${EXTRA_BACKENDS}" ) && "extras" == "extras" ]]; then make -C backend/python/bark ; fi && if [[ ( "${EXTRA_BACKENDS}" =~ "rerankers" || -z "${EXTRA_BACKENDS}" ) && "extras" == "extras" ]]; then make -C backend/python/rerankers ; fi && if [[ ( "${EXTRA_BACKENDS}" =~ "mamba" || -z "${EXTRA_BACKENDS}" ) && "extras" == "extras" ]]; then make -C backend/python/mamba ; fi: 0.490 make: Entering directory '/build/backend/python/vllm' 0.490 python3 -m grpc_tools.protoc -I../.. --python_out=. --grpc_python_out=. backend.proto 0.513 bash install.sh 0.515 Initializing libbackend for vllm 0.520 Using CPython 3.10.12 interpreter at: /usr/bin/python3.10 0.520 Creating virtual environment at: venv 0.521 virtualenv created 0.522 virtualenv activated 0.522 activated virtualenv has been ensured 0.522 starting requirements install for /build/backend/python/vllm/requirements-install.txt 0.542 Using Python 3.10.12 environment at: venv 0.544 Resolved 3 packages in 1ms 0.837 Installed 3 packages in 292ms 0.837 + packaging==24.2 0.837 + setuptools==75.8.0 0.837 + wheel==0.45.1 0.838 finished requirements install for /build/backend/python/vllm/requirements-install.txt 0.838 starting requirements install for /build/backend/python/vllm/requirements.txt 0.841 Using Python 3.10.12 environment at: venv 0.845 Resolved 4 packages in 3ms 0.922 Installed 3 packages in 77ms 0.922 + certifi==2024.12.14 0.922 + grpcio==1.69.0 0.922 + protobuf==5.29.3 0.924 finished requirements install for /build/backend/python/vllm/requirements.txt 0.924 starting requirements install for /build/backend/python/vllm/requirements-cpu.txt 0.927 Using Python 3.10.12 environment at: venv 0.933 Resolved 25 packages in 5ms 8.096 Installed 23 packages in 7.16s 8.096 + accelerate==1.2.1 8.096 + charset-normalizer==3.4.1 8.096 + filelock==3.16.1 8.096 + fsspec==2024.12.0 8.096 + huggingface-hub==0.27.1 8.096 + idna==3.10 8.096 + jinja2==3.1.5 8.096 + markupsafe==3.0.2 8.096 + mpmath==1.3.0 8.096 + networkx==3.4.2 8.096 + numpy==2.2.1 8.096 + psutil==6.1.1 8.096 + pyyaml==6.0.2 8.096 + regex==2024.11.6 8.096 + requests==2.32.3 8.096 + safetensors==0.5.2 8.096 + sympy==1.13.3 8.096 + tokenizers==0.21.0 8.096 + torch==2.4.1 8.096 + tqdm==4.67.1 8.096 + transformers==4.48.0 8.096 + typing-extensions==4.12.2 8.096 + urllib3==2.3.0 8.098 finished requirements install for /build/backend/python/vllm/requirements-cpu.txt 8.098 starting requirements install for /build/backend/python/vllm/requirements-cpu.txt 8.101 Using Python 3.10.12 environment at: venv 8.103 Audited 3 packages in 2ms 8.104 finished requirements install for /build/backend/python/vllm/requirements-cpu.txt 8.104 starting requirements install for /build/backend/python/vllm/requirements-after.txt 8.106 Using Python 3.10.12 environment at: venv 9.608 Resolved 123 packages in 1.50s 12.74 × Failed to build `vllm==0.6.6.post1` 12.74 ├─▶ The build backend returned an error 12.74 ╰─▶ Call to `setuptools.build_meta.build_wheel` failed (exit status: 1) 12.74 12.74 [stderr] 12.74 Traceback (most recent call last): 12.74 File "<string>", line 11, in <module> 12.74 File 12.74 "/build/backend/python/vllm/venv/lib/python3.10/site-packages/setuptools/build_meta.py", 12.74 line 435, in build_wheel 12.74 return _build(['bdist_wheel']) 12.74 File 12.74 "/build/backend/python/vllm/venv/lib/python3.10/site-packages/setuptools/build_meta.py", 12.74 line 426, in _build 12.74 return self._build_with_temp_dir( 12.74 File 12.74 "/build/backend/python/vllm/venv/lib/python3.10/site-packages/setuptools/build_meta.py", 12.74 line 407, in _build_with_temp_dir 12.74 self.run_setup() 12.74 File 12.74 "/build/backend/python/vllm/venv/lib/python3.10/site-packages/setuptools/build_meta.py", 12.74 line 320, in run_setup 12.74 exec(code, locals()) 12.74 File "<string>", line 15, in <module> 12.74 ModuleNotFoundError: No module named 'setuptools_scm' 12.74 12.74 hint: This usually indicates a problem with the package or the build 12.74 environment. 12.74 make: Leaving directory '/build/backend/python/vllm' 12.74 make: *** [Makefile:3: vllm] Error 1 ------ Dockerfile:468 -------------------- 467 | 468 | >>> RUN if [[ ( "${EXTRA_BACKENDS}" =~ "vllm" || -z "${EXTRA_BACKENDS}" ) && "$IMAGE_TYPE" == "extras" ]]; then \ 469 | >>> make -C backend/python/vllm \ 470 | >>> ; fi && \ 471 | >>> if [[ ( "${EXTRA_BACKENDS}" =~ "autogptq" || -z "${EXTRA_BACKENDS}" ) && "$IMAGE_TYPE" == "extras" ]]; then \ 472 | >>> make -C backend/python/autogptq \ 473 | >>> ; fi && \ 474 | >>> if [[ ( "${EXTRA_BACKENDS}" =~ "bark" || -z "${EXTRA_BACKENDS}" ) && "$IMAGE_TYPE" == "extras" ]]; then \ 475 | >>> make -C backend/python/bark \ 476 | >>> ; fi && \ 477 | >>> if [[ ( "${EXTRA_BACKENDS}" =~ "rerankers" || -z "${EXTRA_BACKENDS}" ) && "$IMAGE_TYPE" == "extras" ]]; then \ 478 | >>> make -C backend/python/rerankers \ 479 | >>> ; fi && \ 480 | >>> if [[ ( "${EXTRA_BACKENDS}" =~ "mamba" || -z "${EXTRA_BACKENDS}" ) && "$IMAGE_TYPE" == "extras" ]]; then \ 481 | >>> make -C backend/python/mamba \ 482 | >>> ; fi 483 | -------------------- ERROR: failed to solve: process "/bin/bash -c if [[ ( \"${EXTRA_BACKENDS}\" =~ \"vllm\" || -z \"${EXTRA_BACKENDS}\" ) && \"$IMAGE_TYPE\" == \"extras\" ]]; then make -C backend/python/vllm ; fi && if [[ ( \"${EXTRA_BACKENDS}\" =~ \"autogptq\" || -z \"${EXTRA_BACKENDS}\" ) && \"$IMAGE_TYPE\" == \"extras\" ]]; then make -C backend/python/autogptq ; fi && if [[ ( \"${EXTRA_BACKENDS}\" =~ \"bark\" || -z \"${EXTRA_BACKENDS}\" ) && \"$IMAGE_TYPE\" == \"extras\" ]]; then make -C backend/python/bark ; fi && if [[ ( \"${EXTRA_BACKENDS}\" =~ \"rerankers\" || -z \"${EXTRA_BACKENDS}\" ) && \"$IMAGE_TYPE\" == \"extras\" ]]; then make -C backend/python/rerankers ; fi && if [[ ( \"${EXTRA_BACKENDS}\" =~ \"mamba\" || -z \"${EXTRA_BACKENDS}\" ) && \"$IMAGE_TYPE\" == \"extras\" ]]; then make -C backend/python/mamba ; fi" did not complete successfully: exit code: 2
Tried adding setuptools_scm in /build/backend/python/vllm/requirements-after.txt. This resulted in :
setuptools_scm
/build/backend/python/vllm/requirements-after.txt
=> [stage-9 11/13] RUN if [[ ( "${EXTRA_BACKENDS}" =~ "vall-e-x" || -z "${EXTRA_BACKENDS}" ) && "extras" == "extras" ]]; then make -C backend/python/vall-e-x ; fi && if [[ ( "${EXTRA_BACKENDS}" =~ "openvoice" 164.8s => ERROR [stage-9 12/13] RUN if [[ ( "${EXTRA_BACKENDS}" =~ "vllm" || -z "${EXTRA_BACKENDS}" ) && "extras" == "extras" ]]; then make -C backend/python/vllm ; fi && if [[ ( "${EXTRA_BACKENDS}" =~ "autogptq" || - 12.5s ------ > [stage-9 12/13] RUN if [[ ( "${EXTRA_BACKENDS}" =~ "vllm" || -z "${EXTRA_BACKENDS}" ) && "extras" == "extras" ]]; then make -C backend/python/vllm ; fi && if [[ ( "${EXTRA_BACKENDS}" =~ "autogptq" || -z "${EXTRA_BACKENDS}" ) && "extras" == "extras" ]]; then make -C backend/python/autogptq ; fi && if [[ ( "${EXTRA_BACKENDS}" =~ "bark" || -z "${EXTRA_BACKENDS}" ) && "extras" == "extras" ]]; then make -C backend/python/bark ; fi && if [[ ( "${EXTRA_BACKENDS}" =~ "rerankers" || -z "${EXTRA_BACKENDS}" ) && "extras" == "extras" ]]; then make -C backend/python/rerankers ; fi && if [[ ( "${EXTRA_BACKENDS}" =~ "mamba" || -z "${EXTRA_BACKENDS}" ) && "extras" == "extras" ]]; then make -C backend/python/mamba ; fi: 0.630 make: Entering directory '/build/backend/python/vllm' 0.630 python3 -m grpc_tools.protoc -I../.. --python_out=. --grpc_python_out=. backend.proto 0.653 bash install.sh 0.656 Initializing libbackend for vllm 0.662 Using CPython 3.10.12 interpreter at: /usr/bin/python3.10 0.662 Creating virtual environment at: venv 0.662 virtualenv created 0.663 virtualenv activated 0.663 activated virtualenv has been ensured 0.663 starting requirements install for /build/backend/python/vllm/requirements-install.txt 0.683 Using Python 3.10.12 environment at: venv 0.687 Resolved 3 packages in 2ms 0.985 Installed 3 packages in 297ms 0.985 + packaging==24.2 0.985 + setuptools==75.8.0 0.985 + wheel==0.45.1 0.986 finished requirements install for /build/backend/python/vllm/requirements-install.txt 0.986 starting requirements install for /build/backend/python/vllm/requirements.txt 0.988 Using Python 3.10.12 environment at: venv 0.992 Resolved 4 packages in 3ms 1.070 Installed 3 packages in 77ms 1.070 + certifi==2024.12.14 1.070 + grpcio==1.69.0 1.070 + protobuf==5.29.3 1.072 finished requirements install for /build/backend/python/vllm/requirements.txt 1.072 starting requirements install for /build/backend/python/vllm/requirements-cpu.txt 1.075 Using Python 3.10.12 environment at: venv 1.081 Resolved 28 packages in 5ms 8.185 Installed 25 packages in 7.10s 8.185 + accelerate==1.2.1 8.185 + charset-normalizer==3.4.1 8.185 + filelock==3.16.1 8.185 + fsspec==2024.12.0 8.185 + huggingface-hub==0.27.1 8.185 + idna==3.10 8.185 + jinja2==3.1.5 8.185 + markupsafe==3.0.2 8.185 + mpmath==1.3.0 8.185 + networkx==3.4.2 8.185 + numpy==2.2.1 8.185 + psutil==6.1.1 8.185 + pyyaml==6.0.2 8.185 + regex==2024.11.6 8.185 + requests==2.32.3 8.185 + safetensors==0.5.2 8.185 + setuptools-scm==8.1.0 8.185 + sympy==1.13.3 8.185 + tokenizers==0.21.0 8.185 + tomli==2.2.1 8.185 + torch==2.4.1 8.185 + tqdm==4.67.1 8.185 + transformers==4.48.0 8.185 + typing-extensions==4.12.2 8.185 + urllib3==2.3.0 8.187 finished requirements install for /build/backend/python/vllm/requirements-cpu.txt 8.187 starting requirements install for /build/backend/python/vllm/requirements-cpu.txt 8.190 Using Python 3.10.12 environment at: venv 8.193 Audited 4 packages in 3ms 8.193 finished requirements install for /build/backend/python/vllm/requirements-cpu.txt 8.193 starting requirements install for /build/backend/python/vllm/requirements-after.txt 8.196 Using Python 3.10.12 environment at: venv 8.880 Resolved 123 packages in 683ms 12.33 × Failed to build `vllm==0.6.6.post1` 12.33 ├─▶ The build backend returned an error 12.33 ╰─▶ Call to `setuptools.build_meta.build_wheel` failed (exit status: 1) 12.33 12.33 [stderr] 12.33 /bin/sh: 1: lsmod: not found 12.33 Traceback (most recent call last): 12.33 File "<string>", line 11, in <module> 12.33 File 12.33 "/build/backend/python/vllm/venv/lib/python3.10/site-packages/setuptools/build_meta.py", 12.33 line 435, in build_wheel 12.33 return _build(['bdist_wheel']) 12.33 File 12.33 "/build/backend/python/vllm/venv/lib/python3.10/site-packages/setuptools/build_meta.py", 12.33 line 426, in _build 12.33 return self._build_with_temp_dir( 12.33 File 12.33 "/build/backend/python/vllm/venv/lib/python3.10/site-packages/setuptools/build_meta.py", 12.33 line 407, in _build_with_temp_dir 12.33 self.run_setup() 12.33 File 12.33 "/build/backend/python/vllm/venv/lib/python3.10/site-packages/setuptools/build_meta.py", 12.33 line 320, in run_setup 12.33 exec(code, locals()) 12.33 File "<string>", line 606, in <module> 12.33 File "<string>", line 508, in get_vllm_version 12.33 RuntimeError: Unknown runtime environment 12.33 12.33 hint: This usually indicates a problem with the package or the build 12.33 environment. 12.34 make: Leaving directory '/build/backend/python/vllm' 12.34 make: *** [Makefile:3: vllm] Error 1 ------ Dockerfile:471 -------------------- 470 | 471 | >>> RUN if [[ ( "${EXTRA_BACKENDS}" =~ "vllm" || -z "${EXTRA_BACKENDS}" ) && "$IMAGE_TYPE" == "extras" ]]; then \ 472 | >>> make -C backend/python/vllm \ 473 | >>> ; fi && \ 474 | >>> if [[ ( "${EXTRA_BACKENDS}" =~ "autogptq" || -z "${EXTRA_BACKENDS}" ) && "$IMAGE_TYPE" == "extras" ]]; then \ 475 | >>> make -C backend/python/autogptq \ 476 | >>> ; fi && \ 477 | >>> if [[ ( "${EXTRA_BACKENDS}" =~ "bark" || -z "${EXTRA_BACKENDS}" ) && "$IMAGE_TYPE" == "extras" ]]; then \ 478 | >>> make -C backend/python/bark \ 479 | >>> ; fi && \ 480 | >>> if [[ ( "${EXTRA_BACKENDS}" =~ "rerankers" || -z "${EXTRA_BACKENDS}" ) && "$IMAGE_TYPE" == "extras" ]]; then \ 481 | >>> make -C backend/python/rerankers \ 482 | >>> ; fi && \ 483 | >>> if [[ ( "${EXTRA_BACKENDS}" =~ "mamba" || -z "${EXTRA_BACKENDS}" ) && "$IMAGE_TYPE" == "extras" ]]; then \ 484 | >>> make -C backend/python/mamba \ 485 | >>> ; fi 486 | -------------------- ERROR: failed to solve: process "/bin/bash -c if [[ ( \"${EXTRA_BACKENDS}\" =~ \"vllm\" || -z \"${EXTRA_BACKENDS}\" ) && \"$IMAGE_TYPE\" == \"extras\" ]]; then make -C backend/python/vllm ; fi && if [[ ( \"${EXTRA_BACKENDS}\" =~ \"autogptq\" || -z \"${EXTRA_BACKENDS}\" ) && \"$IMAGE_TYPE\" == \"extras\" ]]; then make -C backend/python/autogptq ; fi && if [[ ( \"${EXTRA_BACKENDS}\" =~ \"bark\" || -z \"${EXTRA_BACKENDS}\" ) && \"$IMAGE_TYPE\" == \"extras\" ]]; then make -C backend/python/bark ; fi && if [[ ( \"${EXTRA_BACKENDS}\" =~ \"rerankers\" || -z \"${EXTRA_BACKENDS}\" ) && \"$IMAGE_TYPE\" == \"extras\" ]]; then make -C backend/python/rerankers ; fi && if [[ ( \"${EXTRA_BACKENDS}\" =~ \"mamba\" || -z \"${EXTRA_BACKENDS}\" ) && \"$IMAGE_TYPE\" == \"extras\" ]]; then make -C backend/python/mamba ; fi" did not complete successfully: exit code: 2
To Reproduce
git clone [email protected]:mudler/LocalAI.git cd LocalAI docker build -t localai .
Expected behavior
Logs
Additional context
The text was updated successfully, but these errors were encountered:
No branches or pull requests
LocalAI version:
d7dee3aEnvironment, CPU architecture, OS, and Version:
24.2.0 Darwin Kernel Version 24.2.0: Fri Dec 6 19:03:40 PST 2024; root:xnu-11215.61.5~2/RELEASE_ARM64_T6041 arm64
Describe the bug
Docker build is failing with errors.Tried adding
setuptools_scm
in/build/backend/python/vllm/requirements-after.txt
. This resulted in :To Reproduce
Expected behavior
A successful docker build and then run.Logs
Added aboveAdditional context
The text was updated successfully, but these errors were encountered: