This repository was archived by the owner on Nov 24, 2021. It is now read-only.
-
Notifications
You must be signed in to change notification settings - Fork 5
This repository was archived by the owner on Nov 24, 2021. It is now read-only.
ERROR: Failed building wheel for tokenizers, even thought setuptools-rust installed #4
Copy link
Copy link
Open
Description
Hi!
while tryoing to
pip install docly, getting this error:
Building wheel for tokenizers (PEP 517) ... error
ERROR: Command errored out with exit status 1:
command: /Users/krisku/opt/miniconda3/envs/auto_doc/bin/python /Users/krisku/opt/miniconda3/envs/auto_doc/lib/python3.9/site-packages/pip/_vendor/pep517/_in_process.py build_wheel /var/folders/mr/5nhwtm3n4dvfdkh3sf2xs41h0000gn/T/tmpb3b95crr
cwd: /private/var/folders/mr/5nhwtm3n4dvfdkh3sf2xs41h0000gn/T/pip-install-syhlo38v/tokenizers_eb68e965c7654f6bbe50cefd6fff55af
Complete output (36 lines):
running bdist_wheel
running build
running build_py
creating build
creating build/lib
creating build/lib/tokenizers
copying tokenizers/__init__.py -> build/lib/tokenizers
creating build/lib/tokenizers/models
copying tokenizers/models/__init__.py -> build/lib/tokenizers/models
creating build/lib/tokenizers/decoders
copying tokenizers/decoders/__init__.py -> build/lib/tokenizers/decoders
creating build/lib/tokenizers/normalizers
copying tokenizers/normalizers/__init__.py -> build/lib/tokenizers/normalizers
creating build/lib/tokenizers/pre_tokenizers
copying tokenizers/pre_tokenizers/__init__.py -> build/lib/tokenizers/pre_tokenizers
creating build/lib/tokenizers/processors
copying tokenizers/processors/__init__.py -> build/lib/tokenizers/processors
creating build/lib/tokenizers/trainers
copying tokenizers/trainers/__init__.py -> build/lib/tokenizers/trainers
creating build/lib/tokenizers/implementations
copying tokenizers/implementations/byte_level_bpe.py -> build/lib/tokenizers/implementations
copying tokenizers/implementations/sentencepiece_bpe.py -> build/lib/tokenizers/implementations
copying tokenizers/implementations/base_tokenizer.py -> build/lib/tokenizers/implementations
copying tokenizers/implementations/__init__.py -> build/lib/tokenizers/implementations
copying tokenizers/implementations/char_level_bpe.py -> build/lib/tokenizers/implementations
copying tokenizers/implementations/bert_wordpiece.py -> build/lib/tokenizers/implementations
copying tokenizers/__init__.pyi -> build/lib/tokenizers
copying tokenizers/models/__init__.pyi -> build/lib/tokenizers/models
copying tokenizers/decoders/__init__.pyi -> build/lib/tokenizers/decoders
copying tokenizers/normalizers/__init__.pyi -> build/lib/tokenizers/normalizers
copying tokenizers/pre_tokenizers/__init__.pyi -> build/lib/tokenizers/pre_tokenizers
copying tokenizers/processors/__init__.pyi -> build/lib/tokenizers/processors
copying tokenizers/trainers/__init__.pyi -> build/lib/tokenizers/trainers
running build_ext
running build_rust
error: Can not find Rust compiler
----------------------------------------
ERROR: Failed building wheel for tokenizers
Failed to build tokenizers
ERROR: Could not build wheels for tokenizers which use PEP 517 and cannot be installed directly
even though the setuptools-rust installed .
MacOS Catalina 10.15.7
Python 3.9.1
conda 4.9.2 + pip 21.0.1
I'll appreciate your help.
Metadata
Metadata
Assignees
Labels
No labels