Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

conda environment compile tutorial! #16

Open
xyb1314 opened this issue Feb 24, 2025 · 2 comments
Open

conda environment compile tutorial! #16

xyb1314 opened this issue Feb 24, 2025 · 2 comments

Comments

@xyb1314
Copy link

xyb1314 commented Feb 24, 2025

So start from below command to download some neccessary libs and package on conda env.

apt-get install libgl1 libglib2.0-0 libsm6 libxrender1 libxext6 libssl-dev build-essential g++ libboost-all-dev libsparsehash-dev git-core perl libegl1-mesa-dev libgl1-mesa-dev -y
conda create -n texgen python=3.10 -y
conda activate texgen
conda install ninja -y
conda install -c "nvidia/label/cuda-11.8.0" cuda-toolkit -y
conda install pytorch==2.1.0 torchvision==0.16.0 pytorch-cuda=11.8 -c pytorch -c nvidia -y
conda install h5py pyyaml -c anaconda -y
conda install sharedarray tensorboard tensorboardx yapf addict einops scipy plyfile termcolor timm gxx=11.1.0 lightning -c conda-forge -y
conda install pytorch-cluster pytorch-scatter pytorch-sparse -c pyg -y
pip install -r requirements.txt

when you run this command pip install -r requirements.txt , this is error( on my server ) on below:

Usage: pip [options]
ERROR: Invalid requirement: flash-attn --no-build-isolation
pip: error: no such option: --no-build-isolation

comment the flash-attn --no-build-isolation in requirements.txt , and continue running pip install -r requirements.txt.
error occurs:

ERROR: Could not find a version that satisfies the requirement xformers==0.0.22.post7+cu118 (from versions: 0.0.1, 0.0.2, 0.0.3, 0.0.4, 0.0.5, 0.0.6, 0.0.7, 0.0.8, 0.0.9, 0.0.10, 0.0.11, 0.0.12, 0.0.13, 0.0.16rc424, 0.0.16rc425, 0.0.16, 0.0.17rc481, 0.0.17rc482, 0.0.17, 0.0.18, 0.0.19, 0.0.20, 0.0.21, 0.0.22, 0.0.22.post7, 0.0.23, 0.0.23.post1, 0.0.24, 0.0.25, 0.0.25.post1, 0.0.26.post1, 0.0.27, 0.0.27.post1, 0.0.27.post2, 0.0.28, 0.0.28.post1, 0.0.28.post2, 0.0.28.post3, 0.0.29, 0.0.29.post1, 0.0.29.post2, 0.0.29.post3, 0.0.30.dev989, 0.0.30.dev1002)
Could not fetch URL https://pypi.ngc.nvidia.com/pip/: There was a problem confirming the ssl certificate: HTTPSConnectionPool(host='pypi.ngc.nvidia.com', port=443): Max retries exceeded with url: /pip/ (Caused by SSLError(SSLEOFError(8, '[SSL: UNEXPECTED_EOF_WHILE_READING] EOF occurred in violation of protocol (_ssl.c:1007)'))) - skipping
ERROR: No matching distribution found for xformers==0.0.22.post7+cu118

if you have problems to connect the link to download xformer, you can comment it in requirements.txt and download it(0.0.22.post7+cu118_cp310) via https://download.pytorch.org/whl/xformers/. and pip instll xformer_xxx.whl offline.

after installing xformer ,rerun the pip install -r requirements.txt , and take much time on torchsparse installation.

Collecting git+https://github.com/mit-han-lab/torchsparse.git (from -r requirements.txt (line 24))
Cloning https://github.com/mit-han-lab/torchsparse.git to /tmp/pip-req-build-fl0t81py
Running command git clone --filter=blob:none --quiet https://github.com/mit-han-lab/torchsparse.git /tmp/pip-req-build-fl0t81py
Resolved https://github.com/mit-han-lab/torchsparse.git to commit 6d24aa581c2b598210fdaa2abdab425f3c561336
Preparing metadata (setup.py) ... -

and error occurs when git clone the https://github.com/mit-han-lab/torchsparse.git and python setup.py install :

torchsparse/torchsparse/backend/others/query_cpu.cpp:6:10: fatal error: google/dense_hash_map: No such file or directory
6 | #include <google/dense_hash_map>
| ^~~~~~~~~~~~~~~~~~~~~~~
compilation terminated.
ninja: build stopped: subcommand failed.

solution is https://github.com/facebookresearch/SparseConvNet/issues/96 , the detailed step :
conda install google-sparsehash -c bioconda and add the path export PATH=/path/to/anaconda3/envs/yourenv/include:$PATH

solved it and install other required package and module in requirments.txt

@xyb1314
Copy link
Author

xyb1314 commented Feb 25, 2025

download whl flash_attn and pip install whl offline , saving time for compile like pip install flash_attn --no-build-isolation and easier!

@xyb1314
Copy link
Author

xyb1314 commented Feb 25, 2025

and downgrade huggingface-hub to 0.25.2, it can download the model from huggingface straightly ! (if available to huggingface.com)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant