Dependency hell is a known problem in Machine Learning ecosystem. Hardware(eg. NVIDIA RTX chips) with major libraries such as PyTorch, NumPy etc. can easily create all sorts of dependency issues. That is why making a system for maximum reproducability is important.

I’ve previously used poetry or conda for package managing for Python but found it hard and clunky to use sometimes. Recently I’ve found a rising tool for package managing called uv.

I heard uv was very fast, optimized by using Rust, so I wanted to test how fast it was compared to pip install.

uv: ~2m 35s which is not so bad

❯ uv add torch  
Resolved 31 packages in 381ms  
Prepared 25 packages in 2m 35s  
Installed 25 packages in 240ms  
+ filelock==3.18.0  
...
+ typing-extensions==4.14.1

pip : ~9m 34s ngl it took forever to install them

time pip install torch  
Collecting torch  
...
noglob pip install torch 65.50s user 23.14s system 15% cpu 9:34.48 total

Conclusion: uv is much faster and efficient when importing libraries.

Checkout uv:
https://github.com/astral-sh/uv/
https://docs.astral.sh/uv/