pin*_*iny 6 environment channel package anaconda conda
conda create --name ml --file ./requirements.txt\nRun Code Online (Sandbox Code Playgroud)\nconda list -e > requirements.txt我过去在另一台计算机上创建了requirements.txt 文件。\nrequirements.txt:\n https://github.com/penguinsAreFunny/bugFinder-machineLearning/blob/master/requirements.txt
\n\nPackagesNotFoundError:当前渠道无法提供以下软件包:
\n\n
\n- protobuf==3.19.1=pypi_0
\n- 张量板数据服务器==0.6.1=pypi_0
\n- pygments==2.10.0=pypi_0
\n- scikit-learn==1.0.1=pypi_0
\n- 张量流估计器==2.4.0=pypi_0
\n- flake8==4.0.1=pypi_0
\n- Nest-asyncio==1.5.1=pypi_0\n[...]
\n目前渠道:
\n\n
\n- https://conda.anaconda.org/conda-forge/win-64
\n- https://conda.anaconda.org/conda-forge/noarch
\n- https://repo.anaconda.com/pkgs/main/win-64
\n- https://repo.anaconda.com/pkgs/main/noarch
\n- https://repo.anaconda.com/pkgs/r/win-64
\n- https://repo.anaconda.com/pkgs/r/noarch
\n- https://repo.anaconda.com/pkgs/msys2/win-64
\n- https://repo.anaconda.com/pkgs/msys2/noarch
\n- https://conda.anaconda.org/pickle/win-64
\n- https://conda.anaconda.org/pickle/noarch
\n- https://conda.anaconda.org/nltk/win-64
\n- https://conda.anaconda.org/nltk/noarch
\n
为什么\xc2\xb4t conda 可以在通道中找到包?\n我认为丢失的包应该在conda-forge 中,\xc2\xb4t 不应该吗?
\n康达4.11.0
\n这些软件包可能按照建议位于 Conda Forge 中,但构建字符串“ pypi_0”表明它们是在之前的环境中从 PyPI 安装的。该conda list -e命令捕获此信息,但该conda create命令无法处理它。
最快的修复可能是编辑文件以删除这些包上的构建字符串规范。也就是说,类似:
## remove all PyPI references
sed -e 's/=pypi_0//' requirements.txt > reqs.nopip.txt
## try creating only from Conda packages
conda create -n m1 --file reqs.nopip.txt
Run Code Online (Sandbox Code Playgroud)
然后,Conda 将尝试将这些 PyPI 包规范视为 Conda 包。然而,这并不总是可靠的,因为某些包在两个存储库中使用不同的名称。
或者,序列化为 YAML 可以处理捕获和重新安装 Pip 安装的包。因此,如果您仍然使用旧环境,请考虑使用:
conda env export > environment.yaml
Run Code Online (Sandbox Code Playgroud)
它可以重新创建(在同一平台上)
conda env create -n m1 -f environment.yaml
Run Code Online (Sandbox Code Playgroud)
requirements.txt为 YAML如果环境不再存在,或者requirements.txt由其他用户提供,则另一个选择是将文件转换为 YAML 格式。下面是一个用于执行此操作的 AWK 脚本:
list_export_to_yaml.awk
#!/usr/bin/env awk -f
#' Author: Mervin Fansler
#' GitHub: @mfansler
#' License: MIT
#'
#' Basic usage
#' $ conda list --export | awk -f list_export_to_yaml.awk
#'
#' Omitting builds with 'no_builds'
#' $ conda list --export | awk -v no_builds=1 -f list_export_to_yaml.awk
#'
#' Specifying channels with 'channels'
#' $ conda list --export | awk -v channels="conda-forge,defaults" -f list_export_to_yaml.awk
BEGIN {
FS="=";
if (channels) split(channels, channels_arr, ",");
else channels_arr[0]="defaults";
}
{
# skip header
if ($1 ~ /^#/) next;
if ($3 ~ /pypi/) { # pypi packages
pip=1;
pypi[i++]=" - "$1"=="$2" ";
} else { # conda packages
if ($1 ~ /pip/) pip=1;
else { # should we keep builds?
if (no_builds) conda[j++]=" - "$1"="$2" ";
else conda[j++]=" - "$1"="$2"="$3" ";
}
}
}
END {
# emit channel info
print "channels: ";
for (k in channels_arr) print " - "channels_arr[k]" ";
# emit conda pkg info
print "dependencies: ";
for (j in conda) print conda[j];
# emit PyPI pkg info
if (pip) print " - pip ";
if (length(pypi) > 0) {
print " - pip: ";
for (i in pypi) print pypi[i];
}
}
Run Code Online (Sandbox Code Playgroud)
对于OP的例子,我们得到:
$ wget -O requirements.txt 'https://github.com/penguinsAreFunny/bugFinder-machineLearning/raw/master/requirements.txt'
$ awk -f list_export_to_yaml.awk requirements.txt > bugfinder-ml.yaml
Run Code Online (Sandbox Code Playgroud)
其内容如下:
channels:
- defaults
dependencies:
- brotlipy=0.7.0=py38h294d835_1003
- ca-certificates=2021.10.8=h5b45459_0
- cffi=1.15.0=py38hd8c33c5_0
- chardet=4.0.0=py38haa244fe_2
- cryptography=35.0.0=py38hb7941b4_2
- future=0.18.2=py38haa244fe_4
- h2o=3.34.0.3=py38_0
- openjdk=11.0.9.1=h57928b3_1
- openssl=1.1.1l=h8ffe710_0
- pycparser=2.20=pyh9f0ad1d_2
- pyopenssl=21.0.0=pyhd8ed1ab_0
- pysocks=1.7.1=py38haa244fe_4
- python=3.8.12=h7840368_2_cpython
- python_abi=3.8=2_cp38
- requests=2.26.0=pyhd8ed1ab_0
- setuptools=58.5.3=py38haa244fe_0
- sqlite=3.36.0=h8ffe710_2
- tabulate=0.8.9=pyhd8ed1ab_0
- ucrt=10.0.20348.0=h57928b3_0
- urllib3=1.26.7=pyhd8ed1ab_0
- vc=14.2=hb210afc_5
- vs2013_runtime=12.0.21005=1
- vs2015_runtime=14.29.30037=h902a5da_5
- wheel=0.37.0=pyhd8ed1ab_1
- win_inet_pton=1.1.0=py38haa244fe_3
- pip
- pip:
- absl-py==0.15.0
- appdirs==1.4.4
- astroid==2.7.3
- astunparse==1.6.3
- autopep8==1.6.0
- backcall==0.2.0
- backports-entry-points-selectable==1.1.0
- black==21.4b0
- cachetools==4.2.4
- certifi==2021.10.8
- cfgv==3.3.1
- charset-normalizer==2.0.7
- click==8.0.3
- cycler==0.11.0
- deap==1.3.1
- debugpy==1.5.1
- decorator==5.1.0
- dill==0.3.4
- distlib==0.3.3
- entrypoints==0.3
- filelock==3.3.2
- flake8==4.0.1
- flatbuffers==1.12
- gast==0.3.3
- google-auth==2.3.3
- google-auth-oauthlib==0.4.6
- google-pasta==0.2.0
- grpcio==1.32.0
- h5py==2.10.0
- identify==2.3.3
- idna==3.3
- importlib-resources==5.4.0
- ipykernel==6.5.0
- ipython==7.29.0
- isort==5.10.0
- jedi==0.18.0
- jinja2==3.0.2
- joblib==1.1.0
- jupyter-client==7.0.6
- jupyter-core==4.9.1
- keras-preprocessing==1.1.2
- kiwisolver==1.3.2
- markdown==3.3.4
- markupsafe==2.0.1
- matplotlib==3.4.3
- matplotlib-inline==0.1.3
- mypy==0.910
- mypy-extensions==0.4.3
- nest-asyncio==1.5.1
- nodeenv==1.6.0
- numpy==1.19.5
- oauthlib==3.1.1
- opt-einsum==3.3.0
- pandas==1.3.4
- parso==0.8.2
- pathspec==0.9.0
- pickleshare==0.7.5
- pillow==8.4.0
- platformdirs==2.4.0
- pre-commit==2.15.0
- prompt-toolkit==3.0.22
- protobuf==3.19.1
- pyasn1==0.4.8
- pyasn1-modules==0.2.8
- pycodestyle==2.8.0
- pyflakes==2.4.0
- pygments==2.10.0
- pylint==2.10.2
- pyparsing==3.0.4
- python-dateutil==2.8.2
- pytz==2021.3
- pywin32==302
- pyyaml==6.0
- pyzmq==22.3.0
- regex==2021.11.2
- requests-oauthlib==1.3.0
- rsa==4.7.2
- scikit-learn==1.0.1
- scipy==1.7.1
- six==1.15.0
- stopit==1.1.2
- sweetviz==2.1.3
- tensorboard==2.7.0
- tensorboard-data-server==0.6.1
- tensorboard-plugin-wit==1.8.0
- tensorflow==2.4.4
- tensorflow-estimator==2.4.0
- termcolor==1.1.0
- threadpoolctl==3.0.0
- tornado==6.1
- tpot==0.11.7
- tqdm==4.62.3
- traitlets==5.1.1
- typing-extensions==3.7.4.3
- update-checker==0.18.0
- virtualenv==20.10.0
- wcwidth==0.2.5
- werkzeug==2.0.2
- xgboost==1.5.0
- zipp==3.6.0
Run Code Online (Sandbox Code Playgroud)
请注意,由于conda list --export不捕获频道信息,因此用户必须自行确定。默认情况下,脚本会插入一个defaults,但还提供一个参数 ( channels) 来以逗号分隔的格式指定 YAML 的其他通道。例如
awk -f list_export_to_yaml.awk -v channels='conda-forge,defaults' requirements.txt
Run Code Online (Sandbox Code Playgroud)
会输出
channels:
- conda-forge
- defaults
Run Code Online (Sandbox Code Playgroud)
在 YAML 中。
还有一个no_builds论点是抑制构建(即仅版本)。例如,
awk -f list_export_to_yaml.awk -v no_builds=1 requirements.txt
Run Code Online (Sandbox Code Playgroud)