sac*_*ruk 9 huggingface-transformers huggingface-tokenizers
我正在尝试将标记生成器保存在 Huggingface 中,以便以后可以从不需要访问互联网的容器中加载它。
BASE_MODEL = "distilbert-base-multilingual-cased"
tokenizer = AutoTokenizer.from_pretrained(BASE_MODEL)
tokenizer.save_vocabulary("./models/tokenizer/")
tokenizer2 = AutoTokenizer.from_pretrained("./models/tokenizer/")
Run Code Online (Sandbox Code Playgroud)
但是,最后一行给出了错误:
OSError: Can't load config for './models/tokenizer3/'. Make sure that:
- './models/tokenizer3/' is a correct model identifier listed on 'https://huggingface.co/models'
- or './models/tokenizer3/' is the correct path to a directory containing a config.json file
Run Code Online (Sandbox Code Playgroud)
变压器版本:3.1.0
不幸的是,如何从 Pytorch 中的预训练模型加载保存的标记器并没有帮助。
感谢下面@ashwin 的回答,我save_pretrained
改为尝试,但出现以下错误:
OSError: Can't load config for './models/tokenizer/'. Make sure that:
- './models/tokenizer/' is a correct model identifier listed on 'https://huggingface.co/models'
- or './models/tokenizer/' is the correct path to a directory containing a config.json file
Run Code Online (Sandbox Code Playgroud)
我尝试重命名tokenizer_config.json
为config.json
,然后出现错误:
ValueError: Unrecognized model in ./models/tokenizer/. Should have a `model_type` key in its config.json, or contain one of the following strings in its name: retribert, t5, mobilebert, distilbert, albert, camembert, xlm-roberta, pegasus, marian, mbart, bart, reformer, longformer, roberta, flaubert, bert, openai-gpt, gpt2, transfo-xl, xlnet, xlm, ctrl, electra, encoder-decoder
Run Code Online (Sandbox Code Playgroud)
save_vocabulary()
, 仅保存分词器的词汇文件(BPE 标记列表)。
要保存整个分词器,您应该使用 save_pretrained()
因此,如下:
BASE_MODEL = "distilbert-base-multilingual-cased"
tokenizer = AutoTokenizer.from_pretrained(BASE_MODEL)
tokenizer.save_pretrained("./models/tokenizer/")
tokenizer2 = DistilBertTokenizer.from_pretrained("./models/tokenizer/")
Run Code Online (Sandbox Code Playgroud)
编辑:
出于某种未知的原因:而不是
tokenizer2 = AutoTokenizer.from_pretrained("./models/tokenizer/")
使用
tokenizer2 = DistilBertTokenizer.from_pretrained("./models/tokenizer/")
作品。
归档时间: |
|
查看次数: |
5917 次 |
最近记录: |