小编Abh*_*k K的帖子

Huggingface Transformer 问题答案置信度得分

我们如何从huggingface转换器问题答案的示例代码中获取答案置信度得分?我看到管道确实返回了分数,但是下面的核心也可以返回置信度分数吗?

\n
from transformers import AutoTokenizer, TFAutoModelForQuestionAnswering\nimport tensorflow as tf\n\ntokenizer = AutoTokenizer.from_pretrained("bert-large-uncased-whole-word-masking-finetuned-squad")\nmodel = TFAutoModelForQuestionAnswering.from_pretrained("bert-large-uncased-whole-word-masking-finetuned-squad")\n\ntext = r"""\n Transformers (formerly known as pytorch-transformers and pytorch-pretrained-bert) provides general-purpose\narchitectures (BERT, GPT-2, RoBERTa, XLM, DistilBert, XLNet\xe2\x80\xa6) for Natural Language Understanding (NLU) and Natural\nLanguage Generation (NLG) with over 32+ pretrained models in 100+ languages and deep interoperability between\nTensorFlow 2.0 and PyTorch.\n"""\n\nquestions = [\n    "How many pretrained models are available in Transformers?",\n    "What does Transformers provide?",\n    "Transformers provides interoperability between which frameworks?",\n]\n\nfor question in questions:\n    inputs = …
Run Code Online (Sandbox Code Playgroud)

python huggingface-transformers

5
推荐指数
1
解决办法
3628
查看次数

标签 统计

huggingface-transformers ×1

python ×1