小编sam*_*sam的帖子

Pytorch 期望每个张量大小相等

运行此代码时:embedding_matrix = torch.stack(embeddings)

我收到这个错误:

RuntimeError: stack expects each tensor to be equal size, but got [7, 768] at entry 0 and [8, 768] at entry 1
Run Code Online (Sandbox Code Playgroud)

我正在尝试通过以下方式使用 BERT 进行嵌入:

    split_sent = sent.split()
    tokens_embedding = []
    j = 0
    for full_token in split_sent:
        curr_token = ''
        x = 0
        for i,_ in enumerate(tokenized_sent[1:]): 
            token = tokenized_sent[i+j]
            piece_embedding = bert_embedding[i+j]
            if token == full_token and curr_token == '' :
               tokens_embedding.append(piece_embedding)
               j += 1
               break                                     
    sent_embedding = torch.stack(tokens_embedding)
    embeddings.append(sent_embedding)
embedding_matrix = torch.stack(embeddings) …
Run Code Online (Sandbox Code Playgroud)

python pytorch tensor bert-language-model

6
推荐指数
1
解决办法
3万
查看次数

标签 统计

bert-language-model ×1

python ×1

pytorch ×1

tensor ×1