如何解决火炬张量和输入冲突:“不可调用张量对象”
由于代码“ torch.tensor”,当我添加“输入”时,出现错误“无法调用Tensor对象”。有谁知道我该如何解决?
import torch
from torch.nn import functional as F
from transformers import GPT2Tokenizer,GPT2LMHeadModel
tokenizer = GPT2Tokenizer.from_pretrained('gpt2')
model = GPT2LMHeadModel.from_pretrained('gpt2')
text0 = "In order to"
text = tokenizer.encode("In order to")
input,past = torch.tensor([text]),None
logits,past = model(input,past = past)
logits = logits[0,-1]
probabilities = torch.nn.functional.softmax(logits)
best_logits,best_indices = logits.topk(5)
best_words = [tokenizer.decode([idx.item()]) for idx in best_indices]
text.append(best_indices[0].item())
best_probabilities = probabilities[best_indices].tolist()
for i in range(5):
f = ('Generated {}: {}'.format(i,best_words[i]))
print(f)
option = input("Pick a Option:")
z = text0.append(option)
print(z)
错误堆栈跟踪:
TypeError Traceback (most recent call last)
<ipython-input-2-82e8d88e81c1> in <module>()
25
26
---> 27 option = input("Pick a Option:")
28 z = text0.append(option)
29 print(z)
TypeError: 'Tensor' object is not callable
解决方法
问题是您已经定义了一个名称为input
的变量,它将代替input
函数使用。只需为变量使用其他名称即可,它将按预期运行。
另外,python字符串没有添加方法。
import torch
from torch.nn import functional as F
from transformers import GPT2Tokenizer,GPT2LMHeadModel
tokenizer = GPT2Tokenizer.from_pretrained('gpt2')
model = GPT2LMHeadModel.from_pretrained('gpt2')
text0 = "In order to"
text = tokenizer.encode("In order to")
myinput,past = torch.tensor([text]),None
logits,past = model(myinput,past = past)
logits = logits[0,-1]
probabilities = torch.nn.functional.softmax(logits)
best_logits,best_indices = logits.topk(5)
best_words = [tokenizer.decode([idx.item()]) for idx in best_indices]
text.append(best_indices[0].item())
best_probabilities = probabilities[best_indices].tolist()
for i in range(5):
f = ('Generated {}: {}'.format(i,best_words[i]))
print(f)
option = input("Pick a Option:")
z = text0 + ' ' + option
print(z)
版权声明:本文内容由互联网用户自发贡献,该文观点与技术仅代表作者本人。本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如发现本站有涉嫌侵权/违法违规的内容, 请发送邮件至 dio@foxmail.com 举报,一经查实,本站将立刻删除。