无法从“变压器”导入“AutoModelForSequenceClassification”

Posted

技术标签:

【中文标题】无法从“变压器”导入“AutoModelForSequenceClassification”【英文标题】:cannot import 'AutoModelForSequenceClassification' from 'transformers' 【发布时间】:2021-06-28 18:17:30 【问题描述】:

代码是

from transformers import AutoTokenizer, AutoModelForSequenceClassification, pipeline

t = AutoTokenizer.from_pretrained('/some/directory')
m = AutoModelForSequenceClassification.from_pretrained('/some/directory')
c2 = pipeline(task = 'sentiment-analysis', model=m, tokenizer=t)

错误是

cannot import 'AutoModelForSequenceClassification' from 'transformers'

【问题讨论】:

什么是变形金刚版本? 它正在工作。我已经安装了 4.3.3 【参考方案1】:

from transformers import AutoModelForSequenceClassification, BertForSequenceClassification
from transformers import (XLMRobertaConfig, XLMRobertaTokenizer, TFXLMRobertaModel)            
from transformers import AutoTokenizer, AutoConfig, TFAutoModel    

PRETRAINED_MODEL_TYPES = 
    'xlmroberta': (AutoConfig, AutoModelForSequenceClassification, AutoTokenizer, 'akhooli/xlm-r-large-arabic-toxic')


# model_class,model = AutoModelForSequenceClassification.from_pretrained("akhooli/xlm-r-large-arabic-toxic")

config_class, model_class, tokenizer_class, model_name = PRETRAINED_MODEL_TYPES['xlmroberta']

# Download vocabulary from huggingface.co and cache.
tokenizer = AutoTokenizer.from_pretrained(model_name,use_fast=False) #fast tokenizer

tokenizer

【讨论】:

以上是关于无法从“变压器”导入“AutoModelForSequenceClassification”的主要内容,如果未能解决你的问题,请参考以下文章

ImportError:无法从“变压器”导入名称“AutoModelWithLMHead”

变压器:导入包时出错。 “ImportError:无法从 'torch.optim.lr_scheduler' 导入名称 'SAVE_STATE_WARNING'”

无法保存变压器模型

无法为拥抱脸变压器库安装 tensorflow

使变压器BertForSequenceClassification初始层无法进行火炬训练

无法在 mleap 中序列化 apache spark 变压器