huggingface | how to get the params‘ number of a model
Posted CSU迦叶
tags:
篇首语:本文由小常识网(cha138.com)小编为大家整理,主要介绍了huggingface | how to get the params‘ number of a model相关的知识,希望对你有一定的参考价值。
主要实现在下列的how_big(model)函数中
# 探究plbart和codegpt的参数量
from transformers import PLBartModel,GPT2Model
def how_big(model):
total_num = sum(p.numel() for p in model.parameters())
trainable_num = sum(p.numel() for p in model.parameters() if p.requires_grad)
print("Total: ,Trainable: ".format(total_num,trainable_num))
def main():
bart = PLBartModel.from_pretrained("uclanlp/plbart-java-cs")
how_big(bart)
codegpt = GPT2Model.from_pretrained("pytorch.bin所在的文件夹路径")
how_big(codegpt)
if __name__ == "__main__":
main()
输出结果为
Total: 139220736,Trainable: 139220736
Total: 124442112,Trainable: 124442112
以上是关于huggingface | how to get the params‘ number of a model的主要内容,如果未能解决你的问题,请参考以下文章
How to get service execuable path
How to get date from OAMessageDateFieldBean
how-to-get-a-job-in-deep-learning
[特征选择] DIscover Feature Engineering, How to Engineer Features and How to Get Good at It 翻译