Conformers
1.0.0
这是纸条形式建模的非正式实施。我发现该论文很有趣,并想和它一起玩。仍处于非常早的状态 - 目前唯一严格的统计保证是存在错误和误解。请原谅当前代码的状态 - 我保证我会清理它!
尚无PYPI软件包。要安装,请克隆存储库并运行
pip install poetry
poetry installPython API尚未设置在石头上,但目的是使使用不同的入院,团体信心和拒绝功能变得容易。最近的CFG语言模型论文可能会有一些非常有趣的组合。以下是GPT2的示例。
from conformer import Calibrator , Sampler , Components
import torch
from random import randint
x = [
"What is the capital of France?" ,
"Which prime-minster of the UK was the biggest nob?" ,
]
from transformers import GPT2LMHeadModel , GPT2Tokenizer
model_name = "gpt2"
model = GPT2LMHeadModel . from_pretrained ( model_name ). cuda ()
tokenizer = GPT2Tokenizer . from_pretrained ( model_name )
tokenizer . pad_token_id = tokenizer . eos_token_id
calibrator = Calibrator (
model = model ,
tokenizer = tokenizer ,
calibration_prompts = x ,
)
calibrator . set_admission_function ( Components . admission . debug )
calibrator . set_group_confidence_function ( Components . group_confidence . debug , torch . tensor ([ 0.1 , 0.5 , 1 ]))
calibrator . add_rejection_function ( Components . rejection . debug , torch . tensor ([ 0.1 , 0.5 , 1 ]))
calibrator . set_FWER ( Components . FWER . debug )
lambdaz = calibrator . search ()
sampler = Sampler . from_calibrator ( calibrator )
sampler . sample_with_rejection ( "What is the capital of France?" )这使用了一些内置入学/gf/fwer/拒绝功能。也可以使用您自己的功能,例如:
calibrator . set_group_confidence_function ( lambda x : x > 0.5 , torch . tensor ([ 0.1 , 0.5 , 1 ]))