BERT keras
1.0.0
狀態:存檔(提供代碼為IS,沒有預期的更新)
KERAS實現Google Bert(來自變形金剛的雙向編碼器表示)和OpenAI的Transformer LM,能夠使用Finetuning API加載預審計的模型。
更新:在TPU支持的推理和培訓的情況下,@highcwu都可以通過THICOLAB筆記本來進行推理和培訓
# this is a pseudo code you can read an actual working example in tutorial.ipynb or the colab notebook
text_encoder = MyTextEncoder ( ** my_text_encoder_params ) # you create a text encoder (sentence piece and openai's bpe are included)
lm_generator = lm_generator ( text_encoder , ** lm_generator_params ) # this is essentially your data reader (single sentence and double sentence reader with masking and is_next label are included)
task_meta_datas = [ lm_task , classification_task , pos_task ] # these are your tasks (the lm_generator must generate the labels for these tasks too)
encoder_model = create_transformer ( ** encoder_params ) # or you could simply load_openai() or you could write your own encoder(BiLSTM for example)
trained_model = train_model ( encoder_model , task_meta_datas , lm_generator , ** training_params ) # it does both pretraing and finetuning
trained_model . save_weights ( 'my_awesome_model' ) # save it
model = load_model ( 'my_awesome_model' , encoder_model ) # load it later and use it! 尼爾