BERT keras
1.0.0
상태 : 아카이브 (코드는대로 제공되며 업데이트가 예상되지 않음)
Keras는 Google Bert (Transformers의 양방향 인코더 표현) 및 Finetuning API를 사용하여 전기 모델을로드 할 수있는 OpenAi의 Transformer LM의 Keras 구현.
업데이트 : @highcwu 덕분 에이 Colab 노트북과 같은 추론 및 교육을위한 TPU 지원
# this is a pseudo code you can read an actual working example in tutorial.ipynb or the colab notebook
text_encoder = MyTextEncoder ( ** my_text_encoder_params ) # you create a text encoder (sentence piece and openai's bpe are included)
lm_generator = lm_generator ( text_encoder , ** lm_generator_params ) # this is essentially your data reader (single sentence and double sentence reader with masking and is_next label are included)
task_meta_datas = [ lm_task , classification_task , pos_task ] # these are your tasks (the lm_generator must generate the labels for these tasks too)
encoder_model = create_transformer ( ** encoder_params ) # or you could simply load_openai() or you could write your own encoder(BiLSTM for example)
trained_model = train_model ( encoder_model , task_meta_datas , lm_generator , ** training_params ) # it does both pretraing and finetuning
trained_model . save_weights ( 'my_awesome_model' ) # save it
model = load_model ( 'my_awesome_model' , encoder_model ) # load it later and use it! 네 아이언