nanogptrs
1.0.0
Esta es una implementación de óxido del modelo de nanogpt de Andrej Karpathy's YT Video: https://www.youtube.com/watch?v=kcc8fmeb1ny&t=12s
Con algo de ayuda de: https://github.com/laurentmazare/tch-rs/blob/main/examples/min-gpt/main.rs y https://github.com/karpathy/nanogpt/blob/model.py.
Crear entorno de micromamba (o condena):
micromamba env create -f environment.ymlActivar el entorno:
micromamba activate nanogptrs export LD_LIBRARY_PATH=/opt/conda/lib/python3.10/site-packages/torch/lib/: $LD_LIBRARY_PATH./data/download.sh
./models/download.sh gpt2
cargo run --release -- --device=cuda --restore-from models/gpt2/model.safetensors generate --max-len 32 --prompt " Once upon a time " gpt2cargo run --release -- --device=cuda train --n-epochs=3 --final-checkpoint-path=models/nanogptrs.safetensors nano-gptDebería eventualmente (~ 5h en mi Titan XP) producir algo como esto:
DUCHESS OF YORK:
Here comes already.
EXTOLY:
O, by the means of your crown?
KING HENRY VI:
Brother, that my lord, change thou givest queen.
KING RICHARD II:
Mine honour, because I am advertised
The queen our is not your voice. Would thy sight
Next Rome, among, insible express to dictliffe:
For ere for goings
Abova drunking redel her food pain soul to every it.
QUEEN MARGARET:
I took! O, if you so, good and the Montague of slave,
That he's breathing which holy a holy brats.