LLM_json_schema
1.0.0
LLM_JSON_SCHEMA dapat menegakkan output model LLM untuk mengikuti skema JSON yang diberikan. Jenis -jenis berikut tersedia: String, Number, Boolean, Array, Object.
Output dijamin memiliki format yang benar.
python3 LLM_json_schema.py
--model models/Mistral-7B-Instruct-v0.1.gguf
--json-schema ' {"type":"object", "properties":{"country":{"type":"string"}, "capital":{"type":"string"}}} '
--prompt " What is the capital of France?nn "keluaran:
{ "country" : " France " , "capital" : " Paris " }python3 LLM_json_schema.py
--model models/Mistral-7B-Instruct-v0.1.gguf
--json-schema ' {"type":"array", "items":{"type":"number"}} '
--prompt " Count until 20.nn "keluaran:
[ 1 , 2 , 3 , 4 , 5 , 6 , 7 , 8 , 9 , 10 , 11 , 12 , 13 , 14 , 15 , 16 , 17 , 18 , 19 , 20 ]Ini menambahkan bias ke logit yang dikeluarkan oleh LLM untuk menegakkan bahwa hanya token yang valid yang dapat dipilih.
cd LLM_json_schema
pip3 install -r requirements.txtUnduh model LLM, dan konversinya ke format GGUF.
Contoh:
mkdir models
cd models
git clone https://huggingface.co/mistralai/Mistral-7B-Instruct-v0.1
git clone https://github.com/ggerganov/llama.cpp.git
pip install -r llama.cpp/requirements.txt
python3 llama.cpp/convert.py Mistral-7B-Instruct-v0.1
--outfile Mistral-7B-Instruct-v0.1.gguf
--outtype q8_0
cd .. usage: LLM_json_schema.py [-h] --model-path MODEL_PATH --prompt PROMPT [--json-schema JSON_SCHEMA]
options:
-h, --help show this help message and exit
--model-path MODEL_PATH
Path to the LLM model in gguf format
--prompt PROMPT Input prompt
--json-schema JSON_SCHEMA
JSON schema to enforce
python3 LLM_json_schema.py --model models/Mistral-7B-Instruct-v0.1.gguf --json-schema ' {"type":"object", "properties":{"country":{"type":"string"}, "captial":{"type":"string"}}} ' --prompt " What is the capital of France?nn " from LLM_json_schema import run_inference_constrained_by_json_schema
import os
script_path = os . path . dirname ( os . path . realpath ( __file__ ))
model_path = os . environ . get ( 'MODEL_PATH' , os . path . join ( script_path , "./models/Mistral-7B-Instruct-v0.1.gguf" ))
prompt = " n n ### Instruction: n What is the capital of France? n n ### Response: n "
json_schema = { "type" : "object" , "properties" :{ "country" :{ "type" : "string" }, "capital" :{ "type" : "string" }}}
for chunk in run_inference_constrained_by_json_schema ( model_path = model_path , json_schema = json_schema , prompt = prompt ):
print ( chunk , end = "" , flush = True )
print ( "" )Jika Anda menggunakan pekerjaan ini, silakan kutip yang berikut:
@article{duchenne2023llm_json_schema,
title={LLM Json Schema},
author={Olivier Duchenne},
journal={Github},
url={https://github.com/olivierDuchenne/LLM_json_schema},
year={2023}
}