如果您不指定提供商, ollama將是默認提供商。 http://localhost:11434是您的終點。 mistral:7b如果未更新配置,則將是您的默認值。
-- Simple, minimal Lazy.nvim configuration
{
" huynle/ogpt.nvim " ,
event = " VeryLazy " ,
opts = {
default_provider = " ollama " ,
providers = {
ollama = {
api_host = os.getenv ( " OLLAMA_API_HOST " ) or " http://localhost:11434 " ,
api_key = os.getenv ( " OLLAMA_API_KEY " ) or " " ,
}
}
},
dependencies = {
" MunifTanjim/nui.nvim " ,
" nvim-lua/plenary.nvim " ,
" nvim-telescope/telescope.nvim "
}
}OGPT.nvim帶有以下默認值。您可以通過將配置作為設置參數覆蓋任何字段。
https://github.com/huynle/ogpt.nvim/blob/main/lua/ogpt/config.lua
OGPT是一個Neovim插件,可讓您毫不費力地利用Ollama OGPT API,使您能夠直接在編輯器中從Ollama產生自然語言響應,以響應您的提示。
curl 。具有配置選項api_host_cmd或稱為$OLLAMA_API_HOST的環境變量的自定義Ollama API主機。如果您遠程運行Ollama,這將很有用。

edgy.nvim插件提供了一個側窗(默認位於右側),該窗口提供了一個並行的工作空間,在與OGPT互動時,您可以在項目中工作。這是“前衛”配置的示例。
{
{
" huynle/ogpt.nvim " ,
event = " VeryLazy " ,
opts = {
default_provider = " ollama " ,
edgy = true , -- enable this!
single_window = false , -- set this to true if you want only one OGPT window to appear at a time
providers = {
ollama = {
api_host = os.getenv ( " OLLAMA_API_HOST " ) or " http://localhost:11434 " ,
api_key = os.getenv ( " OLLAMA_API_KEY " ) or " " ,
}
}
},
dependencies = {
" MunifTanjim/nui.nvim " ,
" nvim-lua/plenary.nvim " ,
" nvim-telescope/telescope.nvim "
}
},
{
" folke/edgy.nvim " ,
event = " VeryLazy " ,
init = function ()
vim . opt . laststatus = 3
vim . opt . splitkeep = " screen " -- or "topline" or "screen"
end ,
opts = {
exit_when_last = false ,
animate = {
enabled = false ,
},
wo = {
winbar = true ,
winfixwidth = true ,
winfixheight = false ,
winhighlight = " WinBar:EdgyWinBar,Normal:EdgyNormal " ,
spell = false ,
signcolumn = " no " ,
},
keys = {
-- -- close window
[ " q " ] = function ( win )
win : close ()
end ,
-- close sidebar
[ " Q " ] = function ( win )
win . view . edgebar : close ()
end ,
-- increase width
[ " <S-Right> " ] = function ( win )
win : resize ( " width " , 3 )
end ,
-- decrease width
[ " <S-Left> " ] = function ( win )
win : resize ( " width " , - 3 )
end ,
-- increase height
[ " <S-Up> " ] = function ( win )
win : resize ( " height " , 3 )
end ,
-- decrease height
[ " <S-Down> " ] = function ( win )
win : resize ( " height " , - 3 )
end ,
},
right = {
{
title = " OGPT Popup " ,
ft = " ogpt-popup " ,
size = { width = 0.2 },
wo = {
wrap = true ,
},
},
{
title = " OGPT Parameters " ,
ft = " ogpt-parameters-window " ,
size = { height = 6 },
wo = {
wrap = true ,
},
},
{
title = " OGPT Template " ,
ft = " ogpt-template " ,
size = { height = 6 },
},
{
title = " OGPT Sessions " ,
ft = " ogpt-sessions " ,
size = { height = 6 },
wo = {
wrap = true ,
},
},
{
title = " OGPT System Input " ,
ft = " ogpt-system-window " ,
size = { height = 6 },
},
{
title = " OGPT " ,
ft = " ogpt-window " ,
size = { height = 0.5 },
wo = {
wrap = true ,
},
},
{
title = " OGPT {{{selection}}} " ,
ft = " ogpt-selection " ,
size = { width = 80 , height = 4 },
wo = {
wrap = true ,
},
},
{
title = " OGPt {{{instruction}}} " ,
ft = " ogpt-instruction " ,
size = { width = 80 , height = 4 },
wo = {
wrap = true ,
},
},
{
title = " OGPT Chat " ,
ft = " ogpt-input " ,
size = { width = 80 , height = 4 },
wo = {
wrap = true ,
},
},
},
},
}
}插件公開以下命令:
OGPT OGPT命令打開一個交互式窗口,與LLM後端通信。交互式窗口由四個窗格組成:
| 區域 | 默認快捷方式 | 描述 |
|---|---|---|
| 常見的 | ctrl-o | 切換參數面板(OGPT參數)和會話面板(OGPT會話)。 |
| Ctrl-n | 創建一個新會話。 | |
| ctrl-c | 關閉OGPT。 | |
| Ctrl-i | 在OGPT輸出文本區域中LLM中的最新響應中復制代碼。 | |
| Ctrl-X | 停止產生響應。 | |
| 選項卡 | 通過窗格循環。 | |
| 哦 | k | 以前的回應。 |
| j | 下一個回應。 | |
| ctrl-u | 向上滾動。 | |
| Ctrl-D | 向下滾動。 | |
| OGPT聊天 | 進入 (正常模式) | 將提示發送到LLM。 |
| alt-enter (輸入模式) | 將提示發送到LLM。 | |
| Ctrl-y | 在OGPT輸出文本區域中復制LLM的最新響應。 | |
| ctrl-r | 切換角色(助手或用戶)。 | |
| ctrl-s | 切換系統消息。 | |
| OGPT參數 | 進入 | 更改參數。 |
| OGPT會議 | 進入 | 切換會話。 |
| d | 刪除會話。 | |
| r | 重命名會話。請注意,活動會話無法刪除。 |
可以通過opts.chat.keymaps修改OGPT交互式窗口的快捷方式。
OGPTActAs OGPTActAs命令從mistral:7b型號中使用Awesome OGPT提示的提示。
OGPTRun [action_name] OGPTRun [action_name]運行LLM,使用名為[action_name]的預定義動作。 OGPT提供了一些默認操作,以及用戶定義的自定義操作。
操作需要參數來配置其行為。默認模型參數是在actions.<action_name>在config.lua中,自定義模型參數是在自定義OGPT配置文件中或單獨的操作配置文件(例如actions.json )中定義的。
一些動作參數包括:
type :OGPT接口的類型。目前,OGPT接口有三種類型:popup :輕巧的彈出窗口edit :OGPT窗口completions :不打開窗口,直接在編輯窗口上完成strategy :確定OGPT接口的行為方式。您可以為每種type使用特定的strategy :type = "popup" :顯示,替換,附加,預處,quick_fixtype = "completions" :顯示,替換,附加,預處type = "edits" :edit,edit_codesystem :系統提示params :模型參數。默認模型參數在opts.providers.<provider_name>.api_params 。您可以在opts.actions.<action_name>.params和覆蓋默認模型參數。model :用於動作的LLM模型stop :LLM停止生成響應的條件。例如, codellama的一個有用的停止條件是“````'。請參閱示例lazy.nvim配置中的“ optimize_code”操作。temperature :LLM響應的差異frequency_penalty :請有人解釋max_tokens :最大令牌數量top_p :請有人解釋template :提示模板。模板定義了LLM必須遵循的一般指令。模板可以將模板參數包含在{{{<argument_name>}}}的形式中,其中<argument_name>是args中定義的模板參數。除了在args中定義的參數外,您還可以包括以下參數:{{{input}}} :視覺模式下選定的文本。{{{filetype}}} :您正在與之交互的文件類型。args :模板參數。模板參數定義了用template中相同名稱替換參數的參數。一些常見的模板參數包括:instruction :LLM的自定義說明遵循。它往往比進入template的一般指令更具體和結構化。lang :語言。通常用於與語言相關的動作,例如翻譯或語法檢查。 默認操作是在config.lua中的actions.<action_name> 。
OGPTRun edit_with_instructions OGPTRun edit_with_instructions使用config.<provider_name>.api_params中定義的模型打開交互式窗口,以編輯所選文本或整個窗口。
默認情況下, OGPTRun edit_with_instructions的接口類型是edit ,它打開了右側的交互式窗口。在窗口中,您可以使用<co> (默認的keymap,可以自定義)打開和關閉參數面板。請注意此屏幕截圖使用edgy.nvim


您可以通過在自定義的OGPT配置文件或單獨的操作文件中定義它們來自定義OGPT操作。在製作自己的自定義操作時,config.lua中的默認模型配置可以是很好的引用。
您可以在自己的OGPT配置文件中配置操作(如果使用Neovim,通常在ogpt.lua文件中的LUA文件中)。在自定義的OGPT配置文件中,您必須在actions.<action_name>就像config.lua中默認操作的方式一樣。
--- config options lua
opts = {
...
actions = {
grammar_correction = {
-- type = "popup", -- could be a string or table to override
type = {
popup = { -- overrides the default popup options - https://github.com/huynle/ogpt.nvim/blob/main/lua/ogpt/config.lua#L147-L180
edgy = true
}
},
strategy = " replace " ,
provider = " ollama " , -- default to "default_provider" if not provided
model = " mixtral:7b " , -- default to "provider.<default_provider>.model" if not provided
template = " Correct the given text to standard {{{lang}}}: nn ```{{{input}}}``` " ,
system = " You are a helpful note writing assistant, given a text input, correct the text only for grammar and spelling error. You are to keep all formatting the same, e.g. markdown bullets, should stay as a markdown bullet in the result, and indents should stay the same. Return ONLY the corrected text. " ,
params = {
temperature = 0.3 ,
},
args = {
lang = {
type = " string " ,
optional = " true " ,
default = " english " ,
},
},
},
...
}
} edit類型包括與輸入並排顯示輸出,可用於進一步的編輯提示。
display策略顯示了浮子窗口中的輸出。用“ A”或“ R”直接在緩衝區中append並replace修改文本
可以使用JSON文件定義自定義操作。請參閱actions.json的示例以獲取參考。
自定義操作的一個示例可能如下:( #標記註釋)
{
"action_name" : {
"type" : " popup " , # "popup" or "edit"
"template" : " A template using possible variable: {{{filetype}}} (neovim filetype), {{{input}}} (the selected text) an {{{argument}}} (provided on the command line) " ,
"strategy" : " replace " , # or "display" or "append" or "edit"
"params" : { # parameters according to the official Ollama API
"model" : " mistral:7b " , # or any other model supported by `"type"` in the Ollama API, use the playground for reference
"stop" : [
" ``` " # a string used to stop the model
]
}
"args" : {
"argument" : " some value " -- or function
}
}
}如果要使用其他操作文件,則必須將這些路徑附加到defaults.actions_paths中的config.lua中。
在即時,您可以執行命令行以致電OGPT。下面提供了一個替換Grammar_Correction調用的示例。 :OGPTRun grammar_correction {provider="openai", model="gpt-4"}
為了使其更具動態性,您可以將其更改為在執行命令時,用戶在現場將其輸入提供商/模型或任何參數。 :OGPTRun grammar_correction {provider=vim.fn.input("Provider: "), type={popup={edgy=false}}}}
此外,在上面的示例中,可以關閉edgy.nvim 。這樣,響應彈出彈出了光標將在哪裡。有關彈出窗口的其他選項,請通過https://github.com/huynle/ogpt.nvim/blob/main/main/lua/ogpt/config.lua#l147-l180
例如,您並將其彈出並更改enter = false ,這將光標留在同一位置,而不是將其移至彈出窗口。
此外,對於高級用戶,這允許您使用VIM自動碼數。例如,當光標被暫停時,可以進行自動完成。查看此高級選項的各種模板助手,因為現在
當前,對API的給定輸入進行了{{{<template_helper_name>}}}的掃描。當您想為您的API請求提供更多上下文時,或者只是在其他函數調用中掛接時,這將很有幫助。
查看此文件最新的模板助手。如果您有更多的模板助手,請提出先生,您會為您的貢獻表示感謝!
https://github.com/huynle/ogpt.nvim/blob/main/lua/lua/ogpt/flows/actions/template_helpers.lua
這是一個自定義操作,我一直在使用可見窗口作為上下文,以使AI回答任何內聯問題。
....
-- Other OGPT configurations here
....
actions = {
infill_visible_code = {
type = " popup " ,
template = [[
Given the following code snippets, please complete the code by infilling the rest of the code in between the two
code snippets for BEFORE and AFTER, these snippets are given below.
Code BEFORE infilling position:
```{{{filetype}}}
{{{visible_window_content}}}
{{{before_cursor}}}
```
Code AFTER infilling position:
```{{{filetype}}}
{{{after_cursor}}}
```
Within the given snippets, complete the instructions that are given in between the
triple percent sign '%%%-' and '-%%%'. Note that the instructions as
could be multilines AND/OR it could be in a comment block of the code!!!
Lastly, apply the following conditions to your response.
* The response should replace the '%%%-' and '-%%%' if the code snippet was to be reused.
* PLEASE respond ONLY with the answers to the given instructions.
]] ,
strategy = " display " ,
-- provider = "textgenui",
-- model = "mixtral-8-7b",
-- params = {
-- max_new_tokens = 1000,
-- },
},
-- more actions here
}
....通過打開參數面板默認為(CTRL-O)或您的方式來更改模型,然後按模型字段上的Enter()更改它。它應列出LLM提供商的所有可用模型。 
在參數面板中,分別使用鍵“ A”和“ D”來添加和刪除參數
使用OGPT時,請在config.chat.keymaps下可用以下鍵鍵。
https://github.com/huynle/ogpt.nvim/blob/main/lua/ogpt/config.lua#l51-l71
edit和edit_code策略的交互式彈出式彈出窗口https://github.com/huynle/ogpt.nvim/blob/main/lua/ogpt/config.lua#l18-l28
https://github.com/huynle/ogpt.nvim/blob/main/lua/ogpt/config.lua#l174-l181
打開參數面板(使用<Co> )時,可以通過在相關配置上按Enter進行修改設置。設置在各個會議上保存。
lazy.nvim配置 return {
{
" huynle/ogpt.nvim " ,
dev = true ,
event = " VeryLazy " ,
keys = {
{
" <leader>]] " ,
" <cmd>OGPTFocus<CR> " ,
desc = " GPT " ,
},
{
" <leader>] " ,
" :'<,'>OGPTRun<CR> " ,
desc = " GPT " ,
mode = { " n " , " v " },
},
{
" <leader>]c " ,
" <cmd>OGPTRun edit_code_with_instructions<CR> " ,
" Edit code with instruction " ,
mode = { " n " , " v " },
},
{
" <leader>]e " ,
" <cmd>OGPTRun edit_with_instructions<CR> " ,
" Edit with instruction " ,
mode = { " n " , " v " },
},
{
" <leader>]g " ,
" <cmd>OGPTRun grammar_correction<CR> " ,
" Grammar Correction " ,
mode = { " n " , " v " },
},
{
" <leader>]r " ,
" <cmd>OGPTRun evaluate<CR> " ,
" Evaluate " ,
mode = { " n " , " v " },
},
{
" <leader>]i " ,
" <cmd>OGPTRun get_info<CR> " ,
" Get Info " ,
mode = { " n " , " v " },
},
{ " <leader>]t " , " <cmd>OGPTRun translate<CR> " , " Translate " , mode = { " n " , " v " } },
{ " <leader>]k " , " <cmd>OGPTRun keywords<CR> " , " Keywords " , mode = { " n " , " v " } },
{ " <leader>]d " , " <cmd>OGPTRun docstring<CR> " , " Docstring " , mode = { " n " , " v " } },
{ " <leader>]a " , " <cmd>OGPTRun add_tests<CR> " , " Add Tests " , mode = { " n " , " v " } },
{ " <leader>]<leader> " , " <cmd>OGPTRun custom_input<CR> " , " Custom Input " , mode = { " n " , " v " } },
{ " g? " , " <cmd>OGPTRun quick_question<CR> " , " Quick Question " , mode = { " n " } },
{ " <leader>]f " , " <cmd>OGPTRun fix_bugs<CR> " , " Fix Bugs " , mode = { " n " , " v " } },
{
" <leader>]x " ,
" <cmd>OGPTRun explain_code<CR> " ,
" Explain Code " ,
mode = { " n " , " v " },
},
},
opts = {
default_provider = " ollama " ,
-- default edgy flag
-- set this to true if you prefer to use edgy.nvim (https://github.com/folke/edgy.nvim) instead of floating windows
edgy = false ,
providers = {
ollama = {
api_host = os.getenv ( " OLLAMA_API_HOST " ),
-- default model
model = " mistral:7b " ,
-- model definitions
models = {
-- alias to actual model name, helpful to define same model name across multiple providers
coder = " deepseek-coder:6.7b " ,
-- nested alias
cool_coder = " coder " ,
general_model = " mistral:7b " ,
custom_coder = {
name = " deepseek-coder:6.7b " ,
modify_url = function ( url )
-- completely modify the URL of a model, if necessary. This function is called
-- right before making the REST request
return url
end ,
-- custom conform function. Each provider have a dedicated conform function where all
-- of OGPT chat info is passed into the conform function to be massaged to the
-- correct format that the provider is expecting. This function, if provided will
-- override the provider default conform function
-- conform_fn = function(ogpt_params)
-- return provider_specific_params
-- end,
},
},
-- default model params for all 'actions'
api_params = {
model = " mistral:7b " ,
temperature = 0.8 ,
top_p = 0.9 ,
},
api_chat_params = {
model = " mistral:7b " ,
frequency_penalty = 0 ,
presence_penalty = 0 ,
temperature = 0.5 ,
top_p = 0.9 ,
},
},
openai = {
api_host = os.getenv ( " OPENAI_API_HOST " ),
api_key = os.getenv ( " OPENAI_API_KEY " ),
api_params = {
model = " gpt-4 " ,
temperature = 0.8 ,
top_p = 0.9 ,
},
api_chat_params = {
model = " gpt-4 " ,
frequency_penalty = 0 ,
presence_penalty = 0 ,
temperature = 0.5 ,
top_p = 0.9 ,
},
},
textgenui = {
api_host = os.getenv ( " TEXTGEN_API_HOST " ),
api_key = os.getenv ( " TEXTGEN_API_KEY " ),
api_params = {
model = " mixtral-8-7b " ,
temperature = 0.8 ,
top_p = 0.9 ,
},
api_chat_params = {
model = " mixtral-8-7b " ,
frequency_penalty = 0 ,
presence_penalty = 0 ,
temperature = 0.5 ,
top_p = 0.9 ,
},
},
},
yank_register = " + " ,
edit = {
edgy = nil , -- use global default, override if defined
diff = false ,
keymaps = {
close = " <C-c> " ,
accept = " <M-CR> " ,
toggle_diff = " <C-d> " ,
toggle_parameters = " <C-o> " ,
cycle_windows = " <Tab> " ,
use_output_as_input = " <C-u> " ,
},
},
popup = {
edgy = nil , -- use global default, override if defined
position = 1 ,
size = {
width = " 40% " ,
height = 10 ,
},
padding = { 1 , 1 , 1 , 1 },
enter = true ,
focusable = true ,
zindex = 50 ,
border = {
style = " rounded " ,
},
buf_options = {
modifiable = false ,
readonly = false ,
filetype = " ogpt-popup " ,
syntax = " markdown " ,
},
win_options = {
wrap = true ,
linebreak = true ,
winhighlight = " Normal:Normal,FloatBorder:FloatBorder " ,
},
keymaps = {
close = { " <C-c> " , " q " },
accept = " <C-CR> " ,
append = " a " ,
prepend = " p " ,
yank_code = " c " ,
yank_to_register = " y " ,
},
},
chat = {
edgy = nil , -- use global default, override if defined
welcome_message = WELCOME_MESSAGE ,
loading_text = " Loading, please wait ... " ,
question_sign = " " , -- ?
answer_sign = " ﮧ " , -- ?
border_left_sign = " | " ,
border_right_sign = " | " ,
max_line_length = 120 ,
sessions_window = {
active_sign = " ? " ,
inactive_sign = " ? " ,
current_line_sign = " " ,
border = {
style = " rounded " ,
text = {
top = " Sessions " ,
},
},
win_options = {
winhighlight = " Normal:Normal,FloatBorder:FloatBorder " ,
},
},
keymaps = {
close = { " <C-c> " },
yank_last = " <C-y> " ,
yank_last_code = " <C-i> " ,
scroll_up = " <C-u> " ,
scroll_down = " <C-d> " ,
new_session = " <C-n> " ,
cycle_windows = " <Tab> " ,
cycle_modes = " <C-f> " ,
next_message = " J " ,
prev_message = " K " ,
select_session = " <CR> " ,
rename_session = " r " ,
delete_session = " d " ,
draft_message = " <C-d> " ,
edit_message = " e " ,
delete_message = " d " ,
toggle_parameters = " <C-o> " ,
toggle_message_role = " <C-r> " ,
toggle_system_role_open = " <C-s> " ,
stop_generating = " <C-x> " ,
},
},
-- {{{input}}} is always available as the selected/highlighted text
actions = {
grammar_correction = {
type = " popup " ,
template = " Correct the given text to standard {{{lang}}}: nn ```{{{input}}}``` " ,
system = " You are a helpful note writing assistant, given a text input, correct the text only for grammar and spelling error. You are to keep all formatting the same, e.g. markdown bullets, should stay as a markdown bullet in the result, and indents should stay the same. Return ONLY the corrected text. " ,
strategy = " replace " ,
params = {
temperature = 0.3 ,
},
args = {
lang = {
type = " string " ,
optional = " true " ,
default = " english " ,
},
},
},
translate = {
type = " popup " ,
template = " Translate this into {{{lang}}}: nn {{{input}}} " ,
strategy = " display " ,
params = {
temperature = 0.3 ,
},
args = {
lang = {
type = " string " ,
optional = " true " ,
default = " vietnamese " ,
},
},
},
keywords = {
type = " popup " ,
template = " Extract the main keywords from the following text to be used as document tags. nn ```{{{input}}}``` " ,
strategy = " display " ,
params = {
model = " general_model " , -- use of model alias, generally, this model alias should be available to all providers in use
temperature = 0.5 ,
frequency_penalty = 0.8 ,
},
},
do_complete_code = {
type = " popup " ,
template = " Code: n ```{{{filetype}}} n {{{input}}} n ``` nn Completed Code: n ```{{{filetype}}} " ,
strategy = " display " ,
params = {
model = " coder " ,
stop = {
" ``` " ,
},
},
},
quick_question = {
type = " popup " ,
args = {
-- template expansion
question = {
type = " string " ,
optional = " true " ,
default = function ()
return vim . fn . input ( " question: " )
end ,
},
},
system = " You are a helpful assistant " ,
template = " {{{question}}} " ,
strategy = " display " ,
},
custom_input = {
type = " popup " ,
args = {
instruction = {
type = " string " ,
optional = " true " ,
default = function ()
return vim . fn . input ( " instruction: " )
end ,
},
},
system = " You are a helpful assistant " ,
template = " Given the follow snippet, {{{instruction}}}. nn snippet: n ```{{{filetype}}} n {{{input}}} n ``` " ,
strategy = " display " ,
},
optimize_code = {
type = " popup " ,
system = " You are a helpful coding assistant. Complete the given prompt. " ,
template = " Optimize the code below, following these instructions: nn {{{instruction}}}. nn Code: n ```{{{filetype}}} n {{{input}}} n ``` nn Optimized version: n ```{{{filetype}}} " ,
strategy = " edit_code " ,
params = {
model = " coder " ,
stop = {
" ``` " ,
},
},
},
},
},
dependencies = {
" MunifTanjim/nui.nvim " ,
" nvim-lua/plenary.nvim " ,
" nvim-telescope/telescope.nvim " ,
},
},
}當您經常更新操作時,我建議在lazy.nvim ogpt配置中添加以下鍵。這只是在現場重新加載ogpt.nvim ,讓您查看您的更新操作。
...
-- other config options here
keys = {
{
" <leader>ro " ,
" <Cmd>Lazy reload ogpt.nvim<CR> " ,
desc = " RELOAD ogpt " ,
},
...
}
-- other config options here
... 這是如何設置可能位於其他服務器上的Ollama Mixtral模型服務器的示例。請注意,在下面的示例中,您可以:
secret_model是mixtral-8-7b的別名,因此在您的actions中,您可以使用secret_model 。當您擁有與MixTral相同功能的多個提供商時,這很有用,並且您希望根據開發環境或出於其他原因將不同的提供商交換以供不同的提供商使用。mixtral-8-7b時,此模型將顯示在chat和edit操作中的模型選項中。conform_message_fn用於覆蓋默認提供商conform_message函數。此功能允許按摩API請求參數符合特定模型。當您需要修改消息以適合經過模型訓練的模板時,這確實很有用。conform_request_fn用於覆蓋默認提供商conform_request函數。此功能(或提供商默認函數)在製作API調用之前,在末尾調用。最終按摩可以在這裡進行。 -- advanced model, can take the following structure
providers = {
ollama = {
model = " secret_model " , -- default model for ollama
models = {
...
secret_model = " mixtral-8-7b " ,
[ " mixtral-8-7b " ] = {
params = {
-- the parameters here are FORCED into the final API REQUEST, OVERRIDDING
-- anything that was set before
max_new_token = 200 ,
},
modify_url = function ( url )
-- given a URL, this function modifies the URL specifically to the model
-- This is useful when you have different models hosted on different subdomains like
-- https://model1.yourdomain.com/
-- https://model2.yourdomain.com/
local new_model = " mixtral-8-7b "
-- local new_model = "mistral-7b-tgi-predictor-ai-factory"
local host = url : match ( " https?://([^/]+) " )
local subdomain , domain , tld = host : match ( " ([^.]+)%.([^.]+)%.([^.]+) " )
local _new_url = url : gsub ( host , new_model .. " . " .. domain .. " . " .. tld )
return _new_url
end ,
-- conform_messages_fn = function(params)
-- Different models might have different instruction format
-- for example, Mixtral operates on `<s> [INST] Instruction [/INST] Model answer</s> [INST] Follow-up instruction [/INST] `
-- look in the `providers` folder of the plugin for examples
-- end,
-- conform_request_fn = function(params)
-- API request might need custom format, this function allows that to happen
-- look in the `providers` folder of the plugin for examples
-- end,
}
}
}
}
TBD
如果您喜歡edgy.nvim設置,請為edgy.nvim的插件設置選項使用類似的內容。設置此之後,請確保您在ogpt.nvim的配置選項中啟用edgy = true選項。
opts = {
right = {
{
title = " OGPT Popup " ,
ft = " ogpt-popup " ,
size = { width = 0.2 },
wo = {
wrap = true ,
},
},
{
title = " OGPT Parameters " ,
ft = " ogpt-parameters-window " ,
size = { height = 6 },
wo = {
wrap = true ,
},
},
{
title = " OGPT Template " ,
ft = " ogpt-template " ,
size = { height = 6 },
},
{
title = " OGPT Sessions " ,
ft = " ogpt-sessions " ,
size = { height = 6 },
wo = {
wrap = true ,
},
},
{
title = " OGPT System Input " ,
ft = " ogpt-system-window " ,
size = { height = 6 },
},
{
title = " OGPT " ,
ft = " ogpt-window " ,
size = { height = 0.5 },
wo = {
wrap = true ,
},
},
{
title = " OGPT {{{selection}}} " ,
ft = " ogpt-selection " ,
size = { width = 80 , height = 4 },
wo = {
wrap = true ,
},
},
{
title = " OGPt {{{instruction}}} " ,
ft = " ogpt-instruction " ,
size = { width = 80 , height = 4 },
wo = {
wrap = true ,
},
},
{
title = " OGPT Chat " ,
ft = " ogpt-input " ,
size = { width = 80 , height = 4 },
wo = {
wrap = true ,
},
},
},
}providersconfig.lua中的“ default_provider”,默認為ollamaOGPTRun顯示望遠鏡選擇器type="popup"和strategy="display" -- or append, prepend, replace, quick_fix 感謝jackMort/ChatGPT.nvim的作者創建了一個與Neovim互動的無縫框架!
給我買咖啡