Skip to content

AI Code Editor Integration

Sources:

Summary: Integrate Mistral.ai 7B model, install it locally and use with Ollama, and how to use with Neovim extension gen.nvim created by David Kunz

  • Mistral model: Comes in text and instruct, use instruct
  • Install and run Ollama
  • gen.nvim will invoke Ollama from Neovim
    • Call using :Gen
    • Example prompts like:
      • Correct grammar and spelling
      • Summarize text
      • Make concise
      • Make list
      • Make table for markdown
      • Ask question
      • Code
        • Review code
        • Change code
    • Create new prompts as needed
      • User input as variables
      • Can replace text or not with generated output
DescriptionShortcut
Start:Gen
Chat, Follow up in window:Gen chat
Response, replace textC-Enter
Select modelrequire(‘gen’).selectmodel() :
Terminal window
# Run without desktop app
ollama serve
## Default model setting for gen.nvim
ollama run mistral
## Run nvim and use :Gen