Continue locall llm via ollama setting config
by 개발자   2025-11-10 01:07:44   조회수:39

~/.confinue/config/config.yamal

continue 실행 명령어

cn --config ~/.continue/config.yaml


설정파일. 

name: Local Assistant

version: 1.0.0

schema: v1

models:

  - name: Llama 3.1 8B

    provider: ollama

    model: llama3.1:8b

    roles:

      - chat

      - edit

      - apply

  - name: Code Llama

    provider: ollama

    model: codellama:latest

    roles:

      - chat

      - edit

      - apply

  - name: Qwen2.5-Coder 1.5B

    provider: ollama

    model: qwen2.5-coder:1.5b-base

    roles:

      - autocomplete

  - name: Nomic Embed

    provider: ollama

    model: nomic-embed-text:latest

    api_base: http://localhost:11434 # 명시적으로 API 주소 추가

    roles:

      - embed

  - name: Autodetect

    provider: ollama

    model: AUTODETECT

context:

  - provider: code

  - provider: docs

  - provider: diff

  - provider: terminal

  - provider: problems

  - provider: folder

  - provider: codebase