#local-model
View:
  • 1.
    Ollama Chat
    a year ago by Brumik
    favorite
    share
    Score: 48/100
    Category: 3rd Party Integrations
    The Ollama Chat plugin enables users to interact with a locally hosted language model (LLM) to ask questions directly related to their Obsidian notes. It indexes files during startup and updates the index upon file modifications, ensuring up-to-date responses. Users can open a modal via shortcuts or commands to query the LLM. The plugin supports running a local model and plans to introduce features like real-time text streaming and predefined commands for common queries such as summarizing notes or topics.
    View Plugin Details