This plugin is currently in beta. While it is considered safe for use, please be aware that its API could change in ways that are not compatible with earlier versions in future releases, or it might become unsupported.
Ollama Model Provider
yaml
type: "io.kestra.plugin.langchain4j.provider.Ollama"Examples
Chat completion with Ollama
yaml
id: chat_completion
namespace: company.team
inputs:
  - id: prompt
    type: STRING
tasks:
  - id: chat_completion
    type: io.kestra.core.plugin.langchain4j.ChatCompletion
    provider:
      type: io.kestra.plugin.langchain4j.provider.Ollama
      modelName: llama3
      endpoint: http://localhost:11434
    messages:
      - type: SYSTEM
        content: You are a helpful assistant, answer concisely, avoid overly casual language or unnecessary verbosity.
      - type: USER
        content: "{{inputs.prompt}}"
Properties
endpoint *Requiredstring
Model endpoint
modelName *Requiredstring
Model name
