Source: ollama-python Maintainer: Home Assistant Team Uploaders: Edward Betts , Section: python Priority: optional Build-Depends: debhelper-compat (= 13), dh-sequence-python3, pybuild-plugin-pyproject, python3-all, python3-poetry-core, Build-Depends-Indep: python3-httpx , python3-pillow , python3-pydantic , python3-pytest , python3-pytest-asyncio , python3-pytest-cov , python3-pytest-httpserver , Rules-Requires-Root: no Standards-Version: 4.7.0 Homepage: https://github.com/jmorganca/ollama-python Vcs-Browser: https://salsa.debian.org/homeassistant-team/deps/ollama-python Vcs-Git: https://salsa.debian.org/homeassistant-team/deps/ollama-python.git Package: python3-ollama Architecture: all Depends: ${misc:Depends}, ${python3:Depends}, Description: Library for interacting with the Ollama server and its AI models This library provides functionality for integrating with an Ollama server to interact with AI language models and create conversational experiences. It allows querying and generating text through an API that communicates with the server, supporting various operations such as model management, message exchange, and prompt handling. Ollama requires configuration to connect to a network-accessible server, after which it can be used to fetch and generate information based on context received from Home Assistant or similar platforms. Through model specification and prompt templates, the library adapts responses to the specific environment, although it operates without direct command over connected devices.