Local Model Demo Real Providers
Real provider mode is opt-in. CI and normal tests use the mock adapter.
Ollama
Minimum setup:
ollama serveollama pull llama3.2Environment:
CORVID_RUN_REAL=1CORVID_MODEL=ollama:llama3.2OLLAMA_BASE_URL=http://localhost:11434OLLAMA_BASE_URL is optional when Ollama listens on http://localhost:11434.
Use it when running Ollama on another host or port. If your local tooling uses
OLLAMA_HOST, set OLLAMA_BASE_URL to the same URL before running Corvid.
Mock Mode
Offline tests and CI use:
CORVID_TEST_MOCK_LLM=1CORVID_TEST_MOCK_LLM_RESPONSE=provider-neutral local inference with deterministic replay.CORVID_MODEL=ollama:llama3.2The mock returns the same String prompt result consumed by the Corvid program;
the surrounding LocalChatTurn typed surface is unchanged.