Direct — autoyou_lib
pip install autoyou-lib. That's it.
autoyou_lib is the lightweight AutoYou Python server. When
it starts, it checks for a running Ollama instance on the default port
and automatically routes chat to your installed models. Nothing extra
to configure — install the library, start it, pair your phone,
and your local AI is ready.
This is the simplest path. You get Ollama chat from your phone with two
commands and no accounts.
pip install autoyou-lib
python -m autoyou_lib
Via OpenClaw
Ollama as one option among many.
If you're already running OpenClaw, you can add
Ollama as one of its model providers — alongside Claude, GPT-4,
Gemini, and others. The AutoYou plugin routes your phone's messages to
whichever model OpenClaw currently has selected. Switch between a local
Ollama model and a cloud model in OpenClaw without changing anything in
the phone app.
This is the more powerful path: the full OpenClaw AI pipeline, with Ollama
available as a local, private, zero-cost option whenever you want it.