AI Ollama Helper
Local LLM Hub

Install Ollama on Windows

A quick, reliable setup: download the official installer, verify the CLI, pull your first model, and run locally — all in a few minutes.

⬇️ Download Ollama for Windows (Official)

Step‑by‑step

Download the installer

Open the official download page and get the Windows .exe.

Go to official download

Run the installer

Launch the .exe and follow prompts. The installer adds the ollama CLI and a background service. If a firewall dialog appears, allow local access.

Verify the CLI

Open Command Prompt or PowerShell and check that Ollama is available. If the command isn’t found, open a new terminal window.

ollama --version
Pull your first model

Download a starter model like Llama 3:

ollama pull llama3
Run locally

Start a local chat session with the model:

ollama run llama3

To verify the local API is up, you can also check the tags endpoint in a browser: http://localhost:11434/api/tags.

What’s next?

Community‑driven guide. Not affiliated with the official Ollama project.