ai.Local

An open-source local AI assistant that learns from your documents, runs on your machine, and stays private.

Copy and paste in your terminal to get started

$curl -fsSL https://smartloop.ai/install | bash
gpt.local terminal demo

Features

Attach Personal Documents

Local Small Language Model (GGML)

Connect to MCP Servers

Create Custom Skills

Open Model, Local Agents

Your Own AI Model

OpenAI Compatible API

Coming Soon

Smartloop Studio

A native desktop experience for macOS, Windows, and Linux. Chat with your documents and run local models — all from one app.

Smartloop Studio

FAQ

01What is Smartloop?

A free and open-source AI assistant to extract information from documents and generate content 100% locally using open models.

02How does it work?

We use open models that are quantized (compressed) to run on your local devices.

03How much does it cost?

It is absolutely free to use and open-source, however, there is a paid plan which we use to maintain the models and improve the framework.

04What model do you use?

We primarily use small language models (SLMs) that can be run and tuned in the scope of your device. We recommend devices like macOS (M-series) and NVIDIA CUDA devices (at least 4GB of GPU memory). Based on available memory, documents, attachments, skills, conversations, MCP, etc. inputs are vectorized and tuned by local agents. Our approach is to train to free up context, as context is not infinite when it comes to running locally.

Accelerated by

NVIDIAMicrosoftGoogle for Startups