Tabby: self-hosted AI coding assistant

Tabby is a self-hosted AI coding assistant, offering an open-source and on-premises alternative to GitHub Copilot.

It boasts several key features:

  • Self-contained, with no need for a DBMS or cloud service.
  • OpenAPI interface, easy to integrate with existing infrastructure (e.g Cloud IDE).
  • Supports consumer-grade GPUs.
  • Supports consumer-grade GPUs.

Run Tabby in 1 Minute

The easiest way to start a Tabby server is by using the following Docker command:

docker run -it \
  --gpus all -p 8080:8080 -v $HOME/.tabby:/data \
  tabbyml/tabby \
  serve --model StarCoder-1B --device cuda --chat-model Qwen2-1.5B-Instruct

For additional options (e.g inference type, parallelism), please refer to the documentation page.