Open source button (Coze Studio local version) deployment guide

1. Introduction

Coze (button) ByteDance is an open source agent platform, including Coze Studio (development platform) and Coze Loop (operation and maintenance platform), support localized deployment.
Main features of the local version:

  • support Docker Compose one-click deployment
  • provide Agent development, plug-ins, knowledge base management function
  • The default does not include a large model and requires user configuration Cloud or local model API

2. Environmental preparation

  • operating system: Linux / macOS / Windows (WSL2 recommended)
  • dependent

3. Deployment steps

1. Get source code

git clone https://github.com/coze-dev/coze-studio.git
cd coze-studio/docker

2. Configure environment variables

replication .env.example for .env, modify as needed:

cp .env.example .env

Common configurations:

#Service Port
APP_PORT=8888

#Database Account
POSTGRES_USER=coze
POSTGRES_PASSWORD=coze123
POSTGRES_DB=coze_local

# Redis Settings
REDIS_PASSWORD=coze123

3. start the service

docker compose --profile '*' up -d

After waiting for container startup to complete, visit:

http://localhost:8888

You can enter the Coze Studio management platform.

4. Access model

1. Cloud model (OpenAI example)

in Coze Studio → Settings → Large Model Fill in:

  • Base URL: https://api.openai.com/v1
  • API Key: sk-xxxxxx

2. Local model (recommended by Ollama / FastChat)

Scenario A: Ollama

installation Ollama and run:

ollama run qwen:7b

Ollama provides by default:

http://127.0.0.1:11434/v1

Configure in the button:

  • Base URL: http://127.0.0.1:11434/v1
  • API Key: Any string

Scheme B: FastChat

deployment FastChat(Support Qwen, LLaMA, ChatGLM, etc.):

docker run -it -p 8000:8000 
 -v ./ models:/models lmsys/fastchat 
 bash -c "python3 -m fastchat.serve.controller & 
 python3 -m fastchat.serve.model_worker --model-path /models/qwen-7b & 
 python3 -m fastchat.serve.openai_api_server --host 0.0.0.0 --port 8000"

Configure in the button:

  • Base URL: http://127.0.0.1:8000/v1
  • API Key: Any string

5. Verify whether it is connected

1. Test API directly

with curl Check whether the large model interface is normal:

curl http://127.0.0.1:8000/v1/chat/completions 
 -H "Content-Type: application/json" 
 -H "Authorization: Bearer test" 
 -d '{
 "model": "qwen-7b",
 "messages":{"role":"user","content":"Hello"}]
 }'

Return choices → message → API is normal
404 /Connection denied → Model service is not started or URL is mismatched

2. Check the button log

docker compose logs -f app

Common errors:

  • ECONNREFUSED → Model service is not running
  • 404 Not Found → Incorrect URL (missing /v1
  • invalid_request_error → The return format does not match, a translator is needed

6. Common questions and matters needing attention

  1. Default no large model
    The button local version is just an agent platform and requires you to manually configure the large model API.
  2. API format requirements
    must be OpenAI Chat Completions format{ "model": "xxx", "messages": [{ "role": "user", "content": "Hello"}] }If the local model API is incompatible, you must use the Translator (proxy) Encapsulation.
  3. API Key required
    Even if it is a local model (without checking the key), you have to fill in a random string, otherwise the request will report an error.
  4. Performance and latency
    • Cloud model → High network latency
    • Local model → Depends on graphics performance (recommended to start with 16GB of video memory)
  5. Offline environment deployment
    • On external network machines docker pull Mirror image → docker save → Intranet docker load
    • Avoid network restrictions causing image download failure

VII. Summary

  • The button local version allows you to safely run the agent platform on the intranet, but The large model needs to be configured by yourself
  • If you want to access Local LLM, recommend Ollama (lightweight) or FastChat (highly compatible).
  • When encountering problems, use first curl Verify the large model API, and then check the button configuration and logs.

Download address for local version of Coze Studio:https://github.com/coze-dev/coze-studio

Oil tubing:

Scroll to Top