This document describes the DB-GPT monorepo structure, build system architecture, and package management approach. It covers the workspace organization, build tooling (uv package manager and Hatchling build backend), inter-package dependencies, version management, and the CI/CD pipeline for package publishing.
For information about setting up a development environment, see Development Environment Setup. For information about testing, see Testing and Quality Assurance. For deployment configurations, see Deployment and Configuration.
DB-GPT uses a monorepo architecture managed by uv workspaces. The repository root contains a workspace configuration that coordinates seven independent packages, each with its own pyproject.toml file and versioning.
Sources: pyproject.toml31-40 packages/dbgpt-core/pyproject.toml1-205 packages/dbgpt-ext/pyproject.toml1-106 packages/dbgpt-app/pyproject.toml1-65 packages/dbgpt-serve/pyproject.toml1-44 packages/dbgpt-client/pyproject.toml1-47
The workspace root pyproject.toml defines the workspace members and establishes workspace sources that allow packages to reference each other during development:
| Configuration Key | Purpose |
|---|---|
tool.uv.workspace.members | Lists all package directories included in the workspace |
tool.uv.sources | Defines workspace dependencies as local paths |
tool.uv.dev-dependencies | Shared development tools (pytest, ruff, mypy, pre-commit) |
Sources: pyproject.toml31-40 pyproject.toml15-23 pyproject.toml44-60
| Package | Primary Purpose | Key Dependencies |
|---|---|---|
dbgpt-core | Core framework abstractions (AWEL, agents, storage interfaces) | aiohttp, pydantic, SQLAlchemy |
dbgpt-ext | Storage implementations (vector stores, knowledge graphs, data sources) | dbgpt, pymysql |
dbgpt-serve | Service layer (RAG service, evaluation service) | dbgpt-ext |
dbgpt-app | Application logic integrating all components | All packages |
dbgpt-client | Python client library for DB-GPT APIs | dbgpt[client,cli], dbgpt-ext |
dbgpt-acc-auto | Hardware acceleration (CUDA, quantization, vLLM) | torch, vllm, bitsandbytes |
dbgpt-acc-flash-attn | Flash Attention acceleration | flash-attn |
dbgpt-sandbox | Code execution sandbox | - |
Sources: packages/dbgpt-core/pyproject.toml14-26 packages/dbgpt-ext/pyproject.toml12-15 packages/dbgpt-serve/pyproject.toml12-14 packages/dbgpt-app/pyproject.toml12-20 packages/dbgpt-client/pyproject.toml12-22 packages/dbgpt-accelerator/dbgpt-acc-auto/pyproject.toml26-103
DB-GPT uses uv as its package manager, a modern Rust-based tool that provides fast dependency resolution and installation. The uv package manager replaces traditional pip-based workflows.
uv.lock for reproducible builds across all platforms and Python versionsSources: uv.lock1-7
The uv.lock file contains hundreds of resolution markers that specify different dependency sets based on:
python_full_version >= '3.13', python_full_version < '3.11')sys_platform == 'linux', sys_platform == 'win32', sys_platform == 'darwin')platform_machine == 'aarch64', platform_machine != 'aarch64')extra == 'extra-14-dbgpt-acc-auto-cuda124')This enables conditional dependencies like CUDA-specific PyTorch installations.
Sources: uv.lock4-44
Each package's pyproject.toml includes uv-specific configuration:
The managed = true setting indicates that uv manages the virtual environment. The tool.uv.sources section tells uv to resolve workspace packages from local paths during development rather than from PyPI.
Sources: pyproject.toml42-60 packages/dbgpt-core/pyproject.toml171-174
DB-GPT uses Hatchling as the PEP 517 build backend. Each package specifies Hatchling in its pyproject.toml:
Sources: packages/dbgpt-core/pyproject.toml167-169
Each package configures its build targets to specify which files to include in distribution packages. The tool.hatch.build.targets.wheel section controls wheel generation:
This configuration:
src/dbgpt packageSources: packages/dbgpt-core/pyproject.toml194-203 packages/dbgpt-ext/pyproject.toml96-105 packages/dbgpt-app/pyproject.toml54-63
Each pyproject.toml defines standard package metadata:
| Field | Description | Example |
|---|---|---|
name | Package name on PyPI | "dbgpt", "dbgpt-ext" |
version | Current version | "0.7.5" |
description | Package description | Core framework description |
authors | Package maintainers | [{ name = "csunny", email = "..." }] |
license | Software license | "MIT" |
readme | README file | "README.md" |
requires-python | Python version constraint | ">= 3.10" |
Sources: packages/dbgpt-core/pyproject.toml1-13
Sources: packages/dbgpt-core/pyproject.toml14-26 packages/dbgpt-ext/pyproject.toml12-15 packages/dbgpt-app/pyproject.toml12-20 packages/dbgpt-accelerator/dbgpt-acc-auto/pyproject.toml67-103
Packages define optional dependency groups using the project.optional-dependencies section. This allows users to install only the features they need:
| Extra | Purpose | Key Packages |
|---|---|---|
client | HTTP client support | httpx, fastapi, tenacity |
cli | Command-line tools | click, rich, prettytable |
agent | Multi-agent framework | pandas, numpy, mcp |
simple_framework | Basic framework | jinja2, uvicorn, SQLAlchemy, duckdb |
framework | Full framework | alembic, GitPython, graphviz |
hf | HuggingFace models | transformers, sentence-transformers |
llama_cpp | LLAMA.cpp support | llama-cpp-python |
proxy_openai | OpenAI proxy | openai>=1.59.6, tiktoken |
Sources: packages/dbgpt-core/pyproject.toml34-137
| Extra | Purpose | Key Packages |
|---|---|---|
rag | Document processing | spacy, pypdf, python-docx |
graph_rag | Knowledge graphs | networkx, dbgpt-tugraph-plugins, neo4j |
storage_milvus | Milvus vector store | pymilvus |
storage_chromadb | ChromaDB vector store | chromadb>=0.4.22 |
datasource_mysql | MySQL connector | mysqlclient==2.1.0 |
datasource_postgres | PostgreSQL connector | psycopg2-binary |
Sources: packages/dbgpt-ext/pyproject.toml27-88
| Extra | Purpose | Key Packages |
|---|---|---|
auto | Auto-detect hardware | torch>=2.2.1 (platform-specific) |
cpu | CPU-only PyTorch | torch>=2.2.1 (CPU version) |
cuda118 | CUDA 11.8 support | torch from pytorch-cu118 index |
cuda121 | CUDA 12.1 support | torch from pytorch-cu121 index |
cuda124 | CUDA 12.4 support | torch from pytorch-cu124 index |
vllm | vLLM inference | vllm>=0.7.0 (Linux only) |
mlx | Apple Silicon acceleration | mlx-lm>=0.25.2 (macOS only) |
quant_bnb | BitsAndBytes quantization | bitsandbytes, accelerate |
quant_gptq | GPTQ quantization | optimum, auto-gptq |
Sources: packages/dbgpt-accelerator/dbgpt-acc-auto/pyproject.toml26-103
The dbgpt-acc-auto package defines custom PyPI indexes for different CUDA versions:
The tool.uv.sources section maps these indexes to specific extras using markers:
Sources: packages/dbgpt-accelerator/dbgpt-acc-auto/pyproject.toml125-143 packages/dbgpt-accelerator/dbgpt-acc-auto/pyproject.toml150-189
The uv configuration specifies mutually exclusive dependency groups using conflicts:
This prevents users from simultaneously installing multiple CUDA versions or mixing CPU and GPU PyTorch builds.
Sources: packages/dbgpt-accelerator/dbgpt-acc-auto/pyproject.toml110-118 packages/dbgpt-core/pyproject.toml176-193
All packages in the DB-GPT monorepo share the same version number: 0.7.5. This synchronized versioning simplifies release management and ensures compatibility across packages.
Each package maintains its version in two locations:
pyproject.toml: The project.version field_version.py: A Python module containing the version stringVersion File Locations:
packages/dbgpt-core/src/dbgpt/_version.pypackages/dbgpt-ext/src/dbgpt_ext/_version.pypackages/dbgpt-app/src/dbgpt_app/_version.pypackages/dbgpt-serve/src/dbgpt_serve/_version.pypackages/dbgpt-client/src/dbgpt_client/_version.pypackages/dbgpt-accelerator/dbgpt-acc-auto/src/dbgpt_acc_auto/_version.pypackages/dbgpt-accelerator/dbgpt-acc-flash-attn/src/dbgpt_acc_flash_attn/_version.pySources: packages/dbgpt-core/src/dbgpt/_version.py1-2 packages/dbgpt-ext/src/dbgpt_ext/_version.py1-2 packages/dbgpt-app/src/dbgpt_app/_version.py1-2 packages/dbgpt-serve/src/dbgpt_serve/_version.py1-2 packages/dbgpt-client/src/dbgpt_client/_version.py1-2 packages/dbgpt-accelerator/dbgpt-acc-auto/src/dbgpt_acc_auto/_version.py1-2 packages/dbgpt-accelerator/dbgpt-acc-flash-attn/src/dbgpt_acc_flash_attn/_version.py1-2
The GitHub Actions workflow for publishing packages includes an automated version update step using the scripts/update_version_all.py script:
This script updates the version number across all pyproject.toml files and _version.py files in a single operation.
Sources: .github/workflows/python-publish.yml50-60
The repository includes a Makefile that provides convenient commands for common build tasks. While the exact Makefile content is not shown in the provided files, the GitHub Actions workflow references Make commands:
This command builds distribution packages (wheels and source distributions) for all packages in the workspace. The output is placed in the dist/ directory.
Sources: .github/workflows/python-publish.yml65-68
Referenced in the version update step, this command likely prepares packages for release.
Sources: .github/workflows/python-publish.yml53
To install all packages from source with development dependencies:
This command:
Sources: .github/workflows/python-publish.yml63
Sources: .github/workflows/python-publish.yml1-88
The publishing workflow can be triggered in two ways:
When a release is published on GitHub, the workflow automatically:
Sources: .github/workflows/python-publish.yml11-13
Developers can manually trigger the workflow with custom parameters:
| Input Parameter | Type | Description |
|---|---|---|
version | string (required) | Package version (e.g., 0.7.0rc0) |
publish_to_testpypi | boolean | Whether to publish to TestPyPI |
publish_to_pypi | boolean | Whether to publish to PyPI |
Sources: .github/workflows/python-publish.yml14-29
The workflow executes the following steps:
Setup Environment
.python-version fileVersion Management
update_version_all.pyBuild
uv sync --all-packages --devmake buildls dist/Artifact Upload
Publishing
publish_to_testpypi=truepublish_to_pypi=trueSources: .github/workflows/python-publish.yml39-88
The workflow uses PyPI API tokens stored as GitHub secrets:
Sources: .github/workflows/python-publish.yml77-88
The root pyproject.toml defines a TestPyPI index for testing package uploads before production release:
The explicit = true setting means packages must explicitly opt-in to use this index.
Sources: pyproject.toml25-29
The dbgpt-acc-flash-attn package requires special build configuration due to the complexity of building the flash-attn package from source. The pyproject.toml disables build isolation:
This configuration:
torch first in the "direct" groupsetuptools) in the "build" groupflash-attn without isolation so it can access the pre-installed torchSources: packages/dbgpt-accelerator/dbgpt-acc-flash-attn/pyproject.toml19-32
Similarly, the dbgpt-acc-auto package disables build isolation for gptqmodel:
Sources: packages/dbgpt-accelerator/dbgpt-acc-auto/pyproject.toml194-196
The workspace root defines pytest configuration that applies to all packages:
This configuration:
packages/ directory to Python path for test discoverytest_*.py or *_test.py patternsSources: pyproject.toml63-66
The workspace uses Ruff for code formatting and linting:
Key settings:
Sources: pyproject.toml68-86
The workspace includes comprehensive development tooling:
| Tool | Version | Purpose |
|---|---|---|
pytest | >=7.0.0 | Testing framework |
pytest-asyncio | - | Async test support |
pytest-mock | >=3.14.0 | Mocking utilities |
pytest-cov | >=6.0.0 | Coverage reporting |
ruff | >=0.9.1 | Linter and formatter |
mypy | >=1.15.0 | Type checking |
pre-commit | >=4.2.0 | Git hooks |
twine | - | Package publishing |
jupyter | - | Interactive notebooks |
Sources: pyproject.toml44-60
Refresh this wiki