This document explains the contribution workflow for DB-GPT, the CI/CD pipeline architecture, versioning strategy, and the process for publishing packages to PyPI/TestPyPI. This covers how developers can contribute code changes and how maintainers create and publish releases.
For information about setting up your local development environment, see Development Environment Setup. For details on the project structure and build system, see Project Structure and Build System. For testing procedures, see Testing and Quality Assurance.
The DB-GPT project follows a standard fork-and-pull-request workflow for external contributions. Contributors must fork the repository, make changes in a feature branch, and submit a pull request for review.
The monorepo contains seven packages that are versioned and released together:
| Package | Path | Purpose |
|---|---|---|
dbgpt (core) | packages/dbgpt-core/ | Core framework and AWEL |
dbgpt-ext | packages/dbgpt-ext/ | Storage implementations |
dbgpt-serve | packages/dbgpt-serve/ | Service layer |
dbgpt-app | packages/dbgpt-app/ | Application logic |
dbgpt-client | packages/dbgpt-client/ | Client library |
dbgpt-acc-auto | packages/dbgpt-accelerator/dbgpt-acc-auto/ | Hardware acceleration |
dbgpt-acc-flash-attn | packages/dbgpt-accelerator/dbgpt-acc-flash-attn/ | Flash attention support |
All packages share synchronized version numbers, currently at 0.7.5.
Sources: pyproject.toml1-86 packages/dbgpt-core/pyproject.toml1-205 packages/dbgpt-ext/pyproject.toml1-106 packages/dbgpt-app/pyproject.toml1-65
When contributing code changes:
uv to install dependencies with uv sync --all-packages --devruff for code formatting according to project standardsSources: pyproject.toml42-60 pyproject.toml68-86
DB-GPT uses a synchronized versioning approach across all packages in the monorepo. All packages share the same version number, which simplifies dependency management and ensures compatibility.
Each package contains a _version.py file that defines the current version:
Version File Locations:
Sources: packages/dbgpt-core/src/dbgpt/_version.py1-2 packages/dbgpt-ext/src/dbgpt_ext/_version.py1-2 packages/dbgpt-app/src/dbgpt_app/_version.py1-2 packages/dbgpt-serve/src/dbgpt_serve/_version.py1-2 packages/dbgpt-client/src/dbgpt_client/_version.py1-2 packages/dbgpt-accelerator/dbgpt-acc-auto/src/dbgpt_acc_auto/_version.py1-2 packages/dbgpt-accelerator/dbgpt-acc-flash-attn/src/dbgpt_acc_flash_attn/_version.py1-2
The update_version_all.py script in the scripts/ directory automates version updates across all packages. This script:
version field in all pyproject.toml files_version.py files in each packageThe script is invoked during the release process via the GitHub Actions workflow with:
Sources: .github/workflows/python-publish.yml50-60
Each pyproject.toml file contains package metadata including:
| Field | Description |
|---|---|
name | Package name (e.g., dbgpt, dbgpt-ext) |
version | Current version (synchronized across all packages) |
requires-python | Minimum Python version (>=3.10) |
dependencies | Core dependencies for the package |
[project.optional-dependencies] | Optional feature groups |
[build-system] | Build backend configuration (uses hatchling) |
Sources: packages/dbgpt-core/pyproject.toml1-26 packages/dbgpt-ext/pyproject.toml1-25 packages/dbgpt-app/pyproject.toml1-30
The CI/CD pipeline is implemented using GitHub Actions and orchestrates the build, test, and publish process. The main workflow is defined in .github/workflows/python-publish.yml.
The workflow supports two trigger types:
Sources: .github/workflows/python-publish.yml1-88
The python-publish.yml workflow executes the following steps:
| Step | Action | Purpose |
|---|---|---|
| Checkout | actions/checkout@v4 | Clone the repository |
| Install uv | astral-sh/setup-uv@v5 | Install the uv package manager |
| Setup Python | actions/setup-python@v5 | Configure Python from .python-version file |
| Update Version | uv run update_version_all.py | Synchronize version across all packages (manual dispatch only) |
| Install Dependencies | uv sync --all-packages --dev | Install all workspace packages and dev dependencies |
| Build Packages | make build | Build distribution packages (wheels and source distributions) |
| Upload Artifacts | actions/upload-artifact@v4 | Store built packages for 7 days |
| Publish to TestPyPI | pypa/gh-action-pypi-publish@release/v1 | Upload to test.pypi.org (conditional) |
| Publish to PyPI | pypa/gh-action-pypi-publish@release/v1 | Upload to pypi.org (conditional) |
Sources: .github/workflows/python-publish.yml39-88
The workflow uses make build to trigger the build process. The build system uses:
[build-system] sections).whl) and source distribution (.tar.gz)[tool.hatch.build.targets.wheel] configurationThe build process creates distribution files in the dist/ directory, which are then uploaded to PyPI registries.
Sources: .github/workflows/python-publish.yml65-68 packages/dbgpt-core/pyproject.toml167-203
The workflow requires two secrets for publishing:
| Secret | Usage | Description |
|---|---|---|
TEST_PYPI_API_TOKEN | TestPyPI publishing | API token for https://test.pypi.org |
PYPI_API_TOKEN | PyPI publishing | API token for https://pypi.org |
These secrets must be configured in the repository settings under "Secrets and variables" → "Actions".
Sources: .github/workflows/python-publish.yml81-82 .github/workflows/python-publish.yml87-88
DB-GPT supports publishing to both TestPyPI (for testing) and PyPI (for production releases). The publishing process is automated through GitHub Actions but can also be performed manually.
Sources: .github/workflows/python-publish.yml11-88
TestPyPI (https://test.pypi.org) is used for testing package distribution before production release. To publish to TestPyPI:
version (e.g., 0.7.0rc0 for release candidates)publish_to_testpypi checkboxpublish_to_pypi uncheckedThe TestPyPI publish step uses:
Packages published to TestPyPI can be installed with:
Sources: .github/workflows/python-publish.yml77-82 pyproject.toml25-29
Production releases are published to PyPI (https://pypi.org). This happens automatically when a GitHub release is published, or can be triggered manually:
publish_to_pypi enabledThe PyPI publish step uses:
Published packages are immediately available for installation:
Sources: .github/workflows/python-publish.yml84-88
Each package in the monorepo has specific build configuration:
Wheel Package Structure:
The [tool.hatch.build.targets.wheel] section defines what gets included in the built package:
This ensures that test files and examples are excluded from the distributed package, reducing package size.
Sources: packages/dbgpt-core/pyproject.toml194-203 packages/dbgpt-ext/pyproject.toml96-106 packages/dbgpt-app/pyproject.toml54-63
Creating a new release involves several coordinated steps to ensure all packages are versioned correctly and published successfully.
Sources: .github/workflows/python-publish.yml11-88
Before creating a release, ensure:
DB-GPT follows semantic versioning with the format MAJOR.MINOR.PATCH:
| Version Component | Meaning | Example |
|---|---|---|
| MAJOR | Breaking changes | 0.x.x → 1.0.0 |
| MINOR | New features, backward compatible | 0.7.x → 0.8.0 |
| PATCH | Bug fixes, backward compatible | 0.7.5 → 0.7.6 |
| Pre-release suffix | Release candidates, alpha, beta | 0.8.0rc1, 0.8.0a1 |
Current version: 0.7.5 across all packages.
Sources: packages/dbgpt-core/pyproject.toml3 packages/dbgpt-ext/pyproject.toml3 packages/dbgpt-app/pyproject.toml3
To create a new release on GitHub:
v0.7.6 (prefix with v)Upon publishing, the GitHub Actions workflow automatically:
PYPI_API_TOKENpip install dbgpt==0.7.6Sources: .github/workflows/python-publish.yml11-14
After publishing a release, verify:
PyPI Availability: Check https://pypi.org/project/dbgpt/
Installation Test: Install in a clean environment
All Packages Published: Verify all seven packages are available:
dbgptdbgpt-extdbgpt-servedbgpt-appdbgpt-clientdbgpt-acc-autodbgpt-acc-flash-attnDependency Resolution: Test with different optional dependencies
Sources: pyproject.toml31-40
If a release has critical issues:
Note: PyPI does not allow deleting or replacing published packages, only yanking them.
The project uses uv as its package manager, which provides fast dependency resolution and reproducible builds through the uv.lock1-3 file.
The uv.lock file contains:
>=3.10)The lock file ensures reproducible builds across different environments by pinning exact versions of all transitive dependencies.
Sources: uv.lock1-10
The monorepo uses uv's workspace feature defined in the root pyproject.toml31-40:
This allows:
uv.lock file for the entire monorepoSources: pyproject.toml31-40
Packages define optional dependencies using [project.optional-dependencies]. For example, dbgpt-core defines:
client: HTTP client dependenciescli: Command-line interface toolsagent: Multi-agent framework dependenciessimple_framework: Core framework dependenciesframework: Extended framework featureshf: HuggingFace transformers supportproxy_openai, proxy_ollama, etc.Users can install specific feature sets:
Refresh this wiki