airbyte_ops_mcp.cli.registry
CLI commands for connector registry operations.
This module provides CLI wrappers for registry operations. The core logic
lives in the airbyte_ops_mcp.registry capability module.
Command groups:
airbyte-ops registry rc create - Create the release_candidate/ marker in GCS airbyte-ops registry rc cleanup - Delete the release_candidate/ marker from GCS airbyte-ops registry connector list - List all connectors in registry airbyte-ops registry connector-version list - List versions for a connector airbyte-ops registry connector-version next - Compute next version tag (prerelease/RC) airbyte-ops registry connector-version yank - Mark a connector version as yanked airbyte-ops registry connector-version unyank - Remove yank marker from a connector version airbyte-ops registry connector-version metadata get - Read connector metadata from GCS airbyte-ops registry connector-version artifacts generate - Generate version artifacts locally via docker airbyte-ops registry connector-version artifacts publish - Publish version artifacts to GCS airbyte-ops registry store mirror --local|--gcs-bucket|--s3-bucket - Mirror entire registry for testing airbyte-ops registry store compile --store coral:dev|coral:prod - Compile registry indexes and sync latest/ dirs airbyte-ops registry marketing-stubs sync --store coral:prod - Sync connector_stubs.json to GCS airbyte-ops registry marketing-stubs check --store coral:prod - Compare local file with GCS
CLI reference
The commands below are regenerated by poe docs-generate via cyclopts's
programmatic docs API; see docs/generate_cli.py.
airbyte-ops registry COMMAND
Connector registry operations (GCS metadata service).
Commands:
connector: Connector listing operations.connector-version: Connector version operations (list, yank, unyank, artifacts).marketing-stubs: Marketing connector stubs GCS operations (whole-file sync).progressive-rollout: Progressive rollout lifecycle operations (create/cleanup rollout marker in GCS).store: Whole-registry store operations (mirror, compile).
airbyte-ops registry connector
Connector listing operations.
airbyte-ops registry connector list
airbyte-ops registry connector list [OPTIONS] STORE
List connectors in the registry.
When filters are applied, reads the compiled cloud_registry.json index
for fast lookups. Without filters, falls back to scanning individual
metadata blobs (captures all connectors including OSS-only).
Requires GCS_CREDENTIALS environment variable to be set.
Parameters:
STORE, --store: Store target (e.g. 'coral:dev', 'coral:prod'). [required]--certified-only, --no-certified-only: Include only certified connectors. [default: False]--support-level: Exact support level to match (e.g.certified,community,archived). [choices: archived, community, certified]--min-support-level: Minimum support level (inclusive). Levels from lowest to highest:archived,community,certified. [choices: archived, community, certified]--connector-type: Filter by connector type:sourceordestination. [choices: source, destination]--language: Filter by implementation language (e.g.python,java,manifest-only). [choices: python, java, low-code, manifest-only]--format: Output format: 'json' for JSON array, 'text' for newline-separated. [choices: json, text] [default: json]
airbyte-ops registry connector-version
Connector version operations (list, yank, unyank, artifacts).
airbyte-ops registry connector-version metadata
Connector metadata inspection.
Commands:
get: Read a connector version's metadata from the registry.
airbyte-ops registry connector-version metadata get
airbyte-ops registry connector-version metadata get [OPTIONS] NAME STORE
Read a connector version's metadata from the registry.
Returns the full metadata.yaml content for a connector at the specified version.
Requires GCS_CREDENTIALS environment variable to be set.
Parameters:
NAME, --name: Connector name (e.g., 'source-faker', 'destination-postgres'). [required]STORE, --store: Store target (e.g. 'coral:dev', 'coral:prod'). [required]--version: Version to read (e.g., 'latest', '1.2.3'). [default: latest]--format: Output format: 'json' for JSON, 'raw' for YAML. [choices: json, raw] [default: json]
airbyte-ops registry connector-version artifacts
Version artifact generation and publishing.
Commands:
generate: Generate version artifacts for a connector locally.publish: Publish version artifacts to GCS using fsspec rsync.
airbyte-ops registry connector-version artifacts generate
airbyte-ops registry connector-version artifacts generate METADATA-FILE DOCKER-IMAGE [ARGS]
Generate version artifacts for a connector locally.
Runs the connector's docker image with DEPLOYMENT_MODE=cloud and DEPLOYMENT_MODE=oss to obtain both spec variants, then generates the registry entries (cloud.json, oss.json) by applying registryOverrides from the metadata.
The generated metadata.yaml is enriched with git commit info, SBOM URL,
and (when applicable) components SHA before writing. Validation is run
after generation by default; pass --no-validate to skip.
This is a local-only operation -- no files are uploaded to GCS.
Use artifacts publish to upload generated artifacts to GCS.
Parameters:
METADATA-FILE, --metadata-file: Path to the connector's metadata.yaml file. [required]DOCKER-IMAGE, --docker-image: Docker image to run spec against (e.g., 'airbyte/source-faker:6.2.38'). [required]OUTPUT-DIR, --output-dir: Directory to write artifacts to. If not specified, a temp directory is created.REPO-ROOT, --repo-root: Root of the Airbyte repo checkout (for resolving doc.md). If not specified, inferred by walking up from metadata-file.DRY-RUN, --dry-run, --no-dry-run: Show what would be generated without running docker or writing files. [default: False]WITH-VALIDATE, --with-validate, --no-validate: Run metadata validators after generation (default: enabled). Use --no-validate to skip. [default: True]WITH-SBOM, --with-sbom, --no-sbom: Generate spdx.json (SBOM) for connectors (default: enabled). Use --no-sbom to skip. [default: True]WITH-DEPENDENCY-DUMP, --with-dependency-dump, --no-dependency-dump: Generate dependencies.json for Python connectors (default: enabled). Use --no-dependency-dump to skip. [default: True]
airbyte-ops registry connector-version artifacts publish
airbyte-ops registry connector-version artifacts publish [OPTIONS] NAME VERSION ARTIFACTS-DIR STORE
Publish version artifacts to GCS using fsspec rsync.
Uploads locally generated artifacts (from artifacts generate) to the
versioned path in GCS. By default, metadata is validated before upload;
pass --no-validate to skip.
Uses --store to select the destination store and environment:
coral:dev→ coral dev bucket at rootcoral:prod→ coral prod bucket at rootcoral:dev/aj-test100→ coral dev bucket underaj-test100/prefix
Requires GCS_CREDENTIALS environment variable to be set.
Parameters:
NAME, --name: Connector name (e.g., 'source-faker'). [required]VERSION, --version: Version to publish artifacts for (e.g., '1.2.3'). [required]ARTIFACTS-DIR, --artifacts-dir: Directory containing generated artifacts to publish (from 'artifacts generate'). [required]STORE, --store: Store target (e.g. 'coral:dev', 'coral:prod', 'coral:dev/prefix'). [required]--dry-run, --no-dry-run: Show what would be published without writing to GCS. [default: False]--with-validate, --no-validate: Validate metadata before uploading (default: enabled). Use --no-validate to skip. [default: True]
airbyte-ops registry connector-version next
airbyte-ops registry connector-version next NAME SHA [ARGS]
Compute the next version tag for a connector.
Outputs the version tag to stdout for easy capture in shell scripts. This is the single source of truth for pre-release version format.
The command fetches the connector's metadata.yaml from GitHub at the given SHA to determine the base version. It also compares against the master branch and prints a warning to stderr if no version bump is detected.
If --base-version is provided, it is used directly instead of fetching from GitHub.
Parameters:
NAME, --name: Connector name (e.g., 'source-github'). [required]SHA, --sha: Git commit SHA (full or at least 7 characters). [required]BASE-VERSION, --base-version: Base version override. If not provided, fetched from metadata.yaml at the given SHA.
airbyte-ops registry connector-version list
airbyte-ops registry connector-version list [OPTIONS] NAME STORE
List all versions of a connector in the registry.
Scans the registry bucket to find all versions of a specific connector.
Requires GCS_CREDENTIALS environment variable to be set.
Parameters:
NAME, --name: Connector name (e.g., 'source-faker'). [required]STORE, --store: Store target (e.g. 'coral:dev', 'coral:prod'). [required]--format: Output format: 'json' for JSON array, 'text' for newline-separated. [choices: json, text] [default: json]
airbyte-ops registry connector-version yank
airbyte-ops registry connector-version yank [OPTIONS] NAME VERSION STORE
Mark a connector version as yanked.
Writes a version-yank.yml marker file to the version's directory in GCS. Yanked versions are excluded when determining the latest version of a connector.
Requires GCS_CREDENTIALS environment variable to be set.
Parameters:
NAME, --name: Connector name (e.g., 'source-faker'). [required]VERSION, --version: Version to yank (e.g., '1.2.3'). [required]STORE, --store: Store target (e.g. 'coral:dev', 'coral:prod'). [required]--reason: Reason for yanking this version. [default: ""]--dry-run, --no-dry-run: Show what would be done without making changes. [default: False]
airbyte-ops registry connector-version unyank
airbyte-ops registry connector-version unyank [OPTIONS] NAME VERSION STORE
Remove the yank marker from a connector version.
Deletes the version-yank.yml marker file from the version's directory in GCS, making the version eligible again when determining the latest version.
Requires GCS_CREDENTIALS environment variable to be set.
Parameters:
NAME, --name: Connector name (e.g., 'source-faker'). [required]VERSION, --version: Version to unyank (e.g., '1.2.3'). [required]STORE, --store: Store target (e.g. 'coral:dev', 'coral:prod'). [required]--dry-run, --no-dry-run: Show what would be done without making changes. [default: False]
airbyte-ops registry store
Whole-registry store operations (mirror, compile).
airbyte-ops registry store mirror
airbyte-ops registry store mirror [ARGS]
Create a mirror of the connector registry from the source store.
Reads all connector metadata from the source store and copies it to the specified output target. Supports local filesystem, GCS, and S3 as output targets via fsspec.
Output targets are mutually exclusive: specify exactly one of --local, --gcs-bucket, or --s3-bucket.
The production bucket is categorically disallowed as an output target.
To clean up legacy artifacts (e.g. disabled strict-encrypt connectors)
after mirroring, run compile with --with-legacy-migration v1::
airbyte-ops registry store compile --store coral:dev/my-prefix \
--with-legacy-migration v1
Parameters:
LOCAL, --local: Write output to a local directory. Mutually exclusive with --gcs-bucket and --s3-bucket. [default: False]GCS-BUCKET, --gcs-bucket: Write output to a GCS bucket. Must not be the prod bucket. Mutually exclusive with --local and --s3-bucket.S3-BUCKET, --s3-bucket: Write output to an S3 bucket. Mutually exclusive with --local and --gcs-bucket.OUTPUT-PATH-ROOT, --output-path-root: Root path/prefix for the output. For --local, this is a directory path (defaults to a new temp dir if omitted). For --gcs-bucket/--s3-bucket, this prefix is prepended to all blob paths.DRY-RUN, --dry-run, --no-dry-run: Show what would be rebuilt without writing any files. [default: False]SOURCE-STORE, --source-store: Source store to read from (e.g. 'coral:prod', 'coral:dev'). [default: coral:prod]CONNECTOR-NAME, --connector-name, --empty-connector-name: Only rebuild these connectors (by name). Can be specified multiple times, e.g. --connector-name source-faker --connector-name destination-bigquery.
airbyte-ops registry store compile
airbyte-ops registry store compile [OPTIONS] STORE
Compile the registry: sync latest/ dirs, write global and per-connector indexes.
Scans all version directories in the target store, determines the latest GA
semver per connector (excluding yanked and pre-release versions), ensures
each latest/ directory matches the computed latest, and writes:
registries/v0/cloud_registry.json-- global cloud registry indexregistries/v0/oss_registry.json-- global OSS registry indexmetadata/airbyte/<connector>/versions.json-- per-connector version index
With --with-secrets-mask, also regenerates:
registries/v0/specs_secrets_mask.yaml-- properties marked as secrets
With --with-legacy-migration=v1, deletes cloud.json / oss.json
files for connectors whose registryOverrides.cloud.enabled or
registryOverrides.oss.enabled is false.
With --force, resyncs all latest/ directories even if the version marker
matches the computed latest version.
Uses efficient glob patterns for scanning (no file downloads during discovery).
Requires GCS_CREDENTIALS environment variable to be set.
Parameters:
STORE, --store: Store target (e.g. 'coral:dev', 'coral:prod', 'coral:dev/prefix'). [required]--connector-name, --empty-connector-name: Only compile these connectors (can be repeated).--dry-run, --no-dry-run: Show what would be done without writing. [default: False]--with-secrets-mask, --no-with-secrets-mask: Also regenerate specs_secrets_mask.yaml by scanning all connector specs for airbyte_secret properties. [default: False]--with-legacy-migration: Run a one-time legacy migration step during compile. Currently supported: 'v1' — delete cloud.json / oss.json files for connectors whose registryOverrides.cloud.enabled or registryOverrides.oss.enabled is false. This cleans up artifacts produced by the legacy pipeline that did not respect the enabled flag.--force, --no-force: Force resync of latest/ directories even if version markers are current. Useful when metadata changes without a version bump. [default: False]
airbyte-ops registry store delete-dev-latest
airbyte-ops registry store delete-dev-latest [OPTIONS] STORE
Delete all latest/ directories from a dev registry store.
Discovers every connector that has a latest/ directory and
deletes each one in parallel using a thread pool.
This is useful before a full re-compile to prove that latest/ directories can be correctly regenerated from versioned data.
Only dev stores are allowed (store must begin with 'coral:dev').
Requires GCS_CREDENTIALS environment variable to be set.
Parameters:
STORE, --store: Store target (must begin with 'coral:dev'). [required]--connector-name, --empty-connector-name: Only delete latest/ for these connectors (can be repeated).--dry-run, --no-dry-run: Show what would be done without deleting. [default: False]
airbyte-ops registry store compare
airbyte-ops registry store compare [OPTIONS] STORE REFERENCE-STORE
Compare a store against a reference store and report differences.
Evaluates the --store target against --reference-store and reports
per-connector artifact diffs and global index diffs.
Requires GCS_CREDENTIALS environment variable to be set.
Parameters:
STORE, --store: Store target being evaluated (e.g. 'coral:dev/20260306-mirror-compile'). [required]REFERENCE-STORE, --reference-store: Known-good reference store to compare against. [required]--connector-name, --empty-connector-name: Only compare these connectors (can be repeated).--with-artifacts, --no-artifacts: Compare per-connector artifact files (metadata.yaml, cloud.json, oss.json, spec.json). [default: True]--with-indexes, --no-indexes: Compare global registry index files (cloud_registry.json, oss_registry.json). [default: True]
airbyte-ops registry progressive-rollout
Progressive rollout lifecycle operations (create/cleanup rollout marker in GCS).
airbyte-ops registry progressive-rollout create
airbyte-ops registry progressive-rollout create [OPTIONS] NAME STORE
Create the release_candidate/ metadata marker in GCS.
Copies the versioned metadata (e.g. 1.2.3-rc.1/metadata.yaml or
2.1.0/metadata.yaml) to release_candidate/metadata.yaml so the
platform knows a progressive rollout is active for this connector.
The connector's metadata.yaml on disk must declare a version that is valid for progressive rollout (i.e. not a -preview build) and the versioned blob must already exist in GCS.
Uses --store to select the registry store and environment.
Requires GCS_CREDENTIALS environment variable to be set.
Parameters:
NAME, --name: Connector technical name (e.g., source-github). [required]STORE, --store: Store target (e.g. 'coral:dev', 'coral:prod'). [required]--repo-path: Path to the Airbyte monorepo. Defaults to current directory. [default: /home/runner/work/airbyte-ops-mcp/airbyte-ops-mcp]--dry-run, --no-dry-run: Show what would be done without making changes. [default: False]
airbyte-ops registry progressive-rollout cleanup
airbyte-ops registry progressive-rollout cleanup [OPTIONS] NAME STORE
Delete the release_candidate/ metadata marker from GCS.
This command deletes only the release_candidate/metadata.yaml file
for the given connector. The versioned directory (e.g. 1.2.3-rc.1/)
is intentionally preserved as an audit trail of what was actually deployed
during the progressive rollout.
Both promote and rollback workflows call this same command -- the only
difference between the two is in the git-side steps (version bump vs.
not), which are handled by the finalize_rollout GitHub Actions
workflow.
Uses --store to select the registry store and environment.
Requires GCS_CREDENTIALS environment variable to be set.
Parameters:
NAME, --name: Connector technical name (e.g., source-github). [required]STORE, --store: Store target (e.g. 'coral:dev', 'coral:prod'). [required]--repo-path: Path to the Airbyte monorepo. Defaults to current directory. [default: /home/runner/work/airbyte-ops-mcp/airbyte-ops-mcp]--dry-run, --no-dry-run: Show what would be done without making changes. [default: False]
airbyte-ops registry marketing-stubs
Marketing connector stubs GCS operations (whole-file sync).
airbyte-ops registry marketing-stubs check
airbyte-ops registry marketing-stubs check [OPTIONS] STORE
Compare local connector_stubs.json with the version in GCS.
This command reads the entire local connector_stubs.json file and compares it with the version currently published in GCS.
Exit codes:
0: Local file matches GCS (check passed) 1: Differences found (check failed)
Output:
STDOUT: JSON representation of the comparison result STDERR: Informational messages and comparison details
Parameters:
STORE, --store: Store target (e.g. 'coral:dev', 'coral:prod'). [required]--repo-root: Path to the airbyte-enterprise repository root. Defaults to current directory. [default: /home/runner/work/airbyte-ops-mcp/airbyte-ops-mcp]
airbyte-ops registry marketing-stubs sync
airbyte-ops registry marketing-stubs sync [OPTIONS] STORE
Sync local connector_stubs.json to GCS.
This command uploads the entire local connector_stubs.json file to GCS, replacing the existing file. Use this after merging changes to master in the airbyte-enterprise repository.
Exit codes:
0: Sync successful (or dry-run completed) 1: Error (file not found, validation failed, etc.)
Output:
STDOUT: JSON representation of the sync result STDERR: Informational messages and status updates
Parameters:
STORE, --store: Store target (e.g. 'coral:dev', 'coral:prod'). [required]--repo-root: Path to the airbyte-enterprise repository root. Defaults to current directory. [default: /home/runner/work/airbyte-ops-mcp/airbyte-ops-mcp]--dry-run, --no-dry-run: Show what would be uploaded without making changes. [default: False]
1# Copyright (c) 2025 Airbyte, Inc., all rights reserved. 2"""CLI commands for connector registry operations. 3 4This module provides CLI wrappers for registry operations. The core logic 5lives in the `airbyte_ops_mcp.registry` capability module. 6 7Command groups: 8 airbyte-ops registry rc create - Create the release_candidate/ marker in GCS 9 airbyte-ops registry rc cleanup - Delete the release_candidate/ marker from GCS 10 airbyte-ops registry connector list - List all connectors in registry 11 airbyte-ops registry connector-version list - List versions for a connector 12 airbyte-ops registry connector-version next - Compute next version tag (prerelease/RC) 13 airbyte-ops registry connector-version yank - Mark a connector version as yanked 14 airbyte-ops registry connector-version unyank - Remove yank marker from a connector version 15 airbyte-ops registry connector-version metadata get - Read connector metadata from GCS 16 airbyte-ops registry connector-version artifacts generate - Generate version artifacts locally via docker 17 airbyte-ops registry connector-version artifacts publish - Publish version artifacts to GCS 18 airbyte-ops registry store mirror --local|--gcs-bucket|--s3-bucket - Mirror entire registry for testing 19 airbyte-ops registry store compile --store coral:dev|coral:prod - Compile registry indexes and sync latest/ dirs 20 airbyte-ops registry marketing-stubs sync --store coral:prod - Sync connector_stubs.json to GCS 21 airbyte-ops registry marketing-stubs check --store coral:prod - Compare local file with GCS 22 23## CLI reference 24 25The commands below are regenerated by `poe docs-generate` via cyclopts's 26programmatic docs API; see `docs/generate_cli.py`. 27 28.. include:: ../../../docs/generated/cli/registry.md 29 :start-line: 2 30""" 31 32from __future__ import annotations 33 34# Hide Python-level members from the pdoc page for this module; the rendered 35# docs for this CLI group come entirely from the grafted `.. include::` in 36# the module docstring above. 37__all__: list[str] = [] 38 39import contextlib 40import sys 41from pathlib import Path 42from typing import Annotated, Literal 43 44import yaml 45from cyclopts import Parameter 46 47from airbyte_ops_mcp.cli._base import App, app 48from airbyte_ops_mcp.cli._shared import ( 49 error_console, 50 exit_with_error, 51 print_error, 52 print_json, 53 print_success, 54) 55from airbyte_ops_mcp.github_api import ( 56 get_file_contents_at_ref, 57 resolve_default_github_token, 58) 59from airbyte_ops_mcp.mcp.prerelease import ( 60 compute_prerelease_docker_image_tag, 61) 62from airbyte_ops_mcp.registry import ( 63 resolve_registry_store, 64) 65from airbyte_ops_mcp.registry._enums import ( 66 ConnectorLanguage, 67 ConnectorType, 68 SupportLevel, 69) 70from airbyte_ops_mcp.registry.compare import compare_stores 71from airbyte_ops_mcp.registry.connector_stubs import ( 72 CONNECTOR_STUBS_FILE, 73) 74from airbyte_ops_mcp.registry.generate import generate_version_artifacts 75from airbyte_ops_mcp.registry.registry_store_base import ( 76 Registry, 77 get_registry, 78) 79 80# Create the registry sub-app 81registry_app = App( 82 name="registry", help="Connector registry operations (GCS metadata service)." 83) 84app.command(registry_app) 85 86# Create the connector sub-app under registry 87connector_app = App(name="connector", help="Connector listing operations.") 88registry_app.command(connector_app) 89 90# Create the connector-version sub-app under registry 91connector_version_app = App( 92 name="connector-version", 93 help="Connector version operations (list, yank, unyank, artifacts).", 94) 95registry_app.command(connector_version_app) 96 97# Create the metadata sub-app under connector-version 98metadata_app = App( 99 name="metadata", 100 help="Connector metadata inspection.", 101) 102connector_version_app.command(metadata_app) 103 104# Create the artifacts sub-app under connector-version 105artifacts_app = App( 106 name="artifacts", 107 help="Version artifact generation and publishing.", 108) 109connector_version_app.command(artifacts_app) 110 111# Create the store sub-app under registry (whole-registry operations) 112store_app = App( 113 name="store", 114 help="Whole-registry store operations (mirror, compile).", 115) 116registry_app.command(store_app) 117 118# Create the progressive-rollout sub-app under registry 119progressive_rollout_app = App( 120 name="progressive-rollout", 121 help="Progressive rollout lifecycle operations (create/cleanup rollout marker in GCS).", 122) 123registry_app.command(progressive_rollout_app) 124 125# Create the marketing-stubs sub-app under registry (for whole-file GCS operations) 126marketing_stubs_app = App( 127 name="marketing-stubs", 128 help="Marketing connector stubs GCS operations (whole-file sync).", 129) 130registry_app.command(marketing_stubs_app) 131 132 133AIRBYTE_REPO_OWNER = "airbytehq" 134AIRBYTE_ENTERPRISE_REPO_NAME = "airbyte-enterprise" 135AIRBYTE_REPO_NAME = "airbyte" 136CONNECTOR_PATH_PREFIX = "airbyte-integrations/connectors" 137 138 139def _resolve_store(store: str) -> Registry: 140 """Resolve `--store` to a `Registry` instance. 141 142 Wraps `resolve_registry_store` and `get_registry`, converting 143 `ValueError` into a user-friendly CLI error via `exit_with_error`. 144 """ 145 try: 146 resolved = resolve_registry_store(store=store) 147 except ValueError as e: 148 exit_with_error(str(e)) 149 raise # unreachable; satisfies type checker 150 return get_registry(resolved) 151 152 153def _get_connector_version_from_github( 154 connector_name: str, 155 ref: str, 156 token: str | None = None, 157) -> str | None: 158 """Fetch connector version from metadata.yaml via GitHub API. 159 160 Args: 161 connector_name: Connector name (e.g., "source-github") 162 ref: Git ref (commit SHA, branch name, or tag) 163 token: GitHub API token (optional for public repos) 164 165 Returns: 166 Version string from metadata.yaml, or None if not found. 167 """ 168 path = f"{CONNECTOR_PATH_PREFIX}/{connector_name}/metadata.yaml" 169 contents = get_file_contents_at_ref( 170 owner=AIRBYTE_REPO_OWNER, 171 repo=AIRBYTE_REPO_NAME, 172 path=path, 173 ref=ref, 174 token=token, 175 ) 176 if contents is None: 177 return None 178 179 metadata = yaml.safe_load(contents) 180 return metadata.get("data", {}).get("dockerImageTag") 181 182 183@connector_version_app.command(name="next") 184def compute_next_version( 185 name: Annotated[ 186 str, 187 Parameter(help="Connector name (e.g., 'source-github')."), 188 ], 189 sha: Annotated[ 190 str, 191 Parameter(help="Git commit SHA (full or at least 7 characters)."), 192 ], 193 base_version: Annotated[ 194 str | None, 195 Parameter( 196 help="Base version override. If not provided, fetched from metadata.yaml at the given SHA." 197 ), 198 ] = None, 199) -> None: 200 """Compute the next version tag for a connector. 201 202 Outputs the version tag to stdout for easy capture in shell scripts. 203 This is the single source of truth for pre-release version format. 204 205 The command fetches the connector's metadata.yaml from GitHub at the given SHA 206 to determine the base version. It also compares against the master branch and 207 prints a warning to stderr if no version bump is detected. 208 209 If --base-version is provided, it is used directly instead of fetching from GitHub. 210 211 Examples: 212 airbyte-ops registry connector-version next --name source-github --sha abcdef1234567 213 # Output: 1.2.3-preview.abcdef1 214 215 airbyte-ops registry connector-version next --name source-github --sha abcdef1234567 --base-version 1.2.3 216 # Output: 1.2.3-preview.abcdef1 (uses provided version, skips GitHub API) 217 """ 218 # Try to get a GitHub token (optional, but helps avoid rate limiting) 219 # Token resolution may fail if no token is configured, which is fine for public repos 220 token: str | None = None 221 with contextlib.suppress(ValueError): 222 token = resolve_default_github_token() 223 224 # Determine base version 225 version: str 226 if base_version: 227 version = base_version 228 else: 229 # Fetch version from metadata.yaml at the given SHA 230 fetched_version = _get_connector_version_from_github(name, sha, token) 231 if fetched_version is None: 232 print( 233 f"Error: Could not fetch metadata.yaml for {name} at ref {sha}", 234 file=sys.stderr, 235 ) 236 sys.exit(1) 237 version = fetched_version 238 239 # Compare with master branch version and warn if no bump detected 240 master_version = _get_connector_version_from_github(name, "master", token) 241 if master_version and master_version == version: 242 print( 243 f"Warning: No version bump detected for {name}. " 244 f"Version {version} matches master branch.", 245 file=sys.stderr, 246 ) 247 248 # Compute and output the prerelease tag 249 tag = compute_prerelease_docker_image_tag(version, sha) 250 print(tag) 251 252 253# ============================================================================= 254# PROGRESSIVE ROLLOUT COMMANDS 255# ============================================================================= 256 257 258@progressive_rollout_app.command(name="create") 259def progressive_rollout_create( 260 name: Annotated[ 261 str, 262 Parameter(help="Connector technical name (e.g., source-github)."), 263 ], 264 store: Annotated[ 265 str, 266 Parameter( 267 help="Store target (e.g. 'coral:dev', 'coral:prod').", 268 ), 269 ], 270 *, 271 repo_path: Annotated[ 272 Path, 273 Parameter(help="Path to the Airbyte monorepo. Defaults to current directory."), 274 ] = Path.cwd(), 275 dry_run: Annotated[ 276 bool, 277 Parameter(help="Show what would be done without making changes."), 278 ] = False, 279) -> None: 280 """Create the release_candidate/ metadata marker in GCS. 281 282 Copies the versioned metadata (e.g. `1.2.3-rc.1/metadata.yaml` or 283 `2.1.0/metadata.yaml`) to `release_candidate/metadata.yaml` so the 284 platform knows a progressive rollout is active for this connector. 285 286 The connector's metadata.yaml on disk must declare a version that 287 is valid for progressive rollout (i.e. not a -preview build) and the 288 versioned blob must already exist in GCS. 289 290 Uses `--store` to select the registry store and environment. 291 292 Requires `GCS_CREDENTIALS` environment variable to be set. 293 """ 294 if not repo_path.exists(): 295 exit_with_error(f"Repository path not found: {repo_path}") 296 297 registry = _resolve_store(store) 298 299 result = registry.progressive_rollout_create( 300 repo_path=repo_path, 301 connector_name=name, 302 dry_run=dry_run, 303 ) 304 305 print_json(result.model_dump()) 306 307 if result.status == "failure": 308 exit_with_error(result.message or "Operation failed", code=1) 309 310 311@progressive_rollout_app.command(name="cleanup") 312def progressive_rollout_cleanup( 313 name: Annotated[ 314 str, 315 Parameter(help="Connector technical name (e.g., source-github)."), 316 ], 317 store: Annotated[ 318 str, 319 Parameter( 320 help="Store target (e.g. 'coral:dev', 'coral:prod').", 321 ), 322 ], 323 *, 324 repo_path: Annotated[ 325 Path, 326 Parameter(help="Path to the Airbyte monorepo. Defaults to current directory."), 327 ] = Path.cwd(), 328 dry_run: Annotated[ 329 bool, 330 Parameter(help="Show what would be done without making changes."), 331 ] = False, 332) -> None: 333 """Delete the release_candidate/ metadata marker from GCS. 334 335 This command deletes *only* the `release_candidate/metadata.yaml` file 336 for the given connector. The versioned directory (e.g. `1.2.3-rc.1/`) 337 is intentionally preserved as an audit trail of what was actually deployed 338 during the progressive rollout. 339 340 Both promote and rollback workflows call this same command -- the only 341 difference between the two is in the git-side steps (version bump vs. 342 not), which are handled by the `finalize_rollout` GitHub Actions 343 workflow. 344 345 Uses `--store` to select the registry store and environment. 346 347 Requires `GCS_CREDENTIALS` environment variable to be set. 348 """ 349 if not repo_path.exists(): 350 exit_with_error(f"Repository path not found: {repo_path}") 351 352 registry = _resolve_store(store) 353 354 result = registry.progressive_rollout_cleanup( 355 repo_path=repo_path, 356 connector_name=name, 357 dry_run=dry_run, 358 ) 359 360 print_json(result.model_dump()) 361 362 if result.status == "failure": 363 exit_with_error(result.message or "Operation failed", code=1) 364 365 366# ============================================================================= 367# REGISTRY I/O - READ COMMANDS 368# ============================================================================= 369 370 371@metadata_app.command(name="get") 372def get_connector_version_metadata_cmd( 373 name: Annotated[ 374 str, 375 Parameter( 376 help="Connector name (e.g., 'source-faker', 'destination-postgres')." 377 ), 378 ], 379 store: Annotated[ 380 str, 381 Parameter( 382 help="Store target (e.g. 'coral:dev', 'coral:prod').", 383 ), 384 ], 385 *, 386 version: Annotated[ 387 str, 388 Parameter(help="Version to read (e.g., 'latest', '1.2.3')."), 389 ] = "latest", 390 format: Annotated[ 391 Literal["json", "raw"], 392 Parameter(help="Output format: 'json' for JSON, 'raw' for YAML."), 393 ] = "json", 394) -> None: 395 """Read a connector version's metadata from the registry. 396 397 Returns the full metadata.yaml content for a connector at the specified version. 398 399 Requires GCS_CREDENTIALS environment variable to be set. 400 401 Examples: 402 airbyte-ops registry connector-version metadata get --name source-faker --store coral:dev 403 airbyte-ops registry connector-version metadata get --name source-faker --store coral:dev --version 6.2.38 404 airbyte-ops registry connector-version metadata get --name source-faker --store coral:prod 405 """ 406 registry = _resolve_store(store) 407 408 try: 409 metadata = registry.get_connector_metadata( 410 connector_name=name, 411 version=version, 412 ) 413 except FileNotFoundError as e: 414 exit_with_error(str(e), code=1) 415 except Exception as e: 416 exit_with_error(f"Error reading metadata: {e}", code=1) 417 418 if format == "json": 419 print_json(metadata) 420 else: 421 print(yaml.dump(metadata, default_flow_style=False)) 422 423 424@connector_app.command(name="list") 425def list_connectors_cmd( 426 store: Annotated[ 427 str, 428 Parameter( 429 help="Store target (e.g. 'coral:dev', 'coral:prod').", 430 ), 431 ], 432 *, 433 certified_only: Annotated[ 434 bool, 435 Parameter(help="Include only certified connectors."), 436 ] = False, 437 support_level: Annotated[ 438 SupportLevel | None, 439 Parameter( 440 help=( 441 "Exact support level to match " 442 "(e.g. `certified`, `community`, `archived`)." 443 ) 444 ), 445 ] = None, 446 min_support_level: Annotated[ 447 SupportLevel | None, 448 Parameter( 449 help=( 450 "Minimum support level (inclusive). " 451 "Levels from lowest to highest: `archived`, `community`, `certified`." 452 ) 453 ), 454 ] = None, 455 connector_type: Annotated[ 456 ConnectorType | None, 457 Parameter(help="Filter by connector type: `source` or `destination`."), 458 ] = None, 459 language: Annotated[ 460 ConnectorLanguage | None, 461 Parameter( 462 help=( 463 "Filter by implementation language " 464 "(e.g. `python`, `java`, `manifest-only`)." 465 ) 466 ), 467 ] = None, 468 format: Annotated[ 469 Literal["json", "text"], 470 Parameter( 471 help="Output format: 'json' for JSON array, 'text' for newline-separated." 472 ), 473 ] = "json", 474) -> None: 475 """List connectors in the registry. 476 477 When filters are applied, reads the compiled `cloud_registry.json` index 478 for fast lookups. Without filters, falls back to scanning individual 479 metadata blobs (captures all connectors including OSS-only). 480 481 Requires GCS_CREDENTIALS environment variable to be set. 482 """ 483 registry = _resolve_store(store) 484 485 # `--certified-only` is sugar for `--support-level certified`. 486 effective_support_level = support_level 487 if certified_only: 488 if support_level and support_level != SupportLevel.CERTIFIED: 489 exit_with_error( 490 "`--certified-only` conflicts with `--support-level " 491 f"{support_level}`. Use one or the other.", 492 code=1, 493 ) 494 effective_support_level = SupportLevel.CERTIFIED 495 496 try: 497 connectors = registry.list_connectors( 498 support_level=effective_support_level, 499 min_support_level=min_support_level, 500 connector_type=connector_type, 501 language=language, 502 ) 503 except Exception as e: 504 exit_with_error(f"Error listing connectors: {e}", code=1) 505 506 if format == "json": 507 print_json({"connectors": connectors, "count": len(connectors)}) 508 else: 509 for connector in connectors: 510 print(connector) 511 512 513@connector_version_app.command(name="list") 514def list_connector_versions_cmd( 515 name: Annotated[ 516 str, 517 Parameter(help="Connector name (e.g., 'source-faker')."), 518 ], 519 store: Annotated[ 520 str, 521 Parameter( 522 help="Store target (e.g. 'coral:dev', 'coral:prod').", 523 ), 524 ], 525 *, 526 format: Annotated[ 527 Literal["json", "text"], 528 Parameter( 529 help="Output format: 'json' for JSON array, 'text' for newline-separated." 530 ), 531 ] = "json", 532) -> None: 533 """List all versions of a connector in the registry. 534 535 Scans the registry bucket to find all versions of a specific connector. 536 537 Requires GCS_CREDENTIALS environment variable to be set. 538 """ 539 registry = _resolve_store(store) 540 541 try: 542 versions = registry.list_connector_versions( 543 connector_name=name, 544 ) 545 except Exception as e: 546 exit_with_error(f"Error listing versions: {e}", code=1) 547 548 if format == "json": 549 print_json({"connector": name, "versions": versions, "count": len(versions)}) 550 else: 551 for v in versions: 552 print(v) 553 554 555@marketing_stubs_app.command(name="check") 556def marketing_stubs_check( 557 store: Annotated[ 558 str, 559 Parameter( 560 help="Store target (e.g. 'coral:dev', 'coral:prod').", 561 ), 562 ], 563 *, 564 repo_root: Annotated[ 565 Path, 566 Parameter( 567 help="Path to the airbyte-enterprise repository root. Defaults to current directory." 568 ), 569 ] = Path.cwd(), 570) -> None: 571 """Compare local connector_stubs.json with the version in GCS. 572 573 This command reads the entire local connector_stubs.json file and compares it 574 with the version currently published in GCS. 575 576 Exit codes: 577 0: Local file matches GCS (check passed) 578 1: Differences found (check failed) 579 580 Output: 581 STDOUT: JSON representation of the comparison result 582 STDERR: Informational messages and comparison details 583 584 Example: 585 airbyte-ops registry marketing-stubs check --store coral:prod --repo-root /path/to/airbyte-enterprise 586 airbyte-ops registry marketing-stubs check --store coral:dev 587 """ 588 registry = _resolve_store(store) 589 590 try: 591 result = registry.marketing_stubs_check(repo_root=repo_root) 592 except FileNotFoundError as e: 593 exit_with_error(str(e)) 594 except ValueError as e: 595 exit_with_error(str(e)) 596 597 error_console.print( 598 f"Comparing local {CONNECTOR_STUBS_FILE} with {result.get('bucket', '')}/{result.get('path', '')}" 599 ) 600 601 differences = result.get("differences", []) 602 if differences: 603 error_console.print( 604 f"[yellow]Warning:[/yellow] {len(differences)} difference(s) found:" 605 ) 606 for diff in differences: 607 error_console.print(f" {diff['id']}: {diff['status']}") 608 print_json(result) 609 sys.exit(1) 610 611 error_console.print( 612 f"[green]Local file is in sync with GCS ({result.get('local_count', 0)} stubs)[/green]" 613 ) 614 print_json(result) 615 616 617@marketing_stubs_app.command(name="sync") 618def marketing_stubs_sync( 619 store: Annotated[ 620 str, 621 Parameter( 622 help="Store target (e.g. 'coral:dev', 'coral:prod').", 623 ), 624 ], 625 *, 626 repo_root: Annotated[ 627 Path, 628 Parameter( 629 help="Path to the airbyte-enterprise repository root. Defaults to current directory." 630 ), 631 ] = Path.cwd(), 632 dry_run: Annotated[ 633 bool, 634 Parameter(help="Show what would be uploaded without making changes."), 635 ] = False, 636) -> None: 637 """Sync local connector_stubs.json to GCS. 638 639 This command uploads the entire local connector_stubs.json file to GCS, 640 replacing the existing file. Use this after merging changes to master 641 in the airbyte-enterprise repository. 642 643 Exit codes: 644 0: Sync successful (or dry-run completed) 645 1: Error (file not found, validation failed, etc.) 646 647 Output: 648 STDOUT: JSON representation of the sync result 649 STDERR: Informational messages and status updates 650 651 Example: 652 airbyte-ops registry marketing-stubs sync --store coral:prod --repo-root /path/to/airbyte-enterprise 653 airbyte-ops registry marketing-stubs sync --store coral:dev 654 airbyte-ops registry marketing-stubs sync --store coral:dev --dry-run 655 """ 656 registry = _resolve_store(store) 657 658 try: 659 result = registry.marketing_stubs_sync( 660 repo_root=repo_root, 661 dry_run=dry_run, 662 ) 663 except FileNotFoundError as e: 664 exit_with_error(str(e)) 665 except ValueError as e: 666 exit_with_error(str(e)) 667 668 bucket_name = result.get("bucket", "") 669 path = result.get("path", "") 670 stub_count = result.get("stub_count", 0) 671 672 if dry_run: 673 error_console.print( 674 f"[DRY RUN] Would upload {stub_count} stubs to {bucket_name}/{path}" 675 ) 676 else: 677 error_console.print( 678 f"[green]Synced {stub_count} stubs to {bucket_name}/{path}[/green]" 679 ) 680 print_json(result) 681 682 683# ============================================================================= 684# REGISTRY REBUILD COMMANDS 685# ============================================================================= 686 687 688@store_app.command(name="mirror") 689def mirror_cmd( 690 local: Annotated[ 691 bool, 692 Parameter( 693 help="Write output to a local directory. Mutually exclusive with --gcs-bucket and --s3-bucket.", 694 negative="", 695 ), 696 ] = False, 697 gcs_bucket: Annotated[ 698 str | None, 699 Parameter( 700 help="Write output to a GCS bucket. Must not be the prod bucket. " 701 "Mutually exclusive with --local and --s3-bucket.", 702 ), 703 ] = None, 704 s3_bucket: Annotated[ 705 str | None, 706 Parameter( 707 help="Write output to an S3 bucket. " 708 "Mutually exclusive with --local and --gcs-bucket.", 709 ), 710 ] = None, 711 output_path_root: Annotated[ 712 str | None, 713 Parameter( 714 help="Root path/prefix for the output. For --local, this is a directory path " 715 "(defaults to a new temp dir if omitted). For --gcs-bucket/--s3-bucket, " 716 "this prefix is prepended to all blob paths.", 717 ), 718 ] = None, 719 dry_run: Annotated[ 720 bool, 721 Parameter(help="Show what would be rebuilt without writing any files."), 722 ] = False, 723 source_store: Annotated[ 724 str, 725 Parameter( 726 help="Source store to read from (e.g. 'coral:prod', 'coral:dev').", 727 ), 728 ] = "coral:prod", 729 connector_name: Annotated[ 730 tuple[str, ...] | None, 731 Parameter( 732 help="Only rebuild these connectors (by name). " 733 "Can be specified multiple times, e.g. " 734 "--connector-name source-faker --connector-name destination-bigquery.", 735 ), 736 ] = None, 737) -> None: 738 """Create a mirror of the connector registry from the source store. 739 740 Reads all connector metadata from the source store and copies it 741 to the specified output target. Supports local filesystem, GCS, and S3 742 as output targets via fsspec. 743 744 Output targets are mutually exclusive: specify exactly one of 745 --local, --gcs-bucket, or --s3-bucket. 746 747 The production bucket is categorically disallowed as an output target. 748 749 To clean up legacy artifacts (e.g. disabled strict-encrypt connectors) 750 after mirroring, run compile with `--with-legacy-migration v1`:: 751 752 airbyte-ops registry store compile --store coral:dev/my-prefix \\ 753 --with-legacy-migration v1 754 755 Examples: 756 airbyte-ops registry store mirror --local 757 airbyte-ops registry store mirror --local --source-store coral:dev 758 airbyte-ops registry store mirror --gcs-bucket dev-airbyte-cloud-connector-metadata-service-2 \\ 759 --output-path-root test-run-123 760 airbyte-ops registry store mirror --s3-bucket my-test-bucket --dry-run 761 """ 762 # Validate mutually exclusive output targets 763 targets = [local, gcs_bucket is not None, s3_bucket is not None] 764 if sum(targets) != 1: 765 exit_with_error( 766 "Specify exactly one output target: --local, --gcs-bucket, or --s3-bucket." 767 ) 768 769 if local: 770 output_mode = "local" 771 elif gcs_bucket is not None: 772 output_mode = "gcs" 773 else: 774 output_mode = "s3" 775 776 registry = _resolve_store(source_store) 777 778 result = registry.mirror( 779 output_mode=output_mode, 780 output_path_root=output_path_root, 781 gcs_bucket=gcs_bucket, 782 s3_bucket=s3_bucket, 783 dry_run=dry_run, 784 connector_name=list(connector_name) if connector_name else None, 785 ) 786 787 print_json( 788 { 789 "status": result.status, 790 "source_bucket": result.source_bucket, 791 "output_mode": result.output_mode, 792 "output_root": result.output_root, 793 "connectors_processed": result.connectors_processed, 794 "blobs_copied": result.blobs_copied, 795 "blobs_skipped": result.blobs_skipped, 796 "error_count": len(result.errors), 797 "dry_run": result.dry_run, 798 } 799 ) 800 801 if result.errors: 802 for err in result.errors[:10]: 803 print_error(err) 804 if len(result.errors) > 10: 805 print_error(f"... and {len(result.errors) - 10} more errors") 806 807 if result.status == "success": 808 print_success(result.summary()) 809 elif result.status == "dry-run": 810 print_success(f"[DRY RUN] {result.summary()}") 811 else: 812 error_console.print(f"[yellow]{result.summary()}[/yellow]") 813 814 815# ============================================================================= 816# VERSION YANK COMMANDS 817# ============================================================================= 818 819 820@connector_version_app.command(name="yank") 821def yank_cmd( 822 name: Annotated[ 823 str, 824 Parameter(help="Connector name (e.g., 'source-faker')."), 825 ], 826 version: Annotated[ 827 str, 828 Parameter(help="Version to yank (e.g., '1.2.3')."), 829 ], 830 store: Annotated[ 831 str, 832 Parameter( 833 help="Store target (e.g. 'coral:dev', 'coral:prod').", 834 ), 835 ], 836 *, 837 reason: Annotated[ 838 str, 839 Parameter(help="Reason for yanking this version."), 840 ] = "", 841 dry_run: Annotated[ 842 bool, 843 Parameter(help="Show what would be done without making changes."), 844 ] = False, 845) -> None: 846 """Mark a connector version as yanked. 847 848 Writes a version-yank.yml marker file to the version's directory in GCS. 849 Yanked versions are excluded when determining the latest version of a 850 connector. 851 852 Requires GCS_CREDENTIALS environment variable to be set. 853 854 Examples: 855 airbyte-ops registry connector-version yank --name source-faker --version 1.2.3 --store coral:dev 856 airbyte-ops registry connector-version yank --name source-faker --version 1.2.3 --store coral:dev --reason "Critical bug" 857 airbyte-ops registry connector-version yank --name source-faker --version 1.2.3 --store coral:prod 858 """ 859 registry = _resolve_store(store) 860 861 result = registry.yank( 862 connector_name=name, 863 version=version, 864 reason=reason, 865 dry_run=dry_run, 866 ) 867 868 print_json(result.to_dict()) 869 870 if result.success: 871 print_success(result.message) 872 else: 873 exit_with_error(result.message, code=1) 874 875 876@connector_version_app.command(name="unyank") 877def unyank_cmd( 878 name: Annotated[ 879 str, 880 Parameter(help="Connector name (e.g., 'source-faker')."), 881 ], 882 version: Annotated[ 883 str, 884 Parameter(help="Version to unyank (e.g., '1.2.3')."), 885 ], 886 store: Annotated[ 887 str, 888 Parameter( 889 help="Store target (e.g. 'coral:dev', 'coral:prod').", 890 ), 891 ], 892 *, 893 dry_run: Annotated[ 894 bool, 895 Parameter(help="Show what would be done without making changes."), 896 ] = False, 897) -> None: 898 """Remove the yank marker from a connector version. 899 900 Deletes the version-yank.yml marker file from the version's directory in GCS, 901 making the version eligible again when determining the latest version. 902 903 Requires GCS_CREDENTIALS environment variable to be set. 904 905 Examples: 906 airbyte-ops registry connector-version unyank --name source-faker --version 1.2.3 --store coral:dev 907 airbyte-ops registry connector-version unyank --name source-faker --version 1.2.3 --store coral:prod 908 """ 909 registry = _resolve_store(store) 910 911 result = registry.unyank( 912 connector_name=name, 913 version=version, 914 dry_run=dry_run, 915 ) 916 917 print_json(result.to_dict()) 918 919 if result.success: 920 print_success(result.message) 921 else: 922 exit_with_error(result.message, code=1) 923 924 925# ============================================================================= 926# ARTIFACT GENERATION COMMANDS 927# ============================================================================= 928 929 930@artifacts_app.command(name="generate") 931def generate_version_artifacts_cmd( 932 metadata_file: Annotated[ 933 Path, 934 Parameter(help="Path to the connector's metadata.yaml file."), 935 ], 936 docker_image: Annotated[ 937 str, 938 Parameter( 939 help="Docker image to run spec against (e.g., 'airbyte/source-faker:6.2.38')." 940 ), 941 ], 942 output_dir: Annotated[ 943 Path | None, 944 Parameter( 945 help="Directory to write artifacts to. If not specified, a temp directory is created." 946 ), 947 ] = None, 948 repo_root: Annotated[ 949 Path | None, 950 Parameter( 951 help=( 952 "Root of the Airbyte repo checkout (for resolving doc.md). " 953 "If not specified, inferred by walking up from metadata-file." 954 ), 955 ), 956 ] = None, 957 dry_run: Annotated[ 958 bool, 959 Parameter( 960 help="Show what would be generated without running docker or writing files." 961 ), 962 ] = False, 963 with_validate: Annotated[ 964 bool, 965 Parameter( 966 help=( 967 "Run metadata validators after generation (default: enabled). " 968 "Use --no-validate to skip." 969 ), 970 negative="--no-validate", 971 ), 972 ] = True, 973 with_sbom: Annotated[ 974 bool, 975 Parameter( 976 help=( 977 "Generate spdx.json (SBOM) for connectors " 978 "(default: enabled). Use --no-sbom to skip." 979 ), 980 negative="--no-sbom", 981 ), 982 ] = True, 983 with_dependency_dump: Annotated[ 984 bool, 985 Parameter( 986 help=( 987 "Generate dependencies.json for Python connectors " 988 "(default: enabled). Use --no-dependency-dump to skip." 989 ), 990 negative="--no-dependency-dump", 991 ), 992 ] = True, 993) -> None: 994 """Generate version artifacts for a connector locally. 995 996 Runs the connector's docker image with DEPLOYMENT_MODE=cloud and 997 DEPLOYMENT_MODE=oss to obtain both spec variants, then generates 998 the registry entries (cloud.json, oss.json) by applying 999 registryOverrides from the metadata. 1000 1001 The generated metadata.yaml is enriched with git commit info, SBOM URL, 1002 and (when applicable) components SHA before writing. Validation is run 1003 after generation by default; pass `--no-validate` to skip. 1004 1005 This is a local-only operation -- no files are uploaded to GCS. 1006 Use `artifacts publish` to upload generated artifacts to GCS. 1007 1008 Examples: 1009 airbyte-ops registry connector-version artifacts generate \\ 1010 --metadata-file path/to/metadata.yaml \\ 1011 --docker-image airbyte/source-faker:6.2.38 1012 1013 airbyte-ops registry connector-version artifacts generate \\ 1014 --metadata-file path/to/metadata.yaml \\ 1015 --docker-image airbyte/source-faker:6.2.38 \\ 1016 --output-dir ./artifacts --with-validate 1017 """ 1018 result = generate_version_artifacts( 1019 metadata_file=metadata_file, 1020 docker_image=docker_image, 1021 output_dir=output_dir, 1022 repo_root=repo_root, 1023 dry_run=dry_run, 1024 with_validate=with_validate, 1025 with_dependency_dump=with_dependency_dump, 1026 with_sbom=with_sbom, 1027 ) 1028 1029 print_json(result.to_dict()) 1030 1031 if result.success: 1032 print_success( 1033 f"Generated {len(result.artifacts_written)} artifacts to {result.output_dir}" 1034 ) 1035 else: 1036 all_errors = result.errors + result.validation_errors 1037 exit_with_error( 1038 f"Generation completed with {len(all_errors)} error(s): " 1039 + "; ".join(all_errors), 1040 code=1, 1041 ) 1042 1043 1044@artifacts_app.command(name="publish") 1045def publish_version_artifacts_cmd( 1046 name: Annotated[ 1047 str, 1048 Parameter(help="Connector name (e.g., 'source-faker')."), 1049 ], 1050 version: Annotated[ 1051 str, 1052 Parameter(help="Version to publish artifacts for (e.g., '1.2.3')."), 1053 ], 1054 artifacts_dir: Annotated[ 1055 Path, 1056 Parameter( 1057 help="Directory containing generated artifacts to publish (from 'artifacts generate')." 1058 ), 1059 ], 1060 store: Annotated[ 1061 str, 1062 Parameter( 1063 help="Store target (e.g. 'coral:dev', 'coral:prod', 'coral:dev/prefix').", 1064 ), 1065 ], 1066 *, 1067 dry_run: Annotated[ 1068 bool, 1069 Parameter(help="Show what would be published without writing to GCS."), 1070 ] = False, 1071 with_validate: Annotated[ 1072 bool, 1073 Parameter( 1074 help=( 1075 "Validate metadata before uploading (default: enabled). " 1076 "Use --no-validate to skip." 1077 ), 1078 negative="--no-validate", 1079 ), 1080 ] = True, 1081) -> None: 1082 """Publish version artifacts to GCS using fsspec rsync. 1083 1084 Uploads locally generated artifacts (from `artifacts generate`) to the 1085 versioned path in GCS. By default, metadata is validated before upload; 1086 pass `--no-validate` to skip. 1087 1088 Uses `--store` to select the destination store and environment: 1089 1090 * `coral:dev` → coral dev bucket at root 1091 * `coral:prod` → coral prod bucket at root 1092 * `coral:dev/aj-test100` → coral dev bucket under `aj-test100/` prefix 1093 1094 Requires GCS_CREDENTIALS environment variable to be set. 1095 1096 Examples: 1097 airbyte-ops registry connector-version artifacts publish \\ 1098 --name source-faker --version 6.2.38 \\ 1099 --artifacts-dir ./artifacts --store coral:dev --with-validate 1100 1101 airbyte-ops registry connector-version artifacts publish \\ 1102 --name source-faker --version 6.2.38 \\ 1103 --artifacts-dir ./artifacts --store coral:prod 1104 1105 airbyte-ops registry connector-version artifacts publish \\ 1106 --name source-faker --version 6.2.38 \\ 1107 --artifacts-dir ./artifacts --store coral:dev/aj-test100 1108 """ 1109 registry = _resolve_store(store) 1110 1111 result = registry.publish_version_artifacts( 1112 connector_name=name, 1113 version=version, 1114 artifacts_dir=artifacts_dir, 1115 dry_run=dry_run, 1116 with_validate=with_validate, 1117 ) 1118 1119 print_json( 1120 { 1121 "status": result.status, 1122 "connector_name": result.connector_name, 1123 "version": result.version, 1124 "target": result.target, 1125 "gcs_destination": result.gcs_destination, 1126 "files_uploaded": result.files_uploaded, 1127 "errors": result.errors, 1128 "validation_errors": result.validation_errors, 1129 "dry_run": result.dry_run, 1130 } 1131 ) 1132 1133 if result.success: 1134 print_success( 1135 f"Published {len(result.files_uploaded)} artifacts for " 1136 f"{result.connector_name}@{result.version} → {result.gcs_destination}" 1137 ) 1138 else: 1139 all_errors = result.errors + result.validation_errors 1140 exit_with_error( 1141 f"Publish completed with {len(all_errors)} error(s): " 1142 + "; ".join(all_errors), 1143 code=1, 1144 ) 1145 1146 1147# ============================================================================= 1148# COMPILE COMMAND 1149# ============================================================================= 1150 1151 1152@store_app.command(name="compile") 1153def compile_cmd( 1154 store: Annotated[ 1155 str, 1156 Parameter( 1157 help="Store target (e.g. 'coral:dev', 'coral:prod', 'coral:dev/prefix').", 1158 ), 1159 ], 1160 *, 1161 connector_name: Annotated[ 1162 tuple[str, ...] | None, 1163 Parameter( 1164 help="Only compile these connectors (can be repeated).", 1165 ), 1166 ] = None, 1167 dry_run: Annotated[ 1168 bool, 1169 Parameter(help="Show what would be done without writing."), 1170 ] = False, 1171 with_secrets_mask: Annotated[ 1172 bool, 1173 Parameter( 1174 help=( 1175 "Also regenerate specs_secrets_mask.yaml by scanning all " 1176 "connector specs for airbyte_secret properties." 1177 ), 1178 ), 1179 ] = False, 1180 with_legacy_migration: Annotated[ 1181 str | None, 1182 Parameter( 1183 help=( 1184 "Run a one-time legacy migration step during compile. " 1185 "Currently supported: 'v1' — delete cloud.json / oss.json " 1186 "files for connectors whose registryOverrides.cloud.enabled " 1187 "or registryOverrides.oss.enabled is false. This cleans up " 1188 "artifacts produced by the legacy pipeline that did not " 1189 "respect the enabled flag." 1190 ), 1191 ), 1192 ] = None, 1193 force: Annotated[ 1194 bool, 1195 Parameter( 1196 help=( 1197 "Force resync of latest/ directories even if version markers are current. " 1198 "Useful when metadata changes without a version bump." 1199 ), 1200 ), 1201 ] = False, 1202) -> None: 1203 """Compile the registry: sync latest/ dirs, write global and per-connector indexes. 1204 1205 Scans all version directories in the target store, determines the latest GA 1206 semver per connector (excluding yanked and pre-release versions), ensures 1207 each `latest/` directory matches the computed latest, and writes: 1208 1209 * `registries/v0/cloud_registry.json` -- global cloud registry index 1210 * `registries/v0/oss_registry.json` -- global OSS registry index 1211 * `metadata/airbyte/<connector>/versions.json` -- per-connector version index 1212 1213 With `--with-secrets-mask`, also regenerates: 1214 1215 * `registries/v0/specs_secrets_mask.yaml` -- properties marked as secrets 1216 1217 With `--with-legacy-migration=v1`, deletes `cloud.json` / `oss.json` 1218 files for connectors whose `registryOverrides.cloud.enabled` or 1219 `registryOverrides.oss.enabled` is `false`. 1220 1221 With `--force`, resyncs all latest/ directories even if the version marker 1222 matches the computed latest version. 1223 1224 Uses efficient glob patterns for scanning (no file downloads during discovery). 1225 1226 Requires GCS_CREDENTIALS environment variable to be set. 1227 1228 Examples: 1229 airbyte-ops registry store compile --store coral:dev --dry-run 1230 1231 airbyte-ops registry store compile --store coral:dev/aj-test100 \\ 1232 --connector-name source-faker --connector-name destination-bigquery 1233 1234 airbyte-ops registry store compile --store coral:prod --with-secrets-mask 1235 1236 airbyte-ops registry store compile --store coral:dev \\ 1237 --with-legacy-migration v1 1238 """ 1239 registry = _resolve_store(store) 1240 1241 result = registry.compile( 1242 connector_name=list(connector_name) if connector_name else None, 1243 dry_run=dry_run, 1244 with_secrets_mask=with_secrets_mask, 1245 with_legacy_migration=with_legacy_migration, 1246 force=force, 1247 ) 1248 1249 print_json( 1250 { 1251 "status": result.status, 1252 "target": result.target, 1253 "connectors_scanned": result.connectors_scanned, 1254 "versions_found": result.versions_found, 1255 "yanked_versions": result.yanked_versions, 1256 "latest_updated": result.latest_updated, 1257 "latest_already_current": result.latest_already_current, 1258 "cloud_registry_entries": result.cloud_registry_entries, 1259 "oss_registry_entries": result.oss_registry_entries, 1260 "version_indexes_written": result.version_indexes_written, 1261 "specs_secrets_mask_properties": result.specs_secrets_mask_properties, 1262 "errors": result.errors, 1263 "dry_run": result.dry_run, 1264 } 1265 ) 1266 1267 if result.status == "success" or result.status == "dry-run": 1268 print_success(result.summary()) 1269 else: 1270 exit_with_error( 1271 f"Compile completed with {len(result.errors)} error(s): " 1272 + "; ".join(result.errors), 1273 code=1, 1274 ) 1275 1276 1277# ============================================================================= 1278# DELETE-DEV-LATEST COMMAND 1279# ============================================================================= 1280 1281 1282@store_app.command(name="delete-dev-latest") 1283def delete_dev_latest_cmd( 1284 store: Annotated[ 1285 str, 1286 Parameter( 1287 help="Store target (must begin with 'coral:dev').", 1288 ), 1289 ], 1290 *, 1291 connector_name: Annotated[ 1292 tuple[str, ...] | None, 1293 Parameter( 1294 help="Only delete latest/ for these connectors (can be repeated).", 1295 ), 1296 ] = None, 1297 dry_run: Annotated[ 1298 bool, 1299 Parameter(help="Show what would be done without deleting."), 1300 ] = False, 1301) -> None: 1302 """Delete all latest/ directories from a dev registry store. 1303 1304 Discovers every connector that has a `latest/` directory and 1305 deletes each one in parallel using a thread pool. 1306 1307 This is useful before a full re-compile to prove that latest/ 1308 directories can be correctly regenerated from versioned data. 1309 1310 Only dev stores are allowed (store must begin with 'coral:dev'). 1311 1312 Requires GCS_CREDENTIALS environment variable to be set. 1313 1314 Examples: 1315 airbyte-ops registry store delete-dev-latest --store coral:dev --dry-run 1316 1317 airbyte-ops registry store delete-dev-latest --store coral:dev/aj-test100 1318 1319 airbyte-ops registry store delete-dev-latest --store coral:dev \\ 1320 --connector-name source-faker --connector-name destination-bigquery 1321 """ 1322 if not store.startswith("coral:dev"): 1323 exit_with_error( 1324 "delete-dev-latest only supports dev stores " 1325 f"(store must begin with 'coral:dev', got '{store}').", 1326 code=1, 1327 ) 1328 1329 registry = _resolve_store(store) 1330 1331 result = registry.delete_dev_latest( 1332 connector_name=list(connector_name) if connector_name else None, 1333 dry_run=dry_run, 1334 ) 1335 1336 print_json( 1337 { 1338 "status": result.status, 1339 "target": result.target, 1340 "connectors_found": result.connectors_found, 1341 "latest_dirs_deleted": result.latest_dirs_deleted, 1342 "errors": result.errors, 1343 "dry_run": result.dry_run, 1344 } 1345 ) 1346 1347 if result.status in ("success", "dry-run"): 1348 print_success(result.summary()) 1349 else: 1350 exit_with_error( 1351 f"Delete completed with {len(result.errors)} error(s): " 1352 + "; ".join(result.errors[:5]), 1353 code=1, 1354 ) 1355 1356 1357# ============================================================================= 1358# STORE COMPARE COMMAND 1359# ============================================================================= 1360 1361 1362@store_app.command(name="compare") 1363def compare_cmd( 1364 store: Annotated[ 1365 str, 1366 Parameter( 1367 help="Store target being evaluated (e.g. 'coral:dev/20260306-mirror-compile').", 1368 ), 1369 ], 1370 reference_store: Annotated[ 1371 str, 1372 Parameter( 1373 help="Known-good reference store to compare against.", 1374 ), 1375 ], 1376 *, 1377 connector_name: Annotated[ 1378 tuple[str, ...] | None, 1379 Parameter( 1380 help="Only compare these connectors (can be repeated).", 1381 ), 1382 ] = None, 1383 with_artifacts: Annotated[ 1384 bool, 1385 Parameter( 1386 help="Compare per-connector artifact files " 1387 "(metadata.yaml, cloud.json, oss.json, spec.json).", 1388 negative="--no-artifacts", 1389 ), 1390 ] = True, 1391 with_indexes: Annotated[ 1392 bool, 1393 Parameter( 1394 help="Compare global registry index files " 1395 "(cloud_registry.json, oss_registry.json).", 1396 negative="--no-indexes", 1397 ), 1398 ] = True, 1399) -> None: 1400 """Compare a store against a reference store and report differences. 1401 1402 Evaluates the `--store` target against `--reference-store` and reports 1403 per-connector artifact diffs and global index diffs. 1404 1405 Requires GCS_CREDENTIALS environment variable to be set. 1406 1407 Examples: 1408 airbyte-ops registry store compare --store coral:dev/20260306-mirror \\ 1409 --reference-store coral:prod 1410 1411 airbyte-ops registry store compare --store coral:dev/my-test \\ 1412 --connector-name source-faker --no-indexes 1413 1414 airbyte-ops registry store compare --store coral:dev/my-test \\ 1415 --no-artifacts 1416 """ 1417 store_target = _resolve_store(store) 1418 ref_target = _resolve_store(reference_store) 1419 1420 store_prefix = f"{store_target.prefix}/" if store_target.prefix else "" 1421 ref_prefix = f"{ref_target.prefix}/" if ref_target.prefix else "" 1422 1423 result = compare_stores( 1424 store_bucket=store_target.bucket_name, 1425 store_prefix=store_prefix, 1426 reference_bucket=ref_target.bucket_name, 1427 reference_prefix=ref_prefix, 1428 connector_name=list(connector_name) if connector_name else None, 1429 with_artifacts=with_artifacts, 1430 with_indexes=with_indexes, 1431 ) 1432 1433 print_json(result.to_dict()) 1434 1435 if result.status == "match": 1436 print_success(result.summary()) 1437 elif result.status == "differences-found": 1438 error_console.print(f"[yellow]{result.summary()}[/yellow]") 1439 1440 # Print a concise per-connector diff summary 1441 for diff in result.connector_diffs: 1442 if diff.status in ("only_in_store", "only_in_reference"): 1443 error_console.print(f" {diff.connector}: {diff.status}") 1444 else: 1445 for ad in diff.artifact_diffs: 1446 error_console.print( 1447 f" {diff.connector}/{ad.file}: {ad.status}" 1448 + (f" ({ad.details})" if ad.details else "") 1449 ) 1450 1451 for idx_diff in result.index_diffs: 1452 if idx_diff.status != "match": 1453 error_console.print( 1454 f" [index] {idx_diff.file}: {idx_diff.status}" 1455 + ( 1456 f" (store={idx_diff.entry_count_store}," 1457 f" ref={idx_diff.entry_count_reference})" 1458 if idx_diff.entry_count_store or idx_diff.entry_count_reference 1459 else "" 1460 ) 1461 ) 1462 1463 sys.exit(1) 1464 else: 1465 exit_with_error( 1466 f"Compare completed with {len(result.errors)} error(s): " 1467 + "; ".join(result.errors[:5]), 1468 code=1, 1469 )