airbyte_ops_mcp.cli.local

CLI commands for local Airbyte monorepo operations.

Commands:

airbyte-ops local connector list - List connectors in the monorepo airbyte-ops local connector info - Get metadata for a single connector airbyte-ops local connector get-version - Get connector version (current or next) airbyte-ops local connector bump-version - Bump connector version airbyte-ops local connector qa - Run QA checks on a connector airbyte-ops local connector qa-docs-generate - Generate QA checks documentation airbyte-ops local connector changelog check - Check changelog entries for issues airbyte-ops local connector changelog fix - Fix changelog entry dates airbyte-ops local connector bump-deps - Update Poetry-managed dependencies airbyte-ops local connector marketing-stub check - Validate marketing stub entries airbyte-ops local connector marketing-stub sync - Sync stub from connector metadata

CLI reference

The commands below are regenerated by poe docs-generate via cyclopts's programmatic docs API; see docs/generate_cli.py.

airbyte-ops local COMMAND

Local Airbyte monorepo operations.

Commands:

  • connector: Connector operations in the monorepo.

airbyte-ops local connector

Connector operations in the monorepo.

airbyte-ops local connector list

airbyte-ops local connector list REPO-PATH [ARGS]

List connectors in the Airbyte monorepo with filtering options.

Parameters:

  • REPO-PATH, --repo-path: Absolute path to the Airbyte monorepo. [required]
  • CERTIFIED-ONLY, --certified-only, --no-certified-only: Include only certified connectors. [default: False]
  • MODIFIED-ONLY, --modified-only, --no-modified-only: Include only modified connectors (requires PR context). [default: False]
  • LOCAL-CDK, --local-cdk, --no-local-cdk: Include connectors using local CDK reference. When combined with --modified-only, adds local-CDK connectors to the modified set. [default: False]
  • LANGUAGE, --language, --empty-language: Languages to include (python, java, low-code, manifest-only).
  • EXCLUDE-LANGUAGE, --exclude-language, --empty-exclude-language: Languages to exclude.
  • CONNECTOR-TYPE, --connector-type: Connector types to include (source, destination). Accepts CSV or newline-delimited values.
  • MIN-SUPPORT-LEVEL, --min-support-level: Minimum support level (inclusive). Accepts integer (100, 200, 300) or keyword (archived, community, certified).
  • MAX-SUPPORT-LEVEL, --max-support-level: Maximum support level (inclusive). Accepts integer (100, 200, 300) or keyword (archived, community, certified).
  • PR, --pr: PR number or GitHub URL for modification detection.
  • GH-TOKEN, --gh-token: GitHub API token. When provided together with --pr, the GitHub API is used to detect modified files instead of local git diff (avoids shallow-clone issues).
  • EXCLUDE-CONNECTORS, --exclude-connectors, --empty-exclude-connectors: Connectors to exclude from results. Accepts CSV or newline-delimited values. Can be specified multiple times.
  • FORCE-INCLUDE-CONNECTORS, --force-include-connectors, --empty-force-include-connectors: Connectors to force-include regardless of other filters. Accepts CSV or newline-delimited values. Can be specified multiple times.
  • CONNECTORS-FILTER, --connectors-filter, --empty-connectors-filter: Intersect results with this explicit set of connector names. Only connectors present in both the filtered results and this set are returned. Useful for composing multiple filter passes (e.g. combining separate source and destination lists). Accepts CSV or newline-delimited values. Can be specified multiple times.
  • OUTPUT-FORMAT, --output-format: Output format: "csv" (comma-separated), "lines" (one connector per line), "json-gh-matrix" (GitHub Actions matrix JSON). [choices: csv, lines, json-gh-matrix] [default: lines]
  • UNPUBLISHED, --unpublished, --no-unpublished: Filter to only connectors whose local dockerImageTag has not been published to the GCS registry. Requires --store. [default: False]
  • STORE, --store: Store target for unpublished check (e.g. 'coral:prod', 'coral:dev'). Required when --unpublished is set.
  • ASSERT-NONE, --assert-none, --no-assert-none: Exit with non-zero status if any connectors match the filters. Useful for audit/CI checks (e.g. --unpublished --assert-none). [default: False]
  • SORT-BY, --sort-by: Sort connectors by the given key. Only 'name' is supported for now. [choices: name] [default: name]
  • SORT-DIRECTION, --sort-direction: Sort direction: 'asc' (ascending, default) or 'desc' (descending). [choices: asc, desc] [default: asc]
  • LIMIT, --limit: Maximum number of connectors to return. Applied after sorting. Useful for batched processing.
  • OFFSET, --offset: Number of connectors to skip from the start of the (sorted) list. Applied after sorting, before --limit. Useful for batched processing (e.g. --offset=200 --limit=200 for batch 2). [default: 0]

airbyte-ops local connector info

airbyte-ops local connector info CONNECTOR-NAME [ARGS]

Get metadata for a single connector.

Prints JSON output with connector metadata. When running in GitHub Actions (CI env var set), also writes each field to GitHub step outputs.

Parameters:

  • CONNECTOR-NAME, --connector-name: Name of the connector (e.g., source-github). [required]
  • REPO-PATH, --repo-path: Path to the Airbyte monorepo. Can be inferred from context.

airbyte-ops local connector get-version

airbyte-ops local connector get-version NAME REPO-PATH [ARGS]

Get the current or next version for a connector.

By default, prints the current version from metadata.yaml.

With --next, computes and prints the next version. Requires either --bump-type or --prerelease to be specified.

Parameters:

  • NAME, --name: Connector technical name (e.g., source-github). [required]
  • REPO-PATH, --repo-path: Absolute path to the Airbyte monorepo. [required]
  • NEXT, --next, --no-next: Compute the next version instead of the current version. [default: False]
  • BUMP-TYPE, --bump-type: Version bump type (requires --next). Standard: patch, minor, major. RC: patch_rc, minor_rc, major_rc, rc, promote. [choices: patch, minor, major, patch_rc, minor_rc, major_rc, rc, promote]
  • PRERELEASE, --prerelease, --no-prerelease: Compute a prerelease (preview) tag using the repo HEAD SHA (requires --next). [default: False]

airbyte-ops local connector bump-version

airbyte-ops local connector bump-version NAME REPO-PATH [ARGS]

Bump a connector's version across all relevant files.

Updates version in metadata.yaml (always), pyproject.toml (if exists), and documentation changelog (if --changelog-message provided).

Note: --changelog-message is ignored when --no-changelog is set.

Either --bump-type or --new-version must be provided.

Parameters:

  • NAME, --name: Connector technical name (e.g., source-github). [required]
  • REPO-PATH, --repo-path: Absolute path to the Airbyte monorepo. [required]
  • BUMP-TYPE, --bump-type: Version bump type. Standard: patch, minor, major. RC: patch_rc, minor_rc, major_rc, rc, promote. [choices: patch, minor, major, patch_rc, minor_rc, major_rc, rc, promote]
  • NEW-VERSION, --new-version: Explicit new version (overrides --bump-type if provided).
  • CHANGELOG-MESSAGE, --changelog-message: Message to add to changelog. Ignored if --no-changelog is set.
  • PR-NUMBER, --pr-number: PR number for changelog entry.
  • DRY-RUN, --dry-run, --no-dry-run: Show what would be changed without modifying files. [default: False]
  • NO-CHANGELOG, --no-changelog, --no-no-changelog: Skip changelog updates even if --changelog-message is provided. Useful for ephemeral version bumps (e.g. pre-release artifact generation). [default: False]
  • PROGRESSIVE-ROLLOUT-ENABLED, --progressive-rollout-enabled, --no-progressive-rollout-enabled: Explicitly set enableProgressiveRollout in metadata.yaml. Pass false to disable progressive rollout (e.g. for preview builds). When omitted, the automatic behaviour based on --bump-type is used.

airbyte-ops local connector bump-base-image

airbyte-ops local connector bump-base-image NAME REPO-PATH [ARGS]

Update a connector's base image.

Two modes:

  • Default: bump to the latest stable tag within the same major version. Major version changes are treated as breaking-change boundaries.
  • --force-latest: bump to the absolute latest stable tag regardless of semver.

Parameters:

  • NAME, --name: Connector technical name (e.g., source-github). [required]
  • REPO-PATH, --repo-path: Absolute path to the Airbyte monorepo. [required]
  • FORCE-LATEST, --force-latest, --no-force-latest: Bump to the absolute latest stable base image, ignoring major-version boundaries. Without this flag the bump stays within the current major version. [default: False]
  • DRY-RUN, --dry-run, --no-dry-run: Show what would be changed without modifying files. [default: False]

airbyte-ops local connector bump-cdk

airbyte-ops local connector bump-cdk NAME REPO-PATH [ARGS]

Bump a connector's CDK dependency.

Two modes:

  • Default: refresh the lock file so it resolves the newest CDK that satisfies the existing constraint. The constraint is NOT changed.
  • --force-latest: rewrite the constraint to >=LATEST,

For Java connectors, updates build.gradle to the latest CDK version.

Parameters:

  • NAME, --name: Connector technical name (e.g., source-github). [required]
  • REPO-PATH, --repo-path: Absolute path to the Airbyte monorepo. [required]
  • FORCE-LATEST, --force-latest, --no-force-latest: Rewrite the CDK constraint to >=LATEST,[default: False]
  • DRY-RUN, --dry-run, --no-dry-run: Show what would be changed without modifying files. [default: False]

airbyte-ops local connector bump-deps

airbyte-ops local connector bump-deps NAME REPO-PATH [ARGS]

Update a connector's dependencies.

For Python / low-code connectors using Poetry, this runs poetry update --lock to refresh the lock file with the latest versions allowed by existing constraints.

For connectors that do not use Poetry (manifest-only, Java, etc.), this is a no-op.

Parameters:

  • NAME, --name: Connector technical name (e.g., source-github). [required]
  • REPO-PATH, --repo-path: Absolute path to the Airbyte monorepo. [required]
  • DRY-RUN, --dry-run, --no-dry-run: Show what would be changed without modifying files. [default: False]

airbyte-ops local connector qa

airbyte-ops local connector qa [ARGS]

Run QA checks on connector(s).

Validates connector metadata, documentation, packaging, security, and versioning. Exit code is non-zero if any checks fail.

Parameters:

  • NAME, --name, --empty-name: Connector technical name(s) (e.g., source-github). Can be specified multiple times.
  • CONNECTOR-DIRECTORY, --connector-directory: Directory containing connectors to run checks on all connectors in this directory.
  • CHECK, --check, --empty-check: Specific check(s) to run. Can be specified multiple times.
  • REPORT-PATH, --report-path: Path to write the JSON report file.

airbyte-ops local connector qa-docs-generate

airbyte-ops local connector qa-docs-generate OUTPUT-FILE

Generate documentation for QA checks.

Creates a markdown file documenting all available QA checks organized by category.

Parameters:

  • OUTPUT-FILE, --output-file: Path to write the generated documentation file. [required]

airbyte-ops local connector changelog

Changelog operations for connectors.

Commands:

  • add: Add a changelog entry for a connector using its current version.
  • check: Check changelog entries for issues.
  • fix: Fix changelog entry dates to match PR merge dates.
airbyte-ops local connector changelog check
airbyte-ops local connector changelog check [ARGS]

Check changelog entries for issues.

Validates changelog dates match PR merge dates and checks for PR number mismatches.

Parameters:

  • CONNECTOR-NAME, --connector-name: Connector technical name (e.g., source-github).
  • ALL, --all, --no-all: Check all connectors in the repository. [default: False]
  • REPO-PATH, --repo-path: Path to the Airbyte monorepo. Can be inferred from context.
  • LOOKBACK-DAYS, --lookback-days: Only check entries with dates within this many days.
  • STRICT, --strict, --no-strict: Exit with error code if any issues are found. [default: False]
airbyte-ops local connector changelog fix
airbyte-ops local connector changelog fix [ARGS]

Fix changelog entry dates to match PR merge dates.

Looks up the actual merge date for each PR referenced in the changelog and updates the date column to match.

Parameters:

  • CONNECTOR-NAME, --connector-name: Connector technical name (e.g., source-github).
  • ALL, --all, --no-all: Fix all connectors in the repository. [default: False]
  • REPO-PATH, --repo-path: Path to the Airbyte monorepo. Can be inferred from context.
  • LOOKBACK-DAYS, --lookback-days: Only fix entries with dates within this many days.
  • DRY-RUN, --dry-run, --no-dry-run: Print changes without modifying files. [default: False]
airbyte-ops local connector changelog add
airbyte-ops local connector changelog add CONNECTOR-NAME PR-NUMBER MESSAGE [ARGS]

Add a changelog entry for a connector using its current version.

Reads the version from metadata.yaml and writes a single changelog entry to the connector's documentation file. Does not modify any version files.

Parameters:

  • CONNECTOR-NAME, --connector-name: Connector technical name (e.g., source-github). [required]
  • PR-NUMBER, --pr-number: PR number for the changelog entry. [required]
  • MESSAGE, --message: Changelog entry message. [required]
  • REPO-PATH, --repo-path: Path to the Airbyte monorepo. Can be inferred from context.
  • DRY-RUN, --dry-run, --no-dry-run: Print changes without modifying files. [default: False]

airbyte-ops local connector marketing-stub

Marketing connector stub operations (local file validation and updates).

Commands:

  • check: Validate marketing connector stub entries.
  • sync: Sync connector stub(s) from connector metadata.yaml file(s).
airbyte-ops local connector marketing-stub check
airbyte-ops local connector marketing-stub check [ARGS]

Validate marketing connector stub entries.

Checks that stub entries have valid required fields (id, name, url, icon) and optionally validates that the stub matches the connector's metadata.yaml.

Exit codes:

0: All checks passed 1: Validation errors found

Output:

STDOUT: JSON validation result STDERR: Informational messages

Parameters:

  • CONNECTOR, --connector: Connector name to check (e.g., 'source-oracle-enterprise').
  • ALL, --all, --no-all: Check all stubs in the file. [default: False]
  • REPO-ROOT, --repo-root: Path to the airbyte-enterprise repository root. Defaults to current directory.
airbyte-ops local connector marketing-stub sync
airbyte-ops local connector marketing-stub sync [ARGS]

Sync connector stub(s) from connector metadata.yaml file(s).

Reads the connector's metadata.yaml file and updates the corresponding entry in connector_stubs.json with the current values.

Exit codes:

0: Sync successful (or dry-run completed) 1: Error (connector not found, no metadata, etc.)

Output:

STDOUT: JSON representation of the synced stub(s) STDERR: Informational messages

Parameters:

  • CONNECTOR, --connector: Connector name to sync (e.g., 'source-oracle-enterprise').
  • ALL, --all, --no-all: Sync all connectors that have metadata.yaml files. [default: False]
  • REPO-ROOT, --repo-root: Path to the airbyte-enterprise repository root. Defaults to current directory.
  • DRY-RUN, --dry-run, --no-dry-run: Show what would be synced without making changes. [default: False]
   1# Copyright (c) 2025 Airbyte, Inc., all rights reserved.
   2"""CLI commands for local Airbyte monorepo operations.
   3
   4Commands:
   5    airbyte-ops local connector list - List connectors in the monorepo
   6    airbyte-ops local connector info - Get metadata for a single connector
   7    airbyte-ops local connector get-version - Get connector version (current or next)
   8    airbyte-ops local connector bump-version - Bump connector version
   9    airbyte-ops local connector qa - Run QA checks on a connector
  10    airbyte-ops local connector qa-docs-generate - Generate QA checks documentation
  11    airbyte-ops local connector changelog check - Check changelog entries for issues
  12    airbyte-ops local connector changelog fix - Fix changelog entry dates
  13    airbyte-ops local connector bump-deps - Update Poetry-managed dependencies
  14    airbyte-ops local connector marketing-stub check - Validate marketing stub entries
  15    airbyte-ops local connector marketing-stub sync - Sync stub from connector metadata
  16
  17## CLI reference
  18
  19The commands below are regenerated by `poe docs-generate` via cyclopts's
  20programmatic docs API; see `docs/generate_cli.py`.
  21
  22.. include:: ../../../docs/generated/cli/local.md
  23   :start-line: 2
  24"""
  25
  26from __future__ import annotations
  27
  28# Hide Python-level members from the pdoc page for this module; the rendered
  29# docs for this CLI group come entirely from the grafted `.. include::` in
  30# the module docstring above.
  31__all__: list[str] = []
  32
  33import json
  34import os
  35import subprocess
  36import sys
  37from pathlib import Path
  38from typing import Annotated, Literal
  39
  40import yaml
  41from cyclopts import Parameter
  42from jinja2 import Environment, PackageLoader, select_autoescape
  43from rich.console import Console
  44
  45from airbyte_ops_mcp.airbyte_repo.bump_base_image import (
  46    BaseImageError,
  47    bump_base_image,
  48)
  49from airbyte_ops_mcp.airbyte_repo.bump_cdk import (
  50    CdkBumpError,
  51    bump_cdk,
  52)
  53from airbyte_ops_mcp.airbyte_repo.bump_deps import (
  54    DepsError,
  55    bump_deps,
  56)
  57from airbyte_ops_mcp.airbyte_repo.bump_version import (
  58    ConnectorNotFoundError,
  59    InvalidVersionError,
  60    VersionNotFoundError,
  61    bump_connector_version,
  62    calculate_new_version,
  63    get_connector_doc_path,
  64    get_connector_path,
  65    get_current_version,
  66    update_changelog,
  67)
  68from airbyte_ops_mcp.airbyte_repo.changelog_fix import (
  69    ChangelogCheckResult,
  70    ChangelogFixResult,
  71    check_all_changelogs,
  72    check_changelog,
  73    fix_all_changelog_dates,
  74    fix_changelog_dates,
  75)
  76from airbyte_ops_mcp.airbyte_repo.list_connectors import (
  77    CONNECTOR_PATH_PREFIX,
  78    METADATA_FILE_NAME,
  79    _detect_connector_language,
  80    get_connectors_with_local_cdk,
  81)
  82from airbyte_ops_mcp.cli._base import App, app
  83from airbyte_ops_mcp.cli._shared import error_console, exit_with_error, print_json
  84from airbyte_ops_mcp.connector_ops.utils import Connector
  85from airbyte_ops_mcp.connector_qa.checks import ENABLED_CHECKS
  86from airbyte_ops_mcp.connector_qa.consts import CONNECTORS_QA_DOC_TEMPLATE_NAME
  87from airbyte_ops_mcp.connector_qa.models import (
  88    Check,
  89    CheckCategory,
  90    CheckStatus,
  91    Report,
  92)
  93from airbyte_ops_mcp.connector_qa.utils import (
  94    get_all_connectors_in_directory,
  95    remove_strict_encrypt_suffix,
  96)
  97from airbyte_ops_mcp.mcp.github_repo_ops import list_connectors_in_repo
  98from airbyte_ops_mcp.mcp.prerelease import compute_prerelease_docker_image_tag
  99from airbyte_ops_mcp.registry._enums import ConnectorType, SupportLevel
 100from airbyte_ops_mcp.registry.audit import (
 101    AuditResult,
 102    find_unpublished_connectors,
 103    generate_connector_list_summary,
 104)
 105from airbyte_ops_mcp.registry.connector_stubs import (
 106    CONNECTOR_STUBS_FILE,
 107    ConnectorStub,
 108    find_stub_by_connector,
 109    load_local_stubs,
 110    save_local_stubs,
 111)
 112from airbyte_ops_mcp.registry.store import resolve_registry_store
 113from airbyte_ops_mcp.regression_tests.ci_output import write_github_summary
 114
 115console = Console()
 116
 117OutputFormat = Literal["csv", "lines", "json-gh-matrix"]
 118SortBy = Literal["name"]
 119
 120
 121def _parse_support_level(value: str) -> SupportLevel:
 122    """Parse a support level string into a `SupportLevel` enum member.
 123
 124    Accepts a keyword ("certified", "community", "archived") or a legacy
 125    integer string ("100", "200", "300").
 126    """
 127    return SupportLevel.parse(value.strip().lower())
 128
 129
 130def _get_connector_support_level(connector_dir: Path) -> SupportLevel | None:
 131    """Read support level from connector's metadata.yaml."""
 132    metadata_file = connector_dir / METADATA_FILE_NAME
 133    if not metadata_file.exists():
 134        return None
 135    metadata = yaml.safe_load(metadata_file.read_text())
 136    support_level_str = metadata.get("data", {}).get("supportLevel")
 137    if not support_level_str:
 138        return None
 139    try:
 140        return SupportLevel(support_level_str.lower())
 141    except ValueError:
 142        return None
 143
 144
 145def _parse_connector_types(value: str) -> set[ConnectorType]:
 146    """Parse connector types from CSV or newline-delimited string."""
 147    types: set[ConnectorType] = set()
 148    for item in value.replace(",", "\n").split("\n"):
 149        item = item.strip().lower()
 150        if item:
 151            types.add(ConnectorType.parse(item))
 152    return types
 153
 154
 155def _get_connector_type(connector_name: str) -> str:
 156    """Derive connector type from name prefix."""
 157    if connector_name.startswith("source-"):
 158        return ConnectorType.SOURCE
 159    elif connector_name.startswith("destination-"):
 160        return ConnectorType.DESTINATION
 161    return "unknown"
 162
 163
 164def _parse_connector_names(value: str) -> set[str]:
 165    """Parse connector names from CSV or newline-delimited string."""
 166    names = set()
 167    for item in value.replace(",", "\n").split("\n"):
 168        item = item.strip()
 169        if item:
 170            names.add(item)
 171    return names
 172
 173
 174def _get_connector_version(connector_dir: Path) -> str | None:
 175    """Read connector version (dockerImageTag) from metadata.yaml."""
 176    metadata_file = connector_dir / METADATA_FILE_NAME
 177    if not metadata_file.exists():
 178        return None
 179    metadata = yaml.safe_load(metadata_file.read_text())
 180    return metadata.get("data", {}).get("dockerImageTag")
 181
 182
 183def _support_level_to_int(level: SupportLevel | None) -> int | None:
 184    """Convert a `SupportLevel` to its legacy integer representation for JSON output."""
 185    if level is None:
 186        return None
 187    return level.precedence
 188
 189
 190def _get_connector_info(
 191    connector_name: str, connector_dir: Path
 192) -> dict[str, str | int | None]:
 193    """Get full connector metadata as a dict with connector_ prefixed keys.
 194
 195    This is shared between the `list --output-format json-gh-matrix` and `info` commands.
 196    """
 197    return {
 198        "connector": connector_name,
 199        "connector_type": _get_connector_type(connector_name),
 200        "connector_language": _detect_connector_language(connector_dir, connector_name)
 201        or "unknown",
 202        "connector_support_level": _support_level_to_int(
 203            _get_connector_support_level(connector_dir)
 204        ),
 205        "connector_version": _get_connector_version(connector_dir),
 206        "connector_dir": f"{CONNECTOR_PATH_PREFIX}/{connector_name}",
 207    }
 208
 209
 210# Create the local sub-app
 211local_app = App(name="local", help="Local Airbyte monorepo operations.")
 212app.command(local_app)
 213
 214# Create the connector sub-app under local
 215connector_app = App(name="connector", help="Connector operations in the monorepo.")
 216local_app.command(connector_app)
 217
 218
 219@connector_app.command(name="list")
 220def list_connectors(
 221    repo_path: Annotated[
 222        str,
 223        Parameter(help="Absolute path to the Airbyte monorepo."),
 224    ],
 225    certified_only: Annotated[
 226        bool,
 227        Parameter(help="Include only certified connectors."),
 228    ] = False,
 229    modified_only: Annotated[
 230        bool,
 231        Parameter(help="Include only modified connectors (requires PR context)."),
 232    ] = False,
 233    local_cdk: Annotated[
 234        bool,
 235        Parameter(
 236            help=(
 237                "Include connectors using local CDK reference. "
 238                "When combined with --modified-only, adds local-CDK connectors to the modified set."
 239            )
 240        ),
 241    ] = False,
 242    language: Annotated[
 243        list[str] | None,
 244        Parameter(help="Languages to include (python, java, low-code, manifest-only)."),
 245    ] = None,
 246    exclude_language: Annotated[
 247        list[str] | None,
 248        Parameter(help="Languages to exclude."),
 249    ] = None,
 250    connector_type: Annotated[
 251        str | None,
 252        Parameter(
 253            help=(
 254                "Connector types to include (source, destination). "
 255                "Accepts CSV or newline-delimited values."
 256            )
 257        ),
 258    ] = None,
 259    min_support_level: Annotated[
 260        str | None,
 261        Parameter(
 262            help=(
 263                "Minimum support level (inclusive). "
 264                "Accepts integer (100, 200, 300) or keyword (archived, community, certified)."
 265            )
 266        ),
 267    ] = None,
 268    max_support_level: Annotated[
 269        str | None,
 270        Parameter(
 271            help=(
 272                "Maximum support level (inclusive). "
 273                "Accepts integer (100, 200, 300) or keyword (archived, community, certified)."
 274            )
 275        ),
 276    ] = None,
 277    pr: Annotated[
 278        str | None,
 279        Parameter(help="PR number or GitHub URL for modification detection."),
 280    ] = None,
 281    gh_token: Annotated[
 282        str | None,
 283        Parameter(
 284            help=(
 285                "GitHub API token. When provided together with --pr, the GitHub API "
 286                "is used to detect modified files instead of local git diff "
 287                "(avoids shallow-clone issues)."
 288            )
 289        ),
 290    ] = None,
 291    exclude_connectors: Annotated[
 292        list[str] | None,
 293        Parameter(
 294            help=(
 295                "Connectors to exclude from results. "
 296                "Accepts CSV or newline-delimited values. Can be specified multiple times."
 297            )
 298        ),
 299    ] = None,
 300    force_include_connectors: Annotated[
 301        list[str] | None,
 302        Parameter(
 303            help=(
 304                "Connectors to force-include regardless of other filters. "
 305                "Accepts CSV or newline-delimited values. Can be specified multiple times."
 306            )
 307        ),
 308    ] = None,
 309    connectors_filter: Annotated[
 310        list[str] | None,
 311        Parameter(
 312            help=(
 313                "Intersect results with this explicit set of connector names. "
 314                "Only connectors present in both the filtered results and this set are returned. "
 315                "Useful for composing multiple filter passes (e.g. combining separate source "
 316                "and destination lists). "
 317                "Accepts CSV or newline-delimited values. Can be specified multiple times."
 318            )
 319        ),
 320    ] = None,
 321    output_format: Annotated[
 322        OutputFormat,
 323        Parameter(
 324            help=(
 325                'Output format: "csv" (comma-separated), '
 326                '"lines" (one connector per line), '
 327                '"json-gh-matrix" (GitHub Actions matrix JSON).'
 328            )
 329        ),
 330    ] = "lines",
 331    unpublished: Annotated[
 332        bool,
 333        Parameter(
 334            help=(
 335                "Filter to only connectors whose local dockerImageTag "
 336                "has not been published to the GCS registry. Requires --store."
 337            )
 338        ),
 339    ] = False,
 340    store: Annotated[
 341        str | None,
 342        Parameter(
 343            help=(
 344                "Store target for unpublished check (e.g. 'coral:prod', 'coral:dev'). "
 345                "Required when --unpublished is set."
 346            )
 347        ),
 348    ] = None,
 349    assert_none: Annotated[
 350        bool,
 351        Parameter(
 352            help=(
 353                "Exit with non-zero status if any connectors match the filters. "
 354                "Useful for audit/CI checks (e.g. --unpublished --assert-none)."
 355            )
 356        ),
 357    ] = False,
 358    sort_by: Annotated[
 359        SortBy,
 360        Parameter(
 361            help="Sort connectors by the given key. Only 'name' is supported for now."
 362        ),
 363    ] = "name",
 364    sort_direction: Annotated[
 365        Literal["asc", "desc"],
 366        Parameter(
 367            help="Sort direction: 'asc' (ascending, default) or 'desc' (descending)."
 368        ),
 369    ] = "asc",
 370    limit: Annotated[
 371        int | None,
 372        Parameter(
 373            help=(
 374                "Maximum number of connectors to return. "
 375                "Applied after sorting. Useful for batched processing."
 376            )
 377        ),
 378    ] = None,
 379    offset: Annotated[
 380        int,
 381        Parameter(
 382            help=(
 383                "Number of connectors to skip from the start of the (sorted) list. "
 384                "Applied after sorting, before --limit. "
 385                "Useful for batched processing (e.g. --offset=200 --limit=200 for batch 2)."
 386            )
 387        ),
 388    ] = 0,
 389) -> None:
 390    """List connectors in the Airbyte monorepo with filtering options."""
 391    # Validate mutually exclusive flags
 392    if language and exclude_language:
 393        exit_with_error("Cannot specify both --language and --exclude-language.")
 394    if assert_none and (offset or limit is not None):
 395        exit_with_error(
 396            "Cannot combine --assert-none with --offset or --limit. "
 397            "--assert-none checks all connectors matching the filters, "
 398            "which is incompatible with pagination."
 399        )
 400
 401    # Map CLI flags to MCP tool parameters
 402    certified: bool | None = True if certified_only else None
 403    modified: bool | None = True if modified_only else None
 404
 405    language_filter: set[str] | None = set(language) if language else None
 406    language_exclude: set[str] | None = (
 407        set(exclude_language) if exclude_language else None
 408    )
 409
 410    # Parse connector type filter
 411    connector_type_filter: set[str] | None = None
 412    if connector_type:
 413        try:
 414            connector_type_filter = _parse_connector_types(connector_type)
 415        except ValueError as e:
 416            exit_with_error(str(e))
 417
 418    # Parse support level filters
 419    min_level: SupportLevel | None = None
 420    max_level: SupportLevel | None = None
 421    if min_support_level:
 422        try:
 423            min_level = _parse_support_level(min_support_level)
 424        except ValueError as e:
 425            exit_with_error(str(e))
 426    if max_support_level:
 427        try:
 428            max_level = _parse_support_level(max_support_level)
 429        except ValueError as e:
 430            exit_with_error(str(e))
 431
 432    # Parse exclude/force-include connector lists (merge multiple flag values)
 433    exclude_set: set[str] = set()
 434    if exclude_connectors:
 435        for value in exclude_connectors:
 436            exclude_set.update(_parse_connector_names(value))
 437
 438    force_include_set: set[str] = set()
 439    if force_include_connectors:
 440        for value in force_include_connectors:
 441            force_include_set.update(_parse_connector_names(value))
 442
 443    connectors_filter_set: set[str] | None = None
 444    if connectors_filter:
 445        connectors_filter_set = set()
 446        for value in connectors_filter:
 447            connectors_filter_set.update(_parse_connector_names(value))
 448
 449    result = list_connectors_in_repo(
 450        repo_path=repo_path,
 451        certified=certified,
 452        modified=modified,
 453        language_filter=language_filter,
 454        language_exclude=language_exclude,
 455        pr_num_or_url=pr,
 456        gh_token=gh_token,
 457    )
 458    connectors = list(result.connectors)
 459    repo_path_obj = Path(repo_path)
 460
 461    # Add connectors with local CDK reference if --local-cdk flag is set
 462    if local_cdk:
 463        local_cdk_connectors = get_connectors_with_local_cdk(repo_path)
 464        connectors = sorted(set(connectors) | local_cdk_connectors)
 465
 466    # Apply connector type filter
 467    if connector_type_filter:
 468        connectors = [
 469            name
 470            for name in connectors
 471            if _get_connector_type(name) in connector_type_filter
 472        ]
 473
 474    # Apply support level filters (requires reading metadata)
 475    if min_level is not None or max_level is not None:
 476        filtered_connectors = []
 477        for name in connectors:
 478            connector_dir = repo_path_obj / CONNECTOR_PATH_PREFIX / name
 479            level = _get_connector_support_level(connector_dir)
 480            if level is None:
 481                continue  # Skip connectors without support level
 482            if min_level is not None and level.precedence < min_level.precedence:
 483                continue
 484            if max_level is not None and level.precedence > max_level.precedence:
 485                continue
 486            filtered_connectors.append(name)
 487        connectors = filtered_connectors
 488
 489    # Apply exclude filter
 490    if exclude_set:
 491        connectors = [name for name in connectors if name not in exclude_set]
 492
 493    # Apply connectors filter (intersection, narrows the set)
 494    if connectors_filter_set is not None:
 495        connectors = [name for name in connectors if name in connectors_filter_set]
 496
 497    # Apply force-include (union, overrides all other filters)
 498    if force_include_set:
 499        connectors_set = set(connectors)
 500        connectors_set.update(force_include_set)
 501        connectors = sorted(connectors_set)
 502
 503    # Apply unpublished filter (requires GCS check)
 504    audit_result: AuditResult | None = None
 505    if unpublished:
 506        if not store:
 507            exit_with_error("--store is required when --unpublished is set.")
 508
 509        target = resolve_registry_store(store=store)
 510        audit_result = find_unpublished_connectors(
 511            repo_path=repo_path,
 512            bucket_name=target.bucket,
 513            connector_names=connectors,
 514        )
 515        unpublished_names = {entry.connector_name for entry in audit_result.unpublished}
 516        connectors = [name for name in connectors if name in unpublished_names]
 517
 518        # Log audit summary to stderr
 519        error_console.print(
 520            f"Audit: {audit_result.checked_count} checked, "
 521            f"{len(audit_result.unpublished)} unpublished, "
 522            f"{len(audit_result.skipped_archived)} archived-skipped, "
 523            f"{len(audit_result.skipped_disabled)} disabled-skipped, "
 524            f"{len(audit_result.skipped_rc)} rc-skipped"
 525        )
 526        if audit_result.errors:
 527            for err in audit_result.errors:
 528                error_console.print(f"  Warning: {err}")
 529
 530    # --- Sort, offset, limit (applied after all filters) ---
 531    if offset < 0:
 532        exit_with_error("--offset must be non-negative.")
 533    if limit is not None and limit < 0:
 534        exit_with_error("--limit must be non-negative.")
 535
 536    total_before_slice = len(connectors)
 537    reverse = sort_direction == "desc"
 538    if sort_by == "name":
 539        connectors = sorted(connectors, reverse=reverse)
 540
 541    if offset:
 542        connectors = connectors[offset:]
 543    if limit is not None:
 544        connectors = connectors[:limit]
 545
 546    if offset or limit is not None:
 547        error_console.print(
 548            f"Slice: showing {len(connectors)} of {total_before_slice} "
 549            f"(offset={offset}, limit={limit})"
 550        )
 551
 552    if output_format == "csv":
 553        sys.stdout.write(",".join(connectors) + "\n")
 554    elif output_format == "lines":
 555        for name in connectors:
 556            sys.stdout.write(name + "\n")
 557    elif output_format == "json-gh-matrix":
 558        # Build matrix with full connector metadata
 559        include_list = []
 560        for name in connectors:
 561            connector_dir = repo_path_obj / CONNECTOR_PATH_PREFIX / name
 562            include_list.append(_get_connector_info(name, connector_dir))
 563        matrix = {"include": include_list}
 564        sys.stdout.write(
 565            json.dumps(
 566                matrix,
 567                indent=2,
 568                default=str,
 569            )
 570            + "\n"
 571        )
 572
 573    # Write GitHub Step Summary when GITHUB_STEP_SUMMARY is set
 574    _write_connector_list_summary(
 575        connectors=connectors,
 576        assert_none=assert_none,
 577        is_unpublished=unpublished,
 578        audit_result=audit_result,
 579    )
 580
 581    # Exit non-zero if --assert-none and any connectors matched
 582    if assert_none and connectors:
 583        error_console.print(f"FAIL: {len(connectors)} connector(s) matched.")
 584        sys.exit(1)
 585
 586
 587def _write_github_step_outputs(outputs: dict[str, str | int | None]) -> None:
 588    """Write outputs to GitHub Actions step output file if running in CI."""
 589    github_output = os.getenv("GITHUB_OUTPUT")
 590    if not (os.getenv("CI") and github_output):
 591        return
 592
 593    with open(github_output, "a", encoding="utf-8") as f:
 594        for key, value in outputs.items():
 595            if value is None:
 596                continue
 597            f.write(f"{key}={value}\n")
 598
 599
 600def _write_connector_list_summary(
 601    connectors: list[str],
 602    *,
 603    assert_none: bool,
 604    is_unpublished: bool,
 605    audit_result: AuditResult | None,
 606) -> None:
 607    """Write a GitHub Step Summary for the connector list command.
 608
 609    This is a no-op when `GITHUB_STEP_SUMMARY` is not set.
 610    """
 611    github_summary = os.getenv("GITHUB_STEP_SUMMARY")
 612    if not github_summary:
 613        return
 614
 615    summary = generate_connector_list_summary(
 616        connectors,
 617        assert_none=assert_none,
 618        unpublished=is_unpublished,
 619        audit_result=audit_result,
 620    )
 621    write_github_summary(summary)
 622
 623
 624@connector_app.command(name="info")
 625def connector_info(
 626    connector_name: Annotated[
 627        str,
 628        Parameter(help="Name of the connector (e.g., source-github)."),
 629    ],
 630    repo_path: Annotated[
 631        str | None,
 632        Parameter(help="Path to the Airbyte monorepo. Can be inferred from context."),
 633    ] = None,
 634) -> None:
 635    """Get metadata for a single connector.
 636
 637    Prints JSON output with connector metadata. When running in GitHub Actions
 638    (CI env var set), also writes each field to GitHub step outputs.
 639    """
 640    # Infer repo_path from current directory if not provided
 641    if repo_path is None:
 642        # Check if we're in an airbyte repo by looking for the connectors directory
 643        cwd = Path.cwd()
 644        # Walk up to find airbyte-integrations/connectors
 645        for parent in [cwd, *cwd.parents]:
 646            if (parent / CONNECTOR_PATH_PREFIX).exists():
 647                repo_path = str(parent)
 648                break
 649        if repo_path is None:
 650            exit_with_error(
 651                "Could not infer repo path. Please provide --repo-path or run from within the Airbyte monorepo."
 652            )
 653
 654    repo_path_obj = Path(repo_path)
 655    connector_dir = repo_path_obj / CONNECTOR_PATH_PREFIX / connector_name
 656
 657    if not connector_dir.exists():
 658        exit_with_error(f"Connector directory not found: {connector_dir}")
 659
 660    info = _get_connector_info(connector_name, connector_dir)
 661
 662    # Print JSON output
 663    print_json(info)
 664
 665    # Write to GitHub step outputs if in CI
 666    _write_github_step_outputs(info)
 667
 668
 669BumpType = Literal[
 670    "patch",
 671    "minor",
 672    "major",
 673    "patch_rc",
 674    "minor_rc",
 675    "major_rc",
 676    "rc",
 677    "promote",
 678]
 679
 680
 681def _get_git_head_sha(repo_path: Path) -> str:
 682    """Get the HEAD commit SHA from a git repository.
 683
 684    Args:
 685        repo_path: Path to the git repository.
 686
 687    Returns:
 688        Full commit SHA string.
 689
 690    Raises:
 691        SystemExit: If the git command fails (e.g., not a git repo).
 692    """
 693    try:
 694        result = subprocess.run(
 695            ["git", "rev-parse", "HEAD"],
 696            cwd=repo_path,
 697            capture_output=True,
 698            text=True,
 699            check=True,
 700        )
 701    except subprocess.CalledProcessError:
 702        exit_with_error(
 703            f"Failed to get HEAD SHA from {repo_path}. Is it a git repository?"
 704        )
 705    except OSError:
 706        exit_with_error("Could not run 'git'. Is git installed and on PATH?")
 707    return result.stdout.strip()
 708
 709
 710@connector_app.command(name="get-version")
 711def get_version(
 712    name: Annotated[
 713        str,
 714        Parameter(help="Connector technical name (e.g., source-github)."),
 715    ],
 716    repo_path: Annotated[
 717        str,
 718        Parameter(help="Absolute path to the Airbyte monorepo."),
 719    ],
 720    compute_next: Annotated[
 721        bool,
 722        Parameter(
 723            name="--next",
 724            help="Compute the next version instead of the current version.",
 725        ),
 726    ] = False,
 727    bump_type: Annotated[
 728        BumpType | None,
 729        Parameter(
 730            help="Version bump type (requires --next). "
 731            "Standard: patch, minor, major. RC: patch_rc, minor_rc, major_rc, rc, promote."
 732        ),
 733    ] = None,
 734    prerelease: Annotated[
 735        bool,
 736        Parameter(
 737            help="Compute a prerelease (preview) tag using the repo HEAD SHA (requires --next)."
 738        ),
 739    ] = False,
 740) -> None:
 741    """Get the current or next version for a connector.
 742
 743    By default, prints the current version from metadata.yaml.
 744
 745    With --next, computes and prints the next version. Requires either
 746    --bump-type or --prerelease to be specified.
 747
 748    Examples:
 749        # Current version
 750        airbyte-ops local connector get-version --name source-github --repo-path /path/to/airbyte
 751
 752        # Next patch version
 753        VERSION=$(airbyte-ops local connector get-version --name source-github --repo-path /path/to/airbyte --next --bump-type patch)
 754
 755        # Next prerelease (preview) tag based on HEAD SHA
 756        TAG=$(airbyte-ops local connector get-version --name source-github --repo-path /path/to/airbyte --next --prerelease)
 757    """
 758    repo = Path(repo_path)
 759
 760    # Validate flag combinations
 761    if bump_type and not compute_next:
 762        exit_with_error("--bump-type requires --next.")
 763    if prerelease and not compute_next:
 764        exit_with_error("--prerelease requires --next.")
 765    if bump_type and prerelease:
 766        exit_with_error("--bump-type and --prerelease are mutually exclusive.")
 767    if compute_next and not bump_type and not prerelease:
 768        exit_with_error(
 769            "--next requires either --bump-type or --prerelease. "
 770            "Automatic detection of bump type is not yet supported."
 771        )
 772
 773    try:
 774        connector_path = get_connector_path(repo, name)
 775        current_version = get_current_version(connector_path)
 776    except ConnectorNotFoundError as e:
 777        exit_with_error(str(e))
 778    except VersionNotFoundError as e:
 779        exit_with_error(str(e))
 780
 781    if not compute_next:
 782        sys.stdout.write(current_version + "\n")
 783        return
 784
 785    if prerelease:
 786        sha = _get_git_head_sha(repo)
 787        try:
 788            tag = compute_prerelease_docker_image_tag(current_version, sha)
 789        except InvalidVersionError as e:
 790            exit_with_error(str(e))
 791        sys.stdout.write(tag + "\n")
 792        return
 793
 794    # bump_type is set (validated above)
 795    try:
 796        new_version = calculate_new_version(
 797            current_version=current_version,
 798            bump_type=bump_type,
 799        )
 800    except (InvalidVersionError, ValueError) as e:
 801        exit_with_error(str(e))
 802
 803    sys.stdout.write(new_version + "\n")
 804
 805
 806@connector_app.command(name="bump-version")
 807def bump_version(
 808    name: Annotated[
 809        str,
 810        Parameter(help="Connector technical name (e.g., source-github)."),
 811    ],
 812    repo_path: Annotated[
 813        str,
 814        Parameter(help="Absolute path to the Airbyte monorepo."),
 815    ],
 816    bump_type: Annotated[
 817        BumpType | None,
 818        Parameter(
 819            help="Version bump type. Standard: patch, minor, major. RC: patch_rc, minor_rc, major_rc, rc, promote."
 820        ),
 821    ] = None,
 822    new_version: Annotated[
 823        str | None,
 824        Parameter(help="Explicit new version (overrides --bump-type if provided)."),
 825    ] = None,
 826    changelog_message: Annotated[
 827        str | None,
 828        Parameter(
 829            help="Message to add to changelog. Ignored if --no-changelog is set."
 830        ),
 831    ] = None,
 832    pr_number: Annotated[
 833        int | None,
 834        Parameter(help="PR number for changelog entry."),
 835    ] = None,
 836    dry_run: Annotated[
 837        bool,
 838        Parameter(help="Show what would be changed without modifying files."),
 839    ] = False,
 840    no_changelog: Annotated[
 841        bool,
 842        Parameter(
 843            help="Skip changelog updates even if --changelog-message is provided. "
 844            "Useful for ephemeral version bumps (e.g. pre-release artifact generation)."
 845        ),
 846    ] = False,
 847    progressive_rollout_enabled: Annotated[
 848        bool | None,
 849        Parameter(
 850            help="Explicitly set `enableProgressiveRollout` in metadata.yaml. "
 851            "Pass `false` to disable progressive rollout (e.g. for preview builds). "
 852            "When omitted, the automatic behaviour based on --bump-type is used.",
 853        ),
 854    ] = None,
 855) -> None:
 856    """Bump a connector's version across all relevant files.
 857
 858    Updates version in metadata.yaml (always), pyproject.toml (if exists),
 859    and documentation changelog (if --changelog-message provided).
 860
 861    Note: --changelog-message is ignored when --no-changelog is set.
 862
 863    Either --bump-type or --new-version must be provided.
 864
 865    Examples:
 866        airbyte-ops local connector bump-version --name source-github --repo-path /path/to/airbyte --bump-type patch
 867        airbyte-ops local connector bump-version --name source-github --repo-path /path/to/airbyte --new-version 1.2.3-preview.abc1234 --no-changelog
 868        airbyte-ops local connector bump-version --name source-github --repo-path /path/to/airbyte --new-version 1.2.3-preview.abc1234 --no-changelog --progressive-rollout-enabled=false
 869    """
 870    try:
 871        result = bump_connector_version(
 872            repo_path=repo_path,
 873            connector_name=name,
 874            bump_type=bump_type,
 875            new_version=new_version,
 876            changelog_message=changelog_message,
 877            pr_number=pr_number,
 878            dry_run=dry_run,
 879            no_changelog=no_changelog,
 880            progressive_rollout_enabled=progressive_rollout_enabled,
 881        )
 882    except ConnectorNotFoundError as e:
 883        exit_with_error(str(e))
 884    except VersionNotFoundError as e:
 885        exit_with_error(str(e))
 886    except InvalidVersionError as e:
 887        exit_with_error(str(e))
 888    except ValueError as e:
 889        exit_with_error(str(e))
 890
 891    # Build output matching the issue spec
 892    output = {
 893        "connector": result.connector,
 894        "previous_version": result.previous_version,
 895        "new_version": result.new_version,
 896        "files_modified": result.files_modified,
 897        "dry_run": result.dry_run,
 898    }
 899    print_json(output)
 900
 901    # Write to GitHub step outputs if in CI
 902    _write_github_step_outputs(
 903        {
 904            "connector": result.connector,
 905            "previous_version": result.previous_version,
 906            "new_version": result.new_version,
 907        }
 908    )
 909
 910
 911@connector_app.command(name="bump-base-image")
 912def bump_base_image_cmd(
 913    name: Annotated[
 914        str,
 915        Parameter(help="Connector technical name (e.g., source-github)."),
 916    ],
 917    repo_path: Annotated[
 918        str,
 919        Parameter(help="Absolute path to the Airbyte monorepo."),
 920    ],
 921    force_latest: Annotated[
 922        bool,
 923        Parameter(
 924            help=(
 925                "Bump to the absolute latest stable base image, ignoring "
 926                "major-version boundaries.  Without this flag the bump stays "
 927                "within the current major version."
 928            )
 929        ),
 930    ] = False,
 931    dry_run: Annotated[
 932        bool,
 933        Parameter(help="Show what would be changed without modifying files."),
 934    ] = False,
 935) -> None:
 936    """Update a connector's base image.
 937
 938    Two modes:
 939
 940    * Default: bump to the latest stable tag within the same major version.
 941      Major version changes are treated as breaking-change boundaries.
 942    * --force-latest: bump to the absolute latest stable tag regardless of semver.
 943
 944    Examples:
 945        airbyte-ops local connector bump-base-image --name source-github --repo-path /path/to/airbyte
 946        airbyte-ops local connector bump-base-image --name source-github --repo-path /path/to/airbyte --force-latest
 947    """
 948    try:
 949        result = bump_base_image(
 950            repo_path=repo_path,
 951            connector_name=name,
 952            force_latest=force_latest,
 953            dry_run=dry_run,
 954        )
 955    except (ConnectorNotFoundError, BaseImageError) as e:
 956        exit_with_error(str(e))
 957    except ValueError as e:
 958        exit_with_error(str(e))
 959
 960    output = {
 961        "connector": result.connector,
 962        "previous_base_image": result.previous_base_image,
 963        "new_base_image": result.new_base_image,
 964        "updated": result.updated,
 965        "dry_run": result.dry_run,
 966        "files_modified": result.files_modified,
 967        "message": result.message,
 968    }
 969    print_json(output)
 970
 971
 972@connector_app.command(name="bump-cdk")
 973def bump_cdk_cmd(
 974    name: Annotated[
 975        str,
 976        Parameter(help="Connector technical name (e.g., source-github)."),
 977    ],
 978    repo_path: Annotated[
 979        str,
 980        Parameter(help="Absolute path to the Airbyte monorepo."),
 981    ],
 982    force_latest: Annotated[
 983        bool,
 984        Parameter(
 985            help=(
 986                "Rewrite the CDK constraint to >=LATEST,<NEXT_MAJOR and refresh "
 987                "the lock file.  Without this flag only the lock file is refreshed "
 988                "(constraint unchanged)."
 989            )
 990        ),
 991    ] = False,
 992    dry_run: Annotated[
 993        bool,
 994        Parameter(help="Show what would be changed without modifying files."),
 995    ] = False,
 996) -> None:
 997    """Bump a connector's CDK dependency.
 998
 999    Two modes:
1000
1001    * Default: refresh the lock file so it resolves the newest CDK that
1002      satisfies the existing constraint.  The constraint is NOT changed.
1003    * --force-latest: rewrite the constraint to >=LATEST,<NEXT_MAJOR and refresh
1004      the lock file.  Extras (e.g. file-based) are preserved.
1005
1006    For Java connectors, updates `build.gradle` to the latest CDK version.
1007
1008    Examples:
1009        airbyte-ops local connector bump-cdk --name source-github --repo-path /path/to/airbyte
1010        airbyte-ops local connector bump-cdk --name source-github --repo-path /path/to/airbyte --force-latest
1011    """
1012    try:
1013        result = bump_cdk(
1014            repo_path=repo_path,
1015            connector_name=name,
1016            force_latest=force_latest,
1017            dry_run=dry_run,
1018        )
1019    except (ConnectorNotFoundError, CdkBumpError) as e:
1020        exit_with_error(str(e))
1021    except ValueError as e:
1022        exit_with_error(str(e))
1023
1024    output = {
1025        "connector": result.connector,
1026        "language": result.language,
1027        "previous_version": result.previous_version,
1028        "new_version": result.new_version,
1029        "updated": result.updated,
1030        "dry_run": result.dry_run,
1031        "files_modified": result.files_modified,
1032        "message": result.message,
1033    }
1034    print_json(output)
1035
1036
1037@connector_app.command(name="bump-deps")
1038def bump_deps_cmd(
1039    name: Annotated[
1040        str,
1041        Parameter(help="Connector technical name (e.g., source-github)."),
1042    ],
1043    repo_path: Annotated[
1044        str,
1045        Parameter(help="Absolute path to the Airbyte monorepo."),
1046    ],
1047    dry_run: Annotated[
1048        bool,
1049        Parameter(help="Show what would be changed without modifying files."),
1050    ] = False,
1051) -> None:
1052    """Update a connector's dependencies.
1053
1054    For Python / low-code connectors using Poetry, this runs
1055    `poetry update --lock` to refresh the lock file with the latest
1056    versions allowed by existing constraints.
1057
1058    For connectors that do not use Poetry (manifest-only, Java, etc.),
1059    this is a no-op.
1060
1061    Examples:
1062        airbyte-ops local connector bump-deps --name source-github --repo-path /path/to/airbyte
1063        airbyte-ops local connector bump-deps --name source-github --repo-path /path/to/airbyte --dry-run
1064    """
1065    try:
1066        result = bump_deps(
1067            repo_path=repo_path,
1068            connector_name=name,
1069            dry_run=dry_run,
1070        )
1071    except (ConnectorNotFoundError, DepsError) as e:
1072        exit_with_error(str(e))
1073    except ValueError as e:
1074        exit_with_error(str(e))
1075
1076    output = {
1077        "connector": result.connector,
1078        "language": result.language,
1079        "updated": result.updated,
1080        "dry_run": result.dry_run,
1081        "files_modified": result.files_modified,
1082        "outdated_packages": result.outdated_packages,
1083        "message": result.message,
1084    }
1085    print_json(output)
1086
1087
1088@connector_app.command(name="qa")
1089def run_qa_checks(
1090    name: Annotated[
1091        list[str] | None,
1092        Parameter(
1093            help="Connector technical name(s) (e.g., source-github). Can be specified multiple times."
1094        ),
1095    ] = None,
1096    connector_directory: Annotated[
1097        str | None,
1098        Parameter(
1099            help="Directory containing connectors to run checks on all connectors in this directory."
1100        ),
1101    ] = None,
1102    check: Annotated[
1103        list[str] | None,
1104        Parameter(help="Specific check(s) to run. Can be specified multiple times."),
1105    ] = None,
1106    report_path: Annotated[
1107        str | None,
1108        Parameter(help="Path to write the JSON report file."),
1109    ] = None,
1110) -> None:
1111    """Run QA checks on connector(s).
1112
1113    Validates connector metadata, documentation, packaging, security, and versioning.
1114    Exit code is non-zero if any checks fail.
1115    """
1116    # Determine which checks to run
1117    checks_to_run = ENABLED_CHECKS
1118    if check:
1119        check_names = set(check)
1120        checks_to_run = [c for c in ENABLED_CHECKS if type(c).__name__ in check_names]
1121        if not checks_to_run:
1122            exit_with_error(
1123                f"No matching checks found. Available checks: {[type(c).__name__ for c in ENABLED_CHECKS]}"
1124            )
1125
1126    # Collect connectors to check
1127    connectors: list[Connector] = []
1128    if name:
1129        connectors.extend(Connector(remove_strict_encrypt_suffix(n)) for n in name)
1130    if connector_directory:
1131        connectors.extend(get_all_connectors_in_directory(Path(connector_directory)))
1132
1133    if not connectors:
1134        exit_with_error("No connectors specified. Use --name or --connector-directory.")
1135
1136    connectors = sorted(connectors, key=lambda c: c.technical_name)
1137
1138    # Run checks synchronously (simpler than async for CLI)
1139    all_results = []
1140    for connector in connectors:
1141        for qa_check in checks_to_run:
1142            result = qa_check.run(connector)
1143            if result.status == CheckStatus.PASSED:
1144                status_icon = "[green]✅ PASS[/green]"
1145            elif result.status == CheckStatus.SKIPPED:
1146                status_icon = "[yellow]🔶 SKIP[/yellow]"
1147            else:
1148                status_icon = "[red]❌ FAIL[/red]"
1149            console.print(
1150                f"{status_icon} {connector.technical_name}: {result.check.name}"
1151            )
1152            if result.message:
1153                console.print(f"    {result.message}")
1154            all_results.append(result)
1155
1156    # Write report if requested
1157    if report_path:
1158        Report(check_results=all_results).write(Path(report_path))
1159        console.print(f"Report written to {report_path}")
1160
1161    # Exit with error if any checks failed
1162    failed = [r for r in all_results if r.status == CheckStatus.FAILED]
1163    if failed:
1164        exit_with_error(f"{len(failed)} check(s) failed")
1165
1166
1167@connector_app.command(name="qa-docs-generate")
1168def generate_qa_docs(
1169    output_file: Annotated[
1170        str,
1171        Parameter(help="Path to write the generated documentation file."),
1172    ],
1173) -> None:
1174    """Generate documentation for QA checks.
1175
1176    Creates a markdown file documenting all available QA checks organized by category.
1177    """
1178    checks_by_category: dict[CheckCategory, list[Check]] = {}
1179    for qa_check in ENABLED_CHECKS:
1180        checks_by_category.setdefault(qa_check.category, []).append(qa_check)
1181
1182    jinja_env = Environment(
1183        loader=PackageLoader("airbyte_ops_mcp.connector_qa", "templates"),
1184        autoescape=select_autoescape(),
1185        trim_blocks=False,
1186        lstrip_blocks=True,
1187    )
1188    template = jinja_env.get_template(CONNECTORS_QA_DOC_TEMPLATE_NAME)
1189    documentation = template.render(checks_by_category=checks_by_category)
1190
1191    output_path = Path(output_file)
1192    output_path.write_text(documentation)
1193    console.print(f"Documentation written to {output_file}")
1194
1195
1196# Create the changelog sub-app under connector
1197changelog_app = App(name="changelog", help="Changelog operations for connectors.")
1198connector_app.command(changelog_app)
1199
1200
1201@changelog_app.command(name="check")
1202def changelog_check(
1203    connector_name: Annotated[
1204        str | None,
1205        Parameter(help="Connector technical name (e.g., source-github)."),
1206    ] = None,
1207    all_connectors: Annotated[
1208        bool,
1209        Parameter("--all", help="Check all connectors in the repository."),
1210    ] = False,
1211    repo_path: Annotated[
1212        str | None,
1213        Parameter(help="Path to the Airbyte monorepo. Can be inferred from context."),
1214    ] = None,
1215    lookback_days: Annotated[
1216        int | None,
1217        Parameter(help="Only check entries with dates within this many days."),
1218    ] = None,
1219    strict: Annotated[
1220        bool,
1221        Parameter(help="Exit with error code if any issues are found."),
1222    ] = False,
1223) -> None:
1224    """Check changelog entries for issues.
1225
1226    Validates changelog dates match PR merge dates and checks for PR number mismatches.
1227    """
1228    if not connector_name and not all_connectors:
1229        exit_with_error("Either --connector-name or --all must be specified.")
1230
1231    if connector_name and all_connectors:
1232        exit_with_error("Cannot specify both --connector-name and --all.")
1233
1234    if repo_path is None:
1235        cwd = Path.cwd()
1236        for parent in [cwd, *cwd.parents]:
1237            if (parent / CONNECTOR_PATH_PREFIX).exists():
1238                repo_path = str(parent)
1239                break
1240        if repo_path is None:
1241            exit_with_error(
1242                "Could not infer repo path. Please provide --repo-path or run from within the Airbyte monorepo."
1243            )
1244
1245    total_issues = 0
1246
1247    if all_connectors:
1248        results = check_all_changelogs(repo_path=repo_path, lookback_days=lookback_days)
1249        for result in results:
1250            if result.has_issues or result.errors:
1251                _print_check_result(result)
1252                total_issues += result.issue_count
1253    else:
1254        result = check_changelog(
1255            repo_path=repo_path,
1256            connector_name=connector_name,
1257            lookback_days=lookback_days,
1258        )
1259        _print_check_result(result)
1260        total_issues = result.issue_count
1261
1262    if total_issues > 0:
1263        console.print(f"\n[bold]Total issues found: {total_issues}[/bold]")
1264        if strict:
1265            exit_with_error(f"Found {total_issues} issue(s) in changelog(s).")
1266    else:
1267        console.print("[green]No issues found.[/green]")
1268
1269
1270def _print_check_result(result: ChangelogCheckResult) -> None:
1271    """Print a changelog check result."""
1272    if not result.has_issues and not result.errors:
1273        return
1274
1275    console.print(f"\n[bold]{result.connector}[/bold]")
1276
1277    for warning in result.pr_mismatch_warnings:
1278        console.print(
1279            f"  [yellow]WARNING[/yellow] Line {warning.line_number} (v{warning.version}): {warning.message}"
1280        )
1281
1282    for fix in result.date_issues:
1283        if fix.changed:
1284            console.print(
1285                f"  [red]DATE MISMATCH[/red] Line {fix.line_number} (v{fix.version}): "
1286                f"changelog has {fix.old_date}, PR merged on {fix.new_date}"
1287            )
1288
1289    for error in result.errors:
1290        console.print(f"  [red]ERROR[/red] {error}")
1291
1292
1293@changelog_app.command(name="fix")
1294def changelog_fix(
1295    connector_name: Annotated[
1296        str | None,
1297        Parameter(help="Connector technical name (e.g., source-github)."),
1298    ] = None,
1299    all_connectors: Annotated[
1300        bool,
1301        Parameter("--all", help="Fix all connectors in the repository."),
1302    ] = False,
1303    repo_path: Annotated[
1304        str | None,
1305        Parameter(help="Path to the Airbyte monorepo. Can be inferred from context."),
1306    ] = None,
1307    lookback_days: Annotated[
1308        int | None,
1309        Parameter(help="Only fix entries with dates within this many days."),
1310    ] = None,
1311    dry_run: Annotated[
1312        bool,
1313        Parameter(help="Print changes without modifying files."),
1314    ] = False,
1315) -> None:
1316    """Fix changelog entry dates to match PR merge dates.
1317
1318    Looks up the actual merge date for each PR referenced in the changelog
1319    and updates the date column to match.
1320    """
1321    if not connector_name and not all_connectors:
1322        exit_with_error("Either --connector-name or --all must be specified.")
1323
1324    if connector_name and all_connectors:
1325        exit_with_error("Cannot specify both --connector-name and --all.")
1326
1327    if repo_path is None:
1328        cwd = Path.cwd()
1329        for parent in [cwd, *cwd.parents]:
1330            if (parent / CONNECTOR_PATH_PREFIX).exists():
1331                repo_path = str(parent)
1332                break
1333        if repo_path is None:
1334            exit_with_error(
1335                "Could not infer repo path. Please provide --repo-path or run from within the Airbyte monorepo."
1336            )
1337
1338    total_fixed = 0
1339    total_warnings = 0
1340
1341    if all_connectors:
1342        results = fix_all_changelog_dates(
1343            repo_path=repo_path, dry_run=dry_run, lookback_days=lookback_days
1344        )
1345        for result in results:
1346            if result.has_changes or result.warnings or result.errors:
1347                _print_fix_result(result)
1348                total_fixed += result.changed_count
1349                total_warnings += len(result.warnings)
1350    else:
1351        result = fix_changelog_dates(
1352            repo_path=repo_path,
1353            connector_name=connector_name,
1354            dry_run=dry_run,
1355            lookback_days=lookback_days,
1356        )
1357        _print_fix_result(result)
1358        total_fixed = result.changed_count
1359        total_warnings = len(result.warnings)
1360
1361    action = "Would fix" if dry_run else "Fixed"
1362    console.print(f"\n[bold]{action} {total_fixed} date(s).[/bold]")
1363    if total_warnings > 0:
1364        console.print(
1365            f"[yellow]{total_warnings} warning(s) about PR number mismatches.[/yellow]"
1366        )
1367
1368
1369def _print_fix_result(result: ChangelogFixResult) -> None:
1370    """Print a changelog fix result."""
1371    if not result.has_changes and not result.warnings and not result.errors:
1372        return
1373
1374    console.print(f"\n[bold]{result.connector}[/bold]")
1375
1376    for warning in result.warnings:
1377        console.print(
1378            f"  [yellow]WARNING[/yellow] Line {warning.line_number} (v{warning.version}): {warning.message}"
1379        )
1380
1381    for fix in result.fixes:
1382        if fix.changed:
1383            action = "Would fix" if result.dry_run else "Fixed"
1384            console.print(
1385                f"  [green]{action}[/green] Line {fix.line_number} (v{fix.version}): "
1386                f"{fix.old_date} -> {fix.new_date}"
1387            )
1388
1389    for error in result.errors:
1390        console.print(f"  [red]ERROR[/red] {error}")
1391
1392
1393@changelog_app.command(name="add")
1394def changelog_add(
1395    connector_name: Annotated[
1396        str,
1397        Parameter(help="Connector technical name (e.g., source-github)."),
1398    ],
1399    pr_number: Annotated[
1400        int,
1401        Parameter(help="PR number for the changelog entry."),
1402    ],
1403    message: Annotated[
1404        str,
1405        Parameter(help="Changelog entry message."),
1406    ],
1407    repo_path: Annotated[
1408        str | None,
1409        Parameter(help="Path to the Airbyte monorepo. Can be inferred from context."),
1410    ] = None,
1411    dry_run: Annotated[
1412        bool,
1413        Parameter(help="Print changes without modifying files."),
1414    ] = False,
1415) -> None:
1416    """Add a changelog entry for a connector using its current version.
1417
1418    Reads the version from metadata.yaml and writes a single changelog
1419    entry to the connector's documentation file.  Does not modify any
1420    version files.
1421    """
1422    if repo_path is None:
1423        cwd = Path.cwd()
1424        for parent in [cwd, *cwd.parents]:
1425            if (parent / CONNECTOR_PATH_PREFIX).exists():
1426                repo_path = str(parent)
1427                break
1428        if repo_path is None:
1429            exit_with_error(
1430                "Could not infer repo path. Please provide --repo-path or run from within the Airbyte monorepo."
1431            )
1432
1433    try:
1434        connector_path = get_connector_path(Path(repo_path), connector_name)
1435        version = get_current_version(connector_path)
1436    except (ConnectorNotFoundError, VersionNotFoundError) as e:
1437        exit_with_error(str(e))
1438
1439    doc_path = get_connector_doc_path(Path(repo_path), connector_name)
1440    if doc_path is None or not doc_path.exists():
1441        exit_with_error(f"Documentation file not found for {connector_name}.")
1442
1443    modified = update_changelog(
1444        doc_path=doc_path,
1445        new_version=version,
1446        changelog_message=message,
1447        pr_number=pr_number,
1448        dry_run=dry_run,
1449    )
1450
1451    action = "Would add" if dry_run else "Added"
1452    if modified:
1453        console.print(
1454            f"[green]{action} changelog entry for {connector_name} v{version}[/green]"
1455        )
1456    else:
1457        console.print(
1458            f"[yellow]No changes needed for {connector_name} v{version}[/yellow]"
1459        )
1460
1461
1462# Create the marketing-stub sub-app under connector
1463marketing_stub_app = App(
1464    name="marketing-stub",
1465    help="Marketing connector stub operations (local file validation and updates).",
1466)
1467connector_app.command(marketing_stub_app)
1468
1469# Path to connectors in the airbyte-enterprise repo
1470ENTERPRISE_CONNECTOR_PATH_PREFIX = "airbyte-integrations/connectors"
1471
1472
1473def _build_stub_from_metadata(
1474    connector_name: str,
1475    metadata: dict,
1476    existing_stub: dict | None = None,
1477) -> dict:
1478    """Build a connector stub from metadata.yaml.
1479
1480    Args:
1481        connector_name: The connector name (e.g., 'source-oracle-enterprise').
1482        metadata: The parsed metadata.yaml content.
1483        existing_stub: Optional existing stub to preserve extra fields from.
1484
1485    Returns:
1486        A connector stub dictionary.
1487    """
1488    data = metadata.get("data", {})
1489
1490    # Determine connector type for the stub
1491    connector_type = data.get("connectorType", "source")
1492    stub_type = f"enterprise_{connector_type}"
1493
1494    # Preserve existing stub ID if available, otherwise use connector name
1495    stub_id = (existing_stub.get("id") if existing_stub else None) or connector_name
1496
1497    # Get the icon URL - construct from icon filename if available
1498    icon_filename = data.get("icon", "")
1499    if icon_filename and not icon_filename.startswith("http"):
1500        # Construct icon URL from the standard GCS path
1501        icon_url = f"https://storage.googleapis.com/prod-airbyte-cloud-connector-metadata-service/resources/connector_stubs/v0/icons/{icon_filename}"
1502    else:
1503        icon_url = icon_filename or ""
1504
1505    # Build the stub
1506    stub: dict = {
1507        "id": stub_id,
1508        "name": data.get("name", connector_name.replace("-", " ").title()),
1509        "label": "enterprise",
1510        "icon": icon_url,
1511        "url": data.get("documentationUrl", ""),
1512        "type": stub_type,
1513    }
1514
1515    # Add definitionId if available
1516    definition_id = data.get("definitionId")
1517    if definition_id:
1518        stub["definitionId"] = definition_id
1519
1520    # Preserve extra fields from existing stub (like codename)
1521    if existing_stub:
1522        for key in existing_stub:
1523            if key not in stub:
1524                stub[key] = existing_stub[key]
1525
1526    return stub
1527
1528
1529@marketing_stub_app.command(name="check")
1530def marketing_stub_check(
1531    connector: Annotated[
1532        str | None,
1533        Parameter(help="Connector name to check (e.g., 'source-oracle-enterprise')."),
1534    ] = None,
1535    all_connectors: Annotated[
1536        bool,
1537        Parameter("--all", help="Check all stubs in the file."),
1538    ] = False,
1539    repo_root: Annotated[
1540        Path | None,
1541        Parameter(
1542            help="Path to the airbyte-enterprise repository root. Defaults to current directory."
1543        ),
1544    ] = None,
1545) -> None:
1546    """Validate marketing connector stub entries.
1547
1548    Checks that stub entries have valid required fields (id, name, url, icon)
1549    and optionally validates that the stub matches the connector's metadata.yaml.
1550
1551    Exit codes:
1552        0: All checks passed
1553        1: Validation errors found
1554
1555    Output:
1556        STDOUT: JSON validation result
1557        STDERR: Informational messages
1558
1559    Example:
1560        airbyte-ops local connector marketing-stub check --connector source-oracle-enterprise --repo-root /path/to/airbyte-enterprise
1561        airbyte-ops local connector marketing-stub check --all --repo-root /path/to/airbyte-enterprise
1562    """
1563    if not connector and not all_connectors:
1564        exit_with_error("Either --connector or --all must be specified.")
1565
1566    if connector and all_connectors:
1567        exit_with_error("Cannot specify both --connector and --all.")
1568
1569    if repo_root is None:
1570        repo_root = Path.cwd()
1571
1572    # Load local stubs
1573    try:
1574        stubs = load_local_stubs(repo_root)
1575    except FileNotFoundError as e:
1576        exit_with_error(str(e))
1577    except ValueError as e:
1578        exit_with_error(str(e))
1579
1580    stubs_to_check = stubs if all_connectors else []
1581    if connector:
1582        stub = find_stub_by_connector(stubs, connector)
1583        if stub is None:
1584            exit_with_error(
1585                f"Connector stub '{connector}' not found in {CONNECTOR_STUBS_FILE}"
1586            )
1587        stubs_to_check = [stub]
1588
1589    errors: list[dict] = []
1590    warnings: list[dict] = []
1591    placeholders: list[dict] = []
1592
1593    for stub in stubs_to_check:
1594        stub_id = stub.get("id", "<unknown>")
1595        stub_name = stub.get("name", stub_id)
1596
1597        # Check required fields
1598        required_fields = ["id", "name", "url", "icon"]
1599        for field in required_fields:
1600            if not stub.get(field):
1601                errors.append(
1602                    {"stub_id": stub_id, "error": f"Missing required field: {field}"}
1603                )
1604
1605        # Check if corresponding connector exists and validate against metadata
1606        connector_dir = repo_root / ENTERPRISE_CONNECTOR_PATH_PREFIX / stub_id
1607        metadata_file = connector_dir / METADATA_FILE_NAME
1608
1609        if metadata_file.exists():
1610            metadata = yaml.safe_load(metadata_file.read_text())
1611            data = metadata.get("data", {})
1612
1613            # Check if definitionId matches
1614            metadata_def_id = data.get("definitionId")
1615            stub_def_id = stub.get("definitionId")
1616            if metadata_def_id and stub_def_id and metadata_def_id != stub_def_id:
1617                errors.append(
1618                    {
1619                        "stub_id": stub_id,
1620                        "error": f"definitionId mismatch: stub has '{stub_def_id}', metadata has '{metadata_def_id}'",
1621                    }
1622                )
1623
1624            # Check if name matches
1625            metadata_name = data.get("name")
1626            if metadata_name and stub_name and metadata_name != stub_name:
1627                warnings.append(
1628                    {
1629                        "stub_id": stub_id,
1630                        "warning": f"name mismatch: stub has '{stub_name}', metadata has '{metadata_name}'",
1631                    }
1632                )
1633        else:
1634            # No connector directory - this is a registry placeholder for a future connector
1635            placeholders.append(
1636                {
1637                    "stub_id": stub_id,
1638                    "name": stub_name,
1639                }
1640            )
1641
1642    result = {
1643        "checked_count": len(stubs_to_check),
1644        "error_count": len(errors),
1645        "warning_count": len(warnings),
1646        "placeholder_count": len(placeholders),
1647        "valid": len(errors) == 0,
1648        "errors": errors,
1649        "warnings": warnings,
1650        "placeholders": placeholders,
1651    }
1652
1653    # Print placeholders as info (not warnings - these are valid registry placeholders)
1654    if placeholders:
1655        error_console.print(
1656            f"[blue]Found {len(placeholders)} registry placeholder(s) (no local directory):[/blue]"
1657        )
1658        for placeholder in placeholders:
1659            error_console.print(
1660                f"  Found Connector Registry Placeholder (no local directory): {placeholder['name']}"
1661            )
1662
1663    if errors:
1664        error_console.print(f"[red]Found {len(errors)} error(s):[/red]")
1665        for err in errors:
1666            error_console.print(f"  {err['stub_id']}: {err['error']}")
1667
1668    if warnings:
1669        error_console.print(f"[yellow]Found {len(warnings)} warning(s):[/yellow]")
1670        for warn in warnings:
1671            error_console.print(f"  {warn['stub_id']}: {warn['warning']}")
1672
1673    if not errors and not warnings:
1674        error_console.print(
1675            f"[green]All {len(stubs_to_check)} stub(s) passed validation[/green]"
1676        )
1677
1678    print_json(result)
1679
1680    if errors:
1681        exit_with_error("Validation failed", code=1)
1682
1683
1684@marketing_stub_app.command(name="sync")
1685def marketing_stub_sync(
1686    connector: Annotated[
1687        str | None,
1688        Parameter(help="Connector name to sync (e.g., 'source-oracle-enterprise')."),
1689    ] = None,
1690    all_connectors: Annotated[
1691        bool,
1692        Parameter("--all", help="Sync all connectors that have metadata.yaml files."),
1693    ] = False,
1694    repo_root: Annotated[
1695        Path | None,
1696        Parameter(
1697            help="Path to the airbyte-enterprise repository root. Defaults to current directory."
1698        ),
1699    ] = None,
1700    dry_run: Annotated[
1701        bool,
1702        Parameter(help="Show what would be synced without making changes."),
1703    ] = False,
1704) -> None:
1705    """Sync connector stub(s) from connector metadata.yaml file(s).
1706
1707    Reads the connector's metadata.yaml file and updates the corresponding
1708    entry in connector_stubs.json with the current values.
1709
1710    Exit codes:
1711        0: Sync successful (or dry-run completed)
1712        1: Error (connector not found, no metadata, etc.)
1713
1714    Output:
1715        STDOUT: JSON representation of the synced stub(s)
1716        STDERR: Informational messages
1717
1718    Example:
1719        airbyte-ops local connector marketing-stub sync --connector source-oracle-enterprise --repo-root /path/to/airbyte-enterprise
1720        airbyte-ops local connector marketing-stub sync --all --repo-root /path/to/airbyte-enterprise
1721        airbyte-ops local connector marketing-stub sync --connector source-oracle-enterprise --dry-run
1722    """
1723    if not connector and not all_connectors:
1724        exit_with_error("Either --connector or --all must be specified.")
1725
1726    if connector and all_connectors:
1727        exit_with_error("Cannot specify both --connector and --all.")
1728
1729    if repo_root is None:
1730        repo_root = Path.cwd()
1731
1732    # Load existing stubs
1733    try:
1734        stubs = load_local_stubs(repo_root)
1735    except FileNotFoundError:
1736        stubs = []
1737    except ValueError as e:
1738        exit_with_error(str(e))
1739
1740    # Determine which connectors to sync
1741    connectors_to_sync: list[str] = []
1742    if connector:
1743        connectors_to_sync = [connector]
1744    else:
1745        # Find all connectors with metadata.yaml in the enterprise connectors directory
1746        connectors_dir = repo_root / ENTERPRISE_CONNECTOR_PATH_PREFIX
1747        if connectors_dir.exists():
1748            for item in connectors_dir.iterdir():
1749                if item.is_dir() and (item / METADATA_FILE_NAME).exists():
1750                    connectors_to_sync.append(item.name)
1751        connectors_to_sync.sort()
1752
1753    if not connectors_to_sync:
1754        exit_with_error("No connectors found to sync.")
1755
1756    synced_stubs: list[dict] = []
1757    updated_count = 0
1758    added_count = 0
1759
1760    for conn_name in connectors_to_sync:
1761        connector_dir = repo_root / ENTERPRISE_CONNECTOR_PATH_PREFIX / conn_name
1762        metadata_file = connector_dir / METADATA_FILE_NAME
1763
1764        if not connector_dir.exists():
1765            if connector:
1766                exit_with_error(f"Connector directory not found: {connector_dir}")
1767            continue
1768
1769        if not metadata_file.exists():
1770            if connector:
1771                exit_with_error(f"Metadata file not found: {metadata_file}")
1772            continue
1773
1774        # Load metadata
1775        metadata = yaml.safe_load(metadata_file.read_text())
1776
1777        # Find existing stub if any
1778        existing_stub = find_stub_by_connector(stubs, conn_name)
1779
1780        # Build new stub from metadata
1781        new_stub = _build_stub_from_metadata(conn_name, metadata, existing_stub)
1782
1783        # Validate the new stub
1784        ConnectorStub(**new_stub)
1785
1786        if dry_run:
1787            action = "update" if existing_stub else "create"
1788            error_console.print(f"[DRY RUN] Would {action} stub for '{conn_name}'")
1789            synced_stubs.append(new_stub)
1790            continue
1791
1792        # Update or add the stub
1793        if existing_stub:
1794            # Find and replace
1795            for i, stub in enumerate(stubs):
1796                if stub.get("id") == existing_stub.get("id"):
1797                    stubs[i] = new_stub
1798                    break
1799            updated_count += 1
1800        else:
1801            stubs.append(new_stub)
1802            added_count += 1
1803
1804        synced_stubs.append(new_stub)
1805
1806    if not dry_run:
1807        # Save the updated stubs
1808        save_local_stubs(repo_root, stubs)
1809        error_console.print(
1810            f"[green]Synced {len(synced_stubs)} stub(s) to {CONNECTOR_STUBS_FILE} "
1811            f"({added_count} added, {updated_count} updated)[/green]"
1812        )
1813    else:
1814        error_console.print(
1815            f"[DRY RUN] Would sync {len(synced_stubs)} stub(s) to {CONNECTOR_STUBS_FILE}"
1816        )
1817
1818    print_json(
1819        synced_stubs if all_connectors else synced_stubs[0] if synced_stubs else {}
1820    )