Skip to content

Conversation

@sundarshankar89
Copy link
Collaborator

@sundarshankar89 sundarshankar89 commented Dec 11, 2025

Changes

What does this PR do?

Profiler Test Connection

databricks labs lakebridge test-profiler-connection

Relevant implementation details

Caveats/things to watch out for when reviewing:

Linked issues

Resolves #..

Functionality

  • added relevant user documentation
  • added new CLI command
  • modified existing command: databricks labs lakebridge ...
  • ... +add your own

Tests

  • manually tested
  • added unit tests
  • added integration tests

@sundarshankar89 sundarshankar89 self-assigned this Dec 11, 2025
@sundarshankar89 sundarshankar89 added feat/profiler Issues related to profilers feat/cli actions that are visible to the user labels Dec 11, 2025
@codecov
Copy link

codecov bot commented Dec 11, 2025

Codecov Report

❌ Patch coverage is 13.20755% with 92 lines in your changes missing coverage. Please review.
✅ Project coverage is 63.36%. Comparing base (7ac560e) to head (5915661).

Files with missing lines Patch % Lines
...cks/labs/lakebridge/connections/synapse_helpers.py 10.41% 43 Missing ⚠️
src/databricks/labs/lakebridge/cli.py 14.58% 41 Missing ⚠️
...ks/labs/lakebridge/connections/database_manager.py 22.22% 7 Missing ⚠️
.../resources/assessments/synapse/common/connector.py 0.00% 1 Missing ⚠️
Additional details and impacted files
@@            Coverage Diff             @@
##             main    #2186      +/-   ##
==========================================
- Coverage   63.95%   63.36%   -0.59%     
==========================================
  Files          99      100       +1     
  Lines        8644     8745     +101     
  Branches      890      902      +12     
==========================================
+ Hits         5528     5541      +13     
- Misses       2944     3032      +88     
  Partials      172      172              

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

🚀 New features to boost your workflow:
  • ❄️ Test Analytics: Detect flaky tests, report on failures, and find test suite problems.

@github-actions
Copy link

github-actions bot commented Dec 11, 2025

❌ 134/135 passed, 7 flaky, 1 failed, 5 skipped, 30m4s total

❌ test_recon_databricks_job_succeeds: TimeoutError: timed out after 0:20:00: (20m34.639s)
TimeoutError: timed out after 0:20:00:
07:38 INFO [databricks.labs.pytester.fixtures.baseline] Created dummy_coknkq7lp catalog: https://DATABRICKS_HOST/#explore/data/dummy_coknkq7lp
07:38 DEBUG [databricks.labs.pytester.fixtures.baseline] added catalog fixture: CatalogInfo(browse_only=False, catalog_type=<CatalogType.MANAGED_CATALOG: 'MANAGED_CATALOG'>, comment=None, connection_name=None, created_at=1769585908719, created_by='3fe685a1-96cc-4fec-8cdb-6944f5c9787e', effective_predictive_optimization_flag=EffectivePredictiveOptimizationFlag(value=<EnablePredictiveOptimization.DISABLE: 'DISABLE'>, inherited_from_name='primary', inherited_from_type=None), enable_predictive_optimization=<EnablePredictiveOptimization.INHERIT: 'INHERIT'>, full_name='dummy_coknkq7lp', isolation_mode=<CatalogIsolationMode.OPEN: 'OPEN'>, metastore_id='8952c1e3-b265-4adf-98c3-6f755e2e1453', name='dummy_coknkq7lp', options=None, owner='3fe685a1-96cc-4fec-8cdb-6944f5c9787e', properties={'RemoveAfter': '2026012809'}, provider_name=None, provisioning_info=None, securable_type=<SecurableType.CATALOG: 'CATALOG'>, share_name=None, storage_location=None, storage_root=None, updated_at=1769585908719, updated_by='3fe685a1-96cc-4fec-8cdb-6944f5c9787e')
07:38 INFO [tests.integration.reconcile.conftest] Created catalog dummy_coknkq7lp for recon tests
07:38 INFO [databricks.labs.pytester.fixtures.baseline] Created dummy_coknkq7lp.dummy_s3puv9y2o schema: https://DATABRICKS_HOST/#explore/data/dummy_coknkq7lp/dummy_s3puv9y2o
07:38 DEBUG [databricks.labs.pytester.fixtures.baseline] added schema fixture: SchemaInfo(browse_only=None, catalog_name='dummy_coknkq7lp', catalog_type=None, comment=None, created_at=None, created_by=None, effective_predictive_optimization_flag=None, enable_predictive_optimization=None, full_name='dummy_coknkq7lp.dummy_s3puv9y2o', metastore_id=None, name='dummy_s3puv9y2o', owner=None, properties=None, schema_id=None, storage_location=None, storage_root=None, updated_at=None, updated_by=None)
07:38 INFO [tests.integration.reconcile.conftest] Created schema dummy_s3puv9y2o in catalog dummy_coknkq7lp for recon tests
07:38 INFO [databricks.labs.pytester.fixtures.baseline] Created dummy_s3puv9y2o volume: https://DATABRICKS_HOST/#explore/data/dummy_coknkq7lp/dummy_s3puv9y2o/dummy_s3puv9y2o
07:38 DEBUG [databricks.labs.pytester.fixtures.baseline] added volume fixture: VolumeInfo(access_point=None, browse_only=None, catalog_name='dummy_coknkq7lp', comment=None, created_at=1769585910679, created_by='3fe685a1-96cc-4fec-8cdb-6944f5c9787e', encryption_details=None, full_name='dummy_coknkq7lp.dummy_s3puv9y2o.dummy_s3puv9y2o', metastore_id='8952c1e3-b265-4adf-98c3-6f755e2e1453', name='dummy_s3puv9y2o', owner='3fe685a1-96cc-4fec-8cdb-6944f5c9787e', schema_name='dummy_s3puv9y2o', storage_location='abfss://labs-CLOUD_ENV-TEST_CATALOG-container@databrickslabsstorage.dfs.core.windows.net/8952c1e3-b265-4adf-98c3-6f755e2e1453/volumes/f1cdd103-0846-406f-a7f1-7b91ea03d9de', updated_at=1769585910679, updated_by='3fe685a1-96cc-4fec-8cdb-6944f5c9787e', volume_id='f1cdd103-0846-406f-a7f1-7b91ea03d9de', volume_type=<VolumeType.MANAGED: 'MANAGED'>)
07:38 INFO [tests.integration.reconcile.test_recon_databricks] Using recon job overrides: ReconcileJobConfig(existing_cluster_id='DATABRICKS_CLUSTER_ID', tags={'RemoveAfter': '2026012809'})
07:38 INFO [databricks.labs.pytester.fixtures.baseline] Created dummy_coknkq7lp.dummy_s3puv9y2o.dummy_tllkpbobl schema: https://DATABRICKS_HOST/#explore/data/dummy_coknkq7lp/dummy_s3puv9y2o/dummy_tllkpbobl
07:38 DEBUG [databricks.labs.pytester.fixtures.baseline] added table fixture: TableInfo(access_point=None, browse_only=None, catalog_name='dummy_coknkq7lp', columns=None, comment=None, created_at=None, created_by=None, data_access_configuration_id=None, data_source_format=<DataSourceFormat.DELTA: 'DELTA'>, deleted_at=None, delta_runtime_properties_kvpairs=None, effective_predictive_optimization_flag=None, enable_predictive_optimization=None, encryption_details=None, full_name='dummy_coknkq7lp.dummy_s3puv9y2o.dummy_tllkpbobl', metastore_id=None, name='dummy_tllkpbobl', owner=None, pipeline_id=None, properties={'RemoveAfter': '2026012809'}, row_filter=None, schema_name='dummy_s3puv9y2o', securable_kind_manifest=None, sql_path=None, storage_credential_name=None, storage_location='dbfs:/user/hive/warehouse/dummy_s3puv9y2o/dummy_tllkpbobl', table_constraints=None, table_id=None, table_type=<TableType.MANAGED: 'MANAGED'>, updated_at=None, updated_by=None, view_definition=None, view_dependencies=None)
07:38 INFO [databricks.labs.pytester.fixtures.baseline] Created dummy_coknkq7lp.dummy_s3puv9y2o.dummy_tpfhvjjck schema: https://DATABRICKS_HOST/#explore/data/dummy_coknkq7lp/dummy_s3puv9y2o/dummy_tpfhvjjck
07:38 DEBUG [databricks.labs.pytester.fixtures.baseline] added table fixture: TableInfo(access_point=None, browse_only=None, catalog_name='dummy_coknkq7lp', columns=None, comment=None, created_at=None, created_by=None, data_access_configuration_id=None, data_source_format=<DataSourceFormat.DELTA: 'DELTA'>, deleted_at=None, delta_runtime_properties_kvpairs=None, effective_predictive_optimization_flag=None, enable_predictive_optimization=None, encryption_details=None, full_name='dummy_coknkq7lp.dummy_s3puv9y2o.dummy_tpfhvjjck', metastore_id=None, name='dummy_tpfhvjjck', owner=None, pipeline_id=None, properties={'RemoveAfter': '2026012809'}, row_filter=None, schema_name='dummy_s3puv9y2o', securable_kind_manifest=None, sql_path=None, storage_credential_name=None, storage_location='dbfs:/user/hive/warehouse/dummy_s3puv9y2o/dummy_tpfhvjjck', table_constraints=None, table_id=None, table_type=<TableType.MANAGED: 'MANAGED'>, updated_at=None, updated_by=None, view_definition=None, view_dependencies=None)
07:38 INFO [tests.integration.reconcile.conftest] Created recon tables dummy_tllkpbobl, dummy_tpfhvjjck in schema dummy_s3puv9y2o
07:38 INFO [tests.integration.reconcile.conftest] Inserted data into table dummy_tllkpbobl and got response StatementStatus(error=None, state=<StatementState.SUCCEEDED: 'SUCCEEDED'>)
07:38 INFO [tests.integration.reconcile.conftest] Inserted data into table dummy_tpfhvjjck and got response StatementStatus(error=None, state=<StatementState.SUCCEEDED: 'SUCCEEDED'>)
07:38 INFO [tests.integration.reconcile.test_recon_databricks] Setting up application context for recon tests
07:38 INFO [tests.integration.reconcile.test_recon_databricks] Installing app and recon configuration into workspace
07:38 DEBUG [databricks.labs.lakebridge.install] Upgrades applied successfully.
07:38 INFO [databricks.labs.lakebridge.install] Installing Lakebridge reconcile Metadata components.
07:38 INFO [databricks.labs.lakebridge.deployment.recon] Installing reconcile components.
07:38 INFO [databricks.labs.lakebridge.deployment.recon] Deploying reconciliation metadata tables.
07:38 INFO [databricks.labs.lakebridge.deployment.table] Deploying table main in dummy_coknkq7lp.dummy_s3puv9y2o
07:38 INFO [databricks.labs.lakebridge.deployment.table] SQL Backend used for deploying table: StatementExecutionBackend
07:38 INFO [databricks.labs.lakebridge.deployment.table] Deploying table metric in dummy_coknkq7lp.dummy_s3puv9y2o
07:38 INFO [databricks.labs.lakebridge.deployment.table] SQL Backend used for deploying table: StatementExecutionBackend
07:38 INFO [databricks.labs.lakebridge.deployment.table] Deploying table detai in dummy_coknkq7lp.dummy_s3puv9y2o
07:38 INFO [databricks.labs.lakebridge.deployment.table] SQL Backend used for deploying table: StatementExecutionBackend
07:38 INFO [databricks.labs.lakebridge.deployment.table] Deploying table aggregate_metric in dummy_coknkq7lp.dummy_s3puv9y2o
07:38 INFO [databricks.labs.lakebridge.deployment.table] SQL Backend used for deploying table: StatementExecutionBackend
07:38 INFO [databricks.labs.lakebridge.deployment.table] Deploying table aggregate_detai in dummy_coknkq7lp.dummy_s3puv9y2o
07:38 INFO [databricks.labs.lakebridge.deployment.table] SQL Backend used for deploying table: StatementExecutionBackend
07:38 INFO [databricks.labs.lakebridge.deployment.table] Deploying table aggregate_rule in dummy_coknkq7lp.dummy_s3puv9y2o
07:38 INFO [databricks.labs.lakebridge.deployment.table] SQL Backend used for deploying table: StatementExecutionBackend
07:38 INFO [databricks.labs.lakebridge.deployment.recon] Deploying reconciliation dashboards.
07:38 INFO [databricks.labs.lakebridge.deployment.dashboard] Deploying dashboards from base folder /home/runner/work/lakebridge/lakebridge/src/databricks/labs/lakebridge/resources/reconcile/dashboards
07:38 WARNING [databricks.labs.lsql.dashboards] Parsing : No expression was parsed from ''
07:38 WARNING [databricks.labs.lsql.dashboards] Parsing : No expression was parsed from ''
07:38 WARNING [databricks.labs.lsql.dashboards] Parsing : No expression was parsed from ''
07:38 WARNING [databricks.labs.lsql.dashboards] Parsing : No expression was parsed from ''
07:38 WARNING [databricks.labs.lsql.dashboards] Parsing : No expression was parsed from ''
07:38 INFO [databricks.labs.lakebridge.deployment.dashboard] Dashboard deployed with URL: https://DATABRICKS_HOST/sql/dashboardsv3/01f0fc1c5d0311988fe2670ab080c78f
07:38 WARNING [databricks.labs.lsql.dashboards] Parsing : No expression was parsed from ''
07:38 WARNING [databricks.labs.lsql.dashboards] Parsing : No expression was parsed from ''
07:38 WARNING [databricks.labs.lsql.dashboards] Parsing : No expression was parsed from ''
07:38 WARNING [databricks.labs.lsql.dashboards] Parsing : No expression was parsed from ''
07:38 WARNING [databricks.labs.lsql.dashboards] Parsing : No expression was parsed from ''
07:38 WARNING [databricks.labs.lsql.dashboards] Parsing : No expression was parsed from ''
07:38 WARNING [databricks.labs.lsql.dashboards] Parsing : No expression was parsed from ''
07:38 WARNING [databricks.labs.lsql.dashboards] Parsing : No expression was parsed from ''
07:38 WARNING [databricks.labs.lsql.dashboards] Parsing : No expression was parsed from ''
07:38 WARNING [databricks.labs.lsql.dashboards] Parsing : No expression was parsed from ''
07:38 WARNING [databricks.labs.lsql.dashboards] Parsing : No expression was parsed from ''
07:38 INFO [databricks.labs.lakebridge.deployment.dashboard] Dashboard deployed with URL: https://DATABRICKS_HOST/sql/dashboardsv3/01f0fc1c5e7413f6b6cc1d2c50806e03
07:38 INFO [databricks.labs.lakebridge.deployment.recon] Deploying reconciliation jobs.
07:38 INFO [databricks.labs.lakebridge.deployment.job] Deploying reconciliation job.
07:38 DEBUG [databricks.labs.lakebridge.deployment.job] Applying deployment overrides: ReconcileJobConfig(existing_cluster_id='DATABRICKS_CLUSTER_ID', tags={'RemoveAfter': '2026012809'})
07:38 DEBUG [databricks.labs.lakebridge.deployment.job] Reconciliation job task cluster: existing: DATABRICKS_CLUSTER_ID or name: None
07:38 INFO [databricks.labs.lakebridge.deployment.job] Updating configuration for job `Reconciliation Runner`, job_id=449269971857771
07:38 INFO [databricks.labs.lakebridge.deployment.job] Reconciliation job deployed with job_id=449269971857771
07:38 INFO [databricks.labs.lakebridge.deployment.job] Job URL: https://DATABRICKS_HOST#job/449269971857771
07:38 INFO [databricks.labs.lakebridge.deployment.recon] Installation of reconcile components completed successfully.
07:38 INFO [tests.integration.reconcile.test_recon_databricks] Application context setup complete for recon tests
[gw8] linux -- Python 3.10.19 /home/runner/work/lakebridge/lakebridge/.venv/bin/python
07:38 INFO [databricks.labs.pytester.fixtures.baseline] Created dummy_coknkq7lp catalog: https://DATABRICKS_HOST/#explore/data/dummy_coknkq7lp
07:38 DEBUG [databricks.labs.pytester.fixtures.baseline] added catalog fixture: CatalogInfo(browse_only=False, catalog_type=<CatalogType.MANAGED_CATALOG: 'MANAGED_CATALOG'>, comment=None, connection_name=None, created_at=1769585908719, created_by='3fe685a1-96cc-4fec-8cdb-6944f5c9787e', effective_predictive_optimization_flag=EffectivePredictiveOptimizationFlag(value=<EnablePredictiveOptimization.DISABLE: 'DISABLE'>, inherited_from_name='primary', inherited_from_type=None), enable_predictive_optimization=<EnablePredictiveOptimization.INHERIT: 'INHERIT'>, full_name='dummy_coknkq7lp', isolation_mode=<CatalogIsolationMode.OPEN: 'OPEN'>, metastore_id='8952c1e3-b265-4adf-98c3-6f755e2e1453', name='dummy_coknkq7lp', options=None, owner='3fe685a1-96cc-4fec-8cdb-6944f5c9787e', properties={'RemoveAfter': '2026012809'}, provider_name=None, provisioning_info=None, securable_type=<SecurableType.CATALOG: 'CATALOG'>, share_name=None, storage_location=None, storage_root=None, updated_at=1769585908719, updated_by='3fe685a1-96cc-4fec-8cdb-6944f5c9787e')
07:38 INFO [tests.integration.reconcile.conftest] Created catalog dummy_coknkq7lp for recon tests
07:38 INFO [databricks.labs.pytester.fixtures.baseline] Created dummy_coknkq7lp.dummy_s3puv9y2o schema: https://DATABRICKS_HOST/#explore/data/dummy_coknkq7lp/dummy_s3puv9y2o
07:38 DEBUG [databricks.labs.pytester.fixtures.baseline] added schema fixture: SchemaInfo(browse_only=None, catalog_name='dummy_coknkq7lp', catalog_type=None, comment=None, created_at=None, created_by=None, effective_predictive_optimization_flag=None, enable_predictive_optimization=None, full_name='dummy_coknkq7lp.dummy_s3puv9y2o', metastore_id=None, name='dummy_s3puv9y2o', owner=None, properties=None, schema_id=None, storage_location=None, storage_root=None, updated_at=None, updated_by=None)
07:38 INFO [tests.integration.reconcile.conftest] Created schema dummy_s3puv9y2o in catalog dummy_coknkq7lp for recon tests
07:38 INFO [databricks.labs.pytester.fixtures.baseline] Created dummy_s3puv9y2o volume: https://DATABRICKS_HOST/#explore/data/dummy_coknkq7lp/dummy_s3puv9y2o/dummy_s3puv9y2o
07:38 DEBUG [databricks.labs.pytester.fixtures.baseline] added volume fixture: VolumeInfo(access_point=None, browse_only=None, catalog_name='dummy_coknkq7lp', comment=None, created_at=1769585910679, created_by='3fe685a1-96cc-4fec-8cdb-6944f5c9787e', encryption_details=None, full_name='dummy_coknkq7lp.dummy_s3puv9y2o.dummy_s3puv9y2o', metastore_id='8952c1e3-b265-4adf-98c3-6f755e2e1453', name='dummy_s3puv9y2o', owner='3fe685a1-96cc-4fec-8cdb-6944f5c9787e', schema_name='dummy_s3puv9y2o', storage_location='abfss://labs-CLOUD_ENV-TEST_CATALOG-container@databrickslabsstorage.dfs.core.windows.net/8952c1e3-b265-4adf-98c3-6f755e2e1453/volumes/f1cdd103-0846-406f-a7f1-7b91ea03d9de', updated_at=1769585910679, updated_by='3fe685a1-96cc-4fec-8cdb-6944f5c9787e', volume_id='f1cdd103-0846-406f-a7f1-7b91ea03d9de', volume_type=<VolumeType.MANAGED: 'MANAGED'>)
07:38 INFO [tests.integration.reconcile.test_recon_databricks] Using recon job overrides: ReconcileJobConfig(existing_cluster_id='DATABRICKS_CLUSTER_ID', tags={'RemoveAfter': '2026012809'})
07:38 INFO [databricks.labs.pytester.fixtures.baseline] Created dummy_coknkq7lp.dummy_s3puv9y2o.dummy_tllkpbobl schema: https://DATABRICKS_HOST/#explore/data/dummy_coknkq7lp/dummy_s3puv9y2o/dummy_tllkpbobl
07:38 DEBUG [databricks.labs.pytester.fixtures.baseline] added table fixture: TableInfo(access_point=None, browse_only=None, catalog_name='dummy_coknkq7lp', columns=None, comment=None, created_at=None, created_by=None, data_access_configuration_id=None, data_source_format=<DataSourceFormat.DELTA: 'DELTA'>, deleted_at=None, delta_runtime_properties_kvpairs=None, effective_predictive_optimization_flag=None, enable_predictive_optimization=None, encryption_details=None, full_name='dummy_coknkq7lp.dummy_s3puv9y2o.dummy_tllkpbobl', metastore_id=None, name='dummy_tllkpbobl', owner=None, pipeline_id=None, properties={'RemoveAfter': '2026012809'}, row_filter=None, schema_name='dummy_s3puv9y2o', securable_kind_manifest=None, sql_path=None, storage_credential_name=None, storage_location='dbfs:/user/hive/warehouse/dummy_s3puv9y2o/dummy_tllkpbobl', table_constraints=None, table_id=None, table_type=<TableType.MANAGED: 'MANAGED'>, updated_at=None, updated_by=None, view_definition=None, view_dependencies=None)
07:38 INFO [databricks.labs.pytester.fixtures.baseline] Created dummy_coknkq7lp.dummy_s3puv9y2o.dummy_tpfhvjjck schema: https://DATABRICKS_HOST/#explore/data/dummy_coknkq7lp/dummy_s3puv9y2o/dummy_tpfhvjjck
07:38 DEBUG [databricks.labs.pytester.fixtures.baseline] added table fixture: TableInfo(access_point=None, browse_only=None, catalog_name='dummy_coknkq7lp', columns=None, comment=None, created_at=None, created_by=None, data_access_configuration_id=None, data_source_format=<DataSourceFormat.DELTA: 'DELTA'>, deleted_at=None, delta_runtime_properties_kvpairs=None, effective_predictive_optimization_flag=None, enable_predictive_optimization=None, encryption_details=None, full_name='dummy_coknkq7lp.dummy_s3puv9y2o.dummy_tpfhvjjck', metastore_id=None, name='dummy_tpfhvjjck', owner=None, pipeline_id=None, properties={'RemoveAfter': '2026012809'}, row_filter=None, schema_name='dummy_s3puv9y2o', securable_kind_manifest=None, sql_path=None, storage_credential_name=None, storage_location='dbfs:/user/hive/warehouse/dummy_s3puv9y2o/dummy_tpfhvjjck', table_constraints=None, table_id=None, table_type=<TableType.MANAGED: 'MANAGED'>, updated_at=None, updated_by=None, view_definition=None, view_dependencies=None)
07:38 INFO [tests.integration.reconcile.conftest] Created recon tables dummy_tllkpbobl, dummy_tpfhvjjck in schema dummy_s3puv9y2o
07:38 INFO [tests.integration.reconcile.conftest] Inserted data into table dummy_tllkpbobl and got response StatementStatus(error=None, state=<StatementState.SUCCEEDED: 'SUCCEEDED'>)
07:38 INFO [tests.integration.reconcile.conftest] Inserted data into table dummy_tpfhvjjck and got response StatementStatus(error=None, state=<StatementState.SUCCEEDED: 'SUCCEEDED'>)
07:38 INFO [tests.integration.reconcile.test_recon_databricks] Setting up application context for recon tests
07:38 INFO [tests.integration.reconcile.test_recon_databricks] Installing app and recon configuration into workspace
07:38 DEBUG [databricks.labs.lakebridge.install] Upgrades applied successfully.
07:38 INFO [databricks.labs.lakebridge.install] Installing Lakebridge reconcile Metadata components.
07:38 INFO [databricks.labs.lakebridge.deployment.recon] Installing reconcile components.
07:38 INFO [databricks.labs.lakebridge.deployment.recon] Deploying reconciliation metadata tables.
07:38 INFO [databricks.labs.lakebridge.deployment.table] Deploying table main in dummy_coknkq7lp.dummy_s3puv9y2o
07:38 INFO [databricks.labs.lakebridge.deployment.table] SQL Backend used for deploying table: StatementExecutionBackend
07:38 INFO [databricks.labs.lakebridge.deployment.table] Deploying table metric in dummy_coknkq7lp.dummy_s3puv9y2o
07:38 INFO [databricks.labs.lakebridge.deployment.table] SQL Backend used for deploying table: StatementExecutionBackend
07:38 INFO [databricks.labs.lakebridge.deployment.table] Deploying table detai in dummy_coknkq7lp.dummy_s3puv9y2o
07:38 INFO [databricks.labs.lakebridge.deployment.table] SQL Backend used for deploying table: StatementExecutionBackend
07:38 INFO [databricks.labs.lakebridge.deployment.table] Deploying table aggregate_metric in dummy_coknkq7lp.dummy_s3puv9y2o
07:38 INFO [databricks.labs.lakebridge.deployment.table] SQL Backend used for deploying table: StatementExecutionBackend
07:38 INFO [databricks.labs.lakebridge.deployment.table] Deploying table aggregate_detai in dummy_coknkq7lp.dummy_s3puv9y2o
07:38 INFO [databricks.labs.lakebridge.deployment.table] SQL Backend used for deploying table: StatementExecutionBackend
07:38 INFO [databricks.labs.lakebridge.deployment.table] Deploying table aggregate_rule in dummy_coknkq7lp.dummy_s3puv9y2o
07:38 INFO [databricks.labs.lakebridge.deployment.table] SQL Backend used for deploying table: StatementExecutionBackend
07:38 INFO [databricks.labs.lakebridge.deployment.recon] Deploying reconciliation dashboards.
07:38 INFO [databricks.labs.lakebridge.deployment.dashboard] Deploying dashboards from base folder /home/runner/work/lakebridge/lakebridge/src/databricks/labs/lakebridge/resources/reconcile/dashboards
07:38 WARNING [databricks.labs.lsql.dashboards] Parsing : No expression was parsed from ''
07:38 WARNING [databricks.labs.lsql.dashboards] Parsing : No expression was parsed from ''
07:38 WARNING [databricks.labs.lsql.dashboards] Parsing : No expression was parsed from ''
07:38 WARNING [databricks.labs.lsql.dashboards] Parsing : No expression was parsed from ''
07:38 WARNING [databricks.labs.lsql.dashboards] Parsing : No expression was parsed from ''
07:38 INFO [databricks.labs.lakebridge.deployment.dashboard] Dashboard deployed with URL: https://DATABRICKS_HOST/sql/dashboardsv3/01f0fc1c5d0311988fe2670ab080c78f
07:38 WARNING [databricks.labs.lsql.dashboards] Parsing : No expression was parsed from ''
07:38 WARNING [databricks.labs.lsql.dashboards] Parsing : No expression was parsed from ''
07:38 WARNING [databricks.labs.lsql.dashboards] Parsing : No expression was parsed from ''
07:38 WARNING [databricks.labs.lsql.dashboards] Parsing : No expression was parsed from ''
07:38 WARNING [databricks.labs.lsql.dashboards] Parsing : No expression was parsed from ''
07:38 WARNING [databricks.labs.lsql.dashboards] Parsing : No expression was parsed from ''
07:38 WARNING [databricks.labs.lsql.dashboards] Parsing : No expression was parsed from ''
07:38 WARNING [databricks.labs.lsql.dashboards] Parsing : No expression was parsed from ''
07:38 WARNING [databricks.labs.lsql.dashboards] Parsing : No expression was parsed from ''
07:38 WARNING [databricks.labs.lsql.dashboards] Parsing : No expression was parsed from ''
07:38 WARNING [databricks.labs.lsql.dashboards] Parsing : No expression was parsed from ''
07:38 INFO [databricks.labs.lakebridge.deployment.dashboard] Dashboard deployed with URL: https://DATABRICKS_HOST/sql/dashboardsv3/01f0fc1c5e7413f6b6cc1d2c50806e03
07:38 INFO [databricks.labs.lakebridge.deployment.recon] Deploying reconciliation jobs.
07:38 INFO [databricks.labs.lakebridge.deployment.job] Deploying reconciliation job.
07:38 DEBUG [databricks.labs.lakebridge.deployment.job] Applying deployment overrides: ReconcileJobConfig(existing_cluster_id='DATABRICKS_CLUSTER_ID', tags={'RemoveAfter': '2026012809'})
07:38 DEBUG [databricks.labs.lakebridge.deployment.job] Reconciliation job task cluster: existing: DATABRICKS_CLUSTER_ID or name: None
07:38 INFO [databricks.labs.lakebridge.deployment.job] Updating configuration for job `Reconciliation Runner`, job_id=449269971857771
07:38 INFO [databricks.labs.lakebridge.deployment.job] Reconciliation job deployed with job_id=449269971857771
07:38 INFO [databricks.labs.lakebridge.deployment.job] Job URL: https://DATABRICKS_HOST#job/449269971857771
07:38 INFO [databricks.labs.lakebridge.deployment.recon] Installation of reconcile components completed successfully.
07:38 INFO [tests.integration.reconcile.test_recon_databricks] Application context setup complete for recon tests
07:38 DEBUG [databricks.labs.lakebridge.reconcile.runner] Reconcile job id found in the install state.
07:38 INFO [databricks.labs.lakebridge.reconcile.runner] Triggering the reconcile job with job_id: `449269971857771`
07:38 INFO [databricks.labs.lakebridge.reconcile.runner] 'RECONCILE' job started. Please check the job_url `https://DATABRICKS_HOST/jobs/449269971857771/runs/634510219795170` for the current status.
07:58 INFO [tests.integration.reconcile.test_recon_databricks] Reconcile job run had 1 tasks
07:58 INFO [tests.integration.reconcile.test_recon_databricks] Task run_reconciliation has error message: No output is available until the task begins.
07:38 INFO [databricks.labs.pytester.fixtures.baseline] Created dummy_coknkq7lp catalog: https://DATABRICKS_HOST/#explore/data/dummy_coknkq7lp
07:38 DEBUG [databricks.labs.pytester.fixtures.baseline] added catalog fixture: CatalogInfo(browse_only=False, catalog_type=<CatalogType.MANAGED_CATALOG: 'MANAGED_CATALOG'>, comment=None, connection_name=None, created_at=1769585908719, created_by='3fe685a1-96cc-4fec-8cdb-6944f5c9787e', effective_predictive_optimization_flag=EffectivePredictiveOptimizationFlag(value=<EnablePredictiveOptimization.DISABLE: 'DISABLE'>, inherited_from_name='primary', inherited_from_type=None), enable_predictive_optimization=<EnablePredictiveOptimization.INHERIT: 'INHERIT'>, full_name='dummy_coknkq7lp', isolation_mode=<CatalogIsolationMode.OPEN: 'OPEN'>, metastore_id='8952c1e3-b265-4adf-98c3-6f755e2e1453', name='dummy_coknkq7lp', options=None, owner='3fe685a1-96cc-4fec-8cdb-6944f5c9787e', properties={'RemoveAfter': '2026012809'}, provider_name=None, provisioning_info=None, securable_type=<SecurableType.CATALOG: 'CATALOG'>, share_name=None, storage_location=None, storage_root=None, updated_at=1769585908719, updated_by='3fe685a1-96cc-4fec-8cdb-6944f5c9787e')
07:38 INFO [tests.integration.reconcile.conftest] Created catalog dummy_coknkq7lp for recon tests
07:38 INFO [databricks.labs.pytester.fixtures.baseline] Created dummy_coknkq7lp.dummy_s3puv9y2o schema: https://DATABRICKS_HOST/#explore/data/dummy_coknkq7lp/dummy_s3puv9y2o
07:38 DEBUG [databricks.labs.pytester.fixtures.baseline] added schema fixture: SchemaInfo(browse_only=None, catalog_name='dummy_coknkq7lp', catalog_type=None, comment=None, created_at=None, created_by=None, effective_predictive_optimization_flag=None, enable_predictive_optimization=None, full_name='dummy_coknkq7lp.dummy_s3puv9y2o', metastore_id=None, name='dummy_s3puv9y2o', owner=None, properties=None, schema_id=None, storage_location=None, storage_root=None, updated_at=None, updated_by=None)
07:38 INFO [tests.integration.reconcile.conftest] Created schema dummy_s3puv9y2o in catalog dummy_coknkq7lp for recon tests
07:38 INFO [databricks.labs.pytester.fixtures.baseline] Created dummy_s3puv9y2o volume: https://DATABRICKS_HOST/#explore/data/dummy_coknkq7lp/dummy_s3puv9y2o/dummy_s3puv9y2o
07:38 DEBUG [databricks.labs.pytester.fixtures.baseline] added volume fixture: VolumeInfo(access_point=None, browse_only=None, catalog_name='dummy_coknkq7lp', comment=None, created_at=1769585910679, created_by='3fe685a1-96cc-4fec-8cdb-6944f5c9787e', encryption_details=None, full_name='dummy_coknkq7lp.dummy_s3puv9y2o.dummy_s3puv9y2o', metastore_id='8952c1e3-b265-4adf-98c3-6f755e2e1453', name='dummy_s3puv9y2o', owner='3fe685a1-96cc-4fec-8cdb-6944f5c9787e', schema_name='dummy_s3puv9y2o', storage_location='abfss://labs-CLOUD_ENV-TEST_CATALOG-container@databrickslabsstorage.dfs.core.windows.net/8952c1e3-b265-4adf-98c3-6f755e2e1453/volumes/f1cdd103-0846-406f-a7f1-7b91ea03d9de', updated_at=1769585910679, updated_by='3fe685a1-96cc-4fec-8cdb-6944f5c9787e', volume_id='f1cdd103-0846-406f-a7f1-7b91ea03d9de', volume_type=<VolumeType.MANAGED: 'MANAGED'>)
07:38 INFO [tests.integration.reconcile.test_recon_databricks] Using recon job overrides: ReconcileJobConfig(existing_cluster_id='DATABRICKS_CLUSTER_ID', tags={'RemoveAfter': '2026012809'})
07:38 INFO [databricks.labs.pytester.fixtures.baseline] Created dummy_coknkq7lp.dummy_s3puv9y2o.dummy_tllkpbobl schema: https://DATABRICKS_HOST/#explore/data/dummy_coknkq7lp/dummy_s3puv9y2o/dummy_tllkpbobl
07:38 DEBUG [databricks.labs.pytester.fixtures.baseline] added table fixture: TableInfo(access_point=None, browse_only=None, catalog_name='dummy_coknkq7lp', columns=None, comment=None, created_at=None, created_by=None, data_access_configuration_id=None, data_source_format=<DataSourceFormat.DELTA: 'DELTA'>, deleted_at=None, delta_runtime_properties_kvpairs=None, effective_predictive_optimization_flag=None, enable_predictive_optimization=None, encryption_details=None, full_name='dummy_coknkq7lp.dummy_s3puv9y2o.dummy_tllkpbobl', metastore_id=None, name='dummy_tllkpbobl', owner=None, pipeline_id=None, properties={'RemoveAfter': '2026012809'}, row_filter=None, schema_name='dummy_s3puv9y2o', securable_kind_manifest=None, sql_path=None, storage_credential_name=None, storage_location='dbfs:/user/hive/warehouse/dummy_s3puv9y2o/dummy_tllkpbobl', table_constraints=None, table_id=None, table_type=<TableType.MANAGED: 'MANAGED'>, updated_at=None, updated_by=None, view_definition=None, view_dependencies=None)
07:38 INFO [databricks.labs.pytester.fixtures.baseline] Created dummy_coknkq7lp.dummy_s3puv9y2o.dummy_tpfhvjjck schema: https://DATABRICKS_HOST/#explore/data/dummy_coknkq7lp/dummy_s3puv9y2o/dummy_tpfhvjjck
07:38 DEBUG [databricks.labs.pytester.fixtures.baseline] added table fixture: TableInfo(access_point=None, browse_only=None, catalog_name='dummy_coknkq7lp', columns=None, comment=None, created_at=None, created_by=None, data_access_configuration_id=None, data_source_format=<DataSourceFormat.DELTA: 'DELTA'>, deleted_at=None, delta_runtime_properties_kvpairs=None, effective_predictive_optimization_flag=None, enable_predictive_optimization=None, encryption_details=None, full_name='dummy_coknkq7lp.dummy_s3puv9y2o.dummy_tpfhvjjck', metastore_id=None, name='dummy_tpfhvjjck', owner=None, pipeline_id=None, properties={'RemoveAfter': '2026012809'}, row_filter=None, schema_name='dummy_s3puv9y2o', securable_kind_manifest=None, sql_path=None, storage_credential_name=None, storage_location='dbfs:/user/hive/warehouse/dummy_s3puv9y2o/dummy_tpfhvjjck', table_constraints=None, table_id=None, table_type=<TableType.MANAGED: 'MANAGED'>, updated_at=None, updated_by=None, view_definition=None, view_dependencies=None)
07:38 INFO [tests.integration.reconcile.conftest] Created recon tables dummy_tllkpbobl, dummy_tpfhvjjck in schema dummy_s3puv9y2o
07:38 INFO [tests.integration.reconcile.conftest] Inserted data into table dummy_tllkpbobl and got response StatementStatus(error=None, state=<StatementState.SUCCEEDED: 'SUCCEEDED'>)
07:38 INFO [tests.integration.reconcile.conftest] Inserted data into table dummy_tpfhvjjck and got response StatementStatus(error=None, state=<StatementState.SUCCEEDED: 'SUCCEEDED'>)
07:38 INFO [tests.integration.reconcile.test_recon_databricks] Setting up application context for recon tests
07:38 INFO [tests.integration.reconcile.test_recon_databricks] Installing app and recon configuration into workspace
07:38 DEBUG [databricks.labs.lakebridge.install] Upgrades applied successfully.
07:38 INFO [databricks.labs.lakebridge.install] Installing Lakebridge reconcile Metadata components.
07:38 INFO [databricks.labs.lakebridge.deployment.recon] Installing reconcile components.
07:38 INFO [databricks.labs.lakebridge.deployment.recon] Deploying reconciliation metadata tables.
07:38 INFO [databricks.labs.lakebridge.deployment.table] Deploying table main in dummy_coknkq7lp.dummy_s3puv9y2o
07:38 INFO [databricks.labs.lakebridge.deployment.table] SQL Backend used for deploying table: StatementExecutionBackend
07:38 INFO [databricks.labs.lakebridge.deployment.table] Deploying table metric in dummy_coknkq7lp.dummy_s3puv9y2o
07:38 INFO [databricks.labs.lakebridge.deployment.table] SQL Backend used for deploying table: StatementExecutionBackend
07:38 INFO [databricks.labs.lakebridge.deployment.table] Deploying table detai in dummy_coknkq7lp.dummy_s3puv9y2o
07:38 INFO [databricks.labs.lakebridge.deployment.table] SQL Backend used for deploying table: StatementExecutionBackend
07:38 INFO [databricks.labs.lakebridge.deployment.table] Deploying table aggregate_metric in dummy_coknkq7lp.dummy_s3puv9y2o
07:38 INFO [databricks.labs.lakebridge.deployment.table] SQL Backend used for deploying table: StatementExecutionBackend
07:38 INFO [databricks.labs.lakebridge.deployment.table] Deploying table aggregate_detai in dummy_coknkq7lp.dummy_s3puv9y2o
07:38 INFO [databricks.labs.lakebridge.deployment.table] SQL Backend used for deploying table: StatementExecutionBackend
07:38 INFO [databricks.labs.lakebridge.deployment.table] Deploying table aggregate_rule in dummy_coknkq7lp.dummy_s3puv9y2o
07:38 INFO [databricks.labs.lakebridge.deployment.table] SQL Backend used for deploying table: StatementExecutionBackend
07:38 INFO [databricks.labs.lakebridge.deployment.recon] Deploying reconciliation dashboards.
07:38 INFO [databricks.labs.lakebridge.deployment.dashboard] Deploying dashboards from base folder /home/runner/work/lakebridge/lakebridge/src/databricks/labs/lakebridge/resources/reconcile/dashboards
07:38 WARNING [databricks.labs.lsql.dashboards] Parsing : No expression was parsed from ''
07:38 WARNING [databricks.labs.lsql.dashboards] Parsing : No expression was parsed from ''
07:38 WARNING [databricks.labs.lsql.dashboards] Parsing : No expression was parsed from ''
07:38 WARNING [databricks.labs.lsql.dashboards] Parsing : No expression was parsed from ''
07:38 WARNING [databricks.labs.lsql.dashboards] Parsing : No expression was parsed from ''
07:38 INFO [databricks.labs.lakebridge.deployment.dashboard] Dashboard deployed with URL: https://DATABRICKS_HOST/sql/dashboardsv3/01f0fc1c5d0311988fe2670ab080c78f
07:38 WARNING [databricks.labs.lsql.dashboards] Parsing : No expression was parsed from ''
07:38 WARNING [databricks.labs.lsql.dashboards] Parsing : No expression was parsed from ''
07:38 WARNING [databricks.labs.lsql.dashboards] Parsing : No expression was parsed from ''
07:38 WARNING [databricks.labs.lsql.dashboards] Parsing : No expression was parsed from ''
07:38 WARNING [databricks.labs.lsql.dashboards] Parsing : No expression was parsed from ''
07:38 WARNING [databricks.labs.lsql.dashboards] Parsing : No expression was parsed from ''
07:38 WARNING [databricks.labs.lsql.dashboards] Parsing : No expression was parsed from ''
07:38 WARNING [databricks.labs.lsql.dashboards] Parsing : No expression was parsed from ''
07:38 WARNING [databricks.labs.lsql.dashboards] Parsing : No expression was parsed from ''
07:38 WARNING [databricks.labs.lsql.dashboards] Parsing : No expression was parsed from ''
07:38 WARNING [databricks.labs.lsql.dashboards] Parsing : No expression was parsed from ''
07:38 INFO [databricks.labs.lakebridge.deployment.dashboard] Dashboard deployed with URL: https://DATABRICKS_HOST/sql/dashboardsv3/01f0fc1c5e7413f6b6cc1d2c50806e03
07:38 INFO [databricks.labs.lakebridge.deployment.recon] Deploying reconciliation jobs.
07:38 INFO [databricks.labs.lakebridge.deployment.job] Deploying reconciliation job.
07:38 DEBUG [databricks.labs.lakebridge.deployment.job] Applying deployment overrides: ReconcileJobConfig(existing_cluster_id='DATABRICKS_CLUSTER_ID', tags={'RemoveAfter': '2026012809'})
07:38 DEBUG [databricks.labs.lakebridge.deployment.job] Reconciliation job task cluster: existing: DATABRICKS_CLUSTER_ID or name: None
07:38 INFO [databricks.labs.lakebridge.deployment.job] Updating configuration for job `Reconciliation Runner`, job_id=449269971857771
07:38 INFO [databricks.labs.lakebridge.deployment.job] Reconciliation job deployed with job_id=449269971857771
07:38 INFO [databricks.labs.lakebridge.deployment.job] Job URL: https://DATABRICKS_HOST#job/449269971857771
07:38 INFO [databricks.labs.lakebridge.deployment.recon] Installation of reconcile components completed successfully.
07:38 INFO [tests.integration.reconcile.test_recon_databricks] Application context setup complete for recon tests
07:38 DEBUG [databricks.labs.lakebridge.reconcile.runner] Reconcile job id found in the install state.
07:38 INFO [databricks.labs.lakebridge.reconcile.runner] Triggering the reconcile job with job_id: `449269971857771`
07:38 INFO [databricks.labs.lakebridge.reconcile.runner] 'RECONCILE' job started. Please check the job_url `https://DATABRICKS_HOST/jobs/449269971857771/runs/634510219795170` for the current status.
07:58 INFO [tests.integration.reconcile.test_recon_databricks] Reconcile job run had 1 tasks
07:58 INFO [tests.integration.reconcile.test_recon_databricks] Task run_reconciliation has error message: No output is available until the task begins.
07:59 INFO [tests.integration.reconcile.test_recon_databricks] Tearing down application context for recon tests
07:59 INFO [databricks.labs.lakebridge.install] Uninstalling Lakebridge from https://DATABRICKS_HOST.
07:59 ERROR [databricks.labs.lakebridge.install] Check if /Users/3fe685a1-96cc-4fec-8cdb-6944f5c9787e/.lakebridge is present. Aborting uninstallation.
07:59 INFO [tests.integration.reconcile.test_recon_databricks] Application context teardown complete for recon tests
07:59 DEBUG [databricks.labs.pytester.fixtures.baseline] clearing 2 table fixtures
07:59 DEBUG [databricks.labs.pytester.fixtures.baseline] removing table fixture: TableInfo(access_point=None, browse_only=None, catalog_name='dummy_coknkq7lp', columns=None, comment=None, created_at=None, created_by=None, data_access_configuration_id=None, data_source_format=<DataSourceFormat.DELTA: 'DELTA'>, deleted_at=None, delta_runtime_properties_kvpairs=None, effective_predictive_optimization_flag=None, enable_predictive_optimization=None, encryption_details=None, full_name='dummy_coknkq7lp.dummy_s3puv9y2o.dummy_tllkpbobl', metastore_id=None, name='dummy_tllkpbobl', owner=None, pipeline_id=None, properties={'RemoveAfter': '2026012809'}, row_filter=None, schema_name='dummy_s3puv9y2o', securable_kind_manifest=None, sql_path=None, storage_credential_name=None, storage_location='dbfs:/user/hive/warehouse/dummy_s3puv9y2o/dummy_tllkpbobl', table_constraints=None, table_id=None, table_type=<TableType.MANAGED: 'MANAGED'>, updated_at=None, updated_by=None, view_definition=None, view_dependencies=None)
07:59 DEBUG [databricks.labs.pytester.fixtures.baseline] removing table fixture: TableInfo(access_point=None, browse_only=None, catalog_name='dummy_coknkq7lp', columns=None, comment=None, created_at=None, created_by=None, data_access_configuration_id=None, data_source_format=<DataSourceFormat.DELTA: 'DELTA'>, deleted_at=None, delta_runtime_properties_kvpairs=None, effective_predictive_optimization_flag=None, enable_predictive_optimization=None, encryption_details=None, full_name='dummy_coknkq7lp.dummy_s3puv9y2o.dummy_tpfhvjjck', metastore_id=None, name='dummy_tpfhvjjck', owner=None, pipeline_id=None, properties={'RemoveAfter': '2026012809'}, row_filter=None, schema_name='dummy_s3puv9y2o', securable_kind_manifest=None, sql_path=None, storage_credential_name=None, storage_location='dbfs:/user/hive/warehouse/dummy_s3puv9y2o/dummy_tpfhvjjck', table_constraints=None, table_id=None, table_type=<TableType.MANAGED: 'MANAGED'>, updated_at=None, updated_by=None, view_definition=None, view_dependencies=None)
07:59 DEBUG [databricks.labs.pytester.fixtures.baseline] clearing 1 volume fixtures
07:59 DEBUG [databricks.labs.pytester.fixtures.baseline] removing volume fixture: VolumeInfo(access_point=None, browse_only=None, catalog_name='dummy_coknkq7lp', comment=None, created_at=1769585910679, created_by='3fe685a1-96cc-4fec-8cdb-6944f5c9787e', encryption_details=None, full_name='dummy_coknkq7lp.dummy_s3puv9y2o.dummy_s3puv9y2o', metastore_id='8952c1e3-b265-4adf-98c3-6f755e2e1453', name='dummy_s3puv9y2o', owner='3fe685a1-96cc-4fec-8cdb-6944f5c9787e', schema_name='dummy_s3puv9y2o', storage_location='abfss://labs-CLOUD_ENV-TEST_CATALOG-container@databrickslabsstorage.dfs.core.windows.net/8952c1e3-b265-4adf-98c3-6f755e2e1453/volumes/f1cdd103-0846-406f-a7f1-7b91ea03d9de', updated_at=1769585910679, updated_by='3fe685a1-96cc-4fec-8cdb-6944f5c9787e', volume_id='f1cdd103-0846-406f-a7f1-7b91ea03d9de', volume_type=<VolumeType.MANAGED: 'MANAGED'>)
07:59 DEBUG [databricks.labs.pytester.fixtures.baseline] clearing 1 schema fixtures
07:59 DEBUG [databricks.labs.pytester.fixtures.baseline] removing schema fixture: SchemaInfo(browse_only=None, catalog_name='dummy_coknkq7lp', catalog_type=None, comment=None, created_at=None, created_by=None, effective_predictive_optimization_flag=None, enable_predictive_optimization=None, full_name='dummy_coknkq7lp.dummy_s3puv9y2o', metastore_id=None, name='dummy_s3puv9y2o', owner=None, properties=None, schema_id=None, storage_location=None, storage_root=None, updated_at=None, updated_by=None)
07:59 DEBUG [databricks.labs.pytester.fixtures.baseline] clearing 1 catalog fixtures
07:59 DEBUG [databricks.labs.pytester.fixtures.baseline] removing catalog fixture: CatalogInfo(browse_only=False, catalog_type=<CatalogType.MANAGED_CATALOG: 'MANAGED_CATALOG'>, comment=None, connection_name=None, created_at=1769585908719, created_by='3fe685a1-96cc-4fec-8cdb-6944f5c9787e', effective_predictive_optimization_flag=EffectivePredictiveOptimizationFlag(value=<EnablePredictiveOptimization.DISABLE: 'DISABLE'>, inherited_from_name='primary', inherited_from_type=None), enable_predictive_optimization=<EnablePredictiveOptimization.INHERIT: 'INHERIT'>, full_name='dummy_coknkq7lp', isolation_mode=<CatalogIsolationMode.OPEN: 'OPEN'>, metastore_id='8952c1e3-b265-4adf-98c3-6f755e2e1453', name='dummy_coknkq7lp', options=None, owner='3fe685a1-96cc-4fec-8cdb-6944f5c9787e', properties={'RemoveAfter': '2026012809'}, provider_name=None, provisioning_info=None, securable_type=<SecurableType.CATALOG: 'CATALOG'>, share_name=None, storage_location=None, storage_root=None, updated_at=1769585908719, updated_by='3fe685a1-96cc-4fec-8cdb-6944f5c9787e')
[gw8] linux -- Python 3.10.19 /home/runner/work/lakebridge/lakebridge/.venv/bin/python

Flaky tests:

  • 🤪 test_installs_and_runs_local_bladebridge (20.84s)
  • 🤪 test_installs_and_runs_pypi_bladebridge (24.064s)
  • 🤪 test_transpiles_informatica_to_sparksql_non_interactive[True] (18.062s)
  • 🤪 test_transpiles_informatica_to_sparksql (18.81s)
  • 🤪 test_transpile_teradata_sql_non_interactive[False] (5.997s)
  • 🤪 test_transpile_teradata_sql (7.052s)
  • 🤪 test_transpile_teradata_sql_non_interactive[True] (5.655s)

Running from acceptance #3533

@sundarshankar89 sundarshankar89 marked this pull request as ready for review December 30, 2025 10:03
@sundarshankar89 sundarshankar89 requested a review from a team as a code owner December 30, 2025 10:03
Copy link
Collaborator

@gueniai gueniai left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

feat/cli actions that are visible to the user feat/profiler Issues related to profilers

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants