Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
28 commits
Select commit Hold shift + click to select a range
ad1df65
prompt for recon secrets
m-abulazm Nov 20, 2025
abb7459
add one more test
m-abulazm Nov 20, 2025
0d8b166
remove untested code and spec for snowflake pem
m-abulazm Nov 20, 2025
eee23bf
remove empty line
m-abulazm Nov 20, 2025
d375e4f
handle optional snowflake prompts correctly
m-abulazm Nov 20, 2025
412c748
add docs
m-abulazm Nov 21, 2025
f97e06a
add more examples
m-abulazm Nov 21, 2025
9f41a68
update doc
m-abulazm Nov 24, 2025
f0f43d2
Merge branch 'refactor/creds-manager' into cli/reconcile_secrets
m-abulazm Nov 24, 2025
fac9298
mege conflicts
m-abulazm Nov 24, 2025
6513ff5
fix tests after merge conflicts
m-abulazm Nov 24, 2025
653bcd4
Merge branch 'refactor/creds-manager' into cli/reconcile_secrets
m-abulazm Nov 24, 2025
6e0a7a9
Merge branch 'refactor/creds-manager' into cli/reconcile_secrets
m-abulazm Nov 24, 2025
2efd18d
Merge branch 'refactor/creds-manager' into cli/reconcile_secrets
gueniai Nov 26, 2025
b79b760
Merge branch 'refactor/creds-manager' into cli/reconcile_secrets
m-abulazm Nov 27, 2025
34efe1a
Merge branch 'refactor/creds-manager' into cli/reconcile_secrets
m-abulazm Dec 4, 2025
6baf386
minor fixes after merge
m-abulazm Dec 4, 2025
2be43f2
Merge branch 'refactor/creds-manager' into cli/reconcile_secrets
m-abulazm Dec 8, 2025
4656bb8
support databricks vault only
m-abulazm Dec 8, 2025
b20fb8f
rename class and property
m-abulazm Dec 15, 2025
b88c395
Merge branch 'refactor/creds-manager' into cli/reconcile_secrets
m-abulazm Dec 15, 2025
a1729ec
update notebook
m-abulazm Dec 15, 2025
8074d17
Merge branch 'refactor/creds-manager' into cli/reconcile_secrets
m-abulazm Dec 24, 2025
a51fa94
fix tests and minor refactor
m-abulazm Dec 24, 2025
1e9785b
fix one more test
m-abulazm Dec 24, 2025
6303e25
Merge branch 'refactor/creds-manager' into cli/reconcile_secrets
m-abulazm Dec 24, 2025
db2349d
Merge branch 'refactor/creds-manager' into cli/reconcile_secrets
m-abulazm Dec 25, 2025
44ea015
fix type of creds property in docs
m-abulazm Dec 25, 2025
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
80 changes: 41 additions & 39 deletions docs/lakebridge/docs/reconcile/index.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -38,36 +38,38 @@ Refer to [Reconcile Configuration Guide](reconcile_configuration) for detailed i

> 2. Setup the connection properties

Lakebridge-Reconcile manages connection properties by utilizing secrets stored in the Databricks workspace.
Below is the default secret naming convention for managing connection properties.

**Note: When both the source and target are Databricks, a secret scope is not required.**

**Default Secret Scope:** lakebridge_data_source

| source | scope |
|---------------|-----------------------|
| snowflake | lakebridge_snowflake |
| oracle | lakebridge_oracle |
| databricks | lakebridge_databricks |
| mssql | lakebridge_mssql |
| synapse | lakebridge_synapse |

Below are the connection properties required for each source:

Reconcile connection properties are configured through a dynamic mapping from connection property to value.
The values can be loaded from databricks, env vars or used directly. It depends on the config in `reconcile.yml`
```yaml
...
creds:
vault_type: local
vault_secret_names:
<mappings of connection properties to values>
```
or to use databricks secrets. And the value has to be in the form of `<scope_name>/<secret_key>`
```yaml
...
creds:
vault_type: databricks
vault_secret_names:
some_property = <scope_name>/<secret_key>
...
```
The expected connection properties under `vault_secret_names` per data source are:
<Tabs>
<TabItem value="snowflake" label="Snowflake">
```yaml
sfUrl = https://[acount_name].snowflakecomputing.com
account = [acount_name]
sfUser = [user]
sfPassword = [password]
sfDatabase = [database]
sfSchema = [schema]
sfWarehouse = [warehouse_name]
sfRole = [role_name]
pem_private_key = [pkcs8_pem_private_key]
pem_private_key_password = [pkcs8_pem_private_key]
sfUrl = [local_or_databricks_mapping]
account = [local_or_databricks_mapping]
sfUser = [local_or_databricks_mapping]
sfPassword = [local_or_databricks_mapping]
sfDatabase = [local_or_databricks_mapping]
sfSchema = [local_or_databricks_mapping]
sfWarehouse = [local_or_databricks_mapping]
sfRole = [local_or_databricks_mapping]
pem_private_key = [local_or_databricks_mapping]
pem_private_key_password = [local_or_databricks_mapping]
```

:::note
Expand All @@ -81,22 +83,22 @@ Below are the connection properties required for each source:
</TabItem>
<TabItem value="oracle" label="Oracle">
```yaml
user = [user]
password = [password]
host = [host]
port = [port]
database = [database/SID]
user = [local_or_databricks_mapping]
password = [local_or_databricks_mapping]
host = [local_or_databricks_mapping]
port = [local_or_databricks_mapping]
database = [local_or_databricks_mapping]
```
</TabItem>
<TabItem value="mssql" label="MS SQL Server (incl. Synapse)">
```yaml
user = [user]
password = [password]
host = [host]
port = [port]
database = [database/SID]
encrypt = [true/false]
trustServerCertificate = [true/false]
user = [local_or_databricks_mapping]
password = [local_or_databricks_mapping]
host = [local_or_databricks_mapping]
port = [local_or_databricks_mapping]
database = [local_or_databricks_mapping]
encrypt = [local_or_databricks_mapping]
trustServerCertificate = [local_or_databricks_mapping]
```
</TabItem>
</Tabs>
Expand Down
23 changes: 18 additions & 5 deletions docs/lakebridge/docs/reconcile/recon_notebook.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -69,15 +69,14 @@ We use the class `ReconcileConfig` to configure the properties required for reco
class ReconcileConfig:
data_source: str
report_type: str
secret_scope: str
database_config: DatabaseConfig
metadata_config: ReconcileMetadataConfig
creds: ReconcileCredentialsConfig | None = None
```
Parameters:

- `data_source`: The data source to be reconciled. Supported values: `snowflake`, `teradata`, `oracle`, `mssql`, `synapse`, `databricks`.
- `report_type`: The type of report to be generated. Available report types are `schema`, `row`, `data` or `all`. For details check [here](./dataflow_example.mdx).
- `secret_scope`: The secret scope name used to store the connection credentials for the source database system.
- `database_config`: The database configuration for connecting to the source database. expects a `DatabaseConfig` object.
- `source_schema`: The source schema name.
- `target_catalog`: The target catalog name.
Expand All @@ -104,20 +103,29 @@ class ReconcileMetadataConfig:
```
If not set the default values will be used to store the metadata. The default resources are created during the installation
of Lakebridge.
- `creds`: The credentials to use to connect to the data source.
- `vault_type`: Can be local to use the values directly, env to load from env variables or databricks to load from databricks secrets.
- `vault_secret_names`: A mapping of reconcile credentials keys to the values that will be resolved depending on vault type.
```python
@dataclass
class ReconcileCredentialsConfig:
vault_type: str
vault_secret_names: dict[str, str]
```

An Example of configuring the Reconcile properties:

```python
from databricks.labs.lakebridge.config import (
DatabaseConfig,
ReconcileConfig,
ReconcileMetadataConfig
ReconcileMetadataConfig,
ReconcileCredentialsConfig
)

reconcile_config = ReconcileConfig(
data_source = "snowflake",
report_type = "all",
secret_scope = "snowflake-credential",
database_config= DatabaseConfig(source_catalog="source_sf_catalog",
source_schema="source_sf_schema",
target_catalog="target_databricks_catalog",
Expand All @@ -126,9 +134,14 @@ reconcile_config = ReconcileConfig(
metadata_config = ReconcileMetadataConfig(
catalog = "lakebridge_metadata",
schema= "reconcile"
)
),
creds=ReconcileCredentialsConfig(
vault_type="databricks",
vault_secret_names={"sfUrl": "some_secret_scope/some_key", "sfUser": "another_secret_scope/user_key", "sfPassword": "scope/key", "sfRole": "scope/key"}
)
)
```
All the expected credentials have to be configured.

## Configure Table Properties

Expand Down
2 changes: 1 addition & 1 deletion docs/lakebridge/docs/reconcile/reconcile_automation.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -116,7 +116,7 @@ To run the utility, the following parameters must be set:
- `remorph_catalog`: The catalog configured through CLI.
- `remorph_schema`: The schema configured through CLI.
- `remorph_config_table`: The table configs created as a part of the pre-requisites.
- `secret_scope`: The Databricks secret scope for accessing the source system. Refer to the Lakebridge documentation for the specific keys required to be configured as per the source system.
- `secret_scope`: (Deprecated) The Databricks secret scope for accessing the source system. Refer to the Lakebridge documentation for the specific keys required to be configured as per the source system.
- `source_system`: The source system against which reconciliation is performed.
- `table_recon_summary`: The target summary table created as a part of the pre-requisites.

Expand Down

Large diffs are not rendered by default.

Binary file modified docs/lakebridge/static/lakebridge_reconciliation.dbc
Binary file not shown.
13 changes: 0 additions & 13 deletions src/databricks/labs/lakebridge/cli.py
Original file line number Diff line number Diff line change
Expand Up @@ -27,7 +27,6 @@
from databricks.labs.lakebridge.config import TranspileConfig, LSPConfigOptionV1
from databricks.labs.lakebridge.contexts.application import ApplicationContext
from databricks.labs.lakebridge.connections.credential_manager import cred_file
from databricks.labs.lakebridge.helpers.recon_config_utils import ReconConfigPrompts
from databricks.labs.lakebridge.helpers.telemetry_utils import make_alphanum_or_semver
from databricks.labs.lakebridge.install import installer
from databricks.labs.lakebridge.reconcile.runner import ReconcileRunner
Expand Down Expand Up @@ -699,18 +698,6 @@ def generate_lineage(
lineage_generator(engine, source_dialect, input_source, output_folder)


@lakebridge.command
def configure_secrets(*, w: WorkspaceClient) -> None:
"""Setup reconciliation connection profile details as Secrets on Databricks Workspace"""
recon_conf = ReconConfigPrompts(w)

# Prompt for source
source = recon_conf.prompt_source()

logger.info(f"Setting up Scope, Secrets for `{source}` reconciliation")
recon_conf.prompt_and_save_connection_details()


@lakebridge.command
def configure_database_profiler(w: WorkspaceClient) -> None:
"""[Experimental] Installs and runs the Lakebridge Assessment package for database profiling"""
Expand Down
Loading
Loading