Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Original file line number Diff line number Diff line change
@@ -0,0 +1,7 @@
bundle:
name: my_project

resources:
jobs:
foo:
name: My Job
Original file line number Diff line number Diff line change
@@ -0,0 +1,12 @@

>>> musterr [CLI] bundle deployment bind foo [NEW_JOB_ID]
Error: terraform import: exit status 1

Error: Resource already managed by Terraform

Terraform is already managing a remote object for databricks_job.foo. To
import to this address you must first remove the existing object from the
state.



Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

Original file line number Diff line number Diff line change
@@ -0,0 +1,8 @@

>>> [CLI] bundle deploy
Uploading bundle files to /Workspace/Users/[USERNAME]/.bundle/my_project/default/files...
Deploying resources...
Updating deployment state...
Deployment complete!

>>> [CLI] jobs create --json {"name": "My Job"}
Original file line number Diff line number Diff line change
@@ -0,0 +1,7 @@
# Bind job that is already bound to another ID
trace $CLI bundle deploy

new_job_id=$(trace $CLI jobs create --json '{"name": "My Job"}' | jq -r '.job_id')
add_repl.py $new_job_id NEW_JOB_ID

trace musterr $CLI bundle deployment bind foo $new_job_id &> out.bind.$DATABRICKS_BUNDLE_ENGINE.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
Cloud = false
Original file line number Diff line number Diff line change
@@ -0,0 +1,7 @@
bundle:
name: my_project

resources:
jobs:
foo:
name: My Job
Original file line number Diff line number Diff line change
@@ -0,0 +1,12 @@

>>> musterr [CLI] bundle deployment bind foo [FOO_ID]
Error: terraform import: exit status 1

Error: Resource already managed by Terraform

Terraform is already managing a remote object for databricks_job.foo. To
import to this address you must first remove the existing object from the
state.



Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

Original file line number Diff line number Diff line change
@@ -0,0 +1,9 @@

>>> [CLI] bundle deploy
Uploading bundle files to /Workspace/Users/[USERNAME]/.bundle/my_project/default/files...
Deploying resources...
Updating deployment state...
Deployment complete!

>>> [CLI] bundle plan
Plan: 0 to add, 0 to change, 0 to delete, 1 unchanged
Original file line number Diff line number Diff line change
@@ -0,0 +1,8 @@
# Bind job that is already bound to the same ID.
# This is a no-op, but terraform complains anyway.
trace $CLI bundle deploy

job_id=$(read_id.py foo)

trace musterr $CLI bundle deployment bind foo $job_id &> out.bind.$DATABRICKS_BUNDLE_ENGINE.txt
trace $CLI bundle plan | contains.py "1 unchanged"
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
Cloud = false
Original file line number Diff line number Diff line change
@@ -0,0 +1,11 @@
bundle:
name: test-pipeline-recreate

resources:
pipelines:
foo:
name: test-pipeline
libraries:
- notebook:
path: ./nb.sql
catalog: main
2 changes: 2 additions & 0 deletions acceptance/bundle/deployment/bind/pipelines/recreate/nb.sql
Original file line number Diff line number Diff line change
@@ -0,0 +1,2 @@
-- Databricks notebook source
select 1
Original file line number Diff line number Diff line change
@@ -0,0 +1,59 @@

>>> musterr [CLI] bundle deployment bind foo [NEW_PIPELINE_ID]
databricks_pipeline.foo: Refreshing state... [id=[NEW_PIPELINE_ID]]

Terraform used the selected providers to generate the following execution
plan. Resource actions are indicated with the following symbols:
-/+ destroy and then create replacement

Terraform will perform the following actions:

# databricks_pipeline.foo must be replaced
-/+ resource "databricks_pipeline" "foo" {
- allow_duplicate_names = false -> null
~ catalog = "old_catalog" -> "main" # forces replacement
+ cause = (known after apply)
+ channel = "CURRENT"
+ cluster_id = (known after apply)
- continuous = false -> null
~ creator_user_name = "[USERNAME]" -> (known after apply)
- development = false -> null
+ edition = "ADVANCED"
- expected_last_modified = 0 -> null
+ health = (known after apply)
~ id = "[NEW_PIPELINE_ID]" -> (known after apply)
~ last_modified = [UNIX_TIME_MILLIS] -> (known after apply)
~ name = "lakeflow-pipeline" -> "test-pipeline"
- photon = false -> null
- root_path = "/Workspace/Users/[USERNAME]/lakeflow_pipeline" -> null
~ run_as_user_name = "[USERNAME]" -> (known after apply)
- serverless = false -> null
~ state = "IDLE" -> (known after apply)
- storage = "old_storage" -> null # forces replacement
~ url = "[DATABRICKS_URL]/#joblist/pipelines/[NEW_PIPELINE_ID]" -> (known after apply)

+ deployment {
+ kind = "BUNDLE"
+ metadata_file_path = "/Workspace/Users/[USERNAME]/.bundle/test-pipeline-recreate/default/state/metadata.json"
}

~ library {
- glob {
- include = "/Workspace/Users/[USERNAME]/lakeflow_pipeline/transformations/**" -> null
}
+ notebook {
+ path = "/Workspace/Users/[USERNAME]/.bundle/test-pipeline-recreate/default/files/nb"
}
}
- library {
- glob {
- include = "/Workspace/Users/[email protected]/another/**" -> null
}
}
}

Plan: 1 to add, 0 to change, 1 to destroy.


Error: This bind operation requires user confirmation, but the current console does not support prompting. Please specify --auto-approve if you would like to skip prompts and proceed.

Original file line number Diff line number Diff line change
@@ -0,0 +1,5 @@

>>> [CLI] bundle deployment bind foo [NEW_PIPELINE_ID] --auto-approve
Updating deployment state...
Successfully bound pipeline with an id '[NEW_PIPELINE_ID]'
Run 'bundle deploy' to deploy changes to your workspace
Original file line number Diff line number Diff line change
@@ -0,0 +1,32 @@
{
"method": "POST",
"path": "/api/2.0/workspace/mkdirs",
"body": {
"path": "/Workspace/Users/[USERNAME]/.bundle/test-pipeline-recreate/default/artifacts/.internal"
}
}
{
"method": "DELETE",
"path": "/api/2.0/pipelines/[NEW_PIPELINE_ID]"
}
{
"method": "POST",
"path": "/api/2.0/pipelines",
"body": {
"catalog": "main",
"channel": "CURRENT",
"deployment": {
"kind": "BUNDLE",
"metadata_file_path": "/Workspace/Users/[USERNAME]/.bundle/test-pipeline-recreate/default/state/metadata.json"
},
"edition": "ADVANCED",
"libraries": [
{
"notebook": {
"path": "/Workspace/Users/[USERNAME]/.bundle/test-pipeline-recreate/default/files/nb"
}
}
],
"name": "test-pipeline"
}
}
Original file line number Diff line number Diff line change
@@ -0,0 +1,51 @@
{
"bundle": {
"environment": "default",
"git": {
"bundle_root_path": "."
},
"name": "test-pipeline-recreate",
"target": "default"
},
"resources": {
"pipelines": {
"foo": {
"catalog": "main",
"channel": "CURRENT",
"deployment": {
"kind": "BUNDLE",
"metadata_file_path": "/Workspace/Users/[USERNAME]/.bundle/test-pipeline-recreate/default/state/metadata.json"
},
"edition": "ADVANCED",
"id": "[NEW_PIPELINE_ID]",
"libraries": [
{
"notebook": {
"path": "/Workspace/Users/[USERNAME]/.bundle/test-pipeline-recreate/default/files/nb"
}
}
],
"name": "test-pipeline",
"url": "[DATABRICKS_URL]/pipelines/[NEW_PIPELINE_ID]?o=[NUMID]"
}
}
},
"sync": {
"paths": [
"."
]
},
"workspace": {
"artifact_path": "/Workspace/Users/[USERNAME]/.bundle/test-pipeline-recreate/default/artifacts",
"current_user": {
"domain_friendly_name": "[USERNAME]",
"id": "[USERID]",
"short_name": "[USERNAME]",
"userName": "[USERNAME]"
},
"file_path": "/Workspace/Users/[USERNAME]/.bundle/test-pipeline-recreate/default/files",
"resource_path": "/Workspace/Users/[USERNAME]/.bundle/test-pipeline-recreate/default/resources",
"root_path": "/Workspace/Users/[USERNAME]/.bundle/test-pipeline-recreate/default",
"state_path": "/Workspace/Users/[USERNAME]/.bundle/test-pipeline-recreate/default/state"
}
}

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

32 changes: 32 additions & 0 deletions acceptance/bundle/deployment/bind/pipelines/recreate/output.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1,32 @@

>>> [CLI] bundle summary -o json

>>> [CLI] bundle plan
recreate pipelines.foo

Plan: 1 to add, 0 to change, 1 to delete, 0 unchanged

>>> musterr [CLI] bundle deploy
Uploading bundle files to /Workspace/Users/[USERNAME]/.bundle/test-pipeline-recreate/default/files...

This action will result in the deletion or recreation of the following Lakeflow Declarative Pipelines along with the
Streaming Tables (STs) and Materialized Views (MVs) managed by them. Recreating the pipelines will
restore the defined STs and MVs through full refresh. Note that recreation is necessary when pipeline
properties such as the 'catalog' or 'storage' are changed:
recreate resources.pipelines.foo
Error: the deployment requires destructive actions, but current console does not support prompting. Please specify --auto-approve if you would like to skip prompts and proceed


>>> [CLI] bundle deploy --auto-approve
Uploading bundle files to /Workspace/Users/[USERNAME]/.bundle/test-pipeline-recreate/default/files...

This action will result in the deletion or recreation of the following Lakeflow Declarative Pipelines along with the
Streaming Tables (STs) and Materialized Views (MVs) managed by them. Recreating the pipelines will
restore the defined STs and MVs through full refresh. Note that recreation is necessary when pipeline
properties such as the 'catalog' or 'storage' are changed:
recreate resources.pipelines.foo
Deploying resources...
Updating deployment state...
Deployment complete!

>>> print_requests.py ^//import-file/ ^//workspace/delete ^//telemetry-ext
Original file line number Diff line number Diff line change
@@ -0,0 +1,18 @@
{
"name": "lakeflow-pipeline",
"catalog": "old_catalog",
"storage": "old_storage",
"libraries": [
{
"glob": {
"include": "/Workspace/Users/[email protected]/lakeflow_pipeline/transformations/**"
}
},
{
"glob": {
"include": "/Workspace/Users/[email protected]/another/**"
}
}
],
"root_path": "/Workspace/Users/[email protected]/lakeflow_pipeline"
}
19 changes: 19 additions & 0 deletions acceptance/bundle/deployment/bind/pipelines/recreate/script
Original file line number Diff line number Diff line change
@@ -0,0 +1,19 @@
NEW_PIPELINE_ID=$($CLI pipelines create --json @pipeline.json | jq -r .pipeline_id)
add_repl.py $NEW_PIPELINE_ID NEW_PIPELINE_ID

rm -f out.requests.txt
trace musterr $CLI bundle deployment bind foo $NEW_PIPELINE_ID &> out.bind-fail.$DATABRICKS_BUNDLE_ENGINE.txt
print_requests.py '^//import-file/' '^//workspace/delete'

rm -f out.requests.txt
trace $CLI bundle deployment bind foo $NEW_PIPELINE_ID --auto-approve &> out.bind-success.$DATABRICKS_BUNDLE_ENGINE.txt
print_requests.py '^//import-file/' '^//workspace/delete'

trace $CLI bundle summary -o json > out.summary.$DATABRICKS_BUNDLE_ENGINE.json
trace $CLI bundle plan

trace musterr $CLI bundle deploy

rm -f out.requests.txt
trace $CLI bundle deploy --auto-approve
trace print_requests.py '^//import-file/' '^//workspace/delete' '^//telemetry-ext' > out.deploy.requests.$DATABRICKS_BUNDLE_ENGINE.json
Original file line number Diff line number Diff line change
@@ -0,0 +1,2 @@
RecordRequests = true
Ignore = [".databricks"]
1 change: 1 addition & 0 deletions acceptance/bundle/deployment/bind/pipelines/test.toml
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
Cloud = false
Original file line number Diff line number Diff line change
@@ -0,0 +1,11 @@
bundle:
name: test-pipeline-recreate

resources:
pipelines:
foo:
name: test-pipeline
libraries:
- notebook:
path: ./nb.sql
catalog: main
2 changes: 2 additions & 0 deletions acceptance/bundle/deployment/bind/pipelines/update/nb.sql
Original file line number Diff line number Diff line change
@@ -0,0 +1,2 @@
-- Databricks notebook source
select 1
Loading