Troubleshooting Guide | AWS tutorial | Azure tutorial | End-to-end tutorial | Changelog | Authentication | databricks_aws_assume_role_policy data | databricks_aws_bucket_policy data | databricks_aws_crossaccount_policy data | databricks_catalog | databricks_catalogs data | databricks_cluster | databricks_clusters data | databricks_cluster_policy | databricks_current_user | databricks_dbfs_file | databricks_dbfs_file_paths data | databricks_dbfs_file data | databricks_directory | databricks_external_location | databricks_git_credential | databricks_global_init_script | databricks_grants | databricks_group | databricks_group data | databricks_group_instance_profile | databricks_group_member | databricks_instance_pool | databricks_instance_profile | databricks_ip_access_list | databricks_job | databricks_job data | databricks_jobs | databricks_library | databricks_metastore | databricks_metastore_assignment | databricks_metastore_data_access | databricks_mlflow_model | databricks_mlflow_experiment | databricks_mlflow_webhook | databricks_mount | databricks_mws_credentials | databricks_mws_customer_managed_keys | databricks_mws_log_delivery | databricks_mws_networks | databricks_mws_permission_assignment | databricks_mws_private_access_settings | databricks_mws_storage_configurations | databricks_mws_vpc_endpoint | databricks_mws_workspaces | databricks_mws_workspaces data | databricks_node_type data | databricks_notebook | databricks_notebook data | databricks_notebook_paths data | databricks_obo_token | databricks_permissions | databricks_pipeline | databricks_repo | databricks_schema | databricks_schemas data | databricks_secret | databricks_secret_acl | databricks_secret_scope | databricks_service_principal | databricks_service_principals data | databricks_service_principal_role | databricks_spark_version data | databricks_sql_dashboard | databricks_sql_endpoint | databricks_sql_global_config | databricks_sql_permissions | databricks_sql_query | databricks_sql_visualization | databricks_sql_warehouse data | databricks_sql_warehouses data | databricks_sql_widget | databricks_storage_credential | databricks_tables data | databricks_token | databricks_user | databricks_user_role | databricks_user_instance_profile | databricks_views data | databricks_workspace_conf | databricks_zones | Contributing and Development Guidelines
If you use Terraform 0.13 or newer, please refer to instructions specified at registry page. If you use older versions of Terraform or want to build it from sources, please refer to contributing guidelines page.
terraform {
required_providers {
databricks = {
source = "databricks/databricks"
version = "1.6.5"
}
}
}
Then create a small sample file, named main.tf
with approximately following contents. Replace <your PAT token>
with newly created PAT Token.
provider "databricks" {
host = "https://abc-defg-024.cloud.databricks.com/"
token = "<your PAT token>"
}
data "databricks_current_user" "me" {}
data "databricks_spark_version" "latest" {}
data "databricks_node_type" "smallest" {
local_disk = true
}
resource "databricks_notebook" "this" {
path = "${data.databricks_current_user.me.home}/Terraform"
language = "PYTHON"
content_base64 = base64encode(<<-EOT
# created from ${abspath(path.module)}
display(spark.range(10))
EOT
)
}
resource "databricks_job" "this" {
name = "Terraform Demo (${data.databricks_current_user.me.alphanumeric})"
new_cluster {
num_workers = 1
spark_version = data.databricks_spark_version.latest.id
node_type_id = data.databricks_node_type.smallest.id
}
notebook_task {
notebook_path = databricks_notebook.this.path
}
}
output "notebook_url" {
value = databricks_notebook.this.url
}
output "job_url" {
value = databricks_job.this.url
}
Then run terraform init
then terraform apply
to apply the hcl code to your Databricks workspace.
To make Databricks Terraform Provider generally available, we've moved it from https://github.com/databrickslabs to https://github.com/databricks. We've worked closely with the Terraform Registry team at Hashicorp to ensure a smooth migration. Existing terraform deployments continue to work as expected without any action from your side. We ask you to replace databrickslabs/databricks
with databricks/databricks
in all your .tf
files.
You should have .terraform.lock.hcl
file in your state directory that is checked into source control. terraform init will give you the following warning.
Warning: Additional provider information from registry
The remote registry returned warnings for registry.terraform.io/databrickslabs/databricks:
- For users on Terraform 0.13 or greater, this provider has moved to databricks/databricks. Please update your source in required_providers.
After you replace databrickslabs/databricks
with databricks/databricks
in the required_providers
block, the warning will disappear. Do a global "search and replace" in *.tf
files. Alternatively you can run python3 -c "$(curl -Ls https://dbricks.co/updtfns)"
from the command-line, that would do all the boring work for you.
If you didn't check-in .terraform.lock.hcl
to the source code version control, you may you may see Failed to install provider
error. Please follow the simple steps described in the troubleshooting guide.