terraform-provider-databricks

command module
v1.77.0 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: May 8, 2025 License: Apache-2.0 Imports: 9 Imported by: 0

README

Databricks Terraform Provider

Resources

AWS tutorial | Azure tutorial | End-to-end tutorial | Authentication | Troubleshooting Guide | Changelog | Contributing and Development Guidelines

build codecov downloads

Databricks Terraform provider works with Terraform 1.1.5 or newer. To use it please refer to instructions specified at registry page:

terraform {
  required_providers {
    databricks = {
      source = "databricks/databricks"
    }
  }
}

If you want to build it from sources, please refer to contributing guidelines.

Then create a small sample file, named main.tf with approximately following contents. Replace <your PAT token> with newly created PAT Token.

provider "databricks" {
  host  = "https://abc-defg-024.cloud.databricks.com/"
  token = "<your PAT token>"
}

data "databricks_current_user" "me" {}
data "databricks_spark_version" "latest" {}
data "databricks_node_type" "smallest" {
  local_disk = true
}

resource "databricks_notebook" "this" {
  path     = "${data.databricks_current_user.me.home}/Terraform"
  language = "PYTHON"
  content_base64 = base64encode(<<-EOT
    # created from ${abspath(path.module)}
    display(spark.range(10))
    EOT
  )
}

resource "databricks_job" "this" {
  name = "Terraform Demo (${data.databricks_current_user.me.alphanumeric})"

  new_cluster {
    num_workers   = 1
    spark_version = data.databricks_spark_version.latest.id
    node_type_id  = data.databricks_node_type.smallest.id
  }

  notebook_task {
    notebook_path = databricks_notebook.this.path
  }
}

output "notebook_url" {
  value = databricks_notebook.this.url
}

output "job_url" {
  value = databricks_job.this.url
}

Then run terraform init then terraform apply to apply the hcl code to your Databricks workspace.

OpenTofu Support

OpenTofu is an open-source fork of Terraform with the MPL 2.0 license. The Databricks Terraform provider should be compatible with OpenTofu, but this integration is not actively tested and should be considered experimental. Please raise a Github issue if you find any incompatibility.

Switching from databrickslabs to databricks namespace

To make Databricks Terraform Provider generally available, we've moved it from https://github.com/databrickslabs to https://github.com/databricks. We've worked closely with the Terraform Registry team at Hashicorp to ensure a smooth migration. Existing terraform deployments continue to work as expected without any action from your side. We ask you to replace databrickslabs/databricks with databricks/databricks in all your .tf files.

You should have .terraform.lock.hcl file in your state directory that is checked into source control. terraform init will give you the following warning.

Warning: Additional provider information from registry 

The remote registry returned warnings for registry.terraform.io/databrickslabs/databricks:
- For users on Terraform 0.13 or greater, this provider has moved to databricks/databricks. Please update your source in required_providers.

After you replace databrickslabs/databricks with databricks/databricks in the required_providers block, the warning will disappear. Do a global "search and replace" in *.tf files. Alternatively you can run python3 -c "$(curl -Ls https://dbricks.co/updtfns)" from the command-line, that would do all the boring work for you.

If you didn't check-in .terraform.lock.hcl to the source code version control, you may you may see Failed to install provider error. Please follow the simple steps described in the troubleshooting guide.

Use of Terraform exporter

The exporter functionality is experimental and provided as is. It has an evolving interface, which may change or be removed in future versions of the provider.

Documentation

The Go Gopher

There is no documentation for this package.

Directories

Path Synopsis
Code generated from OpenAPI specs by Databricks SDK Generator.
Code generated from OpenAPI specs by Databricks SDK Generator.
internal
providers
Package providers contains the changes for both SDKv2 and Plugin Framework which are defined in their respective sub packages.
Package providers contains the changes for both SDKv2 and Plugin Framework which are defined in their respective sub packages.
providers/common
Code generated from OpenAPI specs by Databricks SDK Generator.
Code generated from OpenAPI specs by Databricks SDK Generator.
providers/pluginfw
Package pluginfw contains the changes specific to the plugin framework
Package pluginfw contains the changes specific to the plugin framework
providers/pluginfw/autogen
Code generated from OpenAPI specs by Databricks SDK Generator.
Code generated from OpenAPI specs by Databricks SDK Generator.
providers/pluginfw/common
Code generated from OpenAPI specs by Databricks SDK Generator.
Code generated from OpenAPI specs by Databricks SDK Generator.
providers/pluginfw/context
Code generated from OpenAPI specs by Databricks SDK Generator.
Code generated from OpenAPI specs by Databricks SDK Generator.
providers/pluginfw/converters
Code generated from OpenAPI specs by Databricks SDK Generator.
Code generated from OpenAPI specs by Databricks SDK Generator.
providers/pluginfw/tfschema
Code generated from OpenAPI specs by Databricks SDK Generator.
Code generated from OpenAPI specs by Databricks SDK Generator.
providers/sdkv2
Package sdkv2 contains the changes specific to the SDKv2
Package sdkv2 contains the changes specific to the SDKv2
tfreflect
Code generated from OpenAPI specs by Databricks SDK Generator.
Code generated from OpenAPI specs by Databricks SDK Generator.
sql
api
Package api contains API structures for use with the Databricks SQL API.
Package api contains API structures for use with the Databricks SQL API.

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL