subcategory |
---|
Databricks SQL |
-> Public Preview This feature is in Public Preview.
This resource configures the security policy, databricks_instance_profile, and data access properties for all databricks_sql_endpoint of workspace. Please note that changing parameters of this resources will restart all running databricks_sql_endpoint. To use this resource you need to be an administrator.
resource "databricks_sql_global_config" "this" {
security_policy = "DATA_ACCESS_CONTROL"
instance_profile_arn = "arn:...."
data_access_config = {
"spark.sql.session.timeZone": "UTC"
}
}
The following arguments are supported (see documentation for more details):
security_policy
(Optional, String) - The policy for controlling access to datasets. Default value:DATA_ACCESS_CONTROL
, consult documentation for list of possible valuesdata_access_config
(Optional, Map) - data access configuration for databricks_sql_endpoint, such as configuration for an external Hive metastore, Hadoop Filesystem configuration, etc. Please note that the list of supported configuration properties is limited, so refer to the documentation for a full list. Apply will fail if you're specifying not permitted configuration.instance_profile_arn
(Optional, String) - databricks_instance_profile used to access storage from databricks_sql_endpoint. Please note that this parameter is only for AWS, and will generate an error if used on other clouds.
You can import a databricks_sql_global_config
resource with command like the following (you need to use global
as ID):
$ terraform import databricks_sql_global_config.this global