Skip to content

You are viewing documentation for Immuta version 2023.4.

For the latest version, view our documentation for Immuta SaaS or the latest self-hosted version.

Enable Unity Catalog Support for an Existing Databricks Spark Integration

Existing Data Sources

Existing data sources will reference the default catalog, hive_metastore, once Unity Catalog is enabled. However, this default catalog will not be used when you create new data sources.

If you already have an Immuta Databricks Spark integration configured, follow the steps below to enable Unity Catalog support in Immuta.

  1. Enable Unity Catalog support on the App Settings page.
  2. Re-push cluster policies to your Databricks cluster. Note that you must set IMMUTA_SPARK_DATABRICKS_ALLOW_NON_IMMUTA_READS and IMMUTA_SPARK_DATABRICKS_ALLOW_NON_IMMUTA_WRITES to false in your cluster policies manually or by selecting Protected until made available by policy in the Databricks integration section of the App Settings page. See the Databricks Spark integration with Unity Catalog support limitations for details.
  3. Re-start your Databricks cluster with the new cluster policy applied.