About 11,500,000 results
Open links in new tab
  1. Is there a way to use parameters in Databricks in SQL with …

    Sep 29, 2024 · There is a lot of confusion wrt the use of parameters in SQL, but I see Databricks has started harmonizing heavily (for example, 3 months back, IDENTIFIER () didn't work with …

  2. Printing secret value in Databricks - Stack Overflow

    Nov 11, 2021 · 1 Building on @camo's answer, since you're looking to use the secret value outside Databricks, you can use the Databricks Python SDK to fetch the bytes representation …

  3. databricks - How to create Storage Credential using Service …

    Sep 24, 2024 · An Azure Databricks access connector is a first-party Azure resource that lets you connect managed identities to an Azure Databricks account. You must have the Contributor …

  4. Databricks: managed tables vs. external tables - Stack Overflow

    Jun 21, 2024 · While Databricks manages the metadata for external tables, the actual data remains in the specified external location, providing flexibility and control over the data storage …

  5. Converting SQL stored procedure into a Databricks Notebook: …

    Dec 5, 2023 · 1 I'm trying to convert a SQL stored procedure into a Databricks notebook. One stored procedure has multiple IF statements combined with BEGIN/END statements. Based …

  6. REST API to query Databricks table - Stack Overflow

    Jul 24, 2022 · Is databricks designed for such use cases or is a better approach to copy this table (gold layer) in an operational database such as azure sql db after the transformations are done …

  7. Databricks Permissions Required to Create a Cluster

    Nov 9, 2023 · In Azure Databricks, if you want to create a cluster, you need to have the " Can Manage " permission. This permission basically lets you handle everything related to clusters, …

  8. Create Azure KeyVault backed Databricks secret scope in CLI

    Apr 22, 2025 · I am trying to create an Azure KeyVault-backed secret scope within Databricks. I can create the AKV-backed secret scope in the UI no bother, however, I want to create it in the …

  9. Databricks - Download a dbfs:/FileStore file to my Local Machine

    Method3: Using third-party tool named DBFS Explorer DBFS Explorer was created as a quick way to upload and download files to the Databricks filesystem (DBFS). This will work with both …

  10. Databricks: How do I get path of current notebook?

    Databricks is smart and all, but how do you identify the path of your current notebook? The guide on the website does not help. It suggests: %scala dbutils.notebook.getContext.notebookPath …