← Back to all products

Azure Synapse-Databricks Integration Kit

$49

Integration patterns for organizations running both Synapse and Databricks. Serverless views, Power BI setup, and cost comparison tools.

📁 13 files🏷 v1.0.0
PythonTerraformJSONMarkdownSQLAzureDatabricksSparkDelta LakeRedis

📁 File Structure 13 files

azure-synapse-integration-kit/ ├── README.md ├── guides/ │ ├── architecture_decision_guide.md │ ├── migration_synapse_to_databricks.md │ └── security_cross_service.md ├── notebooks/ │ ├── data_sharing_patterns.py │ └── power_bi_directquery_setup.py ├── sql/ │ ├── dedicated_pool_integration.sql │ └── serverless_views_over_delta.sql ├── terraform/ │ └── synapse-databricks/ │ ├── main.tf │ ├── outputs.tf │ └── variables.tf └── tools/ └── cost_comparison_calculator.py

📖 Documentation Preview README excerpt

Azure Synapse-Databricks Integration Kit

Product ID: azure-synapse-integration-kit

Version: 1.0.0

Author: [Datanest Digital](https://datanest.dev)

Price: $49 USD

---

Overview

The Azure Synapse-Databricks Integration Kit provides production-ready templates, guides, and automation for organizations running both Azure Synapse Analytics and Azure Databricks. Rather than treating these services as competitors, this kit enables you to leverage the strengths of each platform in a unified architecture.

Most Azure data platforms eventually adopt both services. Synapse excels at T-SQL workloads, serverless exploration, and Power BI integration. Databricks excels at large-scale Spark processing, ML workflows, and Delta Lake management. This kit eliminates the guesswork in making them work together.

What's Included

Architecture & Migration Guides

| Guide | Description |

|-------|-------------|

| guides/architecture_decision_guide.md | Decision tree for when to use Synapse vs Databricks vs both |

| guides/migration_synapse_to_databricks.md | Step-by-step migration from Synapse Spark pools to Databricks |

| guides/security_cross_service.md | Cross-service authentication with Managed Identity and service principals |

SQL Templates

| File | Description |

|------|-------------|

| sql/serverless_views_over_delta.sql | 10+ Synapse serverless SQL view patterns over Delta Lake tables |

| sql/dedicated_pool_integration.sql | Synapse dedicated SQL pool integration queries and external table patterns |

Databricks Notebooks

| Notebook | Description |

|----------|-------------|

| notebooks/power_bi_directquery_setup.py | Configure Power BI DirectQuery over Databricks SQL endpoints |

| notebooks/data_sharing_patterns.py | Data sharing between Synapse and Unity Catalog |

Infrastructure as Code

| File | Description |

|------|-------------|

| terraform/synapse-databricks/main.tf | Joint Synapse + Databricks Terraform deployment |

| terraform/synapse-databricks/variables.tf | Configurable variables for the deployment |

| terraform/synapse-databricks/outputs.tf | Output values for downstream consumption |

Tools

| Tool | Description |

|------|-------------|

| tools/cost_comparison_calculator.py | CLI tool comparing Synapse vs Databricks costs for common workloads |

Prerequisites

  • Azure subscription with Contributor access
  • Terraform >= 1.5.0 (for infrastructure deployment)
  • Python >= 3.9 (for CLI tools)
  • Azure CLI >= 2.50.0
  • Familiarity with Azure Synapse Analytics and Azure Databricks

... continues with setup instructions, usage examples, and more.

📄 Code Sample .py preview

notebooks/data_sharing_patterns.py # Databricks notebook source # MAGIC %md # MAGIC # Data Sharing Patterns: Synapse and Unity Catalog # MAGIC # MAGIC **Datanest Digital** | [datanest.dev](https://datanest.dev) # MAGIC # MAGIC This notebook demonstrates patterns for sharing data between Azure Synapse Analytics # MAGIC and Databricks Unity Catalog. The shared storage layer is ADLS Gen2 with Delta Lake. # MAGIC # MAGIC ## Patterns Covered # MAGIC 1. Unity Catalog tables accessible from Synapse serverless # MAGIC 2. Synapse dedicated pool data accessible from Databricks # MAGIC 3. Delta Sharing for cross-environment access # MAGIC 4. Metadata synchronization between catalogs # COMMAND ---------- # MAGIC %md # MAGIC ## Configuration # COMMAND ---------- # Unity Catalog settings UC_CATALOG = "production" UC_SCHEMA_GOLD = "gold" UC_SCHEMA_SHARED = "synapse_shared" # ADLS Gen2 settings STORAGE_ACCOUNT = "<storage_account>" GOLD_CONTAINER = "gold" SHARED_CONTAINER = "synapse-shared" # Synapse settings (for JDBC connectivity) SYNAPSE_ENDPOINT = "<synapse_workspace>.sql.azuresynapse.net" SYNAPSE_DATABASE = "dedicated_pool" # COMMAND ---------- # MAGIC %md # MAGIC ## Pattern 1: Publish Unity Catalog Tables for Synapse Consumption # MAGIC # MAGIC Unity Catalog external tables store data in ADLS Gen2. Synapse serverless can # MAGIC read these Delta files directly — no data copy required. # COMMAND ---------- def publish_table_for_synapse( source_table: str, target_table: str, storage_path: str, # ... 422 more lines ...