Overview
Azure Synapse Analytics integrates Stream Analytics to enable low‑latency, high‑throughput data processing. This tutorial guides you through creating a Stream Analytics job that ingests data from an Event Hub, transforms it with SQL‑like queries, and writes the results to a Synapse SQL pool.
Prerequisites
- Azure subscription with Contributor role.
- Azure Event Hub namespace with a topic named
sensor-data. - Synapse workspace with a dedicated SQL pool.
- Basic knowledge of T‑SQL.
Setup
- Navigate to the Azure portal and create an Azure Stream Analytics job inside your Synapse workspace.
- In the job's Inputs tab, add a new input:
- Source: Event Hub
- Event Hub namespace: YourNamespace
- Event Hub name:
sensor-data - Consumer group:
$Default
- In the Outputs tab, add an output:
- Sink: SQL Database
- Database: SynapseSQLPool
- Table:
dbo.SensorReadings
Code Sample
The following query reads JSON payloads from Event Hub, extracts fields, and inserts them into the Synapse table.
SELECT
GetMetadataPropertyValue(event, '$.deviceId') AS DeviceId,
CAST(GetMetadataPropertyValue(event, '$.temperature') AS float) AS Temperature,
CAST(GetMetadataPropertyValue(event, '$.humidity') AS float) AS Humidity,
System.Timestamp AS EventTime
INTO
[dbo].[SensorReadings]
FROM
[EventHubInput] AS event
WHERE
GetMetadataPropertyValue(event, '$.temperature') IS NOT NULL;
Deploy
Start the Stream Analytics job and monitor its status from the Azure portal. The job will begin processing incoming events and populating the Synapse table.
Monitor
Use the built‑in Metrics blade to view throughput, latency, and error rates. Optionally, enable diagnostic logs to Azure Monitor for deeper insights.
Conclusion
You have created a real‑time streaming pipeline that ingests sensor data, transforms it using Stream Analytics, and stores it in Synapse Analytics for downstream analytics. Explore features like windowed aggregations, custom functions, and integration with Power BI for live dashboards.