Join us for an expert-led overview of the tools and concepts you'll need to pass exam PL-300. The first session starts on June 11th. See you there!
Get registeredJoin us at FabCon Vienna from September 15-18, 2025, for the ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM. Get registered
There are static json files in Azure Event hub and I'm trying to ingest it into KQL Database using Event stream. I'm able to connect to event hub using event stream but I'm not able to preview the data and publish it and getting this error: Data preview "ES_STREAM": ["While sampling data, no data was received from '4' partitions."]. Is that issue persisting because of the data from event hub is not a real-time dynamic data? Are there any known workarounds or fixes for this issue? It would be helpful to understand if this is a limitation or a bug in the event streaming setup or related to event hub. Looking forward to your insights and suggestions!
Solved! Go to Solution.
Hi @Nalapriya
This is the default behaviour of eventhub.
The issue you’re experiencing with ingesting static JSON files from Azure Event Hub into a KQL Database using Event Stream is likely related to the nature of the data in your Event Hub. Event Stream is designed to work with real-time, streaming data, and may have difficulty processing static files that are not actively being pushed into the Event Hub.
Event Hub is designed for real-time event streaming. If your JSON files are static and not being actively sent to the Event Hub, Event Stream may not be able to detect or process them.
Event Hub has a limited data retention period. If the static files were added to the Event Hub some time ago, they may have expired and been removed.
Potential solution:
Simulate Real-time Data: Instead of using static files, try simulating real-time data by periodically sending the JSON content to your Event Hub. This can be done using a script or an Azure Function that reads the static files and sends them as events.
For static data, consider using other ingestion methods supported by KQL Database, such as direct ingestion from blob storage or using Azure Data Factory or Fabric pipelines.
Use Event Hub Data Generator: If you’re just testing the setup, you can use the Event Hub Data Generator feature in the Azure Portal to send sample data to your Event Hub and verify the Event Stream connection.
If this is helpful please accept the solution and give kudos.
thanks
Hi @Nalapriya
This is the default behaviour of eventhub.
The issue you’re experiencing with ingesting static JSON files from Azure Event Hub into a KQL Database using Event Stream is likely related to the nature of the data in your Event Hub. Event Stream is designed to work with real-time, streaming data, and may have difficulty processing static files that are not actively being pushed into the Event Hub.
Event Hub is designed for real-time event streaming. If your JSON files are static and not being actively sent to the Event Hub, Event Stream may not be able to detect or process them.
Event Hub has a limited data retention period. If the static files were added to the Event Hub some time ago, they may have expired and been removed.
Potential solution:
Simulate Real-time Data: Instead of using static files, try simulating real-time data by periodically sending the JSON content to your Event Hub. This can be done using a script or an Azure Function that reads the static files and sends them as events.
For static data, consider using other ingestion methods supported by KQL Database, such as direct ingestion from blob storage or using Azure Data Factory or Fabric pipelines.
Use Event Hub Data Generator: If you’re just testing the setup, you can use the Event Hub Data Generator feature in the Azure Portal to send sample data to your Event Hub and verify the Event Stream connection.
If this is helpful please accept the solution and give kudos.
thanks
Hi @Nalapriya ,
Please check that you fulfill the following prerequisites and that the steps are correct.
Get data from Azure Event Hubs:
Before you can create a connection to your Event Hubs data, you need to set a shared access policy (SAS) on the event hub and collect some information to be used later in setting up the connection. For more information on authorizing access to Event Hubs resources, see Shared Access Signatures.
Add Azure Event Hubs source to an eventstream:
Best Regards,
Neeko Tang
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.