Join us for an expert-led overview of the tools and concepts you'll need to pass exam PL-300. The first session starts on June 11th. See you there!
Get registeredPower BI is turning 10! Let’s celebrate together with dataviz contests, interactive sessions, and giveaways. Register now.
The API I am retrieving data from only delivers one year total of data, meaning today I can get data from 2024-06-04 to 2025-06-04. Tomorrow it will be 2024-06-05 to 2025-06-05 etc.
So in the above I will be missing the data from 2024-06-04. In a DB I can insert, update, delete. How to handle the insert in this example?
Solved! Go to Solution.
Hi Bokazoit,
Thank you for your valuable feedback, and kindly accept our apologies for any inconvenience caused.
As the API enforces a strict 365-day rolling window, retaining data older than one year is possible; however, it requires an approach that decouples storage from the API’s limitations.
Please find below an approach that helps resolve this issue:
If you find our response helpful, kindly mark it as the accepted solution and provide kudos. This will assist other community members facing similar queries. Should you have any further queries, please feel free to contact the Microsoft Fabric community.
Thank you.
Hi Bokazoit,
Thank you for your valuable feedback, and kindly accept our apologies for any inconvenience caused.
As the API enforces a strict 365-day rolling window, retaining data older than one year is possible; however, it requires an approach that decouples storage from the API’s limitations.
Please find below an approach that helps resolve this issue:
If you find our response helpful, kindly mark it as the accepted solution and provide kudos. This will assist other community members facing similar queries. Should you have any further queries, please feel free to contact the Microsoft Fabric community.
Thank you.
Hi Bokazoit,
We have not received a response from you regarding the query and were following up to check if you have found a resolution. If you have identified a solution, we kindly request you to share it with the community, as it may be helpful to others facing a similar issue.
If you find the response helpful, please mark it as the accepted solution and provide kudos, as this will help other members with similar queries.
Thank you.
It is not helpful nor a solution. If all solutions in the world were all the "not possible" the world would have standed still and the same for evolutoin
Hi @Bokazoit
In your scenario, where the API only provides a rolling one-year window of data that shifts daily—meaning each day you lose the oldest day’s data and gain a new day’s data—handling inserts in your database requires a strategy to maintain a complete historical record despite the limited API window. Since the API data window moves forward every day, you cannot rely on the API alone to keep the full dataset, especially for older dates falling outside the current one-year window. To handle inserts effectively, you should implement an incremental load process that fetches the latest available data from the API each day and inserts new records for those new dates. At the same time, your database must maintain all previously collected data for dates outside the current API window so you don’t lose historical data. This often means your ETL or data pipeline should detect which dates are newly available from the API, insert or update those records, and avoid deleting older data that is no longer provided by the API. In other words, you treat the API as a moving snapshot and maintain your own persistent storage that accumulates data over time. This approach ensures that even when the API drops older dates from its response, your database retains a full historical timeline by only inserting and updating new or changed records, and never deleting older data unless explicitly required.
Hi @Bokazoit,
We sincerely appreciate your inquiry posted on the Microsoft Fabric Community Forum.
To the best of my understanding, since the API provides data only for a rolling one-year window, the most effective way to prevent the loss of older data is to implement a daily incremental data ingestion process. This process should store each day’s retrieved data into a persistent storage medium such as a Microsoft Fabric Lakehouse, Warehouse, or SQL database.
Please consider the following approach, which may help resolve the issue:
If you find this response helpful, kindly mark it as the accepted solution and provide kudos. This will assist other community members who face similar issues.
Should you have any further queries, please feel free to contact the Microsoft Fabric Community.
Thank you.