You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Make sure the changes are deployed correctly in the CI Tinybird Branch. Optionally you can add automated tests or verify it from the `tmp_ci_*` Branch created as part of the CI pipeline.
76
76
77
-
## 3: Backfilling
77
+
## 3: (For large datasets) Splitting the Data into Chunks for Backfilling
78
+
79
+
If your data source is large, you may run into a memory error like this:
80
+
```
81
+
error: "There was a problem while copying data: [Error] Memory limit (for query) exceeded. Make sure the query just process the required data. Contact us at support@tinybird.co for help or read this SQL tip: https://tinybird.co/docs/guides/best-practices-for-faster-sql.html#memory-limit-reached-title"
82
+
```
83
+
84
+
To avoid memory issues, you will need to break the backfill operation into smaller, manageable chunks. This approach reduces the memory load per query by processing only a subset of the data at a time. You can use the ***data source's sorting key*** to define each chunk.
85
+
Refer to [this guide](https://www.tinybird.co/docs/work-with-data/strategies/backfill-strategies#scenario-3-streaming-ingestion-with-incremental-timestamp-column) for more details.
86
+
87
+
## 4: Backfilling
78
88
79
89
Wait for the first event to be ingested into `analytics_pages_mv_1` and then proceed with the backfilling.
80
90
@@ -93,10 +103,10 @@ tb sql "select timestamp from tinybird.datasources_ops_log where event_type = 'c
0 commit comments