Help Needed: Task Becomes Dormant While Processing Large Datasets

Hi Xano Community,

I’ve encountered a strange issue while processing a large dataset in Xano. Here’s a brief summary of the problem:

We’re working with over 56,000 records that require:

  1. Restructuring.

  2. Calling an external API to fetch additional data if needed.

  3. Adding the processed data back to three different tables.

I’ve set this operation to run as a background task. On average, it takes approximately 60 minutes to process 1,000 records. However, the task often becomes dormant after executing a certain number of statements.

By "dormant," I mean that the task status remains “Processing,” but there’s no visible CPU or memory activity. I’ve confirmed that the database isn’t locked or blocking the process, so it appears the task itself has stopped functioning correctly.

Here are some processed statements where the task becomes dormant:

  • At 1,415,642 statements.

  • At 1,131,221 statements.

  • At 1,148,185 statements.

Each time this happens, I have to rerun the task to process the remaining data manually. This approach is both time-consuming and inefficient.

Additionally, I’ve noticed no logs or visual indicators to help me identify what might be causing the task to become dormant.

My questions to the community are:

  1. Are there any hardcoded limitations in Xano regarding the amount of data or the number of statements a background task can process in one go?

  2. Has anyone experienced similar issues, and if so, how did you address them?

  3. Are there any best practices for optimizing tasks that handle large datasets in Xano?

Any advice, suggestions, or insights would be greatly appreciated. Thanks in advance for your help!

4 replies