Large Datatable Streamer

Processes large datasets in streaming chunks to avoid PAD memory issues with DataTables over ~50K rows. Instead of loading the entire file into memory, reads N rows at a time from Excel, processes each chunk, writes to CSV output, and frees memory before the next chunk.

Sign in or create a free account to copy this script.

Problem this solves

PAD slows drastically or crashes with datatables over ~50K rows. Users report memory issues with large Excel files.

Usage Notes

  1. 1.Memory management is the point. The key action is Variables.ClearDataTable after each chunk write.
  2. 2.LABEL/GOTO loop pattern is used instead of LOOP WHILE because LOOP WHILE is not in the verified production reference.
  3. 3.CSV append mode: File.WriteToCSVFile with IfFileExists.Append adds rows without overwriting.
  4. 4.Excel stays open for the duration of streaming — opened once and read in windows via ReadCells.
  5. 5.Chunk size tuning: 5,000 rows is conservative. For narrow datasets increase to 10,000-20,000.
  6. 6.Safe stop (Pattern 6): Insert at the top of the chunk loop for attended runs.