With the limited testing, it does seem like it might be limited within a single function/process space. But I've noticed, at least with FG Classic, that if I do a bunch of nodes, like 100 or so, the memory is not release. So it will balloon up to 2.5Gb, and when the process completes, it stays in use. In Unity, it never rose anywhere near as high, and went right back down once the process completed. So I agree with MoonWizard that it may be a stack overflow somewhere.
It might be possible to chunk it. The easiest way I can think of is to load all the data into memory without creating the nodes. Store it in a global variable somewhere. Then in a processing window have a button that says "Process the next 100 records" or something. Perhaps include a countdown for the number of lines left to process. This would require the user to have to click it multiple times, but thats better than having it crash or hang.