You are not logged in.
as I mentioned, splitting up the file into more files didn't help. I tried that workaround too, after I experienced the mentioned memory dependency when generating one big file.
The strange thing is that, even if I have chosen the split-up option, the memory dependency is the same as if I would not split up the file. Clearly, you can't eliminate a java heap space problem by splitting up the file.
This is my experience.
I have the same problem Jojo, are you able to split your xml file please ?
Easier question then:
What effect (technical background) does the generation mode "Slow with no memory consumed" have?
Just to keep that task alive! ;-)
Thanks for your comprehension!
I designed and exported a job that should generate a big XML file using tAdvancedFileOutputXML with generation mode "Slow with no memory consumed".
The problem is that the file generation is conditioned by the memory allocation to the Java VM in the batch file of the exported job, and therefore ends with an out-of-memory exception in case the memory allocation is below the file size. But, shouldn't the generation mode option mentioned above prevent from such a dependency?
Splitting up the output file didn't help as well, same problem.