Anyway, I took a scenario that runs fine on the genetic optimizer and switched to default. I knew this would run for hours to go through all the permutations, but I didn't expect any problems. I have tried this repeatedly under 8.0.0.9 and 8.0.0.10 and the memory utilization is enormous and can get to the point where the machine runs out and starts to page, effectively killing things.
Why does the default optimizer use so much more memory? The only theory I have is that the Log and Output are somehow to blame. With the genetic optimizer, even if I have 10 generations and 100 as a generation size that's only 1000 total runs that are generating log and output entries. But the default optimizer does all the parameter permutations. So in the cases where the system dies, I'm talking about tens of millions of combinations.
The alternative is that the default optimizer has a memory leak in it or something. But I'd like to know if you've tried running massive combinations through the default optimizer and seen a similar result.
I'm running NT8 64bit on a machine with 32GB of Ram.
-Jason
Comment