When I backtest my pairs-trading strategy with a single set of parameters, the backtest will take a reasonable 10 seconds and use up about 100MB worth of RAM. When I run an optimization that has to backtest 10 times (10 possible combinations of parameters), NT uses about 1GB of RAM and takes about 100 seconds.
A problem arises when I try to run optimizations that take 50 backtests or more. The optimization does not scale properly, it takes an hour instead of 500 seconds. NT always follows the same pattern regardless of which instruments I run the strategy on:
1. CPU usage immediately spikes to 70%
2. RAM usage slowly crawls up to about 11GB
3. When RAM usage hits 11GB, CPU usage decreases to about 15%
I assume that the CPU is used less because NT is now fetching data from a hard drive and this introduces a fair amount of latency into the operation.
What I want to know is if anyone else has had this problem and if there is some way to fix it. I have a feeling that I might be doing something wrong.
Comment