Announcement
Collapse
No announcement yet.
Partner 728x90
Collapse
NinjaTrader
Out of system memory when importing tick data
Collapse
X
-
Most editors i've come by, loads the entire file into memory rather than scan to current area. Maybe UltraEdit can do it. Notepad++ easily dies when loading large files, and its usually on par with TextPad.
I should get the extra memory tomorrow, so hopefully the import will be sorted with that alone.
@Matthew
Does NT7 support loading and doing historical testing on a data feed from a SQL database? It would truly be awesome if it could iterate through a simple select statement as data source.Last edited by filson; 11-29-2012, 05:11 PM.
-
There appears to be some products out there.Originally posted by filson View PostFair enough.
I much appreciate you taking this into account for future releases.
If you would like the sample data file for testing purposes, its about 524MB zipped.
I'd be happy to upload it somewhere accessible to you.
Filip
Good luck!
Leave a comment:
-
Try textpad at textpad.comOriginally posted by filson View PostDon't really think I can find an editor that will accommodate the file size.
I've already ordered more memory, hoping 32GB will fit it. Any ideas about chances on that?
Could be good to take a different approach to importing data, should you want to revisit the import code.
You might need to setup a ram drive and do all sorts of stuff to load into memory.
Might be able to write a java/c#/c++ and truncate it. or use some unix command line trickery.
Leave a comment:
-
Fair enough.
I much appreciate you taking this into account for future releases.
If you would like the sample data file for testing purposes, its about 524MB zipped.
I'd be happy to upload it somewhere accessible to you.
Filip
Leave a comment:
-
Can't say for sure and I'm not sure I've ever seen this much data imported at once.
We are looking on improving the import methods in the next major release, but I do not have ETA on this at this time.
Leave a comment:
-
Don't really think I can find an editor that will accommodate the file size.
I've already ordered more memory, hoping 32GB will fit it. Any ideas about chances on that?
Could be good to take a different approach to importing data, should you want to revisit the import code.
Leave a comment:
-
Hello,
With this amount of data, I'd recommend separating the data into smaller files and re-importing them one at a time.
I can't recommend a specific file size as the amount you can import at one time will depend on your system resources.
Leave a comment:
-
Out of system memory when importing tick data
Dear NinjaTrader,
I've begun testing NT7 with historical tick data.
For the EURUSD cross I've got 10 years or about 24.8GB of data.
When I tried to import this, NT7 ate all my system memory, to a full 7.4GB of used RAM.
Then stalled completely. Regrettably that was about 6 months before end of data.
Do you have any tricks for optimizing memory usage in NT7?
It seems like all data is kept in memory before flushing the final product as historical data.
FilipTags: None
Latest Posts
Collapse
| Topics | Statistics | Last Post | ||
|---|---|---|---|---|
|
Started by CaptainJack, 04-24-2026, 11:07 PM
|
0 responses
34 views
0 likes
|
Last Post
by CaptainJack
04-24-2026, 11:07 PM
|
||
|
Started by Mindset, 04-21-2026, 06:46 AM
|
0 responses
129 views
0 likes
|
Last Post
by Mindset
04-21-2026, 06:46 AM
|
||
|
Started by M4ndoo, 04-20-2026, 05:21 PM
|
0 responses
183 views
0 likes
|
Last Post
by M4ndoo
04-20-2026, 05:21 PM
|
||
|
Started by M4ndoo, 04-19-2026, 05:54 PM
|
0 responses
94 views
0 likes
|
Last Post
by M4ndoo
04-19-2026, 05:54 PM
|
||
|
Started by cmoran13, 04-16-2026, 01:02 PM
|
0 responses
138 views
0 likes
|
Last Post
by cmoran13
04-16-2026, 01:02 PM
|

Leave a comment: