Announcement

Collapse

Looking for a User App or Add-On built by the NinjaTrader community?

Visit NinjaTrader EcoSystem and our free User App Share!

Have a question for the NinjaScript developer community? Open a new thread in our NinjaScript File Sharing Discussion Forum!
See more
See less

Partner 728x90

Collapse

Writing data to one text file

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

    Writing data to one text file

    Hi, I've been struggling with this long enough by looking at other threads and C# documentation, so even though this question has been asked many times, here goes...

    Is there a simple way to open a text file, append one line to the end of it, and then close the file so multiple strategies can all write to the same file? This would give me the same result as printing information to the output window without eating up all my computer's RAM. In my case, I have about 200 programs running and over the course of a 390-minute day will write about 3000 lines, or less than 10 lines a minute.

    #2
    Hi egan857,

    There's nothing we support as a solution here but other users report success with mutex for this. Here's one thread for reference where a solution is shared:


    You could also consider using Log() calls, but there is greater performance hit here.
    Ryan M.NinjaTrader Customer Service

    Comment


      #3
      Hi RyanM,

      Thanks! That worked. I'm now sending all my logs to text files instead of the output screen. It worked all day long with 200 strategies running - no concurrent write failures.

      The reason for doing this was to reduce my memory usage, since anything printed to the output screen is kept in memory. Can't be sure because I restarted NT many times to tweak the strategies, but I think it only reduced the memory use slightly.

      The messages that were sent to the output log are now stored in a string and then written to file. I then clear out the string ("outputLine = null") immediately after printing. Is it possible the space allocated for the string is still allocated? Can I dispose just that string to release the memory? Theoretically it shouldn't really matter; assuming a maximum 256 bytes for each string, that means I'd have 200x256=50K memory allocated for the most recent message per strategy.

      The log file (2.6MB) and trace file (11.1MB) are no smaller today than in the past. There's no reason they should be since they never included the output from my Print statements.

      There doesn't seem to be a logical reason why my memory usage should go from about 1GB to about 3.5GB in 90 minutes. It only placed about 200 trades in the whole day, probably 50 during those 90 minutes. Can you think of any other reason why memory keeps on growing now that I'm not sending anything to the output screen?

      Comment


        #4
        Great to hear that's working for you. Your string usage should be OK-- no reason to that that's contributing.

        Using system IO for writing to a file can be expensive. Streamwriter is more efficient, but you'll need to make sure you're disposing it when you're done. This is typically within OnTermination(). There is a complete streamwriter sample here that covers usage and disposing.
        Ryan M.NinjaTrader Customer Service

        Comment


          #5
          Hi RyanM,

          I'm using mutex instead of streamwriter, so the Dispose-ing appears to be different. I do issue a ReleaseMutex() statement immediately after writing, so hopefully that disposes of the extra gunk.

          Besides, that wouldn't have caused this situation; in this case, I started NT about 90 minutes before the close and activated 200 programs. They quickly used up about 1GB of RAM and then over the course of 90 minutes grew to over 3.5GB. During that time I did not disable any strategies so I would not have encountered the OnTermination section.

          But while I have you here, what else could be contributing to this increase in memory usage? The example above is typical, and it's actually a little better than what I was experiencing before writing to log files.

          Comment


            #6
            Originally posted by egan857 View Post
            Hi RyanM,

            I'm using mutex instead of streamwriter, so the Dispose-ing appears to be different. I do issue a ReleaseMutex() statement immediately after writing, so hopefully that disposes of the extra gunk.

            Besides, that wouldn't have caused this situation; in this case, I started NT about 90 minutes before the close and activated 200 programs. They quickly used up about 1GB of RAM and then over the course of 90 minutes grew to over 3.5GB. During that time I did not disable any strategies so I would not have encountered the OnTermination section.

            But while I have you here, what else could be contributing to this increase in memory usage? The example above is typical, and it's actually a little better than what I was experiencing before writing to log files.
            It begins to look to me as if it is your strategy code that is eating up the memory. Have you tried just running the strategy with absolutely no logging at all to see what happens?

            Comment


              #7
              RAM usage will be higher the more strategies you run. If performance is not acceptable when running 200 strategies, then reduce this number. If performance is acceptable and you're just looking to reduce RAM usage, you would need to isolate to one running instance at a time, and begin commenting out code sections you feel are contributing to high memory usage.
              Ryan M.NinjaTrader Customer Service

              Comment


                #8
                Hi Koganam and RyanM,

                Yes, I'm going to turn off all logging later today and run it for a while to test that theory. I suspect, however, that will not solve the problem since my logging is now writing to disk and not to the output window. The top three contenders for this problem are:

                1. The number of variables in the strategy. I have well over 300 variables and a few arrays. Fortunately I don't have many strings, but let's just say each variable takes up 16 bytes. That's 4800 bytes * 200 programs = just under 1MB. The only way the variables can be a problem is if NT is allocating new space every time a value changes, which probably is not happening.

                2. The accumulation of minute bars. I have four timeframes: minute bar, minute bar on another stock (DIA), daily bar, and 3-minute bar. When the program loads up, it loads 20 calendar days' worth of history, or about 4000-5000 minute bars. Let's assume 5000 bars for the stock + 5000 bars for DIA + 15 daily bars + 1666 3-minute bars. That's almost 12,000 bars. By the end of the day it has accumulated 390+390+1+130=911 more bars, or almost 13,000 bars total. The problem is that after loading 12,000 bars for 200 programs it takes up about 1GB of RAM, and within 4-5 hours it takes up over 6GB of RAM. The numbers don't add up: why would it take 500% more space when it only adds about 10% more bars?

                3. Looping. Every minute, I loop through the last few minute bars to look for trends. I only loop through ALL of the minute bars when the program starts up. Every minute, I just loop through the most recent minutes for various reasons, plus I do some calculations to find the highest highs and lowest lows for the last few bars. This is probably the most likely reason I'm eating memory, because it's an activity that could accumulate memory usage.

                Anyway, if either of you have any other ideas, I'll be happy to experiment. I'm trying to get to the point where I can double the number of symbols so I can find more opportunities, but I can't even run all day long with my current number of symbols. I could definitely reduce the number of programs and survive, but I want to go in the opposite direction.

                Comment


                  #9
                  General tips here are: eliminate loops when possible, avoid logging, dispose custom resources when finished.

                  Performance will vary depending on exactly what your strategy is doing, and we unfortunately cannot provide in-depth code optimization here. If you are developing within NT only, the only approach to take is to simplify (1 instance at a time) and isolate code segments until you can identify what is contributing to higher memory. Manually calculating what you think memory should be (especially using 200 instances) will not likely tell you much.

                  If you want proper analysis here, you can consider developing in Visual studio and use profiling tools to get accurate metrics of your memory allocation.
                  Ryan M.NinjaTrader Customer Service

                  Comment


                    #10
                    Thanks guys for your help. RyanM, I will look into the Visual Studio profiling later; I've only been using C# since the start of this year and have never used Visual Studio, but it looks like something that deserves investigation. In the meantime, I'll stick with empirical testing.

                    I ran multiple one-hour tests yesterday. Removing logs had no improvement, probably because I already saw some improvement by writing to disk instead of the output window. Removing a bunch of variables had no effect (but it made for cleaner code, so that's a good thing). But I did see noticeable improvement when I removed some of the loops that are activated once a minute. Instead, I go through the loop on startup and then every minute I only look at the current minute bar and the minute bar that just dropped off my window to keep rolling totals. I removed the biggest loops and still have a few to go, but the results were dramatic: with 200 programs running, the footprint steadily grew to 2.3GB after an hour; after the changes, it only grew to 1.7GB. Still room for improvement, but I still have more loops to remove.

                    Comment


                      #11
                      Thanks for following up, egan857. Removing loops can definitely have a big impact on performance. Loops do make certain items a lot easier to code and it can for sure be challenging at times adapting these. Best of luck getting performance where you need it!
                      Ryan M.NinjaTrader Customer Service

                      Comment

                      Latest Posts

                      Collapse

                      Topics Statistics Last Post
                      Started by Uregon, Today, 03:44 AM
                      0 responses
                      2 views
                      0 likes
                      Last Post Uregon
                      by Uregon
                       
                      Started by CyberNate, 11-17-2020, 08:15 PM
                      2 responses
                      426 views
                      0 likes
                      Last Post nurpalgo  
                      Started by sdauteuil, 09-23-2021, 10:16 AM
                      7 responses
                      1,247 views
                      0 likes
                      Last Post marcus2300  
                      Started by sofortune, 05-18-2024, 11:48 AM
                      2 responses
                      34 views
                      0 likes
                      Last Post NinjaTrader_BrandonH  
                      Started by Zach55, 02-19-2024, 07:22 PM
                      2 responses
                      67 views
                      0 likes
                      Last Post lbadisa1  
                      Working...
                      X