Announcement

Collapse
No announcement yet.

Partner 728x90

Collapse

Streamwrite Questions

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

    Streamwrite Questions

    Hello,

    I'm trying to streamwrite to CSV a List of Times from a chart. I would like to record what time a security had a local maximum. The basic code used to identify the Times & create the List is working fine:

    "if (High[5] > Math.Max(MAX(High, 5)[6], MAX(High, 5)[0]))
    {
    maxtimes.Add(Time[5]);
    }"

    And the times are all correct. My problems relate to formatting the CSV.


    1. My first problem is that Streamwrite is giving a lot of duplicative values, and I still haven't been able to figure out how to delete duplicative values from a CSV in C#!

    Any help here would be very much appreciated.


    2. My second problem: I would like the file to be saved as: <Instrument name> + "_Maxes" as the CSV filename.

    However, when I use this command, it got quite confused / failed to work:

    path = @"C:\Users\[blahblahblah]" + Bars.Instrument.MasterInstrument.Name.ToString() + "_Maxes.csv";

    That path unfortunately does not seem to work! I get an error message:

    "Error on calling 'OnStateChange' method: Object reference not set to an instance of an object.
    Error on getting/setting property 'PaintPriceMarkers' for NinjaScript '[blah blah]': Exception has been thrown by the target of an invocation."

    Furthermore, NT8 zaps this indicator from the list of available indicators on a chart so long as the path is defined to include the instrument name.

    Could you please point me in the correct direction here? Thank you!

    #2
    Your questions seem pretty simple.

    As for your first question regarding 'deleting duplicative values' in a CSV
    file, is it your own code writing those values to the file?

    Let's backtrack. From what source do these values come from? Are
    you writing the 'maxtimes' list to a CSV file? If you don't want this list
    to have duplicate values, well, then don't add duplicate values to the list.

    Code:
    if (High[5] > Math.Max(MAX(High, 5)[6], MAX(High, 5)[0]))
    {
        if (!maxtimes.Contains(Time[5]))
            maxtimes.Add(Time[5]);
    }
    My point is: if 'maxtimes' has no duplicate values, then when 'maxtimes'
    is written to a CSV file, the CSV file won't have any duplicates either.

    Let's take a look at your second question.

    I suspect the solution to your filename question may have something
    to do with the surrounding code. I mean, you're accessing Bars from
    OnStateChange() -- but are you aware of the limitations?

    I suspect you are accessing Bars from an inappropriate state.

    From the documentation:
    "Warning: The Bars object and its member should NOT be
    accessed within the OnStateChange() method before the State
    has reached State.DataLoaded."
    Last edited by bltdavid; 03-13-2021, 11:28 AM. Reason: fix typos

    Comment


      #3
      Hi there,

      Thank you so much for your response.

      Unfortunately, the solution you proferred for question #1 does not seem to work. the CSV continues to have many duplicative values.

      Your solution to question #2, however, was perfect, which I truly appreciate.

      If there may be any more bright ideas on how to remove the duplicative values from the CSV I'd greatly appreciate the help!



      Comment


        #4
        My friend, you are welcome.

        However, let's get real, you want me to 'divine' another answer?
        Heheh, sorry, only one divine answer per thread.

        But, seriously, you're hitting a pet peeve here.
        Why? Think about this. Can an automotive mechanic diagnose
        your car trouble without looking at your car?

        Some can, but most cannot, and even more won't even bother.

        My point is:
        For Pete's sake, throw us a bone here!
        Show us the code that builds your CSV file!


        Sure, I have plenty of ideas. The main one is, somewhere in
        your code, your logic is simply wrong.

        If you want a better answer, well, I need to pop open the hood
        and see your code. Make sense?

        Comment


          #5
          Thank you for your response!

          I got it to work on my own. What I did was the following:

          1. I went with a HashSet in favor of a List. HashSets explicitly will not add pre-existing values.

          2. Then, by simply using "File.Create," the code ensured whatever was there before got deleted, leaving just the single values of the HashSet.


          Comment


            #6
            Originally posted by catinabag View Post
            Thank you for your response!

            I got it to work on my own. What I did was the following:

            1. I went with a HashSet in favor of a List. HashSets explicitly will not add pre-existing values.

            2. Then, by simply using "File.Create," the code ensured whatever was there before got deleted, leaving just the single values of the HashSet.
            Yep, without code details, you're on your own.
            Glad you fixed it!

            Comment

            Latest Posts

            Collapse

            Topics Statistics Last Post
            Started by Geovanny Suaza, 02-11-2026, 06:32 PM
            0 responses
            612 views
            0 likes
            Last Post Geovanny Suaza  
            Started by Geovanny Suaza, 02-11-2026, 05:51 PM
            0 responses
            355 views
            1 like
            Last Post Geovanny Suaza  
            Started by Mindset, 02-09-2026, 11:44 AM
            0 responses
            105 views
            0 likes
            Last Post Mindset
            by Mindset
             
            Started by Geovanny Suaza, 02-02-2026, 12:30 PM
            0 responses
            561 views
            1 like
            Last Post Geovanny Suaza  
            Started by RFrosty, 01-28-2026, 06:49 PM
            0 responses
            564 views
            1 like
            Last Post RFrosty
            by RFrosty
             
            Working...
            X