the basics of the indicator is as follows:
parameters (MA length, Compensation, and Prediction Plength
take a MA(the period is fixed) of the input,(1)
take the difference of the Output and the input[Plength] (i'll call it the error term)
take the MA of the error(call it avgerr) (2)
set the output to the original MA (1) + Compensation*avgerr (2)
the above i have working
what i want to do is to create an indicator which adjusts the Compensation:
basically the initial compensation would be provided as a parameter, and later it would be adjusted to reduce the error term. (kinda like the walkforward test)

Comment