The Frequency Effect Part Deux: Shifting Time at Frequency Zero For Better Trading Performance

Animation 1: The out-of-sample performance over 60 trading days of signal built using a custom i2 criterion. With 5 trades and 4 successful, the ROI is nearly 40 percent.

Animation 1: The out-of-sample performance over 60 trading days of a signal built using an optimized time-shift criterion. With 5 trades and 4 successful, the ROI is nearly 40 percent over 3 month.

What is an optimized time-shift? Is it important to use when building successful financial trading signals? While the theoretical aspects of the frequency zero and vanishing time-shift can be discussed in a very formal and mathematical manner,  I hope to answer these questions in a more simple (and applicable) way in this article. To do this, I will give an informative and illustrated real world example in this unforeseen continuation of my previous article on the frequency effect a few days ago. I discovered something quite interesting after I got an e-mail from Herr Doktor Marc (Wildi) that nudged me even further into my circus of investigations in carving out optimal frequency intervals for financial trading (see his blog for the exact email and response).  So I thought about it  and soon after I sent my response to Marc, I began to question a few things even further at 3am in the morning while sipping on some Asian raspberry white tea (my sleeping patterns lately have been as erratic as fiscal cliff negotiations), and came up with an idea. Firstly, there has to be a way to include information about the zero-frequency (this wasn’t included in my previous article on optimal frequency selection). Secondly, if I’m seeing promising results using a narrow band-pass approach after optimizing the location and distance, is there anyway to still incorporate the zero-frequency and maybe improve results even more with this additional frequency information?

Frequency zero is an important frequency in the world of nonstationary time series and model-based time series methodologies as it deals with the topic of unit roots, integrated processes,  and (for multivariate data) cointegration. Fortunately for you (and me), I don’t need to dwell further into this mess of a topic that is cointegration since typically, the type of data we want to deal with in financial trading (log-returns) is closer to being stationary (namely close to being white noise, ehem, again, close, but not quite). Nonetheless, a typical sequence of log-return data over time is never zero-mean, and full of interesting turning points at certain frequency bands. In essence, we’d somehow like to take advantage of that and perhaps better locate local turning points intrinsic to the optimal trading frequency range we are dealing with.

The perfect way to do this is through the use of the time-shift value of the filter. The time-shift is defined by the derivative of the frequency response (or transfer) function at zero. Suppose we have an optimal bandpass set at (\omega_0, \omega_1) \subset [0,\pi] where \omega_0 > 0. We can introduce a constraint on the filter coefficients so as to impose a vanishing time-shift at frequency zero. As Wildi says on page 24 of the Elements paper: “A vanishing time-shift is highly desirable because turning-points in the filtered series are concomitant with turning-points in the original data.” In fact, we can take this a step further and even impose an arbitrary time-shift with the value s at frequency zero, where s is any real number. In this case, the derivative of the frequency response function (transfer function) \hat{\Gamma}(\omega) at zero is s. As explained on page 25 of Elements,  this is implemented as \frac{d}{d \omega} |_{\omega=0} \sum_{l=0}^{L-1} b_j \exp(-i j \omega) = s, which implies b_1 + 2b_2 + \cdots + (L-1) b_{L-1} = s.

This constraint can be integrated into the MDFA formulation, but then of course adds another parameter to an already full-flight of parameters.  Furthermore, the search for the optimal s with respect to a given financial trading criterion is tricky and takes some hefty computational assistance by a robust (highly nonlinear) optimization routine, but it can be done. In iMetrica I’ve implemented a time-shift turning point optimizer, something that works well so far for my taste buds, but takes a large burden of computational time to find.

To illustrate this methodology in a real financial trading application, I return to the same example I used in my previous article, namely using daily log-returns of GOOG and AAPL from 6-3-2011 to 12-31-2012 to build a trading signal. This time to freshen things up a but, I’m going to target and trade shares of Apple Inc. instead of Google.  Quickly, before I begin, I will swiftly go through the basic steps of building trading signals. If you’re already familiar, feel free to skip down two paragraphs.

As I’ve mentioned in the past, fundamentally the most important step to building a successful and robust trading signal is in choosing an appropriate preliminary in-sample metric space in which the filter coefficients for the signal are computed. This preliminary in-sample metric space represents by far the most critically important aspect of building a successful trading signal and is built using the following ingredients:

  • The target and explanatory series (i.e. minute, hourly, daily log-returns of financial assets)
  • The time span of in-sample observations (i.e. 6 hours, 20 days, 168 days, 3 years, etc.)

Choosing the appropriate preliminary in-sample metric space is beyond the scope of this article, but will certainly be discussed in a future article.  Once this in-sample metric space has been chosen, one can then proceed by choosing the optimal extractor (the frequency bandpass interval) for the metric space. While concurrently selecting the optimal extractor, one must  begin warping and bending the preliminary metric space through the use of the various customization and regularization tools (see my previous Frequency Effect article, as well as Marc’s Elements paper for an in-depth look at the mathematics of regularization and customization). These are the principle steps.

Now let’s look at an example. In the .gif animation at the top of this article, I featured a signal that I built using this time-shift optimizer and a frequency bandpass extractor heavily centered around the frequency \pi/12, which is not a very frequent trading frequency, but has its benefits, as we’ll see. The preliminary metric space was constructed by an in-sample period using the daily log-returns of GOOG and AAPL and AAPL as my target is from 6-4-2011 to 9-25-2012, nearly 16 months of data. Thus we mention that the in-sample includes many important news events from Apple Inc. such as the announcement of the iPad mini, the iPhone 4S and 5, and the unfortunate sad passing of Steve Jobs. I then proceeded to bend the preliminary metric space with a heavy dosage of regularization, but only a tablespoon of customization¹. Finally, I set the time-shift constraint and applied my optimization routine in iMetrica to find the value s that yields the best possible turning-point detector for the in-sample metric space. The result is shown in Figure 1 below in the slide-show. The in-sample signal from the last 12 months or so (no out-of-sample yet applied) is plotted in green, and since I have future data available (more than 60 trading days worth from 9-25 to present), I can also approximate the target symmetric filter (the theoretically optimal target signal) in order to compare things (a quite useful option available with the click of a button in iMetrica I might add). I do this so I can have a good barometer of over-fitting and concurrent filter robustness at the most recent in-sample observation. Figure 1 in the slide-show below, the trading signal is in green, the AAPL log-return data in red, and the approximated target signal in gray (recall that if you can approximate this target signal (in gray) arbitrarily well, you win, big).

Notice that at the very endpoint (the most challenging point to achieve greatness) of the signal in Figure 1, the filter does a very fine job at getting extremely close. In fact, since the theoretical target signal is only a Fourier approximation of order 60, my concurrent signal that I built might even be closer to the ‘true value’, who knows. Achieving exact replication of the target signal (gray) elsewhere is a little less critical in my experience. All that really matters is that it is close in moving above and below zero to the targeted intention (the symmetric filter) and close at the most recent in-sample observation. Figure 2 above shows the signal without the time-shift constraint and optimization. You might be inclined to say that there is no real big difference. In fact, the signal with no time-shift constraint looks even better. It’s hard to make such a conclusion in-sample, but now here is where things get interesting.

We apply the filter to the out-of-sample data, namely the 60 tradings days. Figure 3 shows the out-of-sample performance over these past 60 trading days, roughly October, November, and December, (12-31-2012 was the latest trading day), of the signal without the time-shift constraint. Compare that to Figure 4 which depicts the performance with the constraint and optimization. Hard to tell a difference, but let’s look closer at the vertical lines. These lines can be easily plotted in iMetrica using the plot button below the canvas named Buy Indicators. The green line represents where the long position begins (we buy shares) and the exit of a short position. The magenta line represents where selling the shares occurs and the entering of a short position. These lines, in other words, are the turning point detection lines. They determine where one buys/sells (enter into a long/short position). Compare the two figures in the out-of-sample-portion after the light cyan line (indicated in Figure 4 but not Figure 3, sorry).

Figure 3: Out-of-sample performance of the signal built without time-shift constraint The out-of-sample period beings where the light cyan line is from Figure 4.

Figure 3: Out-of-sample performance of the signal built without time-shift constraint The out-of-sample period beings where the light cyan line is from Figure 4 below.

Figure 4: Out-of-sample performance of the signal built with time-shift constraint and optimized for turning point-detection,  The out-of-sample period beings where the light cyan is.

Figure 4: Out-of-sample performance of the signal built with time-shift constraint and optimized for turning point-detection, The out-of-sample period beings where the light cyan is.

Notice how the optimized time-shift constraint in the trading signal in Figure 4 pinpoints to a close perfection where the turning points are (specifically at points 3, 4,and 5).  The local minimum turning point was detected exactly at 3, and nearly exact at 4 and 5. The only loss out of the 5 trades occurred at 2, but this was more the fault of the long unexpected fall in the share price of Apple in October. Fortunately we were able to make up for those losses (and then some) at the next trade exactly at the moment a big turning point came (3).  Compare this to the non optimized time-shift constrained signal (Figure 3), and how the second and third turning points are a bit too late and too early, respectively. And remember, this performance is all out-of-sample, no adjustments to the filter have been made, nothing adaptive. To see even more clearly how the two signals compare, here are gains and losses of the 5 actual trades performed out-of-sample (all numbers are in percentage according to gains and losses in the trading account governed only by the signal. Positive number is a gain, negative a loss)

                       Without Time-Shift Optimization              With Time-Shift Optimization

Trade 1:                              29.1 -> 38.7 =  9.6                          14.1 -> 22.3  =  8.2
Trade 2:                              38.7 -> 32.0  = -6.7                         22.3 -> 17.1  = -5.2
Trade 3:                              32.0 -> 40.7  =  8.7                         17.1  -> 30.5  = 13.4
Trade 4:                              40.7 -> 48.2  =  7.5                         30.5 -> 41.2   = 10.7
Trade 5:                              48.2 -> 60.2  = 12.0                        41.2 -> 53.2   = 12.0

The optimized time-shift signal is clearly better, with an ROI of nearly 40 percent in 3 months of trading. Compare this to roughly 30 percent ROI in the non-constrained signal. I’ll take the optimized time-shift constrained signal any day. I can sleep at night with this type of trading signal. Notice that this trading was applied over a period in which Apple Inc. lost nearly 20 percent of its share price.

Another nice aspect of this trading frequency interval that I used is that trading costs aren’t much of an issue since only 10 transactions (2 transaction each trade) were made in the span of 3 months, even though I did set them to be .01 percent for each transaction nonetheless.

To dig a bit deeper into plausible reasons as to why the optimization of the time-shift constraint matters (if only even just a little bit), let’s take a look at the plots of the coefficients of each respective filter. Figure 5 depicts the filter coefficients with the optimized time-shift constraint, and Figure 6 shows the coefficients without it.  Notice how in the filter for the AAPL log-return data (blue-ish tinted line) the filter privileges the latest observation much more, while slowly modifying the others less. In the non optimized time-shift filter, the most recent observation has much less importance, and in fact, privileges a larger lag more. For timely turning point detection, this is (probably) not a good thing.  Another interesting observation is that the optimized time-shift filter completely disregards the latest observation in the log-return data of GOOG (purplish-line) in order to determine the turning points. Maybe a “better” financial asset could be used for trading AAPL? Hmmm…. well in any case I’m quite ecstatic with these results so far.  I just need to hack my way into writing a better time-shift optimization routine, it’s a bit slow at this point.  Until next time, happy extracting. And feel free to contact me with any questions.

Figure 5: The filter coefficients with time-shift optimization.

Figure 5: The filter coefficients with time-shift optimization.

Figure 6: The filter coefficients without the time-shift optimization.

Figure 6: The filter coefficients without the time-shift optimization.

¹ I won’t disclose quite yet how I found these optimal parameters and frequency interval or reveal what they are as I need to keep some sort of competitive advantage as I presently look for consulting opportunities 😉 .

Advertisement

Hierarchy of Financial Trading Parameters

Figure 1: A trading signal produced in iMetrica for the daily price index of GOOG (Google) using the log-returns of GOOG and AAPL (Apple) as the explanatory data, The blue-pink line represents the account wealth over time, with a 89 percent return on investment in 16 months time (GOOG recorded a 23 percent return during this time). The green line represents the trading signal built using the MDFA module using the hierarchy of parameters described in this article. The gray line is the log price of GOOG from June 6 2011 to November 16 2012.

In any computational method for constructing binary buy/sell signals for trading financial assets, most certainly a plethora of parameters are involved and must be taken into consideration when computing and testing the signals in-sample for their effectiveness and performance. As traders and trading institutions typically rely on different financial priorities for navigating their positions such as risk/reward priorities, minimizing trading costs/trading frequency, or maximizing return on investment , a robust set of parameters for adjusting and meeting the criteria of any of these financial aims is needed. The parameters need to clearly explain how and why their adjustments will aid in operating the trading signal to their goals in mind. It is my strong belief that any computational paradigm that fails to do so  should not be considered a candidate for a transparent, robust, and complete method for trading financial assets.

In this article, we give an in-depth look at the hierarchy of financial trading parameters involved in building financial trading signals using the powerful and versatile real-time multivariate direct filtering approach (MDFA, Wildi 2006,2008,2012), the principle method used in the financial trading interface of iMetrica.  Our aim is to clearly identify the characteristics of each parameter involved in constructing trading signals using the MDFA module in iMetrica as well as what effects (if any) the parameter will have on building trading signals and their performance.

With the many different parameters at one’s disposal for computing a signal for virtually any type of financial data and using any financial priority profile, naturally there exists a hierarchy associated with these parameters that all have well-defined mathematical definitions and properties. We propose a categorization of these parameters into three levels according to the clarity on their effect in building robust trading signals. Below are the four main control panels used in the MDFA module for the Financial Trading Interface (shown in Figure 1). They will be referenced throughout the remainder of this article.

Figure 2: The interface for controlling many of the parameters involved in MDFA. Adjusting any of these parameters will automatically compute the new filter and signal output with the new set of parameters and plot the results on the MDFA module plotting canvases.

Figure 3: The main interface for building the target symmetric filter that is used for computing the real-time (nonsymmetric) filter and output signal. Many of the desired risk/reward properties are controlled in this interface. One can control every aspect of the target filter as well as spectral densities used to compute the optimal filter in the frequency domain.

Figure 4: The main interface for constructing Zero-Pole Combination filters, the original paradigm for real-time direct filtering. Here, one can control all the parameters involved in ZPC filtering, visualize the frequency domain characteristics of the filter, and inject the filter into the I-MDFA filter to create “hybrid” filters.

Figure 5: The basic trading regulation parameters currently offered in the Financial Trading Interface. This panel is accessed by using the Financial Trading menu at the top of the software. Here, we have direct control over setting the trading frequency, the trading costs per transaction, and the risk-free rate for computing the Sharpe Ration, all controlled by simply sliding the bars to the desired level. One can also set the option to short sell during the trading period (provided that one is able to do so with the type of financial asset being traded).

The Primary Parameters:

  • Trading Frequency. As the title entails, the trading frequency governs how often buy/sell signal will occur during the span of the trading horizon. Regardless of minute data, hourly data, or daily data, the trading frequency regulates when trades are signaled and is also a key parameter when considering trading costs. The parameter that controls the trading frequency is defined by the cutoff frequency in the target filter of the MDFA and is regulated in either the Target Filter Design interface (see Figure 3) or, if one is not accustomed to building target filters in MDFA, a simpler parameter is given in the Trading Parameter panel (see Figure 5). In Figure 3, the pass-band and stop-band properties are controlled by any one of the sliding scrollbars. The design of the target filter is plotted in the Filter Design canvas (not shown).
  • Timeliness of signal. The timeliness of the signal controls the quality of the phase characteristics in the real-time filter that computes the trading signal. Namely, it can control how well turning points (momentum changes) are detected in the financial data while minimizing the phase error in the filter. Bad timeliness properties will lead to a large delay in detecting up/downswings in momentum. Good timeliness properties lead to anticipated detection of momentum in real-time. However, the timeliness must be controlled by smoothness, as too much timeliness leads to the addition of unwanted noise in the trading signal, leading to unnecessary unwanted trades. The timeliness of the filter is governed by the \lambda parameter that controls the phase error in the MDFA optimization. This is done by using the sliding scrollbar marked \lambda in the Real-Time Filter Design in Figure 2. One can also control the timeliness property for ZPC filters using the \lambda scrollbar in the ZPC Filter Design panel (Figure 4).
  • Smoothness of signal.  The smoothness of the signal is related to how well the filter has suppressed the unwanted frequency information in the financial data, resulting in a smoother trading signal that corresponds more directly to the targeted signal and trading frequency. A signal that has been submitted to too much smoothing however will lose any important timeliness advantages, resulting in delayed or no trades at all. The smoothness of the filter can be adjusted through using the \alpha parameter that controls the error in the stop-band between the targeted filter and the computed concurrent filter. The smoothness parameter is found on the Real-Time Filter Design interface in the sliding scrollbar marked W(\omega) (see Figure 2) and in the sliding scrollbar marked \alpha in the ZPC Filter Design panel (see Figure 4).
  • Quantization of information.   In this sense, the quantization of information relates to how much past information is used to construct the trading signal. In MDFA, it is controlled by the length of the filter L and is found on the Real-Time Filter Design interface (see Figure 2). In theory, as the filter length L gets larger. the more past information from the financial time series is used resulting in a better approximation of the targeted filter. However, as the saying goes, there’s no such thing as a free lunch: increasing the filter length adds more degrees of freedom, which then leads to the age-old problem of over-fitting. The result: increased nonsense at the most concurrent observation of the signal and chaos out-of-sample. Fortunately, we can relieve the problem of over-fitting by using regularization (see Secondary Parameters). The length of the filter is controlled in the sliding scrollbar marked Order-L in the Real-Time Filter Design panel (Figure 2).

As you might have suspected, there exists a so-called “uncertainty principle” regarding the timeliness and smoothness of the signal. Namely, one cannot achieve a perfectly timely signal (zero phase error in the filter) while at the same time remaining certain that the timely signal estimate is free of unwanted “noise” (perfectly filtered data in the stop-band of the filter).   The greater the timeliness (better phase error), the lesser the smoothness (suppression of unwanted high-frequency noise). A happy combination of these two parameters is always desired, and thankfully there exists in iMetrica an interface to optimize these two parameters to achieve a perfect balance given one’s financial trading priorities. There has been much to say on this real-time direct filter “uncertainty” principle, and the interested reader can seek the gory mathematical details in an original paper by the inventor and good friend and colleague Professor Marc Wildi here.

The Secondary Parameters 

Regularization of filters is the act of projecting the filter space into a lower dimensional space,reducing the effective number of degrees of freedom. Recently introduced by Wildi in 2012 (see the Elements paper), regularization has three different members to adjust according to the preferences of the signal extraction problem at hand and the data. The regularization parameters are classified as secondary parameters and are found in the Additional Filter Ingredients section in the lower portion of the Real-Time Filter Design interface (Figure 2). The regularization parameters are described as follows.

  • Regularization: smoothness. Not to be confused with the smoothness parameter found in the primary list of parameters, this regularization technique serves to project the filter coefficients of the trading signal into an approximation space satisfying a smoothness requirement, namely that the finite differences of the coefficients up to a certain order defined by the smoothness parameter are kept relatively small. This ultimately has the effect that the parameters appear smoother as the smooth parameter increases. Furthermore, as the approximation space becomes more “regularized” according to the requirement that solutions have “smoother” solutions, the effective degrees of freedom decrease and chances of over-fitting will decrease as well. The direct consequences of applying this type of regularization on the signal output are typically quite subtle, and depends clearly on how much smoothness is being applied to the coefficients. Personally, I usually begin with this parameter for my regularization needs to decrease the number of effective degrees of freedom and improve out-of-sample performance.
  • Regularization: decay. Employing the decay parameter ensures that the coefficients of the filter decay to zero at a certain rate as the lag of the filter increases. In effect, it is another form of information quantization as the trading signal will tend to lessen the importance of past information as the decay increases. This rate is governed by two decay parameter and higher the value, the faster the values decrease to zero. The first decay parameter adjusts the strength of the decay. The second parameter adjusts for how fast the coefficients decay to zero. Usually, just a slight touch on the strength of the decay and then adjusting for the speed of the decay is the order in which to proceed for these parameters. As with the smoothing regularization, the number of effective degrees of freedom will (in most cases) decreases as the decay parameter decreases, which is a good thing (in most cases).
  • Regularization: cross correlation.  Used for building trading signals with multivariate data only, this regularization effect groups the latitudinal structure of the multivariate time series more closely, resulting in more weighted estimate of the target filter using the target data frequency information. As the cross regularization parameter increases, the filter coefficients for each time series tend to converge towards each other. It should typically be used in a last effort to control for over-fitting and should only be used if the financial time series data is on the same scale and all highly correlated.

The Tertiary Parameters

  • Phase-delay customization. The phase-delay of the filter at frequency zero, defined by the instantaneous rate of change of a filter’s phase at frequency zero, characterizes important information related to the timeliness of the filter. One can directly ensure that the phase delay of the filter at frequency zero is zero by adding constraints to the filter coefficients at computation time. This is done by setting the clicking the i2 option in the Real-Time Filter Design interface. To go further, one can even set the phase delay to an fixed value other than zero using the i2 scrollbar in the Additional Filter Ingredients box. Setting this value to a certain value (between -20 and 20 in the scrollbar) ensures that the phase delay at zero of the filter reacts as anticipated. It’s use and benefit is still under investigation. In any case, one can seamlessly test how this constraint affects the trading signal output in their own trading strategies directly by visualizing its performance in-sample using the Financial Trading canvas.
  • Differencing weight. This option, found in the Real-Time Filter Design interface as the checkbox labeled “d” (Figure 2), multiplies the frequency information (periodogram or discrete Fourier transform (DFT)) of the financial data by the weighting function f(\omega) = 1/(1 - \exp(i \omega)), \omega \in (0,\pi), which is the reciprocal of the differencing operator in the frequency domain. Since the Financial Trading platform in iMetrica strictly uses log-return financial time series to build trading signals, the use of this weighting function is in a sense a frequency-based “de-differencing” of the differenced data. In many cases, using the differencing weight provides better timeliness properties for the filter and thus the trading signal.

In addition to these three levels of parameters used in building real-time trading signals, there is a collection of more exotic “parameterization” strategies that exist in the iMetica MDFA module for fine tuning and constructing boosting trading performance. However, these strategies require more time to develop, a bit of experimentation, and a keen eye for filtering. We will develop more information and tutorials about these advanced filtering techniques for constructing effective trading signals in iMetrica in future articles on this blog coming soon. For now, we just summarize their main ideas.

Advanced Filtering Parameters

  • Hybrid filtering. In hybrid filtering, the goal is to filter a target signal additionally by injecting it with another filter of a different type that was constructed using the same data, but different paradigm or set of parameters. One method of hybrid filtering that is readily available in the MDFA module entails constructing Zero-Pole Combination filters using the ZPC Filter Design interface (Figure 4) and injecting the filter into the filter constructed in the Real-Time Filter Design interface (Figure 2) (see Wildi ZPC for more information). The combination (or hybrid) filter can then be accessed using one of the check box buttons in the filter interface and then adjusted using all the various levels of parameters above, and then used in the financial trading interface. The effect of this hybrid construction is to essentially improve either the smoothness or timeliness of any computed trading signal, while at the same time not succumbing to the nasty side-effects of over-fitting.
  • Forecasting and Smoothing signals. Smoothing signals in time series, as its name implies, involves obtaining a smoother estimate of certain signal in the past. Since the real-time estimate of a signal value in the past involves using more recent values, the signal estimation becomes more symmetrical as past and future values at a point in the past are used to estimate the value of the signal. For example, if today is after market hours on Friday, we can obtain a better estimate of the targeted signal for Wednesday since we have information from Thursday and Friday. In the opposite manner, forecasting involves projecting a signal into the future. However, since the estimate becomes even more “anti-symmetric”, the estimate becomes more polluted with noise. How these smoothed and forecasted signals can be used for constructing buy/sell trading signals in real-time is still purely experimental. With iMetrica, building and testing strategies that improve trading performance using either smoothed and forecasted signals (or both), is available.To produce either a smoothed or forecasted signal, there is a lag scrollbar available in the Real-Time Filter Design interface under Additional Filter Ingredients that enables one to compute either a smooth or forecasted signal. Setting the lag value k in the scrollbar to any integer between -10 and 10 and the signal with the set lag applied is automatically computed. For negative lag values k, the method produces a k step-ahead forecast estimate of the signal. For positive values, the method produces a smoothed signal with a delay of $k$ observations.
  • Customized spectral weighting functions. In the spirit of customizing a trading signal to fit one’s priorities in financial trading, one also has the option of customizing the spectral density estimate of the data generating process to any design one wishes. In the computation of the real-time filter, the periodogram (or DFTs in multivariate case) is used as the default estimate of the spectral density weighting function. This spectral density weighting function in theory is supposed to serve as the spectrum of the underlying data generating process (DGP). However, since we have no possible idea about the underlying DGP of the price movement of publicly traded financial assets (other than it’s supposed to be pretty darn close to a random walk according to the Efficient Market Hypothesis), the periodogram is the best thing to an unbiased estimate a mortal human can get and is the default option in the MDFA module of iMetrica. However, customization of this weighting function is certainly possible through the use of the Target Filter Design interface. Not only can one design their target filter for the approximation of the concurrent filter, but the spectral density weighting function of the DGP can also be customized using some of the available options readily available in the interface. We will discuss these features in a soon-to-come discussion and tutorial on advanced real-time filtering methods.
  • Adaptive filtering. As perhaps the most advanced feature of the MDFA module, adaptive filtering is an elegant way to achieve building smarter filters based on previous filter realizations. With the goal of adaptive filtering being to improve certain properties of the output signal at each iteration without compensating with over-fitting, the adaptive process is of course highly nonlinear. In short, adaptive MDFA filtering is an iterative process in which a one begins with a desired filter, computes the output signal, and then uses the output signal as explanatory data in the next filtering round. At each iteration step, one has the freedom to change any properties of the filter that they desire, whether it be customization, regularization, adding negative lags, adding filter coefficient constraints, applying a ZPC filter, or even changing the pass-band in the target filter. The hope is to improve on certain properties of filter at each stage of the iterative process. An in-depth look at adaptive filtering and how to easily produce an adaptive filter using iMetrica is soon to come later this week.