tag:blogger.com,1999:blog-261139923818144971.post1042630594194607814..comments2017-03-25T12:00:10.489+00:00Comments on Investment Idiocy: Correlations, Weights, Multipliers.... (pysystemtrade)Rob Carvernoreply@blogger.comBlogger86125tag:blogger.com,1999:blog-261139923818144971.post-33241997813596701432017-03-21T06:24:19.942+00:002017-03-21T06:24:19.942+00:00To get the performance of a trading rule you run t...To get the performance of a trading rule you run through the position sizing method in the book allocating 100% to a given trading rule.<br /><br />1) No, that isn't how the system works at all. Read the rest of the book before asking any more questions.<br />2) yes - again this in discussed later in the book<br />3) No, I use continous positions. You need to read chapter 7 again as you don't seem to have quite got the gist.<br /><br /> f*w - lemada*w*sigma*w<br /><br />I don't think I've ever used this formula in my book, or on my blog, so I can't really explain it.Rob Carverhttp://www.blogger.com/profile/10175885372013572770noreply@blogger.comtag:blogger.com,1999:blog-261139923818144971.post-29483813399991305832017-03-20T22:52:32.600+00:002017-03-20T22:52:32.600+00:00Hi Rob,
Thank you so much for your book. It it ve...Hi Rob,<br /><br />Thank you so much for your book. It it very educative. I was trying to understand more about trading rules correlations in "Chapter 8: Combined Forecasts". You mentioned that back-testing the performance of trading rules to get correlation.<br /><br />Could you share a bit more insights on how you get the performance of trading rules, please?<br />(1) Do you set buy/sell threshold at +/- 10? meaning that no position held when signal is [-10,10], only 1 position held when signal is [10,20] and [-20,-10] and 2 positions held when signal is at -20/+20?<br />(2) Trading cost is considered? (I think the answer is yes.)<br />(3) You entry a buy trade, say at signal=10. When do you signal to exit the trade? when signal<10 or signal=0?<br /><br />or you use dynamic positions, meaning the position varies with signal all the time.<br /><br />Another question regarding optimisation:<br />In the formula: f*w - lemada*w*sigma*w' to estimate weights<br />(1) f is rules' sharpe ratio calculated using the rules' historical performance pooled from all instruments or just the sharpe of the rule from the instrument we look at?<br />(2) how do you define lemada? =0.0001? if so, is it always 0.0001?<br /><br />Sorry if those two questions had been asked before.<br /><br />Thanks,<br />DeanoGu langyuhttp://www.blogger.com/profile/16054821256101608361noreply@blogger.comtag:blogger.com,1999:blog-261139923818144971.post-15858215407372944062017-03-13T16:17:06.439+00:002017-03-13T16:17:06.439+00:00Yes. You should use a nearer month for carry if yo...Yes. You should use a nearer month for carry if you can, and trade further out, but this isn't possible in bonds, equities or FX. See appendix B.Rob Carverhttp://www.blogger.com/profile/10175885372013572770noreply@blogger.comtag:blogger.com,1999:blog-261139923818144971.post-90681759521581606682017-03-13T16:10:52.858+00:002017-03-13T16:10:52.858+00:00Rob, in your legacy.csv modules, some specific fut...Rob, in your legacy.csv modules, some specific futures have the "price contract" as the "front month"(closest contract) like Bund, US20 & US10, etc. meanwhile, others such as Wheat, , gas, crude, etc have the "carry contract" as the front month. is this by design?Dolphhttp://www.blogger.com/profile/15588741885676289695noreply@blogger.comtag:blogger.com,1999:blog-261139923818144971.post-89901957483022988252017-03-13T06:38:45.613+00:002017-03-13T06:38:45.613+00:00Well 92% of the weight on the correlations will be...Well 92% of the weight on the correlations will be coming from the last 3 years. So yes you could speed this up by using a rolling out of sample although the results will be slightly different. 5 years would be better as this gets you up to 99%.Rob Carverhttp://www.blogger.com/profile/10175885372013572770noreply@blogger.comtag:blogger.com,1999:blog-261139923818144971.post-51660349476836649322017-03-11T10:44:26.606+00:002017-03-11T10:44:26.606+00:00OK, but in your simulations you work with an expan...OK, but in your simulations you work with an expanding window and do calculations yearly based on weekly data. If we use EWM-span of 125 it means the rolling correlations go back roughly about 3 years (125*5 days). So if for example the total period is from 1990-2016, is the last element of last calculation (1990-2016) then a correct estimate of the correlation of the whole period, because data before 2012 is 'ignored' ?<br /><br />Maybe it's then faster to work with a rolling out-of-sample frame to do this calculations ?<br /><br />Or is my idea on this not correct ?<br /><br />Kris Krishttp://www.blogger.com/profile/17919667101415157462noreply@blogger.comtag:blogger.com,1999:blog-261139923818144971.post-28287407067770538462017-03-10T06:15:48.693+00:002017-03-10T06:15:48.693+00:00ewm.corr returns rolling correlations; each elemen...ewm.corr returns rolling correlations; each element in the list is already an exponentially weighted average of correlations. Since I'm doing the rolling through time process myself I only need the last of these elements.Rob Carverhttp://www.blogger.com/profile/10175885372013572770noreply@blogger.comtag:blogger.com,1999:blog-261139923818144971.post-33198178595926293882017-03-09T21:40:27.524+00:002017-03-09T21:40:27.524+00:00I see that the ewm.corr-function return a list of ...I see that the ewm.corr-function return a list of correlations for each date and not a correlation matrix.<br />For the classic corr-function the result a matrix of correlation coëfficients.<br /><br />In your code (https://github.com/robcarver17/pysystemtrade/blob/ba7fe7782837b0df0dea83631da19d98a1d8c84f/syscore/correlations.py#L173) I see you only takes the latest value for each year of the ewm.corr function. <br />I should expect that we must take a kind of average of all correlation values from a pair to calculate the correlation coëfficient for each pair. Can you clarify this, thanks.<br /><br />KrisKrishttp://www.blogger.com/profile/17919667101415157462noreply@blogger.comtag:blogger.com,1999:blog-261139923818144971.post-76280141570939336092017-03-09T06:20:46.122+00:002017-03-09T06:20:46.122+00:00Oh sorry I misunderstood. You are using WEEKLY RET...Oh sorry I misunderstood. You are using WEEKLY RETURNS to estimate instrument weights: that's fine. I thought you were actually redoing the bootstrapping every week.Rob Carverhttp://www.blogger.com/profile/10175885372013572770noreply@blogger.comtag:blogger.com,1999:blog-261139923818144971.post-6611054274989205442017-03-08T20:39:06.376+00:002017-03-08T20:39:06.376+00:00Thanks for the reply. I had applied the same metho...Thanks for the reply. I had applied the same method for the instrument weights as for the forecast weights. You'd mentioned above:<br />"Also the default is to use weekly returns for optimisation. This has two advantages; firstly it's faster. Secondly correlations of daily returns tend to be unrealistically low (because for example of different market closes when working across instruments)."<br />Why would the default for forecast weights be weekly but not for instrument weights?<br />Thanks!AvantGardehttp://www.blogger.com/profile/13449856505135293636noreply@blogger.comtag:blogger.com,1999:blog-261139923818144971.post-38105353005991843832017-03-08T12:10:02.100+00:002017-03-08T12:10:02.100+00:00"I am using weekly bootstrapping to estimate ..."I am using weekly bootstrapping to estimate instrument weights...." I think this is a little... well if I'm being honest I think its insane.<br /><br />Okay so to answer the question for backtesting I use one config, and then for my live trading system I use another config which has fixed weights. Personally I run these seperately for different reasons, the backtest to get an idea of historical performance, the 'as live' backtest with fixed weights to compare against what I'm currently doing and for actual trading.<br /><br />There is no configurable way of mixing these, so you'd need to write some code that takes the estimate bootstrapped weights and then replaces them with fixed weights after a certain date. <br />Rob Carverhttp://www.blogger.com/profile/10175885372013572770noreply@blogger.comtag:blogger.com,1999:blog-261139923818144971.post-55704908207667524672017-03-08T09:27:17.775+00:002017-03-08T09:27:17.775+00:00Hi Rob,
I’ve not been clear. I understand how EWMA...Hi Rob,<br />I’ve not been clear. I understand how EWMA works and the process of smoothing.<br />The problem I am having is that I am using weekly bootstrapping to estimate instrument weights. However, each day when I run pysystemtrade the calculated instrument weights can vary significantly day to day due to the nature of bootstrapping. This leads to situations where e.g., pysystemtrade would have generated a trade yesterday when I was running it (which I would have executed), but when I run it today the instrument weight estimates may have changed enough due to the bootstrapping so that the trade that was generated and executed yesterday does not show up as a trade that was generated yesterday today. This makes me less trusting of the backtested performance, as the majority of trades that were historically generated but excluded after resampling are losing trades.<br />I only sample the market once a day generally (so that repeated sampling of the market overwriting the current day’s mark is not an issue).<br />I would like to use the bootstrapping to estimate the weights ANNUALLY and apply the smooth to adjust between last year’s calculated weight, and today’s. But if I am using fixed weights (after having estimated via bootstrapping) by setting them as fixed in the config, there are no longer two data points to smooth between as I have only one fixed estimate in the config.<br />How can I insert an historical weight for last year and a new fixed weight this year (by fixing it in the config) and smooth between them? <br />AvantGardehttp://www.blogger.com/profile/13449856505135293636noreply@blogger.comtag:blogger.com,1999:blog-261139923818144971.post-38009321469878379472017-02-28T19:59:49.741+00:002017-02-28T19:59:49.741+00:00Thanks for this tip!
I always try to write my ow...Thanks for this tip! <br /><br />I always try to write my own code (don't like dependency of others code) and also I don't see how I can use the pandas libraries into vb.net. <br /><br />But I've found the functions here :<br />https://github.com/pandas-dev/pandas/blob/master/pandas/window.pyx --> EWMCOV <br /><br />and here :<br />https://github.com/pandas-dev/pandas/blob/v0.19.2/pandas/core/window.py#L1576-L1596 --> corr<br />So I can analyse how they to the stuff and can write it in VB.NET<br /><br />KrisKrishttp://www.blogger.com/profile/17919667101415157462noreply@blogger.comtag:blogger.com,1999:blog-261139923818144971.post-47953185179922868092017-02-28T06:22:55.730+00:002017-02-28T06:22:55.730+00:00Sorry, yes I use exponential weighting a lot. With...Sorry, yes I use exponential weighting a lot. With respect to the first, yes I calculate correlations using an exponential estimator: http://pandas.pydata.org/pandas-docs/version/0.17.0/generated/pandas.ewmcorr.htmlRob Carverhttp://www.blogger.com/profile/10175885372013572770noreply@blogger.comtag:blogger.com,1999:blog-261139923818144971.post-41022012447778854762017-02-27T21:41:29.472+00:002017-02-27T21:41:29.472+00:00OK,but in this article I found 2 different paramet...OK,but in this article I found 2 different parameters refering to exponantional weighting :<br /><br />- under 'Forecast Diversification Multiplier' --> 'correlation' : I found "using_exponent: True # use an exponentially weighted correlation, or all the values equally"<br /><br />- under 'Forecast Diversification Multiplier' --> 'Smoothing again' : I found "ewma_span: 125 ## smooth to apply"<br /><br />I am a little bit confused about the 2 parameters. I understand that the second parameter (smoothing again) is to smooth the jump on the 1st January each year. <br /><br />But what about the first parameter (correlation) ? I thought that you use some kind of exponantial weighting for calculating the correlations, but maybe I'm wrong ? Sorry, but it is not so clear for me.<br /><br />Kris<br /><br />Krishttp://www.blogger.com/profile/17919667101415157462noreply@blogger.comtag:blogger.com,1999:blog-261139923818144971.post-56105157997830032492017-02-27T20:03:20.260+00:002017-02-27T20:03:20.260+00:00No, on the actual multiplier. It's calculated ...No, on the actual multiplier. It's calculated from correlations, updated annually. Without a smooth it would be jumpy on the 1st January each year. Rob Carverhttp://www.blogger.com/profile/10175885372013572770noreply@blogger.comtag:blogger.com,1999:blog-261139923818144971.post-61139107661532640002017-02-27T19:13:13.407+00:002017-02-27T19:13:13.407+00:00Hi Robert,
For the diversification multiplier you...Hi Robert,<br /><br />For the diversification multiplier you mention to use exponential weighting. Where or how you implement this? On the returns or on the deviations of the returns from the expected returns (so just before the calculation of the covariances)? Or maybe at an other place ? <br /><br />Can you give me some direction?<br /><br />Thanks<br /><br />KrisKrishttp://www.blogger.com/profile/17919667101415157462noreply@blogger.comtag:blogger.com,1999:blog-261139923818144971.post-5122689131053127692017-02-20T18:36:04.030+00:002017-02-20T18:36:04.030+00:00(goes to the blackboard to write "I will not ...(goes to the blackboard to write "I will not overfit" 50 times)...sorry, I've read your statements on overfitting more than once, but had a lapse in memory when this question popped into my thick skull. Thanks for your response.Chad Bhttp://www.blogger.com/profile/13026562498196984544noreply@blogger.comtag:blogger.com,1999:blog-261139923818144971.post-2130180258649559192017-02-20T06:16:59.523+00:002017-02-20T06:16:59.523+00:00http://pandas.pydata.org/pandas-docs/version/0.17....http://pandas.pydata.org/pandas-docs/version/0.17.0/generated/pandas.ewma.htmlRob Carverhttp://www.blogger.com/profile/10175885372013572770noreply@blogger.comtag:blogger.com,1999:blog-261139923818144971.post-27334361040410767942017-02-20T06:15:59.116+00:002017-02-20T06:15:59.116+00:00No this smacks of overfitting. Put such evil thoug...No this smacks of overfitting. Put such evil thoughts out of your head. The point of the smooth is to reduce turnover on the first of january each year, not to make money.Rob Carverhttp://www.blogger.com/profile/10175885372013572770noreply@blogger.comtag:blogger.com,1999:blog-261139923818144971.post-83645533545880415232017-02-19T17:43:26.781+00:002017-02-19T17:43:26.781+00:00Hello Rob,
Would you consider making the ewma_spa...Hello Rob,<br /> Would you consider making the ewma_span period for smoothing your forecast weights a variable instead of fixed value, perhaps by some additional logic to detect different volatility 'regimes' that are seen in the market? Or maybe such a notion is fair, but this is the wrong place to apply it, and should be applied at the individual instrument level or in strategy scripts?Chad Bhttp://www.blogger.com/profile/13026562498196984544noreply@blogger.comtag:blogger.com,1999:blog-261139923818144971.post-69628611522623364072017-02-18T19:29:18.695+00:002017-02-18T19:29:18.695+00:00Hi Rob,
Can you provide some further details on ho...Hi Rob,<br />Can you provide some further details on how to use fixed weights (that I have estimated), yet apply a smooth to them? I've been unable to use 'instrument_weight_ewma_span' to filfill this purpose... Thanks!AvantGardehttp://www.blogger.com/profile/13449856505135293636noreply@blogger.comtag:blogger.com,1999:blog-261139923818144971.post-72092462375464515742017-01-17T18:00:53.682+00:002017-01-17T18:00:53.682+00:00Chapter 4 of my book.Chapter 4 of my book.Rob Carverhttp://www.blogger.com/profile/10175885372013572770noreply@blogger.comtag:blogger.com,1999:blog-261139923818144971.post-91144832736845173172017-01-17T16:14:05.957+00:002017-01-17T16:14:05.957+00:00syscore/optimisation line 322
# factors .First ele...syscore/optimisation line 322<br /># factors .First element of tuple is SR difference, second is adjustment<br />adj_factors = ([-.5, -.4, -.3, -25, -.2, -.15, -.1, -0.05, 0.0, .05, .1, .15, .2, .25, .3, .4, .5],<br /> [.32, .42, .55, .6, .66, .77, .85, .94, 1.0, 1.11, 1.19, 1.3, 1.37, 1.48, 1.56, 1.72, 1.83])<br /><br /><br />def apply_cost_weighting(raw_weight_df, ann_SR_costs):<br /> """<br /> Apply cost weighting to the raw optimisation results<br /> """<br /><br /> # Work out average costs, in annualised sharpe ratio terms<br /> # In sample for vol estimation, but shouldn't matter much since target vol<br /> # should be the same<br /><br /> avg_cost = np.mean(ann_SR_costs)<br /> relative_SR_costs = [cost - avg_cost for cost in ann_SR_costs]<br /><br /> # Find adjustment factors<br /> weight_adj = list(<br /> np.interp(<br /> relative_SR_costs,<br /> adj_factors[0],<br /> adj_factors[1]))<br /> weight_adj = np.array([list(weight_adj)] * len(raw_weight_df.index))<br /> weight_adj = pd.DataFrame(<br /> weight_adj,<br /> index=raw_weight_df.index,<br /> columns=raw_weight_df.columns)<br /><br /> return raw_weight_df * weight_adj<br />lvymathhttp://www.blogger.com/profile/02155526410125935499noreply@blogger.comtag:blogger.com,1999:blog-261139923818144971.post-82138943176977807842017-01-17T15:35:58.542+00:002017-01-17T15:35:58.542+00:00Please tell me which file you are looking at and t...Please tell me which file you are looking at and the line number please.Rob Carverhttp://www.blogger.com/profile/10175885372013572770noreply@blogger.com