Metadata-Version: 2.1
Name: EAP
Version: 1.1.11
Summary: This package is designed for empirical asset pricing
Home-page: https://whyecofiliter.github.io/EAP/
Author: whyecofiliter
Author-email: why_ecofiliter@126.com
License: MIT
Project-URL: Bug Tracker, https://github.com/whyecofiliter/EAP/issues
Project-URL: Documentation, https://whyecofiliter.github.io/EAP/Documentation.html
Project-URL: Source Code, https://github.com/whyecofiliter/EAP
Description: # Documentation
        
        ## Module
        
        ### portfolio_analysis
        
        This module is used for portfolio analysis by 4 steps:
        
        1. select breakpoints
        2. distribute the assets into groups
        3. calculate the average and difference of groups
        4. summary the result
        
        #### class ptf_analysis()
        
        This class is for portfolio analysis containing few functions.
        
        ##### def select_breakpoints(self, character, number, perc=*None*, percn=*None*):
        
        This function, corresponding to the step 1,  selects the breakpoints of the sample.
        
        **input : ** 
        
         *character (ndarray/Series):*  The asset characteristics by which the assets are grouped. 
        
         *number (int):* The number of the breakpoints and the number of interval is *number+1*. Once the number is given, the assets would be grouped into *number+1* groups by the average partition of the asset characteristic.  
        
        *perc* (list/ndarray):  perc are percentile points of the characteristics. Once it is set, then  *number* is overwritten. eg. perc = [0, 30, 70, 100] represents the percentiles at 0%, 30%, 70%, 100%.
        
        *percn (list/ndarray):* percn are percentiles of the characteristics. eg. the characteristics are divided by NYSE breakpoints.
        
        **output :** 
        
        *breakpoint :*  The breakpoint is the percentiles of the asset characteristic, ranging from 0% to 100%, whose in length is *number+2*.
        
        **Example**
        
        ```python
        from EAP.portfolio_analysis import Ptf_analysis as ptfa
        import matplotlib.pyplot as plt
        import numpy as np
        
        # generate characteristics
        character = np.random.normal(0, 100, 10000)
        # generate breakpoint
        breakpoint = ptfa().slect_breakpoint(character=character, number=9)
        print('Breakpoint:', breakpoint)
        # comapre with the true breakpoint
        for i in np.linspace(0, 100, 11):
                print('True breakpoint', i, '%:', np.percentile(character,i))
        
        ========================================================================
        Generated breakpoint: [-418.25352494 -127.85494153  -84.90868131  -53.27163604  -25.2394311   1.74938872   25.93867426   50.87047751   84.42711213  128.52009426  334.42181608]
        True breakpoint 0.0 %: -418.25352493659074
        True breakpoint 10.0 %: -127.85494153240666
        True breakpoint 20.0 %: -84.90868130631613
        True breakpoint 30.0 %: -53.271636036097135
        True breakpoint 40.0 %: -25.239431099072657
        True breakpoint 50.0 %: 1.7493887248228535
        True breakpoint 60.0 %: 25.938674261710755
        True breakpoint 70.0 %: 50.870477505977846
        True breakpoint 80.0 %: 84.42711212754239
        True breakpoint 90.0 %: 128.5200942602427
        True breakpoint 100.0 %: 334.42181607589504
        ========================================================================
        ```
        
        
        
        ##### def distribute(self, character, breakpoint):
        
        This function, corresponding to the step 2, distributes the assets into groups by characteristics  grouped by the breakpoint.
        
        **input :**  
        
        *character (ndarray/Series):* The characteristic by which the assets are grouped.
        
        *breakpoint (list/array):*  The breakpoints of the characteristic.
        
        **output : **
        
        *label (ndarray):*  an array containing the group number of each asset.
        
        **Example**
        
        ```python
        # continue the previous code
        # generate the groups     
        print('The Label of unique value:\n', np.sort(np.unique(ptfa().distribute(character, breakpoint))))
        # plot the histogram of the label 
        # each group have the same number of samples
        plt.hist(ptfa().distribute(character, breakpoint))
        label = ptfa().distribute(character, breakpoint)[:, 0]
        # print the label
        print('Label:\n', ptfa().distribute(character, breakpoint))
        ========================================================================
        The Label of unique value:
         [0. 1. 2. 3. 4. 5. 6. 7. 8. 9.]
        Label:
         [[2.]
         [8.]
         [1.]
         ...
         [1.]
         [4.]
         [8.]]
        ```
        
        
        
        ##### def average(self, sample_return, label, cond='uni', weight=None):
        
        This function, corresponding to the step 3, calculates the average return for each group.
        
        **input :**
        
        *sample_return (ndarray):* The return of each asset.
        
        *label (ndarray):* The group label of each asset.
        
        *cond (str):* If univariate analysis, then cond = 'uni'; if bivariate analysis, then cond = 'bi'.
        
        *weight (None):* The weight to calculate the weighted average group return.   
        
        **output :**
        
        *average_return (ndarray):* The average return of each group.
        
        **Example**
        
        ```python
        # continue the previous code
        # generate the future sample return
        sample_return = character/100 + np.random.normal()
        ave_ret = ptfa().average(sample_return, label)
        # print the groups return
        print('average return for groups:\n', ave_ret)
        ========================================================================
        average return for groups:
         [[-1.47855293]
         [-0.75343815]
         [-0.39540537]
         [-0.0971466 ]
         [ 0.17474793]
         [ 0.42303036]
         [ 0.68905747]
         [ 0.95962212]
         [ 1.33445632]
         [ 2.03728439]]
        ```
        
        
        
        ##### def statistics(self, variable, label, func, cond='uni'):
        
        This function is for summary statistics of groups.
        
        **input :**
        
        *variable (ndarray):* The variables of the groups.
        
        *label (ndarray):* The label of the groups for each stock.
        
        *func (function):* The operations to variables in each group, like numpy.mean, numpy.sum, etc.
        
        *cond (str):*  If univariate analysis, then cond = 'uni'; if bivariate analysis, then cond = 'bi'.
        
        **output:**
        
        *average_statistics (ndarray):* The statistics of each group.
        
        
        
        ##### def create_breakpoint(self, data, number, perc=None):
        
        This function is for creating breakpoints. In many researches, the special breakpoints are needed like NYSE breakpoints in common. This function is designed for this demand.
        
        **input :**
        
        *data (ndarray/DataFrame):* The characteristics for partition. 
        
        The structure is 
        
        ​    first row to last two row: character.
        
        ​    last row : time index.
        
        *number (int):* The number of breakpoints.
        
        *perc (array/list):* The percentile points of breakpoints.
        
        **output :**
        
        *breakpoints array (list):* The breakpoints of characteristics.
        
        
        
        **Example :**
        
        ```python
        # generate variables
        character=np.random.normal(0,1,20*3000)
        # generate future return
        ret=character*-0.5+np.random.normal(0,1,20*3000)
        # create sample containing future return, character, time
        sample=np.array([character,year]).T
            
        breakpoints = ptfa().create_breakpoint(data=sample, number=4)
        print(breakpoints)
        ```
        
        
        
        
        
        #### class Univariate(ptf_analysis):
        
        This class is designed for univariate portfolio analysis.
        
        ##### def \__init__ (self, sample):
        
        The initialization function
        
        **input :**
        
        *sample (ndarray or DataFrame) :*  The samples to be analyzed. Samples usually contain the future return, characteristics, time. The **DEFAULT** setting is the *1th* column is the forecast return, the *2nd* column is the characteristic, the *3rd* column or the index(if data type is Dataframe) is time label.
        
        
        
        ##### def divide_by_time(self, sample):
        
        This function groups the sample by time.
        
        split the sample by time into groups  
        
        **output :** 
        
        *groups_by_time (list):* The samples group by time.
        
        
        
        ##### def average_by_time(self):
        
        This function, using the sample group by time from function *divide_by_time*, groups the sample by the characteristic, and then calculate average return of each group samples at every time point. 
        
        **output :** 
        
        *average_group_time(matrix: N_T) :* The average return of groups by each characteristic-time pair.
        
        
        
        **Example**
        
        ```python
        import numpy as np
        from portfolio_analysis import Univariate as uni
            
        # generate time 
        year=np.ones((3000,1),dtype=int)*2020
        for i in range(19):
            year=np.append(year,(2019-i)*np.ones((3000,1),dtype=int))
            
        # generate character
        character=np.random.normal(0,1,20*3000)
        # generate future return
        ret=character*-0.5+np.random.normal(0,1,20*3000)
        # create sample containing future return, character, time
        sample=np.array([ret,character,year]).T
        # initializ the univariate object
        exper=uni(sample,9)
        # 
        data=exper.average_by_time()
        print(data)
        ==========================================================================================================================
        [[ 0.82812256  0.87549215  0.81043114  0.77480366  0.85232487  0.85599445   0.90860961  0.76600211  0.91360546  0.85921985  0.96717798  0.77677131   0.88669273  0.86895145  0.97832435  0.88494486  0.82571951  0.84777939   0.89373487  0.95906454]
         [ 0.51155724  0.4963439   0.6100762   0.47351625  0.46844971  0.53303287   0.52087477  0.43934316  0.51169633  0.61918844  0.56254028  0.50949226   0.39033219  0.49685445  0.5844816   0.48723354  0.49861094  0.43197525   0.40040156  0.57529228]
         [ 0.41566251  0.3421546   0.27117215  0.35550346  0.28884636  0.43710998   0.33146264  0.27860032  0.35956881  0.34818479  0.35692361  0.42462374   0.16909231  0.33823117  0.31762348  0.44863438  0.42785283  0.20093775   0.29664738  0.31509963]
         [ 0.21972246  0.24685649  0.29933776  0.09880866  0.13564638  0.17673649   0.14251437  0.12188551  0.1567432   0.20428427  0.15009782  0.08488247   0.20489871  0.10598241  0.12591301  0.17287433  0.11180376  0.09941738   0.22635281  0.22828588]
         [ 0.01851548  0.05771421  0.0624163   0.05368921  0.15247324  0.05839522   0.05864669  0.01863668 -0.08367879  0.09273579  0.18374921  0.12331214   0.03635538  0.05804576  0.0116589  -0.04158565  0.11655945  0.09727234   0.14038867  0.13594649]
         [-0.07910789 -0.04670755  0.08732773 -0.07361966 -0.00232509 -0.08546681  -0.15020487 -0.05302521 -0.07922696 -0.1088824  -0.01700017 -0.06742183   0.00190131  0.00961174 -0.05953252 -0.09504501 -0.0958816  -0.00355493  -0.08553405 -0.05343558]
         [-0.13094033 -0.23888179 -0.11046595 -0.11176528 -0.14017103 -0.17184142  -0.26587781 -0.14426219 -0.15687278 -0.15962335 -0.18586504 -0.2367552  -0.26761165 -0.16169935 -0.26608677 -0.16202763 -0.24272797 -0.17049684  -0.21470737 -0.13520545]
         [-0.35621842 -0.28111488 -0.42057927 -0.37219582 -0.25449753 -0.36362452  -0.34165952 -0.28564624 -0.29936621 -0.32545156 -0.28208242 -0.36730096  -0.24269836 -0.31584032 -0.34207757 -0.35185102 -0.35515763 -0.32239715  -0.2803911  -0.36334961]
         [-0.58529295 -0.54329245 -0.52006031 -0.49856708 -0.44262707 -0.4464171  -0.58846501 -0.56725297 -0.35845646 -0.52923391 -0.42119445 -0.55659388  -0.47716067 -0.4574991  -0.52123094 -0.54767832 -0.50289813 -0.45529132  -0.58429513 -0.48110405]
         [-0.81992395 -0.95766159 -0.92069685 -0.92906348 -0.84891875 -0.81670916  -0.90281776 -0.84845902 -0.90479169 -0.86860559 -0.96790821 -0.9464988  -0.88176205 -0.96118242 -0.92402295 -0.81623283 -0.81560442 -0.85841478  -0.87337267 -0.8070857 ]]
        ```
        
        
        
        ##### def difference(self, average_group):
        
        This functions calculates the difference of group return, which, in detail, is the last group average return minus the first group average return. 
        
         **input :** 
        
        *average_group (ndarray):* The average return of groups by each characteristic-time pair.
        
        **output :**
        
        *result (ndarray):* The matrix added with the difference of average group return.
        
        
        
        ##### def summary_and_test(self):
        
        This function summarizes the result and take t-test.
        
        **output : **
        
        *self.average (ndarray):* The average of the portfolio return across time.
        
        *self.ttest (ndarray):* The t-value of the portfolio return across time.
        
        
        
        ##### def fit(self, number, perc=None, percn=None, maxlag=12, weight=False):
        
        This function fit the model
        
        **input :**
        
        *number (int):*  The breakpoint number.
        
        *perc (list or array):*  The breakpoint percentile points.
        
        *percn (list or array):* The breakpoint percentiles.
        
        *maxlag (int):*  The maximum lag for Newey-West adjustment.
        
        *weight (boolean):* If the weighted return is chosen, then weight is True. The **DEFAULT** is False. 
        
        
        
        ##### def factor_adjustment(self, factor):
        
        This function calculates the group return adjusted by risk factors.
        
        **input :**
        
        *factor (ndarray or DataFrame):* The return table with difference sequence.
        
        **output :**
        
        *alpha (array):* The anomaly
        
        *ttest (array):* The t-value of the anomaly.
        
        
        
        ##### def extractor(self, pos):
        
        This function extracts the return series
        
        **input :**
        
        *pos (int):* The position of the return series.
        
        **output:**
        
        *series_ex (Series):* The extracted Series.
        
        
        
        ##### def summary_statistics(self, variables=None, periodic=False):
        
        This function is for summary statistics and outputs the group statistics and variables statistics. 
        
        **input :**
        
        *variables (ndarray/DataFrame):* variables, except sort variable, that need to be analyzed.
        
        *periodic (boolean):* whether print periodic results.
        
        
        
        ##### def correlation(self, variables, periodic=False, export=False):
        
        This function is for calculating correlation coefficient of variables.
        
        **input :**
        
        *variables (ndarray/DataFrame):* The variables to be analyzed.
        
        *periodic (boolean):* whether prints the periodic result. The **DEFAULT** is False.
        
        *export (boolean):* whether exports the summary table. The **DEFAULT** is False.
        
        **output :**
        
        *df (DataFrame):* The summary table if export is True.
        
        
        
        ##### def print_summary_by_time(self, export=False):
        
        This function print the summary grouped by time.
        
        **input :**
        
        *export (boolean):* Export the table or not. The table is exported in form of Dataframe. The default setting is **False.**
        
        **output :**
        
        *df (DataFrame):* The table exported in form of Dataframe.
        
        
        
        ##### def print_summary(self, explicit=False, export=False, percentage=False):
        
        This function print the summary grouped by characteristic and averaged by time.
        
        **input :**
        
        *explicit (boolean):* Whether presents the explicit result. The default is **False**.
        
        *export (boolean):* Export the table or not. The table is exported in form of Dataframe. The default setting is **False.**
        
        *percentage (boolean):* Whether presents the percentage average return. The default is **False**.
        
        **output :**
        
        *df (DataFrame):* The table exported in form of Dataframe.
        
        
        
        **Example**
        
        ```python
        # continue the previous code
        exper.summary_and_test()
        exper.print_summary_by_time()
        ========================================================================================================
        +--------+-------+-------+-------+-------+--------+--------+--------+--------+--------+--------+--------+
        |  Time  |   1   |   2   |   3   |   4   |   5    |   6    |   7    |   8    |   9    |   10   |  diff  |
        +--------+-------+-------+-------+-------+--------+--------+--------+--------+--------+--------+--------+
        | 2001.0 | 0.828 | 0.512 | 0.416 |  0.22 | 0.019  | -0.079 | -0.131 | -0.356 | -0.585 | -0.82  | -1.648 |
        | 2002.0 | 0.875 | 0.496 | 0.342 | 0.247 | 0.058  | -0.047 | -0.239 | -0.281 | -0.543 | -0.958 | -1.833 |
        | 2003.0 |  0.81 |  0.61 | 0.271 | 0.299 | 0.062  | 0.087  | -0.11  | -0.421 | -0.52  | -0.921 | -1.731 |
        | 2004.0 | 0.775 | 0.474 | 0.356 | 0.099 | 0.054  | -0.074 | -0.112 | -0.372 | -0.499 | -0.929 | -1.704 |
        | 2005.0 | 0.852 | 0.468 | 0.289 | 0.136 | 0.152  | -0.002 | -0.14  | -0.254 | -0.443 | -0.849 | -1.701 |
        | 2006.0 | 0.856 | 0.533 | 0.437 | 0.177 | 0.058  | -0.085 | -0.172 | -0.364 | -0.446 | -0.817 | -1.673 |
        | 2007.0 | 0.909 | 0.521 | 0.331 | 0.143 | 0.059  | -0.15  | -0.266 | -0.342 | -0.588 | -0.903 | -1.811 |
        | 2008.0 | 0.766 | 0.439 | 0.279 | 0.122 | 0.019  | -0.053 | -0.144 | -0.286 | -0.567 | -0.848 | -1.614 |
        | 2009.0 | 0.914 | 0.512 |  0.36 | 0.157 | -0.084 | -0.079 | -0.157 | -0.299 | -0.358 | -0.905 | -1.818 |
        | 2010.0 | 0.859 | 0.619 | 0.348 | 0.204 | 0.093  | -0.109 | -0.16  | -0.325 | -0.529 | -0.869 | -1.728 |
        | 2011.0 | 0.967 | 0.563 | 0.357 |  0.15 | 0.184  | -0.017 | -0.186 | -0.282 | -0.421 | -0.968 | -1.935 |
        | 2012.0 | 0.777 | 0.509 | 0.425 | 0.085 | 0.123  | -0.067 | -0.237 | -0.367 | -0.557 | -0.946 | -1.723 |
        | 2013.0 | 0.887 |  0.39 | 0.169 | 0.205 | 0.036  | 0.002  | -0.268 | -0.243 | -0.477 | -0.882 | -1.768 |
        | 2014.0 | 0.869 | 0.497 | 0.338 | 0.106 | 0.058  |  0.01  | -0.162 | -0.316 | -0.457 | -0.961 | -1.83  |
        | 2015.0 | 0.978 | 0.584 | 0.318 | 0.126 | 0.012  | -0.06  | -0.266 | -0.342 | -0.521 | -0.924 | -1.902 |
        | 2016.0 | 0.885 | 0.487 | 0.449 | 0.173 | -0.042 | -0.095 | -0.162 | -0.352 | -0.548 | -0.816 | -1.701 |
        | 2017.0 | 0.826 | 0.499 | 0.428 | 0.112 | 0.117  | -0.096 | -0.243 | -0.355 | -0.503 | -0.816 | -1.641 |
        | 2018.0 | 0.848 | 0.432 | 0.201 | 0.099 | 0.097  | -0.004 | -0.17  | -0.322 | -0.455 | -0.858 | -1.706 |
        | 2019.0 | 0.894 |  0.4  | 0.297 | 0.226 |  0.14  | -0.086 | -0.215 | -0.28  | -0.584 | -0.873 | -1.767 |
        | 2020.0 | 0.959 | 0.575 | 0.315 | 0.228 | 0.136  | -0.053 | -0.135 | -0.363 | -0.481 | -0.807 | -1.766 |
        +--------+-------+-------+-------+-------+--------+--------+--------+--------+--------+--------+--------+
        
        exper.print_summary()
        ==================================================================================================================
        +---------+--------+--------+--------+--------+-------+--------+---------+---------+---------+---------+---------+
        |  Group  |   1    |   2    |   3    |   4    |   5   |   6    |    7    |    8    |    9    |    10   |   Diff  |
        +---------+--------+--------+--------+--------+-------+--------+---------+---------+---------+---------+---------+
        | Average | 0.867  | 0.506  | 0.336  | 0.166  | 0.068 | -0.053 |  -0.184 |  -0.326 |  -0.504 |  -0.883 |  -1.75  |
        |  T-Test | 63.706 | 35.707 | 20.199 | 12.637 |  4.6  | -4.463 | -15.616 | -32.172 | -36.497 | -73.162 | -92.152 |
        +---------+--------+--------+--------+--------+-------+--------+---------+---------+---------+---------+---------+
        
        # generate factor
        factor=np.random.normal(0,1.0,(20,1))
        exper=uni(sample,9,factor=factor,maxlag=12)
        # print(exper.summary_and_test()) # if needed
        exper.print_summary()
        ====================================================================================================================
        +---------+--------+--------+--------+--------+-------+---------+---------+---------+---------+---------+----------+
        |  Group  |   1    |   2    |   3    |   4    |   5   |    6    |    7    |    8    |    9    |    10   |   Diff   |
        +---------+--------+--------+--------+--------+-------+---------+---------+---------+---------+---------+----------+
        | Average | 0.867  | 0.506  | 0.336  | 0.166  | 0.068 |  -0.053 |  -0.184 |  -0.326 |  -0.504 |  -0.883 |  -1.75   |
        |  T-Test | 63.706 | 35.707 | 20.199 | 12.637 |  4.6  |  -4.463 | -15.616 | -32.172 | -36.497 | -73.162 | -92.152  |
        |  Alpha  | 0.869  | 0.507  | 0.336  | 0.164  | 0.067 |  -0.054 |  -0.184 |  -0.326 |  -0.503 |  -0.883 |  -1.752  |
        | Alpha-T | 62.39  | 69.24  | 43.377 | 16.673 | 8.204 | -12.372 | -16.679 | -65.704 | -87.618 | -93.223 | -139.881 |
        +---------+--------+--------+--------+--------+-------+---------+---------+---------+---------+---------+----------+
        
        # summary statistics
        # summary statstics
        exper.summary_statistics()
        exper.summary_statistics(periodic=True)
        exper.summary_statistics(variables=np.array([variable_1, variable_2]).T, periodic=True)
        =============================================================================================================================================
        Group Statistics
        +---------------+----------+----------+----------+----------+----------+---------+---------+---------+---------+---------+
        |    Variable   |    1     |    2     |    3     |    4     |    5     |    6    |    7    |    8    |    9    |    10   |
        +---------------+----------+----------+----------+----------+----------+---------+---------+---------+---------+---------+
        | Sort Variable | -1.75599 | -1.04855 | -0.68048 | -0.38582 | -0.13043 | 0.12162 | 0.38283 | 0.67123 | 1.04036 | 1.75617 |
        +---------------+----------+----------+----------+----------+----------+---------+---------+---------+---------+---------+
        
        Indicator Statistics
        +---------------+----------+---------+---------+---------+----------+----------+----------+----------+---------+---------+---------+--------+
        |    Variable   |   Mean   |    SD   |   Skew  |   Kurt  |   Min    |    P5    |   P25    |  Median  |   P75   |   P95   |   Max   |   n    |
        +---------------+----------+---------+---------+---------+----------+----------+----------+----------+---------+---------+---------+--------+
        | Sort Variable | -0.00291 | 1.00034 | 0.00655 | 0.00903 | -3.78607 | -1.64983 | -0.67779 | -0.00555 | 0.66878 | 1.64654 | 5.01253 | 3000.0 |
        +---------------+----------+---------+---------+---------+----------+----------+----------+----------+---------+---------+---------+--------+
        
        Group Statistics
        +--------+----------+----------+----------+----------+----------+---------+---------+---------+---------+---------+
        |  Time  |    1     |    2     |    3     |    4     |    5     |    6    |    7    |    8    |    9    |    10   |
        +--------+----------+----------+----------+----------+----------+---------+---------+---------+---------+---------+
        | 2001.0 | -1.6925  | -1.02089 | -0.66652 | -0.38477 | -0.1233  | 0.11802 | 0.37854 | 0.65636 | 1.01076 | 1.72971 |
        | 2002.0 | -1.87762 | -1.11717 | -0.73731 | -0.41848 | -0.13805 |  0.1111 | 0.36679 | 0.66682 | 1.03146 | 1.73504 |
        | 2003.0 | -1.78951 | -1.10062 | -0.72741 | -0.43769 | -0.16465 | 0.10609 | 0.38293 | 0.68745 | 1.07672 | 1.81225 |
        | 2004.0 | -1.73014 | -1.06276 | -0.71448 | -0.4134  | -0.15013 | 0.11498 |  0.3735 | 0.65222 | 1.04136 | 1.75216 |
        | 2005.0 | -1.76065 | -1.03176 | -0.64098 |  -0.361  | -0.11605 |  0.1288 | 0.37912 | 0.67395 | 1.05602 | 1.77653 |
        | 2006.0 | -1.74401 | -1.0027  | -0.66083 | -0.39449 | -0.14914 | 0.10178 |  0.3742 | 0.67059 | 1.04616 | 1.76814 |
        | 2007.0 | -1.80695 | -1.06589 | -0.68808 | -0.38284 | -0.12541 |  0.1307 |  0.3805 |  0.6842 | 1.07225 | 1.79131 |
        | 2008.0 | -1.74075 | -1.03649 | -0.66014 | -0.36012 | -0.08308 | 0.16978 | 0.40924 | 0.69059 | 1.07663 | 1.80214 |
        | 2009.0 | -1.72743 | -1.02872 | -0.66078 | -0.35962 | -0.10524 | 0.15432 | 0.41381 | 0.70732 | 1.05451 | 1.77899 |
        | 2010.0 | -1.78576 | -1.0724  | -0.70536 | -0.4206  | -0.15331 | 0.10784 | 0.36778 | 0.63268 | 0.97913 | 1.70654 |
        | 2011.0 | -1.74376 | -1.06618 | -0.66421 | -0.36223 | -0.11708 | 0.11619 | 0.37691 | 0.65584 | 1.04012 | 1.75902 |
        | 2012.0 | -1.74187 | -1.02805 | -0.66739 | -0.35736 | -0.11212 | 0.14089 | 0.40391 | 0.67798 | 1.02562 | 1.73806 |
        | 2013.0 | -1.76579 | -1.05636 | -0.71253 | -0.40731 | -0.15883 | 0.08952 | 0.34684 | 0.64058 |  1.0596 | 1.76204 |
        | 2014.0 | -1.74755 | -1.05756 | -0.66698 | -0.35469 | -0.08209 | 0.15193 | 0.41016 | 0.70804 | 1.05792 | 1.72348 |
        | 2015.0 | -1.71824 | -1.02469 | -0.68511 | -0.39623 | -0.12801 |  0.1155 |  0.3805 | 0.67353 | 1.03347 | 1.79041 |
        | 2016.0 | -1.76041 | -1.00856 | -0.65567 | -0.38146 | -0.14799 | 0.12255 | 0.39534 | 0.68455 | 1.03506 | 1.74972 |
        | 2017.0 | -1.76569 | -1.09121 | -0.6997  | -0.39767 | -0.16277 | 0.08815 | 0.36036 | 0.65853 | 1.02296 |  1.7052 |
        | 2018.0 | -1.7382  | -1.04642 | -0.69095 | -0.39814 | -0.14829 | 0.09936 | 0.35899 | 0.63287 | 1.00188 | 1.77708 |
        | 2019.0 | -1.79867 | -1.02312 | -0.63218 | -0.35047 | -0.10632 | 0.13742 | 0.38529 | 0.67535 | 1.05045 | 1.73754 |
        | 2020.0 | -1.68432 | -1.02939 | -0.6729  | -0.3778  | -0.13669 | 0.12742 | 0.41184 |  0.6951 | 1.03508 | 1.72796 |
        +--------+----------+----------+----------+----------+----------+---------+---------+---------+---------+---------+
        
        Indicator Statistics
        +--------+----------+---------+----------+----------+----------+----------+----------+----------+---------+---------+---------+--------+
        |  Time  |   Mean   |    SD   |   Skew   |   Kurt   |   Min    |    P5    |   P25    |  Median  |   P75   |   P95   |   Max   |   n    |
        +--------+----------+---------+----------+----------+----------+----------+----------+----------+---------+---------+---------+--------+
        | 2001.0 | 0.00054  | 0.97302 | 0.02247  | -0.10708 | -3.42651 | -1.6194  | -0.65605 | -0.00359 | 0.65144 | 1.64108 | 3.03045 | 3000.0 |
        | 2002.0 | -0.03774 | 1.02917 | -0.07101 | -0.04394 | -3.34554 | -1.75824 | -0.73412 | -0.00649 | 0.67196 | 1.60588 | 3.24348 | 3000.0 |
        | 2003.0 | -0.01544 |  1.031  | 0.08283  | -0.05399 | -3.70265 | -1.68766 | -0.72952 | -0.02717 | 0.67982 | 1.71738 | 5.01253 | 3000.0 |
        | 2004.0 | -0.01367 | 0.99949 | 0.05183  | 0.03698  | -3.78607 | -1.6032  | -0.7196  | -0.00878 | 0.64707 | 1.63855 | 3.94993 | 3000.0 |
        | 2005.0 |  0.0104  | 1.00123 | -0.00145 | 0.05439  | -3.45686 | -1.65593 | -0.64007 | 0.00818  | 0.66671 | 1.66286 | 3.25955 | 3000.0 |
        | 2006.0 | 0.00097  | 0.99657 |  0.0462  | 0.14851  | -3.66269 | -1.63244 | -0.65111 | -0.02304 | 0.66632 | 1.64847 | 3.80687 | 3000.0 |
        | 2007.0 | -0.00102 | 1.02414 | -0.02673 | 0.09203  | -3.6959  | -1.67876 | -0.68666 | -0.00039 | 0.68119 | 1.67692 | 3.46859 | 3000.0 |
        | 2008.0 | 0.02678  | 1.00841 | -0.01221 | -0.01983 | -3.23867 | -1.60362 | -0.66403 | 0.05001  | 0.68418 | 1.69399 | 3.12197 | 3000.0 |
        | 2009.0 | 0.02272  | 0.99936 | 0.00098  | -0.02585 | -3.47649 | -1.6255  | -0.66684 | 0.01566  | 0.70893 | 1.65815 | 3.52246 | 3000.0 |
        | 2010.0 | -0.03435 | 0.99275 | -0.00444 | 0.07042  | -3.76678 | -1.68815 | -0.70062 | -0.02015 | 0.62253 | 1.58344 | 4.38276 | 3000.0 |
        | 2011.0 | -0.00054 | 0.99636 | 0.02189  | 0.01578  | -3.27961 | -1.65198 | -0.66553 | -0.00789 | 0.64738 | 1.63708 | 3.82473 | 3000.0 |
        | 2012.0 | 0.00797  | 0.99176 | -0.02293 | 0.16201  | -3.71695 | -1.62543 | -0.66423 | 0.00611  | 0.67271 | 1.62142 | 4.53518 | 3000.0 |
        | 2013.0 | -0.02023 | 1.00519 | 0.06585  | -0.02879 | -2.94317 | -1.6765  | -0.71018 | -0.04464 | 0.63132 |  1.6228 | 3.69209 | 3000.0 |
        | 2014.0 | 0.01427  |  0.9944 | -0.06002 | -0.14694 | -3.24666 | -1.65555 | -0.66567 | 0.03211  | 0.70328 | 1.61143 | 3.76138 | 3000.0 |
        | 2015.0 | 0.00411  | 0.99681 | 0.05986  | -0.00795 | -3.23554 | -1.62117 | -0.67571 | -0.01158 | 0.67075 | 1.69081 |  3.2803 | 3000.0 |
        | 2016.0 | 0.00331  | 0.99567 | -0.01401 | 0.05045  | -3.32744 | -1.6719  | -0.65461 | -0.02452 | 0.68572 | 1.63524 | 3.64654 | 3000.0 |
        | 2017.0 | -0.02818 | 0.99394 | 0.03023  | -0.07768 | -2.89247 | -1.66422 | -0.69934 | -0.04211 |  0.6557 | 1.61093 | 4.08491 | 3000.0 |
        | 2018.0 | -0.01518 | 0.99426 | 0.06121  | 0.05183  | -3.6031  | -1.61522 | -0.68846 | -0.02645 | 0.63035 | 1.65972 | 3.17931 | 3000.0 |
        | 2019.0 | 0.00753  | 0.99978 | -0.09315 | 0.15681  | -3.66891 | -1.66554 | -0.62705 | 0.01924  | 0.67606 |  1.6475 | 3.63029 | 3000.0 |
        | 2020.0 | 0.00963  | 0.97843 | 0.00495  | -0.19773 | -3.52657 | -1.57132 | -0.67812 | -0.0006  | 0.69571 |  1.6329 | 2.96727 | 3000.0 |
        +--------+----------+---------+----------+----------+----------+----------+----------+----------+---------+---------+---------+--------+
        
        Group Statistics
        +-----------+----------+----------+----------+----------+----------+----------+----------+----------+----------+----------+
        |    Time   |    1     |    2     |    3     |    4     |    5     |    6     |    7     |    8     |    9     |    10    |
        +-----------+----------+----------+----------+----------+----------+----------+----------+----------+----------+----------+
        |   2001.0  | -1.6925  | -1.02089 | -0.66652 | -0.38477 | -0.1233  | 0.11802  | 0.37854  | 0.65636  | 1.01076  | 1.72971  |
        |   2002.0  | -1.87762 | -1.11717 | -0.73731 | -0.41848 | -0.13805 |  0.1111  | 0.36679  | 0.66682  | 1.03146  | 1.73504  |
        |   2003.0  | -1.78951 | -1.10062 | -0.72741 | -0.43769 | -0.16465 | 0.10609  | 0.38293  | 0.68745  | 1.07672  | 1.81225  |
        |   2004.0  | -1.73014 | -1.06276 | -0.71448 | -0.4134  | -0.15013 | 0.11498  |  0.3735  | 0.65222  | 1.04136  | 1.75216  |
        |   2005.0  | -1.76065 | -1.03176 | -0.64098 |  -0.361  | -0.11605 |  0.1288  | 0.37912  | 0.67395  | 1.05602  | 1.77653  |
        |   2006.0  | -1.74401 | -1.0027  | -0.66083 | -0.39449 | -0.14914 | 0.10178  |  0.3742  | 0.67059  | 1.04616  | 1.76814  |
        |   2007.0  | -1.80695 | -1.06589 | -0.68808 | -0.38284 | -0.12541 |  0.1307  |  0.3805  |  0.6842  | 1.07225  | 1.79131  |
        |   2008.0  | -1.74075 | -1.03649 | -0.66014 | -0.36012 | -0.08308 | 0.16978  | 0.40924  | 0.69059  | 1.07663  | 1.80214  |
        |   2009.0  | -1.72743 | -1.02872 | -0.66078 | -0.35962 | -0.10524 | 0.15432  | 0.41381  | 0.70732  | 1.05451  | 1.77899  |
        |   2010.0  | -1.78576 | -1.0724  | -0.70536 | -0.4206  | -0.15331 | 0.10784  | 0.36778  | 0.63268  | 0.97913  | 1.70654  |
        |   2011.0  | -1.74376 | -1.06618 | -0.66421 | -0.36223 | -0.11708 | 0.11619  | 0.37691  | 0.65584  | 1.04012  | 1.75902  |
        |   2012.0  | -1.74187 | -1.02805 | -0.66739 | -0.35736 | -0.11212 | 0.14089  | 0.40391  | 0.67798  | 1.02562  | 1.73806  |
        |   2013.0  | -1.76579 | -1.05636 | -0.71253 | -0.40731 | -0.15883 | 0.08952  | 0.34684  | 0.64058  |  1.0596  | 1.76204  |
        |   2014.0  | -1.74755 | -1.05756 | -0.66698 | -0.35469 | -0.08209 | 0.15193  | 0.41016  | 0.70804  | 1.05792  | 1.72348  |
        |   2015.0  | -1.71824 | -1.02469 | -0.68511 | -0.39623 | -0.12801 |  0.1155  |  0.3805  | 0.67353  | 1.03347  | 1.79041  |
        |   2016.0  | -1.76041 | -1.00856 | -0.65567 | -0.38146 | -0.14799 | 0.12255  | 0.39534  | 0.68455  | 1.03506  | 1.74972  |
        |   2017.0  | -1.76569 | -1.09121 | -0.6997  | -0.39767 | -0.16277 | 0.08815  | 0.36036  | 0.65853  | 1.02296  |  1.7052  |
        |   2018.0  | -1.7382  | -1.04642 | -0.69095 | -0.39814 | -0.14829 | 0.09936  | 0.35899  | 0.63287  | 1.00188  | 1.77708  |
        |   2019.0  | -1.79867 | -1.02312 | -0.63218 | -0.35047 | -0.10632 | 0.13742  | 0.38529  | 0.67535  | 1.05045  | 1.73754  |
        |   2020.0  | -1.68432 | -1.02939 | -0.6729  | -0.3778  | -0.13669 | 0.12742  | 0.41184  |  0.6951  | 1.03508  | 1.72796  |
        | Variable1 |          |          |          |          |          |          |          |          |          |          |
        |   2001.0  | -0.01522 | -0.04716 | -0.00418 | 0.04429  | 0.02995  | -0.03962 | 0.03099  | -0.00748 | -0.01993 | 0.07281  |
        |   2002.0  | -0.04164 | 0.04915  | 0.09105  | -0.01251 | 0.01903  | -0.06013 | 0.13446  | 0.00548  | 0.10778  |  0.0321  |
        |   2003.0  | -0.05489 | -0.05391 | 0.07348  | 0.01206  | -0.02717 | 0.09487  | 0.02613  | -0.08623 | -0.01783 | 0.03207  |
        |   2004.0  | -0.11632 | -0.05173 | 0.00346  | 0.05961  | 0.05607  |  0.0044  | -0.06302 | -0.0196  | 0.02273  | 0.00575  |
        |   2005.0  | -0.03024 | 0.07865  | -0.01689 | 0.01001  | 0.03753  | -0.07502 | -0.04625 | -0.03403 | -0.02387 | -0.0277  |
        |   2006.0  | -0.00906 | 0.06595  | -0.06769 | 0.04341  | -0.11204 | 0.02921  | -0.09275 | -0.03564 | 0.02151  | -0.01127 |
        |   2007.0  | 0.06778  | 0.04576  |  0.0608  | -0.04811 | 0.00333  | 0.00261  | -0.0016  | -0.07013 | -0.03491 | 0.05194  |
        |   2008.0  | -0.05786 | -0.15562 | 0.02984  |  0.0357  | -0.05811 | -0.00905 | -0.01723 | 0.01197  | -0.01656 | 0.05303  |
        |   2009.0  | 0.07899  | 0.04472  | 0.01793  | -0.01595 | -0.00243 | -0.01357 | -0.05197 | 0.01346  | -0.06112 | 0.01429  |
        |   2010.0  | 0.00449  | -0.01249 | 0.06374  | 0.04358  | -0.06673 | -0.03651 | 0.03834  |  0.108   | -0.01276 | -0.08404 |
        |   2011.0  | 0.04746  | -0.02221 | 0.03684  |  -0.036  | 0.07592  |  0.0378  | -0.07824 | 0.08281  | -0.06111 | -0.01362 |
        |   2012.0  | -0.00621 | -0.05096 | -0.1767  | 0.02314  | 0.04629  | -0.01675 | 0.05391  | 0.04671  | 0.11531  | 0.02835  |
        |   2013.0  |  -0.015  | 0.04923  | -0.06807 | -0.01484 | 0.03847  | 0.01953  | 0.08062  | -0.05869 | 0.03246  | 0.06554  |
        |   2014.0  | 0.04659  | -0.04936 | 0.03138  | -0.03342 | 0.05986  | 0.03902  | -0.02342 | 0.07528  | 0.08315  | 0.05208  |
        |   2015.0  | -0.05979 | 0.02031  | 0.02758  | -0.01191 | -0.10045 | 0.06854  |  0.0208  | 0.01457  | 0.02852  | -0.00798 |
        |   2016.0  | -0.05725 | 0.10226  | 0.01476  | 0.04893  | -0.06243 | 0.01586  | -0.03891 |  0.0497  | -0.06997 | 0.02027  |
        |   2017.0  | -0.08491 | -0.0204  | 0.06377  | -0.03237 | 0.04701  | -0.03702 | 0.03355  | 0.01081  | 0.06146  | -0.10583 |
        |   2018.0  | 0.17193  | 0.00903  | -0.09272 | -0.03469 | -0.0089  | 0.02757  |  0.1021  | -0.0237  |  0.0248  | 0.06012  |
        |   2019.0  | 0.01284  | -0.16395 | -0.02548 | -0.02048 | 0.02623  | -0.07111 | -0.01099 | -0.00777 | -0.08123 |  0.0173  |
        |   2020.0  |  0.0235  | 0.05426  | -0.01945 | -0.00294 | 0.00467  | -0.03483 | -0.05022 | -0.10885 | -0.02188 | 0.04856  |
        | Variable2 |          |          |          |          |          |          |          |          |          |          |
        |   2001.0  | -0.06124 | -0.04428 | -0.04644 | -0.01806 | 0.06192  | -0.04294 | 0.01533  | 0.06654  | 0.09389  | 0.10937  |
        |   2002.0  | -0.13372 | -0.07595 | -0.06099 | 0.12119  |  0.0084  | -0.03141 | 0.03979  | 0.01794  | -0.00021 | 0.01842  |
        |   2003.0  | 0.03537  | -0.00357 | 0.00429  | -0.01691 | 0.04931  | -0.15836 |  0.0193  | -0.0739  | -0.06891 |  0.0953  |
        |   2004.0  | 0.06502  | -0.04215 |  0.038   | -0.07539 | -0.02489 | -0.0284  | 0.00056  | -0.03221 | -0.00046 | 0.00694  |
        |   2005.0  | -0.08741 | -0.07291 | 0.14488  | 0.05242  | -0.04511 | -0.01714 |  0.0171  | -0.00856 | 0.00803  | -0.01396 |
        |   2006.0  | 0.03065  | -0.02519 | 0.07904  | 0.04322  | -0.01684 | 0.10146  | -0.06241 | 0.07308  | -0.04444 | 0.10448  |
        |   2007.0  | 0.02148  | -0.01106 | 0.00893  | 0.02162  | -0.00159 | -0.0297  | 0.04191  | 0.05094  |  0.0363  | -0.00436 |
        |   2008.0  |  0.0107  | -0.10686 | 0.04827  | 0.03952  | 0.07151  | -0.03169 | 0.03021  | -0.09669 | -0.0482  | -0.06362 |
        |   2009.0  | -0.01609 | -0.08597 | -0.03498 | -0.10286 | 0.01119  |  0.0722  | -0.00609 | -0.01957 | 0.05398  | -0.00213 |
        |   2010.0  | -0.07676 | 0.04038  | -0.07815 | -0.0874  | 0.01334  | -0.07987 | -0.0038  | -0.05676 |  0.1112  | 0.05337  |
        |   2011.0  | 0.10439  | 0.10231  | -0.01716 | 0.07714  | 0.03848  | -0.0531  | 0.00782  | 0.06249  | -0.01327 | -0.0104  |
        |   2012.0  | 0.06762  | -0.14694 | -0.0542  | -0.05754 | 0.04338  | -0.05577 | 0.02918  | 0.05796  | -0.01778 | 0.03046  |
        |   2013.0  | 0.01455  | 0.08443  | 0.01292  | -0.03034 | -0.02707 | 0.07382  | 0.02197  | 0.03101  | 0.00928  | -0.04489 |
        |   2014.0  | 0.02744  | 0.04831  | -0.04699 | -0.00752 | 0.01465  | 0.03622  | 0.06426  | 0.02387  | 0.04653  | 0.03347  |
        |   2015.0  |  0.1187  | 0.05819  | -0.11595 | 0.04011  | -0.0611  | 0.05873  | 0.04269  | 0.08874  | -0.02125 | 0.04671  |
        |   2016.0  | -0.04301 | 0.12979  | -0.06156 | 0.05615  | 0.07165  | -0.05736 | -0.03592 | 0.01267  | -0.09932 | 0.02544  |
        |   2017.0  | 0.03395  | 0.10936  | 0.02666  | -0.02994 | -0.07346 | -0.00728 | 0.03578  |  0.0431  | -0.03173 | -0.11917 |
        |   2018.0  | 0.00207  | -0.00248 | 0.10703  | -0.07196 | -0.0273  | -0.06974 | -0.07203 | 0.00266  | 0.02013  | -0.00583 |
        |   2019.0  | 0.03897  | -0.05036 | -0.00791 | 0.08408  | -0.02484 | 0.05819  | -0.00277 | -0.00087 | 0.05029  | -0.11294 |
        |   2020.0  | -0.03353 | -0.05889 | -0.04434 | -0.00597 | 0.05152  | -0.08162 | -0.01077 | 0.02026  | -0.02127 | 0.05415  |
        +-----------+----------+----------+----------+----------+----------+----------+----------+----------+----------+----------+
        
        Indicator Statistics
        +-----------+----------+---------+----------+----------+----------+----------+----------+----------+---------+---------+---------+--------+
        |    Time   |   Mean   |    SD   |   Skew   |   Kurt   |   Min    |    P5    |   P25    |  Median  |   P75   |   P95   |   Max   |   n    |
        +-----------+----------+---------+----------+----------+----------+----------+----------+----------+---------+---------+---------+--------+
        |   2001.0  | 0.00054  | 0.97302 | 0.02247  | -0.10708 | -3.42651 | -1.6194  | -0.65605 | -0.00359 | 0.65144 | 1.64108 | 3.03045 | 3000.0 |
        |   2002.0  | -0.03774 | 1.02917 | -0.07101 | -0.04394 | -3.34554 | -1.75824 | -0.73412 | -0.00649 | 0.67196 | 1.60588 | 3.24348 | 3000.0 |
        |   2003.0  | -0.01544 |  1.031  | 0.08283  | -0.05399 | -3.70265 | -1.68766 | -0.72952 | -0.02717 | 0.67982 | 1.71738 | 5.01253 | 3000.0 |
        |   2004.0  | -0.01367 | 0.99949 | 0.05183  | 0.03698  | -3.78607 | -1.6032  | -0.7196  | -0.00878 | 0.64707 | 1.63855 | 3.94993 | 3000.0 |
        |   2005.0  |  0.0104  | 1.00123 | -0.00145 | 0.05439  | -3.45686 | -1.65593 | -0.64007 | 0.00818  | 0.66671 | 1.66286 | 3.25955 | 3000.0 |
        |   2006.0  | 0.00097  | 0.99657 |  0.0462  | 0.14851  | -3.66269 | -1.63244 | -0.65111 | -0.02304 | 0.66632 | 1.64847 | 3.80687 | 3000.0 |
        |   2007.0  | -0.00102 | 1.02414 | -0.02673 | 0.09203  | -3.6959  | -1.67876 | -0.68666 | -0.00039 | 0.68119 | 1.67692 | 3.46859 | 3000.0 |
        |   2008.0  | 0.02678  | 1.00841 | -0.01221 | -0.01983 | -3.23867 | -1.60362 | -0.66403 | 0.05001  | 0.68418 | 1.69399 | 3.12197 | 3000.0 |
        |   2009.0  | 0.02272  | 0.99936 | 0.00098  | -0.02585 | -3.47649 | -1.6255  | -0.66684 | 0.01566  | 0.70893 | 1.65815 | 3.52246 | 3000.0 |
        |   2010.0  | -0.03435 | 0.99275 | -0.00444 | 0.07042  | -3.76678 | -1.68815 | -0.70062 | -0.02015 | 0.62253 | 1.58344 | 4.38276 | 3000.0 |
        |   2011.0  | -0.00054 | 0.99636 | 0.02189  | 0.01578  | -3.27961 | -1.65198 | -0.66553 | -0.00789 | 0.64738 | 1.63708 | 3.82473 | 3000.0 |
        |   2012.0  | 0.00797  | 0.99176 | -0.02293 | 0.16201  | -3.71695 | -1.62543 | -0.66423 | 0.00611  | 0.67271 | 1.62142 | 4.53518 | 3000.0 |
        |   2013.0  | -0.02023 | 1.00519 | 0.06585  | -0.02879 | -2.94317 | -1.6765  | -0.71018 | -0.04464 | 0.63132 |  1.6228 | 3.69209 | 3000.0 |
        |   2014.0  | 0.01427  |  0.9944 | -0.06002 | -0.14694 | -3.24666 | -1.65555 | -0.66567 | 0.03211  | 0.70328 | 1.61143 | 3.76138 | 3000.0 |
        |   2015.0  | 0.00411  | 0.99681 | 0.05986  | -0.00795 | -3.23554 | -1.62117 | -0.67571 | -0.01158 | 0.67075 | 1.69081 |  3.2803 | 3000.0 |
        |   2016.0  | 0.00331  | 0.99567 | -0.01401 | 0.05045  | -3.32744 | -1.6719  | -0.65461 | -0.02452 | 0.68572 | 1.63524 | 3.64654 | 3000.0 |
        |   2017.0  | -0.02818 | 0.99394 | 0.03023  | -0.07768 | -2.89247 | -1.66422 | -0.69934 | -0.04211 |  0.6557 | 1.61093 | 4.08491 | 3000.0 |
        |   2018.0  | -0.01518 | 0.99426 | 0.06121  | 0.05183  | -3.6031  | -1.61522 | -0.68846 | -0.02645 | 0.63035 | 1.65972 | 3.17931 | 3000.0 |
        |   2019.0  | 0.00753  | 0.99978 | -0.09315 | 0.15681  | -3.66891 | -1.66554 | -0.62705 | 0.01924  | 0.67606 |  1.6475 | 3.63029 | 3000.0 |
        |   2020.0  | 0.00963  | 0.97843 | 0.00495  | -0.19773 | -3.52657 | -1.57132 | -0.67812 | -0.0006  | 0.69571 |  1.6329 | 2.96727 | 3000.0 |
        | Variable1 |          |         |          |          |          |          |          |          |         |         |         |        |
        |   2001.0  | 0.00444  | 0.99937 | -0.00111 | 0.02784  | -4.01608 | -1.62687 | -0.66101 | 0.00461  | 0.68782 | 1.66745 | 3.74674 | 3000.0 |
        |   2002.0  | 0.03248  | 0.98333 | -0.03543 |  -0.12   | -2.95129 | -1.59552 | -0.64121 | 0.03724  | 0.71526 | 1.65638 | 2.94975 | 3000.0 |
        |   2003.0  | -0.00014 |  1.0129 | -0.01918 | -0.02553 | -3.53621 | -1.70547 | -0.68189 | 0.01289  | 0.66419 | 1.70014 | 3.46892 | 3000.0 |
        |   2004.0  | -0.00986 | 0.99839 | -0.04955 | -0.09378 | -3.64843 | -1.68933 | -0.66527 | 0.00177  | 0.67276 | 1.61928 | 3.39946 | 3000.0 |
        |   2005.0  | -0.01278 | 1.01301 | 0.04036  | -0.05986 | -3.94393 | -1.66642 | -0.72307 | -0.01037 | 0.66475 | 1.68997 | 4.01455 | 3000.0 |
        |   2006.0  | -0.01684 | 1.00801 | -0.02485 | 0.04404  | -3.42747 | -1.69649 | -0.69847 | 0.00137  | 0.64377 | 1.63762 | 3.71065 | 3000.0 |
        |   2007.0  | 0.00775  | 0.99226 | 0.02647  | -0.09826 | -3.35856 | -1.63332 | -0.6701  | 0.01552  | 0.68772 | 1.63534 | 3.38473 | 3000.0 |
        |   2008.0  | -0.01839 | 1.00207 | 0.07064  | -0.04444 | -3.49316 | -1.62796 | -0.70664 | -0.04696 | 0.67611 | 1.64585 | 3.41926 | 3000.0 |
        |   2009.0  | 0.00244  | 1.00653 | 0.00118  | 0.03134  | -3.97288 | -1.62437 | -0.69253 | 0.00254  | 0.69752 | 1.66172 | 3.59204 | 3000.0 |
        |   2010.0  | 0.00456  | 0.98586 | 0.00081  | 0.00748  | -3.77567 | -1.65978 | -0.65396 | -0.00024 | 0.66656 | 1.63131 | 4.09801 | 3000.0 |
        |   2011.0  | 0.00697  | 1.00025 | 0.03471  | 0.02968  | -3.36289 | -1.63179 | -0.65539 | -0.00692 | 0.67358 | 1.68125 | 3.31398 | 3000.0 |
        |   2012.0  | 0.00631  | 1.01953 | 0.08786  | -0.01507 | -3.65272 | -1.66321 | -0.69096 | 0.01699  | 0.69087 | 1.68358 | 4.29577 | 3000.0 |
        |   2013.0  | 0.01293  |  1.0022 | 0.03341  | 0.06975  | -3.4732  | -1.6283  | -0.65712 | -0.00773 |  0.6928 | 1.64534 | 3.63433 | 3000.0 |
        |   2014.0  | 0.02812  | 1.01169 | 0.00673  | -0.06633 | -3.29828 | -1.6315  | -0.65895 | 0.02223  |  0.7063 | 1.68416 | 3.34373 | 3000.0 |
        |   2015.0  |  2e-05   | 1.00701 | -0.04947 | -0.04389 | -3.62209 | -1.66903 | -0.67316 | 0.00995  |  0.6839 | 1.65849 | 3.37435 | 3000.0 |
        |   2016.0  | 0.00232  |  0.9964 | 0.00089  | -0.11472 | -3.41436 | -1.60652 | -0.68629 | -0.02141 |  0.6936 | 1.64288 | 3.24371 | 3000.0 |
        |   2017.0  | -0.00639 | 0.99143 | -0.09349 | 0.14564  | -3.7956  | -1.67824 | -0.65161 | 0.01273  | 0.64207 | 1.57087 |  3.3055 | 3000.0 |
        |   2018.0  | 0.02355  | 1.00525 |  0.0735  | 0.03266  | -3.24816 | -1.59484 | -0.65466 | -0.00647 | 0.67029 | 1.72254 | 4.04177 | 3000.0 |
        |   2019.0  | -0.03246 | 1.01253 | -0.08108 | -0.09494 | -3.68037 | -1.75283 | -0.69156 | -0.01116 | 0.65456 | 1.59883 | 2.96434 | 3000.0 |
        |   2020.0  | -0.01072 |  1.0106 | 0.02844  | 0.03357  | -3.57483 | -1.66278 | -0.6799  | -0.04445 |  0.6553 | 1.69042 | 3.88206 | 3000.0 |
        | Variable2 |          |         |          |          |          |          |          |          |         |         |         |        |
        |   2001.0  | 0.01341  | 0.99215 | -0.04582 | -0.06823 | -3.6463  | -1.66845 | -0.64103 | 0.03084  | 0.67477 | 1.61286 | 3.38016 | 3000.0 |
        |   2002.0  | -0.00965 | 1.00207 | -0.01972 | -0.0418  | -3.68085 | -1.6906  | -0.68189 | 0.02149  | 0.65511 | 1.60544 | 3.67893 | 3000.0 |
        |   2003.0  | -0.01181 | 0.98368 |   0.03   | 0.07221  | -3.52554 | -1.6329  | -0.6623  | -0.01934 | 0.62873 | 1.56947 | 3.29198 | 3000.0 |
        |   2004.0  | -0.0093  | 1.01012 | -0.06468 | 0.05186  | -3.82134 | -1.68905 | -0.68096 | -0.01715 | 0.66304 | 1.60242 | 3.43242 | 3000.0 |
        |   2005.0  | -0.00226 | 1.00171 | 0.07204  | 0.24676  | -3.90934 | -1.6036  | -0.66317 | -0.03034 | 0.66551 | 1.65203 | 3.29004 | 3000.0 |
        |   2006.0  | 0.02831  | 0.98831 | -0.01583 | -0.06904 | -3.49769 |  -1.597  | -0.6341  | 0.02602  | 0.70498 |  1.612  | 3.27316 | 3000.0 |
        |   2007.0  | 0.01345  | 0.98311 | -0.00127 | -0.03949 | -3.94062 | -1.56016 | -0.64564 |  0.0141  | 0.70418 | 1.62664 |  3.1278 | 3000.0 |
        |   2008.0  | -0.01468 | 0.98965 | -0.02328 | -0.13919 | -3.23034 | -1.64276 | -0.69535 | -0.00703 | 0.64543 | 1.67091 | 2.98688 | 3000.0 |
        |   2009.0  | -0.01303 | 0.99521 | -0.01693 | -0.11205 | -3.18737 | -1.65024 | -0.69354 | -0.0227  |  0.6594 | 1.62866 | 3.12058 | 3000.0 |
        |   2010.0  | -0.01644 | 1.02983 | 0.01627  | 0.09362  | -3.68702 | -1.71173 | -0.70591 | -0.01513 | 0.67615 | 1.65729 | 3.84489 | 3000.0 |
        |   2011.0  | 0.02987  | 0.97808 | -0.01661 | -0.00216 | -3.75328 | -1.58039 | -0.63462 | 0.03478  | 0.70848 | 1.61408 | 3.32421 | 3000.0 |
        |   2012.0  | -0.01036 | 1.02126 | -0.04517 | -0.06873 | -3.42255 | -1.71536 | -0.69263 |  -0.02   | 0.67138 | 1.68837 | 3.21492 | 3000.0 |
        |   2013.0  | 0.01457  | 0.99664 | -0.05389 | -0.01116 | -3.8404  | -1.59651 | -0.68504 |  0.0347  | 0.69874 | 1.62143 | 3.18552 | 3000.0 |
        |   2014.0  | 0.02402  | 1.00528 | 0.05727  | 0.23415  | -4.52543 | -1.62336 | -0.66718 |  0.0268  | 0.67981 | 1.70991 | 3.85701 | 3000.0 |
        |   2015.0  | 0.02556  | 1.04251 | 0.00448  | -0.2219  | -3.77315 | -1.68143 | -0.71505 | 0.03778  | 0.74572 | 1.72891 | 3.11878 | 3000.0 |
        |   2016.0  | -0.00015 | 1.01562 | 0.08258  |  0.0474  | -3.26366 | -1.64969 | -0.70016 | -0.01887 | 0.65377 | 1.69586 | 3.84588 | 3000.0 |
        |   2017.0  | -0.00127 | 0.97951 | -0.10032 | 0.00475  | -3.23879 | -1.63675 | -0.66305 | 0.02015  | 0.64732 | 1.57432 | 3.13642 | 3000.0 |
        |   2018.0  | -0.01174 | 1.01481 |  0.0055  |  0.077   | -4.15834 | -1.67378 | -0.72306 | 0.00013  |  0.6599 | 1.64383 | 3.77525 | 3000.0 |
        |   2019.0  | 0.00318  | 1.01367 | -0.04253 | 0.00838  | -3.90455 | -1.68076 | -0.66782 | -0.00943 | 0.70025 | 1.63818 | 3.43779 | 3000.0 |
        |   2020.0  | -0.01305 | 1.00361 | 0.00358  | -0.18842 | -3.11144 | -1.66498 | -0.69793 | -0.01765 | 0.69372 | 1.64435 | 3.36513 | 3000.0 |
        +-----------+----------+---------+----------+----------+----------+----------+----------+----------+---------+---------+---------+--------+
        ===========================================================================================================================================
        
        # correlation 
        variable_3 = np.random.normal(0, 1, 20*3000)
        variable_4 = np.random.normal(0, 1, 20*3000)
        print('-------------------------------------- Correlation ---------------------------------')
        exper.correlation(variables=np.array([variable_1, variable_2, variable_3, variable_4]).T, periodic=True)
        exper.correlation(variables=np.array([variable_1, variable_2, variable_3, variable_4]).T)
        =======================================================================================================================================================================
        Spearman Correlation
        +---------+-------------------------+-------------------------+-------------------------+-------------------------+-------------------------+-------------------------+
        |   Time  | Variable_0 & Variable_1 | Variable_0 & Variable_2 | Variable_0 & Variable_3 | Variable_1 & Variable_2 | Variable_1 & Variable_3 | Variable_2 & Variable_3 |
        +---------+-------------------------+-------------------------+-------------------------+-------------------------+-------------------------+-------------------------+
        | 2001.0  |         0.00863         |         0.00926         |         -0.00397        |         -0.00906        |         -0.00307        |         -0.0309         |
        | 2002.0  |         -0.01452        |         -0.04919        |         -0.01512        |         0.01882         |         -0.01007        |         -0.01103        |
        | 2003.0  |         -0.00284        |         -0.00438        |         0.02967         |         0.00591         |         -0.00215        |         -0.02898        |
        | 2004.0  |         0.00591         |         -0.00211        |          0.0029         |         0.01874         |         -0.0006         |         -0.01046        |
        | 2005.0  |         -0.03282        |         0.00415         |          0.0126         |         -0.00071        |         0.01993         |         -0.01941        |
        | 2006.0  |         -0.00586        |         -0.01359        |         0.03434         |         -0.00489        |         0.00569         |         0.00498         |
        | 2007.0  |         0.01482         |         -0.01671        |         -0.01547        |         0.01586         |         0.03723         |         0.00367         |
        | 2008.0  |         -0.02851        |         0.00128         |         0.00171         |         0.00211         |         -0.03949        |         -0.00039        |
        | 2009.0  |         -0.02377        |         0.01281         |         0.02534         |         0.01994         |         -0.00389        |         0.02761         |
        | 2010.0  |         0.02264         |         -0.0402         |         0.01084         |         -0.02077        |         -0.02531        |         -0.00131        |
        | 2011.0  |         0.00374         |         -0.0273         |           0.01          |         0.00546         |         0.00095         |          0.0209         |
        | 2012.0  |         -0.00557        |         0.01421         |         -0.00734        |         -0.02101        |         -0.00696        |         0.00732         |
        | 2013.0  |          -4e-05         |         -0.00498        |         -0.02152        |         0.01399         |         -0.04743        |         0.01188         |
        | 2014.0  |         -0.00228        |         0.01015         |         -0.01893        |         0.00524         |         -0.00558        |         0.01606         |
        | 2015.0  |         -0.00637        |         -0.00037        |         0.00886         |         0.01985         |         0.00351         |         -0.01251        |
        | 2016.0  |          0.0045         |         0.00385         |         -0.02201        |         -0.01904        |         -0.01806        |         -0.00567        |
        | 2017.0  |          0.0084         |         0.00174         |         -0.01042        |         0.03584         |         -0.00474        |         -0.01022        |
        | 2018.0  |          0.029          |         0.01707         |         0.01334         |         0.00304         |         -0.00772        |         -0.03887        |
        | 2019.0  |         -0.02843        |         0.04992         |         0.02054         |          0.0179         |         -0.00819        |         -0.00156        |
        | 2020.0  |         -0.00428        |          0.0306         |         -0.00014        |         -0.03424        |         0.00179         |         0.00909         |
        +---------+-------------------------+-------------------------+-------------------------+-------------------------+-------------------------+-------------------------+
        Fisher Correlation
        +---------+-------------------------+-------------------------+-------------------------+-------------------------+-------------------------+-------------------------+
        |   Time  | Variable_0 & Variable_1 | Variable_0 & Variable_2 | Variable_0 & Variable_3 | Variable_1 & Variable_2 | Variable_1 & Variable_3 | Variable_2 & Variable_3 |
        +---------+-------------------------+-------------------------+-------------------------+-------------------------+-------------------------+-------------------------+
        | 2001.0  |          0.0047         |         0.00928         |         -0.00062        |         -0.00261        |         -0.00212        |         -0.02891        |
        | 2002.0  |         -0.0136         |         -0.04798        |         -0.00929        |         0.01184         |         -0.00402        |         -0.02674        |
        | 2003.0  |         0.00528         |         -0.00703        |         0.03384         |         0.00351         |         -0.0062         |         -0.02955        |
        | 2004.0  |         0.00026         |         -0.00738        |         -0.00191        |         0.01757         |         -0.00905        |         -0.0198         |
        | 2005.0  |         -0.02188        |         0.00508         |         0.01595         |         -0.00102        |         0.02017         |         -0.02134        |
        | 2006.0  |         0.00079         |         -0.01184        |         0.04108         |         -0.00318        |         0.01416         |         0.00016         |
        | 2007.0  |         0.00898         |         -0.00828        |         -0.02333        |         0.01113         |         0.04626         |         -0.00328        |
        | 2008.0  |         -0.03725        |          0.0059         |         -0.00461        |         0.00724         |         -0.0367         |         -0.00523        |
        | 2009.0  |         -0.02321        |         0.01845         |         0.02499         |         0.02819         |         -0.01009        |         0.02698         |
        | 2010.0  |         0.01829         |         -0.03991        |         0.00954         |         -0.00806        |         -0.02117        |         -0.00337        |
        | 2011.0  |         -0.00429        |         -0.02689        |         0.00147         |         0.01042         |         0.00555         |         0.02417         |
        | 2012.0  |         -0.00896        |         0.02461         |         -0.00171        |         -0.02136        |         -0.00776        |          0.0049         |
        | 2013.0  |         0.00897         |         -0.00313        |         -0.01611        |         0.01796         |         -0.04195        |         0.01399         |
        | 2014.0  |         -0.00553        |         0.01175         |         -0.02001        |         0.01146         |         -0.00138        |         0.02431         |
        | 2015.0  |         -0.01301        |         0.00142         |         0.00421         |         0.01506         |         0.01047         |         -0.00587        |
        | 2016.0  |         0.00976         |         0.00267         |         -0.01945        |         -0.02886        |         -0.01569        |         -0.00137        |
        | 2017.0  |         0.01281         |         0.00114         |         -0.0059         |         0.03967         |         -0.00231        |         -0.02049        |
        | 2018.0  |         0.02602         |         0.02507         |         0.01181         |         0.00088         |         0.00019         |         -0.03295        |
        | 2019.0  |         -0.02941        |         0.04476         |         0.02165         |         0.00689         |         0.00426         |         -0.00762        |
        | 2020.0  |         0.00638         |         0.01985         |         -0.00544        |         -0.03257        |         -0.00213        |         0.00639         |
        +---------+-------------------------+-------------------------+-------------------------+-------------------------+-------------------------+-------------------------+
        +----------+-------------------------+-------------------------+-------------------------+-------------------------+-------------------------+-------------------------+
        | Variable | Variable_0 & Variable_1 | Variable_0 & Variable_2 | Variable_0 & Variable_3 | Variable_1 & Variable_2 | Variable_1 & Variable_3 | Variable_2 & Variable_3 |
        +----------+-------------------------+-------------------------+-------------------------+-------------------------+-------------------------+-------------------------+
        |  Fisher  |         -0.00274        |         0.00088         |         0.00281         |         0.00421         |         -0.00298        |         -0.00528        |
        | Spearman |         -0.00288        |         -0.00019        |         0.00276         |         0.00365         |         -0.00571        |         -0.00349        |
        +----------+-------------------------+-------------------------+-------------------------+-------------------------+-------------------------+-------------------------+
        ```
        
         
        
        #### class Bivariate(ptf_analysis):
        
        ##### def \__init__(self, sample):
        
        This function initializes the class Bivariate
        
        **input :**
        
        *sample (ndarray or DataFrame):* The samples to be analyzed. Samples usually contain the future return, characteristics, time. The **DEFAULT** setting is the *1th* column is the forecast return, the *2nd* column is the first characteristic, the *3rd* column is the second characteristic, the *4th* column or the index(if data type is Dataframe) is time label.
        
        
        
        ##### def divide_by_time(self):
        
        This function groups the sample by time.
        
        **output :** 
        
        *groups_by_time (list):* The samples group by time.
        
        
        
        ##### def average_by_time(self, conditional=False):
        
        This function, using the sample group by time from function *divide_by_time*, groups the sample by the characteristic, and then calculate average return of each group samples at every time point. 
        
        **input :**
        
        *conditional (boolean):* The way of sorting. If true, it is dependent-sort analysis; if false, it is independent sort analysis. The Default setting is **False**. 
        
        **output :** 
        
        *average_group_time(matrix: N_N_T) :* The average return of groups by each characteristic pair indexed by time.
        
        
        
        **Example**
        
        ```python
        import numpy as np
        from portfolio_analysis import Bivariate as bi
            
        # generate time 
        year = np.ones((3000,1), dtype=int)*2020
        for i in range(19):
            year = np.append(year, (2019-i)*np.ones((3000,1), dtype=int))
            
        # generate character
        character_1 = np.random.normal(0, 1, 20*3000)
        character_2 = np.random.normal(0, 1, 20*3000)
        
        # generate future return
        ret=character_1*-0.5 + character_2*0.5 + np.random.normal(0,1,20*3000)
        # create sample containing future return, character, time
        sample=np.array([ret,character_1, character_2, year]).T
        print(sample)
        # generate the Univiriate Class
        exper=bi(sample,9)
        # test function divide_by_time
        group_by_time = exper.divide_by_time()
        print(group_by_time)
        # test function average_by_time
        average_group_time = exper.average_by_time()
        print(average_group_time)
        print(np.shape(average_group_time))
        ========================================================================
        [[-1.248 -0.723 -0.526 2020.000]
         [0.739 0.058 1.388 2020.000]
         [1.711 -1.081 0.348 2020.000]
         ...
         [-0.906 1.128 -2.996 2001.000]
         [-1.240 0.781 -0.451 2001.000]
         [-0.216 0.907 -1.086 2001.000]]
        group by time: 
         [array([[0.522, 0.420, -1.185, 2001.000],
               [0.381, -0.954, -2.045, 2001.000],
               [1.086, -1.116, 1.017, 2001.000],
               ...,
               [-0.906, 1.128, -2.996, 2001.000],
               [-1.240, 0.781, -0.451, 2001.000],
               [-0.216, 0.907, -1.086, 2001.000]]), array([[-1.170, 1.635, 0.001, 2002.000],
               [0.852, -1.479, 0.878, 2002.000],
               [-0.080, 0.403, 0.547, 2002.000],
               ...,
               [0.308, -0.107, 0.261, 2002.000],
               [0.301, -2.221, 1.165, 2002.000],
               [-2.017, 0.576, 0.726, 2002.000]]), array([[-0.371, 1.792, 1.310, 2003.000],
               [0.815, -1.668, 0.022, 2003.000],
               [2.301, 0.324, 0.586, 2003.000],
               ...,
               [1.337, -1.082, -0.890, 2003.000],
               [-0.447, 1.330, 0.707, 2003.000],
               [2.666, -1.206, 0.232, 2003.000]]), array([[-0.430, 0.654, 0.543, 2004.000],
               [1.787, 0.772, 1.571, 2004.000],
               [2.241, -0.778, 1.403, 2004.000],
               ...,
               [0.290, -0.513, 0.026, 2004.000],
               [1.303, -0.173, 2.936, 2004.000],
               [-0.064, 0.986, 0.777, 2004.000]]), array([[-0.966, -0.162, -0.747, 2005.000],
               [1.395, 1.247, 0.257, 2005.000],
               [0.850, -2.563, -1.314, 2005.000],
               ...,
               [-1.148, -0.395, -1.396, 2005.000],
               [-2.013, 1.442, -2.360, 2005.000],
               [0.753, -0.533, -0.829, 2005.000]]), array([[-0.351, 0.771, 0.764, 2006.000],
               [-0.338, 0.545, 0.719, 2006.000],
               [-0.853, -0.271, 0.107, 2006.000],
               ...,
               [-1.302, 1.416, -0.656, 2006.000],
               [1.233, 0.318, 0.925, 2006.000],
               [0.169, 0.571, 0.703, 2006.000]]), array([[-1.335, 0.054, -1.674, 2007.000],
               [-1.760, -0.648, -1.320, 2007.000],
               [-0.783, -0.507, -0.527, 2007.000],
               ...,
               [0.703, -0.111, -0.153, 2007.000],
               [-1.252, 0.056, 1.051, 2007.000],
               [-1.914, 1.627, -2.030, 2007.000]]), array([[1.767, -0.743, 1.614, 2008.000],
               [0.364, -0.737, 0.141, 2008.000],
               [-0.691, -0.299, -1.860, 2008.000],
               ...,
               [-0.490, -0.202, -0.279, 2008.000],
               [0.982, -0.508, -0.415, 2008.000],
               [0.380, -0.796, -0.756, 2008.000]]), array([[-1.207, 1.881, -1.727, 2009.000],
               [0.848, -0.094, 0.782, 2009.000],
               [-1.822, 1.992, 0.239, 2009.000],
               ...,
               [1.484, -2.619, 0.840, 2009.000],
               [0.530, 0.929, 0.644, 2009.000],
               [-1.095, 2.209, -0.606, 2009.000]]), array([[-1.747, 0.447, -1.597, 2010.000],
               [-1.092, 0.321, 1.201, 2010.000],
               [-0.398, 0.282, 0.574, 2010.000],
               ...,
               [3.040, -0.083, 1.477, 2010.000],
               [1.221, 0.182, 2.531, 2010.000],
               [0.722, -0.342, -0.096, 2010.000]]), array([[0.134, 0.632, -0.437, 2011.000],
               [0.679, 0.649, 0.097, 2011.000],
               [-0.440, -0.798, -1.114, 2011.000],
               ...,
               [-2.137, 2.317, -1.015, 2011.000],
               [0.792, 0.309, 0.440, 2011.000],
               [1.685, -0.949, -0.092, 2011.000]]), array([[-0.727, 0.941, 1.412, 2012.000],
               [-0.052, 0.951, -1.023, 2012.000],
               [0.781, 0.434, 0.105, 2012.000],
               ...,
               [1.158, 1.810, 0.118, 2012.000],
               [1.382, -0.029, 0.234, 2012.000],
               [0.297, -1.016, -0.231, 2012.000]]), array([[-1.071, 0.400, 1.490, 2013.000],
               [-0.559, 1.989, 0.298, 2013.000],
               [0.324, -0.078, 0.512, 2013.000],
               ...,
               [-1.160, 0.799, -0.191, 2013.000],
               [1.542, -1.006, 0.835, 2013.000],
               [-0.160, -0.608, 1.218, 2013.000]]), array([[1.760, 1.303, 0.584, 2014.000],
               [0.308, 1.184, 0.710, 2014.000],
               [-0.916, -1.485, 0.717, 2014.000],
               ...,
               [-0.839, -1.420, -0.786, 2014.000],
               [0.625, -0.289, 0.914, 2014.000],
               [-1.393, -0.144, 0.389, 2014.000]]), array([[-0.371, 2.836, 1.156, 2015.000],
               [0.625, -0.812, 0.517, 2015.000],
               [1.683, -0.697, 0.316, 2015.000],
               ...,
               [-1.519, -0.195, -1.427, 2015.000],
               [1.141, -0.541, 1.434, 2015.000],
               [1.491, -0.605, 0.456, 2015.000]]), array([[0.396, 1.153, 1.451, 2016.000],
               [1.124, -0.064, -1.265, 2016.000],
               [0.047, 0.038, 1.536, 2016.000],
               ...,
               [-1.715, 1.981, 0.622, 2016.000],
               [-0.997, -0.048, -0.100, 2016.000],
               [4.086, -0.561, 1.430, 2016.000]]), array([[-2.620, 1.302, -0.624, 2017.000],
               [-3.629, 0.548, -0.361, 2017.000],
               [-1.450, 0.024, -1.406, 2017.000],
               ...,
               [0.035, -0.252, -1.975, 2017.000],
               [1.145, -1.471, -0.003, 2017.000],
               [-1.875, -0.794, -0.484, 2017.000]]), array([[-0.199, 0.571, -1.849, 2018.000],
               [-0.535, 0.408, -1.257, 2018.000],
               [-1.629, -0.315, 0.223, 2018.000],
               ...,
               [1.010, -0.382, 0.335, 2018.000],
               [-2.577, 0.425, -1.771, 2018.000],
               [0.316, -0.722, -1.370, 2018.000]]), array([[-0.323, -0.139, -0.376, 2019.000],
               [0.105, 0.439, -0.103, 2019.000],
               [0.394, 0.929, -1.189, 2019.000],
               ...,
               [0.617, 0.686, -0.222, 2019.000],
               [-2.545, 0.752, -1.271, 2019.000],
               [2.016, -0.663, 2.025, 2019.000]]), array([[-1.248, -0.723, -0.526, 2020.000],
               [0.739, 0.058, 1.388, 2020.000],
               [1.711, -1.081, 0.348, 2020.000],
               ...,
               [-2.098, -0.131, -1.182, 2020.000],
               [2.356, 0.722, -0.356, 2020.000],
               [0.703, -0.360, -0.386, 2020.000]])]
        average_group_time: 
         [[[-0.006 0.056 -0.214 ... 0.008 -0.440 0.230]
          [-0.025 0.134 0.298 ... 0.432 0.475 0.022]
          [0.416 0.685 0.884 ... 0.384 0.925 0.614]
          ...
          [1.206 1.389 1.050 ... 1.180 1.053 1.124]
          [1.458 1.319 1.620 ... 1.101 1.226 1.273]
          [1.558 1.582 1.733 ... 1.392 1.841 1.867]]
        
         [[-0.294 -0.542 -0.319 ... -0.468 -0.204 -0.196]
          [0.120 -0.160 0.077 ... 0.012 0.260 -0.132]
          [0.305 0.077 0.027 ... 0.307 0.102 0.246]
          ...
          [0.971 0.836 1.196 ... 0.867 0.775 0.851]
          [0.988 0.800 1.467 ... 0.985 1.328 1.197]
          [1.713 1.087 1.665 ... 1.551 1.262 1.628]]
        
         [[-0.686 -0.606 -0.273 ... -0.361 -0.532 -0.435]
          [-0.449 -0.263 -0.432 ... -0.168 -0.638 -0.397]
          [0.169 -0.096 0.029 ... -0.177 -0.119 -0.089]
          ...
          [0.568 0.623 0.594 ... 0.664 0.517 0.740]
          [0.671 0.831 1.049 ... 0.963 0.850 1.072]
          [1.214 1.414 1.283 ... 1.040 1.187 1.295]]
        
         ...
        
         [[-1.160 -1.274 -1.352 ... -1.244 -1.537 -0.719]
          [-0.918 -0.634 -0.714 ... -0.855 -0.798 -1.014]
          [-0.732 -0.565 -0.556 ... -0.769 -0.790 -1.063]
          ...
          [0.186 0.032 -0.331 ... 0.512 0.143 0.205]
          [0.029 0.107 0.098 ... 0.116 0.275 0.223]
          [0.927 0.506 0.767 ... 0.444 0.667 0.860]]
        
         [[-1.694 -1.812 -1.452 ... -1.498 -1.312 -1.729]
          [-0.973 -1.123 -0.965 ... -1.175 -1.003 -1.054]
          [-0.928 -0.770 -0.845 ... -0.727 -0.961 -0.891]
          ...
          [-0.051 0.025 -0.158 ... 0.210 0.221 0.048]
          [0.223 0.166 0.475 ... 0.266 -0.050 -0.149]
          [0.218 0.157 -0.182 ... 0.416 0.225 0.479]]
        
         [[-1.693 -1.962 -1.773 ... -1.600 -1.497 -1.749]
          [-1.580 -1.482 -1.580 ... -1.337 -1.629 -1.711]
          [-1.384 -0.927 -1.158 ... -1.390 -1.010 -1.213]
          ...
          [-0.747 -0.601 -0.499 ... -0.831 -0.348 -0.404]
          [-0.238 -0.269 -0.472 ... -0.031 -0.502 -0.356]
          [-0.322 0.136 0.020 ... -0.202 -0.066 -0.342]]]
        shape of average_group_time: 
         (10, 10, 20)
        ```
        
        
        
        ##### def difference(self, average_group):
        
        This functions calculates the difference of group return, which, in detail, is the last group average return minus the first group average return. 
        
         **input :** 
        
        *average_group (ndarray):* The average return of groups by each characteristic-time pair.
        
        **output :**
        
        *result (ndarray):* The matrix added with the difference of average group return.
        
        
        
        ##### def factor_adjustment(self, factor):
        
        This function calculates the group return adjusted by risk factors.
        
        **input :**
        
        *factor (ndarray or DataFrame):* The return table with difference sequence.
        
        **output :**
        
        *alpha (ndarray):* The anomaly
        
        *ttest (ndarray):* The t-value of the anomaly.
        
        
        
        ##### def summary_and_test(self, **kwargs):
        
        This function summarizes the result and take t-test.
        
        **input :** 
        
        *export (boolean):* Export the table or not. The table is exported in form of Dataframe. The default setting is **False.**
        
        **output : **
        
        *self.average (array):* The average of the portfolio return across time.
        
        *self.ttest (array):* The t-value of the portfolio return across time.
        
        
        
        ##### def extractor(self, r_pos, c_pos):
        
        This function extracts the return series.
        
        **input :**
        
        *r_pos (int):* The row position of the return matrix.
        
        *c_pos (int):* The column position of the return matrix.
        
        **output :**
        
        *series_ex (Series):* The extracted Series.
        
        
        
        ##### def fit(self, number, perc_row=None, perc_col=None, percn_row=None, percn_col=None, weight=False, maxlag=12, **kwargs):
        
        This function run the function **summary_and_test().**
        
        **input :**
        
        *number (int):*  The breakpoint number.
        
        *perc_row (list or array):*  The breakpoint percentile points of row characteristics.
        
        *perc_col (list or array):* The breakpoint percentile points of column characteristics.
        
        *percn_row (list or array):* The breakpoints percentiles of row characteristics.
        
        *percn_col (list or array):* The breakpoints percentiles of column characteristics.
        
        *weight (boolean):* Whether calculate the weighted average return.
        
        *maxlag (int):*  The maximum lag for Newey-West adjustment.
        
        *kwargs :* kwargs include settings like conditional, etc. 
        
        
        
        ##### def print_summary_by_time(self, export=False):
        
        This function print the summary grouped by time.
        
        **input :** 
        
        *export (boolean):* Export the table or not. The table is exported in form of Dataframe. The default setting is **False.**
        
        **output :**
        
        *df (DataFrame):* The table exported in form of DataFrame.
        
        
        
        ##### def print_summary(self, explicit=False, export=False, percentage=False):
        
        This function print the summary grouped by characteristic and averaged by time.
        
        **input :**
        
        *explicit (boolean):* Whether presents the explicit result. The default is **False**.
        
        *export (boolean):* Export the table or not. The table is exported in form of Dataframe. The default setting is **False.**
        
        *percentage (boolean):* Whether presents the percentage return. The default is **False**.
        
        **output :**
        
        *df (DataFrame):* The table exported in form of Dataframe.
        
        
        
        **Example**
        
        ```python
        # Continue the above code
        # test function difference
        result = exper.difference(average_group_time)
        print('result :\n', result)
        print('difference matrix :\n', np.shape(result))
        # test function summary_and_test
        average, ttest = exper.summary_and_test()
        print('average :\n', average)
        print(' shape of average :', np.shape(average))
        print('ttest :\n', ttest)
        print('shape of ttest :', np.shape(ttest))
        # test function print_summary_by_time()
        exper.print_summary_by_time()
        # test function print_summary
        exper.print_summary()
            
        # generate factor
        factor=np.random.normal(0,1.0,(20,1))
        exper=bi(sample,9,factor=factor,maxlag=12)
        exper.fit()
        exper.print_summary()
        ========================================================================
        result :
         [[[-0.006 0.056 -0.214 ... 0.008 -0.440 0.230]
          [-0.025 0.134 0.298 ... 0.432 0.475 0.022]
          [0.416 0.685 0.884 ... 0.384 0.925 0.614]
          ...
          [1.458 1.319 1.620 ... 1.101 1.226 1.273]
          [1.558 1.582 1.733 ... 1.392 1.841 1.867]
          [1.564 1.526 1.947 ... 1.384 2.281 1.637]]
        
         [[-0.294 -0.542 -0.319 ... -0.468 -0.204 -0.196]
          [0.120 -0.160 0.077 ... 0.012 0.260 -0.132]
          [0.305 0.077 0.027 ... 0.307 0.102 0.246]
          ...
          [0.988 0.800 1.467 ... 0.985 1.328 1.197]
          [1.713 1.087 1.665 ... 1.551 1.262 1.628]
          [2.007 1.629 1.983 ... 2.019 1.466 1.824]]
        
         [[-0.686 -0.606 -0.273 ... -0.361 -0.532 -0.435]
          [-0.449 -0.263 -0.432 ... -0.168 -0.638 -0.397]
          [0.169 -0.096 0.029 ... -0.177 -0.119 -0.089]
          ...
          [0.671 0.831 1.049 ... 0.963 0.850 1.072]
          [1.214 1.414 1.283 ... 1.040 1.187 1.295]
          [1.900 2.020 1.556 ... 1.401 1.720 1.730]]
        
         ...
        
         [[-1.694 -1.812 -1.452 ... -1.498 -1.312 -1.729]
          [-0.973 -1.123 -0.965 ... -1.175 -1.003 -1.054]
          [-0.928 -0.770 -0.845 ... -0.727 -0.961 -0.891]
          ...
          [0.223 0.166 0.475 ... 0.266 -0.050 -0.149]
          [0.218 0.157 -0.182 ... 0.416 0.225 0.479]
          [1.912 1.969 1.270 ... 1.914 1.536 2.208]]
        
         [[-1.693 -1.962 -1.773 ... -1.600 -1.497 -1.749]
          [-1.580 -1.482 -1.580 ... -1.337 -1.629 -1.711]
          [-1.384 -0.927 -1.158 ... -1.390 -1.010 -1.213]
          ...
          [-0.238 -0.269 -0.472 ... -0.031 -0.502 -0.356]
          [-0.322 0.136 0.020 ... -0.202 -0.066 -0.342]
          [1.372 2.098 1.793 ... 1.398 1.431 1.407]]
        
         [[-1.687 -2.018 -1.559 ... -1.608 -1.057 -1.979]
          [-1.554 -1.615 -1.879 ... -1.769 -2.103 -1.733]
          [-1.799 -1.612 -2.043 ... -1.773 -1.934 -1.827]
          ...
          [-1.696 -1.588 -2.092 ... -1.132 -1.727 -1.629]
          [-1.879 -1.446 -1.713 ... -1.594 -1.907 -2.209]
          [-0.192 0.572 -0.153 ... 0.014 -0.850 -0.230]]]
        difference matrix :
         (11, 11, 20)
        average :
         [[-0.009 0.307 0.487 0.670 0.808 0.870 1.121 1.170 1.395 1.763 1.771]
         [-0.348 0.025 0.155 0.353 0.500 0.556 0.791 0.923 1.013 1.426 1.774]
         [-0.492 -0.264 -0.008 0.144 0.248 0.429 0.543 0.663 0.854 1.249 1.741]
         [-0.712 -0.379 -0.135 -0.027 0.132 0.207 0.361 0.552 0.753 1.011 1.723]
         [-0.864 -0.470 -0.229 -0.095 0.010 0.096 0.225 0.395 0.562 0.876 1.740]
         [-0.936 -0.588 -0.369 -0.266 -0.109 -0.043 0.104 0.292 0.524 0.769 1.706]
         [-1.166 -0.700 -0.531 -0.455 -0.250 -0.162 0.061 0.113 0.331 0.596 1.762]
         [-1.206 -0.863 -0.702 -0.530 -0.408 -0.195 -0.072 -0.033 0.139 0.587
          1.794]
         [-1.418 -1.080 -0.907 -0.827 -0.537 -0.401 -0.280 -0.108 0.087 0.384
          1.803]
         [-1.740 -1.428 -1.208 -1.037 -0.882 -0.800 -0.722 -0.563 -0.264 -0.020
          1.720]
         [-1.732 -1.735 -1.695 -1.707 -1.689 -1.670 -1.843 -1.733 -1.659 -1.782
          -0.051]]
         shape of average : (11, 11)
        ttest :
         Ttest_1sampResult(statistic=array([[-0.210, 9.363, 10.040, 14.328, 19.642, 19.275, 32.422, 31.651,
                32.548, 36.339, 30.879],
               [-8.735, 0.554, 4.106, 9.711, 15.652, 17.688, 18.663, 21.424,
                24.727, 25.303, 27.968],
               [-14.510, -5.950, -0.196, 3.603, 4.279, 10.713, 11.675, 15.120,
                28.222, 30.641, 33.960],
               [-13.954, -12.731, -3.512, -0.734, 4.065, 3.896, 10.159, 11.573,
                19.965, 33.017, 27.068],
               [-22.685, -10.826, -5.787, -2.676, 0.240, 3.617, 4.984, 8.858,
                12.281, 22.043, 35.963],
               [-17.923, -13.135, -7.587, -8.412, -3.183, -1.094, 2.233, 6.425,
                14.503, 19.564, 22.897],
               [-23.358, -18.494, -11.156, -12.058, -6.798, -3.088, 1.608, 2.929,
                9.236, 12.340, 25.862],
               [-29.779, -21.416, -16.812, -12.196, -9.915, -4.330, -1.904,
                -0.730, 4.491, 14.137, 32.595],
               [-28.227, -28.724, -24.521, -19.015, -10.798, -8.905, -6.847,
                -2.320, 2.612, 8.876, 32.366],
               [-58.707, -35.555, -33.894, -25.661, -22.006, -19.782, -17.041,
                -13.409, -6.740, -0.442, 30.899],
               [-29.798, -36.048, -35.280, -26.596, -33.752, -24.635, -42.514,
                -31.417, -26.573, -25.966, -0.585]]), pvalue=array([[0.836, 0.000, 0.000, 0.000, 0.000, 0.000, 0.000, 0.000, 0.000,
                0.000, 0.000],
               [0.000, 0.586, 0.001, 0.000, 0.000, 0.000, 0.000, 0.000, 0.000,
                0.000, 0.000],
               [0.000, 0.000, 0.847, 0.002, 0.000, 0.000, 0.000, 0.000, 0.000,
                0.000, 0.000],
               [0.000, 0.000, 0.002, 0.472, 0.001, 0.001, 0.000, 0.000, 0.000,
                0.000, 0.000],
               [0.000, 0.000, 0.000, 0.015, 0.813, 0.002, 0.000, 0.000, 0.000,
                0.000, 0.000],
               [0.000, 0.000, 0.000, 0.000, 0.005, 0.288, 0.038, 0.000, 0.000,
                0.000, 0.000],
               [0.000, 0.000, 0.000, 0.000, 0.000, 0.006, 0.124, 0.009, 0.000,
                0.000, 0.000],
               [0.000, 0.000, 0.000, 0.000, 0.000, 0.000, 0.072, 0.474, 0.000,
                0.000, 0.000],
               [0.000, 0.000, 0.000, 0.000, 0.000, 0.000, 0.000, 0.032, 0.017,
                0.000, 0.000],
               [0.000, 0.000, 0.000, 0.000, 0.000, 0.000, 0.000, 0.000, 0.000,
                0.664, 0.000],
               [0.000, 0.000, 0.000, 0.000, 0.000, 0.000, 0.000, 0.000, 0.000,
                0.000, 0.565]]))
        shape of ttest : (2, 11, 11)
        +--------+-------+--------+--------+--------+--------+--------+--------+--------+--------+--------+--------+--------+
        |  Time  | Group |   1    |   2    |   3    |   4    |   5    |   6    |   7    |   8    |   9    |   10   |  Diff  |
        +--------+-------+--------+--------+--------+--------+--------+--------+--------+--------+--------+--------+--------+
        | 2001.0 |   1   | -0.006 | -0.025 | 0.416  | 0.536  | 0.896  | 0.852  | 1.127  | 1.206  | 1.458  | 1.558  | 1.564  |
        |        |   2   | -0.294 |  0.12  | 0.305  | 0.324  | 0.419  | 0.637  | 0.349  | 0.971  | 0.988  | 1.713  | 2.007  |
        |        |   3   | -0.686 | -0.449 | 0.169  | -0.136 | -0.057 |  0.23  | 0.503  | 0.568  | 0.671  | 1.214  |  1.9   |
        |        |   4   | -1.005 | -0.287 | -0.135 |  0.05  | 0.196  | 0.587  |  0.36  | 0.707  |  0.68  | 1.066  | 2.071  |
        |        |   5   | -0.669 | -0.713 | -0.364 | 0.075  | -0.345 | 0.085  | 0.265  | 0.364  | 0.569  | 0.922  | 1.591  |
        |        |   6   | -1.136 | -0.535 | -0.669 | -0.325 | -0.265 | -0.137 | 0.041  | 0.571  | 0.444  | 1.052  | 2.188  |
        |        |   7   | -1.34  | -0.766 | -0.515 | -0.439 | -0.201 | -0.342 | 0.211  | -0.107 | 0.369  | 0.231  | 1.571  |
        |        |   8   | -1.16  | -0.918 | -0.732 | -0.563 | -0.625 | -0.182 | -0.06  | 0.186  | 0.029  | 0.927  | 2.086  |
        |        |   9   | -1.694 | -0.973 | -0.928 | -0.63  | -0.557 | -0.391 | -0.531 | -0.051 | 0.223  | 0.218  | 1.912  |
        |        |   10  | -1.693 | -1.58  | -1.384 | -0.712 | -0.67  | -0.707 | -0.803 | -0.747 | -0.238 | -0.322 | 1.372  |
        |        |  Diff | -1.687 | -1.554 | -1.799 | -1.248 | -1.566 | -1.559 | -1.93  | -1.953 | -1.696 | -1.879 | -0.192 |
        | 2002.0 |   1   | 0.056  | 0.134  | 0.685  | 0.891  | 1.142  | 0.668  | 1.173  | 1.389  | 1.319  | 1.582  | 1.526  |
        |        |   2   | -0.542 | -0.16  | 0.077  | 0.046  | 0.603  | 0.746  | 0.779  | 0.836  |  0.8   | 1.087  | 1.629  |
        |        |   3   | -0.606 | -0.263 | -0.096 | 0.235  | -0.006 | 0.273  |  0.79  | 0.623  | 0.831  | 1.414  |  2.02  |
        |        |   4   | -0.634 | -0.419 | 0.107  | -0.265 | 0.193  | 0.035  |  0.15  | 0.514  | 0.694  | 0.999  | 1.633  |
        |        |   5   | -1.139 | -0.555 | -0.269 | 0.015  | -0.198 |  0.13  | -0.018 | 0.496  | 0.657  | 0.763  | 1.902  |
        |        |   6   | -0.587 | -0.192 | -0.27  | -0.348 | 0.291  | -0.125 | -0.091 | 0.103  | 0.489  | 0.949  | 1.536  |
        |        |   7   | -0.916 | -0.847 | -0.781 | -0.223 | -0.065 | -0.498 | -0.066 | 0.277  |  0.37  | 0.383  | 1.299  |
        |        |   8   | -1.274 | -0.634 | -0.565 | -0.773 | -0.335 | -0.161 | -0.002 | 0.032  | 0.107  | 0.506  | 1.779  |
        |        |   9   | -1.812 | -1.123 | -0.77  | -0.67  | -0.922 | -0.32  | -0.569 | 0.025  | 0.166  | 0.157  | 1.969  |
        |        |   10  | -1.962 | -1.482 | -0.927 | -1.269 | -0.649 | -0.576 | -0.869 | -0.601 | -0.269 | 0.136  | 2.098  |
        |        |  Diff | -2.018 | -1.615 | -1.612 | -2.16  | -1.791 | -1.243 | -2.042 | -1.99  | -1.588 | -1.446 | 0.572  |
        | 2003.0 |   1   | -0.214 | 0.298  | 0.884  | 0.701  | 0.773  | 0.705  |  1.12  |  1.05  |  1.62  | 1.733  | 1.947  |
        |        |   2   | -0.319 | 0.077  | 0.027  | 0.399  | 0.495  | 0.534  | 0.922  | 1.196  | 1.467  | 1.665  | 1.983  |
        |        |   3   | -0.273 | -0.432 | 0.029  |  0.3   | 0.726  | 0.243  | 0.655  | 0.594  | 1.049  | 1.283  | 1.556  |
        |        |   4   | -0.643 | -0.531 | -0.149 | 0.124  | 0.086  | -0.055 | 0.186  | 0.563  | 0.696  | 1.047  |  1.69  |
        |        |   5   | -0.922 |  -0.1  | -0.214 | -0.203 | 0.229  | 0.024  | -0.049 | 0.283  | 0.895  | 0.965  | 1.888  |
        |        |   6   | -0.492 | -0.802 | -0.234 | -0.392 | -0.198 | -0.148 | 0.083  | 0.348  |  0.61  | 0.702  | 1.194  |
        |        |   7   | -1.483 | -0.691 | -0.522 | -0.633 | -0.409 | -0.008 | 0.193  | 0.292  | -0.001 | 0.835  | 2.317  |
        |        |   8   | -1.352 | -0.714 | -0.556 | -0.788 | -0.063 | -0.393 | -0.047 | -0.331 | 0.098  | 0.767  | 2.119  |
        |        |   9   | -1.452 | -0.965 | -0.845 | -0.935 | -0.38  | -0.281 | -0.293 | -0.158 | 0.475  | -0.182 |  1.27  |
        |        |   10  | -1.773 | -1.58  | -1.158 | -1.133 | -0.702 | -1.006 | -0.465 | -0.499 | -0.472 |  0.02  | 1.793  |
        |        |  Diff | -1.559 | -1.879 | -2.043 | -1.833 | -1.475 | -1.711 | -1.585 | -1.548 | -2.092 | -1.713 | -0.153 |
        | 2004.0 |   1   | -0.176 | 0.258  | 0.289  | 0.622  | 0.792  | 1.224  | 1.096  | 1.166  | 1.604  | 1.758  | 1.934  |
        |        |   2   | -0.342 | 0.185  | 0.181  | 0.413  | 0.552  | 0.689  | 1.024  | 1.149  | 0.971  | 1.626  | 1.967  |
        |        |   3   | -0.363 |  0.12  | -0.258 | 0.258  | -0.219 | 0.244  | 0.817  | 0.795  | 0.942  | 1.121  | 1.485  |
        |        |   4   | -0.956 | -0.466 | -0.361 | -0.046 | 0.217  | 0.052  | 0.652  |  0.6   | 0.776  | 0.872  | 1.829  |
        |        |   5   | -1.072 | -0.247 | -0.348 | 0.101  | 0.185  | 0.061  | 0.476  | 0.288  | 0.292  | 0.871  | 1.943  |
        |        |   6   | -0.782 | -0.744 | -0.451 | -0.449 | -0.116 | -0.059 | 0.442  |  0.26  | 0.297  | 0.406  | 1.188  |
        |        |   7   | -1.011 | -0.523 | -0.174 | -0.264 | -0.209 | 0.018  | -0.09  | -0.152 | 0.406  | 0.771  | 1.782  |
        |        |   8   | -1.064 | -1.185 | -0.958 | -0.314 | -0.714 | -0.096 | -0.343 | -0.017 | 0.205  | 0.427  | 1.491  |
        |        |   9   | -1.272 | -1.341 | -0.932 | -0.839 | -0.425 | -0.452 | -0.498 | -0.107 | 0.153  | 0.259  | 1.532  |
        |        |   10  | -1.551 | -1.152 | -1.257 | -1.175 | -0.789 | -0.613 | -0.566 | -0.606 | 0.003  |  0.16  | 1.712  |
        |        |  Diff | -1.376 | -1.41  | -1.547 | -1.797 | -1.581 | -1.837 | -1.662 | -1.772 |  -1.6  | -1.598 | -0.222 |
        | 2005.0 |   1   | 0.292  | 0.185  | 0.793  | 0.676  | 0.866  | 0.658  | 0.898  | 1.312  | 1.711  | 1.957  | 1.664  |
        |        |   2   | -0.351 | -0.267 | 0.235  | 0.484  | 0.402  | 0.517  | 0.938  |  0.5   | 1.084  | 1.604  | 1.955  |
        |        |   3   | -0.492 | -0.213 |  0.28  | 0.022  | 0.338  | 0.221  | 0.903  | 0.862  | 0.848  | 1.144  | 1.635  |
        |        |   4   | -0.425 | -0.268 | -0.228 | 0.014  | 0.252  |  0.27  | 0.453  | 0.612  | 0.911  | 0.999  | 1.424  |
        |        |   5   | -0.673 | -0.538 | -0.462 | -0.144 | -0.027 | 0.154  | 0.409  | -0.003 | 0.414  | 0.774  | 1.447  |
        |        |   6   | -1.109 | -0.604 | -0.465 | -0.06  | -0.048 | 0.104  | -0.058 | 0.212  | 0.749  | 1.172  | 2.282  |
        |        |   7   | -1.106 | -0.633 | -0.221 | -0.54  | -0.558 | -0.036 | 0.076  | -0.037 |  0.14  | 0.895  | 2.001  |
        |        |   8   | -1.485 | -1.188 | -0.414 | -0.21  | -0.223 | -0.556 | 0.016  | 0.029  | 0.234  | 0.521  | 2.006  |
        |        |   9   | -1.199 | -1.172 | -1.14  | -0.85  | -0.706 | -0.571 | -0.217 | -0.066 | 0.055  |  0.49  | 1.689  |
        |        |   10  | -1.876 | -1.479 | -1.317 | -1.205 | -1.136 | -0.596 | -1.07  | -0.469 | -0.376 | 0.321  | 2.197  |
        |        |  Diff | -2.168 | -1.664 | -2.11  | -1.881 | -2.002 | -1.254 | -1.968 | -1.781 | -2.087 | -1.635 | 0.533  |
        | 2006.0 |   1   | -0.006 | 0.369  | 0.611  | 0.916  | 0.818  | 0.894  | 1.053  | 1.307  | 1.403  | 1.791  | 1.797  |
        |        |   2   | -0.597 | -0.006 | 0.287  | 0.339  | 0.519  | 0.422  | 0.971  | 0.978  | 0.995  | 1.383  |  1.98  |
        |        |   3   | -0.662 | -0.362 | -0.057 |  0.17  | 0.198  | 0.369  | 0.781  | 1.019  | 0.817  | 1.337  | 1.999  |
        |        |   4   | -1.05  | -0.16  | 0.006  | 0.179  | -0.003 | -0.07  | 0.039  | 0.602  | 0.701  | 1.133  | 2.184  |
        |        |   5   | -0.966 | -0.898 | 0.001  | 0.046  | 0.031  | 0.178  | 0.454  | 0.281  | 0.479  | 0.663  | 1.629  |
        |        |   6   | -0.757 |  -0.6  | -0.35  | -0.201 | -0.069 | -0.236 | 0.408  | 0.476  | 0.838  | 0.652  | 1.409  |
        |        |   7   | -1.058 | -0.532 | -0.504 | -0.384 | -0.417 | -0.004 | -0.082 | -0.06  | 0.311  | 0.522  |  1.58  |
        |        |   8   | -1.148 | -0.962 | -0.734 | -0.498 | -0.527 | 0.101  | -0.373 | -0.033 | 0.389  | 0.735  | 1.884  |
        |        |   9   | -1.473 | -0.956 | -0.508 | -0.85  | -0.689 | -0.417 | -0.198 | -0.285 | 0.005  | 0.284  | 1.757  |
        |        |   10  | -1.494 | -1.458 | -1.18  | -1.087 | -0.525 | -1.031 | -0.547 | -0.41  | -0.092 | 0.256  |  1.75  |
        |        |  Diff | -1.488 | -1.827 | -1.792 | -2.003 | -1.344 | -1.924 | -1.601 | -1.716 | -1.495 | -1.535 | -0.047 |
        | 2007.0 |   1   | -0.123 | 0.481  | 0.258  | 0.852  | 1.025  | 0.964  | 1.239  | 0.922  | 1.271  | 1.566  | 1.689  |
        |        |   2   | -0.425 | 0.048  | -0.015 | 0.031  | 0.277  |  0.5   | 0.746  | 0.831  | 1.023  | 1.435  | 1.861  |
        |        |   3   | -0.495 | 0.105  | 0.106  | -0.066 | 0.553  | 0.421  | 0.491  | 0.851  |  0.83  | 0.951  | 1.446  |
        |        |   4   | -0.595 | -0.274 | -0.412 | 0.034  | -0.059 | 0.189  | 0.594  | 0.278  | 1.072  | 0.971  | 1.566  |
        |        |   5   | -0.888 | -0.202 | -0.076 |  0.01  | -0.195 | 0.265  | 0.346  | 0.696  | 0.097  | 1.018  | 1.906  |
        |        |   6   | -1.027 | -0.787 | -0.022 | -0.213 | -0.181 | -0.109 | 0.147  | 0.205  | 0.118  | 0.798  | 1.825  |
        |        |   7   | -1.302 | -1.056 | -0.662 | -0.689 | -0.457 |  0.04  | 0.059  | 0.069  |  0.3   | 0.425  | 1.727  |
        |        |   8   | -1.333 | -0.748 | -0.862 | -0.387 | -0.526 | -0.571 | -0.221 | -0.04  |  0.0   | 0.551  | 1.884  |
        |        |   9   | -0.981 | -0.873 | -1.158 | -0.487 | -0.287 | -0.094 | -0.132 | -0.17  | 0.071  | 0.416  | 1.397  |
        |        |   10  | -1.842 | -1.518 | -1.186 | -0.903 | -0.762 | -0.558 | -0.743 | -0.43  | -0.455 | 0.242  | 2.084  |
        |        |  Diff | -1.719 | -1.999 | -1.444 | -1.754 | -1.787 | -1.521 | -1.982 | -1.351 | -1.726 | -1.325 | 0.395  |
        | 2008.0 |   1   | -0.098 | 0.303  | 0.088  | 0.747  | 0.614  | 0.964  | 1.201  | 1.296  | 1.374  | 1.807  | 1.905  |
        |        |   2   | -0.602 | -0.213 | 0.199  | 0.435  | 0.603  | 0.781  | 0.739  | 1.198  | 0.876  | 0.948  |  1.55  |
        |        |   3   | -0.607 | -0.182 | 0.041  | 0.394  |  0.28  | 0.617  | 0.472  | 0.661  | 1.057  | 1.099  | 1.706  |
        |        |   4   | -0.394 | -0.623 | 0.087  | -0.195 | 0.018  | -0.299 | 0.508  |  0.65  | 0.721  | 0.878  | 1.272  |
        |        |   5   | -0.837 | -0.421 | -0.008 | -0.168 | -0.132 | -0.049 | 0.384  | 0.486  | 0.573  | 0.728  | 1.565  |
        |        |   6   | -0.996 | -0.694 | -0.157 | -0.295 | 0.108  | 0.004  | 0.307  |  0.11  | 0.604  |  0.8   | 1.796  |
        |        |   7   | -0.842 | -0.59  | -0.517 | -0.619 | -0.429 | -0.233 | 0.302  | 0.106  | 0.414  | 0.541  | 1.383  |
        |        |   8   | -1.337 | -1.116 | -0.813 | -0.695 | -0.061 | -0.409 | -0.136 | -0.352 | 0.222  |  0.71  | 2.047  |
        |        |   9   | -1.193 | -1.169 | -0.754 | -0.77  | -0.284 | -0.354 | -0.131 | -0.069 | 0.118  | 0.631  | 1.825  |
        |        |   10  | -1.847 | -1.415 | -1.344 | -0.984 | -0.846 | -0.686 | -0.583 | -0.694 | -0.077 | 0.003  |  1.85  |
        |        |  Diff | -1.749 | -1.718 | -1.432 | -1.731 | -1.459 | -1.649 | -1.784 | -1.99  | -1.451 | -1.804 | -0.055 |
        | 2009.0 |   1   | 0.004  |  0.41  | 0.595  | 0.964  | 0.871  | 0.773  | 0.991  | 1.168  | 1.108  | 1.644  |  1.64  |
        |        |   2   | 0.028  | -0.482 | 0.104  | 0.088  | 0.597  | 0.706  | 0.974  | 0.825  |  0.79  | 1.275  | 1.247  |
        |        |   3   | -0.386 | -0.29  | -0.234 |  0.33  | 0.229  | 0.582  | 0.577  | 0.727  | 0.614  | 1.094  | 1.481  |
        |        |   4   | -0.637 | -0.508 | -0.199 | -0.332 | -0.074 | 0.255  | 0.284  | 0.096  | 0.439  | 1.158  | 1.794  |
        |        |   5   | -0.723 | -0.472 | 0.157  | -0.091 |  0.04  | 0.133  | 0.102  |  0.35  | 0.496  | 0.852  | 1.575  |
        |        |   6   | -1.057 | -0.643 | -0.585 | -0.612 | -0.204 | 0.115  |  0.32  |  0.33  | 0.603  | 0.667  | 1.723  |
        |        |   7   | -0.863 | -0.911 | -0.952 | -0.353 | -0.174 | -0.03  | -0.055 | 0.114  | 0.405  | 0.857  | 1.719  |
        |        |   8   | -1.121 | -0.646 | -0.564 | -0.685 | -0.477 | -0.235 | 0.039  | 0.048  | 0.211  | 0.168  |  1.29  |
        |        |   9   | -1.336 | -1.282 | -1.043 | -0.84  | -0.532 | -0.551 | -0.331 | -0.215 | 0.067  | 0.438  | 1.774  |
        |        |   10  | -1.727 | -1.131 | -1.202 | -0.873 | -1.011 | -0.627 |  -0.7  | -0.537 | -0.467 | 0.101  | 1.828  |
        |        |  Diff | -1.731 | -1.541 | -1.797 | -1.837 | -1.882 |  -1.4  | -1.691 | -1.705 | -1.575 | -1.543 | 0.188  |
        | 2010.0 |   1   | 0.398  | 0.232  | 0.344  | 0.271  | 0.476  | 0.836  | 1.238  |  1.36  | 1.652  | 1.797  | 1.399  |
        |        |   2   | -0.324 | 0.262  | 0.119  | 0.492  |  0.53  | 0.695  | 0.681  | 0.693  | 0.927  | 1.509  | 1.833  |
        |        |   3   | -0.438 | -0.564 | -0.147 | 0.335  | 0.688  | 0.283  | 0.359  | 0.804  | 0.797  | 1.339  | 1.777  |
        |        |   4   | -0.582 | -0.318 | -0.246 | -0.045 | 0.266  | 0.529  | 0.367  | 0.539  | 0.559  | 1.126  | 1.708  |
        |        |   5   | -1.221 | -0.754 | -0.498 |  0.03  | -0.185 | -0.02  | 0.324  | 0.695  | 0.711  | 0.862  | 2.084  |
        |        |   6   | -1.001 | -0.697 | -0.32  | -0.427 | -0.047 | 0.352  | 0.018  | 0.329  | 0.554  |  0.62  | 1.621  |
        |        |   7   | -0.948 | -0.484 | -0.402 | -0.555 |  -0.1  | -0.253 | -0.326 | 0.218  | 0.242  | 0.401  | 1.349  |
        |        |   8   | -1.214 | -0.57  | -0.531 | -0.58  | -0.383 | -0.304 | -0.024 | -0.187 | -0.011 | 0.663  | 1.877  |
        |        |   9   | -1.187 | -0.866 | -0.827 | -1.277 | -0.418 | -0.583 | -0.491 | 0.196  | -0.153 | 0.566  | 1.753  |
        |        |   10  | -1.815 | -1.309 | -1.377 | -1.038 | -0.939 | -0.781 | -0.335 | -0.757 | -0.224 | -0.121 | 1.694  |
        |        |  Diff | -2.213 | -1.541 | -1.721 | -1.309 | -1.415 | -1.617 | -1.573 | -2.117 | -1.876 | -1.919 | 0.295  |
        | 2011.0 |   1   | 0.118  | 0.372  | 0.283  | 0.764  | 0.693  | 0.581  | 1.095  | 1.004  | 1.381  |  2.34  | 2.222  |
        |        |   2   | -0.12  | -0.056 | 0.197  | 0.623  | 0.528  | 0.429  | 1.032  | 0.995  | 0.792  | 1.525  | 1.645  |
        |        |   3   | -0.727 | -0.498 | 0.392  | -0.075 | 0.101  |  0.42  | 0.886  | 0.547  | 0.744  | 1.446  | 2.172  |
        |        |   4   | -0.743 | -0.357 | -0.019 | -0.137 | 0.228  | 0.258  | 0.283  | 0.824  | 0.982  | 0.807  |  1.55  |
        |        |   5   | -0.762 | -0.51  | -0.224 | 0.031  | 0.244  | 0.032  | 0.252  | 0.406  | 0.877  | 1.207  | 1.969  |
        |        |   6   | -1.281 | -0.633 | -0.157 | -0.182 | 0.031  | -0.15  | -0.177 | 0.459  | 0.495  | 0.794  | 2.075  |
        |        |   7   | -1.463 | -0.75  | -0.557 | -0.396 | -0.311 | 0.082  | -0.033 | 0.155  | 0.569  | 0.505  | 1.968  |
        |        |   8   | -1.224 | -0.942 | -1.001 | -0.362 | -0.396 | -0.274 | -0.03  | -0.247 | 0.136  | 0.574  | 1.798  |
        |        |   9   | -1.545 | -0.878 | -0.998 | -0.706 | -0.379 | -0.32  | -0.212 | 0.023  | 0.162  | 0.527  | 2.072  |
        |        |   10  | -1.606 | -1.534 | -1.321 | -0.935 | -1.014 | -0.947 | -0.52  | -0.564 | -0.12  | -0.016 |  1.59  |
        |        |  Diff | -1.725 | -1.907 | -1.604 | -1.699 | -1.707 | -1.528 | -1.614 | -1.568 | -1.501 | -2.356 | -0.631 |
        | 2012.0 |   1   | 0.007  | 0.437  | 0.364  |  0.65  | 0.744  | 0.871  | 0.846  | 1.401  | 1.454  | 1.942  | 1.935  |
        |        |   2   | -0.211 | 0.347  | 0.334  | 0.474  | 0.388  | 0.553  | 0.609  |  1.23  | 1.096  | 1.132  | 1.344  |
        |        |   3   | -0.593 | -0.204 | 0.255  | 0.025  | 0.188  | 0.467  | 0.501  | 0.876  |  1.01  | 1.093  | 1.686  |
        |        |   4   |  -0.8  | -0.355 | -0.228 | -0.071 | 0.162  | 0.032  | 0.527  | 0.331  | 0.865  | 0.778  | 1.578  |
        |        |   5   | -0.686 | -0.649 | -0.231 | -0.307 | 0.045  | 0.356  | 0.535  | 0.464  | 0.571  | 1.283  | 1.969  |
        |        |   6   | -0.937 | -0.828 | -0.39  | -0.22  | -0.177 |  0.17  | 0.229  | 0.407  | 0.525  | 0.794  | 1.732  |
        |        |   7   | -1.389 | -0.866 | -0.569 | -0.473 | -0.265 | -0.039 | 0.233  | 0.029  | 0.113  | 0.725  | 2.114  |
        |        |   8   | -0.986 | -0.783 | -0.699 | -0.509 | -0.592 | -0.219 | 0.232  | -0.225 | 0.145  | 0.466  | 1.453  |
        |        |   9   | -1.321 | -1.27  | -0.977 | -0.913 | -0.673 | -0.394 | -0.46  | -0.266 | -0.082 | 0.417  | 1.738  |
        |        |   10  | -1.775 | -1.274 | -1.423 | -1.141 | -1.174 | -0.84  | -0.887 | -0.78  | -0.291 | -0.13  | 1.645  |
        |        |  Diff | -1.782 | -1.711 | -1.787 | -1.79  | -1.917 | -1.711 | -1.732 | -2.181 | -1.745 | -2.072 | -0.29  |
        | 2013.0 |   1   | -0.048 | 0.324  |  0.48  | 0.571  | 0.748  | 0.703  |  1.18  | 1.111  | 1.442  | 1.638  | 1.686  |
        |        |   2   | -0.562 | 0.103  | -0.179 | 0.233  | 0.691  | 0.473  | 0.932  | 1.174  | 0.961  | 1.348  |  1.91  |
        |        |   3   | -0.398 | -0.175 | -0.166 | 0.234  |  0.4   | 0.682  | 0.489  | 0.469  |  0.74  | 1.726  | 2.124  |
        |        |   4   | -0.738 | -0.619 | -0.418 | -0.073 | -0.108 |  0.24  | 0.276  | 0.373  | 0.468  | 1.071  | 1.809  |
        |        |   5   | -0.863 | -0.377 | -0.39  | -0.113 | -0.033 | 0.235  | 0.186  | 0.682  | 0.684  |  0.81  | 1.673  |
        |        |   6   | -1.077 | -0.916 | -0.246 | -0.251 | -0.21  | -0.143 |  0.18  | 0.216  | 0.254  | 0.814  | 1.891  |
        |        |   7   | -1.058 | -0.795 | -0.693 | -0.696 | -0.014 | -0.276 | 0.019  | 0.193  | 0.325  | 0.614  | 1.672  |
        |        |   8   | -1.311 | -1.051 | -0.623 | -0.405 | -0.395 | -0.427 | -0.012 | -0.002 | 0.087  | 0.423  | 1.734  |
        |        |   9   | -1.397 | -1.082 | -0.901 | -0.708 | -0.414 | -0.726 | -0.495 | -0.321 | 0.057  |  0.26  | 1.657  |
        |        |   10  | -1.667 | -1.489 | -1.047 | -1.229 | -1.01  | -0.663 | -0.617 | -0.362 | -0.299 | -0.169 | 1.498  |
        |        |  Diff | -1.619 | -1.813 | -1.527 | -1.799 | -1.758 | -1.366 | -1.796 | -1.473 | -1.741 | -1.807 | -0.187 |
        | 2014.0 |   1   | -0.034 | 0.241  | 0.419  | 0.757  | 0.858  | 0.525  | 1.198  | 1.045  |  1.02  | 1.699  | 1.733  |
        |        |   2   | -0.408 |  0.19  | -0.119 | 0.337  | 0.591  | 0.377  | 0.904  | 0.903  | 0.883  |  1.41  | 1.818  |
        |        |   3   | -0.409 | -0.099 | 0.102  | -0.019 | -0.073 | 0.459  | 0.414  | 0.252  | 0.845  | 1.126  | 1.535  |
        |        |   4   | -0.512 | -0.355 | 0.119  | 0.074  |  0.43  | 0.225  | 0.139  | 0.802  | 0.797  |  0.88  | 1.392  |
        |        |   5   | -1.035 | -0.427 | -0.468 | -0.243 | 0.106  | -0.02  | -0.197 | 0.573  | 0.423  | 0.553  | 1.588  |
        |        |   6   | -0.499 | -0.515 | -0.074 | -0.169 | -0.163 |  -0.3  | -0.116 | 0.761  | 0.447  | 0.751  | 1.249  |
        |        |   7   | -1.057 | -0.442 | -0.697 | -0.293 | -0.076 | -0.288 | -0.158 | -0.161 | 0.508  | 0.707  | 1.764  |
        |        |   8   | -0.999 | -0.741 | -0.487 | -0.661 | -0.527 | 0.079  | 0.098  | -0.015 |  0.3   | 0.619  | 1.618  |
        |        |   9   | -1.416 | -0.84  | -0.994 | -0.848 | -0.619 | -0.711 | -0.267 |  0.02  | -0.047 | 0.582  | 1.998  |
        |        |   10  | -1.844 | -1.085 | -1.056 | -0.792 | -1.005 | -0.656 | -0.896 | -0.854 | 0.081  | 0.034  | 1.878  |
        |        |  Diff | -1.81  | -1.326 | -1.475 | -1.549 | -1.863 | -1.181 | -2.094 | -1.899 | -0.939 | -1.665 | 0.145  |
        | 2015.0 |   1   | -0.102 | 0.472  | 0.353  |  0.39  | 0.892  | 1.007  | 0.944  |  1.42  | 1.429  | 1.755  | 1.857  |
        |        |   2   | -0.089 | 0.226  | 0.475  | 0.355  |  0.74  | 0.802  | 0.882  | 0.944  | 1.093  | 1.243  | 1.331  |
        |        |   3   | -0.556 | -0.277 | 0.071  | -0.042 | 0.272  | 0.659  | 0.171  | 0.743  | 0.942  | 1.302  | 1.858  |
        |        |   4   | -0.575 | -0.359 | -0.347 | 0.069  | 0.077  |  0.37  | 0.366  | 0.779  | 0.873  | 0.916  | 1.491  |
        |        |   5   | -0.694 | -0.451 | -0.051 | -0.235 | -0.243 | 0.179  | 0.247  | 0.142  | 0.391  | 0.725  | 1.418  |
        |        |   6   | -1.202 | -0.162 | -0.674 | -0.078 | -0.235 | -0.062 | 0.205  | 0.275  | 0.616  | 0.691  | 1.893  |
        |        |   7   | -1.446 | -0.757 | -0.59  | -0.406 | -0.359 | 0.124  | 0.106  | 0.449  | 0.247  | 0.399  | 1.845  |
        |        |   8   | -1.208 | -0.749 | -0.583 | -0.524 | -0.355 | -0.074 | -0.097 | -0.111 | -0.251 | 0.833  | 2.041  |
        |        |   9   | -1.29  | -1.283 | -0.671 | -0.799 | -0.689 | 0.015  | 0.018  | -0.311 | 0.217  | 0.559  | 1.849  |
        |        |   10  | -1.864 | -1.381 | -1.103 | -0.893 | -0.727 | -1.004 | -0.913 | -0.132 | -0.472 | 0.081  | 1.944  |
        |        |  Diff | -1.762 | -1.853 | -1.457 | -1.283 | -1.619 | -2.01  | -1.857 | -1.552 |  -1.9  | -1.674 | 0.087  |
        | 2016.0 |   1   | -0.049 | 0.267  | 0.437  | 0.574  | 1.052  | 0.797  | 1.475  | 1.057  | 1.621  | 2.096  | 2.144  |
        |        |   2   | -0.385 | -0.062 | -0.086 | 0.358  | 0.585  | 0.521  | 0.623  | 0.743  | 0.793  | 2.004  | 2.389  |
        |        |   3   | -0.141 | -0.174 | -0.067 | 0.258  | 0.383  | 0.591  |  0.5   | 0.695  | 0.816  | 1.503  | 1.644  |
        |        |   4   | -1.209 | -0.191 | -0.135 | -0.375 |  0.25  | 0.226  | 0.472  |  0.19  | 0.625  | 1.256  | 2.466  |
        |        |   5   | -0.779 |  -0.3  | -0.227 | -0.264 | 0.101  | 0.066  |  0.08  | 0.286  | 0.469  | 0.917  | 1.696  |
        |        |   6   | -0.917 |  -0.6  | -0.65  | -0.345 | -0.301 | 0.188  | 0.083  | -0.182 | 0.615  | 0.608  | 1.526  |
        |        |   7   | -1.478 | -0.925 | -0.501 | -0.582 | -0.114 | -0.603 | -0.066 | -0.055 | 0.614  | 0.535  | 2.013  |
        |        |   8   | -1.171 | -0.774 | -0.448 | -0.626 | -0.555 | -0.03  | -0.336 | -0.17  | 0.222  | 0.496  | 1.666  |
        |        |   9   | -1.861 | -0.947 | -0.966 | -0.679 | -0.868 | -0.535 | -0.165 | -0.605 | 0.078  |  0.37  | 2.232  |
        |        |   10  | -1.829 | -1.318 | -0.905 | -1.276 | -0.799 | -0.802 | -0.785 | -0.527 | -0.28  | -0.392 | 1.437  |
        |        |  Diff | -1.78  | -1.584 | -1.342 | -1.85  | -1.851 | -1.599 | -2.261 | -1.583 | -1.901 | -2.488 | -0.707 |
        | 2017.0 |   1   | 0.011  | 0.448  | 0.522  |  0.26  | 0.508  | 1.239  | 1.196  | 0.836  | 1.442  | 1.491  | 1.479  |
        |        |   2   | -0.55  | 0.057  | 0.314  | 0.301  | 0.169  |  0.32  | 0.844  | 0.798  |  1.21  | 1.167  | 1.716  |
        |        |   3   | -0.68  | -0.113 | -0.196 | -0.14  | 0.102  |  0.14  | 0.175  | 0.248  | 0.639  |  1.27  |  1.95  |
        |        |   4   | -0.374 | -0.539 | -0.147 | 0.189  | 0.187  | 0.574  | 0.384  | 0.713  | 0.945  | 1.278  | 1.652  |
        |        |   5   | -0.881 | -0.352 | -0.322 | -0.16  | 0.276  | -0.108 | 0.277  | 0.097  | 0.836  | 0.817  | 1.699  |
        |        |   6   | -0.874 | -0.436 | -0.232 | -0.06  | -0.278 | 0.051  | 0.179  | 0.216  | 0.568  | 0.901  | 1.775  |
        |        |   7   | -1.164 | -0.722 | -0.315 | -0.685 | -0.425 | -0.446 | 0.235  | 0.311  | 0.389  | 0.842  | 2.006  |
        |        |   8   | -1.241 | -0.877 | -0.857 | -0.97  | -0.498 | -0.112 | 0.059  | -0.083 |  0.05  | 0.389  |  1.63  |
        |        |   9   | -1.399 | -1.35  | -1.142 | -0.936 | -0.252 | -0.369 | -0.156 | -0.276 | 0.114  | 0.575  | 1.974  |
        |        |   10  | -1.788 |  -1.7  | -1.365 | -0.75  | -0.811 | -0.979 | -0.596 | -0.708 | -0.339 | 0.011  | 1.799  |
        |        |  Diff | -1.799 | -2.148 | -1.886 | -1.01  | -1.319 | -2.218 | -1.792 | -1.544 | -1.781 | -1.48  |  0.32  |
        | 2018.0 |   1   | 0.008  | 0.432  | 0.384  | 0.844  | 0.812  | 0.936  | 1.029  |  1.18  | 1.101  | 1.392  | 1.384  |
        |        |   2   | -0.468 | 0.012  | 0.307  | 0.623  | 0.365  |  0.51  | 0.678  | 0.867  | 0.985  | 1.551  | 2.019  |
        |        |   3   | -0.361 | -0.168 | -0.177 |  0.34  | 0.501  | 0.448  | 0.516  | 0.664  | 0.963  |  1.04  | 1.401  |
        |        |   4   | -0.615 | -0.416 | -0.068 | -0.069 | 0.191  | -0.086 | 0.502  | 0.559  | 0.686  | 0.978  | 1.593  |
        |        |   5   | -0.837 | -0.41  | -0.226 | -0.388 | 0.277  | 0.209  | -0.095 |  0.29  |  0.83  |  1.08  | 1.917  |
        |        |   6   | -1.293 | -0.31  | -0.482 | -0.24  | 0.118  | 0.133  | 0.139  | 0.027  | 0.563  | 0.959  | 2.252  |
        |        |   7   |  -1.1  | -0.543 | -0.849 | -0.144 | -0.267 | 0.051  | 0.122  |  0.21  | 0.074  | 0.187  | 1.286  |
        |        |   8   | -1.244 | -0.855 | -0.769 | -0.346 | -0.521 | 0.072  | -0.271 | 0.512  | 0.116  | 0.444  | 1.688  |
        |        |   9   | -1.498 | -1.175 | -0.727 | -1.168 | -0.162 | -0.464 |  0.05  |  0.21  | 0.266  | 0.416  | 1.914  |
        |        |   10  |  -1.6  | -1.337 | -1.39  | -0.975 | -1.115 | -0.983 | -0.877 | -0.831 | -0.031 | -0.202 | 1.398  |
        |        |  Diff | -1.608 | -1.769 | -1.773 | -1.819 | -1.927 | -1.92  | -1.906 | -2.011 | -1.132 | -1.594 | 0.014  |
        | 2019.0 |   1   | -0.44  | 0.475  | 0.925  |  0.48  | 1.048  | 1.155  | 1.371  | 1.053  | 1.226  | 1.841  | 2.281  |
        |        |   2   | -0.204 |  0.26  | 0.102  | 0.299  | 0.613  | 0.416  | 0.424  | 0.775  | 1.328  | 1.262  | 1.466  |
        |        |   3   | -0.532 | -0.638 | -0.119 | 0.308  | 0.423  | 0.775  | 0.427  | 0.517  |  0.85  | 1.187  |  1.72  |
        |        |   4   | -0.93  | -0.281 |  0.02  | 0.134  | 0.224  |  0.44  | 0.321  | 0.863  | 0.927  | 1.049  | 1.979  |
        |        |   5   | -1.019 | -0.393 |  -0.3  | 0.244  | 0.094  | -0.029 | 0.351  | 0.646  | 0.458  | 0.969  | 1.988  |
        |        |   6   | -0.765 | -0.487 | -0.181 | -0.323 | -0.04  | -0.275 | -0.429 | 0.413  | 0.534  |  0.63  | 1.395  |
        |        |   7   | -0.918 | -0.659 | -0.149 | -0.239 | -0.002 | 0.076  | 0.273  | 0.309  | 0.355  | 0.676  | 1.594  |
        |        |   8   | -1.537 | -0.798 | -0.79  | -0.386 | -0.164 | -0.058 | -0.127 | 0.143  | 0.275  | 0.667  | 2.205  |
        |        |   9   | -1.312 | -1.003 | -0.961 | -0.559 | -0.569 | -0.486 | -0.378 | 0.221  | -0.05  | 0.225  | 1.536  |
        |        |   10  | -1.497 | -1.629 | -1.01  | -1.305 | -0.99  | -0.801 | -0.715 | -0.348 | -0.502 | -0.066 | 1.431  |
        |        |  Diff | -1.057 | -2.103 | -1.934 | -1.785 | -2.038 | -1.957 | -2.086 | -1.401 | -1.727 | -1.907 | -0.85  |
        | 2020.0 |   1   |  0.23  | 0.022  | 0.614  | 0.927  | 0.523  | 1.044  | 0.959  | 1.124  | 1.273  | 1.867  | 1.637  |
        |        |   2   | -0.196 | -0.132 | 0.246  |  0.41  | 0.343  | 0.486  | 0.765  | 0.851  | 1.197  | 1.628  | 1.824  |
        |        |   3   | -0.435 | -0.397 | -0.089 | 0.152  | -0.056 |  0.45  | 0.429  |  0.74  | 1.072  | 1.295  |  1.73  |
        |        |   4   | -0.823 | -0.261 | 0.059  |  0.19  | -0.096 | 0.364  | 0.348  | 0.438  | 0.645  | 0.952  | 1.775  |
        |        |   5   | -0.614 | -0.623 | -0.055 | -0.146 | -0.07  |  0.04  | 0.166  | 0.375  | 0.511  | 0.746  |  1.36  |
        |        |   6   | -0.933 | -0.574 | -0.774 | -0.133 | -0.194 | -0.236 | 0.176  | 0.301  | 0.547  | 0.627  |  1.56  |
        |        |   7   | -1.388 | -0.506 | -0.445 | -0.479 | -0.142 | -0.575 |  0.27  |  0.09  | 0.466  | 0.863  | 2.251  |
        |        |   8   | -0.719 | -1.014 | -1.063 | -0.313 | -0.218 | -0.042 | 0.194  | 0.205  | 0.223  |  0.86  | 1.578  |
        |        |   9   | -1.729 | -1.054 | -0.891 | -1.075 | -0.912 | -0.011 | -0.14  | 0.048  | -0.149 | 0.479  | 2.208  |
        |        |   10  | -1.749 | -1.711 | -1.213 | -1.071 | -0.96  | -1.143 | -0.949 | -0.404 | -0.356 | -0.342 | 1.407  |
        |        |  Diff | -1.979 | -1.733 | -1.827 | -1.998 | -1.483 | -2.187 | -1.909 | -1.528 | -1.629 | -2.209 | -0.23  |
        +--------+-------+--------+--------+--------+--------+--------+--------+--------+--------+--------+--------+--------+
        
        +-------+---------+---------+---------+---------+---------+---------+---------+---------+---------+---------+--------+
        | Group |    1    |    2    |    3    |    4    |    5    |    6    |    7    |    8    |    9    |    10   |  Diff  |
        +-------+---------+---------+---------+---------+---------+---------+---------+---------+---------+---------+--------+
        |   1   |  0.001  |  0.404  |  0.582  |  0.692  |  0.813  |  0.926  |  0.991  |  1.186  |  1.436  |  1.735  | 1.734  |
        |       |  0.023  |  10.409 |  12.521 |  12.719 |  21.586 |  28.711 |  25.342 |  30.539 |  40.084 |  50.195 | 31.306 |
        |   2   |  -0.329 |  -0.065 |  0.156  |  0.261  |  0.429  |   0.51  |  0.709  |  0.908  |  1.065  |   1.39  | 1.719  |
        |       |  -5.829 |  -1.779 |  3.245  |  5.842  |  10.537 |  11.431 |  14.92  |  21.006 |  24.647 |  33.373 | 23.585 |
        |   3   |  -0.538 |  -0.193 |  -0.021 |  0.183  |   0.3   |  0.453  |  0.533  |  0.646  |  0.851  |  1.242  |  1.78  |
        |       | -13.214 |  -5.684 |  -0.454 |   6.47  |  12.997 |  11.485 |  13.978 |  17.107 |  27.073 |  25.137 | 21.695 |
        |   4   |  -0.691 |  -0.332 |  -0.065 |  0.055  |  0.142  |  0.313  |  0.369  |  0.503  |  0.736  |  1.034  | 1.725  |
        |       | -20.494 | -10.105 |  -1.874 |  1.247  |  3.242  |  9.129  |  9.121  |  12.057 |  15.833 |  24.654 | 26.702 |
        |   5   |  -0.824 |  -0.439 |  -0.292 |  -0.184 |  0.028  |  0.137  |  0.242  |  0.407  |  0.639  |   0.93  | 1.755  |
        |       | -19.074 |  -8.937 |  -8.189 |  -3.622 |  0.598  |  3.632  |  5.026  |   9.74  |  14.213 |  19.002 | 29.906 |
        |   6   |  -0.999 |  -0.557 |  -0.383 |  -0.256 |  -0.102 |  0.019  |  0.078  |  0.255  |  0.506  |  0.711  |  1.71  |
        |       |  -24.83 | -12.412 |  -9.966 |  -8.078 |  -2.96  |  0.627  |  1.588  |  4.753  |  13.01  |  16.397 | 24.858 |
        |   7   |  -1.043 |  -0.701 |  -0.57  |  -0.472 |  -0.349 |  -0.163 |  -0.009 |  0.234  |  0.408  |  0.674  | 1.717  |
        |       |  -22.39 | -18.651 | -15.173 |  -7.718 |  -8.648 |  -4.423 |  -0.261 |  5.529  |  12.285 |  13.125 | 28.054 |
        |   8   |  -1.168 |  -0.845 |  -0.703 |  -0.513 |  -0.445 |  -0.338 |  -0.19  |  -0.023 |  0.114  |  0.617  | 1.786  |
        |       | -37.234 | -18.681 | -14.622 | -11.528 | -11.526 |  -7.705 |  -4.063 |  -0.463 |  3.978  |  19.966 | 35.299 |
        |   9   |  -1.419 |  -1.109 |  -0.998 |  -0.789 |  -0.593 |  -0.429 |  -0.291 |  -0.197 |  0.015  |  0.423  | 1.843  |
        |       | -36.366 | -29.646 | -17.698 | -23.854 | -18.852 | -11.099 |  -7.294 |  -4.962 |  0.321  |  10.342 | 43.131 |
        |   10  |  -1.796 |  -1.401 |  -1.233 |  -1.072 |  -0.943 |  -0.807 |  -0.669 |  -0.51  |  -0.271 |  -0.083 | 1.714  |
        |       | -42.272 | -33.192 | -30.341 |  -35.35 | -22.034 | -17.648 | -14.833 | -10.624 |  -6.58  |  -1.658 | 26.054 |
        |  Diff |  -1.798 |  -1.804 |  -1.815 |  -1.764 |  -1.756 |  -1.734 |  -1.66  |  -1.696 |  -1.707 |  -1.818 | -0.02  |
        |       | -24.606 | -25.698 | -37.732 |  -26.27 | -33.109 | -28.342 | -30.244 | -24.987 | -26.689 | -30.782 | -0.234 |
        +-------+---------+---------+---------+---------+---------+---------+---------+---------+---------+---------+--------+
        +-------+---------+---------+---------+---------+---------+---------+---------+---------+---------+---------+--------+
        | Group |    1    |    2    |    3    |    4    |    5    |    6    |    7    |    8    |    9    |    10   |  Diff  |
        +-------+---------+---------+---------+---------+---------+---------+---------+---------+---------+---------+--------+
        |   1   |  0.001  |  0.404  |  0.582  |  0.692  |  0.813  |  0.926  |  0.991  |  1.186  |  1.436  |  1.735  | 1.734  |
        |       |  0.023  |  10.409 |  12.521 |  12.719 |  21.586 |  28.711 |  25.342 |  30.539 |  40.084 |  50.195 | 31.306 |
        | alpha |  0.001  |  0.393  |  0.588  |  0.695  |  0.814  |  0.928  |  0.994  |  1.182  |  1.431  |   1.73  |  1.73  |
        |       |  0.023  |  15.172 |  16.757 |  18.333 |  25.684 |  52.337 |  53.463 |  33.971 |  32.144 | 120.675 | 64.353 |
        |   2   |  -0.329 |  -0.065 |  0.156  |  0.261  |  0.429  |   0.51  |  0.709  |  0.908  |  1.065  |   1.39  | 1.719  |
        |       |  -5.829 |  -1.779 |  3.245  |  5.842  |  10.537 |  11.431 |  14.92  |  21.006 |  24.647 |  33.373 | 23.585 |
        | alpha |  -0.328 |  -0.067 |  0.168  |  0.257  |  0.433  |  0.526  |  0.708  |   0.9   |  1.064  |  1.385  | 1.712  |
        |       |  -6.882 |  -4.699 |  5.144  |  5.719  |  9.258  |  16.709 |  46.722 |  30.124 |  48.315 |   54.9  | 33.901 |
        |   3   |  -0.538 |  -0.193 |  -0.021 |  0.183  |   0.3   |  0.453  |  0.533  |  0.646  |  0.851  |  1.242  |  1.78  |
        |       | -13.214 |  -5.684 |  -0.454 |   6.47  |  12.997 |  11.485 |  13.978 |  17.107 |  27.073 |  25.137 | 21.695 |
        | alpha |  -0.532 |  -0.187 |  -0.021 |  0.179  |  0.299  |   0.45  |  0.523  |  0.654  |  0.847  |  1.227  |  1.76  |
        |       | -15.226 | -10.095 |  -0.443 |  12.856 |  25.329 |  18.972 |  12.407 |  18.425 |  31.055 |  24.734 | 21.164 |
        |   4   |  -0.691 |  -0.332 |  -0.065 |  0.055  |  0.142  |  0.313  |  0.369  |  0.503  |  0.736  |  1.034  | 1.725  |
        |       | -20.494 | -10.105 |  -1.874 |  1.247  |  3.242  |  9.129  |  9.121  |  12.057 |  15.833 |  24.654 | 26.702 |
        | alpha |  -0.695 |  -0.335 |  -0.063 |  0.049  |  0.146  |  0.317  |  0.365  |  0.508  |  0.749  |  1.039  | 1.735  |
        |       | -34.692 |  -8.032 |  -2.253 |  1.587  |  4.982  |  8.723  |  19.073 |  18.714 |  10.766 |  59.53  | 53.763 |
        |   5   |  -0.824 |  -0.439 |  -0.292 |  -0.184 |  0.028  |  0.137  |  0.242  |  0.407  |  0.639  |   0.93  | 1.755  |
        |       | -19.074 |  -8.937 |  -8.189 |  -3.622 |  0.598  |  3.632  |  5.026  |   9.74  |  14.213 |  19.002 | 29.906 |
        | alpha |  -0.828 |  -0.441 |  -0.289 |  -0.196 |  0.034  |  0.141  |  0.242  |  0.402  |  0.633  |  0.924  | 1.752  |
        |       |  -50.38 |  -17.93 | -14.905 |  -6.066 |  1.608  |  5.277  |   6.61  |  16.518 |  12.557 |  21.612 | 44.153 |
        |   6   |  -0.999 |  -0.557 |  -0.383 |  -0.256 |  -0.102 |  0.019  |  0.078  |  0.255  |  0.506  |  0.711  |  1.71  |
        |       |  -24.83 | -12.412 |  -9.966 |  -8.078 |  -2.96  |  0.627  |  1.588  |  4.753  |  13.01  |  16.397 | 24.858 |
        | alpha |  -1.001 |  -0.553 |  -0.385 |  -0.264 |  -0.106 |  0.026  |  0.094  |  0.266  |  0.507  |  0.717  | 1.718  |
        |       | -69.664 | -17.632 |  -7.841 | -11.507 |  -4.864 |  1.416  |  2.265  |  8.531  |  13.376 |  32.65  | 60.333 |
        |   7   |  -1.043 |  -0.701 |  -0.57  |  -0.472 |  -0.349 |  -0.163 |  -0.009 |  0.234  |  0.408  |  0.674  | 1.717  |
        |       |  -22.39 | -18.651 | -15.173 |  -7.718 |  -8.648 |  -4.423 |  -0.261 |  5.529  |  12.285 |  13.125 | 28.054 |
        | alpha |  -1.036 |  -0.695 |  -0.58  |  -0.467 |  -0.348 |  -0.161 |  -0.013 |  0.236  |  0.413  |   0.68  | 1.716  |
        |       | -15.761 | -26.285 | -30.789 |  -9.662 | -21.064 |  -10.49 |  -0.54  |  11.513 |  13.003 |  16.793 | 26.971 |
        |   8   |  -1.168 |  -0.845 |  -0.703 |  -0.513 |  -0.445 |  -0.338 |  -0.19  |  -0.023 |  0.114  |  0.617  | 1.786  |
        |       | -37.234 | -18.681 | -14.622 | -11.528 | -11.526 |  -7.705 |  -4.063 |  -0.463 |  3.978  |  19.966 | 35.299 |
        | alpha |  -1.165 |  -0.866 |  -0.695 |  -0.521 |  -0.441 |  -0.334 |  -0.192 |  -0.014 |  0.117  |  0.612  | 1.777  |
        |       | -68.442 | -47.472 | -16.293 | -18.214 | -13.313 | -20.344 |  -6.396 |  -0.495 |  8.054  |  37.427 | 74.827 |
        |   9   |  -1.419 |  -1.109 |  -0.998 |  -0.789 |  -0.593 |  -0.429 |  -0.291 |  -0.197 |  0.015  |  0.423  | 1.843  |
        |       | -36.366 | -29.646 | -17.698 | -23.854 | -18.852 | -11.099 |  -7.294 |  -4.962 |  0.321  |  10.342 | 43.131 |
        | alpha |  -1.421 |  -1.101 |  -0.999 |  -0.796 |  -0.586 |  -0.43  |  -0.297 |  -0.208 |  0.008  |  0.422  | 1.843  |
        |       | -58.572 | -36.794 | -24.981 | -57.723 | -40.075 | -22.395 | -11.038 |  -8.076 |  0.188  |  12.12  | 79.95  |
        |   10  |  -1.796 |  -1.401 |  -1.233 |  -1.072 |  -0.943 |  -0.807 |  -0.669 |  -0.51  |  -0.271 |  -0.083 | 1.714  |
        |       | -42.272 | -33.192 | -30.341 |  -35.35 | -22.034 | -17.648 | -14.833 | -10.624 |  -6.58  |  -1.658 | 26.054 |
        | alpha |  -1.792 |  -1.396 |  -1.235 |  -1.075 |  -0.934 |  -0.802 |  -0.676 |  -0.504 |  -0.27  |  -0.078 | 1.714  |
        |       | -94.079 | -29.367 | -24.322 | -49.929 | -18.363 | -22.423 | -17.282 | -12.055 |  -6.679 |  -2.087 | 64.666 |
        |  Diff |  -1.798 |  -1.804 |  -1.815 |  -1.764 |  -1.756 |  -1.734 |  -1.66  |  -1.696 |  -1.707 |  -1.818 | -0.02  |
        |       | -24.606 | -25.698 | -37.732 |  -26.27 | -33.109 | -28.342 | -30.244 | -24.987 | -26.689 | -30.782 | -0.234 |
        | alpha |  -1.792 |  -1.789 |  -1.823 |  -1.77  |  -1.749 |  -1.73  |  -1.671 |  -1.686 |   -1.7  |  -1.808 | -0.016 |
        |       | -68.889 | -25.304 | -35.662 | -49.993 | -23.566 | -35.304 | -39.998 | -48.493 | -20.813 | -57.675 | -0.471 |
        +-------+---------+---------+---------+---------+---------+---------+---------+---------+---------+---------+--------+
        
        ========================================================================
        # conditional portfolio
        # test function average_by_time
        exper_con = bi(sample, 9, maxlag=12)
        average_group_time = exper_con.average_by_time(conditional=True)
        print('average_group_time: \n', average_group_time)
        print('shape of average_group_time: \n', np.shape(average_group_time))
        # test function difference
        result = exper_con.difference(average_group_time)
        print('result :\n', result)
        print('difference matrix :\n', np.shape(result))
        # test function summary_and_test
        average, ttest = exper_con.summary_and_test(conditional=True)
        print('average :\n', average)
        print(' shape of average :', np.shape(average))
        print('ttest :\n', ttest)
        print('shape of ttest :', np.shape(ttest))
        # test function print_summary_by_time()
        exper_con.print_summary_by_time()
        # test function print_summary
        exper_con.print_summary()
        =========================================================================
        average_group_time: 
         [[[-0.038 -0.166 -0.106 ... 0.159 -0.269 -0.153]
          [0.646 0.607 0.207 ... 0.646 0.522 -0.120]
          [0.562 0.595 0.529 ... 0.522 0.629 0.842]
          ...
          [1.532 1.257 1.386 ... 1.063 1.323 1.445]
          [1.433 1.618 1.642 ... 1.232 1.724 1.240]
          [1.646 1.492 1.715 ... 2.065 1.609 1.576]]
        
         [[-0.433 -0.127 -0.365 ... -0.312 -0.494 -0.538]
          [0.192 0.154 -0.039 ... -0.105 -0.220 -0.270]
          [0.429 0.335 -0.084 ... -0.161 0.102 0.251]
          ...
          [1.122 0.987 0.814 ... 0.751 0.682 0.701]
          [0.960 0.856 1.060 ... 0.965 0.826 1.156]
          [1.539 1.672 1.229 ... 1.255 1.344 1.735]]
        
         [[-0.557 -0.607 -1.008 ... -0.587 -0.488 -0.685]
          [0.035 -0.231 0.071 ... -0.228 -0.036 -0.438]
          [-0.087 0.027 -0.119 ... 0.156 -0.255 -0.007]
          ...
          [0.653 0.694 0.489 ... 0.783 0.667 0.883]
          [0.840 0.749 0.810 ... 0.521 0.976 0.843]
          [1.274 1.040 1.282 ... 1.386 0.971 1.234]]
        
         ...
        
         [[-1.452 -1.150 -1.096 ... -1.098 -1.268 -1.428]
          [-0.720 -0.787 -0.790 ... -1.104 -1.268 -0.895]
          [-0.509 -0.782 -1.040 ... -0.461 -0.857 -0.520]
          ...
          [0.020 -0.202 -0.355 ... 0.117 0.083 -0.006]
          [0.088 0.343 0.038 ... 0.158 0.171 0.327]
          [0.339 0.452 0.345 ... 0.401 0.555 0.581]]
        
         [[-1.216 -1.196 -1.604 ... -1.820 -1.136 -1.514]
          [-1.210 -1.053 -0.910 ... -1.118 -0.663 -1.092]
          [-0.780 -0.610 -0.867 ... -0.967 -0.940 -1.062]
          ...
          [-0.344 -0.105 -0.444 ... -0.542 -0.119 -0.295]
          [0.138 0.346 -0.113 ... 0.010 -0.221 0.080]
          [0.415 0.361 0.335 ... 0.362 0.132 0.715]]
        
         [[-1.876 -1.826 -1.527 ... -1.676 -2.018 -1.795]
          [-1.439 -1.495 -1.771 ... -1.645 -1.366 -1.513]
          [-1.214 -1.419 -0.954 ... -1.269 -1.173 -1.191]
          ...
          [-0.186 -0.994 -0.262 ... -0.384 -0.375 -0.574]
          [-0.496 -0.515 -0.625 ... -0.037 -0.381 -0.155]
          [-0.384 -0.406 -0.220 ... -0.068 0.175 -0.261]]]
        shape of average_group_time: 
         (10, 10, 20)
        result :
         [[[-0.038 -0.166 -0.106 ... 0.159 -0.269 -0.153]
          [0.646 0.607 0.207 ... 0.646 0.522 -0.120]
          [0.562 0.595 0.529 ... 0.522 0.629 0.842]
          ...
          [1.433 1.618 1.642 ... 1.232 1.724 1.240]
          [1.646 1.492 1.715 ... 2.065 1.609 1.576]
          [1.685 1.658 1.821 ... 1.906 1.878 1.730]]
        
         [[-0.433 -0.127 -0.365 ... -0.312 -0.494 -0.538]
          [0.192 0.154 -0.039 ... -0.105 -0.220 -0.270]
          [0.429 0.335 -0.084 ... -0.161 0.102 0.251]
          ...
          [0.960 0.856 1.060 ... 0.965 0.826 1.156]
          [1.539 1.672 1.229 ... 1.255 1.344 1.735]
          [1.972 1.799 1.594 ... 1.567 1.838 2.273]]
        
         [[-0.557 -0.607 -1.008 ... -0.587 -0.488 -0.685]
          [0.035 -0.231 0.071 ... -0.228 -0.036 -0.438]
          [-0.087 0.027 -0.119 ... 0.156 -0.255 -0.007]
          ...
          [0.840 0.749 0.810 ... 0.521 0.976 0.843]
          [1.274 1.040 1.282 ... 1.386 0.971 1.234]
          [1.831 1.647 2.291 ... 1.973 1.458 1.919]]
        
         ...
        
         [[-1.216 -1.196 -1.604 ... -1.820 -1.136 -1.514]
          [-1.210 -1.053 -0.910 ... -1.118 -0.663 -1.092]
          [-0.780 -0.610 -0.867 ... -0.967 -0.940 -1.062]
          ...
          [0.138 0.346 -0.113 ... 0.010 -0.221 0.080]
          [0.415 0.361 0.335 ... 0.362 0.132 0.715]
          [1.631 1.557 1.938 ... 2.182 1.268 2.229]]
        
         [[-1.876 -1.826 -1.527 ... -1.676 -2.018 -1.795]
          [-1.439 -1.495 -1.771 ... -1.645 -1.366 -1.513]
          [-1.214 -1.419 -0.954 ... -1.269 -1.173 -1.191]
          ...
          [-0.496 -0.515 -0.625 ... -0.037 -0.381 -0.155]
          [-0.384 -0.406 -0.220 ... -0.068 0.175 -0.261]
          [1.493 1.420 1.307 ... 1.609 2.194 1.534]]
        
         [[-1.838 -1.660 -1.421 ... -1.835 -1.749 -1.642]
          [-2.085 -2.102 -1.978 ... -2.291 -1.888 -1.393]
          [-1.776 -2.014 -1.483 ... -1.791 -1.801 -2.033]
          ...
          [-1.929 -2.133 -2.268 ... -1.268 -2.105 -1.395]
          [-2.030 -1.898 -1.934 ... -2.133 -1.433 -1.837]
          [-0.192 -0.239 -0.514 ... -0.298 0.316 -0.196]]]
        difference matrix :
         (11, 11, 20)
        average :
         [[-0.038 0.370 0.505 0.619 0.796 0.939 1.107 1.248 1.409 1.706 1.744]
         [-0.350 -0.051 0.196 0.240 0.400 0.583 0.697 0.858 0.994 1.397 1.746]
         [-0.569 -0.187 -0.004 0.144 0.285 0.358 0.533 0.688 0.807 1.214 1.782]
         [-0.711 -0.364 -0.128 -0.024 0.173 0.294 0.403 0.515 0.741 1.070 1.782]
         [-0.792 -0.508 -0.339 -0.126 0.020 0.102 0.254 0.383 0.643 0.941 1.733]
         [-0.974 -0.573 -0.388 -0.224 -0.131 0.039 0.147 0.216 0.495 0.798 1.772]
         [-1.058 -0.738 -0.517 -0.375 -0.299 -0.152 0.071 0.168 0.293 0.728 1.786]
         [-1.190 -0.868 -0.695 -0.507 -0.386 -0.265 -0.122 0.044 0.177 0.541
          1.731]
         [-1.377 -1.038 -0.951 -0.718 -0.540 -0.506 -0.383 -0.218 -0.016 0.337
          1.714]
         [-1.776 -1.409 -1.193 -1.103 -0.996 -0.872 -0.639 -0.468 -0.369 -0.067
          1.709]
         [-1.738 -1.779 -1.698 -1.722 -1.791 -1.811 -1.747 -1.716 -1.778 -1.773
          -0.035]]
         shape of average : (11, 11)
        ttest :
         Ttest_1sampResult(statistic=array([[-0.766, 8.116, 11.277, 10.899, 22.006, 18.667, 24.400, 24.478,
                35.421, 30.592, 23.568],
               [-10.661, -1.511, 5.209, 5.210, 7.975, 12.688, 17.417, 20.672,
                27.177, 33.248, 31.875],
               [-12.848, -6.098, -0.127, 2.526, 7.683, 9.839, 11.904, 21.529,
                24.455, 28.413, 28.988],
               [-17.377, -8.674, -2.496, -0.586, 5.316, 7.416, 9.065, 10.686,
                15.126, 19.900, 25.131],
               [-14.475, -14.309, -8.058, -3.938, 0.536, 2.572, 6.850, 8.335,
                18.277, 21.266, 25.570],
               [-19.275, -11.824, -8.954, -4.821, -3.072, 0.851, 2.700, 6.193,
                9.897, 18.036, 27.111],
               [-24.154, -18.151, -10.755, -10.206, -6.406, -2.871, 1.819, 3.415,
                8.517, 15.138, 30.275],
               [-27.714, -20.058, -16.177, -11.078, -9.717, -8.066, -4.597,
                1.273, 4.322, 12.620, 28.072],
               [-25.313, -22.253, -27.550, -23.118, -12.510, -11.787, -9.905,
                -6.951, -0.301, 7.280, 20.847],
               [-38.837, -37.726, -28.888, -25.884, -23.686, -21.055, -13.418,
                -9.744, -6.285, -1.344, 25.915],
               [-26.896, -28.445, -25.380, -22.034, -32.129, -24.922, -26.006,
                -24.258, -21.094, -25.287, -0.396]]), pvalue=array([[0.453, 0.000, 0.000, 0.000, 0.000, 0.000, 0.000, 0.000, 0.000,
                0.000, 0.000],
               [0.000, 0.147, 0.000, 0.000, 0.000, 0.000, 0.000, 0.000, 0.000,
                0.000, 0.000],
               [0.000, 0.000, 0.900, 0.021, 0.000, 0.000, 0.000, 0.000, 0.000,
                0.000, 0.000],
               [0.000, 0.000, 0.022, 0.565, 0.000, 0.000, 0.000, 0.000, 0.000,
                0.000, 0.000],
               [0.000, 0.000, 0.000, 0.001, 0.598, 0.019, 0.000, 0.000, 0.000,
                0.000, 0.000],
               [0.000, 0.000, 0.000, 0.000, 0.006, 0.405, 0.014, 0.000, 0.000,
                0.000, 0.000],
               [0.000, 0.000, 0.000, 0.000, 0.000, 0.010, 0.085, 0.003, 0.000,
                0.000, 0.000],
               [0.000, 0.000, 0.000, 0.000, 0.000, 0.000, 0.000, 0.218, 0.000,
                0.000, 0.000],
               [0.000, 0.000, 0.000, 0.000, 0.000, 0.000, 0.000, 0.000, 0.767,
                0.000, 0.000],
               [0.000, 0.000, 0.000, 0.000, 0.000, 0.000, 0.000, 0.000, 0.000,
                0.195, 0.000],
               [0.000, 0.000, 0.000, 0.000, 0.000, 0.000, 0.000, 0.000, 0.000,
                0.000, 0.697]]))
        shape of ttest : (2, 11, 11)
        +--------+-------+--------+--------+--------+--------+--------+--------+--------+--------+--------+--------+--------+
        |  Time  | Group |   1    |   2    |   3    |   4    |   5    |   6    |   7    |   8    |   9    |   10   |  Diff  |
        +--------+-------+--------+--------+--------+--------+--------+--------+--------+--------+--------+--------+--------+
        | 2001.0 |   1   | -0.038 | 0.646  | 0.562  | 0.805  | 0.737  | 0.861  | 0.813  | 1.532  | 1.433  | 1.646  | 1.685  |
        |        |   2   | -0.433 | 0.192  | 0.429  | 0.363  | 0.545  |  0.76  | 0.735  | 1.122  |  0.96  | 1.539  | 1.972  |
        |        |   3   | -0.557 | 0.035  | -0.087 | 0.471  | 0.255  | 0.258  | 0.488  | 0.653  |  0.84  | 1.274  | 1.831  |
        |        |   4   | -0.773 | -0.504 | -0.37  | -0.107 | 0.381  | 0.296  | 0.252  | 0.391  | 0.331  | 1.199  | 1.972  |
        |        |   5   | -0.94  | -0.509 | -0.54  | -0.373 | -0.106 | 0.187  | 0.312  | 0.408  | 0.886  | 0.532  | 1.472  |
        |        |   6   | -1.166 | -0.593 | -0.435 | -0.41  | -0.297 | 0.119  | 0.174  | 0.384  | 0.908  | 0.862  | 2.028  |
        |        |   7   | -0.953 | -0.845 | -0.453 | -0.259 | -0.11  |  0.26  | 0.097  | 0.217  | 0.404  | 0.926  | 1.878  |
        |        |   8   | -1.452 | -0.72  | -0.509 |  -0.6  | -0.241 | -0.574 | 0.099  |  0.02  | 0.088  | 0.339  | 1.791  |
        |        |   9   | -1.216 | -1.21  | -0.78  | -0.765 | -0.574 | -0.683 | -0.192 | -0.344 | 0.138  | 0.415  | 1.631  |
        |        |   10  | -1.876 | -1.439 | -1.214 | -1.096 | -0.982 | -0.982 | -0.296 | -0.186 | -0.496 | -0.384 | 1.493  |
        |        |  Diff | -1.838 | -2.085 | -1.776 | -1.902 | -1.719 | -1.843 | -1.109 | -1.718 | -1.929 | -2.03  | -0.192 |
        | 2002.0 |   1   | -0.166 | 0.607  | 0.595  | 0.488  | 0.649  | 0.945  | 1.309  | 1.257  | 1.618  | 1.492  | 1.658  |
        |        |   2   | -0.127 | 0.154  | 0.335  | 0.274  | 0.444  | 0.536  | 0.887  | 0.987  | 0.856  | 1.672  | 1.799  |
        |        |   3   | -0.607 | -0.231 | 0.027  | -0.037 | 0.085  | 0.279  | 0.542  | 0.694  | 0.749  |  1.04  | 1.647  |
        |        |   4   | -0.583 | -0.323 | -0.351 | -0.073 | 0.325  | 0.441  | 0.627  | 0.539  | 0.799  | 1.103  | 1.686  |
        |        |   5   | -1.041 | -0.315 | -0.363 | -0.144 | 0.081  | -0.229 | -0.014 | -0.076 |  0.66  | 0.937  | 1.978  |
        |        |   6   | -1.179 | -0.545 | -0.258 | -0.622 | -0.436 | -0.147 | -0.092 | -0.067 | 0.511  | 0.857  | 2.036  |
        |        |   7   | -1.222 | -0.731 | -0.132 | -0.202 | -0.371 | 0.163  |  -0.1  | 0.172  | 0.172  | 0.779  | 2.001  |
        |        |   8   | -1.15  | -0.787 | -0.782 | -0.896 | -0.568 | -0.294 | -0.142 | -0.202 | 0.343  | 0.452  | 1.602  |
        |        |   9   | -1.196 | -1.053 | -0.61  | -0.898 | -0.76  | -0.351 | -0.138 | -0.105 | 0.346  | 0.361  | 1.557  |
        |        |   10  | -1.826 | -1.495 | -1.419 | -0.918 | -1.364 | -1.13  | -0.63  | -0.994 | -0.515 | -0.406 |  1.42  |
        |        |  Diff | -1.66  | -2.102 | -2.014 | -1.406 | -2.013 | -2.075 | -1.938 | -2.251 | -2.133 | -1.898 | -0.239 |
        | 2003.0 |   1   | -0.106 | 0.207  | 0.529  | 0.773  | 1.071  |  0.91  | 0.965  | 1.386  | 1.642  | 1.715  | 1.821  |
        |        |   2   | -0.365 | -0.039 | -0.084 | 0.438  | 0.084  | 0.837  |  0.75  | 0.814  |  1.06  | 1.229  | 1.594  |
        |        |   3   | -1.008 | 0.071  | -0.119 | 0.252  | 0.303  | 0.281  | 0.573  | 0.489  |  0.81  | 1.282  | 2.291  |
        |        |   4   | -0.908 | -0.547 | -0.297 | 0.107  | -0.01  | 0.362  |  0.63  | 0.133  | 0.675  | 1.423  | 2.332  |
        |        |   5   | -0.97  | -0.449 | -0.429 | -0.218 | -0.068 |  0.12  | 0.243  | 0.913  |  0.63  | 1.232  | 2.202  |
        |        |   6   | -0.947 | -0.529 | -0.589 | -0.075 | 0.061  |  0.0   | 0.534  | 0.405  | 0.301  | 0.987  | 1.934  |
        |        |   7   | -0.775 | -0.748 | -0.774 | -0.287 | 0.016  | -0.052 | 0.027  | 0.079  | 0.243  | 1.103  | 1.878  |
        |        |   8   | -1.096 | -0.79  | -1.04  | -0.376 | -0.392 | -0.151 | -0.048 | -0.355 | 0.038  | 0.345  |  1.44  |
        |        |   9   | -1.604 | -0.91  | -0.867 | -0.737 | -0.581 | -0.591 | -0.542 | -0.444 | -0.113 | 0.335  | 1.938  |
        |        |   10  | -1.527 | -1.771 | -0.954 | -1.305 | -1.352 | -1.052 | -0.693 | -0.262 | -0.625 | -0.22  | 1.307  |
        |        |  Diff | -1.421 | -1.978 | -1.483 | -2.079 | -2.423 | -1.962 | -1.658 | -1.648 | -2.268 | -1.934 | -0.514 |
        | 2004.0 |   1   | 0.181  | 0.325  | 0.266  | 0.885  | 0.838  | 0.695  | 1.184  | 1.553  | 1.317  | 1.913  | 1.732  |
        |        |   2   | -0.358 | -0.084 |  0.3   | 0.131  | 0.308  |  0.57  | 0.482  | 1.076  |  1.12  | 1.353  | 1.711  |
        |        |   3   | 0.018  | -0.177 | -0.033 | 0.195  | 0.195  | 0.307  |  0.33  | 0.651  |  0.92  | 1.266  | 1.249  |
        |        |   4   | -0.812 | -0.404 | -0.241 | 0.108  | 0.117  | 0.131  | 0.473  | 0.473  | 0.518  | 0.922  | 1.735  |
        |        |   5   | -0.702 | -0.413 | -0.283 | -0.142 | -0.011 | 0.146  | 0.254  | 0.377  | 0.796  | 1.189  | 1.891  |
        |        |   6   | -1.346 | -0.701 | -0.624 | 0.007  | -0.154 | 0.239  | 0.081  | 0.124  | 0.707  |  0.94  | 2.286  |
        |        |   7   | -1.105 | -0.888 | -0.324 | -0.554 | -0.286 | -0.105 | 0.094  | 0.134  | 0.363  | 0.496  | 1.601  |
        |        |   8   | -0.931 | -0.699 | -0.618 | -0.652 | -0.367 | -0.198 | -0.18  | 0.033  | -0.101 | 0.491  | 1.423  |
        |        |   9   | -0.853 | -1.157 | -0.916 | -0.823 | -0.481 | -0.457 | -0.566 | -0.298 | -0.16  |  0.08  | 0.933  |
        |        |   10  | -1.848 | -1.408 | -1.374 | -0.981 | -1.179 | -0.896 | -0.592 | -0.292 | -0.893 | -0.066 | 1.782  |
        |        |  Diff | -2.029 | -1.733 | -1.64  | -1.866 | -2.017 | -1.59  | -1.776 | -1.845 | -2.21  | -1.979 |  0.05  |
        | 2005.0 |   1   | 0.032  | 0.351  | 0.768  | 0.645  | 0.713  | 1.293  | 0.852  | 1.347  | 1.181  | 2.057  | 2.025  |
        |        |   2   | -0.504 | -0.064 | 0.462  | 0.064  | 0.514  | 0.422  | 0.922  | 0.621  | 0.827  | 1.471  | 1.975  |
        |        |   3   | -0.695 | -0.345 | 0.102  | -0.47  | -0.048 | 0.346  | 1.086  | 0.962  | 0.728  | 1.008  | 1.703  |
        |        |   4   | -0.739 | -0.257 | -0.078 |  0.15  | 0.026  | 0.098  | 0.079  | 0.214  | 0.662  |  1.12  | 1.858  |
        |        |   5   | -1.097 | -0.662 | -0.328 | -0.317 | 0.128  | 0.046  |  0.06  | 0.565  | 0.489  | 0.872  | 1.969  |
        |        |   6   | -0.977 | -0.693 | -0.68  | -0.002 | -0.103 | 0.106  | 0.092  | 0.192  |  0.34  | 0.682  |  1.66  |
        |        |   7   | -1.131 | -0.801 | -0.572 | -0.308 | -0.544 | -0.424 |  0.48  |  0.39  | 0.353  | 0.807  | 1.939  |
        |        |   8   | -1.239 | -0.631 | -0.835 | -0.218 | -0.708 | -0.426 | 0.036  | 0.124  | 0.278  | 0.728  | 1.967  |
        |        |   9   | -1.277 | -0.663 | -1.324 | -0.661 | -0.654 | -0.492 | -0.312 | -0.129 | -0.195 | 0.039  | 1.317  |
        |        |   10  | -1.646 | -1.199 | -1.326 | -0.89  | -0.902 | -0.982 | -0.745 | -0.401 | -0.051 | -0.114 | 1.532  |
        |        |  Diff | -1.678 | -1.55  | -2.094 | -1.535 | -1.615 | -2.275 | -1.596 | -1.748 | -1.232 | -2.171 | -0.493 |
        | 2006.0 |   1   | -0.134 | 0.377  |  0.07  | -0.093 | 0.606  | 0.689  | 0.923  | 1.025  | 1.334  | 2.048  | 2.182  |
        |        |   2   | -0.446 | -0.266 | 0.416  | 0.393  | 0.344  | 0.845  | 0.815  | 0.992  | 1.043  | 1.233  | 1.678  |
        |        |   3   | -0.611 | -0.323 | -0.094 |  0.31  | 0.399  | 0.249  | 0.263  | 0.596  | 0.959  | 1.317  | 1.929  |
        |        |   4   | -0.343 | -0.164 | 0.273  | 0.349  | 0.235  | 0.354  | 0.554  | 0.729  |  0.84  | 1.253  | 1.596  |
        |        |   5   | -0.868 | -0.333 | -0.533 | -0.061 | -0.084 | 0.049  | 0.155  | 0.313  | 0.708  | 1.205  | 2.073  |
        |        |   6   | -0.749 | -0.973 | -0.488 | -0.017 | -0.125 | -0.251 | 0.389  | 0.057  | 0.495  | 0.795  | 1.544  |
        |        |   7   | -0.947 | -0.846 | -0.559 | -0.168 | -0.291 | -0.522 | 0.079  | -0.31  | -0.002 | 0.446  | 1.393  |
        |        |   8   | -1.106 | -0.511 | -1.046 | -0.61  | -0.74  | -0.262 | -0.024 | 0.095  | -0.172 |  0.78  | 1.885  |
        |        |   9   | -1.605 | -1.381 | -0.907 | -0.943 | -0.206 | -0.585 | -0.281 | -0.156 | -0.068 |  0.29  | 1.895  |
        |        |   10  | -1.671 | -1.654 | -0.997 | -0.92  | -0.926 | -0.628 | -0.205 | -0.321 | -0.003 | 0.159  |  1.83  |
        |        |  Diff | -1.537 | -2.031 | -1.067 | -0.827 | -1.532 | -1.317 | -1.127 | -1.346 | -1.337 | -1.889 | -0.352 |
        | 2007.0 |   1   | -0.495 | 0.213  | 0.431  | 0.626  | 0.795  | 1.297  | 1.212  | 1.878  | 1.193  | 1.457  | 1.951  |
        |        |   2   | -0.356 | 0.053  |  0.13  | 0.327  | 0.377  | 0.542  | 0.566  | 0.784  | 0.993  | 1.336  | 1.692  |
        |        |   3   | -0.642 | -0.001 | 0.023  | 0.463  | 0.311  | 0.417  | 0.442  | 0.875  | 0.964  | 1.272  | 1.915  |
        |        |   4   | -0.51  | -0.271 | 0.064  | 0.196  | 0.343  |  0.29  |  0.19  | 0.545  | 0.814  | 0.929  | 1.439  |
        |        |   5   | -0.224 | -0.285 | -0.297 | -0.202 | -0.112 | 0.093  |  0.11  | 0.344  | 0.551  | 1.305  | 1.529  |
        |        |   6   | -1.104 | -0.488 | -0.523 | 0.019  | -0.342 | -0.141 | -0.291 | 0.105  | 0.235  | 0.184  | 1.288  |
        |        |   7   | -1.498 | -0.797 | -0.566 | -0.41  | -0.297 | 0.154  | 0.359  | 0.356  | 0.213  | 0.619  | 2.117  |
        |        |   8   | -1.16  | -1.045 | -0.428 | -0.242 | -0.286 | -0.107 | -0.001 | -0.116 | 0.355  | 0.847  | 2.007  |
        |        |   9   | -1.576 | -1.093 | -0.816 | -0.651 | -0.613 |  -0.2  | -0.097 | -0.304 | -0.125 |  0.48  | 2.055  |
        |        |   10  | -1.85  | -1.367 | -1.071 | -1.091 | -1.125 | -0.881 | -0.743 | -0.648 | 0.096  | -0.003 | 1.847  |
        |        |  Diff | -1.355 | -1.58  | -1.502 | -1.718 | -1.919 | -2.178 | -1.956 | -2.525 | -1.097 | -1.46  | -0.105 |
        | 2008.0 |   1   | -0.21  | 0.363  |  0.48  | 0.448  | 1.041  | 0.943  | 1.225  | 1.136  | 1.361  | 1.439  | 1.649  |
        |        |   2   | -0.224 | 0.209  | 0.133  |  0.57  | 0.456  | 0.803  | 0.852  | 0.808  | 1.366  | 1.571  | 1.795  |
        |        |   3   | -0.502 | -0.273 | -0.121 | 0.285  | 0.621  | 0.776  | 0.667  | 0.567  | 0.709  | 1.102  | 1.604  |
        |        |   4   | -0.813 | -0.665 | -0.066 | -0.257 | 0.073  | 0.313  |  0.58  | 0.851  | 0.829  | 1.078  | 1.891  |
        |        |   5   |  -1.0  | -0.37  | -0.261 | 0.031  | -0.28  |  0.14  |  0.13  | 0.174  | 0.523  | 0.833  | 1.833  |
        |        |   6   | -1.027 | -0.327 | -0.604 | -0.342 | -0.276 | -0.088 | 0.119  | 0.433  | 0.584  | 0.711  | 1.739  |
        |        |   7   | -0.72  | -0.711 | -0.378 | -0.511 | -0.597 | -0.164 | -0.11  | -0.087 | 0.102  | 0.512  | 1.231  |
        |        |   8   | -1.166 | -1.171 | -0.748 | -0.33  | -0.236 | -0.381 | -0.295 |  0.03  |  0.25  | 0.156  | 1.322  |
        |        |   9   | -1.768 | -1.064 | -1.086 | -0.894 | -0.434 | -0.112 | -0.657 | -0.159 | -0.311 |  0.36  | 2.128  |
        |        |   10  | -1.617 | -1.483 | -1.621 | -1.202 | -0.853 | -1.125 | -0.631 | -0.713 | -0.501 | 0.062  | 1.679  |
        |        |  Diff | -1.407 | -1.846 | -2.101 | -1.65  | -1.894 | -2.068 | -1.856 | -1.849 | -1.862 | -1.377 |  0.03  |
        | 2009.0 |   1   | -0.111 | 0.707  | 0.599  |  1.12  | 0.873  | 0.937  | 1.342  | 1.123  | 1.284  | 2.031  | 2.141  |
        |        |   2   | -0.592 | -0.082 |  0.15  | 0.161  | 0.049  | 0.326  | 0.758  | 1.114  | 0.969  | 1.592  | 2.185  |
        |        |   3   | -0.563 | -0.046 | 0.278  | 0.248  | 0.387  | 0.391  | 0.365  | 0.624  | 0.646  | 1.112  | 1.675  |
        |        |   4   | -0.996 | -0.383 | -0.586 | -0.109 | -0.013 |  0.22  | 0.172  | 0.708  | 0.792  | 1.143  | 2.139  |
        |        |   5   | -0.529 | -0.604 | -0.623 | -0.256 | 0.164  | -0.305 | 0.083  | 0.361  | 0.374  | 0.964  | 1.493  |
        |        |   6   | -0.875 | -0.351 | -0.57  | -0.257 | 0.067  | 0.111  | 0.277  | 0.136  | 0.388  | 0.824  | 1.699  |
        |        |   7   | -0.881 | -0.789 | -0.42  | -0.351 | -0.26  | -0.413 | 0.084  | 0.054  | 0.426  | 0.392  | 1.272  |
        |        |   8   | -0.91  |  -0.8  | -0.609 | -0.403 | -0.485 | -0.247 | -0.072 | 0.129  | 0.274  |  0.46  |  1.37  |
        |        |   9   | -1.107 | -1.13  | -1.032 | -0.551 | -0.553 | -0.323 | -0.213 | -0.054 | 0.326  | 0.711  | 1.818  |
        |        |   10  | -1.767 | -1.507 | -1.239 | -1.17  | -0.757 | -0.78  | -0.64  | -0.117 | -0.225 | 0.323  | 2.091  |
        |        |  Diff | -1.657 | -2.214 | -1.838 | -2.29  | -1.63  | -1.716 | -1.983 | -1.239 | -1.508 | -1.707 | -0.051 |
        | 2010.0 |   1   | -0.287 | 0.288  | 0.459  | 0.678  | 0.721  | 1.091  | 1.118  | 1.305  | 1.602  | 1.755  | 2.042  |
        |        |   2   | -0.274 | -0.082 |  0.23  | 0.045  | 0.821  | 0.435  | 0.578  | 1.064  | 1.204  | 1.665  | 1.939  |
        |        |   3   | -0.357 | -0.211 | -0.068 | -0.017 | 0.071  | 0.257  | 0.773  | 1.022  | 0.806  | 1.679  | 2.036  |
        |        |   4   | -0.693 | -0.069 | -0.266 | 0.084  |  0.29  | 0.164  | 0.393  | 0.402  | 0.729  | 0.536  | 1.229  |
        |        |   5   | -0.665 | -0.552 | -0.51  | 0.137  | -0.072 | 0.147  | 0.497  | 0.172  | 0.398  | 0.754  | 1.419  |
        |        |   6   | -0.844 | -0.569 | -0.068 | -0.586 | -0.24  | 0.305  | 0.372  | 0.315  | 0.225  | 0.874  | 1.719  |
        |        |   7   | -1.212 | -0.704 | -0.61  | -0.235 | -0.402 | -0.185 | 0.014  |  0.33  | 0.108  | 0.818  |  2.03  |
        |        |   8   | -1.018 | -0.706 | -0.717 | -0.999 | -0.414 | -0.463 | -0.096 | 0.066  | 0.021  | 0.656  | 1.674  |
        |        |   9   | -1.255 | -1.237 | -0.964 | -0.802 | -0.615 | -0.807 | -0.498 | -0.067 | 0.365  | 0.169  | 1.424  |
        |        |   10  | -1.85  | -1.389 | -1.081 | -1.109 | -0.865 | -0.738 | -0.608 | -0.459 | -0.261 | 0.215  | 2.065  |
        |        |  Diff | -1.563 | -1.677 | -1.54  | -1.786 | -1.587 | -1.829 | -1.726 | -1.764 | -1.863 | -1.54  | 0.023  |
        | 2011.0 |   1   | 0.169  | 0.438  | 0.218  | 0.275  | 0.676  | 0.923  | 0.878  | 1.186  | 1.132  | 1.475  | 1.305  |
        |        |   2   | -0.004 | -0.139 | 0.209  | 0.025  | 0.077  | 0.322  | 0.608  | 1.004  | 0.836  | 1.172  | 1.176  |
        |        |   3   | -0.568 | -0.327 | 0.012  | 0.499  | 0.414  | 0.282  | 0.396  | 0.571  | 1.102  | 1.131  |  1.7   |
        |        |   4   | -0.401 | 0.069  | -0.17  | -0.238 | 0.452  | 0.459  |  0.74  | 0.379  | 0.564  | 1.212  | 1.613  |
        |        |   5   | -0.604 | -0.313 | -0.367 | -0.21  | 0.213  | 0.095  | 0.297  | 0.456  | 0.843  | 0.799  | 1.403  |
        |        |   6   | -0.918 | -0.356 | -0.231 | -0.253 | -0.046 | 0.161  | 0.094  | 0.195  | 0.429  | 0.872  |  1.79  |
        |        |   7   | -1.323 | -0.612 | -1.067 | -0.479 | -0.403 | -0.344 | 0.117  | 0.182  | 0.486  | 0.597  | 1.921  |
        |        |   8   | -1.21  | -0.912 | -0.727 | -0.577 | -0.211 | -0.288 | -0.204 | -0.101 | 0.279  | 0.515  | 1.724  |
        |        |   9   | -1.526 | -1.13  | -0.857 | -0.898 | -0.619 | -0.409 | -0.703 | -0.27  | -0.444 |  0.39  | 1.916  |
        |        |   10  | -1.494 | -1.23  | -1.183 | -0.995 | -0.76  | -0.512 | -0.939 | -0.465 | -0.414 | 0.022  | 1.516  |
        |        |  Diff | -1.664 | -1.667 | -1.401 | -1.27  | -1.436 | -1.434 | -1.817 | -1.652 | -1.547 | -1.453 | 0.211  |
        | 2012.0 |   1   | 0.355  | 0.543  | 0.323  | 0.426  | 0.847  | 1.066  | 1.189  | 1.121  | 1.505  | 1.638  | 1.283  |
        |        |   2   | -0.463 | -0.002 | 0.173  | 0.558  | 0.259  | 0.773  | 0.613  | 0.583  | 0.821  | 1.137  | 1.601  |
        |        |   3   | -0.644 | -0.167 | -0.133 | -0.187 | 0.321  | 0.271  | 0.368  | 0.596  | 0.759  | 0.944  | 1.589  |
        |        |   4   | -0.513 | -0.345 | -0.163 | -0.318 | 0.175  | 0.123  | 0.474  | 0.572  | 0.767  | 1.217  |  1.73  |
        |        |   5   | -1.035 | -0.422 | -0.36  | 0.083  | -0.025 |  0.11  | 0.547  | 0.281  | 0.768  | 1.138  | 2.174  |
        |        |   6   | -1.151 | -1.138 | -0.028 | -0.234 | -0.407 | -0.02  | 0.337  | 0.157  | 0.133  | 1.024  | 2.175  |
        |        |   7   | -1.09  | -0.713 | -0.77  | -0.417 | -0.104 | -0.28  | -0.095 | -0.007 | 0.187  |  0.96  |  2.05  |
        |        |   8   | -1.284 | -0.778 | -0.875 | -0.532 | -0.216 | -0.157 | -0.153 | 0.223  | -0.082 | 0.395  | 1.679  |
        |        |   9   | -1.172 | -1.218 | -1.136 | -0.595 | -0.552 | -0.689 | -0.272 | -0.076 | -0.095 |  0.16  | 1.332  |
        |        |   10  | -1.855 | -1.282 | -0.878 | -1.277 | -0.92  | -0.561 | -0.258 | -0.569 | -0.608 | -0.357 | 1.498  |
        |        |  Diff | -2.21  | -1.826 |  -1.2  | -1.702 | -1.767 | -1.627 | -1.447 | -1.69  | -2.113 | -1.995 | 0.215  |
        | 2013.0 |   1   | 0.054  | 0.156  | 0.472  |  0.67  | 0.935  | 1.303  | 1.078  | 1.027  | 1.346  | 1.666  | 1.612  |
        |        |   2   | -0.462 | -0.084 | 0.156  | 0.184  | 0.089  | 0.825  | 0.211  | 0.815  | 1.177  | 1.115  | 1.577  |
        |        |   3   | -0.764 | -0.211 | -0.227 | -0.244 |  0.1   |  0.18  | 0.639  | 0.573  | 0.599  | 1.508  | 2.272  |
        |        |   4   | -0.76  | -0.582 | 0.163  | -0.183 | 0.098  | 0.348  | 0.479  | 0.177  | 0.826  | 1.184  | 1.944  |
        |        |   5   | -0.668 | -0.585 | -0.516 | 0.034  | 0.285  | -0.056 | 0.153  | 0.284  | 0.714  | 0.881  | 1.549  |
        |        |   6   | -0.672 | -0.426 | -0.322 | -0.306 | -0.337 | -0.117 | -0.384 | 0.128  | 0.474  | 0.782  | 1.455  |
        |        |   7   | -0.841 | -0.843 | -0.311 |  -0.6  | -0.675 | 0.014  | 0.073  | 0.102  | 0.466  | 0.795  | 1.635  |
        |        |   8   | -1.15  | -0.983 | -0.668 | -0.618 | -0.635 | -0.179 | -0.051 | 0.159  | 0.379  | 0.498  | 1.648  |
        |        |   9   | -1.261 | -1.046 | -1.009 | -0.75  | -0.799 | -0.685 | -0.386 | -0.202 |  0.33  | 0.259  | 1.519  |
        |        |   10  | -1.298 | -1.397 | -1.262 | -1.227 | -1.006 | -1.037 | -0.523 | -0.412 | -0.285 | -0.081 | 1.218  |
        |        |  Diff | -1.352 | -1.554 | -1.733 | -1.897 | -1.942 | -2.341 | -1.601 | -1.439 | -1.632 | -1.747 | -0.394 |
        | 2014.0 |   1   | 0.016  | 0.383  | 0.626  | 0.505  | 0.887  | 0.958  | 1.023  | 0.982  | 1.511  | 1.943  | 1.927  |
        |        |   2   | -0.291 | 0.181  | 0.061  | 0.299  | 0.518  | 0.548  | 0.868  | 0.911  | 0.754  | 1.463  | 1.754  |
        |        |   3   | -0.575 | -0.307 | 0.263  | 0.156  | 0.548  | 0.436  | 0.767  | 0.639  | 0.921  | 1.429  | 2.004  |
        |        |   4   | -0.621 | -0.382 | -0.244 | -0.112 |  0.19  | -0.121 | 0.205  | 0.872  | 0.934  |  0.82  | 1.442  |
        |        |   5   | -0.406 | -0.827 | 0.036  | -0.188 | 0.002  | 0.256  | 0.158  | 0.456  | 0.783  | 0.808  | 1.213  |
        |        |   6   | -0.407 | -0.516 | -0.294 | -0.361 | -0.231 | 0.267  | 0.032  | 0.355  | 0.419  |  1.04  | 1.446  |
        |        |   7   | -1.024 | -0.946 | -0.377 | -0.628 | -0.147 | -0.325 | 0.358  | 0.408  | 0.368  | 0.522  | 1.546  |
        |        |   8   | -1.069 | -0.909 | -0.882 | -0.282 | -0.32  | -0.296 | -0.218 | 0.101  | 0.151  | 0.831  |  1.9   |
        |        |   9   | -1.538 | -0.649 | -0.772 | -0.715 | -0.642 | -0.785 | -0.307 | 0.029  | -0.277 | 0.615  | 2.153  |
        |        |   10  | -2.103 | -1.438 | -0.928 | -1.108 | -1.004 | -0.691 | -0.857 | -0.283 | -0.819 | -0.427 | 1.676  |
        |        |  Diff | -2.119 | -1.82  | -1.555 | -1.613 | -1.891 | -1.649 | -1.88  | -1.266 | -2.33  | -2.37  | -0.251 |
        | 2015.0 |   1   |  0.39  | 0.143  | 0.267  | 0.742  | 0.841  | 0.992  | 1.043  |  1.02  | 1.634  | 1.141  | 0.751  |
        |        |   2   | -0.241 |  -0.1  | 0.003  | 0.123  | 0.481  | 0.551  | 0.563  | 0.491  | 1.024  |  1.3   | 1.542  |
        |        |   3   | -0.704 | -0.15  | -0.104 | 0.181  | 0.401  | 0.428  | 0.706  | 0.675  | 0.691  | 1.222  | 1.925  |
        |        |   4   | -0.995 | -0.283 | -0.148 | 0.022  | 0.281  |  0.12  | 0.627  | 0.663  | 0.874  | 1.606  | 2.601  |
        |        |   5   | -0.69  | -0.723 | 0.021  | -0.153 | -0.106 | 0.127  | 0.255  | 0.522  | 0.395  | 0.737  | 1.427  |
        |        |   6   | -0.931 | -0.666 | -0.443 | -0.241 | 0.035  | 0.115  | -0.022 | 0.021  | 0.541  | 0.672  | 1.603  |
        |        |   7   | -1.016 | -0.378 | -0.301 | -0.303 | -0.179 | 0.215  | -0.099 | 0.247  | 0.235  | 0.664  |  1.68  |
        |        |   8   | -1.768 | -1.027 | -0.635 | -0.421 | -0.454 | -0.263 | -0.282 | 0.239  | 0.032  | 0.646  | 2.414  |
        |        |   9   | -1.373 | -1.221 | -0.891 | -0.612 | -0.131 | -0.57  | -0.448 | -0.27  | -0.217 | 0.605  | 1.978  |
        |        |   10  | -2.049 | -1.09  | -1.228 | -0.896 | -0.662 | -0.876 | -0.907 | -0.507 | -0.358 | -0.057 | 1.992  |
        |        |  Diff | -2.439 | -1.233 | -1.496 | -1.639 | -1.503 | -1.867 | -1.95  | -1.526 | -1.992 | -1.198 | 1.241  |
        | 2016.0 |   1   | -0.195 | 0.343  | 0.774  | 0.671  | 0.829  | 1.082  | 0.755  | 1.169  | 1.342  | 1.639  | 1.833  |
        |        |   2   | -0.289 | -0.275 | 0.411  |  0.35  | 0.304  | 0.351  | 0.775  | 1.015  | 1.124  |  1.28  | 1.569  |
        |        |   3   | -0.458 | -0.253 | 0.198  | 0.232  | 0.257  | 0.176  | 0.548  | 0.593  |  0.66  |  1.04  | 1.498  |
        |        |   4   | -0.644 | -0.597 | -0.059 | 0.028  | -0.117 | 0.485  | 0.275  | 0.448  | 0.278  | 1.175  | 1.819  |
        |        |   5   | -0.846 | -0.632 | -0.195 | -0.071 | 0.201  | 0.338  | 0.448  | 0.376  | 0.698  | 0.807  | 1.653  |
        |        |   6   | -1.086 | -0.536 | -0.521 | -0.098 | -0.024 | 0.231  | -0.119 | 0.015  | 0.493  | 0.827  | 1.913  |
        |        |   7   | -0.995 | -0.601 | -0.363 | -0.179 | -0.258 | 0.047  | 0.175  | 0.197  | 0.511  |  0.99  | 1.986  |
        |        |   8   | -1.181 | -0.676 | -0.362 | -0.401 | -0.272 | -0.503 | -0.291 |  0.28  | 0.547  | 0.814  | 1.995  |
        |        |   9   | -1.269 | -0.765 | -0.995 | -0.507 | -0.523 | -0.395 | -0.462 | -0.305 | 0.046  | 0.081  |  1.35  |
        |        |   10  | -2.056 | -1.221 | -1.385 | -1.244 | -0.968 | -1.02  | -0.73  | -0.804 | -0.448 | 0.224  | 2.281  |
        |        |  Diff | -1.862 | -1.564 | -2.159 | -1.915 | -1.798 | -2.103 | -1.485 | -1.973 | -1.79  | -1.414 | 0.447  |
        | 2017.0 |   1   | 0.055  | 0.262  | 0.674  | 0.776  | 0.993  | 1.017  | 1.248  | 1.074  | 1.551  | 1.822  | 1.766  |
        |        |   2   | -0.219 | 0.008  | 0.209  | 0.479  | 0.637  | 0.712  | 0.599  | 0.827  | 0.806  |  1.47  | 1.689  |
        |        |   3   | -0.372 | -0.12  | 0.103  | 0.068  | 0.118  | 0.221  | 0.429  | 0.641  | 0.928  | 1.056  | 1.429  |
        |        |   4   | -0.741 | -0.525 | -0.302 | 0.249  |  0.16  | 0.452  | 0.081  | 0.844  | 1.107  | 0.955  | 1.696  |
        |        |   5   | -1.111 | -0.398 | -0.461 | -0.19  |  0.07  | 0.107  | 0.138  | 0.182  | 0.606  |  1.09  | 2.202  |
        |        |   6   | -1.016 | -0.812 | -0.264 | -0.154 | -0.107 | -0.261 | 0.286  | 0.315  | 0.691  | 0.964  |  1.98  |
        |        |   7   | -1.359 | -1.114 | -0.564 | -0.049 | -0.065 | -0.473 | -0.178 | 0.459  | 0.366  | 0.528  | 1.887  |
        |        |   8   | -1.117 | -0.955 | -0.577 | -0.658 | -0.047 | 0.035  | -0.206 | -0.049 | 0.195  |  0.34  | 1.456  |
        |        |   9   | -1.478 | -0.961 | -1.086 | -0.654 | -0.576 | -0.488 | -0.571 | -0.255 | 0.257  | 0.185  | 1.663  |
        |        |   10  | -1.691 | -1.288 | -1.057 | -1.037 | -1.187 | -1.082 | -0.992 | -0.597 | -0.407 | -0.08  |  1.61  |
        |        |  Diff | -1.746 | -1.55  | -1.731 | -1.814 | -2.18  | -2.099 | -2.239 | -1.671 | -1.958 | -1.902 | -0.156 |
        | 2018.0 |   1   | 0.159  | 0.646  | 0.522  | 0.586  | 0.452  | 0.658  | 1.525  | 1.063  | 1.232  | 2.065  | 1.906  |
        |        |   2   | -0.312 | -0.105 | -0.161 | 0.218  | 0.442  | 0.525  | 0.649  | 0.751  | 0.965  | 1.255  | 1.567  |
        |        |   3   | -0.587 | -0.228 | 0.156  | 0.055  | 0.231  | 0.673  | 0.583  | 0.783  | 0.521  | 1.386  | 1.973  |
        |        |   4   | -0.777 | -0.222 | 0.347  | -0.119 | 0.119  | 0.586  | 0.407  | 0.473  | 0.559  | 0.898  | 1.675  |
        |        |   5   | -0.675 | -0.63  | -0.45  | -0.164 |  0.34  | -0.08  | 0.501  | 0.672  | 0.632  | 0.877  | 1.552  |
        |        |   6   | -0.744 | -0.468 | -0.214 | -0.166 | -0.06  | 0.293  | 0.408  | 0.501  | 0.761  | 0.464  | 1.208  |
        |        |   7   | -1.09  | -0.811 | -0.777 | -0.528 | 0.051  | -0.11  | -0.111 | -0.054 | 0.519  | 0.637  | 1.727  |
        |        |   8   | -1.098 | -1.104 | -0.461 | -0.483 | -0.293 | -0.121 | -0.265 | 0.117  | 0.158  | 0.401  | 1.499  |
        |        |   9   | -1.82  | -1.118 | -0.967 | -0.78  | -0.115 | -0.431 | -0.391 | -0.542 |  0.01  | 0.362  | 2.182  |
        |        |   10  | -1.676 | -1.645 | -1.269 | -0.707 | -1.179 | -0.882 | -0.713 | -0.384 | -0.037 | -0.068 | 1.609  |
        |        |  Diff | -1.835 | -2.291 | -1.791 | -1.293 | -1.631 | -1.54  | -2.238 | -1.447 | -1.268 | -2.133 | -0.298 |
        | 2019.0 |   1   | -0.269 | 0.522  | 0.629  | 0.862  | 0.873  | 0.643  | 1.114  | 1.323  | 1.724  | 1.609  | 1.878  |
        |        |   2   | -0.494 | -0.22  | 0.102  | -0.258 | 0.415  |  0.16  | 0.982  | 0.682  | 0.826  | 1.344  | 1.838  |
        |        |   3   | -0.488 | -0.036 | -0.255 | 0.443  | 0.346  |  0.62  | 0.333  | 0.667  | 0.976  | 0.971  | 1.458  |
        |        |   4   | -0.955 | -0.298 | 0.098  | -0.072 | 0.208  | 0.546  | 0.529  | 0.383  | 0.754  | 0.801  | 1.756  |
        |        |   5   | -1.059 | -0.722 | -0.322 | 0.119  | -0.228 | 0.464  | 0.264  | 0.418  | 0.549  | 0.857  | 1.916  |
        |        |   6   | -0.988 | -0.521 | -0.472 | 0.128  | 0.272  | 0.223  | 0.274  | 0.264  | 0.298  | 0.852  | 1.839  |
        |        |   7   | -0.995 | -0.554 | -0.444 | -0.549 | -0.447 | -0.223 | 0.005  | -0.104 | 0.249  | 0.951  | 1.946  |
        |        |   8   | -1.268 | -1.268 | -0.857 | -0.312 | -0.375 | -0.225 | 0.001  | 0.083  | 0.171  | 0.555  | 1.822  |
        |        |   9   | -1.136 | -0.663 | -0.94  | -0.428 | -0.562 | -0.331 | -0.209 | -0.119 | -0.221 | 0.132  | 1.268  |
        |        |   10  | -2.018 | -1.366 | -1.173 | -1.45  | -0.915 | -0.876 | -0.574 | -0.375 | -0.381 | 0.175  | 2.194  |
        |        |  Diff | -1.749 | -1.888 | -1.801 | -2.312 | -1.788 | -1.518 | -1.688 | -1.698 | -2.105 | -1.433 | 0.316  |
        | 2020.0 |   1   | -0.153 | -0.12  | 0.842  | 0.497  | 0.534  |  0.48  | 1.354  | 1.445  |  1.24  | 1.576  |  1.73  |
        |        |   2   | -0.538 | -0.27  | 0.251  |  0.05  | 0.845  | 0.814  | 0.734  | 0.701  | 1.156  | 1.735  | 2.273  |
        |        |   3   | -0.685 | -0.438 | -0.007 | -0.016 | 0.384  | 0.316  | 0.357  | 0.883  | 0.843  | 1.234  | 1.919  |
        |        |   4   | -0.65  | -0.537 | -0.157 | -0.175 | 0.125  | 0.212  | 0.292  | 0.502  | 1.163  |  0.83  | 1.479  |
        |        |   5   | -0.716 | -0.422 |  -0.0  | -0.229 | 0.004  | 0.275  | 0.494  | 0.455  | 0.866  | 0.995  | 1.711  |
        |        |   6   | -1.345 | -0.243 | -0.137 | -0.502 | 0.134  | -0.371 | 0.372  | 0.285  | 0.966  | 0.751  | 2.096  |
        |        |   7   | -0.98  | -0.325 | -0.587 | -0.484 | -0.615 | -0.268 | 0.149  | 0.598  | 0.098  | 1.026  | 2.006  |
        |        |   8   | -1.428 | -0.895 | -0.52  | -0.526 | -0.461 | -0.205 | -0.041 | -0.006 | 0.327  | 0.581  | 2.009  |
        |        |   9   | -1.514 | -1.092 | -1.062 | -0.705 | -0.81  | -0.727 | -0.405 | -0.295 |  0.08  | 0.715  | 2.229  |
        |        |   10  | -1.795 | -1.513 | -1.191 | -1.427 | -1.005 | -0.712 | -0.507 | -0.574 | -0.155 | -0.261 | 1.534  |
        |        |  Diff | -1.642 | -1.393 | -2.033 | -1.924 | -1.539 | -1.192 | -1.861 | -2.019 | -1.395 | -1.837 | -0.196 |
        +--------+-------+--------+--------+--------+--------+--------+--------+--------+--------+--------+--------+--------+
        +-------+---------+---------+---------+---------+---------+---------+---------+---------+---------+---------+--------+
        | Group |    1    |    2    |    3    |    4    |    5    |    6    |    7    |    8    |    9    |    10   |  Diff  |
        +-------+---------+---------+---------+---------+---------+---------+---------+---------+---------+---------+--------+
        |   1   |  -0.038 |   0.37  |  0.505  |  0.619  |  0.796  |  0.939  |  1.107  |  1.248  |  1.409  |  1.706  | 1.744  |
        |       |  -0.766 |  8.116  |  11.277 |  10.899 |  22.006 |  18.667 |   24.4  |  24.478 |  35.421 |  30.592 | 23.568 |
        |   2   |  -0.35  |  -0.051 |  0.196  |   0.24  |   0.4   |  0.583  |  0.697  |  0.858  |  0.994  |  1.397  | 1.746  |
        |       | -10.661 |  -1.511 |  5.209  |   5.21  |  7.975  |  12.688 |  17.417 |  20.672 |  27.177 |  33.248 | 31.875 |
        |   3   |  -0.569 |  -0.187 |  -0.004 |  0.144  |  0.285  |  0.358  |  0.533  |  0.688  |  0.807  |  1.214  | 1.782  |
        |       | -12.848 |  -6.098 |  -0.127 |  2.526  |  7.683  |  9.839  |  11.904 |  21.529 |  24.455 |  28.413 | 28.988 |
        |   4   |  -0.711 |  -0.364 |  -0.128 |  -0.024 |  0.173  |  0.294  |  0.403  |  0.515  |  0.741  |   1.07  | 1.782  |
        |       | -17.377 |  -8.674 |  -2.496 |  -0.586 |  5.316  |  7.416  |  9.065  |  10.686 |  15.126 |   19.9  | 25.131 |
        |   5   |  -0.792 |  -0.508 |  -0.339 |  -0.126 |   0.02  |  0.102  |  0.254  |  0.383  |  0.643  |  0.941  | 1.733  |
        |       | -14.475 | -14.309 |  -8.058 |  -3.938 |  0.536  |  2.572  |   6.85  |  8.335  |  18.277 |  21.266 | 25.57  |
        |   6   |  -0.974 |  -0.573 |  -0.388 |  -0.224 |  -0.131 |  0.039  |  0.147  |  0.216  |  0.495  |  0.798  | 1.772  |
        |       | -19.275 | -11.824 |  -8.954 |  -4.821 |  -3.072 |  0.851  |   2.7   |  6.193  |  9.897  |  18.036 | 27.111 |
        |   7   |  -1.058 |  -0.738 |  -0.517 |  -0.375 |  -0.299 |  -0.152 |  0.071  |  0.168  |  0.293  |  0.728  | 1.786  |
        |       | -24.154 | -18.151 | -10.755 | -10.206 |  -6.406 |  -2.871 |  1.819  |  3.415  |  8.517  |  15.138 | 30.275 |
        |   8   |  -1.19  |  -0.868 |  -0.695 |  -0.507 |  -0.386 |  -0.265 |  -0.122 |  0.044  |  0.177  |  0.541  | 1.731  |
        |       | -27.714 | -20.058 | -16.177 | -11.078 |  -9.717 |  -8.066 |  -4.597 |  1.273  |  4.322  |  12.62  | 28.072 |
        |   9   |  -1.377 |  -1.038 |  -0.951 |  -0.718 |  -0.54  |  -0.506 |  -0.383 |  -0.218 |  -0.016 |  0.337  | 1.714  |
        |       | -25.313 | -22.253 |  -27.55 | -23.118 |  -12.51 | -11.787 |  -9.905 |  -6.951 |  -0.301 |   7.28  | 20.847 |
        |   10  |  -1.776 |  -1.409 |  -1.193 |  -1.103 |  -0.996 |  -0.872 |  -0.639 |  -0.468 |  -0.369 |  -0.067 | 1.709  |
        |       | -38.837 | -37.726 | -28.888 | -25.884 | -23.686 | -21.055 | -13.418 |  -9.744 |  -6.285 |  -1.344 | 25.915 |
        |  Diff |  -1.738 |  -1.779 |  -1.698 |  -1.722 |  -1.791 |  -1.811 |  -1.747 |  -1.716 |  -1.778 |  -1.773 | -0.035 |
        |       | -26.896 | -28.445 |  -25.38 | -22.034 | -32.129 | -24.922 | -26.006 | -24.258 | -21.094 | -25.287 | -0.396 |
        +-------+---------+---------+---------+---------+---------+---------+---------+---------+---------+---------+--------+
        ```
        
        
        
        #### class Persistence()
        
        ##### def \__init__(self, sample):
        
        This function makes initialization.
        
        **input :** 
        
        *sample (DataFrame):* Data for analysis. The structure of the sample:
        
        ​                                     The first column : sample indicator
        
        ​                                     The second column : timestamp
        
        ​                                     The higher order columns: the variables.
        
        
        
        ##### def _shift(self, series, lag):
        
        This private function shift the time series with lags.
        
        **input :**
        
        *series (Series):* The series need to be shifted with lags.
        
        *lag (int):* The lag order.
        
        
        
        ##### def fit(self, lags):
        
        This function calculate the persistence with lags.
        
        **input :**
        
        *lags (list):* the lags that need to be analyzed.
        
        
        
        ##### def summary(self, periodic=False, export=False):
        
        This function prints the result summary and exports table. The Fisher coefficient and the Spearman coefficient are both calculated. 
        
        **input :**
        
        *periodic (boolean):* whether prints periodic result. The **DEFAULT** setting is False.
        
        *export (boolean):* whether export the summary table. The **DEFAULT** setting is False.
        
        **output :**
        
        *df (DataFrame)*: If export is True, then output the summary table. 
        
        
        
        **Example**
        
        ```python
        import numpy as np
        import pandas as pd
        from portfolio_analysis import Persistence as pste
            
        # generate time 
        year = np.ones((3000,1), dtype=int)*2020
        id = np.linspace(1, 3000, 3000, dtype=int)
        for i in range(19):
            year = np.append(year, (2019-i)*np.ones((3000,1), dtype=int))
            id = np.append(id, np.linspace(1, 3000, 3000, dtype=int))
            
        # generate character
        character_1 = np.random.normal(0, 1, 20*3000)
        character_2 = np.random.normal(0, 1, 20*3000)
            
        # generate future return
        ret=character_1*-0.5 + character_2*0.5 + np.random.normal(0,1,20*3000)
        # create sample containing future return, character, time
        sample = np.array([id, year, ret, character_1, character_2]).T
        sample = pd.DataFrame(sample, columns=['id', 'year', 'ret', 'character_1', 'character_2'])
            
        exper = pste(sample)
        exper.fit(lags=[1, 2, 3])
        exper.summary(periodic=True)
        exper.summary()
        =======================================================================================================================
        +--------+----------+------------+-----------+----------+------------+-----------+----------+------------+-----------+
        |  Time  | id_lag_1 | year_lag_1 | ret_lag_1 | id_lag_2 | year_lag_2 | ret_lag_2 | id_lag_3 | year_lag_3 | ret_lag_3 |
        +--------+----------+------------+-----------+----------+------------+-----------+----------+------------+-----------+
        | 2001.0 | -0.01191 |  -0.00334  |  0.02361  | -0.01781 |  0.00886   |  -0.02894 | -0.00263 |  -0.0131   |  -0.00058 |
        | 2002.0 | -0.01199 |  -0.0176   |  -0.03603 | 0.00258  |  -0.01285  |  -0.00237 | -0.0294  |  0.00956   |  -0.00281 |
        | 2003.0 | -0.01074 |  -0.03747  |  0.00879  | 0.00161  |  0.00559   |   -0.009  | 0.00415  |  0.00091   |  -0.01175 |
        | 2004.0 | 0.01954  |  -0.01125  |  -0.00944 | 0.00546  |  0.00239   |  -0.02387 | -0.00457 |  -0.00225  |  -0.00341 |
        | 2005.0 | -0.00924 |  -0.01317  |  0.01217  | -0.01098 |  -0.02188  |  0.01794  | -0.00123 |   0.0042   |  0.02049  |
        | 2006.0 | -0.00811 |  0.02284   |  -0.00611 | 0.00417  |  0.01633   |  -0.00517 | -0.00661 |  -0.01671  |   0.0038  |
        | 2007.0 | -0.02182 |  0.00383   |  0.03477  | -0.01294 |  -0.01041  |  -0.01589 | -0.02979 |  0.00703   |  -0.00808 |
        | 2008.0 | -0.01375 |  -0.0061   |  -0.00717 |  0.0116  |  -0.00816  |  -0.02017 | 0.01595  |  -0.01796  |  0.01091  |
        | 2009.0 |  0.0303  |  0.01339   |  -0.05158 |  0.0105  |  0.00794   |  0.01445  | -0.02718 |  -0.03998  |  0.00076  |
        | 2010.0 | -0.00215 |  -0.0053   |  -0.03419 | -0.00931 |   0.0032   |  -0.01369 | -0.01174 |  0.00524   |  -0.00814 |
        | 2011.0 | 0.03012  |  0.03217   |   0.0206  | 0.00162  |  -0.00661  |  -0.0418  | -0.01987 |  0.00034   |  -0.01522 |
        | 2012.0 | 0.00614  |  0.02679   |  -0.00315 | -0.01214 |  -0.0222   |  -0.01302 | -0.02656 |  0.01336   |  0.00475  |
        | 2013.0 | 0.01331  |  0.01011   |  -0.02019 | 0.01789  |  0.02888   |  0.01926  | -0.01949 |  0.01759   |  -0.0042  |
        | 2014.0 | -0.02874 |  -0.00777  |  -0.00271 | 0.00561  |  0.01307   |  0.02344  | -0.01049 |  -0.00115  |  -0.00471 |
        | 2015.0 | -0.01492 |  0.00798   |  -0.00376 | 0.00942  |  -0.00938  |  -0.01007 | 0.00626  |  0.00371   |  -0.00552 |
        | 2016.0 | 0.00516  |  0.00175   |  0.01016  | 0.02137  |  -0.02389  |  0.00064  | 0.00778  |  -0.01682  |  0.00409  |
        | 2017.0 | 0.00593  |  -0.01336  |  -0.00072 | -0.01063 |  0.00522   |  0.01871  | -0.00298 |  -0.01108  |  -0.04003 |
        | 2018.0 | 0.02177  |  -0.00555  |  -0.0257  | -0.0033  |  -0.01959  |  -0.00221 |   nan    |    nan     |    nan    |
        | 2019.0 | -0.01153 |  -0.01477  |  -0.0089  |   nan    |    nan     |    nan    |   nan    |    nan     |    nan    |
        | 2020.0 |   nan    |    nan     |    nan    |   nan    |    nan     |    nan    |   nan    |    nan     |    nan    |
        +--------+----------+------------+-----------+----------+------------+-----------+----------+------------+-----------+
        +----------+----------+------------+-----------+----------+------------+-----------+----------+------------+-----------+
        | Variable | id_lag_1 | year_lag_1 | ret_lag_1 | id_lag_2 | year_lag_2 | ret_lag_2 | id_lag_3 | year_lag_3 | ret_lag_3 |
        +----------+----------+------------+-----------+----------+------------+-----------+----------+------------+-----------+
        | Average  | -0.00066 |  -0.00089  |  -0.00524 | 0.00082  |  -0.00242  |  -0.0051  | -0.00932 |  -0.00336  |  -0.00351 |
        +----------+----------+------------+-----------+----------+------------+-----------+----------+------------+-----------+
        ```
        
        
        
        #### class Tangency_porfolio():
        
        This class is to calculate the tangency portfolio given the asset and the risk-free rate.
        
        ##### def \__init__(self, rf, mu, cov\_mat):
        
        **input :**
        
        *rf (float):* risk-free rate
        
        *mu (array/Series):* stock rate of return
        
        *cov_mat (matrix):* covariance matrix of stock rate of return
        
        
        
        ##### def _portfolio_weight(self):
        
        **output :**
        
        *weight (array):* weight of assets in tangency portfolio
        
        
        
        ##### def _sharpe_ratio(self):
        
        **output:**
        
        *sr (float):* the sharpe ratio of the tangency portoflio
        
        
        
        ##### def fit(self):
        
        this function fits the function _portfolio_weight and _sharpe_ratio
        
        
        
        ##### def print(self):
        
        print the result
        
        
        
        **Example**
        
        ```python
        def test_tangency_portfolio():
            '''
            This function is for testing rangency_portfolio
            '''
            import numpy as np
            from portfolio_analysis import Tangency_portfolio as tanport
            
            # construct the sample data 1
            mu = np.array([0.0427, 0.0015, 0.0285])
            cov_mat = np.mat([[0.01, 0.0018, 0.0011], [0.0018, 0.0109, 0.0026], [0.0011, 0.0026, 0.0199]])
            rf = 0.005
            
            # calculate the weight and the sharpe ratio
            portfolio = tanport(rf, mu, cov_mat)
            print(portfolio._portfolio_weight())
            print(portfolio.fit())
            portfolio.print()
        
            # construct the sample data 2
            mu = np.array([0.0427, 0.0015, 0.0285, 0.0028])
            cov_mat = np.mat([[0.01, 0.0018, 0.0011, 0], [0.0018, 0.0109, 0.0026, 0], [0.0011, 0.0026, 0.0199, 0], [0, 0, 0, 0.1]])
            rf = 0.005
            
            # calculate the weight and the sharpe ratio
            portfolio = tanport(rf, mu, cov_mat)
            print(portfolio._portfolio_weight())
            print(portfolio.fit())
            portfolio.print()
        
        test_tangency_portfolio()
        
        ================================================================================================================================
        [[ 1.02682298]
         [-0.32625112]
         [ 0.29942815]]
        (matrix([[ 1.02682298],
                [-0.32625112],
                [ 0.29942815]]), 0.4202276695645767)
        +--------+---------+----------+---------+
        | Weight |  asset1 |  asset2  |  asset3 |
        +--------+---------+----------+---------+
        |        | 1.02682 | -0.32625 | 0.29943 |
        +--------+---------+----------+---------+
        [[ 1.03285649]
         [-0.32816815]
         [ 0.30118756]
         [-0.00587591]]
        (matrix([[ 1.03285649],
                [-0.32816815],
                [ 0.30118756],
                [-0.00587591]]), 0.42028525345017165)
        +--------+---------+----------+---------+----------+
        | Weight |  asset1 |  asset2  |  asset3 |  asset4  |
        +--------+---------+----------+---------+----------+
        |        | 1.03286 | -0.32817 | 0.30119 | -0.00588 |
        +--------+---------+----------+---------+----------+
        
        ```
        
        
        
        #### class Spanning_test():
        
        This module is designed for spanning test. Three asymptotic estimates and one small sample estimates are contained. The construction is based on 
        
        R. Kan, G. Zhou, Test of Mean-Variance Spanning, Annals of Economics and Finance, 2012, 13-1, 145-193.
        
        ##### def \__init__(self, Rn, Rk):
        
        **input :**
        
        *Rn (ndarray/Series/DataFrame) :* The assets to be tested.
        
        *Rk (ndarray/Series/DataFrame) :* The fundamental assets. 
        
        
        
        ##### def _cov(self):
        
        This function calculates the covariance.
        
        
        
        ##### def _regress(self):
        
        This function regresses Rn on Rk and return the eigen value and U statistics for building estimates of hypothesis tests.
        
         **output :**
        
        *eigen1 (float) :* the eigen value #1
        
        *eigen2 (float) :* the eigen value #2
        
        *U (float) :* the U statistics
        
        
        
        ##### def _build_statistics(self):
        
        This function build three asymptotic estimates and one small sample estimate. The asymptotic estimates include likelihood ratio (LR), Wald test (W), Lagrange multiplier test (LM). The one small sample estimate is F-test corresponding to likelihood ratio. The asymptotic estimates satisfy the chi-square distribution with freedom 2N, where N is the number of test assets. The small sample estimate satisfies the F-distribution with coefficient, 2N, 2(T-K-N) for N>1, and 2, (T-K-1) for N=1.
        
        **output :**
        
        *perc (array) :* the quantiles of chi-square distribution at 90%, 95%, 99%. 
        
        *perc_F (array) :* the quantiles of F distribution at 90%, 95%, 99%.
        
        *[LR, chi_LR] (float) :* The LR estimate and p-value of test. 
        
        *[W, chi_W] (float) :* The Wald estimate and p-value of test.
        
        *[LM, chi_LM] (float) :* The LM estimate and p-value of test.
        
        *[LR_F, f_LR] (float) :* The F estimate and p-value of test.
        
        
        
        ##### def fit(self):
        
        This function fits the model
        
        **output :**
        
        *perc (array) :* the quantiles of chi-square distribution at 90%, 95%, 99%. 
        
        *perc_F (array) :* the quantiles of F distribution at 90%, 95%, 99%.
        
        *[LR, chi_LR] (float) :* The LR estimate and p-value of test. 
        
        *[W, chi_W] (float) :* The Wald estimate and p-value of test.
        
        *[LM, chi_LM] (float) :* The LM estimate and p-value of test.
        
        *[LR_F, f_LR] (float) :* The F estimate and p-value of test.
        
        
        
        ##### def summary(self):
        
        This function print the result.
        
        
        
        **Example**
        
        ```python
        # %% test Spanning test
        def test_spanning_test():
            '''
            This function is for testing spanning test
            '''
            import numpy as np
            from portfolio_analysis import Spanning_test as span
        
            factor1 = np.random.normal(loc=0.1, scale=1.0, size=(240, 1))
            factor2 = np.random.normal(loc=0.2, scale=2.0, size=(240, 1))
            factor3 = np.random.normal(loc=0.5, scale=4.5, size=(240, 1))
        
            factor4 = 0.1 * factor1 + 0.5 * factor2 + 0.4 * factor3
            factor5 = -0.2 * factor1 - 0.1 * factor2 + 1.3 * factor3
            factor6 = 1.0 * factor1 - 0.5 * factor2 + 0.5 * factor3
            factor7 = 0.2 * factor1 + 0.1 * factor2 + 0.7 * factor3
            factor8 = -0.1 * factor1 -0.1 * factor2 + 1.2 * factor3
            factor9 = -0.3 * factor1 - 0.2 * factor2 + 1.5 * factor3
            factor10 = 0.9 * factor1 - 0.5 * factor2 + 0.6 * factor3
            factor11 = 0.2 * factor1 - 0.1 * factor2 + 0.9 * factor3
            
            factornew1 = np.random.normal(loc=0.3, scale=2.0, size=(240, 1))
        
            factork = np.block([factor1, factor2, factor3, factor4, factor5, factor6, factor7, factor8, factor9])
            factorn = np.block([factor10, factor11])
        
            model1 = span(factorn, factork)
            model1._regress()
            model1._build_statistics()
            model1.fit()
            model1.summary()
        
            model2 = span(factornew1, factork)
            model2._regress()
            model2._build_statistics()
            model2.fit()
            model2.summary()
        
        test_spanning_test()
        
        ================================================================================================================================
        +--------+----------+---------+----------+---------+-----------+----------+------------+----------+-----------+----------+------------+-----+---+---+
        | asset  |  alpha   | p-value |  delta   |  F-test | p-value-F |    LR    | p-value-LR |    W     | p-value-W |    LM    | p-value-LM |  T  | N | K |
        +--------+----------+---------+----------+---------+-----------+----------+------------+----------+-----------+----------+------------+-----+---+---+
        | asset0 | -0.00000 | 0.13810 | -0.00000 | 3.10291 |  0.01543  | 12.83472 |  0.01211   | 13.14587 |  0.01058  | 12.53381 |  0.01379   | 240 | 2 | 9 |
        | asset1 | -0.00000 | 0.97109 | -0.00000 | 3.10291 |  0.01543  | 12.83472 |  0.01211   | 13.14587 |  0.01058  | 12.53381 |  0.01379   | 240 | 2 | 9 |
        +--------+----------+---------+----------+---------+-----------+----------+------------+----------+-----------+----------+------------+-----+---+---+
        +--------+---------+---------+---------+----------+-----------+----------+------------+----------+-----------+----------+------------+-----+---+---+
        | asset  |  alpha  | p-value |  delta  |  F-test  | p-value-F |    LR    | p-value-LR |    W     | p-value-W |    LM    | p-value-LM |  T  | N | K |
        +--------+---------+---------+---------+----------+-----------+----------+------------+----------+-----------+----------+------------+-----+---+---+
        | asset0 | 0.47970 | 0.00029 | 1.04366 | 10.56632 |  0.00004  | 21.09647 |  0.00003   | 22.05146 |  0.00002  | 20.19584 |  0.00004   | 240 | 1 | 9 |
        +--------+---------+---------+---------+----------+-----------+----------+------------+----------+-----------+----------+------------+-----+---+---+
        ```
        
        
        
        ### fama_macbeth
        
        This module is designed for Fama_Macbeth regression(1976). 
        
        Fama-Macbeth Regression follows two steps:
        
        1.  Specify the model and take cross-sectional regression.
        2.  Take the time-series average of regression coefficient
        
        For more details, please read Empirical Asset Pricing: The Cross Section of Stock Returns. Bali, Engle, Murray, 2016.
        
        
        
        #### class Fama_macbeth_regress():
        
        Package Needed: numpy, statsmodels, scipy, prettytable.
        
        ##### def \__init__(self, sample):
        
        **input :**
        
        *sample (ndarray/DataFrame):* data for analysis. The structure of the sample : 
        
        ​                The first column is dependent variable/ test portfolio return.
        
        ​                The second to the last-1 columns are independent variable/ factor loadings.
        
        ​                The last column is the time label.
        
        
        
        ##### def divide_by_time(self, sample):
        
        This function group the sample by time.
        
        **input :**
        
        *sample (ndarray/DataFrame):* The data for analysis in the \__init__ function.
        
        **output :**
        
        *groups_by_time (list):* The sample grouped by time.
        
        
        
        ##### def cross_sectional_regress(self, add_constant=True, normalization=True, **kwargs):
        
        This function conducts the first step of Fama-Macebth Regression Fama-Macbeth, that taking the cross-sectional regression for each period.
        
        **input :** 
        
        *add_constant (boolean):* whether add intercept when take the cross-sectional regression.
        
        *normalization (boolean):* Whether conduct normalization.
        
        **output :**
        
        *parameters (ndarray):* The regression coefficient/factor risk premium, whose rows are the group coefficient and columns are regression variable.
        
        *tvalue (ndarray):* t value for the coefficient.
        
        *rsquare (list):* The r-square.
        
        *adjrsq (list):*  The adjust r-square.
        
        *n (list):* The sample quantity in each group.
        
        
        
        ##### def time_series_average(self, **kwargs):
        
        This function conducts the second step of Fama-Macbeth regression, take the time series average of cross section regression.
        
        
        
        ##### def fit(self, **kwargs):
        
        This function fits the model by running the time_series_average function.
        
        **Example**
        
        ```
        import numpy as np
        from fama_macbeth import Fama_macbeth_regress
            
        # construct sample
        year=np.ones((3000,1),dtype=int)*2020
        for i in range(19):
            year=np.append(year,(2019-i)*np.ones((3000,1),dtype=int))
        character=np.random.normal(0,1,(2,20*3000))
        # print('Character:',character)
        ret=np.array([-0.5,-1]).dot(character)+np.random.normal(0,1,20*3000)
        sample=np.array([ret,character[0],character[1],year]).T    
        # print('Sample:',sample)
        # print(sample.shape)
        
        model = Fama_macbeth_regress(sample)
        result = model.fit(add_constant=False)
        print(result)
        =========================================================================
        para_average: [-0.501 -1.003]
        tvalue: [-111.857 -202.247]
        R: 0.5579113793318332
        ADJ_R: 0.5576164569698131
        sample number N: 3000.0
        ```
        
        
        
        ##### def summary_by_time(self):
        
        This function summarize the cross-section regression result at each time.
        
        Package needed: prettytable.
        
        **Example**
        
        ```
        # continue the previous code
        model.summary_by_time()
        ==========================================================================
        +--------+-----------------+----------+--------------+---------------+
        |  Year  |      Param      | R Square | Adj R Square | Sample Number |
        +--------+-----------------+----------+--------------+---------------+
        | 2001.0 | [-0.499 -0.990] |   0.53   |     0.53     |      3000     |
        | 2002.0 | [-0.524 -0.987] |   0.56   |     0.56     |      3000     |
        | 2003.0 | [-0.544 -1.015] |   0.58   |     0.58     |      3000     |
        | 2004.0 | [-0.474 -0.948] |   0.53   |     0.53     |      3000     |
        | 2005.0 | [-0.502 -1.007] |   0.57   |     0.57     |      3000     |
        | 2006.0 | [-0.497 -0.981] |   0.55   |     0.55     |      3000     |
        | 2007.0 | [-0.526 -1.020] |   0.57   |     0.57     |      3000     |
        | 2008.0 | [-0.476 -1.024] |   0.56   |     0.56     |      3000     |
        | 2009.0 | [-0.533 -1.011] |   0.57   |     0.57     |      3000     |
        | 2010.0 | [-0.493 -1.029] |   0.57   |     0.57     |      3000     |
        | 2011.0 | [-0.504 -0.975] |   0.55   |     0.55     |      3000     |
        | 2012.0 | [-0.508 -1.002] |   0.56   |     0.56     |      3000     |
        | 2013.0 | [-0.474 -1.015] |   0.56   |     0.56     |      3000     |
        | 2014.0 | [-0.503 -0.998] |   0.55   |     0.55     |      3000     |
        | 2015.0 | [-0.485 -1.034] |   0.55   |     0.55     |      3000     |
        | 2016.0 | [-0.514 -1.005] |   0.57   |     0.57     |      3000     |
        | 2017.0 | [-0.498 -1.016] |   0.58   |     0.58     |      3000     |
        | 2018.0 | [-0.475 -0.994] |   0.55   |     0.55     |      3000     |
        | 2019.0 | [-0.487 -0.974] |   0.54   |     0.54     |      3000     |
        | 2020.0 | [-0.511 -1.031] |   0.56   |     0.56     |      3000     |
        +--------+-----------------+----------+--------------+---------------+
        
        ```
        
        
        
        ##### def summary(self, charactername=None):
        
        This function summarize the final result.
        
        **input :**
        
        *charactername :* The factors' name in the cross-section regression model.
        
        **Example**
        
        ```python
        # continue the previous code
        model.summary()
        ================================================================================
        +-----------------+---------------------+-----------+---------------+-----------+
        |      Param      |     Param Tvalue    | Average R | Average adj R | Average n |
        +-----------------+---------------------+-----------+---------------+-----------+
        | [-0.501 -1.003] | [-111.857 -202.247] |   0.558   |     0.558     |   3000.0  |
        +-----------------+---------------------+-----------+---------------+-----------+
        ```
        
        
        
        #### class Factor_mimicking_portfolio
        
        This module is designed for generating factor mimicking portfolio following the Fama-French (1993) conventions, and then calculating factor risk premium.
        
        Fama-French (1993) conventions.
        
        
        
        | Size\Factor  | Low(<=30%) | Medium(30%< & <=70%) | High(>70%) |
        | ------------ | ---------- | -------------------- | ---------- |
        | Small(<=50%) | S/L        | S/M                  | S/H        |
        | Big(>50%)    | B/L        | B/M                  | B/H        |
        
        
        
        1. Group stocks by two dimensions. One dimension is size, divided by 50% small stocks and 50% big stocks. The other is the factor, divided by 30% tail factor stocks, 30%~70% factor stocks, and 70%~100% factor stocks.
        
        2. Calculate market value weighted portfolio return and factor risk premium.
           
           
           $$
           SMB=1/3(S/L+S/M+S/H)-1/3(B/L+B/M+B/H)
           $$
           
           $$
           Factor=1/2(S/H+B/H)-1/2(S/L+B/L)
           $$
           
        3. 
        
        
        
        3. In Fama-French (1993), the factor is book-to-market ratio, and other literatures follow the same way to construct factor mimicking portfolio. The return of each portfolio is represented by the market value weighted portfolio return.  
        
        
        
        ##### def \__init__(self, sample, perc_row=[0, 50, 100], perc_col=[0, 30, 70, 100],  percn_row=None, percn_col=None, weight=True):
        
        This function initializes the object.
        
        **input :**
        
        *sample (ndarray/DataFrame):* data for analysis. The structure of the sample : 
        
        ​                The first column is dependent variable/ test portfolio return.
        
        ​                The second is the first factor.
        
        ​                The third is the second factor.
        
        ​                The fourth is the timestamp.
        
        ​                The fifth is the weight.
        
        *perc_row (list/array):* The percentile points that divide stocks by the first factor. The **Default** percentile is [0, 50, 100].
        
        *perc_col (list/array):* The percentile points that divide stocks by the second factor. The **Default** percentile is [0, 30, 70, 100].
        
        *percn_row (list/array):* The percentile that divide stocks by the first factor.
        
        *percn_col (list/array):* The percentile that divide stocks by the second factor. 
        
        *weight (array/Series):* Whether the portfolio return calculated by weight. The **Default** is True.
        
        
        
        ##### def portfolio_return_time(self):
        
        This function is to construct portfolio and calculate the average return and difference matrix.
        
        **output :** 
        
        *diff (ndarray):* The differenced portfolio return matrix.
        
        
        
        **Example**
        
        ```python
        '''
        TEST Factor_mimicking_portfolio
        construct sample:
            1. 20 Periods
            2. 3000 Observations for each Period
            3. Character negative with return following the return=character*-0.5+sigma where sigma~N(0,1)
        '''
        import numpy as np
        from fama_macbeth import Factor_mimicking_portfolio
            
        # construct sample
        year=np.ones((3000,1),dtype=int)*2020
        for i in range(19):
            year=np.append(year,(2019-i)*np.ones((3000,1),dtype=int))
        character=np.random.normal(0, 1, (2, 20*3000))
        weight = np.random.uniform(0, 1, (1, 20*3000))
        #    print('Character:',character)
        ret=np.array([-0.5,-1]).dot(character)+np.random.normal(0,1,20*3000)
        sample=np.array([ret, character[0], character[1], year, weight[0]]).T    
        #    print('Sample:',sample)
        #    print(sample.shape)
        
        model = Factor_mimicking_portfolio(sample)
        portfolio_return_time = model.portfolio_return_time()
        print('portfolio_return_time:', portfolio_return_time)
        print('portfolio_return_time:', np.shape(portfolio_return_time))
        ========================================================================
        portfolio_return_time: 
         [[[ 1.6302854   1.5920872   1.54199455  1.47560967  1.60182404
            1.48860463  1.70067317  1.57084898  1.52938766  1.54919833
            1.58910675  1.44369383  1.6489323   1.57230951  1.64104889
            1.52816059  1.53197648  1.44067358  1.55358692  1.60347805]
          [ 0.27087317  0.37938255  0.46473997  0.43118946  0.36934624
            0.3215685   0.51716485  0.3740702   0.38663032  0.44333387
            0.38917603  0.31347239  0.30760233  0.41415477  0.45250083
            0.4439825   0.35821053  0.40601508  0.40495275  0.46236114]
          [-0.77286286 -0.67348462 -0.79216628 -0.76712914 -0.73108828
           -0.74141791 -0.85011447 -0.65743291 -0.57583562 -0.80490414
           -0.64311824 -0.79901273 -0.81325556 -0.84443278 -0.80362147
           -0.75246184 -0.61693674 -0.69909426 -0.68554981 -0.6564971 ]
          [-2.40314826 -2.26557182 -2.33416083 -2.24273881 -2.33291232
           -2.23002254 -2.55078764 -2.22828188 -2.10522328 -2.35410248
           -2.23222499 -2.24270656 -2.46218786 -2.41674229 -2.44467035
           -2.28062243 -2.14891322 -2.13976784 -2.23913673 -2.25997515]]
        
         [[ 0.73368397  0.87086931  0.69626265  0.83424514  0.75825827
            0.65550255  0.79076344  0.77316807  0.82031302  0.83320932
            0.76971902  0.81248391  0.74568233  0.7749383   0.69168886
            0.7511554   0.70757116  0.75320784  0.78223447  0.73699303]
          [-0.55494493 -0.4097451  -0.42006211 -0.36317491 -0.47791724
           -0.40042274 -0.36004971 -0.39127871 -0.4712118  -0.33368187
           -0.48957933 -0.41103346 -0.46612991 -0.39504384 -0.34021312
           -0.41049521 -0.33209925 -0.39212119 -0.42890556 -0.40389983]
          [-1.63941363 -1.52575966 -1.55711119 -1.45657452 -1.53325438
           -1.5163362  -1.4984305  -1.50929504 -1.52485715 -1.60331314
           -1.56591033 -1.60738805 -1.77875307 -1.4963315  -1.65246163
           -1.55526031 -1.40809313 -1.49853102 -1.56149388 -1.48409324]
          [-2.3730976  -2.39662897 -2.25337384 -2.29081966 -2.29151265
           -2.17183875 -2.28919394 -2.28246311 -2.34517017 -2.43652246
           -2.33562934 -2.41987196 -2.5244354  -2.27126981 -2.34415049
           -2.30641572 -2.11566429 -2.25173887 -2.34372835 -2.22108628]]
        
         [[-0.89660143 -0.72121789 -0.8457319  -0.64136454 -0.84356577
           -0.83310208 -0.90990973 -0.79768091 -0.70907464 -0.71598901
           -0.81938773 -0.63120993 -0.90324997 -0.7973712  -0.94936003
           -0.77700518 -0.82440531 -0.68746574 -0.77135245 -0.86648502]
          [-0.82581811 -0.78912765 -0.88480208 -0.79436437 -0.84726347
           -0.72199124 -0.87721456 -0.7653489  -0.85784212 -0.77701574
           -0.87875536 -0.72450585 -0.77373224 -0.80919861 -0.79271394
           -0.85447771 -0.69030979 -0.79813627 -0.83385831 -0.86626098]
          [-0.86655077 -0.85227505 -0.76494491 -0.68944538 -0.8021661
           -0.77491829 -0.64831603 -0.85186213 -0.94902153 -0.798409
           -0.92279209 -0.80837532 -0.96549751 -0.65189873 -0.84884016
           -0.80279847 -0.79115639 -0.79943676 -0.87594408 -0.82759615]
          [ 0.03005066 -0.13105715  0.08078699 -0.04808085  0.04139968
            0.05818379  0.26159369 -0.05418122 -0.23994689 -0.08241999
           -0.10340436 -0.17716539 -0.06224754  0.14547248  0.10051987
           -0.02579329  0.03324893 -0.11197102 -0.10459162  0.03888887]]]
        portfolio_return_time: 
         (3, 4, 20)
        ```
        
        
        
        ##### def portfolio_return(self):
        
        This function is to construct factor risk premium.
        
        **output :**
        
        *return_row :* The first factor risk premium. The **Default** is size factor.
        
        *return_col :* The second factor risk premium. 
        
        
        
        ##### def portfolio_return_horizon(self, period, log):
        
        This function is to construct horizon pricing factor. For details, read *Horizon Pricing, JFQA, 2016, 51(6): 1769-1793.* 
        
        **input :**
        
        *period (int):* The lagged period for constructing factor risk premium return.
        
        *log(boolean):* whether use log return.
        
        **output :**
        
        *return_row_multi :* The first factor risk premium. The **DEFAULT** is size factor.
        
        *return_col_multi :* The second factor premium.
        
        
        
        **Example**
        
        ```python
        # Continue the previous code
        portfolio_return = model.portfolio_return()
        print('portfolio_return_row: \n', portfolio_return[0])
        print('portfolio_return_row:', np.shape(portfolio_return[0]))
        print('portfolio_return_col: \n', portfolio_return[1])
        print('portfolio_return_col:', np.shape(portfolio_return[1]))
        ==============================================================
        portfolio_return_row: 
         1970-01-01 00:00:00.000002001   -0.639730
        1970-01-01 00:00:00.000002002   -0.623419
        1970-01-01 00:00:00.000002003   -0.603673
        1970-01-01 00:00:00.000002004   -0.543314
        1970-01-01 00:00:00.000002005   -0.612899
        1970-01-01 00:00:00.000002006   -0.567957
        1970-01-01 00:00:00.000002007   -0.543462
        1970-01-01 00:00:00.000002008   -0.617268
        1970-01-01 00:00:00.000002009   -0.688971
        1970-01-01 00:00:00.000002010   -0.593458
        1970-01-01 00:00:00.000002011   -0.681085
        1970-01-01 00:00:00.000002012   -0.585314
        1970-01-01 00:00:00.000002013   -0.676182
        1970-01-01 00:00:00.000002014   -0.528249
        1970-01-01 00:00:00.000002015   -0.622599
        1970-01-01 00:00:00.000002016   -0.615019
        1970-01-01 00:00:00.000002017   -0.568156
        1970-01-01 00:00:00.000002018   -0.599252
        1970-01-01 00:00:00.000002019   -0.646437
        1970-01-01 00:00:00.000002020   -0.630363
        dtype: float64
        portfolio_return_row: (20,)
        portfolio_return_col: 
         1970-01-01 00:00:00.000002001   -1.582065
        1970-01-01 00:00:00.000002002   -1.597753
        1970-01-01 00:00:00.000002003   -1.502249
        1970-01-01 00:00:00.000002004   -1.527213
        1970-01-01 00:00:00.000002005   -1.527675
        1970-01-01 00:00:00.000002006   -1.447892
        1970-01-01 00:00:00.000002007   -1.526129
        1970-01-01 00:00:00.000002008   -1.521642
        1970-01-01 00:00:00.000002009   -1.563447
        1970-01-01 00:00:00.000002010   -1.624348
        1970-01-01 00:00:00.000002011   -1.557086
        1970-01-01 00:00:00.000002012   -1.613248
        1970-01-01 00:00:00.000002013   -1.682957
        1970-01-01 00:00:00.000002014   -1.514180
        1970-01-01 00:00:00.000002015   -1.562767
        1970-01-01 00:00:00.000002016   -1.537610
        1970-01-01 00:00:00.000002017   -1.410443
        1970-01-01 00:00:00.000002018   -1.501159
        1970-01-01 00:00:00.000002019   -1.562486
        1970-01-01 00:00:00.000002020   -1.480724
        dtype: float64
        portfolio_return_col: (20,)
        ```
        
        
        
        ###  time_series_regress
        
        #### class TS_regress()
        
        This class is designed for time series regression, 
        
        
        $$
        r_{i,t} = \beta_if_t + \epsilon_{i,t}
        $$
        
        
        to obtain the beta for each asset.
        
        
        
        ##### def \__init__(self, list_y, factor):
        
        This function initializes the object.
        
        **input :**
        
        *list_y (list/DataFrame):* The return matrix with i rows and t columns.
        
        *factor (ndarray or DataFrame):* The factor risk premium return series.
        
        
        
        ##### def ts_regress(self, newey_west=*True*):
        
        This function is for conducting the time series regression.
        
        **input :**
        
        *newey_west  (boolean)：* conduct the newey_west adjustment or not.
        
        **output :**
        
        *self.alpha (list):* The regression alpha.
        
        *self.e_mat (ndarray):* The error matrix.
        
        
        
        **Example**
        
        ```python
        from statsmodels.base.model import Model
        from statsmodels.tools.tools import add_constant
        from EAP.time_series_regress import TS_regress
        
        X = np.random.normal(loc=0.0, scale=1.0, size=(2000,10))
        y_list = []
        for i in range(10) :
            b = np.random.uniform(low=0.1, high=1.1, size=(10,1))
            e = np.random.normal(loc=0.0, scale=1.0, size=(2000,1))
            y = X.dot(b) + e 
            y_list.append(y)
        
        re = TS_regress(y_list, X)
        ```
        
        
        
        ##### def fit(self, **kwargs):
        
        This function runs the function, ts_regress.
        
        **Example**
        
        ```python
        # continue the previous code
        re.fit()
        ```
        
        
        
        ##### def summary(self):
        
        This function summarize the result, including the GRS test.
        
        **Example**
        
        ```python
        # continue the previous code
        re.summary()
        =============================================================================
        ----------------------------------- GRS Test -------------------------------- 
        
        GRS Statistics: 
         [[0.67160203]]
        GRS p_value: 
         [[0.75175471]]
        ------------------------------------------------------------------------------
        ```
        
        
        
        ##### def grs(self):
        
        This function conducts the GRS test.
        
        **output :**
        
        *grs_stats (list):* The GRS statistics.
        
        *p_value (list):* The p_value.
        
        
        
        **Example**
        
        ```python
        # continue the previous code
        print(re.grs())
        ==============================================
        (array([[0.67160203]]), array([[0.75175471]]))
        ```
        
        
        
        
        
        ### cross_section_regress
        
        #### class CS_regress()
        
        ##### def \__init__(self, y_list, factor):
        
        This function initializes the class.
        
        **input :**
        
        *y_list (list/DataFrame):* The assets return series list.
        
        *factor (ndarray/DataFrame):* The factor risk premium series.
        
        
        
        ##### def ts_regress(self):
        
        This function conducts the time series regression.
        
        **output :**
        
        *beta(ndarray[N, c]) :* The betas for each asset.
        
        *err_mat (ndarray):* The error matrix.
        
        
        
        ##### def ave(self):
        
        This function conducts the average operation on the assets returns series list.
        
        **output :**
        
        *np.mean(self.y_list, axis=1)* : The average of the assets returns series list.
        
        
        
        ##### def cs_regress(self, beta, err_mat, constant=*True*, gls=*True*, **kwargs):
        
        This function takes the cross-sectional regression.
        
        **input :**
        
        *beta (ndarray):* The betas from the output of the function ts_regress().
        
        *err_mat (ndarray) :* The error matrix from the output of the function ts_regress().
        
        *constant (boolean):* add constant or not. The default is add constant.
        
        *gls (boolean):* GLS regression or OLS regression. The default is GLS regression.
        
        **output :**
        
        *params (array):* The params of the cross regression model.
        
        *resid (array):* The residue of the cross regression model.
        
        **Example**
        
        ```python
        from statsmodels.regression.linear_model import GLS
        from EAP.cross_section_regress import CS_regress
        import numpy as np
            
        X = np.random.normal(loc=0, scale=0.1, size=(2000,3))
        y_list = []
        for i in range(100) :
            b = np.random.uniform(low=-1, high=1, size=(3,1))
            e = np.random.normal(loc=0.0, scale=0.5, size=(2000,1))
            alpha = np.random.normal(loc=0.0, scale=0.5)
            y = X.dot(b)  + e 
            y_list.append(y)
        print(np.mean(X, axis= 0)) # average return of the factor risk premium 
        ========================================================================
        [-0.00496423  0.00146649 -0.0004722 ]
        ```
        
        
        
        ##### def cov_mat(self, beta, err_mat, shanken=*True*, constant=*True*, gls=*True*, **kwargs):
        
        This function calculates the covariance matrix of the cross regression model. 
        
        **input :**
        
        *beta (ndarray):* The betas from the output of the function ts_regress().
        
        *err_mat (ndarray):* The error matrix from the output of the function ts_regress().
        
        *shanken (boolean):* Take the shanken adjustment or not. The default is *True*.
        
        *constant (boolean):* add constant or not.
        
        *gls (boolean):* GLS regression or OLS regression. The default is GLS regression.
        
        **output :**
        
        *param_cov_mat (ndarray):* The covariance matrix of the parameters.
        
        *resid_cov_mat (ndarray):* The covariance matrix of the residue.
        
        
        
        ##### def t_test(self, param_cov_mat, params):
        
        This function takes t-test for parameters.
        
        **input :**
        
        *param_cov_mat (ndarray):* The covariance matrix of the parameters from the function cov_mat.
        
        *params (ndarray):* The parameters from the function cs_regress.
        
        **output :**
        
        *t_value (ndarray):* The t-value for statistical inference.
        
        *p_value (ndarray):* The p-value for statistical inference.
        
        
        
        ##### def union_test(self, resid_cov_mat, resid):
        
        This function takes union test for parameters.
        
        **input :**
        
        *resid_cov_mat (ndarray):* The covariance matrix of the residue. 
        
        *resid (ndarray):* The residue from the function cs_regress.
        
        **output :**
        
        *chi_square (list):* The chi-square statistics.
        
        *p_value (list):* The p-value corresponding to the chi-square.
        
        
        
        ##### def fit(self, **kwargs):
        
        This function runs the cross-sectional regression and takes the statistical inference.
        
        
        
        ##### def summary(self):
        
        This function print the summary. 
        
         **Example**
        
        ```python
        # continue the previous code
        print("\n---------------------GLS: Constant=True shanken=True------------------------\n")
        re = CS_regress(y_list, X)
        re.fit()
        re.summary()
        print("\n------------------------------------------------------------------------\n")
            
        print("\n---------------------GLS: Constant=False shanken=True------------------------\n")
        re = CS_regress(y_list, X)
        re.fit(constant=False)
        re.summary()
        print("\n------------------------------------------------------------------------\n")
            
        print("\n---------------------GLS: Constant=True shanken=False------------------------\n")
        re = CS_regress(y_list, X)
        re.fit(shanken=False)
        re.summary()
        print("\n------------------------------------------------------------------------\n")
            
        print("\n---------------------GLS: Constant=False shanken=False------------------------\n")
        re = CS_regress(y_list, X)
        re.fit(constant=False, shanken=False)
        re.summary()
        print("\n------------------------------------------------------------------------\n")
            
        print("\n---------------------OLS: Constant=True shanken=True------------------------\n")
        re = CS_regress(y_list, X)
        re.fit(gls=False)
        re.summary()
        print("\n------------------------------------------------------------------------\n")
            
        print("\n---------------------OLS: Constant=False shanken=True------------------------\n")
        re = CS_regress(y_list, X)
        re.fit(constant=False, gls=False)
        re.summary()
        print("\n------------------------------------------------------------------------\n")
            
        print("\n---------------------OLS: Constant=True shanken=False------------------------\n")
        re = CS_regress(y_list, X)
        re.fit(shanken=False, gls=False)
        re.summary()
        print("\n------------------------------------------------------------------------\n")
            
        print("\n---------------------OLS: Constant=False shanken=False------------------------\n")
        re = CS_regress(y_list, X)
        re.fit(constant=False, shanken=False, gls=False)
        re.summary()
        print("\n------------------------------------------------------------------------\n")
        ================================================================================================================================
        ---------------------GLS: Constant=True shanken=True------------------------
        
        
        -------------------------- Risk Premium ------------------------------
        
        +-----------------------+---------------+--------------+
        |         params        |    t_value    |   p_value    |
        +-----------------------+---------------+--------------+
        | -0.004239493082637248 | [-1.49931456] | [0.13394995] |
        | 0.0018754489425203834 |  [0.64798047] |  [0.517072]  |
        | -0.002623980021497935 | [-0.92194337] | [0.35666937] |
        +-----------------------+---------------+--------------+
        
        ----------------------------------------------------------------------
        
        
        ----------------------------- Alpha test -----------------------------
        
        +-----------------+----------------+
        |    chi-square   |    p_value     |
        +-----------------+----------------+
        | [[89.27084753]] | [[0.69922497]] |
        +-----------------+----------------+
        
        ----------------------------------------------------------------------
        
        
        ------------------------------------------------------------------------
        
        
        ---------------------GLS: Constant=False shanken=True------------------------
        
        
        -------------------------- Risk Premium ------------------------------
        
        +------------------------+---------------+--------------+
        |         params         |    t_value    |   p_value    |
        +------------------------+---------------+--------------+
        | -0.0044337140100835495 | [-1.56801135] | [0.11703675] |
        | 0.0014489988306153607  |  [0.50064261] | [0.61667778] |
        | -0.0025199902303684996 |  [-0.8854116] | [0.37604117] |
        +------------------------+---------------+--------------+
        
        ----------------------------------------------------------------------
        
        
        ----------------------------- Alpha test -----------------------------
        
        +------------------+----------------+
        |    chi-square    |    p_value     |
        +------------------+----------------+
        | [[123.73200185]] | [[0.03486519]] |
        +------------------+----------------+
        
        ----------------------------------------------------------------------
        
        
        ------------------------------------------------------------------------
        
        
        ---------------------GLS: Constant=True shanken=False------------------------
        
        
        -------------------------- Risk Premium ------------------------------
        
        +-----------------------+---------------+--------------+
        |         params        |    t_value    |   p_value    |
        +-----------------------+---------------+--------------+
        | -0.004239493082637248 | [-1.50013478] | [0.13373741] |
        | 0.0018754489425203834 |  [0.64838824] | [0.51680835] |
        | -0.002623980021497935 | [-0.92243422] | [0.35641345] |
        +-----------------------+---------------+--------------+
        
        ----------------------------------------------------------------------
        
        
        ----------------------------- Alpha test -----------------------------
        
        +-----------------+---------+
        |    chi-square   | p_value |
        +-----------------+---------+
        | [[32.53458229]] |  [[1.]] |
        +-----------------+---------+
        
        ----------------------------------------------------------------------
        
        
        ------------------------------------------------------------------------
        
        
        ---------------------GLS: Constant=False shanken=False------------------------
        
        
        -------------------------- Risk Premium ------------------------------
        
        +------------------------+---------------+--------------+
        |         params         |    t_value    |   p_value    |
        +------------------------+---------------+--------------+
        | -0.0044337140100835495 | [-1.56885941] | [0.11683897] |
        | 0.0014489988306153607  |  [0.50095408] | [0.6164586]  |
        | -0.0025199902303684996 | [-0.88587763] | [0.37579002] |
        +------------------------+---------------+--------------+
        
        ----------------------------------------------------------------------
        
        
        ----------------------------- Alpha test -----------------------------
        
        +------------------+---------+
        |    chi-square    | p_value |
        +------------------+---------+
        | [[-49.33885529]] |  [[1.]] |
        +------------------+---------+
        
        ----------------------------------------------------------------------
        
        
        ------------------------------------------------------------------------
        
        
        ---------------------OLS: Constant=True shanken=True------------------------
        
        
        -------------------------- Risk Premium ------------------------------
        
        +------------------------+---------------+--------------+
        |         params         |    t_value    |   p_value    |
        +------------------------+---------------+--------------+
        | -0.004191679829911098  | [-1.46410802] | [0.14332167] |
        | 0.0026957168215120215  |  [0.91917806] | [0.35811334] |
        | -0.0028054613152236852 |  [-0.9776489] | [0.3283663]  |
        +------------------------+---------------+--------------+
        
        ----------------------------------------------------------------------
        
        
        ----------------------------- Alpha test -----------------------------
        
        +------------------+----------------+
        |    chi-square    |    p_value     |
        +------------------+----------------+
        | [[105.04417306]] | [[0.27097431]] |
        +------------------+----------------+
        
        ----------------------------------------------------------------------
        
        
        ------------------------------------------------------------------------
        
        
        ---------------------OLS: Constant=False shanken=True------------------------
        
        
        -------------------------- Risk Premium ------------------------------
        
        +-----------------------+---------------+--------------+
        |         params        |    t_value    |   p_value    |
        +-----------------------+---------------+--------------+
        | -0.004314458572417905 | [-1.50700942] | [0.13196625] |
        |  0.002435573118655651 |  [0.83048514] | [0.40636372] |
        | -0.002772551881136359 | [-0.96619054] | [0.33406572] |
        +-----------------------+---------------+--------------+
        
        ----------------------------------------------------------------------
        
        
        ----------------------------- Alpha test -----------------------------
        
        +------------------+----------------+
        |    chi-square    |    p_value     |
        +------------------+----------------+
        | [[117.95438501]] | [[0.07282442]] |
        +------------------+----------------+
        
        ----------------------------------------------------------------------
        
        
        ------------------------------------------------------------------------
        
        
        ---------------------OLS: Constant=True shanken=False------------------------
        
        
        -------------------------- Risk Premium ------------------------------
        
        +------------------------+---------------+--------------+
        |         params         |    t_value    |   p_value    |
        +------------------------+---------------+--------------+
        | -0.004191679829911098  | [-1.46507223] | [0.14305847] |
        | 0.0026957168215120215  |  [0.91987006] | [0.35775165] |
        | -0.0028054613152236852 |  [-0.9782681] | [0.32806011] |
        +------------------------+---------------+--------------+
        
        ----------------------------------------------------------------------
        
        
        ----------------------------- Alpha test -----------------------------
        
        +------------------+---------------+
        |    chi-square    |    p_value    |
        +------------------+---------------+
        | [[144.80511192]] | [[0.0011985]] |
        +------------------+---------------+
        
        ----------------------------------------------------------------------
        
        
        ------------------------------------------------------------------------
        
        
        ---------------------OLS: Constant=False shanken=False------------------------
        
        
        -------------------------- Risk Premium ------------------------------
        
        +-----------------------+---------------+--------------+
        |         params        |    t_value    |   p_value    |
        +-----------------------+---------------+--------------+
        | -0.004314458572417905 | [-1.50798575] | [0.13171619] |
        |  0.002435573118655651 |  [0.8311002]  | [0.40601628] |
        | -0.002772551881136359 | [-0.96679254] | [0.3337647]  |
        +-----------------------+---------------+--------------+
        
        ----------------------------------------------------------------------
        
        
        ----------------------------- Alpha test -----------------------------
        
        +------------------+----------------+
        |    chi-square    |    p_value     |
        +------------------+----------------+
        | [[100.02211446]] | [[0.39645777]] |
        +------------------+----------------+
        
        ----------------------------------------------------------------------
        
        
        ------------------------------------------------------------------------
        ```
        
        
        
        ### adjust
        
        This module consists of several common adjustment method in factor analysis.
        
        
        
        ##### def ols(y, x, constant=*True*):
        
        The function is for OLS regression, which is equal to the OLS module in package *statsmodels*. 
        
        **input :**
        
        *y (ndarray):* The dependent variable.
        
        *x (ndarray):* The explanatory variable.
        
        *constant (boolean):* add constant or not in OLS model. The default is *True*.
        
        **output :**
        
        *result (OLSRegressResult):* The result of the regression.
        
        ```python
        import numpy as np
        from statsmodels.base.model import Model
        from statsmodels.tools.tools import add_constant
        from EAP.adjust import newey_west, newey_west_t, ols, white_t
        from EAP.adjust import white
        
        X = np.random.normal(loc=0.0, scale=1.0, size=(2000,10))
        b = np.random.uniform(low=0.1, high=1.1, size=(10,1))
        e = np.random.normal(loc=0.0, scale=1.0, size=(2000,1))
        y = X.dot(b) + e + 1.0
        
        re = ols(y, X, constant=True)
        print('\nTrue b : ', b)
        print('\nEstimated b : ', re.params)
        print('\nresidue : ', re.resid.shape)
        ====================================================
        True b :  [[0.59169091]
         [0.91342353]
         [0.19599503]
         [0.9112773 ]
         [0.70647024]
         [0.41873624]
         [0.64871071]
         [0.20685505]
         [0.13172035]
         [0.82358063]]
        
        Estimated b :  [1.00206596 0.57602159 0.91832825 0.2104454  0.935377   0.71526534
         0.39181771 0.65432445 0.22666925 0.13173488 0.83208811]
        
        residue :  (2000,)
        ```
        
        
        
        ##### def white(y, X, **kwargs):
        
        This function is for the White test. White estimate of variance:
        
        X(r,c): r is sample number, c is variable number, S0 is covariance matrix of residue.
        
        The variance estimate is 
        $$
        V_{ols}=T(X^`X)^{-1}S_0(X^`X)^{-1}.
        $$
        The white estimation of S0 is 
        $$
        S_0=\frac 1T X^`(XR).
        $$
        **input :**
        
        *y (ndarray):* The dependent variable.
        
        *X (ndarray):* The explanatory variable.
        
        **output :**
        
        *V_ols (ndarray):* The white variance estimate
        
        **Example**
        
        ```python
        # continue the previous code
        import numpy as np
        from statsmodels.api import add_constant
        
        X = np.random.normal(loc=0.0, scale=1.0, size=(2000,10))
        b = np.random.uniform(low=0.1, high=1.1, size=(10,1))
        e = np.random.normal(loc=0.0, scale=1.0, size=(2000,1))
        y = X.dot(b) + e + 1.0
        
        re = white(y, X, constant=True)
        np.set_printoptions(formatter={'float':'{:0.3f}'.format})
        print(re*100)
        X = add_constant(X)
        r, c = np.shape(X)
        print('\n', 1/r*X.T.dot(X).dot(re).dot(X.T.dot(X)))
        ============================================================================
        [[0.052 0.004 0.002 0.002 0.000 -0.002 0.002 -0.004 -0.001 0.000 0.003]
         [0.004 0.056 0.002 -0.002 -0.003 -0.004 -0.001 -0.001 -0.003 0.004 0.001]
         [0.002 0.002 0.049 -0.003 -0.003 0.003 0.000 -0.002 -0.002 -0.003 0.002]
         [0.002 -0.002 -0.003 0.051 -0.002 -0.002 0.004 -0.000 -0.000 -0.001 0.002]
         [0.000 -0.003 -0.003 -0.002 0.054 0.004 0.000 -0.001 0.000 -0.001 0.003]
         [-0.002 -0.004 0.003 -0.002 0.004 0.055 -0.001 0.000 0.000 -0.002 0.001]
         [0.002 -0.001 0.000 0.004 0.000 -0.001 0.053 -0.003 -0.005 -0.002 0.004]
         [-0.004 -0.001 -0.002 -0.000 -0.001 0.000 -0.003 0.057 -0.001 -0.001 -0.002]
         [-0.001 -0.003 -0.002 -0.000 0.000 0.000 -0.005 -0.001 0.051 0.003 0.000]
         [0.000 0.004 -0.003 -0.001 -0.001 -0.002 -0.002 -0.001 0.003 0.049 0.002]
         [0.003 0.001 0.002 0.002 0.003 0.001 0.004 -0.002 0.000 0.002 0.052]]
        
         [[1.036 0.029 -0.028 0.031 0.007 -0.067 -0.004 0.003 0.029 -0.022 -0.041]
         [0.029 1.001 0.020 -0.056 -0.066 -0.023 0.000 -0.010 -0.070 0.069 0.023]
         [-0.028 0.020 0.972 -0.057 -0.019 0.058 -0.048 -0.002 0.084 -0.005 0.045]
         [0.031 -0.056 -0.057 1.004 -0.004 0.017 0.017 -0.072 -0.008 -0.042 0.041]
         [0.007 -0.066 -0.019 -0.004 1.024 0.045 -0.035 0.017 0.003 0.006 0.029]
         [-0.067 -0.023 0.058 0.017 0.045 1.125 0.010 -0.014 0.037 0.053 -0.017]
         [-0.004 0.000 -0.048 0.017 -0.035 0.010 1.053 -0.022 -0.047 0.047 0.035]
         [0.003 -0.010 -0.002 -0.072 0.017 -0.014 -0.022 0.955 -0.007 -0.002 -0.019]
         [0.029 -0.070 0.084 -0.008 0.003 0.037 -0.047 -0.007 1.008 -0.025 0.025]
         [-0.022 0.069 -0.005 -0.042 0.006 0.053 0.047 -0.002 -0.025 1.010 0.026]
         [-0.041 0.023 0.045 0.041 0.029 -0.017 0.035 -0.019 0.025 0.026 1.058]]
        ```
        
        
        
        ##### def newey_west(y, X, J=*None*):
        
        This function is for Newey-West adjustment. Newey-West estimate of variance:
        
        X(r,c): r is sample number, c is variable number, and S0 is covariance matrix of residue.
        
        The estimate variance is
        $$
        V_{ols}=T(X^`X)^{-1}S_0(X^`X)^{-1}.
        $$
        The Newey-West estimate variance of S0 is
        $$
        S_0= \frac1T X^`(XR)+\frac1T \sum_j\sum_tw_je_te_{t-j}(X_tX^`_{t-j}+X_{t-j}X^`_t).
        $$
        **input :**
        
        *y (ndarray):* The dependent variable.
        
        *X (ndarray):* The explanatory variable.
        
        *J (int):* The lag.
        
        **output :**
        
        *V_ols (ndarray):* The Newey-West variance estimate.
        
        **Example**
        
        ```python
        # continue the previous code
        import numpy as np
        from statsmodels.stats.sandwich_covariance import cov_hac
        
        X = np.random.normal(loc=0.0, scale=1.0, size=(10000,10))
        b = np.random.uniform(low=0.1, high=1.1, size=(10,1))
        e = np.random.normal(loc=0.0, scale=1.0, size=(10000,1))
        y = X.dot(b) + e + 1.0
        
        re = newey_west(y, X, constant=False)
        np.set_printoptions(formatter={'float':'{:0.2f}'.format})
        print(re)
        # X = add_constant(X)
        r, c = np.shape(X) 
        print('\n', 1/r*X.T.dot(X).dot(re).dot(X.T.dot(X)))
        result = ols(y, X, constant=False)
        print('\n', cov_hac(result))
        print('\n', 1/r*X.T.dot(X).dot(cov_hac(result)).dot(X.T.dot(X)))
        ========================================================================
        [[0.00 -0.00 -0.00 0.00 0.00 0.00 -0.00 0.00 -0.00 -0.00]
         [-0.00 0.00 -0.00 -0.00 -0.00 0.00 -0.00 -0.00 0.00 -0.00]
         [-0.00 -0.00 0.00 -0.00 -0.00 -0.00 -0.00 -0.00 -0.00 0.00]
         [0.00 -0.00 -0.00 0.00 -0.00 -0.00 -0.00 0.00 -0.00 -0.00]
         [0.00 -0.00 -0.00 -0.00 0.00 0.00 -0.00 -0.00 0.00 0.00]
         [0.00 0.00 -0.00 -0.00 0.00 0.00 -0.00 -0.00 -0.00 0.00]
         [-0.00 -0.00 -0.00 -0.00 -0.00 -0.00 0.00 0.00 0.00 0.00]
         [0.00 -0.00 -0.00 0.00 -0.00 -0.00 0.00 0.00 -0.00 -0.00]
         [-0.00 0.00 -0.00 -0.00 0.00 -0.00 0.00 -0.00 0.00 0.00]
         [-0.00 -0.00 0.00 -0.00 0.00 0.00 0.00 -0.00 0.00 0.00]]
        
         [[2.08 0.00 -0.09 -0.01 -0.03 -0.05 -0.02 0.00 -0.02 0.01]
         [0.00 2.01 0.04 -0.00 0.00 -0.03 -0.00 -0.01 -0.00 0.02]
         [-0.09 0.04 1.99 -0.02 0.01 0.00 -0.01 -0.08 0.03 0.02]
         [-0.01 -0.00 -0.02 1.96 -0.00 -0.05 -0.01 -0.04 -0.01 -0.02]
         [-0.03 0.00 0.01 -0.00 1.98 0.06 0.00 0.04 -0.01 -0.05]
         [-0.05 -0.03 0.00 -0.05 0.06 1.94 -0.01 -0.00 -0.01 -0.02]
         [-0.02 -0.00 -0.01 -0.01 0.00 -0.01 2.04 -0.00 0.01 -0.07]
         [0.00 -0.01 -0.08 -0.04 0.04 -0.00 -0.00 1.89 -0.05 -0.02]
         [-0.02 -0.00 0.03 -0.01 -0.01 -0.01 0.01 -0.05 1.97 -0.01]
         [0.01 0.02 0.02 -0.02 -0.05 -0.02 -0.07 -0.02 -0.01 1.93]]
        
         [[0.00 -0.00 0.00 0.00 0.00 0.00 -0.00 0.00 0.00 -0.00]
         [-0.00 0.00 0.00 -0.00 0.00 0.00 -0.00 0.00 -0.00 -0.00]
         [0.00 0.00 0.00 -0.00 -0.00 0.00 -0.00 -0.00 -0.00 0.00]
         [0.00 -0.00 -0.00 0.00 0.00 -0.00 -0.00 -0.00 -0.00 -0.00]
         [0.00 0.00 -0.00 0.00 0.00 0.00 0.00 -0.00 0.00 0.00]
         [0.00 0.00 0.00 -0.00 0.00 0.00 -0.00 -0.00 -0.00 -0.00]
         [-0.00 -0.00 -0.00 -0.00 0.00 -0.00 0.00 0.00 -0.00 0.00]
         [0.00 0.00 -0.00 -0.00 -0.00 -0.00 0.00 0.00 -0.00 0.00]
         [0.00 -0.00 -0.00 -0.00 0.00 -0.00 -0.00 -0.00 0.00 0.00]
         [-0.00 -0.00 0.00 -0.00 0.00 -0.00 0.00 0.00 0.00 0.00]]
        
         [[2.07 -0.05 -0.04 0.02 0.06 -0.11 -0.08 0.03 0.05 0.02]
         [-0.05 1.91 0.10 0.01 0.07 0.00 0.01 0.02 -0.09 0.03]
         [-0.04 0.10 2.05 -0.06 0.01 0.07 -0.06 -0.11 -0.15 0.04]
         [0.02 0.01 -0.06 1.87 0.04 -0.05 -0.02 -0.10 -0.01 -0.06]
         [0.06 0.07 0.01 0.04 2.00 0.11 0.05 0.03 -0.00 -0.04]
         [-0.11 0.00 0.07 -0.05 0.11 1.91 0.00 -0.05 -0.05 -0.05]
         [-0.08 0.01 -0.06 -0.02 0.05 0.00 1.99 0.03 -0.02 -0.10]
         [0.03 0.02 -0.11 -0.10 0.03 -0.05 0.03 2.01 -0.10 0.06]
         [0.05 -0.09 -0.15 -0.01 -0.00 -0.05 -0.02 -0.10 2.16 0.01]
         [0.02 0.03 0.04 -0.06 -0.04 -0.05 -0.10 0.06 0.01 1.97]]
        ```
        
        
        
        ##### def white_t(y, X, params=*None*, side='Two', **kwargs):
        
        This function constructs t-test based on White variance estimate.
        
        **input :**
        
        *y (ndarray):* The dependent variable.
        
        *X (ndarray):* The explanatory variable.
        
        *params (ndarray):* Already have parameters or not. The default is *None*.
        
        side ()
        
        **output :**
        
        *t_value (ndarray):* The t-value of parameters.
        
        *p_value (ndarray):* The p-value of parameters.
        
        **Example**
        
        ```python
        import numpy as np
        
        X = np.random.normal(loc=0.0, scale=1.0, size=(10000,10))
        b = np.random.uniform(low=-0.5, high=0.5, size=(10,1))
        e = np.random.normal(loc=0.0, scale=1.0, size=(10000,1))
        y = X.dot(b) + e + 1.0
        
        re = white_t(y, X, constant=False)
        np.set_printoptions(formatter={'float':'{:0.2f}'.format})
        print('t_value : ', re[0], '\np_value : ', re[1])
        print('b :', b.T)
        =========================================================================
        t_value :  [2.50 34.58 13.23 6.56 -17.64 -34.69 -16.40 13.84 28.90 30.64] 
        p_value :  [0.01 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00]
        b : [[0.02 0.49 0.18 0.09 -0.25 -0.49 -0.22 0.19 0.44 0.43]]
        ```
        
        
        
        ##### def newey_west_t(y, X, params=*None*):
        
        This function constructs t-test based on Newey-West variance estimate.
        
        **input :**
        
        *y (ndarray):* The dependent variable.
        
        *X (ndarray):* The explanatory variable.
        
        *params (ndarray):* Already have parameters or not. The default is None.
        
        **output :**
        
        *t_value (ndarray):* The t-value of parameters.
        
        *p_value (ndarray):* The p-value of parameters.
        
        **Example**
        
        ```python
        import numpy as np
        
        X = np.random.normal(loc=0.0, scale=1.0, size=(10000,10))
        b = np.random.uniform(low=-0.5, high=0.5, size=(10,1))
        e = np.random.normal(loc=0.0, scale=1.0, size=(10000,1))
        y = X.dot(b) + e + 1.0
        
        re = newey_west_t(y, X, constant=True)
        np.set_printoptions(formatter={'float':'{:0.2f}'.format})
        print('t_value : ', re[0], '\np_value : ', re[1])
        print('b :', b.T)
        =================================================================================
        t_value :  [135.08 -17.19 13.80 34.95 14.61 21.31 48.56 -44.62 2.75 -28.71 51.63] 
        p_value :  [0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00]
        b : [[-0.14 0.10 0.25 0.11 0.16 0.40 -0.36 0.03 -0.21 0.40]]
        ```
        
        
        
        
        
        ### time_frequency
        
        This module is designed for time frequency asset pricing model, including Fourier Transform based method and Wavelet based method.
        
        
        
        #### class Wavelet():
        
        This class is designed for decomposing the series using wavelet method, basing on the package  **pywt**.
        
        
        
        ##### def \__init__(self, series):
        
        **input:** 
        
        *series (array) :* the series to be decomposed.
        
        
        
        ##### def _decompose(self, wave, mode, level=None):
        
        This function is designed for decomposing the series using given wavelet family, mode and level.
        
        **input :**
        
        *wave (str) :* The chosen wavelet family, which is the same in **pywt**.
        
        *mode (str) :* choose multiscale or single scale. 'multi' denotes multiscale. 'single' denotes single scale.
        
        *level (int) :* the level of multiscale. If 'multi' is chosen, it must be set.
        
        **output :**
        
        *wave_dec (list):* The decomposed details including (CA, CN, CN-1,..., C1).
        
        
        
        ##### def _pick_details(self):
        
        This function is designed for picking the details of decomposed series at different level.
        
        **output:**
        
        *pick_series (list):* pick the detail series at each level. Each elements in list only contains the details at that level. 
        
        
        
        ##### def _rebuild(self, mode='constant'):
        
        This function is designed for rebuilding the detail series from the picked series at each level.
        
        **input:**
        
        *mode (str):* The recomposed method.
        
        **output:**
        
        *wave_rec (list):* The recomposed series from the details at each level.
        
        
        
        ##### def fit(self, wave, mode, level, output=False):
        
        This function is designed for fitting the model.
        
        **input :**
        
        *wave (str) :* The chosen wavelet family, which is the same in **pywt**.
        
        *mode (str) :* choose multiscale or single scale. 'multi' denotes multiscale. 'single' denotes single scale.
        
        *level (int) :* the level of multiscale. If 'multi' is chosen, it must be set.
        
        *output (boolean):* whether output the result. The **DEFAULT** is False.
        
        **output :**
        
        *wave_rec (list):* The recomposed series from the details at each level, if output is True.
        
        
        
        #### class wavelet_pricing():
        
        This module is designed for wavelet pricing model.
        
        ##### def \__init__(self, rets, factors):
        
        **input :**
        
        *rets (ndarray/Series/DataFrame):* the dependent variables of the return.
        
        *factors (ndarray/Series/DataFrame):* the independent variables of factors.
        
        
        
        ##### def wavelet_dec_rec(self, **kwargs):
        
        This function is designed for wavelet decomposing and recomposing.
        
        **input :**
        
        *kwargs :* the kwargs include wave family, mode, and level of wavelet.
        
        **output :**
        
        *rets_dec_s (list):* The recomposed detail series of returns (rets).
        
        *factors_dec_s (list):* The recomposed detail series of factors (factors).
        
        
        
        ##### def wavelet_regression(self, **kwargs):
        
        This function is designed for OLS regression of detail series between return and factors at each level.
        
        **input :**
        
        ***kwargs :* the kwargs include 'constant': whether the regression includes the constant.
        
        **output :**
        
        *regress (list):* the regression results of OLS in package **statsmodels**.
        
        
        
        ##### def fit(self, wave, mode, level, win=None, robust=False, constant=True):
        
        This function is designed for fitting the model.
        
        **input :**
        
        *wave (str) :* The chosen wavelet family, which is the same in **pywt**.
        
        *mode (str) :* choose multiscale or single scale. 'multi' denotes multiscale. 'single' denotes single scale.
        
        *level (int) :* the level of multiscale. If 'multi' is chosen, it must be set.
        
        *win (int):* The rolling window if rolling regression is used.
        
        *robust (boolean):* whether use the robust covariance matrix.
        
        *constant (boolean):* whether includes the constant in regression. The **DEFAULT** is True.
        
        
        
        ##### def summary(self, export=False):
        
        This function is designed for printing the summary.
        
        **input :**
        
        *export (boolean):* whether export the summary table. The **DEFAULT** is False.
        
        **output :**
        
        *df (DataFrame):* if export is True, then return the summary table. 
        
        
        
        
Platform: UNKNOWN
Classifier: Programming Language :: Python :: 3
Classifier: License :: OSI Approved :: MIT License
Classifier: Operating System :: OS Independent
Requires-Python: >=3
Description-Content-Type: text/markdown
