November  2020, 3(4): 301-308. doi: 10.3934/mfc.2020012

AIMS: Average information matrix splitting

1. 

Laboratory for Intelligent Computing and Financial Technology, Department of Mathematics, Xi'an Jiaotong-Liverpool University, Suzhou, 215123, China

2. 

Laboratory of Computational Physics, Institute of Applied Physics and Computational Mathematics, Beijing 100088, China

* Corresponding author: Shengxin Zhu

Received  November 2019 Revised  February 2020 Published  November 2020 Early access  June 2020

Fund Project: This research is supported by Foundation of LCP(6142A05180501), Jiangsu Science and Technology Basic Research Program (BK20171237), Key Program Special Fund of XJTLU (KSF-E-21, KSF-P-02), Research Development Fund of XJTLU (RDF-2017-02-23) and partially supported by NSFC (No.11771002, 11571047, 11671049, 11671051, 6162003, and 11871339)

For linear mixed models with co-variance matrices which are not linearly dependent on variance component parameters, we prove that the average of the observed information and the Fisher information can be split into two parts. The essential part enjoys a simple and computational friendly formula, while the other part which involves a lot of computations is a random zero matrix and thus is negligible.

Citation: Shengxin Zhu, Tongxiang Gu, Xingping Liu. AIMS: Average information matrix splitting. Mathematical Foundations of Computing, 2020, 3 (4) : 301-308. doi: 10.3934/mfc.2020012
References:
[1]

Z. Chen, S. Zhu, Q. Niu and X. Lu, Censorious young: Knowledge discovery from high-throughput movie rating data with LME4, in 2019 IEEE 4th International Conference on Big Data Analytics (ICBDA), 2019, 32–36. doi: 10.1109/ICBDA.2019.8713193.

[2]

Z. ChenS. ZhuQ. Niu and T. Zuo, Knowledge discovery and recommendation with linear mixed model, IEEE Access, 8 (2020), 38304-38317.  doi: 10.1109/ACCESS.2020.2973170.

[3]

B. Efron and D. V. Hinkley, Assessing the accuracy of the maximum likelihood estimator: Observed versus expected Fisher information, Biometrika, 65 (1978), 457-483.  doi: 10.1093/biomet/65.3.457.

[4]

R. A. Fisher, The Genetical Theory of Natural Selection, Oxford University Press, Oxford, 1999.

[5]

B. Gao, G. Zhan, H. Wang, Y. Wang and S. Zhu, Learning with linear mixed model for group recommendation systems, in Proceedings of the 2019 11th International Conference on Machine Learning and Computing, ICMLC '19, Association for Computing Machinery, New York, NY, 2019, 81–85. doi: 10.1145/3318299.3318342.

[6]

A. R. GilmourR. Thompson and B. R. Cullis, Average information reml: An efficient algorithm for variance parameter estimation in linear mixed models, Biometrics, 51 (1995), 1440-1450.  doi: 10.2307/2533274.

[7]

G. Givens and J. Hoeting, Computational Statistics, 2$^{nd}$ edition, Wiley Series in Computation Statistics, John Wiley & Sons, Inc., Wiley, NJ, 2005.

[8]

F. N. Gumedze and T. T. Dunne, Parameter estimation and inference in the linear mixed model, Linear Algebra Appl., 435 (2011), 1920-1944.  doi: 10.1016/j.laa.2011.04.015.

[9]

A. Heavens, Generalised Fisher matrices, Entropy, 18 (2016), 8 pp. doi: 10.3390/e18060236.

[10]

W. JankeD. Johnston and R. Kenna, Information geometry and phase transitions, Physica A: Statistical Mechanics and its Applications, 336 (2004), 181-186.  doi: 10.1016/j.physa.2004.01.023.

[11]

R. I. Jennrich and P. F. Sampson, Newton-Raphson and related algorithms for maximum likelihood variance component estimation, Technometrics, 18 (1976), 11-17.  doi: 10.2307/1267911.

[12]

D. Johnson and R. Thompson, Restricted maximum likelihood estimation of variance components for univariate animal models using sparse matrix techniques and average information, Journal of Dairy Science, 78 (1995), 449-456.  doi: 10.3168/jds.S0022-0302(95)76654-1.

[13]

N. T. Longford, A fast scoring algorithm for maximum likelihood estimation in unbalanced mixed models with nested random effects, Biometrika, 74 (1987), 817-827.  doi: 10.1093/biomet/74.4.817.

[14]

K. Meyer, An average information restricted maximum likelihood algorithm for estimating reduced rank genetic covariance matrices or covariance functions for animal models with equal design matrices, Genetics Selection Evolution, 29 (1997), 97. doi: 10.1186/1297-9686-29-2-97.

[15]

J. I. Myung and D. J. Navarro, Information Matrix, American Cancer Society, 2005. doi: 10.1002/0470013192.bsa302.

[16]

J. W. Pratt, F. Y. Edgeworth and R. A. Fisher on the efficiency of maximum likelihood estimation, Ann. Statist., 4 (1976), 501-514.  doi: 10.1214/aos/1176343457.

[17]

M. Prokopenko, J. T. Lizier, O. Obst and X. R. Wang, Relating Fisher information to order parameters, Phys. Rev. E, 84 (2011), 041116. doi: 10.1103/PhysRevE.84.041116.

[18]

S. R. Searle, G. Casella and C. E. McCulloch, Variance Components, Wiley Series in Probability and Statistics, Wiley-Interscience [John Wiley & Sons], Hoboken, NJ, 2006.

[19]

M. Vallisneri, A User Manual for the Fisher Informaiton Matrix, California Institute of Technology, Jet Propulsion Laboratory, 2007.

[20]

R. S. Varga, Matrix Iterative Analysis, expanded edition, Springer Series in Computational Mathematics, 27, Springer-Verlag, Berlin, 2000. doi: 10.1007/978-3-642-05156-2.

[21]

Y. Wang, T. Wu, F. Ma and S. Zhu, Personalized recommender systems with multiple source data, in Computing Conference 2020

[22]

S. Welham, S. Zhu and A. J. Wathen, Big Data, Fast Models: Faster Calculation of Models from High-Throughput Biological Data Sets, Knowledge Transfer Report IP12-0009, Smith Institute and The Universtiy of Oxford, Oxford, 2013.

[23]

R. Zamir, A Necessary and Sufficient Condition for Equality in the Matrix Fisher Information Inequality, Technical report, Tel Aviv University, 1997.

[24]

R. Zamir, A proof of the Fisher information inequality via a data processing argument, IEEE Transactions on Information Theory, 44 (1998), 1246-1250.  doi: 10.1109/18.669301.

[25]

S. Zhu, T. Gu and X. Liu, Information matrix splitting, preprint, arXiv: 1605.07646.

[26]

S. Zhu and A. J. Wathen, Essential formulae for restricted maximum likelihood and its derivatives associated with the linear mixed models, preprint, arXiv: 1805.05188.

[27]

S. Zhu and A. J. Wathen, Sparse inversion for derivative of log determinant, arXiv: 1911.00685.

[28]

T. Zuo, S. Zhu and J. Lu, A hybrid recommender system combing singular value decomposition and linear mixed model, in Computing Conference 2020, Advance in Intelligent Systems and Computing, Springer International Publishing, Cham, 2020.

show all references

References:
[1]

Z. Chen, S. Zhu, Q. Niu and X. Lu, Censorious young: Knowledge discovery from high-throughput movie rating data with LME4, in 2019 IEEE 4th International Conference on Big Data Analytics (ICBDA), 2019, 32–36. doi: 10.1109/ICBDA.2019.8713193.

[2]

Z. ChenS. ZhuQ. Niu and T. Zuo, Knowledge discovery and recommendation with linear mixed model, IEEE Access, 8 (2020), 38304-38317.  doi: 10.1109/ACCESS.2020.2973170.

[3]

B. Efron and D. V. Hinkley, Assessing the accuracy of the maximum likelihood estimator: Observed versus expected Fisher information, Biometrika, 65 (1978), 457-483.  doi: 10.1093/biomet/65.3.457.

[4]

R. A. Fisher, The Genetical Theory of Natural Selection, Oxford University Press, Oxford, 1999.

[5]

B. Gao, G. Zhan, H. Wang, Y. Wang and S. Zhu, Learning with linear mixed model for group recommendation systems, in Proceedings of the 2019 11th International Conference on Machine Learning and Computing, ICMLC '19, Association for Computing Machinery, New York, NY, 2019, 81–85. doi: 10.1145/3318299.3318342.

[6]

A. R. GilmourR. Thompson and B. R. Cullis, Average information reml: An efficient algorithm for variance parameter estimation in linear mixed models, Biometrics, 51 (1995), 1440-1450.  doi: 10.2307/2533274.

[7]

G. Givens and J. Hoeting, Computational Statistics, 2$^{nd}$ edition, Wiley Series in Computation Statistics, John Wiley & Sons, Inc., Wiley, NJ, 2005.

[8]

F. N. Gumedze and T. T. Dunne, Parameter estimation and inference in the linear mixed model, Linear Algebra Appl., 435 (2011), 1920-1944.  doi: 10.1016/j.laa.2011.04.015.

[9]

A. Heavens, Generalised Fisher matrices, Entropy, 18 (2016), 8 pp. doi: 10.3390/e18060236.

[10]

W. JankeD. Johnston and R. Kenna, Information geometry and phase transitions, Physica A: Statistical Mechanics and its Applications, 336 (2004), 181-186.  doi: 10.1016/j.physa.2004.01.023.

[11]

R. I. Jennrich and P. F. Sampson, Newton-Raphson and related algorithms for maximum likelihood variance component estimation, Technometrics, 18 (1976), 11-17.  doi: 10.2307/1267911.

[12]

D. Johnson and R. Thompson, Restricted maximum likelihood estimation of variance components for univariate animal models using sparse matrix techniques and average information, Journal of Dairy Science, 78 (1995), 449-456.  doi: 10.3168/jds.S0022-0302(95)76654-1.

[13]

N. T. Longford, A fast scoring algorithm for maximum likelihood estimation in unbalanced mixed models with nested random effects, Biometrika, 74 (1987), 817-827.  doi: 10.1093/biomet/74.4.817.

[14]

K. Meyer, An average information restricted maximum likelihood algorithm for estimating reduced rank genetic covariance matrices or covariance functions for animal models with equal design matrices, Genetics Selection Evolution, 29 (1997), 97. doi: 10.1186/1297-9686-29-2-97.

[15]

J. I. Myung and D. J. Navarro, Information Matrix, American Cancer Society, 2005. doi: 10.1002/0470013192.bsa302.

[16]

J. W. Pratt, F. Y. Edgeworth and R. A. Fisher on the efficiency of maximum likelihood estimation, Ann. Statist., 4 (1976), 501-514.  doi: 10.1214/aos/1176343457.

[17]

M. Prokopenko, J. T. Lizier, O. Obst and X. R. Wang, Relating Fisher information to order parameters, Phys. Rev. E, 84 (2011), 041116. doi: 10.1103/PhysRevE.84.041116.

[18]

S. R. Searle, G. Casella and C. E. McCulloch, Variance Components, Wiley Series in Probability and Statistics, Wiley-Interscience [John Wiley & Sons], Hoboken, NJ, 2006.

[19]

M. Vallisneri, A User Manual for the Fisher Informaiton Matrix, California Institute of Technology, Jet Propulsion Laboratory, 2007.

[20]

R. S. Varga, Matrix Iterative Analysis, expanded edition, Springer Series in Computational Mathematics, 27, Springer-Verlag, Berlin, 2000. doi: 10.1007/978-3-642-05156-2.

[21]

Y. Wang, T. Wu, F. Ma and S. Zhu, Personalized recommender systems with multiple source data, in Computing Conference 2020

[22]

S. Welham, S. Zhu and A. J. Wathen, Big Data, Fast Models: Faster Calculation of Models from High-Throughput Biological Data Sets, Knowledge Transfer Report IP12-0009, Smith Institute and The Universtiy of Oxford, Oxford, 2013.

[23]

R. Zamir, A Necessary and Sufficient Condition for Equality in the Matrix Fisher Information Inequality, Technical report, Tel Aviv University, 1997.

[24]

R. Zamir, A proof of the Fisher information inequality via a data processing argument, IEEE Transactions on Information Theory, 44 (1998), 1246-1250.  doi: 10.1109/18.669301.

[25]

S. Zhu, T. Gu and X. Liu, Information matrix splitting, preprint, arXiv: 1605.07646.

[26]

S. Zhu and A. J. Wathen, Essential formulae for restricted maximum likelihood and its derivatives associated with the linear mixed models, preprint, arXiv: 1805.05188.

[27]

S. Zhu and A. J. Wathen, Sparse inversion for derivative of log determinant, arXiv: 1911.00685.

[28]

T. Zuo, S. Zhu and J. Lu, A hybrid recommender system combing singular value decomposition and linear mixed model, in Computing Conference 2020, Advance in Intelligent Systems and Computing, Springer International Publishing, Cham, 2020.

[1]

Nicolas Rougerie. On two properties of the Fisher information. Kinetic and Related Models, 2021, 14 (1) : 77-88. doi: 10.3934/krm.2020049

[2]

Marcello Delitala, Tommaso Lorenzi. A mathematical model for value estimation with public information and herding. Kinetic and Related Models, 2014, 7 (1) : 29-44. doi: 10.3934/krm.2014.7.29

[3]

Klas Modin. Geometry of matrix decompositions seen through optimal transport and information geometry. Journal of Geometric Mechanics, 2017, 9 (3) : 335-390. doi: 10.3934/jgm.2017014

[4]

Rui Wang, Denghua Zhong, Yuankun Zhang, Jia Yu, Mingchao Li. A multidimensional information model for managing construction information. Journal of Industrial and Management Optimization, 2015, 11 (4) : 1285-1300. doi: 10.3934/jimo.2015.11.1285

[5]

Tero Laihonen. Information retrieval and the average number of input clues. Advances in Mathematics of Communications, 2017, 11 (1) : 203-223. doi: 10.3934/amc.2017013

[6]

Xin Guo, Qiang Fu, Yue Wang, Kenneth C. Land. A numerical method to compute Fisher information for a special case of heterogeneous negative binomial regression. Communications on Pure and Applied Analysis, 2020, 19 (8) : 4179-4189. doi: 10.3934/cpaa.2020187

[7]

Jonathan Zinsl. The gradient flow of a generalized Fisher information functional with respect to modified Wasserstein distances. Discrete and Continuous Dynamical Systems - S, 2017, 10 (4) : 919-933. doi: 10.3934/dcdss.2017047

[8]

Ricardo J. Alonso, Véronique Bagland, Bertrand Lods. Uniform estimates on the Fisher information for solutions to Boltzmann and Landau equations. Kinetic and Related Models, 2019, 12 (5) : 1163-1183. doi: 10.3934/krm.2019044

[9]

Matteo Ludovico Bedini, Rainer Buckdahn, Hans-Jürgen Engelbert. On the compensator of the default process in an information-based model. Probability, Uncertainty and Quantitative Risk, 2017, 2 (0) : 10-. doi: 10.1186/s41546-017-0017-4

[10]

Vikram Krishnamurthy, William Hoiles. Information diffusion in social sensing. Numerical Algebra, Control and Optimization, 2016, 6 (3) : 365-411. doi: 10.3934/naco.2016017

[11]

Subrata Dasgupta. Disentangling data, information and knowledge. Big Data & Information Analytics, 2016, 1 (4) : 377-389. doi: 10.3934/bdia.2016016

[12]

Apostolis Pavlou. Asymmetric information in a bilateral monopoly. Journal of Dynamics and Games, 2016, 3 (2) : 169-189. doi: 10.3934/jdg.2016009

[13]

Ioannis D. Baltas, Athanasios N. Yannacopoulos. Uncertainty and inside information. Journal of Dynamics and Games, 2016, 3 (1) : 1-24. doi: 10.3934/jdg.2016001

[14]

Vieri Benci, C. Bonanno, Stefano Galatolo, G. Menconi, M. Virgilio. Dynamical systems and computable information. Discrete and Continuous Dynamical Systems - B, 2004, 4 (4) : 935-960. doi: 10.3934/dcdsb.2004.4.935

[15]

Wai-Ki Ching, Jia-Wen Gu, Harry Zheng. On correlated defaults and incomplete information. Journal of Industrial and Management Optimization, 2021, 17 (2) : 889-908. doi: 10.3934/jimo.2020003

[16]

Matthew Bourque, T. E. S. Raghavan. Policy improvement for perfect information additive reward and additive transition stochastic games with discounted and average payoffs. Journal of Dynamics and Games, 2014, 1 (3) : 347-361. doi: 10.3934/jdg.2014.1.347

[17]

Hui Gao, Jian Lv, Xiaoliang Wang, Liping Pang. An alternating linearization bundle method for a class of nonconvex optimization problem with inexact information. Journal of Industrial and Management Optimization, 2021, 17 (2) : 805-825. doi: 10.3934/jimo.2019135

[18]

Meijuan Shang, Yanan Liu, Lingchen Kong, Xianchao Xiu, Ying Yang. Nonconvex mixed matrix minimization. Mathematical Foundations of Computing, 2019, 2 (2) : 107-126. doi: 10.3934/mfc.2019009

[19]

Jianping Zhang, Ke Chen, Bo Yu, Derek A. Gould. A local information based variational model for selective image segmentation. Inverse Problems and Imaging, 2014, 8 (1) : 293-320. doi: 10.3934/ipi.2014.8.293

[20]

Jong Soo Kim, Won Chan Jeong. A model for buyer and supplier coordination and information sharing in order-up-to systems. Journal of Industrial and Management Optimization, 2012, 8 (4) : 987-1015. doi: 10.3934/jimo.2012.8.987

 Impact Factor: 

Metrics

  • PDF downloads (222)
  • HTML views (513)
  • Cited by (2)

Other articles
by authors

[Back to Top]