The Journey from Entropy to Generalized Maximum Entropy

  • Amjad D. Al-Nasser Professor, Department of Statistics, Science Faculty, Yarmouk University, Irbid 21163, Jordan.
Keywords: Entropy, Generalized Maximum Entropy, Maximum Entropy (ME), Mathematical Programming Problem

Abstract

Currently we are witnessing the revaluation of huge data recourses that should be analyzed carefully to draw the right decisions about the world problems. Such big data are statistically risky since we know that the data are combination of (useful) signals and (useless) noise, which considered as unorganized facts that need to be filtered and processed. Using the signals only and discarding the noise means that the data restructured and reorganized to be useful and it is called information. So for any data set, we need only the information. In context of information theory, the entropy is used as a statistical measure to quantify the maximum amount of information in a random event.

Downloads

Download data is not yet available.

References

Al-Nasser, A. D. (2010). Measuring customer satisfaction: An information-theoretic approach. Germany: LAP Lambert Academic Publishing AG & CO.KG.
Al-Nasser, A. D. (2005). Entropy type estimator to simple linear measurement error models. Austrian Journal of Statistics, 34(3), 283-294. https://doi.org/10.17713/ajs.v34i3.418.
Al-Nasser, A. D. (2011). An information-theoretic alternative to maximum likelihood estimation method in ultrastructural measurement error model. Hacettepe Journal of Mathematics and Statistics, 40(3), 469-481.
Al-Nasser, A. D. (2012). Human Development Index in the Arab States. Journal of Applied Sciences, 12(4), 326-335. https://doi.org/10.3923/jas.2012.326.335.
Al-Nasser, A. D. (2014). Two steps generalized maximum entropy estimation procedure for fitting linear regression when both covariates are subject to error. Journal of Applied Statistics, 41(8), 1708-1720. https://doi.org/10.1080/02664763.2014.888544.
Al‐Rawwash, M., & Al‐Nasser, A. D. (2013). Repeated measures and longitudinal data analysis using higher‐order entropies. Statistica Neerlandica, 67(1), 100-111. https://doi.org/10.1111/j.1467-9574.2012.00534.x.
Boltzmann L. (1970) Weitere studien uber das Warmegleichgewicht unter Gasmolekülen. In: Kinetische Theorie II. WTB wissenschaftliche taschenbucher, (pp. 115-225). Wiesbaden, Germany: Vieweg+Teubner Verlag. https://doi.org/10.1007/978-3-322-84986-1_3.
Ciavolino, E. & Al-Nasser A. D. (2010). Information theoretic estimation improvement to the nonlinear Gompertz’s model based on ranked set sampling. Journal of Applied Quantitative Methods. 5(2), 317-330.
Ciavolino, E., & Al-Nasser, A. D. (2009). Comparing generalized maximum entropy and partial least squares methods for structural equation models. Journal of nonparametric statistics, 21(8), 1017-1036. https://doi.org/10.1080/10485250903009037.
Ciavolino, E., Carpita, M., & Al-Nasser, A. (2015). Modelling the quality of work in the Italian social co-operatives combining NPCA-RSM and SEM-GME approaches. Journal of Applied Statistics, 42(1), 161-179. https://doi.org/10.1080/02664763.2014.938226.
Csiszar, I. (1991). Why least squares and maximum entropy? An axiomatic approach to inference for linear inverse problems. The Annals of Statistics, 19(4), 2032-2066. https://doi.org/10.1214/aos/1176348385.
Golan, A. (2014). Information dynamics. Minds and Machines, 24(1), 19-36. https://doi.org/10.1007/s11023-013-9326-2.
Golan, A., & Gzyl, H. (2012). An entropic estimator for linear inverse problems. Entropy, 14(5), 892-923. https://doi.org/10.3390/e14050892.
Golan, A., & Ullah, A. (2017). Interval estimation: An information theoretic approach. Econometric Reviews, 36(6-9), 781-795. https://doi.org/10.1080/07474938.2017.1307573.
Golan, A., Judge, G. G., & Miller, D. (1996). Maximum entropy econometrics: Robust estimation with limited data. New York: Wiley.
Golan, A., Judge, G., & Perloff, J. (1997). Estimation and inference with censored and ordered multinomial response data. Journal of Econometrics, 79(1), 23-51. https://doi.org/10.1016/S0304-4076(97)00006-7.
Jaynes, E. T. (1957). Information theory and statistical mechanics. Physical Review, 106(4), 620. https://doi.org/10.1103/PhysRev.106.620.
Jaynes, E. T. (1968). Prior probabilities. IEEE Transactions on Systems Science and Cybernetics, 4(3), 227-241. https://doi.org/10.1109/TSSC.1968.300117.
Jaynes, E. T. (2003). Probability Theory: The Logic of Science. Cambridge University Press, Cambridge, England.
Mead, L. R., & Papanicolaou, N. (1984). Maximum entropy in the problem of moments. Journal of Mathematical Physics, 25(8), 2404-2417. https://doi.org/10.1063/1.526446.
Press, S. J. (1996). The de Finetti transform. In Hanson, k. M. & Silver, N. R. (Eds), Maximum Entropy and Bayesian Methods (pp. 101-108). Netherlands: Kluwer Academic https://doi.org/10.1007/978-94-011-5430-7_12.
Shannon, C. E. (1948). A mathematical theory of communication. Bell System Technical Journal, 27(3), 379-423. https://doi.org/10.1002/j.1538-7305.1948.tb01338.x.
Zellner, A., & Highfield, R. A. (1988). Calculation of maximum entropy distributions and approximation of marginal posterior distributions. Journal of Econometrics, 37(2), 195-209. https://doi.org/10.1016/0304-4076(88)90002-4.
Published
2019-03-13
How to Cite
Al-Nasser, A. D. (2019). The Journey from Entropy to Generalized Maximum Entropy. Journal of Quantitative Methods, 3(1), 1-7. https://doi.org/10.29145/2019/jqm/030101
Section
Editorial