Matches in SemOpenAlex for { <https://semopenalex.org/work/W4244332020> ?p ?o ?g. }
Showing items 1 to 92 of
92
with 100 items per page.
- W4244332020 endingPage "274" @default.
- W4244332020 startingPage "269" @default.
- W4244332020 abstract "Free Access References Zhiyi Zhang, Zhiyi Zhang Department of Mathematics and Statistics, University of North Carolina, Charlotte, NC, USSearch for more papers by this author Book Author(s):Zhiyi Zhang, Zhiyi Zhang Department of Mathematics and Statistics, University of North Carolina, Charlotte, NC, USSearch for more papers by this author First published: 14 October 2016 https://doi.org/10.1002/9781119237150.refs AboutPDFPDF ToolsRequest permissionExport citationAdd to favoritesTrack citation ShareShareShare a linkShare onFacebookTwitterLinked InRedditWechat References Antos, A. and Kontoyiannis, I. (2001). Convergence properties of functional estimates for discrete distributions. Random Structures & Algorithms, 19, 163– 193. Beirlant, J., Dudewicz, E.J., Györfi, L., and Meulen, E.C. (2001). Nonparametric entropy estimation: an overview. International Journal of the Mathematical Statistics Sciences, 6, 17– 39. Billingsley, P. (1995). Probability and Measure, John Wiley & Sons, Inc., New York. Blyth, C.R. (1959). Note on estimating information. Annals of Mathematical Statistics, 30, 71– 79. Bunge, J., Willis, A., and Walsh, F. (2014). Estimating the number of species in microbial diversity studies. Annual Review of Statistics and its Application, 1, 427– 445. Chao, A. (1987). Estimating the population size for capture-recapture data with unequal catchability. Biometrics, 43, 783– 791. Chao, A. and Jost, L. (2012). Coverage-based rarefaction and extrapolation: standardizing samples by completeness rather than size. Ecology, 93, 2533– 2547. Chao, A. and Jost, L. (2015). Estimating diversity and entropy profiles via discovery rates of new species. Methods in Ecology and Evolution, 6, 873– 882. Chao, A. and Shen, T.J. (2003). Nonparametric estimation of Shannon's index of diversity when there are unseen species. Environmental and Ecological Statistics, 10, 429– 443. Chao, A., Chiu, C.-H., and Jost, L. (2010). Phylogenetic diversity measures based on Hill numbers. Philosophical Transactions of the Royal Society B: Biological Sciences, 365, 3599– 3609. Chao, A., Lee, S.-M., and Chen, T.-C. (1988). A generalized Good's nonparametric coverage estimator. Chinese Journal of Mathematics, 16(3), 189– 199. Cochran, W.G. (1952). The χ 2 test of goodness of fit. Annals of Mathematical Statistics, 25, 315– 345. Cover, T.M. and Thomas, J.A. (2006). Elements of Information Theory, 2nd ed. John Wiley & Sons, Inc., Hoboken, NJ. David, F.N. (1950). Two combinatorial tests of whether a sample has come from a given population. Biometrika, 37, 97– 110. de Haan, L. and Ferreira, A. (2006). Extreme Value Theory: An Introduction, Springer Science+Business Media, LLC, New York. Emlen, J.M. (1973). Ecology: An Evolutionary Approach, Addison-Wesley Publishing Co., Reading, MA. Esty, W. (1983). A normal limit law for a nonparametric estimator of the coverage of a random sample. The Annals of Statistics, 11(3), 905– 912. Fisher, R.A. and Tippett, L.H.C. (1922). On the interpretation of chi-square from contingency tables, and the calculation of P. Journal of the Royal Statistical Society, 85, 87– 94. Fisher, R.A. and Tippett, L.H.C. (1928). Limiting forms of the frequency-distribution of the largest or smallest member of a sample. Mathematical Proceedings of the Cambridge Philosophical Society, 24, 180. Fisher, R.A., Corbet, A.S., and Williams, C.B. (1943). The relationship between the number of species and the number of individuals in a random sample of an animal population. Journal of Animal Ecology, 12(1), 42– 58. Fréchet, M. (1927). Sur la loi de probabilité de l'écart maximum. Annales de la Société Polonaise de Mathématique, 6, 92. Gini, C. (1912). Variabilità e mutabilità. Reprinted in Memorie di metodologica statistica (eds E. Pizetti and T. Salvemini), Libreria Eredi Virgilio Veschi, Rome (1955). Gnedenko, B.V. (1943). Sur la distribution limite du terme maximum d'une série aléatoire. Annals of Mathematics, 44, 423– 453. Gnedenko, B.V. (1948). On a local limit theorem of the theory of probability. Uspekhi Matematicheskikh Nauk, 3(25), 187– 194. Good, I.J. (1953). The population frequencies of species and the estimation of population parameters. Biometrika, 40(3-4), 237– 264. Grabchak, M., Marcon, E., Lang, G., and Zhang, Z. (2016). The generalized Simpson's entropy is a measure of biodiversity <hal-01276738>. Haeusler, E. and Teugels, J.L. (1985). On asymptotic normality of Hill's estimator for the exponent of regular variation. The Annals of Statistics, 13(2), 743– 756. Hall, P. and Weissman, I. (1997). On the estimation of extreme tail probabilities. The Annals of Statistics, 25(3), 1311– 1326. Hall, P. and Welsh, A.H. (1985). Adaptive estimates of parameters of regular variation. The Annals of Statistics, 13(1), 331– 341. Harris, B. (1975). The statistical estimation of entropy in the non-parametric case. In Topics in Information Theory (ed. I. Csiszar), North-Holland, Amsterdam, 323– 355. Hausser, J. and Strimmer, K. (2009). Entropy inference and the James-Stein estimator, with application to nonlinear gene association networks. Journal of Machine Learning Research, 10, 1469– 1484. Heip, C.H.R., Herman, P.M.J., and Soetaert, K. (1998). Indices of diversity and evenness. Oc eanis, 24(4), 61– 87. Hill, M.O. (1973). Diversity and evenness: a unifying notation and its consequences. Ecology, 54, 427– 431. Hill, B.M. (1975). A simple general approach to inference about the tail of a distribution. The Annals of Statistics, 3(5), 1163– 1174. Hoeffding, W. (1948). A class of statistics with asymptotically normal distributions. The Annals of Statistics, 19(3), 293– 325. Hoeffding, W. (1963). Probability Inequalities for Sums of Bounded Random Variables. Journal of the American Statistical Association, 58, 13– 30. Holste, D., Große, I., and Herzel, H. (1998). Bayes' estimators of generalized entropies. Journal of Physics A: Mathematical and General, 31, 2551– 2566. Johnson, N.L. and Kotz, S. (1977). URN Models and their Applications, John Wiley & Sons, Inc., New York. Jost, L. (2006). Entropy and diversity. Oikos, 113, 363– 375. Kolchin, V.F., Sevastyanov, B.A., and Chistyakov, V.P. (1978). Random Allocations, V.H. Winston & Sons, Washington, DC. Krebs, C.J. (1999). Ecological Methodology, 2nd ed. Addison-Welsey Educational Publishers, Menlo Park, CA. Krichevsky, R.E. and Trofimov, V.K. (1981). The performance of universal encoding. IEEE Transactions on Information Theory, 27, 199– 207. Kullback, S. and Leibler, R.A. (1951). On information and sufficiency. The Annals of Mathematical Statistics, 22(1), 79– 86. Kvalseth, T.O. (1987). Entropy and correlation: some comments. IEEE Transactions on Systems, Man, and Cybernetics, 17(3), 517– 519. Lee, A.J. (1990). U-Statistics: Theory and Practice, Marcel Dekker, New York. MacArthur, R.H. (1955). Fluctuations of animal populations, and a measure of community stability. Ecology, 36, 533– 536. Magurran, A.E. (2004). Measuring Biological Diversity, Blackwell Publishing, Ltd., Malden, MA. Mann, H.B. and Wald, A. (1943). On stochastic limit and order relationships. The Annals of Mathematical Statistics, 14(3), 217– 226. Marcon, E. (2014). Mesures de la Biodiversité. Technical Report, Ecologie des forêts de Guyane. Margalef, R. (1958). Temporal succession and spatial heterogeneity in phytoplankton. In Perspectives in Marine biology (ed. A.A. Buzzati-Traverso), University of California Press, Berkeley, CA, 323– 347. Miller, G.A. (1955). Note on the bias of information estimates. Information Theory in Psychology; Problems and Methods, II-B, 95– 100. Miller, G.A. and Madow, W.G. (1954). On the maximum likelihood estimate of the Shannon-Weaver measure of information. Air Force Cambridge Research Center Technical Report AFCRC-TR-54-75, Operational Applications Laboratory, Air Force, Cambridge Research Center, Air Research and Development Command, New York. Montgomery-Smith, S. and Schümann, T. (2007). Unbiased Estimators for Entropy and Class Number. Unpublished preprint 2007, Department of Mathematics, University of Missouri, Columbia, MO. Mood, A.M., Graybill, F.A., and Boes, D.C. (1974). Introduction to the Theory of Statistics, McGraw-Hill, Inc., New York. Nemenman, I., Shafee, F., and Bialek, W. (2002). Entropy and inference, revisited. Advances in Neural Information Processing Systems, vol. 14, MIT Press, Cambridge, MA, 471– 478. Ohannessian, M.I. and Dahleh, M.A. (2012). Rare probability estimation under regularly varying heavy tails. Journal of Machine Learning Research, Proceedings Track, 23, 21.1– 21.24. Paninski, L. (2003). Estimation of entropy and mutual information. Neural Computation, 15, 1191– 1253. Patil, G.P. and Taillie, C. (1982). Diversity as a concept and its measurement. Journal of the American Statistical Association, 77, 548– 567. Pearson, K. (1900). On a criterion that a given system of deviations from the probable in the case of a correlated system of variables is such that it can be reasonably supposed to have arisen from random sampling. Philosophical Magazine Series 5, 50, 157– 175. (Reprinted 1948 in Karl Pearson's Early Statistical Papers, (ed. E.S. Pearson), Cambridge University Press, Cambridge.) Pearson, K. (1922). On the χ 2 test of goodness of fit. Biometrika, 14, 186– 191. Peet, R.K. (1974). The measurements of species diversity. Annual Review of Ecology and Systematics, 5, 285– 307. Pickands, J. (1975). Statistical inference using extreme order statistics. The Annals of Statistics, 3(1), 119– 131. Purvis, A. and Hector, A. (2000). Getting the measure of biodiversity. Nature, 405, 212– 219. Rényi, A. (1961). On measures of entropy and information. Proceedings of the 4th Berkeley Symposium on Mathematical Statistics and Probabilities, Vol. 1, 547– 561. Robbins, H.E. (1968). Estimating the total probability of the unobserved outcomes of an experiment. The Annals of Mathematical Statistics, 39(1), 256– 257. Schürmann, T. and Grassberger, P. (1996). Entropy estimation of symbol sequences. Chaos, 6, 414– 427. Serfling, R.J. (1980). Approximation Theorems of Mathematical Statistics, John Wiley & Sons, Inc., New York. Shannon, C.E. (1948). A mathematical theory of communication. The Bell System Technical Journal, 27, 379–423 & 623– 656. Simpson, E.H. (1949). Measurement of diversity. Nature, 163, 688. Smith, R.L. (1987). Estimating tails of probability distributions. The Annals of Statistics, 15(3), 1174– 1207. Stanley, R.P. (1997). Enumerative Combinatorics, Vol. 1, Cambridge University Press, New York. Strehl, A. and Ghosh, J. (2002). Cluster ensembles - a knowledge reuse framework for combining multiple partitions. Journal of Machine Learning Research, 3, 583– 617. Strong, S.P., Koberle, R., de Ruyter van Steveninck, R.R., and Bialek, W. (1998). Entropy and information in neural spike trains. Physical Review Letters, 80, 197– 200. Taillie, C. (1979). Species equitability: a comparative approach. In —it Ecological Diversity in Theory and Practice (eds J.F. Grassle, G.P. Patil, W. Smith and C. Taillie), International Cooperative Publishing House, Fairland, MD, 51– 61. Tanabe, K. and Sagae, M. (1992). An exact Cholesky decomposition and the generalized inverse of the variance-covariance matrix of the multinomial distribution, with applications. Journal of the Royal Statistical Society, Series B Methodological, 54(1), 211– 219. Tsallis, C. (1988). Possible generalization of Boltzmann-Gibbs statistics. Journal of Statistical Physics, 52, 479– 487. Valiant, G. and Valiant, P. (2011). Estimating the Unseen: an n / log ( n ) -sample estimator for entropy and support size, shown optimal via new CLTs. In Proceedings of the 43rd Annual ACM Symposium on Theory of Computing, STOC'11 Symposium on Theory of Computing (Co-located with FCRC 2011), San Jose, CA, USA - June 06-08, 2011 (ed. L. Fortnow and S.P. Vadhan), Association for Computing Machinery, New York, 685– 694. Vinh, N.X., Epps, J., and Bailey, J. (2010). Information theoretic measures for clusterings comparison: variants, properties, normalization and correction for chance. Journal of Machine Learning Research, 11, 2837– 2854. von Mises, R. (1936). La distribution de la plus grande de n valeurs. Reprinted (1954) in Selected Papers 11271-294, American Mathematical Society, Providence, RI. Vu, V.Q., Yu, B., and Kass, R.E. (2007). Coverage-adjusted entropy estimation. Statistical Analysis of Neuronal Data, 26(21), 4039– 4060. Wang, S.C. and Dodson, P. (2006). Estimating the diversity of dinosaurs. Proceedings of the National Academy of Sciences of the United States of America, 103(37), 13601– 13605. Yao, Y.Y. (2003). Information-theoretic measures for knowledge discovery and data mining. In: Entropy Measures, Maximum Entropy Principle and Emerging Applications (ed. Karmeshu), 1st edition, Springer, Berlin, 115– 136. Zahl, S. (1977). Jackknifing an index of diversity. Ecology, 58, 907– 913. Zhang, Z. (2012). Entropy estimation in Turing's perspective. Neural Computation, 24(5), 1368– 1389. Zhang, Z. (2013a). A multivariate normal law for Turing's formulae. Sankhyā: The Indian Journal of Statistics, 75-A(1), 51– 73. Zhang, Z. (2013b). Asymptotic normality of an entropy estimator with exponentially decaying bias. IEEE Transactions on Information Theory, 59(1), 504– 508. Zhang, Z. (2017). Domains of attraction on countable alphabets. Bernoulli Journal (to appear). Zhang, Z. and Grabchak, M. (2013). Bias adjustment for a nonparametric entropy estimator. Entropy, 15(6), 1999– 2011. Zhang, Z. and Grabchak, M. (2014). Nonparametric estimation of Kullback-Leibler divergence. Neural Computation, 26(11), 2570– 2593. Zhang, Z. and Grabchak, M. (2016). Entropic representation and estimation of diversity indices. Journal of Nonparametric Statistics, DOI: 10.1080/10485252.2016.1190357. Zhang, Z. and Huang, H. (2008). A sufficient normality condition for Turing's formula. Journal of Nonparametric Statistics, 20(5), 431– 446. Zhang, Z. and Stewart, A.M. (2016). Estimation of standardized mutual information. Technical Report No. 7, University of North Carolina at Charlotte. Zhang, C.-H. and Zhang, Z. (2009). Asymptotic normality of a nonparametric estimator of sample coverage. The Annals of Statistics, 37(5A), 2582– 2595. Zhang, Z. and Zhang, X. (2012). A normal law for the plug-in estimator of entropy. IEEE Transactions on Information Theory, 58(5), 2745– 2747. Zhang, Z. and Zheng, L. (2015). A mutual information estimator with exponentially decaying bias. Statistical Applications in Genetics and Molecular Biology, 14(3), 243– 252. Zhang, Z. and Zhou, J. (2010). Re-parameterization of multinomial distribution and diversity indices. Journal of Statistical Planning and Inference, 140(7), 1731– 1738. Zubkov, A.M. (1973). Limit distributions for a statistical estimate of the entropy. Teoriya Veroyatnostei i Ee Primeneniya, 18(3), 643– 650. Statistical Implications of Turing's Formula ReferencesRelatedInformation" @default.
- W4244332020 created "2022-05-12" @default.
- W4244332020 date "2016-10-14" @default.
- W4244332020 modified "2023-09-23" @default.
- W4244332020 title "References" @default.
- W4244332020 cites W121866894 @default.
- W4244332020 cites W1957617498 @default.
- W4244332020 cites W1965555277 @default.
- W4244332020 cites W1965716462 @default.
- W4244332020 cites W1971405816 @default.
- W4244332020 cites W1973288867 @default.
- W4244332020 cites W1980179247 @default.
- W4244332020 cites W1980494867 @default.
- W4244332020 cites W1983874169 @default.
- W4244332020 cites W1985747475 @default.
- W4244332020 cites W1988625484 @default.
- W4244332020 cites W1997644467 @default.
- W4244332020 cites W2009169570 @default.
- W4244332020 cites W2018891628 @default.
- W4244332020 cites W2018913979 @default.
- W4244332020 cites W2021531548 @default.
- W4244332020 cites W2022825068 @default.
- W4244332020 cites W2031648200 @default.
- W4244332020 cites W2032440662 @default.
- W4244332020 cites W2036983007 @default.
- W4244332020 cites W2039728913 @default.
- W4244332020 cites W2041457600 @default.
- W4244332020 cites W2047701135 @default.
- W4244332020 cites W2048649579 @default.
- W4244332020 cites W2055403763 @default.
- W4244332020 cites W2056701644 @default.
- W4244332020 cites W2062829699 @default.
- W4244332020 cites W2067754386 @default.
- W4244332020 cites W2082092506 @default.
- W4244332020 cites W2082221931 @default.
- W4244332020 cites W2087189381 @default.
- W4244332020 cites W2088346040 @default.
- W4244332020 cites W2097645701 @default.
- W4244332020 cites W2101985079 @default.
- W4244332020 cites W2103336148 @default.
- W4244332020 cites W2106651885 @default.
- W4244332020 cites W2108571308 @default.
- W4244332020 cites W2109584604 @default.
- W4244332020 cites W2114771311 @default.
- W4244332020 cites W2120474334 @default.
- W4244332020 cites W2122456939 @default.
- W4244332020 cites W2129645399 @default.
- W4244332020 cites W2141753389 @default.
- W4244332020 cites W2141846515 @default.
- W4244332020 cites W2146368895 @default.
- W4244332020 cites W2156232632 @default.
- W4244332020 cites W2165521059 @default.
- W4244332020 cites W2241103921 @default.
- W4244332020 cites W2294024199 @default.
- W4244332020 cites W2320432931 @default.
- W4244332020 cites W2324099123 @default.
- W4244332020 cites W2510474575 @default.
- W4244332020 cites W2797333853 @default.
- W4244332020 cites W2914659449 @default.
- W4244332020 cites W2993383518 @default.
- W4244332020 cites W3104209297 @default.
- W4244332020 cites W3134527637 @default.
- W4244332020 cites W4214712351 @default.
- W4244332020 cites W4233413206 @default.
- W4244332020 cites W4376453175 @default.
- W4244332020 cites W2183235533 @default.
- W4244332020 doi "https://doi.org/10.1002/9781119237150.refs" @default.
- W4244332020 hasPublicationYear "2016" @default.
- W4244332020 type Work @default.
- W4244332020 citedByCount "0" @default.
- W4244332020 crossrefType "other" @default.
- W4244332020 hasBestOaLocation W42443320201 @default.
- W4244332020 hasConcept C41008148 @default.
- W4244332020 hasConceptScore W4244332020C41008148 @default.
- W4244332020 hasLocation W42443320201 @default.
- W4244332020 hasOpenAccess W4244332020 @default.
- W4244332020 hasPrimaryLocation W42443320201 @default.
- W4244332020 hasRelatedWork W1596801655 @default.
- W4244332020 hasRelatedWork W2130043461 @default.
- W4244332020 hasRelatedWork W2350741829 @default.
- W4244332020 hasRelatedWork W2358668433 @default.
- W4244332020 hasRelatedWork W2376932109 @default.
- W4244332020 hasRelatedWork W2382290278 @default.
- W4244332020 hasRelatedWork W2390279801 @default.
- W4244332020 hasRelatedWork W2748952813 @default.
- W4244332020 hasRelatedWork W2899084033 @default.
- W4244332020 hasRelatedWork W2530322880 @default.
- W4244332020 isParatext "false" @default.
- W4244332020 isRetracted "false" @default.
- W4244332020 workType "other" @default.