Model Selection and Multimodel Inference - A Practical Information-Theoretic Approach (Hardcover, 2nd ed. 2002. Corr. 3rd printing 2003)

,
The second edition of this book is unique in that it focuses on methods for making formal statistical inference from all the models in an a priori set (Multi-Model Inference). A philosophy is presented for model-based data analysis and a general strategy outlined for the analysis of empirical data. The book invites increased attention on a priori science hypotheses and modeling. Kullback-Leibler Information represents a fundamental quantity in science and is Hirotugu Akaike's basis for model selection. The maximized log-likelihood function can be bias-corrected as an estimator of expected, relative Kullback-Leibler information. This leads to Akaike's Information Criterion (AIC) and various extensions. These methods are relatively simple and easy to use in practice, but based on deep statistical theory. The information theoretic approaches provide a unified and rigorous theory, an extension of likelihood theory, an important application of information theory, and are objective and practical to employ across a very wide class of empirical problems. The book presents several new ways to incorporate model selection uncertainty into parameter estimates and estimates of precision. An array of challenging examples is given to illustrate various technical issues. This is an applied book written primarily for biologists and statisticians wanting to make inferences from multiple models and is suitable as a graduate text or as a reference for professional analysts.

R5,344

Or split into 4x interest-free payments of 25% on orders over R50
Learn more

Discovery Miles53440
Mobicred@R501pm x 12* Mobicred Info
Free Delivery
Delivery AdviceShips in 12 - 17 working days



Product Description

The second edition of this book is unique in that it focuses on methods for making formal statistical inference from all the models in an a priori set (Multi-Model Inference). A philosophy is presented for model-based data analysis and a general strategy outlined for the analysis of empirical data. The book invites increased attention on a priori science hypotheses and modeling. Kullback-Leibler Information represents a fundamental quantity in science and is Hirotugu Akaike's basis for model selection. The maximized log-likelihood function can be bias-corrected as an estimator of expected, relative Kullback-Leibler information. This leads to Akaike's Information Criterion (AIC) and various extensions. These methods are relatively simple and easy to use in practice, but based on deep statistical theory. The information theoretic approaches provide a unified and rigorous theory, an extension of likelihood theory, an important application of information theory, and are objective and practical to employ across a very wide class of empirical problems. The book presents several new ways to incorporate model selection uncertainty into parameter estimates and estimates of precision. An array of challenging examples is given to illustrate various technical issues. This is an applied book written primarily for biologists and statisticians wanting to make inferences from multiple models and is suitable as a graduate text or as a reference for professional analysts.

Customer Reviews

No reviews or ratings yet - be the first to create one!

Product Details

General

Imprint

Springer-Verlag New York

Country of origin

United States

Release date

December 2003

Availability

Expected to ship within 12 - 17 working days

First published

February 2004

Authors

,

Dimensions

235 x 155 x 34mm (L x W x T)

Format

Hardcover

Pages

488

Edition

2nd ed. 2002. Corr. 3rd printing 2003

ISBN-13

978-0-387-95364-9

Barcode

9780387953649

Categories

LSN

0-387-95364-7



Trending On Loot