Verkauf durch Sack Fachmedien

Milton / Arnold

Introduction to Probability and Statistics: Principles and A

Medium: Buch
ISBN: 978-0-07-119859-2
Verlag: MCGRAW-HILL Higher Education
Erscheinungstermin: 30.11.2002
Lieferfrist: bis zu 10 Tage

This well-respected text is designed for the first course in probability and statistics taken by students majoring in Engineering and the Computing Sciences. The prerequisite is one year of calculus. The text offers a balanced presentation of applications and theory. The authors take care to develop the theoretical foundations for the statistical methods presented at a level that is accessible to students with only a calculus background. They explore the practical implications of the formal results to problem-solving so students gain an understanding of the logic behind the techniques as well as practice in using them. The examples, exercises, and applications were chosen specifically for students in engineering and computer science and include opportunities for real data analysis.


Produkteigenschaften


  • Artikelnummer: 9780071198592
  • Medium: Buch
  • ISBN: 978-0-07-119859-2
  • Verlag: MCGRAW-HILL Higher Education
  • Erscheinungstermin: 30.11.2002
  • Sprache(n): Englisch
  • Auflage: 4. Auflage 2002
  • Produktform: Kartoniert
  • Gewicht: 1216 g
  • Seiten: 816
  • Format (B x H x T): 169 x 245 x 34 mm
  • Ausgabetyp: Kein, Unbekannt
Autoren/Hrsg.

Autoren

J.Susan Milton is professor Emeritus of Satatics at Radford University. Dr. Milton recieved the B.S. degree from Western Carolina University, the M.A. degree from the University of North Carolina at Chapel Hill, and the Ph.D degree in Statistics from Virginia Polytechnic Institute and State university. She is a Danforth Associate and is a recipient of the Radford University Foundation Award for Excellence in Teaching. Dr. Milton is the author of Statistical Methods in the Biological and Health Sciences as well as Introduction to statistics, Probability with the Essential Analysis, and a first Course in the Theory of Linear Statistical Models.

Jesse C. arnold is a Professor of Statistics at Virginia Polytechnic Insitute and state University. Dr. arnold received the B.S. Degree from Southeastern state University, and the M.A and Ph. D degrees in statistics from Florida state university. He served as head of Statistics department for ten years, is a fellow of the American Statistical Association, and elected member of the International Statistics Institute. Ha has served as President of the International Biometric Society (Eastern North American Region) and Chairman of the statistical Educational Section of the American Statistical Association.

Chapter 1 - Introduction to Probability and Counting1.1 Interpreting Probabilities1.2 Sample Spaces and Events1.3 Permutations and CombinationsChapter SummaryExercisesReview ExercisesChapter 2 - Some Probability Laws2.1 Axioms of Probability2.2 Conditional Probability2.3 Independence and the Multiplication Rule2.4 Bayes' TheoremChapter SummaryExercisesReview ExercisesChapter 3 - Discrete Distributions3.1 Random Variables3.2 Discrete Probablility Densities3.3 Expectation and Distribution Parameters3.4 Geometric Distribution and the Moment Generating Function3.5 Binomial Distribution3.6 Negative Binomial Distribution3.7 Hypergeometric Distribution3.8 Poisson DistributionChapter SummaryExercisesReview ExercisesChapter 4 - Continuous Distributions4.1 Continuous Densities4.2 Expectation and Distribution Parameters4.3 Gamma, Exponential, and Chi-Squared Distributions4.4 Normal Distribution4.5 Normal Probability Rule and Chebyshev's Inequality4.6 Normal Approximation to the Binomial Distribution4.7 Weibull Distribution and Reliability4.8 Transformation of Variables4.9 Simulating a Continuous DistributionChapter SummaryExercisesReview ExercisesChapter 5 - Joint Distributions5.1 Joint Densities and Independence5.2 Expectation and Covariance5.3 Correlation5.4 Conditional Densities and Regression5.5 Transformation of VariablesChapter SummaryExercisesReview ExercisesChapter 6 - Descriptive Statistics6.1 Random Sampling6.2 Picturing the Distribution6.3 Sample Statistics6.4 BoxplotsChapter SummaryExercisesReview ExercisesChapter 7 - Estimation7.1 Point Estimation7.2 The Method of Moments and Maximum Likelihood7.3 Functions of Random Variables--Distribution of X7.4 Interval Estimation and the Central Limit TheoremChapter SummaryExercisesReview ExercisesChapter 8 - Inferences on the Mean and Variance of a Distribution8.1 Interval Estimation of Variability8.2 Estimating the Mean and the Student-t Distribution8.3 Hypothesis Testing8.4 Significance Testing8.5 Hypothesis and Significance Tests on the Mean8.6 Hypothesis Test on the Variance8.7 Alternative Nonparametric MethodsChapter SummaryExercisesReview ExercisesChapter 9 - Inferences on Proportions9.1 Estimating Proportions9.2 Testing Hypothesis on a Proportion9.3 Comparing Two Proportions Estimation9.4 Coparing Two Proportions: Hypothesis TestingChapter SummaryExercisesReview ExercisesChapter 10 - Comparing Two Means and Two Variances10.1 Point Estimation: Independent Samples10.2 Comparing Variances: The F Distribution10.3 Comparing Means: Variances Equal (Pooled Test)10.4 Comparing Means: Variances Unequal10.5 Compairing Means: Paried Data10.6 Alternative Nonparametric Methods10.7 A Note on TechnologyChapter SummaryExercisesReview ExercisesChapter 11 - Sample Linear Regression and Correlation11.1 Model and Parameter Estimation11.2 Properties of Least-Squares Estimators11.3 Confidence Interval Estimation and Hypothesis Testing11.4 Repeated Measurements and Lack of Fit11.5 Residual Analysis11.6 CorrelationChapter SummaryExercisesReview ExercisesChapter 12 - Multiple Linear Regression Models12.1 Least-Squares Procedures for Model Fitting12.2 A Matrix Approach to Least Squares12.3 Properties of the Least-Squares Estimators12.4 Interval Estimation12.5 Testing Hypothesis about Model Parameters12.6 Use of Indicator or