Statistics For Engineers and Scientists 6th Edition by William Navidi – Ebook PDF Instant Download/DeliveryISBN: 1265877815, 9781265877811
Full download Statistics For Engineers and Scientists 6th Edition after payment.
Product details:
ISBN-10 : 1265877815
ISBN-13 : 9781265877811
Author: William Navidi
Statistics for Engineers and Scientists stands out for its clear presentation of applied statistics. The book takes a practical approach to methods of statistical modeling and data analysis that are most often used in scientific work. This edition features a unique approach highlighted by an engaging writing style that explains difficult concepts clearly, along with the use of contemporary real world data sets, to help motivate students and show direct connections to industry and research. While focusing on practical applications of statistics, the text makes extensive use of examples to motivate fundamental concepts and to develop intuition.
Statistics For Engineers and Scientists 6th Table of contents:
Chapter 1: Sampling and Descriptive Statistics
Chapter 1 Introduction
Introduction
The Basic Idea
1.1 Sampling
Independence
Other Sampling Methods
Types of Experiments
Types of Data
Controlled Experiments and Observational Studies
Exercises for Section 1.1
1.2 Summary Statistics
The Sample Mean
The Standard Deviation
The Sample Median
The Trimmed Mean
Outliers
Resistance to Outliers
The Mode and the Range
Quartiles
Percentiles
Summary Statistics for Categorical Data
Sample Statistics and Population Parameters
Exercises for Section 1.2
1.3 Graphical Summaries
Stem-and-Leaf Plots
Dotplots
Histograms
Unequal Class Widths
Symmetry and Skewness
Unimodal and Bimodal Histograms
Boxplots
Comparative Boxplots
Multivariate Data
Exercises for Section 1.3
Supplementary Exercises for Chapter 1
Chapter 2: Probability
Chapter 2 Introduction
Introduction
2.1 Basic Ideas
Combining Events
Mutually Exclusive Events
Probabilities
Axioms of Probability
Sample Spaces with Equally Likely Outcomes
The Addition Rule
Exercises for Section 2.1
2.2 Counting Methods
Permutations
Combinations
Exercises for Section 2.2
2.3 Conditional Probability and Independence
Independent Events
The Multiplication Rule
The Law of Total Probability
Bayes’ Rule
Application to Reliability Analysis
Exercises for Section 2.3
2.4 Random Variables
Random Variables and Populations
Discrete Random Variables
The Cumulative Distribution Function of a Discrete Random Variable
Mean and Variance for Discrete Random Variables
The Probability Histogram
Continuous Random Variables
Computing Probabilities with the Probability Density Function
The Cumulative Distribution Function of a Continuous Random Variable
Mean and Variance for Continuous Random Variables
The Population Median and Percentiles
Chebyshev’s Inequality
Exercises for Section 2.4
2.5 Linear Functions of Random Variables
Adding a Constant
Multiplying by a Constant
Means of Linear Combinations of Random Variables
Independent Random Variables
Variances of Linear Combinations of Independent Random Variables
Independence and Simple Random Samples
The Mean and Variance of a Sample Mean
Exercises for Section 2.5
2.6 Jointly Distributed Random Variables
Jointly Discrete Random Variables
Jointly Continuous Random Variables
More than Two Random Variables
Means of Functions of Random Variables
Conditional Distributions
Conditional Expectation
Independent Random Variables
Covariance
Correlation
Covariance, Correlation, and Independence
Linear Combinations of Random Variables
The Mean and Variance of a Sample Mean
Application to Portfolio Management
Exercises for Section 2.6
Supplementary Exercises for Chapter 2
Chapter 3: Propagation of Error
Chapter 3 Introduction
Introduction
3.1 Measurement Error
Exercises for Section 3.1
3.2 Linear Combinations of Measurements
Repeated Measurements
Repeated Measurements with Differing Uncertainties
Linear Combinations of Dependent Measurements
Exercises for Section 3.2
3.3 Uncertainties for Functions of One Measurement
Propagation of Error Uncertainties Are Only Approximate
Nonlinear Functions Are Biased
Relative Uncertainties for Functions of One Measurement
Exercises for Section 3.3
3.4 Uncertainties for Functions of Several Measurements
Uncertainties for Functions of Dependent Measurements
Relative Uncertainties for Functions of Several Measurements
Exercises for Section 3.4
Supplementary Exercises for Chapter 3
Chapter 4: Commonly Used Distributions
Chapter 4 Introduction
Introduction
4.1 The Bernoulli Distribution
Mean and Variance of a Bernoulli Random Variable
Exercises for Section 4.1
4.2 The Binomial Distribution
Probability Mass Function of a Binomial Random Variable
A Binomial Random Variable Is a Sum of Bernoulli Random Variables
The Mean and Variance of a Binomial Random Variable
Using a Sample Proportion to Estimate a Success Probability
Uncertainty in the Sample Proportion
Exercises for Section 4.2
4.3 The Poisson Distribution
The Mean and Variance of a Poisson Random Variable
Using the Poisson Distribution to Estimate a Rate
Uncertainty in the Estimated Rate
Exercises for Section 4.3
4.4 Some Other Discrete Distributions
The Hypergeometric Distribution
Mean and Variance of the Hypergeometric Distribution
Comparison with the Binomial Distribution
The Geometric Distribution
The Mean and Variance of a Geometric Distribution
The Negative Binomial Distribution
A Negative Binomial Random Variable Is a Sum of Geometric Random Variables
The Mean and Variance of the Negative Binomial Distribution
The Multinomial Distribution
Exercises for Section 4.4
4.5 The Normal Distribution
Estimating the Parameters of a Normal Distribution
Linear Functions of Normal Random Variables
Linear Combinations of Independent Normal Random Variables
How Can I Tell Whether My Data Come from a Normal Population?
Exercises for Section 4.5
4.6 The Lognormal Distribution
Estimating the Parameters of a Lognormal Distribution
How Can I Tell Whether My Data Come from a Lognormal Population?
Exercises for Section 4.6
4.7 The Exponential Distribution
The Exponential Distribution and the Poisson Process
Memoryless Property
Using the Exponential Distribution to Estimate a Rate
Correcting the Bias
Exercises for Section 4.7
4.8 Some Other Continuous Distributions
The Uniform Distribution
The Gamma Distribution
The Weibull Distribution
Exercises for Section 4.8
4.9 Some Principles of Point Estimation
Measuring the Goodness of an Estimator
The Method of Maximum Likelihood
Desirable Properties of Maximum Likelihood Estimators
Exercises for Section 4.9
4.10 Probability Plots
Interpreting Probability Plots
Exercises for Section 4.10
4.11 The Central Limit Theorem
Normal Approximation to the Binomial
The Continuity Correction
Accuracy of the Continuity Correction
Normal Approximation to the Poisson
Continuity Correction for the Poisson Distribution
Exercises for Section 4.11
4.12 Simulation
Using Simulation to Estimate a Probability
Estimating Means and Variances
Using Simulation to Determine Whether a Population Is Approximately Normal
Using Simulation in Reliability Analysis
Using Simulation to Estimate Bias
The Bootstrap
Parametric and Nonparametric Bootstrap
Comparison of Simulation with Propagation of Error
Exercises for Section 4.12
Supplementary Exercises for Chapter 4
Chapter 5: Confidence Intervals
Chapter 5 Introduction
Introduction
5.1 Confidence Intervals for a Population Mean, Variance Known
More About Confidence Levels
Probability versus Confidence
One-Sided Confidence Intervals
Confidence Intervals Must Be Based on Random Samples
Exercises for Section 5.1
5.2 Confidence Intervals for a Population Mean, Variance Unknown
The Student’s t Distribution
Confidence Intervals Using the Student’s t Distribution
How Do I Determine Whether the Student’s t Distribution Is Appropriate?
Approximating the Sample Size Needed for a Confidence Interval of a Specified Width
The Student’s t Distribution Is Close to Normal When the Sample Size Is Large
Exercises for Section 5.2
5.3 Confidence Intervals for Proportions
The Traditional Method
Exercises for Section 5.3
5.4 Confidence Intervals for the Difference Between Two Means
Constructing Confidence Intervals When Population Variances Are Known
Constructing Confidence Intervals When Population Variances Are Unknown
When the Populations Have Equal Variances
Don’t Assume the Population Variances Are Equal Just Because the Sample Variances Are Close
Exercises for Section 5.4
5.5 Confidence Intervals for the Difference Between Two Proportions
The Traditional Method
Exercises for Section 5.5
5.6 Confidence Intervals with Paired Data
Exercises for Section 5.6
5.7 Confidence Intervals for the Variance and Standard Deviation of a Normal Population
The Chi-Square Distribution
Confidence Intervals for the Variance of a Normal Population
Confidence Intervals for the Variance Are Sensitive to Departures from Normality
Exercises for Section 5.7
5.8 Prediction Intervals and Tolerance Intervals
Prediction Intervals
Comparison of Prediction Intervals and Confidence Intervals
One-Sided Prediction Intervals
Prediction Intervals Are Sensitive to Departures from Normality
Tolerance Intervals for a Normal Population
Exercises for Section 5.8
5.9 Using Simulation to Construct Confidence Intervals
Confidence Intervals Using the Bootstrap
Using Simulation to Evaluate Confidence Intervals
Exercises for Section 5.9
Supplementary Exercises for Chapter 5
Chapter 6: Hypothesis Testing
Chapter 6 Introduction
Introduction
6.1 Tests for a Population Mean, Variance Known
Another Way to Express H 0
Exercises for Section 6.1
6.2 Drawing Conclusions from the Results of Hypothesis Tests
Statistical Significance
The P-value Is Not the Probability That H 0 Is True
Choose H 0 to Answer the Right Question
Statistical Significance Is Not the Same as Practical Significance
The Relationship Between Hypothesis Tests and Confidence Intervals
Exercises for Section 6.2
6.3 Tests for a Population Mean, Variance Unknown
Exercises for Section 6.3
6.4 Tests for a Population Proportion
The Sample Size Must Be Large
Relationship with Confidence Intervals for a Proportion
Exercises for Section 6.4
6.5 Tests for the Difference Between Two Means
Tests When Population Variances Are Known
Tests When Population Variances Are Unknown
When the Populations Have Equal Variances
Don’t Assume the Population Variances Are Equal Just Because the Sample Variances Are Close
Exercises for Section 6.5
6.6 Tests for the Difference Between Two Proportions
Exercises for Section 6.6
6.7 Tests with Paired Data
Exercises for Section 6.7
6.8 Distribution-Free Tests
The Wilcoxon Signed-Rank Test
Ties
Differences of Zero
Large-Sample Approximation
The Wilcoxon Rank-Sum Test
Large-Sample Approximation
Distribution-Free Methods Are Not Assumption-Free
Exercises for Section 6.8
6.9 Tests with Categorical Data
The Chi-Square Test for Homogeneity
The Chi-Square Test for Independence
Exercises for Section 6.9
6.10 Tests for Variances of Normal Populations
Testing the Variance of a Normal Population
The F Test for Equality of Variance
The F Distribution
The F Statistic for Testing Equality of Variance
The F Test Is Sensitive to Departures from Normality
The F Test Cannot Prove That Two Variances Are Equal
Exercises for Section 6.10
6.11 Fixed-Level Testing
Critical Points and Rejection Regions
Type I and Type II Errors
Exercises for Section 6.11
6.12 Power
Using a Computer to Calculate Power
Exercises for Section 6.12
6.13 Multiple Tests
The Bonferroni Method
Exercises for Section 6.13
6.14 Using Simulation to Perform Hypothesis Tests
Testing Hypotheses with Bootstrap Confidence Intervals
Randomization Tests
Using Simulation to Estimate Power
Exercises for Section 6.14
Supplementary Exercises for Chapter 6
Chapter 7: Correlation and Simple Linear Regression
Chapter 7 Introduction
Introduction
7.1 Correlation
How the Correlation Coefficient Works
The Correlation Coefficient Is Unitless
The Correlation Coefficient Measures Only Linear Association
The Correlation Coefficient Can Be Misleading when Outliers Are Present
Correlation Is Not Causation
Controlled Experiments Reduce the Risk of Confounding
Inference on the Population Correlation
Exercises for Section 7.1
7.2 The Least-Squares Line
Computing the Equation of the Least-Squares Line
Computing Formulas
Interpreting the Slope of the Least-Squares Line
The Estimates Are Not the Same as the True Values
The Residuals Are Not the Same as the Errors
Don’t Extrapolate Outside the Range of the Data
Interpreting the y-Intercept of the Least-Squares Line
Don’t Use the Least-Squares Line When the Data Aren’t Linear
Another Look at the Least-Squares Line
Measuring Goodness-of-Fit
Exercises for Section 7.2
7.3 Uncertainties in the Least-Squares Coefficients
The More Spread in the x Values, the Better (Within Reason)
Inferences on the Slope and Intercept
Inferences on the Mean Response
Prediction Intervals for Future Observations
Interpreting Computer Output
Exercises for Section 7.3
7.4 Checking Assumptions and Transforming Data
The Plot of Residuals versus Fitted Values
Transforming the Variables
Determining Which Transformation to Apply
Transformations Don’t Always Work
Residual Plots with Only a Few Points Can Be Hard to Interpret
Outliers and Influential Points
Methods Other Than Transforming Variables
Checking Independence and Normality
Empirical Models and Physical Laws
Exercises for Section 7.4
Supplementary Exercises for Chapter 7
Chapter 8: Multiple Regression
Chapter 8 Introduction
Introduction
8.1 The Multiple Regression Model
Estimating the Coefficients
Sums of Squares
The Statistics s 2 , R 2 , and F
An Example
Checking Assumptions in Multiple Regression
Exercises for Section 8.1
8.2 Confounding and Collinearity
Collinearity
Exercises for Section 8.2
8.3 Model Selection
Determining Whether Variables Can Be Dropped from a Model
Best Subsets Regression
Stepwise Regression
Model Selection Procedures Sometimes Find Models When They Shouldn’t
Exercises for Section 8.3
Supplementary Exercises for Chapter 8
Chapter 9: Factorial Experiments
Chapter 9 Introduction
Introduction
9.1 One-Factor Experiments
Completely Randomized Experiments
One-Way Analysis of Variance
Confidence Intervals for the Treatment Means
The ANOVA Table
Checking the Assumptions
Balanced versus Unbalanced Designs
The Analysis of Variance Identity
An Alternative Parameterization
Power
Random Effects Models
Exercises for Section 9.1
9.2 Pairwise Comparisons in One-Factor Experiments
Fisher’s Least Significant Difference (LSD) Method
The Bonferroni Method of Multiple Comparisons
The Tukey-Kramer Method of Multiple Comparisons
Exercises for Section 9.2
9.3 Two-Factor Experiments
Parameterization for Two-Way Analysis of Variance
Using Two-Way ANOVA to Test Hypotheses
Checking the Assumptions
Don’t Interpret the Main Effects When the Additive Model Doesn’t Hold
A Two-Way ANOVA Is Not the Same as Two One-Way ANOVAs
Interaction Plots
Multiple Comparisons in Two-Way ANOVA
Two-Way ANOVA when K=1
Random Factors
Unbalanced Designs
Exercises for Section 9.3
9.4 Randomized Complete Block Designs
Multiple Comparisons in Randomized Complete Block Designs
Exercises for Section 9.4
9.5 2p Factorial Experiments
Notation for 2 3 Factorial Experiments
Estimating Effects in a 2 3 Factorial Experiment
Interpreting Computer Output
Estimating Effects in a 2 p Factorial Experiment
Factorial Experiments without Replication
Using Probability Plots to Detect Large Effects
Fractional Factorial Experiments
Exercises for Section 9.5
Supplementary Exercises for Chapter 9
Chapter 10: Statistical Quality Control
Chapter 10 Introduction
Introduction
10.1 Basic Ideas
Collecting Data–Rational Subgroups
Control versus Capability
Process Control Must Be Done Continually
Similarities Between Control Charts and Hypothesis Tests
Exercises for Section 10.1
10.2 Control Charts for Variables
Control Chart Performance
The Western Electric Rules
The S chart
Which Is Better, the S Chart or the R Chart?
Samples of Size 1
Exercises for Section 10.2
10.3 Control Charts for Attributes
The p Chart
Interpreting Out-of-Control Signals in Attribute Charts
The c Chart
Exercises for Section 10.3
10.4 The CUSUM Chart
Exercises for Section 10.4
10.5 Process Capability
Estimating the Proportion of Nonconforming Units from Process Capability
Six-Sigma Quality
One-Sided Tolerances
Exercises for Section 10.5
Supplementary Exercises for Chapter 10
People also search for Statistics For Engineers and Scientists 6th:
statistics for engineers and scientists 6th edition pdf
statistics for engineers and scientists 4th edition pdf
statistics for engineers and scientists 6th edition
statistics for engineers and scientists answer key
probability and statistics for engineers and scientists anthony hayter pdf
Tags:
Statistics,Engineers,Scientists,William Navidi