Mathematics

Go to TOC

Statistics for the Sciences

Charles Peters

Go to TOC

Contents

1 Background 6 1.1 Populations, Samples and Variables . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6 1.2 Types of Variables . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7 1.3 Random Experiments and Sample Spaces . . . . . . . . . . . . . . . . . . . . . . . . . 7 1.4 Computing in Statistics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8 1.5 Exercises . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9

2 Descriptive and Graphical Statistics 11 2.1 Location Measures . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11

2.1.1 The Mean . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11 2.1.2 The Median and Other Quantiles . . . . . . . . . . . . . . . . . . . . . . . . . . 12 2.1.3 Trimmed Means . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12 2.1.4 Grouped Data . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13 2.1.5 Histograms . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14 2.1.6 Robustness . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15 2.1.7 The Five Number Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15 2.1.8 The Mode . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15 2.1.9 Exercises . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15

2.2 Measures of Variability or Scale . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16 2.2.1 The Variance and Standard Deviation . . . . . . . . . . . . . . . . . . . . . . . 16 2.2.2 The Coefficient of Variation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17 2.2.3 The Mean and Median Absolute Deviation . . . . . . . . . . . . . . . . . . . . 17 2.2.4 The Interquartile Range . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18 2.2.5 Boxplots . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18 2.2.6 Exercises . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 20

2.3 Jointly Distributed Variables . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 20 2.3.1 Side by Side Boxplots . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 20 2.3.2 Scatterplots . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 22 2.3.3 Covariance and Correlation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 23 2.3.4 Exercises . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 26

3 Probability 28 3.1 Basic Definitions. Equally Likely Outcomes . . . . . . . . . . . . . . . . . . . . . . . . 28 3.2 Combinations of Events . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 29

3.2.1 Exercises . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 30

1

Go to TOC

CONTENTS 2

3.3 Rules for Probability Measures . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31 3.4 Counting Outcomes. Sampling with and without Replacement . . . . . . . . . . . . . 32

3.4.1 Exercises . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 33 3.5 Conditional Probability . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 34

3.5.1 Relating Conditional and Unconditional Probabilities . . . . . . . . . . . . . . 36 3.5.2 Bayes’ Rule . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 36

3.6 Independent Events . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 37 3.6.1 Exercises . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 38

3.7 Replications of a Random Experiment . . . . . . . . . . . . . . . . . . . . . . . . . . . 39

4 Discrete Distributions 40 4.1 Random Variables . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 40 4.2 Discrete Random Variables . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 41 4.3 Expected Values . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 42

4.3.1 Exercises . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 43 4.4 Bernoulli Random Variables . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 43

4.4.1 The Mean and Variance of a Bernoulli Variable . . . . . . . . . . . . . . . . . . 44 4.5 Binomial Random Variables . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 44

4.5.1 The Mean and Variance of a Binomial Distribution . . . . . . . . . . . . . . . . 48 4.5.2 Exercises . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 48

4.6 Hypergeometric Distributions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 49 4.6.1 The Mean and Variance of a Hypergeometric Distribution . . . . . . . . . . . . 51

4.7 Poisson Distributions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 51 4.7.1 The Mean and Variance of a Poisson Distribution . . . . . . . . . . . . . . . . 54 4.7.2 Exercises . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 54

4.8 Jointly Distributed Variables . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 55 4.8.1 Covariance and Correlation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 57

4.9 Multinomial Distributions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 58 4.9.1 Exercises . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 60

5 Continuous Distributions 62 5.1 Density Functions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 62 5.2 Expected Values and Quantiles for Continuous Distributions . . . . . . . . . . . . . . 67

5.2.1 Expected Values . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 67 5.2.2 Quantiles . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 68 5.2.3 Exercises . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 69

5.3 Uniform Distributions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 69 5.4 Exponential Distributions and Their Relatives . . . . . . . . . . . . . . . . . . . . . . . 70

5.4.1 Exponential Distributions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 70 5.4.2 Gamma Distributions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 72 5.4.3 Weibull Distributions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 74 5.4.4 Exercises . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 78

5.5 Normal Distributions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 79 5.5.1 Tables of the Standard Normal Distribution . . . . . . . . . . . . . . . . . . . . 80 5.5.2 Other Normal Distributions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 81 5.5.3 The Normal Approximation to the Binomial Distribution . . . . . . . . . . . . 83

Go to TOC

CONTENTS 3

5.5.4 Exercises . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 84

6 Joint Distributions and Sampling Distributions 85 6.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 85 6.2 Jointly Distributed Continuous Variables . . . . . . . . . . . . . . . . . . . . . . . . . . 85

6.2.1 Mixed Joint Distributions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 88 6.2.2 Covariance and Correlation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 89 6.2.3 Bivariate Normal Distributions . . . . . . . . . . . . . . . . . . . . . . . . . . . 90

6.3 Independent Random Variables . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 92 6.3.1 Exercises . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 93

6.4 Sums of Random Variables . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 94 6.4.1 Simulating Random Samples . . . . . . . . . . . . . . . . . . . . . . . . . . . . 96

6.5 Sample Sums and the Central Limit Theorem . . . . . . . . . . . . . . . . . . . . . . . 98 6.5.1 Exercises . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 102

6.6 Other Distributions Associated with Normal Sampling . . . . . . . . . . . . . . . . . . 103 6.6.1 Chi Square Distributions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 103 6.6.2 Student t Distributions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 106 6.6.3 The Joint Distribution of the Sample Mean and Variance . . . . . . . . . . . . 108 6.6.4 Exercises . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 109

7 Statistical Inference for a Single Population 110 7.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 110 7.2 Estimation of Parameters . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 110

7.2.1 Estimators . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 110 7.2.2 Desireable Properties of Estimators . . . . . . . . . . . . . . . . . . . . . . . . . 111

7.3 Estimating a Population Mean . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 112 7.3.1 Confidence Intervals . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 113 7.3.2 Small Sample Confidence Intervals for a Normal Mean . . . . . . . . . . . . . . 115 7.3.3 Exercises . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 117

7.4 Estimating a Population Proportion . . . . . . . . . . . . . . . . . . . . . . . . . . . . 119 7.4.1 Choosing the Sample Size . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 120 7.4.2 Confidence Intervals for p . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 121 7.4.3 Exercises . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 123

7.5 Estimating Quantiles . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 123 7.5.1 Exercises . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 125

7.6 Estimating the Variance and Standard Deviation . . . . . . . . . . . . . . . . . . . . . 125 7.7 Hypothesis Testing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 126

7.7.1 Test Statistics, Type 1 and Type 2 Errors . . . . . . . . . . . . . . . . . . . . . 127 7.8 Hypotheses About a Population Mean . . . . . . . . . . . . . . . . . . . . . . . . . . . 127

7.8.1 Tests for the mean when the variance is unknown . . . . . . . . . . . . . . . . . 129 7.9 p-values . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 130

7.9.1 Exercises . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 132 7.10 Hypotheses About a Population Proportion . . . . . . . . . . . . . . . . . . . . . . . . 132

7.10.1 Exercises . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 134

Go to TOC

CONTENTS 4

8 Regression and Correlation 136 8.1 Examples of Linear Regression Problems . . . . . . . . . . . . . . . . . . . . . . . . . . 136 8.2 Least Squares Estimates . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 140

8.2.1 The ”lm” Function in R . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 143 8.2.2 Exercises . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 144

8.3 Distributions of the Least Squares Estimators . . . . . . . . . . . . . . . . . . . . . . . 145 8.3.1 Exercises . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 147

8.4 Inference for the Regression Parameters . . . . . . . . . . . . . . . . . . . . . . . . . . 148 8.4.1 Confidence Intervals for the Parameters . . . . . . . . . . . . . . . . . . . . . . 150 8.4.2 Hypothesis Tests for the Parameters . . . . . . . . . . . . . . . . . . . . . . . . 150 8.4.3 Exercises . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 156

8.5 Correlation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 157 8.5.1 Confidence intervals for ρ . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 158 8.5.2 Exercises . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 159

9 Inference from Multiple Samples 160 9.1 Comparison of Two Population Means . . . . . . . . . . . . . . . . . . . . . . . . . . . 160

9.1.1 Large Samples . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 160 9.1.2 Comparing Two Population Proportions . . . . . . . . . . . . . . . . . . . . . . 162 9.1.3 Samples from Normal Distributions . . . . . . . . . . . . . . . . . . . . . . . . 164 9.1.4 Exercises . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 167

9.2 Paired Observations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 168 9.2.1 Crossover Studies . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 169 9.2.2 Exercises . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 171

9.3 More than Two Independent Samples: Single Factor Analysis of Variance . . . . . . . 171 9.3.1 Example Using R . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 175 9.3.2 Multiple Comparisons . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 177 9.3.3 Exercises . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 178

9.4 Two-Way Analysis of Variance . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 179 9.4.1 Interactions Between the Factors . . . . . . . . . . . . . . . . . . . . . . . . . . 183 9.4.2 Exercises . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 184

10 Analysis of Categorical Data 185 10.1 Multinomial Distributions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 185

10.1.1 Estimators and Hypothesis Tests for the Parameters . . . . . . . . . . . . . . . 186 10.1.2 Multinomial Probabilities That Are Functions of Other Parameters . . . . . . . 187 10.1.3 Exercises . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 189

10.2 Testing Equality of Multinomial Probabilities . . . . . . . . . . . . . . . . . . . . . . . 190 10.3 Independence of Attributes: Contingency Tables . . . . . . . . . . . . . . . . . . . . . 192

10.3.1 Exercises . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 195

11 Miscellaneous Topics 196 11.1 Multiple Linear Regression . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 196

11.1.1 Inferences Based on Normality . . . . . . . . . . . . . . . . . . . . . . . . . . . 197 11.1.2 Using R’s ”lm” Function for Multiple Regression . . . . . . . . . . . . . . . . . 198 11.1.3 Factor Variables as Predictors . . . . . . . . . . . . . . . . . . . . . . . . . . . . 201 11.1.4 Exercises . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 206

Go to TOC

CONTENTS 5

11.2 Nonparametric Methods . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 207 11.2.1 The Signed Rank Test . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 207 11.2.2 The Wilcoxon Rank Sum Test . . . . . . . . . . . . . . . . . . . . . . . . . . . 212 11.2.3 Exercises . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 214

11.3 Bootstrap Confidence Intervals . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 215 11.3.1 Exercises . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 218

Go to TOC

Chapter 1

Background

Statistics is the art of summarizing data, depicting data, and extracting information from it. Statistics and the theory of probability are distinct subjects, although statistics depends on probability to quantify the strength of its inferences. The probability used in this course will be developed in Chapter 3 and throughout the text as needed. We begin by introducing some basic ideas and terminology.

1.1 Populations, Samples and Variables

A population is a set of individual elements whose collective properties are the subject of investigation. Usually, populations are large collections whose individual members cannot all be examined in detail. In statistical inference a manageable subset of the population is selected according to certain sampling procedures and properties of the subset are generalized to the entire population. These generalizations are accompanied by statements quantifying their accuracy and reliability. The selected subset is called a sample from the population.

Examples:

(a) the population of registered voters in a congressional district, (b) the population of U.S. adult males, (c) the population of currently enrolled students at a certain large urban university, (d) the population of all transactions in the U.S. stock market for the past month, (e) the population of all peak temperatures at points on the Earth’s surface over a given time interval.

Some samples from these populations might be: (a) the voters contacted in a pre-election telephone poll, (b) adult males interviewed by a TV reporter, (c) the dean’s list, (d) transactions recorded on the books of Smith Barney, (e) peak temperatures recorded at several weather stations.

Clearly, for these particular samples, some generalizations from sample to population would be highly questionable.

6

Go to TOC

CHAPTER 1. BACKGROUND 7

A population variable is an attribute that has a value for each individual in the population. In other words, it is a function from the population to some set of possible values. It may be helpful to imagine a population as a spreadsheet with one row or record for each individual member. Along the ith row, the values of a number of attributes of the ith individual are recorded in different columns. The column headings of the spreadsheet can be thought of as the population variables. For example, if the population is the set of currently enrolled students at the urban university, some of the variables are academic classification, number of hours currently enrolled, total hours taken, grade point average, gender, ethnic classification, major, and so on. Variables, such as these, that are defined for the same population are said to be jointly observed or jointly distributed.

1.2 Types of Variables

Variables are classified according to the kinds of values they have. The three basic types are numeric variables, factor variables, and ordered factor variables. Numeric variables are those for which arith- metic operations such as addition and subtraction make sense. Numeric variables are often related to a scale of measurement and expressed in units, such as meters, seconds, or dollars. Factor variables are those whose values are mere names, to which arithmetic operations do not apply. Factors usually have a small number of possible values. These values might be designated by numbers. If they are, the numbers that represent distinct values are chosen merely for convenience. The values of factors might also be letters, words, or pictorial symbols. Factor variables are sometimes called nominal variables or categorical variables. Ordered factor variables are factors whose values are ordered in some natural and important way. Ordered factors are also called ordinal variables. Some textbooks have a more elaborate classification of variables, with various subtypes. The three types above are enough for our purposes.

Examples: Consider the population of students currently enrolled at a large university. Each stu- dent has a residency status, either resident or nonresident. Residency status is an unordered factor variable. Academic classification is an ordered factor with values “freshman”, “sophomore”, “junior”, “senior”, “post-baccalaureate” and “graduate student”. The number of hours enrolled is a numeric variable with integer values. The distance a student travels from home to campus is a numeric vari- able expressed in miles or kilometers. Home area code is an unordered factor variable whose values are designated by numbers.

1.3 Random Experiments and Sample Spaces

An experiment can be something as simple as flipping a coin or as complex as conducting a public opinion poll. A random experiment is one with the following two characteristics:

(1) The experiment can be replicated an indefinite number of times under essentially the same exper- imental conditions.

(2) There is a degree of uncertainty in the outcome of the experiment. The outcome may vary from replication to replication even though experimental conditions are the same.

Go to TOC

CHAPTER 1. BACKGROUND 8

When we say that an experiment can be replicated under the same conditions, we mean that control- lable or observable conditions that we think might affect the outcome are the same. There may be hidden conditions that affect the outcome, but we cannot account for them. Implicit in (1) is the idea that replications of a random experiment are independent, that is, the outcomes of some replications do not affect the outcomes of others. Obviously, a random experiment is an idealization of a real experiment. Some simple experiments, such as tossing a coin, approach this ideal closely while more complicated experiments may not.

The sample space of a random experiment is the set of all its possible outcomes. We use the Greek capital letter Ω (omega)to denote the sample space. There is some degree of arbitrariness in the description of Ω. It depends on how the outcomes of the experiment are represented symbolically.

Examples:

(a) Toss a coin. Ω = {H,T}, where “H” denotes a head and “T” a tail. Another way of repre- senting the outcome is to let the number 1 denote a head and 0 a tail (or vice-versa). If we do this, then Ω = {0, 1}. In the latter representation the outcome of the experiment is just the number of heads.

(b) Toss a coin 5 times, i.e., replicate the experiment in (a) 5 times. An outcome of this experiment is a 5 term sequence of heads and tails. A typical outcome might be indicated by (H,T,T,H,H), or by (1,0,0,1,1). Even for this little experiment it is cumbersome to list all the outcomes, so we use a shorter notation

Ω = {(x1, x2, x3, x4, x5) | xi = 0 or xi = 1 for each i} .

(c) Select a student randomly from the population of all currently enrolled students. The sample space is the same as the population. The word “randomly” is vague. We will define it later.

(d) Repeat the Michelson-Morley experiment to measure the speed of the Earth relative to the ether (which doesn’t exist, as we now know). The outcome of the experiment could conceivably be any nonnegative number, so we take Ω = [0,∞) = {x | x is a real number and x ≥ 0.} Uncertainty arises from the fact that this is a very delicate experiment with several sources of unpredictable error.

Order now and get 10% discount on all orders above $50 now!!The professional are ready and willing handle your assignment.

ORDER NOW »»