maths made easy by ashish pandey

242
PDF generated using the open source mwlib toolkit. See http://code.pediapress.com/ for more information. PDF generated at: Mon, 27 Sep 2010 12:14:40 UTC Maths made EASY

Upload: ashish-pandey

Post on 09-Apr-2015

1.313 views

Category:

Documents


9 download

DESCRIPTION

A collection of most frequently used maths

TRANSCRIPT

Mathsmade EASY

PDF generated using the open source mwlib toolkit. See http://code.pediapress.com/ for more information. PDF generated at: Mon, 27 Sep 2010 12:14:40 UTC

ContentsArticlesARITHMETIC MEANArithmetic mean Statistics Mathematics Median Mean Statistical population Sampling (statistics) Probability theory Normal distribution Standard deviation Random variable Probability distribution Real number Variance Probability density function Cumulative distribution function Expected value Discrete probability distribution Continuous probability distribution Probability mass function Continuous function Measure (mathematics) Bias of an estimator Probability Pierre-Simon Laplace Integral Function (mathematics) Calculus Average 1 1 3 12 24 29 36 37 50 56 76 89 95 99 105 115 120 124 131 133 134 135 144 149 152 158 171 192 213 226

ReferencesArticle Sources and Contributors 231

Image Sources, Licenses and Contributors

237

Article LicensesLicense 239

1

ARITHMETIC MEANArithmetic meanIn mathematics and statistics, the arithmetic mean, often referred to as simply the mean or average when the context is clear, is a method to derive the central tendency of a sample space. The term "arithmetic mean" is preferred in mathematics and statistics because it helps distinguish it from other averages such as the geometric and harmonic mean. In addition to mathematics and statistics, the arithmetic mean is used frequently in fields such as economics, sociology, and history, though it is used in almost every academic field to some extent. For example, per capita GDP gives an approximation of the arithmetic average income of a nation's population. While the arithmetic mean is often used to report central tendencies, it is not a robust statistic, meaning that it is greatly influenced by outliers. Notably, for skewed distributions, the arithmetic mean may not accord with one's notion of "middle", and robust statistics such as the median may be a better description of central tendency.

DefinitionSuppose we have sample space . If the list is a statistical population, then the mean of that population is called a population mean. If the list is a statistical sample, we call the resulting statistic a sample mean. . Then the arithmetic mean is defined via the equation

Motivating propertiesThe arithmetic mean has several properties that make it useful, especially as a measure of central tendency. These include: If numbers have mean X, then . Since is the

distance from a given number to the mean, one way to interpret this property is as saying that the numbers to the left of the mean are balanced by the numbers to the right of the mean. The mean is the only single number for which the residuals defined this way sum to zero. If it is required to use a single number X as an estimate for the value of numbers , then the arithmetic mean does this best, in the sense of minimizing the sum of squares (xiX)2 of the residuals. (It follows that the mean is also the best single predictor in the sense of having the lowest root mean squared error.) For a normal distribution, the arithmetic mean is equal to both the median and the mode, other measures of central tendency.

ProblemsThe arithmetic mean may be misinterpreted as the median to imply that most values are higher or lower than is actually the case. If elements in the sample space increase arithmetically, when placed in some order, then the median and arithmetic average are equal. For example, consider the sample space {1,2,3,4}. The average is 2.5, as is the median. However, when we consider a sample space that cannot be arranged into an arithmetic progression, such as {1,2,4,8,16}, the median and arithmetic average can differ significantly. In this case the arithmetic average is 6.2

Arithmetic mean and the median is 4. When one looks at the arithmetic average of a sample space, one must note that the average value can vary significantly from most values in the sample space. There are applications of this phenomenon in fields such as economics. For example, since the 1980s in the United States median income has increased more slowly than the arithmetic average of income. Ben Bernanke, has speculated that the difference can be accounted for through technology, and less so via the decline in labour unions and other factors.[1]

2

AnglesParticular care must be taken when using cyclic data such as phases or angles. Navely taking the arithmetic mean of 1 and 359 yields a result of 180. This is incorrect for two reasons: Firstly, angle measurements are only defined up to a factor of 360 (or 2, if measuring in radians). Thus one could as easily call these 1 and 1, or 1 and 719 each of which gives a different average. Secondly, in this situation, 0 (equivalently, 360) is geometrically a better average value: there is lower dispersion about it (the points are both 1 from it, and 179 from 180, the putative average). In general application such an oversight will lead to the average value artificially moving towards the middle of the numerical range. A solution to this problem is to use the optimization formulation (viz, define the mean as the central point: the point about which one has the lowest dispersion), and redefine the difference as a modular distance (i.e., the distance on the circle: so the modular distance between 1 and 359 is 2, not 358).

See also Assumed mean Average Central tendency Empirical measure Frchet mean Generalized mean Geometric mean Inequality of arithmetic and geometric means Mean Median Mode Muirhead's inequality Sample mean and covariance Sample size Standard deviation Summary statistics Variance

Further reading Darrell Huff, How to lie with statistics, Victor Gollancz, 1954 (ISBN 0-393-31072-8).

External links Calculations and comparisons between arithmetic and geometric mean of two numbers [2] Mean or Average [3]

References[1] Ben S. Bernanke. "The Level and Distribution of Economic Well-Being" (http:/ / www. federalreserve. gov/ newsevents/ speech/ bernanke20070206a. htm). . Retrieved 23 July 2010. [2] http:/ / www. sengpielaudio. com/ calculator-geommean. htm [3] http:/ / people. revoledu. com/ kardi/ tutorial/ BasicMath/ Average/ index. html

Statistics

3

StatisticsStatistics is the science of the collection, organization, and interpretation of data.[1] [2] It deals with all aspects of this, including the planning of data collection in terms of the design of surveys and experiments.[1] A statistician is someone who is particularly well versed in the ways of thinking necessary for the successful application of statistical analysis. Such people have often gained this experience through working in any of a wide number of fields. There is also a discipline called mathematical statistics, which is concerned with the theoretical basis of the subject. The word statistics can either be singular or plural.[3] When it refers to the discipline, "statistics" is singular, as in "Statistics is an art." When it refers to quantities (such as mean and median) calculated from a set of data,[4] statistics is plural, as in "These statistics are misleading."

ScopeStatistics is considered by some to be a mathematical science pertaining to the collection, analysis, interpretation or explanation, and presentation of data,[5] while others consider it a branch of mathematics[6] concerned with collecting and interpreting data.[7] Because of its empirical roots and its focus on applications, statistics is usually considered to be a distinct mathematical science rather than a branch of mathematics.[8] [9] Statisticians improve the quality of More probability density will be found the closer one gets to the expected (mean) value in data with the design of experiments a normal distribution. Statistics used in standardized testing assessment are shown. The and survey sampling. Statistics also scales include standard deviations, cumulative percentages, percentile equivalents, provides tools for prediction and Z-scores, T-scores, standard nines, and percentages in standard nines. forecasting using data and statistical models. Statistics is applicable to a wide variety of academic disciplines, including natural and social sciences, government, and business. Statistical methods can be used to summarize or describe a collection of data; this is called descriptive statistics. This is useful in research, when communicating the results of experiments. In addition, patterns in the data may be modeled in a way that accounts for randomness and uncertainty in the observations, and are then used to draw inferences about the process or population being studied; this is called inferential statistics. Inference is a vital element of scientific advance, since it provides a prediction (based in data) for where a theory logically leads. To further prove the guiding theory, these predictions are tested as well, as part of the scientific method. If the inference holds true, then the descriptive statistics of the new data increase the soundness of that hypothesis. Descriptive statistics and inferential statistics (a.k.a., predictive statistics) together comprise applied statistics.[10]

Statistics

4

HistorySome scholars pinpoint the origin of statistics to 1663, with the publication of Natural and Political Observations upon the Bills of Mortality by John Graunt.[11] Early applications of statistical thinking revolved around the needs of states to base policy on demographic and economic data, hence its stat- etymology. The scope of the discipline of statistics broadened in the early 19th century to include the collection and analysis of data in general. Today, statistics is widely employed in government, business, and the natural and social sciences. Its mathematical foundations were laid in the 17th century with the development of probability theory by Blaise Pascal and Pierre de Fermat. Probability theory arose from the study of games of chance. The method of least squares was first described by Carl Friedrich Gauss around 1794. The use of modern computers has expedited large-scale statistical computation, and has also made possible new methods that are impractical to perform manually.

OverviewIn applying statistics to a scientific, industrial, or societal problem, it is necessary to begin with a population or process to be studied. Populations can be diverse topics such as "all persons living in a country" or "every atom composing a crystal". A population can also be composed of observations of a process at various times, with the data from each observation serving as a different member of the overall group. Data collected about this kind of "population" constitutes what is called a time series. For practical reasons, a chosen subset of the population called a sample is studied as opposed to compiling data about the entire group (an operation called census). Once a sample that is representative of the population is determined, data is collected for the sample members in an observational or experimental setting. This data can then be subjected to statistical analysis, serving two related purposes: description and inference. Descriptive statistics summarize the population data by describing what was observed in the sample numerically or graphically. Numerical descriptors include mean and standard deviation for continuous data types (like heights or weights), while frequency and percentage are more useful in terms of describing categorical data (like race). Inferential statistics uses patterns in the sample data to draw inferences about the population represented, accounting for randomness. These inferences may take the form of: answering yes/no questions about the data (hypothesis testing), estimating numerical characteristics of the data (estimation), describing associations within the data (correlation), modeling relationships within the data (regression), extrapolation, interpolation, or other modeling techniques like ANOVA, time series, and data mining.... it is only the manipulation of uncertainty that interests us. We are not concerned with the matter that is uncertain. Thus we do not study the mechanism of rain; only whether it will rain. Dennis Lindley, "The Philosophy of Statistics", The Statistician (2000).

The concept of correlation is particularly noteworthy for the potential confusion it can cause. Statistical analysis of a data set often reveals that two variables (properties) of the population under consideration tend to vary together, as if they were connected. For example, a study of annual income that also looks at age of death might find that poor people tend to have shorter lives than affluent people. The two variables are said to be correlated; however, they may or may not be the cause of one another. The correlation phenomena could be caused by a third, previously unconsidered phenomenon, called a lurking variable or confounding variable. For this reason, there is no way to immediately infer the existence of a causal relationship between the two variables. (See Correlation does not imply causation.) For a sample to be used as a guide to an entire population, it is important that it is truly a representative of that overall population. Representative sampling assures that the inferences and conclusions can be safely extended from the sample to the population as a whole. A major problem lies in determining the extent to which the sample chosen is actually representative. Statistics offers methods to estimate and correct for any random trending within the sample

Statistics and data collection procedures. There are also methods for designing experiments that can lessen these issues at the outset of a study, strengthening its capability to discern truths about the population. Statisticians describe stronger methods as more "robust".(See experimental design.) Randomness is studied using the mathematical discipline of probability theory. Probability is used in "Mathematical statistics" (alternatively, "statistical theory") to study the sampling distributions of sample statistics and, more generally, the properties of statistical procedures. The use of any statistical method is valid when the system or population under consideration satisfies the assumptions of the method. Misuse of statistics can produce subtle, but serious errors in description and interpretation subtle in the sense that even experienced professionals make such errors, and serious in the sense that they can lead to devastating decision errors. For instance, social policy, medical practice, and the reliability of structures like bridges all rely on the proper use of statistics. Even when statistics are correctly applied, the results can be difficult to interpret for those lacking expertise. The statistical significance of a trend in the data which measures the extent to which a trend could be caused by random variation in the sample may or may not agree with an intuitive sense of its significance. The set of basic statistical skills (and skepticism) that people need to deal with information in their everyday lives properly is referred to as statistical literacy.

5

Statistical methodsExperimental and observational studiesA common goal for a statistical research project is to investigate causality, and in particular to draw a conclusion on the effect of changes in the values of predictors or independent variables on dependent variables or response. There are two major types of causal statistical studies: experimental studies and observational studies. In both types of studies, the effect of differences of an independent variable (or variables) on the behavior of the dependent variable are observed. The difference between the two types lies in how the study is actually conducted. Each can be very effective. An experimental study involves taking measurements of the system under study, manipulating the system, and then taking additional measurements using the same procedure to determine if the manipulation has modified the values of the measurements. In contrast, an observational study does not involve experimental manipulation. Instead, data are gathered and correlations between predictors and response are investigated. Experiments The basic steps of a statistical experiment are: 1. Planning the research, including finding the number of replicates of the study, using the following information: preliminary estimates regarding the size of treatment effects, alternative hypotheses, and the estimated experimental variability. Consideration of the selection of experimental subjects and the ethics of research is necessary. Statisticians recommend that experiments compare (at least) one new treatment with a standard treatment or control, to allow an unbiased estimate of the difference in treatment effects. 2. Design of experiments, using blocking to reduce the influence of confounding variables, and randomized assignment of treatments to subjects to allow unbiased estimates of treatment effects and experimental error. At this stage, the experimenters and statisticians write the experimental protocol that shall guide the performance of the experiment and that specifies the primary analysis of the experimental data. 3. Performing the experiment following the experimental protocol and analyzing the data following the experimental protocol. 4. Further examining the data set in secondary analyses, to suggest new hypotheses for future study. 5. Documenting and presenting the results of the study. Experiments on human behavior have special concerns. The famous Hawthorne study examined changes to the working environment at the Hawthorne plant of the Western Electric Company. The researchers were interested in

Statistics determining whether increased illumination would increase the productivity of the assembly line workers. The researchers first measured the productivity in the plant, then modified the illumination in an area of the plant and checked if the changes in illumination affected productivity. It turned out that productivity indeed improved (under the experimental conditions). However, the study is heavily criticized today for errors in experimental procedures, specifically for the lack of a control group and blindness. The Hawthorne effect refers to finding that an outcome (in this case, worker productivity) changed due to observation itself. Those in the Hawthorne study became more productive not because the lighting was changed but because they were being observed. Observational study An example of an observational study is one that explores the correlation between smoking and lung cancer. This type of study typically uses a survey to collect observations about the area of interest and then performs statistical analysis. In this case, the researchers would collect observations of both smokers and non-smokers, perhaps through a case-control study, and then look for the number of cases of lung cancer in each group.

6

Levels of measurementThere are four main levels of measurement used in statistics: nominal, ordinal, interval, and ratio. They have different degrees of usefulness in statistical research. Ratio measurements have both a meaningful zero value and the distances between different measurements defined; they provide the greatest flexibility in statistical methods that can be used for analyzing the data. Interval measurements have meaningful distances between measurements defined, but the zero value is arbitrary (as in the case with longitude and temperature measurements in Celsius or Fahrenheit). Ordinal measurements have imprecise differences between consecutive values, but have a meaningful order to those values. Nominal measurements have no meaningful rank order among values. Because variables conforming only to nominal or ordinal measurements cannot be reasonably measured numerically, sometimes they are grouped together as categorical variables, whereas ratio and interval measurements are grouped together as quantitative or continuous variables due to their numerical nature.

Key terms used in statisticsNull hypothesis Interpretation of statistical information can often involve the development of a null hypothesis in that the assumption is that whatever is proposed as a cause has no effect on the variable being measured. The best illustration for a novice is the predicament encountered by a jury trial. The null hypothesis, H0, asserts that the defendant is innocent, whereas the alternative hypothesis, H1, asserts that the defendant is guilty. The indictment comes because of suspicion of the guilt. The H0 (status quo) stands in opposition to H1 and is maintained unless H1 is supported by evidence beyond a reasonable doubt. However, failure to reject H0 in this case does not imply innocence, but merely that the evidence was insufficient to convict. So the jury does not necessarily accept H0 but fails to reject H0. While to the casual observer the difference appears moot, misunderstanding the difference is one of the most common and arguably most serious errors made by non-statisticians. Failure to reject the H0 does NOT prove that the H0 is true, as any crook with a good lawyer who gets off because of insufficient evidence can attest to. While one can not prove a null hypothesis one can test how close it is to being true with a power test, which tests for type II errors.

Statistics Error Working from a null hypothesis two basic forms of error are recognised: Type I errors where the null hypothesis is falsely rejected giving a "false positive". Type II errors where the null hypothesis fails to be rejected and an actual difference between populations is missed. Error also refers to the extent to which individual observations in a sample differ from a central value, such as the sample or population mean. Many statistical methods seek to minimize the mean-squared error, and these are called "methods of least squares." Measurement processes that generate statistical data are also subject to error. Many of these errors are classified as random (noise) or systematic (bias), but other important types of errors (e.g., blunder, such as when an analyst reports incorrect units) can also be important. Confidence intervals Most studies will only sample part of a population and then the result is used to interpret the null hypothesis in the context of the whole population. Any estimates obtained from the sample only approximate the population value. Confidence intervals allow statisticians to express how closely the sample estimate matches the true value in the whole population. Often they are expressed as 95% confidence intervals. Formally, a 95% confidence interval of a procedure is a range where, if the sampling an analysis were repeated under the same conditions, the interval would include the true (population) value 95% of the time. This does not imply that the probability that the true value is in the confidence interval is 95%. One quantity that is a probability for an estimated value is the credible interval from Bayesian statistics. Significance Statistics rarely give a simple Yes/No type answer to the question asked of them. Interpretation often comes down to the level of statistical significance applied to the numbers and often refer to the probability of a value accurately rejecting the null hypothesis (sometimes referred to as the p-value). Referring to statistical significance does not necessarily mean that the overall result is significant in real world terms. For example, in a large study of a drug it may be shown that the drug has a statistically significant but very small beneficial effect, such that the drug will be unlikely to help the patient in a noticeable way.

7

ExamplesSome well-known statistical tests and procedures are: Analysis of variance (ANOVA) Chi-square test Correlation Factor analysis MannWhitney U Mean square weighted deviation (MSWD) Pearson product-moment correlation coefficient Regression analysis Spearman's rank correlation coefficient Student's t-test Time series analysis

Statistics

8

Specialized disciplinesSome fields of inquiry use applied statistics so extensively that they have specialized terminology. These disciplines include: Actuarial science Applied information economics Biostatistics Business statistics Chemometrics (for analysis of data from chemistry) Data mining (applying statistics and pattern recognition to discover knowledge from data) Demography Econometrics Energy statistics Engineering statistics Epidemiology Geography and Geographic Information Systems, specifically in Spatial analysis Image processing Psychological statistics

Reliability engineering Social statistics In addition, there are particular types of statistical analysis that have also developed their own specialised terminology and methodology: Bootstrap & Jackknife Resampling Statistical classification Statistical surveys Structured data analysis (statistics) Survival analysis Statistics in various sports, particularly baseball and cricket

Statistics form a key basis tool in business and manufacturing as well. It is used to understand measurement systems variability, control processes (as in statistical process control or SPC), for summarizing data, and to make data-driven decisions. In these roles, it is a key tool, and perhaps the only reliable tool.

Statistics

9

Statistical computingThe rapid and sustained increases in computing power starting from the second half of the 20th century have had a substantial impact on the practice of statistical science. Early statistical models were almost always from the class of linear models, but powerful computers, coupled with suitable numerical algorithms, caused an increased interest in nonlinear models (such as neural networks) as well as the creation of new types, such as generalized linear models and multilevel models. Increased computing power has also led to the growing popularity of computationally intensive methods based on resampling, such as permutation tests and the bootstrap, gretl, an example of an open source statistical package while techniques such as Gibbs sampling have made use of Bayesian models more feasible. The computer revolution has implications for the future of statistics with new emphasis on "experimental" and "empirical" statistics. A large number of both general and special purpose statistical software are now available.

MisuseThere is a general perception that statistical knowledge is all-too-frequently intentionally misused by finding ways to interpret only the data that are favorable to the presenter. The famous saying, "There are three kinds of lies: lies, damned lies, and statistics".[12] which was popularized in the USA by Samuel Clemens and incorrectly attributed by him to Disraeli (18041881), has come to represent the general mistrust [and misunderstanding] of statistical science. Harvard President Lawrence Lowell wrote in 1909 that statistics, "...like veal pies, are good if you know the person that made them, and are sure of the ingredients." If various studies appear to contradict one another, then the public may come to distrust such studies. For example, one study may suggest that a given diet or activity raises blood pressure, while another may suggest that it lowers blood pressure. The discrepancy can arise from subtle variations in experimental design, such as differences in the patient groups or research protocols, which are not easily understood by the non-expert. (Media reports usually omit this vital contextual information entirely, because of its complexity.) By choosing (or rejecting, or modifying) a certain sample, results can be manipulated. Such manipulations need not be malicious or devious; they can arise from unintentional biases of the researcher. The graphs used to summarize data can also be misleading. Deeper criticisms come from the fact that the hypothesis testing approach, widely used and in many cases required by law or regulation, forces one hypothesis (the null hypothesis) to be "favored," and can also seem to exaggerate the importance of minor differences in large studies. A difference that is highly statistically significant can still be of no practical significance. (See criticism of hypothesis testing and controversy over the null hypothesis.) One response is by giving a greater emphasis on the p-value than simply reporting whether a hypothesis is rejected at the given level of significance. The p-value, however, does not indicate the size of the effect. Another increasingly common approach is to report confidence intervals. Although these are produced from the same calculations as those of hypothesis tests or p-values, they describe both the size of the effect and the uncertainty surrounding it.

Statistics

10

Statistics applied to mathematics or the artsTraditionally, statistics was concerned with drawing inferences using a semi-standardized methodology that was "required learning" in most sciences. This has changed with use of statistics in non-inferential contexts. What was once considered a dry subject, taken in many fields as a degree-requirement, is now viewed enthusiastically. Initially derided by some mathematical purists, it is now considered essential methodology in certain areas. In number theory, scatter plots of data generated by a distribution function may be transformed with familiar tools used in statistics to reveal underlying patterns, which may then lead to hypotheses. Methods of statistics including predictive methods in forecasting, are combined with chaos theory and fractal geometry to create video works that are considered to have great beauty. The process art of Jackson Pollock relied on artistic experiments whereby underlying distributions in nature were artistically revealed. With the advent of computers, methods of statistics were applied to formalize such distribution driven natural processes, in order to make and analyze moving video art. Methods of statistics may be used predicatively in performance art, as in a card trick based on a Markov process that only works some of the time, the occasion of which can be predicted using statistical methodology. Statistics is used to predicatively create art, as in applications of statistical mechanics with the statistical or stochastic music invented by Iannis Xenakis, where the music is performance-specific. Though this type of artistry does not always come out as expected, it does behave within a range predictable using statistics.

See also Glossary of probability and statistics Index of statistics articles List of academic statistical associations List of important publications in statistics List of statistical packages (software) Notation in probability and statistics Forecasting Foundations of statistics Multivariate statistics Official statistics Regression analysis Statistical consultants Statistician, List of statisticians Structural equation modeling

List of national and international statistical services

Statistical literacy Statistical modeling

Related disciplines Biostatistics Computational biology Computational sociology Network biology Social science Sociology Positivism Social research

Statistics

11

References Best, Joel (2001). Damned Lies and Statistics: Untangling Numbers from the Media, Politicians, and Activists. University of California Press. ISBN0-520-21978-3. Desrosires, Alain (2004). The Politics of Large Numbers: A History of Statistical Reasoning. Trans. Camille Naish. Harvard University Press. ISBN0-674-68932-1. Hacking, Ian (1990). The Taming of Chance. Cambridge University Press. ISBN0-521-38884-8. Lindley, D.V. (1985). Making Decisions (2nd ed. ed.). John Wiley & Sons. ISBN0-471-90808-8. Tijms, Henk (2004). Understanding Probability: Chance Rules in Everyday life. Cambridge University Press. ISBN0-521-83329-9.

External linksOnline non-commercial textbooks "A New View of Statistics" [13], by Will G. Hopkins, AUT University "NIST/SEMATECH e-Handbook of Statistical Methods" [14], by U.S. National Institute of Standards and Technology and SEMATECH "Online Statistics: An Interactive Multimedia Course of Study" [15], by David Lane, Joan Lu, Camille Peres, Emily Zitek, etal. "The Little Handbook of Statistical Practice" [16], by Gerard E. Dallal [17], Tufts University "StatSoft Electronic Textbook" [18], by StatSoft [19]

Other non-commercial resources Statistics [20] (OECD) Probability Web [21] (Carleton College) Free online statistics course with interactive practice exercises [22] (Carnegie Mellon University) Resources for Teaching and Learning about Probability and Statistics [23] (ERIC) Rice Virtual Lab in Statistics [24] (Rice University) Statistical Science Web [25] (University of Melbourne) Applied statistics applets [26] Statlib: data and software archives [27] StatProb [28] peer reviewed Statistics and probability Wikipedia, Sponsored by a Collaborative of Statistics and Probability Societies[29]

References[1] [2] [3] [4] [5] [6] [7] [8] Dodge, Y. (2003) The Oxford Dictionary of Statistical Terms, OUP. ISBN 0-19-920613-9 The Free Online Dictionary (http:/ / www. thefreedictionary. com/ dict. asp?Word=statistics) "Statistics" (http:/ / www. merriam-webster. com/ dictionary/ statistics). Merriam-Webster Online Dictionary. . "Statistic" (http:/ / www. merriam-webster. com/ dictionary/ statistic). Merriam-Webster Online Dictionary. . Moses, Lincoln E. Think and Explain with statistics, pp. 13. Addison-Wesley, 1986. Hays, William Lee, Statistics for the social sciences, Holt, Rinehart and Winston, 1973, p.xii, ISBN 978-0-03-077945-9 Statistics at Encyclopedia of Mathematics (http:/ / us. oocities. com/ mathfair2002/ school/ plans. htm) Moore, David (1992). "Teaching Statistics as a Respectable Subject". Statistics for the Twenty-First Century. Washington, DC: The Mathematical Association of America. pp.1425. [9] Chance, Beth L.; Rossman, Allan J. (2005). "Preface" (http:/ / www. rossmanchance. com/ iscam/ preface. pdf). Investigating Statistical Concepts, Applications, and Methods. Duxbury Press. ISBN978-0495050643. . [10] Anderson, , D.R.; Sweeney, D.J.; Williams, T.A.. Statistics: Concepts and Applications, pp. 59. West Publishing Company, 1986. [11] Willcox, Walter (1938) The Founder of Statistics. (http:/ / www. jstor. org/ stable/ 1400906) Review of the International Statistical Institute 5(4):321328.

Statistics[12] Leonard H.Courtney (18321918) in a speech at Saratoga Springs, New York, August 1895, in which this sentence appeared: After all, facts are facts, and although we may quote one to another with a chuckle the words of the Wise Statesman, Lies damned lies and statistics, still there are some easy figures the simplest must understand, and the astutest cannot wriggle out of., earliest documented use of exact phrase. [13] http:/ / sportsci. org/ resource/ stats/ [14] http:/ / www. itl. nist. gov/ div898/ handbook/ [15] http:/ / onlinestatbook. com/ index. html [16] http:/ / www. StatisticalPractice. com [17] http:/ / www. tufts. edu/ ~gdallal/ [18] http:/ / www. statsoft. com/ textbook/ stathome. html [19] http:/ / www. statsoft. com/ index. htm [20] http:/ / stats. oecd. org/ Index. aspx [21] http:/ / www. mathcs. carleton. edu/ probweb/ probweb. html [22] http:/ / oli. web. cmu. edu/ openlearning/ forstudents/ freecourses/ statistics [23] http:/ / www. ericdigests. org/ 2000-2/ resources. htm [24] http:/ / www. onlinestatbook. com/ rvls. html [25] http:/ / www. statsci. org [26] http:/ / www. mbhs. edu/ ~steind00/ statistics. html [27] http:/ / lib. stat. cmu. edu/ [28] http:/ / statprob. com/ encyclopedia [29] http:/ / statprob. com/ ?op=about

12

MathematicsMathematics is the study of quantity, structure, space, and change. Mathematicians seek out patterns,[2] [3] formulate new conjectures, and establish truth by rigorous deduction from appropriately chosen axioms and definitions.[4] There is debate over whether mathematical objects such as numbers and points exist naturally or are human creations. The mathematician Benjamin Peirce called mathematics "the science that draws necessary conclusions".[5] Albert Einstein, on the other hand, stated that "as far as the laws of mathematics refer to reality, they are not certain; and as far as they are certain, they do not refer to reality."[6]

Through the use of abstraction and logical reasoning, mathematics evolved from counting, calculation, measurement, and the systematic study of the shapes and motions of physical objects. Practical mathematics has been a human activity for as far back as written records exist. Rigorous arguments first appeared in Greek mathematics, most notably in Euclid's Elements. Mathematics continued to develop, for example in China in 300 BC, in India in AD 100, and in the Muslim world in AD 800, until the Renaissance, when mathematical innovations interacting with new scientific discoveries led to a rapid increase in the rate of mathematical discovery that continues to the present day.[7] Mathematics is used throughout the world as an essential tool in many fields, including natural science, engineering, medicine, and the social sciences. Applied mathematics, the branch of mathematics concerned with application of mathematical knowledge to other fields, inspires and makes use of new mathematical discoveries and sometimes leads to the development of entirely new mathematical disciplines, such as statistics and game theory. Mathematicians also engage in pure mathematics, or mathematics for its own sake, without having any application in mind, although practical applications for what began as pure mathematics are often discovered.[8]

Euclid, Greek mathematician, 3rd century BC, as imagined by Raphael in this detail from The [1] School of Athens.

Mathematics

13

EtymologyThe word "mathematics" comes from the Greek (mthma), which means learning, study, science, and additionally came to have the narrower and more technical meaning "mathematical study", even in Classical times.[9] Its adjective is (mathmatiks), related to learning, or studious, which likewise further came to mean mathematical. In particular, (mathmatik tkhn), Latin: ars mathematica, meant the mathematical art. The apparent plural form in English, like the French plural form les mathmatiques (and the less commonly used singular derivative la mathmatique), goes back to the Latin neuter plural mathematica (Cicero), based on the Greek plural (ta mathmatik), used by Aristotle, and meaning roughly "all things mathematical"; although it is plausible that English borrowed only the adjective mathematic(al) and formed the noun mathematics anew, after the pattern of physics and metaphysics, which were inherited from the Greek.[10] In English, the noun mathematics takes singular verb forms. It is often shortened to maths or, in English-speaking North America, math.

HistoryThe evolution of mathematics might be seen as an ever-increasing series of abstractions, or alternatively an expansion of subject matter. The first abstraction, which is shared by many animals,[11] was probably that of numbers: the realization that a collection of two apples and a collection of two oranges (for example) have something in common, namely quantity of their members. In addition to recognizing how to count physical objects, prehistoric peoples also recognized how to count abstract quantities, like time days, seasons, years.[12] Elementary arithmetic (addition, subtraction, multiplication and division) naturally followed. Since numeracy pre-dated writing, further steps were needed for recording numbers such as tallies or the knotted strings called quipu used by the Inca to store numerical data. Numeral systems have been many and diverse, with the first known written numerals created by Egyptians in Middle Kingdom texts such as the Rhind Mathematical Papyrus.

Pythagoras (c.570-c.495 BC) has commonly been given credit for discovering the Pythagorean theorem. Well-known figures in Greek mathematics also include Euclid, Archimedes, and Thales.

Mathematics

14

The earliest uses of mathematics were in trading, land measurement, painting and weaving patterns and the recording of time. More complex mathematics did not appear until around 3000 BC, when the Babylonians and Egyptians began using arithmetic, algebra and geometry for taxation and other financial calculations, for building and construction, and for astronomy.[13] The systematic study of mathematics in its own right began with the Ancient Greeks between 600 and 300 BC.[14] Mathematics has since been greatly extended, and there has been a fruitful interaction between mathematics and science, to the benefit of both. Mathematical discoveries continue to be made today. According to Mikhail B. Sevryuk, in the January 2006 issue of the Bulletin of the American Mathematical Society, "The number of papers and books Mayan numerals included in the Mathematical Reviews database since 1940 (the first year of operation of MR) is now more than 1.9 million, and more than 75 thousand items are added to the database each year. The overwhelming majority of works in this ocean contain new mathematical theorems and their proofs."[15]

Inspiration, pure and applied mathematics, and aestheticsMathematics arises from many different kinds of problems. At first these were found in commerce, land measurement, architecture and later astronomy; nowadays, all sciences suggest problems studied by mathematicians, and many problems arise within mathematics itself. For example, the physicist Richard Feynman invented the path integral formulation of quantum mechanics using a combination of mathematical reasoning and physical insight, and today's string theory, a still-developing scientific theory which attempts to unify the four fundamental forces of nature, continues to inspire new mathematics.[16] Some mathematics is only relevant in the area that inspired it, and is applied to solve further problems in that area. But often mathematics inspired by one area proves useful in many areas, and joins the general stock of mathematical concepts. A distinction is often made between pure mathematics and applied mathematics. However pure mathematics topics often turn out to have applications, e.g. number theory in cryptography. This remarkable fact that even the "purest" Sir Isaac Newton (1643-1727), an inventor of mathematics often turns out to have practical applications is what infinitesimal calculus. Eugene Wigner has called "the unreasonable effectiveness of [17] mathematics". As in most areas of study, the explosion of knowledge in the scientific age has led to specialization: there are now hundreds of specialized areas in mathematics and the latest Mathematics Subject Classification runs to 46 pages.[18] Several areas of applied mathematics have merged with related traditions outside of mathematics and become disciplines in their own right, including statistics, operations research, and computer science. For those who are mathematically inclined, there is often a definite aesthetic aspect to much of mathematics. Many mathematicians talk about the elegance of mathematics, its intrinsic aesthetics and inner beauty. Simplicity and generality are valued. There is beauty in a simple and elegant proof, such as Euclid's proof that there are infinitely

Mathematics many prime numbers, and in an elegant numerical method that speeds calculation, such as the fast Fourier transform. G. H. Hardy in A Mathematician's Apology expressed the belief that these aesthetic considerations are, in themselves, sufficient to justify the study of pure mathematics. He identified criteria such as significance, unexpectedness, inevitability, and economy as factors that contribute to a mathematical aesthetic.[19] Mathematicians often strive to find proofs of theorems that are particularly elegant, a quest Paul Erds often referred to as finding proofs from "The Book" in which God had written down his favorite proofs.[20] [21] The popularity of recreational mathematics is another sign of the pleasure many find in solving mathematical questions.

15

Notation, language, and rigorMost of the mathematical notation in use today was not invented until the 16th century.[22] Before that, mathematics was written out in words, a painstaking process that limited mathematical discovery.[23] Euler (17071783) was responsible for many of the notations in use today. Modern notation makes mathematics much easier for the professional, but beginners often find it daunting. It is extremely compressed: a few symbols contain a great deal of information. Like musical notation, modern mathematical notation has a strict syntax (which to a limited extent varies from author to author and from discipline to discipline) and encodes information that would be difficult to write in any other way. Mathematical language can also be hard for beginners. Words such as or and only have more precise meanings than in everyday speech. Moreover, words such as open and field have been given specialized Leonhard Euler, who created and popularized mathematical meanings. Mathematical jargon includes technical terms much of the mathematical notation used today such as homeomorphism and integrable. But there is a reason for special notation and technical jargon: mathematics requires more precision than everyday speech. Mathematicians refer to this precision of language and logic as "rigor". Mathematical proof is fundamentally a matter of rigor. Mathematicians want their theorems to follow from axioms by means of systematic reasoning. This is to avoid mistaken "theorems", based on fallible intuitions, of which many instances have occurred in the history of the subject.[24] The level of rigor expected in mathematics has varied over time: the Greeks expected detailed arguments, but at the time of Isaac Newton the methods employed were less rigorous. Problems inherent in the definitions used by Newton would lead to a resurgence of careful analysis and formal proof in the 19th century. Misunderstanding the rigor is a cause for some of the common misconceptions of mathematics. Today, mathematicians continue to argue among themselves about computer-assisted proofs. Since large computations are hard to verify, such proofs may not be sufficiently rigorous.[25]The infinity symbol in several typefaces.

Axioms in traditional thought were "self-evident truths", but that conception is problematic. At a formal level, an axiom is just a string of symbols, which has an intrinsic meaning only in the context of all derivable formulas of an axiomatic system. It was the goal of Hilbert's program to put all of mathematics on a firm axiomatic basis, but according to Gdel's incompleteness theorem every (sufficiently powerful) axiomatic system has undecidable formulas; and so a final axiomatization of mathematics is impossible.

Mathematics Nonetheless mathematics is often imagined to be (as far as its formal content) nothing but set theory in some axiomatization, in the sense that every mathematical statement or proof could be cast into formulas within set theory.[26]

16

Mathematics as scienceCarl Friedrich Gauss referred to mathematics as "the Queen of the Sciences".[28] In the original Latin Regina Scientiarum, as well as in German Knigin der Wissenschaften, the word corresponding to science means (field of) knowledge. Indeed, this is also the original meaning in English, and there is no doubt that mathematics is in this sense a science. The specialization restricting the meaning to natural science is of later date. If one considers science to be strictly about the physical world, then mathematics, or at least pure mathematics, is not a science. Albert Einstein stated that "as far as the laws of mathematics refer to reality, they are not certain; and as far as they are certain, they do not refer to reality."[6] Many philosophers believe that mathematics is not experimentally falsifiable, and thus not a science according to the definition of Karl Popper.[29] However, in the 1930s important work in mathematical Carl Friedrich Gauss, himself known as the logic convinced many mathematicians that mathematics cannot be [27] "prince of mathematicians", referred to reduced to logic alone, and Karl Popper concluded that "most mathematics as "the Queen of the Sciences". mathematical theories are, like those of physics and biology, hypothetico-deductive: pure mathematics therefore turns out to be much closer to the natural sciences whose hypotheses are conjectures, than it seemed even recently."[30] Other thinkers, notably Imre Lakatos, have applied a version of falsificationism to mathematics itself. An alternative view is that certain scientific fields (such as theoretical physics) are mathematics with axioms that are intended to correspond to reality. In fact, the theoretical physicist, J. M. Ziman, proposed that science is public knowledge and thus includes mathematics.[31] In any case, mathematics shares much in common with many fields in the physical sciences, notably the exploration of the logical consequences of assumptions. Intuition and experimentation also play a role in the formulation of conjectures in both mathematics and the (other) sciences. Experimental mathematics continues to grow in importance within mathematics, and computation and simulation are playing an increasing role in both the sciences and mathematics, weakening the objection that mathematics does not use the scientific method. In his 2002 book A New Kind of Science, Stephen Wolfram argues that computational mathematics deserves to be explored empirically as a scientific field in its own right. The opinions of mathematicians on this matter are varied. Many mathematicians feel that to call their area a science is to downplay the importance of its aesthetic side, and its history in the traditional seven liberal arts; others feel that to ignore its connection to the sciences is to turn a blind eye to the fact that the interface between mathematics and its applications in science and engineering has driven much development in mathematics. One way this difference of viewpoint plays out is in the philosophical debate as to whether mathematics is created (as in art) or discovered (as in science). It is common to see universities divided into sections that include a division of Science and Mathematics, indicating that the fields are seen as being allied but that they do not coincide. In practice, mathematicians are typically grouped with scientists at the gross level but separated at finer levels. This is one of many issues considered in the philosophy of mathematics. Mathematical awards are generally kept separate from their equivalents in science. The most prestigious award in mathematics is the Fields Medal,[32] [33] established in 1936 and now awarded every 4 years. It is often considered

Mathematics the equivalent of science's Nobel Prizes. The Wolf Prize in Mathematics, instituted in 1978, recognizes lifetime achievement, and another major international award, the Abel Prize, was introduced in 2003. These are awarded for a particular body of work, which may be innovation, or resolution of an outstanding problem in an established field. A famous list of 23 such open problems, called "Hilbert's problems", was compiled in 1900 by German mathematician David Hilbert. This list achieved great celebrity among mathematicians, and at least nine of the problems have now been solved. A new list of seven important problems, titled the "Millennium Prize Problems", was published in 2000. Solution of each of these problems carries a $1 million reward, and only one (the Riemann hypothesis) is duplicated in Hilbert's problems.

17

Fields of mathematicsMathematics can, broadly speaking, be subdivided into the study of quantity, structure, space, and change (i.e. arithmetic, algebra, geometry, and analysis). In addition to these main concerns, there are also subdivisions dedicated to exploring links from the heart of mathematics to other fields: to logic, to set theory (foundations), to the empirical mathematics of the various sciences (applied mathematics), and more recently to the rigorous study of uncertainty.

Quantity

An abacus, a simple calculating tool used since ancient times.

The study of quantity starts with numbers, first the familiar natural numbers and integers ("whole numbers") and arithmetical operations on them, which are characterized in arithmetic. The deeper properties of integers are studied in number theory, from which come such popular results as Fermat's Last Theorem. Number theory also holds two problems widely considered to be unsolved: the twin prime conjecture and Goldbach's conjecture. As the number system is further developed, the integers are recognized as a subset of the rational numbers ("fractions"). These, in turn, are contained within the real numbers, which are used to represent continuous quantities. Real numbers are generalized to complex numbers. These are the first steps of a hierarchy of numbers that goes on to include quarternions and octonions. Consideration of the natural numbers also leads to the transfinite numbers, which formalize the concept of "infinity". Another area of study is size, which leads to the cardinal numbers and then to another conception of infinity: the aleph numbers, which allow meaningful comparison of the size of infinitely large sets.

Natural numbers

Integers

Rational numbers

Real numbers

Complex numbers

StructureMany mathematical objects, such as sets of numbers and functions, exhibit internal structure as a consequence of operations or relations that are defined on the set. Mathematics then studies properties of those sets that can be expressed in terms of that structure; for instance number theory studies properties of the set of integers that can be expressed in terms of arithmetic operations. Moreover, it frequently happens that different such structured sets (or structures) exhibit similar properties, which makes it possible, by a further step of abstraction, to state axioms for a class of structures, and then study at once the whole class of structures satisfying these axioms. Thus one can study groups, rings, fields and other abstract systems; together such studies (for structures defined by algebraic operations) constitute the domain of abstract algebra. By its great generality, abstract algebra can often be applied to seemingly unrelated problems; for instance a number of ancient problems concerning compass and straightedge constructions

Mathematics were finally solved using Galois theory, which involves field theory and group theory. Another example of an algebraic theory is linear algebra, which is the general study of vector spaces, whose elements called vectors have both quantity and direction, and can be used to model (relations between) points in space. This is one example of the phenomenon that the originally unrelated areas of geometry and algebra have very strong interactions in modern mathematics. Combinatorics studies ways of enumerating the number of objects that fit a given structure.

18

Combinatorics

Number theory

Group theory

Graph theory

Order theory

SpaceThe study of space originates with geometry in particular, Euclidean geometry. Trigonometry is the branch of mathematics that deals with relationships between the sides and the angles of triangles and with the trigonometric functions; it combines space and numbers, and encompasses the well-known Pythagorean theorem. The modern study of space generalizes these ideas to include higher-dimensional geometry, non-Euclidean geometries (which play a central role in general relativity) and topology. Quantity and space both play a role in analytic geometry, differential geometry, and algebraic geometry. Within differential geometry are the concepts of fiber bundles and calculus on manifolds, in particular, vector and tensor calculus. Within algebraic geometry is the description of geometric objects as solution sets of polynomial equations, combining the concepts of quantity and space, and also the study of topological groups, which combine structure and space. Lie groups are used to study space, structure, and change. Topology in all its many ramifications may have been the greatest growth area in 20th century mathematics; it includes point-set topology, set-theoretic topology, algebraic topology and differential topology. In particular, instances of modern day topology are metrizability theory, axiomatic set theory, homotopy theory, and Morse theory. Topology also includes the now solved Poincar conjecture and the controversial four color theorem, whose only proof, by computer, has never been verified by a human.

Geometry

Trigonometry

Differential geometry

Topology

Fractal geometry

Measure Theory

ChangeUnderstanding and describing change is a common theme in the natural sciences, and calculus was developed as a powerful tool to investigate it. Functions arise here, as a central concept describing a changing quantity. The rigorous study of real numbers and functions of a real variable is known as real analysis, with complex analysis the equivalent field for the complex numbers. Functional analysis focuses attention on (typically infinite-dimensional) spaces of functions. One of many applications of functional analysis is quantum mechanics. Many problems lead naturally to relationships between a quantity and its rate of change, and these are studied as differential equations. Many phenomena in nature can be described by dynamical systems; chaos theory makes precise the ways in which many of these systems exhibit unpredictable yet still deterministic behavior.

Mathematics

19

Calculus

Vector calculus

Differential equations

Dynamical systems

Chaos theory

Complex analysis

Foundations and philosophyIn order to clarify the foundations of mathematics, the fields of mathematical logic and set theory were developed. Mathematical logic includes the mathematical study of logic and the applications of formal logic to other areas of mathematics; set theory is the branch of mathematics that studies sets or collections of objects. Category theory, which deals in an abstract way with mathematical structures and relationships between them, is still in development. The phrase "crisis of foundations" describes the search for a rigorous foundation for mathematics that took place from approximately 1900 to 1930.[34] Some disagreement about the foundations of mathematics continues to present day. The crisis of foundations was stimulated by a number of controversies at the time, including the controversy over Cantor's set theory and the Brouwer-Hilbert controversy. Mathematical logic is concerned with setting mathematics within a rigorous axiomatic framework, and studying the implications of such a framework. As such, it is home to Gdel's incompleteness theorems which (informally) imply that any formal system that contains basic arithmetic, if sound (meaning that all theorems that can be proven are true), is necessarily incomplete (meaning that there are true theorems which cannot be proved in that system). Whatever finite collection of number-theoretical axioms is taken as a foundation, Gdel showed how to construct a formal statement that is a true number-theoretical fact, but which does not follow from those axioms. Therefore no formal system is a complete axiomatization of full number theory. Modern logic is divided into recursion theory, model theory, and proof theory, and is closely linked to theoretical computer science.

Mathematical logic

Set theory

Category theory

Theoretical computer scienceTheoretical computer science includes computability theory, computational complexity theory, and information theory. Computability theory examines the limitations of various theoretical models of the computer, including the most powerful known model the Turing machine. Complexity theory is the study of tractability by computer; some problems, although theoretically solvable by computer, are so expensive in terms of time or space that solving them is likely to remain practically unfeasible, even with rapid advance of computer hardware. A famous problem is the "P=NP?" problem, one of the Millennium Prize Problems.[35] Finally, information theory is concerned with the amount of data that can be stored on a given medium, and hence deals with concepts such as compression and entropy.

Theory of computation

Cryptography

Mathematics

20

Applied mathematicsApplied mathematics considers the use of abstract mathematical tools in solving concrete problems in the sciences, business, and other areas. Applied mathematics has significant overlap with the discipline of statistics, whose theory is formulated mathematically, especially with probability theory. Statisticians (working as part of a research project) "create data that makes sense" with random sampling and with randomized experiments; the design of a statistical sample or experiment specifies the analysis of the data (before the data be available). When reconsidering data from experiments and samples or when analyzing data from observational studies, statisticians "make sense of the data" using the art of modelling and the theory of inference with model selection and estimation; the estimated models and consequential predictions should be tested on new data.[36] Computational mathematics proposes and studies methods for solving mathematical problems that are typically too large for human numerical capacity. Numerical analysis studies methods for problems in analysis using ideas of functional analysis and techniques of approximation theory; numerical analysis includes the study of approximation and discretization broadly with special concern for rounding errors. Other areas of computational mathematics include computer algebra and symbolic computation.

Mathematical physics

Fluid dynamics

Numerical analysis

Optimization (mathematics)Optimization

Probability theory

Statistics

Financial mathematics

Game theory

Mathematical biology

Mathematical chemistry

Mathematical economics

Control theory

Mathematics

21

See also Definitions of mathematics Dyscalculia Iatromathematicians Logics Mathematical anxiety Mathematical game Mathematical model Mathematical problem Mathematical structure Mathematics and art Mathematics competitions Mathematics education Mathematics portal Pattern Philosophy of mathematics Pseudomathematics

References Benson, Donald C., The Moment of Proof: Mathematical Epiphanies, Oxford University Press, USA; New Ed edition (December 14, 2000). ISBN 0-19-513919-4. Boyer, Carl B., A History of Mathematics, Wiley; 2 edition (March 6, 1991). ISBN 0-471-54397-7. A concise history of mathematics from the Concept of Number to contemporary Mathematics. Courant, R. and H. Robbins, What Is Mathematics? : An Elementary Approach to Ideas and Methods, Oxford University Press, USA; 2 edition (July 18, 1996). ISBN 0-19-510519-2. Davis, Philip J. and Hersh, Reuben, The Mathematical Experience. Mariner Books; Reprint edition (January 14, 1999). ISBN 0-395-92968-7. A gentle introduction to the world of mathematics. Einstein, Albert (1923). Sidelights on Relativity (Geometry and Experience). P. Dutton., Co. Eves, Howard, An Introduction to the History of Mathematics, Sixth Edition, Saunders, 1990, ISBN 0-03-029558-0. Gullberg, Jan, Mathematics From the Birth of Numbers. W. W. Norton & Company; 1st edition (October 1997). ISBN 0-393-04002-X. An encyclopedic overview of mathematics presented in clear, simple language. Hazewinkel, Michiel (ed.), Encyclopaedia of Mathematics. Kluwer Academic Publishers 2000. A translated and expanded version of a Soviet mathematics encyclopedia, in ten (expensive) volumes, the most complete and authoritative work available. Also in paperback and on CD-ROM, and online [37]. Jourdain, Philip E. B., The Nature of Mathematics, in The World of Mathematics, James R. Newman, editor, Dover Publications, 2003, ISBN 0-486-43268-8. Kline, Morris, Mathematical Thought from Ancient to Modern Times, Oxford University Press, USA; Paperback edition (March 1, 1990). ISBN 0-19-506135-7. Monastyrsky, Michael (2001) (PDF). Some Trends in Modern Mathematics and the Fields Medal [38]. Canadian Mathematical Society. Retrieved 2006-07-28. Oxford English Dictionary, second edition, ed. John Simpson and Edmund Weiner, Clarendon Press, 1989, ISBN 0-19-861186-2. The Oxford Dictionary of English Etymology, 1983 reprint. ISBN 0-19-861112-9. Pappas, Theoni, The Joy Of Mathematics, Wide World Publishing; Revised edition (June 1989). ISBN 0-933174-65-9.

Mathematics Peirce, Benjamin (1882). "Linear Associative Algebra" [39]. American Journal of Mathematics (Vol. 4, No. 1/4. (1881).. Peterson, Ivars, Mathematical Tourist, New and Updated Snapshots of Modern Mathematics, Owl Books, 2001, ISBN 0-8050-7159-8. Paulos, John Allen (1996). A Mathematician Reads the Newspaper. Anchor. ISBN0-385-48254-X. Popper, Karl R. (1995). "On knowledge". In Search of a Better World: Lectures and Essays from Thirty Years. Routledge. ISBN0-415-13548-6. Riehm, Carl (August 2002). "The Early History of the Fields Medal" [40] (PDF). Notices of the AMS (AMS) 49 (7): 778782. Sevryuk, Mikhail B. (January 2006). "Book Reviews" [41] (PDF). Bulletin of the American Mathematical Society 43 (1): 101109. doi:10.1090/S0273-0979-05-01069-4. Retrieved 2006-06-24. Waltershausen, Wolfgang Sartorius von (1856, repr. 1965). Gauss zum Gedchtniss [42]. Sndig Reprint Verlag H. R. Wohlwend. ISBN3-253-01702-8. Ziman, J.M., F.R.S. (1968). Public Knowledge:An essay concerning the social dimension of science [43].

22

External links Free Mathematics books [44] Free Mathematics books collection. Encyclopaedia of Mathematics online encyclopaedia from Springer [45], Graduate-level reference work with over 8,000 entries, illuminating nearly 50,000 notions in mathematics. HyperMath site at Georgia State University [46] FreeScience Library [47] The mathematics section of FreeScience library Rusin, Dave: The Mathematical Atlas [48]. A guided tour through the various branches of modern mathematics. (Can also be found at NIU.edu [49].) Polyanin, Andrei: EqWorld: The World of Mathematical Equations [50]. An online resource focusing on algebraic, ordinary differential, partial differential (mathematical physics), integral, and other mathematical equations. Cain, George: Online Mathematics Textbooks [51] available free online. Tricki [52], Wiki-style site that is intended to develop into a large store of useful mathematical problem-solving techniques. Mathematical Structures [53], list information about classes of mathematical structures. Math & Logic: The history of formal mathematical, logical, linguistic and methodological ideas. [54] In The Dictionary of the History of Ideas. Mathematician Biographies [55]. The MacTutor History of Mathematics archive Extensive history and quotes from all famous mathematicians. Metamath [56]. A site and a language, that formalize mathematics from its foundations. Nrich [57], a prize-winning site for students from age five from Cambridge University Open Problem Garden [58], a wiki of open problems in mathematics Planet Math [59]. An online mathematics encyclopedia under construction, focusing on modern mathematics. Uses the Attribution-ShareAlike license, allowing article exchange with Wikipedia. Uses TeX markup. Some mathematics applets, at MIT [60] Weisstein, Eric et al.: MathWorld: World of Mathematics [61]. An online encyclopedia of mathematics. Patrick Jones' Video Tutorials [62] on Mathematics Citizendium: Theory (mathematics) [63].

Mathematics

23

References[1] No likeness or description of Euclid's physical appearance made during his lifetime survived antiquity. Therefore, Euclid's depiction in works of art depends on the artist's imagination (see Euclid). [2] Steen, L.A. (April 29, 1988). The Science of Patterns. Science, 240: 611616. and summarized at Association for Supervision and Curriculum Development. (http:/ / www. ascd. org/ portal/ site/ ascd/ template. chapter/ menuitem. 1889bf0176da7573127855b3e3108a0c/ ?chapterMgmtId=f97433df69abb010VgnVCM1000003d01a8c0RCRD), ascd.org [3] Devlin, Keith, Mathematics: The Science of Patterns: The Search for Order in Life, Mind and the Universe (Scientific American Paperback Library) 1996, ISBN 978-0-7167-5047-5 [4] Jourdain. [5] Peirce, p. 97. [6] Einstein, p. 28. The quote is Einstein's answer to the question: "how can it be that mathematics, being after all a product of human thought which is independent of experience, is so admirably appropriate to the objects of reality?" He, too, is concerned with The Unreasonable Effectiveness of Mathematics in the Natural Sciences. [7] Eves [8] Peterson [9] Both senses can be found in Plato. Liddell and Scott, s.voce [10] The Oxford Dictionary of English Etymology, Oxford English Dictionary, sub "mathematics", "mathematic", "mathematics" [11] S. Dehaene; G. Dehaene-Lambertz; L. Cohen (Aug 1998). "Abstract representations of numbers in the animal and human brain". Trends in Neuroscience 21 (8): 355361. doi:10.1016/S0166-2236(98)01263-6. [12] See, for example, Raymond L. Wilder, Evolution of Mathematical Concepts; an Elementary Study, passim [13] Kline 1990, Chapter 1. [14] " A History of Greek Mathematics: From Thales to Euclid (http:/ / books. google. com/ books?id=drnY3Vjix3kC& pg=PA1& dq& hl=en#v=onepage& q=& f=false)". Thomas Little Heath (1981). ISBN 0-486-24073-8 [15] Sevryuk [16] Johnson, Gerald W.; Lapidus, Michel L. (2002). The Feynman Integral and Feynman's Operational Calculus. Oxford University Press. ISBN0821824139. [17] Eugene Wigner, 1960, " The Unreasonable Effectiveness of Mathematics in the Natural Sciences, (http:/ / www. dartmouth. edu/ ~matc/ MathDrama/ reading/ Wigner. html)" Communications on Pure and Applied Mathematics 13(1): 114. [18] Mathematics Subject Classification 2010 (http:/ / www. ams. org/ mathscinet/ msc/ pdfs/ classification2010. pdf) [19] Hardy, G. H. (1940). A Mathematician's Apology. Cambridge University Press. ISBN0521427061. [20] Gold, Bonnie; Simons, Rogers A. (2008). Proof and Other Dilemmas: Mathematics and Philosophy. MAA. [21] Aigner, Martin; Ziegler, Gunter M. (2001). Proofs from the Book. Springer. ISBN3540404600. [22] Earliest Uses of Various Mathematical Symbols (http:/ / jeff560. tripod. com/ mathsym. html) (Contains many further references). [23] Kline, p. 140, on Diophantus; p.261, on Vieta. [24] See false proof for simple examples of what can go wrong in a formal proof. The history of the Four Color Theorem contains examples of false proofs accidentally accepted by other mathematicians at the time. [25] Ivars Peterson, The Mathematical Tourist, Freeman, 1988, ISBN 0-7167-1953-3. p. 4 "A few complain that the computer program can't be verified properly", (in reference to the Haken-Apple proof of the Four Color Theorem). [26] Patrick Suppes, Axiomatic Set Theory, Dover, 1972, ISBN 0-486-61630-4. p. 1, "Among the many branches of modern mathematics set theory occupies a unique place: with a few rare exceptions the entities which are studied and analyzed in mathematics may be regarded as certain particular sets or classes of objects." [27] Zeidler, Eberhard (2004). Oxford User's Guide to Mathematics. Oxford, UK: Oxford University Press. p.1188. ISBN0198507631. [28] Waltershausen [29] Shasha, Dennis Elliot; Lazere, Cathy A. (1998). Out of Their Minds: The Lives and Discoveries of 15 Great Computer Scientists. Springer. p.228. [30] Popper 1995, p. 56 [31] Ziman [32] "The Fields Medal is now indisputably the best known and most influential award in mathematics." Monastyrsky [33] Riehm [34] Luke Howard Hodgkin & Luke Hodgkin, A History of Mathematics, Oxford University Press, 2005. [35] Clay Mathematics Institute (http:/ / www. claymath. org/ millennium/ P_vs_NP/ ), P=NP, claymath.org [36] Like other mathematical sciences such as physics and computer science, statistics is an autonomous discipline rather than a branch of applied mathematics. Like research physicists and computer scientists, research statisticians are mathematical scientists. Many statisticians have a degree in mathematics, and some statisticians are also mathematicians. [37] http:/ / eom. springer. de/ default. htm [38] http:/ / www. fields. utoronto. ca/ aboutus/ FieldsMedal_Monastyrsky. pdf [39] http:/ / books. google. com/ ?id=De0GAAAAYAAJ& pg=PA1& dq=Peirce+ Benjamin+ Linear+ Associative+ Algebra+ & q= [40] http:/ / www. ams. org/ notices/ 200207/ comm-riehm. pdf

Mathematics[41] [42] [43] [44] [45] [46] [47] [48] [49] [50] [51] [52] [53] [54] [55] [56] [57] [58] [59] [60] [61] [62] http:/ / www. ams. org/ bull/ 2006-43-01/ S0273-0979-05-01069-4/ S0273-0979-05-01069-4. pdf http:/ / www. amazon. de/ Gauss-Ged%e4chtnis-Wolfgang-Sartorius-Waltershausen/ dp/ 3253017028 http:/ / info. med. yale. edu/ therarad/ summers/ ziman. htm http:/ / freebookcentre. net/ SpecialCat/ Free-Mathematics-Books-Download. html http:/ / eom. springer. de http:/ / hyperphysics. phy-astr. gsu. edu/ Hbase/ hmat. html http:/ / www. freescience. info/ mathematics. php http:/ / www. math-atlas. org/ http:/ / www. math. niu. edu/ ~rusin/ known-math/ index/ index. html http:/ / eqworld. ipmnet. ru/ http:/ / www. math. gatech. edu/ ~cain/ textbooks/ onlinebooks. html http:/ / www. tricki. org/ http:/ / math. chapman. edu/ cgi-bin/ structures?HomePage http:/ / etext. lib. virginia. edu/ DicHist/ analytic/ anaVII. html http:/ / www-history. mcs. st-and. ac. uk/ ~history/ http:/ / metamath. org/ http:/ / www. nrich. maths. org/ public/ index. php http:/ / garden. irmacs. sfu. ca http:/ / planetmath. org/ http:/ / www-math. mit. edu/ daimp http:/ / www. mathworld. com/ http:/ / www. youtube. com/ user/ patrickJMT

24

[63] http:/ / en. citizendium. org/ wiki/ Theory_(mathematics)

MedianIn probability theory and statistics, a median is described as the numeric value separating the higher half of a sample, a population, or a probability distribution, from the lower half. The median of a finite list of numbers can be found by arranging all the observations from lowest value to highest value and picking the middle one. If there is an even number of observations, then there is no single middle value; the median is then usually defined to be the mean of the two middle values.[1] [2] In a sample of data, or a finite population, there may be no member of the sample whose value is identical to the median (in the case of an even sample size) and, if there is such a member, there may be more than one so that the median may not uniquely identify a sample member. Nonetheless the value of the median is uniquely determined with the usual definition. A related concept, in which the outcome is forced to correspond to a member of the sample is the medoid. At most half the population have values less than the median and at most half have values greater than the median. If both groups contain less than half the population, then some of the population is exactly equal to the median. For example, if a0 away from the mean ; about 95% of the values are within two standard deviations and about 99.7% lie within three standard deviations. This is known as the 68-95-99.7 rule, or the empirical rule, or the 3-sigma rule. To be more precise, the area under the bell curve between n and +n in terms of the cumulative normal distribution function is given by

Dark blue is less than one standard deviation from the mean. For the normal distribution, this accounts for about 68% of the set (dark blue), while two standard deviations from the mean (medium and dark blue) account for about 95%, and three standard deviations (light, medium, and dark blue) account for about 99.7%.

where erf is the error function. To 12 decimal places, the values for the 1-, 2-, up to 6-sigma points are:i.e. 1 minus ... or 1 in ...

1 0.682689492137 0.317310507863 3.15148718753 2 0.954499736104 0.045500263896 21.9778945081 3 0.997300203937 0.002699796063 370.398347380 4 0.999936657516 0.000063342484 15,787.192684 5 0.999999426697 0.000000573303 6 0.999999998027 0.000000001973 1,744,278.331 506,842,375.7

The next table gives the reverse relation of sigma multiples corresponding to a few often used values for the area under the bell curve. These values are useful to determine (asymptotic) confidence intervals of the specified levels based on normally distributed (or asymptotically normal) estimators:

0.80 0.90 0.95 0.98 0.99 0.995 0.998 0.999 0.9999

1.281551565545 1.644853626951 1.959963984540 2.326347874041 2.575829303549 2.807033768344 3.090232306168 3.290526731492 3.890591886413

0.99999 4.417173413469

Normal distribution where the value on the left of the table is the proportion of values that will fall within a given interval and n is a multiple of the standard deviation that specifies the width of the interval.

63

Central limit theoremThe theorem states that under certain, fairly common conditions, the sum of a large number of random variables will have an approximately normal distribution. For example if (x1, , xn) is a sequence of iid random variables, each having mean and variance 2 but otherwise distributions of xis can be arbitrary, then the central limit theorem states that

The theorem will hold even if the summands xi are not iid, although some constraints on the degree of dependence and the growth rate of moments still have to be imposed. The importance of the central limit theorem cannot be overemphasized. A great number of test statistics, scores, and estimators encountered in practice contain sums of certain random variables in them, even more estimators can be represented as sums of random variables through the use of influence functions all of these quantities are governed by the central limit theorem and will have asymptotically normal distribution as a result. Another practical consequence of the central limit theorem is that certain other distributions can be approximated by the normal distribution, for example: The binomial distribution B(n, p) is approximately normal N(np, np(1p)) for large n and for p not too close to zero or one. The Poisson() distribution is approximately normal N(, ) for large values of. The chi-squared distribution 2(k) is approximately normal N(k, 2k) for large ks. The Students t-distribution t() is approximately normal N(0, 1) when is large.

As the number of discrete events increases, the function begins to resemble a normal distribution

Whether these approximations are sufficiently accurate depends on the purpose for which they are needed, and the rate of convergence to the normal distribution. It is typically the case that such approximations are less accurate in the tails of the distribution. A general upper bound for the approximation error in the central limit theorem is given by the BerryEsseen theorem, improvements of the approximation are given by the Edgeworth expansions.

Miscellaneous1. The family of normal distributions is closed under linear transformations. That is, if X is normally distributed with mean and variance2, then a linear transform aX + b (for some real numbers a and b) is also normally distributed:

Also if X1, X2 are two independent normal random variables, with means 1, 2 and standard deviations 1, 2, then their linear combination will also be normally distributed: [proof] 2. The converse of (1) is also true: if X1 and X2 are independent and their sum X1 + X2 is distributed normally, then both X1 and X2 must also be normal. This is known as Cramrs theorem. The interpretation of this property is that

Normal distribution a normal distribution is only divisible by other normal distributions. 3. It is a common fallacy that if two normal random variables are uncorrelated then they are also independent. This is false.[proof] The correct statement is that if the two random variables are jointly normal and uncorrelated, only then they are independent. 4. Normal distribution is infinitely divisible: for a normally distributed X with mean and variance2 we can find n independent random variables {X1, , Xn} each distributed normally with means/n and variances2/n such that 5. Normal distribution is stable (with exponent = 2): if X1, X2 are two independent N(, 2) random variables and a, b are arbitrary real numbers, then where X3 is also N(, 2). This relationship directly follows from property (1). 6. The KullbackLeibler divergence between two normal distributions X1 N(1, 21 )and X2 N(2, 22 )is given by:[11]

64

The Hellinger distance between the same distributions is equal to

7. The Fisher information matrix for normal distribution is diagonal and takes form

8. Normal distributions belongs to an exponential family with natural parameters2

and2

, and natural

statistics x and x . The dual, expectation parameters for normal distribution are 1 = and 2 = + 2. 9. Of all probability distributions over the reals with mean and variance2, the normal distribution N(, 2) is the one with the maximum entropy. 10. The family of normal distributions forms a manifold with constant curvature 1. The same family is flat with respect to the (1)-connections (e) and (m).[12]

Related distributions If X is distributed normally with mean and variance 2, then The exponent of X is distributed log-normally: eX ~ lnN (, 2). The absolute value of X has folded normal distribution: IXI ~ Nf (, 2). If = 0 this is known as the half-normal distribution. The square of X/ has the non-central chi-square distribution with one degree of freedom: X2/2 ~ 21(2/2). If = 0, the distribution is called simply chi-square. Variable X restricted to an interval [a, b] is called the truncated normal distribution. (X )2 has a Lvy distribution with location 0 and scale 2. If X1 and X2 are two independent standard normal random variables, then Their sum and difference is distributed normally with mean zero and variance two: X1 X2 N(0, 2). Their product Z = X1X2 follows the product-normal distribution[13] with density function fZ(z) = 1K0(|z|), where K0 is the modified Bessel function of the second kind. This distribution is symmetric around zero, unbounded at z = 0, and has the characteristic function Z(t) = (1 + t 2)1/2. Their ratio follows the standard Cauchy distribution: X1 X2 Cauchy(0, 1).

Normal distribution Their Euclidean norm has the Rayleigh distribution, also known as the chi distribution with 2

65

degrees of freedom. If X1, X2, , Xn are independent standard normal random variables, then the sum of their squares has the chi-square distribution with n degrees of freedom: . If X1, X2, , Xn are independent normally distributed random variables with means and variances 2, then their sample mean is independent from the sample standard deviation, which can be demonstrated using the Basus theorem or Cochrans theorem. The ratio of these two quantities will have the Students t-distribution with n 1 degrees of freedom: If X1, , Xn, Y1, , Ym are independent standard normal random variables, then the ratio of their normalized sums of squares will have the F-distribution with (n, m) degrees of freedom:

ExtensionsThe notion of normal distribution, being one of the most important distributions in probability theory, has been extended far beyond the standard framework of the univariate (that is one-dimensional) case. All these extensions are also called normal or Gaussian laws, so a certain ambiguity in names exists. Multivariate normal distribution describes the Gaussian law in the k-dimensional Euclidean space. A vector X Rk is multivariate-normally distributed if any linear combination of its components has a (univariate) normal distribution. The variance of X is a kk symmetric positive-definite matrixV. Complex normal distribution deals with the complex normal vectors. A complex vector X Ck is said to be normal if both its real and imaginary components jointly possess a 2k-dimensional multivariate normal distribution. The variance-covariance structure of X is described by two matrices: the variance matrix, and the relation matrixC. Matrix normal distribution describes the case of normally distributed matrices. Gaussian processes are the normally distributed stochastic processes. These can be viewed as elements of some infinite-dimensional Hilbert spaceH, and thus are the analogues of multivariate normal vectors for the case k = . A random element h H is said to be normal if for any constant a H the scalar product (a, h) has a (univariate) normal distribution. The variance structure of such Gaussian random element can be described in terms of the linear covariance operator K: H H. Several Gaussian processes became popular enough to have their own names: Brownian motion, Brownian bridge, Ornstein-Uhlenbeck process. Gaussian q-distribution is an abstract mathematical construction which represents a q-analogue of the normal distribution. One of the main practical uses of the Gaussian law is to model the empirical distributions of many different random variables encountered in practice. In such case a possible extension would be a richer family of distributions, having more than two parameters and therefore being able to fit the empirical distribution more accurately. The examples of such extensions are: Pearson distribution a four-parametric family of probability distributions that extend the normal law to include different skewness and kurtosis values.

Normal distribution

66

Normality testsNormality tests assess the likelihood that the given data set {x1, , xn} comes from a normal distribution. Typically the null hypothesis H0 is that the observations are distributed normally with unspecified mean and variance 2, versus the alternative Ha that the distribution is arbitrary. A great number of tests (over 40) have been devised for this problem, the more prominent of them are outlined below: Visual tests are more intuitively appealing but subjective at the same time, as they rely on informal human judgement to accept or reject the null hypothesis. Q-Q plot is a plot of the sorted values from the data set against the expected values of the corresponding quantiles from the standard normal distribution. That is, its a plot of point of the form ( 1(pk), x(k)), where plotting points pk are equal to pk=(k)/(n+12) and is an adjustment constant which can be anything between 0 and 1. If the null hypothesis is true, the plotted points should approximately lie on a straight line. P-P plot similar to the Q-Q plot, but used much less frequently. This method consists of plotting the points ( (z(k)), pk), where . For normally distributed data this plot should lie on a 45 line between (0,0) and (1,1). WilkShapiro test employs the fact that the line in the Q-Q plot has the slope of . The test compares the least squares estimate of that slope with the value of the sample variance, and rejects the null hypothesis if these two quantities differ significantly. Normal probability plot (rankit plot) Moment tests: DAgostinos K-squared test JarqueBera test Empirical distribution function tests: KolmogorovSmirnov test Lilliefors test AndersonDarling test

Estimation of parametersIt is often the case that we dont know the parameters of the normal distribution, but instead want to estimate them. That is, having a sample (x1, , xn) from a normal N(, 2) population we would like to learn the approximate values of parameters and 2. The standard approach to this problem is the maximum likelihood method, which requires maximization of the log-likelihood function: Taking derivatives with respect to and 2 and solving the resulting system of first order conditions yields the maximum likelihood estimates:

Estimator is called the sample mean, since it is the arithmetic mean of all observations. The statistic is complete and sufficient for , and therefore by the LehmannScheff theorem, is the uniformly minimum variance unbiased (UMVU) estimator. In finite samples it is distributed normally:

The variance of this estimator is equal to the -element of the inverse Fisher information matrix . This implies that the estimator is finite-sample efficient. Of practical importance is the fact that the standard error of is proportional to , that is, if one wishes to decrease the standard error by a factor of 10, one must increase the number of points in the sample by a factor of 100. This fact is widely used in determining sample sizes for opinion polls and the number of trials in Monte Carlo simulations.

Normal distribution From the standpoint of the asymptotic theory, is consistent, that is, it converges in probability to as n . The estimator is also as