You are on page 1of 3

History of whole Numbers

History of whole Numbers In mathematics, the natural numbers are the ordinary whole numbers used for counting ("there are 6 coins on the table") and ordering ("this is the 3rd largest city in the country"). These purposes are related to the linguistic notions of cardinal and ordinal numbers, respectively (see English numerals). A later notion is that of a nominal number, which is used only for naming. Properties of the natural numbers related to divisibility, such as the distribution of prime numbers, are studied in number theory. Problems concerning counting and ordering, such as partition enumeration, are studied in combinatorics. There is no universal agreement about whether to include zero in the set of natural numbers: some define the natural numbers to be the positive integers {1, 2, 3, ...}, while for others the term designates the non-negative integers {0, 1, 2, 3, ...}. The former definition is the traditional one, with the latter definition first appearing in the 19th century. Some authors use the term "natural number" to exclude zero and "whole number" to include it; others use "whole number" in a way that excludes zero, or in a way that includes both zero and the negative integers.
Know More About :- Identity Property of Real Numbers

Math.Edurite.com

Page : 1/3

The natural numbers had their origins in the words used to count things, beginning with the number 1. The first major advance in abstraction was the use of numerals to represent numbers. This allowed systems to be developed for recording large numbers. The ancient Egyptians developed a powerful system of numerals with distinct hieroglyphs for 1, 10, and all the powers of 10 up to over one million. A stone carving from Karnak, dating from around 1500 BC and now at the Louvre in Paris, depicts 276 as 2 hundreds, 7 tens, and 6 ones; and similarly for the number 4,622. The Babylonians had a place-value system based essentially on the numerals for 1 and 10. A much later advance was the development of the idea that zero can be considered as a number, with its own numeral. The use of a zero digit in place-value notation (within other numbers) dates back as early as 700 BC by the Babylonians, but they omitted such a digit when it would have been the last symbol in the number.[1] The Olmec and Maya civilizations used zero as a separate number as early as the 1st century BC, but this usage did not spread beyond Mesoamerica. The use of a numeral zero in modern times originated with the Indian mathematician Brahmagupta in 628. However, zero had been used as a number in the medieval computus (the calculation of the date of Easter), beginning with Dionysius Exiguus in 525, without being denoted by a numeral (standard Roman numerals do not have a symbol for zero). The first systematic study of numbers as abstractions (that is, as abstract entities) is usually credited to the Greek philosophers Pythagoras and Archimedes. Note that many Greek mathematicians did not consider 1 to be "a number", so to them 2 was the smallest number.[3] Independent studies also occurred at around the same time in India, China, and Mesoamerica. [citation needed] Several set-theoretical definitions of natural numbers were developed in the 19th century. With these definitions it was convenient to include 0 (corresponding to the empty set) as a natural number. Including 0 is now the common convention among set theorists, logicians, and computer scientists.
Read More About :- Whole Numbers Basics

Math.Edurite.com

Page : 2/3

ThankYou

Math.Edurite.Com

You might also like