History of Computer
The earliest foundations of what would become computer science predate the invention of the modern digital computer. Machines for calculating fixed numerical tasks such as the abacus have existed since antiquity. Wilhelm Schickard designed the first mechanical calculator in 1623, but did not complete its construction.[2] Blaise Pascal designed and constructed the first working mechanical calculator, the Pascaline, in 1642. In 1694 Gottfried Wilhelm Leibnitzcompleted the Step Reckoner, the first calculator that could perform all four arithmetic operations. Charles Babbage designed a difference engine and then a general-purpose Analytical Engine in Victorian times,[3] for which Ada Lovelace wrote a manual. Because of this work she is regarded today as the world's firstprogrammer.[4] Around 1900, punched card machines were introduced.
During the 1940s, as new and more powerful computing machines were developed, the term computer came to refer to the machines rather than their human predecessors.[5] As it became clear that computers could be used for more than just mathematical calculations, the field of computer science broadened to study computation in general. Computer science began to be established as a distinct academic discipline in the 1950s and early 1960s.[6][7] The world's first computer science degree program, the Cambridge Diploma in Computer Science, began at the University of Cambridge Computer Laboratory in 1953. The first computer science degree program in the United States was formed at Purdue University in 1962.[8] Since practical computers became available, many applications of computing have become distinct areas of study in their own right.
Although many initially believed it was impossible that computers themselves could actually be a scientific field of study, in the late fifties it gradually became accepted among the greater academic population.[9] It is the now well-known IBM brand that formed part of the computer science revolution during this time. IBM (short for International Business Machines) released the IBM 704 and later the IBM 709 computers, which were widely used during the exploration period of such devices. "Still, working with the IBM [computer] was frustrating...if you had misplaced as much as one letter in one instruction, the program would crash, and you would have to start the whole process over again".[9] During the late 1950s, the computer science discipline was very much in its developmental stages, and such issues were commonplace.
Time has seen significant improvements in the usability and effectiveness of computing technology. Modern society has seen a significant shift in the users of computer technology, from usage only by experts and professionals, to a near-ubiquitous user base. Initially, computers were quite costly, and some degree of human aid was needed for efficient use - in part from professional computer operators. As computer adoption became more widespread and affordable, less human assistance was needed for common usage.
[edit]Major achievements
Despite its short history as a formal academic discipline, computer science has made a number of fundamental contributions to science and society - in fact, along with electronics, it is a founding science of the current epoch of human history called the Information Age and a driver of the Information Revolution, seen as the third major leap in human technological progress after the Industrial Revolution (1750-1850 CE) and the Agricultural Revolution (8000-5000 BCE).
These contributions include:
- The start of the "digital revolution," which includes the current Information Age and the Internet.[11]
- A formal definition of computation and computability, and proof that there are computationally unsolvable and intractable problems.[12]
- The concept of a programming language, a tool for the precise expression of methodological information at various levels of abstraction.[13]
- In cryptography, breaking the Enigma code was an important factor contributing to the Allied victory in World War II.[10]
- Scientific computing enabled practical evaluation of processes and situations of great complexity, as well as experimentation entirely by software. It also enabled advanced study of the mind, and mapping of the human genome became possible with the Human Genome Project.[11] Distributed computing projects such as Folding@home explore protein folding.
- Algorithmic trading has increased the efficiency and liquidity of financial markets by using artificial intelligence, machine learning, and otherstatistical and numerical techniques on a large scale.[14] High frequency algorithmic trading can also exacerbate volatility.[15]
- Image synthesis, including video by computing individual video frames.[citation needed]
- Simulation of various processes, including computational fluid dynamics, physical, electrical, and electronic systems and circuits, as well as societies and social situations (notably war games) along with their habitats, among many others. Modern computers enable optimization of such designs as complete aircraft. Notable in electrical and electronic circuit design are SPICE, as well as software for physical realization of new (or modified) designs. The latter includes essential design software for integrated circuits.[citation needed]