Selasa, 27 November 2012

History of computer

History of computer
you still do not know about the history of the computer that you always use? certainly there are those among us who do not know the history of the computer, so this time I'll post about the history of computers. for more details read this artike.
History includes computer hardware, architecture, and its effect on software.
Understanding computersComputers are tools used to process the data according to the order that has been formulated. Computer word originally used to describe people who perkerjaannya perform arithmetic calculations, with or without the tools, but the meaning of the word is then transferred to the machine itself. Origins, processing information almost exclusively related to arithmetical problems, but modern computers are used for many tasks unrelated to mathematics.
Broadly, the computer can be defined as an electronic equipment that consists of several components, which can work together between the components with one another to produce a program based on information and data available. The computer components are included: Screen Monitor, CPU, Keyboard, Mouse, and Printer (as a complement). Without a computer printer can still do its job as a data processor, but the extent seen on the monitor has not been in print form (paper).
In this definition that there are tools such as slide rule, mechanical calculators types ranging from abacus and so on, until all contemporary electronic computers. The term better suited for a broad sense as "computer" is "that process information" or "information-processing system."
Today, computers are becoming more sophisticated. However, before the computer is not small, sophisticated, cool and light as now. In the history of computers, there are 5 generations of computer history.
Computer Generation
The first generation
With the onset of the Second World War, the countries involved in the war sought to develop computers to exploit their potential strategic importance a computer. This increased funding for computer development and accelerate technical progress. In 1941, Konrad Zuse, a German engineer to build a computer, the Z3, to design airplanes and missiles.
Party allies also made other progress in the development of computer power. In 1943, the British completed a secret code-breaking computer called Colossus to crack the secret code used by Germany. The Colossus did not really affect the development of the computer industry because of two reasons. First, the Colossus is not a versatile computer (general-purpose computer), it was only designed to decode secret messages. Second, the existence of the machine was kept secret until decades after the war ended.
The work done by the Americans at that time produced a broader achievement. Howard H. Aiken (1900-1973), a Harvard engineer working with IBM, succeeded in producing electronic calculators for the U.S. Navy. The calculator measures the length of a football field and a half away and has a range of 500 miles along the cable. The Harvard-IBM Automatic Sequence Controlled Calculator, or Mark I, an electronic relay computer. It uses electromagnetic signals to move mechanical components. The machine was slow (taking 3-5 seconds per calculation) and inflexible (order calculations can not be changed). The calculator can perform basic arithmetic and more complex equations.
The development of the present day computer was the Electronic Numerical Integrator and Computer (ENIAC), which was created by the cooperation between the United States and the University of Pennsylvania. Consisting of 18,000 vacuum tubes, 70,000 resistors and 5 million soldered joints, the computer is a machine that consume enormous power of 160kW.
The computer was designed by John Presper Eckert (1919-1995) and John W. Mauchly (1907-1980), ENIAC is a versatile computer (general purpose computer) that work 1000 times faster than the Mark I.
In the mid-1940s, John von Neumann (1903-1957) joined the University of Pennsylvania team, initiating concepts in computer design are up to 40 years is still used in computer engineering. Von Neumann designed the Electronic Discrete Variable Automatic Computer (EDVAC) in 1945 with a memory to hold both programs and data. This technique allows the computer to stop at some point and then resume her job back. The main key von Neumann architecture is the central processing unit (CPU), which allowed all computer functions to be coordinated through a single source. In 1951, UNIVAC I (Universal Automatic Computer I) made by Remington Rand, became the first commercial computer that uses the Von Neumann architecture model.
Neither the U.S. Census Bureau and General Electric have UNIVAC. One of the impressive results achieved by the UNIVAC dalah success in predicting victory Dwilight D. Eisenhower in the 1952 presidential election.
First generation computers were characterized by the fact that operating instructions were made specifically for a particular task. Each computer has a different binary code program called "machine language" (machine language). This causes the computer is difficult to be programmed and the speed limit. Another feature is the use of first-generation computer vacuum tube (which makes the computer at that time are very large) and magnetic cylinders for the storage of data.
The second generation
In 1948, the invention of the transistor greatly influenced the development of computers. The transistor replaced the vacuum tube in televisions, radios, and computers. As a result, the size of electric machines is reduced drastically.
The transistor used in computers began in 1956. Other findings in the form of magnetic core memory-second generation computers smaller, faster, more reliable, and more energy efficient than its predecessor. The first machine that utilizes this new technology is the supercomputer. IBM makes supercomputer named Stretch, and Sprery-Rand makes a computer named LARC. These computers, both developed for atomic energy laboratories, could handle large amounts of data, a capability much in demand by atomic scientists. The machine is very expensive and tend to be too complex for business computing needs, thereby limiting. There are only two LARC ever installed and used: one at the Lawrence Radiation Labs in Livermore, California, and the other at the U.S. Navy Research and Development Center in Washington DC The second generation of computers replacing the machine language to assembly language. Assembly language is a language that uses abbreviations to replace the binary code.
In the early 1960s, computers began to appear successful second generation in the business, in universities and in government. The second generation of computers is a computer which used transistors. They also have components that can be associated with the computer at this time: printers, storage, disk, memory, operating system, and programs.
One important example is the computer during 1401 that is widely accepted in the industry. In 1965, almost all large businesses use computers second generation to financial memprosesinformasi.
The program stored in the computer programming language in it gives flexibility to the computer. Flexibility is increased performance at a reasonable price for business use. With this concept, the computer can print customer invoices and minutes later design products or calculate paychecks. Some programming languages ​​began to appear at that time. Programming language Common Business-Oriented Language (COBOL) and FORTRAN (Formula Translator) came into common use. These languages ​​replaced cryptic binary machine code with words, sentences, and mathematical formulas are more easily understood by humans. This makes it easy for someone to program a computer. A wide range of emerging careers (programmer, systems analyst, and expert computer systems). Industr software also began to appear and grow during this second generation of computers.
The third generation
Although the transistors in many respects the vacuum tube, but transistors generate substantial heat, which could potentially damage the internal parts of the computer. Quartz stone (quartz rock) eliminates this problem. Jack Kilby, an engineer at Texas Instruments, developed the integrated circuit (IC: integrated circuit) in 1958. IC combined three electronic components onto a small silicon disc, made from quartz. Scientists later managed to fit more components into a single chip, called a semiconductor. As a result, computers became ever smaller as more components were squeezed onto the chip. Other third-generation development is the use of the operating system (operating system) which allows the engine to run many different programs at once with a central program that monitored and coordinated the computer's memory.
The fourth generation
After IC, the only place to go was down the size of circuits and electrical components. Large Scale Integration (LSI) could fit hundreds of components onto one chip. In the 1980's, Very Large Scale Integration (VLSI) contains thousands of components on a single chip.
Ultra-Large Scale Integration (ULSI) increased that number into the millions. The ability to install so many components in a chip that berukurang half coin prices eased and the size of the computer. It also increased power, efficiency and reliability. Intel 4004 chip made in 1971membawa advances in IC by putting all the components of a computer (central processing unit, memory, and control input / output) in a very small chip. Previously, the IC is made to do a certain task specific. Now, a microprocessor could be manufactured and then programmed to meet all demands. Soon, every household devices such as microwave ovens, televisions, and automobiles with electronic fuel injection (EFI) is equipped with a microprocessor.
Such developments allow ordinary people to use computers. The computer is no longer a dominance of large corporations or government agencies. In the mid-1970s, computer assemblers offer their computer products to the general public. These computers, called minicomputers, sold with a software package that is easy to use by the layman. The most popular software at the time was word processing and spreadsheet programs. In the early 1980s, such as the Atari 2600 video game consumer interest in home computers are more sophisticated and can be programmed.
In 1981, IBM introduced the use of Personal Computer (PC) for use in homes, offices, and schools. The number of PCs in use jumped from 2 million units in 1981 to 5.5 million units in 1982. Ten years later, 65 million PCs in use. Computers continued their trend toward a smaller size, from computers that are on the table (desktop computer) to a computer that can be inserted into the bag (laptop), or even a computer that can be held (palmtop).
IBM PC to compete with Apple's Macintosh line, introduced in. Apple Macintosh became famous for popularizing the computer graphics system, while his rival was still using a text-based computer. Macintosh also popularized the use of mouse devices.
At the present time, we know the way to the use of IBM compatible CPU: IBM PC/486, Pentium, Pentium II, Pentium III, Pentium IV (series of CPUs made by Intel). Also we know AMD k6, Athlon, etc.. This is all included in the class of fourth generation computers.
Along with the proliferation of computer usage in the workplace, new ways to harness their potential developed. Along with the increased strength of a small computer, these computers can be connected together in a network to share a memory, software, information, and also to be able to communicate with each other. Computer networks allow a single computer to establish electronic collaboration to complete a task process. By using direct cabling (also called Local Area Network or LAN), or [telephone cable, the network can become very large.

 
The fifth generation
Defining the fifth generation computer becomes quite difficult because the field is still very young. Example of fifth generation computer imaginative fictional HAL9000 computer from the novel by Arthur C. Clarke titled 2001: Space Odyssey. HAL displays all the desired functions of a fifth-generation computers. With artificial intelligence (artificial intelligence or AI), HAL may have enough reason to hold conversations with humans, using visual feedback, and learn from his own experiences.
Although it may be the realization of HAL9000 is still far from reality, many of the functions that had been established. Some computers can receive verbal instructions and imitate human reasoning. The ability to translate a foreign language also becomes possible. The facility is deceptively simple. However, such facilities become much more complicated than expected when programmers realized that human understanding relies heavily on context and understanding rather than just translate the words directly.
Many advances in computer design and technology is increasingly enabling the manufacture of fifth generation computers. Two such engineering advances are parallel processing capabilities, which will replace the non-Neumann model. Non Neumann model will be replaced with a system that is able to coordinate many CPUs to work as one. Another advance is superconductor technology, which allows the flow of electrically without any obstacles, which will accelerate the speed of information.
Japan is a country well known in the jargon of socialization and the fifth generation computer project. Institutions ICOT (Institute for new Computer Technology) was also set up to make it happen. Many news stating that the project has failed, but some other information that the success of the fifth generation computer project will bring new changes in the world of computerized paradigm.

Tidak ada komentar:

Posting Komentar