Wednesday, April 30, 2008

One of the greatest invension...

Here we will focus only on the evolution of the computers, its hardware and latest softwares....

1930s–1960s: desktop calculators The first all-electronic desktop calculator was the British ANITA Mk.VII, which used a Nixie tube display and 177 subminiature thyratron tubes. In June 1963, Friden introduced the four-function EC-130. It had an all-transistor design, 13-digit capacity on a 5-inch CRT, and introduced reverse Polish notation (RPN) to the calculator market at a price of $2200. The model EC-132 added square root and reciprocal functions. In 1965, Wang Laboratories produced the LOCI-2, a 10-digit transistorized desktop calculator that used a Nixie tube display and could compute logarithms. With development of the integrated circuits and microprocessors, the expensive, large calculators were replaced with smaller electronic devices. Advanced analog computers The art of analog computing reached its zenith with the differential analyzer, invented in 1876 by James Thomson and built by H. W. Nieman and Vannevar Bush at MIT starting in 1927. Fewer than a dozen of these devices were ever built; the most powerful was constructed at the University of Pennsylvania's Moore School of Electrical Engineering, where the ENIAC was built. Digital electronic computers like the ENIAC spelled the end for most analog computing machines, but hybrid analog computers, controlled by digital electronics, remained in substantial use into the 1950s and 1960s, and later in some specialized applications. Early digital computers The era of modern computing began with a flurry of development before and during World War II, as electronic circuits, relays, capacitors, and vacuum tubes replaced mechanical equivalents and digital calculations replaced analog calculations. Machines such as the Atanasoff–Berry Computer, the Z3, the Colossus, and ENIAC were built by hand using circuits containing relays or valves (vacuum tubes), and often used punched cards or punched paper tape for input and as the main (non-volatile) storage medium. Konrad Zuse's Z-series: the first program-controlled computers Working in isolation in Germany, Konrad Zuse started construction in 1936 of his first Z-series calculators featuring memory and (initially limited) programmability. Zuse's purely mechanical, but already binary Z1, finished in 1938, never worked reliably due to problems with the precision of parts. Zuse's subsequent machine, the Z3, was finished in 1941. It was based on telephone relays and did work satisfactorily. The Z3 thus became the first functional program-controlled, all-purpose, digital computer. In many ways it was quite similar to modern machines, pioneering numerous advances, such as floating point numbers. Replacement of the hard-to-implement decimal system (used in Charles Babbage's earlier design) by the simpler binary system meant that Zuse's machines were easier to build and potentially more reliable, given the technologies available at that time. This is sometimes viewed as the main reason why Zuse succeeded where Babbage failed. Second generation: transistors At the beginning the experts themselves reckoned that only four or five big American companies could have been really interested in their use. In 1951 starts the first calculation machine which is made in series and there is a big development of these machines, because of the introduction of new techniques, of new unities and programming methods. In 1953 the calculation machines' number all over the world raises till about 100 unities. In 1958 only the United States have about 2500 models in all. In Italy the first calculation machine was settled in 1954 at the Politecnico di Milano University and only in 1957 it is used by a firm. In 1958 it is settled in Italy a tenth of calculation machines, which support about 700 meccanographic employee. In conclusion of the First Generation, at the end of the fifties, the electronic calculation machines have won their users' trust. At the beginning they were considered more as calculus instruments and useful for the university researches than machines useful to the corporations or firms' operative needs for their capacities to execute information. Post-1960: third generation and beyond The explosion in the use of computers began with 'Third Generation' computers. These relied on Jack St. Clair Kilby's and Robert Noyce's independent invention of the integrated circuit (or microchip), which later led to the invention of the microprocessor, by Ted Hoff and Federic Faggin at Intel. During the 1960s there was considerable overlap between second and third generation technologies. As late as 1975, Sperry Univac continued the manufacture of second-generation machines such as the UNIVAC 494. The microprocessor led to the development of the microcomputer, small, low-cost computers that could be owned by individuals and small businesses. Microcomputers, the first of which appeared in the 1970s, became ubiquitous in the 1980s and beyond. Steve Wozniak, co-founder of Apple Computer, is credited with developing the first mass-market home computers. However, his first computer, the Apple I, came out some time after the KIM-1 and Altair 8800, and the first Apple computer with graphic and sound capabilities came out well after the Commodore PET. Computing has evolved with microcomputer architectures, with features added from their larger brethren, now dominant in most market segments.

No comments: