-
Essay / Eniac Machine by John W. Mauchly and J. Presper Eckert
The first large-scale computer was the giant ENIAC machine of John W. Mauchly and J. Presper Eckert of the University of Pennsylvania. ENIAC (Electrical Numerical Integrator and Calculator) used a word of 10 decimal digits instead of binary digits like previous automated calculators/computers. ENIAC was also the first machine to use more than 2,000 vacuum tubes, or almost 18,000 vacuum tubes. Storing all those vacuum tubes and the machinery needed to maintain freshness took up more than 167 square meters (1,800 square feet) of floor space. Nevertheless, it had punch card input and output and arithmetically had 1 multiplier, 1 square root divider and 20 adders employing decimal "ring counters", which served as adders and also read-write register storage quick access. The executable instructions comprising a program were embedded in separate units of ENIAC, which were linked together to form a route through the machine for the flow of calculations. These connections had to be redone for each different problem, along with the function tables and preset switches. This "wire-your-own" instruction technique was not practical, and only with a certain license could ENIAC be considered programmable; however, it has proven effective in managing the specific programs for which it was designed. ENIAC is generally recognized as the first high-speed electronic digital computer (EDC) and was used productively from 1946 to 1955. Say no to plagiarism. Get a tailor-made essay on "Why Violent Video Games Should Not Be Banned"?Get the original essay Controversy developed in 1971, however, over the patentability of ENIAC's core digital concepts, the assertion that another American physicist, John V Atanasoff, had already used the same ideas in a simpler vacuum tube device that he had built in the 1930s while studying at Iowa State College. In 1973, the court ruled in favor of the company using Atanasoff's claim and Atanasoff received the praise he rightly deserved. In the 1950s, two devices were invented to improve the computing field and spark the start of the computer revolution. The first of these two devices was the transistor. Invented in 1947 by William Shockley, John Bardeen, and Walter Brattain of Bell Labs, the transistor was intended to supplant the days of vacuum tubes in computers, radios, and other electronic devices. In 1958, this problem was also solved by Jack St. Clair Kilby of Texas Instruments. He made the first integrated circuit or chip. A chip is actually a collection of tiny transistors connected together during transistor manufacturing. Thus, the need to solder large numbers of transistors together was virtually eliminated; now only connections were needed with other electronic components. In addition to saving space, the speed of the machine was now increased since the distance the electrons had to travel was reduced. The 1960s saw large mainframe computers become much more common in large industries and as part of the US military and space program. IBM became the undisputed market leader in selling these large, expensive, error-prone and very difficult-to-use machines. A veritable explosion of personal computers occurred in the early 1970s, beginning with Steve Jobs and Steve Wozniak exhibiting the first Apple II at the First West Coast Computer Faire in San Francisco.,.