*** Welcome to piglix ***

History of computing hardware (1960s-present)


Computer science is the study of the theory, experimentation, and engineering that form the basis for the design and use of computers. It is the scientific and practical approach to computation and its applications and the systematic study of the feasibility, structure, expression, and mechanization of the methodical procedures (or algorithms) that underlie the acquisition, representation, processing, storage, communication of, and access to information. An alternate, more succinct definition of computer science is the study of automating algorithmic processes that scale. A computer scientist specializes in the theory of computation and the design of computational systems.[1]

Its fields can be divided into a variety of theoretical and practical disciplines. Some fields, such as computational complexity theory (which explores the fundamental properties of computational and intractable problems), are highly abstract, while fields such as computer graphics emphasize real-world visual applications. Other fields still focus on challenges in implementing computation. For example, programming language theory considers various approaches to the description of computation, while the study of computer programming itself investigates various aspects of the use of programming language and complex systems. Human–computer interaction considers the challenges in making computers and computations useful, usable, and universally accessible to humans. The history of computing hardware starting at 1960 is marked by the conversion from [vacuum tube] to [solid state (electronics)]solid state devices such as the[transistors] and later the [integrated circuit]. By 1959 discrete transistors were considered sufficiently reliable and economical that they made further vacuum tube computers competition (economics)[uncompetitive]. Computer main memory slowly moved away from [magnetic core memory] devices to solid-state static and dynamic semiconductor memory, which greatly reduced the cost, size and power consumption of computers.

The mass increase in the use of computers accelerated with 'Third Generation' computers. These generally relied on Jack Kilby's invention of the integrated circuit (or microchip), starting around 1965. However, the IBM System/360 used hybrid circuits, which were solid-state devices interconnected on a substrate with discrete wires.

The first integrated circuit was produced in September 1958, but computers using them didn't begin to appear until 1963. Some of their early uses were in embedded systems, notably used by NASA for the Apollo Guidance Computer, by the military in the LGM-30 Minuteman intercontinental ballistic missile, the Honeywell ALERT airborne computer, and in the Central Air Data Computer used for flight control in the US Navy's F-14A Tomcat fighter jet.


...
Wikipedia

...