The evolution of technology – The history of computers

While computers are now an important part of human life, there was a time when computers didn’t exist. Knowing about the history of computers and how much progress has been made can help you understand how complicated and innovative creating computers really is.

Unlike most devices, the computer is one of the few inventions that does not have a specific inventor. Throughout the development of the computer, many people have added their creations to the list required for a computer to work. Some of the inventions have been different types of computers, and some of them were necessary parts to allow computers to develop further.

The beginning

Perhaps the most significant date in the history of computers is the year 1936. It was in this year that the first “computer” was developed. It was created by Konrad Zuse and called Z1 Computer. This computer is the first, as it was the first system to be fully programmable. There were devices before this, but none had the computing power that distinguishes it from other electronic devices.

It wasn’t until 1942 that any business saw profit and opportunity in computers. This first company was called ABC Computers, owned and operated by John Atanasoff and Clifford Berry. Two years later, the Harvard Mark I computer was developed, kick-starting computer science.

Over the course of the following years, inventors around the world began to do more research on the study of computers and how to improve them. Those next ten years mark the introduction of the transistor, which would become a vital part of the inner workings of the computer, the ENIAC 1 computer, as well as many other types of systems. ENIAC 1 is perhaps one of the most interesting, as it required 20,000 vacuum tubes to run. It was a huge machine and it started the revolution to build smaller and faster computers.

The computer age was forever altered by the introduction of International Business Machines, or IBM, into the computer industry in 1953. This company, throughout the history of computing, has been a major player in the development of new systems and servers for public and private use. This introduction sparked the first real signs of competition in the history of computing, helping to drive faster and better development of computers. His first contribution was the IBM 701 EDPM computer.

A programming language evolves

A year later, the first successful high-level programming language was created. This was a programming language not written in ‘assembly’ or binary, which are considered very low level languages. FORTRAN was written so that more people could easily start programming computers.

The year 1955, the Bank of America, together with the Stanford Research Institute and General Electric, saw the creation of the first computers for use in banks. MICR, or Magnetic Ink Character Recognition, along with the actual computer, ERMA, was a breakthrough for the banking industry. It was not until 1959 that the two systems were put into use in royal banks.

During 1958 one of the most important advances in the history of computing took place, the creation of the integrated circuit. This device, also known as a chip, is one of the basic requirements for modern computer systems. On every motherboard and board within a computer system, there are many chips that contain information about what the boards and boards do. Without these chips, systems as we know them today cannot function.

Games, mice and the Internet

For many computer users now, games are a vital part of the computing experience. 1962 saw the creation of the first computer game, which was created by Steve Russel and MIT, which was called Spacewar.

The mouse, one of the most basic components of modern computers, was created in 1964 by Douglass Engelbart. He got his name from the “tail” that comes out of the device.

One of the most important aspects of computers today was invented in 1969. The ARPA network was the original Internet, laying the foundation for the Internet we know today. This development would result in the evolution of knowledge and business across the planet.

It wasn’t until 1970 that Intel came onto the scene with the first dynamic RAM chip, resulting in an explosion of computing innovation.

Immediately after the RAM chip was the first microprocessor, which was also designed by Intel. These two components, in addition to the chip developed in 1958, would be among the core components of modern computers.

A year later, the floppy disk was created, which got its name from the flexibility of the storage unit. This was the first step in allowing most people to transfer bits of data between disconnected computers.

The first network card was created in 1973 and allowed the transfer of data between connected computers. This is similar to the Internet, but allows computers to connect without using the Internet.

Rise of home PCs

The next three years were very important for computers. It was then that companies began to develop systems for the average consumer. The Scelbi, Mark-8 Altair, IBM 5100, Apple I and II, TRS-80 and Commodore Pet computers were the forerunners in this area. Although expensive, these machines started the trend of computers within ordinary homes.

One of the biggest advances in computer software came in 1978 with the release of the VisiCalc spreadsheet program. All development costs were paid for within two weeks, making this program one of the most successful in computing history.

1979 was perhaps one of the most important years for the home computer user. This is the year that WordStar, the first word processing program, was released for sale to the public. This drastically altered the utility of computers for the everyday user.

The IBM Home computer quickly helped revolutionize the consumer market in 1981 as it was affordable to homeowners and mainstream consumers. 1981 also saw the mega-giant Microsoft enter the scene with the MS-DOS operating system. This operating system completely changed computing forever as it was pretty easy for everyone to learn.

The competition begins: Apple vs. Microsoft

Computers saw another vital change during the year of 1983. The Apple Lisa computer was the first with a graphical user interface, or GUI. Most modern programs contain a GUI, which allows them to be easy to use and pleasing to the eye. This marked the beginning of the obsolescence of most text-only based programs.

Beyond this point in the history of computing, many changes and alterations have occurred, from the wars between Apple and Microsoft, to the development of microcomputers, to a variety of computing advances that have become an accepted part of our lives. daily. Without the first initial steps in the history of computing, none of this would have been possible.

You may also like...

Leave a Reply

Your email address will not be published. Required fields are marked *