History of Computers =What Is Computer Technology?

 When we study the numerous aspects of computing and computers, it's important to know about the history of computers.

Charles Babbage designed an Analytical Engine which was a general computer It helps us understand the growth and progress of technology through the times. It's also important content for competitive and banking examinations.    

What's a Computer? 

  A computer is an electronic machine that collects information, stores it, processes it according to stoner instructions, and also returns the result. A computer is a programmable electronic device that performs computation and logical operations automatically using a set of instructions handed by the stoner. 

 Early Computing bias 

  People used sticks,  monuments, and bones as counting tools before computers were constructed. further computing biases were produced as technology advanced and the mortal intellect bettered over time. Let us look at many of the early-age computing biases used by humanity.  

 1  Abacus - The abacus was constructed by the Chinese around 4000 times agone. It’s a  rustic rack with essence rods with globules attached to them. The abacus driver moves the globules according to certain guidelines to complete computation calculations.  

 2  Napier’s Bone - John Napier cooked  Napier’s Bones, a manually operated calculating outfit. For calculating, this instrument used 9 separate ivory strips( bones) marked with numbers to multiply and divide. It was also the first machine to calculate using the decimal point system.  

3  Pascaline -Pascaline was constructed in 1642 by Blaise Pascal, a French mathematician, and champion. It's allowed to be the first mechanical and automated calculator. It was a  rustic box with gears and a bus outside.  

4  Stepped Reckoner- or the Leibniz wheel -In 1673, a German mathematician-champion named Gottfried Wilhelm Leibniz bettered on Pascal’s invention to produce this outfit. It was a digital mechanical calculator known as the stepped reckoner because it used fluted cans rather than gears.  

5  Difference Machine -In the early 1820s, Charles Babbage created the Difference Machine. It was a mechanical computer that could do introductory calculations. It was a brume-powered calculating machine used to break numerical tables similar to logarithmic tables.  

6  Analytical Engine -Charles Babbage created another calculating machine, the Analytical Engine, in 1830. It was a mechanical computer that took input from punch cards. It was able of working any fine problem and store data in an indefinite memory.  

7  Tabulating machine- An American Statistician – Herman Hollerith constructed this machine in the time 1890. Tabulating Machine was a punch card-grounded mechanical tabulator. It could cipher statistics and record or sort data or information. Hollerith began manufacturing these machines in his company, which eventually came International Business Machines( IBM) in 1924. 

8  Differential Analyzer -Vannevar Bush introduced the first electrical computer, the Differential Analyzer, in 1930. This machine is made up of vacuum tubes that switch electrical impulses in order to do computations. It was able of performing 25  computations in a matter of twinkles.  

9  Mark I -Howard Aiken planned to make a machine in 1937 that could conduct massive computations or computations using enormous figures. The Mark I computer was constructed in 1944 as a collaboration between IBM and Harvard. 

  History of Computers Generation  


The word ‘ computer ’ has a  veritably intriguing origin. It was first used in the 16th century for a person who used to cipher, i.e. do computations. The word was used in the same sense as a noun until the 20th century. Women were hired as mortal computers to carry out all forms of computations and calculations.  By the last part of the 19th century, the word was also used to describe machines that did computations. The ultramodern- day use of the word is generally to describe programmable digital bias that runs on electricity.   

 Early History of Computer  

 Since the elaboration of humans, biases have been used for computations thousands of times. One of the foremost and most well-known biases was an abacus. also in 1822, the father of computers, Charles Babbage began developing what would be the first mechanical computer. And also in 1833 he actually designed an Analytical Engine which was a general-purpose computer. It contained an ALU, some introductory inflow map principles, and the conception of integrated memory.  

 also further than a century latterly in the history of computers, we got our first electronic computer for general purposes. It was the ENIAC, which stands for Electronic Numerical Integrator and Computer. The formulators of this computer were John W. Mauchly and J.Presper Eckert.   And with time the technology developed and the computers got lower and the processing got brisk. We got our first laptop in 1981 and it was introduced by Adam Osborne and EPSON.  

 Generations of Computers  

In the history of computers, we frequently relate to the advancements of ultramodern computers as the generation of computers. We're presently on the fifth generation of computers. So let us look at the important features of these five generations of computers.   

1st Generation

 This was from the period 1940 to 1955. This was when machine language was developed for the use of computers. They used vacuum tubes for the circuitry. For the purpose of memory, they used glamorous cans. These machines were complicated, large, and precious. They were substantially reliant on batch operating systems and punch cards. As affair and input bias,  glamorous tape recording and paper tape recording were enforced. For illustration, ENIAC, UNIVAC- 1, EDVAC, and so on.  

2nd Generation 

The times 1957- 1963 were appertained to as the “ alternate generation of computers ” at the time. In alternate-generation computers, COBOL and FORTRAN are employed as assembly languages and programming languages. Then they advanced from vacuum tubes to transistors. This made the computers lower, brisk, and more energy-effective. And they advanced from double to assembly languages. For case, IBM 1620, IBM 7094, CDC 1604, CDC 3600, and so forth. 

3rd Generation 

The hallmark of this period( 1964- 1971) was the development of the intertwined circuit. A single intertwined circuit( IC) is made up of numerous transistors, which increases the power of a computer while contemporaneously lowering its cost. These computers were hastily,  lower, more dependable, and less precious than their forerunners. High-position programming languages similar to FORTRON- II to IV, COBOL, and PASCAL PL/ 1 were employed. For illustration, the IBM- 360 series, the Honeywell- 6000 series, and the IBM-370/168.  

4th Generation 

The invention of microprocessors brought along the fourth generation of computers. The times 1971- 1980 were dominated by fourth-generation computers. C, C, and Java were the programming languages employed in this generation of computers. For case, the STAR 1000, PDP 11, CRAY- 1, CRAY-X-MP, and Apple II. This was when we started producing computers for home use. 

 5th Generation 

These computers have been employed since 1980 and continue to be used now. This is the present and the future of the computer world. The defining aspect of this generation is artificial intelligence. The use of resemblant processing and superconductors are making this a reality and give a lot of compass for the future. Fifth-generation computers use ULSI( Ultra Large Scale Integration) technology. These are the most recent and sophisticated computers. C, C, Java,. Net and further programming languages are used. For case, IBM, Pentium, Desktop, Laptop, Tablet, Ultrabook, and so on.

 Detail History of Computers  

The naive understanding of calculation had to be overcome before the true power of computing could be realized. The formulators who worked lifelessly to bring the computer into the world had to realize that what they were creating was further than just a number cruncher or a calculator. They had to address all of the difficulties associated with contriving such a machine,  enforcing the design, and actually erecting the thing. The history of the computer is the history of these difficulties being answered.  

 19th Century  

 1801 – Joseph Marie Jacquard, an embroidered, and businessman from France,  cooked a  impend that employed punched rustic cards to automatically weave cloth designs.  

 1822 – Charles Babbage, a mathematician,  constructed the brume-powered calculating machine able of calculating number tables. The “ Difference Machine ” idea failed owing to a lack of technology at the time.   

 1848 – The world’s first computer program was written by Ada Lovelace, an English mathematician. Lovelace also includes a step-by-step tutorial on how to cipher Bernoulli figures using Babbage’s machine.  

 1890 – Herman Hollerith, an innovator, creates the punch card fashion used to calculate the 1880U.S.  tale. He'd go on to start the pot that would come to IBM.  

 Early 20th Century 

 1930 – Differential Analyzer was the first large-scale automatic general-purpose mechanical analog computer constructed and erected by Vannevar Bush.   

1936 – Alan Turing had an idea for a universal machine, which he called the Turing machine, that could cipher anything that could be reckoned.  

 1939 – Hewlett- Packard was discovered in a garage in Palo Alto, California by Bill Hewlett and David Packard.   

1941 – Konrad Zuse, a German innovator, and mastermind, completed his Z3 machine, the world’s first digital computer. still, the machine was destroyed during a World War II bombing strike on Berlin.  

 1941 –J.V. Atanasoff and graduate pupil Clifford Berry concoct a computer able of working 29 equations at the same time. The first time a computer can store data in its primary memory.  

 1945 – University of Pennsylvania academics John Mauchly and J. Presper Eckert produce an Electronic Numerical Integrator and Calculator( ENIAC). It was Turing-complete and able of working “ a vast class of numerical problems ” by reprogramming, earning it the title of “ forefather of computers. ”  

 1946 – The UNIVAC I( Universal Automatic Computer) was the first general-purpose electronic digital computer designed in the United States for commercial operations.  

 1949 – The Electronic Delay Storage Automatic Calculator(, EDSAC)developed by a  platoon at the University of Cambridge, is the “ first practical stored-program computer. ”   

1950 – The norms Eastern Automatic Computer( SEAC) was erected in Washington, DC, and it was the first stored-program computer completed in the United States.  

 Late 20th Century 

 1953 – Grace Hopper, a computer scientist, creates the first computer language, which becomes given as COBOL, which stands for COmmon, Business- acquainted Language. It allowed a computer stoner to offer the computer instructions in English- suchlike words rather than figures.  

 1954 – John Backus and a  platoon of IBM programmers created the FORTRAN programming language, an acronym for FORmula restatement. In addition, IBM developed the 650.   

1958 – The intertwined circuit,  occasionally known as the computer chip, was created by Jack Kirby and Robert Noyce.  

 1962 – Atlas, the computer, makes its appearance. It was the fastest computer in the world at the time, and it innovated the concept of “ virtual memory. ”  

 1964 – Douglas Engelbart proposes an ultramodern computer prototype that combines a mouse and a graphical stoner interface( GUI).   

1969 – Bell Labs inventors, led by Ken Thompson and Dennis Ritchie, revealed UNIX, an operating system developed in the C programming language that addressed program comity difficulties.   

1970 – The Intel 1103, the first Dynamic Access Memory( DRAM) chip, is unveiled by Intel.  

 1971 – The droopy slice was constructed by Alan Shugart and a  platoon of IBM  masterminds. At the same time, Xerox developed the first ray printer, which not only produced billions of bones but also heralded the morning of a new age in computer printing.   

1973 – Robert Metcalfe, a member of Xerox’s exploration department, created Ethernet, which is used to connect numerous computers and other gear.  

 1974 – particular computers were introduced into the request. The first were the Altair Scelbi & Mark- 8, IBM 5100, and Radio Shack’s TRS- 80.  

 1975 – Popular Electronics magazine touted the Altair 8800 as the world’s first minicomputer tackle in January. Paul Allen and Bill Gates offer to make software in the Drive language for the Altair.  

 1976 – Apple Computers is innovated by Steve Jobs and Steve Wozniak, who expose the world to the Apple I, the first computer with a single-circuit board.

1977 – At the first West Coast Computer Faire, Jobs, and Wozniak advertise the Apple II. It has color plates and a main drive for storing music.   

1978 – The first motorized spreadsheet program, VisiCalc, is introduced.  

1979 – WordStar, a word processing tool from MicroPro International, is released. 

1981 – IBM unveils the Acorn, their first particular computer, which has an Intel CPU, two droopy drives, and a color display. The MS-DOS operating system from Microsoft is used by Acorn.  

 1983 – The CD- ROM, which could carry 550 megabytes of-recorded data, hit the request. This time also saw the release of the Gavilan SC, the first movable computer with a flip-form design and the first to be offered as a “ laptop. ”  

 1984 – Apple launched Macintosh during the Superbowl XVIII  marketable. It was priced at$ 2,500   

 1985 – Microsoft introduces Windows, which enables multitasking via a graphical stoner interface. In addition, the programming language C has been released.  

 1990 – Tim Berners- Lee, an English programmer, and scientist, creates HyperText Markup Language, extensively known as HTML. He also chased the term “ WorldWideWeb. ” It includes the first cybersurfer, a garçon, HTML, and URLs. 

 1993 – The Pentium CPU improves the operation of plates and music on particular computers.  

 1995 – Microsoft’s Windows 95 operating system was released. A$ 300 million promotional crusade was launched to get the news out. Sun Microsystems introduces Java1.0, followed by Netscape Dispatches ’ JavaScript.   

1996 – At Stanford University, Sergey Brin and Larry Page created the Google Hunt machine.   

1998 – Apple introduces the iMac, an each-by-one Macintosh desktop computer. These PCs bring$ 1,300 and came with a 4 GB hard drive, 32 MB RAM, a CD- ROM, and a 15-inch examiner. 

1999 – Wi-Fi, a condensation for “ wireless dedication, ” is created, firstly covering a range of over 300  bases.   

21st Century  

2000 – The USB flash drive is first introduced in 2000. They were speedier and had further storehouse space than other storehouse media options when used for data storehouses. 

  2001 – Apple releases Mac OS X,  latterly renamed OS X and ultimately simply macOS, as the successor to its conventional Mac Operating System.   

2003 – guests could buy AMD’s Athlon 64, the first 64-bit CPU for consumer computers.  

2004 – Facebook began as a social networking website.  

2005 – Google acquires Android, a mobile phone zilches grounded on Linux.  

2006 – Apple’s MacBook Pro was available. The Pro was the company’s first binary-core, Intel-grounded mobile computer. Amazon Web Services, including Amazon Elastic Cloud 2( EC2) and Amazon Simple Storage Service, were also launched( S3) 

 2007 – The first iPhone was produced by Apple, bringing numerous computer operations into the win of our hands. Amazon also released the Kindle, one of the first electronic reading systems, in 2007.   

2009 – Microsoft released Windows 7.   

2011 – Google introduces the Chromebook, which runs Google Chrome OS.   

2014 – The University of Michigan Micro Mote( M3), the world’s  lowest computer, was constructed.   

2015 – Apple introduces the Apple Watch. Windows 10 was also released by Microsoft.   

2016 – The world’s first reprogrammable amount computer is erected. 

  Types of Computers    


1. Analog Computers – Analog computers are erected with colorful factors similar to gears and regulators, with no electrical factors. One advantage of analog calculation is that designing and erecting an analog computer to attack a specific problem can be relatively straightforward.   

 2. Digital Computers – Information in digital computers is represented in separate forms,  generally as sequences of 0s and 1s(  double integers, or bits). A digital computer is a system or contrivance that can reuse any type of information in a matter of seconds. Digital computers are distributed into numerous different types. They're as follows   

a.  Mainframe computers – It's a computer that's generally employed by large enterprises for charge-critical conditioning similar to massive data processing. Mainframe computers were distinguished by massive storehouse capacities, quick factors, and important computational capabilities. Because they were complicated systems, they were managed by a  platoon of systems programmers who had sole access to the computer. These machines are now appertained to as waiters rather than mainframes.   

b. Supercomputers – The most important computers to date are generally appertained to as supercomputers. Supercomputers are enormous systems that are purpose-  erected to break complicated scientific and artificial problems. Quantum mechanics, rainfall soothsaying,  oil painting and gas disquisition, molecular modeling, physical simulations, aerodynamics, nuclear emulsion exploration, and cryptoanalysis are each done on supercomputers.    

c. Minicomputers – A minicomputer is a type of computer that has numerous of the same features and capabilities as a larger computer but is lower in size. Minicomputers, which were fairly small and affordable, were frequently employed in a single department of an association and were frequently devoted to a specific task or participated by a small group.  

d.  Microcomputers – A microcomputer is a small computer that's grounded on a microprocessor integrated circuit,  frequently known as a chip. A microcomputer is a system that incorporates at a minimum a microprocessor, program memory, data memory, and input- affair system( I/ O). A microcomputer is now generally appertained to as a  particular computer( PC).   

e.  Bedded processors – These are atomic computers that control electrical and mechanical processes with introductory microprocessors. Bedded processors are frequently simple in design, have limited processing capability and I/ O capabilities, and need little power. Ordinary microprocessors and microcontrollers are the two primary types of bedded processors. Bedded processors are employed in systems that don't bear the computing capability of traditional bias similar to desktop computers, laptop computers, or workstations.

  FAQs on the History of Computers  

Q The principle of  ultramodern computers was proposed by,  

A.  Steve Jobs

B.  Adam Osborne  

C.  Alan Turing  

D.  Charles Babbage 

 Ans The correct answer is C. 

 Q Who introduced the first computer for home use in 1981?

A.  IBM  

B.  Apple 

C.  Microsoft 

D.  Sun Technology

 Ans Answer is. IBM made the first home-use particular computer.

Q Third-generation computers used which programming language?  

A.  Java  

B.  Machine language 

C.  FORTRAN 

D.  C and C  

Ans The correct option is C.

What Is Computer Technology?

At one time, it would be unheard of, or indeed insolvable, for a  ménage to have a computer. Many decades latterly, the computer was a high-ticket item. moment, the average ménage owns further than one computer.  How did similar humongous bias,  frequently used by the service, come nearly ubiquitous in every aspect of life, from work to play? Computing technology has come a long way in the once several decades, and it continues to have an indeed lesser impact on our diurnal lives.    

History of Computing Technology 

  ENIAC was an early computer erected during World War II. It was by no means the first computer, but it was the first programmable computer. That was a significant step toward the bias we know moment because people could program ENIAC to carry out different tasks. Having an unprogrammable computer would be like having a machine that can only produce and save spreadsheets. In stark discrepancy, a programmable computer can work on spreadsheets and be programmed to act as a word processor.   

It took a 1,500- forecourt-  bottom room to store ENIAC, and the humongous machine was made of several large panels. The computer performed complex computations about ordnance. Unlike its forerunners, it was programmable so that the service could use it for other purposes after the War. By the time the bulk machine was erected, World War II had formally ended. still, it was reprogrammed to help produce a hydrogen lemon.   

ENIAC took up a room bigger than numerous houses at the time. It bring nearly half a million bone, and it took nonfictional help from the service to make and operate. At their commencement, computers were artificial machines, reserved for only the most important( not to mention precious) uses, but that has changed.   

  How the Function of Computers Has Changed 

  Moment, computers are still used in artificial settings, but much lower and far more important computers are used in homes for everything from work and the academy to drooling and watching vids. Significant changes in computer factors are a big part of why computers have come lower and more important. The foremost computers used vacuum tubes. These were about the size of a cutlet, and they could only store one bit of information on each tube. By the 60s,  utmost computers used transistors to store information. They could still only hold a bit, but they were much lower than cumbrous vacuum tubes. Next came integrated circuits. 

These were indeed bigger than vacuum tubes, but they could hold thousands of bits on a single circuit. Eventually, the computer chips we know moment came the most common way to store information on a computer. Chips are bitsy, and they can store millions and indeed billions of bits of information on a single chip.   Computer chips made it possible to shrink the room-sized mega machines of yore into the frequently portal particular bias called computers moment. Since it's possible to store billions of bits of information in a chip small enough to fit several in the win of your hand,  bitsy computers can store a huge quantum of data.  

 In the 70s, Apple and Tandy Radio Shack brought particular computers to the requests. Their models bring over$ 1,000, which was relatively expensive for the average family. While these computers could fit in an office, they were still kindly cumbrous compared to the average ultramodern computer. They had large suckers to keep them cool, and they had both an examiner and a computer palace. The palace housed the utmost of the computer corridor. The examiner created a visual stoner interface that allowed an untrained stoner to spark computer functions by clicking buttons and codifying commands. 

 How Computers Came More Important    

 As computer companies continue to produce further sophisticated chips,  briskly,  lower computers have come the norm. ultramodern computers have further processing power than some of the room-sized machines of history. It could take a large computer bare twinkles to reuse a single equation. moment’s computers can download a movie,  shoot a dispatch, and save a spreadsheet contemporaneously. In history, it took a droopy fragment or CD to use any program that the computer didn't come with. moment, the process of penetrating further capabilities of your computer is indeed simpler.

Web-grounded operations allow the same computer to come a  mecca for graphic design, armature, engineering, or any other stoner requirements.   As computers have grown more important and more movable, computers have come far wider. utmost homes have multiple computers. However, there's nearly clearly a smartphone or tablet, If there are no computers in a home. It used to bear a trained specialist to run a computer, but computers moment are far more stoner-friendly. Without important help, anyone from a baby boomer to a practical baby can learn how to use a computer. 

 Ultramodern Use of Computers

  As wide as the use of calculating technology is some people either don't know how to use computers or prefer not to use them. Indeed if you don't tête-à-tête use a computer, computers are part of your everyday life. When your supermarket is running low on your favorite cereal brand,  force operations on computers generally warn workers to order further. When you check out at nearly any store, the cashier uses a computer. Computing technology is essential to business. During the COVID-19 epidemic, numerous companies could( nearly) seamlessly transition to remote work because so numerous business functions formerly involve using computers and the Internet.

 Technology has also had a significant impact on healthcare. Croakers and cases can fluently pierce and unite medical records because it's possible to view them securely on the Internet.   Indeed in the realm of entertainment, computers allow individuals to produce and partake in content with people worldwide. individualities can pierce step-by-step instructions to learn new chops using a computer. You can indeed get a formal education using a computer. Everything from primary seminaries to graduate seminaries is available entirely online. As computers have changed over time, they've  relatively literally changed the wor



Post a Comment

0 Comments