Technology has twisted out to be the foundation of daily life. People have been exclusively dependent on the recently found gadgets and devices that are presented in the market. They obtain these things in order to make a smooth and more creative life. One of the things depleted in the market is computer. Computers play an essential role not only in the lives of students but also in the lives of adults too. Even the most reputed companies and firms use this gadget as storage of important information of their business office.
The term computer generation is utilized in several advancements of new computer technology. Every generation of computers is known by major technological development that essentially altered the way computers operate. And it is results in increase smaller, inexpensive, influential, resourceful and reliable gadgets.
First Generation of Computers (1942-1955)
The beginning of First Generation computer age is from UNIVAC, which used vacuum tubes. They were often huge in size and occupy complete rooms. They were very costly to operate.
Calculate the data in millisecond.
- Vacuum tube technology made possible to make electronic digital computers
- Vacuum tubes were the only electronic component available during those days.
- Utilized punch cards for input.
- Very time-consuming processing.
- Large in size.
- Used a large amount of energy.
- Heated very quickly due to thousands of vacuum tubes.
- They were not very reliable.
- Constant maintenance was required.
- Used machine language only.
- Limited commercial use.
- Limited programming capabilities
- Air conditioning was required.
Second Generation Computers (1955-1964)
During second generation, the transistor was invented. Transistors became the key component in all digital circuits including computers. Today’s latest microprocessors have tens of millions of microscopic transistors! The microprocessors brought the latest generation of computers. Thousands of ICs are built up into a single silicon chip that contains a CPU.
- The 2nd generation Computers were more reliable
- Ample commercial use
- Better portability as compared to the first generation computers.
- High speed and could determine data in microseconds
- Used faster peripherals devices. For example- tape drives, magnetic disks, printer etc.
- Used Assembly language.
- Accuracy improved.
- Cooling system was compulsory
- Regular maintenance was required
- Only used for specific purposes
- Costly and not versatile
Third Generation Computers (1964-1975)
Third generation computers used the integrated circuits (IC). Jack Kilby developed the concept of integrated circuit. It was an important discovery in the computer field. The first IC was invented and used in 1961.
- Less expensive
- More reliable.
- Smaller in size as compared to previous generations.
- Better accuracy
- Used less energy
- Produced less heat as compared to the previous two generations of computers.
- Better speed and could compute data in nanoseconds.
- Maintenance cost was low because hardware failure is reare.
- Could be used for high-level languages.
- Commercial production increased.
- Air conditioning was required.
- Highly complicated technology obligatory for the manufacturing of IC chips.
Fourth Generation Computers
The fourth generation computers started with the discovery of Microprocessor. The Microprocessor contains thousands of ICs. Ted Hoff produced the first microprocessor for Intel in 1971. It was recognized as Intel 4004. The technology of integrated circuits improved quickly. The LSI i.e.Large Scale Integration circuit and VLSI (Very Large Scale Integration) circuit was designed. It very much reduced the size of computer. Now the size of modern Microprocessors is usually one square inch. It can have millions of electronic circuits. The examples of fourth generation computers are Apple Macintosh & IBM PC.
- More influential and consistent than earlier generations.
- Small in size
- Fast processing power with less power consumption
- No air conditioning required.
- Totally general purpose
- Commercial production
- Cheapest among all generations
- All types of High level languages can be used
- The latest technology is required for manufacturing of Microprocessors.
The next generation of computers will probably be based on artificial intelligence (AI), which is under development process. It is a branch of computer science concerned with making computers perform like humans! Scientists are beginning to use the term Nanotechnology, while a great deal of progress is yet to be made.
Artificial Intelligence is the study of computer science concerned with making computers behave like humans. It includes:
programming computers to play games i.e. chess and checkers.
Expert Systems: programming computers to make decisions in real-life situations .
Natural Language: programming computers to recognize natural human languages.
Systems that simulate intelligence by attempting to replicate the types of physical connections that occur in animal brains.
programming computers to see and hear and react to other sensory stimuli.
Huge research work is required for the development of software to improve computer works and its robotic applications!
- Currently, no computers exhibit full artificial intelligence means they are able to simulate human behavior. The greatest advances have occurred in the field of games playing. In May, 1997, an IBM super-computer called Deep Blue overcome world chess champion Gary Kasparov in a chess match.
- In the area of robotics, computers are now widely used in assembly plants, but they are able only of very limited works. Robots have great difficulty identifying objects based on appearance or feel, and they still shift and handle objects clumsily.
- Natural-language processing offers the greatest potential rewards because it would allow people to interact with computers without needing any specialized knowledge. You could easily walk up to a computer and talk to it. There are also voice recognition systems that can change spoken sounds into written words, but they do not understand what they are writing.
- Today, the newest part of artificial intelligence is neural networks, which are proving successful in a number of disciplines like voice recognition and natural-language processing.
- There are several programming languages that are known as AI languages as they are used almost solely for AI applications. The two most common are LISP and Prolo.