Skip to end of metadata
Go to start of metadata


ICT is an acronym that stands for Information and Communications Technology.

ICT is the integration of information processing, computing and communication technologies. ICT is changing the way we learn, work and live in society and are often spoken of in a particular context, such as in education, health care, or libraries.  A good way to think about ICT is to consider all the uses of digital technology that already exist to help individuals, businesses and organizations use information. ICT covers any product that will store, retrieve, manipulate, transmit or receive information electronically in a digital form and is concerned with these products. Importantly, it is also concerned with the way these different uses can work with each other. For example, personal computers, digital television, email, robots.  

A look at what we use at home, in the office, in school, or at any business or social function finds many devices equipped with computer chips. They include access cards, mobile phones, point of sales scanner, medical instruments, TV remote controls, microwaves ovens, DVD players, digital cameras, PDAs, etc.



IT defines as Information Technology, consists of study, design, advance development, accomplishment, support or administration of computer foundation information system, mostly software application and computer hardware. Information technology works with the use of electronic computers and computer software to renovate, defend, development, and broadcast and other information.
Information technology has overstuffed to cover many features of computing and technology, and this word is more familiar than ever before. Information technology subject can be quite large, encompassing many fields. IT professionals perform different types of responsibilities that range from installing applications to designing complex computer networks.

IT professional's responsibilities are data management, networking, database, software design, computer hardware, management and administration of whole system. IT (Information Technology) is combined word of computer and communications or "InfoTech". Information Technology illustrates any technology which helps to manufacture, manipulate, accumulate, communicate or broadcast information.

Recently it has become popular to broaden the term to explicitly include the field of electronic communication so that people tend to use the abbreviation ICT (Information and Communications Technology).

The term "information technology" evolved in the 1970s. Its basic concept, however, can be traced to the World War II alliance of the military and industry in the development of electronics, computers, and information theory. After the 1940s, the military remained the major source of research and development funding for the expansion of automation to replace manpower with machine power.
Since the 1950s, four generations of computers have evolved. Each generation reflected a change to hardware of decreased size but increased capabilities to control computer operations. The first generation used vacuum tubes, the second used transistors, the third used integrated circuits, and the fourth used integrated circuits on a single computer chip. Advances in artificial intelligence that will minimize the need for complex programming characterize the fifth generation of computers, still in the experimental stage.

The first commercial computer was the UNIVAC I, developed by John Eckert and John W. Mauchly in 1951. It was used by the Census Bureau to predict the outcome of the 1952 presidential election. For the next twenty-five years, mainframe computers were used in large corporations to do calculations and manipulate large amounts of information stored in databases. Supercomputers were used in science and engineering, for designing aircraft and nuclear reactors, and for predicting worldwide weather patterns. Minicomputers came on to the scene in the early 1980s in small businesses, manufacturing plants, and factories.

In 1975, the Massachusetts Institute of Technology developed microcomputers. In 1976, Tandy Corporation's first Radio Shack microcomputer followed; the Apple microcomputer was introduced in 1977. The market for microcomputers increased dramatically when IBM introduced the first personal computer in the fall of 1981. Because of dramatic improvements in computer components and manufacturing, personal computers today do more than the largest computers of the mid-1960s at about a thousandth of the cost.
Computers today are divided into four categories by size, cost, and processing ability. They are supercomputer, mainframe, minicomputer, and microcomputer, more commonly known as a personal computer. Personal computer categories include desktop, network, laptop, and handheld.


Current development

Every day, people use computers in new ways. Computers are increasingly affordable; they continue to be more powerful as information-processing tools as well as easier to use.

Computers in Business :
One of the first and largest applications of computers is keeping and managing business and financial records. Most large companies keep the employment records of all their workers in large databases that are managed by computer programs. Similar programs and databases are used in such business functions as billing customers; tracking payments received and payments to be made; and tracking supplies needed and items produced, stored, shipped, and sold. In fact, practically all the information companies need to do business involves the use of computers and information technology.
On a smaller scale, many businesses have replaced cash registers with point-of-sale (POS) terminals. These POS terminals not only print a sales receipt for the customer but also send information to a computer database when each item is sold to maintain an inventory of items on hand and items to be ordered. Computers have also become very important in modern factories. Computer-controlled robots now do tasks that are hot, heavy, or hazardous. Robots are also used to do routine, repetitive tasks in which boredom or fatigue can lead to poor quality work.

Computers in Medicine :
Information technology plays an important role in medicine. For example, a scanner takes a series of pictures of the body by means of computerized axial tomography (CAT) or magnetic resonance imaging (MRI). A computer then combines the pictures to produce detailed three-dimensional images of the body's organs. In addition, the MRI produces images that show changes in body chemistry and blood flow.

Computers in Science and Engineering :
Using supercomputers, meteorologists predict future weather by using a combination of observations of weather conditions from many sources, a mathematical representation of the behavior of the atmosphere, and geographic data.
Computer-aided design and computer-aided manufacturing programs, often called CAD/CAM, have led to improved products in many fields, especially where designs tend to be very detailed. Computer programs make it possible for engineers to analyze designs of complex structures such as power plants and space stations.

Integrated Information Systems :
With today's sophisticated hardware, software, and communications technologies, it is often difficult to classify a system as belonging uniquely to one specific application program. Organizations increasingly are consolidating their information needs into a single, integrated information system. One example is SAP, a German software package that runs on mainframe computers and provides an enterprise-wide solution for information technologies. It is a powerful database that enables companies to organize all their data into a single database, then choose only the program modules or tables they want. The freestanding modules are customized to fit each customer's needs.



What is information technology? -
What is ICT? -
Wikipedia Communication technology -
Austin and Hughes, Information technology -
Yeo and Oh, 2008, ICT and Our Society Fourth Edition, P7-9


  • No labels