Computer Monitor
A computer monitor is an output device that displays information in pictorial or textual form. A discrete monitor comprises a visual display, support electronics, power supply, housing, electrical connectors, and external user controls.

The display in modern monitors is typically an LCD with LED backlight, having by the 2010s replaced CCFL backlit LCDs. Before the mid-2000s, most monitors used a CRT. Monitors are connected to the computer via DisplayPort, HDMI, USB-C, DVI, VGA, or other proprietary connectors and signals.
Originally, computer monitors were used for data processing while television sets were used for video. From the 1980s onward, computers (and their monitors) have been used for both data processing and video, while televisions have implemented some computer functionality. In the 2000s, the typical display aspect ratio of both televisions and computer monitors has changed from 4:3 to 16:9.

Modern computer monitors are mostly interchangeable with television sets and vice versa. As most computer monitors do not include integrated speakers, TV tuners, nor remote controls, external components such as a DTA box may be needed to use a computer monitor as a TV set.

History

Early electronic computer front panels were fitted with an array of light bulbs where the state of each particular bulb would indicate the on/off state of a particular register bit inside the computer. This allowed the engineers operating the computer to monitor the internal state of the machine, so this panel of lights came to be known as the 'monitor'. As early monitors were only capable of displaying a very limited amount of information and were very transient, they were rarely considered for program output. Instead, a line printer was the primary output device, while the monitor was limited to keeping track of the program's operation.Computer monitors were formerly known as visual display units (VDU), particularly in British English. This term mostly fell out of use by the 1990s.

Technologies

Multiple technologies have been used for computer monitors. Until the 21st century most used cathode-ray tubes but they have largely been superseded by LCD monitors.

Cathode-ray tube

The first computer monitors used cathode-ray tubes (CRTs). Prior to the advent of home computers in the late 1970s, it was common for a video display terminal (VDT) using a CRT to be physically integrated with a keyboard and other components of the workstation in a single large chassis, typically limiting them to emulation of a paper teletypewriter, thus the early epithet of 'glass TTY'. The display was monochromatic and far less sharp and detailed than on a modern monitor, necessitating the use of relatively large text and severely limiting the amount of information that could be displayed at one time. High-resolution CRT displays were developed for specialized military, industrial and scientific applications but they were far too costly for general use; wider commercial use became possible after the release of a slow, but affordable Tektronix 4010 terminal in 1972.

Some of the earliest home computers (such as the TRS-80 and Commodore PET) were limited to monochrome CRT displays, but color display capability was already a possible feature for a few MOS 6500 series-based machines (such as introduced in 1977 Apple II computer or Atari 2600 console), and the color output was a speciality of the more graphically sophisticated Atari 800 computer, introduced in 1979. Either computer could be connected to the antenna terminals of an ordinary color TV set or used with a purpose-made CRT color monitor for optimum resolution and color quality.
Continue Reading...