Computing
In computing, most central processing units (CPU) are labeled in terms of their clock speed expressed in megahertz or gigahertz (109 hertz). This number refers to the frequency of the CPU's master clock signal ("clock speed"). This signal is simply an electrical voltage which changes from low to high and back again at regular intervals. Hertz has become the primary unit of measurement used by the general populace to determine the speed of a CPU, but many experts have criticized this approach, which they claim is an easily manipulable benchmark.[4] For home-based personal computers, the CPU has ranged from approximately 1 megahertz in the late 1970s (Atari, Commodore, Apple computers) to nearly 4 GHz in the present. This can be increased even further by increasing the frequency of the CPU (overclocking) in the BIOS or other software. (Likewise, speed can also be decreased, or underclocked.)
Various computer buses, such as memory buses connecting the CPU and system random access memory (RAM), also transfer data using clock signals operating at different frequencies in the megahertz ranges (for modern products).