Hertz and bytes?

I was just thinking randomly about something I always took for granted - that if a gigabyte is a measure of magnitude of data, why is it that a gigahertz is only a measure of speed in cycles per second? Or am I missing something? Does it make sense to say "My CPU processes information at 2.8 gigahertz." Shouldn't it be measured in units of data per time? Something more like "My CPU processes information at ____bytes per ____nanosecond." I don't understand how we measure it in only cycles per second, I feel like it should be cycles of something per second. Please help?

2 Answers

  • Zarn
    Lv 7
    6 years ago
    Favorite Answer

    Hertz is a frequency measure, and is just how many clock cycles the oscillator (computer clock) is providing. You're right that you could measure I/O (input / output) in data units per second - but that won't necessarily tell anything about how much *work* your computer was getting through in a given amount of time.

    Various processors have various pipelines, then there's GPU offloading of processing, and stuff like that. All CPUs give a "memory bandwidth", which is just a straight up "data units per second". That's important for, say, handling a website, where there's not much calculation involved (as compared to, say, a fractal simulation or something).

    One way of doing this is to introduce a benchmark, which is usually a numerical rating for how much work a given CPU can do using a synthetic workload. That'll give you some idea on how processors compare with a straight up number (second link).

    Or, you could go for the FLOPS rating. FLOPS is FLoating point OPerations per Second, and is a measure of how much work your CPU is doing per time unit, and can also afford you some comparison between CPU architectures (if your workload is floating point operations-heavy, of course).

    • Commenter avatarLogin to reply the answers
  • Bob B
    Lv 7
    6 years ago

    Giga just means 10 to the 9th power whether it's referring to a quantity or a speed.

    Bytes are a quantity of data. One gigabyte is 10^9 bytes of data.

    The cycles of "something" is the cycles per second of *the CPU clock*. One gigahertz is 10^9 clock cycles per second,

    • Commenter avatarLogin to reply the answers
Still have questions? Get your answers by asking now.