Meaning – The term MIPS, refers to a measure of computer processing performance that is equal to one million instructions per second.
It is a method of measuring the raw speed of a computer’s processor. Since the MIPS measurement doesn’t take into account other factors such as the computer’s I/O speed or processor architecture, it isn’t always a fair way to measure the performance of a computer.
The MIPS measurement has been used by computer manufacturers like IBM to measure the cost of computing. The value of computers is determined in MIPS per dollar.
MIPS can be useful when comparing performance between processors made with similar architecture. MIPS stands for Million Instructions per second.
Example of usage – “A computer rated at 100 MIPS may be able to compute certain functions faster than another computer rated at 120 MIPS.”