Parallel Computing Questions Medium
Gustafson's law, also known as Gustafson-Barsis's law, is a principle in parallel computing that focuses on the scalability of parallel systems. It was proposed by John L. Gustafson and Edwin H. Barsis in 1988 as an alternative to Amdahl's law.
Amdahl's law, formulated by Gene Amdahl in 1967, states that the speedup of a parallel program is limited by the fraction of the program that cannot be parallelized. According to Amdahl's law, even with an infinite number of processors, there will always be a maximum speedup that can be achieved due to the sequential portion of the program.
In contrast, Gustafson's law challenges the assumptions made by Amdahl's law. It argues that as the problem size increases, the relative amount of time spent on the parallelizable portion of the program also increases. Gustafson's law suggests that by scaling up the problem size, the impact of the sequential portion can be minimized, allowing for greater speedup with more processors.
Gustafson's law introduces the concept of "scalability" as a measure of performance improvement in parallel systems. It emphasizes that the goal of parallel computing is to solve larger problems in the same amount of time, rather than speeding up a fixed problem size. According to Gustafson's law, if the problem size is increased proportionally with the number of processors, the execution time can remain constant, resulting in a linear speedup.
In summary, Gustafson's law differs from Amdahl's law by focusing on the scalability of parallel systems and challenging the notion that the sequential portion of a program limits the speedup. It suggests that by increasing the problem size, parallel systems can achieve greater performance improvement and solve larger problems efficiently.