?Outline:
? ?1. General speedup formula
? ?2. Amdahl's Law
? ?3. Gustfson-Barsis' Law?
? ?4. Karp-Flatt metric
? ?5. Isoefficiency metric?
General speedup formula:
? ? ? ? ? ※ Inherently sequential computations : ?δ(n)
? ? ? ? ? ※ Potentially parallel computations : φ(n)
? ? ? ? ? ※ Communication operations : k(n,p)
Amdahl's Law:
f-sequential(serial) p-processors
The Limitation of Amdahl's Law: ignores overhead associated with the introduction of parallelism.(忽略了引入并行性所帶來的開銷)
Amdahl's Effect: ?Speedup is usually an increasing function of the problem size for a fixed number of processors.(對于固定數量的處理器渔彰,加速比是問題規(guī)模的增函數)
Amdahl's Law assumes that minimizing execution time is the focus of parallel computing.(Amdahl定律假設并行計算的主要目標是縮短執(zhí)行時間)
Gustafson-Barsis's Law
(Amdahl定律從串行程序開始鸟顺,通過預測其在多個處理器上的計算速度來確定加速比。Gustafson-Barsis定律正好相反。從并行程序出發(fā)柴墩,估計并行計算比在單處理器上執(zhí)行同樣的計算可以快多少)
Karp-Flatt metric
Experimentally Determined Serial?Fraction -- 實驗決定的串行比例
* Takes into account parallel overhead
*?Detects other sources of overhead or inefficiency ignored in speedup model (發(fā)現并行程序執(zhí)行模型中所忽略的其他開銷的來源)
????* Process startup time
????* Process synchronization time
????* Imbalanced workload
????* Architectural overhead
(由于實驗確定的串行部分比例并沒有隨著處理器的個數增加而增加≡3ィ可以看出性能不好的主要原因是有限的并行性匠童,即大部分計算是本質上的串行)
(由于實驗確定的串行部分的比例隨著處理器的個數增加而穩(wěn)定增加,可以看出加速比差的主要原因是并行開銷亦歉。這可能是由于進程啟動恤浪、通信或同步所需的時間,或是由于系統(tǒng)結構上的限制)
Isoefficiency metric
* Parallel system: parallel program executing on a parallel computer
* Scalability of a parallel system: measure of its ability to increase performance as number of processors increases
* A scalable system maintains efficiency as processors are added
* Isoefficiency: way to measure scalability
Isoefficiency Derivation Steps
Begin with speedup formula
Compute total amount of overhead
Assume efficiency remains constant
Determine relation between sequential execution time and overhead
Scalability Function
Suppose isoefficiency relation is n >= f(p)
Let M(n) denote memory required for problem of size n
M(f(p))/p shows how memory usage per processor must increase to maintain same efficiency
We call M(f(p))/p the scalability function
Meaning of Scalability Function
To maintain efficiency when increasing p, we must increase n
Maximum problem size limited by available memory, which is linear in p
Scalability function shows how memory usage per processor must grow to maintain efficiency
Scalability function a constant means parallel system is perfectly scalable
KEY TERMS
Amdahl effect -- Amdahl效應
Amdahl's Law -- Amdahl定律
efficiency -- 效率
experimentally determined serial fraction -- 實驗確定串行比例
Gustafson - Barsis's Law
isoefficiency relation -- 等效關系
Karp - Flatt metric -- ... 度量
parallel system -- 并行系統(tǒng)
scalability -- 可擴展性
scalability function -- 可擴展函數
scaled speedup -- 比例加速比
speedup -- 加速比