SuperCache:A mechanism to minimize the front end latency

Allan, Zhang and Helal, Sumi (2007) SuperCache:A mechanism to minimize the front end latency. In: Information Technology, 2007. ITNG '07. Fourth International Conference on. IEEE, pp. 908-914. ISBN 0769527760

Full text not available from this repository.

Abstract

Modern CPU's pipeline stages can be roughly classified as front end and back end stages. Front end supplies ready (decoded, renamed) instructions and dispatches them to reservation stations where back end issues, executes and retires them. The lengthy front end stages, including instruction fetching, decoding, renaming and dispatching, play a key role in overall performance: only adequate ready instruction supply can make room for back end stages to fully exploit instruction level parallelism (ILP). The front end latency reduction is especially critical for recent deeply pipelined architecture where the front end is especially long: instruction cache access may take more than one cycle even for cache hit, let alone cache miss. In case of branch mis-prediction, the supply/demand equilibrium between front end and back end is suddenly disrupted, back end often under-utilizes available resources during the long waiting period until front end can supply new branch of instructions ready in reservation stations. In this paper, we introduce and evaluate a new mechanism (called SuperCache) that aims to reduce the front end latency by enhancing the traditional reservation pool to a SuperCache and recycle retired reservation stations. With the employment of the proposed mechanism, we can see a significant performance improvement by up to 15% even 30% in our simulations. © 2007 IEEE.

Item Type:
Contribution in Book/Report/Proceedings
Subjects:
?? FRONT ENDINSTRUCTION LEVEL PARALLELISMLATENCY REDUCTIONPIPELINESUPERSCALARCLASSIFICATION (OF INFORMATION)COMPUTER SIMULATIONDECODINGINSTRUCTION LEVEL PARALLELISM (ILP)BUFFER STORAGE ??
ID Code:
89998
Deposited By:
Deposited On:
29 Jan 2018 11:06
Refereed?:
Yes
Published?:
Published
Last Modified:
16 Sep 2023 03:12