Return to search

Superscalar Processor Models Using Statistical Learning

Processor architectures are becoming increasingly complex and hence architects have to evaluate a large design space consisting of several parameters, each with a number of potential settings. In order to assist in guiding design decisions we develop simple and accurate models of the superscalar processor design space using a detailed and validated superscalar processor simulator.
Firstly, we obtain precise estimates of all significant micro-architectural parameters and their interactions by building linear regression models using simulation based experiments. We obtain good approximate models at low simulation costs using an iterative process in which Akaike’s Information Criteria is used to extract a good linear model from a small set of simulations, and limited further simulation is guided by the model using D-optimal experimental designs. The iterative process is repeated until desired error bounds are achieved. We use this procedure for model construction and show that it provides a cost effective scheme to experiment with all relevant parameters.
We also obtain accurate predictors of the processors performance response across the entire design-space, by constructing radial basis function networks from sampled simulation experiments. We construct these models, by simulating at limited design points selected by latin hypercube sampling, and then deriving the radial neural networks from the results. We show that these predictors provide accurate approximations to the simulator’s performance response, and hence provide a cheap alternative to simulation while searching for optimal processor design points.

  1. http://hdl.handle.net/2005/537
Identiferoai:union.ndltd.org:IISc/oai:etd.ncsi.iisc.ernet.in:2005/537
Date04 1900
CreatorsJoseph, P J
ContributorsJacob, T Matthew
Source SetsIndia Institute of Science
Languageen_US
Detected LanguageEnglish
TypeThesis
RelationG20338

Page generated in 0.0025 seconds