cs533 Project 3
Due date: Friday, April 2nd, by 11:59pm
This project is about evaluating two or more computer systems you have
available to you with an established benchmark and comparing the
benchmark results with actual application performance. A primary goal
is to become familiar with one benchmark, understanding how to install
it, run it, and interpret its performance results. Another major goal
is to characterize an application workload that can be compared against
the benchmark results. The last goal is to generate a report that
reflects your understanding of the benchmarks you have run and
your interpretation of your results.
To achieve these goals, you will undertake the following tasks:
- Select benchmark. Choose a well-known computer system
benchmark. The term "well-known" is, of course, loosely defined but
it should have been described in at least one peer-reviewed forum and
it should have published results for computer systems you can compare
your results to.
- Install benchmark. Download and compile benchmark in an
appropriate fashion, as determined by the benchmark guidelines and
documentation. This may include compiler settings or other
- Run benchmark. Run the benchmark as instructed by the
documentation on least two different computer systems. The
differences between the computers should be significant with the
intent that the benchmark performance measurements returned are
different. Note, the documentation may require multiple iteration or
system configurations settings or other details.
- Create application-level benchmark. Choose an
application or set of applications that reflect some "typical"
workload. For example, you may compile a large set of programs, or
run a large simulation. You should consider appropriate parameters
for your workload and appropriate measures of performance.
- Report results. Write up your benchmarking results,
- Background on the well-known benchmark, including who,
where, when invented, where published, what the intent of
the benchmark is, measures of performance and some
published performance results.
- Descriptions of the systems you run your benchmarks on,
including CPU type and speed, memory size, cache levels and
sizes, operating system and any other pertinent system
- Characterization and details on your application level
benchmark, including type of workload, length of run, number
of iterations, etc.
- Results and analysis depicting
your results clearly using a series of
tables and graphs, use appropriate statistical techniques and data
- Conclusions on the meaning of your data, both in terms of data
that supports your conclusions and in terms of subjective opinions
you may have.
You might first see what computers are available to you before
you choose your benchmarks. This is because not all benchmarks
can run (easily) on all platforms. Note, this is not typically the course
of action for performance evaluation (the system is usually chosen)
but this project is a special case.
There are numerous benchmarking possibilities, but below is
a head-start. Do not that I have not actually tried most
of the below benchmarks so your mileage may vary.
- The WebStone
benchmark for Web servers.
- One of the many SPEC
- Various Linux benchmarks
- Others, without links but should be easy to find:
PERFECT, Livermore Loops, WhetStone, Dhrystone, LINPACK,
BYTEmark, Bonnie, XBench
The Quake3 demo benchmarks
are fun for evaluating the performance of graphics cards. You:
- Download the quake 3 demo.
- Adjust settings as desired (fastest, and high are good).
- Start the console by hitting ~.
- Type "disconnect", "timedemo 1", "demo demo001".
- When done, enter console (~) and record frame rate.
If you are using Windows, MS Excel has good support for
drawing graphs. You might try this tutorial
to get started.
If you are using Unix, gnuplot has good support for drawing
graphs. You might see http://www.gnuplot.info/ for more
You might look at the slides for this project (ppt).
Hand-in a report (your report should be in pdf or postscript or
text but not MS Word or any other native document format).
Tar up (with gzip) your files, for example:
cp * login-name /* copy all your files to submit to directory */
tar -czf login-name.tgz long-name
then attach login-name.tgz to an email with "cs533_proj3" as the
elm -scs525_proj3 < login-name.tgz to send it,
if that is easier.
to the 533 Home Page
Send all questions to (claypool at cs.wpi.edu).