Multimedia Networking Project 1b

Evaluation of Speech Detection

Due date: February 14 in class


Index


Overview

You are to conduct experiments to evaluate the performance of your Speech Detection algorithm implementation from Project 1. The focus of this project is not only on how the algorithm (or sound system) performs, but on the formulation of hypotheses; the design, implementation and analysis of experiments to test the hypotheses; and a writeup of it all.

You are allowed (and encouraged) to work in groups of 2 for this project. If you do work in a group, you may evaluate the implementation from either group member (or both!).


Details

Design experiments

In evaluating your algorithm, there are two measures of performance you need to consider:

You will decide on how each is to be measured. For example, you may choose to get a user's opinion number from 1-10 to measure user perception and you may choose to use processing time to measure system overhead. You should consider the appropriateness of your measure along with accuracy and possible sources of error.

Then, you will manipulate the independent variable and determine its impact on the algorithm through your chosen performance measures. You must pick at least two independent variables. Some possibilities are:

In addition, you must chose 1 algorithm modification and evaluate it. Possibilities include:

You should formulate hypotheses about how a change in the independent variables affects your measures of performance.

Results and Analysis

You must provide details on both the results and the analysis. The results are the numeric measures recorded in the experiments, in the form of charts or tables. The analysis involves manipulating the data to understand relationships and interpreting the results, often in the form of graphs. The analysis should consider whether the data supports or rejects the hypotheses.

Summarize Findings

The main deliverable for this project is a report describing:

Note, the above bullets break naturally into Sections in the report!


Hints

Good experimental writeups provide sufficient details for a knowledgeable reader to reproduce the results obtained. Keep this in mind when doing your writeup. In particular, for system measures, you should record details about the hardware and software used. For user perception measures, you should record background information on the subjects, familiarity with the topic and type of software, method sampled, etc. For measurements, you should record details on the tools used, the process of gathering data, etc.

Visualizations, such as graphs or charts, even simple ones, are typically much better representations of data than just tables of numbers. All graphs should include:

Graph Tips

If you are using Windows, Microsoft Excel has good support for drawing graphs. You might try this tutorial http://www.ncsu.edu/labwrite/res/gt/gt-menu.html to get started.

If you are using Unix, gnuplot has good support for drawing graphs. You might see http://www.gnuplot.info/ for more information.

You might look at the slides for this project (pptx) and the slides for Experiments in Computer Science (pptx, pdf).


Hand In

You must turn in a hard-copy of your project report. Please include a cover page with a title, abstract and group members. Unless otherwise specified, the hard-copy must be given to me at the beginning of class, or delivered to FL B24b before class on the day it is due.

Be sure to include names of both group members on the report, as appropriate.

You will use email to turn in any testing Code/Scripts used/modified, and any Makefile/Project file (if you have them).

When ready, create a directory of your project based on your last name (e.g. jones) and pack (with zip or tar) your files, for example:

    cp * lastname-proj1b  // copy all the files you want to submit
    tar czvf proj1b-lastname.tgz lastname-proj1b  // package and compress


Grading

The grading breakdown is as follows:

Experiments

   10% Measure of user perception

   10% Measure of system impact

   10% Independent variable 1

   10% Independent variable 2

   10% Algorithm modification

Report

     5% Abstract

     5% Introduction

           Background (as needed)

   10% Experiment design

   15% Results and analysis

     5% Conclusions

   10% (Other)

Below is a general grading rubric:

100-90. The project clearly exceeds requirements. The experiments are carefully and thoroughly done. Measures of user perception and system impact are meaningful. Both independent variables are varied over a good range in studying their effect on performance. The [RS75] algorithm is clearly modified and the effect of the modification thoroughly studied on performance.

89-80. The project meets requirements. The experiments are complete. Measures of user perception and system impact are in place. Both independent variables are varied in studying their effect on performance. The [RS75] algorithm is modified and the effect of the modification studied on performance.

79-70. The project barely meets requirements. The experiments are in place but may not be complete. Measures of user perception and system impact are in place, but not be clear. An independent variable may not have been clearly chosen or varied in studying the effect on performance. The [RS75] algorithm may not be clearly modified or its effects studied on performance.

69-60. The project fails to meet requirements in some places. The experiments may not be complete. Measures of user perception and system impact may be missing or unclear. One or both independent variables may not have been clearly chosen or varied in studying the effect on performance. The [RS75] algorithm may not have been modified or its effects studied on performance.

59-0. The project does not meet requirements. The experiments are incomplete. Measures of user perception and system impact are missing or unclear. One or both independent variables are not have clearly chosen or varied. The [RS75] algorithm has not been adequately modified or studied.


Return to the Multimedia Networking Home Page

Send all questions to the staff mailing list.