The project is divided into two project phases. Each phase has a working OS implementation due. All code must be well documented, and should be accompanied by a report following the guidelines below.
You have the following options for receiving the source file(s) provided for the project:
If you receive a single file, then you must break this into the individual source files. The boundaries are clearly delimited. (Look for the strings `START OF' and `END OF'.) These files are named:
global.h
syscalls.h
z502.h
protos.h
test.c
base.c
sample.c
scheduler_printer.c
z502.c
The first four files are include files. The declarations in the includes are separated, to some extent, based on whether those declarations should be visible from the os, from the z502 hardware, etc.
test.c contains the various user test programs which you are to run to assure that your implementation of the operating system is working correctly.
base.c is a skeleton of the operating system. It has a few of the hooks necessary to enable the program to link together, and will provide a bare framework on which you can hang your development.
sample.c contains simple examples of calls to the hardware routines. You may find these useful as examples of how your operating system can call the hardware.
scheduler_printer.c is a routine you should use to print out information on how your scheduler is working. It's convenient in that it provides a simple, easy to read output format. The interface is a bit awkward since it's designed to work with just about any scheduler you might write. Details on using this interface are given in Appendix D. Later in this student manual is an overview of what your deliverables should look like; scheduler_printer will help you meet these goals.
z502.c is the code for the hardware simulation. Please, don't mess with this code just for jollys; if you modify it, you own it. On the other hand, many curious minds enjoy discovering how it's built. This simulation provides the appearance of a CPU, memory, disk, etc. to which your Operating System is expected to interface.
After separating all the individual files using your favorite editor, it should be simple to build the executable. No make file is provided since systems vary so greatly. I've found a simple line like
"cc -g test.c base.c sample.c scheduler_printer.c z502.c -lm"
works pretty well. Note that the last "-lm" is to include the math library at link time. Executing the resultant program "a.out" gave me this output:
This is Release 2.0 of the Z502 Hardware. os_switch_context_complete called before going to user code. Program called with 1 arguments: a.out Calling with argument 'sample' executes the sample program. os_switch_context_complete called before going to user code. This is Release 2.0: Test 1a SVC handler: get_time 131676 0 0 0 0 0 SVC handler: sleep 100 0 0 0 0 0 SVC handler: get_time 131680 0 0 0 0 0 sleep time= 100, elapsed time= 0 SVC handler: term_proc -1 131708 0 0 0 0 ERROR: Test should be terminated but isn't. User program did a simple return; use proper system calls only. PANIC: Because OS502 used hardware wrong.
Note that you may get different numbers. Note also that the output looks so strange (and so erroneous) because there's no operating system to handle the system calls that were generated when the code tried to execute test1a. This is because YOU haven't yet written the OS502 that will do the work required by the tests.
Now to exercise the "sample" program, type "a.out sample". This code simply tries some of the hardware calls. The result I got looked like this:
% a.out sample This is Release 2.0 of the Z502 Hardware. os_switch_context_complete called before going to user code. Program called with 2 arguments: a.out sample Calling with argument 'sample' executes the sample program. os_switch_context_complete called before going to user code. Interrupt_handler: Found vector type 5 with value 0 Interrupt_handler: Found vector type 5 with value 0 The disk data written is: 123456789abcdef The disk data read is: 123456789abcdef The following test may take a few seconds to execute. The following test may take a few seconds to execute. Interrupt_handler: Found vector type 5 with value 0 Interrupt_handler: Found vector type 5 with value 0 Interrupt_handler: Found vector type 5 with value 0 Interrupt_handler: Found vector type 5 with value 0 Interrupt_handler: Found vector type 5 with value 0 Interrupt_handler: Found vector type 5 with value 0 Interrupt_handler: Found vector type 5 with value 0 If you got here without the hardware complaining, then you have approximately enough memory for the disk data. This does NOT account for other memory uses such as code and other data storage you may do during this project. The system says the time is 317681 Time Target Action Run New Done State Populations 99999 99 CREATE 99 50 80 READY : 0 1 2 3 4 5 6 7 8 9 WAITING: 20 21 22 23 24 25 26 27 28 29 SUSPEND: 40 41 42 43 44 45 46 47 48 49 SWAPPED: 60 61 62 63 64 65 66 67 68 69 The Z502 stopped execution because of a halt command.
The project deliverables are:
Excruciating detail on the contents of these deliverables is provided below. Copies of these documents must be submitted at the end of each Project 1 and Project 2.
In general, there's a VERY LARGE amount of work to do. If you are a compulsive, then you need to understand right now, that you may NEVER finish the entire project. The project is purposely left open-ended so that you can go to whatever limit you are willing/able to accomplish; but that doesn't mean that you should spend every waking hour from now to the end of the semester on this project.
In all documents, strive to be concise. The purpose of the documents is to indicate the decisions you made and how effective they were. Your goal should be to communicate that information clearly and succinctly. Assume that I am the audience; as such, there is no need to provide detail on general operating system topics - for instance, you can assume that your reader understands the various aspects of schedulers. In essence, this is not a project you must sell to management and colleagues; thus prettiness is NOT as important as succinctness.
This document should consist of an architectural and policy overview. You may add any additional information you think is useful. The real goal here is "how does everything hang together?" But please, if all of this Architectural Section is more than 5 - 10 pages long, something is wrong.
Your architectural document should include EACH of the following subsections:
You must submit a copy of all YOUR operating system code; but, you do NOT give me back a copy of the code I've handed out to you. Source code should contain module descriptions and should be appropriately commented so that an average human can understand it.
The module description should include at least the following: A brief paragraph explaining WHAT the code does and HOW it does it. This is the "little picture" and describes each routine; the big picture was given in the Architectural Document.
The SVC, interrupt, and fault handler routines are a key to your design. These routines will be looked at in detail. Please highlight where these routines can be found in your code. (A separate section is easiest.)
The Test Document should contain a number of sections, one for each test.
Here is a MAJOR issue; ignore this at your peril. One of your very tough jobs will be to present output that represents what you've done. It's a very narrow path you must follow: present too little data, and it's impossible to see the extent of the work you've done; present too much, and no one will be able to carry all the paper. You must choose some middle ground - you must work hard on determining a concise, legible, complete list of actions.
In general, there are four levels of output; some will be appropriate for some tests, but not for others. NOTE: These descriptions apply to the TEST OUTPUT Table given below.
Here's a recommended format for the output - it's a dump of the frame table.
A 0000000000111111111122222222223333333333444444444455555555556666 B 0123456789012345678901234567890123456789012345678901234567890123 C 0000000001111 D 0000000110000 E 0000111000000 F 0000000220000 G 0123567230123 H 7773333117777 The rows mean the following: A - B: The frame number. Note how the first column is "00" and the last column is "63". C: The Process ID of the process having it's virtual page in the frame table. D - G: The virtual page number of the process that's mapped to that frame. Again the number (from 0 to 1023 possible) is written vertically. H: The state of the page. Valid = 1, referenced = 2, modified = 4. These are OR'd together. So the page mapped to frame number 0 is valid + ref'd + modified. Example: The page in frame 6 is virtual page 107 in process 0. That page has been made valid and has been referenced.
Here's a Table indicating the type of output that should be provided with each test.
Test Output |
||||
Test Name |
Test Output |
SVC Fault Interrupt |
Scheduler
|
Memory
|
---|---|---|---|---|
test1a |
full |
full |
No |
No |
test1b |
full |
full |
No |
No |
test1c |
full |
limited |
full |
No |
test1d |
full |
limited |
full |
No |
test1e |
full |
full |
No |
No |
test1f |
full |
limited |
full |
No |
test1g |
full |
full |
No |
No |
test1h |
full |
limited |
full |
No |
test1I |
full |
full |
No |
No |
test1j |
full |
limited |
full |
No |
test1k |
full |
full |
No |
No |
test2a |
full |
full |
No |
full |
test2b |
full |
full |
No |
full |
test2c |
full |
limited |
No |
No |
test2d |
full |
limited |
limited |
No |
test2e |
full |
limited |
limited |
limited |
test2f |
full |
limited |
No |
limited |
test2g |
full |
No |
No |
No |
In the Table, "full" means the complete output offered by that option. "limited" means give me the first, say, 10 occurrences of the printout and then do NOT give me the remaining thousands that might occur. "no" means just that - don't give me that kind of output.
Remember, points are gained by PROVING your feature works - this output is the best way of gaining that proof.
For each test run, include the data just described. This raw output needs to be annotated. There are lots of projects to be marked. We need help in locating what YOU consider to be important. The TURNIN rules given below contain exactly how to do this.
What can you trust about this project? Numerous classes have gone before you and have suffered through numerous anomalies and bugs. But perfection is illusive.
I've written the z502.c code. I also wrote a hardware diagnostic which runs all the hardware code. There is 100% code coverage for the disk and memory facilities. To the best of my knowledge, I've completely exercised the fault and interrupt mechanism. I built enough of an operating system so that I could run all the tests; I could at least determine that the user tests sequenced correctly, though my partial system didn't return correct system call data.
This assures that the hardware is able to successfully do context switching both for memory requests and for other types of system calls. I am convinced there are still bugs in the hardware, even though I wrote more diagnostic code than I did hardware code. However, several classes have used the hardware without uncovering any new bugs. There are, of course, always problems with newly written code.
This code was successfully run using GNU C on a SUN. It has also compiled and run on a Stratus system and on a PC.
Throughout the project you will be faced with choices of "beautiful, efficient" algorithms that you have designed versus "easy, trivial(ha, ha)" algorithms that you are sure you can implement. I offer the following advice for your general design: Choose the most complex design that you are CERTAIN you can implement. The emphasis is on getting each phase of the project working. A very simple design that works will score substantially higher than a wonderfully efficient design that does not. On the other hand, the grading does reflect the complexity of the design implemented. In this regard, MAKE SURE I know, by TELLING ME in your writeup, what features you have implemented.
You will be marked independently on each of the two projects. Your mark for the second project is independent of the first, except that numerous features for Project 1 show up in Project 2. There's no way I can possibly read all your code; I'm faced with either superficially looking over your entire project, or reading pieces in some detail. I'm likely to choose some of each (the algorithm keeps changing.) You could be lucky - the piece I pick could be one on which you did a stupendous job; then again, you could be less fortunate.
The ultimate purpose of this, believe it or not, is to have fun. You will have a great sense of satisfaction in completing this piece of work. Good luck to you.
This is the mechanism and rules to be used for turning in projects. These rules are meant to be simple and general enough that you won't have any trouble using them, yet using naming conventions that make it easy for me to manipulate multiple files from each student.
NAMING CONVENTION:
I am capable of reading the following formats:
.txt - Simple ASCII text. .doc - A document in Microsoft Word format. .pdf - A document that can be read using Adobe Acrobat. .html - A document I can read with a browser.
When you turnin a file, please make sure that it has a suffix matching the type of file it is. This will ensure I don't need to rename your file. It is NOT necessary that all the files you turnin be of the same type. From this we define
Suffix = { txt | doc | pdf | html }
In order to reduce the number of filename collisions, I'm asking you to begin the filenames with your initials, lastname first. So my initials would be "bj".
Initials = 1st Letter of your last name + 1st letter of first name.
E-MAIL CONVENTION:
THE PLEDGE DOCUMENT:
The purpose of the pledge document is to remind you that all your work must be done independently. The text of this document should be as given below.
I pledge that I have not collaborated with anyone else in writing the code for this project. I understand that it is permissible to discuss aspects of the project and talk over strategies with others. However, ALL programming that I am handing in was done by me and only me. I understand I will get a grade of 0 on the project if it is discovered that my code bears ANY resemblence to that of another current or former student.
The pledge document should be included in the e-mail and should be named
Initials_pledge.Suffix.
Example: bj_pledge.doc
THE ARCHITECTURAL DOCUMENT:
The architectural document should be included in the e-mail and should be named
Initials_arch.Suffix.
Example: bj_arch.doc
THE SOURCE DOCUMENT:
All source files that you have modified should be concatenated together. For instance, you modified my base.c (and perhaps renamed it). Please include that modified source as well as any new sources. Also make sure you put in modified include files, again all concatenated into one file. If you can't fit all the code/includes in one file, then please limit the number to two or three.
The source code file should be included in the e-mail and should be named
Initials_source{ | 1 | 2 | 3 }.Suffix.
Example: bj_sample.txt or bj_sample1.txt
THE TEST DOCUMENT:
The test document contains the concatenated results of all the tests. At the beginning of this file is a section called "NAVIGATION". It explains how to find things in the file. For instance, a statement such as the following might be appropriate:
NAVIGATION: In this test document file, search for characters "XXX" where you will find text commenting the relevant portions of the output.
Note: Don't ask me to search for a "highlight". I want to be able to do a "find" on particular textual characters. Comments highlighted in red can't be searched for and require me to do a lot of hunting.
Note: Each test should have 5 - 10 "XXX" comments. A possible comment might be:
XXX Look at that!! Did you see how that crazy priority scheduler caused process 4 to go on the ready queue ahead of process 2. See - this PROVES priority scheduling is working!!
Other Tips: Do NOT include your OS debugging statements. They are generally helpful only to you. Output guidelines for each tests are given in the Student Manual.
The test document should be included in the e-mail and should be named
Initials_test_document.Suffix.
Example: bj_test_document. pdf
CS502 - Operating Systems Evaluation for Project I Name _________________________ _ 1. Architectural Document. Each of the following should be addressed; see pages 4 and 5 of the Student Manual for more detail. ___ [5] 1) A list of WHAT is included in your design. ___ [10] 2) High Level Design. ___ [10] 3) Justification of High Level Design. ___ [5] 4) Additional Features. ___ 5) What anomalies and bugs did you find? _ 2. Source Code. Page 5 of the Student Manual for more detail. ___ [5] a) Code contains description of WHAT and HOW. ___ [5] b) The SVC, interrupt, and fault handler routines. _ 3. Test Results ___ [6] a) Used output guidelines discussed in student manual. ___ [4] b) Test program 1a runs and gives expected output. ___ [4] c) Test program 1b runs and gives expected output. ___ [4] d) Test program 1c runs and gives expected output. ___ [4] e) Test program 1d runs and gives expected output. ___ [4] f) Test program 1e runs and gives expected output. ___ [4] g) Test program 1f runs and gives expected output. ___ [4] h) Test program 1g runs and gives expected output. ___ [4] i) Test program 1h runs and gives expected output. ___ [4] j) Test program 1i runs and gives expected output. ___ [4] k) Test program 1j runs and gives expected output. ___ [4] l) Test program 1k runs and gives expected output. ___ [10] m) Demonstration of extra features.
CS502 - Operating Systems Evaluation for Project II Name _________________________ _ 1. Architectural Document. Each of the following should be addressed; see pages 4 and 5 of the Student Manual for more detail. ___ [5] 1) A list of WHAT is included in your design. ___ [10] 2) High Level Design. ___ [10] 3) Justification of High Level Design. ___ [5] 4) Additional Features. ___ 5) What anomalies and bugs did you find? _ 2. Source Code. Page 5 of the Student Manual for more detail. ___ [5] a) Code contains description of WHAT and HOW. ___ [5] b) The SVC, interrupt, and fault handler routines. _ 3. Test Results ___ [7] a) Used output guidelines discussed in student manual. ___ [7] b) Test program 2a runs and gives expected output. ___ [6] c) Test program 2b runs and gives expected output. ___ [6] d) Test program 2c runs and gives expected output. ___ [6] e) Test program 2d runs and gives expected output. ___ [6] f) Test program 2e runs and gives expected output. ___ [6] g) Test program 2f runs and gives expected output. ___ [6] h) Test program 2g runs and gives expected output. ___ [10] i) Demonstration of extra features.