OS Survey F99


Submission Criteria

For our OSSurveyF99 conference, you, as researchers, must submit your paper electronically to the program chair (me). You must make use of MIME to send me a PDF or PostScript version of your file. If you use PostScript, it must be portable PostScript -- Unix programs that generate PostScript (e.g., dvips with LaTeX, or FrameMaker's output) are okay; with Windows, you should verify your printer driver properties have the "maximize portablility" selection, and make sure that it will print on any printer. Typically, if I'm forced to use Windows, I install the Apple laserwriter driver and use "maximize portability", and that seems to work okay.

The page limit is 10 pages. This is firm. The paper must be in 12 point font, with at least 1 inch of margin on all edges.

Program Committee

After you've submitted your papers, you will play the part of program committee members. This means reviewing the papers for the correctness / soundness of the claims or observations, possibly tracking down their references to verify statements, etc.

Review Format

For our OSSurveyF99 conference, you, as members of the program committee, should get the on-line submissions from this web page. The papers should be evaluated using the following method (this is taken from a real conference's program committee review instructions and revised slightly, since survey papers will not necessarily contain new research -- though if there are new research ideas, that's even better):
Papers will be evaluated in two parts: a set of numeric scores and a written commentary. Scores will be on a 1 to 7 scale (with 4 as an "average" in all cases 1 is bad, 7 is best). Do not assign a zero score. Also, please use integer values.

Scoring will be in four categories:

Import -- is the work (both its area and results) important to the OS community? Low scores might be given for evaluations of marketing literature and papers on inappropriate or dead topics. High scores are for papers that nearly all attendees will want to read and understand carefully. (This score is sometimes more a measure of how ``hot'' the area is and not a measure of the paper's quality; it will not affect the grade.)

Novelty -- are the observations novel / germane? Low scores should be given for papers that re-hash obvious results or known observations about works in the topic area. High scores are for papers that point out new research areas (portions of design space not explored that ought to be), new fields, or demonstrate new ways to view / attack a problem.

Quality -- are the observations / criticisms sound? A low score might go to a paper whose observations are incorrect or whose critiques are biased or not well supported in your opinion. High scores are for papers with enough justification to convince you that the opinions are correct and viable.

Overall -- should we accept this paper or not? This is by far the most important number. It need not be an average of the other numbers, but it should reflect them. This number can also reflect issues in addition to those described above (e.g., poor presentation or lack of knowledge of related work).

Note that the conference evaluation contains criteria for novelty and importance to the OS community; when I grade these, these will be less important -- I will pay more attention to the quality of the reasoning and the soundness of the observations / criticisms, so having selected what's currently ``hot'' (or what's not) wouldn't be so important.

The review should be in the following (best if you cut-and-paste or used the review template). The reviews will be machine parsed to generate statistics.

Paper 99
-------- 8< --------  scores  -------- 8< --------  scores  --------
Import		Novelty		Quality		Overall
7		1		5		4
-------- 8< -------- comments -------- 8< -------- comments --------
Your comments on the paper.  This is public comments that the authors
of the papers will see.  Provide feedback to improve their paper, etc.

Submissions

  1. [AFW] abbott,fei,williams.a_look_at_modern_file_systems.pdf
  2. [AGHW] anderson,gillen,hickson,wurster.a_survey_of_real-time_operating_systems.ps
  3. [BCH] bellardo,copenhafer,hamerly.microkernels_as_foundations_for_distributed_systems.ps
  4. [BCGL] boyer,chen,gray,lee.operating_systems_for_the_www.pdf
  5. [GJG] ghose,jain,gopal.characterizing_QoS-awareness_in_multimedia_operating_systems.pdf
  6. [GKLL] guan,kuhl,li,liu.survey_of_distributed_file_systems.ps
  7. [KNSS] koletsou,nightingale,sair,sinanoglu.performance_of_software_distributed_shared_memory_systems.ps
  8. [SS] schneider,savla.survey_of_security_in_mobile_code_systems.ps
  9. [TKML] tower,kondo,mysore,leong.resource_scheduling_in_real-time_systems.ps

Review Assignments / Results

Note that normally authors don't know who on the program committee reviewed their paper. Of course, if one of the authors are on the program committee, then they'd probably know. You should view these comments (and hopefully made your comments) in a constructive light -- i.e., take the comments into account when writing up your slide presentation.

Sorted by Paper
PaperReviewersAnonymized comments
[AFW]cophenhafer guan savla gillen lee ghose sairhere
[AGHW]abbott hamerly kuhl tower bellardo jain sinanogluhere
[BCH]fei boyer li kondo hickson gopal schneiderhere
[BCGL]williams ghose liu mysore wurster guan savlahere
[GJG]anderson chen koletsou leong cophenhafer kuhl towerhere
[GKLL]gillen gray nightingale abbott hamerly koletsou kondohere
[KNSS]hickson lee schneider fei boyer li mysorehere
[SS]wurster jain sair williams chen liu leonghere
[TKML]bellardo gopal sinanoglu anderson gray nightingalehere
Sorted by Reviewer
Reviewer Papers
abbottAGHW GKLL
feiBCH KNSS
williamsBCGL SS
andersonGJG TKML
gillenGKLL AFW
hicksonKNSS BCH
wursterSS BCGL
bellardoTKML AGHW
copenhaferAFW GJG
hamerlyAGHW GKLL
boyerBCH KNSS
chenGJG SS
grayGKLL TKML
leeKNSS AFW
ghoseBCGL AFW
jainSS AGHW
gopalTKML BCH
guanAFW BCGL
kuhlAGHW GJG
liBCH KNSS
liuBCGL SS
koletsouGJG GKLL
nightingaleGKLL TKML
sairSS AFW
sinanogluTKML AGHW
schneiderKNSS BCH
savlaAFW BCGL
towerAGHW GJG
kondoBCH GKLL
mysoreBCGL KNSS
leongGJG SS

You should write up your reviews for Dec 2. You should also email me the entire review. The authors will get the comments. I will sum up / average the numeric scores, and provide those for the authors.

Conference Presentation

All the groups should prepare a presentation. On the evening of Dec 2, I'll announce the top four papers on this web page; their authors will actually make presentations. All of the groups (including those that makes the actual presentation) should print out their slides (on paper) or generate portable postscript/PDF and send that to me for grading/review. This should be done by the day of the presentation.

On Dec 7 and Dec 9, the four ``accepted'' papers' groups will give an oral presentation in front of the entire class. The conference will be held in AP&M 4882. Each group will have about 1/2 hour total, so you should plan on 20-25 minutes for the presentation and 5-10 minutes for questions and answers from the audience.

What To Expect At The Conference

For our OSSurveyF99 conference, you, as attendees, will also help evaluate the presentation of the papers. The top four papers will be presented by the authors. An overhead projector will be available for the oral presentation. The presentations should give an overview of the topic and results; its main purpose is to motivate the audience to read the full paper published in the proceedings. As attendees, this is your main evaluation criteria: do the presenters convey the key ideas clearly? does it make you want to look at the details in the paper? (Making you look because the presentation was confusing wouldn't count!)
[ search CSE | CSE home | bsy's home page | webster i/f | yahoo | hotbot | lycos | altavista | pgp key svr | spam | commerce ]
picture of bsy

bsy+cse221.f99@cs.ucsd.edu, last updated Fri Dec 3 17:00:49 PST 1999. Copyright 1999 Bennet Yee.
email bsy.


Don't make me hand over my privacy keys!