1. Middle States Assessment Report
Computer Science Department
Report of Assessments Scheduled for Spring, 2006
The Middle States Evaluators will be visiting the University of Maryland in March 2007. The evaluations scheduled for Spring 2006 have been completed.
Small committees were appointed by the department chair to each of the assessments scheduled for completion, and reports have been submitted.
The committees were created with several criteria in mind. The members of the committee could not be associated that semester with the course being
evaluated. The committee members were chosen from people who did have more intimate knowledge of the course such as faculty who had taught that course in
previous semesters, faculty who taught the course or courses that follow it in the sequence and/or faculty who teach similar material on a different level. The
teams were also created to have a mixture of faculty from different levels. These criteria were followed to maximize differences of faculty so that all viewpoints
would be represented during the evaluation and to maximize the involvement of faculty members in this process.
Each report looks at a sample of student work in order to determine if the learning outcome specified is being reached to the level of the goal also
specified in the assessment proposal. Each report submitted by a committee is 1-3 pages summarizing: what was being reviewed, how the review was conducted
(sampling etc), the criteria used for the review, the results, and suggestions for improvement.
2.
3. Bachelor’s - Learning Outcome #1
Student Learning Outcomes Assessment Measures and Criteria Assessment Schedule
(list the three-to-five most important) (describe one or more measures for each outcome and criteria for (initial year, and
success) subsequent cycle)
1. Graduates will be able to create, augment, Projects will be identified from the introductory sequence of courses Starting in the spring of
debug and test computer software. These skills (CMSC 131, 132 & 212) that emphasize each of these skills. These 2006 these project
will be built progressively through the courses in project descriptions and sample student implementations will be descriptions and sample
the introductory sequence of courses. evaluated to determine the extent to which each of these skills and solutions will be reviewed
the expectations build through the sequence. Sample student for coding skills. In spring
implementations will be reviewed to determine the extent to which of 2008 they will be
students are meeting the expectations of each skill at each level. reviewed for debugging
This evaluation will be done by the “Introductory programming and testing. These two
oversight committee” which is already a standing committee of will then proceed on an
people with experience teaching at this level. The projects should every other year cycle.
have expectations that are building in a method appropriate to a three
course sequence, and at least 75% of the sampled students projects
should be deemed as working to the level of that expectation.
For this learning outcome – we had a two person sub-committee of the introductory program committee review one or two project descriptions and
implementations done by a sample of students who passed the course. Each time this evaluation is done, it will be performed on a different course in the
introductory sequence. For this first time through, CMSC 131 will be evaluated by people who are not directly involved with CMSC131, but are receiving the
students who pass 131 by teaching 132. Nelson Padua-Perez and Chau-Wen Tseng were selected as viable candidates for this evaluation.
============================================================
4. Middle States Evaluation of CMSC 131, Spring 2006
Reviewers
Nelson Padua-Perez, Chau-Wen Tseng
Reviewed
The reviewers reviewed CMSC 131, Object-Oriented Programming I. It was taught in Spring 2006 by Fawzi Emad. The goal of the course is to help
students develop skills such as program design and testing, as well as the implementation of programs using a graphical IDE. All programming is
performed in Java.
Review methodology
The reviewers obtained examples of student code for two programming projects, “Company Database” and “Shape Decorator.” The reviewers
examined the project descriptions, student code, and other course materials. Six student implementations were selected at random from all projects
submitted for each of those two project assignments.
Review criteria
The reviewers examined student code submissions to determine their coding skills. In particular, we attempted to establish whether students were
proficient in the following areas:
Programming Knowledge
• fundamental (primitive) Java data types
• conditional constructs
• looping constructs
• one & two dimensional arrays
• definition of classes, constructors
• inheritance, interfaces
Programming Skills
• ability to understand & implement small programming solutions
• ability to implement code involving classes w/ inheritance relationships provided
• significant ability to use one/two-dimensional arrays .
5. Results
The reviewers found that students demonstrated their programming knowledge and programming ability for basic objected-oriented programs.
Students were proficient in using primitive Java data types, control flow (conditional & looping constructs), 1D arrays, classes (constructors &
methods). In addition students were able to design classes and methods from scratch. The programming projects were of small to moderate size
(150-200 lines of code, including comments). Projects examined did not provide examples of students’ ability to use 2D arrays and inheritance.
Suggestions for improvement
Student projects were in general well-designed to be both educational and capture student interest. The reviewers suggest less emphasis on string
input/output (requiring concern about formatting details) if possible, and more opportunities for students to design open-ended projects (choosing
their own classes, methods).
6.
7. Bachelor’s - Learning Outcome #2
Student Learning Outcomes Assessment Measures and Criteria Assessment Schedule
(list the three-to-five most important) (describe one or more measures for each outcome and criteria for (initial year, and
success) subsequent cycle)
2. Graduates of this program will develop Since this program requires courses from the Math department and Starting in the spring of
mathematical and analytical reasoning skills. CMSC 250, they will develop mathematical and analytic reasoning 2006 and every three
skills. Their ability to perform mathematically rigorous proofs is years after, a small
important and a good way to measure their level of analytical committee will review a
reasoning. A small committee will review selected answers from a sample of these as
single final exam question from CMSC 250 to determine how well described.
the students are able to rigorously prove a mathematical concept.
The sample will be selected from those who were deemed as C or
better on the exam as a whole. At least 80% of the students in the
course program should receive a rating of very good or excellent on
this one exam question.
For this learning outcome – we had a two person committee of people who are not currently teaching CMSC 250 review one exam question that is deemed to be
an exemplar of the goals of CMSC 250. The committee will look at the answers of people who received a C or better on the exam as a whole and determine how
well they understand the topic tested by that one question. Bill Gasarch and Evan Golub were selected as viable candidates for this evaluation.
==============================================================
Evaluating Mathematical and Analytical Skills
Of Computer Science Majors
Using the Middle States Criteria
By Evan Golub and Bill Gasarch
The second item in the Middle States Evaluation Criteria states Graduates of this program will develop mathematical and analytical reasoning skills. The
evaluation method mandated was that we pick a sample of the students who got a C or better on the exam as a whole, pick a question on the final that we feel
tests the student’s ability to prove statements rigorously, and grade it ourselves.
We picked question 3b
Question 3 states: For each of the following, either give a complete formal proof being sure to use good notation and reasons for all steps OR give a
counter example with justification to show that it is a valid counter example.
8. Question 3b states: Give a proof or a counter example for the statement:
n
∀n ∈ Z ≥o , ∑ i ⋅ 2 i = 2 n +1 (n − 1) + 2
i =0
This question was chosen since it required a rigorous proof by induction, but was not overly cumbersome in terms of algebra. Hence, if a student got it
wrong, either they lacked the analytical reasoning skills or they lacked basic mathematical skills.
We picked 20 students who got a C or better on the exam. We chose the number of A’s, B’s and C’s in rough percentage to how many students got A’s,
B’s and C’s. We graded this one problem given 2 points for the Base Case, 6 for the Induction Hypotheses, and 10 for the induction step.
Here are the final results:
1. 15 were Excellent or Very Good. (14 points or higher)
2. 1 was Moderate (9-13 points)
3. 4 were Poor (8 points or less)
Hence, 75% were Excellent or Very Good.
9. Bachelor’s - Learning Outcome #3
Student Learning Outcomes Assessment Measures and Criteria Assessment Schedule
(list the three-to-five most (describe one or more measures for each outcome and criteria for success) (initial year, and
important) subsequent cycle)
3. Graduates will experience Several 400 level courses contain large scale programming projects each semester. Starting in the spring of
design and implementation of For evaluation, these courses will be identified and the project base of a sample of 2006 and every three
programming projects that are those identified courses and the expectations of students will be reviewed by a small years after, a small
more similar to those that committee of faculty to determine the appropriateness of the level of the project as committee will review a
would be seen in a real world well as the student understanding. In order to be sure that enough students are taking sample of these as
environment. these identified courses so that the programming skills are emphasized in a high described. The statistics
percentage of the graduates, data will also be collected to determine the percentage of about the number of
students electing to take these courses. Some examples of courses that currently meet majors graduating who
the criteria are CMSC 412, 414, 417, 420, 430 and 433. At least 85% of each elected to take these
graduating class should be electing to take 2 or more of these courses that are courses will be collected
determined to be “project based.” The project descriptions and sample for every graduating
implementations will be reviewed by the committee in order to determine the quality class.
of the code being designed and written. At least 75% of the running projects should be
determined to have very good or excellent programming characteristics. The
evaluations of the projects will help to guide project design decisions for later
semesters to make sure students are being adequately trained for actual applications of
programming in the real world, and the numbers will be used to ensure that enough
students are able to take the “project based” courses.
For this learning outcome – we had a two person sub committee review one project descriptions and implementations done by a sample of students who passed
the course. Each time this evaluation is done, it will be performed on a different course in the sequence that requires a programming project as a significant
portion of the assessment. For this first time through, CMSC 412 will be evaluated by people who are not directly involved with CMSC412, but people who have
been involved with the course before. Udaya Shankar and Pete Keleher were selected as viable candidates for this evaluation.
===============================================================
CMSC 412 evaluation (keleher, shankar)
Expectations
Students who successfully complete the course are expected to know an operating system inside out. They should be able to build an operating
system from scratch, including all the major subsystems such as process scheduling and management, virtual memory, file systems, threading, and
inter-process communication. They should also be able to dive into an existing operating system to make modifications and to add components.
10. Course project
The main thrust of the course is to implement an operating system, starting with a skeleton operating system (GeekOS) and adding various
subsystems to it over a series of six projects. GeekOS can run on an actual x86 machine or an emulator; the course uses an emulator (Bochs) that runs
on a Linux cluster dedicated to instructional use.
The GeekOS that the students start with implements an elementary process scheduler and device drivers for floppy and screen. The students extend
GeekOS over the course of the semester as follows:
• Project 0: Get familiar with the GeekOS development environment and the Bochs x86 simulator, and writes a simple kernel mode thread that
does some simple keyboard and terminal IO.
• Project 1: Augment GeekOS process services to include background processes, asynchronous killing of a process from another process, and
printing the status (process table) of running processes. Involves address space protection, system calls, user versus kernel memory, and
segmentation.
• Project 2: Augment GeekOS with code to handle signals, implement system calls to setup and call signals, and handle orphaned background
processes.
• Project 3: Implement multilevel feedback process scheduling in place of the default priority-based, preemptive round robin, and evaluate the
two policies. Implement semaphores and provide user programs with semaphore-manipulating system calls.
• Project 4: Augment GeekOS with virtual memory. This is done in steps: kernel memory paging, user memory paging, demand paging, and
finally virtual memory.
• Project 5: Augment GeekOS with a new file system, including support for long file names, creating directories, and mounting. This file
system resides on the second IDE disk drive in the Bochs emulator, with the first IDE drive having the default PFAT file system. A virtual
file system (VFS) layer is placed above the two file systems.
• Project 5: Augment GeekOS with inter-process communication via message passing.
For each project, students are given a base version of GeekOS that they can build on. So a student who has not successfully completed the previous
project is not penalized.
Evaluation
We examined several projects and decided to use the fifth project as our sample. Project 5 has the students implement a complete file system for a
simple operating system. The students were given a scaffolding that implements the rest of the operating system, though some of this is derived from
the students' own versions of prior projects. Students were also given a complete description of the interface between their code and the rest of the
system, which effectively defines most of the relevant types gives prototype to all of the main routines.
Students usually wrote about 1200 lines of C source code for this project, plus a variety of small changes elsewhere in order to get the new code to
mesh with the rest of the scaffolding correctly. We looked at three samples of running code and inspected the code for three characteristics:
• clear and well-documented code
• well-designed functions without redundancy, and
• evidence of good debugging practice (error conditions checked, assertions used).
11. Students excelling in all three criteria should be well-prepared to develop design and build large-scale projects after graduation.
Two of the three projects we examined did very well on all three criteria. The functional structure was laid out well, code used more than once was
invariably hoisted into common subroutines, and the students used good debugging practice by checking all or most error conditions, as well as
defining invariants using kernel assertions. The third project was less clearly laid out, and the student did not bother to document the project
like the others. The student used less sophisticated methods of debugging (inserting print statements) that would not scale well up to larger or more
time-sensitive projects. Functionally, however, the three projects were equivalent, they passed all tests and detected relevant error conditions.
Suggestions for Assessment Improvement
More than three projects will need to be evaluated in the future since the three chosen may not accurately represent the 40 people taking the class that
semester. We should review whether more emphasis be placed on the grading of programming style (reuse of subroutines, reliability and readability)
in this course or in the courses that are prerequisites to this course.
12.
13. Bachelor’s - Learning Outcome #4b
Student Learning Outcomes Assessment Measures and Criteria Assessment Schedule
(list the three-to-five most (describe one or more measures for each outcome and criteria for success) (initial year, and
important) subsequent cycle)
4. In addition to those skills All students will have one or more of these areas of specific skill development. The These graduation
mentioned above needed by all statistics about the courses chosen by the graduates should be reviewed after each statistics should be
computer science graduates, graduation to determine that students are developing skills for application to graduate collected and reviewed
the students must also gain school or the real world in more than one of these areas. All students are required to for each graduating class
skills and knowledge in one or have one based on the graduation requirements, there should be at least 90% that have to be sure the correct
more of the following areas of gained the skills described in more than one of these areas. courses are being offered
computer science. so that this goal can be
met.
b) Graduates will be There are several courses that address this skill set. Some examples are: 498A Starting in the Spring of
exposed to some form (independent study), 390 (honor’s research), independent work with a faculty member, 2006 and for every third
of academic research or the software engineering corporate add-on, and the corporate scholars program all year after. The faculty
real world provide this exposure. The quality of the individual research done at the committee responsible for
programming/business undergraduate level within the department will be maintained by a process of review ofgraduation with
environments. This selected papers submitted for the honors program. The responsibility for this review departmental honors will
exposure makes their will not be done by the student’s research supervisor, but rather by the faculty review papers submitted
transition to either committee responsible for graduation with departmental honors. At least 90% of the for that purpose.
graduate school or the papers reviewed should receive an excellent rating by this external review committee.
work environment
smoother.
For this learning outcome – we will have a committee who oversees the Honor’s Program review a random sample of the projects of students who graduate with
departmental honors. This is a standing committee currently chaired by Dr. Gasarch.
14. Middle States Evaluation of Honors Thesis
By William Gasarch and Don Perlis
Nine students graduated with honors in Spring 2006. We looked at six honors theses and independently gave then grades in the range of
{0,1,2,3} for originality (ORIG), significance (SIG) and presentation (PRES). The grades are interpreted as follows:
0=weak, 1=good, 2=very good, 3=excellent
After we evaluated them separately, we added our scores together. Hence, we have, for ORIG, SIG and PRES, a score than can be from min
0 to max 6.
1. Graphical Interface for JSHOP2 by John Shin.
ORIG: 5, SIG: 5, PRES:3
2. Magnetic Swipe Card System Security by Daniel Ramsbrock.
ORIG: 6, SIG: 5, PRES:6
3. A Survey of some recent results in Computer Graphics by Robert Patro.
ORIG: 3, SIG: 5, PRES:4
4. XMT Applications Programming: Image Registration and Computer Graphics by Bryant Lee.
ORIG: 5, SIG: 5, PRES:5
5. Annoflow- Handwritten Annotation and Proof-reading on Dynamic Digital Documents by Phil Cosby.
ORIG: 5, SIG: 5, PRES:6
6. A Computer Program to Play Minesweeper.
ORIG: 6, SIG: 4, PRES:4
(Note: Bill Gasarch was the mentor or this project so the ratings are based on Don Perlis’s scores doubled.)
We regard a student’s honors thesis to be excellent if it is at least 5 in one category, at least 4 in another, and at least 3 in the third. Using this
measure, all of the theses were excellent. This may sound rather optimistic; however, realize that only nine students did honors theses, and they
were some of our best students.
15. Master’s – Learning Outcome #1
Doctorate – Learning Outcome #1
Student Learning Outcomes Assessment Measures and Criteria Assessment Schedule
(list the three-to-five most (describe one or more measures for each outcome and criteria for success) (initial year, and
important) subsequent cycle)
1. Master’s and Ph.D. Student taking the 600-800 level project-based courses will have developed Starting in the Spring, 2006
graduates will learn to design projects appropriate to the level for assessment. A committee of three faculty and every third year
and implement computer members will review project descriptions and sample student implementations of afterward this committee
software. a project for a course that is identified as being heavily based on the project. This will review a course that
committee will determine if the level of the project is appropriate and if the was offered in the previous
students are able to complete the project with appropriate programming semester.
characteristics. At least 75% of the sample of students who have passed this
graduate level course will have the project implementation rated as a very good or
excellent in a variety of programming characteristics.
For this learning outcome – we will have a single person who has taught a similar course in the past review a project and implementations. For this year, Jim
Reggia will review a project in the course being taught by Lise Getoor.
Evaluation of Learning Outcome #1 for MS/PhD Degree
Part of the Middle States Evaluation Process
J. Reggia, Professor, Computer Science Dept.
5/15/06
Learning Outcome #1: The goal of this Learning Outcome is that MS/PhD students will learn to design and implement computer software.
Assessment Measures and Criteria: Students taking the 600-800 level project-based courses will have developed projects appropriate to their level.
Project descriptions and sample student implementations of a course that is identified as being heavily based on a project will be reviewed. It will be
determined whether the level of the project is appropriate and if the students are able to complete the project with appropriate programming
characteristics. At least 75% of the sample of students who pass this course will have the project implementation rated as very good to excellent.
Course Selected for Review: CMSC 726, Machine Learning, was selected for review this semester. This course reviews a broad range of machine
learning methods at a graduate level.
The course typically has a major project required of all students that involves an independent research project. The project is required of all students.
16. Methodology: All projects turned in by students taking the course are reviewed here for quality independent of the course instructor. These projects
were provided by the instructor during the last week of the semester.
Evaluation criteria: Each project was reviewed and rated based on originality, content, implementation effort, and report quality.
Results:
1. Project goal: (excerpts from the instructor’s project assignment) The goal of this project is to give students experience in defining, refining and
developing an interesting research idea, hands-on experience implementing the proposed research, practice evaluating research results, and
experience writing up and explaining the research results. Ideally, the project should be of a quality such that students can submit it to a workshop or
conference. Projects can be either applications oriented, applying machine learning techniques to an interesting or novel task, or algorithmic oriented,
comparing several models or developing a new algorithm. In either case the results should be validated empirically. A 1 to 2 page proposal must be
turned in, describing the problem, the hypothesis to be tested, the experiments that will be performed, and how the results will be evaluated. The
project can be done alone or with 1 to 2 (maximum) other students. (The instructor also suggested ideas for potential projects as well as providing
pointers to past projects and to conference proceedings that students could review for ideas.)
2. Projects received: A total of 13 projects were received, reflecting reports from all 20 of the students in the class. Several projects involved two or
three students working together, but most were done by individual students working alone. All projects that were turned in were examined as part of
this evaluation.
3. Evaluation of projects: The student projects spanned a broad range of creative topics and applications. While some of the projects were
straightforward applications of supervised learning, even these often took on creative tasks. Multiple projects were done on timely topics such as
learning to detect spam in email messages, clustering of information in databases, and applications in bioinformatics. Several projects were of a
caliber that they would be appropriate to submit to workshops/conferences for possible publication.
Comments and Conclusions
1. The level of this project was appropriate to the level of the students, and it clearly met the goal of giving the graduate students in this course
experience with implementing substantial advanced software. The students had learned a lot about machine learning that they could bring to bear in
the software development.
2. Independent review of these projects ranks them all as at least very good, with several being clearly excellent in terms of creativity, technical
execution, and innovation. Several were judged as potentially publishable research results.
3. Not only was learning objective #1 met, but it was clearly exceeded in that in addition to obtaining software development experience, the students
also gained experience formulating and executing a research study, analyzing the results, and writing up their conclusions. This is a very valuable
exercise for this level of students.
17. Doctorate – Learning Outcome #3 and 4
Student Learning Outcomes Assessment Measures and Criteria Assessment Schedule
(list the three-to-five most important) (describe one or more measures for each outcome and criteria for (initial year, and
success) subsequent cycle)
3. Ph.D. graduates will complete The research component of the Ph.D. requirement will be evaluated at Starting in the spring of
extensive independent research and have two different periods of time – both after 3 years of study and at 2006 and every year after
at least one paper accepted by a graduation. At least 75% of the students graduating in a given year the CMSC graduate office
conference or published in a refereed will have published at least one paper. This information can be and the associate chair of
journal. gathered during the graduation survey and interview already undergraduate studies will
conducted by the graduate office and the associate chair of the CMSC compile and present these
graduate school. At least 75% of the students completing their third statistics to the chair for
year of study toward a Ph.D. will have submitted at least one paper to review.
a conference or refereed journal. This information will be gathered on
the form used for the graduate student review day which is held during
the spring semester.
4. Ph.D. graduates will have their At least half of the students graduating in a given year will have Starting in the spring of
research accepted for presentation at presented their research during at least one conference by the time of 2006 and every year after
subject area conferences and the Ph.D. their graduation. This information can be gathered during the the CMSC graduate office
graduate will have made the presentation graduation survey and interview already conducted by the CMSC and the associate chair of
of their research material. graduate office and the associate chair of the CMSC graduate school. graduate studies will
compile and present these
statistics to the chair for
review.
These are both statistics that will need to be collected by the graduate chair, Samir Khuller, from the graduate office survey and from the exit
interviews.
Report of the Ph.D. Learning Outcomes #3 and #4
For the graduation statistics, the data was collected both on a survey given to all students as they a apply for graduation and from an exit
interview requested of all people who would be graduating in the next year. Since neither the surveys nor the interviews are required in any way,
several students did neither. The graduate program coordinator then attempted to fill in the data for the missing students by using their web page and/
18. or vitae if she had access to the information there. This method of data collection proved to not be as accurate as we would like so next year, this
information will be additional questions that appear on the form they must submit to the department in order to apply for graduation. That method of
data collection should be significantly more reliable.
The information in the following paragraphs is according to the data we were able to collect in the first method. For the 2005-2006 school
year, there were a total of 34 Ph.D. graduates from the computer science department. They were divided into 11 for the summer, 10 for the winter
and 13 for the spring graduation ceremonies.
For published journal articles, there were a total of 26 students who had already had one or more articles published in a journal before the
time of their graduation. These were divided as 8 for the summer graduates, 9 for the winter and 9 for the spring. This gave an overall percentage of
75.9% slightly above the Middle States goal of 75% as stated above. The winter graduates were well over the goal rate at 90%, the summer
graduates were slightly under at 73.73%, and the spring graduates were slightly over at 76.85%.
For presentation at conferences, there were a total of 28 graduates that had already presented at one or more conferences. These were divided
as 10 for the summer graduates, 10 for the winter graduates and 8 for the spring graduates. This gave an overall percentage of 82.35% which is well
over the goal of at least half as stated above. Each of the three groups was over the goal with the summer having 90.91%, the winter 100% and the
fall 61.54%.
For the mid-program evaluation, the data was collected as part of the graduate student review process done each April. All current graduate
students are evaluated by a committee to determine if they are making adequate progress toward graduation. This first step of this process is a web
survey that all Computer Science graduate students must complete. If the students do not appear to be making progress toward graduation from their
answers on that survey, they are then interviewed further to determine the proper corrective measures. The data is then entered into a database and
we were able to query for just the information about the students who met our criteria. The goal was to have publications in peer reviewed journals
by at least 755% of those who had completed their third year of graduate study in the computer science department. We found that there were 29
students who had completed their third year and that 20 of them already had articles accepted at peer reviewed journals. This gives a rate of 69% of
these students had published. We did not have data to determine how many had submitted to a peer reviewed journal. The assessment should
probably be changed to something that is more easily measured such as having an article accepted for publication in a peer reviewed journal.
Suggestions for Improvement
The method of data collection for those that are needed from people who are graduating will be changed. The method used was not on any
form that is required – and so there was quite a bit of missing data that was difficult to recover. The new method will be to add it to the form that
students must fill out to apply for their graduation. Since all students submit this form and are unlikely to leave these small questions blank, the data
collection will be more accurate when this evaluation is completed for 2007.