SlideShare une entreprise Scribd logo
1  sur  18
Middle States Assessment Report
                                                    Computer Science Department
                                          Report of Assessments Scheduled for Spring, 2006
        The Middle States Evaluators will be visiting the University of Maryland in March 2007. The evaluations scheduled for Spring 2006 have been completed.
Small committees were appointed by the department chair to each of the assessments scheduled for completion, and reports have been submitted.
        The committees were created with several criteria in mind. The members of the committee could not be associated that semester with the course being
evaluated. The committee members were chosen from people who did have more intimate knowledge of the course such as faculty who had taught that course in
previous semesters, faculty who taught the course or courses that follow it in the sequence and/or faculty who teach similar material on a different level. The
teams were also created to have a mixture of faculty from different levels. These criteria were followed to maximize differences of faculty so that all viewpoints
would be represented during the evaluation and to maximize the involvement of faculty members in this process.
        Each report looks at a sample of student work in order to determine if the learning outcome specified is being reached to the level of the goal also
specified in the assessment proposal. Each report submitted by a committee is 1-3 pages summarizing: what was being reviewed, how the review was conducted
(sampling etc), the criteria used for the review, the results, and suggestions for improvement.
Bachelor’s - Learning Outcome #1


Student Learning Outcomes                             Assessment Measures and Criteria                                          Assessment Schedule
(list the three-to-five most important)               (describe one or more measures for each outcome and criteria for          (initial year, and
                                                      success)                                                                  subsequent cycle)

1. Graduates will be able to create, augment,         Projects will be identified from the introductory sequence of courses Starting in the spring of
debug and test computer software. These skills        (CMSC 131, 132 & 212) that emphasize each of these skills. These 2006 these project
will be built progressively through the courses in    project descriptions and sample student implementations will be        descriptions and sample
the introductory sequence of courses.                 evaluated to determine the extent to which each of these skills and solutions will be reviewed
                                                      the expectations build through the sequence. Sample student            for coding skills. In spring
                                                      implementations will be reviewed to determine the extent to which of 2008 they will be
                                                      students are meeting the expectations of each skill at each level.     reviewed for debugging
                                                      This evaluation will be done by the “Introductory programming          and testing. These two
                                                      oversight committee” which is already a standing committee of          will then proceed on an
                                                      people with experience teaching at this level. The projects should every other year cycle.
                                                      have expectations that are building in a method appropriate to a three
                                                      course sequence, and at least 75% of the sampled students projects
                                                      should be deemed as working to the level of that expectation.

For this learning outcome – we had a two person sub-committee of the introductory program committee review one or two project descriptions and
implementations done by a sample of students who passed the course. Each time this evaluation is done, it will be performed on a different course in the
introductory sequence. For this first time through, CMSC 131 will be evaluated by people who are not directly involved with CMSC131, but are receiving the
students who pass 131 by teaching 132. Nelson Padua-Perez and Chau-Wen Tseng were selected as viable candidates for this evaluation.

============================================================
Middle States Evaluation of CMSC 131, Spring 2006

Reviewers
Nelson Padua-Perez, Chau-Wen Tseng

Reviewed
The reviewers reviewed CMSC 131, Object-Oriented Programming I. It was taught in Spring 2006 by Fawzi Emad. The goal of the course is to help
students develop skills such as program design and testing, as well as the implementation of programs using a graphical IDE. All programming is
performed in Java.

Review methodology
The reviewers obtained examples of student code for two programming projects, “Company Database” and “Shape Decorator.” The reviewers
examined the project descriptions, student code, and other course materials. Six student implementations were selected at random from all projects
submitted for each of those two project assignments.

Review criteria
The reviewers examined student code submissions to determine their coding skills. In particular, we attempted to establish whether students were
proficient in the following areas:

Programming Knowledge

   •   fundamental (primitive) Java data types
   •   conditional constructs
   •   looping constructs
   •   one & two dimensional arrays
   •   definition of classes, constructors
   •   inheritance, interfaces

 Programming Skills

   •   ability to understand & implement small programming solutions
   •   ability to implement code involving classes w/ inheritance relationships provided
   •   significant ability to use one/two-dimensional arrays .
Results
The reviewers found that students demonstrated their programming knowledge and programming ability for basic objected-oriented programs.
Students were proficient in using primitive Java data types, control flow (conditional & looping constructs), 1D arrays, classes (constructors &
methods). In addition students were able to design classes and methods from scratch. The programming projects were of small to moderate size
(150-200 lines of code, including comments). Projects examined did not provide examples of students’ ability to use 2D arrays and inheritance.

Suggestions for improvement
Student projects were in general well-designed to be both educational and capture student interest. The reviewers suggest less emphasis on string
input/output (requiring concern about formatting details) if possible, and more opportunities for students to design open-ended projects (choosing
their own classes, methods).
Bachelor’s - Learning Outcome #2


Student Learning Outcomes                             Assessment Measures and Criteria                                            Assessment Schedule
(list the three-to-five most important)               (describe one or more measures for each outcome and criteria for            (initial year, and
                                                      success)                                                                    subsequent cycle)

2. Graduates of this program will develop             Since this program requires courses from the Math department and            Starting in the spring of
mathematical and analytical reasoning skills.         CMSC 250, they will develop mathematical and analytic reasoning             2006 and every three
                                                      skills. Their ability to perform mathematically rigorous proofs is          years after, a small
                                                      important and a good way to measure their level of analytical               committee will review a
                                                      reasoning. A small committee will review selected answers from a            sample of these as
                                                      single final exam question from CMSC 250 to determine how well              described.
                                                      the students are able to rigorously prove a mathematical concept.
                                                      The sample will be selected from those who were deemed as C or
                                                      better on the exam as a whole. At least 80% of the students in the
                                                      course program should receive a rating of very good or excellent on
                                                      this one exam question.

For this learning outcome – we had a two person committee of people who are not currently teaching CMSC 250 review one exam question that is deemed to be
an exemplar of the goals of CMSC 250. The committee will look at the answers of people who received a C or better on the exam as a whole and determine how
well they understand the topic tested by that one question. Bill Gasarch and Evan Golub were selected as viable candidates for this evaluation.

==============================================================
Evaluating Mathematical and Analytical Skills
Of Computer Science Majors
Using the Middle States Criteria
By Evan Golub and Bill Gasarch

         The second item in the Middle States Evaluation Criteria states Graduates of this program will develop mathematical and analytical reasoning skills. The
evaluation method mandated was that we pick a sample of the students who got a C or better on the exam as a whole, pick a question on the final that we feel
tests the student’s ability to prove statements rigorously, and grade it ourselves.
         We picked question 3b
         Question 3 states: For each of the following, either give a complete formal proof being sure to use good notation and reasons for all steps OR give a
counter example with justification to show that it is a valid counter example.
Question 3b states: Give a proof or a counter example for the statement:
                                                                            n
                                                              ∀n ∈ Z ≥o , ∑ i ⋅ 2 i = 2 n +1 (n − 1) + 2
                                                                           i =0


        This question was chosen since it required a rigorous proof by induction, but was not overly cumbersome in terms of algebra. Hence, if a student got it
wrong, either they lacked the analytical reasoning skills or they lacked basic mathematical skills.
        We picked 20 students who got a C or better on the exam. We chose the number of A’s, B’s and C’s in rough percentage to how many students got A’s,
B’s and C’s. We graded this one problem given 2 points for the Base Case, 6 for the Induction Hypotheses, and 10 for the induction step.
        Here are the final results:
            1. 15 were Excellent or Very Good. (14 points or higher)
            2. 1 was Moderate (9-13 points)
            3. 4 were Poor (8 points or less)
        Hence, 75% were Excellent or Very Good.
Bachelor’s - Learning Outcome #3


Student Learning Outcomes Assessment Measures and Criteria                                                                      Assessment Schedule
(list the three-to-five most (describe one or more measures for each outcome and criteria for success)                          (initial year, and
important)                                                                                                                      subsequent cycle)

3. Graduates will experience       Several 400 level courses contain large scale programming projects each semester.       Starting in the spring of
design and implementation of       For evaluation, these courses will be identified and the project base of a sample of    2006 and every three
programming projects that are      those identified courses and the expectations of students will be reviewed by a small years after, a small
more similar to those that         committee of faculty to determine the appropriateness of the level of the project as    committee will review a
would be seen in a real world      well as the student understanding. In order to be sure that enough students are taking sample of these as
environment.                       these identified courses so that the programming skills are emphasized in a high        described. The statistics
                                   percentage of the graduates, data will also be collected to determine the percentage of about the number of
                                   students electing to take these courses. Some examples of courses that currently meet majors graduating who
                                   the criteria are CMSC 412, 414, 417, 420, 430 and 433. At least 85% of each             elected to take these
                                   graduating class should be electing to take 2 or more of these courses that are         courses will be collected
                                   determined to be “project based.” The project descriptions and sample                   for every graduating
                                   implementations will be reviewed by the committee in order to determine the quality class.
                                   of the code being designed and written. At least 75% of the running projects should be
                                   determined to have very good or excellent programming characteristics. The
                                   evaluations of the projects will help to guide project design decisions for later
                                   semesters to make sure students are being adequately trained for actual applications of
                                   programming in the real world, and the numbers will be used to ensure that enough
                                   students are able to take the “project based” courses.

For this learning outcome – we had a two person sub committee review one project descriptions and implementations done by a sample of students who passed
the course. Each time this evaluation is done, it will be performed on a different course in the sequence that requires a programming project as a significant
portion of the assessment. For this first time through, CMSC 412 will be evaluated by people who are not directly involved with CMSC412, but people who have
been involved with the course before. Udaya Shankar and Pete Keleher were selected as viable candidates for this evaluation.

===============================================================

CMSC 412 evaluation (keleher, shankar)

Expectations
Students who successfully complete the course are expected to know an operating system inside out. They should be able to build an operating
system from scratch, including all the major subsystems such as process scheduling and management, virtual memory, file systems, threading, and
inter-process communication. They should also be able to dive into an existing operating system to make modifications and to add components.
Course project
The main thrust of the course is to implement an operating system, starting with a skeleton operating system (GeekOS) and adding various
subsystems to it over a series of six projects. GeekOS can run on an actual x86 machine or an emulator; the course uses an emulator (Bochs) that runs
on a Linux cluster dedicated to instructional use.

The GeekOS that the students start with implements an elementary process scheduler and device drivers for floppy and screen. The students extend
GeekOS over the course of the semester as follows:

   •   Project 0: Get familiar with the GeekOS development environment and the Bochs x86 simulator, and writes a simple kernel mode thread that
       does some simple keyboard and terminal IO.
   •   Project 1: Augment GeekOS process services to include background processes, asynchronous killing of a process from another process, and
       printing the status (process table) of running processes. Involves address space protection, system calls, user versus kernel memory, and
       segmentation.
   •   Project 2: Augment GeekOS with code to handle signals, implement system calls to setup and call signals, and handle orphaned background
       processes.
   •   Project 3: Implement multilevel feedback process scheduling in place of the default priority-based, preemptive round robin, and evaluate the
       two policies. Implement semaphores and provide user programs with semaphore-manipulating system calls.
   •   Project 4: Augment GeekOS with virtual memory. This is done in steps: kernel memory paging, user memory paging, demand paging, and
       finally virtual memory.
   •   Project 5: Augment GeekOS with a new file system, including support for long file names, creating directories, and mounting. This file
       system resides on the second IDE disk drive in the Bochs emulator, with the first IDE drive having the default PFAT file system. A virtual
       file system (VFS) layer is placed above the two file systems.
   •   Project 5: Augment GeekOS with inter-process communication via message passing.

For each project, students are given a base version of GeekOS that they can build on. So a student who has not successfully completed the previous
project is not penalized.

Evaluation
We examined several projects and decided to use the fifth project as our sample. Project 5 has the students implement a complete file system for a
simple operating system. The students were given a scaffolding that implements the rest of the operating system, though some of this is derived from
the students' own versions of prior projects. Students were also given a complete description of the interface between their code and the rest of the
system, which effectively defines most of the relevant types gives prototype to all of the main routines.

Students usually wrote about 1200 lines of C source code for this project, plus a variety of small changes elsewhere in order to get the new code to
mesh with the rest of the scaffolding correctly. We looked at three samples of running code and inspected the code for three characteristics:
   • clear and well-documented code
   • well-designed functions without redundancy, and
   • evidence of good debugging practice (error conditions checked, assertions used).
Students excelling in all three criteria should be well-prepared to develop design and build large-scale projects after graduation.

Two of the three projects we examined did very well on all three criteria. The functional structure was laid out well, code used more than once was
invariably hoisted into common subroutines, and the students used good debugging practice by checking all or most error conditions, as well as
defining invariants using kernel assertions. The third project was less clearly laid out, and the student did not bother to document the project
like the others. The student used less sophisticated methods of debugging (inserting print statements) that would not scale well up to larger or more
time-sensitive projects. Functionally, however, the three projects were equivalent, they passed all tests and detected relevant error conditions.

Suggestions for Assessment Improvement
More than three projects will need to be evaluated in the future since the three chosen may not accurately represent the 40 people taking the class that
semester. We should review whether more emphasis be placed on the grading of programming style (reuse of subroutines, reliability and readability)
in this course or in the courses that are prerequisites to this course.
Bachelor’s - Learning Outcome #4b


Student Learning Outcomes Assessment Measures and Criteria                                                                   Assessment Schedule
(list the three-to-five most (describe one or more measures for each outcome and criteria for success)                       (initial year, and
important)                                                                                                                   subsequent cycle)

4. In addition to those skills   All students will have one or more of these areas of specific skill development. The        These graduation
mentioned above needed by all    statistics about the courses chosen by the graduates should be reviewed after each          statistics should be
computer science graduates,      graduation to determine that students are developing skills for application to graduate     collected and reviewed
the students must also gain      school or the real world in more than one of these areas. All students are required to      for each graduating class
skills and knowledge in one or   have one based on the graduation requirements, there should be at least 90% that have       to be sure the correct
more of the following areas of   gained the skills described in more than one of these areas.                                courses are being offered
computer science.                                                                                                            so that this goal can be
                                                                                                                             met.
   b) Graduates will be          There are several courses that address this skill set. Some examples are: 498A        Starting in the Spring of
      exposed to some form       (independent study), 390 (honor’s research), independent work with a faculty member, 2006 and for every third
      of academic research or    the software engineering corporate add-on, and the corporate scholars program all     year after. The faculty
      real world                 provide this exposure. The quality of the individual research done at the             committee responsible for
      programming/business       undergraduate level within the department will be maintained by a process of review ofgraduation with
      environments. This         selected papers submitted for the honors program. The responsibility for this review departmental honors will
      exposure makes their       will not be done by the student’s research supervisor, but rather by the faculty      review papers submitted
      transition to either       committee responsible for graduation with departmental honors. At least 90% of the for that purpose.
      graduate school or the     papers reviewed should receive an excellent rating by this external review committee.
      work environment
      smoother.

For this learning outcome – we will have a committee who oversees the Honor’s Program review a random sample of the projects of students who graduate with
departmental honors. This is a standing committee currently chaired by Dr. Gasarch.
Middle States Evaluation of Honors Thesis
By William Gasarch and Don Perlis

        Nine students graduated with honors in Spring 2006. We looked at six honors theses and independently gave then grades in the range of
{0,1,2,3} for originality (ORIG), significance (SIG) and presentation (PRES). The grades are interpreted as follows:
 0=weak, 1=good, 2=very good, 3=excellent
        After we evaluated them separately, we added our scores together. Hence, we have, for ORIG, SIG and PRES, a score than can be from min
0 to max 6.

   1. Graphical Interface for JSHOP2 by John Shin.
      ORIG: 5, SIG: 5, PRES:3

   2. Magnetic Swipe Card System Security by Daniel Ramsbrock.
      ORIG: 6, SIG: 5, PRES:6

   3. A Survey of some recent results in Computer Graphics by Robert Patro.
      ORIG: 3, SIG: 5, PRES:4

   4. XMT Applications Programming: Image Registration and Computer Graphics by Bryant Lee.
      ORIG: 5, SIG: 5, PRES:5

   5. Annoflow- Handwritten Annotation and Proof-reading on Dynamic Digital Documents by Phil Cosby.
      ORIG: 5, SIG: 5, PRES:6

   6. A Computer Program to Play Minesweeper.
      ORIG: 6, SIG: 4, PRES:4
      (Note: Bill Gasarch was the mentor or this project so the ratings are based on Don Perlis’s scores doubled.)

      We regard a student’s honors thesis to be excellent if it is at least 5 in one category, at least 4 in another, and at least 3 in the third. Using this
 measure, all of the theses were excellent. This may sound rather optimistic; however, realize that only nine students did honors theses, and they
 were some of our best students.
Master’s – Learning Outcome #1
Doctorate – Learning Outcome #1

Student Learning Outcomes          Assessment Measures and Criteria                                                            Assessment Schedule
(list the three-to-five most       (describe one or more measures for each outcome and criteria for success)                   (initial year, and
important)                                                                                                                     subsequent cycle)

1. Master’s and Ph.D.              Student taking the 600-800 level project-based courses will have developed                  Starting in the Spring, 2006
graduates will learn to design     projects appropriate to the level for assessment. A committee of three faculty              and every third year
and implement computer             members will review project descriptions and sample student implementations of              afterward this committee
software.                          a project for a course that is identified as being heavily based on the project. This       will review a course that
                                   committee will determine if the level of the project is appropriate and if the              was offered in the previous
                                   students are able to complete the project with appropriate programming                      semester.
                                   characteristics. At least 75% of the sample of students who have passed this
                                   graduate level course will have the project implementation rated as a very good or
                                   excellent in a variety of programming characteristics.

For this learning outcome – we will have a single person who has taught a similar course in the past review a project and implementations. For this year, Jim
Reggia will review a project in the course being taught by Lise Getoor.


                                           Evaluation of Learning Outcome #1 for MS/PhD Degree
                                                         Part of the Middle States Evaluation Process

                                                        J. Reggia, Professor, Computer Science Dept.

                                                                             5/15/06

Learning Outcome #1: The goal of this Learning Outcome is that MS/PhD students will learn to design and implement computer software.

Assessment Measures and Criteria: Students taking the 600-800 level project-based courses will have developed projects appropriate to their level.
Project descriptions and sample student implementations of a course that is identified as being heavily based on a project will be reviewed. It will be
determined whether the level of the project is appropriate and if the students are able to complete the project with appropriate programming
characteristics. At least 75% of the sample of students who pass this course will have the project implementation rated as very good to excellent.

Course Selected for Review: CMSC 726, Machine Learning, was selected for review this semester. This course reviews a broad range of machine
learning methods at a graduate level.
The course typically has a major project required of all students that involves an independent research project. The project is required of all students.
Methodology: All projects turned in by students taking the course are reviewed here for quality independent of the course instructor. These projects
were provided by the instructor during the last week of the semester.

Evaluation criteria: Each project was reviewed and rated based on originality, content, implementation effort, and report quality.

Results:

1. Project goal: (excerpts from the instructor’s project assignment) The goal of this project is to give students experience in defining, refining and
developing an interesting research idea, hands-on experience implementing the proposed research, practice evaluating research results, and
experience writing up and explaining the research results. Ideally, the project should be of a quality such that students can submit it to a workshop or
conference. Projects can be either applications oriented, applying machine learning techniques to an interesting or novel task, or algorithmic oriented,
comparing several models or developing a new algorithm. In either case the results should be validated empirically. A 1 to 2 page proposal must be
turned in, describing the problem, the hypothesis to be tested, the experiments that will be performed, and how the results will be evaluated. The
project can be done alone or with 1 to 2 (maximum) other students. (The instructor also suggested ideas for potential projects as well as providing
pointers to past projects and to conference proceedings that students could review for ideas.)

2. Projects received: A total of 13 projects were received, reflecting reports from all 20 of the students in the class. Several projects involved two or
three students working together, but most were done by individual students working alone. All projects that were turned in were examined as part of
this evaluation.

3. Evaluation of projects: The student projects spanned a broad range of creative topics and applications. While some of the projects were
straightforward applications of supervised learning, even these often took on creative tasks. Multiple projects were done on timely topics such as
learning to detect spam in email messages, clustering of information in databases, and applications in bioinformatics. Several projects were of a
caliber that they would be appropriate to submit to workshops/conferences for possible publication.

Comments and Conclusions

1. The level of this project was appropriate to the level of the students, and it clearly met the goal of giving the graduate students in this course
experience with implementing substantial advanced software. The students had learned a lot about machine learning that they could bring to bear in
the software development.

2. Independent review of these projects ranks them all as at least very good, with several being clearly excellent in terms of creativity, technical
execution, and innovation. Several were judged as potentially publishable research results.

3. Not only was learning objective #1 met, but it was clearly exceeded in that in addition to obtaining software development experience, the students
also gained experience formulating and executing a research study, analyzing the results, and writing up their conclusions. This is a very valuable
exercise for this level of students.
Doctorate – Learning Outcome #3 and 4

Student Learning Outcomes                    Assessment Measures and Criteria                                          Assessment Schedule
(list the three-to-five most important)      (describe one or more measures for each outcome and criteria for          (initial year, and
                                             success)                                                                  subsequent cycle)
3. Ph.D. graduates will complete             The research component of the Ph.D. requirement will be evaluated at      Starting in the spring of
extensive independent research and have      two different periods of time – both after 3 years of study and at        2006 and every year after
at least one paper accepted by a             graduation. At least 75% of the students graduating in a given year       the CMSC graduate office
conference or published in a refereed        will have published at least one paper. This information can be           and the associate chair of
journal.                                     gathered during the graduation survey and interview already               undergraduate studies will
                                             conducted by the graduate office and the associate chair of the CMSC      compile and present these
                                             graduate school. At least 75% of the students completing their third      statistics to the chair for
                                             year of study toward a Ph.D. will have submitted at least one paper to    review.
                                             a conference or refereed journal. This information will be gathered on
                                             the form used for the graduate student review day which is held during
                                             the spring semester.

4. Ph.D. graduates will have their           At least half of the students graduating in a given year will have        Starting in the spring of
research accepted for presentation at        presented their research during at least one conference by the time of    2006 and every year after
subject area conferences and the Ph.D.       their graduation. This information can be gathered during the             the CMSC graduate office
graduate will have made the presentation     graduation survey and interview already conducted by the CMSC             and the associate chair of
of their research material.                  graduate office and the associate chair of the CMSC graduate school.      graduate studies will
                                                                                                                       compile and present these
                                                                                                                       statistics to the chair for
                                                                                                                       review.




These are both statistics that will need to be collected by the graduate chair, Samir Khuller, from the graduate office survey and from the exit
interviews.

Report of the Ph.D. Learning Outcomes #3 and #4

        For the graduation statistics, the data was collected both on a survey given to all students as they a apply for graduation and from an exit
interview requested of all people who would be graduating in the next year. Since neither the surveys nor the interviews are required in any way,
several students did neither. The graduate program coordinator then attempted to fill in the data for the missing students by using their web page and/
or vitae if she had access to the information there. This method of data collection proved to not be as accurate as we would like so next year, this
information will be additional questions that appear on the form they must submit to the department in order to apply for graduation. That method of
data collection should be significantly more reliable.

        The information in the following paragraphs is according to the data we were able to collect in the first method. For the 2005-2006 school
year, there were a total of 34 Ph.D. graduates from the computer science department. They were divided into 11 for the summer, 10 for the winter
and 13 for the spring graduation ceremonies.

        For published journal articles, there were a total of 26 students who had already had one or more articles published in a journal before the
time of their graduation. These were divided as 8 for the summer graduates, 9 for the winter and 9 for the spring. This gave an overall percentage of
75.9% slightly above the Middle States goal of 75% as stated above. The winter graduates were well over the goal rate at 90%, the summer
graduates were slightly under at 73.73%, and the spring graduates were slightly over at 76.85%.

        For presentation at conferences, there were a total of 28 graduates that had already presented at one or more conferences. These were divided
as 10 for the summer graduates, 10 for the winter graduates and 8 for the spring graduates. This gave an overall percentage of 82.35% which is well
over the goal of at least half as stated above. Each of the three groups was over the goal with the summer having 90.91%, the winter 100% and the
fall 61.54%.

        For the mid-program evaluation, the data was collected as part of the graduate student review process done each April. All current graduate
students are evaluated by a committee to determine if they are making adequate progress toward graduation. This first step of this process is a web
survey that all Computer Science graduate students must complete. If the students do not appear to be making progress toward graduation from their
answers on that survey, they are then interviewed further to determine the proper corrective measures. The data is then entered into a database and
we were able to query for just the information about the students who met our criteria. The goal was to have publications in peer reviewed journals
by at least 755% of those who had completed their third year of graduate study in the computer science department. We found that there were 29
students who had completed their third year and that 20 of them already had articles accepted at peer reviewed journals. This gives a rate of 69% of
these students had published. We did not have data to determine how many had submitted to a peer reviewed journal. The assessment should
probably be changed to something that is more easily measured such as having an article accepted for publication in a peer reviewed journal.

Suggestions for Improvement
        The method of data collection for those that are needed from people who are graduating will be changed. The method used was not on any
form that is required – and so there was quite a bit of missing data that was difficult to recover. The new method will be to add it to the form that
students must fill out to apply for their graduation. Since all students submit this form and are unlikely to leave these small questions blank, the data
collection will be more accurate when this evaluation is completed for 2007.

Contenu connexe

Tendances

Classroom diagnostic tools training 9.23.14
Classroom diagnostic tools training 9.23.14Classroom diagnostic tools training 9.23.14
Classroom diagnostic tools training 9.23.14nickpaolini81
 
Achievement test powerpoint97 2003
Achievement test powerpoint97 2003Achievement test powerpoint97 2003
Achievement test powerpoint97 2003SwathiE6
 
IRJET- Attribute Based Adaptive Evaluation System
IRJET-  	  Attribute Based Adaptive Evaluation SystemIRJET-  	  Attribute Based Adaptive Evaluation System
IRJET- Attribute Based Adaptive Evaluation SystemIRJET Journal
 
Assessment of students' performance grading systems
Assessment of students' performance grading systemsAssessment of students' performance grading systems
Assessment of students' performance grading systemsAlexander Decker
 
Assessment and evaluation of learning part 1
Assessment and evaluation of learning part 1Assessment and evaluation of learning part 1
Assessment and evaluation of learning part 1joannejoyceperez
 
Capstone Rubric4 08
Capstone Rubric4 08Capstone Rubric4 08
Capstone Rubric4 08info@ iun.ch
 
Adaptive Testing, Learning Progressions, and Students with Disabilities - May...
Adaptive Testing, Learning Progressions, and Students with Disabilities - May...Adaptive Testing, Learning Progressions, and Students with Disabilities - May...
Adaptive Testing, Learning Progressions, and Students with Disabilities - May...Peter Hofman
 
Unit 5 oba lo1
Unit 5 oba lo1Unit 5 oba lo1
Unit 5 oba lo1obepsp
 
Criterion-referenced and norm-referenced assessments: compatibility and compl...
Criterion-referenced and norm-referencedassessments: compatibility and compl...Criterion-referenced and norm-referencedassessments: compatibility and compl...
Criterion-referenced and norm-referenced assessments: compatibility and compl...Fereshte Tadayyon
 
Criterion-Referenced Assessment Review
Criterion-Referenced Assessment ReviewCriterion-Referenced Assessment Review
Criterion-Referenced Assessment Reviewcloder6416
 
Assessment for LI sessions
Assessment for LI sessionsAssessment for LI sessions
Assessment for LI sessionsDiana Hyatt
 

Tendances (18)

Assessment criteria
Assessment criteriaAssessment criteria
Assessment criteria
 
Classroom diagnostic tools training 9.23.14
Classroom diagnostic tools training 9.23.14Classroom diagnostic tools training 9.23.14
Classroom diagnostic tools training 9.23.14
 
Common standard 2
Common standard 2Common standard 2
Common standard 2
 
Achievement test powerpoint97 2003
Achievement test powerpoint97 2003Achievement test powerpoint97 2003
Achievement test powerpoint97 2003
 
IRJET- Attribute Based Adaptive Evaluation System
IRJET-  	  Attribute Based Adaptive Evaluation SystemIRJET-  	  Attribute Based Adaptive Evaluation System
IRJET- Attribute Based Adaptive Evaluation System
 
Assessment of students' performance grading systems
Assessment of students' performance grading systemsAssessment of students' performance grading systems
Assessment of students' performance grading systems
 
Standardized Test
Standardized TestStandardized Test
Standardized Test
 
Assessment and evaluation of learning part 1
Assessment and evaluation of learning part 1Assessment and evaluation of learning part 1
Assessment and evaluation of learning part 1
 
Capstone Rubric4 08
Capstone Rubric4 08Capstone Rubric4 08
Capstone Rubric4 08
 
achievement test
achievement testachievement test
achievement test
 
Achievement test
Achievement testAchievement test
Achievement test
 
QS M2-Building the Assessment-11FEB14
QS M2-Building the Assessment-11FEB14QS M2-Building the Assessment-11FEB14
QS M2-Building the Assessment-11FEB14
 
Adaptive Testing, Learning Progressions, and Students with Disabilities - May...
Adaptive Testing, Learning Progressions, and Students with Disabilities - May...Adaptive Testing, Learning Progressions, and Students with Disabilities - May...
Adaptive Testing, Learning Progressions, and Students with Disabilities - May...
 
Unit 5 oba lo1
Unit 5 oba lo1Unit 5 oba lo1
Unit 5 oba lo1
 
Note 1
Note 1Note 1
Note 1
 
Criterion-referenced and norm-referenced assessments: compatibility and compl...
Criterion-referenced and norm-referencedassessments: compatibility and compl...Criterion-referenced and norm-referencedassessments: compatibility and compl...
Criterion-referenced and norm-referenced assessments: compatibility and compl...
 
Criterion-Referenced Assessment Review
Criterion-Referenced Assessment ReviewCriterion-Referenced Assessment Review
Criterion-Referenced Assessment Review
 
Assessment for LI sessions
Assessment for LI sessionsAssessment for LI sessions
Assessment for LI sessions
 

En vedette

CS 898O : Machine Learning
CS 898O : Machine LearningCS 898O : Machine Learning
CS 898O : Machine Learningbutest
 
Vladimir Bobrikov Rit2010 Reputation
Vladimir Bobrikov Rit2010 ReputationVladimir Bobrikov Rit2010 Reputation
Vladimir Bobrikov Rit2010 Reputationguest092df8
 
Web Design
Web DesignWeb Design
Web Designbutest
 
Powerpoint
PowerpointPowerpoint
Powerpointbutest
 
Machine Learning applied to Go
Machine Learning applied to GoMachine Learning applied to Go
Machine Learning applied to Gobutest
 
PPT SLIDES
PPT SLIDESPPT SLIDES
PPT SLIDESbutest
 
NEB Step-1 Formative Assessment-SEQ
NEB Step-1 Formative Assessment-SEQNEB Step-1 Formative Assessment-SEQ
NEB Step-1 Formative Assessment-SEQDrSaeed Shafi
 
Atelier Innovation : FWT15 Paris - La révolution de l’internet des objets
Atelier Innovation : FWT15 Paris - La révolution de l’internet des objetsAtelier Innovation : FWT15 Paris - La révolution de l’internet des objets
Atelier Innovation : FWT15 Paris - La révolution de l’internet des objetsFujitsu France
 
День творческих идей 03.02.2015г
День творческих идей 03.02.2015гДень творческих идей 03.02.2015г
День творческих идей 03.02.2015гArtem Ermolitskiy
 
slides
slidesslides
slidesbutest
 
GCD.263.doc
GCD.263.docGCD.263.doc
GCD.263.docbutest
 
Neb step 1 formative assessment-day-8
Neb step 1 formative assessment-day-8Neb step 1 formative assessment-day-8
Neb step 1 formative assessment-day-8DrSaeed Shafi
 
Pia Vilenius: Tutkimus tekstiiliteollisuusalan tilanteesta 2015
Pia Vilenius: Tutkimus tekstiiliteollisuusalan tilanteesta 2015Pia Vilenius: Tutkimus tekstiiliteollisuusalan tilanteesta 2015
Pia Vilenius: Tutkimus tekstiiliteollisuusalan tilanteesta 2015Sitra / Ekologinen kestävyys
 
Chua Benh Dau Khop Goi
Chua Benh Dau Khop GoiChua Benh Dau Khop Goi
Chua Benh Dau Khop Goimathew342
 

En vedette (17)

CS 898O : Machine Learning
CS 898O : Machine LearningCS 898O : Machine Learning
CS 898O : Machine Learning
 
Vladimir Bobrikov Rit2010 Reputation
Vladimir Bobrikov Rit2010 ReputationVladimir Bobrikov Rit2010 Reputation
Vladimir Bobrikov Rit2010 Reputation
 
(ppt
(ppt(ppt
(ppt
 
Web Design
Web DesignWeb Design
Web Design
 
Powerpoint
PowerpointPowerpoint
Powerpoint
 
Machine Learning applied to Go
Machine Learning applied to GoMachine Learning applied to Go
Machine Learning applied to Go
 
PPT SLIDES
PPT SLIDESPPT SLIDES
PPT SLIDES
 
NEB Step-1 Formative Assessment-SEQ
NEB Step-1 Formative Assessment-SEQNEB Step-1 Formative Assessment-SEQ
NEB Step-1 Formative Assessment-SEQ
 
Atelier Innovation : FWT15 Paris - La révolution de l’internet des objets
Atelier Innovation : FWT15 Paris - La révolution de l’internet des objetsAtelier Innovation : FWT15 Paris - La révolution de l’internet des objets
Atelier Innovation : FWT15 Paris - La révolution de l’internet des objets
 
День творческих идей 03.02.2015г
День творческих идей 03.02.2015гДень творческих идей 03.02.2015г
День творческих идей 03.02.2015г
 
slides
slidesslides
slides
 
GCD.263.doc
GCD.263.docGCD.263.doc
GCD.263.doc
 
Neb step 1 formative assessment-day-8
Neb step 1 formative assessment-day-8Neb step 1 formative assessment-day-8
Neb step 1 formative assessment-day-8
 
Pia Vilenius: Tutkimus tekstiiliteollisuusalan tilanteesta 2015
Pia Vilenius: Tutkimus tekstiiliteollisuusalan tilanteesta 2015Pia Vilenius: Tutkimus tekstiiliteollisuusalan tilanteesta 2015
Pia Vilenius: Tutkimus tekstiiliteollisuusalan tilanteesta 2015
 
.doc
.doc.doc
.doc
 
Chua Benh Dau Khop Goi
Chua Benh Dau Khop GoiChua Benh Dau Khop Goi
Chua Benh Dau Khop Goi
 
Linking and referencing
Linking and referencingLinking and referencing
Linking and referencing
 

Similaire à MSWord

Planning an achievement test and assessment
Planning an achievement test and assessmentPlanning an achievement test and assessment
Planning an achievement test and assessmentUmair Ashraf
 
Aprendizaje para toda la vida
Aprendizaje para toda la vidaAprendizaje para toda la vida
Aprendizaje para toda la vidaziolivieri
 
Haxhiraj ch13 14-presentation
Haxhiraj ch13 14-presentationHaxhiraj ch13 14-presentation
Haxhiraj ch13 14-presentationbhaxhira
 
Jane's ccss
Jane's ccssJane's ccss
Jane's ccssjsuddaby
 
P2004079admin wei1
P2004079admin wei1P2004079admin wei1
P2004079admin wei1jhoy06
 
1 Saint Leo University GBA 334 Applied Decision.docx
 1 Saint Leo University  GBA 334  Applied Decision.docx 1 Saint Leo University  GBA 334  Applied Decision.docx
1 Saint Leo University GBA 334 Applied Decision.docxaryan532920
 
dolan_SAM_syllabus_spring_2017
dolan_SAM_syllabus_spring_2017dolan_SAM_syllabus_spring_2017
dolan_SAM_syllabus_spring_2017Sean Dolan
 
CO and PO presentation.pptx
CO and PO presentation.pptxCO and PO presentation.pptx
CO and PO presentation.pptxIsrarEqubal2
 
PLANNING CLASSROOM TESTS AND ASSESSMENTS
PLANNING CLASSROOM TESTS AND ASSESSMENTSPLANNING CLASSROOM TESTS AND ASSESSMENTS
PLANNING CLASSROOM TESTS AND ASSESSMENTSSANA FATIMA
 
Master of Personnel Management
Master of Personnel ManagementMaster of Personnel Management
Master of Personnel ManagementAnushka Shah
 
ch11sped420PP
ch11sped420PPch11sped420PP
ch11sped420PPfiegent
 
Relevant-and-Pressing-Topicsin-High-Stakes-Assessment-in-Education.pptx
Relevant-and-Pressing-Topicsin-High-Stakes-Assessment-in-Education.pptxRelevant-and-Pressing-Topicsin-High-Stakes-Assessment-in-Education.pptx
Relevant-and-Pressing-Topicsin-High-Stakes-Assessment-in-Education.pptxJodieBeatrizZamora
 
Assessment in Learning Design v2
Assessment in Learning Design v2Assessment in Learning Design v2
Assessment in Learning Design v2Andrew Moore
 
CBA_2_Assessment_Introduction to assessment
CBA_2_Assessment_Introduction to assessmentCBA_2_Assessment_Introduction to assessment
CBA_2_Assessment_Introduction to assessmentlinetnafuna
 
Final Integrated Summative Assessment
Final Integrated Summative AssessmentFinal Integrated Summative Assessment
Final Integrated Summative AssessmentCIMAP
 
Template #6-Performance Measure Rubric-May 2014-Final
Template #6-Performance Measure Rubric-May 2014-FinalTemplate #6-Performance Measure Rubric-May 2014-Final
Template #6-Performance Measure Rubric-May 2014-FinalResearch in Action, Inc.
 

Similaire à MSWord (20)

Planning an achievement test and assessment
Planning an achievement test and assessmentPlanning an achievement test and assessment
Planning an achievement test and assessment
 
Aprendizaje para toda la vida
Aprendizaje para toda la vidaAprendizaje para toda la vida
Aprendizaje para toda la vida
 
Scripting for Design
Scripting for DesignScripting for Design
Scripting for Design
 
Cwmd 2601 2020
Cwmd 2601 2020Cwmd 2601 2020
Cwmd 2601 2020
 
Haxhiraj ch13 14-presentation
Haxhiraj ch13 14-presentationHaxhiraj ch13 14-presentation
Haxhiraj ch13 14-presentation
 
Jane's ccss
Jane's ccssJane's ccss
Jane's ccss
 
P2004079admin wei1
P2004079admin wei1P2004079admin wei1
P2004079admin wei1
 
1 Saint Leo University GBA 334 Applied Decision.docx
 1 Saint Leo University  GBA 334  Applied Decision.docx 1 Saint Leo University  GBA 334  Applied Decision.docx
1 Saint Leo University GBA 334 Applied Decision.docx
 
dolan_SAM_syllabus_spring_2017
dolan_SAM_syllabus_spring_2017dolan_SAM_syllabus_spring_2017
dolan_SAM_syllabus_spring_2017
 
Unit 3- Topic 3.pptx
Unit 3- Topic 3.pptxUnit 3- Topic 3.pptx
Unit 3- Topic 3.pptx
 
CO and PO presentation.pptx
CO and PO presentation.pptxCO and PO presentation.pptx
CO and PO presentation.pptx
 
PLANNING CLASSROOM TESTS AND ASSESSMENTS
PLANNING CLASSROOM TESTS AND ASSESSMENTSPLANNING CLASSROOM TESTS AND ASSESSMENTS
PLANNING CLASSROOM TESTS AND ASSESSMENTS
 
Master of Personnel Management
Master of Personnel ManagementMaster of Personnel Management
Master of Personnel Management
 
ch11sped420PP
ch11sped420PPch11sped420PP
ch11sped420PP
 
Relevant-and-Pressing-Topicsin-High-Stakes-Assessment-in-Education.pptx
Relevant-and-Pressing-Topicsin-High-Stakes-Assessment-in-Education.pptxRelevant-and-Pressing-Topicsin-High-Stakes-Assessment-in-Education.pptx
Relevant-and-Pressing-Topicsin-High-Stakes-Assessment-in-Education.pptx
 
Assessment in Learning Design v2
Assessment in Learning Design v2Assessment in Learning Design v2
Assessment in Learning Design v2
 
asdasd
asdasdasdasd
asdasd
 
CBA_2_Assessment_Introduction to assessment
CBA_2_Assessment_Introduction to assessmentCBA_2_Assessment_Introduction to assessment
CBA_2_Assessment_Introduction to assessment
 
Final Integrated Summative Assessment
Final Integrated Summative AssessmentFinal Integrated Summative Assessment
Final Integrated Summative Assessment
 
Template #6-Performance Measure Rubric-May 2014-Final
Template #6-Performance Measure Rubric-May 2014-FinalTemplate #6-Performance Measure Rubric-May 2014-Final
Template #6-Performance Measure Rubric-May 2014-Final
 

Plus de butest

EL MODELO DE NEGOCIO DE YOUTUBE
EL MODELO DE NEGOCIO DE YOUTUBEEL MODELO DE NEGOCIO DE YOUTUBE
EL MODELO DE NEGOCIO DE YOUTUBEbutest
 
1. MPEG I.B.P frame之不同
1. MPEG I.B.P frame之不同1. MPEG I.B.P frame之不同
1. MPEG I.B.P frame之不同butest
 
LESSONS FROM THE MICHAEL JACKSON TRIAL
LESSONS FROM THE MICHAEL JACKSON TRIALLESSONS FROM THE MICHAEL JACKSON TRIAL
LESSONS FROM THE MICHAEL JACKSON TRIALbutest
 
Timeline: The Life of Michael Jackson
Timeline: The Life of Michael JacksonTimeline: The Life of Michael Jackson
Timeline: The Life of Michael Jacksonbutest
 
Popular Reading Last Updated April 1, 2010 Adams, Lorraine The ...
Popular Reading Last Updated April 1, 2010 Adams, Lorraine The ...Popular Reading Last Updated April 1, 2010 Adams, Lorraine The ...
Popular Reading Last Updated April 1, 2010 Adams, Lorraine The ...butest
 
LESSONS FROM THE MICHAEL JACKSON TRIAL
LESSONS FROM THE MICHAEL JACKSON TRIALLESSONS FROM THE MICHAEL JACKSON TRIAL
LESSONS FROM THE MICHAEL JACKSON TRIALbutest
 
Com 380, Summer II
Com 380, Summer IICom 380, Summer II
Com 380, Summer IIbutest
 
The MYnstrel Free Press Volume 2: Economic Struggles, Meet Jazz
The MYnstrel Free Press Volume 2: Economic Struggles, Meet JazzThe MYnstrel Free Press Volume 2: Economic Struggles, Meet Jazz
The MYnstrel Free Press Volume 2: Economic Struggles, Meet Jazzbutest
 
MICHAEL JACKSON.doc
MICHAEL JACKSON.docMICHAEL JACKSON.doc
MICHAEL JACKSON.docbutest
 
Social Networks: Twitter Facebook SL - Slide 1
Social Networks: Twitter Facebook SL - Slide 1Social Networks: Twitter Facebook SL - Slide 1
Social Networks: Twitter Facebook SL - Slide 1butest
 
Facebook
Facebook Facebook
Facebook butest
 
Executive Summary Hare Chevrolet is a General Motors dealership ...
Executive Summary Hare Chevrolet is a General Motors dealership ...Executive Summary Hare Chevrolet is a General Motors dealership ...
Executive Summary Hare Chevrolet is a General Motors dealership ...butest
 
Welcome to the Dougherty County Public Library's Facebook and ...
Welcome to the Dougherty County Public Library's Facebook and ...Welcome to the Dougherty County Public Library's Facebook and ...
Welcome to the Dougherty County Public Library's Facebook and ...butest
 
NEWS ANNOUNCEMENT
NEWS ANNOUNCEMENTNEWS ANNOUNCEMENT
NEWS ANNOUNCEMENTbutest
 
C-2100 Ultra Zoom.doc
C-2100 Ultra Zoom.docC-2100 Ultra Zoom.doc
C-2100 Ultra Zoom.docbutest
 
MAC Printing on ITS Printers.doc.doc
MAC Printing on ITS Printers.doc.docMAC Printing on ITS Printers.doc.doc
MAC Printing on ITS Printers.doc.docbutest
 
Mac OS X Guide.doc
Mac OS X Guide.docMac OS X Guide.doc
Mac OS X Guide.docbutest
 
WEB DESIGN!
WEB DESIGN!WEB DESIGN!
WEB DESIGN!butest
 

Plus de butest (20)

EL MODELO DE NEGOCIO DE YOUTUBE
EL MODELO DE NEGOCIO DE YOUTUBEEL MODELO DE NEGOCIO DE YOUTUBE
EL MODELO DE NEGOCIO DE YOUTUBE
 
1. MPEG I.B.P frame之不同
1. MPEG I.B.P frame之不同1. MPEG I.B.P frame之不同
1. MPEG I.B.P frame之不同
 
LESSONS FROM THE MICHAEL JACKSON TRIAL
LESSONS FROM THE MICHAEL JACKSON TRIALLESSONS FROM THE MICHAEL JACKSON TRIAL
LESSONS FROM THE MICHAEL JACKSON TRIAL
 
Timeline: The Life of Michael Jackson
Timeline: The Life of Michael JacksonTimeline: The Life of Michael Jackson
Timeline: The Life of Michael Jackson
 
Popular Reading Last Updated April 1, 2010 Adams, Lorraine The ...
Popular Reading Last Updated April 1, 2010 Adams, Lorraine The ...Popular Reading Last Updated April 1, 2010 Adams, Lorraine The ...
Popular Reading Last Updated April 1, 2010 Adams, Lorraine The ...
 
LESSONS FROM THE MICHAEL JACKSON TRIAL
LESSONS FROM THE MICHAEL JACKSON TRIALLESSONS FROM THE MICHAEL JACKSON TRIAL
LESSONS FROM THE MICHAEL JACKSON TRIAL
 
Com 380, Summer II
Com 380, Summer IICom 380, Summer II
Com 380, Summer II
 
PPT
PPTPPT
PPT
 
The MYnstrel Free Press Volume 2: Economic Struggles, Meet Jazz
The MYnstrel Free Press Volume 2: Economic Struggles, Meet JazzThe MYnstrel Free Press Volume 2: Economic Struggles, Meet Jazz
The MYnstrel Free Press Volume 2: Economic Struggles, Meet Jazz
 
MICHAEL JACKSON.doc
MICHAEL JACKSON.docMICHAEL JACKSON.doc
MICHAEL JACKSON.doc
 
Social Networks: Twitter Facebook SL - Slide 1
Social Networks: Twitter Facebook SL - Slide 1Social Networks: Twitter Facebook SL - Slide 1
Social Networks: Twitter Facebook SL - Slide 1
 
Facebook
Facebook Facebook
Facebook
 
Executive Summary Hare Chevrolet is a General Motors dealership ...
Executive Summary Hare Chevrolet is a General Motors dealership ...Executive Summary Hare Chevrolet is a General Motors dealership ...
Executive Summary Hare Chevrolet is a General Motors dealership ...
 
Welcome to the Dougherty County Public Library's Facebook and ...
Welcome to the Dougherty County Public Library's Facebook and ...Welcome to the Dougherty County Public Library's Facebook and ...
Welcome to the Dougherty County Public Library's Facebook and ...
 
NEWS ANNOUNCEMENT
NEWS ANNOUNCEMENTNEWS ANNOUNCEMENT
NEWS ANNOUNCEMENT
 
C-2100 Ultra Zoom.doc
C-2100 Ultra Zoom.docC-2100 Ultra Zoom.doc
C-2100 Ultra Zoom.doc
 
MAC Printing on ITS Printers.doc.doc
MAC Printing on ITS Printers.doc.docMAC Printing on ITS Printers.doc.doc
MAC Printing on ITS Printers.doc.doc
 
Mac OS X Guide.doc
Mac OS X Guide.docMac OS X Guide.doc
Mac OS X Guide.doc
 
hier
hierhier
hier
 
WEB DESIGN!
WEB DESIGN!WEB DESIGN!
WEB DESIGN!
 

MSWord

  • 1. Middle States Assessment Report Computer Science Department Report of Assessments Scheduled for Spring, 2006 The Middle States Evaluators will be visiting the University of Maryland in March 2007. The evaluations scheduled for Spring 2006 have been completed. Small committees were appointed by the department chair to each of the assessments scheduled for completion, and reports have been submitted. The committees were created with several criteria in mind. The members of the committee could not be associated that semester with the course being evaluated. The committee members were chosen from people who did have more intimate knowledge of the course such as faculty who had taught that course in previous semesters, faculty who taught the course or courses that follow it in the sequence and/or faculty who teach similar material on a different level. The teams were also created to have a mixture of faculty from different levels. These criteria were followed to maximize differences of faculty so that all viewpoints would be represented during the evaluation and to maximize the involvement of faculty members in this process. Each report looks at a sample of student work in order to determine if the learning outcome specified is being reached to the level of the goal also specified in the assessment proposal. Each report submitted by a committee is 1-3 pages summarizing: what was being reviewed, how the review was conducted (sampling etc), the criteria used for the review, the results, and suggestions for improvement.
  • 2.
  • 3. Bachelor’s - Learning Outcome #1 Student Learning Outcomes Assessment Measures and Criteria Assessment Schedule (list the three-to-five most important) (describe one or more measures for each outcome and criteria for (initial year, and success) subsequent cycle) 1. Graduates will be able to create, augment, Projects will be identified from the introductory sequence of courses Starting in the spring of debug and test computer software. These skills (CMSC 131, 132 & 212) that emphasize each of these skills. These 2006 these project will be built progressively through the courses in project descriptions and sample student implementations will be descriptions and sample the introductory sequence of courses. evaluated to determine the extent to which each of these skills and solutions will be reviewed the expectations build through the sequence. Sample student for coding skills. In spring implementations will be reviewed to determine the extent to which of 2008 they will be students are meeting the expectations of each skill at each level. reviewed for debugging This evaluation will be done by the “Introductory programming and testing. These two oversight committee” which is already a standing committee of will then proceed on an people with experience teaching at this level. The projects should every other year cycle. have expectations that are building in a method appropriate to a three course sequence, and at least 75% of the sampled students projects should be deemed as working to the level of that expectation. For this learning outcome – we had a two person sub-committee of the introductory program committee review one or two project descriptions and implementations done by a sample of students who passed the course. Each time this evaluation is done, it will be performed on a different course in the introductory sequence. For this first time through, CMSC 131 will be evaluated by people who are not directly involved with CMSC131, but are receiving the students who pass 131 by teaching 132. Nelson Padua-Perez and Chau-Wen Tseng were selected as viable candidates for this evaluation. ============================================================
  • 4. Middle States Evaluation of CMSC 131, Spring 2006 Reviewers Nelson Padua-Perez, Chau-Wen Tseng Reviewed The reviewers reviewed CMSC 131, Object-Oriented Programming I. It was taught in Spring 2006 by Fawzi Emad. The goal of the course is to help students develop skills such as program design and testing, as well as the implementation of programs using a graphical IDE. All programming is performed in Java. Review methodology The reviewers obtained examples of student code for two programming projects, “Company Database” and “Shape Decorator.” The reviewers examined the project descriptions, student code, and other course materials. Six student implementations were selected at random from all projects submitted for each of those two project assignments. Review criteria The reviewers examined student code submissions to determine their coding skills. In particular, we attempted to establish whether students were proficient in the following areas: Programming Knowledge • fundamental (primitive) Java data types • conditional constructs • looping constructs • one & two dimensional arrays • definition of classes, constructors • inheritance, interfaces Programming Skills • ability to understand & implement small programming solutions • ability to implement code involving classes w/ inheritance relationships provided • significant ability to use one/two-dimensional arrays .
  • 5. Results The reviewers found that students demonstrated their programming knowledge and programming ability for basic objected-oriented programs. Students were proficient in using primitive Java data types, control flow (conditional & looping constructs), 1D arrays, classes (constructors & methods). In addition students were able to design classes and methods from scratch. The programming projects were of small to moderate size (150-200 lines of code, including comments). Projects examined did not provide examples of students’ ability to use 2D arrays and inheritance. Suggestions for improvement Student projects were in general well-designed to be both educational and capture student interest. The reviewers suggest less emphasis on string input/output (requiring concern about formatting details) if possible, and more opportunities for students to design open-ended projects (choosing their own classes, methods).
  • 6.
  • 7. Bachelor’s - Learning Outcome #2 Student Learning Outcomes Assessment Measures and Criteria Assessment Schedule (list the three-to-five most important) (describe one or more measures for each outcome and criteria for (initial year, and success) subsequent cycle) 2. Graduates of this program will develop Since this program requires courses from the Math department and Starting in the spring of mathematical and analytical reasoning skills. CMSC 250, they will develop mathematical and analytic reasoning 2006 and every three skills. Their ability to perform mathematically rigorous proofs is years after, a small important and a good way to measure their level of analytical committee will review a reasoning. A small committee will review selected answers from a sample of these as single final exam question from CMSC 250 to determine how well described. the students are able to rigorously prove a mathematical concept. The sample will be selected from those who were deemed as C or better on the exam as a whole. At least 80% of the students in the course program should receive a rating of very good or excellent on this one exam question. For this learning outcome – we had a two person committee of people who are not currently teaching CMSC 250 review one exam question that is deemed to be an exemplar of the goals of CMSC 250. The committee will look at the answers of people who received a C or better on the exam as a whole and determine how well they understand the topic tested by that one question. Bill Gasarch and Evan Golub were selected as viable candidates for this evaluation. ============================================================== Evaluating Mathematical and Analytical Skills Of Computer Science Majors Using the Middle States Criteria By Evan Golub and Bill Gasarch The second item in the Middle States Evaluation Criteria states Graduates of this program will develop mathematical and analytical reasoning skills. The evaluation method mandated was that we pick a sample of the students who got a C or better on the exam as a whole, pick a question on the final that we feel tests the student’s ability to prove statements rigorously, and grade it ourselves. We picked question 3b Question 3 states: For each of the following, either give a complete formal proof being sure to use good notation and reasons for all steps OR give a counter example with justification to show that it is a valid counter example.
  • 8. Question 3b states: Give a proof or a counter example for the statement: n ∀n ∈ Z ≥o , ∑ i ⋅ 2 i = 2 n +1 (n − 1) + 2 i =0 This question was chosen since it required a rigorous proof by induction, but was not overly cumbersome in terms of algebra. Hence, if a student got it wrong, either they lacked the analytical reasoning skills or they lacked basic mathematical skills. We picked 20 students who got a C or better on the exam. We chose the number of A’s, B’s and C’s in rough percentage to how many students got A’s, B’s and C’s. We graded this one problem given 2 points for the Base Case, 6 for the Induction Hypotheses, and 10 for the induction step. Here are the final results: 1. 15 were Excellent or Very Good. (14 points or higher) 2. 1 was Moderate (9-13 points) 3. 4 were Poor (8 points or less) Hence, 75% were Excellent or Very Good.
  • 9. Bachelor’s - Learning Outcome #3 Student Learning Outcomes Assessment Measures and Criteria Assessment Schedule (list the three-to-five most (describe one or more measures for each outcome and criteria for success) (initial year, and important) subsequent cycle) 3. Graduates will experience Several 400 level courses contain large scale programming projects each semester. Starting in the spring of design and implementation of For evaluation, these courses will be identified and the project base of a sample of 2006 and every three programming projects that are those identified courses and the expectations of students will be reviewed by a small years after, a small more similar to those that committee of faculty to determine the appropriateness of the level of the project as committee will review a would be seen in a real world well as the student understanding. In order to be sure that enough students are taking sample of these as environment. these identified courses so that the programming skills are emphasized in a high described. The statistics percentage of the graduates, data will also be collected to determine the percentage of about the number of students electing to take these courses. Some examples of courses that currently meet majors graduating who the criteria are CMSC 412, 414, 417, 420, 430 and 433. At least 85% of each elected to take these graduating class should be electing to take 2 or more of these courses that are courses will be collected determined to be “project based.” The project descriptions and sample for every graduating implementations will be reviewed by the committee in order to determine the quality class. of the code being designed and written. At least 75% of the running projects should be determined to have very good or excellent programming characteristics. The evaluations of the projects will help to guide project design decisions for later semesters to make sure students are being adequately trained for actual applications of programming in the real world, and the numbers will be used to ensure that enough students are able to take the “project based” courses. For this learning outcome – we had a two person sub committee review one project descriptions and implementations done by a sample of students who passed the course. Each time this evaluation is done, it will be performed on a different course in the sequence that requires a programming project as a significant portion of the assessment. For this first time through, CMSC 412 will be evaluated by people who are not directly involved with CMSC412, but people who have been involved with the course before. Udaya Shankar and Pete Keleher were selected as viable candidates for this evaluation. =============================================================== CMSC 412 evaluation (keleher, shankar) Expectations Students who successfully complete the course are expected to know an operating system inside out. They should be able to build an operating system from scratch, including all the major subsystems such as process scheduling and management, virtual memory, file systems, threading, and inter-process communication. They should also be able to dive into an existing operating system to make modifications and to add components.
  • 10. Course project The main thrust of the course is to implement an operating system, starting with a skeleton operating system (GeekOS) and adding various subsystems to it over a series of six projects. GeekOS can run on an actual x86 machine or an emulator; the course uses an emulator (Bochs) that runs on a Linux cluster dedicated to instructional use. The GeekOS that the students start with implements an elementary process scheduler and device drivers for floppy and screen. The students extend GeekOS over the course of the semester as follows: • Project 0: Get familiar with the GeekOS development environment and the Bochs x86 simulator, and writes a simple kernel mode thread that does some simple keyboard and terminal IO. • Project 1: Augment GeekOS process services to include background processes, asynchronous killing of a process from another process, and printing the status (process table) of running processes. Involves address space protection, system calls, user versus kernel memory, and segmentation. • Project 2: Augment GeekOS with code to handle signals, implement system calls to setup and call signals, and handle orphaned background processes. • Project 3: Implement multilevel feedback process scheduling in place of the default priority-based, preemptive round robin, and evaluate the two policies. Implement semaphores and provide user programs with semaphore-manipulating system calls. • Project 4: Augment GeekOS with virtual memory. This is done in steps: kernel memory paging, user memory paging, demand paging, and finally virtual memory. • Project 5: Augment GeekOS with a new file system, including support for long file names, creating directories, and mounting. This file system resides on the second IDE disk drive in the Bochs emulator, with the first IDE drive having the default PFAT file system. A virtual file system (VFS) layer is placed above the two file systems. • Project 5: Augment GeekOS with inter-process communication via message passing. For each project, students are given a base version of GeekOS that they can build on. So a student who has not successfully completed the previous project is not penalized. Evaluation We examined several projects and decided to use the fifth project as our sample. Project 5 has the students implement a complete file system for a simple operating system. The students were given a scaffolding that implements the rest of the operating system, though some of this is derived from the students' own versions of prior projects. Students were also given a complete description of the interface between their code and the rest of the system, which effectively defines most of the relevant types gives prototype to all of the main routines. Students usually wrote about 1200 lines of C source code for this project, plus a variety of small changes elsewhere in order to get the new code to mesh with the rest of the scaffolding correctly. We looked at three samples of running code and inspected the code for three characteristics: • clear and well-documented code • well-designed functions without redundancy, and • evidence of good debugging practice (error conditions checked, assertions used).
  • 11. Students excelling in all three criteria should be well-prepared to develop design and build large-scale projects after graduation. Two of the three projects we examined did very well on all three criteria. The functional structure was laid out well, code used more than once was invariably hoisted into common subroutines, and the students used good debugging practice by checking all or most error conditions, as well as defining invariants using kernel assertions. The third project was less clearly laid out, and the student did not bother to document the project like the others. The student used less sophisticated methods of debugging (inserting print statements) that would not scale well up to larger or more time-sensitive projects. Functionally, however, the three projects were equivalent, they passed all tests and detected relevant error conditions. Suggestions for Assessment Improvement More than three projects will need to be evaluated in the future since the three chosen may not accurately represent the 40 people taking the class that semester. We should review whether more emphasis be placed on the grading of programming style (reuse of subroutines, reliability and readability) in this course or in the courses that are prerequisites to this course.
  • 12.
  • 13. Bachelor’s - Learning Outcome #4b Student Learning Outcomes Assessment Measures and Criteria Assessment Schedule (list the three-to-five most (describe one or more measures for each outcome and criteria for success) (initial year, and important) subsequent cycle) 4. In addition to those skills All students will have one or more of these areas of specific skill development. The These graduation mentioned above needed by all statistics about the courses chosen by the graduates should be reviewed after each statistics should be computer science graduates, graduation to determine that students are developing skills for application to graduate collected and reviewed the students must also gain school or the real world in more than one of these areas. All students are required to for each graduating class skills and knowledge in one or have one based on the graduation requirements, there should be at least 90% that have to be sure the correct more of the following areas of gained the skills described in more than one of these areas. courses are being offered computer science. so that this goal can be met. b) Graduates will be There are several courses that address this skill set. Some examples are: 498A Starting in the Spring of exposed to some form (independent study), 390 (honor’s research), independent work with a faculty member, 2006 and for every third of academic research or the software engineering corporate add-on, and the corporate scholars program all year after. The faculty real world provide this exposure. The quality of the individual research done at the committee responsible for programming/business undergraduate level within the department will be maintained by a process of review ofgraduation with environments. This selected papers submitted for the honors program. The responsibility for this review departmental honors will exposure makes their will not be done by the student’s research supervisor, but rather by the faculty review papers submitted transition to either committee responsible for graduation with departmental honors. At least 90% of the for that purpose. graduate school or the papers reviewed should receive an excellent rating by this external review committee. work environment smoother. For this learning outcome – we will have a committee who oversees the Honor’s Program review a random sample of the projects of students who graduate with departmental honors. This is a standing committee currently chaired by Dr. Gasarch.
  • 14. Middle States Evaluation of Honors Thesis By William Gasarch and Don Perlis Nine students graduated with honors in Spring 2006. We looked at six honors theses and independently gave then grades in the range of {0,1,2,3} for originality (ORIG), significance (SIG) and presentation (PRES). The grades are interpreted as follows: 0=weak, 1=good, 2=very good, 3=excellent After we evaluated them separately, we added our scores together. Hence, we have, for ORIG, SIG and PRES, a score than can be from min 0 to max 6. 1. Graphical Interface for JSHOP2 by John Shin. ORIG: 5, SIG: 5, PRES:3 2. Magnetic Swipe Card System Security by Daniel Ramsbrock. ORIG: 6, SIG: 5, PRES:6 3. A Survey of some recent results in Computer Graphics by Robert Patro. ORIG: 3, SIG: 5, PRES:4 4. XMT Applications Programming: Image Registration and Computer Graphics by Bryant Lee. ORIG: 5, SIG: 5, PRES:5 5. Annoflow- Handwritten Annotation and Proof-reading on Dynamic Digital Documents by Phil Cosby. ORIG: 5, SIG: 5, PRES:6 6. A Computer Program to Play Minesweeper. ORIG: 6, SIG: 4, PRES:4 (Note: Bill Gasarch was the mentor or this project so the ratings are based on Don Perlis’s scores doubled.) We regard a student’s honors thesis to be excellent if it is at least 5 in one category, at least 4 in another, and at least 3 in the third. Using this measure, all of the theses were excellent. This may sound rather optimistic; however, realize that only nine students did honors theses, and they were some of our best students.
  • 15. Master’s – Learning Outcome #1 Doctorate – Learning Outcome #1 Student Learning Outcomes Assessment Measures and Criteria Assessment Schedule (list the three-to-five most (describe one or more measures for each outcome and criteria for success) (initial year, and important) subsequent cycle) 1. Master’s and Ph.D. Student taking the 600-800 level project-based courses will have developed Starting in the Spring, 2006 graduates will learn to design projects appropriate to the level for assessment. A committee of three faculty and every third year and implement computer members will review project descriptions and sample student implementations of afterward this committee software. a project for a course that is identified as being heavily based on the project. This will review a course that committee will determine if the level of the project is appropriate and if the was offered in the previous students are able to complete the project with appropriate programming semester. characteristics. At least 75% of the sample of students who have passed this graduate level course will have the project implementation rated as a very good or excellent in a variety of programming characteristics. For this learning outcome – we will have a single person who has taught a similar course in the past review a project and implementations. For this year, Jim Reggia will review a project in the course being taught by Lise Getoor. Evaluation of Learning Outcome #1 for MS/PhD Degree Part of the Middle States Evaluation Process J. Reggia, Professor, Computer Science Dept. 5/15/06 Learning Outcome #1: The goal of this Learning Outcome is that MS/PhD students will learn to design and implement computer software. Assessment Measures and Criteria: Students taking the 600-800 level project-based courses will have developed projects appropriate to their level. Project descriptions and sample student implementations of a course that is identified as being heavily based on a project will be reviewed. It will be determined whether the level of the project is appropriate and if the students are able to complete the project with appropriate programming characteristics. At least 75% of the sample of students who pass this course will have the project implementation rated as very good to excellent. Course Selected for Review: CMSC 726, Machine Learning, was selected for review this semester. This course reviews a broad range of machine learning methods at a graduate level. The course typically has a major project required of all students that involves an independent research project. The project is required of all students.
  • 16. Methodology: All projects turned in by students taking the course are reviewed here for quality independent of the course instructor. These projects were provided by the instructor during the last week of the semester. Evaluation criteria: Each project was reviewed and rated based on originality, content, implementation effort, and report quality. Results: 1. Project goal: (excerpts from the instructor’s project assignment) The goal of this project is to give students experience in defining, refining and developing an interesting research idea, hands-on experience implementing the proposed research, practice evaluating research results, and experience writing up and explaining the research results. Ideally, the project should be of a quality such that students can submit it to a workshop or conference. Projects can be either applications oriented, applying machine learning techniques to an interesting or novel task, or algorithmic oriented, comparing several models or developing a new algorithm. In either case the results should be validated empirically. A 1 to 2 page proposal must be turned in, describing the problem, the hypothesis to be tested, the experiments that will be performed, and how the results will be evaluated. The project can be done alone or with 1 to 2 (maximum) other students. (The instructor also suggested ideas for potential projects as well as providing pointers to past projects and to conference proceedings that students could review for ideas.) 2. Projects received: A total of 13 projects were received, reflecting reports from all 20 of the students in the class. Several projects involved two or three students working together, but most were done by individual students working alone. All projects that were turned in were examined as part of this evaluation. 3. Evaluation of projects: The student projects spanned a broad range of creative topics and applications. While some of the projects were straightforward applications of supervised learning, even these often took on creative tasks. Multiple projects were done on timely topics such as learning to detect spam in email messages, clustering of information in databases, and applications in bioinformatics. Several projects were of a caliber that they would be appropriate to submit to workshops/conferences for possible publication. Comments and Conclusions 1. The level of this project was appropriate to the level of the students, and it clearly met the goal of giving the graduate students in this course experience with implementing substantial advanced software. The students had learned a lot about machine learning that they could bring to bear in the software development. 2. Independent review of these projects ranks them all as at least very good, with several being clearly excellent in terms of creativity, technical execution, and innovation. Several were judged as potentially publishable research results. 3. Not only was learning objective #1 met, but it was clearly exceeded in that in addition to obtaining software development experience, the students also gained experience formulating and executing a research study, analyzing the results, and writing up their conclusions. This is a very valuable exercise for this level of students.
  • 17. Doctorate – Learning Outcome #3 and 4 Student Learning Outcomes Assessment Measures and Criteria Assessment Schedule (list the three-to-five most important) (describe one or more measures for each outcome and criteria for (initial year, and success) subsequent cycle) 3. Ph.D. graduates will complete The research component of the Ph.D. requirement will be evaluated at Starting in the spring of extensive independent research and have two different periods of time – both after 3 years of study and at 2006 and every year after at least one paper accepted by a graduation. At least 75% of the students graduating in a given year the CMSC graduate office conference or published in a refereed will have published at least one paper. This information can be and the associate chair of journal. gathered during the graduation survey and interview already undergraduate studies will conducted by the graduate office and the associate chair of the CMSC compile and present these graduate school. At least 75% of the students completing their third statistics to the chair for year of study toward a Ph.D. will have submitted at least one paper to review. a conference or refereed journal. This information will be gathered on the form used for the graduate student review day which is held during the spring semester. 4. Ph.D. graduates will have their At least half of the students graduating in a given year will have Starting in the spring of research accepted for presentation at presented their research during at least one conference by the time of 2006 and every year after subject area conferences and the Ph.D. their graduation. This information can be gathered during the the CMSC graduate office graduate will have made the presentation graduation survey and interview already conducted by the CMSC and the associate chair of of their research material. graduate office and the associate chair of the CMSC graduate school. graduate studies will compile and present these statistics to the chair for review. These are both statistics that will need to be collected by the graduate chair, Samir Khuller, from the graduate office survey and from the exit interviews. Report of the Ph.D. Learning Outcomes #3 and #4 For the graduation statistics, the data was collected both on a survey given to all students as they a apply for graduation and from an exit interview requested of all people who would be graduating in the next year. Since neither the surveys nor the interviews are required in any way, several students did neither. The graduate program coordinator then attempted to fill in the data for the missing students by using their web page and/
  • 18. or vitae if she had access to the information there. This method of data collection proved to not be as accurate as we would like so next year, this information will be additional questions that appear on the form they must submit to the department in order to apply for graduation. That method of data collection should be significantly more reliable. The information in the following paragraphs is according to the data we were able to collect in the first method. For the 2005-2006 school year, there were a total of 34 Ph.D. graduates from the computer science department. They were divided into 11 for the summer, 10 for the winter and 13 for the spring graduation ceremonies. For published journal articles, there were a total of 26 students who had already had one or more articles published in a journal before the time of their graduation. These were divided as 8 for the summer graduates, 9 for the winter and 9 for the spring. This gave an overall percentage of 75.9% slightly above the Middle States goal of 75% as stated above. The winter graduates were well over the goal rate at 90%, the summer graduates were slightly under at 73.73%, and the spring graduates were slightly over at 76.85%. For presentation at conferences, there were a total of 28 graduates that had already presented at one or more conferences. These were divided as 10 for the summer graduates, 10 for the winter graduates and 8 for the spring graduates. This gave an overall percentage of 82.35% which is well over the goal of at least half as stated above. Each of the three groups was over the goal with the summer having 90.91%, the winter 100% and the fall 61.54%. For the mid-program evaluation, the data was collected as part of the graduate student review process done each April. All current graduate students are evaluated by a committee to determine if they are making adequate progress toward graduation. This first step of this process is a web survey that all Computer Science graduate students must complete. If the students do not appear to be making progress toward graduation from their answers on that survey, they are then interviewed further to determine the proper corrective measures. The data is then entered into a database and we were able to query for just the information about the students who met our criteria. The goal was to have publications in peer reviewed journals by at least 755% of those who had completed their third year of graduate study in the computer science department. We found that there were 29 students who had completed their third year and that 20 of them already had articles accepted at peer reviewed journals. This gives a rate of 69% of these students had published. We did not have data to determine how many had submitted to a peer reviewed journal. The assessment should probably be changed to something that is more easily measured such as having an article accepted for publication in a peer reviewed journal. Suggestions for Improvement The method of data collection for those that are needed from people who are graduating will be changed. The method used was not on any form that is required – and so there was quite a bit of missing data that was difficult to recover. The new method will be to add it to the form that students must fill out to apply for their graduation. Since all students submit this form and are unlikely to leave these small questions blank, the data collection will be more accurate when this evaluation is completed for 2007.