The Assessment Report

Software Engineering Group

09/12/2008

 

1.      Student Learning Outcome (SLO)

Be able to apply the principles and practices for software design and development in the construction of software systems of varying complexity. (SLO11)

 

2.     Assessment Plan

The First Iteration

·        Develop performance criteria – by 31 March 2008

·        Verify performance criteria for the assessment– by 16 May 2008

·        Assess artifacts – by 15 June 2008

·        Analyze the results of the assessment – by 15 July 2008

·        Create an assessment report – by 15 August 2008

 

The Second Iteration

·        Modify class objectives and/or the assessment method based on the result of the first iteration – by 15 October 2008

·        Update performance criteria, if necessary – by 31 October 2008

·        Assess artifacts – by 19 December 2008

·        Analyze the results of the assessment – by 9 January 2009

·        Create an assessment report – by 15 January 2009

 

3.     Method(s) of Assessment

Performance criteria were used to assess artifacts already produced by COMP380/L and COMP480/L students as part of their coursework. The criteria are as follows:

 

The scale for satisfying each criterion: 4 (Exceptional or Grade A), 3 (Good or Grade B), 2 (Satisfactory or Grade C), 1 (Poor or Grade D), 0 (Unacceptable or Grade F).

 

A.          Performance Criteria for COMP380/L

1)      Assessment criteria for analysis (use case descriptions and diagram)

Criterion ID

Criterion

Measurement concerns

CR-1

Identify system responsibilities

Use cases need to cover requirement and specifications.

CR-2

Develop system responsibilities

Development of each use case based on the recommended format that includes use case properties, primary scenario and alternative scenario. Each use case should be complete and correct.

 


2)     Assessment criteria for design (class diagram)

Criterion ID

Criterion

Measurement concerns

CR-3

Identify classes

Population of classes and associations from use case descriptions. The classes chosen should be high cohesive, low coupling, more sufficient, less complex, and complete.

CR-4

Identify relationships between classes

Class relationships should be chosen appropriately using the different types of connections such as regular association, generalization, aggregation, composition, and dependency.

CR-5

Identify and allocate attributes

Details of a class are represented with attributes. Attributes can be simple primitive types (integer, floating point numbers, etc.) or relationships to other objects. Each attribute can be described in detail.

CR-6

Identify and allocate operations

Operations specify how to invoke a particular behavior. Each operation can be described in detail.

3)     Assessment criteria for design (Sequence Diagram)

Criterion ID

Criterion

Measurement concerns

CR-7

Identify structural elements

The sequence diagram can model interaction at use case. Where the interaction is modeling a use case, the starting point is the classes that participate in “use case realization.”

CR-8

Draw sequence diagrams

Draw sequence diagrams for the use case descriptions. Each sequence diagram should be displayed with a frame, life lines, messages, constraints, conditions, etc. In particular, proper messages types (methods calls, sending a signal, creating an instance, destroying and object, confirmation, etc.) should be used depending on the situations. If necessary, proper fragments (“alt,” “opt,” “break,” “critical,” etc.) should be used.

 

B.          Performance Criteria for COMP480/L

1)      Assessment Criteria for requirement and specification document

Criterion ID

Criterion

Measurement concerns

CR-1

Feasible

The requirement is doable and can be accomplished within budget and schedule.

CR-2

Concise

The requirement is stated simply.

CR-3

Unambiguous

The requirement can be interpreted in only one way.

CR-4

Consistent

It is not in conflict with other requirements.

CR-5

Verifiable

Implementation of the requirement in the system can be proved.

CR-6

Allocated

The requirement is assigned to a component of the desired system.

CR-7

Assigned a unique identifier

Each requirement shall have a unique identifying number.

CR-8

Devoid of escape clauses

Language should not be speculative or general (i.e., avoid wording such as “usually,” “generally,” “often,” “normally,” and “typically”).

 


2)     Assessment criteria for design

Criterion ID

Criterion

Measurement concerns

CR-9

Feasible and implementable

Design should be made to be successfully implemented.

CR-10

Consistency with requirement

Design should be consistent with the requirement document. 

CR-11

Enough information

Design should include enough information for implementing the system.

 

4.     Scoring Rubrics

 

Before conducting the assessment test, scoring rubrics were established as follows:

 

 

5.     Results of the Assessment

 

5.1  COMP380/L

 

Among 10 group projects, three projects (30%) fell into “Good,” five projects (50%) into “Fair,” and two projects (20%) into “Poor.” Table 1 shows the details.

 

Table 1.  The assessment results of COMP380/L based on teams.

 

CR1 and CR2 turned out to be “Good,” CR3, CR5, CR6 and CR7 were “Fair,” and CR4 and CR8 were “Poor,” as shown in Table 2.

 

Table 2.  The assessment results of COMP380/L based on the evaluation criteria.

 

5.2 COMP480/L

 

Among 4 group projects, two fell into “Fair” and two into “Poor.” Table 3 shows the details.

 

Table 3.  The assessment results of COMP480/L based on teams.

 

CR7 turned out to be “Good,” CR7 and CR8 were “Fair,” and the rest were “Poor,” as shown in Table 4.

 

Table 4.  The assessment results of COMP480/L based on the evaluation criteria.

 

6.     Analysis of the Assessment Results

 

Reflecting the 2007 assessment results and actions, rubrics were specified in great details for this assessment and design activities were stressed in the previous classes. See the appendix for details. 

 

6.1 COMP380/L

 

As described on page 1, the assessment criteria for COMP380 consist of three sub-groups: developing use case descriptions, class diagrams, and developing sequence diagrams. Each sub-group is refined with a set of criterion.

 

The first sub-group criteria evaluate the student capabilities on identifying and developing system responsibilities, particularly functional requirements. The deliverables are use case diagrams and their associated use case descriptions. Overall, the results of the assessment criteria for analysis are good.

 

The second sub-group criteria assess the student capabilities on developing class diagrams that describe the structure of the system. The results were relatively fair. Students were able to identify classes and develop their attributes and methods. Students, however, seemed to face some degree of difficulties in fully and fluently developing class relationships by using the different types such as generalization, aggregation, composition, and dependency.

 

The third sub-group criteria review the student capabilities on developing sequence diagrams that help software developers understand system behaviors. A sequence diagrams consists of a set of objects and interactions between the objects. Students were good at identifying relevant objects but they needed to specify the interactions in more details with proper message types and proper fragments.

 

In summary, compared to the 2007 assessment result, the 2008 assessment result shows some degree of improvement but the design skills need to be improved.

 

6.2 COMP480/L

 

<<Bob, kindly take care of this section.>>

 

7.     Recommendations for Actions/Changes

 

7.1.    Should this assessment activity be repeated?  If so, when?

 

Yes, we need to repeat this assessment in Fall 2008 to see if there has been improvement in the results.

 

7.2.   Should changes be made in the way this assessment was done?  If so, describe the changes.

 

Our current assessment method is a formal and indirect approach. We, the software engineering group, carefully planned and carried out the assessment activities. We also indirectly assessed the student learning outcome by assessing the student project deliverables.

 

We now consider a formal and direct approach. Although indirect approaches are efficient and effective on evaluating certain student learning outcomes, we have a sense that this SLO can be more accurately assessed by a direct manner. Therefore, we plan to create an assessment test so that we can see whether or not students understand and apply the principles of software design.

 

 

7.3.   Should there be any changes in curriculum?  If so, describe recommended changes.

 

No.

 

7.4.   Should any other changes be made?

 

No


Appendix

 

Summary Assessment Report – SLO11

Software Engineering Group

9/7/2007

 

Student Learning Outcome: Be able to apply the principles and practices for software design and development to real world problems.

Performance Criteria

Strategies

(How and Where Taught)

Assessment Method(s)

Context for Assessment (How and Where Measured)

Time of Data Collection

Assessment Coordinator

Persons of Responsible for Evaluation of Results

Develop a software requirements and specification (SRS) document that addresses a reasonable set of requirements in sufficient details.

COMP 380/L

COMP 480/L

Faculty evaluation using rubric

Comp 380/L

Comp 480/L

Spring 2007

George Wang

George Wang

Diane Schwartz

Robert Lingard

Develop a software design document (SDD) that includes appropriate designs and properly reflects the SRS document.

COMP 380/L

COMP 480/L

Faculty evaluation using rubric

Comp 380/L

Comp 480/L

Spring 2007

George Wang

George Wang

Diane Schwartz

Robert Lingard

Construct code that should follow coding conventions such as names, comments, layout, etc.

COMP 380/L

COMP 480/L

Faculty evaluation using rubric

Comp 380/L

Comp 480/L

Spring 2007

George Wang

George Wang

Diane Schwartz

Robert Lingard

Results 5/22/2007: There is a mismatch between design document and implementation. Design documents tend to not updated reflecting implementation. One of the reasons is that the design documents were created before implementation was started and were not updated when implementation was finished.

Actions 5/22/2007: A formal assessment will be done. We will organize a panel in Spring 2008 whose members are former COMP380/L or COMP480/L instructors. Rubrics will be more specific and the project deliverables of all COMP380/L sessions will be reviewed. Meanwhile, in COMP380/L course objectives, design activities will be stressed. The basic design activities will be also addressed in lower-level course courses such as COMP110, COMP182 and COMP282.

Second-Cycle Results: None so far.