Introduction

Frema (Framework Reference Model for Assessment) is one of five e-Learning Framework (ELF) Reference Model projects in the JISC e-Learning programme. JISC's e-Learning Programme has the aim of identifying how e-learning can facilitate learning and to advise how such e-learning systems might be implemented. A core part of this second aim is the e-Learning Framework (ELF), which is a service-oriented view of the core system modules required to support e-learning applications.

For this to occur it is necessary to develop ELF Reference Models that describe how areas of the e-learning domain map to the ELF and which can act as a driver for implementation and evaluation. Once complete, such reference models will ease the development of further services and promote the re-use of existing ones.

overview

Figure 1: e-Learning Framework

This project will develop the reference model for the Assessment domain area. The JISC Circular 10/04 identifies six domain areas, of which five are the subject of the circular: assessment, learning content, enterprise, personal development planning, and personal learning environments. The learning content domain area reference model involves the specification of services for the design, construction and execution of learning activity that can be used and shared by multiple institutions and the lifelong learner.

The Assessment services will focus upon the creation, execution and recording of electronic assessments which are accessible across institutions and to the lifelong learner.

Frema is a 1 year collaboration between the Universities of Southampton, Strathclyde and Hull and is funded by a JISC grant of £148,095.

Project description

Assessment is a large and complex portion of the e-Learning Framework, as shown in Figure 2, interacting with VLE's, Portals and Marking Tools at the User Agent layer, multiple Learning Domain Services including Sequencing tools, Grading, Marking, Reporting, Competency and Tracking and relying on most of the common services.

overview

Figure 2: A Summary of the Assessment Domain 11-04: ELF Conference 11/04 (http://www.elearning.ac.uk/resources/Assessment.ppt)

The project is a collaboration between the Universities of Southampton, Strathclyde and Hull and the team has many of the key players in this domain.

Members of the Frema project team have been working on defining and implementing the components of assessment framework for some years through initiatives such as:

The Frema team are in regular communication with other active teams such as the TIP project (Authentication and Authorisation, University of Oxford, http://www.jisc.ac.uk/delettip.html ) and Serving Maths (University of York), http://www.jisc.ac.uk/deletsm.html.

The ultimate aim of ELF is for each service to reference an open specification or standard and for open-source implementations to be available. The stated intention of the ELF effort is to facilitate the integration of commercial, home-grown and open source components by agreeing common service definitions data models and protocols.

For this to occur it is necessary to develop ELF Reference Models that describe how areas of the e-learning domain map to the ELF and which can act as a driver for implementation and evaluation. Once complete, such reference models will ease the development of further services and promote the re-use of existing ones.

The e-Learning Framework Reference Model for Assessment will be composed of several distinct parts that:

The development of an ELF Reference model for Assessment will also be expected to contribute to the ELF itself, in terms of new service definitions and implementations.

Outline of the proposed reference model

Wilson, Blinco and Rehak (2004) describe a reference model as "... a selection of Services defined in one or more Frameworks together with rules or constraints on how those Services should be combined to realize a particular functional, or organisational goal. A Reference Model constrains the number of unique organisational infrastructures".

The Assessment reference model will include:
  1. A domain definition that will describe the scope of the Assessment domain, including evidence that gives an overview of current practises, processes and systems.
  2. A set of use cases that describe common solution patterns in the Assessment domain area.
  3. Service profile definitions for both existing services and those that need to be developed for the Assessment domain area, their scope, behaviour, and data.
  4. Assessment reference model prototype implementations.

Specifications to be used

The Assessment reference model project will build upon existing specifications and standards from JISC, IMS, and other projects. In particular, it is expected to reference: agreed standards such as SOAP and WSDL of the W3C; Learning Design, Content Packaging, Simple Sequencing, and Question and Test Interoperability of IMS; the OSIDs from OKI and the current SAKAI initiative of the US Universities consortium; and the current ELF and associated specifications from JISC and CETIS.

The Assessment reference model project is likely to identify additional work, and may identify new developments, that could be usefully introduced into these standards. Such material will be discussed with CETIS and the relevant SIGs.

Approach of the project

Throughout the project efforts will be made to work with other groups and relate the Reference Model to other bodies of work. The project will work with:

Workpackages

Workpackage 1: Project initiation & reference model specification

The project will begin by providing a more detailed and in-depth analysis and specification of the Assessment reference model requirements, and by specifying the standards for the Assessment reference model.

To do this a domain definition will be produced that describes the scope of the Assessment domain and a portfolio will be assembled as evidence that gives an overview of current practises, processes and systems. Standards relevant to the Assessment domain area will be reviewed (eg IMS QTI), as will the relevant DeL regional pilot projects, relevant JISC projects (eg TOIA, ASSIS, APIS) and other projects.

In the light of these reviews a support plan for the relevant DeL regional pilot projects will be developed.

Workpackage 2: Iteration 1 for Assessment reference model

Work Package 2 represents a first iteration of the design of the reference model and implementation of a prototype. In this first iteration, a particular portion of the Assessment domain area will be addressed (expected to be around half of the domain) and use cases and scenarios will be identified and documented. The particular portion is expected to be the more straightforward aspects of the Assessment domain area, where relevant Assessment services can be readily identified and related to the reference model.

Use cases will be developed that describe common solution patterns in the Assessment domain area. These will be coupled with narrative descriptions of the elements/actors and processes involved in the domain area and the ways in which they interact.

Based on these use cases a gap analysis will be undertaken. The current state and status of the e-Learning Framework with respect to the Assessment domain area will be established and existing ELF services relevant to the Assessment domain area will be identified. This analysis will provide an initial table of service gaps in the e-Learning Framework, as well as an identification of the Assessment domain area services to be implemented.

A Service Profile will be produced for the Assessment domain area that describes the existing and new services in context and gives details of each service in terms of its functional scope, data and behaviour, and API. If new services are required for the Assessment domain area then they will be scoped, and their behaviour and data defined.

A proof of concept prototype will be designed and implemented based around the Assessment domain area use cases and the Service Profile.

Workpackage 3: Iteration 2 for Assessment reference model, & project report

In work package 3, the other portion of the Assessment domain area will be identified (expected to be the remaining half of the domain) and use cases and scenarios will be identified and documented. This particular portion is expected to be the more awkward aspects of the Assessment domain area, which will benefit from the work of iteration 1.

Experience gained from the first iterationof the reference model design will be used to revise the use cases described in WP2 and also to inform the development of the remaining use cases planned for iteration

The analysis of the use cases will provide an augmentation to the table of service gaps in the e-Learning Framework. The gap analysis from will be revised and extended to cover the new portion of the Assessment domain area if necessary.

The Service Profile will be updated to describe the existing and new services in context, and the service descriptions will be updated or augmented. If new services are required for the second iteration of Assessment then they will be scoped, and their behaviour and data defined.

The proof of concept prototype from will be extended to support the new use cases and service profile.

At the end of iteration 2 the proof of concept prototypes will be evaluated and the Assessment reference model and service definitions further revised.

Workpage 4: Management, collaboration, and dissemination

Work package 4 is a substantial project activity which runs throughout the project. Its components include project dissemination work (Web site construction and maintenance, developing and running workshops, providing conference posters and presenting conference papers, and authoring and publishing journal articles), providing support for DeL regional pilot projects in the Assessment domain area, collaboration and coordination with the other e-Learning Framework reference model projects, collaboration and validation activities with CETIS and the CETIS Assessment SIG, and collaboration with other relevant Assessment domain area bodies and authorities (eg IMS).