Welcome to the XCRI @ Greenwich blog. Here we aim to keep you updated on all developments in the implementation of XCRI-CAP and related data initiatives.

Tuesday 20 December 2011

JISC grant

I've just heard that we have been successful in our bid for a JISC grant for the XCRI project. That is a nice Christmas present, isn't it.

Monday 21 November 2011

I have had some problems getting the posts to appear on this blog and it is likely that I'll set up a new blog address after Christmas when I set up the University of Greenwich XCRI project website. Anyway, this is the blog for now and I just want to report that I have submitted the XCRI stage 2 bid (with a whole 50 minutes to spare).

Wednesday 26 October 2011

XCRI-CAP 'quick and dirty' feed

I spent a couple of hours yesterday together with a colleague in Student Record Systems working through the XCRI-CAP XML spec, working out what information we have on our Banner system and where it resides.

Wednesday 19 October 2011

The first meeting of the XCRI Project Board was held today (19 Oct). We discussed the University of Greenwich's approach to the project and completed the JISC Self-assessement framework.

Thursday 6 October 2011

The KIS Advisory Board met this morning and heard a report about the XCRI project, over which it has a general overview.

Thursday 29 September 2011

XCRI Project Board

The first meeting of the XCRI Project Board will be held on Wed 19 Oct at 11am in QA084.

Thursday 22 September 2011

Just to let you know that an XCRI project brief has been submitted to the University of Greenwich IT Project Board for approval.

Wednesday 21 September 2011

Hello! I just wanted to introduce myself as I'm now the contact person down for this blog. I'm Katarina Thomson in Planning and Statistics.

Thursday 8 September 2011

We have just heard that the University of Greenwich has secured the £10,000 funding for Stage 1 of the JISC  ‘Course Data: Making the most of Course Information’ Capital Programme, starting immediately.

Report on University of Greenwich XCRI Self Assessment Framework Field Test

University of Greenwich XCRI Self Assessment Framework Field Test Final Report

18.7.11

Contents
1. Approach
2. Success factors
3. Lessons learnt
4. Recommendations
5. Appendices

1. Approach: Commentary on how useable and useful the XCRI Self-Assessment Framework and supporting XCRI Knowledge Base was
Feedback from the participants involved in University of Greenwich’s field test of the XCRI Self Assessment Framework has been largely positive in respect of usability and usefulness of the questions in the toolkit and links to resources in the Knowledge Base. The reaction to the content and organization of the Knowledge Base was more muted; in trying to do so much it is suggested that the purpose of the Knowledge Base is somewhat confused.
It is noted that the XCRI Support Team recommended as part of the Field Test a two phase implementation of XCRI-CAP, with Phase 1 being a pilot of the XCRI-CAP process, followed by Phase 2, essentially the completion of the Self-Assessment Framework.
The recommended approach appears to presuppose that a decision to implement XCRI-CAP has already been taken by an institution and that resource has been identified to undertake Phases 1 and 2.
The approach taken by Greenwich did not (could not) follow this approach since at the time when the field test commenced the institution had not yet taken a decision to embark on the implementation of XCRI-CAP, nor was sufficient resource available to undertake all of the recommended steps.

Greenwich’s approach to field testing the XCRI-CAP Self Assessment Framework was as follows:
1. Identify key personnel
2. Organise a workshop to raise awareness of and develop understanding of XCRI and XCRI-CAP;
3. Organise a further workshop to look at usage by other institutions; compare Greenwich’s current approach to managing course information; identify the extent to which XCRI-CAP might meet institutional priorities and agree actions required and next steps;
4. During the workshop to consider the questions in the XCRI-CAP Self Assessment toolkit and come to a collective agreement on the most appropriate answer and to consult the contents of the XCRI Knowledge Base.

Prior to the field test stakeholders in Corporate Information Systems and Development & Communications had recognized there was a need to enhance Greenwich’s capacity to satisfy internal and external demand for authoritative course information. Some but not all of the stakeholders were aware of the XCRI standard and XCRI-CAP and the field test was an opportunity to consider the scope and possibilities of implementation at Greenwich in the context of institutional business need, notably in relation to compliance with the forthcoming Key Information Sets (KIS) required by HEFCE from September 2012.

The workshops that took place at Greenwich (particularly the first workshop) had a wide scope and this approach was found to be appropriate. The spacing of the workshops while gave key personnel a chance to take information back to their staff and to ponder their particular service priorities and what XCRI could deliver.

In reflecting on the Self Assessment Framework, participants at the workshop agreed that the questions posed helped provide a focus to the broader preliminary discussions. One participant observed “it is quite thorough” and “made us think around the issues more carefully”.

It was noted by participants that while the Self Assessment Framework appeared primarily to be a tool to assess institutional readiness to implement XCRI, the questions had helped to inform thinking and debate on:

• where institutional need for improved course-related information sat in relation to their service priorities
• how Greenwich might engage with the XCRI standard and XCRI-CAP

It was noted very late on during the field test that the XCRI Support Team recommended in Phase 1 of the field test the rapid development of an XCRI-CAP export. Independently, discussion at the Greenwich workshop on 14 June came to a similar conclusion but identified a different rationale for doing so.

The specific suggestion made during the Greenwich workshop related to the possibility of producing a “quick and dirty” version of XCRI-CAP with the specific purpose of exposing in a clear and unequivocal way the weakness in the current processes for the capture and use of course marketing information.

This development suggests that the XCRI Support Team’s recommended approach to the implementation of XCRI-CAP is appropriate, however from Greenwich’s perspective it is not proven that the development of an XCRI-CAP feed need precede scrutiny and completion of the Self Assessment toolkit questions, but can follow afterwards, or in conjunction.

During the workshop participants looked at some of the content of the Knowledge Base, accessing the data via the summary results of the Self Assessment. The initial reactions from workshop participants to the content of the documents were that it provided some highly detailed information. It was observed that this level of detail was of interest primarily to a developer working on XCRI-CAP implementation, rather than to managers addressing strategic considerations/issues.

A usage scenario of the toolkit at Greenwich is envisaged whereby different staff consult it at different times and for different purposes and some mechanism for documented version control of the results of the self-assessment using the XCRI toolkit would be useful.

As part of the field test the XCRI Knowledge Base was subjected to more detailed scrutiny in the weeks after the workshop, and a series of observations recorded as follows:

• The care that has been taken to frame information for different audiences is enormous, and yet the need for this level of differentiation may not be justified. The detailed technical data and the efforts made to explain to non-teccies how XCRI can be of benefit is totally understood, but further distinctions (e.g. between Policy Makers / Managers) assumes a level of ‘self service’ that may be too high. At Greenwich senior managers would be unlikely to consult the Knowledge Base themselves to any level of detail and would instead rely on a third party to interpret and distill the information.

• Scrutiny of the Knowledge Base from the www.xcri.co.uk home page was a less satisfying experience than accessing specific documents via the toolkit questions and the link at http://www.xcri.co.uk/getting-started.html simply because of the vast amount of information available. Trying to distinguish between the documents in terms of usefulness and ability to cover specific areas was time consuming and did not instill confidence.

• Elements of the Knowledge Base function as a repository of all documentation relating to XCRI. It is accepted that this was created in response to previous demand, but might it now time to archive some of the documents?

• Since developments in XCRI have been so rapid over the period the content of some of the resources may essentially have become a historical record.

• In the A-Z list of resources – dates are given but they are not reader friendly at all.

• Could the completed projects be put in date order?

• Is there any scope for showing how XCRI developers have responded to all of the feedback along the lines of “you said – we did”? Alternatively or as well is there any synthesis document?

• The case studies are very interesting – and potentially a good tool for raising awareness about XCRI to colleagues and also to sell the case for XCRI-CAP implementation in an institution. As the number of case studies increases some standardized format may be advisable and would help with synthesis.

• One of the links in the Technical Implementation Section to the Knowledge Base included some postings that are three years old; a BOXCRIp link is broken; SRC is down as a completed XCRI project but the link takes you to a live project.

2. Success factors: A description of how the institution intends to define improvements in relation to the Course Management processes and data flows

Participation in the XCRI Self Assessment Framework field test has resulted in a series of statements in relation to the management of course information at Greenwich. These have been agreed by key stakeholders who took part in the field test.
1. There is value to Greenwich and the sector in general in having a data interoperability standard that is common across the sector.
2. XCRI-CAP will not in itself deliver any new information on courses over and above that already present on the University of Greenwich website, what is will do is allow for course information data to be reproduced for multiple – and increasingly diverse – audiences. This is a necessary development in order to make best use of resources, to ensure contractual compliance and to improve Greenwich’s capacity to be as agile and responsive.
3. Currently Greenwich’s management of course information data flow is inefficient and results in two completely separate sets of centrally held course information data. There is clear potential for XCRI to function as a change agent for improving the way Greenwich manages course information, and it is recommended that this change start with an institutional review of current programme/course validation and review processes.
4. There is good practice to draw on from other institutions.
5. Greenwich is receptive to exploring opportunities to link the KIS to XCRI-CAP development, although it is accepted that at the current time XCRI-CAP is not in a position to drive the University’s compliance with the new KIS requirements.

It was noted that JISC are about to initiate a funding call to support XCRI-CAP implementation (due July 2011) and participants in the field test are recommending that Greenwich submit an expression of interest.

3. Lessons learnt: Practical ‘hints and tips’ picked up by the institution when using the XCRI Self-Assessment Framework and the supporting XCRI Knowledge Base, which will add context to the guidance provided in the XCRI Self-Assessment Framework and which will be of benefit to other institutions seeking to adopt its advice

• Ascertain where your institution is in relation to XCRI and XCRI-CAP on a spectrum from “no awareness” through to “intention to implement”. The nearer your are to “no awareness” the greater the flexibility in how you use the Self-Assessment Toolkit/Framework;
• Identify dedicated resource to make the most of the resources on the XCRI Knowledge Base and to select/summarise/highlight appropriate information. (Self service by senior managers is great in principle - but in practice?);
• Line up your key personnel and complete the self assessment questions collectively – agree actions arising and assign tasks; agree a date to repeat the self assessment; date and circulate the self assessment summary;

4. Recommendations: Suggestion of how JISC could amend the XCRI Self-Assessment Framework and the XCRI Knowledge Base to improve its usability and usefulness in the light of their active use of it.

1. An approach which always requires Phase 1 to be a pilot of the XCRI-CAP process and Phase 2 to be a review of course marketing information management may not be appropriate in all cases. On the contrary, Greenwich’s experience indicates that a blending of the two phases may be a better approach in some situations. In the case of Greenwich this appears to be because the most significant issue to resolve in relation to XCRI-CAP implementation did not relate to the application of the technology, or even to understanding how this would operate in the context of current systems and processes. Instead the key issue was whether there was a sufficient business case that justified allocation of resource to the task of developing an XCRI-CAP feed and dealing with the issues that arose from this action. Starting with a workshop to scrutinize the XCRI-CAP self -assessment questions can be very useful for ascertaining priorities and requirements and reaching agreement on next steps.

2. While key personnel involved in the field test did not need to go through the process of creating an XCRI-CAP feed to in order to identify strengths and weaknesses in relevant systems and processes, they did recognize that it was a useful activity to undertake. In Greenwich’s case, this was in order to raise awareness with other stakeholders about the need for changes to processes linked to course marketing information.

3. Could additional functionality make the “self-assessment results” reports more user friendly?
For example:

• Ability to print reports in a usable version;
• Ability to date reports (to facilitate version control, measuring progress etc.)
• Even better to have a report where areas of development could be put into a business case template?
4. Could some of the questions have comments box so that an institution can add details, in terms of timescales to achieve an action for example.
5. Ensure resources are up to date.
6. Combine XCRI resources onto one website (rather than existing practice with xcri.org and xcri.co.uk

5. Appendices



Appendix 1

Key Personnel involved in field test:

Head of Development & Communications • Head of Marketing • Head of Corporate Information Systems
• Technical Services Manager  • Web Services Manager • CIS Project Manager • Head of Planning & Statistics • Head of Student Information Systems

Appendix 2

Extracts from Report of XCRI Workshop held Tuesday 14 June 2011

In considering Greenwich’s current capacity to satisfy internal and external demands for authoritative course information, it was noted that the benefits flowing from interoperability of course information data had already been recognised and previously attempted at Greenwich. The attempt led by the Development & Communications Office to achieve this through the Eagle project had stalled due to the unsuitability of course information for marketing purposes. It was noted that currently, the course information stored in the Student Records System (Banner) is taken directly from Programme Approval & Review documentation generated from the University’s Quality Assurance processes (overseen by the Learning & Quality Unit).

Course information stored in Banner is passed on to the Marketing Team where it is rewritten by professional writers for marketing purposes. This action is required because the original content is written for validation and review purposes and is of little value in marketing terms. Once rewritten, this data is stored and managed within the Development & Communications Office in a separate content management system. The result is two completely separate sets of “centrally held” data on course information. (Additional there are instances where versions of course documentation and course information are also held locally by schools/academics.)
Contributors looked at a series of examples of the course information stored in Greenwich’s student records system and at the information on Greenwich’s website and compared this with the course information offered on the websites of providers who use XCRI-CAP. A consensus emerged that the course information on the Open University website was an example of good practice and one that Greenwich should aspire to.

The pros and cons of Greenwich’s current approach to managing course information were discussed at length and the following comments/questions were noted:

• Greenwich has no clear formalised processes for the management of course information. There are no clear processes for managing the impact in marketing terms of changes to courses made through the validation and review process.

• There are no clear data management principles underpinning the content management system managed by Development and Communications. It is not possible to replicate the information there for different audiences and so data prepared separately for different external bodies. This approach is resource intensive.

• From a resourcing perspective it is inefficient to store data twice (or more) unless there is a clear identified need. It was not clear to contributors whether there was a clear need for two completely separate versions of course information, one for course approval/review (held in Banner) and another for marketing purposes.

• Further questions were raised about the purpose of storing course information in the student records system. The value of storing this information was unclear and it was questioned whether staff use it.

• The quality of course information generated through the quality assurance process and stored in the student records system was considered to be poor; the content viewed varied significantly and there were many examples where it was written in inaccessible language (e.g. bad grammar / spelling / esoteric). Further, the issue of how data is stored in Banner (particularly related to the format of narrative text) was noted.

• Contributors asked whether it would make sense to consider the course information requirements for potential at the course validation/review stage and whether it was necessary to use professional writers to write course information for marketing purposes. It was agreed that a necessary action would be to find out how other institutions deal with this issue: specifically, do they have separate marketing copy and where/when is it generated and stored?

• Towards the end of the workshop, a quick brainstorm (below) indicated that XCRI-CAP could deliver data interoperability at Greenwich: