Why?

“Students must be effective self-assessors; to be anything less is to be dangerously ill-prepared to cope with change” (Boud 2000, p. 160).

As part of the development of both sound academic practice and lifelong learning skills, the ability to make self-evaluative assessment of competencies, attributes and learning preferences/needs are vital. Therefore, students need to have a clear understanding of what these are, how to acquire and/or develop them, and the confidence to apply them. To do this, students need opportunities, beyond formal assessment and feedback, to reflect on the competencies they require and to recognise their attributes, strengths and weaknesses, and preferences in order to identify and plan for their areas of development.

Development of these skills are key to independent enquiry and self-regulation.

The development of structured self-assessments using the diagnostic tool will allow the students to reflect upon their own levels of confidence linked to key competencies, attributes, and preferences; direct them to a variety of targeted self-access resources, including support, relevant to their levels of confidence; and create space for them to identify their own development plans.

Another wider benefit is that the student response data can also be used to inform future course/resource content development, including any specific student requirements around preferences/needs.

What is it?

The confidence-based self-assessment diagnostic tool provides a mechanism for staff to create bespoke diagnostics, allowing students to align their levels of confidence around the skills, needs, preferences and competencies that they will need on their course (and/or placement or future employment) and to (self) determine a plan of action to develop these skills and competencies along-side their studies using linked resources.

Using the students' responses, either individually or collectively, can be a useful method to stimulate a formative exchange between the students and tutors.

Diagnostics can be built to accommodate between one and six skills categories assigned/created by you. Each category containing eight confidence-based questions (I can…) assigned/created by you. The students choose a response indicating - high confidence or reasonable confidence or low confidence or don’t know/no experience. All questions are randomised and category information is not provided until the response page. Based upon the responses given the students are presented with feedback created by you linked to practical suggestions in url links (to workshops, practicums, online resources, drop-ins & courses, et cetera) chosen by you. Students then set their own developmental goals based upon these recommendations.

See Create a Diagnostic for further information of the customisable fields.

Student responses can be set to be anonymised or onymised, in both cases unique users can be identified.

Diagnostics can be deployed to UoB students (or staff), via SSO,  or to external users, via a registration page, for pre-arrival, Widening Participation (WP) et cetera.

The student response data (question responses and areas for development) is available via an inbuilt Response Dashboard, with the raw data available for download in a CSV format.

Diagnostic Response Dashboard providing inbuilt analysis of student data: Summary Stats: Confidence response levels by question (per Category); Confidence response levels by category (per Space); Overall weighting by question (per Category); Overall weighting by category (per Space); Checked Responses by Category (per Space).
Response Dashboard

How might I use it?

Students need to have a clear understanding of skills required for their studies, research and/or future employment. By building self-assessments linked to these skills, we can help students to identify their own tailored and unique areas for development and to link these to specific support and resources.

Examples of areas where a diagnostic might be developed
  • supporting students at year-level to 'signpost' the skills and attributes that they will need and linking them to support and resources
  • supporting students using labs to 'signpost ' support and resources in areas such as equipment calibration, health and safety, rules and regulations
  • supporting students with non-academic needs to 'signpost' support and resources
  • supporting students and staff to 'signpost' resources and support around digital capabilities
  • supporting students pre-arrival

What are the pros & cons?

Pros
  • built with accessibility in mind
  • built to be user-friendly
  • no text editor required to create content
  • simple Content Management Template design
  • device agnostic design - makes it possible to view content on different devices (such as tablets and smartphones)
  • category and question banks are stored openly, for collaboration or sharing
  • URL based content means it can be deployed via any web-based tool - such as Moodle, email, social media
  • diagnostics can be reused
  • co-author assignment to collaboratively build and analyse response data
  • access via SSO (Single Sign-on) for UoB students/staff or via secure registration for external students/staff
  • trackable student data which can be used for research purposes or course evaluations
  • response data available for analysis using both an inbuilt dashboard and exportable cvs file
Cons
  • adding more sophisticated text editing requires knowledge/confidence to use Hyper Text Markup Language (HTML), though this is not necessary to develop and deploy a diagnostic

Case study

EBM-ITM Diagnostic

Steve Cayzer and Peter Mott, Dept of Mechanical Engineering and Vaggelis Giannikas, School of Management, have been one of the early testers and used the Diagnostic Tool on two MSc Courses, Engineering Business Management and Innovation and Technology Management, both of which are centred around team-based learning.

The principle purpose of the diagnostic was to:

  • signpost skills and attributes required on the Courses
  • as a means to link student self-assessment to Practicum training sessions on the Course
  • provide insights in student levels of confidence on core elements of the Courses
Response data from category Digital Learning and Development, showing individual student reposes for each question
Student responses for Digital Learning and Development

Commenting on the Diagnostic Tool, Steve said: “Adopting this tool to support the team-based learning approach I take on these two courses has enabled me to raise the awareness levels of my students and encourage them to take greater ownership of the skills development they need to meet the demands of their courses. So far it has proved very valuable and I am looking forward to using it further in the coming year.”

Further reading

Stankov, Lazar & Kleitman, Sabina & Jackson, Simon. (2014). Measures of the Trait of Confidence. 10.1016/B978-0-12-386915-9.00007-3.

David Boud (2000) Sustainable Assessment: Rethinking assessment for the learning society, Studies in Continuing Education, 22:2, 151-167, DOI: 10.1080/713695728

European Parliament, 2008. Skills and competences are further defined in Appendix D.

Evers, Frederick T., & S. O'Hara. 1996. ‘Educational Outcome Measures of Knowledge, Skills, and Values by Canadian Colleges and Universities.’ Educational Quarterly Review 3 (1): 43–56.

OECD. 2012. AHELO Feasibility Study Report: Volume 1: Design and Implementation: 18. Paris: OECD

Higher Education Funding Council for England (HEFCE). 2014a. Review of the National Student Survey: 28. London: HEFCE

David Nicol (2009) Assessment for learner self‐regulation: enhancing achievement in the first year using learning technologies, Assessment & Evaluation in Higher Education, 34:3, 335-352, DOI: 10.1080/02602930802255139

Themes

  • confidence-based-assessment
  • confidence-gain
  • learning-gain
  • objective setting
  • self-diagnostic
  • self-efficacy

Guidance

Creating a Diagnostic

Diagnostic Demo

Diagnostic CMS

Bath Blend Baseline

UK Professional Skills Framework

Contacts

For technical or pedagogical advice or just simply to talk through ideas or ask questions about using the Diagnostic Tool and CMS contact Kevin Renfrew.