Monday, 13 July 2015

Welcome to Middlesex ALTC 2015: Revisiting Assessment


The day was opened by our new Vice Chancellor, Professor Tim Blackman, who thanked all Middlesex staff for the warm welcome to his new role.

The Vice Chancellor shared his ambition for the future of the University and describes 'students first' as his mantra. He looks forward to seeing investment in 21st century teaching and dissemination of best practice, with ambition and innovation being at the heart of what we do.

Professor Blackman suggested that those working in higher education should move away from wishing for higher quality students and step towards a healthier appreciation for the diversity and creativity of the students we have, inspiring them to identify their existing assets and achieve their highest potential.

This year’s Annual Learning and Teaching Conference (ALTC) focused on assessment – proposing and sharing models of practice that demonstrate shared assessment literacies within academic communities of staff and students.

Our keynote speaker, Professor Margaret Price, National Teaching Fellow and Professor of Teaching and Learning at Oxford Brookes University, explored the importance of applying these shared literacies, along with other tactics, in formulating effective feedback.

The morning continued with colleagues from within and beyond the University demonstrating effective practice in applying assessment literacy, exploring creative assessment and considering the role of technologies in enabling alternative constructions of the assessment process.

Professor Paul Haywood, Deputy Dean of the School of Art and Design at Middlesex, opened the afternoon’s activities with reflection on the capacity of reconsidering assessment as the co-creation of content through collaborative processes – thus providing learners with alternative levers of control and influence.

The day closed with a drinks reception and the award of this year’s PGCHE prizes.


Keynote I - What Makes Good Feedback Good?

 Professor Margaret Price, Oxford Brookes University

The keynote speaker addressed student feedback in the context of a higher education environment, that is dynamic and subject to a large number of external and internal pressures. Within that context, there appears to be general consensus that feedback is a crucial component of learning and assessment, and that assessment itself is a key driver of learning. Professor Price pointed out that the discourse of assessment is relatively simple for such a complex subject and suggested a need for new and more descriptive terminology than exists at present. She also noted the added pressure of higher student expectations and a louder student voice from ‘customers’ paying high fees.

Oxford Brookes and Cardiff have collaborated on a new piece of research, examining what makes feedback good in the perceptions of the students and what domains influence these perceptions of ‘good’ and ‘bad’ in this context. The project used student researchers and a cross-discipline approach. Students taking part were asked to bring one piece of ‘good’ feedback and one ‘bad’. They were interviewed around these pieces and then the feedback was analysed. The domains used in the analysis covered quality, context, student development and expectations and were further divided into areas such as technical factors, particularity (i.e. personal/impersonal feedback), recognition of effort, assessment design (crucial), student resilience (can they accept criticism?), student desires (to learn or to achieve a high grade?) etc. The full report is available at http://www.brookes.ac.uk/aske/.

The research found that the domains overlapped and compensated for each other so that feedback that was poor in one domain might be good in another, and vice versa. Three important messages for those giving feedback that emerged were:

Give it plenty of time
Train, develop and support staff in giving feedback
Limit anonymous feedback (personalised feedback scored highly)

Professor Price suggested that students need to develop their assessment literacy if they are to be able to gain the most from assessment and feedback. This can be done through developing their technical understanding of marking and grading, through self and peer assessment, and through an appreciation of what grading criteria actually mean. She pointed out that academics see hundreds of pieces of work and have a tacit understanding of what a 2:1 looks like, for example, but how are students to know this? Give them good examples was the answer and put them through stages of self-assessment, peer review, drafting, re-drafting and perhaps peer assisted learning where more experienced students support beginners and help them to develop their assessment literacy.

An overarching message resulting from the research was that ‘You don’t need to get it right all the time’: students are very forgiving of delayed feedback or low grades when they can see that there has been a real effort to engage with their individual piece of work. Examples of good practice are available at https://www.plymouth.ac.uk/whats-on/inclusive-assessment-in-practice-conference.

Finally – she concluded by saying that this should go beyond university and students should leave having developed these valuable self-evaluation skills to take with them into the future.





Report by Celia Cozens, e-Learning Content Manager, Centre for Academic Practice Enhancement (CAPE)

Keynote II: Could do Better: Assessing the Assessors

Professor Paul Haywood, Middlesex University, School of Art and Design


If Professor Haywood had been giving the keynote address at a conference on student engagement his presentation would have got full marks for walking the talk - the whole audience was totally engaged from start to finish. Highly entertaining delivery coupled with the offer of free apples for participants (which, I have to say, were not too forthcoming) proved a winning combination and the keynote sped by. A bit too fast for me actually as I had been charged with reporting on the session, but my frantic note-taking could not keep up with the speed and the range of information delivered so I will summarise a few key points and recommend you to watch the video for the full, highly worthwhile experience.

Professor Haywood whisked us through a quick history of the modern university, from Bologna to Middlesex. The first universities were, he claimed, set up as propaganda machines to oppress difference. Now we claim to celebrate difference and, to paraphrase V. I. Lenin, see education as the core of our so-called democracy. Indeed, mass education defines us as a democracy, with all aspiring to equal rights of access leading to the widening participation agenda.

Paul Haywood addressed the issue of social mobility – people leaving behind communities and people that they love and understand, to go to far-removed ‘institutions of learning’ – and highlighted the alternative offer, outreach education that invests in the communities, growing them rather than asset-stripping them. A model of total engagement, where education resources link to community activism, practice environments and life-wide learning led us to an outline of Paul's major project. Rooted in Salford whose grim statistics belie the experience, knowledge and creativity within, the project aims to break down institutional barriers to learning and rather than importing culture to the area, use and develop the rich diversity of knowledge, experience and creativity that already exists.

One of the key aims of the project (see video for full outline) is the development of the E.L.L.I.E. (Experiential Learning Live and Immediate Evidence) App tool for mobile devices linked to student blogs and on-line learning journals. This is a tool for learners through which they can capture, store, profile and disseminate evidence of learning from experience or practical engagement. It allows the student to start a process of reflection without breaking their flow of activity. The technology is a means of populating and structuring the student’s digital learning journal with raw content to support the process of reflection and, later, portfolio development.

Informal/formal education, institutional learning/social learning, cooperative learners’ action networks (CLANs), onions, apples, all human (and vegetable) life is here. But don't take my word for it - watch the video and let Professor Paul Haywood - the human dynamo - explain this all to you so much better than I can.





Report by Celia Cozens, e-Learning Content Manager, Centre for Academic Practice Enhancement (CAPE)

Track A1 - Learning About Assessment Literacy: A Case Study from the Assessment Literacy Project

 Nicky Spawls and Clare O'Donoghue, Middlesex University, School of Health and Education

How do you go about reviewing assessment literary in programmes that have diverse cohorts, widely different prior experiences of assessment and diverging expectations? This was the opening of the presentation by Nicky Spawls and Clare O’Donoghue from the School of Health and Education.

Nicky and Clare decided to review their programmes’ assessment practice and use this as an opportunity, taking a broader programme level approach of assessment. Nicky and Clare discussed how they questioned consistency of practice across modules, or they identified the need to clarify standards with students and went on to look at ways to explain the purpose of formative assessment to students.

Their review went back to basics: reminding ourselves of how daunting assessment can be; asking ourselves why an essay is indeed the best way to assess certain modules; or how to personalise assessment in large cohorts of 150-200 students. But also interesting questions addressed the need to discuss assessment amongst staff, recognising that assessment is not only about students.






Report by Pascale Colonna, Senior Academic Developer, Centre for Academic Practice Enhancement (CAPE)

Track A2 - Can Assessment Literacy be Enhanced and does it Lead to Improved Student Performance? A Case Study of Year One Business and Management Students at Middlesex University Business School


Simon Roberts, Middlesex University, Business School
Ana Marinica, Centre for Academic Practice Enhancement (CAPE)
Karim Qizalbash, Learner Development Unit (LDU)

Simon Roberts started off the presentation by providing an overview of assessment literacy in the context of a project carried out in the Business School. The 12 week enhanced programme  and MBS0111 Preparing for Business have been evaluated through an assessment literacy lens and findings were discussed as part of the session. Students enrolled on the enhanced programme failed to meet their conditional offer with regard to required UCAS points and were offered a 12 week course that aimed at preparing them for University. The programme was designed in collaboration with the Learner Development Unit. Interestingly, it consisted of 12 hours of teaching in a week, 6 of which were delivered by the LDU in order to develop students’ academic writing skills.

Assessment literacy can be defined as students’ ability to translate and appreciate the relationship between students’ learning and assessment as well as assessment criteria, feedback practices and the level of assessment they are presented with. Ana also highlighted the importance of assessment in a student’s learning journey. Assessment has been looked at with regard to what students find to be important, how they spend their time and at the end how they see themselves as students and later on as graduates. The key issue identified in this project is the fact that students often don’t understand what a good piece of coursework is and what is expected from them, especially with regard to assessment criteria. 

Karim Qizalbash presented on how the team has employed a model by O’Donovan et al, (2008) - Approaches to Developing Student Understanding of Assessment Standards when evaluating the 12 week programme. The techniques Karim employed based on the feedback received from students at the beginning of the course were to make all the learning materials relevant to the assessment which included an essay and presentations. Formal 1-2-1 tutorials, general teacher lead instructions in seminars, individual/group tasks and peer review exercises were part of the 12 week programme.

The evaluation of the project consisted of both quantitative and qualitative data collection and two cohorts of students - enhanced and January start students were compared. The initial findings showed that the grades of the enhanced students were lower across all the assessments (with the exception of HRM1004), however the enhanced students had perceived higher clarity of the requirements of each assessment and were slightly more confident than their January start counterparts. 

Most of the discussions which took place during the workshop evolved around the implications and possible issues of managing the enhanced students and how the findings of this project can be disseminated and put into practice by the Programme Team. For further details, please have a look at the video and PowerPoint slides from this session.





Report by Natalia Czuba, Educational Technologist, Centre for Academic Practice Enhancement (CAPE)

Track A3 - The Use of MCQs within Team Based Learning: Choosing the Right Approach to Foster Student Learning

Dr Venetia Brown, Dr Kevin Corbett, Icram Serroukh
Middlesex University, School of Health and Education

How do you sell the importance of research skills to nursing students? Nursing students know they are not going to be researchers. They can’t quite work out the relevance research has to nursing. So they hate it.

Venetia’s interest in team based learning started a couple of years ago after seeing Jenny Morris from Plymouth University present at a Nurse Education Today conference. Venetia was so impressed with the approach that she visited Plymouth and saw TBL in action. On returning to Middlesex she consulted the BSc Nursing programme teaching team.

Icram, a graduate teaching assistant has been working alongside Venetia, Kevin and the research methods teaching team to implement the TBL approach with the September 2015 nursing student cohort. Icram says TBL is a small group facilitation process that can be used with large cohorts (200 or more), where small permanent groups of 5 to 7 students work together. The permanency allows group cohesion to develop, ‘research has shown that after time, students become more concerned about the group’s performance rather than their own individual performance’. Although physical space is needed, the approach can be used in lecture theatres.

Is TBL a structured method of flipping the classroom? Does it portray the ideals behind ‘assessment for learning’? The first phase of TBL asks students to go through learning materials before the session, this phase is timetabled into the module and all learning materials are electronically made available through the module’s online space. Interestingly each resource has time recommendations and explicitly flagged up objectives so students understand what is required from them.

The second phase ‘readiness assurance’, consists of 5 sessions, the first session is an introductory session, used to familiarise students with the process of TBL. The remaining sessions takes up to 1 ½ hours. The first part of each session start off with an Individual Readiness Assurance Test (iRAT), a MCQ test, with 15 to 20 questions that goes towards 70% of the module’s summative assessment grade. The second part is the Team Readiness Assurance Test (tRAT), exactly the same iRAT test is taken by the group. Use of Immediate Feedback Assessment Technique (IF-AT) cards forefronts the team’s discussions of the iRAT test.

This approach generates student excitement around learning by introducing game-based or gambling-based approach to team learning. IF-AT cards are similar to lottery scratch cards, the teams must discuss and agree on a correct response before scratching. The correct response reveals a star and awards the maximum 4 points, however negative marking comes into play for incorrect responses. The discussion around the answer in addition to providing feedback for iRAT tests allows students to delve more deeply and critically into the question and share their individual understanding and knowledge with other team members. The tRAT team score contributes 30% of the module’s summative grade.

The final stage, the application of course concepts, can last between 2 or 4 hours. In this module this phase is formatively assessed and involves the group members collaboratively working on a case study. The same case study is used across all groups.

Watch the recording of this presentation to find out more …






Report by Asanka Dayananda, Senior Academic Developer, Centre for Academic Practice Enhancement (CAPE)

Track A4 - Assessment Literacy in Student Midwives

Jo Killingley, Middlesex University, School of Health and Education


As part of the revisiting assessment theme of the Learning and Teaching Conference 2015, Jo’s talk centred on providing prompt feedback to students immediately after one of the exams called the Objective Structured Clinical Examination, OSCE, which she carried out as a research project. Jo identified that:

  • Feedback has to promote critical thinking and self-judgement
  •   Feedback has to be behaviour focused
  • Clarity and clear articulation – students want to be in control of their learning and be motivated through clear feedback. Feedback helps students take control of their learning.
  •  Timeliness – it’s important that feedback provided is very close to the time of the event in       order for it to be effective 
  • Coping with learning – feedback provokes high levels of anxiety 
  • Feedback should be constructive. Destructive feedback can be more in the perception by the recipient than in the intended feedback content.

Jo described in detail how much dedication is required to support and mentor students through their learning progress on their Midwifery course and the value quality feedback played in this process.




Report by John Koushappas, Senior Educational Technologist, Centre for Academic Practice Enhancement (CAPE)

Track A5 - 'I Came Here to be Taught the Law by You …’ Designing Assessment so Students Want to Find out for Themselves

Dr Maureen Spencer, Middlesex University, School of Law
 

At the start of a very interesting and thought-provoking presentation, Dr Spencer posed a philosophical puzzle called ‘The Flute Game’, originally used by the eminent Professor and Nobel Laureate, Amartya Sen, to illustrate the multi-dimensional idea of justice. There is one flute available and three children with ‘claims’ upon it. Anne says the flute should be given to her because she is the only one who knows how to play it. Bob says the flute should be handed to him as he is so poor he has no toys to play with. Carla says the flute is hers because it is the fruit of her own labour. How do we decide between these three legitimate claims?

A great way to start a debate and one immediately ensued but it was not such a great idea in the view of one of Dr Spencer’s students, whose indignation gave us the title of this session: ‘I came here to be taught the law by you’ (and, by implication, not to take part in pointless philosophical debates).
Of course, what Maureen Spencer was trying to illustrate for her students was the fact that questions of value don’t have just one ‘right’ answer, and learning and assessment should be designed to challenge student conformity and question the ‘answers’.

Dr Spencer suggests that “academic staff should consider adopting more vigorously the Boyer Commission recommendation that inquiry-based learning (IBL) should be our pervasive pedagogical approach”.  At a time when students are treated as consumers of higher education she advocates moving away from those approaches based on the passive exposition of content.

The presentation outlined an innovation whereby delivery and assessment of a third-year Law module, ‘Criminal Evidence’, was transformed from information-based lectures to an emphasis on student inquiry. The pedagogic objectives of this inquiry-based learning (IBL) approach included: learning to learn; research capabilities; analysis; cognitive skills; communication and domain knowledge. As Dr Spencer put it, the main aim of using IBL for this module was to add to the individual’s knowledge base and to develop critical thinking.

Students were given a range of inquiry tasks to pursue and presented their findings, aided by extensive online resources and the core textbook. Assessment was by viva voce and a seen exam. The law exam is traditionally a test of memorising large amounts of information and the seen exam represents a move away from this – reducing some of the enormous stress and moving into the information age, where memorising large tracts is not quite as relevant or necessary as it was.

The approach is a work in progress, and the outcome was only partly successful in this first year of IBL. Dr Spencer reported that on the one hand the oral questioning assessment encouraged the expression of independent thinking but on the other hand there was limited participation in the overseas campus in the weekly blogs. To quote from Dr Spencer’s abstract of this session: “To improve performance more attention needs to be given to involving students in the delivery of the module, for example by making the podcasts multi-voiced including students and staff. Overall the experiment confirmed the potential for IBL to enable students to contribute to constructing rather than simply reflecting the world they will join as professionals.”






Report by Celia Cozens, e-Learning Content Manager, Centre for Academic Practice Enhancement (CAPE)

Track B1 - Readiness for Direct Practice’: Using Video as a Tool to Assess Social Work Students

Helen Hingley-Jones, Middlesex University, School of Health and Education
Angus Macdonald, Centre for Academic Practice Enhancement (CAPE)


This session explored the ways in which video could be used as an assessment tool in evaluating the skills of social work students. We began by watching a video in which Helen Hingley-Jones described the context within which the project fits. Social work education has been under scrutiny and reform in recent years, leading to the development of the Professional Capabilities Framework.

This represents nine different domains of practice for social workers, at all stages of their career, from pre-training through to continuing professional development in advanced roles. Students undergoing training are required to reach a basic level of competence in these domains in a short, intensive period of time.

The need to evaluate students’ communication skills and demonstrate basic social work knowledge led to the development of a video interview process as part of the assessment of their readiness for direct practice. Students interact with actors based on a particular scenario and are assessed on different criteria using a rating scale. The criteria include building a rapport with the actor, evidencing good listening skills and acknowledging any potential risks raised during the interview.

Angus then talked to us about the use of video in e-learning, mentioning that Kay (2012) identifies the main areas in which it is often used, including lecture capture, supplementary materials, and worked examples - crucially, video for assessment was not identified. Angus argues that video can be used effectively for student presentations or recorded submissions, digital storytelling, and providing feedback. With regard to the video interview process, Cartney (2006) says that “Recording and playing back interviews with students has the potential to generate powerful learning experiences”, and suggests that video feedback is the most effective method for improving oral communication skills. The method was compared to the OSCEs (Objective Structured Clinical Examinations) which have been used since the 1970s and involve a similar kind of interview process; it is argued that the addition of recorded video makes this even more effective.

The interview day itself was described as very intense for both students and staff, and it was difficult to manage the sheer numbers of students. Emphasis was placed on the importance of supporting and communicating with students both inside and outside the room, and making sure that video equipment was setup correctly to ensure the process went as smoothly as possible, with as little distraction for the students as possible. That said, some students found being in front of a camera perturbing and it was suggested that students be given more opportunities to practice in this kind of environment before the event. Ultimately the session was a success and only four students failed on the day and three of these passed at re-sit.

In the tutors’ opinion the exercise was challenging but a realistic form of preparation for social work practice and there was evidence of students’ deep learning. They concluded that video is a meaningful and highly effective tool for teaching and assessment of practice skills.





Report by Paul Smith, Senior Educational Technologist, Centre for Academic Practice Enhancement (CAPE)

Track B2 - Open Badges: Capturing and Rewarding Hidden Learning

Dr Simon Best, Middlesex University, Business School
John Parkinson, Centre for Academic Practice Enhancement (CAPE)

John Parkinson, Senior Academic Developer (CAPE) and Dr Simon Best, Senior Lecturer in the Business School, teamed up to tell us about digital badges; what they are, the results of a pilot implementation on a module (MGT3193 Business Start Up) and how they might be used going forward.

We were introduced to the idea that digital badges share many of the attributes of traditional badges earned in organisations like the Scouts and Guides, to identify and promote achievements that can be developed through some type of process. They can be seen as a type of “credentialing system” arising out of research into gamification of learning.

A main driver for the use of digital badges has been the Mozilla Foundation. After working with major organisations, such as NASA, they launched the OBI (Open Badge Infrastructure) in 2011, which provides a common system for the creation, issuance and verification if Digital Badges across a variety of platforms. Then in 2013 Mozilla created the Badge Backpack, an online space where badge recipients can store and present their badges to others.

The badge itself is a 250 x 250 pixel graphic, which can be encoded with metadata, such as the awarding institution and information about the skill achieved. The challenge is to create a meaningful visual within this size restriction. If schools want to consider using the Middlesex logo going forward, the institution would need to consider what is going out with that endorsement.  An awarded badge gets added to users’ individual online ‘back-packs’ and these can link to social media, such as Linked In.

“So what about the pedagogy?” poses John; after reviewing around 50 papers, he found little literature so far for their use in education but there is some work emerging around self-efficacy via motivation and assessment. Possibly badges can be considered as signposts for learning, sitting alongside the curriculum.

After a step by step demonstration of how easy it is to create and administer badges in My Learning, John handed over to Simon Best for a run through of how the pilot was applied to MGT3193. This module is taught using disruptive learning methodology, whereby students are not provided with texts or reading lists but spend the first few weeks learning what is expected of them, in order to work through four linked, progressive assessments; a reflective essay, a group progress presentation, a business plan and a final presentation that involves contributions from each member of the group.  The working groups are carefully constructed based on a combination of Belbin’s roles, what the students are majoring in and the spread of skills and knowledge. Their assessment is not based on right or wrong answers but whether they can acquire and apply tools and skills to support/back-up their arguments. Badges were awarded for; leadership, research and communication and if they acquired all three, they would automatically receive the forth (group leadership).

Initially Simon and John began the pilot thinking that badges would be useful for employability but at this stage of the project evidence is still being collated on the affordances of badges for professional credentialing. However, they do feel that there is something around cognitive ability and supporting assessment to be reviewed and explored in the future.

Simon reflected that “we don’t assess or reward an enormous amount of skills that the students pick up”, which prompted a question about using badges to capture the learning between modules, for example, being a student voice representative or giving a presentation.

To find out more about this innovative pilot, please watch the video of the session or review the slides via the links.







Report by Louise Merlin, eLearning Content Manager, Centre for Academic Practice Enhancement (CAPE)

Track B3 - Using My Learning in the Assessment Cycle: How Hard Can it Be? Challenges and Benefits of Using VLEs for Assessment

Jas Ahmad, Middlesex University, Business School
Luiza Dantas, Centre for Academic Practice Enhancement (CAPE)


Jas and Luiza provided an energetic presentation on the benefits and versatility of integrating My Learning (Moodle) tools into the assessment cycle.

Luiza opened by describing the various stages of the assessment cycle; 1) specifying requirements, 2) setting assessment criteria and expectations, 3) supporting students through the assessment process, 4) submission of assignments, 5) marking and feedback, 6) returning marks and feedback, and 7) reflection. She identified that more focus and engagement is needed in the first 4 stages than is currently being given.

Jas then walked us through his approach to these 4 stages of the assessment cycle. Jas advocates the use of Breeze presentations.  Breeze is available freely online and allows the lecturer to upload Powerpoint slides coupled with short audio narration for each slide.  This allows Jas’ students to gain the basic information regarding assessment design at home, allowing for richer engagement and more informed questions during his face to face lectures.

Jas emphasised the need to ‘frontload’ your course with information and support.  Jas sets out his expectations early on, providing a recorded lectures, online announcements and engages with online discussions on how students should prepare for seminars, lectures and assignments.  In his experience, students routinely need the same guidance at the start of every term, so providing this information in a pre-recorded format saves valuable time and confusion in the long run.

Challenges to providing online guidance and support can include resistance from students who see more value in receiving the information in a face to face format. Gaining the students ‘buy in’ early on and continuously throughout the course is important for the success of online support.  It is also important to consider necessary adjustments for disabled students, if these students are not considered in the preparation stage of teaching then this can create a lot of back tracking and extra work later.

Jas empathised with colleagues who may be reluctant to try out technologies they are not familiar with due to fear of failure, and the potential for being caught on camera making a mistake. He recommends speaking to the camera as though you are speaking to a student in the room, if you trip up over a sentence or need to correct yourself it is not a big deal, and with time and practice confidence will grow.

Luiza closed the session by reminding the audience that the Centre for Academic Practice Enhancement is on hand to support any members of staff who would like to branch out and explore the potential of using My Learning in their assessment strategies.




Report by Jessica Isaacs, Online Learning Content Developer, Centre for Academic Practice Enhancement (CAPE)

Track B5 - Demo session: eMarking & eFeedback

Paul Smith and Kirsteen MacDonald
Middlesex University, Centre for Academic Practice Enhancement (CAPE)


This session included demos of a number of online tools to help with e-feedback and e-marking:
  • How to use the Turnitin App for iPads for marking, and giving typed and audio feedback (which synchs to your Turnitin Assignments on My Learning)
  • How to use the Grademark feature in Turnitin on My Learning
  • How to annotate PDF's submitted to the Assignment tool to give feedback
  • How to mark-up and add comments to Word documents submitted to the Assignment tool when giving feedback
  • How to give video feedback through My Learning
  • How to quickly and easily re-purpose and re-use quizzes for individuals and multiple groups
The Centre for Academic Practice Enhancement (CAPE) provides staff support for use of My Learning - queries can be sent to elearning@mdx.ac.uk

For help with all aspects of My Learning please visit My Learning Essentials - Staff

The Centre for Academic Practice Enhancement (CAPE) offers staff development across a range of areas relating to academic practice and learning technologies, please visit our pages in the 2015-2016 MDX Staff Development Brochure* the chapter called Developing Academic Practice which starts on page 19

(*This link directs you to a page on the Mdx Staff Intranet, you will need access to the University's internal server to view this).




Track C1 - Digital Stories - Everyone Has a Story to Tell… ‘Assessing Students Using Digital Stories and Bringing Out Their Creativity'

Alex Chapman and Luiza Dantas
Middlesex University, Centre for Academic Practice Enhancement (CAPE)

The theme of the session was digital storytelling and how this can be used as an assessment tool. Luiza Dantas introduced the presentation by explaining that stories are universal, used in all cultures and passed down from one generation to the next. We all know how to tell a story so how can we take this capability and use it for reflection and to present our work for assessment?

Using digital technology to add sound, colour and images to a story can bring it to life and make it engaging both for the storyteller and the intended audience. Luiza Dantas showed an example from YouTube of the story of Australian Aboriginal culture – images of the environment, the people and their art and work, backed with Aboriginal music, combined to give a rich and informative overview in a very short time. But though this was interesting and entertaining, we had to get on to the serious business of assessment – how could this be deep and rigorous enough for assessing our students? There were a few in the room who were definitely waiting to be convinced and some doubt was in the air.

The presenters put forward a very convincing argument for the benefits of using this tool. Luiza Dantas pointed out that – on the whole – students hate reflective writing. However, many of them are asked to produce a written reflection for assessment. Why not use a digital story, they asked. It is much more enjoyable for everyone and focuses the student fully on being concise, to the point, engaging, creative and reflective. What about presentations? How many students, particularly international ones, suffer the undue stress of standing up and talking, accompanied by ubiquitous PowerPoint slides? Why not get them to create a digital story instead? The same could be used to replace an Executive Summary; how much more creative and engaging that would be. Another idea was a digital CV. In this digital age, how impressive it would be to receive a digital story about your prospective employee – and you would know that they were equipped with the technological skills that are so important in 21st-century employment. The CV can be part of a person’s LinkedIn profile – the possibilities are endless … (you can tell I enjoyed this session).

Alex Chapman presented some excellent examples of students’ work, from the Business School and from a Mental Health programme. He pointed out (to the doubters in the room) that these assessments generally carry a small percentage of marks and are normally accompanied by a storyboard, and a written reflection as well as the story itself. He indicated the range of areas that could be assessed, including project planning, editing, research, critical analysis, originality, reflection, creativity and more.

Students were asked to feed back on the experience and while most started out as unsure or negative, afterwards they reported that it was exciting, enjoyable and skill enhancing, and they would like to see it used in more modules. A few questions arose, mostly to do with copyright issues and it was agreed that clear guidance should be given at the start of the process as this – like referencing – is all part of developing students’ academic integrity. Luiza Dantas concluded by pointing out that as well as developing all the core skills, digital stories also develop technical, visual and digital literacies – so important for employability. What’s not to like?


Report by Celia Cozens, eLearning Content Manager, Centre for Academic Practice Enhancement (CAPE)

Track B4 - Drop-in Session: Hands-on Session Following on from track C1


Alex Chapman, Middlesex University, Centre for Academic Practice Enhancement (CAPE)

The drop-in session was a follow up from the morning session - Digital Stories – Everyone Has a Story to Tell by Luiza Dantas and Alex Chapman. This workshop focused on four stages of creating a digital story such as pre-production, production, presentation, as well as assessment and reflection. These stages were discussed in details and covered:
*gathering and researching multimedia in compliance with copyright laws
* writing a script
* planning the stories using digital storyboards tools (online and paper based)
* using technologies and various software that are free are accessible for students
* presentation and assessment of digital stories

Resources from the session: 
 
Guides and websites:


Kathy Schrock's Guide to digital stories
Digital Storytelling Tools for Educators by Silvia Rosenthal Tolisano
Digital storytelling site for students and educators

Software:
 Apple iMovie (Macintosh OS X)
 Audacity (Macintosh OS X and Windows)
 Microsoft Photo Story 3 (Windows)
 Windows Moviemaker 2.1 (Windows)
 Google story builder
 21 Free Digital Storytelling Tools For Teachers and Students

Free pictures:
Flickr: Advanced Search for Creative Commons only.
http://www.flickr.com 
Compfight: Search Creative Commons Flickr Images
http://compfight.com/
Pixabay
http://pixabay.com/
wikimedia commons
http://commons.wikimedia.org/wiki/Main_Page 

Free images

 http://www.freeimages.com/

Royalty-Free Music and Sounds:
http://digitalstorytelling.coe.uh.edu/page.cfm?id=23&cid=23&sublinkid=95
http://freesound.org
http://soundbible.com

The Centre for Academic Practice Enhancement (CAPE) provides staff support for use of Digital Stories - queries can be sent to elearning@mdx.ac.uk.

The Centre for Academic Practice Enhancement (CAPE) offers staff development across a range of areas relating to academic practice and learning technologies, please visit our pages in the 2015-2016 MDX Staff Development Brochure* the chapter called Developing Academic Practice which starts on page 19

(*This link directs you to a page on the Mdx Staff Intranet, you will need access to the University's internal server to view this).

Track C2: Developing quality feedback and assessment literacy in dance

 Julia K. Gleich
 Head of Choreography (London Studio Centre)

London Studio Centre’s shared their institution development of their Learning & Teaching Strategy 2013-18 focussing on assessment and feedback. The nature of practice at LSC makes use of intense continuous, day-to-day formative feedback by both tutors and peers, which ac
counts for unusual learning experience in HE. Addressing the high frequency of assessment and feedback is at centre of LSC’s culture of quality enhancement and review of assessment strategies that have led to significant successes in the National Student Survey.
The institution has reviewed strategies associated with developing effective feedback and assessment that include ways of creatively assessing creativity and identified points to consider in constructing effective feedback.

Identified Issues related to creative assessment:

·         How to “prepare for assessment” in a context of creativity
·         Prior learning
·         Tick-boxing / Goal-oriented-ness
·         Subjectivity
·         House styles
·         New ideas and risk-taking
                                                        
The staff at LSC also recognize that Assessment Literacy is a double-edged sword and if students are too focused on assessment, they will be unwilling to take risks and discover their own voices in a creative environment. In addition, Assessment tasks and criteria must be designed to reward even “inappropriate” choices as students re-invent themselves and develop new movement vocabularies, explore compositional strategies and form their artistic identities.

Constructing feedback
At LSC feedback is provided taking into consideration the points below:
·         Specificity
·         Constructive comments
·         Appropriate language
·         Transparency – see example below. If not clear from the starting point comments like “ student needs to be more analytical” – are not clear.
·         Enabling and encouraging/Positive but honest
·         Personal

The presentation also included seven principles of good feedback practice From Dr Nicol and Debra Macfarlane-Dick below:

“good feedback practice should facilitate the development of self-assessment (reflection) in learning, encourage dialogue, clarify what is good performance, deliver quality information about the students learning, provide information to teachers to shape learning and should be motivational for the students and create positive self-esteem.”


Report by Betty Sinyinza, Educational Tehcnologist, Centre for Academic Practice Enhancement (CAPE)