Showing posts with label learning. Show all posts
Showing posts with label learning. Show all posts

Monday, 13 July 2015

Keynote I - What Makes Good Feedback Good?

 Professor Margaret Price, Oxford Brookes University

The keynote speaker addressed student feedback in the context of a higher education environment, that is dynamic and subject to a large number of external and internal pressures. Within that context, there appears to be general consensus that feedback is a crucial component of learning and assessment, and that assessment itself is a key driver of learning. Professor Price pointed out that the discourse of assessment is relatively simple for such a complex subject and suggested a need for new and more descriptive terminology than exists at present. She also noted the added pressure of higher student expectations and a louder student voice from ‘customers’ paying high fees.

Oxford Brookes and Cardiff have collaborated on a new piece of research, examining what makes feedback good in the perceptions of the students and what domains influence these perceptions of ‘good’ and ‘bad’ in this context. The project used student researchers and a cross-discipline approach. Students taking part were asked to bring one piece of ‘good’ feedback and one ‘bad’. They were interviewed around these pieces and then the feedback was analysed. The domains used in the analysis covered quality, context, student development and expectations and were further divided into areas such as technical factors, particularity (i.e. personal/impersonal feedback), recognition of effort, assessment design (crucial), student resilience (can they accept criticism?), student desires (to learn or to achieve a high grade?) etc. The full report is available at http://www.brookes.ac.uk/aske/.

The research found that the domains overlapped and compensated for each other so that feedback that was poor in one domain might be good in another, and vice versa. Three important messages for those giving feedback that emerged were:

Give it plenty of time
Train, develop and support staff in giving feedback
Limit anonymous feedback (personalised feedback scored highly)

Professor Price suggested that students need to develop their assessment literacy if they are to be able to gain the most from assessment and feedback. This can be done through developing their technical understanding of marking and grading, through self and peer assessment, and through an appreciation of what grading criteria actually mean. She pointed out that academics see hundreds of pieces of work and have a tacit understanding of what a 2:1 looks like, for example, but how are students to know this? Give them good examples was the answer and put them through stages of self-assessment, peer review, drafting, re-drafting and perhaps peer assisted learning where more experienced students support beginners and help them to develop their assessment literacy.

An overarching message resulting from the research was that ‘You don’t need to get it right all the time’: students are very forgiving of delayed feedback or low grades when they can see that there has been a real effort to engage with their individual piece of work. Examples of good practice are available at https://www.plymouth.ac.uk/whats-on/inclusive-assessment-in-practice-conference.

Finally – she concluded by saying that this should go beyond university and students should leave having developed these valuable self-evaluation skills to take with them into the future.





Report by Celia Cozens, e-Learning Content Manager, Centre for Academic Practice Enhancement (CAPE)

Keynote II: Could do Better: Assessing the Assessors

Professor Paul Haywood, Middlesex University, School of Art and Design


If Professor Haywood had been giving the keynote address at a conference on student engagement his presentation would have got full marks for walking the talk - the whole audience was totally engaged from start to finish. Highly entertaining delivery coupled with the offer of free apples for participants (which, I have to say, were not too forthcoming) proved a winning combination and the keynote sped by. A bit too fast for me actually as I had been charged with reporting on the session, but my frantic note-taking could not keep up with the speed and the range of information delivered so I will summarise a few key points and recommend you to watch the video for the full, highly worthwhile experience.

Professor Haywood whisked us through a quick history of the modern university, from Bologna to Middlesex. The first universities were, he claimed, set up as propaganda machines to oppress difference. Now we claim to celebrate difference and, to paraphrase V. I. Lenin, see education as the core of our so-called democracy. Indeed, mass education defines us as a democracy, with all aspiring to equal rights of access leading to the widening participation agenda.

Paul Haywood addressed the issue of social mobility – people leaving behind communities and people that they love and understand, to go to far-removed ‘institutions of learning’ – and highlighted the alternative offer, outreach education that invests in the communities, growing them rather than asset-stripping them. A model of total engagement, where education resources link to community activism, practice environments and life-wide learning led us to an outline of Paul's major project. Rooted in Salford whose grim statistics belie the experience, knowledge and creativity within, the project aims to break down institutional barriers to learning and rather than importing culture to the area, use and develop the rich diversity of knowledge, experience and creativity that already exists.

One of the key aims of the project (see video for full outline) is the development of the E.L.L.I.E. (Experiential Learning Live and Immediate Evidence) App tool for mobile devices linked to student blogs and on-line learning journals. This is a tool for learners through which they can capture, store, profile and disseminate evidence of learning from experience or practical engagement. It allows the student to start a process of reflection without breaking their flow of activity. The technology is a means of populating and structuring the student’s digital learning journal with raw content to support the process of reflection and, later, portfolio development.

Informal/formal education, institutional learning/social learning, cooperative learners’ action networks (CLANs), onions, apples, all human (and vegetable) life is here. But don't take my word for it - watch the video and let Professor Paul Haywood - the human dynamo - explain this all to you so much better than I can.





Report by Celia Cozens, e-Learning Content Manager, Centre for Academic Practice Enhancement (CAPE)

Track A1 - Learning About Assessment Literacy: A Case Study from the Assessment Literacy Project

 Nicky Spawls and Clare O'Donoghue, Middlesex University, School of Health and Education

How do you go about reviewing assessment literary in programmes that have diverse cohorts, widely different prior experiences of assessment and diverging expectations? This was the opening of the presentation by Nicky Spawls and Clare O’Donoghue from the School of Health and Education.

Nicky and Clare decided to review their programmes’ assessment practice and use this as an opportunity, taking a broader programme level approach of assessment. Nicky and Clare discussed how they questioned consistency of practice across modules, or they identified the need to clarify standards with students and went on to look at ways to explain the purpose of formative assessment to students.

Their review went back to basics: reminding ourselves of how daunting assessment can be; asking ourselves why an essay is indeed the best way to assess certain modules; or how to personalise assessment in large cohorts of 150-200 students. But also interesting questions addressed the need to discuss assessment amongst staff, recognising that assessment is not only about students.






Report by Pascale Colonna, Senior Academic Developer, Centre for Academic Practice Enhancement (CAPE)

Track A2 - Can Assessment Literacy be Enhanced and does it Lead to Improved Student Performance? A Case Study of Year One Business and Management Students at Middlesex University Business School


Simon Roberts, Middlesex University, Business School
Ana Marinica, Centre for Academic Practice Enhancement (CAPE)
Karim Qizalbash, Learner Development Unit (LDU)

Simon Roberts started off the presentation by providing an overview of assessment literacy in the context of a project carried out in the Business School. The 12 week enhanced programme  and MBS0111 Preparing for Business have been evaluated through an assessment literacy lens and findings were discussed as part of the session. Students enrolled on the enhanced programme failed to meet their conditional offer with regard to required UCAS points and were offered a 12 week course that aimed at preparing them for University. The programme was designed in collaboration with the Learner Development Unit. Interestingly, it consisted of 12 hours of teaching in a week, 6 of which were delivered by the LDU in order to develop students’ academic writing skills.

Assessment literacy can be defined as students’ ability to translate and appreciate the relationship between students’ learning and assessment as well as assessment criteria, feedback practices and the level of assessment they are presented with. Ana also highlighted the importance of assessment in a student’s learning journey. Assessment has been looked at with regard to what students find to be important, how they spend their time and at the end how they see themselves as students and later on as graduates. The key issue identified in this project is the fact that students often don’t understand what a good piece of coursework is and what is expected from them, especially with regard to assessment criteria. 

Karim Qizalbash presented on how the team has employed a model by O’Donovan et al, (2008) - Approaches to Developing Student Understanding of Assessment Standards when evaluating the 12 week programme. The techniques Karim employed based on the feedback received from students at the beginning of the course were to make all the learning materials relevant to the assessment which included an essay and presentations. Formal 1-2-1 tutorials, general teacher lead instructions in seminars, individual/group tasks and peer review exercises were part of the 12 week programme.

The evaluation of the project consisted of both quantitative and qualitative data collection and two cohorts of students - enhanced and January start students were compared. The initial findings showed that the grades of the enhanced students were lower across all the assessments (with the exception of HRM1004), however the enhanced students had perceived higher clarity of the requirements of each assessment and were slightly more confident than their January start counterparts. 

Most of the discussions which took place during the workshop evolved around the implications and possible issues of managing the enhanced students and how the findings of this project can be disseminated and put into practice by the Programme Team. For further details, please have a look at the video and PowerPoint slides from this session.





Report by Natalia Czuba, Educational Technologist, Centre for Academic Practice Enhancement (CAPE)

Track A3 - The Use of MCQs within Team Based Learning: Choosing the Right Approach to Foster Student Learning

Dr Venetia Brown, Dr Kevin Corbett, Icram Serroukh
Middlesex University, School of Health and Education

How do you sell the importance of research skills to nursing students? Nursing students know they are not going to be researchers. They can’t quite work out the relevance research has to nursing. So they hate it.

Venetia’s interest in team based learning started a couple of years ago after seeing Jenny Morris from Plymouth University present at a Nurse Education Today conference. Venetia was so impressed with the approach that she visited Plymouth and saw TBL in action. On returning to Middlesex she consulted the BSc Nursing programme teaching team.

Icram, a graduate teaching assistant has been working alongside Venetia, Kevin and the research methods teaching team to implement the TBL approach with the September 2015 nursing student cohort. Icram says TBL is a small group facilitation process that can be used with large cohorts (200 or more), where small permanent groups of 5 to 7 students work together. The permanency allows group cohesion to develop, ‘research has shown that after time, students become more concerned about the group’s performance rather than their own individual performance’. Although physical space is needed, the approach can be used in lecture theatres.

Is TBL a structured method of flipping the classroom? Does it portray the ideals behind ‘assessment for learning’? The first phase of TBL asks students to go through learning materials before the session, this phase is timetabled into the module and all learning materials are electronically made available through the module’s online space. Interestingly each resource has time recommendations and explicitly flagged up objectives so students understand what is required from them.

The second phase ‘readiness assurance’, consists of 5 sessions, the first session is an introductory session, used to familiarise students with the process of TBL. The remaining sessions takes up to 1 ½ hours. The first part of each session start off with an Individual Readiness Assurance Test (iRAT), a MCQ test, with 15 to 20 questions that goes towards 70% of the module’s summative assessment grade. The second part is the Team Readiness Assurance Test (tRAT), exactly the same iRAT test is taken by the group. Use of Immediate Feedback Assessment Technique (IF-AT) cards forefronts the team’s discussions of the iRAT test.

This approach generates student excitement around learning by introducing game-based or gambling-based approach to team learning. IF-AT cards are similar to lottery scratch cards, the teams must discuss and agree on a correct response before scratching. The correct response reveals a star and awards the maximum 4 points, however negative marking comes into play for incorrect responses. The discussion around the answer in addition to providing feedback for iRAT tests allows students to delve more deeply and critically into the question and share their individual understanding and knowledge with other team members. The tRAT team score contributes 30% of the module’s summative grade.

The final stage, the application of course concepts, can last between 2 or 4 hours. In this module this phase is formatively assessed and involves the group members collaboratively working on a case study. The same case study is used across all groups.

Watch the recording of this presentation to find out more …






Report by Asanka Dayananda, Senior Academic Developer, Centre for Academic Practice Enhancement (CAPE)

Track A5 - 'I Came Here to be Taught the Law by You …’ Designing Assessment so Students Want to Find out for Themselves

Dr Maureen Spencer, Middlesex University, School of Law
 

At the start of a very interesting and thought-provoking presentation, Dr Spencer posed a philosophical puzzle called ‘The Flute Game’, originally used by the eminent Professor and Nobel Laureate, Amartya Sen, to illustrate the multi-dimensional idea of justice. There is one flute available and three children with ‘claims’ upon it. Anne says the flute should be given to her because she is the only one who knows how to play it. Bob says the flute should be handed to him as he is so poor he has no toys to play with. Carla says the flute is hers because it is the fruit of her own labour. How do we decide between these three legitimate claims?

A great way to start a debate and one immediately ensued but it was not such a great idea in the view of one of Dr Spencer’s students, whose indignation gave us the title of this session: ‘I came here to be taught the law by you’ (and, by implication, not to take part in pointless philosophical debates).
Of course, what Maureen Spencer was trying to illustrate for her students was the fact that questions of value don’t have just one ‘right’ answer, and learning and assessment should be designed to challenge student conformity and question the ‘answers’.

Dr Spencer suggests that “academic staff should consider adopting more vigorously the Boyer Commission recommendation that inquiry-based learning (IBL) should be our pervasive pedagogical approach”.  At a time when students are treated as consumers of higher education she advocates moving away from those approaches based on the passive exposition of content.

The presentation outlined an innovation whereby delivery and assessment of a third-year Law module, ‘Criminal Evidence’, was transformed from information-based lectures to an emphasis on student inquiry. The pedagogic objectives of this inquiry-based learning (IBL) approach included: learning to learn; research capabilities; analysis; cognitive skills; communication and domain knowledge. As Dr Spencer put it, the main aim of using IBL for this module was to add to the individual’s knowledge base and to develop critical thinking.

Students were given a range of inquiry tasks to pursue and presented their findings, aided by extensive online resources and the core textbook. Assessment was by viva voce and a seen exam. The law exam is traditionally a test of memorising large amounts of information and the seen exam represents a move away from this – reducing some of the enormous stress and moving into the information age, where memorising large tracts is not quite as relevant or necessary as it was.

The approach is a work in progress, and the outcome was only partly successful in this first year of IBL. Dr Spencer reported that on the one hand the oral questioning assessment encouraged the expression of independent thinking but on the other hand there was limited participation in the overseas campus in the weekly blogs. To quote from Dr Spencer’s abstract of this session: “To improve performance more attention needs to be given to involving students in the delivery of the module, for example by making the podcasts multi-voiced including students and staff. Overall the experiment confirmed the potential for IBL to enable students to contribute to constructing rather than simply reflecting the world they will join as professionals.”






Report by Celia Cozens, e-Learning Content Manager, Centre for Academic Practice Enhancement (CAPE)

Track B2 - Open Badges: Capturing and Rewarding Hidden Learning

Dr Simon Best, Middlesex University, Business School
John Parkinson, Centre for Academic Practice Enhancement (CAPE)

John Parkinson, Senior Academic Developer (CAPE) and Dr Simon Best, Senior Lecturer in the Business School, teamed up to tell us about digital badges; what they are, the results of a pilot implementation on a module (MGT3193 Business Start Up) and how they might be used going forward.

We were introduced to the idea that digital badges share many of the attributes of traditional badges earned in organisations like the Scouts and Guides, to identify and promote achievements that can be developed through some type of process. They can be seen as a type of “credentialing system” arising out of research into gamification of learning.

A main driver for the use of digital badges has been the Mozilla Foundation. After working with major organisations, such as NASA, they launched the OBI (Open Badge Infrastructure) in 2011, which provides a common system for the creation, issuance and verification if Digital Badges across a variety of platforms. Then in 2013 Mozilla created the Badge Backpack, an online space where badge recipients can store and present their badges to others.

The badge itself is a 250 x 250 pixel graphic, which can be encoded with metadata, such as the awarding institution and information about the skill achieved. The challenge is to create a meaningful visual within this size restriction. If schools want to consider using the Middlesex logo going forward, the institution would need to consider what is going out with that endorsement.  An awarded badge gets added to users’ individual online ‘back-packs’ and these can link to social media, such as Linked In.

“So what about the pedagogy?” poses John; after reviewing around 50 papers, he found little literature so far for their use in education but there is some work emerging around self-efficacy via motivation and assessment. Possibly badges can be considered as signposts for learning, sitting alongside the curriculum.

After a step by step demonstration of how easy it is to create and administer badges in My Learning, John handed over to Simon Best for a run through of how the pilot was applied to MGT3193. This module is taught using disruptive learning methodology, whereby students are not provided with texts or reading lists but spend the first few weeks learning what is expected of them, in order to work through four linked, progressive assessments; a reflective essay, a group progress presentation, a business plan and a final presentation that involves contributions from each member of the group.  The working groups are carefully constructed based on a combination of Belbin’s roles, what the students are majoring in and the spread of skills and knowledge. Their assessment is not based on right or wrong answers but whether they can acquire and apply tools and skills to support/back-up their arguments. Badges were awarded for; leadership, research and communication and if they acquired all three, they would automatically receive the forth (group leadership).

Initially Simon and John began the pilot thinking that badges would be useful for employability but at this stage of the project evidence is still being collated on the affordances of badges for professional credentialing. However, they do feel that there is something around cognitive ability and supporting assessment to be reviewed and explored in the future.

Simon reflected that “we don’t assess or reward an enormous amount of skills that the students pick up”, which prompted a question about using badges to capture the learning between modules, for example, being a student voice representative or giving a presentation.

To find out more about this innovative pilot, please watch the video of the session or review the slides via the links.







Report by Louise Merlin, eLearning Content Manager, Centre for Academic Practice Enhancement (CAPE)

Track B3 - Using My Learning in the Assessment Cycle: How Hard Can it Be? Challenges and Benefits of Using VLEs for Assessment

Jas Ahmad, Middlesex University, Business School
Luiza Dantas, Centre for Academic Practice Enhancement (CAPE)


Jas and Luiza provided an energetic presentation on the benefits and versatility of integrating My Learning (Moodle) tools into the assessment cycle.

Luiza opened by describing the various stages of the assessment cycle; 1) specifying requirements, 2) setting assessment criteria and expectations, 3) supporting students through the assessment process, 4) submission of assignments, 5) marking and feedback, 6) returning marks and feedback, and 7) reflection. She identified that more focus and engagement is needed in the first 4 stages than is currently being given.

Jas then walked us through his approach to these 4 stages of the assessment cycle. Jas advocates the use of Breeze presentations.  Breeze is available freely online and allows the lecturer to upload Powerpoint slides coupled with short audio narration for each slide.  This allows Jas’ students to gain the basic information regarding assessment design at home, allowing for richer engagement and more informed questions during his face to face lectures.

Jas emphasised the need to ‘frontload’ your course with information and support.  Jas sets out his expectations early on, providing a recorded lectures, online announcements and engages with online discussions on how students should prepare for seminars, lectures and assignments.  In his experience, students routinely need the same guidance at the start of every term, so providing this information in a pre-recorded format saves valuable time and confusion in the long run.

Challenges to providing online guidance and support can include resistance from students who see more value in receiving the information in a face to face format. Gaining the students ‘buy in’ early on and continuously throughout the course is important for the success of online support.  It is also important to consider necessary adjustments for disabled students, if these students are not considered in the preparation stage of teaching then this can create a lot of back tracking and extra work later.

Jas empathised with colleagues who may be reluctant to try out technologies they are not familiar with due to fear of failure, and the potential for being caught on camera making a mistake. He recommends speaking to the camera as though you are speaking to a student in the room, if you trip up over a sentence or need to correct yourself it is not a big deal, and with time and practice confidence will grow.

Luiza closed the session by reminding the audience that the Centre for Academic Practice Enhancement is on hand to support any members of staff who would like to branch out and explore the potential of using My Learning in their assessment strategies.




Report by Jessica Isaacs, Online Learning Content Developer, Centre for Academic Practice Enhancement (CAPE)

Track B5 - Demo session: eMarking & eFeedback

Paul Smith and Kirsteen MacDonald
Middlesex University, Centre for Academic Practice Enhancement (CAPE)


This session included demos of a number of online tools to help with e-feedback and e-marking:
  • How to use the Turnitin App for iPads for marking, and giving typed and audio feedback (which synchs to your Turnitin Assignments on My Learning)
  • How to use the Grademark feature in Turnitin on My Learning
  • How to annotate PDF's submitted to the Assignment tool to give feedback
  • How to mark-up and add comments to Word documents submitted to the Assignment tool when giving feedback
  • How to give video feedback through My Learning
  • How to quickly and easily re-purpose and re-use quizzes for individuals and multiple groups
The Centre for Academic Practice Enhancement (CAPE) provides staff support for use of My Learning - queries can be sent to elearning@mdx.ac.uk

For help with all aspects of My Learning please visit My Learning Essentials - Staff

The Centre for Academic Practice Enhancement (CAPE) offers staff development across a range of areas relating to academic practice and learning technologies, please visit our pages in the 2015-2016 MDX Staff Development Brochure* the chapter called Developing Academic Practice which starts on page 19

(*This link directs you to a page on the Mdx Staff Intranet, you will need access to the University's internal server to view this).




Track C1 - Digital Stories - Everyone Has a Story to Tell… ‘Assessing Students Using Digital Stories and Bringing Out Their Creativity'

Alex Chapman and Luiza Dantas
Middlesex University, Centre for Academic Practice Enhancement (CAPE)

The theme of the session was digital storytelling and how this can be used as an assessment tool. Luiza Dantas introduced the presentation by explaining that stories are universal, used in all cultures and passed down from one generation to the next. We all know how to tell a story so how can we take this capability and use it for reflection and to present our work for assessment?

Using digital technology to add sound, colour and images to a story can bring it to life and make it engaging both for the storyteller and the intended audience. Luiza Dantas showed an example from YouTube of the story of Australian Aboriginal culture – images of the environment, the people and their art and work, backed with Aboriginal music, combined to give a rich and informative overview in a very short time. But though this was interesting and entertaining, we had to get on to the serious business of assessment – how could this be deep and rigorous enough for assessing our students? There were a few in the room who were definitely waiting to be convinced and some doubt was in the air.

The presenters put forward a very convincing argument for the benefits of using this tool. Luiza Dantas pointed out that – on the whole – students hate reflective writing. However, many of them are asked to produce a written reflection for assessment. Why not use a digital story, they asked. It is much more enjoyable for everyone and focuses the student fully on being concise, to the point, engaging, creative and reflective. What about presentations? How many students, particularly international ones, suffer the undue stress of standing up and talking, accompanied by ubiquitous PowerPoint slides? Why not get them to create a digital story instead? The same could be used to replace an Executive Summary; how much more creative and engaging that would be. Another idea was a digital CV. In this digital age, how impressive it would be to receive a digital story about your prospective employee – and you would know that they were equipped with the technological skills that are so important in 21st-century employment. The CV can be part of a person’s LinkedIn profile – the possibilities are endless … (you can tell I enjoyed this session).

Alex Chapman presented some excellent examples of students’ work, from the Business School and from a Mental Health programme. He pointed out (to the doubters in the room) that these assessments generally carry a small percentage of marks and are normally accompanied by a storyboard, and a written reflection as well as the story itself. He indicated the range of areas that could be assessed, including project planning, editing, research, critical analysis, originality, reflection, creativity and more.

Students were asked to feed back on the experience and while most started out as unsure or negative, afterwards they reported that it was exciting, enjoyable and skill enhancing, and they would like to see it used in more modules. A few questions arose, mostly to do with copyright issues and it was agreed that clear guidance should be given at the start of the process as this – like referencing – is all part of developing students’ academic integrity. Luiza Dantas concluded by pointing out that as well as developing all the core skills, digital stories also develop technical, visual and digital literacies – so important for employability. What’s not to like?


Report by Celia Cozens, eLearning Content Manager, Centre for Academic Practice Enhancement (CAPE)

Plenary/Panel Session

Jacqui Boddington (Chair), Prof. Paul Haywood, Phil Barter, Dr. Carol L Davies & Prof. Margaret Price


Watch this video for some innovative and nuanced responses to the following questions:
  • Are we moving fast enough with assessment design or are we just slightly improving on old techniques?
  • How can we deal with academic dishonesty in the form of  'essay mills'?
  • What are your views on top-down and bottom-up approaches to assessment design?
  • In the context of an OSCE, for example, how can you grade a student's level of 'compassion' or 'empathy'?

Hear the latest news and opportunities for Learning and Teaching research and development at Middlesex University (18:30)

See our PGCertHE award winners! (22:20)
 

Workshop 1- Changing Perceptions of Ourselves: How Graduate Academic Assistants are Altering the Learning Experience of Education Studies Students Through Assessment Literacy

Dr. Carole L Davis, Sandip K Gill and Kristína Repová
Middlesex University, School of Health and Education

Carole, along with Sandip and Kristina, facilitated a lively interactive session, with activities around the clarity of guidance and assessment criteria, the importance of the quality of feedback and peer and self-assessment. Using specific examples from the GAA’s sessions with Education Studies students,  to prompt thinking and discussion, the 50 or so participants were tasked with discussing and feeding back suggestions for how they might tackle the specific issues raised by students. These issues included self-doubt by a high achieving student and another students consistent inability to understand and respond to assessment tasks. The session also outlined some of the interesting approaches the department and GAA’s have developed that students were appreciative of, such as a book club for students to discuss readings together.

The video clip provides a snapshot of the liveliness of the session and wealth of discussion that arose.

Workshop 2- Appartunity or Appain?

Kate Brown, Sheila Cunningham, Jodie Ward, Kate Wilkinson and Jo Wilson, Middlesex University, School of Health and Education

Are apps in the classroom an opportunity or a pain? This was the question posed by this workshop with each group exploring answers for themselves. We were promptly organised into groups of 6-8 participants per table, all seated around board-style guided questions to allow us to explore the use of (mobile) applications at Middlesex. The workshop turned out to be as exciting as its title and setup hinted at, and was prepared with a good dose of creativity and carefully organised questions.

I’ll describe the experience at our table...

Participants were from different areas, with some qualifying themselves as ‘technology dinosaurs’, others quietly having used apps in the classroom for some time.  All of us seemed very keen to define what is meant by ‘apps’ (mobile apps only or others?), and the conversation quickly moved to practical considerations (‘but does this work on a Samsung tablet? ‘are all apps free?’ etc.) and then onto accessibility (‘would the library consider putting iPads on loan for students who do not have one?’). We also discussed the use of technology as a possible distraction in the classroom and how we can deal with this.

I particularly enjoyed how one academic talked about how he used 3D-visualisations of difficult concepts in his teaching as a way to break down a dry two-day course and as a way to make some of his presentations more vividly understandable (‘and your work is half done!’).  We eventually agreed that we needed to know where to ask for support and guidance (both CAPE Senior Academic Developers and School Librarians were referred to at that point) in order to be able to focus on pedagogical considerations and we talked about looking at apps in exactly the same way as you look at other new initiatives, with the same critical approach and part of an evaluative process.

We also touched on the potential of apps in an assessment context; we had more difficulty with this. We wondered whether they could help with developing certain aspects such as memorising skills. We also identified areas where we thought apps may currently be difficult to support assessment practice, such as in assessing creativity. One group wondered whether apps might be able to help you with minimising elements of subjectivity in the future and one group had come to the conclusion that we should develop our own apps, as a way to tailor them to our needs.

I was really impressed with the quality of the conversation in such as short space of time and with as few as 8 people from a range of areas. Definitely the highlight of the day for me.

Here are links to the articles and BBC item used during the session:


Report by Pascale Colonna, Senior Academic Developer, Centre for Academic Practice Enhancement (CAPE)