owned this note
owned this note
Published
Linked with GitHub
Data Science and AI Educators' Programme: Cohort Call Six notes
===
###### tags: `cohort-call-6` `DS-AI-Educators'-Programme` `asynchronous`
## CC6: Continuous assessment and feedback: self-guided exercises
There are **three** activities to complete, each taking around 10-15 minutes to complete.
### Is YOUR feedback fit for purpose?
Feedback may sometimes feel like nothing more than an expected ritual within the cycle of assessment processes that are a part of academic life. However, it can be more than this and can be seen to perform multiple functions for us as educators:
- correct student errors;
- develop understanding through educator explanations;
- generate more learning by suggesting further specific study tasks;
- promote the development of generic skills by focusing on the evidence of the use of the skills rather than the content;
- promote meta-cognition by encouraging students' reflection and awareness of learning processes involved in the assignment;
- encourage students to continue studying.
Feedback, if effective, can have multiple positive benefits:
- promote an individual's learning journey and encourage greater achievement;
- enable students to reflect and develop their strengths and weaknesses;
- foster greater levels of self-esteem and motivation;
- enable educators to realign their teaching content and methods in response to learner's needs; encourage a more effective dialogue between educators and students;
- develop core skills around assessment and peer-to-peer evaluation.
:::info
### ACTIVITY 1:
Think of a time, recently, where you gave feedback to a student. This could have been verbal or written feedback. Consider the following three questions, noting down your answers in the HackMD below.
1. What were your overall aims of the piece of feedback?
2. Were different parts of the feedback trying to do different things?
3. Do you think your aims will have been clear to the student: what will they have taken away from your feedback?
Now, consider how you might **improve** your feedback:
Identify any ways in which you could change, re-word or add to the feedback to further achieve those purposes;
4. Draft those changes, adding to your notes below.
:::
### ACTIVITY 1 REFLECTIONS:
**1. What were your overall aims of the piece of feedback?**
* [name=Ayesha] my overall aims were...
* [name=Samantha] my aims were to help their understand of Python basics, incl how to interpret error messages
* [name=Amina] my aim was to inform them about the ethical use of text generating AI tool in their assignments
* [name=Edo] my aim was to give them an informed opinion on a project they were doing, trying to guide them in improving their final work.
* [name=Paul] my aim was to help them understand the issues with using ChatGPT within their studies
* [name=Saman] my aim was to help them with improving the quality of work for the final submission of the Data Science course assessment (this was given on formative assessment)
* [name=Saman]
* [name=Tom] My aim was to provide verbal feedback on answers to a question asked about why certain steps are important in using version control
*
*
**2. Were different parts of the feedback trying to do different things?**
* [name=Ayesha] my feedback was...
* [name=Samantha] potentially, part was understanding the error message. The other was about identifying typing errors or inconsistency.
* [name=Amina] one part was directlty addressing the school policy on the use of AI tools, and the second part was specific to the content of their work, like strength of the argument
* [name=Edo] One part was mainly focused on pointing out which were the strengths and weaknesses of the project, while the other included suggestions on how to improve the final project.
* [name=Paul] the feedback was mainly focused on looking at the output from ChatGPT, especially validating example references provided and their accuracy/correctness.
* [name=Saman]i pointed out areas that needed improvement with suggested improvements and areas that were well presented
* [name=Tom] Part of it was to confirm correct responses and explain why those were correct, part of it was to synthesise different responses which together resulted in a more accurate answer and part of it was to acknowledge people who'd given an incorrect or incomplete answer, explaining the misconceptions and reassuring that these were common.
*
*
*
**3. Do you think your aims will have been clear to the student: what will they have taken away from your feedback?**
* [name=Ayesha] my aims were...
* [name=Samantha] my aims would have been clear within the context of the workshop, and hopefully useful going forward
* [name=Amina] my aims were clear to students since they realised that even small tasks are being checked for authentic completion
* [name=Edo] I think the aim were quite clear to the students; they were expecting a formative feedback by me since the beginning of their project, and they were informed on when and how that would have been provided.
* [name=Paul] my aims were to help the student understand the limitations of using Generative AI within their studies, as well as discuss the ethics of such tools.
* [name=Saman] yes as they only asked questions about a couple of things that were not related to the quality fo work but document style/alyout
* [name=Tom] difficult to say as we do a lot of virtual teaching and can't sense body language or demeanour. I try to be very explicit about why I'm giving that feedback or that it's common for people new to the subject to make the same mistakes, possibly more so than if I was teaching in person.
*
*
*
*
**4. Drafted changes to my feedback:**
* [name=Ayesha] my changes were...
* [name=Samantha] I could change my feedback by asking some additional questions, have you seen this error before? What have been common errors so far?
* [name=Amina]my change was to provide more evidence of their use of AI tools for the assignment to back up my feedback.
* [name=Saman] considering the questions perhaps i could provide a smapel report rather than just guidelines
* [name=Tom] I'd prepare a list of questions/responses to address misconceptions ahead of time and ask as multiple choice/poll questions in teams.
*
*
*
*
*
---
:::success
### Feedback: efficiency for us, learning payoff for students.
### ACTIVITY 2:
**Consider the list of feedback methods, below.**
- live feedback in class
- individual written feedback
- ad hoc verbal feedback e.g. in a seminar
- written feedback, unreadable or too short
- peer group discussion
- exam marks, no comment
- peer assessment, assuming fairly
- generic written report for all students
- recorded audio feedback for individuals
- self feedback/assessment
- talking to small groups about common problems
- face-to-face feedback one-to-one
- recorded generic audio feedback to a whole group
- criteria sheets - rubrics
- email feedback
- track changes
- hand-written feedback on end-of-semester major assignments
**Pick 6-10 of the methods and decide where they sit on the following matrix. Make sure to include methods that you currently use and please add any additional methods to the list. Click [here](https://miro.com/app/board/uXjVM_wE65c=/?share_link_id=189380200232) to add your methods to the shared matrix. These can be anonymous and you may have differing opinions so please do not worry about what has already been entered into the matrix. Use the menu on the lefthand side to select post it notes to add your ideas to.**

### Questions for reflection:
1. Where do your current feedback methods sit? What does this suggest about continuing to set these methods or changing to other methods?
2. Are you able to find methods which could work in your practice and which are both highly efficient for educators and highly beneficial for students?
3. Are there ways in which some of the methods can be adapted slightly to bring them closer to the top-left quadrant?
:::
### ACTIVITY 2 REFLECTIONS:
**1. Where do your current feedback methods sit? What does this suggest about continuing to set these methods or changing to other methods?**
* _[name=enter name here]_ ...
* [name=Samantha] we use a combination of methods, in workshops we do small group work and pair discussions, individual feedback and group feedback. Some efficient for both us and the students, some more efficient for us, others more for the students. On the whole these are effective. We also seek regular feedback from students about the workshop as it progresses.
* [name=Amina]in my case we use both in-person feedback after seminars and lectures, as well as digital feedback in terms of comments for submitted work and emails.
* [name=Edo] Since classes mainly happen in lab/tutorial settings, I usually give majority of feedbacks as live during classes. I believe that this methods is quite efficient for us and, if done rigorously, has a high learning payoff for students.
* [name=Saman]I give individual feedback on a submission which I will continue to give but giving generic feedback to the whole group will benefit the entire group
* [name=Tom] As we're teaching virtually and not formally assessing, I tend to give a lot of live feedback in session and ad-hoc verbal. Ocassionally we run hackathon style sessions where participants work in groups to tackle a common problem. For our graduate programme we do offer some projects for which written feedback is given on code and outputs.
*
*
*
*
*
*
**2. Are you able to find methods which could work in your practice and which are both highly efficient for educators and highly beneficial for students?**
* _[name=enter name here]_ ...
* [name=Samantha] yes, and we do use some already. Peer discussions, feedback and group work. We also make use of rubrics in assessed courses.
* [name=Amina] yes, we use majority of those methods to give feedback. For now at least for the big projects one-to-one in person feedback is the most effective one.
* [name = Edo] Peer Feedback, Talking to Small Group, Live Feedback.
* [name=Saman] generic group report feedback
* [name=Tom] It's difficult to incorporate more 'academic' formal assessment in day to day training, however it's something that's always being considered. I'll certainly try to incorporate more formal questioning in sessions that are planned ahead of time.
*
*
*
**3. Are there ways in which some of the methods can be adapted slightly to bring them closer to the top-left quadrant?**
* _[name=enter name here]_ ...
* [name=Samantha] for 'exam marks, no comment', the assessment could be autograded with automatic feedback set-up in advance for questions and for 'recorded generic audio feedback to a whole group' if it is also accompanied by specific feedback for the individual. Key considerations with feedback are usefulness and timing. Is it specific information that could be actioned, can it feed into upcoming tasks or assessments?
* [name=Amina] I think that the generic written report could be more specified for each student and their progress within the courses. It would be great to have a complete record of student's strong and weak sides that educators can access to understand student's academic situation and then give feedback accordingly.
* [name=Saman] get student feedback on which method of feedback they appreciate and use that
*
*
*
*
*
*
*
---
:::danger
### Feedback to students would work much better for me if only I...
Below are responses from educators on things which would make feedback better from their own perspectives. This activity is designed to help you reflect on challenges that you yourself face with respect to feedback, the source of the challenge, the factors which are perhaps within your control and the areas where you would like to explore new strategies.
### ACTIVITY 3:
1. Read through the statements and, in the first column 'true for me', tick the statements that you feel apply to you. Please do not put your responses directly into the table, instead, use the section further below to jot down your responses.
2. Consider the responses you ticked as 'true for you' and, in the 'source of challenge' column, jot down the source of the challenge(s) for you. You might find the following categories helpful but, if not, use some of your own: 'the institutional system', 'student attitude', 'my attitude', 'time', 'my skills or abilities' etc.
3. Pick three or four of these challenges which feel the most pressing and explore potential strategies to counteract these. You may want to make use of the Slack channel here to get advice or startegies from your fellow educators.
| Staff responses... </br> </br> _Feedback to students would work much better for me if only I..._ | True for me | Source of challenge |
| ------------------------ | ---- | --------- |
| Thought they'd read and digest it. | | | Complete
| Could give them back their essays to keep. | | | Complete
| Thought it would make a difference. | | | Complete
| Was able to do it more quickly. | | | Complete
| Could be in the right frame of mind when meeting them face to face. | | | Complete
| Could be sure they would understand what I'm trying to tell them. | | | Complete
| Could get them to turn up to receive feedback. | | | Complete
| Had some expectations that they would use it to improve their next grade/mark. | | | Complete
| Cared less about the judgements I make and how to articulate these in writing. | | | Complete
| Could get more of them to attend sessions and engage in the learning activities. | | | Complete
| Knew the students better. | | | Complete
| Could discuss this with my students. | | | Complete
| Knew them personally, and their profiles. | | | Complete
| Knew how to phrase it in terms that students understand. | | | Complete
| Saw evidence of improvement as a result of that feedback. | | | Complete
| Knew what to say to them. | | | Complete
| Could be clearer about why they got the grade/mark that they did. | | | Complete
| Made more time for giving effective feedback. | | | Complete
| Knew that they would read the comments and not just look at the grade/mark. | | | Complete
| Could identify better what the student needs to do to improve the next piece of work and feedforward. | | | Complete
| Could do this as a dialogue. | | | Complete
| Didn't take so long over it. | | | Complete
| Had the chance to discuss it and explain it. | | | Complete
:::
### ACTIVITY 3 REFLECTIONS:
**1. The statements that resonate with me are...**
* _[name=enter name here]_ ...
* [name=Samantha]: Knew the students better.
* [name=Amina] Had the chance to discuss it and explain it
* [name=Edo] Could identify better what the student needs to do to improve the next piece of work and feedforward; Could discuss this with my students; Was able to do it more quickly.
* [name=Saman] Saw evidence of improvement as a result of that feedback.
* [name=Tom] Knew students better, could meet them and get to know their needs and challenges at work. Again, as we teach mostly working professionals many of these statements are not applicable but some of the concepts are useful.
*
*
*
*
**2. For me, the challenge(s) associated with these statements is/are...**
* _[name=enter name here]_ ...
* [name=Samantha]knowing enough about the students. Teach short one-off workshops, I no longer teach programmes.
* [name=Amina] usually students care more about the mark itself and it is hard to explain the specific feedback for each student if they get lower than expected scores. For instance they start to compare each others feedback and you need more time to explain individual feedbacks.
* [name=Edo] giving good feedback and reflecting on it is quite a time consuming task both for educators and for students. Having the time to give live feedback during classes is only really feasible when the class number and structure permits it. Discussion is also very important but still it might be challenging to find the time during classes to do so.
* [name=Saman] giving feedback and understanding it is time consuming for staff and students. so is checking and submitting drafts so that they improve
* [name=Tom] Just being able to form connections with our participants. No matter how much you get to know a cohort, never meeting them in person and their time being so limited makes training often feel 'do or die' for them rather than quality time exploring concepts and answering questions.
*
*
*
*
**3. Potential strategies for the challenges I encounter are...**
* _[name=enter name here]_ ...
* [name=Samantha] encourage more students to compete the pre-course survey but also more time on intro activities at start of workshop
* [name=Amina] have more specific rubrics, and take more time on each written feedback that goes with the score.
* [name=Saman] I am not sure what would be a good strategy. I would like to get some input from colleagues here
*
*
*
*
*
*
### Useful Links/references:
- **Systematic review of automated assessment**
- Paiva, José Carlos, José Paulo Leal, and Álvaro Figueira. "Automated Assessment in Computer Science Education: A State-of-the-Art Review." ACM Transactions on Computing Education (TOCE) 22.3 (2022): 1-40. [[link]](https://dl.acm.org/doi/abs/10.1145/3513140)
- **Automated assessment in Jupyer Notebooks**
- There is growing literature exploring approaches to automated grading in Jupyter notebooks, which are commonly used in data science teaching. Here are a couple of interesting starter links:
- Manzoor, H., Naik, A., Shaffer, C. A., North, C., & Edwards, S. H. (2020, February). Auto-grading jupyter notebooks. In Proceedings of the *51st ACM Technical Symposium on Computer Science Education* (pp. 1139-1144). [[link]](https://dl.acm.org/doi/abs/10.1145/3328778.3366947)