Skip to content

Evaluation

For teachers only

Below is the text of the NBIS Short Term Feedback questions for reference.

We consider using it in this Issue.

[Richel] My comments can be found in code blocks like this :-)
> [Richel] What I don't like, is that the results are not public

NBIS Short Term Feedback (STF)

Core question set information

The intention of the STF survey is to find out how participants have used the skills and knowledge they gained through participating in the NBIS course.

[Richel] This seems false to me. Could this be a copy-paste mistake from the Long Term Feedback survey?

A goal I suggest: The intention of the STF survey is to find out how the course can be improved.

The STF survey aims to provide data back to NBIS from course participants.

[Richel] Suggest: 'The results of this survey are used by NBIS'.

I wish it would say:

The results of this survey will be published online, to make sure each voice is heard.

The survey should preferably be given by the course leader to the participants on the last day of the course. Some of the questions below are CORE Questions and needs to always be included in the survey. There are also room for ADDITIONAL questions that can be modified for respective course.

[Richel] Sounds like the core questions are the only ones bothering learners with. The additional questions already feel too much 'nice to have'.

  • Contents
  • Important Information
  • Core Question Set
  • Demographic Information
  • Quality Metrics
  • Additional Questions - Training content/information
  • Additional Questions - Training logistics

Important Information

Below are the core questions for NBIS short term feedback (STF), which are required to be captured for all NBIS training events from August 2018 onwards, most typically in an end-of-training-event feedback survey (i.e. exit survey). The information and Core questions are extracted from the ELIXIR and ELIXIR-EXCELERATE courses. Additional questions are free to be modified to suit the course needs. The format for collecting the data is up to each training provider, although results should be exportable to Excel format. The core questions may be divided into two categories and will by and large be analysed separately - both categories are required to be captured:

  • Demographic information
  • Quality metrics

For the demographic information questions specifically, these may be captured either in the exit survey OR in the registration form. The exit survey should be administered as close as possible to the end of the training event, preferably on the last day of the course. Please add the result of the survey to the course folder in Google Drive (NBIS Course Catalogue).

[Richel] In teaching, I don't care about where the learners come from. I guess this is just to make pretty pictures?

The core question set is followed by a set of Additional (suggested) questions that training organisers might also like to ask. Please note: while the core question set is compulsory, Course leader(s) are encouraged to ask any additional questions for their own collection and data analysis, should they wish.

Data formatting: Preferred column headers for each core metric are in ‘red’. It would be very helpful for analysing the data if everyone used these column headings when exporting the results. Please note: these descriptors are case sensitive (e.g. use advertised not Advertised). Also, the underscores are important! (e.g. career_stage is NOT the same as career stage).

If possible, please name the dataset file as follows to assist with data handling: YYYY-MM-DD_L/STF_Location_CourseName, e.g. 2018-06-11_STF_Visby_RaukR

Core Question Set

Section 1 - Template: NBIS Short Term Feedback (STF) survey COURSE NAME, LOCATION, YYYY-MM-DD

Thank you for filling the questionnaire. It is really important to us in order to continually improve the course and the materials we deliver. In filling the questionnaire, please keep in mind that your comments - which are not mandatory - are especially precious. We may share anonymised information with course presenters and developers as well as for wider quality/impact analyses.

[Richel] Suggest to use 'The results will be posted online as-is, to make sure all voices get heard'

  • Required

Section 2 - Demographic Information

[Richel]: I don't feel it is worth to bother our learners with question 1 below: I don't care. I do see its use in an intake form, but even there, I feel we should respect our learners' time.

  1. Where did you see the course advertised? advertised

  2. a. NBIS website

  3. b. SciLifeLab website
  4. c. Social Media (e.g. NBIS twitter)
  5. d. Host Institute website
  6. e. Colleague
  7. f. TeSS
  8. g. Email
  9. h. Internet search
  10. i. Other (comments)

[Richel]: I don't feel it is worth to bother our learners with question 2 below it will have no effect on my preparation

  1. What is your career stage? career_stage

  2. a. PhD candidate

  3. b. Postdoctoral researcher
  4. c. Senior researcher/Principal investigator
  5. d. Staff scientist
  6. e. Industry scientist
  7. f. Other (comments)

[Richel]: I don't feel it is worth to bother our learners with question 3 below: it will have zero effect on my preparation

  1. What is your host university? host_university

[Richel]: I don't feel it is worth to bother our learners with question 4 below: it will have zero effect on my preparation

  1. Gender gender

  2. a. Male

  3. b. Female
  4. c. Prefer not to say
  5. d. Other (please specify)

Section 3 - Quality Metrics

[Richel] I don't see the use of question 5 below,
I see no reason why this would change the course, nor my teaching
in any way
  1. Have you used the tools/resource(s) covered in the course before? have_used_resources_before

  2. i. Never - Unaware of them

  3. ii. Never - Used other service
  4. iii. Occasionally
  5. iv. Frequently
[Richel] I don't see the use of question 6 below,
as I feel question 7 gives me the similar info.
  1. Will you use the tools/resource(s) covered in the course again? will_use_resources_future
  • v. Yes
  • vi. No
  • vii. Maybe

  • Would you recommend the course? would_recommend_course

  • viii. Yes

  • ix. No
  • x. Maybe

  • What is your overall rating for the course*. overall_satisfaction

  • a. Poor (1)

  • b. Satisfactory (2)
  • c. Good (3)
  • d. Very Good (4)
  • e. Excellent (5)

(*please include both numeric and categorical scale for this question.)

[Richel] I don't like to bother our learners with this.
Instead, I suggest to link to a long-term feedback form at this page
  1. A. May we contact you by email in the future for more feedback? contact_future

  2. i. Yes

  3. ii. No

9 B. If you answered ‘yes’ to the above question, please enter your email address, below. email ( Information for question 9B must be collected and stored by each Node/Institution, but should NOT be shared with the Q&I subtask or any other third party due to GDPR considerations.)

Additional Questions - Training content/information

These are suggested questions that may be of interest (not compulsory):

Richel: I don't see how question 1 below would change my teaching
  1. What part of the training did you enjoy the most? enjoy
[Richel]: I don't feel it is worth to bother our learners with question 2 below
  1. What part of the training did you enjoy the least? to_improve
[Richel]: I don't feel it is worth to bother our learners with question 3 below
  1. The balance of theoretical and practical content was theoretical_practical

  2. a. Too practical

  3. b. About right
  4. c. Too theoretical
[Richel]: I don't feel it is worth to bother our learners with question 4 below
  1. How do you rate the pre-course information given? pre_course_information

  2. Linear scale 1-5

    1. (Very unsatisfactory/Not useful)
    1. Very good/Very useful
[Richel]: I don't feel it is worth to bother our learners with question 5 below
  1. What other topics would you like to see covered in the future? future_topics

  2. Any other comments? Comments

[Richel]: I don't feel it is worth to bother our learners with question 7 below, for these reasons:

  • SETs ('Student Evaluations of Teaching') encourage poor teaching [Stroebe, 2020]
  • I do not care about 'satisfaction', instead I care about the learning outcomes. Sure, they may correlate, but I prefer to ask the thing I care for
  • Asking for satisfaction needlessly hurts teachers' feelings, as it allows bullying. I've seen two reasonable teachers (one of these me) get a '1' here. I hypothesize for both that this was because the learners simply did not like the teacher. We don't need to take such crap
  1. PLEASE RATE EACH SESSION OF THE COURSE satisfaction_per_session_YYYY_MM_DD_am/pm

  2. a. Did not attend

  3. b. Poor (1)
  4. c. Satisfactory (2)
  5. d. Good (3)
  6. e. Very Good (4)
  7. f. Excellent (5)
[Richel]: I don't feel it is worth to bother our learners with question 8
below, as we have question 5
  1. Comments on teaching staff teaching_staff Help our teaching staff to improve by providing constructive feedback Paragraph text answer
[Richel]: I don't feel it is worth to bother our learners with question 9 below
  1. Was the course held at a teaching level matching your training? teaching_training_level
[Richel]: I don't feel it is worth to bother our learners with question 10 below
  1. STATEMENTS REGARDING WHAT PARTICIPANTS COULD DO before TRAINING (customised to a specific training) skills_before
[Richel]: I don't feel it is worth to bother our learners with question 11 below
  1. STATEMENTS REGARDING WHAT PARTICIPANTS CAN DO after TRAINING (customised to a specific training) skills_after
[Richel]: I don't feel it is worth to bother our learners with question 12 below
  1. What other topics would you like to see covered in the future? future_topics

  2. Any other comments? Comments_1

Additional Questions - Training logistics

These are suggested questions that may be of interest (not compulsory):

[Richel]: I don't feel it is worth to bother our learners with question 1 below
  1. What would be the preferred length of the course? preferred_length

  2. Linear scale 1-5 Days

[Richel]: I don't feel it is worth to bother our learners with question 2 below
  1. How did you like the facilities/localities of the course (rooms and surroundings)? course_localities

  2. Linear scale 1-5

    1. Not at all
    1. Very much
[Richel]: I don't feel it is worth to bother our learners with question 3 below
  1. How did you like the lunch(es) and “fika(s)”? lunch_fikas

  2. Linear scale 1-5

    1. Not at all
    1. Very much
  3. Any other comments? Comments_2

It was a great experience and we are working hard to make it even better. Now go make something great!

References

  • [Stroebe, 2020] Stroebe, W. (2020). Student Evaluations of Teaching Encourages Poor Teaching and Contributes to Grade Inflation: A Theoretical and Empirical Analysis. Basic and Applied Social Psychology, 42(4), 276–294. https://doi.org/10.1080/01973533.2020.1756817
  • [Sitzmann & Johnson, 2014] Traci Sitzmann and Stefanie Johnson. "The paradox of seduction by irrelevant details: How irrelevant information helps and hinders self-regulated learning." Learning and Individual Differences (2014): 1-11. Download page