Contents

Why it matters

In Kineo Courses, we design for results.  

We care about making sure our courses work for your learners and that organisations see real change in behaviour and knowledge. To know they work, we need our assessments to be a true test of the knowledge and skills we’re providing.  

Our experience in learning design means our approaches are engaging, unique, and creative, built on a foundation of learning theory, research, and best practice. Our approach to assessment is no different. The creation of this Manifesto has been inspired by some of the best globally recognised and understood sources in eLearning.

Kineo are a part of City & Guilds Group who have been designing and delivering vocational training and assessment for over 140 years across the globe. This means we are in a unique position to tap into this experience and educational expertise.

Our Learnforce delivery platform has been deliberately designed to gather and provide valuable insights from a huge data set of learning analytics which grows daily and includes all our past and present courses. This means we can evaluate, improve, and act upon feedback from thousands of learners who tell us what they want and need from our assessments.

All this knowledge, experience, expertise, feedback, and data has helped to inform this Manifesto which we believe will help us to deliver fair, accurate, and accessible assessments for everyone at all levels of the workforce.

Designing assessments

We will use clear, simple language

We won’t use words or phrases that would not be understood by those for whom English is not a first language, such as localisms or colloquialisms. We won’t write lengthy and complex question stems or answer options.

We will:

  • use Global English
  • write in short, clear sentences and paragraphs.
  • avoid using unnecessarily complex words and phrases
  • use list formatting as appropriate, and
  • only include relevant information.

We will give clear instructions

We won’t be ambiguous about how to answer the question correctly. We won’t risk testing the learner’s ability to understand complex question structures, rather than their understanding of the topic.

We will:

  • clearly introduce each assessment and instruct on what is needed to pass
  • make it clear when there is more than one correct answer
  • provide clear instructions about how to answer the question, and
  • be clear about the question format (e.g., true or false).

We will test comprehension

Our aim is to determine whether the learner can demonstrate their understanding of the topic.  

We won’t write cryptic or misleading questions as they might undermine the intention of the training. They may also give the learner an incorrect impression of a situation or requirement.

We won’t give easy answers like ‘all of the above’ or where only one option relates to the topic.

We won’t trivialise comprehension of the content by using joke questions or answer options.

We won’t write questions that test the learners reading comprehension, or understanding of grammar, rather than the understanding of the course material. E.g., a question where all four answer options are the same except for the phrasing.

We will:

  • avoid double negatives, e.g., Which of the following things should you not do?
  • ensure Subject Matter Experts review and contribute to the validity of all questions and their ability to test comprehension, and
  • ensure all questions directly relate to content presented in the course or topic being assessed (we won’t test what we haven’t learned).

We will test critical thinking

We will focus on behaviour change. We may challenge learners to apply something they have just learned to a new situation by setting up questions as a scenario and asking learners to choose what they, or the character would do next. We may ask learns to interpret or evaluate facts or predict results. This kind of question requires more critical thinking and can lead to deeper learning.

Where appropriate to the topic and learning outcomes, we will:

  • ask learners to interpret facts (why is this true?)
  • ask learners to explain cause and effect (how does this affect this?)
  • base case-study and scenario-based questions on realistic, plausible situations, and
  • consider things that are likely to have been faced by the learner or might be in the future to put learning into context.

Example fact recall question, written as a scenario

We will challenge learners, but allow them to succeed

We won’t make assessments too difficult. Answering questions incorrectly may strengthen memories for misinformation. We won’t trick learners by writing questions that could be interpreted in more than one way, or where the difference is too subtle, or questions that give clues about which answer is correct.

We will:

  • create assessments that are hard enough to reveal how well learners know the topic, and
  • create questions that are easy enough that 80% of learners should pass.

We will write effective question stems

We won’t include irrelevant material in a question stem, as this can decrease the validity of a test score.

We will ensure that each stem:

  • contains more words than each answer option
  • is meaningful by itself
  • is a question or partial statement
  • presents a clear problem, and
  • relates to a learning outcome.

 

We will write effective answer options

All incorrect answer options (distractors) should be meaningful and possible. We will avoid using all of the above or none of the above as this creates guesswork or allows learners to use only partial knowledge to arrive at an answer.

Our distractors will:

  • be realistic rather than contrived
  • be 100% incorrect
  • be similar in length, and
  • be written in the same style, so they don’t give themselves away.

 

Example types of distractors we might use to cement learning:

  • expose misconceptions, errors, or misunderstandings
  • true statements that do not correctly answer this particular question

 

An example of distractors that might address misconceptions and are 100% incorrect

We will use effective formats

The majority of our questions will have one correct answer to maximise effectiveness of our assessments (Costello, Holland and Kirwan, 2018).

Creating pre-assessments

We will account for prior knowledge

We will allow for prior learning and knowledge and account for training to be regularly repeated by using Adaptics technology where appropriate.

This approach will use a pre-assessment which asks the learner a variety of questions before they start a topic. Each pre-assessment will include different question types and use a bank of questions to present variety in each attempt.

This pre-assessment will eliminate a topic should the learner achieve a 100% pass rate.

Continually improving assessments

We will use design thinking to continually review the effectiveness of our assessments and the success of our intentions in this Manifesto. We are in a unique position where we have access to huge amounts of learning analytics data for all our courses and we will use this data to inform, research, and improve our practices.  

We will stay informed about the latest design theory and industry trends and work in collaboration with our clients to innovate and refine our assessment methods.

We will:

  • regularly review and act upon learner feedback
  • work with our subject matter experts to regularly review and update our assessments to ensure they are current and in line with best practice, and
  • utilise our learning analytics data to analyse learner behaviour and identify challenges and successes, continually improving and building on our successes.

----------------------------------------------------------------------------------------

Inspiration, references, and further resources

Brame, C. (2013) Writing good multiple choice test questions. Retrieved from https://cft.vanderbilt.edu/guides-sub-pages/writing-good-multiple-choice-test-questions/

Chetia, B (2019). All about Using Scenario-based Assessments in Online Learning. https://blog.commlabindia.com/elearning-design/scenario-based-assessments  

Costello, E., Holland, J. and Kirwan, C., 2018. The future of online testing and assessment: question quality in MOOCs. International Journal of Educational Technology in Higher Education, 15(1), p.42 .          

Malamed, C. (2010). 10 rules for writing multiple choice questions https://theelearningcoach.com/elearning_design/rules-for-multiple-choice-questions/  

Moore, C. (2012) How to rewrite a quiz question as scenario-based training. https://blog.cathy-moore.com/2012/05/scenarios-what-are-they-good-for/  

Web Content Accessibility Guidelines (WCAG). Writing for Web Accessibility. https://www.w3.org/TR/WCAG/  

Kineo (2024), The Transformative Power of Plain Language in Organisational Learning  https://kineo.com/resources/the-transformative-power-of-plain-language-in-organisational-learning

Start your learning journey with us

Get in touch