In association with Pilgrims Limited
*  CONTENTS
--- 
*  EDITORIAL
--- 
*  MAJOR ARTICLES
--- 
*  JOKES
--- 
*  SHORT ARTICLES
--- 
*  CORPORA IDEAS
--- 
*  LESSON OUTLINES
--- 
*  STUDENT VOICES
--- 
*  PUBLICATIONS
--- 
*  AN OLD EXERCISE
--- 
*  COURSE OUTLINE
--- 
*  READERS’ LETTERS
--- 
*  PREVIOUS EDITIONS
--- 
*  BOOK PREVIEW
--- 
*  POEMS
--- 
--- 
*  Would you like to receive publication updates from HLT? Join our free mailing list
--- 
Pilgrims 2005 Teacher Training Courses - Read More
--- 
 
Humanising Language Teaching
Humanising Language Teaching
Humanising Language Teaching
AN OLD EXERCISE

Types of English language Tests and Their Application in Creating Examination Papers for Technology Students at the University of Food Technologies in Bulgaria

Ivanka Marinova, Bulgaria

Ivanka Marinova is a senior language lecturer at the University of Food Technologies in Plovdiv, Bulgaria. She teaches specialized English to students from the Technological faculty and to PhD candidates in Engineering. Her interests are linguistics and translation. E-mail: ivanka_vassileva@yahoo.com

Menu

Abstract
Introduction
Discussion
Application of existing test items at the UFT – Bulgaria
Students’ opinion
Conclusion
References

Abstract

The paper makes a summary of existing techniques for checking the English languauge knowledge of students. Some examples from mock examination papers for technology students have been given to show how the test items are applied at the UFT – Plovdiv, Bulgaria. Finally, a survey has been conducted among the students to find out how they feel about each examination technique.

Introduction

Language tests are mostly instruments for measuring the performance of each student in comparison with that of other students or with certain established norms. When the teacher or lecturer creates a test he or she should be acquainted with the existing test items [1] or techniques [2] in order to choose the most appropriate one depending on the skill he or she wishes to evaluate (i.e. reading, speaking, grammar, vocabulary, etc.).

Discussion

In terms of scoring there are two main types of tests: subjective and objective [1; 25-30]. Subjective tests measure the students’ ability to communicate in the target language. These usually include the writing of stories, essays, reports, letters or oral descriptions of a picture, discussions of a certain problem, etc. The scoring of such tests can be difficult because students express themselves in a different way, they use different grammatical constructions and vocabulary and may have an individual and unique approach to the task at hand. As a result there can be a number of possible correct answers. In addition the teacher or lecturer marks the test according to his or her own standards and requirements and this makes the scoring of the item all the more difficult.

Objective tests measure the students’ knowledge of the grammatical constructions and vocabulary of the target language rather than the ability to use the language in a certain situation (they check whether the students can recognize certain grammatical forms or idiomatic expressions as correct or whether they can use them correctly in a sentence). The writing of such objective tests requires careful consideration. It is up to the lecturer to decide what type of knowledge he or she wishes to check, which problem areas to include, which test item to use and how to score each correct answer. But once a scoring system has been established, the student’s mark will be the same irrespective of which examiner grades the paper. That is why objective tests are very popular among examiners responsible for a large number of students.

The existing test items which are most commonly used are as follows:

  • multiple-choice – these items draw on the works of structural linguistics, in particular the importance of constructive analysis and the need to identify and measure the learners’ mastery of the separate elements of the target language (only grammar or only vocabulary) [1; 15]. Multiple-choice items are used in exams such as FCE, CAE, CPE, SAT and TOEFL [5,6,7,8]. They check grammar, vocabulary, reading and listening.
  • error recognition – these items are a variation of multiple-choice. In a given sentence four grammatical constructions are underlined and the testee has to choose which of them is the wrong one. These test items check grammar and are used in public exams such as SAT and TOEFL [7,8]
  • rearrangement items – here the given elements have to be arranged in the correct order to form a phrase or a sentence. These items check grammar and reading.
  • completion items – they are a way of measuring the students’ ability to produce suitable and acceptable grammatical forms. This is done in way of filling in gaps. The missing words can be prepositions, articles, auxiliaries, etc. Most often these items are in the form of a whole text where the afore mentioned grammatical categories have been erased. These items check grammar, vocabulary and reading and are used in FCE, CAE, CPE and SAT [5,6,7]
  • transformation items – the task here is to rewrite a given sentence. These items check grammar and are used in FCE, CAE and CPE
  • changing of words – these items test the students’ ability to use the verb forms and tenses correctly. A verb is usually given in infinitive and the task is to put it in the correct verb tense. These test items check grammar and are used mainly in grammar books and workbooks
  • broken sentences – these items consist of sets of phrases which have to be put together in a sentence by adding the necessary prepositions, articles, etc. to the given phrases. They check grammar but are not used in English and American official public tests
  • cloze – these items look a lot like completion items but unlike the latter where the missing words have been erased subjectively by the teacher (only grammatical forms or only vocabulary) in cloze the erasing is systematic (every nth word is erased, usually every 5th, 6th or 7th irrespective of its function in the sentence). These items check reading and listening. A variation of cloze called open cloze is used in FCE, CAE and CPE [5.6]
  • pairing and matching = the aim in these items is to choose two words or phrases out of a whole set and to match them according to similar grammatical features, meaning etc. These items check grammar and vocabulary but although matching is a good technique it is not that common in official public tests
  • true/false - these items are a variation of multiple-choice. Here you have to read or listen to a text and based on the reading or the listening, you have to consider whether certain sentences are true or false.

Application of existing test items at the UFT – Bulgaria

Technology students at the UFT – Plovdiv, Bulgaria have vocabulary-oriented textbooks [3, 4] which emphasize on the acquisition of special terms related to food technology and engineering. Therefore the exam papers based on these textbooks and created by the lecturers are focused mainly on specific vocabulary and partly on grammar. Consequently the used objective techniques check those particular language skills. The techniques used by the lecturers to create the exams are matching, gap filling – a variation of completion but with vocabulary items, changing of words, broken sentences and multiple-choice. Here are some examples:

matching – this is one of the most commonly used technique because it can easily give the examiner an impression on whether or not the student has learned a certain term and its definition ex. Match a word on the left to its definition on the right:

1)homocentrica)cutting off
2)immersedb)between molecules
3)particulatec)having the same centre
4)churnd)taking in heat
5)heterothermic e)a chemical combination of two or more different elements
6)intramolecularf)giving out heat
7)heterogeneous g)with body temperature that changes
8)intermolecularh)covered completely in a liquid
9)troughi)formation of crystals
10)endothermicj)consisting of separate particles
11)batchk)to shake or stir violently to make butter
12)exothermicl)in the molecule
13)shearingm)a narrow open container
14)compoundn)having different parts or elements
15)crystallizationo)an amount produced at the same time

15 points

 gap filling – this is also a widely used testing item because again it gives information about the student’s knowledge of certain terms from Chemistry, Physics, Mathematics or Engineering.ex. Provide the missing information:

  1. A hexagon has ……… sides and ……… vertices.
  2. An octagon has ………angles.
  3. A nonagon has ……… vertices.
  4. A heptagon is with ……… sides.
  5. A quadrilateral has ……… angles.
  6. A tripod is a …………… – legged object.
  7. A pentagram is a ………… - pointed figure.
  8. A quadrangle has ………………..sides.
  9. A polygon has ……… angles.
  10. An equilateral has ……… sides.

10 points

broken sentences – this technique is not that common because it is a bit difficult to score in terms of how many points should be allocated to each correct sentence. However, it is a good way of checking whether or not the students can construct a grammatically correct sentence. ex. Write questions with the following words:

  1. food / should/ I /eat / overweight?
  2. food / mustn’t/ you / eat / celiac?
  3. food / mustn’t / you / offer / vegetarians?
  4. food / can/ vegans / eat?
  5. food / should / people / avoid / allergic / peanuts?
  6. food / can / lactovegetarians / eat?
  7. things / must / you/ avoid / high cholesterol?
  8. products / can / you / offer/ ovovegetarians?

8 points

changing of words – this is a good technique for checking the grammar of students and so is commonly used when constructing tests for technology students at the UFT ex. . Fill in the blanks with the correct Passive form of the verb in brackets:

  1. Molecules (form) ………………………… by the chemical combination of two or more atoms.
  2. Elements (divide) ………………………… into metals and non-metals.
  3. A few elements (refer) ………………………… to as semi-metals or metalloids.
  4. A chemical combination of two or more different elements (call) ………………………… a compound.
  5. The atoms (link) ………………………… together by chemical bonds.
  6. Inorganic compounds (obtain) ……………………… from non-living sources.
  7. A mixture (make) ………………………… up of at least two substances.

7 points

multiple-choice – with this technique the teacher can check both grammar and vocabulary and it is, therefore, a preferred one in constructing tests ex. Choose the correct answer

I don’t like style. I think he writes ….
A the worst B badly C good D bad
If something breaks or cracks easily, it is ….
A coarse B fine C brittle D stiff

Students’ opinion

A survey was conducted among one hundred students to find out how they feel about each item they have been tested with. The students were asked to assess the different types of tests by marking them as easy, difficult, confusing, objective, a good way of checking the knowledge of English. They were also asked to give their opinion on whether they find it easy or difficult to understand the task in each test item. The results from the survey are represented in Table 1.

Table 1

(% is the percentage of students)Matchinggap fillingchanging of wordsbroken sentencesmultiple-choice
Difficult 5% 52% 57% 69% 2%
Easy 59% 38% 22% 19% 79%
Confusing 0% 9% 26% 54% 0%
a good way of checking the knowledge of English80%63%54%18%71%
easy to understand the task 70% 59% 41% 19% 73%
difficult to understand the task 2% 7% 10% 27% 0%
Objective 77% 60% 51% 20% 79%
Preferable 65% 35% 27% 17% 78%

The table shows that more than half of the students consider gap filling, changing of words and broken sentences difficult while very few of them have any difficulty in doing matching and multiple-choice items. Accordingly the majority of students find matching and multiple-choice easy whereas a relatively small number of students consider gap filling (38%), changing of words (22%) and broken sentences (19%) an easy task. The same tendency is observed with the next criterion of assessing the test items: none of the students find matching and multiple-choice confusing while half of them (54%) consider broken sentences to be quite so. Changing of words (26%) and gap filling (9%) could also cause confusion. The majority of students (80% and 71% respectively) assess matching (80%) and multiple-choice (71%) as a good way of checking the knowledge of English and although they find gap filling and changing of words difficult and confusing more than half of the students admit that they are a good way of testing their English language knowledge. The survey shows further that, as a whole, the students find it easy to understand the instructions in each test item and that it is clear to them what they have to do. The only test item which is ambiguous to them is broken sentences. Respectively almost none of the students find it difficult to understand the task. The only exception is broken sentences (27%). Further on, the majority of students consider the types of tests objective except for broken sentences (20%). Finally, the most preferable test items are matching (65%) and multiple-choice (78%).

Two tendencies have been observed based on this survey:

  • The majority of the technology students at the UFT, Bulgaria find matching and multiple-choice easy, clear, objective, and a good and preferable way of testing their knowledge of English
  • The rest of the items (gap filling, changing of words and broken sentences) tend to be difficult and confusing for a considerable number of students which makes them less preferable.

The general conclusion is that the students from this survey find easy and prefer such test items in which the correct answer is already given together with the other possible options. In matching they have to choose two words out of a given set and match them according to similar meaning, grammatical feature, etc. In multiple-choice the students have four options from which to choose the correct one. Maybe subconsciously they think that if they don’t know the correct answer but can see it somewhere in the item, it could trigger their memory or some part of their linguistic competency which would help them identify the correct option. Likewise, when the students have to think of a word or make a correct grammatical construction on their own, they find it more difficult and therefore those test item are less preferable to them (as is the case with gap filling, changing of words and broken sentences).

The teachers, of course, want to check every level of the students’ linguistic competence and will, therefore, continue to apply all of the techniques discussed above.

Conclusion

Constructing English language tests is not an easy task. Teachers have to take a lot in consideration in order to create the exact testing item they need. The tests at the UFT are as objective as possible in order to avoid any partiality on the part of the lecturer and to make the scoring quick and easy. This works well for an institution which has to examine a large number of students every semester. The survey also shows that the people who study at the UFT find the techniques objective and, therefore, fair. And, although they find some of the test items difficult to do, most of them agree that they are a good way of testing their English language skills.

References

Heaton, J.B., Writing English Language Tests, Longman, 1990

Hughes, Arthur, Testing for Language Teachers, Cambridge University Press, 1988

Luizova-Horeva, Tsveta, English for Food Technology and Engineering, An Introduction, UFT Academic Publishing House, Plovdiv, 2010

Luizova-Horeva, Tsveta, English for Food Technology and Engineering, UFT Academic Publishing House, Plovdiv, 2011

Newbrook, Jacky; Wilson, Judith; Acklam, Richard, FCE Gold Plus – Course book, Longman, 2004

www.flo-joe.co.uk/cae/students/index.htm

www.majortests.com/sat/grammar.php

www.examenglish.com/TOEFL

--- 

Please check the Teaching Advanced Students course at Pilgrims website.

Back Back to the top

 
    Website design and hosting by Ampheon © HLT Magazine and Pilgrims Limited