Tải bản đầy đủ (.pdf) (18 trang)

Tai lieu tap huan thiet ke de tieng anh assessment

Bạn đang xem bản rút gọn của tài liệu. Xem và tải ngay bản đầy đủ của tài liệu tại đây (320.05 KB, 18 trang )

Assessment
Barbara Thompson


Teaching Oral English
Communicatively

Barbara Thompson


A Holistic Approach to Assessing Levels
of Spoken English

Barbara Thompson


Purpose of Oral Assessment Tests
2 Types:

• Diagnostic Tests
• Placement Tests


Placement Tests
Purpose:
• For program entry/exit or placement within a
course of study
• Proficiency threshold or cutoff - students can
pass or fail

• Tests can be quantifiable or holistic




Administrative Challenges
• Time consuming to design and administer
• Scheduling constraints

• Expensive
• Can be subjective
• Rater training and turnover


Designing Assessment Tools
Must have:
• Purpose: Oral assessment must be tailor-made
for a particular purpose and population

• Reliability: Accuracy and consistency of
measurement
• Validity: Justification or interpretation of test
results


Designing Assessment Tools (cont.)
Must have:
Instructional authenticity: Engagement of test
taker’s communicative language ability

Practicality: Constraints imposed by time, money,
personnel, equipment, institutional or
governmental policies

Face validity: Assessment tests what was taught


Rating of Performance Tests
Rating scales generally have 2 features:
1. Each aspect of language to be rated is listed
(oral grammar, phrasal stress, etc.)
2. Definition of each level in terms of mastery
of language features


Designing an Assessment
What tasks will the test takers perform?
• Sentence repetition or completion: lower
levels

• Oral interview: face to face, telephone, skype
• Role play: situations relevant to test takers


Designing an Assessment (cont.)
What tasks will the test takers perform?
• Small group discussions
• Answer questions or give opinions
• Describe a picture, graph, or tell a story
• Oral presentation – more formal


Developing Test Questions
• List your objectives or content area


• Develop items that meet these objectives
• Pilot with groups of students to see if they
elicit the expected language, and are
scoreable


Developing Test Questions (cont.)
• Decide on the number of test items

• Avoid yes/no questions
• To raise difficulty level, create questions that
test multiple things


Standardize Test Administration
• Will the test be live, taped, or on computer?
(test should be administered the same way
each time)
• Is repetition or rephrasing by the tester
allowed?


Standardize Rater Training
Testers must administer tests in a standard way,
and be able to score reliably
• Training must always cover the same points
• Trainers and trainees must have a thorough
understanding of the rubric



Standardize Test Administration (cont.)
• What should the tester do if the student asks
a question or doesn’t respond?
• Is there a time limit for each item?
• Can students read the questions during the
test?


Test Security
• Test items are never practiced in class
• Do not discuss items with students before,
during, or after the test
• If possible have several versions of tests (pilot
to standardize versions)
• Video or audio tape each test


Standardize Rater Training (cont.)
• Trainees should do many practice exercises

• Trainees should do certification exercises
• Inter-rater reliability
• Periodic re-norming



×