VIETNAM NATIONAL UNIVERSITY, HANOI
University of Languages and International Studies
FACULTY OF ENGLISH LANGUAGE TEACHER EDUCATION
GRADUATION PAPER
AN INVESTIGATION INTO THE AUTHENTICITY OF
THE WRITING PORTFOLIOS FOR THE 2
ND
YEAR
FAST-TRACK STUDENTS IN THE SECOND
SEMESTER AT FELTE, ULIS, VNU
Supervisor: Duong Thu Mai, PhD.
Student: Nguyen Dieu Hong
Course: QH2010.F1.E2
Ha Noi, May 2014
ĐẠI HỌC QUỐC GIA HÀ NỘI
TRƯỜNG ĐẠI HỌC NGOẠI NGỮ
KHOA TIẾNG ANH SƯ PHẠM
KHOÁ LUẬN TỐT NGHIỆP
NGHIÊN CỨU VỀ TÍNH THỰC TIỄN CỦA HỒ SƠ BÀI
VIẾT DÀNH CHO SINH VIÊN NĂM 2, TỔ CHẤT LƯỢNG
CAO, KHOA TIẾNG ANH SƯ PHẠM, ĐẠI HỌC NGOẠI
NGỮ, ĐẠI HỌC QUỐC GIA HÀ NỘI
Giáo viên hướng dẫn: Tiến sĩ Dương Thu Mai.
Sinh viên: Nguyễn Diệu Hồng
Khoá: QH2010.F1.E2
HÀ NỘI – THÁNG 5 NĂM 2014
ACCEPTANCE
I hereby state that I: Nguyen Dieu Hong, QH2010.E2, being a candidate for
the degree of Bachelor of Arts (TEFL) accept the requirements of the College
relating to the retention and use of Bachelor’s Graduation Paper deposited
in the library.
In terms of these conditions, I agree that the origin of my paper deposited in
the library should be accessible for the purposes of study and research, in
accordance with the normal conditions established by the librarian for the
care, loan or reproduction of the paper.
Signature
May, 2014
i
ACKNOWLEDGEMENTS
To complete this graduation paper, I would like to express my gratitude to:
• My supervisor, Dr Duong Thu Mai as I have always been and will always be
thankful to her. I highly appreciate her for the constant, unfailing support
during my study. While providing me with critical, constructive comments
she has respected my autonomy in the way I chose the topic. Although rarely
did I receive her words of encouragement she has always been there for me at
the worst moments of my research journey.
• Mrs. Dinh Hai Yen, who provided me with endless support during my data
collection process.
• Mr. Nguyen Chi Duc for his helpfulness, kindness and inspiration.
• All of my participants who have always been the decisive factors in the
completion of this paper.
• My parents for their unconditional love and support.
• My younger brother and my best friends, who have always stood by my side,
believed in me and raised me up whenever I lost faith in myself.
Without all their support and cooperation, this thesis would not have been possible.
i
ABSTRACT
Authenticity is one of the important qualities making portfolios a powerful
method of writing assessment. As one of the pioneers in foreign language
education, Fast-Track division at FELTE, ULIS (University of Languages and
International Study) has made attempt to improve writing assessment standard by
including portfolios in the writing courses. To meet the goal of this activity, it is,
therefore, necessary that authenticity goes hand in hand with such method.
Nevertheless, that how the quality is evaluated is unspecified. To tackle this
problem, the present study aims at building a framework for authentic writing
portfolios, then applying that framework to investigate the authenticity of the
writing portfolios for the second- year fast-track students at ULIS, VNU. Four
phases with three qualitative methods were employed for well-rounded results. The
findings indicated that the framework for authentic writing portfolios should
include five main dimensions: Tasks, goals, context, assessment criteria, and roles
of assessors. With regard to the writing portfolios for the second- year fast-track
students at ULIS, VNU, the authenticity was demonstrated to a certain extent, yet
some improvements were still needed. The study hopefully will benefit researchers
and teachers who are interested in the same topic.
Key Words: Language assessment, authenticity, writing competence, and writing
portfolios.
ii
TABLE OF CONTENTS
CHAPTER 1: INTRODUCTION 1
CHAPTER 2: LANGUAGE ASSESSMENT, AUTHENTICITY AND WRITING
PORTFOLIO 7
CHAPTER 3: RESEARCH METHODOLOGY 35
4.1. FINDINGS FROM DOCUMENT ANALYSIS 49
CHAPTER 5: DISCUSSION 76
CHAPTER 6: CONCLUSION 85
APPENDICES 87
REFERENCES 126
LISTS OF TABLES, FIGURES AND ABBREVIATIONS
TABLES
Table 1 Task characteristic (Bachman & Palmer, 1996)
PAGE
88
Table 2 Authenticity in writing portfolios 51
FIGURES PAGE
Figure 1: Measurement, tests and evaluation (Bachman, 1990)
Figure 2: Authenticity (Bachman& Palmer, 1996)
9
12
iii
Figure 3: Five-dimensional framework for authentic assessment
(Gulikers, Bastiaens & Kirschner, 2004)
Figure 4: Research procedure
Figure 5: Classification of authentic tasks (Brown & Menasche, 2005)
89
35
56
Figure 6: A framework for analyzing communicative tasks (Nunan, 1989)
Figure 7: Authenticity in Writing Portfolios
76
77
ABBREVIATIONS
EFL : English as a Foreign Language
ESL : English as a Second Language
FELTE : Faculty of English Language Teacher Education
VNU : Vietnam National University
ULIS : University of Languages and International Studies.
iv
CHAPTER 1: INTRODUCTION
1.1. STATEMENT OF RESEARCH PROBLEM AND RATIONALE FOR
THE STUDY
With numerous conflicting theories concerning writing, writing assessment
has undergone immense changes throughout centuries. In the past, writing was
taught based on product-oriented approach in which “students are encouraged to
mimic a model text, usually is presented and analyzed at an early stage”
(Gabrielatos, 2002, p.5). Consequently, timed testing seemed to be an appropriate
method to assess students’ writing capacity. Then, the notion that writing is a
process was introduced.
Kroll (2001) has defined this view as follows:
The “process approach” serves today as an umbrella term for many types of
writing courses. What the term captures is the fact that student writers
engage in their writing tasks through a cyclical approach rather than a
single-shot approach. They are not expected to produce and submit complete
and polished responses to their writing assignments without going through
stages of drafting and receiving feedback on their draft, be it from peers
and/or from the teacher, followed by revision of their evolving texts. (p.
220)
In other words, the progress of brainstorming ideas, drafting, reviewing and
editing are essential in writing. Consequently, it is more difficult to assess EFL
students’ writing abilities than native speakers’ in timed writing assessment (John,
1991). Conforming to this point, Song and August (2002) point out that a set time
during a writing test does affect test takers’ simultaneous concentration on L2
writing needed skills and culturally related issues in the process of writing. From
1
this perspective, traditional timed testing seemed to be inadequate in writing
assessment.
The need for a supplementary assessment method leads to the born of
writing portfolios. While standardized tests are incapable of reflecting a complete
picture of students’ needs and learning, writing portfolios are proved to be very
useful. Hamp-Lyons and Condon (2000) hold that “portfolio provide a broader
measure of what students can do, and because they replace the timed writing
context, which has long been claimed to be particularly discriminatory against non-
native writers” (p. 61). In addition, by applying portfolios, the student-centered
concept of teaching will take priority over the traditional one (Lee, 2000).
Due to the crucial role of portfolios in teaching and learning writing, it is
hardly surprising that many research topics have been conducted surrounding these
assessment method. Some concentrate on the relationship between writing
portfolios and learners. “The Effects of portfolio assessment on Writing of EFL
Students” (Netzakatgoo, 2010) is a salient example. The findings of that study are
used to determine the effect of portfolio assessment on the final examination scores
of EFL students respecting writing skill. Others, for instance “A qualitative
research on Portfolio Keeping in English as a Foreign Language Writing” (Aydin,
2010), tend to deepen into the contributions of portfolios to the language
development of a specific group like pre-service teachers or Korean ESL students.
The qualities of portfolios also receive much concern. To illustrate, Hamp-Lyons &
Condon (2000) believe that in writing portfolios both, reliability and validity are
necessary and must be taken into consideration. Therefore, substantial studies have
been carried out in this aspect, for example, Joan, Maryl and Eva (2010).
As mentioned above, a wide variety of topics regarding writing portfolios
have been chosen, but there are still some gaps. It is held by some researchers that
portfolio assessment is one form of authentic assessment (Hart, 1994). However,
Arter and Spandel (1992) point out one of the problems arising when using
portfolios as assessment devices is the possibility of lacking authenticity. Although
2
there are controversial opinions surrounding this quality, little attention has been
paid to it.
In the Vietnamese context, the term “authenticity” of language assessment
in general and portfolios in particular is still new to a large number of teachers and
learners. Even at ULIS, VNU (Vietnam National University, University of
Languages and International Studyies) where EFL students need to build writing
portfolios in English every year, hardly could readers find any official studies about
the authenticity of such assessment method. It is, hence, necessary to produce more
research on that field. Additionally, in the current semester (semester 2, 2013-
2014), the 2
nd
year fast-track students at FELTE, ULIS, VNU has experienced a
new type of writing portfolios of which course designers have high expectations.
Owing to the fact that authenticity was of essence in writing portfolios (Estrem,
2004), evaluating the authenticity of those portfolios might offer some
contributions to the development of the writing course. All of the above mentioned
reasons have led to researcher’s decision on conducting a study entitled: “An
investigation into the authenticity of the writing portfolios for the 2nd year
Fast-track students in the second semester at FELTE, ULIS, VNU”.
1.2. AIMS AND OBJECTIVES OF THE STUDY.
This study aimed at building a framework to evaluate authenticity in writing
portfolios and then investigating the authenticity of the writing portfolios for the
second- year fast-track students at ULIS, VNU based on that framework. Thus, the
research focuses on two questions:
- What aspects of the authenticity need to be demonstrated in writing
portfolios?
- From the teacher perspective, which aspects of authenticity are
demonstrated in the writing portfolios for the 2nd year Fast-Track students
in the second semester at FELTE, ULIS, VNU?
3
1.3. SCOPE OF THE STUDY.
First, according to the course syllabus (Appendix 3 p.102), there were five
types of genre about academic writing included in the writing portfolios for the 2nd
year Fast-Track students in the second semester at FELTE, ULIS, VNU. However,
under time constraint, this research just focused on three main entries:
argumentative essay, informative synthesis and argumentative synthesis.
Second, owing to the time limit, the samples of the third phase-observations
were two second-year fast-track classes whose students majored in English
language education instead of all fast-track classes.
Last but not least, there are numerous experts in language assessment field,
yet only experts at ULIS, VNU were invited to join the two semi-structured
interview phases since the favorable conditions as well as their willingness to help
with the data collection.
1.4. SIGNIFICANCE
Overall, being the first study to develop a framework for authentic writing
portfolios in Vietnam, this research could be considerably helpful for Vietnamese
teachers as well as researchers working on related studies.
With regard to ULIS context, this study, once completed, hopefully would
provide the teachers and the course designers of the Fast-track program with
valuable information about the authenticity in the implemented writing portfolios
so that adjustments might be made to improve the quality of this assessment
method.
Besides, second or foreign language researchers who have special interest in
qualities of assessment, especially authenticity could certainly base on this research
to find reliable and useful materials for their related studies in the future.
1.5. METHODS OF THE STUDY
4
A combination of four phases with three qualitative methods was employed
in the process of data collection. Specifically, at the first place, document analysis
was utilized to build the first version of the framework for the authenticity in
writing portfolios. The second phase, semi-structured interview 1 targeted at
verifying that drafted framework through experts in language assessment.
Observation was applied to examine the quality of the final framework as well as to
collect data for the next phase. Semi-structured interview 2 enabled the researcher
to answer the second research question. It is important to note that these phases
were carried out in this order.
The language used in the interviews was both Vietnamese and English to
make the respondents most comfortable to express their ideas. All the interviews
were recorded with the interviewees’ acceptance. Notes were prepared as well.
During the process of observation, the researcher recorded and took notes all
needed information from the writing lessons (for example, appendix 3, p. 95).
1.6. AN OVERVIEW OF THE REST OF THE PAPER
The rest of the paper includes five chapters as follows:
Chapter 2 (Language Assessment, Authenticity and Writing Portfolios)
provides the background of the study including key concepts and the discussions of
the related studies. This is also the instrument of the first research method-
document analysis.
Chapter 3 (Research Methodology) describes the methods and instruments
of the study, as well as the procedure employed to carry out the research.
Chapter 4 (Findings) presents all collected data.
Chapter 5 (Discussion and Implication) analyzes and summarizes major
findings. The author’s suggestions to resolve remaining problems related to
authenticity in the writing portfolios for the Fast-track students were given as well.
5
Chapter 6 (Conclusion) summarizes the main issues discussed in the paper,
the limitations of the research as well as some suggestions for further studies.
Following this chapter are the References and Appendices.
In this chapter, the researcher has elaborated on these following points:
(1) Statement and rationale of the study
(2) Aims of the study
(3) Scope of the study
(4) Methods of the study
(5) An overview of the rest of the paper
Summary
The paper has provided the rationale for the study by stressing the role of
authenticity in writing portfolios as well as disclosing the research gap. The
framework of the paper has also been explained through the two research
questions and clearly defined scope. These elaborations have not only justified the
major contents and structure of the study but will also work as the guidelines for
the rest of the paper.
6
CHAPTER 2: LANGUAGE ASSESSMENT, AUTHENTICITY
AND WRITING PORTFOLIO
Shedding light on the literature of the study, the second chapter includes
four main parts. First, a brief review of the key concepts in language assessment is
provided. In the next two parts, a detailed presentation of the theoretical
background respecting writing portfolios and authenticity in language assessment
is mentioned. Finally, the in-depth analysis toward a framework to assess
authenticity in writing portfolios lays the solid foundation for the aims of this
research paper.
2.1. LANGUAGE ASSESSMENT
2.1.1. Key Concepts
The terms “assessment”, “measurement”, “test” and “evaluation” are often
used interchangeably in general public (Nitko, 2001) . To illustrate, when talking
about assessing students’ language proficiency, people normally refer to ‘test’.
Each term, in fact, has distinctive characteristics. It is necessary to understand those
terms properly before further studies regarding language assessment are conducted.
2.1.1.1 Assessment
According to Nitko (2001), assessment is:
A broad term defined as a process for obtaining information that is used for
making decisions about students, curricula and programs. [….]. When we
say we are “assessing a student’s competence”, we mean we are collecting
information to help us decide the degree to which the student has achieved
the meaning targets. (p. 4)
7
He likewise emphasizes that there are numerous assessment techniques
serving the purpose of collecting such above-mentioned information, for example
paper-and-pencil tests, oral questioning, students’ homework performance and so
on. Besides, assessment provides teachers with data about their own teaching and
assists students in understanding their own learning progress (Duncan & Dunn,
1989).
2.1.1.2. Test
The notion of test is quite similar to the one of assessment. However, they
are not synonymous terms. Tests “ occur at identifiable times in a curriculum when
learners master all their faculties to offer peak performance, knowing that their
responses are being measured and evaluated, while assessment is an ongoing
process that encompasses a much wider domain” (Brown, 2010, p. 4). In other
words, all tests are assessment but not all assessments are tests.
As stated by Carroll (1968), “a psychological or educational test is a
procedure designed to elicit certain behavior from which one can make inferences
about certain characteristics of an individual” (p. 6). Based on this definition,
Bachman (1990) considers a test a measurement instrument created to implement
the elicitation of an individual’s behavior regarding a specific sample. Similarly,
Nitko (2001) defines a test “as an instrument or systematic procedure for observing
and describing one or more characteristics of a student using either a numerical
scale or classification scheme” (p. 5). In this study, for the sake of updated
information, Nitko’s definition of test is adopted.
2.1.1.3 Measurement
The term “measurement” mentions a “process of quantifying the
characteristics of a person according to explicit procedures and rules” (Bachman,
1990, p. 7). It is a procedure in which a specified attribute or characteristic of a
person is assigned numbers (scores). Such numbers show the degree to which he or
she acquires the attribute. From the preceding definitions, it can be seen that not all
8
types of assessment yield measurement. A procedure describing students by
qualitative labels or categories is a salient example. “Measurement” is a smaller
term than “assessment”.
2.1.1.4. Evaluation
Weiss (1972) (as cited in Bachman, 1990) regarded evaluation as “the
systematic gathering of information for the purpose of making decisions” (p. 22).
To make it clearer, Bachman (1990) has represented the relationship among
measurement, tests, and evaluation in Figure 1 below:
Figure 1: Measurement, tests and evaluation (Bachman, 1990)
According to the figure, besides testing, there exist many other methods for
evaluation. Teachers can still evaluate learners’ competency without using tests.
This fact is clearly demonstrated throughout educational assessment history and
especially in the modern language assessment trends presented in the next section.
2.1.2 A Brief Sketch of Language Assessment History and Modern language
Assessment Trends.
As stated by Brown (2007), language assessment trends and practices have
followed the shifting sands of teaching methodology. During the first-half of the
nineteenth century, assessment mainly based on the behaviorist views of cognitive
9
development (Hancock, 1994). In that behaviorism era, “special attention was paid
to contrastive analysis, and testing focused on specific language elements such as
phonological, grammatical, and lexical contrasts between two languages” (Brown,
2007, p. 8). For instance, to assess the students’ written proficiency, teachers might
ask students to translate a paragraph from their mother tongues to the targeted
language. From 1970s, communicative theories brought a new view to language
assessment: “The whole of the communicative event was considerably greater than
the sum of its linguistic elements” (Clark, 1983, p. 432). Teachers and educators
started to pose questions respecting validity and the degree to which an assessing
instrument stimulates real-world interaction. Archbald and Newmann (1988)
declare “traditional tests have been criticized for neglecting the kind of competence
needed for dealing successfully with various situations beyond school” (Archbald
& Newmann, p.21). Bachman (1990) also stressed that the relationship between the
language required in test tasks and the one used in everyday life is a matter of
concern. Thus, although paper-and-pencil tests are easy to grade, it cannot be
denied that they provide little evidence of what a language learner can actually do
with the language.
In the early 1990s, in a culture of rebellion against the notion that traditional
tests were versatile tools to measure all people and all skills, a novel concept
labeled “alternative assessment” emerged. It assembles “additional measures of
students- portfolios, journals, observations, self-assessments, peer-assessments, and
the like in an effort to triangulate data about students” (Brown, 2010, p. 251).
Nowadays, in language courses and programs throughout the world, educators are
dealing with the more student-centered agenda (Alderson, 2002). Consequently,
alternative assessments have become more and more popular and hold “ethical
potential” (Lynch, 2001, p. 228) in the promotion of fairness and the balance of
power relationships in the classroom. Nevertheless, some issues related to different
types of alternative assessments have arisen. According to Brown (2010),
practicality and reliability of language assessment tend to be lower. To illustrate,
one of the main problems of writing portfolios (a method of alternatives
10
assessment) is “low inter-rater reliability, consistency of scores because teachers
are not used to the concept of assessment” (Nezakatgoo, 2010, p.32). In addition,
though alternative assessment is strongly believed to provide a wider place for
“authenticity”, it is interesting to note that authenticity is also a matter of concern.
Back to the early 1990s, Bachman and Palmer (1996) included
“authenticity” into the qualities of a good test due to two reasons. First, it connects
the test task to the domain of generalization; and second, it may enable test takers
to perform at their best. However, the authentic label is often placed on different
types of alternative assessments such as performance-based assessment or
portfolios without regard to whether the tasks are similar to those valued outside
the classroom (Frey. B, 2012). From the aforementioned analysis, there is little
doubt that in this day and age, authenticity is one of the special considerations in
language assessment. Nevertheless, what is “authenticity”? And which are the
criteria used to evaluate such quality? Further discussion on this concept will be
held in the next part.
2.2. AUTHENTICITY IN LANGUAGE ASSESSMENT
2.2.1. Authenticity in Language Education
Some authors and researchers use the term “authenticity” in language
education as a synonym of reality; however, the notion, in fact, encompasses much
more than that.
Initially, authenticity is regarded as a quality helping classroom texts
connect students to the real world (Joy, 2011). Not only do such texts reflect reality
but they also enhance students’ motivation to communicate without self-
consciousness (Gatbonton & Gu, 1994). The opponents of this view argue that the
concern of a teaching-learning situation is not merely getting students to use
original text but enabling students to use language in real context as well.
Moreover, it is probably inaccurate to assume that the nature of a text does not
change when it is taken out of the original context. Wallace (1992) declares that
11
when real-life materials are “brought into classrooms for pedagogic purposes they
have, arguably, lost authenticity” (p. 79). Therefore, authentic texts cannot ensure
the real language use solely by itself. Clark (1989) also claims: “the notion of
authenticity has become increasingly related to specific learner needs and less
concerned with the authentic nature of the input materials themselves” (p. 73).
To conclude, it is obvious that although authenticity is the term frequently
used in language education, defining it still causes controversies. As reported by
Joy (2011), “the attempts made to define authenticity, on the one hand, have
deepened its complexity, and have widened its scope, on the other hand” (p. 10).
Additionally, language education is a broad field. Therefore, up to now, no official
definition for the term “authenticity” has been presented. Even in a narrower
aspect-language assessment, “authenticity” is likewise a contentious issue.
2.2.2. Authenticity in Language Assessment.
Since the early 90’s, authenticity in language assessment has captured the
attention of numerous teachers, educators, theorists and researchers throughout the
world. There have been hundreds of books as well as journal articles written about
such complex topic. The literature on how authenticity should be understood and
demonstrated in language assessment is, hence, extensive, too. Within the limited
scope of this research, only three magnificent views are expounded.
Concerning the domain of test, Bachman and Palmer (1996) consider
authenticity “the degree of correspondence of the characteristics of a given
language test task to the features of a TLU (target language use) task” (p.23) (figure
2).
Authenticity
Figure 2: Authenticity (Bachman& Palmer, 1996)
12
Characteristics of
the TLU task
Characteristics of
the test task
Moreover, they develop a framework (Table 1-Appendix 1) as the basis for
test designers to design an authentic test task. Their framework embodies a set of
five aspects of test authenticity: setting, test rubric, input, expected response, and
relationship between input and response.
First of all, “setting” consists of three physical circumstances “under which
either language use or testing takes place” (Bachman & Palmer, 1996, p. 48). They
include the physical settings, participants, and time of tasks. Second, with regards
to test rubric, it contains “those characteristics of the test that provide the structure
for a particular test tasks and that indicate how test takers are to proceed in
accomplishing the tasks” (p. 50). The characteristics of rubrics include: the
structure of the test, instructions, the duration of the test and how the language will
be evaluated or scored. Third, input refers to material involved in a given test task
or TLU task, “which test takers or language users are expected to process in some
ways and to which they are expected to respond” (p. 52). Finally, in a test, “the
expected response consists of language use or, perhaps, the physical response we
are attempting to elicit by the way the instructions have been written, the task
designed and by the kind of input provided” (p. 53).
There is no denying that Bachman and Palmer (1996) have built a detailed
framework. During the process of explanation, those two authors not only base on
their extensive pedagogic experience but also mention various opinions from
prestigious researchers like Caroll (1993) to support their assertions. This is
probably the reason why many testing experts, for example Purpura (2004), have
utilized that framework as a solid ground for their studies. Nevertheless, Bachman
and Palmer particularly focus on test, a form of summative assessment while
features of tests, for instance time or purpose, are not always as same as those of
other assessment methods. Consequently, that framework may not be suitable for
language assessment tasks in general.
The next view is stated by Nitko (2001). According to Nitko, “realistic” and
“meaningful” are the two terms often used as talking about authentic assessment.
13
Furthermore, to craft authenticity, assessment designers should take the following
features into account:
1. Emphasize applications: assessing whether a student can use his
knowledge in addition to assessing what the students knows
2. Focus on direct assessment: Assess the stated learning target directly in
contrast to indirect assessment.
3. Use realistic problems: Frame the tasks in a highly realistic way so that
the students can recognize them as a part of everyday life.
4. Encourage open-ended thinking: frame the tasks to encourage more than
one correct answer, more than one way of expressing the answer, groups of
students working together, and taking a relatively long time to complete (e.g,
several days, weeks, months). (p. 245)
Compared with Bachman and Palmer (1996), Nitko’s opinion about
“authenticity” seems to be broader. The author does not only pay attention to test
but also language assessment in general. Nonetheless, his own conception is merely
given without a justification or further supporting details. Thus, from researcher’s
perspective, Nitko’s arguments of authenticity seem insufficiently persuasive.
Gulikers, Bastiaens and Kirschner (2004) also propose another definition of
authentic assessment. To them, an assessment is authentic if it requires task takers
to “use the same competences, or combinations of knowledge, skills, and attitudes,
that they need to apply in the criterion situation in professional life” (p. 69). Basing
on such notion, the researchers build a five-dimensional framework for authentic
assessment (Figure 3- Appendix 1, p. 86). As its name implies, that framework
concludes five aspects.
14
The first one is “task” which “resembles the criterion task with respect to the
integration of knowledge, skills, and attitudes, its complexity and its ownership” (p.
71). Moreover, authentic task should be seen as relevant and meaningful by
different people. Besides, physical context is mentioned as one dimension with
fidelity; kind and amount of available resources, and time listed as three branches.
In consonance with Resnick’s view of the relationship between social system and
learning-out-of-school activities, Gulikers, Bastiaens and Kirschner (2004)
comprise “social context” in their framework. The fourth and the fifth one are
“assessment result” and “criteria” used to assess such result is the final dimension.
While formulating their argumentation, Gulikers, Bastiaens and Kirschner
(2004) conducted a systematic literature review. Different viewpoints are discussed
critically before the conclusion is stated. Additionally, the three authors carried out
an empirical research with students and teachers from a nursing college in order to
examine whether such dimensions are complete or not. An electronic group support
system (GSS) at the Open University of the Netherlands was used as research tool.
According to the collected results, their framework is adequate. However, that
framework is initially developed to shed light on the concept of authenticity in
general. There is no evidence that it can work appropriately in language education
field in particular.
In a nutshell, different viewpoints of authenticity exist, but there are still
some common features. With regard to the definition of authenticity in language
assessment, all the three viewpoints refer to the characteristic of “realistic” and
“meaningful”. In this research, the notion that authenticity is the quality resembling
real-life situation and addressing learners’ needs in further professional life is
adopted. However, the real professional context, which is highly performance
based, is a complicated mixture of ill-defined problems, multiple uncertain and
unexpected outcomes (Herrington & Herrington, 1998; Kirschner, 2002; Wiggins,
1993). It is, hence, fairly difficult to create that kind of context in the classroom
environment. So, the word “resembling” here means making the assessment close
15
to real life. That leads to the second concern: “how close is appropriate?” and
“which framework is the most suitable one to evaluate authenticity of a language
assessment?”
Despite that some given criteria are adequately justified, no one can say that
perfect criteria for authenticity of an assessment method can be totally suitable for
others since each one possesses distinctive features. In the next part entitled
“Toward a Framework to Assess Authenticity in Writing Portfolio Practice”, basing
on those above frameworks; theories for building a framework particularly used to
evaluate authenticity of writing portfolios will be presented. Before that, the
reasons why such framework needs to be created as well as a close look into
features of writing and writing portfolios will be offered.
2.3. WRITING PORTFOLIOS
2.3.1. Writing
2.3.1.1 Definition of writing and writing competence
Gunanadesikan (as cited in Syahid, 2010, p. 17) opens her book by
highlighting the importance of writing:
Imagine a world without writing. Obviously there would be no books, no
novels, no encyclopedias, no cookbooks, no textbooks, no travel guides.
There would be no ball-points, no typewriters, no word processors, no
Internet […]. But such lists of objects also miss the point. The world we live
in has been indelibly marked by the written world, shaped by the technology
of writing over thousands of years.
In her opinion, writing is “a miracle”, yet the question is: how that “miracle”
is conceptualized? Lannon (1989) considers writing “the process of transforming
the material discovered by research inspiration, accident, trial and error, or
16
whatever into a message with a definite meaning-writing is a process of deliberate
decision” (p. 9). Besides, Hamp-Lyon and Kroll (1997) talk about writing as “an
act takes place within a context, that accomplishes a particular purpose, and that is
appropriately shaped for its intended audience” (p. 8). In a similar vein, Sperling
(1996) declares: “writing, like language in general, is a meaning making activity
that is socially and culturally shaped and individually and socially purposeful”
(p.55). Another definition is stated by Syahid (2010):
Writing can be defined as a mental and physical process of expressing
thought and feelings by forming words into a sequence of arranged
sentences leading to the creation of meaning and the information. The
writing itself is influenced both by the personal attitudes and social
experiences that the writer brings to writing and the impacts of the particular
political and institutional contexts. It is also a process that what is written is
influenced by the constraints of genre. (p. 20)
From those perspectives, it is obvious that writing involves an intentional,
interactive, creative and complicated cognitive process.
The term “competence” is generally defined as “the ability to do something
well, measured against standard, especially ability acquired through experience or
training and linguistically, knowledge of a language that enables somebody to
speak and understand it”(Microsoft Encarta, 2009). Besides, Richard and Smith
(2002) add an entry of competencies related to competency-based teaching.
According to them, competencies are “descriptions of the essential skills,
knowledge and behaviors required for the effective performance of a real world
task of activities” (p. 94).
17
In this research, writing competence can be understood as skills, knowledge,
and behaviors of writing that enable a person to express his/her thoughts and
feelings in a satisfactory arrangement of sentences.
From this perspective, there is no denying that writing is one of the most
difficult skills for learners to master. “Learning to write involves much more than
simply learning the grammar and vocabulary of the language” (Weigle, 2002, p.
20). “Students have to pay much attention to higher level skills (macro levels skills)
such as planning, organizing as well as lower level skills (micro level skills) such
as spelling, punctuation, word choice, and so on” (Nezakatgoo, 2010, p. 231).
Numerous teaching methods respecting writing are, thus, developed which make
writing assessment methods diverse too.
2.3.1.2 Writing assessment methods
As stated in chapter 1, writing assessment methods have undergone
significant changes. According to Yancey (1999), the history of writing assessment
can be divided into three “waves”. The first waves (1950-1970) put premium on
objective, non-essay testing that prioritized efficiency and reliability. At that time,
assessment procedures were designed to “produce reliable numerical scores of
individual student papers from independent judges basing on classical test theory”
(Huot, 1996). In the second wave (1970-1986), holistic scoring of essays basing on
rubrics and scoring guides were first developed. Around the mid-1980 (the third
wave), the decline in educational standards in the United States became a matter of
concern. It was likewise when portfolio-based assessment sparked interest in
numerous specialists. With reference to that period, Camp and Levine (1991) said:
Working with researchers from Educational Testing Service, we sought a
model of assessment that would build on students’ strengths rather than
highlight their deficits. Portfolio assessment seemed a most appropriate form
of evaluation for our purposes, a test that would be conceptualized. (p. 201)
18