A Brief History of WAC Assessment at Howard University
Method: Starting in Spring 1993, students responded to
anonymous surveys measuring their attitudes toward the writing component of their WAC classes.
Students answered nine questions about reading, thinking, organizing,
and writing, using an attitude scale. They also wrote answers to two
questions about the strengths and weaknesses of the WAC component.
Results:The combined data from 1993-2014 showed that students felt that WAC pushed them most to think critically, in addition to helping them more clearly organize their thoughts and present them in a comprehensible format. Click here to see the survey results.
Method: Starting in Spring 1993, teachers responded to
questionnaires assessing their attitudes toward the writing component of their
WAC classes. Teachers answered the same nine questions that students were asked
and wrote responses to questions about teaching and learning in their classes.
Results: Response rates were too low for statistical
analysis. Teachers who responded, however, generally gave their courses ratings that were
similar to the averages of the students’ responses. Their written responses
(and, in some cases, attached writing samples) confirmed that they were using
WAC techniques for prewriting, rewriting, and discussing assignments.
Their suggestions for improvement varied.
Method: In Spring 1994, the WAC Director led and taped discussions with the students and
teachers in five WAC classes (biology, math, political science, history, and
modern languages). Students were asked questions about WAC
courses vs. Freshman English, WAC courses vs. other courses in the discipline,
writing to learn, and WAC’s effects on their reading, thinking, and writing.
Results: All of the classes felt that writing had enhanced their learning of
the subject matter. Many of the students claimed that the WAC course had
dramatically improved their thinking, mainly because the requirement to write
about texts had improved their reading. Fewer students claimed to be better
writers as a result of the WAC courses: The most positive remarks about writing
improvement came from the social sciences classes.
Method: In June 2001, the WAC Director organized the first portfolio
assessment of the Writing Across the Curriculum (WAC) Program in the College of
Arts & Sciences. The purpose of this assessment was (1) to determine what
type of writing instruction was taking place and (2) to measure WAC students’
writing progress and proficiency. The assessment consisted of two types of
evaluation, both focusing on the student “portfolio”—a folder presenting
examples of a student’s writing: First, the Director reviewed 179
portfolios consisting of all of the writing (e.g., journals, outlines, papers,
essay exams) that the students had submitted during the Fall 2000 or Spring 2001
semester. Then, on June 4 and 5, pairs of WAC and English teachers rated 140
portfolios, each containing a representative set of papers that a WAC student
Results: The following conclusions emerged from the Director’s review of
student work and from the raters’ scores:
Writing Instruction. WAC teachers are providing substantive writing
instruction: Although the genres of writing vary from course to course,
WAC teachers are assigning a great deal of writing, including frequent
prewriting but only occasional rewriting. Most WAC teachers are also
supplying sufficient feedback on formal papers, but a few need to respond more
consistently to grammatical problems.
Writing Progress and Proficiency. While taking a WAC class, most students
progressed in one or more areas of writing (i.e., content, arrangement, or
style). However, by the end of the term, most WAC students had achieved
only an average level of writing proficiency in the discipline.
Use of Results:
These findings led the WAC Director (1) to circulate "WAC Tips for Teaching"
that describe efficient ways to incorporate more revising, (2) to organize more
WAC-English rating sessions to develop a grading rubric, and (3) to encourage
the faculty to collaborate more closely with the Writing Center.
In Spring 2008, CETLA collected portfolios from 141 students in 11 of the
12 WAC classes in COAS (response rate=
These classes were taught by 11 different faculty in the Humanities,
Social Science, and Natural Science divisions.
Pairs of WAC and English faculty rated portfolios with at least three
papers (53) for progress and those with at least two papers (76) for
proficiency, using a descriptive
rubric with seven subcategories and a five-point rating scale (5 = Exemplary, 1
The following conclusions emerged from a
statistical analysis of the raters’ scores:
Derived from two raters, the mean proficiency score for all 76 portfolios was
22.86, which fell in the “Proficient” range according to the rubric.
Although the mean score in each subcategory was “Average,” with one
exception, at least half of the students earned “Exemplary” or “Proficient” in
each of the following subcategories: (1) Task fulfillment, (2) Critical
thinking, (3) Supporting evidence, (4) Coherence, (5) Correctness, (6) Fluency,
(7) Disciplinary conventions. Only
one portfolio was rated “Deficient” in any subcategory.
Students were most skilled in the use of Standard English (i.e.,
Correctness). However, they were
least skilled in the use of disciplinary conventions.
Progress. The portfolio assessment revealed that barely
half of the students (49%) had progressed in one or more areas of writing (i.e.,
content, arrangement, or style).
Most of the remaining students needed to improve, for only one third of them
produced proficient or exemplary writing.
To sum up, compared to the
portfolios rated during the last assessment in Spring 2001, the Spring 2008
portfolios demonstrated more proficiency but less progress.1
As a result of the last portfolio assessment, CETLA’s Director sought to
improve the WAC faculty’s response to grammatical problems in students’ essays
by emphasizing rubrics, revision, and collaboration with the Writing Center.
It is gratifying, then, to find that the majority of the students
demonstrated proficiency in their use of Standard English (i.e., “Correctness”).
As for the rate of progress, that may reflect insufficient support from the
Writing Center. At the time of data collection, the Writing Center had only five
tutors and was open sporadically because it shared a room with graduate
seminars. This state of affairs may
have discouraged WAC faculty from referring students to the Writing Center and
WAC students from visiting the Center on their own.
Use of Results:
Having analyzed the data during the summer of 2009, the WAC Director took
the following steps to improve proficiency in the disciplines and progress
She emailed the Fall 2009 WAC faculty links to writing guides in their
With the assistance of the Writing Center director, she reminded the Fall 2009
WAC Faculty more frequently to refer students to the Center.
Perhaps sampling differences account for
the discrepancy between the 2001 and 2008 findings:
In 2001, raters evaluated 111
portfolios for progress because they scored all portfolios with two or more
papers. However, in 2008, raters
evaluated only portfolios with at least three papers (53).
Method: In Spring
2015, CETLA received portfolios from 119 students (71%) in all 13 WAC classes
in the College of Arts & Sciences. These classes were taught by 11
different faculty in the Humanities, Social Sciences, and Natural Sciences
divisions. Of the 119 portfolios, only
82 contained enough assigned papers for raters to evaluate papers of similar
genres within a class; moreover, some portfolios did not have enough papers or
drafts to compare across time. Therefore,
to assess students’ progress, pairs of WAC and English faculty rated only 73 portfolios—those
that contained at least three papers of a similar genre or two drafts. Then, to measure the students’ proficiency,
the raters evaluated all 82 portfolios.
The raters used the WAC-English Descriptive
Rubric, including all subcategories except “Appearance”: (1) Task
fulfillment, (2) Critical thinking, (3) Supporting evidence, (4) Coherence, (5)
Correctness, (6) Fluency, and (7) Disciplinary conventions. The raters also adopted the following grading
scale (29-35 = Exemplary, 22-28 = Proficient, 15-21 = Average, 8-14 = Minimal,
1-7 = Deficient).
following conclusions emerged from an analysis of the raters’ scores:
Derived from two raters, the mean proficiency score for all 82 portfolios was
24.23, which fell in the “Proficient” range, according to the rubric. Of the 82 portfolios, 14 earned “Exemplary”
composite scores, and 29 earned “Proficient” composite scores. Thus, more than three quarters of the
portfolios qualified as proficient or better. Only one portfolio was rated “Minimal,”
and none “Deficient.”
portfolio assessment revealed that nearly two thirds (65%) of the students had
progressed in one or more areas of writing (i.e., content, arrangement, or
To sum up, compared to the portfolios rated during the last
assessment in Spring 2008, the Spring 2015 portfolios demonstrated greater
proficiency and progress.
Paired Classes Study
Method: After the Fall 1993 semester, the WAC Director and a WAC Committee
member from the English Department took advantage of the rare opportunity to
compare the final exam grades in WAC and non-WAC sections taught by the same
teacher. These included two sections taught by a history professor and two
sections taught by a classics professor. The WAC team also compared the term
papers in the WAC and non-WAC history sections,
using blind and holistic rating procedures to score the papers. Essays were rated on a scale of 1-5
for idea development, support, organization, diction/sentence variety, and
Results: Both WAC classes earned more points on the final exam than their
non-WAC counterparts did. Moreover, both the history teacher and the WAC team
gave the WAC term papers higher grades than they gave the non-WAC term papers.
Because of the small sample sizes, the team could not determine whether the
differences were statistically significant, but pre-existing ability differences
did not account for the WAC advantage. Click here
for more details.
Method: In Fall 1994, teachers in five WAC classes (history, biology,
physical education, modern languages, philosophy) assigned a WAC diagnostic
essay ("What do you already know about the subject matter of this course?) and
WAC final essay ("What do you know about the subject matter of this course?").
Because of the number of time references embedded in the WAC final essays, the
essays could not be scored blindly, as planned. Nevertheless, the WAC Director
scored the essays holistically for development, organization, and language on a
scale of 1-4.
Results: The study revealed that most students knew how to organize and
develop a personal expository essay when they enrolled in the WAC classes. What
the study did not reveal was how well they could organize and develop the type
of disciplinary writing taught in their particular WAC class. Nor did the study
indicate how well students could command the style of the discipline. As for
general editing skills, there was no consistent improvement.
Write to Succeed Study
Method: In Fall 1998, a chemistry professor and theater
professor participated in a pilot called the "Write
to Succeed Program." The professors in the pilot gave a paper an Incomplete grade
(e.g., "IC" for a paper with "B" content) if it contained a noteworthy writing problem.
Then they referred the student to the Writing Center, where a tutor helped the
student edit the paper. Once the student had earned a Success Report from the
tutor for successful editing, the teacher raised the paper grade one letter
grade. To implement the pilot, the WAC Director received money from the Fund for
Academic Excellence to hire and train six additional tutors.
Results: An analysis of the data revealed that the program (1) ensured that
every student referred to the Writing Center earned a Success Report and/or a
higher grade, (2) phased out "I" referrals for students who had earned Success
Reports—and any others who had received "I" grades previously, and (3) decreased
by 80% or more the percentage of a class receiving "I" for a writing assignment.
21st Century Skills Study
In 1998, an Allied Health professor investigated the impact of WAC strategies
as well as active learning and computer-assisted instruction in her
first-semester professional-level course, Clinical Immunology. After completing
her WAC training, she redesigned the course “to include writing components such as
pre-writing strategies (journal notebooks), laboratory notebooks, and a formal
case-report on a specific immunological disorder. Included in these
exercises are various stages of drafting and review (including peer-review).
Writing assignments are evaluated based on organization, clarity of expression,
grammar, and spelling, as well as the demonstration of a clear understanding of
the immunological concepts involved.” From student questionnaires
and pre-post comparisons of student work, the professor concluded that the course
had fulfilled WAC objectives:
After some initial resistance, the students seemed to accept
the writing components of the class favorably. They were delighted
with the more creative writing assignments; in particular those assignments that
involved group participation. Students also expressed their satisfaction
with the case-report activities, especially the peer review. Students felt
that their writing skills were better and, with few exceptions, the final
case-reports were excellent and the overall quality of student writing had
improved. From the instructor’s perspective the WAC component was also the
most useful and rewarding. Commonly occurring errors as well as specific
misconceptions were more easily discerned and corrected before an examination,
thereby allowing the students a greater possibility of improving their grades.
Click here to read more about the
study in the paper the professor presented at the “Interinstitutional Symposium:
Curricula for the 21st Century.”
Writing Competence Study
In Fall 1995, the coordinator of the Health Ethics Program
incorporated WAC strategies in an interdisciplinary course for 251 health
science students, including 57 medical students. With the help of her
team-teachers, she introduced the class “to writing interventions to help
students master writing techniques as well as reinforce the analytical synthesis
of information through observing, listening, and critically thinking.” An
end-of-the-semester survey revealed that 56% of the students found the WAC
strategies helpful, especially while preparing the assigned research paper.
The Health Ethics faculty also noted that the students’ performance on the
preliminary WAC assignments correlated with their grades on the final paper.
Click here to view the PowerPoint
At the end of the Fall 2006 semester, the coordinator of
all fifteen Biology 101 sections administered an anonymous survey to assess
students’ attitudes toward lab reports, since they had been using Labwrite, a web-based tutorial for
writing lab reports. Altogether, 172 of the 372 students responded to the
survey. However, students in some sections reported using LabWrite more than others.
For instance, there were three sections (35 respondents) where 90% or more of
the students used Labwrite for four or more labs. On the other hand, there
were two sections (16 respondents) where only 40-45% of the students used
Labwrite for four or more labs. Because of the uneven implementation, low
response rates, and small group sizes, it was impossible to conduct a meaningful
However, the Biology 101 coordinator also surveyed his nine Teaching Assistants,
eight of whom responded to the anonymous survey. They were asked specifically to
compare the students’ writing and their teaching productivity with and without
LabWrite. Two thirds (67%) of the TAs agreed that their students were better
prepared for the lab and understood the scientific concepts of the lab better
when they had completed the LabWrite PreLab, while two disagreed. Moreover,
three quarters (75%) agreed that the students composed better lab reports when
they completed the LabWrite PostLab; this time only one disagreed.
In addition, the TAs reported gains in their productivity. When they used the
LabWrite PreLab, PostLab, or Rubric, at least two thirds of the TAs found that
the LabWrite PreLab helped them prepare students for the lab, the LabWrite Post
Lab helped them teach students how to write a lab report, and the LabWrite
Rubric helped them evaluate students’ lab reports. They only regretted that the
LabWrite materials and links had been posted inside Blackboard: Because of login
and performance problems with Blackboard, accessing and submitting LabWrite
assignments often became time-consuming.
In Fall 2006, an Allied Health professor investigated whether the
11 students in her WAC course learned more when they (1) wrote a summary of a
text for a lay audience, (2) wrote a summary of a text for the teacher, or (3)
only read the text. Although each student was supposed to rotate through all
three conditions, some students assigned to the “Read Only” condition repeatedly
wrote summaries. Thus, it was possible to evaluate only the impact of writing a
summary (regardless of audience) vs. merely reading the text.
Unfortunately, the sample was far too small to warrant statistical
analysis. Moreover, only 7 of the 11 students responded to the anonymous survey,
and a few did not complete all of the assignments. However, a few striking
differences emerged. When summarizing the text, all of the respondents indicated
that they had read the assigned text more than once (regardless of the
audience). In contrast, only two of the students read the text twice when a
summary was not required. On the other hand, 40-50% of the students claimed that
they remembered comprehended, analyzed, synthesized, and evaluated ideas from
the text best when they merely read the text. This claim could not be
substantiated because several students did not submit all of the summaries and
quizzes. Also worth noting is that despite their claim,
85% of the students recommended that the professor continue to assign