• Discoverability Visible
  • Join Policy Restricted
  • Created 07 Jul 2014

article:1801

The Analytic Assessment of Online Portfolios in Undergraduate Technical Communication: A Model

In a ten-year process of collaboration faculty at New Jersey Institute of Technology developed an analytic online portfolio assessment for our technical communication service course.  Our assessment model encourages analysis of the portfolios through attention to the variables of technical writing: writing and editing, substance and content, audience awareness, document design, and textual attribution. In essence, we seek validity by redefining the elements of the course to incorporate communication in the digital age, creating new criteria for evaluation related to digitally mediated communication.  Employing a sampling plan designed to yield a 90% confidence interval, we achieved inter-reader reliability by developing a community of readers dedicated to pursing a robust model of technical communication. . Historically, validity in writing assessment has often been sacrificed to reliability. We were able to overcome this limitation.

After using the model successfully for three semesters, we can see increased consistency in teaching among sections and semesters, more communication among instructors, and we are beginning a database with which we can test further change.  From our data collection, we have been able to track patterns and test theories.  This model fosters communication between all stakeholders – faculty, instructors, students and administration – and is embedded within the lifecycle of the semester.

Methodologically, at the end of the semester, students create a portfolio, hosted on the university servers, containing their work for the semester.  After the semester, technical writing instructors spend a half-day reading and assessing a random sampling of the portfolios.  We start with a calibration session, scoring sample portfolios, which brings us into close accord.  Then we score the random sampling for separate variables and a holistic (overall) score.  The results from the data are distributed to the instructors before the beginning of the semester and form the basis for any changes in our methods and goals.

The theory and method behind this model can be applied to other disciplines as well.  In order to be authentic, teachers must be involved in the development and process of the assessment.  Each program in each discipline could theorize and implement a homegrown assessment of its own and each would likely take its own form.  By instituting assessment directly into the academic cycle on an ongoing basis, third-party, external accountability testing would be rendered unnecessary.  However, to make program assessment sustainable, all parties must be prepared for extra work.

Creating this model has accomplished demonstrable outcomes for our instructors, our program, our institution, and our students.  For instance, one of our concerns was that the grades, for the past several years, had been higher in technical communication than those for other Humanities courses: there was a disjuncture between the assessment scores and the course grade. In the fall of 2004 there was no significant correlation between the course grade and the overall portfolio score. However, over the next three semesters, the correlation between course grade and portfolio score increased to a significant level. This finding suggests that our instructors are now evaluating student work with greater attention to our articulated goals.

As instructors, we continue to reach consensus on our methods and goals. We have a time set aside each semester to get together to study and discuss student work. We have a process in place that helps us to update the course and monitor change.  This collaborative process renders the program stronger as a whole and makes the goals of the program clearer to the university community. Thus, we can easily document our outcomes for accreditation agencies such as ABET without creating new work, since the process is embedded in our program. Our students benefit because they learning to communicate in the medium they use the most—electronic communication on the internet. They will leave the program with state-of-the-art skills and can update their portfolio continuously. This collaborative endeavor has benefited all involved and made our outcomes both more consistent and more determined.

Author 1: Carol Siri Johnson cjohnson@njit.edu

Article Link: www.asee.org


: Back to 2007 Winter Issue Vol. 3, No. 1

: Back to List of Issues

: Back to Table of Contents

Created on , Last modified on