{{{ #!html

Using Writing Assignments to Improve Self-Assessment and Communication Skills in an Engineering Statics Course

Context for the Study

A student is handed a problem in engineering statics, and he or she must use his or her knowledge of math, physics, and engineering in order to finish it.  Unfortunately this scenario often demonstrates less a student’s knowledge of the key subject areas than his or her ability to plug numbers into an equation or formula that may or may not be suitable to the task at hand, a situation commonly referred to as “plug-and-chug.”  The authors’ goal was to get students to evaluate whether they really understood the material.  Therefore, the authors sought an approach to promote metacognitive behavior.  Metacognition is a concept introduced by Brown (1975) and Flavell (1976) and can be described as a sequence of steps followed by a person to monitor and improve that person’s own cognitive performance in an area.  In order to promote metacognitive behavior, the authors used writing assignments.  This approach builds on the Writing Across the Curriculum (WAC) and Writing in the Disciplines (WID) strategies (Russell, 1994; McLeod et al, 2001). 

Research Questions

The key research question in this study is whether the “explain-a-problem” type of writing assignment in an engineering course can help students achieve the following self-assessment and communication learning objectives:

1.  Students identify what they do and do not know. 

2.  Students recognize the difference between understanding how to solve a problem and blindly plugging numbers into formulas.

3.  Students communicate the solution process with sufficient detail that another person can reproduce the solution to the problem.  

4.  Students develop a habit of annotating calculations in all courses. 

An additional research question focused on bias of the assignment:  Does performance on an “explain-a-problem” writing assignment depend on learning style preference and/or Myers-Briggs temperament?

Methodology

The students in this study are freshmen and sophomores in the engineering statics course at a 4-year, private school that emphasizes math, science and engineering.  As part of a homework assignment, students were asked to provide a half page written description of a homework problem chosen by the instructor.  This model is a variation of the “Documented Problem Solutions” classroom assessment technique (Angelo and Cross, 1993).  The students were provided with a rubric which indicated that their written descriptions would be evaluated against the following criteria: 

1.  Has the student provided sufficient detail that the instructor could reproduce the approach to the solution? 

2.  Has the student demonstrated an understanding of what is being done in the solution process? 

3.  Is the description written such that  the instructor can understand what the student means? 

4.  Is the description focused on the approach to the solution of this problem, not the specific numbers of the solution? 

The authors recorded scores for each criterion for each assignment for each student.  The authors reviewed the trends of average criterion score for the class over the course of the term for four terms.  The authors also evaluated the correlation between the average writing assignment score and the intensity of Myers-Briggs temperament or Index of Learning Styles preference for each student. 

Major Findings

The average scores for Criterion 4: Focused on Approach, tended to rise to a high level after the first few assignments and remain there for the term.  The average score for Criterion 2: , Demonstrates Understanding, tended to remain the same or drop only modestly even though the students had to describe increasingly more difficult problems.  Therefore, the “explain-a-problem” type of assignment does appear to help students achieve the self-assessment objectives: students discover what they do and do not know, and students recognize the difference between understanding how to solve a problem and blindly plugging numbers into formulas.  Achievement of the self-assessment objectives did not appear to improve overall student performance though. 

Performance on Criterion 1: Provides Sufficient Detail, varied greatly throughout each term with a general trend of consistently mediocre performance (30-60%).  It appears that as the problems became increasingly difficult, the students continued to struggle with how to communicate with sufficient detail that someone could reproduce their work.  Performance on Criterion 3: Written So Can Be Understood, however, increased over the term and plateaued at a high level (85-95%).  Therefore, students do improve in their ability to write so that they can be understood. 

The authors also evaluated the correlation between the average writing assignment score and the intensity of Myers-Briggs temperament (judging/perceiving only) or Index of Learning Styles preference (all dimensions).  There was no correlation; therefore, the “explain-a-problem” type of assignment does not have bias in any of these dimensions. 

Implications for Engineering Education

The “explain-a-problem” type of writing assignment is a viable tool for promoting self-assessment in an engineering statics course.  However, there are areas for further study of this tool. 

The most common feedback from students was that students felt like they could not grasp what was expected in the assignment.  This impression was likely linked to the consistently mediocre scores on Criterion 1: Provides Sufficient Detail.  Consistent difficulty is probably due to difficulties students have deciding what information to include while staying within the half page limit.  Such decisions require high level cognitive processes, e.g., Evaluation in Bloom’s taxonomy (Bloom et al, 1956).  Therefore, students might require explicit training in critical evaluation in order to improve their decisions on what details to include in the limited space. 

In this pilot study, the authors used scores on the writing assignments as a measure of the achievement of the self-assessment objectives.  In future studies, instruments such as a metacognitive questionnaire (Swanson, 1990) or metacognitive interviews (Paris and Jacobs, 1984) might provide more direct assessment of the impact of “explain-a-problem” assignments in achieving the self-assessment objectives. 

Acknowledgments

The authors would like to thank Mr. Timothy Chow from the Office of Institutional Research, Planning & Assessment at Rose-Hulman for performing the correlation analyses for this study. 

References

Angelo, T. A. and K. P. Cross. 1993. Classroom Assessment Techniques: A Handbook for College TeachersSan Francisco, CA: Jossey-Bass. 

Brown, A. L. 1975. The Development of Memory: Knowing, Knowing About Knowing, and Knowing How to Know. Advances in Child Development and Behavior. H. W. Reese (ed), Vol. 10. 

Flavell, J. H. 1976. Metacognitive Aspects of Problem Solving. The Nature of Intelligence. L. Resnick (ed), Hillsdale, NJ: Lawrence Erlbaum Associates. 231-235.

McLeod, S. H., E. Miraglia, M. Soven, and C. Thaiss, eds. 2001. WAC for the New Millennium: Strategies for Continuing Writing-Across-the-Curriculum Programs, Urbana, Illinois: NCTE.

Paris, S. G. and J. E. Jacobs. 1984. The Benefits of Informed Instruction for Children’s Reading Awareness and Comprehension Skills. Child Development, 55: 2083-2093.

Russell, D. R. 1994. American Origins of the Writing-across-the-Curriculum Movement. In C. Bazerman and D. R. Russell, eds.,  Landmark Essays in Writing Across the Curriculum,  Davis, CA: Hermagoras Press, 3-22. 

Swanson, H. L. 1990. Influence of Metacognitive Knowledge and Aptitude on Problem Solving. Journal of Educational Psychology, 82 (2): 306-314. 

Author:
Article Link: www.asee.org/jee

}}} [https://stemedhub.org/groups/cleerhub/wiki/issue:1326 : Back to 2009 Winter Issue, Vol. 4, No. 2] [[Include(issues:footer)]]