{{{ #!html
Computer aided process design tools are increasingly recognized as an essential component of any engineer’s toolkit as their use reduces computation time and allows for more aggressive parametric analyses. However, many such tools have steep learning curves, resulting in their low level of use or poor implementation in education. When they are used in educational settings, there is a tendency to give little consideration to the process by which students learn to use these tools effectively as compared to their learning engineering science concepts and applying these concepts in design situations. Such oversight hinders efforts to address ABET's Engineering Program Criterion 3 (c) and (k). These criteria state that ""Engineering programs must demonstrate that their graduates have: […] (c) an ability to design a system, component, or process to meet desired needs [and] (k) an ability to use the techniques, skills, and modern engineering tools necessary for engineering practice.""
A computer-aided process design tool for food process engineers, FOODS-LIB – Food Operations Oriented Design System Library [1,2], and associated learning modules had been used in a senior level food process engineering (FPE) course for three years prior to this work. The purpose of the tool is to aid students in performing process conceptualization, steady-state mass and energy balances across operations, and continuous food operations design. The design goal for the modules was to afford self-paced learning of FOODS-LIB. So, the intention was for students to teach themselves how to use FOODS-LIB to perform design work with little instructor intervention. The issue raised with these early implementations was the difficulty students were having in using FOODS-LIB to complete their process design work. To guide improvements to the learning modules, the process design tool, and/or their implementation, a thorough evaluation was warranted.
The driving question was: ""Can learning modules and implementation protocols be developed to effectively facilitate students’ learning to use features of the tool and transferring their knowledge to actual engineering design situations?"" A literature review revealed that surprisingly little research has been done related to the evaluation of computer simulation tools, their implementation, and the nature and quality of their subsequent use in design work.
This study focused on development of a Kirkpatrick Level 1 (reaction) evaluation [3,4] protocol to collect base-line data about student perceptions of the learning modules, FOODS-LIB, and the implementation for the purpose of identifying instructional issues. A quantitative and qualitative analysis of students’ performance on assignments associated with learning the design tool and completing a process design was done to triangulate the formative evaluation results. The emphasis was on skills learned by the students while completing the learning modules and on the transfer of knowledge to their design work. While these analyses somewhat approach Kirkpatrick Level 2 (learning) and Level 3 (behavior) evaluations, they lack the formality of control groups and pre-post assessment strategies recommended for these levels of evaluation.
Over an 11-week period in Fall 2001, twelve FPE students enrolled in the senior level course concurrently completed the learning modules and used FOODS-LIB to design an ice cream manufacturing process, sizing and detailing the operation of each step in the process. The design problem was divided into 8 homework assignments. In assignments 1 through 5, students completed the indicated learning modules and then designed specific pieces of the process. These assignments were carefully constructed so that there was a match between the FOODS-LIB skills gained through the modules and the design work assigned. The remaining three homework assignments focused on completion of the process design. As is typical in this course, students concurrently worked on a design project while continuing to complete laboratory assignments and textbook problems corresponding to the theory discussed in lecture.
Students voluntarily completed six online survey instruments, starting with a collection of demographic data (Instrument 1). Instruments 2 through 4 were used to gather information about student perceptions of each learning module, including clarity, organization, and length of the module, and the degree to which learning objectives stated at the start of each learning module were met. Instruments 2 through 5 focused on student perceptions of their ability to transfer their learning of FOODS-LIB concepts and features to their design work. Instrument 6 was used to capture student perceptions of their FOODS-LIB skills at the end of the design project period. Open-ended questions included in each survey asked students to qualify their responses in terms of strengths, weaknesses, and suggested improvements. To validate student perceptions and provide further insight into the implementation, the author graded and coded student errors (i.e. tracked types of student errors) on the homework assignments.
Students successfully completed the learning modules, and their assessment of the presentation of the learning module content was favorable. What came across very clear in the analysis of student's perceptions of their achievement of the learning module objectives was that they felt they did meet objectives focused on learning to use the existing FOODS-LIB features. However, the students did not feel they met objectives focused on creating new or modifying existing FOODS-LIB components, particularly when computer coding was involved. Further, students did not perceive improvement in their understanding of and ability to use FOODS-LIB as they used the tool to perform design work. So, there was no reinforcement or forwarding of their understandings and skills with the tool through its use. While students reported high levels of frustration with the coding required to complete their design work, emphasizing their lack of experience with the computer language used by FOODS-LIB, the author's analysis of their design work revealed that the difficulties students experienced came from an additional source - poor process conceptualization prior to use of the FOODS-LIB tool.
These results and the open-ended student comments provided evidence to direct changes to instructional materials and their implementation in Fall 2002. The benefits of this type of evaluation are that it clearly identifies problems and uses student input to generate solutions.
Support for this work was provided by a 2000 USDA Higher Education Challenge Grant (http://www.csrees.usda.gov/funding/rfas/hep_challenge.html). This was a collaborative research effort between two faculty and two graduate students in the College of Engineering and College of Education. The lead author was a faculty member in the Department of Engineering Education with a joint appointment in Agricultural and Biological Engineering (home department of the Food Process Engineering curriculum). This work resulted in not only the Journal of Engineering Education article but also two M.S. project reports.
References:
Diefes, H.A., Okos, M.R., and Morgan, M.T. ""Computer-aided Process Design Using Food Operations Oriented Design System Block Library,"" Journal of Food Engineering, Vol. 46, No. 2, 2000, pp. 99-108.
Diefes-Dux, H.A., “FOODS-LIB: Food Operations Oriented design System Block Library,” <http://pasture.ecn.purdue.edu/~foodlib/>, accessed March 30, 2004.
Kirkpatrick, D.L., Evaluating Training Programs – The Four Levels, San Francisco, CA: Berrett-Koehler Publishers, Inc., 1998.
Kirkpatrick, D.L., “Great Ideas Revisited. Techniques for Evaluating Training Programs. Revisiting Kirkpatrick's Four-Level Model,” Training & Development, Vol. 50, No. 1, 1996, pp. 54-59.