{{{ #!html
Engineering educators have long-emphasized laboratory development and teaching [1], but only recently has increasing attention been paid to understanding and assessing the laboratory learning experience [2], [3]. Newstetter and Turns published a study in 2003 relating the laboratory learning experience to the problem-based learning experience in the classroom [4]. Peterson and Feisel [5] reported on concerns raised by Graham [6] over twenty years ago. Graham referred to the need for a better understanding of the learning process in the laboratory, not focusing on what and how (experiments, setup, etc.), and instead focusing on why; in other words, on an understanding of the student’s learning experience. This is particularly relevant in computer engineering education, because new technologies, tools and methods continue to emerge with the rapid advancement of the field. Courses and labs must provide the educational context and learning experiences that lead to an enduring understanding [7] of the subject and discipline. The achievement of depth of understanding through increasing levels of cognition was first introduced in Bloom’s taxonomy [8], [9]. This is the underpinning for the two main areas of our work: course design and assessment of cognitive learning in the laboratory. A course, CPRE 488, that is part of a curriculum on embedded computer systems was developed using a model called 3C5I that incorporates both Bloom’s levels and elements of problem-based learning [10]. The 3C5I model creates an educational context based on Concepts within Courses within a Curriculum (3C), and in each, progressing along the ?ve learning steps (I’s) of Introduction, Illustration, Instruction, Investigation, and Implementation. Lecture-lab integration and advancing student learning were goals during course design; targets were set for the level of learning to be accomplished in lecture versus laboratory and across labs. Following two years of experience with the course, we used the study by Ulmer and Torres [11] as a framework to guide assessment of the level of cognitive behavior observed in the laboratory assignments completed by students in the class. This is based on the Florida Taxonomy of Cognitive Behavior (FTCB) [12], [13], a modified form of Bloom’s taxonomy.
In the design of the course, the instructional team used the 3C5I learning model as a structured approach to organizing classroom and laboratory learning. The team met weekly over several months to brainstorm and plan the learning experiences to support the course outcomes. A collaborative approach was used with work-in-progress recorded on large self-stick wall sheets. These wall sheets remained posted for reference during team-based development of the course materials. Each sheet was labeled with a week of the semester, topics were listed for the lecture and/or laboratory, and the expected level of learning (Introduction, Illustration, Instruction, Investigation, Implementation) was identified in each case. Thus, the team created a plan that progressed through learning steps for each of the topics and related concepts over one or more lessons and throughout the course. For example, suppose that the learning goal for a concept is at the level of investigation. Team members then determined when and how to provide each of introduction, illustration, instruction, and investigation in lectures and labs related to that concept. Thus the course plan was motivated by goals for student learning and not simply coverage of content. Course development also benefited from the composition of the instructional team, which included specialists responsible for particular roles and processes, a form of team teaching described by Bess [14].
Some specific background about the laboratory is needed before turning to our study on the laboratory learning experience. CPRE 488 is organized as three lecture hours and one three-hour lab session weekly. During the semester, there are nine labs and a lab project, all supervised by a trained teaching assistant. There are also regular homework assignments and exams. The nine lab sessions fall into three sets: Set 1 (Labs 1-3) – Introductory, Set 2 (Labs 4-6) – Digital Camera, and Set 3 (Labs 7-9) – MP3 Player. Sets 2 and 3 are motivated by two realistic applications. These labs were developed with a progression of learning in mind. At the end of the first set of labs, students are well-acquainted with the lab environment. In Lab 3, the last lab in the introductory set, system performance is analyzed and concepts central to the next set of labs are introduced. In Lab 6, the last lab in the second set, students must apply the techniques from Lab 3 to meet system performance requirements in a complex application. The last set of labs also re-visits earlier material. In Lab 7, a new component is added to the environment, and the same simple application from Lab 1 is again used to let students see the new capabilities. Labs 8 and 9 use the new component in different complex applications, with Lab 8 drawing on various skills from previous labs.
The design of the course emphasized laboratory learning and providing students with opportunities for higher-order thinking, based on the educational premise that student learning is improved as students progress through higher cognitive levels. Thus, a key question of interest to the instructional team was whether the lab activities provided learning opportunities at higher cognition levels. Drawing on the expertise of a team member, we identified an FTCB-based methodology to assess cognitive behavior. In FTCB, there are seven levels of cognitive behavior: (1) knowledge of specifics, of ways and means of dealing with specifics, and of universals and abstracts; (2) translation; (3) interpretation; (4) application; (5) analysis; (6) synthesis; and (7) evaluation. These are further refined into a list of fifty-five specific behaviors used for observation. In the Ulmer and Torres study, the FTCB was used to categorize teachers’ cognitive behaviors by observing and recording the behaviors of teachers and students in the classroom during six-minute intervals. [33] We adapted this to categorize cognitive behaviors observed in laboratory assignments over ten-point intervals. As a behavior was observed, a box was marked within the cognitive level. A behavior was recorded only once per ten-point interval, regardless of the number of times it occurred. For example, if a lab activity has a student perform a task (any task) three times in an interval, the box was marked only once. However, if a student performs a task during three different intervals, a box was marked in all three intervals. Each level of cognitive behavior was recorded as it was seen by each of three observers. This was done for every lab assignment. The observations were tallied for each of the seven cognitive levels and reported as percentages.
Averaging across the ratings of the cognitive level of the nine labs by the three observers, 62.9% of all behaviors observed were higher-order behaviors (defined as levels 4-7). Of the higher-order cognitive behaviors, the most common was analysis (23.7%) and the least common was evaluation (4.5%). Of the lower-order cognitive behaviors, the most common was interpretation (19.6%) and the least common was knowledge (8.3%). Data were analyzed using various methods, including analysis of variance and Pearson product-moment correlations. There is a statistically significant difference in the ratings provided by the three observers, and these differences are also evident in the literature [11],[15],[16],[17]. An analysis of the mean cognitive level for each lab shows that the differences between labs are statistically significant, including a significantly higher mean for Lab 6 (4.95) compared to Lab 4 (3.44). The general trend is toward increased sophistication, measured by higher mean cognitive level, over time. The mean increases from Lab 1 to Lab 2 to Lab 3. Lab 4 has the lowest mean. Then the mean increases for Lab 5, and again for Lab 6, which has the highest mean. The means for Labs 7 and 8 are approximately the same, and finally Lab 9 has the second-highest mean. The differences between labs are consistent with the design of the labs. Labs 1-3 form an introductory set, with more advanced skills introduced in each lab. Labs 4-6 are based on the digital camera application, with increasing complexity. Because it is based on a real application, it should reach a higher level of cognition than the first set of three labs. Labs 7-9 form a third set, with lab 7 being somewhat independent. Labs 8 and 9 then build from lab 7 on another real application (MP3 player). One might have expected lab 8 to be rated higher than Lab 7. However, the results accurately portray that students need to make a bigger jump in the level of tasks being done from start to finish on labs 4-6 compared to labs 7-9. The pattern in the progression of cognitive skills is notable. Students progress from lower order thinking skills to higher order skills in each set of labs, and start each set at a relatively low skill level. While a detailed study of student performance was not conducted, the correlation between the mean percentage student score on each lab and the lab’s mean cognitive level is significant.
Finally, a commonly-used measure with FTCB data is the total cognitive weighted score [17]. This is calculated as the weighted average of the percentage distribution of observed behaviors, where the weights are .10 for knowledge, .20 for translation, .25 for interpretation, .30 for application, .40 for analysis, and .50 for both synthesis and evaluation. The lab assignments were found to have a total cognitive weighted score of 33.87, indicating an average cognitive level for laboratory activities above the application level. As with the results for mean cognitive level, there is a statistically significant negative correlation between the total cognitive weighted scores and mean student performance on the labs. This finding confirms the conclusion that assignments requiring more sophisticated thinking by students enhance the ability of the labs as instruments measuring student outcomes to discriminate between students who are “getting it” at a higher cognitive level and those who are not.
The assessment of cognitive levels using FTCB not only provided insights into CPRE 488 and the cognitive depth of teaching and learning, but also led to greater reflection by the instructional team. For example, what patterns of lower-order and higher-order learning should be achieved in a lab? In a course? In a curriculum? Ulmer makes an interesting observation on the difference between lower-order thinking and higher-order thinking, as follows: “… the difference is influenced by prior knowledge held by the learner. What may require higher-order thinking by one learner may require lower-order thinking by another learner. Arguably, what may require higher-order thinking by a learner today may not require the same level of thinking tomorrow.” [15] This observation seems particularly relevant to a dynamic field such as computer engineering, and reinforces the need for learner-centered approaches informed by cognitive models. Several specific activities that would build on the results of this study and enhance teaching and learning in the course include: analysis of classroom learning and comparison with lab learning; measurement of student performance on learning outcomes in relation to statistical results of this study; and selected refinements to lab exercises to engage students in higher order thinking.
This work was partially supported under NSF grant No. 0431924 and a GAANN grant from the U.S. Dept. of Education to the Information Infrastructure Institute at Iowa State, and also through support from Rockwell Collins Foundation and Xilinx.
References
[1] P. C. Wankat and F. S. Oreovicz, Teaching Engineering, New York, NY: McGraw-Hill, 1993.
[2] L. D. Feisel and G. D. Peterson, "A Colloquy On Learning Objectives for Engineering Education Laboratories," in 2002 Proc. American Society for Engineering Education Annual Conference & Exposition, 2002.
[3] L. D. Feisel, G. D. Peterson, O. Arnas, L. Carter, A. Rosa and W. Worek, “Learning Objectives for Engineering Education Laboratories,” in 2002 Proc. ASEE/IEEE Frontiers in Education Conf., 2002.
[4] W. C. Newstetter and J. Turns, “Looking for Convergence: Laboratory Learning and Classroom Learning,” in 2003 Proc. ASEE/IEEE Frontiers in Education Conf., 2003.
[5] G. D. Peterson and L. D. Feisel, “e-Learning: The Challenge for Engineering Education,” Proc. of 2002 eTEE Conference, vol. P1, article 25, 2002.
[6] R. Graham, “Needed: A Theory of Laboratory Instruction”, The Undergraduate Engineering Laboratory, Engineering Foundation Publication, 1983.
[7] G. Wiggens and J. McTighe, Understanding by Design, 2nd ed., New York, NY: Prentice Hall, 1998.
[8] B.S. Bloom and D. Krathwohl, Taxonomy of educational objectives: The classi?cation of educational goals, Handbook I: Cognitive domain, New York, NY: Longmans Press, 1956.
[9] L. Anderson and D. Krathwohl, A taxonomy for learning, teaching and assessing: A revision of Bloom’s taxonomy of educational objectives, 2nd ed., New York, NY: Allyn & Bacon, 2001.
[10] A. Striegel and D. Rover, “Problem-based learning in an introductory computer engineering course,” Proc. 2002 IEEE/ASEE Frontiers in Education Conf., 2002.
[11] J. D. Ulmer, and R. M. Torres, “A Classroom Assessment of Agriculture Teachers’ Cognitive Behaviors,” in 2007 Proc. AAAE Research Conf., vol. 34, pp. 123-137.
[12] J. N. Webb, The Florida taxonomy of cognitive behavior: A working manual. University of Alabama, Tuscaloosa, AL 1968.
[13] B.B. Brown, R. Ober, R. Soar, and J.N. Webb, Florida taxonomy of cognitive behavior: Directions. Unpublished manuscript. Gainesville: University of Florida.
[14] J. L. Bess, Teaching Alone/Teaching Together: Transforming the Structure of Teams for Teaching, San Francisco, CA: Jossey-Bass, 2000.
[15] J. D. Ulmer, “An Assessment of the Cognitive Behavior Exhibited by Secondary Agriculture Teachers,” Ph.D. dissertation, Agricultural Education Dept., University of Missouri-Columbia, Columbia, MO 2005.
[16] A.B. Smith, G. Ward, and J.S. Rosenshein, “Improving Instruction by Measuring Teacher Discussion Skills,” American Journal of Physics, 45(1), Jan. 1977, pp. 83-87.
[17] M. S. Whittington, and L.H. Newcomb, “Aspired Cognitive Level of Instruction, Assessed Cognitive Level of Instruction and Attitude Toward Teaching at Higher Cognitive Levels,” Journal of Agricultural Education, 34(2), 1993, pp. 55-62.
Author: