Open Access Paper
5 June 2009 Problem-based learning in photonics technology education: assessing student learning
Author Affiliations +
Proceedings Volume 9666, 11th Education and Training in Optics and Photonics Conference; 96661K (2009) https://doi.org/10.1117/12.2208043
Event: Eleventh International Topical Meeting on Education and Training in Optics and Photonics, 2009, St. Asaph, United Kingdom
Abstract
Problem-based learning (PBL) is an instructional approach whereby students learn course content by collaboratively solving complex real-world problems and reflecting on their experience. Research shows that PBL improves student knowledge and retention, motivation, problem-solving skills, and the ability to skillfully apply knowledge in new situations. One of the challenges with PBL, however, is that real-world problems are typically open-ended with more than one possible solution, which poses a challenge to educators with regard to assessing student performance. In this paper, we describe an approach to assessing student performance in PBL developed by the Photon PBL Project, a three-year National Science Foundation Advanced Technological Education (NSF-ATE) project in which eight interdisciplinary multimedia PBL “Challenges” were created in collaboration with photonics industry and university partners for use in high school and college math, science and technology courses. Assessment included measures of content knowledge, conceptual knowledge, problem-solving skills, motivation, self-efficacy, and metacognitive ability. Results from pilot testing at four community college photonics technology programs are presented.

1.

INTRODUCTION

The new global innovation economy demands creative, teamwork-oriented problem solvers capable of adapting to the ever-changing needs of business and industry. This is especially true in field of photonics, in which rapid advances in technology require engineers and technicians to apply their knowledge and skills in solving problems in new and emerging situations. But what does it mean to be a good problem solver? Problem solving has been described as “knowing what to do when you don’t know what to do.”1 Researchers generally agree that a good problem solver is someone who can approach a problem, any problem, and systematically dissect, analyze, and formulate a coherent and viable strategy for solving the problem. Good problem solvers are patient and methodical, carefully considering all options before moving forward toward a solution. Good problem solvers break complex problems down into smaller, more manageable steps, making reasoned decisions on how to approach each step. Good problem solvers use metacognitive strategies to manage the problem solving process by planning, monitoring, and evaluating their progress and strategies during problem solving, adjusting their approach when necessary. Good problem solvers persist in the face of difficulty and have the confidence and motivation to seek alternative solutions 2,3,4,5. Unfortunately, traditional instructor-centered approaches to education do not provide adequate opportunities for students to develop the knowledge, skills, and attitudes necessary to be good problem solvers. One instructional method that has been shown to be effective in helping students develop these skills is PBL.

PBL is a learner-centered instructional method in which students learn by solving authentic real-world problems, actively and collaboratively. In PBL, the instructor serves as a facilitator or consultant, guiding the students through the problem solving process and providing instruction on an “as needed” basis. Research shows that PBL results in “deep” learning rather than “surface” learning, improves critical thinking and problem-solving skills, motivation for learning, and students’ ability to skillfully apply knowledge in new and novel situations – skills deemed critical for lifelong learning6,7,8,9. In PBL, students actively participate in their own learning by solving real-world problems in which the parameters are ill-defined and ambiguous. Unlike traditional instruction in which students attend lectures, solve well-defined end-of-chapter homework problems, and engage in highly structured “cookbook” type laboratory activities, PBL is open-ended and contextualized, and student learning is driven by the problem itself. With PBL, students’ learn the process of learning in addition to course content by engaging in a systematic and recursive process that begins problem analysis, carefully and methodically dissecting and framing a problem by reflecting on prior knowledge to identify knowledge gaps, situational constraints, and other pertinent problem features required to formulate a solution. Once the problem has been properly framed, students engage in self-directed learning to acquire the knowledge and skills needed to solve the problem, followed by brainstorming possible solutions with peers, and finally solution testing, developing viable strategies to test and validate their solutions.

One of the challenges of implementing PBL in photonics technology education, however, has been the lack of resources and training available to educators to help them transition to PBL. To address this challenge, the New England Board of Higher Education received funding for project PHOTON PBL from the National Science Foundation Advanced Technology Education (NSF-ATE) program to (1) create eight multimedia PBL “Challenges” in partnership with photonics industry and university research labs, and (2) provide professional development to high school teachers and college faculty in the principles and applications of PBL using the PBL Challenges, (3) conduct research into the efficacy of PBL in photonics technology education.

2.

THE PHOTON PBL CHALLENGES

The PHOTON PBL Challenges are self-contained multimedia instructional modules designed to develop students’ problem solving ability and understanding of photonics concepts and applications. The Challenges provide students with authentic real-world photonics technology problems presented in a multimedia format designed to emulate the real-world context in which the problems were encountered and solved. Each PBL Challenge contains five main sections: (1) Introduction - An overview of the particular photonics topic to be explored; (2) Company/University Overview - An overview of the organization that solved the problem to set the context of the problem; (3) Problem Statement - A re-enactment of an authentic real-world photonics problem as originally presented to the organization’s technical team; (4) Problem-Discussion - A re-enactment of the brainstorming session engaged in by organization’s technical team; and (5) Problem Solution - A detailed description of the organization’s solution to the problem. The Problem Discussion and Problem Solution sections are password protected allowing instructors to control the flow of information and pace of instruction. Each of the five main sections contains additional information and resources (i.e., scripts, websites, spec sheets, etc.) designed to guide the student through the problem solving process. Designed to be implemented using three levels of structure ranging from highly structured (instructor led) to guided (instructor guided) to open-ended (instructor as consultant), the PHOTON PBL Challenges provide the necessary scaffolds to assist students in the development of their problem solving skills through a developmental continuum.

One unique feature of the PHOTON PBL Challenges is the “Problem Solvers Toolbox.” The Problem Solvers Toolbox helps students develop a systematic approach to problem solving through a feature called “The Whiteboards.” Four Whiteboards guide students through a four-phase problem solving process:

  • Problem Analysis – Identifying what is known, what needs to be learned, and any problem constraints to properly frame the problem.

  • Self-Directed Learning – Setting specific learning goals, identifying necessary resources, and developing a timeline for achieving those goals.

  • Brainstorming – Collaboratively generating and evaluating ideas and alternative solutions best suited for addressing the task at hand.

  • Solution Testing – Developing a plan to validate the solution based on specific performance criteria.

The Whiteboards help students systematically capture and document their thoughts, ideas, and learning strategies during each stage of the problem solving process. Eight PBL Challenges have been developed to date in partnership with photonics industry and university partners and are available online at http://vilenski.org/pub. An Implementation Guide for Teachers, and several related conference publications and resources providing a complete description of the PBL Challenges are available online at www.photonprojects.org.

3.

ASSESSING STUDENT LEARNING

Educational assessment usually involves measuring students’ knowledge, skills, and attitudes. Assessing student learning in PBL, however, presents a unique challenge for educators accustomed to traditional assessment methods. Researchers agree that while traditional assessment methods such as multiple choice questions, true/false tests, and well-defined “end-of-chapter” problems are a convenient and effective way to measure students’ factual knowledge and recall, they do not adequately capture higher-order thinking skills10,11. To accurately assess student performance in PBL, measures must capture not only factual and conceptual knowledge within a specific content area, but more importantly problem solving ability, the ability to skillfully apply factual and conceptual knowledge to solve problems in new situations. Whereas factual and conceptual knowledge can be assessed using traditional methods such as quizzes, tests and concept maps, problem solving ability is more difficult because it involves capturing the process by which students solve problems.

3.1

PHOTON PBL Assessment Model

The PHOTON PBL approach for assessing student learning in PBL was informed by research conducted by the Vanderbilt-Northwestern-Texas-Harvard-MIT (VaNTH) Research Center for Bioengineering Educational Technologies on assessing adaptive expertise12. Research on the development of expertise shows that in contrast to novices, experts rely on a readily accessible foundation of factual knowledge organized into a conceptual schema centered on core principles or concepts. Adaptive experts are able to apply this knowledge in solving problems in a variety of situations and contexts by recognizing similar features and underlying principles. Based on this research, the VaNTH model for assessing adaptive expertise consists of three weighted measures: content knowledge, conceptual knowledge, and transfer (the ability to apply factual and conceptual knowledge in new and novel situations)12,13,14. Similarly, the PHOTON PBL assessment model shown in Figure 1 includes three measures: content knowledge, conceptual knowledge, and problem-solving ability in which specific weights are assigned by the instructor depending on the course format.

Figure 1–

Student Assessment in PBL

00055_psisdg9666_96661K_page_3_1.jpg

3.1.1

Content Knowledge

Content knowledge refers to a student’s understanding of key facts and principles within a specific domain of knowledge. Each PBL Challenge includes a test bank consisting of multiple-choice questions, closed-ended problems, and higher-level thought provoking questions centered on specific technical content associated with the Challenge. To enhance adaptability, the PBL Challenges were designed to be implemented as a either a supplemental activity in a traditional course or as a stand alone instructional method allowing instructors the flexibility to assess content knowledge in several different ways including traditional textbook assignments, lab reports, quizzes and tests, and assign an appropriate weight to the measure. We recommend pre-post testing for each PBL challenge introduced to provide a measure of improvement in content knowledge associated with each challenge.

3.1.2

Conceptual Knowledge

Conceptual knowledge refers to a students understanding of the relationship between key concepts within a particular domain of knowledge. Research on expertise shows that compared to novices, experts have a deep foundation of factual knowledge, understand facts and ideas in the context of a conceptual framework centered on core concepts and principles, and organize knowledge in ways that facilitate retrieval and application. Experts’ rich interrelated framework of concepts and principles allows them to understand and give meaning to new information by seeing patterns and relationships that are not apparent to novices, allowing them to access relevant information more efficiently. As expertise in a particular domain of knowledge increases through experience and practice, however, an individual’s conceptual framework becomes more complex and interrelated, improving their ability to transfer learning to new situations and domains12,13,14.

One method for assessing conceptual knowledge is through concept mapping. Originally developed in 1972 by Joseph Novak at Cornell University, concept maps typically consist of groupings of circles labeled with key concepts, connected with lines and arrow, and labeled with words in a way that describes the relationship between concepts15. Each pair of concepts and linking lines produces a proposition whose validity represents a measure of a student’s understanding of the relationship between the two concepts. The overall number of number of connections between concepts and the respective proposition validity formed represents a measure of a student’s conceptual knowledge in a particular domain. While a number of different methods for scoring concept maps exist in the literature, scoring is usually based on the number of connections formed and the quality and validity of the propositions generated.

Each PBL Challenge contains a list of main concepts related to the topic being explored, a reference or “expert” concept map for instructors, detailed instructions for students on how to construct a concept map, and a concept map scoring rubric. Instructors are encouraged to introduce the concept mapping using a simple topic to ensure students understand the concept mapping process and how they will be scored prior to assigning a concept mapping exercise for the PBL Challenge.

3.1.3

Problem Solving Ability

In the VaNTH model for assessing adaptive expertise, researchers defined transfer as “the extent to which students recognized the relationship between what had been taught and the new situation presented in the problem12.” In the PHOTON PBL Challenges, we depart somewhat from the VaNTH model in how we define the transfer variable. We believe that problem solving ability is the cornerstone of transfer, and is a measure of metacognitive ability, which includes reflecting on prior knowledge, identifying key problem parameters and features, developing and planning a coherent strategy for solving a problem, setting specific learning goals to acquire the knowledge and skills needed to solve the problem, monitoring progress while problem solving, and evaluating the effectiveness of problem solving strategies after a solution has been developed. Research has shown that metacognitive ability is a key factor linked to students’ ability to transfer knowledge and skills to new situations13,16,17.

Measuring problem solving ability involves both formative and summative assessments. Formative or in-process assessment is accomplished via the Whiteboards. As students collaboratively engage a problem by completing the four Whiteboards, they reflect upon and elucidate their current state of understanding, their thought process, and problem solving strategies. Research shows that verbalizing the thought process while engaging in problem solving improves metacognition, which is essential for effective problem solving18. Summative or post-process assessment is accomplished using the Final Challenge Report, a reflective journal. Upon completion of a PBL Challenge, students reflect upon and provide a detailed summary of each stage of the problem-solving process in which they have engaged. The Final Challenge Report represents a synthesis of the knowledge, skills, and strategies employed in solving the PBL challenge. Researchers maintain that this final reflective exercise is essential in the development of effective problem-solving skills4. A scoring rubric is used to grade the Final Challenge Report.

3.2

Motivation and Self-Efficacy

The attitudes that students bring to learning situations is an important factor related to their overall performance in problem-solving situations. Motivation affects the amount of effort a student is willing to commit to a particular activity and can vary depending on the value that a student places on the activity. Students who engage in a learning activity out of personal interest in the topic, or learning for learning sake, are said to be intrinsically motivated or mastery oriented. In contrast, students who engage in a learning activity for external rewards such as a good grade or promotion are said to be externally motivated or goal oriented. Research shows that while both motivational orientations are important for achieving learning objectives, students who are intrinsically motivated are more likely to engage in “deep learning” and persist in the face of difficulty19. Self-efficacy refers to a student’s confidence that he/she will be successful in a particular learning endeavor. Research shows that self-efficacy is an important factor related to positive learning outcomes and can moderate the amount of effort learners put forth in achieving a specific learning objectives20. In this study, intrinsic motivation, extrinsic motivation, and self-efficacy were measured using selected subscales from the Motivated Strategies for Learning Questionnaire (MSLQ)21. The MSLQ is a widely used and validated 81-question Likert-scaled instrument designed to assess motivation and use of learning strategies by college students.

4.

METHODS

This pilot study was conducted during the 2008-2009 academic year as an observational case study. Based on the literature22,23,24, this approach was chosen to gain a deep description of how engagement with the PHOTON PBL Challenges influences student learning, self-efficacy, motivation, and metacognitive ability. Quantitative and qualitative measures were applied to answer the following research questions:

  • 1. How and to what extent does PBL affect student learning outcomes?

  • 2. How and to what extent does PBL affect student motivation?

  • 3. How and to what extent does PBL affect student self-efficacy?

  • 4. How and to what extent does PBL affect student metacognitive ability?

In this study, student learning outcomes are defined in terms of content knowledge, conceptual knowledge, and problem solving ability as described previously. Motivation is defined using two constructs: (1) intrinsic goal orientation – the extent to which students are intrinsically motivated to engage in the particular problem-solving task; and (2) extrinsic goal orientation – the extent to which students are motivated to engage in the particular problem-solving task for external rewards (i.e., grade). Self-efficacy is defined as a student’s confidence in his or her ability to solve real-world open-ended problems. Metacognitive ability is defined as a continuous and integrated process utilizing reflection skills and metacognitive knowledge to plan, monitor, and evaluating ones learning.

4.1

Data Collection

Participants in the study were students and instructors from four community college photonics technology classes. Prior to the study, instructors participated in one of two weeklong professional development workshops conducted in summer 2007 and summer 2008 in which they learned the principles and applications of PBL through engagement with the PHOTON PBL Challenges. Instructors participating in the study were required to (1) complete at least two PBL Challenges over the course of the fall 2008 and/or spring 2009 semesters with their students, and (2) provide samples of student concept maps, Whiteboard data, and Final Challenge Reports for each PBL Challenge completed. Students were invited to participate in the study by volunteering to complete a pre-post online survey (MSLQ) at the beginning and end of the fall 2008 and/or spring 2009 semesters, and participate in a personal interview at the end of the spring 2009 semester. To encourage student participation, a cash-prize raffle was held for those students who completed both pre- and post online surveys. Researchers were available to respond to any questions or concerns via email, BlackBoard®, and telephone.

Four instructors (3 male: 1 female) and 21 students (15 male: 6 female) from four community colleges participated in the pilot study. Six of the 21 students (3 male; 3 female) who completed the pre-post online survey (MSLQ) participated in personal interviews.

4.2

Data Analysis and Results

4.2.1

Content knowledge

Content knowledge was to be assessed using student performance on pre-post content knowledge tests included with the PBL Challenges. Unfortunately, due to variations in curricula and instructor implementation methods, there was insufficient data available for analysis of this measure. However, given that the PBL Challenges were implemented as supplemental activities in traditional lecture-based courses, a comparison of student grade averages between PBL students and historical grade averages of non-PBL students in past classes using traditional measures as reported by instructors revealed that grade performance for PBL students was “comparable to better” than historical student grade performance.

Additional information regarding the effect on students’ content knowledge was acquired through thematic analysis of student interviews. When asked, “Did your experience with the PBL Challenges improve your understanding of the technical content (optics and photonics) in your course?” all six students interviewed replied positively. Sample student responses included:

  • “Yes, absolutely. You had to think outside the box…think for yourself, instead of having the teacher tell you what to do or think…”

  • “I do believe so, because its easier to learn something if you have to do it yourself as opposed to just lecturing because…someone can lecture to you for 10 minutes and you won’t retain as much as if you had to do it yourself.”

  • “Yes, it helps to have an actual problem to know where to research, so it kind of creates a focus point…you get set areas to focus on to do research on, so that focus does help…and the practicality, knowing what your working on has applications in the real world make you want to learn more.”

While individual pre-post scores for content knowledge were not available for this pilot study, results from student and instructor interviews suggest that student content knowledge in classes supplemented with PBL was, at minimum, comparable with student content knowledge resulting from traditional lecture-based instruction. Results from student interviews also revealed that the problems presented in the PBL Challenges helped link course content to real-world applications, resulting in a more contextualized and meaningful learning experience, which has been linked to improved student learning outcomes16.

4.2.2

Conceptual Knowledge

Conceptual knowledge was assessed through scoring of students’ concept maps. For each PBL Challenge completed, students were provided with a list of 10–20 main concepts and specific instructions for creating a concept map. Instructors were given detailed instructions on how to guide students in developing their concept maps as well as a reference concept map developed by the Photon PBL principle investigators. Students completed their concept maps as a group activity after completing each challenge. A scoring rubric was used to evaluate the quality and validity of the concept maps. Scoring rubric criteria consisted of two main components, proposition validity and presentation, in which student performance was scored on a scale from 1 to 4 (1=poor, 2=fair. 3=good, and 4=excellent). The proposition validity criterion included attributes relating to accuracy, depth of knowledge reflected, and number of propositions generated. The presentation criterion included attributes relating to strength of organization, legibility, and clarity. Development of the rubric and scoring of the concept maps was performed by two experts to ensure face validity and interrater reliability. Additional information regarding students’ conceptual knowledge was acquired through thematic analysis of student interview transcripts.

Comparisons were made between initial and subsequent concept maps to investigate if and to what degree students’ conceptual knowledge had changed after completing two or more PBL Challenges. Samples of 10 initial and 10 subsequent concept maps were evaluated. Because students worked in small teams to develop their concept maps, individual student scores were not available. Two experts scored each concept map individually and mean scores were calculated for each. The mean score for the 10 initial concept maps was 2.86/4.0. The mean score for the 10 subsequent concept maps was 3.42/4.0, representing an improvement of approximately 20%.

To gain further insight into whether results were due to an increase in conceptual understanding, an improved ability to construct concepts, or both, student interviews were conducted in which interviewees were asked the question, “Did you find the concept mapping exercise valuable in helping you to understand the relationships between the concepts presented?” Sample student responses included:

  • “Yes, in the end…at first it was complicated to grasp them all together (concepts)…but when it’s finally completed, being able to see the sentence go together with the two concepts and the relationships with all the others… all those focus points… it brings it all together.”

  • “I found them valuable… you can interconnect ideas that you never think would be connected in any way. I found it valuable also because it lets you hear other people’s ideas which could be better than yours…and it just allows you to put them all together.”

  • “I like the concept maps… I now use them in writing my English papers a lot. It helps you to organize your thoughts…how like ever idea connects with something else.”

These results suggest that as students progressed with the PBL Challenges, their conceptual knowledge as measured by their concept maps as well as their skill in creating concept maps improved, reflecting a deeper understand of the interrelationships between concepts presented. In addition, student interview results revealed that students viewed the concept maps a valuable tool for visualizing and understanding the relationships between concepts of which they otherwise may not have been aware. Moreover, while some viewed concept maps as somewhat tedious to construct, they also found them to be a valuable tool for engaging in collaborative learning that provided a unique opportunity for gauging their understanding through feedback from and exchange of ideas with others.

4.2.3

Problem Solving Ability

Problem solving ability was assessed through analysis and scoring of students’ Final Challenge Reports, which provided a reflective summary of the problem solving process engaged in by the students. In the Final Challenge Report, students responded to five probing questions that required them to reflect on their problem solving experience as captured in the Whiteboards. Responses to the five questions provided a measure of students’ knowledge, skills, and strategies employed in solving the PBL Challenge, and the reflective judgment used in the assessment of their problem solution as compared to the PBL Challenge solution. The scoring rubric used to evaluate students’ Final Challenge Report consisted of performance criteria relating to the five questions, in which student performance was scored on a scale from 1 to 4 (1=poor, 2=fair. 3=good, and 4=excellent). Performance criteria included: clearly and precisely defining problem parameters and constraints, identifying required knowledge and skills needed to solve the problem, setting specific and appropriate learning goals, collaborative brainstorming of solutions, development of viable a test plan, solution quality and effectiveness, and comparing and contrasting with the PBL challenge solution. Development of the scoring rubric and independent scoring of the Final Challenge Reports was performed by two experts to ensure face validity and interrater reliability. Additional information regarding students’ problem-solving ability was acquired through thematic analysis of student interview transcripts.

A sample of 10 Final Challenge Reports were evaluated to explore how and to what extent their experience with the PBL Challenges affected students’ problem solving ability. Individual student scores were not available because of the team-based approach used by instructors. Two experts scored each Final Challenge Report independently and a mean score of 3.34/4.0 was calculated for the 10 reports. Analysis of the Final Challenge Reports revealed that students used the Problem-Solver’s Toolbox effectively to guide them in developing their problem solutions. The majority of students provided clear and detailed information regarding prior knowledge, setting learning goals, generating alternative solutions, and developing methods for testing their solutions. While instructors did report some angst amongst certain students with regard to their willingness to carefully document their problem-solving process using the Whiteboards (e.g., “Some students just want to jump right to the solution without taking the time to carefully examine all aspects of the problem.”), the majority of students did appear to work through the problems in a systematic fashion, and were able to converge on solutions that in most cases were very similar to the organization’s solution.

To gain further insight into the problem solving ability developed in students through engagement with the PBL Challenges, semi-structured interviews were conducted with six volunteer students at the end of the spring 2009 semester. Of the six students, three had completed two challenges and three had completed four or more challenges. Student selection was based on convenience and availability. Each of the six students interviewed were presented with a hypothetical problem to which they were asked to comment on: (1) the process by which they would solve the problem; (2) if their experience with the PBL Challenges has helped them in their ability to solve the problem; and (3) how they would have solved the problem prior to their experience with the PBL Challenges. Responses were recorded, transcribed, and analyzed for evidence of problem-solving ability consistent the methods prescribed in the Problem-Solver’s Toolbox.

Problem Statement: A telecommunications company would like to build a free-space optical communications system to transmit data from the roof of a 40 story high-rise building in lower Manhattan to the roof another 40-story high-rise building in upper Manhattan 2-miles away. They want the system to be eye-safe to avoid any possible hazards to low-flying aircraft or birds that may cross the beam’s path.

Interview Question 1: How would you solve this problem? Explain your steps.

Analysis of students’ responses revealed variations in problem-solving ability dependent upon the number of PBL Challenges completed. Students who had completed four or more PBL Challenges were much more clear in articulating the problem-solving process they engaged in, and specifically identified each of the four phases prescribed in the Problem-Solver’s Toolbox. One sample response was:

“Well…the first thing you would have to do is list all the things you know and then list all the things you need to know…and then go research. From there you can take what you’ve learned and brainstorm with your team…come up with different ideas…and rate them on a scale from 1 to 10 or 1 to 5 or how ever many steps you have… Uhhmm…from there you would decide on which ones would work out the best… and then test out each one of your solutions to see which one works the best for your particular situation.”

In contrast, students who had completed just two challenges appeared to be more concerned with surface features of the problem and finding an immediate solution, rather than taking the time to fully understand and properly frame the problem as prescribed in the Problem-Solver’s Toolbox. While students did describe certain elements of the problem-solving process, they applied more of a “shot gun” approach, characteristic of novice problem-solvers. A sample response is provided below:

“I’d have to do some research because I don’t know how high planes fly over Manhattan …uhhmm…We’d have to make sure it’s safe, …pause…birds…I’d have to check what birds are in the area, where they are…uhhmm.., and…would this be possible? It could be…I can’t really answer that question because I don’t know enough about certain things in that situation…but… uhhmm…You’d have to check what the dangers are with the…uhhmm… I’d definitely have to figure out what are we transmitting… How far away? Two Miles? Would that be cost effective? I was thinking if you enclosed it but that would block air space… You’d have to figure out some way of testing it out before you put it in place.”

Interview Question 2: Did your experience with the Photon PBL Challenges help you in your ability to solve this problem? If so, how?

Students’ responses were unanimously positive regardless of the amount of experience with the PBL Challenges. Overall, students felt that the PBL Challenges provided them with a systematic method for solving real-world problems that taught them how to break the task down into smaller, more manageable steps. In addition, three of the six students commented on the value of being able to work collaboratively in a way that provided critical feedback against which to gauge their own understanding and capitalized on the collective knowledge of the team in developing a problem solution. Sample responses included:

  • “Yes – It definitely helps you to stay on target with what you need to do and… I would have looked at that and said “where do I go with this?” instead of saying “OK – research problem issues – safety, all those things you need to think about and how to tie them together”

  • “Yes…because I filled out enough whiteboards to know…ha-ha…The way you have to break the problem down into such basic steps…uhhmm…Lets you break down any problem into basic steps…Really lets you look at a real-world scenario…and just break it down into its most basic form so that you know what you need to know and how it can be done…”

  • “Yes- definitely. It helped us organize what we did know and didn’t know and kept us on track…and then…uhhmm…it allows you to collect ideas together which normally you might have one or two….but this lets you put many together…and decide which one is the best…I mean…you might be able to sit there and hear somebody else’s ideas and make it better.”

Interview Question 3: How would you have solved this problem prior to your experience with the Photon PBL Challenges? Explain.

Each of the six students reported that prior to their experience with the PBL Challenges, they would have had a much more difficult time solving the problem, in particular, not knowing where to begin. Students indicated that they would have immediately attempted a solution without first analyzing the problem to identify and understand important features and parameters, and developing a plan or strategy for attacking the problem. Student responses were consistent with the research identified in literature with regard to how “poor problem solvers” approach problems situations3,4. Sample responses include:

  • “I would have looked at it and went (Throw hands up)…I have no idea.”

  • “I have no idea…but after we did the challenges it helped a lot…helped a great deal. Before that I wouldn’t really know where to begin…I’d probably start with like, this is my idea and then go check it…instead of actually going through your knowledge and going through the steps…I would have probably wasted a lot of time.”

  • “No…I probably would have run in circles for quite some time…Asking myself a lot of questions but not really writing them down…Uhhmm…taking those questions as they popped into my head and then research, …but then more questions would come in as I’m researching…And then they’d get thrown out as I’m doing he research…so…uhhmm…there’d be a lot more confusion I’d have to say”

4.2.4

Motivation

Motivation, self-efficacy and metacognitive ability were assessed using selected subscales of the MSLQ. Chronbac’s alpha for each variable are reported as intrinsic motivation (4 items; α=.74), extrinsic motivation (6 items; α=.62), self-efficacy (8 items; α=.93), and metacognitive self-regulation (12 items; α=.79)24. Mean values were computed for each variable and data were screened for outliers and normality. Paired t-tests were conducted to measure changes in mean scores for each variable. Additional information regarding students’ self-efficacy, motivation, and metacognitive ability were acquired through thematic analysis of student interview transcripts.

Results of paired t-tests performed on the MSLQ motivation subscale data (n = 16) showed a statistically significant increase (t = 4.09, p= .001; Cohen’s d = 2.11) for intrinsic motivation representing a medium to large effect size. Results for extrinsic motivation and task value were not significant. Additional analyses were conducted to examine whether the number of PBL Challenges completed by students had an effect on motivation. Of the 21 students who completed the pre-post to the MSLQ survey, 12 students had completed two PBL Challenges and 9 students had completed four or more PBL Challenges. Data were screened for normality and two outliers removed. Results of paired t-tests for students completing two PBL Challenges (n=10) showed a statistically significant increase for intrinsic motivation (t = 3.58, p= .006; Cohen’s d = 2.39) representing a medium to large effect size. While not statistically significant, a medium effect size (Cohen’s d = 1.28) was also found for extrinsic motivation. Results showed a slight increase for task value, but no statistically significant difference, and a small effect size (Cohen’s d = .21). Results of paired t-tests for students who had completed four or more PBL Challenges (n=9) showed a statistically significant increase for intrinsic motivation (t = 2.866, p= .021; Cohen’s d = 2.03) representing a medium to large effect size, but a statistically significant decrease in extrinsic motivation (t = 2.344, p= .047; Cohen’s d = 1.66) representing a medium effect size. Results for task value showed no significant difference and a small positive effect size.

Paired t-test results were corroborated through analysis of student interview data, which showed that overall, students were intrinsically motivated to learn by the real world problems posed by the PBL Challenges, and through the opportunity for collaborative learning. Supporting comments included:

  • “Yes, I found them (PBL Challenges) really interesting…once we were doing the challenges it was pretty cool because you got to see how it works out…it really is interesting to see what processes it really takes to solve a problem in the real world…it helps you to think more accurately.”

  • “Yes…whenever the whole group gets together… to share what they have learned in their research… when you get together to concept map it you learn an extra large amount and you all get excited about it…you’re all excited to share what you’ve all researched… the motivation for working together… definitely.”

These results show that overall, engagement with the PBL Challenges improved students’ intrinsic motivation, which has been shown by research to promote deep, high-quality learning. Moreover, the results showed that increased experience with the PBL Challenges resulted in not only an increase in intrinsic motivation, but also a decrease in extrinsic motivation. This result suggests an internalization of external motivation, a shift from a more goal oriented approach to learning to a mastery orientation. Research has shown that compared to goal-oriented learners, learners with a mastery orientation are more flexible in their learning approach, are more likely to uses metacognitive strategies, and are likely to persist in the face of difficulty, which are all positive attributes of good problem solvers19.

4.2.5

Self-Efficacy

Results of paired t-tests on the self-efficacy subscale (n=16) of the MSLQ showed a statistically significant increase (t = 2.81, p= .013; Cohen’s d = 1.45) with a medium effect size. Additional analyses were conducted to examine whether the number of PBL Challenges completed by students had an effect on students’ self-efficacy. Results of paired t-test for students completing two PBL Challenges (n=10) were not statistically significant, but did show in a moderate increase for self-efficacy (Cohen’s d = 0.65) representing a medium effect size. For students completing four or more PBL challenges (n=9), however, results were significant (t= 3.04, p= .016; Cohen’s d = 2.15) representing a medium to large effect size.

Paired t-test results were corroborated through analysis of student interview data in which students were asked, “Do you believe your experience with the PBL Challenges has made you more confident in your ability to solve real-world problems?” Sample responses include:

  • “At first I was a little nervous about it because it was like… the first time, and I didn’t know how to explain or what I thought, but then afterwards, slowly, as we continued more challenges, I had more confidence to say, OK, I think this or this would work…”

  • “Yes, because there were real-world issues there…the whole breaking down of the problem…what needs to be done…there’s this I need to do…it definitely help because I know how to problem solve now.”

These results showed that overall, students were more confident in their ability to solve real-world problems as a result of completing the PBL Challenges because of the specific skills developed in learning how to approach a problem. Moreover, student self-efficacy improved significantly with more experience with the PBL Challenges.

4.2.6

Metacognitive Ability

Results of paired t-tests using the metacognitive self-regulation subscale (n=16) of the MSLQ showed a statistically significant increase (t = 3.45, p= .004; Cohen’s d = 1.77) representing a medium effect size. Additional analyses were conducted to examine whether the number of PBL Challenges completed by students had an effect on students’ metacognitive self-regulation. Results of paired t-test for students completing two PBL Challenges (n=10) were not statistically significant, but did show in a modest increase in metacognitive self-regulation (Cohen’s d = 0.55) representing a small to medium effect size. For students completing four or more PBL challenges (n=9), however, results were significant (t = 4.95, p= .001; Cohen’s d = 3.50) representing a large effect size. These results were corroborated through analysis of student interview data, which showed that students who had completed four or more PBL challenges were more articulate in describing the process by which they would solve a problem. This included reflecting on their current understanding of the problem and its parameters, identifying knowledge gaps, and articulating the need for planning a strategy for implementing and testing their solution – all key attributes of metacognitive ability.

These results suggest that overall, students’ metacognitive ability improved as a result of completing the PBL Challenges. As in the case of self-efficacy, metacognitive ability improved significantly with more experience with the PBL Challenges, suggesting an internalization of the problem solving process.

5.

CONCLUSION

In this paper, we presented the results of a pilot test conducted to evaluate the efficacy of the PHOTON PBL Challenges and associated assessment strategies in photonics technology education. The study included 21 photonics technology students and four photonics technology instructors from four community colleges. In the study, we examined how and to what extent engagement with the PHOTON PBL Challenges affected student problem solving skills, motivation, self-efficacy, and metacognitive ability. Students in the pilot study completed at least two PBL Challenges over the course of the fall 2008 and/or spring 2009 semesters, and a pre- and post online survey (MSLQ) at the beginning and end of the each semester. Samples of student work (concept maps, Whiteboards, and Final Challenge Reports) completed for each PBL Challenge were obtained from the instructors and analyzed. Six student volunteers participated in semi-structured interviews to provide additional information regarding their experience with the PBL challenges.

Results of the pilot test revealed that with increased experience with the PBL Challenges, students’ conceptual knowledge and problem-solving ability improved markedly. While pre-post measures of student content knowledge was not available for the study, instructor observations and comparisons of student performance in aggregate using traditional measures (homework, quizzes, and exams) data for PBL students with performance of non-PBL students in the past showed that PBL students performed at least as well as non-PBL students. Results also revealed statistically significant increases in intrinsic motivation, self-efficacy, and metacognitive self-regulation. Of particular interest was a decrease in extrinsic motivation with increased experience with the PBL Challenges, suggesting an internalization of external motivation, a shift from a more goal oriented approach to learning to a mastery orientation. Finally, results showed a statistically significant increase in metacognitive self-regulation – a key factor linked to students’ ability to transfer knowledge and skills to new situations. While the results are encouraging, given the small sample size, self-report instrument, lack of control group, possible bias, and other threats to internal and external validity, generalizability of results is limited to the sample within the study. Future studies should include a larger sample size and an experimental or quasi-experimental design to improve internal validity and generalizability.

6.

ACKNOWLEDGEMENTS

Funded in-part by the Advanced Technological Education program of the National Science Foundation (ATE #ATE 0603143) Principal Investigator, Fenna Hanes (Project Manager), New England Board of Higher Education; Co-Principal Investigators Judith Donnelly, Three Rivers Community College; Nicholas Massa Springfield Technical Community College; Richard Audet, Roger Williams University. Website: http://www.photonprojects.org.

7.

7.

REFERENCES

[1] 

Johnson, K., Herr, T., and Kysh, J., “Crossing the river with dogs,” Key College Publishing, California (2004). Google Scholar

[2] 

Schoenfeld, A. H., “Learning to think mathematically: problem solving, metacognition, and sense making in mathematicsHandbook of research on mathematics teaching and learning: A project of the National Council of Teachers of Mathematics,” MacMillan, New York (1992). Google Scholar

[3] 

Krulik, S. & Rudnick, J. A., “Problem solving: A handbook for teachers,” Allyn and Bacon, Inc, Boston (1980). Google Scholar

[4] 

Polya, G., “How to Solve It: A New Aspect of Mathematical Method,” SecondPrinceton University Press, Princeton, NJ (1957). Google Scholar

[5] 

Lochhead, Jack, and Arthur Whimbey, “Problem Solving and Comprehension,” The Franklin Institute Press, Philadelphia (1980). Google Scholar

[6] 

Savery, J. R., & Duffey, T. M., “Problem based learning: An instructional model and its constructivist frameworkConstructivist learning environments: Case studies in instructional design,” Educational Technology Publications, Englewood Cliffs, NJ (1996). Google Scholar

[7] 

Barrow, H.S., “A Taxonomy of Problem Based Learning MethodsMedical Education,” 20 481 –486 (1986). Google Scholar

[8] 

Massa. N.M., Dischino, M., Donnelly, and J., Hanes, F., “Problem-Based Learning in Photonics Technology Education,” in International Society for Optical Engineering (SPIE) Annual Conference, (2008). Google Scholar

[9] 

Hmelo-Silver, C. E., “Problem-Based Learning: What and How Do Students Learn?Educational Psychology Review,” 16 (3), (2004). Google Scholar

[10] 

McKenna, A., Walsh, J., Parsek, M. and Birol, G., “Assessing Challenge Based Instruction in Biomedical Engineering,” in Proceedings of the American Society for Engineering Education (CD-ROM DEStech Publications, Google Scholar

[11] 

Zubaidah, S., “Problem–Based Learning: Literature ReviewSingapore Nursing Journal,” 32 (4), 50 –54 (2005). Google Scholar

[12] 

Pandy, MG, Petrosino, AJ, Austin, B and Barr, R., “Assessing Adaptive Expertise in Undergraduate BiomechanicsJournal of Engineering Education,” 93 211 –222 (2004). Google Scholar

[13] 

“How People Learn,” National Academy Press, Washington, DC (1999). Google Scholar

[14] 

,Vanderbilt-Northwestern-Texas-Harvard/MIT Engineering Research Center, http://www.vanth.org/index.htm Google Scholar

[15] 

Novak, J.D. and Canas, A.J., “The Theory Underlying Concept Maps and How to Construct Them,” (2006) http://cmap.ihmc.us/Publications/ResearchPapers/TheoryCmaps/TheoryUnderlyingConceptMaps.htm Google Scholar

[16] 

Schraw, G., “On the development of metacognitionAdult Learning and Development: Perspectives from Educational Psychology,” 89 –106 Erlbaum, Mahwah, NJ (1998). Google Scholar

[17] 

Zimmerman, B. J., “Dimensions of academic self-regulation: A conceptual framework for educationSelf-regulation of learning and performance: Issues and educational applications,” 3 –24 Lawrence Erlbaum, Hillsdale, NJ (1994). Google Scholar

[18] 

Domoinowski, R.L., “Verbalization and Problem SolvingMetacognition in Educational Theory and Practice,” 25 –35 Erlbaum, Mahwah, NJ (1998). Google Scholar

[19] 

Deci, E. L., & Ryan, R. M., “Intrinsic motivation and self-determination in human behavior,” Plenum, New York (1985). https://doi.org/10.1007/978-1-4899-2271-7 Google Scholar

[20] 

Bandura, A., “Self-efficacy: The exercise of control,” Freeman, New York (1997). Google Scholar

[21] 

Pintrich P., Smith D., Garcia T., McKeachie W., “A Manual for the Use of the Motivated Strategies for Learning Questionnaire,” (1991). Google Scholar

[22] 

Rossman, G. & Rallis, S., “Learning in the field: An introduction to qualitative research,” Sage, Thousand Oaks, CA (2003). Google Scholar

[23] 

Borg, W. R., Gall, M. D., “Educational research: an introduction,” 5Longman, New York (1989). Google Scholar

[24] 

Creswell, J.W., “Research design. Qualitative, quantitative and mixed methods approaches,” Sage, Thousand Oaks, CA (2003). Google Scholar
© (2009) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Nicholas Massa, Michele Dischino, Judith Donnelly, and Fenna Hanes "Problem-based learning in photonics technology education: assessing student learning", Proc. SPIE 9666, 11th Education and Training in Optics and Photonics Conference, 96661K (5 June 2009); https://doi.org/10.1117/12.2208043
Lens.org Logo
CITATIONS
Cited by 12 scholarly publications.
Advertisement
Advertisement
RIGHTS & PERMISSIONS
Get copyright permission  Get copyright permission on Copyright Marketplace
KEYWORDS
Photonics

Multimedia

Statistical analysis

Reflectivity

Analytical research

Mathematics

Telecommunications

Back to Top