Return to search

Preventing Systems Engineering Failures with Crowdsourcing: Instructor Recommendations and Student Feedback in Project-Based Learning

Most engineering curricula in the United States include some form of major design project experiences for students, such as capstone courses or design-build-fly projects. Such courses are examples of project-based learning (PBL). Part of PBL is to prepare students—and future engineers—to deal with and prevent common project failures such as missing requirements, overspending, and schedule delays. <i>But how well are students performing once they join the workforce?</i> Unfortunately, despite our best efforts to prepare future engineers as best we can, the frequency of failures of complex projects shows no signs of decreasing. In 2020 only 53% of projects were on time, 59% within budget, and 69% met their goal, as reported by the Project Management Institute. If we want to improve success rates in industry projects, letting students get the most out of their PBL experience and be better prepared to deal with project failures before they join the workforce may be a viable starting point. <br><br>The overarching goal of this dissertation is to identify and suggest improvements to areas that PBL lacks when it comes to preparing students for failure, to investigate student behaviors that lead to project failures, and to improve these behaviors by providing helpful feedback to students. <br><br>To investigate the actions and behaviors that lead to events that cause failures in student projects, I introduced “crowd signals”, which are data collected directly from the students that are part of a project team. In total, I developed 49 survey questions that collect these crowd signals. To complete the first part of the dissertation, I conducted a first experiment with 28 student teams and their instructors in two aerospace engineering PBL courses at Purdue University. The student teams were working on aircraft designs or low-gravity experiments.<br><br><i>Does PBL provide sufficient opportunities for students to fail safely, and learn from the experience? How can we improve?</i> To identify areas that PBL may lack, I compared industry failure cause occurrence rates with similar rates from student teams in PBL courses, and then provided recommendations to PBL instructors. Failure causes refer to events that frequently preceded budget, schedule, or requirements failures in industry, and are identified from the literature. Through this analysis, I found that PBL does not prepare students sufficiently for situations where the failure cause missing a design aspect occurs. The failure cause is fundamentally linked to proper systems engineering: it represents a scenario where, for example, students failed to consider an important requirement during system development, or did not detect a design flaw, or component incompatibility. I provided four recommendations to instructors who want to give their students more opportunities to learn from this failure cause, so they are better prepared to tackle it as engineers. <br><br><i>Is crowdsourced information from project team members a good indicator of future failure occurrences in student projects?</i> I developed models that predict the occurrence of future budget, schedule, or requirements failures, using crowd signals and other information as inputs, and interpreted those models to get an insight on which student actions are likely to lead to project failures. The final models correctly predict, on average, 73.11±6.92% of budget outcomes, 75.27%±9.21% of schedule outcomes, and 76.71±6.90% of technical requirements outcomes. The previous status of the project is the only input variable that appeared to be important in all three final predictive models for all three metrics. Overall, crowdsourced information is a useful source of knowledge to assess likelihood of future failures in student projects. <br><br><i>Does targeted feedback that addresses the failure causes help reduce failures in student projects?</i> To improve student behaviors that lead to project failures, I used correlations between failure measures and the crowd signals as a guide to generate 35 feedback statements. To evaluate whether the feedback statements help reduce project failures in the student teams, I conducted a second experiment at Purdue University with 14 student teams and their instructors. The student teams were enrolled in aircraft design, satellite design, or propulsion DBT courses. The student teams were split in two treatment groups: teams that received targeted feedback (i.e., feedback that aimed to address the failure causes that the specific team is most prone to) and teams that received non-targeted feedback (i.e., feedback that is positive, but does not necessarily address the failure causes the specific team is most prone to). Through my analysis, I found that my targeted feedback does not reduce the failure occurrences in terms of any metrics, compared to the non-targeted feedback. However, qualitative evaluations from the students indicated that student teams who received targeted feedback made more changes to their behaviors and thought the feedback was more helpful, compared to the student teams who received non-targeted feedback.<br><br>

  1. 10.25394/pgs.14829423.v1
Identiferoai:union.ndltd.org:purdue.edu/oai:figshare.com:article/14829423
Date23 July 2021
CreatorsGeorgios Georgalis (11013966)
Source SetsPurdue University
Detected LanguageEnglish
TypeText, Thesis
RightsCC BY 4.0
Relationhttps://figshare.com/articles/thesis/Preventing_Systems_Engineering_Failures_with_Crowdsourcing_Instructor_Recommendations_and_Student_Feedback_in_Project-Based_Learning/14829423

Page generated in 0.0022 seconds