Evaluating Assessment Practices in Team-Based Computing Capstone Projects

dc.contributor.authorHooshangi, Saraen
dc.contributor.authorShakil, Asmaen
dc.contributor.authorRiddle, Steveen
dc.contributor.authorAydin, Ilknuren
dc.contributor.authorNasir, Naylaen
dc.contributor.authorParupudi, Tejasvien
dc.contributor.authorRehman, Attiqaen
dc.contributor.authorScott, Michael Jamesen
dc.contributor.authorVahrenhold, Janen
dc.contributor.authorWeerasinghe, Amalien
dc.contributor.authorWu, Xien
dc.date.accessioned2026-03-03T14:02:49Zen
dc.date.available2026-03-03T14:02:49Zen
dc.date.issued2025-06-27en
dc.date.updated2026-03-01T08:45:28Zen
dc.description.abstractTeam-based capstone projects are vital in preparing computer science students for real-world work by developing teamwork, communication, and industry-relevant technical skills. Their assessment, however, is challenging, requiring alignment between academic criteria and external stakeholder expectations, fair evaluation of individual contributions, recognition of diverse skills, and clarity on external partners' involvement in the evaluation process. The high stakes of these projects further demand transparent and equitable assessment methods that are perceived as fair by all involved. Our working group (WG) addresses the challenges of capstone project assessment by examining the perspectives of instructors, students, and external stakeholders to support fair and effective evaluation. Building on insights from our previous WG and a comprehensive review of the literature, we used a mixed-methods approach combining online surveys (quantitative) and in-depth interviews (qualitative) with instructors, students, and external stakeholders. In total, we collected 66 survey responses and conducted 30 interviews across multiple countries and institutions, capturing a diverse range of global perspectives on capstone course assessments. Insights from instructors and students revealed several commonalities, for example, in the types of assessed components and the challenges of identifying and addressing non-contributing group members. Our findings also revealed clear variation between instructor and student perspectives on how contributions are measured and weighted. Instructors were reluctant to rely heavily on peer or self-evaluation due to concerns about reliability, preferring scaffolded assessments and early-warning systems to gather contribution data and moderate team dynamics. They viewed contribution-based grading as positive but resource-intensive. Students, in contrast, emphasized the need for more transparency, formative feedback, and accurate recognition of individual contributions. They also expressed concerns about the lack of recognition for hidden labor (e.g., project management, team coordination), assessor inconsistency, and a reluctance to critique peers. Instructors treated peer input as supplementary evidence, whereas students perceived it as high-stakes and socially risky. Stakeholder involvement in assessment was generally limited to providing formative feedback and participating in final showcase events. We also identified generative AI as a rapidly evolving challenge, with both students and instructors seeking guidance on acceptable use and exploring opportunities to automate aspects of assessment. Our results offer actionable evidence-based guidance for designing transparent and equitable assessment practices in team-based computing capstones.en
dc.description.versionPublished versionen
dc.format.mimetypeapplication/pdfen
dc.identifier.doihttps://doi.org/10.1145/3760545.3783974en
dc.identifier.urihttps://hdl.handle.net/10919/141645en
dc.language.isoenen
dc.publisherACMen
dc.rightsCreative Commons Attribution 4.0 Internationalen
dc.rights.holderThe author(s)en
dc.rights.urihttp://creativecommons.org/licenses/by/4.0/en
dc.titleEvaluating Assessment Practices in Team-Based Computing Capstone Projectsen
dc.typeArticle - Refereeden
dc.type.dcmitypeTexten

Files

Original bundle
Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
3760545.3783974.pdf
Size:
1.57 MB
Format:
Adobe Portable Document Format
Description:
Published version
License bundle
Now showing 1 - 1 of 1
Name:
license.txt
Size:
1.5 KB
Format:
Item-specific license agreed upon to submission
Description: