iGEM Judging Rubric
Judging is a complex task and can seem mysterious to iGEMers at times. We're aiming to help teams understand how they are evaluated and provide more information ahead of time. While the individual decisions judges make about teams must remain confidential until after the Jamboree, the systems they use do not.
The main mechanism through which iGEM teams are evaluated is called the rubric. Scoring rubrics are a common tool used to ensure consistent evaluation is taking place. The questions are posted publicly so both teams and judges can see the rubric questions, which helps keep the communication of expectations clear to everyone. The rubric is composed of three main sections:
- Medals Section
- Project Section
- Special Prizes Section
Each section is called a category. Within each category, there are questions that we call aspects (shown below). Each aspect has 6 language choices that covers a range of how the judge evaluating should feel about the quality of the work. Unlike the aspects, these language choices will not be shown. We want iGEMers to know how they are being evaluated, but we don't want to "teach to the test".
The language choices correspond to roughly:
- Amazing!
- Great
- Good
- Present
- Bad
- Absent
Medals Section
The Medals section refers to a team's work convincing the judges they have achieved specific medal criteria.
These criteria can be found on the Medals Page and won't be reiterated here.
Project Section
The Project Section has ten aspects that determine the scores for the teams who will win their Track Awards and will also determine the Finalist teams for the Grand Prize in each team section (undergraduate, overgraduate, and high school).
This category is arguably the most important part of the evaluation for an iGEM team.
Special Prizes Section
The final section of the judging rubric determines Special Prizes. This part of the evaluation integrates with the Pages for Awards system.
To be eligible for an a Special Prize, teams need to complete the corresponding page on the wiki and fill out a 150 word description on the Judging Form.
This rubric is the result of more than five years of development, hundreds of hours of discussion, dozens and dozens of meetings, and thousands of emails between some of the most experienced advisers in iGEM. We are continuously improving and tweaking the rubric, but the system we have is extremely effective at selecting for the winning teams that best represent the values of iGEM.
Project Section | Number | Category | Aspects |
---|---|---|
1 | Project | How much did the team accomplish (addressed a real world problem, produced BioBricks, carried out Human Practices, created a wiki, presentation, etc.)? |
2 | Project | How impressive is this project? |
3 | Project | Did the project work or is it likely to work? |
4 | Project | Is the project likely to have an impact? |
5 | Project | How well were engineering principles used (e.g., design-build-test cycle, use of standards, modularity, etc.)? |
6 | Project | How thoughtful and thorough was the team's consideration of human practices? |
7 | Project | How much of the work did the team do themselves and how much was done by others? |
8 | Project | Did the team design a project based on synthetic biology and standard components (BioBricks, software, etc.)? |
9 | Project | Are the project components well documented on the team's wiki/Registry pages (parts should be documented in the Registry)? |
10 | Project | How competent were the team members at answering questions? |
Special Prizes Section | Number | Category | Aspects |
1 | Education | How well did their work promote mutual learning and/or a dialogue? |
2 | Education | Is it documented in a way that others can build upon? |
3 | Education | Was it thoughtfully implemented? |
4 | Education | Did the team convince you that their activities would enable more people to shape, contribute to, and/or participate in synthetic biology? |
1 | Hardware | Does the hardware address a need or problem in synthetic biology? |
2 | Hardware | Did the team conduct user testing and learn from user feedback? |
3 | Hardware | Did the team demonstrate utility and functionality in their hardware proof of concept? |
4 | Hardware | Is the documentation of the hardware system sufficient to enable reproduction by other teams? |
1 | Inclusivity Award | How well did the work investigate barriers to participation in synthetic biology and/or science more broadly? |
2 | Inclusivity Award | How well did the work expand access to synthetic biology and/or science more broadly? |
3 | Inclusivity Award | Was the work thoughtful about inclusivity and local public values in its implementation? |
4 | Inclusivity Award | Is the work documented in a way that other teams or external entities can build upon? |
1 | Integrated Human Practices | How well was their Human Practices work integrated throughout the project? |
2 | Integrated Human Practices | How inspiring an example is it to others? |
3 | Integrated Human Practices | To what extent is the Human Practices work documented so that others can build upon it? |
4 | Integrated Human Practices | How thoughtfully was it implemented? How well did they explain the context, rationale, and prior work? |
5 | Integrated Human Practices | How well did it incorporate different stakeholder views? |
6 | Integrated Human Practices | To what extent did they convince you that their Human Practices activities helped create a project that is responsible and good for the world? |
1 | Measurement | Could the measurement(s) be repeated by other iGEM teams? |
2 | Measurement | Is the protocol well described? |
3 | Measurement | Is it useful to other projects? |
4 | Measurement | Did the team appropriately use controls to validate the measurement process and calibrate units? |
1 | Model | How impressive is the modeling? |
2 | Model | Did the model help the team understand a part, device, or system? |
3 | Model | Did the team use measurements of a part, device, or system to develop the model? |
4 | Model | Does the modeling approach provide a good example for others? |
1 | New Basic Part | How does the documentation compare to BBa_K863006 and BBa_K863001? |
2 | New Basic Part | How new/innovative is it? |
3 | New Basic Part | Did the team show the part works as expected (modeling data can be acceptable)? |
4 | New Basic Part | Is it useful to the community? |
5 | New Basic Part | How well characterized (experimentally measured or modeled) is this Basic Part when tested in a device? |
1 | New Composite Part | How does the documentation compare to BBa_K404122 and BBa_K863005? |
2 | New Composite Part | How new/innovative is it? |
3 | New Composite Part | Did the team show the part works as expected (modeling data can be acceptable)? |
4 | New Composite Part | Is it useful to the community? |
5 | New Composite Part | How well characterized (experimentally measured or modeled) is this Composite Part? |
1 | Part Collection | Is this collection a coherent group of parts meant to be used as a collection, or just a list of all the parts the team made? |
2 | Part Collection | How does the documentation compare to the BBa_K747000-095 collection? |
3 | Part Collection | Is the collection fully documented on the Registry so any user could use the parts correctly? |
4 | Part Collection | Did the team finish building a functional system using this collection? |
5 | Part Collection | Is it useful to the community? |
1 | Plant Synthetic Biology | How successful was the team in engineering a plant or algal cell? |
2 | Plant Synthetic Biology | Does their work address a need or problem in plant synthetic biology? |
3 | Plant Synthetic Biology | How well did the team use the special attributes of the plant chassis? |
4 | Plant Synthetic Biology | Are the parts/tools/protocols for plants made during this project useful to other teams? |
1 | Presentation | How well does the presentation communicate the team's project and their goals? |
2 | Presentation | Do the presentation design elements effectively communicate the technical content? |
3 | Presentation | Did you find the presentation engaging? |
4 | Presentation | Were reference material and data acknowledged appropriately? |
1 | Software Tool | How well is the software using and supporting existing synthetic biology standards and platforms? |
2 | Software Tool | Was this software validated by experimental work? |
3 | Software Tool | Is it useful to other projects? |
4 | Software Tool | Does the team demonstrate that their software can interface with and be embedded in new workflows? |
5 | Software Tool | Is the software user-friendly and well documented? |
1 | Supporting Entrepreneurship | Has the team discovered their first potential customers and identified any unmet needs not yet covered by other existing solutions? |
2 | Supporting Entrepreneurship | Has the team shown that their solution is possible, scalable, and inventive? |
3 | Supporting Entrepreneurship | Has the team presented logical product development plans with realistic milestones, timelines, resources, and risks? |
4 | Supporting Entrepreneurship | Has the team outlined the skills, capabilities, and stakeholders required to be credible in developing their solution further? |
5 | Supporting Entrepreneurship | Has the team considered the positive and negative long-term impacts of their fully developed solution? |
1 | Sustainable Development Impact | Did the team incorporate feedback from relevant SDG stakeholders into their work? |
2 | Sustainable Development Impact | Did the team address potential long-term social, environmental, and economic impacts of their work in the context of the SDG(s) they have chosen? |
3 | Sustainable Development Impact | How well has the team considered the positive and/or negative interactions of their work with other SDGs? |
4 | Sustainable Development Impact | Has the team documented their work against their chosen SDG(s) so that other teams can build upon their work? |
5 | Sustainable Development Impact | Has their work measurably and significantly addressed one or more SDGs? |
1 | Wiki | How well does the wiki communicate the team's project and their goals? |
2 | Wiki | Did the team clearly document their project and support their results with convincing evidence? |
3 | Wiki | Is the wiki well designed, functional, and easy to navigate? |
4 | Wiki | Will the wiki be a compelling record of the team's project for future teams? |