The University of Texas at Austin Invitational 2020
-
- Member
- Posts: 34
- Joined: Fri Dec 14, 2018 9:14 am
- Division: Grad
- State: FL
- Has thanked: 0
- Been thanked: 12 times
Re: The University of Texas at Austin Invitational 2020
Hello all! I was the supervisor, along with Shayren Patel, for Water Quality B/C. I know there were some technical difficulties, so I want to congratulate each team for their perseverance and hard work. I hope each and every one of you enjoyed your testing experience and I wish you all good luck in future competitions as well as a safe and healthy rest of the year.
Now, to the part you all want to know:
Common Errors
Teams neglected to read the instructions for the questions. Teams missed out on a lot of points due to this.
One of my favorite reoccurring themes was seeing teams talk about aragonite levels in water in a particular question, mentioning a specific value. Then, less than 10 questions later, when there was a question asking for that value, those teams got it wrong. UT Austin did not penalize for leaving your browser, but try to keep the integrity of the tournament.
Results
Teams did very well answering the basic questions and short answers but fell short on a lot of the longer questions. Teams also struggled with multiple-answer questions. Overall, the emphasis on harder questions and fewer multiple choice/select than my normal exams made our score distribution lower than normal, but my goal of having the top teams separated from the pack did occur. Each team, especially Beckendorff A and Bayard Rustin Blue, our Division B and C Champions, respectively, should be proud of the hard work they've done. I'm excited to see each and every competitor succeed in future tournaments and I look forward to seeing some of the same teams at other tournaments this year!
Now, to the part you all want to know:
Common Errors
Teams neglected to read the instructions for the questions. Teams missed out on a lot of points due to this.
One of my favorite reoccurring themes was seeing teams talk about aragonite levels in water in a particular question, mentioning a specific value. Then, less than 10 questions later, when there was a question asking for that value, those teams got it wrong. UT Austin did not penalize for leaving your browser, but try to keep the integrity of the tournament.
Results
Teams did very well answering the basic questions and short answers but fell short on a lot of the longer questions. Teams also struggled with multiple-answer questions. Overall, the emphasis on harder questions and fewer multiple choice/select than my normal exams made our score distribution lower than normal, but my goal of having the top teams separated from the pack did occur. Each team, especially Beckendorff A and Bayard Rustin Blue, our Division B and C Champions, respectively, should be proud of the hard work they've done. I'm excited to see each and every competitor succeed in future tournaments and I look forward to seeing some of the same teams at other tournaments this year!
You do not have the required permissions to view the files attached to this post.
- These users thanked the author will926ok3645 for the post (total 2):
- Krish2007 (Mon Oct 26, 2020 2:39 pm) • Adi1008 (Mon Oct 26, 2020 5:42 pm)
-
- Member
- Posts: 47
- Joined: Sat Feb 17, 2018 7:19 am
- Division: Grad
- State: TX
- Pronouns: He/Him/His
- Has thanked: 17 times
- Been thanked: 45 times
Re: The University of Texas at Austin Invitational 2020
Codebusters B/C
First off I would like to congratulate every single team that competed in the UT Invitational last weekend for their hard work! Although there were some bumps along the way, overall the tournament was a big success, especially since it was a lot of team's first time competing in a virtual format. I wrote the Codebusters B/C test along with Sid, who is also a member of ATX Scioly.
Division C
Scoring Distribution
The 8th-10th place teams definitely showed that they were good at cracking ciphers and deserved their medals, but even they couldn't even get close to the top 2 teams with a whopping 1,000 points difference between 2nd and 3rd and a mere 3 points between 2nd and 1st. Congratulations to Syosset High School for prevailing over Clements High School to become the event champion at this tournament. What's ridiculous is that Clements would have won if they didn't get a partial score on one of the xenos, but that's part of Codebusters!
We decided to make the test longer than a usual Codebusters test since when we were writing it, it was not possible to have a timed question on Scilympiad (but now it is possible!). We also decided to use national rules for C in hopes that we were able to spread the scores around a bit. The inclusion of the four "joke" ciphers was added in order to have some easy stuff, but many teams decided to ignore them in favor of the ciphers that offer a lot more points.
Division B
Scoring Distribution
We had most of the questions from C transfer to B while changing the national questions to regional ones. Since Codebusters B was a trial event and a relatively new one at that, we didn't expect a lot of teams to score particularly well, but in a similar situation in Division C, the top 4 teams scored particularly high, with the top 2 scoring pretty close to one another. Congratulations to Kennedy Middle School for scoring 47 points higher than Sierra Vista Middle School, securing the gold medals.
We tried to make the test for B easier by employing the easier ciphers that were specific to B (atbash and railfence), but there were still a lot of scores on the lower side. Perhaps next time we can make the quotes for B somewhat easier.
Reflections
Almost all of the teams didn't bother attempting to solve the morbit/pollux and baconian ciphers. It made sense because you would have to write the ciphertext down on paper and then manually solve it, which would waste a lot of time. Unfortunately, there is no better option at the moment for these types of ciphers, but hopefully, there will be a solution when regionals and state roll around.
If we were to had implemented a timed question, it would have been the first question. I intentionally rearranged the "I before E" into "E after I" so that a lot of teams would take some time on that, so expect some surprise twists in the future if you see me as the event supervisor! Overall, the scoring distribution was pretty decent, with the top team scoring getting about 50-60% of the points available. Shout out to Camas High School Team Graphite for being the closest answer to the ultimate tiebreaker question with 240.5 WPM while the actual answer is 241.9 WPM!
I would like to give a shoutout to Thuan, who is the developer of Scilympiad for being able to have these cryptography-type questions so that it makes both the competitor and the ES's lives easier!
Div B Stats
Max: 3,272.00
Mean: 624.06
Median: 387.50
Q1: 100.01
Q3: 750.01
SD: 783.04
Div C Stats
Max: 5,478.00
Mean: 1,828.37
Median: 1,651.00
Q1: 700.01
Q3: 2,806.50
SD: 1,363.23
First off I would like to congratulate every single team that competed in the UT Invitational last weekend for their hard work! Although there were some bumps along the way, overall the tournament was a big success, especially since it was a lot of team's first time competing in a virtual format. I wrote the Codebusters B/C test along with Sid, who is also a member of ATX Scioly.
Division C
Scoring Distribution
The 8th-10th place teams definitely showed that they were good at cracking ciphers and deserved their medals, but even they couldn't even get close to the top 2 teams with a whopping 1,000 points difference between 2nd and 3rd and a mere 3 points between 2nd and 1st. Congratulations to Syosset High School for prevailing over Clements High School to become the event champion at this tournament. What's ridiculous is that Clements would have won if they didn't get a partial score on one of the xenos, but that's part of Codebusters!
We decided to make the test longer than a usual Codebusters test since when we were writing it, it was not possible to have a timed question on Scilympiad (but now it is possible!). We also decided to use national rules for C in hopes that we were able to spread the scores around a bit. The inclusion of the four "joke" ciphers was added in order to have some easy stuff, but many teams decided to ignore them in favor of the ciphers that offer a lot more points.
Division B
Scoring Distribution
We had most of the questions from C transfer to B while changing the national questions to regional ones. Since Codebusters B was a trial event and a relatively new one at that, we didn't expect a lot of teams to score particularly well, but in a similar situation in Division C, the top 4 teams scored particularly high, with the top 2 scoring pretty close to one another. Congratulations to Kennedy Middle School for scoring 47 points higher than Sierra Vista Middle School, securing the gold medals.
We tried to make the test for B easier by employing the easier ciphers that were specific to B (atbash and railfence), but there were still a lot of scores on the lower side. Perhaps next time we can make the quotes for B somewhat easier.
Reflections
Almost all of the teams didn't bother attempting to solve the morbit/pollux and baconian ciphers. It made sense because you would have to write the ciphertext down on paper and then manually solve it, which would waste a lot of time. Unfortunately, there is no better option at the moment for these types of ciphers, but hopefully, there will be a solution when regionals and state roll around.
If we were to had implemented a timed question, it would have been the first question. I intentionally rearranged the "I before E" into "E after I" so that a lot of teams would take some time on that, so expect some surprise twists in the future if you see me as the event supervisor! Overall, the scoring distribution was pretty decent, with the top team scoring getting about 50-60% of the points available. Shout out to Camas High School Team Graphite for being the closest answer to the ultimate tiebreaker question with 240.5 WPM while the actual answer is 241.9 WPM!
I would like to give a shoutout to Thuan, who is the developer of Scilympiad for being able to have these cryptography-type questions so that it makes both the competitor and the ES's lives easier!
Div B Stats
Max: 3,272.00
Mean: 624.06
Median: 387.50
Q1: 100.01
Q3: 750.01
SD: 783.04
Div C Stats
Max: 5,478.00
Mean: 1,828.37
Median: 1,651.00
Q1: 700.01
Q3: 2,806.50
SD: 1,363.23
You do not have the required permissions to view the files attached to this post.
Last edited by Longivitis on Mon Oct 26, 2020 4:50 pm, edited 1 time in total.
- These users thanked the author Longivitis for the post (total 3):
- Name (Mon Oct 26, 2020 5:06 pm) • Umaroth (Mon Oct 26, 2020 5:10 pm) • Adi1008 (Mon Oct 26, 2020 5:42 pm)
University of Texas at Austin '23
Cypress Lakes High School '19
Chemistry Lab, Codebusters, Game On, Science Word, Towers, We've Got Your Number
Cypress Lakes High School '19
Chemistry Lab, Codebusters, Game On, Science Word, Towers, We've Got Your Number
-
- Staff Emeritus
- Posts: 1367
- Joined: Sun Apr 19, 2015 6:37 pm
- Division: Grad
- State: FL
- Has thanked: 2 times
- Been thanked: 30 times
Re: The University of Texas at Austin Invitational 2020
Heredity B / Designer Genes C
Congrats to everyone for competing this past weekend! We assembled a pretty large team of experienced ESes to contribute questions and all in all, I think things turned out pretty good. Huge thanks to the UT Austin team for doing all the Scilympiad interfacing of the exam, they did a phenomenal job. One thing to mention is that there Heredity B test was cut a bit short because we hadn't realized the lack of overlap between B and C questions. That combined with the immense number of teams resulted in a pretty tight distribution.
Congrats to everyone for competing this past weekend! We assembled a pretty large team of experienced ESes to contribute questions and all in all, I think things turned out pretty good. Huge thanks to the UT Austin team for doing all the Scilympiad interfacing of the exam, they did a phenomenal job. One thing to mention is that there Heredity B test was cut a bit short because we hadn't realized the lack of overlap between B and C questions. That combined with the immense number of teams resulted in a pretty tight distribution.
You do not have the required permissions to view the files attached to this post.
- These users thanked the author windu34 for the post (total 4):
- Umaroth (Mon Oct 26, 2020 9:37 pm) • Adi1008 (Tue Oct 27, 2020 12:39 am) • azboy1910 (Tue Oct 27, 2020 12:53 pm) • sciolyperson1 (Tue Oct 27, 2020 11:26 pm)
Boca Raton Community High School Alumni
Florida Science Olympiad Board of Directors
National Physical Sciences Rules Committee Member
kevin@floridascienceolympiad.org || windu34's Userpage
Florida Science Olympiad Board of Directors
National Physical Sciences Rules Committee Member
kevin@floridascienceolympiad.org || windu34's Userpage
-
- Administrator
- Posts: 2405
- Joined: Sun Jan 05, 2014 3:12 pm
- Division: Grad
- State: WA
- Pronouns: He/Him/His
- Has thanked: 165 times
- Been thanked: 727 times
Re: The University of Texas at Austin Invitational 2020
Event Supervisor Review: Detector Building
Congrats to everyone who competed in the Detector Building event! I was very pleased with the distribution of scores and especially many of the responses to the short answer questions which demonstrated exceptional understanding of the event topics.
Test Format: When writing for Detector Building, I tend to focus on what I think makes it a unique event from Circuit Lab: engineering hardware devices to gather data for developing models. I test for familiarity with components used for constructing devices, understanding of the science that allows for light-emission or temperature-sensing with these components, and knowing how to work with data effectively. As the mini SO version of this event excludes the build portion, it was even more important to emulate these parts of the event through the Written Test. As such, questions involved deciding which components to use (question 1, 33), understanding device limitations (questions 25-26, 37-39), interpreting code (question 40), comprehending datasheets (question 36), and working practically with data (questions 41, 46-48).
Impressions: I was pleased to see a large number of teams answer the code question (40) correctly. However, I was surprised by the number of teams that answered the first question incorrectly, which is a calculation every team should have completed in making their device. It seems many teams were tripped by the device resolution questions (37-39) and some answer choices of the datasheet comprehension question (36). If you struggled with the B parameter equation (question 24) and voltage divider (question 51) calculations, I recommend reviewing them in preparation for future competitions.
A surprising number of teams left many of the short answer questions blank, even question 41, which asked about general steps for improving data accuracy. A common mistake for the fill-in-the-blank questions was not rounding to the precision requested by the question or including units (which teams were asked to omit in their responses).
I’ve uploaded the test, datasheet referenced by the test, answer key with detailed explanations, and results summary with histograms for overall score as well as by question to the Test Exchange. I hope you’ll find the detailed explanations I’ve written for every question and answer choice helpful for your preparations.
Congrats to everyone who competed in the Detector Building event! I was very pleased with the distribution of scores and especially many of the responses to the short answer questions which demonstrated exceptional understanding of the event topics.
Test Format: When writing for Detector Building, I tend to focus on what I think makes it a unique event from Circuit Lab: engineering hardware devices to gather data for developing models. I test for familiarity with components used for constructing devices, understanding of the science that allows for light-emission or temperature-sensing with these components, and knowing how to work with data effectively. As the mini SO version of this event excludes the build portion, it was even more important to emulate these parts of the event through the Written Test. As such, questions involved deciding which components to use (question 1, 33), understanding device limitations (questions 25-26, 37-39), interpreting code (question 40), comprehending datasheets (question 36), and working practically with data (questions 41, 46-48).
Impressions: I was pleased to see a large number of teams answer the code question (40) correctly. However, I was surprised by the number of teams that answered the first question incorrectly, which is a calculation every team should have completed in making their device. It seems many teams were tripped by the device resolution questions (37-39) and some answer choices of the datasheet comprehension question (36). If you struggled with the B parameter equation (question 24) and voltage divider (question 51) calculations, I recommend reviewing them in preparation for future competitions.
A surprising number of teams left many of the short answer questions blank, even question 41, which asked about general steps for improving data accuracy. A common mistake for the fill-in-the-blank questions was not rounding to the precision requested by the question or including units (which teams were asked to omit in their responses).
I’ve uploaded the test, datasheet referenced by the test, answer key with detailed explanations, and results summary with histograms for overall score as well as by question to the Test Exchange. I hope you’ll find the detailed explanations I’ve written for every question and answer choice helpful for your preparations.
You do not have the required permissions to view the files attached to this post.
- These users thanked the author bernard for the post (total 10):
- Adi1008 (Tue Oct 27, 2020 8:14 pm) • builderguy135 (Tue Oct 27, 2020 9:13 pm) • sciolyperson1 (Tue Oct 27, 2020 9:14 pm) • Rossyspsce (Tue Oct 27, 2020 9:41 pm) • Giantpants (Tue Oct 27, 2020 9:44 pm) • RobertYL (Tue Oct 27, 2020 11:21 pm) • imaditi (Wed Oct 28, 2020 3:15 am) • EwwPhysics (Wed Oct 28, 2020 5:56 am) • IOnlyShoot3s (Wed Oct 28, 2020 7:07 am) • Godspeed (Wed Oct 28, 2020 8:51 pm)
"One of the ways that I believe people express their appreciation to the rest of humanity is to make something wonderful and put it out there." – Steve Jobs
-
- Member
- Posts: 11
- Joined: Sun Dec 15, 2019 1:15 pm
- Division: Grad
- State: CA
- Pronouns: He/Him/His
- Has thanked: 6 times
- Been thanked: 14 times
Re: The University of Texas at Austin Invitational 2020
Botany B/C
Preface:
Hello, my name is Mayur, the 2020 Science Olympiad Botany B/C Event Supervisor: University of Texas Invitational. First, congratulations to all the UT Invitational competitors for any and all work that you put in on my difficult examination. An exam of roughly 85 multiple choice, several fill-in-the-blank, and many free response questions in 50 minutes is no joke. I wholeheartedly thank all of the teams who were patient, attentive, and appreciative towards me. I loved all of the positive responses given on my SciOly Feedback Survey ! Shoutout to Thuan for creating and continuing to improve Scilympiad!
Learning concepts to reflect upon after taking my exam:
Common Mistakes:
Once again, thank you for sitting down to take my exam. Although there are not many botany exams released (I’m only aware of the MIT one), I hope this exam acts as a study guide for many competitors as they partake in other tournaments. P.S. You might run into another one of my Botany exams .
Attached are the exams and answer keys, followed by statistical distribution of the overall performance:
Test: Division B | Division C
Answer Key: Division B | Division C
Comprehensive Free Response & Tie-breaker Criteria: Division B | Division C
Score v. Placement Statistics: Division B | Division C
Preface:
Hello, my name is Mayur, the 2020 Science Olympiad Botany B/C Event Supervisor: University of Texas Invitational. First, congratulations to all the UT Invitational competitors for any and all work that you put in on my difficult examination. An exam of roughly 85 multiple choice, several fill-in-the-blank, and many free response questions in 50 minutes is no joke. I wholeheartedly thank all of the teams who were patient, attentive, and appreciative towards me. I loved all of the positive responses given on my SciOly Feedback Survey ! Shoutout to Thuan for creating and continuing to improve Scilympiad!
Learning concepts to reflect upon after taking my exam:
- Proper analysis and comprehension
- Time Management
- Team Communication
- Plant Anatomy and Physiology
- Horticulture*
- Genetic Engineering
- Pharmacognosy*
- Phytopathology
- Biochemistry*
Common Mistakes:
- Teams neglected reading instructions (e.g. assume answer to be singular unless otherwise instructed). For the most part, that should not have been an issue since Scilympiad accepts plural or singular answers.
- Congratulations to Jeffrey Trail (JTMS Blue) (67.5 points), the runner-up team, Beckendorff (Beckendorff A) (59 points), followed by Solon (SMS SO) (47 points) on your first, second, and third team placements, respectively!
- This test had to be different from Division C and so I added the “Plant Evolution” and “Plant Diversity” free-response questions. For some odd reason, many teams assume the plant cycles had angiosperm parts like stigma and ovary. This is not the case! First cycle is moss, the second is fern.
- Multiple Choice Questions were obviously rough on all teams.
- Was impressed with team responses on the “C4 and CAM Pathways” FRQ. Loved seeing 12/12 points!
- Lots of lost opportunities to place higher by not describing your answer(s).
- Congratulations to William P. Clements (Clements B) (107 points), the runner-up team, North Hollywood (NoHo) (101.5 points), followed by Troy (Troy Black) (95 points) on your first, second, and third team placements, respectively!
- I admit that this exam was quite notorious, especially the “Plant Biochemistry” FRQ was quite difficult. Thank you to all the teams who answered any and all questions pertaining to it.
- You clever competitors were quick to identify two foods containing phenylalanine (most-protein foods)- an easy two points of the bat!
- Several teams did not put much thought when defining drug adulteration- the practice of making crude drugs of poorer quality through deliberate or unintentional reasons (e.g. inferiority, spoilage, admixture, lack of knowledge).
- Be sure to study up on plant horticulture (e.g. three-cut method, crown-raising, branch aspect ratio). As far as I know, no team knew exactly what branch aspect ratio. Logic and Figure 1 of “Tree Essentials: Figure 1” FRQ helped teams obtain that one point (50%, 1:2, or 1:3 ). Something like 1:7 is invalid because there were only 3 co-dominant leading structures.
- Many teams moved up in placing based on their in multiple choice questions, free-response questions, or a mixture of both.
- Many teams answered the “Stem Characteristic” (8 points) and “C3 and C4 Graphs” (7 points) FRQ's with ease, and it really helped with their team placement. This also outweighed the concern of teams writing “Serrated” instead of “Serrate” on the auto-graded fill-in-the-blank question (1 point). I even gave an answer to the first frq to which many teams did not type. It was a free point . Same goes with the “Agrobacterium” FRQ.
- Grading was quite tedious as many teams gave similar answers and I made impromptu exceptions noted on the Answer Key.
- Looking at the scoreboard, it’s crazy to see how scoring 10 extra points could move you up to 20 ranks.
Once again, thank you for sitting down to take my exam. Although there are not many botany exams released (I’m only aware of the MIT one), I hope this exam acts as a study guide for many competitors as they partake in other tournaments. P.S. You might run into another one of my Botany exams .
Attached are the exams and answer keys, followed by statistical distribution of the overall performance:
Test: Division B | Division C
Answer Key: Division B | Division C
Comprehensive Free Response & Tie-breaker Criteria: Division B | Division C
Score v. Placement Statistics: Division B | Division C
- These users thanked the author Mayur917 for the post (total 5):
- Krish2007 (Wed Oct 28, 2020 4:09 pm) • Booknerd (Wed Oct 28, 2020 4:17 pm) • Adi1008 (Wed Oct 28, 2020 7:05 pm) • IOnlyShoot3s (Thu Oct 29, 2020 2:18 pm) • stenopushispidus (Fri Nov 13, 2020 12:46 pm)
Etiwanda High School '19
University of California, Riverside, B.S., Biochemistry '23
Southern California Science Olympiad Alumni
Mathematics Connoisseur
https://scioly.org/wiki/index.php/User:Mayur917
University of California, Riverside, B.S., Biochemistry '23
Southern California Science Olympiad Alumni
Mathematics Connoisseur
https://scioly.org/wiki/index.php/User:Mayur917
-
- Exalted Member
- Posts: 1067
- Joined: Mon Apr 23, 2018 7:13 pm
- Division: C
- State: NJ
- Pronouns: He/Him/His
- Has thanked: 526 times
- Been thanked: 597 times
Re: The University of Texas at Austin Invitational 2020
Results are out!
Division B:
https://drive.google.com/file/d/1pTWGL_ ... 0FKVL/view
Division C:
https://drive.google.com/file/d/16SVWvg ... faSdX/view
Division B:
https://drive.google.com/file/d/1pTWGL_ ... 0FKVL/view
Division C:
https://drive.google.com/file/d/16SVWvg ... faSdX/view
- These users thanked the author sciolyperson1 for the post (total 3):
- Umaroth (Thu Oct 29, 2020 10:35 pm) • Adi1008 (Thu Oct 29, 2020 11:23 pm) • IOnlyShoot3s (Fri Oct 30, 2020 7:01 am)
WW-P HSN '22, Community MS '18, BirdSO Tournament Director
Event Supervisor + Volunteer
'22: Bridge/Code/ExDesi/GV/PPP/Traj/WIDI/WICI
Sciolyperson1's Userpage
Event Supervisor + Volunteer
'22: Bridge/Code/ExDesi/GV/PPP/Traj/WIDI/WICI
Sciolyperson1's Userpage
-
- Exalted Member
- Posts: 1067
- Joined: Mon Apr 23, 2018 7:13 pm
- Division: C
- State: NJ
- Pronouns: He/Him/His
- Has thanked: 526 times
- Been thanked: 597 times
Re: The University of Texas at Austin Invitational 2020
Superscoring live: https://docs.google.com/spreadsheets/d/ ... sp=sharing
- These users thanked the author sciolyperson1 for the post (total 3):
- Godspeed (Thu Oct 29, 2020 9:58 pm) • Umaroth (Thu Oct 29, 2020 10:35 pm) • Adi1008 (Thu Oct 29, 2020 11:23 pm)
WW-P HSN '22, Community MS '18, BirdSO Tournament Director
Event Supervisor + Volunteer
'22: Bridge/Code/ExDesi/GV/PPP/Traj/WIDI/WICI
Sciolyperson1's Userpage
Event Supervisor + Volunteer
'22: Bridge/Code/ExDesi/GV/PPP/Traj/WIDI/WICI
Sciolyperson1's Userpage
-
- Exalted Member
- Posts: 1067
- Joined: Mon Apr 23, 2018 7:13 pm
- Division: C
- State: NJ
- Pronouns: He/Him/His
- Has thanked: 526 times
- Been thanked: 597 times
Re: The University of Texas at Austin Invitational 2020
Div C w/ Drops:
1 - Troy High School (CA) 87 pts.
2 - Seven Lakes High School 94 pts.
3 - Mason High School 102 pts.
4 - Mission San Jose High School 203 pts.
5 - William P. Clements High School 216 pts.
6 - Iolani School 232 pts.
Div C w/o Drops:
1 - Troy High School (CA), 136 pts.
2 - Seven Lakes High School, 164 pts.
3 - Mason High School, 204 pts.
4 - William P. Clements High School, 326 pts.
5 - Montgomery High School, 366 pts.
6 - American High School, 378 pts.
Div B w/ Drops:
1 - Kennedy, 29 pts.
2 - Jeffrey Trail, 35 pts.
3 - Beckendorff JH, 54 pts.
4 - Winston Churchill, 86 pts.
5 - Solon, 126 pts.
6 - Fulton Science Academy, 132 pts.
Div B wo Drops:
1 - Jeffrey Trail, 66 pts.
2 - Beckendorff JH, 106 pts.
3 - Kennedy, 115 pts.
4 - Winston Churchill, 169 pts.
5 - Solon, 188 pts.
6 - Sierra Vista, 206 pts.
1 - Troy High School (CA) 87 pts.
2 - Seven Lakes High School 94 pts.
3 - Mason High School 102 pts.
4 - Mission San Jose High School 203 pts.
5 - William P. Clements High School 216 pts.
6 - Iolani School 232 pts.
Div C w/o Drops:
1 - Troy High School (CA), 136 pts.
2 - Seven Lakes High School, 164 pts.
3 - Mason High School, 204 pts.
4 - William P. Clements High School, 326 pts.
5 - Montgomery High School, 366 pts.
6 - American High School, 378 pts.
Div B w/ Drops:
1 - Kennedy, 29 pts.
2 - Jeffrey Trail, 35 pts.
3 - Beckendorff JH, 54 pts.
4 - Winston Churchill, 86 pts.
5 - Solon, 126 pts.
6 - Fulton Science Academy, 132 pts.
Div B wo Drops:
1 - Jeffrey Trail, 66 pts.
2 - Beckendorff JH, 106 pts.
3 - Kennedy, 115 pts.
4 - Winston Churchill, 169 pts.
5 - Solon, 188 pts.
6 - Sierra Vista, 206 pts.
- These users thanked the author sciolyperson1 for the post (total 3):
- Umaroth (Thu Oct 29, 2020 10:35 pm) • Adi1008 (Thu Oct 29, 2020 11:23 pm) • IOnlyShoot3s (Fri Oct 30, 2020 7:02 am)
WW-P HSN '22, Community MS '18, BirdSO Tournament Director
Event Supervisor + Volunteer
'22: Bridge/Code/ExDesi/GV/PPP/Traj/WIDI/WICI
Sciolyperson1's Userpage
Event Supervisor + Volunteer
'22: Bridge/Code/ExDesi/GV/PPP/Traj/WIDI/WICI
Sciolyperson1's Userpage
-
- Member
- Posts: 47
- Joined: Sat Feb 17, 2018 7:19 am
- Division: Grad
- State: TX
- Pronouns: He/Him/His
- Has thanked: 17 times
- Been thanked: 45 times
Re: The University of Texas at Austin Invitational 2020
We've Got Your Number B/C
We weren't initially going to run WGYN since it would have been a lot of work for graders to whip out a calculator and punch hundreds of expressions, but I managed to code some stuff that made everything a lot easier. The GitHub repo for it is here if anyone wants to know how the expressions are calculated. Any feedback is appreciated and if other tournaments want to run WGYN with Google Forms (although I wouldn't recommend it to be added into the final score), feel free to use it!
Scores
Division B Scoring Distribution
Division C Scoring Distribution
Overall both divisions did very well on inputting their expressions correctly onto the Google Form. The extra 10-minute buffer that we allowed gave teams ample time to copy/paste their answers from their document into the form. I tiered teams that took over an hour to submit since that gives them an inherent advantage.
I gave Division C an easy set for the first one and an okay set for the second one. The max potential score was 183, but Clements High School exceeded expectations and could've gotten 184 if they didn't miss 22 on accident on the second set. Division B had a different see of numbers and a disadvantage since they can't use factorials or logs, but Kennedy Middle School still gets a score of 148 which is very impressive.
Mistakes
The most common mistakes for formatting that teams made were submitting two separate Google Forms that I had to merge together, or they used implicit multiplication and brackets which the calculator does not like. There were also the standard mistakes of not closing parenthesis and not using the correct numbers on a couple of expressions.
Div B Stats
Max 148.00
Mean 68.16
Median 69.00
Q1 35.00
Q3 105.00
SD 44.60
Div C Stats
Max 182.00
Mean 90.55
Median 95.10
Q1 51.00
Q3 133.00
SD 50.41
We weren't initially going to run WGYN since it would have been a lot of work for graders to whip out a calculator and punch hundreds of expressions, but I managed to code some stuff that made everything a lot easier. The GitHub repo for it is here if anyone wants to know how the expressions are calculated. Any feedback is appreciated and if other tournaments want to run WGYN with Google Forms (although I wouldn't recommend it to be added into the final score), feel free to use it!
Scores
Division B Scoring Distribution
Division C Scoring Distribution
Overall both divisions did very well on inputting their expressions correctly onto the Google Form. The extra 10-minute buffer that we allowed gave teams ample time to copy/paste their answers from their document into the form. I tiered teams that took over an hour to submit since that gives them an inherent advantage.
I gave Division C an easy set for the first one and an okay set for the second one. The max potential score was 183, but Clements High School exceeded expectations and could've gotten 184 if they didn't miss 22 on accident on the second set. Division B had a different see of numbers and a disadvantage since they can't use factorials or logs, but Kennedy Middle School still gets a score of 148 which is very impressive.
Mistakes
The most common mistakes for formatting that teams made were submitting two separate Google Forms that I had to merge together, or they used implicit multiplication and brackets which the calculator does not like. There were also the standard mistakes of not closing parenthesis and not using the correct numbers on a couple of expressions.
Div B Stats
Max 148.00
Mean 68.16
Median 69.00
Q1 35.00
Q3 105.00
SD 44.60
Div C Stats
Max 182.00
Mean 90.55
Median 95.10
Q1 51.00
Q3 133.00
SD 50.41
You do not have the required permissions to view the files attached to this post.
- These users thanked the author Longivitis for the post (total 3):
- sciolyperson1 (Fri Oct 30, 2020 12:03 pm) • Adi1008 (Fri Oct 30, 2020 12:19 pm) • IOnlyShoot3s (Sun Nov 08, 2020 6:39 am)
University of Texas at Austin '23
Cypress Lakes High School '19
Chemistry Lab, Codebusters, Game On, Science Word, Towers, We've Got Your Number
Cypress Lakes High School '19
Chemistry Lab, Codebusters, Game On, Science Word, Towers, We've Got Your Number
-
- Member
- Posts: 1
- Joined: Wed Sep 30, 2020 12:36 pm
- Division: B
- State: CA
- Has thanked: 0
- Been thanked: 0
Re: The University of Texas at Austin Invitational 2020
Hi,
It would be great if we can get some insight on few questions from Crime Buster.
1) What is more acidic - lemon juice vs vinegar - How would one determine the correct answer, based on our readings lemon juice has ph 2 and vinegar ~2.5. The exam expected the answer to be vinegar, it would be great if you can provide some resource that would clarify.
2) What metal produces bubbles with 20 seconds delay with HCL - Zn vs Al - From youtube experiments looks like Zn reacts violently with HCl, releasing H2 causing bubbles, where as Al displays delayed reaction. Any pointers here would be really helpful.
3) Identifying salt vs sugar - from the pictures its hard to distinguish cube structure (salt) from rounded hexagonal shape (sugar) anything else that can help identify one vs other?
This is our first year doing science olympiad. Any guidance is really helpful. Thanks a lot for your time!
Thanks!
It would be great if we can get some insight on few questions from Crime Buster.
1) What is more acidic - lemon juice vs vinegar - How would one determine the correct answer, based on our readings lemon juice has ph 2 and vinegar ~2.5. The exam expected the answer to be vinegar, it would be great if you can provide some resource that would clarify.
2) What metal produces bubbles with 20 seconds delay with HCL - Zn vs Al - From youtube experiments looks like Zn reacts violently with HCl, releasing H2 causing bubbles, where as Al displays delayed reaction. Any pointers here would be really helpful.
3) Identifying salt vs sugar - from the pictures its hard to distinguish cube structure (salt) from rounded hexagonal shape (sugar) anything else that can help identify one vs other?
This is our first year doing science olympiad. Any guidance is really helpful. Thanks a lot for your time!
Thanks!