No, it was Columbia HS. The ink spill happened in Block 1 and I was in Block 2. I'm honestly surprised they didn't fix it during Block 4.Kyanite wrote:I think the ink incident was my team, luckily I think they placed okdaydreamer0023 wrote:Anyone have thoughts on the Forensics test? I do know that I couldn't light the match...then I got the event helpers to try and help me...they got the match but couldn't get the candle...long story short, the entire box of matches was spent and I was forced to sight-identify the fibers without a candle. The team across from me had an incident where the team before them spilled ink over the shared test packet (not including answer sheet) at the station so they couldn't read it. The event supervisor did not seem very sympathetic to their problem - one of them seemed very distraught as they relayed the mishap by phone (to a coach?) after the test.
Nationals Event Discussion
-
- Member
- Posts: 198
- Joined: Thu Jan 29, 2015 5:44 pm
- Division: Grad
- Has thanked: 1 time
- Been thanked: 1 time
Re: Nationals Event Discussion
"I am among those who think that science has great beauty. A scientist in his laboratory is not only a technician: he is also a child placed before natural phenomena which impress him like a fairy tale." - Marie Curie
Enloe '19 || UNC Chapel Hill '23
See resources I helped create here!
Enloe '19 || UNC Chapel Hill '23
See resources I helped create here!
-
- Member
- Posts: 202
- Joined: Mon Nov 06, 2017 8:43 am
- Division: Grad
- State: WA
- Has thanked: 0
- Been thanked: 0
Re: Nationals Event Discussion
Wow that is really rough then, my team must have been in block 3daydreamer0023 wrote:No, it was Columbia HS. The ink spill happened in Block 1 and I was in Block 2. I'm honestly surprised they didn't fix it during Block 4.Kyanite wrote:I think the ink incident was my team, luckily I think they placed okdaydreamer0023 wrote:Anyone have thoughts on the Forensics test? I do know that I couldn't light the match...then I got the event helpers to try and help me...they got the match but couldn't get the candle...long story short, the entire box of matches was spent and I was forced to sight-identify the fibers without a candle. The team across from me had an incident where the team before them spilled ink over the shared test packet (not including answer sheet) at the station so they couldn't read it. The event supervisor did not seem very sympathetic to their problem - one of them seemed very distraught as they relayed the mishap by phone (to a coach?) after the test.
LSU Class of 2022, Geaux Tigers
https://scioly.org/wiki/index.php/Camas_High_School
https://scioly.org/wiki/index.php/Camas_High_School
-
- Member
- Posts: 109
- Joined: Sat Nov 04, 2017 11:15 am
- Division: C
- State: PA
- Has thanked: 0
- Been thanked: 0
Re: Nationals Event Discussion
This is kinda late but I still wanted to share.
Dynamic Planet (12th): 6.5/10 This test was very unbalanced and a little disappointing. I enjoyed the fact that there were questions on figures, but there were way too many based off of them. This test was more about logic instead of testing on the topics in the rules. Even when they did test on topics, it was very basic knowledge of plate tectonics. This is pretty disappointing because I barely looked at my cheatsheet throughout the test and I felt that more could've been tested on. The only times I used my cheat sheet was to check a couple of definitions to write a better answer. They also section the test based on the topics so it made it easier to split the test and helped with seeing what was going to be tested.
Optics (5th): 7/10 Test: overall I thought the test had a nice balance between the topics. Though I was disappointed when there weren't enough problems on lenses, mirrors, red/blue shift. Other than that the test had covered various topics and did a good job of testing us on our knowledge. The test also had a good length and my partner and I were able to complete it. Box: Even though the test was good, I have very different opinions about the box. I didn't like how the mirrors didn't touch the base and the mirror itself was elevated. This makes it hard for people to align the mirrors because we had to align with the wood that was attached to the back of the mirror instead of the actual mirror. In addition to that, I did not appreciate the covers. These covers weren't thin and added to the difficulty of aligning the mirrors. Finally, I'm also disappointed with the barrier placement because they didn't challenge us enough. There are so many things you can to with the barrier and the barrier is pretty important since it gives a lot of points. I would've liked to see them challenge us on how well we could guide the laser. Even with all of this criticism, the proctors were extremely nice and really patient. We had a lot of templates for different scenarios and they were willing to wait for us to organize them as we prepared to set up.
Dynamic Planet (12th): 6.5/10 This test was very unbalanced and a little disappointing. I enjoyed the fact that there were questions on figures, but there were way too many based off of them. This test was more about logic instead of testing on the topics in the rules. Even when they did test on topics, it was very basic knowledge of plate tectonics. This is pretty disappointing because I barely looked at my cheatsheet throughout the test and I felt that more could've been tested on. The only times I used my cheat sheet was to check a couple of definitions to write a better answer. They also section the test based on the topics so it made it easier to split the test and helped with seeing what was going to be tested.
Optics (5th): 7/10 Test: overall I thought the test had a nice balance between the topics. Though I was disappointed when there weren't enough problems on lenses, mirrors, red/blue shift. Other than that the test had covered various topics and did a good job of testing us on our knowledge. The test also had a good length and my partner and I were able to complete it. Box: Even though the test was good, I have very different opinions about the box. I didn't like how the mirrors didn't touch the base and the mirror itself was elevated. This makes it hard for people to align the mirrors because we had to align with the wood that was attached to the back of the mirror instead of the actual mirror. In addition to that, I did not appreciate the covers. These covers weren't thin and added to the difficulty of aligning the mirrors. Finally, I'm also disappointed with the barrier placement because they didn't challenge us enough. There are so many things you can to with the barrier and the barrier is pretty important since it gives a lot of points. I would've liked to see them challenge us on how well we could guide the laser. Even with all of this criticism, the proctors were extremely nice and really patient. We had a lot of templates for different scenarios and they were willing to wait for us to organize them as we prepared to set up.
"Who's Fettywap?"
-
- Moderator
- Posts: 716
- Joined: Fri Dec 07, 2012 2:30 pm
- Division: Grad
- State: IN
- Pronouns: She/Her/Hers
- Has thanked: 89 times
- Been thanked: 167 times
Re: Nationals Event Discussion
Yeah...sometimes things don’t get fixed even if you ask for it (trust me on this, I live in the same state as the nats forensics ES). There have been quite a few times when I’ve asked for a new candle and not necessarily gotten a better one upon request. I remember when I competed at nats in 2016, my partner and I went in basically being like “OK, everything’s gonna be mega contaminated, rip us” since we went in block 6 of the day...plus, you can only imagine how cranky she was at the end of the day. I still get shivers thinking about it.Kyanite wrote:Wow that is really rough then, my team must have been in block 3daydreamer0023 wrote:No, it was Columbia HS. The ink spill happened in Block 1 and I was in Block 2. I'm honestly surprised they didn't fix it during Block 4.Kyanite wrote:
I think the ink incident was my team, luckily I think they placed ok
Carmel HS (IN) '16
Purdue BioE '21? reevaluating my life choices
Nationals 2016 ~ 4th place Forensics
"It is important to draw wisdom from different places. If you take it from only one place, it becomes rigid and stale." -Uncle Iroh
About me || Rate my tests!
Opinions expressed on this site are not official; the only place for official rules changes and FAQs is soinc.org.
MY CABBAGES!
Purdue BioE '21? reevaluating my life choices
Nationals 2016 ~ 4th place Forensics
"It is important to draw wisdom from different places. If you take it from only one place, it becomes rigid and stale." -Uncle Iroh
About me || Rate my tests!
Opinions expressed on this site are not official; the only place for official rules changes and FAQs is soinc.org.
MY CABBAGES!
-
- Member
- Posts: 7
- Joined: Sun Feb 18, 2018 7:47 pm
- Division: C
- State: IL
- Has thanked: 0
- Been thanked: 0
Re: Nationals Event Discussion
Disease Detectives (10): Supervisors only gave us 40 minutes to complete the test so it was a time crunch until the very end. It was completely different than any prior Nationals tests — there weren’t two distinct case scenarios but rather one general knowledge section and one case study. The questions were not very difficult and they provided a significant amount of filler information. They also included a strange patient information chart that we needed to fill out based on background information given to us. The test was very strange and it didn’t appear very difficult, but there were indeed some tricky and abstract questions. We pushed ourselves finish the test and were hoping to medal, but alas, the field was strong (as always) and we came up short.
Remote Sensing (10): This test was very disappointing. They asked optics and physics questions unrelated to the nature of the event. I thought last year’s test was relatively fair (albeit very difficult) but this year’s test was outright off-topic on several sections. They included Snell’s Law problems, Doppler speed questions (not too egregious), and other unrelated physics jargon. The test was very lacking in terms of climate information and relevant physics questions. They spiced in a few good questions here and there but I expected so much more from a Nationals Test.
Forensics (14): The test was long and hard. Powders were varied and well chosen and the test was a trademark Nationals test. There were so many things to do in 50 minutes and we needed every second of it. We didn’t have enough time to finish (not sure anyone did) but plowed through a lot of it, although we messed up on chromatography and getting proper measurements. I was told Forensics Nationals would be crazy and it really was. No amount of preparation could’ve equipped me for the time-crunch. That said, the test itself and the frantic atmosphere was pretty exhilarating and thrilling. Good test and had a lot of fun taking it (even if my hands were shaking the entire time).
Towers (12): Very nice supervisors. I heard the stand was not level, but aside from that, it was smoothly run.
Remote Sensing (10): This test was very disappointing. They asked optics and physics questions unrelated to the nature of the event. I thought last year’s test was relatively fair (albeit very difficult) but this year’s test was outright off-topic on several sections. They included Snell’s Law problems, Doppler speed questions (not too egregious), and other unrelated physics jargon. The test was very lacking in terms of climate information and relevant physics questions. They spiced in a few good questions here and there but I expected so much more from a Nationals Test.
Forensics (14): The test was long and hard. Powders were varied and well chosen and the test was a trademark Nationals test. There were so many things to do in 50 minutes and we needed every second of it. We didn’t have enough time to finish (not sure anyone did) but plowed through a lot of it, although we messed up on chromatography and getting proper measurements. I was told Forensics Nationals would be crazy and it really was. No amount of preparation could’ve equipped me for the time-crunch. That said, the test itself and the frantic atmosphere was pretty exhilarating and thrilling. Good test and had a lot of fun taking it (even if my hands were shaking the entire time).
Towers (12): Very nice supervisors. I heard the stand was not level, but aside from that, it was smoothly run.
New Trier Scioly
Marie Murphy 2012-2015
2017 Events || Disease Detectives, Remote Sensing, Forensics, Towers
Marie Murphy 2012-2015
2017 Events || Disease Detectives, Remote Sensing, Forensics, Towers
-
- Member
- Posts: 25
- Joined: Wed Mar 18, 2015 6:47 am
- Division: C
- State: AL
- Has thanked: 0
- Been thanked: 0
Re: Nationals Event Discussion
I agree with your opinion on the Remote Sensing Test. Unfortunately, I am not strong in physics (at all).EdwardMMNT wrote:Disease Detectives (10): Supervisors only gave us 40 minutes to complete the test so it was a time crunch until the very end. It was completely different than any prior Nationals tests — there weren’t two distinct case scenarios but rather one general knowledge section and one case study. The questions were not very difficult and they provided a significant amount of filler information. They also included a strange patient information chart that we needed to fill out based on background information given to us. The test was very strange and it didn’t appear very difficult, but there were indeed some tricky and abstract questions. We pushed ourselves finish the test and were hoping to medal, but alas, the field was strong (as always) and we came up short.
Remote Sensing (10): This test was very disappointing. They asked optics and physics questions unrelated to the nature of the event. I thought last year’s test was relatively fair (albeit very difficult) but this year’s test was outright off-topic on several sections. They included Snell’s Law problems, Doppler speed questions (not too egregious), and other unrelated physics jargon. The test was very lacking in terms of climate information and relevant physics questions. They spiced in a few good questions here and there but I expected so much more from a Nationals Test.
Forensics (14): The test was long and hard. Powders were varied and well chosen and the test was a trademark Nationals test. There were so many things to do in 50 minutes and we needed every second of it. We didn’t have enough time to finish (not sure anyone did) but plowed through a lot of it, although we messed up on chromatography and getting proper measurements. I was told Forensics Nationals would be crazy and it really was. No amount of preparation could’ve equipped me for the time-crunch. That said, the test itself and the frantic atmosphere was pretty exhilarating and thrilling. Good test and had a lot of fun taking it (even if my hands were shaking the entire time).
Towers (12): Very nice supervisors. I heard the stand was not level, but aside from that, it was smoothly run.
2019 Interests: Anatomy, Disease Detectives, Fossils, Experimental Design, Geologic Mapping, Designer Genes
Anatomy/Disease/Experimental/Fossils/Circuit Lab:
MIT: 12/25/13/22
Regionals: 1/1/x/x/1
State: 1/1/2/1/x
Nationals:
Anatomy/Disease/Experimental/Fossils/Circuit Lab:
MIT: 12/25/13/22
Regionals: 1/1/x/x/1
State: 1/1/2/1/x
Nationals:
-
- Member
- Posts: 16
- Joined: Sat Apr 16, 2016 10:00 am
- Division: Grad
- State: MI
- Has thanked: 5 times
- Been thanked: 7 times
Re: Nationals Event Discussion
I apologize for being so late to this forum, but I thought it would be best as a graduating senior to first take the time to look back at my entire Science Olympiad career and the experiences I've had. I've enjoyed every minute of competition over the years, and I plan to continue being involved in SO in any way I can. With that said, I hope that those involved with SO and the National Tournament will seriously consider the thoughts that I and many others on this forum put out so that SO can continue to improve at the highest level. I apologize in advance for the length!
I'll start with my events!
Anatomy & Physiology: (21) - I'll be completely honest - this was not good at all. This test failed to represent the rules manual, as many topics in each system were completely left out. For the most part, the stations were far too easy, but there also were questions that were either too random or too irrelevant to the heart of the event (i.e. pink puffers, blue bloaters). Although the case study was a good idea, it just wasn't executed well enough and the patient gave too many contradictory responses.
One of the beautiful parts of an event like Anatomy & Physiology is the multitude of ways in which an event supervisor can test the competitors' knowledge. I am always impressed by invitational tests like MIT's that are able to cover all of the rule book and really separate teams based on their knowledge of A&P. This test was truly embarrassing for a national-level tournament in its inability to accomplish that. I thought last year's A&P test was better - it had parts that were easy, medium, and hard, as well as sections that forced the competitors to dig into the knowledge of A&P that they had built over many months. You could tell that last year's test was written by someone with a doctor's perspective and a solid grasp of the rules. I cannot say the same about Ms. Palmietto's exams from both 2015 and 2018. It was just so frustrating for someone like me who made this event my first true love over the years and was fortunate enough to attend the SfN Conference through SO last year to end on such a sour note. However, I know I would've had the same complaints even if I had placed in the top 6. Congrats to Mason on a phenomenal job in this event and as a team all year.
Disease Detectives (4) - The CDC does a good job with this event on a yearly basis. The exams are always long, challenging, and test the competitors' ability to adapt to different scenarios and use critical thinking. All in all, I thought this year's event was done well, but probably not as well as previous years. The general knowledge section at the beginning of the test was an interesting twist (especially with the bias and confounding part), but it was a little too straightforward for a national test. The second section on plague had some new questions that I enjoyed answering and thinking through, but it still felt too easy overall (i.e. the Bradford-Hill Criteria listing). Although I think topics like occupational health and plague are interesting inclusions in Nationals case studies, it's still strange that foodborne disease was hardly ever addressed.
The test was definitely doable in the 40 minutes given, but the national supervisors need to do a much better job with checking teams in and handing out booklets so that all teams have 50 minutes to work, not 40 or fewer. This test was OK overall, but I'd point to the tests from the last three years as much better examples of how a nationals DD test should be structured. Congrats to TJHSST for winning this event.
Materials Science: (25) - Just like how an NBA game can be "tale of two halves", this MatSci exam was a tale of two parts. I'll start with the test. For the most part, I thought it was quite good for a Nationals test. It had varied multiple choice questions that covered most, if not all, of the rule book. For example, I really liked the questions that tested the materials testing equipment because it's something mentioned in the rules that takes a good amount of research to fully understand. The matching section could've been written and spaced out better, and there was probably too much on certain topics like IR. Like Unome said earlier, there was also probably too much focus on individual polymers rather than how polymers behave as a class.
The lab, on the other hand, was quite flawed. The lab asked competitors to melt some cheese and then extrude it through a syringe to form a string that had to carried to a station for judging. Next, it asked us to make a hashtag from the extruded cheese that also had to carried. Seriously? I understand that challenging labs can be expensive to create and difficult to complete in the allotted time, but I've seen many interesting and intuitive labs at tournaments all year that blew these out of the water. I was definitely expecting better from the event supervisor because my partner had a blast doing the metals-based labs at Wright State Nationals.
Also, the scoring system for the cheese lengths was just bad. I don't remember the numbers exactly, but the scoring ranges were unreasonably large. For example, a team with a string of 21 cm would get the same number of points as a team with a 99 cm string, a significant difference considering how much cheese we were initially given. It just didn't capture the essence of the event's focus on polymers and seemed rather silly in the grand scheme of things. Frankly, it seemed like the event supervisor felt the need to force cheese into the event in some way because he's from Wisconsin. The rest of the lab section dealt with opacity and general polymer stuff, both of which were OK. This event was decent, but it had the potential to be a great overall with a better lab component. Hats off to Troy for winning this event and the overall tournament.
Again, I don't want to give the impression that I'm complaining or suffering from sour grapes because I didn't do well as I had hoped in all my events. I've thought long and hard about each one, and I feel that those events and others should be analyzed in an unbiased manner that accounts for all the factors involved.
With that said, I thought the overall tournament was run pretty well. The event staff and volunteers were great as usual, and the builders on my team didn't seem to have any glaring problems with how events were run. Colorado State had a great campus, and I didn't really have any issues with traveling between events. I will say that there seemed to be very few things that the competitors could do on Friday and Saturday other than walk around and/or compete. The Noosa was great, however, and I haven't eaten yogurt since then because other brands are just a letdown in quality I'd rank this tournament as better than Nebraska (for obvious reasons) and around the same as Wright State and UW-Stout. In my opinion, the UCF Nationals from 2014 and 2012 were just awesome and unparalleled in quality.
I'll premise my final part by first saying that Science Olympiad does a LOT of things right. I've always loved picking up a variety of events that have allowed me to grow in so many ways. SO is also incredibly rewarding, and its benefits obviously go far beyond a few medals at States and Nationals. However, the National Tournament and SO have some pressing issues that need to be tackled. Nationals should reflect the highest standards that SO has to offer. I think a lot of people on these forums believe that the quality of Nationals, especially in the bio events, has been declining a bit over the years. It's far too common to see tests at invitationals like MIT, GGSO, and SOUP in January and February that are significantly better than Nationals tests. That shouldn't happen, plain and simple. I understand that the National event supervisors have responsibilities outside of SO, but it's simply unacceptable when a college student or professor writes a test that demonstrates a greater understanding of the event and all it entails than the National supervisor does. This is especially concerning since the event supervisors have almost an entire year to prepare an exam that they know students have been diligently preparing for. There are too many exams like Ecology last year and Herpetology and Anatomy & Physiology this year that leave competitors asking for much, much more. The National supervisor committee as a whole should honestly reconsider their approach, especially since there's a solid argument that tests from invites like MIT are better nearly across the board.
I've been truly honored to attend 7 national tournaments, and as a result, it feels like SO has become ingrained in my DNA. I'm a little sad that my team and I were not able to do as well as we had wished in my final competition ever. In the end, however, I'll remember all the incredible memories I've created from SO. I love my teammates to death, and I know they'll be back next year with a vengeance. We all have a great deal of respect for teams like Troy, Solon, and Harriton that consistently churn out winning teams that capture what SO is all about. Another shoutout goes to all the people on these forums for their awesome performances and dedication to Science Olympiad. You guys rock.
I'd love to hear what others think. Feel free to reply or PM me if you'd like to discuss.
I'll start with my events!
Anatomy & Physiology: (21) - I'll be completely honest - this was not good at all. This test failed to represent the rules manual, as many topics in each system were completely left out. For the most part, the stations were far too easy, but there also were questions that were either too random or too irrelevant to the heart of the event (i.e. pink puffers, blue bloaters). Although the case study was a good idea, it just wasn't executed well enough and the patient gave too many contradictory responses.
One of the beautiful parts of an event like Anatomy & Physiology is the multitude of ways in which an event supervisor can test the competitors' knowledge. I am always impressed by invitational tests like MIT's that are able to cover all of the rule book and really separate teams based on their knowledge of A&P. This test was truly embarrassing for a national-level tournament in its inability to accomplish that. I thought last year's A&P test was better - it had parts that were easy, medium, and hard, as well as sections that forced the competitors to dig into the knowledge of A&P that they had built over many months. You could tell that last year's test was written by someone with a doctor's perspective and a solid grasp of the rules. I cannot say the same about Ms. Palmietto's exams from both 2015 and 2018. It was just so frustrating for someone like me who made this event my first true love over the years and was fortunate enough to attend the SfN Conference through SO last year to end on such a sour note. However, I know I would've had the same complaints even if I had placed in the top 6. Congrats to Mason on a phenomenal job in this event and as a team all year.
Disease Detectives (4) - The CDC does a good job with this event on a yearly basis. The exams are always long, challenging, and test the competitors' ability to adapt to different scenarios and use critical thinking. All in all, I thought this year's event was done well, but probably not as well as previous years. The general knowledge section at the beginning of the test was an interesting twist (especially with the bias and confounding part), but it was a little too straightforward for a national test. The second section on plague had some new questions that I enjoyed answering and thinking through, but it still felt too easy overall (i.e. the Bradford-Hill Criteria listing). Although I think topics like occupational health and plague are interesting inclusions in Nationals case studies, it's still strange that foodborne disease was hardly ever addressed.
The test was definitely doable in the 40 minutes given, but the national supervisors need to do a much better job with checking teams in and handing out booklets so that all teams have 50 minutes to work, not 40 or fewer. This test was OK overall, but I'd point to the tests from the last three years as much better examples of how a nationals DD test should be structured. Congrats to TJHSST for winning this event.
Materials Science: (25) - Just like how an NBA game can be "tale of two halves", this MatSci exam was a tale of two parts. I'll start with the test. For the most part, I thought it was quite good for a Nationals test. It had varied multiple choice questions that covered most, if not all, of the rule book. For example, I really liked the questions that tested the materials testing equipment because it's something mentioned in the rules that takes a good amount of research to fully understand. The matching section could've been written and spaced out better, and there was probably too much on certain topics like IR. Like Unome said earlier, there was also probably too much focus on individual polymers rather than how polymers behave as a class.
The lab, on the other hand, was quite flawed. The lab asked competitors to melt some cheese and then extrude it through a syringe to form a string that had to carried to a station for judging. Next, it asked us to make a hashtag from the extruded cheese that also had to carried. Seriously? I understand that challenging labs can be expensive to create and difficult to complete in the allotted time, but I've seen many interesting and intuitive labs at tournaments all year that blew these out of the water. I was definitely expecting better from the event supervisor because my partner had a blast doing the metals-based labs at Wright State Nationals.
Also, the scoring system for the cheese lengths was just bad. I don't remember the numbers exactly, but the scoring ranges were unreasonably large. For example, a team with a string of 21 cm would get the same number of points as a team with a 99 cm string, a significant difference considering how much cheese we were initially given. It just didn't capture the essence of the event's focus on polymers and seemed rather silly in the grand scheme of things. Frankly, it seemed like the event supervisor felt the need to force cheese into the event in some way because he's from Wisconsin. The rest of the lab section dealt with opacity and general polymer stuff, both of which were OK. This event was decent, but it had the potential to be a great overall with a better lab component. Hats off to Troy for winning this event and the overall tournament.
Again, I don't want to give the impression that I'm complaining or suffering from sour grapes because I didn't do well as I had hoped in all my events. I've thought long and hard about each one, and I feel that those events and others should be analyzed in an unbiased manner that accounts for all the factors involved.
With that said, I thought the overall tournament was run pretty well. The event staff and volunteers were great as usual, and the builders on my team didn't seem to have any glaring problems with how events were run. Colorado State had a great campus, and I didn't really have any issues with traveling between events. I will say that there seemed to be very few things that the competitors could do on Friday and Saturday other than walk around and/or compete. The Noosa was great, however, and I haven't eaten yogurt since then because other brands are just a letdown in quality I'd rank this tournament as better than Nebraska (for obvious reasons) and around the same as Wright State and UW-Stout. In my opinion, the UCF Nationals from 2014 and 2012 were just awesome and unparalleled in quality.
I'll premise my final part by first saying that Science Olympiad does a LOT of things right. I've always loved picking up a variety of events that have allowed me to grow in so many ways. SO is also incredibly rewarding, and its benefits obviously go far beyond a few medals at States and Nationals. However, the National Tournament and SO have some pressing issues that need to be tackled. Nationals should reflect the highest standards that SO has to offer. I think a lot of people on these forums believe that the quality of Nationals, especially in the bio events, has been declining a bit over the years. It's far too common to see tests at invitationals like MIT, GGSO, and SOUP in January and February that are significantly better than Nationals tests. That shouldn't happen, plain and simple. I understand that the National event supervisors have responsibilities outside of SO, but it's simply unacceptable when a college student or professor writes a test that demonstrates a greater understanding of the event and all it entails than the National supervisor does. This is especially concerning since the event supervisors have almost an entire year to prepare an exam that they know students have been diligently preparing for. There are too many exams like Ecology last year and Herpetology and Anatomy & Physiology this year that leave competitors asking for much, much more. The National supervisor committee as a whole should honestly reconsider their approach, especially since there's a solid argument that tests from invites like MIT are better nearly across the board.
I've been truly honored to attend 7 national tournaments, and as a result, it feels like SO has become ingrained in my DNA. I'm a little sad that my team and I were not able to do as well as we had wished in my final competition ever. In the end, however, I'll remember all the incredible memories I've created from SO. I love my teammates to death, and I know they'll be back next year with a vengeance. We all have a great deal of respect for teams like Troy, Solon, and Harriton that consistently churn out winning teams that capture what SO is all about. Another shoutout goes to all the people on these forums for their awesome performances and dedication to Science Olympiad. You guys rock.
I'd love to hear what others think. Feel free to reply or PM me if you'd like to discuss.
Northville '18, UMich '22
Vice Executive Director - University of Michigan Science Olympiad Invitational (UMSO)
Website: www.umichscioly.org
Instagram: @umichscioly
Twitter: @umichscioly
Vice Executive Director - University of Michigan Science Olympiad Invitational (UMSO)
Website: www.umichscioly.org
Instagram: @umichscioly
Twitter: @umichscioly
-
- Staff Emeritus
- Posts: 1382
- Joined: Sun Apr 19, 2015 6:37 pm
- Division: Grad
- State: FL
- Has thanked: 2 times
- Been thanked: 37 times
Re: Nationals Event Discussion
I find this quite interesting in particular. I would also be interested in hearing peoples perspectives and speculation on thisjaguarhunter wrote: I'll premise my final part by first saying that Science Olympiad does a LOT of things right. I've always loved picking up a variety of events that have allowed me to grow in so many ways. SO is also incredibly rewarding, and its benefits obviously go far beyond a few medals at States and Nationals. However, the National Tournament and SO have some pressing issues that need to be tackled. Nationals should reflect the highest standards that SO has to offer. I think a lot of people on these forums believe that the quality of Nationals, especially in the bio events, has been declining a bit over the years. It's far too common to see tests at invitationals like MIT, GGSO, and SOUP in January and February that are significantly better than Nationals tests. That shouldn't happen, plain and simple. I understand that the National event supervisors have responsibilities outside of SO, but it's simply unacceptable when a college student or professor writes a test that demonstrates a greater understanding of the event and all it entails than the National supervisor does. This is especially concerning since the event supervisors have almost an entire year to prepare an exam that they know students have been diligently preparing for. There are too many exams like Ecology last year and Herpetology and Anatomy & Physiology this year that leave competitors asking for much, much more. The National supervisor committee as a whole should honestly reconsider their approach, especially since there's a solid argument that tests from invites like MIT are better nearly across the board.
Boca Raton Community High School Alumni
University of Florida Science Olympiad Co-Founder
Florida Science Olympiad Board of Directors
kevin@floridascienceolympiad.org || windu34's Userpage
University of Florida Science Olympiad Co-Founder
Florida Science Olympiad Board of Directors
kevin@floridascienceolympiad.org || windu34's Userpage
-
- Member
- Posts: 2107
- Joined: Fri Jan 09, 2009 7:30 pm
- Division: Grad
- State: OH
- Has thanked: 1 time
- Been thanked: 56 times
Re: Nationals Event Discussion
There is a nuance here that many people don't seem to be aware of. Part of the general 'deal' with being a host site for the National Tournament is that you get to designate a certain number of local people as National Event Supervisors for the tournament. As such, every year there are many events run by people who haven't had experience running a national event before. I'm not implying there is a direct correlation with the concerns some of you have about some national events, but it's likely a significant factor, even though we do a variety of things to try to help them prepare.windu34 wrote:I find this quite interesting in particular. I would also be interested in hearing peoples perspectives and speculation on thisjaguarhunter wrote: I'll premise my final part by first saying that Science Olympiad does a LOT of things right. I've always loved picking up a variety of events that have allowed me to grow in so many ways. SO is also incredibly rewarding, and its benefits obviously go far beyond a few medals at States and Nationals. However, the National Tournament and SO have some pressing issues that need to be tackled. Nationals should reflect the highest standards that SO has to offer. I think a lot of people on these forums believe that the quality of Nationals, especially in the bio events, has been declining a bit over the years. It's far too common to see tests at invitationals like MIT, GGSO, and SOUP in January and February that are significantly better than Nationals tests. That shouldn't happen, plain and simple. I understand that the National event supervisors have responsibilities outside of SO, but it's simply unacceptable when a college student or professor writes a test that demonstrates a greater understanding of the event and all it entails than the National supervisor does. This is especially concerning since the event supervisors have almost an entire year to prepare an exam that they know students have been diligently preparing for. There are too many exams like Ecology last year and Herpetology and Anatomy & Physiology this year that leave competitors asking for much, much more. The National supervisor committee as a whole should honestly reconsider their approach, especially since there's a solid argument that tests from invites like MIT are better nearly across the board.
Student Alumni
National Event Supervisor
National Physical Sciences Rules Committee Chair
-
- Member
- Posts: 128
- Joined: Sun Apr 30, 2017 12:27 pm
- Division: Grad
- State: MI
- Has thanked: 0
- Been thanked: 0
Re: Nationals Event Discussion
I do wish that all tournaments, not just nationals, involved younger student alumni (eg college students) more frequently. I feel that students can sometimes write some of the best tests because they are the ones who have actually competed the events dozens of times and know what works and what doesn’t. Obviously I’m not saying adults/professors don’t do a good job, but I wish college students were more a part of the process at state and national tournaments. I think having student alumni help with the test writing/supervising could increase the quality of the tests substantially.
University of Michigan Science Olympiad Executive Board