Page 1 of 1

MIT Science Olympiad Invitational 2024

Posted: July 20th, 2023, 4:38 pm
by SciOly at MIT
We are pleased to announce that the 10th annual MIT Science Olympiad Invitational Tournament will take place in-person on Saturday, January 20th, 2024 at the MIT campus. We will be running all 23 Division C national events, including lab and engineering events. All events will be written, reviewed, and run by Science Olympiad alumni currently at MIT, national tournament event supervisors, and/or former Science Olympiad competitors from the highest level of competition. This means that coaches will not be required to write tests or otherwise volunteer at the invitational, leaving them free to focus solely on coaching their own teams.

Registration will open on Friday, September 15th, 2023 at 6 PM ET and will close the following Monday, September 25th at 6 PM ET. More details about registration will be available on our website shortly. Teams will be notified of their acceptance on Monday, October 2nd, and a public list of participating teams will be announced on Monday, October 9th.

We will guarantee acceptance of schools that meet either of the following qualifications:

- Schools that have placed Top 10 at the MIT Invitational the previous year
- Schools that have placed Top 10 at the National Tournament the previous year

All teams that do not meet the qualifications for guaranteed acceptance will have their acceptance determined by a lottery. As we continue to grow our tournament, we aim to maintain a diverse pool of competitors while also upholding the rigor of our tournament. As such, we still strongly encourage all teams to apply, especially Massachusetts teams, regardless of their reputation or experience level.

Schools that would like to bring two teams can fill out an additional question on the registration form regarding why they need to bring two teams— the answers will be reviewed to make sure as many teams are satisfied as possible.

This announcement can also be viewed on our website scioly.mit.edu. Please do not hesitate to contact us at scioly@mit.edu if you have any questions. We look forward to the upcoming season!
scioly 2024.png
scioly 2024.png (90 KiB) Viewed 2301 times

Re: MIT Science Olympiad Invitational 2024

Posted: September 7th, 2023, 2:36 pm
by SciOly at MIT
We're excited to announce the opening of our event supervisor application for the 2024 MIT Science Olympiad Invitational on January 20th, 2024.

The application for first-time event supervisors (i.e. you haven’t supervised for MIT before) can be found at:
https://forms.gle/6NVPkSceXUgjx8SM8

The application for returning supervisors can be found at:
https://forms.gle/yPzbme1TV5V2rHkB9

Applications will close at 11:59 PM EST on September 29th, 2023.
Hope to see you apply!

Re: MIT Science Olympiad Invitational 2024

Posted: January 21st, 2024, 1:41 pm
by windu34
MIT 2024 Scrambler - ES
I was the Scrambler ES. Overall, I was impressed by the quality of the devices that we judged. In addition to the graph of scores I have attached below, I wanted to provide a few comments to help teams put their experiences in perspective and identify areas to improve. We saw one top 3 nationals score, which I feel is typical for MIT builds over the past 8 years.

Competition Setup
We were ready for a busy impound with 8 stations and 4 staff ready to check devices in. Students should not expect this at regionals/states. I always advocate for impounding as early as you can, even arriving 5-10 mins before the stated impound time to ensure you dont get stuck in impound before your first event.
We chose to not mark the centerline. This is always my preference because I believe teams should have to spend time developing a method for aiming their device. In retrospect, we probably should have not made the track parallel to the tiles to make aiming more difficult and disadvantage the so-called "accurate eye-ballers".
We started timing the 8 mins immediately once teams had gathered their materials from impound. Far too many teams are not rehearsed and struggle to get off two runs in 8 minutes. I highly encourage teams to simplify their setup and rehearse their runs while being timed.

Violations
There are many construction and competition violations that could have been handed out. As this was an invitational, we reframed from this and notified teams of issues when possible. However, this courtesy should not be expected. Here are a few potential violations that I frequently saw:
- Following the device after launch
- Bottom of the dowels not between 5-10 cm from the floor
- Flag not centered in 17cm
- Height of device not <1m
- Rear and front 1cm of the egg not uncovered by tape
- Point of egg not over the start point in ready-to-run config
- Launcher not behind the 0.5m line when egg is over target point in ready-to-run config
- Car fails to travel 7m

Take-home tips
Time is 2 pts per second. Accuracy is 1 pt per 1cm. Accuracy is way harder to master. Only 2 teams broke 20 points. A slow (4-5 seconds), but fairly accurate (<10cm) would be 3rd place. Yet I saw so many devices with < 3 second times and off the target by 30+ cm. In my opinion, once you have < 3 second times, dont speed up the device until you have <10cm accuracy repeatably.

Know what the worse-case-scenario skid on your device is and use that to determine the 1st run distance. I saw so many teams that should have been more aggressive with their first run (aiming to be 40cm short, when the could have reasonably aimed for 10cm short). Many of these teams then proceeded to break their egg on the 2nd run.

Teams need a methodical way to aim. Use a scope or some similar device.

Rail systems that guided the vehicle during launch drastically improve repeatability. See aircraft carriers.

Planned Improvements to How the Event was run
- We didn't end up running very behind, but we were worried. 2 tracks for 72 teams was not ideal. In the future, I think 1 track per 4 teams in a time block is ideal (8 teams per time block = 2 tracks, 12 teams per time block = 3 tracks).
- Photogate....one day we will get this.

Would love to hear any suggestions from competitors/coaches! Hope to be back for Electric Vehicle in 2025.
MIT 2024 Scrambler Scores.png
MIT 2024 Scrambler Scores.png (64.52 KiB) Viewed 1057 times

Re: MIT Science Olympiad Invitational 2024

Posted: January 22nd, 2024, 3:23 pm
by Unome
I was the event supervisor for Geologic Mapping this year.

As a whole, scores were higher than I expected this year, especially at the lower end. Very few teams scored under 10%, and considering that the test turned out harder than I had intended for it to be when I started writing it, I'm satisfied with the high scores - I think the upper half of teams showed fairly strong understanding of the material compared to the previous rotation of the event.
MIT 2024 Geomaps Scores.png
MIT 2024 Geomaps Scores.png (19.18 KiB) Viewed 943 times
Retrospective on specific sections (only open the spoiler tags if you have already taken the test):
  • For Section A, it was nice to see that many teams were able to conceptually understand the uniform slope dip problem (therefore correctly answering question 5 and the first part of question 6). Only a few teams had a strong enough grasp of the math to find the specific solutions, and no teams had it down well enough to answer question 7 correctly, which was approximately what I expected, though I always hope for a little stronger mathematical skills than I end up seeing.
  • Scores for Section B were generally high, with a few teams scoring nearly full points. I probably would change the precise grading scheme for questions 13 and 14 if I had to do that over again, but once I chose a scheme I had to stick to it for reasons of fairness.
  • The vast majority of teams did not read the full intro paragraph for Section C, and so missed out on a point by not including letter N. In general though, most teams ranking in the upper half to upper two-thirds scored well on this section. Also, this Section's total should actually be 26 points - this doesn't substantively affect anything, but I'll have the files corrected for when the test is uploaded by MIT.
  • A few teams were able to partially or completely answer question 35 by interpreting the topographic differences between the regions. For your reference, this map is centered around the region of Camp Douglas in Wisconsin, near the boundary between the Driftless area to the southwest, and the region covered by the Laurentide ice sheet and associated lakes to the northeast.
  • I probably should have made the particular outcropping of Section E less annoying to handle. This is part of how the test turned out harder than I had wanted.
  • For Section G, it seems like many teams had studied depositional sequences to some extent, but lacked a rigorous understanding of the topic, and so weren't able to explain the relationship between the data and associated phenomena. Not really surprising in my opinion, as I think it's one of the most conceptually challenging topics in the event.
  • As it turns out, no one has really mapped this area (a northeastern region of Glacier National Park) since the 80s, when this map was made. I'd have liked for the image quality to be better for sure, and I think I'll try to be more strict with my standards for that in the future. I ended up using it anyway because to my visual inspection of prints, the questions were still answerable, and I think this bears out in the scores. Although, I'm a little surprised that no one really got the cross-section subsurface geology anywhere near right, since it's fairly clear in my opinion (considering the labeled syncline and copious strike and dip angle markers).
Note - if anyone has any feedback on the rules this year, I'd love to hear it. We tried to increase the comprehensibility and guidance aspects of the rules this year while not making that many changes to the core content, so feedback from people who aren't that familiar with the event would be extra nice to hear, since the rules need to communicate the event topics to people at all levels of familiarity with the subject matter.

Re: MIT Science Olympiad Invitational 2024

Posted: February 5th, 2024, 12:39 pm
by jaspattack
Hi there! I helped write Forensics for MIT this year. I wasn't super involved with grading (I mostly focused on actually proctoring the event) so I don't have as much to say about the scores as I normally would. I still put together a score distribution for the event which can be seen below:
image.png
image.png (14.78 KiB) Viewed 155 times
2.png
2.png (25.29 KiB) Viewed 154 times

I feel like just from the results here (the high score was a 68%) we could have made the test a little harder, but I'm not unhappy with it. The test was a good bit shorter than MIT has been in previous years, but I feel like writing a long test just for the sake of writing a long test is a little silly. I feel like we put together an exam that was fun and did a good job of distinguishing between teams and that's what matters to me.

We did have a couple issues with the room itself (namely access to reliable gas outlets and microscopes) but I hope it didn't impact your experience too much. I had a great time helping out this year and I hope I can do it again next year.