Jump to content

sinfoniahorn

Members
  • Posts

    5
  • Joined

  • Last visited

Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

sinfoniahorn's Achievements

Newbie

Newbie (1/14)

0

Reputation

  1. I dont have the score sheet in front of me, and it has been a long time since I have seen the breakdown. It may have changed in the several years since I was a participant, but there are many specific points that the judges look for. Since I am talking about 2 music judges, we can throw out the marching completely. I dont know of any judges that dont watch the show at all, but they are evaluating the music performance, and they are both using the same exact score sheet, so they are evaluating the same criteria. If they both agree on 1st place, what is it that compels them, using the same critera, to give both a 2nd place and a 21st place to the same band? It just doesnt work that way. Both judges were within 2 spots of eachother on 10 bands, 5 points of eachother on 7 bands, and within 10 points of eachother on another 9 bands. They were more than 10 spots away from eachother on 3 bands. Again, 8 of the top 10 were the same for both judges, and both agreed on 8 of 11 bands being 20 or lower, excluding of course Akins and Langham Creek as the obvious differences. So we can see that they both agree consistently on what bands are in the bottom 3rd of the contest, and what bands are in top 3rd of the contest. In addition, 11th-13th place both Judge 2 and 3 are within 1 spot of eachother. To me this all serves as pretty good litmus test that they saw things very similarly regarding the top, middle and bottom of the field. Why then do they drastically differ with the two bands mentioned at the top, where there are differences of 19 and 17 positions respectively on their rankings? I just find it odd that they can agree so much in a general sense, yet have these dramatic differences. When it comes to judging based on a formula, there should not be this much of a discrepency when everything else matches up more or less based on the same exact criteria. Fastforward to Finals, and the results are different. Marcus was undisputed in being the best, and all the judges just about agreed that Akins was last, while the middle of the field was more up in the air. However, Akins recieved all but one 10th place vote. Where did the other 10th place go to? Judge 1 gave it to Duncanville. This to me is striking as well. All but one Judge gave Marcus first, all gave Akins 10th except the one judge that gave Duncanville 10th. To me, this discrepency doesnt make sense either. Judge 4 gave Duncanville 8th and Akins 10th, but the 3 other judges gave 3rd or 2nd to Duncanville and Akins 10th. So judges 1 and 4 really didnt like Duncanville, but they agreed on Marcus being the best and Reagan being second and LD Bell being 3rd, which is fairly consistent with the other 3 judges. So why not the same consistency with Duncanville? Im just saying look at the official results and its easy to see that there are spikes in the judging that done make any sense when you compare the results across the board. http://www.uil.utexas.edu/music/smbc_results2006.html
  2. Actually I disagree with this statement Serardian. There is a difference. You may not be able to tell the difference at the state level, but I assure you there are differences. All of these bands are to be commended for the work that they have put in. Nothing should ever be taken away from the students. I used to be one of them and know what it is like to have won a state championship myself. However, the difference between 2nd place and 20th place is not as fine as you may think. Judge 2 gave 7th place to Bowie, whereas Judge 3 gave 3rd place to Bowie. So they were fairly close in their estimation of 1st and 3rd place. But second place was on the opposite side for each. The same can be said for Langham Creek. Judge 2 thought Langham was 6th, whereas Judge 3 thought they were 23. As a matter of fact, both Judges placed the top 8 bands in prelims in their top 10... which is very consistent with eachother. The only 2 bands where this wasnt the case are Akins and Langham Creek. I am curious what was so different between these two bands that bucked the trend of the way that they judged the other 8 bands.
  3. All of the judges that are hired have a track record of being successful directors in their own right at the high school or college level. Both judges agree one band is the best, but they are complete opposites on who is second best. It makes me wonder how could agree on 1st place when one of them thinks the other's 2nd place band is one of the worst. There are a few conclusions one may draw from that...
  4. Im not sure that only one Judge would be the best. I think having multiple judges avoids a singular bias, but there needs to be accountability. Yes, many of the bands I mentioned were not in contention from perusing the results, but I think there has to be accountability when there are such wide discrepancies. The system can be improved I think if Judges know they must be responsible for their choices. And if they are judging bands differently then the issue needs to be raised. Again, how can two judges vote one band 1st, and one gives 2nd a band that the other deems is only worthy of 21st? Something is wrong there.
  5. Let me preface this post by saying that I have witnessed several State Marching contests, participated in them, and have won a state championship as a member of a band during 1990s. I am a degreed musician and I have witnessed several of the bands that competed this year within the last 3 to 4 years, and the results of this contest are very surprising. I was hoping to see some in depth analysis of the competition this year. Reviewing the judges scores from the UIL website, it seems as if they were all over the place regarding scoring. Akins recieved a 10, 21 and 2 during prelims for their marching. Thats an extremely wide margin... Langham Creek recieving a 6 and a 23 for their music? Basically Judge 2 loved Langham and hated Akins, but Judge 3 hated Langham and loved Akins, giving them 2nd place for music. Surely this cannot be a matter of opinion concerning musical taste or style. Klein had a 6 and a 23 as well for music, while Austin High was in the 20s from every judge except one. As a point of the ridiculousness of it all, Marching Judge 4 gave Austin 9th, while Judge 5 gave Austin 29. ??? What kind of marching are these judges watching? Both Judges agree that Bell is the best, but they completely differ on Austin? How can they both love Bell's technique yet have such a completely different interpretation on Austin. If its the overall musical effect, I have the same question. What standard is being applied to where they both love Bell's show, but totally disagree on Austin. Its almost the same with their judging on Bowie. Judge 4 gives Bowie 21st, while Judge 5 gives Bowie 9th. Most other bands are fairly close in their assessments. Judge 1 hated Reagan, (23rd) but everyone else loved them (9,4,3,2) LD Bell was the overwhelming choice during prelims, and Duncanville was loved as they usually are every year. Places 1 through 5, and then 7 and 8 were for the most part in the top 10 of every judge, save one. Overall, looking at the scores, it is disheartening to see such disparity. You have 3 music judges, and in many cases they place the same band in totally opposite sides of the spectrum. What basis is there for a band to have one judge give them 21st place while another judge gives them 2nd? Are the scoring sheets kept and reviewed? What is the accountability? Surely the scoring sheet does not have any criteria allowing for if the Judge likes the selection or does not like the selection. We are talking about High School bands performing at a high level, and the judging should be done on the merit of the band's performance, not the music choice. I fail to understand how two competent music evaluators can take the same band and we told at once that it is the 2nd best musical performance yet at the same time told it ranks 21st. I would like a competent director to give some feedback. It is obvious that at the very least, the judges are not evaluating these bands to the same standards.
×
×
  • Create New...