NETexasBandFan Posted October 20, 2018 Posted October 20, 2018 If you ever take a look at UIL Area and even state spreadsheets, you'll often see a LOT of very ridiculous spreads. I know that judges can see different things depending on where they are and what they focus on, and I don't want to bash any judges, but I feel like this is an increasing issue with large spreads in ordinals becoming ever more common. Attached are judges sheets from prelims of 6A Area H, 5A Area E, 4A Area C, and 3A Area C all on their most recent state years. 3A Area C marching judges just really could not agree, could be different views on judging but the spreads got bad in some places, 4A Area C just had INSANE spreads, like 4 23 7 2 12, and 19 24 6 10 6 5A Area E music judges were just all over the place, judge 2 ranked several bands that made finals really low in prelims, 6A I could definitely find a better example for but it has it's spreads, albeit much more reasonable and common ones. https://imgur.com/a/5rewfrf If I could give any thoughts as to why this is common, I feel like it's a lack of clarification on the judging sheets, most categories are rather vague and subjective to the judge, the few ideas of GE and difficulty the sheets present aren't written out very well, such as "General Appearance" and "Visual Reinforcement of Music" which I would take to be the general look of the group at a glance and the drill fitting the music, but those ideas can be defined differently. Quote
Popular Post Rubisco Posted October 20, 2018 Popular Post Posted October 20, 2018 Just in time for UIL Region and Area, as usual! The short answer is yes. But as others will be quick to note, you can find ridiculous discrepancies in most other circuits, if you look hard enough. BOA usually fares a little bit better, but part of that is because BOA has only got 2 judges scoring the same thing on any one panel (Music General Effect). I do find it odd when individual performance scores are vastly different from ensemble performance scores, which happens pretty regularly in BOA. How to fix it? Better training is probably the most obvious solution, but, if I'm letting my hair down, I think being more selective about who is allowed to judge is a better one. Unfortunately, to my knowledge, UIL does not appear to have a s***list. They should hit me up! TWHSParent, LeanderMomma and 5 te 6 3 Quote
CTJBandPops Posted October 20, 2018 Posted October 20, 2018 Just in time for UIL Region and Area, as usual! The short answer is yes. But as others will be quick to note, you can find ridiculous discrepancies in most other circuits, if you look hard enough. BOA usually fares a little bit better, but part of that is because BOA has only got 2 judges scoring the same thing on any one panel (Music General Effect). I do find it odd when individual performance scores are vastly different from ensemble performance scores, which happens pretty regularly in BOA. How to fix it? Better training is probably the most obvious solution, but, if I'm letting my hair down, I think being more selective about who is allowed to judge is a better one. Unfortunately, to my knowledge, UIL does not appear to have a s***list. They should hit me up! do you think the more consistent scoring in BOA has anything to do having a set of judges that they rely on for their events - many of which are also BOA competing directors? Quote
LostChoirGuy Posted October 20, 2018 Posted October 20, 2018 I was recently looking back at the prelims results for San Antonio last year and noticed that a large part of the reason that Keller jumped from 14th to 9th was because of their inconsistent GE scoring. In prelims they were ranked 23rd by one music GE judge and 4th by the other. They ended up 15th overall in GE and only made it into finals barely because they were 10th and 11th in music and visual. Quote
Popular Post Mash Posted October 20, 2018 Popular Post Posted October 20, 2018 Judging always seems to have a judge or two that are on a wildly different page from the others. I am not sure how they can fix it without maybe doing something like they do in gymnastics where band programs start with a max top score and are marked down for each issue noticed. With a standard for each issue. I personally think that would be a detriment to the programs as they would start to look very similar and we would lose variety and innovation. It is frustrating when bands lose places in finals because of "that" judge. We have all seen it, and I think there will always be a judge that views the shows different than the others. Heck, look at baseball umpires and their strike zones. They have a very controlled thing to watch for and they can't get balls and strikes right. 1998-2018, NETexasBandFan and LeanderMomma 3 Quote
Nny14 Posted October 20, 2018 Posted October 20, 2018 Judging always seems to have a judge or two that are on a wildly different page from the others. I am not sure how they can fix it without maybe doing something like they do in gymnastics where band programs start with a max top score and are marked down for each issue noticed. With a standard for each issue. I personally think that would be a detriment to the programs as they would start to look very similar and we would lose variety and innovation. It is frustrating when bands lose places in finals because of "that" judge. We have all seen it, and I think there will always be a judge that views the shows different than the others. Heck, look at baseball umpires and their strike zones. They have a very controlled thing to watch for and they can't get balls and strikes right. DCI used to do that. It was called the "tick" system. I'm sure some old marching band circuits did it that way too, but BOA has never as far as I know. Quote
Jeffrey L. Gorman Posted October 20, 2018 Posted October 20, 2018 DCI used to do that. It was called the "tick" system. I'm sure some old marching band circuits did it that way too, but BOA has never as far as I know. The Cavalcade of Bands in the Mid Atlantic States and the New York Field Band Conference used to use this format. The problem was that many Band Directors observed that if you tried something new or challenging your were punished for things that were not IN VOGUE DURING THAT PERIOD. The fact is that judging is a difficult thing. Across the country there are different style of judging. In Ca BOA is not as well thought off as here, and that is because they have the SCSBOA, NCBA, MBOS, WBA, and several other circuits that each have their own judging standards and judges. Out there it is like a smorgasboard and often Bands that win in one circuit are judged lower in another. we can complain all we want but each human has different likes and dislikes. 1998-2018 1 Quote
b_radon_inSA Posted October 25, 2018 Posted October 25, 2018 If you ever take a look at UIL Area and even state spreadsheets, you'll often see a LOT of very ridiculous spreads. I know that judges can see different things depending on where they are and what they focus on, and I don't want to bash any judges, but I feel like this is an increasing issue with large spreads in ordinals becoming ever more common. Attached are judges sheets from prelims of 6A Area H, 5A Area E, 4A Area C, and 3A Area C all on their most recent state years. 3A Area C marching judges just really could not agree, could be different views on judging but the spreads got bad in some places, 4A Area C just had INSANE spreads, like 4 23 7 2 12, and 19 24 6 10 6 5A Area E music judges were just all over the place, judge 2 ranked several bands that made finals really low in prelims, 6A I could definitely find a better example for but it has it's spreads, albeit much more reasonable and common ones. https://imgur.com/a/5rewfrf If I could give any thoughts as to why this is common, I feel like it's a lack of clarification on the judging sheets, most categories are rather vague and subjective to the judge, the few ideas of GE and difficulty the sheets present aren't written out very well, such as "General Appearance" and "Visual Reinforcement of Music" which I would take to be the general look of the group at a glance and the drill fitting the music, but those ideas can be defined differently. Where can one find the archived UIL spreadsheets? I have been looking for days with no luck. I am a number cruncher and find their scoring intriguing and erratic at the same time. Quote
Rubisco Posted October 26, 2018 Posted October 26, 2018 do you think the more consistent scoring in BOA has anything to do having a set of judges that they rely on for their events - many of which are also BOA competing directors? First of all, I want to say that there are high quality people judging both BOA and UIL events. That said, BOA does seem to attract people whose pretty much only job in the marching arts is to judge bands, drum corps, and winter guards. Sure, you've got your Jay Webbs and Jarrett Lipmans, but a lot of these people are not band directors, or haven't been band directors in a long time. It's not uncommon for Dan Potter or Chuck Henson to announce a judge like: "He has been in the marching arts for 35 years. He has judged for BOA, DCI, and WGI world championships. He is the head judge coordinator for the New Zealand Winter Guard Association. He is an accountant. Judging visual performance - ensemble, from New Brunswick, New Jersey, Mr. Michael McFluff!" I think one of the good things about having career judges is that they judge A LOT, and they do so in circuits that are similar to BOA in terms of judging criteria, like DCI and WGI. So, there's a lot of reinforcement. UIL judges, on the other hand, are typically Texas directors who don't do much judging. They'll put in their ten hours or whatever to judge a UIL area contest, and that's that. I'm generalizing, obviously, but this honestly describes a lot of them. I think the result of this is that personal criteria seep into the scoring a bit more than in BOA. How else do you explain results like SFA's UIL State result in 2004? SFA ended up 22nd place out of 31 bands, after one of the music judges had them in dead last. I think the judge who had SFA last probably placed a lot more emphasis on tone quality than the one music judge who had them in finals. SFA put on a very rhythmically accurate, passionate performance, but they also blew past the point of good tone quality, producing a pretty consistently strident sound. And that one UIL judge hit them HARD for it. Meanwhile, BOA music judges loved SFA all year. They ended up 4th place at BOA Grand Nationals and won the Music Performance caption. I didn't agree with that, but at least BOA was pretty consistent! takigan 1 Quote
4boysmom Posted October 26, 2018 Posted October 26, 2018 How on the world did that "one judge" give Vista Ridge a 7 in music? That's crazy. Quote
LeanderMomma Posted October 26, 2018 Posted October 26, 2018 First of all, I want to say that there are high quality people judging both BOA and UIL events. That said, BOA does seem to attract people whose pretty much only job in the marching arts is to judge bands, drum corps, and winter guards. Sure, you've got your Jay Webbs and Jarrett Lipmans, but a lot of these people are not band directors, or haven't been band directors in a long time. It's not uncommon for Dan Potter or Chuck Henson to announce a judge like: "He has been in the marching arts for 35 years. He has judged for BOA, DCI, and WGI world championships. He is the head judge coordinator for the New Zealand Winter Guard Association. He is an accountant. Judging visual performance - ensemble, from New Brunswick, New Jersey, Mr. Michael McFluff!" I think one of the good things about having career judges is that they judge A LOT, and they do so in circuits that are similar to BOA in terms of judging criteria, like DCI and WGI. So, there's a lot of reinforcement. UIL judges, on the other hand, are typically Texas directors who don't do much judging. They'll put in their ten hours or whatever to judge a UIL area contest, and that's that. I'm generalizing, obviously, but this honestly describes a lot of them. I think the result of this is that personal criteria seep into the scoring a bit more than in BOA. How else do you explain results like SFA's UIL State result in 2004? SFA ended up 22nd place out of 31 bands, after one of the music judges had them in dead last. I think the judge who had SFA last probably placed a lot more emphasis on tone quality than the one music judge who had them in finals. SFA put on a very rhythmically accurate, passionate performance, but they also blew past the point of good tone quality, producing a pretty consistently strident sound. And that one UIL judge hit them HARD for it. Meanwhile, BOA music judges loved SFA all year. They ended up 4th place at BOA Grand Nationals and won the Music Performance caption. I didn't agree with that, but at least BOA was pretty consistent! Interesting! But also a bit alarming. For UIL anyway. Also, do they pretty much use the same judges every year or do they get new ones? I’m not sure which scenario would be better. Quote
principalagent Posted October 26, 2018 Posted October 26, 2018 UIL allows regions and areas to pick their own judges, so some judges judge the same competitions for YEARS on end. UIL would do much better to randomize assignment from the top level. Quote
LeanderMomma Posted October 26, 2018 Posted October 26, 2018 UIL allows regions and areas to pick their own judges, so some judges judge the same competitions for YEARS on end. UIL would do much better to randomize assignment from the top level. Oh my goodness...yes, fresh blood please!!! Quote
SpartanBandAlum Posted October 27, 2018 Posted October 27, 2018 I think that a part of it is that a lot of UIL judges will latch onto one thing they don’t like about a show and refuse to let go. For instance, I know for a fact that the music judge that had Porter in 19th put us there because the beginning of our opener was horribly out of tune. Quote
Popular Post ChristopherRoden Posted October 28, 2018 Popular Post Posted October 28, 2018 Judging seemed to be strange at Area E finals yesterday. Clements placed second with ordinals 1,1,1,1,9 and Pearland won with ordinals 2,3,3,3,1. SteveTurner, ChristopherRoden and RhythmicTuba 3 Quote
Samuel Culper Posted October 28, 2018 Posted October 28, 2018 Judging seemed to be strange at Area E finals yesterday. Clements placed second with ordinals 1,1,1,1,9 and Pearland won with ordinals 2,3,3,3,1. The East German judge strikes again! Quote
LeanderMomma Posted October 28, 2018 Posted October 28, 2018 The East German judge strikes again! No it's the Russians meddling in our business again. Quote
bandgeekofthecentury Posted October 28, 2018 Posted October 28, 2018 In our prelims run at Area, our music scores were 1, 5, and 10. The judge that gave us the 10 was the same judge who gave us low scores in 2014 and 2016, and resulted in us missing state. Just something to add to the table. Quote
5 te 6 Posted October 28, 2018 Posted October 28, 2018 I viewed this thread with only mild interest and a healthy dose of skepticism until I reviewed the score sheets from yesterday's 6A Area competitions. Now I'm a firm believer. Quote
Astros Posted October 28, 2018 Posted October 28, 2018 In our prelims run at Area, our music scores were 1, 5, and 10. The judge that gave us the 10 was the same judge who gave us low scores in 2014 and 2016, and resulted in us missing state. Just something to add to the table. Feel fortunate the same judge had CyFalls 18th in music while another judge had Falls 4th in music. Quote
SpartanBandAlum Posted October 29, 2018 Posted October 29, 2018 I’d like to point out something about 5A area E that OP didn’t include, and that’s the ridiculous outlier in Lumberton’s scores at Finals 5 1 10 in music, 3 5 in visual What was Music Judge 3 hearing? Because I know for a fact they sounded great that night Quote
LeanderMomma Posted October 29, 2018 Posted October 29, 2018 In our prelims run at Area, our music scores were 1, 5, and 10. The judge that gave us the 10 was the same judge who gave us low scores in 2014 and 2016, and resulted in us missing state. Just something to add to the table. So they DO use the same judges over and over. Interesting... Quote
NETexasBandFan Posted October 29, 2018 Author Posted October 29, 2018 I like how this thread blew up after area, just saying, we should file some suggestions to UIL and TMAA. Quote
Popular Post Rudedog34 Posted October 29, 2018 Popular Post Posted October 29, 2018 I don't normally do this, and keep in mind that our band advanced in Area D and I do not particularly disagree with the results of either prelims or finals. But with the interest in UIL Judging Protocol popping up all over the threads, I'm going to give my 2 cents. I performed an evaluation of the results (scores - both ordinals and ranks) from prelims of all bands in Area D. Ordinals and Ranks all over the place. Yes, multiple worksheets in a spreadsheet yada, yada. My background is engineering and not music let alone marching arts, but there are correlations from both fields that can be used in comparison and analysis. It is obvious from reading almost all, OK - ALL of the posts in the TxBands forums, that most concerns of UIL scoring in our community is not so much that a certain band was not placed where they wanted to be, It is more that our community is trying to educate themselves with the scoring process. As we educate ourselves, we find that the scoring we thought we would see, is inconsistent, anomalous and pie in the sky at times. I thought nothing of it until I had the bright idea to review Area D's scores. Again keep in mind I have no issues with the placements in Area D, it's the process and the numbers that baffles me. If score was more consistent Judge to Judge, there would have been placement changes. All bands that advanced would still have advanced. So here is my soapbox moment: My analysis involved evaluating the ordinals and ranks of all judges and analyzing them for anomalous scoring by comparing the scores from each judge for each band (eg. 4, 11, 5 - Hennys prelims score for Music) Where did "11" come from? Soooo, that made me look deeper. I concluded that Henny wasn't the only band effected. What I found was that out of the 32 bands participating in Prelims, scores that fall into an anomalous catagory - J1 had 19% - J2 had 22% and J3 had 6%. 6%, while still high by my standards, I could live with that, but 20% on average from J1 & J2. I feel that this is not satisfactory at this level of adjudication. Not to question a judge's experience or education to be qualified to perform adjudication, but what holds judges accountable when placed on these panels? Keep in mind that these individuals are NOT Gods. This is part of their job description that they have decided to pursue and THEY should be held accountable for their profession. It is one thing to have an advanced music education or an extensive amount of experience in the marching arts and another to perform the tasks of an adjudicator in a fair and consistent manner. As is obvious from the scoring in Area D. I haven't looked at any of the other Areas and from what I read, I don't want to. The percentage of anomalous scores from 2 of the judges is not acceptable in my opinion. These inconsistencies should not exist at this level of professionalism. If it's not the system, then there should be a system of accountability in place that UIL judges should have to adhere to. Should an adjudicator's scoring deviate from the standards that are in place (what ever that is - UIL, educate us!), for any reason, they should be evaluated to determine if they are truly qualified to adjudicate at this level. When I say for any reason, how often are Music Adjudicators tested for hearing loss? How often are Marching Adjudicator's eyes tested? They may actually be scoring based on what they actually hear or see and their sight or hearing may not be what it once was (a lot of these judges are not youngsters). Have they been evaluated for favoritism? Are they consistently falling into the "anomalous catagory"? Maybe there are procedures already in place for this from UIL. How would we know as a community? The way I feel about it is that the general marching art community puts these individuals up on a pedestal and bow to their every whim, and we shouldn't. They should prove to the community that they are truly qualified to perform these tasks in all aspects of experience, aptitude, and physical abilities. We as a community should be educated more by UIL in the intricacies of the judging process, procedures and accountability and not rely on hearsay. UIL should hold these adjudicators accountable for their scoring (because that's their job). If an Adjudicator is inconsistent in scoring within a percentage of all judges on a panel, then they should be evaluated to determine if they are truly qualified to be on a UIL panel. If deemed not within the standards, they should be sent down to AA (sorry baseball analogy) and work their way back into form. If they are deemed to show favoritism, they should be dismissed immediately and never used again. If they are just bad at this, then they should be mentored and reevaluated before allowing to judge again at this level. If they are consistent, fair and always on their game, then reward them. UIL and the Adjudicators of these events should be continually reminded that what they put down on those score sheets are not being reviewed by just the directors and staffs of the bands. These documents are reviewed by the thousands of Boosters, Parents and Participants of this great State. Their consistency and professionalism have an effect on the continued success of the Marching Arts in Texas. If the community starts to have doubts in the Adjudicator's abilities, they may have doubts about participation. I know kids that not only quit band but gave up music because of issues with adjudication. This is something that just should not happen. Last note: If UIL has determined that all the Adjudicators they assign to panels are truly qualified for the job at hand, then they need to evaluate and correct their scoring system, because it's flawed. I am sure there are several members of TxBands forums that would be glad to offer suggestions to systems that actually work. I could think of a few. ...steps off and goes back into his doghouse... Dog885, LeanderMomma, CTJBandPops and 1 other 4 Quote
Popular Post takigan Posted October 29, 2018 Popular Post Posted October 29, 2018 UIL is designed to encompass a large swath of marching band cultures throughout this state. Cultural aspects that most of us aren't aware of. I think many of our forum readers would be shocked to learn that well over a hundred schools in this state march 7th graders at halftime and UIL marching competitions (not that that in itself has anything to do with the scores, it's just an illustration). But in the end there is one predominant marching culture that has been trying to assert itself on the system...The DCI system...it views itself as superior to all the other cultures, and has slowly worked it's way into nearly every circuit. It is an evolving system, which means that any kind of system that prides its culture on tradition, then that crap has to go. As long as this culture continues to exert itself on the UIL judging system there is going to be conflict and confusion of vision and philosophy that skews the scores. UIL cannot be allowed to cater only to the DCI-style purely because DCI views itself to be an evolving system.There's also the whole thing with it bankrupting bands. No, you don't need a new uniform every year in order to provide "unique characterization of each marching member" as a way to give your show a competitive edge. And no you don't need to pay some unemployed hipster $10,000 to design something on an app that really requires no formal training or education to be able to do. You don't need to shell out huge licensing fees to play some turtle-neck's esoteric crap when there's plenty of dead guys who actually knew how to write with free music for the taking. They won't mind if you play it with Sousaphones or an iPad. They're dead. And props? Honestly this is one area where bands go completely nuts with spending that isn't really necessary. Band fees really have gotten completely out of control. Alot of it is spent on instruction and travel; bands are traveling more and bringing in more part-time help than ever before...which is great....but so much of it goes unnecessarily into design. Dog885, LeanderMomma and DRbandDAD 3 Quote
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.