When Steel Talks

Everything Related to the Steelpan Instrument and Music

A more scientific approach to obtain better results in Pan competitions

It grieves me to see the truth in plain sight and people not seeing it, when they bought out peoples choice Fonclaire won it one year in 1971, those days they added up all the marks of all the judges, some time in the eighties the late 70's somebody proposed the idea to drop the highest and lowest marks "as they do in some other contest some where" I have seen numerous score sheets from different bands in panorama and festival, all the same thing, the bias and damage seems to come from the lower marks and not the top, most bands score an average of between 70-97 in the prelims a difference of 27 points, and in the semis between 83 and 97 a difference of 14 points, in the finals 85-98 a difference of 13 points on the high scores from the majority of the judges, however on the low end  most times it is one or two judges that stray from the general consensus and usually for certain popular bands these judges allways seem to score some hot favorites, as low an average of between 63 and 71, making a difference of 22 points from the lowest high score and 36 points from the highest high marking.

           I want to suggest that they switch to a popular rating i.e use the average high marks which shows less bias, drop off the two lowest scores, and keep the rest, if a band is popular with the majority of the judges the results will be around the same high example I have seen bands get 98, 97, 95, 98, 96 from 5 judges a total of 484, and two judges score that band 63and 67 ( a clear bias,and no reflection of the general consensus of the judges) when u drop the highest the band loose only 1 point at the top, when you add one of those low points at the bottom the total comes to 458 and a difference of 26 points and a poor reflection of the popularity of the band at the judges panel.

             On the other hand a less popular band gets fair marks from the same top scoring judges example 95, 94, 93, 91, 95 total 468, our two low scoring judges scores that band 98, 97, take of the higest and lowest their score would be 475, drop the two lowest total 479 even with the clearly biased judges the more popular band will still be ahead, the unbiased jugdes are hardly likely to stray too far, i hope people can see this spread the word please please, 

Views: 474

Reply to This

Replies to This Discussion

That is a decent observation Eustace

This suggestion sounds very reasonable. it would obviously help the apparent bias, which appears in various competitions. We should give this module a try.

a very interesting analysis Eustace ... its worth a shot

I agree with you Eustace, I think judges should also have to explain when their marks  fall way outside (both higher or lower) the range established by the panel

BTW Michael,

I just wanna say "Nuff Respect" for you as you are one of the few people in leadership roles for a band from T&T to show their face on this forum and give an input in the discussions. also hope that we are still on the "Get Panned in T&T" issue ...Respect

Salah

Ah true Renegade.

 

I agree with you , Salah.

Michael is one of the few steelband leaders to express a position on this forum.

Most of us on this forum can only express our opinions, create discussions and hope that our ideas are at least considered by those with influence in the pan world.

Michael is one of the few leaders who at least contributes his thoughts and ideas.

Nuff respect, indeed.

This would be cool IF the PAN HONCHOS were bout transparency and fairness. Let's see.

no kidding shalah ,Michael know first hand  bout this, don't want to start no trouble, but a certain year I spoke personally with more than one judge, 4 to be exact and they told me that one of the hot favourites were the winner on their score sheets, as a matter of fact only two judges liked the winner as a winner so 5 out of seven judges had another choice yet the poor judges get in the firing line, for the results which is the total of the marks after it was disected, that same year i saw two bands score sheets and those marks I post were actual marks, that is questionable but the arrangers and band management  left it alone and those two judges been there for decades, I will tell you this much it was not Merle Albino,, Pan Trinbago know this we need to spread the word and seriously petiotion them to change the format of the results the judging is generally fair but the results is not where there is clear bias if judges cannot come up to a close average they should be removed, if most judges score in the 90's a judge needs to explain how they come up with 64 etc

Thanks Eustace, for your observation this bias behavior seems to be the norm for judges and it seems as though this type of behavior are now being acceptable not only in trinidad but right here in the United States.

St claire I am an expert at bias I have had a lion share direct and indirect, in pan, music in general in and out of tnt, I once was in a bar they had a karoke finals I was seated near the judges and when the MC asked the audience "did you like the results 3 of the 4 judges said no, I'll say no more

 

Hey Apples, the results will still be "scientifically" unreliable. A more reliable result, would be to have the judges make their decisions without knowing who they were judging ("blind test"). Or have every band play the same tune. Or have a larger pool of judges, who have been vetted to show they have no affiliations (bias) with any of the participants. In any event, because there is no control environment (i.e. a lab), no manipulation and observation of variables, and no hypothesis testing involved in judging such a competition, you can never have a "more scientific approach". As long as bias influences scores, whether high or low, you will always end up with unreliable results. In reality, unless the entire method changes, there will always be suspicion. Two recent moves have also made the results more unreliable: the judging of participants at their panyards, and the allowing of old tunes or tunes not related to the traditional rhythms of Carnival, to be included. Because of those two moves, the competition became more unequal to the participants, and the criteria became more muddled and undefined. (What EXACTLY is a "Panorama tune"?) In my opinion, dropping undesirable scores and keeping the others, is like putting a band aid on a cancer sore. The flawed method is still there, like a tumor. The whole Panorama judging thing has been a farce from the "rip" (as the young rappers would say). Too much celebrity in the steelbands and Panorama judging, and not enough honesty. From Captain to Cook. Make that, from Chief (Diaz) to Judge. Ghost. 

RSS

© 2020   Created by When Steel Talks.   Powered by

Badges  |  Report an Issue  |  Terms of Service