Jump to content

Skill rankings on BBO


661_Pete

Recommended Posts

I know this may be a 'touchy' subject and has probably been debated ad nauseam before now!

 

I know BBO is - for me at any rate - a site for merely playing bridge 'for fun' - but nevertheless I find it a bit of an irritation, sat facing a partner who's self-rated themself 'advanced' or 'expert' - only to find that they're no better than a beginner...

 

Any solution?

 

My (EBU based) NGS ranking (derived from live bridge of course) currently stands at 53% and I rank myself on BBO as "intermediate" which I think is fair and reasonable. I just wish others would do likewise...

 

Perhaps if players' long-term IMPs or MPs scores (as visible via the "myhands" utility) could be shown on their profiles, when you're thinking of joining a table? I know the calculation of this is not so sophisticated as NGS which takes into account one's partners' rankings - but it would be a start...

Link to comment
Share on other sites

 

I know BBO is - for me at any rate - a site for merely playing bridge 'for fun' - but nevertheless I find it a bit of an irritation, sat facing a partner who's self-rated themself 'advanced' or 'expert' - only to find that they're no better than a beginner...

 

Any solution?

 

 

Option 1: If you don't find the self ratings useful, don't look at them

 

Option 2: I suppose that BBO could add an option to suppress displaying this information

 

Option 3: I have long criticized implementing a formal rating system. However, I do see some value in having a "permanent floating Indy" where players are matched with one another for "short" 1-3 board rounds and then remixed. In theory, long term ladder rank in this event might serve as a useful proxy for a ratings system.

Link to comment
Share on other sites

My (EBU based) NGS ranking (derived from live bridge of course) currently stands at 53% and I rank myself on BBO as "intermediate" which I think is fair and reasonable. I just wish others would do likewise...

 

NGS looks a very good idea but unfortunately most countries don't have anything similar, otherwise BBO could use that.

As an aside, I noted that there seems to be only 1 woman in the EBU top 50 - quite a surprise to me.

Link to comment
Share on other sites

Option 1: If you don't find the self ratings useful, don't look at them
I try not to. But I do often want to look at someone's CC - and I can't do that without catching sight of the rating.

 

Option 2: I suppose that BBO could add an option to suppress displaying this information
I agree. BBO might be a 'happier' place if this were so... :rolleyes:
Link to comment
Share on other sites

As an aside, I noted that there seems to be only 1 woman in the EBU top 50 - quite a surprise to me.
Sad - especially seeing as two of the three top players in our club are women.

 

To get in the 'top 50' you really have to be a wizard! I'm somewhere in the lower reaches of the "top 10,000". Suits me. :)

 

Incidentally, what comparable schemes are in place in other countries? USA, for instance?

Link to comment
Share on other sites

Sad - especially seeing as two of the three top players in our club are women.

That's what surprises me, usually more than half the club population is female and they hold their own.

Women are clearly under-represented in international bridge, but it's easy to imagine socio-economic explanations for some of that imbalance, and it is not as extreme as 2%. And below that I would expect the curve to tend rapidly to 50%.

 

Incidentally, what comparable schemes are in place in other countries?

In Italy, none. There is a pseudo-ranking in 12 (now) categories based upon the equivalent of masterpoints, which says more about recent activity than current skill, although points do decay and one can be relegated a category per year. There are also points for successes in major tournaments, which are permanent and reflect skill but are only relevant to the top players - most players will never accumulate such a point in their lives.

Link to comment
Share on other sites

What I find even worse than the deluded personally-assessed skill ratings on BBO is the fact that both the EBU, ABCL and BBO itself still contrive to pursue ratings on how much bridge you have played as opposed to how much do you know and/or how many, and at what level, tournaments you have won or participated in?

 

Yes, in theory, the more bridge you play the better you should get. In practice, as everyone knows, that is not always the case. I've said it before, and I'll say it again, the world of chess with its ELO rating system is a far more accurate measure. Bridge has an equivalent: Bibo Zahlen

 

https://www.bridgebase.com/forums/topic/61717-bridge-rating-for-bbo-wwwbibo-zahlende/

 

"Lost in the long grass" is an expression that comes to mind...

  • Upvote 1
Link to comment
Share on other sites

I've said it before, and I'll say it again, the world of chess with its ELO rating system is a far more accurate measure. Bridge has an equivalent: Bibo Zahlen

 

https://www.bridgebase.com/forums/topic/61717-bridge-rating-for-bbo-wwwbibo-zahlende/

 

"Lost in the long grass" is an expression that comes to mind...

 

I had a look at Bibo Zahlen and the grass looks very long indeed B-)

bboskill.com was more immediate, but BBO stopped that (and probably all other) sites crawling the 'my hands' archive.

BBO have said repeatedly that they don't want a ranking, so I guess we just have to accept that.

I agree with you that there should be a ranking outside of BBO though - WBF would gain credibility if it took on the challenge rather than leaving it to the initiative (or lack thereof) of single RAs.

Link to comment
Share on other sites

BBO have said repeatedly that they don't want a ranking, so I guess we just have to accept that.

I agree with you that there should be a ranking outside of BBO though - WBF would gain credibility if it took on the challenge rather than leaving it to the initiative (or lack thereof) of single RAs.

 

FWIW, I have long been critical of suggestions that BBO implement a rating system.

 

In my mind, trying to come up with a system that

 

  1. Is accurate
  2. Is simple enough for end users to understand
  3. Won't cause a socio-political ***** storm

 

is too tough a row to hoe.

 

I personally think that this problem is even more difficult for NationalBridge organizations, let alone the WBF.

 

BBO has the advantage of perfect record keeping. It sees / records every single bid that you make and board that you play. As such, they have - by far - the best data set to develop a good rating system. The higher you get in the food chain, the worse the record keeping and the smaller the numbers of boards that get played. As such, organizations like the WBF are in remarkably bad positions to implement these types of systems.

 

(Periodically, I see claims that the WBF or the USBF or whomever want to improve their seeding procedure or maybe even implement a ratings scheme. To which I inevitably reply "Are you willing to record results on a board by board basis rather than match by match or tournament by tournament. And the folks who claim that they want a better seeding system suddenly decide that they don't actually care about this if it means implementing a process for improving their data collection)

 

With this said and done, as I alluded to earlier, I do think that the "permanent floating Indy" might scratch many of the same itches while avoiding pitfalls that I alluded to earlier...

 

Imagine a system in which all BBO players have the option to enter an Indy style event that is running 24x7.

 

  • First time you play, you are assigned a position in the middle.
  • You matched with three other players who are close to your own level for a small number of hands (somewhere between one and three seems reasonable)
  • You score gets compared to a set of other players who pay the hand at (approximately) the same time
  • If you do well, you move up the ladder. If you do poorly, you move down the ladder.
  • The ladder rank can serve as a proxy for your skill

 

I originally proposed this scheme a few years back...

 

BBO currently is running a variant of it in the form of day long tournaments...

(I'd prefer to see it as a ladder, but such is life)

Link to comment
Share on other sites

Perhaps the main problem with the current system of self rating is that I’m sure it puts good players off playing on BBO. When I’ve asked others at my club if they use BBO they answer no because “everyone thinks they are an expert”. Whilst I don’t think they object to playing with inexperienced players they don’t wish to play with deluded ones. Also, because there are so many “experts” around it is likely that those who put more realistic assessments of there skill rating down, say “intermediate” or “advanced” are assumed by many to be complete beginners.

 

If using a “BBO Skill” type system of rating is not an option maybe the best solution would be for players to state objective facts about their experience rather than the current subjective system. So, for example, instead of skill rating your profile could show how many years you have been playing, or what level you have played at (e.g. “I have represented my club/region/country”). Of course this wouldn’t stop players from lying but somehow I think they are less likely to lie about something factual rather than their own opinion.

Link to comment
Share on other sites

I think it's quite different now for those in BBO who play "social" bridge with human partners and opps and the people who prefer Daylong tournaments with robots. For the latter category the ranking is much more important and something could be done for the improvement of the existing system with BBO masterpoints. Because when you play against robots, the self-improvement is the main "hook" and you have to have a better way to measure it.

In regard to the "live" bridge with human pds - I think you just have to invest some time and form a couple of friendships/partnerships. Probably if you have 5 - 10 people which you can trust, then you will always have a partner when you want to play.

Link to comment
Share on other sites

Perhaps the main problem with the current system of self rating is that I’m sure it puts good players off playing on BBO. When I’ve asked others at my club if they use BBO they answer no because “everyone thinks they are an expert”.

 

I am quite sure that there are plenty of people who find the current system of self rating annoying (people find LOTS of stuff annoying)

 

However:

 

1. I sincerely doubt that many people don't play at BBO because of self ratings

2. Anyone who is so sensitive that they might not play because of self ratings would doubtlessly find some other reason to quit in a day or two's time

 

So, if this is all we have to complain about I don't much care

 

[Personally, I think that unsolicited lessons and gratuitous complaints from idiots are far more annoying]

Link to comment
Share on other sites

 

If using a “BBO Skill” type system of rating is not an option maybe the best solution would be for players to state objective facts about their experience rather than the current subjective system. So, for example, instead of skill rating your profile could show how many years you have been playing, or what level you have played at (e.g. “I have represented my club/region/country”). Of course this wouldn’t stop players from lying but somehow I think they are less likely to lie about something factual rather than their own opinion.

 

Once upon a time there was supposed to be a linkage between self ratings and real world accomplishments

Link to comment
Share on other sites

the lack of the ability to accurately self-analyze and assess is rampant through humanity. Why should it surprise anyone that it is proven at the bridge table?

Do we really think that debating solutions to a problem that is unsolvable because it is in the natural order of things is not a waste of time? Better you adjust your attitude to accept reality.

Link to comment
Share on other sites

Once upon a time there was supposed to be a linkage between self ratings and real world accomplishments

I think there still is, at least there is guidance such as “Expert” is “Have had success at national level” or something similar. The problem with this is I doubt if many look at or take any notice of this guidance. (The fact that I can’t currently find it shows how little importance BBO places on it.) Also, to make things worse, BBO actually says (unless it has changed) “others lie about their skill level, you can too”. My guess is that if your profile stated explicitly something like what level of tournaments you have won (e.g. club/regional/national/international), or some other objective measure, players would be more likely to give accurate answers.

Link to comment
Share on other sites

I hav now found the guidance, which is:-

 

“Novice - Someone who recently learned to play bridge

Beginner - Someone who has played bridge for less than one year

Intermediate - Someone who is comparable in skill to most other members of BBO

Advanced - Someone who has been consistently successful in clubs or minor tournaments

Expert - Someone who has enjoyed success in major national tournaments

World Class - Someone who has represented their country in World Championships”

 

All I’m suggesting is that this, or something similar, is shown explicitly in the profile so that players can clearly see what “Expert” is supposed to mean. Most seem to think that all it requires is that you have heard of lots of conventions, not that you can actually play a decent game of bridge.

Link to comment
Share on other sites

FWIW, I have long been critical of suggestions that BBO implement a rating system.

 

In my mind, trying to come up with a system that

 

  1. Is accurate
  2. Is simple enough for end users to understand
  3. Won't cause a socio-political ***** storm

 

is too tough a row to hoe.

I guess that boils down to a socio-political storm, as the first two seem feasible. It's true that on some other gaming sites the existence of a ranking has resulted in people doing bizarre or reprehensible things to maintain/increase their ranking. It's also probable that some paying members might disappear, which is probably what worries BBO.

 

 

With this said and done, as I alluded to earlier, I do think that the "permanent floating Indy" might scratch many of the same itches while avoiding pitfalls that I alluded to earlier...

 

Imagine a system in which all BBO players have the option to enter an Indy style event that is running 24x7.

 

  • First time you play, you are assigned a position in the middle.
  • You matched with three other players who are close to your own level for a small number of hands (somewhere between one and three seems reasonable)
  • You score gets compared to a set of other players who pay the hand at (approximately) the same time
  • If you do well, you move up the ladder. If you do poorly, you move down the ladder.
  • The ladder rank can serve as a proxy for your skill

Sounds workable and fun to me.

 

 

I personally think that this problem is even more difficult for NationalBridge organizations, let alone the WBF.

 

BBO has the advantage of perfect record keeping. It sees / records every single bid that you make and board that you play. As such, they have - by far - the best data set to develop a good rating system. The higher you get in the food chain, the worse the record keeping and the smaller the numbers of boards that get played. As such, organizations like the WBF are in remarkably bad positions to implement these types of systems.

 

(Periodically, I see claims that the WBF or the USBF or whomever want to improve their seeding procedure or maybe even implement a ratings scheme. To which I inevitably reply "Are you willing to record results on a board by board basis rather than match by match or tournament by tournament. And the folks who claim that they want a better seeding system suddenly decide that they don't actually care about this if it means implementing a process for improving their data collection)

The EBU seems to have demonstrated that it is feasible for National/Zonal organisations to automatically maintain an accurate skill ranking. Such organisations have in their databases the final local result of almost every tournament played. Their NSG system looks sufficiently complete and versatile to adapt to most if not all other organisations. In any case the WBF doesn't need to impose a standard system (although this would be ideal) but just collate the ranking from each organisation into a single WBF ranking. The results of such collation might be too arbitrary at very top level to fully substitute other seeding criteria, but it would surely be better than what happens now for scopes like evaluating multinational players in ACBL tournaments, and valid for open internet platforms.

Link to comment
Share on other sites

 

The EBU seems to have demonstrated that it is feasible for National/Zonal organisations to automatically maintain an accurate skill ranking.

 

I know that the EBU has implemented a dynamic rating system.

 

I have not seen much convincing evidence regarding its validity.

Link to comment
Share on other sites

I know that the EBU has implemented a dynamic rating system.

 

I have not seen much convincing evidence regarding its validity.

 

This document goes into detail and seems forthcoming and objective about known or potential limitations. The EBU clearly put in a lot of thought and work although I don't see independent validation. There are several EBU players active on the forum, maybe they could comment on any known issues.

 

I think it would work in Italy pretty much off the shelf. No obvious problems of inaccessible data or unusual tournament types, maybe less mobility between clubs leading to longer times for the ranking to stabilise though.

Link to comment
Share on other sites

This document goes into detail and seems forthcoming and objective about known or potential limitations. The EBU clearly put in a lot of thought and work although I don't see independent validation. There are several EBU players active on the forum, maybe they could comment on any known issues.

 

From my perspective, the proof is in the pudding, by which I mean I am less interesting in exposition trying to justify the system as I am in its ability to be used for prediction.

 

  • How accurate are the predictions made by this system?
  • How do these results compare to other plausible designs?

I can't help but believe that this type of simple linear model would be significantly out-performed by a machine learning model that was able to crunch results on a board by board basis. (This looks like another classic example where the organization is unwilling to do the basic data collection that would allow it to significantly improve the results)

Link to comment
Share on other sites

This document goes into detail and seems forthcoming and objective about known or potential limitations. The EBU clearly put in a lot of thought and work although I don't see independent validation. There are several EBU players active on the forum, maybe they could comment on any known issues.I think it would work in Italy pretty much off the shelf. No obvious problems of inaccessible data or unusual tournament types, maybe less mobility between clubs leading to longer times for the ranking to stabilise though.

The EBU (English Bridge Union) NGS (National Grading System) seems to work well.

Link to comment
Share on other sites

From my perspective, the proof is in the pudding, by which I mean I am less interesting in exposition trying to justify the system as I am in its ability to be used for prediction.

 

  • How accurate are the predictions made by this system?
  • How do these results compare to other plausible designs?

I can't help but believe that this type of simple linear model would be significantly out-performed by a machine learning model that was able to crunch results on a board by board basis. (This looks like another classic example where the organization is unwilling to do the basic data collection that would allow it to significantly improve the results)

 

It would be nice to see an independent comparison and evaluation of the various designs, I agree. Review of methods by statisticians (which EBU say they did to their own satisfaction) might be even more telling.

 

I may be missing something but I don't see anything terribly wrong with using the final tournament results rather than board by board, which would require a uniformity and integration of software which is not always there yet in traditional bridge. Yes I might not have played against the strongest/weakest pair on the other line, or penalties might impact slightly differently, but I did play them all with the same partner and in time it should all be much the same. The EBU does use club-level results rather than national-level results in simultaneous tournaments in order to eliminate bias due to different strength of NS pairs vs EW pairs, which is something that can screw results quite significantly.

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
×
×
  • Create New...