nige1 Posted February 23, 2017 Report Share Posted February 23, 2017 Google DeepMind's AlphaGo has won over 60 games against top professional Go players Demis Hassabis's Theory of Everything Will DeepMind's general purpose learning machine based on neural-networks easily learn to play Bridge and beat World-Class players? Quote Link to comment Share on other sites More sharing options...
kenberg Posted February 23, 2017 Report Share Posted February 23, 2017 I made my guess. Really a guess I just want to brag about casting the first vote! I read a bit of Tom Friedman's new book, Thank You For Being Late but the library wanted it back. The times they are a changin. Quote Link to comment Share on other sites More sharing options...
barmar Posted February 24, 2017 Report Share Posted February 24, 2017 I voted later for 2 reasons: 1. Bridge is a game of imperfect information, inference, cooperation, and psychology. These aspects don't exist in games like chess and go, which basically require searching and pattern matching. 2. No one seems to be seriously working on trying to apply advanced AI techniques like neural networks to bridge. Quote Link to comment Share on other sites More sharing options...
y66 Posted February 24, 2017 Report Share Posted February 24, 2017 Five years after IBM or Google or someone with their resources decides to pursue this. Quote Link to comment Share on other sites More sharing options...
nige1 Posted February 24, 2017 Author Report Share Posted February 24, 2017 Five years after IBM or Google or someone with their resources decides to pursue this.Conventional AI programs like Jack and WBridge5 are almost there. Before AlphaGo, a Go professional would annihilate any Go program,. The general-intelligence neural-net approach, pioneered by Deedmind, allows a robot to accept a fresh challenge and become expert with minimal human intervention. The robot might need some help interpreting opponent's system-notes but it can induce it's own heuristics and evaluation methods and refine them by playing against itself. These programs are not just learning to conquer new challenges. They also seem to be learning how to learn. Hence, once Deepmind or whoever has risen to the intriguing challenge of Bridge, progress is likely to be electrifyingly fast. Quote Link to comment Share on other sites More sharing options...
mycroft Posted February 24, 2017 Report Share Posted February 24, 2017 One of the issues (besides the incomplete information of play state) that makes bridge difficult is that the language of auction and play is not static. Getting a great bridge player, given an explanation of the auction in a way it can understand, isn't that hard. Getting a great bridge bidder, given an explanation of the auction in a way it can understand, is somewhat more difficult. Note that if we apply neural net tools to develop a bidding system, it will likely a) be totally unrecognizable to humans (viz Chthonic) and as a result b) be unable to be played with a random partner (even another neural net that wasn't trained the same way). There's a reason that the "world computer bridge championships" have a *very* restrictive convention chart. Quote Link to comment Share on other sites More sharing options...
yunling Posted February 26, 2017 Report Share Posted February 26, 2017 From jackbridge.com, published in IMP-bridge magazine:To be able to gauge Jack's playing strength objectively, the Jack team organized several matches against human opponents. In April and May of 2005, Jack played 28-board matches against seven strong pairs. All but the first of the following pairs play in the highest echelons of Dutch bridge, some having represented the Netherlands in European and World Championships. Jack-Bart Nab & Gert-Jan Paulissen 26 - 90Jack-Paul Felten & Eric van Valen 43 - 60Jack-Erik Janssen & Jeroen Top 43 - 51Jack-Vincent Ramondt & Berry Westra 45 - 53Jack-Jan van Cleeff & Vincent Kroes 61 - 46Jack-Hanneke Kreijns & Just vd Kam 74 - 53Jack-Ton Bakkeren & Huub Bertens 67 - 32 "So using imagination and judgement in ways that will not confuse your partner gives you a big advantage against Jack. That is, if you don't tell Jack what might be going on and his programmers have not anticipated it. Importantly, we have learned from these matches that disclosure about partnership methods when playing against human opponents is different from disclosure when playing against other computers. If Jack is to play matches against humans again, some clear procedures for more thorough disclosure must be provided. " These problems has not been solved during the past years and I believe that computer bridge had little progress since then. They depends too much on double-dummy simulations and are not designed to give enough pressure on opponents. Probably we can learn a lot from texas hold'em(In which computer just beat top human player this year). I voted for 2021 but it may last longer. Technically this is not that hard but this game just isn't attractive enough to top developers, I think. 1 Quote Link to comment Share on other sites More sharing options...
1eyedjack Posted February 26, 2017 Report Share Posted February 26, 2017 Jack-Bart Nab & Gert-Jan Paulissen 26 - 90Jack-Paul Felten & Eric van Valen 43 - 60Jack-Erik Janssen & Jeroen Top 43 - 51Jack-Vincent Ramondt & Berry Westra 45 - 53Jack-Jan van Cleeff & Vincent Kroes 61 - 46Jack-Hanneke Kreijns & Just vd Kam 74 - 53Jack-Ton Bakkeren & Huub Bertens 67 - 32 "So using imagination and judgement in ways that will not confuse your partner gives you a big advantage against Jack. That is, if you don't tell Jack what might be going on and his programmers have not anticipated it. Am I missing something or is it impossible to tell from the above who won? Either way it looks like Jack won some and lost some. Impressive. Quote Link to comment Share on other sites More sharing options...
sakuragi Posted March 4, 2017 Report Share Posted March 4, 2017 voted later.not that i dont believe machine could get bridge player.just that bridge does not generate enough interest for them to work on. :ph34r: :ph34r: Quote Link to comment Share on other sites More sharing options...
gixxer1000 Posted March 20, 2017 Report Share Posted March 20, 2017 I think that while Bridge programs will continue to get better over time, I doubt they would ever be able to consistently beat top bridge players.I think that because Bridge is a very psychological game and not much information is present (the dummy and your hand are visable, whereas chess and go are games of open information)it will be very hard to create a program that can beat top human players.There is also a lack of interest when it comes to Bridge in computer science. Quote Link to comment Share on other sites More sharing options...
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.