Shawn Presser has released an intringuing chess engine based on deep learning-based language model (GPT-2). The model was trained on the Kingbase dataset (3.5 million chess games in PGN notation) in 24 hours using 146 TPUs (ouch!). The engine is purely based on text prediction with no concept of chess. Though GPT-2 has already delivered promising/bluffing results for text generation, one can be skeptical and wonder whether it does work for chess.| blog.mathieuacher.com
Scientist, author and entrepreneur, known as a leading voice in AI. Six books including The Algebraic Mind, Rebooting AI, and Taming Silicon Valley; NYU Professor Emeritus.| substack.com