Evaluating the transformer architecture for the game of chess

Organisatsiooni nimi
Arvutiteaduse instituut
Kokkuvõte
Transformer-based language models are used for various tasks like protein folding prediction.
This project will use a language transformer to learn the rules of the chess game and evaluate the similarity of contextual embeddings in chess terms. Though there is research that used GPT-2 and PGN games to learn the rules of the game of chess, we will go beyond that work in two ways:
Instead of using PGN notation (Portable Game Notation) as input to the transformer, we will convert the PGN game notations into a sequence of FEN (Forsyth–Edwards Notation) diagrams. The intuition is that the FEN diagram gives complete information about the board, unlike the PGN.
We will explore the contextual embeddings generated to see if the similarity in the transformer sense translates into similarity in the chess sense (e.g., the chess positions are indeed similar)
We will use the free site "lichess.org" interface to download a collection of chess games. The chess games can be filtered by a series of criteria (the players' strength measured as GLICKO-2 scores, the number of game moves, etc ...)
Requirements: the student should be familiar with transformer architecture and the game of chess. The project is for a master's or ambitious bachelor's student.
Lõputöö kaitsmise aasta
2022-2023
Juhendaja
Eduard Barbu
Suhtlemiskeel(ed)
inglise keel
Nõuded kandideerijale
Tase
Bakalaureus, Magister
Märksõnad
#transformer, #language, #chess

Kandideerimise kontakt

 
Nimi
EDUARD BARBU
Tel
E-mail
eduard.barbu@ut.ee