Forums

ChatGPT has broken chess

Sort:
dendrohypostasis103700

Yes, you know the title, ChatGPT has broken chess

Why it's crazy for me:
1. It feels illegal
2. How is it possible
3. Just, why would you do it?
4. Why would you kill your own pieces


Reddit post: https://www.reddit.com/r/AnarchyChess/comments/10ydnbb/i_placed_stockfish_white_against_chatgpt_black/

GothamChess video: https://www.youtube.com/watch?v=rSCNW1OCk_M 

u123451515

It summoned pieces right?

dendrohypostasis103700

Yeah, it also summoned pieces

 

Inshaal078

I love how it plays by the rules

Tomi

ChatGPT just doesn't get stuck in the matrix. It has escaped and does what it wants. No rules of some game will control it.

 

toxic_internet

Chess has been broken for awhile, now.

aserew12

Chatgpt is dreamer from roblox

konstantcheckov

Personally I found chatGPT to be a bit of an inspiration. During one game, playing with the black pieces; I remember thinking if chatGPT can teleport pieces, then it should be possible for my rook on a8 to reach h1 - moving like a bishop. I admit it was several moves before the rook fully traversed the long diagonal, right through the centre of battle. Let's just say the rook was moving like a drunken bishop: avoiding obstacles along the way. My opponent would have had no clue what I was up to. grin.png

AACChessPro
It cheats
RVSP16

https://www.chess.com/forum/view/general/want-to-improve-your-chess-join-this-forum

join this thread I will answer your chess queries

Quasimorphy

ChatGPT has alerted me to this amazing game:

"Yes, there was a famous chess game between Miguel Najdorf and André Guinard played in 1952 that featured a remarkable position with 10 knights on the board. This unusual situation arose in a very complex endgame. It’s often cited as an interesting example of how knights can dominate in certain positions. If you’d like, I can provide more details about the game or its significance!"

It was either played in 1934, 1937, 1947, or 1952, according to the answers ChatGPT has given me. ChatGPT produces some bizarre answers sometimes.