That is also my experience. I use ChatGPT to help me iterate a Godot game project, and it does not take more than a handful of prompts for it to forget or hallucinate about something we previously established. I need to constantly remind it about code it suggested a while ago or things I asked for in the past, or it completely ignores the context and focus just on the latest ask.
It is incredibly powerful for getting things started, but as soon as you have a sketch of a complex system going it loses its grasp on the full picture and do not account for the states outside the small asks you make. This is even more evident when you need to correct it about something or request a change after a large prompt. It just throws all the other stuff out the window and hyperfocus only on that one piece of code that needs changing.
This has been the case since GPT 3, the even their most recent model (forgot the name, the reasoning one) has this issue.
It is incredibly powerful for getting things started, but as soon as you have a sketch of a complex system going it loses its grasp on the full picture and do not account for the states outside the small asks you make. This is even more evident when you need to correct it about something or request a change after a large prompt. It just throws all the other stuff out the window and hyperfocus only on that one piece of code that needs changing.
This has been the case since GPT 3, the even their most recent model (forgot the name, the reasoning one) has this issue.