>>513511942
>>513511827
>>513511395
>>513511704
All LLMs have issues when provided with or generating huge chunks of information, including code. it has to do with the how transformers abstracts context- a literal integer that represents a tokens position in a sequence. There's almost no 1,500 line webpages of entire coding projects, multi-file in implementation, except the writer is retarded and their code is shit. Claude and GPT are capable of writing O(log(n)) solutions to any problem where it's possible, but they suck at entire projects or self-debugging because the lack of infrastructure to properly provide relevance.