>>508822264If you want it to connect to the internet and get webpages for you, it gets a whole lot more complicated. Most of the LLMs that people play around with on their home PC is just a static model. You'll probably need to do a whole deal of coding, and I doubt it would work very well in the first place. Local modes have only a few pages of context size, which is limited by VRAM, so that's what your model could read in one go.
>>508822458That's not going to work very well either. LLMs won't memorize your database perfectly, they will do a lot of hallucination if you ask them to retrieve something from the training data. They will instead generate "something" that resembles what was in the training data, but it will make up half of it or replace parts of them.