Anonymous
6/21/2025, 6:08:25 AM
No.149091018
>>149090700
Each year the VRAM requirements for the image generation (and thereby animation generation models) gets lower. A consumer 4090 can gen a 5 second animation "kinda" quickly. Interestingly, generating images and animations (which is just a series of images) is a lot easier than generating text, which to my knowledge still require a fair amount of VRAM to store all the context needed. Think about how much more powerful your phone is compared to a consumer grade desktop computer from 20 years ago. Now imagine how much more powerful a consumer grade desktop computer will be 20 years from now. If the AI models continue to get refined, you'll be able to gen stuff at a "reasonable" pace, for an individual.
I don't think we'll ever reach a "make million dollar movie button". But I do think people will start generating some custom media to consume, even if it's only porn. Frankly, I want the language models to improve so I can have a virtual DM for DnD.
Each year the VRAM requirements for the image generation (and thereby animation generation models) gets lower. A consumer 4090 can gen a 5 second animation "kinda" quickly. Interestingly, generating images and animations (which is just a series of images) is a lot easier than generating text, which to my knowledge still require a fair amount of VRAM to store all the context needed. Think about how much more powerful your phone is compared to a consumer grade desktop computer from 20 years ago. Now imagine how much more powerful a consumer grade desktop computer will be 20 years from now. If the AI models continue to get refined, you'll be able to gen stuff at a "reasonable" pace, for an individual.
I don't think we'll ever reach a "make million dollar movie button". But I do think people will start generating some custom media to consume, even if it's only porn. Frankly, I want the language models to improve so I can have a virtual DM for DnD.