>>718940015 (OP)
SSDs are too slow but RAM is viable to use as VRAM, in fact that's how iGPUs that don't have dedicated memory do it. Still it's overall too slow and has too much latency creating a performance bottleneck, it would still be viable for AI.
It's kind of true though, isn't it? Just not anything about SSDs. VRAM isn't that expensive and they don't give higher amounts so they can sell ridiculously expensive enterprise cards