Nvidia has recently released a research paper on a new NTC (Neural Texture Compression) that promises reduction of VRAM usage up to 85%, but without any loss of quality.
This stems from Nvidia acknowledging that VRAM usage has gone out of control, allegedly due to consumers demanding photorealistic graphics.
While being fairly technical, the paper explores encoding textures instead of saving them at full resolution. It's based on machine learning, and uses neural network technology to then reconstruct the image. This also happens to cut the size of the texture, the most extreme example Nvidia could reveal was a 1/24th of the original size.
One of the main points is that the method does not use any sort of generative algorithm or similar, but is fully deterministic, which is a nice way to say that no random elements are used, and that the same input will always produce the same output. As the encoding and neural process happens in the Matrix Engine, which is driven by the Tensor Cores, normal CUDA cores performance will not be affected. This also means that modern day RTX 50 series cards should in theory be able to support it, as soon as game developers start implementing it.