Hello! I have another question around loading a flat file via external table that is hosted on a network share mounted on the linux box our oracle server is running on.
The file uncompressed is 8GB, zipped-up it is ~ 350MB. If I proactively unzip it and have oracle read it, it'll have to read all 8GB over the network. However, I understand I can leave it zipped and have oracle's preprocessor option do the unzipping for me. I'm not sure however, if Oracle would:
A) unzip on the network share, then read all 8GB across the network anyway
B) transfer only the 350MB file across the network and put it into some type of "temporary cache/memory/LOCAL disk space" and then decompress the file and read it.
If Oracle does option A, then I guess I dont really see any benefit since the whole 8GB still needs to be transferred. However, if Oracle does option B, there could be some performance benefit via reduced overhead (with the added bonus that I don't have a bunch of 8GB files laying around on the filesystem). I think it's expecting a lot of Oracle to just know what I want and "know how to do" B automatically, but I figured I would ask anyway.
Thanks!