Hi all,
I have a site built with APEX on Autonomous Database, that will include a few public pages, but I have an issue with the robots.txt file which is set to "User-agent: * Disallow" (per default, if I'm not mistaken).
Is there a way to rewrite the file or supersed it, so that the pages are indexed by robots when they are public?
Moreover, I've read several articles mentionning an issue with googlebot re-indexing the pages constantly because of the session id (e.g. https://jeffkemponoracle.com/2011/10/googlebot-apex-session-ids-and-cookies/)..) Does this issue exist in a cloud configuration as well?
Many thanks
Nicolas