Idea Summary
Include the ability to dynamically generate the robots.txt and sitemap files.
Use Case
I run public-facing consumer sites. Google ranking is not a luxury – it's a requirement. Vanity URL's and friendly URL's are great, but Google still has to crawl them. I can use Insum's solution for dynamically creating a sitemap, but I'm stuck at present (posted a forum note on it today too) in that ORDS disallows crawling everything by default. Anyone who publishes an APEX site on OCI needs access to that file in the short-term or search engines cannot serve up the app. Long-term though, it makes complete sense for both robots.txt and sitemap to be generated.
Preferred Solution (Optional)
Include a shared component section for SEO with friendly URL's, sitemap settings with a link to be used for REST config, and a dynamic instead of static robots.txt file.
Short-term though-- we really at least need to be able to edit that robots.txt file to make the improvements to friendly URL's and SEO work or search engines just won't work.