We also had this issue on a multilingual site and managed to fix it :
Create a new "Robots.txt" content class, with at least one field dedicated to store the content of your robots.txt file
Instantiate your new content class in your content tree, using translations to alter the content of your robots.txt
Add an URL translator rule to map robots.txt to your content node
In your Apache Virtual Host / .htaccess file, comment the robots.txt line
Another approach would be to build a module instead of using translations (useful when your websites are totally different). Then you could store your content in a foreign table...
we own a multisite installation, so every extension has a different URL. We are trying to specify a different robots.txt file for every extension/site so we are trying to follow your instructions.
Could you be more specific in your explanation? Do we have to setup a different layout ?the physical file "robots.txt" must exist in the root of ez publish? The optimal situation would be a pratical example.
As I said, you'll need to create a dedicated content class :
Create a new "Robots.txt" content class, with at least one field dedicated to store the content of your robots.txt file
Instantiate your new content class in your content tree, using translations to alter the content of your robots.txt (yes, you will have to override the pagelayout for this content as it must be a blank layout, not HTML)
Add an URL wildcard rule to map robots.txt to your content node
To add a URL wildcard, go to Setup / URL Wildcards :
New URL wildcard :robots.txt
Destination :<url_of_your_content_node>
Leave Redirecting URL unchecked
Finally check that the following line in your Apache VHost (or .htaccess file) is commented :