(Cross post from my Google+ profile)
If you haven't yet, install http://www.iis.net/download/URLRewrite. If you don't have access to the server, ask your host to install. (Honestly, it should be a default on all Windows Servers.)
Once installed, add the following to the <system.webServer> section of the web.config file...
<rewrite>
<rules>
<rule name="robots">
<!-- Matches the requests for /robots.txt -->
<match url="^robots\.txt" />
<conditions>
<!-- Matches all domains. -->
<add input="{HTTP_HOST}" pattern="(.*)" />
</conditions>
<!-- Rewrites to the domain specific file. -->
<action type="Rewrite" url="robots.{C:0}.txt" appendQueryString="false" />
</rule>
</rules>
</rewrite>
The rewrite engine looks for requests to the /robot.txt then grabs the domain from the condition to change the request to the pattern: robots.www.mydomain.com.txt
I chose this pattern as it groups the files together in the file browser. As this is a rewrite rule, the search engine spiders don’t know the different. As far as they know, they get properly served, individual files for each domain.
If you don’t want to create a file for each portal, then change the condition to:
<add input="{HTTP_HOST}" pattern="^([a-zA-Z0-9]*\.)?(domain1\.org|domain2\.com)" />
Now you can post different files for:
Robots.txt generic or “not-matched”
Robots.domain1.com.txt
Robots.dev.domain1.com.txt
Robots.www.domain2.org.txt