简体   繁体   中英

Disable web.config rewrite rules for local requests

I have a page scraper being used to grabbing content from a subdirectory of the site and my rewrite rules are interfering with the content grabbing. For example, the scraper is grabbing the content of the old version of the site:

/catalog/catalog.asp?page=23&section=14

And then use that to populate the new version:

/PartsBook/Catalog.aspx?page=23&section=14

In addition to prepopulating the new site with this content I'm redirecting the old urls for people that have them book marked to the new url. The problem with that is that it is causing the scraper to try to read the new page rather than the old. Is there any way to use a rule condition to limit the rule to only affecting non-local requests?

Maybe you can use this snippet for URL Rewrite. You can filter by comparing the {REMOTE_ADDR} with the scraper IP.

<rule name="Block SomeRobot" stopProcessing="true">
    <match url="^folder1/folder2" />
        <conditions logicalGrouping="MatchAny">
            <add input="{REMOTE_ADDR}" pattern="XXX\.XXX\.XXX\.[0-5]" />
        </conditions>
        <action type="redirect" url=""/>
</rule>

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM