简体   繁体   中英

Avoid Cloudfront Geographic restriction from restricting robots

I've a Cloudfront distribution that have a Geographic Restriction enabled for allow only Brazil. It works fine, but recently I've found an issue:

When someone try to paste a link of the website under this distribution on Whatsapp or Facebook for example, it goes to the restriction and doesn't get the Meta Title and Meta Description of the page.

I'm here to ask if there's any way to workaround this, like "allow for robots", or if there's a set of IP's from Facebook Co. that I could "allow" and ensure that bots can reach the correct webpage.

Unfortunately, there is no way around this .

CloudFront's geo-restriction capability is a blanket restriction & normally is used for blocking countries that should legally be prohibited from accessing your content ie due to streaming regulations or sanctions.

For your use case, I would use AWS WAF & set up geographic match rules for Brazil. You can then exclude Google, Twitter, Facebook etc. crawler IPs found online, which will solve your issue.

An alternative solution, if you have CloudFront returning a front end application from S3 perhaps, is to do geo-restriction at an application level.

From a pure AWS CloudFront perspective, this is not (yet?) doable.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM