Quantcast
Channel: Using robots.txt to deny access to MediaWiki special pages using substring matching - Webmasters Stack Exchange
Browsing latest articles
Browse All 2 View Live

Answer by Stephen Ostermiller for Using robots.txt to deny access to...

robots.txt disallow rules are all "starts with" rules, not substring rules.MediaWiki suggests using this in robots.txt for a case like yours:User-agent: *Disallow: /index.php?Disallow:...

View Article



Using robots.txt to deny access to MediaWiki special pages using substring...

I am running a Mediawiki at the domain someurl.com/wiki/. Unfortunally it generates a bunch of automatically generated Special pages which mainly are of low quality but nevertheless are massively...

View Article
Browsing latest articles
Browse All 2 View Live




Latest Images