Index | index by Group | index by Distribution | index by Vendor | index by creation date | index by Name | Mirrors | Help | Search |
Name: perl-WWW-RobotRules | Distribution: Mageia |
Version: 6.20.0 | Vendor: Mageia.Org |
Release: 10.mga8 | Build date: Wed Feb 12 08:37:25 2020 |
Group: Development/Perl | Build host: localhost |
Size: 26931 | Source RPM: perl-WWW-RobotRules-6.20.0-10.mga8.src.rpm |
Packager: umeabot <umeabot> | |
Url: http://search.cpan.org/dist/WWW-RobotRules | |
Summary: Parse /robots.txt file |
This module parses _/robots.txt_ files as specified in "A Standard for Robot Exclusion", at <http://www.robotstxt.org/wc/norobots.html> Webmasters can use the _/robots.txt_ file to forbid conforming robots from accessing parts of their web site. The parsed files are kept in a WWW::RobotRules object, and this object provides methods to check if access to a given URL is prohibited. The same WWW::RobotRules object can be used for one or more parsed _/robots.txt_ files on any number of hosts. The following methods are provided:
GPL+ or Artistic
* Wed Feb 12 2020 umeabot <umeabot> 6.20.0-10.mga8 + Revision: 1498314 - Mageia 8 Mass Rebuild + wally <wally> - replace deprecated %makeinstall_std * Wed Sep 19 2018 umeabot <umeabot> 6.20.0-9.mga7 + Revision: 1272724 - Mageia 7 Mass Rebuild
/usr/share/doc/perl-WWW-RobotRules /usr/share/doc/perl-WWW-RobotRules/Changes /usr/share/doc/perl-WWW-RobotRules/META.yml /usr/share/doc/perl-WWW-RobotRules/MYMETA.yml /usr/share/doc/perl-WWW-RobotRules/README /usr/share/man/man3/WWW::RobotRules.3pm.xz /usr/share/man/man3/WWW::RobotRules::AnyDBM_File.3pm.xz /usr/share/perl5/vendor_perl/WWW /usr/share/perl5/vendor_perl/WWW/RobotRules /usr/share/perl5/vendor_perl/WWW/RobotRules.pm /usr/share/perl5/vendor_perl/WWW/RobotRules/AnyDBM_File.pm
Generated by rpm2html 1.8.1
Fabrice Bellet, Sun Oct 13 10:18:22 2024