Skip to content

A python script to check if URLs are allowed or disallowed by a robots.txt file.

Notifications You must be signed in to change notification settings

p0dalirius/RobotsValidator

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

11 Commits
 
 
 
 
 
 
 
 

Repository files navigation

RobotsValidator

The robotsvalidator script allows you to check if URLs are allowed or disallowed by a robots.txt file.
GitHub release (latest by date) YouTube Channel Subscribers

Features

  • Getting robots.txt file from local file
  • Getting robots.txt file from an URL
  • Verbose mode, showing all the rules with their results.

Verbose mode

There is a verbose mode using --debug option, which prints every rule with its result:

Contributing

Pull requests are welcome. Feel free to open an issue if you want to add other features.