Provides functions to download and parse 'robots.txt' files.
        Ultimately the package makes it easy to check if bots
        (spiders, crawler, scrapers, ...) are allowed to access specific
        resources on a domain.
| Version: | 
0.7.13 | 
| Depends: | 
R (≥ 3.0.0) | 
| Imports: | 
stringr (≥ 1.0.0), httr (≥ 1.0.0), spiderbar (≥ 0.2.0), future (≥ 1.6.2), future.apply (≥ 1.0.0), magrittr, utils | 
| Suggests: | 
knitr, rmarkdown, dplyr, testthat, covr | 
| Published: | 
2020-09-03 | 
| Author: | 
Peter Meissner [aut, cre],
  Kun Ren [aut, cph] (Author and copyright holder of list_merge.R.),
  Oliver Keys [ctb] (original release code review),
  Rich Fitz John [ctb] (original release code review) | 
| Maintainer: | 
Peter Meissner  <retep.meissner at gmail.com> | 
| BugReports: | 
https://github.com/ropensci/robotstxt/issues | 
| License: | 
MIT + file LICENSE | 
| URL: | 
https://docs.ropensci.org/robotstxt/,
https://github.com/ropensci/robotstxt | 
| NeedsCompilation: | 
no | 
| Materials: | 
README NEWS  | 
| In views: | 
WebTechnologies | 
| CRAN checks: | 
robotstxt results |