Code Monkey home page Code Monkey logo

robots's Introduction

A simple Ruby library to parse robots.txt.

Usage:

	robots = Robots.new "Some User Agent"
	assert robots.allowed?("http://www.yelp.com/foo")
	assert !robots.allowed?("http://www.yelp.com/mail?foo=bar")
	robots.other_values("http://foo.com") # gets misc. key/values (i.e. sitemaps)
	
If you want caching, you're on your own.  I suggest marshalling an instance of the parser.  

Copyright (c) 2008 Kyle Maxwell, contributors

Permission is hereby granted, free of charge, to any person
obtaining a copy of this software and associated documentation
files (the "Software"), to deal in the Software without
restriction, including without limitation the rights to use,
copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the
Software is furnished to do so, subject to the following
conditions:

The above copyright notice and this permission notice shall be
included in all copies or substantial portions of the Software.

THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND,
EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES
OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND
NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT
HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY,
WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING
FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR
OTHER DEALINGS IN THE SOFTWARE.

robots's People

Contributors

fizx avatar hayato1980 avatar rb2k avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar

robots's Issues

Error in Parser?

This is what I do:
ruby-head > bla = Robots.new("test")
=> #<Robots:0x000001008743c0 @user_agent="test", @parsed={}>
ruby-head > bla.allowed?("http://lacostarecords.net/")
=> false

This is the robots.txt
# Block a bot that was causing issues by ignoring Disallow lines below
User-Agent: OmniExplorer_Bot
Disallow: /

# Block hotlinking of music files by projectplaylist.com due to perceived user bandwidth theft
User-agent: projectplaylist-directlink
Disallow: /

# Block all bots from the core Homestead site
User-agent: *
Disallow: /~site/Scripts_ElementMailer
Disallow: /~site/Scripts_ExternalRedirect
Disallow: /~site/Scripts_ForSale
Disallow: /~site/Scripts_HitCounter
Disallow: /~site/Scripts_NewGuest
Disallow: /~site/Scripts_RealTracker
Disallow: /~site/Scripts_Track
Disallow: /~site/Scripts_WebPoll

it should return true, shouldn't it?

Parsing Error

This doesn't seem to work:

User-agent: *
Disallow:

(found here: http://www.gelsenclan.de/robots.txt )
Having a robots.txt like this will result in NO part of the page being allowed.
The expected behavior would be that everything is allowed

No such file to load -- robots

Getting 'LoadError: no such file to load -- robots' when I try to require 'robots' in irb. Have robots 0.10.1 installed.

Rakefile

Hello ,
please would you consider including patch to the Rakefile as suggested by Mamoru Tasaka?
https://bugzilla.redhat.com/show_bug.cgi?id=632912

$ diff -u Rakefile.debug Rakefile
--- Rakefile.debug 1970-01-01 09:00:00.000000000 +0900
+++ Rakefile 2010-09-13 02:32:52.000000000 +0900
@@ -21,7 +21,7 @@
require 'rake/testtask'
Rake::TestTask.new(:test) do |test|
test.libs << 'lib' << 'test'

  • test.pattern = 'test/*_/__test.rb'
  • test.pattern = 'test/*_/test__.rb'
    test.verbose = true
    end

@@ -38,7 +38,7 @@
end
end

-task :test => :check_dependencies
+#task :test => :check_dependencies

task :default => :test

add URI to timeout message

It would be nice if the "robots.txt request timed out" would tell for which uri it actually failed.
This is the method in question:

  def self.get_robots_txt(uri, user_agent)
    begin
      Timeout::timeout(Robots.timeout) do
        io = URI.join(uri.to_s, "/robots.txt").open("User-Agent" => user_agent) rescue nil
      end 
    rescue Timeout::Error
      STDERR.puts "robots.txt request timed out"
    end
  end

Please don't use open-uri

I noticed you request /robots.txt using open-uri. Since open-uri overrides Kernel#open, it also allows for possible command injection (ex: open("| ls")). I think using Net::HTTP.get would be Good Enough (tm).

robots non functional at the moment?

System: ruby-head and 1.9.1, using the newest gem version.
I came across the error over here: http://danielwebb.us/robots.txt

User-agent: *
Disallow: /bot-trap
Disallow: /about/contact
Disallow: /about/resume/daniel_webb-resume.pdf
Disallow: /projects/pd_tech_books/the_boy_electrician.pdf

But still:

ruby-head > robots.allowed?("http://danielwebb.us/bot-trap/index.php")
 => true 
ruby-head > robots.allowed?("http://danielwebb.us/bot-trap/")
 => true 
ruby-head > robots.allowed?("http://danielwebb.us/bot-trap")
 => true 

What is even more disturbing:
ruby-head > robots.allowed?("http://www.google.de/search")
=> true

Did something break in the recent update?
Did something break in one of the last updates?

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.