You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The parser would skip all non-matching user-agents (except for *), if a ruleset for mybot was found, it would return its ruleset, otherwise it would return the default ruleset *.
The method could accept multiple useragents, so for a example a search engine crawler might do:
rules, err:=robotstxt.Parse(buf, "Searchbot", "Googlebot")
// rules is searchbot// fallback to Googlebot// fallback to *
LMK if you would consider a PR that implements this feature.
The text was updated successfully, but these errors were encountered:
On Fri, Sep 20, 2019, 21:01 Amir Abushareb ***@***.***> wrote:
I'm wondering if it's worth optimizing memory-usage a bit by parsing just
a single ruleset for a given user agent, so the signature might be one of:
rules, err := robotstxt.ForAgent(buf, "mybot")rules, err := robotstxt.ParseAgent(buf, "mybot")
The parser would skip all non-matching user-agents (except for *), if a
ruleset for mybot was found, it would return its ruleset, otherwise it
would return the default ruleset *.
The method could accept multiple useragents, so for a example a search
engine crawler might do:
rules, err := robotstxt.Parse(buf, "Searchbot", "Googlebot")// rules is searchbot// fallback to Googlebot// fallback to *
LMK if you would consider a PR that implements this feature.
—
You are receiving this because you are subscribed to this thread.
Reply to this email directly, view it on GitHub
<#24?email_source=notifications&email_token=AAAGTMJZXJS2M3Q3NZU3Q5LQKUFXNA5CNFSM4IYZZNL2YY3PNVWWK3TUL52HS4DFUVEXG43VMWVGG33NNVSW45C7NFSM4HMXVGGA>,
or mute the thread
<https://github.com/notifications/unsubscribe-auth/AAAGTMIYLS5ES7GJZJBXFODQKUFXNANCNFSM4IYZZNLQ>
.
I'm wondering if it's worth optimizing memory-usage a bit by parsing just a single ruleset for a given user agent, so the signature might be one of:
The parser would skip all non-matching user-agents (except for
*
), if a ruleset formybot
was found, it would return its ruleset, otherwise it would return the default ruleset*
.The method could accept multiple useragents, so for a example a search engine crawler might do:
LMK if you would consider a PR that implements this feature.
The text was updated successfully, but these errors were encountered: