Commit message (Collapse) | Author | Age | Files | Lines | ||
---|---|---|---|---|---|---|
... | ||||||
* | | unpaywall ingest follow-up | Bryan Newbold | 2020-09-02 | 1 | -0/+115 | |
| | | ||||||
* | | more bad SHA1 PDF | Bryan Newbold | 2020-09-02 | 1 | -0/+2 | |
| | | ||||||
* | | another bad PDF sha1 | Bryan Newbold | 2020-09-01 | 1 | -0/+1 | |
| | | ||||||
* | | another bad PDF sha1 | Bryan Newbold | 2020-08-24 | 1 | -0/+1 | |
| | | ||||||
* | | html: handle embed with mangled 'src' attribute | Bryan Newbold | 2020-08-24 | 1 | -1/+1 | |
| | | ||||||
* | | WIP weekly re-ingest script | Bryan Newbold | 2020-08-17 | 2 | -0/+97 | |
| | | ||||||
* | | another bad PDF sha1 | Bryan Newbold | 2020-08-17 | 1 | -0/+1 | |
| | | ||||||
* | | another bad PDF sha1 | Bryan Newbold | 2020-08-15 | 1 | -0/+1 | |
| | | ||||||
* | | more bad sha1 | Bryan Newbold | 2020-08-14 | 1 | -0/+1 | |
| | | ||||||
* | | yet more bad PDF sha1 | Bryan Newbold | 2020-08-14 | 1 | -0/+2 | |
| | | ||||||
* | | more bad SHA1 | Bryan Newbold | 2020-08-13 | 1 | -0/+2 | |
| | | ||||||
* | | yet another PDF sha1 | Bryan Newbold | 2020-08-12 | 1 | -0/+1 | |
| | | ||||||
* | | another bad sha1; maybe the last for this batch? | Bryan Newbold | 2020-08-12 | 1 | -0/+1 | |
| | | ||||||
* | | more bad sha1 | Bryan Newbold | 2020-08-11 | 1 | -0/+2 | |
| | | ||||||
* | | additional loginwall patterns | Bryan Newbold | 2020-08-11 | 1 | -0/+2 | |
| | | ||||||
* | | more SHA1 | Bryan Newbold | 2020-08-11 | 1 | -0/+2 | |
| | | ||||||
* | | Revert "ingest: reduce CDX retry_sleep to 3.0 sec (after SPN)" | Bryan Newbold | 2020-08-11 | 1 | -1/+1 | |
| | | | | | | | | | | | | | | This reverts commit 92bf9bc28ac0eacab2e06fa3b25b52f0882804c2. In practice, in prod, this resulted in much larger spn2-cdx-lookup-failure error rates. | |||||
* | | ingest: reduce CDX retry_sleep to 3.0 sec (after SPN) | Bryan Newbold | 2020-08-11 | 1 | -1/+1 | |
| | | | | | | | | | | | | | | | | As we are moving towards just retrying entire ingest requests, we should probably just make this zero. But until then we should give SPN CDX a small chance to sync before giving up. This change expected to improve overall throughput. | |||||
* | | ingest: actually use force_get flag with SPN | Bryan Newbold | 2020-08-11 | 1 | -0/+13 | |
| | | | | | | | | | | | | The code path was there, but wasn't actually flagging in our most popular daily domains yet. Hopefully will make a big difference in SPN throughput. | |||||
* | | check for simple URL patterns that are usually paywalls or loginwalls | Bryan Newbold | 2020-08-11 | 2 | -0/+29 | |
| | | ||||||
* | | ingest: check for URL blocklist and cookie URL patterns on every hop | Bryan Newbold | 2020-08-11 | 1 | -0/+13 | |
| | | ||||||
* | | refactor: force_get -> force_simple_get | Bryan Newbold | 2020-08-11 | 2 | -8/+8 | |
| | | | | | | | | | | For clarity. The SPNv2 API hasn't changed, just changing the variable/parameter name. | |||||
* | | html: extract eprints PDF url (eg, ub.uni-heidelberg.de) | Bryan Newbold | 2020-08-11 | 1 | -0/+2 | |
| | | ||||||
* | | extract PDF urls for e-periodica.ch | Bryan Newbold | 2020-08-10 | 1 | -0/+6 | |
| | | ||||||
* | | more bad sha1 | Bryan Newbold | 2020-08-10 | 1 | -0/+2 | |
| | | ||||||
* | | another bad PDF sha1 | Bryan Newbold | 2020-08-10 | 1 | -0/+1 | |
| | | ||||||
* | | add hkvalidate.perfdrive.com to domain blocklist | Bryan Newbold | 2020-08-08 | 1 | -0/+3 | |
| | | ||||||
* | | fix tests passing str as HTML | Bryan Newbold | 2020-08-08 | 1 | -3/+3 | |
| | | ||||||
* | | add more HTML extraction tricks | Bryan Newbold | 2020-08-08 | 1 | -2/+29 | |
| | | ||||||
* | | rwth-aachen.de HTML extract, and a generic URL guess method | Bryan Newbold | 2020-08-08 | 1 | -0/+15 | |
| | | ||||||
* | | another PDF hash to skip | Bryan Newbold | 2020-08-08 | 1 | -0/+1 | |
| | | ||||||
* | | another sha1 | Bryan Newbold | 2020-08-07 | 1 | -0/+1 | |
| | | ||||||
* | | another sha1 | Bryan Newbold | 2020-08-06 | 1 | -0/+1 | |
| | | ||||||
* | | and more bad sha1 | Bryan Newbold | 2020-08-06 | 1 | -0/+3 | |
| | | ||||||
* | | more pdfextract skip sha1hex | Bryan Newbold | 2020-08-06 | 1 | -9/+12 | |
| | | ||||||
* | | grobid+pdftext missing catch-up commands | Bryan Newbold | 2020-08-05 | 5 | -10/+150 | |
| | | ||||||
* | | commit stats from a couple weeks back | Bryan Newbold | 2020-08-05 | 1 | -0/+347 | |
| | | ||||||
* | | sql stats commands updates | Bryan Newbold | 2020-08-05 | 1 | -2/+2 | |
| | | ||||||
* | | MAG ingest follow-up notes | Bryan Newbold | 2020-08-05 | 1 | -0/+194 | |
| | | ||||||
* | | more bad PDF sha1; print sha1 before poppler extract | Bryan Newbold | 2020-08-05 | 1 | -0/+7 | |
| | | ||||||
* | | spn2: skip js behavior (experiment) | Bryan Newbold | 2020-08-05 | 1 | -0/+1 | |
| | | | | | | | | | | Hoping this will increase crawling throughput with little-to-no impact on fidelity. | |||||
* | | SPN2: ensure not fetching outlinks | Bryan Newbold | 2020-08-05 | 1 | -0/+1 | |
| | | ||||||
* | | another bad PDF sha1 | Bryan Newbold | 2020-08-04 | 1 | -0/+1 | |
| | | ||||||
* | | another PDF sha1hex | Bryan Newbold | 2020-07-27 | 1 | -0/+1 | |
| | | ||||||
* | | yet another 'bad' PDF sha1hex | Bryan Newbold | 2020-07-27 | 1 | -0/+1 | |
| | | ||||||
* | | use new SPNv2 'skip_first_archive' param | Bryan Newbold | 2020-07-22 | 1 | -0/+1 | |
| | | | | | | | | For speed and efficiency. | |||||
* | | MAG 2020-07 ingest notes | Bryan Newbold | 2020-07-08 | 1 | -0/+159 | |
| | | ||||||
* | | add more slow PDF hashes | Bryan Newbold | 2020-07-05 | 1 | -0/+2 | |
| | | ||||||
* | | add another bad PDF sha1hex | Bryan Newbold | 2020-07-02 | 1 | -0/+1 | |
| | | ||||||
* | | seaweedfs proposal: fix typos and wording | Martin Czygan | 2020-07-01 | 1 | -9/+11 | |
| | |