Commit message (Collapse) | Author | Age | Files | Lines | ||
---|---|---|---|---|---|---|
... | ||||||
* | ingest: log every URL (from ia code side) | Bryan Newbold | 2020-03-18 | 1 | -0/+1 | |
| | ||||||
* | implement (unused) force_get flag for SPN2 | Bryan Newbold | 2020-03-18 | 2 | -4/+19 | |
| | | | | | | | | | I hoped this feature would make it possible to crawl journals.lww.com PDFs, because the token URLs work with `wget`, but it still doesn't seem to work. Maybe because of user agent? Anyways, this feature might be useful for crawling efficiency, so adding to master. | |||||
* | work around local redirect (resource.location) | Bryan Newbold | 2020-03-17 | 1 | -1/+6 | |
| | | | | | | Some redirects are host-local. This patch crudely detects this (full-path redirects starting with "/" only), and appends the URL to the host of the original URL. | |||||
* | Merge branch 'martin-abstract-class-process' into 'master' | bnewbold | 2020-03-12 | 1 | -0/+6 | |
|\ | | | | | | | | | workers: add explicit process to base class See merge request webgroup/sandcrawler!25 | |||||
| * | workers: add explicit process to base class | Martin Czygan | 2020-03-12 | 1 | -0/+6 | |
| | | | | | | | | | | | | | | | | As per https://docs.python.org/3/library/exceptions.html#NotImplementedError > In user defined base classes, abstract methods should raise this exception when they require derived classes to override the method [...]. | |||||
* | | pipenv: work around zipp issue | Bryan Newbold | 2020-03-10 | 2 | -4/+16 | |
| | | ||||||
* | | pipenv: add urlcanon; update pipefile.lock | Bryan Newbold | 2020-03-10 | 2 | -209/+221 | |
| | | ||||||
* | | use local env in python scripts | Bryan Newbold | 2020-03-10 | 3 | -3/+3 | |
| | | | | | | | | | | Without this correct/canonical shebang invocation, virtualenvs (pipenv) don't work. | |||||
* | | url cleaning (canonicalization) for ingest base_url | Bryan Newbold | 2020-03-10 | 4 | -4/+21 | |
| | | | | | | | | | | | | | | | | | | | | | | As mentioned in comment, this first version does not re-write the URL in the `base_url` field. If we did so, then ingest_request rows would not SQL JOIN to ingest_file_result rows, which we wouldn't want. In the future, behaviour should maybe be to refuse to process URLs that aren't clean (eg, if base_url != clean_url(base_url)) and return a 'bad-url' status or soemthing. Then we would only accept clean URLs in both tables, and clear out all old/bad URLs with a cleanup script. | |||||
* | | ingest_file: --no-spn2 flag for single command | Bryan Newbold | 2020-03-10 | 1 | -1/+6 | |
|/ | ||||||
* | ingestrequest_row2json: skip on unicode errors | Bryan Newbold | 2020-03-05 | 1 | -1/+4 | |
| | ||||||
* | fixes to ingest-request persist | Bryan Newbold | 2020-03-05 | 2 | -4/+2 | |
| | ||||||
* | persist: ingest_request tool (with no ingest_file_result) | Bryan Newbold | 2020-03-05 | 3 | -1/+48 | |
| | ||||||
* | ingest_tool: force-recrawl arg | Bryan Newbold | 2020-03-05 | 1 | -0/+5 | |
| | ||||||
* | ia: catch wayback ChunkedEncodingError | Bryan Newbold | 2020-03-05 | 1 | -0/+3 | |
| | ||||||
* | ingest: make content-decoding more robust | Bryan Newbold | 2020-03-03 | 1 | -1/+2 | |
| | ||||||
* | make gzip content-encoding path more robust | Bryan Newbold | 2020-03-03 | 1 | -1/+10 | |
| | ||||||
* | ingest: crude content-encoding support | Bryan Newbold | 2020-03-02 | 1 | -1/+19 | |
| | | | | | | This perhaps should be handled in IA wrapper tool directly, instead of in ingest code. Or really, possibly a bug in wayback python library or SPN? | |||||
* | ingest: add force_recrawl flag to skip historical wayback lookup | Bryan Newbold | 2020-03-02 | 1 | -3/+5 | |
| | ||||||
* | remove protocols.io octet-stream hack | Bryan Newbold | 2020-03-02 | 1 | -6/+2 | |
| | ||||||
* | more mime normalization | Bryan Newbold | 2020-02-27 | 1 | -1/+18 | |
| | ||||||
* | ingest: narrow xhtml filter | Bryan Newbold | 2020-02-25 | 1 | -1/+1 | |
| | ||||||
* | pdftrio: tweaks to avoid connection errors | Bryan Newbold | 2020-02-24 | 1 | -1/+9 | |
| | ||||||
* | fix warc_offset -> offset | Bryan Newbold | 2020-02-24 | 1 | -1/+1 | |
| | ||||||
* | ingest: handle broken revisit records | Bryan Newbold | 2020-02-24 | 1 | -1/+4 | |
| | ||||||
* | ingest: handle missing chemrxvi tag | Bryan Newbold | 2020-02-24 | 1 | -1/+1 | |
| | ||||||
* | ingest: treat CDX lookup error as a wayback-error | Bryan Newbold | 2020-02-24 | 1 | -1/+4 | |
| | ||||||
* | ingest: more direct americanarchivist PDF url guess | Bryan Newbold | 2020-02-24 | 1 | -0/+4 | |
| | ||||||
* | ingest: make ehp.niehs.nih.gov rule more robust | Bryan Newbold | 2020-02-24 | 1 | -2/+3 | |
| | ||||||
* | small tweak to americanarchivist.org URL extraction | Bryan Newbold | 2020-02-24 | 1 | -1/+1 | |
| | ||||||
* | fetch_petabox_body: allow non-200 status code fetches | Bryan Newbold | 2020-02-24 | 1 | -2/+10 | |
| | | | | | | But only if it matches what the revisit record indicated. This is mostly to enable better revisit fetching. | |||||
* | allow fuzzy revisit matches | Bryan Newbold | 2020-02-24 | 1 | -1/+26 | |
| | ||||||
* | ingest: more revisit fixes | Bryan Newbold | 2020-02-22 | 1 | -4/+4 | |
| | ||||||
* | html: more publisher-specific fulltext extraction tricks | Bryan Newbold | 2020-02-22 | 1 | -0/+47 | |
| | ||||||
* | ia: improve warc/revisit implementation | Bryan Newbold | 2020-02-22 | 1 | -26/+46 | |
| | | | | | A lot of the terminal-bad-status seems to have due to not handling revisits correctly. They have status_code = '-' or None. | |||||
* | html: degruyter extraction; disabled journals.lww.com | Bryan Newbold | 2020-02-22 | 1 | -0/+19 | |
| | ||||||
* | ingest: include better terminal URL/status_code/dt | Bryan Newbold | 2020-02-22 | 1 | -0/+8 | |
| | | | | Was getting a lot of "last hit" metadata for these columns. | |||||
* | ingest: skip more non-pdf, non-paper domains | Bryan Newbold | 2020-02-22 | 1 | -0/+9 | |
| | ||||||
* | cdx: handle empty/null CDX response | Bryan Newbold | 2020-02-22 | 1 | -0/+2 | |
| | | | | Sometimes seem to get empty string instead of empty JSON list | |||||
* | html: handle TypeError during bs4 parse | Bryan Newbold | 2020-02-22 | 1 | -1/+7 | |
| | ||||||
* | filter out CDX rows missing WARC playback fields | Bryan Newbold | 2020-02-19 | 1 | -0/+4 | |
| | ||||||
* | pdf_trio persist fixes from prod | Bryan Newbold | 2020-02-19 | 2 | -5/+9 | |
| | ||||||
* | allow <meta property=citation_pdf_url> | Bryan Newbold | 2020-02-18 | 1 | -0/+3 | |
| | | | | at least researchgate does this (!) | |||||
* | X-Archive-Src more robust than X-Archive-Redirect-Reason | Bryan Newbold | 2020-02-18 | 1 | -2/+3 | |
| | ||||||
* | wayback: on bad redirects, log instead of assert | Bryan Newbold | 2020-02-18 | 1 | -2/+13 | |
| | | | | This is a different form of mangled redirect. | |||||
* | attempt to work around corrupt ARC files from alexa issue | Bryan Newbold | 2020-02-18 | 1 | -0/+5 | |
| | ||||||
* | unpaywall2ingestrequest transform script | Bryan Newbold | 2020-02-18 | 2 | -1/+104 | |
| | ||||||
* | pdftrio: mode controlled by CLI arg | Bryan Newbold | 2020-02-18 | 2 | -10/+14 | |
| | ||||||
* | pdftrio: fix error nesting in pdftrio key | Bryan Newbold | 2020-02-18 | 1 | -12/+20 | |
| | ||||||
* | include rel and oa_status in ingest request 'extra' | Bryan Newbold | 2020-02-18 | 2 | -2/+2 | |
| |