aboutsummaryrefslogtreecommitdiffstats
Commit message (Collapse)AuthorAgeFilesLines
...
* | | persist grobid: add option to skip S3 uploadBryan Newbold2020-03-192-7/+14
| | | | | | | | | | | | | | | | | | | | | Motivation for this is that current S3 target (minio) is overloaded, with too many files on a single partition (80 million+). Going to look in to seaweedfs and other options, but for now stopping minio persist. Data is all stored in kafka anyways.
* | | ingest: log every URL (from ia code side)Bryan Newbold2020-03-181-0/+1
| | |
* | | implement (unused) force_get flag for SPN2Bryan Newbold2020-03-182-4/+19
| | | | | | | | | | | | | | | | | | | | | | | | | | | I hoped this feature would make it possible to crawl journals.lww.com PDFs, because the token URLs work with `wget`, but it still doesn't seem to work. Maybe because of user agent? Anyways, this feature might be useful for crawling efficiency, so adding to master.
* | | unpaywall large ingest notesBryan Newbold2020-03-171-0/+10
| | |
* | | make monitoring commands ingest_request local, not ingest_file_resultBryan Newbold2020-03-171-2/+2
| | |
* | | work around local redirect (resource.location)Bryan Newbold2020-03-171-1/+6
| | | | | | | | | | | | | | | | | | Some redirects are host-local. This patch crudely detects this (full-path redirects starting with "/" only), and appends the URL to the host of the original URL.
* | | Merge branch 'martin-abstract-class-process' into 'master'bnewbold2020-03-121-0/+6
|\ \ \ | | | | | | | | | | | | | | | | workers: add explicit process to base class See merge request webgroup/sandcrawler!25
| * | | workers: add explicit process to base classMartin Czygan2020-03-121-0/+6
| |/ / | | | | | | | | | | | | | | | | | | | | | As per https://docs.python.org/3/library/exceptions.html#NotImplementedError > In user defined base classes, abstract methods should raise this exception when they require derived classes to override the method [...].
* | | pipenv: work around zipp issueBryan Newbold2020-03-102-4/+16
| | |
* | | pipenv: add urlcanon; update pipefile.lockBryan Newbold2020-03-102-209/+221
| | |
* | | DOI prefix example queries (SQL)Bryan Newbold2020-03-101-3/+17
| | |
* | | use local env in python scriptsBryan Newbold2020-03-103-3/+3
| | | | | | | | | | | | | | | Without this correct/canonical shebang invocation, virtualenvs (pipenv) don't work.
* | | url cleaning (canonicalization) for ingest base_urlBryan Newbold2020-03-104-4/+21
| | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | As mentioned in comment, this first version does not re-write the URL in the `base_url` field. If we did so, then ingest_request rows would not SQL JOIN to ingest_file_result rows, which we wouldn't want. In the future, behaviour should maybe be to refuse to process URLs that aren't clean (eg, if base_url != clean_url(base_url)) and return a 'bad-url' status or soemthing. Then we would only accept clean URLs in both tables, and clear out all old/bad URLs with a cleanup script.
* | | ingest_file: --no-spn2 flag for single commandBryan Newbold2020-03-101-1/+6
| | |
* | | helpful daily/weekly monitoring SQL queriesBryan Newbold2020-03-101-0/+94
| | |
* | | sandcrawler schema: add MD5 indexBryan Newbold2020-03-051-0/+1
|/ /
* | more unpaywall ingest notesBryan Newbold2020-03-051-0/+416
| |
* | ingestrequest_row2json: skip on unicode errorsBryan Newbold2020-03-051-1/+4
| |
* | fixes to ingest-request persistBryan Newbold2020-03-052-4/+2
| |
* | persist: ingest_request tool (with no ingest_file_result)Bryan Newbold2020-03-053-1/+48
| |
* | ingest_tool: force-recrawl argBryan Newbold2020-03-051-0/+5
| |
* | ia: catch wayback ChunkedEncodingErrorBryan Newbold2020-03-051-0/+3
| |
* | ingest: make content-decoding more robustBryan Newbold2020-03-031-1/+2
| |
* | update (and move) ingest notesBryan Newbold2020-03-036-0/+480
| |
* | make gzip content-encoding path more robustBryan Newbold2020-03-031-1/+10
| |
* | ingest: crude content-encoding supportBryan Newbold2020-03-021-1/+19
| | | | | | | | | | | | This perhaps should be handled in IA wrapper tool directly, instead of in ingest code. Or really, possibly a bug in wayback python library or SPN?
* | ingest: add force_recrawl flag to skip historical wayback lookupBryan Newbold2020-03-022-3/+6
| |
* | more SQL queriesBryan Newbold2020-03-021-0/+57
| |
* | remove protocols.io octet-stream hackBryan Newbold2020-03-021-6/+2
| |
* | more mime normalizationBryan Newbold2020-02-271-1/+18
| |
* | ingest: narrow xhtml filterBryan Newbold2020-02-251-1/+1
| |
* | pdftrio: tweaks to avoid connection errorsBryan Newbold2020-02-241-1/+9
| |
* | fix warc_offset -> offsetBryan Newbold2020-02-241-1/+1
| |
* | ingest: handle broken revisit recordsBryan Newbold2020-02-241-1/+4
| |
* | recent sandcrawler-db / ingest stats (interesting)Bryan Newbold2020-02-242-0/+488
| |
* | ingest: handle missing chemrxvi tagBryan Newbold2020-02-241-1/+1
| |
* | ingest: treat CDX lookup error as a wayback-errorBryan Newbold2020-02-241-1/+4
| |
* | ingest: more direct americanarchivist PDF url guessBryan Newbold2020-02-241-0/+4
| |
* | ingest backfill notesBryan Newbold2020-02-243-0/+150
| |
* | ingest: make ehp.niehs.nih.gov rule more robustBryan Newbold2020-02-241-2/+3
| |
* | small tweak to americanarchivist.org URL extractionBryan Newbold2020-02-241-1/+1
| |
* | fetch_petabox_body: allow non-200 status code fetchesBryan Newbold2020-02-241-2/+10
| | | | | | | | | | | | But only if it matches what the revisit record indicated. This is mostly to enable better revisit fetching.
* | allow fuzzy revisit matchesBryan Newbold2020-02-241-1/+26
| |
* | ingest: more revisit fixesBryan Newbold2020-02-221-4/+4
| |
* | html: more publisher-specific fulltext extraction tricksBryan Newbold2020-02-221-0/+47
| |
* | ia: improve warc/revisit implementationBryan Newbold2020-02-221-26/+46
| | | | | | | | | | A lot of the terminal-bad-status seems to have due to not handling revisits correctly. They have status_code = '-' or None.
* | html: degruyter extraction; disabled journals.lww.comBryan Newbold2020-02-221-0/+19
| |
* | ingest: include better terminal URL/status_code/dtBryan Newbold2020-02-221-0/+8
| | | | | | | | Was getting a lot of "last hit" metadata for these columns.
* | ingest: skip more non-pdf, non-paper domainsBryan Newbold2020-02-221-0/+9
| |
* | cdx: handle empty/null CDX responseBryan Newbold2020-02-221-0/+2
| | | | | | | | Sometimes seem to get empty string instead of empty JSON list