Commit message (Collapse) | Author | Age | Files | Lines | ||
---|---|---|---|---|---|---|
... | ||||||
* | unpaywall ingest notes update | Bryan Newbold | 2020-03-30 | 1 | -0/+138 | |
| | ||||||
* | ia: set User-Agent for replay fetch from wayback | Bryan Newbold | 2020-03-29 | 1 | -0/+5 | |
| | | | | | | | Did this for all the other "client" helpers, but forgot to for wayback replay. Was starting to get "445" errors from wayback. | |||||
* | ingest: block another large domain (and DOI prefix) | Bryan Newbold | 2020-03-27 | 1 | -0/+2 | |
| | ||||||
* | ingest: better spn2 pending error code | Bryan Newbold | 2020-03-27 | 1 | -0/+2 | |
| | ||||||
* | ingest: eurosurveillance PDF parser | Bryan Newbold | 2020-03-25 | 1 | -0/+11 | |
| | ||||||
* | ia: more conservative use of clean_url() | Bryan Newbold | 2020-03-24 | 1 | -3/+5 | |
| | | | | | | Fixes AttributeError: 'NoneType' object has no attribute 'strip' Seen in production on the lookup_resource code path. | |||||
* | ingest: clean_url() in more places | Bryan Newbold | 2020-03-23 | 3 | -1/+6 | |
| | | | | | | Some 'cdx-error' results were due to URLs with ':' after the hostname or trailing newline ("\n") characters in the URL. This attempts to work around this categroy of error. | |||||
* | Merge branch 'martin-pubmed-ftp-topic-docs' into 'master' | bnewbold | 2020-03-20 | 1 | -1/+9 | |
|\ | | | | | | | | | topics: add pubmed ftp topic See merge request webgroup/sandcrawler!26 | |||||
| * | topics: add pubmed ftp topic | Martin Czygan | 2020-03-12 | 1 | -1/+9 | |
| | | | | | | | | PubmedFTPWorker replaced OAI recently. This documents the new topic. | |||||
* | | skip-db option also for worker | Bryan Newbold | 2020-03-19 | 1 | -0/+4 | |
| | | ||||||
* | | persist grobid: add option to skip S3 upload | Bryan Newbold | 2020-03-19 | 2 | -7/+14 | |
| | | | | | | | | | | | | | | Motivation for this is that current S3 target (minio) is overloaded, with too many files on a single partition (80 million+). Going to look in to seaweedfs and other options, but for now stopping minio persist. Data is all stored in kafka anyways. | |||||
* | | ingest: log every URL (from ia code side) | Bryan Newbold | 2020-03-18 | 1 | -0/+1 | |
| | | ||||||
* | | implement (unused) force_get flag for SPN2 | Bryan Newbold | 2020-03-18 | 2 | -4/+19 | |
| | | | | | | | | | | | | | | | | | | I hoped this feature would make it possible to crawl journals.lww.com PDFs, because the token URLs work with `wget`, but it still doesn't seem to work. Maybe because of user agent? Anyways, this feature might be useful for crawling efficiency, so adding to master. | |||||
* | | unpaywall large ingest notes | Bryan Newbold | 2020-03-17 | 1 | -0/+10 | |
| | | ||||||
* | | make monitoring commands ingest_request local, not ingest_file_result | Bryan Newbold | 2020-03-17 | 1 | -2/+2 | |
| | | ||||||
* | | work around local redirect (resource.location) | Bryan Newbold | 2020-03-17 | 1 | -1/+6 | |
| | | | | | | | | | | | | Some redirects are host-local. This patch crudely detects this (full-path redirects starting with "/" only), and appends the URL to the host of the original URL. | |||||
* | | Merge branch 'martin-abstract-class-process' into 'master' | bnewbold | 2020-03-12 | 1 | -0/+6 | |
|\ \ | | | | | | | | | | | | | workers: add explicit process to base class See merge request webgroup/sandcrawler!25 | |||||
| * | | workers: add explicit process to base class | Martin Czygan | 2020-03-12 | 1 | -0/+6 | |
| |/ | | | | | | | | | | | | | | | As per https://docs.python.org/3/library/exceptions.html#NotImplementedError > In user defined base classes, abstract methods should raise this exception when they require derived classes to override the method [...]. | |||||
* | | pipenv: work around zipp issue | Bryan Newbold | 2020-03-10 | 2 | -4/+16 | |
| | | ||||||
* | | pipenv: add urlcanon; update pipefile.lock | Bryan Newbold | 2020-03-10 | 2 | -209/+221 | |
| | | ||||||
* | | DOI prefix example queries (SQL) | Bryan Newbold | 2020-03-10 | 1 | -3/+17 | |
| | | ||||||
* | | use local env in python scripts | Bryan Newbold | 2020-03-10 | 3 | -3/+3 | |
| | | | | | | | | | | Without this correct/canonical shebang invocation, virtualenvs (pipenv) don't work. | |||||
* | | url cleaning (canonicalization) for ingest base_url | Bryan Newbold | 2020-03-10 | 4 | -4/+21 | |
| | | | | | | | | | | | | | | | | | | | | | | As mentioned in comment, this first version does not re-write the URL in the `base_url` field. If we did so, then ingest_request rows would not SQL JOIN to ingest_file_result rows, which we wouldn't want. In the future, behaviour should maybe be to refuse to process URLs that aren't clean (eg, if base_url != clean_url(base_url)) and return a 'bad-url' status or soemthing. Then we would only accept clean URLs in both tables, and clear out all old/bad URLs with a cleanup script. | |||||
* | | ingest_file: --no-spn2 flag for single command | Bryan Newbold | 2020-03-10 | 1 | -1/+6 | |
| | | ||||||
* | | helpful daily/weekly monitoring SQL queries | Bryan Newbold | 2020-03-10 | 1 | -0/+94 | |
| | | ||||||
* | | sandcrawler schema: add MD5 index | Bryan Newbold | 2020-03-05 | 1 | -0/+1 | |
|/ | ||||||
* | more unpaywall ingest notes | Bryan Newbold | 2020-03-05 | 1 | -0/+416 | |
| | ||||||
* | ingestrequest_row2json: skip on unicode errors | Bryan Newbold | 2020-03-05 | 1 | -1/+4 | |
| | ||||||
* | fixes to ingest-request persist | Bryan Newbold | 2020-03-05 | 2 | -4/+2 | |
| | ||||||
* | persist: ingest_request tool (with no ingest_file_result) | Bryan Newbold | 2020-03-05 | 3 | -1/+48 | |
| | ||||||
* | ingest_tool: force-recrawl arg | Bryan Newbold | 2020-03-05 | 1 | -0/+5 | |
| | ||||||
* | ia: catch wayback ChunkedEncodingError | Bryan Newbold | 2020-03-05 | 1 | -0/+3 | |
| | ||||||
* | ingest: make content-decoding more robust | Bryan Newbold | 2020-03-03 | 1 | -1/+2 | |
| | ||||||
* | update (and move) ingest notes | Bryan Newbold | 2020-03-03 | 6 | -0/+480 | |
| | ||||||
* | make gzip content-encoding path more robust | Bryan Newbold | 2020-03-03 | 1 | -1/+10 | |
| | ||||||
* | ingest: crude content-encoding support | Bryan Newbold | 2020-03-02 | 1 | -1/+19 | |
| | | | | | | This perhaps should be handled in IA wrapper tool directly, instead of in ingest code. Or really, possibly a bug in wayback python library or SPN? | |||||
* | ingest: add force_recrawl flag to skip historical wayback lookup | Bryan Newbold | 2020-03-02 | 2 | -3/+6 | |
| | ||||||
* | more SQL queries | Bryan Newbold | 2020-03-02 | 1 | -0/+57 | |
| | ||||||
* | remove protocols.io octet-stream hack | Bryan Newbold | 2020-03-02 | 1 | -6/+2 | |
| | ||||||
* | more mime normalization | Bryan Newbold | 2020-02-27 | 1 | -1/+18 | |
| | ||||||
* | ingest: narrow xhtml filter | Bryan Newbold | 2020-02-25 | 1 | -1/+1 | |
| | ||||||
* | pdftrio: tweaks to avoid connection errors | Bryan Newbold | 2020-02-24 | 1 | -1/+9 | |
| | ||||||
* | fix warc_offset -> offset | Bryan Newbold | 2020-02-24 | 1 | -1/+1 | |
| | ||||||
* | ingest: handle broken revisit records | Bryan Newbold | 2020-02-24 | 1 | -1/+4 | |
| | ||||||
* | recent sandcrawler-db / ingest stats (interesting) | Bryan Newbold | 2020-02-24 | 2 | -0/+488 | |
| | ||||||
* | ingest: handle missing chemrxvi tag | Bryan Newbold | 2020-02-24 | 1 | -1/+1 | |
| | ||||||
* | ingest: treat CDX lookup error as a wayback-error | Bryan Newbold | 2020-02-24 | 1 | -1/+4 | |
| | ||||||
* | ingest: more direct americanarchivist PDF url guess | Bryan Newbold | 2020-02-24 | 1 | -0/+4 | |
| | ||||||
* | ingest backfill notes | Bryan Newbold | 2020-02-24 | 3 | -0/+150 | |
| | ||||||
* | ingest: make ehp.niehs.nih.gov rule more robust | Bryan Newbold | 2020-02-24 | 1 | -2/+3 | |
| |