aboutsummaryrefslogtreecommitdiffstats
Commit message (Collapse)AuthorAgeFilesLines
...
* blobs: start documenting seaweedfs backfillBryan Newbold2020-05-281-0/+53
|
* move minio directory to 'blobs'Bryan Newbold2020-05-282-0/+0
| | | | | Part of migration from minio to seaweedfs, should be agnostic about what our actual blobstore (S3 API) is.
* remove deprecated kafka_grobid.py workerBryan Newbold2020-05-261-331/+0
| | | | | | All use of pykafka was refactored to use the confluent library some time ago. And all kafka workers have been using the newer sandcrawler style worker for some time.
* ingest notesBryan Newbold2020-05-262-6/+76
|
* pipenv: remove old python3.5 cruft; add mypyBryan Newbold2020-05-262-185/+196
|
* Merge branch 'martin-seaweed-s3' into 'master'bnewbold2020-05-261-0/+424
|\ | | | | | | | | notes on seaweedfs (s3 backend) See merge request webgroup/sandcrawler!28
| * notes on seaweedfs (s3 backend)Martin Czygan2020-05-201-0/+424
| | | | | | | | Notes gathered during seaweedfs setup and test runs.
* | potential future backfill ingestsBryan Newbold2020-05-261-0/+52
| |
* | ingests: normalize file names; commit updatesBryan Newbold2020-05-2610-63/+279
| |
* | start a python MakefileBryan Newbold2020-05-191-0/+15
| |
* | handle UnboundLocalError in HTML parsingBryan Newbold2020-05-191-1/+4
| |
* | first iteration of oai2ingestrequest scriptBryan Newbold2020-05-051-0/+137
| |
* | summarize datacite and MAG 2020 crawlsBryan Newbold2020-05-052-0/+200
| |
* | update sandcrawler stats for early mayBryan Newbold2020-05-041-0/+418
| |
* | hotfix for html meta extract codepathBryan Newbold2020-05-031-1/+1
| | | | | | | | Didn't test last commit before pushing; bad Bryan!
* | ingest: handle partial citation_pdf_url tagBryan Newbold2020-05-031-0/+3
| | | | | | | | | | | | | | | | Eg: https://www.cureus.com/articles/29935-a-nomogram-for-the-rapid-prediction-of-hematocrit-following-blood-loss-and-fluid-shifts-in-neonates-infants-and-adults Has: <meta name="citation_pdf_url"/>
* | workers: add missing want() dataflow pathBryan Newbold2020-04-301-0/+9
| |
* | ingest: don't 'want' non-PDF ingestBryan Newbold2020-04-301-0/+5
| |
* | Merge branch 'bnewbold-worker-timeout' into 'master'bnewbold2020-04-293-2/+58
|\ \ | | | | | | | | | | | | sandcrawler worker timeouts See merge request webgroup/sandcrawler!27
| * | timeouts: don't push through None error messagesBryan Newbold2020-04-291-2/+2
| | |
| * | timeout message implementation for GROBID and ingest workersBryan Newbold2020-04-272-0/+18
| | |
| * | worker timeout wrapper, and use for kafkaBryan Newbold2020-04-271-2/+40
| | |
* | | NSQ for job task manager/schedulerBryan Newbold2020-04-281-0/+79
| | |
* | | update MAG crawl notesBryan Newbold2020-04-281-0/+71
|/ /
* | kafka: more reblance notesBryan Newbold2020-04-241-1/+14
| |
* | CI: add missing libsnappy-dev and libsodium-dev system packagesBryan Newbold2020-04-241-1/+1
| | | | | | | | Whack-a-mole here...
* | kafka: how to rebalance parititions between brokersBryan Newbold2020-04-241-0/+29
|/
* CI: add deadsnakes and python3.7Bryan Newbold2020-04-211-2/+3
|
* fix KeyError in HTML PDF URL extractionBryan Newbold2020-04-171-1/+1
|
* persist: only GROBID updates file_meta, not file-resultBryan Newbold2020-04-161-1/+1
| | | | | | | | | The hope here is to reduce deadlocks in production (on aitio). As context, we are only doing "updates" until the entire file_meta table is filled in with full metadata anyways; updates are wasteful of resources, and most inserts we have seen the file before, so should be doing "DO NOTHING" if the SHA1 is already in the table.
* batch/multiprocess for ZipfilePusherBryan Newbold2020-04-162-5/+26
|
* update README for python3.7Bryan Newbold2020-04-151-1/+1
|
* pipenv: update to python3.7Bryan Newbold2020-04-152-197/+202
|
* COVID-19 chinese paper ingestBryan Newbold2020-04-152-0/+156
|
* 2020-04 unpaywall ingest (in progress)Bryan Newbold2020-04-151-0/+63
|
* 2020-04 datacite ingest (in progress)Bryan Newbold2020-04-151-0/+18
|
* partial notes on S2 crawl ingestBryan Newbold2020-04-151-0/+35
|
* ingest: quick hack to capture CNKI outlinksBryan Newbold2020-04-131-2/+9
|
* html: attempt at CNKI href extractionBryan Newbold2020-04-131-0/+11
|
* MAG import notesBryan Newbold2020-04-131-0/+13
|
* unpaywall2ingestrequest: canonicalize URLBryan Newbold2020-04-071-1/+9
|
* MAG 2020-03-04 ingest notes to dateBryan Newbold2020-04-061-0/+395
|
* more monitoring queriesBryan Newbold2020-03-301-5/+29
|
* unpaywall ingest notes updateBryan Newbold2020-03-301-0/+138
|
* ia: set User-Agent for replay fetch from waybackBryan Newbold2020-03-291-0/+5
| | | | | | | Did this for all the other "client" helpers, but forgot to for wayback replay. Was starting to get "445" errors from wayback.
* ingest: block another large domain (and DOI prefix)Bryan Newbold2020-03-271-0/+2
|
* ingest: better spn2 pending error codeBryan Newbold2020-03-271-0/+2
|
* ingest: eurosurveillance PDF parserBryan Newbold2020-03-251-0/+11
|
* ia: more conservative use of clean_url()Bryan Newbold2020-03-241-3/+5
| | | | | | Fixes AttributeError: 'NoneType' object has no attribute 'strip' Seen in production on the lookup_resource code path.
* ingest: clean_url() in more placesBryan Newbold2020-03-233-1/+6
| | | | | | Some 'cdx-error' results were due to URLs with ':' after the hostname or trailing newline ("\n") characters in the URL. This attempts to work around this categroy of error.