aboutsummaryrefslogtreecommitdiffstats
Commit message (Collapse)AuthorAgeFilesLines
* add new pdf workers/persistersBryan Newbold2020-06-174-2/+214
|
* pdf: mypy and typo fixesBryan Newbold2020-06-172-15/+22
|
* workers: refactor to pass key to process()Bryan Newbold2020-06-176-20/+28
|
* pipenv: correct poppler; update lockfileBryan Newbold2020-06-162-76/+255
|
* pipenv: flake8, pytype, blackBryan Newbold2020-06-161-0/+7
|
* pipenv: pillow and poppler (for PDF extraction)Bryan Newbold2020-06-161-0/+2
|
* initial work on PDF extraction workerBryan Newbold2020-06-162-1/+158
| | | | | This worker fetches full PDFs, then extracts thumbnails, raw text, and PDF metadata. Similar to GROBID worker.
* pdf_thumbnail script: demonstrate PDF thumbnail generationBryan Newbold2020-06-161-0/+35
|
* refactor worker fetch code into wrapper classBryan Newbold2020-06-163-141/+111
|
* rename KafkaGrobidSink -> KafkaCompressSinkBryan Newbold2020-06-163-3/+3
|
* ingest: OAI-PMH count tableBryan Newbold2020-05-281-0/+24
|
* blobs: start documenting seaweedfs backfillBryan Newbold2020-05-281-0/+53
|
* move minio directory to 'blobs'Bryan Newbold2020-05-282-0/+0
| | | | | Part of migration from minio to seaweedfs, should be agnostic about what our actual blobstore (S3 API) is.
* remove deprecated kafka_grobid.py workerBryan Newbold2020-05-261-331/+0
| | | | | | All use of pykafka was refactored to use the confluent library some time ago. And all kafka workers have been using the newer sandcrawler style worker for some time.
* ingest notesBryan Newbold2020-05-262-6/+76
|
* pipenv: remove old python3.5 cruft; add mypyBryan Newbold2020-05-262-185/+196
|
* Merge branch 'martin-seaweed-s3' into 'master'bnewbold2020-05-261-0/+424
|\ | | | | | | | | notes on seaweedfs (s3 backend) See merge request webgroup/sandcrawler!28
| * notes on seaweedfs (s3 backend)Martin Czygan2020-05-201-0/+424
| | | | | | | | Notes gathered during seaweedfs setup and test runs.
* | potential future backfill ingestsBryan Newbold2020-05-261-0/+52
| |
* | ingests: normalize file names; commit updatesBryan Newbold2020-05-2610-63/+279
| |
* | start a python MakefileBryan Newbold2020-05-191-0/+15
| |
* | handle UnboundLocalError in HTML parsingBryan Newbold2020-05-191-1/+4
| |
* | first iteration of oai2ingestrequest scriptBryan Newbold2020-05-051-0/+137
| |
* | summarize datacite and MAG 2020 crawlsBryan Newbold2020-05-052-0/+200
| |
* | update sandcrawler stats for early mayBryan Newbold2020-05-041-0/+418
| |
* | hotfix for html meta extract codepathBryan Newbold2020-05-031-1/+1
| | | | | | | | Didn't test last commit before pushing; bad Bryan!
* | ingest: handle partial citation_pdf_url tagBryan Newbold2020-05-031-0/+3
| | | | | | | | | | | | | | | | Eg: https://www.cureus.com/articles/29935-a-nomogram-for-the-rapid-prediction-of-hematocrit-following-blood-loss-and-fluid-shifts-in-neonates-infants-and-adults Has: <meta name="citation_pdf_url"/>
* | workers: add missing want() dataflow pathBryan Newbold2020-04-301-0/+9
| |
* | ingest: don't 'want' non-PDF ingestBryan Newbold2020-04-301-0/+5
| |
* | Merge branch 'bnewbold-worker-timeout' into 'master'bnewbold2020-04-293-2/+58
|\ \ | | | | | | | | | | | | sandcrawler worker timeouts See merge request webgroup/sandcrawler!27
| * | timeouts: don't push through None error messagesBryan Newbold2020-04-291-2/+2
| | |
| * | timeout message implementation for GROBID and ingest workersBryan Newbold2020-04-272-0/+18
| | |
| * | worker timeout wrapper, and use for kafkaBryan Newbold2020-04-271-2/+40
| | |
* | | NSQ for job task manager/schedulerBryan Newbold2020-04-281-0/+79
| | |
* | | update MAG crawl notesBryan Newbold2020-04-281-0/+71
|/ /
* | kafka: more reblance notesBryan Newbold2020-04-241-1/+14
| |
* | CI: add missing libsnappy-dev and libsodium-dev system packagesBryan Newbold2020-04-241-1/+1
| | | | | | | | Whack-a-mole here...
* | kafka: how to rebalance parititions between brokersBryan Newbold2020-04-241-0/+29
|/
* CI: add deadsnakes and python3.7Bryan Newbold2020-04-211-2/+3
|
* fix KeyError in HTML PDF URL extractionBryan Newbold2020-04-171-1/+1
|
* persist: only GROBID updates file_meta, not file-resultBryan Newbold2020-04-161-1/+1
| | | | | | | | | The hope here is to reduce deadlocks in production (on aitio). As context, we are only doing "updates" until the entire file_meta table is filled in with full metadata anyways; updates are wasteful of resources, and most inserts we have seen the file before, so should be doing "DO NOTHING" if the SHA1 is already in the table.
* batch/multiprocess for ZipfilePusherBryan Newbold2020-04-162-5/+26
|
* update README for python3.7Bryan Newbold2020-04-151-1/+1
|
* pipenv: update to python3.7Bryan Newbold2020-04-152-197/+202
|
* COVID-19 chinese paper ingestBryan Newbold2020-04-152-0/+156
|
* 2020-04 unpaywall ingest (in progress)Bryan Newbold2020-04-151-0/+63
|
* 2020-04 datacite ingest (in progress)Bryan Newbold2020-04-151-0/+18
|
* partial notes on S2 crawl ingestBryan Newbold2020-04-151-0/+35
|
* ingest: quick hack to capture CNKI outlinksBryan Newbold2020-04-131-2/+9
|
* html: attempt at CNKI href extractionBryan Newbold2020-04-131-0/+11
|