Commit message (Collapse) | Author | Age | Files | Lines | ||
---|---|---|---|---|---|---|
... | ||||||
* | partial test coverage of pdf extract worker | Bryan Newbold | 2020-06-17 | 2 | -6/+70 | |
| | ||||||
* | fix coverage command | Bryan Newbold | 2020-06-17 | 2 | -2/+4 | |
| | ||||||
* | update grobid2json with type annotations | Bryan Newbold | 2020-06-17 | 1 | -94/+110 | |
| | ||||||
* | remove unused common.py | Bryan Newbold | 2020-06-17 | 2 | -139/+0 | |
| | ||||||
* | better DeprecationWarning filters | Bryan Newbold | 2020-06-17 | 1 | -3/+4 | |
| | ||||||
* | update Makefile from fatcat-scholar tweaks/commands | Bryan Newbold | 2020-06-17 | 1 | -3/+21 | |
| | ||||||
* | WIP on pdf_tool.py | Bryan Newbold | 2020-06-17 | 1 | -0/+137 | |
| | ||||||
* | add new pdf workers/persisters | Bryan Newbold | 2020-06-17 | 4 | -2/+214 | |
| | ||||||
* | pdf: mypy and typo fixes | Bryan Newbold | 2020-06-17 | 2 | -15/+22 | |
| | ||||||
* | workers: refactor to pass key to process() | Bryan Newbold | 2020-06-17 | 6 | -20/+28 | |
| | ||||||
* | pipenv: correct poppler; update lockfile | Bryan Newbold | 2020-06-16 | 2 | -76/+255 | |
| | ||||||
* | pipenv: flake8, pytype, black | Bryan Newbold | 2020-06-16 | 1 | -0/+7 | |
| | ||||||
* | pipenv: pillow and poppler (for PDF extraction) | Bryan Newbold | 2020-06-16 | 1 | -0/+2 | |
| | ||||||
* | initial work on PDF extraction worker | Bryan Newbold | 2020-06-16 | 2 | -1/+158 | |
| | | | | | This worker fetches full PDFs, then extracts thumbnails, raw text, and PDF metadata. Similar to GROBID worker. | |||||
* | pdf_thumbnail script: demonstrate PDF thumbnail generation | Bryan Newbold | 2020-06-16 | 1 | -0/+35 | |
| | ||||||
* | refactor worker fetch code into wrapper class | Bryan Newbold | 2020-06-16 | 3 | -141/+111 | |
| | ||||||
* | rename KafkaGrobidSink -> KafkaCompressSink | Bryan Newbold | 2020-06-16 | 3 | -3/+3 | |
| | ||||||
* | ingest: OAI-PMH count table | Bryan Newbold | 2020-05-28 | 1 | -0/+24 | |
| | ||||||
* | blobs: start documenting seaweedfs backfill | Bryan Newbold | 2020-05-28 | 1 | -0/+53 | |
| | ||||||
* | move minio directory to 'blobs' | Bryan Newbold | 2020-05-28 | 2 | -0/+0 | |
| | | | | | Part of migration from minio to seaweedfs, should be agnostic about what our actual blobstore (S3 API) is. | |||||
* | remove deprecated kafka_grobid.py worker | Bryan Newbold | 2020-05-26 | 1 | -331/+0 | |
| | | | | | | All use of pykafka was refactored to use the confluent library some time ago. And all kafka workers have been using the newer sandcrawler style worker for some time. | |||||
* | ingest notes | Bryan Newbold | 2020-05-26 | 2 | -6/+76 | |
| | ||||||
* | pipenv: remove old python3.5 cruft; add mypy | Bryan Newbold | 2020-05-26 | 2 | -185/+196 | |
| | ||||||
* | Merge branch 'martin-seaweed-s3' into 'master' | bnewbold | 2020-05-26 | 1 | -0/+424 | |
|\ | | | | | | | | | notes on seaweedfs (s3 backend) See merge request webgroup/sandcrawler!28 | |||||
| * | notes on seaweedfs (s3 backend) | Martin Czygan | 2020-05-20 | 1 | -0/+424 | |
| | | | | | | | | Notes gathered during seaweedfs setup and test runs. | |||||
* | | potential future backfill ingests | Bryan Newbold | 2020-05-26 | 1 | -0/+52 | |
| | | ||||||
* | | ingests: normalize file names; commit updates | Bryan Newbold | 2020-05-26 | 10 | -63/+279 | |
| | | ||||||
* | | start a python Makefile | Bryan Newbold | 2020-05-19 | 1 | -0/+15 | |
| | | ||||||
* | | handle UnboundLocalError in HTML parsing | Bryan Newbold | 2020-05-19 | 1 | -1/+4 | |
| | | ||||||
* | | first iteration of oai2ingestrequest script | Bryan Newbold | 2020-05-05 | 1 | -0/+137 | |
| | | ||||||
* | | summarize datacite and MAG 2020 crawls | Bryan Newbold | 2020-05-05 | 2 | -0/+200 | |
| | | ||||||
* | | update sandcrawler stats for early may | Bryan Newbold | 2020-05-04 | 1 | -0/+418 | |
| | | ||||||
* | | hotfix for html meta extract codepath | Bryan Newbold | 2020-05-03 | 1 | -1/+1 | |
| | | | | | | | | Didn't test last commit before pushing; bad Bryan! | |||||
* | | ingest: handle partial citation_pdf_url tag | Bryan Newbold | 2020-05-03 | 1 | -0/+3 | |
| | | | | | | | | | | | | | | | | Eg: https://www.cureus.com/articles/29935-a-nomogram-for-the-rapid-prediction-of-hematocrit-following-blood-loss-and-fluid-shifts-in-neonates-infants-and-adults Has: <meta name="citation_pdf_url"/> | |||||
* | | workers: add missing want() dataflow path | Bryan Newbold | 2020-04-30 | 1 | -0/+9 | |
| | | ||||||
* | | ingest: don't 'want' non-PDF ingest | Bryan Newbold | 2020-04-30 | 1 | -0/+5 | |
| | | ||||||
* | | Merge branch 'bnewbold-worker-timeout' into 'master' | bnewbold | 2020-04-29 | 3 | -2/+58 | |
|\ \ | | | | | | | | | | | | | sandcrawler worker timeouts See merge request webgroup/sandcrawler!27 | |||||
| * | | timeouts: don't push through None error messages | Bryan Newbold | 2020-04-29 | 1 | -2/+2 | |
| | | | ||||||
| * | | timeout message implementation for GROBID and ingest workers | Bryan Newbold | 2020-04-27 | 2 | -0/+18 | |
| | | | ||||||
| * | | worker timeout wrapper, and use for kafka | Bryan Newbold | 2020-04-27 | 1 | -2/+40 | |
| | | | ||||||
* | | | NSQ for job task manager/scheduler | Bryan Newbold | 2020-04-28 | 1 | -0/+79 | |
| | | | ||||||
* | | | update MAG crawl notes | Bryan Newbold | 2020-04-28 | 1 | -0/+71 | |
|/ / | ||||||
* | | kafka: more reblance notes | Bryan Newbold | 2020-04-24 | 1 | -1/+14 | |
| | | ||||||
* | | CI: add missing libsnappy-dev and libsodium-dev system packages | Bryan Newbold | 2020-04-24 | 1 | -1/+1 | |
| | | | | | | | | Whack-a-mole here... | |||||
* | | kafka: how to rebalance parititions between brokers | Bryan Newbold | 2020-04-24 | 1 | -0/+29 | |
|/ | ||||||
* | CI: add deadsnakes and python3.7 | Bryan Newbold | 2020-04-21 | 1 | -2/+3 | |
| | ||||||
* | fix KeyError in HTML PDF URL extraction | Bryan Newbold | 2020-04-17 | 1 | -1/+1 | |
| | ||||||
* | persist: only GROBID updates file_meta, not file-result | Bryan Newbold | 2020-04-16 | 1 | -1/+1 | |
| | | | | | | | | | The hope here is to reduce deadlocks in production (on aitio). As context, we are only doing "updates" until the entire file_meta table is filled in with full metadata anyways; updates are wasteful of resources, and most inserts we have seen the file before, so should be doing "DO NOTHING" if the SHA1 is already in the table. | |||||
* | batch/multiprocess for ZipfilePusher | Bryan Newbold | 2020-04-16 | 2 | -5/+26 | |
| | ||||||
* | update README for python3.7 | Bryan Newbold | 2020-04-15 | 1 | -1/+1 | |
| |