summaryrefslogtreecommitdiffstats
path: root/python/fatcat_tools
Commit message (Collapse)AuthorAgeFilesLines
* pubmed: use standard identifier cleanersBryan Newbold2019-12-231-17/+14
|
* pubmed: remove unused extid mapping codeBryan Newbold2019-12-231-29/+0
|
* pubmed: do reference lookups by defaultBryan Newbold2019-12-231-1/+1
|
* normalizers: clean_pmid(), and handle nulls in all other cleanersBryan Newbold2019-12-231-0/+31
|
* pubmed: null doi parsing checkBryan Newbold2019-12-231-1/+1
|
* add basic MedlineDate year parsingBryan Newbold2019-12-231-0/+11
|
* fix spn/ingest importer duplication checkBryan Newbold2019-12-221-6/+8
| | | | | | Check was happing after the `return True` by mistake, allowing duplicates in SPN editgroups, and potentially in ingest request editgroups as well.
* write diagnostic messages to stderrMartin Czygan2019-12-161-2/+2
| | | | | During debugging, it can be helpful to keep stdout (e.g. processing results) and dignostic messages separate.
* Merge branch 'martin-importers-common-doc-fix' into 'master'Martin Czygan2019-12-141-13/+10
|\ | | | | | | | | Update EntityImporter docstring. See merge request webgroup/fatcat!9
| * complete parse_record docstringMartin Czygan2019-12-141-0/+6
| |
| * Update EntityImporter docstring.Martin Czygan2019-12-131-13/+4
| | | | | | | | I believe the required method is `parse_record`, not `parse`.
* | add ingest import file collision protectionBryan Newbold2019-12-131-0/+6
| | | | | | | | | | | | | | | | The common case is the same URL being submitted repeatedly during testing. This is only within-editgroup, and per importer (eg, won't work across spn importer "submitted" editgroups), but is better than nothing.
* | update ingest request schemaBryan Newbold2019-12-133-8/+30
| | | | | | | | | | This is mostly changing ingest_type from 'file' to 'pdf', and adding 'link_source'/'link_source_id', plus some small cleanups.
* | remove default mimetype from ingest-file importerBryan Newbold2019-12-131-2/+1
| | | | | | | | We really should just use file_meta result or nothing.
* | revert accidentally commited test timingBryan Newbold2019-12-131-2/+2
| | | | | | | | Also fix a spurious typo.
* | ensure importer description arg isn't clobberedBryan Newbold2019-12-123-5/+5
| |
* | tweaks to ingest-file transformBryan Newbold2019-12-121-13/+7
| |
* | savepapernow result importerBryan Newbold2019-12-122-4/+65
| | | | | | | | Based on ingest-file-results importer
* | flush importer editgroups every few minutesBryan Newbold2019-12-121-5/+20
| |
* | EntityImporter: submit (not accept) modeBryan Newbold2019-12-121-2/+14
|/ | | | | For use with bots that don't have admin privileges, or where human follow-up review is desired.
* factor out some basic kafka helpersBryan Newbold2019-12-102-0/+23
|
* add another ingest request source to whitelistBryan Newbold2019-12-101-2/+5
|
* refactor kafka producer in crossref harvesterBryan Newbold2019-12-061-21/+26
| | | | | | | | producer creation/configuration should be happening in __init__() time, not 'daily' call. This specific refactor motivated by mocking out the producer in unit tests.
* tweaks to file ingest importerBryan Newbold2019-12-031-3/+4
| | | | | - allow overriding source filter whitelist (common case for CLI use) - fix editgroup description env variable pass-through
* crossref is_update isn't what I thoughtBryan Newbold2019-12-031-6/+2
| | | | | | | | I thought this would filter for metadata updates to an existing DOI, but actually "updates" are a type of DOI (eg, a retraction). TODO: handle 'updates' field. Should both do a lookup and set work_ident appropriately, and store in crossref-specific metadata.
* re-order ingest want() for better statsBryan Newbold2019-11-151-7/+10
|
* project -> ingest_request_sourceBryan Newbold2019-11-153-9/+9
|
* fix release.pmcid typoBryan Newbold2019-11-151-2/+2
|
* ingest importer fixesBryan Newbold2019-11-151-3/+4
|
* more ingest importer comments and countsBryan Newbold2019-11-152-2/+29
|
* crude support for 'sandcrawler' kafka topicsBryan Newbold2019-11-151-2/+3
|
* ingest file result importerBryan Newbold2019-11-152-2/+135
|
* add ingest request feature to entity_updates workerBryan Newbold2019-11-151-4/+20
| | | | | | | | | | | | | Initially was going to create a new worker to consume from the release update channel, but couldn't get the edit context ("is this a new release, or update to an existing") from that context. Currently there is a flag in source code to control whether we only do OA releases or all releases. Starting with OA only to start slow, but should probably default to all, and make this a config flag. Should probably also have a config flag to control this entire feature. Tested locally in dev.
* add ingest request transform (and test)Bryan Newbold2019-11-152-0/+67
|
* crossref: accurate blank title countsBryan Newbold2019-11-051-0/+1
|
* crossref: component typeBryan Newbold2019-11-041-1/+3
|
* crossref: count why skip happenedBryan Newbold2019-11-041-1/+7
| | | | | | Might skip based on release type (eg container, not a paper/release), or missing title, or other reasons. Over 7 million DOIs are getting skipped, curious why.
* crossref: don't skip on short/null subtitleBryan Newbold2019-11-041-1/+1
| | | | This was a bug. Should only set subtitle black, not skip the import.
* file cleanup tweaks to actually runBryan Newbold2019-10-082-5/+4
|
* refactor duplicated b32_hex function in importersBryan Newbold2019-10-083-21/+11
|
* dict wrapper for entity_from_json()Bryan Newbold2019-10-082-3/+7
|
* new cleanup python tool/frameworkBryan Newbold2019-10-084-0/+241
|
* review/fix all confluent-kafka produce codeBryan Newbold2019-09-206-27/+75
|
* small fixes to confluent-kafka importers/workersBryan Newbold2019-09-206-24/+67
| | | | | | | | - decrease default changelog pipeline to 5.0sec - fix missing KafkaException harvester imports - more confluent-kafka tweaks - updates to kafka consumer configs - bump elastic updates consumergroup (again)
* convert pipeline workers from pykafka to confluent-kafkaBryan Newbold2019-09-203-125/+230
|
* small kafka tweaks for robustnessBryan Newbold2019-09-202-0/+5
|
* convert importers to confluent-kafka libraryBryan Newbold2019-09-201-19/+71
|
* bump max message size to ~20 MBytesBryan Newbold2019-09-202-0/+2
|
* fixes to confluent-kafka harvestersBryan Newbold2019-09-203-20/+21
|
* first draft harvesters using confluent-kafkaBryan Newbold2019-09-203-48/+104
|