aboutsummaryrefslogtreecommitdiffstats
path: root/python
Commit message (Collapse)AuthorAgeFilesLines
* datacite: limit abstract lengthMartin Czygan2019-12-281-0/+6
|
* datacite: use iso 639-1 codesMartin Czygan2019-12-281-7/+4
|
* datacite: use specific auth varMartin Czygan2019-12-281-1/+1
|
* datacite: add missing --extid-map-file flagMartin Czygan2019-12-281-0/+4
|
* address first round of MR14 commentsMartin Czygan2019-12-284-150/+503
| | | | | | | | | | | | | * add missing langdetect * use entity_to_dict for json debug output * factor out code for fields in function and add table driven tests * update citeproc types * add author as default role * add raw_affiliation * include relations from datacite * remove url (covered by doi already) Using yapf for python formatting.
* datacite: move common date patterns out of the loopMartin Czygan2019-12-281-3/+4
| | | | Additionally, try the unspecific (%Y) pattern last.
* improve datacite field mapping and importMartin Czygan2019-12-285-59/+245
| | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | Current version succeeded to import a random sample of 100000 records (0.5%) from datacite. The --debug (write JSON to stdout) and --insert-log-file (log batch before committing to db) flags are temporary added to help debugging. Add few unit tests. Some edge cases: a) Existing keys without value requires a slightly awkward: ``` titles = attributes.get('titles', []) or [] ``` b) There can be 0, 1, or more (first one wins) titles. c) Date handling is probably not ideal. Datacite has a potentiall fine grained list of dates. The test case (tests/files/datacite_sample.jsonl) refers to https://ssl.fao.org/glis/doi/10.18730/8DYM9, which has date (main descriptor) 1986. The datacite record contains: 2017 (publicationYear, probably the year of record creation with reference system), 1978-06-03 (collected, e.g. experimental sample), 1986 ("Accepted"). The online version of the resource knows even one more date (2019-06-05 10:14:43 by WIEWS update).
* datacite: add missing mappings and notesMartin Czygan2019-12-281-266/+175
|
* datacite: basic field mappingsMartin Czygan2019-12-281-41/+181
| | | | | | | | | | Currently using two external libraries: * dateparser * langcodes Note: This commit includes lots of wip docs and field stat in comment, which should be removed.
* datacite: importer skeletonMartin Czygan2019-12-284-0/+514
| | | | | | * contributors, title, date, publisher, container, license Field and value analysis via https://github.com/miku/indigo.
* orcid: skip non-person ORCID recordsBryan Newbold2019-12-261-0/+4
|
* datacite: fix harvest testMartin Czygan2019-12-271-1/+1
| | | | | | Produced messages should match: jq '.data|length' tests/files/datacite_api.json
* datacite: add simple test and fixture for datacite api interactionMartin Czygan2019-12-272-0/+46
|
* datacite: extend range search queryMartin Czygan2019-12-271-1/+1
| | | | | The bracket syntax is inclusive. See also: https://www.elastic.co/guide/en/elasticsearch/reference/7.5/query-dsl-query-string-query.html#_ranges
* avoid usage of short linksMartin Czygan2019-12-271-2/+2
|
* Datacite API v2 throws 400, we cannot recover from, currently.Martin Czygan2019-12-271-0/+4
| | | | | | | | | | As a first iteration, just mark the daily batch complete and continue. The occasional HTTP 400 issue has been reported as https://github.com/datacite/datacite/issues/897. A possible improvement would be to shrink the window, so losses will be smaller.
* datacite: update documentation, add links to issuesMartin Czygan2019-12-271-10/+5
|
* datacite: use v2 of the API (flaky)Martin Czygan2019-12-271-5/+28
| | | | | | | | | Update parameter update for datacite API v2. Works fine, but there are occasional HTTP 400 responses when using the cursor API (daily updates can exceed the 10000 record limit for search queries). The HTTP 400 issue is not solved yet, but reported to datacite as https://github.com/datacite/datacite/issues/897.
* transform ingests via pmc/pmcid, not pubmed/pmidBryan Newbold2019-12-241-4/+4
|
* allow arabesque backfill ingests for some source typesBryan Newbold2019-12-241-0/+5
|
* make chocula URL updates more conservativeBryan Newbold2019-12-241-5/+5
|
* pubmed: if doing update, also do subtitle schema updateBryan Newbold2019-12-231-1/+9
|
* doi parsing fixesBryan Newbold2019-12-231-0/+7
| | | | | | | | | | Replace emdash with regular dash. Replace double slash after partner ID with single slash. This conversion seems to be done by crossref automatically on lookup. I tried several examples, using doi.org resolver and Crossref API lookup. Note that there are a number of fatcat entities with '//' in the DOI.
* pubmed: improve warning and stderr formattingBryan Newbold2019-12-231-5/+6
|
* pubmed: use standard identifier cleanersBryan Newbold2019-12-231-17/+14
|
* pubmed: remove unused extid mapping codeBryan Newbold2019-12-231-29/+0
|
* pubmed: do reference lookups by defaultBryan Newbold2019-12-231-1/+1
|
* normalizers: clean_pmid(), and handle nulls in all other cleanersBryan Newbold2019-12-231-0/+31
|
* pubmed: null doi parsing checkBryan Newbold2019-12-231-1/+1
|
* add basic MedlineDate year parsingBryan Newbold2019-12-231-0/+11
|
* add regression test for medlinedate -> year parsingBryan Newbold2019-12-232-0/+102
|
* fix spn/ingest importer duplication checkBryan Newbold2019-12-221-6/+8
| | | | | | Check was happing after the `return True` by mistake, allowing duplicates in SPN editgroups, and potentially in ingest request editgroups as well.
* datacite release links and metadata expansionBryan Newbold2019-12-202-9/+13
| | | | | | | Small ergonomic changes for datacite releases: - add a link to live/current datacite metadata (like we do for Crossref) - expand "extra" metadata fields under 'datacite' dict in metadata view
* spn: incluce link_source/link_source_id in ingest requestBryan Newbold2019-12-201-0/+2
|
* pipenv: update depsBryan Newbold2019-12-172-11/+55
| | | | | | | | loginpass patches got accepted upstream a while back, so don't need to pin to a git version ipython 7.10 seems to have problems installing, so restricting to earlier 6.x versions
* pipenv: restrict pytest<5.0.0Bryan Newbold2019-12-172-5/+13
| | | | | | | | | | | | | | | | | | | | | | | | | This prevents a test exception that presents like: tests/transform_csl.py:46: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ fatcat_tools/transforms/csl.py:204: in citeproc_csl style_path = get_style_filepath(style) .venv/lib/python3.5/site-packages/citeproc_styles/__init__.py:74: in get_style_filepath if resource_exists(__name__, independent_style): .venv/lib/python3.5/site-packages/pkg_resources/__init__.py:1134: in resource_exists return get_provider(package_or_requirement).has_resource(resource_name) .venv/lib/python3.5/site-packages/pkg_resources/__init__.py:1404: in has_resource return self._has(self._fn(self.module_path, resource_name)) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <pkg_resources.NullProvider object at 0x7f4f38c0bb00> path = '/home/bnewbold/code/fatcat/python/.venv/lib/python3.5/site-packages/citeproc_styles/styles/bibtex.csl' def _has(self, path): raise NotImplementedError( > "Can't perform this operation for unregistered loader type" ) E NotImplementedError: Can't perform this operation for unregistered loader type
* pipenv: update Pipfile and Pipfile.lockBryan Newbold2019-12-172-286/+318
| | | | | | This is still manually tweaked. I believe i've bifurcated the source of the CSL/citeproc_style import error to upgrade of the 'pytest' module. This commit upgrades all packages except pytest.
* pipfile: add langcodes and dateparser dependenciesBryan Newbold2019-12-172-1/+44
|
* write diagnostic messages to stderrMartin Czygan2019-12-161-2/+2
| | | | | During debugging, it can be helpful to keep stdout (e.g. processing results) and dignostic messages separate.
* Merge branch 'martin-importers-common-doc-fix' into 'master'Martin Czygan2019-12-141-13/+10
|\ | | | | | | | | Update EntityImporter docstring. See merge request webgroup/fatcat!9
| * complete parse_record docstringMartin Czygan2019-12-141-0/+6
| |
| * Update EntityImporter docstring.Martin Czygan2019-12-131-13/+4
| | | | | | | | I believe the required method is `parse_record`, not `parse`.
* | add ingest import file collision protectionBryan Newbold2019-12-131-0/+6
| | | | | | | | | | | | | | | | The common case is the same URL being submitted repeatedly during testing. This is only within-editgroup, and per importer (eg, won't work across spn importer "submitted" editgroups), but is better than nothing.
* | fix spn kafka topic env varBryan Newbold2019-12-131-1/+1
| |
* | update ingest request schemaBryan Newbold2019-12-135-16/+44
| | | | | | | | | | This is mostly changing ingest_type from 'file' to 'pdf', and adding 'link_source'/'link_source_id', plus some small cleanups.
* | remove default mimetype from ingest-file importerBryan Newbold2019-12-131-2/+1
| | | | | | | | We really should just use file_meta result or nothing.
* | revert accidentally commited test timingBryan Newbold2019-12-131-2/+2
| | | | | | | | Also fix a spurious typo.
* | ensure importer description arg isn't clobberedBryan Newbold2019-12-123-5/+5
| |
* | tweaks to ingest-file transformBryan Newbold2019-12-121-13/+7
| |
* | initial 'Save Paper Now' web formBryan Newbold2019-12-127-2/+228
| |