aboutsummaryrefslogtreecommitdiffstats
Commit message (Collapse)AuthorAgeFilesLines
* bump required rust to 1.36Bryan Newbold2019-12-032-2/+2
| | | | | | | | | | | | This isn't a fatcat rust requirement, but instead a diesel requirement, via rust-smallvec, which in v1.0 uses the alloc crate: https://github.com/servo/rust-smallvec/issues/73 I think the reason this came up now is that diesel-cli is an application and doesn't have a Cargo.lock file, and the build was updated. Using some binary mechanism to install these dependencies would be more robust, but feels like a yak shave right now.
* update gitlab-ci to rust 1.34Bryan Newbold2019-12-031-1/+1
| | | | | Apparently the rust:1.34-stretch image is gone from docker hub, and this was causing CI errors.
* make file edit form hash values case insensitiveBryan Newbold2019-12-021-0/+3
| | | | | | | Test in previous commit. This fixes a user-reported 500 error when creating a file with SHA1/SHA256/MD5 hashes in upper-case.
* add regression test for upper-case SHA-1 form submitBryan Newbold2019-12-021-0/+10
|
* re-order ingest want() for better statsBryan Newbold2019-11-151-7/+10
|
* project -> ingest_request_sourceBryan Newbold2019-11-153-9/+9
|
* have ingest-file-results importer operate as crawl-botBryan Newbold2019-11-151-1/+1
| | | | As opposed to sandcrawler-bot
* fix release.pmcid typoBryan Newbold2019-11-151-2/+2
|
* better ingest-file-results import nameBryan Newbold2019-11-151-1/+1
|
* ingest importer fixesBryan Newbold2019-11-151-3/+4
|
* more ingest importer comments and countsBryan Newbold2019-11-152-2/+29
|
* crude support for 'sandcrawler' kafka topicsBryan Newbold2019-11-151-2/+3
|
* ingest file result importerBryan Newbold2019-11-155-2/+228
|
* test for ingest transformBryan Newbold2019-11-151-0/+57
|
* add ingest request feature to entity_updates workerBryan Newbold2019-11-152-4/+22
| | | | | | | | | | | | | Initially was going to create a new worker to consume from the release update channel, but couldn't get the edit context ("is this a new release, or update to an existing") from that context. Currently there is a flag in source code to control whether we only do OA releases or all releases. Starting with OA only to start slow, but should probably default to all, and make this a config flag. Should probably also have a config flag to control this entire feature. Tested locally in dev.
* add ingest request transform (and test)Bryan Newbold2019-11-153-1/+68
|
* update next schema tweaks proposal docBryan Newbold2019-11-151-0/+1
|
* Merge branch 'martin-search-results-pagination' into 'master'Martin Czygan2019-11-156-20/+82
|\ | | | | | | | | Add basic pagination to search results See merge request webgroup/fatcat!4
| * address test issueMartin Czygan2019-11-151-2/+3
| |
| * adjust search test case for new wordingMartin Czygan2019-11-141-2/+2
| | | | | | | | > "Showing top " -> "Showing first "
| * gray out inactive navigation linksMartin Czygan2019-11-141-2/+2
| | | | | | | | | | | | | | | | As per [this issue](https://github.com/Semantic-Org/Semantic-UI/issues/1885#issuecomment-77619519), text colors are not supported in semantic ui. To not move text too much, gray out inactive links.
| * move pagination into macrosMartin Czygan2019-11-143-43/+51
| | | | | | | | | | | | | | | | | | Two new macros: * top_results(found) * bottom_results(found) wip: move pagination into macro
| * Add basic pagination to search resultsMartin Czygan2019-11-084-14/+67
| | | | | | | | | | | | | | | | | | | | | | | | The "deep paging problem" imposes some limit, which currently is a hardcoded default value, `deep_page_limit=2000` in `do_search`. Elasticsearch can be configured, too: > Note that from + size can not be more than the index.max_result_window index setting, which defaults to 10,000. -- https://www.elastic.co/guide/en/elasticsearch/reference/current/search-request-body.html#request-body-search-from-size
* | web: catch MacaroonInitExceptionBryan Newbold2019-11-121-0/+4
| | | | | | | | | | Caught one of these in sentry. Probably due to a crawler? Or typing gibberish in the token form.
* | design notes for a larger databaseBryan Newbold2019-11-121-0/+81
| |
* | old proposals for 'next' schema updateBryan Newbold2019-11-121-0/+38
| |
* | crossref patch bulk importBryan Newbold2019-11-122-0/+63
| |
* | Merge branch 'martin-python-readme-es-note' into 'master'bnewbold2019-11-081-0/+5
|\ \ | | | | | | | | | | | | mention elasticsearch empty index setup See merge request webgroup/fatcat!3
| * | mention elasticsearch empty index setupMartin Czygan2019-11-081-0/+5
| |/ | | | | | | | | | | When setting up with the defaults, all works fine, except that the web search will try to access a local elasticsearch. Mention in README, how to create empty indices.
* | crossref: accurate blank title countsBryan Newbold2019-11-051-0/+1
| |
* | fix crossref component testBryan Newbold2019-11-041-1/+1
| |
* | TODO idea: 'first seen'Bryan Newbold2019-11-041-0/+1
| |
* | crossref: component typeBryan Newbold2019-11-041-1/+3
| |
* | add 'component' as a release_typeBryan Newbold2019-11-042-0/+3
| |
* | crossref: count why skip happenedBryan Newbold2019-11-041-1/+7
| | | | | | | | | | | | Might skip based on release type (eg container, not a paper/release), or missing title, or other reasons. Over 7 million DOIs are getting skipped, curious why.
* | crossref: don't skip on short/null subtitleBryan Newbold2019-11-041-1/+1
|/ | | | This was a bug. Should only set subtitle black, not skip the import.
* note file fixup pushed in prodBryan Newbold2019-10-092-1/+64
|
* move corpus changes to 'notes/bulk_edits'Bryan Newbold2019-10-083-0/+285
|
* commit file cleaner testsBryan Newbold2019-10-081-0/+58
|
* file cleanup tweaks to actually runBryan Newbold2019-10-082-5/+4
|
* refactor duplicated b32_hex function in importersBryan Newbold2019-10-083-21/+11
|
* dict wrapper for entity_from_json()Bryan Newbold2019-10-082-3/+7
|
* new cleanup python tool/frameworkBryan Newbold2019-10-085-0/+300
|
* CHANGELOG entry for previous commitBryan Newbold2019-10-031-0/+6
|
* redirect direct entity underscore linksBryan Newbold2019-10-032-0/+30
|
* export raw affiliation strings for analysisBryan Newbold2019-10-031-0/+17
|
* update rust README with fatcat_test db creation noteBryan Newbold2019-10-031-1/+4
|
* update rust README re: opensslBryan Newbold2019-10-011-17/+1
| | | | | | | | | I believe an openssl library is still required locally, but with the SSL/TLS removal it now doesn't matter if it is OpenSSL 1.0 or 1.1. This is only a temporary work-around. When we update rust code generation, we will need to revisit these changes. The current version of swagger-rs still depends on HTTPS and OpenSSL 1.0 (via dependencies).
* entirely remove unused https flag to fatcatdBryan Newbold2019-09-291-15/+6
|
* cargo update fatcat rust after openssl removalBryan Newbold2019-09-291-76/+32
|