aboutsummaryrefslogtreecommitdiffstats
Commit message (Collapse)AuthorAgeFilesLines
* Merge branch 'spertus-packages' into 'master'bnewbold2018-06-074-16/+13
|\ | | | | | | | | Made package names match directory names. Cleaned up imports. See merge request webgroup/sandcrawler!3
| * Made package names match directory names. Cleaned up imports.Ellen Spertus2018-06-054-16/+13
|/
* Merge branch 'refactoring' into 'master'bnewbold2018-06-044-20/+101
|\ | | | | | | | | Refactoring to add, use, and test class HBaseBuilder to eliminate duplicated code and facilitate HBaseSource creation See merge request webgroup/sandcrawler!1
| * Made changes suggested in merge request review.Ellen Spertus2018-06-043-15/+10
| | | | | | | | | | - Changed inverseSchema from Map to List, eliminating incorrect comment. - Changing format of argument to HBaseBuilder.build from String to List[String].
| * Changed interface to HBaseBuilder.parseColSpec.Ellen Spertus2018-06-033-8/+12
| |
| * Added HBaseBuilder.build() and had HBaseRowCountJob call it.Ellen Spertus2018-06-032-11/+5
| |
| * Added HBaseBuilder.parseColSpecs and tests, which pass.Ellen Spertus2018-06-032-0/+92
| |
| * Factored common code out of HBaseRowCountJob and its test into a new ↵Ellen Spertus2018-06-012-16/+12
| | | | | | | | companion object.
* | Merge branch 'bnewbold-scala-build-fixes' into 'master'bnewbold2018-06-043-21/+19
|\ \ | | | | | | | | | | | | scala build fixes See merge request webgroup/sandcrawler!2
| * | try to run scala tests in gitlab CIBryan Newbold2018-06-041-2/+12
| | |
| * | fetch SpyGlass jar from archive.org (not local)Bryan Newbold2018-06-042-19/+7
| |/
* / bnewbold-dev > wbgrp-svc263Bryan Newbold2018-06-041-4/+4
|/ | | | This is a new production VM running an HBase-Thrift gateway
* Provided full path to cascading jar in command line.Ellen Spertus2018-05-311-1/+1
|
* Added tip on OutOfMemoryError.Ellen Spertus2018-05-311-1/+5
|
* Added debugging info for cascading.tuple.Fields.Ellen Spertus2018-05-311-1/+23
|
* switch HBaseRowCountJob to SCAN_ALLBryan Newbold2018-05-292-5/+11
|
* HBaseRowCountJob actually counts rowsBryan Newbold2018-05-292-13/+8
|
* update version and project nameBryan Newbold2018-05-243-4/+6
|
* cleanup scalding notes/READMEBryan Newbold2018-05-243-37/+162
|
* assemblyMergeStrategy deprecation warningBryan Newbold2018-05-241-2/+2
|
* rename jvm/scalding directoriesBryan Newbold2018-05-2414-71/+0
|
* fix up HBaseRowCountTestBryan Newbold2018-05-242-7/+15
| | | | | Again, seems like test fixture must match *exactly* or very obscure errors crop up.
* get quorum fields to match, fixing testBryan Newbold2018-05-241-1/+1
| | | | | | | | | | | Writing this commit message in anger: It seems that the HBaseSource must match exactly between the instantiated Job class and the JobTest. The error when this isn't the case is very obscure: a `None.get()` exception deep in SpyGlass internals. Blech. This may or may not explain other test failure issues.
* Added repository to find com.hadoop.gplcompression#hadoop-lzo;0.4.16.Ellen Spertus2018-05-221-0/+1
|
* more tests (failing)Bryan Newbold2018-05-222-1/+56
|
* update README with invocationsBryan Newbold2018-05-211-0/+13
|
* point SimpleHBaseSourceExample to actual zookeeper quorum hostBryan Newbold2018-05-211-1/+2
|
* another attempt at a simple job variationBryan Newbold2018-05-211-3/+16
|
* update HBaseRowCountJob based on Simple exampleBryan Newbold2018-05-211-10/+11
|
* spyglass/hbase test examples (from upstream)Bryan Newbold2018-05-212-0/+93
|
* deps updates: cdh libs, hbase, custom spyglassBryan Newbold2018-05-212-3/+7
|
* docs of how to munge around custom spyglass jarsBryan Newbold2018-05-211-0/+19
|
* add dependencyTree helper pluginBryan Newbold2018-05-212-1/+2
|
* building (but nullpointer) spyglass integrationBryan Newbold2018-05-212-3/+27
|
* more deps locationsBryan Newbold2018-05-211-0/+8
|
* gitignore for scalding directoryBryan Newbold2018-05-211-0/+3
|
* fix WordCountJob package; tests; hadoop versionBryan Newbold2018-05-213-2/+43
| | | | | | | | | | | When copying from upstream scalding, forgot to change the path/namespace of the WordCountJob. Production IA cluster is actually running Hadoop 2.5, not 2.6 (I keep forgetting). Pull in more dependencies so test runs (copied from scalding repo, only changed the namespace of the job)
* WordCount -> WordCountJobBryan Newbold2018-05-213-13/+13
| | | | Also use the exact file from scalding repo
* success running with com.twitter.scalding.ToolBryan Newbold2018-05-212-4/+11
|
* remove main function; class name same as fileBryan Newbold2018-05-211-12/+1
|
* copy in jvm ecosystem notesBryan Newbold2018-05-211-0/+46
|
* copy in scalding learning exampleBryan Newbold2018-05-216-0/+93
|
* jvm/scala/scalding setup notesBryan Newbold2018-05-171-0/+16
|
* fix tests post-DISTINCTBryan Newbold2018-05-085-25/+30
| | | | Confirms it's working!
* distinct on SHA1 in cdx scriptsBryan Newbold2018-05-082-6/+18
|
* pig cdx join improvementsBryan Newbold2018-05-081-1/+1
|
* how to run pig in productionBryan Newbold2018-05-081-0/+5
|
* WIP on filter-cdx-join-urls.pigBryan Newbold2018-05-071-0/+37
|
* Merge branch 'master' of git.archive.org:webgroup/sandcrawlerBryan Newbold2018-05-088-3/+139
|\
| * stale TODOBryan Newbold2018-05-071-0/+1
| |