1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
|
x write basic exmaple from TSV file
x pass-through basic thing in pipeline
x pretty printer (using column writing, term color)
- convert
to/from TSV
to/from JSON
- example apps
ls
stat
df, mount, something like that
trivial web server (or other thing that logs)
- example datasets (eg, for benchmarking): compare TSV, AFT, JSON
million-line CDX
log file
- manpage (?)
- build .deb and installable
- helper library
=> R/W trails/wrappers
=> header struct
=> iterate rows (from input)
=> pretty-print output based on tty status
=> validation/check modes
=> stream mode/helper for subprocesses
- tests
- compare with xsv command (?)
- reimplement basic commands
cut (accept field names)
cat (combining files with compatible headers)
head, tail
wc (count rows, records)
format (accept field names)
grep/match/filter by column value?
paste
uniq (by column)
sort (by column)
join
- extended commands
parallel (with column names)
shuf
comm
expand/unexpand
nl
seq
ideas:
- python stuff
- C stuff
- log log format integration
- rust serde integration
- aft-header: pretty-prints header as rows
- aft-single: pretty-print single row (first?) as rows
- aft-format (or printf?) "this {col1} to that {col2}" "some other column"
- aft2json, json2aft
- aft2html
- aft-stats: sum, mean, stddev, min, max
- aft-sql
|