x write basic exmaple from TSV file x pass-through basic thing in pipeline x pretty printer (using column writing, term color) - convert to/from TSV to/from JSON - example apps ls stat df, mount, something like that trivial web server (or other thing that logs) - example datasets (eg, for benchmarking): compare TSV, AFT, JSON million-line CDX log file - manpage (?) - build .deb and installable - helper library => R/W trails/wrappers => header struct => iterate rows (from input) => pretty-print output based on tty status => validation/check modes => stream mode/helper for subprocesses - tests - compare with xsv command (?) - reimplement basic commands cut (accept field names) cat (combining files with compatible headers) head, tail wc (count rows, records) format (accept field names) grep/match/filter by column value? paste uniq (by column) sort (by column) join - extended commands parallel (with column names) shuf comm expand/unexpand nl seq ideas: - python stuff - C stuff - log log format integration - rust serde integration - aft-header: pretty-prints header as rows - aft-single: pretty-print single row (first?) as rows - aft-format (or printf?) "this {col1} to that {col2}" "some other column" - aft2json, json2aft - aft2html - aft-stats: sum, mean, stddev, min, max - aft-sql