fish-spark

by jorgebucaran

jorgebucaran / fish-spark

▁▂▃▅▂▇ in your fish shell.

212 Stars 5 Forks Last release: over 1 year ago (1.0.0) MIT License 35 Commits 1 Releases

Available items

No Items, yet!

The developer of this repository has not created any items for sale yet. Need a bug fixed? Help with integration? A different license? Create a request here:

spark.fish

▁▂▃▅▂▇ in your fish shell.

Spark is a sparkline generator for fish. It's an unofficial port of spark.sh with options for adjusting the minimum and maximum values of the input and all-around better performance.

Installation

Install with Fisher (recommended):

fisher add jorgebucaran/spark.fish
Not using a package manager?

###

Copy spark.fish to any directory on your function path.

curl https://git.io/spark.fish --create-dirs -sLo ~/.config/fish/functions/spark.fish

Quickstart

You have a set of numbers which can be comma-delimited, separated by spaces, newlines, or tabs. What's a simple way to visualize these data on the terminal? Sparklines!

$ spark 0 1 2 3
▁▃▅█

Spark can read from standard input as well. Here is a random sequence of numbers.

$ seq 80 | sort -R | spark
▁▅▄▄▂▁▂▅▂▇▄▅▄▃▃▄▁▂▃▁▁▅▄▇▃▆▆▂▄▄▂▆▆▆▇▃▆▇▁▄▃▄▆▅▄█▅▁▃▆▁▁▁▂▆▁▅▅▇▇▅▇▅▇▃▆▄▂▇▃▃▅▂▁▇▆▂▇▂▃

Spark calculates the smallest and largest numbers from your dataset to calibrate the height of the sparklines. To force these values to anything else use

--min=
and
--max=
.
$ spark 10 20 30 40 50
▁▂▄▆█
$ spark --max=100 -- 10 20 30 40 50
▁▂▃▃▄

Want to see what else Spark can do? Check out Wicked Cool Usage and prepare to be amazed!

Performance

Spark is up to 400x faster (that's not a typo!) than the original

spark.sh
, reading and writing relatively large datasets under milliseconds.

$ time fish -c "seq 10000 | sort -R | spark" >/dev/null
       0.19 real         0.19 user         0.01 sys

$ time fish -c "seq 10000 | sort -R | spark.sh" >/dev/null 86.15 real 84.44 user 0.53 sys

License

MIT

We use cookies. If you continue to browse the site, you agree to the use of cookies. For more information on our use of cookies please see our Privacy Policy.