[Begin nostalgic segue]
[End segue, begin Meat]
I've administered my fair share of boxen, segfaulted in Production, all that kind of shit, and I have come to know a few languages. Recently Node has really been tickling me. Until, that is, I noticed this.
$ node > (742846440039055165409247126739172 + 10) - 742846440039055165409247126739172 0 >
In comparison, check this shit out:
$ echo -n "print -300000 + " |tee bignum.py && cat /dev/urandom |openssl enc -base64 sed 's/[^0-9]//g' | tr -d '\n' | head -c 1000000 |tee -a bignum.py && python bignum.py [a whole shitload of digits later...] 70040492113752317698 $ tail -c 20 bignum.py 70040492113752617698That right there works! See the three and the 6? Slow as molasses, but I have just done math with a, you got that right folks, a million digit number. And the truth is, that's what we need. The world revolves around secrets and data visualization, and, today, secrets and visualization revolve around math. People don't realize that that has been the case for centuries, but it's true. And developers who write systems need the most expressive languages possible(!!), but they need the math in there too. I am the last person you will find who is anti-spec. I love the spec's. RFC. IETF. OMFG. But, in the end, any programming language that doesn't natively get the maths right is kind of a non-starter for me. So, unless I get into the nitty-gritty of some serious code here tonight, I really hope somebody steps up and gives this community something worth a coming-of-age party.