On Javascript: They grow up so fast! (Or do they?)

I have been using javascript for a while now. I mean, it needs huge props!
[Begin nostalgic segue]
Initially, windows95 was not super easy to program in. Certainly not graphically! At the very least you had to install some fancy Visual-this or Borland-that. You might be able to install qbasic, but then if you wanted to do graphics, you were stuck in some crazy little full-screen emulated DOS window, or you were just taking over the whole darn screen! With Javascript, on the other hand, you could do a relatively large variety of things. You could change colors, make things move left or right, you name it. Sure, you were constrained to the browser window, but so what? And even that looked (*ahem* relatively) professional (for the time *ahem). You were programming, things were happening, and it didn't cost you anything extra! No compiling! So Javascript was my first real non-textual interaction programming, and (oddly) mIRC had this crazy scripting language which was my first real socket programming. IIRC, mIRC also had some cavases you could paint on, graphical stuff you could do. But the allure of javascript was that you could jump right there. So, I installed mysql, apache, php, and pear-php (first ORM?) on the family win95 box, and bam! You had the full deal with javascript in the browser. Persistence. Form-posting. Cookies, all that! Anyway, jump to the present day.
[End segue, begin Meat]
I've administered my fair share of boxen, segfaulted in Production, all that kind of shit, and I have come to know a few languages. Recently Node has really been tickling me. Until, that is, I noticed this.

$ node
  > (742846440039055165409247126739172 + 10) - 742846440039055165409247126739172
  0
  >

Wat

Ok, so I could see this being reasonable in a browser at some point. But on a server side language? With the multi-processor, multi-core computing power we have today? Get off your holier-than-thou ecma-loving asses and give us real math!
In comparison, check this shit out:
$ echo -n "print -300000 + " |tee  bignum.py && cat /dev/urandom |openssl enc -base64 
  sed 's/[^0-9]//g' | tr -d '\n'  | head -c 1000000 |tee -a bignum.py && python bignum.py
  [a whole shitload of digits later...]
  70040492113752317698
  $ tail -c 20 bignum.py
  70040492113752617698
That right there works! See the three and the 6? Slow as molasses, but I have just done math with a, you got that right folks, a million digit number. And the truth is, that's what we need. The world revolves around secrets and data visualization, and, today, secrets and visualization revolve around math. People don't realize that that has been the case for centuries, but it's true. And developers who write systems need the most expressive languages possible(!!), but they need the math in there too. I am the last person you will find who is anti-spec. I love the spec's. RFC. IETF. OMFG. But, in the end, any programming language that doesn't natively get the maths right is kind of a non-starter for me. So, unless I get into the nitty-gritty of some serious code here tonight, I really hope somebody steps up and gives this community something worth a coming-of-age party.