I've always been amazed at how much much much faster things become if you manage to rewrite code that loops though your ndarray and does something, with numpy functions that work on the whole array at once. I'm looking for something similar in node. So far I have turned up some things, none of which look promising:
- scikit-node, runs scikit-learn in Python, and interfaces with node. I haven't tried it, it may not give me the speed that I would like.
- There are some rather old, and newer, javascript matrix libraries (sylvester, gl-matrix, ...). In addition to not being sure they work well with matrices larger than 4x4 (which is most useful in 3D rendering), they seem to be native javascript (and some, not sure these, use webGL acceleration). Great on the browser, not so on node.
I'm not asking for "what is the best package to do xyz". I'm just wondering if there is a technical reason there is no package to do this on node, a social reason, or no reason at all and there is just a package I missed. Maybe to avoid too many opinionated criticism:
I have about 10000 matrices that are 100 x 100 each. What's a reasonable fast way to add them together?
After some more googling for "node.js scientific computing" there are links to some very interesting notes:
- https://cs.stackexchange.com/questions/ ... ould-i-kee
- http://www.quora.com/Can-Node-js-handle ... -Julia-can
- Javascript and Scientific Processing?
Источник: https://stackoverflow.com/questions/314 ... ot-why-not
Мобильная версия