Machine learning is at the heart of a lot of computer applications in this generation. The community is doing a great deal of research - proving methods mathematically with years of research. For most reasons, like rapid prototyping for example, it makes sense to use an existing package of functions that give you readily available solutions. For example, computing the differentiation of a mathematical equation. In other words, tasks can be done as a "black box" approach to achieve something bigger. This brings the need for libraries like Torch.
What is Torch and What Does It Offer?
Torch is a scientific computing library that was initially written in and offered for Lua (a high level programming language). The awesome feature people use this library for is its ability to off-load tasks to GPUs (Graphics Processing Units), which have been developed to more efficiently handle complex linear algebra tasks that require a lot of processing time on CPUs.
Torch supports very quick matrix manipulation by putting GPUs first. It is built with numeric optimization routines that support porting to mobile devices with lower power.
Torch has been consistent in delivering and therefore has been ported to multiple languages like Python (py-torch) and C/C++.
Documentation and Community Support
Since torch is open sourced, there's a lot of community support on their own github page as well as places like StackOverflow where any questions can be answered regarding bugs, issues, updates, etc.