Earlier this year, Microsoft made its open source Computational Network Toolkit (CNTK), a tool used to speed up advances in artificial intelligence, available on GitHub. With CNTK 1.5, Microsoft adding significant language enhancements, an expanded toolbox of features, and improved readers for text and speech.
One of CNTK’s advantages is its ability to scale efficiently across multiple GPUs and machines. CNTK 1.5 introduces a new parallelism technique known as Block Momentum that takes training scalability to a new level of performance, while still preserving accuracy.
“Expressing very deep nets, beam decoding, and other complex structures is greatly simplified with BrainScript, which supports infix operators, nested variables and function definitions, recursive function calls, arrays, and even lambdas,” said Frank Seide, principal researcher and one of the architects of CNTK.
Additionally, CNTK 1.5 includes a revamped I/O architecture, including more flexible readers for text and speech, making it easier to input popular formats into the toolkit for deep learning training. This saves users from having to write their own code to parse these formats themselves. Microsoft has also included a growing library of standard components, such as Sequence-to-Sequence with Attention and the state-of-the-art Deep Residual Nets for Image Recognition. Features like these expand the toolbox available to CNTK users, giving developers advanced recipes they can use out of the box.
Since CNTK’s initial release on GitHub, the Microsoft Research team had received an incredible amount of feedback from the community. Many of the improvements in CNTK 1.5 stem directly from community requests and contributions. They will continue working with the community to help advance CNTK, including the addition of more popular programming languages like Python.
The bottom line for developers: With CNTK 1.5, you now have access to efficient and easy-to-use tools to add artificial intelligence capabilities, like speech and image recognition, to your applications.
Source: Microsoft Research Blog