{Submarine} : Running deep learning workloads on Apache Hadoop

(This Blogpost is coauthored by Xun Liu and Quan Zhou from Netease). Introduction Hadoop is the most popular open source framework for the distributed processing of large, enterprise data sets. It is heavily used in both on-prem and on-cloud environment. Deep learning is useful for enterprises tasks in the field of speech recognition, image classification, […]

The post {Submarine} : Running deep learning workloads on Apache Hadoop appeared first on Hortonworks.

Comments

Popular posts from this blog

Underwater Autonomous Vehicles Helping Navy Get More for the Money 

Canada regulator seeks information from public on Rogers-Shaw deal