Sunday, December 30, 2018

Submarine: Running Deep Learning Workloads on Apache Hadoop

Introduction

Hadoop is the most popular open source framework for the distributed processing of large, enterprise data sets. It is heavily used in both on-prem and on-cloud environment.

Deep learning is useful for enterprises tasks in the field of speech recognition, image classification, AI chatbots, machine translation, just to name a few. In order to train deep learning/machine learning models, frameworks such as TensorFlow/MXNet/Pytorch/Caffe/XGBoost can be leveraged. And sometimes these frameworks are used together to solve different problems.



from DZone.com Feed http://bit.ly/2BN7fEq

No comments:

Post a Comment