Back

Speaker "Bhairav Mehta" Details Back

 

Topic

Deep learning with Python using Tensorflow, Theano and Keras libraries.

Abstract

Deep learning is the most interesting and powerful machine learning technique right now.

Top deep learning libraries are available on the Python ecosystem like Theano and TensorFlow. Tap into their power in a few lines of code using Keras, the best-of-breed applied deep learning library.

Deep learning techniques are so powerful because they learn the best way to represent the problem while learning how to solve the problem.

This is called representation learning.

Representation learning is perhaps the biggest differentiation between deep learning models and classical machine learning algorithm.

It is the power of representation learning that is spurring such great creativity in the way the techniques are being used. 

 For example:

                    Deep learning models are being used for very difficult problems and making progress, like colorizing image and videos based on the context in the scene.

                    Deep learning models are being used in bold new ways, such as cutting the head off a network trained on one problem and tuning it for a completely different problem, and getting impressive results.

                    Combinations of deep learning models are being used to both identify objects in photographs and then generate textual descriptions of those objects, a complex multi-media problem that was previously thought to require large artificial intelligence systems.

Deep learning is hot, it is delivering results and now is the time to get involved. But where do you start?

Develop and evaluate deep learning models in Python.

The platform for getting started in applied deep learning is Python.

Python is a fully featured general purpose programming language, unlike R and Matlab. It is also quick and easy to write and understand, unlike C++ and Java.

The SciPy stack in Python is a mature and quickly expanding platform for scientific and numerical computing. The platform hosts libraries such as scikit-learn the general purpose machine learning library that can be used with your deep learning models.

 It is because of these benefits of the Python ecosystem that two top numerical libraries for deep learning were developed for Python, Theano and the newer TensorFlow library released by Google (and adopted recently by the Google DeepMind research group).

Theano and TensorFlow are two top numerical libraries for developing deep learning models, but are too technical and complex for the average practitioner. They are intended more for research and development teams and academics interested in developing wholly new deep learning algorithms.

 The saving grace is the Keras library for deep learning, that is written in pure Python, wraps and provides a consistent agnostic interface to Theano and TensorFlow and is aimed at machine learning practitioners that are interested in creating and evaluating deep learning models.

 It is a little over one year old and is clearly the best-of-breed library for getting started with deep learning because of both the speed at which you can develop models and the numerical power it is built upon.

 These two parts(2 days) are Lessons and Projects for this fast paced session:

             Lessons: Learn how the sub-tasks of applied deep learning map onto the Keras Python library and the best practice way of working through each task.

             Projects: Tie together all of the knowledge from the lessons by working through case study predictive modeling problems.

 

1. Lessons

Here is an overview of the step-by-step lessons you will complete:

                    Lesson 01: Introduction to the Theano library.

                    Lesson 02: Introduction to the TensorFlow library.

                    Lesson 03: Introduction to the Keras library.

                    Lesson 04: Crash Course in Multi-Layer Perceptrons.

                    Lesson 05: Develop Your First Neural Network With Keras.

                    Lesson 06: Evaluate the Performance Of Deep Learning Models.

                    Lesson 07: Use Keras Models With scikit-learn.

                    Lesson 08: Save Your Models For Later With Serialization.

                    Lesson 09: Keep The Best Models During Training.

                    Lesson 10: Understand Model Behavior During Training.

                    Lesson 11: Reduce Overfitting With Dropout Regularization.

                    Lesson 12: Lift Performance With Learning Rate Schedules.

                    Lesson 13: Crash Course in Convolutional Neural Networks.

                    Lesson 14: Improve Model Performance With Image Augmentation.

                    Lesson 15: Crash Course in Recurrent Neural Networks.

                    Lesson 16: Time Series Prediction with Multilayer Perceptrons.

                    Lesson 17: Time Series Prediction with LSTM Networks.

                    Lesson 18: Understanding Stateful LSTM Recurrent Neural Networks.

 

 2. Projects

Here is an overview of the 7 end-to-end projects you will complete:

                    Project 01: Develop Large Models on GPUs Cheaply in the Cloud.

                    Project 02: Multiclass Classification of Flower Species.

                    Project 03: Binary Classification of Sonar Returns.

                    Project 04: Regression of Boston House Prices.

                    Project 05: Handwritten Digit Recognition.

                    Project 06: Object Recognition in Photographs.

                    Project 07: Predict Sentiment From Movie Reviews.

                    Project 08: Sequence Classification with LSTMs for Movie Reviews.

                    Project 09: Text Generation With Alice in Wonderland.

Profile

Experienced engineer, entrepreneur and seasoned Statistician / programmer with 14 years of combined experience working on product service/ channel /warranty management in electronics consumer products industry at Apple Inc. (4 years), yield engineering in semiconductor manufacturing at Qualcomm (6 years) and quality engineering in automotive industry at Ford Motor Company (OEM, Tier2 Suppliers) (3 years). MBA from Johnson Graduate School of Management at Cornell University, 3 Master of Science (MS) degrees in diverse engineering and mathematical sciences. Industrial Systems Engineering (Rochester '02), Applied Statistics (Cornell '04)..Developed skills in Big Data, Hadoop, NoSQL, Spark for Scalability and implementation of big data solution for the enterprises. Cloudera certified Hadoop developer, Cloudera certified Data Scientist and Amazon AWS Certified Solutions Architect- Associate. Industry renowned ASQ certifications: Certified Quality Engineer (CQE '06), Certified Reliability Engineer (CRE '07) and Six Sigma Master Black Belt (CSSBB '09). Skills across hardware and software platforms. Naturalized US Citizen.