Improving Deep Neural Networks: Hyperparameter tuning, Regularization and Optimization
Description
When you enroll for courses through Coursera you get to choose for a paid plan or for a free plan .
- Free plan: No certicification and/or audit only. You will have access to all course materials except graded items.
- Paid plan: Commit to earning a Certificate—it's a trusted, shareable way to showcase your new skills.
About this course: This course will teach you the "magic" of getting deep learning to work well. Rather than the deep learning process being a black box, you will understand what drives performance, and be able to more systematically get good results. You will also learn TensorFlow. After 3 weeks, you will: - Understand industry best-practices for building deep learning applications. - Be able to effectively use the common neural network "tricks", including initialization, L2 and dropout regularization, Batch normalization, gradient checking, - Be able to implement and apply a variety of optimization algorithms, such as mini-batch gradient descent, Momentum, RMSprop and Adam, and check …
Frequently asked questions
There are no frequently asked questions yet. If you have any more questions or need help, contact our customer service.
When you enroll for courses through Coursera you get to choose for a paid plan or for a free plan .
- Free plan: No certicification and/or audit only. You will have access to all course materials except graded items.
- Paid plan: Commit to earning a Certificate—it's a trusted, shareable way to showcase your new skills.
About this course: This course will teach you the "magic" of getting deep learning to work well. Rather than the deep learning process being a black box, you will understand what drives performance, and be able to more systematically get good results. You will also learn TensorFlow. After 3 weeks, you will: - Understand industry best-practices for building deep learning applications. - Be able to effectively use the common neural network "tricks", including initialization, L2 and dropout regularization, Batch normalization, gradient checking, - Be able to implement and apply a variety of optimization algorithms, such as mini-batch gradient descent, Momentum, RMSprop and Adam, and check for their convergence. - Understand new best-practices for the deep learning era of how to set up train/dev/test sets and analyze bias/variance - Be able to implement a neural network in TensorFlow. This is the second course of the Deep Learning Specialization.
Who is this class for: This class is for: - Learners that took the first course of the specialization: "Neural Networks and Deep Learning" - Anyone that already understands fully-connected neural networks, and wants to learn the practical aspects of making them work well.
Created by: deeplearning.ai-
Taught by: Andrew Ng, Co-founder, Coursera; Adjunct Professor, Stanford University; formerly head of Baidu AI Group/Google Brain
Cada curso es como un libro de texto interactivo, con videos pregrabados, cuestionarios y proyectos.
Ayuda de tus compañerosConéctate con miles de estudiantes y debate ideas y materiales del curso, y obtén ayuda para dominar los conceptos.
CertificadosObtén reconocimiento oficial por tu trabajo y comparte tu éxito con amigos, compañeros y empleadores.
deeplearning.ai deeplearning.ai is Andrew Ng's new venture which amongst others, strives for providing comprehensive AI education beyond borders.Syllabus
WEEK 1
Practical aspects of Deep Learning
15 videos expand
- Video: Train / Dev / Test sets
- Video: Bias / Variance
- Video: Basic Recipe for Machine Learning
- Video: Regularization
- Video: Why regularization reduces overfitting?
- Video: Dropout Regularization
- Video: Understanding Dropout
- Video: Other regularization methods
- Video: Normalizing inputs
- Video: Vanishing / Exploding gradients
- Video: Weight Initialization for Deep Networks
- Video: Numerical approximation of gradients
- Video: Gradient checking
- Video: Gradient Checking Implementation Notes
- Libreta: Initialization
- Libreta: Regularization
- Libreta: Gradient Checking
- Video: Yoshua Bengio interview
Graded: Practical aspects of deep learning
Graded: Initialization
Graded: Regularization
Graded: Gradient Checking
WEEK 2
Optimization algorithms
11 videos expand
- Video: Mini-batch gradient descent
- Video: Understanding mini-batch gradient descent
- Video: Exponentially weighted averages
- Video: Understanding exponentially weighted averages
- Video: Bias correction in exponentially weighted averages
- Video: Gradient descent with momentum
- Video: RMSprop
- Video: Adam optimization algorithm
- Video: Learning rate decay
- Video: The problem of local optima
- Libreta: Optimization
- Video: Yuanqing Lin interview
Graded: Optimization algorithms
Graded: Optimization
WEEK 3
Hyperparameter tuning, Batch Normalization and Programming Frameworks
11 videos expand
- Video: Tuning process
- Video: Using an appropriate scale to pick hyperparameters
- Video: Hyperparameters tuning in practice: Pandas vs. Caviar
- Video: Normalizing activations in a network
- Video: Fitting Batch Norm into a neural network
- Video: Why does Batch Norm work?
- Video: Batch Norm at test time
- Video: Softmax Regression
- Video: Training a softmax classifier
- Video: Deep learning frameworks
- Video: TensorFlow
- Libreta: Tensorflow
Graded: Hyperparameter tuning, Batch Normalization, Programming Frameworks
Graded: Tensorflow
Share your review
Do you have experience with this course? Submit your review and help other people make the right choice. As a thank you for your effort we will donate $1.- to Stichting Edukans.There are no frequently asked questions yet. If you have any more questions or need help, contact our customer service.