Skip to content

This repository contains all materials, including Labs in the Deep Learning Specialization offered by DeepLearning.AI on Coursera

Notifications You must be signed in to change notification settings

evgenyzorin/Deep-Learning

Repository files navigation

๐Ÿš€ Deep Learning Specialization

This repository contains all materials, including Labs in the Deep Learning Specialization offered by DeepLearning.AI on Coursera.

About this Course

In the first course of the Deep Learning Specialization, you will study the foundational concept of neural networks and deep learning.

By the end, you will be familiar with the significant technological trends driving the rise of deep learning; build, train, and apply fully connected deep neural networks; implement efficient (vectorized) neural networks; identify key parameters in a neural networkโ€™s architecture; and apply deep learning to your own applications.

Syllabus:

Week 1: Introduction to Deep Learning

Analyze the major trends driving the rise of deep learning, and give examples of where and how it is applied today.

Week 2: Neural Networks Basics

Set up a machine learning problem with a neural network mindset and use vectorization to speed up your models.

Week 3: Shallow Neural Networks

Build a neural network with one hidden layer, using forward propagation and backpropagation.

Week 4: Deep Neural Networks

Analyze the key computations underlying deep learning, then use them to build and train deep neural networks for computer vision tasks.

About this Course

In the second course of the Deep Learning Specialization, you will open the deep learning black box to understand the processes that drive performance and generate good results systematically.

By the end, you will learn the best practices to train and develop test sets and analyze bias/variance for building deep learning applications; be able to use standard neural network techniques such as initialization, L2 and dropout regularization, hyperparameter tuning, batch normalization, and gradient checking; implement and apply a variety of optimization algorithms, such as mini-batch gradient descent, Momentum, RMSprop and Adam, and check for their convergence; and implement a neural network in TensorFlow.

Syllabus:

Week 1: Practical Aspects of Deep Learning

Discover and experiment with a variety of different initialization methods, apply L2 regularization and dropout to avoid model overfitting, then apply gradient checking to identify errors in a fraud detection model.

Week 2: Optimization Algorithms

Develop your deep learning toolbox by adding more advanced optimizations, random minibatching, and learning rate decay scheduling to speed up your models.

Week 3: Hyperparameter Tuning, Batch Normalization and Programming Frameworks

Explore TensorFlow, a deep learning framework that allows you to build neural networks quickly and easily, then train a neural network on a TensorFlow dataset.

About this Course

In the third course of the Deep Learning Specialization, you will learn how to build a successful machine learning project and get to practice decision-making as a machine learning project leader.

By the end, you will be able to diagnose errors in a machine learning system; prioritize strategies for reducing errors; understand complex ML settings, such as mismatched training/test sets, and comparing to and/or surpassing human-level performance; and apply end-to-end learning, transfer learning, and multi-task learning.

Syllabus:

Week 1: ML Strategy (1)

Streamline and optimize your ML production workflow by implementing strategic guidelines for goal-setting and applying human-level performance to help define key priorities.

Week 2: ML Strategy (2)

Develop time-saving error analysis procedures to evaluate the most worthwhile options to pursue and gain intuition for how to split your data and when to use multi-task, transfer, and end-to-end deep learning.

About this Course

In the fourth course of the Deep Learning Specialization, you will understand how computer vision has evolved and become familiar with its exciting applications such as autonomous driving, face recognition, reading radiology images, and more.

By the end, you will be able to build a convolutional neural network, including recent variations such as residual networks; apply convolutional networks to visual detection and recognition tasks; and use neural style transfer to generate art and apply these algorithms to a variety of image, video, and other 2D or 3D data.

Syllabus:

Week 1: Foundations of Convolutional Neural Networks

Implement the foundational layers of CNNs (pooling, convolutions) and stack them properly in a deep network to solve multi-class image classification problems.

Week 2: Deep Convolutional Models: Case Studies

Discover some powerful practical tricks and methods used in deep CNNs, straight from the research papers, then apply transfer learning to your own deep CNN.

Week 3: Object Detection

Apply your new knowledge of CNNs to one of the hottest (and most challenging!) fields in computer vision: object detection.

Week 4: Special Applications: Face recognition & Neural Style Transfer

Explore how CNNs can be applied to multiple fields, including art generation and face recognition, then implement your own algorithm to generate art and recognize faces.

About this Course

In the fifth course of the Deep Learning Specialization, you will become familiar with sequence models and their exciting applications such as speech recognition, music synthesis, chatbots, machine translation, natural language processing (NLP), and more.

By the end, you will be able to build and train Recurrent Neural Networks (RNNs) and commonly-used variants such as GRUs and LSTMs; apply RNNs to Character-level Language Modeling; gain experience with natural language processing and Word Embeddings; and use HuggingFace tokenizers and transformer models to solve different NLP tasks such as NER and Question Answering.

Syllabus:

Week 1: Recurrent Neural Networks

Discover recurrent neural networks, a type of model that performs extremely well on temporal data, and several of its variants, including LSTMs, GRUs and Bidirectional RNNs.

Week 2: Natural Language Processing & Word Embeddings

Natural language processing with deep learning is a powerful combination. Using word vector representations and embedding layers, train recurrent neural networks with outstanding performance across a wide variety of applications, including sentiment analysis, named entity recognition and neural machine translation.

Week 3: Sequence Models & Attention Mechanism

Augment your sequence models using an attention mechanism, an algorithm that helps your model decide where to focus its attention given a sequence of inputs. Then, explore speech recognition and how to deal with audio data.

Week 4: Transformer Network

Transformers is an architecture that has completely taken the NLP world by storm and many of the most effective albums for NLP today are based on the transformer architecture.

About

This repository contains all materials, including Labs in the Deep Learning Specialization offered by DeepLearning.AI on Coursera

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages