Skip to content

Latest commit

 

History

History
80 lines (65 loc) · 3.55 KB

lec_tensorflow.md

File metadata and controls

80 lines (65 loc) · 3.55 KB
layout title description
default
모두를 위한 딥러닝 시즌 2 -TensorFlow
This is TensorFlow page.

Deep Learning Zero To All : TensorFlow

아래 링크에서 학습을 시작할 수 있습니다. 도커를 사용하실 분은 Github 의 도커 가이드 문서를 참고해주세요!


목차

PART 1: Basic Machine Learning

  • Lec 01: 기본적인 Machine Learning 의 용어와 개념 설명
  • Lec 02: Simple Linear Regression
  • Lab 02: Simple Linear Regression 를 TensorFlow 로 구현하기
  • Lec 03: Linear Regression and How to minimize cost
  • Lab 03: Linear Regression and How to minimize cost 를 TensorFlow 로 구현하기
  • Lec 04: Multi-variable Linear Regression
  • Lab 04: Multi-variable Linear Regression 를 TensorFlow 로 구현하기
  • Lec 05-1: Logistic Regression/Classification 의 소개
  • Lec 05-2: Logistic Regression/Classification 의 cost 함수, 최소화
  • Lab 05-3: Logistic Regression/Classification 를 TensorFlow 로 구현하기
  • Lec 06-1: Softmax Regression: 기본 개념소개
  • Lec 06-2: Softmax Classifier의 cost함수
  • Lab 06-1: Softmax classifier 를 TensorFlow 로 구현하기
  • Lab 06-2: Fancy Softmax classifier 를 TensorFlow 로 구현하기
  • Lab 07-1: Application & Tips: 학습률(Learning Rate)과 데이터 전처리(Data Preprocessing)
  • Lab 07-2-1: Application & Tips: 오버피팅(Overfitting) & Solutions
  • Lab 07-2-2: Application & Tips: 학습률, 전처리, 오버피팅을 TensorFlow 로 실습
  • Lab 07-3-1: Application & Tips: Data & Learning
  • Lab 07-3-2: Application & Tips: 다양한 Dataset 으로 실습

PART 2: Basic Deep Learning

  • Lec 08-1: 딥러닝의 기본 개념: 시작과 XOR 문제
  • Lec 08-2: 딥러닝의 기본 개념2: Back-propagation 과 2006/2007 '딥'의 출현
  • Lec 09-1: XOR 문제 딥러닝으로 풀기
  • Lec 09-2: 딥넷트웍 학습 시키기 (backpropagation)
  • Lab 09-1: Neural Net for XOR
  • Lab 09-2: Tensorboard (Neural Net for XOR)
  • Lab 10-1: Sigmoid 보다 ReLU가 더 좋아
  • Lab 10-2: Weight 초기화 잘해보자d
  • Lab 10-3: Dropout
  • Lab 10-4: Batch Normalization

PART 3: Convolutional Neural Network

  • Lec 11-1: ConvNet의 Conv 레이어 만들기
  • Lec 11-2: ConvNet Max pooling 과 Full Network
  • Lec 11-3: ConvNet의 활용예
  • Lab 11-0-1: CNN Basic: Convolution
  • Lab 11-0-2: CNN Basic: Pooling
  • Lab 11-2: mnist cnn keras functional eager
  • Lab 11-1: mnist cnn keras sequential eager
  • Lab-11-3: mnist cnn keras subclassing eager
  • Lab-11-4: mnist cnn keras ensemble eager
  • Lab-11-5: mnist cnn best keras eager

PART 4: Recurrent Neural Network

  • Lec 12: NN의 꽃 RNN 이야기
  • Lab 12-0: rnn basics
  • Lab 12-1: many to one (word sentiment classification)
  • Lab 12-2: many to one stacked (sentence classification, stacked)
  • Lab 12-3: many to many (simple pos-tagger training)
  • Lab 12-5: seq to seq (simple neural machine translation)
  • Lab 12-4: many to many bidirectional (simpled pos-tagger training, bidirectional)
  • Lab 12-6: seq to seq with attention (simple neural machine translation, attention)

back