본문 바로가기

ML14

[Tensorflow 2.x] 기초 tensor ~= Numpy.arrays 일반적인 축 순서: tf.zeros([batch, width, height, features]) variable tensor = tf.constant([[1.0, 2.0], [3.0, 4.0]]) variable = tf.Variable(tensor, name='my_tensor') variable.assign([[1,2], [3,4]]) # same as initial variable (keep dtype) variable.assign_add([[1,1], [1,1]]) # [[2.,3.], [4.,5.,]] print(tf.Variable(tensor+1) == tf.Variable(tensor)) # [[False, False], [False, False]].. 2021. 8. 13.
[CNN] DenseNet Implementation (Keras) DenseNet DenseNet의 전체적인 구조 Dense Connectivity $$ x_{l}=f([x_{0},x_{1},...,x_{l−1}]) $$ ResNet의 Add와 다르게 Concatenate를 사용 Implementation Architecture 위 구조에 따라 구현 (growth rate=32) Util Functions def get_full_props(properties): assert properties.get('filters') is not None assert properties.get('kernel_size') is not None assert properties.get('padding') in ['valid','same', None] filters = properties... 2021. 8. 10.
[CNN] ResNet50 Implementation(Keras) ResNet 구조 구현 import tensorflow as tf from tensorflow.keras.models import Model, Sequential from tensorflow.keras.layers import ( Activation, BatchNormalization, Dense, Flatten, Add, Input, Conv2D, MaxPool2D, GlobalAvgPool2D, Concatenate ) from tensorflow.keras.regularizers import l2 사용할 구조 정의 def _bn_relu(inputs): x = BatchNormalization()(inputs) x = Activation('relu')(x) return x def _conv(inpu.. 2021. 8. 9.
[CNN] GoogLeNet Implementation (Keras) GoogLeNet https://arxiv.org/pdf/1409.4842.pdf 위 논문에 따라 Keras로 구현해보았다. 논문의 unit과 num_class는 각각 1024, 1000 이다. Auxiliary Classifier def AuxiliaryClassifier(inputs, name=None): x = AveragePooling2D((5,5), strides=3, padding='valid')(inputs) x = Conv2D(128, (1,1), strides=1, padding='same', activation='relu', kernel_initializer=kernel_initializer, bias_initializer=bias_initializer)(x) x = Flatten()(.. 2021. 8. 6.