Image-to-Image Translation with Conditional Adversarial Networks
Conditional GAN
GAN에서 조건을 부여하면 이미지를 생성시키는 모델
이미지를 변환할 때 생성 모델과 판별 모델에 조건으로 이미지를 넣어 학습하면
생성 모델은 입력 이미지에 따르는 결과 이미지를 생성할 수 있게 된다
흑백 이미지를 컬러와 하거나, 윤곽이 주어지면 물체를 그리는 문제와 같은 것들을 해결 할 수 있다
픽셀 하나 하나를 바꾼다
Generator
이미지의 맥락을 보존시키는데 효과적인 구조이다
Discriminator
PatchGAN은 이미지를 작은 patch들로 쪼개 그 patch에 대해 fake/real를 판단한다
이렇게 함으로써 L1 loss가 만들어낸 전체적인 특징과 더불어 좀 더 디테일한 특징들을 이미지에서 살려낼 수 있다
부가적 이점으로는 전체 이미지보다 작은 크기에 적용하기 때문에 파라미터 수가 적고 좀 더 빠른 속도로 연산이 가능하다
또한 더 큰 사이즈의 이미지에 대해 일반화 하여 적용시킬 수 있는 장점이 있다
PatchGAN discriminator
원본 이미지에 70x70만큼 움직이면서 window연산을 하고 연산 결과 30x30 크기 이미지가 나온다
이때 30x30 크기 이미지를 통해 각각의 부분을 예측한다
convolution은 특징이 있는지 없는지 여부를 찾고 주변 데이터와의 상관관계를 따지지 않기 때문에
전체 이미지를 통해 예측을 하게 되면 정확한 예측이 불가능하다
따라서 부분 부분의 이미지를 통해 예측을 하는 방식을 사용해야 한다
import tensorflow as tf
resnet = tf.keras.applications.ResNet152V2()
resnet.summary() # Add를 사용해서 결과를 유지했다
Model: "resnet152v2"
__________________________________________________________________________________________________
Layer (type) Output Shape Param # Connected to
==================================================================================================
input_2 (InputLayer) [(None, 224, 224, 3) 0
__________________________________________________________________________________________________
conv1_pad (ZeroPadding2D) (None, 230, 230, 3) 0 input_2[0][0]
__________________________________________________________________________________________________
conv1_conv (Conv2D) (None, 112, 112, 64) 9472 conv1_pad[0][0]
__________________________________________________________________________________________________
pool1_pad (ZeroPadding2D) (None, 114, 114, 64) 0 conv1_conv[0][0]
__________________________________________________________________________________________________
pool1_pool (MaxPooling2D) (None, 56, 56, 64) 0 pool1_pad[0][0]
__________________________________________________________________________________________________
conv2_block1_preact_bn (BatchNo (None, 56, 56, 64) 256 pool1_pool[0][0]
__________________________________________________________________________________________________
conv2_block1_preact_relu (Activ (None, 56, 56, 64) 0 conv2_block1_preact_bn[0][0]
__________________________________________________________________________________________________
conv2_block1_1_conv (Conv2D) (None, 56, 56, 64) 4096 conv2_block1_preact_relu[0][0]
__________________________________________________________________________________________________
conv2_block1_1_bn (BatchNormali (None, 56, 56, 64) 256 conv2_block1_1_conv[0][0]
__________________________________________________________________________________________________
conv2_block1_1_relu (Activation (None, 56, 56, 64) 0 conv2_block1_1_bn[0][0]
__________________________________________________________________________________________________
conv2_block1_2_pad (ZeroPadding (None, 58, 58, 64) 0 conv2_block1_1_relu[0][0]
__________________________________________________________________________________________________
conv2_block1_2_conv (Conv2D) (None, 56, 56, 64) 36864 conv2_block1_2_pad[0][0]
__________________________________________________________________________________________________
conv2_block1_2_bn (BatchNormali (None, 56, 56, 64) 256 conv2_block1_2_conv[0][0]
__________________________________________________________________________________________________
conv2_block1_2_relu (Activation (None, 56, 56, 64) 0 conv2_block1_2_bn[0][0]
__________________________________________________________________________________________________
conv2_block1_0_conv (Conv2D) (None, 56, 56, 256) 16640 conv2_block1_preact_relu[0][0]
__________________________________________________________________________________________________
conv2_block1_3_conv (Conv2D) (None, 56, 56, 256) 16640 conv2_block1_2_relu[0][0]
__________________________________________________________________________________________________
conv2_block1_out (Add) (None, 56, 56, 256) 0 conv2_block1_0_conv[0][0]
conv2_block1_3_conv[0][0]
__________________________________________________________________________________________________
conv2_block2_preact_bn (BatchNo (None, 56, 56, 256) 1024 conv2_block1_out[0][0]
__________________________________________________________________________________________________
conv2_block2_preact_relu (Activ (None, 56, 56, 256) 0 conv2_block2_preact_bn[0][0]
__________________________________________________________________________________________________
conv2_block2_1_conv (Conv2D) (None, 56, 56, 64) 16384 conv2_block2_preact_relu[0][0]
__________________________________________________________________________________________________
conv2_block2_1_bn (BatchNormali (None, 56, 56, 64) 256 conv2_block2_1_conv[0][0]
__________________________________________________________________________________________________
conv2_block2_1_relu (Activation (None, 56, 56, 64) 0 conv2_block2_1_bn[0][0]
__________________________________________________________________________________________________
conv2_block2_2_pad (ZeroPadding (None, 58, 58, 64) 0 conv2_block2_1_relu[0][0]
__________________________________________________________________________________________________
conv2_block2_2_conv (Conv2D) (None, 56, 56, 64) 36864 conv2_block2_2_pad[0][0]
__________________________________________________________________________________________________
conv2_block2_2_bn (BatchNormali (None, 56, 56, 64) 256 conv2_block2_2_conv[0][0]
__________________________________________________________________________________________________
conv2_block2_2_relu (Activation (None, 56, 56, 64) 0 conv2_block2_2_bn[0][0]
__________________________________________________________________________________________________
conv2_block2_3_conv (Conv2D) (None, 56, 56, 256) 16640 conv2_block2_2_relu[0][0]
__________________________________________________________________________________________________
conv2_block2_out (Add) (None, 56, 56, 256) 0 conv2_block1_out[0][0]
conv2_block2_3_conv[0][0]
__________________________________________________________________________________________________
conv2_block3_preact_bn (BatchNo (None, 56, 56, 256) 1024 conv2_block2_out[0][0]
__________________________________________________________________________________________________
conv2_block3_preact_relu (Activ (None, 56, 56, 256) 0 conv2_block3_preact_bn[0][0]
__________________________________________________________________________________________________
conv2_block3_1_conv (Conv2D) (None, 56, 56, 64) 16384 conv2_block3_preact_relu[0][0]
__________________________________________________________________________________________________
conv2_block3_1_bn (BatchNormali (None, 56, 56, 64) 256 conv2_block3_1_conv[0][0]
__________________________________________________________________________________________________
conv2_block3_1_relu (Activation (None, 56, 56, 64) 0 conv2_block3_1_bn[0][0]
__________________________________________________________________________________________________
conv2_block3_2_pad (ZeroPadding (None, 58, 58, 64) 0 conv2_block3_1_relu[0][0]
__________________________________________________________________________________________________
conv2_block3_2_conv (Conv2D) (None, 28, 28, 64) 36864 conv2_block3_2_pad[0][0]
__________________________________________________________________________________________________
conv2_block3_2_bn (BatchNormali (None, 28, 28, 64) 256 conv2_block3_2_conv[0][0]
__________________________________________________________________________________________________
conv2_block3_2_relu (Activation (None, 28, 28, 64) 0 conv2_block3_2_bn[0][0]
__________________________________________________________________________________________________
max_pooling2d_3 (MaxPooling2D) (None, 28, 28, 256) 0 conv2_block2_out[0][0]
__________________________________________________________________________________________________
conv2_block3_3_conv (Conv2D) (None, 28, 28, 256) 16640 conv2_block3_2_relu[0][0]
__________________________________________________________________________________________________
conv2_block3_out (Add) (None, 28, 28, 256) 0 max_pooling2d_3[0][0]
conv2_block3_3_conv[0][0]
__________________________________________________________________________________________________
conv3_block1_preact_bn (BatchNo (None, 28, 28, 256) 1024 conv2_block3_out[0][0]
__________________________________________________________________________________________________
conv3_block1_preact_relu (Activ (None, 28, 28, 256) 0 conv3_block1_preact_bn[0][0]
__________________________________________________________________________________________________
conv3_block1_1_conv (Conv2D) (None, 28, 28, 128) 32768 conv3_block1_preact_relu[0][0]
__________________________________________________________________________________________________
conv3_block1_1_bn (BatchNormali (None, 28, 28, 128) 512 conv3_block1_1_conv[0][0]
__________________________________________________________________________________________________
conv3_block1_1_relu (Activation (None, 28, 28, 128) 0 conv3_block1_1_bn[0][0]
__________________________________________________________________________________________________
conv3_block1_2_pad (ZeroPadding (None, 30, 30, 128) 0 conv3_block1_1_relu[0][0]
__________________________________________________________________________________________________
conv3_block1_2_conv (Conv2D) (None, 28, 28, 128) 147456 conv3_block1_2_pad[0][0]
__________________________________________________________________________________________________
conv3_block1_2_bn (BatchNormali (None, 28, 28, 128) 512 conv3_block1_2_conv[0][0]
__________________________________________________________________________________________________
conv3_block1_2_relu (Activation (None, 28, 28, 128) 0 conv3_block1_2_bn[0][0]
__________________________________________________________________________________________________
conv3_block1_0_conv (Conv2D) (None, 28, 28, 512) 131584 conv3_block1_preact_relu[0][0]
__________________________________________________________________________________________________
conv3_block1_3_conv (Conv2D) (None, 28, 28, 512) 66048 conv3_block1_2_relu[0][0]
__________________________________________________________________________________________________
conv3_block1_out (Add) (None, 28, 28, 512) 0 conv3_block1_0_conv[0][0]
conv3_block1_3_conv[0][0]
__________________________________________________________________________________________________
conv3_block2_preact_bn (BatchNo (None, 28, 28, 512) 2048 conv3_block1_out[0][0]
__________________________________________________________________________________________________
conv3_block2_preact_relu (Activ (None, 28, 28, 512) 0 conv3_block2_preact_bn[0][0]
__________________________________________________________________________________________________
conv3_block2_1_conv (Conv2D) (None, 28, 28, 128) 65536 conv3_block2_preact_relu[0][0]
__________________________________________________________________________________________________
conv3_block2_1_bn (BatchNormali (None, 28, 28, 128) 512 conv3_block2_1_conv[0][0]
__________________________________________________________________________________________________
conv3_block2_1_relu (Activation (None, 28, 28, 128) 0 conv3_block2_1_bn[0][0]
__________________________________________________________________________________________________
conv3_block2_2_pad (ZeroPadding (None, 30, 30, 128) 0 conv3_block2_1_relu[0][0]
__________________________________________________________________________________________________
conv3_block2_2_conv (Conv2D) (None, 28, 28, 128) 147456 conv3_block2_2_pad[0][0]
__________________________________________________________________________________________________
conv3_block2_2_bn (BatchNormali (None, 28, 28, 128) 512 conv3_block2_2_conv[0][0]
__________________________________________________________________________________________________
conv3_block2_2_relu (Activation (None, 28, 28, 128) 0 conv3_block2_2_bn[0][0]
__________________________________________________________________________________________________
conv3_block2_3_conv (Conv2D) (None, 28, 28, 512) 66048 conv3_block2_2_relu[0][0]
__________________________________________________________________________________________________
conv3_block2_out (Add) (None, 28, 28, 512) 0 conv3_block1_out[0][0]
conv3_block2_3_conv[0][0]
__________________________________________________________________________________________________
conv3_block3_preact_bn (BatchNo (None, 28, 28, 512) 2048 conv3_block2_out[0][0]
__________________________________________________________________________________________________
conv3_block3_preact_relu (Activ (None, 28, 28, 512) 0 conv3_block3_preact_bn[0][0]
__________________________________________________________________________________________________
conv3_block3_1_conv (Conv2D) (None, 28, 28, 128) 65536 conv3_block3_preact_relu[0][0]
__________________________________________________________________________________________________
conv3_block3_1_bn (BatchNormali (None, 28, 28, 128) 512 conv3_block3_1_conv[0][0]
__________________________________________________________________________________________________
conv3_block3_1_relu (Activation (None, 28, 28, 128) 0 conv3_block3_1_bn[0][0]
__________________________________________________________________________________________________
conv3_block3_2_pad (ZeroPadding (None, 30, 30, 128) 0 conv3_block3_1_relu[0][0]
__________________________________________________________________________________________________
conv3_block3_2_conv (Conv2D) (None, 28, 28, 128) 147456 conv3_block3_2_pad[0][0]
__________________________________________________________________________________________________
conv3_block3_2_bn (BatchNormali (None, 28, 28, 128) 512 conv3_block3_2_conv[0][0]
__________________________________________________________________________________________________
conv3_block3_2_relu (Activation (None, 28, 28, 128) 0 conv3_block3_2_bn[0][0]
__________________________________________________________________________________________________
conv3_block3_3_conv (Conv2D) (None, 28, 28, 512) 66048 conv3_block3_2_relu[0][0]
__________________________________________________________________________________________________
conv3_block3_out (Add) (None, 28, 28, 512) 0 conv3_block2_out[0][0]
conv3_block3_3_conv[0][0]
__________________________________________________________________________________________________
conv3_block4_preact_bn (BatchNo (None, 28, 28, 512) 2048 conv3_block3_out[0][0]
__________________________________________________________________________________________________
conv3_block4_preact_relu (Activ (None, 28, 28, 512) 0 conv3_block4_preact_bn[0][0]
__________________________________________________________________________________________________
conv3_block4_1_conv (Conv2D) (None, 28, 28, 128) 65536 conv3_block4_preact_relu[0][0]
__________________________________________________________________________________________________
conv3_block4_1_bn (BatchNormali (None, 28, 28, 128) 512 conv3_block4_1_conv[0][0]
__________________________________________________________________________________________________
conv3_block4_1_relu (Activation (None, 28, 28, 128) 0 conv3_block4_1_bn[0][0]
__________________________________________________________________________________________________
conv3_block4_2_pad (ZeroPadding (None, 30, 30, 128) 0 conv3_block4_1_relu[0][0]
__________________________________________________________________________________________________
conv3_block4_2_conv (Conv2D) (None, 28, 28, 128) 147456 conv3_block4_2_pad[0][0]
__________________________________________________________________________________________________
conv3_block4_2_bn (BatchNormali (None, 28, 28, 128) 512 conv3_block4_2_conv[0][0]
__________________________________________________________________________________________________
conv3_block4_2_relu (Activation (None, 28, 28, 128) 0 conv3_block4_2_bn[0][0]
__________________________________________________________________________________________________
conv3_block4_3_conv (Conv2D) (None, 28, 28, 512) 66048 conv3_block4_2_relu[0][0]
__________________________________________________________________________________________________
conv3_block4_out (Add) (None, 28, 28, 512) 0 conv3_block3_out[0][0]
conv3_block4_3_conv[0][0]
__________________________________________________________________________________________________
conv3_block5_preact_bn (BatchNo (None, 28, 28, 512) 2048 conv3_block4_out[0][0]
__________________________________________________________________________________________________
conv3_block5_preact_relu (Activ (None, 28, 28, 512) 0 conv3_block5_preact_bn[0][0]
__________________________________________________________________________________________________
conv3_block5_1_conv (Conv2D) (None, 28, 28, 128) 65536 conv3_block5_preact_relu[0][0]
__________________________________________________________________________________________________
conv3_block5_1_bn (BatchNormali (None, 28, 28, 128) 512 conv3_block5_1_conv[0][0]
__________________________________________________________________________________________________
conv3_block5_1_relu (Activation (None, 28, 28, 128) 0 conv3_block5_1_bn[0][0]
__________________________________________________________________________________________________
conv3_block5_2_pad (ZeroPadding (None, 30, 30, 128) 0 conv3_block5_1_relu[0][0]
__________________________________________________________________________________________________
conv3_block5_2_conv (Conv2D) (None, 28, 28, 128) 147456 conv3_block5_2_pad[0][0]
__________________________________________________________________________________________________
conv3_block5_2_bn (BatchNormali (None, 28, 28, 128) 512 conv3_block5_2_conv[0][0]
__________________________________________________________________________________________________
conv3_block5_2_relu (Activation (None, 28, 28, 128) 0 conv3_block5_2_bn[0][0]
__________________________________________________________________________________________________
conv3_block5_3_conv (Conv2D) (None, 28, 28, 512) 66048 conv3_block5_2_relu[0][0]
__________________________________________________________________________________________________
conv3_block5_out (Add) (None, 28, 28, 512) 0 conv3_block4_out[0][0]
conv3_block5_3_conv[0][0]
__________________________________________________________________________________________________
conv3_block6_preact_bn (BatchNo (None, 28, 28, 512) 2048 conv3_block5_out[0][0]
__________________________________________________________________________________________________
conv3_block6_preact_relu (Activ (None, 28, 28, 512) 0 conv3_block6_preact_bn[0][0]
__________________________________________________________________________________________________
conv3_block6_1_conv (Conv2D) (None, 28, 28, 128) 65536 conv3_block6_preact_relu[0][0]
__________________________________________________________________________________________________
conv3_block6_1_bn (BatchNormali (None, 28, 28, 128) 512 conv3_block6_1_conv[0][0]
__________________________________________________________________________________________________
conv3_block6_1_relu (Activation (None, 28, 28, 128) 0 conv3_block6_1_bn[0][0]
__________________________________________________________________________________________________
conv3_block6_2_pad (ZeroPadding (None, 30, 30, 128) 0 conv3_block6_1_relu[0][0]
__________________________________________________________________________________________________
conv3_block6_2_conv (Conv2D) (None, 28, 28, 128) 147456 conv3_block6_2_pad[0][0]
__________________________________________________________________________________________________
conv3_block6_2_bn (BatchNormali (None, 28, 28, 128) 512 conv3_block6_2_conv[0][0]
__________________________________________________________________________________________________
conv3_block6_2_relu (Activation (None, 28, 28, 128) 0 conv3_block6_2_bn[0][0]
__________________________________________________________________________________________________
conv3_block6_3_conv (Conv2D) (None, 28, 28, 512) 66048 conv3_block6_2_relu[0][0]
__________________________________________________________________________________________________
conv3_block6_out (Add) (None, 28, 28, 512) 0 conv3_block5_out[0][0]
conv3_block6_3_conv[0][0]
__________________________________________________________________________________________________
conv3_block7_preact_bn (BatchNo (None, 28, 28, 512) 2048 conv3_block6_out[0][0]
__________________________________________________________________________________________________
conv3_block7_preact_relu (Activ (None, 28, 28, 512) 0 conv3_block7_preact_bn[0][0]
__________________________________________________________________________________________________
conv3_block7_1_conv (Conv2D) (None, 28, 28, 128) 65536 conv3_block7_preact_relu[0][0]
__________________________________________________________________________________________________
conv3_block7_1_bn (BatchNormali (None, 28, 28, 128) 512 conv3_block7_1_conv[0][0]
__________________________________________________________________________________________________
conv3_block7_1_relu (Activation (None, 28, 28, 128) 0 conv3_block7_1_bn[0][0]
__________________________________________________________________________________________________
conv3_block7_2_pad (ZeroPadding (None, 30, 30, 128) 0 conv3_block7_1_relu[0][0]
__________________________________________________________________________________________________
conv3_block7_2_conv (Conv2D) (None, 28, 28, 128) 147456 conv3_block7_2_pad[0][0]
__________________________________________________________________________________________________
conv3_block7_2_bn (BatchNormali (None, 28, 28, 128) 512 conv3_block7_2_conv[0][0]
__________________________________________________________________________________________________
conv3_block7_2_relu (Activation (None, 28, 28, 128) 0 conv3_block7_2_bn[0][0]
__________________________________________________________________________________________________
conv3_block7_3_conv (Conv2D) (None, 28, 28, 512) 66048 conv3_block7_2_relu[0][0]
__________________________________________________________________________________________________
conv3_block7_out (Add) (None, 28, 28, 512) 0 conv3_block6_out[0][0]
conv3_block7_3_conv[0][0]
__________________________________________________________________________________________________
conv3_block8_preact_bn (BatchNo (None, 28, 28, 512) 2048 conv3_block7_out[0][0]
__________________________________________________________________________________________________
conv3_block8_preact_relu (Activ (None, 28, 28, 512) 0 conv3_block8_preact_bn[0][0]
__________________________________________________________________________________________________
conv3_block8_1_conv (Conv2D) (None, 28, 28, 128) 65536 conv3_block8_preact_relu[0][0]
__________________________________________________________________________________________________
conv3_block8_1_bn (BatchNormali (None, 28, 28, 128) 512 conv3_block8_1_conv[0][0]
__________________________________________________________________________________________________
conv3_block8_1_relu (Activation (None, 28, 28, 128) 0 conv3_block8_1_bn[0][0]
__________________________________________________________________________________________________
conv3_block8_2_pad (ZeroPadding (None, 30, 30, 128) 0 conv3_block8_1_relu[0][0]
__________________________________________________________________________________________________
conv3_block8_2_conv (Conv2D) (None, 14, 14, 128) 147456 conv3_block8_2_pad[0][0]
__________________________________________________________________________________________________
conv3_block8_2_bn (BatchNormali (None, 14, 14, 128) 512 conv3_block8_2_conv[0][0]
__________________________________________________________________________________________________
conv3_block8_2_relu (Activation (None, 14, 14, 128) 0 conv3_block8_2_bn[0][0]
__________________________________________________________________________________________________
max_pooling2d_4 (MaxPooling2D) (None, 14, 14, 512) 0 conv3_block7_out[0][0]
__________________________________________________________________________________________________
conv3_block8_3_conv (Conv2D) (None, 14, 14, 512) 66048 conv3_block8_2_relu[0][0]
__________________________________________________________________________________________________
conv3_block8_out (Add) (None, 14, 14, 512) 0 max_pooling2d_4[0][0]
conv3_block8_3_conv[0][0]
__________________________________________________________________________________________________
conv4_block1_preact_bn (BatchNo (None, 14, 14, 512) 2048 conv3_block8_out[0][0]
__________________________________________________________________________________________________
conv4_block1_preact_relu (Activ (None, 14, 14, 512) 0 conv4_block1_preact_bn[0][0]
__________________________________________________________________________________________________
conv4_block1_1_conv (Conv2D) (None, 14, 14, 256) 131072 conv4_block1_preact_relu[0][0]
__________________________________________________________________________________________________
conv4_block1_1_bn (BatchNormali (None, 14, 14, 256) 1024 conv4_block1_1_conv[0][0]
__________________________________________________________________________________________________
conv4_block1_1_relu (Activation (None, 14, 14, 256) 0 conv4_block1_1_bn[0][0]
__________________________________________________________________________________________________
conv4_block1_2_pad (ZeroPadding (None, 16, 16, 256) 0 conv4_block1_1_relu[0][0]
__________________________________________________________________________________________________
conv4_block1_2_conv (Conv2D) (None, 14, 14, 256) 589824 conv4_block1_2_pad[0][0]
__________________________________________________________________________________________________
conv4_block1_2_bn (BatchNormali (None, 14, 14, 256) 1024 conv4_block1_2_conv[0][0]
__________________________________________________________________________________________________
conv4_block1_2_relu (Activation (None, 14, 14, 256) 0 conv4_block1_2_bn[0][0]
__________________________________________________________________________________________________
conv4_block1_0_conv (Conv2D) (None, 14, 14, 1024) 525312 conv4_block1_preact_relu[0][0]
__________________________________________________________________________________________________
conv4_block1_3_conv (Conv2D) (None, 14, 14, 1024) 263168 conv4_block1_2_relu[0][0]
__________________________________________________________________________________________________
conv4_block1_out (Add) (None, 14, 14, 1024) 0 conv4_block1_0_conv[0][0]
conv4_block1_3_conv[0][0]
__________________________________________________________________________________________________
conv4_block2_preact_bn (BatchNo (None, 14, 14, 1024) 4096 conv4_block1_out[0][0]
__________________________________________________________________________________________________
conv4_block2_preact_relu (Activ (None, 14, 14, 1024) 0 conv4_block2_preact_bn[0][0]
__________________________________________________________________________________________________
conv4_block2_1_conv (Conv2D) (None, 14, 14, 256) 262144 conv4_block2_preact_relu[0][0]
__________________________________________________________________________________________________
conv4_block2_1_bn (BatchNormali (None, 14, 14, 256) 1024 conv4_block2_1_conv[0][0]
__________________________________________________________________________________________________
conv4_block2_1_relu (Activation (None, 14, 14, 256) 0 conv4_block2_1_bn[0][0]
__________________________________________________________________________________________________
conv4_block2_2_pad (ZeroPadding (None, 16, 16, 256) 0 conv4_block2_1_relu[0][0]
__________________________________________________________________________________________________
conv4_block2_2_conv (Conv2D) (None, 14, 14, 256) 589824 conv4_block2_2_pad[0][0]
__________________________________________________________________________________________________
conv4_block2_2_bn (BatchNormali (None, 14, 14, 256) 1024 conv4_block2_2_conv[0][0]
__________________________________________________________________________________________________
conv4_block2_2_relu (Activation (None, 14, 14, 256) 0 conv4_block2_2_bn[0][0]
__________________________________________________________________________________________________
conv4_block2_3_conv (Conv2D) (None, 14, 14, 1024) 263168 conv4_block2_2_relu[0][0]
__________________________________________________________________________________________________
conv4_block2_out (Add) (None, 14, 14, 1024) 0 conv4_block1_out[0][0]
conv4_block2_3_conv[0][0]
__________________________________________________________________________________________________
conv4_block3_preact_bn (BatchNo (None, 14, 14, 1024) 4096 conv4_block2_out[0][0]
__________________________________________________________________________________________________
conv4_block3_preact_relu (Activ (None, 14, 14, 1024) 0 conv4_block3_preact_bn[0][0]
__________________________________________________________________________________________________
conv4_block3_1_conv (Conv2D) (None, 14, 14, 256) 262144 conv4_block3_preact_relu[0][0]
__________________________________________________________________________________________________
conv4_block3_1_bn (BatchNormali (None, 14, 14, 256) 1024 conv4_block3_1_conv[0][0]
__________________________________________________________________________________________________
conv4_block3_1_relu (Activation (None, 14, 14, 256) 0 conv4_block3_1_bn[0][0]
__________________________________________________________________________________________________
conv4_block3_2_pad (ZeroPadding (None, 16, 16, 256) 0 conv4_block3_1_relu[0][0]
__________________________________________________________________________________________________
conv4_block3_2_conv (Conv2D) (None, 14, 14, 256) 589824 conv4_block3_2_pad[0][0]
__________________________________________________________________________________________________
conv4_block3_2_bn (BatchNormali (None, 14, 14, 256) 1024 conv4_block3_2_conv[0][0]
__________________________________________________________________________________________________
conv4_block3_2_relu (Activation (None, 14, 14, 256) 0 conv4_block3_2_bn[0][0]
__________________________________________________________________________________________________
conv4_block3_3_conv (Conv2D) (None, 14, 14, 1024) 263168 conv4_block3_2_relu[0][0]
__________________________________________________________________________________________________
conv4_block3_out (Add) (None, 14, 14, 1024) 0 conv4_block2_out[0][0]
conv4_block3_3_conv[0][0]
__________________________________________________________________________________________________
conv4_block4_preact_bn (BatchNo (None, 14, 14, 1024) 4096 conv4_block3_out[0][0]
__________________________________________________________________________________________________
conv4_block4_preact_relu (Activ (None, 14, 14, 1024) 0 conv4_block4_preact_bn[0][0]
__________________________________________________________________________________________________
conv4_block4_1_conv (Conv2D) (None, 14, 14, 256) 262144 conv4_block4_preact_relu[0][0]
__________________________________________________________________________________________________
conv4_block4_1_bn (BatchNormali (None, 14, 14, 256) 1024 conv4_block4_1_conv[0][0]
__________________________________________________________________________________________________
conv4_block4_1_relu (Activation (None, 14, 14, 256) 0 conv4_block4_1_bn[0][0]
__________________________________________________________________________________________________
conv4_block4_2_pad (ZeroPadding (None, 16, 16, 256) 0 conv4_block4_1_relu[0][0]
__________________________________________________________________________________________________
conv4_block4_2_conv (Conv2D) (None, 14, 14, 256) 589824 conv4_block4_2_pad[0][0]
__________________________________________________________________________________________________
conv4_block4_2_bn (BatchNormali (None, 14, 14, 256) 1024 conv4_block4_2_conv[0][0]
__________________________________________________________________________________________________
conv4_block4_2_relu (Activation (None, 14, 14, 256) 0 conv4_block4_2_bn[0][0]
__________________________________________________________________________________________________
conv4_block4_3_conv (Conv2D) (None, 14, 14, 1024) 263168 conv4_block4_2_relu[0][0]
__________________________________________________________________________________________________
conv4_block4_out (Add) (None, 14, 14, 1024) 0 conv4_block3_out[0][0]
conv4_block4_3_conv[0][0]
__________________________________________________________________________________________________
conv4_block5_preact_bn (BatchNo (None, 14, 14, 1024) 4096 conv4_block4_out[0][0]
__________________________________________________________________________________________________
conv4_block5_preact_relu (Activ (None, 14, 14, 1024) 0 conv4_block5_preact_bn[0][0]
__________________________________________________________________________________________________
conv4_block5_1_conv (Conv2D) (None, 14, 14, 256) 262144 conv4_block5_preact_relu[0][0]
__________________________________________________________________________________________________
conv4_block5_1_bn (BatchNormali (None, 14, 14, 256) 1024 conv4_block5_1_conv[0][0]
__________________________________________________________________________________________________
conv4_block5_1_relu (Activation (None, 14, 14, 256) 0 conv4_block5_1_bn[0][0]
__________________________________________________________________________________________________
conv4_block5_2_pad (ZeroPadding (None, 16, 16, 256) 0 conv4_block5_1_relu[0][0]
__________________________________________________________________________________________________
conv4_block5_2_conv (Conv2D) (None, 14, 14, 256) 589824 conv4_block5_2_pad[0][0]
__________________________________________________________________________________________________
conv4_block5_2_bn (BatchNormali (None, 14, 14, 256) 1024 conv4_block5_2_conv[0][0]
__________________________________________________________________________________________________
conv4_block5_2_relu (Activation (None, 14, 14, 256) 0 conv4_block5_2_bn[0][0]
__________________________________________________________________________________________________
conv4_block5_3_conv (Conv2D) (None, 14, 14, 1024) 263168 conv4_block5_2_relu[0][0]
__________________________________________________________________________________________________
conv4_block5_out (Add) (None, 14, 14, 1024) 0 conv4_block4_out[0][0]
conv4_block5_3_conv[0][0]
__________________________________________________________________________________________________
conv4_block6_preact_bn (BatchNo (None, 14, 14, 1024) 4096 conv4_block5_out[0][0]
__________________________________________________________________________________________________
conv4_block6_preact_relu (Activ (None, 14, 14, 1024) 0 conv4_block6_preact_bn[0][0]
__________________________________________________________________________________________________
conv4_block6_1_conv (Conv2D) (None, 14, 14, 256) 262144 conv4_block6_preact_relu[0][0]
__________________________________________________________________________________________________
conv4_block6_1_bn (BatchNormali (None, 14, 14, 256) 1024 conv4_block6_1_conv[0][0]
__________________________________________________________________________________________________
conv4_block6_1_relu (Activation (None, 14, 14, 256) 0 conv4_block6_1_bn[0][0]
__________________________________________________________________________________________________
conv4_block6_2_pad (ZeroPadding (None, 16, 16, 256) 0 conv4_block6_1_relu[0][0]
__________________________________________________________________________________________________
conv4_block6_2_conv (Conv2D) (None, 14, 14, 256) 589824 conv4_block6_2_pad[0][0]
__________________________________________________________________________________________________
conv4_block6_2_bn (BatchNormali (None, 14, 14, 256) 1024 conv4_block6_2_conv[0][0]
__________________________________________________________________________________________________
conv4_block6_2_relu (Activation (None, 14, 14, 256) 0 conv4_block6_2_bn[0][0]
__________________________________________________________________________________________________
conv4_block6_3_conv (Conv2D) (None, 14, 14, 1024) 263168 conv4_block6_2_relu[0][0]
__________________________________________________________________________________________________
conv4_block6_out (Add) (None, 14, 14, 1024) 0 conv4_block5_out[0][0]
conv4_block6_3_conv[0][0]
__________________________________________________________________________________________________
conv4_block7_preact_bn (BatchNo (None, 14, 14, 1024) 4096 conv4_block6_out[0][0]
__________________________________________________________________________________________________
conv4_block7_preact_relu (Activ (None, 14, 14, 1024) 0 conv4_block7_preact_bn[0][0]
__________________________________________________________________________________________________
conv4_block7_1_conv (Conv2D) (None, 14, 14, 256) 262144 conv4_block7_preact_relu[0][0]
__________________________________________________________________________________________________
conv4_block7_1_bn (BatchNormali (None, 14, 14, 256) 1024 conv4_block7_1_conv[0][0]
__________________________________________________________________________________________________
conv4_block7_1_relu (Activation (None, 14, 14, 256) 0 conv4_block7_1_bn[0][0]
__________________________________________________________________________________________________
conv4_block7_2_pad (ZeroPadding (None, 16, 16, 256) 0 conv4_block7_1_relu[0][0]
__________________________________________________________________________________________________
conv4_block7_2_conv (Conv2D) (None, 14, 14, 256) 589824 conv4_block7_2_pad[0][0]
__________________________________________________________________________________________________
conv4_block7_2_bn (BatchNormali (None, 14, 14, 256) 1024 conv4_block7_2_conv[0][0]
__________________________________________________________________________________________________
conv4_block7_2_relu (Activation (None, 14, 14, 256) 0 conv4_block7_2_bn[0][0]
__________________________________________________________________________________________________
conv4_block7_3_conv (Conv2D) (None, 14, 14, 1024) 263168 conv4_block7_2_relu[0][0]
__________________________________________________________________________________________________
conv4_block7_out (Add) (None, 14, 14, 1024) 0 conv4_block6_out[0][0]
conv4_block7_3_conv[0][0]
__________________________________________________________________________________________________
conv4_block8_preact_bn (BatchNo (None, 14, 14, 1024) 4096 conv4_block7_out[0][0]
__________________________________________________________________________________________________
conv4_block8_preact_relu (Activ (None, 14, 14, 1024) 0 conv4_block8_preact_bn[0][0]
__________________________________________________________________________________________________
conv4_block8_1_conv (Conv2D) (None, 14, 14, 256) 262144 conv4_block8_preact_relu[0][0]
__________________________________________________________________________________________________
conv4_block8_1_bn (BatchNormali (None, 14, 14, 256) 1024 conv4_block8_1_conv[0][0]
__________________________________________________________________________________________________
conv4_block8_1_relu (Activation (None, 14, 14, 256) 0 conv4_block8_1_bn[0][0]
__________________________________________________________________________________________________
conv4_block8_2_pad (ZeroPadding (None, 16, 16, 256) 0 conv4_block8_1_relu[0][0]
__________________________________________________________________________________________________
conv4_block8_2_conv (Conv2D) (None, 14, 14, 256) 589824 conv4_block8_2_pad[0][0]
__________________________________________________________________________________________________
conv4_block8_2_bn (BatchNormali (None, 14, 14, 256) 1024 conv4_block8_2_conv[0][0]
__________________________________________________________________________________________________
conv4_block8_2_relu (Activation (None, 14, 14, 256) 0 conv4_block8_2_bn[0][0]
__________________________________________________________________________________________________
conv4_block8_3_conv (Conv2D) (None, 14, 14, 1024) 263168 conv4_block8_2_relu[0][0]
__________________________________________________________________________________________________
conv4_block8_out (Add) (None, 14, 14, 1024) 0 conv4_block7_out[0][0]
conv4_block8_3_conv[0][0]
__________________________________________________________________________________________________
conv4_block9_preact_bn (BatchNo (None, 14, 14, 1024) 4096 conv4_block8_out[0][0]
__________________________________________________________________________________________________
conv4_block9_preact_relu (Activ (None, 14, 14, 1024) 0 conv4_block9_preact_bn[0][0]
__________________________________________________________________________________________________
conv4_block9_1_conv (Conv2D) (None, 14, 14, 256) 262144 conv4_block9_preact_relu[0][0]
__________________________________________________________________________________________________
conv4_block9_1_bn (BatchNormali (None, 14, 14, 256) 1024 conv4_block9_1_conv[0][0]
__________________________________________________________________________________________________
conv4_block9_1_relu (Activation (None, 14, 14, 256) 0 conv4_block9_1_bn[0][0]
__________________________________________________________________________________________________
conv4_block9_2_pad (ZeroPadding (None, 16, 16, 256) 0 conv4_block9_1_relu[0][0]
__________________________________________________________________________________________________
conv4_block9_2_conv (Conv2D) (None, 14, 14, 256) 589824 conv4_block9_2_pad[0][0]
__________________________________________________________________________________________________
conv4_block9_2_bn (BatchNormali (None, 14, 14, 256) 1024 conv4_block9_2_conv[0][0]
__________________________________________________________________________________________________
conv4_block9_2_relu (Activation (None, 14, 14, 256) 0 conv4_block9_2_bn[0][0]
__________________________________________________________________________________________________
conv4_block9_3_conv (Conv2D) (None, 14, 14, 1024) 263168 conv4_block9_2_relu[0][0]
__________________________________________________________________________________________________
conv4_block9_out (Add) (None, 14, 14, 1024) 0 conv4_block8_out[0][0]
conv4_block9_3_conv[0][0]
__________________________________________________________________________________________________
conv4_block10_preact_bn (BatchN (None, 14, 14, 1024) 4096 conv4_block9_out[0][0]
__________________________________________________________________________________________________
conv4_block10_preact_relu (Acti (None, 14, 14, 1024) 0 conv4_block10_preact_bn[0][0]
__________________________________________________________________________________________________
conv4_block10_1_conv (Conv2D) (None, 14, 14, 256) 262144 conv4_block10_preact_relu[0][0]
__________________________________________________________________________________________________
conv4_block10_1_bn (BatchNormal (None, 14, 14, 256) 1024 conv4_block10_1_conv[0][0]
__________________________________________________________________________________________________
conv4_block10_1_relu (Activatio (None, 14, 14, 256) 0 conv4_block10_1_bn[0][0]
__________________________________________________________________________________________________
conv4_block10_2_pad (ZeroPaddin (None, 16, 16, 256) 0 conv4_block10_1_relu[0][0]
__________________________________________________________________________________________________
conv4_block10_2_conv (Conv2D) (None, 14, 14, 256) 589824 conv4_block10_2_pad[0][0]
__________________________________________________________________________________________________
conv4_block10_2_bn (BatchNormal (None, 14, 14, 256) 1024 conv4_block10_2_conv[0][0]
__________________________________________________________________________________________________
conv4_block10_2_relu (Activatio (None, 14, 14, 256) 0 conv4_block10_2_bn[0][0]
__________________________________________________________________________________________________
conv4_block10_3_conv (Conv2D) (None, 14, 14, 1024) 263168 conv4_block10_2_relu[0][0]
__________________________________________________________________________________________________
conv4_block10_out (Add) (None, 14, 14, 1024) 0 conv4_block9_out[0][0]
conv4_block10_3_conv[0][0]
__________________________________________________________________________________________________
conv4_block11_preact_bn (BatchN (None, 14, 14, 1024) 4096 conv4_block10_out[0][0]
__________________________________________________________________________________________________
conv4_block11_preact_relu (Acti (None, 14, 14, 1024) 0 conv4_block11_preact_bn[0][0]
__________________________________________________________________________________________________
conv4_block11_1_conv (Conv2D) (None, 14, 14, 256) 262144 conv4_block11_preact_relu[0][0]
__________________________________________________________________________________________________
conv4_block11_1_bn (BatchNormal (None, 14, 14, 256) 1024 conv4_block11_1_conv[0][0]
__________________________________________________________________________________________________
conv4_block11_1_relu (Activatio (None, 14, 14, 256) 0 conv4_block11_1_bn[0][0]
__________________________________________________________________________________________________
conv4_block11_2_pad (ZeroPaddin (None, 16, 16, 256) 0 conv4_block11_1_relu[0][0]
__________________________________________________________________________________________________
conv4_block11_2_conv (Conv2D) (None, 14, 14, 256) 589824 conv4_block11_2_pad[0][0]
__________________________________________________________________________________________________
conv4_block11_2_bn (BatchNormal (None, 14, 14, 256) 1024 conv4_block11_2_conv[0][0]
__________________________________________________________________________________________________
conv4_block11_2_relu (Activatio (None, 14, 14, 256) 0 conv4_block11_2_bn[0][0]
__________________________________________________________________________________________________
conv4_block11_3_conv (Conv2D) (None, 14, 14, 1024) 263168 conv4_block11_2_relu[0][0]
__________________________________________________________________________________________________
conv4_block11_out (Add) (None, 14, 14, 1024) 0 conv4_block10_out[0][0]
conv4_block11_3_conv[0][0]
__________________________________________________________________________________________________
conv4_block12_preact_bn (BatchN (None, 14, 14, 1024) 4096 conv4_block11_out[0][0]
__________________________________________________________________________________________________
conv4_block12_preact_relu (Acti (None, 14, 14, 1024) 0 conv4_block12_preact_bn[0][0]
__________________________________________________________________________________________________
conv4_block12_1_conv (Conv2D) (None, 14, 14, 256) 262144 conv4_block12_preact_relu[0][0]
__________________________________________________________________________________________________
conv4_block12_1_bn (BatchNormal (None, 14, 14, 256) 1024 conv4_block12_1_conv[0][0]
__________________________________________________________________________________________________
conv4_block12_1_relu (Activatio (None, 14, 14, 256) 0 conv4_block12_1_bn[0][0]
__________________________________________________________________________________________________
conv4_block12_2_pad (ZeroPaddin (None, 16, 16, 256) 0 conv4_block12_1_relu[0][0]
__________________________________________________________________________________________________
conv4_block12_2_conv (Conv2D) (None, 14, 14, 256) 589824 conv4_block12_2_pad[0][0]
__________________________________________________________________________________________________
conv4_block12_2_bn (BatchNormal (None, 14, 14, 256) 1024 conv4_block12_2_conv[0][0]
__________________________________________________________________________________________________
conv4_block12_2_relu (Activatio (None, 14, 14, 256) 0 conv4_block12_2_bn[0][0]
__________________________________________________________________________________________________
conv4_block12_3_conv (Conv2D) (None, 14, 14, 1024) 263168 conv4_block12_2_relu[0][0]
__________________________________________________________________________________________________
conv4_block12_out (Add) (None, 14, 14, 1024) 0 conv4_block11_out[0][0]
conv4_block12_3_conv[0][0]
__________________________________________________________________________________________________
conv4_block13_preact_bn (BatchN (None, 14, 14, 1024) 4096 conv4_block12_out[0][0]
__________________________________________________________________________________________________
conv4_block13_preact_relu (Acti (None, 14, 14, 1024) 0 conv4_block13_preact_bn[0][0]
__________________________________________________________________________________________________
conv4_block13_1_conv (Conv2D) (None, 14, 14, 256) 262144 conv4_block13_preact_relu[0][0]
__________________________________________________________________________________________________
conv4_block13_1_bn (BatchNormal (None, 14, 14, 256) 1024 conv4_block13_1_conv[0][0]
__________________________________________________________________________________________________
conv4_block13_1_relu (Activatio (None, 14, 14, 256) 0 conv4_block13_1_bn[0][0]
__________________________________________________________________________________________________
conv4_block13_2_pad (ZeroPaddin (None, 16, 16, 256) 0 conv4_block13_1_relu[0][0]
__________________________________________________________________________________________________
conv4_block13_2_conv (Conv2D) (None, 14, 14, 256) 589824 conv4_block13_2_pad[0][0]
__________________________________________________________________________________________________
conv4_block13_2_bn (BatchNormal (None, 14, 14, 256) 1024 conv4_block13_2_conv[0][0]
__________________________________________________________________________________________________
conv4_block13_2_relu (Activatio (None, 14, 14, 256) 0 conv4_block13_2_bn[0][0]
__________________________________________________________________________________________________
conv4_block13_3_conv (Conv2D) (None, 14, 14, 1024) 263168 conv4_block13_2_relu[0][0]
__________________________________________________________________________________________________
conv4_block13_out (Add) (None, 14, 14, 1024) 0 conv4_block12_out[0][0]
conv4_block13_3_conv[0][0]
__________________________________________________________________________________________________
conv4_block14_preact_bn (BatchN (None, 14, 14, 1024) 4096 conv4_block13_out[0][0]
__________________________________________________________________________________________________
conv4_block14_preact_relu (Acti (None, 14, 14, 1024) 0 conv4_block14_preact_bn[0][0]
__________________________________________________________________________________________________
conv4_block14_1_conv (Conv2D) (None, 14, 14, 256) 262144 conv4_block14_preact_relu[0][0]
__________________________________________________________________________________________________
conv4_block14_1_bn (BatchNormal (None, 14, 14, 256) 1024 conv4_block14_1_conv[0][0]
__________________________________________________________________________________________________
conv4_block14_1_relu (Activatio (None, 14, 14, 256) 0 conv4_block14_1_bn[0][0]
__________________________________________________________________________________________________
conv4_block14_2_pad (ZeroPaddin (None, 16, 16, 256) 0 conv4_block14_1_relu[0][0]
__________________________________________________________________________________________________
conv4_block14_2_conv (Conv2D) (None, 14, 14, 256) 589824 conv4_block14_2_pad[0][0]
__________________________________________________________________________________________________
conv4_block14_2_bn (BatchNormal (None, 14, 14, 256) 1024 conv4_block14_2_conv[0][0]
__________________________________________________________________________________________________
conv4_block14_2_relu (Activatio (None, 14, 14, 256) 0 conv4_block14_2_bn[0][0]
__________________________________________________________________________________________________
conv4_block14_3_conv (Conv2D) (None, 14, 14, 1024) 263168 conv4_block14_2_relu[0][0]
__________________________________________________________________________________________________
conv4_block14_out (Add) (None, 14, 14, 1024) 0 conv4_block13_out[0][0]
conv4_block14_3_conv[0][0]
__________________________________________________________________________________________________
conv4_block15_preact_bn (BatchN (None, 14, 14, 1024) 4096 conv4_block14_out[0][0]
__________________________________________________________________________________________________
conv4_block15_preact_relu (Acti (None, 14, 14, 1024) 0 conv4_block15_preact_bn[0][0]
__________________________________________________________________________________________________
conv4_block15_1_conv (Conv2D) (None, 14, 14, 256) 262144 conv4_block15_preact_relu[0][0]
__________________________________________________________________________________________________
conv4_block15_1_bn (BatchNormal (None, 14, 14, 256) 1024 conv4_block15_1_conv[0][0]
__________________________________________________________________________________________________
conv4_block15_1_relu (Activatio (None, 14, 14, 256) 0 conv4_block15_1_bn[0][0]
__________________________________________________________________________________________________
conv4_block15_2_pad (ZeroPaddin (None, 16, 16, 256) 0 conv4_block15_1_relu[0][0]
__________________________________________________________________________________________________
conv4_block15_2_conv (Conv2D) (None, 14, 14, 256) 589824 conv4_block15_2_pad[0][0]
__________________________________________________________________________________________________
conv4_block15_2_bn (BatchNormal (None, 14, 14, 256) 1024 conv4_block15_2_conv[0][0]
__________________________________________________________________________________________________
conv4_block15_2_relu (Activatio (None, 14, 14, 256) 0 conv4_block15_2_bn[0][0]
__________________________________________________________________________________________________
conv4_block15_3_conv (Conv2D) (None, 14, 14, 1024) 263168 conv4_block15_2_relu[0][0]
__________________________________________________________________________________________________
conv4_block15_out (Add) (None, 14, 14, 1024) 0 conv4_block14_out[0][0]
conv4_block15_3_conv[0][0]
__________________________________________________________________________________________________
conv4_block16_preact_bn (BatchN (None, 14, 14, 1024) 4096 conv4_block15_out[0][0]
__________________________________________________________________________________________________
conv4_block16_preact_relu (Acti (None, 14, 14, 1024) 0 conv4_block16_preact_bn[0][0]
__________________________________________________________________________________________________
conv4_block16_1_conv (Conv2D) (None, 14, 14, 256) 262144 conv4_block16_preact_relu[0][0]
__________________________________________________________________________________________________
conv4_block16_1_bn (BatchNormal (None, 14, 14, 256) 1024 conv4_block16_1_conv[0][0]
__________________________________________________________________________________________________
conv4_block16_1_relu (Activatio (None, 14, 14, 256) 0 conv4_block16_1_bn[0][0]
__________________________________________________________________________________________________
conv4_block16_2_pad (ZeroPaddin (None, 16, 16, 256) 0 conv4_block16_1_relu[0][0]
__________________________________________________________________________________________________
conv4_block16_2_conv (Conv2D) (None, 14, 14, 256) 589824 conv4_block16_2_pad[0][0]
__________________________________________________________________________________________________
conv4_block16_2_bn (BatchNormal (None, 14, 14, 256) 1024 conv4_block16_2_conv[0][0]
__________________________________________________________________________________________________
conv4_block16_2_relu (Activatio (None, 14, 14, 256) 0 conv4_block16_2_bn[0][0]
__________________________________________________________________________________________________
conv4_block16_3_conv (Conv2D) (None, 14, 14, 1024) 263168 conv4_block16_2_relu[0][0]
__________________________________________________________________________________________________
conv4_block16_out (Add) (None, 14, 14, 1024) 0 conv4_block15_out[0][0]
conv4_block16_3_conv[0][0]
__________________________________________________________________________________________________
conv4_block17_preact_bn (BatchN (None, 14, 14, 1024) 4096 conv4_block16_out[0][0]
__________________________________________________________________________________________________
conv4_block17_preact_relu (Acti (None, 14, 14, 1024) 0 conv4_block17_preact_bn[0][0]
__________________________________________________________________________________________________
conv4_block17_1_conv (Conv2D) (None, 14, 14, 256) 262144 conv4_block17_preact_relu[0][0]
__________________________________________________________________________________________________
conv4_block17_1_bn (BatchNormal (None, 14, 14, 256) 1024 conv4_block17_1_conv[0][0]
__________________________________________________________________________________________________
conv4_block17_1_relu (Activatio (None, 14, 14, 256) 0 conv4_block17_1_bn[0][0]
__________________________________________________________________________________________________
conv4_block17_2_pad (ZeroPaddin (None, 16, 16, 256) 0 conv4_block17_1_relu[0][0]
__________________________________________________________________________________________________
conv4_block17_2_conv (Conv2D) (None, 14, 14, 256) 589824 conv4_block17_2_pad[0][0]
__________________________________________________________________________________________________
conv4_block17_2_bn (BatchNormal (None, 14, 14, 256) 1024 conv4_block17_2_conv[0][0]
__________________________________________________________________________________________________
conv4_block17_2_relu (Activatio (None, 14, 14, 256) 0 conv4_block17_2_bn[0][0]
__________________________________________________________________________________________________
conv4_block17_3_conv (Conv2D) (None, 14, 14, 1024) 263168 conv4_block17_2_relu[0][0]
__________________________________________________________________________________________________
conv4_block17_out (Add) (None, 14, 14, 1024) 0 conv4_block16_out[0][0]
conv4_block17_3_conv[0][0]
__________________________________________________________________________________________________
conv4_block18_preact_bn (BatchN (None, 14, 14, 1024) 4096 conv4_block17_out[0][0]
__________________________________________________________________________________________________
conv4_block18_preact_relu (Acti (None, 14, 14, 1024) 0 conv4_block18_preact_bn[0][0]
__________________________________________________________________________________________________
conv4_block18_1_conv (Conv2D) (None, 14, 14, 256) 262144 conv4_block18_preact_relu[0][0]
__________________________________________________________________________________________________
conv4_block18_1_bn (BatchNormal (None, 14, 14, 256) 1024 conv4_block18_1_conv[0][0]
__________________________________________________________________________________________________
conv4_block18_1_relu (Activatio (None, 14, 14, 256) 0 conv4_block18_1_bn[0][0]
__________________________________________________________________________________________________
conv4_block18_2_pad (ZeroPaddin (None, 16, 16, 256) 0 conv4_block18_1_relu[0][0]
__________________________________________________________________________________________________
conv4_block18_2_conv (Conv2D) (None, 14, 14, 256) 589824 conv4_block18_2_pad[0][0]
__________________________________________________________________________________________________
conv4_block18_2_bn (BatchNormal (None, 14, 14, 256) 1024 conv4_block18_2_conv[0][0]
__________________________________________________________________________________________________
conv4_block18_2_relu (Activatio (None, 14, 14, 256) 0 conv4_block18_2_bn[0][0]
__________________________________________________________________________________________________
conv4_block18_3_conv (Conv2D) (None, 14, 14, 1024) 263168 conv4_block18_2_relu[0][0]
__________________________________________________________________________________________________
conv4_block18_out (Add) (None, 14, 14, 1024) 0 conv4_block17_out[0][0]
conv4_block18_3_conv[0][0]
__________________________________________________________________________________________________
conv4_block19_preact_bn (BatchN (None, 14, 14, 1024) 4096 conv4_block18_out[0][0]
__________________________________________________________________________________________________
conv4_block19_preact_relu (Acti (None, 14, 14, 1024) 0 conv4_block19_preact_bn[0][0]
__________________________________________________________________________________________________
conv4_block19_1_conv (Conv2D) (None, 14, 14, 256) 262144 conv4_block19_preact_relu[0][0]
__________________________________________________________________________________________________
conv4_block19_1_bn (BatchNormal (None, 14, 14, 256) 1024 conv4_block19_1_conv[0][0]
__________________________________________________________________________________________________
conv4_block19_1_relu (Activatio (None, 14, 14, 256) 0 conv4_block19_1_bn[0][0]
__________________________________________________________________________________________________
conv4_block19_2_pad (ZeroPaddin (None, 16, 16, 256) 0 conv4_block19_1_relu[0][0]
__________________________________________________________________________________________________
conv4_block19_2_conv (Conv2D) (None, 14, 14, 256) 589824 conv4_block19_2_pad[0][0]
__________________________________________________________________________________________________
conv4_block19_2_bn (BatchNormal (None, 14, 14, 256) 1024 conv4_block19_2_conv[0][0]
__________________________________________________________________________________________________
conv4_block19_2_relu (Activatio (None, 14, 14, 256) 0 conv4_block19_2_bn[0][0]
__________________________________________________________________________________________________
conv4_block19_3_conv (Conv2D) (None, 14, 14, 1024) 263168 conv4_block19_2_relu[0][0]
__________________________________________________________________________________________________
conv4_block19_out (Add) (None, 14, 14, 1024) 0 conv4_block18_out[0][0]
conv4_block19_3_conv[0][0]
__________________________________________________________________________________________________
conv4_block20_preact_bn (BatchN (None, 14, 14, 1024) 4096 conv4_block19_out[0][0]
__________________________________________________________________________________________________
conv4_block20_preact_relu (Acti (None, 14, 14, 1024) 0 conv4_block20_preact_bn[0][0]
__________________________________________________________________________________________________
conv4_block20_1_conv (Conv2D) (None, 14, 14, 256) 262144 conv4_block20_preact_relu[0][0]
__________________________________________________________________________________________________
conv4_block20_1_bn (BatchNormal (None, 14, 14, 256) 1024 conv4_block20_1_conv[0][0]
__________________________________________________________________________________________________
conv4_block20_1_relu (Activatio (None, 14, 14, 256) 0 conv4_block20_1_bn[0][0]
__________________________________________________________________________________________________
conv4_block20_2_pad (ZeroPaddin (None, 16, 16, 256) 0 conv4_block20_1_relu[0][0]
__________________________________________________________________________________________________
conv4_block20_2_conv (Conv2D) (None, 14, 14, 256) 589824 conv4_block20_2_pad[0][0]
__________________________________________________________________________________________________
conv4_block20_2_bn (BatchNormal (None, 14, 14, 256) 1024 conv4_block20_2_conv[0][0]
__________________________________________________________________________________________________
conv4_block20_2_relu (Activatio (None, 14, 14, 256) 0 conv4_block20_2_bn[0][0]
__________________________________________________________________________________________________
conv4_block20_3_conv (Conv2D) (None, 14, 14, 1024) 263168 conv4_block20_2_relu[0][0]
__________________________________________________________________________________________________
conv4_block20_out (Add) (None, 14, 14, 1024) 0 conv4_block19_out[0][0]
conv4_block20_3_conv[0][0]
__________________________________________________________________________________________________
conv4_block21_preact_bn (BatchN (None, 14, 14, 1024) 4096 conv4_block20_out[0][0]
__________________________________________________________________________________________________
conv4_block21_preact_relu (Acti (None, 14, 14, 1024) 0 conv4_block21_preact_bn[0][0]
__________________________________________________________________________________________________
conv4_block21_1_conv (Conv2D) (None, 14, 14, 256) 262144 conv4_block21_preact_relu[0][0]
__________________________________________________________________________________________________
conv4_block21_1_bn (BatchNormal (None, 14, 14, 256) 1024 conv4_block21_1_conv[0][0]
__________________________________________________________________________________________________
conv4_block21_1_relu (Activatio (None, 14, 14, 256) 0 conv4_block21_1_bn[0][0]
__________________________________________________________________________________________________
conv4_block21_2_pad (ZeroPaddin (None, 16, 16, 256) 0 conv4_block21_1_relu[0][0]
__________________________________________________________________________________________________
conv4_block21_2_conv (Conv2D) (None, 14, 14, 256) 589824 conv4_block21_2_pad[0][0]
__________________________________________________________________________________________________
conv4_block21_2_bn (BatchNormal (None, 14, 14, 256) 1024 conv4_block21_2_conv[0][0]
__________________________________________________________________________________________________
conv4_block21_2_relu (Activatio (None, 14, 14, 256) 0 conv4_block21_2_bn[0][0]
__________________________________________________________________________________________________
conv4_block21_3_conv (Conv2D) (None, 14, 14, 1024) 263168 conv4_block21_2_relu[0][0]
__________________________________________________________________________________________________
conv4_block21_out (Add) (None, 14, 14, 1024) 0 conv4_block20_out[0][0]
conv4_block21_3_conv[0][0]
__________________________________________________________________________________________________
conv4_block22_preact_bn (BatchN (None, 14, 14, 1024) 4096 conv4_block21_out[0][0]
__________________________________________________________________________________________________
conv4_block22_preact_relu (Acti (None, 14, 14, 1024) 0 conv4_block22_preact_bn[0][0]
__________________________________________________________________________________________________
conv4_block22_1_conv (Conv2D) (None, 14, 14, 256) 262144 conv4_block22_preact_relu[0][0]
__________________________________________________________________________________________________
conv4_block22_1_bn (BatchNormal (None, 14, 14, 256) 1024 conv4_block22_1_conv[0][0]
__________________________________________________________________________________________________
conv4_block22_1_relu (Activatio (None, 14, 14, 256) 0 conv4_block22_1_bn[0][0]
__________________________________________________________________________________________________
conv4_block22_2_pad (ZeroPaddin (None, 16, 16, 256) 0 conv4_block22_1_relu[0][0]
__________________________________________________________________________________________________
conv4_block22_2_conv (Conv2D) (None, 14, 14, 256) 589824 conv4_block22_2_pad[0][0]
__________________________________________________________________________________________________
conv4_block22_2_bn (BatchNormal (None, 14, 14, 256) 1024 conv4_block22_2_conv[0][0]
__________________________________________________________________________________________________
conv4_block22_2_relu (Activatio (None, 14, 14, 256) 0 conv4_block22_2_bn[0][0]
__________________________________________________________________________________________________
conv4_block22_3_conv (Conv2D) (None, 14, 14, 1024) 263168 conv4_block22_2_relu[0][0]
__________________________________________________________________________________________________
conv4_block22_out (Add) (None, 14, 14, 1024) 0 conv4_block21_out[0][0]
conv4_block22_3_conv[0][0]
__________________________________________________________________________________________________
conv4_block23_preact_bn (BatchN (None, 14, 14, 1024) 4096 conv4_block22_out[0][0]
__________________________________________________________________________________________________
conv4_block23_preact_relu (Acti (None, 14, 14, 1024) 0 conv4_block23_preact_bn[0][0]
__________________________________________________________________________________________________
conv4_block23_1_conv (Conv2D) (None, 14, 14, 256) 262144 conv4_block23_preact_relu[0][0]
__________________________________________________________________________________________________
conv4_block23_1_bn (BatchNormal (None, 14, 14, 256) 1024 conv4_block23_1_conv[0][0]
__________________________________________________________________________________________________
conv4_block23_1_relu (Activatio (None, 14, 14, 256) 0 conv4_block23_1_bn[0][0]
__________________________________________________________________________________________________
conv4_block23_2_pad (ZeroPaddin (None, 16, 16, 256) 0 conv4_block23_1_relu[0][0]
__________________________________________________________________________________________________
conv4_block23_2_conv (Conv2D) (None, 14, 14, 256) 589824 conv4_block23_2_pad[0][0]
__________________________________________________________________________________________________
conv4_block23_2_bn (BatchNormal (None, 14, 14, 256) 1024 conv4_block23_2_conv[0][0]
__________________________________________________________________________________________________
conv4_block23_2_relu (Activatio (None, 14, 14, 256) 0 conv4_block23_2_bn[0][0]
__________________________________________________________________________________________________
conv4_block23_3_conv (Conv2D) (None, 14, 14, 1024) 263168 conv4_block23_2_relu[0][0]
__________________________________________________________________________________________________
conv4_block23_out (Add) (None, 14, 14, 1024) 0 conv4_block22_out[0][0]
conv4_block23_3_conv[0][0]
__________________________________________________________________________________________________
conv4_block24_preact_bn (BatchN (None, 14, 14, 1024) 4096 conv4_block23_out[0][0]
__________________________________________________________________________________________________
conv4_block24_preact_relu (Acti (None, 14, 14, 1024) 0 conv4_block24_preact_bn[0][0]
__________________________________________________________________________________________________
conv4_block24_1_conv (Conv2D) (None, 14, 14, 256) 262144 conv4_block24_preact_relu[0][0]
__________________________________________________________________________________________________
conv4_block24_1_bn (BatchNormal (None, 14, 14, 256) 1024 conv4_block24_1_conv[0][0]
__________________________________________________________________________________________________
conv4_block24_1_relu (Activatio (None, 14, 14, 256) 0 conv4_block24_1_bn[0][0]
__________________________________________________________________________________________________
conv4_block24_2_pad (ZeroPaddin (None, 16, 16, 256) 0 conv4_block24_1_relu[0][0]
__________________________________________________________________________________________________
conv4_block24_2_conv (Conv2D) (None, 14, 14, 256) 589824 conv4_block24_2_pad[0][0]
__________________________________________________________________________________________________
conv4_block24_2_bn (BatchNormal (None, 14, 14, 256) 1024 conv4_block24_2_conv[0][0]
__________________________________________________________________________________________________
conv4_block24_2_relu (Activatio (None, 14, 14, 256) 0 conv4_block24_2_bn[0][0]
__________________________________________________________________________________________________
conv4_block24_3_conv (Conv2D) (None, 14, 14, 1024) 263168 conv4_block24_2_relu[0][0]
__________________________________________________________________________________________________
conv4_block24_out (Add) (None, 14, 14, 1024) 0 conv4_block23_out[0][0]
conv4_block24_3_conv[0][0]
__________________________________________________________________________________________________
conv4_block25_preact_bn (BatchN (None, 14, 14, 1024) 4096 conv4_block24_out[0][0]
__________________________________________________________________________________________________
conv4_block25_preact_relu (Acti (None, 14, 14, 1024) 0 conv4_block25_preact_bn[0][0]
__________________________________________________________________________________________________
conv4_block25_1_conv (Conv2D) (None, 14, 14, 256) 262144 conv4_block25_preact_relu[0][0]
__________________________________________________________________________________________________
conv4_block25_1_bn (BatchNormal (None, 14, 14, 256) 1024 conv4_block25_1_conv[0][0]
__________________________________________________________________________________________________
conv4_block25_1_relu (Activatio (None, 14, 14, 256) 0 conv4_block25_1_bn[0][0]
__________________________________________________________________________________________________
conv4_block25_2_pad (ZeroPaddin (None, 16, 16, 256) 0 conv4_block25_1_relu[0][0]
__________________________________________________________________________________________________
conv4_block25_2_conv (Conv2D) (None, 14, 14, 256) 589824 conv4_block25_2_pad[0][0]
__________________________________________________________________________________________________
conv4_block25_2_bn (BatchNormal (None, 14, 14, 256) 1024 conv4_block25_2_conv[0][0]
__________________________________________________________________________________________________
conv4_block25_2_relu (Activatio (None, 14, 14, 256) 0 conv4_block25_2_bn[0][0]
__________________________________________________________________________________________________
conv4_block25_3_conv (Conv2D) (None, 14, 14, 1024) 263168 conv4_block25_2_relu[0][0]
__________________________________________________________________________________________________
conv4_block25_out (Add) (None, 14, 14, 1024) 0 conv4_block24_out[0][0]
conv4_block25_3_conv[0][0]
__________________________________________________________________________________________________
conv4_block26_preact_bn (BatchN (None, 14, 14, 1024) 4096 conv4_block25_out[0][0]
__________________________________________________________________________________________________
conv4_block26_preact_relu (Acti (None, 14, 14, 1024) 0 conv4_block26_preact_bn[0][0]
__________________________________________________________________________________________________
conv4_block26_1_conv (Conv2D) (None, 14, 14, 256) 262144 conv4_block26_preact_relu[0][0]
__________________________________________________________________________________________________
conv4_block26_1_bn (BatchNormal (None, 14, 14, 256) 1024 conv4_block26_1_conv[0][0]
__________________________________________________________________________________________________
conv4_block26_1_relu (Activatio (None, 14, 14, 256) 0 conv4_block26_1_bn[0][0]
__________________________________________________________________________________________________
conv4_block26_2_pad (ZeroPaddin (None, 16, 16, 256) 0 conv4_block26_1_relu[0][0]
__________________________________________________________________________________________________
conv4_block26_2_conv (Conv2D) (None, 14, 14, 256) 589824 conv4_block26_2_pad[0][0]
__________________________________________________________________________________________________
conv4_block26_2_bn (BatchNormal (None, 14, 14, 256) 1024 conv4_block26_2_conv[0][0]
__________________________________________________________________________________________________
conv4_block26_2_relu (Activatio (None, 14, 14, 256) 0 conv4_block26_2_bn[0][0]
__________________________________________________________________________________________________
conv4_block26_3_conv (Conv2D) (None, 14, 14, 1024) 263168 conv4_block26_2_relu[0][0]
__________________________________________________________________________________________________
conv4_block26_out (Add) (None, 14, 14, 1024) 0 conv4_block25_out[0][0]
conv4_block26_3_conv[0][0]
__________________________________________________________________________________________________
conv4_block27_preact_bn (BatchN (None, 14, 14, 1024) 4096 conv4_block26_out[0][0]
__________________________________________________________________________________________________
conv4_block27_preact_relu (Acti (None, 14, 14, 1024) 0 conv4_block27_preact_bn[0][0]
__________________________________________________________________________________________________
conv4_block27_1_conv (Conv2D) (None, 14, 14, 256) 262144 conv4_block27_preact_relu[0][0]
__________________________________________________________________________________________________
conv4_block27_1_bn (BatchNormal (None, 14, 14, 256) 1024 conv4_block27_1_conv[0][0]
__________________________________________________________________________________________________
conv4_block27_1_relu (Activatio (None, 14, 14, 256) 0 conv4_block27_1_bn[0][0]
__________________________________________________________________________________________________
conv4_block27_2_pad (ZeroPaddin (None, 16, 16, 256) 0 conv4_block27_1_relu[0][0]
__________________________________________________________________________________________________
conv4_block27_2_conv (Conv2D) (None, 14, 14, 256) 589824 conv4_block27_2_pad[0][0]
__________________________________________________________________________________________________
conv4_block27_2_bn (BatchNormal (None, 14, 14, 256) 1024 conv4_block27_2_conv[0][0]
__________________________________________________________________________________________________
conv4_block27_2_relu (Activatio (None, 14, 14, 256) 0 conv4_block27_2_bn[0][0]
__________________________________________________________________________________________________
conv4_block27_3_conv (Conv2D) (None, 14, 14, 1024) 263168 conv4_block27_2_relu[0][0]
__________________________________________________________________________________________________
conv4_block27_out (Add) (None, 14, 14, 1024) 0 conv4_block26_out[0][0]
conv4_block27_3_conv[0][0]
__________________________________________________________________________________________________
conv4_block28_preact_bn (BatchN (None, 14, 14, 1024) 4096 conv4_block27_out[0][0]
__________________________________________________________________________________________________
conv4_block28_preact_relu (Acti (None, 14, 14, 1024) 0 conv4_block28_preact_bn[0][0]
__________________________________________________________________________________________________
conv4_block28_1_conv (Conv2D) (None, 14, 14, 256) 262144 conv4_block28_preact_relu[0][0]
__________________________________________________________________________________________________
conv4_block28_1_bn (BatchNormal (None, 14, 14, 256) 1024 conv4_block28_1_conv[0][0]
__________________________________________________________________________________________________
conv4_block28_1_relu (Activatio (None, 14, 14, 256) 0 conv4_block28_1_bn[0][0]
__________________________________________________________________________________________________
conv4_block28_2_pad (ZeroPaddin (None, 16, 16, 256) 0 conv4_block28_1_relu[0][0]
__________________________________________________________________________________________________
conv4_block28_2_conv (Conv2D) (None, 14, 14, 256) 589824 conv4_block28_2_pad[0][0]
__________________________________________________________________________________________________
conv4_block28_2_bn (BatchNormal (None, 14, 14, 256) 1024 conv4_block28_2_conv[0][0]
__________________________________________________________________________________________________
conv4_block28_2_relu (Activatio (None, 14, 14, 256) 0 conv4_block28_2_bn[0][0]
__________________________________________________________________________________________________
conv4_block28_3_conv (Conv2D) (None, 14, 14, 1024) 263168 conv4_block28_2_relu[0][0]
__________________________________________________________________________________________________
conv4_block28_out (Add) (None, 14, 14, 1024) 0 conv4_block27_out[0][0]
conv4_block28_3_conv[0][0]
__________________________________________________________________________________________________
conv4_block29_preact_bn (BatchN (None, 14, 14, 1024) 4096 conv4_block28_out[0][0]
__________________________________________________________________________________________________
conv4_block29_preact_relu (Acti (None, 14, 14, 1024) 0 conv4_block29_preact_bn[0][0]
__________________________________________________________________________________________________
conv4_block29_1_conv (Conv2D) (None, 14, 14, 256) 262144 conv4_block29_preact_relu[0][0]
__________________________________________________________________________________________________
conv4_block29_1_bn (BatchNormal (None, 14, 14, 256) 1024 conv4_block29_1_conv[0][0]
__________________________________________________________________________________________________
conv4_block29_1_relu (Activatio (None, 14, 14, 256) 0 conv4_block29_1_bn[0][0]
__________________________________________________________________________________________________
conv4_block29_2_pad (ZeroPaddin (None, 16, 16, 256) 0 conv4_block29_1_relu[0][0]
__________________________________________________________________________________________________
conv4_block29_2_conv (Conv2D) (None, 14, 14, 256) 589824 conv4_block29_2_pad[0][0]
__________________________________________________________________________________________________
conv4_block29_2_bn (BatchNormal (None, 14, 14, 256) 1024 conv4_block29_2_conv[0][0]
__________________________________________________________________________________________________
conv4_block29_2_relu (Activatio (None, 14, 14, 256) 0 conv4_block29_2_bn[0][0]
__________________________________________________________________________________________________
conv4_block29_3_conv (Conv2D) (None, 14, 14, 1024) 263168 conv4_block29_2_relu[0][0]
__________________________________________________________________________________________________
conv4_block29_out (Add) (None, 14, 14, 1024) 0 conv4_block28_out[0][0]
conv4_block29_3_conv[0][0]
__________________________________________________________________________________________________
conv4_block30_preact_bn (BatchN (None, 14, 14, 1024) 4096 conv4_block29_out[0][0]
__________________________________________________________________________________________________
conv4_block30_preact_relu (Acti (None, 14, 14, 1024) 0 conv4_block30_preact_bn[0][0]
__________________________________________________________________________________________________
conv4_block30_1_conv (Conv2D) (None, 14, 14, 256) 262144 conv4_block30_preact_relu[0][0]
__________________________________________________________________________________________________
conv4_block30_1_bn (BatchNormal (None, 14, 14, 256) 1024 conv4_block30_1_conv[0][0]
__________________________________________________________________________________________________
conv4_block30_1_relu (Activatio (None, 14, 14, 256) 0 conv4_block30_1_bn[0][0]
__________________________________________________________________________________________________
conv4_block30_2_pad (ZeroPaddin (None, 16, 16, 256) 0 conv4_block30_1_relu[0][0]
__________________________________________________________________________________________________
conv4_block30_2_conv (Conv2D) (None, 14, 14, 256) 589824 conv4_block30_2_pad[0][0]
__________________________________________________________________________________________________
conv4_block30_2_bn (BatchNormal (None, 14, 14, 256) 1024 conv4_block30_2_conv[0][0]
__________________________________________________________________________________________________
conv4_block30_2_relu (Activatio (None, 14, 14, 256) 0 conv4_block30_2_bn[0][0]
__________________________________________________________________________________________________
conv4_block30_3_conv (Conv2D) (None, 14, 14, 1024) 263168 conv4_block30_2_relu[0][0]
__________________________________________________________________________________________________
conv4_block30_out (Add) (None, 14, 14, 1024) 0 conv4_block29_out[0][0]
conv4_block30_3_conv[0][0]
__________________________________________________________________________________________________
conv4_block31_preact_bn (BatchN (None, 14, 14, 1024) 4096 conv4_block30_out[0][0]
__________________________________________________________________________________________________
conv4_block31_preact_relu (Acti (None, 14, 14, 1024) 0 conv4_block31_preact_bn[0][0]
__________________________________________________________________________________________________
conv4_block31_1_conv (Conv2D) (None, 14, 14, 256) 262144 conv4_block31_preact_relu[0][0]
__________________________________________________________________________________________________
conv4_block31_1_bn (BatchNormal (None, 14, 14, 256) 1024 conv4_block31_1_conv[0][0]
__________________________________________________________________________________________________
conv4_block31_1_relu (Activatio (None, 14, 14, 256) 0 conv4_block31_1_bn[0][0]
__________________________________________________________________________________________________
conv4_block31_2_pad (ZeroPaddin (None, 16, 16, 256) 0 conv4_block31_1_relu[0][0]
__________________________________________________________________________________________________
conv4_block31_2_conv (Conv2D) (None, 14, 14, 256) 589824 conv4_block31_2_pad[0][0]
__________________________________________________________________________________________________
conv4_block31_2_bn (BatchNormal (None, 14, 14, 256) 1024 conv4_block31_2_conv[0][0]
__________________________________________________________________________________________________
conv4_block31_2_relu (Activatio (None, 14, 14, 256) 0 conv4_block31_2_bn[0][0]
__________________________________________________________________________________________________
conv4_block31_3_conv (Conv2D) (None, 14, 14, 1024) 263168 conv4_block31_2_relu[0][0]
__________________________________________________________________________________________________
conv4_block31_out (Add) (None, 14, 14, 1024) 0 conv4_block30_out[0][0]
conv4_block31_3_conv[0][0]
__________________________________________________________________________________________________
conv4_block32_preact_bn (BatchN (None, 14, 14, 1024) 4096 conv4_block31_out[0][0]
__________________________________________________________________________________________________
conv4_block32_preact_relu (Acti (None, 14, 14, 1024) 0 conv4_block32_preact_bn[0][0]
__________________________________________________________________________________________________
conv4_block32_1_conv (Conv2D) (None, 14, 14, 256) 262144 conv4_block32_preact_relu[0][0]
__________________________________________________________________________________________________
conv4_block32_1_bn (BatchNormal (None, 14, 14, 256) 1024 conv4_block32_1_conv[0][0]
__________________________________________________________________________________________________
conv4_block32_1_relu (Activatio (None, 14, 14, 256) 0 conv4_block32_1_bn[0][0]
__________________________________________________________________________________________________
conv4_block32_2_pad (ZeroPaddin (None, 16, 16, 256) 0 conv4_block32_1_relu[0][0]
__________________________________________________________________________________________________
conv4_block32_2_conv (Conv2D) (None, 14, 14, 256) 589824 conv4_block32_2_pad[0][0]
__________________________________________________________________________________________________
conv4_block32_2_bn (BatchNormal (None, 14, 14, 256) 1024 conv4_block32_2_conv[0][0]
__________________________________________________________________________________________________
conv4_block32_2_relu (Activatio (None, 14, 14, 256) 0 conv4_block32_2_bn[0][0]
__________________________________________________________________________________________________
conv4_block32_3_conv (Conv2D) (None, 14, 14, 1024) 263168 conv4_block32_2_relu[0][0]
__________________________________________________________________________________________________
conv4_block32_out (Add) (None, 14, 14, 1024) 0 conv4_block31_out[0][0]
conv4_block32_3_conv[0][0]
__________________________________________________________________________________________________
conv4_block33_preact_bn (BatchN (None, 14, 14, 1024) 4096 conv4_block32_out[0][0]
__________________________________________________________________________________________________
conv4_block33_preact_relu (Acti (None, 14, 14, 1024) 0 conv4_block33_preact_bn[0][0]
__________________________________________________________________________________________________
conv4_block33_1_conv (Conv2D) (None, 14, 14, 256) 262144 conv4_block33_preact_relu[0][0]
__________________________________________________________________________________________________
conv4_block33_1_bn (BatchNormal (None, 14, 14, 256) 1024 conv4_block33_1_conv[0][0]
__________________________________________________________________________________________________
conv4_block33_1_relu (Activatio (None, 14, 14, 256) 0 conv4_block33_1_bn[0][0]
__________________________________________________________________________________________________
conv4_block33_2_pad (ZeroPaddin (None, 16, 16, 256) 0 conv4_block33_1_relu[0][0]
__________________________________________________________________________________________________
conv4_block33_2_conv (Conv2D) (None, 14, 14, 256) 589824 conv4_block33_2_pad[0][0]
__________________________________________________________________________________________________
conv4_block33_2_bn (BatchNormal (None, 14, 14, 256) 1024 conv4_block33_2_conv[0][0]
__________________________________________________________________________________________________
conv4_block33_2_relu (Activatio (None, 14, 14, 256) 0 conv4_block33_2_bn[0][0]
__________________________________________________________________________________________________
conv4_block33_3_conv (Conv2D) (None, 14, 14, 1024) 263168 conv4_block33_2_relu[0][0]
__________________________________________________________________________________________________
conv4_block33_out (Add) (None, 14, 14, 1024) 0 conv4_block32_out[0][0]
conv4_block33_3_conv[0][0]
__________________________________________________________________________________________________
conv4_block34_preact_bn (BatchN (None, 14, 14, 1024) 4096 conv4_block33_out[0][0]
__________________________________________________________________________________________________
conv4_block34_preact_relu (Acti (None, 14, 14, 1024) 0 conv4_block34_preact_bn[0][0]
__________________________________________________________________________________________________
conv4_block34_1_conv (Conv2D) (None, 14, 14, 256) 262144 conv4_block34_preact_relu[0][0]
__________________________________________________________________________________________________
conv4_block34_1_bn (BatchNormal (None, 14, 14, 256) 1024 conv4_block34_1_conv[0][0]
__________________________________________________________________________________________________
conv4_block34_1_relu (Activatio (None, 14, 14, 256) 0 conv4_block34_1_bn[0][0]
__________________________________________________________________________________________________
conv4_block34_2_pad (ZeroPaddin (None, 16, 16, 256) 0 conv4_block34_1_relu[0][0]
__________________________________________________________________________________________________
conv4_block34_2_conv (Conv2D) (None, 14, 14, 256) 589824 conv4_block34_2_pad[0][0]
__________________________________________________________________________________________________
conv4_block34_2_bn (BatchNormal (None, 14, 14, 256) 1024 conv4_block34_2_conv[0][0]
__________________________________________________________________________________________________
conv4_block34_2_relu (Activatio (None, 14, 14, 256) 0 conv4_block34_2_bn[0][0]
__________________________________________________________________________________________________
conv4_block34_3_conv (Conv2D) (None, 14, 14, 1024) 263168 conv4_block34_2_relu[0][0]
__________________________________________________________________________________________________
conv4_block34_out (Add) (None, 14, 14, 1024) 0 conv4_block33_out[0][0]
conv4_block34_3_conv[0][0]
__________________________________________________________________________________________________
conv4_block35_preact_bn (BatchN (None, 14, 14, 1024) 4096 conv4_block34_out[0][0]
__________________________________________________________________________________________________
conv4_block35_preact_relu (Acti (None, 14, 14, 1024) 0 conv4_block35_preact_bn[0][0]
__________________________________________________________________________________________________
conv4_block35_1_conv (Conv2D) (None, 14, 14, 256) 262144 conv4_block35_preact_relu[0][0]
__________________________________________________________________________________________________
conv4_block35_1_bn (BatchNormal (None, 14, 14, 256) 1024 conv4_block35_1_conv[0][0]
__________________________________________________________________________________________________
conv4_block35_1_relu (Activatio (None, 14, 14, 256) 0 conv4_block35_1_bn[0][0]
__________________________________________________________________________________________________
conv4_block35_2_pad (ZeroPaddin (None, 16, 16, 256) 0 conv4_block35_1_relu[0][0]
__________________________________________________________________________________________________
conv4_block35_2_conv (Conv2D) (None, 14, 14, 256) 589824 conv4_block35_2_pad[0][0]
__________________________________________________________________________________________________
conv4_block35_2_bn (BatchNormal (None, 14, 14, 256) 1024 conv4_block35_2_conv[0][0]
__________________________________________________________________________________________________
conv4_block35_2_relu (Activatio (None, 14, 14, 256) 0 conv4_block35_2_bn[0][0]
__________________________________________________________________________________________________
conv4_block35_3_conv (Conv2D) (None, 14, 14, 1024) 263168 conv4_block35_2_relu[0][0]
__________________________________________________________________________________________________
conv4_block35_out (Add) (None, 14, 14, 1024) 0 conv4_block34_out[0][0]
conv4_block35_3_conv[0][0]
__________________________________________________________________________________________________
conv4_block36_preact_bn (BatchN (None, 14, 14, 1024) 4096 conv4_block35_out[0][0]
__________________________________________________________________________________________________
conv4_block36_preact_relu (Acti (None, 14, 14, 1024) 0 conv4_block36_preact_bn[0][0]
__________________________________________________________________________________________________
conv4_block36_1_conv (Conv2D) (None, 14, 14, 256) 262144 conv4_block36_preact_relu[0][0]
__________________________________________________________________________________________________
conv4_block36_1_bn (BatchNormal (None, 14, 14, 256) 1024 conv4_block36_1_conv[0][0]
__________________________________________________________________________________________________
conv4_block36_1_relu (Activatio (None, 14, 14, 256) 0 conv4_block36_1_bn[0][0]
__________________________________________________________________________________________________
conv4_block36_2_pad (ZeroPaddin (None, 16, 16, 256) 0 conv4_block36_1_relu[0][0]
__________________________________________________________________________________________________
conv4_block36_2_conv (Conv2D) (None, 7, 7, 256) 589824 conv4_block36_2_pad[0][0]
__________________________________________________________________________________________________
conv4_block36_2_bn (BatchNormal (None, 7, 7, 256) 1024 conv4_block36_2_conv[0][0]
__________________________________________________________________________________________________
conv4_block36_2_relu (Activatio (None, 7, 7, 256) 0 conv4_block36_2_bn[0][0]
__________________________________________________________________________________________________
max_pooling2d_5 (MaxPooling2D) (None, 7, 7, 1024) 0 conv4_block35_out[0][0]
__________________________________________________________________________________________________
conv4_block36_3_conv (Conv2D) (None, 7, 7, 1024) 263168 conv4_block36_2_relu[0][0]
__________________________________________________________________________________________________
conv4_block36_out (Add) (None, 7, 7, 1024) 0 max_pooling2d_5[0][0]
conv4_block36_3_conv[0][0]
__________________________________________________________________________________________________
conv5_block1_preact_bn (BatchNo (None, 7, 7, 1024) 4096 conv4_block36_out[0][0]
__________________________________________________________________________________________________
conv5_block1_preact_relu (Activ (None, 7, 7, 1024) 0 conv5_block1_preact_bn[0][0]
__________________________________________________________________________________________________
conv5_block1_1_conv (Conv2D) (None, 7, 7, 512) 524288 conv5_block1_preact_relu[0][0]
__________________________________________________________________________________________________
conv5_block1_1_bn (BatchNormali (None, 7, 7, 512) 2048 conv5_block1_1_conv[0][0]
__________________________________________________________________________________________________
conv5_block1_1_relu (Activation (None, 7, 7, 512) 0 conv5_block1_1_bn[0][0]
__________________________________________________________________________________________________
conv5_block1_2_pad (ZeroPadding (None, 9, 9, 512) 0 conv5_block1_1_relu[0][0]
__________________________________________________________________________________________________
conv5_block1_2_conv (Conv2D) (None, 7, 7, 512) 2359296 conv5_block1_2_pad[0][0]
__________________________________________________________________________________________________
conv5_block1_2_bn (BatchNormali (None, 7, 7, 512) 2048 conv5_block1_2_conv[0][0]
__________________________________________________________________________________________________
conv5_block1_2_relu (Activation (None, 7, 7, 512) 0 conv5_block1_2_bn[0][0]
__________________________________________________________________________________________________
conv5_block1_0_conv (Conv2D) (None, 7, 7, 2048) 2099200 conv5_block1_preact_relu[0][0]
__________________________________________________________________________________________________
conv5_block1_3_conv (Conv2D) (None, 7, 7, 2048) 1050624 conv5_block1_2_relu[0][0]
__________________________________________________________________________________________________
conv5_block1_out (Add) (None, 7, 7, 2048) 0 conv5_block1_0_conv[0][0]
conv5_block1_3_conv[0][0]
__________________________________________________________________________________________________
conv5_block2_preact_bn (BatchNo (None, 7, 7, 2048) 8192 conv5_block1_out[0][0]
__________________________________________________________________________________________________
conv5_block2_preact_relu (Activ (None, 7, 7, 2048) 0 conv5_block2_preact_bn[0][0]
__________________________________________________________________________________________________
conv5_block2_1_conv (Conv2D) (None, 7, 7, 512) 1048576 conv5_block2_preact_relu[0][0]
__________________________________________________________________________________________________
conv5_block2_1_bn (BatchNormali (None, 7, 7, 512) 2048 conv5_block2_1_conv[0][0]
__________________________________________________________________________________________________
conv5_block2_1_relu (Activation (None, 7, 7, 512) 0 conv5_block2_1_bn[0][0]
__________________________________________________________________________________________________
conv5_block2_2_pad (ZeroPadding (None, 9, 9, 512) 0 conv5_block2_1_relu[0][0]
__________________________________________________________________________________________________
conv5_block2_2_conv (Conv2D) (None, 7, 7, 512) 2359296 conv5_block2_2_pad[0][0]
__________________________________________________________________________________________________
conv5_block2_2_bn (BatchNormali (None, 7, 7, 512) 2048 conv5_block2_2_conv[0][0]
__________________________________________________________________________________________________
conv5_block2_2_relu (Activation (None, 7, 7, 512) 0 conv5_block2_2_bn[0][0]
__________________________________________________________________________________________________
conv5_block2_3_conv (Conv2D) (None, 7, 7, 2048) 1050624 conv5_block2_2_relu[0][0]
__________________________________________________________________________________________________
conv5_block2_out (Add) (None, 7, 7, 2048) 0 conv5_block1_out[0][0]
conv5_block2_3_conv[0][0]
__________________________________________________________________________________________________
conv5_block3_preact_bn (BatchNo (None, 7, 7, 2048) 8192 conv5_block2_out[0][0]
__________________________________________________________________________________________________
conv5_block3_preact_relu (Activ (None, 7, 7, 2048) 0 conv5_block3_preact_bn[0][0]
__________________________________________________________________________________________________
conv5_block3_1_conv (Conv2D) (None, 7, 7, 512) 1048576 conv5_block3_preact_relu[0][0]
__________________________________________________________________________________________________
conv5_block3_1_bn (BatchNormali (None, 7, 7, 512) 2048 conv5_block3_1_conv[0][0]
__________________________________________________________________________________________________
conv5_block3_1_relu (Activation (None, 7, 7, 512) 0 conv5_block3_1_bn[0][0]
__________________________________________________________________________________________________
conv5_block3_2_pad (ZeroPadding (None, 9, 9, 512) 0 conv5_block3_1_relu[0][0]
__________________________________________________________________________________________________
conv5_block3_2_conv (Conv2D) (None, 7, 7, 512) 2359296 conv5_block3_2_pad[0][0]
__________________________________________________________________________________________________
conv5_block3_2_bn (BatchNormali (None, 7, 7, 512) 2048 conv5_block3_2_conv[0][0]
__________________________________________________________________________________________________
conv5_block3_2_relu (Activation (None, 7, 7, 512) 0 conv5_block3_2_bn[0][0]
__________________________________________________________________________________________________
conv5_block3_3_conv (Conv2D) (None, 7, 7, 2048) 1050624 conv5_block3_2_relu[0][0]
__________________________________________________________________________________________________
conv5_block3_out (Add) (None, 7, 7, 2048) 0 conv5_block2_out[0][0]
conv5_block3_3_conv[0][0]
__________________________________________________________________________________________________
post_bn (BatchNormalization) (None, 7, 7, 2048) 8192 conv5_block3_out[0][0]
__________________________________________________________________________________________________
post_relu (Activation) (None, 7, 7, 2048) 0 post_bn[0][0]
__________________________________________________________________________________________________
avg_pool (GlobalAveragePooling2 (None, 2048) 0 post_relu[0][0]
__________________________________________________________________________________________________
predictions (Dense) (None, 1000) 2049000 avg_pool[0][0]
==================================================================================================
Total params: 60,380,648
Trainable params: 60,236,904
Non-trainable params: 143,744
__________________________________________________________________________________________________
inception = tf.keras.applications.InceptionV3()
inception.summary() # concatenate
Model: "inception_v3"
__________________________________________________________________________________________________
Layer (type) Output Shape Param # Connected to
==================================================================================================
input_3 (InputLayer) [(None, 299, 299, 3) 0
__________________________________________________________________________________________________
conv2d (Conv2D) (None, 149, 149, 32) 864 input_3[0][0]
__________________________________________________________________________________________________
batch_normalization (BatchNorma (None, 149, 149, 32) 96 conv2d[0][0]
__________________________________________________________________________________________________
activation (Activation) (None, 149, 149, 32) 0 batch_normalization[0][0]
__________________________________________________________________________________________________
conv2d_1 (Conv2D) (None, 147, 147, 32) 9216 activation[0][0]
__________________________________________________________________________________________________
batch_normalization_1 (BatchNor (None, 147, 147, 32) 96 conv2d_1[0][0]
__________________________________________________________________________________________________
activation_1 (Activation) (None, 147, 147, 32) 0 batch_normalization_1[0][0]
__________________________________________________________________________________________________
conv2d_2 (Conv2D) (None, 147, 147, 64) 18432 activation_1[0][0]
__________________________________________________________________________________________________
batch_normalization_2 (BatchNor (None, 147, 147, 64) 192 conv2d_2[0][0]
__________________________________________________________________________________________________
activation_2 (Activation) (None, 147, 147, 64) 0 batch_normalization_2[0][0]
__________________________________________________________________________________________________
max_pooling2d_6 (MaxPooling2D) (None, 73, 73, 64) 0 activation_2[0][0]
__________________________________________________________________________________________________
conv2d_3 (Conv2D) (None, 73, 73, 80) 5120 max_pooling2d_6[0][0]
__________________________________________________________________________________________________
batch_normalization_3 (BatchNor (None, 73, 73, 80) 240 conv2d_3[0][0]
__________________________________________________________________________________________________
activation_3 (Activation) (None, 73, 73, 80) 0 batch_normalization_3[0][0]
__________________________________________________________________________________________________
conv2d_4 (Conv2D) (None, 71, 71, 192) 138240 activation_3[0][0]
__________________________________________________________________________________________________
batch_normalization_4 (BatchNor (None, 71, 71, 192) 576 conv2d_4[0][0]
__________________________________________________________________________________________________
activation_4 (Activation) (None, 71, 71, 192) 0 batch_normalization_4[0][0]
__________________________________________________________________________________________________
max_pooling2d_7 (MaxPooling2D) (None, 35, 35, 192) 0 activation_4[0][0]
__________________________________________________________________________________________________
conv2d_8 (Conv2D) (None, 35, 35, 64) 12288 max_pooling2d_7[0][0]
__________________________________________________________________________________________________
batch_normalization_8 (BatchNor (None, 35, 35, 64) 192 conv2d_8[0][0]
__________________________________________________________________________________________________
activation_8 (Activation) (None, 35, 35, 64) 0 batch_normalization_8[0][0]
__________________________________________________________________________________________________
conv2d_6 (Conv2D) (None, 35, 35, 48) 9216 max_pooling2d_7[0][0]
__________________________________________________________________________________________________
conv2d_9 (Conv2D) (None, 35, 35, 96) 55296 activation_8[0][0]
__________________________________________________________________________________________________
batch_normalization_6 (BatchNor (None, 35, 35, 48) 144 conv2d_6[0][0]
__________________________________________________________________________________________________
batch_normalization_9 (BatchNor (None, 35, 35, 96) 288 conv2d_9[0][0]
__________________________________________________________________________________________________
activation_6 (Activation) (None, 35, 35, 48) 0 batch_normalization_6[0][0]
__________________________________________________________________________________________________
activation_9 (Activation) (None, 35, 35, 96) 0 batch_normalization_9[0][0]
__________________________________________________________________________________________________
average_pooling2d (AveragePooli (None, 35, 35, 192) 0 max_pooling2d_7[0][0]
__________________________________________________________________________________________________
conv2d_5 (Conv2D) (None, 35, 35, 64) 12288 max_pooling2d_7[0][0]
__________________________________________________________________________________________________
conv2d_7 (Conv2D) (None, 35, 35, 64) 76800 activation_6[0][0]
__________________________________________________________________________________________________
conv2d_10 (Conv2D) (None, 35, 35, 96) 82944 activation_9[0][0]
__________________________________________________________________________________________________
conv2d_11 (Conv2D) (None, 35, 35, 32) 6144 average_pooling2d[0][0]
__________________________________________________________________________________________________
batch_normalization_5 (BatchNor (None, 35, 35, 64) 192 conv2d_5[0][0]
__________________________________________________________________________________________________
batch_normalization_7 (BatchNor (None, 35, 35, 64) 192 conv2d_7[0][0]
__________________________________________________________________________________________________
batch_normalization_10 (BatchNo (None, 35, 35, 96) 288 conv2d_10[0][0]
__________________________________________________________________________________________________
batch_normalization_11 (BatchNo (None, 35, 35, 32) 96 conv2d_11[0][0]
__________________________________________________________________________________________________
activation_5 (Activation) (None, 35, 35, 64) 0 batch_normalization_5[0][0]
__________________________________________________________________________________________________
activation_7 (Activation) (None, 35, 35, 64) 0 batch_normalization_7[0][0]
__________________________________________________________________________________________________
activation_10 (Activation) (None, 35, 35, 96) 0 batch_normalization_10[0][0]
__________________________________________________________________________________________________
activation_11 (Activation) (None, 35, 35, 32) 0 batch_normalization_11[0][0]
__________________________________________________________________________________________________
mixed0 (Concatenate) (None, 35, 35, 256) 0 activation_5[0][0]
activation_7[0][0]
activation_10[0][0]
activation_11[0][0]
__________________________________________________________________________________________________
conv2d_15 (Conv2D) (None, 35, 35, 64) 16384 mixed0[0][0]
__________________________________________________________________________________________________
batch_normalization_15 (BatchNo (None, 35, 35, 64) 192 conv2d_15[0][0]
__________________________________________________________________________________________________
activation_15 (Activation) (None, 35, 35, 64) 0 batch_normalization_15[0][0]
__________________________________________________________________________________________________
conv2d_13 (Conv2D) (None, 35, 35, 48) 12288 mixed0[0][0]
__________________________________________________________________________________________________
conv2d_16 (Conv2D) (None, 35, 35, 96) 55296 activation_15[0][0]
__________________________________________________________________________________________________
batch_normalization_13 (BatchNo (None, 35, 35, 48) 144 conv2d_13[0][0]
__________________________________________________________________________________________________
batch_normalization_16 (BatchNo (None, 35, 35, 96) 288 conv2d_16[0][0]
__________________________________________________________________________________________________
activation_13 (Activation) (None, 35, 35, 48) 0 batch_normalization_13[0][0]
__________________________________________________________________________________________________
activation_16 (Activation) (None, 35, 35, 96) 0 batch_normalization_16[0][0]
__________________________________________________________________________________________________
average_pooling2d_1 (AveragePoo (None, 35, 35, 256) 0 mixed0[0][0]
__________________________________________________________________________________________________
conv2d_12 (Conv2D) (None, 35, 35, 64) 16384 mixed0[0][0]
__________________________________________________________________________________________________
conv2d_14 (Conv2D) (None, 35, 35, 64) 76800 activation_13[0][0]
__________________________________________________________________________________________________
conv2d_17 (Conv2D) (None, 35, 35, 96) 82944 activation_16[0][0]
__________________________________________________________________________________________________
conv2d_18 (Conv2D) (None, 35, 35, 64) 16384 average_pooling2d_1[0][0]
__________________________________________________________________________________________________
batch_normalization_12 (BatchNo (None, 35, 35, 64) 192 conv2d_12[0][0]
__________________________________________________________________________________________________
batch_normalization_14 (BatchNo (None, 35, 35, 64) 192 conv2d_14[0][0]
__________________________________________________________________________________________________
batch_normalization_17 (BatchNo (None, 35, 35, 96) 288 conv2d_17[0][0]
__________________________________________________________________________________________________
batch_normalization_18 (BatchNo (None, 35, 35, 64) 192 conv2d_18[0][0]
__________________________________________________________________________________________________
activation_12 (Activation) (None, 35, 35, 64) 0 batch_normalization_12[0][0]
__________________________________________________________________________________________________
activation_14 (Activation) (None, 35, 35, 64) 0 batch_normalization_14[0][0]
__________________________________________________________________________________________________
activation_17 (Activation) (None, 35, 35, 96) 0 batch_normalization_17[0][0]
__________________________________________________________________________________________________
activation_18 (Activation) (None, 35, 35, 64) 0 batch_normalization_18[0][0]
__________________________________________________________________________________________________
mixed1 (Concatenate) (None, 35, 35, 288) 0 activation_12[0][0]
activation_14[0][0]
activation_17[0][0]
activation_18[0][0]
__________________________________________________________________________________________________
conv2d_22 (Conv2D) (None, 35, 35, 64) 18432 mixed1[0][0]
__________________________________________________________________________________________________
batch_normalization_22 (BatchNo (None, 35, 35, 64) 192 conv2d_22[0][0]
__________________________________________________________________________________________________
activation_22 (Activation) (None, 35, 35, 64) 0 batch_normalization_22[0][0]
__________________________________________________________________________________________________
conv2d_20 (Conv2D) (None, 35, 35, 48) 13824 mixed1[0][0]
__________________________________________________________________________________________________
conv2d_23 (Conv2D) (None, 35, 35, 96) 55296 activation_22[0][0]
__________________________________________________________________________________________________
batch_normalization_20 (BatchNo (None, 35, 35, 48) 144 conv2d_20[0][0]
__________________________________________________________________________________________________
batch_normalization_23 (BatchNo (None, 35, 35, 96) 288 conv2d_23[0][0]
__________________________________________________________________________________________________
activation_20 (Activation) (None, 35, 35, 48) 0 batch_normalization_20[0][0]
__________________________________________________________________________________________________
activation_23 (Activation) (None, 35, 35, 96) 0 batch_normalization_23[0][0]
__________________________________________________________________________________________________
average_pooling2d_2 (AveragePoo (None, 35, 35, 288) 0 mixed1[0][0]
__________________________________________________________________________________________________
conv2d_19 (Conv2D) (None, 35, 35, 64) 18432 mixed1[0][0]
__________________________________________________________________________________________________
conv2d_21 (Conv2D) (None, 35, 35, 64) 76800 activation_20[0][0]
__________________________________________________________________________________________________
conv2d_24 (Conv2D) (None, 35, 35, 96) 82944 activation_23[0][0]
__________________________________________________________________________________________________
conv2d_25 (Conv2D) (None, 35, 35, 64) 18432 average_pooling2d_2[0][0]
__________________________________________________________________________________________________
batch_normalization_19 (BatchNo (None, 35, 35, 64) 192 conv2d_19[0][0]
__________________________________________________________________________________________________
batch_normalization_21 (BatchNo (None, 35, 35, 64) 192 conv2d_21[0][0]
__________________________________________________________________________________________________
batch_normalization_24 (BatchNo (None, 35, 35, 96) 288 conv2d_24[0][0]
__________________________________________________________________________________________________
batch_normalization_25 (BatchNo (None, 35, 35, 64) 192 conv2d_25[0][0]
__________________________________________________________________________________________________
activation_19 (Activation) (None, 35, 35, 64) 0 batch_normalization_19[0][0]
__________________________________________________________________________________________________
activation_21 (Activation) (None, 35, 35, 64) 0 batch_normalization_21[0][0]
__________________________________________________________________________________________________
activation_24 (Activation) (None, 35, 35, 96) 0 batch_normalization_24[0][0]
__________________________________________________________________________________________________
activation_25 (Activation) (None, 35, 35, 64) 0 batch_normalization_25[0][0]
__________________________________________________________________________________________________
mixed2 (Concatenate) (None, 35, 35, 288) 0 activation_19[0][0]
activation_21[0][0]
activation_24[0][0]
activation_25[0][0]
__________________________________________________________________________________________________
conv2d_27 (Conv2D) (None, 35, 35, 64) 18432 mixed2[0][0]
__________________________________________________________________________________________________
batch_normalization_27 (BatchNo (None, 35, 35, 64) 192 conv2d_27[0][0]
__________________________________________________________________________________________________
activation_27 (Activation) (None, 35, 35, 64) 0 batch_normalization_27[0][0]
__________________________________________________________________________________________________
conv2d_28 (Conv2D) (None, 35, 35, 96) 55296 activation_27[0][0]
__________________________________________________________________________________________________
batch_normalization_28 (BatchNo (None, 35, 35, 96) 288 conv2d_28[0][0]
__________________________________________________________________________________________________
activation_28 (Activation) (None, 35, 35, 96) 0 batch_normalization_28[0][0]
__________________________________________________________________________________________________
conv2d_26 (Conv2D) (None, 17, 17, 384) 995328 mixed2[0][0]
__________________________________________________________________________________________________
conv2d_29 (Conv2D) (None, 17, 17, 96) 82944 activation_28[0][0]
__________________________________________________________________________________________________
batch_normalization_26 (BatchNo (None, 17, 17, 384) 1152 conv2d_26[0][0]
__________________________________________________________________________________________________
batch_normalization_29 (BatchNo (None, 17, 17, 96) 288 conv2d_29[0][0]
__________________________________________________________________________________________________
activation_26 (Activation) (None, 17, 17, 384) 0 batch_normalization_26[0][0]
__________________________________________________________________________________________________
activation_29 (Activation) (None, 17, 17, 96) 0 batch_normalization_29[0][0]
__________________________________________________________________________________________________
max_pooling2d_8 (MaxPooling2D) (None, 17, 17, 288) 0 mixed2[0][0]
__________________________________________________________________________________________________
mixed3 (Concatenate) (None, 17, 17, 768) 0 activation_26[0][0]
activation_29[0][0]
max_pooling2d_8[0][0]
__________________________________________________________________________________________________
conv2d_34 (Conv2D) (None, 17, 17, 128) 98304 mixed3[0][0]
__________________________________________________________________________________________________
batch_normalization_34 (BatchNo (None, 17, 17, 128) 384 conv2d_34[0][0]
__________________________________________________________________________________________________
activation_34 (Activation) (None, 17, 17, 128) 0 batch_normalization_34[0][0]
__________________________________________________________________________________________________
conv2d_35 (Conv2D) (None, 17, 17, 128) 114688 activation_34[0][0]
__________________________________________________________________________________________________
batch_normalization_35 (BatchNo (None, 17, 17, 128) 384 conv2d_35[0][0]
__________________________________________________________________________________________________
activation_35 (Activation) (None, 17, 17, 128) 0 batch_normalization_35[0][0]
__________________________________________________________________________________________________
conv2d_31 (Conv2D) (None, 17, 17, 128) 98304 mixed3[0][0]
__________________________________________________________________________________________________
conv2d_36 (Conv2D) (None, 17, 17, 128) 114688 activation_35[0][0]
__________________________________________________________________________________________________
batch_normalization_31 (BatchNo (None, 17, 17, 128) 384 conv2d_31[0][0]
__________________________________________________________________________________________________
batch_normalization_36 (BatchNo (None, 17, 17, 128) 384 conv2d_36[0][0]
__________________________________________________________________________________________________
activation_31 (Activation) (None, 17, 17, 128) 0 batch_normalization_31[0][0]
__________________________________________________________________________________________________
activation_36 (Activation) (None, 17, 17, 128) 0 batch_normalization_36[0][0]
__________________________________________________________________________________________________
conv2d_32 (Conv2D) (None, 17, 17, 128) 114688 activation_31[0][0]
__________________________________________________________________________________________________
conv2d_37 (Conv2D) (None, 17, 17, 128) 114688 activation_36[0][0]
__________________________________________________________________________________________________
batch_normalization_32 (BatchNo (None, 17, 17, 128) 384 conv2d_32[0][0]
__________________________________________________________________________________________________
batch_normalization_37 (BatchNo (None, 17, 17, 128) 384 conv2d_37[0][0]
__________________________________________________________________________________________________
activation_32 (Activation) (None, 17, 17, 128) 0 batch_normalization_32[0][0]
__________________________________________________________________________________________________
activation_37 (Activation) (None, 17, 17, 128) 0 batch_normalization_37[0][0]
__________________________________________________________________________________________________
average_pooling2d_3 (AveragePoo (None, 17, 17, 768) 0 mixed3[0][0]
__________________________________________________________________________________________________
conv2d_30 (Conv2D) (None, 17, 17, 192) 147456 mixed3[0][0]
__________________________________________________________________________________________________
conv2d_33 (Conv2D) (None, 17, 17, 192) 172032 activation_32[0][0]
__________________________________________________________________________________________________
conv2d_38 (Conv2D) (None, 17, 17, 192) 172032 activation_37[0][0]
__________________________________________________________________________________________________
conv2d_39 (Conv2D) (None, 17, 17, 192) 147456 average_pooling2d_3[0][0]
__________________________________________________________________________________________________
batch_normalization_30 (BatchNo (None, 17, 17, 192) 576 conv2d_30[0][0]
__________________________________________________________________________________________________
batch_normalization_33 (BatchNo (None, 17, 17, 192) 576 conv2d_33[0][0]
__________________________________________________________________________________________________
batch_normalization_38 (BatchNo (None, 17, 17, 192) 576 conv2d_38[0][0]
__________________________________________________________________________________________________
batch_normalization_39 (BatchNo (None, 17, 17, 192) 576 conv2d_39[0][0]
__________________________________________________________________________________________________
activation_30 (Activation) (None, 17, 17, 192) 0 batch_normalization_30[0][0]
__________________________________________________________________________________________________
activation_33 (Activation) (None, 17, 17, 192) 0 batch_normalization_33[0][0]
__________________________________________________________________________________________________
activation_38 (Activation) (None, 17, 17, 192) 0 batch_normalization_38[0][0]
__________________________________________________________________________________________________
activation_39 (Activation) (None, 17, 17, 192) 0 batch_normalization_39[0][0]
__________________________________________________________________________________________________
mixed4 (Concatenate) (None, 17, 17, 768) 0 activation_30[0][0]
activation_33[0][0]
activation_38[0][0]
activation_39[0][0]
__________________________________________________________________________________________________
conv2d_44 (Conv2D) (None, 17, 17, 160) 122880 mixed4[0][0]
__________________________________________________________________________________________________
batch_normalization_44 (BatchNo (None, 17, 17, 160) 480 conv2d_44[0][0]
__________________________________________________________________________________________________
activation_44 (Activation) (None, 17, 17, 160) 0 batch_normalization_44[0][0]
__________________________________________________________________________________________________
conv2d_45 (Conv2D) (None, 17, 17, 160) 179200 activation_44[0][0]
__________________________________________________________________________________________________
batch_normalization_45 (BatchNo (None, 17, 17, 160) 480 conv2d_45[0][0]
__________________________________________________________________________________________________
activation_45 (Activation) (None, 17, 17, 160) 0 batch_normalization_45[0][0]
__________________________________________________________________________________________________
conv2d_41 (Conv2D) (None, 17, 17, 160) 122880 mixed4[0][0]
__________________________________________________________________________________________________
conv2d_46 (Conv2D) (None, 17, 17, 160) 179200 activation_45[0][0]
__________________________________________________________________________________________________
batch_normalization_41 (BatchNo (None, 17, 17, 160) 480 conv2d_41[0][0]
__________________________________________________________________________________________________
batch_normalization_46 (BatchNo (None, 17, 17, 160) 480 conv2d_46[0][0]
__________________________________________________________________________________________________
activation_41 (Activation) (None, 17, 17, 160) 0 batch_normalization_41[0][0]
__________________________________________________________________________________________________
activation_46 (Activation) (None, 17, 17, 160) 0 batch_normalization_46[0][0]
__________________________________________________________________________________________________
conv2d_42 (Conv2D) (None, 17, 17, 160) 179200 activation_41[0][0]
__________________________________________________________________________________________________
conv2d_47 (Conv2D) (None, 17, 17, 160) 179200 activation_46[0][0]
__________________________________________________________________________________________________
batch_normalization_42 (BatchNo (None, 17, 17, 160) 480 conv2d_42[0][0]
__________________________________________________________________________________________________
batch_normalization_47 (BatchNo (None, 17, 17, 160) 480 conv2d_47[0][0]
__________________________________________________________________________________________________
activation_42 (Activation) (None, 17, 17, 160) 0 batch_normalization_42[0][0]
__________________________________________________________________________________________________
activation_47 (Activation) (None, 17, 17, 160) 0 batch_normalization_47[0][0]
__________________________________________________________________________________________________
average_pooling2d_4 (AveragePoo (None, 17, 17, 768) 0 mixed4[0][0]
__________________________________________________________________________________________________
conv2d_40 (Conv2D) (None, 17, 17, 192) 147456 mixed4[0][0]
__________________________________________________________________________________________________
conv2d_43 (Conv2D) (None, 17, 17, 192) 215040 activation_42[0][0]
__________________________________________________________________________________________________
conv2d_48 (Conv2D) (None, 17, 17, 192) 215040 activation_47[0][0]
__________________________________________________________________________________________________
conv2d_49 (Conv2D) (None, 17, 17, 192) 147456 average_pooling2d_4[0][0]
__________________________________________________________________________________________________
batch_normalization_40 (BatchNo (None, 17, 17, 192) 576 conv2d_40[0][0]
__________________________________________________________________________________________________
batch_normalization_43 (BatchNo (None, 17, 17, 192) 576 conv2d_43[0][0]
__________________________________________________________________________________________________
batch_normalization_48 (BatchNo (None, 17, 17, 192) 576 conv2d_48[0][0]
__________________________________________________________________________________________________
batch_normalization_49 (BatchNo (None, 17, 17, 192) 576 conv2d_49[0][0]
__________________________________________________________________________________________________
activation_40 (Activation) (None, 17, 17, 192) 0 batch_normalization_40[0][0]
__________________________________________________________________________________________________
activation_43 (Activation) (None, 17, 17, 192) 0 batch_normalization_43[0][0]
__________________________________________________________________________________________________
activation_48 (Activation) (None, 17, 17, 192) 0 batch_normalization_48[0][0]
__________________________________________________________________________________________________
activation_49 (Activation) (None, 17, 17, 192) 0 batch_normalization_49[0][0]
__________________________________________________________________________________________________
mixed5 (Concatenate) (None, 17, 17, 768) 0 activation_40[0][0]
activation_43[0][0]
activation_48[0][0]
activation_49[0][0]
__________________________________________________________________________________________________
conv2d_54 (Conv2D) (None, 17, 17, 160) 122880 mixed5[0][0]
__________________________________________________________________________________________________
batch_normalization_54 (BatchNo (None, 17, 17, 160) 480 conv2d_54[0][0]
__________________________________________________________________________________________________
activation_54 (Activation) (None, 17, 17, 160) 0 batch_normalization_54[0][0]
__________________________________________________________________________________________________
conv2d_55 (Conv2D) (None, 17, 17, 160) 179200 activation_54[0][0]
__________________________________________________________________________________________________
batch_normalization_55 (BatchNo (None, 17, 17, 160) 480 conv2d_55[0][0]
__________________________________________________________________________________________________
activation_55 (Activation) (None, 17, 17, 160) 0 batch_normalization_55[0][0]
__________________________________________________________________________________________________
conv2d_51 (Conv2D) (None, 17, 17, 160) 122880 mixed5[0][0]
__________________________________________________________________________________________________
conv2d_56 (Conv2D) (None, 17, 17, 160) 179200 activation_55[0][0]
__________________________________________________________________________________________________
batch_normalization_51 (BatchNo (None, 17, 17, 160) 480 conv2d_51[0][0]
__________________________________________________________________________________________________
batch_normalization_56 (BatchNo (None, 17, 17, 160) 480 conv2d_56[0][0]
__________________________________________________________________________________________________
activation_51 (Activation) (None, 17, 17, 160) 0 batch_normalization_51[0][0]
__________________________________________________________________________________________________
activation_56 (Activation) (None, 17, 17, 160) 0 batch_normalization_56[0][0]
__________________________________________________________________________________________________
conv2d_52 (Conv2D) (None, 17, 17, 160) 179200 activation_51[0][0]
__________________________________________________________________________________________________
conv2d_57 (Conv2D) (None, 17, 17, 160) 179200 activation_56[0][0]
__________________________________________________________________________________________________
batch_normalization_52 (BatchNo (None, 17, 17, 160) 480 conv2d_52[0][0]
__________________________________________________________________________________________________
batch_normalization_57 (BatchNo (None, 17, 17, 160) 480 conv2d_57[0][0]
__________________________________________________________________________________________________
activation_52 (Activation) (None, 17, 17, 160) 0 batch_normalization_52[0][0]
__________________________________________________________________________________________________
activation_57 (Activation) (None, 17, 17, 160) 0 batch_normalization_57[0][0]
__________________________________________________________________________________________________
average_pooling2d_5 (AveragePoo (None, 17, 17, 768) 0 mixed5[0][0]
__________________________________________________________________________________________________
conv2d_50 (Conv2D) (None, 17, 17, 192) 147456 mixed5[0][0]
__________________________________________________________________________________________________
conv2d_53 (Conv2D) (None, 17, 17, 192) 215040 activation_52[0][0]
__________________________________________________________________________________________________
conv2d_58 (Conv2D) (None, 17, 17, 192) 215040 activation_57[0][0]
__________________________________________________________________________________________________
conv2d_59 (Conv2D) (None, 17, 17, 192) 147456 average_pooling2d_5[0][0]
__________________________________________________________________________________________________
batch_normalization_50 (BatchNo (None, 17, 17, 192) 576 conv2d_50[0][0]
__________________________________________________________________________________________________
batch_normalization_53 (BatchNo (None, 17, 17, 192) 576 conv2d_53[0][0]
__________________________________________________________________________________________________
batch_normalization_58 (BatchNo (None, 17, 17, 192) 576 conv2d_58[0][0]
__________________________________________________________________________________________________
batch_normalization_59 (BatchNo (None, 17, 17, 192) 576 conv2d_59[0][0]
__________________________________________________________________________________________________
activation_50 (Activation) (None, 17, 17, 192) 0 batch_normalization_50[0][0]
__________________________________________________________________________________________________
activation_53 (Activation) (None, 17, 17, 192) 0 batch_normalization_53[0][0]
__________________________________________________________________________________________________
activation_58 (Activation) (None, 17, 17, 192) 0 batch_normalization_58[0][0]
__________________________________________________________________________________________________
activation_59 (Activation) (None, 17, 17, 192) 0 batch_normalization_59[0][0]
__________________________________________________________________________________________________
mixed6 (Concatenate) (None, 17, 17, 768) 0 activation_50[0][0]
activation_53[0][0]
activation_58[0][0]
activation_59[0][0]
__________________________________________________________________________________________________
conv2d_64 (Conv2D) (None, 17, 17, 192) 147456 mixed6[0][0]
__________________________________________________________________________________________________
batch_normalization_64 (BatchNo (None, 17, 17, 192) 576 conv2d_64[0][0]
__________________________________________________________________________________________________
activation_64 (Activation) (None, 17, 17, 192) 0 batch_normalization_64[0][0]
__________________________________________________________________________________________________
conv2d_65 (Conv2D) (None, 17, 17, 192) 258048 activation_64[0][0]
__________________________________________________________________________________________________
batch_normalization_65 (BatchNo (None, 17, 17, 192) 576 conv2d_65[0][0]
__________________________________________________________________________________________________
activation_65 (Activation) (None, 17, 17, 192) 0 batch_normalization_65[0][0]
__________________________________________________________________________________________________
conv2d_61 (Conv2D) (None, 17, 17, 192) 147456 mixed6[0][0]
__________________________________________________________________________________________________
conv2d_66 (Conv2D) (None, 17, 17, 192) 258048 activation_65[0][0]
__________________________________________________________________________________________________
batch_normalization_61 (BatchNo (None, 17, 17, 192) 576 conv2d_61[0][0]
__________________________________________________________________________________________________
batch_normalization_66 (BatchNo (None, 17, 17, 192) 576 conv2d_66[0][0]
__________________________________________________________________________________________________
activation_61 (Activation) (None, 17, 17, 192) 0 batch_normalization_61[0][0]
__________________________________________________________________________________________________
activation_66 (Activation) (None, 17, 17, 192) 0 batch_normalization_66[0][0]
__________________________________________________________________________________________________
conv2d_62 (Conv2D) (None, 17, 17, 192) 258048 activation_61[0][0]
__________________________________________________________________________________________________
conv2d_67 (Conv2D) (None, 17, 17, 192) 258048 activation_66[0][0]
__________________________________________________________________________________________________
batch_normalization_62 (BatchNo (None, 17, 17, 192) 576 conv2d_62[0][0]
__________________________________________________________________________________________________
batch_normalization_67 (BatchNo (None, 17, 17, 192) 576 conv2d_67[0][0]
__________________________________________________________________________________________________
activation_62 (Activation) (None, 17, 17, 192) 0 batch_normalization_62[0][0]
__________________________________________________________________________________________________
activation_67 (Activation) (None, 17, 17, 192) 0 batch_normalization_67[0][0]
__________________________________________________________________________________________________
average_pooling2d_6 (AveragePoo (None, 17, 17, 768) 0 mixed6[0][0]
__________________________________________________________________________________________________
conv2d_60 (Conv2D) (None, 17, 17, 192) 147456 mixed6[0][0]
__________________________________________________________________________________________________
conv2d_63 (Conv2D) (None, 17, 17, 192) 258048 activation_62[0][0]
__________________________________________________________________________________________________
conv2d_68 (Conv2D) (None, 17, 17, 192) 258048 activation_67[0][0]
__________________________________________________________________________________________________
conv2d_69 (Conv2D) (None, 17, 17, 192) 147456 average_pooling2d_6[0][0]
__________________________________________________________________________________________________
batch_normalization_60 (BatchNo (None, 17, 17, 192) 576 conv2d_60[0][0]
__________________________________________________________________________________________________
batch_normalization_63 (BatchNo (None, 17, 17, 192) 576 conv2d_63[0][0]
__________________________________________________________________________________________________
batch_normalization_68 (BatchNo (None, 17, 17, 192) 576 conv2d_68[0][0]
__________________________________________________________________________________________________
batch_normalization_69 (BatchNo (None, 17, 17, 192) 576 conv2d_69[0][0]
__________________________________________________________________________________________________
activation_60 (Activation) (None, 17, 17, 192) 0 batch_normalization_60[0][0]
__________________________________________________________________________________________________
activation_63 (Activation) (None, 17, 17, 192) 0 batch_normalization_63[0][0]
__________________________________________________________________________________________________
activation_68 (Activation) (None, 17, 17, 192) 0 batch_normalization_68[0][0]
__________________________________________________________________________________________________
activation_69 (Activation) (None, 17, 17, 192) 0 batch_normalization_69[0][0]
__________________________________________________________________________________________________
mixed7 (Concatenate) (None, 17, 17, 768) 0 activation_60[0][0]
activation_63[0][0]
activation_68[0][0]
activation_69[0][0]
__________________________________________________________________________________________________
conv2d_72 (Conv2D) (None, 17, 17, 192) 147456 mixed7[0][0]
__________________________________________________________________________________________________
batch_normalization_72 (BatchNo (None, 17, 17, 192) 576 conv2d_72[0][0]
__________________________________________________________________________________________________
activation_72 (Activation) (None, 17, 17, 192) 0 batch_normalization_72[0][0]
__________________________________________________________________________________________________
conv2d_73 (Conv2D) (None, 17, 17, 192) 258048 activation_72[0][0]
__________________________________________________________________________________________________
batch_normalization_73 (BatchNo (None, 17, 17, 192) 576 conv2d_73[0][0]
__________________________________________________________________________________________________
activation_73 (Activation) (None, 17, 17, 192) 0 batch_normalization_73[0][0]
__________________________________________________________________________________________________
conv2d_70 (Conv2D) (None, 17, 17, 192) 147456 mixed7[0][0]
__________________________________________________________________________________________________
conv2d_74 (Conv2D) (None, 17, 17, 192) 258048 activation_73[0][0]
__________________________________________________________________________________________________
batch_normalization_70 (BatchNo (None, 17, 17, 192) 576 conv2d_70[0][0]
__________________________________________________________________________________________________
batch_normalization_74 (BatchNo (None, 17, 17, 192) 576 conv2d_74[0][0]
__________________________________________________________________________________________________
activation_70 (Activation) (None, 17, 17, 192) 0 batch_normalization_70[0][0]
__________________________________________________________________________________________________
activation_74 (Activation) (None, 17, 17, 192) 0 batch_normalization_74[0][0]
__________________________________________________________________________________________________
conv2d_71 (Conv2D) (None, 8, 8, 320) 552960 activation_70[0][0]
__________________________________________________________________________________________________
conv2d_75 (Conv2D) (None, 8, 8, 192) 331776 activation_74[0][0]
__________________________________________________________________________________________________
batch_normalization_71 (BatchNo (None, 8, 8, 320) 960 conv2d_71[0][0]
__________________________________________________________________________________________________
batch_normalization_75 (BatchNo (None, 8, 8, 192) 576 conv2d_75[0][0]
__________________________________________________________________________________________________
activation_71 (Activation) (None, 8, 8, 320) 0 batch_normalization_71[0][0]
__________________________________________________________________________________________________
activation_75 (Activation) (None, 8, 8, 192) 0 batch_normalization_75[0][0]
__________________________________________________________________________________________________
max_pooling2d_9 (MaxPooling2D) (None, 8, 8, 768) 0 mixed7[0][0]
__________________________________________________________________________________________________
mixed8 (Concatenate) (None, 8, 8, 1280) 0 activation_71[0][0]
activation_75[0][0]
max_pooling2d_9[0][0]
__________________________________________________________________________________________________
conv2d_80 (Conv2D) (None, 8, 8, 448) 573440 mixed8[0][0]
__________________________________________________________________________________________________
batch_normalization_80 (BatchNo (None, 8, 8, 448) 1344 conv2d_80[0][0]
__________________________________________________________________________________________________
activation_80 (Activation) (None, 8, 8, 448) 0 batch_normalization_80[0][0]
__________________________________________________________________________________________________
conv2d_77 (Conv2D) (None, 8, 8, 384) 491520 mixed8[0][0]
__________________________________________________________________________________________________
conv2d_81 (Conv2D) (None, 8, 8, 384) 1548288 activation_80[0][0]
__________________________________________________________________________________________________
batch_normalization_77 (BatchNo (None, 8, 8, 384) 1152 conv2d_77[0][0]
__________________________________________________________________________________________________
batch_normalization_81 (BatchNo (None, 8, 8, 384) 1152 conv2d_81[0][0]
__________________________________________________________________________________________________
activation_77 (Activation) (None, 8, 8, 384) 0 batch_normalization_77[0][0]
__________________________________________________________________________________________________
activation_81 (Activation) (None, 8, 8, 384) 0 batch_normalization_81[0][0]
__________________________________________________________________________________________________
conv2d_78 (Conv2D) (None, 8, 8, 384) 442368 activation_77[0][0]
__________________________________________________________________________________________________
conv2d_79 (Conv2D) (None, 8, 8, 384) 442368 activation_77[0][0]
__________________________________________________________________________________________________
conv2d_82 (Conv2D) (None, 8, 8, 384) 442368 activation_81[0][0]
__________________________________________________________________________________________________
conv2d_83 (Conv2D) (None, 8, 8, 384) 442368 activation_81[0][0]
__________________________________________________________________________________________________
average_pooling2d_7 (AveragePoo (None, 8, 8, 1280) 0 mixed8[0][0]
__________________________________________________________________________________________________
conv2d_76 (Conv2D) (None, 8, 8, 320) 409600 mixed8[0][0]
__________________________________________________________________________________________________
batch_normalization_78 (BatchNo (None, 8, 8, 384) 1152 conv2d_78[0][0]
__________________________________________________________________________________________________
batch_normalization_79 (BatchNo (None, 8, 8, 384) 1152 conv2d_79[0][0]
__________________________________________________________________________________________________
batch_normalization_82 (BatchNo (None, 8, 8, 384) 1152 conv2d_82[0][0]
__________________________________________________________________________________________________
batch_normalization_83 (BatchNo (None, 8, 8, 384) 1152 conv2d_83[0][0]
__________________________________________________________________________________________________
conv2d_84 (Conv2D) (None, 8, 8, 192) 245760 average_pooling2d_7[0][0]
__________________________________________________________________________________________________
batch_normalization_76 (BatchNo (None, 8, 8, 320) 960 conv2d_76[0][0]
__________________________________________________________________________________________________
activation_78 (Activation) (None, 8, 8, 384) 0 batch_normalization_78[0][0]
__________________________________________________________________________________________________
activation_79 (Activation) (None, 8, 8, 384) 0 batch_normalization_79[0][0]
__________________________________________________________________________________________________
activation_82 (Activation) (None, 8, 8, 384) 0 batch_normalization_82[0][0]
__________________________________________________________________________________________________
activation_83 (Activation) (None, 8, 8, 384) 0 batch_normalization_83[0][0]
__________________________________________________________________________________________________
batch_normalization_84 (BatchNo (None, 8, 8, 192) 576 conv2d_84[0][0]
__________________________________________________________________________________________________
activation_76 (Activation) (None, 8, 8, 320) 0 batch_normalization_76[0][0]
__________________________________________________________________________________________________
mixed9_0 (Concatenate) (None, 8, 8, 768) 0 activation_78[0][0]
activation_79[0][0]
__________________________________________________________________________________________________
concatenate (Concatenate) (None, 8, 8, 768) 0 activation_82[0][0]
activation_83[0][0]
__________________________________________________________________________________________________
activation_84 (Activation) (None, 8, 8, 192) 0 batch_normalization_84[0][0]
__________________________________________________________________________________________________
mixed9 (Concatenate) (None, 8, 8, 2048) 0 activation_76[0][0]
mixed9_0[0][0]
concatenate[0][0]
activation_84[0][0]
__________________________________________________________________________________________________
conv2d_89 (Conv2D) (None, 8, 8, 448) 917504 mixed9[0][0]
__________________________________________________________________________________________________
batch_normalization_89 (BatchNo (None, 8, 8, 448) 1344 conv2d_89[0][0]
__________________________________________________________________________________________________
activation_89 (Activation) (None, 8, 8, 448) 0 batch_normalization_89[0][0]
__________________________________________________________________________________________________
conv2d_86 (Conv2D) (None, 8, 8, 384) 786432 mixed9[0][0]
__________________________________________________________________________________________________
conv2d_90 (Conv2D) (None, 8, 8, 384) 1548288 activation_89[0][0]
__________________________________________________________________________________________________
batch_normalization_86 (BatchNo (None, 8, 8, 384) 1152 conv2d_86[0][0]
__________________________________________________________________________________________________
batch_normalization_90 (BatchNo (None, 8, 8, 384) 1152 conv2d_90[0][0]
__________________________________________________________________________________________________
activation_86 (Activation) (None, 8, 8, 384) 0 batch_normalization_86[0][0]
__________________________________________________________________________________________________
activation_90 (Activation) (None, 8, 8, 384) 0 batch_normalization_90[0][0]
__________________________________________________________________________________________________
conv2d_87 (Conv2D) (None, 8, 8, 384) 442368 activation_86[0][0]
__________________________________________________________________________________________________
conv2d_88 (Conv2D) (None, 8, 8, 384) 442368 activation_86[0][0]
__________________________________________________________________________________________________
conv2d_91 (Conv2D) (None, 8, 8, 384) 442368 activation_90[0][0]
__________________________________________________________________________________________________
conv2d_92 (Conv2D) (None, 8, 8, 384) 442368 activation_90[0][0]
__________________________________________________________________________________________________
average_pooling2d_8 (AveragePoo (None, 8, 8, 2048) 0 mixed9[0][0]
__________________________________________________________________________________________________
conv2d_85 (Conv2D) (None, 8, 8, 320) 655360 mixed9[0][0]
__________________________________________________________________________________________________
batch_normalization_87 (BatchNo (None, 8, 8, 384) 1152 conv2d_87[0][0]
__________________________________________________________________________________________________
batch_normalization_88 (BatchNo (None, 8, 8, 384) 1152 conv2d_88[0][0]
__________________________________________________________________________________________________
batch_normalization_91 (BatchNo (None, 8, 8, 384) 1152 conv2d_91[0][0]
__________________________________________________________________________________________________
batch_normalization_92 (BatchNo (None, 8, 8, 384) 1152 conv2d_92[0][0]
__________________________________________________________________________________________________
conv2d_93 (Conv2D) (None, 8, 8, 192) 393216 average_pooling2d_8[0][0]
__________________________________________________________________________________________________
batch_normalization_85 (BatchNo (None, 8, 8, 320) 960 conv2d_85[0][0]
__________________________________________________________________________________________________
activation_87 (Activation) (None, 8, 8, 384) 0 batch_normalization_87[0][0]
__________________________________________________________________________________________________
activation_88 (Activation) (None, 8, 8, 384) 0 batch_normalization_88[0][0]
__________________________________________________________________________________________________
activation_91 (Activation) (None, 8, 8, 384) 0 batch_normalization_91[0][0]
__________________________________________________________________________________________________
activation_92 (Activation) (None, 8, 8, 384) 0 batch_normalization_92[0][0]
__________________________________________________________________________________________________
batch_normalization_93 (BatchNo (None, 8, 8, 192) 576 conv2d_93[0][0]
__________________________________________________________________________________________________
activation_85 (Activation) (None, 8, 8, 320) 0 batch_normalization_85[0][0]
__________________________________________________________________________________________________
mixed9_1 (Concatenate) (None, 8, 8, 768) 0 activation_87[0][0]
activation_88[0][0]
__________________________________________________________________________________________________
concatenate_1 (Concatenate) (None, 8, 8, 768) 0 activation_91[0][0]
activation_92[0][0]
__________________________________________________________________________________________________
activation_93 (Activation) (None, 8, 8, 192) 0 batch_normalization_93[0][0]
__________________________________________________________________________________________________
mixed10 (Concatenate) (None, 8, 8, 2048) 0 activation_85[0][0]
mixed9_1[0][0]
concatenate_1[0][0]
activation_93[0][0]
__________________________________________________________________________________________________
avg_pool (GlobalAveragePooling2 (None, 2048) 0 mixed10[0][0]
__________________________________________________________________________________________________
predictions (Dense) (None, 1000) 2049000 avg_pool[0][0]
==================================================================================================
Total params: 23,851,784
Trainable params: 23,817,352
Non-trainable params: 34,432
__________________________________________________________________________________________________
concatenate는 원본 값을 그대로 유지하고 싶을 때 (구조를 합침)
add는 합쳐서 하나의 결과를 유지한다
원본 이미지와 조건을 더할때 일반적으로 concatenate를 사용하는 것이 유용한다
add를 사용해야 할때는 concatenate를 하고나서도 해도 되기 때문이다
import numpy as np
a = np.array([[1,2,3]])
b = np.array([[1,2,3]])
a+b
array([[2, 4, 6]])
c = np.concatenate((a,b))
c
# array([[1, 2, 3],
# [1, 2, 3]])
c.sum(axis=0)
# array([2, 4, 6])
cGAN의 목표
이러한 과정을 통해 G는 입력 이미지 x와 noise z로 부터 출력 이미지 y로 매핑하도록 학습이 된다
Pix2Pix 구현
import tensorflow as tf
import os
import pathlib
import time
import datetime
from matplotlib import pyplot as plt
from IPython import display
dataset_name = "facades"
_URL = f'http://efrosgans.eecs.berkeley.edu/pix2pix/datasets/{dataset_name}.tar.gz'
path_to_zip = tf.keras.utils.get_file(
fname=f"{dataset_name}.tar.gz",
origin=_URL,
extract=True)
path_to_zip = pathlib.Path(path_to_zip)
PATH = path_to_zip.parent/dataset_name
list(PATH.parent.iterdir())
[PosixPath('/Users/jihyeokjeong/.keras/datasets/imdb_word_index.json'),
PosixPath('/Users/jihyeokjeong/.keras/datasets/mnist.npz'),
PosixPath('/Users/jihyeokjeong/.keras/datasets/fashion-mnist'),
PosixPath('/Users/jihyeokjeong/.keras/datasets/imdb.npz'),
PosixPath('/Users/jihyeokjeong/.keras/datasets/facades'),
PosixPath('/Users/jihyeokjeong/.keras/datasets/facades.tar.gz')]
sample_image = tf.io.read_file(str(PATH / 'train/1.jpg'))
sample_image = tf.io.decode_jpeg(sample_image)
print(sample_image.shape)
# (256, 512, 3)
plt.figure()
plt.imshow(sample_image)
def load(image_file):
image = tf.io.read_file(image_file)
image = tf.image.decode_jpeg(image)
w = tf.shape(image)[1]
w = w // 2
input_image = image[:, w:, :]
real_image = image[:, :w, :]
input_image = tf.cast(input_image, tf.float32)
real_image = tf.cast(real_image, tf.float32)
return input_image, real_image
inp, re = load(str(PATH / 'train/100.jpg'))
plt.figure()
plt.imshow(inp / 255.0)
plt.figure()
plt.imshow(re / 255.0)
BUFFER_SIZE = 400
BATCH_SIZE = 1
IMG_WIDTH = 256
IMG_HEIGHT = 256
def resize(input_image, real_image, height, width):
input_image = tf.image.resize(input_image, [height, width],
method=tf.image.ResizeMethod.NEAREST_NEIGHBOR)
real_image = tf.image.resize(real_image, [height, width],
method=tf.image.ResizeMethod.NEAREST_NEIGHBOR)
return input_image, real_image
def random_crop(input_image, real_image):
stacked_image = tf.stack([input_image, real_image], axis=0)
cropped_image = tf.image.random_crop(
stacked_image, size=[2, IMG_HEIGHT, IMG_WIDTH, 3])
return cropped_image[0], cropped_image[1]
def normalize(input_image, real_image):
input_image = (input_image / 127.5) - 1
real_image = (real_image / 127.5) - 1
return input_image, real_image
@tf.function()
def random_jitter(input_image, real_image):
input_image, real_image = resize(input_image, real_image, 286, 286)
input_image, real_image = random_crop(input_image, real_image)
if tf.random.uniform(()) > 0.5:
input_image = tf.image.flip_left_right(input_image)
real_image = tf.image.flip_left_right(real_image)
return input_image, real_image
plt.figure(figsize=(6, 6))
for i in range(4):
rj_inp, rj_re = random_jitter(inp, re)
plt.subplot(2, 2, i + 1)
plt.imshow(rj_inp / 255.0)
plt.axis('off')
plt.show()
def load_image_train(image_file):
input_image, real_image = load(image_file)
input_image, real_image = random_jitter(input_image, real_image)
input_image, real_image = normalize(input_image, real_image)
return input_image, real_image
def load_image_test(image_file):
input_image, real_image = load(image_file)
input_image, real_image = resize(input_image, real_image,
IMG_HEIGHT, IMG_WIDTH)
input_image, real_image = normalize(input_image, real_image)
return input_image, real_image
train_dataset = tf.data.Dataset.list_files(str(PATH / 'train/*.jpg'))
train_dataset = train_dataset.map(load_image_train,
num_parallel_calls=tf.data.AUTOTUNE)
train_dataset = train_dataset.shuffle(BUFFER_SIZE)
train_dataset = train_dataset.batch(BATCH_SIZE)
try:
test_dataset = tf.data.Dataset.list_files(str(PATH / 'test/*.jpg'))
except tf.errors.InvalidArgumentError:
test_dataset = tf.data.Dataset.list_files(str(PATH / 'val/*.jpg'))
test_dataset = test_dataset.map(load_image_test)
test_dataset = test_dataset.batch(BATCH_SIZE)
OUTPUT_CHANNELS = 3
def downsample(filters, size, apply_batchnorm=True):
initializer = tf.random_normal_initializer(0., 0.02)
result = tf.keras.Sequential()
result.add(
tf.keras.layers.Conv2D(filters, size, strides=2, padding='same',
kernel_initializer=initializer, use_bias=False)) # bias 사용 옵션
if apply_batchnorm:
result.add(tf.keras.layers.BatchNormalization()) # BM는 옵션
result.add(tf.keras.layers.LeakyReLU())
return result
down_model = downsample(3, 4)
down_result = down_model(tf.expand_dims(inp, 0))
print (down_result.shape)
# (1, 128, 128, 3)
def upsample(filters, size, apply_dropout=False):
initializer = tf.random_normal_initializer(0., 0.02)
result = tf.keras.Sequential()
result.add(
tf.keras.layers.Conv2DTranspose(filters, size, strides=2,
padding='same',
kernel_initializer=initializer,
use_bias=False))
# bias를 사용하지 않는 이유 : GAN은 기본적으로 학습이 잘 안되기 때문에 간소화 시켜야 하므로 bias를 사용하지 않는다
result.add(tf.keras.layers.BatchNormalization())
# layer가 많을 때는 dropout을 사용하지 않아도 되기 때문에 옵션으로 사용
# 범용적 사용을 위해 옵션으로 만듦
if apply_dropout:
result.add(tf.keras.layers.Dropout(0.5))
result.add(tf.keras.layers.ReLU())
return result
up_model = upsample(3, 4)
up_result = up_model(down_result)
print (up_result.shape)
# (1, 256, 256, 3)
# Generator안에서 U-net 생성
# U-net을 가짜 데이터 생성용으로 사용
def Generator():
inputs = tf.keras.layers.Input(shape=[256, 256, 3])
down_stack = [
# U-net구조를 위해 size확인을 해야 한다
downsample(64, 4, apply_batchnorm=False), # (batch_size, 128, 128, 64)
downsample(128, 4), # (batch_size, 64, 64, 128)
downsample(256, 4), # (batch_size, 32, 32, 256)
downsample(512, 4), # (batch_size, 16, 16, 512)
downsample(512, 4), # (batch_size, 8, 8, 512)
downsample(512, 4), # (batch_size, 4, 4, 512)
downsample(512, 4), # (batch_size, 2, 2, 512)
downsample(512, 4), # (batch_size, 1, 1, 512)
]
up_stack = [
upsample(512, 4, apply_dropout=True), # (batch_size, 2, 2, 1024)
upsample(512, 4, apply_dropout=True), # (batch_size, 4, 4, 1024)
upsample(512, 4, apply_dropout=True), # (batch_size, 8, 8, 1024)
upsample(512, 4), # (batch_size, 16, 16, 1024)
upsample(256, 4), # (batch_size, 32, 32, 512)
upsample(128, 4), # (batch_size, 64, 64, 256)
upsample(64, 4), # (batch_size, 128, 128, 128)
]
initializer = tf.random_normal_initializer(0., 0.02)
last = tf.keras.layers.Conv2DTranspose(OUTPUT_CHANNELS, 4,
strides=2,
padding='same',
kernel_initializer=initializer,
activation='tanh') # (batch_size, 256, 256, 3)
# sigmoid를 사용하면 zero centered가 되지 않아 학습이 잘 안되기 때문에 tanh를 사용한다
# zero centered가 되지 않으면 약간의 변화에도 값이 크게 변하기 때문에 학습이 잘 안될 수 있다
x = inputs
skips = []
for down in down_stack:
x = down(x)
skips.append(x)
skips = reversed(skips[:-1])
for up, skip in zip(up_stack, skips):
x = up(x)
x = tf.keras.layers.Concatenate()([x, skip])
x = last(x)
return tf.keras.Model(inputs=inputs, outputs=x)
AutoEncoder, U-net 구조는 loss를 어떻게 하느냐에 따라, 학습 데이터를 어떻게 하느냐에 따라 목적이 달라질 수 있다
U-net구조를 통해 segmentation을 사용하기도 하고, 원본 이미지를 복원하기도 하고, 원본 이미지의 색상을 바꾸기도 한다
generator = Generator()
gen_output = generator(inp[tf.newaxis, ...], training=False)
plt.imshow(gen_output[0, ...])
LAMBDA = 100
# tanh을 사용했기 때문에 BinaryCrossentropy을 사용
loss_object = tf.keras.losses.BinaryCrossentropy(from_logits=True)
# One-sided label smoothing: 실제 데이터에 대한 target 값을 1보다 약간 작은 값으로 설정하는 방식
# 여기서는 one-sided label smoothing 방식을 사용하지 않았다
def generator_loss(disc_generated_output, gen_output, target):
gan_loss = loss_object(tf.ones_like(disc_generated_output), disc_generated_output)
l1_loss = tf.reduce_mean(tf.abs(target - gen_output))
total_gen_loss = gan_loss + (LAMBDA * l1_loss)
return total_gen_loss, gan_loss, l1_loss
def Discriminator():
initializer = tf.random_normal_initializer(0., 0.02)
inp = tf.keras.layers.Input(shape=[256, 256, 3], name='input_image')
tar = tf.keras.layers.Input(shape=[256, 256, 3], name='target_image')
# 실제값 + 목표 이미지 concatenate
x = tf.keras.layers.concatenate([inp, tar]) # (batch_size, 256, 256, channels*2)
down1 = downsample(64, 4, False)(x) # (batch_size, 128, 128, 64)
down2 = downsample(128, 4)(down1) # (batch_size, 64, 64, 128)
down3 = downsample(256, 4)(down2) # (batch_size, 32, 32, 256)
# 출력의 각 30x30 크기 이미지 패치는 입력 이미지의 70x70부분을 판별한다 (부분으로 쪼개어 판별한다/사이즈는 실험을 통해 알아내야 한다)
# 크기 맞추는 방법을 사용한다
zero_pad1 = tf.keras.layers.ZeroPadding2D()(down3) # (batch_size, 34, 34, 256)
conv = tf.keras.layers.Conv2D(512, 4, strides=1,
kernel_initializer=initializer,
use_bias=False)(zero_pad1) # (batch_size, 31, 31, 512)
batchnorm1 = tf.keras.layers.BatchNormalization()(conv)
leaky_relu = tf.keras.layers.LeakyReLU()(batchnorm1)
zero_pad2 = tf.keras.layers.ZeroPadding2D()(leaky_relu) # (batch_size, 33, 33, 512)
last = tf.keras.layers.Conv2D(1, 4, strides=1,
kernel_initializer=initializer)(zero_pad2) # (batch_size, 30, 30, 1)
return tf.keras.Model(inputs=[inp, tar], outputs=last)
discriminator = Discriminator()
disc_out = discriminator([inp[tf.newaxis, ...], gen_output], training=False)
plt.imshow(disc_out[0, ..., -1], vmin=-20, vmax=20, cmap='RdBu_r')
plt.colorbar()
def discriminator_loss(disc_real_output, disc_generated_output):
# 진짜를 진짜로 1
real_loss = loss_object(tf.ones_like(disc_real_output), disc_real_output)
# 가짜는 가짜로 0
generated_loss = loss_object(tf.zeros_like(disc_generated_output), disc_generated_output)
total_disc_loss = real_loss + generated_loss
return total_disc_loss
generator_optimizer = tf.keras.optimizers.Adam(2e-4, beta_1=0.5)
discriminator_optimizer = tf.keras.optimizers.Adam(2e-4, beta_1=0.5)
checkpoint_dir = './training_checkpoints'
checkpoint_prefix = os.path.join(checkpoint_dir, "ckpt")
checkpoint = tf.train.Checkpoint(generator_optimizer=generator_optimizer,
discriminator_optimizer=discriminator_optimizer,
generator=generator,
discriminator=discriminator)
def generate_images(model, test_input, tar):
prediction = model(test_input, training=True)
plt.figure(figsize=(15, 15))
display_list = [test_input[0], tar[0], prediction[0]]
title = ['Input Image', 'Ground Truth', 'Predicted Image']
for i in range(3):
plt.subplot(1, 3, i+1)
plt.title(title[i])
plt.imshow(display_list[i] * 0.5 + 0.5)
plt.axis('off')
plt.show()
for example_input, example_target in test_dataset.take(1):
generate_images(generator, example_input, example_target)
log_dir="logs/"
summary_writer = tf.summary.create_file_writer(
log_dir + "fit/" + datetime.datetime.now().strftime("%Y%m%d-%H%M%S"))
@tf.function
def train_step(input_image, target, step):
# generator가 먼저 실행되지만 discriminator가 먼저 전체가 실행된다 (with 중첩 때문에)
# discriminator 학습 -> generator 학습 -> discriminator 학습 -> generator 학습 반복
with tf.GradientTape() as gen_tape, tf.GradientTape() as disc_tape:
# 미분할 대상을 지정한다
gen_output = generator(input_image, training=True)
disc_real_output = discriminator([input_image, target], training=True)
disc_generated_output = discriminator([input_image, gen_output], training=True)
gen_total_loss, gen_gan_loss, gen_l1_loss = generator_loss(disc_generated_output, gen_output, target)
disc_loss = discriminator_loss(disc_real_output, disc_generated_output)
generator_gradients = gen_tape.gradient(gen_total_loss,
generator.trainable_variables)
discriminator_gradients = disc_tape.gradient(disc_loss,
discriminator.trainable_variables)
generator_optimizer.apply_gradients(zip(generator_gradients,
generator.trainable_variables))
discriminator_optimizer.apply_gradients(zip(discriminator_gradients,
discriminator.trainable_variables))
with summary_writer.as_default():
tf.summary.scalar('gen_total_loss', gen_total_loss, step=step//1000)
tf.summary.scalar('gen_gan_loss', gen_gan_loss, step=step//1000)
tf.summary.scalar('gen_l1_loss', gen_l1_loss, step=step//1000)
tf.summary.scalar('disc_loss', disc_loss, step=step//1000)
with A:
with B:
X()
Y()
1. A __enter__
2. B __enter__
3. X()
4. B __exit__
5. Y()
6. A __exit__
def fit(train_ds, test_ds, steps):
example_input, example_target = next(iter(test_ds.take(1)))
start = time.time()
for step, (input_image, target) in train_ds.repeat().take(steps).enumerate():
if (step) % 1000 == 0:
display.clear_output(wait=True)
if step != 0:
print(f'Time taken for 1000 steps: {time.time()-start:.2f} sec\n')
start = time.time()
generate_images(generator, example_input, example_target)
print(f"Step: {step//1000}k")
train_step(input_image, target, step)
if (step+1) % 10 == 0:
print('.', end='', flush=True)
if (step + 1) % 5000 == 0:
checkpoint.save(file_prefix=checkpoint_prefix)
# 총 weight 업데이트를 4만번 (GAN은 기본적으로 학습이 잘 안되기 때문에 Epochs을 높게 잡아야 한다)
fit(train_dataset, test_dataset, steps=40000)
# Time taken for 1000 steps: 159.76 sec
for inp, tar in test_dataset.take(5):
generate_images(generator, inp, tar)
for inp, tar in test_dataset.take(6):
generate_images(generator, inp, tar)
'Computer_Science > Visual Intelligence' 카테고리의 다른 글
38일차 - Style Transfer (0) | 2021.11.16 |
---|---|
37일차 - 논문 수업 (Segmentation) (0) | 2021.11.16 |
36일차 - 논문 수업 (Segmentation) (0) | 2021.11.16 |
35일차 - 논문 수업 (Segmentation) (0) | 2021.11.01 |
34일차 - 논문 수업 (VAE) (0) | 2021.11.01 |