微信公众号搜"智元新知"关注
微信扫一扫可直接关注哦!

如何在tf2中获得中间层结果?

如何解决如何在tf2中获得中间层结果?

我以子类化的方式编写了一个模型,

''' 类块(tf.keras.Model):

def __init__(self,index,is_train_bn,channel_axis):
    super().__init__()
    prefix = 'block' + str(index + 5)
    self.is_train_bn=is_train_bn
    self.sepconv1_act = layers.Activation('relu',name=prefix + '_sepconv1_act')
    self.sepconv1 = layers.SeparableConv2D(728,(3,3),padding='same',use_bias=False,name=prefix + '_sepconv1')
    self.sepconv1_bn = layers.Batchnormalization(axis=channel_axis,name=prefix + '_sepconv1_bn')
    self.sepconv2_act = layers.Activation('relu',name=prefix + '_sepconv2_act')
    self.sepconv2 = layers.SeparableConv2D(728,name=prefix + '_sepconv2')
    self.sepconv2_bn = layers.Batchnormalization(axis=channel_axis,name=prefix + '_sepconv2_bn')
    self.sepconv3_act = layers.Activation('relu',name=prefix + '_sepconv3_act')
    self.sepconv3 = layers.SeparableConv2D(728,name=prefix + '_sepconv3')
    self.sepconv3_bn = layers.Batchnormalization(axis=channel_axis,name=prefix + '_sepconv3_bn')

def __call__(self,x,training=False):
    residual = x
    x=self.sepconv1_act(x)
    x=self.sepconv1(x)
    x=self.sepconv1_bn(x,self.is_train_bn)
    x=self.sepconv2_act(x)
    x=self.sepconv2 (x)
    x=self.sepconv2_bn(x,self.is_train_bn)
    x=self.sepconv3_act (x)
    x=self.sepconv3 (x)
    x=self.sepconv3_bn (x,self.is_train_bn)
    return x+residual

''' 当我想打印 x 时,出现此错误

'无法将符号张量 (block1_conv1_act_1/Relu:0) 转换为 numpy 数组'。

解决方法

要从“模型中间”打印出“x”,您可以应用下面举例说明的方法(从您的示例中修改的代码)。在创建那种“监控模型”时,您可以通过以下过程轻松获取“x_to_probe”:

enter image description here

...在此示例中,模型的输入以随机张量为例。

import tensorflow as tf
from tensorflow import keras
from tensorflow.keras import layers

channel_axis=1
prefix='hmmm...'

sepconv1_act = layers.Activation('relu',name=prefix + '_sepconv1_act')
sepconv1 = layers.SeparableConv2D(728,(3,3),padding='same',use_bias=False,name=prefix + '_sepconv1')
sepconv1_bn = layers.BatchNormalization(axis=channel_axis,name=prefix + '_sepconv1_bn')
sepconv2_act = layers.Activation('relu',name=prefix + '_sepconv2_act')
sepconv2 = layers.SeparableConv2D(728,name=prefix + '_sepconv2')
sepconv2_bn = layers.BatchNormalization(axis=channel_axis,name=prefix + '_sepconv2_bn')
sepconv3_act = layers.Activation('relu',name=prefix + '_sepconv3_act')
sepconv3 = layers.SeparableConv2D(728,name=prefix + '_sepconv3')
sepconv3_bn = layers.BatchNormalization(axis=channel_axis,name=prefix + '_sepconv3_bn')

#This should be "vice-versa" ...the x need to be taken from the function input...
#residual = x

is_train_bn=1

#x=self.sepconv1_act(x)
inputs=keras.Input(shape=(1,16,16))
x=sepconv1_act(inputs)
x=sepconv1(x)
x=sepconv1_bn(x,is_train_bn)
x=sepconv2_act(x)
x=sepconv2 (x)
x=sepconv2_bn(x,is_train_bn)
x=sepconv3_act (x)
x_to_probe=sepconv3 (x)
x=sepconv3_bn (x_to_probe,is_train_bn)

model=keras.Model(inputs=inputs,outputs=x,name="example for Wayne")
model.summary()

#Let's take x out..
model_for_monitoring_selected_x=keras.Model(inputs=inputs,outputs=x_to_probe,name="example for Wayne to print x")
model_for_monitoring_selected_x.summary()

版权声明:本文内容由互联网用户自发贡献,该文观点与技术仅代表作者本人。本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如发现本站有涉嫌侵权/违法违规的内容, 请发送邮件至 dio@foxmail.com 举报,一经查实,本站将立刻删除。