小言_互联网的博客

dagan改错

418人阅读  评论(0)

    net_h8 = ElementwiseLayer(layer=[net_h7, net], combine_fn=tf.add, name='h8/add')
TypeError: __init__() got an unexpected keyword argument 'layer'

ElementwiseLayer(prev_layer=[net_h7, net], combine_fn=tf.add, name='h8/add')

__init__() got an unexpected keyword argument 'out_size'

 DeConv2d(conv8, 512, (4, 4), strides=(2, 2), padding='SAME',
               act=None, W_init=w_init, b_init=b_init, name='deconv7')

This API is deprecated. Please use as `tf.nn.leaky_relu`

 BatchNormLayer(conv5, act=lambda x: tf.nn.leaky_relu(x, 0.2),
                       is_train=is_train, gamma_init=gamma_init, name='bn5')

 idex = tl.utils.get_random_int(min=0, max=len(X_test) - 1, number=sample_size, seed=config.TRAIN.seed)
TypeError: get_random_int() got an unexpected keyword argument 'min'

idex = tl.utils.get_random_int(min_v=0, max_v=len(X_test) - 1, number=sample_size, seed=config.TRAIN.seed)

x = zoom(x, zoom_range=[0.90, 1.10], is_random=True, fill_mode='constant')
TypeError: zoom() got an unexpected keyword argument 'is_random'

 

"zoom_range: float or tuple of 2 floats"
x = zoom(x, fill_mode='constant')

https://blog.csdn.net/qq_41780295/article/details/89677453

memory error问题解决

主要参考第三步


第一次循环完成,保存节点失败

需要设置gpu分配格式

https://blog.csdn.net/chen13017535518/article/details/84318499

# sess = tf.Session(config=tf.ConfigProto(allow_soft_placement=True))######################################################
config_ses = tf.ConfigProto()
config_ses.gpu_options.allow_growth = True
sess = tf.Session(config=config_ses)

转载:https://blog.csdn.net/qq_42724402/article/details/101846384
查看评论
* 以上用户言论只代表其个人观点,不代表本网站的观点或立场