首页 > TAG信息列表 > LeakyReLU

ReLU、LeakyReLU

https://blog.csdn.net/weixin_37724529/article/details/109623344?spm=1001.2101.3001.6650.1&utm_medium=distribute.pc_relevant.none-task-blog-2~default~CTRLIST~Rate-1-109623344-blog-90210041.pc_relevant_antiscanv3&depth_1-utm_source=distribute.pc_rel

pytorch深入学习(三)

文章目录 卷积注意事项 卷积注意事项 卷积后需要加个relu, 因为卷积也可以看成是线性层, 加relu可以增加非线性, sigmoid容易出现梯度消失的问题, 所以用relu就挺好, 还有leakyrelu

GAN生成式对抗生成网络

源代码: # -*- coding = utf-8 -*- # @Time : 2021/7/23 # @Author : pistachio # @File : p26.py # @Software : PyCharm # GAN generator network import keras from keras import layers import numpy as np import os from keras.preprocessing import image latent_dim =

yolov3

1、darknet53 Backbone 输入: 416x416x3 3x3Conv2d, stride=1, padding=1 -> BatchNorm2d -> LeakyReLu : [1, 32, 416, 416] [1,2,8,8,4]五种尺度,一共32倍下采样 layer1:重复1次 3x3Conv2d, stride=2, padding=1 -> BatchNorm2d -> LeakyReLu : [1, 64, 208, 208] 下采样,ch