Skip to main content
added 431 characters in body

OP here:

https://towardsdatascience.com/types-of-convolutions-in-deep-learning-717013397f4d This has helped me understand how deconvolutional layer or transposed convolution works! Note to self don't google deconvolution or upconvolution google transposed convolution instead.

Update 2: http://deeplearning.net/software/theano/tutorial/conv_arithmetic.html

It turns out "deconvolution" is just convolution but with different arithmetics. You can take the transpose or add enough padding so that

  1. You can upsample instead of downsampling.

  2. Keep the previously linked relation, what I mean is you need to make sure each upsampled filter still contains a symbolic link with the smaller input.

OP here:

https://towardsdatascience.com/types-of-convolutions-in-deep-learning-717013397f4d This has helped me understand how deconvolutional layer or transposed convolution works! Note to self don't google deconvolution or upconvolution google transposed convolution instead.

OP here:

https://towardsdatascience.com/types-of-convolutions-in-deep-learning-717013397f4d This has helped me understand how deconvolutional layer or transposed convolution works! Note to self don't google deconvolution or upconvolution google transposed convolution instead.

Update 2: http://deeplearning.net/software/theano/tutorial/conv_arithmetic.html

It turns out "deconvolution" is just convolution but with different arithmetics. You can take the transpose or add enough padding so that

  1. You can upsample instead of downsampling.

  2. Keep the previously linked relation, what I mean is you need to make sure each upsampled filter still contains a symbolic link with the smaller input.

OP here:

https://towardsdatascience.com/types-of-convolutions-in-deep-learning-717013397f4d This has helped me understand how deconvolutional layer or transposed convolution works! Note to self don't google deconvolution or upconvolution google transposed convolution instead.

close