OP here:
https://towardsdatascience.com/types-of-convolutions-in-deep-learning-717013397f4d This has helped me understand how deconvolutional layer or transposed convolution works! Note to self don't google deconvolution or upconvolution google transposed convolution instead.
Update 2: http://deeplearning.net/software/theano/tutorial/conv_arithmetic.html
It turns out "deconvolution" is just convolution but with different arithmetics. You can take the transpose or add enough padding so that
You can upsample instead of downsampling.
Keep the previously linked relation, what I mean is you need to make sure each upsampled filter still contains a symbolic link with the smaller input.