Questions tagged [autoencoder]
Autoencoders are a type of neural network that learns a useful encoding for data in an unsupervised manner.
141 questions with no upvoted or accepted answers
6votes
0answers
153views
Unable to transform (greatly performing) Autoencoder into Variational Autoencoder
Following the procedure described in this SO question, I am trying to transform my (greatly performing) convolutional Autoencoder into a Variational version of the same Autoencoder. As explained in ...
4votes
1answer
886views
Why KL Divergence instead of Cross-entropy in VAE
I understand how KL divergence provides us with a measure of how one probability distribution is different from a second, reference probability distribution. But why are they particularly used (...
4votes
2answers
4kviews
Autoencoders for the compression of time series
I am trying to use autoencoder (simple, convolutional, LSTM) to compress time series. Here are the models I tried. Simple autoencoder: ...
3votes
0answers
560views
Autoencoder: Size of out_backprop doesn't match computed
This question was asked before and non of the answered worked for, I have the code ...
3votes
0answers
386views
Chess deep learning siamese network overfitting when shouldn't in theory
TLDR: My network is training with pairs so instead of 10^6 samples it has 10^12 samples (The number of samples squared) . With that large of a data set is shouldn't overfit but it does after very few ...
3votes
0answers
654views
What is the difference between KL-divergence, JS-divergence, Wasserstein distance and MMD?
I was reading about different distribution distances, and came across Kullback-Leibler divergence Jensen-Shannon divergence Wasserstein distance Maximum mean discrepancy (MMD) The book was too ...
3votes
0answers
162views
What is an intuitive explanation for the Importance Weighted Autoencoder?
I have been reading a paper by Burda et al. on Importance Weighted Autoencoders(IWAE) but I can't quite grasp what they mean by sampling the terms h1...hk. Do they mean you have separate models from ...
3votes
0answers
116views
Encoder-Decoder Sequence-to-Sequence Model for Translations in Both Directions
Is it possible to use a pre-trained sequence to sequence encoder-decoder model which translates an input text in source language to an output in target language to do an inverse translation? That is, ...
3votes
0answers
752views
Autoencoder behavior with All White/Black MNIST
I am using a stock auto-encoder anomaly detector from Deeplearning4j. I was getting unexpected results from my own variant of the auto-encoder, which looks for anomalies in my own (non-image) data, ...
3votes
0answers
725views
Using an autoencoder to mimic independent component analysis?
I'm trying to use autoencoders in keras to create a linear transformation similar to independent component analysis (ICA) (using this to denoise electroencephalographic data, time series of 64x100000 ...
2votes
1answer
243views
Why VAE Encoder outputs log variance and not standard deviation?
When talking about VAE (and viewing VAE implementations), the Encoder outputs: μ, log(variance) when we train the model (the ...
2votes
0answers
57views
How can I use autoencoders for noise detection and removal
How can I use autoencoders for noise detection and removal in a dataset with only 2 features and no labels? How should my architecture be like, such as 2 1 1 1 2 or any other? And does the output of ...
2votes
0answers
136views
Convolutional autoencoder - why keras example is asymmetry model?
I'm looking on keras convolutional autoencoder example, and confused with the model structure: ...
2votes
0answers
27views
Deep Continious Clustering algorithm - just one output cluster
I use the DCC algorithm to cluster some data. The whole algorithm is available here, but shortly it is: construct mkNN graph of the data points (the connected components of it are the clusters). ...
2votes
1answer
1kviews
How to Save Model that has a TensorFlow Probability Regularizer?
Consider the following minimal VAE: ...