When it was proposed that GAN uses Wasserstein distance as the training metric, GAN is usually seen as a transportation problem. Previously, it was mentioned in a previous post that GAN can be seen as a transportation problem, and because of that, some computation can be simplified by relating a kernel in the discriminator and the generator.
GAN can be used in word translation problem too. In a recent preprint in arXiv (refer to arXiv:1710.04087), Wasserstein GAN has been used to train a machine translation machine, given that there are no parallel data between the word embeddings between two languages. The translation mapping is seen as a generator, and the mapping is described using Wasserstein distance. The training objective is cross-domain similarity local scaling (CSLS). Their work has been performed in English-Russian and English-Chinese mappings.
It seems to work. Given GAN sometimes does not work for unknown reasons, it is an excitement that it works.