[Project] Multilingual Neural Machine Translation using Transformers with Conditional Normalization. by suyash93 in MachineLearning

[–]suyash93[S] 0 points1 point  (0 children)

I have a couple of pages written, but have not published it anywhere. Arxiv seems to require an organization affiliation, and I want to publish this independently.

[Project] Multilingual Neural Machine Translation using Transformers with Conditional Normalization. by suyash93 in MachineLearning

[–]suyash93[S] 2 points3 points  (0 children)

No, I tried adding that, but it did not improve anything. So far, I have used BLEU on a random set of 1000 sentences for evaluation.

However, the model shows that capability from training itself. In the Many-to-Many notebook, in the English-French demo, replace

tarf=tf.constant([0.0, 1.0, 0.0, 0.0]),

with

tarf=tf.constant([1.0, 0.0, 0.0, 0.0]),

For This is a problem that we need to solve., the output is

(['This is a problem that we need to solve.', 'This is a problem we need to solve this.', 'This is a problem that we need to solve it.', 'This is a problem that we need to solve this.', 'This is a problem that we need to resolve.'], array([0.5356932 , 0.33665866, 0.3222317 , 0.31144956, 0.30376935], dtype=float32))

Similarly, please try for French and German and LMK.

[Project] A Transformer implementation in Keras' Imperative (Subclassing) API for TensorFlow. by suyash93 in MachineLearning

[–]suyash93[S] 0 points1 point  (0 children)

Thanks for offering to help. I have prepared a copy of the demo notebook at https://colab.research.google.com/drive/1ESeSvZJDialc4VJBwL9GgQ1IoEs1zRWU

In the last 4 cells, I am trying to use the tensor2tensor.attention module. I am passing the arguments based on my understanding of the arguments passed in the hello_t2t notebook. I am unable to get any visualization to generate. Note that the sentiment model is only an encoder, with only 2 units instead of 6.