-
Notifications
You must be signed in to change notification settings - Fork 6
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Asking for the source code of Transformer/Multi. BERT/XLM #5
Comments
Could you please also share the settings for reproducing the reported results? This is what we get using your settings in the README file. Goal ACC: 0.4840. Joint ACC: 0.1306. Request ACC: 0.8092. Avg ACC: 0.4699 |
Hi, In our experiments, using multilingual word embeddings can achieve comparable or sometimes better results than using Multi. BERT/XLM. If you want to use Multi.BERT/XLM, you can simply replace the word embeddings with the Multi. BERT/XLM embeddings. As for reproducing the results, we have provided the scripts in the README (in "How to run" section). Thank you! |
Thanks for your kind reply! As for reproducing the results, we have provided the scripts in the README (in "How to run" section). As for model, I do not think it's only the problem of embeddings. Could you please provide more details about the Model and settings for reproducing the models? |
Hi, The script you run should be able to reproduce the results of MUSE model. The hyper-parameter settings are in config.py file. Can you check whether you use the correct embeddings we have provided in the data folder? Thanks. As for the code for Multi. BERT/XLM + Transformer, we are sorry that we didn't provide the code, since it is a bit messy in our codebase. If you need, we can try to wrap the corresponding code and upload it in next following days. Thank you! |
Can you check whether you use the correct embeddings we have provided in the data folder? Thanks. |
If you need, we can try to wrap the corresponding code and upload it in next following days. |
@zliucr |
Hi Zihan,
Thanks for contributing your source code of the paper https://arxiv.org/pdf/1911.09273.pdf.
In this source code, I have not find the Transformer/Multi. BERT/XLM models, as they are the state-of-the-art models as reported in the paper.
Could you please share the models or let us know how to reproduce the reported model.
Regards,
Jiahuan
The text was updated successfully, but these errors were encountered: