Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

MMLU COT Giving less accuracy #2770

Open
Rajshree-Sahu opened this issue Mar 7, 2025 · 0 comments
Open

MMLU COT Giving less accuracy #2770

Rajshree-Sahu opened this issue Mar 7, 2025 · 0 comments

Comments

@Rajshree-Sahu
Copy link

Rajshree-Sahu commented Mar 7, 2025

I was trying to check for MMLU COT 0 shot configuration. But since there is no configuration available for that I went ahead with mmlu_flan_cot_zeroshot. But this is giving really less accuracy ranging between 28-33%. But the data published by HuggingFace is 73.0% for llama 3.18B. I tried to update the prompts still no improvements. Can anyone please help with the MMLU COT configuration.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant