Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

making converter additionally support deepseek-coder dense hf model #167

Open
wants to merge 17 commits into
base: main
Choose a base branch
from

Conversation

leo-liuzy
Copy link

Changes made:

  • adding support for converting back/forth for deepseek-coder dense model on top of existing PR.
  • restructure RoPEScalingConfig to support linear scaling; previously, the object only contains counterparts of huggingface's llama3 rope scaling, whereas deepseek-coder uses linear scaling.

@dirkgr

MODEL_CONFIG: TransformerConfig
if HF_MODEL == "meta-llama/Llama-3.2-1B":
if "Llama-3.2-1B" in HF_MODEL:
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Do we also need this for bigger Llama models?

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I change it so that it works with local cache dir. We are encountering some additional weird bug related to conversion; and requires some help from you. We will submit a separate issue regarding that. Do you prefer we incorporate solution to that issue into this PR or should I separate that from this current PR?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Many small PRs is always better, if possible.

# Conflicts:
#	src/examples/huggingface/convert_checkpoint_to_hf.py
@dirkgr
Copy link
Member

dirkgr commented Feb 19, 2025

@lingchensanwen , I merged current main into this. There were a fair number of conflicts, and I'm not sure I did it right. Can you check?

@lingchensanwen
Copy link

@lingchensanwen , I merged current main into this. There were a fair number of conflicts, and I'm not sure I did it right. Can you check?

@dirkgr, I think I've fixed this. Could you take a look at the new one?

Comment on lines 27 to 28
SAVE_PATH = f"{os.environ['SHARE_RES_DIR']}/models/deepseek/olmo/deepseek-coder-1.3b-base"
SAVE_OVERWRITE = True
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is hardcoded to deepseek, and the SAVE_OVERWRITE was probably a leftover statement? It should default to the safe setting, so people don't overwrite their stuff by accident.

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

sure will fix.

)

elif "deepseek-coder-1.3b-base" in HF_MODEL:
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Does this have to happen for all DeepSeek models?

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

yes, we defined class for this version deepseek following the olmo core setup. For other version, users would need to define their own class as well like this. Also, I can modify this to elif HF_MODEL.startswith("deepseek-coder-1.3b-base"): to follow the same format as repo did.

)
else:
# Extract RoPE scaling config
# from pdb import set_trace; set_trace()
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

leftover debug code?

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

will remove in next commit.


with open(config_path, "r") as f:
olmo_config_dict = json.load(f)["model"]
# import pdb; pdb.set_trace()
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

leftover?

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

will remove in next commit.

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

resolved all above in new commit @dirkgr

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants