Skip to content

Is there a problem with sample loading? #308

@lkq111

Description

@lkq111
    model = None
    if args.hf_model is not None:
        model = create_wrapped_hf_model(args)
    else:
        # Optional: Use meta device
        with torch.device("meta" if args.experimental_meta_device and args.fsdp else args.device):
            model = create_model(args)

    args.vocab_size = model.vocab_size
    args.seq_len = model.seq_len
    if args.train_num_samples is not `None:`
        args.train_num_samples //= args.seq_len
        print(f"Training num samples (tokens): {args.train_num_samples} / seq_len: {args.seq_len} = {args.train_num_samples // args.seq_len} sequences")
    if args.val_num_samples is not None:
        if args.val_num_samples // args.seq_len == 0:
            raise ValueError(
                f"number of requested evaluation val_num_samples (tokens): {args.val_num_samples} is less than seq_len: {args.seq_len}"
            )
        args.val_num_samples //= args.seq_len

why args.train_num_samples //= args.seq_len ??? I don't understand, could you please explain?

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions