Skip to content

add configurable prefix #2625

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
wants to merge 1 commit into from

Conversation

coconutruben
Copy link
Contributor

Summary:

Why

make experiments easier to find

What

  • dynamo config to provide a prefix
  • use the prefix when sending data to scuba through the self.id_ field

Differential Revision: D77837550

@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D77837550

pytorch-bot bot pushed a commit to pytorch/pytorch that referenced this pull request Jul 17, 2025
Summary:
X-link: pytorch/benchmark#2625


# Why

make experiments easier to find

# What

- dynamo config to provide a prefix
- use the prefix when sending data to scuba through the self.id_ field

Test Plan:
```
# code edited to set the prefix as `coconutruben-02`
buck2 run mode/opt scripts/coconutruben/torchmm:experiment 2>&1 | tee /tmp/epx040
```

on scuba

```
| autotune_dtypes | autotune_offset | autotune_shape | autotune_strides | event | run_id |
| -----| -----| -----| -----| -----| ----- |
| "torch.float16, torch.float16" | "0, 0" | "4096x3008, 3008x2048" | "[3008, 1], [2048, 1]" | "mm_template_autotuning" | "coconutruben-02-e6bdccc5-6dcf-4d68-9a04-b34f2c6d94fd" |
| "torch.float16, torch.float16" | "0, 0" | "4096x3008, 3008x2048" | "[3008, 1], [2048, 1]" | "mm_template_autotuning" | "coconutruben-02-14165153-5842-4eaa-9e6c-3b0cbc016375" |

```

Rollback Plan:

Differential Revision: D77837550
coconutruben added a commit to coconutruben/benchmark that referenced this pull request Jul 17, 2025
Summary:

X-link: pytorch/pytorch#157678

# Why

make experiments easier to find

# What

- dynamo config to provide a prefix
- use the prefix when sending data to scuba through the self.id_ field

Differential Revision: D77837550
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D77837550

coconutruben added a commit to coconutruben/pytorch that referenced this pull request Jul 18, 2025
Summary:
X-link: pytorch/benchmark#2625


# Why

make experiments easier to find

# What

- dynamo config to provide a prefix
- use the prefix when sending data to scuba through the self.id_ field

Test Plan:
```
# code edited to set the prefix as `coconutruben-02`
buck2 run mode/opt scripts/coconutruben/torchmm:experiment 2>&1 | tee /tmp/epx040
```

on scuba

```
| autotune_dtypes | autotune_offset | autotune_shape | autotune_strides | event | run_id |
| -----| -----| -----| -----| -----| ----- |
| "torch.float16, torch.float16" | "0, 0" | "4096x3008, 3008x2048" | "[3008, 1], [2048, 1]" | "mm_template_autotuning" | "coconutruben-02-e6bdccc5-6dcf-4d68-9a04-b34f2c6d94fd" |
| "torch.float16, torch.float16" | "0, 0" | "4096x3008, 3008x2048" | "[3008, 1], [2048, 1]" | "mm_template_autotuning" | "coconutruben-02-14165153-5842-4eaa-9e6c-3b0cbc016375" |

```

Rollback Plan:

Reviewed By: stashuk-olek

Differential Revision: D77837550
coconutruben added a commit to coconutruben/benchmark that referenced this pull request Jul 18, 2025
Summary:

X-link: pytorch/pytorch#157678

# Why

make experiments easier to find

# What

- dynamo config to provide a prefix
- use the prefix when sending data to scuba through the self.id_ field

Reviewed By: stashuk-olek

Differential Revision: D77837550
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D77837550

coconutruben added a commit to coconutruben/benchmark that referenced this pull request Jul 21, 2025
Summary:

X-link: pytorch/pytorch#157678

# Why

make experiments easier to find

# What

- dynamo config to provide a prefix
- use the prefix when sending data to scuba through the self.id_ field

Reviewed By: stashuk-olek

Differential Revision: D77837550
pytorch-bot bot pushed a commit to pytorch/pytorch that referenced this pull request Jul 21, 2025
Summary:
X-link: pytorch/benchmark#2625


# Why

make experiments easier to find

# What

- dynamo config to provide a prefix
- use the prefix when sending data to scuba through the self.id_ field

Test Plan:
```
# code edited to set the prefix as `coconutruben-02`
buck2 run mode/opt scripts/coconutruben/torchmm:experiment 2>&1 | tee /tmp/epx040
```

on scuba

```
| autotune_dtypes | autotune_offset | autotune_shape | autotune_strides | event | run_id |
| -----| -----| -----| -----| -----| ----- |
| "torch.float16, torch.float16" | "0, 0" | "4096x3008, 3008x2048" | "[3008, 1], [2048, 1]" | "mm_template_autotuning" | "coconutruben-02-e6bdccc5-6dcf-4d68-9a04-b34f2c6d94fd" |
| "torch.float16, torch.float16" | "0, 0" | "4096x3008, 3008x2048" | "[3008, 1], [2048, 1]" | "mm_template_autotuning" | "coconutruben-02-14165153-5842-4eaa-9e6c-3b0cbc016375" |

```

Rollback Plan:

Reviewed By: stashuk-olek

Differential Revision: D77837550
Summary:
Pull Request resolved: pytorch#2625

X-link: pytorch/pytorch#157678

# Why

make experiments easier to find

# What

- dynamo config to provide a prefix
- use the prefix when sending data to scuba through the self.id_ field

Reviewed By: stashuk-olek

Differential Revision: D77837550
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D77837550

coconutruben added a commit to coconutruben/pytorch that referenced this pull request Jul 21, 2025
Summary:
X-link: pytorch/benchmark#2625

Pull Request resolved: pytorch#157678

# Why

make experiments easier to find

# What

- dynamo config to provide a prefix
- use the prefix when sending data to scuba through the self.id_ field

Test Plan:
```
# code edited to set the prefix as `coconutruben-02`
buck2 run mode/opt scripts/coconutruben/torchmm:experiment 2>&1 | tee /tmp/epx040
```

on scuba

```
| autotune_dtypes | autotune_offset | autotune_shape | autotune_strides | event | run_id |
| -----| -----| -----| -----| -----| ----- |
| "torch.float16, torch.float16" | "0, 0" | "4096x3008, 3008x2048" | "[3008, 1], [2048, 1]" | "mm_template_autotuning" | "coconutruben-02-e6bdccc5-6dcf-4d68-9a04-b34f2c6d94fd" |
| "torch.float16, torch.float16" | "0, 0" | "4096x3008, 3008x2048" | "[3008, 1], [2048, 1]" | "mm_template_autotuning" | "coconutruben-02-14165153-5842-4eaa-9e6c-3b0cbc016375" |

```

Rollback Plan:

Reviewed By: stashuk-olek

Differential Revision: D77837550
@facebook-github-bot
Copy link
Contributor

This pull request has been merged in 7ddd618.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants