Skip to content

MoE-Visualizer is a tool designed to visualize the selection of experts in Mixture-of-Experts (MoE) models.

License

Notifications You must be signed in to change notification settings

ChenZiHong-Gavin/MoE-Visualizer

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

8 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

MoE-Visualizer

demo

Introduction

This project is a visualizer for Mixture of Experts (MoE) models. We aim to provide a visual tool to help users understand the usage of experts in MoE models.

We designed a hook that can be mounted on a specific layer of the MoE model, which records which experts are used for each sample during inference. Ultimately, this allows us to count the usage of each expert.

Therefore, this is a plug-and-play module that can be used with any MoE model, with Qwen1.5-MoE-A2.7B provided as an example.

What we have done

  • Visualize the usage of experts in prefill and generate phase
  • Support batch processing
  • Support downloading data

Models we support

How to use

Step 1: Install the package

pip install -r requirements.txt

Step 2: Run the demo

python qwen1_5_moe.py

If this project helps you, please give us a star. 🌟

About

MoE-Visualizer is a tool designed to visualize the selection of experts in Mixture-of-Experts (MoE) models.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages