You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
* Update index.rst
updated nav w/ Slack and DIscourse link
minor formatting tweaks to bulleted lists and markdown
typo fix
* Update CONTRIBUTING.md
added Slack and Discourse links
* Update README.md
added Discourse and Slack links
Copy file name to clipboardExpand all lines: CONTRIBUTING.md
+1-3
Original file line number
Diff line number
Diff line change
@@ -77,9 +77,7 @@ For documentation edits, include:
77
77
78
78
## Question or Problem
79
79
80
-
- Go to: [GitHub Discussions](https://github.com/neuralmagic/sparsezoo/discussions/)
81
-
82
-
Post all other questions including support or how to contribute. Don’t forget to search through existing discussions to avoid duplication! Thanks!
80
+
Sign up or log in: **Deep Sparse Community**[Discourse Forum](https://https://discuss.neuralmagic.com/) and/or [Slack](https://discuss-neuralmagic.slack.com/). We are growing the community member by member and happy to see you there. Post all other questions including support or how to contribute. Don’t forget to search through existing discussions to avoid duplication! Thanks!
Copy file name to clipboardExpand all lines: README.md
+4-4
Original file line number
Diff line number
Diff line change
@@ -68,8 +68,8 @@ Techniques for sparsification are all encompassing including everything from ind
68
68
When implemented correctly, these techniques result in significantly more performant and smaller models with limited to no effect on the baseline metrics.
69
69
For example, pruning plus quantization can give noticeable improvements in performance while recovering to nearly the same baseline accuracy.
70
70
71
-
The Deep Sparse product suite builds on top of sparsification enabling you to easily apply the techniques to your datasets and models using recipe-driven approaches.
72
-
Recipes encode the directions for how to sparsify a model into a simple, easily editable format.
71
+
The Deep Sparse product suite builds on top of sparsification enabling you to easily apply the techniques to your datasets and models using recipe-driven approaches. Recipes encode the directions for how to sparsify a model into a simple, easily editable format.
72
+
73
73
- Download a sparsification recipe and sparsified model from the [SparseZoo](https://github.com/neuralmagic/sparsezoo).
74
74
- Alternatively, create a recipe for your model using [Sparsify](https://github.com/neuralmagic/sparsify).
75
75
- Apply your recipe with only a few lines of code using [SparseML](https://github.com/neuralmagic/sparseml).
@@ -248,7 +248,7 @@ We appreciate contributions to the code, examples, and documentation as well as
248
248
249
249
## Join the Community
250
250
251
-
For user help or questions about SparseZoo, use our [GitHub Discussions](https://www.github.com/neuralmagic/sparsezoo/discussions/). Everyone is welcome!
251
+
For user help or questions about SparseZoo, sign up or log in: **Deep Sparse Community**[Discourse Forum](https://discuss.neuralmagic.com/) and/or [Slack](https://discuss-neuralmagic.slack.com/). We are growing the community member by member and happy to see you there.
252
252
253
253
You can get the latest news, webinar and event invites, research papers, and other ML Performance tidbits by [subscribing](https://neuralmagic.com/subscribe/) to the Neural Magic community.
254
254
@@ -260,7 +260,7 @@ The project is licensed under the [Apache License Version 2.0](https://github.co
Copy file name to clipboardExpand all lines: docs/index.rst
+7-3
Original file line number
Diff line number
Diff line change
@@ -66,6 +66,7 @@ For example, pruning plus quantization can give noticeable improvements in perfo
66
66
67
67
The Deep Sparse product suite builds on top of sparsification enabling you to easily apply the techniques to your datasets and models using recipe-driven approaches.
68
68
Recipes encode the directions for how to sparsify a model into a simple, easily editable format.
69
+
69
70
- Download a sparsification recipe and sparsified model from the `SparseZoo <https://github.com/neuralmagic/sparsezoo>`_.
70
71
- Alternatively, create a recipe for your model using `Sparsify <https://github.com/neuralmagic/sparsify>`_.
71
72
- Apply your recipe with only a few lines of code using `SparseML <https://github.com/neuralmagic/sparseml>`_.
0 commit comments