You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardexpand all lines: README.md
+1-1
Original file line number
Diff line number
Diff line change
@@ -29,7 +29,7 @@ see the [full list](#citation) of our papers below.
29
29
## Example Use Cases
30
30
31
31
This section lists projects that leverage hivemind for decentralized training.
32
-
If you have succesfully trained a model or created a downstream repository with the help of our library,
32
+
If you have successfully trained a model or created a downstream repository with the help of our library,
33
33
feel free to submit a pull request that adds your project to this list.
34
34
35
35
***Petals** ([webpage](https://petals.ml), [code](https://github.com/bigscience-workshop/petals)) — a decentralized platform for inference and fine-tuning of 100B+ language models.
0 commit comments