Skip to content

Commit 13a0db0

Browse files
committed
Aligns repo links with new name
Points contribution guide links at flash-sparse-attention to avoid outdated references.
1 parent 9f5d48d commit 13a0db0

File tree

1 file changed

+8
-8
lines changed

1 file changed

+8
-8
lines changed

CONTRIBUTING.md

Lines changed: 8 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -4,7 +4,7 @@ Everyone is welcome to contribute, and we value everybody's contribution. Code c
44

55
It also helps us if you spread the word! Reference the library in blog posts about the awesome projects it made possible, shout out on Twitter every time it has helped you, or simply ⭐️ the repository to say thank you.
66

7-
However you choose to contribute, please be mindful and respect our [code of conduct](https://github.com/SmallDoges/flash-dmattn/blob/main/CODE_OF_CONDUCT.md).
7+
However you choose to contribute, please be mindful and respect our [code of conduct](https://github.com/SmallDoges/flash-sparse-attention/blob/main/CODE_OF_CONDUCT.md).
88

99
## Ways to contribute
1010

@@ -16,7 +16,7 @@ There are several ways you can contribute to Flash-DMA:
1616
* Contribute to the examples, benchmarks, or documentation.
1717
* Improve CUDA kernel performance.
1818

19-
If you don't know where to start, there is a special [Good First Issue](https://github.com/SmallDoges/flash-dmattn/contribute) listing. It will give you a list of open issues that are beginner-friendly and help you start contributing to open-source.
19+
If you don't know where to start, there is a special [Good First Issue](https://github.com/SmallDoges/flash-sparse-attention/contribute) listing. It will give you a list of open issues that are beginner-friendly and help you start contributing to open-source.
2020

2121
> All contributions are equally valuable to the community. 🥰
2222
@@ -81,14 +81,14 @@ You will need basic `git` proficiency to contribute to Flash-DMA. You'll need **
8181

8282
### Development Setup
8383

84-
1. Fork the [repository](https://github.com/SmallDoges/flash-dmattn) by clicking on the **Fork** button.
84+
1. Fork the [repository](https://github.com/SmallDoges/flash-sparse-attention) by clicking on the **Fork** button.
8585

8686
2. Clone your fork to your local disk, and add the base repository as a remote:
8787

8888
```bash
89-
git clone https://github.com/<your Github handle>/flash-dmattn.git
90-
cd flash-dmattn
91-
git remote add upstream https://github.com/SmallDoges/flash-dmattn.git
89+
git clone https://github.com/<your Github handle>/flash-sparse-attention.git
90+
cd flash-sparse-attention
91+
git remote add upstream https://github.com/SmallDoges/flash-sparse-attention.git
9292
```
9393

9494
3. Create a new branch to hold your development changes:
@@ -157,7 +157,7 @@ You will need basic `git` proficiency to contribute to Flash-DMA. You'll need **
157157

158158
### Tests
159159

160-
An extensive test suite is included to test the library behavior and performance. Tests can be found in the [tests](https://github.com/SmallDoges/flash-dmattn/tree/main/tests) folder and benchmarks in the [benchmarks](https://github.com/SmallDoges/flash-dmattn/tree/main/benchmarks) folder.
160+
An extensive test suite is included to test the library behavior and performance. Tests can be found in the [tests](https://github.com/SmallDoges/flash-sparse-attention/tree/main/tests) folder and benchmarks in the [benchmarks](https://github.com/SmallDoges/flash-sparse-attention/tree/main/benchmarks) folder.
161161

162162
We use `pytest` for testing. From the root of the repository, run:
163163

@@ -200,6 +200,6 @@ If you discover a security vulnerability, please send an e-mail to the maintaine
200200

201201
## Questions?
202202

203-
If you have questions about contributing, feel free to ask in the [GitHub Discussions](https://github.com/SmallDoges/flash-dmattn/discussions) or open an issue.
203+
If you have questions about contributing, feel free to ask in the [GitHub Discussions](https://github.com/SmallDoges/flash-sparse-attention/discussions) or open an issue.
204204

205205
Thank you for contributing to Flash Dynamic Mask Attention! 🚀

0 commit comments

Comments
 (0)