Skip to content

Commit 5aa140b

Browse files
committed
Update repository links in CITATION.cff and CONTRIBUTING.md to point to the correct GitHub repository
1 parent 9b41255 commit 5aa140b

File tree

2 files changed

+7
-7
lines changed

2 files changed

+7
-7
lines changed

CITATION.cff

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -2,7 +2,7 @@ cff-version: "1.2.0"
22
date-released: 2025-06
33
message: "If you use this software, please cite it using these metadata."
44
title: "Flash Sparse Attention: Trainable Dynamic Mask Sparse Attention"
5-
url: "https://github.com/SmallDoges/flash-sparse-attention"
5+
url: "https://github.com/flash-algo/flash-sparse-attention"
66
authors:
77
- family-names: Shi
88
given-names: Jingze

CONTRIBUTING.md

Lines changed: 6 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -4,7 +4,7 @@ Everyone is welcome to contribute, and we value everybody's contribution. Code c
44

55
It also helps us if you spread the word! Reference the library in blog posts about the awesome projects it made possible, shout out on Twitter every time it has helped you, or simply ⭐️ the repository to say thank you.
66

7-
However you choose to contribute, please be mindful and respect our [code of conduct](https://github.com/SmallDoges/flash-sparse-attention/blob/main/CODE_OF_CONDUCT.md).
7+
However you choose to contribute, please be mindful and respect our [code of conduct](https://github.com/flash-algo/flash-sparse-attention/blob/main/CODE_OF_CONDUCT.md).
88

99
## Ways to contribute
1010

@@ -16,7 +16,7 @@ There are several ways you can contribute to Flash-DMA:
1616
* Contribute to the examples, benchmarks, or documentation.
1717
* Improve CUDA kernel performance.
1818

19-
If you don't know where to start, there is a special [Good First Issue](https://github.com/SmallDoges/flash-sparse-attention/contribute) listing. It will give you a list of open issues that are beginner-friendly and help you start contributing to open-source.
19+
If you don't know where to start, there is a special [Good First Issue](https://github.com/flash-algo/flash-sparse-attention/contribute) listing. It will give you a list of open issues that are beginner-friendly and help you start contributing to open-source.
2020

2121
> All contributions are equally valuable to the community. 🥰
2222
@@ -81,14 +81,14 @@ You will need basic `git` proficiency to contribute to Flash-DMA. You'll need **
8181

8282
### Development Setup
8383

84-
1. Fork the [repository](https://github.com/SmallDoges/flash-sparse-attention) by clicking on the **Fork** button.
84+
1. Fork the [repository](https://github.com/flash-algo/flash-sparse-attention) by clicking on the **Fork** button.
8585

8686
2. Clone your fork to your local disk, and add the base repository as a remote:
8787

8888
```bash
8989
git clone https://github.com/<your Github handle>/flash-sparse-attention.git
9090
cd flash-sparse-attention
91-
git remote add upstream https://github.com/SmallDoges/flash-sparse-attention.git
91+
git remote add upstream https://github.com/flash-algo/flash-sparse-attention.git
9292
```
9393

9494
3. Create a new branch to hold your development changes:
@@ -157,7 +157,7 @@ You will need basic `git` proficiency to contribute to Flash-DMA. You'll need **
157157

158158
### Tests
159159

160-
An extensive test suite is included to test the library behavior and performance. Tests can be found in the [tests](https://github.com/SmallDoges/flash-sparse-attention/tree/main/tests) folder and benchmarks in the [benchmarks](https://github.com/SmallDoges/flash-sparse-attention/tree/main/benchmarks) folder.
160+
An extensive test suite is included to test the library behavior and performance. Tests can be found in the [tests](https://github.com/flash-algo/flash-sparse-attention/tree/main/tests) folder and benchmarks in the [benchmarks](https://github.com/flash-algo/flash-sparse-attention/tree/main/benchmarks) folder.
161161

162162
We use `pytest` for testing. From the root of the repository, run:
163163

@@ -200,6 +200,6 @@ If you discover a security vulnerability, please send an e-mail to the maintaine
200200

201201
## Questions?
202202

203-
If you have questions about contributing, feel free to ask in the [GitHub Discussions](https://github.com/SmallDoges/flash-sparse-attention/discussions) or open an issue.
203+
If you have questions about contributing, feel free to ask in the [GitHub Discussions](https://github.com/flash-algo/flash-sparse-attention/discussions) or open an issue.
204204

205205
Thank you for contributing to Flash Sparse Attention! 🚀

0 commit comments

Comments
 (0)