Skip to content

Commit 612b85c

Browse files
committed
Aligns security docs with FSA naming
Clarifies security instructions under the Flash Sparse Attention brand so users follow the right guidance for install, reporting, and support
1 parent 6a29931 commit 612b85c

File tree

1 file changed

+10
-10
lines changed

1 file changed

+10
-10
lines changed

SECURITY.md

Lines changed: 10 additions & 10 deletions
Original file line numberDiff line numberDiff line change
@@ -13,7 +13,7 @@ We actively maintain and provide security updates for the following versions:
1313

1414
### CUDA Code Execution
1515

16-
Flash Dynamic Mask Attention includes CUDA kernels and C++ extensions that execute on your GPU. When using this library:
16+
Flash Sparse Attention includes CUDA kernels and C++ extensions that execute on your GPU. When using this library:
1717

1818
- Only install from trusted sources (official PyPI releases or verified builds)
1919
- Be cautious when building from source with modifications
@@ -46,11 +46,11 @@ If you discover a security vulnerability, please report it responsibly:
4646

4747
**For security issues:**
4848
- Email: losercheems@gmail.com
49-
- Subject: [SECURITY] Flash-DMA Vulnerability Report
49+
- Subject: [SECURITY] FSA Vulnerability Report
5050
- Include: Detailed description, reproduction steps, and potential impact
5151

5252
**For general bugs:**
53-
- Use our [GitHub Issues](https://github.com/SmallDoges/flash-dmattn/issues)
53+
- Use our [GitHub Issues](https://github.com/SmallDoges/flash-sparse-attention/issues)
5454
- Follow our [contributing guidelines](CONTRIBUTING.md)
5555

5656
## Response Timeline
@@ -63,21 +63,21 @@ Critical security issues will be prioritized and may result in emergency release
6363

6464
## Security Best Practices
6565

66-
When using Flash Dynamic Mask Attention:
66+
When using Flash Sparse Attention:
6767

6868
1. **Environment Isolation**
6969
```bash
7070
# Use virtual environments
71-
python -m venv flash_dma_env
72-
source flash_dma_env/bin/activate # Linux/Mac
71+
python -m venv fsa_env
72+
source fsa_env/bin/activate # Linux/Mac
7373
# or
74-
flash_dma_env\Scripts\activate # Windows
74+
fsa_env\Scripts\activate # Windows
7575
```
7676

7777
2. **Dependency Management**
7878
```bash
7979
# Keep dependencies updated
80-
pip install --upgrade torch flash-dmattn
80+
pip install --upgrade torch flash_sparse_attn
8181
```
8282

8383
3. **Input Validation**
@@ -108,5 +108,5 @@ For security-related questions or concerns:
108108
- Project maintainers: See [AUTHORS](AUTHORS) file
109109

110110
For general support:
111-
- GitHub Issues: https://github.com/SmallDoges/flash-dmattn/issues
112-
- Documentation: https://github.com/SmallDoges/flash-dmattn/tree/main/docs/
111+
- GitHub Issues: https://github.com/SmallDoges/flash-sparse-attention/issues
112+
- Documentation: https://github.com/SmallDoges/flash-sparse-attention/tree/main/docs/

0 commit comments

Comments
 (0)