@@ -13,7 +13,7 @@ We actively maintain and provide security updates for the following versions:
1313
1414### CUDA Code Execution
1515
16- Flash Dynamic Mask Attention includes CUDA kernels and C++ extensions that execute on your GPU. When using this library:
16+ Flash Sparse Attention includes CUDA kernels and C++ extensions that execute on your GPU. When using this library:
1717
1818- Only install from trusted sources (official PyPI releases or verified builds)
1919- Be cautious when building from source with modifications
@@ -46,11 +46,11 @@ If you discover a security vulnerability, please report it responsibly:
4646
4747** For security issues:**
4848- Email: losercheems@gmail.com
49- - Subject: [ SECURITY] Flash-DMA Vulnerability Report
49+ - Subject: [ SECURITY] FSA Vulnerability Report
5050- Include: Detailed description, reproduction steps, and potential impact
5151
5252** For general bugs:**
53- - Use our [ GitHub Issues] ( https://github.com/SmallDoges/flash-dmattn /issues )
53+ - Use our [ GitHub Issues] ( https://github.com/SmallDoges/flash-sparse-attention /issues )
5454- Follow our [ contributing guidelines] ( CONTRIBUTING.md )
5555
5656## Response Timeline
@@ -63,21 +63,21 @@ Critical security issues will be prioritized and may result in emergency release
6363
6464## Security Best Practices
6565
66- When using Flash Dynamic Mask Attention:
66+ When using Flash Sparse Attention:
6767
68681 . ** Environment Isolation**
6969 ``` bash
7070 # Use virtual environments
71- python -m venv flash_dma_env
72- source flash_dma_env /bin/activate # Linux/Mac
71+ python -m venv fsa_env
72+ source fsa_env /bin/activate # Linux/Mac
7373 # or
74- flash_dma_env \S cripts\a ctivate # Windows
74+ fsa_env \S cripts\a ctivate # Windows
7575 ```
7676
77772 . ** Dependency Management**
7878 ``` bash
7979 # Keep dependencies updated
80- pip install --upgrade torch flash-dmattn
80+ pip install --upgrade torch flash_sparse_attn
8181 ```
8282
83833 . ** Input Validation**
@@ -108,5 +108,5 @@ For security-related questions or concerns:
108108- Project maintainers: See [ AUTHORS] ( AUTHORS ) file
109109
110110For general support:
111- - GitHub Issues: https://github.com/SmallDoges/flash-dmattn /issues
112- - Documentation: https://github.com/SmallDoges/flash-dmattn /tree/main/docs/
111+ - GitHub Issues: https://github.com/SmallDoges/flash-sparse-attention /issues
112+ - Documentation: https://github.com/SmallDoges/flash-sparse-attention /tree/main/docs/
0 commit comments