1 min readfrom Machine Learning

TritonSigmoid: A fast, padding-aware sigmoid attention kernel for GPUs [R]

Our take

We are excited to introduce TritonSigmoid, an open-sourced, fast, padding-aware sigmoid attention kernel designed for GPUs. Developed for single-cell foundation models, this kernel enables effective representation of gene sequences by allowing simultaneous attention to multiple genes, overcoming the limitations of softmax. Our experiments demonstrate impressive performance, achieving up to 515 TFLOPS on the H100, lower validation loss across various datasets, and enhanced cell-type separation. We invite feedback and discussion on this innovative advancement. Explore our paper and code to learn more: [Paper](https://arxiv.org/abs/260

We are open-sourcing TritonSigmoid — a fast, padding-aware sigmoid attention kernel for GPUs.

We built this for single-cell foundation models, where every cell is represented as a sequence of genes. A single gene can be regulated by multiple transcription factors at once. Softmax forces them to compete for attention, but sigmoid lets the model attend strongly to many genes (tokens) simultaneously. Because cells express anywhere from 200 to 16,000+ genes (tokens), the kernel handles variable-length padding natively so you're not wasting compute on empty positions.

What we found during our experiments:
• Hardware: Up to 515 TFLOPS on H100 (vs. FlashAttention-2 at 361, FlashSigmoid at 440)
• Accuracy: Lower validation loss than softmax attention across 6 held-out datasets
• Representation: 25% better cell-type separation
• Stability: Stable training where softmax catastrophically diverges

We would welcome any discussion or feedback.

Links to our work:
Paper: https://arxiv.org/abs/2604.27124
Code: https://github.com/MSDLLCpapers/triton-sigmoid

submitted by /u/vjysd
[link] [comments]

Read on the original site

Open the publisher's page for the full experience

View original article

Tagged with

#natural language processing for spreadsheets#generative AI for data analysis#Excel alternatives for data analysis#rows.com#no-code spreadsheet solutions#TritonSigmoid#sigmoid attention#GPU#single-cell foundation models#genes#transcription factors#softmax#variable-length padding#TFLOPS#validation loss#cell-type separation#stability#compute#H100#FlashAttention-2