Xiuying Wei
Google Scholar / GitHub / Email /
xiuying.wei [at] epfl.ch, weixiuying966 [at] gmail.com
                                  

Hi, I'm Xiuying. I'm the first year PhD student in EPFL, supervised by Prof. Caglar Gulcehre. Before that, I was a master's student at Beihang University and an intern in SenseTime, supervised by Prof. Xianglong Liu and Ruihao Gong, respectively.

My research interests are efficient machine learning and natural language processing.

If you have any questions / want to collaborate, feel free to send me an email! I am always excited to learn more by talking with people.

Publications

Outlier Suppression+: Accurate quantization of large language models by equivalent and effective shifting and scaling
Xiuying Wei, Yunchen Zhang, Yuhang Li, Xianguo Zhang, Ruihao Gong, Jinyang Guo, Xianglong Liu.
EMNLP23
[Paper] [Code]

Lossy and Lossless (L2) Post-training Model Size Compression
Yumeng Xue, Shihao Bai, Xiuying Wei, Ruihao Gong, Jianlei Yang
International Conference on Computer Vision (ICCV), 2023

Integrate lossless and lossy compression techniques in a post-training setting.

[Paper]

Outlier Suppression: Pushing the Limit of Low-bit Transformer Language Models
Xiuying Wei, Yunchen Zhang, Xiangguo Zhang, Ruihao Gong, Shanghang Zhang, Qi Zhang, Fengwei Yu, and Xianglong Liu
Neural Information Processing Systems (NeurIPS), 2022 (Spotlight)

Identify outlier phenomenons (channel concentration and token discrepancy) for quantizing transformer language models. Propose a framework to suppress these outliers.

[Paper] [Code]

QDrop: Randomly Dropping Quantization For Extremely Low-bit Post-training quantization.
Xiuying Wei, Ruihao Gong, Yuhang Li, Xianglong Liu, and Fengwei Yu
International Conference on Learning Representations (ICLR), 2022, with 8688 scores by reviewers

Investigate how the activation quantization affects weight tuning. Build the relationship between activation quantization and flatness of quantized weights. Propose to randomly drop the activation quantization to achieve a flatter optimized weights.

[Paper] [Code]


Honors and Awards
Misc
I love climbing and reading.