Assistant Professor
ACA Lab, SEIEE
Shanghai Jiao Tong University
Email: liufangxin@sjtu.edu.cn
Wechat: lfx920701
Office: SEIEE 3-125, SJTU
800 DongChuan Road, SEIEE Building #03-125, Minhang District, Shanghai
本站近期访问量:
My Friends:
Zongwu Wang (汪宗武): My junior fellow studied
under Prof. Jiang, PhD Student
Haomin Li (李皓民): My junior fellow studied
under Prof. Jiang, PhD Student
Fangxin Liu is an Assistant Professor at Shanghai Jiao Tong University (SJTU), specializing in neural network acceleration (e.g., neural network compression and SW/HW co-design.), in-memory computing, and brain-inspired neuromorphic computing.
He obtained his Ph.D. Degree in Computer Science and Technology from Shanghai Jiao Tong University in 2023, under the supervision of Prof. Li Jiang. You can find more information about Prof. Jiang here.
The papers and related resources will be shared on my Github in the near future.
Current research interests focuses on:
Our paper "HyperAttack" has been accepted by DAC 2023!
Feb./24/2023
Our paper "SIMSnn" has been accepted by DATE 2023!
Nov./18/2022
I'm awarded National Scholarship again!
Sep./27/2022
Our paper "Randomize and Match" has been accepted by ICCD 2022!
Aug./22/2022
I'm awarded Spark Award from HUAWEI!
Jul./29/2022
Fangxin Liu is invited to deliver a talk about SNNs at AI Time
May/25/2022
Our paper "SoBS-X" has been accepted by IEEE TCAD!
Apr./17/2022
Two papers have been accepted by ICMR 2022!
Apr./06/2022
Our paper "L3E-HD" has been accepted by SIGIR 2022!
Apr./01/2022
Our paper "IVQ" has been accepted by IEEE TCAD!
Feb./23/2022
Three papers have been accepted by DAC 2022!
Feb./22/2022
Our paper has been selected by DATE 2022 as Best Paper in the T Track!
Feb./04/2022
Our paper "DynSNN" has been accepted by ICASSP 2022!
Jan./21/2022
Our paper "SpikeConverter" has been accepted by AAAI 2022!
Dec./01/2021
Two papers have been accepted by DATE 2022!
Nov./11/2021
Our paper "SSTDP" has been accepted by Frontiers in Neuroscience!
Oct./01/2021
I'm awarded National Scholarship!
Sep./23/2021
Our paper "HAWIS" has been accepted by ASP-DAC 2022!
Sep./12/2021
Our paper "SME: ReRAM-based Sparse-Multiplication-Engine to Squeeze-Out Bit Sparsity of Neural Network" has been accepted by ICCD'21 !
Aug./20/2021
Our paper "Improving Neural Network Efficiency via Post-training Quantization with Adaptive Floating-Point" has been accepted by ICCV'21 !
Jul./23/2021
Our paper "Bit-Transformer: Transforming Bit-level Sparsity into Higher Preformance in ReRAM-based Accelerator" has been accepted by ICCAD'21 !
Jul./13/2021
Our paper "IM3A: Boosting Deep Neural Network Efficiency via In-Memory Addressing-Assisted Acceleration" has been accepted by GLSVLSI'21 !
Apr./12/2021
Our paper "PIMGCN: A ReRAM-based PIM Design for Graph Convolutional Network Acceleration" has been accepted by DAC'21 !
Feb./20/2021
![]() |
Compression and acceleration on Neuromorphic Computing
|
![]() |
Compression and acceleration on Large-Scale Models
|
![]() |
Compression and acceleration on Databases
|
Thanks to Vasilios Mavroudis for the template!