Follow
Shida Wang
Title
Cited by
Cited by
Year
A brief survey on the approximation theory for sequence modelling
H Jiang, Q Li, Z Li, S Wang
Journal of Machine Learning (JML) 2 (1), 1-30, 2023
62023
State-space models with layer-wise nonlinearity are universal approximators with exponential decaying memory
S Wang, B Xue
Advances in Neural Information Processing Systems 36, 2024
52024
Inverse approximation theory for nonlinear recurrent neural networks
S Wang, Z Li, Q Li
The 12th International Conference on Learning Representations (Spotlight …, 2024
32024
Efficient hyperdimensional computing
Z Yan, S Wang, K Tang, WF Wong
Joint European Conference on Machine Learning and Knowledge Discovery in …, 2023
32023
StableSSM: Alleviating the Curse of Memory in State-space Models through Stable Reparameterization
S Wang, Q Li
arXiv preprint arXiv:2311.14495, 2023
12023
HyperSNN: A new efficient and robust deep learning model for resource constrained control applications
Z Yan, S Wang, K Tang, WF Wong
arXiv preprint arXiv:2308.08222, 2023
12023
The Effects of Nonlinearity on Approximation Capacity of Recurrent Neural Networks
S Wang, Z Li, Q Li
12022
Integrating Deep Learning and Synthetic Biology: A Co-Design Approach for Enhancing Gene Expression via N-terminal Coding Sequences
Z Yan, W Chu, Y Sheng, K Tang, S Wang, Y Liu, WF Wong
arXiv preprint arXiv:2402.13297, 2024
2024
Improve Long-term Memory Learning Through Rescaling the Error Temporally
S Wang, Z Yan
arXiv preprint arXiv:2307.11462, 2023
2023
The system can't perform the operation now. Try again later.
Articles 1–9