首页
科学研究
研究领域
论文成果
专利
著作成果
科研项目
科研团队
教学研究
教学资源
授课信息
教学成果
获奖信息
招生信息
学生信息
我的相册
教师博客
其他栏目
语种
English
焦雨领
教授
性别:男
学历:博士研究生毕业
在职信息:在职
所在单位:人工智能学院
联系方式:+86 15871394253
电子邮箱:yulingjiaomath@whu.edu.cn
访问量:
开通时间:
.
.
最后更新时间:
.
.
论文成果
当前位置:
中文主页
>>
科学研究
>>
论文成果
[1].
A Primal Dual Active Set Algorithm with continuation in Compressed Sensing.
IEEE Transactions on Signal Processing.
62 (33). 6276-6285. 2014. MATLAB code: http://xllv.whu.edu.cn/pdascl1.zip.
[2].
A Primal Dual Active Set with Continuation Algorithm for the l0-Regularized Optimization Problem.
Applied and Computational Harmonic Analysis.
39 (3). 400-426. 2015. MATALB code: http://www0.cs.ucl.ac.uk/staff/b.jin/software/pdascl0.zip.
[3].
Alternating Direction Method of Multiplier for Linear Inverse Problems.
SIAM Journal on Numerical Analysis.
54 (4). 2114-2137. 2016. MATLAB code: http://xllv.whu.edu.cn/admm_lin_inv.rar.
[4].
Preasymptotic Convergence of Randomized Kaczmarz Method.
Inverse Problems.
33 (12). 125012. 2017.
[5].
Group Sparse Recovery via the l0(l2) Penalty: Theory and Algorithm.
IEEE Transactions on Signal Processing.
65 (4). 998-1012. 2017. MATLAB code: http://www0.cs.ucl.ac.uk/staff/b.jin/software/gpdasc.zip.
[6].
Preconditioned Alternating Direction Method of Multipliers for Solving Inverse Problems with Constraints.
Inverse Problems.
33 (2). 025004. 2017.
[7].
Iterative Soft/Hard Thresholding Homotopy Algorithm for Sparse Recovery.
IEEE Signal Processing Letter.
24 (6). 784-788. 2017. MATLAB code: http://www0.cs.ucl.ac.uk/staff/b.jin/software/ishtc.zip.
[8].
Robust Decoding from 1-bit Compressive Sampling by Ordinary and Regularized Least Squares.
SIAM Journal on Scientific Computing.
40 (4). A2062-A2086. 2018. MATLAB code: see attachment..
[9].
A Constructive Approach to L0 Penalized Regression.
Journal of Machine Learning Research.
19 (10). 1-37. 2018. MATLAB code: see attachment.
[10].
Deep Generative Learning via Variational Gradeint Flow.
ICML.
97. 2093-2101. 2019. Pytorch code: https://github.com/xjtuygao/VGrow.
[11].
Fitting Sparse Linear Models under the Sufficient and Necessary Condition for Model Identification.
Statistics & Probability Letters.
168. 108925. 2020.
[12].
A tissue-specific collaborative mixed model for jointly analyzing multiple tissues in transcriptome-wide association studies.
Nucleic acids research.
48 (19). e109. 2020.
[13].
A Unified Primal Dual Active Set Algorithm Nonconvex Sparse Recovery.
Statistical Science.
36 (2). 215-238. 2021. MATLAB code: http://www0.cs.ucl.ac.uk/staff/b.jin/software/updasc.zip.
[14].
Generative Learning With Euler Particle Transport.
MSML.
145. 1-33. 2021. Pytorch code: https://github.com/xjtuygao/EPT.
[15].
Deep Generative Learning via Schrodinger Bridge.
ICML.
10794-10804. 2021. Demo Pytorch code see attachment.
[16].
REMI: Regression with Marginal Information and Its Application in Genome-wide Association Studies.
Statistica Sinica.
31. 1985-2004. 2021. R code: https://github.com/gordonliu810822/REMI.
[17].
Non-asymptotic Error Bounds for Bidirectional GANs.
NeurIPS.
2021.
[18].
An Error Analysis of Generative Adversarial Networks for Learning Distributions.
Journal of Machine Learning Research.
23 (116). 1-43. 2022.
[19].
Convergence Rate Analysis for Deep Ritz Method.
CICP.
31 (4). 1020-1048. 2022.
[20].
Approximation with Deep Convolutional Networks in Sobolev Space: with Applications to Classification.
NeurIPS (Oral).
2022.
[21].
Sample-Efficient Sparse Phase Retrieval via Stochastic Alternating Minimization.
IEEE Transactions on Signal Processing.
70. 4951-4966. 2022.
[22].
Probabilistic embedding and clustering with alignment for spatial transcriptomics data integration with PRECAST.
Nature Communications.
14 (1). 296. 2023.
[23].
Deep Neural Networks with ReLU-Sine-Exponential Activations Break Curse of Dimensionality in Approximation on H\" older Class.
SIAM Journal on Mathematical Analysis.
55 (4). 3635-3649. 2023.
[24].
Improved Analysis of PINNs: Alleviate the CoD for Compositional Solutions.
Applied Annals of Mathematics.
39 (3). 239-263. 2023.
[25].
Integrative Analysis for High-dimensional Stratified Models.
Statistica Sinica.
33. 1533-1553. 2023.
[26].
Approximation Bounds for Norm Constrained Neural Networks with Applications to Regression and GANs.
Applied and Computational Harmonic Analysis.
65. 249-278. 2023.
[27].
Fast Excess Risk Rate via Offset Rademacher Complexity.
ICML.
2023.
[28].
Just Least Squares: Binary Compressive Sampling with Low Generative Intrinsic Dimension.
Journal of Scientific Computing.
95 (1). 28. 2023.
[29].
Deep Nonparametric Regression on Approximately Low-dimensional Manifolds: Non-Asymptotic Error Bounds with Polynomail Prefactors.
Annals of Statistics.
51 (2). 691-716. 2023.
[30].
A Rate of Convergence of Weak Adversarial Neural Networks for the Second Order Parabolic PDEs.
Commun. Comput. Phys..
34 (3). 814-836. 2023.
[31].
Global Optimization via Schr {\" o} dinger-F {\" o} llmer Diffusion.
SIAM Journal on Control and Optimization.
61 (5). 2953-2980. 2023.
[32].
A Deep Generative Approach to Conditional Sampling.
Journal of the American Statistical Association.
118 (543). 1837-1848. 2023. Code: https://github.com/Jason-Xingyu/A-Deep-Generative-Approach-to-Conditional-Sampling.
[33].
Over parameterized Deep Nonparametric Regression for Dependent Data with Its Applications to Reinforcement Learning.
Journal of Machine Learning Research.
24 (383). 1-40. 2023.
[34].
Deep Ritz Method for Elliptical Multiple Eigenvalue Problems.
Journal of Scientific Computing.
98 (2). 48. 2024.
[35].
Error Estimation to the Direct Sampling Method for the Inverse Acoustic Source Problem with Multi-frequency Data.
Inverse Problems and Imaging.
18 (2). 517-540. 2024.
[36].
Efficient and Practical Quantum Compiler Towards Multi-qubit Systems with Deep Reinforcement Learning.
Quantum Science and Technology.
9 (4). 045002. 2024.
[37].
Neural Network Approximation for Pessimistic Offline Reinforcement Learning.
AAAI.
2024.
[38].
Error Analysis of Deep Ritz Methods for Elliptic Equations.
Analysis and Applications.
22 (1). 57-87. 2024.
[39].
Nonparametric Estimation of Non-Crossing Quantile Regression Process with Deep Neural Networks.
Journal of Machine Learning Research.
25 (88). 1-75. 2024.
[40].
Deep Dimension Reduction for Supervised Representation Learning.
IEEE Transactions on Information Theory.
70 (5). 3583-3598. 2024. Pytoch code: https://github.com/Liao-Xu/DDR.
[41].
A Gaussian Mixture Distribution-based Adaptive Sampling Method for Physics-informed Neural Networks.
Engineering Applications of Artificial Intelligence.
135. 108770. 2024.
[42].
Recovering the Source Term in Elliptic Equation via Deep Learning: Method and Convergence Analysis.
East Asian Journal on Applied Mathematics.
19. 460-489. 2024.
[43].
Deep Nonlinear Sufficient Dimention Reduction.
Annals of Statistics.
52 (3). 1201-1226. 2024.
[44].
Convergence Analysis for Over-Parameterized Deep Learning.
Commun. Comput. Phys..
36 (1). 71-103. 2024.
[45].
Gaussian Interpolation Flows.
Journal of Machine Learning Research.
25 (253). 1-52. 2024.
[46].
Non-asymptotic Approximation Error Bounds of Parameterized Quantum Circuits.
NeurIPS (Spotlight).
2024.
[47].
Non-Asymptotic Bounds for Adversarial Excess Risk under Misspecified Models.
SIAM Journal on Mathematics of Data Science.
6 (4). 847-868. 2024.
[48].
A Stabilized Physics Informed Neural Networks Method for Wave Equations.
Numer. Math. Theor. Meth. Appl..
17 (4). 1100-1127. 2024.
[49].
Deep Nonparametric Quantile Regression under Covariate Shift.
Journal of Machine Learning Research.
25 (385). 1-50. 2024.
[50].
Multivariate Stochastic Modeling for Transcriptional Dynamics with Cell-specific Latent Time Using SDEvelo.
Nature Communications.
15. 10849. 2024.
[51].
Convergence Analysis of Deep Ritz Method with Over-parameterizations.
Neural Networks.
184 (2025). 101110. 2025.
[52].
Semi-Supervised Deep Sobolev Regression: Estimation and Variable Selection by ReQU Neural Network.
IEEE Transactions on Information Theory.
71 (4). 2955-2981. 2025.
[53].
Schrodinger-Follmer Sampler.
IEEE Transactions on Information Theory.
71 (2). 1283-1299. 2025. R code and Python code: https://github.com/Liao-Xu/SFS_R and https://github.com/Liao-Xu/SFS_py.
[54].
Relative Entropy Gradient Sampler for Unnormalized Distributions.
Journal of Computational and Graphical Statistics.
34 (1). 211-221. 2025.
[55].
DRM Revisited: A Complete Error Analysis.
Journal of Machine Learning Research.
26 (115). 1-76. 2025.
[56].
Deep Approximate Policy Iteration.
Annals of Statistics.
53 (2). 802-821. 2025.
[57].
Error Analysis of Three-Layer Neural Network Trained with PGD for Deep Ritz Method.
IEEE Trasactions on Information Theory.
71 (7). 5512 - 5538. 2025.
[58].
Model Free Prediction with Uncertainty Assessment.
IEEE Transactions on Information Theory.
71 (9). 7229 - 7253. 2025.
[59].
Robust Decoding from Binary Measurements with Cardinality Constraint Least Squares.
CICP.
2025.
[60].
Deep Nonlinear Sufficient Dimention Reduction.
NeurIPS.
2025.
[61].
Adversarial Self-Supervised Representation Learning with Theoretical Guarantees.
NeurIPS.
2025.
[62].
Transformers Can Overcome the Curse of Dimensionality: A Theoretical Study from an Approximation Perspective.
Journal of Machine Learning Research (accepted), arXiv:2504.13558.
2026.
[63].
Approximation Bounds for Transformer Networks with Application to Regression.
arXiv:2504.12175.
[64].
Characteristic Learning for Provable One Step Generation.
arXiv:2405.05512.
[65].
Approximation Bounds for Recurrent Neural Networks with Application to Regression.
arXiv:2409.05577.
[66].
Convergence Analysis of Schrodinger-Follmer Sampler without Convexity.
arXiv preprint arXiv:2107.04766.
[67].
Convergence of Continuous Normalizing Flows for Learning Probability Distributions.
arXiv:2404.00551.
[68].
Distribution Matching for Self-Supervised Transfer Learning.
arXiv:2502.14424.
[69].
Sampling via Föllmer Flow.
arXiv preprint arXiv:2311.03660.
[70].
Convergence Analysis of Flow Matching in Latent Space with Transformers.
arXiv:2404.02538.
[71].
Nonlinear Assimilation with Score-based Sequential Langevin Sampling.
arXiv:2411.13443.
[72].
Deep Conditional Generative Learning: Model and Error Analysis.
arXiv:2402.01460.
[73].
Latent Schr{\"o}dinger Bridge Diffusion Model for Generative Learning.
arXiv:2404.13309.
共73条 1/1
首页
上页
下页
尾页