Paul E Chang
Paul E Chang
Verified email at
Cited by
Cited by
Dual parameterization of sparse variational Gaussian processes
V Adam, P Chang, MEE Khan, A Solin
Advances in Neural Information Processing Systems 34, 11474-11486, 2021
Fast variational learning in state-space Gaussian process models
PE Chang, WJ Wilkinson, ME Khan, A Solin
2020 IEEE 30th International Workshop on Machine Learning for Signal …, 2020
State space expectation propagation: Efficient inference schemes for temporal Gaussian processes
W Wilkinson, P Chang, M Andersen, A Solin
International Conference on Machine Learning, 10270-10281, 2020
Fantasizing with dual GPs in Bayesian optimization and active learning
PE Chang, P Verma, ST John, V Picheny, H Moss, A Solin
arXiv preprint arXiv:2211.01053, 2022
Sparse Function-space Representation of Neural Networks
A Scannell, R Mereu, P Chang, E Tamir, J Pajarinen, A Solin
arXiv preprint arXiv:2309.02195, 2023
Memory-based dual gaussian processes for sequential learning
PE Chang, P Verma, ST John, A Solin, ME Khan
International Conference on Machine Learning, 4035-4054, 2023
Global Approximate Inference via Local Linearisation for Temporal Gaussian Processes
WJ Wilkinson, PE Chang, MR Andersen, A Solin
Second Symposium on Advances in Approximate Bayesian Inference, 2019
Function-space Parameterization of Neural Networks for Sequential Learning
A Scannell, R Mereu, PE Chang, E Tamir, J Pajarinen, A Solin
The Twelfth International Conference on Learning Representations, 2023
Sequential Learning in GPs with Memory and Bayesian Leverage Score
P Verma, PE Chang, A Solin, ME Khan
Continual Lifelong Learning Workshop at ACML 2022, 2022
The system can't perform the operation now. Try again later.
Articles 1–9