Register now After registration you will be able to apply for this opportunity online.
Imposition of non-fixed convex constraints on Neural Networks
This project aims to answer the unsolved question of how to guarantee (in a computationally efficient way) hard convex constraints on the output of a network when the parameters that define the constraints change.
Recent work [1-5] has shown the potential of the imposition of constraints on the output of neural networks. The approach [1] is able to guarantee a **zero violation of convex constraints for any input and/or weight of the network, achieving computations times extremely low**. However, a current limitation is that the parameters that define the constraints (e.g., A and b in a constraint Ax<=b) **need to be fixed and known beforehand**. In many applications, however, these parameters either depend on the input (or the output of the previous layers) of the network. Hence, the parameters change in each forward pass of the neural network. The question of how to impose convex constraints with changing parameters in a **computationally efficient way** remains therefore unanswered. This work aims to solve this unanswered question, and design a novel strategy to impose, in a computationally efficient way, hard convex non-fixed constraints on the output of the networks.
[1] Tordesillas, J., How, J.P. and Hutter, M., 2023. RAYEN: Imposition of Hard Convex Constraints on Neural Networks. arXiv preprint arXiv:2307.08336.
[2] Donti, P.L., Rolnick, D. and Kolter, J.Z., 2021. DC3: A learning method for optimization with hard constraints. arXiv preprint arXiv:2104.12225.
[3] Frerix, T., Nießner, M. and Cremers, D., 2020. Homogeneous linear inequality constraints for neural network activations. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops (pp. 748-749).
[4] Agrawal, A., Amos, B., Barratt, S., Boyd, S., Diamond, S. and Kolter, J.Z., 2019. Differentiable convex optimization layers. Advances in neural information processing systems, 32.
[5] Amos, B. and Kolter, J.Z., 2017, July. Optnet: Differentiable optimization as a layer in neural networks. In International Conference on Machine Learning (pp. 136-145). PMLR.
Recent work [1-5] has shown the potential of the imposition of constraints on the output of neural networks. The approach [1] is able to guarantee a **zero violation of convex constraints for any input and/or weight of the network, achieving computations times extremely low**. However, a current limitation is that the parameters that define the constraints (e.g., A and b in a constraint Ax<=b) **need to be fixed and known beforehand**. In many applications, however, these parameters either depend on the input (or the output of the previous layers) of the network. Hence, the parameters change in each forward pass of the neural network. The question of how to impose convex constraints with changing parameters in a **computationally efficient way** remains therefore unanswered. This work aims to solve this unanswered question, and design a novel strategy to impose, in a computationally efficient way, hard convex non-fixed constraints on the output of the networks.
[1] Tordesillas, J., How, J.P. and Hutter, M., 2023. RAYEN: Imposition of Hard Convex Constraints on Neural Networks. arXiv preprint arXiv:2307.08336.
[2] Donti, P.L., Rolnick, D. and Kolter, J.Z., 2021. DC3: A learning method for optimization with hard constraints. arXiv preprint arXiv:2104.12225.
[3] Frerix, T., Nießner, M. and Cremers, D., 2020. Homogeneous linear inequality constraints for neural network activations. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops (pp. 748-749).
[4] Agrawal, A., Amos, B., Barratt, S., Boyd, S., Diamond, S. and Kolter, J.Z., 2019. Differentiable convex optimization layers. Advances in neural information processing systems, 32.
[5] Amos, B. and Kolter, J.Z., 2017, July. Optnet: Differentiable optimization as a layer in neural networks. In International Conference on Machine Learning (pp. 136-145). PMLR.
- Literature review
- Mathematical design of the proposed algorithm
- Software implementation of the proposed algorithm (in Pytorch)
- Comparison with state-of-the-art approaches
- Literature review - Mathematical design of the proposed algorithm - Software implementation of the proposed algorithm (in Pytorch) - Comparison with state-of-the-art approaches
- Excellent knowledge of convex optimization
- Great Python programming skills (PyTorch)
- Excellent knowledge of convex optimization - Great Python programming skills (PyTorch)
Please send to jtordesillas@ethz.ch these things:
- Link to your GitHub repository
- CV
- Small paragraph stating why you are interested in this project
Please send to jtordesillas@ethz.ch these things: - Link to your GitHub repository - CV - Small paragraph stating why you are interested in this project