Neural Architecture Search
building the AI that builds AI for various tasks

Artificial Intelligence (AI) is going to be the extension of our brains, in the same way as cars are the extension of our legs. It has already been an indispensable part of our life. Every day, AI navigates us to places, answers our queries, and recommends restaurants and movies. Overall, it amplifies what we do, augmenting our memory, giving you instant knowledge, allowing us to concentrate on doing things that are properly human. However, designing new AI models is still reserved for experts; and the goal of this research is to democratize AI, making it accessible to everybody, such that any person regardless of their prior experiences, and any company regardless of size can deploy sophisticated AI solutions with only a few simple clicks.
In this project, we’re improving the sample efficiency of Monte Carlo Tree Search (MCTS) in designing AI models to get better results in a shorter time or the limited budget. We’re trying to build a new theoretical framework around the state abstraction for MDP under the tree search to improve the sample efficiency, and trying to materialize it into a new efficient and scalable black-box optimization framework.
People
Publications
-
Neural Architecture Search using Deep Neural Networks and Monte Carlo Tree Search In Proceedings of the 2020 AAAI Conference on Artificial Intelligence 2020 [Bibtex]
-
Learning Search Space Partition for Black-box Optimization using Monte Carlo Tree Search In Advances in Neural Information Processing Systems (NeurIPS), 2020 2020 [Bibtex]