In this talk, I will cover a broad range of work covering aspects of various quantum algorithms relevant for both the NISQ and fault-tolerant era. I will begin by discussing how one can leverage constraints on the architecture of current day devices to optimize the implementation of algorithms relevant for quantum simulation as well as combinatorial optimization. In addition, I will describe how one could exploit ideas from classical reinforcement learning to obtain optimally short gate sequences for state preparation and gate compilation. Lastly, I will discuss ongoing work that employs state-of-the-art techniques for Hamiltonian simulation and applies it to the problem of simulating quantum field theories.
M. Sohaib Alam is a research scientist working on quantum computing at the NASA Quantum Artificial Intelligence Lab (QuAIL) at Ames Research Center. Prior to joining NASA QuAIL, he worked on quantum algorithms and their implementation on present-day noisy devices at Rigetti Computing. Sohaib has a Ph.D. in theoretical high energy physics from the (late) Steven Weinberg Theory Group at the University of Texas at Austin, where he focused on aspects of string theory and quantum gravity. He has also worked in the industry, leading the data science and machine learning efforts of an analytics startup from its early days to its successful acquisition.