Summary: | We present the problem of approximating the time-evolution operator e-iHt to error ϵ, where the Hamiltonian H = ((G|⊗I)U (|Gi⊗I) is the projection of a unitary oracle U onto the state |Gi created by another unitary oracle. Our algorithm solves this with a query complexity O ( t + log(1/ϵ) ) to both oracles that is optimal with respect to all parameters in both the asymptotic and non-asymptotic regime, and also with low overhead, using at most two additional ancilla qubits. This approach to Hamiltonian simulation subsumes important prior art considering Hamiltonians which are d-sparse or a linear combination of unitaries, leading to significant improvements in space and gate complexity, such as a quadratic speed-up for precision simulations. It also motivates useful new instances, such as where H is a density matrix. A key technical result is 'qubitization', which uses the controlled version of these oracles to embed any H in an invariant SU(2) subspace. A large class of operator functions of H can then be computed with optimal query complexity, of which e-iHt is a special case.
|