演讲人:
Sitan Chen Harvard John A. Paulson School of Engineering and Applied Sciences
时间: 2024-06-05 21:00-2024-06-05 22:00
地点:https://caltech.zoom.us/j/81461358567?pwd=ZHkxS21xK0dMUW1UUkFtTnpTWno0QT09 (ps: BB1984)
内容:
We revisit the problem of Pauli shadow tomography: given copies of an unknown 𝑛-qubit quantum state 𝜌, estimate tr(𝑃𝜌) for some set of Pauli operators 𝑃 to within additive error 𝜀. This has been a popular testbed for exploring the advantage of protocols with quantum memory over those without: with enough memory to measure two copies at a time, one can use Bell sampling to estimate | tr(𝑃𝜌)| for all 𝑃 using 𝑂(𝑛/𝜀^4 ) copies, but with 𝑘 ≤ 𝑛 qubits of memory, Ω(2^(𝑛−𝑘)/3 ) copies are needed. These results leave open several natural questions. How does this picture change in the physically relevant setting where one only needs to estimate a certain subset of Paulis? What is the optimal dependence on 𝜀? What is the optimal tradeoff between quantum memory and sample complexity? We answer all of these questions:
- For any subset𝐴 of Paulis and any family of measurement strategies, we completely characterize the optimal sample complexity, up to log |𝐴| factors.
- We show any protocol that makes poly(𝑛)-copy measurements must make Ω(1/𝜀^4 ) measurements.
- For any protocol that makes poly(𝑛)-copy measurements and only has 𝑘 < 𝑛 qubits of memory, we show that Θ(min{2^𝑛 /𝜀^2 , 2^(𝑛−𝑘) /𝜀^4 }) copies are necessary and sufficient.
The protocols we propose can also estimate the actual values tr(𝑃𝜌), rather than just their absolute values as in prior work. Additionally, as a byproduct of our techniques, we establish tight bounds for the task of purity testing and show that it exhibits an intriguing phase transition not present in the memory-sample tradeoff for Pauli shadow tomography.
Ref: Sitan Chen, Weiyuan Gong, Qi Ye. arXiv: 2404.19105
个人简介:
Prof. Sitan Chen is an Assistant Professor of Computer Science at Harvard John A. Paulson School of Engineering and Applied Sciences, where he is a member of the Theory of Computation group, the ML Foundations group, and the Harvard Quantum Initiative. He works on designing algorithms with provable guarantees for fundamental problems in data science, especially in the context of generative modeling, deep learning, and quantum information.