On Quantum Data Structures Daan Schoneveld Abstract: Data structures serve as fundamental building blocks in classical computing, allowing for efficient ways of organising, storing and manipulating data. To develop certain time-efficient quantum algorithms, classical data structures must be translated to the quantum context. However, several challenges emerge in this translation, such as uniqueness of representation, uniqueness of memory and the worst-case limitation problem. In this thesis we explore the field of quantum data structures, and their limitations, in the context of element distinctness. Our results are presented in a (what we call) Quantum Word Random Access Machine (QWRAM) model, extending that of [BLPS22], by quantifying both classical and quantum elementary word operations as a variable γ. The main theorem is a black-box quantum algorithm for element distinctness with time complexity O(N^(2/3)(γ + T_L + T_ID + T_C ) + T_init ), for any quantum data structure that can lookup elements in time T_L , insert and delete elements in time T_ID , check for collisions in time T_C and initialise the data structure in time T_init . This theorem eliminates the need for an explicit diffusion operator, previously thought required from our quantum data structure on top of the other operations. As a result, designing quantum data structures becomes considerably simpler. For this proof we introduce a new formal definition of quantum data structure and a new kind of graph, called the permuted Johnson graph. Using our theorem, we can slightly simplify two existing quantum data structures and reanalyse their time complexity in the QWRAM model: Ambainis’ quantum skip list and the quantum radix tree. With the former we obtain a time complexity of O(N^(2/3) γ log3 N ) and with the latter a time complexity of O(N^(2/3) γ log M ). Next, we introduce a quantum analogue of a hash table, which achieves an optimal O(N^(2/3) γ) time complexity for element distinctness. The algorithm uses a more general version of our main theorem that replaces both the lookup cost, and the insert and delete cost, with an average cost. The proof of this theorem exploits a very recent result that allows us to take an average-case complexity over subroutines. The space complexity of this quantum hash table, however, is Õ(N 4/3 ). We therefore also introduce a space-efficient version of the quantum hash table achieving a time complexity of O(N^(2/3)γ log N ), yet costing us a logarithmic factor in the process. Finally, we suggest a method for constructing a quantum binary search tree, which is deemed impossible to implement in the quantum setting. For this, we use an extension of the technique introduced in [BJLM13], by maintaining a superposition over all possible tree structures. Based on our findings, we conclude that the limitations of using classical data structures in the quantum setting are much less restrictive than previously thought.