I have a really succinct implementation for a persistent map with $O(log N)$ querying and (hopefully) $O(log N)$ amortized average-case insertion. Is this correct, or is the insertion actually $O(log^2 N)$?↵
↵
I know there's the pathological case for cases where the sum of dict sizes is a power of two and you end up copying the whole dictionary, but let's say that's not the average case.↵
↵
Also, for anyone unfamiliar -- Python dictionaries are hashmaps, so insertion/querying is O(1)↵
↵
```python3↵
class LinkedList:↵
parent: LinkedList | None↵
curr_dict: dict[str,objecint]↵
↵
def update(parent: LinkedList, key: str, value:objecint) -> LinkedList:↵
curr_dict = {key: value}↵
while parent.parent is not null and len(parent.curr_dict) <= len(curr_dict):↵
curr_dict |= parent.curr_dict↵
parent = parent.parent↵
return LinkedList(parent=parent, curr_dict=curr_dict)↵
↵
def query(node: LinkedList, key: str) -> int | None:↵
while node is not None:↵
if value := curr_dict.get(key)↵
return value↵
node = node.parent↵
return None↵
```
↵
I know there's the pathological case for cases where the sum of dict sizes is a power of two and you end up copying the whole dictionary, but let's say that's not the average case.↵
↵
Also, for anyone unfamiliar -- Python dictionaries are hashmaps, so insertion/querying is O(1)↵
↵
```python3↵
class LinkedList:↵
parent: LinkedList | None↵
curr_dict: dict[str,
↵
def update(parent: LinkedList, key: str, value:
curr_dict = {key: value}↵
while parent.parent is not null and len(parent.curr_dict) <= len(curr_dict):↵
curr_dict |= parent.curr_dict↵
parent = parent.parent↵
return LinkedList(parent=parent, curr_dict=curr_dict)↵
↵
def query(node: LinkedList, key: str) -> int | None:↵
while node is not None:↵
if value := curr_dict.get(key)↵
return value↵
node = node.parent↵
return None↵
```