Hacker News new | past | comments | ask | show | jobs | submit login

> That means the number of elements scanned is proportional to the number of elements inserted.

And the number of elements inserted is in the worst case N, hence the proposed solution has a worst case complexity of O(N)?




> hence the proposed solution has a worst case complexity of O(N)?

Worst case yes, but the question is concerned with the average case.


It's not specified in the article?

Do that for rate limiting and it'll be super easy to DoS you - just show how fundamentally you can't brute force your way out of algorithmic questions despite the article suggestion (or how interviews are fundamentally flawed, but this is another debate ;)


> It's not specified in the article?

From the article:

> The function should have expected O(1) performance.

> Do that for rate limiting and it'll be super easy to DoS you

How so? The rate limiter has the same performance as, for example, a hash table. Operations are usually O(1), but are periodically O(n). It's not like every service that uses a hash-table is DoS-able.


Geniusly curious: does "expected" in English necessarily means average? My understanding was that expected is dependent of the application ___domain, so it can mean best or worst or average.

If you have an implementation that is O(N) in worst case it's (theoretically) DoSable since an attacker would always hit that case - so the expected complexity in case of an attack is O(N).

A trivial solution in O(60)=O(1) for worst case is to store the number of call everything second in a fixed size array and loop over it. With some arithmetic you can even avoid looping over it.


"Expected value" or "expectation" is a precise mathematical term in probability theory that corresponds to "average-case", always.




Consider applying for YC's Summer 2025 batch! Applications are open till May 13

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: