Hacker News new | past | comments | ask | show | jobs | submit login

I've never ever gotten why people get so mad that the database gets hit on every request. It's going to get hit anyway on every request for something else. I've seen how easy it is to break into systems that don't re-auth the user on each request. I guess I just feel like people spend an inordinate amount of time worrying about database performance. A single select from a single table shouldn't be a performance bottleneck. It seems to me that a good database will have that query cached in memory. It may sound "icky" but if you look at the times, I bet your Rails app spends way more time in the renderer than it does for the "select users.* from user" that it uses to populat current_user.

I'm happy to be proven wrong, but I've written a lot of apps before I moved to Rails, and we used to avoid re-authing the user / looking it up on each request, and it's just never made a bit of difference to us.




Query execution time is only part of the equation. But even that gets ugly fast when you have locking going on. E.g., try adding an index to a non-trivially sized table in MySQL and see how fast that (unnecessary) SELECT performs.

We're a postgres shop and the problem we've run into lately is just the sheer number of open connections to the DB. Using something like PgBouncer helps tremendously, but it still is contingent on there being idle connections. Certainly read slaves could be thrown at the problem, too. But it's mounting complexity for something that isn't all that necessary shrug




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: