Hacker News new | past | comments | ask | show | jobs | submit login

Yeah I have the 3070 and I hate that it has the same amount of ram as a 1080 from 2016 while costing more than a 1080 did at the time.



That sort of feels like complaining your ferrari has the same size gas tank as a Civic. It's not why the car was made.


The size of a car's gas tank has no effect on performance. I think the ram size here does, so you analogy is not applicable


For gaming (which these cards are sold and marketed to), it doesn't make a difference today.


It depends on the games you play. I often run out when playing VR games, so I have to reduce resolution. Even with an 8gb card.


How do you know that's a memory issue? I'd expect that to be less of a concern in VR games and more a requirement to render the exact same duplicate resources at slightly different perspectives, which should not be memory, but pure GPU throughput.


No, it is - in terms of gaming performance, the thing that the 3070 is primarily made for, it's much much faster than the 1080 despite having the same amount of RAM. It's probably also much faster in Machine Learning activities as well.


>The size of a car's gas tank has no effect on performance

It has a dramatic effect on performance when you run out of gas -- a little bit like the brick wall you hit if you run out of memory. The GP'S analogy isn't very accurate, but it's not totally wrong


I guess the analogy works if you could start ripping out random components from the car and put them in your gas tank and burn them as gas


I was posting in the context of deep learning specifically, per top level comment.


Well yeah, the comment still applies. GeForce is made for gaming, so the bill of materials goes to what improves gaming performance.


It should still do deep learning activities much faster than the 1080 though, doesn't it?


Not relevant if your model doesn't fit in GPU memory.


That's fair, but the 3070 isn't being marketed as a deep learning machine primarily. I don't know why it's the fault of a midrange gaming card that it doesn't fit a deep learning requirement. The 3090, which is much more expensive mind you, is the one you'd want with 24 GB of VRAM.




Consider applying for YC's Summer 2025 batch! Applications are open till May 13

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: