|
Post by aspiring3dmod on Apr 3, 2021 20:51:32 GMT
Just curious does anybody know anything about game development? Like how many polygons on scene are acceptable?
Can't find any info anywhere how many polygons ampere graphics cards can process per frame. Is 4-6 billion too much?
|
|
|
Post by Rancor022 on Apr 3, 2021 23:09:36 GMT
I'm not sure if this'd help answer your question: computergraphics.stackexchange.com/questions/1793/how-many-polygons-in-a-scene-can-modern-hardware-reach-while-maintaining-realtim. Another, more recent, quote I found: "Todays graphics APIs don't really care how many triangles you use(triangle data are stored in buffers on the GPU and the API only sends a buffer id to the GPU, it doesn't matter if it is 1 triangle or 1 billion triangles from the APIs point of view). (The APIs care about things like draw calls, state changes, etc).
a modern GPU can handle several billion triangles per second in theory. in practice a few hundred million per second shouldn't be a problem in the normal case."
|
|
|
Post by aspiring3dmod on Apr 4, 2021 1:26:07 GMT
Yeah i tried reading that, there doesn't seem to be a consensus. The thing i don't understand is if you have the multiply the triangles in the scene with 30 or 60 depending on the game refresh rate? :/
|
|
|
Post by aspiring3dmod on Apr 6, 2021 23:24:24 GMT
Anybody know if 12 billion triangles is too much?? (total on screen x fps)
|
|
|
Post by Rancor022 on Apr 7, 2021 0:14:02 GMT
Not sure if there's any useful info in here for you:
https://www.reddit.com/r/gamedev/comments/61zrap/what_polycount_is_considered_standard_in_video/
|
|