|
Post by Blade died for our sins on Dec 9, 2015 22:21:56 GMT
I'm almost certain that the reason times are quicker on PC is because there are more frames to play with, so a dropped frame on PC costs you 1/60th of a second whereas on console it costs 1/30th. Given there are always going to be dropped frames the smaller impact of them on PC allows lap times to be quicker. The super kerb boosting on PC is probably related to this in some way, like being at fractionally higher revs in every 1/30th of a second interval between the frames that would normally be on console. Very interesting if true. People did tell me they were going lot faster with at 144fps. Someone with an expensive monitor should test this. You don't need to have a 144hz monitor to achieve 144fps
|
|
Hertz
Member
Winner of Chiliad-Sandy Triathlon
Posts: 580
Registered on: December 2015
|
Post by Hertz on Dec 9, 2015 22:23:40 GMT
Very interesting if true. People did tell me they were going lot faster with at 144fps. Someone with an expensive monitor should test this. You don't need to have a 144hz monitor to achieve 144fps But would it still act the same way though?
|
|
|
Post by Blade died for our sins on Dec 9, 2015 22:41:32 GMT
You don't need to have a 144hz monitor to achieve 144fps But would it still act the same way though? Yes, what matters is the framerate the game's running at not the framerate you see
|
|
|
Post by CHILLI on Dec 9, 2015 22:55:50 GMT
tags Broughy1322 method0ne Oh_Darn Grumples_Plox (grimreaper977) cameronman1329 Hertz DeadKelly Broughy is on the right track when talking about the physics timesteps matching the framerate. From what I've been able to tell in every GTA since GTA 3 the physics steps are partially tied to the framerate times. Around 25fps or so and below the physics will start to slow down (fixed timestep) but at 30fps and above the timestep will decrease to match the framerate (variable timestep). I dont know if there's a limit to how short a timestep can be. For all we know it could be beyond 128fps, 256fps etc. This ties in to what Deadkelly was questioning. Some old games like NFS HP2 ran the physics on a seperate thread, so if the game would lock up or stutter the main game logic would still be running without you seeing it. I know for a fact that GTA V runs its inputs per-frame with a delay of about 2-3 frames ( EDIT: The delay isnt actually that bad now. Playing at 60/30fps feels quite solid). When locked to 60fps (standard Vsync setting for the game) there's a small amount of input latency. When locked to 30fps it feels just as bad as old-gen (and possibly current-gen?) with a terrible amount of input latency. The last game that I saw that ran at least somewhat independent physics/game logic was DiRT 2. It's really a shame that the majority of games arent programmed with gameplay in mind anymore. As a side note what Vsync does is force the graphics thread to wait before displaying a new result. So to keep it at 60fps it will check how long the update took and add on the remaining amount of time to wait before updating the screen. Regarding G-sync according to the info I've found it's all done by the monitor instead of the CPU, so it wont have any negative effects on whatever the computer is doing. Also I've noticed that the collision detection works more or less the same way as on old-gen where it takes it one or two physics steps to complete a "impact" cycle. By this I mean that there's a short pause in movement from hitting a breakable object, the object breaking loose and forces being applied to the vehicle and said object. This can be noticed fairly easily by running down the small signs in the city or the wooden poles around Sandy Shores Airfield sand paths. This is probably a solution to keep the framerate high but sacrificing impact precision. If this holds true for other calculations, I dont know because it's quite difficult to test for differences in smooth movement changes. Though it might explain why hitting a car from the outside of a turn can sometimes "glue" the two together, essentially keeping the speed of the car on the outside. And lastly regarding the first question about inputs. Because of the higher framerate on PC (unless you run on Vsync Half or experience a low framerate) the steering smoothing will be able to change in smaller steps. For example going between 0 degrees to 10 degrees of steering between two frames at 30fps. On PC, assuming it's running at 60fps, it will go from 0 -> 5 -> 10 in the same amount of time and the physics will get to react accordingly. The smoother you steer the less risk there is of getting oversteer. So at 120fps and beyond it will become even harder to sort of throw the car into corners if you give it the same steering input, assuming that you're trying to steer less than 100%. But even at 100% it will still be a bit smoother, so having a crazy high framerate will give you the ultimate steering precision but limiting the extreme kinds of aggressive steering like keyboard or flicking the stick. TL;DR A higher framerate = more physics updates and input polling. Also, a framerate below 25-30fps will cause the game to play in slow-mo.
|
|
|
Post by Broughy1322 on Dec 9, 2015 23:07:11 GMT
|
|
Hertz
Member
Winner of Chiliad-Sandy Triathlon
Posts: 580
Registered on: December 2015
|
Post by Hertz on Dec 10, 2015 0:20:35 GMT
This ties in to what Deadkelly was questioning. Some old games like NFS HP2 ran the physics on a seperate thread, so if the game would lock up or stutter the main game logic would still be running without you seeing it. I know for a fact that GTA V runs its inputs per-frame with a delay of about 2-3 frames. When locked to 60fps (standard Vsync setting for the game) there's a small amount of input latency. When locked to 30fps it feels just as bad as old-gen (and possibly current-gen?) with a terrible amount of input latency. The last game that I saw that ran at least somewhat independent physics/game logic was DiRT 2. It's really a shame that the majority of games arent programmed with gameplay in mind anymore. As a side note what Vsync does is force the graphics thread to wait before displaying a new result. So to keep it at 60fps it will check how long the update took and add on the remaining amount of time to wait before updating the screen. Regarding G-sync according to the info I've found it's all done by the monitor instead of the CPU, so it wont have any negative effects on whatever the computer is doing. So this is interesting stuff, because I was under the naive notion of game logic running independent of whatever physics engine chews out. Is there any reason why this is being done? Is it easier to program that way? You would think certainly them running on separate threads would result in "cleaner" physics. Edit: Furthermore I have tested with 30fps condition. It certainly requires less stick flicking too turn a mild corner at full speed. At 60fps, it seems you need to flick the stick almost twice as fast to get similar steering.
|
|
|
Post by CHILLI on Dec 10, 2015 6:21:42 GMT
So this is interesting stuff, because I was under the naive notion of game logic running independent of whatever physics engine chews out. Is there any reason why this is being done? Is it easier to program that way? You would think certainly them running on separate threads would result in "cleaner" physics. Edit: Furthermore I have tested with 30fps condition. It certainly requires less stick flicking too turn a mild corner at full speed. At 60fps, it seems you need to flick the stick almost twice as fast to get similar steering. Coding for more than one thread becomes more difficult the more threads you have. Because of the nature of threads there's no way to have them perfectly in sync, so you cant tell two (or more) threads to do something and expect them to finish at exactly the same time. So the code needs to be written in such a way that any thread that isnt the main one needs to somehow return their results in a safe fashion so that the cycle doesnt get thrown out of wack. It's for reasons like this you'll hear that running some games on Intel processors will grant you a higher and/or more stable framerate. For anyone that doesnt know Intel usually scores higher at single-core performance benchmarks. One can assume what's going on behind the scenes in some games... This said though please dont rush to the stores and get an Intel processor. AMD should serve you perfectly fine unless you need that extra power for something else. Always buy according to your needs. I also went back and checked out capping the framerate at 60fps aswell as 30fps to verify that what I'm saying still applies. So it turns out that they have greatly reduced their input latency when using Vsync. 30fps actually feels quite solid now compared to just a month or two ago. And to clarify on the relation between the screen/monitor frequency (Hz) and frames-per-second, they're only related one-way. As Blade died for our sins said earlier there's a disconnect between the two. How it works is that the computer generates an image that it sends off to the screen. The screen will then draw this image line by line until it has refreshed it all. At this point it will keep displaying the last retrieved image until a new one comes in and it will begin to refresh to the new one. If the time between the last and new image is shorter than the time required to finish drawing a full screen you'll get what's known as screen tearing. The screen starts drawing the first image, but halfway through the data gets updated. But the screen doesnt care so it just keeps drawing the new image from the next line it sees and thus creates a sudden cut in the resulting image. The point of G-sync is to force the screen to keep drawing the last retrieved image from start to end before starting on the current. Keep in mind that during this entire process the application has not cared about the refresh rate at all. It just does what it's been told, which is to run the cycles as fast as it possibly can. In short: once you start seeing the imagine being cut into some number of segments the framerate of the application is higher than the refresh rate of the screen, known as screen tearing. Fun fact: It's for this reason you'd see old fat TVs or CRT monitors flicker. Turn down the refresh rate and the flickering will get even more annoying. This is also why you'll see banding when recording screens or around certain lights. For instance look closely at the wall behind Broughy1322 in some of his videos and you should notice some dark bands kinda "creeping" along the wall. The camera capture rate is a tiny bit out of sync with the frequency of the lights, leaving some more or less noticable lines in the image as a result. Broughy, I dont think your camera likes the lights in your room too much
|
|
Hertz
Member
Winner of Chiliad-Sandy Triathlon
Posts: 580
Registered on: December 2015
|
Post by Hertz on Dec 10, 2015 7:02:47 GMT
Thanks, that cleared a lot of things up for me. I think I wasn't too clear on the fact that application tries to provide maximum frames regardless of the output. On the other hand... It's for this reason you'd see old fat TVs or CRT monitors flicker. Turn down the refresh rate and the flickering will get even more annoying. This is also why you'll see banding when recording screens or around certain lights. For instance look closely at the wall behind Broughy1322 in some of his videos and you should notice some dark bands kinda "creeping" along the wall. The camera capture rate is a tiny bit out of sync with the frequency of the lights, leaving some more or less noticable lines in the image as a result. Broughy, I dont think your camera likes the lights in your room too much Well, at least you don't live in the hell that is the NTSC-land. Oh dear god.
|
|
jsantospt
Member
¯\_(ツ)_/¯
Posts: 893
Registered on: January 2015
PSN ID: JSantosPT
Steam: 76561198123809208
Social Club: JSantosPT
Discord: JSantosPT#9246
|
Post by jsantospt on Dec 10, 2015 7:44:21 GMT
EDIT: This is also why back on old gen sometimes you'd get the ultimate tryhards dropping down to 480p quality to set lap records (less frame drops). I imagine the way to be quickest for sure on PC is to turn down all the graphics settings and make sure you're experiencing 60fps for the maximum amount of time as possible. But I'd prefer to have it look good personally XD I think consoles always render at the same resolution, just like forcing 1080p isn't going to make the console render at 1080p, it upscales from 720p so forcing 480p would still make the console render at 720p and downscale it to 480p? Possibly even adding input lag? Since the image has to be downscaled after being rendered. Not even sure if downscaling wouldn't put even more stress on the system since it still would run at the same resolution but it is forced to downscale it. Maybe I'm wrong but I think at least on consoles that's how it works. Some games do offer the possibility to be played at either 720p or 1080p, but I think no console game is coded to be rendered at 480p
|
|
|
Post by vxwk on Dec 10, 2015 8:03:01 GMT
Everyone was waiting for that CHILLI post. What a guy.
|
|
|
Post by Dnl_Jackson on Dec 10, 2015 12:19:57 GMT
The question that came on my mind after reading all of chilli's posts is, if it's better to turn v-sync off for GTA Online racing or not?
|
|
bladecruiser
Member
Posts: 1,287
Registered on: June 2015
Social Club: BladeCruiser
|
Post by bladecruiser on Dec 10, 2015 12:24:23 GMT
The question that came on my mind after reading all of chilli's posts is, if it's better to turn v-sync off for GTA Online racing or not? If you don't have any problems with your framerate dipping drastically, then having it off is good. If you do, it's better to have it set to full or half, whichever provides you with no severe drops.
|
|
|
Post by Dnl_Jackson on Dec 10, 2015 12:30:53 GMT
The question that came on my mind after reading all of chilli's posts is, if it's better to turn v-sync off for GTA Online racing or not? If you don't have any problems with your framerate dipping drastically, then having it off is good. If you do, it's better to have it set to full or half, whichever provides you with no severe drops. I actually have it always on. I think I would get around 80 - 90 fps without it but I guess it would drop heavier or more often. With v-sync on it doesn't drop or when it drops not as much. Maybe I will test the game without v-sync and report my experience.
|
|
|
Post by Broughy1322 on Dec 10, 2015 12:41:45 GMT
Fun fact: It's for this reason you'd see old fat TVs or CRT monitors flicker. Turn down the refresh rate and the flickering will get even more annoying. This is also why you'll see banding when recording screens or around certain lights. For instance look closely at the wall behind Broughy1322 in some of his videos and you should notice some dark bands kinda "creeping" along the wall. The camera capture rate is a tiny bit out of sync with the frequency of the lights, leaving some more or less noticable lines in the image as a result. Broughy, I dont think your camera likes the lights in your room too much Yeah I changed the refresh rate and it seemed to fix it. Won't be a problem at all soon though. Stupid webcam.
|
|
|
Post by Oh_Darn on Dec 10, 2015 12:44:00 GMT
If you don't have any problems with your framerate dipping drastically, then having it off is good. If you do, it's better to have it set to full or half, whichever provides you with no severe drops. I actually have it always on. I think I would get around 80 - 90 fps without it but I guess it would drop heavier or more often. With v-sync on it doesn't drop or when it drops not as much. Maybe I will test the game without v-sync and report my experience. This sounds like an interesting thing to test, I will do the same and report my experience as well. I will also run some tests without vsync and all settings set to the lowest and see what fps I can get doing so.
|
|