Hello!
I was having some genuine performance issues with my game (which have since gone away with some refactoring, and by realising that streaming in OBS using window capture and Windows Aero enabled is a baaaaaad idea), which highlighted something odd with how I am tracking frames per second.
I have written two functions for tracking fps - one using the dt reported by the engine, and one using differences in os.clock(). The dt function reports 53 fps as the maximum framerate. The os.clock() function reports 60 fps as the maximum. The difference of ~7 fps seems consistent, as they both drop together if I do something intense. I have variable_dt enabled.
Any ideas on why this might be? It could very well be an issue with my functions, so I will attach them below just in case.
dt function:
local dt_table = {}
local dt_sum = 0
local frames_per_second = 0
local seconds_to_track = 2
local expected_fps = 60
local function show_fps(dt)
dt_sum = dt_sum + dt
table.insert(dt_table, dt)
local len = length(dt_table)
frames_per_second = len / dt_sum
if len >= seconds_to_track*expected_fps then
dt_sum = dt_sum - dt_table[1]
splice(dt_table, 1, 1)
end
local n = gui.get_node("fps")
gui.set_text(n, "FPS (dt): " .. round(frames_per_second))
end
OS clock function:
local last_time = os.clock()
local os_dt_table = {}
local os_dt_sum = 0
local os_frames_per_second = 0
local os_seconds_to_track = 2
local os_expected_fps = 60
local function show_os_fps()
local dt = os.clock() - last_time
os_dt_sum = os_dt_sum + dt
table.insert(os_dt_table, dt)
local len = length(os_dt_table)
os_frames_per_second = len / os_dt_sum
if len >= os_seconds_to_track*os_expected_fps then
os_dt_sum = os_dt_sum - os_dt_table[1]
splice(os_dt_table, 1, 1)
end
last_time = os.clock()
local n = gui.get_node("fps_os")
gui.set_text(n, "FPS (OS): " .. round(os_frames_per_second))
end
Any insight would be greatly appreciated!