Cyclone V: Timing a run in Microseconds

Hey,
I’ve been trying to time how long it takes my board to go through a function 1000 times, I’ve checked my loops and made sure that I’m not immediately leaving them, but I’m still finding that I’m being returned the value of zero when I call alt_gpt_time_microsecs_get(ALT_GPT_CPU_GLOBAL_TMR). I have also made sure that when I start the timer it is starting correctly, along with also checking for the time in all the different calls (ex millisecs and secs), I also attempted to increase my computation time, but this did not appear to help increase the time I recieved. Would someone be able to give me a basic example of initializing and calling the ALT_GPT_CPU_GLOBAL_TMR? I haven’t been able to find any documents relating to it.

Thanks in advance!

I figured out a solution to this problem, though I’m still not sure why the Timers are not working correctly. I used the frequency and the counter belonging to the timer to determine the amount of Microseconds to within an eighth of a microsecond.