Inconsistent NetworkTime.time Sync Between Client and Server During Debugging
Description:
I'm experiencing intermittent mis-sync issues with NetworkTime.time between the client and server while debugging the server in a Mirror Networking project. I understand that debugging pauses the server and prevents it from sending data, which can affect time synchronization. However, it would be helpful if the server could use UTC or another mechanism to account for the elapsed time during debugging and maintain sync with the client. (If you debug even once - it goes out of sync)
Steps to Reproduce:
Create a simple Mirror Networking project in Unity. Add Debug.Log statements to show NetworkTime.time on both client and server. Launch and debug the server using a preferred debugger. Launch the client and connect to the server. Check the logged NetworkTime.time values.
Expected Behavior:
Client and server NetworkTime.time values should be synced with minimal latency difference, even during debugging sessions.
Actual Behavior:
NetworkTime.time values are intermittently mis-synced during debugging, with larger discrepancies than expected.
Suggested Solution:
Implement a mechanism, such as using UTC, to accurately account for elapsed time during server debugging and maintain proper synchronization with the client.
Additional Information:
Unity Version: 2021.3.18 Mirror Networking Version: 71.0.0 Operating System: Windows 11 Debugger Used: Visual Studio 2022
I've come here with a similar issue - though in my case it's not even with debugger on. My clients will seemingly at random run their network time fast compared to the server. I've seen the server running in a docker container printing out its network time every 2 seconds as expected, but on the client side the network time it prints increases 3 or 4 seconds, so an every increasing gap.
I'm baffled what might be causing this but it sounds similar to you @jienma.

thanks guys. we'll take a look asap.
@j-c-levin @jienma could you try with latest mirror from github? we actually introduce 'timeline clamp' about 1-2 weeks ago. which guarantees that timeline will never get too far behind/ahead. basically, there's a tolerance area where it will slow down / catch up. and beyond that tolerance, it will hard clamp.
I experimented with version 78.1.2, and it indeed performed better. However, I believe the improvement resulted from the reversion to the StopWatch implementation specifically in that version. I haven't had the opportunity to test a more recent release yet.
I experimented with version 78.1.2, and it indeed performed better. However, I believe the improvement resulted from the reversion to the StopWatch implementation specifically in that version. I haven't had the opportunity to test a more recent release yet.
ok. can we close this issue?
It looks like the changes from version 78.1.2 got accidentally removed in later versions. This means the issue might still happen in versions after 78.1.2, so it's probably not a good idea to close this just yet.
I'm working on a big project so I don't update core packages too frequently.
What I realized was that in an older Mirror (version 71.0.0), if you paused the server or host for a bit, the client's NetworkTime.time would go out of sync. I expected the client to slow down or server to jump but neither of those happened. This also meant that if server had a hiccup or was loading something, the client would go out of sync. I thought it was intentional and a lot of the my networking codes was accounting for this limitation.
Fast forward to earlier this year, I checked out latest Mirror releases. Don't know exactly which commit it is, but I'm happy to see it fixed in the latest version. Planning to upgrade but this will significantly affect how I take care of time-related calculations.