GIGA icon indicating copy to clipboard operation
GIGA copied to clipboard

How

Open wuzeww opened this issue 4 years ago • 29 comments

Are giga, vgn and giga-aff are the same epoches? Are they all 20 epoches?

wuzeww avatar Oct 25 '21 06:10 wuzeww

Yeah, I think so.

Steve-Tod avatar Oct 26 '21 01:10 Steve-Tod

But the result of GIGA-Aff is higher than the paper.

wuzeww avatar Oct 26 '21 01:10 wuzeww

I am currently making a graduation project based on your thesis, so I am eager to know the specific epochs of your work. You can also reply me by email [email protected]. Extremely grateful!

wuzeww avatar Oct 26 '21 01:10 wuzeww

But the result of GIGA-Aff is higher than the paper.

That's possible, different devices and different random seeds can give different results. I suggest running more tests with more different random seeds. BTW, how much higher?

Steve-Tod avatar Oct 26 '21 02:10 Steve-Tod

5 or 6 percentage points

wuzeww avatar Oct 26 '21 02:10 wuzeww

Is that the average result from multiple random seeds?

Steve-Tod avatar Oct 26 '21 03:10 Steve-Tod

Yes, the random seeds are [0, 1, 2, 3, 4]. The epoches are 20. I tested twice, the results were both higher than the paper.

wuzeww avatar Oct 26 '21 03:10 wuzeww

Hmmm, how about the result of GIGA? Is it better than GIGA-Aff?

Steve-Tod avatar Oct 26 '21 22:10 Steve-Tod

emm, the result of GIGA-Aff sometimes are better than GIGA. About 1 percent.

---Original--- From: "Zhenyu @.> Date: Wed, Oct 27, 2021 06:42 AM To: @.>; Cc: @.>;"State @.>; Subject: Re: [UT-Austin-RPL/GIGA] How (Issue #8)

Hmmm, how about the result of GIGA? Is it better than GIGA-Aff?

— You are receiving this because you modified the open/close state. Reply to this email directly, view it on GitHub, or unsubscribe. Triage notifications on the go with GitHub Mobile for iOS or Android.

wuzeww avatar Oct 27 '21 03:10 wuzeww

Hmmm, that's weird. Have you checked the loss curve and made sure they both converged?

Steve-Tod avatar Oct 27 '21 14:10 Steve-Tod

This is the loss graph of GIGA-Aff training 20 epoches

2021-10-28 09-59-19 的屏幕截图

This is the 10 epoches

2021-10-28 09-59-26 的屏幕截图

wuzeww avatar Oct 28 '21 02:10 wuzeww

Hi,

I trained GIGA-Aff for 10 epoches, the result is higher 5 percentage points than the paper. I wonder if there is a problem with the code.

wuzeww avatar Oct 29 '21 05:10 wuzeww

How about the training figure of GIGA? Have you trained GIGA?

Steve-Tod avatar Oct 29 '21 17:10 Steve-Tod

This the loss curve of giga: 2021-10-30 20-51-43 的屏幕截图

wuzeww avatar Oct 30 '21 12:10 wuzeww

So the GIGA trained with the same number of epochs perform worse than GIGA-Aff?

Steve-Tod avatar Oct 30 '21 21:10 Steve-Tod

yes

---Original--- From: "Zhenyu @.> Date: Sun, Oct 31, 2021 05:10 AM To: @.>; Cc: @.>;"State @.>; Subject: Re: [UT-Austin-RPL/GIGA] How (Issue #8)

So the GIGA trained with the same number of epochs perform worse than GIGA-Aff?

— You are receiving this because you modified the open/close state. Reply to this email directly, view it on GitHub, or unsubscribe. Triage notifications on the go with GitHub Mobile for iOS or Android.

wuzeww avatar Oct 31 '21 02:10 wuzeww

Hmmm, that's weird. What scenario are you using? Packed or pile?

Steve-Tod avatar Oct 31 '21 19:10 Steve-Tod

packed. Is the cause of network instability?

---Original--- From: "Zhenyu @.> Date: Mon, Nov 1, 2021 03:11 AM To: @.>; Cc: @.>;"State @.>; Subject: Re: [UT-Austin-RPL/GIGA] How (Issue #8)

Hmmm, that's weird. What scenario are you using? Packed or pile?

— You are receiving this because you modified the open/close state. Reply to this email directly, view it on GitHub, or unsubscribe. Triage notifications on the go with GitHub Mobile for iOS or Android.

wuzeww avatar Nov 01 '21 02:11 wuzeww

Not sure about that. GIGA should perform better than GIGA-Aff, especially in packed scenarios.

Steve-Tod avatar Nov 01 '21 05:11 Steve-Tod

Sorry to reply you now.

It's weird. I retrained giga-aff, and the result was lower than before, even lower than the paper.

wuzeww avatar Nov 03 '21 01:11 wuzeww

Do you retrain with the same setting?

Steve-Tod avatar Nov 03 '21 15:11 Steve-Tod

on the different computer

---Original--- From: "Zhenyu @.> Date: Wed, Nov 3, 2021 23:23 PM To: @.>; Cc: @.>;"State @.>; Subject: Re: [UT-Austin-RPL/GIGA] How (Issue #8)

Do you retrain with the same setting?

— You are receiving this because you modified the open/close state. Reply to this email directly, view it on GitHub, or unsubscribe. Triage notifications on the go with GitHub Mobile for iOS or Android.

wuzeww avatar Nov 03 '21 15:11 wuzeww

The absolute value may vary because the test scenes can be different, but the relative performance between GIGA and GIGA-Aff should stay the same. However, you said GIGA is worse than GIGA-Aff previously, which is very weird. Did you train GIGA and GIGA-Aff on the same computer?

Steve-Tod avatar Nov 03 '21 16:11 Steve-Tod

The absolute value may vary because the test scenes can be different, but the relative performance between GIGA and GIGA-Aff should stay the same. However, you said GIGA is worse than GIGA-Aff previously, which is very weird. Did you train GIGA and GIGA-Aff on the same computer?

Yes, I trained them on the same computer before. Training the same model on the same computer, the loss curve obtained is different.

wuzeww avatar Nov 05 '21 03:11 wuzeww

The absolute value may vary because the test scenes can be different, but the relative performance between GIGA and GIGA-Aff should stay the same. However, you said GIGA is worse than GIGA-Aff previously, which is very weird. Did you train GIGA and GIGA-Aff on the same computer?

Yes, I trained them on the same computer before. Training the same model on the same computer, the loss curve obtained is different.

wuzeww avatar Nov 05 '21 04:11 wuzeww

The loss curve can be different. I think training on different computers is OK. The important thing is testing on the same computer so that after fixing the random seed, the generated scenes will be the same. (I should have asked if you test them on the same computer, it was a typo.)

Steve-Tod avatar Nov 05 '21 15:11 Steve-Tod

I tested them on the same computer.

---Original--- From: "Zhenyu @.> Date: Fri, Nov 5, 2021 23:23 PM To: @.>; Cc: @.>;"State @.>; Subject: Re: [UT-Austin-RPL/GIGA] How (Issue #8)

The loss curve can be different. I think training on different computers is OK. The important thing is testing on the same computer so that after fixing the random seed, the generated scenes will be the same. (I should have asked if you test them on the same computer, it was a typo.)

— You are receiving this because you modified the open/close state. Reply to this email directly, view it on GitHub, or unsubscribe. Triage notifications on the go with GitHub Mobile for iOS or Android.

wuzeww avatar Nov 05 '21 15:11 wuzeww

Not sure why this happens. I'll look back to this and re-train by myself later.

Steve-Tod avatar Nov 05 '21 15:11 Steve-Tod

thank you very much. 比心🙆,非常感谢!

---Original--- From: "Zhenyu @.> Date: Fri, Nov 5, 2021 23:43 PM To: @.>; Cc: @.>;"State @.>; Subject: Re: [UT-Austin-RPL/GIGA] How (Issue #8)

Not sure why this happens. I'll look back to this and re-train by myself later.

— You are receiving this because you modified the open/close state. Reply to this email directly, view it on GitHub, or unsubscribe. Triage notifications on the go with GitHub Mobile for iOS or Android.

wuzeww avatar Nov 05 '21 15:11 wuzeww