Run integration tests on Windows
Fix running integration tests on Windows
At least they now work when using
go run cmd/integration_test/main.go cli
or
go run cmd/integration_test/main.go tui.
I haven't been able to get them to run headless (i.e. using go test pkg/integration/clients/*.go), which prevents us from running them on CI.
Coverage summary from Codacy
See diff coverage on Codacy
| Coverage variation | Diff coverage |
|---|---|
| :white_check_mark: 0.00% (target: -2.00%) | :white_check_mark: 64.71% |
Coverage variation details
| Coverable lines | Covered lines | Coverage | |
|---|---|---|---|
| Common ancestor commit (221ebdc1fbfbaa02401d0218749430603038d801) | 46428 | 38374 | 82.65% |
| Head commit (8311933de43b890868482686940b72506d45ff4f) | 46437 (+9) | 38379 (+5) | 82.65% (0.00%) |
Coverage variation is the difference between the coverage for the head and common ancestor commits of the pull request branch: <coverage of head commit> - <coverage of common ancestor commit>
Diff coverage details
| Coverable lines | Covered lines | Diff coverage | |
|---|---|---|---|
| Pull request (#3236) | 17 | 11 | 64.71% |
Diff coverage is the percentage of lines that are covered by tests out of the coverable lines that the pull request added or modified: <covered lines added or modified>/<coverable lines added or modified> * 100%
See your quality gate settings Change summary preferences
You may notice some variations in coverage metrics with the latest Coverage engine update. For more details, visit the documentation
Nice work. Running integration tests on CI is dependent on https://github.com/creack/pty/pull/155 which has been around for ages but may actually be close to merging
Hi Stefan, I'm seeing a consistent failure on your branch (without the cherry pick from the rev-parse branch) with message:
unexpected number of lines in view 'files'. Expected equals 1, got 0 via:
go run cmd/integration_test/main.go cli pkg/integration/tests/custom_commands/form_prompts.go
This is run from cmd shell, with current Go for Windows and Git for Windows. Are you able to see this failure? If not, is there anything I should know about the Windows repro environment?
Nice work. Running integration tests on CI is dependent on creack/pty#155 which has been around for ages but may actually be close to merging
Ah, I see. Too bad. Judging from how long this issue has already been going on, I wouldn't be so sure that it's close to merging. I wonder if it's worth switching to photostorm's fork in the meantime, which seems to have the fix already.
Hi Stefan, I'm seeing a consistent failure on your branch
Yeah, sorry I lied when I said they succeed for me. :-/ I thought I had seen a successful run locally, but I must have dreamed that. I'll look into it in the afternoon, we might just have to skip that test too.
Unfortunately, using cli to run all tests stops at the first one that fails, so we don't know how many more there are...
I'm giving up.
I disabled a few more tests on Windows in an attempt to make things green; but I can't get a complete run of go run cmd/integration_test/main.go cli. Some tests will either fail inconsistently, or hang (I had this several times, where a prompt was waiting for text input, and it didn't continue from there). This is too frustrating.
I also tried moving from creack/pty to photostorm's fork, as mentioned above, hoping that this will make it possible to run the integration tests in "go test" mode. Using photostorm's fork directly wasn't possible because its module declaration is wrong, so I had to make my own. It didn't help though; the code builds now, but running the tests didn't work (seemed to hang without doing anything).
My time budget is used up for this. If somebody else feels like picking this up, please do!
Proposal: let's just disable EVERY remaining Windows test failure from the suite. Get the whole thing, even at reduced capacity, reliably passing via manual runs. That's a lot better than what we have now, and then it becomes possible to work incrementally towards a better future.
Yes, that's what I tried to do. But I reached a point where tests were either failing or hanging inconsistently, i.e. different ones with every run. Feel free to give it a try too, maybe you have better luck.
I was hoping to find one of those intermittent failure/hang cases and see if I can catch it in the debugger. That really smells like an integration test harness bug. But it's looking like that's just not going to happen on my Windows ARM VM. The VSCode Go debugger refuses to attach. I already had to swap out go windows arm64 for the Intel build, since one of the VSCode debugging dependencies (dlv) explicitly fails to install as unsupported on ARM. Switching to the Go amd64 build fixes that, but even so the debugger refuses to attach.
I'll try again on my Windows Intel VM, but it'll take a bit to set it up for dev work.
As of this morning, I finally have my Intel Windows VM setup for lazygit work. I can successfully reproduce failures. 😅
I just had a go at attaching a debugger, and it nominally worked, except that the lazygit isDebuggerAttached() logic is broken on Windows and remains waiting forever because the parent executable name is not "debugserver" but rather "main.exe". I need to dig deeper to understand what's going on, and therefore identify what a fix might look like.
OK, confirmed. The debug workflow in the comment on isDebuggerAttached() is completely wrong for Windows. The test_lazygit executable remains a child of the integration test runner's "main.exe" after the debugger has attached. Some research required to figure out the appropriate code here.
Ah, this SO answer suggests that the solution is to actually scrape all of the "dlv.exe" processes and check their arguments for a PID matching the waiting program's PID. But it's not clear that PID is actually included in the arguments to dlv, looking at an attached debug session with procmon. :-ppppp
Relevant Windows API: https://learn.microsoft.com/en-us/windows/win32/api/debugapi/nf-debugapi-isdebuggerpresent
Edit: See also this blog post on calling Win APIs from Go
Closing this for now in order to clean up the bookkeeping of open PRs. I don't intend to work on it any time soon. We can reopen it if/when we pick this up again.