fill match arms assist put the code in the wrong place
rust-analyzer version: v0.4.1157
rustc version: rustc 1.64.0-nightly (9067d5277 2022-07-28)
fill match arms assist put the code in the wrong place!
https://user-images.githubusercontent.com/13777628/182836330-27603145-5ff2-4755-bbd0-2e003719c92f.mp4
Reload the window doesn't solve the problem.
Minimal reproduce:
// lib.rs, with tokio
use tokio;
#[tokio::test]
async fn f() {
enum E {
A,
B,
}
let e = E::A;
match e$0 {
}
}
I'll try to figure out what's wrong by myself. ~~Cannot reproduce on current master 😢. Turns out I'm using an old master.~~ Reproduced.
Are macros or attributes in involved here?
Are macros or attributes in involved here?
It's inside a #[tokio::test].
Hmm, looks like we fail to map the match out of the macro expansion and so we fall back to the attributes range (as you can see the attribute invocation is being replaced here).
So one thing we should change in the assist here is this line
https://github.com/rust-lang/rust-analyzer/blob/4904b2bdf8797f14fff3b585d18207161126acce/crates/ide-assists/src/handlers/add_missing_match_arms.rs#L201
should be moved out of the |builder| { closure, and replaced with a
let old_range = ctx.sema.original_range_opt(match_arm_list.syntax())?.range;
The assist won't trigger in this scenario anymore instead of doing the wrong thing.
The actual issue of us not upmapping properly requires some more debugging though, we might be getting a span wrong which causes us to fail here
Any updates to this issue? I am hitting the same bug with rust-analyzer v0.4.1303.
Any updates to this issue? I am hitting the same bug with rust-analyzer v0.4.1303.
I'm pretty sure this problem is related to how rust-analyzer expands proc_macros.
I add a dbg! on db.parse_macro_expansion(macro_file).value and found all TokenIds for Delimiters are TokenId::unspecified() when using the minimal reproduce code.
buggy dbg! output
[crates/hir-expand/src/lib.rs:249] db.parse_macro_expansion(macro_file).value? = (
Parse {
green: GreenNode {
kind: SyntaxKind(
247,
),
text_len: 36,
n_children: 1,
},
errors: [],
_ty: PhantomData,
},
TokenMap {
entries: [
(
TokenId(
0,
),
Token(
0..2,
),
),
(
TokenId(
1,
),
Token(
2..3,
),
),
(
TokenId(
4294967295,
),
Delimiter(
3..5,
),
),
(
TokenId(
4,
),
Token(
6..10,
),
),
(
TokenId(
5,
),
Token(
10..11,
),
),
(
TokenId(
7,
),
Token(
12..13,
),
),
(
TokenId(
8,
),
Token(
13..14,
),
),
(
TokenId(
9,
),
Token(
14..15,
),
),
(
TokenId(
10,
),
Token(
15..16,
),
),
(
TokenId(
4294967295,
),
Delimiter(
11..17,
),
),
(
TokenId(
11,
),
Token(
17..20,
),
),
(
TokenId(
12,
),
Token(
20..21,
),
),
(
TokenId(
13,
),
Token(
21..22,
),
),
(
TokenId(
14,
),
Token(
22..23,
),
),
(
TokenId(
15,
),
Token(
23..24,
),
),
(
TokenId(
16,
),
Token(
24..25,
),
),
(
TokenId(
17,
),
Token(
25..26,
),
),
(
TokenId(
18,
),
Token(
26..27,
),
),
(
TokenId(
19,
),
Token(
27..32,
),
),
(
TokenId(
20,
),
Token(
32..33,
),
),
(
TokenId(
4294967295,
),
Delimiter(
33..35,
),
),
(
TokenId(
4294967295,
),
Delimiter(
33..36,
),
),
],
synthetic_entries: [],
},
)
dbg! output for the "unbuggy" implementation of `untouched`
[crates/hir-expand/src/lib.rs:249] db.parse_macro_expansion(macro_file).value? = (
Parse {
green: GreenNode {
kind: SyntaxKind(
247,
),
text_len: 36,
n_children: 1,
},
errors: [],
_ty: PhantomData,
},
TokenMap {
entries: [
(
TokenId(
0,
),
Token(
0..2,
),
),
(
TokenId(
1,
),
Token(
2..3,
),
),
(
TokenId(
2,
),
Delimiter(
3..5,
),
),
(
TokenId(
4,
),
Token(
6..10,
),
),
(
TokenId(
5,
),
Token(
10..11,
),
),
(
TokenId(
7,
),
Token(
12..13,
),
),
(
TokenId(
8,
),
Token(
13..14,
),
),
(
TokenId(
9,
),
Token(
14..15,
),
),
(
TokenId(
10,
),
Token(
15..16,
),
),
(
TokenId(
6,
),
Delimiter(
11..17,
),
),
(
TokenId(
11,
),
Token(
17..20,
),
),
(
TokenId(
12,
),
Token(
20..21,
),
),
(
TokenId(
13,
),
Token(
21..22,
),
),
(
TokenId(
14,
),
Token(
22..23,
),
),
(
TokenId(
15,
),
Token(
23..24,
),
),
(
TokenId(
16,
),
Token(
24..25,
),
),
(
TokenId(
17,
),
Token(
25..26,
),
),
(
TokenId(
18,
),
Token(
26..27,
),
),
(
TokenId(
19,
),
Token(
27..32,
),
),
(
TokenId(
20,
),
Token(
32..33,
),
),
(
TokenId(
21,
),
Delimiter(
33..35,
),
),
(
TokenId(
3,
),
Delimiter(
5..36,
),
),
],
synthetic_entries: [],
},
)
However, I'm not familiar with the process rust-analyzer dealing with proc_macro...
I am currently looking into rewriting our tokenmap, so ideally this would get fixed by that rewrite.
I have this issue too, but another one is that in the past fill match arms actions worked inside {}, but now it does not (I may be wrong here though).
I'm having this same issue. The assist deletes about 30 lines of code inside the macro.
Before using code action:
After using code action: