LinYang

Results 17 comments of LinYang

Hi, @asimokby , I'm not sure what is the "answer" you mean. But according to the discussion with authors of LogAnomaly, they did not open-source their implementation due to several...

Hi @KiteFlyKid , Thank you. My email address is linyang[AT]tju.edu.cn. You can email me as much as you want, I'm happy to discuss with other researchers! :D

I'm facing a similar question, all I need is to know the start and end line number of a method. I can know the start line by accessing the `.position`...

Thanks for asking. The `label.txt` file was generated from the original `anomalous_label.csv` file from loghub, which 0 means a normal block and 1 otherwise. Here is the label.txt file we...

Can you check which one is 200d, and which one is 100d in `hiddens * sent_probs`? This could help clearify this issue.

Sorry, I failed to reproduce this error. However, here's a tip that may help: The shape of `sent_prob` should be **batch_size * seq_len** after the attention mechanism. And become **batch_size...

> ![图片1](https://user-images.githubusercontent.com/50793022/250437323-d79be436-625d-4605-9f8e-fb43941b4055.png) What are the shapes after the view operation?

So it seems fine? sent_probs can be regarded as the attention score of the hidden states for each log event in the log sequence. The multiplication between hidden_states and sent_probs...

Hi @superzeroT , may I ask if the issue is still unresolved? And, as mentioned in your screenshot, "此处有相应的修改“, what were those exactly?

Mine were: hiddens: [100, 38, 200] sent_probs: [100,38,1] Your first two shapes seems fine. But the sequence length of the last two shapes is only 1?