[Bug] 历史日志有时无法查看
Search before asking
- [X] I searched the issues and found no similar issues.
Streamis Component
streamis-server
What happened + What you expected to happen
- 执行任务
- 当时任务可以查看日志
- 经过一点时间,有可能无法查看历史任务日志
Relevent platform
linux
Reproduction script
create table kafka_source_car_speed_info ( carLicense string, areaId string, roadId string, monitorId string, cameraId string, actionTime bigint, speed float, rt AS TO_TIMESTAMP(FROM_UNIXTIME( actionTime / 1000 , 'yyyy-MM-dd HH:mm:ss')), -- 定义事件时间 WATERMARK FOR rt AS rt - INTERVAL '5' SECOND ) with ( 'connector' = 'kafka', 'properties.bootstrap.servers' = 'localhost:19092', 'topic' = 'ttt1', 'format' = 'json' )
SELECT CONCAT(areaId,'',roadId,'',monitorId) ckey, COUNT(1) cars, SUM(speed) sumed, SUM(speed) / COUNT(1) , TUMBLE_START(rt, INTERVAL '30' SECOND ) , TUMBLE_END(rt, INTERVAL '30' SECOND) FROM kafka_source_car_speed_info GROUP BY CONCAT(areaId,'',roadId,'',monitorId) , TUMBLE(rt, INTERVAL '30' SECOND)
SELECT CONCAT(areaId,'',roadId,'',monitorId) ckey, COUNT(1) cars, SUM(speed) sumed, SUM(speed) / COUNT(1) , HOP_START(rt, interval '30' second , interval '1' minute ) , HOP_END(rt, interval '30' second , interval '1' minute) FROM kafka_source_car_speed_info GROUP BY CONCAT(areaId,'',roadId,'',monitorId) , HOP(rt, interval '30' second , interval '1' minute)
Anything else
No response
Are you willing to submit a PR?
- [ ] Yes I am willing to submit a PR!
任务日志还是yarn日志无法查看,streamis linkis版本是多少
是任务日志,streamis 版本0.2.0 , linkis 1.3.2
linkis 1.3.2有个bug会导致无法获取到本地ec日志,在1.4.0已经修复了哈
请问 https://github.com/apache/linkis/issues/4686 是这个fix的么?