clickhouse-java icon indicating copy to clipboard operation
clickhouse-java copied to clipboard

At least one SQL statement is required for execution

Open xlvchao opened this issue 1 year ago • 20 comments

JDBC vsersion:0.4.6 Clickhouse version:21.8.10.19

When I do batch insert, sometimes I get this error:

java.sql.BatchUpdateException: At least one SQL statement is required for execution, server ClickHouseNode [uri=http://10.109.152.2:8123/default, options={jdbcCompliant=false,collect_timeout=3600,max_execution_time=3600,socket_timeout=4800000,max_queued_buffers=0}]@-232571340 at com.clickhouse.jdbc.SqlExceptionUtils.batchUpdateError(SqlExceptionUtils.java:107) at com.clickhouse.jdbc.internal.SqlBasedPreparedStatement.executeAny(SqlBasedPreparedStatement.java:194) at com.clickhouse.jdbc.internal.AbstractPreparedStatement.executeLargeBatch(AbstractPreparedStatement.java:85) at com.clickhouse.jdbc.internal.ClickHouseStatementImpl.executeBatch(ClickHouseStatementImpl.java:754) at com.zaxxer.hikari.pool.ProxyStatement.executeBatch(ProxyStatement.java:127) at com.zaxxer.hikari.pool.HikariProxyPreparedStatement.executeBatch(HikariProxyPreparedStatement.java) at com.hihonor.aiops.clickhouse.component.SinkWriter.writeToClickhouse(SinkWriter.java:80) at com.hihonor.aiops.databus.task.ClickHouseTask.doWriteToCk(ClickHouseTask.java:190) at com.hihonor.aiops.databus.task.ClickHouseTask.writeToCk(ClickHouseTask.java:122) at com.hihonor.aiops.databus.task.ClickHouseTask.tryWriteToCk(ClickHouseTask.java:105) at com.hihonor.aiops.databus.task.ClickHouseTask.run(ClickHouseTask.java:96) at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) at java.util.concurrent.FutureTask.run(FutureTask.java:266) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:748)

No problem with the data, so what is the reason for this error?

xlvchao avatar Sep 19 '24 09:09 xlvchao

@chernser Can you help me?

xlvchao avatar Sep 19 '24 09:09 xlvchao

Hi @xlvchao! it seems that statement call with empty batch. Do you use jdbc driver directly or inside some framework (like spark)?

chernser avatar Sep 19 '24 15:09 chernser

@chernser Yes, i use jdbc driver directly in a springboot program!

xlvchao avatar Sep 19 '24 16:09 xlvchao

@xlvchao is it possible that batch can be empty? Would you be able to give an sample code how JDBC is used? Thanks!

chernser avatar Sep 19 '24 18:09 chernser

@zhicwu Bro, do you have any ideas?

xlvchao avatar Sep 21 '24 08:09 xlvchao

Hi @xlvchao ! Have you tried an upgrade to newer version?

chernser avatar Oct 29 '24 18:10 chernser

Hi @xlvchao ! Have you tried an upgrade to newer version?

@chernser Yes, i upgraded the version from 0.4.6 to 0.6.4, but the problem still exists!

When i reduced the batch size from 8000 to 1000 and then the problem disappeared! This is very strange!

Could it be that this problem is related to the length of SQL? When the batch size is 8000, the SQL size exceeds 30 MB!

xlvchao avatar Nov 17 '24 02:11 xlvchao

@xlvchao, thank you for information! I will look into it today. I think it may be incorrect concatenation of statements or may be something with compression. btw, do you have any complex types in inserts? I've found a bug in reading code for AggregateFunction - it may be a case for another type while writing. Usually such bugs are hard to reproduce and have different effect.

chernser avatar Nov 18 '24 16:11 chernser

Found similar problem https://github.com/ClickHouse/clickhouse-java/issues/1259#issuecomment-1514603581 - for some reason invalid statement caused empty statement.

The problem is in client https://github.com/ClickHouse/clickhouse-java/blob/b704b9523592e473522eb416c241fc55f0e5f443/clickhouse-http-client/src/main/java/com/clickhouse/client/http/ClickHouseHttpClient.java#L170

 List<String> stmts = sealedRequest.getStatements(false);
        int size = stmts.size();
        String sql;
        if (size == 0) {
            throw new IllegalArgumentException("At least one SQL statement is required for execution");
        } else if (size > 1) {
            throw new IllegalArgumentException("Expect one SQL statement to execute but we got " + size);
        } else {
            sql = stmts.get(0);
        }

chernser avatar Nov 19 '24 06:11 chernser

@xlvchao

I've investigated the code and found that it may happen in very few conditions if internal statement is SQL based (data is send via series of insert statements) and statement has no parameters. I think there is some invalid parsing of initial query. Would you please give a very close to original example of this statement and how do you send data?

Thanks!

chernser avatar Nov 19 '24 07:11 chernser

@xlvchao

I've investigated the code and found that it may happen in very few conditions if internal statement is SQL based (data is send via series of insert statements) and statement has no parameters. I think there is some invalid parsing of initial query. Would you please give a very close to original example of this statement and how do you send data?

Thanks!

@chernser

Here is the table script:

create table aiops_local_prd.quantile_aiops_collect_93
(
    product     String,
    domain      String,
    interface   String,
    returnCode  String,
    requestType String,
    az          String,
    count       Float64,
    fail Nullable(Int64),
    success Nullable(Int64),
    request_time_quantile AggregateFunction(quantilesTDigest(0.95), Float64),
    upstream_response_time_quantile AggregateFunction(quantilesTDigest(0.95), Float64),
    body_bytes_sent_quantile AggregateFunction(quantilesTDigest(0.95), Float64),
    time        DateTime,
    createTime  DateTime
)
    engine = ReplicatedMergeTree('/clickhouse/tables/{layer}-{shard}/aiops_local_prd/quantile_aiops_collect_93',
             '{replica}')
        PARTITION BY toYYYYMMDD(time)
        ORDER BY time
        TTL time + toIntervalMonth(3)
        SETTINGS index_granularity = 8192;

Here is the insert template:

INSERT INTO aiops_local_prd.quantile_aiops_collect_93 (product,domain,interface,returnCode,requestType,az,count,fail,success,request_time_quantile,upstream_response_time_quantile,body_bytes_sent_quantile,time,createTime) VALUES (?,?,?,?,?,?,?,?,?,unhex(?),unhex(?),unhex(?),?,?)

Here is how JDBC is used:

public FlushResponse writeToClickhouse(String clickHouseNode, Long collectId, List<JSONObject> dataList) {
        String nodeTableKey = TableUtil.getTableMapKey(clickHouseNode,collectId);
        if (!tableInfoMap.containsKey(nodeTableKey)) {
            logger.warn("Failed while flush data to ClickHouse, because of table not exist, database = {}, collectId =  {}", database, collectId);
            failureHandler.handle(database, collectId, dataList, FailEnum.TABLE_NOT_EXIST, FailEnum.TABLE_NOT_EXIST.getMsg());
            return FlushResponse.tableNotExist();
        }
        SqlTemplate sqlTmpObject = tableInfoMap.get(nodeTableKey);
        String sqlTemplate = sqlTmpObject.getSqlTemplate();
        HikariDataSource dataSource = Globals.HIKARI_DATA_SOURCES.get(clickHouseNode);

        try (Connection conn = dataSource.getConnection();
             PreparedStatement prepareStatement = conn.prepareStatement(sqlTemplate)) {
            List<String> fields = sqlTmpObject.getFields();
            Map<String, FieldInfo> fieldInfoMap = sqlTmpObject.getFieldInfoMap();

            List<JSONObject> validDataList = new ArrayList<>();
            checkParameters(collectId, fields, fieldInfoMap, dataList, validDataList);
            assembleParameters(prepareStatement, fields, fieldInfoMap, validDataList);

            long s = System.currentTimeMillis();
            prepareStatement.executeBatch();
            long e = System.currentTimeMillis();
            successHandler.handle(database, collectId, dataList, (e - s));
            logger.info("SDK Successful flush data to ClickHouse, database = {}, collectId =  {},  elapsed = {}ms, valid data size = {}, invalid data size = {}", database, collectId, (e - s), validDataList.size(), dataList.size() - validDataList.size());
            return FlushResponse.flushSuccess();

        } catch (DataAssembleException e) {
            logger.error("SDK Failed while flush data to ClickHouse, because of data parse exception, database = {}, collectId =  {}", database, collectId, e);
            failureHandler.handle(database, collectId, dataList, FailEnum.DATA_PARSE_FAILED, FailEnum.DATA_PARSE_FAILED.getMsg());
            return FlushResponse.dataParseFailed();

        } catch (Exception e) {
            String message = e.getMessage();
            Exception newE = e;
            FailEnum failEnum = FailEnum.FLUSH_DATA_FAILED;
            FlushResponse flushResponse = FlushResponse.flushDataFailed();
            if (StringUtils.isNotBlank(message)) {
                if (message.contains(Globals.ZK_SESSION_EXPIRED)) {
                    newE = new ZkSessionExpiredException(message);
                    failEnum = FailEnum.ZK_SESSION_EXPIRED;
                    flushResponse = FlushResponse.zkSessionExpired();
                } else if (message.contains(Globals.ZK_TABLE_READ_ONLY)) {
                    newE = new ZkTableReadonlyException(message);
                    failEnum = FailEnum.ZK_TABLE_READ_ONLY;
                    flushResponse = FlushResponse.zkTableReadOnly();
                } else if (message.contains(Globals.ZK_ALLOCATE_FAILED)) {
                    newE = new ZkAllocateFailedException(message);
                    failEnum = FailEnum.ZK_ALLOCATE_FAILED;
                    flushResponse = FlushResponse.zkAllocateFailed();
                } else if (message.contains(Globals.CK_TOO_MANY_PARTS)) {
                    newE = new CkTooManyPartsException(message);
                    failEnum = FailEnum.CK_TOO_MANY_PARTS;
                    flushResponse = FlushResponse.ckTooManyParts();
                } else if (message.contains(Globals.CK_MATERIAL_VIEW_ERROR)) {
                    newE = new CkMaterialViewException(message);
                    failEnum = FailEnum.CK_MATERIAL_VIEW_ERROR;
                    flushResponse = FlushResponse.ckMaterialViewError();
                } else if (message.contains(Globals.CK_TABLE_COLUMNS_ERROR)) {
                    newE = new CkTableColumnsException(message);
                    failEnum = FailEnum.CK_TABLE_COLUMNS_ERROR;
                    flushResponse = FlushResponse.ckTableColumnsError();
                } else if (message.contains(Globals.CK_WRITING_REQUEST_ERROR)) {
                    newE = new CkWritingRequestException(message);
                    failEnum = FailEnum.CK_WRITING_REQUEST_ERROR;
                    flushResponse = FlushResponse.ckWritingRequestError();
                } else if (message.contains(Globals.CK_CONNECTION_ERROR)) {
                    newE = new CkConnectionException(message);
                    failEnum = FailEnum.CK_CONNECTION_ERROR;
                    flushResponse = FlushResponse.ckConnectionError();
                }
            }
            logger.error("SDK Failed while flush data to ClickHouse, because of clickhouse exception, ck node:{}, database = {}, collectId =  {}!", clickHouseNode, database, collectId, newE);
            failureHandler.handle(database, collectId, dataList, failEnum, failEnum.getMsg());
            return flushResponse;
        }
    }

private void checkParameters(Long collectId, List<String> fields, Map<String, FieldInfo> fieldInfoMap, List<JSONObject> dataList, List<JSONObject> validDataList) throws Exception {
        for (int i = 0; i <= dataList.size() - 1; i++) {
            JSONObject data = dataList.get(i);
            String field = "";
            Object fieldValue = null;
            String fieldType = "";
            try {
                for (int j = 0; j < fields.size(); j++) {
                    field = fields.get(j);
                    fieldValue = data.get(field);
                    fieldType = fieldInfoMap.get(field).getType();
                    if (null != fieldValue) {
                        if ("String".equals(fieldType) || "Nullable(String)".equals(fieldType)) {
                            if (!(fieldValue instanceof String)) throw new Exception();
                        } else if ("Int32".equals(fieldType) || "Nullable(Int32)".equals(fieldType)) {
                            Integer.parseInt(fieldValue.toString());
                        } else if ("Int64".equals(fieldType) || "Nullable(Int64)".equals(fieldType)) {
                            Long.parseLong(fieldValue.toString());
                        } else if ("Float32".equals(fieldType) || "Nullable(Float32)".equals(fieldType)) {
                            Float.parseFloat(fieldValue.toString());
                        } else if ("Float64".equals(fieldType) || "Nullable(Float64)".equals(fieldType)) {
                            Double.parseDouble(fieldValue.toString());
                        } else if ("DateTime".equals(fieldType) || "Nullable(DateTime)".equals(fieldType)) {
                            if (DateTimeUtil.convertToTimestamp(fieldValue.toString()) < 0) throw new Exception();
                        }
                    }
                }
                validDataList.add(data);
            } catch (Exception e) {
                Exception newE = new DataCheckException(String.format("CollectId:%s, fieldName:%s, fieldType:%s, fieldValue:%s", collectId, field, fieldType, fieldValue));
                logger.error("Find invalid parameters when check!", newE);
            }
        }
    }

private void assembleParameters(PreparedStatement prepareStatement, List<String> fields, Map<String, FieldInfo> fieldInfoMap, List<JSONObject> dataList) throws Exception {
        for (int i = 0; i < dataList.size(); i++) {
            JSONObject data = dataList.get(i);
            try {
                int idx = 1;
                for (String field : fields) {
                    Object value = data.get(field);
                    String type = fieldInfoMap.get(field).getType();
                    if (null == value) {
                        if ("String".equals(type)) {
                            prepareStatement.setString(idx++, "");
                        } else if ("Nullable(String)".equals(type)) {
                            prepareStatement.setNull(idx++, Types.VARCHAR);
                        } else if ("Int32".equals(type)) {
                            prepareStatement.setInt(idx++, 0);
                        } else if ("Nullable(Int32)".equals(type)) {
                            prepareStatement.setNull(idx++, Types.INTEGER);
                        } else if ("Int64".equals(type)) {
                            prepareStatement.setLong(idx++, 0L);
                        } else if ("Nullable(Int64)".equals(type)) {
                            prepareStatement.setNull(idx++, Types.BIGINT);
                        } else if ("Float32".equals(type)) {
                            prepareStatement.setFloat(idx++, 0F);
                        } else if ("Nullable(Float32)".equals(type)) {
                            prepareStatement.setNull(idx++, Types.FLOAT);
                        } else if ("Float64".equals(type)) {
                            prepareStatement.setDouble(idx++, 0D);
                        } else if ("Nullable(Float64)".equals(type)) {
                            prepareStatement.setNull(idx++, Types.DOUBLE);
                        } else if ("DateTime".equals(type)) {
                            prepareStatement.setString(idx++, DateTimeUtil.getCurrentUtcDatetime());
                        } else if ("Nullable(DateTime)".equals(type)) {
                            prepareStatement.setNull(idx++, Types.VARCHAR);
                        } else if (type.contains("AggregateFunction")) {
                            prepareStatement.setNull(idx++, Types.VARCHAR);
                        }
                    } else {
                        prepareStatement.setObject(idx++, value);
                    }
                }
                prepareStatement.addBatch();
            } catch (Exception e) {
                Exception newE = new DataAssembleException(data.toJSONString(), e);
                logger.error("Find invalid parameters when assemble!", newE);
                throw newE;
            }
        }
    }

Here is one of the data in a certain batch write: Tips: I just picked one of them, but may not cause insert exception.

{"request_time_quantile":"91010000c0400000803f0000e0400000803f0000e0400000803f0000e0400000803f0000e0400000803f0000e0400000803f000000410000803f000000410000803f000000410000803f000000410000803f000000410000803f000000410000803f000000410000803f000000410000803f000000410000803f000000410000803f000000410000803f000000410000803f0000004100000040000000410000004000000841000000400000104100000040000010410000004000001041000000400000104100000040000010410000004000001041000000400000184100000040000020410000004000002041000000400000204100000040000020410000404000002041000040400000204100004040000020410000404000002041000040400000204100004040000020410000404000002041000040405555254100004040000030410000404000003041000040400000304100004040000030410000404000003041000040400000304100004040000030410000804000003041000080400000304100008040000030410000804000003041000080400000344100008040000040410000804000004041000080400000404100008040000040410000804000004041000080400000404100008040000040410000804000004041000080400000404100008040000050410000804000005041000080400000504100008040000050410000804000005041000080400000504100008040000050410000804000005041000080400000604100008040000060410000804000006041000080400000604100008040000060410000804000006041000080400000604100008040000060410000804000006c41000080400000704100008040000070410000804000007041000080400000704100008040000070410000804000007c410000804000008041000080400000804100008040000080410000804000008041000080400000804100008040000080410000804000008041000080400000804100008040000084410000804000008841000080400000884100008040000088410000804000008841000080400000884100008040abaa8a4100004040000090410000404000009041000040400000904100004040000090410000404055559541000040400000984100004040000098410000404000009841000040400000984100004040abaa9a41000040400000a041000040400000a041000040400000a041000040400000a041000040400000a041000000400000a841000000400000a841000000400000a841000000400000a841000000400000ac41000000400000b041000000400000b841000000400000c041000000400000c841000000400000d041000000400000fc410000004000000c420000004000002c4200000040000084420000803f000086420000803f00008c420000803f000098420000803f00009c420000803f00009e420000803f0000a0420000803f0000a0420000803f0000a8420000803f0000d0420000803f0000dc420000803f000002430000803f000008430000803f000013430000803f000017430000803f000025430000803f00002d430000803f000049430000803f","product":"AdPlatform","code":"200","requestType":"POST","__offset":3742997,"interface":"/api/ad-bizlog/v1/media/workflow","__topic":"quantile_aiops_collect_93","returnCode":"388","fail":"0","__partition":26,"createTime":"2024-09-18 15:49:16","success":"388","domain":"ads-adnet-drcn.cloud.honor.com","az":"None","body_bytes_sent_quantile":"91010000f0410000803f0000f0410000803f0000f0410000803f0000f0410000803f0000f0410000803f0000f0410000803f0000f0410000803f0000f0410000803f0000f0410000803f0000f0410000803f0000f0410000803f0000f0410000803f0000f0410000803f0000f0410000803f0000f0410000803f0000f0410000803f0000f0410000803f0000f0410000803f0000f041000000400000f041000000400000f041000000400000f041000000400000f041000000400000f041000000400000f041000000400000f041000000400000f041000000400000f041000000400000f041000000400000f041000000400000f041000000400000f041000040400000f041000040400000f041000040400000f041000040400000f041000040400000f041000040400000f041000040400000f041000040400000f041000040400000f041000040400000f041000040400000f041000040400000f041000040400000f041000040400000f041000040400000f041000080400000f041000080400000f041000080400000f041000080400000f041000080400000f041000080400000f041000080400000f041000080400000f041000080400000f041000080400000f041000080400000f041000080400000f041000080400000f041000080400000f041000080400000f041000080400000f041000080400000f041000080400000f041000080400000f041000080400000f041000080400000f041000080400000f041000080400000f041000080400000f041000080400000f041000080400000f041000080400000f041000080400000f041000080400000f041000080400000f041000080400000f041000080400000f041000080400000f041000080400000f041000080400000f041000080400000f041000080400000f041000080400000f041000080400000f041000080400000f041000080400000f041000080400000f041000080400000f041000080400000f041000080400000f041000080400000f041000080400000f041000080400000f041000080400000f041000080400000f041000080400000f041000080400000f041000040400000f041000040400000f041000040400000f041000040400000f041000040400000f041000040400000f041000040400000f041000040400000f041000040400000f041000040400000f041000040400000f041000040400000f041000040400000f041000040400000f041000040400000f041000000400000f041000000400000f041000000400000f041000000400000f041000000400000f041000000400000f041000000400000f041000000400000f041000000400000f041000000400000f041000000400000f041000000400000f041000000400000f041000000400000f0410000803f0000f0410000803f0000f0410000803f0000f0410000803f0000f0410000803f0000f0410000803f0000f0410000803f0000f0410000803f0000f0410000803f0000f0410000803f0000f0410000803f0000f0410000803f0000f0410000803f0000f0410000803f0000f0410000803f0000f0410000803f0000f0410000803f0000f0410000803f","time":"2024-09-14 12:46:00","upstream_response_time_quantile":"91016f12033b0000803f6f12033b0000803f6f12033b0000803f6f12033b0000803f6f12033b0000803f6f12033b0000803f6f12033b0000803f6f12033b0000803f6f12033b0000803f6f12033b0000803f6f12033b0000803f6f12033b0000803f6f12033b0000803f6f12033b0000803f6f12033b0000803f6f12033b0000803f6f12033b0000803f6f12033b0000803fa69b443b00000040a69b443b00000040a69b443b00000040a69b443b00000040a69b443b00000040a69b443b00000040a69b443b00000040a69b443b00000040a69b443b00000040a69b443b00000040a69b443b00000040a69b443b00000040a69b443b00000040a69b443b000040406f12833b000040406f12833b000040406f12833b000040406f12833b000040406f12833b000040406f12833b000040406f12833b000040406f12833b000040406f12833b000040406f12833b000040406f12833b000040406f12833b000040400ad7a33b000040400ad7a33b000040400ad7a33b000080400ad7a33b000080400ad7a33b000080400ad7a33b000080400ad7a33b000080400ad7a33b000080400ad7a33b000080400ad7a33b000080400ad7a33b000080400ad7a33b000080400ad7a33b000080400ad7a33b000080400ad7a33b000080400ad7a33b000080400ad7a33b000080400ad7a33b000080400ad7a33b000080400ad7a33b000080400ad7a33b000080400ad7a33b000080400ad7a33b000080400ad7a33b000080400ad7a33b000080400ad7a33b000080403108ac3b00008040a69bc43b00008040a69bc43b00008040a69bc43b00008040a69bc43b00008040a69bc43b00008040a69bc43b00008040a69bc43b00008040a69bc43b00008040f4fdd43b000080404260e53b000080404260e53b000080404260e53b000080404260e53b000080404260e53b000080404260e53b00008040b6f3fd3b000080406f12033c000080406f12033c000080406f12033c000080406f12033c000080406f12033c00008040bc74133c00008040bc74133c00008040bc74133c00008040bc74133c00008040bc74133c00008040bc74133c00008040f0601e3c000040400ad7233c000040400ad7233c000040400ad7233c000040400ad7233c000040400ad7233c000040400ad7233c000040400ad7233c000040405839343c000040405839343c000040405839343c000040405839343c000040405839343c000040405839343c0000404072af393c00004040a69b443c00000040a69b443c00000040a69b443c00000040a69b443c00000040a69b443c00000040a69b443c00000040a69b443c00000040cdcc4c3c00000040f4fd543c000000401b2f5d3c000000404260653c0000004068916d3c00000040b6f37d3c000000401283c03c000000408fc2f53c0000803f31082c3d0000803ffca9713d0000803f022b873d0000803f4c37893d0000803f295c8f3d0000803fbc74933d0000803f2db29d3d0000803fc1caa13d0000803f5c8fc23d0000803f60e5d03d0000803fd9cef73d0000803f9318043e0000803f295c0f3e0000803f7368113e0000803f9cc4203e0000803f7b142e3e0000803f8195433e0000803f"}

xlvchao avatar Nov 23 '24 09:11 xlvchao

@xlvchao thank you so much for the information! I will start with AggregateFunction and then replicating the issue.

chernser avatar Nov 25 '24 18:11 chernser

@xlvchao thank you so much for the information! I will start with AggregateFunction and then replicating the issue.

@chernser How's going, buddy?

xlvchao avatar Nov 26 '24 15:11 xlvchao

Good day, @xlvchao !

I've been looking into it yesterday.

  1. as you are using unhex() in the insert template - JDBC will not use RowBinary format and will just compile a big SQL statement. I'm not sure if it can be bypassed to avoid unhex()
  2. when batch is executed - SQL is built into a String twice and from different places (I will fix it) but it should not be a root cause.
  3. I've tested with different scenarios like empty batch, incomplete batch or all NULLs - it doesn't cause the error.

I will continue next week since Monday.

chernser avatar Nov 27 '24 23:11 chernser

Good day, @chernser!

There is a key point I forgot to mention!

When this error occurs(At least one SQL statement...), after retrying several times with the same batch data, it succeeded!

xlvchao avatar Nov 28 '24 01:11 xlvchao

Thank you for the additional information! That looks strange that it requires several retries. Because it is client error. I will look into it.

chernser avatar Dec 02 '24 15:12 chernser

@chernser Hi, is everything ok? Have you found the root cause?

xlvchao avatar Dec 11 '24 02:12 xlvchao

Good day, @xlvchao ! Sorry - I'm working on other issues. But I remind myself about this one. Currently I have no ideas :-(

chernser avatar Dec 12 '24 05:12 chernser

Good day, @xlvchao ! Sorry - I'm working on other issues. But I remind myself about this one. Currently I have no ideas :-(

@chernser hello,buddy😂

xlvchao avatar Dec 20 '24 07:12 xlvchao

hi @xlvchao ! how are you?

We have fixed many issues around parser that seems involved here. Would you please update to the latest version? Thanks!

chernser avatar Nov 14 '25 23:11 chernser