Roy Bellingan
Roy Bellingan
Thank you for the response. I have tried to read the N3980 but is way out of my knowledge. From what I have seen if you use always the same...
A description in in the code starting at https://github.com/bombela/backward-cpp/blob/84ae4f5e80381aca765a0810d4c811acae3cd7c7/backward.hpp#L118 In short libdw and libdwarf are the best, BUT libdwarf requries also libelf.
Just discovered that even if I initialize the file to 1byte, if the vector grow over 1024, the same error will occour... I do not know after which treshold and...
Thank you for the consideration, @grisumbras I think is a nice trick, but will incour in quite a lot of overhead. If you check you already have almost all that...
Uhm in case of the streaming yes is quite more complex, but in the streaming case you will only have the offset of the last block parsed. I will check...
So I finally had a bit of time and did a small bench and sorry the proposed solution as feared is ~30x slower... Parsing https://raw.githubusercontent.com/RichardHightower/json-parsers-benchmark/master/data/citm_catalog.json The result are Compiled as...
To find at which character the error happened, and is what I think consumed += p.write_some(next_input(), ec); Is supposed to do. I will try now a bigger batch, even if...
Ok I now understood, Sorry I totally missed the whole point of the streaming parser -.-
Ok monotonic_resource mr; stream_parser p(&mr); auto start = chrono::steady_clock::now(); int pos = p.write(raw, ec); auto end = chrono::steady_clock::now(); Is perfect, and more than excellent for any real case Simple Method...
Hi @Jahrenski thank you for the information, I was already using this method to access it, would you accept a merge request if I provide a (hopefully) compliant cookie parser...