Kevin Burns

Results 67 comments of Kevin Burns

I saw the same behavior while implementing https://github.com/valyala/fastjson/pull/68. Recommend enforcing 500 depth (as global var) for both parse and validate.

Confirmed by looking at the code. https://github.com/valyala/fastjson/blob/master/parser.go#L427 `parseRawNumber` allows multiple instances of `.` `e` `E` `-` etc. in numbers.

@alexus1024 If your json requires validation, you should validate your json before parsing. I assume that these operations have remained separate in order to maximize parser efficiency while still providing...

After further testing, I think the best solution is a validating parser. Running both `Validate` and `Parse` separately is not very efficient. Better would be to combine the validator and...

```go s := "{\"data\":\"sampleData\", \"id\":\"data_12344\", \"debug\":10000, \"temp\":123213}" var p fastjson.Parser v, err := p.Parse(s) if err != nil { log.Fatalf("cannot parse json: %s", err) } fmt.Println(string(v.GetStringBytes("id"))) fmt.Println(v.GetInt("debug")) fmt.Println(v.GetInt("temp")) // data_12344...

This code is a lot more complicated than it needs to be and contains no comments indicating what it is trying to do so let me summarize... ``` // input...

Well, I cleaned it up but it didn't help performance any. https://play.golang.org/p/Mb78je9twYj ```$ go test -bench=. BenchmarkBefore-12 157874 7434 ns/op BenchmarkAfter-12 155823 7651 ns/op PASS ``` Every message requires 5...

I replaced that setData Parse with Arena. ```go var a = &fastjson.Arena{} val.Set("Data", a.NewStringBytes(data.MarshalTo(nil))) ``` It didn't really make a difference. `fastjson.Parse` is very well optimized for use with single...

`parserPool1` is for the entire message, and `parserPool2` is for individual data items. I felt that having a separate pool for each message type would probably be more efficient. As...

> ParserPool may be used for pooling Parsers for similarly typed JSONs. https://godoc.org/github.com/valyala/fastjson#ParserPool What are the disadvantages of reusing a parser pool for infinitely dissimilar json? Boundless memory consumption?