rust-data-analysis
rust-data-analysis copied to clipboard
Bump polars from 0.35.4 to 0.38.0
Bumps polars from 0.35.4 to 0.38.0.
Release notes
Sourced from polars's releases.
Rust Polars 0.38.0
🏆 Highlights
💥 Breaking changes
- Infer
valuescolumns inDataFrame.pivotwhenvaluesis None (#14477)- Mark
DataFrame::new_no_checksandDataFrame::new_no_length_checksunsafe (#14443)- Remove
DatetimeChunked::convert_time_zone(#14046)- Rename
LiteralValue::to_anyvaluetoLiteralValue::to_any_value(#14033)🚀 Performance improvements
- auto-tune concurrency budget (#14753)
- Don't materialize for broadcasting
fill_nullvalue and default value ofreplace(#14736)- Improve performance of boolean filters
1-100x. (#14746)- fix accidental quadratic utf8 validation in parquet (#14705)
- fast path for COUNT(*) queries (#14574)
- Elide the total order wrapper for non-(float/option) types (#14648)
- add utf8-validation fast paths for utf8view (#14644)
- don't reassign chunks back to df owner (#14633)
- If there are many small chunks in write_parquet(), convert to a single chunk (#14484) (#14487)
- Polars thread pool was not used properly in various functions (#14583)
- use owned arithmetic in horizontal_sum (#14525)
- Combine small chunks in sinks for streaming pipelines (#14346)
- reduce heap allocs in expression/logical-plan iteration (#14440)
- simplify and speed up cum_sum and cum_prod (#14409)
- simplify negated predicates to improve row groups skipping (#14370)
- prune parquet row groups when
is_not_nullis used (#14260)- use is_between to skip parquet row groups (#14244)
- Use a compression API that is designed for this use case (#11699) (#14194)
- Use
UnitVecin polars-plan traversal (#14199)- use
UnitVecin streaming joins (#14197)- improve
ChunkId(#14175)- improve iteration performance (#14126)
- elide unneeded work in window? (#14108)
- run window functions more in parallel (#14095)
- improve skip row group using statistics condition (#14056)
✨ Enhancements
- Change default for maximum number of Series items printed to 10 to match DataFrame (#14703)
- Infer
valuescolumns inDataFrame.pivotwhenvaluesis None (#14477)- fast path for COUNT(*) queries (#14574)
- let
rollingacceptindex_columnof type UInt32 or UInt64 (#14669)- Treat float -0.0 == 0.0 and -NaN == NaN in group-by, joins and unique (#14617)
- Properly cache object-stores (#14598)
- Mark
DataFrame::new_no_checksandDataFrame::new_no_length_checksunsafe (#14443)
... (truncated)
Commits
- See full diff in compare view
Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR:
-
@dependabot rebasewill rebase this PR -
@dependabot recreatewill recreate this PR, overwriting any edits that have been made to it -
@dependabot mergewill merge this PR after your CI passes on it -
@dependabot squash and mergewill squash and merge this PR after your CI passes on it -
@dependabot cancel mergewill cancel a previously requested merge and block automerging -
@dependabot reopenwill reopen this PR if it is closed -
@dependabot closewill close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually -
@dependabot show <dependency name> ignore conditionswill show all of the ignore conditions of the specified dependency -
@dependabot ignore this major versionwill close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) -
@dependabot ignore this minor versionwill close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) -
@dependabot ignore this dependencywill close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)