log4rs-1.3.0/CHANGELOG.md000064400000000000000000000116601046102023000126540ustar 00000000000000# Change Log ## [1.3.0] ### New * Add debug and release formatters * Documentation on configuring the tool * Code Coverage CI * CVE Audit CI * EditorConfig CI * Code Owners * NO_COLOR, CLICOLOR, CLICOLOR_FORCE controls * Example of inline configuration with file rotation * Time Based Trigger ### Changed * Update minimum supported rust to 1.69 for CVE-2020-26235 * Update `arc-swap` to `1.6` * Update `log` to `0.4.20` * Update `humantime` to `2.1` * Update `serde_yaml` to `0.9` * Update `toml` to `0.8` * Update `derivative` to `2.2` * Update `tempfile` to `3.8` * Moved `level` field before `message` in json format * Legacy test moved to examples ### Fixed * README typo regarding building for dev on windows * Apply editorconfig * Swap rustfmt configuration to `imports_granularity="Crate"` over deprecated `merge_imports = true` ## [1.2.0] ### Changed * Update minimum supported rust to 1.56 for `edition 2021` ### Fixed * Typemap fix: [#282](https://github.com/estk/log4rs/pull/282) ## [1.1.1] ### Added ### Changed * Removed palaver * Update `parking_lot` to `0.11` * Update minimum supported rust to 1.49 for `parking_lot` ### Fixed * #253 ## [1.1.0] ### Added * Example of compile-time config * `gettid` for `PatternEncoder` * Better rotation benchmark statistics * `tty_only` option to `ConsoleAppender` ### Changed * Update `arc_swap` to `1.2` * Update `thread_id` to `4` * Update docs for `FixedWindow::build` * Drop `Regex` dependency ### Fixed * Hide {} in error message from formatting machinery * Fix link in examples ## [1.0.0] ### Added * Custom error handling * Allow parsing of config from string * Expand env vars in file path of file and RollingFile appenders PR#155 * Console appender can be configured to only write output when it's a TTY ### Changed * Colors changed to match `env_logger` * Drop XML config support * Rename feature `file` to `config_parsing` * Use `thiserror`/`anyhow` for errors ### Fixed ## [0.13.0] ### Added ### Changed * Update `serde-xml-rs` to `0.4` * Update `parking_lot` to `0.11` ### Fixed * Fix bug where both `pattern_encoder` and `json_encoder` features need to be active to use either ## [0.12.0] ### Added * Derived `Clone` for `Handle` ### Changed ### Fixed * Build warnings * Docs typos ## [0.11.0] A performance issue was discovered with gzip and rolling logs, the `background_rotation` feature was added to mitigate this by spawning a background thread to perform the rotation in. Shout out to @yakov-bakhmatov for the PR! ### Added * `background_rotation` feature which rotates and compresses log archives in a background thread ### Changed * Deprecate xml feature in preparation for removal * Simplify and increase visibility of docs * Swap some synchronization primitives to use `parking_lot` implementations ### Fixed ## [0.10.0] This is a big release as we're moving to rust 2018 edition! ### Added * More badges in the readme ### Changed * Use rust 2018 edition * Minimum rust version is 1.38.0 * Update `arcswap`, `serde-value` and `serde-xml-rs` ### Fixed * Deprecate len method on rolling_file * Windows build issue after 2018 edition ## [0.9.0] ### Added * `Logger` is now public * `PatternEncoder` now has the pid * Many config structs are now `Clone` and `Debug` for convenience * JSON logger example added * File logging example added ### Fixed * Hierarchical Changelog * No longer looking for maintainer ## [0.8.3] - 2019-04-02 ### Fixed * Fixed Cargo.toml badge ## [0.8.2] - 2019-04-02 ### Changed * Switched from crossbeam's `ArcCell` to arc-swap's `ArcSwap` internally * Upgraded toml to 0.5 ## [0.8.1] - 2018-10-17 ### Added * Support thread IDs in both JSON and pattern encoders ### Changed * Upgraded to serde_yaml 0.8 ## [0.8.0] - 2017-12-25 ### Added * XML-formatted config files are now supported * `Append::flush` method ### Changed * Upgraded to log 0.4 ## [0.7.0] - 2017-04-26 ### Added ### Changed * Update to serde 1.0 ## [0.6.3] - 2017-04-05 ### Added ### Changed * Fix console appender to actually log to stdout when requested ## [0.6.2] - 2017-03-01 ### Added ### Changed * Fix handling of non-0 bases in rolling file appender ## [0.6.1] - 2017-02-11 ### Added * Add TOML support back in ### Changed ## [0.6.0] - 2017-02-10 ### Added * Enable most features by default. This increases compile times a bit, but is way less confusing for people since components aren't randomly missing * Restructure config deserialization. A log4rs config can now be embedded in other config structures and deserialized by downstream users ### Changed * Update to serde 0.9 * Use serde_derive instead of manual codegen * Drop TOML support. The toml crate hasn't yet been released with support for serde 0.9, but we'll add support back when that lands ## [0.5.2] - 2016-11-25 ### Added * Make Deserializers Clone ### Changed ## [0.5.1] - 2016-11-20 ### Added ### Changed * Update serde_yaml * Fix file modification time checks in config reloader log4rs-1.3.0/Cargo.lock0000644000000564100000000000100102300ustar # This file is automatically @generated by Cargo. # It is not intended for manual editing. version = 3 [[package]] name = "adler" version = "1.0.2" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "f26201604c87b1e01bd3d98f8d5d9a8fcbb815e8cedb41ffccbeb4bf593a35fe" [[package]] name = "android-tzdata" version = "0.1.1" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "e999941b234f3131b00bc13c22d06e8c5ff726d1b6318ac7eb276997bbb4fef0" [[package]] name = "android_system_properties" version = "0.1.5" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "819e7219dbd41043ac279b19830f2efc897156490d7fd6ea916720117ee66311" dependencies = [ "libc", ] [[package]] name = "anyhow" version = "1.0.79" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "080e9890a082662b09c1ad45f567faeeb47f22b5fb23895fbe1e651e718e25ca" [[package]] name = "arc-swap" version = "1.6.0" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "bddcadddf5e9015d310179a59bb28c4d4b9920ad0f11e8e14dbadf654890c9a6" [[package]] name = "autocfg" version = "1.1.0" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "d468802bab17cbc0cc575e9b053f41e72aa36bfa6b7f55e3529ffa43161b97fa" [[package]] name = "bitflags" version = "1.3.2" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "bef38d45163c2f1dde094a7dfd33ccf595c92905c8f8f4fdc18d06fb1037718a" [[package]] name = "bitflags" version = "2.4.2" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "ed570934406eb16438a4e976b1b4500774099c13b8cb96eec99f620f05090ddf" [[package]] name = "bumpalo" version = "3.14.0" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "7f30e7476521f6f8af1a1c4c0b8cc94f0bee37d91763d0ca2665f299b6cd8aec" [[package]] name = "cc" version = "1.0.83" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "f1174fb0b6ec23863f8b971027804a42614e347eafb0a95bf0b12cdae21fc4d0" dependencies = [ "libc", ] [[package]] name = "cfg-if" version = "1.0.0" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "baf1de4339761588bc0619e3cbc0120ee582ebb74b53b4efbf79117bd2da40fd" [[package]] name = "chrono" version = "0.4.33" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "9f13690e35a5e4ace198e7beea2895d29f3a9cc55015fcebe6336bd2010af9eb" dependencies = [ "android-tzdata", "iana-time-zone", "num-traits", "windows-targets 0.52.0", ] [[package]] name = "core-foundation-sys" version = "0.8.6" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "06ea2b9bc92be3c2baa9334a323ebca2d6f074ff852cd1d7b11064035cd3868f" [[package]] name = "crc32fast" version = "1.3.2" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "b540bd8bc810d3885c6ea91e2018302f68baba2129ab3e88f32389ee9370880d" dependencies = [ "cfg-if", ] [[package]] name = "derivative" version = "2.2.0" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "fcc3dd5e9e9c0b295d6e1e4d811fb6f157d5ffd784b8d202fc62eac8035a770b" dependencies = [ "proc-macro2", "quote", "syn 1.0.109", ] [[package]] name = "destructure_traitobject" version = "0.2.0" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "3c877555693c14d2f84191cfd3ad8582790fc52b5e2274b40b59cf5f5cea25c7" [[package]] name = "equivalent" version = "1.0.1" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "5443807d6dff69373d433ab9ef5378ad8df50ca6298caf15de6e52e24aaf54d5" [[package]] name = "errno" version = "0.3.8" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "a258e46cdc063eb8519c00b9fc845fc47bcfca4130e2f08e88665ceda8474245" dependencies = [ "libc", "windows-sys", ] [[package]] name = "fastrand" version = "2.0.1" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "25cbce373ec4653f1a01a31e8a5e5ec0c622dc27ff9c4e6606eefef5cbbed4a5" [[package]] name = "flate2" version = "1.0.28" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "46303f565772937ffe1d394a4fac6f411c6013172fadde9dcdb1e147a086940e" dependencies = [ "crc32fast", "miniz_oxide", ] [[package]] name = "fnv" version = "1.0.7" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "3f9eec918d3f24069decb9af1554cad7c880e2da24a9afd88aca000531ab82c1" [[package]] name = "getrandom" version = "0.2.12" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "190092ea657667030ac6a35e305e62fc4dd69fd98ac98631e5d3a2b1575a12b5" dependencies = [ "cfg-if", "libc", "wasi", ] [[package]] name = "hashbrown" version = "0.14.3" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "290f1a1d9242c78d09ce40a5e87e7554ee637af1351968159f4952f028f75604" [[package]] name = "humantime" version = "2.1.0" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "9a3a5bfb195931eeb336b2a7b4d761daec841b97f947d34394601737a7bba5e4" [[package]] name = "iana-time-zone" version = "0.1.60" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "e7ffbb5a1b541ea2561f8c41c087286cc091e21e556a4f09a8f6cbf17b69b141" dependencies = [ "android_system_properties", "core-foundation-sys", "iana-time-zone-haiku", "js-sys", "wasm-bindgen", "windows-core", ] [[package]] name = "iana-time-zone-haiku" version = "0.1.2" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "f31827a206f56af32e590ba56d5d2d085f558508192593743f16b2306495269f" dependencies = [ "cc", ] [[package]] name = "indexmap" version = "2.2.2" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "824b2ae422412366ba479e8111fd301f7b5faece8149317bb81925979a53f520" dependencies = [ "equivalent", "hashbrown", ] [[package]] name = "itoa" version = "1.0.10" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "b1a46d1a171d865aa5f83f92695765caa047a9b4cbae2cbf37dbd613a793fd4c" [[package]] name = "js-sys" version = "0.3.68" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "406cda4b368d531c842222cf9d2600a9a4acce8d29423695379c6868a143a9ee" dependencies = [ "wasm-bindgen", ] [[package]] name = "lazy_static" version = "1.4.0" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "e2abad23fbc42b3700f2f279844dc832adb2b2eb069b2df918f455c4e18cc646" [[package]] name = "libc" version = "0.2.153" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "9c198f91728a82281a64e1f4f9eeb25d82cb32a5de251c6bd1b5154d63a8e7bd" [[package]] name = "linux-raw-sys" version = "0.4.13" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "01cda141df6706de531b6c46c3a33ecca755538219bd484262fa09410c13539c" [[package]] name = "lock_api" version = "0.4.11" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "3c168f8615b12bc01f9c17e2eb0cc07dcae1940121185446edc3744920e8ef45" dependencies = [ "autocfg", "scopeguard", ] [[package]] name = "log" version = "0.4.20" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "b5e6163cb8c49088c2c36f57875e58ccd8c87c7427f7fbd50ea6710b2f3f2e8f" dependencies = [ "serde", ] [[package]] name = "log-mdc" version = "0.1.0" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "a94d21414c1f4a51209ad204c1776a3d0765002c76c6abcb602a6f09f1e881c7" [[package]] name = "log4rs" version = "1.3.0" dependencies = [ "anyhow", "arc-swap", "chrono", "derivative", "flate2", "fnv", "humantime", "lazy_static", "libc", "log", "log-mdc", "mock_instant", "once_cell", "parking_lot", "rand", "serde", "serde-value", "serde_json", "serde_yaml", "streaming-stats", "tempfile", "thiserror", "thread-id", "toml", "typemap-ors", "winapi", ] [[package]] name = "memchr" version = "2.7.1" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "523dc4f511e55ab87b694dc30d0f820d60906ef06413f93d4d7a1385599cc149" [[package]] name = "miniz_oxide" version = "0.7.2" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "9d811f3e15f28568be3407c8e7fdb6514c1cda3cb30683f15b6a1a1dc4ea14a7" dependencies = [ "adler", ] [[package]] name = "mock_instant" version = "0.3.1" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "6c1a54de846c4006b88b1516731cc1f6026eb5dc4bcb186aa071ef66d40524ec" [[package]] name = "num-traits" version = "0.2.18" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "da0df0e5185db44f69b44f26786fe401b6c293d1907744beaa7fa62b2e5a517a" dependencies = [ "autocfg", ] [[package]] name = "once_cell" version = "1.19.0" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "3fdb12b2476b595f9358c5161aa467c2438859caa136dec86c26fdd2efe17b92" [[package]] name = "ordered-float" version = "2.10.1" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "68f19d67e5a2795c94e73e0bb1cc1a7edeb2e28efd39e2e1c9b7a40c1108b11c" dependencies = [ "num-traits", ] [[package]] name = "parking_lot" version = "0.12.1" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "3742b2c103b9f06bc9fff0a37ff4912935851bee6d36f3c02bcc755bcfec228f" dependencies = [ "lock_api", "parking_lot_core", ] [[package]] name = "parking_lot_core" version = "0.9.9" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "4c42a9226546d68acdd9c0a280d17ce19bfe27a46bf68784e4066115788d008e" dependencies = [ "cfg-if", "libc", "redox_syscall", "smallvec", "windows-targets 0.48.5", ] [[package]] name = "ppv-lite86" version = "0.2.17" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "5b40af805b3121feab8a3c29f04d8ad262fa8e0561883e7653e024ae4479e6de" [[package]] name = "proc-macro2" version = "1.0.78" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "e2422ad645d89c99f8f3e6b88a9fdeca7fabeac836b1002371c4367c8f984aae" dependencies = [ "unicode-ident", ] [[package]] name = "quote" version = "1.0.35" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "291ec9ab5efd934aaf503a6466c5d5251535d108ee747472c3977cc5acc868ef" dependencies = [ "proc-macro2", ] [[package]] name = "rand" version = "0.8.5" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "34af8d1a0e25924bc5b7c43c079c942339d8f0a8b57c39049bef581b46327404" dependencies = [ "libc", "rand_chacha", "rand_core", ] [[package]] name = "rand_chacha" version = "0.3.1" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "e6c10a63a0fa32252be49d21e7709d4d4baf8d231c2dbce1eaa8141b9b127d88" dependencies = [ "ppv-lite86", "rand_core", ] [[package]] name = "rand_core" version = "0.6.4" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "ec0be4795e2f6a28069bec0b5ff3e2ac9bafc99e6a9a7dc3547996c5c816922c" dependencies = [ "getrandom", ] [[package]] name = "redox_syscall" version = "0.4.1" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "4722d768eff46b75989dd134e5c353f0d6296e5aaa3132e776cbdb56be7731aa" dependencies = [ "bitflags 1.3.2", ] [[package]] name = "rustix" version = "0.38.31" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "6ea3e1a662af26cd7a3ba09c0297a31af215563ecf42817c98df621387f4e949" dependencies = [ "bitflags 2.4.2", "errno", "libc", "linux-raw-sys", "windows-sys", ] [[package]] name = "ryu" version = "1.0.16" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "f98d2aa92eebf49b69786be48e4477826b256916e84a57ff2a4f21923b48eb4c" [[package]] name = "scopeguard" version = "1.2.0" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "94143f37725109f92c262ed2cf5e59bce7498c01bcc1502d7b9afe439a4e9f49" [[package]] name = "serde" version = "1.0.196" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "870026e60fa08c69f064aa766c10f10b1d62db9ccd4d0abb206472bee0ce3b32" dependencies = [ "serde_derive", ] [[package]] name = "serde-value" version = "0.7.0" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "f3a1a3341211875ef120e117ea7fd5228530ae7e7036a779fdc9117be6b3282c" dependencies = [ "ordered-float", "serde", ] [[package]] name = "serde_derive" version = "1.0.196" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "33c85360c95e7d137454dc81d9a4ed2b8efd8fbe19cee57357b32b9771fccb67" dependencies = [ "proc-macro2", "quote", "syn 2.0.48", ] [[package]] name = "serde_json" version = "1.0.113" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "69801b70b1c3dac963ecb03a364ba0ceda9cf60c71cfe475e99864759c8b8a79" dependencies = [ "itoa", "ryu", "serde", ] [[package]] name = "serde_spanned" version = "0.6.5" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "eb3622f419d1296904700073ea6cc23ad690adbd66f13ea683df73298736f0c1" dependencies = [ "serde", ] [[package]] name = "serde_yaml" version = "0.9.31" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "adf8a49373e98a4c5f0ceb5d05aa7c648d75f63774981ed95b7c7443bbd50c6e" dependencies = [ "indexmap", "itoa", "ryu", "serde", "unsafe-libyaml", ] [[package]] name = "smallvec" version = "1.13.1" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "e6ecd384b10a64542d77071bd64bd7b231f4ed5940fba55e98c3de13824cf3d7" [[package]] name = "streaming-stats" version = "0.2.3" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "b0d670ce4e348a2081843569e0f79b21c99c91bb9028b3b3ecb0f050306de547" dependencies = [ "num-traits", ] [[package]] name = "syn" version = "1.0.109" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "72b64191b275b66ffe2469e8af2c1cfe3bafa67b529ead792a6d0160888b4237" dependencies = [ "proc-macro2", "quote", "unicode-ident", ] [[package]] name = "syn" version = "2.0.48" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "0f3531638e407dfc0814761abb7c00a5b54992b849452a0646b7f65c9f770f3f" dependencies = [ "proc-macro2", "quote", "unicode-ident", ] [[package]] name = "tempfile" version = "3.10.0" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "a365e8cd18e44762ef95d87f284f4b5cd04107fec2ff3052bd6a3e6069669e67" dependencies = [ "cfg-if", "fastrand", "rustix", "windows-sys", ] [[package]] name = "thiserror" version = "1.0.56" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "d54378c645627613241d077a3a79db965db602882668f9136ac42af9ecb730ad" dependencies = [ "thiserror-impl", ] [[package]] name = "thiserror-impl" version = "1.0.56" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "fa0faa943b50f3db30a20aa7e265dbc66076993efed8463e8de414e5d06d3471" dependencies = [ "proc-macro2", "quote", "syn 2.0.48", ] [[package]] name = "thread-id" version = "4.2.1" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "f0ec81c46e9eb50deaa257be2f148adf052d1fb7701cfd55ccfab2525280b70b" dependencies = [ "libc", "winapi", ] [[package]] name = "toml" version = "0.8.10" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "9a9aad4a3066010876e8dcf5a8a06e70a558751117a145c6ce2b82c2e2054290" dependencies = [ "serde", "serde_spanned", "toml_datetime", "toml_edit", ] [[package]] name = "toml_datetime" version = "0.6.5" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "3550f4e9685620ac18a50ed434eb3aec30db8ba93b0287467bca5826ea25baf1" dependencies = [ "serde", ] [[package]] name = "toml_edit" version = "0.22.4" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "0c9ffdf896f8daaabf9b66ba8e77ea1ed5ed0f72821b398aba62352e95062951" dependencies = [ "indexmap", "serde", "serde_spanned", "toml_datetime", "winnow", ] [[package]] name = "typemap-ors" version = "1.0.0" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "a68c24b707f02dd18f1e4ccceb9d49f2058c2fb86384ef9972592904d7a28867" dependencies = [ "unsafe-any-ors", ] [[package]] name = "unicode-ident" version = "1.0.12" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "3354b9ac3fae1ff6755cb6db53683adb661634f67557942dea4facebec0fee4b" [[package]] name = "unsafe-any-ors" version = "1.0.0" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "e0a303d30665362d9680d7d91d78b23f5f899504d4f08b3c4cf08d055d87c0ad" dependencies = [ "destructure_traitobject", ] [[package]] name = "unsafe-libyaml" version = "0.2.10" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "ab4c90930b95a82d00dc9e9ac071b4991924390d46cbd0dfe566148667605e4b" [[package]] name = "wasi" version = "0.11.0+wasi-snapshot-preview1" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "9c8d87e72b64a3b4db28d11ce29237c246188f4f51057d65a7eab63b7987e423" [[package]] name = "wasm-bindgen" version = "0.2.91" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "c1e124130aee3fb58c5bdd6b639a0509486b0338acaaae0c84a5124b0f588b7f" dependencies = [ "cfg-if", "wasm-bindgen-macro", ] [[package]] name = "wasm-bindgen-backend" version = "0.2.91" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "c9e7e1900c352b609c8488ad12639a311045f40a35491fb69ba8c12f758af70b" dependencies = [ "bumpalo", "log", "once_cell", "proc-macro2", "quote", "syn 2.0.48", "wasm-bindgen-shared", ] [[package]] name = "wasm-bindgen-macro" version = "0.2.91" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "b30af9e2d358182b5c7449424f017eba305ed32a7010509ede96cdc4696c46ed" dependencies = [ "quote", "wasm-bindgen-macro-support", ] [[package]] name = "wasm-bindgen-macro-support" version = "0.2.91" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "642f325be6301eb8107a83d12a8ac6c1e1c54345a7ef1a9261962dfefda09e66" dependencies = [ "proc-macro2", "quote", "syn 2.0.48", "wasm-bindgen-backend", "wasm-bindgen-shared", ] [[package]] name = "wasm-bindgen-shared" version = "0.2.91" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "4f186bd2dcf04330886ce82d6f33dd75a7bfcf69ecf5763b89fcde53b6ac9838" [[package]] name = "winapi" version = "0.3.9" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "5c839a674fcd7a98952e593242ea400abe93992746761e38641405d28b00f419" dependencies = [ "winapi-i686-pc-windows-gnu", "winapi-x86_64-pc-windows-gnu", ] [[package]] name = "winapi-i686-pc-windows-gnu" version = "0.4.0" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "ac3b87c63620426dd9b991e5ce0329eff545bccbbb34f3be09ff6fb6ab51b7b6" [[package]] name = "winapi-x86_64-pc-windows-gnu" version = "0.4.0" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "712e227841d057c1ee1cd2fb22fa7e5a5461ae8e48fa2ca79ec42cfc1931183f" [[package]] name = "windows-core" version = "0.52.0" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "33ab640c8d7e35bf8ba19b884ba838ceb4fba93a4e8c65a9059d08afcfc683d9" dependencies = [ "windows-targets 0.52.0", ] [[package]] name = "windows-sys" version = "0.52.0" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "282be5f36a8ce781fad8c8ae18fa3f9beff57ec1b52cb3de0789201425d9a33d" dependencies = [ "windows-targets 0.52.0", ] [[package]] name = "windows-targets" version = "0.48.5" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "9a2fa6e2155d7247be68c096456083145c183cbbbc2764150dda45a87197940c" dependencies = [ "windows_aarch64_gnullvm 0.48.5", "windows_aarch64_msvc 0.48.5", "windows_i686_gnu 0.48.5", "windows_i686_msvc 0.48.5", "windows_x86_64_gnu 0.48.5", "windows_x86_64_gnullvm 0.48.5", "windows_x86_64_msvc 0.48.5", ] [[package]] name = "windows-targets" version = "0.52.0" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "8a18201040b24831fbb9e4eb208f8892e1f50a37feb53cc7ff887feb8f50e7cd" dependencies = [ "windows_aarch64_gnullvm 0.52.0", "windows_aarch64_msvc 0.52.0", "windows_i686_gnu 0.52.0", "windows_i686_msvc 0.52.0", "windows_x86_64_gnu 0.52.0", "windows_x86_64_gnullvm 0.52.0", "windows_x86_64_msvc 0.52.0", ] [[package]] name = "windows_aarch64_gnullvm" version = "0.48.5" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "2b38e32f0abccf9987a4e3079dfb67dcd799fb61361e53e2882c3cbaf0d905d8" [[package]] name = "windows_aarch64_gnullvm" version = "0.52.0" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "cb7764e35d4db8a7921e09562a0304bf2f93e0a51bfccee0bd0bb0b666b015ea" [[package]] name = "windows_aarch64_msvc" version = "0.48.5" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "dc35310971f3b2dbbf3f0690a219f40e2d9afcf64f9ab7cc1be722937c26b4bc" [[package]] name = "windows_aarch64_msvc" version = "0.52.0" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "bbaa0368d4f1d2aaefc55b6fcfee13f41544ddf36801e793edbbfd7d7df075ef" [[package]] name = "windows_i686_gnu" version = "0.48.5" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "a75915e7def60c94dcef72200b9a8e58e5091744960da64ec734a6c6e9b3743e" [[package]] name = "windows_i686_gnu" version = "0.52.0" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "a28637cb1fa3560a16915793afb20081aba2c92ee8af57b4d5f28e4b3e7df313" [[package]] name = "windows_i686_msvc" version = "0.48.5" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "8f55c233f70c4b27f66c523580f78f1004e8b5a8b659e05a4eb49d4166cca406" [[package]] name = "windows_i686_msvc" version = "0.52.0" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "ffe5e8e31046ce6230cc7215707b816e339ff4d4d67c65dffa206fd0f7aa7b9a" [[package]] name = "windows_x86_64_gnu" version = "0.48.5" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "53d40abd2583d23e4718fddf1ebec84dbff8381c07cae67ff7768bbf19c6718e" [[package]] name = "windows_x86_64_gnu" version = "0.52.0" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "3d6fa32db2bc4a2f5abeacf2b69f7992cd09dca97498da74a151a3132c26befd" [[package]] name = "windows_x86_64_gnullvm" version = "0.48.5" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "0b7b52767868a23d5bab768e390dc5f5c55825b6d30b86c844ff2dc7414044cc" [[package]] name = "windows_x86_64_gnullvm" version = "0.52.0" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "1a657e1e9d3f514745a572a6846d3c7aa7dbe1658c056ed9c3344c4109a6949e" [[package]] name = "windows_x86_64_msvc" version = "0.48.5" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "ed94fce61571a4006852b7389a063ab983c02eb1bb37b47f8272ce92d06d9538" [[package]] name = "windows_x86_64_msvc" version = "0.52.0" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "dff9641d1cd4be8d1a070daf9e3773c5f67e78b4d9d42263020c057706765c04" [[package]] name = "winnow" version = "0.5.39" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "5389a154b01683d28c77f8f68f49dea75f0a4da32557a58f68ee51ebba472d29" dependencies = [ "memchr", ] log4rs-1.3.0/Cargo.toml0000644000000105120000000000100102440ustar # THIS FILE IS AUTOMATICALLY GENERATED BY CARGO # # When uploading crates to the registry Cargo will automatically # "normalize" Cargo.toml files for maximal compatibility # with all versions of Cargo and also rewrite `path` dependencies # to registry (e.g., crates.io) dependencies. # # If you are reading this file be aware that the original Cargo.toml # will likely look very different (and much more reasonable). # See Cargo.toml.orig for the original contents. [package] edition = "2018" rust-version = "1.69" name = "log4rs" version = "1.3.0" authors = [ "Steven Fackler ", "Evan Simmons ", ] description = "A highly configurable multi-output logging implementation for the `log` facade" readme = "README.md" keywords = [ "log", "logger", "logging", "log4", ] license = "MIT OR Apache-2.0" repository = "https://github.com/estk/log4rs" [[example]] name = "json_logger" required-features = [ "json_encoder", "console_appender", ] [[example]] name = "log_to_file" required-features = [ "console_appender", "file_appender", "rolling_file_appender", ] [[example]] name = "compile_time_config" required-features = [ "yaml_format", "config_parsing", ] [[example]] name = "log_to_file_with_rolling" required-features = [ "file_appender", "rolling_file_appender", "size_trigger", ] [[example]] name = "multi_logger_config" required-features = [ "yaml_format", "config_parsing", ] [[bench]] name = "rotation" harness = false [dependencies.anyhow] version = "1.0.28" [dependencies.arc-swap] version = "1.6" [dependencies.chrono] version = "0.4.23" features = ["clock"] optional = true default-features = false [dependencies.derivative] version = "2.2" [dependencies.flate2] version = "1.0" optional = true [dependencies.fnv] version = "1.0" [dependencies.humantime] version = "2.1" optional = true [dependencies.log] version = "0.4.20" features = ["std"] [dependencies.log-mdc] version = "0.1" optional = true [dependencies.once_cell] version = "1.17.1" [dependencies.parking_lot] version = "0.12.0" optional = true [dependencies.rand] version = "0.8" optional = true [dependencies.serde] version = "1.0" features = ["derive"] optional = true [dependencies.serde-value] version = "0.7" optional = true [dependencies.serde_json] version = "1.0" optional = true [dependencies.serde_yaml] version = "0.9" optional = true [dependencies.thiserror] version = "1.0.15" [dependencies.thread-id] version = "4" optional = true [dependencies.toml] version = "0.8" optional = true [dependencies.typemap-ors] version = "1.0.0" optional = true [dev-dependencies.humantime] version = "2.1" [dev-dependencies.lazy_static] version = "1.4" [dev-dependencies.mock_instant] version = "0.3" [dev-dependencies.streaming-stats] version = "0.2.3" [dev-dependencies.tempfile] version = "3.8" [features] all_components = [ "console_appender", "file_appender", "rolling_file_appender", "compound_policy", "delete_roller", "fixed_window_roller", "size_trigger", "time_trigger", "json_encoder", "pattern_encoder", "threshold_filter", ] ansi_writer = [] background_rotation = [] compound_policy = [] config_parsing = [ "humantime", "serde", "serde-value", "typemap-ors", "log/serde", ] console_appender = [ "console_writer", "simple_writer", "pattern_encoder", ] console_writer = [ "ansi_writer", "libc", "winapi", ] default = [ "all_components", "config_parsing", "yaml_format", ] delete_roller = [] file_appender = [ "parking_lot", "simple_writer", "pattern_encoder", ] fixed_window_roller = [] gzip = ["flate2"] json_encoder = [ "serde", "serde_json", "chrono", "log-mdc", "log/serde", "thread-id", ] json_format = ["serde_json"] pattern_encoder = [ "chrono", "log-mdc", "thread-id", ] rolling_file_appender = [ "parking_lot", "simple_writer", "pattern_encoder", ] simple_writer = [] size_trigger = [] threshold_filter = [] time_trigger = ["rand"] toml_format = ["toml"] yaml_format = ["serde_yaml"] [target."cfg(not(windows))".dependencies.libc] version = "0.2" optional = true [target."cfg(windows)".dependencies.winapi] version = "0.3" features = [ "handleapi", "minwindef", "processenv", "winbase", "wincon", ] optional = true log4rs-1.3.0/Cargo.toml.orig000064400000000000000000000064251046102023000137350ustar 00000000000000[package] name = "log4rs" version = "1.3.0" authors = ["Steven Fackler ", "Evan Simmons "] description = "A highly configurable multi-output logging implementation for the `log` facade" license = "MIT OR Apache-2.0" repository = "https://github.com/estk/log4rs" readme = "README.md" keywords = ["log", "logger", "logging", "log4"] edition = "2018" rust-version = "1.69" [features] default = ["all_components", "config_parsing", "yaml_format"] config_parsing = ["humantime", "serde", "serde-value", "typemap-ors", "log/serde"] yaml_format = ["serde_yaml"] json_format = ["serde_json"] toml_format = ["toml"] console_appender = ["console_writer", "simple_writer", "pattern_encoder"] file_appender = ["parking_lot", "simple_writer", "pattern_encoder"] rolling_file_appender = ["parking_lot", "simple_writer", "pattern_encoder"] compound_policy = [] delete_roller = [] fixed_window_roller = [] size_trigger = [] time_trigger = ["rand"] json_encoder = ["serde", "serde_json", "chrono", "log-mdc", "log/serde", "thread-id"] pattern_encoder = ["chrono", "log-mdc", "thread-id"] ansi_writer = [] console_writer = ["ansi_writer", "libc", "winapi"] simple_writer = [] threshold_filter = [] background_rotation = [] all_components = [ "console_appender", "file_appender", "rolling_file_appender", "compound_policy", "delete_roller", "fixed_window_roller", "size_trigger", "time_trigger", "json_encoder", "pattern_encoder", "threshold_filter" ] gzip = ["flate2"] [[bench]] name = "rotation" harness = false [dependencies] arc-swap = "1.6" chrono = { version = "0.4.23", optional = true, features = ["clock"], default-features = false } flate2 = { version = "1.0", optional = true } fnv = "1.0" humantime = { version = "2.1", optional = true } log = { version = "0.4.20", features = ["std"] } log-mdc = { version = "0.1", optional = true } serde = { version = "1.0", optional = true, features = ["derive"] } serde-value = { version = "0.7", optional = true } thread-id = { version = "4", optional = true } typemap-ors = { version = "1.0.0", optional = true } serde_json = { version = "1.0", optional = true } serde_yaml = { version = "0.9", optional = true } toml = { version = "0.8", optional = true } parking_lot = { version = "0.12.0", optional = true } rand = { version = "0.8", optional = true} thiserror = "1.0.15" anyhow = "1.0.28" derivative = "2.2" once_cell = "1.17.1" [target.'cfg(windows)'.dependencies] winapi = { version = "0.3", optional = true, features = ["handleapi", "minwindef", "processenv", "winbase", "wincon"] } [target.'cfg(not(windows))'.dependencies] libc = { version = "0.2", optional = true } [dev-dependencies] lazy_static = "1.4" streaming-stats = "0.2.3" humantime = "2.1" tempfile = "3.8" mock_instant = "0.3" [[example]] name = "json_logger" required-features = ["json_encoder", "console_appender"] [[example]] name = "log_to_file" required-features = ["console_appender", "file_appender", "rolling_file_appender"] [[example]] name = "compile_time_config" required-features = ["yaml_format", "config_parsing"] [[example]] name = "log_to_file_with_rolling" required-features = ["file_appender", "rolling_file_appender", "size_trigger"] [[example]] name = "multi_logger_config" required-features = ["yaml_format", "config_parsing"] log4rs-1.3.0/LICENSE-APACHE000064400000000000000000000251371046102023000127730ustar 00000000000000 Apache License Version 2.0, January 2004 http://www.apache.org/licenses/ TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION 1. Definitions. "License" shall mean the terms and conditions for use, reproduction, and distribution as defined by Sections 1 through 9 of this document. "Licensor" shall mean the copyright owner or entity authorized by the copyright owner that is granting the License. "Legal Entity" shall mean the union of the acting entity and all other entities that control, are controlled by, or are under common control with that entity. For the purposes of this definition, "control" means (i) the power, direct or indirect, to cause the direction or management of such entity, whether by contract or otherwise, or (ii) ownership of fifty percent (50%) or more of the outstanding shares, or (iii) beneficial ownership of such entity. "You" (or "Your") shall mean an individual or Legal Entity exercising permissions granted by this License. "Source" form shall mean the preferred form for making modifications, including but not limited to software source code, documentation source, and configuration files. "Object" form shall mean any form resulting from mechanical transformation or translation of a Source form, including but not limited to compiled object code, generated documentation, and conversions to other media types. "Work" shall mean the work of authorship, whether in Source or Object form, made available under the License, as indicated by a copyright notice that is included in or attached to the work (an example is provided in the Appendix below). "Derivative Works" shall mean any work, whether in Source or Object form, that is based on (or derived from) the Work and for which the editorial revisions, annotations, elaborations, or other modifications represent, as a whole, an original work of authorship. For the purposes of this License, Derivative Works shall not include works that remain separable from, or merely link (or bind by name) to the interfaces of, the Work and Derivative Works thereof. "Contribution" shall mean any work of authorship, including the original version of the Work and any modifications or additions to that Work or Derivative Works thereof, that is intentionally submitted to Licensor for inclusion in the Work by the copyright owner or by an individual or Legal Entity authorized to submit on behalf of the copyright owner. For the purposes of this definition, "submitted" means any form of electronic, verbal, or written communication sent to the Licensor or its representatives, including but not limited to communication on electronic mailing lists, source code control systems, and issue tracking systems that are managed by, or on behalf of, the Licensor for the purpose of discussing and improving the Work, but excluding communication that is conspicuously marked or otherwise designated in writing by the copyright owner as "Not a Contribution." "Contributor" shall mean Licensor and any individual or Legal Entity on behalf of whom a Contribution has been received by Licensor and subsequently incorporated within the Work. 2. Grant of Copyright License. Subject to the terms and conditions of this License, each Contributor hereby grants to You a perpetual, worldwide, non-exclusive, no-charge, royalty-free, irrevocable copyright license to reproduce, prepare Derivative Works of, publicly display, publicly perform, sublicense, and distribute the Work and such Derivative Works in Source or Object form. 3. Grant of Patent License. Subject to the terms and conditions of this License, each Contributor hereby grants to You a perpetual, worldwide, non-exclusive, no-charge, royalty-free, irrevocable (except as stated in this section) patent license to make, have made, use, offer to sell, sell, import, and otherwise transfer the Work, where such license applies only to those patent claims licensable by such Contributor that are necessarily infringed by their Contribution(s) alone or by combination of their Contribution(s) with the Work to which such Contribution(s) was submitted. If You institute patent litigation against any entity (including a cross-claim or counterclaim in a lawsuit) alleging that the Work or a Contribution incorporated within the Work constitutes direct or contributory patent infringement, then any patent licenses granted to You under this License for that Work shall terminate as of the date such litigation is filed. 4. Redistribution. You may reproduce and distribute copies of the Work or Derivative Works thereof in any medium, with or without modifications, and in Source or Object form, provided that You meet the following conditions: (a) You must give any other recipients of the Work or Derivative Works a copy of this License; and (b) You must cause any modified files to carry prominent notices stating that You changed the files; and (c) You must retain, in the Source form of any Derivative Works that You distribute, all copyright, patent, trademark, and attribution notices from the Source form of the Work, excluding those notices that do not pertain to any part of the Derivative Works; and (d) If the Work includes a "NOTICE" text file as part of its distribution, then any Derivative Works that You distribute must include a readable copy of the attribution notices contained within such NOTICE file, excluding those notices that do not pertain to any part of the Derivative Works, in at least one of the following places: within a NOTICE text file distributed as part of the Derivative Works; within the Source form or documentation, if provided along with the Derivative Works; or, within a display generated by the Derivative Works, if and wherever such third-party notices normally appear. The contents of the NOTICE file are for informational purposes only and do not modify the License. You may add Your own attribution notices within Derivative Works that You distribute, alongside or as an addendum to the NOTICE text from the Work, provided that such additional attribution notices cannot be construed as modifying the License. You may add Your own copyright statement to Your modifications and may provide additional or different license terms and conditions for use, reproduction, or distribution of Your modifications, or for any such Derivative Works as a whole, provided Your use, reproduction, and distribution of the Work otherwise complies with the conditions stated in this License. 5. Submission of Contributions. Unless You explicitly state otherwise, any Contribution intentionally submitted for inclusion in the Work by You to the Licensor shall be under the terms and conditions of this License, without any additional terms or conditions. Notwithstanding the above, nothing herein shall supersede or modify the terms of any separate license agreement you may have executed with Licensor regarding such Contributions. 6. Trademarks. This License does not grant permission to use the trade names, trademarks, service marks, or product names of the Licensor, except as required for reasonable and customary use in describing the origin of the Work and reproducing the content of the NOTICE file. 7. Disclaimer of Warranty. Unless required by applicable law or agreed to in writing, Licensor provides the Work (and each Contributor provides its Contributions) on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied, including, without limitation, any warranties or conditions of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A PARTICULAR PURPOSE. You are solely responsible for determining the appropriateness of using or redistributing the Work and assume any risks associated with Your exercise of permissions under this License. 8. Limitation of Liability. In no event and under no legal theory, whether in tort (including negligence), contract, or otherwise, unless required by applicable law (such as deliberate and grossly negligent acts) or agreed to in writing, shall any Contributor be liable to You for damages, including any direct, indirect, special, incidental, or consequential damages of any character arising as a result of this License or out of the use or inability to use the Work (including but not limited to damages for loss of goodwill, work stoppage, computer failure or malfunction, or any and all other commercial damages or losses), even if such Contributor has been advised of the possibility of such damages. 9. Accepting Warranty or Additional Liability. While redistributing the Work or Derivative Works thereof, You may choose to offer, and charge a fee for, acceptance of support, warranty, indemnity, or other liability obligations and/or rights consistent with this License. However, in accepting such obligations, You may act only on Your own behalf and on Your sole responsibility, not on behalf of any other Contributor, and only if You agree to indemnify, defend, and hold each Contributor harmless for any liability incurred by, or claims asserted against, such Contributor by reason of your accepting any such warranty or additional liability. END OF TERMS AND CONDITIONS APPENDIX: How to apply the Apache License to your work. To apply the Apache License to your work, attach the following boilerplate notice, with the fields enclosed by brackets "[]" replaced with your own identifying information. (Don't include the brackets!) The text should be enclosed in the appropriate comment syntax for the file format. We also recommend that a file or class name and description of purpose be included on the same "printed page" as the copyright notice for easier identification within third-party archives. Copyright [yyyy] [name of copyright owner] Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with the License. You may obtain a copy of the License at http://www.apache.org/licenses/LICENSE-2.0 Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the specific language governing permissions and limitations under the License. log4rs-1.3.0/LICENSE-MIT000064400000000000000000000020771046102023000125010ustar 00000000000000The MIT License (MIT) Copyright (c) 2015-2016 Steven Fackler Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. log4rs-1.3.0/README.md000064400000000000000000000054501046102023000123220ustar 00000000000000# log4rs [![docs](https://docs.rs/log4rs/badge.svg)](https://docs.rs/log4rs) [![crates.io](https://img.shields.io/crates/v/log4rs.svg)](https://crates.io/crates/log4rs) [![License: MIT OR Apache-2.0](https://img.shields.io/crates/l/clippy.svg)](#license) ![CI](https://github.com/estk/log4rs/workflows/CI/badge.svg) [![Minimum rustc version](https://img.shields.io/badge/rustc-1.69+-green.svg)](https://github.com/estk/log4rs#rust-version-requirements) log4rs is a highly configurable logging framework modeled after Java's Logback and log4j libraries. ## Warning If you are using the file rotation in your configuration there is a known substantial performance issue so listen up! By default the `gzip` feature is enabled and when rolling files it will zip log archives automatically. This is a problem when the log archives are large as the zip happens in the main thread and will halt the process while the zip is completed. Be advised that the `gzip` feature will be removed from default features as of `1.0`. The methods to mitigate this are as follows. 1. Use the `background_rotation` feature which spawns an os thread to do the compression. 1. Disable the `gzip` feature with `--no-default-features`. 1. Ensure the archives are small enough that the compression time is acceptable. For more information see the PR that added [`background_rotation`](https://github.com/estk/log4rs/pull/117). ## Quick Start log4rs.yaml: ```yaml refresh_rate: 30 seconds appenders: stdout: kind: console requests: kind: file path: "log/requests.log" encoder: pattern: "{d} - {m}{n}" root: level: warn appenders: - stdout loggers: app::backend::db: level: info app::requests: level: info appenders: - requests additive: false ``` lib.rs: ```rust use log::{error, info, warn}; use log4rs; fn main() { log4rs::init_file("config/log4rs.yaml", Default::default()).unwrap(); info!("booting up"); // ... } ``` ## Rust Version Requirements 1.69 ## Building for Dev * Run the tests: `cargo test --all-features` * Run the tests for windows with [cross](https://github.com/rust-embedded/cross): `cross test --target x86_64-pc-windows-gnu` * Run the tests for all individual features: `./test.sh` * Run the tests for all individual features for windows with [cross](https://github.com/rust-embedded/cross): `./test.sh win` ## License Licensed under either of * Apache License, Version 2.0 ([LICENSE-APACHE](LICENSE-APACHE) or ) * MIT license ([LICENSE-MIT](LICENSE-MIT) or ) at your option. ### Contribution Unless you explicitly state otherwise, any contribution intentionally submitted for inclusion in the work by you shall be dual licensed as above, without any additional terms or conditions. log4rs-1.3.0/benches/rotation.rs000064400000000000000000000106501046102023000146550ustar 00000000000000use std::{ thread, time::{Duration, Instant}, }; use lazy_static::lazy_static; use tempfile::{tempdir, TempDir}; const K: u64 = 1024; const M: u64 = K * K; const FILE_SIZE: u64 = 100 * M; const FILE_COUNT: u32 = 10; lazy_static! { static ref LOGDIR: TempDir = tempdir().unwrap(); static ref HANDLE: log4rs::Handle = log4rs::init_config(mk_config(FILE_SIZE, FILE_COUNT)).unwrap(); static ref MSG: String = "0".repeat(M as usize); } fn main() { bench_find_anomalies(); } // This has been tuned on the assumption that the application will not log faster than we can gzip the files. // This should fail with just gzip feature enabled, and succeed with features = gzip,background_rotation enabled. fn bench_find_anomalies() { lazy_static::initialize(&HANDLE); let iters = 1000; let mut measurements = vec![]; for _ in 1..iters { thread::sleep(Duration::from_millis(5)); let now = Instant::now(); a::write_log(); let dur = now.elapsed(); measurements.push(dur); } let stats = Stats::new(&mut measurements); let anomalies = stats.anomalies(&measurements); println!("{:#?}", stats); if !anomalies.is_empty() { println!("anomalies: {:?}", anomalies); } assert!( anomalies.is_empty(), "There should be no log anomalies: {:?}", anomalies ); } mod a { pub fn write_log() { log::info!("{}", *super::MSG); } } fn mk_config(file_size: u64, file_count: u32) -> log4rs::config::Config { let log_path = LOGDIR.path(); let log_pattern = log_path.join("log.log"); #[cfg(feature = "gzip")] let roll_pattern = format!("{}/{}", log_path.to_string_lossy(), "log.{}.gz"); #[cfg(not(feature = "gzip"))] let roll_pattern = format!("{}/{}", log_path.to_string_lossy(), "log.{}"); use log::LevelFilter; use log4rs::{ append::rolling_file::{policy, RollingFileAppender}, config::{Appender, Config, Logger, Root}, encode::pattern::PatternEncoder, }; let trigger = policy::compound::trigger::size::SizeTrigger::new(file_size); let roller = policy::compound::roll::fixed_window::FixedWindowRoller::builder() .build(&roll_pattern, file_count) .unwrap(); let policy = policy::compound::CompoundPolicy::new(Box::new(trigger), Box::new(roller)); let file = RollingFileAppender::builder() .encoder(Box::new(PatternEncoder::new( "{d(%Y-%m-%d %H:%M:%S.%3f %Z)} {l} [{t} - {T}] {m}{n}", ))) .build(&log_pattern, Box::new(policy)) .unwrap(); Config::builder() .appender(Appender::builder().build("file", Box::new(file))) .logger( Logger::builder() .appender("file") .additive(false) .build("log4rs_benchmark::a", LevelFilter::Info), ) .build(Root::builder().appender("file").build(LevelFilter::Info)) .unwrap() } #[derive(Debug)] struct Stats { min: Duration, max: Duration, median: Duration, mean_nanos: u128, variance_nanos: f64, stddev_nanos: f64, } impl Stats { fn new(measurements: &mut [Duration]) -> Self { measurements.sort(); let (mean_nanos, variance_nanos) = measurements .iter() .fold((0, 0_f64), |(old_mean, old_variance), x| { let nanos = x.as_nanos(); let size = measurements.len(); let mean = old_mean + (nanos - old_mean) / (size as u128); let prevq = old_variance * (size as f64); let variance = (prevq + ((nanos - old_mean) as f64) * (nanos - mean) as f64) / (size as f64); (mean, variance) }); Self { min: measurements.first().unwrap().to_owned(), max: measurements.last().unwrap().to_owned(), median: measurements[measurements.len() / 2], mean_nanos, variance_nanos, stddev_nanos: variance_nanos.sqrt(), } } fn anomalies(&self, measurements: &[Duration]) -> Vec { let mut anomalies = vec![]; let thresh = self.mean_nanos + ((self.stddev_nanos * 50.0).round() as u128); for dur in measurements { if dur.as_nanos() as u128 > thresh { anomalies.push(dur.clone()); } } anomalies } } log4rs-1.3.0/docs/Configuration.md000064400000000000000000000175121046102023000151260ustar 00000000000000# Configuration log4rs can be configured programmatically by using the builders in the `config` module to construct a log4rs `Config` object, which can be passed to the `init_config` function. The more common configuration method, however, is via a separate config file. The `init_file` function takes the path to a config file as well as a `Deserializers` object which is responsible for instantiating the various objects specified by the config file. The following section covers the exact configuration syntax. Examples of both the programatic and configuration files can be found in the [examples directory](https://github.com/estk/log4rs/tree/main/examples). ## Common Fields ### LevelFilter's - Off - Error - Warn - Info - Debug - Trace ### Filters The only accepted `filter` is of kind threshold with a level. The level must be a [LevelFilter](#levelfilters). One to many filters are allowed. i.e. ```yml filters: - kind: threshold level: info ``` ### Encoder An `encoder` consists of a kind: the default which is pattern, or json. If pattern is defined, the default pattern `{d} {l} {t} - {m}{n}` is used unless overridden. Refer to [this documentation](https://docs.rs/log4rs/latest/log4rs/encode/pattern/index.html#formatters) for details regarding valid patterns. > Note that the json encoder does not have any additional controls such as the > pattern field. i.e. ```yml encoder: kind: pattern pattern: "{h({d(%+)(utc)} [{f}:{L}] {l:<6} {M}:{m})}{n}" ``` ## Loggers A map of logger configurations. ### Logger Configuration The _name_ of the logger is the yml tag. The _level_ of the logger is optional and defaults to the parents log level. The level must be a [LevelFilter](#levelfilters). The _appenders_ field is an optional list of [appenders](#appenders) attached to the logger. The _additive_ field is an optional boolean determining if the loggers parent will also be attached to this logger. The default is true. i.e. ```yml loggers: my_logger: level: info appenders: - my_appender additive: true ``` ## The Root Logger Root is the required logger. It is the parent to all children loggers. To configure the Root, refer to [the logger section](#logger-configuration). > Note: The root logger has no parent and therefore cannot the _additive_ field does not apply. ```yml root: level: info appenders: - my_appender ``` ## Appenders All appenders require a unique identifying string for each [appender configuration](#appender-config). ### Appender Config Each Appender Kind has it's own configuration. However, all accept [filters](#filters). The `kind` field is required in an appender configuration. #### The Console Appender The _target_ field is optional and accepts `stdout` or `stderr`. It's default value is stdout. The _tty_only_ field is an optional boolean and dictates that the appender must only write when the target is a TTY. It's default value is false. The _encoder_ field is optional and can consist of multiple fields. Refer to the [encoder](#encoder) documention. ```yml my_console_appender: kind: console target: stdout tty_only: false ``` #### The File Appender The _path_ field is required and accepts environment variables of the form `$ENV{name_here}`. The path can be relative or absolute. The _encoder_ field is optional and can consist of multiple fields. Refer to the [encoder](#encoder) documention. The _append_ field is an optional boolean and defaults to `true`. True will append to the log file if it exists, false will truncate the existing file. ```yml my_file_appender: kind: file path: $ENV{PWD}/log/test.log append: true ``` #### The Rolling File Appender The rolling file configuration is by far the most complex. Like the [file appender](#the-file-appender), the path to the log file is required with the _append_ and the _encoders_ optional fields. i.e. ```yml my_rolling_appender: kind: rolling_file path: "logs/test.log" policy: kind: compound trigger: kind: size limit: 1mb roller: kind: fixed_window base: 1 count: 5 pattern: "logs/test.{}.log" ``` The new component is the _policy_ field. A policy must have the _kind_ field like most other components, the default (and only supported) policy is `kind: compound`. The _trigger_ field is used to dictate when the log file should be rolled. It supports two types: `size`, and `time`. For `size`, it require a _limit_ field. The _limit_ field is a string which defines the maximum file size prior to a rolling of the file. The limit field requires one of the following units in bytes, case does not matter: - b - kb/kib - mb/mib - gb/gib - tb/tib i.e. ```yml trigger: kind: size limit: 10 mb ``` For `time`, it has three field, _interval_, _modulate_ and _max_random_delay_. The _interval_ field is a string which defines the time to roll the file. The interval field supports the following units(second will be used if the unit is not specified), case does not matter: - second[s] - minute[s] - hour[s] - day[s] - week[s] - month[s] - year[s] > Note: `log4j` treats `Sunday` as the first day of the week, but `log4rs` treats > `Monday` as the first day of the week, which follows the `chrono` crate > and the `ISO 8601` standard. So when using `week`, the log file will be rolled > on `Monday` instead of `Sunday`. The _modulate_ field is an optional boolean. It indicates whether the interval should be adjusted to cause the next rollover to occur on the interval boundary. For example, if the interval is 4 hours and the current hour is 3 am, when true, the first rollover will occur at 4 am and then next ones will occur at 8 am, noon, 4pm, etc. The default value is false. The _max_random_delay_ field is an optional integer. It indicates the maximum number of seconds to randomly delay a rollover. By default, this is 0 which indicates no delay. This setting is useful on servers where multiple applications are configured to rollover log files at the same time and can spread the load of doing so across time. i.e. ```yml trigger: kind: time interval: 1 day modulate: false max_random_delay: 0 ``` The _roller_ field supports two types: delete, and fixed_window. The delete roller does not take any other configuration fields. The fixed_window roller supports three fields: pattern, base, and count. The most current log file will always have the _base_ index. The _pattern_ field is used to rename files. The pattern must contain the double curly brace `{}`. For example `archive/foo.{}.log`. Each instance of `{}` will be replaced with the index number of the configuration file. Note that if the file extension of the pattern is `.gz` and the `gzip` Cargo feature is enabled, the archive files will be gzip-compressed. > Note: This pattern field is only used for archived files. The `path` field > of the higher level `rolling_file` will be used for the active log file. The _base_ field is the starting index used to name rolling files. The _count_ field is the exclusive maximum index used to name rolling files. However, be warned that the roller renames every file when a log rolls over. Having a large count value can negatively impact performance. > Note: If you use the `triger: time`, the log file will be rolled before it > gets written, which ensures that the logs are rolled in the correct position > instead of leaving a single line of logs in the previous log file. However, > this may cause a substantial slowdown if the `background` feature is not enabled. i.e. ```yml roller: kind: fixed_window base: 1 count: 5 pattern: "archive/journey-service.{}.log" ``` or ```yml roller: kind: delete ``` ## Refresh Rate The _refresh_rate_ accepts a u64 value in seconds. The field is used to determine how often log4rs will scan the configuration file for changes. If a change is discovered, the logger will reconfigure automatically. i.e. ```yml refresh_rate: 30 seconds ``` log4rs-1.3.0/examples/compile_time_config.rs000064400000000000000000000006231046102023000172170ustar 00000000000000use log::{error, info, trace}; use log4rs; use serde_yaml; fn main() { let config_str = include_str!("sample_config.yml"); let config = serde_yaml::from_str(config_str).unwrap(); log4rs::init_raw_config(config).unwrap(); info!("Goes to console, file and rolling file"); error!("Goes to console, file and rolling file"); trace!("Doesn't go to console as it is filtered out"); } log4rs-1.3.0/examples/json_logger.rs000064400000000000000000000012251046102023000155330ustar 00000000000000use log::{error, info, warn, LevelFilter}; use log4rs::{ append::console::ConsoleAppender, config::{Appender, Root}, encode::json::JsonEncoder, }; fn main() { let stdout: ConsoleAppender = ConsoleAppender::builder() .encoder(Box::new(JsonEncoder::new())) .build(); let log_config = log4rs::config::Config::builder() .appender(Appender::builder().build("stdout", Box::new(stdout))) .build(Root::builder().appender("stdout").build(LevelFilter::Info)) .unwrap(); log4rs::init_config(log_config).unwrap(); info!("Info log!"); warn!("Warn log with value {}", "test"); error!("ERROR!"); } log4rs-1.3.0/examples/log_to_file.rs000064400000000000000000000034761046102023000155170ustar 00000000000000use log::{debug, error, info, trace, warn, LevelFilter, SetLoggerError}; use log4rs::{ append::{ console::{ConsoleAppender, Target}, file::FileAppender, }, config::{Appender, Config, Root}, encode::pattern::PatternEncoder, filter::threshold::ThresholdFilter, }; fn main() -> Result<(), SetLoggerError> { let level = log::LevelFilter::Info; let file_path = "/tmp/foo.log"; // Build a stderr logger. let stderr = ConsoleAppender::builder().target(Target::Stderr).build(); // Logging to log file. let logfile = FileAppender::builder() // Pattern: https://docs.rs/log4rs/*/log4rs/encode/pattern/index.html .encoder(Box::new(PatternEncoder::new("{l} - {m}\n"))) .build(file_path) .unwrap(); // Log Trace level output to file where trace is the default level // and the programmatically specified level to stderr. let config = Config::builder() .appender(Appender::builder().build("logfile", Box::new(logfile))) .appender( Appender::builder() .filter(Box::new(ThresholdFilter::new(level))) .build("stderr", Box::new(stderr)), ) .build( Root::builder() .appender("logfile") .appender("stderr") .build(LevelFilter::Trace), ) .unwrap(); // Use this to change log levels at runtime. // This means you can change the default log level to trace // if you are trying to debug an issue and need more logs on then turn it off // once you are done. let _handle = log4rs::init_config(config)?; error!("Goes to stderr and file"); warn!("Goes to stderr and file"); info!("Goes to stderr and file"); debug!("Goes to file only"); trace!("Goes to file only"); Ok(()) } log4rs-1.3.0/examples/log_to_file_with_rolling.rs000064400000000000000000000076671046102023000203060ustar 00000000000000//! Example showing how to use logging with a rolling trigger based on size //! //! NB: The size used in the example is intentionally small so multiple file //! will be created in the 2 seconds that the example is set to run and is not //! intended for practical for usage /// This is the size at which a new file should be created. For the demo it is /// set to 2KB which is very small and only for demo purposes const TRIGGER_FILE_SIZE: u64 = 2 * 1024; /// Delay between log messages for demo purposes const TIME_BETWEEN_LOG_MESSAGES: Duration = Duration::from_millis(10); /// Number of archive log files to keep const LOG_FILE_COUNT: u32 = 3; /// Time demo is set to run for (Set to be long enough for multiple files to be created) const RUN_TIME: Duration = Duration::from_secs(2); /// Location where logs will be written to const FILE_PATH: &str = "/tmp/foo.log"; /// Location where log archives will be moved to /// For Pattern info See: /// https://docs.rs/log4rs/*/log4rs/append/rolling_file/policy/compound/roll/fixed_window/struct.FixedWindowRollerBuilder.html#method.build const ARCHIVE_PATTERN: &str = "/tmp/archive/foo.{}.log"; use std::{ thread::sleep, time::{Duration, Instant}, }; use log::{debug, error, info, trace, warn, LevelFilter, SetLoggerError}; use log4rs::{ append::{ console::{ConsoleAppender, Target}, rolling_file::policy::compound::{ roll::fixed_window::FixedWindowRoller, trigger::size::SizeTrigger, CompoundPolicy, }, }, config::{Appender, Config, Root}, encode::pattern::PatternEncoder, filter::threshold::ThresholdFilter, }; fn main() -> Result<(), SetLoggerError> { let level = log::LevelFilter::Info; // Build a stderr logger. let stderr = ConsoleAppender::builder().target(Target::Stderr).build(); // Create a policy to use with the file logging let trigger = SizeTrigger::new(TRIGGER_FILE_SIZE); let roller = FixedWindowRoller::builder() .base(0) // Default Value (line not needed unless you want to change from 0 (only here for demo purposes) .build(ARCHIVE_PATTERN, LOG_FILE_COUNT) // Roll based on pattern and max 3 archive files .unwrap(); let policy = CompoundPolicy::new(Box::new(trigger), Box::new(roller)); // Logging to log file. (with rolling) let logfile = log4rs::append::rolling_file::RollingFileAppender::builder() // Pattern: https://docs.rs/log4rs/*/log4rs/encode/pattern/index.html .encoder(Box::new(PatternEncoder::new("{l} - {m}\n"))) .build(FILE_PATH, Box::new(policy)) .unwrap(); // Log Trace level output to file where trace is the default level // and the programmatically specified level to stderr. let config = Config::builder() .appender(Appender::builder().build("logfile", Box::new(logfile))) .appender( Appender::builder() .filter(Box::new(ThresholdFilter::new(level))) .build("stderr", Box::new(stderr)), ) .build( Root::builder() .appender("logfile") .appender("stderr") .build(LevelFilter::Trace), ) .unwrap(); // Use this to change log levels at runtime. // This means you can change the default log level to trace // if you are trying to debug an issue and need more logs on then turn it off // once you are done. let _handle = log4rs::init_config(config)?; error!("Goes to stderr and file"); warn!("Goes to stderr and file"); info!("Goes to stderr and file"); debug!("Goes to file only"); trace!("Goes to file only"); // Generate some log messages to trigger rolling let instant = Instant::now(); while instant.elapsed() < RUN_TIME { info!("Running for {:?}", instant.elapsed()); sleep(TIME_BETWEEN_LOG_MESSAGES); } info!( "See '{}' for log and '{}' for archived logs", FILE_PATH, ARCHIVE_PATTERN ); Ok(()) } log4rs-1.3.0/examples/multi_logger.yml000064400000000000000000000006441046102023000160750ustar 00000000000000refresh_rate: 5 seconds appenders: console: kind: console encoder: pattern: "{d(%+)(local)} [{t}] {h({l})} {M}:{m}{n}" filters: - kind: threshold level: error file: kind: file path: info.log encoder: pattern: "{d} [{t}] {l} {M}:{m}{n}" root: appenders: - console loggers: multi_logger_config::a: level: info appenders: - file additive: true log4rs-1.3.0/examples/multi_logger_config.rs000064400000000000000000000006251046102023000172440ustar 00000000000000use std::{default::Default, thread, time::Duration}; use log::{error, warn}; use log4rs; fn main() { log4rs::init_file("examples/multi_logger.yml", Default::default()).unwrap(); loop { thread::sleep(Duration::from_secs(1)); warn!("main"); error!("error main"); a::foo(); } } mod a { use log::info; pub fn foo() { info!("module a"); } } log4rs-1.3.0/examples/sample_config.yml000064400000000000000000000015551046102023000162140ustar 00000000000000appenders: stdout: kind: console encoder: pattern: "{d(%+)(utc)} [{f}:{L}] {h({l})} {M}:{m}{n}" filters: - kind: threshold level: info file: kind: file path: "log/file.log" encoder: pattern: "[{d(%Y-%m-%dT%H:%M:%S%.6f)} {h({l}):<5.5} {M}] {m}{n}" rollingfile: kind: rolling_file path: "log/rolling_file.log" encoder: pattern: "[{d(%Y-%m-%dT%H:%M:%S%.6f)} {h({l}):<5.5} {M}] {m}{n}" policy: trigger: kind: time interval: 1 minute roller: kind: fixed_window pattern: "log/old-rolling_file-{}.log" base: 0 count: 2 root: level: info appenders: - stdout - file - rollingfile log4rs-1.3.0/src/append/console.rs000064400000000000000000000156321046102023000151140ustar 00000000000000//! The console appender. //! //! Requires the `console_appender` feature. use derivative::Derivative; use log::Record; use std::{ fmt, io::{self, Write}, }; #[cfg(feature = "config_parsing")] use crate::config::{Deserialize, Deserializers}; #[cfg(feature = "config_parsing")] use crate::encode::EncoderConfig; use crate::{ append::Append, encode::{ self, pattern::PatternEncoder, writer::{ console::{ConsoleWriter, ConsoleWriterLock}, simple::SimpleWriter, }, Encode, Style, }, priv_io::{StdWriter, StdWriterLock}, }; /// The console appender's configuration. #[cfg(feature = "config_parsing")] #[derive(Debug, serde::Deserialize)] #[serde(deny_unknown_fields)] pub struct ConsoleAppenderConfig { target: Option, encoder: Option, tty_only: Option, } #[cfg(feature = "config_parsing")] #[derive(Debug, serde::Deserialize)] enum ConfigTarget { #[serde(rename = "stdout")] Stdout, #[serde(rename = "stderr")] Stderr, } enum Writer { Tty(ConsoleWriter), Raw(StdWriter), } impl Writer { fn lock(&self) -> WriterLock { match *self { Writer::Tty(ref w) => WriterLock::Tty(w.lock()), Writer::Raw(ref w) => WriterLock::Raw(SimpleWriter(w.lock())), } } fn is_tty(&self) -> bool { // 1.40 compat #[allow(clippy::match_like_matches_macro)] match self { Self::Tty(_) => true, _ => false, } } } enum WriterLock<'a> { Tty(ConsoleWriterLock<'a>), Raw(SimpleWriter>), } impl<'a> io::Write for WriterLock<'a> { fn write(&mut self, buf: &[u8]) -> io::Result { match *self { WriterLock::Tty(ref mut w) => w.write(buf), WriterLock::Raw(ref mut w) => w.write(buf), } } fn flush(&mut self) -> io::Result<()> { match *self { WriterLock::Tty(ref mut w) => w.flush(), WriterLock::Raw(ref mut w) => w.flush(), } } fn write_all(&mut self, buf: &[u8]) -> io::Result<()> { match *self { WriterLock::Tty(ref mut w) => w.write_all(buf), WriterLock::Raw(ref mut w) => w.write_all(buf), } } fn write_fmt(&mut self, fmt: fmt::Arguments) -> io::Result<()> { match *self { WriterLock::Tty(ref mut w) => w.write_fmt(fmt), WriterLock::Raw(ref mut w) => w.write_fmt(fmt), } } } impl<'a> encode::Write for WriterLock<'a> { fn set_style(&mut self, style: &Style) -> io::Result<()> { match *self { WriterLock::Tty(ref mut w) => w.set_style(style), WriterLock::Raw(ref mut w) => w.set_style(style), } } } /// An appender which logs to standard out. /// /// It supports output styling if standard out is a console buffer on Windows /// or is a TTY on Unix. #[derive(Derivative)] #[derivative(Debug)] pub struct ConsoleAppender { #[derivative(Debug = "ignore")] writer: Writer, encoder: Box, do_write: bool, } impl Append for ConsoleAppender { fn append(&self, record: &Record) -> anyhow::Result<()> { if self.do_write { let mut writer = self.writer.lock(); self.encoder.encode(&mut writer, record)?; writer.flush()?; } Ok(()) } fn flush(&self) {} } impl ConsoleAppender { /// Creates a new `ConsoleAppender` builder. pub fn builder() -> ConsoleAppenderBuilder { ConsoleAppenderBuilder { encoder: None, target: Target::Stdout, tty_only: false, } } } /// A builder for `ConsoleAppender`s. pub struct ConsoleAppenderBuilder { encoder: Option>, target: Target, tty_only: bool, } impl ConsoleAppenderBuilder { /// Sets the output encoder for the `ConsoleAppender`. pub fn encoder(mut self, encoder: Box) -> ConsoleAppenderBuilder { self.encoder = Some(encoder); self } /// Sets the output stream to log to. /// /// Defaults to `Target::Stdout`. pub fn target(mut self, target: Target) -> ConsoleAppenderBuilder { self.target = target; self } /// Sets the output to log only when it's a TTY. /// /// Defaults to `false`. pub fn tty_only(mut self, tty_only: bool) -> ConsoleAppenderBuilder { self.tty_only = tty_only; self } /// Consumes the `ConsoleAppenderBuilder`, producing a `ConsoleAppender`. pub fn build(self) -> ConsoleAppender { let writer = match self.target { Target::Stderr => match ConsoleWriter::stderr() { Some(writer) => Writer::Tty(writer), None => Writer::Raw(StdWriter::stderr()), }, Target::Stdout => match ConsoleWriter::stdout() { Some(writer) => Writer::Tty(writer), None => Writer::Raw(StdWriter::stdout()), }, }; let do_write = writer.is_tty() || !self.tty_only; ConsoleAppender { writer, encoder: self .encoder .unwrap_or_else(|| Box::::default()), do_write, } } } /// The stream to log to. #[derive(Copy, Clone, Eq, PartialEq, Hash, Debug)] pub enum Target { /// Standard output. Stdout, /// Standard error. Stderr, } /// A deserializer for the `ConsoleAppender`. /// /// # Configuration /// /// ```yaml /// kind: console /// /// # The output to write to. One of `stdout` or `stderr`. Defaults to `stdout`. /// target: stdout /// /// # Set this boolean when the console appender must only write when the target is a TTY. /// tty_only: false /// /// # The encoder to use to format output. Defaults to `kind: pattern`. /// encoder: /// kind: pattern /// ``` #[cfg(feature = "config_parsing")] #[derive(Copy, Clone, Eq, PartialEq, Hash, Debug, Default)] pub struct ConsoleAppenderDeserializer; #[cfg(feature = "config_parsing")] impl Deserialize for ConsoleAppenderDeserializer { type Trait = dyn Append; type Config = ConsoleAppenderConfig; fn deserialize( &self, config: ConsoleAppenderConfig, deserializers: &Deserializers, ) -> anyhow::Result> { let mut appender = ConsoleAppender::builder(); if let Some(target) = config.target { let target = match target { ConfigTarget::Stdout => Target::Stdout, ConfigTarget::Stderr => Target::Stderr, }; appender = appender.target(target); } if let Some(tty_only) = config.tty_only { appender = appender.tty_only(tty_only); } if let Some(encoder) = config.encoder { appender = appender.encoder(deserializers.deserialize(&encoder.kind, encoder.config)?); } Ok(Box::new(appender.build())) } } log4rs-1.3.0/src/append/file.rs000064400000000000000000000122301046102023000143600ustar 00000000000000//! The file appender. //! //! Requires the `file_appender` feature. use derivative::Derivative; use log::Record; use parking_lot::Mutex; use std::{ fs::{self, File, OpenOptions}, io::{self, BufWriter, Write}, path::{Path, PathBuf}, }; #[cfg(feature = "config_parsing")] use crate::config::{Deserialize, Deserializers}; #[cfg(feature = "config_parsing")] use crate::encode::EncoderConfig; use crate::{ append::{env_util::expand_env_vars, Append}, encode::{pattern::PatternEncoder, writer::simple::SimpleWriter, Encode}, }; /// The file appender's configuration. #[cfg(feature = "config_parsing")] #[derive(Clone, Eq, PartialEq, Hash, Debug, Default, serde::Deserialize)] #[serde(deny_unknown_fields)] pub struct FileAppenderConfig { path: String, encoder: Option, append: Option, } /// An appender which logs to a file. #[derive(Derivative)] #[derivative(Debug)] pub struct FileAppender { path: PathBuf, #[derivative(Debug = "ignore")] file: Mutex>>, encoder: Box, } impl Append for FileAppender { fn append(&self, record: &Record) -> anyhow::Result<()> { let mut file = self.file.lock(); self.encoder.encode(&mut *file, record)?; file.flush()?; Ok(()) } fn flush(&self) {} } impl FileAppender { /// Creates a new `FileAppender` builder. pub fn builder() -> FileAppenderBuilder { FileAppenderBuilder { encoder: None, append: true, } } } /// A builder for `FileAppender`s. pub struct FileAppenderBuilder { encoder: Option>, append: bool, } impl FileAppenderBuilder { /// Sets the output encoder for the `FileAppender`. pub fn encoder(mut self, encoder: Box) -> FileAppenderBuilder { self.encoder = Some(encoder); self } /// Determines if the appender will append to or truncate the output file. /// /// Defaults to `true`. pub fn append(mut self, append: bool) -> FileAppenderBuilder { self.append = append; self } /// Consumes the `FileAppenderBuilder`, producing a `FileAppender`. /// The path argument can contain environment variables of the form $ENV{name_here}, /// where 'name_here' will be the name of the environment variable that /// will be resolved. Note that if the variable fails to resolve, /// $ENV{name_here} will NOT be replaced in the path. pub fn build>(self, path: P) -> io::Result { let path_cow = path.as_ref().to_string_lossy(); let path: PathBuf = expand_env_vars(path_cow).as_ref().into(); if let Some(parent) = path.parent() { fs::create_dir_all(parent)?; } let file = OpenOptions::new() .write(true) .append(self.append) .truncate(!self.append) .create(true) .open(&path)?; Ok(FileAppender { path, file: Mutex::new(SimpleWriter(BufWriter::with_capacity(1024, file))), encoder: self .encoder .unwrap_or_else(|| Box::::default()), }) } } /// A deserializer for the `FileAppender`. /// /// # Configuration /// /// ```yaml /// kind: file /// /// # The path of the log file. Required. /// # The path can contain environment variables of the form $ENV{name_here}, /// # where 'name_here' will be the name of the environment variable that /// # will be resolved. Note that if the variable fails to resolve, /// # $ENV{name_here} will NOT be replaced in the path. /// path: log/foo.log /// /// # Specifies if the appender should append to or truncate the log file if it /// # already exists. Defaults to `true`. /// append: true /// /// # The encoder to use to format output. Defaults to `kind: pattern`. /// encoder: /// kind: pattern /// ``` #[cfg(feature = "config_parsing")] #[derive(Copy, Clone, Eq, PartialEq, Hash, Debug, Default)] pub struct FileAppenderDeserializer; #[cfg(feature = "config_parsing")] impl Deserialize for FileAppenderDeserializer { type Trait = dyn Append; type Config = FileAppenderConfig; fn deserialize( &self, config: FileAppenderConfig, deserializers: &Deserializers, ) -> anyhow::Result> { let mut appender = FileAppender::builder(); if let Some(append) = config.append { appender = appender.append(append); } if let Some(encoder) = config.encoder { appender = appender.encoder(deserializers.deserialize(&encoder.kind, encoder.config)?); } Ok(Box::new(appender.build(&config.path)?)) } } #[cfg(test)] mod test { use super::*; #[test] fn create_directories() { let tempdir = tempfile::tempdir().unwrap(); FileAppender::builder() .build(tempdir.path().join("foo").join("bar").join("baz.log")) .unwrap(); } #[test] fn append_false() { let tempdir = tempfile::tempdir().unwrap(); FileAppender::builder() .append(false) .build(tempdir.path().join("foo.log")) .unwrap(); } } log4rs-1.3.0/src/append/mod.rs000064400000000000000000000202311046102023000142200ustar 00000000000000//! Appenders use log::{Log, Record}; #[cfg(feature = "config_parsing")] use serde::{de, Deserialize, Deserializer}; #[cfg(feature = "config_parsing")] use serde_value::Value; #[cfg(feature = "config_parsing")] use std::collections::BTreeMap; use std::fmt; #[cfg(feature = "config_parsing")] use crate::config::Deserializable; #[cfg(feature = "config_parsing")] use crate::filter::FilterConfig; #[cfg(feature = "console_appender")] pub mod console; #[cfg(feature = "file_appender")] pub mod file; #[cfg(feature = "rolling_file_appender")] pub mod rolling_file; #[cfg(any(feature = "file_appender", feature = "rolling_file_appender"))] mod env_util { use std::borrow::Cow; const ENV_PREFIX: &str = "$ENV{"; const ENV_PREFIX_LEN: usize = ENV_PREFIX.len(); const ENV_SUFFIX: char = '}'; const ENV_SUFFIX_LEN: usize = 1; fn is_env_var_start(c: char) -> bool { // Close replacement for old [\w] // Note that \w implied \d and '_' and non-ASCII letters/digits. c.is_alphanumeric() || c == '_' } fn is_env_var_part(c: char) -> bool { // Close replacement for old [\w\d_.] c.is_alphanumeric() || c == '_' || c == '.' } pub fn expand_env_vars<'str, Str>(path: Str) -> Cow<'str, str> where Str: Into>, { let mut outpath: Cow = path.into(); let path = outpath.clone(); for (match_start, _) in path.match_indices(ENV_PREFIX) { let env_name_start = match_start + ENV_PREFIX_LEN; let (_, tail) = path.split_at(env_name_start); let mut cs = tail.chars(); // Check first character. if let Some(ch) = cs.next() { if is_env_var_start(ch) { let mut env_name = String::new(); env_name.push(ch); // Consume following characters. let valid = loop { match cs.next() { Some(ch) if is_env_var_part(ch) => env_name.push(ch), Some(ENV_SUFFIX) => break true, _ => break false, } }; // Try replacing properly terminated env var. if valid { if let Ok(env_value) = std::env::var(&env_name) { let match_end = env_name_start + env_name.len() + ENV_SUFFIX_LEN; // This simply rewrites the entire outpath with all instances // of this var replaced. Could be done more efficiently by building // `outpath` as we go when processing `path`. Not critical. outpath = outpath .replace(&path[match_start..match_end], &env_value) .into(); } } } } } outpath } } /// A trait implemented by log4rs appenders. /// /// Appenders take a log record and processes them, for example, by writing it /// to a file or the console. pub trait Append: fmt::Debug + Send + Sync + 'static { /// Processes the provided `Record`. fn append(&self, record: &Record) -> anyhow::Result<()>; /// Flushes all in-flight records. fn flush(&self); } #[cfg(feature = "config_parsing")] impl Deserializable for dyn Append { fn name() -> &'static str { "appender" } } impl Append for T { fn append(&self, record: &Record) -> anyhow::Result<()> { self.log(record); Ok(()) } fn flush(&self) { Log::flush(self) } } /// Configuration for an appender. #[cfg(feature = "config_parsing")] #[derive(Clone, Eq, PartialEq, Hash, Debug)] pub struct AppenderConfig { /// The appender kind. pub kind: String, /// The filters attached to the appender. pub filters: Vec, /// The appender configuration. pub config: Value, } #[cfg(feature = "config_parsing")] impl<'de> Deserialize<'de> for AppenderConfig { fn deserialize(d: D) -> Result where D: Deserializer<'de>, { let mut map = BTreeMap::::deserialize(d)?; let kind = match map.remove(&Value::String("kind".to_owned())) { Some(kind) => kind.deserialize_into().map_err(|e| e.into_error())?, None => return Err(de::Error::missing_field("kind")), }; let filters = match map.remove(&Value::String("filters".to_owned())) { Some(filters) => filters.deserialize_into().map_err(|e| e.into_error())?, None => vec![], }; Ok(AppenderConfig { kind, filters, config: Value::Map(map), }) } } #[cfg(test)] mod test { #[cfg(any(feature = "file_appender", feature = "rolling_file_appender"))] use std::env::{set_var, var}; #[test] #[cfg(any(feature = "file_appender", feature = "rolling_file_appender"))] fn expand_env_vars_tests() { set_var("HELLO_WORLD", "GOOD BYE"); #[cfg(not(target_os = "windows"))] let test_cases = vec![ ("$ENV{HOME}", var("HOME").unwrap()), ("$ENV{HELLO_WORLD}", var("HELLO_WORLD").unwrap()), ("$ENV{HOME}/test", format!("{}/test", var("HOME").unwrap())), ( "/test/$ENV{HOME}", format!("/test/{}", var("HOME").unwrap()), ), ( "/test/$ENV{HOME}/test", format!("/test/{}/test", var("HOME").unwrap()), ), ( "/test$ENV{HOME}/test", format!("/test{}/test", var("HOME").unwrap()), ), ( "test/$ENV{HOME}/test", format!("test/{}/test", var("HOME").unwrap()), ), ( "/$ENV{HOME}/test/$ENV{USER}", format!("/{}/test/{}", var("HOME").unwrap(), var("USER").unwrap()), ), ( "$ENV{SHOULD_NOT_EXIST}", "$ENV{SHOULD_NOT_EXIST}".to_string(), ), ( "/$ENV{HOME}/test/$ENV{SHOULD_NOT_EXIST}", format!("/{}/test/$ENV{{SHOULD_NOT_EXIST}}", var("HOME").unwrap()), ), ( "/unterminated/$ENV{USER", "/unterminated/$ENV{USER".to_string(), ), ]; #[cfg(target_os = "windows")] let test_cases = vec![ ("$ENV{HOMEPATH}", var("HOMEPATH").unwrap()), ("$ENV{HELLO_WORLD}", var("HELLO_WORLD").unwrap()), ( "$ENV{HOMEPATH}/test", format!("{}/test", var("HOMEPATH").unwrap()), ), ( "/test/$ENV{USERNAME}", format!("/test/{}", var("USERNAME").unwrap()), ), ( "/test/$ENV{USERNAME}/test", format!("/test/{}/test", var("USERNAME").unwrap()), ), ( "/test$ENV{USERNAME}/test", format!("/test{}/test", var("USERNAME").unwrap()), ), ( "test/$ENV{USERNAME}/test", format!("test/{}/test", var("USERNAME").unwrap()), ), ( "$ENV{HOMEPATH}/test/$ENV{USERNAME}", format!( "{}/test/{}", var("HOMEPATH").unwrap(), var("USERNAME").unwrap() ), ), ( "$ENV{SHOULD_NOT_EXIST}", "$ENV{SHOULD_NOT_EXIST}".to_string(), ), ( "$ENV{HOMEPATH}/test/$ENV{SHOULD_NOT_EXIST}", format!("{}/test/$ENV{{SHOULD_NOT_EXIST}}", var("HOMEPATH").unwrap()), ), ( "/unterminated/$ENV{USERNAME", "/unterminated/$ENV{USERNAME".to_string(), ), ]; for (input, expected) in test_cases { let res = super::env_util::expand_env_vars(input); assert_eq!(res, expected) } } } log4rs-1.3.0/src/append/rolling_file/mod.rs000064400000000000000000000330561046102023000166760ustar 00000000000000//! A rolling file appender. //! //! Logging directly to a file can be a dangerous proposition for long running //! processes. You wouldn't want to start a server up and find out a couple //! weeks later that the disk is filled with hundreds of gigabytes of logs! A //! rolling file appender alleviates these issues by limiting the amount of log //! data that's preserved. //! //! Like a normal file appender, a rolling file appender is configured with the //! location of its log file and the encoder which formats log events written //! to it. In addition, it holds a "policy" object which controls when a log //! file is rolled over and how the old files are archived. //! //! For example, you may configure an appender to roll the log over once it //! reaches 50 megabytes, and to preserve the last 10 log files. //! //! Requires the `rolling_file_appender` feature. use derivative::Derivative; use log::Record; use parking_lot::Mutex; use std::{ fs::{self, File, OpenOptions}, io::{self, BufWriter, Write}, path::{Path, PathBuf}, }; #[cfg(feature = "config_parsing")] use serde_value::Value; #[cfg(feature = "config_parsing")] use std::collections::BTreeMap; use crate::{ append::Append, encode::{self, pattern::PatternEncoder, Encode}, }; #[cfg(feature = "config_parsing")] use crate::config::{Deserialize, Deserializers}; #[cfg(feature = "config_parsing")] use crate::encode::EncoderConfig; pub mod policy; /// Configuration for the rolling file appender. #[cfg(feature = "config_parsing")] #[derive(Clone, Eq, PartialEq, Hash, Debug, serde::Deserialize)] #[serde(deny_unknown_fields)] pub struct RollingFileAppenderConfig { path: String, append: Option, encoder: Option, policy: Policy, } #[cfg(feature = "config_parsing")] #[derive(Clone, Eq, PartialEq, Hash, Debug)] struct Policy { kind: String, config: Value, } #[cfg(feature = "config_parsing")] impl<'de> serde::Deserialize<'de> for Policy { fn deserialize(d: D) -> Result where D: serde::Deserializer<'de>, { let mut map = BTreeMap::::deserialize(d)?; let kind = match map.remove(&Value::String("kind".to_owned())) { Some(kind) => kind.deserialize_into().map_err(|e| e.to_error())?, None => "compound".to_owned(), }; Ok(Policy { kind, config: Value::Map(map), }) } } #[derive(Debug)] struct LogWriter { file: BufWriter, len: u64, } impl io::Write for LogWriter { fn write(&mut self, buf: &[u8]) -> io::Result { self.file.write(buf).map(|n| { self.len += n as u64; n }) } fn flush(&mut self) -> io::Result<()> { self.file.flush() } } impl encode::Write for LogWriter {} /// Information about the active log file. #[derive(Debug)] pub struct LogFile<'a> { writer: &'a mut Option, path: &'a Path, len: u64, } #[allow(clippy::len_without_is_empty)] impl<'a> LogFile<'a> { /// Returns the path to the log file. pub fn path(&self) -> &Path { self.path } /// Returns an estimate of the log file's current size. /// /// This is calculated by taking the size of the log file when it is opened /// and adding the number of bytes written. It may be inaccurate if any /// writes have failed or if another process has modified the file /// concurrently. #[deprecated(since = "0.9.1", note = "Please use the len_estimate function instead")] pub fn len(&self) -> u64 { self.len } /// Returns an estimate of the log file's current size. /// /// This is calculated by taking the size of the log file when it is opened /// and adding the number of bytes written. It may be inaccurate if any /// writes have failed or if another process has modified the file /// concurrently. pub fn len_estimate(&self) -> u64 { self.len } /// Triggers the log file to roll over. /// /// A policy must call this method when it wishes to roll the log. The /// appender's handle to the file will be closed, which is necessary to /// move or delete the file on Windows. /// /// If this method is called, the log file must no longer be present on /// disk when the policy returns. pub fn roll(&mut self) { *self.writer = None; } } /// An appender which archives log files in a configurable strategy. #[derive(Derivative)] #[derivative(Debug)] pub struct RollingFileAppender { #[derivative(Debug = "ignore")] writer: Mutex>, path: PathBuf, append: bool, encoder: Box, policy: Box, } impl Append for RollingFileAppender { fn append(&self, record: &Record) -> anyhow::Result<()> { // TODO(eas): Perhaps this is better as a concurrent queue? let mut writer = self.writer.lock(); let is_pre_process = self.policy.is_pre_process(); let log_writer = self.get_writer(&mut writer)?; if is_pre_process { let len = log_writer.len; let mut file = LogFile { writer: &mut writer, path: &self.path, len, }; // TODO(eas): Idea: make this optionally return a future, and if so, we initialize a queue for // data that comes in while we are processing the file rotation. self.policy.process(&mut file)?; let log_writer_new = self.get_writer(&mut writer)?; self.encoder.encode(log_writer_new, record)?; log_writer_new.flush()?; } else { self.encoder.encode(log_writer, record)?; log_writer.flush()?; let len = log_writer.len; let mut file = LogFile { writer: &mut writer, path: &self.path, len, }; self.policy.process(&mut file)?; } Ok(()) } fn flush(&self) {} } impl RollingFileAppender { /// Creates a new `RollingFileAppenderBuilder`. pub fn builder() -> RollingFileAppenderBuilder { RollingFileAppenderBuilder { append: true, encoder: None, } } fn get_writer<'a>(&self, writer: &'a mut Option) -> io::Result<&'a mut LogWriter> { if writer.is_none() { let file = OpenOptions::new() .write(true) .append(self.append) .truncate(!self.append) .create(true) .open(&self.path)?; let len = if self.append { file.metadata()?.len() } else { 0 }; *writer = Some(LogWriter { file: BufWriter::with_capacity(1024, file), len, }); } // :( unwrap Ok(writer.as_mut().unwrap()) } } /// A builder for the `RollingFileAppender`. pub struct RollingFileAppenderBuilder { append: bool, encoder: Option>, } impl RollingFileAppenderBuilder { /// Determines if the appender will append to or truncate the log file. /// /// Defaults to `true`. pub fn append(mut self, append: bool) -> RollingFileAppenderBuilder { self.append = append; self } /// Sets the encoder used by the appender. /// /// Defaults to a `PatternEncoder` with the default pattern. pub fn encoder(mut self, encoder: Box) -> RollingFileAppenderBuilder { self.encoder = Some(encoder); self } /// Constructs a `RollingFileAppender`. /// The path argument can contain environment variables of the form $ENV{name_here}, /// where 'name_here' will be the name of the environment variable that /// will be resolved. Note that if the variable fails to resolve, /// $ENV{name_here} will NOT be replaced in the path. pub fn build

( self, path: P, policy: Box, ) -> io::Result where P: AsRef, { let path = super::env_util::expand_env_vars(path.as_ref().to_string_lossy()); let appender = RollingFileAppender { writer: Mutex::new(None), path: path.as_ref().into(), append: self.append, encoder: self .encoder .unwrap_or_else(|| Box::::default()), policy, }; if let Some(parent) = appender.path.parent() { fs::create_dir_all(parent)?; } // open the log file immediately appender.get_writer(&mut appender.writer.lock())?; Ok(appender) } } /// A deserializer for the `RollingFileAppender`. /// /// # Configuration /// /// ```yaml /// kind: rolling_file /// /// # The path of the log file. Required. /// # The path can contain environment variables of the form $ENV{name_here}, /// # where 'name_here' will be the name of the environment variable that /// # will be resolved. Note that if the variable fails to resolve, /// # $ENV{name_here} will NOT be replaced in the path. /// path: log/foo.log /// /// # Specifies if the appender should append to or truncate the log file if it /// # already exists. Defaults to `true`. /// append: true /// /// # The encoder to use to format output. Defaults to `kind: pattern`. /// encoder: /// kind: pattern /// /// # The policy which handles rotation of the log file. Required. /// policy: /// # Identifies which policy is to be used. If no kind is specified, it will /// # default to "compound". /// kind: compound /// /// # The remainder of the configuration is passed along to the policy's /// # deserializer, and will vary based on the kind of policy. /// trigger: /// kind: size /// limit: 10 mb /// /// roller: /// kind: delete /// ``` #[cfg(feature = "config_parsing")] #[derive(Copy, Clone, Eq, PartialEq, Hash, Debug, Default)] pub struct RollingFileAppenderDeserializer; #[cfg(feature = "config_parsing")] impl Deserialize for RollingFileAppenderDeserializer { type Trait = dyn Append; type Config = RollingFileAppenderConfig; fn deserialize( &self, config: RollingFileAppenderConfig, deserializers: &Deserializers, ) -> anyhow::Result> { let mut builder = RollingFileAppender::builder(); if let Some(append) = config.append { builder = builder.append(append); } if let Some(encoder) = config.encoder { let encoder = deserializers.deserialize(&encoder.kind, encoder.config)?; builder = builder.encoder(encoder); } let policy = deserializers.deserialize(&config.policy.kind, config.policy.config)?; let appender = builder.build(config.path, policy)?; Ok(Box::new(appender)) } } #[cfg(test)] mod test { use std::{ fs::File, io::{Read, Write}, }; use super::*; use crate::append::rolling_file::policy::Policy; #[test] #[cfg(feature = "yaml_format")] fn deserialize() { use crate::config::{Deserializers, RawConfig}; let dir = tempfile::tempdir().unwrap(); let config = format!( " appenders: foo: kind: rolling_file path: {0}/foo.log policy: trigger: kind: time interval: 2 minutes roller: kind: delete bar: kind: rolling_file path: {0}/foo.log policy: kind: compound trigger: kind: size limit: 5 mb roller: kind: fixed_window pattern: '{0}/foo.log.{{}}' base: 1 count: 5 ", dir.path().display() ); let config = ::serde_yaml::from_str::(&config).unwrap(); let errors = config.appenders_lossy(&Deserializers::new()).1; println!("{:?}", errors); assert!(errors.is_empty()); } #[derive(Debug)] struct NopPolicy; impl Policy for NopPolicy { fn process(&self, _: &mut LogFile) -> anyhow::Result<()> { Ok(()) } fn is_pre_process(&self) -> bool { false } } #[test] fn append() { let dir = tempfile::tempdir().unwrap(); let path = dir.path().join("append.log"); RollingFileAppender::builder() .append(true) .build(&path, Box::new(NopPolicy)) .unwrap(); assert!(path.exists()); File::create(&path).unwrap().write_all(b"hello").unwrap(); RollingFileAppender::builder() .append(true) .build(&path, Box::new(NopPolicy)) .unwrap(); let mut contents = vec![]; File::open(&path) .unwrap() .read_to_end(&mut contents) .unwrap(); assert_eq!(contents, b"hello"); } #[test] fn truncate() { let dir = tempfile::tempdir().unwrap(); let path = dir.path().join("truncate.log"); RollingFileAppender::builder() .append(false) .build(&path, Box::new(NopPolicy)) .unwrap(); assert!(path.exists()); File::create(&path).unwrap().write_all(b"hello").unwrap(); RollingFileAppender::builder() .append(false) .build(&path, Box::new(NopPolicy)) .unwrap(); let mut contents = vec![]; File::open(&path) .unwrap() .read_to_end(&mut contents) .unwrap(); assert_eq!(contents, b""); } } log4rs-1.3.0/src/append/rolling_file/policy/compound/mod.rs000064400000000000000000000107121046102023000220130ustar 00000000000000//! The compound rolling policy. //! //! Requires the `compound_policy` feature. #[cfg(feature = "config_parsing")] use serde::{self, de}; #[cfg(feature = "config_parsing")] use serde_value::Value; #[cfg(feature = "config_parsing")] use std::collections::BTreeMap; use crate::append::rolling_file::{ policy::{compound::roll::Roll, Policy}, LogFile, }; #[cfg(feature = "config_parsing")] use crate::config::{Deserialize, Deserializers}; pub mod roll; pub mod trigger; /// Configuration for the compound policy. #[cfg(feature = "config_parsing")] #[derive(Clone, Eq, PartialEq, Hash, Debug, serde::Deserialize)] #[serde(deny_unknown_fields)] pub struct CompoundPolicyConfig { trigger: Trigger, roller: Roller, } #[cfg(feature = "config_parsing")] #[derive(Clone, Eq, PartialEq, Hash, Debug)] struct Trigger { kind: String, config: Value, } #[cfg(feature = "config_parsing")] impl<'de> serde::Deserialize<'de> for Trigger { fn deserialize(d: D) -> Result where D: serde::Deserializer<'de>, { let mut map = BTreeMap::::deserialize(d)?; let kind = match map.remove(&Value::String("kind".to_owned())) { Some(kind) => kind.deserialize_into().map_err(|e| e.to_error())?, None => return Err(de::Error::missing_field("kind")), }; Ok(Trigger { kind, config: Value::Map(map), }) } } #[cfg(feature = "config_parsing")] #[derive(Clone, Eq, PartialEq, Hash, Debug)] struct Roller { kind: String, config: Value, } #[cfg(feature = "config_parsing")] impl<'de> serde::Deserialize<'de> for Roller { fn deserialize(d: D) -> Result where D: serde::Deserializer<'de>, { let mut map = BTreeMap::::deserialize(d)?; let kind = match map.remove(&Value::String("kind".to_owned())) { Some(kind) => kind.deserialize_into().map_err(|e| e.to_error())?, None => return Err(de::Error::missing_field("kind")), }; Ok(Roller { kind, config: Value::Map(map), }) } } /// A rolling policy which delegates to a "trigger" and "roller". /// /// The trigger determines if the log file should roll, for example, by checking /// the size of the file. The roller processes the old log file, for example, /// by compressing it and moving it to a different location. #[derive(Debug)] pub struct CompoundPolicy { trigger: Box, roller: Box, } impl CompoundPolicy { /// Creates a new `CompoundPolicy`. pub fn new(trigger: Box, roller: Box) -> CompoundPolicy { CompoundPolicy { trigger, roller } } } impl Policy for CompoundPolicy { fn process(&self, log: &mut LogFile) -> anyhow::Result<()> { if self.trigger.trigger(log)? { log.roll(); self.roller.roll(log.path())?; } Ok(()) } fn is_pre_process(&self) -> bool { self.trigger.is_pre_process() } } /// A deserializer for the `CompoundPolicyDeserializer`. /// /// # Configuration /// /// ```yaml /// kind: compound /// /// # The trigger, which determines when the log will roll over. Required. /// trigger: /// /// # Identifies which trigger is to be used. Required. /// kind: size /// /// # The remainder of the configuration is passed to the trigger's /// # deserializer, and will vary based on the kind of trigger. /// limit: 10 mb /// /// # The roller, which processes the old log file. Required. /// roller: /// /// # Identifies which roller is to be used. Required. /// kind: delete /// /// # The remainder of the configuration is passed to the roller's /// # deserializer, and will vary based on the kind of roller. /// ``` #[cfg(feature = "config_parsing")] #[derive(Copy, Clone, Eq, PartialEq, Hash, Debug, Default)] pub struct CompoundPolicyDeserializer; #[cfg(feature = "config_parsing")] impl Deserialize for CompoundPolicyDeserializer { type Trait = dyn Policy; type Config = CompoundPolicyConfig; fn deserialize( &self, config: CompoundPolicyConfig, deserializers: &Deserializers, ) -> anyhow::Result> { let trigger = deserializers.deserialize(&config.trigger.kind, config.trigger.config)?; let roller = deserializers.deserialize(&config.roller.kind, config.roller.config)?; Ok(Box::new(CompoundPolicy::new(trigger, roller))) } } log4rs-1.3.0/src/append/rolling_file/policy/compound/roll/delete.rs000064400000000000000000000027111046102023000234460ustar 00000000000000//! The delete roller. //! //! Requires the `delete_roller` feature. use std::{fs, path::Path}; use crate::append::rolling_file::policy::compound::roll::Roll; #[cfg(feature = "config_parsing")] use crate::config::{Deserialize, Deserializers}; /// Configuration for the delete roller. #[cfg(feature = "config_parsing")] #[derive(Copy, Clone, Eq, PartialEq, Hash, Debug, Default, serde::Deserialize)] #[serde(deny_unknown_fields)] pub struct DeleteRollerConfig { #[serde(skip_deserializing)] _p: (), } /// A roller which deletes the log file. #[derive(Copy, Clone, Eq, PartialEq, Hash, Debug, Default)] pub struct DeleteRoller(()); impl Roll for DeleteRoller { fn roll(&self, file: &Path) -> anyhow::Result<()> { fs::remove_file(file).map_err(Into::into) } } impl DeleteRoller { /// Returns a new `DeleteRoller`. pub fn new() -> Self { Self::default() } } /// A deserializer for the `DeleteRoller`. /// /// # Configuration /// /// ```yaml /// kind: delete /// ``` #[cfg(feature = "config_parsing")] #[derive(Copy, Clone, Eq, PartialEq, Hash, Debug, Default)] pub struct DeleteRollerDeserializer; #[cfg(feature = "config_parsing")] impl Deserialize for DeleteRollerDeserializer { type Trait = dyn Roll; type Config = DeleteRollerConfig; fn deserialize( &self, _: DeleteRollerConfig, _: &Deserializers, ) -> anyhow::Result> { Ok(Box::::default()) } } log4rs-1.3.0/src/append/rolling_file/policy/compound/roll/fixed_window.rs000064400000000000000000000445101046102023000246750ustar 00000000000000//! The fixed-window roller. //! //! Requires the `fixed_window_roller` feature. use anyhow::bail; #[cfg(feature = "background_rotation")] use parking_lot::{Condvar, Mutex}; #[cfg(feature = "background_rotation")] use std::sync::Arc; use std::{ fs, io, path::{Path, PathBuf}, }; use crate::append::{env_util::expand_env_vars, rolling_file::policy::compound::roll::Roll}; #[cfg(feature = "config_parsing")] use crate::config::{Deserialize, Deserializers}; /// Configuration for the fixed window roller. #[cfg(feature = "config_parsing")] #[derive(Clone, Eq, PartialEq, Hash, Debug, Default, serde::Deserialize)] #[serde(deny_unknown_fields)] pub struct FixedWindowRollerConfig { pattern: String, base: Option, count: u32, } #[derive(Copy, Clone, Eq, PartialEq, Hash, Debug)] enum Compression { None, #[cfg(feature = "gzip")] Gzip, } impl Compression { fn compress(&self, src: &Path, dst: &str) -> io::Result<()> { match *self { Compression::None => move_file(src, dst), #[cfg(feature = "gzip")] Compression::Gzip => { #[cfg(feature = "flate2")] use flate2::write::GzEncoder; use std::fs::File; let mut i = File::open(src)?; let o = File::create(dst)?; let mut o = GzEncoder::new(o, flate2::Compression::default()); io::copy(&mut i, &mut o)?; drop(o.finish()?); drop(i); // needs to happen before remove_file call on Windows fs::remove_file(src) } } } } /// A roller which maintains a fixed window of archived log files. /// /// A `FixedWindowRoller` is configured with a filename pattern, a base index, /// and a maximum file count. Each archived log file is associated with a numeric /// index ordering it by age, starting at the base index. Archived log files are /// named by substituting all instances of `{}` with the file's index in the /// filename pattern. /// /// For example, if the filename pattern is `archive/foo.{}.log`, the base index /// is 0 and the count is 2, the first log file will be archived as /// `archive/foo.0.log`. When the next log file is archived, `archive/foo.0.log` /// will be renamed to `archive/foo.1.log` and the new log file will be named /// `archive/foo.0.log`. When the third log file is archived, /// `archive/foo.1.log` will be deleted, `archive/foo.0.log` will be renamed to /// `archive/foo.1.log`, and the new log file will be renamed to /// `archive/foo.0.log`. /// /// If the file extension of the pattern is `.gz` and the `gzip` Cargo feature /// is enabled, the archive files will be gzip-compressed. /// /// Note that this roller will have to rename every archived file every time the /// log rolls over. Performance may be negatively impacted by specifying a large /// count. #[derive(Clone, Debug)] pub struct FixedWindowRoller { pattern: String, compression: Compression, base: u32, count: u32, #[cfg(feature = "background_rotation")] cond_pair: Arc<(Mutex, Condvar)>, } impl FixedWindowRoller { /// Returns a new builder for the `FixedWindowRoller`. pub fn builder() -> FixedWindowRollerBuilder { FixedWindowRollerBuilder { base: 0 } } } impl Roll for FixedWindowRoller { #[cfg(not(feature = "background_rotation"))] fn roll(&self, file: &Path) -> anyhow::Result<()> { if self.count == 0 { return fs::remove_file(file).map_err(Into::into); } rotate( self.pattern.clone(), self.compression, self.base, self.count, file.to_path_buf(), )?; Ok(()) } #[cfg(feature = "background_rotation")] fn roll(&self, file: &Path) -> anyhow::Result<()> { if self.count == 0 { return fs::remove_file(file).map_err(Into::into); } // rename the file let temp = make_temp_file_name(file); move_file(file, &temp)?; // Wait for the state to be ready to roll let (lock, cvar) = &*self.cond_pair.clone(); let mut ready = lock.lock(); if !*ready { cvar.wait(&mut ready); } *ready = false; drop(ready); let pattern = self.pattern.clone(); let compression = self.compression; let base = self.base; let count = self.count; let cond_pair = self.cond_pair.clone(); // rotate in the separate thread std::thread::spawn(move || { let (lock, cvar) = &*cond_pair; let mut ready = lock.lock(); if let Err(e) = rotate(pattern, compression, base, count, temp) { use std::io::Write; let _ = writeln!(io::stderr(), "log4rs, error rotating: {}", e); } *ready = true; cvar.notify_one(); }); Ok(()) } } fn move_file(src: P, dst: Q) -> io::Result<()> where P: AsRef, Q: AsRef, { // first try a rename match fs::rename(src.as_ref(), dst.as_ref()) { Ok(()) => return Ok(()), Err(ref e) if e.kind() == io::ErrorKind::NotFound => return Ok(()), Err(_) => {} } // fall back to a copy and delete if src and dst are on different mounts fs::copy(src.as_ref(), dst.as_ref()).and_then(|_| fs::remove_file(src.as_ref())) } #[cfg(feature = "background_rotation")] fn make_temp_file_name

(file: P) -> PathBuf where P: AsRef, { let mut n = std::time::SystemTime::now() .duration_since(std::time::SystemTime::UNIX_EPOCH) .unwrap_or_else(|_| std::time::Duration::from_secs(0)) .as_secs(); let mut temp = file.as_ref().to_path_buf(); temp.set_extension(format!("{}", n)); while temp.exists() { n += 1; temp.set_extension(format!("{}", n)); } temp } // TODO(eas): compress to tmp file then move into place once prev task is done fn rotate( pattern: String, compression: Compression, base: u32, count: u32, file: PathBuf, ) -> io::Result<()> { let dst_0 = expand_env_vars(pattern.replace("{}", &base.to_string())); if let Some(parent) = Path::new(dst_0.as_ref()).parent() { fs::create_dir_all(parent)?; } // In the common case, all of the archived files will be in the same // directory, so avoid extra filesystem calls in that case. let parent_varies = match ( Path::new(dst_0.as_ref()).parent(), Path::new(expand_env_vars(&pattern).as_ref()).parent(), ) { (Some(a), Some(b)) => a != b, _ => false, // Only case that can actually happen is (None, None) }; for i in (base..base + count - 1).rev() { let src = expand_env_vars(pattern.replace("{}", &i.to_string())); let dst = expand_env_vars(pattern.replace("{}", &(i + 1).to_string())); if parent_varies { if let Some(parent) = Path::new(dst.as_ref()).parent() { fs::create_dir_all(parent)?; } } move_file(src.as_ref(), dst.as_ref())?; } compression.compress(&file, &dst_0).map_err(|e| { println!("err compressing: {:?}, dst: {:?}", file, dst_0); e })?; Ok(()) } /// A builder for the `FixedWindowRoller`. #[derive(Copy, Clone, Eq, PartialEq, Hash, Debug, Default)] pub struct FixedWindowRollerBuilder { base: u32, } impl FixedWindowRollerBuilder { /// Sets the base index for archived log files. /// /// Defaults to 0. pub fn base(mut self, base: u32) -> FixedWindowRollerBuilder { self.base = base; self } /// Constructs a new `FixedWindowRoller`. /// /// `pattern` is either an absolute path or lacking a leading `/`, relative /// to the `cwd` of your application. The pattern must contain at least one /// instance of `{}`, all of which will be replaced with an archived log file's index. /// /// If the file extension of the pattern is `.gz` and the `gzip` Cargo /// feature is enabled, the archive files will be gzip-compressed. /// If the extension is `.gz` and the `gzip` feature is *not* enabled, an error will be returned. /// /// `count` is the maximum number of archived logs to maintain. pub fn build(self, pattern: &str, count: u32) -> anyhow::Result { if !pattern.contains("{}") { // Hide {} in this error message from the formatting machinery in bail macro let msg = "pattern does not contain `{}`"; bail!(msg); } let compression = match Path::new(pattern).extension() { #[cfg(feature = "gzip")] Some(e) if e == "gz" => Compression::Gzip, #[cfg(not(feature = "gzip"))] Some(e) if e == "gz" => { bail!("gzip compression requires the `gzip` feature"); } _ => Compression::None, }; Ok(FixedWindowRoller { pattern: pattern.to_owned(), compression, base: self.base, count, #[cfg(feature = "background_rotation")] cond_pair: Arc::new((Mutex::new(true), Condvar::new())), }) } } /// A deserializer for the `FixedWindowRoller`. /// /// # Configuration /// /// ```yaml /// kind: fixed_window /// /// # The filename pattern for archived logs. This is either an absolute path or if lacking a leading `/`, /// # relative to the `cwd` of your application. The pattern must contain at least one /// # instance of `{}`, all of which will be replaced with an archived log file's index. /// # If the file extension of the pattern is `.gz` and the `gzip` Cargo feature /// # is enabled, the archive files will be gzip-compressed. /// # Required. /// pattern: archive/foo.{}.log /// /// # The maximum number of archived logs to maintain. Required. /// count: 5 /// /// # The base value for archived log indices. Defaults to 0. /// base: 1 /// ``` #[cfg(feature = "config_parsing")] #[derive(Copy, Clone, Eq, PartialEq, Hash, Debug, Default)] pub struct FixedWindowRollerDeserializer; #[cfg(feature = "config_parsing")] impl Deserialize for FixedWindowRollerDeserializer { type Trait = dyn Roll; type Config = FixedWindowRollerConfig; fn deserialize( &self, config: FixedWindowRollerConfig, _: &Deserializers, ) -> anyhow::Result> { let mut builder = FixedWindowRoller::builder(); if let Some(base) = config.base { builder = builder.base(base); } Ok(Box::new(builder.build(&config.pattern, config.count)?)) } } #[cfg(test)] mod test { use std::{ fs::File, io::{Read, Write}, }; use super::*; use crate::append::rolling_file::policy::compound::roll::Roll; #[cfg(feature = "background_rotation")] fn wait_for_roller(roller: &FixedWindowRoller) { std::thread::sleep(std::time::Duration::from_millis(100)); let _lock = roller.cond_pair.0.lock(); } #[cfg(not(feature = "background_rotation"))] fn wait_for_roller(_roller: &FixedWindowRoller) {} #[test] fn rotation() { let dir = tempfile::tempdir().unwrap(); let base = dir.path().to_str().unwrap(); let roller = FixedWindowRoller::builder() .build(&format!("{}/foo.log.{{}}", base), 2) .unwrap(); let file = dir.path().join("foo.log"); File::create(&file).unwrap().write_all(b"file1").unwrap(); roller.roll(&file).unwrap(); wait_for_roller(&roller); assert!(!file.exists()); let mut contents = vec![]; File::open(dir.path().join("foo.log.0")) .unwrap() .read_to_end(&mut contents) .unwrap(); assert_eq!(contents, b"file1"); File::create(&file).unwrap().write_all(b"file2").unwrap(); roller.roll(&file).unwrap(); wait_for_roller(&roller); assert!(!file.exists()); contents.clear(); File::open(dir.path().join("foo.log.1")) .unwrap() .read_to_end(&mut contents) .unwrap(); assert_eq!(contents, b"file1"); contents.clear(); File::open(dir.path().join("foo.log.0")) .unwrap() .read_to_end(&mut contents) .unwrap(); assert_eq!(contents, b"file2"); File::create(&file).unwrap().write_all(b"file3").unwrap(); roller.roll(&file).unwrap(); wait_for_roller(&roller); assert!(!file.exists()); contents.clear(); assert!(!dir.path().join("foo.log.2").exists()); File::open(dir.path().join("foo.log.1")) .unwrap() .read_to_end(&mut contents) .unwrap(); assert_eq!(contents, b"file2"); contents.clear(); File::open(dir.path().join("foo.log.0")) .unwrap() .read_to_end(&mut contents) .unwrap(); assert_eq!(contents, b"file3"); } #[test] fn rotation_no_trivial_base() { let dir = tempfile::tempdir().unwrap(); let base = 3; let fname = "foo.log"; let fcontent = b"something"; let expected_fist_roll = format!("{}.{}", fname, base); let base_dir = dir.path().to_str().unwrap(); let roller = FixedWindowRoller::builder() .base(base) .build(&format!("{}/{}.{{}}", base_dir, fname), 2) .unwrap(); let file = dir.path().join(fname); File::create(&file).unwrap().write_all(fcontent).unwrap(); roller.roll(&file).unwrap(); wait_for_roller(&roller); assert!(!file.exists()); let mut contents = vec![]; let first_roll = dir.path().join(&expected_fist_roll); assert!(first_roll.as_path().exists()); File::open(first_roll) .unwrap() .read_to_end(&mut contents) .unwrap(); assert_eq!(contents, fcontent); // Sanity check general behaviour roller.roll(&file).unwrap(); wait_for_roller(&roller); assert!(!file.exists()); contents.clear(); File::open(dir.path().join(&format!("{}.{}", fname, base + 1))) .unwrap() .read_to_end(&mut contents) .unwrap(); assert_eq!(contents, b"something"); } #[test] fn create_archive_unvaried() { let dir = tempfile::tempdir().unwrap(); let base = dir.path().join("log").join("archive"); let pattern = base.join("foo.{}.log"); let roller = FixedWindowRoller::builder() .build(pattern.to_str().unwrap(), 2) .unwrap(); let file = dir.path().join("foo.log"); File::create(&file).unwrap().write_all(b"file").unwrap(); roller.roll(&file).unwrap(); wait_for_roller(&roller); assert!(base.join("foo.0.log").exists()); let file = dir.path().join("foo.log"); File::create(&file).unwrap().write_all(b"file2").unwrap(); roller.roll(&file).unwrap(); wait_for_roller(&roller); assert!(base.join("foo.0.log").exists()); assert!(base.join("foo.1.log").exists()); } #[test] fn create_archive_varied() { let dir = tempfile::tempdir().unwrap(); let base = dir.path().join("log").join("archive"); let pattern = base.join("{}").join("foo.log"); let roller = FixedWindowRoller::builder() .build(pattern.to_str().unwrap(), 2) .unwrap(); let file = dir.path().join("foo.log"); File::create(&file).unwrap().write_all(b"file").unwrap(); roller.roll(&file).unwrap(); wait_for_roller(&roller); assert!(base.join("0").join("foo.log").exists()); let file = dir.path().join("foo.log"); File::create(&file).unwrap().write_all(b"file2").unwrap(); roller.roll(&file).unwrap(); wait_for_roller(&roller); assert!(base.join("0").join("foo.log").exists()); assert!(base.join("1").join("foo.log").exists()); } #[test] #[cfg_attr(feature = "gzip", ignore)] fn unsupported_gzip() { let dir = tempfile::tempdir().unwrap(); let pattern = dir.path().join("{}.gz"); assert!(FixedWindowRoller::builder() .build(pattern.to_str().unwrap(), 2) .is_err()); } #[test] #[cfg_attr(not(feature = "gzip"), ignore)] // or should we force windows user to install gunzip #[cfg(not(windows))] fn supported_gzip() { use std::process::Command; let dir = tempfile::tempdir().unwrap(); let pattern = dir.path().join("{}.gz"); let roller = FixedWindowRoller::builder() .build(pattern.to_str().unwrap(), 2) .unwrap(); let contents = (0..10000).map(|i| i as u8).collect::>(); let file = dir.path().join("foo.log"); File::create(&file).unwrap().write_all(&contents).unwrap(); roller.roll(&file).unwrap(); wait_for_roller(&roller); assert!(Command::new("gunzip") .arg(dir.path().join("0.gz")) .status() .unwrap() .success()); let mut file = File::open(dir.path().join("0")).unwrap(); let mut actual = vec![]; file.read_to_end(&mut actual).unwrap(); assert_eq!(contents, actual); } #[test] fn roll_with_env_var() { std::env::set_var("LOG_DIR", "test_log_dir"); let fcontent = b"file1"; let dir = tempfile::tempdir().unwrap(); let base = dir.path().to_str().unwrap(); let roller = FixedWindowRoller::builder() .build(&format!("{}/$ENV{{LOG_DIR}}/foo.log.{{}}", base), 2) .unwrap(); let file = dir.path().join("foo.log"); File::create(&file).unwrap().write_all(fcontent).unwrap(); //Check file exists before roll is called assert!(file.exists()); roller.roll(&file).unwrap(); wait_for_roller(&roller); //Check file does not exists after roll is called assert!(!file.exists()); let rolled_file = dir.path().join("test_log_dir").join("foo.log.0"); //Check the new rolled file exists assert!(rolled_file.exists()); let mut contents = vec![]; File::open(rolled_file) .unwrap() .read_to_end(&mut contents) .unwrap(); //Check the new rolled file has the same contents as the old one assert_eq!(contents, fcontent); } } log4rs-1.3.0/src/append/rolling_file/policy/compound/roll/mod.rs000064400000000000000000000014351046102023000227650ustar 00000000000000//! Rollers use std::{fmt, path::Path}; #[cfg(feature = "config_parsing")] use crate::config::Deserializable; #[cfg(feature = "delete_roller")] pub mod delete; #[cfg(feature = "fixed_window_roller")] pub mod fixed_window; /// A trait which processes log files after they have been rolled over. pub trait Roll: fmt::Debug + Send + Sync + 'static { /// Processes the log file. /// /// At the time that this method has been called, the log file has already /// been closed. /// /// If this method returns successfully, there *must* no longer be a file /// at the specified location. fn roll(&self, file: &Path) -> anyhow::Result<()>; } #[cfg(feature = "config_parsing")] impl Deserializable for dyn Roll { fn name() -> &'static str { "roller" } } log4rs-1.3.0/src/append/rolling_file/policy/compound/trigger/mod.rs000064400000000000000000000014411046102023000234550ustar 00000000000000//! Triggers use std::fmt; use crate::append::rolling_file::LogFile; #[cfg(feature = "config_parsing")] use crate::config::Deserializable; #[cfg(feature = "size_trigger")] pub mod size; #[cfg(feature = "time_trigger")] pub mod time; /// A trait which identifies if the active log file should be rolled over. pub trait Trigger: fmt::Debug + Send + Sync + 'static { /// Determines if the active log file should be rolled over. fn trigger(&self, file: &LogFile) -> anyhow::Result; /// Sets the is_pre_process flag for log files. /// /// Defaults to true for time triggers and false for size triggers fn is_pre_process(&self) -> bool; } #[cfg(feature = "config_parsing")] impl Deserializable for dyn Trigger { fn name() -> &'static str { "trigger" } } log4rs-1.3.0/src/append/rolling_file/policy/compound/trigger/size.rs000064400000000000000000000110451046102023000236510ustar 00000000000000//! The size trigger. //! //! Requires the `size_trigger` feature. #[cfg(feature = "config_parsing")] use serde::de; #[cfg(feature = "config_parsing")] use std::fmt; use crate::append::rolling_file::{policy::compound::trigger::Trigger, LogFile}; #[cfg(feature = "config_parsing")] use crate::config::{Deserialize, Deserializers}; /// Configuration for the size trigger. #[cfg(feature = "config_parsing")] #[derive(Copy, Clone, Eq, PartialEq, Hash, Debug, Default, serde::Deserialize)] #[serde(deny_unknown_fields)] pub struct SizeTriggerConfig { #[serde(deserialize_with = "deserialize_limit")] limit: u64, } #[cfg(feature = "config_parsing")] fn deserialize_limit<'de, D>(d: D) -> Result where D: de::Deserializer<'de>, { struct V; impl<'de2> de::Visitor<'de2> for V { type Value = u64; fn expecting(&self, fmt: &mut fmt::Formatter) -> fmt::Result { fmt.write_str("a size") } fn visit_u64(self, v: u64) -> Result where E: de::Error, { Ok(v) } fn visit_i64(self, v: i64) -> Result where E: de::Error, { if v < 0 { return Err(E::invalid_value( de::Unexpected::Signed(v), &"a non-negative number", )); } Ok(v as u64) } fn visit_str(self, v: &str) -> Result where E: de::Error, { let (number, unit) = match v.find(|c: char| !c.is_ascii_digit()) { Some(n) => (v[..n].trim(), Some(v[n..].trim())), None => (v.trim(), None), }; let number = match number.parse::() { Ok(n) => n, Err(_) => return Err(E::invalid_value(de::Unexpected::Str(number), &"a number")), }; let unit = match unit { Some(u) => u, None => return Ok(number), }; let number = if unit.eq_ignore_ascii_case("b") { Some(number) } else if unit.eq_ignore_ascii_case("kb") || unit.eq_ignore_ascii_case("kib") { number.checked_mul(1024) } else if unit.eq_ignore_ascii_case("mb") || unit.eq_ignore_ascii_case("mib") { number.checked_mul(1024 * 1024) } else if unit.eq_ignore_ascii_case("gb") || unit.eq_ignore_ascii_case("gib") { number.checked_mul(1024 * 1024 * 1024) } else if unit.eq_ignore_ascii_case("tb") || unit.eq_ignore_ascii_case("tib") { number.checked_mul(1024 * 1024 * 1024 * 1024) } else { return Err(E::invalid_value(de::Unexpected::Str(unit), &"a valid unit")); }; match number { Some(n) => Ok(n), None => Err(E::invalid_value(de::Unexpected::Str(v), &"a byte size")), } } } d.deserialize_any(V) } /// A trigger which rolls the log once it has passed a certain size. #[derive(Copy, Clone, Eq, PartialEq, Hash, Debug, Default)] pub struct SizeTrigger { limit: u64, } impl SizeTrigger { /// Returns a new trigger which rolls the log once it has passed the /// specified size in bytes. pub fn new(limit: u64) -> SizeTrigger { SizeTrigger { limit } } } impl Trigger for SizeTrigger { fn trigger(&self, file: &LogFile) -> anyhow::Result { Ok(file.len_estimate() > self.limit) } fn is_pre_process(&self) -> bool { false } } /// A deserializer for the `SizeTrigger`. /// /// # Configuration /// /// ```yaml /// kind: size /// /// # The size limit in bytes. The following units are supported (case insensitive): /// # "b", "kb", "kib", "mb", "mib", "gb", "gib", "tb", "tib". The unit defaults to /// # bytes if not specified. Required. /// limit: 10 mb /// ``` #[cfg(feature = "config_parsing")] #[derive(Copy, Clone, Eq, PartialEq, Hash, Debug, Default)] pub struct SizeTriggerDeserializer; #[cfg(feature = "config_parsing")] impl Deserialize for SizeTriggerDeserializer { type Trait = dyn Trigger; type Config = SizeTriggerConfig; fn deserialize( &self, config: SizeTriggerConfig, _: &Deserializers, ) -> anyhow::Result> { Ok(Box::new(SizeTrigger::new(config.limit))) } } #[cfg(test)] mod test { use super::*; #[test] fn pre_process() { let trigger = SizeTrigger::new(2048); assert!(!trigger.is_pre_process()); } } log4rs-1.3.0/src/append/rolling_file/policy/compound/trigger/time.rs000064400000000000000000000422361046102023000236430ustar 00000000000000//! The time trigger. //! //! Requires the `time_trigger` feature. #[cfg(test)] use chrono::NaiveDateTime; use chrono::{DateTime, Datelike, Duration, Local, TimeZone, Timelike}; #[cfg(test)] use mock_instant::{SystemTime, UNIX_EPOCH}; use rand::Rng; #[cfg(feature = "config_parsing")] use serde::de; #[cfg(feature = "config_parsing")] use std::fmt; use std::sync::RwLock; use crate::append::rolling_file::{policy::compound::trigger::Trigger, LogFile}; #[cfg(feature = "config_parsing")] use crate::config::{Deserialize, Deserializers}; #[cfg(feature = "config_parsing")] /// Configuration for the time trigger. #[derive(Copy, Clone, Eq, PartialEq, Hash, Debug, Default, serde::Deserialize)] #[serde(deny_unknown_fields)] pub struct TimeTriggerConfig { interval: TimeTriggerInterval, #[serde(default)] modulate: bool, #[serde(default)] max_random_delay: u64, } #[cfg(not(feature = "config_parsing"))] /// Configuration for the time trigger. #[derive(Copy, Clone, Eq, PartialEq, Hash, Debug, Default)] pub struct TimeTriggerConfig { interval: TimeTriggerInterval, modulate: bool, max_random_delay: u64, } /// A trigger which rolls the log once it has passed a certain time. #[derive(Debug)] pub struct TimeTrigger { config: TimeTriggerConfig, next_roll_time: RwLock>, } /// The TimeTrigger supports the following units (case insensitive): /// "second", "seconds", "minute", "minutes", "hour", "hours", "day", "days", "week", "weeks", "month", "months", "year", "years". The unit defaults to /// second if not specified. #[derive(Copy, Clone, Eq, PartialEq, Hash, Debug)] pub enum TimeTriggerInterval { /// TimeTriger in second(s). Second(i64), /// TimeTriger in minute(s). Minute(i64), /// TimeTriger in hour(s). Hour(i64), /// TimeTriger in day(s). Day(i64), /// TimeTriger in week(s). Week(i64), /// TimeTriger in month(s). Month(i64), /// TimeTriger in year(s). Year(i64), } impl Default for TimeTriggerInterval { fn default() -> Self { TimeTriggerInterval::Second(1) } } #[cfg(feature = "config_parsing")] impl<'de> serde::Deserialize<'de> for TimeTriggerInterval { fn deserialize(d: D) -> Result where D: de::Deserializer<'de>, { struct V; impl<'de2> de::Visitor<'de2> for V { type Value = TimeTriggerInterval; fn expecting(&self, fmt: &mut fmt::Formatter) -> fmt::Result { fmt.write_str("a time") } fn visit_u64(self, v: u64) -> Result where E: de::Error, { Ok(TimeTriggerInterval::Second(v as i64)) } fn visit_i64(self, v: i64) -> Result where E: de::Error, { if v < 0 { return Err(E::invalid_value( de::Unexpected::Signed(v), &"a non-negative number", )); } Ok(TimeTriggerInterval::Second(v)) } fn visit_str(self, v: &str) -> Result where E: de::Error, { let (number, unit) = match v.find(|c: char| !c.is_ascii_digit()) { Some(n) => (v[..n].trim(), Some(v[n..].trim())), None => (v.trim(), None), }; let number = match number.parse::() { Ok(n) => { if n < 0 { return Err(E::invalid_value( de::Unexpected::Signed(n), &"a non-negative number", )); } n } Err(_) => { return Err(E::invalid_value(de::Unexpected::Str(number), &"a number")) } }; let unit = match unit { Some(u) => u, None => return Ok(TimeTriggerInterval::Second(number)), }; let result = if unit.eq_ignore_ascii_case("second") || unit.eq_ignore_ascii_case("seconds") { Some(TimeTriggerInterval::Second(number)) } else if unit.eq_ignore_ascii_case("minute") || unit.eq_ignore_ascii_case("minutes") { Some(TimeTriggerInterval::Minute(number)) } else if unit.eq_ignore_ascii_case("hour") || unit.eq_ignore_ascii_case("hours") { Some(TimeTriggerInterval::Hour(number)) } else if unit.eq_ignore_ascii_case("day") || unit.eq_ignore_ascii_case("days") { Some(TimeTriggerInterval::Day(number)) } else if unit.eq_ignore_ascii_case("week") || unit.eq_ignore_ascii_case("weeks") { Some(TimeTriggerInterval::Week(number)) } else if unit.eq_ignore_ascii_case("month") || unit.eq_ignore_ascii_case("months") { Some(TimeTriggerInterval::Month(number)) } else if unit.eq_ignore_ascii_case("year") || unit.eq_ignore_ascii_case("years") { Some(TimeTriggerInterval::Year(number)) } else { return Err(E::invalid_value(de::Unexpected::Str(unit), &"a valid unit")); }; match result { Some(n) => Ok(n), None => Err(E::invalid_value(de::Unexpected::Str(v), &"a time")), } } } d.deserialize_any(V) } } impl TimeTrigger { /// Returns a new trigger which rolls the log once it has passed the /// specified time. pub fn new(config: TimeTriggerConfig) -> TimeTrigger { #[cfg(test)] let current = { let now: std::time::Duration = SystemTime::now() .duration_since(UNIX_EPOCH) .expect("system time before Unix epoch"); NaiveDateTime::from_timestamp_opt(now.as_secs() as i64, now.subsec_nanos()) .unwrap() .and_local_timezone(Local) .unwrap() }; #[cfg(not(test))] let current = Local::now(); let next_time = TimeTrigger::get_next_time(current, config.interval, config.modulate); let next_roll_time = if config.max_random_delay > 0 { let random_delay = rand::thread_rng().gen_range(0..config.max_random_delay); next_time + Duration::seconds(random_delay as i64) } else { next_time }; TimeTrigger { config, next_roll_time: RwLock::new(next_roll_time), } } fn get_next_time( current: DateTime, interval: TimeTriggerInterval, modulate: bool, ) -> DateTime { let year = current.year(); if let TimeTriggerInterval::Year(n) = interval { let n = n as i32; let increment = if modulate { n - year % n } else { n }; let year_new = year + increment; return Local.with_ymd_and_hms(year_new, 1, 1, 0, 0, 0).unwrap(); } if let TimeTriggerInterval::Month(n) = interval { let month0 = current.month0(); let n = n as u32; let increment = if modulate { n - month0 % n } else { n }; let num_months = (year as u32) * 12 + month0; let num_months_new = num_months + increment; let year_new = (num_months_new / 12) as i32; let month_new = (num_months_new) % 12 + 1; return Local .with_ymd_and_hms(year_new, month_new, 1, 0, 0, 0) .unwrap(); } let month = current.month(); let day = current.day(); if let TimeTriggerInterval::Week(n) = interval { let week0 = current.iso_week().week0() as i64; let weekday = current.weekday().num_days_from_monday() as i64; // Monday is the first day of the week let time = Local.with_ymd_and_hms(year, month, day, 0, 0, 0).unwrap(); let increment = if modulate { n - week0 % n } else { n }; return time + Duration::weeks(increment) - Duration::days(weekday); } if let TimeTriggerInterval::Day(n) = interval { let ordinal0 = current.ordinal0() as i64; let time = Local.with_ymd_and_hms(year, month, day, 0, 0, 0).unwrap(); let increment = if modulate { n - ordinal0 % n } else { n }; return time + Duration::days(increment); } let hour = current.hour(); if let TimeTriggerInterval::Hour(n) = interval { let time = Local .with_ymd_and_hms(year, month, day, hour, 0, 0) .unwrap(); let increment = if modulate { n - (hour as i64) % n } else { n }; return time + Duration::hours(increment); } let min = current.minute(); if let TimeTriggerInterval::Minute(n) = interval { let time = Local .with_ymd_and_hms(year, month, day, hour, min, 0) .unwrap(); let increment = if modulate { n - (min as i64) % n } else { n }; return time + Duration::minutes(increment); } let sec = current.second(); if let TimeTriggerInterval::Second(n) = interval { let time = Local .with_ymd_and_hms(year, month, day, hour, min, sec) .unwrap(); let increment = if modulate { n - (sec as i64) % n } else { n }; return time + Duration::seconds(increment); } panic!("Should not reach here!"); } } impl Trigger for TimeTrigger { fn trigger(&self, _file: &LogFile) -> anyhow::Result { #[cfg(test)] let current = { let now = SystemTime::now() .duration_since(UNIX_EPOCH) .expect("system time before Unix epoch"); NaiveDateTime::from_timestamp_opt(now.as_secs() as i64, now.subsec_nanos()) .unwrap() .and_local_timezone(Local) .unwrap() }; #[cfg(not(test))] let current: DateTime = Local::now(); let mut next_roll_time = self.next_roll_time.write().unwrap(); let is_trigger = current >= *next_roll_time; if is_trigger { let tmp = TimeTrigger::new(self.config); let time_new = tmp.next_roll_time.read().unwrap(); *next_roll_time = *time_new; } Ok(is_trigger) } fn is_pre_process(&self) -> bool { true } } /// A deserializer for the `TimeTrigger`. /// /// # Configuration /// /// ```yaml /// kind: time /// /// # The time interval. The following units are supported (case insensitive): /// # "second(s)", "minute(s)", "hour(s)", "day(s)", "week(s)", "month(s)", "year(s)". The unit defaults to /// # second if not specified. /// interval: 7 day /// ``` #[cfg(feature = "config_parsing")] #[derive(Copy, Clone, Eq, PartialEq, Hash, Debug, Default)] pub(crate) struct TimeTriggerDeserializer; #[cfg(feature = "config_parsing")] impl Deserialize for TimeTriggerDeserializer { type Trait = dyn Trigger; type Config = TimeTriggerConfig; fn deserialize( &self, config: TimeTriggerConfig, _: &Deserializers, ) -> anyhow::Result> { Ok(Box::new(TimeTrigger::new(config))) } } #[cfg(test)] mod test { use super::*; use mock_instant::MockClock; use std::time::Duration; fn trigger_with_time_and_modulate( interval: TimeTriggerInterval, modulate: bool, millis: u64, ) -> (bool, bool) { let file = tempfile::tempdir().unwrap(); let logfile = LogFile { writer: &mut None, path: file.path(), len: 0, }; let config = TimeTriggerConfig { interval, modulate, max_random_delay: 0, }; let trigger = TimeTrigger::new(config); MockClock::advance_system_time(Duration::from_millis(millis / 2)); let result1 = trigger.trigger(&logfile).unwrap(); MockClock::advance_system_time(Duration::from_millis(millis / 2)); let result2 = trigger.trigger(&logfile).unwrap(); (result1, result2) } #[test] fn trigger() { let second_in_milli = 1000; let minute_in_milli = second_in_milli * 60; let hour_in_milli = minute_in_milli * 60; let day_in_milli = hour_in_milli * 24; let week_in_milli = day_in_milli * 7; let month_in_milli = day_in_milli * 31; let year_in_milli = day_in_milli * 365; let test_list = vec![ (TimeTriggerInterval::Second(1), second_in_milli), (TimeTriggerInterval::Minute(1), minute_in_milli), (TimeTriggerInterval::Hour(1), hour_in_milli), (TimeTriggerInterval::Day(1), day_in_milli), (TimeTriggerInterval::Week(1), week_in_milli), (TimeTriggerInterval::Month(1), month_in_milli), (TimeTriggerInterval::Year(1), year_in_milli), ]; let modulate = false; for (time_trigger_interval, time_in_milli) in test_list.iter() { MockClock::set_system_time(Duration::from_millis(4 * day_in_milli)); // 1970/1/5 00:00:00 Monday assert_eq!( trigger_with_time_and_modulate(*time_trigger_interval, modulate, *time_in_milli), (false, true) ); // trigger will be aligned with units. MockClock::set_system_time( Duration::from_millis(4 * day_in_milli) + Duration::from_millis(time_in_milli / 2), ); assert_eq!( trigger_with_time_and_modulate(*time_trigger_interval, modulate, *time_in_milli), (true, false) ); } let test_list = vec![ (TimeTriggerInterval::Second(3), 3 * second_in_milli), (TimeTriggerInterval::Minute(3), 3 * minute_in_milli), (TimeTriggerInterval::Hour(3), 3 * hour_in_milli), (TimeTriggerInterval::Day(3), 3 * day_in_milli), (TimeTriggerInterval::Week(3), 3 * week_in_milli), (TimeTriggerInterval::Month(3), 3 * month_in_milli), (TimeTriggerInterval::Year(3), 3 * year_in_milli), ]; let modulate = true; for (time_trigger_interval, time_in_milli) in test_list.iter() { MockClock::set_system_time(Duration::from_millis( 59 * day_in_milli + 2 * hour_in_milli + 2 * minute_in_milli + 2 * second_in_milli, )); // 1970/3/1 02:02:02 Sunday assert_eq!( trigger_with_time_and_modulate(*time_trigger_interval, modulate, *time_in_milli), (true, false) ); } } #[test] #[cfg(feature = "yaml_format")] fn test_serde() { let test_error = vec![ "abc", // // str none none "", // none "5 das", // bad unit "-1", // inegative integar "2.0", //flaot ]; for interval in test_error.iter() { let error = ::serde_yaml::from_str::(&interval); assert!(error.is_err()); } let test_ok = vec![ // u64 ("1", TimeTriggerInterval::Second(1)), // str second ("1 second", TimeTriggerInterval::Second(1)), ("1 seconds", TimeTriggerInterval::Second(1)), // str minute ("1 minute", TimeTriggerInterval::Minute(1)), ("1 minutes", TimeTriggerInterval::Minute(1)), // str hour ("1 hour", TimeTriggerInterval::Hour(1)), ("1 hours", TimeTriggerInterval::Hour(1)), // str day ("1 day", TimeTriggerInterval::Day(1)), ("1 days", TimeTriggerInterval::Day(1)), // str week ("1 week", TimeTriggerInterval::Week(1)), ("1 weeks", TimeTriggerInterval::Week(1)), // str month ("1 month", TimeTriggerInterval::Month(1)), ("1 months", TimeTriggerInterval::Month(1)), // str year ("1 year", TimeTriggerInterval::Year(1)), ("1 years", TimeTriggerInterval::Year(1)), ]; for (interval, expected) in test_ok.iter() { let interval = format!("{}", interval); let interval = ::serde_yaml::from_str::(&interval).unwrap(); assert_eq!(interval, *expected); } } #[test] fn test_time_trigger_limit_default() { let interval = TimeTriggerInterval::default(); assert_eq!(interval, TimeTriggerInterval::Second(1)); } #[test] fn pre_process() { let config = TimeTriggerConfig { interval: TimeTriggerInterval::Minute(2), modulate: true, max_random_delay: 0, }; let trigger = TimeTrigger::new(config); assert!(trigger.is_pre_process()); } } log4rs-1.3.0/src/append/rolling_file/policy/mod.rs000064400000000000000000000014211046102023000201640ustar 00000000000000//! Policies. use std::fmt; use crate::append::rolling_file::LogFile; #[cfg(feature = "config_parsing")] use crate::config::Deserializable; #[cfg(feature = "compound_policy")] pub mod compound; /// A trait implementing a rolling policy for a `RollingFileAppender`. pub trait Policy: Sync + Send + 'static + fmt::Debug { /// Rolls the current log file, if necessary. /// /// This method is called after each log event. It is provided a reference /// to the current log file. fn process(&self, log: &mut LogFile) -> anyhow::Result<()>; /// Return the config `Trigger.is_pre_process` value fn is_pre_process(&self) -> bool; } #[cfg(feature = "config_parsing")] impl Deserializable for dyn Policy { fn name() -> &'static str { "policy" } } log4rs-1.3.0/src/config/file.rs000064400000000000000000000152021046102023000143600ustar 00000000000000use std::{ fs, path::{Path, PathBuf}, thread, time::{Duration, SystemTime}, }; use thiserror::Error; use super::{init_config, Config, Deserializers, Handle, RawConfig}; use crate::handle_error; /// Initializes the global logger as a log4rs logger configured via a file. /// /// Configuration is read from a file located at the provided path on the /// filesystem and components are created from the provided `Deserializers`. /// /// Any nonfatal errors encountered when processing the configuration are /// reported to stderr. /// /// Requires the `file` feature (enabled by default). pub fn init_file

(path: P, deserializers: Deserializers) -> anyhow::Result<()> where P: AsRef, { let path = path.as_ref().to_path_buf(); let format = Format::from_path(&path)?; let source = read_config(&path)?; // An Err here could come because mtime isn't available, so don't bail let modified = fs::metadata(&path).and_then(|m| m.modified()).ok(); let config = format.parse(&source)?; let refresh_rate = config.refresh_rate(); let config = deserialize(&config, &deserializers); match init_config(config) { Ok(handle) => { if let Some(refresh_rate) = refresh_rate { ConfigReloader::start( path, format, refresh_rate, source, modified, deserializers, handle, ); } Ok(()) } Err(e) => Err(e.into()), } } /// Loads a log4rs logger configuration from a file. /// /// Unlike `init_file`, this function does not initialize the logger; it only /// loads the `Config` and returns it. pub fn load_config_file

(path: P, deserializers: Deserializers) -> anyhow::Result where P: AsRef, { let path = path.as_ref(); let format = Format::from_path(path)?; let source = read_config(path)?; let config = format.parse(&source)?; Ok(deserialize(&config, &deserializers)) } /// The various types of formatting errors that can be generated. #[derive(Debug, Error)] pub enum FormatError { /// The YAML feature flag was missing. #[error("the `yaml_format` feature is required for YAML support")] YamlFeatureFlagRequired, /// The JSON feature flag was missing. #[error("the `json_format` feature is required for JSON support")] JsonFeatureFlagRequired, /// The TOML feature flag was missing. #[error("the `toml_format` feature is required for TOML support")] TomlFeatureFlagRequired, /// An unsupported format was specified. #[error("unsupported file format `{0}`")] UnsupportedFormat(String), /// Log4rs could not determine the file format. #[error("unable to determine the file format")] UnknownFormat, } #[derive(Debug)] enum Format { #[cfg(feature = "yaml_format")] Yaml, #[cfg(feature = "json_format")] Json, #[cfg(feature = "toml_format")] Toml, } impl Format { fn from_path(path: &Path) -> anyhow::Result { match path.extension().and_then(|s| s.to_str()) { #[cfg(feature = "yaml_format")] Some("yaml") | Some("yml") => Ok(Format::Yaml), #[cfg(not(feature = "yaml_format"))] Some("yaml") | Some("yml") => Err(FormatError::YamlFeatureFlagRequired.into()), #[cfg(feature = "json_format")] Some("json") => Ok(Format::Json), #[cfg(not(feature = "json_format"))] Some("json") => Err(FormatError::JsonFeatureFlagRequired.into()), #[cfg(feature = "toml_format")] Some("toml") => Ok(Format::Toml), #[cfg(not(feature = "toml_format"))] Some("toml") => Err(FormatError::TomlFeatureFlagRequired.into()), Some(f) => Err(FormatError::UnsupportedFormat(f.to_string()).into()), None => Err(FormatError::UnknownFormat.into()), } } #[allow(unused_variables)] fn parse(&self, source: &str) -> anyhow::Result { match *self { #[cfg(feature = "yaml_format")] Format::Yaml => ::serde_yaml::from_str(source).map_err(Into::into), #[cfg(feature = "json_format")] Format::Json => ::serde_json::from_str(source).map_err(Into::into), #[cfg(feature = "toml_format")] Format::Toml => ::toml::from_str(source).map_err(Into::into), } } } fn read_config(path: &Path) -> anyhow::Result { let s = fs::read_to_string(path)?; Ok(s) } fn deserialize(config: &RawConfig, deserializers: &Deserializers) -> Config { let (appenders, mut errors) = config.appenders_lossy(deserializers); errors.handle(); let (config, mut errors) = Config::builder() .appenders(appenders) .loggers(config.loggers()) .build_lossy(config.root()); errors.handle(); config } struct ConfigReloader { path: PathBuf, format: Format, source: String, modified: Option, deserializers: Deserializers, handle: Handle, } impl ConfigReloader { fn start( path: PathBuf, format: Format, rate: Duration, source: String, modified: Option, deserializers: Deserializers, handle: Handle, ) { let mut reloader = ConfigReloader { path, format, source, modified, deserializers, handle, }; thread::Builder::new() .name("log4rs refresh".to_owned()) .spawn(move || reloader.run(rate)) .unwrap(); } fn run(&mut self, mut rate: Duration) { loop { thread::sleep(rate); match self.run_once(rate) { Ok(Some(r)) => rate = r, Ok(None) => break, Err(e) => handle_error(&e), } } } fn run_once(&mut self, rate: Duration) -> anyhow::Result> { if let Some(last_modified) = self.modified { let modified = fs::metadata(&self.path).and_then(|m| m.modified())?; if last_modified == modified { return Ok(Some(rate)); } self.modified = Some(modified); } let source = read_config(&self.path)?; if source == self.source { return Ok(Some(rate)); } self.source = source; let config = self.format.parse(&self.source)?; let rate = config.refresh_rate(); let config = deserialize(&config, &self.deserializers); self.handle.set_config(config); Ok(rate) } } log4rs-1.3.0/src/config/mod.rs000064400000000000000000000063001046102023000142170ustar 00000000000000//! All things pertaining to log4rs config. #![doc = include_str!("../../docs/Configuration.md")] use log::SetLoggerError; use thiserror::Error; use crate::Handle; pub mod runtime; #[cfg(feature = "config_parsing")] mod file; #[cfg(feature = "config_parsing")] mod raw; pub use runtime::{Appender, Config, Logger, Root}; #[cfg(feature = "config_parsing")] pub use self::file::{init_file, load_config_file, FormatError}; #[cfg(feature = "config_parsing")] pub use self::raw::{Deserializable, Deserialize, Deserializers, RawConfig}; /// Initializes the global logger as a log4rs logger with the provided config. /// /// A `Handle` object is returned which can be used to adjust the logging /// configuration. pub fn init_config(config: runtime::Config) -> Result { let logger = crate::Logger::new(config); log::set_max_level(logger.max_log_level()); let handle = Handle { shared: logger.0.clone(), }; log::set_boxed_logger(Box::new(logger)).map(|()| handle) } /// Initializes the global logger as a log4rs logger with the provided config and error handler. /// /// A `Handle` object is returned which can be used to adjust the logging /// configuration. pub fn init_config_with_err_handler( config: runtime::Config, err_handler: Box, ) -> Result { let logger = crate::Logger::new_with_err_handler(config, err_handler); log::set_max_level(logger.max_log_level()); let handle = Handle { shared: logger.0.clone(), }; log::set_boxed_logger(Box::new(logger)).map(|()| handle) } /// Create a log4rs logger using the provided raw config. /// /// This will return errors if the appenders configuration is malformed. #[cfg(feature = "config_parsing")] pub fn create_raw_config(config: RawConfig) -> Result { let (appenders, errors) = config.appenders_lossy(&Deserializers::default()); if !errors.is_empty() { return Err(InitError::Deserializing(errors)); } let config = Config::builder() .appenders(appenders) .loggers(config.loggers()) .build(config.root())?; Ok(crate::Logger::new(config)) } /// Initializes the global logger as a log4rs logger using the provided raw config. /// /// This will return errors if the appenders configuration is malformed or if we fail to set the global logger. #[cfg(feature = "config_parsing")] pub fn init_raw_config(config: RawConfig) -> Result<(), InitError> { let logger = create_raw_config(config)?; log::set_max_level(logger.max_log_level()); log::set_boxed_logger(Box::new(logger))?; Ok(()) } /// Errors found when initializing. #[derive(Debug, Error)] pub enum InitError { /// There was an error deserializing. #[error("Errors found when deserializing the config: {0:#?}")] #[cfg(feature = "config_parsing")] Deserializing(#[from] raw::AppenderErrors), /// There was an error building the handle. #[error("Config building errors: {0:#?}")] BuildConfig(#[from] runtime::ConfigErrors), /// There was an error setting the global logger. #[error("Error setting the logger: {0:#?}")] SetLogger(#[from] log::SetLoggerError), } log4rs-1.3.0/src/config/raw.rs000064400000000000000000000373361046102023000142460ustar 00000000000000//! Support for log4rs configuration from files. //! //! Multiple file formats are supported, each requiring a Cargo feature to be //! enabled. YAML support requires the `yaml_format` feature, JSON support requires //! the `json_format` feature, and TOML support requires the `toml_format` feature. //! //! # Syntax //! //! All file formats currently share the same structure. The example below is //! of the YAML format. //! //! ```yaml //! # If set, log4rs will scan the file at the specified rate for changes and //! # automatically reconfigure the logger. The input string is parsed by the //! # humantime crate. //! refresh_rate: 30 seconds //! //! # The "appenders" map contains the set of appenders, indexed by their names. //! appenders: //! //! foo: //! //! # All appenders must specify a "kind", which will be used to look up the //! # logic to construct the appender in the `Deserializers` passed to the //! # deserialization function. //! kind: console //! //! # Filters attached to an appender are specified inside the "filters" //! # array. //! filters: //! //! - //! # Like appenders, filters are identified by their "kind". //! kind: threshold //! //! # The remainder of the configuration is passed along to the //! # filter's builder, and will vary based on the kind of filter. //! level: error //! //! # The remainder of the configuration is passed along to the appender's //! # builder, and will vary based on the kind of appender. //! # Appenders will commonly be associated with an encoder. //! encoder: //! //! # Like appenders, encoders are identified by their "kind". //! # //! # Default: pattern //! kind: pattern //! //! # The remainder of the configuration is passed along to the //! # encoder's builder, and will vary based on the kind of encoder. //! pattern: "{d} [{t}] {m}{n}" //! //! # The root logger is configured by the "root" map. //! root: //! //! # The maximum log level for the root logger. //! # //! # Default: warn //! level: warn //! //! # The list of appenders attached to the root logger. //! # //! # Default: empty list //! appenders: //! - foo //! //! # The "loggers" map contains the set of configured loggers, indexed by their //! # names. //! loggers: //! //! foo::bar::baz: //! //! # The maximum log level. //! # //! # Default: parent logger's level //! level: trace //! //! # The list of appenders attached to the logger. //! # //! # Default: empty list //! appenders: //! - foo //! //! # The additivity of the logger. If true, appenders attached to the logger's //! # parent will also be attached to this logger. //! # Default: true //! additive: false //! ``` #![allow(deprecated)] use std::{ borrow::ToOwned, collections::HashMap, fmt, marker::PhantomData, sync::Arc, time::Duration, }; use anyhow::anyhow; use derivative::Derivative; use log::LevelFilter; use serde::de::{self, Deserialize as SerdeDeserialize, DeserializeOwned}; use serde_value::Value; use thiserror::Error; use typemap_ors::{Key, ShareCloneMap}; use crate::{append::AppenderConfig, config}; #[allow(unused_imports)] use crate::append; #[cfg(any(feature = "json_encoder", feature = "pattern_encoder"))] use crate::encode; #[cfg(feature = "threshold_filter")] use crate::filter; /// A trait implemented by traits which are deserializable. pub trait Deserializable: 'static { /// Returns a name for objects implementing the trait suitable for display in error messages. /// /// For example, the `Deserializable` implementation for the `Append` trait returns "appender". fn name() -> &'static str; } /// A trait for objects that can deserialize log4rs components out of a config. pub trait Deserialize: Send + Sync + 'static { /// The trait that this deserializer will create. type Trait: ?Sized + Deserializable; /// This deserializer's configuration. type Config: DeserializeOwned; /// Create a new trait object based on the provided config. fn deserialize( &self, config: Self::Config, deserializers: &Deserializers, ) -> anyhow::Result>; } trait ErasedDeserialize: Send + Sync + 'static { type Trait: ?Sized; fn deserialize( &self, config: Value, deserializers: &Deserializers, ) -> anyhow::Result>; } struct DeserializeEraser(T); impl ErasedDeserialize for DeserializeEraser where T: Deserialize, { type Trait = T::Trait; fn deserialize( &self, config: Value, deserializers: &Deserializers, ) -> anyhow::Result> { let config = config.deserialize_into()?; self.0.deserialize(config, deserializers) } } struct KeyAdaptor(PhantomData); impl Key for KeyAdaptor { type Value = HashMap>>; } /// A container of `Deserialize`rs. #[derive(Clone)] pub struct Deserializers(ShareCloneMap); impl Default for Deserializers { fn default() -> Deserializers { #[allow(unused_mut)] let mut d = Deserializers::empty(); #[cfg(feature = "console_appender")] d.insert("console", append::console::ConsoleAppenderDeserializer); #[cfg(feature = "file_appender")] d.insert("file", append::file::FileAppenderDeserializer); #[cfg(feature = "rolling_file_appender")] d.insert( "rolling_file", append::rolling_file::RollingFileAppenderDeserializer, ); #[cfg(feature = "compound_policy")] d.insert( "compound", append::rolling_file::policy::compound::CompoundPolicyDeserializer, ); #[cfg(feature = "delete_roller")] d.insert( "delete", append::rolling_file::policy::compound::roll::delete::DeleteRollerDeserializer, ); #[cfg(feature = "fixed_window_roller")] d.insert( "fixed_window", append::rolling_file::policy::compound::roll::fixed_window::FixedWindowRollerDeserializer, ); #[cfg(feature = "size_trigger")] d.insert( "size", append::rolling_file::policy::compound::trigger::size::SizeTriggerDeserializer, ); #[cfg(feature = "time_trigger")] d.insert( "time", append::rolling_file::policy::compound::trigger::time::TimeTriggerDeserializer, ); #[cfg(feature = "json_encoder")] d.insert("json", encode::json::JsonEncoderDeserializer); #[cfg(feature = "pattern_encoder")] d.insert("pattern", encode::pattern::PatternEncoderDeserializer); #[cfg(feature = "threshold_filter")] d.insert("threshold", filter::threshold::ThresholdFilterDeserializer); d } } impl Deserializers { /// Creates a `Deserializers` with default mappings. /// /// All are enabled by default. /// /// * Appenders /// * "console" -> `ConsoleAppenderDeserializer` /// * Requires the `console_appender` feature. /// * "file" -> `FileAppenderDeserializer` /// * Requires the `file_appender` feature. /// * "rolling_file" -> `RollingFileAppenderDeserializer` /// * Requires the `rolling_file_appender` feature. /// * Encoders /// * "pattern" -> `PatternEncoderDeserializer` /// * Requires the `pattern_encoder` feature. /// * "json" -> `JsonEncoderDeserializer` /// * Requires the `json_encoder` feature. /// * Filters /// * "threshold" -> `ThresholdFilterDeserializer` /// * Requires the `threshold_filter` feature. /// * Policies /// * "compound" -> `CompoundPolicyDeserializer` /// * Requires the `compound_policy` feature. /// * Rollers /// * "delete" -> `DeleteRollerDeserializer` /// * Requires the `delete_roller` feature. /// * "fixed_window" -> `FixedWindowRollerDeserializer` /// * Requires the `fixed_window_roller` feature. /// * Triggers /// * "size" -> `SizeTriggerDeserializer` /// * Requires the `size_trigger` feature. /// * "time" -> `TimeTriggerDeserializer` /// * Requires the `time_trigger` feature. pub fn new() -> Deserializers { Deserializers::default() } /// Creates a new `Deserializers` with no mappings. pub fn empty() -> Deserializers { Deserializers(ShareCloneMap::custom()) } /// Adds a mapping from the specified `kind` to a deserializer. pub fn insert(&mut self, kind: &str, deserializer: T) where T: Deserialize, { self.0 .entry::>() .or_insert_with(HashMap::new) .insert(kind.to_owned(), Arc::new(DeserializeEraser(deserializer))); } /// Deserializes a value of a specific type and kind. pub fn deserialize(&self, kind: &str, config: Value) -> anyhow::Result> where T: Deserializable, { match self.0.get::>().and_then(|m| m.get(kind)) { Some(b) => b.deserialize(config, self), None => Err(anyhow!( "no {} deserializer for kind `{}` registered", T::name(), kind )), } } } #[derive(Debug, Error)] pub enum DeserializingConfigError { #[error("error deserializing appender {0}: {1}")] Appender(String, anyhow::Error), #[error("error deserializing filter attached to appender {0}: {1}")] Filter(String, anyhow::Error), } /// A raw deserializable log4rs configuration. #[derive(Clone, Debug, Default, serde::Deserialize)] #[serde(deny_unknown_fields)] pub struct RawConfig { #[serde(deserialize_with = "de_duration", default)] refresh_rate: Option, #[serde(default)] root: Root, #[serde(default)] appenders: HashMap, #[serde(default)] loggers: HashMap, } #[derive(Debug, Error)] #[error("errors deserializing appenders {0:#?}")] pub struct AppenderErrors(Vec); impl AppenderErrors { pub fn is_empty(&self) -> bool { self.0.is_empty() } pub fn handle(&mut self) { for error in self.0.drain(..) { crate::handle_error(&error.into()); } } } impl RawConfig { /// Returns the root. pub fn root(&self) -> config::Root { config::Root::builder() .appenders(self.root.appenders.clone()) .build(self.root.level) } /// Returns the loggers. pub fn loggers(&self) -> Vec { self.loggers .iter() .map(|(name, logger)| { config::Logger::builder() .appenders(logger.appenders.clone()) .additive(logger.additive) .build(name.clone(), logger.level) }) .collect() } /// Returns the appenders. /// /// Any components which fail to be deserialized will be ignored. pub fn appenders_lossy( &self, deserializers: &Deserializers, ) -> (Vec, AppenderErrors) { let mut appenders = vec![]; let mut errors = vec![]; for (name, appender) in &self.appenders { let mut builder = config::Appender::builder(); for filter in &appender.filters { match deserializers.deserialize(&filter.kind, filter.config.clone()) { Ok(filter) => builder = builder.filter(filter), Err(e) => errors.push(DeserializingConfigError::Filter(name.clone(), e)), } } match deserializers.deserialize(&appender.kind, appender.config.clone()) { Ok(appender) => appenders.push(builder.build(name.clone(), appender)), Err(e) => errors.push(DeserializingConfigError::Appender(name.clone(), e)), } } (appenders, AppenderErrors(errors)) } /// Returns the requested refresh rate. pub fn refresh_rate(&self) -> Option { self.refresh_rate } } fn de_duration<'de, D>(d: D) -> Result, D::Error> where D: de::Deserializer<'de>, { struct S(Duration); impl<'de2> de::Deserialize<'de2> for S { fn deserialize(d: D) -> Result where D: de::Deserializer<'de2>, { struct V; impl<'de3> de::Visitor<'de3> for V { type Value = S; fn expecting(&self, fmt: &mut fmt::Formatter) -> fmt::Result { fmt.write_str("a duration") } fn visit_str(self, v: &str) -> Result where E: de::Error, { humantime::parse_duration(v).map(S).map_err(E::custom) } } d.deserialize_any(V) } } Option::::deserialize(d).map(|r| r.map(|s| s.0)) } #[derive(Clone, Debug, Derivative, serde::Deserialize)] #[derivative(Default)] #[serde(deny_unknown_fields)] struct Root { #[serde(default = "root_level_default")] #[derivative(Default(value = "root_level_default()"))] level: LevelFilter, #[serde(default)] appenders: Vec, } fn root_level_default() -> LevelFilter { LevelFilter::Debug } #[derive(serde::Deserialize, Debug, Clone)] #[serde(deny_unknown_fields)] struct Logger { level: LevelFilter, #[serde(default)] appenders: Vec, #[serde(default = "logger_additive_default")] additive: bool, } fn logger_additive_default() -> bool { true } #[cfg(test)] #[allow(unused_imports)] mod test { use std::fs; use super::*; #[test] #[cfg(all(feature = "yaml_format", feature = "threshold_filter"))] fn full_deserialize() { let cfg = r#" refresh_rate: 60 seconds appenders: console: kind: console filters: - kind: threshold level: debug baz: kind: file path: /tmp/baz.log encoder: pattern: "%m" root: appenders: - console level: info loggers: foo::bar::baz: level: warn appenders: - baz additive: false "#; let config = ::serde_yaml::from_str::(cfg).unwrap(); let errors = config.appenders_lossy(&Deserializers::new()).1; println!("{:?}", errors); assert!(errors.is_empty()); } #[test] #[cfg(feature = "yaml_format")] fn empty() { ::serde_yaml::from_str::("{}").unwrap(); } #[cfg(windows)] #[allow(dead_code)] const LINE_ENDING: &'static str = "\r\n"; #[cfg(not(windows))] #[allow(dead_code)] const LINE_ENDING: &'static str = "\n"; #[test] #[cfg(feature = "yaml_format")] fn readme_sample_file_is_ok() { let readme = fs::read_to_string("./README.md").expect("README file exists"); let sample_file = &readme[readme .find("log4rs.yaml:") .expect("Sample file exists and is called log4rs.yaml")..]; let config_start_string = format!("{}```yaml{}", LINE_ENDING, LINE_ENDING); let config_end_string = format!("{}```{}", LINE_ENDING, LINE_ENDING); let config_start = sample_file.find(&config_start_string).unwrap() + config_start_string.len(); let config_end = sample_file.find(&config_end_string).unwrap(); let config_str = sample_file[config_start..config_end].trim(); let config = ::serde_yaml::from_str::(config_str); assert!(config.is_ok()); assert!(config::create_raw_config(config.unwrap()).is_ok()); } } log4rs-1.3.0/src/config/runtime.rs000064400000000000000000000304641046102023000151330ustar 00000000000000//! log4rs configuration use log::LevelFilter; use std::{collections::HashSet, iter::IntoIterator}; use thiserror::Error; use crate::{append::Append, filter::Filter}; /// A log4rs configuration. #[derive(Debug)] pub struct Config { appenders: Vec, root: Root, loggers: Vec, } impl Config { /// Creates a new `ConfigBuilder`. pub fn builder() -> ConfigBuilder { ConfigBuilder { appenders: vec![], loggers: vec![], } } /// Returns the `Appender`s associated with the `Config`. pub fn appenders(&self) -> &[Appender] { &self.appenders } /// Returns the `Root` associated with the `Config`. pub fn root(&self) -> &Root { &self.root } /// Returns a mutable handle for the `Root` associated with the `Config`. pub fn root_mut(&mut self) -> &mut Root { &mut self.root } /// Returns the `Logger`s associated with the `Config`. pub fn loggers(&self) -> &[Logger] { &self.loggers } pub(crate) fn unpack(self) -> (Vec, Root, Vec) { let Config { appenders, root, loggers, } = self; (appenders, root, loggers) } } /// A builder for `Config`s. #[derive(Debug, Default)] pub struct ConfigBuilder { appenders: Vec, loggers: Vec, } impl ConfigBuilder { /// Adds an appender. pub fn appender(mut self, appender: Appender) -> ConfigBuilder { self.appenders.push(appender); self } /// Adds appenders. pub fn appenders(mut self, appenders: I) -> ConfigBuilder where I: IntoIterator, { self.appenders.extend(appenders); self } /// Adds a logger. pub fn logger(mut self, logger: Logger) -> ConfigBuilder { self.loggers.push(logger); self } /// Adds loggers. pub fn loggers(mut self, loggers: I) -> ConfigBuilder where I: IntoIterator, { self.loggers.extend(loggers); self } /// Consumes the `ConfigBuilder`, returning the `Config`. /// /// Unlike `build`, this method will always return a `Config` by stripping /// portions of the configuration that are incorrect. pub fn build_lossy(self, mut root: Root) -> (Config, ConfigErrors) { let mut errors: Vec = vec![]; let ConfigBuilder { appenders, loggers } = self; let mut ok_appenders = vec![]; let mut appender_names = HashSet::new(); for appender in appenders { if appender_names.insert(appender.name.clone()) { ok_appenders.push(appender); } else { errors.push(ConfigError::DuplicateAppenderName(appender.name)); } } let mut ok_root_appenders = vec![]; for appender in root.appenders { if appender_names.contains(&appender) { ok_root_appenders.push(appender); } else { errors.push(ConfigError::NonexistentAppender(appender)); } } root.appenders = ok_root_appenders; let mut ok_loggers = vec![]; let mut logger_names = HashSet::new(); for mut logger in loggers { if !logger_names.insert(logger.name.clone()) { errors.push(ConfigError::DuplicateLoggerName(logger.name)); continue; } if let Err(err) = check_logger_name(&logger.name) { errors.push(err); continue; } let mut ok_logger_appenders = vec![]; for appender in logger.appenders { if appender_names.contains(&appender) { ok_logger_appenders.push(appender); } else { errors.push(ConfigError::NonexistentAppender(appender)); } } logger.appenders = ok_logger_appenders; ok_loggers.push(logger); } let config = Config { appenders: ok_appenders, root, loggers: ok_loggers, }; (config, ConfigErrors(errors)) } /// Consumes the `ConfigBuilder`, returning the `Config`. pub fn build(self, root: Root) -> Result { let (config, errors) = self.build_lossy(root); if errors.is_empty() { Ok(config) } else { Err(errors) } } } /// Configuration for the root logger. #[derive(Debug)] pub struct Root { level: LevelFilter, appenders: Vec, } impl Root { /// Creates a new `RootBuilder` with no appenders. pub fn builder() -> RootBuilder { RootBuilder { appenders: vec![] } } /// Returns the minimum level of log messages that the root logger will accept. pub fn level(&self) -> LevelFilter { self.level } /// Returns the list of names of appenders that will be attached to the root logger. pub fn appenders(&self) -> &[String] { &self.appenders } /// Sets the minimum level of log messages that the root logger will accept. pub fn set_level(&mut self, level: LevelFilter) { self.level = level; } } /// A builder for `Root`s. #[derive(Clone, Eq, PartialEq, Hash, Debug)] pub struct RootBuilder { appenders: Vec, } impl RootBuilder { /// Adds an appender. pub fn appender(mut self, appender: T) -> RootBuilder where T: Into, { self.appenders.push(appender.into()); self } /// Adds appenders. pub fn appenders(mut self, appenders: I) -> RootBuilder where I: IntoIterator, I::Item: Into, { self.appenders.extend(appenders.into_iter().map(Into::into)); self } /// Consumes the `RootBuilder`, returning the `Root`. pub fn build(self, level: LevelFilter) -> Root { Root { level, appenders: self.appenders, } } } /// Configuration for an appender. #[derive(Debug)] pub struct Appender { name: String, appender: Box, filters: Vec>, } impl Appender { /// Creates a new `AppenderBuilder` with the specified name and `Append` trait object. pub fn builder() -> AppenderBuilder { AppenderBuilder { filters: vec![] } } /// Returns the name of the appender. pub fn name(&self) -> &str { &self.name } /// Returns the appender. pub fn appender(&self) -> &dyn Append { &*self.appender } /// Returns the filters attached to the appender. pub fn filters(&self) -> &[Box] { &self.filters } pub(crate) fn unpack(self) -> (String, Box, Vec>) { let Appender { name, appender, filters, } = self; (name, appender, filters) } } /// A builder for `Appender`s. #[derive(Debug)] pub struct AppenderBuilder { filters: Vec>, } impl AppenderBuilder { /// Adds a filter. pub fn filter(mut self, filter: Box) -> AppenderBuilder { self.filters.push(filter); self } /// Adds filters. pub fn filters(mut self, filters: I) -> AppenderBuilder where I: IntoIterator>, { self.filters.extend(filters); self } /// Consumes the `AppenderBuilder`, returning the `Appender`. pub fn build(self, name: T, appender: Box) -> Appender where T: Into, { Appender { name: name.into(), appender, filters: self.filters, } } } /// Configuration for a logger. #[derive(Clone, Eq, PartialEq, Hash, Debug)] pub struct Logger { name: String, level: LevelFilter, appenders: Vec, additive: bool, } impl Logger { /// Creates a new `LoggerBuilder` with the specified name and level. /// /// There are initially no appenders attached and `additive` is `true`. pub fn builder() -> LoggerBuilder { LoggerBuilder { appenders: vec![], additive: true, } } /// Returns the name of the logger. pub fn name(&self) -> &str { &self.name } /// Returns the minimum level of log messages that the logger will accept. pub fn level(&self) -> LevelFilter { self.level } /// Returns the list of names of appenders that will be attached to the logger. pub fn appenders(&self) -> &[String] { &self.appenders } /// Determines if appenders of parent loggers will also be attached to this logger. pub fn additive(&self) -> bool { self.additive } } /// A builder for `Logger`s. #[derive(Clone, Eq, PartialEq, Hash, Debug, Default)] pub struct LoggerBuilder { appenders: Vec, additive: bool, } impl LoggerBuilder { /// Adds an appender. pub fn appender(mut self, appender: T) -> LoggerBuilder where T: Into, { self.appenders.push(appender.into()); self } /// Adds appenders. pub fn appenders(mut self, appenders: I) -> LoggerBuilder where I: IntoIterator, I::Item: Into, { self.appenders.extend(appenders.into_iter().map(Into::into)); self } /// Sets the additivity of the logger. pub fn additive(mut self, additive: bool) -> LoggerBuilder { self.additive = additive; self } /// Consumes the `LoggerBuilder`, returning the `Logger`. pub fn build(self, name: T, level: LevelFilter) -> Logger where T: Into, { Logger { name: name.into(), level, appenders: self.appenders, additive: self.additive, } } } fn check_logger_name(name: &str) -> Result<(), ConfigError> { if name.is_empty() { return Err(ConfigError::InvalidLoggerName(name.to_owned())); } let mut streak = 0; for ch in name.chars() { if ch == ':' { streak += 1; if streak > 2 { return Err(ConfigError::InvalidLoggerName(name.to_owned())); } } else { if streak > 0 && streak != 2 { return Err(ConfigError::InvalidLoggerName(name.to_owned())); } streak = 0; } } if streak > 0 { Err(ConfigError::InvalidLoggerName(name.to_owned())) } else { Ok(()) } } /// Errors encountered when validating a log4rs `Config`. #[derive(Debug, Error)] #[error("Configuration errors: {0:#?}")] pub struct ConfigErrors(Vec); impl ConfigErrors { /// There were no config errors. pub fn is_empty(&self) -> bool { self.0.is_empty() } /// Returns a slice of `Error`s. pub fn errors(&self) -> &[ConfigError] { &self.0 } /// Handle non-fatal errors (by logging them to stderr.) pub fn handle(&mut self) { for e in self.0.drain(..) { crate::handle_error(&e.into()); } } } /// An error validating a log4rs `Config`. #[derive(Debug, Error)] pub enum ConfigError { /// Multiple appenders were registered with the same name. #[error("Duplicate appender name `{0}`")] DuplicateAppenderName(String), /// A reference to a nonexistant appender. #[error("Reference to nonexistent appender: `{0}`")] NonexistentAppender(String), /// Multiple loggers were registered with the same name. #[error("Duplicate logger name `{0}`")] DuplicateLoggerName(String), /// A logger name was invalid. #[error("Invalid logger name `{0}`")] InvalidLoggerName(String), #[doc(hidden)] #[error("Reserved for future use")] __Extensible, } #[cfg(test)] mod test { #[test] fn check_logger_name() { let tests = [ ("", false), ("asdf", true), ("asdf::jkl", true), ("::", false), ("asdf::jkl::", false), ("asdf:jkl", false), ("asdf:::jkl", false), ("asdf::jkl::", false), ]; for &(ref name, expected) in &tests { assert!( expected == super::check_logger_name(name).is_ok(), "{}", name ); } } } log4rs-1.3.0/src/encode/json.rs000064400000000000000000000135761046102023000144160ustar 00000000000000//! An encoder which writes a JSON object. //! //! Each log event will be written as a JSON object on its own line. //! //! Requires the `json_encoder` feature. //! //! # Contents //! //! An example object (note that real output will not be pretty-printed): //! //! ```json //! { //! "time": "2016-03-20T14:22:20.644420340-08:00", //! "message": "the log message", //! "module_path": "foo::bar", //! "file": "foo/bar/mod.rs", //! "line": 100, //! "level": "INFO", //! "target": "foo::bar", //! "thread": "main", //! "thread_id": 123, //! "mdc": { //! "request_id": "123e4567-e89b-12d3-a456-426655440000" //! } //! } //! ``` use chrono::{ format::{DelayedFormat, Fixed, Item}, DateTime, Local, }; use log::{Level, Record}; use serde::ser::{self, Serialize, SerializeMap}; use std::{fmt, option, thread}; #[cfg(feature = "config_parsing")] use crate::config::{Deserialize, Deserializers}; use crate::encode::{Encode, Write, NEWLINE}; /// The JSON encoder's configuration #[cfg(feature = "config_parsing")] #[derive(Clone, Eq, PartialEq, Hash, Debug, Default, serde::Deserialize)] #[serde(deny_unknown_fields)] pub struct JsonEncoderConfig { #[serde(skip_deserializing)] _p: (), } /// An `Encode`r which writes a JSON object. #[derive(Copy, Clone, Eq, PartialEq, Hash, Debug, Default)] pub struct JsonEncoder(()); impl JsonEncoder { /// Returns a new `JsonEncoder` with a default configuration. pub fn new() -> Self { Self::default() } } impl JsonEncoder { fn encode_inner( &self, w: &mut dyn Write, time: DateTime, record: &Record, ) -> anyhow::Result<()> { let thread = thread::current(); let message = Message { time: time.format_with_items(Some(Item::Fixed(Fixed::RFC3339)).into_iter()), level: record.level(), message: record.args(), module_path: record.module_path(), file: record.file(), line: record.line(), target: record.target(), thread: thread.name(), thread_id: thread_id::get(), mdc: Mdc, }; message.serialize(&mut serde_json::Serializer::new(&mut *w))?; w.write_all(NEWLINE.as_bytes())?; Ok(()) } } impl Encode for JsonEncoder { fn encode(&self, w: &mut dyn Write, record: &Record) -> anyhow::Result<()> { self.encode_inner(w, Local::now(), record) } } #[derive(serde::Serialize)] struct Message<'a> { #[serde(serialize_with = "ser_display")] time: DelayedFormat>>, level: Level, #[serde(serialize_with = "ser_display")] message: &'a fmt::Arguments<'a>, #[serde(skip_serializing_if = "Option::is_none")] module_path: Option<&'a str>, #[serde(skip_serializing_if = "Option::is_none")] file: Option<&'a str>, #[serde(skip_serializing_if = "Option::is_none")] line: Option, target: &'a str, thread: Option<&'a str>, thread_id: usize, mdc: Mdc, } fn ser_display(v: &T, s: S) -> Result where T: fmt::Display, S: ser::Serializer, { s.collect_str(v) } struct Mdc; impl ser::Serialize for Mdc { fn serialize(&self, serializer: S) -> Result where S: ser::Serializer, { let mut map = serializer.serialize_map(None)?; let mut err = Ok(()); log_mdc::iter(|k, v| { if let Ok(()) = err { err = map.serialize_key(k).and_then(|()| map.serialize_value(v)); } }); err?; map.end() } } /// A deserializer for the `JsonEncoder`. /// /// # Configuration /// /// ```yaml /// kind: json /// ``` #[cfg(feature = "config_parsing")] #[derive(Copy, Clone, Eq, PartialEq, Hash, Debug, Default)] pub struct JsonEncoderDeserializer; #[cfg(feature = "config_parsing")] impl Deserialize for JsonEncoderDeserializer { type Trait = dyn Encode; type Config = JsonEncoderConfig; fn deserialize( &self, _: JsonEncoderConfig, _: &Deserializers, ) -> anyhow::Result> { Ok(Box::::default()) } } #[cfg(test)] #[cfg(feature = "simple_writer")] mod test { #[cfg(feature = "chrono")] use chrono::{DateTime, Local}; use log::Level; use super::*; use crate::encode::writer::simple::SimpleWriter; #[test] fn default() { let time = DateTime::parse_from_rfc3339("2016-03-20T14:22:20.644420340-08:00") .unwrap() .with_timezone(&Local); let level = Level::Debug; let target = "target"; let module_path = "module_path"; let file = "file"; let line = 100; let message = "message"; let thread = "encode::json::test::default"; log_mdc::insert("foo", "bar"); let encoder = JsonEncoder::new(); let mut buf = vec![]; encoder .encode_inner( &mut SimpleWriter(&mut buf), time, &Record::builder() .level(level) .target(target) .module_path(Some(module_path)) .file(Some(file)) .line(Some(line)) .args(format_args!("{}", message)) .build(), ) .unwrap(); let expected = format!( "{{\"time\":\"{}\",\"level\":\"{}\",\"message\":\"{}\",\"module_path\":\"{}\",\ \"file\":\"{}\",\"line\":{},\"target\":\"{}\",\ \"thread\":\"{}\",\"thread_id\":{},\"mdc\":{{\"foo\":\"bar\"}}}}", time.to_rfc3339(), level, message, module_path, file, line, target, thread, thread_id::get(), ); assert_eq!(expected, String::from_utf8(buf).unwrap().trim()); } } log4rs-1.3.0/src/encode/mod.rs000064400000000000000000000075731046102023000142240ustar 00000000000000//! Encoders use derivative::Derivative; use log::Record; use std::{fmt, io}; #[cfg(feature = "config_parsing")] use serde::de; #[cfg(feature = "config_parsing")] use serde_value::Value; #[cfg(feature = "config_parsing")] use std::collections::BTreeMap; #[cfg(feature = "config_parsing")] use crate::config::Deserializable; #[cfg(feature = "json_encoder")] pub mod json; #[cfg(feature = "pattern_encoder")] pub mod pattern; pub mod writer; #[allow(dead_code)] #[cfg(windows)] const NEWLINE: &'static str = "\r\n"; #[allow(dead_code)] #[cfg(not(windows))] const NEWLINE: &str = "\n"; /// A trait implemented by types that can serialize a `Record` into a /// `Write`r. /// /// `Encode`rs are commonly used by `Append`ers to format a log record for /// output. pub trait Encode: fmt::Debug + Send + Sync + 'static { /// Encodes the `Record` into bytes and writes them. fn encode(&self, w: &mut dyn Write, record: &Record) -> anyhow::Result<()>; } #[cfg(feature = "config_parsing")] impl Deserializable for dyn Encode { fn name() -> &'static str { "encoder" } } /// Configuration for an encoder. #[cfg(feature = "config_parsing")] #[derive(Clone, Eq, PartialEq, Hash, Debug)] pub struct EncoderConfig { /// The encoder's kind. pub kind: String, /// The encoder's configuration. pub config: Value, } #[cfg(feature = "config_parsing")] impl<'de> de::Deserialize<'de> for EncoderConfig { fn deserialize(d: D) -> Result where D: de::Deserializer<'de>, { let mut map = BTreeMap::::deserialize(d)?; let kind = match map.remove(&Value::String("kind".to_owned())) { Some(kind) => kind.deserialize_into().map_err(|e| e.to_error())?, None => "pattern".to_owned(), }; Ok(EncoderConfig { kind, config: Value::Map(map), }) } } /// A text or background color. #[allow(missing_docs)] #[derive(Copy, Clone, Eq, PartialEq, Hash, Debug)] pub enum Color { Black, Red, Green, Yellow, Blue, Magenta, Cyan, White, } /// The style applied to text output. /// /// Any fields set to `None` will be set to their default format, as defined /// by the `Write`r. #[derive(Derivative)] #[derivative(Debug)] #[derive(Clone, Eq, PartialEq, Hash, Default)] pub struct Style { /// The text (or foreground) color. pub text: Option, /// The background color. pub background: Option, /// True if the text should have increased intensity. pub intense: Option, #[derivative(Debug = "ignore")] _p: (), } impl Style { /// Returns a `Style` with all fields set to their defaults. pub fn new() -> Style { Style::default() } /// Sets the text color. pub fn text(&mut self, text: Color) -> &mut Style { self.text = Some(text); self } /// Sets the background color. pub fn background(&mut self, background: Color) -> &mut Style { self.background = Some(background); self } /// Sets the text intensity. pub fn intense(&mut self, intense: bool) -> &mut Style { self.intense = Some(intense); self } } /// A trait for types that an `Encode`r will write to. /// /// It extends `std::io::Write` and adds some extra functionality. pub trait Write: io::Write { /// Sets the output text style, if supported. /// /// `Write`rs should ignore any parts of the `Style` they do not support. /// /// The default implementation returns `Ok(())`. Implementations that do /// not support styling should do this as well. #[allow(unused_variables)] fn set_style(&mut self, style: &Style) -> io::Result<()> { Ok(()) } } impl<'a, W: Write + ?Sized> Write for &'a mut W { fn set_style(&mut self, style: &Style) -> io::Result<()> { ::set_style(*self, style) } } log4rs-1.3.0/src/encode/pattern/mod.rs000064400000000000000000001055161046102023000156750ustar 00000000000000//! A simple pattern-based encoder. //! //! Requires the `pattern_encoder` feature. //! //! The pattern syntax is similar to Rust's string formatting syntax. It //! consists of raw text interspersed with format arguments. The grammar is: //! //! ```not_rust //! format_string := [ format ] * //! format := '{' formatter [ ':' format_spec ] '}' //! formatter := [ name ] [ '(' argument ')' ] * //! name := identifier //! argument := format_string //! //! format_spec := [ [ fill ] align ] [ min_width ] [ '.' max_width ] //! fill := character //! align := '<' | '>' //! min_width := number //! max_width := number //! ``` //! //! # Special characters //! //! The `{`, `}`, `(`, `)`, and `\` characters are part of the pattern syntax; //! they must be escaped to appear in output. Like with Rust's string //! formatting syntax, type the character twice to escape it. That is, `{{` //! will be rendered as `{` in output and `))` will be rendered as `)`. //! //! In addition, these characters may also be escaped by prefixing them with a //! `\` character. That is, `\{` will be rendered as `{`. //! //! # Formatters //! //! A formatter inserts a dynamic portion of text into the pattern. It may be //! text derived from a log event or from some other context like the current //! time. Formatters may be passed arguments consisting of parenthesized format //! strings. //! //! The following formatters are currently supported. Unless otherwise stated, //! a formatter does not accept any argument. //! //! * `d`, `date` - The current time. By default, the ISO 8601 format is used. //! A custom format may be provided in the syntax accepted by `chrono`. //! The timezone defaults to local, but can be specified explicitly by //! passing a second argument of `utc` for UTC or `local` for local time. //! * `{d}` - `2016-03-20T14:22:20.644420340-08:00` //! * `{d(%Y-%m-%d %H:%M:%S)}` - `2016-03-20 14:22:20` //! * `{d(%Y-%m-%d %H:%M:%S %Z)(utc)}` - `2016-03-20 22:22:20 UTC` //! * `f`, `file` - The source file that the log message came from, or `???` if //! not provided. //! * `h`, `highlight` - Styles its argument according to the log level. The //! style is intense red for errors, red for warnings, blue for info, and //! the default style for all other levels. //! * `{h(the level is {l})}` - //! the level is ERROR //! * `D`, `debug` - Outputs its arguments ONLY in debug build. //! * `R`, `release` - Outputs its arguments ONLY in release build. //! * `l`, `level` - The log level. //! * `L`, `line` - The line that the log message came from, or `???` if not //! provided. //! * `m`, `message` - The log message. //! * `M`, `module` - The module that the log message came from, or `???` if not //! provided. //! * `P`, `pid` - The current process id. //! * `i`, `tid` - The current system-wide unique thread ID. //! * `n` - A platform-specific newline. //! * `t`, `target` - The target of the log message. //! * `T`, `thread` - The name of the current thread. //! * `I`, `thread_id` - The pthread ID of the current thread. //! * `X`, `mdc` - A value from the [MDC][MDC]. The first argument specifies //! the key, and the second argument specifies the default value if the //! key is not present in the MDC. The second argument is optional, and //! defaults to the empty string. //! * `{X(user_id)}` - `123e4567-e89b-12d3-a456-426655440000` //! * `{X(nonexistent_key)(no mapping)}` - `no mapping` //! * An "unnamed" formatter simply formats its argument, applying the format //! specification. //! * `{({l} {m})}` - `INFO hello` //! //! # Format Specification //! //! The format specification determines how the output of a formatter is //! adjusted before being returned. //! //! ## Fill/Alignment //! //! The fill and alignment values are used in conjunction with a minimum width //! value (see below) to control the behavior when a formatter's output is less //! than the minimum. While the default behavior is to pad the output to the //! right with space characters (i.e. left align it), the fill value specifies //! the character used, and the alignment value is one of: //! //! * `<` - Left align by appending the fill character to the formatter output //! * `>` - Right align by prepending the fill character to the formatter //! output. //! //! ## Width //! //! By default, the full contents of a formatter's output will be inserted into //! the pattern output, but both the minimum and maximum lengths can be //! configured. Any output over the maximum length will be truncated, and //! output under the minimum length will be padded (see above). //! //! # Examples //! //! The default pattern is `{d} {l} {t} - {m}{n}` which produces output like //! `2016-03-20T22:22:20.644420340+00:00 INFO module::path - this is a log //! message`. //! //! The pattern `{m:>10.15}` will right-align the log message to a minimum of //! 10 bytes, filling in with space characters, and truncate output after 15 //! bytes. The message `hello` will therefore be displayed as //! hello, while the message `hello there, world!` will be //! displayed as `hello there, wo`. //! //! The pattern `{({l} {m}):15.15}` will output the log level and message //! limited to exactly 15 bytes, padding with space characters on the right if //! necessary. The message `hello` and log level `INFO` will be displayed as //! INFO hello , while the message `hello, world!` and log //! level `DEBUG` will be truncated to `DEBUG hello, wo`. //! //! [MDC]: https://crates.io/crates/log-mdc use chrono::{Local, Utc}; use derivative::Derivative; use log::{Level, Record}; use std::{default::Default, io, process, thread}; use crate::encode::{ self, pattern::parser::{Alignment, Parameters, Parser, Piece}, Color, Encode, Style, NEWLINE, }; #[cfg(feature = "config_parsing")] use crate::config::{Deserialize, Deserializers}; mod parser; thread_local!( /// Thread-locally cached thread ID. static TID: usize = thread_id::get() ); /// The pattern encoder's configuration. #[cfg(feature = "config_parsing")] #[derive(Clone, Eq, PartialEq, Hash, Debug, Default, serde::Deserialize)] #[serde(deny_unknown_fields)] pub struct PatternEncoderConfig { pattern: Option, } fn is_char_boundary(b: u8) -> bool { b as i8 >= -0x40 } fn char_starts(buf: &[u8]) -> usize { buf.iter().filter(|&&b| is_char_boundary(b)).count() } struct MaxWidthWriter<'a> { remaining: usize, w: &'a mut dyn encode::Write, } impl<'a> io::Write for MaxWidthWriter<'a> { fn write(&mut self, buf: &[u8]) -> io::Result { let mut remaining = self.remaining; let mut end = buf.len(); for (idx, _) in buf .iter() .enumerate() .filter(|&(_, &b)| is_char_boundary(b)) { if remaining == 0 { end = idx; break; } remaining -= 1; } // we don't want to report EOF, so just act as a sink past this point if end == 0 { return Ok(buf.len()); } let buf = &buf[..end]; match self.w.write(buf) { Ok(len) => { if len == end { self.remaining = remaining; } else { self.remaining -= char_starts(&buf[..len]); } Ok(len) } Err(e) => Err(e), } } fn flush(&mut self) -> io::Result<()> { self.w.flush() } } impl<'a> encode::Write for MaxWidthWriter<'a> { fn set_style(&mut self, style: &Style) -> io::Result<()> { self.w.set_style(style) } } struct LeftAlignWriter { to_fill: usize, fill: char, w: W, } impl LeftAlignWriter { fn finish(mut self) -> io::Result<()> { for _ in 0..self.to_fill { write!(self.w, "{}", self.fill)?; } Ok(()) } } impl io::Write for LeftAlignWriter { fn write(&mut self, buf: &[u8]) -> io::Result { match self.w.write(buf) { Ok(len) => { self.to_fill = self.to_fill.saturating_sub(char_starts(&buf[..len])); Ok(len) } Err(e) => Err(e), } } fn flush(&mut self) -> io::Result<()> { self.w.flush() } } impl encode::Write for LeftAlignWriter { fn set_style(&mut self, style: &Style) -> io::Result<()> { self.w.set_style(style) } } enum BufferedOutput { Data(Vec), Style(Style), } struct RightAlignWriter { to_fill: usize, fill: char, w: W, buf: Vec, } impl RightAlignWriter { fn finish(mut self) -> io::Result<()> { for _ in 0..self.to_fill { write!(self.w, "{}", self.fill)?; } for out in self.buf { match out { BufferedOutput::Data(ref buf) => self.w.write_all(buf)?, BufferedOutput::Style(ref style) => self.w.set_style(style)?, } } Ok(()) } } impl io::Write for RightAlignWriter { fn write(&mut self, buf: &[u8]) -> io::Result { self.to_fill = self.to_fill.saturating_sub(char_starts(buf)); let mut pushed = false; if let Some(&mut BufferedOutput::Data(ref mut data)) = self.buf.last_mut() { data.extend_from_slice(buf); pushed = true; }; if !pushed { self.buf.push(BufferedOutput::Data(buf.to_owned())); } Ok(buf.len()) } fn flush(&mut self) -> io::Result<()> { Ok(()) } } impl encode::Write for RightAlignWriter { fn set_style(&mut self, style: &Style) -> io::Result<()> { self.buf.push(BufferedOutput::Style(style.clone())); Ok(()) } } #[derive(Clone, Eq, PartialEq, Hash, Debug)] enum Chunk { Text(String), Formatted { chunk: FormattedChunk, params: Parameters, }, Error(String), } impl Chunk { fn encode(&self, w: &mut dyn encode::Write, record: &Record) -> io::Result<()> { match *self { Chunk::Text(ref s) => w.write_all(s.as_bytes()), Chunk::Formatted { ref chunk, ref params, } => match (params.min_width, params.max_width, params.align) { (None, None, _) => chunk.encode(w, record), (None, Some(max_width), _) => { let mut w = MaxWidthWriter { remaining: max_width, w, }; chunk.encode(&mut w, record) } (Some(min_width), None, Alignment::Left) => { let mut w = LeftAlignWriter { to_fill: min_width, fill: params.fill, w, }; chunk.encode(&mut w, record)?; w.finish() } (Some(min_width), None, Alignment::Right) => { let mut w = RightAlignWriter { to_fill: min_width, fill: params.fill, w, buf: vec![], }; chunk.encode(&mut w, record)?; w.finish() } (Some(min_width), Some(max_width), Alignment::Left) => { let mut w = LeftAlignWriter { to_fill: min_width, fill: params.fill, w: MaxWidthWriter { remaining: max_width, w, }, }; chunk.encode(&mut w, record)?; w.finish() } (Some(min_width), Some(max_width), Alignment::Right) => { let mut w = RightAlignWriter { to_fill: min_width, fill: params.fill, w: MaxWidthWriter { remaining: max_width, w, }, buf: vec![], }; chunk.encode(&mut w, record)?; w.finish() } }, Chunk::Error(ref s) => write!(w, "{{ERROR: {}}}", s), } } } impl<'a> From> for Chunk { fn from(piece: Piece<'a>) -> Chunk { match piece { Piece::Text(text) => Chunk::Text(text.to_owned()), Piece::Argument { mut formatter, parameters, } => match formatter.name { "d" | "date" => { if formatter.args.len() > 2 { return Chunk::Error("expected at most two arguments".to_owned()); } let format = match formatter.args.first() { Some(arg) => { let mut format = String::new(); for piece in arg { match *piece { Piece::Text(text) => format.push_str(text), Piece::Argument { .. } => { format.push_str("{ERROR: unexpected formatter}"); } Piece::Error(ref err) => { format.push_str("{ERROR: "); format.push_str(err); format.push('}'); } } } format } None => "%+".to_owned(), }; let timezone = match formatter.args.get(1) { Some(arg) => { if let Some(arg) = arg.first() { match *arg { Piece::Text("utc") => Timezone::Utc, Piece::Text("local") => Timezone::Local, Piece::Text(z) => { return Chunk::Error(format!("invalid timezone `{}`", z)); } _ => return Chunk::Error("invalid timezone".to_owned()), } } else { return Chunk::Error("invalid timezone".to_owned()); } } None => Timezone::Local, }; Chunk::Formatted { chunk: FormattedChunk::Time(format, timezone), params: parameters, } } "h" | "highlight" => { if formatter.args.len() != 1 { return Chunk::Error("expected exactly one argument".to_owned()); } let chunks = formatter .args .pop() .unwrap() .into_iter() .map(From::from) .collect(); Chunk::Formatted { chunk: FormattedChunk::Highlight(chunks), params: parameters, } } "D" | "debug" => { if formatter.args.len() != 1 { return Chunk::Error("expected exactly one argument".to_owned()); } let chunks = formatter .args .pop() .unwrap() .into_iter() .map(From::from) .collect(); Chunk::Formatted { chunk: FormattedChunk::Debug(chunks), params: parameters, } } "R" | "release" => { if formatter.args.len() != 1 { return Chunk::Error("expected exactly one argument".to_owned()); } let chunks = formatter .args .pop() .unwrap() .into_iter() .map(From::from) .collect(); Chunk::Formatted { chunk: FormattedChunk::Release(chunks), params: parameters, } } "l" | "level" => no_args(&formatter.args, parameters, FormattedChunk::Level), "m" | "message" => no_args(&formatter.args, parameters, FormattedChunk::Message), "M" | "module" => no_args(&formatter.args, parameters, FormattedChunk::Module), "n" => no_args(&formatter.args, parameters, FormattedChunk::Newline), "f" | "file" => no_args(&formatter.args, parameters, FormattedChunk::File), "L" | "line" => no_args(&formatter.args, parameters, FormattedChunk::Line), "T" | "thread" => no_args(&formatter.args, parameters, FormattedChunk::Thread), "I" | "thread_id" => no_args(&formatter.args, parameters, FormattedChunk::ThreadId), "P" | "pid" => no_args(&formatter.args, parameters, FormattedChunk::ProcessId), "i" | "tid" => no_args(&formatter.args, parameters, FormattedChunk::SystemThreadId), "t" | "target" => no_args(&formatter.args, parameters, FormattedChunk::Target), "X" | "mdc" => { if formatter.args.len() > 2 { return Chunk::Error("expected at most two arguments".to_owned()); } let key = match formatter.args.first() { Some(arg) => { if let Some(arg) = arg.first() { match arg { Piece::Text(key) => key.to_owned(), Piece::Error(ref e) => return Chunk::Error(e.clone()), _ => return Chunk::Error("invalid MDC key".to_owned()), } } else { return Chunk::Error("invalid MDC key".to_owned()); } } None => return Chunk::Error("missing MDC key".to_owned()), }; let default = match formatter.args.get(1) { Some(arg) => { if let Some(arg) = arg.first() { match arg { Piece::Text(key) => key.to_owned(), Piece::Error(ref e) => return Chunk::Error(e.clone()), _ => return Chunk::Error("invalid MDC default".to_owned()), } } else { return Chunk::Error("invalid MDC default".to_owned()); } } None => "", }; Chunk::Formatted { chunk: FormattedChunk::Mdc(key.into(), default.into()), params: parameters, } } "" => { if formatter.args.len() != 1 { return Chunk::Error("expected exactly one argument".to_owned()); } let chunks = formatter .args .pop() .unwrap() .into_iter() .map(From::from) .collect(); Chunk::Formatted { chunk: FormattedChunk::Align(chunks), params: parameters, } } name => Chunk::Error(format!("unknown formatter `{}`", name)), }, Piece::Error(err) => Chunk::Error(err), } } } fn no_args(arg: &[Vec], params: Parameters, chunk: FormattedChunk) -> Chunk { if arg.is_empty() { Chunk::Formatted { chunk, params } } else { Chunk::Error("unexpected arguments".to_owned()) } } #[derive(Clone, Eq, PartialEq, Hash, Debug)] enum Timezone { Utc, Local, } #[derive(Clone, Eq, PartialEq, Hash, Debug)] enum FormattedChunk { Time(String, Timezone), Level, Message, Module, File, Line, Thread, ThreadId, ProcessId, SystemThreadId, Target, Newline, Align(Vec), Highlight(Vec), Debug(Vec), Release(Vec), Mdc(String, String), } impl FormattedChunk { fn encode(&self, w: &mut dyn encode::Write, record: &Record) -> io::Result<()> { match *self { FormattedChunk::Time(ref fmt, Timezone::Utc) => write!(w, "{}", Utc::now().format(fmt)), FormattedChunk::Time(ref fmt, Timezone::Local) => { write!(w, "{}", Local::now().format(fmt)) } FormattedChunk::Level => write!(w, "{}", record.level()), FormattedChunk::Message => w.write_fmt(*record.args()), FormattedChunk::Module => w.write_all(record.module_path().unwrap_or("???").as_bytes()), FormattedChunk::File => w.write_all(record.file().unwrap_or("???").as_bytes()), FormattedChunk::Line => match record.line() { Some(line) => write!(w, "{}", line), None => w.write_all(b"???"), }, FormattedChunk::Thread => { w.write_all(thread::current().name().unwrap_or("unnamed").as_bytes()) } FormattedChunk::ThreadId => w.write_all(thread_id::get().to_string().as_bytes()), FormattedChunk::ProcessId => w.write_all(process::id().to_string().as_bytes()), FormattedChunk::SystemThreadId => { TID.with(|tid| w.write_all(tid.to_string().as_bytes())) } FormattedChunk::Target => w.write_all(record.target().as_bytes()), FormattedChunk::Newline => w.write_all(NEWLINE.as_bytes()), FormattedChunk::Align(ref chunks) => { for chunk in chunks { chunk.encode(w, record)?; } Ok(()) } FormattedChunk::Highlight(ref chunks) => { match record.level() { Level::Error => { w.set_style(Style::new().text(Color::Red).intense(true))?; } Level::Warn => w.set_style(Style::new().text(Color::Yellow))?, Level::Info => w.set_style(Style::new().text(Color::Green))?, Level::Trace => w.set_style(Style::new().text(Color::Cyan))?, _ => {} } for chunk in chunks { chunk.encode(w, record)?; } match record.level() { Level::Error | Level::Warn | Level::Info | Level::Trace => { w.set_style(&Style::new())? } _ => {} } Ok(()) } FormattedChunk::Debug(ref chunks) => { if cfg!(debug_assertions) { for chunk in chunks { chunk.encode(w, record)?; } } Ok(()) } FormattedChunk::Release(ref chunks) => { if !cfg!(debug_assertions) { for chunk in chunks { chunk.encode(w, record)?; } } Ok(()) } FormattedChunk::Mdc(ref key, ref default) => { log_mdc::get(key, |v| write!(w, "{}", v.unwrap_or(default))) } } } } /// An `Encode`r configured via a format string. #[derive(Derivative)] #[derivative(Debug)] #[derive(Clone, Eq, PartialEq, Hash)] pub struct PatternEncoder { #[derivative(Debug = "ignore")] chunks: Vec, pattern: String, } /// Returns a `PatternEncoder` using the default pattern of `{d} {l} {t} - {m}{n}`. impl Default for PatternEncoder { fn default() -> PatternEncoder { PatternEncoder::new("{d} {l} {t} - {m}{n}") } } impl Encode for PatternEncoder { fn encode(&self, w: &mut dyn encode::Write, record: &Record) -> anyhow::Result<()> { for chunk in &self.chunks { chunk.encode(w, record)?; } Ok(()) } } impl PatternEncoder { /// Creates a `PatternEncoder` from a pattern string. /// /// The pattern string syntax is documented in the `pattern` module. pub fn new(pattern: &str) -> PatternEncoder { PatternEncoder { chunks: Parser::new(pattern).map(From::from).collect(), pattern: pattern.to_owned(), } } } /// A deserializer for the `PatternEncoder`. /// /// # Configuration /// /// ```yaml /// kind: pattern /// /// # The pattern to follow when formatting logs. Defaults to /// # "{d} {l} {t} - {m}{n}". /// pattern: "{d} {l} {t} - {m}{n}" /// ``` #[cfg(feature = "config_parsing")] #[derive(Copy, Clone, Eq, PartialEq, Hash, Debug)] pub struct PatternEncoderDeserializer; #[cfg(feature = "config_parsing")] impl Deserialize for PatternEncoderDeserializer { type Trait = dyn Encode; type Config = PatternEncoderConfig; fn deserialize( &self, config: PatternEncoderConfig, _: &Deserializers, ) -> anyhow::Result> { let encoder = match config.pattern { Some(pattern) => PatternEncoder::new(&pattern), None => PatternEncoder::default(), }; Ok(Box::new(encoder)) } } #[cfg(test)] mod tests { #[cfg(feature = "simple_writer")] use log::{Level, Record}; #[cfg(feature = "simple_writer")] use std::process; #[cfg(feature = "simple_writer")] use std::thread; use super::{Chunk, PatternEncoder}; #[cfg(feature = "simple_writer")] use crate::encode::writer::simple::SimpleWriter; #[cfg(feature = "simple_writer")] use crate::encode::Encode; fn error_free(encoder: &PatternEncoder) -> bool { encoder.chunks.iter().all(|c| match *c { Chunk::Error(_) => false, _ => true, }) } #[test] fn invalid_formatter() { assert!(!error_free(&PatternEncoder::new("{x}"))); } #[test] fn unclosed_delimiter() { assert!(!error_free(&PatternEncoder::new("{d(%Y-%m-%d)"))); } #[test] #[cfg(feature = "simple_writer")] fn log() { let pw = PatternEncoder::new("{l} {m} at {M} in {f}:{L}"); let mut buf = vec![]; pw.encode( &mut SimpleWriter(&mut buf), &Record::builder() .level(Level::Debug) .args(format_args!("the message")) .module_path(Some("path")) .file(Some("file")) .line(Some(132)) .build(), ) .unwrap(); assert_eq!(buf, &b"DEBUG the message at path in file:132"[..]); } #[test] #[cfg(feature = "simple_writer")] fn unnamed_thread() { thread::spawn(|| { let pw = PatternEncoder::new("{T}"); let mut buf = vec![]; pw.encode(&mut SimpleWriter(&mut buf), &Record::builder().build()) .unwrap(); assert_eq!(buf, b"unnamed"); }) .join() .unwrap(); } #[test] #[cfg(feature = "simple_writer")] fn named_thread() { thread::Builder::new() .name("foobar".to_string()) .spawn(|| { let pw = PatternEncoder::new("{T}"); let mut buf = vec![]; pw.encode(&mut SimpleWriter(&mut buf), &Record::builder().build()) .unwrap(); assert_eq!(buf, b"foobar"); }) .unwrap() .join() .unwrap(); } #[test] #[cfg(feature = "simple_writer")] fn thread_id_field() { thread::spawn(|| { let pw = PatternEncoder::new("{I}"); let mut buf = vec![]; pw.encode(&mut SimpleWriter(&mut buf), &Record::builder().build()) .unwrap(); assert_eq!(buf, thread_id::get().to_string().as_bytes()); }) .join() .unwrap(); } #[test] #[cfg(feature = "simple_writer")] fn process_id() { let pw = PatternEncoder::new("{P}"); let mut buf = vec![]; pw.encode(&mut SimpleWriter(&mut buf), &Record::builder().build()) .unwrap(); assert_eq!(buf, process::id().to_string().as_bytes()); } #[test] #[cfg(feature = "simple_writer")] fn system_thread_id() { let pw = PatternEncoder::new("{i}"); let mut buf = vec![]; pw.encode(&mut SimpleWriter(&mut buf), &Record::builder().build()) .unwrap(); assert_eq!(buf, thread_id::get().to_string().as_bytes()); } #[test] #[cfg(feature = "simple_writer")] fn default_okay() { assert!(error_free(&PatternEncoder::default())); } #[test] #[cfg(feature = "simple_writer")] fn left_align() { let pw = PatternEncoder::new("{m:~<5.6}"); let mut buf = vec![]; pw.encode( &mut SimpleWriter(&mut buf), &Record::builder().args(format_args!("foo")).build(), ) .unwrap(); assert_eq!(buf, b"foo~~"); buf.clear(); pw.encode( &mut SimpleWriter(&mut buf), &Record::builder().args(format_args!("foobar!")).build(), ) .unwrap(); assert_eq!(buf, b"foobar"); } #[test] #[cfg(feature = "simple_writer")] fn right_align() { let pw = PatternEncoder::new("{m:~>5.6}"); let mut buf = vec![]; pw.encode( &mut SimpleWriter(&mut buf), &Record::builder().args(format_args!("foo")).build(), ) .unwrap(); assert_eq!(buf, b"~~foo"); buf.clear(); pw.encode( &mut SimpleWriter(&mut buf), &Record::builder().args(format_args!("foobar!")).build(), ) .unwrap(); assert_eq!(buf, b"foobar"); } #[test] #[cfg(feature = "simple_writer")] fn left_align_formatter() { let pw = PatternEncoder::new("{({l} {m}):15}"); let mut buf = vec![]; pw.encode( &mut SimpleWriter(&mut buf), &Record::builder() .level(Level::Info) .args(format_args!("foobar!")) .build(), ) .unwrap(); assert_eq!(buf, b"INFO foobar! "); } #[test] #[cfg(feature = "simple_writer")] fn right_align_formatter() { let pw = PatternEncoder::new("{({l} {m}):>15}"); let mut buf = vec![]; pw.encode( &mut SimpleWriter(&mut buf), &Record::builder() .level(Level::Info) .args(format_args!("foobar!")) .build(), ) .unwrap(); assert_eq!(buf, b" INFO foobar!"); } #[test] fn custom_date_format() { assert!(error_free(&PatternEncoder::new( "{d(%Y-%m-%d %H:%M:%S)} {m}{n}" ))); } #[test] fn timezones() { assert!(error_free(&PatternEncoder::new("{d(%+)(utc)}"))); assert!(error_free(&PatternEncoder::new("{d(%+)(local)}"))); assert!(!error_free(&PatternEncoder::new("{d(%+)(foo)}"))); } #[test] fn unescaped_parens() { assert!(!error_free(&PatternEncoder::new("(hi)"))); } #[test] #[cfg(feature = "simple_writer")] fn escaped_chars() { let pw = PatternEncoder::new("{{{m}(())}}"); let mut buf = vec![]; pw.encode( &mut SimpleWriter(&mut buf), &Record::builder().args(format_args!("foobar!")).build(), ) .unwrap(); assert_eq!(buf, b"{foobar!()}"); } #[test] #[cfg(feature = "simple_writer")] fn quote_braces_with_backslash() { let pw = PatternEncoder::new(r"\{\({l}\)\}\\"); let mut buf = vec![]; pw.encode( &mut SimpleWriter(&mut buf), &Record::builder().level(Level::Info).build(), ) .unwrap(); assert_eq!(buf, br"{(INFO)}\"); } #[test] #[cfg(feature = "simple_writer")] fn mdc() { let pw = PatternEncoder::new("{X(user_id)}"); log_mdc::insert("user_id", "mdc value"); let mut buf = vec![]; pw.encode(&mut SimpleWriter(&mut buf), &Record::builder().build()) .unwrap(); assert_eq!(buf, b"mdc value"); } #[test] #[cfg(feature = "simple_writer")] fn mdc_missing_default() { let pw = PatternEncoder::new("{X(user_id)}"); let mut buf = vec![]; pw.encode(&mut SimpleWriter(&mut buf), &Record::builder().build()) .unwrap(); assert_eq!(buf, b""); } #[test] #[cfg(feature = "simple_writer")] fn mdc_missing_custom() { let pw = PatternEncoder::new("{X(user_id)(missing value)}"); let mut buf = vec![]; pw.encode(&mut SimpleWriter(&mut buf), &Record::builder().build()) .unwrap(); assert_eq!(buf, b"missing value"); } #[test] #[cfg(feature = "simple_writer")] fn debug_release() { let debug_pat = "{D({l})}"; let release_pat = "{R({l})}"; let debug_encoder = PatternEncoder::new(debug_pat); let release_encoder = PatternEncoder::new(release_pat); let mut debug_buf = vec![]; let mut release_buf = vec![]; debug_encoder .encode( &mut SimpleWriter(&mut debug_buf), &Record::builder().level(Level::Info).build(), ) .unwrap(); release_encoder .encode( &mut SimpleWriter(&mut release_buf), &Record::builder().level(Level::Info).build(), ) .unwrap(); if cfg!(debug_assertions) { assert_eq!(debug_buf, b"INFO"); assert!(release_buf.is_empty()); } else { assert_eq!(release_buf, b"INFO"); assert!(debug_buf.is_empty()); } } } log4rs-1.3.0/src/encode/pattern/parser.rs000064400000000000000000000162461046102023000164130ustar 00000000000000// cribbed to a large extent from libfmt_macros use std::{iter::Peekable, str::CharIndices}; #[derive(Clone, Eq, PartialEq, Hash, Debug)] pub enum Piece<'a> { Text(&'a str), Argument { formatter: Formatter<'a>, parameters: Parameters, }, Error(String), } #[derive(Clone, Eq, PartialEq, Hash, Debug)] pub struct Formatter<'a> { pub name: &'a str, pub args: Vec>>, } #[derive(Clone, Eq, PartialEq, Hash, Debug)] pub struct Parameters { pub fill: char, pub align: Alignment, pub min_width: Option, pub max_width: Option, } #[derive(Copy, Clone, Eq, PartialEq, Hash, Debug)] pub enum Alignment { Left, Right, } #[derive(Clone, Debug)] pub struct Parser<'a> { pattern: &'a str, it: Peekable>, } impl<'a> Parser<'a> { pub fn new(pattern: &'a str) -> Parser<'a> { Parser { pattern, it: pattern.char_indices().peekable(), } } fn consume(&mut self, ch: char) -> bool { match self.it.peek() { Some(&(_, c)) if c == ch => { self.it.next(); true } _ => false, } } fn argument(&mut self) -> Piece<'a> { let formatter = match self.formatter() { Ok(formatter) => formatter, Err(err) => return Piece::Error(err), }; Piece::Argument { formatter, parameters: self.parameters(), } } fn formatter(&mut self) -> Result, String> { Ok(Formatter { name: self.name(), args: self.args()?, }) } fn name(&mut self) -> &'a str { let start = match self.it.peek() { Some(&(pos, ch)) if ch.is_alphabetic() => { self.it.next(); pos } _ => return "", }; loop { match self.it.peek() { Some(&(_, ch)) if ch.is_alphanumeric() => { self.it.next(); } Some(&(end, _)) => return &self.pattern[start..end], None => return &self.pattern[start..], } } } fn args(&mut self) -> Result>>, String> { let mut args = vec![]; while let Some(&(_, '(')) = self.it.peek() { args.push(self.arg()?); } Ok(args) } fn arg(&mut self) -> Result>, String> { if !self.consume('(') { return Ok(vec![]); } let mut arg = vec![]; loop { if self.consume(')') { return Ok(arg); } else { match self.next() { Some(piece) => arg.push(piece), None => return Err("unclosed '('".to_owned()), } } } } fn parameters(&mut self) -> Parameters { let mut params = Parameters { fill: ' ', align: Alignment::Left, min_width: None, max_width: None, }; if !self.consume(':') { return params; } if let Some(&(_, ch)) = self.it.peek() { match self.it.clone().nth(1) { Some((_, '<')) | Some((_, '>')) => { self.it.next(); params.fill = ch; } _ => {} } } if self.consume('<') { params.align = Alignment::Left; } else if self.consume('>') { params.align = Alignment::Right; } if let Some(min_width) = self.integer() { params.min_width = Some(min_width); } if self.consume('.') { if let Some(max_width) = self.integer() { params.max_width = Some(max_width); } } params } fn integer(&mut self) -> Option { let mut cur = 0; let mut found = false; while let Some(&(_, ch)) = self.it.peek() { if let Some(digit) = ch.to_digit(10) { cur = cur * 10 + digit as usize; found = true; self.it.next(); } else { break; } } if found { Some(cur) } else { None } } fn text(&mut self, start: usize) -> Piece<'a> { while let Some(&(pos, ch)) = self.it.peek() { match ch { '{' | '}' | '(' | ')' | '\\' => return Piece::Text(&self.pattern[start..pos]), _ => { self.it.next(); } } } Piece::Text(&self.pattern[start..]) } } impl<'a> Iterator for Parser<'a> { type Item = Piece<'a>; fn next(&mut self) -> Option> { match self.it.peek() { Some(&(_, '{')) => { self.it.next(); if self.consume('{') { Some(Piece::Text("{")) } else { let piece = self.argument(); if self.consume('}') { Some(piece) } else { for _ in &mut self.it {} Some(Piece::Error("expected '}'".to_owned())) } } } Some(&(_, '}')) => { self.it.next(); if self.consume('}') { Some(Piece::Text("}")) } else { Some(Piece::Error("unmatched '}'".to_owned())) } } Some(&(_, '(')) => { self.it.next(); if self.consume('(') { Some(Piece::Text("(")) } else { Some(Piece::Error("unexpected '('".to_owned())) } } Some(&(_, ')')) => { self.it.next(); if self.consume(')') { Some(Piece::Text(")")) } else { Some(Piece::Error("unexpected ')'".to_owned())) } } Some(&(_, '\\')) => { self.it.next(); match self.it.peek() { Some(&(_, '{')) => { self.it.next(); Some(Piece::Text("{")) } Some(&(_, '}')) => { self.it.next(); Some(Piece::Text("}")) } Some(&(_, '(')) => { self.it.next(); Some(Piece::Text("(")) } Some(&(_, ')')) => { self.it.next(); Some(Piece::Text(")")) } Some(&(_, '\\')) => { self.it.next(); Some(Piece::Text("\\")) } _ => Some(Piece::Error("unexpected '\\'".to_owned())), } } Some(&(pos, _)) => Some(self.text(pos)), None => None, } } } log4rs-1.3.0/src/encode/writer/ansi.rs000064400000000000000000000053241046102023000157030ustar 00000000000000//! The ANSI writer. //! //! Requires the `ansi_writer` feature. use crate::encode::{self, Color, Style}; use std::{fmt, io}; /// An `encode::Write`r that wraps an `io::Write`r, emitting ANSI escape codes /// for text style. #[derive(Clone, Eq, PartialEq, Hash, Debug)] pub struct AnsiWriter(pub W); impl io::Write for AnsiWriter { fn write(&mut self, buf: &[u8]) -> io::Result { self.0.write(buf) } fn flush(&mut self) -> io::Result<()> { self.0.flush() } fn write_all(&mut self, buf: &[u8]) -> io::Result<()> { self.0.write_all(buf) } fn write_fmt(&mut self, fmt: fmt::Arguments) -> io::Result<()> { self.0.write_fmt(fmt) } } impl encode::Write for AnsiWriter { fn set_style(&mut self, style: &Style) -> io::Result<()> { let mut buf = [0; 12]; buf[0] = b'\x1b'; buf[1] = b'['; buf[2] = b'0'; let mut idx = 3; if let Some(text) = style.text { buf[idx] = b';'; buf[idx + 1] = b'3'; buf[idx + 2] = color_byte(text); idx += 3; } if let Some(background) = style.background { buf[idx] = b';'; buf[idx + 1] = b'4'; buf[idx + 2] = color_byte(background); idx += 3; } if let Some(intense) = style.intense { buf[idx] = b';'; if intense { buf[idx + 1] = b'1'; idx += 2; } else { buf[idx + 1] = b'2'; buf[idx + 2] = b'2'; idx += 3; } } buf[idx] = b'm'; self.0.write_all(&buf[..=idx]) } } fn color_byte(c: Color) -> u8 { match c { Color::Black => b'0', Color::Red => b'1', Color::Green => b'2', Color::Yellow => b'3', Color::Blue => b'4', Color::Magenta => b'5', Color::Cyan => b'6', Color::White => b'7', } } #[cfg(test)] mod test { use std::io::{self, Write}; use super::*; use crate::encode::{Color, Style, Write as EncodeWrite}; #[test] fn basic() { let stdout = io::stdout(); let mut w = AnsiWriter(stdout.lock()); w.write_all(b"normal ").unwrap(); w.set_style( Style::new() .text(Color::Red) .background(Color::Blue) .intense(true), ) .unwrap(); w.write_all(b"styled").unwrap(); w.set_style(Style::new().text(Color::Green)).unwrap(); w.write_all(b" styled2").unwrap(); w.set_style(&Style::new()).unwrap(); w.write_all(b" normal\n").unwrap(); w.flush().unwrap(); } } log4rs-1.3.0/src/encode/writer/console.rs000064400000000000000000000322411046102023000164110ustar 00000000000000//! The console writer //! //! Requires the `console_writer` feature. use std::{fmt, io}; use crate::encode::{self, Style}; use once_cell::sync::Lazy; static COLOR_MODE: Lazy = Lazy::new(|| { let no_color = std::env::var("NO_COLOR") .map(|var| var != "0") .unwrap_or(false); let clicolor_force = std::env::var("CLICOLOR_FORCE") .map(|var| var != "0") .unwrap_or(false); if no_color { ColorMode::Never } else if clicolor_force { ColorMode::Always } else { let clicolor = std::env::var("CLICOLOR") .map(|var| var != "0") .unwrap_or(true); if clicolor { ColorMode::Auto } else { ColorMode::Never } } }); /// The color output mode for a `ConsoleAppender` #[derive(Clone, Copy, Default)] pub enum ColorMode { /// Print color only if the output is recognized as a console #[default] Auto, /// Force color output Always, /// Never print color Never, } /// An `encode::Write`r that outputs to a console. pub struct ConsoleWriter(imp::Writer); impl ConsoleWriter { /// Returns a new `ConsoleWriter` that will write to standard out. /// /// Returns `None` if standard out is not a console buffer on Windows, and /// if it is not a TTY on Unix. pub fn stdout() -> Option { imp::Writer::stdout().map(ConsoleWriter) } /// Returns a new `ConsoleWriter` that will write to standard error. /// /// Returns `None` if standard error is not a console buffer on Windows, and /// if it is not a TTY on Unix. pub fn stderr() -> Option { imp::Writer::stderr().map(ConsoleWriter) } /// Locks the console, preventing other threads from writing concurrently. pub fn lock(&self) -> ConsoleWriterLock { ConsoleWriterLock(self.0.lock()) } } impl io::Write for ConsoleWriter { fn write(&mut self, buf: &[u8]) -> io::Result { self.0.write(buf) } fn flush(&mut self) -> io::Result<()> { self.0.flush() } fn write_all(&mut self, buf: &[u8]) -> io::Result<()> { self.0.write_all(buf) } fn write_fmt(&mut self, fmt: fmt::Arguments) -> io::Result<()> { self.0.write_fmt(fmt) } } impl encode::Write for ConsoleWriter { fn set_style(&mut self, style: &Style) -> io::Result<()> { self.0.set_style(style) } } /// An RAII lock over a console. pub struct ConsoleWriterLock<'a>(imp::WriterLock<'a>); impl<'a> io::Write for ConsoleWriterLock<'a> { fn write(&mut self, buf: &[u8]) -> io::Result { self.0.write(buf) } fn flush(&mut self) -> io::Result<()> { self.0.flush() } fn write_all(&mut self, buf: &[u8]) -> io::Result<()> { self.0.write_all(buf) } fn write_fmt(&mut self, fmt: fmt::Arguments) -> io::Result<()> { self.0.write_fmt(fmt) } } impl<'a> encode::Write for ConsoleWriterLock<'a> { fn set_style(&mut self, style: &Style) -> io::Result<()> { self.0.set_style(style) } } #[cfg(unix)] mod imp { use std::{fmt, io}; use crate::{ encode::{ self, writer::{ ansi::AnsiWriter, console::{ColorMode, COLOR_MODE}, }, Style, }, priv_io::{StdWriter, StdWriterLock}, }; pub struct Writer(AnsiWriter); impl Writer { pub fn stdout() -> Option { let writer = || Writer(AnsiWriter(StdWriter::stdout())); match *COLOR_MODE { ColorMode::Auto => { if unsafe { libc::isatty(libc::STDOUT_FILENO) } != 1 { None } else { Some(writer()) } } ColorMode::Always => Some(writer()), ColorMode::Never => None, } } pub fn stderr() -> Option { let writer = || Writer(AnsiWriter(StdWriter::stderr())); match *COLOR_MODE { ColorMode::Auto => { if unsafe { libc::isatty(libc::STDERR_FILENO) } != 1 { None } else { Some(writer()) } } ColorMode::Always => Some(writer()), ColorMode::Never => None, } } pub fn lock(&self) -> WriterLock { WriterLock(AnsiWriter((self.0).0.lock())) } } impl io::Write for Writer { fn write(&mut self, buf: &[u8]) -> io::Result { self.0.write(buf) } fn flush(&mut self) -> io::Result<()> { self.0.flush() } fn write_all(&mut self, buf: &[u8]) -> io::Result<()> { self.0.write_all(buf) } fn write_fmt(&mut self, fmt: fmt::Arguments) -> io::Result<()> { self.0.write_fmt(fmt) } } impl encode::Write for Writer { fn set_style(&mut self, style: &Style) -> io::Result<()> { self.0.set_style(style) } } pub struct WriterLock<'a>(AnsiWriter>); impl<'a> io::Write for WriterLock<'a> { fn write(&mut self, buf: &[u8]) -> io::Result { self.0.write(buf) } fn flush(&mut self) -> io::Result<()> { self.0.flush() } fn write_all(&mut self, buf: &[u8]) -> io::Result<()> { self.0.write_all(buf) } fn write_fmt(&mut self, fmt: fmt::Arguments) -> io::Result<()> { self.0.write_fmt(fmt) } } impl<'a> encode::Write for WriterLock<'a> { fn set_style(&mut self, style: &Style) -> io::Result<()> { self.0.set_style(style) } } } #[cfg(windows)] mod imp { use std::{ fmt, io::{self, Write}, mem, }; use winapi::{ shared::minwindef, um::{handleapi, processenv, winbase, wincon, winnt}, }; use crate::{ encode::{ self, writer::console::{ColorMode, COLOR_MODE}, Color, Style, }, priv_io::{StdWriter, StdWriterLock}, }; struct RawConsole { handle: winnt::HANDLE, defaults: minwindef::WORD, } unsafe impl Sync for RawConsole {} unsafe impl Send for RawConsole {} impl RawConsole { fn set_style(&self, style: &Style) -> io::Result<()> { let mut attrs = self.defaults; if let Some(text) = style.text { attrs &= !((wincon::FOREGROUND_RED | wincon::FOREGROUND_GREEN | wincon::FOREGROUND_BLUE) as minwindef::WORD); attrs |= match text { Color::Black => 0, Color::Red => wincon::FOREGROUND_RED, Color::Green => wincon::FOREGROUND_GREEN, Color::Yellow => wincon::FOREGROUND_RED | wincon::FOREGROUND_GREEN, Color::Blue => wincon::FOREGROUND_BLUE, Color::Magenta => wincon::FOREGROUND_RED | wincon::FOREGROUND_BLUE, Color::Cyan => wincon::FOREGROUND_GREEN | wincon::FOREGROUND_BLUE, Color::White => { wincon::FOREGROUND_RED | wincon::FOREGROUND_GREEN | wincon::FOREGROUND_BLUE } } as minwindef::WORD; } if let Some(background) = style.background { attrs &= !((wincon::BACKGROUND_RED | wincon::BACKGROUND_GREEN | wincon::BACKGROUND_BLUE) as minwindef::WORD); attrs |= match background { Color::Black => 0, Color::Red => wincon::BACKGROUND_RED, Color::Green => wincon::BACKGROUND_GREEN, Color::Yellow => wincon::BACKGROUND_RED | wincon::BACKGROUND_GREEN, Color::Blue => wincon::BACKGROUND_BLUE, Color::Magenta => wincon::BACKGROUND_RED | wincon::BACKGROUND_BLUE, Color::Cyan => wincon::BACKGROUND_GREEN | wincon::BACKGROUND_BLUE, Color::White => { wincon::BACKGROUND_RED | wincon::BACKGROUND_GREEN | wincon::BACKGROUND_BLUE } } as minwindef::WORD; } if let Some(intense) = style.intense { if intense { attrs |= wincon::FOREGROUND_INTENSITY as minwindef::WORD; } else { attrs &= !(wincon::FOREGROUND_INTENSITY as minwindef::WORD); } } if unsafe { wincon::SetConsoleTextAttribute(self.handle, attrs) } == 0 { Err(io::Error::last_os_error()) } else { Ok(()) } } } pub struct Writer { console: RawConsole, inner: StdWriter, } impl Writer { pub fn stdout() -> Option { unsafe { let handle = processenv::GetStdHandle(winbase::STD_OUTPUT_HANDLE); if handle.is_null() || handle == handleapi::INVALID_HANDLE_VALUE { return None; } let mut info = mem::zeroed(); if wincon::GetConsoleScreenBufferInfo(handle, &mut info) == 0 { return None; } let writer = Writer { console: RawConsole { handle, defaults: info.wAttributes, }, inner: StdWriter::stdout(), }; match *COLOR_MODE { ColorMode::Auto | ColorMode::Always => Some(writer), ColorMode::Never => None, } } } pub fn stderr() -> Option { unsafe { let handle = processenv::GetStdHandle(winbase::STD_ERROR_HANDLE); if handle.is_null() || handle == handleapi::INVALID_HANDLE_VALUE { return None; } let mut info = mem::zeroed(); if wincon::GetConsoleScreenBufferInfo(handle, &mut info) == 0 { return None; } let writer = Writer { console: RawConsole { handle, defaults: info.wAttributes, }, inner: StdWriter::stdout(), }; match *COLOR_MODE { ColorMode::Auto | ColorMode::Always => Some(writer), ColorMode::Never => None, } } } pub fn lock<'a>(&'a self) -> WriterLock<'a> { WriterLock { console: &self.console, inner: self.inner.lock(), } } } impl io::Write for Writer { fn write(&mut self, buf: &[u8]) -> io::Result { self.inner.write(buf) } fn flush(&mut self) -> io::Result<()> { self.inner.flush() } fn write_all(&mut self, buf: &[u8]) -> io::Result<()> { self.inner.write_all(buf) } fn write_fmt(&mut self, fmt: fmt::Arguments) -> io::Result<()> { self.inner.write_fmt(fmt) } } impl encode::Write for Writer { fn set_style(&mut self, style: &Style) -> io::Result<()> { self.inner.flush()?; self.console.set_style(style) } } pub struct WriterLock<'a> { console: &'a RawConsole, inner: StdWriterLock<'a>, } impl<'a> io::Write for WriterLock<'a> { fn write(&mut self, buf: &[u8]) -> io::Result { self.inner.write(buf) } fn flush(&mut self) -> io::Result<()> { self.inner.flush() } fn write_all(&mut self, buf: &[u8]) -> io::Result<()> { self.inner.write_all(buf) } fn write_fmt(&mut self, fmt: fmt::Arguments) -> io::Result<()> { self.inner.write_fmt(fmt) } } impl<'a> encode::Write for WriterLock<'a> { fn set_style(&mut self, style: &Style) -> io::Result<()> { self.inner.flush()?; self.console.set_style(style) } } } #[cfg(test)] mod test { use std::io::Write; use super::*; use crate::encode::{Color, Style, Write as EncodeWrite}; #[test] fn basic() { let w = match ConsoleWriter::stdout() { Some(w) => w, None => return, }; let mut w = w.lock(); w.write_all(b"normal ").unwrap(); w.set_style( Style::new() .text(Color::Red) .background(Color::Blue) .intense(true), ) .unwrap(); w.write_all(b"styled").unwrap(); w.set_style(Style::new().text(Color::Green)).unwrap(); w.write_all(b" styled2").unwrap(); w.set_style(&Style::new()).unwrap(); w.write_all(b" normal\n").unwrap(); w.flush().unwrap(); } } log4rs-1.3.0/src/encode/writer/mod.rs000064400000000000000000000003071046102023000155240ustar 00000000000000//! Implementations of the `encode::Write` trait. #[cfg(feature = "ansi_writer")] pub mod ansi; #[cfg(feature = "console_writer")] pub mod console; #[cfg(feature = "simple_writer")] pub mod simple; log4rs-1.3.0/src/encode/writer/simple.rs000064400000000000000000000014501046102023000162360ustar 00000000000000//! The simple writer //! //! Requires the `simple_writer` feature. use crate::encode; use std::{fmt, io}; /// An `encode::Write`r that simply delegates to an `io::Write`r and relies /// on the default implementations of `encode::Write`r methods. #[derive(Clone, Eq, PartialEq, Hash, Debug)] pub struct SimpleWriter(pub W); impl io::Write for SimpleWriter { fn write(&mut self, buf: &[u8]) -> io::Result { self.0.write(buf) } fn flush(&mut self) -> io::Result<()> { self.0.flush() } fn write_all(&mut self, buf: &[u8]) -> io::Result<()> { self.0.write_all(buf) } fn write_fmt(&mut self, fmt: fmt::Arguments) -> io::Result<()> { self.0.write_fmt(fmt) } } impl encode::Write for SimpleWriter {} log4rs-1.3.0/src/filter/mod.rs000064400000000000000000000037651046102023000142530ustar 00000000000000//! Filters use log::Record; #[cfg(feature = "config_parsing")] use serde::de; #[cfg(feature = "config_parsing")] use serde_value::Value; #[cfg(feature = "config_parsing")] use std::collections::BTreeMap; use std::fmt; #[cfg(feature = "config_parsing")] use crate::config::Deserializable; #[cfg(feature = "threshold_filter")] pub mod threshold; /// The trait implemented by log4rs filters. /// /// Filters are associated with appenders and limit the log events that will be /// sent to that appender. pub trait Filter: fmt::Debug + Send + Sync + 'static { /// Filters a log event. fn filter(&self, record: &Record) -> Response; } #[cfg(feature = "config_parsing")] impl Deserializable for dyn Filter { fn name() -> &'static str { "filter" } } /// The response returned by a filter. pub enum Response { /// Accept the log event. /// /// It will be immediately passed to the appender, bypassing any remaining /// filters. Accept, /// Take no action on the log event. /// /// It will continue on to remaining filters or pass on to the appender if /// there are none remaining. Neutral, /// Reject the log event. Reject, } /// Configuration for a filter. #[cfg(feature = "config_parsing")] #[derive(Clone, Eq, PartialEq, Hash, Debug)] pub struct FilterConfig { /// The filter kind. pub kind: String, /// The filter configuration. pub config: Value, } #[cfg(feature = "config_parsing")] impl<'de> de::Deserialize<'de> for FilterConfig { fn deserialize(d: D) -> Result where D: de::Deserializer<'de>, { let mut map = BTreeMap::::deserialize(d)?; let kind = match map.remove(&Value::String("kind".to_owned())) { Some(kind) => kind.deserialize_into().map_err(|e| e.to_error())?, None => return Err(de::Error::missing_field("kind")), }; Ok(FilterConfig { kind, config: Value::Map(map), }) } } log4rs-1.3.0/src/filter/threshold.rs000064400000000000000000000032711046102023000154600ustar 00000000000000//! The threshold filter. //! //! Requires the `threshold_filter` feature. use log::{LevelFilter, Record}; #[cfg(feature = "config_parsing")] use crate::config::{Deserialize, Deserializers}; use crate::filter::{Filter, Response}; /// The threshold filter's configuration. #[cfg(feature = "config_parsing")] #[derive(Copy, Clone, Eq, PartialEq, Hash, Debug, serde::Deserialize)] pub struct ThresholdFilterConfig { level: LevelFilter, } /// A filter that rejects all events at a level below a provided threshold. #[derive(Copy, Clone, Eq, PartialEq, Hash, Debug)] pub struct ThresholdFilter { level: LevelFilter, } impl ThresholdFilter { /// Creates a new `ThresholdFilter` with the specified threshold. pub fn new(level: LevelFilter) -> ThresholdFilter { ThresholdFilter { level } } } impl Filter for ThresholdFilter { fn filter(&self, record: &Record) -> Response { if record.level() > self.level { Response::Reject } else { Response::Neutral } } } /// A deserializer for the `ThresholdFilter`. /// /// # Configuration /// /// ```yaml /// kind: threshold /// /// # The threshold log level to filter at. Required /// level: warn /// ``` #[cfg(feature = "config_parsing")] #[derive(Copy, Clone, Eq, PartialEq, Hash, Debug)] pub struct ThresholdFilterDeserializer; #[cfg(feature = "config_parsing")] impl Deserialize for ThresholdFilterDeserializer { type Trait = dyn Filter; type Config = ThresholdFilterConfig; fn deserialize( &self, config: ThresholdFilterConfig, _: &Deserializers, ) -> anyhow::Result> { Ok(Box::new(ThresholdFilter::new(config.level))) } } log4rs-1.3.0/src/lib.rs000064400000000000000000000445021046102023000127470ustar 00000000000000//! log4rs is a highly configurable logging framework modeled after Java's //! Logback and log4j libraries. //! //! # Architecture //! //! The basic units of configuration are *appenders*, *encoders*, *filters*, and //! *loggers*. //! //! ## Appenders //! //! An appender takes a log record and logs it somewhere, for example, to a //! file, the console, or the syslog. //! //! Implementations: //! - [console](append/console/struct.ConsoleAppenderDeserializer.html#configuration): requires the `console_appender` feature. //! - [file](append/file/struct.FileAppenderDeserializer.html#configuration): requires the `file_appender` feature. //! - [rolling_file](append/rolling_file/struct.RollingFileAppenderDeserializer.html#configuration): requires the `rolling_file_appender` feature and can be configured with the `compound_policy`. //! - [compound](append/rolling_file/policy/compound/struct.CompoundPolicyDeserializer.html#configuration): requires the `compound_policy` feature //! - Rollers //! - [delete](append/rolling_file/policy/compound/roll/delete/struct.DeleteRollerDeserializer.html#configuration): requires the `delete_roller` feature //! - [fixed_window](append/rolling_file/policy/compound/roll/fixed_window/struct.FixedWindowRollerDeserializer.html#configuration): requires the `fixed_window_roller` feature //! - Triggers //! - [size](append/rolling_file/policy/compound/trigger/size/struct.SizeTriggerDeserializer.html#configuration): requires the `size_trigger` feature //! - [time](append/rolling_file/policy/compound/trigger/tine/struct.TimeTriggerDeserializer.html#configuration): requires the `time_trigger` feature //! //! ## Encoders //! //! An encoder is responsible for taking a log record, transforming it into the //! appropriate output format, and writing it out. An appender will normally //! use an encoder internally. //! //! Implementations: //! - [pattern](encode/pattern/struct.PatternEncoderDeserializer.html#configuration): requires the `pattern_encoder` feature //! - [json](encode/json/struct.JsonEncoderDeserializer.html#configuration): requires the `json_encoder` feature //! //! ## Filters //! //! Filters are associated with appenders and, like the name would suggest, //! filter log events coming into that appender. //! //! Implementations: //! - [threshold](filter/threshold/struct.ThresholdFilterDeserializer.html#configuration): requires the `threshold_filter` feature //! //! ## Loggers //! //! A log event is targeted at a specific logger, which are identified by //! string names. The logging macros built in to the `log` crate set the logger //! of a log event to the one identified by the module containing the //! invocation location. //! //! Loggers form a hierarchy: logger names are divided into components by "::". //! One logger is the ancestor of another if the first logger's component list //! is a prefix of the second logger's component list. //! //! Loggers are associated with a maximum log level. Log events for that logger //! with a level above the maximum will be ignored. The maximum log level for //! any logger can be configured manually; if it is not, the level will be //! inherited from the logger's parent. //! //! Loggers are also associated with a set of appenders. Appenders can be //! associated directly with a logger. In addition, the appenders of the //! logger's parent will be associated with the logger unless the logger has //! its *additive* set to `false`. Log events sent to the logger that are not //! filtered out by the logger's maximum log level will be sent to all //! associated appenders. //! //! The "root" logger is the ancestor of all other loggers. Since it has no //! ancestors, its additivity cannot be configured. //! //! # Configuration //! //! For a detailed breakdown on configuration, refer to the //! [config module](config/index.html#configuration). //! //! log4rs makes heavy use of Cargo features to enable consumers to pick the //! functionality they wish to use. File-based configuration requires the `file` //! feature, and each file format requires its own feature as well. In addition, //! each component has its own feature. For example, YAML support requires the //! `yaml_format` feature and the console appender requires the //! `console_appender` feature. //! //! By default, the `all_components`, `gzip`, `file`, and `yaml_format` features //! are enabled. //! //! As a convenience, the `all_components` feature activates all logger components. //! //! # Examples //! //! ## Configuration via a YAML file //! //! ```yaml //! # Scan this file for changes every 30 seconds //! refresh_rate: 30 seconds //! //! appenders: //! # An appender named "stdout" that writes to stdout //! stdout: //! kind: console //! //! # An appender named "requests" that writes to a file with a custom pattern encoder //! requests: //! kind: file //! path: "log/requests.log" //! encoder: //! pattern: "{d} - {m}{n}" //! //! # Set the default logging level to "warn" and attach the "stdout" appender to the root //! root: //! level: warn //! appenders: //! - stdout //! //! loggers: //! # Raise the maximum log level for events sent to the "app::backend::db" logger to "info" //! app::backend::db: //! level: info //! //! # Route log events sent to the "app::requests" logger to the "requests" appender, //! # and *not* the normal appenders installed at the root //! app::requests: //! level: info //! appenders: //! - requests //! additive: false //! ``` //! //! Add the following in your application initialization. //! //! ```no_run //! # #[cfg(feature = "config_parsing")] //! # fn f() { //! log4rs::init_file("log4rs.yml", Default::default()).unwrap(); //! # } //! ``` //! //! ## Programmatically constructing a configuration: //! //! ```no_run //! # #[cfg(all(feature = "console_appender", //! # feature = "file_appender", //! # feature = "pattern_encoder"))] //! # fn f() { //! use log::LevelFilter; //! use log4rs::append::console::ConsoleAppender; //! use log4rs::append::file::FileAppender; //! use log4rs::encode::pattern::PatternEncoder; //! use log4rs::config::{Appender, Config, Logger, Root}; //! //! fn main() { //! let stdout = ConsoleAppender::builder().build(); //! //! let requests = FileAppender::builder() //! .encoder(Box::new(PatternEncoder::new("{d} - {m}{n}"))) //! .build("log/requests.log") //! .unwrap(); //! //! let config = Config::builder() //! .appender(Appender::builder().build("stdout", Box::new(stdout))) //! .appender(Appender::builder().build("requests", Box::new(requests))) //! .logger(Logger::builder().build("app::backend::db", LevelFilter::Info)) //! .logger(Logger::builder() //! .appender("requests") //! .additive(false) //! .build("app::requests", LevelFilter::Info)) //! .build(Root::builder().appender("stdout").build(LevelFilter::Warn)) //! .unwrap(); //! //! let handle = log4rs::init_config(config).unwrap(); //! //! // use handle to change logger configuration at runtime //! } //! # } //! # fn main() {} //! ``` //! //! For more examples see the [examples](https://github.com/estk/log4rs/tree/main/examples). //! #![allow(where_clauses_object_safety, clippy::manual_non_exhaustive)] #![warn(missing_docs)] use std::{ cmp, collections::HashMap, fmt, hash::BuildHasherDefault, io, io::prelude::*, sync::Arc, }; use arc_swap::ArcSwap; use fnv::FnvHasher; use log::{Level, LevelFilter, Metadata, Record}; pub mod append; pub mod config; pub mod encode; pub mod filter; #[cfg(feature = "console_writer")] mod priv_io; pub use config::{init_config, Config}; #[cfg(feature = "config_parsing")] pub use config::{init_file, init_raw_config}; use self::{append::Append, filter::Filter}; type FnvHashMap = HashMap>; #[derive(Debug)] struct ConfiguredLogger { level: LevelFilter, appenders: Vec, children: FnvHashMap, } impl ConfiguredLogger { fn add(&mut self, path: &str, mut appenders: Vec, additive: bool, level: LevelFilter) { let (part, rest) = match path.find("::") { Some(idx) => (&path[..idx], &path[idx + 2..]), None => (path, ""), }; if let Some(child) = self.children.get_mut(part) { child.add(rest, appenders, additive, level); return; } let child = if rest.is_empty() { if additive { appenders.extend(self.appenders.iter().cloned()); } ConfiguredLogger { level, appenders, children: FnvHashMap::default(), } } else { let mut child = ConfiguredLogger { level: self.level, appenders: self.appenders.clone(), children: FnvHashMap::default(), }; child.add(rest, appenders, additive, level); child }; self.children.insert(part.to_owned(), child); } fn max_log_level(&self) -> LevelFilter { let mut max = self.level; for child in self.children.values() { max = cmp::max(max, child.max_log_level()); } max } fn find(&self, path: &str) -> &ConfiguredLogger { let mut node = self; for part in path.split("::") { match node.children.get(part) { Some(child) => node = child, None => break, } } node } fn enabled(&self, level: Level) -> bool { self.level >= level } fn log(&self, record: &log::Record, appenders: &[Appender]) -> Result<(), Vec> { let mut errors = vec![]; if self.enabled(record.level()) { for &idx in &self.appenders { if let Err(err) = appenders[idx].append(record) { errors.push(err); } } } if errors.is_empty() { Ok(()) } else { Err(errors) } } } #[derive(Debug)] struct Appender { appender: Box, filters: Vec>, } impl Appender { fn append(&self, record: &Record) -> anyhow::Result<()> { for filter in &self.filters { match filter.filter(record) { filter::Response::Accept => break, filter::Response::Neutral => {} filter::Response::Reject => return Ok(()), } } self.appender.append(record) } fn flush(&self) { self.appender.flush(); } } struct SharedLogger { root: ConfiguredLogger, appenders: Vec, err_handler: Box, } impl fmt::Debug for SharedLogger { fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result { f.debug_struct("SharedLogger") .field("root", &self.root) .field("appenders", &self.appenders) .finish() } } impl SharedLogger { fn new(config: config::Config) -> SharedLogger { Self::new_with_err_handler( config, Box::new(|e: &anyhow::Error| { let _ = writeln!(io::stderr(), "log4rs: {}", e); }), ) } fn new_with_err_handler( config: config::Config, err_handler: Box, ) -> SharedLogger { let (appenders, root, mut loggers) = config.unpack(); let root = { let appender_map = appenders .iter() .enumerate() .map(|(i, appender)| (appender.name(), i)) .collect::>(); let mut root = ConfiguredLogger { level: root.level(), appenders: root .appenders() .iter() .map(|appender| appender_map[&**appender]) .collect(), children: FnvHashMap::default(), }; // sort loggers by name length to ensure that we initialize them top to bottom loggers.sort_by_key(|l| l.name().len()); for logger in loggers { let appenders = logger .appenders() .iter() .map(|appender| appender_map[&**appender]) .collect(); root.add(logger.name(), appenders, logger.additive(), logger.level()); } root }; let appenders = appenders .into_iter() .map(|appender| { let (_, appender, filters) = appender.unpack(); Appender { appender, filters } }) .collect(); SharedLogger { root, appenders, err_handler, } } } /// The fully configured log4rs Logger which is appropriate /// to use with the `log::set_boxed_logger` function. #[derive(Debug)] pub struct Logger(Arc>); impl Logger { /// Create a new `Logger` given a configuration. pub fn new(config: config::Config) -> Logger { Logger(Arc::new(ArcSwap::new(Arc::new(SharedLogger::new(config))))) } /// Create a new `Logger` given a configuration and err handler. pub fn new_with_err_handler( config: config::Config, err_handler: Box, ) -> Logger { Logger(Arc::new(ArcSwap::new(Arc::new( SharedLogger::new_with_err_handler(config, err_handler), )))) } /// Set the max log level above which everything will be filtered. pub fn max_log_level(&self) -> LevelFilter { self.0.load().root.max_log_level() } } impl log::Log for Logger { fn enabled(&self, metadata: &Metadata) -> bool { self.0 .load() .root .find(metadata.target()) .enabled(metadata.level()) } fn log(&self, record: &log::Record) { let shared = self.0.load(); if let Err(errs) = shared .root .find(record.target()) .log(record, &shared.appenders) { for e in errs { (shared.err_handler)(&e) } } } fn flush(&self) { for appender in &self.0.load().appenders { appender.flush(); } } } pub(crate) fn handle_error(e: &anyhow::Error) { let _ = writeln!(io::stderr(), "log4rs: {}", e); } /// A handle to the active logger. #[derive(Clone, Debug)] pub struct Handle { shared: Arc>, } impl Handle { /// Sets the logging configuration. pub fn set_config(&self, config: Config) { let shared = SharedLogger::new(config); log::set_max_level(shared.root.max_log_level()); self.shared.store(Arc::new(shared)); } } #[cfg(test)] mod test { use log::{Level, LevelFilter, Log}; use super::*; #[test] #[cfg(all(feature = "config_parsing", feature = "json_format"))] fn init_from_raw_config() { let dir = tempfile::tempdir().unwrap(); let path = dir.path().join("append.log"); let cfg = serde_json::json!({ "refresh_rate": "60 seconds", "root" : { "appenders": ["baz"], "level": "info", }, "appenders": { "baz": { "kind": "file", "path": path, "encoder": { "pattern": "{m}" } } }, }); let config = serde_json::from_str::(&cfg.to_string()).unwrap(); if let Err(e) = init_raw_config(config) { panic!("{}", e); } assert!(path.exists()); log::info!("init_from_raw_config"); let mut contents = String::new(); std::fs::File::open(&path) .unwrap() .read_to_string(&mut contents) .unwrap(); assert_eq!(contents, "init_from_raw_config"); } #[test] fn enabled() { let root = config::Root::builder().build(LevelFilter::Debug); let mut config = config::Config::builder(); let logger = config::Logger::builder().build("foo::bar", LevelFilter::Trace); config = config.logger(logger); let logger = config::Logger::builder().build("foo::bar::baz", LevelFilter::Off); config = config.logger(logger); let logger = config::Logger::builder().build("foo::baz::buz", LevelFilter::Error); config = config.logger(logger); let config = config.build(root).unwrap(); let logger = super::Logger::new(config); assert!(logger.enabled(&Metadata::builder().level(Level::Warn).target("bar").build())); assert!(!logger.enabled( &Metadata::builder() .level(Level::Trace) .target("bar") .build() )); assert!(logger.enabled( &Metadata::builder() .level(Level::Debug) .target("foo") .build() )); assert!(logger.enabled( &Metadata::builder() .level(Level::Trace) .target("foo::bar") .build() )); assert!(!logger.enabled( &Metadata::builder() .level(Level::Error) .target("foo::bar::baz") .build() )); assert!(logger.enabled( &Metadata::builder() .level(Level::Debug) .target("foo::bar::bazbuz") .build() )); assert!(!logger.enabled( &Metadata::builder() .level(Level::Error) .target("foo::bar::baz::buz") .build() )); assert!(!logger.enabled( &Metadata::builder() .level(Level::Warn) .target("foo::baz::buz") .build() )); assert!(!logger.enabled( &Metadata::builder() .level(Level::Warn) .target("foo::baz::buz::bar") .build() )); assert!(logger.enabled( &Metadata::builder() .level(Level::Error) .target("foo::baz::buz::bar") .build() )); } } log4rs-1.3.0/src/priv_io.rs000064400000000000000000000046531046102023000136530ustar 00000000000000use std::{ fmt, io::{self, Stderr, StderrLock, Stdout, StdoutLock}, }; pub enum StdWriter { Stdout(Stdout), Stderr(Stderr), } impl StdWriter { pub fn stdout() -> StdWriter { StdWriter::Stdout(io::stdout()) } pub fn stderr() -> StdWriter { StdWriter::Stderr(io::stderr()) } pub fn lock(&self) -> StdWriterLock { match *self { StdWriter::Stdout(ref w) => StdWriterLock::Stdout(w.lock()), StdWriter::Stderr(ref w) => StdWriterLock::Stderr(w.lock()), } } } impl io::Write for StdWriter { fn write(&mut self, buf: &[u8]) -> io::Result { match *self { StdWriter::Stdout(ref mut w) => w.write(buf), StdWriter::Stderr(ref mut w) => w.write(buf), } } fn flush(&mut self) -> io::Result<()> { match *self { StdWriter::Stdout(ref mut w) => w.flush(), StdWriter::Stderr(ref mut w) => w.flush(), } } fn write_all(&mut self, buf: &[u8]) -> io::Result<()> { match *self { StdWriter::Stdout(ref mut w) => w.write_all(buf), StdWriter::Stderr(ref mut w) => w.write_all(buf), } } fn write_fmt(&mut self, fmt: fmt::Arguments) -> io::Result<()> { match *self { StdWriter::Stdout(ref mut w) => w.write_fmt(fmt), StdWriter::Stderr(ref mut w) => w.write_fmt(fmt), } } } pub enum StdWriterLock<'a> { Stdout(StdoutLock<'a>), Stderr(StderrLock<'a>), } impl<'a> io::Write for StdWriterLock<'a> { fn write(&mut self, buf: &[u8]) -> io::Result { match *self { StdWriterLock::Stdout(ref mut w) => w.write(buf), StdWriterLock::Stderr(ref mut w) => w.write(buf), } } fn flush(&mut self) -> io::Result<()> { match *self { StdWriterLock::Stdout(ref mut w) => w.flush(), StdWriterLock::Stderr(ref mut w) => w.flush(), } } fn write_all(&mut self, buf: &[u8]) -> io::Result<()> { match *self { StdWriterLock::Stdout(ref mut w) => w.write_all(buf), StdWriterLock::Stderr(ref mut w) => w.write_all(buf), } } fn write_fmt(&mut self, fmt: fmt::Arguments) -> io::Result<()> { match *self { StdWriterLock::Stdout(ref mut w) => w.write_fmt(fmt), StdWriterLock::Stderr(ref mut w) => w.write_fmt(fmt), } } } log4rs-1.3.0/test.sh000075500000000000000000000007521046102023000123610ustar 00000000000000#!/bin/bash -e # Test each feature in windows function main { if [ "$1" == "win" ]; then target_arg='--target x86_64-pc-windows-gnu' else target_arg='' fi for feature in $(cargo read-manifest | jq -r '.features|keys|join("\n")'); do echo building with feature "$feature" echo cross test $target_arg --no-default-features --features "$feature" cross test $target_arg --no-default-features --features "$feature" done } main "$@" log4rs-1.3.0/tests/color_control.rs000064400000000000000000000012111046102023000154200ustar 00000000000000use std::process::Command; fn execute_test(env_key: &str, env_val: &str) { let mut child_proc = Command::new("cargo") .args(&["run", "--example", "compile_time_config"]) .env(env_key, env_val) .spawn() .expect("Cargo command failed to start"); let ecode = child_proc.wait().expect("failed to wait on child"); assert!(ecode.success()); } // Maintaining as a single test to avoid blocking calls to the package cache #[test] fn test_no_color() { let keys = vec!["NO_COLOR", "CLICOLOR_FORCE", "CLICOLOR"]; for key in keys { execute_test(key, "1"); execute_test(key, "0"); } }