gstreamer-video-0.23.5/.cargo_vcs_info.json0000644000000001550000000000100142210ustar { "git": { "sha1": "8eb8ab921bc627f854bac4e457521e7312403d0a" }, "path_in_vcs": "gstreamer-video" }gstreamer-video-0.23.5/CHANGELOG.md000064400000000000000000002533741046102023000146370ustar 00000000000000# Changelog All notable changes to this project will be documented in this file. The format is based on [Keep a Changelog](http://keepachangelog.com/en/1.0.0/) and this project adheres to [Semantic Versioning](http://semver.org/spec/v2.0.0.html), specifically the [variant used by Rust](http://doc.crates.io/manifest.html#the-version-field). ## [0.23.5] - 2025-02-17 ### Fixed - Properly validate `gst::IntRange::with_step()` step size - Fix `gst::Buffer` serde serialization - Forward gap events by default in `gst_utils::StreamProducer`. - Correctly account for alternate interlace mode in `gst_video::VideoMeta::add_full()`. - Return `Result`s instead of `bool`s in new `gst_play` API. ### Added - Support for `TracerImpl::USE_STRUCTURE_PARAMS` with GStreamer < 1.26. - Bindings for `gst_analytics::ODMtd`. - Bindings for `TopSurroundRight` and `TopSurroundLeft` audio channels. - Bindings for AV1 and H266 codec helpers API. - Bindings for `gst_audio::reorder_channels_with_reorder_map()`. ### Changed - Updated GStreamer gir files for latest 1.26 APIs. - Documentation links URL was updated. ## [0.23.4] - 2024-12-21 ### Fixed - `gst_video::VideoFrame::plane_data()` does not return a truncated buffer for the alpha plane in A420 and similar formats anymore. - `FnMut` closures are now correctly passed via a mutable reference to FFI code. - Order of arguments in `gst_video::VideoFormat::from_mask()` was corrected. ### Added - Bindings for 1.26 `gst_analytics::Tensor` API. - `gst::DebugCategory::as_ptr()` and `Hash` impl, `gst::DebugMessage::as_ptr()`. - Support for hex-dumping `&mut [u8]` in addition to `&[u8]`, `gst::Buffer`, etc. - Functions to work with meta `glib::Type`s. ### Changed - Updated GStreamer gir files for latest 1.26 APIs. ## [0.23.3] - 2024-11-01 ### Fixed - Bind `gst::Pad::proxy_query_caps()` to the correct C function. - Update `gst_utils::StreamProducer` appsrc latency upon appsink latency event. - Fix type of `gst_app::AppSinkBuilder::processing_deadline()`. ### Added - Various new `gst::Iterator` constructors for convenience. ### Changed - Updated GStreamer gir files for latest 1.26 APIs. ## [0.23.2] - 2024-09-28 ### Fixed - Lifetime of `gst::TagList::index()` return value is correctly bound to `&self` now. - Don't assume `gst::Structure` name / field names have `'static` lifetime. - Set pad probe data to `NULL` if `HANDLED` is returned and the item is an event, buffer or buffer list. - Don't unnecessarily add `#[link]` attribute to the `extern "C"` sections to allow linking against gstreamer-full and make static linking easier. ### Changed - Add `#[must_use]` to `gst_video::VideoTimeCode::add_interval()`. ### Added - Add API to take events/buffers from a `gst::PadProbeInfo`. - Add `gst::EventViewMut` and `gst::Event::view_mut()`, and a few setters for event fields. - Add `gst::MessageViewMut` and `gst::Message::view_mut()`, and a few setters for message fields. ## [0.23.1] - 2024-08-27 ### Fixed - Support `gst_utils::StreamProducer` API on platforms without 64 bit atomics. - Fix off-by-one in `gst::BufferList::remove()` range end handling. - Pass an empty tag array instead of NULL in `gst::CustomMeta::register_simple()`. - Fix various new clippy warnings. ### Added - Add getters for `gst::format::Percent`. ## [0.23.0] - 2024-07-11 ### Changed - Compatible with gtk-rs-core 0.20 / gtk4-rs 0.9. - Update GStreamer gir files to latest (upcoming) 1.26 APIs. - Minimum support Rust version is updated from 1.70 to 1.71.1. - Move `gst::Meta` tags into separate modules and improve API around them. - Improve `gst::Meta` transform functions and make the API more generic, and as part of that add support the video meta transform. - Pass an immutable instead of mutable output buffer reference to `gst_rtp::RtpHeaderExtension::write()` function. - Make `gst_net::PtpClock::new()` constructor fallible. - Change `gst_rtsp_server::RTSPToken` API to be consistent with `gst::Structure`, specifically add a builder. - Change `from_glib_ptr_borrow()` functions to work with references instead of raw pointers for improved safety. - Improve code generation when building with `panic=abort`. - Change `gst::BufferList` APIs to work with ranges instead of index+length. - Use various `usize` instead of `u32` for various indices in `gst::Buffer`, `gst::Caps`, `gst::Structure` and related APIs. - `gst::Clock` calibration-related API uses plain `u64` instead of `gst::ClockTime` for the clock rate. - `gst::debug!` and related macros use `obj = x` instead of `obj: x` for specifying the target object now. Similar for `imp` and `id`. The old syntax is still supported but deprecated. The new syntax works better with tooling, especially rustfmt. ### Added - Mutable access to the `gst_webrtc::WebRTCSessionDescription` fields. - `gst::StructureBuilder::field_if_some()` and the same for related builders to only set a value if `Some(x)` is provided. - `gst::StructureBuilder::field_from_iter()` and `field_if_not_empty()` for various builders. - `gst::PadBuilder` API for selecting an automatically generated name. - Adapter for the `log` crate around the GStreamer debug log system. This allows the macros from the `log` crate to be output via the GStreamer debug log system. - Bindings for the double click `gst_video::Navigation` event. - Bindings for `gst_pbutils` missing/install plugins API. - Setters for `gst_editing_services::FrameCompositionMeta`. - `ges::UriClipAsset::new()`. ## [0.22.6] - 2024-06-19 ### Fixed - When logging with an id and a formatted log message this would previously panic. - A couple of clippy warnings. ## [0.22.5] - 2024-05-23 ### Fixed - A couple of clippy warnings and compiler warnings about unused imports with latest rustc. - Memory leak in builder for the `SelectStreams` event. - Add parameter validity assertions to various `BufferList` and `Caps` APIs where these assertions were missing to avoid assertions in C. ### Added - `StreamProducer::set_forward_preroll()` API to configure if the preroll buffer should be directly forwarded or not yet. ### Changed - Remove nonsensical gstreamer-video test that fails with latest GStreamer main. - Update to itertools 0.13. ## [0.22.4] - 2024-04-08 ### Added - Implement `From` / `ToValue` for `gst_audio::AudioConverterConfig` and `gst_video::VideoConverterConfig`. ### Changed - Fixed various 1.77 clippy warnings. ## [0.22.3] - 2024-03-19 ### Changed - Change `ges::CompositionMeta` position fields to `f64`s in correspondence with the C API. - Change `gst_analytics::AnalyticsMtdRef::obj_type()` to an `Option` in correspondence with the C API. ### Added - `gst::Fraction::new_raw()` and `from_integer()` const constructors. ## [0.22.2] - 2024-02-26 ### Changed - Update GStreamer gir files and add more new 1.24 API. ### Fixed - Add `gst::Object` as parent class for various `gst_rtp` types. - Handle all already queued messages in `gst::BusStream` instead of just new messages. ### Added - Add `gst::CustomMeta::is_registered()`. ## [0.22.1] - 2024-02-13 ### Changed - Update GStreamer gir files and add more new 1.24 API. ### Fixed - Make `AnalyticsODLocation` struct fields public. - `MetaRefMut::upcast_mut()` returns a mutable reference now. ## [0.22.0] - 2024-02-08 ### Changed - Compatible with gtk-rs-core 0.19 / gtk4-rs 0.8. - Update GStreamer gir files to latest (upcoming) 1.24 APIs. - Various standalone functions were moved to separate modules or methods. - `gst::Rank` is not implemented as an enum but as a struct with associated constants now. - Optimized `gst::Buffer::from_slice()` and `Memory::from_slice()` implementations that have one heap allocation fewer. - Various `gst::Buffer` and `gst::Memory` functions take ranges now instead of offset/size parameters. ### Added - Bindings for `gst_gl::GLContext::thread_add()`, `GLFrameBuffer::draw_to_texture()`. - New `gst_gl::GLVideoFrame` type that replaces `gst_video::VideoFrame` for GL-specific API, and comes with mostly the same interface. - Basic gstreamer-tag bindings. - `gst::Buffer:dump()` and `dump_range()` together with the same API on `gst::Memory` for hex-dumping the whole buffer/memory content. - Implement `Clone` on `gst::MetaRef`. - Bindings for `gst::Buffer::map_range_readable()` and its writable variant. - Array-based accessor for `gst_video::VideoFrame` and `gst_audio::AudioBuffer` plane data. - Support for handling custom authentication in `gstreamer-rtsp-server`. - Accessors for various base class struct fields. - Owned buffer getter for `AudioBuffer` / `VideoFrame`. - `gst_rtp::RTPSourceMeta` bindings. - `gst::macos_main()` bindings. - gstreamer-analytics bindings. ### Fixed - API typo in owned `gst::ReferenceTimestampMeta` reference getter. - Allow variable expansion in `gst::loggable_error!` macro. - `gstreamer-gl-*` crates can build docs again on stable. ### Removed - `gst::Pad::caps()` property getter. Use `current_caps()` instead which does the same thing. - Various deprecated APIs that were deprecated in previous releases. - Getter for a mutable buffer reference from `AudioBuffer` / `VideoFrame` as that allowed invalidating the buffer map. ### Fixed ## [0.21.3] - 2023-12-18 ### Added - Update GStreamer gir files to latest (upcoming) 1.24 APIs. - Add an example for writing subclasses with virtual methods. - Add `gst::ClockTime::absdiff()` and same for similar types. ### Fixed - In `Play` example, set bus to flushing before dropping `Play` instance. - Add missing `docsrs` configuration for correct documentation generation. - Make `gst_pbutils::element_properties` module public. - Add missing `gst_audio::AudioFilterImpl::parent_allowed_caps()`. - Fix assertions in `gst::Memory` copy/share/resize functions. ### Changed - Update to itertool 0.12, pretty-hex 0.4. ## [0.21.2] - 2023-11-11 ### Changed - Update GStreamer gir files to latest (upcoming) 1.24 APIs. - Update to latest gir code generator from the gtk-rs 0.18 branch. ### Fixed - Big endian video format order is correct again. - `gst::MetaRef::has_tags()` and `tags()` API actually works and works based on the tags of the concrete meta instance. - `gst::MetaRef::tags()` returns strings with arbitrary lifetimes now because they're statically stored anyway. - Fix another potential deadlock in `gst_utils::StreamProducer` when sending force-keyunit events. ### Added - Bindings for `gst_video::VBIEncoder` and `VBIParser`. - Accessors for the different `gst::PadProbeData` types on `PadProbeInfo`. - `Default` impl for `gst::AllocationParams`. - `From` / `TryFrom` implementations between formatted types (e.g. `gst::Bytes`) and `usize`. - `gst::MetaRef::copy()` to copy metas from one buffer to another. - `gst::ElementImpl::catch_panic_future()` to wrap a `Future` in such a way that panics are converted to GStreamer error messages and the element is marked as unusable. - `gst_gl::GLDisplay::handle()` to get a raw display handle. ## [0.21.1] - 2023-10-04 ### Changed - Update GStreamer gir files to latest (upcoming) 1.24 APIs. ### Fixed - Use correct media links in the tutorials code. - Fix a couple of new 1.72/1.73 clippy warnings. - Fix description of gstreamer-validate crate. - Copyright/license files for the gstreamer-gl were added. - Ordering of raw video formats follows the rules of latest libgstvideo now. - Fix potential deadlock in `gst_utils::StreamProducer` when sending force-keyunit events. ### Added - `max-time` / `max-bytes` setters to `gst_app::AppSink` builder. - `gst::CustomMeta::register_simple()`. ## [0.21.0] - 2023-08-08 ### Changed - Minimum supported Rust version is updated to 1.70.0. - Compatible with gtk-rs-core 0.18. - `gst::Bin::add_many()`, `remove_many()` and `gst::Element::link_many()`, `unlink_many()` are more generic now. - `gst_base::Aggregator::src_pad()` returns an `AggregatorPad`. - `gst::Bus::add_watch()` now returns a guard value that automatically removes the watch when it goes out of scope. - `gst::Bin`, `Pipeline` and `Pad` constructors don't take the optional name parameter anymore but it can instead be provided via the builder API. - `gst::Pad` and `GhostPad` builders inherit name from the pad template (or target) if possible and no other name is provided explicitly. - The preroll samples and selected sticky events are forwarded to `StreamProducer` consumers. ### Added - Support for the upcoming GStreamer 1.24 APIs. - Support for inline variable names in format strings for error/warning/info messages. - Methods for converting between floating point seconds and `gst::ClockTime`. - Various additions to the gst-validate bindings. - `Display` implementations for error/warning/info messages. - More useful `Debug` implementations for messages, events and queries and `gst_pbutils::DiscovererInfo` related structs. - API for listing/checking `gst::Meta` tags. ## [0.20.7] - 2023-07-05 ### Fixed - Fix `wait-for-eos` property name string in `appsink`. - Fix various memory leaks in `BaseTransform` subclassing bindings. - Mark some GES APIs as `Send+Sync`. ### Added - Implement `DiscovererInfo::debug()` and on related structs. + Add subclassing bindings for `GESFormatter`. ## [0.20.6] - 2023-06-06 ### Added - Getter for the `gst_rtsp_server::RTSPContext` URI field. ### Fixed - `gst_pbutils::DiscovererStreamInfo::stream_id()` can return `NULL`. This is mapped to the empty string for this release to keep backwards compatibility. - `gst_pbutils::DiscovererStreamInfo` iterator methods can be called on any subclass directly now without casting. - Debug logs use the actual function name against instead of the name of a closure generated by the log macros. ### Changed - Minor performance improvements to debug logging. ## [0.20.5] - 2023-04-22 ### Added - `glib::HasParamSpec` impl for miniobjects to allow using them with the properties derive macro. - `Default` impl for `gst_player::Player`. ## [0.20.4] - 2023-04-07 ### Fixed - Work around `gst_webrtc::WebRTCICE::add_candidate()` API breakage in 1.24. ### Changed - Reduce size of `gst_audio::AudioBuffer` and `gst_video::VideoFrame` by a factor of two by not storing an unnecessary copy of the audio/video info. ## [0.20.3] - 2023-03-14 ### Fixed - `gst::ParamSpecArray` uses the correct `glib::Type` now. - Work around accidental ABI breakage in 1.18 gst-rtsp-server `GstRTSPClient`. ### Added - Document `gst_utils::StreamProducer::forward_eos()` default value. ## [0.20.2] - 2023-02-21 ### Added - `glib::HasParamSpec` impl for `gst::ClockTime` - `Default` impl for `gst_play::Play` - Constructors for non-raw `gst_audio::AudioCapsBuilder` / `gst_video::VideoCapsBuilder` ## [0.20.1] - 2023-02-13 ### Fixed - Fix memory leaks when converting a `gst_audio::AudioBuffer` or `gst_video::VideoFrame` to a `gst::Buffer` or FFI type. ## [0.20.0] - 2023-02-10 ### Fixed - Make `gst_gL::GLDisplay::create_context()` `other_context` parameter optional. - Make allocation query caps optional. ### Added - Conversions between `gst::Signed` and `T` and signed integer types. - Bindings for the object lock via `gst::Object::lock()`. - Various `FromIterator`, `Extend` and `From` impls for creating `Caps`, `Structure`, `Buffer`, `BufferList`, `CapsFeatures` and other types. - `PartialEq` impls between owned/borrowed miniobjects/structures. - API for appending items to `gst::Array` and `gst::List`. ### Changed - Compatible with the 0.17 gtk-rs release. - Updated minimum supported Rust version to 1.64. - Require GStreamer 1.22.0 or newer when enabling the `v1_22` feature. - Require the object lock to be taken for various `gst_gl::GLDisplay` methods. - Renamed `gst::TagSetter::add()` to `add_tags()` to avoid name conflict with `Bin::add()`. - Mark various un-extendable enums as exhaustive. - Make use of `glib::GStr` and related API in caps, structure, tags and logging API to reduce temporary string allocations. - Various code optimizations to reduce generated code size and allow more optimal code to be generated. - Reduce size of various types, including reduction of `gst_audio::AudioInfo` from 832 to 320 bytes. - Use actual function name instead of module name in log output. - Change `gst_utils::StreamProducer` API to forward buffers by default and allow temporarily discarding via new `set_discard()` function. ## [0.19.8] - 2023-02-09 ### Changed - Update GStreamer .gir files to 1.22.0 release. ### Fixed - Marked `gst::MessageType` as non-exhaustive. ### Added - Added bindings for `gst::Message::structure_mut()`. - Added subclassing support for `gst_allocators::FdAllocator` and `DmabufAllocator`. ## [0.19.7] - 2023-01-19 ### Fixed - Work around the possibility that the caps in the allocation query can be `NULL` by returning any caps for now. This will be handled properly with a minimal API change in the 0.20 release. ## [0.19.6] - 2023-01-18 ### Fixed - The `AppSrc` and `AppSink` builders now assert that GStreamer is initialized before creating an instance. ## [0.19.5] - 2022-12-27 ### Fixed - Clear video frame values when mapping as GL texture to avoid out of bounds reads when trying to access the GL texture as raw video frame. - Allow returning `Handled` from `BufferList` pad probes. ### Changed - Update GStreamer .gir files to latest 1.21 git. ## [0.19.4] - 2022-12-16 ### Added - Subclassing bindings for `gst_audio::AudioFilter`. ### Fixed - Various new clippy warnings. ### Changed - Update GStreamer .gir files to 1.21.3 release. ## [0.19.3] - 2022-11-28 ### Added - `FromIterator` and `Extend` for `Caps`. - `PartialEq` impls between owned/borrowed miniobjects/structures. ### Fixed - Sticky event ordering for instant-rate-change. ### Changed - Updated GStreamer .gir files to post 1.22.2 release. ## [0.19.2] - 2022-11-13 ### Added - Subclassing support for `gst::Allocator`. - `gst_gl::GLBaseMemory::context()` to retrieve the GL context used by the memory. ### Changed - Updated GStreamer .gir files to 1.22.2 release. ### Fixed - `gst::Allocator::register()` does not cause use-after free with GStreamer < 1.20.5 anymore. - Don't generate version constants in `gstreamer-editing-services-sys` as they are useless and change with every update. ### Changed - Fixed various new clippy warnings. ## [0.19.1] - 2022-10-24 ### Changed - Make it possible to use objects not just as reference in the logging macros. ## [0.19.0] - 2022-10-22 ### Added - Builders for element construction. `gst::ElementFactory::make()` returns a builder now that allows to easily set the name or any other property at construction time. The old API is available as `make_with_name()`. - Builders for `Bin` and `Pipeline` as well as a `Default` trait implementation to simplify object construction. - Builders for `appsrc` and `appsink`, which allow type-safe construction of both elements while also allowing to easily set all their properties at construction time. - Builders for the GStreamer-specific fraction/array param/property specs. - Infrastructure for casting between `gst::Memory` subtypes/supertypes, and make use of it for GL memory. - Bindings for the `gstreamer-allocator` library with support for file descriptor-based and DMABUF memory. - Complete bindings for `gst_video` `Navigation` events. - Constructors for error/warning/info messages with a pre-built `glib::Error`. This also leads to some minor simplification of the existing API. - Accessors for static pads of various base classes for making accessing them cheaper and less error-prone than getting them by name. - Builder for pad templates. - Static PTP clock API for statistics, initialization and deinitialization. - New `gstreamer-utils` crate that currently contains only a `StreamProducer` API. This allows building 1:N bridges between live pipelines via `appsink` / `appsrc` elements. - Bindings for the new `gstreamer-play` library that was added in 1.20. - `gst::Caps::new_empty_simple()` to create caps without fields and just a name. - `gst_audio::AudioCapsBuilder` and `gst_video::VideoCapsBuilder` for building (possibly) unfixed raw audio/videos caps with typed setters for the various fields. This makes it impossible to mix up types and e.g. use an `u32` instead of an `i32` for the width of video caps. - `gst::Buffer::ptr_eq()` to compare buffers by pointer instead of doing a deep comparison, and also `ptr_eq()` on all other miniobject types. - Accessors for `gst_webrtc::WebRTCICECandidateStats` fields. - Bindings for the `gstreamer-validate` API. - Subclassing bindings for `gst_audio::AudioVisualizer` base class for easily writing audio visualization elements. - `gst_pbutils::EncodingProfile` API for element properties. - Support for returning buffer lists from `BaseSrc` / `PushSrc` subclasses. - Support for implementing `gst::Bin::do_latency()`. - Minimal bindings for the `gstreamer-mpegts` library. ### Fixed - Signature for `gst_base::Aggregator::connect_samples_selected()` to remove unnecessary generic parameter and make it straightforward to use. - Various APIs had optional parameters/return types corrected to match the C API more closely. - Logging does not evaluate its arguments anymore if the debug category is not enabled or below the current threshold. - Registering custom metas is now possible without transform function. - `gst::subclass::ElementImpl::request_new_pad()` signature uses a `&str` instead of an owned `String` now. ### Removed - `fragile` dependency and instead use the same functionality from `glib`. - `gst_audio::AudioAggregator` `ignore_inactive_pads` property, which was duplicated from the `Aggregator` base class. ### Changed - Compatible with the 0.16 gtk-rs release. - Updated minimum supported GStreamer version from 1.8 to 1.14. - Updated to the latest GStreamer 1.22 APIs while still supporting up to GStreamer 1.14. Any new 1.22 APIs might still change until the stable 1.22 release. - Updated minimum supported Rust version to 1.63. - In `EventView` / `QueryView`, getters that return references now return references that can outlive the view and are only bound by the lifetime of the corresponding event/query. - In addition `Query`, `Event` and `Message` views are implemented more consistently now, which makes them easier to use and as a side effect allows to pass e.g. more strongly typed queries to functions that only accept a single query type. - Various improvements to `gst::ClockTime`, `gst::format::Bytes`, `gst::format::Signed` and related types and their usage in the API, which should make its use from applications easier and less error-prone. Check the `gst::format` module-level documentation for details. - `gst::StreamsSelected` event builder takes the selected streams as iterator instead of slice. - For consistency with other macros the `gst` prefix of the logging macros was also removed. - Various iterator implementations were added and the existing ones were optimized by implementing more specialized traits and custom implementations for a couple of iterator functions. - GStreamer initialization safety checks were optimized. - `gst::Bus::post()` takes ownership of the passed messages like the C API. - Better and easier to read `Debug` impls for `Caps`, `TagList`, `Structure` and `Promise`. - `ser_de` feature was renamed to `serde`. - `gst::Tracer` implementations get result enums passed as `Result`s now instead of single enums. - `gst::Pad`, `ProxyPad`, `GhostPad` default functions are all associated functions instead of methods now to avoid conflicts between multiple types with the same method. - `Pad` tasks are catching panics from the task function and if the parent of the pad is an element then the panic is converted into an error message and the task is simply stopped. Otherwise the panic is rethrown. ## [0.18.8] - 2022-04-26 ### Added - Bindings for `RTPBasePayload` and `RTPBaseDepayload`. - Accessors for `RTPBuffer` buffer. - Bindings for `RTPBuffer` length calculation API. - More complete `gst::Task` bindings. ### Fixed - Export `gst::subclass::TaskPoolFunction`. ## [0.18.7] - 2022-04-04 ### Added - Bindings for `VideoAggregator` and the `VideoAggregatorPad`s. - Bindings for `AudioAggregator` and the `AudioAggregatorPad`s. - Bindings for `TaskPool`. - Various helper functions for `VideoFormatInfo`, `VideoInfo` and `VideoFrame`. ## [0.18.6] - 2022-03-08 ### Fixed - Require `Send` and not `Sync` for the values of an `gst::Array` / `gst::List`. ### Changed - Simplify and speed up log message string construction ## [0.18.5] - 2022-02-20 ### Changed - Require GStreamer 1.20.0 at least when building with `v1_20`. Earlier versions were already going to fail due to API mismatches before. ### Added - `gst::BufferPool` subclassing support. - `Debug` impl for `gst::MiniObject`. - `gst_rtsp_server::RTSPOnvifServer` and related API, including subclassing support. ### Fixed - Handle empty slices correctly at the FFI layer. - `MiniObjectRef::downcast_ref()` and similar functions return the correct type now. While this is an API change, the previous API would've never worked. ## [0.18.4] - 2022-02-04 ### Changed - Update gir files to GStreamer 1.20.0 release. ### Added - `gst_video::VideoCodecFrame::input_buffer_owned()` for getting an owned reference. ### Fixed - All documentation links in the `README.md`s are correct again. ## [0.18.3] - 2022-01-31 ### Added - `Default` implementation for `gst_video::VideoOverlayComposition` when targeting GStreamer 1.20. - `gst_video::VideoOverlayComposition::add_rectangle()` in addition to the addition of all rectangles via an iterator during construction. - Subclassing support for `gst_rtp::RTPHeaderExtension`. - `gst_webrtc::WebRTCError` for programmatically handling WebRTC errors. ### Fixed - `gst_rtp::RTPHeaderExtension` has `gst::Element` set as parent class now. - Global functions are re-exported from the `gst_rtp` crate root. ### Changed - GIO-style async operations in GES no longer need `Send`-able closures. ### Removed - `fragile` is no longer a dependency and instead the corresponding GLib API is used. ## [0.18.2] - 2022-01-24 ### Added - `glib::FromValue` for mini object references. - Bindings for `gst::DebugCategory::get_line()`. ## [0.18.1] - 2022-01-18 ### Fixed - `Message::view()` also handles the redirect message now. - `Message` and `Query` view variants that return references now borrow again from the underlying query and not the view enum, allowing to use them in a wider scope. ### Changed - All miniobjects, `VideoTimeCode`, `Structure` and `CapsFeatures` are marked as `#[repr(transparent)]` now to ensure that their memory representation is exactly the underlying raw pointer. ## [0.18.0] - 2022-01-16 ### Added - `gst_rtp::RtpHeaderExtension::read()` and `write()`. - `gst::ElementMetadata` has a `const` constructor now. - `gst_rtp::RtpBuffer` API works on buffer references instead of plain buffers for statically enforcing writability and usage in more places. - `gst_video::VideoCodecAlphaMeta` and `gst::CustomMeta`. - `gst::MiniObject` for generically passing around mini objects instead of their concrete types. - `gst_app::AppSink` `new-event` callback and `pull_object()` function. - `gst_pbutils::PbUtilsCapsDescriptionFlags` and `pb_utils_get_caps_description_flags()`. - `gst_rtp::RtpBuffer::remove_extension_data()`. - `gst_video::VideoDecoder` subframe API. - `gst_webrtc::WebRTCSCTPTransport`. - `gst::ElementFactory` `create_with_properties()` / `make_with_properties()`. - `gst_video::VideoContentLightLevel` and `VideoMasteringDisplayInfo` for HDR signalling. - Lots of missing `GES` API. - `gst::AllocationParams` and support in the allocation query. - `propose_allocation()` and `decide_allocation()` support in the various base classes. - `Iterator` implementation for `gst_video::VideoOverlayComposition`. - `Extend`, `IntoIterator` and `FromIterator` implementations for `Buffer`, `Caps`, `BufferList`, `CapsFeatures`, `StreamCollection` and `Structure` for more natural Rust APIs. - `instant-rate-change` events/messages bindings. - Support for arithmetic operations on `Option` and related types. - `gst_video::ColorBalance`. - `gst::MetaFlags`. - `gst_base::Aggregator::set_position()`. - Convenience getters for `gst::ElementFactory` and `gst::DeviceProviderFactory` metadata. - `gst_rtp::RtpBuffer::set_padding()`, `get_padding()` and `payload_mut()`. - `#[must_use]` to many types and functions. - `gst::Event`, `gst::Message` and `gst::Structure` `has_name()`. - `gst_video::Navigation` subclassing support and API improvements. - `gst::Structure` and `gst::Caps` `foreach()`, `map_in_place()` and `filter_map_in_place()`. - `gst_gl::GLBufferPool` and various GL constants and functions. - `gst_pbutils` codec utils APIs. ### Fixed - `gst_base::BaseTransform::prepare_output_buffer()` correctly reflects buffer writability. ### Changed - Compatible with the 0.15 gtk-rs release. - Updated to the latest GStreamer 1.20 APIs while still supporting up to GStreamer 1.8. Any new 1.20 APIs might still change until the stable 1.20 release. - Update all code to the Rust 2021 edition. This causes no user-facing changes. - `gst::Sample::segment()` returns a reference instead of a copy of the segment. - `gst::Object::set_property_from_str()` returns a `Result` now instead of silently failing like the C version. - Allow handling passed in buffers in `gst_base::PushSrc::create`. - Allow passing in `None` in `gst_player::Player::set_uri()`. - Use `[[f32; 4]; 4]` instead of `[f32; 16]` for affine transformation matrix. - `gst::Pad::sticky_event()` statically gets the event of the requested type instead of requiring to match on it afterwards. - Clean up `gst_pbutils` `EncodingProfile` API to be harder to misuse and less confusing. - Various `gst::Array`, `gst::List`, `gst::IntRange` and `gst::Fraction` API improvements that should reduce some friction. - Directly generate `NUL`-terminated C strings in debug log API instead of having multiple allocations per message. - Various functions return `glib::SList` and `glib::List` now to avoid copying into a `Vec` if only iteration is needed. - `gst::ChildProxy` API is more consistent with object property API. - Improved `gst::Buffer::foreach()`, `gst::Pad::sticky_events_foreach()` and `gst::BufferList::foreach()` APIs. - Don't post error messages from `propose_allocation()` and `decide_allocation()`. ## [0.17.4] - 2021-09-13 ### Added - Add constructor for device provider metadata. ## [0.17.3] - 2021-08-23 ### Fixed - `gst::Value::deserialize()` takes the target type as parameter now. This is technically an API change but the function would've never worked previously. ### Added - The release date-time parameter to `gst::plugin_define!` is optional now like in the C version of the macro. - Bindings to `gst::Tracer` and `gst::TracerFactory` for allowing to implement custom tracers in Rust. - Bindings for the new `gst::Value::deserialize_with_psec()` function from GStreamer 1.20. - serde `Serialize`/`Deserialize` impls for `gst::PadDirection`, `gst::PadPresence`, `gst::URIType` and `gst::Rank`. ## [0.17.2] - 2021-08-05 ### Fixed - Various new clippy warnings. - Compilation of `gstreamer-audio` on big-endian platforms. ### Added - Support for 1.20 `Gap` event `GapFlags`. - Support for 1.20 `Structure::serialize()` / `Caps::serialize()`. ## [0.17.1] - 2021-07-13 ### Fixed - Store 1.19 as plugin version when building plugins with `v1_20`. Otherwise plugins fail to load with GStreamer versions below 1.20.0. - Fix documentation for `gst::Element::request_pad_simple()` to actually show up. ## [0.17.0] - 2021-06-28 ### Fixed - Use `#[repr(transparent)]` where it is more correct and remove unneeded `#[repr(C)]` annotations. - Don't provide direct access to the logged object in logging functions as the object might currently be finalized and might be unsafe to access. - Moved X11/EGL/Wayland-specific GL APIs into their own crates instead of having them inside gstreamer-gl and behind feature flags. This simplifies conditional usage of them in applications. - Various nullability issues: parameters and return values that should've been or shouldn't have been nullable were fixed. - Print source object correctly in `gst::Message` `Debug` impl. - `gst_rtsp_server::RTSPServer::attach()` is fallible. - `gst::ElementFactoryListType` is a proper bitflags type now instead of generic `u64`. - `gst::PluginFeature::load()` returns the same type as the one passed in. - Value returned by `gst::PromiseFuture` can no longer be freed while still in scope. - Only assign to `GError**`s in subclassing code if they're not `NULL`. ### Added - Bindings for the GStreamer Controller library and the corresponding core API. - Subclassing support for `gst_player::PlayerVideoRenderer`. - `gst::PARAM_FLAG_CONTROLLABLE` and related bindings. - `gst_video::VideoOrientation` and `VideoOrientationMethod` bindings. - Support for removing pad probes from inside the pad probe callback. - `gst_check::Harness::pull_until_eos()` bindings. - `ges::TransitionClip` and `OperationClip`. - Bindings for `gst_gl::GLMemory` and related APIs. - Subclassing support for `gst_gl::GLFilter` and `gst_gl::BaseSrc`. - `gst::TagList::remove()`. - `gst::CapsFeatures` and `gst::Structure` API based on `glib::Quark`s instead of strings. - Subclassing support for `gst_video::VideoFilter`. - Bindings for various new 1.20 APIs: `gst_app::LeakyType`, `gst_video::VideoDecoderRequestSyncPointFlags`, `gst_rtp::RTPHeaderExtension`, `gst_audio::AudioLevelMeta`, `gst_webrtc::WebRTCKind` and various other new flags/enum types. - Subclassing support for `gst_rtsp_server::RTSPMountPoints`. ### Removed - Deprecated APIs in 0.16. - Don't declare that `gst_app::AppSink` and `AppSrc` inherit from `gst_base::BaseSink` and `BaseSrc` to avoid exposing API that is meant for subclasses to applications. - `gst_app::AppSrc` and `AppSink` signals that are also covered by the callbacks. The callbacks are more flexible and have lower overhead. - Duplicated getters/setters for `gst_base::BaseSink` and `BaseTransform` properties. ### Changed - Compatible with the 0.14 gtk-rs release. - Updated to the new GStreamer 1.20 APIs while still supporting up to GStreamer 1.8. Any new 1.20 APIs might still change until the stable 1.20 release. - FFI and safe high-level bindings are in the same repository now and use the same version numbers. - The .gir files are shared with gtk-rs and the GStreamer-specific ones are in a separate git submodule. - Update all code to the Rust 2018 edition. As part of this, most macros lost their `gst_` prefix. - Re-export dependency crates from the different preludes. - Getter functions don't have a `get_` prefix anymore and GObject property accessors don't include the `_property_` part in the middle of their function names anymore. Applications developers should use [`fix-getters-calls`](https://crates.io/crates/fix-getters-calls) to ease migration of their applications. Use [`fix-getters-def`](https://crates.io/crates/fix-getters-def) if you also want your `get` functions definition to comply with the API standards applied in this release. - Lots of changes to the subclassing API. Check the various elements in [gst-plugins-rs](https://gitlab.freedesktop.org/gstreamer/gst-plugins-rs) for examples. - Major improvements to the documentation infrastructure and generated documentation. - `gst::ClockID` bindings are refactored to use different types for single-shot and periodic clock ids, which makes misuse harder. - `gst::ProxyPad` extension trait uses trait functions instead of associated functions now for usability reasons. - Use `Result` for overriding flow returns from pad probes. - `gst_video::VideoInfo::align()` returns a `Result` instead of a `bool`. - Use actual error types instead of `()` in `gst_sdp` APIs. - `Display` impl for `gst::ClockTime` provides better human-readable strings. - `gst::Element::link_filtered()` and `link_pads_filtered()` takes a non-optional caps now. That's easier to use and for not providing caps the non-filtered variants of the functions exist. - Replace various manual bindings with auto-generated ones. - `gst::Element::get_request_pad()` is replaced by `request_pad_simple()` as a simpler version of `request_pad()` and in accordance with the deprecation in GStreamer 1.20. - `gst::ClockTime` and APIs working on it were changed to make possibility of using `GST_CLOCK_TIME_NONE` expressed in the type system. `Option` can be `None` while `gst::ClockTime` is always a valid time. ## [0.16.7] - 2021-02-13 ### Fixed - Usage of the logging system with a GStreamer library with the logging system compiled out does not crash any longer. - Double-free in `gst_video::VideoTimeCode` API when converting between validated and unvalidated timecodes. ### Added - `gst::Element::get_current_state()` and `get_pending_state()` convenience APIs. - `gst_audio::AudioConverterConfig` for setting the configuration on e.g. the `audiomixer` element. The low-level `AudioConverter` API is still not included in the bindings. ## [0.16.6] - 2020-12-20 ### Fixed - `VideoTimeCodeInterval`'s `Ord` and `PartialEq` implementations compare against the correct fields now. - `SDPMessage::medias_mut()` iterator does not crash any longer. ### Added - `PartialEq` and `Eq` implementations on `VideoAlignment`. - Alignment API for `VideoMeta` and `get_plane_height()` / `get_plane_size()`. - `VideoInfo::align_full()`. ## [0.16.5] - 2020-11-23 ### Fixed - Make sure to use `$crate` in more macros to allow them to work without anything special in scope already. - Update documentation location. - Don't panic if C code stores invalid seqnums in events and the seqnum is used directly or via the `Display` impl. - Fix docs build for some crates on docs.rs. - Fix `Debug` impl for `gst_video::VideoTimeCode` to print the correct type name. - Fix plugin version to be 1.18 instead of 1.17 when compiling a plugin with `v1_18`. ### Added - Event handling support in pad probes, that is returning `PadProbeReturn::Handled` for events. - `EventRef::get_structure_mut()` getter that allows changing the events' structures. ### Changed - Remove unnecessary `PhantomData` markers and use `repr(transparent)` instead of `repr(C)` where it is more correct. ## [0.16.4] - 2020-10-09 ### Fixed - Correctly implement `ExactSizeIterator` on the `AudioFormat` and `VideoFormat` iterators. Previously they returned the overall size instead of the remaining size, and they didn't implement `Iterator::size_hint()`. - Don't implement `ExactSizeIterator` on the buffer `gst::Meta` iterator. The overall length is not known easily and the implementation would've simply panicked in the past. ### Added - `gst::ClockID::wait_async_stream()` for async integration for clock waiting. - `From` / `TryFrom` impls for converting between `gst::ClockTime` and `std::time::Duration`. ## [0.16.3] - 2020-09-08 ### Fixed - Reset vfuncs if calling `BaseTransformClass::configure()` multiple times. - Fix `gst::debug_remove_default_log_function()` to actually remove the default log function. ### Added - Some more new APIs added in 1.18. - API for getting an owned buffer from a readable `gst_video::VideoFrame` / `VideoFrameRef`. ### Changed - Updated bindings to 1.18.0. This stabilized GStreamer 1.18 support and any API behind the "v1_18" feature is considered stable now. - Factor out some common code from `gst::Pad::ProbeInfo` code. This reduces the code generated for each pad probe considerably. - Update paste dependency to 1.0 and pretty-hex to 0.2. ## [0.16.2] - 2020-07-27 ### Fixed - Use correct pointer for the plane data in `gst_audio::AudioBuffer`. ### Added - Add `gst::GhostPad` convenience constructors that take a target pad, similar to the ones that existed in 0.15 and before. - Add `gst::parse_bin_from_description_with_name` that allows setting a name for the created bin without having to use unsafe code in application code. ## [0.16.1] - 2020-07-10 ### Fixed - Allow calling `gst::DebugCategory::new()` before `gst::init()` again. ## [0.16.0] - 2020-07-06 ### Added - Updated bindings to 1.17.2, adding experimental 1.18 support. This can be opted-in via the "v1_18" feature flag but there might still be API changes in the newly added API. - `gst::MemoryRef::dump()` for dumping contents of a memory. - `gst::Bus::stream()` instead of a custom constructor on the `BusStream`. - Use more accurate types for `Seqnum`, `GroupId` and `MetaSeqnum`. These are now proper wrapper types instead of plain integers, which makes misuse harder. - Provide `TryFrom` impls for conversion between `glib::DateTime` and `gst::DateTime`. - Add `get_allocator()` functions to `gst_base::{Aggregator, BaseTransform, BaseSrc}`, and allow overriding `BaseSrc::alloc()`. - Add subclassing bindings for `gst_base::PushSrc`. - Add new `gst::BufferCursor` API that allows to handle a buffer as `Read`, `Write` and `Seek` and accesses the underlying memories of the buffer individually without mapping them all together. - Add `gst::Plugin::get_plugin_name()`. - Support for `gst_video::VideoAFDMeta` and `VideoBarMeta`. - API for getting all / iterating over all `gst_audio::AudioFormat` and `gst_video::VideoFormat`. - Bindings and subclassing bindings for `gst_video::VideoSink`. - `gst::Pad` can be constructed via the builder pattern and `gst::PadBuilder` now, which allows to safely set the pad functions and various other fields during construction. The `PadBuilder` works on any `gst::Pad` subclass and also has special support for `GhostPad`s by allowing to set pad functions of the proxy pad. - `gst::Message`, `gst::Event` and `gst::Query` type constructors are now on the specific target type instead of various `new_XXX()` functions on the basic type. E.g. `gst::message::Eos::new()`. - Support for overriding `gst_audio::AudioSrc/Sink::reset()`. - Support for overriding `gst_base::BaseParse::stop()`. - Support for overriding `gst::Element::post_message()`. - Added bindings for `gst::BufferList::foreach()` and `foreach_mut()`. - Added bindings for `gst::Buffer::foreach_meta()` and `foreach_meta_mut()`. ### Fixed - Allow using any `glib::Object` as target object for logging instead of just `gst::Object`. - Remove restriction API from `gst_pbutils::EncodingContainerProfile`. They are supposed to be used only with the other encoding profiles. - Return `&'static str` for various `gst::StructureRef` functions where the string is backed by a `glib::Quark`. - Fix various `gst::DateTime` functions to actually return `Option`s. - Add support for filling in a buffer passed to the `gst::Pad` getrange function, allow passing one in into `get_range()` and `pull_range()` and provide the corresponding API on `gst_base::BaseSrc` too. - Allocator in audio/video `Decoder` base classes is optional and can return `None`. - `gst_video::ValidVideoTimeCode::add_interval()` always returns a valid timecode again. - Allow resolving a `gst::Promise` with `None` and also handle that correctly in the callback. This is allowed by the API. - Allow calling various debugging related functions before `gst::init()`. - Various enum/function versions were fixed to only show up if the corresponding version feature is enabled. - `gst::Pad` function setters are marked unsafe now as changing the functions is not thread-safe. - Remove `gst::Object::set_name()` as changing the name after construction generally causes problems and is potentially unsafe. - Remove `gst::Pad::set_pad_template()` as changing the pad template after construction is generally unsafe. - `gst::Pad::stream_lock()` borrows the pad now instead of taking a new reference. - Unimplemented `Jitter` and `Buffer` queries were removed from the bindings. These are not implemented in C and only have a type registered. - Various `LAST`, `NONE` variants of enums and flags were removed as these only make sense in C. - Call the parent impl of various vfuncs that were omitted before to not require further subclasses of them to implement them but automatically call the parent ones. ### Changed - Use `NonZeroU64/U32` for various ID types to allow further optimizations. - Use `thiserror` crate for deriving error types. - Switch from `lazy_static` to `once_cell`. - Change various miniobject functions like `gst::Caps::append()` from taking the object by value to modifying it internally. This makes them easier to use and only applies to functions that are defined on the non-reference type and take ownership of the values passed in. - Use `mem::ManuallyDrop` instead of `mem::forget()` everywhere. - Replace most `mem::transmute()` calls with safer alternatives. - `gst:StreamCollection` API was changed to the builder pattern for construction as the collection must not be changed after construction. - `gst::ProxyPad` default functions are plain functions on `ProxyPad` now instead of trait functions to allow easier usage of them. - Use proper error types in various `TryFrom` impls. - `gst_video::VideoMeta::add()` returns a `Result` now instead of panicking. - Various constructors were renamed from `new_with_XXX()` and `new_from_XXX()` to the more idiomatic `with_XXX()` and `from_XXX()`. - Miniobject bindings are simplified now and there is no `gst::GstRc` type anymore, instead everything is directly implemented on the concrete types. As part of this the `gst::MiniObject` trait was also removed as it was unneeded now. ## [0.15.7] - 2020-06-08 ### Fixed - Allow multiple filter types per process with `gst::Iterator::filter()`. - Check that `VideoInfo` is valid when creating a `VideoFrame`. - Don't potentially dereference a `NULL` pointer when getting the format from an invalid `VideoInfo` or `AudioInfo`. - Don't unmap borrowed `VideoFrameRef`s. ### Added - `gst::ProtectionMeta`, `gst_video::VideoAffineTransformationMeta`, `VideoCropMeta` and `VideoRegionOfInterestMeta` bindings. - Various new `gst_rtp::RTPBuffer` methods. - `gst_audio::audio_buffer_truncate()`, `AudioMeta` and `AudioBuffer` bindings. ## [0.15.6] - 2020-05-28 ### Fixed - Assert that the data passed to `VideoCaptionMeta::add()` is not empty. - Don't store strong references to the object in the bus, appsink and appsrc futures `Stream` / `Sink` adapters. This would keep them alive unnecessarily and would prevent the `Stream` / `Sink` to ever "finish" on its own. - Handle receiving a `None` reply in the change function of `gst::Promise`. This is apparently valid. For backwards compatibility reasons this is currently replaced with an empty structure but in 0.16 the API will explicitly handle `None`. ### Added - `gst::Stream::debug()` and `gst::StreamCollection::debug()` for converting into a structured string with the actual contents of each. - `gst::Structure::from_iter()` and `gst::Caps::from_iter()` to create structures/caps from iterators. - `gst::Event` support for getting/setting the `gst::Stream` in the `StreamStart` event. - `gst_video::calculate_display_ratio()` and `::guess_framerate()`. - Various video related `gst::CapsFeatures` in `gst_video`. - `TryFrom`/`From` impls for converting between `gst::Structure` and `gst_video::VideoConverterConfig`. - Various `glib::Value` trait impls for `SDPMessage`, `StructureRef`, `CapsFeatureRef` and all borrowed variants of miniobjects to be able to work with the borrowed, non-owned variants when handling `glib::Value`s. ## [0.15.5] - 2020-05-03 ### Fixed - Revert: Allow logging any `glib::Object` and not just `gst::Object`. This broke API in subtle ways and needs to wait until 0.16 - Replace `%` in log output with `%%` to prevent accidental C formatting - Add missing manual traits to the documentation ### Added - `BufferRef::peek_memory_mut()` to give a mutable reference to a given memory - Different iterators for iterating over the memories of a buffer - Support for `gst_audio::AudioClippingMeta` - `gst::Plugin::get_plugin_name()` was added - `gst::Element::get_current_clock_time()` and `gst::Element::get_current_running_time() helper functions - `gst::State` and `StateChange` API for calculating next/previous state and convert from/to the components of a state change ### Changed - Use `mem::ManuallyDrop` instead of `mem::forget` everywhere ## [0.15.4] - 2020-03-09 ### Fixed - Allow logging any `glib::Object` and not just `gst::Object` - Fix floating reference handling in `RTSPMedia::take_pipeline()` - Hold `GMutex` guards for the remainder of the function and warn if they're directly dropped - Work around empty/any caps handling bugs in `Caps::fixate()` ### Added - Add `BaseTransform::prepare_output_buffer()` subclassing support - `RTSPServer`, `RTSPClient`, `RTSPMedia` and `RTSPMediaFactory` subclassing support - Handle panicking in `appsrc`/`appsink` callbacks by posting an error message instead of killing the process ## [0.15.3] - 2020-02-15 ### Fixed - `UniqueFlowCombiner::clear()` should take a mutable reference. - `AudioStreamAlign` doesn't require mutable references for getters anymore. - Don't use bool return value of `gst_video_info_set_format()` and `gst_video_info_align()` with GStreamer < 1.11.1 as it returned void back then. We'd otherwise use some random value. - Make `VideoInfo::align()` is available since 1.8. - Fix changing/clearing of `AppSrc`, `AppSink` callbacks and `Bus` sync handler. Before 1.16.3 this was not thread-safe and caused crashes. When running with older versions changing them causes a panic now and unsetting the bus sync handler has not effect. With newer versions it works correctly. ### Added - Add `Clone` impls for `BufferPoolConfig` and `PlayerConfig`. - Add `VideoConverter` bindings. - Add `Future`s variant for `gst::Promise` constructor. - Add `Future`s variant for `gst_video::convert_sample_async()`. - Add `submit_input_buffer()`, `generate_output()`, `before_transform()`, `copy_metadata()` and `transform_meta()` virtual method support for `BaseTransform`. - Add `AppSink` `Stream` adapter and `AppSrc` `Sink` adapter for integrating both into Rust async contexts. ### Changed - More generic implementations of `VideoFrame` / `VideoFrameRef` functions to allow usage in more generic contexts. ## [0.15.2] - 2020-01-30 ### Fixed - Fix another race condition in the `gst::Bus` `Stream` that could cause it to not wake up although a message is available. ## [0.15.1] - 2020-01-23 ### Added - Use static inner lifetime for `VideoCodecState` so that it can be stored safely on the heap. - Getters/setters for `BinFlags` on `gst::Bin`. - `gst::Caps::builder_full()` for building caps with multiple structures conveniently. - `gst::Element::call_async_future()` for asynchronously spawning a closure and returning a `Future` for awaiting its return value. ### Fixed - Various clippy warnings. - Getters/setters for `PadFlags` on `gst::Pad` now provide the correct behaviour. - Take mutex before popping messages in the `gst::Bus` `Stream` to close a small race condition that could cause it to not be woken up. - `gst::ChildProxy` implementers do not have to provide `child_added()` and `child_removed()` functions anymore but these are optional now. - Manually implement `Debug` impls for various generic types where to `Debug` impl should not depend on their type parameters also implementing `Debug`. ## [0.15.0] - 2019-12-18 ### Added - `StructureRef::get_optional()` for returning `None` if the field does not exist instead of `Err` - Bindings for `gstreamer-rtp` library, mostly `RTPBuffer` - Support for writing `Preset`, `TagSetter`, `Clock`, `SystemClock` subclasses - Bindings for `Typefind::get_length()` - Bindings for `BaseSrcImpl::get_times()` - Bindings (incl. subclassing) for `AudioSink` and `AudioSrc` - Missing `Send`/`Sync` impl for various types ### Fixed - Cleanup of cargo features/dependencies to improve build times - Serde serialization with optional values. Attention: This changes the format of the serialization! - `VideoEncoder`/`VideoDecoder` `proxy_getcaps()` can't return `None` - Use non-panicking UTF8 conversion in log handler. We don't want to panic just because some C code printed a non-UTF8 string - Re-rexport all traits from the crate level and also ensure that all traits are actually included in the preludes - Actually export `is_video_overlay_prepare_window_handle_message()` function - Use `FnMut` for the `appsink` callbacks instead of `Fn` - `Promise` change function returns the actual reply to the promise now instead of just passing the promise itself - Memory leak in `Iterator::filter()` - `BinImpl::add()` takes ownership of floating references - `DeviceImpl::create_element()` preserves floating flag - `BinImpl::remove()` takes a strong reference of the element now as the last reference might be owned by the bin and otherwise we would potentially have a use-after-free afterwards - `BaseParseFrame` and `VideoCodecFrame` take a `&mut self` now for various functions that actually change the frame ### Changed - Minimum supported Rust version is 1.39 - Allow passing `None` to `VideoEncoder::finish_frame()` - Various `to_string()` methods were moved into the `Display` trait impl and for some types `to_str()` was added to return a `&'static str` - .gir files were updated to 1.16.2 release - `Sample` constructor uses the builder pattern now - `VideoMeta::add_full()` is simplified and requires parameters - `BasetransformImpl::set_caps()` returns a `Result` instead of `bool` - SDP data type getters for strings return an `Option` now as these can be `None` in practice although not allowed by the SDP spec - Various functions returning `Option`s were changed to return `Results` if `None` actually signalled an error instead of just a missing value ### Removed - "subclassing" and "futures" cargo features. These are enabled by default now ## [0.14.5] - 2019-09-17 ### Added - Support subclassing of `gst::Device`, `gst::DeviceProvider`, `gst_audio::AudioDecoder` and `::AudioEncoder` - Support for `Element::set_clock` and `::provide_clock` virtual methods - `ElementClass::add_metadata` was added - `gst_video::VideoDecoder` and `::VideoEncoder` got support for `get_caps`, `negotiate`, `src/sink_query/event` and the `drain` virtual methods - `Element::num_pads`, `::num_src_pads` and `::num_sink_pads` functions - `gst_video::VideoDecoder` and `::VideoEncoder` got `get_allocator` bindings - `gst::Iterator` implements `IntoIterator` now for providing `std::iter::Iterator>` adapter - Error macros for audio/video decoder subclasses to handle decoding errors more gracefully and only actually error out after many consecutive errors ### Fixed - Macros now also work in Rust 2018 edition without `#[macro_use]` but explicit imports - The log handler unit test runs reliable in parallel with other tests - Manually implement `Debug` for `gst::Iterator` to allow it for any `T` instead of `T: Debug` - `Device::create_element` has correct reference count handling now - Return `NotNegotiated` in the video codec base classes if setting the output state fails instead of `Error` ## [0.14.4] - 2019-08-14 ### Added - Bindings for adding/removing custom log functions - Bindings for `calculate_linear_regression()` - Constants for base class custom flow returns ### Fixed - Ownership of pad in `Element::release_pad()` virtual method implementations ## [0.14.3] - 2019-07-16 ### Added - `Buffer::unset_flags()` for unsetting specific buffer flags - `VideoBufferFlags` flags type and `VideoBufferExt::set_video_flags()`, `unset_video_flags()` and `get_video_flags()` for working with video buffer flags from safe code. ### Fixed - Setting buffer flags does not override arbitrary other flags anymore but only sets the flags in question. This is necessary to not override extension buffer flags like `gst_video::VideoBufferFlags`. ## [0.14.2] - 2019-07-15 ### Added - Support for `ReferenceTimestampMeta` ## [0.14.1] - 2019-07-06 ### Added - Various new WebRTC enum types from 1.14.1/1.16.0 ### Fixed - Correctly generate interlaced `VideoInfo` by using `gst_video_info_set_interlaced_format()` instead of the generic function. - serde serialization unit tests for `gst::format` succeed again now. ### Changed - `Debug` impls for `VideoFormatInfo` and `AudioFormatInfo` now print all the details of the format instead of only the name, and the `Debug` impls for `VideoInfo` and `AudioInfo` also print the format now. ## [0.14.0] - 2019-06-24 ### Added - Bindings for `GLSyncMeta`. - Bindings for setting/getting `TagScope` on a `TagList` - Bindings for `GLDisplayWayland` and `GLDisplayX11` in addition to the already existing `GLDisplayEGL` - Bindings for `Bus::pop_filtered()` and related functions - Bindings for getting/setting `Object`, `Element`, `Bin`, `Pipeline` and `Plugin` flags - Bindings for `VideoCaptionMeta` - `Debug` impl of `Buffer` now also shows the metas of the buffers - Expose flow return in `PadProbeInfo` for overriding the return value - Bindings for `VideoDecoder` and `VideoEncoder`, including subclassing support - Bindings for `Memory`, `Allocator` and `VideoBufferPool` - Bindings for `VideoFormatInfo::pack` and `::unpack` for format conversion - Bindings for `BaseParse`, including subclassing support - Various new arithmetic operation impls for fractions, formatted values and `ClockTime` - Bindings for `VideoInfo::align()` ### Changed - The `SDPMessage` and `SDPMedia` bindings were completely rewritten as they were broken before and caused crashes in various usages. As part of this there's also some more convenience API available on these types, like iterators for example, and API to modify the `SDPMedia` contained in a `SDPMessage`. - Update to GStreamer 1.16. - Regenerate with latest gir. - Run all autogenerated code through rustfmt after generation too. - Updated to latest versions of GLib/GIO/etc crates. - Updated to futures 0.3 / `std::future` - `ProxyPad` default functions moved to an extension trait instead of plain functions on `ProxyPad`, making them more in sync with the default `Pad` functions - GStreamer plugins are now exporting the new 1.14+ plugin symbols if they were configured for GStreamer 1.14+ - Arithmetic operations on formatted values and `ClockTime` do overflow checks now and replace the result with the `NONE` value on overflow - `TryFrom`/`TryInto` traits are used in various places now instead of the previous ad-hoc implementations of them. - Registering element/typefind/device monitor factories requires passing a value of `gst::Rank` now instead of an arbitrary `u32` ### Fixed - Use correct type for destroying pad task closure data. This was previously using the wrong type, causing crashes at runtime. - `DeviceAdded`/`DeviceRemoved` message getters are transfer full so we don't need to take an additional reference that would be leaked. - `AppSink` callbacks are correctly marked as `Send` instead of `Send+Sync`, allowing a wider range of closures to be used for them. - Handle `PadProbeReturn::Handled` return values from pad probes more correctly. - `ToOwned::to_owned()` on miniobjects has to create copies instead of only increasing the reference count. Otherwise it was possible to create multiple mutable and immutable references to the same object at the same time. - Various functions take references to owned miniobjects instead of borrowed references as it was otherwise possible to create multiple mutable or immutable references to the same object at the same time. - `URIHandler::set_uri` does not accept `None` anymore as this is not allowed by the C function. - Comparisons and addition of `TypeFindProbability` and `Rank` work correctly now - Various `Display` implementations were fixed to not cause a stack overflow due to infinite recursion anymore - Various `::to_string()` functions don't take ownership of C strings anymore that they do not own, which caused double frees before ### Removed - MIKEY related bindings from the SDP library. The bindings were broken and until someone needs them these are not available anymore. ## [0.13.0] - 2019-02-22 ### Added - Subclassing infrastructure was moved directly into the bindings, making the `gst-plugin` crate deprecated. This involves many API changes but generally cleans up code and makes it more flexible. Take a look at the `gst-plugins-rs` crate for various examples. - Bindings for GStreamer GL library - Bindings for `CapsFeatures` and `Meta` - Bindings for `ParentBufferMeta, `VideoMeta` and `VideoOverlayCompositionMeta` - Bindings for `VideoOverlayComposition` and `VideoOverlayRectangle` - Bindings for `VideoTimeCode` - Bindings for `NetAddressMeta` - Bindings for registering custom tags - `UniqueFlowCombiner` and `UniqueAdapter` wrappers that make use of the Rust compile-time mutability checks and expose more API in a safe way, and as a side-effect implement `Sync` and `Send` now - `Bus::add_watch_local()` and `gst_video::convert_frame_async_local()` that allows to use a closure that does not implement `Send` but can only be called from the thread owning the main context. - More complete bindings for `Allocation` `Query` - `pbutils` functions for codec descriptions - `TagList::iter()` for iterating over all tags while getting a single value per tag. The old `::iter_tag_list()` function was renamed to `::iter_generic()` and still provides access to each value for a tag - `Bus::iter()` and `Bus::iter_timed()` iterators around the corresponding `::pop*()` functions - Getters for `VideoColorimetry` to access its fields - `Debug` impls for various missing types. - serde serialization of `Value` can also handle `Buffer` now - Extensive comments to all examples with explanations - Transmuxing example showing how to use `typefind`, `multiqueue` and dynamic pads - basic-tutorial-12 was ported and added ### Changed - Rust 1.31 is the minimum supported Rust version now - Update to latest gir code generator and glib bindings - Functions returning e.g. `gst::FlowReturn` or other "combined" enums were changed to return split enums like `Result` to allow usage of the standard Rust error handling. - Various functions and callbacks returning `bool` or `Option<_>` were changed to return a `Result<_, glib::BoolError>` or `Result<_, gst::LoggableError>` or `Result<_, gst::ErrorMessage>` for better integration with Rust's error handling infrastructure. - Some infallible functions returning `bool` were changed to return `()`. - `MiniObject` subclasses are now newtype wrappers around the underlying `GstRc` wrapper. This does not change the API in any breaking way for the current usages, but allows `MiniObject`s to also be implemented in other crates and makes sure `rustdoc` places the documentation in the right places. - `BinExt` extension trait was renamed to `GstBinExt` to prevent conflicts with `gtk::Bin` if both are imported - `Buffer::from_slice()` can't possible return `None` ### Fixed - `gst::tag::Album` is the album tag now instead of artist sortname - Return `0` for the channel mask corresponding to negative `AudioChannelPosition`s. - `PartialOrd` and related traits are implemented via pointer equality on `ClockId` instead of using the compare function. Two clock ids with the same timestamp are not necessarily the same. - Various functions that are actually fallible are now returning an `Option<_>`. - Various `clippy` warnings ## [0.12.2] - 2018-11-26 ### Fixed - PTP clock constructor actually creates a PTP instead of NTP clock ### Added - Bindings for GStreamer Editing Services - Bindings for GStreamer Check testing library - Bindings for the encoding profile API (encodebin) - VideoFrame, VideoInfo, AudioInfo, StructureRef implements Send and Sync now - VideoFrame has a function to get the raw FFI pointer - From impls from the Error/Success enums to the combined enums like FlowReturn - Bin-to-dot file functions were added to the Bin trait - gst_base::Adapter implements SendUnique now ### Changed - All references were updated from GitHub to freedesktop.org GitLab - Fix various links in the README.md - Link to the correct location for the documentation - Remove GitLab badge as that only works with gitlab.com currently ## [0.12.1] - 2018-09-21 ### Added - More complete bindings for the gst_video::VideoOverlay interface, especially gst_video::is_video_overlay_prepare_window_handle_message() ## [0.12.0] - 2018-09-08 ### Added - Bindings for the GStreamer SDP and WebRTC libraries - Generic API for working with tags that is based on string tag names and glib::Value for the tag values - Bindings for Aggregator and AggregatorPad - Bindings for BaseTransform/BaseSrc::get_buffer_pool() - Optional serde implementations for the basic GStreamer data flow and metadata types ### Changed - Use ptr::NonNull in various places - Updated to muldiv 0.2, num-rational 0.2 - Bus::create_watch() can't return None - Remove CallbackGuard as unwinding across FFI boundaries is not undefined behaviour anymore but will directly cause a panic - Changed from the futures to the futures-preview crate as an optional dependency - Various Caps operations take a &CapsRef instead of &Caps - "deep-notify" signal takes the whole ParamSpec as parameter instead of only the signal name - Some structs were changed from empty struct to empty enums - Pad probe code does not take an additional reference to the data anymore, potentially passing writable events/buffers into the probe - ValueExt::compare() is implemented around std::cmp::Ordering now instead of a custom enum that was basically the same ### Fixed - Pad::add_probe() can return None if an IDLE probe was already called and removed in the meantime - Various compiler and clippy warnings ### Removed - std::Iterator impl for gst::Iterator. It was awkward to use because the gst::Iterator could fail at each iteration ## [0.11.6] - 2018-08-27 ### Fixed - Build with NLL/two-phase borrows - Explicitly define [bin] section for discoverer example to fix a cargo warning ### Added - Add unsafe gst::deinit() function - Ord/PartialOrd impls on gst::Seqnum - Getter for current pad mode - gst::Pad::sticky_events_foreach() for iterating over all sticky events in a thread-safe way ## [0.11.5] - 2018-07-24 ### Fixed - `gst::Bus`'s sync handler must unref every message if `gst::BusSyncReply::Drop` is returned, otherwise they are all leaked ## [0.11.4] - 2018-07-19 ### Fixed - `gst::Caps::subtract()` does not leak its arguments anymore - `gst::Caps::get_structure()` gracefully returns `None` if the index is out of bounds instead of a `g_return_val_if_fail()` - `gst::Structure::new()` has to give away ownership of the info structure but didn't. For 0.11 we internally copy, in 0.12 it will take the info structure by value - Typefind tests don't fail anymore if the system has typefind factories without caps ### Added - An additional assertion that ensures that miniobjects are actually writable before creating a mutable reference ## [0.11.3] - 2018-06-08 ### Added - `gst::Bus::remove_watch()` is now available to remove a bus watch again - `fmt::Debug` impls for `AudioInfo` and `VideoInfo` were added - `fmt::Debug` impls for mini objects also print the pointer value now to make it easier to track them in debug logs - `PlayerVisualization` has accessors for the name and description fields now, without which there is no sensible way to use them or to set a player visualization ## [0.11.2] - 2018-05-09 ### Fixed - Work-around various floating reference handling changes between 1.12 and 1.14 to be able to run with both versions without memory leaks or other reference count problems. This affects NetTimeProvider, BufferPool, DeviceMonitor, Stream, StreamCollection, and Player, NetClientClock, NetClock, PtpClock which were already previously fixed. ### Changed - Change the appsrc need-data and all appsink callbacks to not require the Sync bound anymore and change from Fn to FnMut. They can only be called from a single thread at a time. This change is only done for the corresponding callbacks, not the signals. ## [0.11.1] - 2018-04-07 ### Fixed - Fix Structure::to_string() to not run into an infinite recursion but call the method on the contained StructureRef instead of on itself ## [0.11.0] - 2018-03-20 ### Changed - Updated everything to GStreamer 1.14.0 - Event, Message and Query types were refactored to improve usability. Especially newly constructed queries allow to directly use the type-specific functions to be used without first creating a view - VideoFrameRef::copy_to_ref() and ::copy_plane_to_ref() are gone now and the original functions work with refs instead of full frames - PadProbeId and NotifyIds are not Copy/Clone anymore and are taken by value - GstPlayer has GstObject as parent class now ### Added - GstPbutils, GstSdp, GstRtsp and GstRtspServer bindings - GstPromise, GstAudioStreamAlign and various other 1.14 API - GstVideoFilter and GstBufferPool bindings - Element::call_async() - Debug impl For Toc and TocEntry - Various new examples (RTP FEC, RTSP server, tag usage, ...) ### Fixed - Memory leak in gst_video::convert_sample_async() ## [0.10.2] - 2018-02-18 ### Fixed - Fix building of messages with custom fields for types that don't have a GstStructure ### Added - VideoFrameRef::copy_to_ref() and ::copy_plane_to_ref(), which work with VideoFrameRefs instead of full VideoFrames - Getters for the BaseSrc/Sink/Transform configured segment - Document the gstreamer-player-1.0 dependency in the README.md ## [0.10.1] - 2018-01-03 ### Fixed - Don't require &mut self for TagSetterExtManual::add() ### Added - A TagSetter example application - Bindings for gst_video::convert_sample() and ::convert_sample_async() - Bindings for gst_video::VideoRectangle - Debug impl for Sample and ::with_buffer_list() constructor - A borrowing version of VideoFrame: VideoFrameRef - Bindings for GstVideoFilter ### Changed - Deprecated Sample::get_info() in favour of ::get_structure() - Player has gst::Object as another parent class now ## [0.10.0] - 2017-12-22 ### Fixed - Various clippy warnings - Memory leak of the tag list in Toc::merge_tags() - Property getters use Values of the correct type - Event::get_structure(), Message::get_structure() and Query::get_structure() can return None for the structure - Various other nullability fixes all over the API, changing functions to accept Option<> or returning Option<>, or only plain types - Functions taking paths/filenames now actually take Paths instead of &strs - Element::remove_pad() is not giving away a new reference to the pad anymore, which caused a memory leak of all pads ever removed - Precision handling in ClockTime's Display impl - Video/AudioInfo are only Send, not Sync ### Added - Various enums now also derive useful traits like Copy, Clone and Hash in addition to PartialEq, Eq and Debug - TagList::merge() and insert() for combining tag lists - EventType gained many useful functions to work with event types and a PartialOrd impl to check expected event order of event types where it matters - MessageRef/EventRef/QueryRef implement ToOwned - Bindings for Registry and PluginFeature - Event::set_running_time_offset() for adjusting the offset while events pass through the pipeline - Event/Message GroupIds and Seqnums now have a newtype wrapper around u32 instead of the plain value, making usage of them slightly more typesafe. Also add an "invalid" value for both, as exists in latest GStreamer now. - FormattedValue, GenericFormattedValue and related types were implemented now, which allows more convenient and type-safe usage of formatted values (time, bytes, etc) - Bindings for force-keyunit and still-frame events were added - MappedBuffer/BufferMap now implement various other useful traits, including AsRef<[u8]>, AsMut, Deref, DerefMut, Debug, PartialEq and Eq - Add VideoMultiviewFramePacking enum, and use it in Player - Bindings for the GStreamer Net library, including PTP/NTP/network client clocks and the GStreamer NetClock provider for network synchronization of pipelines - IteratorError implements std::error:Error - Plugin::add_dependency() and ::add_dependency_simple() was added - Rank and TypeFindProbability implement PartialOrd/Ord now - Bindings for TypeFind, TypeFindFactory and the typefind helpers - StreamCollection::iter() for iterating over all contained streams - ErrorMessage type that can be used e.g. in a Result for passing an error message from somewhere to upper layers to then be posted on an element the same way gst_element_error!() would've done ### Changed - Sample::new(), TagList::add(), Structure::set() and similar functions take the values (ToSendValue impls) by reference instead of value. They were not consumed by the function before. - The Debug impls of various types, including Event/Buffer/Message/Query/Structure were improved to print all the fields, similar to what GST_PTR_FORMAT would do in C - Switched to lazy_static 1.0 - Gap event and Duration tag are using ClockTimes now, as well as various Player signals - Segment is now based on a generic type FormattedSegment that can take any format (time, bytes, etc) or a GenericFormattedValue for more type-safety and convenience. Also functions for "casting" between a generic segment and a segment with a specific format exist on this now - AppSrc and AppSink now have a builder for the callbacks, making it unnecessary to always provide all callbacks even if only one is actually needed - Various functions that returned bool for errors, are now returning a Result - Player configuration is now a custom type with more convenient API - Player VideoInfo uses a Fraction instead of (u32,u32) for the framerate and pixel-aspect-ratio - VideoFrame API has more consistent API between writable and read-only variants - Buffer::copy_into() was added, and ::copy_region() now takes a BufferCopyFlags parameter instead of always using the default flags - ChildProxy::set_child_property() takes a &ToValue now to follow the API of Object::set_property() and improve usability - Proxy/GhostPad default pad functions use the correct specific pad type now instead of a generic Pad - Bus::add_signal_watch_full() takes a Priority for the priority instead of u32 - Clock::(un)adjust_with_calibration() takes no clock parameter anymore ### Removed - FormatValue was removed in favour of GenericFormattedValue and the connected traits and specific format impls ## [0.9.1] - 2017-11-26 ### Fixed - Export `FlowError`/`FlowSuccess`, `ClockError`/`ClockSuccess`, `PadLinkError`/`PadLinkSuccess` too ## [0.9.0] - 2017-11-26 ### Added - Bindings for (outputting to) the GStreamer logging system - Bindings for the GStreamer base library - Bindings for all the `Pad` functions to override pad behaviour, and pad task functions - Bindings for `StaticCaps` and `StaticPadTemplate` - Bindings for `deep-notify` signal on `Object` - Support for directly creating `Error`/`Warning`/`Info` `Messages` and posting them from an element with context information (file, line, module, etc.) similar to the C `GST_ELEMENT_ERROR` macro - Support for setting custom fields in `Messages`/`Events` during construction - Support for creating Buffers out of anything that is `AsRef<[u8]>` or `AsMut<[u8]>` - Support for using the `Read` trait on `Adapter` - Functions for getting all sink/src/all pads of an `Element`, and all children of a `Bin` - Builder for `Caps` and `Structures` in addition to the existing functions - `AppSrc`/`AppSink` implement `BaseSrc`/`BaseSink` and `URIHandler` - Rust ports of the basic tutorials 1 to 8 from https://gstreamer.freedesktop.org/documentation/tutorials/ - "Getting started" and "Installation" sections to the README.md - "dox" feature for generating documentation for all available configurations ### Fixed - `StackTraceFlags` are only available since 1.12 - Worked around macOS requiring a `NSRunLoop` running on the main thread in all examples and tutorials, to be able to show a window or anything else ### Changed - `ClockTime` is now a wrapper around `Option` to handle the `CLOCK_TIME_NONE` case better. This wrapper implements all the arithmetic and other traits as needed and ensures that no accidental calculations with `CLOCK_TIME_NONE` can happen - "Values with format", like in `Duration`/`Position`/`Convert` queries or `Seek` events now return a `FormatValue` type. This contains the actual `Format` together with the value and does any required conversions. This also makes it harder to accidentally mix e.g. values in bytes and time - `PadProbeId` does not implement `Clone`/`Copy` anymore - Property notify watches return a custom type instead of ulong - `Error`/`Warning`/`Info` `Messages` can only be created with specific kinds of `glib::Error` now. Using arbitrary ones does not work - `Iterator` bindings were completely rewritten and provide the item type as a generic type parameter now, greatly simplifying its usage - All `glib::Values` are now `glib::SendValue` instead, e.g. in `Caps` and `Structures`, as their content must be possible to send to different threads safely - `Message::get_src()` can return `None` - Allow `None` as `Caps` in `AppSrc`/`AppSink` - Allow everything implementing `Into>` to be used as a pad name - Moved `copy()` from `GstRc` directly to `MiniObject` - Success/Error enums (like `FlowReturn`, `PadLinkReturn`, `StateChangeReturn`) now implement an `into_result()` function that splits them into a `Result` with the good and bad cases. Also mark them as `#[must_use]` to make it harder to accidentally ignore errors. - Error enums implement the `Error` trait - Many examples use the `failure` crate for error handling now, cleaning up the error handling code quite a bit - Lots of other code cleanup, compiler/clippy warning cleanup, etc. ## [0.8.2] - 2017-11-11 ### Fixed - Implement StaticType of BufferRef instead of Buffer. Buffer aka GstRc already implements StaticType if BufferRef does, and without this it was not possible to use Buffers in GValues. - Free memory of the appsink/appsrc callbacks with the correct type. It was crashing because of using the wrong type before. - Fix documentation URLs in Cargo.toml. ### Added - Installation instructions and links to documentation for getting started to README.md. ## [0.8.1] - 2017-09-15 ### Added - Implement Send+Sync for Query, Message and Event, and their corresponding Ref types. ### Fixed - Constructor for gst_player::Player now works properly with GStreamer 1.12 when passing a video renderer or signal dispatcher. There was a reference counting bug. - Instead of returning &'static references from functions, return references with a generic, unbound lifetime instead. See https://github.com/rust-lang/rust/pull/42417#issue-233404573 - Various "unused external crate" warnings and clippy warnings everywhere. ### Changed - Remove Cargo.lock from GIT, it's not very useful for library crates. - Run everything through latest rustfmt-nightly. - Use while-let (instead of loop and if-let) and CLOCK_TIME_NONE (instead of u64::MAX) in the examples. ## 0.8.0 - 2017-08-31 - Initial release of the autogenerated GStreamer bindings. Older versions (< 0.8.0) of the bindings can be found [here](https://github.com/arturoc/gstreamer1.0-rs). The API of the two is incompatible. [Unreleased]: https://gitlab.freedesktop.org/gstreamer/gstreamer-rs/compare/0.23.5...HEAD [0.23.5]: https://gitlab.freedesktop.org/gstreamer/gstreamer-rs/compare/0.23.4...0.23.5 [0.23.4]: https://gitlab.freedesktop.org/gstreamer/gstreamer-rs/compare/0.23.3...0.23.4 [0.23.3]: https://gitlab.freedesktop.org/gstreamer/gstreamer-rs/compare/0.23.2...0.23.3 [0.23.2]: https://gitlab.freedesktop.org/gstreamer/gstreamer-rs/compare/0.23.1...0.23.2 [0.23.1]: https://gitlab.freedesktop.org/gstreamer/gstreamer-rs/compare/0.23.0...0.23.1 [0.23.0]: https://gitlab.freedesktop.org/gstreamer/gstreamer-rs/compare/0.22.6...0.23.0 [0.22.6]: https://gitlab.freedesktop.org/gstreamer/gstreamer-rs/compare/0.22.5...0.22.6 [0.22.5]: https://gitlab.freedesktop.org/gstreamer/gstreamer-rs/compare/0.22.4...0.22.5 [0.22.4]: https://gitlab.freedesktop.org/gstreamer/gstreamer-rs/compare/0.22.3...0.22.4 [0.22.3]: https://gitlab.freedesktop.org/gstreamer/gstreamer-rs/compare/0.22.2...0.22.3 [0.22.2]: https://gitlab.freedesktop.org/gstreamer/gstreamer-rs/compare/0.22.1...0.22.2 [0.22.1]: https://gitlab.freedesktop.org/gstreamer/gstreamer-rs/compare/0.22.0...0.22.1 [0.22.0]: https://gitlab.freedesktop.org/gstreamer/gstreamer-rs/compare/0.21.3...0.22.0 [0.21.3]: https://gitlab.freedesktop.org/gstreamer/gstreamer-rs/compare/0.21.2...0.21.3 [0.21.2]: https://gitlab.freedesktop.org/gstreamer/gstreamer-rs/compare/0.21.1...0.21.2 [0.21.1]: https://gitlab.freedesktop.org/gstreamer/gstreamer-rs/compare/0.21.0...0.21.1 [0.21.0]: https://gitlab.freedesktop.org/gstreamer/gstreamer-rs/compare/0.20.7...0.21.0 [0.20.7]: https://gitlab.freedesktop.org/gstreamer/gstreamer-rs/compare/0.20.6...0.20.7 [0.20.6]: https://gitlab.freedesktop.org/gstreamer/gstreamer-rs/compare/0.20.5...0.20.6 [0.20.5]: https://gitlab.freedesktop.org/gstreamer/gstreamer-rs/compare/0.20.4...0.20.5 [0.20.4]: https://gitlab.freedesktop.org/gstreamer/gstreamer-rs/compare/0.20.3...0.20.4 [0.20.3]: https://gitlab.freedesktop.org/gstreamer/gstreamer-rs/compare/0.20.2...0.20.3 [0.20.2]: https://gitlab.freedesktop.org/gstreamer/gstreamer-rs/compare/0.20.1...0.20.2 [0.20.1]: https://gitlab.freedesktop.org/gstreamer/gstreamer-rs/compare/0.20.0...0.20.1 [0.20.0]: https://gitlab.freedesktop.org/gstreamer/gstreamer-rs/compare/0.19.8...0.20.0 [0.19.8]: https://gitlab.freedesktop.org/gstreamer/gstreamer-rs/compare/0.19.7...0.19.8 [0.19.7]: https://gitlab.freedesktop.org/gstreamer/gstreamer-rs/compare/0.19.6...0.19.7 [0.19.6]: https://gitlab.freedesktop.org/gstreamer/gstreamer-rs/compare/0.19.5...0.19.6 [0.19.5]: https://gitlab.freedesktop.org/gstreamer/gstreamer-rs/compare/0.19.4...0.19.5 [0.19.4]: https://gitlab.freedesktop.org/gstreamer/gstreamer-rs/compare/0.19.3...0.19.4 [0.19.3]: https://gitlab.freedesktop.org/gstreamer/gstreamer-rs/compare/0.19.2...0.19.3 [0.19.2]: https://gitlab.freedesktop.org/gstreamer/gstreamer-rs/compare/0.19.1...0.19.2 [0.19.1]: https://gitlab.freedesktop.org/gstreamer/gstreamer-rs/compare/0.19.0...0.19.1 [0.19.0]: https://gitlab.freedesktop.org/gstreamer/gstreamer-rs/compare/0.18.8...0.19.0 [0.18.8]: https://gitlab.freedesktop.org/gstreamer/gstreamer-rs/compare/0.18.7...0.18.8 [0.18.7]: https://gitlab.freedesktop.org/gstreamer/gstreamer-rs/compare/0.18.6...0.18.7 [0.18.6]: https://gitlab.freedesktop.org/gstreamer/gstreamer-rs/compare/0.18.5...0.18.6 [0.18.5]: https://gitlab.freedesktop.org/gstreamer/gstreamer-rs/compare/0.18.4...0.18.5 [0.18.4]: https://gitlab.freedesktop.org/gstreamer/gstreamer-rs/compare/0.18.3...0.18.4 [0.18.3]: https://gitlab.freedesktop.org/gstreamer/gstreamer-rs/compare/0.18.2...0.18.3 [0.18.2]: https://gitlab.freedesktop.org/gstreamer/gstreamer-rs/compare/0.18.1...0.18.2 [0.18.1]: https://gitlab.freedesktop.org/gstreamer/gstreamer-rs/compare/0.18.0...0.18.1 [0.18.0]: https://gitlab.freedesktop.org/gstreamer/gstreamer-rs/compare/0.17.4...0.18.0 [0.17.4]: https://gitlab.freedesktop.org/gstreamer/gstreamer-rs/compare/0.17.3...0.17.4 [0.17.3]: https://gitlab.freedesktop.org/gstreamer/gstreamer-rs/compare/0.17.2...0.17.3 [0.17.2]: https://gitlab.freedesktop.org/gstreamer/gstreamer-rs/compare/0.17.1...0.17.2 [0.17.1]: https://gitlab.freedesktop.org/gstreamer/gstreamer-rs/compare/0.17.0...0.17.1 [0.17.0]: https://gitlab.freedesktop.org/gstreamer/gstreamer-rs/compare/0.16.7...0.17.0 [0.16.7]: https://gitlab.freedesktop.org/gstreamer/gstreamer-rs/compare/0.16.6...0.16.7 [0.16.6]: https://gitlab.freedesktop.org/gstreamer/gstreamer-rs/compare/0.16.5...0.16.6 [0.16.5]: https://gitlab.freedesktop.org/gstreamer/gstreamer-rs/compare/0.16.4...0.16.5 [0.16.4]: https://gitlab.freedesktop.org/gstreamer/gstreamer-rs/compare/0.16.3...0.16.4 [0.16.3]: https://gitlab.freedesktop.org/gstreamer/gstreamer-rs/compare/0.16.2...0.16.3 [0.16.2]: https://gitlab.freedesktop.org/gstreamer/gstreamer-rs/compare/0.16.1...0.16.2 [0.16.1]: https://gitlab.freedesktop.org/gstreamer/gstreamer-rs/compare/0.16.0...0.16.1 [0.16.0]: https://gitlab.freedesktop.org/gstreamer/gstreamer-rs/compare/0.15.7...0.16.0 [0.15.7]: https://gitlab.freedesktop.org/gstreamer/gstreamer-rs/compare/0.15.6...0.15.7 [0.15.6]: https://gitlab.freedesktop.org/gstreamer/gstreamer-rs/compare/0.15.5...0.15.6 [0.15.5]: https://gitlab.freedesktop.org/gstreamer/gstreamer-rs/compare/0.15.4...0.15.5 [0.15.4]: https://gitlab.freedesktop.org/gstreamer/gstreamer-rs/compare/0.15.3...0.15.4 [0.15.3]: https://gitlab.freedesktop.org/gstreamer/gstreamer-rs/compare/0.15.2...0.15.3 [0.15.2]: https://gitlab.freedesktop.org/gstreamer/gstreamer-rs/compare/0.15.1...0.15.2 [0.15.1]: https://gitlab.freedesktop.org/gstreamer/gstreamer-rs/compare/0.15.0...0.15.1 [0.15.0]: https://gitlab.freedesktop.org/gstreamer/gstreamer-rs/compare/0.14.2...0.15.0 [0.14.2]: https://gitlab.freedesktop.org/gstreamer/gstreamer-rs/compare/0.14.1...0.14.2 [0.14.1]: https://gitlab.freedesktop.org/gstreamer/gstreamer-rs/compare/0.14.0...0.14.1 [0.14.0]: https://gitlab.freedesktop.org/gstreamer/gstreamer-rs/compare/0.13.0...0.14.0 [0.13.0]: https://gitlab.freedesktop.org/gstreamer/gstreamer-rs/compare/0.12.2...0.13.0 [0.12.2]: https://gitlab.freedesktop.org/gstreamer/gstreamer-rs/compare/0.12.1...0.12.2 [0.12.1]: https://gitlab.freedesktop.org/gstreamer/gstreamer-rs/compare/0.12.0...0.12.1 [0.12.0]: https://gitlab.freedesktop.org/gstreamer/gstreamer-rs/compare/0.11.6...0.12.0 [0.11.6]: https://gitlab.freedesktop.org/gstreamer/gstreamer-rs/compare/0.11.5...0.11.6 [0.11.5]: https://gitlab.freedesktop.org/gstreamer/gstreamer-rs/compare/0.11.4...0.11.5 [0.11.4]: https://gitlab.freedesktop.org/gstreamer/gstreamer-rs/compare/0.11.3...0.11.4 [0.11.3]: https://gitlab.freedesktop.org/gstreamer/gstreamer-rs/compare/0.11.2...0.11.3 [0.11.2]: https://gitlab.freedesktop.org/gstreamer/gstreamer-rs/compare/0.11.1...0.11.2 [0.11.1]: https://gitlab.freedesktop.org/gstreamer/gstreamer-rs/compare/0.11.0...0.11.1 [0.11.0]: https://gitlab.freedesktop.org/gstreamer/gstreamer-rs/compare/0.10.2...0.11.0 [0.10.2]: https://gitlab.freedesktop.org/gstreamer/gstreamer-rs/compare/0.10.1...0.10.2 [0.10.1]: https://gitlab.freedesktop.org/gstreamer/gstreamer-rs/compare/0.10.0...0.10.1 [0.10.0]: https://gitlab.freedesktop.org/gstreamer/gstreamer-rs/compare/0.9.1...0.10.0 [0.9.1]: https://gitlab.freedesktop.org/gstreamer/gstreamer-rs/compare/0.9.0...0.9.1 [0.9.0]: https://gitlab.freedesktop.org/gstreamer/gstreamer-rs/compare/0.8.1...0.9.0 [0.8.2]: https://gitlab.freedesktop.org/gstreamer/gstreamer-rs/compare/0.8.1...0.8.2 [0.8.1]: https://gitlab.freedesktop.org/gstreamer/gstreamer-rs/compare/0.8.0...0.8.1 gstreamer-video-0.23.5/COPYRIGHT000064400000000000000000000022331046102023000143030ustar 00000000000000The gstreamer-rs project is dual-licensed under Apache 2.0 and MIT terms, with the exception of the sys crates which are licensed only under the terms of the MIT license. Copyrights in the gstreamer-rs project are retained by their contributors. No copyright assignment is required to contribute to the gstreamer-rs project. Some files include explicit copyright notices and/or license notices. For full authorship information, see the version control history. Except as otherwise noted (below and/or in individual files), gstreamer-rs is licensed under the Apache License, Version 2.0 or or the MIT license or , at your option. All the sys crates (e.g. gstreamer/sys and gstreamer-base/sys) are licensed only under the terms of the MIT license. This project provides interoperability with various GStreamer libraries but doesn't distribute any parts of them. Distributing compiled libraries and executables that link to those libraries may be subject to terms of the GNU LGPL or other licenses. For more information check the license of each GStreamer library. gstreamer-video-0.23.5/Cargo.lock0000644000000431650000000000100122040ustar # This file is automatically @generated by Cargo. # It is not intended for manual editing. version = 3 [[package]] name = "atomic_refcell" version = "0.1.13" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "41e67cd8309bbd06cd603a9e693a784ac2e5d1e955f11286e355089fcab3047c" [[package]] name = "autocfg" version = "1.4.0" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "ace50bade8e6234aa140d9a2f552bbee1db4d353f69b8217bc503490fc1a9f26" [[package]] name = "bitflags" version = "2.6.0" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "b048fb63fd8b5923fc5aa7b340d8e156aec7ec02f0c78fa8a6ddc2613f6f71de" [[package]] name = "cfg-expr" version = "0.17.2" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "8d4ba6e40bd1184518716a6e1a781bf9160e286d219ccdb8ab2612e74cfe4789" dependencies = [ "smallvec", "target-lexicon", ] [[package]] name = "cfg-if" version = "1.0.0" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "baf1de4339761588bc0619e3cbc0120ee582ebb74b53b4efbf79117bd2da40fd" [[package]] name = "either" version = "1.13.0" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "60b1af1c220855b6ceac025d3f6ecdd2b7c4894bfe9cd9bda4fbb4bc7c0d4cf0" [[package]] name = "equivalent" version = "1.0.1" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "5443807d6dff69373d433ab9ef5378ad8df50ca6298caf15de6e52e24aaf54d5" [[package]] name = "futures-channel" version = "0.3.31" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "2dff15bf788c671c1934e366d07e30c1814a8ef514e1af724a602e8a2fbe1b10" dependencies = [ "futures-core", ] [[package]] name = "futures-core" version = "0.3.31" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "05f29059c0c2090612e8d742178b0580d2dc940c837851ad723096f87af6663e" [[package]] name = "futures-executor" version = "0.3.31" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "1e28d1d997f585e54aebc3f97d39e72338912123a67330d723fdbb564d646c9f" dependencies = [ "futures-core", "futures-task", "futures-util", ] [[package]] name = "futures-macro" version = "0.3.31" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "162ee34ebcb7c64a8abebc059ce0fee27c2262618d7b60ed8faf72fef13c3650" dependencies = [ "proc-macro2", "quote", "syn", ] [[package]] name = "futures-task" version = "0.3.31" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "f90f7dce0722e95104fcb095585910c0977252f286e354b5e3bd38902cd99988" [[package]] name = "futures-util" version = "0.3.31" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "9fa08315bb612088cc391249efdc3bc77536f16c91f6cf495e6fbe85b20a4a81" dependencies = [ "futures-core", "futures-macro", "futures-task", "pin-project-lite", "pin-utils", "slab", ] [[package]] name = "gio-sys" version = "0.20.9" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "160eb5250a26998c3e1b54e6a3d4ea15c6c7762a6062a19a7b63eff6e2b33f9e" dependencies = [ "glib-sys", "gobject-sys", "libc", "system-deps", "windows-sys", ] [[package]] name = "gir-format-check" version = "0.1.3" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "3a5da913a8586ce748f1164c890e1ebe75a7bbc472668f57b7f9fb893d7ac416" [[package]] name = "glib" version = "0.20.9" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "707b819af8059ee5395a2de9f2317d87a53dbad8846a2f089f0bb44703f37686" dependencies = [ "bitflags", "futures-channel", "futures-core", "futures-executor", "futures-task", "futures-util", "gio-sys", "glib-macros", "glib-sys", "gobject-sys", "libc", "memchr", "smallvec", ] [[package]] name = "glib-macros" version = "0.20.7" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "715601f8f02e71baef9c1f94a657a9a77c192aea6097cf9ae7e5e177cd8cde68" dependencies = [ "heck", "proc-macro-crate", "proc-macro2", "quote", "syn", ] [[package]] name = "glib-sys" version = "0.20.9" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "a8928869a44cfdd1fccb17d6746e4ff82c8f82e41ce705aa026a52ca8dc3aefb" dependencies = [ "libc", "system-deps", ] [[package]] name = "gobject-sys" version = "0.20.9" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "c773a3cb38a419ad9c26c81d177d96b4b08980e8bdbbf32dace883e96e96e7e3" dependencies = [ "glib-sys", "libc", "system-deps", ] [[package]] name = "gstreamer" version = "0.23.4" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "700cb1b2e86dda424f85eb728102a111602317e40b4dd71cf1c0dc04e0cc5d95" dependencies = [ "cfg-if", "futures-channel", "futures-core", "futures-util", "glib", "gstreamer-sys", "itertools", "libc", "muldiv", "num-integer", "num-rational", "once_cell", "option-operations", "paste", "pin-project-lite", "serde", "serde_bytes", "smallvec", "thiserror", ] [[package]] name = "gstreamer-base" version = "0.23.5" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "ad33dd444db0d215ac363164f900f800ffb93361ad8a60840e95e14b7de985e8" dependencies = [ "atomic_refcell", "cfg-if", "glib", "gstreamer", "gstreamer-base-sys", "libc", ] [[package]] name = "gstreamer-base-sys" version = "0.23.5" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "114b2a704f19a70f20c54b00e54f5d5376bbf78bd2791e6beb0776c997d8bf24" dependencies = [ "glib-sys", "gobject-sys", "gstreamer-sys", "libc", "system-deps", ] [[package]] name = "gstreamer-check" version = "0.23.5" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "60ee88620d2ef3d1ca218dedb40f39e759650a12704cf5976a95fa048a3d4f3e" dependencies = [ "glib", "gstreamer", "gstreamer-check-sys", ] [[package]] name = "gstreamer-check-sys" version = "0.23.5" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "2500c1e8af0bd17125bf2397b4b954f7f8fcc273f1b9545e1852fd5b8cc2bfeb" dependencies = [ "glib-sys", "gobject-sys", "gstreamer-sys", "libc", "system-deps", ] [[package]] name = "gstreamer-sys" version = "0.23.4" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "16cf1ae0a869aa7066ce3c685b76053b4b4f48f364a5b18c4b1f36ef57469719" dependencies = [ "glib-sys", "gobject-sys", "libc", "system-deps", ] [[package]] name = "gstreamer-video" version = "0.23.5" dependencies = [ "cfg-if", "futures-channel", "gir-format-check", "glib", "gstreamer", "gstreamer-base", "gstreamer-check", "gstreamer-video-sys", "itertools", "libc", "once_cell", "serde", "serde_json", "thiserror", ] [[package]] name = "gstreamer-video-sys" version = "0.23.4" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "31dc0f49c117f4867b0f98c712aa55ebf25580151d794be8f9179ec2d877fd14" dependencies = [ "glib-sys", "gobject-sys", "gstreamer-base-sys", "gstreamer-sys", "libc", "system-deps", ] [[package]] name = "hashbrown" version = "0.15.2" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "bf151400ff0baff5465007dd2f3e717f3fe502074ca563069ce3a6629d07b289" [[package]] name = "heck" version = "0.5.0" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "2304e00983f87ffb38b55b444b5e3b60a884b5d30c0fca7d82fe33449bbe55ea" [[package]] name = "indexmap" version = "2.7.0" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "62f822373a4fe84d4bb149bf54e584a7f4abec90e072ed49cda0edea5b95471f" dependencies = [ "equivalent", "hashbrown", ] [[package]] name = "itertools" version = "0.13.0" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "413ee7dfc52ee1a4949ceeb7dbc8a33f2d6c088194d9f922fb8318faf1f01186" dependencies = [ "either", ] [[package]] name = "itoa" version = "1.0.14" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "d75a2a4b1b190afb6f5425f10f6a8f959d2ea0b9c2b1d79553551850539e4674" [[package]] name = "libc" version = "0.2.169" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "b5aba8db14291edd000dfcc4d620c7ebfb122c613afb886ca8803fa4e128a20a" [[package]] name = "memchr" version = "2.7.4" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "78ca9ab1a0babb1e7d5695e3530886289c18cf2f87ec19a575a0abdce112e3a3" [[package]] name = "muldiv" version = "1.0.1" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "956787520e75e9bd233246045d19f42fb73242759cc57fba9611d940ae96d4b0" [[package]] name = "num-integer" version = "0.1.46" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "7969661fd2958a5cb096e56c8e1ad0444ac2bbcd0061bd28660485a44879858f" dependencies = [ "num-traits", ] [[package]] name = "num-rational" version = "0.4.2" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "f83d14da390562dca69fc84082e73e548e1ad308d24accdedd2720017cb37824" dependencies = [ "num-integer", "num-traits", "serde", ] [[package]] name = "num-traits" version = "0.2.19" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "071dfc062690e90b734c0b2273ce72ad0ffa95f0c74596bc250dcfd960262841" dependencies = [ "autocfg", ] [[package]] name = "once_cell" version = "1.20.2" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "1261fe7e33c73b354eab43b1273a57c8f967d0391e80353e51f764ac02cf6775" [[package]] name = "option-operations" version = "0.5.0" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "7c26d27bb1aeab65138e4bf7666045169d1717febcc9ff870166be8348b223d0" dependencies = [ "paste", ] [[package]] name = "paste" version = "1.0.15" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "57c0d7b74b563b49d38dae00a0c37d4d6de9b432382b2892f0574ddcae73fd0a" [[package]] name = "pin-project-lite" version = "0.2.15" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "915a1e146535de9163f3987b8944ed8cf49a18bb0056bcebcdcece385cece4ff" [[package]] name = "pin-utils" version = "0.1.0" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "8b870d8c151b6f2fb93e84a13146138f05d02ed11c7e7c54f8826aaaf7c9f184" [[package]] name = "pkg-config" version = "0.3.31" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "953ec861398dccce10c670dfeaf3ec4911ca479e9c02154b3a215178c5f566f2" [[package]] name = "proc-macro-crate" version = "3.2.0" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "8ecf48c7ca261d60b74ab1a7b20da18bede46776b2e55535cb958eb595c5fa7b" dependencies = [ "toml_edit", ] [[package]] name = "proc-macro2" version = "1.0.92" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "37d3544b3f2748c54e147655edb5025752e2303145b5aefb3c3ea2c78b973bb0" dependencies = [ "unicode-ident", ] [[package]] name = "quote" version = "1.0.37" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "b5b9d34b8991d19d98081b46eacdd8eb58c6f2b201139f7c5f643cc155a633af" dependencies = [ "proc-macro2", ] [[package]] name = "ryu" version = "1.0.18" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "f3cb5ba0dc43242ce17de99c180e96db90b235b8a9fdc9543c96d2209116bd9f" [[package]] name = "serde" version = "1.0.216" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "0b9781016e935a97e8beecf0c933758c97a5520d32930e460142b4cd80c6338e" dependencies = [ "serde_derive", ] [[package]] name = "serde_bytes" version = "0.11.15" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "387cc504cb06bb40a96c8e04e951fe01854cf6bc921053c954e4a606d9675c6a" dependencies = [ "serde", ] [[package]] name = "serde_derive" version = "1.0.216" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "46f859dbbf73865c6627ed570e78961cd3ac92407a2d117204c49232485da55e" dependencies = [ "proc-macro2", "quote", "syn", ] [[package]] name = "serde_json" version = "1.0.133" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "c7fceb2473b9166b2294ef05efcb65a3db80803f0b03ef86a5fc88a2b85ee377" dependencies = [ "itoa", "memchr", "ryu", "serde", ] [[package]] name = "serde_spanned" version = "0.6.8" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "87607cb1398ed59d48732e575a4c28a7a8ebf2454b964fe3f224f2afc07909e1" dependencies = [ "serde", ] [[package]] name = "slab" version = "0.4.9" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "8f92a496fb766b417c996b9c5e57daf2f7ad3b0bebe1ccfca4856390e3d3bb67" dependencies = [ "autocfg", ] [[package]] name = "smallvec" version = "1.13.2" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "3c5e1a9a646d36c3599cd173a41282daf47c44583ad367b8e6837255952e5c67" [[package]] name = "syn" version = "2.0.90" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "919d3b74a5dd0ccd15aeb8f93e7006bd9e14c295087c9896a110f490752bcf31" dependencies = [ "proc-macro2", "quote", "unicode-ident", ] [[package]] name = "system-deps" version = "7.0.3" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "66d23aaf9f331227789a99e8de4c91bf46703add012bdfd45fdecdfb2975a005" dependencies = [ "cfg-expr", "heck", "pkg-config", "toml", "version-compare", ] [[package]] name = "target-lexicon" version = "0.12.16" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "61c41af27dd6d1e27b1b16b489db798443478cef1f06a660c96db617ba5de3b1" [[package]] name = "thiserror" version = "2.0.8" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "08f5383f3e0071702bf93ab5ee99b52d26936be9dedd9413067cbdcddcb6141a" dependencies = [ "thiserror-impl", ] [[package]] name = "thiserror-impl" version = "2.0.8" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "f2f357fcec90b3caef6623a099691be676d033b40a058ac95d2a6ade6fa0c943" dependencies = [ "proc-macro2", "quote", "syn", ] [[package]] name = "toml" version = "0.8.19" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "a1ed1f98e3fdc28d6d910e6737ae6ab1a93bf1985935a1193e68f93eeb68d24e" dependencies = [ "serde", "serde_spanned", "toml_datetime", "toml_edit", ] [[package]] name = "toml_datetime" version = "0.6.8" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "0dd7358ecb8fc2f8d014bf86f6f638ce72ba252a2c3a2572f2a795f1d23efb41" dependencies = [ "serde", ] [[package]] name = "toml_edit" version = "0.22.22" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "4ae48d6208a266e853d946088ed816055e556cc6028c5e8e2b84d9fa5dd7c7f5" dependencies = [ "indexmap", "serde", "serde_spanned", "toml_datetime", "winnow", ] [[package]] name = "unicode-ident" version = "1.0.14" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "adb9e6ca4f869e1180728b7950e35922a7fc6397f7b641499e8f3ef06e50dc83" [[package]] name = "version-compare" version = "0.2.0" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "852e951cb7832cb45cb1169900d19760cfa39b82bc0ea9c0e5a14ae88411c98b" [[package]] name = "windows-sys" version = "0.59.0" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "1e38bc4d79ed67fd075bcc251a1c39b32a1776bbe92e5bef1f0bf1f8c531853b" dependencies = [ "windows-targets", ] [[package]] name = "windows-targets" version = "0.52.6" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "9b724f72796e036ab90c1021d4780d4d3d648aca59e491e6b98e725b84e99973" dependencies = [ "windows_aarch64_gnullvm", "windows_aarch64_msvc", "windows_i686_gnu", "windows_i686_gnullvm", "windows_i686_msvc", "windows_x86_64_gnu", "windows_x86_64_gnullvm", "windows_x86_64_msvc", ] [[package]] name = "windows_aarch64_gnullvm" version = "0.52.6" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "32a4622180e7a0ec044bb555404c800bc9fd9ec262ec147edd5989ccd0c02cd3" [[package]] name = "windows_aarch64_msvc" version = "0.52.6" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "09ec2a7bb152e2252b53fa7803150007879548bc709c039df7627cabbd05d469" [[package]] name = "windows_i686_gnu" version = "0.52.6" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "8e9b5ad5ab802e97eb8e295ac6720e509ee4c243f69d781394014ebfe8bbfa0b" [[package]] name = "windows_i686_gnullvm" version = "0.52.6" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "0eee52d38c090b3caa76c563b86c3a4bd71ef1a819287c19d586d7334ae8ed66" [[package]] name = "windows_i686_msvc" version = "0.52.6" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "240948bc05c5e7c6dabba28bf89d89ffce3e303022809e73deaefe4f6ec56c66" [[package]] name = "windows_x86_64_gnu" version = "0.52.6" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "147a5c80aabfbf0c7d901cb5895d1de30ef2907eb21fbbab29ca94c5b08b1a78" [[package]] name = "windows_x86_64_gnullvm" version = "0.52.6" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "24d5b23dc417412679681396f2b49f3de8c1473deb516bd34410872eff51ed0d" [[package]] name = "windows_x86_64_msvc" version = "0.52.6" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "589f6da84c646204747d1270a2a5661ea66ed1cced2631d546fdfb155959f9ec" [[package]] name = "winnow" version = "0.6.20" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "36c1fec1a2bb5866f07c25f68c26e565c4c200aebb96d7e55710c19d3e8ac49b" dependencies = [ "memchr", ] gstreamer-video-0.23.5/Cargo.toml0000644000000054400000000000100122210ustar # THIS FILE IS AUTOMATICALLY GENERATED BY CARGO # # When uploading crates to the registry Cargo will automatically # "normalize" Cargo.toml files for maximal compatibility # with all versions of Cargo and also rewrite `path` dependencies # to registry (e.g., crates.io) dependencies. # # If you are reading this file be aware that the original Cargo.toml # will likely look very different (and much more reasonable). # See Cargo.toml.orig for the original contents. [package] edition = "2021" rust-version = "1.71.1" name = "gstreamer-video" version = "0.23.5" authors = ["Sebastian Dröge "] build = false autolib = false autobins = false autoexamples = false autotests = false autobenches = false description = "Rust bindings for GStreamer Video library" homepage = "https://gstreamer.freedesktop.org" documentation = "https://gstreamer.freedesktop.org/documentation/rust/stable/latest/docs/gstreamer_video/" readme = "README.md" keywords = [ "gstreamer", "multimedia", "audio", "video", "gnome", ] categories = [ "api-bindings", "multimedia", ] license = "MIT OR Apache-2.0" repository = "https://gitlab.freedesktop.org/gstreamer/gstreamer-rs" [package.metadata.docs.rs] all-features = true rustc-args = [ "--cfg", "docsrs", ] rustdoc-args = [ "--cfg", "docsrs", "--generate-link-to-definition", ] [lib] name = "gstreamer_video" path = "src/lib.rs" [[test]] name = "check_gir" path = "tests/check_gir.rs" [dependencies.cfg-if] version = "1.0" [dependencies.futures-channel] version = "0.3" [dependencies.glib] version = "0.20" [dependencies.gst] version = "0.23" package = "gstreamer" [dependencies.gst-base] version = "0.23" package = "gstreamer-base" [dependencies.gstreamer-video-sys] version = "0.23" [dependencies.libc] version = "0.2" [dependencies.once_cell] version = "1" [dependencies.serde] version = "1.0" features = ["derive"] optional = true [dependencies.thiserror] version = "2" [dev-dependencies.gir-format-check] version = "0.1" [dev-dependencies.gst-check] version = "0.23" package = "gstreamer-check" [dev-dependencies.itertools] version = "0.13" [dev-dependencies.serde_json] version = "1.0" [features] default = [] serde = [ "dep:serde", "gst/serde", ] v1_16 = [ "gst/v1_16", "gst-base/v1_16", "gstreamer-video-sys/v1_16", ] v1_18 = [ "gst/v1_18", "gst-base/v1_18", "gstreamer-video-sys/v1_18", "v1_16", ] v1_20 = [ "gst/v1_20", "gst-base/v1_20", "gstreamer-video-sys/v1_20", "v1_18", ] v1_22 = [ "gst/v1_22", "gst-base/v1_22", "gstreamer-video-sys/v1_22", "v1_20", ] v1_24 = [ "gst/v1_24", "gst-base/v1_24", "gstreamer-video-sys/v1_24", "v1_22", ] v1_26 = [ "gst/v1_26", "gst-base/v1_26", "gstreamer-video-sys/v1_26", "v1_24", ] gstreamer-video-0.23.5/Cargo.toml.orig000064400000000000000000000030731046102023000157020ustar 00000000000000[package] name = "gstreamer-video" authors = ["Sebastian Dröge "] description = "Rust bindings for GStreamer Video library" license = "MIT OR Apache-2.0" readme = "README.md" documentation = "https://gstreamer.freedesktop.org/documentation/rust/stable/latest/docs/gstreamer_video/" keywords = ["gstreamer", "multimedia", "audio", "video", "gnome"] version.workspace = true categories.workspace = true repository.workspace = true homepage.workspace = true edition.workspace = true rust-version.workspace = true [dependencies] libc = "0.2" cfg-if = "1.0" gstreamer-video-sys.workspace = true glib.workspace = true gst.workspace = true gst-base.workspace = true futures-channel = "0.3" serde = { version = "1.0", optional = true, features = ["derive"] } thiserror = "2" once_cell = "1" [dev-dependencies] itertools = "0.13" gst-check.workspace = true serde_json = "1.0" gir-format-check = "0.1" [features] default = [] v1_16 = ["gst/v1_16", "gst-base/v1_16", "gstreamer-video-sys/v1_16"] v1_18 = ["gst/v1_18", "gst-base/v1_18", "gstreamer-video-sys/v1_18", "v1_16"] v1_20 = ["gst/v1_20", "gst-base/v1_20", "gstreamer-video-sys/v1_20", "v1_18"] v1_22 = ["gst/v1_22", "gst-base/v1_22", "gstreamer-video-sys/v1_22", "v1_20"] v1_24 = ["gst/v1_24", "gst-base/v1_24", "gstreamer-video-sys/v1_24", "v1_22"] v1_26 = ["gst/v1_26", "gst-base/v1_26", "gstreamer-video-sys/v1_26", "v1_24"] serde = ["dep:serde", "gst/serde"] [package.metadata.docs.rs] all-features = true rustc-args = ["--cfg", "docsrs"] rustdoc-args = ["--cfg", "docsrs", "--generate-link-to-definition"] gstreamer-video-0.23.5/Gir.toml000064400000000000000000000404071046102023000144330ustar 00000000000000[options] girs_directories = ["../gir-files", "../gst-gir-files"] library = "GstVideo" version = "1.0" min_cfg_version = "1.14" work_mode = "normal" concurrency = "send+sync" generate_safety_asserts = true single_version_file = true generate_display_trait = false trust_return_value_nullability = true external_libraries = [ "GLib", "GObject", "Gst", "GstBase", ] generate = [ "GstVideo.AncillaryMetaField", "GstVideo.ColorBalance", "GstVideo.ColorBalanceChannel", "GstVideo.ColorBalanceType", "GstVideo.NavigationMessageType", "GstVideo.NavigationQueryType", "GstVideo.VideoAFDSpec", "GstVideo.VideoAFDValue", "GstVideo.VideoAggregatorParallelConvertPad", "GstVideo.VideoAlphaMode", "GstVideo.VideoAncillaryDID", "GstVideo.VideoAncillaryDID16", "GstVideo.VideoBufferPool", "GstVideo.VideoChromaMode", "GstVideo.VideoDecoderRequestSyncPointFlags", "GstVideo.VideoDitherMethod", "GstVideo.VideoFilter", "GstVideo.VideoFormatFlags", "GstVideo.VideoGammaMode", "GstVideo.VideoMatrixMode", "GstVideo.VideoMultiviewFramePacking", "GstVideo.VideoMultiviewMode", "GstVideo.VideoOrientationMethod", "GstVideo.VideoPrimariesMode", "GstVideo.VideoResamplerMethod", "GstVideo.VideoTileMode", ] manual = [ "GLib.DateTime", "GObject.Object", "GObject.Value", "Gst.AllocationParams", "Gst.Allocator", "Gst.Buffer", "Gst.BufferPool", "Gst.BufferPoolAcquireParams", "Gst.ClockTimeDiff", "Gst.Element", "Gst.Format", "Gst.Memory", "Gst.Message", "Gst.Object", "Gst.Pad", "Gst.Pipeline", "Gst.State", "Gst.TagList", "Gst.TagMergeMode", "GstBase.Aggregator", "GstBase.AggregatorPad", "GstBase.BaseSink", "GstBase.BaseTransform", "GstVideo.VideoAncillary", "GstVideo.VideoCodecFrame", "GstVideo.VideoCodecState", "GstVideo.VideoColorimetry", "GstVideo.VideoColorRange", "GstVideo.VideoFormatInfo", "GstVideo.VideoInfo", "GstVideo.VideoInfoDmaDrm", "GstVideo.VideoMeta", "GstVideo.VideoTimeCode", "GstVideo.VideoTimeCodeInterval", "GstVideo.VideoVBIEncoder", "GstVideo.VideoVBIParser", ] [[object]] name = "Gst.Caps" status = "manual" ref_mode = "ref" [[object]] name = "Gst.ClockTime" status = "manual" conversion_type = "Option" [[object]] name = "Gst.Event" status = "manual" ref_mode = "ref" [[object]] name = "Gst.FlowReturn" status = "manual" must_use = true [object.conversion_type] variant = "Result" ok_type = "gst::FlowSuccess" err_type = "gst::FlowError" [[object]] name = "Gst.Query" status = "manual" ref_mode = "ref" [[object]] name = "Gst.Structure" status = "manual" ref_mode = "ref" [[object]] name = "GstVideo.Navigation" status = "generate" [[object.function]] name = "event_parse_key_event" manual = true [[object.function]] name = "event_parse_mouse_button_event" manual = true [[object.function]] name = "event_parse_mouse_move_event" manual = true [[object.function]] name = "event_parse_command" manual = true [[object.function]] name = "event_parse_touch_event" manual = true [[object.function]] name = "event_parse_touch_up_event" manual = true [[object.function]] name = "event_parse_mouse_scroll_event" manual = true [[object.function]] name = "event_parse_modifier_state" manual = true [[object.function]] name = "event_get_type" manual = true [[object.function]] name = "event_new_command" manual = true [[object.function]] name = "event_new_key_press" manual = true [[object.function]] name = "event_new_key_release" manual = true [[object.function]] name = "event_new_mouse_button_press" manual = true [[object.function]] name = "event_new_mouse_button_release" manual = true [[object.function]] name = "event_new_mouse_move" manual = true [[object.function]] name = "event_new_mouse_scroll" manual = true [[object.function]] name = "event_new_touch_down" manual = true [[object.function]] name = "event_new_touch_motion" manual = true [[object.function]] name = "event_new_touch_up" manual = true [[object.function]] name = "event_new_touch_frame" manual = true [[object.function]] name = "event_new_mouse_double_click" manual = true [[object.function]] name = "event_new_touch_cancel" manual = true [[object.function]] name = "event_get_coordinates" manual = true [[object.function]] name = "event_set_coordinates" manual = true [[object.function]] name = "send_event" [[object.function.parameter]] name = "structure" move = true [[object.function]] name = "send_key_event" [[object.function]] name = "send_mouse_event" [[object.function]] name = "send_mouse_scroll_event" [[object.function]] name = "send_command" [[object.function]] name = "message_parse_event" manual = true [[object.function]] name = "message_get_type" manual = true [[object.function]] name = "message_new_event" manual = true [[object.function]] name = "message_new_commands_changed" ignore = true [[object.function]] name = "message_parse_angles_changed" ignore = true [[object.function]] name = "message_new_mouse_over" ignore = true [[object.function]] name = "message_parse_mouse_over" ignore = true [[object.function]] name = "message_new_angles_changed" ignore = true [[object.function]] name = "query_get_type" ignore = true [[object.function]] name = "query_new_angles" ignore = true [[object.function]] name = "query_new_commands" ignore = true [[object.function]] name = "query_parse_angles" ignore = true [[object.function]] name = "query_parse_commands_length" ignore = true [[object.function]] name = "query_parse_commands_nth" ignore = true [[object.function]] name = "query_set_angles" ignore = true [[object]] name = "GstVideo.NavigationCommand" status = "generate" [[object.derive]] name = "serde::Serialize, serde::Deserialize" cfg_condition = "feature = \"serde\"" [[object.derive]] name = "Debug, Eq, PartialEq, Ord, PartialOrd, Hash" [[object]] name = "GstVideo.NavigationEventType" status = "generate" [[object.member]] name = "mouse_scroll" version = "1.18" [[object.member]] name = "touch_down" version = "1.22" [[object.member]] name = "touch_motion" version = "1.22" [[object.member]] name = "touch_up" version = "1.22" [[object.member]] name = "touch_frame" version = "1.22" [[object.member]] name = "touch_cancel" version = "1.22" [[object]] name = "GstVideo.NavigationModifierType" status = "generate" [[object.member]] name = "none" ignore = true [[object.member]] name = "mask" ignore = true [[object]] name = "GstVideo.VideoAggregator" status = "generate" [[object.property]] name = "force-live" # getter/setter exists in base class ignore = true [[object]] name = "GstVideo.VideoAggregatorConvertPad" status = "generate" [[object.property]] name = "converter-config" # wrong type manual = true [[object]] name = "GstVideo.VideoAggregatorPad" status = "generate" [[object.function]] name = "get_current_buffer" # needs special considerations manual = true [[object.function]] name = "get_prepared_frame" # needs special considerations manual = true [[object.function]] name = "has_current_buffer" # needs special considerations manual = true [[object]] name = "GstVideo.VideoBufferFlags" status = "generate" [[object.member]] name = "top_field" version = "1.16" [[object.member]] name = "bottom_field" version = "1.16" [[object.member]] name = "marker" version = "1.18" [[object.member]] name = "last" ignore = true [[object]] name = "GstVideo.VideoCaptionType" status = "generate" [[object.function]] name = "from_caps" # Use &CapsRef manual = true [[object]] name = "GstVideo.VideoChromaSite" status = "generate" [[object.member]] name = "unknown" ignore = true [[object.function]] name = "to_string" # Manual function for < v1_20: manual = true # Always generate the trait, without version constraint: version = "1.8" [object.function.return] nullable = false [[object]] name = "GstVideo.VideoCodecFrameFlags" status = "generate" [[object.function]] name = "get_type" version = "1.20" [[object]] name = "GstVideo.VideoColorMatrix" status = "generate" [[object.function]] name = "get_Kr_Kb" # Function and parameter name capitalization is wrong ignore = true [[object]] name = "GstVideo.VideoColorPrimaries" status = "generate" [[object.member]] name = "smptest428" version = "1.16" [[object.member]] name = "smpterp431" version = "1.16" [[object.member]] name = "smpteeg432" version = "1.16" [[object.member]] name = "ebu3213" version = "1.16" [[object]] name = "GstVideo.VideoDecoder" status = "generate" manual_traits = ["VideoDecoderExtManual"] [[object.function]] name = "allocate_output_frame" manual = true [[object.function]] name = "allocate_output_frame_with_params" ignore = true [[object.function]] name = "get_processed_subframe_index" manual = true [[object.function]] name = "get_input_subframe_index" manual = true [[object.function]] name = "set_latency" manual = true [[object.function]] name = "get_latency" manual = true [[object.function]] name = "get_frame" manual = true [[object.function]] name = "get_frames" manual = true [[object.function]] name = "get_oldest_frame" manual = true [[object.function]] name = "get_output_state" manual = true [[object.function]] name = "set_output_state" manual = true [[object.function]] name = "set_interlaced_output_state" manual = true [[object.function]] name = "negotiate" manual = true [[object.function]] name = "get_allocator" manual = true [[object.function]] name = "allocate_output_buffer" [object.function.return] nullable_return_is_error = "Failed to allocate output buffer" [[object]] name = "GstVideo.VideoEncoder" status = "generate" manual_traits = ["VideoEncoderExtManual"] [[object.function]] name = "allocate_output_frame" manual = true [[object.function]] name = "allocate_output_frame_with_params" ignore = true [[object.function]] name = "finish_subframe" manual = true [[object.function]] name = "set_latency" manual = true [[object.function]] name = "get_latency" manual = true [[object.function]] name = "get_frame" manual = true [[object.function]] name = "get_frames" manual = true [[object.function]] name = "get_oldest_frame" manual = true [[object.function]] name = "get_output_state" manual = true [[object.function]] name = "set_output_state" manual = true [[object.function]] name = "negotiate" manual = true [[object.function]] name = "get_allocator" manual = true [[object.function]] name = "set_headers" manual = true [[object]] name = "GstVideo.VideoFieldOrder" status = "generate" [[object.function]] name = "to_string" # This has an Unknown field that may return NULL or "UNKNOWN" manual = true [[object]] name = "GstVideo.VideoFlags" status = "generate" [[object.member]] name = "none" ignore = true [[object]] name = "GstVideo.VideoFormat" status = "generate" [[object.derive]] name = "Debug, Eq, PartialEq, Hash" [[object.member]] name = "nv12_10le40" version = "1.16" [[object.member]] name = "y210" version = "1.16" [[object.member]] name = "y410" version = "1.16" [[object.member]] name = "vuya" version = "1.16" [[object.member]] name = "bgr10a2_le" version = "1.16" [[object.member]] name = "rgb10a2_le" version = "1.18" [[object.member]] name = "y444_16be" version = "1.18" [[object.member]] name = "y444_16le" version = "1.18" [[object.member]] name = "p016_be" version = "1.18" [[object.member]] name = "p016_le" version = "1.18" [[object.member]] name = "p012_be" version = "1.18" [[object.member]] name = "p012_le" version = "1.18" [[object.member]] name = "y212_be" version = "1.18" [[object.member]] name = "y212_le" version = "1.18" [[object.member]] name = "y412_be" version = "1.18" [[object.member]] name = "y412_le" version = "1.18" [[object.member]] name = "last" ignore = true [[object.function]] name = "to_string" # This has an Unknown field that may return NULL or "UNKNOWN" manual = true [[object.function]] name = "get_info" # Result is not nullable, function does effectively the same # as VideoFormatInfo::from_format() ignore = true [[object.function]] name = "from_masks" # Use custom VideoEndianness enum manual = true [[object]] name = "GstVideo.VideoFrame" status = "manual" [[object.function]] name = "map" # Readable and writable variant dealing with mutability rename = "from_buffer_readable" [[object]] name = "GstVideo.VideoFrameFlags" status = "generate" [[object.member]] name = "top_field" version = "1.16" [[object.member]] name = "bottom_field" version = "1.16" [[object.member]] name = "none" ignore = true [[object]] name = "GstVideo.VideoInterlaceMode" status = "generate" [[object.member]] name = "alternate" version = "1.16" [[object]] name = "GstVideo.VideoMultiviewFlags" status = "generate" [[object.member]] name = "none" ignore = true [[object]] name = "GstVideo.VideoOrientation" status = "generate" [[object.function]] name = "set_hcenter" [object.function.return] bool_return_is_error = "Failed to set horizontal centering" [[object.function]] name = "set_hflip" [object.function.return] bool_return_is_error = "Failed to set horizontal flipping" [[object.function]] name = "set_vcenter" [object.function.return] bool_return_is_error = "Failed to set vertical centering" [[object.function]] name = "set_vflip" [object.function.return] bool_return_is_error = "Failed to set vertical flipping" [[object.function]] name = "from_tag" # Use &TagListRef and move to the enum manual = true [[object]] name = "GstVideo.VideoOverlay" status = "generate" manual_traits = ["VideoOverlayExtManual"] [[object.function]] name = "set_property" # Only for implementors of GstVideoOverlay ignore = true [[object.function]] name = "set_window_handle" # Pointer argument manual = true [[object.function]] name = "set_render_rectangle" [object.function.return] bool_return_is_error = "Failed to set render rectangle" [[object]] name = "GstVideo.VideoOverlayFormatFlags" status = "generate" [[object.function]] name = "get_type" version = "1.16" [[object.member]] name = "none" ignore = true [[object]] name = "GstVideo.VideoPackFlags" status = "generate" [[object.member]] name = "none" ignore = true [[object]] name = "GstVideo.VideoSink" status = "generate" [[object.function]] name = "center_rect" # Implemented in video_rectangle manual = true [[object]] name = "GstVideo.VideoTimeCodeFlags" status = "generate" [[object.function]] name = "get_type" version = "1.18" [[object.member]] name = "none" ignore = true [[object]] name = "GstVideo.VideoTransferFunction" status = "generate" [[object.member]] name = "bt2020_10" version = "1.18" [[object.member]] name = "smpte2084" version = "1.18" [[object.member]] name = "arib_std_b67" version = "1.18" gstreamer-video-0.23.5/LICENSE-APACHE000064400000000000000000000251371046102023000147440ustar 00000000000000 Apache License Version 2.0, January 2004 http://www.apache.org/licenses/ TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION 1. Definitions. "License" shall mean the terms and conditions for use, reproduction, and distribution as defined by Sections 1 through 9 of this document. "Licensor" shall mean the copyright owner or entity authorized by the copyright owner that is granting the License. "Legal Entity" shall mean the union of the acting entity and all other entities that control, are controlled by, or are under common control with that entity. For the purposes of this definition, "control" means (i) the power, direct or indirect, to cause the direction or management of such entity, whether by contract or otherwise, or (ii) ownership of fifty percent (50%) or more of the outstanding shares, or (iii) beneficial ownership of such entity. "You" (or "Your") shall mean an individual or Legal Entity exercising permissions granted by this License. "Source" form shall mean the preferred form for making modifications, including but not limited to software source code, documentation source, and configuration files. "Object" form shall mean any form resulting from mechanical transformation or translation of a Source form, including but not limited to compiled object code, generated documentation, and conversions to other media types. "Work" shall mean the work of authorship, whether in Source or Object form, made available under the License, as indicated by a copyright notice that is included in or attached to the work (an example is provided in the Appendix below). "Derivative Works" shall mean any work, whether in Source or Object form, that is based on (or derived from) the Work and for which the editorial revisions, annotations, elaborations, or other modifications represent, as a whole, an original work of authorship. For the purposes of this License, Derivative Works shall not include works that remain separable from, or merely link (or bind by name) to the interfaces of, the Work and Derivative Works thereof. "Contribution" shall mean any work of authorship, including the original version of the Work and any modifications or additions to that Work or Derivative Works thereof, that is intentionally submitted to Licensor for inclusion in the Work by the copyright owner or by an individual or Legal Entity authorized to submit on behalf of the copyright owner. For the purposes of this definition, "submitted" means any form of electronic, verbal, or written communication sent to the Licensor or its representatives, including but not limited to communication on electronic mailing lists, source code control systems, and issue tracking systems that are managed by, or on behalf of, the Licensor for the purpose of discussing and improving the Work, but excluding communication that is conspicuously marked or otherwise designated in writing by the copyright owner as "Not a Contribution." "Contributor" shall mean Licensor and any individual or Legal Entity on behalf of whom a Contribution has been received by Licensor and subsequently incorporated within the Work. 2. Grant of Copyright License. Subject to the terms and conditions of this License, each Contributor hereby grants to You a perpetual, worldwide, non-exclusive, no-charge, royalty-free, irrevocable copyright license to reproduce, prepare Derivative Works of, publicly display, publicly perform, sublicense, and distribute the Work and such Derivative Works in Source or Object form. 3. Grant of Patent License. Subject to the terms and conditions of this License, each Contributor hereby grants to You a perpetual, worldwide, non-exclusive, no-charge, royalty-free, irrevocable (except as stated in this section) patent license to make, have made, use, offer to sell, sell, import, and otherwise transfer the Work, where such license applies only to those patent claims licensable by such Contributor that are necessarily infringed by their Contribution(s) alone or by combination of their Contribution(s) with the Work to which such Contribution(s) was submitted. If You institute patent litigation against any entity (including a cross-claim or counterclaim in a lawsuit) alleging that the Work or a Contribution incorporated within the Work constitutes direct or contributory patent infringement, then any patent licenses granted to You under this License for that Work shall terminate as of the date such litigation is filed. 4. Redistribution. You may reproduce and distribute copies of the Work or Derivative Works thereof in any medium, with or without modifications, and in Source or Object form, provided that You meet the following conditions: (a) You must give any other recipients of the Work or Derivative Works a copy of this License; and (b) You must cause any modified files to carry prominent notices stating that You changed the files; and (c) You must retain, in the Source form of any Derivative Works that You distribute, all copyright, patent, trademark, and attribution notices from the Source form of the Work, excluding those notices that do not pertain to any part of the Derivative Works; and (d) If the Work includes a "NOTICE" text file as part of its distribution, then any Derivative Works that You distribute must include a readable copy of the attribution notices contained within such NOTICE file, excluding those notices that do not pertain to any part of the Derivative Works, in at least one of the following places: within a NOTICE text file distributed as part of the Derivative Works; within the Source form or documentation, if provided along with the Derivative Works; or, within a display generated by the Derivative Works, if and wherever such third-party notices normally appear. The contents of the NOTICE file are for informational purposes only and do not modify the License. You may add Your own attribution notices within Derivative Works that You distribute, alongside or as an addendum to the NOTICE text from the Work, provided that such additional attribution notices cannot be construed as modifying the License. You may add Your own copyright statement to Your modifications and may provide additional or different license terms and conditions for use, reproduction, or distribution of Your modifications, or for any such Derivative Works as a whole, provided Your use, reproduction, and distribution of the Work otherwise complies with the conditions stated in this License. 5. Submission of Contributions. Unless You explicitly state otherwise, any Contribution intentionally submitted for inclusion in the Work by You to the Licensor shall be under the terms and conditions of this License, without any additional terms or conditions. Notwithstanding the above, nothing herein shall supersede or modify the terms of any separate license agreement you may have executed with Licensor regarding such Contributions. 6. Trademarks. This License does not grant permission to use the trade names, trademarks, service marks, or product names of the Licensor, except as required for reasonable and customary use in describing the origin of the Work and reproducing the content of the NOTICE file. 7. Disclaimer of Warranty. Unless required by applicable law or agreed to in writing, Licensor provides the Work (and each Contributor provides its Contributions) on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied, including, without limitation, any warranties or conditions of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A PARTICULAR PURPOSE. You are solely responsible for determining the appropriateness of using or redistributing the Work and assume any risks associated with Your exercise of permissions under this License. 8. Limitation of Liability. In no event and under no legal theory, whether in tort (including negligence), contract, or otherwise, unless required by applicable law (such as deliberate and grossly negligent acts) or agreed to in writing, shall any Contributor be liable to You for damages, including any direct, indirect, special, incidental, or consequential damages of any character arising as a result of this License or out of the use or inability to use the Work (including but not limited to damages for loss of goodwill, work stoppage, computer failure or malfunction, or any and all other commercial damages or losses), even if such Contributor has been advised of the possibility of such damages. 9. Accepting Warranty or Additional Liability. While redistributing the Work or Derivative Works thereof, You may choose to offer, and charge a fee for, acceptance of support, warranty, indemnity, or other liability obligations and/or rights consistent with this License. However, in accepting such obligations, You may act only on Your own behalf and on Your sole responsibility, not on behalf of any other Contributor, and only if You agree to indemnify, defend, and hold each Contributor harmless for any liability incurred by, or claims asserted against, such Contributor by reason of your accepting any such warranty or additional liability. END OF TERMS AND CONDITIONS APPENDIX: How to apply the Apache License to your work. To apply the Apache License to your work, attach the following boilerplate notice, with the fields enclosed by brackets "[]" replaced with your own identifying information. (Don't include the brackets!) The text should be enclosed in the appropriate comment syntax for the file format. We also recommend that a file or class name and description of purpose be included on the same "printed page" as the copyright notice for easier identification within third-party archives. Copyright [yyyy] [name of copyright owner] Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with the License. You may obtain a copy of the License at http://www.apache.org/licenses/LICENSE-2.0 Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the specific language governing permissions and limitations under the License. gstreamer-video-0.23.5/LICENSE-MIT000064400000000000000000000017771046102023000144600ustar 00000000000000Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. gstreamer-video-0.23.5/README.md000064400000000000000000000176721046102023000143040ustar 00000000000000# gstreamer-rs [![crates.io](https://img.shields.io/crates/v/gstreamer-video.svg)](https://crates.io/crates/gstreamer-video) [![pipeline status](https://gitlab.freedesktop.org/gstreamer/gstreamer-rs/badges/main/pipeline.svg)](https://gitlab.freedesktop.org/gstreamer/gstreamer-rs/commits/main) [GStreamer](https://gstreamer.freedesktop.org/) (Video library) bindings for Rust. Documentation can be found [here](https://gstreamer.freedesktop.org/documentation/rust/stable/latest/docs/gstreamer_video/). These bindings are providing a safe API that can be used to interface with GStreamer, e.g. for writing GStreamer-based applications and GStreamer plugins. The bindings are mostly autogenerated with [gir](https://github.com/gtk-rs/gir/) based on the [GObject-Introspection](https://wiki.gnome.org/Projects/GObjectIntrospection/) API metadata provided by the GStreamer project. ## Table of Contents 1. [Installation](#installation) 1. [Linux/BSDs](#installation-linux) 1. [macOS](#installation-macos) 1. [Windows](#installation-windows) 1. [Getting Started](#getting-started) 1. [License](#license) 1. [Contribution](#contribution) ## Installation To build the GStreamer bindings or anything depending on them, you need to have at least GStreamer 1.14 and gst-plugins-base 1.14 installed. In addition, some of the examples/tutorials require various GStreamer plugins to be available, which can be found in gst-plugins-base, gst-plugins-good, gst-plugins-bad, gst-plugins-ugly and/or gst-libav. ### Linux/BSDs You need to install the above mentioned packages with your distributions package manager, or build them from source. On Debian/Ubuntu they can be installed with ```console $ apt-get install libgstreamer1.0-dev libgstreamer-plugins-base1.0-dev \ gstreamer1.0-plugins-base gstreamer1.0-plugins-good \ gstreamer1.0-plugins-bad gstreamer1.0-plugins-ugly \ gstreamer1.0-libav libgstrtspserver-1.0-dev libges-1.0-dev ``` The minimum required version of the above libraries is >= 1.14. If you build the gstreamer-player sub-crate, or any of the examples that depend on gstreamer-player, you must ensure that in addition to the above packages, `libgstreamer-plugins-bad1.0-dev` is installed. See the `Cargo.toml` files for the full details, ```console $ apt-get install libgstreamer-plugins-bad1.0-dev ``` Package names on other distributions should be similar. Please submit a pull request with instructions for yours. ### macOS You can install GStreamer and the plugins via [Homebrew](https://brew.sh/) or by installing the [binaries](https://gstreamer.freedesktop.org/data/pkg/osx/) provided by the GStreamer project. We recommend using the official GStreamer binaries over Homebrew, especially as GStreamer in Homebrew is [currently broken](https://github.com/orgs/Homebrew/discussions/3740#discussioncomment-3804964). #### GStreamer Binaries You need to download the *two* `.pkg` files from the GStreamer website and install them, e.g. `gstreamer-1.0-1.20.4-universal.pkg` and `gstreamer-1.0-devel-1.20.4-universal.pkg`. After installation, you also need to set the `PATH` environment variable as follows ```console $ export PATH="/Library/Frameworks/GStreamer.framework/Versions/1.0/bin${PATH:+:$PATH}" ``` Also note that the `pkg-config` from GStreamer should be the first one in the `PATH` as other versions have all kinds of quirks that will cause problems. #### Homebrew Homebrew only installs various plugins if explicitly enabled, so some extra `--with-*` flags may be required. ```console $ brew install gstreamer gst-plugins-base gst-plugins-good \ gst-plugins-bad gst-plugins-ugly gst-libav gst-rtsp-server \ gst-editing-services --with-orc --with-libogg --with-opus \ --with-pango --with-theora --with-libvorbis --with-libvpx \ --enable-gtk3 ``` Make sure the version of these libraries is >= 1.14. ### Windows You can install GStreamer and the plugins via [MSYS2](http://www.msys2.org/) with `pacman` or by installing the [binaries](https://gstreamer.freedesktop.org/data/pkg/windows/) provided by the GStreamer project. We recommend using the official GStreamer binaries over MSYS2. #### GStreamer Binaries You need to download the *two* `.msi` files for your platform from the GStreamer website and install them, e.g. `gstreamer-1.0-x86_64-1.20.4.msi` and `gstreamer-1.0-devel-x86_64-1.20.4.msi`. Make sure to select the version that matches your Rust toolchain, i.e. MinGW or MSVC. After installation set the ``PATH` environment variable as follows: ```console # For a UNIX-style shell: $ export PATH="c:/gstreamer/1.0/msvc_x86_64/bin${PATH:+:$PATH}" # For cmd.exe: $ set PATH=C:\gstreamer\1.0\msvc_x86_64\bin;%PATH% ``` Make sure to update the path to where you have actually installed GStreamer and for the corresponding toolchain. Also note that the `pkg-config.exe` from GStreamer should be the first one in the `PATH` as other versions have all kinds of quirks that will cause problems. #### MSYS2 / pacman ```console $ pacman -S glib2-devel pkg-config \ mingw-w64-x86_64-gstreamer mingw-w64-x86_64-gst-plugins-base \ mingw-w64-x86_64-gst-plugins-good mingw-w64-x86_64-gst-plugins-bad \ mingw-w64-x86_64-gst-plugins-ugly mingw-w64-x86_64-gst-libav \ mingw-w64-x86_64-gst-rtsp-server ``` Make sure the version of these libraries is >= 1.14. Note that the version of `pkg-config` included in `MSYS2` is [known to have problems](https://github.com/rust-lang/pkg-config-rs/issues/51#issuecomment-346300858) compiling GStreamer, so you may need to install another version. One option would be [`pkg-config-lite`](https://sourceforge.net/projects/pkgconfiglite/). ## Getting Started The API reference can be found [here](https://gstreamer.freedesktop.org/documentation/rust/stable/latest/docs/gstreamer/), however it is only the Rust API reference and does not explain any of the concepts. For getting started with GStreamer development, the best would be to follow the [documentation](https://gstreamer.freedesktop.org/documentation/) on the GStreamer website, especially the [Application Development Manual](https://gstreamer.freedesktop.org/documentation/application-development/). While being C-centric, it explains all the fundamental concepts of GStreamer and the code examples should be relatively easily translatable to Rust. The API is basically the same, function/struct names are the same and everything is only more convenient (hopefully) and safer. In addition there are [tutorials](https://gstreamer.freedesktop.org/documentation/tutorials/) on the GStreamer website. Many of them were ported to Rust already and the code can be found in the [tutorials](https://gitlab.freedesktop.org/gstreamer/gstreamer-rs/tree/main/tutorials) directory. Some further examples for various aspects of GStreamer and how to use it from Rust can be found in the [examples](https://gitlab.freedesktop.org/gstreamer/gstreamer-rs/tree/main/examples) directory. Various GStreamer plugins written in Rust can be found in the [gst-plugins-rs](https://gitlab.freedesktop.org/gstreamer/gst-plugins-rs) repository. ## LICENSE gstreamer-rs and all crates contained in here are licensed under either of * Apache License, Version 2.0, ([LICENSE-APACHE](LICENSE-APACHE) or http://www.apache.org/licenses/LICENSE-2.0) * MIT license ([LICENSE-MIT](LICENSE-MIT) or http://opensource.org/licenses/MIT) at your option. GStreamer itself is licensed under the Lesser General Public License version 2.1 or (at your option) any later version: https://www.gnu.org/licenses/lgpl-2.1.html ## Contribution Any kinds of contributions are welcome as a pull request. Unless you explicitly state otherwise, any contribution intentionally submitted for inclusion in gstreamer-rs by you, as defined in the Apache-2.0 license, shall be dual licensed as above, without any additional terms or conditions. gstreamer-video-0.23.5/src/auto/color_balance.rs000064400000000000000000000074441046102023000177110ustar 00000000000000// This file was generated by gir (https://github.com/gtk-rs/gir) // from gir-files (https://github.com/gtk-rs/gir-files) // from gst-gir-files (https://gitlab.freedesktop.org/gstreamer/gir-files-rs.git) // DO NOT EDIT use crate::{ffi, ColorBalanceChannel, ColorBalanceType}; use glib::{ object::ObjectType as _, prelude::*, signal::{connect_raw, SignalHandlerId}, translate::*, }; use std::boxed::Box as Box_; glib::wrapper! { #[doc(alias = "GstColorBalance")] pub struct ColorBalance(Interface); match fn { type_ => || ffi::gst_color_balance_get_type(), } } impl ColorBalance { pub const NONE: Option<&'static ColorBalance> = None; } unsafe impl Send for ColorBalance {} unsafe impl Sync for ColorBalance {} mod sealed { pub trait Sealed {} impl> Sealed for T {} } pub trait ColorBalanceExt: IsA + sealed::Sealed + 'static { #[doc(alias = "gst_color_balance_get_balance_type")] #[doc(alias = "get_balance_type")] fn balance_type(&self) -> ColorBalanceType { unsafe { from_glib(ffi::gst_color_balance_get_balance_type( self.as_ref().to_glib_none().0, )) } } #[doc(alias = "gst_color_balance_get_value")] #[doc(alias = "get_value")] fn value(&self, channel: &impl IsA) -> i32 { unsafe { ffi::gst_color_balance_get_value( self.as_ref().to_glib_none().0, channel.as_ref().to_glib_none().0, ) } } #[doc(alias = "gst_color_balance_list_channels")] fn list_channels(&self) -> Vec { unsafe { FromGlibPtrContainer::from_glib_none(ffi::gst_color_balance_list_channels( self.as_ref().to_glib_none().0, )) } } #[doc(alias = "gst_color_balance_set_value")] fn set_value(&self, channel: &impl IsA, value: i32) { unsafe { ffi::gst_color_balance_set_value( self.as_ref().to_glib_none().0, channel.as_ref().to_glib_none().0, value, ); } } #[doc(alias = "gst_color_balance_value_changed")] fn value_changed(&self, channel: &impl IsA, value: i32) { unsafe { ffi::gst_color_balance_value_changed( self.as_ref().to_glib_none().0, channel.as_ref().to_glib_none().0, value, ); } } #[doc(alias = "value-changed")] fn connect_value_changed( &self, f: F, ) -> SignalHandlerId { unsafe extern "C" fn value_changed_trampoline< P: IsA, F: Fn(&P, &ColorBalanceChannel, i32) + Send + Sync + 'static, >( this: *mut ffi::GstColorBalance, channel: *mut ffi::GstColorBalanceChannel, value: std::ffi::c_int, f: glib::ffi::gpointer, ) { let f: &F = &*(f as *const F); f( ColorBalance::from_glib_borrow(this).unsafe_cast_ref(), &from_glib_borrow(channel), value, ) } unsafe { let f: Box_ = Box_::new(f); connect_raw( self.as_ptr() as *mut _, b"value-changed\0".as_ptr() as *const _, Some(std::mem::transmute::<*const (), unsafe extern "C" fn()>( value_changed_trampoline:: as *const (), )), Box_::into_raw(f), ) } } } impl> ColorBalanceExt for O {} gstreamer-video-0.23.5/src/auto/color_balance_channel.rs000064400000000000000000000041531046102023000213730ustar 00000000000000// This file was generated by gir (https://github.com/gtk-rs/gir) // from gir-files (https://github.com/gtk-rs/gir-files) // from gst-gir-files (https://gitlab.freedesktop.org/gstreamer/gir-files-rs.git) // DO NOT EDIT use crate::ffi; use glib::{ object::ObjectType as _, prelude::*, signal::{connect_raw, SignalHandlerId}, translate::*, }; use std::boxed::Box as Box_; glib::wrapper! { #[doc(alias = "GstColorBalanceChannel")] pub struct ColorBalanceChannel(Object); match fn { type_ => || ffi::gst_color_balance_channel_get_type(), } } impl ColorBalanceChannel { pub const NONE: Option<&'static ColorBalanceChannel> = None; } unsafe impl Send for ColorBalanceChannel {} unsafe impl Sync for ColorBalanceChannel {} mod sealed { pub trait Sealed {} impl> Sealed for T {} } pub trait ColorBalanceChannelExt: IsA + sealed::Sealed + 'static { #[doc(alias = "value-changed")] fn connect_value_changed( &self, f: F, ) -> SignalHandlerId { unsafe extern "C" fn value_changed_trampoline< P: IsA, F: Fn(&P, i32) + Send + Sync + 'static, >( this: *mut ffi::GstColorBalanceChannel, value: std::ffi::c_int, f: glib::ffi::gpointer, ) { let f: &F = &*(f as *const F); f( ColorBalanceChannel::from_glib_borrow(this).unsafe_cast_ref(), value, ) } unsafe { let f: Box_ = Box_::new(f); connect_raw( self.as_ptr() as *mut _, b"value-changed\0".as_ptr() as *const _, Some(std::mem::transmute::<*const (), unsafe extern "C" fn()>( value_changed_trampoline:: as *const (), )), Box_::into_raw(f), ) } } } impl> ColorBalanceChannelExt for O {} gstreamer-video-0.23.5/src/auto/enums.rs000064400000000000000000004443511046102023000162570ustar 00000000000000// This file was generated by gir (https://github.com/gtk-rs/gir) // from gir-files (https://github.com/gtk-rs/gir-files) // from gst-gir-files (https://gitlab.freedesktop.org/gstreamer/gir-files-rs.git) // DO NOT EDIT use crate::ffi; use glib::{prelude::*, translate::*, GStr}; #[cfg(feature = "v1_24")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_24")))] #[derive(Debug, Eq, PartialEq, Ord, PartialOrd, Hash, Clone, Copy)] #[non_exhaustive] #[doc(alias = "GstAncillaryMetaField")] pub enum AncillaryMetaField { #[doc(alias = "GST_ANCILLARY_META_FIELD_PROGRESSIVE")] Progressive, #[doc(alias = "GST_ANCILLARY_META_FIELD_INTERLACED_FIRST")] InterlacedFirst, #[doc(alias = "GST_ANCILLARY_META_FIELD_INTERLACED_SECOND")] InterlacedSecond, #[doc(hidden)] __Unknown(i32), } #[cfg(feature = "v1_24")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_24")))] #[doc(hidden)] impl IntoGlib for AncillaryMetaField { type GlibType = ffi::GstAncillaryMetaField; #[inline] fn into_glib(self) -> ffi::GstAncillaryMetaField { match self { Self::Progressive => ffi::GST_ANCILLARY_META_FIELD_PROGRESSIVE, Self::InterlacedFirst => ffi::GST_ANCILLARY_META_FIELD_INTERLACED_FIRST, Self::InterlacedSecond => ffi::GST_ANCILLARY_META_FIELD_INTERLACED_SECOND, Self::__Unknown(value) => value, } } } #[cfg(feature = "v1_24")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_24")))] #[doc(hidden)] impl FromGlib for AncillaryMetaField { #[inline] unsafe fn from_glib(value: ffi::GstAncillaryMetaField) -> Self { skip_assert_initialized!(); match value { ffi::GST_ANCILLARY_META_FIELD_PROGRESSIVE => Self::Progressive, ffi::GST_ANCILLARY_META_FIELD_INTERLACED_FIRST => Self::InterlacedFirst, ffi::GST_ANCILLARY_META_FIELD_INTERLACED_SECOND => Self::InterlacedSecond, value => Self::__Unknown(value), } } } #[cfg(feature = "v1_24")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_24")))] impl StaticType for AncillaryMetaField { #[inline] #[doc(alias = "gst_ancillary_meta_field_get_type")] fn static_type() -> glib::Type { unsafe { from_glib(ffi::gst_ancillary_meta_field_get_type()) } } } #[cfg(feature = "v1_24")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_24")))] impl glib::HasParamSpec for AncillaryMetaField { type ParamSpec = glib::ParamSpecEnum; type SetValue = Self; type BuilderFn = fn(&str, Self) -> glib::ParamSpecEnumBuilder; fn param_spec_builder() -> Self::BuilderFn { Self::ParamSpec::builder_with_default } } #[cfg(feature = "v1_24")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_24")))] impl glib::value::ValueType for AncillaryMetaField { type Type = Self; } #[cfg(feature = "v1_24")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_24")))] unsafe impl<'a> glib::value::FromValue<'a> for AncillaryMetaField { type Checker = glib::value::GenericValueTypeChecker; #[inline] unsafe fn from_value(value: &'a glib::Value) -> Self { skip_assert_initialized!(); from_glib(glib::gobject_ffi::g_value_get_enum(value.to_glib_none().0)) } } #[cfg(feature = "v1_24")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_24")))] impl ToValue for AncillaryMetaField { #[inline] fn to_value(&self) -> glib::Value { let mut value = glib::Value::for_value_type::(); unsafe { glib::gobject_ffi::g_value_set_enum(value.to_glib_none_mut().0, self.into_glib()); } value } #[inline] fn value_type(&self) -> glib::Type { Self::static_type() } } #[cfg(feature = "v1_24")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_24")))] impl From for glib::Value { #[inline] fn from(v: AncillaryMetaField) -> Self { skip_assert_initialized!(); ToValue::to_value(&v) } } #[derive(Debug, Eq, PartialEq, Ord, PartialOrd, Hash, Clone, Copy)] #[non_exhaustive] #[doc(alias = "GstColorBalanceType")] pub enum ColorBalanceType { #[doc(alias = "GST_COLOR_BALANCE_HARDWARE")] Hardware, #[doc(alias = "GST_COLOR_BALANCE_SOFTWARE")] Software, #[doc(hidden)] __Unknown(i32), } #[doc(hidden)] impl IntoGlib for ColorBalanceType { type GlibType = ffi::GstColorBalanceType; #[inline] fn into_glib(self) -> ffi::GstColorBalanceType { match self { Self::Hardware => ffi::GST_COLOR_BALANCE_HARDWARE, Self::Software => ffi::GST_COLOR_BALANCE_SOFTWARE, Self::__Unknown(value) => value, } } } #[doc(hidden)] impl FromGlib for ColorBalanceType { #[inline] unsafe fn from_glib(value: ffi::GstColorBalanceType) -> Self { skip_assert_initialized!(); match value { ffi::GST_COLOR_BALANCE_HARDWARE => Self::Hardware, ffi::GST_COLOR_BALANCE_SOFTWARE => Self::Software, value => Self::__Unknown(value), } } } impl StaticType for ColorBalanceType { #[inline] #[doc(alias = "gst_color_balance_type_get_type")] fn static_type() -> glib::Type { unsafe { from_glib(ffi::gst_color_balance_type_get_type()) } } } impl glib::HasParamSpec for ColorBalanceType { type ParamSpec = glib::ParamSpecEnum; type SetValue = Self; type BuilderFn = fn(&str, Self) -> glib::ParamSpecEnumBuilder; fn param_spec_builder() -> Self::BuilderFn { Self::ParamSpec::builder_with_default } } impl glib::value::ValueType for ColorBalanceType { type Type = Self; } unsafe impl<'a> glib::value::FromValue<'a> for ColorBalanceType { type Checker = glib::value::GenericValueTypeChecker; #[inline] unsafe fn from_value(value: &'a glib::Value) -> Self { skip_assert_initialized!(); from_glib(glib::gobject_ffi::g_value_get_enum(value.to_glib_none().0)) } } impl ToValue for ColorBalanceType { #[inline] fn to_value(&self) -> glib::Value { let mut value = glib::Value::for_value_type::(); unsafe { glib::gobject_ffi::g_value_set_enum(value.to_glib_none_mut().0, self.into_glib()); } value } #[inline] fn value_type(&self) -> glib::Type { Self::static_type() } } impl From for glib::Value { #[inline] fn from(v: ColorBalanceType) -> Self { skip_assert_initialized!(); ToValue::to_value(&v) } } #[cfg_attr(feature = "serde", derive(serde::Serialize, serde::Deserialize))] #[derive(Debug, Eq, PartialEq, Ord, PartialOrd, Hash, Clone, Copy)] #[non_exhaustive] #[doc(alias = "GstNavigationCommand")] pub enum NavigationCommand { #[doc(alias = "GST_NAVIGATION_COMMAND_INVALID")] Invalid, #[doc(alias = "GST_NAVIGATION_COMMAND_MENU1")] Menu1, #[doc(alias = "GST_NAVIGATION_COMMAND_MENU2")] Menu2, #[doc(alias = "GST_NAVIGATION_COMMAND_MENU3")] Menu3, #[doc(alias = "GST_NAVIGATION_COMMAND_MENU4")] Menu4, #[doc(alias = "GST_NAVIGATION_COMMAND_MENU5")] Menu5, #[doc(alias = "GST_NAVIGATION_COMMAND_MENU6")] Menu6, #[doc(alias = "GST_NAVIGATION_COMMAND_MENU7")] Menu7, #[doc(alias = "GST_NAVIGATION_COMMAND_LEFT")] Left, #[doc(alias = "GST_NAVIGATION_COMMAND_RIGHT")] Right, #[doc(alias = "GST_NAVIGATION_COMMAND_UP")] Up, #[doc(alias = "GST_NAVIGATION_COMMAND_DOWN")] Down, #[doc(alias = "GST_NAVIGATION_COMMAND_ACTIVATE")] Activate, #[doc(alias = "GST_NAVIGATION_COMMAND_PREV_ANGLE")] PrevAngle, #[doc(alias = "GST_NAVIGATION_COMMAND_NEXT_ANGLE")] NextAngle, #[doc(hidden)] __Unknown(i32), } #[doc(hidden)] impl IntoGlib for NavigationCommand { type GlibType = ffi::GstNavigationCommand; fn into_glib(self) -> ffi::GstNavigationCommand { match self { Self::Invalid => ffi::GST_NAVIGATION_COMMAND_INVALID, Self::Menu1 => ffi::GST_NAVIGATION_COMMAND_MENU1, Self::Menu2 => ffi::GST_NAVIGATION_COMMAND_MENU2, Self::Menu3 => ffi::GST_NAVIGATION_COMMAND_MENU3, Self::Menu4 => ffi::GST_NAVIGATION_COMMAND_MENU4, Self::Menu5 => ffi::GST_NAVIGATION_COMMAND_MENU5, Self::Menu6 => ffi::GST_NAVIGATION_COMMAND_MENU6, Self::Menu7 => ffi::GST_NAVIGATION_COMMAND_MENU7, Self::Left => ffi::GST_NAVIGATION_COMMAND_LEFT, Self::Right => ffi::GST_NAVIGATION_COMMAND_RIGHT, Self::Up => ffi::GST_NAVIGATION_COMMAND_UP, Self::Down => ffi::GST_NAVIGATION_COMMAND_DOWN, Self::Activate => ffi::GST_NAVIGATION_COMMAND_ACTIVATE, Self::PrevAngle => ffi::GST_NAVIGATION_COMMAND_PREV_ANGLE, Self::NextAngle => ffi::GST_NAVIGATION_COMMAND_NEXT_ANGLE, Self::__Unknown(value) => value, } } } #[doc(hidden)] impl FromGlib for NavigationCommand { unsafe fn from_glib(value: ffi::GstNavigationCommand) -> Self { skip_assert_initialized!(); match value { ffi::GST_NAVIGATION_COMMAND_INVALID => Self::Invalid, ffi::GST_NAVIGATION_COMMAND_MENU1 => Self::Menu1, ffi::GST_NAVIGATION_COMMAND_MENU2 => Self::Menu2, ffi::GST_NAVIGATION_COMMAND_MENU3 => Self::Menu3, ffi::GST_NAVIGATION_COMMAND_MENU4 => Self::Menu4, ffi::GST_NAVIGATION_COMMAND_MENU5 => Self::Menu5, ffi::GST_NAVIGATION_COMMAND_MENU6 => Self::Menu6, ffi::GST_NAVIGATION_COMMAND_MENU7 => Self::Menu7, ffi::GST_NAVIGATION_COMMAND_LEFT => Self::Left, ffi::GST_NAVIGATION_COMMAND_RIGHT => Self::Right, ffi::GST_NAVIGATION_COMMAND_UP => Self::Up, ffi::GST_NAVIGATION_COMMAND_DOWN => Self::Down, ffi::GST_NAVIGATION_COMMAND_ACTIVATE => Self::Activate, ffi::GST_NAVIGATION_COMMAND_PREV_ANGLE => Self::PrevAngle, ffi::GST_NAVIGATION_COMMAND_NEXT_ANGLE => Self::NextAngle, value => Self::__Unknown(value), } } } impl StaticType for NavigationCommand { #[inline] #[doc(alias = "gst_navigation_command_get_type")] fn static_type() -> glib::Type { unsafe { from_glib(ffi::gst_navigation_command_get_type()) } } } impl glib::HasParamSpec for NavigationCommand { type ParamSpec = glib::ParamSpecEnum; type SetValue = Self; type BuilderFn = fn(&str, Self) -> glib::ParamSpecEnumBuilder; fn param_spec_builder() -> Self::BuilderFn { Self::ParamSpec::builder_with_default } } impl glib::value::ValueType for NavigationCommand { type Type = Self; } unsafe impl<'a> glib::value::FromValue<'a> for NavigationCommand { type Checker = glib::value::GenericValueTypeChecker; #[inline] unsafe fn from_value(value: &'a glib::Value) -> Self { skip_assert_initialized!(); from_glib(glib::gobject_ffi::g_value_get_enum(value.to_glib_none().0)) } } impl ToValue for NavigationCommand { #[inline] fn to_value(&self) -> glib::Value { let mut value = glib::Value::for_value_type::(); unsafe { glib::gobject_ffi::g_value_set_enum(value.to_glib_none_mut().0, self.into_glib()); } value } #[inline] fn value_type(&self) -> glib::Type { Self::static_type() } } impl From for glib::Value { #[inline] fn from(v: NavigationCommand) -> Self { skip_assert_initialized!(); ToValue::to_value(&v) } } #[derive(Debug, Eq, PartialEq, Ord, PartialOrd, Hash, Clone, Copy)] #[non_exhaustive] #[doc(alias = "GstNavigationEventType")] pub enum NavigationEventType { #[doc(alias = "GST_NAVIGATION_EVENT_INVALID")] Invalid, #[doc(alias = "GST_NAVIGATION_EVENT_KEY_PRESS")] KeyPress, #[doc(alias = "GST_NAVIGATION_EVENT_KEY_RELEASE")] KeyRelease, #[doc(alias = "GST_NAVIGATION_EVENT_MOUSE_BUTTON_PRESS")] MouseButtonPress, #[doc(alias = "GST_NAVIGATION_EVENT_MOUSE_BUTTON_RELEASE")] MouseButtonRelease, #[doc(alias = "GST_NAVIGATION_EVENT_MOUSE_MOVE")] MouseMove, #[doc(alias = "GST_NAVIGATION_EVENT_COMMAND")] Command, #[cfg(feature = "v1_18")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_18")))] #[doc(alias = "GST_NAVIGATION_EVENT_MOUSE_SCROLL")] MouseScroll, #[cfg(feature = "v1_22")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_22")))] #[doc(alias = "GST_NAVIGATION_EVENT_TOUCH_DOWN")] TouchDown, #[cfg(feature = "v1_22")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_22")))] #[doc(alias = "GST_NAVIGATION_EVENT_TOUCH_MOTION")] TouchMotion, #[cfg(feature = "v1_22")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_22")))] #[doc(alias = "GST_NAVIGATION_EVENT_TOUCH_UP")] TouchUp, #[cfg(feature = "v1_22")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_22")))] #[doc(alias = "GST_NAVIGATION_EVENT_TOUCH_FRAME")] TouchFrame, #[cfg(feature = "v1_22")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_22")))] #[doc(alias = "GST_NAVIGATION_EVENT_TOUCH_CANCEL")] TouchCancel, #[cfg(feature = "v1_26")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_26")))] #[doc(alias = "GST_NAVIGATION_EVENT_MOUSE_DOUBLE_CLICK")] MouseDoubleClick, #[doc(hidden)] __Unknown(i32), } #[doc(hidden)] impl IntoGlib for NavigationEventType { type GlibType = ffi::GstNavigationEventType; fn into_glib(self) -> ffi::GstNavigationEventType { match self { Self::Invalid => ffi::GST_NAVIGATION_EVENT_INVALID, Self::KeyPress => ffi::GST_NAVIGATION_EVENT_KEY_PRESS, Self::KeyRelease => ffi::GST_NAVIGATION_EVENT_KEY_RELEASE, Self::MouseButtonPress => ffi::GST_NAVIGATION_EVENT_MOUSE_BUTTON_PRESS, Self::MouseButtonRelease => ffi::GST_NAVIGATION_EVENT_MOUSE_BUTTON_RELEASE, Self::MouseMove => ffi::GST_NAVIGATION_EVENT_MOUSE_MOVE, Self::Command => ffi::GST_NAVIGATION_EVENT_COMMAND, #[cfg(feature = "v1_18")] Self::MouseScroll => ffi::GST_NAVIGATION_EVENT_MOUSE_SCROLL, #[cfg(feature = "v1_22")] Self::TouchDown => ffi::GST_NAVIGATION_EVENT_TOUCH_DOWN, #[cfg(feature = "v1_22")] Self::TouchMotion => ffi::GST_NAVIGATION_EVENT_TOUCH_MOTION, #[cfg(feature = "v1_22")] Self::TouchUp => ffi::GST_NAVIGATION_EVENT_TOUCH_UP, #[cfg(feature = "v1_22")] Self::TouchFrame => ffi::GST_NAVIGATION_EVENT_TOUCH_FRAME, #[cfg(feature = "v1_22")] Self::TouchCancel => ffi::GST_NAVIGATION_EVENT_TOUCH_CANCEL, #[cfg(feature = "v1_26")] Self::MouseDoubleClick => ffi::GST_NAVIGATION_EVENT_MOUSE_DOUBLE_CLICK, Self::__Unknown(value) => value, } } } #[doc(hidden)] impl FromGlib for NavigationEventType { unsafe fn from_glib(value: ffi::GstNavigationEventType) -> Self { skip_assert_initialized!(); match value { ffi::GST_NAVIGATION_EVENT_INVALID => Self::Invalid, ffi::GST_NAVIGATION_EVENT_KEY_PRESS => Self::KeyPress, ffi::GST_NAVIGATION_EVENT_KEY_RELEASE => Self::KeyRelease, ffi::GST_NAVIGATION_EVENT_MOUSE_BUTTON_PRESS => Self::MouseButtonPress, ffi::GST_NAVIGATION_EVENT_MOUSE_BUTTON_RELEASE => Self::MouseButtonRelease, ffi::GST_NAVIGATION_EVENT_MOUSE_MOVE => Self::MouseMove, ffi::GST_NAVIGATION_EVENT_COMMAND => Self::Command, #[cfg(feature = "v1_18")] ffi::GST_NAVIGATION_EVENT_MOUSE_SCROLL => Self::MouseScroll, #[cfg(feature = "v1_22")] ffi::GST_NAVIGATION_EVENT_TOUCH_DOWN => Self::TouchDown, #[cfg(feature = "v1_22")] ffi::GST_NAVIGATION_EVENT_TOUCH_MOTION => Self::TouchMotion, #[cfg(feature = "v1_22")] ffi::GST_NAVIGATION_EVENT_TOUCH_UP => Self::TouchUp, #[cfg(feature = "v1_22")] ffi::GST_NAVIGATION_EVENT_TOUCH_FRAME => Self::TouchFrame, #[cfg(feature = "v1_22")] ffi::GST_NAVIGATION_EVENT_TOUCH_CANCEL => Self::TouchCancel, #[cfg(feature = "v1_26")] ffi::GST_NAVIGATION_EVENT_MOUSE_DOUBLE_CLICK => Self::MouseDoubleClick, value => Self::__Unknown(value), } } } impl StaticType for NavigationEventType { #[inline] #[doc(alias = "gst_navigation_event_type_get_type")] fn static_type() -> glib::Type { unsafe { from_glib(ffi::gst_navigation_event_type_get_type()) } } } impl glib::HasParamSpec for NavigationEventType { type ParamSpec = glib::ParamSpecEnum; type SetValue = Self; type BuilderFn = fn(&str, Self) -> glib::ParamSpecEnumBuilder; fn param_spec_builder() -> Self::BuilderFn { Self::ParamSpec::builder_with_default } } impl glib::value::ValueType for NavigationEventType { type Type = Self; } unsafe impl<'a> glib::value::FromValue<'a> for NavigationEventType { type Checker = glib::value::GenericValueTypeChecker; #[inline] unsafe fn from_value(value: &'a glib::Value) -> Self { skip_assert_initialized!(); from_glib(glib::gobject_ffi::g_value_get_enum(value.to_glib_none().0)) } } impl ToValue for NavigationEventType { #[inline] fn to_value(&self) -> glib::Value { let mut value = glib::Value::for_value_type::(); unsafe { glib::gobject_ffi::g_value_set_enum(value.to_glib_none_mut().0, self.into_glib()); } value } #[inline] fn value_type(&self) -> glib::Type { Self::static_type() } } impl From for glib::Value { #[inline] fn from(v: NavigationEventType) -> Self { skip_assert_initialized!(); ToValue::to_value(&v) } } #[derive(Debug, Eq, PartialEq, Ord, PartialOrd, Hash, Clone, Copy)] #[non_exhaustive] #[doc(alias = "GstNavigationMessageType")] pub enum NavigationMessageType { #[doc(alias = "GST_NAVIGATION_MESSAGE_INVALID")] Invalid, #[doc(alias = "GST_NAVIGATION_MESSAGE_MOUSE_OVER")] MouseOver, #[doc(alias = "GST_NAVIGATION_MESSAGE_COMMANDS_CHANGED")] CommandsChanged, #[doc(alias = "GST_NAVIGATION_MESSAGE_ANGLES_CHANGED")] AnglesChanged, #[doc(alias = "GST_NAVIGATION_MESSAGE_EVENT")] Event, #[doc(hidden)] __Unknown(i32), } #[doc(hidden)] impl IntoGlib for NavigationMessageType { type GlibType = ffi::GstNavigationMessageType; #[inline] fn into_glib(self) -> ffi::GstNavigationMessageType { match self { Self::Invalid => ffi::GST_NAVIGATION_MESSAGE_INVALID, Self::MouseOver => ffi::GST_NAVIGATION_MESSAGE_MOUSE_OVER, Self::CommandsChanged => ffi::GST_NAVIGATION_MESSAGE_COMMANDS_CHANGED, Self::AnglesChanged => ffi::GST_NAVIGATION_MESSAGE_ANGLES_CHANGED, Self::Event => ffi::GST_NAVIGATION_MESSAGE_EVENT, Self::__Unknown(value) => value, } } } #[doc(hidden)] impl FromGlib for NavigationMessageType { #[inline] unsafe fn from_glib(value: ffi::GstNavigationMessageType) -> Self { skip_assert_initialized!(); match value { ffi::GST_NAVIGATION_MESSAGE_INVALID => Self::Invalid, ffi::GST_NAVIGATION_MESSAGE_MOUSE_OVER => Self::MouseOver, ffi::GST_NAVIGATION_MESSAGE_COMMANDS_CHANGED => Self::CommandsChanged, ffi::GST_NAVIGATION_MESSAGE_ANGLES_CHANGED => Self::AnglesChanged, ffi::GST_NAVIGATION_MESSAGE_EVENT => Self::Event, value => Self::__Unknown(value), } } } impl StaticType for NavigationMessageType { #[inline] #[doc(alias = "gst_navigation_message_type_get_type")] fn static_type() -> glib::Type { unsafe { from_glib(ffi::gst_navigation_message_type_get_type()) } } } impl glib::HasParamSpec for NavigationMessageType { type ParamSpec = glib::ParamSpecEnum; type SetValue = Self; type BuilderFn = fn(&str, Self) -> glib::ParamSpecEnumBuilder; fn param_spec_builder() -> Self::BuilderFn { Self::ParamSpec::builder_with_default } } impl glib::value::ValueType for NavigationMessageType { type Type = Self; } unsafe impl<'a> glib::value::FromValue<'a> for NavigationMessageType { type Checker = glib::value::GenericValueTypeChecker; #[inline] unsafe fn from_value(value: &'a glib::Value) -> Self { skip_assert_initialized!(); from_glib(glib::gobject_ffi::g_value_get_enum(value.to_glib_none().0)) } } impl ToValue for NavigationMessageType { #[inline] fn to_value(&self) -> glib::Value { let mut value = glib::Value::for_value_type::(); unsafe { glib::gobject_ffi::g_value_set_enum(value.to_glib_none_mut().0, self.into_glib()); } value } #[inline] fn value_type(&self) -> glib::Type { Self::static_type() } } impl From for glib::Value { #[inline] fn from(v: NavigationMessageType) -> Self { skip_assert_initialized!(); ToValue::to_value(&v) } } #[derive(Debug, Eq, PartialEq, Ord, PartialOrd, Hash, Clone, Copy)] #[non_exhaustive] #[doc(alias = "GstNavigationQueryType")] pub enum NavigationQueryType { #[doc(alias = "GST_NAVIGATION_QUERY_INVALID")] Invalid, #[doc(alias = "GST_NAVIGATION_QUERY_COMMANDS")] Commands, #[doc(alias = "GST_NAVIGATION_QUERY_ANGLES")] Angles, #[doc(hidden)] __Unknown(i32), } #[doc(hidden)] impl IntoGlib for NavigationQueryType { type GlibType = ffi::GstNavigationQueryType; #[inline] fn into_glib(self) -> ffi::GstNavigationQueryType { match self { Self::Invalid => ffi::GST_NAVIGATION_QUERY_INVALID, Self::Commands => ffi::GST_NAVIGATION_QUERY_COMMANDS, Self::Angles => ffi::GST_NAVIGATION_QUERY_ANGLES, Self::__Unknown(value) => value, } } } #[doc(hidden)] impl FromGlib for NavigationQueryType { #[inline] unsafe fn from_glib(value: ffi::GstNavigationQueryType) -> Self { skip_assert_initialized!(); match value { ffi::GST_NAVIGATION_QUERY_INVALID => Self::Invalid, ffi::GST_NAVIGATION_QUERY_COMMANDS => Self::Commands, ffi::GST_NAVIGATION_QUERY_ANGLES => Self::Angles, value => Self::__Unknown(value), } } } impl StaticType for NavigationQueryType { #[inline] #[doc(alias = "gst_navigation_query_type_get_type")] fn static_type() -> glib::Type { unsafe { from_glib(ffi::gst_navigation_query_type_get_type()) } } } impl glib::HasParamSpec for NavigationQueryType { type ParamSpec = glib::ParamSpecEnum; type SetValue = Self; type BuilderFn = fn(&str, Self) -> glib::ParamSpecEnumBuilder; fn param_spec_builder() -> Self::BuilderFn { Self::ParamSpec::builder_with_default } } impl glib::value::ValueType for NavigationQueryType { type Type = Self; } unsafe impl<'a> glib::value::FromValue<'a> for NavigationQueryType { type Checker = glib::value::GenericValueTypeChecker; #[inline] unsafe fn from_value(value: &'a glib::Value) -> Self { skip_assert_initialized!(); from_glib(glib::gobject_ffi::g_value_get_enum(value.to_glib_none().0)) } } impl ToValue for NavigationQueryType { #[inline] fn to_value(&self) -> glib::Value { let mut value = glib::Value::for_value_type::(); unsafe { glib::gobject_ffi::g_value_set_enum(value.to_glib_none_mut().0, self.into_glib()); } value } #[inline] fn value_type(&self) -> glib::Type { Self::static_type() } } impl From for glib::Value { #[inline] fn from(v: NavigationQueryType) -> Self { skip_assert_initialized!(); ToValue::to_value(&v) } } #[cfg(feature = "v1_18")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_18")))] #[derive(Debug, Eq, PartialEq, Ord, PartialOrd, Hash, Clone, Copy)] #[non_exhaustive] #[doc(alias = "GstVideoAFDSpec")] pub enum VideoAFDSpec { #[doc(alias = "GST_VIDEO_AFD_SPEC_DVB_ETSI")] DvbEtsi, #[doc(alias = "GST_VIDEO_AFD_SPEC_ATSC_A53")] AtscA53, #[doc(alias = "GST_VIDEO_AFD_SPEC_SMPTE_ST2016_1")] SmpteSt20161, #[doc(hidden)] __Unknown(i32), } #[cfg(feature = "v1_18")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_18")))] #[doc(hidden)] impl IntoGlib for VideoAFDSpec { type GlibType = ffi::GstVideoAFDSpec; #[inline] fn into_glib(self) -> ffi::GstVideoAFDSpec { match self { Self::DvbEtsi => ffi::GST_VIDEO_AFD_SPEC_DVB_ETSI, Self::AtscA53 => ffi::GST_VIDEO_AFD_SPEC_ATSC_A53, Self::SmpteSt20161 => ffi::GST_VIDEO_AFD_SPEC_SMPTE_ST2016_1, Self::__Unknown(value) => value, } } } #[cfg(feature = "v1_18")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_18")))] #[doc(hidden)] impl FromGlib for VideoAFDSpec { #[inline] unsafe fn from_glib(value: ffi::GstVideoAFDSpec) -> Self { skip_assert_initialized!(); match value { ffi::GST_VIDEO_AFD_SPEC_DVB_ETSI => Self::DvbEtsi, ffi::GST_VIDEO_AFD_SPEC_ATSC_A53 => Self::AtscA53, ffi::GST_VIDEO_AFD_SPEC_SMPTE_ST2016_1 => Self::SmpteSt20161, value => Self::__Unknown(value), } } } #[cfg(feature = "v1_18")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_18")))] impl StaticType for VideoAFDSpec { #[inline] #[doc(alias = "gst_video_afd_spec_get_type")] fn static_type() -> glib::Type { unsafe { from_glib(ffi::gst_video_afd_spec_get_type()) } } } #[cfg(feature = "v1_18")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_18")))] impl glib::HasParamSpec for VideoAFDSpec { type ParamSpec = glib::ParamSpecEnum; type SetValue = Self; type BuilderFn = fn(&str, Self) -> glib::ParamSpecEnumBuilder; fn param_spec_builder() -> Self::BuilderFn { Self::ParamSpec::builder_with_default } } #[cfg(feature = "v1_18")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_18")))] impl glib::value::ValueType for VideoAFDSpec { type Type = Self; } #[cfg(feature = "v1_18")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_18")))] unsafe impl<'a> glib::value::FromValue<'a> for VideoAFDSpec { type Checker = glib::value::GenericValueTypeChecker; #[inline] unsafe fn from_value(value: &'a glib::Value) -> Self { skip_assert_initialized!(); from_glib(glib::gobject_ffi::g_value_get_enum(value.to_glib_none().0)) } } #[cfg(feature = "v1_18")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_18")))] impl ToValue for VideoAFDSpec { #[inline] fn to_value(&self) -> glib::Value { let mut value = glib::Value::for_value_type::(); unsafe { glib::gobject_ffi::g_value_set_enum(value.to_glib_none_mut().0, self.into_glib()); } value } #[inline] fn value_type(&self) -> glib::Type { Self::static_type() } } #[cfg(feature = "v1_18")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_18")))] impl From for glib::Value { #[inline] fn from(v: VideoAFDSpec) -> Self { skip_assert_initialized!(); ToValue::to_value(&v) } } #[cfg(feature = "v1_18")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_18")))] #[derive(Debug, Eq, PartialEq, Ord, PartialOrd, Hash, Clone, Copy)] #[non_exhaustive] #[doc(alias = "GstVideoAFDValue")] pub enum VideoAFDValue { #[doc(alias = "GST_VIDEO_AFD_UNAVAILABLE")] Unavailable, #[doc(alias = "GST_VIDEO_AFD_16_9_TOP_ALIGNED")] _169TopAligned, #[doc(alias = "GST_VIDEO_AFD_14_9_TOP_ALIGNED")] _149TopAligned, #[doc(alias = "GST_VIDEO_AFD_GREATER_THAN_16_9")] GreaterThan169, #[doc(alias = "GST_VIDEO_AFD_4_3_FULL_16_9_FULL")] _43Full169Full, #[doc(alias = "GST_VIDEO_AFD_4_3_FULL_4_3_PILLAR")] _43Full43Pillar, #[doc(alias = "GST_VIDEO_AFD_16_9_LETTER_16_9_FULL")] _169Letter169Full, #[doc(alias = "GST_VIDEO_AFD_14_9_LETTER_14_9_PILLAR")] _149Letter149Pillar, #[doc(alias = "GST_VIDEO_AFD_4_3_FULL_14_9_CENTER")] _43Full149Center, #[doc(alias = "GST_VIDEO_AFD_16_9_LETTER_14_9_CENTER")] _169Letter149Center, #[doc(alias = "GST_VIDEO_AFD_16_9_LETTER_4_3_CENTER")] _169Letter43Center, #[doc(hidden)] __Unknown(i32), } #[cfg(feature = "v1_18")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_18")))] #[doc(hidden)] impl IntoGlib for VideoAFDValue { type GlibType = ffi::GstVideoAFDValue; #[inline] fn into_glib(self) -> ffi::GstVideoAFDValue { match self { Self::Unavailable => ffi::GST_VIDEO_AFD_UNAVAILABLE, Self::_169TopAligned => ffi::GST_VIDEO_AFD_16_9_TOP_ALIGNED, Self::_149TopAligned => ffi::GST_VIDEO_AFD_14_9_TOP_ALIGNED, Self::GreaterThan169 => ffi::GST_VIDEO_AFD_GREATER_THAN_16_9, Self::_43Full169Full => ffi::GST_VIDEO_AFD_4_3_FULL_16_9_FULL, Self::_43Full43Pillar => ffi::GST_VIDEO_AFD_4_3_FULL_4_3_PILLAR, Self::_169Letter169Full => ffi::GST_VIDEO_AFD_16_9_LETTER_16_9_FULL, Self::_149Letter149Pillar => ffi::GST_VIDEO_AFD_14_9_LETTER_14_9_PILLAR, Self::_43Full149Center => ffi::GST_VIDEO_AFD_4_3_FULL_14_9_CENTER, Self::_169Letter149Center => ffi::GST_VIDEO_AFD_16_9_LETTER_14_9_CENTER, Self::_169Letter43Center => ffi::GST_VIDEO_AFD_16_9_LETTER_4_3_CENTER, Self::__Unknown(value) => value, } } } #[cfg(feature = "v1_18")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_18")))] #[doc(hidden)] impl FromGlib for VideoAFDValue { #[inline] unsafe fn from_glib(value: ffi::GstVideoAFDValue) -> Self { skip_assert_initialized!(); match value { ffi::GST_VIDEO_AFD_UNAVAILABLE => Self::Unavailable, ffi::GST_VIDEO_AFD_16_9_TOP_ALIGNED => Self::_169TopAligned, ffi::GST_VIDEO_AFD_14_9_TOP_ALIGNED => Self::_149TopAligned, ffi::GST_VIDEO_AFD_GREATER_THAN_16_9 => Self::GreaterThan169, ffi::GST_VIDEO_AFD_4_3_FULL_16_9_FULL => Self::_43Full169Full, ffi::GST_VIDEO_AFD_4_3_FULL_4_3_PILLAR => Self::_43Full43Pillar, ffi::GST_VIDEO_AFD_16_9_LETTER_16_9_FULL => Self::_169Letter169Full, ffi::GST_VIDEO_AFD_14_9_LETTER_14_9_PILLAR => Self::_149Letter149Pillar, ffi::GST_VIDEO_AFD_4_3_FULL_14_9_CENTER => Self::_43Full149Center, ffi::GST_VIDEO_AFD_16_9_LETTER_14_9_CENTER => Self::_169Letter149Center, ffi::GST_VIDEO_AFD_16_9_LETTER_4_3_CENTER => Self::_169Letter43Center, value => Self::__Unknown(value), } } } #[cfg(feature = "v1_18")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_18")))] impl StaticType for VideoAFDValue { #[inline] #[doc(alias = "gst_video_afd_value_get_type")] fn static_type() -> glib::Type { unsafe { from_glib(ffi::gst_video_afd_value_get_type()) } } } #[cfg(feature = "v1_18")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_18")))] impl glib::HasParamSpec for VideoAFDValue { type ParamSpec = glib::ParamSpecEnum; type SetValue = Self; type BuilderFn = fn(&str, Self) -> glib::ParamSpecEnumBuilder; fn param_spec_builder() -> Self::BuilderFn { Self::ParamSpec::builder_with_default } } #[cfg(feature = "v1_18")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_18")))] impl glib::value::ValueType for VideoAFDValue { type Type = Self; } #[cfg(feature = "v1_18")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_18")))] unsafe impl<'a> glib::value::FromValue<'a> for VideoAFDValue { type Checker = glib::value::GenericValueTypeChecker; #[inline] unsafe fn from_value(value: &'a glib::Value) -> Self { skip_assert_initialized!(); from_glib(glib::gobject_ffi::g_value_get_enum(value.to_glib_none().0)) } } #[cfg(feature = "v1_18")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_18")))] impl ToValue for VideoAFDValue { #[inline] fn to_value(&self) -> glib::Value { let mut value = glib::Value::for_value_type::(); unsafe { glib::gobject_ffi::g_value_set_enum(value.to_glib_none_mut().0, self.into_glib()); } value } #[inline] fn value_type(&self) -> glib::Type { Self::static_type() } } #[cfg(feature = "v1_18")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_18")))] impl From for glib::Value { #[inline] fn from(v: VideoAFDValue) -> Self { skip_assert_initialized!(); ToValue::to_value(&v) } } #[derive(Debug, Eq, PartialEq, Ord, PartialOrd, Hash, Clone, Copy)] #[non_exhaustive] #[doc(alias = "GstVideoAlphaMode")] pub enum VideoAlphaMode { #[doc(alias = "GST_VIDEO_ALPHA_MODE_COPY")] Copy, #[doc(alias = "GST_VIDEO_ALPHA_MODE_SET")] Set, #[doc(alias = "GST_VIDEO_ALPHA_MODE_MULT")] Mult, #[doc(hidden)] __Unknown(i32), } #[doc(hidden)] impl IntoGlib for VideoAlphaMode { type GlibType = ffi::GstVideoAlphaMode; #[inline] fn into_glib(self) -> ffi::GstVideoAlphaMode { match self { Self::Copy => ffi::GST_VIDEO_ALPHA_MODE_COPY, Self::Set => ffi::GST_VIDEO_ALPHA_MODE_SET, Self::Mult => ffi::GST_VIDEO_ALPHA_MODE_MULT, Self::__Unknown(value) => value, } } } #[doc(hidden)] impl FromGlib for VideoAlphaMode { #[inline] unsafe fn from_glib(value: ffi::GstVideoAlphaMode) -> Self { skip_assert_initialized!(); match value { ffi::GST_VIDEO_ALPHA_MODE_COPY => Self::Copy, ffi::GST_VIDEO_ALPHA_MODE_SET => Self::Set, ffi::GST_VIDEO_ALPHA_MODE_MULT => Self::Mult, value => Self::__Unknown(value), } } } impl StaticType for VideoAlphaMode { #[inline] #[doc(alias = "gst_video_alpha_mode_get_type")] fn static_type() -> glib::Type { unsafe { from_glib(ffi::gst_video_alpha_mode_get_type()) } } } impl glib::HasParamSpec for VideoAlphaMode { type ParamSpec = glib::ParamSpecEnum; type SetValue = Self; type BuilderFn = fn(&str, Self) -> glib::ParamSpecEnumBuilder; fn param_spec_builder() -> Self::BuilderFn { Self::ParamSpec::builder_with_default } } impl glib::value::ValueType for VideoAlphaMode { type Type = Self; } unsafe impl<'a> glib::value::FromValue<'a> for VideoAlphaMode { type Checker = glib::value::GenericValueTypeChecker; #[inline] unsafe fn from_value(value: &'a glib::Value) -> Self { skip_assert_initialized!(); from_glib(glib::gobject_ffi::g_value_get_enum(value.to_glib_none().0)) } } impl ToValue for VideoAlphaMode { #[inline] fn to_value(&self) -> glib::Value { let mut value = glib::Value::for_value_type::(); unsafe { glib::gobject_ffi::g_value_set_enum(value.to_glib_none_mut().0, self.into_glib()); } value } #[inline] fn value_type(&self) -> glib::Type { Self::static_type() } } impl From for glib::Value { #[inline] fn from(v: VideoAlphaMode) -> Self { skip_assert_initialized!(); ToValue::to_value(&v) } } #[cfg(feature = "v1_16")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_16")))] #[derive(Debug, Eq, PartialEq, Ord, PartialOrd, Hash, Clone, Copy)] #[non_exhaustive] #[doc(alias = "GstVideoAncillaryDID")] pub enum VideoAncillaryDID { #[doc(alias = "GST_VIDEO_ANCILLARY_DID_UNDEFINED")] Undefined, #[doc(alias = "GST_VIDEO_ANCILLARY_DID_DELETION")] Deletion, #[doc(alias = "GST_VIDEO_ANCILLARY_DID_HANC_3G_AUDIO_DATA_FIRST")] Hanc3gAudioDataFirst, #[doc(alias = "GST_VIDEO_ANCILLARY_DID_HANC_3G_AUDIO_DATA_LAST")] Hanc3gAudioDataLast, #[doc(alias = "GST_VIDEO_ANCILLARY_DID_HANC_HDTV_AUDIO_DATA_FIRST")] HancHdtvAudioDataFirst, #[doc(alias = "GST_VIDEO_ANCILLARY_DID_HANC_HDTV_AUDIO_DATA_LAST")] HancHdtvAudioDataLast, #[doc(alias = "GST_VIDEO_ANCILLARY_DID_HANC_SDTV_AUDIO_DATA_1_FIRST")] HancSdtvAudioData1First, #[doc(alias = "GST_VIDEO_ANCILLARY_DID_HANC_SDTV_AUDIO_DATA_1_LAST")] HancSdtvAudioData1Last, #[doc(alias = "GST_VIDEO_ANCILLARY_DID_CAMERA_POSITION")] CameraPosition, #[doc(alias = "GST_VIDEO_ANCILLARY_DID_HANC_ERROR_DETECTION")] HancErrorDetection, #[doc(alias = "GST_VIDEO_ANCILLARY_DID_HANC_SDTV_AUDIO_DATA_2_FIRST")] HancSdtvAudioData2First, #[doc(alias = "GST_VIDEO_ANCILLARY_DID_HANC_SDTV_AUDIO_DATA_2_LAST")] HancSdtvAudioData2Last, #[doc(hidden)] __Unknown(i32), } #[cfg(feature = "v1_16")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_16")))] #[doc(hidden)] impl IntoGlib for VideoAncillaryDID { type GlibType = ffi::GstVideoAncillaryDID; #[inline] fn into_glib(self) -> ffi::GstVideoAncillaryDID { match self { Self::Undefined => ffi::GST_VIDEO_ANCILLARY_DID_UNDEFINED, Self::Deletion => ffi::GST_VIDEO_ANCILLARY_DID_DELETION, Self::Hanc3gAudioDataFirst => ffi::GST_VIDEO_ANCILLARY_DID_HANC_3G_AUDIO_DATA_FIRST, Self::Hanc3gAudioDataLast => ffi::GST_VIDEO_ANCILLARY_DID_HANC_3G_AUDIO_DATA_LAST, Self::HancHdtvAudioDataFirst => ffi::GST_VIDEO_ANCILLARY_DID_HANC_HDTV_AUDIO_DATA_FIRST, Self::HancHdtvAudioDataLast => ffi::GST_VIDEO_ANCILLARY_DID_HANC_HDTV_AUDIO_DATA_LAST, Self::HancSdtvAudioData1First => { ffi::GST_VIDEO_ANCILLARY_DID_HANC_SDTV_AUDIO_DATA_1_FIRST } Self::HancSdtvAudioData1Last => { ffi::GST_VIDEO_ANCILLARY_DID_HANC_SDTV_AUDIO_DATA_1_LAST } Self::CameraPosition => ffi::GST_VIDEO_ANCILLARY_DID_CAMERA_POSITION, Self::HancErrorDetection => ffi::GST_VIDEO_ANCILLARY_DID_HANC_ERROR_DETECTION, Self::HancSdtvAudioData2First => { ffi::GST_VIDEO_ANCILLARY_DID_HANC_SDTV_AUDIO_DATA_2_FIRST } Self::HancSdtvAudioData2Last => { ffi::GST_VIDEO_ANCILLARY_DID_HANC_SDTV_AUDIO_DATA_2_LAST } Self::__Unknown(value) => value, } } } #[cfg(feature = "v1_16")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_16")))] #[doc(hidden)] impl FromGlib for VideoAncillaryDID { #[inline] unsafe fn from_glib(value: ffi::GstVideoAncillaryDID) -> Self { skip_assert_initialized!(); match value { ffi::GST_VIDEO_ANCILLARY_DID_UNDEFINED => Self::Undefined, ffi::GST_VIDEO_ANCILLARY_DID_DELETION => Self::Deletion, ffi::GST_VIDEO_ANCILLARY_DID_HANC_3G_AUDIO_DATA_FIRST => Self::Hanc3gAudioDataFirst, ffi::GST_VIDEO_ANCILLARY_DID_HANC_3G_AUDIO_DATA_LAST => Self::Hanc3gAudioDataLast, ffi::GST_VIDEO_ANCILLARY_DID_HANC_HDTV_AUDIO_DATA_FIRST => Self::HancHdtvAudioDataFirst, ffi::GST_VIDEO_ANCILLARY_DID_HANC_HDTV_AUDIO_DATA_LAST => Self::HancHdtvAudioDataLast, ffi::GST_VIDEO_ANCILLARY_DID_HANC_SDTV_AUDIO_DATA_1_FIRST => { Self::HancSdtvAudioData1First } ffi::GST_VIDEO_ANCILLARY_DID_HANC_SDTV_AUDIO_DATA_1_LAST => { Self::HancSdtvAudioData1Last } ffi::GST_VIDEO_ANCILLARY_DID_CAMERA_POSITION => Self::CameraPosition, ffi::GST_VIDEO_ANCILLARY_DID_HANC_ERROR_DETECTION => Self::HancErrorDetection, ffi::GST_VIDEO_ANCILLARY_DID_HANC_SDTV_AUDIO_DATA_2_FIRST => { Self::HancSdtvAudioData2First } ffi::GST_VIDEO_ANCILLARY_DID_HANC_SDTV_AUDIO_DATA_2_LAST => { Self::HancSdtvAudioData2Last } value => Self::__Unknown(value), } } } #[cfg(feature = "v1_16")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_16")))] impl StaticType for VideoAncillaryDID { #[inline] #[doc(alias = "gst_video_ancillary_did_get_type")] fn static_type() -> glib::Type { unsafe { from_glib(ffi::gst_video_ancillary_did_get_type()) } } } #[cfg(feature = "v1_16")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_16")))] impl glib::HasParamSpec for VideoAncillaryDID { type ParamSpec = glib::ParamSpecEnum; type SetValue = Self; type BuilderFn = fn(&str, Self) -> glib::ParamSpecEnumBuilder; fn param_spec_builder() -> Self::BuilderFn { Self::ParamSpec::builder_with_default } } #[cfg(feature = "v1_16")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_16")))] impl glib::value::ValueType for VideoAncillaryDID { type Type = Self; } #[cfg(feature = "v1_16")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_16")))] unsafe impl<'a> glib::value::FromValue<'a> for VideoAncillaryDID { type Checker = glib::value::GenericValueTypeChecker; #[inline] unsafe fn from_value(value: &'a glib::Value) -> Self { skip_assert_initialized!(); from_glib(glib::gobject_ffi::g_value_get_enum(value.to_glib_none().0)) } } #[cfg(feature = "v1_16")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_16")))] impl ToValue for VideoAncillaryDID { #[inline] fn to_value(&self) -> glib::Value { let mut value = glib::Value::for_value_type::(); unsafe { glib::gobject_ffi::g_value_set_enum(value.to_glib_none_mut().0, self.into_glib()); } value } #[inline] fn value_type(&self) -> glib::Type { Self::static_type() } } #[cfg(feature = "v1_16")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_16")))] impl From for glib::Value { #[inline] fn from(v: VideoAncillaryDID) -> Self { skip_assert_initialized!(); ToValue::to_value(&v) } } #[cfg(feature = "v1_16")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_16")))] #[derive(Debug, Eq, PartialEq, Ord, PartialOrd, Hash, Clone, Copy)] #[non_exhaustive] #[doc(alias = "GstVideoAncillaryDID16")] pub enum VideoAncillaryDID16 { #[doc(alias = "GST_VIDEO_ANCILLARY_DID16_S334_EIA_708")] S334Eia708, #[doc(alias = "GST_VIDEO_ANCILLARY_DID16_S334_EIA_608")] S334Eia608, #[doc(alias = "GST_VIDEO_ANCILLARY_DID16_S2016_3_AFD_BAR")] S20163AfdBar, #[doc(hidden)] __Unknown(i32), } #[cfg(feature = "v1_16")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_16")))] #[doc(hidden)] impl IntoGlib for VideoAncillaryDID16 { type GlibType = ffi::GstVideoAncillaryDID16; #[inline] fn into_glib(self) -> ffi::GstVideoAncillaryDID16 { match self { Self::S334Eia708 => ffi::GST_VIDEO_ANCILLARY_DID16_S334_EIA_708, Self::S334Eia608 => ffi::GST_VIDEO_ANCILLARY_DID16_S334_EIA_608, Self::S20163AfdBar => ffi::GST_VIDEO_ANCILLARY_DID16_S2016_3_AFD_BAR, Self::__Unknown(value) => value, } } } #[cfg(feature = "v1_16")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_16")))] #[doc(hidden)] impl FromGlib for VideoAncillaryDID16 { #[inline] unsafe fn from_glib(value: ffi::GstVideoAncillaryDID16) -> Self { skip_assert_initialized!(); match value { ffi::GST_VIDEO_ANCILLARY_DID16_S334_EIA_708 => Self::S334Eia708, ffi::GST_VIDEO_ANCILLARY_DID16_S334_EIA_608 => Self::S334Eia608, ffi::GST_VIDEO_ANCILLARY_DID16_S2016_3_AFD_BAR => Self::S20163AfdBar, value => Self::__Unknown(value), } } } #[cfg(feature = "v1_16")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_16")))] impl StaticType for VideoAncillaryDID16 { #[inline] #[doc(alias = "gst_video_ancillary_di_d16_get_type")] fn static_type() -> glib::Type { unsafe { from_glib(ffi::gst_video_ancillary_di_d16_get_type()) } } } #[cfg(feature = "v1_16")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_16")))] impl glib::HasParamSpec for VideoAncillaryDID16 { type ParamSpec = glib::ParamSpecEnum; type SetValue = Self; type BuilderFn = fn(&str, Self) -> glib::ParamSpecEnumBuilder; fn param_spec_builder() -> Self::BuilderFn { Self::ParamSpec::builder_with_default } } #[cfg(feature = "v1_16")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_16")))] impl glib::value::ValueType for VideoAncillaryDID16 { type Type = Self; } #[cfg(feature = "v1_16")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_16")))] unsafe impl<'a> glib::value::FromValue<'a> for VideoAncillaryDID16 { type Checker = glib::value::GenericValueTypeChecker; #[inline] unsafe fn from_value(value: &'a glib::Value) -> Self { skip_assert_initialized!(); from_glib(glib::gobject_ffi::g_value_get_enum(value.to_glib_none().0)) } } #[cfg(feature = "v1_16")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_16")))] impl ToValue for VideoAncillaryDID16 { #[inline] fn to_value(&self) -> glib::Value { let mut value = glib::Value::for_value_type::(); unsafe { glib::gobject_ffi::g_value_set_enum(value.to_glib_none_mut().0, self.into_glib()); } value } #[inline] fn value_type(&self) -> glib::Type { Self::static_type() } } #[cfg(feature = "v1_16")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_16")))] impl From for glib::Value { #[inline] fn from(v: VideoAncillaryDID16) -> Self { skip_assert_initialized!(); ToValue::to_value(&v) } } #[cfg(feature = "v1_16")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_16")))] #[derive(Debug, Eq, PartialEq, Ord, PartialOrd, Hash, Clone, Copy)] #[non_exhaustive] #[doc(alias = "GstVideoCaptionType")] pub enum VideoCaptionType { #[doc(alias = "GST_VIDEO_CAPTION_TYPE_UNKNOWN")] Unknown, #[doc(alias = "GST_VIDEO_CAPTION_TYPE_CEA608_RAW")] Cea608Raw, #[doc(alias = "GST_VIDEO_CAPTION_TYPE_CEA608_S334_1A")] Cea608S3341a, #[doc(alias = "GST_VIDEO_CAPTION_TYPE_CEA708_RAW")] Cea708Raw, #[doc(alias = "GST_VIDEO_CAPTION_TYPE_CEA708_CDP")] Cea708Cdp, #[doc(hidden)] __Unknown(i32), } #[cfg(feature = "v1_16")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_16")))] impl VideoCaptionType { #[doc(alias = "gst_video_caption_type_to_caps")] pub fn to_caps(self) -> gst::Caps { assert_initialized_main_thread!(); unsafe { from_glib_full(ffi::gst_video_caption_type_to_caps(self.into_glib())) } } } #[cfg(feature = "v1_16")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_16")))] #[doc(hidden)] impl IntoGlib for VideoCaptionType { type GlibType = ffi::GstVideoCaptionType; #[inline] fn into_glib(self) -> ffi::GstVideoCaptionType { match self { Self::Unknown => ffi::GST_VIDEO_CAPTION_TYPE_UNKNOWN, Self::Cea608Raw => ffi::GST_VIDEO_CAPTION_TYPE_CEA608_RAW, Self::Cea608S3341a => ffi::GST_VIDEO_CAPTION_TYPE_CEA608_S334_1A, Self::Cea708Raw => ffi::GST_VIDEO_CAPTION_TYPE_CEA708_RAW, Self::Cea708Cdp => ffi::GST_VIDEO_CAPTION_TYPE_CEA708_CDP, Self::__Unknown(value) => value, } } } #[cfg(feature = "v1_16")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_16")))] #[doc(hidden)] impl FromGlib for VideoCaptionType { #[inline] unsafe fn from_glib(value: ffi::GstVideoCaptionType) -> Self { skip_assert_initialized!(); match value { ffi::GST_VIDEO_CAPTION_TYPE_UNKNOWN => Self::Unknown, ffi::GST_VIDEO_CAPTION_TYPE_CEA608_RAW => Self::Cea608Raw, ffi::GST_VIDEO_CAPTION_TYPE_CEA608_S334_1A => Self::Cea608S3341a, ffi::GST_VIDEO_CAPTION_TYPE_CEA708_RAW => Self::Cea708Raw, ffi::GST_VIDEO_CAPTION_TYPE_CEA708_CDP => Self::Cea708Cdp, value => Self::__Unknown(value), } } } #[cfg(feature = "v1_16")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_16")))] impl StaticType for VideoCaptionType { #[inline] #[doc(alias = "gst_video_caption_type_get_type")] fn static_type() -> glib::Type { unsafe { from_glib(ffi::gst_video_caption_type_get_type()) } } } #[cfg(feature = "v1_16")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_16")))] impl glib::HasParamSpec for VideoCaptionType { type ParamSpec = glib::ParamSpecEnum; type SetValue = Self; type BuilderFn = fn(&str, Self) -> glib::ParamSpecEnumBuilder; fn param_spec_builder() -> Self::BuilderFn { Self::ParamSpec::builder_with_default } } #[cfg(feature = "v1_16")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_16")))] impl glib::value::ValueType for VideoCaptionType { type Type = Self; } #[cfg(feature = "v1_16")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_16")))] unsafe impl<'a> glib::value::FromValue<'a> for VideoCaptionType { type Checker = glib::value::GenericValueTypeChecker; #[inline] unsafe fn from_value(value: &'a glib::Value) -> Self { skip_assert_initialized!(); from_glib(glib::gobject_ffi::g_value_get_enum(value.to_glib_none().0)) } } #[cfg(feature = "v1_16")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_16")))] impl ToValue for VideoCaptionType { #[inline] fn to_value(&self) -> glib::Value { let mut value = glib::Value::for_value_type::(); unsafe { glib::gobject_ffi::g_value_set_enum(value.to_glib_none_mut().0, self.into_glib()); } value } #[inline] fn value_type(&self) -> glib::Type { Self::static_type() } } #[cfg(feature = "v1_16")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_16")))] impl From for glib::Value { #[inline] fn from(v: VideoCaptionType) -> Self { skip_assert_initialized!(); ToValue::to_value(&v) } } #[derive(Debug, Eq, PartialEq, Ord, PartialOrd, Hash, Clone, Copy)] #[non_exhaustive] #[doc(alias = "GstVideoChromaMode")] pub enum VideoChromaMode { #[doc(alias = "GST_VIDEO_CHROMA_MODE_FULL")] Full, #[doc(alias = "GST_VIDEO_CHROMA_MODE_UPSAMPLE_ONLY")] UpsampleOnly, #[doc(alias = "GST_VIDEO_CHROMA_MODE_DOWNSAMPLE_ONLY")] DownsampleOnly, #[doc(alias = "GST_VIDEO_CHROMA_MODE_NONE")] None, #[doc(hidden)] __Unknown(i32), } #[doc(hidden)] impl IntoGlib for VideoChromaMode { type GlibType = ffi::GstVideoChromaMode; #[inline] fn into_glib(self) -> ffi::GstVideoChromaMode { match self { Self::Full => ffi::GST_VIDEO_CHROMA_MODE_FULL, Self::UpsampleOnly => ffi::GST_VIDEO_CHROMA_MODE_UPSAMPLE_ONLY, Self::DownsampleOnly => ffi::GST_VIDEO_CHROMA_MODE_DOWNSAMPLE_ONLY, Self::None => ffi::GST_VIDEO_CHROMA_MODE_NONE, Self::__Unknown(value) => value, } } } #[doc(hidden)] impl FromGlib for VideoChromaMode { #[inline] unsafe fn from_glib(value: ffi::GstVideoChromaMode) -> Self { skip_assert_initialized!(); match value { ffi::GST_VIDEO_CHROMA_MODE_FULL => Self::Full, ffi::GST_VIDEO_CHROMA_MODE_UPSAMPLE_ONLY => Self::UpsampleOnly, ffi::GST_VIDEO_CHROMA_MODE_DOWNSAMPLE_ONLY => Self::DownsampleOnly, ffi::GST_VIDEO_CHROMA_MODE_NONE => Self::None, value => Self::__Unknown(value), } } } impl StaticType for VideoChromaMode { #[inline] #[doc(alias = "gst_video_chroma_mode_get_type")] fn static_type() -> glib::Type { unsafe { from_glib(ffi::gst_video_chroma_mode_get_type()) } } } impl glib::HasParamSpec for VideoChromaMode { type ParamSpec = glib::ParamSpecEnum; type SetValue = Self; type BuilderFn = fn(&str, Self) -> glib::ParamSpecEnumBuilder; fn param_spec_builder() -> Self::BuilderFn { Self::ParamSpec::builder_with_default } } impl glib::value::ValueType for VideoChromaMode { type Type = Self; } unsafe impl<'a> glib::value::FromValue<'a> for VideoChromaMode { type Checker = glib::value::GenericValueTypeChecker; #[inline] unsafe fn from_value(value: &'a glib::Value) -> Self { skip_assert_initialized!(); from_glib(glib::gobject_ffi::g_value_get_enum(value.to_glib_none().0)) } } impl ToValue for VideoChromaMode { #[inline] fn to_value(&self) -> glib::Value { let mut value = glib::Value::for_value_type::(); unsafe { glib::gobject_ffi::g_value_set_enum(value.to_glib_none_mut().0, self.into_glib()); } value } #[inline] fn value_type(&self) -> glib::Type { Self::static_type() } } impl From for glib::Value { #[inline] fn from(v: VideoChromaMode) -> Self { skip_assert_initialized!(); ToValue::to_value(&v) } } #[derive(Debug, Eq, PartialEq, Ord, PartialOrd, Hash, Clone, Copy)] #[non_exhaustive] #[doc(alias = "GstVideoColorMatrix")] pub enum VideoColorMatrix { #[doc(alias = "GST_VIDEO_COLOR_MATRIX_UNKNOWN")] Unknown, #[doc(alias = "GST_VIDEO_COLOR_MATRIX_RGB")] Rgb, #[doc(alias = "GST_VIDEO_COLOR_MATRIX_FCC")] Fcc, #[doc(alias = "GST_VIDEO_COLOR_MATRIX_BT709")] Bt709, #[doc(alias = "GST_VIDEO_COLOR_MATRIX_BT601")] Bt601, #[doc(alias = "GST_VIDEO_COLOR_MATRIX_SMPTE240M")] Smpte240m, #[doc(alias = "GST_VIDEO_COLOR_MATRIX_BT2020")] Bt2020, #[doc(hidden)] __Unknown(i32), } impl VideoColorMatrix { #[cfg(feature = "v1_18")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_18")))] #[doc(alias = "gst_video_color_matrix_from_iso")] pub fn from_iso(value: u32) -> VideoColorMatrix { assert_initialized_main_thread!(); unsafe { from_glib(ffi::gst_video_color_matrix_from_iso(value)) } } #[cfg(feature = "v1_18")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_18")))] #[doc(alias = "gst_video_color_matrix_to_iso")] pub fn to_iso(self) -> u32 { assert_initialized_main_thread!(); unsafe { ffi::gst_video_color_matrix_to_iso(self.into_glib()) } } } #[doc(hidden)] impl IntoGlib for VideoColorMatrix { type GlibType = ffi::GstVideoColorMatrix; #[inline] fn into_glib(self) -> ffi::GstVideoColorMatrix { match self { Self::Unknown => ffi::GST_VIDEO_COLOR_MATRIX_UNKNOWN, Self::Rgb => ffi::GST_VIDEO_COLOR_MATRIX_RGB, Self::Fcc => ffi::GST_VIDEO_COLOR_MATRIX_FCC, Self::Bt709 => ffi::GST_VIDEO_COLOR_MATRIX_BT709, Self::Bt601 => ffi::GST_VIDEO_COLOR_MATRIX_BT601, Self::Smpte240m => ffi::GST_VIDEO_COLOR_MATRIX_SMPTE240M, Self::Bt2020 => ffi::GST_VIDEO_COLOR_MATRIX_BT2020, Self::__Unknown(value) => value, } } } #[doc(hidden)] impl FromGlib for VideoColorMatrix { #[inline] unsafe fn from_glib(value: ffi::GstVideoColorMatrix) -> Self { skip_assert_initialized!(); match value { ffi::GST_VIDEO_COLOR_MATRIX_UNKNOWN => Self::Unknown, ffi::GST_VIDEO_COLOR_MATRIX_RGB => Self::Rgb, ffi::GST_VIDEO_COLOR_MATRIX_FCC => Self::Fcc, ffi::GST_VIDEO_COLOR_MATRIX_BT709 => Self::Bt709, ffi::GST_VIDEO_COLOR_MATRIX_BT601 => Self::Bt601, ffi::GST_VIDEO_COLOR_MATRIX_SMPTE240M => Self::Smpte240m, ffi::GST_VIDEO_COLOR_MATRIX_BT2020 => Self::Bt2020, value => Self::__Unknown(value), } } } impl StaticType for VideoColorMatrix { #[inline] #[doc(alias = "gst_video_color_matrix_get_type")] fn static_type() -> glib::Type { unsafe { from_glib(ffi::gst_video_color_matrix_get_type()) } } } impl glib::HasParamSpec for VideoColorMatrix { type ParamSpec = glib::ParamSpecEnum; type SetValue = Self; type BuilderFn = fn(&str, Self) -> glib::ParamSpecEnumBuilder; fn param_spec_builder() -> Self::BuilderFn { Self::ParamSpec::builder_with_default } } impl glib::value::ValueType for VideoColorMatrix { type Type = Self; } unsafe impl<'a> glib::value::FromValue<'a> for VideoColorMatrix { type Checker = glib::value::GenericValueTypeChecker; #[inline] unsafe fn from_value(value: &'a glib::Value) -> Self { skip_assert_initialized!(); from_glib(glib::gobject_ffi::g_value_get_enum(value.to_glib_none().0)) } } impl ToValue for VideoColorMatrix { #[inline] fn to_value(&self) -> glib::Value { let mut value = glib::Value::for_value_type::(); unsafe { glib::gobject_ffi::g_value_set_enum(value.to_glib_none_mut().0, self.into_glib()); } value } #[inline] fn value_type(&self) -> glib::Type { Self::static_type() } } impl From for glib::Value { #[inline] fn from(v: VideoColorMatrix) -> Self { skip_assert_initialized!(); ToValue::to_value(&v) } } #[derive(Debug, Eq, PartialEq, Ord, PartialOrd, Hash, Clone, Copy)] #[non_exhaustive] #[doc(alias = "GstVideoColorPrimaries")] pub enum VideoColorPrimaries { #[doc(alias = "GST_VIDEO_COLOR_PRIMARIES_UNKNOWN")] Unknown, #[doc(alias = "GST_VIDEO_COLOR_PRIMARIES_BT709")] Bt709, #[doc(alias = "GST_VIDEO_COLOR_PRIMARIES_BT470M")] Bt470m, #[doc(alias = "GST_VIDEO_COLOR_PRIMARIES_BT470BG")] Bt470bg, #[doc(alias = "GST_VIDEO_COLOR_PRIMARIES_SMPTE170M")] Smpte170m, #[doc(alias = "GST_VIDEO_COLOR_PRIMARIES_SMPTE240M")] Smpte240m, #[doc(alias = "GST_VIDEO_COLOR_PRIMARIES_FILM")] Film, #[doc(alias = "GST_VIDEO_COLOR_PRIMARIES_BT2020")] Bt2020, #[doc(alias = "GST_VIDEO_COLOR_PRIMARIES_ADOBERGB")] Adobergb, #[cfg(feature = "v1_16")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_16")))] #[doc(alias = "GST_VIDEO_COLOR_PRIMARIES_SMPTEST428")] Smptest428, #[cfg(feature = "v1_16")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_16")))] #[doc(alias = "GST_VIDEO_COLOR_PRIMARIES_SMPTERP431")] Smpterp431, #[cfg(feature = "v1_16")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_16")))] #[doc(alias = "GST_VIDEO_COLOR_PRIMARIES_SMPTEEG432")] Smpteeg432, #[cfg(feature = "v1_16")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_16")))] #[doc(alias = "GST_VIDEO_COLOR_PRIMARIES_EBU3213")] Ebu3213, #[doc(hidden)] __Unknown(i32), } impl VideoColorPrimaries { #[cfg(feature = "v1_18")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_18")))] #[doc(alias = "gst_video_color_primaries_from_iso")] pub fn from_iso(value: u32) -> VideoColorPrimaries { assert_initialized_main_thread!(); unsafe { from_glib(ffi::gst_video_color_primaries_from_iso(value)) } } //#[doc(alias = "gst_video_color_primaries_get_info")] //#[doc(alias = "get_info")] //pub fn info(self) -> /*Ignored*/VideoColorPrimariesInfo { // unsafe { TODO: call ffi:gst_video_color_primaries_get_info() } //} #[cfg(feature = "v1_22")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_22")))] #[doc(alias = "gst_video_color_primaries_is_equivalent")] pub fn is_equivalent(self, other: VideoColorPrimaries) -> bool { assert_initialized_main_thread!(); unsafe { from_glib(ffi::gst_video_color_primaries_is_equivalent( self.into_glib(), other.into_glib(), )) } } #[cfg(feature = "v1_18")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_18")))] #[doc(alias = "gst_video_color_primaries_to_iso")] pub fn to_iso(self) -> u32 { assert_initialized_main_thread!(); unsafe { ffi::gst_video_color_primaries_to_iso(self.into_glib()) } } } #[doc(hidden)] impl IntoGlib for VideoColorPrimaries { type GlibType = ffi::GstVideoColorPrimaries; fn into_glib(self) -> ffi::GstVideoColorPrimaries { match self { Self::Unknown => ffi::GST_VIDEO_COLOR_PRIMARIES_UNKNOWN, Self::Bt709 => ffi::GST_VIDEO_COLOR_PRIMARIES_BT709, Self::Bt470m => ffi::GST_VIDEO_COLOR_PRIMARIES_BT470M, Self::Bt470bg => ffi::GST_VIDEO_COLOR_PRIMARIES_BT470BG, Self::Smpte170m => ffi::GST_VIDEO_COLOR_PRIMARIES_SMPTE170M, Self::Smpte240m => ffi::GST_VIDEO_COLOR_PRIMARIES_SMPTE240M, Self::Film => ffi::GST_VIDEO_COLOR_PRIMARIES_FILM, Self::Bt2020 => ffi::GST_VIDEO_COLOR_PRIMARIES_BT2020, Self::Adobergb => ffi::GST_VIDEO_COLOR_PRIMARIES_ADOBERGB, #[cfg(feature = "v1_16")] Self::Smptest428 => ffi::GST_VIDEO_COLOR_PRIMARIES_SMPTEST428, #[cfg(feature = "v1_16")] Self::Smpterp431 => ffi::GST_VIDEO_COLOR_PRIMARIES_SMPTERP431, #[cfg(feature = "v1_16")] Self::Smpteeg432 => ffi::GST_VIDEO_COLOR_PRIMARIES_SMPTEEG432, #[cfg(feature = "v1_16")] Self::Ebu3213 => ffi::GST_VIDEO_COLOR_PRIMARIES_EBU3213, Self::__Unknown(value) => value, } } } #[doc(hidden)] impl FromGlib for VideoColorPrimaries { unsafe fn from_glib(value: ffi::GstVideoColorPrimaries) -> Self { skip_assert_initialized!(); match value { ffi::GST_VIDEO_COLOR_PRIMARIES_UNKNOWN => Self::Unknown, ffi::GST_VIDEO_COLOR_PRIMARIES_BT709 => Self::Bt709, ffi::GST_VIDEO_COLOR_PRIMARIES_BT470M => Self::Bt470m, ffi::GST_VIDEO_COLOR_PRIMARIES_BT470BG => Self::Bt470bg, ffi::GST_VIDEO_COLOR_PRIMARIES_SMPTE170M => Self::Smpte170m, ffi::GST_VIDEO_COLOR_PRIMARIES_SMPTE240M => Self::Smpte240m, ffi::GST_VIDEO_COLOR_PRIMARIES_FILM => Self::Film, ffi::GST_VIDEO_COLOR_PRIMARIES_BT2020 => Self::Bt2020, ffi::GST_VIDEO_COLOR_PRIMARIES_ADOBERGB => Self::Adobergb, #[cfg(feature = "v1_16")] ffi::GST_VIDEO_COLOR_PRIMARIES_SMPTEST428 => Self::Smptest428, #[cfg(feature = "v1_16")] ffi::GST_VIDEO_COLOR_PRIMARIES_SMPTERP431 => Self::Smpterp431, #[cfg(feature = "v1_16")] ffi::GST_VIDEO_COLOR_PRIMARIES_SMPTEEG432 => Self::Smpteeg432, #[cfg(feature = "v1_16")] ffi::GST_VIDEO_COLOR_PRIMARIES_EBU3213 => Self::Ebu3213, value => Self::__Unknown(value), } } } impl StaticType for VideoColorPrimaries { #[inline] #[doc(alias = "gst_video_color_primaries_get_type")] fn static_type() -> glib::Type { unsafe { from_glib(ffi::gst_video_color_primaries_get_type()) } } } impl glib::HasParamSpec for VideoColorPrimaries { type ParamSpec = glib::ParamSpecEnum; type SetValue = Self; type BuilderFn = fn(&str, Self) -> glib::ParamSpecEnumBuilder; fn param_spec_builder() -> Self::BuilderFn { Self::ParamSpec::builder_with_default } } impl glib::value::ValueType for VideoColorPrimaries { type Type = Self; } unsafe impl<'a> glib::value::FromValue<'a> for VideoColorPrimaries { type Checker = glib::value::GenericValueTypeChecker; #[inline] unsafe fn from_value(value: &'a glib::Value) -> Self { skip_assert_initialized!(); from_glib(glib::gobject_ffi::g_value_get_enum(value.to_glib_none().0)) } } impl ToValue for VideoColorPrimaries { #[inline] fn to_value(&self) -> glib::Value { let mut value = glib::Value::for_value_type::(); unsafe { glib::gobject_ffi::g_value_set_enum(value.to_glib_none_mut().0, self.into_glib()); } value } #[inline] fn value_type(&self) -> glib::Type { Self::static_type() } } impl From for glib::Value { #[inline] fn from(v: VideoColorPrimaries) -> Self { skip_assert_initialized!(); ToValue::to_value(&v) } } #[derive(Debug, Eq, PartialEq, Ord, PartialOrd, Hash, Clone, Copy)] #[non_exhaustive] #[doc(alias = "GstVideoDitherMethod")] pub enum VideoDitherMethod { #[doc(alias = "GST_VIDEO_DITHER_NONE")] None, #[doc(alias = "GST_VIDEO_DITHER_VERTERR")] Verterr, #[doc(alias = "GST_VIDEO_DITHER_FLOYD_STEINBERG")] FloydSteinberg, #[doc(alias = "GST_VIDEO_DITHER_SIERRA_LITE")] SierraLite, #[doc(alias = "GST_VIDEO_DITHER_BAYER")] Bayer, #[doc(hidden)] __Unknown(i32), } #[doc(hidden)] impl IntoGlib for VideoDitherMethod { type GlibType = ffi::GstVideoDitherMethod; #[inline] fn into_glib(self) -> ffi::GstVideoDitherMethod { match self { Self::None => ffi::GST_VIDEO_DITHER_NONE, Self::Verterr => ffi::GST_VIDEO_DITHER_VERTERR, Self::FloydSteinberg => ffi::GST_VIDEO_DITHER_FLOYD_STEINBERG, Self::SierraLite => ffi::GST_VIDEO_DITHER_SIERRA_LITE, Self::Bayer => ffi::GST_VIDEO_DITHER_BAYER, Self::__Unknown(value) => value, } } } #[doc(hidden)] impl FromGlib for VideoDitherMethod { #[inline] unsafe fn from_glib(value: ffi::GstVideoDitherMethod) -> Self { skip_assert_initialized!(); match value { ffi::GST_VIDEO_DITHER_NONE => Self::None, ffi::GST_VIDEO_DITHER_VERTERR => Self::Verterr, ffi::GST_VIDEO_DITHER_FLOYD_STEINBERG => Self::FloydSteinberg, ffi::GST_VIDEO_DITHER_SIERRA_LITE => Self::SierraLite, ffi::GST_VIDEO_DITHER_BAYER => Self::Bayer, value => Self::__Unknown(value), } } } impl StaticType for VideoDitherMethod { #[inline] #[doc(alias = "gst_video_dither_method_get_type")] fn static_type() -> glib::Type { unsafe { from_glib(ffi::gst_video_dither_method_get_type()) } } } impl glib::HasParamSpec for VideoDitherMethod { type ParamSpec = glib::ParamSpecEnum; type SetValue = Self; type BuilderFn = fn(&str, Self) -> glib::ParamSpecEnumBuilder; fn param_spec_builder() -> Self::BuilderFn { Self::ParamSpec::builder_with_default } } impl glib::value::ValueType for VideoDitherMethod { type Type = Self; } unsafe impl<'a> glib::value::FromValue<'a> for VideoDitherMethod { type Checker = glib::value::GenericValueTypeChecker; #[inline] unsafe fn from_value(value: &'a glib::Value) -> Self { skip_assert_initialized!(); from_glib(glib::gobject_ffi::g_value_get_enum(value.to_glib_none().0)) } } impl ToValue for VideoDitherMethod { #[inline] fn to_value(&self) -> glib::Value { let mut value = glib::Value::for_value_type::(); unsafe { glib::gobject_ffi::g_value_set_enum(value.to_glib_none_mut().0, self.into_glib()); } value } #[inline] fn value_type(&self) -> glib::Type { Self::static_type() } } impl From for glib::Value { #[inline] fn from(v: VideoDitherMethod) -> Self { skip_assert_initialized!(); ToValue::to_value(&v) } } #[derive(Debug, Eq, PartialEq, Ord, PartialOrd, Hash, Clone, Copy)] #[non_exhaustive] #[doc(alias = "GstVideoFieldOrder")] pub enum VideoFieldOrder { #[doc(alias = "GST_VIDEO_FIELD_ORDER_UNKNOWN")] Unknown, #[doc(alias = "GST_VIDEO_FIELD_ORDER_TOP_FIELD_FIRST")] TopFieldFirst, #[doc(alias = "GST_VIDEO_FIELD_ORDER_BOTTOM_FIELD_FIRST")] BottomFieldFirst, #[doc(hidden)] __Unknown(i32), } impl VideoFieldOrder { #[doc(alias = "gst_video_field_order_from_string")] pub fn from_string(order: &str) -> VideoFieldOrder { assert_initialized_main_thread!(); unsafe { from_glib(ffi::gst_video_field_order_from_string( order.to_glib_none().0, )) } } } impl std::fmt::Display for VideoFieldOrder { #[inline] fn fmt(&self, f: &mut std::fmt::Formatter) -> std::fmt::Result { f.write_str(&self.to_str()) } } #[doc(hidden)] impl IntoGlib for VideoFieldOrder { type GlibType = ffi::GstVideoFieldOrder; #[inline] fn into_glib(self) -> ffi::GstVideoFieldOrder { match self { Self::Unknown => ffi::GST_VIDEO_FIELD_ORDER_UNKNOWN, Self::TopFieldFirst => ffi::GST_VIDEO_FIELD_ORDER_TOP_FIELD_FIRST, Self::BottomFieldFirst => ffi::GST_VIDEO_FIELD_ORDER_BOTTOM_FIELD_FIRST, Self::__Unknown(value) => value, } } } #[doc(hidden)] impl FromGlib for VideoFieldOrder { #[inline] unsafe fn from_glib(value: ffi::GstVideoFieldOrder) -> Self { skip_assert_initialized!(); match value { ffi::GST_VIDEO_FIELD_ORDER_UNKNOWN => Self::Unknown, ffi::GST_VIDEO_FIELD_ORDER_TOP_FIELD_FIRST => Self::TopFieldFirst, ffi::GST_VIDEO_FIELD_ORDER_BOTTOM_FIELD_FIRST => Self::BottomFieldFirst, value => Self::__Unknown(value), } } } impl StaticType for VideoFieldOrder { #[inline] #[doc(alias = "gst_video_field_order_get_type")] fn static_type() -> glib::Type { unsafe { from_glib(ffi::gst_video_field_order_get_type()) } } } impl glib::HasParamSpec for VideoFieldOrder { type ParamSpec = glib::ParamSpecEnum; type SetValue = Self; type BuilderFn = fn(&str, Self) -> glib::ParamSpecEnumBuilder; fn param_spec_builder() -> Self::BuilderFn { Self::ParamSpec::builder_with_default } } impl glib::value::ValueType for VideoFieldOrder { type Type = Self; } unsafe impl<'a> glib::value::FromValue<'a> for VideoFieldOrder { type Checker = glib::value::GenericValueTypeChecker; #[inline] unsafe fn from_value(value: &'a glib::Value) -> Self { skip_assert_initialized!(); from_glib(glib::gobject_ffi::g_value_get_enum(value.to_glib_none().0)) } } impl ToValue for VideoFieldOrder { #[inline] fn to_value(&self) -> glib::Value { let mut value = glib::Value::for_value_type::(); unsafe { glib::gobject_ffi::g_value_set_enum(value.to_glib_none_mut().0, self.into_glib()); } value } #[inline] fn value_type(&self) -> glib::Type { Self::static_type() } } impl From for glib::Value { #[inline] fn from(v: VideoFieldOrder) -> Self { skip_assert_initialized!(); ToValue::to_value(&v) } } #[derive(Debug, Eq, PartialEq, Hash, Clone, Copy)] #[non_exhaustive] #[doc(alias = "GstVideoFormat")] pub enum VideoFormat { #[doc(alias = "GST_VIDEO_FORMAT_UNKNOWN")] Unknown, #[doc(alias = "GST_VIDEO_FORMAT_ENCODED")] Encoded, #[doc(alias = "GST_VIDEO_FORMAT_I420")] I420, #[doc(alias = "GST_VIDEO_FORMAT_YV12")] Yv12, #[doc(alias = "GST_VIDEO_FORMAT_YUY2")] Yuy2, #[doc(alias = "GST_VIDEO_FORMAT_UYVY")] Uyvy, #[doc(alias = "GST_VIDEO_FORMAT_AYUV")] Ayuv, #[doc(alias = "GST_VIDEO_FORMAT_RGBx")] Rgbx, #[doc(alias = "GST_VIDEO_FORMAT_BGRx")] Bgrx, #[doc(alias = "GST_VIDEO_FORMAT_xRGB")] Xrgb, #[doc(alias = "GST_VIDEO_FORMAT_xBGR")] Xbgr, #[doc(alias = "GST_VIDEO_FORMAT_RGBA")] Rgba, #[doc(alias = "GST_VIDEO_FORMAT_BGRA")] Bgra, #[doc(alias = "GST_VIDEO_FORMAT_ARGB")] Argb, #[doc(alias = "GST_VIDEO_FORMAT_ABGR")] Abgr, #[doc(alias = "GST_VIDEO_FORMAT_RGB")] Rgb, #[doc(alias = "GST_VIDEO_FORMAT_BGR")] Bgr, #[doc(alias = "GST_VIDEO_FORMAT_Y41B")] Y41b, #[doc(alias = "GST_VIDEO_FORMAT_Y42B")] Y42b, #[doc(alias = "GST_VIDEO_FORMAT_YVYU")] Yvyu, #[doc(alias = "GST_VIDEO_FORMAT_Y444")] Y444, #[doc(alias = "GST_VIDEO_FORMAT_v210")] V210, #[doc(alias = "GST_VIDEO_FORMAT_v216")] V216, #[doc(alias = "GST_VIDEO_FORMAT_NV12")] Nv12, #[doc(alias = "GST_VIDEO_FORMAT_NV21")] Nv21, #[doc(alias = "GST_VIDEO_FORMAT_GRAY8")] Gray8, #[doc(alias = "GST_VIDEO_FORMAT_GRAY16_BE")] Gray16Be, #[doc(alias = "GST_VIDEO_FORMAT_GRAY16_LE")] Gray16Le, #[doc(alias = "GST_VIDEO_FORMAT_v308")] V308, #[doc(alias = "GST_VIDEO_FORMAT_RGB16")] Rgb16, #[doc(alias = "GST_VIDEO_FORMAT_BGR16")] Bgr16, #[doc(alias = "GST_VIDEO_FORMAT_RGB15")] Rgb15, #[doc(alias = "GST_VIDEO_FORMAT_BGR15")] Bgr15, #[doc(alias = "GST_VIDEO_FORMAT_UYVP")] Uyvp, #[doc(alias = "GST_VIDEO_FORMAT_A420")] A420, #[doc(alias = "GST_VIDEO_FORMAT_RGB8P")] Rgb8p, #[doc(alias = "GST_VIDEO_FORMAT_YUV9")] Yuv9, #[doc(alias = "GST_VIDEO_FORMAT_YVU9")] Yvu9, #[doc(alias = "GST_VIDEO_FORMAT_IYU1")] Iyu1, #[doc(alias = "GST_VIDEO_FORMAT_ARGB64")] Argb64, #[doc(alias = "GST_VIDEO_FORMAT_AYUV64")] Ayuv64, #[doc(alias = "GST_VIDEO_FORMAT_r210")] R210, #[doc(alias = "GST_VIDEO_FORMAT_I420_10BE")] I42010be, #[doc(alias = "GST_VIDEO_FORMAT_I420_10LE")] I42010le, #[doc(alias = "GST_VIDEO_FORMAT_I422_10BE")] I42210be, #[doc(alias = "GST_VIDEO_FORMAT_I422_10LE")] I42210le, #[doc(alias = "GST_VIDEO_FORMAT_Y444_10BE")] Y44410be, #[doc(alias = "GST_VIDEO_FORMAT_Y444_10LE")] Y44410le, #[doc(alias = "GST_VIDEO_FORMAT_GBR")] Gbr, #[doc(alias = "GST_VIDEO_FORMAT_GBR_10BE")] Gbr10be, #[doc(alias = "GST_VIDEO_FORMAT_GBR_10LE")] Gbr10le, #[doc(alias = "GST_VIDEO_FORMAT_NV16")] Nv16, #[doc(alias = "GST_VIDEO_FORMAT_NV24")] Nv24, #[doc(alias = "GST_VIDEO_FORMAT_NV12_64Z32")] Nv1264z32, #[doc(alias = "GST_VIDEO_FORMAT_A420_10BE")] A42010be, #[doc(alias = "GST_VIDEO_FORMAT_A420_10LE")] A42010le, #[doc(alias = "GST_VIDEO_FORMAT_A422_10BE")] A42210be, #[doc(alias = "GST_VIDEO_FORMAT_A422_10LE")] A42210le, #[doc(alias = "GST_VIDEO_FORMAT_A444_10BE")] A44410be, #[doc(alias = "GST_VIDEO_FORMAT_A444_10LE")] A44410le, #[doc(alias = "GST_VIDEO_FORMAT_NV61")] Nv61, #[doc(alias = "GST_VIDEO_FORMAT_P010_10BE")] P01010be, #[doc(alias = "GST_VIDEO_FORMAT_P010_10LE")] P01010le, #[doc(alias = "GST_VIDEO_FORMAT_IYU2")] Iyu2, #[doc(alias = "GST_VIDEO_FORMAT_VYUY")] Vyuy, #[doc(alias = "GST_VIDEO_FORMAT_GBRA")] Gbra, #[doc(alias = "GST_VIDEO_FORMAT_GBRA_10BE")] Gbra10be, #[doc(alias = "GST_VIDEO_FORMAT_GBRA_10LE")] Gbra10le, #[doc(alias = "GST_VIDEO_FORMAT_GBR_12BE")] Gbr12be, #[doc(alias = "GST_VIDEO_FORMAT_GBR_12LE")] Gbr12le, #[doc(alias = "GST_VIDEO_FORMAT_GBRA_12BE")] Gbra12be, #[doc(alias = "GST_VIDEO_FORMAT_GBRA_12LE")] Gbra12le, #[doc(alias = "GST_VIDEO_FORMAT_I420_12BE")] I42012be, #[doc(alias = "GST_VIDEO_FORMAT_I420_12LE")] I42012le, #[doc(alias = "GST_VIDEO_FORMAT_I422_12BE")] I42212be, #[doc(alias = "GST_VIDEO_FORMAT_I422_12LE")] I42212le, #[doc(alias = "GST_VIDEO_FORMAT_Y444_12BE")] Y44412be, #[doc(alias = "GST_VIDEO_FORMAT_Y444_12LE")] Y44412le, #[doc(alias = "GST_VIDEO_FORMAT_GRAY10_LE32")] Gray10Le32, #[doc(alias = "GST_VIDEO_FORMAT_NV12_10LE32")] Nv1210le32, #[doc(alias = "GST_VIDEO_FORMAT_NV16_10LE32")] Nv1610le32, #[cfg(feature = "v1_16")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_16")))] #[doc(alias = "GST_VIDEO_FORMAT_NV12_10LE40")] Nv1210le40, #[cfg(feature = "v1_16")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_16")))] #[doc(alias = "GST_VIDEO_FORMAT_Y210")] Y210, #[cfg(feature = "v1_16")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_16")))] #[doc(alias = "GST_VIDEO_FORMAT_Y410")] Y410, #[cfg(feature = "v1_16")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_16")))] #[doc(alias = "GST_VIDEO_FORMAT_VUYA")] Vuya, #[cfg(feature = "v1_16")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_16")))] #[doc(alias = "GST_VIDEO_FORMAT_BGR10A2_LE")] Bgr10a2Le, #[cfg(feature = "v1_18")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_18")))] #[doc(alias = "GST_VIDEO_FORMAT_RGB10A2_LE")] Rgb10a2Le, #[cfg(feature = "v1_18")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_18")))] #[doc(alias = "GST_VIDEO_FORMAT_Y444_16BE")] Y44416be, #[cfg(feature = "v1_18")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_18")))] #[doc(alias = "GST_VIDEO_FORMAT_Y444_16LE")] Y44416le, #[cfg(feature = "v1_18")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_18")))] #[doc(alias = "GST_VIDEO_FORMAT_P016_BE")] P016Be, #[cfg(feature = "v1_18")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_18")))] #[doc(alias = "GST_VIDEO_FORMAT_P016_LE")] P016Le, #[cfg(feature = "v1_18")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_18")))] #[doc(alias = "GST_VIDEO_FORMAT_P012_BE")] P012Be, #[cfg(feature = "v1_18")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_18")))] #[doc(alias = "GST_VIDEO_FORMAT_P012_LE")] P012Le, #[cfg(feature = "v1_18")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_18")))] #[doc(alias = "GST_VIDEO_FORMAT_Y212_BE")] Y212Be, #[cfg(feature = "v1_18")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_18")))] #[doc(alias = "GST_VIDEO_FORMAT_Y212_LE")] Y212Le, #[cfg(feature = "v1_18")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_18")))] #[doc(alias = "GST_VIDEO_FORMAT_Y412_BE")] Y412Be, #[cfg(feature = "v1_18")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_18")))] #[doc(alias = "GST_VIDEO_FORMAT_Y412_LE")] Y412Le, #[cfg(feature = "v1_18")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_18")))] #[doc(alias = "GST_VIDEO_FORMAT_NV12_4L4")] Nv124l4, #[cfg(feature = "v1_18")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_18")))] #[doc(alias = "GST_VIDEO_FORMAT_NV12_32L32")] Nv1232l32, #[cfg(feature = "v1_20")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_20")))] #[doc(alias = "GST_VIDEO_FORMAT_RGBP")] Rgbp, #[cfg(feature = "v1_20")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_20")))] #[doc(alias = "GST_VIDEO_FORMAT_BGRP")] Bgrp, #[cfg(feature = "v1_20")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_20")))] #[doc(alias = "GST_VIDEO_FORMAT_AV12")] Av12, #[cfg(feature = "v1_20")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_20")))] #[doc(alias = "GST_VIDEO_FORMAT_ARGB64_LE")] Argb64Le, #[cfg(feature = "v1_20")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_20")))] #[doc(alias = "GST_VIDEO_FORMAT_ARGB64_BE")] Argb64Be, #[cfg(feature = "v1_20")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_20")))] #[doc(alias = "GST_VIDEO_FORMAT_RGBA64_LE")] Rgba64Le, #[cfg(feature = "v1_20")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_20")))] #[doc(alias = "GST_VIDEO_FORMAT_RGBA64_BE")] Rgba64Be, #[cfg(feature = "v1_20")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_20")))] #[doc(alias = "GST_VIDEO_FORMAT_BGRA64_LE")] Bgra64Le, #[cfg(feature = "v1_20")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_20")))] #[doc(alias = "GST_VIDEO_FORMAT_BGRA64_BE")] Bgra64Be, #[cfg(feature = "v1_20")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_20")))] #[doc(alias = "GST_VIDEO_FORMAT_ABGR64_LE")] Abgr64Le, #[cfg(feature = "v1_20")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_20")))] #[doc(alias = "GST_VIDEO_FORMAT_ABGR64_BE")] Abgr64Be, #[cfg(feature = "v1_22")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_22")))] #[doc(alias = "GST_VIDEO_FORMAT_NV12_16L32S")] Nv1216l32s, #[cfg(feature = "v1_22")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_22")))] #[doc(alias = "GST_VIDEO_FORMAT_NV12_8L128")] Nv128l128, #[cfg(feature = "v1_22")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_22")))] #[doc(alias = "GST_VIDEO_FORMAT_NV12_10BE_8L128")] Nv1210be8l128, #[cfg(feature = "v1_24")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_24")))] #[doc(alias = "GST_VIDEO_FORMAT_NV12_10LE40_4L4")] Nv1210le404l4, #[cfg(feature = "v1_24")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_24")))] #[doc(alias = "GST_VIDEO_FORMAT_DMA_DRM")] DmaDrm, #[cfg(feature = "v1_24")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_24")))] #[doc(alias = "GST_VIDEO_FORMAT_MT2110T")] Mt2110t, #[cfg(feature = "v1_24")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_24")))] #[doc(alias = "GST_VIDEO_FORMAT_MT2110R")] Mt2110r, #[cfg(feature = "v1_24")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_24")))] #[doc(alias = "GST_VIDEO_FORMAT_A422")] A422, #[cfg(feature = "v1_24")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_24")))] #[doc(alias = "GST_VIDEO_FORMAT_A444")] A444, #[cfg(feature = "v1_24")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_24")))] #[doc(alias = "GST_VIDEO_FORMAT_A444_12LE")] A44412le, #[cfg(feature = "v1_24")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_24")))] #[doc(alias = "GST_VIDEO_FORMAT_A444_12BE")] A44412be, #[cfg(feature = "v1_24")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_24")))] #[doc(alias = "GST_VIDEO_FORMAT_A422_12LE")] A42212le, #[cfg(feature = "v1_24")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_24")))] #[doc(alias = "GST_VIDEO_FORMAT_A422_12BE")] A42212be, #[cfg(feature = "v1_24")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_24")))] #[doc(alias = "GST_VIDEO_FORMAT_A420_12LE")] A42012le, #[cfg(feature = "v1_24")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_24")))] #[doc(alias = "GST_VIDEO_FORMAT_A420_12BE")] A42012be, #[cfg(feature = "v1_24")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_24")))] #[doc(alias = "GST_VIDEO_FORMAT_A444_16LE")] A44416le, #[cfg(feature = "v1_24")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_24")))] #[doc(alias = "GST_VIDEO_FORMAT_A444_16BE")] A44416be, #[cfg(feature = "v1_24")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_24")))] #[doc(alias = "GST_VIDEO_FORMAT_A422_16LE")] A42216le, #[cfg(feature = "v1_24")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_24")))] #[doc(alias = "GST_VIDEO_FORMAT_A422_16BE")] A42216be, #[cfg(feature = "v1_24")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_24")))] #[doc(alias = "GST_VIDEO_FORMAT_A420_16LE")] A42016le, #[cfg(feature = "v1_24")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_24")))] #[doc(alias = "GST_VIDEO_FORMAT_A420_16BE")] A42016be, #[cfg(feature = "v1_24")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_24")))] #[doc(alias = "GST_VIDEO_FORMAT_GBR_16LE")] Gbr16le, #[cfg(feature = "v1_24")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_24")))] #[doc(alias = "GST_VIDEO_FORMAT_GBR_16BE")] Gbr16be, #[cfg(feature = "v1_24")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_24")))] #[doc(alias = "GST_VIDEO_FORMAT_RBGA")] Rbga, #[cfg(feature = "v1_26")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_26")))] #[doc(alias = "GST_VIDEO_FORMAT_Y216_LE")] Y216Le, #[cfg(feature = "v1_26")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_26")))] #[doc(alias = "GST_VIDEO_FORMAT_Y216_BE")] Y216Be, #[cfg(feature = "v1_26")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_26")))] #[doc(alias = "GST_VIDEO_FORMAT_Y416_LE")] Y416Le, #[cfg(feature = "v1_26")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_26")))] #[doc(alias = "GST_VIDEO_FORMAT_Y416_BE")] Y416Be, #[cfg(feature = "v1_26")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_26")))] #[doc(alias = "GST_VIDEO_FORMAT_GRAY10_LE16")] Gray10Le16, #[doc(hidden)] __Unknown(i32), } impl VideoFormat { #[doc(alias = "gst_video_format_from_fourcc")] pub fn from_fourcc(fourcc: u32) -> VideoFormat { assert_initialized_main_thread!(); unsafe { from_glib(ffi::gst_video_format_from_fourcc(fourcc)) } } #[doc(alias = "gst_video_format_from_string")] pub fn from_string(format: &str) -> VideoFormat { assert_initialized_main_thread!(); unsafe { from_glib(ffi::gst_video_format_from_string(format.to_glib_none().0)) } } //#[doc(alias = "gst_video_format_get_palette")] //#[doc(alias = "get_palette")] //pub fn palette(self) -> (/*Unimplemented*/Option, usize) { // unsafe { TODO: call ffi:gst_video_format_get_palette() } //} #[doc(alias = "gst_video_format_to_fourcc")] pub fn to_fourcc(self) -> u32 { assert_initialized_main_thread!(); unsafe { ffi::gst_video_format_to_fourcc(self.into_glib()) } } } impl std::fmt::Display for VideoFormat { #[inline] fn fmt(&self, f: &mut std::fmt::Formatter) -> std::fmt::Result { f.write_str(&self.to_str()) } } #[doc(hidden)] impl IntoGlib for VideoFormat { type GlibType = ffi::GstVideoFormat; fn into_glib(self) -> ffi::GstVideoFormat { match self { Self::Unknown => ffi::GST_VIDEO_FORMAT_UNKNOWN, Self::Encoded => ffi::GST_VIDEO_FORMAT_ENCODED, Self::I420 => ffi::GST_VIDEO_FORMAT_I420, Self::Yv12 => ffi::GST_VIDEO_FORMAT_YV12, Self::Yuy2 => ffi::GST_VIDEO_FORMAT_YUY2, Self::Uyvy => ffi::GST_VIDEO_FORMAT_UYVY, Self::Ayuv => ffi::GST_VIDEO_FORMAT_AYUV, Self::Rgbx => ffi::GST_VIDEO_FORMAT_RGBx, Self::Bgrx => ffi::GST_VIDEO_FORMAT_BGRx, Self::Xrgb => ffi::GST_VIDEO_FORMAT_xRGB, Self::Xbgr => ffi::GST_VIDEO_FORMAT_xBGR, Self::Rgba => ffi::GST_VIDEO_FORMAT_RGBA, Self::Bgra => ffi::GST_VIDEO_FORMAT_BGRA, Self::Argb => ffi::GST_VIDEO_FORMAT_ARGB, Self::Abgr => ffi::GST_VIDEO_FORMAT_ABGR, Self::Rgb => ffi::GST_VIDEO_FORMAT_RGB, Self::Bgr => ffi::GST_VIDEO_FORMAT_BGR, Self::Y41b => ffi::GST_VIDEO_FORMAT_Y41B, Self::Y42b => ffi::GST_VIDEO_FORMAT_Y42B, Self::Yvyu => ffi::GST_VIDEO_FORMAT_YVYU, Self::Y444 => ffi::GST_VIDEO_FORMAT_Y444, Self::V210 => ffi::GST_VIDEO_FORMAT_v210, Self::V216 => ffi::GST_VIDEO_FORMAT_v216, Self::Nv12 => ffi::GST_VIDEO_FORMAT_NV12, Self::Nv21 => ffi::GST_VIDEO_FORMAT_NV21, Self::Gray8 => ffi::GST_VIDEO_FORMAT_GRAY8, Self::Gray16Be => ffi::GST_VIDEO_FORMAT_GRAY16_BE, Self::Gray16Le => ffi::GST_VIDEO_FORMAT_GRAY16_LE, Self::V308 => ffi::GST_VIDEO_FORMAT_v308, Self::Rgb16 => ffi::GST_VIDEO_FORMAT_RGB16, Self::Bgr16 => ffi::GST_VIDEO_FORMAT_BGR16, Self::Rgb15 => ffi::GST_VIDEO_FORMAT_RGB15, Self::Bgr15 => ffi::GST_VIDEO_FORMAT_BGR15, Self::Uyvp => ffi::GST_VIDEO_FORMAT_UYVP, Self::A420 => ffi::GST_VIDEO_FORMAT_A420, Self::Rgb8p => ffi::GST_VIDEO_FORMAT_RGB8P, Self::Yuv9 => ffi::GST_VIDEO_FORMAT_YUV9, Self::Yvu9 => ffi::GST_VIDEO_FORMAT_YVU9, Self::Iyu1 => ffi::GST_VIDEO_FORMAT_IYU1, Self::Argb64 => ffi::GST_VIDEO_FORMAT_ARGB64, Self::Ayuv64 => ffi::GST_VIDEO_FORMAT_AYUV64, Self::R210 => ffi::GST_VIDEO_FORMAT_r210, Self::I42010be => ffi::GST_VIDEO_FORMAT_I420_10BE, Self::I42010le => ffi::GST_VIDEO_FORMAT_I420_10LE, Self::I42210be => ffi::GST_VIDEO_FORMAT_I422_10BE, Self::I42210le => ffi::GST_VIDEO_FORMAT_I422_10LE, Self::Y44410be => ffi::GST_VIDEO_FORMAT_Y444_10BE, Self::Y44410le => ffi::GST_VIDEO_FORMAT_Y444_10LE, Self::Gbr => ffi::GST_VIDEO_FORMAT_GBR, Self::Gbr10be => ffi::GST_VIDEO_FORMAT_GBR_10BE, Self::Gbr10le => ffi::GST_VIDEO_FORMAT_GBR_10LE, Self::Nv16 => ffi::GST_VIDEO_FORMAT_NV16, Self::Nv24 => ffi::GST_VIDEO_FORMAT_NV24, Self::Nv1264z32 => ffi::GST_VIDEO_FORMAT_NV12_64Z32, Self::A42010be => ffi::GST_VIDEO_FORMAT_A420_10BE, Self::A42010le => ffi::GST_VIDEO_FORMAT_A420_10LE, Self::A42210be => ffi::GST_VIDEO_FORMAT_A422_10BE, Self::A42210le => ffi::GST_VIDEO_FORMAT_A422_10LE, Self::A44410be => ffi::GST_VIDEO_FORMAT_A444_10BE, Self::A44410le => ffi::GST_VIDEO_FORMAT_A444_10LE, Self::Nv61 => ffi::GST_VIDEO_FORMAT_NV61, Self::P01010be => ffi::GST_VIDEO_FORMAT_P010_10BE, Self::P01010le => ffi::GST_VIDEO_FORMAT_P010_10LE, Self::Iyu2 => ffi::GST_VIDEO_FORMAT_IYU2, Self::Vyuy => ffi::GST_VIDEO_FORMAT_VYUY, Self::Gbra => ffi::GST_VIDEO_FORMAT_GBRA, Self::Gbra10be => ffi::GST_VIDEO_FORMAT_GBRA_10BE, Self::Gbra10le => ffi::GST_VIDEO_FORMAT_GBRA_10LE, Self::Gbr12be => ffi::GST_VIDEO_FORMAT_GBR_12BE, Self::Gbr12le => ffi::GST_VIDEO_FORMAT_GBR_12LE, Self::Gbra12be => ffi::GST_VIDEO_FORMAT_GBRA_12BE, Self::Gbra12le => ffi::GST_VIDEO_FORMAT_GBRA_12LE, Self::I42012be => ffi::GST_VIDEO_FORMAT_I420_12BE, Self::I42012le => ffi::GST_VIDEO_FORMAT_I420_12LE, Self::I42212be => ffi::GST_VIDEO_FORMAT_I422_12BE, Self::I42212le => ffi::GST_VIDEO_FORMAT_I422_12LE, Self::Y44412be => ffi::GST_VIDEO_FORMAT_Y444_12BE, Self::Y44412le => ffi::GST_VIDEO_FORMAT_Y444_12LE, Self::Gray10Le32 => ffi::GST_VIDEO_FORMAT_GRAY10_LE32, Self::Nv1210le32 => ffi::GST_VIDEO_FORMAT_NV12_10LE32, Self::Nv1610le32 => ffi::GST_VIDEO_FORMAT_NV16_10LE32, #[cfg(feature = "v1_16")] Self::Nv1210le40 => ffi::GST_VIDEO_FORMAT_NV12_10LE40, #[cfg(feature = "v1_16")] Self::Y210 => ffi::GST_VIDEO_FORMAT_Y210, #[cfg(feature = "v1_16")] Self::Y410 => ffi::GST_VIDEO_FORMAT_Y410, #[cfg(feature = "v1_16")] Self::Vuya => ffi::GST_VIDEO_FORMAT_VUYA, #[cfg(feature = "v1_16")] Self::Bgr10a2Le => ffi::GST_VIDEO_FORMAT_BGR10A2_LE, #[cfg(feature = "v1_18")] Self::Rgb10a2Le => ffi::GST_VIDEO_FORMAT_RGB10A2_LE, #[cfg(feature = "v1_18")] Self::Y44416be => ffi::GST_VIDEO_FORMAT_Y444_16BE, #[cfg(feature = "v1_18")] Self::Y44416le => ffi::GST_VIDEO_FORMAT_Y444_16LE, #[cfg(feature = "v1_18")] Self::P016Be => ffi::GST_VIDEO_FORMAT_P016_BE, #[cfg(feature = "v1_18")] Self::P016Le => ffi::GST_VIDEO_FORMAT_P016_LE, #[cfg(feature = "v1_18")] Self::P012Be => ffi::GST_VIDEO_FORMAT_P012_BE, #[cfg(feature = "v1_18")] Self::P012Le => ffi::GST_VIDEO_FORMAT_P012_LE, #[cfg(feature = "v1_18")] Self::Y212Be => ffi::GST_VIDEO_FORMAT_Y212_BE, #[cfg(feature = "v1_18")] Self::Y212Le => ffi::GST_VIDEO_FORMAT_Y212_LE, #[cfg(feature = "v1_18")] Self::Y412Be => ffi::GST_VIDEO_FORMAT_Y412_BE, #[cfg(feature = "v1_18")] Self::Y412Le => ffi::GST_VIDEO_FORMAT_Y412_LE, #[cfg(feature = "v1_18")] Self::Nv124l4 => ffi::GST_VIDEO_FORMAT_NV12_4L4, #[cfg(feature = "v1_18")] Self::Nv1232l32 => ffi::GST_VIDEO_FORMAT_NV12_32L32, #[cfg(feature = "v1_20")] Self::Rgbp => ffi::GST_VIDEO_FORMAT_RGBP, #[cfg(feature = "v1_20")] Self::Bgrp => ffi::GST_VIDEO_FORMAT_BGRP, #[cfg(feature = "v1_20")] Self::Av12 => ffi::GST_VIDEO_FORMAT_AV12, #[cfg(feature = "v1_20")] Self::Argb64Le => ffi::GST_VIDEO_FORMAT_ARGB64_LE, #[cfg(feature = "v1_20")] Self::Argb64Be => ffi::GST_VIDEO_FORMAT_ARGB64_BE, #[cfg(feature = "v1_20")] Self::Rgba64Le => ffi::GST_VIDEO_FORMAT_RGBA64_LE, #[cfg(feature = "v1_20")] Self::Rgba64Be => ffi::GST_VIDEO_FORMAT_RGBA64_BE, #[cfg(feature = "v1_20")] Self::Bgra64Le => ffi::GST_VIDEO_FORMAT_BGRA64_LE, #[cfg(feature = "v1_20")] Self::Bgra64Be => ffi::GST_VIDEO_FORMAT_BGRA64_BE, #[cfg(feature = "v1_20")] Self::Abgr64Le => ffi::GST_VIDEO_FORMAT_ABGR64_LE, #[cfg(feature = "v1_20")] Self::Abgr64Be => ffi::GST_VIDEO_FORMAT_ABGR64_BE, #[cfg(feature = "v1_22")] Self::Nv1216l32s => ffi::GST_VIDEO_FORMAT_NV12_16L32S, #[cfg(feature = "v1_22")] Self::Nv128l128 => ffi::GST_VIDEO_FORMAT_NV12_8L128, #[cfg(feature = "v1_22")] Self::Nv1210be8l128 => ffi::GST_VIDEO_FORMAT_NV12_10BE_8L128, #[cfg(feature = "v1_24")] Self::Nv1210le404l4 => ffi::GST_VIDEO_FORMAT_NV12_10LE40_4L4, #[cfg(feature = "v1_24")] Self::DmaDrm => ffi::GST_VIDEO_FORMAT_DMA_DRM, #[cfg(feature = "v1_24")] Self::Mt2110t => ffi::GST_VIDEO_FORMAT_MT2110T, #[cfg(feature = "v1_24")] Self::Mt2110r => ffi::GST_VIDEO_FORMAT_MT2110R, #[cfg(feature = "v1_24")] Self::A422 => ffi::GST_VIDEO_FORMAT_A422, #[cfg(feature = "v1_24")] Self::A444 => ffi::GST_VIDEO_FORMAT_A444, #[cfg(feature = "v1_24")] Self::A44412le => ffi::GST_VIDEO_FORMAT_A444_12LE, #[cfg(feature = "v1_24")] Self::A44412be => ffi::GST_VIDEO_FORMAT_A444_12BE, #[cfg(feature = "v1_24")] Self::A42212le => ffi::GST_VIDEO_FORMAT_A422_12LE, #[cfg(feature = "v1_24")] Self::A42212be => ffi::GST_VIDEO_FORMAT_A422_12BE, #[cfg(feature = "v1_24")] Self::A42012le => ffi::GST_VIDEO_FORMAT_A420_12LE, #[cfg(feature = "v1_24")] Self::A42012be => ffi::GST_VIDEO_FORMAT_A420_12BE, #[cfg(feature = "v1_24")] Self::A44416le => ffi::GST_VIDEO_FORMAT_A444_16LE, #[cfg(feature = "v1_24")] Self::A44416be => ffi::GST_VIDEO_FORMAT_A444_16BE, #[cfg(feature = "v1_24")] Self::A42216le => ffi::GST_VIDEO_FORMAT_A422_16LE, #[cfg(feature = "v1_24")] Self::A42216be => ffi::GST_VIDEO_FORMAT_A422_16BE, #[cfg(feature = "v1_24")] Self::A42016le => ffi::GST_VIDEO_FORMAT_A420_16LE, #[cfg(feature = "v1_24")] Self::A42016be => ffi::GST_VIDEO_FORMAT_A420_16BE, #[cfg(feature = "v1_24")] Self::Gbr16le => ffi::GST_VIDEO_FORMAT_GBR_16LE, #[cfg(feature = "v1_24")] Self::Gbr16be => ffi::GST_VIDEO_FORMAT_GBR_16BE, #[cfg(feature = "v1_24")] Self::Rbga => ffi::GST_VIDEO_FORMAT_RBGA, #[cfg(feature = "v1_26")] Self::Y216Le => ffi::GST_VIDEO_FORMAT_Y216_LE, #[cfg(feature = "v1_26")] Self::Y216Be => ffi::GST_VIDEO_FORMAT_Y216_BE, #[cfg(feature = "v1_26")] Self::Y416Le => ffi::GST_VIDEO_FORMAT_Y416_LE, #[cfg(feature = "v1_26")] Self::Y416Be => ffi::GST_VIDEO_FORMAT_Y416_BE, #[cfg(feature = "v1_26")] Self::Gray10Le16 => ffi::GST_VIDEO_FORMAT_GRAY10_LE16, Self::__Unknown(value) => value, } } } #[doc(hidden)] impl FromGlib for VideoFormat { unsafe fn from_glib(value: ffi::GstVideoFormat) -> Self { skip_assert_initialized!(); match value { ffi::GST_VIDEO_FORMAT_UNKNOWN => Self::Unknown, ffi::GST_VIDEO_FORMAT_ENCODED => Self::Encoded, ffi::GST_VIDEO_FORMAT_I420 => Self::I420, ffi::GST_VIDEO_FORMAT_YV12 => Self::Yv12, ffi::GST_VIDEO_FORMAT_YUY2 => Self::Yuy2, ffi::GST_VIDEO_FORMAT_UYVY => Self::Uyvy, ffi::GST_VIDEO_FORMAT_AYUV => Self::Ayuv, ffi::GST_VIDEO_FORMAT_RGBx => Self::Rgbx, ffi::GST_VIDEO_FORMAT_BGRx => Self::Bgrx, ffi::GST_VIDEO_FORMAT_xRGB => Self::Xrgb, ffi::GST_VIDEO_FORMAT_xBGR => Self::Xbgr, ffi::GST_VIDEO_FORMAT_RGBA => Self::Rgba, ffi::GST_VIDEO_FORMAT_BGRA => Self::Bgra, ffi::GST_VIDEO_FORMAT_ARGB => Self::Argb, ffi::GST_VIDEO_FORMAT_ABGR => Self::Abgr, ffi::GST_VIDEO_FORMAT_RGB => Self::Rgb, ffi::GST_VIDEO_FORMAT_BGR => Self::Bgr, ffi::GST_VIDEO_FORMAT_Y41B => Self::Y41b, ffi::GST_VIDEO_FORMAT_Y42B => Self::Y42b, ffi::GST_VIDEO_FORMAT_YVYU => Self::Yvyu, ffi::GST_VIDEO_FORMAT_Y444 => Self::Y444, ffi::GST_VIDEO_FORMAT_v210 => Self::V210, ffi::GST_VIDEO_FORMAT_v216 => Self::V216, ffi::GST_VIDEO_FORMAT_NV12 => Self::Nv12, ffi::GST_VIDEO_FORMAT_NV21 => Self::Nv21, ffi::GST_VIDEO_FORMAT_GRAY8 => Self::Gray8, ffi::GST_VIDEO_FORMAT_GRAY16_BE => Self::Gray16Be, ffi::GST_VIDEO_FORMAT_GRAY16_LE => Self::Gray16Le, ffi::GST_VIDEO_FORMAT_v308 => Self::V308, ffi::GST_VIDEO_FORMAT_RGB16 => Self::Rgb16, ffi::GST_VIDEO_FORMAT_BGR16 => Self::Bgr16, ffi::GST_VIDEO_FORMAT_RGB15 => Self::Rgb15, ffi::GST_VIDEO_FORMAT_BGR15 => Self::Bgr15, ffi::GST_VIDEO_FORMAT_UYVP => Self::Uyvp, ffi::GST_VIDEO_FORMAT_A420 => Self::A420, ffi::GST_VIDEO_FORMAT_RGB8P => Self::Rgb8p, ffi::GST_VIDEO_FORMAT_YUV9 => Self::Yuv9, ffi::GST_VIDEO_FORMAT_YVU9 => Self::Yvu9, ffi::GST_VIDEO_FORMAT_IYU1 => Self::Iyu1, ffi::GST_VIDEO_FORMAT_ARGB64 => Self::Argb64, ffi::GST_VIDEO_FORMAT_AYUV64 => Self::Ayuv64, ffi::GST_VIDEO_FORMAT_r210 => Self::R210, ffi::GST_VIDEO_FORMAT_I420_10BE => Self::I42010be, ffi::GST_VIDEO_FORMAT_I420_10LE => Self::I42010le, ffi::GST_VIDEO_FORMAT_I422_10BE => Self::I42210be, ffi::GST_VIDEO_FORMAT_I422_10LE => Self::I42210le, ffi::GST_VIDEO_FORMAT_Y444_10BE => Self::Y44410be, ffi::GST_VIDEO_FORMAT_Y444_10LE => Self::Y44410le, ffi::GST_VIDEO_FORMAT_GBR => Self::Gbr, ffi::GST_VIDEO_FORMAT_GBR_10BE => Self::Gbr10be, ffi::GST_VIDEO_FORMAT_GBR_10LE => Self::Gbr10le, ffi::GST_VIDEO_FORMAT_NV16 => Self::Nv16, ffi::GST_VIDEO_FORMAT_NV24 => Self::Nv24, ffi::GST_VIDEO_FORMAT_NV12_64Z32 => Self::Nv1264z32, ffi::GST_VIDEO_FORMAT_A420_10BE => Self::A42010be, ffi::GST_VIDEO_FORMAT_A420_10LE => Self::A42010le, ffi::GST_VIDEO_FORMAT_A422_10BE => Self::A42210be, ffi::GST_VIDEO_FORMAT_A422_10LE => Self::A42210le, ffi::GST_VIDEO_FORMAT_A444_10BE => Self::A44410be, ffi::GST_VIDEO_FORMAT_A444_10LE => Self::A44410le, ffi::GST_VIDEO_FORMAT_NV61 => Self::Nv61, ffi::GST_VIDEO_FORMAT_P010_10BE => Self::P01010be, ffi::GST_VIDEO_FORMAT_P010_10LE => Self::P01010le, ffi::GST_VIDEO_FORMAT_IYU2 => Self::Iyu2, ffi::GST_VIDEO_FORMAT_VYUY => Self::Vyuy, ffi::GST_VIDEO_FORMAT_GBRA => Self::Gbra, ffi::GST_VIDEO_FORMAT_GBRA_10BE => Self::Gbra10be, ffi::GST_VIDEO_FORMAT_GBRA_10LE => Self::Gbra10le, ffi::GST_VIDEO_FORMAT_GBR_12BE => Self::Gbr12be, ffi::GST_VIDEO_FORMAT_GBR_12LE => Self::Gbr12le, ffi::GST_VIDEO_FORMAT_GBRA_12BE => Self::Gbra12be, ffi::GST_VIDEO_FORMAT_GBRA_12LE => Self::Gbra12le, ffi::GST_VIDEO_FORMAT_I420_12BE => Self::I42012be, ffi::GST_VIDEO_FORMAT_I420_12LE => Self::I42012le, ffi::GST_VIDEO_FORMAT_I422_12BE => Self::I42212be, ffi::GST_VIDEO_FORMAT_I422_12LE => Self::I42212le, ffi::GST_VIDEO_FORMAT_Y444_12BE => Self::Y44412be, ffi::GST_VIDEO_FORMAT_Y444_12LE => Self::Y44412le, ffi::GST_VIDEO_FORMAT_GRAY10_LE32 => Self::Gray10Le32, ffi::GST_VIDEO_FORMAT_NV12_10LE32 => Self::Nv1210le32, ffi::GST_VIDEO_FORMAT_NV16_10LE32 => Self::Nv1610le32, #[cfg(feature = "v1_16")] ffi::GST_VIDEO_FORMAT_NV12_10LE40 => Self::Nv1210le40, #[cfg(feature = "v1_16")] ffi::GST_VIDEO_FORMAT_Y210 => Self::Y210, #[cfg(feature = "v1_16")] ffi::GST_VIDEO_FORMAT_Y410 => Self::Y410, #[cfg(feature = "v1_16")] ffi::GST_VIDEO_FORMAT_VUYA => Self::Vuya, #[cfg(feature = "v1_16")] ffi::GST_VIDEO_FORMAT_BGR10A2_LE => Self::Bgr10a2Le, #[cfg(feature = "v1_18")] ffi::GST_VIDEO_FORMAT_RGB10A2_LE => Self::Rgb10a2Le, #[cfg(feature = "v1_18")] ffi::GST_VIDEO_FORMAT_Y444_16BE => Self::Y44416be, #[cfg(feature = "v1_18")] ffi::GST_VIDEO_FORMAT_Y444_16LE => Self::Y44416le, #[cfg(feature = "v1_18")] ffi::GST_VIDEO_FORMAT_P016_BE => Self::P016Be, #[cfg(feature = "v1_18")] ffi::GST_VIDEO_FORMAT_P016_LE => Self::P016Le, #[cfg(feature = "v1_18")] ffi::GST_VIDEO_FORMAT_P012_BE => Self::P012Be, #[cfg(feature = "v1_18")] ffi::GST_VIDEO_FORMAT_P012_LE => Self::P012Le, #[cfg(feature = "v1_18")] ffi::GST_VIDEO_FORMAT_Y212_BE => Self::Y212Be, #[cfg(feature = "v1_18")] ffi::GST_VIDEO_FORMAT_Y212_LE => Self::Y212Le, #[cfg(feature = "v1_18")] ffi::GST_VIDEO_FORMAT_Y412_BE => Self::Y412Be, #[cfg(feature = "v1_18")] ffi::GST_VIDEO_FORMAT_Y412_LE => Self::Y412Le, #[cfg(feature = "v1_18")] ffi::GST_VIDEO_FORMAT_NV12_4L4 => Self::Nv124l4, #[cfg(feature = "v1_18")] ffi::GST_VIDEO_FORMAT_NV12_32L32 => Self::Nv1232l32, #[cfg(feature = "v1_20")] ffi::GST_VIDEO_FORMAT_RGBP => Self::Rgbp, #[cfg(feature = "v1_20")] ffi::GST_VIDEO_FORMAT_BGRP => Self::Bgrp, #[cfg(feature = "v1_20")] ffi::GST_VIDEO_FORMAT_AV12 => Self::Av12, #[cfg(feature = "v1_20")] ffi::GST_VIDEO_FORMAT_ARGB64_LE => Self::Argb64Le, #[cfg(feature = "v1_20")] ffi::GST_VIDEO_FORMAT_ARGB64_BE => Self::Argb64Be, #[cfg(feature = "v1_20")] ffi::GST_VIDEO_FORMAT_RGBA64_LE => Self::Rgba64Le, #[cfg(feature = "v1_20")] ffi::GST_VIDEO_FORMAT_RGBA64_BE => Self::Rgba64Be, #[cfg(feature = "v1_20")] ffi::GST_VIDEO_FORMAT_BGRA64_LE => Self::Bgra64Le, #[cfg(feature = "v1_20")] ffi::GST_VIDEO_FORMAT_BGRA64_BE => Self::Bgra64Be, #[cfg(feature = "v1_20")] ffi::GST_VIDEO_FORMAT_ABGR64_LE => Self::Abgr64Le, #[cfg(feature = "v1_20")] ffi::GST_VIDEO_FORMAT_ABGR64_BE => Self::Abgr64Be, #[cfg(feature = "v1_22")] ffi::GST_VIDEO_FORMAT_NV12_16L32S => Self::Nv1216l32s, #[cfg(feature = "v1_22")] ffi::GST_VIDEO_FORMAT_NV12_8L128 => Self::Nv128l128, #[cfg(feature = "v1_22")] ffi::GST_VIDEO_FORMAT_NV12_10BE_8L128 => Self::Nv1210be8l128, #[cfg(feature = "v1_24")] ffi::GST_VIDEO_FORMAT_NV12_10LE40_4L4 => Self::Nv1210le404l4, #[cfg(feature = "v1_24")] ffi::GST_VIDEO_FORMAT_DMA_DRM => Self::DmaDrm, #[cfg(feature = "v1_24")] ffi::GST_VIDEO_FORMAT_MT2110T => Self::Mt2110t, #[cfg(feature = "v1_24")] ffi::GST_VIDEO_FORMAT_MT2110R => Self::Mt2110r, #[cfg(feature = "v1_24")] ffi::GST_VIDEO_FORMAT_A422 => Self::A422, #[cfg(feature = "v1_24")] ffi::GST_VIDEO_FORMAT_A444 => Self::A444, #[cfg(feature = "v1_24")] ffi::GST_VIDEO_FORMAT_A444_12LE => Self::A44412le, #[cfg(feature = "v1_24")] ffi::GST_VIDEO_FORMAT_A444_12BE => Self::A44412be, #[cfg(feature = "v1_24")] ffi::GST_VIDEO_FORMAT_A422_12LE => Self::A42212le, #[cfg(feature = "v1_24")] ffi::GST_VIDEO_FORMAT_A422_12BE => Self::A42212be, #[cfg(feature = "v1_24")] ffi::GST_VIDEO_FORMAT_A420_12LE => Self::A42012le, #[cfg(feature = "v1_24")] ffi::GST_VIDEO_FORMAT_A420_12BE => Self::A42012be, #[cfg(feature = "v1_24")] ffi::GST_VIDEO_FORMAT_A444_16LE => Self::A44416le, #[cfg(feature = "v1_24")] ffi::GST_VIDEO_FORMAT_A444_16BE => Self::A44416be, #[cfg(feature = "v1_24")] ffi::GST_VIDEO_FORMAT_A422_16LE => Self::A42216le, #[cfg(feature = "v1_24")] ffi::GST_VIDEO_FORMAT_A422_16BE => Self::A42216be, #[cfg(feature = "v1_24")] ffi::GST_VIDEO_FORMAT_A420_16LE => Self::A42016le, #[cfg(feature = "v1_24")] ffi::GST_VIDEO_FORMAT_A420_16BE => Self::A42016be, #[cfg(feature = "v1_24")] ffi::GST_VIDEO_FORMAT_GBR_16LE => Self::Gbr16le, #[cfg(feature = "v1_24")] ffi::GST_VIDEO_FORMAT_GBR_16BE => Self::Gbr16be, #[cfg(feature = "v1_24")] ffi::GST_VIDEO_FORMAT_RBGA => Self::Rbga, #[cfg(feature = "v1_26")] ffi::GST_VIDEO_FORMAT_Y216_LE => Self::Y216Le, #[cfg(feature = "v1_26")] ffi::GST_VIDEO_FORMAT_Y216_BE => Self::Y216Be, #[cfg(feature = "v1_26")] ffi::GST_VIDEO_FORMAT_Y416_LE => Self::Y416Le, #[cfg(feature = "v1_26")] ffi::GST_VIDEO_FORMAT_Y416_BE => Self::Y416Be, #[cfg(feature = "v1_26")] ffi::GST_VIDEO_FORMAT_GRAY10_LE16 => Self::Gray10Le16, value => Self::__Unknown(value), } } } impl StaticType for VideoFormat { #[inline] #[doc(alias = "gst_video_format_get_type")] fn static_type() -> glib::Type { unsafe { from_glib(ffi::gst_video_format_get_type()) } } } impl glib::HasParamSpec for VideoFormat { type ParamSpec = glib::ParamSpecEnum; type SetValue = Self; type BuilderFn = fn(&str, Self) -> glib::ParamSpecEnumBuilder; fn param_spec_builder() -> Self::BuilderFn { Self::ParamSpec::builder_with_default } } impl glib::value::ValueType for VideoFormat { type Type = Self; } unsafe impl<'a> glib::value::FromValue<'a> for VideoFormat { type Checker = glib::value::GenericValueTypeChecker; #[inline] unsafe fn from_value(value: &'a glib::Value) -> Self { skip_assert_initialized!(); from_glib(glib::gobject_ffi::g_value_get_enum(value.to_glib_none().0)) } } impl ToValue for VideoFormat { #[inline] fn to_value(&self) -> glib::Value { let mut value = glib::Value::for_value_type::(); unsafe { glib::gobject_ffi::g_value_set_enum(value.to_glib_none_mut().0, self.into_glib()); } value } #[inline] fn value_type(&self) -> glib::Type { Self::static_type() } } impl From for glib::Value { #[inline] fn from(v: VideoFormat) -> Self { skip_assert_initialized!(); ToValue::to_value(&v) } } #[derive(Debug, Eq, PartialEq, Ord, PartialOrd, Hash, Clone, Copy)] #[non_exhaustive] #[doc(alias = "GstVideoGammaMode")] pub enum VideoGammaMode { #[doc(alias = "GST_VIDEO_GAMMA_MODE_NONE")] None, #[doc(alias = "GST_VIDEO_GAMMA_MODE_REMAP")] Remap, #[doc(hidden)] __Unknown(i32), } #[doc(hidden)] impl IntoGlib for VideoGammaMode { type GlibType = ffi::GstVideoGammaMode; #[inline] fn into_glib(self) -> ffi::GstVideoGammaMode { match self { Self::None => ffi::GST_VIDEO_GAMMA_MODE_NONE, Self::Remap => ffi::GST_VIDEO_GAMMA_MODE_REMAP, Self::__Unknown(value) => value, } } } #[doc(hidden)] impl FromGlib for VideoGammaMode { #[inline] unsafe fn from_glib(value: ffi::GstVideoGammaMode) -> Self { skip_assert_initialized!(); match value { ffi::GST_VIDEO_GAMMA_MODE_NONE => Self::None, ffi::GST_VIDEO_GAMMA_MODE_REMAP => Self::Remap, value => Self::__Unknown(value), } } } impl StaticType for VideoGammaMode { #[inline] #[doc(alias = "gst_video_gamma_mode_get_type")] fn static_type() -> glib::Type { unsafe { from_glib(ffi::gst_video_gamma_mode_get_type()) } } } impl glib::HasParamSpec for VideoGammaMode { type ParamSpec = glib::ParamSpecEnum; type SetValue = Self; type BuilderFn = fn(&str, Self) -> glib::ParamSpecEnumBuilder; fn param_spec_builder() -> Self::BuilderFn { Self::ParamSpec::builder_with_default } } impl glib::value::ValueType for VideoGammaMode { type Type = Self; } unsafe impl<'a> glib::value::FromValue<'a> for VideoGammaMode { type Checker = glib::value::GenericValueTypeChecker; #[inline] unsafe fn from_value(value: &'a glib::Value) -> Self { skip_assert_initialized!(); from_glib(glib::gobject_ffi::g_value_get_enum(value.to_glib_none().0)) } } impl ToValue for VideoGammaMode { #[inline] fn to_value(&self) -> glib::Value { let mut value = glib::Value::for_value_type::(); unsafe { glib::gobject_ffi::g_value_set_enum(value.to_glib_none_mut().0, self.into_glib()); } value } #[inline] fn value_type(&self) -> glib::Type { Self::static_type() } } impl From for glib::Value { #[inline] fn from(v: VideoGammaMode) -> Self { skip_assert_initialized!(); ToValue::to_value(&v) } } #[derive(Debug, Eq, PartialEq, Ord, PartialOrd, Hash, Clone, Copy)] #[non_exhaustive] #[doc(alias = "GstVideoInterlaceMode")] pub enum VideoInterlaceMode { #[doc(alias = "GST_VIDEO_INTERLACE_MODE_PROGRESSIVE")] Progressive, #[doc(alias = "GST_VIDEO_INTERLACE_MODE_INTERLEAVED")] Interleaved, #[doc(alias = "GST_VIDEO_INTERLACE_MODE_MIXED")] Mixed, #[doc(alias = "GST_VIDEO_INTERLACE_MODE_FIELDS")] Fields, #[cfg(feature = "v1_16")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_16")))] #[doc(alias = "GST_VIDEO_INTERLACE_MODE_ALTERNATE")] Alternate, #[doc(hidden)] __Unknown(i32), } impl VideoInterlaceMode { #[doc(alias = "gst_video_interlace_mode_from_string")] pub fn from_string(mode: &str) -> VideoInterlaceMode { assert_initialized_main_thread!(); unsafe { from_glib(ffi::gst_video_interlace_mode_from_string( mode.to_glib_none().0, )) } } pub fn to_str<'a>(self) -> &'a GStr { unsafe { GStr::from_ptr( ffi::gst_video_interlace_mode_to_string(self.into_glib()) .as_ref() .expect("gst_video_interlace_mode_to_string returned NULL"), ) } } } impl std::fmt::Display for VideoInterlaceMode { #[inline] fn fmt(&self, f: &mut std::fmt::Formatter) -> std::fmt::Result { f.write_str(&self.to_str()) } } #[doc(hidden)] impl IntoGlib for VideoInterlaceMode { type GlibType = ffi::GstVideoInterlaceMode; #[inline] fn into_glib(self) -> ffi::GstVideoInterlaceMode { match self { Self::Progressive => ffi::GST_VIDEO_INTERLACE_MODE_PROGRESSIVE, Self::Interleaved => ffi::GST_VIDEO_INTERLACE_MODE_INTERLEAVED, Self::Mixed => ffi::GST_VIDEO_INTERLACE_MODE_MIXED, Self::Fields => ffi::GST_VIDEO_INTERLACE_MODE_FIELDS, #[cfg(feature = "v1_16")] Self::Alternate => ffi::GST_VIDEO_INTERLACE_MODE_ALTERNATE, Self::__Unknown(value) => value, } } } #[doc(hidden)] impl FromGlib for VideoInterlaceMode { #[inline] unsafe fn from_glib(value: ffi::GstVideoInterlaceMode) -> Self { skip_assert_initialized!(); match value { ffi::GST_VIDEO_INTERLACE_MODE_PROGRESSIVE => Self::Progressive, ffi::GST_VIDEO_INTERLACE_MODE_INTERLEAVED => Self::Interleaved, ffi::GST_VIDEO_INTERLACE_MODE_MIXED => Self::Mixed, ffi::GST_VIDEO_INTERLACE_MODE_FIELDS => Self::Fields, #[cfg(feature = "v1_16")] ffi::GST_VIDEO_INTERLACE_MODE_ALTERNATE => Self::Alternate, value => Self::__Unknown(value), } } } impl StaticType for VideoInterlaceMode { #[inline] #[doc(alias = "gst_video_interlace_mode_get_type")] fn static_type() -> glib::Type { unsafe { from_glib(ffi::gst_video_interlace_mode_get_type()) } } } impl glib::HasParamSpec for VideoInterlaceMode { type ParamSpec = glib::ParamSpecEnum; type SetValue = Self; type BuilderFn = fn(&str, Self) -> glib::ParamSpecEnumBuilder; fn param_spec_builder() -> Self::BuilderFn { Self::ParamSpec::builder_with_default } } impl glib::value::ValueType for VideoInterlaceMode { type Type = Self; } unsafe impl<'a> glib::value::FromValue<'a> for VideoInterlaceMode { type Checker = glib::value::GenericValueTypeChecker; #[inline] unsafe fn from_value(value: &'a glib::Value) -> Self { skip_assert_initialized!(); from_glib(glib::gobject_ffi::g_value_get_enum(value.to_glib_none().0)) } } impl ToValue for VideoInterlaceMode { #[inline] fn to_value(&self) -> glib::Value { let mut value = glib::Value::for_value_type::(); unsafe { glib::gobject_ffi::g_value_set_enum(value.to_glib_none_mut().0, self.into_glib()); } value } #[inline] fn value_type(&self) -> glib::Type { Self::static_type() } } impl From for glib::Value { #[inline] fn from(v: VideoInterlaceMode) -> Self { skip_assert_initialized!(); ToValue::to_value(&v) } } #[derive(Debug, Eq, PartialEq, Ord, PartialOrd, Hash, Clone, Copy)] #[non_exhaustive] #[doc(alias = "GstVideoMatrixMode")] pub enum VideoMatrixMode { #[doc(alias = "GST_VIDEO_MATRIX_MODE_FULL")] Full, #[doc(alias = "GST_VIDEO_MATRIX_MODE_INPUT_ONLY")] InputOnly, #[doc(alias = "GST_VIDEO_MATRIX_MODE_OUTPUT_ONLY")] OutputOnly, #[doc(alias = "GST_VIDEO_MATRIX_MODE_NONE")] None, #[doc(hidden)] __Unknown(i32), } #[doc(hidden)] impl IntoGlib for VideoMatrixMode { type GlibType = ffi::GstVideoMatrixMode; #[inline] fn into_glib(self) -> ffi::GstVideoMatrixMode { match self { Self::Full => ffi::GST_VIDEO_MATRIX_MODE_FULL, Self::InputOnly => ffi::GST_VIDEO_MATRIX_MODE_INPUT_ONLY, Self::OutputOnly => ffi::GST_VIDEO_MATRIX_MODE_OUTPUT_ONLY, Self::None => ffi::GST_VIDEO_MATRIX_MODE_NONE, Self::__Unknown(value) => value, } } } #[doc(hidden)] impl FromGlib for VideoMatrixMode { #[inline] unsafe fn from_glib(value: ffi::GstVideoMatrixMode) -> Self { skip_assert_initialized!(); match value { ffi::GST_VIDEO_MATRIX_MODE_FULL => Self::Full, ffi::GST_VIDEO_MATRIX_MODE_INPUT_ONLY => Self::InputOnly, ffi::GST_VIDEO_MATRIX_MODE_OUTPUT_ONLY => Self::OutputOnly, ffi::GST_VIDEO_MATRIX_MODE_NONE => Self::None, value => Self::__Unknown(value), } } } impl StaticType for VideoMatrixMode { #[inline] #[doc(alias = "gst_video_matrix_mode_get_type")] fn static_type() -> glib::Type { unsafe { from_glib(ffi::gst_video_matrix_mode_get_type()) } } } impl glib::HasParamSpec for VideoMatrixMode { type ParamSpec = glib::ParamSpecEnum; type SetValue = Self; type BuilderFn = fn(&str, Self) -> glib::ParamSpecEnumBuilder; fn param_spec_builder() -> Self::BuilderFn { Self::ParamSpec::builder_with_default } } impl glib::value::ValueType for VideoMatrixMode { type Type = Self; } unsafe impl<'a> glib::value::FromValue<'a> for VideoMatrixMode { type Checker = glib::value::GenericValueTypeChecker; #[inline] unsafe fn from_value(value: &'a glib::Value) -> Self { skip_assert_initialized!(); from_glib(glib::gobject_ffi::g_value_get_enum(value.to_glib_none().0)) } } impl ToValue for VideoMatrixMode { #[inline] fn to_value(&self) -> glib::Value { let mut value = glib::Value::for_value_type::(); unsafe { glib::gobject_ffi::g_value_set_enum(value.to_glib_none_mut().0, self.into_glib()); } value } #[inline] fn value_type(&self) -> glib::Type { Self::static_type() } } impl From for glib::Value { #[inline] fn from(v: VideoMatrixMode) -> Self { skip_assert_initialized!(); ToValue::to_value(&v) } } #[derive(Debug, Eq, PartialEq, Ord, PartialOrd, Hash, Clone, Copy)] #[non_exhaustive] #[doc(alias = "GstVideoMultiviewFramePacking")] pub enum VideoMultiviewFramePacking { #[doc(alias = "GST_VIDEO_MULTIVIEW_FRAME_PACKING_NONE")] None, #[doc(alias = "GST_VIDEO_MULTIVIEW_FRAME_PACKING_MONO")] Mono, #[doc(alias = "GST_VIDEO_MULTIVIEW_FRAME_PACKING_LEFT")] Left, #[doc(alias = "GST_VIDEO_MULTIVIEW_FRAME_PACKING_RIGHT")] Right, #[doc(alias = "GST_VIDEO_MULTIVIEW_FRAME_PACKING_SIDE_BY_SIDE")] SideBySide, #[doc(alias = "GST_VIDEO_MULTIVIEW_FRAME_PACKING_SIDE_BY_SIDE_QUINCUNX")] SideBySideQuincunx, #[doc(alias = "GST_VIDEO_MULTIVIEW_FRAME_PACKING_COLUMN_INTERLEAVED")] ColumnInterleaved, #[doc(alias = "GST_VIDEO_MULTIVIEW_FRAME_PACKING_ROW_INTERLEAVED")] RowInterleaved, #[doc(alias = "GST_VIDEO_MULTIVIEW_FRAME_PACKING_TOP_BOTTOM")] TopBottom, #[doc(alias = "GST_VIDEO_MULTIVIEW_FRAME_PACKING_CHECKERBOARD")] Checkerboard, #[doc(hidden)] __Unknown(i32), } #[doc(hidden)] impl IntoGlib for VideoMultiviewFramePacking { type GlibType = ffi::GstVideoMultiviewFramePacking; #[inline] fn into_glib(self) -> ffi::GstVideoMultiviewFramePacking { match self { Self::None => ffi::GST_VIDEO_MULTIVIEW_FRAME_PACKING_NONE, Self::Mono => ffi::GST_VIDEO_MULTIVIEW_FRAME_PACKING_MONO, Self::Left => ffi::GST_VIDEO_MULTIVIEW_FRAME_PACKING_LEFT, Self::Right => ffi::GST_VIDEO_MULTIVIEW_FRAME_PACKING_RIGHT, Self::SideBySide => ffi::GST_VIDEO_MULTIVIEW_FRAME_PACKING_SIDE_BY_SIDE, Self::SideBySideQuincunx => { ffi::GST_VIDEO_MULTIVIEW_FRAME_PACKING_SIDE_BY_SIDE_QUINCUNX } Self::ColumnInterleaved => ffi::GST_VIDEO_MULTIVIEW_FRAME_PACKING_COLUMN_INTERLEAVED, Self::RowInterleaved => ffi::GST_VIDEO_MULTIVIEW_FRAME_PACKING_ROW_INTERLEAVED, Self::TopBottom => ffi::GST_VIDEO_MULTIVIEW_FRAME_PACKING_TOP_BOTTOM, Self::Checkerboard => ffi::GST_VIDEO_MULTIVIEW_FRAME_PACKING_CHECKERBOARD, Self::__Unknown(value) => value, } } } #[doc(hidden)] impl FromGlib for VideoMultiviewFramePacking { #[inline] unsafe fn from_glib(value: ffi::GstVideoMultiviewFramePacking) -> Self { skip_assert_initialized!(); match value { ffi::GST_VIDEO_MULTIVIEW_FRAME_PACKING_NONE => Self::None, ffi::GST_VIDEO_MULTIVIEW_FRAME_PACKING_MONO => Self::Mono, ffi::GST_VIDEO_MULTIVIEW_FRAME_PACKING_LEFT => Self::Left, ffi::GST_VIDEO_MULTIVIEW_FRAME_PACKING_RIGHT => Self::Right, ffi::GST_VIDEO_MULTIVIEW_FRAME_PACKING_SIDE_BY_SIDE => Self::SideBySide, ffi::GST_VIDEO_MULTIVIEW_FRAME_PACKING_SIDE_BY_SIDE_QUINCUNX => { Self::SideBySideQuincunx } ffi::GST_VIDEO_MULTIVIEW_FRAME_PACKING_COLUMN_INTERLEAVED => Self::ColumnInterleaved, ffi::GST_VIDEO_MULTIVIEW_FRAME_PACKING_ROW_INTERLEAVED => Self::RowInterleaved, ffi::GST_VIDEO_MULTIVIEW_FRAME_PACKING_TOP_BOTTOM => Self::TopBottom, ffi::GST_VIDEO_MULTIVIEW_FRAME_PACKING_CHECKERBOARD => Self::Checkerboard, value => Self::__Unknown(value), } } } impl StaticType for VideoMultiviewFramePacking { #[inline] #[doc(alias = "gst_video_multiview_frame_packing_get_type")] fn static_type() -> glib::Type { unsafe { from_glib(ffi::gst_video_multiview_frame_packing_get_type()) } } } impl glib::HasParamSpec for VideoMultiviewFramePacking { type ParamSpec = glib::ParamSpecEnum; type SetValue = Self; type BuilderFn = fn(&str, Self) -> glib::ParamSpecEnumBuilder; fn param_spec_builder() -> Self::BuilderFn { Self::ParamSpec::builder_with_default } } impl glib::value::ValueType for VideoMultiviewFramePacking { type Type = Self; } unsafe impl<'a> glib::value::FromValue<'a> for VideoMultiviewFramePacking { type Checker = glib::value::GenericValueTypeChecker; #[inline] unsafe fn from_value(value: &'a glib::Value) -> Self { skip_assert_initialized!(); from_glib(glib::gobject_ffi::g_value_get_enum(value.to_glib_none().0)) } } impl ToValue for VideoMultiviewFramePacking { #[inline] fn to_value(&self) -> glib::Value { let mut value = glib::Value::for_value_type::(); unsafe { glib::gobject_ffi::g_value_set_enum(value.to_glib_none_mut().0, self.into_glib()); } value } #[inline] fn value_type(&self) -> glib::Type { Self::static_type() } } impl From for glib::Value { #[inline] fn from(v: VideoMultiviewFramePacking) -> Self { skip_assert_initialized!(); ToValue::to_value(&v) } } #[derive(Debug, Eq, PartialEq, Ord, PartialOrd, Hash, Clone, Copy)] #[non_exhaustive] #[doc(alias = "GstVideoMultiviewMode")] pub enum VideoMultiviewMode { #[doc(alias = "GST_VIDEO_MULTIVIEW_MODE_NONE")] None, #[doc(alias = "GST_VIDEO_MULTIVIEW_MODE_MONO")] Mono, #[doc(alias = "GST_VIDEO_MULTIVIEW_MODE_LEFT")] Left, #[doc(alias = "GST_VIDEO_MULTIVIEW_MODE_RIGHT")] Right, #[doc(alias = "GST_VIDEO_MULTIVIEW_MODE_SIDE_BY_SIDE")] SideBySide, #[doc(alias = "GST_VIDEO_MULTIVIEW_MODE_SIDE_BY_SIDE_QUINCUNX")] SideBySideQuincunx, #[doc(alias = "GST_VIDEO_MULTIVIEW_MODE_COLUMN_INTERLEAVED")] ColumnInterleaved, #[doc(alias = "GST_VIDEO_MULTIVIEW_MODE_ROW_INTERLEAVED")] RowInterleaved, #[doc(alias = "GST_VIDEO_MULTIVIEW_MODE_TOP_BOTTOM")] TopBottom, #[doc(alias = "GST_VIDEO_MULTIVIEW_MODE_CHECKERBOARD")] Checkerboard, #[doc(alias = "GST_VIDEO_MULTIVIEW_MODE_FRAME_BY_FRAME")] FrameByFrame, #[doc(alias = "GST_VIDEO_MULTIVIEW_MODE_MULTIVIEW_FRAME_BY_FRAME")] MultiviewFrameByFrame, #[doc(alias = "GST_VIDEO_MULTIVIEW_MODE_SEPARATED")] Separated, #[doc(hidden)] __Unknown(i32), } impl VideoMultiviewMode { #[doc(alias = "gst_video_multiview_mode_from_caps_string")] pub fn from_caps_string(caps_mview_mode: &str) -> VideoMultiviewMode { assert_initialized_main_thread!(); unsafe { from_glib(ffi::gst_video_multiview_mode_from_caps_string( caps_mview_mode.to_glib_none().0, )) } } #[doc(alias = "gst_video_multiview_mode_to_caps_string")] pub fn to_caps_string(self) -> Option { assert_initialized_main_thread!(); unsafe { from_glib_none(ffi::gst_video_multiview_mode_to_caps_string( self.into_glib(), )) } } } #[doc(hidden)] impl IntoGlib for VideoMultiviewMode { type GlibType = ffi::GstVideoMultiviewMode; fn into_glib(self) -> ffi::GstVideoMultiviewMode { match self { Self::None => ffi::GST_VIDEO_MULTIVIEW_MODE_NONE, Self::Mono => ffi::GST_VIDEO_MULTIVIEW_MODE_MONO, Self::Left => ffi::GST_VIDEO_MULTIVIEW_MODE_LEFT, Self::Right => ffi::GST_VIDEO_MULTIVIEW_MODE_RIGHT, Self::SideBySide => ffi::GST_VIDEO_MULTIVIEW_MODE_SIDE_BY_SIDE, Self::SideBySideQuincunx => ffi::GST_VIDEO_MULTIVIEW_MODE_SIDE_BY_SIDE_QUINCUNX, Self::ColumnInterleaved => ffi::GST_VIDEO_MULTIVIEW_MODE_COLUMN_INTERLEAVED, Self::RowInterleaved => ffi::GST_VIDEO_MULTIVIEW_MODE_ROW_INTERLEAVED, Self::TopBottom => ffi::GST_VIDEO_MULTIVIEW_MODE_TOP_BOTTOM, Self::Checkerboard => ffi::GST_VIDEO_MULTIVIEW_MODE_CHECKERBOARD, Self::FrameByFrame => ffi::GST_VIDEO_MULTIVIEW_MODE_FRAME_BY_FRAME, Self::MultiviewFrameByFrame => ffi::GST_VIDEO_MULTIVIEW_MODE_MULTIVIEW_FRAME_BY_FRAME, Self::Separated => ffi::GST_VIDEO_MULTIVIEW_MODE_SEPARATED, Self::__Unknown(value) => value, } } } #[doc(hidden)] impl FromGlib for VideoMultiviewMode { unsafe fn from_glib(value: ffi::GstVideoMultiviewMode) -> Self { skip_assert_initialized!(); match value { ffi::GST_VIDEO_MULTIVIEW_MODE_NONE => Self::None, ffi::GST_VIDEO_MULTIVIEW_MODE_MONO => Self::Mono, ffi::GST_VIDEO_MULTIVIEW_MODE_LEFT => Self::Left, ffi::GST_VIDEO_MULTIVIEW_MODE_RIGHT => Self::Right, ffi::GST_VIDEO_MULTIVIEW_MODE_SIDE_BY_SIDE => Self::SideBySide, ffi::GST_VIDEO_MULTIVIEW_MODE_SIDE_BY_SIDE_QUINCUNX => Self::SideBySideQuincunx, ffi::GST_VIDEO_MULTIVIEW_MODE_COLUMN_INTERLEAVED => Self::ColumnInterleaved, ffi::GST_VIDEO_MULTIVIEW_MODE_ROW_INTERLEAVED => Self::RowInterleaved, ffi::GST_VIDEO_MULTIVIEW_MODE_TOP_BOTTOM => Self::TopBottom, ffi::GST_VIDEO_MULTIVIEW_MODE_CHECKERBOARD => Self::Checkerboard, ffi::GST_VIDEO_MULTIVIEW_MODE_FRAME_BY_FRAME => Self::FrameByFrame, ffi::GST_VIDEO_MULTIVIEW_MODE_MULTIVIEW_FRAME_BY_FRAME => Self::MultiviewFrameByFrame, ffi::GST_VIDEO_MULTIVIEW_MODE_SEPARATED => Self::Separated, value => Self::__Unknown(value), } } } impl StaticType for VideoMultiviewMode { #[inline] #[doc(alias = "gst_video_multiview_mode_get_type")] fn static_type() -> glib::Type { unsafe { from_glib(ffi::gst_video_multiview_mode_get_type()) } } } impl glib::HasParamSpec for VideoMultiviewMode { type ParamSpec = glib::ParamSpecEnum; type SetValue = Self; type BuilderFn = fn(&str, Self) -> glib::ParamSpecEnumBuilder; fn param_spec_builder() -> Self::BuilderFn { Self::ParamSpec::builder_with_default } } impl glib::value::ValueType for VideoMultiviewMode { type Type = Self; } unsafe impl<'a> glib::value::FromValue<'a> for VideoMultiviewMode { type Checker = glib::value::GenericValueTypeChecker; #[inline] unsafe fn from_value(value: &'a glib::Value) -> Self { skip_assert_initialized!(); from_glib(glib::gobject_ffi::g_value_get_enum(value.to_glib_none().0)) } } impl ToValue for VideoMultiviewMode { #[inline] fn to_value(&self) -> glib::Value { let mut value = glib::Value::for_value_type::(); unsafe { glib::gobject_ffi::g_value_set_enum(value.to_glib_none_mut().0, self.into_glib()); } value } #[inline] fn value_type(&self) -> glib::Type { Self::static_type() } } impl From for glib::Value { #[inline] fn from(v: VideoMultiviewMode) -> Self { skip_assert_initialized!(); ToValue::to_value(&v) } } #[derive(Debug, Eq, PartialEq, Ord, PartialOrd, Hash, Clone, Copy)] #[non_exhaustive] #[doc(alias = "GstVideoOrientationMethod")] pub enum VideoOrientationMethod { #[doc(alias = "GST_VIDEO_ORIENTATION_IDENTITY")] Identity, #[doc(alias = "GST_VIDEO_ORIENTATION_90R")] _90r, #[doc(alias = "GST_VIDEO_ORIENTATION_180")] _180, #[doc(alias = "GST_VIDEO_ORIENTATION_90L")] _90l, #[doc(alias = "GST_VIDEO_ORIENTATION_HORIZ")] Horiz, #[doc(alias = "GST_VIDEO_ORIENTATION_VERT")] Vert, #[doc(alias = "GST_VIDEO_ORIENTATION_UL_LR")] UlLr, #[doc(alias = "GST_VIDEO_ORIENTATION_UR_LL")] UrLl, #[doc(alias = "GST_VIDEO_ORIENTATION_AUTO")] Auto, #[doc(alias = "GST_VIDEO_ORIENTATION_CUSTOM")] Custom, #[doc(hidden)] __Unknown(i32), } #[doc(hidden)] impl IntoGlib for VideoOrientationMethod { type GlibType = ffi::GstVideoOrientationMethod; #[inline] fn into_glib(self) -> ffi::GstVideoOrientationMethod { match self { Self::Identity => ffi::GST_VIDEO_ORIENTATION_IDENTITY, Self::_90r => ffi::GST_VIDEO_ORIENTATION_90R, Self::_180 => ffi::GST_VIDEO_ORIENTATION_180, Self::_90l => ffi::GST_VIDEO_ORIENTATION_90L, Self::Horiz => ffi::GST_VIDEO_ORIENTATION_HORIZ, Self::Vert => ffi::GST_VIDEO_ORIENTATION_VERT, Self::UlLr => ffi::GST_VIDEO_ORIENTATION_UL_LR, Self::UrLl => ffi::GST_VIDEO_ORIENTATION_UR_LL, Self::Auto => ffi::GST_VIDEO_ORIENTATION_AUTO, Self::Custom => ffi::GST_VIDEO_ORIENTATION_CUSTOM, Self::__Unknown(value) => value, } } } #[doc(hidden)] impl FromGlib for VideoOrientationMethod { #[inline] unsafe fn from_glib(value: ffi::GstVideoOrientationMethod) -> Self { skip_assert_initialized!(); match value { ffi::GST_VIDEO_ORIENTATION_IDENTITY => Self::Identity, ffi::GST_VIDEO_ORIENTATION_90R => Self::_90r, ffi::GST_VIDEO_ORIENTATION_180 => Self::_180, ffi::GST_VIDEO_ORIENTATION_90L => Self::_90l, ffi::GST_VIDEO_ORIENTATION_HORIZ => Self::Horiz, ffi::GST_VIDEO_ORIENTATION_VERT => Self::Vert, ffi::GST_VIDEO_ORIENTATION_UL_LR => Self::UlLr, ffi::GST_VIDEO_ORIENTATION_UR_LL => Self::UrLl, ffi::GST_VIDEO_ORIENTATION_AUTO => Self::Auto, ffi::GST_VIDEO_ORIENTATION_CUSTOM => Self::Custom, value => Self::__Unknown(value), } } } impl StaticType for VideoOrientationMethod { #[inline] #[doc(alias = "gst_video_orientation_method_get_type")] fn static_type() -> glib::Type { unsafe { from_glib(ffi::gst_video_orientation_method_get_type()) } } } impl glib::HasParamSpec for VideoOrientationMethod { type ParamSpec = glib::ParamSpecEnum; type SetValue = Self; type BuilderFn = fn(&str, Self) -> glib::ParamSpecEnumBuilder; fn param_spec_builder() -> Self::BuilderFn { Self::ParamSpec::builder_with_default } } impl glib::value::ValueType for VideoOrientationMethod { type Type = Self; } unsafe impl<'a> glib::value::FromValue<'a> for VideoOrientationMethod { type Checker = glib::value::GenericValueTypeChecker; #[inline] unsafe fn from_value(value: &'a glib::Value) -> Self { skip_assert_initialized!(); from_glib(glib::gobject_ffi::g_value_get_enum(value.to_glib_none().0)) } } impl ToValue for VideoOrientationMethod { #[inline] fn to_value(&self) -> glib::Value { let mut value = glib::Value::for_value_type::(); unsafe { glib::gobject_ffi::g_value_set_enum(value.to_glib_none_mut().0, self.into_glib()); } value } #[inline] fn value_type(&self) -> glib::Type { Self::static_type() } } impl From for glib::Value { #[inline] fn from(v: VideoOrientationMethod) -> Self { skip_assert_initialized!(); ToValue::to_value(&v) } } #[derive(Debug, Eq, PartialEq, Ord, PartialOrd, Hash, Clone, Copy)] #[non_exhaustive] #[doc(alias = "GstVideoPrimariesMode")] pub enum VideoPrimariesMode { #[doc(alias = "GST_VIDEO_PRIMARIES_MODE_NONE")] None, #[doc(alias = "GST_VIDEO_PRIMARIES_MODE_MERGE_ONLY")] MergeOnly, #[doc(alias = "GST_VIDEO_PRIMARIES_MODE_FAST")] Fast, #[doc(hidden)] __Unknown(i32), } #[doc(hidden)] impl IntoGlib for VideoPrimariesMode { type GlibType = ffi::GstVideoPrimariesMode; #[inline] fn into_glib(self) -> ffi::GstVideoPrimariesMode { match self { Self::None => ffi::GST_VIDEO_PRIMARIES_MODE_NONE, Self::MergeOnly => ffi::GST_VIDEO_PRIMARIES_MODE_MERGE_ONLY, Self::Fast => ffi::GST_VIDEO_PRIMARIES_MODE_FAST, Self::__Unknown(value) => value, } } } #[doc(hidden)] impl FromGlib for VideoPrimariesMode { #[inline] unsafe fn from_glib(value: ffi::GstVideoPrimariesMode) -> Self { skip_assert_initialized!(); match value { ffi::GST_VIDEO_PRIMARIES_MODE_NONE => Self::None, ffi::GST_VIDEO_PRIMARIES_MODE_MERGE_ONLY => Self::MergeOnly, ffi::GST_VIDEO_PRIMARIES_MODE_FAST => Self::Fast, value => Self::__Unknown(value), } } } impl StaticType for VideoPrimariesMode { #[inline] #[doc(alias = "gst_video_primaries_mode_get_type")] fn static_type() -> glib::Type { unsafe { from_glib(ffi::gst_video_primaries_mode_get_type()) } } } impl glib::HasParamSpec for VideoPrimariesMode { type ParamSpec = glib::ParamSpecEnum; type SetValue = Self; type BuilderFn = fn(&str, Self) -> glib::ParamSpecEnumBuilder; fn param_spec_builder() -> Self::BuilderFn { Self::ParamSpec::builder_with_default } } impl glib::value::ValueType for VideoPrimariesMode { type Type = Self; } unsafe impl<'a> glib::value::FromValue<'a> for VideoPrimariesMode { type Checker = glib::value::GenericValueTypeChecker; #[inline] unsafe fn from_value(value: &'a glib::Value) -> Self { skip_assert_initialized!(); from_glib(glib::gobject_ffi::g_value_get_enum(value.to_glib_none().0)) } } impl ToValue for VideoPrimariesMode { #[inline] fn to_value(&self) -> glib::Value { let mut value = glib::Value::for_value_type::(); unsafe { glib::gobject_ffi::g_value_set_enum(value.to_glib_none_mut().0, self.into_glib()); } value } #[inline] fn value_type(&self) -> glib::Type { Self::static_type() } } impl From for glib::Value { #[inline] fn from(v: VideoPrimariesMode) -> Self { skip_assert_initialized!(); ToValue::to_value(&v) } } #[derive(Debug, Eq, PartialEq, Ord, PartialOrd, Hash, Clone, Copy)] #[non_exhaustive] #[doc(alias = "GstVideoResamplerMethod")] pub enum VideoResamplerMethod { #[doc(alias = "GST_VIDEO_RESAMPLER_METHOD_NEAREST")] Nearest, #[doc(alias = "GST_VIDEO_RESAMPLER_METHOD_LINEAR")] Linear, #[doc(alias = "GST_VIDEO_RESAMPLER_METHOD_CUBIC")] Cubic, #[doc(alias = "GST_VIDEO_RESAMPLER_METHOD_SINC")] Sinc, #[doc(alias = "GST_VIDEO_RESAMPLER_METHOD_LANCZOS")] Lanczos, #[doc(hidden)] __Unknown(i32), } #[doc(hidden)] impl IntoGlib for VideoResamplerMethod { type GlibType = ffi::GstVideoResamplerMethod; #[inline] fn into_glib(self) -> ffi::GstVideoResamplerMethod { match self { Self::Nearest => ffi::GST_VIDEO_RESAMPLER_METHOD_NEAREST, Self::Linear => ffi::GST_VIDEO_RESAMPLER_METHOD_LINEAR, Self::Cubic => ffi::GST_VIDEO_RESAMPLER_METHOD_CUBIC, Self::Sinc => ffi::GST_VIDEO_RESAMPLER_METHOD_SINC, Self::Lanczos => ffi::GST_VIDEO_RESAMPLER_METHOD_LANCZOS, Self::__Unknown(value) => value, } } } #[doc(hidden)] impl FromGlib for VideoResamplerMethod { #[inline] unsafe fn from_glib(value: ffi::GstVideoResamplerMethod) -> Self { skip_assert_initialized!(); match value { ffi::GST_VIDEO_RESAMPLER_METHOD_NEAREST => Self::Nearest, ffi::GST_VIDEO_RESAMPLER_METHOD_LINEAR => Self::Linear, ffi::GST_VIDEO_RESAMPLER_METHOD_CUBIC => Self::Cubic, ffi::GST_VIDEO_RESAMPLER_METHOD_SINC => Self::Sinc, ffi::GST_VIDEO_RESAMPLER_METHOD_LANCZOS => Self::Lanczos, value => Self::__Unknown(value), } } } impl StaticType for VideoResamplerMethod { #[inline] #[doc(alias = "gst_video_resampler_method_get_type")] fn static_type() -> glib::Type { unsafe { from_glib(ffi::gst_video_resampler_method_get_type()) } } } impl glib::HasParamSpec for VideoResamplerMethod { type ParamSpec = glib::ParamSpecEnum; type SetValue = Self; type BuilderFn = fn(&str, Self) -> glib::ParamSpecEnumBuilder; fn param_spec_builder() -> Self::BuilderFn { Self::ParamSpec::builder_with_default } } impl glib::value::ValueType for VideoResamplerMethod { type Type = Self; } unsafe impl<'a> glib::value::FromValue<'a> for VideoResamplerMethod { type Checker = glib::value::GenericValueTypeChecker; #[inline] unsafe fn from_value(value: &'a glib::Value) -> Self { skip_assert_initialized!(); from_glib(glib::gobject_ffi::g_value_get_enum(value.to_glib_none().0)) } } impl ToValue for VideoResamplerMethod { #[inline] fn to_value(&self) -> glib::Value { let mut value = glib::Value::for_value_type::(); unsafe { glib::gobject_ffi::g_value_set_enum(value.to_glib_none_mut().0, self.into_glib()); } value } #[inline] fn value_type(&self) -> glib::Type { Self::static_type() } } impl From for glib::Value { #[inline] fn from(v: VideoResamplerMethod) -> Self { skip_assert_initialized!(); ToValue::to_value(&v) } } #[derive(Debug, Eq, PartialEq, Ord, PartialOrd, Hash, Clone, Copy)] #[non_exhaustive] #[doc(alias = "GstVideoTileMode")] pub enum VideoTileMode { #[doc(alias = "GST_VIDEO_TILE_MODE_UNKNOWN")] Unknown, #[doc(alias = "GST_VIDEO_TILE_MODE_ZFLIPZ_2X2")] Zflipz2x2, #[cfg(feature = "v1_18")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_18")))] #[doc(alias = "GST_VIDEO_TILE_MODE_LINEAR")] Linear, #[doc(hidden)] __Unknown(i32), } #[doc(hidden)] impl IntoGlib for VideoTileMode { type GlibType = ffi::GstVideoTileMode; #[inline] fn into_glib(self) -> ffi::GstVideoTileMode { match self { Self::Unknown => ffi::GST_VIDEO_TILE_MODE_UNKNOWN, Self::Zflipz2x2 => ffi::GST_VIDEO_TILE_MODE_ZFLIPZ_2X2, #[cfg(feature = "v1_18")] Self::Linear => ffi::GST_VIDEO_TILE_MODE_LINEAR, Self::__Unknown(value) => value, } } } #[doc(hidden)] impl FromGlib for VideoTileMode { #[inline] unsafe fn from_glib(value: ffi::GstVideoTileMode) -> Self { skip_assert_initialized!(); match value { ffi::GST_VIDEO_TILE_MODE_UNKNOWN => Self::Unknown, ffi::GST_VIDEO_TILE_MODE_ZFLIPZ_2X2 => Self::Zflipz2x2, #[cfg(feature = "v1_18")] ffi::GST_VIDEO_TILE_MODE_LINEAR => Self::Linear, value => Self::__Unknown(value), } } } impl StaticType for VideoTileMode { #[inline] #[doc(alias = "gst_video_tile_mode_get_type")] fn static_type() -> glib::Type { unsafe { from_glib(ffi::gst_video_tile_mode_get_type()) } } } impl glib::HasParamSpec for VideoTileMode { type ParamSpec = glib::ParamSpecEnum; type SetValue = Self; type BuilderFn = fn(&str, Self) -> glib::ParamSpecEnumBuilder; fn param_spec_builder() -> Self::BuilderFn { Self::ParamSpec::builder_with_default } } impl glib::value::ValueType for VideoTileMode { type Type = Self; } unsafe impl<'a> glib::value::FromValue<'a> for VideoTileMode { type Checker = glib::value::GenericValueTypeChecker; #[inline] unsafe fn from_value(value: &'a glib::Value) -> Self { skip_assert_initialized!(); from_glib(glib::gobject_ffi::g_value_get_enum(value.to_glib_none().0)) } } impl ToValue for VideoTileMode { #[inline] fn to_value(&self) -> glib::Value { let mut value = glib::Value::for_value_type::(); unsafe { glib::gobject_ffi::g_value_set_enum(value.to_glib_none_mut().0, self.into_glib()); } value } #[inline] fn value_type(&self) -> glib::Type { Self::static_type() } } impl From for glib::Value { #[inline] fn from(v: VideoTileMode) -> Self { skip_assert_initialized!(); ToValue::to_value(&v) } } #[derive(Debug, Eq, PartialEq, Ord, PartialOrd, Hash, Clone, Copy)] #[non_exhaustive] #[doc(alias = "GstVideoTransferFunction")] pub enum VideoTransferFunction { #[doc(alias = "GST_VIDEO_TRANSFER_UNKNOWN")] Unknown, #[doc(alias = "GST_VIDEO_TRANSFER_GAMMA10")] Gamma10, #[doc(alias = "GST_VIDEO_TRANSFER_GAMMA18")] Gamma18, #[doc(alias = "GST_VIDEO_TRANSFER_GAMMA20")] Gamma20, #[doc(alias = "GST_VIDEO_TRANSFER_GAMMA22")] Gamma22, #[doc(alias = "GST_VIDEO_TRANSFER_BT709")] Bt709, #[doc(alias = "GST_VIDEO_TRANSFER_SMPTE240M")] Smpte240m, #[doc(alias = "GST_VIDEO_TRANSFER_SRGB")] Srgb, #[doc(alias = "GST_VIDEO_TRANSFER_GAMMA28")] Gamma28, #[doc(alias = "GST_VIDEO_TRANSFER_LOG100")] Log100, #[doc(alias = "GST_VIDEO_TRANSFER_LOG316")] Log316, #[doc(alias = "GST_VIDEO_TRANSFER_BT2020_12")] Bt202012, #[doc(alias = "GST_VIDEO_TRANSFER_ADOBERGB")] Adobergb, #[cfg(feature = "v1_18")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_18")))] #[doc(alias = "GST_VIDEO_TRANSFER_BT2020_10")] Bt202010, #[cfg(feature = "v1_18")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_18")))] #[doc(alias = "GST_VIDEO_TRANSFER_SMPTE2084")] Smpte2084, #[cfg(feature = "v1_18")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_18")))] #[doc(alias = "GST_VIDEO_TRANSFER_ARIB_STD_B67")] AribStdB67, #[cfg(feature = "v1_18")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_18")))] #[doc(alias = "GST_VIDEO_TRANSFER_BT601")] Bt601, #[doc(hidden)] __Unknown(i32), } impl VideoTransferFunction { #[cfg(feature = "v1_20")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_20")))] #[doc(alias = "gst_video_transfer_function_decode")] pub fn decode(self, val: f64) -> f64 { assert_initialized_main_thread!(); unsafe { ffi::gst_video_transfer_function_decode(self.into_glib(), val) } } #[cfg(feature = "v1_20")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_20")))] #[doc(alias = "gst_video_transfer_function_encode")] pub fn encode(self, val: f64) -> f64 { assert_initialized_main_thread!(); unsafe { ffi::gst_video_transfer_function_encode(self.into_glib(), val) } } #[cfg(feature = "v1_18")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_18")))] #[doc(alias = "gst_video_transfer_function_from_iso")] pub fn from_iso(value: u32) -> VideoTransferFunction { assert_initialized_main_thread!(); unsafe { from_glib(ffi::gst_video_transfer_function_from_iso(value)) } } #[cfg(feature = "v1_18")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_18")))] #[doc(alias = "gst_video_transfer_function_is_equivalent")] pub fn is_equivalent(self, from_bpp: u32, to_func: VideoTransferFunction, to_bpp: u32) -> bool { assert_initialized_main_thread!(); unsafe { from_glib(ffi::gst_video_transfer_function_is_equivalent( self.into_glib(), from_bpp, to_func.into_glib(), to_bpp, )) } } #[cfg(feature = "v1_18")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_18")))] #[doc(alias = "gst_video_transfer_function_to_iso")] pub fn to_iso(self) -> u32 { assert_initialized_main_thread!(); unsafe { ffi::gst_video_transfer_function_to_iso(self.into_glib()) } } } #[doc(hidden)] impl IntoGlib for VideoTransferFunction { type GlibType = ffi::GstVideoTransferFunction; fn into_glib(self) -> ffi::GstVideoTransferFunction { match self { Self::Unknown => ffi::GST_VIDEO_TRANSFER_UNKNOWN, Self::Gamma10 => ffi::GST_VIDEO_TRANSFER_GAMMA10, Self::Gamma18 => ffi::GST_VIDEO_TRANSFER_GAMMA18, Self::Gamma20 => ffi::GST_VIDEO_TRANSFER_GAMMA20, Self::Gamma22 => ffi::GST_VIDEO_TRANSFER_GAMMA22, Self::Bt709 => ffi::GST_VIDEO_TRANSFER_BT709, Self::Smpte240m => ffi::GST_VIDEO_TRANSFER_SMPTE240M, Self::Srgb => ffi::GST_VIDEO_TRANSFER_SRGB, Self::Gamma28 => ffi::GST_VIDEO_TRANSFER_GAMMA28, Self::Log100 => ffi::GST_VIDEO_TRANSFER_LOG100, Self::Log316 => ffi::GST_VIDEO_TRANSFER_LOG316, Self::Bt202012 => ffi::GST_VIDEO_TRANSFER_BT2020_12, Self::Adobergb => ffi::GST_VIDEO_TRANSFER_ADOBERGB, #[cfg(feature = "v1_18")] Self::Bt202010 => ffi::GST_VIDEO_TRANSFER_BT2020_10, #[cfg(feature = "v1_18")] Self::Smpte2084 => ffi::GST_VIDEO_TRANSFER_SMPTE2084, #[cfg(feature = "v1_18")] Self::AribStdB67 => ffi::GST_VIDEO_TRANSFER_ARIB_STD_B67, #[cfg(feature = "v1_18")] Self::Bt601 => ffi::GST_VIDEO_TRANSFER_BT601, Self::__Unknown(value) => value, } } } #[doc(hidden)] impl FromGlib for VideoTransferFunction { unsafe fn from_glib(value: ffi::GstVideoTransferFunction) -> Self { skip_assert_initialized!(); match value { ffi::GST_VIDEO_TRANSFER_UNKNOWN => Self::Unknown, ffi::GST_VIDEO_TRANSFER_GAMMA10 => Self::Gamma10, ffi::GST_VIDEO_TRANSFER_GAMMA18 => Self::Gamma18, ffi::GST_VIDEO_TRANSFER_GAMMA20 => Self::Gamma20, ffi::GST_VIDEO_TRANSFER_GAMMA22 => Self::Gamma22, ffi::GST_VIDEO_TRANSFER_BT709 => Self::Bt709, ffi::GST_VIDEO_TRANSFER_SMPTE240M => Self::Smpte240m, ffi::GST_VIDEO_TRANSFER_SRGB => Self::Srgb, ffi::GST_VIDEO_TRANSFER_GAMMA28 => Self::Gamma28, ffi::GST_VIDEO_TRANSFER_LOG100 => Self::Log100, ffi::GST_VIDEO_TRANSFER_LOG316 => Self::Log316, ffi::GST_VIDEO_TRANSFER_BT2020_12 => Self::Bt202012, ffi::GST_VIDEO_TRANSFER_ADOBERGB => Self::Adobergb, #[cfg(feature = "v1_18")] ffi::GST_VIDEO_TRANSFER_BT2020_10 => Self::Bt202010, #[cfg(feature = "v1_18")] ffi::GST_VIDEO_TRANSFER_SMPTE2084 => Self::Smpte2084, #[cfg(feature = "v1_18")] ffi::GST_VIDEO_TRANSFER_ARIB_STD_B67 => Self::AribStdB67, #[cfg(feature = "v1_18")] ffi::GST_VIDEO_TRANSFER_BT601 => Self::Bt601, value => Self::__Unknown(value), } } } impl StaticType for VideoTransferFunction { #[inline] #[doc(alias = "gst_video_transfer_function_get_type")] fn static_type() -> glib::Type { unsafe { from_glib(ffi::gst_video_transfer_function_get_type()) } } } impl glib::HasParamSpec for VideoTransferFunction { type ParamSpec = glib::ParamSpecEnum; type SetValue = Self; type BuilderFn = fn(&str, Self) -> glib::ParamSpecEnumBuilder; fn param_spec_builder() -> Self::BuilderFn { Self::ParamSpec::builder_with_default } } impl glib::value::ValueType for VideoTransferFunction { type Type = Self; } unsafe impl<'a> glib::value::FromValue<'a> for VideoTransferFunction { type Checker = glib::value::GenericValueTypeChecker; #[inline] unsafe fn from_value(value: &'a glib::Value) -> Self { skip_assert_initialized!(); from_glib(glib::gobject_ffi::g_value_get_enum(value.to_glib_none().0)) } } impl ToValue for VideoTransferFunction { #[inline] fn to_value(&self) -> glib::Value { let mut value = glib::Value::for_value_type::(); unsafe { glib::gobject_ffi::g_value_set_enum(value.to_glib_none_mut().0, self.into_glib()); } value } #[inline] fn value_type(&self) -> glib::Type { Self::static_type() } } impl From for glib::Value { #[inline] fn from(v: VideoTransferFunction) -> Self { skip_assert_initialized!(); ToValue::to_value(&v) } } gstreamer-video-0.23.5/src/auto/flags.rs000064400000000000000000001142501046102023000162140ustar 00000000000000// This file was generated by gir (https://github.com/gtk-rs/gir) // from gir-files (https://github.com/gtk-rs/gir-files) // from gst-gir-files (https://gitlab.freedesktop.org/gstreamer/gir-files-rs.git) // DO NOT EDIT use crate::ffi; use glib::{bitflags::bitflags, prelude::*, translate::*}; #[cfg(feature = "v1_22")] bitflags! { #[cfg_attr(docsrs, doc(cfg(feature = "v1_22")))] #[derive(Clone, Copy, Debug, PartialEq, Eq, Hash)] #[doc(alias = "GstNavigationModifierType")] pub struct NavigationModifierType: u32 { #[doc(alias = "GST_NAVIGATION_MODIFIER_SHIFT_MASK")] const SHIFT_MASK = ffi::GST_NAVIGATION_MODIFIER_SHIFT_MASK as _; #[doc(alias = "GST_NAVIGATION_MODIFIER_LOCK_MASK")] const LOCK_MASK = ffi::GST_NAVIGATION_MODIFIER_LOCK_MASK as _; #[doc(alias = "GST_NAVIGATION_MODIFIER_CONTROL_MASK")] const CONTROL_MASK = ffi::GST_NAVIGATION_MODIFIER_CONTROL_MASK as _; #[doc(alias = "GST_NAVIGATION_MODIFIER_MOD1_MASK")] const MOD1_MASK = ffi::GST_NAVIGATION_MODIFIER_MOD1_MASK as _; #[doc(alias = "GST_NAVIGATION_MODIFIER_MOD2_MASK")] const MOD2_MASK = ffi::GST_NAVIGATION_MODIFIER_MOD2_MASK as _; #[doc(alias = "GST_NAVIGATION_MODIFIER_MOD3_MASK")] const MOD3_MASK = ffi::GST_NAVIGATION_MODIFIER_MOD3_MASK as _; #[doc(alias = "GST_NAVIGATION_MODIFIER_MOD4_MASK")] const MOD4_MASK = ffi::GST_NAVIGATION_MODIFIER_MOD4_MASK as _; #[doc(alias = "GST_NAVIGATION_MODIFIER_MOD5_MASK")] const MOD5_MASK = ffi::GST_NAVIGATION_MODIFIER_MOD5_MASK as _; #[doc(alias = "GST_NAVIGATION_MODIFIER_BUTTON1_MASK")] const BUTTON1_MASK = ffi::GST_NAVIGATION_MODIFIER_BUTTON1_MASK as _; #[doc(alias = "GST_NAVIGATION_MODIFIER_BUTTON2_MASK")] const BUTTON2_MASK = ffi::GST_NAVIGATION_MODIFIER_BUTTON2_MASK as _; #[doc(alias = "GST_NAVIGATION_MODIFIER_BUTTON3_MASK")] const BUTTON3_MASK = ffi::GST_NAVIGATION_MODIFIER_BUTTON3_MASK as _; #[doc(alias = "GST_NAVIGATION_MODIFIER_BUTTON4_MASK")] const BUTTON4_MASK = ffi::GST_NAVIGATION_MODIFIER_BUTTON4_MASK as _; #[doc(alias = "GST_NAVIGATION_MODIFIER_BUTTON5_MASK")] const BUTTON5_MASK = ffi::GST_NAVIGATION_MODIFIER_BUTTON5_MASK as _; #[doc(alias = "GST_NAVIGATION_MODIFIER_SUPER_MASK")] const SUPER_MASK = ffi::GST_NAVIGATION_MODIFIER_SUPER_MASK as _; #[doc(alias = "GST_NAVIGATION_MODIFIER_HYPER_MASK")] const HYPER_MASK = ffi::GST_NAVIGATION_MODIFIER_HYPER_MASK as _; #[doc(alias = "GST_NAVIGATION_MODIFIER_META_MASK")] const META_MASK = ffi::GST_NAVIGATION_MODIFIER_META_MASK as _; } } #[cfg(feature = "v1_22")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_22")))] #[doc(hidden)] impl IntoGlib for NavigationModifierType { type GlibType = ffi::GstNavigationModifierType; #[inline] fn into_glib(self) -> ffi::GstNavigationModifierType { self.bits() } } #[cfg(feature = "v1_22")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_22")))] #[doc(hidden)] impl FromGlib for NavigationModifierType { #[inline] unsafe fn from_glib(value: ffi::GstNavigationModifierType) -> Self { skip_assert_initialized!(); Self::from_bits_truncate(value) } } #[cfg(feature = "v1_22")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_22")))] impl StaticType for NavigationModifierType { #[inline] #[doc(alias = "gst_navigation_modifier_type_get_type")] fn static_type() -> glib::Type { unsafe { from_glib(ffi::gst_navigation_modifier_type_get_type()) } } } #[cfg(feature = "v1_22")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_22")))] impl glib::HasParamSpec for NavigationModifierType { type ParamSpec = glib::ParamSpecFlags; type SetValue = Self; type BuilderFn = fn(&str) -> glib::ParamSpecFlagsBuilder; fn param_spec_builder() -> Self::BuilderFn { Self::ParamSpec::builder } } #[cfg(feature = "v1_22")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_22")))] impl glib::value::ValueType for NavigationModifierType { type Type = Self; } #[cfg(feature = "v1_22")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_22")))] unsafe impl<'a> glib::value::FromValue<'a> for NavigationModifierType { type Checker = glib::value::GenericValueTypeChecker; #[inline] unsafe fn from_value(value: &'a glib::Value) -> Self { skip_assert_initialized!(); from_glib(glib::gobject_ffi::g_value_get_flags(value.to_glib_none().0)) } } #[cfg(feature = "v1_22")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_22")))] impl ToValue for NavigationModifierType { #[inline] fn to_value(&self) -> glib::Value { let mut value = glib::Value::for_value_type::(); unsafe { glib::gobject_ffi::g_value_set_flags(value.to_glib_none_mut().0, self.into_glib()); } value } #[inline] fn value_type(&self) -> glib::Type { Self::static_type() } } #[cfg(feature = "v1_22")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_22")))] impl From for glib::Value { #[inline] fn from(v: NavigationModifierType) -> Self { skip_assert_initialized!(); ToValue::to_value(&v) } } bitflags! { #[derive(Clone, Copy, Debug, PartialEq, Eq, Hash)] #[doc(alias = "GstVideoBufferFlags")] pub struct VideoBufferFlags: u32 { #[doc(alias = "GST_VIDEO_BUFFER_FLAG_INTERLACED")] const INTERLACED = ffi::GST_VIDEO_BUFFER_FLAG_INTERLACED as _; #[doc(alias = "GST_VIDEO_BUFFER_FLAG_TFF")] const TFF = ffi::GST_VIDEO_BUFFER_FLAG_TFF as _; #[doc(alias = "GST_VIDEO_BUFFER_FLAG_RFF")] const RFF = ffi::GST_VIDEO_BUFFER_FLAG_RFF as _; #[doc(alias = "GST_VIDEO_BUFFER_FLAG_ONEFIELD")] const ONEFIELD = ffi::GST_VIDEO_BUFFER_FLAG_ONEFIELD as _; #[doc(alias = "GST_VIDEO_BUFFER_FLAG_MULTIPLE_VIEW")] const MULTIPLE_VIEW = ffi::GST_VIDEO_BUFFER_FLAG_MULTIPLE_VIEW as _; #[doc(alias = "GST_VIDEO_BUFFER_FLAG_FIRST_IN_BUNDLE")] const FIRST_IN_BUNDLE = ffi::GST_VIDEO_BUFFER_FLAG_FIRST_IN_BUNDLE as _; #[cfg(feature = "v1_16")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_16")))] #[doc(alias = "GST_VIDEO_BUFFER_FLAG_TOP_FIELD")] const TOP_FIELD = ffi::GST_VIDEO_BUFFER_FLAG_TOP_FIELD as _; #[cfg(feature = "v1_16")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_16")))] #[doc(alias = "GST_VIDEO_BUFFER_FLAG_BOTTOM_FIELD")] const BOTTOM_FIELD = ffi::GST_VIDEO_BUFFER_FLAG_BOTTOM_FIELD as _; #[cfg(feature = "v1_18")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_18")))] #[doc(alias = "GST_VIDEO_BUFFER_FLAG_MARKER")] const MARKER = ffi::GST_VIDEO_BUFFER_FLAG_MARKER as _; } } #[doc(hidden)] impl IntoGlib for VideoBufferFlags { type GlibType = ffi::GstVideoBufferFlags; #[inline] fn into_glib(self) -> ffi::GstVideoBufferFlags { self.bits() } } #[doc(hidden)] impl FromGlib for VideoBufferFlags { #[inline] unsafe fn from_glib(value: ffi::GstVideoBufferFlags) -> Self { skip_assert_initialized!(); Self::from_bits_truncate(value) } } impl StaticType for VideoBufferFlags { #[inline] #[doc(alias = "gst_video_buffer_flags_get_type")] fn static_type() -> glib::Type { unsafe { from_glib(ffi::gst_video_buffer_flags_get_type()) } } } impl glib::HasParamSpec for VideoBufferFlags { type ParamSpec = glib::ParamSpecFlags; type SetValue = Self; type BuilderFn = fn(&str) -> glib::ParamSpecFlagsBuilder; fn param_spec_builder() -> Self::BuilderFn { Self::ParamSpec::builder } } impl glib::value::ValueType for VideoBufferFlags { type Type = Self; } unsafe impl<'a> glib::value::FromValue<'a> for VideoBufferFlags { type Checker = glib::value::GenericValueTypeChecker; #[inline] unsafe fn from_value(value: &'a glib::Value) -> Self { skip_assert_initialized!(); from_glib(glib::gobject_ffi::g_value_get_flags(value.to_glib_none().0)) } } impl ToValue for VideoBufferFlags { #[inline] fn to_value(&self) -> glib::Value { let mut value = glib::Value::for_value_type::(); unsafe { glib::gobject_ffi::g_value_set_flags(value.to_glib_none_mut().0, self.into_glib()); } value } #[inline] fn value_type(&self) -> glib::Type { Self::static_type() } } impl From for glib::Value { #[inline] fn from(v: VideoBufferFlags) -> Self { skip_assert_initialized!(); ToValue::to_value(&v) } } bitflags! { #[derive(Clone, Copy, Debug, PartialEq, Eq, Hash)] #[doc(alias = "GstVideoChromaSite")] pub struct VideoChromaSite: u32 { #[doc(alias = "GST_VIDEO_CHROMA_SITE_NONE")] const NONE = ffi::GST_VIDEO_CHROMA_SITE_NONE as _; #[doc(alias = "GST_VIDEO_CHROMA_SITE_H_COSITED")] const H_COSITED = ffi::GST_VIDEO_CHROMA_SITE_H_COSITED as _; #[doc(alias = "GST_VIDEO_CHROMA_SITE_V_COSITED")] const V_COSITED = ffi::GST_VIDEO_CHROMA_SITE_V_COSITED as _; #[doc(alias = "GST_VIDEO_CHROMA_SITE_ALT_LINE")] const ALT_LINE = ffi::GST_VIDEO_CHROMA_SITE_ALT_LINE as _; #[doc(alias = "GST_VIDEO_CHROMA_SITE_COSITED")] const COSITED = ffi::GST_VIDEO_CHROMA_SITE_COSITED as _; #[doc(alias = "GST_VIDEO_CHROMA_SITE_JPEG")] const JPEG = ffi::GST_VIDEO_CHROMA_SITE_JPEG as _; #[doc(alias = "GST_VIDEO_CHROMA_SITE_MPEG2")] const MPEG2 = ffi::GST_VIDEO_CHROMA_SITE_MPEG2 as _; #[doc(alias = "GST_VIDEO_CHROMA_SITE_DV")] const DV = ffi::GST_VIDEO_CHROMA_SITE_DV as _; } } impl VideoChromaSite { #[cfg(feature = "v1_20")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_20")))] #[doc(alias = "gst_video_chroma_site_from_string")] pub fn from_string(s: &str) -> VideoChromaSite { assert_initialized_main_thread!(); unsafe { from_glib(ffi::gst_video_chroma_site_from_string(s.to_glib_none().0)) } } } impl std::fmt::Display for VideoChromaSite { #[inline] fn fmt(&self, f: &mut std::fmt::Formatter) -> std::fmt::Result { f.write_str(&self.to_str()) } } #[doc(hidden)] impl IntoGlib for VideoChromaSite { type GlibType = ffi::GstVideoChromaSite; #[inline] fn into_glib(self) -> ffi::GstVideoChromaSite { self.bits() } } #[doc(hidden)] impl FromGlib for VideoChromaSite { #[inline] unsafe fn from_glib(value: ffi::GstVideoChromaSite) -> Self { skip_assert_initialized!(); Self::from_bits_truncate(value) } } impl StaticType for VideoChromaSite { #[inline] #[doc(alias = "gst_video_chroma_site_get_type")] fn static_type() -> glib::Type { unsafe { from_glib(ffi::gst_video_chroma_site_get_type()) } } } impl glib::HasParamSpec for VideoChromaSite { type ParamSpec = glib::ParamSpecFlags; type SetValue = Self; type BuilderFn = fn(&str) -> glib::ParamSpecFlagsBuilder; fn param_spec_builder() -> Self::BuilderFn { Self::ParamSpec::builder } } impl glib::value::ValueType for VideoChromaSite { type Type = Self; } unsafe impl<'a> glib::value::FromValue<'a> for VideoChromaSite { type Checker = glib::value::GenericValueTypeChecker; #[inline] unsafe fn from_value(value: &'a glib::Value) -> Self { skip_assert_initialized!(); from_glib(glib::gobject_ffi::g_value_get_flags(value.to_glib_none().0)) } } impl ToValue for VideoChromaSite { #[inline] fn to_value(&self) -> glib::Value { let mut value = glib::Value::for_value_type::(); unsafe { glib::gobject_ffi::g_value_set_flags(value.to_glib_none_mut().0, self.into_glib()); } value } #[inline] fn value_type(&self) -> glib::Type { Self::static_type() } } impl From for glib::Value { #[inline] fn from(v: VideoChromaSite) -> Self { skip_assert_initialized!(); ToValue::to_value(&v) } } bitflags! { #[derive(Clone, Copy, Debug, PartialEq, Eq, Hash)] #[doc(alias = "GstVideoCodecFrameFlags")] pub struct VideoCodecFrameFlags: u32 { #[doc(alias = "GST_VIDEO_CODEC_FRAME_FLAG_DECODE_ONLY")] const DECODE_ONLY = ffi::GST_VIDEO_CODEC_FRAME_FLAG_DECODE_ONLY as _; #[doc(alias = "GST_VIDEO_CODEC_FRAME_FLAG_SYNC_POINT")] const SYNC_POINT = ffi::GST_VIDEO_CODEC_FRAME_FLAG_SYNC_POINT as _; #[doc(alias = "GST_VIDEO_CODEC_FRAME_FLAG_FORCE_KEYFRAME")] const FORCE_KEYFRAME = ffi::GST_VIDEO_CODEC_FRAME_FLAG_FORCE_KEYFRAME as _; #[doc(alias = "GST_VIDEO_CODEC_FRAME_FLAG_FORCE_KEYFRAME_HEADERS")] const FORCE_KEYFRAME_HEADERS = ffi::GST_VIDEO_CODEC_FRAME_FLAG_FORCE_KEYFRAME_HEADERS as _; #[cfg(feature = "v1_20")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_20")))] #[doc(alias = "GST_VIDEO_CODEC_FRAME_FLAG_CORRUPTED")] const CORRUPTED = ffi::GST_VIDEO_CODEC_FRAME_FLAG_CORRUPTED as _; } } #[doc(hidden)] impl IntoGlib for VideoCodecFrameFlags { type GlibType = ffi::GstVideoCodecFrameFlags; #[inline] fn into_glib(self) -> ffi::GstVideoCodecFrameFlags { self.bits() } } #[doc(hidden)] impl FromGlib for VideoCodecFrameFlags { #[inline] unsafe fn from_glib(value: ffi::GstVideoCodecFrameFlags) -> Self { skip_assert_initialized!(); Self::from_bits_truncate(value) } } #[cfg(feature = "v1_20")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_20")))] impl StaticType for VideoCodecFrameFlags { #[inline] #[doc(alias = "gst_video_codec_frame_flags_get_type")] fn static_type() -> glib::Type { unsafe { from_glib(ffi::gst_video_codec_frame_flags_get_type()) } } } #[cfg(feature = "v1_20")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_20")))] impl glib::HasParamSpec for VideoCodecFrameFlags { type ParamSpec = glib::ParamSpecFlags; type SetValue = Self; type BuilderFn = fn(&str) -> glib::ParamSpecFlagsBuilder; fn param_spec_builder() -> Self::BuilderFn { Self::ParamSpec::builder } } #[cfg(feature = "v1_20")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_20")))] impl glib::value::ValueType for VideoCodecFrameFlags { type Type = Self; } #[cfg(feature = "v1_20")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_20")))] unsafe impl<'a> glib::value::FromValue<'a> for VideoCodecFrameFlags { type Checker = glib::value::GenericValueTypeChecker; #[inline] unsafe fn from_value(value: &'a glib::Value) -> Self { skip_assert_initialized!(); from_glib(glib::gobject_ffi::g_value_get_flags(value.to_glib_none().0)) } } #[cfg(feature = "v1_20")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_20")))] impl ToValue for VideoCodecFrameFlags { #[inline] fn to_value(&self) -> glib::Value { let mut value = glib::Value::for_value_type::(); unsafe { glib::gobject_ffi::g_value_set_flags(value.to_glib_none_mut().0, self.into_glib()); } value } #[inline] fn value_type(&self) -> glib::Type { Self::static_type() } } #[cfg(feature = "v1_20")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_20")))] impl From for glib::Value { #[inline] fn from(v: VideoCodecFrameFlags) -> Self { skip_assert_initialized!(); ToValue::to_value(&v) } } #[cfg(feature = "v1_20")] bitflags! { #[cfg_attr(docsrs, doc(cfg(feature = "v1_20")))] #[derive(Clone, Copy, Debug, PartialEq, Eq, Hash)] #[doc(alias = "GstVideoDecoderRequestSyncPointFlags")] pub struct VideoDecoderRequestSyncPointFlags: u32 { #[doc(alias = "GST_VIDEO_DECODER_REQUEST_SYNC_POINT_DISCARD_INPUT")] const DISCARD_INPUT = ffi::GST_VIDEO_DECODER_REQUEST_SYNC_POINT_DISCARD_INPUT as _; #[doc(alias = "GST_VIDEO_DECODER_REQUEST_SYNC_POINT_CORRUPT_OUTPUT")] const CORRUPT_OUTPUT = ffi::GST_VIDEO_DECODER_REQUEST_SYNC_POINT_CORRUPT_OUTPUT as _; } } #[cfg(feature = "v1_20")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_20")))] #[doc(hidden)] impl IntoGlib for VideoDecoderRequestSyncPointFlags { type GlibType = ffi::GstVideoDecoderRequestSyncPointFlags; #[inline] fn into_glib(self) -> ffi::GstVideoDecoderRequestSyncPointFlags { self.bits() } } #[cfg(feature = "v1_20")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_20")))] #[doc(hidden)] impl FromGlib for VideoDecoderRequestSyncPointFlags { #[inline] unsafe fn from_glib(value: ffi::GstVideoDecoderRequestSyncPointFlags) -> Self { skip_assert_initialized!(); Self::from_bits_truncate(value) } } #[cfg(feature = "v1_20")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_20")))] impl StaticType for VideoDecoderRequestSyncPointFlags { #[inline] #[doc(alias = "gst_video_decoder_request_sync_point_flags_get_type")] fn static_type() -> glib::Type { unsafe { from_glib(ffi::gst_video_decoder_request_sync_point_flags_get_type()) } } } #[cfg(feature = "v1_20")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_20")))] impl glib::HasParamSpec for VideoDecoderRequestSyncPointFlags { type ParamSpec = glib::ParamSpecFlags; type SetValue = Self; type BuilderFn = fn(&str) -> glib::ParamSpecFlagsBuilder; fn param_spec_builder() -> Self::BuilderFn { Self::ParamSpec::builder } } #[cfg(feature = "v1_20")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_20")))] impl glib::value::ValueType for VideoDecoderRequestSyncPointFlags { type Type = Self; } #[cfg(feature = "v1_20")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_20")))] unsafe impl<'a> glib::value::FromValue<'a> for VideoDecoderRequestSyncPointFlags { type Checker = glib::value::GenericValueTypeChecker; #[inline] unsafe fn from_value(value: &'a glib::Value) -> Self { skip_assert_initialized!(); from_glib(glib::gobject_ffi::g_value_get_flags(value.to_glib_none().0)) } } #[cfg(feature = "v1_20")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_20")))] impl ToValue for VideoDecoderRequestSyncPointFlags { #[inline] fn to_value(&self) -> glib::Value { let mut value = glib::Value::for_value_type::(); unsafe { glib::gobject_ffi::g_value_set_flags(value.to_glib_none_mut().0, self.into_glib()); } value } #[inline] fn value_type(&self) -> glib::Type { Self::static_type() } } #[cfg(feature = "v1_20")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_20")))] impl From for glib::Value { #[inline] fn from(v: VideoDecoderRequestSyncPointFlags) -> Self { skip_assert_initialized!(); ToValue::to_value(&v) } } bitflags! { #[derive(Clone, Copy, Debug, PartialEq, Eq, Hash)] #[doc(alias = "GstVideoFlags")] pub struct VideoFlags: u32 { #[doc(alias = "GST_VIDEO_FLAG_VARIABLE_FPS")] const VARIABLE_FPS = ffi::GST_VIDEO_FLAG_VARIABLE_FPS as _; #[doc(alias = "GST_VIDEO_FLAG_PREMULTIPLIED_ALPHA")] const PREMULTIPLIED_ALPHA = ffi::GST_VIDEO_FLAG_PREMULTIPLIED_ALPHA as _; } } #[doc(hidden)] impl IntoGlib for VideoFlags { type GlibType = ffi::GstVideoFlags; #[inline] fn into_glib(self) -> ffi::GstVideoFlags { self.bits() } } #[doc(hidden)] impl FromGlib for VideoFlags { #[inline] unsafe fn from_glib(value: ffi::GstVideoFlags) -> Self { skip_assert_initialized!(); Self::from_bits_truncate(value) } } impl StaticType for VideoFlags { #[inline] #[doc(alias = "gst_video_flags_get_type")] fn static_type() -> glib::Type { unsafe { from_glib(ffi::gst_video_flags_get_type()) } } } impl glib::HasParamSpec for VideoFlags { type ParamSpec = glib::ParamSpecFlags; type SetValue = Self; type BuilderFn = fn(&str) -> glib::ParamSpecFlagsBuilder; fn param_spec_builder() -> Self::BuilderFn { Self::ParamSpec::builder } } impl glib::value::ValueType for VideoFlags { type Type = Self; } unsafe impl<'a> glib::value::FromValue<'a> for VideoFlags { type Checker = glib::value::GenericValueTypeChecker; #[inline] unsafe fn from_value(value: &'a glib::Value) -> Self { skip_assert_initialized!(); from_glib(glib::gobject_ffi::g_value_get_flags(value.to_glib_none().0)) } } impl ToValue for VideoFlags { #[inline] fn to_value(&self) -> glib::Value { let mut value = glib::Value::for_value_type::(); unsafe { glib::gobject_ffi::g_value_set_flags(value.to_glib_none_mut().0, self.into_glib()); } value } #[inline] fn value_type(&self) -> glib::Type { Self::static_type() } } impl From for glib::Value { #[inline] fn from(v: VideoFlags) -> Self { skip_assert_initialized!(); ToValue::to_value(&v) } } bitflags! { #[derive(Clone, Copy, Debug, PartialEq, Eq, Hash)] #[doc(alias = "GstVideoFormatFlags")] pub struct VideoFormatFlags: u32 { #[doc(alias = "GST_VIDEO_FORMAT_FLAG_YUV")] const YUV = ffi::GST_VIDEO_FORMAT_FLAG_YUV as _; #[doc(alias = "GST_VIDEO_FORMAT_FLAG_RGB")] const RGB = ffi::GST_VIDEO_FORMAT_FLAG_RGB as _; #[doc(alias = "GST_VIDEO_FORMAT_FLAG_GRAY")] const GRAY = ffi::GST_VIDEO_FORMAT_FLAG_GRAY as _; #[doc(alias = "GST_VIDEO_FORMAT_FLAG_ALPHA")] const ALPHA = ffi::GST_VIDEO_FORMAT_FLAG_ALPHA as _; #[doc(alias = "GST_VIDEO_FORMAT_FLAG_LE")] const LE = ffi::GST_VIDEO_FORMAT_FLAG_LE as _; #[doc(alias = "GST_VIDEO_FORMAT_FLAG_PALETTE")] const PALETTE = ffi::GST_VIDEO_FORMAT_FLAG_PALETTE as _; #[doc(alias = "GST_VIDEO_FORMAT_FLAG_COMPLEX")] const COMPLEX = ffi::GST_VIDEO_FORMAT_FLAG_COMPLEX as _; #[doc(alias = "GST_VIDEO_FORMAT_FLAG_UNPACK")] const UNPACK = ffi::GST_VIDEO_FORMAT_FLAG_UNPACK as _; #[doc(alias = "GST_VIDEO_FORMAT_FLAG_TILED")] const TILED = ffi::GST_VIDEO_FORMAT_FLAG_TILED as _; #[cfg(feature = "v1_22")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_22")))] #[doc(alias = "GST_VIDEO_FORMAT_FLAG_SUBTILES")] const SUBTILES = ffi::GST_VIDEO_FORMAT_FLAG_SUBTILES as _; } } #[doc(hidden)] impl IntoGlib for VideoFormatFlags { type GlibType = ffi::GstVideoFormatFlags; #[inline] fn into_glib(self) -> ffi::GstVideoFormatFlags { self.bits() } } #[doc(hidden)] impl FromGlib for VideoFormatFlags { #[inline] unsafe fn from_glib(value: ffi::GstVideoFormatFlags) -> Self { skip_assert_initialized!(); Self::from_bits_truncate(value) } } impl StaticType for VideoFormatFlags { #[inline] #[doc(alias = "gst_video_format_flags_get_type")] fn static_type() -> glib::Type { unsafe { from_glib(ffi::gst_video_format_flags_get_type()) } } } impl glib::HasParamSpec for VideoFormatFlags { type ParamSpec = glib::ParamSpecFlags; type SetValue = Self; type BuilderFn = fn(&str) -> glib::ParamSpecFlagsBuilder; fn param_spec_builder() -> Self::BuilderFn { Self::ParamSpec::builder } } impl glib::value::ValueType for VideoFormatFlags { type Type = Self; } unsafe impl<'a> glib::value::FromValue<'a> for VideoFormatFlags { type Checker = glib::value::GenericValueTypeChecker; #[inline] unsafe fn from_value(value: &'a glib::Value) -> Self { skip_assert_initialized!(); from_glib(glib::gobject_ffi::g_value_get_flags(value.to_glib_none().0)) } } impl ToValue for VideoFormatFlags { #[inline] fn to_value(&self) -> glib::Value { let mut value = glib::Value::for_value_type::(); unsafe { glib::gobject_ffi::g_value_set_flags(value.to_glib_none_mut().0, self.into_glib()); } value } #[inline] fn value_type(&self) -> glib::Type { Self::static_type() } } impl From for glib::Value { #[inline] fn from(v: VideoFormatFlags) -> Self { skip_assert_initialized!(); ToValue::to_value(&v) } } bitflags! { #[derive(Clone, Copy, Debug, PartialEq, Eq, Hash)] #[doc(alias = "GstVideoFrameFlags")] pub struct VideoFrameFlags: u32 { #[doc(alias = "GST_VIDEO_FRAME_FLAG_INTERLACED")] const INTERLACED = ffi::GST_VIDEO_FRAME_FLAG_INTERLACED as _; #[doc(alias = "GST_VIDEO_FRAME_FLAG_TFF")] const TFF = ffi::GST_VIDEO_FRAME_FLAG_TFF as _; #[doc(alias = "GST_VIDEO_FRAME_FLAG_RFF")] const RFF = ffi::GST_VIDEO_FRAME_FLAG_RFF as _; #[doc(alias = "GST_VIDEO_FRAME_FLAG_ONEFIELD")] const ONEFIELD = ffi::GST_VIDEO_FRAME_FLAG_ONEFIELD as _; #[doc(alias = "GST_VIDEO_FRAME_FLAG_MULTIPLE_VIEW")] const MULTIPLE_VIEW = ffi::GST_VIDEO_FRAME_FLAG_MULTIPLE_VIEW as _; #[doc(alias = "GST_VIDEO_FRAME_FLAG_FIRST_IN_BUNDLE")] const FIRST_IN_BUNDLE = ffi::GST_VIDEO_FRAME_FLAG_FIRST_IN_BUNDLE as _; #[cfg(feature = "v1_16")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_16")))] #[doc(alias = "GST_VIDEO_FRAME_FLAG_TOP_FIELD")] const TOP_FIELD = ffi::GST_VIDEO_FRAME_FLAG_TOP_FIELD as _; #[cfg(feature = "v1_16")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_16")))] #[doc(alias = "GST_VIDEO_FRAME_FLAG_BOTTOM_FIELD")] const BOTTOM_FIELD = ffi::GST_VIDEO_FRAME_FLAG_BOTTOM_FIELD as _; } } #[doc(hidden)] impl IntoGlib for VideoFrameFlags { type GlibType = ffi::GstVideoFrameFlags; #[inline] fn into_glib(self) -> ffi::GstVideoFrameFlags { self.bits() } } #[doc(hidden)] impl FromGlib for VideoFrameFlags { #[inline] unsafe fn from_glib(value: ffi::GstVideoFrameFlags) -> Self { skip_assert_initialized!(); Self::from_bits_truncate(value) } } impl StaticType for VideoFrameFlags { #[inline] #[doc(alias = "gst_video_frame_flags_get_type")] fn static_type() -> glib::Type { unsafe { from_glib(ffi::gst_video_frame_flags_get_type()) } } } impl glib::HasParamSpec for VideoFrameFlags { type ParamSpec = glib::ParamSpecFlags; type SetValue = Self; type BuilderFn = fn(&str) -> glib::ParamSpecFlagsBuilder; fn param_spec_builder() -> Self::BuilderFn { Self::ParamSpec::builder } } impl glib::value::ValueType for VideoFrameFlags { type Type = Self; } unsafe impl<'a> glib::value::FromValue<'a> for VideoFrameFlags { type Checker = glib::value::GenericValueTypeChecker; #[inline] unsafe fn from_value(value: &'a glib::Value) -> Self { skip_assert_initialized!(); from_glib(glib::gobject_ffi::g_value_get_flags(value.to_glib_none().0)) } } impl ToValue for VideoFrameFlags { #[inline] fn to_value(&self) -> glib::Value { let mut value = glib::Value::for_value_type::(); unsafe { glib::gobject_ffi::g_value_set_flags(value.to_glib_none_mut().0, self.into_glib()); } value } #[inline] fn value_type(&self) -> glib::Type { Self::static_type() } } impl From for glib::Value { #[inline] fn from(v: VideoFrameFlags) -> Self { skip_assert_initialized!(); ToValue::to_value(&v) } } bitflags! { #[derive(Clone, Copy, Debug, PartialEq, Eq, Hash)] #[doc(alias = "GstVideoMultiviewFlags")] pub struct VideoMultiviewFlags: u32 { #[doc(alias = "GST_VIDEO_MULTIVIEW_FLAGS_RIGHT_VIEW_FIRST")] const RIGHT_VIEW_FIRST = ffi::GST_VIDEO_MULTIVIEW_FLAGS_RIGHT_VIEW_FIRST as _; #[doc(alias = "GST_VIDEO_MULTIVIEW_FLAGS_LEFT_FLIPPED")] const LEFT_FLIPPED = ffi::GST_VIDEO_MULTIVIEW_FLAGS_LEFT_FLIPPED as _; #[doc(alias = "GST_VIDEO_MULTIVIEW_FLAGS_LEFT_FLOPPED")] const LEFT_FLOPPED = ffi::GST_VIDEO_MULTIVIEW_FLAGS_LEFT_FLOPPED as _; #[doc(alias = "GST_VIDEO_MULTIVIEW_FLAGS_RIGHT_FLIPPED")] const RIGHT_FLIPPED = ffi::GST_VIDEO_MULTIVIEW_FLAGS_RIGHT_FLIPPED as _; #[doc(alias = "GST_VIDEO_MULTIVIEW_FLAGS_RIGHT_FLOPPED")] const RIGHT_FLOPPED = ffi::GST_VIDEO_MULTIVIEW_FLAGS_RIGHT_FLOPPED as _; #[doc(alias = "GST_VIDEO_MULTIVIEW_FLAGS_HALF_ASPECT")] const HALF_ASPECT = ffi::GST_VIDEO_MULTIVIEW_FLAGS_HALF_ASPECT as _; #[doc(alias = "GST_VIDEO_MULTIVIEW_FLAGS_MIXED_MONO")] const MIXED_MONO = ffi::GST_VIDEO_MULTIVIEW_FLAGS_MIXED_MONO as _; } } #[doc(hidden)] impl IntoGlib for VideoMultiviewFlags { type GlibType = ffi::GstVideoMultiviewFlags; #[inline] fn into_glib(self) -> ffi::GstVideoMultiviewFlags { self.bits() } } #[doc(hidden)] impl FromGlib for VideoMultiviewFlags { #[inline] unsafe fn from_glib(value: ffi::GstVideoMultiviewFlags) -> Self { skip_assert_initialized!(); Self::from_bits_truncate(value) } } impl StaticType for VideoMultiviewFlags { #[inline] #[doc(alias = "gst_video_multiview_flags_get_type")] fn static_type() -> glib::Type { unsafe { from_glib(ffi::gst_video_multiview_flags_get_type()) } } } impl glib::HasParamSpec for VideoMultiviewFlags { type ParamSpec = glib::ParamSpecFlags; type SetValue = Self; type BuilderFn = fn(&str) -> glib::ParamSpecFlagsBuilder; fn param_spec_builder() -> Self::BuilderFn { Self::ParamSpec::builder } } impl glib::value::ValueType for VideoMultiviewFlags { type Type = Self; } unsafe impl<'a> glib::value::FromValue<'a> for VideoMultiviewFlags { type Checker = glib::value::GenericValueTypeChecker; #[inline] unsafe fn from_value(value: &'a glib::Value) -> Self { skip_assert_initialized!(); from_glib(glib::gobject_ffi::g_value_get_flags(value.to_glib_none().0)) } } impl ToValue for VideoMultiviewFlags { #[inline] fn to_value(&self) -> glib::Value { let mut value = glib::Value::for_value_type::(); unsafe { glib::gobject_ffi::g_value_set_flags(value.to_glib_none_mut().0, self.into_glib()); } value } #[inline] fn value_type(&self) -> glib::Type { Self::static_type() } } impl From for glib::Value { #[inline] fn from(v: VideoMultiviewFlags) -> Self { skip_assert_initialized!(); ToValue::to_value(&v) } } bitflags! { #[derive(Clone, Copy, Debug, PartialEq, Eq, Hash)] #[doc(alias = "GstVideoOverlayFormatFlags")] pub struct VideoOverlayFormatFlags: u32 { #[doc(alias = "GST_VIDEO_OVERLAY_FORMAT_FLAG_PREMULTIPLIED_ALPHA")] const PREMULTIPLIED_ALPHA = ffi::GST_VIDEO_OVERLAY_FORMAT_FLAG_PREMULTIPLIED_ALPHA as _; #[doc(alias = "GST_VIDEO_OVERLAY_FORMAT_FLAG_GLOBAL_ALPHA")] const GLOBAL_ALPHA = ffi::GST_VIDEO_OVERLAY_FORMAT_FLAG_GLOBAL_ALPHA as _; } } #[doc(hidden)] impl IntoGlib for VideoOverlayFormatFlags { type GlibType = ffi::GstVideoOverlayFormatFlags; #[inline] fn into_glib(self) -> ffi::GstVideoOverlayFormatFlags { self.bits() } } #[doc(hidden)] impl FromGlib for VideoOverlayFormatFlags { #[inline] unsafe fn from_glib(value: ffi::GstVideoOverlayFormatFlags) -> Self { skip_assert_initialized!(); Self::from_bits_truncate(value) } } #[cfg(feature = "v1_16")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_16")))] impl StaticType for VideoOverlayFormatFlags { #[inline] #[doc(alias = "gst_video_overlay_format_flags_get_type")] fn static_type() -> glib::Type { unsafe { from_glib(ffi::gst_video_overlay_format_flags_get_type()) } } } #[cfg(feature = "v1_16")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_16")))] impl glib::HasParamSpec for VideoOverlayFormatFlags { type ParamSpec = glib::ParamSpecFlags; type SetValue = Self; type BuilderFn = fn(&str) -> glib::ParamSpecFlagsBuilder; fn param_spec_builder() -> Self::BuilderFn { Self::ParamSpec::builder } } #[cfg(feature = "v1_16")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_16")))] impl glib::value::ValueType for VideoOverlayFormatFlags { type Type = Self; } #[cfg(feature = "v1_16")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_16")))] unsafe impl<'a> glib::value::FromValue<'a> for VideoOverlayFormatFlags { type Checker = glib::value::GenericValueTypeChecker; #[inline] unsafe fn from_value(value: &'a glib::Value) -> Self { skip_assert_initialized!(); from_glib(glib::gobject_ffi::g_value_get_flags(value.to_glib_none().0)) } } #[cfg(feature = "v1_16")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_16")))] impl ToValue for VideoOverlayFormatFlags { #[inline] fn to_value(&self) -> glib::Value { let mut value = glib::Value::for_value_type::(); unsafe { glib::gobject_ffi::g_value_set_flags(value.to_glib_none_mut().0, self.into_glib()); } value } #[inline] fn value_type(&self) -> glib::Type { Self::static_type() } } #[cfg(feature = "v1_16")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_16")))] impl From for glib::Value { #[inline] fn from(v: VideoOverlayFormatFlags) -> Self { skip_assert_initialized!(); ToValue::to_value(&v) } } bitflags! { #[derive(Clone, Copy, Debug, PartialEq, Eq, Hash)] #[doc(alias = "GstVideoPackFlags")] pub struct VideoPackFlags: u32 { #[doc(alias = "GST_VIDEO_PACK_FLAG_TRUNCATE_RANGE")] const TRUNCATE_RANGE = ffi::GST_VIDEO_PACK_FLAG_TRUNCATE_RANGE as _; #[doc(alias = "GST_VIDEO_PACK_FLAG_INTERLACED")] const INTERLACED = ffi::GST_VIDEO_PACK_FLAG_INTERLACED as _; } } #[doc(hidden)] impl IntoGlib for VideoPackFlags { type GlibType = ffi::GstVideoPackFlags; #[inline] fn into_glib(self) -> ffi::GstVideoPackFlags { self.bits() } } #[doc(hidden)] impl FromGlib for VideoPackFlags { #[inline] unsafe fn from_glib(value: ffi::GstVideoPackFlags) -> Self { skip_assert_initialized!(); Self::from_bits_truncate(value) } } impl StaticType for VideoPackFlags { #[inline] #[doc(alias = "gst_video_pack_flags_get_type")] fn static_type() -> glib::Type { unsafe { from_glib(ffi::gst_video_pack_flags_get_type()) } } } impl glib::HasParamSpec for VideoPackFlags { type ParamSpec = glib::ParamSpecFlags; type SetValue = Self; type BuilderFn = fn(&str) -> glib::ParamSpecFlagsBuilder; fn param_spec_builder() -> Self::BuilderFn { Self::ParamSpec::builder } } impl glib::value::ValueType for VideoPackFlags { type Type = Self; } unsafe impl<'a> glib::value::FromValue<'a> for VideoPackFlags { type Checker = glib::value::GenericValueTypeChecker; #[inline] unsafe fn from_value(value: &'a glib::Value) -> Self { skip_assert_initialized!(); from_glib(glib::gobject_ffi::g_value_get_flags(value.to_glib_none().0)) } } impl ToValue for VideoPackFlags { #[inline] fn to_value(&self) -> glib::Value { let mut value = glib::Value::for_value_type::(); unsafe { glib::gobject_ffi::g_value_set_flags(value.to_glib_none_mut().0, self.into_glib()); } value } #[inline] fn value_type(&self) -> glib::Type { Self::static_type() } } impl From for glib::Value { #[inline] fn from(v: VideoPackFlags) -> Self { skip_assert_initialized!(); ToValue::to_value(&v) } } bitflags! { #[derive(Clone, Copy, Debug, PartialEq, Eq, Hash)] #[doc(alias = "GstVideoTimeCodeFlags")] pub struct VideoTimeCodeFlags: u32 { #[doc(alias = "GST_VIDEO_TIME_CODE_FLAGS_DROP_FRAME")] const DROP_FRAME = ffi::GST_VIDEO_TIME_CODE_FLAGS_DROP_FRAME as _; #[doc(alias = "GST_VIDEO_TIME_CODE_FLAGS_INTERLACED")] const INTERLACED = ffi::GST_VIDEO_TIME_CODE_FLAGS_INTERLACED as _; } } #[doc(hidden)] impl IntoGlib for VideoTimeCodeFlags { type GlibType = ffi::GstVideoTimeCodeFlags; #[inline] fn into_glib(self) -> ffi::GstVideoTimeCodeFlags { self.bits() } } #[doc(hidden)] impl FromGlib for VideoTimeCodeFlags { #[inline] unsafe fn from_glib(value: ffi::GstVideoTimeCodeFlags) -> Self { skip_assert_initialized!(); Self::from_bits_truncate(value) } } #[cfg(feature = "v1_18")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_18")))] impl StaticType for VideoTimeCodeFlags { #[inline] #[doc(alias = "gst_video_time_code_flags_get_type")] fn static_type() -> glib::Type { unsafe { from_glib(ffi::gst_video_time_code_flags_get_type()) } } } #[cfg(feature = "v1_18")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_18")))] impl glib::HasParamSpec for VideoTimeCodeFlags { type ParamSpec = glib::ParamSpecFlags; type SetValue = Self; type BuilderFn = fn(&str) -> glib::ParamSpecFlagsBuilder; fn param_spec_builder() -> Self::BuilderFn { Self::ParamSpec::builder } } #[cfg(feature = "v1_18")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_18")))] impl glib::value::ValueType for VideoTimeCodeFlags { type Type = Self; } #[cfg(feature = "v1_18")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_18")))] unsafe impl<'a> glib::value::FromValue<'a> for VideoTimeCodeFlags { type Checker = glib::value::GenericValueTypeChecker; #[inline] unsafe fn from_value(value: &'a glib::Value) -> Self { skip_assert_initialized!(); from_glib(glib::gobject_ffi::g_value_get_flags(value.to_glib_none().0)) } } #[cfg(feature = "v1_18")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_18")))] impl ToValue for VideoTimeCodeFlags { #[inline] fn to_value(&self) -> glib::Value { let mut value = glib::Value::for_value_type::(); unsafe { glib::gobject_ffi::g_value_set_flags(value.to_glib_none_mut().0, self.into_glib()); } value } #[inline] fn value_type(&self) -> glib::Type { Self::static_type() } } #[cfg(feature = "v1_18")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_18")))] impl From for glib::Value { #[inline] fn from(v: VideoTimeCodeFlags) -> Self { skip_assert_initialized!(); ToValue::to_value(&v) } } gstreamer-video-0.23.5/src/auto/mod.rs000064400000000000000000000116201046102023000156740ustar 00000000000000// This file was generated by gir (https://github.com/gtk-rs/gir) // from gir-files (https://github.com/gtk-rs/gir-files) // from gst-gir-files (https://gitlab.freedesktop.org/gstreamer/gir-files-rs.git) // DO NOT EDIT mod color_balance; pub use self::color_balance::ColorBalance; mod color_balance_channel; pub use self::color_balance_channel::ColorBalanceChannel; mod navigation; pub use self::navigation::Navigation; #[cfg(feature = "v1_16")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_16")))] mod video_aggregator; #[cfg(feature = "v1_16")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_16")))] pub use self::video_aggregator::VideoAggregator; #[cfg(feature = "v1_16")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_16")))] mod video_aggregator_convert_pad; #[cfg(feature = "v1_16")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_16")))] pub use self::video_aggregator_convert_pad::VideoAggregatorConvertPad; #[cfg(feature = "v1_16")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_16")))] mod video_aggregator_pad; #[cfg(feature = "v1_16")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_16")))] pub use self::video_aggregator_pad::VideoAggregatorPad; #[cfg(feature = "v1_20")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_20")))] mod video_aggregator_parallel_convert_pad; #[cfg(feature = "v1_20")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_20")))] pub use self::video_aggregator_parallel_convert_pad::VideoAggregatorParallelConvertPad; mod video_buffer_pool; pub use self::video_buffer_pool::VideoBufferPool; mod video_decoder; pub use self::video_decoder::VideoDecoder; mod video_encoder; pub use self::video_encoder::VideoEncoder; mod video_filter; pub use self::video_filter::VideoFilter; mod video_orientation; pub use self::video_orientation::VideoOrientation; mod video_overlay; pub use self::video_overlay::VideoOverlay; mod video_sink; pub use self::video_sink::VideoSink; mod enums; #[cfg(feature = "v1_24")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_24")))] pub use self::enums::AncillaryMetaField; pub use self::enums::ColorBalanceType; pub use self::enums::NavigationCommand; pub use self::enums::NavigationEventType; pub use self::enums::NavigationMessageType; pub use self::enums::NavigationQueryType; #[cfg(feature = "v1_18")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_18")))] pub use self::enums::VideoAFDSpec; #[cfg(feature = "v1_18")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_18")))] pub use self::enums::VideoAFDValue; pub use self::enums::VideoAlphaMode; #[cfg(feature = "v1_16")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_16")))] pub use self::enums::VideoAncillaryDID; #[cfg(feature = "v1_16")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_16")))] pub use self::enums::VideoAncillaryDID16; #[cfg(feature = "v1_16")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_16")))] pub use self::enums::VideoCaptionType; pub use self::enums::VideoChromaMode; pub use self::enums::VideoColorMatrix; pub use self::enums::VideoColorPrimaries; pub use self::enums::VideoDitherMethod; pub use self::enums::VideoFieldOrder; pub use self::enums::VideoFormat; pub use self::enums::VideoGammaMode; pub use self::enums::VideoInterlaceMode; pub use self::enums::VideoMatrixMode; pub use self::enums::VideoMultiviewFramePacking; pub use self::enums::VideoMultiviewMode; pub use self::enums::VideoOrientationMethod; pub use self::enums::VideoPrimariesMode; pub use self::enums::VideoResamplerMethod; pub use self::enums::VideoTileMode; pub use self::enums::VideoTransferFunction; mod flags; #[cfg(feature = "v1_22")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_22")))] pub use self::flags::NavigationModifierType; pub use self::flags::VideoBufferFlags; pub use self::flags::VideoChromaSite; pub use self::flags::VideoCodecFrameFlags; #[cfg(feature = "v1_20")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_20")))] pub use self::flags::VideoDecoderRequestSyncPointFlags; pub use self::flags::VideoFlags; pub use self::flags::VideoFormatFlags; pub use self::flags::VideoFrameFlags; pub use self::flags::VideoMultiviewFlags; pub use self::flags::VideoOverlayFormatFlags; pub use self::flags::VideoPackFlags; pub use self::flags::VideoTimeCodeFlags; pub(crate) mod traits { pub use super::color_balance::ColorBalanceExt; pub use super::color_balance_channel::ColorBalanceChannelExt; pub use super::navigation::NavigationExt; #[cfg(feature = "v1_16")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_16")))] pub use super::video_aggregator::VideoAggregatorExt; #[cfg(feature = "v1_16")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_16")))] pub use super::video_aggregator_convert_pad::VideoAggregatorConvertPadExt; #[cfg(feature = "v1_16")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_16")))] pub use super::video_aggregator_pad::VideoAggregatorPadExt; pub use super::video_decoder::VideoDecoderExt; pub use super::video_encoder::VideoEncoderExt; pub use super::video_orientation::VideoOrientationExt; pub use super::video_overlay::VideoOverlayExt; pub use super::video_sink::VideoSinkExt; } gstreamer-video-0.23.5/src/auto/navigation.rs000064400000000000000000000066301046102023000172610ustar 00000000000000// This file was generated by gir (https://github.com/gtk-rs/gir) // from gir-files (https://github.com/gtk-rs/gir-files) // from gst-gir-files (https://gitlab.freedesktop.org/gstreamer/gir-files-rs.git) // DO NOT EDIT use crate::{ffi, NavigationCommand}; use glib::{prelude::*, translate::*}; glib::wrapper! { #[doc(alias = "GstNavigation")] pub struct Navigation(Interface); match fn { type_ => || ffi::gst_navigation_get_type(), } } impl Navigation { pub const NONE: Option<&'static Navigation> = None; //#[doc(alias = "gst_navigation_query_set_commands")] //pub fn query_set_commands(query: &gst::Query, n_cmds: i32, : /*Unknown conversion*//*Unimplemented*/Basic: VarArgs) { // unsafe { TODO: call ffi:gst_navigation_query_set_commands() } //} //#[doc(alias = "gst_navigation_query_set_commandsv")] //pub fn query_set_commandsv(query: &gst::Query, cmds: /*Unimplemented*/&CArray TypeId { ns_id: 1, id: 8 }) { // unsafe { TODO: call ffi:gst_navigation_query_set_commandsv() } //} } unsafe impl Send for Navigation {} unsafe impl Sync for Navigation {} mod sealed { pub trait Sealed {} impl> Sealed for T {} } pub trait NavigationExt: IsA + sealed::Sealed + 'static { #[doc(alias = "gst_navigation_send_command")] fn send_command(&self, command: NavigationCommand) { unsafe { ffi::gst_navigation_send_command(self.as_ref().to_glib_none().0, command.into_glib()); } } #[doc(alias = "gst_navigation_send_event")] fn send_event(&self, structure: gst::Structure) { unsafe { ffi::gst_navigation_send_event( self.as_ref().to_glib_none().0, structure.into_glib_ptr(), ); } } #[cfg(feature = "v1_22")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_22")))] #[doc(alias = "gst_navigation_send_event_simple")] fn send_event_simple(&self, event: gst::Event) { unsafe { ffi::gst_navigation_send_event_simple( self.as_ref().to_glib_none().0, event.into_glib_ptr(), ); } } #[doc(alias = "gst_navigation_send_key_event")] fn send_key_event(&self, event: &str, key: &str) { unsafe { ffi::gst_navigation_send_key_event( self.as_ref().to_glib_none().0, event.to_glib_none().0, key.to_glib_none().0, ); } } #[doc(alias = "gst_navigation_send_mouse_event")] fn send_mouse_event(&self, event: &str, button: i32, x: f64, y: f64) { unsafe { ffi::gst_navigation_send_mouse_event( self.as_ref().to_glib_none().0, event.to_glib_none().0, button, x, y, ); } } #[cfg(feature = "v1_18")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_18")))] #[doc(alias = "gst_navigation_send_mouse_scroll_event")] fn send_mouse_scroll_event(&self, x: f64, y: f64, delta_x: f64, delta_y: f64) { unsafe { ffi::gst_navigation_send_mouse_scroll_event( self.as_ref().to_glib_none().0, x, y, delta_x, delta_y, ); } } } impl> NavigationExt for O {} gstreamer-video-0.23.5/src/auto/versions.txt000064400000000000000000000003421046102023000171570ustar 00000000000000Generated by gir (https://github.com/gtk-rs/gir @ 2b05eaddce95) from gir-files (https://github.com/gtk-rs/gir-files @ 5089b7ff80cd) from gst-gir-files (https://gitlab.freedesktop.org/gstreamer/gir-files-rs.git @ 26898eacb093) gstreamer-video-0.23.5/src/auto/video_aggregator.rs000064400000000000000000000024561046102023000204340ustar 00000000000000// This file was generated by gir (https://github.com/gtk-rs/gir) // from gir-files (https://github.com/gtk-rs/gir-files) // from gst-gir-files (https://gitlab.freedesktop.org/gstreamer/gir-files-rs.git) // DO NOT EDIT use crate::ffi; use glib::prelude::*; glib::wrapper! { #[doc(alias = "GstVideoAggregator")] pub struct VideoAggregator(Object) @extends gst_base::Aggregator, gst::Element, gst::Object; match fn { type_ => || ffi::gst_video_aggregator_get_type(), } } impl VideoAggregator { pub const NONE: Option<&'static VideoAggregator> = None; } unsafe impl Send for VideoAggregator {} unsafe impl Sync for VideoAggregator {} mod sealed { pub trait Sealed {} impl> Sealed for T {} } pub trait VideoAggregatorExt: IsA + sealed::Sealed + 'static { //#[cfg(feature = "v1_20")] //#[cfg_attr(docsrs, doc(cfg(feature = "v1_20")))] //#[doc(alias = "gst_video_aggregator_get_execution_task_pool")] //#[doc(alias = "get_execution_task_pool")] //fn execution_task_pool(&self) -> /*Ignored*/gst::TaskPool { // unsafe { TODO: call ffi:gst_video_aggregator_get_execution_task_pool() } //} } impl> VideoAggregatorExt for O {} gstreamer-video-0.23.5/src/auto/video_aggregator_convert_pad.rs000064400000000000000000000026571046102023000230230ustar 00000000000000// This file was generated by gir (https://github.com/gtk-rs/gir) // from gir-files (https://github.com/gtk-rs/gir-files) // from gst-gir-files (https://gitlab.freedesktop.org/gstreamer/gir-files-rs.git) // DO NOT EDIT use crate::{ffi, VideoAggregatorPad}; use glib::{prelude::*, translate::*}; glib::wrapper! { #[doc(alias = "GstVideoAggregatorConvertPad")] pub struct VideoAggregatorConvertPad(Object) @extends VideoAggregatorPad, gst_base::AggregatorPad, gst::Pad, gst::Object; match fn { type_ => || ffi::gst_video_aggregator_convert_pad_get_type(), } } impl VideoAggregatorConvertPad { pub const NONE: Option<&'static VideoAggregatorConvertPad> = None; } unsafe impl Send for VideoAggregatorConvertPad {} unsafe impl Sync for VideoAggregatorConvertPad {} mod sealed { pub trait Sealed {} impl> Sealed for T {} } pub trait VideoAggregatorConvertPadExt: IsA + sealed::Sealed + 'static { #[doc(alias = "gst_video_aggregator_convert_pad_update_conversion_info")] fn update_conversion_info(&self) { unsafe { ffi::gst_video_aggregator_convert_pad_update_conversion_info( self.as_ref().to_glib_none().0, ); } } } impl> VideoAggregatorConvertPadExt for O {} gstreamer-video-0.23.5/src/auto/video_aggregator_pad.rs000064400000000000000000000125771046102023000212650ustar 00000000000000// This file was generated by gir (https://github.com/gtk-rs/gir) // from gir-files (https://github.com/gtk-rs/gir-files) // from gst-gir-files (https://gitlab.freedesktop.org/gstreamer/gir-files-rs.git) // DO NOT EDIT use crate::ffi; use glib::{ prelude::*, signal::{connect_raw, SignalHandlerId}, translate::*, }; use std::boxed::Box as Box_; glib::wrapper! { #[doc(alias = "GstVideoAggregatorPad")] pub struct VideoAggregatorPad(Object) @extends gst_base::AggregatorPad, gst::Pad, gst::Object; match fn { type_ => || ffi::gst_video_aggregator_pad_get_type(), } } impl VideoAggregatorPad { pub const NONE: Option<&'static VideoAggregatorPad> = None; } unsafe impl Send for VideoAggregatorPad {} unsafe impl Sync for VideoAggregatorPad {} mod sealed { pub trait Sealed {} impl> Sealed for T {} } pub trait VideoAggregatorPadExt: IsA + sealed::Sealed + 'static { #[doc(alias = "gst_video_aggregator_pad_set_needs_alpha")] fn set_needs_alpha(&self, needs_alpha: bool) { unsafe { ffi::gst_video_aggregator_pad_set_needs_alpha( self.as_ref().to_glib_none().0, needs_alpha.into_glib(), ); } } #[doc(alias = "max-last-buffer-repeat")] fn max_last_buffer_repeat(&self) -> u64 { ObjectExt::property(self.as_ref(), "max-last-buffer-repeat") } #[doc(alias = "max-last-buffer-repeat")] fn set_max_last_buffer_repeat(&self, max_last_buffer_repeat: u64) { ObjectExt::set_property( self.as_ref(), "max-last-buffer-repeat", max_last_buffer_repeat, ) } #[doc(alias = "repeat-after-eos")] fn is_repeat_after_eos(&self) -> bool { ObjectExt::property(self.as_ref(), "repeat-after-eos") } #[doc(alias = "repeat-after-eos")] fn set_repeat_after_eos(&self, repeat_after_eos: bool) { ObjectExt::set_property(self.as_ref(), "repeat-after-eos", repeat_after_eos) } fn zorder(&self) -> u32 { ObjectExt::property(self.as_ref(), "zorder") } fn set_zorder(&self, zorder: u32) { ObjectExt::set_property(self.as_ref(), "zorder", zorder) } #[doc(alias = "max-last-buffer-repeat")] fn connect_max_last_buffer_repeat_notify( &self, f: F, ) -> SignalHandlerId { unsafe extern "C" fn notify_max_last_buffer_repeat_trampoline< P: IsA, F: Fn(&P) + Send + Sync + 'static, >( this: *mut ffi::GstVideoAggregatorPad, _param_spec: glib::ffi::gpointer, f: glib::ffi::gpointer, ) { let f: &F = &*(f as *const F); f(VideoAggregatorPad::from_glib_borrow(this).unsafe_cast_ref()) } unsafe { let f: Box_ = Box_::new(f); connect_raw( self.as_ptr() as *mut _, b"notify::max-last-buffer-repeat\0".as_ptr() as *const _, Some(std::mem::transmute::<*const (), unsafe extern "C" fn()>( notify_max_last_buffer_repeat_trampoline:: as *const (), )), Box_::into_raw(f), ) } } #[doc(alias = "repeat-after-eos")] fn connect_repeat_after_eos_notify( &self, f: F, ) -> SignalHandlerId { unsafe extern "C" fn notify_repeat_after_eos_trampoline< P: IsA, F: Fn(&P) + Send + Sync + 'static, >( this: *mut ffi::GstVideoAggregatorPad, _param_spec: glib::ffi::gpointer, f: glib::ffi::gpointer, ) { let f: &F = &*(f as *const F); f(VideoAggregatorPad::from_glib_borrow(this).unsafe_cast_ref()) } unsafe { let f: Box_ = Box_::new(f); connect_raw( self.as_ptr() as *mut _, b"notify::repeat-after-eos\0".as_ptr() as *const _, Some(std::mem::transmute::<*const (), unsafe extern "C" fn()>( notify_repeat_after_eos_trampoline:: as *const (), )), Box_::into_raw(f), ) } } #[doc(alias = "zorder")] fn connect_zorder_notify(&self, f: F) -> SignalHandlerId { unsafe extern "C" fn notify_zorder_trampoline< P: IsA, F: Fn(&P) + Send + Sync + 'static, >( this: *mut ffi::GstVideoAggregatorPad, _param_spec: glib::ffi::gpointer, f: glib::ffi::gpointer, ) { let f: &F = &*(f as *const F); f(VideoAggregatorPad::from_glib_borrow(this).unsafe_cast_ref()) } unsafe { let f: Box_ = Box_::new(f); connect_raw( self.as_ptr() as *mut _, b"notify::zorder\0".as_ptr() as *const _, Some(std::mem::transmute::<*const (), unsafe extern "C" fn()>( notify_zorder_trampoline:: as *const (), )), Box_::into_raw(f), ) } } } impl> VideoAggregatorPadExt for O {} gstreamer-video-0.23.5/src/auto/video_aggregator_parallel_convert_pad.rs000064400000000000000000000016751046102023000246760ustar 00000000000000// This file was generated by gir (https://github.com/gtk-rs/gir) // from gir-files (https://github.com/gtk-rs/gir-files) // from gst-gir-files (https://gitlab.freedesktop.org/gstreamer/gir-files-rs.git) // DO NOT EDIT use crate::{ffi, VideoAggregatorConvertPad, VideoAggregatorPad}; glib::wrapper! { #[doc(alias = "GstVideoAggregatorParallelConvertPad")] pub struct VideoAggregatorParallelConvertPad(Object) @extends VideoAggregatorConvertPad, VideoAggregatorPad, gst_base::AggregatorPad, gst::Pad, gst::Object; match fn { type_ => || ffi::gst_video_aggregator_parallel_convert_pad_get_type(), } } impl VideoAggregatorParallelConvertPad { pub const NONE: Option<&'static VideoAggregatorParallelConvertPad> = None; } unsafe impl Send for VideoAggregatorParallelConvertPad {} unsafe impl Sync for VideoAggregatorParallelConvertPad {} gstreamer-video-0.23.5/src/auto/video_buffer_pool.rs000064400000000000000000000020231046102023000206020ustar 00000000000000// This file was generated by gir (https://github.com/gtk-rs/gir) // from gir-files (https://github.com/gtk-rs/gir-files) // from gst-gir-files (https://gitlab.freedesktop.org/gstreamer/gir-files-rs.git) // DO NOT EDIT use crate::ffi; use glib::{prelude::*, translate::*}; glib::wrapper! { #[doc(alias = "GstVideoBufferPool")] pub struct VideoBufferPool(Object) @extends gst::BufferPool, gst::Object; match fn { type_ => || ffi::gst_video_buffer_pool_get_type(), } } impl VideoBufferPool { pub const NONE: Option<&'static VideoBufferPool> = None; #[doc(alias = "gst_video_buffer_pool_new")] pub fn new() -> VideoBufferPool { assert_initialized_main_thread!(); unsafe { gst::BufferPool::from_glib_full(ffi::gst_video_buffer_pool_new()).unsafe_cast() } } } impl Default for VideoBufferPool { fn default() -> Self { Self::new() } } unsafe impl Send for VideoBufferPool {} unsafe impl Sync for VideoBufferPool {} gstreamer-video-0.23.5/src/auto/video_decoder.rs000064400000000000000000000506561046102023000177240ustar 00000000000000// This file was generated by gir (https://github.com/gtk-rs/gir) // from gir-files (https://github.com/gtk-rs/gir-files) // from gst-gir-files (https://gitlab.freedesktop.org/gstreamer/gir-files-rs.git) // DO NOT EDIT #[cfg(feature = "v1_20")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_20")))] use crate::VideoDecoderRequestSyncPointFlags; use crate::{ffi, VideoCodecFrame}; #[cfg(feature = "v1_18")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_18")))] use glib::signal::{connect_raw, SignalHandlerId}; use glib::{prelude::*, translate::*}; #[cfg(feature = "v1_18")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_18")))] use std::boxed::Box as Box_; glib::wrapper! { #[doc(alias = "GstVideoDecoder")] pub struct VideoDecoder(Object) @extends gst::Element, gst::Object; match fn { type_ => || ffi::gst_video_decoder_get_type(), } } impl VideoDecoder { pub const NONE: Option<&'static VideoDecoder> = None; } unsafe impl Send for VideoDecoder {} unsafe impl Sync for VideoDecoder {} mod sealed { pub trait Sealed {} impl> Sealed for T {} } pub trait VideoDecoderExt: IsA + sealed::Sealed + 'static { #[doc(alias = "gst_video_decoder_add_to_frame")] fn add_to_frame(&self, n_bytes: i32) { unsafe { ffi::gst_video_decoder_add_to_frame(self.as_ref().to_glib_none().0, n_bytes); } } #[doc(alias = "gst_video_decoder_allocate_output_buffer")] fn allocate_output_buffer(&self) -> Result { unsafe { Option::<_>::from_glib_full(ffi::gst_video_decoder_allocate_output_buffer( self.as_ref().to_glib_none().0, )) .ok_or_else(|| glib::bool_error!("Failed to allocate output buffer")) } } #[doc(alias = "gst_video_decoder_drop_frame")] fn drop_frame(&self, frame: VideoCodecFrame) -> Result { unsafe { try_from_glib(ffi::gst_video_decoder_drop_frame( self.as_ref().to_glib_none().0, frame.into_glib_ptr(), )) } } #[cfg(feature = "v1_20")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_20")))] #[doc(alias = "gst_video_decoder_drop_subframe")] fn drop_subframe(&self, frame: VideoCodecFrame) -> Result { unsafe { try_from_glib(ffi::gst_video_decoder_drop_subframe( self.as_ref().to_glib_none().0, frame.into_glib_ptr(), )) } } #[doc(alias = "gst_video_decoder_finish_frame")] fn finish_frame(&self, frame: VideoCodecFrame) -> Result { unsafe { try_from_glib(ffi::gst_video_decoder_finish_frame( self.as_ref().to_glib_none().0, frame.into_glib_ptr(), )) } } #[cfg(feature = "v1_20")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_20")))] #[doc(alias = "gst_video_decoder_finish_subframe")] fn finish_subframe(&self, frame: VideoCodecFrame) -> Result { unsafe { try_from_glib(ffi::gst_video_decoder_finish_subframe( self.as_ref().to_glib_none().0, frame.into_glib_ptr(), )) } } #[doc(alias = "gst_video_decoder_get_buffer_pool")] #[doc(alias = "get_buffer_pool")] fn buffer_pool(&self) -> Option { unsafe { from_glib_full(ffi::gst_video_decoder_get_buffer_pool( self.as_ref().to_glib_none().0, )) } } #[doc(alias = "gst_video_decoder_get_estimate_rate")] #[doc(alias = "get_estimate_rate")] fn estimate_rate(&self) -> i32 { unsafe { ffi::gst_video_decoder_get_estimate_rate(self.as_ref().to_glib_none().0) } } #[doc(alias = "gst_video_decoder_get_max_decode_time")] #[doc(alias = "get_max_decode_time")] fn max_decode_time(&self, frame: &VideoCodecFrame) -> gst::ClockTimeDiff { unsafe { ffi::gst_video_decoder_get_max_decode_time( self.as_ref().to_glib_none().0, frame.to_glib_none().0, ) } } #[doc(alias = "gst_video_decoder_get_max_errors")] #[doc(alias = "get_max_errors")] #[doc(alias = "max-errors")] fn max_errors(&self) -> i32 { unsafe { ffi::gst_video_decoder_get_max_errors(self.as_ref().to_glib_none().0) } } #[doc(alias = "gst_video_decoder_get_needs_format")] #[doc(alias = "get_needs_format")] fn needs_format(&self) -> bool { unsafe { from_glib(ffi::gst_video_decoder_get_needs_format( self.as_ref().to_glib_none().0, )) } } #[cfg(feature = "v1_20")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_20")))] #[doc(alias = "gst_video_decoder_get_needs_sync_point")] #[doc(alias = "get_needs_sync_point")] fn needs_sync_point(&self) -> bool { unsafe { from_glib(ffi::gst_video_decoder_get_needs_sync_point( self.as_ref().to_glib_none().0, )) } } #[doc(alias = "gst_video_decoder_get_packetized")] #[doc(alias = "get_packetized")] fn is_packetized(&self) -> bool { unsafe { from_glib(ffi::gst_video_decoder_get_packetized( self.as_ref().to_glib_none().0, )) } } #[doc(alias = "gst_video_decoder_get_pending_frame_size")] #[doc(alias = "get_pending_frame_size")] fn pending_frame_size(&self) -> usize { unsafe { ffi::gst_video_decoder_get_pending_frame_size(self.as_ref().to_glib_none().0) } } #[doc(alias = "gst_video_decoder_get_qos_proportion")] #[doc(alias = "get_qos_proportion")] fn qos_proportion(&self) -> f64 { unsafe { ffi::gst_video_decoder_get_qos_proportion(self.as_ref().to_glib_none().0) } } #[cfg(feature = "v1_20")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_20")))] #[doc(alias = "gst_video_decoder_get_subframe_mode")] #[doc(alias = "get_subframe_mode")] fn is_subframe_mode(&self) -> bool { unsafe { from_glib(ffi::gst_video_decoder_get_subframe_mode( self.as_ref().to_glib_none().0, )) } } #[doc(alias = "gst_video_decoder_have_frame")] fn have_frame(&self) -> Result { unsafe { try_from_glib(ffi::gst_video_decoder_have_frame( self.as_ref().to_glib_none().0, )) } } #[cfg(feature = "v1_20")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_20")))] #[doc(alias = "gst_video_decoder_have_last_subframe")] fn have_last_subframe( &self, frame: &VideoCodecFrame, ) -> Result { unsafe { try_from_glib(ffi::gst_video_decoder_have_last_subframe( self.as_ref().to_glib_none().0, frame.to_glib_none().0, )) } } #[doc(alias = "gst_video_decoder_merge_tags")] fn merge_tags(&self, tags: Option<&gst::TagList>, mode: gst::TagMergeMode) { unsafe { ffi::gst_video_decoder_merge_tags( self.as_ref().to_glib_none().0, tags.to_glib_none().0, mode.into_glib(), ); } } #[doc(alias = "gst_video_decoder_proxy_getcaps")] fn proxy_getcaps(&self, caps: Option<&gst::Caps>, filter: Option<&gst::Caps>) -> gst::Caps { unsafe { from_glib_full(ffi::gst_video_decoder_proxy_getcaps( self.as_ref().to_glib_none().0, caps.to_glib_none().0, filter.to_glib_none().0, )) } } #[doc(alias = "gst_video_decoder_release_frame")] fn release_frame(&self, frame: VideoCodecFrame) { unsafe { ffi::gst_video_decoder_release_frame( self.as_ref().to_glib_none().0, frame.into_glib_ptr(), ); } } #[cfg(feature = "v1_20")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_20")))] #[doc(alias = "gst_video_decoder_request_sync_point")] fn request_sync_point( &self, frame: &VideoCodecFrame, flags: VideoDecoderRequestSyncPointFlags, ) { unsafe { ffi::gst_video_decoder_request_sync_point( self.as_ref().to_glib_none().0, frame.to_glib_none().0, flags.into_glib(), ); } } #[doc(alias = "gst_video_decoder_set_estimate_rate")] fn set_estimate_rate(&self, enabled: bool) { unsafe { ffi::gst_video_decoder_set_estimate_rate( self.as_ref().to_glib_none().0, enabled.into_glib(), ); } } #[doc(alias = "gst_video_decoder_set_max_errors")] #[doc(alias = "max-errors")] fn set_max_errors(&self, num: i32) { unsafe { ffi::gst_video_decoder_set_max_errors(self.as_ref().to_glib_none().0, num); } } #[doc(alias = "gst_video_decoder_set_needs_format")] fn set_needs_format(&self, enabled: bool) { unsafe { ffi::gst_video_decoder_set_needs_format( self.as_ref().to_glib_none().0, enabled.into_glib(), ); } } #[cfg(feature = "v1_20")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_20")))] #[doc(alias = "gst_video_decoder_set_needs_sync_point")] fn set_needs_sync_point(&self, enabled: bool) { unsafe { ffi::gst_video_decoder_set_needs_sync_point( self.as_ref().to_glib_none().0, enabled.into_glib(), ); } } #[doc(alias = "gst_video_decoder_set_packetized")] fn set_packetized(&self, packetized: bool) { unsafe { ffi::gst_video_decoder_set_packetized( self.as_ref().to_glib_none().0, packetized.into_glib(), ); } } #[cfg(feature = "v1_20")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_20")))] #[doc(alias = "gst_video_decoder_set_subframe_mode")] fn set_subframe_mode(&self, subframe_mode: bool) { unsafe { ffi::gst_video_decoder_set_subframe_mode( self.as_ref().to_glib_none().0, subframe_mode.into_glib(), ); } } #[doc(alias = "gst_video_decoder_set_use_default_pad_acceptcaps")] fn set_use_default_pad_acceptcaps(&self, use_: bool) { unsafe { ffi::gst_video_decoder_set_use_default_pad_acceptcaps( self.as_ref().to_glib_none().0, use_.into_glib(), ); } } #[cfg(feature = "v1_20")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_20")))] #[doc(alias = "automatic-request-sync-point-flags")] fn automatic_request_sync_point_flags(&self) -> VideoDecoderRequestSyncPointFlags { ObjectExt::property(self.as_ref(), "automatic-request-sync-point-flags") } #[cfg(feature = "v1_20")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_20")))] #[doc(alias = "automatic-request-sync-point-flags")] fn set_automatic_request_sync_point_flags( &self, automatic_request_sync_point_flags: VideoDecoderRequestSyncPointFlags, ) { ObjectExt::set_property( self.as_ref(), "automatic-request-sync-point-flags", automatic_request_sync_point_flags, ) } #[cfg(feature = "v1_20")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_20")))] #[doc(alias = "automatic-request-sync-points")] fn is_automatic_request_sync_points(&self) -> bool { ObjectExt::property(self.as_ref(), "automatic-request-sync-points") } #[cfg(feature = "v1_20")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_20")))] #[doc(alias = "automatic-request-sync-points")] fn set_automatic_request_sync_points(&self, automatic_request_sync_points: bool) { ObjectExt::set_property( self.as_ref(), "automatic-request-sync-points", automatic_request_sync_points, ) } #[cfg(feature = "v1_20")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_20")))] #[doc(alias = "discard-corrupted-frames")] fn is_discard_corrupted_frames(&self) -> bool { ObjectExt::property(self.as_ref(), "discard-corrupted-frames") } #[cfg(feature = "v1_20")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_20")))] #[doc(alias = "discard-corrupted-frames")] fn set_discard_corrupted_frames(&self, discard_corrupted_frames: bool) { ObjectExt::set_property( self.as_ref(), "discard-corrupted-frames", discard_corrupted_frames, ) } #[cfg(feature = "v1_20")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_20")))] #[doc(alias = "min-force-key-unit-interval")] fn min_force_key_unit_interval(&self) -> u64 { ObjectExt::property(self.as_ref(), "min-force-key-unit-interval") } #[cfg(feature = "v1_20")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_20")))] #[doc(alias = "min-force-key-unit-interval")] fn set_min_force_key_unit_interval(&self, min_force_key_unit_interval: u64) { ObjectExt::set_property( self.as_ref(), "min-force-key-unit-interval", min_force_key_unit_interval, ) } #[cfg(feature = "v1_18")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_18")))] fn is_qos(&self) -> bool { ObjectExt::property(self.as_ref(), "qos") } #[cfg(feature = "v1_18")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_18")))] fn set_qos(&self, qos: bool) { ObjectExt::set_property(self.as_ref(), "qos", qos) } #[cfg(feature = "v1_20")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_20")))] #[doc(alias = "automatic-request-sync-point-flags")] fn connect_automatic_request_sync_point_flags_notify( &self, f: F, ) -> SignalHandlerId { unsafe extern "C" fn notify_automatic_request_sync_point_flags_trampoline< P: IsA, F: Fn(&P) + Send + Sync + 'static, >( this: *mut ffi::GstVideoDecoder, _param_spec: glib::ffi::gpointer, f: glib::ffi::gpointer, ) { let f: &F = &*(f as *const F); f(VideoDecoder::from_glib_borrow(this).unsafe_cast_ref()) } unsafe { let f: Box_ = Box_::new(f); connect_raw( self.as_ptr() as *mut _, b"notify::automatic-request-sync-point-flags\0".as_ptr() as *const _, Some(std::mem::transmute::<*const (), unsafe extern "C" fn()>( notify_automatic_request_sync_point_flags_trampoline:: as *const (), )), Box_::into_raw(f), ) } } #[cfg(feature = "v1_20")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_20")))] #[doc(alias = "automatic-request-sync-points")] fn connect_automatic_request_sync_points_notify( &self, f: F, ) -> SignalHandlerId { unsafe extern "C" fn notify_automatic_request_sync_points_trampoline< P: IsA, F: Fn(&P) + Send + Sync + 'static, >( this: *mut ffi::GstVideoDecoder, _param_spec: glib::ffi::gpointer, f: glib::ffi::gpointer, ) { let f: &F = &*(f as *const F); f(VideoDecoder::from_glib_borrow(this).unsafe_cast_ref()) } unsafe { let f: Box_ = Box_::new(f); connect_raw( self.as_ptr() as *mut _, b"notify::automatic-request-sync-points\0".as_ptr() as *const _, Some(std::mem::transmute::<*const (), unsafe extern "C" fn()>( notify_automatic_request_sync_points_trampoline:: as *const (), )), Box_::into_raw(f), ) } } #[cfg(feature = "v1_20")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_20")))] #[doc(alias = "discard-corrupted-frames")] fn connect_discard_corrupted_frames_notify( &self, f: F, ) -> SignalHandlerId { unsafe extern "C" fn notify_discard_corrupted_frames_trampoline< P: IsA, F: Fn(&P) + Send + Sync + 'static, >( this: *mut ffi::GstVideoDecoder, _param_spec: glib::ffi::gpointer, f: glib::ffi::gpointer, ) { let f: &F = &*(f as *const F); f(VideoDecoder::from_glib_borrow(this).unsafe_cast_ref()) } unsafe { let f: Box_ = Box_::new(f); connect_raw( self.as_ptr() as *mut _, b"notify::discard-corrupted-frames\0".as_ptr() as *const _, Some(std::mem::transmute::<*const (), unsafe extern "C" fn()>( notify_discard_corrupted_frames_trampoline:: as *const (), )), Box_::into_raw(f), ) } } #[cfg(feature = "v1_18")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_18")))] #[doc(alias = "max-errors")] fn connect_max_errors_notify( &self, f: F, ) -> SignalHandlerId { unsafe extern "C" fn notify_max_errors_trampoline< P: IsA, F: Fn(&P) + Send + Sync + 'static, >( this: *mut ffi::GstVideoDecoder, _param_spec: glib::ffi::gpointer, f: glib::ffi::gpointer, ) { let f: &F = &*(f as *const F); f(VideoDecoder::from_glib_borrow(this).unsafe_cast_ref()) } unsafe { let f: Box_ = Box_::new(f); connect_raw( self.as_ptr() as *mut _, b"notify::max-errors\0".as_ptr() as *const _, Some(std::mem::transmute::<*const (), unsafe extern "C" fn()>( notify_max_errors_trampoline:: as *const (), )), Box_::into_raw(f), ) } } #[cfg(feature = "v1_20")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_20")))] #[doc(alias = "min-force-key-unit-interval")] fn connect_min_force_key_unit_interval_notify( &self, f: F, ) -> SignalHandlerId { unsafe extern "C" fn notify_min_force_key_unit_interval_trampoline< P: IsA, F: Fn(&P) + Send + Sync + 'static, >( this: *mut ffi::GstVideoDecoder, _param_spec: glib::ffi::gpointer, f: glib::ffi::gpointer, ) { let f: &F = &*(f as *const F); f(VideoDecoder::from_glib_borrow(this).unsafe_cast_ref()) } unsafe { let f: Box_ = Box_::new(f); connect_raw( self.as_ptr() as *mut _, b"notify::min-force-key-unit-interval\0".as_ptr() as *const _, Some(std::mem::transmute::<*const (), unsafe extern "C" fn()>( notify_min_force_key_unit_interval_trampoline:: as *const (), )), Box_::into_raw(f), ) } } #[cfg(feature = "v1_18")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_18")))] #[doc(alias = "qos")] fn connect_qos_notify(&self, f: F) -> SignalHandlerId { unsafe extern "C" fn notify_qos_trampoline< P: IsA, F: Fn(&P) + Send + Sync + 'static, >( this: *mut ffi::GstVideoDecoder, _param_spec: glib::ffi::gpointer, f: glib::ffi::gpointer, ) { let f: &F = &*(f as *const F); f(VideoDecoder::from_glib_borrow(this).unsafe_cast_ref()) } unsafe { let f: Box_ = Box_::new(f); connect_raw( self.as_ptr() as *mut _, b"notify::qos\0".as_ptr() as *const _, Some(std::mem::transmute::<*const (), unsafe extern "C" fn()>( notify_qos_trampoline:: as *const (), )), Box_::into_raw(f), ) } } } impl> VideoDecoderExt for O {} gstreamer-video-0.23.5/src/auto/video_encoder.rs000064400000000000000000000170461046102023000177320ustar 00000000000000// This file was generated by gir (https://github.com/gtk-rs/gir) // from gir-files (https://github.com/gtk-rs/gir-files) // from gst-gir-files (https://gitlab.freedesktop.org/gstreamer/gir-files-rs.git) // DO NOT EDIT use crate::{ffi, VideoCodecFrame}; use glib::{ prelude::*, signal::{connect_raw, SignalHandlerId}, translate::*, }; use std::boxed::Box as Box_; glib::wrapper! { #[doc(alias = "GstVideoEncoder")] pub struct VideoEncoder(Object) @extends gst::Element, gst::Object; match fn { type_ => || ffi::gst_video_encoder_get_type(), } } impl VideoEncoder { pub const NONE: Option<&'static VideoEncoder> = None; } unsafe impl Send for VideoEncoder {} unsafe impl Sync for VideoEncoder {} mod sealed { pub trait Sealed {} impl> Sealed for T {} } pub trait VideoEncoderExt: IsA + sealed::Sealed + 'static { #[doc(alias = "gst_video_encoder_allocate_output_buffer")] fn allocate_output_buffer(&self, size: usize) -> gst::Buffer { unsafe { from_glib_full(ffi::gst_video_encoder_allocate_output_buffer( self.as_ref().to_glib_none().0, size, )) } } #[cfg(feature = "v1_26")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_26")))] #[doc(alias = "gst_video_encoder_drop_frame")] fn drop_frame(&self, frame: VideoCodecFrame) { unsafe { ffi::gst_video_encoder_drop_frame( self.as_ref().to_glib_none().0, frame.into_glib_ptr(), ); } } #[doc(alias = "gst_video_encoder_finish_frame")] fn finish_frame(&self, frame: VideoCodecFrame) -> Result { unsafe { try_from_glib(ffi::gst_video_encoder_finish_frame( self.as_ref().to_glib_none().0, frame.into_glib_ptr(), )) } } #[doc(alias = "gst_video_encoder_get_max_encode_time")] #[doc(alias = "get_max_encode_time")] fn max_encode_time(&self, frame: &VideoCodecFrame) -> gst::ClockTimeDiff { unsafe { ffi::gst_video_encoder_get_max_encode_time( self.as_ref().to_glib_none().0, frame.to_glib_none().0, ) } } #[cfg(feature = "v1_18")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_18")))] #[doc(alias = "gst_video_encoder_get_min_force_key_unit_interval")] #[doc(alias = "get_min_force_key_unit_interval")] #[doc(alias = "min-force-key-unit-interval")] fn min_force_key_unit_interval(&self) -> Option { unsafe { from_glib(ffi::gst_video_encoder_get_min_force_key_unit_interval( self.as_ref().to_glib_none().0, )) } } #[doc(alias = "gst_video_encoder_is_qos_enabled")] fn is_qos_enabled(&self) -> bool { unsafe { from_glib(ffi::gst_video_encoder_is_qos_enabled( self.as_ref().to_glib_none().0, )) } } #[doc(alias = "gst_video_encoder_merge_tags")] fn merge_tags(&self, tags: Option<&gst::TagList>, mode: gst::TagMergeMode) { unsafe { ffi::gst_video_encoder_merge_tags( self.as_ref().to_glib_none().0, tags.to_glib_none().0, mode.into_glib(), ); } } #[doc(alias = "gst_video_encoder_proxy_getcaps")] fn proxy_getcaps(&self, caps: Option<&gst::Caps>, filter: Option<&gst::Caps>) -> gst::Caps { unsafe { from_glib_full(ffi::gst_video_encoder_proxy_getcaps( self.as_ref().to_glib_none().0, caps.to_glib_none().0, filter.to_glib_none().0, )) } } #[cfg(feature = "v1_26")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_26")))] #[doc(alias = "gst_video_encoder_release_frame")] fn release_frame(&self, frame: VideoCodecFrame) { unsafe { ffi::gst_video_encoder_release_frame( self.as_ref().to_glib_none().0, frame.into_glib_ptr(), ); } } #[cfg(feature = "v1_18")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_18")))] #[doc(alias = "gst_video_encoder_set_min_force_key_unit_interval")] #[doc(alias = "min-force-key-unit-interval")] fn set_min_force_key_unit_interval(&self, interval: impl Into>) { unsafe { ffi::gst_video_encoder_set_min_force_key_unit_interval( self.as_ref().to_glib_none().0, interval.into().into_glib(), ); } } #[doc(alias = "gst_video_encoder_set_min_pts")] fn set_min_pts(&self, min_pts: impl Into>) { unsafe { ffi::gst_video_encoder_set_min_pts( self.as_ref().to_glib_none().0, min_pts.into().into_glib(), ); } } #[doc(alias = "gst_video_encoder_set_qos_enabled")] fn set_qos_enabled(&self, enabled: bool) { unsafe { ffi::gst_video_encoder_set_qos_enabled( self.as_ref().to_glib_none().0, enabled.into_glib(), ); } } fn is_qos(&self) -> bool { ObjectExt::property(self.as_ref(), "qos") } fn set_qos(&self, qos: bool) { ObjectExt::set_property(self.as_ref(), "qos", qos) } #[cfg(feature = "v1_18")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_18")))] #[doc(alias = "min-force-key-unit-interval")] fn connect_min_force_key_unit_interval_notify( &self, f: F, ) -> SignalHandlerId { unsafe extern "C" fn notify_min_force_key_unit_interval_trampoline< P: IsA, F: Fn(&P) + Send + Sync + 'static, >( this: *mut ffi::GstVideoEncoder, _param_spec: glib::ffi::gpointer, f: glib::ffi::gpointer, ) { let f: &F = &*(f as *const F); f(VideoEncoder::from_glib_borrow(this).unsafe_cast_ref()) } unsafe { let f: Box_ = Box_::new(f); connect_raw( self.as_ptr() as *mut _, b"notify::min-force-key-unit-interval\0".as_ptr() as *const _, Some(std::mem::transmute::<*const (), unsafe extern "C" fn()>( notify_min_force_key_unit_interval_trampoline:: as *const (), )), Box_::into_raw(f), ) } } #[doc(alias = "qos")] fn connect_qos_notify(&self, f: F) -> SignalHandlerId { unsafe extern "C" fn notify_qos_trampoline< P: IsA, F: Fn(&P) + Send + Sync + 'static, >( this: *mut ffi::GstVideoEncoder, _param_spec: glib::ffi::gpointer, f: glib::ffi::gpointer, ) { let f: &F = &*(f as *const F); f(VideoEncoder::from_glib_borrow(this).unsafe_cast_ref()) } unsafe { let f: Box_ = Box_::new(f); connect_raw( self.as_ptr() as *mut _, b"notify::qos\0".as_ptr() as *const _, Some(std::mem::transmute::<*const (), unsafe extern "C" fn()>( notify_qos_trampoline:: as *const (), )), Box_::into_raw(f), ) } } } impl> VideoEncoderExt for O {} gstreamer-video-0.23.5/src/auto/video_filter.rs000064400000000000000000000012301046102023000175640ustar 00000000000000// This file was generated by gir (https://github.com/gtk-rs/gir) // from gir-files (https://github.com/gtk-rs/gir-files) // from gst-gir-files (https://gitlab.freedesktop.org/gstreamer/gir-files-rs.git) // DO NOT EDIT use crate::ffi; glib::wrapper! { #[doc(alias = "GstVideoFilter")] pub struct VideoFilter(Object) @extends gst_base::BaseTransform, gst::Element, gst::Object; match fn { type_ => || ffi::gst_video_filter_get_type(), } } impl VideoFilter { pub const NONE: Option<&'static VideoFilter> = None; } unsafe impl Send for VideoFilter {} unsafe impl Sync for VideoFilter {} gstreamer-video-0.23.5/src/auto/video_orientation.rs000064400000000000000000000110141046102023000206330ustar 00000000000000// This file was generated by gir (https://github.com/gtk-rs/gir) // from gir-files (https://github.com/gtk-rs/gir-files) // from gst-gir-files (https://gitlab.freedesktop.org/gstreamer/gir-files-rs.git) // DO NOT EDIT use crate::ffi; use glib::{prelude::*, translate::*}; glib::wrapper! { #[doc(alias = "GstVideoOrientation")] pub struct VideoOrientation(Interface); match fn { type_ => || ffi::gst_video_orientation_get_type(), } } impl VideoOrientation { pub const NONE: Option<&'static VideoOrientation> = None; } unsafe impl Send for VideoOrientation {} unsafe impl Sync for VideoOrientation {} mod sealed { pub trait Sealed {} impl> Sealed for T {} } pub trait VideoOrientationExt: IsA + sealed::Sealed + 'static { #[doc(alias = "gst_video_orientation_get_hcenter")] #[doc(alias = "get_hcenter")] fn hcenter(&self) -> Option { unsafe { let mut center = std::mem::MaybeUninit::uninit(); let ret = from_glib(ffi::gst_video_orientation_get_hcenter( self.as_ref().to_glib_none().0, center.as_mut_ptr(), )); if ret { Some(center.assume_init()) } else { None } } } #[doc(alias = "gst_video_orientation_get_hflip")] #[doc(alias = "get_hflip")] fn hflip(&self) -> Option { unsafe { let mut flip = std::mem::MaybeUninit::uninit(); let ret = from_glib(ffi::gst_video_orientation_get_hflip( self.as_ref().to_glib_none().0, flip.as_mut_ptr(), )); if ret { Some(from_glib(flip.assume_init())) } else { None } } } #[doc(alias = "gst_video_orientation_get_vcenter")] #[doc(alias = "get_vcenter")] fn vcenter(&self) -> Option { unsafe { let mut center = std::mem::MaybeUninit::uninit(); let ret = from_glib(ffi::gst_video_orientation_get_vcenter( self.as_ref().to_glib_none().0, center.as_mut_ptr(), )); if ret { Some(center.assume_init()) } else { None } } } #[doc(alias = "gst_video_orientation_get_vflip")] #[doc(alias = "get_vflip")] fn vflip(&self) -> Option { unsafe { let mut flip = std::mem::MaybeUninit::uninit(); let ret = from_glib(ffi::gst_video_orientation_get_vflip( self.as_ref().to_glib_none().0, flip.as_mut_ptr(), )); if ret { Some(from_glib(flip.assume_init())) } else { None } } } #[doc(alias = "gst_video_orientation_set_hcenter")] fn set_hcenter(&self, center: i32) -> Result<(), glib::error::BoolError> { unsafe { glib::result_from_gboolean!( ffi::gst_video_orientation_set_hcenter(self.as_ref().to_glib_none().0, center), "Failed to set horizontal centering" ) } } #[doc(alias = "gst_video_orientation_set_hflip")] fn set_hflip(&self, flip: bool) -> Result<(), glib::error::BoolError> { unsafe { glib::result_from_gboolean!( ffi::gst_video_orientation_set_hflip( self.as_ref().to_glib_none().0, flip.into_glib() ), "Failed to set horizontal flipping" ) } } #[doc(alias = "gst_video_orientation_set_vcenter")] fn set_vcenter(&self, center: i32) -> Result<(), glib::error::BoolError> { unsafe { glib::result_from_gboolean!( ffi::gst_video_orientation_set_vcenter(self.as_ref().to_glib_none().0, center), "Failed to set vertical centering" ) } } #[doc(alias = "gst_video_orientation_set_vflip")] fn set_vflip(&self, flip: bool) -> Result<(), glib::error::BoolError> { unsafe { glib::result_from_gboolean!( ffi::gst_video_orientation_set_vflip( self.as_ref().to_glib_none().0, flip.into_glib() ), "Failed to set vertical flipping" ) } } } impl> VideoOrientationExt for O {} gstreamer-video-0.23.5/src/auto/video_overlay.rs000064400000000000000000000051211046102023000177630ustar 00000000000000// This file was generated by gir (https://github.com/gtk-rs/gir) // from gir-files (https://github.com/gtk-rs/gir-files) // from gst-gir-files (https://gitlab.freedesktop.org/gstreamer/gir-files-rs.git) // DO NOT EDIT use crate::ffi; use glib::{prelude::*, translate::*}; glib::wrapper! { #[doc(alias = "GstVideoOverlay")] pub struct VideoOverlay(Interface); match fn { type_ => || ffi::gst_video_overlay_get_type(), } } impl VideoOverlay { pub const NONE: Option<&'static VideoOverlay> = None; //#[doc(alias = "gst_video_overlay_install_properties")] //pub fn install_properties(oclass: /*Ignored*/&mut glib::ObjectClass, last_prop_id: i32) { // unsafe { TODO: call ffi:gst_video_overlay_install_properties() } //} } unsafe impl Send for VideoOverlay {} unsafe impl Sync for VideoOverlay {} mod sealed { pub trait Sealed {} impl> Sealed for T {} } pub trait VideoOverlayExt: IsA + sealed::Sealed + 'static { #[doc(alias = "gst_video_overlay_expose")] fn expose(&self) { unsafe { ffi::gst_video_overlay_expose(self.as_ref().to_glib_none().0); } } //#[doc(alias = "gst_video_overlay_got_window_handle")] //fn got_window_handle(&self, handle: /*Unimplemented*/Basic: UIntPtr) { // unsafe { TODO: call ffi:gst_video_overlay_got_window_handle() } //} #[doc(alias = "gst_video_overlay_handle_events")] fn handle_events(&self, handle_events: bool) { unsafe { ffi::gst_video_overlay_handle_events( self.as_ref().to_glib_none().0, handle_events.into_glib(), ); } } #[doc(alias = "gst_video_overlay_prepare_window_handle")] fn prepare_window_handle(&self) { unsafe { ffi::gst_video_overlay_prepare_window_handle(self.as_ref().to_glib_none().0); } } #[doc(alias = "gst_video_overlay_set_render_rectangle")] fn set_render_rectangle( &self, x: i32, y: i32, width: i32, height: i32, ) -> Result<(), glib::error::BoolError> { unsafe { glib::result_from_gboolean!( ffi::gst_video_overlay_set_render_rectangle( self.as_ref().to_glib_none().0, x, y, width, height ), "Failed to set render rectangle" ) } } } impl> VideoOverlayExt for O {} gstreamer-video-0.23.5/src/auto/video_sink.rs000064400000000000000000000045271046102023000172570ustar 00000000000000// This file was generated by gir (https://github.com/gtk-rs/gir) // from gir-files (https://github.com/gtk-rs/gir-files) // from gst-gir-files (https://gitlab.freedesktop.org/gstreamer/gir-files-rs.git) // DO NOT EDIT #![allow(deprecated)] use crate::ffi; use glib::{ prelude::*, signal::{connect_raw, SignalHandlerId}, translate::*, }; use std::boxed::Box as Box_; glib::wrapper! { #[doc(alias = "GstVideoSink")] pub struct VideoSink(Object) @extends gst_base::BaseSink, gst::Element, gst::Object; match fn { type_ => || ffi::gst_video_sink_get_type(), } } impl VideoSink { pub const NONE: Option<&'static VideoSink> = None; } unsafe impl Send for VideoSink {} unsafe impl Sync for VideoSink {} mod sealed { pub trait Sealed {} impl> Sealed for T {} } pub trait VideoSinkExt: IsA + sealed::Sealed + 'static { #[doc(alias = "show-preroll-frame")] fn shows_preroll_frame(&self) -> bool { ObjectExt::property(self.as_ref(), "show-preroll-frame") } #[doc(alias = "show-preroll-frame")] fn set_show_preroll_frame(&self, show_preroll_frame: bool) { ObjectExt::set_property(self.as_ref(), "show-preroll-frame", show_preroll_frame) } #[doc(alias = "show-preroll-frame")] fn connect_show_preroll_frame_notify( &self, f: F, ) -> SignalHandlerId { unsafe extern "C" fn notify_show_preroll_frame_trampoline< P: IsA, F: Fn(&P) + Send + Sync + 'static, >( this: *mut ffi::GstVideoSink, _param_spec: glib::ffi::gpointer, f: glib::ffi::gpointer, ) { let f: &F = &*(f as *const F); f(VideoSink::from_glib_borrow(this).unsafe_cast_ref()) } unsafe { let f: Box_ = Box_::new(f); connect_raw( self.as_ptr() as *mut _, b"notify::show-preroll-frame\0".as_ptr() as *const _, Some(std::mem::transmute::<*const (), unsafe extern "C" fn()>( notify_show_preroll_frame_trampoline:: as *const (), )), Box_::into_raw(f), ) } } } impl> VideoSinkExt for O {} gstreamer-video-0.23.5/src/caps.rs000064400000000000000000000456271046102023000151110ustar 00000000000000use std::ops::{Bound::*, RangeBounds}; use glib::translate::*; use gst::Caps; use crate::VideoFormat; pub struct VideoCapsBuilder { builder: gst::caps::Builder, } impl VideoCapsBuilder { // rustdoc-stripper-ignore-next /// Constructs an `VideoCapsBuilder` for the "video/x-raw" encoding. /// /// If left unchanged, the resulting `Caps` will be initialized with: /// - "video/x-raw" encoding. /// - all available formats. /// - maximum width range. /// - maximum height range. /// /// Use [`VideoCapsBuilder::for_encoding`] to specify another encoding. pub fn new() -> Self { assert_initialized_main_thread!(); let builder = Caps::builder(glib::gstr!("video/x-raw")); let builder = VideoCapsBuilder { builder }; builder .format_list(VideoFormat::iter_raw()) .width_range(..) .height_range(..) .framerate_range(..) } // rustdoc-stripper-ignore-next /// Constructs an `VideoCapsBuilder` for the specified encoding. /// /// The resulting `Caps` will use the `encoding` argument as name /// and will not contain any additional fields unless explicitly added. pub fn for_encoding(encoding: impl IntoGStr) -> Self { assert_initialized_main_thread!(); VideoCapsBuilder { builder: Caps::builder(encoding), } } pub fn any_features(self) -> VideoCapsBuilder { VideoCapsBuilder { builder: self.builder.any_features(), } } pub fn features( self, features: impl IntoIterator, ) -> VideoCapsBuilder { VideoCapsBuilder { builder: self.builder.features(features), } } } impl Default for VideoCapsBuilder { fn default() -> Self { Self::new() } } impl VideoCapsBuilder { pub fn format(self, format: VideoFormat) -> Self { Self { builder: self.builder.field(glib::gstr!("format"), format.to_str()), } } pub fn format_if(self, format: VideoFormat, predicate: bool) -> Self { if predicate { self.format(format) } else { self } } pub fn format_if_some(self, format: Option) -> Self { if let Some(format) = format { self.format(format) } else { self } } pub fn format_list(self, formats: impl IntoIterator) -> Self { Self { builder: self.builder.field( glib::gstr!("format"), gst::List::new(formats.into_iter().map(|f| f.to_str())), ), } } pub fn format_list_if( self, formats: impl IntoIterator, predicate: bool, ) -> Self { if predicate { self.format_list(formats) } else { self } } pub fn format_list_if_some( self, formats: Option>, ) -> Self { if let Some(formats) = formats { self.format_list(formats) } else { self } } pub fn format_list_if_not_empty(self, formats: impl IntoIterator) -> Self { let mut formats = formats.into_iter().peekable(); if formats.peek().is_some() { self.format_list(formats) } else { self } } pub fn width(self, width: i32) -> Self { Self { builder: self.builder.field(glib::gstr!("width"), width), } } pub fn width_if(self, width: i32, predicate: bool) -> Self { if predicate { self.width(width) } else { self } } pub fn width_if_some(self, width: Option) -> Self { if let Some(width) = width { self.width(width) } else { self } } pub fn width_range(self, widths: impl RangeBounds) -> Self { let (start, end) = range_bounds_i32_start_end(widths); let gst_widths: gst::IntRange = gst::IntRange::new(start, end); Self { builder: self.builder.field(glib::gstr!("width"), gst_widths), } } pub fn width_range_if_some(self, widths: Option>) -> Self { if let Some(widths) = widths { self.width_range(widths) } else { self } } pub fn width_range_if(self, widths: impl RangeBounds, predicate: bool) -> Self { if predicate { self.width_range(widths) } else { self } } pub fn width_list(self, widths: impl IntoIterator) -> Self { Self { builder: self .builder .field(glib::gstr!("width"), gst::List::new(widths)), } } pub fn width_list_if(self, widths: impl IntoIterator, predicate: bool) -> Self { if predicate { self.width_list(widths) } else { self } } pub fn width_list_if_some(self, widths: Option>) -> Self { if let Some(widths) = widths { self.width_list(widths) } else { self } } pub fn width_list_if_not_empty(self, widths: impl IntoIterator) -> Self { let mut widths = widths.into_iter().peekable(); if widths.peek().is_some() { self.width_list(widths) } else { self } } pub fn height(self, height: i32) -> Self { Self { builder: self.builder.field(glib::gstr!("height"), height), } } pub fn height_if(self, height: i32, predicate: bool) -> Self { if predicate { self.height(height) } else { self } } pub fn height_if_some(self, height: Option) -> Self { if let Some(height) = height { self.height(height) } else { self } } pub fn height_range(self, heights: impl RangeBounds) -> Self { let (start, end) = range_bounds_i32_start_end(heights); let gst_heights: gst::IntRange = gst::IntRange::new(start, end); Self { builder: self.builder.field(glib::gstr!("height"), gst_heights), } } pub fn height_range_if(self, heights: impl RangeBounds, predicate: bool) -> Self { if predicate { self.height_range(heights) } else { self } } pub fn height_range_if_some(self, heights: Option>) -> Self { if let Some(heights) = heights { self.height_range(heights) } else { self } } pub fn height_list(self, heights: impl IntoIterator) -> Self { Self { builder: self .builder .field(glib::gstr!("height"), gst::List::new(heights)), } } pub fn height_list_if(self, heights: impl IntoIterator, predicate: bool) -> Self { if predicate { self.height_list(heights) } else { self } } pub fn height_list_if_some(self, heights: Option>) -> Self { if let Some(heights) = heights { self.height_list(heights) } else { self } } pub fn height_list_if_not_empty(self, heights: impl IntoIterator) -> Self { let mut heights = heights.into_iter().peekable(); if heights.peek().is_some() { self.height_list(heights) } else { self } } pub fn framerate(self, framerate: gst::Fraction) -> Self { Self { builder: self.builder.field(glib::gstr!("framerate"), framerate), } } pub fn framerate_if(self, framerate: gst::Fraction, predicate: bool) -> Self { if predicate { self.framerate(framerate) } else { self } } pub fn framerate_if_some(self, framerate: Option) -> Self { if let Some(framerate) = framerate { self.framerate(framerate) } else { self } } pub fn framerate_range(self, framerates: impl RangeBounds) -> Self { let start = match framerates.start_bound() { Unbounded => gst::Fraction::new(0, 1), Excluded(n) => next_fraction(*n), Included(n) => { assert!(n.numer() >= 0); *n } }; let end = match framerates.end_bound() { Unbounded => gst::Fraction::new(i32::MAX, 1), Excluded(n) => previous_fraction(*n), Included(n) => { assert!(n.numer() >= 0); *n } }; assert!(start <= end); let framerates: gst::FractionRange = gst::FractionRange::new(start, end); Self { builder: self.builder.field(glib::gstr!("framerate"), framerates), } } pub fn framerate_range_if( self, framerates: impl RangeBounds, predicate: bool, ) -> Self { if predicate { self.framerate_range(framerates) } else { self } } pub fn framerate_range_if_some( self, framerates: Option>, ) -> Self { if let Some(framerates) = framerates { self.framerate_range(framerates) } else { self } } pub fn framerate_list(self, framerates: impl IntoIterator) -> Self { Self { builder: self .builder .field(glib::gstr!("framerate"), gst::List::new(framerates)), } } pub fn framerate_list_if( self, framerates: impl IntoIterator, predicate: bool, ) -> Self { if predicate { self.framerate_list(framerates) } else { self } } pub fn framerate_list_if_some( self, framerates: Option>, ) -> Self { if let Some(framerates) = framerates { self.framerate_list(framerates) } else { self } } pub fn framerate_list_if_not_empty( self, framerates: impl IntoIterator, ) -> Self { let mut framerates = framerates.into_iter().peekable(); if framerates.peek().is_some() { self.framerate_list(framerates) } else { self } } pub fn pixel_aspect_ratio(self, pixel_aspect_ratio: gst::Fraction) -> Self { Self { builder: self.builder.field("pixel-aspect-ratio", pixel_aspect_ratio), } } pub fn pixel_aspect_ratio_if(self, pixel_aspect_ratio: gst::Fraction, predicate: bool) -> Self { if predicate { self.pixel_aspect_ratio(pixel_aspect_ratio) } else { self } } pub fn pixel_aspect_ratio_if_some(self, pixel_aspect_ratio: Option) -> Self { if let Some(pixel_aspect_ratio) = pixel_aspect_ratio { self.pixel_aspect_ratio(pixel_aspect_ratio) } else { self } } pub fn pixel_aspect_ratio_range( self, pixel_aspect_ratios: impl RangeBounds, ) -> Self { let start = match pixel_aspect_ratios.start_bound() { Unbounded => gst::Fraction::new(1, i32::MAX), Excluded(n) => next_fraction(*n), Included(n) => { assert!(n.numer() >= 0); *n } }; let end = match pixel_aspect_ratios.end_bound() { Unbounded => gst::Fraction::new(i32::MAX, 1), Excluded(n) => previous_fraction(*n), Included(n) => { assert!(n.numer() >= 0); *n } }; assert!(start <= end); let pixel_aspect_ratios: gst::FractionRange = gst::FractionRange::new(start, end); Self { builder: self .builder .field("pixel-aspect-ratio", pixel_aspect_ratios), } } pub fn pixel_aspect_ratio_range_if( self, pixel_aspect_ratios: impl RangeBounds, predicate: bool, ) -> Self { if predicate { self.pixel_aspect_ratio_range(pixel_aspect_ratios) } else { self } } pub fn pixel_aspect_ratio_range_if_some( self, pixel_aspect_ratios: Option>, ) -> Self { if let Some(pixel_aspect_ratios) = pixel_aspect_ratios { self.pixel_aspect_ratio_range(pixel_aspect_ratios) } else { self } } pub fn pixel_aspect_ratio_list( self, pixel_aspect_ratios: impl IntoIterator, ) -> Self { Self { builder: self .builder .field("pixel-aspect-ratio", gst::List::new(pixel_aspect_ratios)), } } pub fn pixel_aspect_ratio_list_if( self, pixel_aspect_ratios: impl IntoIterator, predicate: bool, ) -> Self { if predicate { self.pixel_aspect_ratio_list(pixel_aspect_ratios) } else { self } } pub fn pixel_aspect_ratio_list_if_some( self, pixel_aspect_ratios: Option>, ) -> Self { if let Some(pixel_aspect_ratios) = pixel_aspect_ratios { self.pixel_aspect_ratio_list(pixel_aspect_ratios) } else { self } } pub fn pixel_aspect_ratio_list_if_not_empty( self, pixel_aspect_ratios: impl IntoIterator, ) -> Self { let mut pixel_aspect_ratios = pixel_aspect_ratios.into_iter().peekable(); if pixel_aspect_ratios.peek().is_some() { self.pixel_aspect_ratio_list(pixel_aspect_ratios) } else { self } } // rustdoc-stripper-ignore-next /// Sets field `name` to the given value `value`. /// /// Overrides any default or previously defined value for `name`. #[inline] pub fn field(self, name: impl IntoGStr, value: impl Into + Send) -> Self { Self { builder: self.builder.field(name, value), } } gst::impl_builder_gvalue_extra_setters!(field); #[must_use] pub fn build(self) -> gst::Caps { self.builder.build() } } fn range_bounds_i32_start_end(range: impl RangeBounds) -> (i32, i32) { skip_assert_initialized!(); let start = match range.start_bound() { Unbounded => 1, Excluded(n) => n + 1, Included(n) => *n, }; let end = match range.end_bound() { Unbounded => i32::MAX, Excluded(n) => n - 1, Included(n) => *n, }; (start, end) } // https://math.stackexchange.com/questions/39582/how-to-compute-next-previous-representable-rational-number/3798608#3798608 /* Extended Euclidean Algorithm: computes (g, x, y), * such that a*x + b*y = g = gcd(a, b) >= 0. */ fn xgcd(mut a: i64, mut b: i64) -> (i64, i64, i64) { skip_assert_initialized!(); let mut x0 = 0i64; let mut x1 = 1i64; let mut y0 = 1i64; let mut y1 = 0i64; while a != 0 { let q; (q, a, b) = (b / a, b % a, a); (y0, y1) = (y1, y0 - q * y1); (x0, x1) = (x1, x0 - q * x1); } if b >= 0 { (b, x0, y0) } else { (-b, -x0, -y0) } } /* Computes the neighbours of p/q in the Farey sequence of order n. */ fn farey_neighbours(p: i32, q: i32) -> (i32, i32, i32, i32) { skip_assert_initialized!(); let n = i32::MAX as i64; assert!(q != 0); let mut p = p as i64; let mut q = q as i64; if q < 0 { p = -p; q = -q; } let (g, r, _) = xgcd(p, q); p /= g; q /= g; let b = ((n - r) / q) * q + r; let a = (b * p - 1) / q; let d = ((n + r) / q) * q - r; let c = (d * p + 1) / q; (a as i32, b as i32, c as i32, d as i32) } fn previous_fraction(fraction: gst::Fraction) -> gst::Fraction { skip_assert_initialized!(); let num = fraction.numer(); let den = fraction.denom(); let (new_num, new_den); if num < den { (new_num, new_den, _, _) = farey_neighbours(num, den); } else { (_, _, new_den, new_num) = farey_neighbours(den, num); } gst::Fraction::new(new_num, new_den) } fn next_fraction(fraction: gst::Fraction) -> gst::Fraction { skip_assert_initialized!(); let num = fraction.numer(); let den = fraction.denom(); let (new_num, new_den); if num < den { (_, _, new_num, new_den) = farey_neighbours(num, den); } else { (new_den, new_num, _, _) = farey_neighbours(den, num); } gst::Fraction::new(new_num, new_den) } #[cfg(test)] mod tests { use super::{next_fraction, previous_fraction, VideoCapsBuilder}; #[test] fn default_encoding() { gst::init().unwrap(); let caps = VideoCapsBuilder::new().build(); assert_eq!(caps.structure(0).unwrap().name(), "video/x-raw"); } #[test] fn explicit_encoding() { gst::init().unwrap(); let caps = VideoCapsBuilder::for_encoding("video/mpeg").build(); assert_eq!(caps.structure(0).unwrap().name(), "video/mpeg"); } #[test] fn test_0_1_fraction() { gst::init().unwrap(); let zero_over_one = gst::Fraction::new(0, 1); let prev = previous_fraction(zero_over_one); assert_eq!(prev.numer(), -1); assert_eq!(prev.denom(), i32::MAX); let next = next_fraction(zero_over_one); assert_eq!(next.numer(), 1); assert_eq!(next.denom(), i32::MAX); } #[test] fn test_25_1() { gst::init().unwrap(); let twentyfive = gst::Fraction::new(25, 1); let next = next_fraction(twentyfive); //25.000000011641532 assert_eq!(next.numer(), 2147483626); assert_eq!(next.denom(), 85899345); let prev = previous_fraction(twentyfive); //24.999999988358468 assert_eq!(prev.numer(), 2147483624); assert_eq!(prev.denom(), 85899345); } #[test] fn test_1_25() { gst::init().unwrap(); let twentyfive = gst::Fraction::new(1, 25); let next = next_fraction(twentyfive); //0.040000000018626 assert_eq!(next.numer(), 85899345); assert_eq!(next.denom(), 2147483624); let prev = previous_fraction(twentyfive); //0.039999999981374 assert_eq!(prev.numer(), 85899345); assert_eq!(prev.denom(), 2147483626); } } gstreamer-video-0.23.5/src/caps_features.rs000064400000000000000000000040061046102023000167710ustar 00000000000000// Take a look at the license at the top of the repository in the LICENSE file. use crate::ffi; use gst::CapsFeatures; use once_cell::sync::Lazy; #[cfg(feature = "v1_16")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_16")))] pub static CAPS_FEATURE_FORMAT_INTERLACED: &glib::GStr = unsafe { glib::GStr::from_utf8_with_nul_unchecked(ffi::GST_CAPS_FEATURE_FORMAT_INTERLACED) }; #[cfg(feature = "v1_16")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_16")))] pub static CAPS_FEATURES_FORMAT_INTERLACED: Lazy = Lazy::new(|| CapsFeatures::new([CAPS_FEATURE_FORMAT_INTERLACED])); pub static CAPS_FEATURE_META_GST_VIDEO_AFFINE_TRANSFORMATION_META: &glib::GStr = unsafe { glib::GStr::from_utf8_with_nul_unchecked( ffi::GST_CAPS_FEATURE_META_GST_VIDEO_AFFINE_TRANSFORMATION_META, ) }; pub static CAPS_FEATURES_META_GST_VIDEO_AFFINE_TRANSFORMATION_META: Lazy = Lazy::new(|| CapsFeatures::new([CAPS_FEATURE_META_GST_VIDEO_AFFINE_TRANSFORMATION_META])); pub static CAPS_FEATURE_META_GST_VIDEO_GL_TEXTURE_UPLOAD_META: &glib::GStr = unsafe { glib::GStr::from_utf8_with_nul_unchecked( ffi::GST_CAPS_FEATURE_META_GST_VIDEO_GL_TEXTURE_UPLOAD_META, ) }; pub static CAPS_FEATURES_META_GST_VIDEO_GL_TEXTURE_UPLOAD_META: Lazy = Lazy::new(|| CapsFeatures::new([CAPS_FEATURE_META_GST_VIDEO_GL_TEXTURE_UPLOAD_META])); pub static CAPS_FEATURE_META_GST_VIDEO_META: &glib::GStr = unsafe { glib::GStr::from_utf8_with_nul_unchecked(ffi::GST_CAPS_FEATURE_META_GST_VIDEO_META) }; pub static CAPS_FEATURES_META_GST_VIDEO_META: Lazy = Lazy::new(|| CapsFeatures::new([CAPS_FEATURE_META_GST_VIDEO_META])); pub static CAPS_FEATURE_META_GST_VIDEO_OVERLAY_COMPOSITION: &glib::GStr = unsafe { glib::GStr::from_utf8_with_nul_unchecked( ffi::GST_CAPS_FEATURE_META_GST_VIDEO_OVERLAY_COMPOSITION, ) }; pub static CAPS_FEATURES_META_GST_VIDEO_OVERLAY_COMPOSITION: Lazy = Lazy::new(|| CapsFeatures::new([CAPS_FEATURE_META_GST_VIDEO_OVERLAY_COMPOSITION])); gstreamer-video-0.23.5/src/color_balance_channel.rs000064400000000000000000000007251046102023000204240ustar 00000000000000// Take a look at the license at the top of the repository in the LICENSE file. use glib::{prelude::*, translate::*}; use crate::ColorBalanceChannel; impl ColorBalanceChannel { pub fn label(&self) -> glib::GString { unsafe { from_glib_none((*self.as_ptr()).label) } } pub fn min_value(&self) -> i32 { unsafe { (*self.as_ptr()).min_value } } pub fn max_value(&self) -> i32 { unsafe { (*self.as_ptr()).max_value } } } gstreamer-video-0.23.5/src/enums.rs000064400000000000000000000030011046102023000152660ustar 00000000000000// Take a look at the license at the top of the repository in the LICENSE file. #[cfg(feature = "v1_16")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_16")))] use glib::translate::*; #[cfg(feature = "v1_16")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_16")))] use crate::VideoCaptionType; #[cfg(feature = "v1_20")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_20")))] use crate::VideoOrientationMethod; #[cfg(feature = "v1_16")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_16")))] impl VideoCaptionType { #[doc(alias = "gst_video_caption_type_from_caps")] pub fn from_caps(caps: &gst::CapsRef) -> VideoCaptionType { skip_assert_initialized!(); unsafe { from_glib(crate::ffi::gst_video_caption_type_from_caps(caps.as_ptr())) } } } #[cfg(feature = "v1_20")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_20")))] impl VideoOrientationMethod { #[cfg(feature = "v1_20")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_20")))] #[doc(alias = "gst_video_orientation_from_tag")] pub fn from_tag(taglist: &gst::TagListRef) -> Option { skip_assert_initialized!(); unsafe { use std::mem; let mut method = mem::MaybeUninit::uninit(); let ret = from_glib(crate::ffi::gst_video_orientation_from_tag( mut_override(taglist.as_ptr()), method.as_mut_ptr(), )); if ret { Some(from_glib(method.assume_init())) } else { None } } } } gstreamer-video-0.23.5/src/flag_serde.rs000064400000000000000000000202621046102023000162420ustar 00000000000000// Take a look at the license at the top of the repository in the LICENSE file. use glib::{ prelude::*, translate::{from_glib, ToGlibPtr}, FlagsClass, }; use gst::bitflags_serde_impl; bitflags_serde_impl!(crate::NavigationModifierType, "v1_22"); bitflags_serde_impl!(crate::VideoBufferFlags); bitflags_serde_impl!(crate::VideoChromaSite); bitflags_serde_impl!(crate::VideoCodecFrameFlags, "v1_20"); bitflags_serde_impl!(crate::VideoDecoderRequestSyncPointFlags, "v1_20"); bitflags_serde_impl!(crate::VideoFlags); bitflags_serde_impl!(crate::VideoFormatFlags); bitflags_serde_impl!(crate::VideoFrameFlags); bitflags_serde_impl!(crate::VideoMultiviewFlags); bitflags_serde_impl!(crate::VideoOverlayFormatFlags, "v1_16"); bitflags_serde_impl!(crate::VideoPackFlags); bitflags_serde_impl!(crate::VideoTimeCodeFlags, "v1_18"); #[cfg(test)] mod tests { macro_rules! check_serialize { ($flags:expr, $expected:expr) => { let actual = serde_json::to_string(&$flags).unwrap(); assert_eq!(actual, $expected); }; } macro_rules! check_deserialize { ($ty:ty, $expected:expr, $json:expr) => { let actual: $ty = serde_json::from_str(&$json).unwrap(); assert_eq!(actual, $expected); }; } macro_rules! check_roundtrip { ($ty:ty, $flags:expr) => { let json = serde_json::to_string(&$flags).unwrap(); let deserialized: $ty = serde_json::from_str(&json).unwrap(); assert_eq!(deserialized, $flags); }; } #[test] fn test_serialize() { gst::init().unwrap(); #[cfg(feature = "v1_22")] check_serialize!( crate::NavigationModifierType::all(), concat!( "\"shift-mask+lock-mask+control-mask+mod1-mask+mod2-mask+mod3-mask", "+mod4-mask+mod5-mask+button1-mask+button2-mask+button3-mask", "+button4-mask+button5-mask+super-mask+hyper-mask+meta-mask\"" ) ); #[cfg(feature = "v1_18")] check_serialize!( crate::VideoBufferFlags::all(), "\"interlaced+tff+rff+onefield+multiple-view+first-in-bundle+marker\"" ); check_serialize!( crate::VideoChromaSite::all(), "\"none+h-cosited+v-cosited+alt-line\"" ); #[cfg(feature = "v1_20")] check_serialize!( crate::VideoCodecFrameFlags::all(), "\"decode-only+sync-point+force-keyframe+force-keyframe-headers+corrupted\"" ); #[cfg(feature = "v1_20")] check_serialize!( crate::VideoDecoderRequestSyncPointFlags::all(), "\"discard-input+corrupt-output\"" ); check_serialize!( crate::VideoFlags::all(), "\"variable-fps+premultiplied-alpha\"" ); #[cfg(feature = "v1_22")] check_serialize!( crate::VideoFormatFlags::all(), "\"yuv+rgb+gray+alpha+le+palette+complex+unpack+tiled+subtiles\"" ); check_serialize!( crate::VideoFrameFlags::all(), "\"interlaced+tff+rff+onefield+multiple-view+first-in-bundle\"" ); check_serialize!( crate::VideoMultiviewFlags::all(), concat!( "\"right-view-first+left-flipped+left-flopped+right-flipped", "+right-flopped+half-aspect+mixed-mono\"" ) ); #[cfg(feature = "v1_16")] check_serialize!( crate::VideoOverlayFormatFlags::all(), "\"premultiplied-alpha+global-alpha\"" ); check_serialize!( crate::VideoPackFlags::all(), "\"truncate-range+interlaced\"" ); #[cfg(feature = "v1_18")] check_serialize!( crate::VideoTimeCodeFlags::all(), "\"drop-frame+interlaced\"" ); } #[test] fn test_deserialize() { gst::init().unwrap(); #[cfg(feature = "v1_22")] check_deserialize!( crate::NavigationModifierType, crate::NavigationModifierType::all(), concat!( "\"shift-mask+lock-mask+control-mask+mod1-mask+mod2-mask", "+mod3-mask+mod4-mask+mod5-mask+button1-mask", "+button2-mask+button3-mask+button4-mask+button5-mask", "+super-mask+hyper-mask+meta-mask\"" ) ); #[cfg(feature = "v1_18")] check_deserialize!( crate::VideoBufferFlags, crate::VideoBufferFlags::all(), "\"interlaced+tff+rff+onefield+multiple-view+first-in-bundle+marker\"" ); check_deserialize!( crate::VideoChromaSite, crate::VideoChromaSite::all(), "\"none+h-cosited+v-cosited+alt-line\"" ); #[cfg(feature = "v1_20")] check_deserialize!( crate::VideoCodecFrameFlags, crate::VideoCodecFrameFlags::all(), "\"decode-only+sync-point+force-keyframe+force-keyframe-headers+corrupted\"" ); #[cfg(feature = "v1_20")] check_deserialize!( crate::VideoDecoderRequestSyncPointFlags, crate::VideoDecoderRequestSyncPointFlags::all(), "\"discard-input+corrupt-output\"" ); check_deserialize!( crate::VideoFlags, crate::VideoFlags::all(), "\"variable-fps+premultiplied-alpha\"" ); #[cfg(feature = "v1_22")] check_deserialize!( crate::VideoFormatFlags, crate::VideoFormatFlags::all(), "\"yuv+rgb+gray+alpha+le+palette+complex+unpack+tiled+subtiles\"" ); check_deserialize!( crate::VideoFrameFlags, crate::VideoFrameFlags::all(), "\"interlaced+tff+rff+onefield+multiple-view+first-in-bundle\"" ); check_deserialize!( crate::VideoMultiviewFlags, crate::VideoMultiviewFlags::all(), concat!( "\"right-view-first+left-flipped+left-flopped+right-flipped", "+right-flopped+half-aspect+mixed-mono\"" ) ); #[cfg(feature = "v1_16")] check_deserialize!( crate::VideoOverlayFormatFlags, crate::VideoOverlayFormatFlags::all(), "\"premultiplied-alpha+global-alpha\"" ); check_deserialize!( crate::VideoPackFlags, crate::VideoPackFlags::all(), "\"truncate-range+interlaced\"" ); #[cfg(feature = "v1_18")] check_deserialize!( crate::VideoTimeCodeFlags, crate::VideoTimeCodeFlags::all(), "\"drop-frame+interlaced\"" ); } #[test] fn test_serde_roundtrip() { gst::init().unwrap(); #[cfg(feature = "v1_22")] check_roundtrip!( crate::NavigationModifierType, crate::NavigationModifierType::all() ); #[cfg(feature = "v1_18")] check_roundtrip!(crate::VideoBufferFlags, crate::VideoBufferFlags::all()); check_roundtrip!(crate::VideoChromaSite, crate::VideoChromaSite::all()); #[cfg(feature = "v1_20")] check_roundtrip!( crate::VideoCodecFrameFlags, crate::VideoCodecFrameFlags::all() ); #[cfg(feature = "v1_20")] check_roundtrip!( crate::VideoDecoderRequestSyncPointFlags, crate::VideoDecoderRequestSyncPointFlags::all() ); check_roundtrip!(crate::VideoFlags, crate::VideoFlags::all()); #[cfg(feature = "v1_22")] check_roundtrip!(crate::VideoFormatFlags, crate::VideoFormatFlags::all()); check_roundtrip!(crate::VideoFrameFlags, crate::VideoFrameFlags::all()); check_roundtrip!( crate::VideoMultiviewFlags, crate::VideoMultiviewFlags::all() ); #[cfg(feature = "v1_16")] check_roundtrip!( crate::VideoOverlayFormatFlags, crate::VideoOverlayFormatFlags::all() ); check_roundtrip!(crate::VideoPackFlags, crate::VideoPackFlags::all()); #[cfg(feature = "v1_18")] check_roundtrip!(crate::VideoTimeCodeFlags, crate::VideoTimeCodeFlags::all()); } } gstreamer-video-0.23.5/src/functions.rs000064400000000000000000000235021046102023000161570ustar 00000000000000// Take a look at the license at the top of the repository in the LICENSE file. use std::{mem, ptr}; use crate::ffi; use glib::translate::{from_glib, from_glib_full, IntoGlib, ToGlibPtr}; #[doc(alias = "gst_video_convert_sample")] pub fn convert_sample( sample: &gst::Sample, caps: &gst::Caps, timeout: gst::ClockTime, ) -> Result { skip_assert_initialized!(); unsafe { let mut error = ptr::null_mut(); let ret = ffi::gst_video_convert_sample( sample.to_glib_none().0, caps.to_glib_none().0, timeout.into_glib(), &mut error, ); if error.is_null() { Ok(from_glib_full(ret)) } else { Err(from_glib_full(error)) } } } pub fn convert_sample_async( sample: &gst::Sample, caps: &gst::Caps, timeout: Option, func: F, ) where F: FnOnce(Result) + Send + 'static, { skip_assert_initialized!(); unsafe { convert_sample_async_unsafe(sample, caps, timeout, func) } } pub fn convert_sample_async_local( sample: &gst::Sample, caps: &gst::Caps, timeout: Option, func: F, ) where F: FnOnce(Result) + 'static, { skip_assert_initialized!(); unsafe { let ctx = glib::MainContext::ref_thread_default(); let _acquire = ctx .acquire() .expect("thread default main context already acquired by another thread"); let func = glib::thread_guard::ThreadGuard::new(func); convert_sample_async_unsafe(sample, caps, timeout, move |res| (func.into_inner())(res)) } } unsafe fn convert_sample_async_unsafe( sample: &gst::Sample, caps: &gst::Caps, timeout: Option, func: F, ) where F: FnOnce(Result) + 'static, { unsafe extern "C" fn convert_sample_async_trampoline( sample: *mut gst::ffi::GstSample, error: *mut glib::ffi::GError, user_data: glib::ffi::gpointer, ) where F: FnOnce(Result) + 'static, { let callback: &mut Option = &mut *(user_data as *mut Option); let callback = callback.take().unwrap(); if error.is_null() { callback(Ok(from_glib_full(sample))) } else { callback(Err(from_glib_full(error))) } } unsafe extern "C" fn convert_sample_async_free(user_data: glib::ffi::gpointer) where F: FnOnce(Result) + 'static, { let _: Box> = Box::from_raw(user_data as *mut _); } let user_data: Box> = Box::new(Some(func)); ffi::gst_video_convert_sample_async( sample.to_glib_none().0, caps.to_glib_none().0, timeout.into_glib(), Some(convert_sample_async_trampoline::), Box::into_raw(user_data) as glib::ffi::gpointer, Some(convert_sample_async_free::), ); } pub fn convert_sample_future( sample: &gst::Sample, caps: &gst::Caps, timeout: Option, ) -> std::pin::Pin> + 'static>> { skip_assert_initialized!(); use futures_channel::oneshot; let (sender, receiver) = oneshot::channel(); let sample = sample.clone(); let caps = caps.clone(); let future = async move { assert!( glib::MainContext::ref_thread_default().is_owner(), "Spawning futures only allowed if the thread is owning the MainContext" ); convert_sample_async(&sample, &caps, timeout, move |res| { let _ = sender.send(res); }); receiver .await .expect("Sender dropped before callback was called") }; Box::pin(future) } #[doc(alias = "gst_video_calculate_display_ratio")] pub fn calculate_display_ratio( video_width: u32, video_height: u32, video_par: gst::Fraction, display_par: gst::Fraction, ) -> Option { skip_assert_initialized!(); unsafe { let mut dar_n = mem::MaybeUninit::uninit(); let mut dar_d = mem::MaybeUninit::uninit(); let res: bool = from_glib(ffi::gst_video_calculate_display_ratio( dar_n.as_mut_ptr(), dar_d.as_mut_ptr(), video_width, video_height, video_par.numer() as u32, video_par.denom() as u32, display_par.numer() as u32, display_par.denom() as u32, )); if res { Some(gst::Fraction::new( dar_n.assume_init() as i32, dar_d.assume_init() as i32, )) } else { None } } } #[doc(alias = "gst_video_guess_framerate")] pub fn guess_framerate(duration: gst::ClockTime) -> Option { skip_assert_initialized!(); unsafe { let mut dest_n = mem::MaybeUninit::uninit(); let mut dest_d = mem::MaybeUninit::uninit(); let res: bool = from_glib(ffi::gst_video_guess_framerate( duration.into_glib(), dest_n.as_mut_ptr(), dest_d.as_mut_ptr(), )); if res { Some(gst::Fraction::new( dest_n.assume_init(), dest_d.assume_init(), )) } else { None } } } #[cfg(feature = "v1_22")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_22")))] #[doc(alias = "gst_video_is_common_aspect_ratio")] pub fn is_common_aspect_ratio(width: u32, height: u32, par: gst::Fraction) -> bool { skip_assert_initialized!(); unsafe { from_glib(ffi::gst_video_is_common_aspect_ratio( width as i32, height as i32, par.numer(), par.denom(), )) } } pub fn video_make_raw_caps( formats: &[crate::VideoFormat], ) -> crate::VideoCapsBuilder { skip_assert_initialized!(); let formats = formats.iter().copied().map(|f| match f { crate::VideoFormat::Encoded => panic!("Invalid encoded format"), crate::VideoFormat::Unknown => panic!("Invalid unknown format"), _ => f, }); crate::VideoCapsBuilder::new().format_list(formats) } #[cfg(test)] mod tests { use std::sync::{Arc, Mutex}; use super::*; #[test] fn test_convert_sample_async() { gst::init().unwrap(); let l = glib::MainLoop::new(None, false); let mut in_buffer = gst::Buffer::with_size(320 * 240 * 4).unwrap(); { let buffer = in_buffer.get_mut().unwrap(); let mut data = buffer.map_writable().unwrap(); for p in data.as_mut_slice().chunks_mut(4) { p[0] = 63; p[1] = 127; p[2] = 191; p[3] = 255; } } let in_caps = crate::VideoInfo::builder(crate::VideoFormat::Rgba, 320, 240) .build() .unwrap() .to_caps() .unwrap(); let sample = gst::Sample::builder() .buffer(&in_buffer) .caps(&in_caps) .build(); let out_caps = crate::VideoInfo::builder(crate::VideoFormat::Abgr, 320, 240) .build() .unwrap() .to_caps() .unwrap(); let l_clone = l.clone(); let res_store = Arc::new(Mutex::new(None)); let res_store_clone = res_store.clone(); convert_sample_async(&sample, &out_caps, gst::ClockTime::NONE, move |res| { *res_store_clone.lock().unwrap() = Some(res); l_clone.quit(); }); l.run(); let res = res_store.lock().unwrap().take().unwrap(); let res = res.unwrap(); let converted_out_caps = res.caps().unwrap(); assert_eq!(out_caps.as_ref(), converted_out_caps); let out_buffer = res.buffer().unwrap(); { let data = out_buffer.map_readable().unwrap(); for p in data.as_slice().chunks(4) { assert_eq!(p, &[255, 191, 127, 63]); } } } #[test] fn video_caps() { gst::init().unwrap(); let caps = video_make_raw_caps(&[crate::VideoFormat::Nv12, crate::VideoFormat::Nv16]).build(); assert_eq!(caps.to_string(), "video/x-raw, format=(string){ NV12, NV16 }, width=(int)[ 1, 2147483647 ], height=(int)[ 1, 2147483647 ], framerate=(fraction)[ 0/1, 2147483647/1 ]"); #[cfg(feature = "v1_18")] { /* video_make_raw_caps() is a re-implementation so ensure it returns the same caps as the C API */ let c_caps = unsafe { let formats: Vec = [crate::VideoFormat::Nv12, crate::VideoFormat::Nv16] .iter() .map(|f| f.into_glib()) .collect(); let caps = ffi::gst_video_make_raw_caps(formats.as_ptr(), formats.len() as u32); gst::Caps::from_glib_full(caps) }; assert_eq!(caps, c_caps); } let caps = video_make_raw_caps(&[crate::VideoFormat::Nv12, crate::VideoFormat::Nv16]) .width(800) .height(600) .framerate((30, 1).into()) .build(); assert_eq!(caps.to_string(), "video/x-raw, format=(string){ NV12, NV16 }, width=(int)800, height=(int)600, framerate=(fraction)30/1"); } #[test] #[should_panic(expected = "Invalid encoded format")] fn video_caps_encoded() { gst::init().unwrap(); let _caps = video_make_raw_caps(&[crate::VideoFormat::Encoded]); } #[test] #[should_panic(expected = "Invalid unknown format")] fn video_caps_unknown() { gst::init().unwrap(); let _caps = video_make_raw_caps(&[crate::VideoFormat::Unknown]); } } gstreamer-video-0.23.5/src/lib.rs000064400000000000000000000140441046102023000147160ustar 00000000000000// Take a look at the license at the top of the repository in the LICENSE file. #![cfg_attr(docsrs, feature(doc_cfg))] #![allow(clippy::missing_safety_doc)] #![allow(clippy::manual_c_str_literals)] #![doc = include_str!("../README.md")] pub use glib; pub use gst; pub use gst_base; pub use gstreamer_video_sys as ffi; macro_rules! assert_initialized_main_thread { () => { if !gst::INITIALIZED.load(std::sync::atomic::Ordering::SeqCst) { gst::assert_initialized(); } }; } macro_rules! skip_assert_initialized { () => {}; } #[allow(clippy::needless_borrow)] #[allow(unused_imports)] mod auto; pub use crate::auto::*; mod enums; #[cfg(feature = "serde")] mod flag_serde; mod caps; pub use crate::caps::VideoCapsBuilder; mod caps_features; #[cfg(feature = "v1_16")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_16")))] pub use crate::caps_features::{CAPS_FEATURES_FORMAT_INTERLACED, CAPS_FEATURE_FORMAT_INTERLACED}; pub use crate::caps_features::{ CAPS_FEATURES_META_GST_VIDEO_AFFINE_TRANSFORMATION_META, CAPS_FEATURES_META_GST_VIDEO_GL_TEXTURE_UPLOAD_META, CAPS_FEATURES_META_GST_VIDEO_META, CAPS_FEATURES_META_GST_VIDEO_OVERLAY_COMPOSITION, CAPS_FEATURE_META_GST_VIDEO_AFFINE_TRANSFORMATION_META, CAPS_FEATURE_META_GST_VIDEO_GL_TEXTURE_UPLOAD_META, CAPS_FEATURE_META_GST_VIDEO_META, CAPS_FEATURE_META_GST_VIDEO_OVERLAY_COMPOSITION, }; mod video_color_matrix; mod video_format; pub use crate::video_format::*; mod video_format_info; pub use crate::video_format_info::*; mod video_info; pub use crate::video_info::*; #[cfg(feature = "v1_24")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_24")))] mod video_info_dma_drm; #[cfg(feature = "v1_24")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_24")))] pub use crate::video_info_dma_drm::*; pub mod video_frame; pub use crate::video_frame::{VideoFrame, VideoFrameExt, VideoFrameRef}; mod video_overlay; pub use crate::video_overlay::is_video_overlay_prepare_window_handle_message; pub mod video_event; pub use crate::video_event::{ DownstreamForceKeyUnitEvent, ForceKeyUnitEvent, NavigationEvent, StillFrameEvent, UpstreamForceKeyUnitEvent, }; pub mod video_message; pub use crate::video_message::{NavigationEventMessage, NavigationMessage}; mod functions; pub use crate::functions::*; mod video_rectangle; pub use crate::video_rectangle::*; pub mod video_overlay_composition; pub use crate::video_overlay_composition::{ VideoOverlayComposition, VideoOverlayCompositionRef, VideoOverlayRectangle, VideoOverlayRectangleRef, }; pub mod video_meta; #[cfg(feature = "v1_16")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_16")))] pub use crate::video_meta::VideoCaptionMeta; #[cfg(feature = "v1_18")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_18")))] pub use crate::video_meta::{VideoAFDMeta, VideoBarMeta}; pub use crate::video_meta::{ VideoAffineTransformationMeta, VideoCropMeta, VideoMeta, VideoOverlayCompositionMeta, VideoRegionOfInterestMeta, }; mod video_time_code; pub use crate::video_time_code::{ValidVideoTimeCode, VideoTimeCode, VideoTimeCodeMeta}; mod video_time_code_interval; pub use crate::video_time_code_interval::VideoTimeCodeInterval; mod video_buffer_pool; pub use crate::video_buffer_pool::{ VideoAlignment, VideoBufferPoolConfig, BUFFER_POOL_OPTION_VIDEO_AFFINE_TRANSFORMATION_META, BUFFER_POOL_OPTION_VIDEO_ALIGNMENT, BUFFER_POOL_OPTION_VIDEO_GL_TEXTURE_UPLOAD_META, BUFFER_POOL_OPTION_VIDEO_META, }; pub mod video_converter; pub use crate::video_converter::{VideoConverter, VideoConverterConfig}; mod video_codec_frame; mod video_decoder; mod video_encoder; mod video_filter; pub use crate::video_codec_frame::VideoCodecFrame; pub mod video_codec_state; pub use crate::video_codec_state::{VideoCodecState, VideoCodecStateContext}; mod utils; #[cfg(feature = "v1_18")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_18")))] mod video_hdr; #[cfg(feature = "v1_18")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_18")))] pub use crate::video_hdr::*; mod color_balance_channel; #[cfg(feature = "v1_16")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_16")))] mod video_aggregator; #[cfg(feature = "v1_16")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_16")))] mod video_aggregator_convert_pad; #[cfg(feature = "v1_16")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_16")))] mod video_aggregator_pad; #[cfg(feature = "v1_16")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_16")))] mod video_vbi; #[cfg(feature = "v1_16")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_16")))] pub use video_vbi::*; #[cfg(feature = "v1_16")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_16")))] mod video_vbi_encoder; #[cfg(feature = "v1_16")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_16")))] pub use video_vbi_encoder::*; #[cfg(feature = "v1_16")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_16")))] mod video_vbi_parser; #[cfg(feature = "v1_16")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_16")))] pub use video_vbi_parser::*; pub const VIDEO_ENCODER_FLOW_NEED_DATA: gst::FlowSuccess = gst::FlowSuccess::CustomSuccess; pub const VIDEO_DECODER_FLOW_NEED_DATA: gst::FlowSuccess = gst::FlowSuccess::CustomSuccess; // Re-export all the traits in a prelude module, so that applications // can always "use gst_video::prelude::*" without getting conflicts pub mod prelude { #[doc(hidden)] pub use gst_base::prelude::*; #[cfg(feature = "v1_16")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_16")))] pub use crate::video_aggregator::VideoAggregatorExtManual; #[cfg(feature = "v1_16")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_16")))] pub use crate::video_aggregator_convert_pad::VideoAggregatorConvertPadExtManual; #[cfg(feature = "v1_16")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_16")))] pub use crate::video_aggregator_pad::VideoAggregatorPadExtManual; pub use crate::VideoFrameExt; pub use crate::{ auto::traits::*, video_buffer_pool::VideoBufferPoolConfig, video_decoder::VideoDecoderExtManual, video_encoder::VideoEncoderExtManual, video_filter::VideoFilterExtManual, video_format::VideoFormatIteratorExt, video_frame::VideoBufferExt, video_overlay::VideoOverlayExtManual, }; } pub mod subclass; gstreamer-video-0.23.5/src/subclass/mod.rs000064400000000000000000000030301046102023000165370ustar 00000000000000// Take a look at the license at the top of the repository in the LICENSE file. #![allow(clippy::cast_ptr_alignment)] mod navigation; #[cfg(feature = "v1_16")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_16")))] mod video_aggregator; #[cfg(feature = "v1_16")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_16")))] mod video_aggregator_convert_pad; #[cfg(feature = "v1_16")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_16")))] mod video_aggregator_pad; mod video_decoder; mod video_encoder; mod video_filter; mod video_sink; #[cfg(feature = "v1_16")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_16")))] pub use video_aggregator::AggregateFramesToken; pub mod prelude { #[doc(hidden)] pub use gst_base::subclass::prelude::*; #[cfg(feature = "v1_16")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_16")))] pub use super::video_aggregator::{VideoAggregatorImpl, VideoAggregatorImplExt}; #[cfg(feature = "v1_16")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_16")))] pub use super::video_aggregator_convert_pad::VideoAggregatorConvertPadImpl; #[cfg(feature = "v1_16")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_16")))] pub use super::video_aggregator_pad::{VideoAggregatorPadImpl, VideoAggregatorPadImplExt}; pub use super::{ navigation::{NavigationImpl, NavigationImplExt}, video_decoder::{VideoDecoderImpl, VideoDecoderImplExt}, video_encoder::{VideoEncoderImpl, VideoEncoderImplExt}, video_filter::{VideoFilterImpl, VideoFilterImplExt}, video_sink::{VideoSinkImpl, VideoSinkImplExt}, }; } gstreamer-video-0.23.5/src/subclass/navigation.rs000064400000000000000000000062001046102023000201210ustar 00000000000000// Take a look at the license at the top of the repository in the LICENSE file. use glib::{prelude::*, subclass::prelude::*, translate::*}; use crate::{ffi, Navigation}; pub trait NavigationImpl: ObjectImpl { fn send_event(&self, structure: gst::Structure); #[cfg(feature = "v1_22")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_22")))] fn send_event_simple(&self, event: gst::Event) { if let Some(structure) = event.structure() { self.send_event(structure.to_owned()); } } } mod sealed { pub trait Sealed {} impl Sealed for T {} } pub trait NavigationImplExt: sealed::Sealed + ObjectSubclass { fn parent_send_event(&self, structure: gst::Structure) { unsafe { let type_data = Self::type_data(); let parent_iface = type_data.as_ref().parent_interface::() as *const ffi::GstNavigationInterface; let func = match (*parent_iface).send_event { Some(func) => func, None => return, }; func( self.obj().unsafe_cast_ref::().to_glib_none().0, structure.into_glib_ptr(), ); } } #[cfg(feature = "v1_22")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_22")))] fn parent_send_event_simple(&self, event: gst::Event) { unsafe { let type_data = Self::type_data(); let parent_iface = type_data.as_ref().parent_interface::() as *const ffi::GstNavigationInterface; let func = match (*parent_iface).send_event_simple { Some(func) => func, None => return, }; func( self.obj().unsafe_cast_ref::().to_glib_none().0, event.into_glib_ptr(), ); } } } impl NavigationImplExt for T {} unsafe impl IsImplementable for Navigation { #[cfg(not(any(feature = "v1_22", docsrs)))] fn interface_init(iface: &mut glib::Interface) { let iface = iface.as_mut(); iface.send_event = Some(navigation_send_event::); } #[cfg(feature = "v1_22")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_22")))] fn interface_init(iface: &mut glib::Interface) { let iface = iface.as_mut(); iface.send_event = Some(navigation_send_event::); iface.send_event_simple = Some(navigation_send_event_simple::); } } unsafe extern "C" fn navigation_send_event( nav: *mut ffi::GstNavigation, structure: *mut gst::ffi::GstStructure, ) { let instance = &*(nav as *mut T::Instance); let imp = instance.imp(); imp.send_event(from_glib_full(structure)); } #[cfg(feature = "v1_22")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_22")))] unsafe extern "C" fn navigation_send_event_simple( nav: *mut ffi::GstNavigation, event: *mut gst::ffi::GstEvent, ) { let instance = &*(nav as *mut T::Instance); let imp = instance.imp(); imp.send_event_simple(from_glib_full(event)); } gstreamer-video-0.23.5/src/subclass/video_aggregator.rs000064400000000000000000000176051046102023000213050ustar 00000000000000// Take a look at the license at the top of the repository in the LICENSE file. use std::{mem, ptr}; use glib::translate::*; use gst_base::{prelude::*, subclass::prelude::*}; use crate::{ffi, VideoAggregator}; pub struct AggregateFramesToken<'a>(pub(crate) &'a VideoAggregator); pub trait VideoAggregatorImpl: VideoAggregatorImplExt + AggregatorImpl { fn update_caps(&self, caps: &gst::Caps) -> Result { self.parent_update_caps(caps) } fn aggregate_frames( &self, token: &AggregateFramesToken, outbuf: &mut gst::BufferRef, ) -> Result { self.parent_aggregate_frames(token, outbuf) } fn create_output_buffer(&self) -> Result, gst::FlowError> { self.parent_create_output_buffer() } fn find_best_format(&self, downstream_caps: &gst::Caps) -> Option<(crate::VideoInfo, bool)> { self.parent_find_best_format(downstream_caps) } } mod sealed { pub trait Sealed {} impl Sealed for T {} } pub trait VideoAggregatorImplExt: sealed::Sealed + ObjectSubclass { fn parent_update_caps(&self, caps: &gst::Caps) -> Result { unsafe { let data = Self::type_data(); let parent_class = data.as_ref().parent_class() as *mut ffi::GstVideoAggregatorClass; let f = (*parent_class) .update_caps .expect("Missing parent function `update_caps`"); Option::<_>::from_glib_full(f( self.obj() .unsafe_cast_ref::() .to_glib_none() .0, caps.as_mut_ptr(), )) .ok_or_else(|| { gst::loggable_error!(gst::CAT_RUST, "Parent function `update_caps` failed") }) } } fn parent_aggregate_frames( &self, token: &AggregateFramesToken, outbuf: &mut gst::BufferRef, ) -> Result { assert_eq!( self.obj().as_ptr() as *mut ffi::GstVideoAggregator, token.0.as_ptr() ); unsafe { let data = Self::type_data(); let parent_class = data.as_ref().parent_class() as *mut ffi::GstVideoAggregatorClass; let f = (*parent_class) .aggregate_frames .expect("Missing parent function `aggregate_frames`"); try_from_glib(f( self.obj() .unsafe_cast_ref::() .to_glib_none() .0, // FIXME: Wrong pointer type outbuf.as_mut_ptr() as *mut *mut gst::ffi::GstBuffer, )) } } fn parent_create_output_buffer(&self) -> Result, gst::FlowError> { unsafe { let data = Self::type_data(); let parent_class = data.as_ref().parent_class() as *mut ffi::GstVideoAggregatorClass; let f = (*parent_class) .create_output_buffer .expect("Missing parent function `create_output_buffer`"); let mut buffer = ptr::null_mut(); try_from_glib(f( self.obj() .unsafe_cast_ref::() .to_glib_none() .0, &mut buffer, )) .map(|_: gst::FlowSuccess| from_glib_full(buffer)) } } fn parent_find_best_format( &self, downstream_caps: &gst::Caps, ) -> Option<(crate::VideoInfo, bool)> { unsafe { let data = Self::type_data(); let parent_class = data.as_ref().parent_class() as *mut ffi::GstVideoAggregatorClass; (*parent_class).find_best_format.and_then(|f| { let mut info = mem::MaybeUninit::uninit(); ffi::gst_video_info_init(info.as_mut_ptr()); let mut info = info.assume_init(); let mut at_least_one_alpha = glib::ffi::GFALSE; f( self.obj() .unsafe_cast_ref::() .to_glib_none() .0, downstream_caps.as_mut_ptr(), &mut info, &mut at_least_one_alpha, ); if info.finfo.is_null() { None } else { Some(( from_glib_none(&info as *const ffi::GstVideoInfo), from_glib(at_least_one_alpha), )) } }) } } } impl VideoAggregatorImplExt for T {} unsafe impl IsSubclassable for VideoAggregator { fn class_init(klass: &mut glib::Class) { Self::parent_class_init::(klass); let klass = klass.as_mut(); klass.update_caps = Some(video_aggregator_update_caps::); klass.aggregate_frames = Some(video_aggregator_aggregate_frames::); klass.create_output_buffer = Some(video_aggregator_create_output_buffer::); klass.find_best_format = Some(video_aggregator_find_best_format::); } } unsafe extern "C" fn video_aggregator_update_caps( ptr: *mut ffi::GstVideoAggregator, caps: *mut gst::ffi::GstCaps, ) -> *mut gst::ffi::GstCaps { let instance = &*(ptr as *mut T::Instance); let imp = instance.imp(); gst::panic_to_error!(imp, ptr::null_mut(), { match imp.update_caps(&from_glib_borrow(caps)) { Ok(caps) => caps.into_glib_ptr(), Err(err) => { err.log_with_imp(imp); ptr::null_mut() } } }) } unsafe extern "C" fn video_aggregator_aggregate_frames( ptr: *mut ffi::GstVideoAggregator, outbuf: *mut *mut gst::ffi::GstBuffer, ) -> gst::ffi::GstFlowReturn { let instance = &*(ptr as *mut T::Instance); let imp = instance.imp(); gst::panic_to_error!(imp, gst::FlowReturn::Error, { let instance = imp.obj(); let instance = instance.unsafe_cast_ref::(); let token = AggregateFramesToken(instance); imp.aggregate_frames( &token, gst::BufferRef::from_mut_ptr( // Wrong pointer type outbuf as *mut gst::ffi::GstBuffer, ), ) .into() }) .into_glib() } unsafe extern "C" fn video_aggregator_create_output_buffer( ptr: *mut ffi::GstVideoAggregator, outbuf: *mut *mut gst::ffi::GstBuffer, ) -> gst::ffi::GstFlowReturn { let instance = &*(ptr as *mut T::Instance); let imp = instance.imp(); gst::panic_to_error!(imp, gst::FlowReturn::Error, { match imp.create_output_buffer() { Ok(buffer) => { *outbuf = buffer.map(|b| b.into_glib_ptr()).unwrap_or(ptr::null_mut()); Ok(gst::FlowSuccess::Ok) } Err(err) => { *outbuf = ptr::null_mut(); Err(err) } } .into() }) .into_glib() } unsafe extern "C" fn video_aggregator_find_best_format( ptr: *mut ffi::GstVideoAggregator, downstream_caps: *mut gst::ffi::GstCaps, best_info: *mut ffi::GstVideoInfo, at_least_one_alpha: *mut glib::ffi::gboolean, ) { let instance = &*(ptr as *mut T::Instance); let imp = instance.imp(); gst::panic_to_error!(imp, (), { match imp.find_best_format(&from_glib_borrow(downstream_caps)) { None => (), Some((info, alpha)) => { *best_info = *info.to_glib_none().0; *at_least_one_alpha = alpha.into_glib(); } } }) } gstreamer-video-0.23.5/src/subclass/video_aggregator_convert_pad.rs000064400000000000000000000005561046102023000236660ustar 00000000000000// Take a look at the license at the top of the repository in the LICENSE file. use gst_base::subclass::prelude::*; use super::prelude::VideoAggregatorPadImpl; use crate::VideoAggregatorConvertPad; pub trait VideoAggregatorConvertPadImpl: VideoAggregatorPadImpl {} unsafe impl IsSubclassable for VideoAggregatorConvertPad {} gstreamer-video-0.23.5/src/subclass/video_aggregator_pad.rs000064400000000000000000000140201046102023000221150ustar 00000000000000// Take a look at the license at the top of the repository in the LICENSE file. use std::{mem, ptr}; use glib::translate::*; use gst_base::{prelude::*, subclass::prelude::*}; use crate::{ffi, subclass::AggregateFramesToken, VideoAggregator, VideoAggregatorPad}; pub trait VideoAggregatorPadImpl: VideoAggregatorPadImplExt + AggregatorPadImpl { fn update_conversion_info(&self) { self.parent_update_conversion_info() } fn prepare_frame( &self, aggregator: &crate::VideoAggregator, token: &AggregateFramesToken, buffer: &gst::Buffer, ) -> Option> { self.parent_prepare_frame(aggregator, token, buffer) } fn clean_frame( &self, aggregator: &crate::VideoAggregator, token: &AggregateFramesToken, frame: Option>, ) { self.parent_clean_frame(aggregator, token, frame) } } mod sealed { pub trait Sealed {} impl Sealed for T {} } pub trait VideoAggregatorPadImplExt: ObjectSubclass + sealed::Sealed { fn parent_update_conversion_info(&self) { unsafe { let data = Self::type_data(); let parent_class = data.as_ref().parent_class() as *mut ffi::GstVideoAggregatorPadClass; if let Some(f) = (*parent_class).update_conversion_info { f(self .obj() .unsafe_cast_ref::() .to_glib_none() .0); } } } fn parent_prepare_frame( &self, aggregator: &crate::VideoAggregator, token: &AggregateFramesToken, buffer: &gst::Buffer, ) -> Option> { assert_eq!(aggregator.as_ptr(), token.0.as_ptr()); unsafe { let data = Self::type_data(); let parent_class = data.as_ref().parent_class() as *mut ffi::GstVideoAggregatorPadClass; if let Some(f) = (*parent_class).prepare_frame { let mut prepared_frame = mem::MaybeUninit::zeroed(); f( self.obj() .unsafe_cast_ref::() .to_glib_none() .0, aggregator.to_glib_none().0, buffer.as_mut_ptr(), prepared_frame.as_mut_ptr(), ); let prepared_frame = prepared_frame.assume_init(); if prepared_frame.buffer.is_null() { None } else { Some(crate::VideoFrame::from_glib_full(prepared_frame)) } } else { None } } } fn parent_clean_frame( &self, aggregator: &crate::VideoAggregator, token: &AggregateFramesToken, frame: Option>, ) { assert_eq!(aggregator.as_ptr(), token.0.as_ptr()); unsafe { let data = Self::type_data(); let parent_class = data.as_ref().parent_class() as *mut ffi::GstVideoAggregatorPadClass; if let Some(f) = (*parent_class).clean_frame { let mut prepared_frame = if let Some(frame) = frame { frame.into_raw() } else { mem::zeroed() }; f( self.obj() .unsafe_cast_ref::() .to_glib_none() .0, aggregator.to_glib_none().0, &mut prepared_frame, ); } } } } impl VideoAggregatorPadImplExt for T {} unsafe impl IsSubclassable for VideoAggregatorPad { fn class_init(klass: &mut glib::Class) { Self::parent_class_init::(klass); let klass = klass.as_mut(); klass.update_conversion_info = Some(video_aggregator_pad_update_conversion_info::); klass.prepare_frame = Some(video_aggregator_pad_prepare_frame::); klass.clean_frame = Some(video_aggregator_pad_clean_frame::); } } unsafe extern "C" fn video_aggregator_pad_update_conversion_info( ptr: *mut ffi::GstVideoAggregatorPad, ) { let instance = &*(ptr as *mut T::Instance); let imp = instance.imp(); imp.update_conversion_info(); } unsafe extern "C" fn video_aggregator_pad_prepare_frame( ptr: *mut ffi::GstVideoAggregatorPad, aggregator: *mut ffi::GstVideoAggregator, buffer: *mut gst::ffi::GstBuffer, prepared_frame: *mut ffi::GstVideoFrame, ) -> glib::ffi::gboolean { let instance = &*(ptr as *mut T::Instance); let imp = instance.imp(); let aggregator: Borrowed = from_glib_borrow(aggregator); let token = AggregateFramesToken(&aggregator); match imp.prepare_frame(&aggregator, &token, &from_glib_borrow(buffer)) { Some(frame) => { *prepared_frame = frame.into_raw(); } None => { ptr::write(prepared_frame, mem::zeroed()); } } glib::ffi::GTRUE } unsafe extern "C" fn video_aggregator_pad_clean_frame( ptr: *mut ffi::GstVideoAggregatorPad, aggregator: *mut ffi::GstVideoAggregator, prepared_frame: *mut ffi::GstVideoFrame, ) { let instance = &*(ptr as *mut T::Instance); let imp = instance.imp(); let aggregator: Borrowed = from_glib_borrow(aggregator); let token = AggregateFramesToken(&aggregator); let frame = if (*prepared_frame).buffer.is_null() { None } else { let frame = crate::VideoFrame::from_glib_full(*prepared_frame); ptr::write(prepared_frame, mem::zeroed()); Some(frame) }; imp.clean_frame(&aggregator, &token, frame); } gstreamer-video-0.23.5/src/subclass/video_decoder.rs000064400000000000000000000656511046102023000205740ustar 00000000000000// Take a look at the license at the top of the repository in the LICENSE file. use glib::translate::*; use gst::subclass::prelude::*; use crate::{ ffi, prelude::*, video_codec_state::{Readable, VideoCodecState}, VideoCodecFrame, VideoDecoder, }; pub trait VideoDecoderImpl: VideoDecoderImplExt + ElementImpl { fn open(&self) -> Result<(), gst::ErrorMessage> { self.parent_open() } fn close(&self) -> Result<(), gst::ErrorMessage> { self.parent_close() } fn start(&self) -> Result<(), gst::ErrorMessage> { self.parent_start() } fn stop(&self) -> Result<(), gst::ErrorMessage> { self.parent_stop() } fn finish(&self) -> Result { self.parent_finish() } fn drain(&self) -> Result { self.parent_drain() } fn set_format( &self, state: &VideoCodecState<'static, Readable>, ) -> Result<(), gst::LoggableError> { self.parent_set_format(state) } fn parse( &self, frame: &VideoCodecFrame, adapter: &gst_base::Adapter, at_eos: bool, ) -> Result { self.parent_parse(frame, adapter, at_eos) } fn handle_frame(&self, frame: VideoCodecFrame) -> Result { self.parent_handle_frame(frame) } fn flush(&self) -> bool { self.parent_flush() } fn negotiate(&self) -> Result<(), gst::LoggableError> { self.parent_negotiate() } fn caps(&self, filter: Option<&gst::Caps>) -> gst::Caps { self.parent_caps(filter) } fn sink_event(&self, event: gst::Event) -> bool { self.parent_sink_event(event) } fn sink_query(&self, query: &mut gst::QueryRef) -> bool { self.parent_sink_query(query) } fn src_event(&self, event: gst::Event) -> bool { self.parent_src_event(event) } fn src_query(&self, query: &mut gst::QueryRef) -> bool { self.parent_src_query(query) } fn propose_allocation( &self, query: &mut gst::query::Allocation, ) -> Result<(), gst::LoggableError> { self.parent_propose_allocation(query) } fn decide_allocation( &self, query: &mut gst::query::Allocation, ) -> Result<(), gst::LoggableError> { self.parent_decide_allocation(query) } #[cfg(feature = "v1_20")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_20")))] fn handle_missing_data( &self, timestamp: gst::ClockTime, duration: Option, ) -> bool { self.parent_handle_missing_data(timestamp, duration) } } mod sealed { pub trait Sealed {} impl Sealed for T {} } pub trait VideoDecoderImplExt: sealed::Sealed + ObjectSubclass { fn parent_open(&self) -> Result<(), gst::ErrorMessage> { unsafe { let data = Self::type_data(); let parent_class = data.as_ref().parent_class() as *mut ffi::GstVideoDecoderClass; (*parent_class) .open .map(|f| { if from_glib(f(self .obj() .unsafe_cast_ref::() .to_glib_none() .0)) { Ok(()) } else { Err(gst::error_msg!( gst::CoreError::StateChange, ["Parent function `open` failed"] )) } }) .unwrap_or(Ok(())) } } fn parent_close(&self) -> Result<(), gst::ErrorMessage> { unsafe { let data = Self::type_data(); let parent_class = data.as_ref().parent_class() as *mut ffi::GstVideoDecoderClass; (*parent_class) .close .map(|f| { if from_glib(f(self .obj() .unsafe_cast_ref::() .to_glib_none() .0)) { Ok(()) } else { Err(gst::error_msg!( gst::CoreError::StateChange, ["Parent function `close` failed"] )) } }) .unwrap_or(Ok(())) } } fn parent_start(&self) -> Result<(), gst::ErrorMessage> { unsafe { let data = Self::type_data(); let parent_class = data.as_ref().parent_class() as *mut ffi::GstVideoDecoderClass; (*parent_class) .start .map(|f| { if from_glib(f(self .obj() .unsafe_cast_ref::() .to_glib_none() .0)) { Ok(()) } else { Err(gst::error_msg!( gst::CoreError::StateChange, ["Parent function `start` failed"] )) } }) .unwrap_or(Ok(())) } } fn parent_stop(&self) -> Result<(), gst::ErrorMessage> { unsafe { let data = Self::type_data(); let parent_class = data.as_ref().parent_class() as *mut ffi::GstVideoDecoderClass; (*parent_class) .stop .map(|f| { if from_glib(f(self .obj() .unsafe_cast_ref::() .to_glib_none() .0)) { Ok(()) } else { Err(gst::error_msg!( gst::CoreError::StateChange, ["Parent function `stop` failed"] )) } }) .unwrap_or(Ok(())) } } fn parent_finish(&self) -> Result { unsafe { let data = Self::type_data(); let parent_class = data.as_ref().parent_class() as *mut ffi::GstVideoDecoderClass; (*parent_class) .finish .map(|f| { try_from_glib(f(self .obj() .unsafe_cast_ref::() .to_glib_none() .0)) }) .unwrap_or(Ok(gst::FlowSuccess::Ok)) } } fn parent_drain(&self) -> Result { unsafe { let data = Self::type_data(); let parent_class = data.as_ref().parent_class() as *mut ffi::GstVideoDecoderClass; (*parent_class) .drain .map(|f| { try_from_glib(f(self .obj() .unsafe_cast_ref::() .to_glib_none() .0)) }) .unwrap_or(Ok(gst::FlowSuccess::Ok)) } } fn parent_set_format( &self, state: &VideoCodecState<'static, Readable>, ) -> Result<(), gst::LoggableError> { unsafe { let data = Self::type_data(); let parent_class = data.as_ref().parent_class() as *mut ffi::GstVideoDecoderClass; (*parent_class) .set_format .map(|f| { gst::result_from_gboolean!( f( self.obj() .unsafe_cast_ref::() .to_glib_none() .0, state.as_mut_ptr() ), gst::CAT_RUST, "parent function `set_format` failed" ) }) .unwrap_or(Ok(())) } } fn parent_parse( &self, frame: &VideoCodecFrame, adapter: &gst_base::Adapter, at_eos: bool, ) -> Result { unsafe { let data = Self::type_data(); let parent_class = data.as_ref().parent_class() as *mut ffi::GstVideoDecoderClass; (*parent_class) .parse .map(|f| { try_from_glib(f( self.obj() .unsafe_cast_ref::() .to_glib_none() .0, frame.to_glib_none().0, adapter.to_glib_none().0, at_eos.into_glib(), )) }) .unwrap_or(Ok(gst::FlowSuccess::Ok)) } } fn parent_handle_frame( &self, frame: VideoCodecFrame, ) -> Result { unsafe { let data = Self::type_data(); let parent_class = data.as_ref().parent_class() as *mut ffi::GstVideoDecoderClass; (*parent_class) .handle_frame .map(|f| { try_from_glib(f( self.obj() .unsafe_cast_ref::() .to_glib_none() .0, frame.to_glib_none().0, )) }) .unwrap_or(Err(gst::FlowError::Error)) } } fn parent_flush(&self) -> bool { unsafe { let data = Self::type_data(); let parent_class = data.as_ref().parent_class() as *mut ffi::GstVideoDecoderClass; (*parent_class) .flush .map(|f| { from_glib(f(self .obj() .unsafe_cast_ref::() .to_glib_none() .0)) }) .unwrap_or(false) } } fn parent_negotiate(&self) -> Result<(), gst::LoggableError> { unsafe { let data = Self::type_data(); let parent_class = data.as_ref().parent_class() as *mut ffi::GstVideoDecoderClass; (*parent_class) .negotiate .map(|f| { gst::result_from_gboolean!( f(self .obj() .unsafe_cast_ref::() .to_glib_none() .0), gst::CAT_RUST, "Parent function `negotiate` failed" ) }) .unwrap_or(Ok(())) } } fn parent_caps(&self, filter: Option<&gst::Caps>) -> gst::Caps { unsafe { let data = Self::type_data(); let parent_class = data.as_ref().parent_class() as *mut ffi::GstVideoDecoderClass; (*parent_class) .getcaps .map(|f| { from_glib_full(f( self.obj() .unsafe_cast_ref::() .to_glib_none() .0, filter.to_glib_none().0, )) }) .unwrap_or_else(|| { self.obj() .unsafe_cast_ref::() .proxy_getcaps(None, filter) }) } } fn parent_sink_event(&self, event: gst::Event) -> bool { unsafe { let data = Self::type_data(); let parent_class = data.as_ref().parent_class() as *mut ffi::GstVideoDecoderClass; let f = (*parent_class) .sink_event .expect("Missing parent function `sink_event`"); from_glib(f( self.obj() .unsafe_cast_ref::() .to_glib_none() .0, event.into_glib_ptr(), )) } } fn parent_sink_query(&self, query: &mut gst::QueryRef) -> bool { unsafe { let data = Self::type_data(); let parent_class = data.as_ref().parent_class() as *mut ffi::GstVideoDecoderClass; let f = (*parent_class) .sink_query .expect("Missing parent function `sink_query`"); from_glib(f( self.obj() .unsafe_cast_ref::() .to_glib_none() .0, query.as_mut_ptr(), )) } } fn parent_src_event(&self, event: gst::Event) -> bool { unsafe { let data = Self::type_data(); let parent_class = data.as_ref().parent_class() as *mut ffi::GstVideoDecoderClass; let f = (*parent_class) .src_event .expect("Missing parent function `src_event`"); from_glib(f( self.obj() .unsafe_cast_ref::() .to_glib_none() .0, event.into_glib_ptr(), )) } } fn parent_src_query(&self, query: &mut gst::QueryRef) -> bool { unsafe { let data = Self::type_data(); let parent_class = data.as_ref().parent_class() as *mut ffi::GstVideoDecoderClass; let f = (*parent_class) .src_query .expect("Missing parent function `src_query`"); from_glib(f( self.obj() .unsafe_cast_ref::() .to_glib_none() .0, query.as_mut_ptr(), )) } } fn parent_propose_allocation( &self, query: &mut gst::query::Allocation, ) -> Result<(), gst::LoggableError> { unsafe { let data = Self::type_data(); let parent_class = data.as_ref().parent_class() as *mut ffi::GstVideoDecoderClass; (*parent_class) .propose_allocation .map(|f| { gst::result_from_gboolean!( f( self.obj() .unsafe_cast_ref::() .to_glib_none() .0, query.as_mut_ptr(), ), gst::CAT_RUST, "Parent function `propose_allocation` failed", ) }) .unwrap_or(Ok(())) } } fn parent_decide_allocation( &self, query: &mut gst::query::Allocation, ) -> Result<(), gst::LoggableError> { unsafe { let data = Self::type_data(); let parent_class = data.as_ref().parent_class() as *mut ffi::GstVideoDecoderClass; (*parent_class) .decide_allocation .map(|f| { gst::result_from_gboolean!( f( self.obj() .unsafe_cast_ref::() .to_glib_none() .0, query.as_mut_ptr(), ), gst::CAT_RUST, "Parent function `decide_allocation` failed", ) }) .unwrap_or(Ok(())) } } #[cfg(feature = "v1_20")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_20")))] fn parent_handle_missing_data( &self, timestamp: gst::ClockTime, duration: Option, ) -> bool { unsafe { let data = Self::type_data(); let parent_class = data.as_ref().parent_class() as *mut ffi::GstVideoDecoderClass; (*parent_class) .handle_missing_data .map(|f| { from_glib(f( self.obj() .unsafe_cast_ref::() .to_glib_none() .0, timestamp.into_glib(), duration.into_glib(), )) }) .unwrap_or(true) } } } impl VideoDecoderImplExt for T {} unsafe impl IsSubclassable for VideoDecoder { fn class_init(klass: &mut glib::Class) { Self::parent_class_init::(klass); let klass = klass.as_mut(); klass.open = Some(video_decoder_open::); klass.close = Some(video_decoder_close::); klass.start = Some(video_decoder_start::); klass.stop = Some(video_decoder_stop::); klass.finish = Some(video_decoder_finish::); klass.drain = Some(video_decoder_drain::); klass.set_format = Some(video_decoder_set_format::); klass.parse = Some(video_decoder_parse::); klass.handle_frame = Some(video_decoder_handle_frame::); klass.flush = Some(video_decoder_flush::); klass.negotiate = Some(video_decoder_negotiate::); klass.getcaps = Some(video_decoder_getcaps::); klass.sink_event = Some(video_decoder_sink_event::); klass.src_event = Some(video_decoder_src_event::); klass.sink_query = Some(video_decoder_sink_query::); klass.src_query = Some(video_decoder_src_query::); klass.propose_allocation = Some(video_decoder_propose_allocation::); klass.decide_allocation = Some(video_decoder_decide_allocation::); #[cfg(feature = "v1_20")] { klass.handle_missing_data = Some(video_decoder_handle_missing_data::); } } } unsafe extern "C" fn video_decoder_open( ptr: *mut ffi::GstVideoDecoder, ) -> glib::ffi::gboolean { let instance = &*(ptr as *mut T::Instance); let imp = instance.imp(); gst::panic_to_error!(imp, false, { match imp.open() { Ok(()) => true, Err(err) => { imp.post_error_message(err); false } } }) .into_glib() } unsafe extern "C" fn video_decoder_close( ptr: *mut ffi::GstVideoDecoder, ) -> glib::ffi::gboolean { let instance = &*(ptr as *mut T::Instance); let imp = instance.imp(); gst::panic_to_error!(imp, false, { match imp.close() { Ok(()) => true, Err(err) => { imp.post_error_message(err); false } } }) .into_glib() } unsafe extern "C" fn video_decoder_start( ptr: *mut ffi::GstVideoDecoder, ) -> glib::ffi::gboolean { let instance = &*(ptr as *mut T::Instance); let imp = instance.imp(); gst::panic_to_error!(imp, false, { match imp.start() { Ok(()) => true, Err(err) => { imp.post_error_message(err); false } } }) .into_glib() } unsafe extern "C" fn video_decoder_stop( ptr: *mut ffi::GstVideoDecoder, ) -> glib::ffi::gboolean { let instance = &*(ptr as *mut T::Instance); let imp = instance.imp(); gst::panic_to_error!(imp, false, { match imp.stop() { Ok(()) => true, Err(err) => { imp.post_error_message(err); false } } }) .into_glib() } unsafe extern "C" fn video_decoder_finish( ptr: *mut ffi::GstVideoDecoder, ) -> gst::ffi::GstFlowReturn { let instance = &*(ptr as *mut T::Instance); let imp = instance.imp(); gst::panic_to_error!(imp, gst::FlowReturn::Error, { imp.finish().into() }).into_glib() } unsafe extern "C" fn video_decoder_drain( ptr: *mut ffi::GstVideoDecoder, ) -> gst::ffi::GstFlowReturn { let instance = &*(ptr as *mut T::Instance); let imp = instance.imp(); gst::panic_to_error!(imp, gst::FlowReturn::Error, { imp.drain().into() }).into_glib() } unsafe extern "C" fn video_decoder_set_format( ptr: *mut ffi::GstVideoDecoder, state: *mut ffi::GstVideoCodecState, ) -> glib::ffi::gboolean { let instance = &*(ptr as *mut T::Instance); let imp = instance.imp(); ffi::gst_video_codec_state_ref(state); let wrap_state = VideoCodecState::::new(state); gst::panic_to_error!(imp, false, { match imp.set_format(&wrap_state) { Ok(()) => true, Err(err) => { err.log_with_imp(imp); false } } }) .into_glib() } unsafe extern "C" fn video_decoder_parse( ptr: *mut ffi::GstVideoDecoder, frame: *mut ffi::GstVideoCodecFrame, adapter: *mut gst_base::ffi::GstAdapter, at_eos: glib::ffi::gboolean, ) -> gst::ffi::GstFlowReturn { let instance = &*(ptr as *mut T::Instance); let imp = instance.imp(); ffi::gst_video_codec_frame_ref(frame); let instance = imp.obj(); let instance = instance.unsafe_cast_ref::(); let wrap_frame = VideoCodecFrame::new(frame, instance); let wrap_adapter: Borrowed = from_glib_borrow(adapter); let at_eos: bool = from_glib(at_eos); gst::panic_to_error!(imp, gst::FlowReturn::Error, { imp.parse(&wrap_frame, &wrap_adapter, at_eos).into() }) .into_glib() } unsafe extern "C" fn video_decoder_handle_frame( ptr: *mut ffi::GstVideoDecoder, frame: *mut ffi::GstVideoCodecFrame, ) -> gst::ffi::GstFlowReturn { let instance = &*(ptr as *mut T::Instance); let imp = instance.imp(); let instance = imp.obj(); let instance = instance.unsafe_cast_ref::(); let wrap_frame = VideoCodecFrame::new(frame, instance); gst::panic_to_error!(imp, gst::FlowReturn::Error, { imp.handle_frame(wrap_frame).into() }) .into_glib() } unsafe extern "C" fn video_decoder_flush( ptr: *mut ffi::GstVideoDecoder, ) -> glib::ffi::gboolean { let instance = &*(ptr as *mut T::Instance); let imp = instance.imp(); gst::panic_to_error!(imp, false, { VideoDecoderImpl::flush(imp) }).into_glib() } unsafe extern "C" fn video_decoder_negotiate( ptr: *mut ffi::GstVideoDecoder, ) -> glib::ffi::gboolean { let instance = &*(ptr as *mut T::Instance); let imp = instance.imp(); gst::panic_to_error!(imp, false, { match imp.negotiate() { Ok(()) => true, Err(err) => { err.log_with_imp(imp); false } } }) .into_glib() } unsafe extern "C" fn video_decoder_getcaps( ptr: *mut ffi::GstVideoDecoder, filter: *mut gst::ffi::GstCaps, ) -> *mut gst::ffi::GstCaps { let instance = &*(ptr as *mut T::Instance); let imp = instance.imp(); gst::panic_to_error!(imp, gst::Caps::new_empty(), { VideoDecoderImpl::caps( imp, Option::::from_glib_borrow(filter) .as_ref() .as_ref(), ) }) .into_glib_ptr() } unsafe extern "C" fn video_decoder_sink_event( ptr: *mut ffi::GstVideoDecoder, event: *mut gst::ffi::GstEvent, ) -> glib::ffi::gboolean { let instance = &*(ptr as *mut T::Instance); let imp = instance.imp(); gst::panic_to_error!(imp, false, { imp.sink_event(from_glib_full(event)) }).into_glib() } unsafe extern "C" fn video_decoder_sink_query( ptr: *mut ffi::GstVideoDecoder, query: *mut gst::ffi::GstQuery, ) -> glib::ffi::gboolean { let instance = &*(ptr as *mut T::Instance); let imp = instance.imp(); gst::panic_to_error!(imp, false, { imp.sink_query(gst::QueryRef::from_mut_ptr(query)) }) .into_glib() } unsafe extern "C" fn video_decoder_src_event( ptr: *mut ffi::GstVideoDecoder, event: *mut gst::ffi::GstEvent, ) -> glib::ffi::gboolean { let instance = &*(ptr as *mut T::Instance); let imp = instance.imp(); gst::panic_to_error!(imp, false, { imp.src_event(from_glib_full(event)) }).into_glib() } unsafe extern "C" fn video_decoder_src_query( ptr: *mut ffi::GstVideoDecoder, query: *mut gst::ffi::GstQuery, ) -> glib::ffi::gboolean { let instance = &*(ptr as *mut T::Instance); let imp = instance.imp(); gst::panic_to_error!(imp, false, { imp.src_query(gst::QueryRef::from_mut_ptr(query)) }) .into_glib() } unsafe extern "C" fn video_decoder_propose_allocation( ptr: *mut ffi::GstVideoDecoder, query: *mut gst::ffi::GstQuery, ) -> glib::ffi::gboolean { let instance = &*(ptr as *mut T::Instance); let imp = instance.imp(); let query = match gst::QueryRef::from_mut_ptr(query).view_mut() { gst::QueryViewMut::Allocation(allocation) => allocation, _ => unreachable!(), }; gst::panic_to_error!(imp, false, { match imp.propose_allocation(query) { Ok(()) => true, Err(err) => { err.log_with_imp(imp); false } } }) .into_glib() } unsafe extern "C" fn video_decoder_decide_allocation( ptr: *mut ffi::GstVideoDecoder, query: *mut gst::ffi::GstQuery, ) -> glib::ffi::gboolean { let instance = &*(ptr as *mut T::Instance); let imp = instance.imp(); let query = match gst::QueryRef::from_mut_ptr(query).view_mut() { gst::QueryViewMut::Allocation(allocation) => allocation, _ => unreachable!(), }; gst::panic_to_error!(imp, false, { match imp.decide_allocation(query) { Ok(()) => true, Err(err) => { err.log_with_imp(imp); false } } }) .into_glib() } #[cfg(feature = "v1_20")] unsafe extern "C" fn video_decoder_handle_missing_data( ptr: *mut ffi::GstVideoDecoder, timestamp: gst::ffi::GstClockTime, duration: gst::ffi::GstClockTime, ) -> glib::ffi::gboolean { let instance = &*(ptr as *mut T::Instance); let imp = instance.imp(); gst::panic_to_error!(imp, true, { imp.handle_missing_data( Option::::from_glib(timestamp).unwrap(), from_glib(duration), ) }) .into_glib() } gstreamer-video-0.23.5/src/subclass/video_encoder.rs000064400000000000000000000543231046102023000206000ustar 00000000000000// Take a look at the license at the top of the repository in the LICENSE file. use glib::translate::*; use gst::subclass::prelude::*; use crate::{ ffi, prelude::*, video_codec_state::{Readable, VideoCodecState}, VideoCodecFrame, VideoEncoder, }; pub trait VideoEncoderImpl: VideoEncoderImplExt + ElementImpl { fn open(&self) -> Result<(), gst::ErrorMessage> { self.parent_open() } fn close(&self) -> Result<(), gst::ErrorMessage> { self.parent_close() } fn start(&self) -> Result<(), gst::ErrorMessage> { self.parent_start() } fn stop(&self) -> Result<(), gst::ErrorMessage> { self.parent_stop() } fn finish(&self) -> Result { self.parent_finish() } fn set_format( &self, state: &VideoCodecState<'static, Readable>, ) -> Result<(), gst::LoggableError> { self.parent_set_format(state) } fn handle_frame(&self, frame: VideoCodecFrame) -> Result { self.parent_handle_frame(frame) } fn flush(&self) -> bool { self.parent_flush() } fn negotiate(&self) -> Result<(), gst::LoggableError> { self.parent_negotiate() } fn caps(&self, filter: Option<&gst::Caps>) -> gst::Caps { self.parent_caps(filter) } fn sink_event(&self, event: gst::Event) -> bool { self.parent_sink_event(event) } fn sink_query(&self, query: &mut gst::QueryRef) -> bool { self.parent_sink_query(query) } fn src_event(&self, event: gst::Event) -> bool { self.parent_src_event(event) } fn src_query(&self, query: &mut gst::QueryRef) -> bool { self.parent_src_query(query) } fn propose_allocation( &self, query: &mut gst::query::Allocation, ) -> Result<(), gst::LoggableError> { self.parent_propose_allocation(query) } fn decide_allocation( &self, query: &mut gst::query::Allocation, ) -> Result<(), gst::LoggableError> { self.parent_decide_allocation(query) } } mod sealed { pub trait Sealed {} impl Sealed for T {} } pub trait VideoEncoderImplExt: sealed::Sealed + ObjectSubclass { fn parent_open(&self) -> Result<(), gst::ErrorMessage> { unsafe { let data = Self::type_data(); let parent_class = data.as_ref().parent_class() as *mut ffi::GstVideoEncoderClass; (*parent_class) .open .map(|f| { if from_glib(f(self .obj() .unsafe_cast_ref::() .to_glib_none() .0)) { Ok(()) } else { Err(gst::error_msg!( gst::CoreError::StateChange, ["Parent function `open` failed"] )) } }) .unwrap_or(Ok(())) } } fn parent_close(&self) -> Result<(), gst::ErrorMessage> { unsafe { let data = Self::type_data(); let parent_class = data.as_ref().parent_class() as *mut ffi::GstVideoEncoderClass; (*parent_class) .close .map(|f| { if from_glib(f(self .obj() .unsafe_cast_ref::() .to_glib_none() .0)) { Ok(()) } else { Err(gst::error_msg!( gst::CoreError::StateChange, ["Parent function `close` failed"] )) } }) .unwrap_or(Ok(())) } } fn parent_start(&self) -> Result<(), gst::ErrorMessage> { unsafe { let data = Self::type_data(); let parent_class = data.as_ref().parent_class() as *mut ffi::GstVideoEncoderClass; (*parent_class) .start .map(|f| { if from_glib(f(self .obj() .unsafe_cast_ref::() .to_glib_none() .0)) { Ok(()) } else { Err(gst::error_msg!( gst::CoreError::StateChange, ["Parent function `start` failed"] )) } }) .unwrap_or(Ok(())) } } fn parent_stop(&self) -> Result<(), gst::ErrorMessage> { unsafe { let data = Self::type_data(); let parent_class = data.as_ref().parent_class() as *mut ffi::GstVideoEncoderClass; (*parent_class) .stop .map(|f| { if from_glib(f(self .obj() .unsafe_cast_ref::() .to_glib_none() .0)) { Ok(()) } else { Err(gst::error_msg!( gst::CoreError::StateChange, ["Parent function `stop` failed"] )) } }) .unwrap_or(Ok(())) } } fn parent_finish(&self) -> Result { unsafe { let data = Self::type_data(); let parent_class = data.as_ref().parent_class() as *mut ffi::GstVideoEncoderClass; (*parent_class) .finish .map(|f| { try_from_glib(f(self .obj() .unsafe_cast_ref::() .to_glib_none() .0)) }) .unwrap_or(Ok(gst::FlowSuccess::Ok)) } } fn parent_set_format( &self, state: &VideoCodecState<'static, Readable>, ) -> Result<(), gst::LoggableError> { unsafe { let data = Self::type_data(); let parent_class = data.as_ref().parent_class() as *mut ffi::GstVideoEncoderClass; (*parent_class) .set_format .map(|f| { gst::result_from_gboolean!( f( self.obj() .unsafe_cast_ref::() .to_glib_none() .0, state.as_mut_ptr() ), gst::CAT_RUST, "parent function `set_format` failed" ) }) .unwrap_or(Ok(())) } } fn parent_handle_frame( &self, frame: VideoCodecFrame, ) -> Result { unsafe { let data = Self::type_data(); let parent_class = data.as_ref().parent_class() as *mut ffi::GstVideoEncoderClass; (*parent_class) .handle_frame .map(|f| { try_from_glib(f( self.obj() .unsafe_cast_ref::() .to_glib_none() .0, frame.to_glib_none().0, )) }) .unwrap_or(Err(gst::FlowError::Error)) } } fn parent_flush(&self) -> bool { unsafe { let data = Self::type_data(); let parent_class = data.as_ref().parent_class() as *mut ffi::GstVideoEncoderClass; (*parent_class) .flush .map(|f| { from_glib(f(self .obj() .unsafe_cast_ref::() .to_glib_none() .0)) }) .unwrap_or(false) } } fn parent_negotiate(&self) -> Result<(), gst::LoggableError> { unsafe { let data = Self::type_data(); let parent_class = data.as_ref().parent_class() as *mut ffi::GstVideoEncoderClass; (*parent_class) .negotiate .map(|f| { gst::result_from_gboolean!( f(self .obj() .unsafe_cast_ref::() .to_glib_none() .0), gst::CAT_RUST, "Parent function `negotiate` failed" ) }) .unwrap_or(Ok(())) } } fn parent_caps(&self, filter: Option<&gst::Caps>) -> gst::Caps { unsafe { let data = Self::type_data(); let parent_class = data.as_ref().parent_class() as *mut ffi::GstVideoEncoderClass; (*parent_class) .getcaps .map(|f| { from_glib_full(f( self.obj() .unsafe_cast_ref::() .to_glib_none() .0, filter.to_glib_none().0, )) }) .unwrap_or_else(|| { self.obj() .unsafe_cast_ref::() .proxy_getcaps(None, filter) }) } } fn parent_sink_event(&self, event: gst::Event) -> bool { unsafe { let data = Self::type_data(); let parent_class = data.as_ref().parent_class() as *mut ffi::GstVideoEncoderClass; let f = (*parent_class) .sink_event .expect("Missing parent function `sink_event`"); from_glib(f( self.obj() .unsafe_cast_ref::() .to_glib_none() .0, event.into_glib_ptr(), )) } } fn parent_sink_query(&self, query: &mut gst::QueryRef) -> bool { unsafe { let data = Self::type_data(); let parent_class = data.as_ref().parent_class() as *mut ffi::GstVideoEncoderClass; let f = (*parent_class) .sink_query .expect("Missing parent function `sink_query`"); from_glib(f( self.obj() .unsafe_cast_ref::() .to_glib_none() .0, query.as_mut_ptr(), )) } } fn parent_src_event(&self, event: gst::Event) -> bool { unsafe { let data = Self::type_data(); let parent_class = data.as_ref().parent_class() as *mut ffi::GstVideoEncoderClass; let f = (*parent_class) .src_event .expect("Missing parent function `src_event`"); from_glib(f( self.obj() .unsafe_cast_ref::() .to_glib_none() .0, event.into_glib_ptr(), )) } } fn parent_src_query(&self, query: &mut gst::QueryRef) -> bool { unsafe { let data = Self::type_data(); let parent_class = data.as_ref().parent_class() as *mut ffi::GstVideoEncoderClass; let f = (*parent_class) .src_query .expect("Missing parent function `src_query`"); from_glib(f( self.obj() .unsafe_cast_ref::() .to_glib_none() .0, query.as_mut_ptr(), )) } } fn parent_propose_allocation( &self, query: &mut gst::query::Allocation, ) -> Result<(), gst::LoggableError> { unsafe { let data = Self::type_data(); let parent_class = data.as_ref().parent_class() as *mut ffi::GstVideoEncoderClass; (*parent_class) .propose_allocation .map(|f| { gst::result_from_gboolean!( f( self.obj() .unsafe_cast_ref::() .to_glib_none() .0, query.as_mut_ptr(), ), gst::CAT_RUST, "Parent function `propose_allocation` failed", ) }) .unwrap_or(Ok(())) } } fn parent_decide_allocation( &self, query: &mut gst::query::Allocation, ) -> Result<(), gst::LoggableError> { unsafe { let data = Self::type_data(); let parent_class = data.as_ref().parent_class() as *mut ffi::GstVideoEncoderClass; (*parent_class) .decide_allocation .map(|f| { gst::result_from_gboolean!( f( self.obj() .unsafe_cast_ref::() .to_glib_none() .0, query.as_mut_ptr(), ), gst::CAT_RUST, "Parent function `decide_allocation` failed", ) }) .unwrap_or(Ok(())) } } } impl VideoEncoderImplExt for T {} unsafe impl IsSubclassable for VideoEncoder { fn class_init(klass: &mut glib::Class) { Self::parent_class_init::(klass); let klass = klass.as_mut(); klass.open = Some(video_encoder_open::); klass.close = Some(video_encoder_close::); klass.start = Some(video_encoder_start::); klass.stop = Some(video_encoder_stop::); klass.finish = Some(video_encoder_finish::); klass.set_format = Some(video_encoder_set_format::); klass.handle_frame = Some(video_encoder_handle_frame::); klass.flush = Some(video_encoder_flush::); klass.negotiate = Some(video_encoder_negotiate::); klass.getcaps = Some(video_encoder_getcaps::); klass.sink_event = Some(video_encoder_sink_event::); klass.src_event = Some(video_encoder_src_event::); klass.sink_query = Some(video_encoder_sink_query::); klass.src_query = Some(video_encoder_src_query::); klass.propose_allocation = Some(video_encoder_propose_allocation::); klass.decide_allocation = Some(video_encoder_decide_allocation::); } } unsafe extern "C" fn video_encoder_open( ptr: *mut ffi::GstVideoEncoder, ) -> glib::ffi::gboolean { let instance = &*(ptr as *mut T::Instance); let imp = instance.imp(); gst::panic_to_error!(imp, false, { match imp.open() { Ok(()) => true, Err(err) => { imp.post_error_message(err); false } } }) .into_glib() } unsafe extern "C" fn video_encoder_close( ptr: *mut ffi::GstVideoEncoder, ) -> glib::ffi::gboolean { let instance = &*(ptr as *mut T::Instance); let imp = instance.imp(); gst::panic_to_error!(imp, false, { match imp.close() { Ok(()) => true, Err(err) => { imp.post_error_message(err); false } } }) .into_glib() } unsafe extern "C" fn video_encoder_start( ptr: *mut ffi::GstVideoEncoder, ) -> glib::ffi::gboolean { let instance = &*(ptr as *mut T::Instance); let imp = instance.imp(); gst::panic_to_error!(imp, false, { match imp.start() { Ok(()) => true, Err(err) => { imp.post_error_message(err); false } } }) .into_glib() } unsafe extern "C" fn video_encoder_stop( ptr: *mut ffi::GstVideoEncoder, ) -> glib::ffi::gboolean { let instance = &*(ptr as *mut T::Instance); let imp = instance.imp(); gst::panic_to_error!(imp, false, { match imp.stop() { Ok(()) => true, Err(err) => { imp.post_error_message(err); false } } }) .into_glib() } unsafe extern "C" fn video_encoder_finish( ptr: *mut ffi::GstVideoEncoder, ) -> gst::ffi::GstFlowReturn { let instance = &*(ptr as *mut T::Instance); let imp = instance.imp(); gst::panic_to_error!(imp, gst::FlowReturn::Error, { imp.finish().into() }).into_glib() } unsafe extern "C" fn video_encoder_set_format( ptr: *mut ffi::GstVideoEncoder, state: *mut ffi::GstVideoCodecState, ) -> glib::ffi::gboolean { let instance = &*(ptr as *mut T::Instance); let imp = instance.imp(); ffi::gst_video_codec_state_ref(state); let wrap_state = VideoCodecState::::new(state); gst::panic_to_error!(imp, false, { match imp.set_format(&wrap_state) { Ok(()) => true, Err(err) => { err.log_with_imp(imp); false } } }) .into_glib() } unsafe extern "C" fn video_encoder_handle_frame( ptr: *mut ffi::GstVideoEncoder, frame: *mut ffi::GstVideoCodecFrame, ) -> gst::ffi::GstFlowReturn { let instance = &*(ptr as *mut T::Instance); let imp = instance.imp(); let instance = imp.obj(); let instance = instance.unsafe_cast_ref::(); let wrap_frame = VideoCodecFrame::new(frame, instance); gst::panic_to_error!(imp, gst::FlowReturn::Error, { imp.handle_frame(wrap_frame).into() }) .into_glib() } unsafe extern "C" fn video_encoder_flush( ptr: *mut ffi::GstVideoEncoder, ) -> glib::ffi::gboolean { let instance = &*(ptr as *mut T::Instance); let imp = instance.imp(); gst::panic_to_error!(imp, false, { VideoEncoderImpl::flush(imp) }).into_glib() } unsafe extern "C" fn video_encoder_negotiate( ptr: *mut ffi::GstVideoEncoder, ) -> glib::ffi::gboolean { let instance = &*(ptr as *mut T::Instance); let imp = instance.imp(); gst::panic_to_error!(imp, false, { match imp.negotiate() { Ok(()) => true, Err(err) => { err.log_with_imp(imp); false } } }) .into_glib() } unsafe extern "C" fn video_encoder_getcaps( ptr: *mut ffi::GstVideoEncoder, filter: *mut gst::ffi::GstCaps, ) -> *mut gst::ffi::GstCaps { let instance = &*(ptr as *mut T::Instance); let imp = instance.imp(); gst::panic_to_error!(imp, gst::Caps::new_empty(), { VideoEncoderImpl::caps( imp, Option::::from_glib_borrow(filter) .as_ref() .as_ref(), ) }) .into_glib_ptr() } unsafe extern "C" fn video_encoder_sink_event( ptr: *mut ffi::GstVideoEncoder, event: *mut gst::ffi::GstEvent, ) -> glib::ffi::gboolean { let instance = &*(ptr as *mut T::Instance); let imp = instance.imp(); gst::panic_to_error!(imp, false, { imp.sink_event(from_glib_full(event)) }).into_glib() } unsafe extern "C" fn video_encoder_sink_query( ptr: *mut ffi::GstVideoEncoder, query: *mut gst::ffi::GstQuery, ) -> glib::ffi::gboolean { let instance = &*(ptr as *mut T::Instance); let imp = instance.imp(); gst::panic_to_error!(imp, false, { imp.sink_query(gst::QueryRef::from_mut_ptr(query)) }) .into_glib() } unsafe extern "C" fn video_encoder_src_event( ptr: *mut ffi::GstVideoEncoder, event: *mut gst::ffi::GstEvent, ) -> glib::ffi::gboolean { let instance = &*(ptr as *mut T::Instance); let imp = instance.imp(); gst::panic_to_error!(imp, false, { imp.src_event(from_glib_full(event)) }).into_glib() } unsafe extern "C" fn video_encoder_src_query( ptr: *mut ffi::GstVideoEncoder, query: *mut gst::ffi::GstQuery, ) -> glib::ffi::gboolean { let instance = &*(ptr as *mut T::Instance); let imp = instance.imp(); gst::panic_to_error!(imp, false, { imp.src_query(gst::QueryRef::from_mut_ptr(query)) }) .into_glib() } unsafe extern "C" fn video_encoder_propose_allocation( ptr: *mut ffi::GstVideoEncoder, query: *mut gst::ffi::GstQuery, ) -> glib::ffi::gboolean { let instance = &*(ptr as *mut T::Instance); let imp = instance.imp(); let query = match gst::QueryRef::from_mut_ptr(query).view_mut() { gst::QueryViewMut::Allocation(allocation) => allocation, _ => unreachable!(), }; gst::panic_to_error!(imp, false, { match imp.propose_allocation(query) { Ok(()) => true, Err(err) => { err.log_with_imp(imp); false } } }) .into_glib() } unsafe extern "C" fn video_encoder_decide_allocation( ptr: *mut ffi::GstVideoEncoder, query: *mut gst::ffi::GstQuery, ) -> glib::ffi::gboolean { let instance = &*(ptr as *mut T::Instance); let imp = instance.imp(); let query = match gst::QueryRef::from_mut_ptr(query).view_mut() { gst::QueryViewMut::Allocation(allocation) => allocation, _ => unreachable!(), }; gst::panic_to_error!(imp, false, { match imp.decide_allocation(query) { Ok(()) => true, Err(err) => { err.log_with_imp(imp); false } } }) .into_glib() } gstreamer-video-0.23.5/src/subclass/video_filter.rs000064400000000000000000000217531046102023000204470ustar 00000000000000// Take a look at the license at the top of the repository in the LICENSE file. use glib::translate::*; use gst_base::{prelude::*, subclass::prelude::*}; use crate::{ffi, VideoFilter, VideoFrameExt, VideoFrameRef, VideoInfo}; pub trait VideoFilterImpl: VideoFilterImplExt + BaseTransformImpl { fn set_info( &self, incaps: &gst::Caps, in_info: &VideoInfo, outcaps: &gst::Caps, out_info: &VideoInfo, ) -> Result<(), gst::LoggableError> { self.parent_set_info(incaps, in_info, outcaps, out_info) } fn transform_frame( &self, inframe: &VideoFrameRef<&gst::BufferRef>, outframe: &mut VideoFrameRef<&mut gst::BufferRef>, ) -> Result { self.parent_transform_frame(inframe, outframe) } fn transform_frame_ip( &self, frame: &mut VideoFrameRef<&mut gst::BufferRef>, ) -> Result { self.parent_transform_frame_ip(frame) } fn transform_frame_ip_passthrough( &self, frame: &VideoFrameRef<&gst::BufferRef>, ) -> Result { self.parent_transform_frame_ip_passthrough(frame) } } mod sealed { pub trait Sealed {} impl Sealed for T {} } pub trait VideoFilterImplExt: sealed::Sealed + ObjectSubclass { fn parent_set_info( &self, incaps: &gst::Caps, in_info: &VideoInfo, outcaps: &gst::Caps, out_info: &VideoInfo, ) -> Result<(), gst::LoggableError> { unsafe { let data = Self::type_data(); let parent_class = data.as_ref().parent_class() as *mut ffi::GstVideoFilterClass; (*parent_class) .set_info .map(|f| { gst::result_from_gboolean!( f( self.obj().unsafe_cast_ref::().to_glib_none().0, incaps.to_glib_none().0, mut_override(in_info.to_glib_none().0), outcaps.to_glib_none().0, mut_override(out_info.to_glib_none().0), ), gst::CAT_RUST, "Parent function `set_info` failed" ) }) .unwrap_or(Ok(())) } } fn parent_transform_frame( &self, inframe: &VideoFrameRef<&gst::BufferRef>, outframe: &mut VideoFrameRef<&mut gst::BufferRef>, ) -> Result { unsafe { let data = Self::type_data(); let parent_class = data.as_ref().parent_class() as *mut ffi::GstVideoFilterClass; (*parent_class) .transform_frame .map(|f| { try_from_glib(f( self.obj().unsafe_cast_ref::().to_glib_none().0, mut_override(inframe.as_ptr()), outframe.as_mut_ptr(), )) }) .unwrap_or_else(|| { if !self .obj() .unsafe_cast_ref::() .is_in_place() { Err(gst::FlowError::NotSupported) } else { unreachable!(concat!( "parent `transform_frame` called while transform operates in-place" )); } }) } } fn parent_transform_frame_ip( &self, frame: &mut VideoFrameRef<&mut gst::BufferRef>, ) -> Result { unsafe { let data = Self::type_data(); let parent_class = data.as_ref().parent_class() as *mut ffi::GstVideoFilterClass; let f = (*parent_class).transform_frame_ip.unwrap_or_else(|| { if self .obj() .unsafe_cast_ref::() .is_in_place() { panic!(concat!( "Missing parent function `transform_frame_ip`. Required because ", "transform operates in-place" )); } else { unreachable!(concat!( "parent `transform_frame` called while transform doesn't operate in-place" )); } }); try_from_glib(f( self.obj().unsafe_cast_ref::().to_glib_none().0, frame.as_mut_ptr(), )) } } fn parent_transform_frame_ip_passthrough( &self, frame: &VideoFrameRef<&gst::BufferRef>, ) -> Result { unsafe { let data = Self::type_data(); let parent_class = data.as_ref().parent_class() as *mut ffi::GstVideoFilterClass; let f = (*parent_class).transform_frame_ip.unwrap_or_else(|| { if self .obj() .unsafe_cast_ref::() .is_in_place() { panic!(concat!( "Missing parent function `transform_frame_ip`. Required because ", "transform operates in-place (passthrough mode)" )); } else { unreachable!(concat!( "parent `transform_frame_ip` called ", "while transform doesn't operate in-place (passthrough mode)" )); } }); try_from_glib(f( self.obj().unsafe_cast_ref::().to_glib_none().0, mut_override(frame.as_ptr()), )) } } } impl VideoFilterImplExt for T {} unsafe impl IsSubclassable for VideoFilter { fn class_init(klass: &mut glib::Class) { use gst_base::subclass::base_transform::BaseTransformMode; Self::parent_class_init::(klass); let klass = klass.as_mut(); klass.set_info = Some(video_filter_set_info::); match T::MODE { BaseTransformMode::AlwaysInPlace => { klass.transform_frame = None; klass.transform_frame_ip = Some(video_filter_transform_frame_ip::); } BaseTransformMode::NeverInPlace => { klass.transform_frame = Some(video_filter_transform_frame::); klass.transform_frame_ip = None; } BaseTransformMode::Both => { klass.transform_frame = Some(video_filter_transform_frame::); klass.transform_frame_ip = Some(video_filter_transform_frame_ip::); } } } } unsafe extern "C" fn video_filter_set_info( ptr: *mut ffi::GstVideoFilter, incaps: *mut gst::ffi::GstCaps, in_info: *mut ffi::GstVideoInfo, outcaps: *mut gst::ffi::GstCaps, out_info: *mut ffi::GstVideoInfo, ) -> glib::ffi::gboolean { let instance = &*(ptr as *mut T::Instance); let imp = instance.imp(); gst::panic_to_error!(imp, false, { match imp.set_info( &from_glib_borrow(incaps), &from_glib_none(in_info), &from_glib_borrow(outcaps), &from_glib_none(out_info), ) { Ok(()) => true, Err(err) => { err.log_with_imp(imp); false } } }) .into_glib() } unsafe extern "C" fn video_filter_transform_frame( ptr: *mut ffi::GstVideoFilter, inframe: *mut ffi::GstVideoFrame, outframe: *mut ffi::GstVideoFrame, ) -> gst::ffi::GstFlowReturn { let instance = &*(ptr as *mut T::Instance); let imp = instance.imp(); gst::panic_to_error!(imp, gst::FlowReturn::Error, { imp.transform_frame( &VideoFrameRef::from_glib_borrow(inframe), &mut VideoFrameRef::from_glib_borrow_mut(outframe), ) .into() }) .into_glib() } unsafe extern "C" fn video_filter_transform_frame_ip( ptr: *mut ffi::GstVideoFilter, frame: *mut ffi::GstVideoFrame, ) -> gst::ffi::GstFlowReturn { let instance = &*(ptr as *mut T::Instance); let imp = instance.imp(); gst::panic_to_error!(imp, gst::FlowReturn::Error, { if from_glib(gst_base::ffi::gst_base_transform_is_passthrough( ptr as *mut gst_base::ffi::GstBaseTransform, )) { imp.transform_frame_ip_passthrough(&VideoFrameRef::from_glib_borrow(frame)) .into() } else { imp.transform_frame_ip(&mut VideoFrameRef::from_glib_borrow_mut(frame)) .into() } }) .into_glib() } gstreamer-video-0.23.5/src/subclass/video_sink.rs000064400000000000000000000035521046102023000201230ustar 00000000000000// Take a look at the license at the top of the repository in the LICENSE file. use glib::{prelude::*, translate::*}; use gst_base::subclass::prelude::*; use crate::{ffi, VideoSink}; pub trait VideoSinkImpl: VideoSinkImplExt + BaseSinkImpl + ElementImpl { fn show_frame(&self, buffer: &gst::Buffer) -> Result { self.parent_show_frame(buffer) } } mod sealed { pub trait Sealed {} impl Sealed for T {} } pub trait VideoSinkImplExt: sealed::Sealed + ObjectSubclass { fn parent_show_frame(&self, buffer: &gst::Buffer) -> Result { unsafe { let data = Self::type_data(); let parent_class = data.as_ref().parent_class() as *mut ffi::GstVideoSinkClass; (*parent_class) .show_frame .map(|f| { try_from_glib(f( self.obj().unsafe_cast_ref::().to_glib_none().0, buffer.to_glib_none().0, )) }) .unwrap_or(Err(gst::FlowError::Error)) } } } impl VideoSinkImplExt for T {} unsafe impl IsSubclassable for VideoSink { fn class_init(klass: &mut glib::Class) { Self::parent_class_init::(klass); let klass = klass.as_mut(); klass.show_frame = Some(video_sink_show_frame::); } } unsafe extern "C" fn video_sink_show_frame( ptr: *mut ffi::GstVideoSink, buffer: *mut gst::ffi::GstBuffer, ) -> gst::ffi::GstFlowReturn { let instance = &*(ptr as *mut T::Instance); let imp = instance.imp(); let buffer = from_glib_borrow(buffer); gst::panic_to_error!(imp, gst::FlowReturn::Error, { imp.show_frame(&buffer).into() }) .into_glib() } gstreamer-video-0.23.5/src/utils.rs000064400000000000000000000004611046102023000153060ustar 00000000000000// Take a look at the license at the top of the repository in the LICENSE file. pub trait HasStreamLock { #[doc(alias = "get_stream_lock")] fn stream_lock(&self) -> *mut glib::ffi::GRecMutex; #[doc(alias = "get_element_as_ptr")] fn element_as_ptr(&self) -> *const gst::ffi::GstElement; } gstreamer-video-0.23.5/src/video_aggregator.rs000064400000000000000000000014661046102023000174640ustar 00000000000000use glib::translate::*; use gst::prelude::*; use crate::{ffi, VideoAggregator}; mod sealed { pub trait Sealed {} impl> Sealed for T {} } pub trait VideoAggregatorExtManual: sealed::Sealed + IsA + 'static { fn video_info(&self) -> Option { unsafe { let ptr = self.as_ptr() as *mut ffi::GstVideoAggregator; let _guard = self.as_ref().object_lock(); let info = &(*ptr).info; if info.finfo.is_null() || info.width <= 0 || info.height <= 0 { return None; } Some(from_glib_none(mut_override( info as *const ffi::GstVideoInfo, ))) } } } impl> VideoAggregatorExtManual for O {} gstreamer-video-0.23.5/src/video_aggregator_convert_pad.rs000064400000000000000000000040331046102023000220410ustar 00000000000000use std::mem::transmute; use glib::{ prelude::*, signal::{connect_raw, SignalHandlerId}, translate::*, }; use crate::{ffi, VideoAggregatorConvertPad}; mod sealed { pub trait Sealed {} impl> Sealed for T {} } pub trait VideoAggregatorConvertPadExtManual: sealed::Sealed + IsA + 'static { #[doc(alias = "converter-config")] fn converter_config(&self) -> Option { ObjectExt::property::>(self.as_ref(), "converter-config") .map(|c| c.try_into().unwrap()) } #[doc(alias = "converter-config")] fn set_converter_config(&self, converter_config: Option<&crate::VideoConverterConfig>) { ObjectExt::set_property( self.as_ref(), "converter-config", converter_config.map(|s| s.as_ref()), ) } #[doc(alias = "converter-config")] fn connect_converter_config_notify( &self, f: F, ) -> SignalHandlerId { unsafe extern "C" fn notify_converter_config_trampoline< P: IsA, F: Fn(&P) + Send + Sync + 'static, >( this: *mut ffi::GstVideoAggregatorConvertPad, _param_spec: glib::ffi::gpointer, f: glib::ffi::gpointer, ) { let f: &F = &*(f as *const F); f(VideoAggregatorConvertPad::from_glib_borrow(this).unsafe_cast_ref()) } unsafe { let f: Box = Box::new(f); connect_raw( self.as_ptr() as *mut _, b"notify::converter-config\0".as_ptr() as *const _, Some(transmute::<*const (), unsafe extern "C" fn()>( notify_converter_config_trampoline:: as *const (), )), Box::into_raw(f), ) } } } impl> VideoAggregatorConvertPadExtManual for O {} gstreamer-video-0.23.5/src/video_aggregator_pad.rs000064400000000000000000000040071046102023000203020ustar 00000000000000use glib::{object::IsA, translate::*}; use gst::prelude::*; use crate::{ffi, subclass::AggregateFramesToken, VideoAggregatorPad}; mod sealed { pub trait Sealed {} impl> Sealed for T {} } pub trait VideoAggregatorPadExtManual: sealed::Sealed + IsA + 'static { #[doc(alias = "gst_video_aggregator_pad_has_current_buffer")] fn has_current_buffer(&self, _token: &AggregateFramesToken) -> bool { unsafe { from_glib(ffi::gst_video_aggregator_pad_has_current_buffer( self.as_ref().to_glib_none().0, )) } } #[doc(alias = "gst_video_aggregator_pad_get_current_buffer")] fn current_buffer(&self, _token: &AggregateFramesToken) -> Option { unsafe { from_glib_none(ffi::gst_video_aggregator_pad_get_current_buffer( self.as_ref().to_glib_none().0, )) } } #[doc(alias = "gst_video_aggregator_pad_get_prepared_frame")] fn prepared_frame<'a>( &self, _token: &'a AggregateFramesToken, ) -> Option> { unsafe { let ptr = ffi::gst_video_aggregator_pad_get_prepared_frame(self.as_ref().to_glib_none().0); if ptr.is_null() { None } else { Some(crate::VideoFrameRef::from_glib_borrow(ptr).into_inner()) } } } fn video_info(&self) -> Option { unsafe { let ptr = self.as_ptr() as *mut ffi::GstVideoAggregatorPad; let _guard = self.as_ref().object_lock(); let info = &(*ptr).info; if info.finfo.is_null() || info.width <= 0 || info.height <= 0 { return None; } Some(from_glib_none(mut_override( info as *const ffi::GstVideoInfo, ))) } } } impl> VideoAggregatorPadExtManual for O {} gstreamer-video-0.23.5/src/video_buffer_pool.rs000064400000000000000000000076771046102023000176560ustar 00000000000000// Take a look at the license at the top of the repository in the LICENSE file. use std::{marker::PhantomData, mem}; use crate::ffi; use glib::translate::*; pub static BUFFER_POOL_OPTION_VIDEO_AFFINE_TRANSFORMATION_META: &glib::GStr = unsafe { glib::GStr::from_utf8_with_nul_unchecked( ffi::GST_BUFFER_POOL_OPTION_VIDEO_AFFINE_TRANSFORMATION_META, ) }; pub static BUFFER_POOL_OPTION_VIDEO_ALIGNMENT: &glib::GStr = unsafe { glib::GStr::from_utf8_with_nul_unchecked(ffi::GST_BUFFER_POOL_OPTION_VIDEO_ALIGNMENT) }; pub static BUFFER_POOL_OPTION_VIDEO_GL_TEXTURE_UPLOAD_META: &glib::GStr = unsafe { glib::GStr::from_utf8_with_nul_unchecked( ffi::GST_BUFFER_POOL_OPTION_VIDEO_GL_TEXTURE_UPLOAD_META, ) }; pub static BUFFER_POOL_OPTION_VIDEO_META: &glib::GStr = unsafe { glib::GStr::from_utf8_with_nul_unchecked(ffi::GST_BUFFER_POOL_OPTION_VIDEO_META) }; #[derive(Debug, Clone)] #[doc(alias = "GstVideoAlignment")] pub struct VideoAlignment(pub(crate) ffi::GstVideoAlignment); impl VideoAlignment { #[doc(alias = "get_padding_top")] #[inline] pub fn padding_top(&self) -> u32 { self.0.padding_top } #[doc(alias = "get_padding_bottom")] #[inline] pub fn padding_bottom(&self) -> u32 { self.0.padding_bottom } #[doc(alias = "get_padding_left")] #[inline] pub fn padding_left(&self) -> u32 { self.0.padding_left } #[doc(alias = "get_padding_right")] #[inline] pub fn padding_right(&self) -> u32 { self.0.padding_right } #[doc(alias = "get_stride_align")] #[inline] pub fn stride_align(&self) -> &[u32; ffi::GST_VIDEO_MAX_PLANES as usize] { &self.0.stride_align } pub fn new( padding_top: u32, padding_bottom: u32, padding_left: u32, padding_right: u32, stride_align: &[u32; ffi::GST_VIDEO_MAX_PLANES as usize], ) -> Self { skip_assert_initialized!(); let videoalignment = ffi::GstVideoAlignment { padding_top, padding_bottom, padding_left, padding_right, stride_align: *stride_align, }; Self(videoalignment) } } impl PartialEq for VideoAlignment { #[inline] fn eq(&self, other: &Self) -> bool { self.padding_top() == other.padding_top() && self.padding_bottom() == other.padding_bottom() && self.padding_left() == other.padding_left() && self.padding_right() == other.padding_right() && self.stride_align() == other.stride_align() } } impl Eq for VideoAlignment {} #[doc(hidden)] impl<'a> ToGlibPtr<'a, *const ffi::GstVideoAlignment> for VideoAlignment { type Storage = PhantomData<&'a Self>; #[inline] fn to_glib_none(&'a self) -> Stash<'a, *const ffi::GstVideoAlignment, Self> { Stash(&self.0, PhantomData) } } pub trait VideoBufferPoolConfig { #[doc(alias = "get_video_alignment")] fn video_alignment(&self) -> Option; fn set_video_alignment(&mut self, align: &VideoAlignment); } impl VideoBufferPoolConfig for gst::BufferPoolConfigRef { #[doc(alias = "gst_buffer_pool_config_get_video_alignment")] fn video_alignment(&self) -> Option { unsafe { let mut alignment = mem::MaybeUninit::uninit(); let ret = from_glib(ffi::gst_buffer_pool_config_get_video_alignment( self.as_ref().as_mut_ptr(), alignment.as_mut_ptr(), )); if ret { Some(VideoAlignment(alignment.assume_init())) } else { None } } } #[doc(alias = "gst_buffer_pool_config_set_video_alignment")] fn set_video_alignment(&mut self, align: &VideoAlignment) { unsafe { ffi::gst_buffer_pool_config_set_video_alignment( self.as_mut().as_mut_ptr(), mut_override(&align.0), ) } } } gstreamer-video-0.23.5/src/video_codec_frame.rs000064400000000000000000000173141046102023000175700ustar 00000000000000// Take a look at the license at the top of the repository in the LICENSE file. use std::{fmt, marker::PhantomData, mem}; use glib::translate::*; use crate::{ffi, utils::HasStreamLock, VideoCodecFrameFlags}; pub struct VideoCodecFrame<'a> { frame: *mut ffi::GstVideoCodecFrame, /* GstVideoCodecFrame API isn't safe so protect the frame using the * element (decoder or encoder) stream lock */ element: &'a dyn HasStreamLock, } #[doc(hidden)] impl<'a> ::glib::translate::ToGlibPtr<'a, *mut ffi::GstVideoCodecFrame> for VideoCodecFrame<'a> { type Storage = PhantomData<&'a Self>; #[inline] fn to_glib_none(&'a self) -> ::glib::translate::Stash<'a, *mut ffi::GstVideoCodecFrame, Self> { Stash(self.frame, PhantomData) } #[inline] fn to_glib_full(&self) -> *mut ffi::GstVideoCodecFrame { unsafe { ffi::gst_video_codec_frame_ref(self.frame) } } } impl fmt::Debug for VideoCodecFrame<'_> { fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result { let mut b = f.debug_struct("VideoCodecFrame"); b.field("flags", &self.flags()) .field("system_frame_number", &self.system_frame_number()) .field("decode_frame_number", &self.decode_frame_number()) .field( "presentation_frame_number", &self.presentation_frame_number(), ) .field("dts", &self.dts()) .field("pts", &self.pts()) .field("duration", &self.duration()) .field("distance_from_sync", &self.distance_from_sync()) .field("input_buffer", &self.input_buffer()) .field("output_buffer", &self.output_buffer()) .field("deadline", &self.deadline()); b.finish() } } impl<'a> VideoCodecFrame<'a> { // Take ownership of @frame pub(crate) unsafe fn new( frame: *mut ffi::GstVideoCodecFrame, element: &'a T, ) -> Self { skip_assert_initialized!(); let stream_lock = element.stream_lock(); glib::ffi::g_rec_mutex_lock(stream_lock); Self { frame, element } } #[doc(alias = "get_flags")] #[inline] pub fn flags(&self) -> VideoCodecFrameFlags { let flags = unsafe { (*self.to_glib_none().0).flags }; VideoCodecFrameFlags::from_bits_truncate(flags) } #[inline] pub fn set_flags(&mut self, flags: VideoCodecFrameFlags) { unsafe { (*self.to_glib_none().0).flags |= flags.bits() } } #[inline] pub fn unset_flags(&mut self, flags: VideoCodecFrameFlags) { unsafe { (*self.to_glib_none().0).flags &= !flags.bits() } } #[doc(alias = "get_system_frame_number")] #[inline] pub fn system_frame_number(&self) -> u32 { unsafe { (*self.to_glib_none().0).system_frame_number } } #[doc(alias = "get_decode_frame_number")] #[inline] pub fn decode_frame_number(&self) -> u32 { unsafe { (*self.to_glib_none().0).decode_frame_number } } #[doc(alias = "get_presentation_frame_number")] #[inline] pub fn presentation_frame_number(&self) -> u32 { unsafe { (*self.to_glib_none().0).presentation_frame_number } } #[doc(alias = "get_dts")] #[inline] pub fn dts(&self) -> Option { unsafe { from_glib((*self.to_glib_none().0).dts) } } #[inline] pub fn set_dts(&mut self, dts: impl Into>) { unsafe { (*self.to_glib_none().0).dts = dts.into().into_glib(); } } #[doc(alias = "get_pts")] #[inline] pub fn pts(&self) -> Option { unsafe { from_glib((*self.to_glib_none().0).pts) } } #[inline] pub fn set_pts(&mut self, pts: impl Into>) { unsafe { (*self.to_glib_none().0).pts = pts.into().into_glib(); } } #[doc(alias = "get_duration")] #[inline] pub fn duration(&self) -> Option { unsafe { from_glib((*self.to_glib_none().0).duration) } } #[inline] pub fn set_duration(&mut self, duration: impl Into>) { unsafe { (*self.to_glib_none().0).duration = duration.into().into_glib(); } } #[doc(alias = "get_distance_from_sync")] #[inline] pub fn distance_from_sync(&self) -> i32 { unsafe { (*self.to_glib_none().0).distance_from_sync } } #[doc(alias = "get_input_buffer")] #[inline] pub fn input_buffer(&self) -> Option<&gst::BufferRef> { unsafe { let ptr = (*self.to_glib_none().0).input_buffer; if ptr.is_null() { None } else { Some(gst::BufferRef::from_ptr(ptr)) } } } #[doc(alias = "get_input_buffer")] #[inline] pub fn input_buffer_owned(&self) -> Option { unsafe { let ptr = (*self.to_glib_none().0).input_buffer; if ptr.is_null() { None } else { Some(from_glib_none(ptr)) } } } #[doc(alias = "get_output_buffer")] #[inline] pub fn output_buffer(&self) -> Option<&gst::BufferRef> { unsafe { let ptr = (*self.to_glib_none().0).output_buffer; if ptr.is_null() { None } else { Some(gst::BufferRef::from_ptr(ptr)) } } } #[doc(alias = "get_output_buffer_mut")] pub fn output_buffer_mut(&mut self) -> Option<&mut gst::BufferRef> { unsafe { let ptr = (*self.to_glib_none().0).output_buffer; if ptr.is_null() { None } else { let writable: bool = from_glib(gst::ffi::gst_mini_object_is_writable( ptr as *const gst::ffi::GstMiniObject, )); debug_assert!(writable); Some(gst::BufferRef::from_mut_ptr(ptr)) } } } pub fn set_output_buffer(&mut self, output_buffer: gst::Buffer) { unsafe { assert!(output_buffer.is_writable()); let prev = (*self.to_glib_none().0).output_buffer; if !prev.is_null() { gst::ffi::gst_mini_object_unref(prev as *mut gst::ffi::GstMiniObject); } (*self.to_glib_none().0).output_buffer = output_buffer.into_glib_ptr(); } } #[doc(alias = "get_deadline")] #[inline] pub fn deadline(&self) -> Option { unsafe { from_glib((*self.to_glib_none().0).deadline) } } #[cfg(feature = "v1_20")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_20")))] #[doc(alias = "gst_video_decoder_get_processed_subframe_index")] #[inline] pub fn subframes_processed(&self) -> u32 { unsafe { (*self.to_glib_none().0).abidata.ABI.subframes_processed } } #[cfg(feature = "v1_20")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_20")))] #[doc(alias = "gst_video_decoder_get_input_subframe_index")] #[inline] pub fn num_subframes(&self) -> u32 { unsafe { (*self.to_glib_none().0).abidata.ABI.num_subframes } } } impl IntoGlibPtr<*mut ffi::GstVideoCodecFrame> for VideoCodecFrame<'_> { #[inline] unsafe fn into_glib_ptr(self) -> *mut ffi::GstVideoCodecFrame { let stream_lock = self.element.stream_lock(); glib::ffi::g_rec_mutex_unlock(stream_lock); let s = mem::ManuallyDrop::new(self); s.to_glib_none().0 } } impl Drop for VideoCodecFrame<'_> { #[inline] fn drop(&mut self) { unsafe { let stream_lock = self.element.stream_lock(); glib::ffi::g_rec_mutex_unlock(stream_lock); ffi::gst_video_codec_frame_unref(self.frame); } } } gstreamer-video-0.23.5/src/video_codec_state.rs000064400000000000000000000154641046102023000176220ustar 00000000000000// Take a look at the license at the top of the repository in the LICENSE file. use std::{fmt, marker::PhantomData, ptr}; use glib::translate::*; use crate::{ffi, utils::HasStreamLock, video_info::VideoInfo}; pub trait VideoCodecStateContext<'a> { #[doc(alias = "get_element")] fn element(&self) -> Option<&'a dyn HasStreamLock>; #[doc(alias = "get_element_as_ptr")] fn element_as_ptr(&self) -> *const gst::ffi::GstElement; } pub struct InNegotiation<'a> { /* GstVideoCodecState API isn't safe so protect the state using the * element (decoder or encoder) stream lock */ element: &'a dyn HasStreamLock, } pub struct Readable {} impl<'a> VideoCodecStateContext<'a> for InNegotiation<'a> { #[inline] fn element(&self) -> Option<&'a dyn HasStreamLock> { Some(self.element) } #[inline] fn element_as_ptr(&self) -> *const gst::ffi::GstElement { self.element.element_as_ptr() } } impl<'a> VideoCodecStateContext<'a> for Readable { #[inline] fn element(&self) -> Option<&'a dyn HasStreamLock> { None } #[inline] fn element_as_ptr(&self) -> *const gst::ffi::GstElement { ptr::null() } } pub struct VideoCodecState<'a, T: VideoCodecStateContext<'a>> { state: *mut ffi::GstVideoCodecState, pub(crate) context: T, phantom: PhantomData<&'a T>, } impl<'a, T: VideoCodecStateContext<'a>> fmt::Debug for VideoCodecState<'a, T> { fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result { f.debug_struct("VideoCodecState") .field("info", &self.info()) .field("caps", &self.caps()) .field("codec_data", &self.codec_data()) .field("allocation_caps", &self.allocation_caps()) .finish() } } impl VideoCodecState<'_, Readable> { // Take ownership of @state #[inline] pub(crate) unsafe fn new(state: *mut ffi::GstVideoCodecState) -> Self { skip_assert_initialized!(); Self { state, context: Readable {}, phantom: PhantomData, } } } impl<'a> VideoCodecState<'a, InNegotiation<'a>> { // Take ownership of @state #[inline] pub(crate) unsafe fn new( state: *mut ffi::GstVideoCodecState, element: &'a T, ) -> Self { skip_assert_initialized!(); let stream_lock = element.stream_lock(); glib::ffi::g_rec_mutex_lock(stream_lock); Self { state, context: InNegotiation { element }, phantom: PhantomData, } } } impl<'a, T: VideoCodecStateContext<'a>> VideoCodecState<'a, T> { #[doc(alias = "get_info")] #[inline] pub fn info(&self) -> VideoInfo { unsafe { VideoInfo::from_glib_none(&((*self.as_mut_ptr()).info) as *const ffi::GstVideoInfo) } } #[doc(alias = "get_caps")] #[inline] pub fn caps(&self) -> Option<&gst::CapsRef> { unsafe { let ptr = (*self.as_mut_ptr()).caps; if ptr.is_null() { None } else { Some(gst::CapsRef::from_ptr(ptr)) } } } #[doc(alias = "get_caps")] #[inline] pub fn caps_owned(&self) -> Option { unsafe { from_glib_none((*self.as_mut_ptr()).caps) } } #[doc(alias = "get_codec_data")] #[inline] pub fn codec_data(&self) -> Option<&gst::BufferRef> { unsafe { let ptr = (*self.as_mut_ptr()).codec_data; if ptr.is_null() { None } else { Some(gst::BufferRef::from_ptr(ptr)) } } } #[doc(alias = "get_codec_data")] #[inline] pub fn codec_data_owned(&self) -> Option { unsafe { from_glib_none((*self.as_mut_ptr()).codec_data) } } #[doc(alias = "get_allocation_caps")] #[inline] pub fn allocation_caps(&self) -> Option<&gst::CapsRef> { unsafe { let ptr = (*self.as_mut_ptr()).allocation_caps; if ptr.is_null() { None } else { Some(gst::CapsRef::from_ptr(ptr)) } } } #[doc(alias = "get_allocation_caps")] #[inline] pub fn allocation_caps_owned(&self) -> Option { unsafe { from_glib_none((*self.as_mut_ptr()).allocation_caps) } } #[doc(hidden)] #[inline] pub fn as_mut_ptr(&self) -> *mut ffi::GstVideoCodecState { self.state } } impl<'a, T: VideoCodecStateContext<'a>> Drop for VideoCodecState<'a, T> { #[inline] fn drop(&mut self) { unsafe { if let Some(element) = self.context.element() { let stream_lock = element.stream_lock(); glib::ffi::g_rec_mutex_unlock(stream_lock); } ffi::gst_video_codec_state_unref(self.state); } } } impl<'a> VideoCodecState<'a, InNegotiation<'a>> { #[inline] pub fn set_info(&mut self, info: VideoInfo) { unsafe { ptr::write(&mut (*self.as_mut_ptr()).info, *(info.to_glib_none().0)); } } #[inline] pub fn set_caps(&mut self, caps: &gst::Caps) { unsafe { let prev = (*self.as_mut_ptr()).caps; if !prev.is_null() { gst::ffi::gst_mini_object_unref(prev as *mut gst::ffi::GstMiniObject) } ptr::write( &mut (*self.as_mut_ptr()).caps, gst::ffi::gst_mini_object_ref(caps.as_mut_ptr() as *mut _) as *mut _, ); } } #[inline] pub fn set_codec_data(&mut self, codec_data: &gst::Buffer) { unsafe { let prev = (*self.as_mut_ptr()).codec_data; if !prev.is_null() { gst::ffi::gst_mini_object_unref(prev as *mut gst::ffi::GstMiniObject) } ptr::write( &mut (*self.as_mut_ptr()).codec_data, gst::ffi::gst_mini_object_ref(codec_data.as_mut_ptr() as *mut _) as *mut _, ); } } #[inline] pub fn set_allocation_caps(&mut self, allocation_caps: &gst::Caps) { unsafe { let prev = (*self.as_mut_ptr()).allocation_caps; if !prev.is_null() { gst::ffi::gst_mini_object_unref(prev as *mut gst::ffi::GstMiniObject) } ptr::write( &mut (*self.as_mut_ptr()).allocation_caps, gst::ffi::gst_mini_object_ref(allocation_caps.as_mut_ptr() as *mut _) as *mut _, ); } } } impl Clone for VideoCodecState<'_, Readable> { #[inline] fn clone(&self) -> Self { unsafe { let state = ffi::gst_video_codec_state_ref(self.state); Self::new(state) } } } unsafe impl Send for VideoCodecState<'_, Readable> {} unsafe impl Sync for VideoCodecState<'_, Readable> {} gstreamer-video-0.23.5/src/video_color_matrix.rs000064400000000000000000000015321046102023000200360ustar 00000000000000// Take a look at the license at the top of the repository in the LICENSE file. use std::mem; use glib::translate::IntoGlib; impl crate::VideoColorMatrix { #[doc(alias = "get_kr_kb")] pub fn kr_kb(&self) -> Result<(f64, f64), glib::BoolError> { assert_initialized_main_thread!(); unsafe { let mut kr = mem::MaybeUninit::uninit(); let mut kb = mem::MaybeUninit::uninit(); glib::result_from_gboolean!( crate::ffi::gst_video_color_matrix_get_Kr_Kb( self.into_glib(), kr.as_mut_ptr(), kb.as_mut_ptr(), ), "{:?} is not a YUV matrix", self )?; let kr = kr.assume_init(); let kb = kb.assume_init(); Ok((kr, kb)) } } } gstreamer-video-0.23.5/src/video_converter.rs000064400000000000000000000350141046102023000173450ustar 00000000000000// Take a look at the license at the top of the repository in the LICENSE file. use std::{ops, ptr}; use crate::{ffi, prelude::*}; use glib::{prelude::*, translate::*}; #[derive(Debug)] #[doc(alias = "GstVideoConverter")] pub struct VideoConverter(ptr::NonNull); impl Drop for VideoConverter { #[inline] fn drop(&mut self) { unsafe { ffi::gst_video_converter_free(self.0.as_ptr()); } } } unsafe impl Send for VideoConverter {} unsafe impl Sync for VideoConverter {} impl VideoConverter { #[doc(alias = "gst_video_converter_new")] pub fn new( in_info: &crate::VideoInfo, out_info: &crate::VideoInfo, config: Option, ) -> Result { skip_assert_initialized!(); if in_info.fps() != out_info.fps() { return Err(glib::bool_error!("Can't do framerate conversion")); } if in_info.interlace_mode() != out_info.interlace_mode() { return Err(glib::bool_error!("Can't do interlacing conversion")); } unsafe { let ptr = ffi::gst_video_converter_new( in_info.to_glib_none().0 as *mut _, out_info.to_glib_none().0 as *mut _, config .map(|s| s.0.into_glib_ptr()) .unwrap_or(ptr::null_mut()), ); if ptr.is_null() { Err(glib::bool_error!("Failed to create video converter")) } else { Ok(Self(ptr::NonNull::new_unchecked(ptr))) } } } #[doc(alias = "get_config")] #[doc(alias = "gst_video_converter_get_config")] pub fn config(&self) -> VideoConverterConfig { unsafe { VideoConverterConfig( gst::StructureRef::from_glib_borrow(ffi::gst_video_converter_get_config( self.0.as_ptr(), )) .to_owned(), ) } } #[doc(alias = "gst_video_converter_set_config")] pub fn set_config(&mut self, config: VideoConverterConfig) { unsafe { ffi::gst_video_converter_set_config(self.0.as_ptr(), config.0.into_glib_ptr()); } } #[cfg(feature = "v1_22")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_22")))] #[doc(alias = "get_in_info")] #[doc(alias = "gst_video_converter_get_in_info")] pub fn in_info(&self) -> crate::VideoInfo { unsafe { from_glib_none(ffi::gst_video_converter_get_in_info(self.0.as_ptr())) } } #[cfg(feature = "v1_22")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_22")))] #[doc(alias = "get_out_info")] #[doc(alias = "gst_video_converter_get_out_info")] pub fn out_info(&self) -> crate::VideoInfo { unsafe { from_glib_none(ffi::gst_video_converter_get_out_info(self.0.as_ptr())) } } #[doc(alias = "gst_video_converter_frame")] pub fn frame( &self, src: &crate::VideoFrame, dest: &mut crate::VideoFrame, ) { unsafe { ffi::gst_video_converter_frame(self.0.as_ptr(), src.as_ptr(), dest.as_mut_ptr()); } } pub fn frame_ref( &self, src: &crate::VideoFrameRef, dest: &mut crate::VideoFrameRef<&mut gst::BufferRef>, ) { unsafe { ffi::gst_video_converter_frame(self.0.as_ptr(), src.as_ptr(), dest.as_mut_ptr()); } } } #[derive(Debug, Clone, PartialEq, Eq)] pub struct VideoConverterConfig(gst::Structure); impl ops::Deref for VideoConverterConfig { type Target = gst::StructureRef; #[inline] fn deref(&self) -> &gst::StructureRef { self.0.deref() } } impl ops::DerefMut for VideoConverterConfig { #[inline] fn deref_mut(&mut self) -> &mut gst::StructureRef { self.0.deref_mut() } } impl AsRef for VideoConverterConfig { #[inline] fn as_ref(&self) -> &gst::StructureRef { self.0.as_ref() } } impl AsMut for VideoConverterConfig { #[inline] fn as_mut(&mut self) -> &mut gst::StructureRef { self.0.as_mut() } } impl Default for VideoConverterConfig { fn default() -> Self { Self::new() } } impl TryFrom for VideoConverterConfig { type Error = glib::BoolError; fn try_from(v: gst::Structure) -> Result { skip_assert_initialized!(); if v.name() == "GstVideoConverter" { Ok(Self(v)) } else { Err(glib::bool_error!("Structure is no VideoConverterConfig")) } } } impl<'a> TryFrom<&'a gst::StructureRef> for VideoConverterConfig { type Error = glib::BoolError; fn try_from(v: &'a gst::StructureRef) -> Result { skip_assert_initialized!(); Self::try_from(v.to_owned()) } } impl From for gst::Structure { fn from(v: VideoConverterConfig) -> Self { skip_assert_initialized!(); v.0 } } impl glib::value::ToValue for VideoConverterConfig { fn to_value(&self) -> glib::Value { self.0.to_value() } fn value_type(&self) -> glib::Type { self.0.value_type() } } impl glib::value::ToValueOptional for VideoConverterConfig { fn to_value_optional(s: Option<&Self>) -> glib::Value { skip_assert_initialized!(); s.map(|s| &s.0).to_value() } } impl From for glib::Value { fn from(s: VideoConverterConfig) -> glib::Value { skip_assert_initialized!(); s.0.into() } } impl VideoConverterConfig { pub fn new() -> Self { Self(gst::Structure::new_empty("GstVideoConverter")) } pub fn set_resampler_method(&mut self, v: crate::VideoResamplerMethod) { self.0 .set(glib::gstr!("GstVideoConverter.resampler-method"), v); } #[doc(alias = "get_resampler_method")] pub fn resampler_method(&self) -> crate::VideoResamplerMethod { self.0 .get_optional(glib::gstr!("GstVideoConverter.resampler-method")) .expect("Wrong type") .unwrap_or(crate::VideoResamplerMethod::Cubic) } pub fn set_chroma_resampler_method(&mut self, v: crate::VideoResamplerMethod) { self.0 .set(glib::gstr!("GstVideoConverter.chroma-resampler-method"), v); } #[doc(alias = "get_chroma_resampler_method")] pub fn chroma_resampler_method(&self) -> crate::VideoResamplerMethod { self.0 .get_optional(glib::gstr!("GstVideoConverter.chroma-resampler-method")) .expect("Wrong type") .unwrap_or(crate::VideoResamplerMethod::Linear) } pub fn set_resampler_taps(&mut self, v: u32) { self.0 .set(glib::gstr!("GstVideoConverter.resampler-taps"), v); } #[doc(alias = "get_resampler_taps")] pub fn resampler_taps(&self) -> u32 { self.0 .get_optional(glib::gstr!("GstVideoConverter.resampler-taps")) .expect("Wrong type") .unwrap_or(0) } pub fn set_dither_method(&mut self, v: crate::VideoDitherMethod) { self.0 .set(glib::gstr!("GstVideoConverter.dither-method"), v); } #[doc(alias = "get_dither_method")] pub fn dither_method(&self) -> crate::VideoDitherMethod { self.0 .get_optional(glib::gstr!("GstVideoConverter.dither-method")) .expect("Wrong type") .unwrap_or(crate::VideoDitherMethod::Bayer) } pub fn set_dither_quantization(&mut self, v: u32) { self.0 .set(glib::gstr!("GstVideoConverter.dither-quantization"), v); } #[doc(alias = "get_dither_quantization")] pub fn dither_quantization(&self) -> u32 { self.0 .get_optional(glib::gstr!("GstVideoConverter.dither-quantization")) .expect("Wrong type") .unwrap_or(1) } pub fn set_src_x(&mut self, v: i32) { self.0.set(glib::gstr!("GstVideoConverter.src-x"), v); } #[doc(alias = "get_src_x")] pub fn src_x(&self) -> i32 { self.0 .get_optional(glib::gstr!("GstVideoConverter.src-x")) .expect("Wrong type") .unwrap_or(0) } pub fn set_src_y(&mut self, v: i32) { self.0.set(glib::gstr!("GstVideoConverter.src-y"), v); } #[doc(alias = "get_src_y")] pub fn src_y(&self) -> i32 { self.0 .get_optional(glib::gstr!("GstVideoConverter.src-y")) .expect("Wrong type") .unwrap_or(0) } pub fn set_src_width(&mut self, v: Option) { if let Some(v) = v { self.0.set(glib::gstr!("GstVideoConverter.src-width"), v); } else { self.0 .remove_field(glib::gstr!("GstVideoConverter.src-width")); } } #[doc(alias = "get_src_width")] pub fn src_width(&self) -> Option { self.0 .get_optional(glib::gstr!("GstVideoConverter.src-width")) .expect("Wrong type") } pub fn set_src_height(&mut self, v: Option) { if let Some(v) = v { self.0.set(glib::gstr!("GstVideoConverter.src-height"), v); } else { self.0 .remove_field(glib::gstr!("GstVideoConverter.src-height")); } } #[doc(alias = "get_src_height")] pub fn src_height(&self) -> Option { self.0 .get_optional(glib::gstr!("GstVideoConverter.src-height")) .expect("Wrong type") } pub fn set_dest_x(&mut self, v: i32) { self.0.set(glib::gstr!("GstVideoConverter.dest-x"), v); } #[doc(alias = "get_dest_x")] pub fn dest_x(&self) -> i32 { self.0 .get_optional(glib::gstr!("GstVideoConverter.dest-x")) .expect("Wrong type") .unwrap_or(0) } pub fn set_dest_y(&mut self, v: i32) { self.0.set(glib::gstr!("GstVideoConverter.dest-y"), v); } #[doc(alias = "get_dest_y")] pub fn dest_y(&self) -> i32 { self.0 .get_optional(glib::gstr!("GstVideoConverter.dest-y")) .expect("Wrong type") .unwrap_or(0) } pub fn set_dest_width(&mut self, v: Option) { if let Some(v) = v { self.0.set(glib::gstr!("GstVideoConverter.dest-width"), v); } else { self.0 .remove_field(glib::gstr!("GstVideoConverter.dest-width")); } } #[doc(alias = "get_dest_width")] pub fn dest_width(&self) -> Option { self.0 .get_optional(glib::gstr!("GstVideoConverter.dest-width")) .expect("Wrong type") } pub fn set_dest_height(&mut self, v: Option) { if let Some(v) = v { self.0.set(glib::gstr!("GstVideoConverter.dest-height"), v); } else { self.0 .remove_field(glib::gstr!("GstVideoConverter.dest-height")); } } #[doc(alias = "get_dest_height")] pub fn dest_height(&self) -> Option { self.0 .get_optional(glib::gstr!("GstVideoConverter.dest-height")) .expect("Wrong type") } pub fn set_fill_border(&mut self, v: bool) { self.0.set(glib::gstr!("GstVideoConverter.fill-border"), v); } #[doc(alias = "get_fill_border")] pub fn fills_border(&self) -> bool { self.0 .get_optional(glib::gstr!("GstVideoConverter.fill-border")) .expect("Wrong type") .unwrap_or(true) } pub fn set_alpha_value(&mut self, v: f64) { self.0.set(glib::gstr!("GstVideoConverter.alpha-value"), v); } #[doc(alias = "get_alpha_value")] pub fn alpha_value(&self) -> f64 { self.0 .get_optional(glib::gstr!("GstVideoConverter.alpha-value")) .expect("Wrong type") .unwrap_or(1.0) } pub fn set_alpha_mode(&mut self, v: crate::VideoAlphaMode) { self.0.set(glib::gstr!("GstVideoConverter.alpha-mode"), v); } #[doc(alias = "get_alpha_mode")] pub fn alpha_mode(&self) -> crate::VideoAlphaMode { self.0 .get_optional(glib::gstr!("GstVideoConverter.alpha-mode")) .expect("Wrong type") .unwrap_or(crate::VideoAlphaMode::Copy) } pub fn set_border_argb(&mut self, v: u32) { self.0.set(glib::gstr!("GstVideoConverter.border-argb"), v); } #[doc(alias = "get_border_argb")] pub fn border_argb(&self) -> u32 { self.0 .get_optional(glib::gstr!("GstVideoConverter.border-argb")) .expect("Wrong type") .unwrap_or(0xff_00_00_00) } pub fn set_chroma_mode(&mut self, v: crate::VideoChromaMode) { self.0.set(glib::gstr!("GstVideoConverter.chroma-mode"), v); } #[doc(alias = "get_chroma_mode")] pub fn chroma_mode(&self) -> crate::VideoChromaMode { self.0 .get_optional(glib::gstr!("GstVideoConverter.chroma-mode")) .expect("Wrong type") .unwrap_or(crate::VideoChromaMode::Full) } pub fn set_matrix_mode(&mut self, v: crate::VideoMatrixMode) { self.0.set(glib::gstr!("GstVideoConverter.matrix-mode"), v); } #[doc(alias = "get_matrix_mode")] pub fn matrix_mode(&self) -> crate::VideoMatrixMode { self.0 .get_optional(glib::gstr!("GstVideoConverter.matrix-mode")) .expect("Wrong type") .unwrap_or(crate::VideoMatrixMode::Full) } pub fn set_gamma_mode(&mut self, v: crate::VideoGammaMode) { self.0.set(glib::gstr!("GstVideoConverter.gamma-mode"), v); } #[doc(alias = "get_gamma_mode")] pub fn gamma_mode(&self) -> crate::VideoGammaMode { self.0 .get_optional(glib::gstr!("GstVideoConverter.gamma-mode")) .expect("Wrong type") .unwrap_or(crate::VideoGammaMode::None) } pub fn set_primaries_mode(&mut self, v: crate::VideoPrimariesMode) { self.0 .set(glib::gstr!("GstVideoConverter.primaries-mode"), v); } #[doc(alias = "get_primaries_mode")] pub fn primaries_mode(&self) -> crate::VideoPrimariesMode { self.0 .get_optional(glib::gstr!("GstVideoConverter.primaries-mode")) .expect("Wrong type") .unwrap_or(crate::VideoPrimariesMode::None) } pub fn set_threads(&mut self, v: u32) { self.0.set(glib::gstr!("GstVideoConverter.threads"), v); } #[doc(alias = "get_threads")] pub fn threads(&self) -> u32 { self.0 .get_optional(glib::gstr!("GstVideoConverter.threads")) .expect("Wrong type") .unwrap_or(1) } } gstreamer-video-0.23.5/src/video_decoder.rs000064400000000000000000000262101046102023000167410ustar 00000000000000// Take a look at the license at the top of the repository in the LICENSE file. use std::{mem, ptr}; use glib::{prelude::*, translate::*}; #[cfg(feature = "v1_16")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_16")))] use crate::VideoInterlaceMode; use crate::{ ffi, utils::HasStreamLock, video_codec_state::{InNegotiation, Readable, VideoCodecState, VideoCodecStateContext}, VideoCodecFrame, VideoDecoder, VideoFormat, }; extern "C" { fn _gst_video_decoder_error( dec: *mut ffi::GstVideoDecoder, weight: i32, domain: glib::ffi::GQuark, code: i32, txt: *mut libc::c_char, debug: *mut libc::c_char, file: *const libc::c_char, function: *const libc::c_char, line: i32, ) -> gst::ffi::GstFlowReturn; } mod sealed { pub trait Sealed {} impl> Sealed for T {} } pub trait VideoDecoderExtManual: sealed::Sealed + IsA + 'static { #[doc(alias = "gst_video_decoder_allocate_output_frame")] fn allocate_output_frame( &self, frame: &mut VideoCodecFrame, params: Option<&gst::BufferPoolAcquireParams>, ) -> Result { unsafe { let params_ptr = params.to_glib_none().0 as *mut _; try_from_glib(ffi::gst_video_decoder_allocate_output_frame_with_params( self.as_ref().to_glib_none().0, frame.to_glib_none().0, params_ptr, )) } } #[doc(alias = "get_frame")] #[doc(alias = "gst_video_decoder_get_frame")] fn frame(&self, frame_number: i32) -> Option { let frame = unsafe { ffi::gst_video_decoder_get_frame(self.as_ref().to_glib_none().0, frame_number) }; if frame.is_null() { None } else { unsafe { Some(VideoCodecFrame::new(frame, self.as_ref())) } } } #[doc(alias = "get_frames")] #[doc(alias = "gst_video_decoder_get_frames")] fn frames(&self) -> Vec { unsafe { let frames = ffi::gst_video_decoder_get_frames(self.as_ref().to_glib_none().0); let mut iter: *const glib::ffi::GList = frames; let mut vec = Vec::new(); while !iter.is_null() { let frame_ptr = Ptr::from((*iter).data); /* transfer ownership of the frame */ let frame = VideoCodecFrame::new(frame_ptr, self.as_ref()); vec.push(frame); iter = (*iter).next; } glib::ffi::g_list_free(frames); vec } } #[doc(alias = "get_oldest_frame")] #[doc(alias = "gst_video_decoder_get_oldest_frame")] fn oldest_frame(&self) -> Option { let frame = unsafe { ffi::gst_video_decoder_get_oldest_frame(self.as_ref().to_glib_none().0) }; if frame.is_null() { None } else { unsafe { Some(VideoCodecFrame::new(frame, self.as_ref())) } } } #[doc(alias = "get_allocator")] #[doc(alias = "gst_video_decoder_get_allocator")] fn allocator(&self) -> (Option, gst::AllocationParams) { unsafe { let mut allocator = ptr::null_mut(); let mut params = mem::MaybeUninit::uninit(); ffi::gst_video_decoder_get_allocator( self.as_ref().to_glib_none().0, &mut allocator, params.as_mut_ptr(), ); (from_glib_full(allocator), params.assume_init().into()) } } #[doc(alias = "get_latency")] #[doc(alias = "gst_video_decoder_get_latency")] fn latency(&self) -> (gst::ClockTime, Option) { let mut min_latency = gst::ffi::GST_CLOCK_TIME_NONE; let mut max_latency = gst::ffi::GST_CLOCK_TIME_NONE; unsafe { ffi::gst_video_decoder_get_latency( self.as_ref().to_glib_none().0, &mut min_latency, &mut max_latency, ); ( try_from_glib(min_latency).expect("undefined min_latency"), from_glib(max_latency), ) } } #[doc(alias = "gst_video_decoder_set_latency")] fn set_latency( &self, min_latency: gst::ClockTime, max_latency: impl Into>, ) { unsafe { ffi::gst_video_decoder_set_latency( self.as_ref().to_glib_none().0, min_latency.into_glib(), max_latency.into().into_glib(), ); } } #[doc(alias = "get_output_state")] #[doc(alias = "gst_video_decoder_get_output_state")] fn output_state(&self) -> Option> { let state = unsafe { ffi::gst_video_decoder_get_output_state(self.as_ref().to_glib_none().0) }; if state.is_null() { None } else { unsafe { Some(VideoCodecState::::new(state)) } } } #[doc(alias = "gst_video_decoder_set_output_state")] fn set_output_state( &self, fmt: VideoFormat, width: u32, height: u32, reference: Option<&VideoCodecState>, ) -> Result, gst::FlowError> { let state = unsafe { let reference = match reference { Some(reference) => reference.as_mut_ptr(), None => ptr::null_mut(), }; ffi::gst_video_decoder_set_output_state( self.as_ref().to_glib_none().0, fmt.into_glib(), width, height, reference, ) }; if state.is_null() { Err(gst::FlowError::NotNegotiated) } else { unsafe { Ok(VideoCodecState::::new(state, self.as_ref())) } } } #[cfg(feature = "v1_16")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_16")))] #[doc(alias = "gst_video_decoder_set_interlaced_output_state")] fn set_interlaced_output_state( &self, fmt: VideoFormat, mode: VideoInterlaceMode, width: u32, height: u32, reference: Option<&VideoCodecState>, ) -> Result, gst::FlowError> { let state = unsafe { let reference = match reference { Some(reference) => reference.as_mut_ptr(), None => ptr::null_mut(), }; ffi::gst_video_decoder_set_interlaced_output_state( self.as_ref().to_glib_none().0, fmt.into_glib(), mode.into_glib(), width, height, reference, ) }; if state.is_null() { Err(gst::FlowError::NotNegotiated) } else { unsafe { Ok(VideoCodecState::::new(state, self.as_ref())) } } } #[doc(alias = "gst_video_decoder_negotiate")] fn negotiate<'a>( &'a self, output_state: VideoCodecState<'a, InNegotiation<'a>>, ) -> Result<(), gst::FlowError> { // Consume output_state so user won't be able to modify it anymore let self_ptr = self.to_glib_none().0 as *const gst::ffi::GstElement; assert_eq!(output_state.context.element_as_ptr(), self_ptr); let ret = unsafe { from_glib(ffi::gst_video_decoder_negotiate( self.as_ref().to_glib_none().0, )) }; if ret { Ok(()) } else { Err(gst::FlowError::NotNegotiated) } } #[allow(clippy::too_many_arguments)] fn error( &self, weight: i32, code: T, message: Option<&str>, debug: Option<&str>, file: &str, function: &str, line: u32, ) -> Result { unsafe { try_from_glib(_gst_video_decoder_error( self.as_ref().to_glib_none().0, weight, T::domain().into_glib(), code.code(), message.to_glib_full(), debug.to_glib_full(), file.to_glib_none().0, function.to_glib_none().0, line as i32, )) } } fn sink_pad(&self) -> &gst::Pad { unsafe { let elt = &*(self.as_ptr() as *const ffi::GstVideoDecoder); &*(&elt.sinkpad as *const *mut gst::ffi::GstPad as *const gst::Pad) } } fn src_pad(&self) -> &gst::Pad { unsafe { let elt = &*(self.as_ptr() as *const ffi::GstVideoDecoder); &*(&elt.srcpad as *const *mut gst::ffi::GstPad as *const gst::Pad) } } fn input_segment(&self) -> gst::Segment { unsafe { let ptr: &ffi::GstVideoDecoder = &*(self.as_ptr() as *const _); glib::ffi::g_rec_mutex_lock(mut_override(&ptr.stream_lock)); let segment = ptr.input_segment; glib::ffi::g_rec_mutex_unlock(mut_override(&ptr.stream_lock)); from_glib_none(&segment as *const gst::ffi::GstSegment) } } fn output_segment(&self) -> gst::Segment { unsafe { let ptr: &ffi::GstVideoDecoder = &*(self.as_ptr() as *const _); glib::ffi::g_rec_mutex_lock(mut_override(&ptr.stream_lock)); let segment = ptr.output_segment; glib::ffi::g_rec_mutex_unlock(mut_override(&ptr.stream_lock)); from_glib_none(&segment as *const gst::ffi::GstSegment) } } } impl> VideoDecoderExtManual for O {} impl HasStreamLock for VideoDecoder { fn stream_lock(&self) -> *mut glib::ffi::GRecMutex { let decoder_sys: *const ffi::GstVideoDecoder = self.to_glib_none().0; unsafe { mut_override(&(*decoder_sys).stream_lock) } } fn element_as_ptr(&self) -> *const gst::ffi::GstElement { self.as_ptr() as *mut gst::ffi::GstElement } } #[macro_export] macro_rules! video_decoder_error( ($obj:expr, $weight:expr, $err:expr, ($($msg:tt)*), [$($debug:tt)*]) => { { use $crate::prelude::VideoDecoderExtManual; $obj.error( $weight, $err, Some(&format!($($msg)*)), Some(&format!($($debug)*)), file!(), $crate::glib::function_name!(), line!(), ) }}; ($obj:expr, $weight:expr, $err:expr, ($($msg:tt)*)) => { { use $crate::prelude::VideoDecoderExtManual; $obj.error( $weight, $err, Some(&format!($($msg)*)), None, file!(), $crate::glib::function_name!(), line!(), ) }}; ($obj:expr, $weight:expr, $err:expr, [$($debug:tt)*]) => { { use $crate::prelude::VideoDecoderExtManual; $obj.error( $weight, $err, None, Some(&format!($($debug)*)), file!(), $crate::glib::function_name!(), line!(), ) }}; ); gstreamer-video-0.23.5/src/video_encoder.rs000064400000000000000000000207141046102023000167560ustar 00000000000000// Take a look at the license at the top of the repository in the LICENSE file. use std::{mem, ptr}; use glib::{prelude::*, translate::*}; use crate::{ ffi, utils::HasStreamLock, video_codec_state::{InNegotiation, Readable, VideoCodecState, VideoCodecStateContext}, VideoCodecFrame, VideoEncoder, }; mod sealed { pub trait Sealed {} impl> Sealed for T {} } pub trait VideoEncoderExtManual: sealed::Sealed + IsA + 'static { #[doc(alias = "gst_video_encoder_allocate_output_frame")] fn allocate_output_frame( &self, frame: &mut VideoCodecFrame, size: usize, ) -> Result { unsafe { try_from_glib(ffi::gst_video_encoder_allocate_output_frame( self.as_ref().to_glib_none().0, frame.to_glib_none().0, size, )) } } #[doc(alias = "get_frame")] #[doc(alias = "gst_video_encoder_get_frame")] fn frame(&self, frame_number: i32) -> Option { let frame = unsafe { ffi::gst_video_encoder_get_frame(self.as_ref().to_glib_none().0, frame_number) }; if frame.is_null() { None } else { unsafe { Some(VideoCodecFrame::new(frame, self.as_ref())) } } } #[doc(alias = "get_frames")] #[doc(alias = "gst_video_encoder_get_frames")] fn frames(&self) -> Vec { unsafe { let frames = ffi::gst_video_encoder_get_frames(self.as_ref().to_glib_none().0); let mut iter: *const glib::ffi::GList = frames; let mut vec = Vec::new(); while !iter.is_null() { let frame_ptr = Ptr::from((*iter).data); /* transfer ownership of the frame */ let frame = VideoCodecFrame::new(frame_ptr, self.as_ref()); vec.push(frame); iter = (*iter).next; } glib::ffi::g_list_free(frames); vec } } #[doc(alias = "get_oldest_frame")] #[doc(alias = "gst_video_encoder_get_oldest_frame")] fn oldest_frame(&self) -> Option { let frame = unsafe { ffi::gst_video_encoder_get_oldest_frame(self.as_ref().to_glib_none().0) }; if frame.is_null() { None } else { unsafe { Some(VideoCodecFrame::new(frame, self.as_ref())) } } } #[doc(alias = "get_allocator")] #[doc(alias = "gst_video_encoder_get_allocator")] fn allocator(&self) -> (Option, gst::AllocationParams) { unsafe { let mut allocator = ptr::null_mut(); let mut params = mem::MaybeUninit::uninit(); ffi::gst_video_encoder_get_allocator( self.as_ref().to_glib_none().0, &mut allocator, params.as_mut_ptr(), ); (from_glib_full(allocator), params.assume_init().into()) } } #[cfg(feature = "v1_18")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_18")))] #[doc(alias = "gst_video_encoder_finish_subframe")] fn finish_subframe(&self, frame: &VideoCodecFrame) -> Result { unsafe { try_from_glib(ffi::gst_video_encoder_finish_subframe( self.as_ref().to_glib_none().0, frame.to_glib_none().0, )) } } #[doc(alias = "get_latency")] #[doc(alias = "gst_video_encoder_get_latency")] fn latency(&self) -> (gst::ClockTime, Option) { let mut min_latency = gst::ffi::GST_CLOCK_TIME_NONE; let mut max_latency = gst::ffi::GST_CLOCK_TIME_NONE; unsafe { ffi::gst_video_encoder_get_latency( self.as_ref().to_glib_none().0, &mut min_latency, &mut max_latency, ); ( try_from_glib(min_latency).expect("undefined min_latency"), from_glib(max_latency), ) } } #[doc(alias = "gst_video_encoder_set_latency")] fn set_latency( &self, min_latency: gst::ClockTime, max_latency: impl Into>, ) { unsafe { ffi::gst_video_encoder_set_latency( self.as_ref().to_glib_none().0, min_latency.into_glib(), max_latency.into().into_glib(), ); } } #[doc(alias = "get_output_state")] #[doc(alias = "gst_video_encoder_get_output_state")] fn output_state(&self) -> Option> { let state = unsafe { ffi::gst_video_encoder_get_output_state(self.as_ref().to_glib_none().0) }; if state.is_null() { None } else { unsafe { Some(VideoCodecState::::new(state)) } } } #[doc(alias = "gst_video_encoder_set_output_state")] fn set_output_state( &self, caps: gst::Caps, reference: Option<&VideoCodecState>, ) -> Result, gst::FlowError> { let state = unsafe { let reference = match reference { Some(reference) => reference.as_mut_ptr(), None => ptr::null_mut(), }; ffi::gst_video_encoder_set_output_state( self.as_ref().to_glib_none().0, caps.into_glib_ptr(), reference, ) }; if state.is_null() { Err(gst::FlowError::NotNegotiated) } else { unsafe { Ok(VideoCodecState::::new(state, self.as_ref())) } } } #[doc(alias = "gst_video_encoder_negotiate")] fn negotiate<'a>( &'a self, output_state: VideoCodecState<'a, InNegotiation<'a>>, ) -> Result<(), gst::FlowError> { // Consume output_state so user won't be able to modify it anymore let self_ptr = self.to_glib_none().0 as *const gst::ffi::GstElement; assert_eq!(output_state.context.element_as_ptr(), self_ptr); let ret = unsafe { from_glib(ffi::gst_video_encoder_negotiate( self.as_ref().to_glib_none().0, )) }; if ret { Ok(()) } else { Err(gst::FlowError::NotNegotiated) } } #[doc(alias = "gst_video_encoder_set_headers")] fn set_headers(&self, headers: impl IntoIterator) { unsafe { ffi::gst_video_encoder_set_headers( self.as_ref().to_glib_none().0, headers .into_iter() .collect::>() .into_glib_ptr(), ); } } fn sink_pad(&self) -> &gst::Pad { unsafe { let elt = &*(self.as_ptr() as *const ffi::GstVideoEncoder); &*(&elt.sinkpad as *const *mut gst::ffi::GstPad as *const gst::Pad) } } fn src_pad(&self) -> &gst::Pad { unsafe { let elt = &*(self.as_ptr() as *const ffi::GstVideoEncoder); &*(&elt.srcpad as *const *mut gst::ffi::GstPad as *const gst::Pad) } } fn input_segment(&self) -> gst::Segment { unsafe { let ptr: &ffi::GstVideoDecoder = &*(self.as_ptr() as *const _); glib::ffi::g_rec_mutex_lock(mut_override(&ptr.stream_lock)); let segment = ptr.input_segment; glib::ffi::g_rec_mutex_unlock(mut_override(&ptr.stream_lock)); from_glib_none(&segment as *const gst::ffi::GstSegment) } } fn output_segment(&self) -> gst::Segment { unsafe { let ptr: &ffi::GstVideoDecoder = &*(self.as_ptr() as *const _); glib::ffi::g_rec_mutex_lock(mut_override(&ptr.stream_lock)); let segment = ptr.output_segment; glib::ffi::g_rec_mutex_unlock(mut_override(&ptr.stream_lock)); from_glib_none(&segment as *const gst::ffi::GstSegment) } } } impl> VideoEncoderExtManual for O {} impl HasStreamLock for VideoEncoder { fn stream_lock(&self) -> *mut glib::ffi::GRecMutex { let encoder_sys: *const ffi::GstVideoEncoder = self.to_glib_none().0; unsafe { mut_override(&(*encoder_sys).stream_lock) } } fn element_as_ptr(&self) -> *const gst::ffi::GstElement { self.as_ptr() as *const gst::ffi::GstElement } } gstreamer-video-0.23.5/src/video_event.rs000064400000000000000000001612301046102023000164570ustar 00000000000000// Take a look at the license at the top of the repository in the LICENSE file. use std::mem; use glib::{prelude::*, translate::*}; use gst::EventType; #[cfg(feature = "v1_22")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_22")))] use crate::NavigationModifierType; use crate::{ffi, NavigationCommand, NavigationEventType}; // FIXME: Copy from gstreamer/src/event.rs macro_rules! event_builder_generic_impl { ($new_fn:expr) => { pub fn seqnum(self, seqnum: gst::Seqnum) -> Self { Self { seqnum: Some(seqnum), ..self } } pub fn seqnum_if(self, seqnum: gst::Seqnum, predicate: bool) -> Self { if predicate { self.seqnum(seqnum) } else { self } } pub fn seqnum_if_some(self, seqnum: Option) -> Self { if let Some(seqnum) = seqnum { self.seqnum(seqnum) } else { self } } pub fn running_time_offset(self, running_time_offset: i64) -> Self { Self { running_time_offset: Some(running_time_offset), ..self } } pub fn running_time_offset_if(self, running_time_offset: i64, predicate: bool) -> Self { if predicate { self.running_time_offset(running_time_offset) } else { self } } pub fn running_time_offset_if_some(self, running_time_offset: Option) -> Self { if let Some(running_time_offset) = running_time_offset { self.running_time_offset(running_time_offset) } else { self } } pub fn other_field(self, name: &'a str, value: impl ToSendValue) -> Self { let mut other_fields = self.other_fields; other_fields.push((name, value.to_send_value())); Self { other_fields, ..self } } gst::impl_builder_gvalue_extra_setters!(other_field); #[deprecated = "use build.other_field() instead"] pub fn other_fields( self, other_fields: &[(&'a str, &'a (dyn ToSendValue + Sync))], ) -> Self { let mut s = self; for (name, value) in other_fields { s = s.other_field(name, value.to_send_value()); } s } #[must_use = "Building the event without using it has no effect"] #[allow(clippy::redundant_closure_call)] pub fn build(mut self) -> gst::Event { skip_assert_initialized!(); unsafe { let event = $new_fn(&mut self); if let Some(seqnum) = self.seqnum { gst::ffi::gst_event_set_seqnum(event, seqnum.into_glib()); } if let Some(running_time_offset) = self.running_time_offset { gst::ffi::gst_event_set_running_time_offset(event, running_time_offset); } { let s = gst::StructureRef::from_glib_borrow_mut( gst::ffi::gst_event_writable_structure(event), ); for (k, v) in self.other_fields { s.set_value(k, v); } } from_glib_full(event) } } }; } #[must_use = "The builder must be built to be used"] pub struct DownstreamForceKeyUnitEventBuilder<'a> { seqnum: Option, running_time_offset: Option, other_fields: Vec<(&'a str, glib::SendValue)>, timestamp: Option, stream_time: Option, running_time: Option, all_headers: bool, count: u32, } impl<'a> DownstreamForceKeyUnitEventBuilder<'a> { fn new() -> Self { skip_assert_initialized!(); Self { seqnum: None, running_time_offset: None, other_fields: Vec::new(), timestamp: gst::ClockTime::NONE, stream_time: gst::ClockTime::NONE, running_time: gst::ClockTime::NONE, all_headers: true, count: 0, } } pub fn timestamp(self, timestamp: impl Into>) -> Self { Self { timestamp: timestamp.into(), ..self } } pub fn timestamp_if(self, timestamp: gst::ClockTime, predicate: bool) -> Self { if predicate { self.timestamp(timestamp) } else { self } } pub fn timestamp_if_some(self, timestamp: Option) -> Self { if let Some(timestamp) = timestamp { self.timestamp(timestamp) } else { self } } pub fn stream_time(self, stream_time: impl Into>) -> Self { Self { stream_time: stream_time.into(), ..self } } pub fn stream_time_if(self, stream_time: gst::ClockTime, predicate: bool) -> Self { if predicate { self.stream_time(stream_time) } else { self } } pub fn stream_time_if_some(self, stream_time: Option) -> Self { if let Some(stream_time) = stream_time { self.stream_time(stream_time) } else { self } } pub fn running_time(self, running_time: impl Into>) -> Self { Self { running_time: running_time.into(), ..self } } pub fn running_time_if(self, running_time: gst::ClockTime, predicate: bool) -> Self { if predicate { self.running_time(running_time) } else { self } } pub fn running_time_if_some(self, running_time: Option) -> Self { if let Some(running_time) = running_time { self.running_time(running_time) } else { self } } pub fn all_headers(self, all_headers: bool) -> Self { Self { all_headers, ..self } } pub fn all_headers_if_some(self, all_headers: Option) -> Self { if let Some(all_headers) = all_headers { self.all_headers(all_headers) } else { self } } pub fn count(self, count: u32) -> Self { Self { count, ..self } } pub fn count_if(self, count: u32, predicate: bool) -> Self { if predicate { self.count(count) } else { self } } pub fn count_if_some(self, count: Option) -> Self { if let Some(count) = count { self.count(count) } else { self } } event_builder_generic_impl!(|s: &mut Self| { ffi::gst_video_event_new_downstream_force_key_unit( s.timestamp.into_glib(), s.stream_time.into_glib(), s.running_time.into_glib(), s.all_headers.into_glib(), s.count, ) }); } #[derive(Clone, PartialEq, Eq, Debug)] pub struct DownstreamForceKeyUnitEvent { pub timestamp: Option, pub stream_time: Option, pub running_time: Option, pub all_headers: bool, pub count: u32, } impl DownstreamForceKeyUnitEvent { pub fn builder<'a>() -> DownstreamForceKeyUnitEventBuilder<'a> { assert_initialized_main_thread!(); DownstreamForceKeyUnitEventBuilder::new() } #[doc(alias = "gst_video_event_parse_downstream_force_key_unit")] pub fn parse(event: &gst::EventRef) -> Result { skip_assert_initialized!(); unsafe { let mut timestamp = mem::MaybeUninit::uninit(); let mut stream_time = mem::MaybeUninit::uninit(); let mut running_time = mem::MaybeUninit::uninit(); let mut all_headers = mem::MaybeUninit::uninit(); let mut count = mem::MaybeUninit::uninit(); let res: bool = from_glib(ffi::gst_video_event_parse_downstream_force_key_unit( event.as_mut_ptr(), timestamp.as_mut_ptr(), stream_time.as_mut_ptr(), running_time.as_mut_ptr(), all_headers.as_mut_ptr(), count.as_mut_ptr(), )); if res { Ok(Self { timestamp: from_glib(timestamp.assume_init()), stream_time: from_glib(stream_time.assume_init()), running_time: from_glib(running_time.assume_init()), all_headers: from_glib(all_headers.assume_init()), count: count.assume_init(), }) } else { Err(glib::bool_error!("Failed to parse GstEvent")) } } } } #[must_use = "The builder must be built to be used"] pub struct UpstreamForceKeyUnitEventBuilder<'a> { seqnum: Option, running_time_offset: Option, other_fields: Vec<(&'a str, glib::SendValue)>, running_time: Option, all_headers: bool, count: u32, } impl<'a> UpstreamForceKeyUnitEventBuilder<'a> { fn new() -> Self { skip_assert_initialized!(); Self { seqnum: None, running_time_offset: None, other_fields: Vec::new(), running_time: gst::ClockTime::NONE, all_headers: true, count: 0, } } pub fn running_time(self, running_time: impl Into>) -> Self { Self { running_time: running_time.into(), ..self } } pub fn running_time_if(self, running_time: gst::ClockTime, predicate: bool) -> Self { if predicate { self.running_time(running_time) } else { self } } pub fn running_time_if_some(self, running_time: Option) -> Self { if let Some(running_time) = running_time { self.running_time(running_time) } else { self } } pub fn all_headers(self, all_headers: bool) -> Self { Self { all_headers, ..self } } pub fn all_headers_if_some(self, all_headers: Option) -> Self { if let Some(all_headers) = all_headers { self.all_headers(all_headers) } else { self } } pub fn count(self, count: u32) -> Self { Self { count, ..self } } pub fn count_if(self, count: u32, predicate: bool) -> Self { if predicate { self.count(count) } else { self } } pub fn count_if_some(self, count: Option) -> Self { if let Some(count) = count { self.count(count) } else { self } } event_builder_generic_impl!(|s: &mut Self| { ffi::gst_video_event_new_upstream_force_key_unit( s.running_time.into_glib(), s.all_headers.into_glib(), s.count, ) }); } #[derive(Clone, PartialEq, Eq, Debug)] pub struct UpstreamForceKeyUnitEvent { pub running_time: Option, pub all_headers: bool, pub count: u32, } impl UpstreamForceKeyUnitEvent { pub fn builder<'a>() -> UpstreamForceKeyUnitEventBuilder<'a> { assert_initialized_main_thread!(); UpstreamForceKeyUnitEventBuilder::new() } #[doc(alias = "gst_video_event_parse_upstream_force_key_unit")] pub fn parse(event: &gst::EventRef) -> Result { skip_assert_initialized!(); unsafe { let mut running_time = mem::MaybeUninit::uninit(); let mut all_headers = mem::MaybeUninit::uninit(); let mut count = mem::MaybeUninit::uninit(); let res: bool = from_glib(ffi::gst_video_event_parse_upstream_force_key_unit( event.as_mut_ptr(), running_time.as_mut_ptr(), all_headers.as_mut_ptr(), count.as_mut_ptr(), )); if res { Ok(Self { running_time: from_glib(running_time.assume_init()), all_headers: from_glib(all_headers.assume_init()), count: count.assume_init(), }) } else { Err(glib::bool_error!("Failed to parse GstEvent")) } } } } #[derive(Clone, PartialEq, Eq, Debug)] pub enum ForceKeyUnitEvent { Downstream(DownstreamForceKeyUnitEvent), Upstream(UpstreamForceKeyUnitEvent), } impl ForceKeyUnitEvent { #[doc(alias = "gst_video_event_is_force_key_unit")] pub fn is(event: &gst::EventRef) -> bool { skip_assert_initialized!(); unsafe { from_glib(ffi::gst_video_event_is_force_key_unit(event.as_mut_ptr())) } } pub fn parse(event: &gst::EventRef) -> Result { skip_assert_initialized!(); if event.is_upstream() { UpstreamForceKeyUnitEvent::parse(event).map(Self::Upstream) } else { DownstreamForceKeyUnitEvent::parse(event).map(Self::Downstream) } } } #[must_use = "The builder must be built to be used"] pub struct StillFrameEventBuilder<'a> { seqnum: Option, running_time_offset: Option, other_fields: Vec<(&'a str, glib::SendValue)>, in_still: bool, } impl<'a> StillFrameEventBuilder<'a> { fn new(in_still: bool) -> Self { skip_assert_initialized!(); Self { seqnum: None, running_time_offset: None, other_fields: Vec::new(), in_still, } } event_builder_generic_impl!(|s: &mut Self| ffi::gst_video_event_new_still_frame( s.in_still.into_glib() )); } #[derive(Clone, PartialEq, Eq, Debug)] pub struct StillFrameEvent { pub in_still: bool, } impl StillFrameEvent { pub fn builder<'a>(in_still: bool) -> StillFrameEventBuilder<'a> { assert_initialized_main_thread!(); StillFrameEventBuilder::new(in_still) } #[doc(alias = "gst_video_event_parse_still_frame")] pub fn parse(event: &gst::EventRef) -> Result { skip_assert_initialized!(); unsafe { let mut in_still = mem::MaybeUninit::uninit(); let res: bool = from_glib(ffi::gst_video_event_parse_still_frame( event.as_mut_ptr(), in_still.as_mut_ptr(), )); if res { Ok(Self { in_still: from_glib(in_still.assume_init()), }) } else { Err(glib::bool_error!("Invalid still-frame event")) } } } } macro_rules! nav_event_builder { ($builder:ident, $($event_field:ident: $event_type:ty,)? [$( $field_names:ident : $field_types:ty),*], $new_fn: expr) => { #[must_use = "The builder must be built to be used"] pub struct $builder<'a> { seqnum: Option, running_time_offset: Option, other_fields: Vec<(&'a str, glib::SendValue)>, $($field_names: $field_types,)* #[cfg(feature = "v1_22")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_22")))] modifier_state: NavigationModifierType, $($event_field: $event_type,)? } impl<'a> $builder<'a> { fn new($($event_field: $event_type)?) -> Self { skip_assert_initialized!(); Self { seqnum: None, running_time_offset: None, other_fields: Vec::new(), $($field_names: <$field_types>::default(),)* #[cfg(feature = "v1_22")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_22")))] modifier_state: NavigationModifierType::empty(), $($event_field,)? } } $(pub fn $field_names(self, $field_names: $field_types) -> Self { Self { $field_names, ..self } })* #[cfg(feature = "v1_22")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_22")))] pub fn modifier_state(self, modifier_state: NavigationModifierType) -> Self { Self { modifier_state, ..self } } event_builder_generic_impl!($new_fn); } }; } pub enum KeyEventType<'a> { Press { key: &'a str }, Release { key: &'a str }, } nav_event_builder!( KeyEventBuilder, kind: KeyEventType<'a>, [], |s: &mut Self| { let event = match s.kind { KeyEventType::Press { key } => NavigationEvent::KeyPress { key: key.to_owned(), #[cfg(feature = "v1_22")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_22")))] modifier_state: s.modifier_state, }, KeyEventType::Release { key } => NavigationEvent::KeyRelease { key: key.to_owned(), #[cfg(feature = "v1_22")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_22")))] modifier_state: s.modifier_state, }, }; gst::ffi::gst_event_new_navigation(event.structure().into_glib_ptr()) } ); pub enum MouseEventType { Move, Press { button: i32, }, Release { button: i32, }, #[cfg(feature = "v1_18")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_18")))] Scroll { delta_x: f64, delta_y: f64, }, #[cfg(feature = "v1_26")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_26")))] DoubleClick { button: i32, }, } nav_event_builder!( MouseEventBuilder, kind: MouseEventType, [x: f64, y: f64], |s: &mut Self| { let event = match s.kind { MouseEventType::Move => NavigationEvent::MouseMove { x: s.x, y: s.y, #[cfg(feature = "v1_22")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_22")))] modifier_state: s.modifier_state, }, MouseEventType::Press { button } => NavigationEvent::MouseButtonPress { button, x: s.x, y: s.y, #[cfg(feature = "v1_22")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_22")))] modifier_state: s.modifier_state, }, MouseEventType::Release { button } => NavigationEvent::MouseButtonRelease { button, x: s.x, y: s.y, #[cfg(feature = "v1_22")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_22")))] modifier_state: s.modifier_state, }, #[cfg(feature = "v1_18")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_18")))] MouseEventType::Scroll { delta_x, delta_y } => NavigationEvent::MouseScroll { x: s.x, y: s.y, delta_x, delta_y, #[cfg(feature = "v1_22")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_22")))] modifier_state: s.modifier_state, }, #[cfg(feature = "v1_26")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_26")))] MouseEventType::DoubleClick { button } => NavigationEvent::MouseDoubleClick { button, x: s.x, y: s.y, modifier_state: s.modifier_state, }, }; gst::ffi::gst_event_new_navigation(event.structure().into_glib_ptr()) } ); #[must_use = "The builder must be built to be used"] pub struct CommandEventBuilder<'a> { seqnum: Option, running_time_offset: Option, other_fields: Vec<(&'a str, glib::SendValue)>, command: NavigationCommand, #[cfg(feature = "v1_22")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_22")))] modifier_state: NavigationModifierType, } impl<'a> CommandEventBuilder<'a> { fn new(command: NavigationCommand) -> Self { skip_assert_initialized!(); Self { seqnum: None, running_time_offset: None, other_fields: Vec::new(), command, #[cfg(feature = "v1_22")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_22")))] modifier_state: NavigationModifierType::empty(), } } #[cfg(feature = "v1_22")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_22")))] pub fn modifier_state(self, modifier_state: NavigationModifierType) -> Self { Self { modifier_state, ..self } } event_builder_generic_impl!(|s: &mut Self| { let event = NavigationEvent::Command { command: s.command, #[cfg(feature = "v1_22")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_22")))] modifier_state: s.modifier_state, }; gst::ffi::gst_event_new_navigation(event.structure().into_glib_ptr()) }); } #[cfg(feature = "v1_22")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_22")))] pub enum TouchEventType { Down { pressure: f64 }, Motion { pressure: f64 }, Up, } #[cfg(feature = "v1_22")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_22")))] nav_event_builder!( TouchEventBuilder, kind: TouchEventType, [identifier: u32, x: f64, y: f64], |s: &mut Self| { let event = match s.kind { TouchEventType::Down { pressure } => NavigationEvent::TouchDown { identifier: s.identifier, x: s.x, y: s.y, modifier_state: s.modifier_state, pressure, }, TouchEventType::Motion { pressure } => NavigationEvent::TouchMotion { identifier: s.identifier, x: s.x, y: s.y, modifier_state: s.modifier_state, pressure, }, TouchEventType::Up => NavigationEvent::TouchUp { identifier: s.identifier, x: s.x, y: s.y, modifier_state: s.modifier_state, }, }; gst::ffi::gst_event_new_navigation(event.structure().into_glib_ptr()) } ); #[cfg(feature = "v1_22")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_22")))] pub enum TouchMetaEventType { Frame, Cancel, } #[cfg(feature = "v1_22")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_22")))] nav_event_builder!( TouchMetaEventBuilder, kind: TouchMetaEventType, [], |s: &mut Self| { let event = match s.kind { TouchMetaEventType::Frame => NavigationEvent::TouchFrame { modifier_state: s.modifier_state, }, TouchMetaEventType::Cancel => NavigationEvent::TouchCancel { modifier_state: s.modifier_state, }, }; gst::ffi::gst_event_new_navigation(event.structure().into_glib_ptr()) } ); const NAVIGATION_EVENT_NAME: &str = "application/x-gst-navigation"; #[cfg_attr(feature = "serde", derive(serde::Serialize, serde::Deserialize))] #[cfg_attr(feature = "serde", serde(tag = "event"))] #[derive(Clone, PartialEq, Debug)] pub enum NavigationEvent { KeyPress { key: String, #[cfg(feature = "v1_22")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_22")))] modifier_state: NavigationModifierType, }, KeyRelease { key: String, #[cfg(feature = "v1_22")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_22")))] modifier_state: NavigationModifierType, }, MouseMove { x: f64, y: f64, #[cfg(feature = "v1_22")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_22")))] modifier_state: NavigationModifierType, }, MouseButtonPress { button: i32, x: f64, y: f64, #[cfg(feature = "v1_22")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_22")))] modifier_state: NavigationModifierType, }, MouseButtonRelease { button: i32, x: f64, y: f64, #[cfg(feature = "v1_22")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_22")))] modifier_state: NavigationModifierType, }, Command { command: NavigationCommand, #[cfg(feature = "v1_22")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_22")))] modifier_state: NavigationModifierType, }, #[cfg(feature = "v1_18")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_18")))] MouseScroll { x: f64, y: f64, delta_x: f64, delta_y: f64, #[cfg(feature = "v1_22")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_22")))] modifier_state: NavigationModifierType, }, #[cfg(feature = "v1_22")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_22")))] TouchDown { identifier: u32, x: f64, y: f64, pressure: f64, modifier_state: NavigationModifierType, }, #[cfg(feature = "v1_22")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_22")))] TouchMotion { identifier: u32, x: f64, y: f64, pressure: f64, modifier_state: NavigationModifierType, }, #[cfg(feature = "v1_22")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_22")))] TouchUp { identifier: u32, x: f64, y: f64, modifier_state: NavigationModifierType, }, #[cfg(feature = "v1_22")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_22")))] TouchFrame { modifier_state: NavigationModifierType, }, #[cfg(feature = "v1_22")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_22")))] TouchCancel { modifier_state: NavigationModifierType, }, #[cfg(feature = "v1_26")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_26")))] MouseDoubleClick { button: i32, x: f64, y: f64, modifier_state: NavigationModifierType, }, } impl NavigationEvent { #[doc(alias = "gst_navigation_event_new_key_press")] pub fn new_key_press(key: &str) -> NavigationEvent { assert_initialized_main_thread!(); Self::KeyPress { key: key.to_string(), #[cfg(feature = "v1_22")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_22")))] modifier_state: NavigationModifierType::empty(), } } #[doc(alias = "gst_navigation_event_new_key_release")] pub fn new_key_release(key: &str) -> NavigationEvent { assert_initialized_main_thread!(); Self::KeyRelease { key: key.to_string(), #[cfg(feature = "v1_22")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_22")))] modifier_state: NavigationModifierType::empty(), } } #[doc(alias = "gst_navigation_event_new_mouse_move")] pub fn new_mouse_move(x: f64, y: f64) -> NavigationEvent { assert_initialized_main_thread!(); Self::MouseMove { x, y, #[cfg(feature = "v1_22")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_22")))] modifier_state: NavigationModifierType::empty(), } } #[doc(alias = "gst_navigation_event_new_mouse_button_press")] pub fn new_mouse_button_press(button: i32, x: f64, y: f64) -> NavigationEvent { assert_initialized_main_thread!(); Self::MouseButtonPress { button, x, y, #[cfg(feature = "v1_22")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_22")))] modifier_state: NavigationModifierType::empty(), } } #[doc(alias = "gst_navigation_event_new_mouse_button_release")] pub fn new_mouse_button_release(button: i32, x: f64, y: f64) -> NavigationEvent { assert_initialized_main_thread!(); Self::MouseButtonRelease { button, x, y, #[cfg(feature = "v1_22")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_22")))] modifier_state: NavigationModifierType::empty(), } } #[cfg(feature = "v1_18")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_18")))] #[doc(alias = "gst_navigation_event_new_mouse_scroll")] pub fn new_mouse_scroll(x: f64, y: f64, delta_x: f64, delta_y: f64) -> NavigationEvent { assert_initialized_main_thread!(); Self::MouseScroll { x, y, delta_x, delta_y, #[cfg(feature = "v1_22")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_22")))] modifier_state: NavigationModifierType::empty(), } } #[doc(alias = "gst_navigation_event_new_command")] pub fn new_command(command: NavigationCommand) -> NavigationEvent { assert_initialized_main_thread!(); Self::Command { command, #[cfg(feature = "v1_22")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_22")))] modifier_state: NavigationModifierType::empty(), } } #[cfg(feature = "v1_22")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_22")))] #[doc(alias = "gst_navigation_event_new_touch_down")] pub fn new_touch_down(identifier: u32, x: f64, y: f64, pressure: f64) -> NavigationEvent { assert_initialized_main_thread!(); Self::TouchDown { identifier, x, y, pressure, modifier_state: NavigationModifierType::empty(), } } #[cfg(feature = "v1_22")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_22")))] #[doc(alias = "gst_navigation_event_new_touch_motion")] pub fn new_touch_motion(identifier: u32, x: f64, y: f64, pressure: f64) -> NavigationEvent { assert_initialized_main_thread!(); Self::TouchMotion { identifier, x, y, pressure, modifier_state: NavigationModifierType::empty(), } } #[cfg(feature = "v1_22")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_22")))] #[doc(alias = "gst_navigation_event_new_touch_up")] pub fn new_touch_up(identifier: u32, x: f64, y: f64) -> NavigationEvent { assert_initialized_main_thread!(); Self::TouchUp { identifier, x, y, modifier_state: NavigationModifierType::empty(), } } #[cfg(feature = "v1_22")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_22")))] #[doc(alias = "gst_navigation_event_new_touch_frame")] pub fn new_touch_frame() -> NavigationEvent { assert_initialized_main_thread!(); Self::TouchFrame { modifier_state: NavigationModifierType::empty(), } } #[cfg(feature = "v1_22")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_22")))] #[doc(alias = "gst_navigation_event_new_touch_cancel")] pub fn new_touch_cancel() -> NavigationEvent { assert_initialized_main_thread!(); Self::TouchCancel { modifier_state: NavigationModifierType::empty(), } } #[cfg(feature = "v1_26")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_26")))] #[doc(alias = "gst_navigation_event_new_mouse_double_click")] pub fn new_mouse_double_click(button: i32, x: f64, y: f64) -> NavigationEvent { assert_initialized_main_thread!(); Self::MouseDoubleClick { button, x, y, modifier_state: NavigationModifierType::empty(), } } pub fn key_press_builder(key: &str) -> KeyEventBuilder { assert_initialized_main_thread!(); KeyEventBuilder::new(KeyEventType::Press { key }) } pub fn key_release_builder(key: &str) -> KeyEventBuilder { assert_initialized_main_thread!(); KeyEventBuilder::new(KeyEventType::Release { key }) } pub fn mouse_move_builder(x: f64, y: f64) -> MouseEventBuilder<'static> { assert_initialized_main_thread!(); MouseEventBuilder::new(MouseEventType::Move {}).x(x).y(y) } pub fn mouse_button_press_builder(button: i32, x: f64, y: f64) -> MouseEventBuilder<'static> { assert_initialized_main_thread!(); MouseEventBuilder::new(MouseEventType::Press { button }) .x(x) .y(y) } pub fn mouse_button_release_builder(button: i32, x: f64, y: f64) -> MouseEventBuilder<'static> { assert_initialized_main_thread!(); MouseEventBuilder::new(MouseEventType::Press { button }) .x(x) .y(y) } #[cfg(feature = "v1_18")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_18")))] pub fn mouse_scroll_builder( x: f64, y: f64, delta_x: f64, delta_y: f64, ) -> MouseEventBuilder<'static> { assert_initialized_main_thread!(); MouseEventBuilder::new(MouseEventType::Scroll { delta_x, delta_y }) .x(x) .y(y) } pub fn command_builder(command: NavigationCommand) -> CommandEventBuilder<'static> { assert_initialized_main_thread!(); CommandEventBuilder::new(command) } #[cfg(feature = "v1_22")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_22")))] pub fn touch_down_builder( identifier: u32, x: f64, y: f64, pressure: f64, ) -> TouchEventBuilder<'static> { assert_initialized_main_thread!(); TouchEventBuilder::new(TouchEventType::Down { pressure }) .identifier(identifier) .x(x) .y(y) } #[cfg(feature = "v1_22")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_22")))] pub fn touch_motion_builder( identifier: u32, x: f64, y: f64, pressure: f64, ) -> TouchEventBuilder<'static> { assert_initialized_main_thread!(); TouchEventBuilder::new(TouchEventType::Motion { pressure }) .identifier(identifier) .x(x) .y(y) } #[cfg(feature = "v1_22")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_22")))] pub fn touch_up_builder(identifier: u32, x: f64, y: f64) -> TouchEventBuilder<'static> { assert_initialized_main_thread!(); TouchEventBuilder::new(TouchEventType::Up) .identifier(identifier) .x(x) .y(y) } #[cfg(feature = "v1_22")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_22")))] pub fn touch_frame_builder() -> TouchMetaEventBuilder<'static> { assert_initialized_main_thread!(); TouchMetaEventBuilder::new(TouchMetaEventType::Frame) } #[cfg(feature = "v1_22")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_22")))] pub fn touch_cancel_builder() -> TouchMetaEventBuilder<'static> { assert_initialized_main_thread!(); TouchMetaEventBuilder::new(TouchMetaEventType::Cancel) } #[cfg(feature = "v1_26")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_26")))] pub fn mouse_double_click_builder(button: i32, x: f64, y: f64) -> MouseEventBuilder<'static> { assert_initialized_main_thread!(); MouseEventBuilder::new(MouseEventType::DoubleClick { button }) .x(x) .y(y) } #[doc(alias = "gst_navigation_event_get_type")] pub fn type_(event: &gst::EventRef) -> NavigationEventType { skip_assert_initialized!(); unsafe { from_glib(ffi::gst_navigation_event_get_type(event.as_mut_ptr())) } } #[doc(alias = "gst_navigation_event_parse_key_event")] #[doc(alias = "gst_navigation_event_parse_mouse_button_event")] #[doc(alias = "gst_navigation_event_parse_mouse_scroll_event")] #[doc(alias = "gst_navigation_event_parse_mouse_move_event")] #[doc(alias = "gst_navigation_event_parse_touch_event")] #[doc(alias = "gst_navigation_event_parse_touch_up_event")] #[doc(alias = "gst_navigation_event_parse_command")] pub fn parse(event: &gst::EventRef) -> Result { skip_assert_initialized!(); if event.type_() != EventType::Navigation { return Err(glib::bool_error!("Invalid navigation event")); } let structure = event .structure() .ok_or_else(|| glib::bool_error!("Invalid navigation event"))?; if structure.name() != NAVIGATION_EVENT_NAME { return Err(glib::bool_error!("Invalid navigation event")); } #[cfg(feature = "v1_22")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_22")))] let modifier_state = structure .get("state") .unwrap_or(NavigationModifierType::empty()); let event = match Self::type_(event) { NavigationEventType::MouseMove => NavigationEvent::MouseMove { x: structure .get("pointer_x") .map_err(|_| glib::bool_error!("Invalid mouse event"))?, y: structure .get("pointer_y") .map_err(|_| glib::bool_error!("Invalid mouse event"))?, #[cfg(feature = "v1_22")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_22")))] modifier_state, }, NavigationEventType::MouseButtonPress => NavigationEvent::MouseButtonPress { button: structure .get("button") .map_err(|_| glib::bool_error!("Invalid mouse event"))?, x: structure .get("pointer_x") .map_err(|_| glib::bool_error!("Invalid mouse event"))?, y: structure .get("pointer_y") .map_err(|_| glib::bool_error!("Invalid mouse event"))?, #[cfg(feature = "v1_22")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_22")))] modifier_state, }, NavigationEventType::MouseButtonRelease => NavigationEvent::MouseButtonRelease { button: structure .get("button") .map_err(|_| glib::bool_error!("Invalid mouse event"))?, x: structure .get("pointer_x") .map_err(|_| glib::bool_error!("Invalid mouse event"))?, y: structure .get("pointer_y") .map_err(|_| glib::bool_error!("Invalid mouse event"))?, #[cfg(feature = "v1_22")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_22")))] modifier_state, }, #[cfg(feature = "v1_18")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_18")))] NavigationEventType::MouseScroll => NavigationEvent::MouseScroll { x: structure .get("pointer_x") .map_err(|_| glib::bool_error!("Invalid mouse event"))?, y: structure .get("pointer_y") .map_err(|_| glib::bool_error!("Invalid mouse event"))?, delta_x: structure .get("delta_pointer_x") .map_err(|_| glib::bool_error!("Invalid mouse event"))?, delta_y: structure .get("delta_pointer_y") .map_err(|_| glib::bool_error!("Invalid mouse event"))?, #[cfg(feature = "v1_22")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_22")))] modifier_state, }, NavigationEventType::KeyPress => NavigationEvent::KeyPress { key: structure .get("key") .map_err(|_| glib::bool_error!("Invalid key press event"))?, #[cfg(feature = "v1_22")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_22")))] modifier_state, }, NavigationEventType::KeyRelease => NavigationEvent::KeyRelease { key: structure .get("key") .map_err(|_| glib::bool_error!("Invalid key press event"))?, #[cfg(feature = "v1_22")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_22")))] modifier_state, }, NavigationEventType::Command => NavigationEvent::Command { command: structure .get("command-code") .map_err(|_| glib::bool_error!("Invalid key press event"))?, #[cfg(feature = "v1_22")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_22")))] modifier_state, }, #[cfg(feature = "v1_22")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_22")))] NavigationEventType::TouchDown => NavigationEvent::TouchDown { identifier: structure .get("identifier") .map_err(|_| glib::bool_error!("Invalid touch event"))?, x: structure .get("pointer_x") .map_err(|_| glib::bool_error!("Invalid touch event"))?, y: structure .get("pointer_y") .map_err(|_| glib::bool_error!("Invalid touch event"))?, pressure: structure .get("pressure") .map_err(|_| glib::bool_error!("Invalid touch event"))?, modifier_state, }, #[cfg(feature = "v1_22")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_22")))] NavigationEventType::TouchMotion => NavigationEvent::TouchMotion { identifier: structure .get("identifier") .map_err(|_| glib::bool_error!("Invalid touch event"))?, x: structure .get("pointer_x") .map_err(|_| glib::bool_error!("Invalid touch event"))?, y: structure .get("pointer_y") .map_err(|_| glib::bool_error!("Invalid touch event"))?, pressure: structure .get("pressure") .map_err(|_| glib::bool_error!("Invalid touch event"))?, modifier_state, }, #[cfg(feature = "v1_22")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_22")))] NavigationEventType::TouchUp => NavigationEvent::TouchUp { identifier: structure .get("identifier") .map_err(|_| glib::bool_error!("Invalid touch event"))?, x: structure .get("pointer_x") .map_err(|_| glib::bool_error!("Invalid touch event"))?, y: structure .get("pointer_y") .map_err(|_| glib::bool_error!("Invalid touch event"))?, modifier_state, }, #[cfg(feature = "v1_22")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_22")))] NavigationEventType::TouchFrame => NavigationEvent::TouchFrame { modifier_state }, #[cfg(feature = "v1_22")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_22")))] NavigationEventType::TouchCancel => NavigationEvent::TouchCancel { modifier_state }, #[cfg(feature = "v1_26")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_26")))] NavigationEventType::MouseDoubleClick => NavigationEvent::MouseDoubleClick { button: structure .get("button") .map_err(|_| glib::bool_error!("Invalid mouse event"))?, x: structure .get("pointer_x") .map_err(|_| glib::bool_error!("Invalid mouse event"))?, y: structure .get("pointer_y") .map_err(|_| glib::bool_error!("Invalid mouse event"))?, modifier_state, }, NavigationEventType::Invalid | NavigationEventType::__Unknown(_) => { return Err(glib::bool_error!("Invalid navigation event")) } }; Ok(event) } pub fn structure(&self) -> gst::Structure { skip_assert_initialized!(); #[allow(unused_mut)] let mut structure = match self { Self::MouseMove { x, y, .. } => gst::Structure::builder(NAVIGATION_EVENT_NAME) .field("event", "mouse-move") .field("pointer_x", x) .field("pointer_y", y), Self::MouseButtonPress { button, x, y, .. } => { gst::Structure::builder(NAVIGATION_EVENT_NAME) .field("event", "mouse-button-press") .field("button", button) .field("pointer_x", x) .field("pointer_y", y) } Self::MouseButtonRelease { button, x, y, .. } => { gst::Structure::builder(NAVIGATION_EVENT_NAME) .field("event", "mouse-button-release") .field("button", button) .field("pointer_x", x) .field("pointer_y", y) } #[cfg(feature = "v1_18")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_18")))] Self::MouseScroll { x, y, delta_x, delta_y, .. } => gst::Structure::builder(NAVIGATION_EVENT_NAME) .field("event", "mouse-scroll") .field("pointer_x", x) .field("pointer_y", y) .field("delta_pointer_x", delta_x) .field("delta_pointer_y", delta_y), Self::KeyPress { key, .. } => gst::Structure::builder(NAVIGATION_EVENT_NAME) .field("event", "key-press") .field("key", key), Self::KeyRelease { key, .. } => gst::Structure::builder(NAVIGATION_EVENT_NAME) .field("event", "key-release") .field("key", key), Self::Command { command, .. } => gst::Structure::builder(NAVIGATION_EVENT_NAME) .field("event", "command") .field("command-code", command), #[cfg(feature = "v1_22")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_22")))] Self::TouchDown { identifier, x, y, pressure, .. } => gst::Structure::builder(NAVIGATION_EVENT_NAME) .field("event", "touch-down") .field("identifier", identifier) .field("pointer_x", x) .field("pointer_y", y) .field("pressure", pressure), #[cfg(feature = "v1_22")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_22")))] Self::TouchMotion { identifier, x, y, pressure, .. } => gst::Structure::builder(NAVIGATION_EVENT_NAME) .field("event", "touch-motion") .field("identifier", identifier) .field("pointer_x", x) .field("pointer_y", y) .field("pressure", pressure), #[cfg(feature = "v1_22")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_22")))] Self::TouchUp { identifier, x, y, .. } => gst::Structure::builder(NAVIGATION_EVENT_NAME) .field("event", "touch-up") .field("identifier", identifier) .field("pointer_x", x) .field("pointer_y", y), #[cfg(feature = "v1_22")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_22")))] Self::TouchFrame { .. } => { gst::Structure::builder(NAVIGATION_EVENT_NAME).field("event", "touch-frame") } #[cfg(feature = "v1_22")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_22")))] Self::TouchCancel { .. } => { gst::Structure::builder(NAVIGATION_EVENT_NAME).field("event", "touch-cancel") } #[cfg(feature = "v1_26")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_26")))] Self::MouseDoubleClick { button, x, y, .. } => { gst::Structure::builder(NAVIGATION_EVENT_NAME) .field("event", "mouse-double-click") .field("button", button) .field("pointer_x", x) .field("pointer_y", y) } }; #[cfg(feature = "v1_22")] { structure = match self { Self::MouseMove { modifier_state, .. } => structure.field("state", modifier_state), Self::MouseButtonPress { modifier_state, .. } => { structure.field("state", modifier_state) } Self::MouseButtonRelease { modifier_state, .. } => { structure.field("state", modifier_state) } Self::MouseScroll { modifier_state, .. } => { structure.field("state", modifier_state) } Self::KeyPress { modifier_state, .. } => structure.field("state", modifier_state), Self::KeyRelease { modifier_state, .. } => structure.field("state", modifier_state), Self::Command { modifier_state, .. } => structure.field("state", modifier_state), Self::TouchDown { modifier_state, .. } => structure.field("state", modifier_state), Self::TouchMotion { modifier_state, .. } => { structure.field("state", modifier_state) } Self::TouchUp { modifier_state, .. } => structure.field("state", modifier_state), Self::TouchFrame { modifier_state, .. } => structure.field("state", modifier_state), Self::TouchCancel { modifier_state, .. } => { structure.field("state", modifier_state) } #[cfg(feature = "v1_26")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_26")))] Self::MouseDoubleClick { modifier_state, .. } => { structure.field("state", modifier_state) } }; } structure.build() } pub fn build(&self) -> gst::Event { skip_assert_initialized!(); gst::event::Navigation::new(self.structure()) } } #[cfg(test)] mod tests { #[test] #[cfg(feature = "serde")] #[cfg(feature = "v1_22")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_22")))] fn serialize_navigation_events() { use crate::{NavigationEvent, NavigationModifierType}; gst::init().unwrap(); let mods = NavigationModifierType::SHIFT_MASK | NavigationModifierType::CONTROL_MASK; let ev = NavigationEvent::mouse_scroll_builder(1.0, 2.0, 3.0, 4.0) .modifier_state(mods) .build(); let navigation_event = NavigationEvent::parse(&ev).unwrap(); match &navigation_event { NavigationEvent::MouseScroll { x, y, delta_x, delta_y, modifier_state, } => { assert!( *x == 1.0 && *y == 2.0 && *delta_x == 3.0 && *delta_y == 4.0 && *modifier_state == mods ); } _ => unreachable!(), } let json_event = serde_json::to_string(&navigation_event).unwrap(); assert_eq!( json_event, r#"{"event":"MouseScroll","x":1.0,"y":2.0,"delta_x":3.0,"delta_y":4.0,"modifier_state":"shift-mask+control-mask"}"# ); let navigation_event: NavigationEvent = serde_json::from_str(&json_event).unwrap(); match &navigation_event { NavigationEvent::MouseScroll { x, y, delta_x, delta_y, modifier_state, } => { assert!( *x == 1.0 && *y == 2.0 && *delta_x == 3.0 && *delta_y == 4.0 && *modifier_state == mods ); } _ => unreachable!(), } let ev = NavigationEvent::new_mouse_button_press(1, 1.0, 2.0).build(); let navigation_event = NavigationEvent::parse(&ev).unwrap(); match &navigation_event { NavigationEvent::MouseButtonPress { button, x, y, modifier_state, } => { assert!( *button == 1 && *x == 1.0 && *y == 2.0 && *modifier_state == NavigationModifierType::empty() ); } _ => unreachable!(), } let json_event = serde_json::to_string(&navigation_event).unwrap(); assert_eq!( json_event, r#"{"event":"MouseButtonPress","button":1,"x":1.0,"y":2.0,"modifier_state":""}"# ); let navigation_event: NavigationEvent = serde_json::from_str(&json_event).unwrap(); match &navigation_event { NavigationEvent::MouseButtonPress { button, x, y, modifier_state, } => { assert!( *button == 1 && *x == 1.0 && *y == 2.0 && *modifier_state == NavigationModifierType::empty() ); } _ => unreachable!(), } let mods = NavigationModifierType::META_MASK; let ev = NavigationEvent::key_release_builder("a") .modifier_state(mods) .build(); let navigation_event = NavigationEvent::parse(&ev).unwrap(); match &navigation_event { NavigationEvent::KeyRelease { key, modifier_state, } => { assert!(*key == "a" && *modifier_state == mods); } _ => unreachable!(), } let json_event = serde_json::to_string(&navigation_event).unwrap(); assert_eq!( json_event, r#"{"event":"KeyRelease","key":"a","modifier_state":"meta-mask"}"# ); let navigation_event: NavigationEvent = serde_json::from_str(&json_event).unwrap(); match &navigation_event { NavigationEvent::KeyRelease { key, modifier_state, } => { assert!(*key == "a" && *modifier_state == mods); } _ => unreachable!(), } let ev = NavigationEvent::new_touch_motion(0, 1.0, 2.0, 0.5).build(); let navigation_event = NavigationEvent::parse(&ev).unwrap(); match &navigation_event { NavigationEvent::TouchMotion { identifier, x, y, pressure, modifier_state, } => { assert!( *identifier == 0 && *x == 1.0 && *y == 2.0 && *pressure == 0.5 && *modifier_state == NavigationModifierType::empty() ); } _ => unreachable!(), } let json_event = serde_json::to_string(&navigation_event).unwrap(); assert_eq!( json_event, r#"{"event":"TouchMotion","identifier":0,"x":1.0,"y":2.0,"pressure":0.5,"modifier_state":""}"# ); let navigation_event: NavigationEvent = serde_json::from_str(&json_event).unwrap(); match &navigation_event { NavigationEvent::TouchMotion { identifier, x, y, pressure, modifier_state, } => { assert!( *identifier == 0 && *x == 1.0 && *y == 2.0 && *pressure == 0.5 && *modifier_state == NavigationModifierType::empty() ); } _ => unreachable!(), } let ev = NavigationEvent::touch_cancel_builder().build(); let navigation_event = NavigationEvent::parse(&ev).unwrap(); match &navigation_event { NavigationEvent::TouchCancel { modifier_state } => { assert!(*modifier_state == NavigationModifierType::empty()); } _ => unreachable!(), } let json_event = serde_json::to_string(&navigation_event).unwrap(); assert_eq!(json_event, r#"{"event":"TouchCancel","modifier_state":""}"#); let navigation_event: NavigationEvent = serde_json::from_str(&json_event).unwrap(); match &navigation_event { NavigationEvent::TouchCancel { modifier_state } => { assert!(*modifier_state == NavigationModifierType::empty()); } _ => unreachable!(), } } } gstreamer-video-0.23.5/src/video_filter.rs000064400000000000000000000027431046102023000166260ustar 00000000000000// Take a look at the license at the top of the repository in the LICENSE file. use glib::{prelude::*, translate::*}; use gst::prelude::*; use gst_base::prelude::*; use crate::{ffi, VideoFilter}; mod sealed { pub trait Sealed {} impl> Sealed for T {} } pub trait VideoFilterExtManual: sealed::Sealed + IsA + 'static { fn input_video_info(&self) -> Option { unsafe { let ptr = self.as_ptr() as *mut ffi::GstVideoFilter; let sinkpad = self.as_ref().sink_pad(); let _guard = sinkpad.stream_lock(); let info = &(*ptr).in_info; if info.finfo.is_null() || info.width <= 0 || info.height <= 0 { return None; } Some(from_glib_none(mut_override( info as *const ffi::GstVideoInfo, ))) } } fn output_video_info(&self) -> Option { unsafe { let ptr = self.as_ptr() as *mut ffi::GstVideoFilter; let sinkpad = self.as_ref().sink_pad(); let _guard = sinkpad.stream_lock(); let info = &(*ptr).out_info; if info.finfo.is_null() || info.width <= 0 || info.height <= 0 { return None; } Some(from_glib_none(mut_override( info as *const ffi::GstVideoInfo, ))) } } } impl> VideoFilterExtManual for O {} gstreamer-video-0.23.5/src/video_format.rs000064400000000000000000000421411046102023000166250ustar 00000000000000// Take a look at the license at the top of the repository in the LICENSE file. use std::str; use crate::ffi; use glib::translate::{from_glib, FromGlib, IntoGlib}; use once_cell::sync::Lazy; #[cfg(feature = "v1_18")] pub static VIDEO_FORMATS_ALL: Lazy> = Lazy::new(|| unsafe { let mut len: u32 = 0; let mut res = Vec::with_capacity(len as usize); let formats = ffi::gst_video_formats_raw(&mut len); for i in 0..len { let format = formats.offset(i as isize); res.push(from_glib(*format)); } res.into_boxed_slice() }); #[cfg(not(feature = "v1_18"))] pub static VIDEO_FORMATS_ALL: Lazy> = Lazy::new(|| { #[cfg(target_endian = "little")] { Box::new([ crate::VideoFormat::Ayuv64, crate::VideoFormat::Argb64, crate::VideoFormat::Gbra12le, crate::VideoFormat::Gbra12be, crate::VideoFormat::A44410le, crate::VideoFormat::Gbra10le, crate::VideoFormat::A44410be, crate::VideoFormat::Gbra10be, crate::VideoFormat::A42210le, crate::VideoFormat::A42210be, crate::VideoFormat::A42010le, crate::VideoFormat::A42010be, #[cfg(feature = "v1_16")] crate::VideoFormat::Bgr10a2Le, #[cfg(feature = "v1_16")] crate::VideoFormat::Y410, crate::VideoFormat::Gbra, crate::VideoFormat::Ayuv, #[cfg(feature = "v1_16")] crate::VideoFormat::Vuya, crate::VideoFormat::Rgba, crate::VideoFormat::Argb, crate::VideoFormat::Bgra, crate::VideoFormat::Abgr, crate::VideoFormat::A420, crate::VideoFormat::V216, crate::VideoFormat::Y44412le, crate::VideoFormat::Gbr12le, crate::VideoFormat::Y44412be, crate::VideoFormat::Gbr12be, crate::VideoFormat::I42212le, crate::VideoFormat::I42212be, crate::VideoFormat::I42012le, crate::VideoFormat::I42012be, crate::VideoFormat::Y44410le, crate::VideoFormat::Gbr10le, crate::VideoFormat::Y44410be, crate::VideoFormat::Gbr10be, crate::VideoFormat::R210, crate::VideoFormat::I42210le, crate::VideoFormat::I42210be, crate::VideoFormat::Nv1610le32, #[cfg(feature = "v1_16")] crate::VideoFormat::Y210, crate::VideoFormat::Uyvp, crate::VideoFormat::V210, crate::VideoFormat::I42010le, crate::VideoFormat::I42010be, crate::VideoFormat::P01010le, #[cfg(feature = "v1_16")] crate::VideoFormat::Nv1210le40, crate::VideoFormat::Nv1210le32, crate::VideoFormat::P01010be, crate::VideoFormat::Y444, crate::VideoFormat::Gbr, crate::VideoFormat::Nv24, crate::VideoFormat::V308, crate::VideoFormat::Iyu2, crate::VideoFormat::Rgbx, crate::VideoFormat::Xrgb, crate::VideoFormat::Bgrx, crate::VideoFormat::Xbgr, crate::VideoFormat::Rgb, crate::VideoFormat::Bgr, crate::VideoFormat::Y42b, crate::VideoFormat::Nv16, crate::VideoFormat::Nv61, crate::VideoFormat::Yuy2, crate::VideoFormat::Yvyu, crate::VideoFormat::Uyvy, crate::VideoFormat::Vyuy, crate::VideoFormat::I420, crate::VideoFormat::Yv12, crate::VideoFormat::Nv12, crate::VideoFormat::Nv21, crate::VideoFormat::Nv1264z32, crate::VideoFormat::Y41b, crate::VideoFormat::Iyu1, crate::VideoFormat::Yuv9, crate::VideoFormat::Yvu9, crate::VideoFormat::Bgr16, crate::VideoFormat::Rgb16, crate::VideoFormat::Bgr15, crate::VideoFormat::Rgb15, crate::VideoFormat::Rgb8p, crate::VideoFormat::Gray16Le, crate::VideoFormat::Gray16Be, crate::VideoFormat::Gray10Le32, crate::VideoFormat::Gray8, ]) } #[cfg(target_endian = "big")] { Box::new([ crate::VideoFormat::Ayuv64, crate::VideoFormat::Argb64, crate::VideoFormat::Gbra12be, crate::VideoFormat::Gbra12le, crate::VideoFormat::A44410be, crate::VideoFormat::Gbra10be, crate::VideoFormat::A44410le, crate::VideoFormat::Gbra10le, crate::VideoFormat::A42210be, crate::VideoFormat::A42210le, crate::VideoFormat::A42010be, crate::VideoFormat::A42010le, #[cfg(feature = "v1_16")] crate::VideoFormat::Bgr10a2Le, #[cfg(feature = "v1_16")] crate::VideoFormat::Y410, crate::VideoFormat::Gbra, crate::VideoFormat::Ayuv, #[cfg(feature = "v1_16")] crate::VideoFormat::Vuya, crate::VideoFormat::Rgba, crate::VideoFormat::Argb, crate::VideoFormat::Bgra, crate::VideoFormat::Abgr, crate::VideoFormat::A420, crate::VideoFormat::V216, crate::VideoFormat::Y44412be, crate::VideoFormat::Gbr12be, crate::VideoFormat::Y44412le, crate::VideoFormat::Gbr12le, crate::VideoFormat::I42212be, crate::VideoFormat::I42212le, crate::VideoFormat::I42012be, crate::VideoFormat::I42012le, crate::VideoFormat::Y44410be, crate::VideoFormat::Gbr10be, crate::VideoFormat::Y44410le, crate::VideoFormat::Gbr10le, crate::VideoFormat::R210, crate::VideoFormat::I42210be, crate::VideoFormat::I42210le, crate::VideoFormat::Nv1610le32, #[cfg(feature = "v1_16")] crate::VideoFormat::Y210, crate::VideoFormat::Uyvp, crate::VideoFormat::V210, crate::VideoFormat::I42010be, crate::VideoFormat::I42010le, crate::VideoFormat::P01010be, crate::VideoFormat::P01010le, #[cfg(feature = "v1_16")] crate::VideoFormat::Nv1210le40, crate::VideoFormat::Nv1210le32, crate::VideoFormat::Y444, crate::VideoFormat::Gbr, crate::VideoFormat::Nv24, crate::VideoFormat::V308, crate::VideoFormat::Iyu2, crate::VideoFormat::Rgbx, crate::VideoFormat::Xrgb, crate::VideoFormat::Bgrx, crate::VideoFormat::Xbgr, crate::VideoFormat::Rgb, crate::VideoFormat::Bgr, crate::VideoFormat::Y42b, crate::VideoFormat::Nv16, crate::VideoFormat::Nv61, crate::VideoFormat::Yuy2, crate::VideoFormat::Yvyu, crate::VideoFormat::Uyvy, crate::VideoFormat::Vyuy, crate::VideoFormat::I420, crate::VideoFormat::Yv12, crate::VideoFormat::Nv12, crate::VideoFormat::Nv21, crate::VideoFormat::Nv1264z32, crate::VideoFormat::Y41b, crate::VideoFormat::Iyu1, crate::VideoFormat::Yuv9, crate::VideoFormat::Yvu9, crate::VideoFormat::Bgr16, crate::VideoFormat::Rgb16, crate::VideoFormat::Bgr15, crate::VideoFormat::Rgb15, crate::VideoFormat::Rgb8p, crate::VideoFormat::Gray16Be, crate::VideoFormat::Gray16Le, crate::VideoFormat::Gray10Le32, crate::VideoFormat::Gray8, ]) } }); #[cfg(feature = "v1_24")] pub static VIDEO_FORMATS_ANY: Lazy> = Lazy::new(|| unsafe { let mut len: u32 = 0; let mut res = Vec::with_capacity(len as usize); let formats = ffi::gst_video_formats_any(&mut len); for i in 0..len { let format = formats.offset(i as isize); res.push(from_glib(*format)); } res.into_boxed_slice() }); #[derive(PartialEq, Eq, Copy, Clone, Debug, Hash)] pub enum VideoEndianness { Unknown, LittleEndian = 1234, BigEndian = 4321, } impl FromGlib for VideoEndianness { #[inline] unsafe fn from_glib(value: i32) -> Self { skip_assert_initialized!(); match value { 1234 => Self::LittleEndian, 4321 => Self::BigEndian, _ => Self::Unknown, } } } impl IntoGlib for VideoEndianness { type GlibType = i32; #[inline] fn into_glib(self) -> i32 { match self { Self::LittleEndian => 1234, Self::BigEndian => 4321, _ => 0, } } } impl crate::VideoFormat { #[doc(alias = "gst_video_format_from_masks")] pub fn from_masks( depth: u32, bpp: u32, endianness: crate::VideoEndianness, red_mask: u32, green_mask: u32, blue_mask: u32, alpha_mask: u32, ) -> Self { assert_initialized_main_thread!(); unsafe { from_glib(ffi::gst_video_format_from_masks( depth as i32, bpp as i32, endianness.into_glib(), red_mask, green_mask, blue_mask, alpha_mask, )) } } #[doc(alias = "gst_video_format_to_string")] pub fn to_str<'a>(self) -> &'a glib::GStr { if self == Self::Unknown { return glib::gstr!("UNKNOWN"); } unsafe { glib::GStr::from_ptr( ffi::gst_video_format_to_string(self.into_glib()) .as_ref() .expect("gst_video_format_to_string returned NULL"), ) } } pub fn iter_raw() -> VideoFormatIterator { VideoFormatIterator::default() } #[cfg(feature = "v1_24")] pub fn iter_any() -> impl Iterator { VIDEO_FORMATS_ANY.iter().copied() } } impl str::FromStr for crate::VideoFormat { type Err = glib::BoolError; fn from_str(s: &str) -> Result { skip_assert_initialized!(); let fmt = Self::from_string(s); if fmt == Self::Unknown { Err(glib::bool_error!( "Failed to parse video format from string" )) } else { Ok(fmt) } } } impl PartialOrd for crate::VideoFormat { #[inline] fn partial_cmp(&self, other: &Self) -> Option { Some(self.cmp(other)) } } impl Ord for crate::VideoFormat { #[inline] fn cmp(&self, other: &Self) -> std::cmp::Ordering { crate::VideoFormatInfo::from_format(*self).cmp(&crate::VideoFormatInfo::from_format(*other)) } } pub struct VideoFormatIterator { idx: usize, len: usize, } impl Default for VideoFormatIterator { fn default() -> Self { Self { idx: 0, len: VIDEO_FORMATS_ALL.len(), } } } impl Iterator for VideoFormatIterator { type Item = crate::VideoFormat; fn next(&mut self) -> Option { if self.idx >= self.len { None } else { let fmt = VIDEO_FORMATS_ALL[self.idx]; self.idx += 1; Some(fmt) } } fn size_hint(&self) -> (usize, Option) { if self.idx == self.len { return (0, Some(0)); } let remaining = self.len - self.idx; (remaining, Some(remaining)) } fn count(self) -> usize { self.len - self.idx } fn nth(&mut self, n: usize) -> Option { let (end, overflow) = self.idx.overflowing_add(n); if end >= self.len || overflow { self.idx = self.len; None } else { self.idx = end + 1; Some(VIDEO_FORMATS_ALL[end]) } } fn last(self) -> Option { if self.idx == self.len { None } else { Some(VIDEO_FORMATS_ALL[self.len - 1]) } } } impl ExactSizeIterator for VideoFormatIterator {} impl std::iter::FusedIterator for VideoFormatIterator {} impl DoubleEndedIterator for VideoFormatIterator { fn next_back(&mut self) -> Option { if self.idx >= self.len { None } else { let fmt = VIDEO_FORMATS_ALL[self.len - 1]; self.len -= 1; Some(fmt) } } fn nth_back(&mut self, n: usize) -> Option { let (end, overflow) = self.len.overflowing_sub(n); if end <= self.idx || overflow { self.idx = self.len; None } else { self.len = end - 1; let fmt = VIDEO_FORMATS_ALL[self.len]; Some(fmt) } } } pub trait VideoFormatIteratorExt { fn into_video_caps(self) -> Option>; } impl VideoFormatIteratorExt for T where T: Iterator, { fn into_video_caps(self) -> Option> { let formats: Vec = self.collect(); if !formats.is_empty() { Some(crate::functions::video_make_raw_caps(&formats)) } else { None } } } pub trait VideoFormatIteratorExtRef { fn into_video_caps(self) -> Option>; } impl<'a, T> VideoFormatIteratorExtRef for T where T: Iterator, { fn into_video_caps(self) -> Option> { let formats: Vec = self.copied().collect(); if !formats.is_empty() { Some(crate::functions::video_make_raw_caps(&formats)) } else { None } } } #[cfg(test)] mod tests { #[test] fn enum_to_string() { gst::init().unwrap(); assert_eq!(&format!("{}", crate::VideoFormat::Argb), "ARGB"); assert_eq!(&format!("{:?}", crate::VideoFormat::Argb), "Argb"); assert_eq!(crate::VideoFormat::Argb.to_str(), "ARGB"); assert_eq!(&format!("{}", crate::VideoFormat::Unknown), "UNKNOWN"); assert_eq!(&format!("{:?}", crate::VideoFormat::Unknown), "Unknown"); assert_eq!(crate::VideoFormat::Unknown.to_str(), "UNKNOWN"); assert_eq!( &format!("{:?}", crate::VideoFormat::__Unknown(-1)), "__Unknown(-1)" ); } #[test] fn test_display() { gst::init().unwrap(); assert_eq!(format!("{}", crate::VideoFormat::Nv16), "NV16"); assert_eq!(format!("{:?}", crate::VideoFormat::Nv16), "Nv16"); } #[test] fn iter() { use super::*; gst::init().unwrap(); assert!(crate::VideoFormat::iter_raw().count() > 0); assert_eq!( crate::VideoFormat::iter_raw().count(), crate::VideoFormat::iter_raw().len() ); let mut i = crate::VideoFormat::iter_raw(); let mut count = 0; loop { if i.next().is_none() { break; } count += 1; if i.next_back().is_none() { break; } count += 1; } assert_eq!(count, crate::VideoFormat::iter_raw().len()); assert!(crate::VideoFormat::iter_raw().any(|f| f == crate::VideoFormat::Nv12)); assert!(!crate::VideoFormat::iter_raw().any(|f| f == crate::VideoFormat::Encoded)); let caps = crate::VideoFormat::iter_raw().into_video_caps(); assert!(caps.is_some()); let caps = crate::VideoFormat::iter_raw() .filter(|f| crate::VideoFormatInfo::from_format(*f).is_gray()) .into_video_caps(); assert!(caps.is_some()); let caps = crate::VideoFormat::iter_raw().skip(1000).into_video_caps(); assert!(caps.is_none()); let caps = [crate::VideoFormat::Nv12, crate::VideoFormat::Nv16] .iter() .into_video_caps() .unwrap() .build(); assert_eq!(caps.to_string(), "video/x-raw, format=(string){ NV12, NV16 }, width=(int)[ 1, 2147483647 ], height=(int)[ 1, 2147483647 ], framerate=(fraction)[ 0/1, 2147483647/1 ]"); } #[test] fn sort() { use itertools::Itertools; gst::init().unwrap(); assert!( crate::VideoFormatInfo::from_format(crate::VideoFormat::Nv16) > crate::VideoFormatInfo::from_format(crate::VideoFormat::Nv12) ); assert!(crate::VideoFormat::I420 > crate::VideoFormat::Yv12); assert!(crate::VideoFormat::Nv12 > crate::VideoFormat::Nv21); assert!(crate::VideoFormat::Xrgb > crate::VideoFormat::Rgb); let sorted: Vec = crate::VideoFormat::iter_raw().sorted().rev().collect(); // FIXME: use is_sorted_by() once API is in stable assert_eq!( sorted, crate::VideoFormat::iter_raw().collect::>() ); } } gstreamer-video-0.23.5/src/video_format_info.rs000064400000000000000000000477201046102023000176500ustar 00000000000000// Take a look at the license at the top of the repository in the LICENSE file. use std::{cmp::Ordering, fmt, marker::PhantomData, str}; use crate::ffi; use glib::translate::{from_glib, IntoGlib, ToGlibPtr}; #[doc(alias = "GstVideoFormatInfo")] #[derive(Copy, Clone)] pub struct VideoFormatInfo(&'static ffi::GstVideoFormatInfo); impl VideoFormatInfo { #[inline] pub unsafe fn from_ptr(format_info: *const ffi::GstVideoFormatInfo) -> Self { debug_assert!(!format_info.is_null()); Self(&*format_info) } #[inline] pub fn from_format(format: crate::VideoFormat) -> Self { assert_initialized_main_thread!(); unsafe { let info = ffi::gst_video_format_get_info(format.into_glib()); debug_assert!(!info.is_null()); Self(&*info) } } #[inline] pub fn format(&self) -> crate::VideoFormat { unsafe { from_glib(self.0.format) } } #[inline] pub fn name<'a>(&self) -> &'a glib::GStr { unsafe { glib::GStr::from_ptr(self.0.name) } } #[inline] pub fn description<'a>(&self) -> &'a glib::GStr { unsafe { glib::GStr::from_ptr(self.0.description) } } #[inline] pub fn flags(&self) -> crate::VideoFormatFlags { unsafe { from_glib(self.0.flags) } } #[inline] pub fn bits(&self) -> u32 { self.0.bits } #[inline] pub fn n_components(&self) -> u32 { self.0.n_components } #[inline] pub fn shift(&self) -> &[u32] { &self.0.shift[0..(self.0.n_components as usize)] } #[inline] pub fn depth(&self) -> &[u32] { &self.0.depth[0..(self.0.n_components as usize)] } #[inline] pub fn pixel_stride(&self) -> &[i32] { &self.0.pixel_stride[0..(self.0.n_components as usize)] } #[inline] pub fn n_planes(&self) -> u32 { self.0.n_planes } #[inline] pub fn plane(&self) -> &[u32] { &self.0.plane[0..(self.0.n_components as usize)] } #[inline] pub fn poffset(&self) -> &[u32] { &self.0.poffset[0..(self.0.n_components as usize)] } #[inline] pub fn w_sub(&self) -> &[u32] { &self.0.w_sub[0..(self.0.n_components as usize)] } #[inline] pub fn h_sub(&self) -> &[u32] { &self.0.h_sub[0..(self.0.n_components as usize)] } #[inline] pub fn tile_mode(&self) -> crate::VideoTileMode { unsafe { from_glib(self.0.tile_mode) } } #[cfg_attr(feature = "v1_22", deprecated = "Since 1.22")] #[inline] pub fn tile_ws(&self) -> u32 { self.0.tile_ws } #[cfg_attr(feature = "v1_22", deprecated = "Since 1.22")] #[inline] pub fn tile_hs(&self) -> u32 { self.0.tile_hs } #[inline] pub fn unpack_format(&self) -> crate::VideoFormat { unsafe { from_glib(self.0.unpack_format) } } #[inline] pub fn pack_lines(&self) -> i32 { self.0.pack_lines } #[inline] pub fn has_alpha(&self) -> bool { self.0.flags & ffi::GST_VIDEO_FORMAT_FLAG_ALPHA != 0 } #[inline] pub fn has_palette(&self) -> bool { self.0.flags & ffi::GST_VIDEO_FORMAT_FLAG_PALETTE != 0 } #[cfg(feature = "v1_22")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_22")))] #[inline] pub fn has_subtiles(&self) -> bool { self.0.flags & ffi::GST_VIDEO_FORMAT_FLAG_SUBTILES != 0 } #[inline] pub fn is_complex(&self) -> bool { self.0.flags & ffi::GST_VIDEO_FORMAT_FLAG_COMPLEX != 0 } #[inline] pub fn is_gray(&self) -> bool { self.0.flags & ffi::GST_VIDEO_FORMAT_FLAG_GRAY != 0 } #[inline] pub fn is_le(&self) -> bool { self.0.flags & ffi::GST_VIDEO_FORMAT_FLAG_LE != 0 } #[inline] pub fn is_rgb(&self) -> bool { self.0.flags & ffi::GST_VIDEO_FORMAT_FLAG_RGB != 0 } #[inline] pub fn is_tiled(&self) -> bool { self.0.flags & ffi::GST_VIDEO_FORMAT_FLAG_TILED != 0 } #[inline] pub fn is_yuv(&self) -> bool { self.0.flags & ffi::GST_VIDEO_FORMAT_FLAG_YUV != 0 } #[inline] pub fn scale_width(&self, component: u8, width: u32) -> u32 { (-((-(i64::from(width))) >> self.w_sub()[component as usize])) as u32 } #[inline] pub fn scale_height(&self, component: u8, height: u32) -> u32 { (-((-(i64::from(height))) >> self.h_sub()[component as usize])) as u32 } #[allow(clippy::too_many_arguments)] pub fn unpack( &self, flags: crate::VideoPackFlags, dest: &mut [u8], src: &[&[u8]], stride: &[i32], x: i32, y: i32, width: i32, ) { let unpack_format = Self::from_format(self.unpack_format()); if unpack_format.pixel_stride()[0] == 0 || self.0.unpack_func.is_none() { panic!("No unpack format for {self:?}"); } if src.len() != self.n_planes() as usize { panic!( "Wrong number of planes provided for format: {} != {}", src.len(), self.n_planes() ); } if stride.len() != self.n_planes() as usize { panic!( "Wrong number of strides provided for format: {} != {}", stride.len(), self.n_planes() ); } if dest.len() < unpack_format.pixel_stride()[0] as usize * width as usize { panic!("Too small destination slice"); } for plane in 0..(self.n_planes()) { if stride[plane as usize] < self.scale_width(plane as u8, width as u32) as i32 * self.pixel_stride()[plane as usize] { panic!("Too small source stride for plane {plane}"); } let plane_size = y * stride[plane as usize] + self.scale_width(plane as u8, (x + width) as u32) as i32 * self.pixel_stride()[plane as usize]; if src[plane as usize].len() < plane_size as usize { panic!("Too small source plane size for plane {plane}"); } } unsafe { use std::ptr; let mut src_ptr = [ptr::null(); ffi::GST_VIDEO_MAX_PLANES as usize]; for plane in 0..(self.n_planes()) { src_ptr[plane as usize] = src[plane as usize].as_ptr(); } (self.0.unpack_func.as_ref().unwrap())( self.0, flags.into_glib(), dest.as_mut_ptr() as *mut _, src_ptr.as_ptr() as *const _, stride.as_ptr(), x, y, width, ); } } #[allow(clippy::too_many_arguments)] pub fn pack( &self, flags: crate::VideoPackFlags, src: &[u8], src_stride: i32, dest: &mut [&mut [u8]], dest_stride: &[i32], chroma_site: crate::VideoChromaSite, y: i32, width: i32, ) { let unpack_format = Self::from_format(self.unpack_format()); if unpack_format.pixel_stride()[0] == 0 || self.0.unpack_func.is_none() { panic!("No unpack format for {self:?}"); } if dest.len() != self.n_planes() as usize { panic!( "Wrong number of planes provided for format: {} != {}", dest.len(), self.n_planes() ); } if dest_stride.len() != self.n_planes() as usize { panic!( "Wrong number of strides provided for format: {} != {}", dest_stride.len(), self.n_planes() ); } if src.len() < unpack_format.pixel_stride()[0] as usize * width as usize { panic!("Too small source slice"); } for plane in 0..(self.n_planes()) { if dest_stride[plane as usize] < self.scale_width(plane as u8, width as u32) as i32 * self.pixel_stride()[plane as usize] { panic!("Too small destination stride for plane {plane}"); } let plane_size = y * dest_stride[plane as usize] + self.scale_width(plane as u8, width as u32) as i32 * self.pixel_stride()[plane as usize]; if dest[plane as usize].len() < plane_size as usize { panic!("Too small destination plane size for plane {plane}"); } } unsafe { use std::ptr; let mut dest_ptr = [ptr::null_mut(); ffi::GST_VIDEO_MAX_PLANES as usize]; for plane in 0..(self.n_planes()) { dest_ptr[plane as usize] = dest[plane as usize].as_mut_ptr(); } (self.0.pack_func.as_ref().unwrap())( self.0, flags.into_glib(), src.as_ptr() as *mut _, src_stride, dest_ptr.as_mut_ptr() as *mut _, dest_stride.as_ptr(), chroma_site.into_glib(), y, width, ); } } #[doc(alias = "gst_video_color_range_offsets")] pub fn range_offsets(&self, range: crate::VideoColorRange) -> ([i32; 4], [i32; 4]) { let mut offset = [0i32; 4]; let mut scale = [0i32; 4]; unsafe { ffi::gst_video_color_range_offsets( range.into_glib(), self.to_glib_none().0, &mut offset, &mut scale, ) } (offset, scale) } #[cfg(feature = "v1_22")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_22")))] #[doc(alias = "gst_video_format_info_extrapolate_stride")] pub fn extrapolate_stride(&self, plane: u32, stride: u32) -> u32 { assert!(plane < self.n_planes()); unsafe { ffi::gst_video_format_info_extrapolate_stride( self.to_glib_none().0, plane as i32, stride as i32, ) as u32 } } #[cfg(feature = "v1_22")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_22")))] pub fn tile_info(&self, plane: u32) -> &VideoTileInfo { assert!(plane < self.n_planes()); unsafe { &*(&self.0.tile_info[plane as usize] as *const _ as *const VideoTileInfo) } } #[cfg(feature = "v1_18")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_18")))] #[doc(alias = "gst_video_format_info_component")] pub fn component(&self, plane: u32) -> [i32; ffi::GST_VIDEO_MAX_COMPONENTS as usize] { assert!(plane < self.n_planes()); let mut comp = [-1i32; ffi::GST_VIDEO_MAX_COMPONENTS as usize]; unsafe { ffi::gst_video_format_info_component(self.to_glib_none().0, plane, comp.as_mut_ptr()); } comp } } unsafe impl Sync for VideoFormatInfo {} unsafe impl Send for VideoFormatInfo {} impl PartialEq for VideoFormatInfo { #[inline] fn eq(&self, other: &Self) -> bool { self.format() == other.format() } } impl Eq for VideoFormatInfo {} impl PartialOrd for VideoFormatInfo { #[inline] fn partial_cmp(&self, other: &Self) -> Option { Some(self.cmp(other)) } } impl Ord for VideoFormatInfo { // See GST_VIDEO_FORMATS_ALL for the sorting algorithm fn cmp(&self, other: &Self) -> Ordering { self.n_components() .cmp(&other.n_components()) .reverse() .then_with(|| self.depth().cmp(other.depth()).reverse()) .then_with(|| self.w_sub().cmp(other.w_sub())) .then_with(|| self.h_sub().cmp(other.h_sub())) .then_with(|| self.n_planes().cmp(&other.n_planes()).reverse()) .then_with(|| { // Format using native endianness is considered smaller let native_endianness = [crate::VideoFormat::Ayuv64, crate::VideoFormat::Argb64]; let want_le = cfg!(target_endian = "little"); match ( self.flags().contains(crate::VideoFormatFlags::LE) == want_le || native_endianness.contains(&self.format()), other.flags().contains(crate::VideoFormatFlags::LE) == want_le || native_endianness.contains(&other.format()), ) { (true, false) => Ordering::Less, (false, true) => Ordering::Greater, _ => Ordering::Equal, } }) .then_with(|| { // Prefer non-complex formats match ( self.flags().contains(crate::VideoFormatFlags::COMPLEX), other.flags().contains(crate::VideoFormatFlags::COMPLEX), ) { (true, false) => Ordering::Greater, (false, true) => Ordering::Less, _ => Ordering::Equal, } }) .then_with(|| { // Prefer RGB over YUV if self.flags().contains(crate::VideoFormatFlags::RGB) && other.flags().contains(crate::VideoFormatFlags::YUV) { Ordering::Greater } else if self.flags().contains(crate::VideoFormatFlags::YUV) && other.flags().contains(crate::VideoFormatFlags::RGB) { Ordering::Less } else { Ordering::Equal } }) .then_with(|| { // Prefer xRGB and permutations over RGB and permutations let xrgb = [ crate::VideoFormat::Xrgb, crate::VideoFormat::Xbgr, crate::VideoFormat::Rgbx, crate::VideoFormat::Bgrx, ]; let rgb = [crate::VideoFormat::Rgb, crate::VideoFormat::Bgr]; if xrgb.contains(&self.format()) && rgb.contains(&other.format()) { Ordering::Less } else if rgb.contains(&self.format()) && xrgb.contains(&other.format()) { Ordering::Greater } else { Ordering::Equal } }) .then_with(|| self.pixel_stride().cmp(other.pixel_stride())) .then_with(|| self.poffset().cmp(other.poffset())) .then_with(|| { // tie, sort by name self.name().cmp(other.name()) }) // and reverse the whole ordering so that "better quality" > "lower quality" .reverse() } } impl fmt::Debug for VideoFormatInfo { #[allow(deprecated)] fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result { let mut fmt = f.debug_struct("VideoFormatInfo"); fmt.field("format", &self.format()) .field("name", &self.name()) .field("description", &self.description()) .field("flags", &self.flags()) .field("bits", &self.bits()) .field("n-components", &self.n_components()) .field("shift", &self.shift()) .field("depth", &self.depth()) .field("pixel-stride", &self.pixel_stride()) .field("n-planes", &self.n_planes()) .field("plane", &self.plane()) .field("poffset", &self.poffset()) .field("w-sub", &self.w_sub()) .field("h-sub", &self.h_sub()) .field("unpack-format", &self.unpack_format()) .field("pack-lines", &self.pack_lines()) .field("tile-mode", &self.tile_mode()) .field("tile-ws", &self.tile_ws()) .field("tile-hs", &self.tile_hs()); #[cfg(feature = "v1_22")] { fmt.field( "tile-info", &(0..self.n_planes()).map(|plane| self.tile_info(plane)), ); } fmt.finish() } } impl fmt::Display for VideoFormatInfo { fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result { f.write_str(self.name()) } } impl str::FromStr for crate::VideoFormatInfo { type Err = glib::BoolError; fn from_str(s: &str) -> Result { skip_assert_initialized!(); let format = s.parse()?; Ok(Self::from_format(format)) } } impl From for VideoFormatInfo { #[inline] fn from(f: crate::VideoFormat) -> Self { skip_assert_initialized!(); Self::from_format(f) } } #[doc(hidden)] impl glib::translate::GlibPtrDefault for VideoFormatInfo { type GlibType = *mut ffi::GstVideoFormatInfo; } #[doc(hidden)] unsafe impl glib::translate::TransparentPtrType for VideoFormatInfo {} #[doc(hidden)] impl<'a> glib::translate::ToGlibPtr<'a, *const ffi::GstVideoFormatInfo> for VideoFormatInfo { type Storage = PhantomData<&'a Self>; #[inline] fn to_glib_none(&'a self) -> glib::translate::Stash<'a, *const ffi::GstVideoFormatInfo, Self> { glib::translate::Stash(self.0, PhantomData) } fn to_glib_full(&self) -> *const ffi::GstVideoFormatInfo { unimplemented!() } } #[doc(hidden)] impl glib::translate::FromGlibPtrNone<*mut ffi::GstVideoFormatInfo> for VideoFormatInfo { #[inline] unsafe fn from_glib_none(ptr: *mut ffi::GstVideoFormatInfo) -> Self { Self(&*ptr) } } #[doc(hidden)] impl glib::translate::FromGlibPtrNone<*const ffi::GstVideoFormatInfo> for VideoFormatInfo { #[inline] unsafe fn from_glib_none(ptr: *const ffi::GstVideoFormatInfo) -> Self { Self(&*ptr) } } #[cfg(feature = "v1_22")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_22")))] #[repr(transparent)] #[doc(alias = "GstVideoTileInfo")] pub struct VideoTileInfo(ffi::GstVideoTileInfo); #[cfg(feature = "v1_22")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_22")))] impl fmt::Debug for VideoTileInfo { fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result { f.debug_struct("VideoTileInfo") .field("width", &self.width()) .field("height", &self.height()) .field("stride", &self.stride()) .field("size", &self.size()) .finish() } } #[cfg(feature = "v1_22")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_22")))] impl VideoTileInfo { #[inline] pub fn width(&self) -> u32 { self.0.width } #[inline] pub fn height(&self) -> u32 { self.0.height } #[inline] pub fn stride(&self) -> u32 { self.0.stride } #[inline] pub fn size(&self) -> u32 { self.0.size } } #[cfg(test)] mod tests { use super::*; #[test] fn test_get() { gst::init().unwrap(); let info = VideoFormatInfo::from_format(crate::VideoFormat::I420); assert_eq!(info.name(), "I420"); let other_info = "I420".parse().unwrap(); assert_eq!(info, other_info); assert_eq!(info.scale_width(0, 128), 128); assert_eq!(info.scale_width(1, 128), 64); assert_eq!(info.scale_width(2, 128), 64); } #[test] fn test_unpack() { gst::init().unwrap(); // One line black 320 pixel I420 let input = &[&[0; 320][..], &[128; 160][..], &[128; 160][..]]; // One line of AYUV let intermediate = &mut [0; 320 * 4][..]; // One line of 320 pixel I420 let output = &mut [&mut [0; 320][..], &mut [0; 160][..], &mut [0; 160][..]]; let info = VideoFormatInfo::from_format(crate::VideoFormat::I420); assert_eq!(info.unpack_format(), crate::VideoFormat::Ayuv); info.unpack( crate::VideoPackFlags::empty(), intermediate, input, &[320, 160, 160][..], 0, 0, 320, ); for pixel in intermediate.chunks_exact(4) { assert_eq!(&[255, 0, 128, 128][..], pixel); } info.pack( crate::VideoPackFlags::empty(), &intermediate[..(4 * 320)], 4 * 320, output, &[320, 160, 160][..], crate::VideoChromaSite::NONE, 0, 320, ); assert_eq!(input, output); } } gstreamer-video-0.23.5/src/video_frame.rs000064400000000000000000000727711046102023000164430ustar 00000000000000// Take a look at the license at the top of the repository in the LICENSE file. use std::{fmt, marker::PhantomData, mem, ops, ptr, slice}; use crate::ffi; use glib::translate::{from_glib, from_glib_none, Borrowed, ToGlibPtr}; pub enum Readable {} pub enum Writable {} pub trait IsVideoFrame { fn as_raw(&self) -> &ffi::GstVideoFrame; } impl IsVideoFrame for VideoFrame { #[inline] fn as_raw(&self) -> &ffi::GstVideoFrame { &self.frame } } fn plane_buffer_info( frame: &T, plane: u32, ) -> Result<(usize, usize), glib::BoolError> { skip_assert_initialized!(); if plane >= frame.n_planes() { return Err(glib::bool_error!( "Plane index higher than number of planes" )); } let format_info = frame.format_info(); // Just get the palette if format_info.has_palette() && plane == 1 { return Ok((1, 256 * 4)); } let w = frame.plane_stride()[plane as usize] as u32; let h = frame.plane_height(plane); if w == 0 || h == 0 { return Ok((0, 0)); } Ok((plane as usize, (w * h) as usize)) } pub struct VideoFrame { frame: ffi::GstVideoFrame, buffer: gst::Buffer, phantom: PhantomData, } unsafe impl Send for VideoFrame {} unsafe impl Sync for VideoFrame {} impl fmt::Debug for VideoFrame { fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result { f.debug_struct("VideoFrame") .field("flags", &self.flags()) .field("id", &self.id()) .field("buffer", &self.buffer()) .field("info", &self.info()) .finish() } } mod sealed { pub trait Sealed {} impl Sealed for T {} } pub trait VideoFrameExt: sealed::Sealed + IsVideoFrame { #[inline] fn as_ptr(&self) -> *const ffi::GstVideoFrame { self.as_raw() as _ } #[inline] fn info(&self) -> &crate::VideoInfo { unsafe { let frame = self.as_raw(); let info = &frame.info as *const ffi::GstVideoInfo as *const crate::VideoInfo; &*info } } #[inline] fn flags(&self) -> crate::VideoFrameFlags { unsafe { from_glib(self.as_raw().flags) } } #[inline] fn id(&self) -> i32 { self.as_raw().id } #[inline] fn buffer(&self) -> &gst::BufferRef { unsafe { gst::BufferRef::from_ptr(self.as_raw().buffer) } } #[inline] fn format(&self) -> crate::VideoFormat { self.info().format() } #[inline] fn format_info(&self) -> crate::VideoFormatInfo { self.info().format_info() } #[inline] fn width(&self) -> u32 { self.info().width() } #[inline] fn height(&self) -> u32 { self.info().height() } #[inline] fn size(&self) -> usize { self.info().size() } #[inline] fn is_interlaced(&self) -> bool { self.flags().contains(crate::VideoFrameFlags::INTERLACED) } #[inline] fn is_tff(&self) -> bool { self.flags().contains(crate::VideoFrameFlags::TFF) } #[inline] fn is_rff(&self) -> bool { self.flags().contains(crate::VideoFrameFlags::RFF) } #[inline] fn is_onefield(&self) -> bool { self.flags().contains(crate::VideoFrameFlags::ONEFIELD) } #[inline] fn is_bottom_field(&self) -> bool { self.flags().contains(crate::VideoFrameFlags::ONEFIELD) && !self.flags().contains(crate::VideoFrameFlags::TFF) } #[inline] fn is_top_field(&self) -> bool { self.flags().contains(crate::VideoFrameFlags::ONEFIELD) && self.flags().contains(crate::VideoFrameFlags::TFF) } #[inline] fn n_planes(&self) -> u32 { self.info().n_planes() } #[inline] fn n_components(&self) -> u32 { self.info().n_components() } #[inline] fn plane_stride(&self) -> &[i32] { self.info().stride() } #[inline] fn plane_offset(&self) -> &[usize] { self.info().offset() } #[inline] fn plane_height(&self, plane: u32) -> u32 { cfg_if::cfg_if! { if #[cfg(feature = "v1_18")] { let comp = self.format_info().component(plane)[0]; if comp == -1 { 0 } else { self.comp_height(comp as u32) } } else { // FIXME: This assumes that the horizontal subsampling of all // components in the plane is the same, which is probably safe // Legacy implementation that does not support video formats // where plane index and component index are not the same. // See #536 self.format_info().scale_height(plane as u8, self.height()) } } } #[inline] fn comp_depth(&self, component: u32) -> u32 { self.info().comp_depth(component as u8) } #[inline] fn comp_height(&self, component: u32) -> u32 { self.info().comp_height(component as u8) } #[inline] fn comp_width(&self, component: u32) -> u32 { self.info().comp_width(component as u8) } #[inline] fn comp_offset(&self, component: u32) -> usize { self.info().comp_offset(component as u8) } #[inline] fn comp_poffset(&self, component: u32) -> u32 { self.info().comp_poffset(component as u8) } #[inline] fn comp_pstride(&self, component: u32) -> i32 { self.info().comp_pstride(component as u8) } #[inline] fn comp_stride(&self, component: u32) -> i32 { self.info().comp_stride(component as u8) } #[inline] fn comp_plane(&self, component: u32) -> u32 { self.info().comp_plane(component as u8) } } impl VideoFrameExt for O {} impl VideoFrame { #[inline] pub fn into_buffer(self) -> gst::Buffer { unsafe { let mut s = mem::ManuallyDrop::new(self); let buffer = ptr::read(&s.buffer); ffi::gst_video_frame_unmap(&mut s.frame); buffer } } #[doc(alias = "gst_video_frame_copy")] pub fn copy(&self, dest: &mut VideoFrame) -> Result<(), glib::BoolError> { unsafe { let res: bool = from_glib(ffi::gst_video_frame_copy(&mut dest.frame, &self.frame)); if res { Ok(()) } else { Err(glib::bool_error!("Failed to copy video frame")) } } } #[doc(alias = "gst_video_frame_copy_plane")] pub fn copy_plane( &self, dest: &mut VideoFrame, plane: u32, ) -> Result<(), glib::BoolError> { skip_assert_initialized!(); unsafe { let res: bool = from_glib(ffi::gst_video_frame_copy_plane( &mut dest.frame, &self.frame, plane, )); if res { Ok(()) } else { Err(glib::bool_error!("Failed to copy video frame plane")) } } } #[inline] pub fn comp_data(&self, component: u32) -> Result<&[u8], glib::BoolError> { let poffset = self.info().comp_poffset(component as u8) as usize; Ok(&self.plane_data(self.format_info().plane()[component as usize])?[poffset..]) } #[inline] pub fn buffer(&self) -> &gst::BufferRef { unsafe { gst::BufferRef::from_ptr(self.frame.buffer) } } pub fn plane_data(&self, plane: u32) -> Result<&[u8], glib::BoolError> { match plane_buffer_info(self, plane) { Ok((plane, size)) => { if size == 0 { return Ok(&[]); } unsafe { Ok(slice::from_raw_parts( self.frame.data[plane] as *const u8, size, )) } } Err(err) => Err(err), } } pub fn planes_data(&self) -> [&[u8]; 4] { let mut planes = [[].as_slice(); 4]; for plane in 0..self.n_planes() { planes[plane as usize] = self.plane_data(plane).unwrap(); } planes } #[inline] pub unsafe fn from_glib_full(frame: ffi::GstVideoFrame) -> Self { let buffer = gst::Buffer::from_glib_none(frame.buffer); Self { frame, buffer, phantom: PhantomData, } } #[inline] pub fn as_video_frame_ref(&self) -> VideoFrameRef<&gst::BufferRef> { let frame = unsafe { ptr::read(&self.frame) }; VideoFrameRef { frame, unmap: false, phantom: PhantomData, } } #[inline] pub fn into_raw(self) -> ffi::GstVideoFrame { unsafe { let mut s = mem::ManuallyDrop::new(self); ptr::drop_in_place(&mut s.buffer); s.frame } } } impl Drop for VideoFrame { #[inline] fn drop(&mut self) { unsafe { ffi::gst_video_frame_unmap(&mut self.frame); } } } impl VideoFrame { #[inline] pub fn from_buffer_readable( buffer: gst::Buffer, info: &crate::VideoInfo, ) -> Result { skip_assert_initialized!(); assert!(info.is_valid()); unsafe { let mut frame = mem::MaybeUninit::uninit(); let res: bool = from_glib(ffi::gst_video_frame_map( frame.as_mut_ptr(), info.to_glib_none().0 as *mut _, buffer.to_glib_none().0, ffi::GST_VIDEO_FRAME_MAP_FLAG_NO_REF | gst::ffi::GST_MAP_READ, )); if !res { Err(buffer) } else { let frame = frame.assume_init(); Ok(Self { frame, buffer, phantom: PhantomData, }) } } } #[inline] pub fn from_buffer_id_readable( buffer: gst::Buffer, id: i32, info: &crate::VideoInfo, ) -> Result { skip_assert_initialized!(); assert!(info.is_valid()); unsafe { let mut frame = mem::MaybeUninit::uninit(); let res: bool = from_glib(ffi::gst_video_frame_map_id( frame.as_mut_ptr(), info.to_glib_none().0 as *mut _, buffer.to_glib_none().0, id, ffi::GST_VIDEO_FRAME_MAP_FLAG_NO_REF | gst::ffi::GST_MAP_READ, )); if !res { Err(buffer) } else { let frame = frame.assume_init(); Ok(Self { frame, buffer, phantom: PhantomData, }) } } } #[inline] pub fn buffer_owned(&self) -> gst::Buffer { unsafe { from_glib_none(self.frame.buffer) } } } impl VideoFrame { #[inline] pub fn from_buffer_writable( buffer: gst::Buffer, info: &crate::VideoInfo, ) -> Result { skip_assert_initialized!(); assert!(info.is_valid()); unsafe { let mut frame = mem::MaybeUninit::uninit(); let res: bool = from_glib(ffi::gst_video_frame_map( frame.as_mut_ptr(), info.to_glib_none().0 as *mut _, buffer.to_glib_none().0, ffi::GST_VIDEO_FRAME_MAP_FLAG_NO_REF | gst::ffi::GST_MAP_READ | gst::ffi::GST_MAP_WRITE, )); if !res { Err(buffer) } else { let frame = frame.assume_init(); Ok(Self { frame, buffer, phantom: PhantomData, }) } } } #[inline] pub fn from_buffer_id_writable( buffer: gst::Buffer, id: i32, info: &crate::VideoInfo, ) -> Result { skip_assert_initialized!(); assert!(info.is_valid()); unsafe { let mut frame = mem::MaybeUninit::uninit(); let res: bool = from_glib(ffi::gst_video_frame_map_id( frame.as_mut_ptr(), info.to_glib_none().0 as *mut _, buffer.to_glib_none().0, id, ffi::GST_VIDEO_FRAME_MAP_FLAG_NO_REF | gst::ffi::GST_MAP_READ | gst::ffi::GST_MAP_WRITE, )); if !res { Err(buffer) } else { let frame = frame.assume_init(); Ok(Self { frame, buffer, phantom: PhantomData, }) } } } pub fn comp_data_mut(&mut self, component: u32) -> Result<&mut [u8], glib::BoolError> { let poffset = self.info().comp_poffset(component as u8) as usize; Ok(&mut self.plane_data_mut(self.format_info().plane()[component as usize])?[poffset..]) } pub fn plane_data_mut(&mut self, plane: u32) -> Result<&mut [u8], glib::BoolError> { match plane_buffer_info(self, plane) { Ok((plane, size)) => { if size == 0 { return Ok(&mut []); } unsafe { Ok(slice::from_raw_parts_mut( self.frame.data[plane] as *mut u8, size, )) } } Err(err) => Err(err), } } pub fn planes_data_mut(&mut self) -> [&mut [u8]; 4] { unsafe { let mut planes = [ [].as_mut_slice(), [].as_mut_slice(), [].as_mut_slice(), [].as_mut_slice(), ]; for plane in 0..self.n_planes() { let slice = self.plane_data_mut(plane).unwrap(); planes[plane as usize] = slice::from_raw_parts_mut(slice.as_mut_ptr(), slice.len()); } planes } } #[inline] pub fn as_mut_video_frame_ref(&mut self) -> VideoFrameRef<&mut gst::BufferRef> { let frame = unsafe { ptr::read(&self.frame) }; VideoFrameRef { frame, unmap: false, phantom: PhantomData, } } #[inline] pub fn as_mut_ptr(&mut self) -> *mut ffi::GstVideoFrame { &mut self.frame } } pub struct VideoFrameRef { frame: ffi::GstVideoFrame, unmap: bool, phantom: PhantomData, } impl IsVideoFrame for VideoFrameRef { #[inline] fn as_raw(&self) -> &ffi::GstVideoFrame { &self.frame } } impl fmt::Debug for VideoFrameRef { fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result { f.debug_struct("VideoFrameRef") .field("flags", &self.flags()) .field("id", &self.id()) .field("buffer", &unsafe { gst::BufferRef::from_ptr(self.frame.buffer) }) .field("info", &self.info()) .finish() } } impl VideoFrameRef { #[doc(alias = "gst_video_frame_copy")] pub fn copy( &self, dest: &mut VideoFrameRef<&mut gst::BufferRef>, ) -> Result<(), glib::BoolError> { unsafe { let res: bool = from_glib(ffi::gst_video_frame_copy(&mut dest.frame, &self.frame)); if res { Ok(()) } else { Err(glib::bool_error!("Failed to copy video frame")) } } } #[doc(alias = "gst_video_frame_copy_plane")] pub fn copy_plane( &self, dest: &mut VideoFrameRef<&mut gst::BufferRef>, plane: u32, ) -> Result<(), glib::BoolError> { skip_assert_initialized!(); unsafe { let res: bool = from_glib(ffi::gst_video_frame_copy_plane( &mut dest.frame, &self.frame, plane, )); if res { Ok(()) } else { Err(glib::bool_error!("Failed to copy video frame plane")) } } } pub fn comp_data(&self, component: u32) -> Result<&[u8], glib::BoolError> { let poffset = self.info().comp_poffset(component as u8) as usize; Ok(&self.plane_data(self.format_info().plane()[component as usize])?[poffset..]) } pub fn plane_data(&self, plane: u32) -> Result<&[u8], glib::BoolError> { match plane_buffer_info(self, plane) { Ok((plane, size)) => { if size == 0 { return Ok(&[]); } unsafe { Ok(slice::from_raw_parts( self.frame.data[plane] as *const u8, size, )) } } Err(err) => Err(err), } } pub fn planes_data(&self) -> [&[u8]; 4] { let mut planes = [[].as_slice(); 4]; for plane in 0..self.n_planes() { planes[plane as usize] = self.plane_data(plane).unwrap(); } planes } } impl<'a> VideoFrameRef<&'a gst::BufferRef> { #[inline] pub unsafe fn from_glib_borrow(frame: *const ffi::GstVideoFrame) -> Borrowed { debug_assert!(!frame.is_null()); let frame = ptr::read(frame); Borrowed::new(Self { frame, unmap: false, phantom: PhantomData, }) } #[inline] pub unsafe fn from_glib_full(frame: ffi::GstVideoFrame) -> Self { Self { frame, unmap: true, phantom: PhantomData, } } #[inline] pub fn from_buffer_ref_readable<'b>( buffer: &'a gst::BufferRef, info: &'b crate::VideoInfo, ) -> Result { skip_assert_initialized!(); assert!(info.is_valid()); unsafe { let mut frame = mem::MaybeUninit::uninit(); let res: bool = from_glib(ffi::gst_video_frame_map( frame.as_mut_ptr(), info.to_glib_none().0 as *mut _, buffer.as_mut_ptr(), ffi::GST_VIDEO_FRAME_MAP_FLAG_NO_REF | gst::ffi::GST_MAP_READ, )); if !res { Err(glib::bool_error!("Failed to map VideoFrame")) } else { let frame = frame.assume_init(); Ok(Self { frame, unmap: true, phantom: PhantomData, }) } } } #[inline] pub fn from_buffer_ref_id_readable<'b>( buffer: &'a gst::BufferRef, id: i32, info: &'b crate::VideoInfo, ) -> Result { skip_assert_initialized!(); assert!(info.is_valid()); unsafe { let mut frame = mem::MaybeUninit::uninit(); let res: bool = from_glib(ffi::gst_video_frame_map_id( frame.as_mut_ptr(), info.to_glib_none().0 as *mut _, buffer.as_mut_ptr(), id, ffi::GST_VIDEO_FRAME_MAP_FLAG_NO_REF | gst::ffi::GST_MAP_READ, )); if !res { Err(glib::bool_error!("Failed to map VideoFrame")) } else { let frame = frame.assume_init(); Ok(Self { frame, unmap: true, phantom: PhantomData, }) } } } } impl<'a> VideoFrameRef<&'a mut gst::BufferRef> { #[inline] pub unsafe fn from_glib_borrow_mut(frame: *mut ffi::GstVideoFrame) -> Self { debug_assert!(!frame.is_null()); let frame = ptr::read(frame); Self { frame, unmap: false, phantom: PhantomData, } } #[inline] pub unsafe fn from_glib_full_mut(frame: ffi::GstVideoFrame) -> Self { Self { frame, unmap: true, phantom: PhantomData, } } #[inline] pub fn from_buffer_ref_writable<'b>( buffer: &'a mut gst::BufferRef, info: &'b crate::VideoInfo, ) -> Result { skip_assert_initialized!(); assert!(info.is_valid()); unsafe { let mut frame = mem::MaybeUninit::uninit(); let res: bool = from_glib(ffi::gst_video_frame_map( frame.as_mut_ptr(), info.to_glib_none().0 as *mut _, buffer.as_mut_ptr(), ffi::GST_VIDEO_FRAME_MAP_FLAG_NO_REF | gst::ffi::GST_MAP_READ | gst::ffi::GST_MAP_WRITE, )); if !res { Err(glib::bool_error!("Failed to map VideoFrame")) } else { let frame = frame.assume_init(); Ok(Self { frame, unmap: true, phantom: PhantomData, }) } } } #[inline] pub fn from_buffer_ref_id_writable<'b>( buffer: &'a mut gst::BufferRef, id: i32, info: &'b crate::VideoInfo, ) -> Result { skip_assert_initialized!(); assert!(info.is_valid()); unsafe { let mut frame = mem::MaybeUninit::uninit(); let res: bool = from_glib(ffi::gst_video_frame_map_id( frame.as_mut_ptr(), info.to_glib_none().0 as *mut _, buffer.as_mut_ptr(), id, ffi::GST_VIDEO_FRAME_MAP_FLAG_NO_REF | gst::ffi::GST_MAP_READ | gst::ffi::GST_MAP_WRITE, )); if !res { Err(glib::bool_error!("Failed to map VideoFrame")) } else { let frame = frame.assume_init(); Ok(Self { frame, unmap: true, phantom: PhantomData, }) } } } pub fn comp_data_mut(&mut self, component: u32) -> Result<&mut [u8], glib::BoolError> { let poffset = self.info().comp_poffset(component as u8) as usize; Ok(&mut self.plane_data_mut(self.format_info().plane()[component as usize])?[poffset..]) } pub fn plane_data_mut(&mut self, plane: u32) -> Result<&mut [u8], glib::BoolError> { match plane_buffer_info(self, plane) { Ok((plane, size)) => { if size == 0 { return Ok(&mut []); } unsafe { Ok(slice::from_raw_parts_mut( self.frame.data[plane] as *mut u8, size, )) } } Err(err) => Err(err), } } pub fn planes_data_mut(&mut self) -> [&mut [u8]; 4] { unsafe { let mut planes = [ [].as_mut_slice(), [].as_mut_slice(), [].as_mut_slice(), [].as_mut_slice(), ]; for plane in 0..self.n_planes() { let slice = self.plane_data_mut(plane).unwrap(); planes[plane as usize] = slice::from_raw_parts_mut(slice.as_mut_ptr(), slice.len()); } planes } } #[inline] pub fn as_mut_ptr(&mut self) -> *mut ffi::GstVideoFrame { &mut self.frame } } impl<'a> ops::Deref for VideoFrameRef<&'a mut gst::BufferRef> { type Target = VideoFrameRef<&'a gst::BufferRef>; #[inline] fn deref(&self) -> &Self::Target { unsafe { &*(self as *const Self as *const Self::Target) } } } unsafe impl Send for VideoFrameRef {} unsafe impl Sync for VideoFrameRef {} impl Drop for VideoFrameRef { #[inline] fn drop(&mut self) { unsafe { if self.unmap { ffi::gst_video_frame_unmap(&mut self.frame); } } } } pub trait VideoBufferExt { #[doc(alias = "get_video_flags")] fn video_flags(&self) -> crate::VideoBufferFlags; fn set_video_flags(&mut self, flags: crate::VideoBufferFlags); fn unset_video_flags(&mut self, flags: crate::VideoBufferFlags); } impl VideoBufferExt for gst::BufferRef { #[inline] fn video_flags(&self) -> crate::VideoBufferFlags { unsafe { let ptr = self.as_mut_ptr(); crate::VideoBufferFlags::from_bits_truncate((*ptr).mini_object.flags) } } #[inline] fn set_video_flags(&mut self, flags: crate::VideoBufferFlags) { unsafe { let ptr = self.as_mut_ptr(); (*ptr).mini_object.flags |= flags.bits(); } } #[inline] fn unset_video_flags(&mut self, flags: crate::VideoBufferFlags) { unsafe { let ptr = self.as_mut_ptr(); (*ptr).mini_object.flags &= !flags.bits(); } } } #[cfg(test)] mod tests { use super::*; #[test] fn test_map_read() { gst::init().unwrap(); let info = crate::VideoInfo::builder(crate::VideoFormat::Gray8, 320, 240) .build() .unwrap(); let buffer = gst::Buffer::with_size(info.size()).unwrap(); let frame = VideoFrame::from_buffer_readable(buffer, &info).unwrap(); assert!(frame.plane_data(0).is_ok()); assert_eq!(frame.plane_data(0).unwrap().len(), 320 * 240); assert!(frame.plane_data(1).is_err()); assert!(frame.info() == &info); { let frame = frame.as_video_frame_ref(); assert!(frame.plane_data(0).is_ok()); assert_eq!(frame.plane_data(0).unwrap().len(), 320 * 240); assert!(frame.plane_data(1).is_err()); assert!(frame.info() == &info); } assert!(frame.plane_data(0).is_ok()); assert_eq!(frame.plane_data(0).unwrap().len(), 320 * 240); assert!(frame.plane_data(1).is_err()); assert!(frame.info() == &info); } #[test] fn test_map_write() { gst::init().unwrap(); let info = crate::VideoInfo::builder(crate::VideoFormat::Gray8, 320, 240) .build() .unwrap(); let buffer = gst::Buffer::with_size(info.size()).unwrap(); let mut frame = VideoFrame::from_buffer_writable(buffer, &info).unwrap(); assert!(frame.plane_data_mut(0).is_ok()); assert_eq!(frame.plane_data_mut(0).unwrap().len(), 320 * 240); assert!(frame.plane_data_mut(1).is_err()); assert!(frame.info() == &info); { let mut frame = frame.as_mut_video_frame_ref(); assert!(frame.plane_data_mut(0).is_ok()); assert_eq!(frame.plane_data_mut(0).unwrap().len(), 320 * 240); assert!(frame.plane_data_mut(1).is_err()); assert!(frame.info() == &info); } assert!(frame.plane_data_mut(0).is_ok()); assert_eq!(frame.plane_data_mut(0).unwrap().len(), 320 * 240); assert!(frame.plane_data_mut(1).is_err()); assert!(frame.info() == &info); } #[test] fn test_map_ref_read() { gst::init().unwrap(); let info = crate::VideoInfo::builder(crate::VideoFormat::Gray8, 320, 240) .build() .unwrap(); let buffer = gst::Buffer::with_size(info.size()).unwrap(); let frame = VideoFrameRef::from_buffer_ref_readable(&buffer, &info).unwrap(); assert!(frame.plane_data(0).is_ok()); assert_eq!(frame.plane_data(0).unwrap().len(), 320 * 240); assert!(frame.plane_data(1).is_err()); assert!(frame.info() == &info); } #[test] fn test_map_ref_write() { gst::init().unwrap(); let info = crate::VideoInfo::builder(crate::VideoFormat::Gray8, 320, 240) .build() .unwrap(); let mut buffer = gst::Buffer::with_size(info.size()).unwrap(); { let buffer = buffer.get_mut().unwrap(); let mut frame = VideoFrameRef::from_buffer_ref_writable(buffer, &info).unwrap(); assert!(frame.plane_data_mut(0).is_ok()); assert_eq!(frame.plane_data_mut(0).unwrap().len(), 320 * 240); assert!(frame.plane_data_mut(1).is_err()); assert!(frame.info() == &info); } } #[cfg(feature = "v1_20")] #[test] fn test_plane_data() { gst::init().unwrap(); let info = crate::VideoInfo::builder(crate::VideoFormat::Av12, 320, 240) .build() .unwrap(); let buffer = gst::Buffer::with_size(info.size()).unwrap(); let mut frame = VideoFrame::from_buffer_writable(buffer, &info).unwrap(); // Alpha plane { let mut frame = frame.as_mut_video_frame_ref(); let data = frame.plane_data_mut(2).unwrap(); assert_eq!(data.len(), 320 * 240); data[0] = 42; } // UV plane { let mut frame = frame.as_mut_video_frame_ref(); let data = frame.plane_data_mut(1).unwrap(); assert_eq!(data.len(), 320 * 120); data[0] = 42; } let frame = frame.into_buffer(); let frame = VideoFrame::from_buffer_readable(frame, &info).unwrap(); let alpha_data = frame.plane_data(2).unwrap(); assert_eq!(alpha_data.len(), 320 * 240); assert_eq!(alpha_data[0], 42); let uv_data = frame.plane_data(1).unwrap(); assert_eq!(uv_data.len(), 320 * 120); assert_eq!(uv_data[0], 42); } } gstreamer-video-0.23.5/src/video_hdr.rs000064400000000000000000000261721046102023000161200ustar 00000000000000// Take a look at the license at the top of the repository in the LICENSE file. use std::{fmt, mem, ptr, str}; use crate::ffi; use glib::translate::*; #[doc(alias = "GstVideoContentLightLevel")] #[derive(Copy, Clone)] pub struct VideoContentLightLevel(ffi::GstVideoContentLightLevel); impl VideoContentLightLevel { pub fn new(max_content_light_level: u16, max_frame_average_light_level: u16) -> Self { skip_assert_initialized!(); VideoContentLightLevel(ffi::GstVideoContentLightLevel { max_content_light_level, max_frame_average_light_level, _gst_reserved: [ptr::null_mut(); 4], }) } pub fn max_content_light_level(&self) -> u16 { self.0.max_content_light_level } pub fn set_max_content_light_level(&mut self, max_content_light_level: u16) { self.0.max_content_light_level = max_content_light_level; } pub fn max_frame_average_light_level(&self) -> u16 { self.0.max_frame_average_light_level } pub fn set_max_frame_average_light_level(&mut self, max_frame_average_light_level: u16) { self.0.max_frame_average_light_level = max_frame_average_light_level; } #[doc(alias = "gst_video_content_light_level_add_to_caps")] pub fn add_to_caps(&self, caps: &mut gst::CapsRef) { unsafe { ffi::gst_video_content_light_level_add_to_caps(&self.0, caps.as_mut_ptr()); } } #[doc(alias = "gst_video_content_light_level_from_caps")] pub fn from_caps(caps: &gst::CapsRef) -> Result { skip_assert_initialized!(); unsafe { let mut info = mem::MaybeUninit::uninit(); let res: bool = from_glib(ffi::gst_video_content_light_level_from_caps( info.as_mut_ptr(), caps.as_ptr(), )); if res { Ok(VideoContentLightLevel(info.assume_init())) } else { Err(glib::bool_error!( "Failed to parse VideoContentLightLevel from caps" )) } } } } impl<'a> TryFrom<&'a gst::CapsRef> for VideoContentLightLevel { type Error = glib::BoolError; fn try_from(value: &'a gst::CapsRef) -> Result { skip_assert_initialized!(); Self::from_caps(value) } } impl PartialEq for VideoContentLightLevel { #[doc(alias = "gst_video_content_light_level_is_equal")] fn eq(&self, other: &Self) -> bool { #[cfg(feature = "v1_20")] unsafe { from_glib(ffi::gst_video_content_light_level_is_equal( &self.0, &other.0, )) } #[cfg(not(feature = "v1_20"))] { self.0.max_content_light_level == other.0.max_content_light_level && self.0.max_frame_average_light_level == other.0.max_frame_average_light_level } } } impl Eq for VideoContentLightLevel {} impl str::FromStr for VideoContentLightLevel { type Err = glib::error::BoolError; #[doc(alias = "gst_video_content_light_level_from_string")] fn from_str(s: &str) -> Result { assert_initialized_main_thread!(); unsafe { let mut colorimetry = mem::MaybeUninit::uninit(); let valid: bool = from_glib(ffi::gst_video_content_light_level_from_string( colorimetry.as_mut_ptr(), s.to_glib_none().0, )); if valid { Ok(Self(colorimetry.assume_init())) } else { Err(glib::bool_error!("Invalid colorimetry info")) } } } } impl fmt::Debug for VideoContentLightLevel { fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result { f.debug_struct("VideoContentLightLevel") .field("max_content_light_level", &self.0.max_content_light_level) .field( "max_frame_average_light_level", &self.0.max_frame_average_light_level, ) .finish() } } impl fmt::Display for VideoContentLightLevel { #[doc(alias = "gst_video_content_light_level_to_string")] fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result { let s = unsafe { glib::GString::from_glib_full(ffi::gst_video_content_light_level_to_string(&self.0)) }; f.write_str(&s) } } #[doc(alias = "GstVideoMasteringDisplayInfo")] #[derive(Copy, Clone)] pub struct VideoMasteringDisplayInfo(ffi::GstVideoMasteringDisplayInfo); impl VideoMasteringDisplayInfo { pub fn new( display_primaries: [VideoMasteringDisplayInfoCoordinate; 3], white_point: VideoMasteringDisplayInfoCoordinate, max_display_mastering_luminance: u32, min_display_mastering_luminance: u32, ) -> Self { skip_assert_initialized!(); VideoMasteringDisplayInfo(ffi::GstVideoMasteringDisplayInfo { display_primaries: unsafe { mem::transmute::< [VideoMasteringDisplayInfoCoordinate; 3], [ffi::GstVideoMasteringDisplayInfoCoordinates; 3], >(display_primaries) }, white_point: unsafe { mem::transmute::< VideoMasteringDisplayInfoCoordinate, ffi::GstVideoMasteringDisplayInfoCoordinates, >(white_point) }, max_display_mastering_luminance, min_display_mastering_luminance, _gst_reserved: [ptr::null_mut(); 4], }) } pub fn display_primaries(&self) -> [VideoMasteringDisplayInfoCoordinate; 3] { unsafe { mem::transmute(self.0.display_primaries) } } pub fn set_display_primaries( &mut self, display_primaries: [VideoMasteringDisplayInfoCoordinate; 3], ) { self.0.display_primaries = unsafe { mem::transmute::< [VideoMasteringDisplayInfoCoordinate; 3], [ffi::GstVideoMasteringDisplayInfoCoordinates; 3], >(display_primaries) }; } pub fn white_point(&self) -> VideoMasteringDisplayInfoCoordinate { unsafe { mem::transmute(self.0.white_point) } } pub fn set_white_point(&mut self, white_point: VideoMasteringDisplayInfoCoordinate) { self.0.white_point = unsafe { mem::transmute::< VideoMasteringDisplayInfoCoordinate, ffi::GstVideoMasteringDisplayInfoCoordinates, >(white_point) }; } pub fn max_display_mastering_luminance(&self) -> u32 { self.0.max_display_mastering_luminance } pub fn set_max_display_mastering_luminance(&mut self, max_display_mastering_luminance: u32) { self.0.max_display_mastering_luminance = max_display_mastering_luminance; } pub fn min_display_mastering_luminance(&self) -> u32 { self.0.min_display_mastering_luminance } pub fn set_min_display_mastering_luminance(&mut self, min_display_mastering_luminance: u32) { self.0.min_display_mastering_luminance = min_display_mastering_luminance; } #[doc(alias = "gst_video_mastering_display_info_add_to_caps")] pub fn add_to_caps(&self, caps: &mut gst::CapsRef) { unsafe { ffi::gst_video_mastering_display_info_add_to_caps(&self.0, caps.as_mut_ptr()); } } #[doc(alias = "gst_video_mastering_display_info_from_caps")] pub fn from_caps(caps: &gst::CapsRef) -> Result { skip_assert_initialized!(); unsafe { let mut info = mem::MaybeUninit::uninit(); let res: bool = from_glib(ffi::gst_video_mastering_display_info_from_caps( info.as_mut_ptr(), caps.as_ptr(), )); if res { Ok(VideoMasteringDisplayInfo(info.assume_init())) } else { Err(glib::bool_error!( "Failed to parse VideoMasteringDisplayInfo from caps" )) } } } } impl<'a> TryFrom<&'a gst::CapsRef> for VideoMasteringDisplayInfo { type Error = glib::BoolError; fn try_from(value: &'a gst::CapsRef) -> Result { skip_assert_initialized!(); Self::from_caps(value) } } impl PartialEq for VideoMasteringDisplayInfo { #[doc(alias = "gst_video_mastering_display_info_is_equal")] fn eq(&self, other: &Self) -> bool { unsafe { from_glib(ffi::gst_video_mastering_display_info_is_equal( &self.0, &other.0, )) } } } impl Eq for VideoMasteringDisplayInfo {} impl str::FromStr for VideoMasteringDisplayInfo { type Err = glib::error::BoolError; #[doc(alias = "gst_video_mastering_display_info_from_string")] fn from_str(s: &str) -> Result { assert_initialized_main_thread!(); unsafe { let mut colorimetry = mem::MaybeUninit::uninit(); let valid: bool = from_glib(ffi::gst_video_mastering_display_info_from_string( colorimetry.as_mut_ptr(), s.to_glib_none().0, )); if valid { Ok(Self(colorimetry.assume_init())) } else { Err(glib::bool_error!("Invalid colorimetry info")) } } } } impl fmt::Debug for VideoMasteringDisplayInfo { fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result { f.debug_struct("VideoMasteringDisplayInfo") .field("display_primaries", &self.display_primaries()) .field("white_point", &self.white_point()) .field( "max_display_mastering_luminance", &self.0.max_display_mastering_luminance, ) .field( "min_display_mastering_luminance", &self.0.min_display_mastering_luminance, ) .finish() } } impl fmt::Display for VideoMasteringDisplayInfo { #[doc(alias = "gst_video_mastering_display_info_to_string")] fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result { let s = unsafe { glib::GString::from_glib_full(ffi::gst_video_mastering_display_info_to_string(&self.0)) }; f.write_str(&s) } } #[repr(C)] #[derive(Copy, Clone, PartialEq, Eq)] #[doc(alias = "GstVideoMasteringDisplayInfoCoordinates")] pub struct VideoMasteringDisplayInfoCoordinate { pub x: u16, pub y: u16, } impl VideoMasteringDisplayInfoCoordinate { pub fn new(x: f32, y: f32) -> Self { skip_assert_initialized!(); Self { x: (x * 50000.0) as u16, y: (y * 50000.0) as u16, } } pub fn x(&self) -> f32 { self.x as f32 / 50000.0 } pub fn y(&self) -> f32 { self.y as f32 / 50000.0 } pub fn set_x(&mut self, x: f32) { self.x = (x * 50000.0) as u16; } pub fn set_y(&mut self, y: f32) { self.y = (y * 50000.0) as u16; } } impl fmt::Debug for VideoMasteringDisplayInfoCoordinate { fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result { f.debug_struct("VideoMasteringDisplayInfoCoordinate") .field("x", &self.x()) .field("y", &self.y()) .finish() } } gstreamer-video-0.23.5/src/video_info.rs000064400000000000000000001162321046102023000162730ustar 00000000000000// Take a look at the license at the top of the repository in the LICENSE file. use std::{fmt, marker::PhantomData, mem, ptr, str}; use crate::ffi; use glib::translate::*; use gst::prelude::*; #[doc(alias = "GST_VIDEO_MAX_PLANES")] pub const VIDEO_MAX_PLANES: usize = ffi::GST_VIDEO_MAX_PLANES as usize; #[derive(Debug, Eq, PartialEq, Ord, PartialOrd, Hash, Clone, Copy)] #[non_exhaustive] #[doc(alias = "GstVideoColorRange")] pub enum VideoColorRange { #[doc(alias = "GST_VIDEO_COLOR_RANGE_UNKNOWN")] Unknown, #[doc(alias = "GST_VIDEO_COLOR_RANGE_0_255")] Range0_255, #[doc(alias = "GST_VIDEO_COLOR_RANGE_16_235")] Range16_235, #[doc(hidden)] __Unknown(i32), } #[doc(hidden)] impl IntoGlib for VideoColorRange { type GlibType = ffi::GstVideoColorRange; #[inline] fn into_glib(self) -> ffi::GstVideoColorRange { match self { Self::Unknown => ffi::GST_VIDEO_COLOR_RANGE_UNKNOWN, Self::Range0_255 => ffi::GST_VIDEO_COLOR_RANGE_0_255, Self::Range16_235 => ffi::GST_VIDEO_COLOR_RANGE_16_235, Self::__Unknown(value) => value, } } } #[doc(hidden)] impl FromGlib for VideoColorRange { #[inline] unsafe fn from_glib(value: ffi::GstVideoColorRange) -> Self { skip_assert_initialized!(); match value { 0 => Self::Unknown, 1 => Self::Range0_255, 2 => Self::Range16_235, value => Self::__Unknown(value), } } } impl StaticType for VideoColorRange { #[inline] fn static_type() -> glib::Type { unsafe { from_glib(ffi::gst_video_color_range_get_type()) } } } impl glib::value::ValueType for VideoColorRange { type Type = Self; } unsafe impl<'a> glib::value::FromValue<'a> for VideoColorRange { type Checker = glib::value::GenericValueTypeChecker; unsafe fn from_value(value: &'a glib::Value) -> Self { skip_assert_initialized!(); from_glib(glib::gobject_ffi::g_value_get_enum(value.to_glib_none().0)) } } impl ToValue for VideoColorRange { fn to_value(&self) -> glib::Value { let mut value = glib::Value::for_value_type::(); unsafe { glib::gobject_ffi::g_value_set_enum(value.to_glib_none_mut().0, self.into_glib()) } value } fn value_type(&self) -> glib::Type { Self::static_type() } } impl From for glib::Value { fn from(v: VideoColorRange) -> glib::Value { skip_assert_initialized!(); glib::value::ToValue::to_value(&v) } } #[doc(alias = "GstVideoColorimetry")] #[derive(Copy, Clone)] #[repr(transparent)] pub struct VideoColorimetry(ffi::GstVideoColorimetry); impl VideoColorimetry { pub fn new( range: crate::VideoColorRange, matrix: crate::VideoColorMatrix, transfer: crate::VideoTransferFunction, primaries: crate::VideoColorPrimaries, ) -> Self { skip_assert_initialized!(); let colorimetry = ffi::GstVideoColorimetry { range: range.into_glib(), matrix: matrix.into_glib(), transfer: transfer.into_glib(), primaries: primaries.into_glib(), }; Self(colorimetry) } #[inline] pub fn range(&self) -> crate::VideoColorRange { unsafe { from_glib(self.0.range) } } #[inline] pub fn matrix(&self) -> crate::VideoColorMatrix { unsafe { from_glib(self.0.matrix) } } #[inline] pub fn transfer(&self) -> crate::VideoTransferFunction { unsafe { from_glib(self.0.transfer) } } #[inline] pub fn primaries(&self) -> crate::VideoColorPrimaries { unsafe { from_glib(self.0.primaries) } } #[cfg(feature = "v1_22")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_22")))] #[doc(alias = "gst_video_colorimetry_is_equivalent")] pub fn is_equivalent(&self, bitdepth: u32, other: &Self, other_bitdepth: u32) -> bool { unsafe { from_glib(ffi::gst_video_colorimetry_is_equivalent( &self.0, bitdepth, &other.0, other_bitdepth, )) } } } impl PartialEq for VideoColorimetry { #[doc(alias = "gst_video_colorimetry_is_equal")] fn eq(&self, other: &Self) -> bool { unsafe { from_glib(ffi::gst_video_colorimetry_is_equal(&self.0, &other.0)) } } } impl Eq for VideoColorimetry {} impl str::FromStr for crate::VideoColorimetry { type Err = glib::error::BoolError; #[doc(alias = "gst_video_colorimetry_from_string")] fn from_str(s: &str) -> Result { assert_initialized_main_thread!(); unsafe { let mut colorimetry = mem::MaybeUninit::uninit(); let valid: bool = from_glib(ffi::gst_video_colorimetry_from_string( colorimetry.as_mut_ptr(), s.to_glib_none().0, )); if valid { Ok(Self(colorimetry.assume_init())) } else { Err(glib::bool_error!("Invalid colorimetry info")) } } } } impl fmt::Debug for crate::VideoColorimetry { fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result { f.debug_struct("VideoColorimetry") .field("range", &self.0.range) .field("matrix", &self.0.matrix) .field("transfer", &self.0.transfer) .field("primaries", &self.0.primaries) .finish() } } impl fmt::Display for crate::VideoColorimetry { fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result { let s = unsafe { glib::GString::from_glib_full(ffi::gst_video_colorimetry_to_string(&self.0)) }; f.write_str(&s) } } impl crate::VideoChromaSite { #[doc(alias = "gst_video_chroma_site_to_string")] #[doc(alias = "gst_video_chroma_to_string")] pub fn to_str(self) -> glib::GString { assert_initialized_main_thread!(); unsafe { cfg_if::cfg_if! { if #[cfg(feature = "v1_20")] { from_glib_full(ffi::gst_video_chroma_site_to_string(self.into_glib())) } else { from_glib_none(ffi::gst_video_chroma_to_string(self.into_glib())) } } } } } impl str::FromStr for crate::VideoChromaSite { type Err = glib::error::BoolError; #[doc(alias = "gst_video_chroma_from_string")] fn from_str(s: &str) -> Result { skip_assert_initialized!(); cfg_if::cfg_if! { if #[cfg(feature = "v1_20")] { let chroma_site = Self::from_string(s); } else { assert_initialized_main_thread!(); let chroma_site: Self = unsafe { from_glib(ffi::gst_video_chroma_from_string(s.to_glib_none().0)) }; } }; if chroma_site.is_empty() { Err(glib::bool_error!("Invalid chroma site")) } else { Ok(chroma_site) } } } impl From for crate::VideoMultiviewMode { #[inline] fn from(v: crate::VideoMultiviewFramePacking) -> Self { skip_assert_initialized!(); unsafe { from_glib(v.into_glib()) } } } impl TryFrom for crate::VideoMultiviewFramePacking { type Error = glib::BoolError; fn try_from(v: crate::VideoMultiviewMode) -> Result { skip_assert_initialized!(); let v2 = unsafe { from_glib(v.into_glib()) }; if let Self::__Unknown(_) = v2 { Err(glib::bool_error!("Invalid frame packing mode")) } else { Ok(v2) } } } #[doc(alias = "GstVideoInfo")] #[derive(Clone)] #[repr(transparent)] pub struct VideoInfo(pub(crate) ffi::GstVideoInfo); impl fmt::Debug for VideoInfo { fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result { f.debug_struct("VideoInfo") .field("format", &self.format()) .field("format-info", &self.format_info()) .field("width", &self.width()) .field("height", &self.height()) .field("interlace_mode", &self.interlace_mode()) .field("flags", &self.flags()) .field("size", &self.size()) .field("views", &self.views()) .field("chroma_site", &self.chroma_site()) .field("colorimetry", &self.colorimetry()) .field("par", &self.par()) .field("fps", &self.fps()) .field("offset", &self.offset()) .field("stride", &self.stride()) .field("multiview_mode", &self.multiview_mode()) .field("multiview_flags", &self.multiview_flags()) .field("field_order", &self.field_order()) .finish() } } #[derive(Debug)] #[must_use = "The builder must be built to be used"] pub struct VideoInfoBuilder<'a> { format: crate::VideoFormat, width: u32, height: u32, interlace_mode: Option, flags: Option, size: Option, views: Option, chroma_site: Option, colorimetry: Option<&'a crate::VideoColorimetry>, par: Option, fps: Option, offset: Option<&'a [usize]>, stride: Option<&'a [i32]>, multiview_mode: Option, multiview_flags: Option, field_order: Option, } impl<'a> VideoInfoBuilder<'a> { pub fn build(self) -> Result { unsafe { let mut info = mem::MaybeUninit::uninit(); cfg_if::cfg_if! { if #[cfg(feature = "v1_16")] { let res: bool = { from_glib(if let Some(interlace_mode) = self.interlace_mode { ffi::gst_video_info_set_interlaced_format( info.as_mut_ptr(), self.format.into_glib(), interlace_mode.into_glib(), self.width, self.height, ) } else { ffi::gst_video_info_set_format( info.as_mut_ptr(), self.format.into_glib(), self.width, self.height, ) }) }; } else { let res: bool = { let res = from_glib(ffi::gst_video_info_set_format( info.as_mut_ptr(), self.format.into_glib(), self.width, self.height, )); if res { if let Some(interlace_mode) = self.interlace_mode { let info = info.as_mut_ptr(); (*info).interlace_mode = interlace_mode.into_glib(); } } res }; } } if !res { return Err(glib::bool_error!("Failed to build VideoInfo")); } let mut info = info.assume_init(); if info.finfo.is_null() || info.width <= 0 || info.height <= 0 { return Err(glib::bool_error!("Failed to build VideoInfo")); } if let Some(flags) = self.flags { info.flags = flags.into_glib(); } if let Some(size) = self.size { info.size = size; } if let Some(views) = self.views { info.views = views as i32; } if let Some(chroma_site) = self.chroma_site { info.chroma_site = chroma_site.into_glib(); } if let Some(colorimetry) = self.colorimetry { ptr::write(&mut info.colorimetry, ptr::read(&colorimetry.0)); } if let Some(par) = self.par { info.par_n = par.numer(); info.par_d = par.denom(); } if let Some(fps) = self.fps { info.fps_n = fps.numer(); info.fps_d = fps.denom(); } if let Some(offset) = self.offset { if offset.len() != ((*info.finfo).n_planes as usize) { return Err(glib::bool_error!("Failed to build VideoInfo")); } let n_planes = (*info.finfo).n_planes as usize; info.offset[..n_planes].copy_from_slice(&offset[..n_planes]); } if let Some(stride) = self.stride { if stride.len() != ((*info.finfo).n_planes as usize) { return Err(glib::bool_error!("Failed to build VideoInfo")); } let n_planes = (*info.finfo).n_planes as usize; info.stride[..n_planes].copy_from_slice(&stride[..n_planes]); } if let Some(multiview_mode) = self.multiview_mode { let ptr = &mut info.ABI._gst_reserved as *mut _ as *mut i32; ptr::write(ptr.offset(0), multiview_mode.into_glib()); } if let Some(multiview_flags) = self.multiview_flags { let ptr = &mut info.ABI._gst_reserved as *mut _ as *mut u32; ptr::write(ptr.offset(1), multiview_flags.into_glib()); } if let Some(field_order) = self.field_order { let ptr = &mut info.ABI._gst_reserved as *mut _ as *mut i32; ptr::write(ptr.offset(2), field_order.into_glib()); } Ok(VideoInfo(info)) } } pub fn interlace_mode(self, interlace_mode: crate::VideoInterlaceMode) -> VideoInfoBuilder<'a> { Self { interlace_mode: Some(interlace_mode), ..self } } pub fn interlace_mode_if( self, interlace_mode: crate::VideoInterlaceMode, predicate: bool, ) -> VideoInfoBuilder<'a> { if predicate { self.interlace_mode(interlace_mode) } else { self } } pub fn interlace_mode_if_some( self, interlace_mode: Option, ) -> VideoInfoBuilder<'a> { if let Some(interlace_mode) = interlace_mode { self.interlace_mode(interlace_mode) } else { self } } pub fn flags(self, flags: crate::VideoFlags) -> Self { Self { flags: Some(flags), ..self } } pub fn flags_if(self, flags: crate::VideoFlags, predicate: bool) -> Self { if predicate { self.flags(flags) } else { self } } pub fn flags_if_some(self, flags: Option) -> Self { if let Some(flags) = flags { self.flags(flags) } else { self } } pub fn size(self, size: usize) -> Self { Self { size: Some(size), ..self } } pub fn size_if(self, size: usize, predicate: bool) -> Self { if predicate { self.size(size) } else { self } } pub fn size_if_some(self, size: Option) -> Self { if let Some(size) = size { self.size(size) } else { self } } pub fn views(self, views: u32) -> Self { Self { views: Some(views), ..self } } pub fn views_if(self, views: u32, predicate: bool) -> Self { if predicate { self.views(views) } else { self } } pub fn views_if_some(self, views: Option) -> Self { if let Some(views) = views { self.views(views) } else { self } } pub fn chroma_site(self, chroma_site: crate::VideoChromaSite) -> Self { Self { chroma_site: Some(chroma_site), ..self } } pub fn chroma_site_if(self, chroma_site: crate::VideoChromaSite, predicate: bool) -> Self { if predicate { self.chroma_site(chroma_site) } else { self } } pub fn chroma_site_if_some(self, chroma_site: Option) -> Self { if let Some(chroma_site) = chroma_site { self.chroma_site(chroma_site) } else { self } } pub fn colorimetry(self, colorimetry: &'a crate::VideoColorimetry) -> VideoInfoBuilder<'a> { Self { colorimetry: Some(colorimetry), ..self } } pub fn colorimetry_if( self, colorimetry: &'a crate::VideoColorimetry, predicate: bool, ) -> VideoInfoBuilder<'a> { if predicate { self.colorimetry(colorimetry) } else { self } } pub fn colorimetry_if_some( self, colorimetry: Option<&'a crate::VideoColorimetry>, ) -> VideoInfoBuilder<'a> { if let Some(colorimetry) = colorimetry { self.colorimetry(colorimetry) } else { self } } pub fn par>(self, par: T) -> Self { Self { par: Some(par.into()), ..self } } pub fn par_if>(self, par: T, predicate: bool) -> Self { if predicate { self.par(par) } else { self } } pub fn par_if_some>(self, par: Option) -> Self { if let Some(par) = par { self.par(par) } else { self } } pub fn fps>(self, fps: T) -> Self { Self { fps: Some(fps.into()), ..self } } pub fn fps_if>(self, fps: T, predicate: bool) -> Self { if predicate { self.fps(fps) } else { self } } pub fn fps_if_some>(self, fps: Option) -> Self { if let Some(fps) = fps { self.fps(fps) } else { self } } pub fn offset(self, offset: &'a [usize]) -> VideoInfoBuilder<'a> { Self { offset: Some(offset), ..self } } pub fn offset_if(self, offset: &'a [usize], predicate: bool) -> VideoInfoBuilder<'a> { if predicate { self.offset(offset) } else { self } } pub fn offset_if_some(self, offset: Option<&'a [usize]>) -> VideoInfoBuilder<'a> { if let Some(offset) = offset { self.offset(offset) } else { self } } pub fn stride(self, stride: &'a [i32]) -> VideoInfoBuilder<'a> { Self { stride: Some(stride), ..self } } pub fn stride_if(self, stride: &'a [i32], predicate: bool) -> VideoInfoBuilder<'a> { if predicate { self.stride(stride) } else { self } } pub fn stride_if_some(self, stride: Option<&'a [i32]>) -> VideoInfoBuilder<'a> { if let Some(stride) = stride { self.stride(stride) } else { self } } pub fn multiview_mode(self, multiview_mode: crate::VideoMultiviewMode) -> Self { Self { multiview_mode: Some(multiview_mode), ..self } } pub fn multiview_mode_if( self, multiview_mode: crate::VideoMultiviewMode, predicate: bool, ) -> Self { if predicate { self.multiview_mode(multiview_mode) } else { self } } pub fn multiview_mode_if_some(self, multiview_mode: Option) -> Self { if let Some(multiview_mode) = multiview_mode { self.multiview_mode(multiview_mode) } else { self } } pub fn multiview_flags(self, multiview_flags: crate::VideoMultiviewFlags) -> Self { Self { multiview_flags: Some(multiview_flags), ..self } } pub fn multiview_flags_if( self, multiview_flags: crate::VideoMultiviewFlags, predicate: bool, ) -> Self { if predicate { self.multiview_flags(multiview_flags) } else { self } } pub fn multiview_flags_if_some( self, multiview_flags: Option, ) -> Self { if let Some(multiview_flags) = multiview_flags { self.multiview_flags(multiview_flags) } else { self } } pub fn field_order(self, field_order: crate::VideoFieldOrder) -> Self { Self { field_order: Some(field_order), ..self } } pub fn field_order_if(self, field_order: crate::VideoFieldOrder, predicate: bool) -> Self { if predicate { self.field_order(field_order) } else { self } } pub fn field_order_if_some(self, field_order: Option) -> Self { if let Some(field_order) = field_order { self.field_order(field_order) } else { self } } } impl VideoInfo { pub fn builder<'a>( format: crate::VideoFormat, width: u32, height: u32, ) -> VideoInfoBuilder<'a> { assert_initialized_main_thread!(); VideoInfoBuilder { format, width, height, interlace_mode: None, flags: None, size: None, views: None, chroma_site: None, colorimetry: None, par: None, fps: None, offset: None, stride: None, multiview_mode: None, multiview_flags: None, field_order: None, } } #[inline] pub fn is_valid(&self) -> bool { !self.0.finfo.is_null() && self.0.width > 0 && self.0.height > 0 && self.0.size > 0 } #[doc(alias = "gst_video_info_from_caps")] pub fn from_caps(caps: &gst::CapsRef) -> Result { skip_assert_initialized!(); unsafe { let mut info = mem::MaybeUninit::uninit(); if from_glib(ffi::gst_video_info_from_caps( info.as_mut_ptr(), caps.as_ptr(), )) { Ok(Self(info.assume_init())) } else { Err(glib::bool_error!("Failed to create VideoInfo from caps")) } } } #[doc(alias = "gst_video_info_to_caps")] pub fn to_caps(&self) -> Result { unsafe { let result = from_glib_full(ffi::gst_video_info_to_caps(mut_override(&self.0))); match result { Some(c) => Ok(c), None => Err(glib::bool_error!("Failed to create caps from VideoInfo")), } } } #[inline] pub fn format(&self) -> crate::VideoFormat { if self.0.finfo.is_null() { return crate::VideoFormat::Unknown; } self.format_info().format() } #[inline] pub fn format_info(&self) -> crate::VideoFormatInfo { unsafe { crate::VideoFormatInfo::from_ptr(self.0.finfo) } } #[inline] pub fn name<'a>(&self) -> &'a str { self.format_info().name() } #[inline] pub fn width(&self) -> u32 { self.0.width as u32 } #[inline] pub fn height(&self) -> u32 { self.0.height as u32 } #[cfg(feature = "v1_16")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_16")))] #[inline] pub fn field_height(&self) -> u32 { if self.0.interlace_mode == ffi::GST_VIDEO_INTERLACE_MODE_ALTERNATE { (self.0.height as u32 + 1) / 2 } else { self.0.height as u32 } } #[inline] pub fn interlace_mode(&self) -> crate::VideoInterlaceMode { unsafe { from_glib(self.0.interlace_mode) } } #[inline] pub fn flags(&self) -> crate::VideoFlags { unsafe { from_glib(self.0.flags) } } #[inline] pub fn size(&self) -> usize { self.0.size } #[inline] pub fn views(&self) -> u32 { self.0.views as u32 } #[inline] pub fn chroma_site(&self) -> crate::VideoChromaSite { unsafe { from_glib(self.0.chroma_site) } } #[inline] pub fn colorimetry(&self) -> VideoColorimetry { unsafe { VideoColorimetry(ptr::read(&self.0.colorimetry)) } } #[inline] pub fn comp_depth(&self, component: u8) -> u32 { self.format_info().depth()[component as usize] } #[inline] pub fn comp_height(&self, component: u8) -> u32 { self.format_info().scale_height(component, self.height()) } #[inline] pub fn comp_width(&self, component: u8) -> u32 { self.format_info().scale_width(component, self.width()) } #[inline] pub fn comp_offset(&self, component: u8) -> usize { self.offset()[self.format_info().plane()[component as usize] as usize] + self.format_info().poffset()[component as usize] as usize } #[inline] pub fn comp_plane(&self, component: u8) -> u32 { self.format_info().plane()[component as usize] } #[inline] pub fn comp_poffset(&self, component: u8) -> u32 { self.format_info().poffset()[component as usize] } #[inline] pub fn comp_pstride(&self, component: u8) -> i32 { self.format_info().pixel_stride()[component as usize] } #[inline] pub fn comp_stride(&self, component: u8) -> i32 { self.stride()[self.format_info().plane()[component as usize] as usize] } #[inline] pub fn par(&self) -> gst::Fraction { gst::Fraction::new(self.0.par_n, self.0.par_d) } #[inline] pub fn fps(&self) -> gst::Fraction { gst::Fraction::new(self.0.fps_n, self.0.fps_d) } #[cfg(feature = "v1_16")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_16")))] #[inline] pub fn field_rate(&self) -> gst::Fraction { if self.interlace_mode() == crate::VideoInterlaceMode::Alternate { 2 * self.fps() } else { self.fps() } } #[inline] pub fn offset(&self) -> &[usize] { &self.0.offset[0..(self.format_info().n_planes() as usize)] } #[inline] pub fn stride(&self) -> &[i32] { &self.0.stride[0..(self.format_info().n_planes() as usize)] } #[inline] pub fn multiview_mode(&self) -> crate::VideoMultiviewMode { unsafe { let ptr = &self.0.ABI._gst_reserved as *const _ as *const i32; from_glib(ptr::read(ptr.offset(0))) } } #[inline] pub fn multiview_flags(&self) -> crate::VideoMultiviewFlags { unsafe { let ptr = &self.0.ABI._gst_reserved as *const _ as *const u32; from_glib(ptr::read(ptr.offset(1))) } } #[inline] pub fn field_order(&self) -> crate::VideoFieldOrder { unsafe { let ptr = &self.0.ABI._gst_reserved as *const _ as *const i32; from_glib(ptr::read(ptr.offset(2))) } } #[inline] pub fn has_alpha(&self) -> bool { self.format_info().has_alpha() } #[inline] pub fn is_gray(&self) -> bool { self.format_info().is_gray() } #[inline] pub fn is_rgb(&self) -> bool { self.format_info().is_rgb() } #[inline] pub fn is_yuv(&self) -> bool { self.format_info().is_yuv() } #[inline] pub fn is_interlaced(&self) -> bool { self.interlace_mode() != crate::VideoInterlaceMode::Progressive } #[inline] pub fn n_planes(&self) -> u32 { self.format_info().n_planes() } #[inline] pub fn n_components(&self) -> u32 { self.format_info().n_components() } #[doc(alias = "gst_video_info_convert")] pub fn convert( &self, src_val: impl gst::format::FormattedValue, ) -> Option { skip_assert_initialized!(); unsafe { let mut dest_val = mem::MaybeUninit::uninit(); if from_glib(ffi::gst_video_info_convert( mut_override(&self.0), src_val.format().into_glib(), src_val.into_raw_value(), U::default_format().into_glib(), dest_val.as_mut_ptr(), )) { Some(U::from_raw(U::default_format(), dest_val.assume_init())) } else { None } } } pub fn convert_generic( &self, src_val: impl gst::format::FormattedValue, dest_fmt: gst::Format, ) -> Option { skip_assert_initialized!(); unsafe { let mut dest_val = mem::MaybeUninit::uninit(); if from_glib(ffi::gst_video_info_convert( mut_override(&self.0), src_val.format().into_glib(), src_val.into_raw_value(), dest_fmt.into_glib(), dest_val.as_mut_ptr(), )) { Some(gst::GenericFormattedValue::new( dest_fmt, dest_val.assume_init(), )) } else { None } } } #[doc(alias = "gst_video_info_align")] pub fn align(&mut self, align: &mut crate::VideoAlignment) -> Result<(), glib::BoolError> { unsafe { glib::result_from_gboolean!( ffi::gst_video_info_align(&mut self.0, &mut align.0,), "Failed to align VideoInfo" ) } } #[cfg(feature = "v1_18")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_18")))] #[doc(alias = "gst_video_info_align_full")] pub fn align_full( &mut self, align: &mut crate::VideoAlignment, ) -> Result<[usize; crate::VIDEO_MAX_PLANES], glib::BoolError> { let mut plane_size = [0; crate::VIDEO_MAX_PLANES]; unsafe { glib::result_from_gboolean!( ffi::gst_video_info_align_full(&mut self.0, &mut align.0, plane_size.as_mut_ptr()), "Failed to align VideoInfo" )?; } Ok(plane_size) } #[doc(alias = "gst_video_color_range_offsets")] #[inline] pub fn range_offsets(&self, range: crate::VideoColorRange) -> ([i32; 4], [i32; 4]) { self.format_info().range_offsets(range) } } impl PartialEq for VideoInfo { #[doc(alias = "gst_video_info_is_equal")] fn eq(&self, other: &Self) -> bool { unsafe { from_glib(ffi::gst_video_info_is_equal(&self.0, &other.0)) } } } impl Eq for VideoInfo {} unsafe impl Send for VideoInfo {} unsafe impl Sync for VideoInfo {} impl glib::types::StaticType for VideoInfo { #[inline] fn static_type() -> glib::types::Type { unsafe { glib::translate::from_glib(ffi::gst_video_info_get_type()) } } } impl glib::value::ValueType for VideoInfo { type Type = Self; } #[doc(hidden)] unsafe impl<'a> glib::value::FromValue<'a> for VideoInfo { type Checker = glib::value::GenericValueTypeOrNoneChecker; unsafe fn from_value(value: &'a glib::Value) -> Self { skip_assert_initialized!(); from_glib_none( glib::gobject_ffi::g_value_get_boxed(value.to_glib_none().0) as *mut ffi::GstVideoInfo ) } } #[doc(hidden)] impl glib::value::ToValue for VideoInfo { fn to_value(&self) -> glib::Value { let mut value = glib::Value::for_value_type::(); unsafe { glib::gobject_ffi::g_value_set_boxed( value.to_glib_none_mut().0, self.to_glib_none().0 as *mut _, ) } value } fn value_type(&self) -> glib::Type { Self::static_type() } } #[doc(hidden)] impl glib::value::ToValueOptional for VideoInfo { fn to_value_optional(s: Option<&Self>) -> glib::Value { skip_assert_initialized!(); let mut value = glib::Value::for_value_type::(); unsafe { glib::gobject_ffi::g_value_set_boxed( value.to_glib_none_mut().0, s.to_glib_none().0 as *mut _, ) } value } } #[doc(hidden)] impl From for glib::Value { fn from(v: VideoInfo) -> glib::Value { skip_assert_initialized!(); glib::value::ToValue::to_value(&v) } } #[doc(hidden)] impl glib::translate::Uninitialized for VideoInfo { #[inline] unsafe fn uninitialized() -> Self { mem::zeroed() } } #[doc(hidden)] impl glib::translate::GlibPtrDefault for VideoInfo { type GlibType = *mut ffi::GstVideoInfo; } #[doc(hidden)] impl<'a> glib::translate::ToGlibPtr<'a, *const ffi::GstVideoInfo> for VideoInfo { type Storage = PhantomData<&'a Self>; #[inline] fn to_glib_none(&'a self) -> glib::translate::Stash<'a, *const ffi::GstVideoInfo, Self> { glib::translate::Stash(&self.0, PhantomData) } fn to_glib_full(&self) -> *const ffi::GstVideoInfo { unimplemented!() } } #[doc(hidden)] impl glib::translate::FromGlibPtrNone<*const ffi::GstVideoInfo> for VideoInfo { #[inline] unsafe fn from_glib_none(ptr: *const ffi::GstVideoInfo) -> Self { Self(ptr::read(ptr)) } } #[doc(hidden)] impl glib::translate::FromGlibPtrNone<*mut ffi::GstVideoInfo> for VideoInfo { #[inline] unsafe fn from_glib_none(ptr: *mut ffi::GstVideoInfo) -> Self { Self(ptr::read(ptr)) } } #[doc(hidden)] impl glib::translate::FromGlibPtrFull<*mut ffi::GstVideoInfo> for VideoInfo { #[inline] unsafe fn from_glib_full(ptr: *mut ffi::GstVideoInfo) -> Self { let info = from_glib_none(ptr); glib::ffi::g_free(ptr as *mut _); info } } impl crate::VideoFieldOrder { #[doc(alias = "gst_video_field_order_to_string")] pub fn to_str<'a>(self) -> &'a str { use std::ffi::CStr; if self == Self::Unknown { return "UNKNOWN"; } unsafe { CStr::from_ptr( ffi::gst_video_field_order_to_string(self.into_glib()) .as_ref() .expect("gst_video_field_order_to_string returned NULL"), ) .to_str() .expect("gst_video_field_order_to_string returned an invalid string") } } } impl str::FromStr for crate::VideoFieldOrder { type Err = glib::error::BoolError; fn from_str(s: &str) -> Result { skip_assert_initialized!(); let fmt = Self::from_string(s); if fmt == Self::Unknown { Err(glib::bool_error!( "Failed to parse video field order from string" )) } else { Ok(fmt) } } } impl str::FromStr for crate::VideoInterlaceMode { type Err = glib::error::BoolError; fn from_str(s: &str) -> Result { skip_assert_initialized!(); let fmt = Self::from_string(s); Ok(fmt) } } #[cfg(test)] mod tests { use super::*; #[test] fn test_new() { gst::init().unwrap(); let info = VideoInfo::builder(crate::VideoFormat::I420, 320, 240) .build() .unwrap(); assert_eq!(info.format(), crate::VideoFormat::I420); assert_eq!(info.width(), 320); assert_eq!(info.height(), 240); assert_eq!(info.size(), 320 * 240 + 2 * 160 * 120); assert_eq!(info.multiview_mode(), crate::VideoMultiviewMode::None); assert_eq!(&info.offset(), &[0, 320 * 240, 320 * 240 + 160 * 120]); assert_eq!(&info.stride(), &[320, 160, 160]); let offsets = [0, 640 * 240 + 16, 640 * 240 + 16 + 320 * 120 + 16]; let strides = [640, 320, 320]; let info = VideoInfo::builder(crate::VideoFormat::I420, 320, 240) .offset(&offsets) .stride(&strides) .size(640 * 240 + 16 + 320 * 120 + 16 + 320 * 120 + 16) .multiview_mode(crate::VideoMultiviewMode::SideBySide) .build() .unwrap(); assert_eq!(info.format(), crate::VideoFormat::I420); assert_eq!(info.width(), 320); assert_eq!(info.height(), 240); assert_eq!( info.size(), 640 * 240 + 16 + 320 * 120 + 16 + 320 * 120 + 16 ); assert_eq!(info.multiview_mode(), crate::VideoMultiviewMode::SideBySide); assert_eq!( &info.offset(), &[0, 640 * 240 + 16, 640 * 240 + 16 + 320 * 120 + 16] ); assert_eq!(&info.stride(), &[640, 320, 320]); } #[test] fn test_from_to_caps() { gst::init().unwrap(); let caps = crate::VideoCapsBuilder::new() .format(crate::VideoFormat::I420) .width(320) .height(240) .framerate((30, 1).into()) .pixel_aspect_ratio((1, 1).into()) .field("interlace-mode", "progressive") .field("chroma-site", "mpeg2") .field("colorimetry", "bt709") .build(); let info = VideoInfo::from_caps(&caps).unwrap(); assert_eq!(info.format(), crate::VideoFormat::I420); assert_eq!(info.width(), 320); assert_eq!(info.height(), 240); assert_eq!(info.fps(), gst::Fraction::new(30, 1)); assert_eq!( info.interlace_mode(), crate::VideoInterlaceMode::Progressive ); assert_eq!(info.chroma_site(), crate::VideoChromaSite::MPEG2); assert_eq!(info.colorimetry(), "bt709".parse().unwrap()); let caps2 = info.to_caps().unwrap(); assert_eq!(caps, caps2); let info2 = VideoInfo::from_caps(&caps2).unwrap(); assert!(info == info2); } #[test] fn test_video_align() { gst::init().unwrap(); let mut info = crate::VideoInfo::builder(crate::VideoFormat::Nv16, 1920, 1080) .build() .expect("Failed to create VideoInfo"); assert_eq!(info.stride(), [1920, 1920]); assert_eq!(info.offset(), [0, 2_073_600]); let mut align = crate::VideoAlignment::new(0, 0, 0, 8, &[0; VIDEO_MAX_PLANES]); info.align(&mut align).unwrap(); assert_eq!(info.stride(), [1928, 1928]); assert_eq!(info.offset(), [0, 2_082_240]); #[cfg(feature = "v1_18")] { let mut info = crate::VideoInfo::builder(crate::VideoFormat::Nv16, 1920, 1080) .build() .expect("Failed to create VideoInfo"); let mut align = crate::VideoAlignment::new(0, 0, 0, 8, &[0; VIDEO_MAX_PLANES]); let plane_size = info.align_full(&mut align).unwrap(); assert_eq!(plane_size, [2082240, 2082240, 0, 0]); } } #[test] fn test_display() { gst::init().unwrap(); let _ = format!("{}", "sRGB".parse::().unwrap()); let _ = format!("{}", crate::VideoFieldOrder::TopFieldFirst); let _ = format!("{}", crate::VideoInterlaceMode::Progressive); } } gstreamer-video-0.23.5/src/video_info_dma_drm.rs000064400000000000000000000244521046102023000177600ustar 00000000000000// Take a look at the license at the top of the repository in the LICENSE file. use std::{fmt, marker::PhantomData, mem, ops, ptr, str}; use glib::translate::*; use gst::prelude::*; use crate::{ffi, VideoFormat, VideoInfo}; #[doc(alias = "gst_video_dma_drm_fourcc_from_format")] pub fn dma_drm_fourcc_from_format(v: VideoFormat) -> Result { skip_assert_initialized!(); unsafe { let res = ffi::gst_video_dma_drm_fourcc_from_format(v.into_glib()); if res == 0 { Err(glib::bool_error!("Unsupported video format")) } else { Ok(res) } } } #[doc(alias = "gst_video_dma_drm_fourcc_to_format")] pub fn dma_drm_fourcc_to_format(v: u32) -> Result { skip_assert_initialized!(); unsafe { let res = ffi::gst_video_dma_drm_fourcc_to_format(v); if res == ffi::GST_VIDEO_FORMAT_UNKNOWN { Err(glib::bool_error!("Unsupported fourcc")) } else { Ok(from_glib(res)) } } } #[doc(alias = "gst_video_dma_drm_fourcc_to_string")] pub fn dma_drm_fourcc_to_string(fourcc: u32, modifier: u64) -> glib::GString { skip_assert_initialized!(); unsafe { glib::GString::from_glib_full(ffi::gst_video_dma_drm_fourcc_to_string(fourcc, modifier)) } } #[doc(alias = "gst_video_dma_drm_fourcc_from_string")] pub fn dma_drm_fourcc_from_str(v: &str) -> Result<(u32, u64), glib::BoolError> { skip_assert_initialized!(); unsafe { let mut modifier = mem::MaybeUninit::uninit(); let res = ffi::gst_video_dma_drm_fourcc_from_string(v.to_glib_none().0, modifier.as_mut_ptr()); if res == 0 { Err(glib::bool_error!("Can't parse fourcc string")) } else { Ok((res, modifier.assume_init())) } } } #[cfg(feature = "v1_26")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_24")))] #[doc(alias = "gst_video_dma_drm_format_from_gst_format")] pub fn dma_drm_format_from_gst_format(v: VideoFormat) -> Result<(u32, u64), glib::BoolError> { skip_assert_initialized!(); unsafe { let mut modifier = mem::MaybeUninit::uninit(); let res = ffi::gst_video_dma_drm_format_from_gst_format(v.into_glib(), modifier.as_mut_ptr()); if res == 0 { Err(glib::bool_error!("Unsupported video format")) } else { Ok((res, modifier.assume_init())) } } } #[cfg(feature = "v1_26")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_24")))] #[doc(alias = "gst_video_dma_drm_format_to_gst_format")] pub fn dma_drm_format_to_gst_format( fourcc: u32, modifier: u64, ) -> Result { skip_assert_initialized!(); unsafe { let res = ffi::gst_video_dma_drm_format_to_gst_format(fourcc, modifier); if res == ffi::GST_VIDEO_FORMAT_UNKNOWN { Err(glib::bool_error!("Unsupported fourcc format / modifier")) } else { Ok(from_glib(res)) } } } #[doc(alias = "GstVideoInfoDmaDrm")] #[derive(Clone)] #[repr(transparent)] pub struct VideoInfoDmaDrm(pub(crate) ffi::GstVideoInfoDmaDrm); impl ops::Deref for VideoInfoDmaDrm { type Target = VideoInfo; fn deref(&self) -> &Self::Target { unsafe { &*(&self.0.vinfo as *const ffi::GstVideoInfo as *const VideoInfo) } } } impl fmt::Debug for VideoInfoDmaDrm { fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result { f.debug_struct("VideoInfoDmaDrm") .field("info", &**self) .field("drm_fourcc", &self.0.drm_fourcc) .field("drm_modifier", &self.0.drm_modifier) .finish() } } impl VideoInfoDmaDrm { pub fn new(info: VideoInfo, fourcc: u32, modifier: u64) -> VideoInfoDmaDrm { assert_initialized_main_thread!(); VideoInfoDmaDrm(ffi::GstVideoInfoDmaDrm { vinfo: info.0, drm_fourcc: fourcc, drm_modifier: modifier, _gst_reserved: [0; 20], }) } #[inline] pub fn is_valid(&self) -> bool { !self.0.vinfo.finfo.is_null() && self.0.vinfo.width > 0 && self.0.vinfo.height > 0 && self.0.vinfo.size > 0 } #[doc(alias = "gst_video_info_dma_drm_from_caps")] pub fn from_caps(caps: &gst::CapsRef) -> Result { skip_assert_initialized!(); unsafe { let mut info = mem::MaybeUninit::uninit(); if from_glib(ffi::gst_video_info_dma_drm_from_caps( info.as_mut_ptr(), caps.as_ptr(), )) { Ok(Self(info.assume_init())) } else { Err(glib::bool_error!( "Failed to create VideoInfoDmaDrm from caps" )) } } } #[doc(alias = "gst_video_info_dma_drm_to_caps")] pub fn to_caps(&self) -> Result { unsafe { let result = from_glib_full(ffi::gst_video_info_dma_drm_to_caps(mut_override(&self.0))); match result { Some(c) => Ok(c), None => Err(glib::bool_error!( "Failed to create caps from VideoInfoDmaDrm" )), } } } #[doc(alias = "gst_video_info_dma_drm_from_video_info")] pub fn from_video_info( video_info: &crate::VideoInfo, modifier: u64, ) -> Result { skip_assert_initialized!(); unsafe { let mut info = mem::MaybeUninit::uninit(); if from_glib(ffi::gst_video_info_dma_drm_from_video_info( info.as_mut_ptr(), video_info.to_glib_none().0, modifier, )) { Ok(Self(info.assume_init())) } else { Err(glib::bool_error!( "Failed to create VideoInfoDmaDrm from VideoInfo" )) } } } #[doc(alias = "gst_video_info_dma_drm_to_video_info")] pub fn to_video_info(&self) -> Result { unsafe { let mut video_info = mem::MaybeUninit::uninit(); if from_glib(ffi::gst_video_info_dma_drm_to_video_info( mut_override(&self.0), video_info.as_mut_ptr(), )) { Ok(crate::VideoInfo(video_info.assume_init())) } else { Err(glib::bool_error!( "Failed to create VideoInfo from VideoInfoDmaDrm" )) } } } #[inline] pub fn fourcc(&self) -> u32 { self.0.drm_fourcc } #[inline] pub fn modifier(&self) -> u64 { self.0.drm_modifier } } impl PartialEq for VideoInfoDmaDrm { #[doc(alias = "gst_video_info_is_equal")] fn eq(&self, other: &Self) -> bool { unsafe { from_glib(ffi::gst_video_info_is_equal(&self.0.vinfo, &other.0.vinfo)) && self.0.drm_fourcc == other.0.drm_fourcc && self.0.drm_modifier == other.0.drm_modifier } } } impl Eq for VideoInfoDmaDrm {} unsafe impl Send for VideoInfoDmaDrm {} unsafe impl Sync for VideoInfoDmaDrm {} impl glib::types::StaticType for VideoInfoDmaDrm { #[inline] fn static_type() -> glib::types::Type { unsafe { glib::translate::from_glib(ffi::gst_video_info_dma_drm_get_type()) } } } impl glib::value::ValueType for VideoInfoDmaDrm { type Type = Self; } #[doc(hidden)] unsafe impl<'a> glib::value::FromValue<'a> for VideoInfoDmaDrm { type Checker = glib::value::GenericValueTypeOrNoneChecker; unsafe fn from_value(value: &'a glib::Value) -> Self { skip_assert_initialized!(); from_glib_none(glib::gobject_ffi::g_value_get_boxed(value.to_glib_none().0) as *mut ffi::GstVideoInfoDmaDrm) } } #[doc(hidden)] impl glib::value::ToValue for VideoInfoDmaDrm { fn to_value(&self) -> glib::Value { let mut value = glib::Value::for_value_type::(); unsafe { glib::gobject_ffi::g_value_set_boxed( value.to_glib_none_mut().0, self.to_glib_none().0 as *mut _, ) } value } fn value_type(&self) -> glib::Type { Self::static_type() } } #[doc(hidden)] impl glib::value::ToValueOptional for VideoInfoDmaDrm { fn to_value_optional(s: Option<&Self>) -> glib::Value { skip_assert_initialized!(); let mut value = glib::Value::for_value_type::(); unsafe { glib::gobject_ffi::g_value_set_boxed( value.to_glib_none_mut().0, s.to_glib_none().0 as *mut _, ) } value } } #[doc(hidden)] impl From for glib::Value { fn from(v: VideoInfoDmaDrm) -> glib::Value { skip_assert_initialized!(); glib::value::ToValue::to_value(&v) } } #[doc(hidden)] impl glib::translate::Uninitialized for VideoInfoDmaDrm { #[inline] unsafe fn uninitialized() -> Self { mem::zeroed() } } #[doc(hidden)] impl glib::translate::GlibPtrDefault for VideoInfoDmaDrm { type GlibType = *mut ffi::GstVideoInfoDmaDrm; } #[doc(hidden)] impl<'a> glib::translate::ToGlibPtr<'a, *const ffi::GstVideoInfoDmaDrm> for VideoInfoDmaDrm { type Storage = PhantomData<&'a Self>; #[inline] fn to_glib_none(&'a self) -> glib::translate::Stash<'a, *const ffi::GstVideoInfoDmaDrm, Self> { glib::translate::Stash(&self.0, PhantomData) } fn to_glib_full(&self) -> *const ffi::GstVideoInfoDmaDrm { unimplemented!() } } #[doc(hidden)] impl glib::translate::FromGlibPtrNone<*const ffi::GstVideoInfoDmaDrm> for VideoInfoDmaDrm { #[inline] unsafe fn from_glib_none(ptr: *const ffi::GstVideoInfoDmaDrm) -> Self { Self(ptr::read(ptr)) } } #[doc(hidden)] impl glib::translate::FromGlibPtrNone<*mut ffi::GstVideoInfoDmaDrm> for VideoInfoDmaDrm { #[inline] unsafe fn from_glib_none(ptr: *mut ffi::GstVideoInfoDmaDrm) -> Self { Self(ptr::read(ptr)) } } #[doc(hidden)] impl glib::translate::FromGlibPtrFull<*mut ffi::GstVideoInfoDmaDrm> for VideoInfoDmaDrm { #[inline] unsafe fn from_glib_full(ptr: *mut ffi::GstVideoInfoDmaDrm) -> Self { let info = from_glib_none(ptr); glib::ffi::g_free(ptr as *mut _); info } } gstreamer-video-0.23.5/src/video_message.rs000064400000000000000000000170041046102023000167610ustar 00000000000000// Take a look at the license at the top of the repository in the LICENSE file. use std::ptr; use glib::{ translate::{from_glib, from_glib_full, IntoGlib, ToGlibPtr}, value::ToSendValue, }; use gst::{ffi as gst_ffi, prelude::*, Message, Object, Seqnum}; use crate::{ffi, NavigationMessageType}; macro_rules! message_builder_generic_impl { ($new_fn:expr) => { #[allow(clippy::needless_update)] pub fn src + Cast + Clone>(self, src: &O) -> Self { Self { builder: self.builder.src(src), ..self } } #[allow(clippy::needless_update)] pub fn src_if + Cast + Clone>(self, src: &O, predicate: bool) -> Self { if predicate { self.src(src) } else { self } } #[allow(clippy::needless_update)] pub fn src_if_some + Cast + Clone>(self, src: Option<&O>) -> Self { if let Some(src) = src { self.src(src) } else { self } } #[doc(alias = "gst_message_set_seqnum")] #[allow(clippy::needless_update)] pub fn seqnum(self, seqnum: Seqnum) -> Self { Self { builder: self.builder.seqnum(seqnum), ..self } } #[doc(alias = "gst_message_set_seqnum")] #[allow(clippy::needless_update)] pub fn seqnum_if(self, seqnum: Seqnum, predicate: bool) -> Self { if predicate { self.seqnum(seqnum) } else { self } } #[doc(alias = "gst_message_set_seqnum")] #[allow(clippy::needless_update)] pub fn seqnum_if_some(self, seqnum: Option) -> Self { if let Some(seqnum) = seqnum { self.seqnum(seqnum) } else { self } } pub fn other_field(self, name: &'a str, value: impl ToSendValue) -> Self { Self { builder: self.builder.other_field(name, value), ..self } } gst::impl_builder_gvalue_extra_setters!(other_field); #[deprecated = "use builder.other_field() instead"] #[allow(clippy::needless_update)] pub fn other_fields( self, other_fields: &[(&'a str, &'a (dyn ToSendValue + Sync))], ) -> Self { Self { builder: self.builder.other_fields(other_fields), ..self } } #[must_use = "Building the message without using it has no effect"] #[allow(clippy::redundant_closure_call)] pub fn build(mut self) -> Message { skip_assert_initialized!(); unsafe { let src = self.builder.src.to_glib_none().0; let msg = $new_fn(&mut self, src); if let Some(seqnum) = self.builder.seqnum { gst_ffi::gst_message_set_seqnum(msg, seqnum.into_glib()); } if !self.builder.other_fields.is_empty() { let structure = gst_ffi::gst_message_writable_structure(msg); if !structure.is_null() { let structure = gst::StructureRef::from_glib_borrow_mut(structure as *mut _); for (k, v) in self.builder.other_fields { structure.set_value(k, v); } } } from_glib_full(msg) } } }; } struct MessageBuilder<'a> { src: Option, seqnum: Option, other_fields: Vec<(&'a str, glib::SendValue)>, } impl<'a> MessageBuilder<'a> { pub(crate) fn new() -> Self { skip_assert_initialized!(); Self { src: None, seqnum: None, other_fields: Vec::new(), } } pub fn src + Cast + Clone>(self, src: &O) -> Self { Self { src: Some(src.clone().upcast::()), ..self } } pub fn seqnum(self, seqnum: Seqnum) -> Self { Self { seqnum: Some(seqnum), ..self } } fn other_field(self, name: &'a str, value: impl ToSendValue) -> Self { let mut other_fields = self.other_fields; other_fields.push((name, value.to_send_value())); Self { other_fields, ..self } } fn other_fields(self, other_fields: &[(&'a str, &'a (dyn ToSendValue + Sync))]) -> Self { let mut s = self; for (name, value) in other_fields { s = s.other_field(name, value.to_send_value()); } s } } #[must_use = "The builder must be built to be used"] pub struct NavigationEventMessageBuilder<'a> { builder: MessageBuilder<'a>, event: gst::Event, } impl<'a> NavigationEventMessageBuilder<'a> { fn new(event: gst::Event) -> Self { skip_assert_initialized!(); Self { builder: MessageBuilder::new(), event, } } message_builder_generic_impl!(|s: &Self, src| ffi::gst_navigation_message_new_event( src, s.event.to_glib_none().0 )); } #[derive(Clone, Debug)] pub struct NavigationEventMessage { pub event: gst::Event, } impl PartialEq for NavigationEventMessage { fn eq(&self, other: &Self) -> bool { self.event.as_ptr() == other.event.as_ptr() } } impl Eq for NavigationEventMessage {} impl NavigationEventMessage { #[doc(alias = "gst_navigation_message_new_event")] #[allow(clippy::new_ret_no_self)] pub fn new(event: gst::Event) -> gst::Message { skip_assert_initialized!(); NavigationEventMessageBuilder::new(event).build() } pub fn builder<'a>(event: gst::Event) -> NavigationEventMessageBuilder<'a> { skip_assert_initialized!(); NavigationEventMessageBuilder::new(event) } #[doc(alias = "gst_navigation_message_parse_event")] pub fn parse(msg: &gst::MessageRef) -> Result { skip_assert_initialized!(); unsafe { let mut event = ptr::null_mut(); let ret = from_glib(ffi::gst_navigation_message_parse_event( msg.as_mut_ptr(), &mut event, )); if ret { Ok(Self { event: from_glib_full(event), }) } else { Err(glib::bool_error!("Invalid navigation event msg")) } } } } #[derive(Clone, PartialEq, Eq, Debug)] pub enum NavigationMessage { Event(NavigationEventMessage), } impl NavigationMessage { #[doc(alias = "gst_navigation_message_get_type")] pub fn type_(msg: &gst::MessageRef) -> NavigationMessageType { skip_assert_initialized!(); unsafe { from_glib(ffi::gst_navigation_message_get_type(msg.as_mut_ptr())) } } #[doc(alias = "gst_navigation_message_parse_event")] pub fn parse(msg: &gst::MessageRef) -> Result { skip_assert_initialized!(); let event_type: NavigationMessageType = Self::type_(msg); match event_type { NavigationMessageType::Event => NavigationEventMessage::parse(msg).map(Self::Event), _ => Err(glib::bool_error!( "Unsupported navigation msg {:?}", event_type )), } } } gstreamer-video-0.23.5/src/video_meta.rs000064400000000000000000001247041046102023000162710ustar 00000000000000// Take a look at the license at the top of the repository in the LICENSE file. use std::{fmt, ptr}; use crate::ffi; use glib::translate::*; use gst::prelude::*; #[repr(transparent)] #[doc(alias = "GstVideoMeta")] pub struct VideoMeta(ffi::GstVideoMeta); unsafe impl Send for VideoMeta {} unsafe impl Sync for VideoMeta {} impl VideoMeta { #[doc(alias = "gst_buffer_add_video_meta")] pub fn add( buffer: &mut gst::BufferRef, video_frame_flags: crate::VideoFrameFlags, format: crate::VideoFormat, width: u32, height: u32, ) -> Result, glib::BoolError> { skip_assert_initialized!(); if format == crate::VideoFormat::Unknown || format == crate::VideoFormat::Encoded { return Err(glib::bool_error!("Unsupported video format {}", format)); } let info = crate::VideoInfo::builder(format, width, height).build()?; if !info.is_valid() { return Err(glib::bool_error!("Invalid video info")); } if buffer.size() < info.size() { return Err(glib::bool_error!( "Buffer smaller than required frame size ({} < {})", buffer.size(), info.size() )); } unsafe { let meta = ffi::gst_buffer_add_video_meta( buffer.as_mut_ptr(), video_frame_flags.into_glib(), format.into_glib(), width, height, ); if meta.is_null() { return Err(glib::bool_error!("Failed to add video meta")); } Ok(Self::from_mut_ptr(buffer, meta)) } } pub fn add_full<'a>( buffer: &'a mut gst::BufferRef, video_frame_flags: crate::VideoFrameFlags, format: crate::VideoFormat, width: u32, height: u32, offset: &[usize], stride: &[i32], ) -> Result, glib::BoolError> { skip_assert_initialized!(); if format == crate::VideoFormat::Unknown || format == crate::VideoFormat::Encoded { return Err(glib::bool_error!("Unsupported video format {}", format)); } let n_planes = offset.len() as u32; let info_builder = crate::VideoInfo::builder(format, width, height) .offset(offset) .stride(stride); #[cfg(feature = "v1_16")] let info_builder = info_builder.interlace_mode_if( crate::VideoInterlaceMode::Alternate, video_frame_flags.contains(crate::VideoFrameFlags::ONEFIELD), ); let info = info_builder.build()?; if !info.is_valid() { return Err(glib::bool_error!("Invalid video info")); } if buffer.size() < info.size() { return Err(glib::bool_error!( "Buffer smaller than required frame size ({} < {})", buffer.size(), info.size() )); } unsafe { let meta = ffi::gst_buffer_add_video_meta_full( buffer.as_mut_ptr(), video_frame_flags.into_glib(), format.into_glib(), width, height, n_planes, offset.as_ptr() as *mut _, stride.as_ptr() as *mut _, ); if meta.is_null() { return Err(glib::bool_error!("Failed to add video meta")); } Ok(Self::from_mut_ptr(buffer, meta)) } } #[doc(alias = "get_flags")] #[inline] pub fn video_frame_flags(&self) -> crate::VideoFrameFlags { unsafe { from_glib(self.0.flags) } } #[doc(alias = "get_format")] #[inline] pub fn format(&self) -> crate::VideoFormat { unsafe { from_glib(self.0.format) } } #[doc(alias = "get_id")] #[inline] pub fn id(&self) -> i32 { self.0.id } #[doc(alias = "get_width")] #[inline] pub fn width(&self) -> u32 { self.0.width } #[doc(alias = "get_height")] #[inline] pub fn height(&self) -> u32 { self.0.height } #[doc(alias = "get_n_planes")] #[inline] pub fn n_planes(&self) -> u32 { self.0.n_planes } #[doc(alias = "get_offset")] #[inline] pub fn offset(&self) -> &[usize] { &self.0.offset[0..(self.0.n_planes as usize)] } #[doc(alias = "get_stride")] #[inline] pub fn stride(&self) -> &[i32] { &self.0.stride[0..(self.0.n_planes as usize)] } #[cfg(feature = "v1_18")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_18")))] #[doc(alias = "get_alignment")] #[inline] pub fn alignment(&self) -> crate::VideoAlignment { crate::VideoAlignment::new( self.0.alignment.padding_top, self.0.alignment.padding_bottom, self.0.alignment.padding_left, self.0.alignment.padding_right, &self.0.alignment.stride_align, ) } #[cfg(feature = "v1_18")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_18")))] #[doc(alias = "get_plane_size")] #[doc(alias = "gst_video_meta_get_plane_size")] pub fn plane_size(&self) -> Result<[usize; crate::VIDEO_MAX_PLANES], glib::BoolError> { let mut plane_size = [0; crate::VIDEO_MAX_PLANES]; unsafe { glib::result_from_gboolean!( ffi::gst_video_meta_get_plane_size(mut_override(&self.0), &mut plane_size,), "Failed to get plane size" )?; } Ok(plane_size) } #[cfg(feature = "v1_18")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_18")))] #[doc(alias = "get_plane_height")] #[doc(alias = "gst_video_meta_get_plane_height")] pub fn plane_height(&self) -> Result<[u32; crate::VIDEO_MAX_PLANES], glib::BoolError> { let mut plane_height = [0; crate::VIDEO_MAX_PLANES]; unsafe { glib::result_from_gboolean!( ffi::gst_video_meta_get_plane_height(mut_override(&self.0), &mut plane_height,), "Failed to get plane height" )?; } Ok(plane_height) } #[cfg(feature = "v1_18")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_18")))] #[doc(alias = "gst_video_meta_set_alignment")] pub fn set_alignment( &mut self, alignment: &crate::VideoAlignment, ) -> Result<(), glib::BoolError> { unsafe { glib::result_from_gboolean!( ffi::gst_video_meta_set_alignment(&mut self.0, alignment.0), "Failed to set alignment on VideoMeta" ) } } } unsafe impl MetaAPI for VideoMeta { type GstType = ffi::GstVideoMeta; #[doc(alias = "gst_video_meta_api_get_type")] #[inline] fn meta_api() -> glib::Type { unsafe { from_glib(ffi::gst_video_meta_api_get_type()) } } } impl fmt::Debug for VideoMeta { fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result { f.debug_struct("VideoMeta") .field("id", &self.id()) .field("video_frame_flags", &self.video_frame_flags()) .field("format", &self.format()) .field("width", &self.width()) .field("height", &self.height()) .field("n_planes", &self.n_planes()) .field("offset", &self.offset()) .field("stride", &self.stride()) .finish() } } #[repr(transparent)] #[doc(alias = "GstVideoCropMeta")] pub struct VideoCropMeta(ffi::GstVideoCropMeta); unsafe impl Send for VideoCropMeta {} unsafe impl Sync for VideoCropMeta {} impl VideoCropMeta { #[doc(alias = "gst_buffer_add_meta")] pub fn add( buffer: &mut gst::BufferRef, rect: (u32, u32, u32, u32), ) -> gst::MetaRefMut { skip_assert_initialized!(); unsafe { let meta = gst::ffi::gst_buffer_add_meta( buffer.as_mut_ptr(), ffi::gst_video_crop_meta_get_info(), ptr::null_mut(), ) as *mut ffi::GstVideoCropMeta; { let meta = &mut *meta; meta.x = rect.0; meta.y = rect.1; meta.width = rect.2; meta.height = rect.3; } Self::from_mut_ptr(buffer, meta) } } #[doc(alias = "get_rect")] #[inline] pub fn rect(&self) -> (u32, u32, u32, u32) { (self.0.x, self.0.y, self.0.width, self.0.height) } #[inline] pub fn set_rect(&mut self, rect: (u32, u32, u32, u32)) { self.0.x = rect.0; self.0.y = rect.1; self.0.width = rect.2; self.0.height = rect.3; } } unsafe impl MetaAPI for VideoCropMeta { type GstType = ffi::GstVideoCropMeta; #[doc(alias = "gst_video_crop_meta_api_get_type")] #[inline] fn meta_api() -> glib::Type { unsafe { from_glib(ffi::gst_video_crop_meta_api_get_type()) } } } impl fmt::Debug for VideoCropMeta { fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result { f.debug_struct("VideoCropMeta") .field("rect", &self.rect()) .finish() } } #[repr(transparent)] #[doc(alias = "GstVideoRegionOfInterestMeta")] pub struct VideoRegionOfInterestMeta(ffi::GstVideoRegionOfInterestMeta); unsafe impl Send for VideoRegionOfInterestMeta {} unsafe impl Sync for VideoRegionOfInterestMeta {} impl VideoRegionOfInterestMeta { #[doc(alias = "gst_buffer_add_video_region_of_interest_meta")] pub fn add<'a>( buffer: &'a mut gst::BufferRef, roi_type: &str, rect: (u32, u32, u32, u32), ) -> gst::MetaRefMut<'a, Self, gst::meta::Standalone> { skip_assert_initialized!(); unsafe { let meta = ffi::gst_buffer_add_video_region_of_interest_meta( buffer.as_mut_ptr(), roi_type.to_glib_none().0, rect.0, rect.1, rect.2, rect.3, ); Self::from_mut_ptr(buffer, meta) } } #[doc(alias = "get_rect")] #[inline] pub fn rect(&self) -> (u32, u32, u32, u32) { (self.0.x, self.0.y, self.0.w, self.0.h) } #[doc(alias = "get_id")] #[inline] pub fn id(&self) -> i32 { self.0.id } #[doc(alias = "get_parent_id")] #[inline] pub fn parent_id(&self) -> i32 { self.0.parent_id } #[doc(alias = "get_roi_type")] #[inline] pub fn roi_type<'a>(&self) -> &'a str { unsafe { glib::Quark::from_glib(self.0.roi_type).as_str() } } #[doc(alias = "get_params")] pub fn params(&self) -> ParamsIter { ParamsIter { _meta: self, list: ptr::NonNull::new(self.0.params), } } #[doc(alias = "get_param")] #[inline] pub fn param<'b>(&'b self, name: &str) -> Option<&'b gst::StructureRef> { self.params().find(|s| s.name() == name) } #[inline] pub fn set_rect(&mut self, rect: (u32, u32, u32, u32)) { self.0.x = rect.0; self.0.y = rect.1; self.0.w = rect.2; self.0.h = rect.3; } #[inline] pub fn set_id(&mut self, id: i32) { self.0.id = id } #[inline] pub fn set_parent_id(&mut self, id: i32) { self.0.parent_id = id } #[doc(alias = "gst_video_region_of_interest_meta_add_param")] pub fn add_param(&mut self, s: gst::Structure) { unsafe { ffi::gst_video_region_of_interest_meta_add_param(&mut self.0, s.into_glib_ptr()); } } } pub struct ParamsIter<'a> { _meta: &'a VideoRegionOfInterestMeta, list: Option>, } impl<'a> Iterator for ParamsIter<'a> { type Item = &'a gst::StructureRef; fn next(&mut self) -> Option<&'a gst::StructureRef> { match self.list { None => None, Some(list) => unsafe { self.list = ptr::NonNull::new(list.as_ref().next); let data = list.as_ref().data; let s = gst::StructureRef::from_glib_borrow(data as *const gst::ffi::GstStructure); Some(s) }, } } } impl std::iter::FusedIterator for ParamsIter<'_> {} unsafe impl MetaAPI for VideoRegionOfInterestMeta { type GstType = ffi::GstVideoRegionOfInterestMeta; #[doc(alias = "gst_video_region_of_interest_meta_api_get_type")] #[inline] fn meta_api() -> glib::Type { unsafe { from_glib(ffi::gst_video_region_of_interest_meta_api_get_type()) } } } impl fmt::Debug for VideoRegionOfInterestMeta { fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result { f.debug_struct("VideoRegionOfInterestMeta") .field("roi_type", &self.roi_type()) .field("rect", &self.rect()) .field("id", &self.id()) .field("parent_id", &self.parent_id()) .field("params", &self.params().collect::>()) .finish() } } #[repr(transparent)] #[doc(alias = "GstVideoAffineTransformationMeta")] pub struct VideoAffineTransformationMeta(ffi::GstVideoAffineTransformationMeta); unsafe impl Send for VideoAffineTransformationMeta {} unsafe impl Sync for VideoAffineTransformationMeta {} impl VideoAffineTransformationMeta { #[doc(alias = "gst_buffer_add_meta")] pub fn add<'a>( buffer: &'a mut gst::BufferRef, matrix: Option<&[[f32; 4]; 4]>, ) -> gst::MetaRefMut<'a, Self, gst::meta::Standalone> { skip_assert_initialized!(); unsafe { let meta = gst::ffi::gst_buffer_add_meta( buffer.as_mut_ptr(), ffi::gst_video_affine_transformation_meta_get_info(), ptr::null_mut(), ) as *mut ffi::GstVideoAffineTransformationMeta; if let Some(matrix) = matrix { let meta = &mut *meta; for (i, o) in Iterator::zip(matrix.iter().flatten(), meta.matrix.iter_mut()) { *o = *i; } } Self::from_mut_ptr(buffer, meta) } } #[doc(alias = "get_matrix")] #[inline] pub fn matrix(&self) -> &[[f32; 4]; 4] { unsafe { &*(&self.0.matrix as *const [f32; 16] as *const [[f32; 4]; 4]) } } #[inline] pub fn set_matrix(&mut self, matrix: &[[f32; 4]; 4]) { for (i, o) in Iterator::zip(matrix.iter().flatten(), self.0.matrix.iter_mut()) { *o = *i; } } #[doc(alias = "gst_video_affine_transformation_meta_apply_matrix")] pub fn apply_matrix(&mut self, matrix: &[[f32; 4]; 4]) { unsafe { ffi::gst_video_affine_transformation_meta_apply_matrix( &mut self.0, matrix as *const [[f32; 4]; 4] as *const [f32; 16], ); } } } unsafe impl MetaAPI for VideoAffineTransformationMeta { type GstType = ffi::GstVideoAffineTransformationMeta; #[doc(alias = "gst_video_affine_transformation_meta_api_get_type")] #[inline] fn meta_api() -> glib::Type { unsafe { from_glib(ffi::gst_video_affine_transformation_meta_api_get_type()) } } } impl fmt::Debug for VideoAffineTransformationMeta { fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result { f.debug_struct("VideoAffineTransformationMeta") .field("matrix", &self.matrix()) .finish() } } #[repr(transparent)] #[doc(alias = "GstVideoOverlayCompositionMeta")] pub struct VideoOverlayCompositionMeta(ffi::GstVideoOverlayCompositionMeta); unsafe impl Send for VideoOverlayCompositionMeta {} unsafe impl Sync for VideoOverlayCompositionMeta {} impl VideoOverlayCompositionMeta { #[doc(alias = "gst_buffer_add_video_overlay_composition_meta")] pub fn add<'a>( buffer: &'a mut gst::BufferRef, overlay: &crate::VideoOverlayComposition, ) -> gst::MetaRefMut<'a, Self, gst::meta::Standalone> { skip_assert_initialized!(); unsafe { let meta = ffi::gst_buffer_add_video_overlay_composition_meta( buffer.as_mut_ptr(), overlay.as_mut_ptr(), ); Self::from_mut_ptr(buffer, meta) } } #[doc(alias = "get_overlay")] #[inline] pub fn overlay(&self) -> &crate::VideoOverlayCompositionRef { unsafe { crate::VideoOverlayCompositionRef::from_ptr(self.0.overlay) } } #[doc(alias = "get_overlay_owned")] #[inline] pub fn overlay_owned(&self) -> crate::VideoOverlayComposition { unsafe { from_glib_none(self.overlay().as_ptr()) } } #[inline] pub fn set_overlay(&mut self, overlay: &crate::VideoOverlayComposition) { #![allow(clippy::cast_ptr_alignment)] unsafe { gst::ffi::gst_mini_object_unref(self.0.overlay as *mut _); self.0.overlay = gst::ffi::gst_mini_object_ref(overlay.as_mut_ptr() as *mut _) as *mut _; } } } unsafe impl MetaAPI for VideoOverlayCompositionMeta { type GstType = ffi::GstVideoOverlayCompositionMeta; #[doc(alias = "gst_video_overlay_composition_meta_api_get_type")] #[inline] fn meta_api() -> glib::Type { unsafe { from_glib(ffi::gst_video_overlay_composition_meta_api_get_type()) } } } impl fmt::Debug for VideoOverlayCompositionMeta { fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result { f.debug_struct("VideoOverlayCompositionMeta") .field("overlay", &self.overlay()) .finish() } } #[cfg(feature = "v1_16")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_16")))] #[repr(transparent)] #[doc(alias = "GstVideoCaptionMeta")] pub struct VideoCaptionMeta(ffi::GstVideoCaptionMeta); #[cfg(feature = "v1_16")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_16")))] unsafe impl Send for VideoCaptionMeta {} #[cfg(feature = "v1_16")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_16")))] unsafe impl Sync for VideoCaptionMeta {} #[cfg(feature = "v1_16")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_16")))] impl VideoCaptionMeta { #[doc(alias = "gst_buffer_add_video_caption_meta")] pub fn add<'a>( buffer: &'a mut gst::BufferRef, caption_type: crate::VideoCaptionType, data: &[u8], ) -> gst::MetaRefMut<'a, Self, gst::meta::Standalone> { skip_assert_initialized!(); assert!(!data.is_empty()); unsafe { let meta = ffi::gst_buffer_add_video_caption_meta( buffer.as_mut_ptr(), caption_type.into_glib(), data.as_ptr(), data.len(), ); Self::from_mut_ptr(buffer, meta) } } #[doc(alias = "get_caption_type")] #[inline] pub fn caption_type(&self) -> crate::VideoCaptionType { unsafe { from_glib(self.0.caption_type) } } #[doc(alias = "get_data")] #[inline] pub fn data(&self) -> &[u8] { if self.0.size == 0 { return &[]; } unsafe { use std::slice; slice::from_raw_parts(self.0.data, self.0.size) } } } #[cfg(feature = "v1_16")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_16")))] unsafe impl MetaAPI for VideoCaptionMeta { type GstType = ffi::GstVideoCaptionMeta; #[doc(alias = "gst_video_caption_meta_api_get_type")] #[inline] fn meta_api() -> glib::Type { unsafe { from_glib(ffi::gst_video_caption_meta_api_get_type()) } } } #[cfg(feature = "v1_16")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_16")))] impl fmt::Debug for VideoCaptionMeta { fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result { f.debug_struct("VideoCaptionMeta") .field("caption_type", &self.caption_type()) .field("data", &self.data()) .finish() } } #[cfg(feature = "v1_18")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_18")))] #[repr(transparent)] #[doc(alias = "GstVideoAFDMeta")] pub struct VideoAFDMeta(ffi::GstVideoAFDMeta); #[cfg(feature = "v1_18")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_18")))] unsafe impl Send for VideoAFDMeta {} #[cfg(feature = "v1_18")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_18")))] unsafe impl Sync for VideoAFDMeta {} #[cfg(feature = "v1_18")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_18")))] impl VideoAFDMeta { #[doc(alias = "gst_buffer_add_video_afd_meta")] pub fn add( buffer: &mut gst::BufferRef, field: u8, spec: crate::VideoAFDSpec, afd: crate::VideoAFDValue, ) -> gst::MetaRefMut { skip_assert_initialized!(); unsafe { let meta = ffi::gst_buffer_add_video_afd_meta( buffer.as_mut_ptr(), field, spec.into_glib(), afd.into_glib(), ); Self::from_mut_ptr(buffer, meta) } } #[doc(alias = "get_field")] #[inline] pub fn field(&self) -> u8 { self.0.field } #[doc(alias = "get_spec")] #[inline] pub fn spec(&self) -> crate::VideoAFDSpec { unsafe { from_glib(self.0.spec) } } #[doc(alias = "get_afd")] #[inline] pub fn afd(&self) -> crate::VideoAFDValue { unsafe { from_glib(self.0.afd) } } } #[cfg(feature = "v1_18")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_18")))] unsafe impl MetaAPI for VideoAFDMeta { type GstType = ffi::GstVideoAFDMeta; #[doc(alias = "gst_video_afd_meta_api_get_type")] #[inline] fn meta_api() -> glib::Type { unsafe { from_glib(ffi::gst_video_afd_meta_api_get_type()) } } } #[cfg(feature = "v1_18")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_18")))] impl fmt::Debug for VideoAFDMeta { fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result { f.debug_struct("VideoAFDMeta") .field("field", &self.field()) .field("spec", &self.spec()) .field("afd", &self.afd()) .finish() } } #[cfg(feature = "v1_18")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_18")))] #[repr(transparent)] #[doc(alias = "GstVideoBarMeta")] pub struct VideoBarMeta(ffi::GstVideoBarMeta); #[cfg(feature = "v1_18")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_18")))] unsafe impl Send for VideoBarMeta {} #[cfg(feature = "v1_18")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_18")))] unsafe impl Sync for VideoBarMeta {} #[cfg(feature = "v1_18")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_18")))] impl VideoBarMeta { #[doc(alias = "gst_buffer_add_video_bar_meta")] pub fn add( buffer: &mut gst::BufferRef, field: u8, is_letterbox: bool, bar_data1: u32, bar_data2: u32, ) -> gst::MetaRefMut { skip_assert_initialized!(); unsafe { let meta = ffi::gst_buffer_add_video_bar_meta( buffer.as_mut_ptr(), field, is_letterbox.into_glib(), bar_data1, bar_data2, ); Self::from_mut_ptr(buffer, meta) } } #[doc(alias = "get_field")] #[inline] pub fn field(&self) -> u8 { self.0.field } #[inline] pub fn is_letterbox(&self) -> bool { unsafe { from_glib(self.0.is_letterbox) } } #[doc(alias = "get_bar_data1")] #[inline] pub fn bar_data1(&self) -> u32 { self.0.bar_data1 } #[doc(alias = "get_bar_data2")] #[inline] pub fn bar_data2(&self) -> u32 { self.0.bar_data2 } } #[cfg(feature = "v1_18")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_18")))] unsafe impl MetaAPI for VideoBarMeta { type GstType = ffi::GstVideoBarMeta; #[doc(alias = "gst_video_bar_meta_api_get_type")] #[inline] fn meta_api() -> glib::Type { unsafe { from_glib(ffi::gst_video_bar_meta_api_get_type()) } } } #[cfg(feature = "v1_18")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_18")))] impl fmt::Debug for VideoBarMeta { fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result { f.debug_struct("VideoBarMeta") .field("field", &self.field()) .field("is_letterbox", &self.is_letterbox()) .field("bar_data1", &self.bar_data1()) .field("bar_data2", &self.bar_data2()) .finish() } } #[cfg(feature = "v1_20")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_20")))] #[repr(transparent)] #[doc(alias = "GstVideoCodecAlphaMeta")] pub struct VideoCodecAlphaMeta(ffi::GstVideoCodecAlphaMeta); #[cfg(feature = "v1_20")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_20")))] unsafe impl Send for VideoCodecAlphaMeta {} #[cfg(feature = "v1_20")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_20")))] unsafe impl Sync for VideoCodecAlphaMeta {} #[cfg(feature = "v1_20")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_20")))] impl VideoCodecAlphaMeta { #[doc(alias = "gst_buffer_add_video_codec_alpha_meta")] pub fn add( buffer: &mut gst::BufferRef, alpha_buffer: gst::Buffer, ) -> gst::MetaRefMut { skip_assert_initialized!(); unsafe { let meta = ffi::gst_buffer_add_video_codec_alpha_meta( buffer.as_mut_ptr(), alpha_buffer.to_glib_none().0, ); Self::from_mut_ptr(buffer, meta) } } #[inline] pub fn alpha_buffer(&self) -> &gst::BufferRef { unsafe { gst::BufferRef::from_ptr(self.0.buffer) } } #[inline] pub fn alpha_buffer_owned(&self) -> gst::Buffer { unsafe { from_glib_none(self.0.buffer) } } } #[cfg(feature = "v1_20")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_20")))] unsafe impl MetaAPI for VideoCodecAlphaMeta { type GstType = ffi::GstVideoCodecAlphaMeta; #[doc(alias = "gst_video_codec_alpha_meta_api_get_type")] #[inline] fn meta_api() -> glib::Type { unsafe { from_glib(ffi::gst_video_codec_alpha_meta_api_get_type()) } } } #[cfg(feature = "v1_20")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_20")))] impl fmt::Debug for VideoCodecAlphaMeta { fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result { f.debug_struct("VideoCodecAlphaMeta") .field("buffer", &self.alpha_buffer()) .finish() } } #[cfg(feature = "v1_22")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_22")))] #[repr(transparent)] #[doc(alias = "GstVideoSEIUserDataUnregisteredMeta")] pub struct VideoSeiUserDataUnregisteredMeta(ffi::GstVideoSEIUserDataUnregisteredMeta); #[cfg(feature = "v1_22")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_22")))] unsafe impl Send for VideoSeiUserDataUnregisteredMeta {} #[cfg(feature = "v1_22")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_22")))] unsafe impl Sync for VideoSeiUserDataUnregisteredMeta {} #[cfg(feature = "v1_22")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_22")))] impl VideoSeiUserDataUnregisteredMeta { #[doc(alias = "gst_buffer_add_video_sei_user_data_unregistered_meta")] pub fn add<'a>( buffer: &'a mut gst::BufferRef, uuid: &[u8; 16], data: &[u8], ) -> gst::MetaRefMut<'a, Self, gst::meta::Standalone> { skip_assert_initialized!(); assert!(!data.is_empty()); unsafe { let meta = ffi::gst_buffer_add_video_sei_user_data_unregistered_meta( buffer.as_mut_ptr(), mut_override(uuid.as_ptr()), mut_override(data.as_ptr()), data.len(), ); Self::from_mut_ptr(buffer, meta) } } #[doc(alias = "get_data")] #[inline] pub fn data(&self) -> &[u8] { if self.0.size == 0 { return &[]; } // SAFETY: In the C API we have a pointer data and a size variable // indicating the length of the data. Here we convert it to a size, // making sure we read the size specified in the C API. unsafe { use std::slice; slice::from_raw_parts(self.0.data, self.0.size) } } #[doc(alias = "get_uuid")] #[inline] pub fn uuid(&self) -> [u8; 16] { self.0.uuid } } #[cfg(feature = "v1_22")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_22")))] impl fmt::Debug for VideoSeiUserDataUnregisteredMeta { fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result { f.debug_struct("VideoSeiUserDataUnregisteredMeta") .field( "uuid", &format!("0x{:032X}", u128::from_be_bytes(self.uuid())), ) .field("data", &self.data()) .finish() } } #[cfg(feature = "v1_22")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_22")))] unsafe impl MetaAPI for VideoSeiUserDataUnregisteredMeta { type GstType = ffi::GstVideoSEIUserDataUnregisteredMeta; #[doc(alias = "gst_video_sei_user_data_unregistered_meta_api_get_type")] fn meta_api() -> glib::Type { unsafe { glib::translate::from_glib(ffi::gst_video_sei_user_data_unregistered_meta_api_get_type()) } } } #[cfg(feature = "v1_24")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_24")))] #[repr(transparent)] #[doc(alias = "GstAncillaryMeta")] pub struct AncillaryMeta(ffi::GstAncillaryMeta); #[cfg(feature = "v1_24")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_24")))] unsafe impl Send for AncillaryMeta {} #[cfg(feature = "v1_24")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_24")))] unsafe impl Sync for AncillaryMeta {} #[cfg(feature = "v1_24")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_24")))] impl AncillaryMeta { #[doc(alias = "gst_buffer_add_ancillary_meta")] pub fn add(buffer: &mut gst::BufferRef) -> gst::MetaRefMut { skip_assert_initialized!(); unsafe { let meta = ffi::gst_buffer_add_ancillary_meta(buffer.as_mut_ptr()); Self::from_mut_ptr(buffer, meta) } } #[inline] pub fn field(&self) -> crate::AncillaryMetaField { unsafe { from_glib(self.0.field) } } #[inline] pub fn set_field(&mut self, field: crate::AncillaryMetaField) { self.0.field = field.into_glib(); } #[inline] pub fn c_not_y_channel(&self) -> bool { unsafe { from_glib(self.0.c_not_y_channel) } } #[inline] pub fn set_c_not_y_channel(&mut self, c_not_y_channel: bool) { self.0.c_not_y_channel = c_not_y_channel.into_glib(); } #[inline] pub fn line(&self) -> u16 { self.0.line } #[inline] pub fn set_line(&mut self, line: u16) { self.0.line = line; } #[inline] pub fn offset(&self) -> u16 { self.0.offset } #[inline] pub fn set_offset(&mut self, offset: u16) { self.0.offset = offset; } #[inline] pub fn did(&self) -> u16 { self.0.DID } #[inline] pub fn set_did(&mut self, did: u16) { self.0.DID = did; } #[inline] pub fn sdid_block_number(&self) -> u16 { self.0.SDID_block_number } #[inline] pub fn set_sdid_block_number(&mut self, sdid_block_number: u16) { self.0.SDID_block_number = sdid_block_number; } #[inline] pub fn data_count(&self) -> u16 { self.0.data_count } #[inline] pub fn checksum(&self) -> u16 { self.0.checksum } #[inline] pub fn set_checksum(&mut self, checksum: u16) { self.0.checksum = checksum; } #[inline] pub fn data(&self) -> &[u16] { if self.0.data_count & 0xff == 0 { return &[]; } unsafe { use std::slice; slice::from_raw_parts(self.0.data, (self.0.data_count & 0xff) as usize) } } #[inline] pub fn data_mut(&mut self) -> &mut [u16] { if self.0.data_count & 0xff == 0 { return &mut []; } unsafe { use std::slice; slice::from_raw_parts_mut(self.0.data, (self.0.data_count & 0xff) as usize) } } #[inline] pub fn set_data(&mut self, data: glib::Slice) { unsafe { assert!(data.len() < 256); self.0.data_count = data.len() as u16; self.0.data = data.into_glib_ptr(); } } #[inline] pub fn set_data_count_upper_two_bits(&mut self, upper_two_bits: u8) { assert!(upper_two_bits & !0x03 == 0); self.0.data_count = ((upper_two_bits as u16) << 8) | self.0.data_count & 0xff; } } #[cfg(feature = "v1_24")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_24")))] unsafe impl MetaAPI for AncillaryMeta { type GstType = ffi::GstAncillaryMeta; #[doc(alias = "gst_ancillary_meta_api_get_type")] #[inline] fn meta_api() -> glib::Type { unsafe { from_glib(ffi::gst_ancillary_meta_api_get_type()) } } } #[cfg(feature = "v1_24")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_24")))] impl fmt::Debug for AncillaryMeta { fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result { f.debug_struct("AncillaryMeta") .field("field", &self.field()) .field("c_not_y_channel", &self.c_not_y_channel()) .field("line", &self.line()) .field("offset", &self.offset()) .field("did", &self.did()) .field("sdid_block_number", &self.sdid_block_number()) .field("data_count", &self.data_count()) .field("data", &self.data()) .field("checksum", &self.checksum()) .finish() } } pub mod tags { gst::impl_meta_tag!(Video, crate::ffi::GST_META_TAG_VIDEO_STR); gst::impl_meta_tag!(Size, crate::ffi::GST_META_TAG_VIDEO_SIZE_STR); gst::impl_meta_tag!(Orientation, crate::ffi::GST_META_TAG_VIDEO_ORIENTATION_STR); gst::impl_meta_tag!(Colorspace, crate::ffi::GST_META_TAG_VIDEO_COLORSPACE_STR); } #[derive(Debug, Clone, PartialEq, Eq)] pub struct VideoMetaTransformScale<'a> { in_info: &'a crate::VideoInfo, out_info: &'a crate::VideoInfo, } impl<'a> VideoMetaTransformScale<'a> { pub fn new(in_info: &'a crate::VideoInfo, out_info: &'a crate::VideoInfo) -> Self { skip_assert_initialized!(); VideoMetaTransformScale { in_info, out_info } } } unsafe impl<'a> gst::meta::MetaTransform<'a> for VideoMetaTransformScale<'a> { type GLibType = ffi::GstVideoMetaTransform; #[doc(alias = "gst_video_meta_transform_scale_get_quark")] fn quark() -> glib::Quark { unsafe { from_glib(ffi::gst_video_meta_transform_scale_get_quark()) } } fn to_raw( &self, _meta: &gst::MetaRef, ) -> Result { Ok(ffi::GstVideoMetaTransform { in_info: mut_override(self.in_info.to_glib_none().0), out_info: mut_override(self.out_info.to_glib_none().0), }) } } #[cfg(test)] mod tests { use super::*; #[test] fn test_add_get_meta() { gst::init().unwrap(); let mut buffer = gst::Buffer::with_size(320 * 240 * 4).unwrap(); { let meta = VideoMeta::add( buffer.get_mut().unwrap(), crate::VideoFrameFlags::empty(), crate::VideoFormat::Argb, 320, 240, ) .unwrap(); assert_eq!(meta.id(), 0); assert_eq!(meta.video_frame_flags(), crate::VideoFrameFlags::empty()); assert_eq!(meta.format(), crate::VideoFormat::Argb); assert_eq!(meta.width(), 320); assert_eq!(meta.height(), 240); assert_eq!(meta.n_planes(), 1); assert_eq!(meta.offset(), &[0]); assert_eq!(meta.stride(), &[320 * 4]); assert!(meta.has_tag::()); assert!(meta.has_tag::()); assert!(meta.has_tag::()); assert!(meta.has_tag::()); } { let meta = buffer.meta::().unwrap(); assert_eq!(meta.id(), 0); assert_eq!(meta.video_frame_flags(), crate::VideoFrameFlags::empty()); assert_eq!(meta.format(), crate::VideoFormat::Argb); assert_eq!(meta.width(), 320); assert_eq!(meta.height(), 240); assert_eq!(meta.n_planes(), 1); assert_eq!(meta.offset(), &[0]); assert_eq!(meta.stride(), &[320 * 4]); } } #[test] fn test_add_full_get_meta() { gst::init().unwrap(); let mut buffer = gst::Buffer::with_size(320 * 240 * 4).unwrap(); { let meta = VideoMeta::add_full( buffer.get_mut().unwrap(), crate::VideoFrameFlags::empty(), crate::VideoFormat::Argb, 320, 240, &[0], &[320 * 4], ) .unwrap(); assert_eq!(meta.id(), 0); assert_eq!(meta.video_frame_flags(), crate::VideoFrameFlags::empty()); assert_eq!(meta.format(), crate::VideoFormat::Argb); assert_eq!(meta.width(), 320); assert_eq!(meta.height(), 240); assert_eq!(meta.n_planes(), 1); assert_eq!(meta.offset(), &[0]); assert_eq!(meta.stride(), &[320 * 4]); } { let meta = buffer.meta::().unwrap(); assert_eq!(meta.id(), 0); assert_eq!(meta.video_frame_flags(), crate::VideoFrameFlags::empty()); assert_eq!(meta.format(), crate::VideoFormat::Argb); assert_eq!(meta.width(), 320); assert_eq!(meta.height(), 240); assert_eq!(meta.n_planes(), 1); assert_eq!(meta.offset(), &[0]); assert_eq!(meta.stride(), &[320 * 4]); } } #[test] #[cfg(feature = "v1_16")] fn test_add_full_alternate_interlacing() { gst::init().unwrap(); let mut buffer = gst::Buffer::with_size(320 * 120 * 4).unwrap(); VideoMeta::add_full( buffer.get_mut().unwrap(), crate::VideoFrameFlags::TOP_FIELD, crate::VideoFormat::Argb, 320, 240, &[0], &[320 * 4], ) .unwrap(); } #[test] #[cfg(feature = "v1_18")] fn test_video_meta_alignment() { gst::init().unwrap(); let mut buffer = gst::Buffer::with_size(115200).unwrap(); let meta = VideoMeta::add( buffer.get_mut().unwrap(), crate::VideoFrameFlags::empty(), crate::VideoFormat::Nv12, 320, 240, ) .unwrap(); let alig = meta.alignment(); assert_eq!(alig, crate::VideoAlignment::new(0, 0, 0, 0, &[0, 0, 0, 0])); assert_eq!(meta.plane_size().unwrap(), [76800, 38400, 0, 0]); assert_eq!(meta.plane_height().unwrap(), [240, 120, 0, 0]); /* horizontal padding */ let mut info = crate::VideoInfo::builder(crate::VideoFormat::Nv12, 320, 240) .build() .expect("Failed to create VideoInfo"); let mut alig = crate::VideoAlignment::new(0, 0, 2, 6, &[0, 0, 0, 0]); info.align(&mut alig).unwrap(); let mut meta = VideoMeta::add_full( buffer.get_mut().unwrap(), crate::VideoFrameFlags::empty(), crate::VideoFormat::Nv12, info.width(), info.height(), info.offset(), info.stride(), ) .unwrap(); meta.set_alignment(&alig).unwrap(); let alig = meta.alignment(); assert_eq!(alig, crate::VideoAlignment::new(0, 0, 2, 6, &[0, 0, 0, 0])); assert_eq!(meta.plane_size().unwrap(), [78720, 39360, 0, 0]); assert_eq!(meta.plane_height().unwrap(), [240, 120, 0, 0]); /* vertical alignment */ let mut info = crate::VideoInfo::builder(crate::VideoFormat::Nv12, 320, 240) .build() .expect("Failed to create VideoInfo"); let mut alig = crate::VideoAlignment::new(2, 6, 0, 0, &[0, 0, 0, 0]); info.align(&mut alig).unwrap(); let mut meta = VideoMeta::add_full( buffer.get_mut().unwrap(), crate::VideoFrameFlags::empty(), crate::VideoFormat::Nv12, info.width(), info.height(), info.offset(), info.stride(), ) .unwrap(); meta.set_alignment(&alig).unwrap(); let alig = meta.alignment(); assert_eq!(alig, crate::VideoAlignment::new(2, 6, 0, 0, &[0, 0, 0, 0])); assert_eq!(meta.plane_size().unwrap(), [79360, 39680, 0, 0]); assert_eq!(meta.plane_height().unwrap(), [248, 124, 0, 0]); } #[test] #[cfg(feature = "v1_22")] fn test_get_video_sei_user_data_unregistered_meta() { gst::init().unwrap(); const META_UUID: &[u8; 16] = &[ 0x4D, 0x49, 0x53, 0x50, 0x6D, 0x69, 0x63, 0x72, 0x6F, 0x73, 0x65, 0x63, 0x74, 0x69, 0x6D, 0x65, ]; const META_DATA: &[u8] = &[ 0x1f, 0x00, 0x05, 0xff, 0x21, 0x7e, 0xff, 0x29, 0xb5, 0xff, 0xdc, 0x13, ]; let buffer_data = &[ &[0x00, 0x00, 0x00, 0x20, 0x06, 0x05, 0x1c], META_UUID as &[u8], META_DATA, &[ 0x80, 0x00, 0x00, 0x00, 0x14, 0x65, 0x88, 0x84, 0x00, 0x10, 0xff, 0xfe, 0xf6, 0xf0, 0xfe, 0x05, 0x36, 0x56, 0x04, 0x50, 0x96, 0x7b, 0x3f, 0x53, 0xe1, ], ] .concat(); let mut harness = gst_check::Harness::new("h264parse"); harness.set_src_caps_str(r#" video/x-h264, stream-format=(string)avc, width=(int)1920, height=(int)1080, framerate=(fraction)25/1, bit-depth-chroma=(uint)8, parsed=(boolean)true, alignment=(string)au, profile=(string)high, level=(string)4, codec_data=(buffer)01640028ffe1001a67640028acb200f0044fcb080000030008000003019478c1924001000568ebccb22c "#); let buffer = gst::Buffer::from_slice(buffer_data.clone()); let buffer = harness.push_and_pull(buffer).unwrap(); let meta = buffer.meta::().unwrap(); assert_eq!(meta.uuid(), *META_UUID); assert_eq!(meta.data(), META_DATA); assert_eq!(meta.data().len(), META_DATA.len()); } #[test] fn test_meta_video_transform() { gst::init().unwrap(); let mut buffer = gst::Buffer::with_size(320 * 240 * 4).unwrap(); let meta = VideoCropMeta::add(buffer.get_mut().unwrap(), (10, 10, 20, 20)); let mut buffer2 = gst::Buffer::with_size(640 * 480 * 4).unwrap(); let in_video_info = crate::VideoInfo::builder(crate::VideoFormat::Rgba, 320, 240) .build() .unwrap(); let out_video_info = crate::VideoInfo::builder(crate::VideoFormat::Rgba, 640, 480) .build() .unwrap(); meta.transform( buffer2.get_mut().unwrap(), &VideoMetaTransformScale::new(&in_video_info, &out_video_info), ) .unwrap(); let meta2 = buffer2.meta::().unwrap(); assert_eq!(meta2.rect(), (20, 20, 40, 40)); } } gstreamer-video-0.23.5/src/video_overlay.rs000064400000000000000000000020351046102023000170140ustar 00000000000000// Take a look at the license at the top of the repository in the LICENSE file. use glib::{prelude::*, translate::*}; use libc::uintptr_t; use crate::{ffi, VideoOverlay}; mod sealed { pub trait Sealed {} impl> Sealed for T {} } pub trait VideoOverlayExtManual: sealed::Sealed + IsA + 'static { unsafe fn set_window_handle(&self, handle: uintptr_t) { ffi::gst_video_overlay_set_window_handle(self.as_ref().to_glib_none().0, handle) } unsafe fn got_window_handle(&self, handle: uintptr_t) { ffi::gst_video_overlay_got_window_handle(self.as_ref().to_glib_none().0, handle) } } impl> VideoOverlayExtManual for O {} #[doc(alias = "gst_is_video_overlay_prepare_window_handle_message")] pub fn is_video_overlay_prepare_window_handle_message(msg: &gst::MessageRef) -> bool { skip_assert_initialized!(); unsafe { from_glib(ffi::gst_is_video_overlay_prepare_window_handle_message( msg.as_mut_ptr(), )) } } gstreamer-video-0.23.5/src/video_overlay_composition.rs000064400000000000000000000372061046102023000214470ustar 00000000000000// Take a look at the license at the top of the repository in the LICENSE file. use std::{fmt, mem}; use crate::ffi; use glib::translate::*; gst::mini_object_wrapper!( VideoOverlayRectangle, VideoOverlayRectangleRef, ffi::GstVideoOverlayRectangle, || ffi::gst_video_overlay_rectangle_get_type() ); impl fmt::Debug for VideoOverlayRectangle { fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result { VideoOverlayRectangleRef::fmt(self, f) } } impl fmt::Debug for VideoOverlayRectangleRef { fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result { f.debug_struct("VideoOverlayRectangle") .field("flags", &self.flags()) .field("global_alpha", &self.global_alpha()) .field("render_rectangle", &self.render_rectangle()) .finish() } } impl VideoOverlayRectangle { #[doc(alias = "gst_video_overlay_rectangle_new_raw")] pub fn new_raw( buffer: &gst::Buffer, render_x: i32, render_y: i32, render_width: u32, render_height: u32, flags: crate::VideoOverlayFormatFlags, ) -> Self { skip_assert_initialized!(); assert!(buffer.meta::().is_some()); unsafe { from_glib_full(ffi::gst_video_overlay_rectangle_new_raw( buffer.to_glib_none().0, render_x, render_y, render_width, render_height, flags.into_glib(), )) } } } impl VideoOverlayRectangleRef { #[doc(alias = "get_flags")] #[doc(alias = "gst_video_overlay_rectangle_get_flags")] pub fn flags(&self) -> crate::VideoOverlayFormatFlags { unsafe { from_glib(ffi::gst_video_overlay_rectangle_get_flags( self.as_mut_ptr(), )) } } #[doc(alias = "get_global_alpha")] #[doc(alias = "gst_video_overlay_rectangle_get_global_alpha")] pub fn global_alpha(&self) -> f32 { unsafe { ffi::gst_video_overlay_rectangle_get_global_alpha(self.as_mut_ptr()) } } #[doc(alias = "gst_video_overlay_rectangle_set_global_alpha")] pub fn set_global_alpha(&mut self, alpha: f32) { unsafe { ffi::gst_video_overlay_rectangle_set_global_alpha(self.as_mut_ptr(), alpha) } } #[doc(alias = "get_seqnum")] #[doc(alias = "gst_video_overlay_rectangle_get_seqnum")] pub fn seqnum(&self) -> u32 { unsafe { ffi::gst_video_overlay_rectangle_get_seqnum(self.as_mut_ptr()) } } #[doc(alias = "get_render_rectangle")] #[doc(alias = "gst_video_overlay_rectangle_get_render_rectangle")] pub fn render_rectangle(&self) -> (i32, i32, u32, u32) { unsafe { let mut render_x = mem::MaybeUninit::uninit(); let mut render_y = mem::MaybeUninit::uninit(); let mut render_width = mem::MaybeUninit::uninit(); let mut render_height = mem::MaybeUninit::uninit(); ffi::gst_video_overlay_rectangle_get_render_rectangle( self.as_mut_ptr(), render_x.as_mut_ptr(), render_y.as_mut_ptr(), render_width.as_mut_ptr(), render_height.as_mut_ptr(), ); ( render_x.assume_init(), render_y.assume_init(), render_width.assume_init(), render_height.assume_init(), ) } } #[doc(alias = "gst_video_overlay_rectangle_set_render_rectangle")] pub fn set_render_rectangle( &mut self, render_x: i32, render_y: i32, render_width: u32, render_height: u32, ) { unsafe { ffi::gst_video_overlay_rectangle_set_render_rectangle( self.as_mut_ptr(), render_x, render_y, render_width, render_height, ) } } #[doc(alias = "get_pixels_unscaled_raw")] #[doc(alias = "gst_video_overlay_rectangle_get_pixels_unscaled_raw")] pub fn pixels_unscaled_raw(&self, flags: crate::VideoOverlayFormatFlags) -> gst::Buffer { unsafe { from_glib_none(ffi::gst_video_overlay_rectangle_get_pixels_unscaled_raw( self.as_mut_ptr(), flags.into_glib(), )) } } #[doc(alias = "get_pixels_unscaled_ayuv")] #[doc(alias = "gst_video_overlay_rectangle_get_pixels_unscaled_ayuv")] pub fn pixels_unscaled_ayuv(&self, flags: crate::VideoOverlayFormatFlags) -> gst::Buffer { unsafe { from_glib_none(ffi::gst_video_overlay_rectangle_get_pixels_unscaled_ayuv( self.as_mut_ptr(), flags.into_glib(), )) } } #[doc(alias = "get_pixels_unscaled_argb")] #[doc(alias = "gst_video_overlay_rectangle_get_pixels_unscaled_argb")] pub fn pixels_unscaled_argb(&self, flags: crate::VideoOverlayFormatFlags) -> gst::Buffer { unsafe { from_glib_none(ffi::gst_video_overlay_rectangle_get_pixels_unscaled_argb( self.as_mut_ptr(), flags.into_glib(), )) } } #[doc(alias = "get_pixels_raw")] #[doc(alias = "gst_video_overlay_rectangle_get_pixels_raw")] pub fn pixels_raw(&self, flags: crate::VideoOverlayFormatFlags) -> gst::Buffer { unsafe { from_glib_none(ffi::gst_video_overlay_rectangle_get_pixels_raw( self.as_mut_ptr(), flags.into_glib(), )) } } #[doc(alias = "get_pixels_ayuv")] #[doc(alias = "gst_video_overlay_rectangle_get_pixels_ayuv")] pub fn pixels_ayuv(&self, flags: crate::VideoOverlayFormatFlags) -> gst::Buffer { unsafe { from_glib_none(ffi::gst_video_overlay_rectangle_get_pixels_ayuv( self.as_mut_ptr(), flags.into_glib(), )) } } #[doc(alias = "get_pixels_argb")] #[doc(alias = "gst_video_overlay_rectangle_get_pixels_argb")] pub fn pixels_argb(&self, flags: crate::VideoOverlayFormatFlags) -> gst::Buffer { unsafe { from_glib_none(ffi::gst_video_overlay_rectangle_get_pixels_argb( self.as_mut_ptr(), flags.into_glib(), )) } } } gst::mini_object_wrapper!( VideoOverlayComposition, VideoOverlayCompositionRef, ffi::GstVideoOverlayComposition, || ffi::gst_video_overlay_composition_get_type() ); impl fmt::Debug for VideoOverlayComposition { fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result { VideoOverlayCompositionRef::fmt(self, f) } } impl fmt::Debug for VideoOverlayCompositionRef { fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result { f.debug_struct("VideoOverlayComposition").finish() } } impl VideoOverlayComposition { #[doc(alias = "gst_video_overlay_composition_new")] pub fn new<'a>( rects: impl IntoIterator, ) -> Result { assert_initialized_main_thread!(); #[cfg(feature = "v1_20")] unsafe { use std::ptr; let composition = Self::from_glib_full(ffi::gst_video_overlay_composition_new(ptr::null_mut())); rects.into_iter().for_each(|rect| { ffi::gst_video_overlay_composition_add_rectangle( composition.as_mut_ptr(), rect.as_mut_ptr(), ); }); Ok(composition) } #[cfg(not(feature = "v1_20"))] unsafe { let mut iter = rects.into_iter(); let first = match iter.next() { None => { return Err(glib::bool_error!( "Failed to create VideoOverlayComposition" )) } Some(first) => first, }; let composition = Self::from_glib_full(ffi::gst_video_overlay_composition_new(first.as_mut_ptr())); for rect in iter { ffi::gst_video_overlay_composition_add_rectangle( composition.as_mut_ptr(), rect.as_mut_ptr(), ); } Ok(composition) } } } #[cfg(feature = "v1_20")] impl Default for VideoOverlayComposition { fn default() -> Self { assert_initialized_main_thread!(); use std::ptr; unsafe { from_glib_full(ffi::gst_video_overlay_composition_new(ptr::null_mut())) } } } impl VideoOverlayCompositionRef { #[doc(alias = "gst_video_overlay_composition_n_rectangles")] pub fn n_rectangles(&self) -> u32 { unsafe { ffi::gst_video_overlay_composition_n_rectangles(self.as_mut_ptr()) } } #[doc(alias = "get_rectangle")] #[doc(alias = "gst_video_overlay_composition_get_rectangle")] pub fn rectangle(&self, idx: u32) -> Result { if idx >= self.n_rectangles() { return Err(glib::bool_error!("Invalid index")); } unsafe { match from_glib_none(ffi::gst_video_overlay_composition_get_rectangle( self.as_mut_ptr(), idx, )) { Some(r) => Ok(r), None => Err(glib::bool_error!("Failed to get rectangle")), } } } #[doc(alias = "gst_video_overlay_composition_add_rectangle")] pub fn add_rectangle(&mut self, rect: &VideoOverlayRectangleRef) { unsafe { ffi::gst_video_overlay_composition_add_rectangle(self.as_mut_ptr(), rect.as_mut_ptr()); } } #[doc(alias = "get_seqnum")] #[doc(alias = "gst_video_overlay_composition_get_seqnum")] pub fn seqnum(&self) -> u32 { unsafe { ffi::gst_video_overlay_composition_get_seqnum(self.as_mut_ptr()) } } #[doc(alias = "gst_video_overlay_composition_blend")] pub fn blend( &self, frame: &mut crate::VideoFrameRef<&mut gst::BufferRef>, ) -> Result<(), glib::BoolError> { unsafe { glib::result_from_gboolean!( ffi::gst_video_overlay_composition_blend(self.as_mut_ptr(), frame.as_mut_ptr()), "Failed to blend overlay composition", ) } } pub fn iter(&self) -> Iter { Iter { composition: self, idx: 0, len: self.n_rectangles() as usize, } } } impl<'a> IntoIterator for &'a VideoOverlayComposition { type IntoIter = Iter<'a>; type Item = VideoOverlayRectangle; fn into_iter(self) -> Self::IntoIter { self.iter() } } impl From for VideoOverlayComposition { fn from(value: VideoOverlayRectangle) -> Self { skip_assert_initialized!(); unsafe { Self::from_glib_full(ffi::gst_video_overlay_composition_new( value.into_glib_ptr(), )) } } } impl<'a> From<&'a VideoOverlayRectangle> for VideoOverlayComposition { fn from(value: &'a VideoOverlayRectangle) -> Self { skip_assert_initialized!(); unsafe { Self::from_glib_full(ffi::gst_video_overlay_composition_new(value.as_mut_ptr())) } } } #[cfg(feature = "v1_20")] impl From<[VideoOverlayRectangle; N]> for VideoOverlayComposition { fn from(value: [VideoOverlayRectangle; N]) -> Self { assert_initialized_main_thread!(); unsafe { use std::ptr; let composition = Self::from_glib_full(ffi::gst_video_overlay_composition_new(ptr::null_mut())); value.into_iter().for_each(|rect| { ffi::gst_video_overlay_composition_add_rectangle( composition.as_mut_ptr(), rect.into_glib_ptr(), ); }); composition } } } #[cfg(feature = "v1_20")] impl<'a, const N: usize> From<[&'a VideoOverlayRectangle; N]> for VideoOverlayComposition { fn from(value: [&'a VideoOverlayRectangle; N]) -> Self { assert_initialized_main_thread!(); unsafe { use std::ptr; let composition = Self::from_glib_full(ffi::gst_video_overlay_composition_new(ptr::null_mut())); value.into_iter().for_each(|rect| { ffi::gst_video_overlay_composition_add_rectangle( composition.as_mut_ptr(), rect.as_mut_ptr(), ); }); composition } } } #[cfg(feature = "v1_20")] impl std::iter::FromIterator for VideoOverlayComposition { fn from_iter>(iter: T) -> Self { assert_initialized_main_thread!(); unsafe { use std::ptr; let composition = Self::from_glib_full(ffi::gst_video_overlay_composition_new(ptr::null_mut())); iter.into_iter().for_each(|rect| { ffi::gst_video_overlay_composition_add_rectangle( composition.as_mut_ptr(), rect.into_glib_ptr(), ); }); composition } } } #[cfg(feature = "v1_20")] impl<'a> std::iter::FromIterator<&'a VideoOverlayRectangle> for VideoOverlayComposition { fn from_iter>(iter: T) -> Self { assert_initialized_main_thread!(); unsafe { use std::ptr; let composition = Self::from_glib_full(ffi::gst_video_overlay_composition_new(ptr::null_mut())); iter.into_iter().for_each(|rect| { ffi::gst_video_overlay_composition_add_rectangle( composition.as_mut_ptr(), rect.as_mut_ptr(), ); }); composition } } } pub struct Iter<'a> { composition: &'a VideoOverlayCompositionRef, idx: usize, len: usize, } impl Iterator for Iter<'_> { type Item = VideoOverlayRectangle; fn next(&mut self) -> Option { if self.idx >= self.len { return None; } let rect = self.composition.rectangle(self.idx as u32).unwrap(); self.idx += 1; Some(rect) } fn size_hint(&self) -> (usize, Option) { let remaining = self.len - self.idx; (remaining, Some(remaining)) } fn count(self) -> usize { self.len - self.idx } fn nth(&mut self, n: usize) -> Option { let (end, overflow) = self.idx.overflowing_add(n); if end >= self.len || overflow { self.idx = self.len; None } else { self.idx = end + 1; Some(self.composition.rectangle(end as u32).unwrap()) } } fn last(self) -> Option { if self.idx == self.len { None } else { Some(self.composition.rectangle(self.len as u32 - 1).unwrap()) } } } impl DoubleEndedIterator for Iter<'_> { fn next_back(&mut self) -> Option { if self.idx == self.len { return None; } self.len -= 1; Some(self.composition.rectangle(self.len as u32).unwrap()) } fn nth_back(&mut self, n: usize) -> Option { let (end, overflow) = self.len.overflowing_sub(n); if end <= self.idx || overflow { self.idx = self.len; None } else { self.len = end - 1; Some(self.composition.rectangle(self.len as u32).unwrap()) } } } impl ExactSizeIterator for Iter<'_> {} impl std::iter::FusedIterator for Iter<'_> {} gstreamer-video-0.23.5/src/video_rectangle.rs000064400000000000000000000032461046102023000173040ustar 00000000000000// Take a look at the license at the top of the repository in the LICENSE file. use std::{marker::PhantomData, mem}; use crate::ffi; use glib::translate::IntoGlib; #[repr(C)] #[derive(Clone, Debug, Eq, PartialEq, Hash)] pub struct VideoRectangle { pub x: i32, pub y: i32, pub w: i32, pub h: i32, } impl VideoRectangle { #[inline] pub fn new(x: i32, y: i32, w: i32, h: i32) -> Self { skip_assert_initialized!(); Self { x, y, w, h } } } pub fn center_video_rectangle( src: &VideoRectangle, dst: &VideoRectangle, scale: bool, ) -> VideoRectangle { skip_assert_initialized!(); let mut result = ffi::GstVideoRectangle { x: 0, y: 0, w: 0, h: 0, }; let src_rect = ffi::GstVideoRectangle { x: src.x, y: src.y, w: src.w, h: src.h, }; let dst_rect = ffi::GstVideoRectangle { x: dst.x, y: dst.y, w: dst.w, h: dst.h, }; unsafe { ffi::gst_video_sink_center_rect(src_rect, dst_rect, &mut result, scale.into_glib()); } VideoRectangle::new(result.x, result.y, result.w, result.h) } #[doc(hidden)] impl glib::translate::Uninitialized for VideoRectangle { #[inline] unsafe fn uninitialized() -> Self { mem::zeroed() } } #[doc(hidden)] impl<'a> glib::translate::ToGlibPtrMut<'a, *mut ffi::GstVideoRectangle> for VideoRectangle { type Storage = PhantomData<&'a mut Self>; #[inline] fn to_glib_none_mut( &'a mut self, ) -> glib::translate::StashMut<'a, *mut ffi::GstVideoRectangle, Self> { glib::translate::StashMut(self as *mut _ as *mut _, PhantomData) } } gstreamer-video-0.23.5/src/video_time_code.rs000064400000000000000000000432201046102023000172640ustar 00000000000000// Take a look at the license at the top of the repository in the LICENSE file. use std::{cmp, fmt, mem, str}; use glib::translate::*; use gst::prelude::*; use crate::{ffi, VideoTimeCodeFlags, VideoTimeCodeInterval}; glib::wrapper! { #[doc(alias = "GstVideoTimeCode")] pub struct VideoTimeCode(BoxedInline); match fn { copy => |ptr| ffi::gst_video_time_code_copy(ptr), free => |ptr| ffi::gst_video_time_code_free(ptr), init => |_ptr| (), copy_into => |dest, src| { *dest = *src; if !(*dest).config.latest_daily_jam.is_null() { glib::ffi::g_date_time_ref((*dest).config.latest_daily_jam); } }, clear => |ptr| { if !(*ptr).config.latest_daily_jam.is_null() { glib::ffi::g_date_time_unref((*ptr).config.latest_daily_jam); } }, type_ => || ffi::gst_video_time_code_get_type(), } } glib::wrapper! { #[doc(alias = "GstVideoTimeCode")] pub struct ValidVideoTimeCode(BoxedInline); match fn { copy => |ptr| ffi::gst_video_time_code_copy(ptr), free => |ptr| ffi::gst_video_time_code_free(ptr), init => |_ptr| (), copy_into => |dest, src| { *dest = *src; if !(*dest).config.latest_daily_jam.is_null() { glib::ffi::g_date_time_ref((*dest).config.latest_daily_jam); } }, clear => |ptr| { if !(*ptr).config.latest_daily_jam.is_null() { glib::ffi::g_date_time_unref((*ptr).config.latest_daily_jam); } }, } } impl VideoTimeCode { pub fn new_empty() -> Self { assert_initialized_main_thread!(); unsafe { let mut v = mem::MaybeUninit::zeroed(); ffi::gst_video_time_code_clear(v.as_mut_ptr()); Self { inner: v.assume_init(), } } } #[allow(clippy::too_many_arguments)] pub fn new( fps: gst::Fraction, latest_daily_jam: Option<&glib::DateTime>, flags: VideoTimeCodeFlags, hours: u32, minutes: u32, seconds: u32, frames: u32, field_count: u32, ) -> Self { assert_initialized_main_thread!(); unsafe { let mut v = mem::MaybeUninit::uninit(); ffi::gst_video_time_code_init( v.as_mut_ptr(), fps.numer() as u32, fps.denom() as u32, latest_daily_jam.to_glib_none().0, flags.into_glib(), hours, minutes, seconds, frames, field_count, ); Self { inner: v.assume_init(), } } } #[cfg(feature = "v1_16")] #[cfg_attr(docsrs, doc(cfg(feature = "v1_16")))] #[doc(alias = "gst_video_time_code_init_from_date_time_full")] pub fn from_date_time( fps: gst::Fraction, dt: &glib::DateTime, flags: VideoTimeCodeFlags, field_count: u32, ) -> Result { assert_initialized_main_thread!(); assert!(fps.denom() > 0); unsafe { let mut v = mem::MaybeUninit::zeroed(); let res = ffi::gst_video_time_code_init_from_date_time_full( v.as_mut_ptr(), fps.numer() as u32, fps.denom() as u32, dt.to_glib_none().0, flags.into_glib(), field_count, ); if res == glib::ffi::GFALSE { Err(glib::bool_error!("Failed to init video time code")) } else { Ok(Self { inner: v.assume_init(), }) } } } #[doc(alias = "gst_video_time_code_is_valid")] pub fn is_valid(&self) -> bool { unsafe { from_glib(ffi::gst_video_time_code_is_valid(self.to_glib_none().0)) } } #[inline] pub fn set_fps(&mut self, fps: gst::Fraction) { self.inner.config.fps_n = fps.numer() as u32; self.inner.config.fps_d = fps.denom() as u32; } #[inline] pub fn set_flags(&mut self, flags: VideoTimeCodeFlags) { self.inner.config.flags = flags.into_glib() } #[inline] pub fn set_hours(&mut self, hours: u32) { self.inner.hours = hours } #[inline] pub fn set_minutes(&mut self, minutes: u32) { assert!(minutes < 60); self.inner.minutes = minutes } #[inline] pub fn set_seconds(&mut self, seconds: u32) { assert!(seconds < 60); self.inner.seconds = seconds } #[inline] pub fn set_frames(&mut self, frames: u32) { self.inner.frames = frames } #[inline] pub fn set_field_count(&mut self, field_count: u32) { assert!(field_count <= 2); self.inner.field_count = field_count } } impl TryFrom for ValidVideoTimeCode { type Error = VideoTimeCode; fn try_from(v: VideoTimeCode) -> Result { skip_assert_initialized!(); if v.is_valid() { // Use ManuallyDrop here to prevent the Drop impl of VideoTimeCode // from running as we don't move v.0 out here but copy it. // GstVideoTimeCode implements Copy. let v = mem::ManuallyDrop::new(v); Ok(Self { inner: v.inner }) } else { Err(v) } } } impl ValidVideoTimeCode { #[allow(clippy::too_many_arguments)] pub fn new( fps: gst::Fraction, latest_daily_jam: Option<&glib::DateTime>, flags: VideoTimeCodeFlags, hours: u32, minutes: u32, seconds: u32, frames: u32, field_count: u32, ) -> Result { skip_assert_initialized!(); let tc = VideoTimeCode::new( fps, latest_daily_jam, flags, hours, minutes, seconds, frames, field_count, ); match tc.try_into() { Ok(v) => Ok(v), Err(_) => Err(glib::bool_error!("Failed to create new ValidVideoTimeCode")), } } // #[cfg_attr(docsrs, doc(cfg(feature = "v1_16")))] // pub fn from_date_time( // fps: gst::Fraction, // dt: &glib::DateTime, // flags: VideoTimeCodeFlags, // field_count: u32, // ) -> Option { // let tc = VideoTimeCode::from_date_time(fps, dt, flags, field_count); // tc.and_then(|tc| tc.try_into().ok()) // } #[doc(alias = "gst_video_time_code_add_frames")] pub fn add_frames(&mut self, frames: i64) { unsafe { ffi::gst_video_time_code_add_frames(self.to_glib_none_mut().0, frames); } } #[doc(alias = "gst_video_time_code_add_interval")] #[must_use = "this returns the result of the operation, without modifying the original"] pub fn add_interval( &self, tc_inter: &VideoTimeCodeInterval, ) -> Result { unsafe { match from_glib_full(ffi::gst_video_time_code_add_interval( self.to_glib_none().0, tc_inter.to_glib_none().0, )) { Some(i) => Ok(i), None => Err(glib::bool_error!("Failed to add interval")), } } } #[doc(alias = "gst_video_time_code_compare")] fn compare(&self, tc2: &Self) -> i32 { unsafe { ffi::gst_video_time_code_compare(self.to_glib_none().0, tc2.to_glib_none().0) } } #[doc(alias = "gst_video_time_code_frames_since_daily_jam")] pub fn frames_since_daily_jam(&self) -> u64 { unsafe { ffi::gst_video_time_code_frames_since_daily_jam(self.to_glib_none().0) } } #[doc(alias = "gst_video_time_code_increment_frame")] pub fn increment_frame(&mut self) { unsafe { ffi::gst_video_time_code_increment_frame(self.to_glib_none_mut().0); } } #[doc(alias = "gst_video_time_code_nsec_since_daily_jam")] #[doc(alias = "nsec_since_daily_jam")] pub fn time_since_daily_jam(&self) -> gst::ClockTime { gst::ClockTime::from_nseconds(unsafe { ffi::gst_video_time_code_nsec_since_daily_jam(self.to_glib_none().0) }) } #[doc(alias = "gst_video_time_code_to_date_time")] pub fn to_date_time(&self) -> Result { unsafe { match from_glib_full(ffi::gst_video_time_code_to_date_time(self.to_glib_none().0)) { Some(d) => Ok(d), None => Err(glib::bool_error!( "Failed to convert VideoTimeCode to date time" )), } } } } macro_rules! generic_impl { ($name:ident) => { impl $name { #[inline] pub fn hours(&self) -> u32 { self.inner.hours } #[inline] pub fn minutes(&self) -> u32 { self.inner.minutes } #[inline] pub fn seconds(&self) -> u32 { self.inner.seconds } #[inline] pub fn frames(&self) -> u32 { self.inner.frames } #[inline] pub fn field_count(&self) -> u32 { self.inner.field_count } #[inline] pub fn fps(&self) -> gst::Fraction { ( self.inner.config.fps_n as i32, self.inner.config.fps_d as i32, ) .into() } #[inline] pub fn flags(&self) -> VideoTimeCodeFlags { unsafe { from_glib(self.inner.config.flags) } } #[inline] pub fn latest_daily_jam(&self) -> Option { unsafe { from_glib_none(self.inner.config.latest_daily_jam) } } #[inline] pub fn set_latest_daily_jam(&mut self, latest_daily_jam: Option) { unsafe { if !self.inner.config.latest_daily_jam.is_null() { glib::ffi::g_date_time_unref(self.inner.config.latest_daily_jam); } self.inner.config.latest_daily_jam = latest_daily_jam.into_glib_ptr(); } } } impl fmt::Debug for $name { fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result { f.debug_struct(stringify!($name)) .field("fps", &self.fps()) .field("flags", &self.flags()) .field("latest_daily_jam", &self.latest_daily_jam()) .field("hours", &self.hours()) .field("minutes", &self.minutes()) .field("seconds", &self.seconds()) .field("frames", &self.frames()) .field("field_count", &self.field_count()) .finish() } } impl fmt::Display for $name { #[inline] fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result { let s = unsafe { glib::GString::from_glib_full(ffi::gst_video_time_code_to_string( self.to_glib_none().0, )) }; f.write_str(&s) } } unsafe impl Send for $name {} unsafe impl Sync for $name {} }; } generic_impl!(VideoTimeCode); generic_impl!(ValidVideoTimeCode); impl StaticType for ValidVideoTimeCode { #[inline] fn static_type() -> glib::Type { unsafe { from_glib(ffi::gst_video_time_code_get_type()) } } } #[doc(hidden)] impl glib::value::ToValue for ValidVideoTimeCode { fn to_value(&self) -> glib::Value { let mut value = glib::Value::for_value_type::(); unsafe { glib::gobject_ffi::g_value_set_boxed( value.to_glib_none_mut().0, self.to_glib_none().0 as *mut _, ) } value } fn value_type(&self) -> glib::Type { Self::static_type() } } #[doc(hidden)] impl glib::value::ToValueOptional for ValidVideoTimeCode { fn to_value_optional(s: Option<&Self>) -> glib::Value { skip_assert_initialized!(); let mut value = glib::Value::for_value_type::(); unsafe { glib::gobject_ffi::g_value_set_boxed( value.to_glib_none_mut().0, s.to_glib_none().0 as *mut _, ) } value } } #[doc(hidden)] impl From for glib::Value { fn from(v: ValidVideoTimeCode) -> glib::Value { skip_assert_initialized!(); glib::value::ToValue::to_value(&v) } } impl str::FromStr for VideoTimeCode { type Err = glib::error::BoolError; #[doc(alias = "gst_video_time_code_new_from_string")] fn from_str(s: &str) -> Result { assert_initialized_main_thread!(); unsafe { Option::::from_glib_full(ffi::gst_video_time_code_new_from_string( s.to_glib_none().0, )) .ok_or_else(|| glib::bool_error!("Failed to create VideoTimeCode from string")) } } } impl PartialEq for ValidVideoTimeCode { #[inline] fn eq(&self, other: &Self) -> bool { self.compare(other) == 0 } } impl Eq for ValidVideoTimeCode {} impl PartialOrd for ValidVideoTimeCode { #[inline] fn partial_cmp(&self, other: &Self) -> Option { Some(self.cmp(other)) } } impl Ord for ValidVideoTimeCode { #[inline] fn cmp(&self, other: &Self) -> cmp::Ordering { self.compare(other).cmp(&0) } } impl From for VideoTimeCode { #[inline] fn from(v: ValidVideoTimeCode) -> Self { skip_assert_initialized!(); // Use ManuallyDrop here to prevent the Drop impl of VideoTimeCode // from running as we don't move v.0 out here but copy it. // GstVideoTimeCode implements Copy. let v = mem::ManuallyDrop::new(v); Self { inner: v.inner } } } #[repr(transparent)] #[doc(alias = "GstVideoTimeCodeMeta")] pub struct VideoTimeCodeMeta(ffi::GstVideoTimeCodeMeta); unsafe impl Send for VideoTimeCodeMeta {} unsafe impl Sync for VideoTimeCodeMeta {} impl VideoTimeCodeMeta { #[doc(alias = "gst_buffer_add_video_time_code_meta")] pub fn add<'a>( buffer: &'a mut gst::BufferRef, tc: &ValidVideoTimeCode, ) -> gst::MetaRefMut<'a, Self, gst::meta::Standalone> { skip_assert_initialized!(); unsafe { let meta = ffi::gst_buffer_add_video_time_code_meta( buffer.as_mut_ptr(), tc.to_glib_none().0 as *mut _, ); Self::from_mut_ptr(buffer, meta) } } #[doc(alias = "get_tc")] #[inline] pub fn tc(&self) -> ValidVideoTimeCode { unsafe { ValidVideoTimeCode::from_glib_none(&self.0.tc as *const _) } } #[inline] pub fn set_tc(&mut self, tc: ValidVideoTimeCode) { #![allow(clippy::cast_ptr_alignment)] unsafe { ffi::gst_video_time_code_clear(&mut self.0.tc); // Use ManuallyDrop here to prevent the Drop impl of VideoTimeCode // from running as we don't move tc.0 out here but copy it. // GstVideoTimeCode implements Copy. let tc = mem::ManuallyDrop::new(tc); self.0.tc = tc.inner; } } } unsafe impl MetaAPI for VideoTimeCodeMeta { type GstType = ffi::GstVideoTimeCodeMeta; #[doc(alias = "gst_video_time_code_meta_api_get_type")] #[inline] fn meta_api() -> glib::Type { unsafe { from_glib(ffi::gst_video_time_code_meta_api_get_type()) } } } impl fmt::Debug for VideoTimeCodeMeta { fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result { f.debug_struct("VideoTimeCodeMeta") .field("tc", &self.tc()) .finish() } } #[cfg(feature = "v1_16")] #[cfg(test)] mod tests { #[test] fn test_add_get_set_meta() { gst::init().unwrap(); let mut buffer = gst::Buffer::new(); { let datetime = glib::DateTime::from_utc(2021, 2, 4, 10, 53, 17.0).expect("can't create datetime"); let time_code = crate::VideoTimeCode::from_date_time( gst::Fraction::new(30, 1), &datetime, crate::VideoTimeCodeFlags::empty(), 0, ) .expect("can't create timecode"); drop(datetime); let mut meta = crate::VideoTimeCodeMeta::add( buffer.get_mut().unwrap(), &time_code.try_into().expect("invalid timecode"), ); let datetime = glib::DateTime::from_utc(2021, 2, 4, 10, 53, 17.0).expect("can't create datetime"); let mut time_code_2 = crate::ValidVideoTimeCode::try_from( crate::VideoTimeCode::from_date_time( gst::Fraction::new(30, 1), &datetime, crate::VideoTimeCodeFlags::empty(), 0, ) .expect("can't create timecode"), ) .expect("invalid timecode"); assert_eq!(meta.tc(), time_code_2); time_code_2.increment_frame(); assert_eq!(meta.tc().frames() + 1, time_code_2.frames()); meta.set_tc(time_code_2.clone()); assert_eq!(meta.tc(), time_code_2); } } } gstreamer-video-0.23.5/src/video_time_code_interval.rs000064400000000000000000000073631046102023000212000ustar 00000000000000// Take a look at the license at the top of the repository in the LICENSE file. use std::{cmp, fmt, mem, str}; use crate::ffi; use glib::translate::*; glib::wrapper! { #[doc(alias = "GstVideoTimeCodeInterval")] pub struct VideoTimeCodeInterval(BoxedInline); match fn { type_ => || ffi::gst_video_time_code_interval_get_type(), } } impl VideoTimeCodeInterval { pub fn new(hours: u32, minutes: u32, seconds: u32, frames: u32) -> Self { assert_initialized_main_thread!(); unsafe { let mut v = mem::MaybeUninit::uninit(); ffi::gst_video_time_code_interval_init(v.as_mut_ptr(), hours, minutes, seconds, frames); Self { inner: v.assume_init(), } } } #[doc(alias = "get_hours")] pub fn hours(&self) -> u32 { self.inner.hours } pub fn set_hours(&mut self, hours: u32) { self.inner.hours = hours } #[doc(alias = "get_minutes")] pub fn minutes(&self) -> u32 { self.inner.minutes } pub fn set_minutes(&mut self, minutes: u32) { assert!(minutes < 60); self.inner.minutes = minutes } #[doc(alias = "get_seconds")] pub fn seconds(&self) -> u32 { self.inner.seconds } pub fn set_seconds(&mut self, seconds: u32) { assert!(seconds < 60); self.inner.seconds = seconds } #[doc(alias = "get_frames")] pub fn frames(&self) -> u32 { self.inner.frames } pub fn set_frames(&mut self, frames: u32) { self.inner.frames = frames } } unsafe impl Send for VideoTimeCodeInterval {} unsafe impl Sync for VideoTimeCodeInterval {} impl PartialEq for VideoTimeCodeInterval { fn eq(&self, other: &Self) -> bool { self.inner.hours == other.inner.hours && self.inner.minutes == other.inner.minutes && self.inner.seconds == other.inner.seconds && self.inner.frames == other.inner.frames } } impl Eq for VideoTimeCodeInterval {} impl PartialOrd for VideoTimeCodeInterval { #[inline] fn partial_cmp(&self, other: &Self) -> Option { Some(self.cmp(other)) } } impl Ord for VideoTimeCodeInterval { #[inline] fn cmp(&self, other: &Self) -> cmp::Ordering { self.inner .hours .cmp(&other.inner.hours) .then_with(|| self.inner.minutes.cmp(&other.inner.minutes)) .then_with(|| self.inner.seconds.cmp(&other.inner.seconds)) .then_with(|| self.inner.frames.cmp(&other.inner.frames)) } } impl fmt::Debug for VideoTimeCodeInterval { fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result { f.debug_struct("VideoTimeCodeInterval") .field("hours", &self.inner.hours) .field("minutes", &self.inner.minutes) .field("seconds", &self.inner.seconds) .field("frames", &self.inner.frames) .finish() } } impl fmt::Display for VideoTimeCodeInterval { fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result { write!( f, "{:02}:{:02}:{:02}:{:02}", self.inner.hours, self.inner.minutes, self.inner.seconds, self.inner.frames ) } } impl str::FromStr for VideoTimeCodeInterval { type Err = glib::error::BoolError; #[doc(alias = "gst_video_time_code_interval_new_from_string")] fn from_str(s: &str) -> Result { assert_initialized_main_thread!(); unsafe { Option::::from_glib_full(ffi::gst_video_time_code_interval_new_from_string( s.to_glib_none().0, )) .ok_or_else(|| glib::bool_error!("Failed to create VideoTimeCodeInterval from string")) } } } gstreamer-video-0.23.5/src/video_vbi.rs000064400000000000000000000023331046102023000161140ustar 00000000000000use crate::VideoFormat; pub(super) const VBI_HD_MIN_PIXEL_WIDTH: u32 = 1280; // rustdoc-stripper-ignore-next /// Video Vertical Blanking Interval related Errors. #[derive(thiserror::Error, Clone, Copy, Debug, Eq, PartialEq)] pub enum VideoVBIError { #[error("Format and/or pixel_width is not supported")] Unsupported, #[error("Not enough space left in the current line")] NotEnoughSpace, #[error("Not enough data left in the current line")] NotEnoughData, #[error("Insufficient line buffer length {found}. Expected: {expected}")] InsufficientLineBufLen { found: usize, expected: usize }, } // rustdoc-stripper-ignore-next /// Returns the buffer length needed to store the line. pub(super) fn line_buffer_len(format: VideoFormat, width: u32) -> usize { skip_assert_initialized!(); // Taken from gst-plugins-base/gst-libs/gst/video/video-info.c:fill_planes match format { VideoFormat::V210 => ((width as usize + 47) / 48) * 128, VideoFormat::Uyvy => { // round up width to the next multiple of 4 // FIXME: {integer}::next_multiple_of was stabilised in rustc 1.73.0 ((width as usize * 2) + 3) & !3 } _ => unreachable!(), } } gstreamer-video-0.23.5/src/video_vbi_encoder.rs000064400000000000000000000516321046102023000176210ustar 00000000000000// Take a look at the license at the top of the repository in the LICENSE file. use crate::{ffi, VideoFormat}; use glib::translate::*; use crate::video_vbi::line_buffer_len; use crate::{VideoAncillaryDID, VideoAncillaryDID16, VideoVBIError, VBI_HD_MIN_PIXEL_WIDTH}; glib::wrapper! { #[derive(Debug, PartialEq, Eq, PartialOrd, Ord, Hash)] struct VideoVBIEncoderInner(Boxed); match fn { copy => |ptr| ffi::gst_video_vbi_encoder_copy(ptr), free => |ptr| ffi::gst_video_vbi_encoder_free(ptr), type_ => || ffi::gst_video_vbi_encoder_get_type(), } } #[derive(Debug, PartialEq, Eq, PartialOrd, Ord, Hash)] pub struct VideoVBIEncoder { inner: VideoVBIEncoderInner, format: VideoFormat, pixel_width: u32, line_buffer_len: usize, anc_len: usize, } unsafe impl Send for VideoVBIEncoder {} unsafe impl Sync for VideoVBIEncoder {} #[derive(Clone, Copy, Debug, Eq, PartialEq)] pub enum VideoAFDDescriptionMode { Composite, Component, } impl VideoAFDDescriptionMode { pub fn is_composite(&self) -> bool { matches!(self, VideoAFDDescriptionMode::Composite) } pub fn is_component(&self) -> bool { matches!(self, VideoAFDDescriptionMode::Component) } } impl VideoVBIEncoder { #[doc(alias = "gst_video_vbi_encoder_new")] pub fn try_new( format: VideoFormat, pixel_width: u32, ) -> Result { skip_assert_initialized!(); let res: Option = unsafe { from_glib_full(ffi::gst_video_vbi_encoder_new( format.into_glib(), pixel_width, )) }; Ok(VideoVBIEncoder { inner: res.ok_or(VideoVBIError::Unsupported)?, format, pixel_width, line_buffer_len: line_buffer_len(format, pixel_width), anc_len: 0, }) } // rustdoc-stripper-ignore-next /// Adds the provided ancillary data as a DID and block number AFD. pub fn add_did_ancillary( &mut self, adf_mode: VideoAFDDescriptionMode, did: VideoAncillaryDID, block_number: u8, data: &[u8], ) -> Result<(), VideoVBIError> { self.add_ancillary(adf_mode, did.into_glib() as u8, block_number, data) } // rustdoc-stripper-ignore-next /// Adds the provided ancillary data as a DID16 (DID & SDID) AFD. pub fn add_did16_ancillary( &mut self, adf_mode: VideoAFDDescriptionMode, did16: VideoAncillaryDID16, data: &[u8], ) -> Result<(), VideoVBIError> { let did16 = did16.into_glib(); self.add_ancillary( adf_mode, ((did16 & 0xff00) >> 8) as u8, (did16 & 0xff) as u8, data, ) } #[doc(alias = "gst_video_vbi_encoder_add_ancillary")] pub fn add_ancillary( &mut self, adf_mode: VideoAFDDescriptionMode, did: u8, sdid_block_number: u8, data: &[u8], ) -> Result<(), VideoVBIError> { let data_count = data.len() as _; let res: bool = unsafe { from_glib(ffi::gst_video_vbi_encoder_add_ancillary( self.inner.to_glib_none_mut().0, adf_mode.is_composite().into_glib(), did, sdid_block_number, data.to_glib_none().0, data_count, )) }; if !res { return Err(VideoVBIError::NotEnoughSpace); } // AFD: 1 byte (+2 if component) // DID + SDID_block_number + Data Count: 3 bytes // DATA: data_count bytes // Checksum: 1 byte let mut len = 1 + 3 + (data_count as usize) + 1; if adf_mode.is_component() { len += 2; } if matches!(self.format, VideoFormat::V210) { // 10bits payload on 16bits for now: will be packed when writing the line len *= 2; } self.anc_len += len; Ok(()) } // rustdoc-stripper-ignore-next /// Returns the buffer length needed to store the line. pub fn line_buffer_len(&self) -> usize { self.line_buffer_len } // rustdoc-stripper-ignore-next /// Writes the ancillaries encoded for VBI to the provided buffer. /// /// Use [`Self::line_buffer_len`] to get the expected buffer length. /// /// Resets the internal state, so this [`VideoVBIEncoder`] can be reused for /// subsequent VBI encodings. /// /// # Returns /// /// - `Ok` with the written length in bytes in the line buffer containing the encoded /// ancilliaries previously added using [`VideoVBIEncoder::add_ancillary`], /// [`VideoVBIEncoder::add_did_ancillary`] or [`VideoVBIEncoder::add_did16_ancillary`]. /// - `Err` if the ancillary could not be added. #[doc(alias = "gst_video_vbi_encoder_write_line")] pub fn write_line(&mut self, data: &mut [u8]) -> Result { if data.len() < self.line_buffer_len { return Err(VideoVBIError::InsufficientLineBufLen { found: data.len(), expected: self.line_buffer_len, }); } unsafe { let dest = data.as_mut_ptr(); ffi::gst_video_vbi_encoder_write_line(self.inner.to_glib_none_mut().0, dest); } let mut anc_len = std::mem::take(&mut self.anc_len); match self.format { VideoFormat::V210 => { // Anc data consists in 10bits stored in 16bits word let word_count = anc_len / 2; if self.pixel_width < VBI_HD_MIN_PIXEL_WIDTH { // SD: Packs 12x 10bits data in 4x 32bits word anc_len = 4 * 4 * ((word_count / 12) + if word_count % 12 == 0 { 0 } else { 1 }); } else { // HD: Packs 3x 10bits data in 1x 32bits word interleaving UV and Y components // (where Y starts at buffer offset 0 and UV starts at buffer offset pixel_width) // so we get 6 (uv,y) pairs every 4x 32bits word in the resulting line // FIXME: {integer}::div_ceil was stabilised in rustc 1.73.0 let pair_count = usize::min(word_count, self.pixel_width as usize); anc_len = 4 * 4 * ((pair_count / 6) + if pair_count % 6 == 0 { 0 } else { 1 }); } } VideoFormat::Uyvy => { // Anc data stored as bytes if self.pixel_width < VBI_HD_MIN_PIXEL_WIDTH { // SD: Stores 4x bytes in 4x bytes let's keep 32 bits alignment anc_len = 4 * ((anc_len / 4) + if anc_len % 4 == 0 { 0 } else { 1 }); } else { // HD: Stores 4x bytes in 4x bytes interleaving UV and Y components // (where Y starts at buffer offset 0 and UV starts at buffer offset pixel_width) // so we get 2 (uv,y) pairs every 4x bytes in the resulting line // let's keep 32 bits alignment // FIXME: {integer}::div_ceil was stabilised in rustc 1.73.0 let pair_count = usize::min(anc_len, self.pixel_width as usize); anc_len = 4 * ((pair_count / 2) + if pair_count % 2 == 0 { 0 } else { 1 }); } } _ => unreachable!(), } assert!(anc_len < self.line_buffer_len); Ok(anc_len) } } impl<'a> TryFrom<&'a crate::VideoInfo> for VideoVBIEncoder { type Error = VideoVBIError; fn try_from(info: &'a crate::VideoInfo) -> Result { skip_assert_initialized!(); VideoVBIEncoder::try_new(info.format(), info.width()) } } #[cfg(test)] mod tests { use super::*; #[test] fn cea608_component() { let mut encoder = VideoVBIEncoder::try_new(VideoFormat::V210, VBI_HD_MIN_PIXEL_WIDTH).unwrap(); encoder .add_did16_ancillary( VideoAFDDescriptionMode::Component, VideoAncillaryDID16::S334Eia608, &[0x80, 0x94, 0x2c], ) .unwrap(); let mut buf = vec![0; encoder.line_buffer_len()]; let anc_len = encoder.write_line(buf.as_mut_slice()).unwrap(); assert_eq!(32, anc_len); assert_eq!( buf[0..anc_len], [ 0x00, 0x00, 0x00, 0x00, 0xff, 0x03, 0xf0, 0x3f, 0x00, 0x84, 0x05, 0x00, 0x02, 0x01, 0x30, 0x20, 0x00, 0x00, 0x06, 0x00, 0x94, 0x01, 0xc0, 0x12, 0x00, 0x98, 0x0a, 0x00, 0x00, 0x00, 0x00, 0x00 ] ); } #[test] fn cea608_component_sd() { let mut encoder = VideoVBIEncoder::try_new(VideoFormat::V210, 768).unwrap(); encoder .add_did16_ancillary( VideoAFDDescriptionMode::Component, VideoAncillaryDID16::S334Eia608, &[0x80, 0x94, 0x2c], ) .unwrap(); let mut buf = vec![0; encoder.line_buffer_len()]; let anc_len = encoder.write_line(buf.as_mut_slice()).unwrap(); assert_eq!(16, anc_len); assert_eq!( buf[0..anc_len], [ 0x00, 0xfc, 0xff, 0x3f, 0x61, 0x09, 0x34, 0x20, 0x80, 0x51, 0xc6, 0x12, 0xa6, 0x02, 0x00, 0x00 ] ); } #[test] fn cea608_composite() { let mut encoder = VideoVBIEncoder::try_new(VideoFormat::V210, VBI_HD_MIN_PIXEL_WIDTH).unwrap(); encoder .add_did16_ancillary( VideoAFDDescriptionMode::Composite, VideoAncillaryDID16::S334Eia608, &[0x15, 0x94, 0x2c], ) .unwrap(); let mut buf = vec![0; encoder.line_buffer_len()]; let anc_len = encoder.write_line(buf.as_mut_slice()).unwrap(); assert_eq!(32, anc_len); assert_eq!( buf[0..anc_len], [ 0x00, 0xf0, 0x0f, 0x00, 0x61, 0x01, 0x20, 0x10, 0x00, 0x0c, 0x08, 0x00, 0x15, 0x01, 0x40, 0x19, 0x00, 0xb0, 0x04, 0x00, 0x3b, 0x02, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00 ] ); } #[test] fn cea608_composite_sd() { let mut encoder = VideoVBIEncoder::try_new(VideoFormat::V210, 768).unwrap(); encoder .add_did16_ancillary( VideoAFDDescriptionMode::Composite, VideoAncillaryDID16::S334Eia608, &[0x15, 0x94, 0x2c], ) .unwrap(); let mut buf = vec![0; encoder.line_buffer_len()]; let anc_len = encoder.write_line(buf.as_mut_slice()).unwrap(); assert_eq!(16, anc_len); assert_eq!( buf[0..anc_len], [ 0xfc, 0x87, 0x25, 0x10, 0x03, 0x56, 0x44, 0x19, 0x2c, 0xed, 0x08, 0x00, 0x00, 0x00, 0x00, 0x00 ] ); } #[test] fn cea608_component_uyvy() { let mut encoder = VideoVBIEncoder::try_new(VideoFormat::Uyvy, VBI_HD_MIN_PIXEL_WIDTH).unwrap(); encoder .add_did16_ancillary( VideoAFDDescriptionMode::Component, VideoAncillaryDID16::S334Eia608, &[0x80, 0x94, 0x2c], ) .unwrap(); let mut buf = vec![0; encoder.line_buffer_len()]; let anc_len = encoder.write_line(buf.as_mut_slice()).unwrap(); assert_eq!(20, anc_len); assert_eq!( buf[0..anc_len], [ 0x00, 0x00, 0x00, 0xff, 0x00, 0xff, 0x00, 0x61, 0x00, 0x02, 0x00, 0x03, 0x00, 0x80, 0x00, 0x94, 0x00, 0x2c, 0x00, 0xa6 ] ); } #[test] fn cea608_component_sd_uyvy() { let mut encoder = VideoVBIEncoder::try_new(VideoFormat::Uyvy, 768).unwrap(); encoder .add_did16_ancillary( VideoAFDDescriptionMode::Component, VideoAncillaryDID16::S334Eia608, &[0x80, 0x94, 0x2c], ) .unwrap(); let mut buf = vec![0; encoder.line_buffer_len()]; let anc_len = encoder.write_line(buf.as_mut_slice()).unwrap(); assert_eq!(12, anc_len); assert_eq!( buf[0..anc_len], [0x00, 0xff, 0xff, 0x61, 0x02, 0x03, 0x80, 0x94, 0x2c, 0xa6, 0x00, 0x00] ); } #[test] fn cea608_composite_uyvy() { let mut encoder = VideoVBIEncoder::try_new(VideoFormat::Uyvy, VBI_HD_MIN_PIXEL_WIDTH).unwrap(); encoder .add_did16_ancillary( VideoAFDDescriptionMode::Composite, VideoAncillaryDID16::S334Eia608, &[0x15, 0x94, 0x2c], ) .unwrap(); let mut buf = vec![0; encoder.line_buffer_len()]; let anc_len = encoder.write_line(buf.as_mut_slice()).unwrap(); assert_eq!(16, anc_len); assert_eq!( buf[0..anc_len], [ 0x00, 0xfc, 0x00, 0x61, 0x00, 0x02, 0x00, 0x03, 0x00, 0x15, 0x00, 0x94, 0x00, 0x2c, 0x00, 0x3b ] ); } #[test] fn cea608_composite_sd_uyvy() { let mut encoder = VideoVBIEncoder::try_new(VideoFormat::Uyvy, 768).unwrap(); encoder .add_did16_ancillary( VideoAFDDescriptionMode::Composite, VideoAncillaryDID16::S334Eia608, &[0x15, 0x94, 0x2c], ) .unwrap(); let mut buf = vec![0; encoder.line_buffer_len()]; let anc_len = encoder.write_line(buf.as_mut_slice()).unwrap(); assert_eq!(8, anc_len); assert_eq!( buf[0..anc_len], [0xfc, 0x61, 0x02, 0x03, 0x15, 0x94, 0x2c, 0x3b] ); } #[test] fn insufficient_line_buf_len() { let mut encoder = VideoVBIEncoder::try_new(VideoFormat::V210, VBI_HD_MIN_PIXEL_WIDTH).unwrap(); encoder .add_did16_ancillary( VideoAFDDescriptionMode::Component, VideoAncillaryDID16::S334Eia608, &[0x80, 0x94, 0x2c], ) .unwrap(); let mut buf = vec![0; 10]; assert_eq!( encoder.write_line(buf.as_mut_slice()).unwrap_err(), VideoVBIError::InsufficientLineBufLen { found: 10, expected: encoder.line_buffer_len() }, ); } #[test] fn cea708_component() { let mut encoder = VideoVBIEncoder::try_new(VideoFormat::V210, VBI_HD_MIN_PIXEL_WIDTH).unwrap(); encoder .add_did16_ancillary( VideoAFDDescriptionMode::Component, VideoAncillaryDID16::S334Eia708, &[ 0x96, 0x69, 0x55, 0x3f, 0x43, 0x00, 0x00, 0x72, 0xf8, 0xfc, 0x94, 0x2c, 0xf9, 0x00, 0x00, 0xfa, 0x00, 0x00, 0xfa, 0x00, 0x00, 0xfa, 0x00, 0x00, 0xfa, 0x00, 0x00, 0xfa, 0x00, 0x00, 0xfa, 0x00, 0x00, 0xfa, 0x00, 0x00, 0xfa, 0x00, 0x00, 0xfa, 0x00, 0x00, 0xfa, 0x00, 0x00, 0xfa, 0x00, 0x00, 0xfa, 0x00, 0x00, 0xfa, 0x00, 0x00, 0xfa, 0x00, 0x00, 0xfa, 0x00, 0x00, 0xfa, 0x00, 0x00, 0xfa, 0x00, 0x00, 0xfa, 0x00, 0x00, 0xfa, 0x00, 0x00, 0xfa, 0x00, 0x00, 0xfa, 0x00, 0x00, 0xfa, 0x00, 0x00, 0x74, 0x00, 0x00, 0x1b, ], ) .unwrap(); let mut buf = vec![0; encoder.line_buffer_len()]; let anc_len = encoder.write_line(buf.as_mut_slice()).unwrap(); assert_eq!(256, anc_len); assert_eq!( buf[0..anc_len], [ 0x00, 0x00, 0x00, 0x00, 0xff, 0x03, 0xf0, 0x3f, 0x00, 0x84, 0x05, 0x00, 0x01, 0x01, 0x50, 0x25, 0x00, 0x58, 0x0a, 0x00, 0x69, 0x02, 0x50, 0x25, 0x00, 0xfc, 0x08, 0x00, 0x43, 0x01, 0x00, 0x20, 0x00, 0x00, 0x08, 0x00, 0x72, 0x02, 0x80, 0x1f, 0x00, 0xf0, 0x0b, 0x00, 0x94, 0x01, 0xc0, 0x12, 0x00, 0xe4, 0x0b, 0x00, 0x00, 0x02, 0x00, 0x20, 0x00, 0xe8, 0x0b, 0x00, 0x00, 0x02, 0x00, 0x20, 0x00, 0xe8, 0x0b, 0x00, 0x00, 0x02, 0x00, 0x20, 0x00, 0xe8, 0x0b, 0x00, 0x00, 0x02, 0x00, 0x20, 0x00, 0xe8, 0x0b, 0x00, 0x00, 0x02, 0x00, 0x20, 0x00, 0xe8, 0x0b, 0x00, 0x00, 0x02, 0x00, 0x20, 0x00, 0xe8, 0x0b, 0x00, 0x00, 0x02, 0x00, 0x20, 0x00, 0xe8, 0x0b, 0x00, 0x00, 0x02, 0x00, 0x20, 0x00, 0xe8, 0x0b, 0x00, 0x00, 0x02, 0x00, 0x20, 0x00, 0xe8, 0x0b, 0x00, 0x00, 0x02, 0x00, 0x20, 0x00, 0xe8, 0x0b, 0x00, 0x00, 0x02, 0x00, 0x20, 0x00, 0xe8, 0x0b, 0x00, 0x00, 0x02, 0x00, 0x20, 0x00, 0xe8, 0x0b, 0x00, 0x00, 0x02, 0x00, 0x20, 0x00, 0xe8, 0x0b, 0x00, 0x00, 0x02, 0x00, 0x20, 0x00, 0xe8, 0x0b, 0x00, 0x00, 0x02, 0x00, 0x20, 0x00, 0xe8, 0x0b, 0x00, 0x00, 0x02, 0x00, 0x20, 0x00, 0xe8, 0x0b, 0x00, 0x00, 0x02, 0x00, 0x20, 0x00, 0xe8, 0x0b, 0x00, 0x00, 0x02, 0x00, 0x20, 0x00, 0xe8, 0x0b, 0x00, 0x00, 0x02, 0x00, 0x20, 0x00, 0xe8, 0x0b, 0x00, 0x00, 0x02, 0x00, 0x20, 0x00, 0xe8, 0x0b, 0x00, 0x00, 0x02, 0x00, 0x20, 0x00, 0xe8, 0x0b, 0x00, 0x00, 0x02, 0x00, 0x20, 0x00, 0xe8, 0x0b, 0x00, 0x00, 0x02, 0x00, 0x20, 0x00, 0xd0, 0x09, 0x00, 0x00, 0x02, 0x00, 0x20, 0x00, 0x6c, 0x08, 0x00, 0xb7, 0x02, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00 ] ); } #[test] fn cea608_and_cea708_component() { let mut encoder = VideoVBIEncoder::try_new(VideoFormat::V210, VBI_HD_MIN_PIXEL_WIDTH).unwrap(); encoder .add_did16_ancillary( VideoAFDDescriptionMode::Component, VideoAncillaryDID16::S334Eia608, &[0x80, 0x94, 0x2c], ) .unwrap(); encoder .add_did16_ancillary( VideoAFDDescriptionMode::Component, VideoAncillaryDID16::S334Eia708, &[ 0x96, 0x69, 0x55, 0x3f, 0x43, 0x00, 0x00, 0x72, 0xf8, 0xfc, 0x94, 0x2c, 0xf9, 0x00, 0x00, 0xfa, 0x00, 0x00, 0xfa, 0x00, 0x00, 0xfa, 0x00, 0x00, 0xfa, 0x00, 0x00, 0xfa, 0x00, 0x00, 0xfa, 0x00, 0x00, 0xfa, 0x00, 0x00, 0xfa, 0x00, 0x00, 0xfa, 0x00, 0x00, 0xfa, 0x00, 0x00, 0xfa, 0x00, 0x00, 0xfa, 0x00, 0x00, 0xfa, 0x00, 0x00, 0xfa, 0x00, 0x00, 0xfa, 0x00, 0x00, 0xfa, 0x00, 0x00, 0xfa, 0x00, 0x00, 0xfa, 0x00, 0x00, 0xfa, 0x00, 0x00, 0xfa, 0x00, 0x00, 0xfa, 0x00, 0x00, 0xfa, 0x00, 0x00, 0x74, 0x00, 0x00, 0x1b, ], ) .unwrap(); let mut buf = vec![0; encoder.line_buffer_len()]; let anc_len = encoder.write_line(buf.as_mut_slice()).unwrap(); assert_eq!(272, anc_len); assert_eq!( buf[0..anc_len], [ 0x00, 0x00, 0x00, 0x00, 0xff, 0x03, 0xf0, 0x3f, 0x00, 0x84, 0x05, 0x00, 0x02, 0x01, 0x30, 0x20, 0x00, 0x00, 0x06, 0x00, 0x94, 0x01, 0xc0, 0x12, 0x00, 0x98, 0x0a, 0x00, 0x00, 0x00, 0xf0, 0x3f, 0x00, 0xfc, 0x0f, 0x00, 0x61, 0x01, 0x10, 0x10, 0x00, 0x54, 0x09, 0x00, 0x96, 0x02, 0x90, 0x26, 0x00, 0x54, 0x09, 0x00, 0x3f, 0x02, 0x30, 0x14, 0x00, 0x00, 0x08, 0x00, 0x00, 0x02, 0x20, 0x27, 0x00, 0xe0, 0x07, 0x00, 0xfc, 0x02, 0x40, 0x19, 0x00, 0xb0, 0x04, 0x00, 0xf9, 0x02, 0x00, 0x20, 0x00, 0x00, 0x08, 0x00, 0xfa, 0x02, 0x00, 0x20, 0x00, 0x00, 0x08, 0x00, 0xfa, 0x02, 0x00, 0x20, 0x00, 0x00, 0x08, 0x00, 0xfa, 0x02, 0x00, 0x20, 0x00, 0x00, 0x08, 0x00, 0xfa, 0x02, 0x00, 0x20, 0x00, 0x00, 0x08, 0x00, 0xfa, 0x02, 0x00, 0x20, 0x00, 0x00, 0x08, 0x00, 0xfa, 0x02, 0x00, 0x20, 0x00, 0x00, 0x08, 0x00, 0xfa, 0x02, 0x00, 0x20, 0x00, 0x00, 0x08, 0x00, 0xfa, 0x02, 0x00, 0x20, 0x00, 0x00, 0x08, 0x00, 0xfa, 0x02, 0x00, 0x20, 0x00, 0x00, 0x08, 0x00, 0xfa, 0x02, 0x00, 0x20, 0x00, 0x00, 0x08, 0x00, 0xfa, 0x02, 0x00, 0x20, 0x00, 0x00, 0x08, 0x00, 0xfa, 0x02, 0x00, 0x20, 0x00, 0x00, 0x08, 0x00, 0xfa, 0x02, 0x00, 0x20, 0x00, 0x00, 0x08, 0x00, 0xfa, 0x02, 0x00, 0x20, 0x00, 0x00, 0x08, 0x00, 0xfa, 0x02, 0x00, 0x20, 0x00, 0x00, 0x08, 0x00, 0xfa, 0x02, 0x00, 0x20, 0x00, 0x00, 0x08, 0x00, 0xfa, 0x02, 0x00, 0x20, 0x00, 0x00, 0x08, 0x00, 0xfa, 0x02, 0x00, 0x20, 0x00, 0x00, 0x08, 0x00, 0xfa, 0x02, 0x00, 0x20, 0x00, 0x00, 0x08, 0x00, 0xfa, 0x02, 0x00, 0x20, 0x00, 0x00, 0x08, 0x00, 0xfa, 0x02, 0x00, 0x20, 0x00, 0x00, 0x08, 0x00, 0xfa, 0x02, 0x00, 0x20, 0x00, 0x00, 0x08, 0x00, 0x74, 0x02, 0x00, 0x20, 0x00, 0x00, 0x08, 0x00, 0x1b, 0x02, 0x70, 0x2b ] ); } } gstreamer-video-0.23.5/src/video_vbi_parser.rs000064400000000000000000000334051046102023000174740ustar 00000000000000// Take a look at the license at the top of the repository in the LICENSE file. use crate::{ffi, VideoFormat}; use glib::translate::*; use std::fmt; use crate::video_vbi::line_buffer_len; use crate::{VideoAncillaryDID, VideoAncillaryDID16, VideoVBIError}; glib::wrapper! { #[doc(alias = "GstVideoAncillary")] pub struct VideoAncillary(BoxedInline); } impl VideoAncillary { pub fn did_u8(&self) -> u8 { self.inner.DID } pub fn did(&self) -> VideoAncillaryDID { unsafe { VideoAncillaryDID::from_glib(self.inner.DID as ffi::GstVideoAncillaryDID) } } pub fn sdid_block_number(&self) -> u8 { self.inner.SDID_block_number } pub fn did16(&self) -> VideoAncillaryDID16 { unsafe { VideoAncillaryDID16::from_glib( (((self.inner.DID as u16) << 8) + self.inner.SDID_block_number as u16) as ffi::GstVideoAncillaryDID16, ) } } pub fn len(&self) -> usize { self.inner.data_count as usize } pub fn is_empty(&self) -> bool { self.inner.data_count == 0 } pub fn data(&self) -> &[u8] { &self.inner.data[0..(self.inner.data_count as usize)] } } impl fmt::Debug for VideoAncillary { fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result { f.debug_struct("VideoAncillary") .field("did", &self.did()) .field("sdid_block_number", &self.sdid_block_number()) .field("did16", &self.did16()) .field("data_count", &self.inner.data_count) .finish() } } glib::wrapper! { #[derive(Debug, PartialEq, Eq, PartialOrd, Ord, Hash)] struct VideoVBIParserInner(Boxed); match fn { copy => |ptr| ffi::gst_video_vbi_parser_copy(ptr), free => |ptr| ffi::gst_video_vbi_parser_free(ptr), type_ => || ffi::gst_video_vbi_parser_get_type(), } } #[derive(Debug, PartialEq, Eq, PartialOrd, Ord, Hash)] pub struct VideoVBIParser { inner: VideoVBIParserInner, line_buffer_len: usize, } impl VideoVBIParser { #[doc(alias = "gst_video_vbi_parser_new")] pub fn try_new(format: VideoFormat, pixel_width: u32) -> Result { skip_assert_initialized!(); let res: Option = unsafe { from_glib_full(ffi::gst_video_vbi_parser_new( format.into_glib(), pixel_width, )) }; Ok(VideoVBIParser { inner: res.ok_or(VideoVBIError::Unsupported)?, line_buffer_len: line_buffer_len(format, pixel_width), }) } // rustdoc-stripper-ignore-next /// Returns the buffer length needed to store the line. pub fn line_buffer_len(&self) -> usize { self.line_buffer_len } #[doc(alias = "gst_video_vbi_parser_add_line")] pub fn add_line(&mut self, data: &[u8]) -> Result<(), VideoVBIError> { if data.len() < self.line_buffer_len { return Err(VideoVBIError::InsufficientLineBufLen { found: data.len(), expected: self.line_buffer_len, }); } unsafe { let data = data.as_ptr(); ffi::gst_video_vbi_parser_add_line(self.inner.to_glib_none_mut().0, data); } Ok(()) } pub fn iter(&mut self) -> AncillaryIter { AncillaryIter { parser: self } } #[doc(alias = "gst_video_vbi_parser_get_ancillary")] pub fn next_ancillary(&mut self) -> Option> { unsafe { let mut video_anc = std::mem::MaybeUninit::uninit(); let res = ffi::gst_video_vbi_parser_get_ancillary( self.inner.to_glib_none_mut().0, video_anc.as_mut_ptr(), ); match res { ffi::GST_VIDEO_VBI_PARSER_RESULT_OK => Some(Ok(VideoAncillary { inner: video_anc.assume_init(), })), ffi::GST_VIDEO_VBI_PARSER_RESULT_DONE => None, ffi::GST_VIDEO_VBI_PARSER_RESULT_ERROR => Some(Err(VideoVBIError::NotEnoughData)), _ => unreachable!(), } } } } unsafe impl Send for VideoVBIParser {} unsafe impl Sync for VideoVBIParser {} impl<'a> TryFrom<&'a crate::VideoInfo> for VideoVBIParser { type Error = VideoVBIError; fn try_from(info: &'a crate::VideoInfo) -> Result { skip_assert_initialized!(); VideoVBIParser::try_new(info.format(), info.width()) } } #[derive(Debug)] pub struct AncillaryIter<'a> { parser: &'a mut VideoVBIParser, } impl Iterator for AncillaryIter<'_> { type Item = Result; fn next(&mut self) -> Option { self.parser.next_ancillary() } } #[cfg(test)] mod tests { use super::*; use crate::VBI_HD_MIN_PIXEL_WIDTH; fn init_line_buf(parser: &VideoVBIParser, anc_buf: &[u8]) -> Vec { skip_assert_initialized!(); let mut line_buf = vec![0; parser.line_buffer_len()]; line_buf[0..anc_buf.len()].copy_from_slice(anc_buf); line_buf } #[test] fn cea608_component() { let mut parser = VideoVBIParser::try_new(VideoFormat::V210, VBI_HD_MIN_PIXEL_WIDTH).unwrap(); let line_buf = init_line_buf( &parser, &[ 0x00, 0x00, 0x00, 0x00, 0xff, 0x03, 0xf0, 0x3f, 0x00, 0x84, 0x05, 0x00, 0x02, 0x01, 0x30, 0x20, 0x00, 0x00, 0x06, 0x00, 0x94, 0x01, 0xc0, 0x12, 0x00, 0x98, 0x0a, 0x00, 0x00, 0x00, 0x00, 0x00, ], ); parser.add_line(&line_buf).unwrap(); let video_anc = parser.next_ancillary().unwrap().unwrap(); assert_eq!(video_anc.did16(), VideoAncillaryDID16::S334Eia608); assert_eq!(video_anc.data(), [0x80, 0x94, 0x2c]); assert!(parser.next_ancillary().is_none()); } #[test] fn cea608_composite() { let mut parser = VideoVBIParser::try_new(VideoFormat::V210, VBI_HD_MIN_PIXEL_WIDTH).unwrap(); let line_buf = init_line_buf( &parser, &[ 0x00, 0xf0, 0x0f, 0x00, 0x61, 0x01, 0x20, 0x10, 0x00, 0x0c, 0x08, 0x00, 0x15, 0x01, 0x40, 0x19, 0x00, 0xb0, 0x04, 0x00, 0x3b, 0x02, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, ], ); parser.add_line(&line_buf).unwrap(); let video_anc = parser.next_ancillary().unwrap().unwrap(); assert_eq!(video_anc.did16(), VideoAncillaryDID16::S334Eia608); assert_eq!(video_anc.data(), [0x15, 0x94, 0x2c]); assert!(parser.next_ancillary().is_none()); } #[test] fn cea608_can_not_parse() { let mut parser = VideoVBIParser::try_new(VideoFormat::V210, VBI_HD_MIN_PIXEL_WIDTH).unwrap(); let line_buf = init_line_buf(&parser, &[0x00, 0xf0, 0x0f, 0x00, 0x61, 0x01, 0x20, 0x10]); parser.add_line(&line_buf).unwrap(); assert!(parser.next_ancillary().is_none()); } #[test] fn cea608_insufficient_line_buf_len() { let mut parser = VideoVBIParser::try_new(VideoFormat::V210, VBI_HD_MIN_PIXEL_WIDTH).unwrap(); let line_buf = vec![0; 10]; assert_eq!( parser.add_line(&line_buf).unwrap_err(), VideoVBIError::InsufficientLineBufLen { found: 10, expected: parser.line_buffer_len() }, ); } #[test] fn cea708_component() { let mut parser = VideoVBIParser::try_new(VideoFormat::V210, VBI_HD_MIN_PIXEL_WIDTH).unwrap(); let line_buf = init_line_buf( &parser, &[ 0x00, 0x00, 0x00, 0x00, 0xff, 0x03, 0xf0, 0x3f, 0x00, 0x84, 0x05, 0x00, 0x01, 0x01, 0x50, 0x25, 0x00, 0x58, 0x0a, 0x00, 0x69, 0x02, 0x50, 0x25, 0x00, 0xfc, 0x08, 0x00, 0x43, 0x01, 0x00, 0x20, 0x00, 0x00, 0x08, 0x00, 0x72, 0x02, 0x80, 0x1f, 0x00, 0xf0, 0x0b, 0x00, 0x94, 0x01, 0xc0, 0x12, 0x00, 0xe4, 0x0b, 0x00, 0x00, 0x02, 0x00, 0x20, 0x00, 0xe8, 0x0b, 0x00, 0x00, 0x02, 0x00, 0x20, 0x00, 0xe8, 0x0b, 0x00, 0x00, 0x02, 0x00, 0x20, 0x00, 0xe8, 0x0b, 0x00, 0x00, 0x02, 0x00, 0x20, 0x00, 0xe8, 0x0b, 0x00, 0x00, 0x02, 0x00, 0x20, 0x00, 0xe8, 0x0b, 0x00, 0x00, 0x02, 0x00, 0x20, 0x00, 0xe8, 0x0b, 0x00, 0x00, 0x02, 0x00, 0x20, 0x00, 0xe8, 0x0b, 0x00, 0x00, 0x02, 0x00, 0x20, 0x00, 0xe8, 0x0b, 0x00, 0x00, 0x02, 0x00, 0x20, 0x00, 0xe8, 0x0b, 0x00, 0x00, 0x02, 0x00, 0x20, 0x00, 0xe8, 0x0b, 0x00, 0x00, 0x02, 0x00, 0x20, 0x00, 0xe8, 0x0b, 0x00, 0x00, 0x02, 0x00, 0x20, 0x00, 0xe8, 0x0b, 0x00, 0x00, 0x02, 0x00, 0x20, 0x00, 0xe8, 0x0b, 0x00, 0x00, 0x02, 0x00, 0x20, 0x00, 0xe8, 0x0b, 0x00, 0x00, 0x02, 0x00, 0x20, 0x00, 0xe8, 0x0b, 0x00, 0x00, 0x02, 0x00, 0x20, 0x00, 0xe8, 0x0b, 0x00, 0x00, 0x02, 0x00, 0x20, 0x00, 0xe8, 0x0b, 0x00, 0x00, 0x02, 0x00, 0x20, 0x00, 0xe8, 0x0b, 0x00, 0x00, 0x02, 0x00, 0x20, 0x00, 0xe8, 0x0b, 0x00, 0x00, 0x02, 0x00, 0x20, 0x00, 0xe8, 0x0b, 0x00, 0x00, 0x02, 0x00, 0x20, 0x00, 0xe8, 0x0b, 0x00, 0x00, 0x02, 0x00, 0x20, 0x00, 0xe8, 0x0b, 0x00, 0x00, 0x02, 0x00, 0x20, 0x00, 0xd0, 0x09, 0x00, 0x00, 0x02, 0x00, 0x20, 0x00, 0x6c, 0x08, 0x00, 0xb7, 0x02, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, ], ); parser.add_line(&line_buf).unwrap(); let video_anc = parser.next_ancillary().unwrap().unwrap(); assert_eq!(video_anc.did16(), VideoAncillaryDID16::S334Eia708); assert_eq!( video_anc.data(), [ 0x96, 0x69, 0x55, 0x3f, 0x43, 0x00, 0x00, 0x72, 0xf8, 0xfc, 0x94, 0x2c, 0xf9, 0x00, 0x00, 0xfa, 0x00, 0x00, 0xfa, 0x00, 0x00, 0xfa, 0x00, 0x00, 0xfa, 0x00, 0x00, 0xfa, 0x00, 0x00, 0xfa, 0x00, 0x00, 0xfa, 0x00, 0x00, 0xfa, 0x00, 0x00, 0xfa, 0x00, 0x00, 0xfa, 0x00, 0x00, 0xfa, 0x00, 0x00, 0xfa, 0x00, 0x00, 0xfa, 0x00, 0x00, 0xfa, 0x00, 0x00, 0xfa, 0x00, 0x00, 0xfa, 0x00, 0x00, 0xfa, 0x00, 0x00, 0xfa, 0x00, 0x00, 0xfa, 0x00, 0x00, 0xfa, 0x00, 0x00, 0xfa, 0x00, 0x00, 0xfa, 0x00, 0x00, 0x74, 0x00, 0x00, 0x1b, ] ); assert!(parser.next_ancillary().is_none()); } #[test] fn cea608_and_cea708_component() { let mut parser = VideoVBIParser::try_new(VideoFormat::V210, VBI_HD_MIN_PIXEL_WIDTH).unwrap(); let mut line_buf = vec![0; parser.line_buffer_len()]; let anc_buf = [ 0x00, 0x00, 0x00, 0x00, 0xff, 0x03, 0xf0, 0x3f, 0x00, 0x84, 0x05, 0x00, 0x02, 0x01, 0x30, 0x20, 0x00, 0x00, 0x06, 0x00, 0x94, 0x01, 0xc0, 0x12, 0x00, 0x98, 0x0a, 0x00, 0x00, 0x00, 0xf0, 0x3f, 0x00, 0xfc, 0x0f, 0x00, 0x61, 0x01, 0x10, 0x10, 0x00, 0x54, 0x09, 0x00, 0x96, 0x02, 0x90, 0x26, 0x00, 0x54, 0x09, 0x00, 0x3f, 0x02, 0x30, 0x14, 0x00, 0x00, 0x08, 0x00, 0x00, 0x02, 0x20, 0x27, 0x00, 0xe0, 0x07, 0x00, 0xfc, 0x02, 0x40, 0x19, 0x00, 0xb0, 0x04, 0x00, 0xf9, 0x02, 0x00, 0x20, 0x00, 0x00, 0x08, 0x00, 0xfa, 0x02, 0x00, 0x20, 0x00, 0x00, 0x08, 0x00, 0xfa, 0x02, 0x00, 0x20, 0x00, 0x00, 0x08, 0x00, 0xfa, 0x02, 0x00, 0x20, 0x00, 0x00, 0x08, 0x00, 0xfa, 0x02, 0x00, 0x20, 0x00, 0x00, 0x08, 0x00, 0xfa, 0x02, 0x00, 0x20, 0x00, 0x00, 0x08, 0x00, 0xfa, 0x02, 0x00, 0x20, 0x00, 0x00, 0x08, 0x00, 0xfa, 0x02, 0x00, 0x20, 0x00, 0x00, 0x08, 0x00, 0xfa, 0x02, 0x00, 0x20, 0x00, 0x00, 0x08, 0x00, 0xfa, 0x02, 0x00, 0x20, 0x00, 0x00, 0x08, 0x00, 0xfa, 0x02, 0x00, 0x20, 0x00, 0x00, 0x08, 0x00, 0xfa, 0x02, 0x00, 0x20, 0x00, 0x00, 0x08, 0x00, 0xfa, 0x02, 0x00, 0x20, 0x00, 0x00, 0x08, 0x00, 0xfa, 0x02, 0x00, 0x20, 0x00, 0x00, 0x08, 0x00, 0xfa, 0x02, 0x00, 0x20, 0x00, 0x00, 0x08, 0x00, 0xfa, 0x02, 0x00, 0x20, 0x00, 0x00, 0x08, 0x00, 0xfa, 0x02, 0x00, 0x20, 0x00, 0x00, 0x08, 0x00, 0xfa, 0x02, 0x00, 0x20, 0x00, 0x00, 0x08, 0x00, 0xfa, 0x02, 0x00, 0x20, 0x00, 0x00, 0x08, 0x00, 0xfa, 0x02, 0x00, 0x20, 0x00, 0x00, 0x08, 0x00, 0xfa, 0x02, 0x00, 0x20, 0x00, 0x00, 0x08, 0x00, 0xfa, 0x02, 0x00, 0x20, 0x00, 0x00, 0x08, 0x00, 0xfa, 0x02, 0x00, 0x20, 0x00, 0x00, 0x08, 0x00, 0x74, 0x02, 0x00, 0x20, 0x00, 0x00, 0x08, 0x00, 0x1b, 0x02, 0x70, 0x2b, ]; line_buf[0..anc_buf.len()].copy_from_slice(&anc_buf); parser.add_line(&line_buf).unwrap(); let mut anc_iter = parser.iter(); let video_anc = anc_iter.next().unwrap().unwrap(); assert_eq!(video_anc.did16(), VideoAncillaryDID16::S334Eia608); assert_eq!(video_anc.data(), [0x80, 0x94, 0x2c]); let video_anc = anc_iter.next().unwrap().unwrap(); assert_eq!(video_anc.did16(), VideoAncillaryDID16::S334Eia708); assert_eq!( video_anc.data(), [ 0x96, 0x69, 0x55, 0x3f, 0x43, 0x00, 0x00, 0x72, 0xf8, 0xfc, 0x94, 0x2c, 0xf9, 0x00, 0x00, 0xfa, 0x00, 0x00, 0xfa, 0x00, 0x00, 0xfa, 0x00, 0x00, 0xfa, 0x00, 0x00, 0xfa, 0x00, 0x00, 0xfa, 0x00, 0x00, 0xfa, 0x00, 0x00, 0xfa, 0x00, 0x00, 0xfa, 0x00, 0x00, 0xfa, 0x00, 0x00, 0xfa, 0x00, 0x00, 0xfa, 0x00, 0x00, 0xfa, 0x00, 0x00, 0xfa, 0x00, 0x00, 0xfa, 0x00, 0x00, 0xfa, 0x00, 0x00, 0xfa, 0x00, 0x00, 0xfa, 0x00, 0x00, 0xfa, 0x00, 0x00, 0xfa, 0x00, 0x00, 0xfa, 0x00, 0x00, 0xfa, 0x00, 0x00, 0x74, 0x00, 0x00, 0x1b, ] ); assert!(anc_iter.next().is_none()); } } gstreamer-video-0.23.5/tests/check_gir.rs000064400000000000000000000003461046102023000164410ustar 00000000000000// Take a look at the license at the top of the repository in the LICENSE file. #[test] fn check_gir_file() { let res = gir_format_check::check_gir_file("Gir.toml"); println!("{res}"); assert_eq!(res.nb_errors, 0); }