live/ 000755 001752 001752 00000000000 12656261131 011452 5 ustar 00rsf rsf 000000 000000 live/BasicUsageEnvironment/ 000755 001752 001752 00000000000 12656261123 015706 5 ustar 00rsf rsf 000000 000000 live/proxyServer/ 000755 001752 001752 00000000000 12656261123 014023 5 ustar 00rsf rsf 000000 000000 live/mediaServer/ 000755 001752 001752 00000000000 12656261123 013721 5 ustar 00rsf rsf 000000 000000 live/testProgs/ 000755 001752 001752 00000000000 12656261123 013445 5 ustar 00rsf rsf 000000 000000 live/WindowsAudioInputDevice/ 000755 001752 001752 00000000000 12656261123 016227 5 ustar 00rsf rsf 000000 000000 live/UsageEnvironment/ 000755 001752 001752 00000000000 12656261123 014744 5 ustar 00rsf rsf 000000 000000 live/groupsock/ 000755 001752 001752 00000000000 12656261123 013467 5 ustar 00rsf rsf 000000 000000 live/liveMedia/ 000755 001752 001752 00000000000 12656261123 013352 5 ustar 00rsf rsf 000000 000000 live/configure 000551 001752 001752 00000000571 12656261123 013357 0 ustar 00rsf rsf 000000 000000 #!/bin/sh
echo "Whoa! This software distribution does NOT use the normal Unix \"configure\" mechanism for generating a Makefile. For instructions on how to build this software, see ."
echo "Also, please make sure that you're using the most up-to-date version of the source code - available from ."
live/config.aix 000444 001752 001752 00000000661 12656261131 013423 0 ustar 00rsf rsf 000000 000000 COMPILE_OPTS = $(INCLUDES) -I. -DBSD=1 -O -DTIME_BASE=int -DSOCKLEN_T=socklen_t
C = c
C_COMPILER = cc
C_FLAGS = $(COMPILE_OPTS)
CPP = cpp
CPLUSPLUS_COMPILER = c++
CPLUSPLUS_FLAGS = $(COMPILE_OPTS) -Wall -DAIX=1
OBJ = o
LINK = c++ -o
LINK_OPTS = -L.
CONSOLE_LINK_OPTS = $(LINK_OPTS)
LIBRARY_LINK = ld -o
LIBRARY_LINK_OPTS = $(LINK_OPTS) -r
LIB_SUFFIX = a
LIBS_FOR_CONSOLE_APPLICATION =
LIBS_FOR_GUI_APPLICATION =
EXE =
live/COPYING 000444 001752 001752 00000057505 12656261131 012517 0 ustar 00rsf rsf 000000 000000 GNU LESSER GENERAL PUBLIC LICENSE
Version 2.1, February 1999
Copyright (C) 1991, 1999 Free Software Foundation, Inc.
59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
Everyone is permitted to copy and distribute verbatim copies
of this license document, but changing it is not allowed.
[This is the first released version of the Lesser GPL. It also counts
as the successor of the GNU Library Public License, version 2, hence
the version number 2.1.]
Preamble
The licenses for most software are designed to take away your
freedom to share and change it. By contrast, the GNU General Public
Licenses are intended to guarantee your freedom to share and change
free software--to make sure the software is free for all its users.
This license, the Lesser General Public License, applies to some
specially designated software packages--typically libraries--of the
Free Software Foundation and other authors who decide to use it. You
can use it too, but we suggest you first think carefully about whether
this license or the ordinary General Public License is the better
strategy to use in any particular case, based on the explanations below.
When we speak of free software, we are referring to freedom of use,
not price. Our General Public Licenses are designed to make sure that
you have the freedom to distribute copies of free software (and charge
for this service if you wish); that you receive source code or can get
it if you want it; that you can change the software and use pieces of
it in new free programs; and that you are informed that you can do
these things.
To protect your rights, we need to make restrictions that forbid
distributors to deny you these rights or to ask you to surrender these
rights. These restrictions translate to certain responsibilities for
you if you distribute copies of the library or if you modify it.
For example, if you distribute copies of the library, whether gratis
or for a fee, you must give the recipients all the rights that we gave
you. You must make sure that they, too, receive or can get the source
code. If you link other code with the library, you must provide
complete object files to the recipients, so that they can relink them
with the library after making changes to the library and recompiling
it. And you must show them these terms so they know their rights.
We protect your rights with a two-step method: (1) we copyright the
library, and (2) we offer you this license, which gives you legal
permission to copy, distribute and/or modify the library.
To protect each distributor, we want to make it very clear that
there is no warranty for the free library. Also, if the library is
modified by someone else and passed on, the recipients should know
that what they have is not the original version, so that the original
author's reputation will not be affected by problems that might be
introduced by others.
Finally, software patents pose a constant threat to the existence of
any free program. We wish to make sure that a company cannot
effectively restrict the users of a free program by obtaining a
restrictive license from a patent holder. Therefore, we insist that
any patent license obtained for a version of the library must be
consistent with the full freedom of use specified in this license.
Most GNU software, including some libraries, is covered by the
ordinary GNU General Public License. This license, the GNU Lesser
General Public License, applies to certain designated libraries, and
is quite different from the ordinary General Public License. We use
this license for certain libraries in order to permit linking those
libraries into non-free programs.
When a program is linked with a library, whether statically or using
a shared library, the combination of the two is legally speaking a
combined work, a derivative of the original library. The ordinary
General Public License therefore permits such linking only if the
entire combination fits its criteria of freedom. The Lesser General
Public License permits more lax criteria for linking other code with
the library.
We call this license the "Lesser" General Public License because it
does Less to protect the user's freedom than the ordinary General
Public License. It also provides other free software developers Less
of an advantage over competing non-free programs. These disadvantages
are the reason we use the ordinary General Public License for many
libraries. However, the Lesser license provides advantages in certain
special circumstances.
For example, on rare occasions, there may be a special need to
encourage the widest possible use of a certain library, so that it becomes
a de-facto standard. To achieve this, non-free programs must be
allowed to use the library. A more frequent case is that a free
library does the same job as widely used non-free libraries. In this
case, there is little to gain by limiting the free library to free
software only, so we use the Lesser General Public License.
In other cases, permission to use a particular library in non-free
programs enables a greater number of people to use a large body of
free software. For example, permission to use the GNU C Library in
non-free programs enables many more people to use the whole GNU
operating system, as well as its variant, the GNU/Linux operating
system.
Although the Lesser General Public License is Less protective of the
users' freedom, it does ensure that the user of a program that is
linked with the Library has the freedom and the wherewithal to run
that program using a modified version of the Library.
The precise terms and conditions for copying, distribution and
modification follow. Pay close attention to the difference between a
"work based on the library" and a "work that uses the library". The
former contains code derived from the library, whereas the latter must
be combined with the library in order to run.
GNU LESSER GENERAL PUBLIC LICENSE
TERMS AND CONDITIONS FOR COPYING, DISTRIBUTION AND MODIFICATION
0. This License Agreement applies to any software library or other
program which contains a notice placed by the copyright holder or
other authorized party saying it may be distributed under the terms of
this Lesser General Public License (also called "this License").
Each licensee is addressed as "you".
A "library" means a collection of software functions and/or data
prepared so as to be conveniently linked with application programs
(which use some of those functions and data) to form executables.
The "Library", below, refers to any such software library or work
which has been distributed under these terms. A "work based on the
Library" means either the Library or any derivative work under
copyright law: that is to say, a work containing the Library or a
portion of it, either verbatim or with modifications and/or translated
straightforwardly into another language. (Hereinafter, translation is
included without limitation in the term "modification".)
"Source code" for a work means the preferred form of the work for
making modifications to it. For a library, complete source code means
all the source code for all modules it contains, plus any associated
interface definition files, plus the scripts used to control compilation
and installation of the library.
Activities other than copying, distribution and modification are not
covered by this License; they are outside its scope. The act of
running a program using the Library is not restricted, and output from
such a program is covered only if its contents constitute a work based
on the Library (independent of the use of the Library in a tool for
writing it). Whether that is true depends on what the Library does
and what the program that uses the Library does.
1. You may copy and distribute verbatim copies of the Library's
complete source code as you receive it, in any medium, provided that
you conspicuously and appropriately publish on each copy an
appropriate copyright notice and disclaimer of warranty; keep intact
all the notices that refer to this License and to the absence of any
warranty; and distribute a copy of this License along with the
Library.
You may charge a fee for the physical act of transferring a copy,
and you may at your option offer warranty protection in exchange for a
fee.
2. You may modify your copy or copies of the Library or any portion
of it, thus forming a work based on the Library, and copy and
distribute such modifications or work under the terms of Section 1
above, provided that you also meet all of these conditions:
a) The modified work must itself be a software library.
b) You must cause the files modified to carry prominent notices
stating that you changed the files and the date of any change.
c) You must cause the whole of the work to be licensed at no
charge to all third parties under the terms of this License.
d) If a facility in the modified Library refers to a function or a
table of data to be supplied by an application program that uses
the facility, other than as an argument passed when the facility
is invoked, then you must make a good faith effort to ensure that,
in the event an application does not supply such function or
table, the facility still operates, and performs whatever part of
its purpose remains meaningful.
(For example, a function in a library to compute square roots has
a purpose that is entirely well-defined independent of the
application. Therefore, Subsection 2d requires that any
application-supplied function or table used by this function must
be optional: if the application does not supply it, the square
root function must still compute square roots.)
These requirements apply to the modified work as a whole. If
identifiable sections of that work are not derived from the Library,
and can be reasonably considered independent and separate works in
themselves, then this License, and its terms, do not apply to those
sections when you distribute them as separate works. But when you
distribute the same sections as part of a whole which is a work based
on the Library, the distribution of the whole must be on the terms of
this License, whose permissions for other licensees extend to the
entire whole, and thus to each and every part regardless of who wrote
it.
Thus, it is not the intent of this section to claim rights or contest
your rights to work written entirely by you; rather, the intent is to
exercise the right to control the distribution of derivative or
collective works based on the Library.
In addition, mere aggregation of another work not based on the Library
with the Library (or with a work based on the Library) on a volume of
a storage or distribution medium does not bring the other work under
the scope of this License.
3. You may opt to apply the terms of the ordinary GNU General Public
License instead of this License to a given copy of the Library. To do
this, you must alter all the notices that refer to this License, so
that they refer to the ordinary GNU General Public License, version 2,
instead of to this License. (If a newer version than version 2 of the
ordinary GNU General Public License has appeared, then you can specify
that version instead if you wish.) Do not make any other change in
these notices.
Once this change is made in a given copy, it is irreversible for
that copy, so the ordinary GNU General Public License applies to all
subsequent copies and derivative works made from that copy.
This option is useful when you wish to copy part of the code of
the Library into a program that is not a library.
4. You may copy and distribute the Library (or a portion or
derivative of it, under Section 2) in object code or executable form
under the terms of Sections 1 and 2 above provided that you accompany
it with the complete corresponding machine-readable source code, which
must be distributed under the terms of Sections 1 and 2 above on a
medium customarily used for software interchange.
If distribution of object code is made by offering access to copy
from a designated place, then offering equivalent access to copy the
source code from the same place satisfies the requirement to
distribute the source code, even though third parties are not
compelled to copy the source along with the object code.
5. A program that contains no derivative of any portion of the
Library, but is designed to work with the Library by being compiled or
linked with it, is called a "work that uses the Library". Such a
work, in isolation, is not a derivative work of the Library, and
therefore falls outside the scope of this License.
However, linking a "work that uses the Library" with the Library
creates an executable that is a derivative of the Library (because it
contains portions of the Library), rather than a "work that uses the
library". The executable is therefore covered by this License.
Section 6 states terms for distribution of such executables.
When a "work that uses the Library" uses material from a header file
that is part of the Library, the object code for the work may be a
derivative work of the Library even though the source code is not.
Whether this is true is especially significant if the work can be
linked without the Library, or if the work is itself a library. The
threshold for this to be true is not precisely defined by law.
If such an object file uses only numerical parameters, data
structure layouts and accessors, and small macros and small inline
functions (ten lines or less in length), then the use of the object
file is unrestricted, regardless of whether it is legally a derivative
work. (Executables containing this object code plus portions of the
Library will still fall under Section 6.)
Otherwise, if the work is a derivative of the Library, you may
distribute the object code for the work under the terms of Section 6.
Any executables containing that work also fall under Section 6,
whether or not they are linked directly with the Library itself.
6. As an exception to the Sections above, you may also combine or
link a "work that uses the Library" with the Library to produce a
work containing portions of the Library, and distribute that work
under terms of your choice, provided that the terms permit
modification of the work for the customer's own use and reverse
engineering for debugging such modifications.
You must give prominent notice with each copy of the work that the
Library is used in it and that the Library and its use are covered by
this License. You must supply a copy of this License. If the work
during execution displays copyright notices, you must include the
copyright notice for the Library among them, as well as a reference
directing the user to the copy of this License. Also, you must do one
of these things:
a) Accompany the work with the complete corresponding
machine-readable source code for the Library including whatever
changes were used in the work (which must be distributed under
Sections 1 and 2 above); and, if the work is an executable linked
with the Library, with the complete machine-readable "work that
uses the Library", as object code and/or source code, so that the
user can modify the Library and then relink to produce a modified
executable containing the modified Library. (It is understood
that the user who changes the contents of definitions files in the
Library will not necessarily be able to recompile the application
to use the modified definitions.)
b) Use a suitable shared library mechanism for linking with the
Library. A suitable mechanism is one that (1) uses at run time a
copy of the library already present on the user's computer system,
rather than copying library functions into the executable, and (2)
will operate properly with a modified version of the library, if
the user installs one, as long as the modified version is
interface-compatible with the version that the work was made with.
c) Accompany the work with a written offer, valid for at
least three years, to give the same user the materials
specified in Subsection 6a, above, for a charge no more
than the cost of performing this distribution.
d) If distribution of the work is made by offering access to copy
from a designated place, offer equivalent access to copy the above
specified materials from the same place.
e) Verify that the user has already received a copy of these
materials or that you have already sent this user a copy.
For an executable, the required form of the "work that uses the
Library" must include any data and utility programs needed for
reproducing the executable from it. However, as a special exception,
the materials to be distributed need not include anything that is
normally distributed (in either source or binary form) with the major
components (compiler, kernel, and so on) of the operating system on
which the executable runs, unless that component itself accompanies
the executable.
It may happen that this requirement contradicts the license
restrictions of other proprietary libraries that do not normally
accompany the operating system. Such a contradiction means you cannot
use both them and the Library together in an executable that you
distribute.
7. You may place library facilities that are a work based on the
Library side-by-side in a single library together with other library
facilities not covered by this License, and distribute such a combined
library, provided that the separate distribution of the work based on
the Library and of the other library facilities is otherwise
permitted, and provided that you do these two things:
a) Accompany the combined library with a copy of the same work
based on the Library, uncombined with any other library
facilities. This must be distributed under the terms of the
Sections above.
b) Give prominent notice with the combined library of the fact
that part of it is a work based on the Library, and explaining
where to find the accompanying uncombined form of the same work.
8. You may not copy, modify, sublicense, link with, or distribute
the Library except as expressly provided under this License. Any
attempt otherwise to copy, modify, sublicense, link with, or
distribute the Library is void, and will automatically terminate your
rights under this License. However, parties who have received copies,
or rights, from you under this License will not have their licenses
terminated so long as such parties remain in full compliance.
9. You are not required to accept this License, since you have not
signed it. However, nothing else grants you permission to modify or
distribute the Library or its derivative works. These actions are
prohibited by law if you do not accept this License. Therefore, by
modifying or distributing the Library (or any work based on the
Library), you indicate your acceptance of this License to do so, and
all its terms and conditions for copying, distributing or modifying
the Library or works based on it.
10. Each time you redistribute the Library (or any work based on the
Library), the recipient automatically receives a license from the
original licensor to copy, distribute, link with or modify the Library
subject to these terms and conditions. You may not impose any further
restrictions on the recipients' exercise of the rights granted herein.
You are not responsible for enforcing compliance by third parties with
this License.
11. If, as a consequence of a court judgment or allegation of patent
infringement or for any other reason (not limited to patent issues),
conditions are imposed on you (whether by court order, agreement or
otherwise) that contradict the conditions of this License, they do not
excuse you from the conditions of this License. If you cannot
distribute so as to satisfy simultaneously your obligations under this
License and any other pertinent obligations, then as a consequence you
may not distribute the Library at all. For example, if a patent
license would not permit royalty-free redistribution of the Library by
all those who receive copies directly or indirectly through you, then
the only way you could satisfy both it and this License would be to
refrain entirely from distribution of the Library.
If any portion of this section is held invalid or unenforceable under any
particular circumstance, the balance of the section is intended to apply,
and the section as a whole is intended to apply in other circumstances.
It is not the purpose of this section to induce you to infringe any
patents or other property right claims or to contest validity of any
such claims; this section has the sole purpose of protecting the
integrity of the free software distribution system which is
implemented by public license practices. Many people have made
generous contributions to the wide range of software distributed
through that system in reliance on consistent application of that
system; it is up to the author/donor to decide if he or she is willing
to distribute software through any other system and a licensee cannot
impose that choice.
This section is intended to make thoroughly clear what is believed to
be a consequence of the rest of this License.
12. If the distribution and/or use of the Library is restricted in
certain countries either by patents or by copyrighted interfaces, the
original copyright holder who places the Library under this License may add
an explicit geographical distribution limitation excluding those countries,
so that distribution is permitted only in or among countries not thus
excluded. In such case, this License incorporates the limitation as if
written in the body of this License.
13. The Free Software Foundation may publish revised and/or new
versions of the Lesser General Public License from time to time.
Such new versions will be similar in spirit to the present version,
but may differ in detail to address new problems or concerns.
Each version is given a distinguishing version number. If the Library
specifies a version number of this License which applies to it and
"any later version", you have the option of following the terms and
conditions either of that version or of any later version published by
the Free Software Foundation. If the Library does not specify a
license version number, you may choose any version ever published by
the Free Software Foundation.
14. If you wish to incorporate parts of the Library into other free
programs whose distribution conditions are incompatible with these,
write to the author to ask for permission. For software which is
copyrighted by the Free Software Foundation, write to the Free
Software Foundation; we sometimes make exceptions for this. Our
decision will be guided by the two goals of preserving the free status
of all derivatives of our free software and of promoting the sharing
and reuse of software generally.
NO WARRANTY
15. BECAUSE THE LIBRARY IS LICENSED FREE OF CHARGE, THERE IS NO
WARRANTY FOR THE LIBRARY, TO THE EXTENT PERMITTED BY APPLICABLE LAW.
EXCEPT WHEN OTHERWISE STATED IN WRITING THE COPYRIGHT HOLDERS AND/OR
OTHER PARTIES PROVIDE THE LIBRARY "AS IS" WITHOUT WARRANTY OF ANY
KIND, EITHER EXPRESSED OR IMPLIED, INCLUDING, BUT NOT LIMITED TO, THE
IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR
PURPOSE. THE ENTIRE RISK AS TO THE QUALITY AND PERFORMANCE OF THE
LIBRARY IS WITH YOU. SHOULD THE LIBRARY PROVE DEFECTIVE, YOU ASSUME
THE COST OF ALL NECESSARY SERVICING, REPAIR OR CORRECTION.
16. IN NO EVENT UNLESS REQUIRED BY APPLICABLE LAW OR AGREED TO IN
WRITING WILL ANY COPYRIGHT HOLDER, OR ANY OTHER PARTY WHO MAY MODIFY
AND/OR REDISTRIBUTE THE LIBRARY AS PERMITTED ABOVE, BE LIABLE TO YOU
FOR DAMAGES, INCLUDING ANY GENERAL, SPECIAL, INCIDENTAL OR
CONSEQUENTIAL DAMAGES ARISING OUT OF THE USE OR INABILITY TO USE THE
LIBRARY (INCLUDING BUT NOT LIMITED TO LOSS OF DATA OR DATA BEING
RENDERED INACCURATE OR LOSSES SUSTAINED BY YOU OR THIRD PARTIES OR A
FAILURE OF THE LIBRARY TO OPERATE WITH ANY OTHER SOFTWARE), EVEN IF
SUCH HOLDER OR OTHER PARTY HAS BEEN ADVISED OF THE POSSIBILITY OF SUCH
DAMAGES.
END OF TERMS AND CONDITIONS
live/genMakefiles 000551 001752 001752 00000001762 12656261131 013772 0 ustar 00rsf rsf 000000 000000 #!/bin/sh
usage() {
echo "Usage: $0 "
exit 1
}
if [ $# -ne 1 ]
then
usage $*
fi
cd liveMedia
/bin/rm -f Makefile
cat Makefile.head ../config.$1 Makefile.tail > Makefile
chmod a-w Makefile
cd ../groupsock
/bin/rm -f Makefile
cat Makefile.head ../config.$1 Makefile.tail > Makefile
chmod a-w Makefile
cd ../UsageEnvironment
/bin/rm -f Makefile
cat Makefile.head ../config.$1 Makefile.tail > Makefile
chmod a-w Makefile
cd ../BasicUsageEnvironment
/bin/rm -f Makefile
cat Makefile.head ../config.$1 Makefile.tail > Makefile
chmod a-w Makefile
cd ../testProgs
/bin/rm -f Makefile
cat Makefile.head ../config.$1 Makefile.tail > Makefile
chmod a-w Makefile
cd ../mediaServer
/bin/rm -f Makefile
cat Makefile.head ../config.$1 Makefile.tail > Makefile
chmod a-w Makefile
cd ../proxyServer
/bin/rm -f Makefile
cat Makefile.head ../config.$1 Makefile.tail > Makefile
chmod a-w Makefile
cd ..
/bin/rm -f Makefile
cat Makefile.head config.$1 Makefile.tail > Makefile
chmod a-w Makefile
live/config.uClinux 000444 001752 001752 00000001357 12656261131 014274 0 ustar 00rsf rsf 000000 000000 CROSS_COMPILE= arc-linux-uclibc-
COMPILE_OPTS = $(INCLUDES) -I. -O2 -DSOCKLEN_T=socklen_t -D_LARGEFILE_SOURCE=1 -D_FILE_OFFSET_BITS=64
C = c
C_COMPILER = $(CROSS_COMPILE)gcc
CFLAGS += $(COMPILE_OPTS)
C_FLAGS = $(CFLAGS)
CPP = cpp
CPLUSPLUS_COMPILER = $(CROSS_COMPILE)g++
CPLUSPLUS_FLAGS = $(COMPILE_OPTS) -Wall -DBSD=1
CPLUSPLUS_FLAGS += $(CPPFLAGS) -fexceptions
OBJ = o
LINK = $(CROSS_COMPILE)g++ -o
LINK_OPTS = -L. $(LDFLAGS)
CONSOLE_LINK_OPTS = $(LINK_OPTS)
LIBRARY_LINK = $(CROSS_COMPILE)ar cr
LIBRARY_LINK_OPTS =
LIB_SUFFIX = a
LIBS_FOR_CONSOLE_APPLICATION = $(CXXLIBS)
LIBS_FOR_GUI_APPLICATION = $(LIBS_FOR_CONSOLE_APPLICATION)
EXE =
live/config.sunos 000444 001752 001752 00000000613 12656261131 014006 0 ustar 00rsf rsf 000000 000000 COMPILE_OPTS = $(INCLUDES) -I. -DBSD=1 -O
C = c
C_COMPILER = cc
C_FLAGS = $(COMPILE_OPTS)
CPP = cc
CPLUSPLUS_COMPILER = c++
CPLUSPLUS_FLAGS = $(COMPILE_OPTS) -Wall
OBJ = o
LINK = c++ -o
LINK_OPTS = -L.
CONSOLE_LINK_OPTS = $(LINK_OPTS)
LIBRARY_LINK = ld -o
LIBRARY_LINK_OPTS = $(LINK_OPTS) -r -Bstatic
LIB_SUFFIX = a
LIBS_FOR_CONSOLE_APPLICATION =
LIBS_FOR_GUI_APPLICATION =
EXE =
live/config.solaris-64bit 000444 001752 001752 00000001205 12656261131 015237 0 ustar 00rsf rsf 000000 000000 COMPILE_OPTS = $(INCLUDES) -m64 -I. -O -DSOLARIS -DXLOCALE_NOT_USED -DSOCKLEN_T=socklen_t
C = c
C_COMPILER = cc
C_FLAGS = $(COMPILE_OPTS)
CPP = cpp
CPLUSPLUS_COMPILER = c++
CPLUSPLUS_FLAGS = $(COMPILE_OPTS) -Wall
OBJ = o
LINK = c++ -m64 -o
LINK_OPTS = -L.
CONSOLE_LINK_OPTS = $(LINK_OPTS)
LIBRARY_LINK = ld -o
LIBRARY_LINK_OPTS = $(LINK_OPTS) -64 -r -dn
LIB_SUFFIX = a
LIBS_FOR_CONSOLE_APPLICATION = -lsocket -lnsl
LIBS_FOR_GUI_APPLICATION = $(LIBS_FOR_CONSOLE_APPLICATION)
EXE =
live/config.solaris-32bit 000444 001752 001752 00000000741 12656261131 015236 0 ustar 00rsf rsf 000000 000000 COMPILE_OPTS = $(INCLUDES) -I. -O -DSOLARIS -DXLOCALE_NOT_USED -DSOCKLEN_T=socklen_t
C = c
C_COMPILER = cc
C_FLAGS = $(COMPILE_OPTS)
CPP = cpp
CPLUSPLUS_COMPILER = c++
CPLUSPLUS_FLAGS = $(COMPILE_OPTS) -Wall
OBJ = o
LINK = c++ -o
LINK_OPTS = -L.
CONSOLE_LINK_OPTS = $(LINK_OPTS)
LIBRARY_LINK = ld -o
LIBRARY_LINK_OPTS = $(LINK_OPTS) -r -dn
LIB_SUFFIX = a
LIBS_FOR_CONSOLE_APPLICATION = -lsocket -lnsl
LIBS_FOR_GUI_APPLICATION = $(LIBS_FOR_CONSOLE_APPLICATION)
EXE =
live/config.qnx4 000444 001752 001752 00000001070 12656261131 013527 0 ustar 00rsf rsf 000000 000000 #
# Requires:
# QNX 4.25
# Watcom 10.6
# TCP/IP 5.0
#
COMPILE_OPTS = $(INCLUDES) -I. -D_QNX4 -DBSD -DSOCKLEN_T=uint32_t -I/usr/watcom/10.6/usr/include
C = c
C_COMPILER = cc32
C_FLAGS = $(COMPILE_OPTS)
CPP = cpp
CPLUSPLUS_COMPILER = cc32
CPLUSPLUS_FLAGS = $(COMPILE_OPTS) -WC,-xs
OBJ = o
LINK = cc32 -b -M -N30000 -o
LINK_OPTS = -l.
CONSOLE_LINK_OPTS = $(LINK_OPTS)
LIBRARY_LINK = wlib -n -b -c
LIBRARY_LINK_OPTS = $(LINK_OPTS)
LIB_SUFFIX = lib
LIBS_FOR_CONSOLE_APPLICATION = -lsocket
LIBS_FOR_GUI_APPLICATION = $(LIBS_FOR_CONSOLE_APPLICATION)
EXE =
live/config.openbsd 000444 001752 001752 00000000661 12656261131 014274 0 ustar 00rsf rsf 000000 000000 .SUFFIXES: .cpp
COMPILE_OPTS = $(INCLUDES) -I. -DBSD=1 -O -DSOCKLEN_T=socklen_t
C = c
C_COMPILER = cc
C_FLAGS = $(COMPILE_OPTS)
CPP = cpp
CPLUSPLUS_COMPILER = c++
CPLUSPLUS_FLAGS = $(COMPILE_OPTS) -Wall -DAIX=1
OBJ = o
LINK = c++ -o
LINK_OPTS = -L.
CONSOLE_LINK_OPTS = $(LINK_OPTS)
LIBRARY_LINK = ld -o
LIBRARY_LINK_OPTS = $(LINK_OPTS) -r
LIB_SUFFIX = a
LIBS_FOR_CONSOLE_APPLICATION =
LIBS_FOR_GUI_APPLICATION =
EXE =
live/config.mingw 000444 001752 001752 00000001213 12656261131 013755 0 ustar 00rsf rsf 000000 000000 COMPILE_OPTS = $(INCLUDES) -I. -O -DSOCKLEN_T=int -DLOCALE_NOT_USED
C = c
C_COMPILER = $(CC)
C_FLAGS = $(COMPILE_OPTS) -DUSE_OUR_BZERO=1 -D__MINGW32__
CPP = cpp
CPLUSPLUS_COMPILER = $(CXX)
CPLUSPLUS_FLAGS = $(COMPILE_OPTS) -D__MINGW32__ -Wall -Wno-deprecated
OBJ = o
LINK = $(CXX) -o
LINK_OPTS = -L.
CONSOLE_LINK_OPTS = $(LINK_OPTS)
LIBRARY_LINK = $(LD) -o
LIBRARY_LINK_OPTS = $(LINK_OPTS) -r -Bstatic
LIB_SUFFIX = a
LIBS_FOR_CONSOLE_APPLICATION = -lws2_32
LIBS_FOR_GUI_APPLICATION = -lws2_32
EXE =
live/config.macosx-32bit 000444 001752 001752 00000000725 12656261131 015056 0 ustar 00rsf rsf 000000 000000 COMPILE_OPTS = -m32 $(INCLUDES) -I. $(EXTRA_LDFLAGS) -DBSD=1 -O -DSOCKLEN_T=socklen_t -DHAVE_SOCKADDR_LEN=1 -DTIME_BASE=int
C = c
C_COMPILER = cc
C_FLAGS = $(COMPILE_OPTS)
CPP = cpp
CPLUSPLUS_COMPILER = c++
CPLUSPLUS_FLAGS = $(COMPILE_OPTS) -Wall
OBJ = o
LINK = c++ -o
LINK_OPTS = -L. -m32
CONSOLE_LINK_OPTS = $(LINK_OPTS)
LIBRARY_LINK = libtool -s -o
LIBRARY_LINK_OPTS =
LIB_SUFFIX = a
LIBS_FOR_CONSOLE_APPLICATION =
LIBS_FOR_GUI_APPLICATION =
EXE =
live/config.macosx-before-version-10.4 000444 001752 001752 00000000646 12656261131 017442 0 ustar 00rsf rsf 000000 000000 COMPILE_OPTS = $(INCLUDES) -I. -DBSD=1 -O -DSOCKLEN_T=int -DTIME_BASE=int
C = c
C_COMPILER = cc
C_FLAGS = $(COMPILE_OPTS)
CPP = cpp
CPLUSPLUS_COMPILER = c++
CPLUSPLUS_FLAGS = $(COMPILE_OPTS) -Wall
OBJ = o
LINK = c++ -o
LINK_OPTS = -L.
CONSOLE_LINK_OPTS = $(LINK_OPTS)
LIBRARY_LINK = ld -o
LIBRARY_LINK_OPTS = $(LINK_OPTS) -r
LIB_SUFFIX = a
LIBS_FOR_CONSOLE_APPLICATION =
LIBS_FOR_GUI_APPLICATION =
EXE =
live/config.macosx 000444 001752 001752 00000000713 12656261131 014132 0 ustar 00rsf rsf 000000 000000 COMPILE_OPTS = $(INCLUDES) -I. $(EXTRA_LDFLAGS) -DBSD=1 -O -DSOCKLEN_T=socklen_t -DHAVE_SOCKADDR_LEN=1 -DTIME_BASE=int
C = c
C_COMPILER = cc
C_FLAGS = $(COMPILE_OPTS)
CPP = cpp
CPLUSPLUS_COMPILER = c++
CPLUSPLUS_FLAGS = $(COMPILE_OPTS) -Wall
OBJ = o
LINK = c++ -o
LINK_OPTS = -L.
CONSOLE_LINK_OPTS = $(LINK_OPTS)
LIBRARY_LINK = libtool -s -o
LIBRARY_LINK_OPTS =
LIB_SUFFIX = a
LIBS_FOR_CONSOLE_APPLICATION =
LIBS_FOR_GUI_APPLICATION =
EXE =
live/config.linux-with-shared-libraries 000444 001752 001752 00000004457 12656261131 020177 0 ustar 00rsf rsf 000000 000000 # 'CURRENT':'REVISION':'AGE' are updated - whenever a library changes - as follows:
# The library code changes, but without any changes to the API (i.e., interfaces) => increment REVISION
# At least one interface changes, or is removed => CURRENT += 1; REVISION = 0; AGE = 0
# One or more interfaces were added, but no existing interfaces were changed or removed => CURRENT += 1; REVISION = 0; AGE += 1
libliveMedia_VERSION_CURRENT=51
libliveMedia_VERSION_REVISION=8
libliveMedia_VERSION_AGE=1
libliveMedia_LIB_SUFFIX=so.$(shell expr $(libliveMedia_VERSION_CURRENT) - $(libliveMedia_VERSION_AGE)).$(libliveMedia_VERSION_AGE).$(libliveMedia_VERSION_REVISION)
libBasicUsageEnvironment_VERSION_CURRENT=1
libBasicUsageEnvironment_VERSION_REVISION=0
libBasicUsageEnvironment_VERSION_AGE=0
libBasicUsageEnvironment_LIB_SUFFIX=so.$(shell expr $(libBasicUsageEnvironment_VERSION_CURRENT) - $(libBasicUsageEnvironment_VERSION_AGE)).$(libBasicUsageEnvironment_VERSION_AGE).$(libBasicUsageEnvironment_VERSION_REVISION)
libUsageEnvironment_VERSION_CURRENT=4
libUsageEnvironment_VERSION_REVISION=0
libUsageEnvironment_VERSION_AGE=1
libUsageEnvironment_LIB_SUFFIX=so.$(shell expr $(libUsageEnvironment_VERSION_CURRENT) - $(libUsageEnvironment_VERSION_AGE)).$(libUsageEnvironment_VERSION_AGE).$(libUsageEnvironment_VERSION_REVISION)
libgroupsock_VERSION_CURRENT=9
libgroupsock_VERSION_REVISION=0
libgroupsock_VERSION_AGE=1
libgroupsock_LIB_SUFFIX=so.$(shell expr $(libgroupsock_VERSION_CURRENT) - $(libgroupsock_VERSION_AGE)).$(libgroupsock_VERSION_AGE).$(libgroupsock_VERSION_REVISION)
#####
COMPILE_OPTS = $(INCLUDES) -I. -O2 -DSOCKLEN_T=socklen_t -D_LARGEFILE_SOURCE=1 -D_FILE_OFFSET_BITS=64 -fPIC
C = c
C_COMPILER = $(CC)
C_FLAGS = $(COMPILE_OPTS) $(CPPFLAGS) $(CFLAGS)
CPP = cpp
CPLUSPLUS_COMPILER = $(CXX)
CPLUSPLUS_FLAGS = $(COMPILE_OPTS) -Wall -DBSD=1 $(CPPFLAGS) $(CXXFLAGS)
OBJ = o
LINK = $(CXX) -o
LINK_OPTS = -L. $(LDFLAGS)
CONSOLE_LINK_OPTS = $(LINK_OPTS)
LIBRARY_LINK = $(CC) -o
SHORT_LIB_SUFFIX = so.$(shell expr $($(NAME)_VERSION_CURRENT) - $($(NAME)_VERSION_AGE))
LIB_SUFFIX = $(SHORT_LIB_SUFFIX).$($(NAME)_VERSION_AGE).$($(NAME)_VERSION_REVISION)
LIBRARY_LINK_OPTS = -shared -Wl,-soname,$(NAME).$(SHORT_LIB_SUFFIX) $(LDFLAGS)
LIBS_FOR_CONSOLE_APPLICATION =
LIBS_FOR_GUI_APPLICATION =
EXE =
INSTALL2 = install_shared_libraries
live/config.linux-gdb 000444 001752 001752 00000000673 12656261131 014536 0 ustar 00rsf rsf 000000 000000 COMPILE_OPTS = $(INCLUDES) -I. -O -DSOCKLEN_T=socklen_t -g -D_LARGEFILE_SOURCE=1 -D_FILE_OFFSET_BITS=64
C = c
C_COMPILER = cc
C_FLAGS = $(COMPILE_OPTS)
CPP = cpp
CPLUSPLUS_COMPILER = c++
CPLUSPLUS_FLAGS = $(COMPILE_OPTS) -Wall -DBSD=1
OBJ = o
LINK = c++ -o
LINK_OPTS = -L.
CONSOLE_LINK_OPTS = $(LINK_OPTS)
LIBRARY_LINK = ar cr
LIBRARY_LINK_OPTS =
LIB_SUFFIX = a
LIBS_FOR_CONSOLE_APPLICATION =
LIBS_FOR_GUI_APPLICATION =
EXE =
live/config.linux-64bit 000444 001752 001752 00000000705 12656261131 014726 0 ustar 00rsf rsf 000000 000000 COMPILE_OPTS = $(INCLUDES) -m64 -fPIC -I. -O2 -DSOCKLEN_T=socklen_t -D_LARGEFILE_SOURCE=1 -D_FILE_OFFSET_BITS=64
C = c
C_COMPILER = cc
C_FLAGS = $(COMPILE_OPTS)
CPP = cpp
CPLUSPLUS_COMPILER = c++
CPLUSPLUS_FLAGS = $(COMPILE_OPTS) -Wall -DBSD=1
OBJ = o
LINK = c++ -o
LINK_OPTS = -L.
CONSOLE_LINK_OPTS = $(LINK_OPTS)
LIBRARY_LINK = ar cr
LIBRARY_LINK_OPTS =
LIB_SUFFIX = a
LIBS_FOR_CONSOLE_APPLICATION =
LIBS_FOR_GUI_APPLICATION =
EXE =
live/config.linux 000444 001752 001752 00000000762 12656261131 014003 0 ustar 00rsf rsf 000000 000000 COMPILE_OPTS = $(INCLUDES) -I. -O2 -DSOCKLEN_T=socklen_t -D_LARGEFILE_SOURCE=1 -D_FILE_OFFSET_BITS=64
C = c
C_COMPILER = cc
C_FLAGS = $(COMPILE_OPTS) $(CPPFLAGS) $(CFLAGS)
CPP = cpp
CPLUSPLUS_COMPILER = c++
CPLUSPLUS_FLAGS = $(COMPILE_OPTS) -Wall -DBSD=1 $(CPPFLAGS) $(CXXFLAGS)
OBJ = o
LINK = c++ -o
LINK_OPTS = -L. $(LDFLAGS)
CONSOLE_LINK_OPTS = $(LINK_OPTS)
LIBRARY_LINK = ar cr
LIBRARY_LINK_OPTS =
LIB_SUFFIX = a
LIBS_FOR_CONSOLE_APPLICATION =
LIBS_FOR_GUI_APPLICATION =
EXE =
live/config.irix 000444 001752 001752 00000000623 12656261131 013613 0 ustar 00rsf rsf 000000 000000 COMPILE_OPTS = $(INCLUDES) -I. -O
C = c
C_COMPILER = cc
C_FLAGS = $(COMPILE_OPTS) -DIRIX
CPP = cpp
CPLUSPLUS_COMPILER = c++
CPLUSPLUS_FLAGS = $(COMPILE_OPTS) -Wall -DIRIX
OBJ = o
LINK = c++ -o
LINK_OPTS = -L.
CONSOLE_LINK_OPTS = $(LINK_OPTS)
LIBRARY_LINK = ld -o
LIBRARY_LINK_OPTS = $(LINK_OPTS) -r -B static
LIB_SUFFIX = a
LIBS_FOR_CONSOLE_APPLICATION =
LIBS_FOR_GUI_APPLICATION =
EXE =
live/config.iphoneos 000444 001752 001752 00000002370 12656261131 014465 0 ustar 00rsf rsf 000000 000000 # **Note: You must install the relevant "Command line tools (OSX *.*) for Xcode - Xcode *.*"
# for this configuration file to work.
#
# Change the following version number, if necessary, before running "genMakefiles iphoneos"
IOS_VERSION = 8.3
DEVELOPER_PATH = /Applications/Xcode.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer
TOOL_PATH = $(DEVELOPER_PATH)/usr/bin
SDK_PATH = $(DEVELOPER_PATH)/SDKs
SDK = $(SDK_PATH)/iPhoneOS$(IOS_VERSION).sdk
COMPILE_OPTS = $(INCLUDES) -I. $(EXTRA_LDFLAGS) -DBSD=1 -O2 -DSOCKLEN_T=socklen_t -DHAVE_SOCKADDR_LEN=1 -D_LARGEFILE_SOURCE=1 -D_FILE_OFFSET_BITS=64 -fPIC -arch armv7 --sysroot=$(SDK)
C = c
C_COMPILER = /usr/bin/xcrun clang
C_FLAGS = $(COMPILE_OPTS)
CPP = cpp
CPLUSPLUS_COMPILER = /usr/bin/xcrun clang
CPLUSPLUS_FLAGS = $(COMPILE_OPTS) -Wall
OBJ = o
LINK = /usr/bin/xcrun clang -o
LINK_OPTS = -v -L. -arch armv7 --sysroot=$(SDK) -L$(SDK)/usr/lib/system /usr/lib/libc++.dylib
CONSOLE_LINK_OPTS = $(LINK_OPTS)
LIBRARY_LINK = /usr/bin/xcrun libtool -static -o
LIBRARY_LINK_OPTS =
LIB_SUFFIX = a
LIBS_FOR_CONSOLE_APPLICATION =
LIBS_FOR_GUI_APPLICATION =
EXE =
live/config.iphone-simulator 000444 001752 001752 00000002634 12656261131 016143 0 ustar 00rsf rsf 000000 000000 # **Note: You must install the relevant "Command line tools (OSX *.*) for Xcode - Xcode *.*"
# for this configuration file to work.
# Change the following version number, if necessary, before running "genMakefiles iphone-simulator"
IOS_VERSION = 8.3
MIN_IOS_VERSION = 7.0
DEVELOPER_PATH = /Applications/Xcode.app/Contents/Developer/Platforms/iPhoneSimulator.platform/Developer
TOOL_PATH = $(DEVELOPER_PATH)/usr/bin
SDK_PATH = $(DEVELOPER_PATH)/SDKs
SDK = $(SDK_PATH)/iPhoneSimulator$(IOS_VERSION).sdk
COMPILE_OPTS = $(INCLUDES) -I. $(EXTRA_LDFLAGS) -DBSD=1 -O2 -DSOCKLEN_T=socklen_t -DHAVE_SOCKADDR_LEN=1 -miphoneos-version-min=$(MIN_IOS_VERSION) -D_LARGEFILE_SOURCE=1 -D_FILE_OFFSET_BITS=64 -fPIC -arch i386 --sysroot=$(SDK) -isysroot $(SDK)
C = c
C_COMPILER = /usr/bin/xcrun clang
C_FLAGS = $(COMPILE_OPTS)
CPP = cpp
CPLUSPLUS_COMPILER = /usr/bin/xcrun clang
CPLUSPLUS_FLAGS = $(COMPILE_OPTS) -Wall
OBJ = o
LINK = /usr/bin/xcrun clang -o
LINK_OPTS = -L. -arch i386 -miphoneos-version-min=$(MIN_IOS_VERSION) --sysroot=$(SDK) -isysroot -L$(SDK)/usr/lib/system -I$(SDK)/usr/lib /usr/lib/libc++.dylib
CONSOLE_LINK_OPTS = $(LINK_OPTS)
LIBRARY_LINK = /usr/bin/xcrun libtool -static -o
LIBRARY_LINK_OPTS =
LIB_SUFFIX = a
LIBS_FOR_CONSOLE_APPLICATION =
LIBS_FOR_GUI_APPLICATION =
EXE =
live/config.freebsd 000444 001752 001752 00000000666 12656261131 014261 0 ustar 00rsf rsf 000000 000000 COMPILE_OPTS = $(INCLUDES) -I. -O -DBSD=1 -DXLOCALE_NOT_USED=1 -DSOCKLEN_T=socklen_t -DHAVE_SOCKADDR_LEN=1
C = c
C_COMPILER = cc
C_FLAGS = $(COMPILE_OPTS)
CPP = cpp
CPLUSPLUS_COMPILER = c++
CPLUSPLUS_FLAGS = $(COMPILE_OPTS) -Wall
OBJ = o
LINK = c++ -o
LINK_OPTS = -L.
CONSOLE_LINK_OPTS = $(LINK_OPTS)
LIBRARY_LINK = ar cr
LIBRARY_LINK_OPTS =
LIB_SUFFIX = a
LIBS_FOR_CONSOLE_APPLICATION =
LIBS_FOR_GUI_APPLICATION =
EXE =
live/config.cygwin-for-vlc 000444 001752 001752 00000001003 12656261131 015477 0 ustar 00rsf rsf 000000 000000 COMPILE_OPTS = $(INCLUDES) -I. -O -DSOCKLEN_T=socklen_t -DXLOCALE_NOT_USED=1
C = c
C_COMPILER = gcc
C_FLAGS = $(COMPILE_OPTS) -DUSE_OUR_BZERO=1 -D_WIN32 -mno-cygwin
CPP = cpp
CPLUSPLUS_COMPILER = c++
CPLUSPLUS_FLAGS = $(COMPILE_OPTS) -Wall -DBSD=1 -D_WIN32 -Wno-deprecated -mno-cygwin
OBJ = o
LINK = c++ -o
LINK_OPTS = -L.
CONSOLE_LINK_OPTS = $(LINK_OPTS)
LIBRARY_LINK = ld -o
LIBRARY_LINK_OPTS = $(LINK_OPTS) -r -Bstatic
LIB_SUFFIX = a
LIBS_FOR_CONSOLE_APPLICATION =
LIBS_FOR_GUI_APPLICATION =
EXE =
live/config.cygwin 000444 001752 001752 00000000731 12656261131 014140 0 ustar 00rsf rsf 000000 000000 COMPILE_OPTS = $(INCLUDES) -I. -O -DSOCKLEN_T=socklen_t -DXLOCALE_NOT_USED=1
C = c
C_COMPILER = gcc
C_FLAGS = $(COMPILE_OPTS) -DUSE_OUR_BZERO=1 -D__CYGWIN__
CPP = cpp
CPLUSPLUS_COMPILER = c++
CPLUSPLUS_FLAGS = $(COMPILE_OPTS) -Wall -DBSD=1
OBJ = o
LINK = c++ -o
LINK_OPTS = -L.
CONSOLE_LINK_OPTS = $(LINK_OPTS)
LIBRARY_LINK = ld -o
LIBRARY_LINK_OPTS = $(LINK_OPTS) -r -Bstatic
LIB_SUFFIX = a
LIBS_FOR_CONSOLE_APPLICATION =
LIBS_FOR_GUI_APPLICATION =
EXE =
live/config.cris-axis-linux-gnu 000444 001752 001752 00000001603 12656261131 016465 0 ustar 00rsf rsf 000000 000000 # Note: AXIS_TOP_DIR is assumed to already be set in your environment.
# You can set this using the "init_env" script.
# See http://developer.axis.com/doc/software/apps/apps-howto.html
# for more information.
AXIS_DIR = $(AXIS_TOP_DIR)/target/cris-axis-linux-gnu
COMPILE_OPTS = $(INCLUDES) -I. -mlinux -isystem $(AXIS_DIR)/include -Wall -O2 -DSOCKLEN_T=socklen_t -DCRIS -D_LARGEFILE_SOURCE=1 -D_FILE_OFFSET_BITS=64
C = c
C_COMPILER = gcc-cris
C_FLAGS = $(COMPILE_OPTS)
CPP = cpp
CPLUSPLUS_COMPILER = c++-cris
CPLUSPLUS_FLAGS = $(COMPILE_OPTS) -Wno-ctor-dtor-privacy -ansi -pipe
OBJ = o
LINK = c++-cris -static -o
AXIS_LINK_OPTS = -L$(AXIS_DIR)/lib
LINK_OPTS = -L.
CONSOLE_LINK_OPTS = $(LINK_OPTS) -L$(AXIS_DIR)/lib -mlinux
LIBRARY_LINK = ld-cris -mcrislinux -o
LIBRARY_LINK_OPTS = $(LINK_OPTS) -r -Bstatic
LIB_SUFFIX = a
LIBS_FOR_CONSOLE_APPLICATION =
LIBS_FOR_GUI_APPLICATION =
EXE =
live/config.bsplinux 000444 001752 001752 00000001311 12656261131 014477 0 ustar 00rsf rsf 000000 000000 CROSS_COMPILE=
COMPILE_OPTS = $(INCLUDES) -I. -O2 -DSOCKLEN_T=socklen_t -DNO_SSTREAM=1 -D_LARGEFILE_SOURCE=1 -D_FILE_OFFSET_BITS=64
C = c
C_COMPILER = $(CROSS_COMPILE)ecc
C_FLAGS = $(COMPILE_OPTS)
CPP = cpp
CPLUSPLUS_COMPILER = $(CROSS_COMPILE)e++
CPLUSPLUS_FLAGS = $(COMPILE_OPTS) -Wall -DBSD=1
OBJ = o
LINK = $(CROSS_COMPILE)e++ -o
LINK_OPTS = -L.
CONSOLE_LINK_OPTS = $(LINK_OPTS)
LIBRARY_LINK = $(CROSS_COMPILE)eld -o
LIBRARY_LINK_OPTS = $(LINK_OPTS) -r -Bstatic
LIB_SUFFIX = a
LIBS_FOR_CONSOLE_APPLICATION = -lm
LIBS_FOR_GUI_APPLICATION =
EXE =
live/config.bfin-uclinux 000444 001752 001752 00000001213 12656261131 015237 0 ustar 00rsf rsf 000000 000000 CROSS_COMPILER= bfin-uclinux-
COMPILE_OPTS = $(INCLUDES) -I. -DSOCKLEN_T=socklen_t -D_LARGEFILE_SOURCE=1 -DUCLINUX -D_FILE_OFFSET_BITS=64
C = c
C_COMPILER = $(CROSS_COMPILER)gcc
C_FLAGS = $(COMPILE_OPTS) -Wall
CPP = cpp
CPLUSPLUS_COMPILER = $(CROSS_COMPILER)g++
CPLUSPLUS_FLAGS = $(COMPILE_OPTS) -Wall
OBJ = o
LINK = $(CROSS_COMPILER)g++ -Wl,-elf2flt -o
LINK_OPTS = -L.
CONSOLE_LINK_OPTS = $(LINK_OPTS)
LIBRARY_LINK = $(CROSS_COMPILER)ar cr
LIBRARY_LINK_OPTS =
LIB_SUFFIX = a
LIBS_FOR_CONSOLE_APPLICATION =
LIBS_FOR_GUI_APPLICATION =
EXE =
live/config.bfin-linux-uclibc 000444 001752 001752 00000001215 12656261131 016150 0 ustar 00rsf rsf 000000 000000 CROSS_COMPILER = bfin-linux-uclibc-
COMPILE_OPTS = $(INCLUDES) -I. -DSOCKLEN_T=socklen_t -D_LARGEFILE_SOURCE=1 -DUCLINUX -D_FILE_OFFSET_BITS=64
C = c
C_COMPILER = $(CROSS_COMPILER)gcc
C_FLAGS = $(COMPILE_OPTS) -Wall
CPP = cpp
CPLUSPLUS_COMPILER = $(CROSS_COMPILER)g++
CPLUSPLUS_FLAGS = $(COMPILE_OPTS) -Wall
OBJ = o
LINK = $(CROSS_COMPILER)g++ -o
LINK_OPTS = -L.
CONSOLE_LINK_OPTS = $(LINK_OPTS)
LIBRARY_LINK = $(CROSS_COMPILER)ar cr
LIBRARY_LINK_OPTS =
LIB_SUFFIX = a
LIBS_FOR_CONSOLE_APPLICATION =
LIBS_FOR_GUI_APPLICATION =
EXE =
live/config.avr32-linux 000444 001752 001752 00000001263 12656261131 014733 0 ustar 00rsf rsf 000000 000000 CROSS_COMPILE= avr32-linux-uclibc-
COMPILE_OPTS = -Os $(INCLUDES) -msoft-float -D_LARGEFILE_SOURCE -D_LARGEFILE64_SOURCE -D_FILE_OFFSET_BITS=64 -DSOCKLEN_T=socklen_t -DNO_SSTREAM=1 C = c
C_COMPILER = $(CROSS_COMPILE)gcc
C_FLAGS = $(COMPILE_OPTS)
CPP = cpp
CPLUSPLUS_COMPILER = $(CROSS_COMPILE)c++
CPLUSPLUS_FLAGS = $(COMPILE_OPTS) -Wall -fuse-cxa-atexit -DBSD=1 OBJ = o
LINK = $(CROSS_COMPILE)c++ -o
LINK_OPTS =
CONSOLE_LINK_OPTS = $(LINK_OPTS)
LIBRARY_LINK = $(CROSS_COMPILE)ar cr LIBRARY_LINK_OPTS =
LIB_SUFFIX = a
LIBS_FOR_CONSOLE_APPLICATION =
LIBS_FOR_GUI_APPLICATION =
EXE =
live/config.armlinux 000444 001752 001752 00000001054 12656261131 014476 0 ustar 00rsf rsf 000000 000000 CROSS_COMPILE?= arm-elf-
COMPILE_OPTS = $(INCLUDES) -I. -O2 -DSOCKLEN_T=socklen_t -DNO_SSTREAM=1 -D_LARGEFILE_SOURCE=1 -D_FILE_OFFSET_BITS=64
C = c
C_COMPILER = $(CROSS_COMPILE)gcc
C_FLAGS = $(COMPILE_OPTS)
CPP = cpp
CPLUSPLUS_COMPILER = $(CROSS_COMPILE)g++
CPLUSPLUS_FLAGS = $(COMPILE_OPTS) -Wall -DBSD=1
OBJ = o
LINK = $(CROSS_COMPILE)g++ -o
LINK_OPTS =
CONSOLE_LINK_OPTS = $(LINK_OPTS)
LIBRARY_LINK = $(CROSS_COMPILE)ar cr
LIBRARY_LINK_OPTS = $(LINK_OPTS)
LIB_SUFFIX = a
LIBS_FOR_CONSOLE_APPLICATION =
LIBS_FOR_GUI_APPLICATION =
EXE =
live/config.armeb-uclibc 000444 001752 001752 00000001274 12656261131 015170 0 ustar 00rsf rsf 000000 000000 CROSS_COMPILE= armeb-linux-uclibc-
COMPILE_OPTS = $(INCLUDES) -I. -Os -DSOCKLEN_T=socklen_t -DNO_SSTREAM=1 -D
LARGEFILE_SOURCE=1 -D_FILE_OFFSET_BITS=64
C = c
C_COMPILER = $(CROSS_COMPILE)gcc
C_FLAGS = $(COMPILE_OPTS)
CPP = cpp
CPLUSPLUS_COMPILER = $(CROSS_COMPILE)g++
CPLUSPLUS_FLAGS = $(COMPILE_OPTS) -Wall -DBSD=1
OBJ = o
LINK = $(CROSS_COMPILE)gcc -o
LINK_OPTS = -L.
CONSOLE_LINK_OPTS = $(LINK_OPTS)
LIBRARY_LINK = $(CROSS_COMPILE)ar cr
LIBRARY_LINK_OPTS =
LIB_SUFFIX = a
LIBS_FOR_CONSOLE_APPLICATION =
LIBS_FOR_GUI_APPLICATION =
EXE =
live/config.alpha 000444 001752 001752 00000000655 12656261131 013732 0 ustar 00rsf rsf 000000 000000 COMPILE_OPTS = $(INCLUDES) -I. -O -DTIME_BASE=int
C = c
C_COMPILER = cc
C_FLAGS = $(COMPILE_OPTS) -DALPHA
CPP = cpp
CPLUSPLUS_COMPILER = c++
CPLUSPLUS_FLAGS = $(COMPILE_OPTS) -Wall -DBSD=1 -DALPHA
OBJ = o
LINK = c++ -o
LINK_OPTS = -L.
CONSOLE_LINK_OPTS = $(LINK_OPTS)
LIBRARY_LINK = ld -o
LIBRARY_LINK_OPTS = $(LINK_OPTS) -r -B static
LIB_SUFFIX = a
LIBS_FOR_CONSOLE_APPLICATION =
LIBS_FOR_GUI_APPLICATION =
EXE =
live/README 000444 001752 001752 00000000147 12656261131 012332 0 ustar 00rsf rsf 000000 000000 For documentation and instructions for building this software,
see
live/Makefile.head 000444 001752 001752 00000000061 12656261131 014005 0 ustar 00rsf rsf 000000 000000 ##### Change the following for your environment:
live/Makefile.tail 000444 001752 001752 00000002553 12656261131 014045 0 ustar 00rsf rsf 000000 000000 ##### End of variables to change
LIVEMEDIA_DIR = liveMedia
GROUPSOCK_DIR = groupsock
USAGE_ENVIRONMENT_DIR = UsageEnvironment
BASIC_USAGE_ENVIRONMENT_DIR = BasicUsageEnvironment
TESTPROGS_DIR = testProgs
MEDIA_SERVER_DIR = mediaServer
PROXY_SERVER_DIR = proxyServer
all:
cd $(LIVEMEDIA_DIR) ; $(MAKE)
cd $(GROUPSOCK_DIR) ; $(MAKE)
cd $(USAGE_ENVIRONMENT_DIR) ; $(MAKE)
cd $(BASIC_USAGE_ENVIRONMENT_DIR) ; $(MAKE)
cd $(TESTPROGS_DIR) ; $(MAKE)
cd $(MEDIA_SERVER_DIR) ; $(MAKE)
cd $(PROXY_SERVER_DIR) ; $(MAKE)
install:
cd $(LIVEMEDIA_DIR) ; $(MAKE) install
cd $(GROUPSOCK_DIR) ; $(MAKE) install
cd $(USAGE_ENVIRONMENT_DIR) ; $(MAKE) install
cd $(BASIC_USAGE_ENVIRONMENT_DIR) ; $(MAKE) install
cd $(TESTPROGS_DIR) ; $(MAKE) install
cd $(MEDIA_SERVER_DIR) ; $(MAKE) install
cd $(PROXY_SERVER_DIR) ; $(MAKE) install
clean:
cd $(LIVEMEDIA_DIR) ; $(MAKE) clean
cd $(GROUPSOCK_DIR) ; $(MAKE) clean
cd $(USAGE_ENVIRONMENT_DIR) ; $(MAKE) clean
cd $(BASIC_USAGE_ENVIRONMENT_DIR) ; $(MAKE) clean
cd $(TESTPROGS_DIR) ; $(MAKE) clean
cd $(MEDIA_SERVER_DIR) ; $(MAKE) clean
cd $(PROXY_SERVER_DIR) ; $(MAKE) clean
distclean: clean
-rm -f $(LIVEMEDIA_DIR)/Makefile $(GROUPSOCK_DIR)/Makefile \
$(USAGE_ENVIRONMENT_DIR)/Makefile $(BASIC_USAGE_ENVIRONMENT_DIR)/Makefile \
$(TESTPROGS_DIR)/Makefile $(MEDIA_SERVER_DIR)/Makefile \
$(PROXY_SERVER_DIR)/Makefile Makefile
live/fix-makefile 000555 001752 001752 00000001043 12656261131 013735 0 ustar 00rsf rsf 000000 000000 #!/bin/sh
# the next line restarts using tclsh \
exec tclsh8.4 "$0" "$@"
set makefileName [lindex $argv 0]
set tmpfileName /tmp/rsftmp
set inFid [open $makefileName r]
set outFid [open $tmpfileName w]
while {![eof $inFid]} {
set line [gets $inFid]
if {[string match *\)\$* $line]} {
set pos [string first \)\$ $line]
set prefix [string range $line 0 $pos]
incr pos
set suffix [string range $line $pos end]
set line $prefix\ $suffix
}
puts $outFid $line
}
close $inFid
close $outFid
file rename -force $tmpfileName $makefileName
live/genWindowsMakefiles.cmd 000444 001752 001752 00000001531 12656261131 016102 0 ustar 00rsf rsf 000000 000000 @Echo OFF
SETLOCAL
for %%I in (%0) do %%~dI
for %%I in (%0) do cd "%%~pI"
cd liveMedia
del /Q liveMedia.mak
type Makefile.head ..\win32config Makefile.tail > liveMedia.mak
cd ../groupsock
del /Q groupsock.mak
type Makefile.head ..\win32config Makefile.tail > groupsock.mak
cd ../UsageEnvironment
del /Q UsageEnvironment.mak
type Makefile.head ..\win32config Makefile.tail > UsageEnvironment.mak
cd ../BasicUsageEnvironment
del /Q BasicUsageEnvironment.mak
type Makefile.head ..\win32config Makefile.tail > BasicUsageEnvironment.mak
cd ../testProgs
del /Q testProgs.mak
type Makefile.head ..\win32config Makefile.tail > testProgs.mak
cd ../mediaServer
del /Q mediaServer.mak
type Makefile.head ..\win32config Makefile.tail > mediaServer.mak
cd ../proxyServer
del /Q proxyServer.mak
type Makefile.head ..\win32config Makefile.tail > proxyServer.mak
ENDLOCAL
live/genWindowsMakefiles 000555 001752 001752 00000001667 12656261131 015355 0 ustar 00rsf rsf 000000 000000 #!/bin/sh
cd liveMedia
/bin/rm -f liveMedia.mak
/bin/rm -f Makefile
cat Makefile.head ../win32config Makefile.tail > liveMedia.mak
cd ../groupsock
/bin/rm -f groupsock.mak
/bin/rm -f Makefile
cat Makefile.head ../win32config Makefile.tail > groupsock.mak
cd ../UsageEnvironment
/bin/rm -f UsageEnvironment.mak
/bin/rm -f Makefile
cat Makefile.head ../win32config Makefile.tail > UsageEnvironment.mak
cd ../BasicUsageEnvironment
/bin/rm -f BasicUsageEnvironment.mak
/bin/rm -f Makefile
cat Makefile.head ../win32config Makefile.tail > BasicUsageEnvironment.mak
cd ../testProgs
/bin/rm -f testProgs.mak
/bin/rm -f Makefile
cat Makefile.head ../win32config Makefile.tail > testProgs.mak
cd ../mediaServer
/bin/rm -f mediaServer.mak
/bin/rm -f Makefile
cat Makefile.head ../win32config Makefile.tail > mediaServer.mak
cd ../proxyServer
/bin/rm -f proxyServer.mak
/bin/rm -f Makefile
cat Makefile.head ../win32config Makefile.tail > proxyServer.mak
live/win32config.Borland 000444 001752 001752 00000002616 12656261131 015110 0 ustar 00rsf rsf 000000 000000 # Comment out the following line to produce Makefiles that generate debuggable code:
NODEBUG=1
# The following definition ensures that we are properly matching
# the WinSock2 library file with the correct header files.
# (will link with "ws2_32.lib" and include "winsock2.h" & "Ws2tcpip.h")
TARGETOS = WINNT
# If for some reason you wish to use WinSock1 instead, uncomment the
# following two definitions.
# (will link with "wsock32.lib" and include "winsock.h")
#TARGETOS = WIN95
#APPVER = 4.0
#!include
UI_OPTS = $(guilflags) $(guilibsdll)
# Use the following to get a console (e.g., for debugging):
CONSOLE_UI_OPTS = $(conlflags) $(conlibsdll)
CPU=i386
TOOLS32 = C:\Progra~1\Borland\CBuilder5
COMPILE_OPTS = $(INCLUDES) $(cdebug) $(cflags) $(cvarsdll) -I. -I$(TOOLS32)\include
C = c
C_COMPILER = $(TOOLS32)\bin\bcc32
C_FLAGS = $(COMPILE_OPTS)
CPP = cpp
CPLUSPLUS_COMPILER = $(C_COMPILER)
CPLUSPLUS_FLAGS = $(COMPILE_OPTS)
OBJ = obj
LINK = $(TOOLS32)\bin\ilink32
LIBRARY_LINK = $(TOOLS32)\bin\tlib
LINK_OPTS_0 = $(linkdebug) msvcirt.lib
LIBRARY_LINK_OPTS = /u
LINK_OPTS = $(LINK_OPTS_0) $(UI_OPTS)
CONSOLE_LINK_OPTS = c0x32
SERVICE_LINK_OPTS = kernel32.lib advapi32.lib shell32.lib -subsystem:console,$(APPVER)
LIB_SUFFIX = lib
LIBS_FOR_CONSOLE_APPLICATION = cw32.lib import32.lib
LIBS_FOR_GUI_APPLICATION = ,,cw32
EXE =
rc32 = $(TOOLS32)\bin\brc32"
.rc.res:
$(rc32) $<
live/win32config 000444 001752 001752 00000002650 12656261131 013526 0 ustar 00rsf rsf 000000 000000 # Comment out the following line to produce Makefiles that generate debuggable code:
NODEBUG=1
# The following definition ensures that we are properly matching
# the WinSock2 library file with the correct header files.
# (will link with "ws2_32.lib" and include "winsock2.h" & "Ws2tcpip.h")
TARGETOS = WINNT
# If for some reason you wish to use WinSock1 instead, uncomment the
# following two definitions.
# (will link with "wsock32.lib" and include "winsock.h")
#TARGETOS = WIN95
#APPVER = 4.0
!include
UI_OPTS = $(guilflags) $(guilibsdll)
# Use the following to get a console (e.g., for debugging):
CONSOLE_UI_OPTS = $(conlflags) $(conlibsdll)
CPU=i386
TOOLS32 = c:\Program Files\DevStudio\Vc
COMPILE_OPTS = $(INCLUDES) $(cdebug) $(cflags) $(cvarsdll) -I. -I"$(TOOLS32)\include"
C = c
C_COMPILER = "$(TOOLS32)\bin\cl"
C_FLAGS = $(COMPILE_OPTS)
CPP = cpp
CPLUSPLUS_COMPILER = $(C_COMPILER)
CPLUSPLUS_FLAGS = $(COMPILE_OPTS)
OBJ = obj
LINK = $(link) -out:
LIBRARY_LINK = lib -out:
LINK_OPTS_0 = $(linkdebug) msvcirt.lib
LIBRARY_LINK_OPTS =
LINK_OPTS = $(LINK_OPTS_0) $(UI_OPTS)
CONSOLE_LINK_OPTS = $(LINK_OPTS_0) $(CONSOLE_UI_OPTS)
SERVICE_LINK_OPTS = kernel32.lib advapi32.lib shell32.lib -subsystem:console,$(APPVER)
LIB_SUFFIX = lib
LIBS_FOR_CONSOLE_APPLICATION =
LIBS_FOR_GUI_APPLICATION =
MULTIMEDIA_LIBS = winmm.lib
EXE = .exe
PLATFORM = Windows
rc32 = "$(TOOLS32)\bin\rc"
.rc.res:
$(rc32) $<
live/liveMedia/include/ 000755 001752 001752 00000000000 12656261123 014775 5 ustar 00rsf rsf 000000 000000 live/liveMedia/AC3AudioRTPSink.cpp 000444 001752 001752 00000006752 12656261123 016631 0 ustar 00rsf rsf 000000 000000 /**********
This library is free software; you can redistribute it and/or modify it under
the terms of the GNU Lesser General Public License as published by the
Free Software Foundation; either version 2.1 of the License, or (at your
option) any later version. (See .)
This library is distributed in the hope that it will be useful, but WITHOUT
ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS
FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for
more details.
You should have received a copy of the GNU Lesser General Public License
along with this library; if not, write to the Free Software Foundation, Inc.,
51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
**********/
// "liveMedia"
// Copyright (c) 1996-2016 Live Networks, Inc. All rights reserved.
// RTP sink for AC3 audio
// Implementation
#include "AC3AudioRTPSink.hh"
AC3AudioRTPSink::AC3AudioRTPSink(UsageEnvironment& env, Groupsock* RTPgs,
u_int8_t rtpPayloadFormat,
u_int32_t rtpTimestampFrequency)
: AudioRTPSink(env, RTPgs, rtpPayloadFormat,
rtpTimestampFrequency, "AC3"),
fTotNumFragmentsUsed(0) {
}
AC3AudioRTPSink::~AC3AudioRTPSink() {
}
AC3AudioRTPSink*
AC3AudioRTPSink::createNew(UsageEnvironment& env, Groupsock* RTPgs,
u_int8_t rtpPayloadFormat,
u_int32_t rtpTimestampFrequency) {
return new AC3AudioRTPSink(env, RTPgs,
rtpPayloadFormat, rtpTimestampFrequency);
}
Boolean AC3AudioRTPSink
::frameCanAppearAfterPacketStart(unsigned char const* /*frameStart*/,
unsigned /*numBytesInFrame*/) const {
// (For now) allow at most 1 frame in a single packet:
return False;
}
void AC3AudioRTPSink
::doSpecialFrameHandling(unsigned fragmentationOffset,
unsigned char* frameStart,
unsigned numBytesInFrame,
struct timeval framePresentationTime,
unsigned numRemainingBytes) {
// Set the 2-byte "payload header", as defined in RFC 4184.
unsigned char headers[2];
Boolean isFragment = numRemainingBytes > 0 || fragmentationOffset > 0;
if (!isFragment) {
headers[0] = 0; // One or more complete frames
headers[1] = 1; // because we (for now) allow at most 1 frame per packet
} else {
if (fragmentationOffset > 0) {
headers[0] = 3; // Fragment of frame other than initial fragment
} else {
// An initial fragment of the frame
unsigned const totalFrameSize = fragmentationOffset + numBytesInFrame + numRemainingBytes;
unsigned const fiveEighthsPoint = totalFrameSize/2 + totalFrameSize/8;
headers[0] = numBytesInFrame >= fiveEighthsPoint ? 1 : 2;
// Because this outgoing packet will be full (because it's an initial fragment), we can compute how many total
// fragments (and thus packets) will make up the complete AC-3 frame:
fTotNumFragmentsUsed = (totalFrameSize + (numBytesInFrame-1))/numBytesInFrame;
}
headers[1] = fTotNumFragmentsUsed;
}
setSpecialHeaderBytes(headers, sizeof headers);
if (numRemainingBytes == 0) {
// This packet contains the last (or only) fragment of the frame.
// Set the RTP 'M' ('marker') bit:
setMarkerBit();
}
// Important: Also call our base class's doSpecialFrameHandling(),
// to set the packet's timestamp:
MultiFramedRTPSink::doSpecialFrameHandling(fragmentationOffset,
frameStart, numBytesInFrame,
framePresentationTime,
numRemainingBytes);
}
unsigned AC3AudioRTPSink::specialHeaderSize() const {
return 2;
}
live/liveMedia/AC3AudioRTPSource.cpp 000444 001752 001752 00000004313 12656261123 017154 0 ustar 00rsf rsf 000000 000000 /**********
This library is free software; you can redistribute it and/or modify it under
the terms of the GNU Lesser General Public License as published by the
Free Software Foundation; either version 2.1 of the License, or (at your
option) any later version. (See .)
This library is distributed in the hope that it will be useful, but WITHOUT
ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS
FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for
more details.
You should have received a copy of the GNU Lesser General Public License
along with this library; if not, write to the Free Software Foundation, Inc.,
51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
**********/
// "liveMedia"
// Copyright (c) 1996-2016 Live Networks, Inc. All rights reserved.
// AC3 Audio RTP Sources
// Implementation
#include "AC3AudioRTPSource.hh"
AC3AudioRTPSource*
AC3AudioRTPSource::createNew(UsageEnvironment& env,
Groupsock* RTPgs,
unsigned char rtpPayloadFormat,
unsigned rtpTimestampFrequency) {
return new AC3AudioRTPSource(env, RTPgs, rtpPayloadFormat,
rtpTimestampFrequency);
}
AC3AudioRTPSource::AC3AudioRTPSource(UsageEnvironment& env,
Groupsock* rtpGS,
unsigned char rtpPayloadFormat,
unsigned rtpTimestampFrequency)
: MultiFramedRTPSource(env, rtpGS,
rtpPayloadFormat, rtpTimestampFrequency) {
}
AC3AudioRTPSource::~AC3AudioRTPSource() {
}
Boolean AC3AudioRTPSource
::processSpecialHeader(BufferedPacket* packet,
unsigned& resultSpecialHeaderSize) {
unsigned char* headerStart = packet->data();
unsigned packetSize = packet->dataSize();
// There's a 2-byte payload header at the beginning:
if (packetSize < 2) return False;
resultSpecialHeaderSize = 2;
unsigned char FT = headerStart[0]&0x03;
fCurrentPacketBeginsFrame = FT != 3;
// The RTP "M" (marker) bit indicates the last fragment of a frame.
// In case the sender did not set the "M" bit correctly, we also test for FT == 0:
fCurrentPacketCompletesFrame = packet->rtpMarkerBit() || FT == 0;
return True;
}
char const* AC3AudioRTPSource::MIMEtype() const {
return "audio/AC3";
}
live/liveMedia/AC3AudioStreamFramer.cpp 000444 001752 001752 00000024562 12656261123 017726 0 ustar 00rsf rsf 000000 000000 /**********
This library is free software; you can redistribute it and/or modify it under
the terms of the GNU Lesser General Public License as published by the
Free Software Foundation; either version 2.1 of the License, or (at your
option) any later version. (See .)
This library is distributed in the hope that it will be useful, but WITHOUT
ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS
FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for
more details.
You should have received a copy of the GNU Lesser General Public License
along with this library; if not, write to the Free Software Foundation, Inc.,
51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
**********/
// "liveMedia"
// Copyright (c) 1996-2016 Live Networks, Inc. All rights reserved.
// A filter that breaks up an AC3 audio elementary stream into frames
// Implementation
#include "AC3AudioStreamFramer.hh"
#include "StreamParser.hh"
#include
////////// AC3AudioStreamParser definition //////////
class AC3FrameParams {
public:
AC3FrameParams() : samplingFreq(0) {}
// 8-byte header at the start of each frame:
// u_int32_t hdr0, hdr1;
unsigned hdr0, hdr1;
// parameters derived from the headers
unsigned kbps, samplingFreq, frameSize;
void setParamsFromHeader();
};
class AC3AudioStreamParser: public StreamParser {
public:
AC3AudioStreamParser(AC3AudioStreamFramer* usingSource,
FramedSource* inputSource);
virtual ~AC3AudioStreamParser();
public:
void testStreamCode(unsigned char ourStreamCode,
unsigned char* ptr, unsigned size);
unsigned parseFrame(unsigned& numTruncatedBytes);
// returns the size of the frame that was acquired, or 0 if none was
void registerReadInterest(unsigned char* to, unsigned maxSize);
AC3FrameParams const& currentFrame() const { return fCurrentFrame; }
Boolean haveParsedAFrame() const { return fHaveParsedAFrame; }
void readAndSaveAFrame();
private:
static void afterGettingSavedFrame(void* clientData, unsigned frameSize,
unsigned numTruncatedBytes,
struct timeval presentationTime,
unsigned durationInMicroseconds);
void afterGettingSavedFrame1(unsigned frameSize);
static void onSavedFrameClosure(void* clientData);
void onSavedFrameClosure1();
private:
AC3AudioStreamFramer* fUsingSource;
unsigned char* fTo;
unsigned fMaxSize;
Boolean fHaveParsedAFrame;
unsigned char* fSavedFrame;
unsigned fSavedFrameSize;
char fSavedFrameFlag;
// Parameters of the most recently read frame:
AC3FrameParams fCurrentFrame;
};
////////// AC3AudioStreamFramer implementation //////////
AC3AudioStreamFramer::AC3AudioStreamFramer(UsageEnvironment& env,
FramedSource* inputSource,
unsigned char streamCode)
: FramedFilter(env, inputSource), fOurStreamCode(streamCode) {
// Use the current wallclock time as the initial 'presentation time':
gettimeofday(&fNextFramePresentationTime, NULL);
fParser = new AC3AudioStreamParser(this, inputSource);
}
AC3AudioStreamFramer::~AC3AudioStreamFramer() {
delete fParser;
}
AC3AudioStreamFramer*
AC3AudioStreamFramer::createNew(UsageEnvironment& env,
FramedSource* inputSource,
unsigned char streamCode) {
// Need to add source type checking here??? #####
return new AC3AudioStreamFramer(env, inputSource, streamCode);
}
unsigned AC3AudioStreamFramer::samplingRate() {
if (!fParser->haveParsedAFrame()) {
// Because we haven't yet parsed a frame, we don't yet know the input
// stream's sampling rate. So, we first need to read a frame
// (into a special buffer that we keep around for later use).
fParser->readAndSaveAFrame();
}
return fParser->currentFrame().samplingFreq;
}
void AC3AudioStreamFramer::flushInput() {
fParser->flushInput();
}
void AC3AudioStreamFramer::doGetNextFrame() {
fParser->registerReadInterest(fTo, fMaxSize);
parseNextFrame();
}
#define MILLION 1000000
struct timeval AC3AudioStreamFramer::currentFramePlayTime() const {
AC3FrameParams const& fr = fParser->currentFrame();
unsigned const numSamples = 1536;
unsigned const freq = fr.samplingFreq;
// result is numSamples/freq
unsigned const uSeconds = (freq == 0) ? 0
: ((numSamples*2*MILLION)/freq + 1)/2; // rounds to nearest integer
struct timeval result;
result.tv_sec = uSeconds/MILLION;
result.tv_usec = uSeconds%MILLION;
return result;
}
void AC3AudioStreamFramer
::handleNewData(void* clientData, unsigned char* ptr, unsigned size,
struct timeval /*presentationTime*/) {
AC3AudioStreamFramer* framer = (AC3AudioStreamFramer*)clientData;
framer->handleNewData(ptr, size);
}
void AC3AudioStreamFramer
::handleNewData(unsigned char* ptr, unsigned size) {
fParser->testStreamCode(fOurStreamCode, ptr, size);
parseNextFrame();
}
void AC3AudioStreamFramer::parseNextFrame() {
unsigned acquiredFrameSize = fParser->parseFrame(fNumTruncatedBytes);
if (acquiredFrameSize > 0) {
// We were able to acquire a frame from the input.
// It has already been copied to the reader's space.
fFrameSize = acquiredFrameSize;
// Also set the presentation time, and increment it for next time,
// based on the length of this frame:
fPresentationTime = fNextFramePresentationTime;
struct timeval framePlayTime = currentFramePlayTime();
fDurationInMicroseconds = framePlayTime.tv_sec*MILLION + framePlayTime.tv_usec;
fNextFramePresentationTime.tv_usec += framePlayTime.tv_usec;
fNextFramePresentationTime.tv_sec
+= framePlayTime.tv_sec + fNextFramePresentationTime.tv_usec/MILLION;
fNextFramePresentationTime.tv_usec %= MILLION;
// Call our own 'after getting' function. Because we're not a 'leaf'
// source, we can call this directly, without risking infinite recursion.
afterGetting(this);
} else {
// We were unable to parse a complete frame from the input, because:
// - we had to read more data from the source stream, or
// - the source stream has ended.
}
}
////////// AC3AudioStreamParser implementation //////////
static int const kbpsTable[] = {32, 40, 48, 56, 64, 80, 96, 112,
128, 160, 192, 224, 256, 320, 384, 448,
512, 576, 640};
void AC3FrameParams::setParamsFromHeader() {
unsigned char byte4 = hdr1 >> 24;
unsigned char kbpsIndex = (byte4&0x3E) >> 1;
if (kbpsIndex > 18) kbpsIndex = 18;
kbps = kbpsTable[kbpsIndex];
unsigned char samplingFreqIndex = (byte4&0xC0) >> 6;
switch (samplingFreqIndex) {
case 0:
samplingFreq = 48000;
frameSize = 4*kbps;
break;
case 1:
samplingFreq = 44100;
frameSize = 2*(320*kbps/147 + (byte4&1));
break;
case 2:
case 3: // not legal?
samplingFreq = 32000;
frameSize = 6*kbps;
}
}
AC3AudioStreamParser
::AC3AudioStreamParser(AC3AudioStreamFramer* usingSource,
FramedSource* inputSource)
: StreamParser(inputSource, FramedSource::handleClosure, usingSource,
&AC3AudioStreamFramer::handleNewData, usingSource),
fUsingSource(usingSource), fHaveParsedAFrame(False),
fSavedFrame(NULL), fSavedFrameSize(0) {
}
AC3AudioStreamParser::~AC3AudioStreamParser() {
}
void AC3AudioStreamParser::registerReadInterest(unsigned char* to,
unsigned maxSize) {
fTo = to;
fMaxSize = maxSize;
}
void AC3AudioStreamParser
::testStreamCode(unsigned char ourStreamCode,
unsigned char* ptr, unsigned size) {
if (ourStreamCode == 0) return; // we assume that there's no stream code at the beginning of the data
if (size < 4) return;
unsigned char streamCode = *ptr;
if (streamCode == ourStreamCode) {
// Remove the first 4 bytes from the stream:
memmove(ptr, ptr + 4, size - 4);
totNumValidBytes() = totNumValidBytes() - 4;
} else {
// Discard all of the data that was just read:
totNumValidBytes() = totNumValidBytes() - size;
}
}
unsigned AC3AudioStreamParser::parseFrame(unsigned& numTruncatedBytes) {
if (fSavedFrameSize > 0) {
// We've already read and parsed a frame. Use it instead:
memmove(fTo, fSavedFrame, fSavedFrameSize);
delete[] fSavedFrame; fSavedFrame = NULL;
unsigned frameSize = fSavedFrameSize;
fSavedFrameSize = 0;
return frameSize;
}
try {
saveParserState();
// We expect an AC3 audio header (first 2 bytes == 0x0B77) at the start:
while (1) {
unsigned next4Bytes = test4Bytes();
if (next4Bytes>>16 == 0x0B77) break;
skipBytes(1);
saveParserState();
}
fCurrentFrame.hdr0 = get4Bytes();
fCurrentFrame.hdr1 = test4Bytes();
fCurrentFrame.setParamsFromHeader();
fHaveParsedAFrame = True;
// Copy the frame to the requested destination:
unsigned frameSize = fCurrentFrame.frameSize;
if (frameSize > fMaxSize) {
numTruncatedBytes = frameSize - fMaxSize;
frameSize = fMaxSize;
} else {
numTruncatedBytes = 0;
}
fTo[0] = fCurrentFrame.hdr0 >> 24;
fTo[1] = fCurrentFrame.hdr0 >> 16;
fTo[2] = fCurrentFrame.hdr0 >> 8;
fTo[3] = fCurrentFrame.hdr0;
getBytes(&fTo[4], frameSize-4);
skipBytes(numTruncatedBytes);
return frameSize;
} catch (int /*e*/) {
#ifdef DEBUG
fUsingSource->envir() << "AC3AudioStreamParser::parseFrame() EXCEPTION (This is normal behavior - *not* an error)\n";
#endif
return 0; // the parsing got interrupted
}
}
void AC3AudioStreamParser::readAndSaveAFrame() {
unsigned const maxAC3FrameSize = 4000;
fSavedFrame = new unsigned char[maxAC3FrameSize];
fSavedFrameSize = 0;
fSavedFrameFlag = 0;
fUsingSource->getNextFrame(fSavedFrame, maxAC3FrameSize,
afterGettingSavedFrame, this,
onSavedFrameClosure, this);
fUsingSource->envir().taskScheduler().doEventLoop(&fSavedFrameFlag);
}
void AC3AudioStreamParser
::afterGettingSavedFrame(void* clientData, unsigned frameSize,
unsigned /*numTruncatedBytes*/,
struct timeval /*presentationTime*/,
unsigned /*durationInMicroseconds*/) {
AC3AudioStreamParser* parser = (AC3AudioStreamParser*)clientData;
parser->afterGettingSavedFrame1(frameSize);
}
void AC3AudioStreamParser
::afterGettingSavedFrame1(unsigned frameSize) {
fSavedFrameSize = frameSize;
fSavedFrameFlag = ~0;
}
void AC3AudioStreamParser::onSavedFrameClosure(void* clientData) {
AC3AudioStreamParser* parser = (AC3AudioStreamParser*)clientData;
parser->onSavedFrameClosure1();
}
void AC3AudioStreamParser::onSavedFrameClosure1() {
delete[] fSavedFrame; fSavedFrame = NULL;
fSavedFrameSize = 0;
fSavedFrameFlag = ~0;
}
live/liveMedia/ADTSAudioFileServerMediaSubsession.cpp 000444 001752 001752 00000004474 12656261123 022607 0 ustar 00rsf rsf 000000 000000 /**********
This library is free software; you can redistribute it and/or modify it under
the terms of the GNU Lesser General Public License as published by the
Free Software Foundation; either version 2.1 of the License, or (at your
option) any later version. (See .)
This library is distributed in the hope that it will be useful, but WITHOUT
ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS
FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for
more details.
You should have received a copy of the GNU Lesser General Public License
along with this library; if not, write to the Free Software Foundation, Inc.,
51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
**********/
// "liveMedia"
// Copyright (c) 1996-2016 Live Networks, Inc. All rights reserved.
// A 'ServerMediaSubsession' object that creates new, unicast, "RTPSink"s
// on demand, from an AAC audio file in ADTS format
// Implementation
#include "ADTSAudioFileServerMediaSubsession.hh"
#include "ADTSAudioFileSource.hh"
#include "MPEG4GenericRTPSink.hh"
ADTSAudioFileServerMediaSubsession*
ADTSAudioFileServerMediaSubsession::createNew(UsageEnvironment& env,
char const* fileName,
Boolean reuseFirstSource) {
return new ADTSAudioFileServerMediaSubsession(env, fileName, reuseFirstSource);
}
ADTSAudioFileServerMediaSubsession
::ADTSAudioFileServerMediaSubsession(UsageEnvironment& env,
char const* fileName, Boolean reuseFirstSource)
: FileServerMediaSubsession(env, fileName, reuseFirstSource) {
}
ADTSAudioFileServerMediaSubsession
::~ADTSAudioFileServerMediaSubsession() {
}
FramedSource* ADTSAudioFileServerMediaSubsession
::createNewStreamSource(unsigned /*clientSessionId*/, unsigned& estBitrate) {
estBitrate = 96; // kbps, estimate
return ADTSAudioFileSource::createNew(envir(), fFileName);
}
RTPSink* ADTSAudioFileServerMediaSubsession
::createNewRTPSink(Groupsock* rtpGroupsock,
unsigned char rtpPayloadTypeIfDynamic,
FramedSource* inputSource) {
ADTSAudioFileSource* adtsSource = (ADTSAudioFileSource*)inputSource;
return MPEG4GenericRTPSink::createNew(envir(), rtpGroupsock,
rtpPayloadTypeIfDynamic,
adtsSource->samplingFrequency(),
"audio", "AAC-hbr", adtsSource->configStr(),
adtsSource->numChannels());
}
live/liveMedia/ADTSAudioFileSource.cpp 000444 001752 001752 00000014407 12656261123 017560 0 ustar 00rsf rsf 000000 000000 /**********
This library is free software; you can redistribute it and/or modify it under
the terms of the GNU Lesser General Public License as published by the
Free Software Foundation; either version 2.1 of the License, or (at your
option) any later version. (See .)
This library is distributed in the hope that it will be useful, but WITHOUT
ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS
FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for
more details.
You should have received a copy of the GNU Lesser General Public License
along with this library; if not, write to the Free Software Foundation, Inc.,
51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
**********/
// "liveMedia"
// Copyright (c) 1996-2016 Live Networks, Inc. All rights reserved.
// A source object for AAC audio files in ADTS format
// Implementation
#include "ADTSAudioFileSource.hh"
#include "InputFile.hh"
#include
////////// ADTSAudioFileSource //////////
static unsigned const samplingFrequencyTable[16] = {
96000, 88200, 64000, 48000,
44100, 32000, 24000, 22050,
16000, 12000, 11025, 8000,
7350, 0, 0, 0
};
ADTSAudioFileSource*
ADTSAudioFileSource::createNew(UsageEnvironment& env, char const* fileName) {
FILE* fid = NULL;
do {
fid = OpenInputFile(env, fileName);
if (fid == NULL) break;
// Now, having opened the input file, read the fixed header of the first frame,
// to get the audio stream's parameters:
unsigned char fixedHeader[4]; // it's actually 3.5 bytes long
if (fread(fixedHeader, 1, sizeof fixedHeader, fid) < sizeof fixedHeader) break;
// Check the 'syncword':
if (!(fixedHeader[0] == 0xFF && (fixedHeader[1]&0xF0) == 0xF0)) {
env.setResultMsg("Bad 'syncword' at start of ADTS file");
break;
}
// Get and check the 'profile':
u_int8_t profile = (fixedHeader[2]&0xC0)>>6; // 2 bits
if (profile == 3) {
env.setResultMsg("Bad (reserved) 'profile': 3 in first frame of ADTS file");
break;
}
// Get and check the 'sampling_frequency_index':
u_int8_t sampling_frequency_index = (fixedHeader[2]&0x3C)>>2; // 4 bits
if (samplingFrequencyTable[sampling_frequency_index] == 0) {
env.setResultMsg("Bad 'sampling_frequency_index' in first frame of ADTS file");
break;
}
// Get and check the 'channel_configuration':
u_int8_t channel_configuration
= ((fixedHeader[2]&0x01)<<2)|((fixedHeader[3]&0xC0)>>6); // 3 bits
// If we get here, the frame header was OK.
// Reset the fid to the beginning of the file:
#ifndef _WIN32_WCE
rewind(fid);
#else
SeekFile64(fid, SEEK_SET,0);
#endif
#ifdef DEBUG
fprintf(stderr, "Read first frame: profile %d, "
"sampling_frequency_index %d => samplingFrequency %d, "
"channel_configuration %d\n",
profile,
sampling_frequency_index, samplingFrequencyTable[sampling_frequency_index],
channel_configuration);
#endif
return new ADTSAudioFileSource(env, fid, profile,
sampling_frequency_index, channel_configuration);
} while (0);
// An error occurred:
CloseInputFile(fid);
return NULL;
}
ADTSAudioFileSource
::ADTSAudioFileSource(UsageEnvironment& env, FILE* fid, u_int8_t profile,
u_int8_t samplingFrequencyIndex, u_int8_t channelConfiguration)
: FramedFileSource(env, fid) {
fSamplingFrequency = samplingFrequencyTable[samplingFrequencyIndex];
fNumChannels = channelConfiguration == 0 ? 2 : channelConfiguration;
fuSecsPerFrame
= (1024/*samples-per-frame*/*1000000) / fSamplingFrequency/*samples-per-second*/;
// Construct the 'AudioSpecificConfig', and from it, the corresponding ASCII string:
unsigned char audioSpecificConfig[2];
u_int8_t const audioObjectType = profile + 1;
audioSpecificConfig[0] = (audioObjectType<<3) | (samplingFrequencyIndex>>1);
audioSpecificConfig[1] = (samplingFrequencyIndex<<7) | (channelConfiguration<<3);
sprintf(fConfigStr, "%02X%02x", audioSpecificConfig[0], audioSpecificConfig[1]);
}
ADTSAudioFileSource::~ADTSAudioFileSource() {
CloseInputFile(fFid);
}
// Note: We should change the following to use asynchronous file reading, #####
// as we now do with ByteStreamFileSource. #####
void ADTSAudioFileSource::doGetNextFrame() {
// Begin by reading the 7-byte fixed_variable headers:
unsigned char headers[7];
if (fread(headers, 1, sizeof headers, fFid) < sizeof headers
|| feof(fFid) || ferror(fFid)) {
// The input source has ended:
handleClosure();
return;
}
// Extract important fields from the headers:
Boolean protection_absent = headers[1]&0x01;
u_int16_t frame_length
= ((headers[3]&0x03)<<11) | (headers[4]<<3) | ((headers[5]&0xE0)>>5);
#ifdef DEBUG
u_int16_t syncword = (headers[0]<<4) | (headers[1]>>4);
fprintf(stderr, "Read frame: syncword 0x%x, protection_absent %d, frame_length %d\n", syncword, protection_absent, frame_length);
if (syncword != 0xFFF) fprintf(stderr, "WARNING: Bad syncword!\n");
#endif
unsigned numBytesToRead
= frame_length > sizeof headers ? frame_length - sizeof headers : 0;
// If there's a 'crc_check' field, skip it:
if (!protection_absent) {
SeekFile64(fFid, 2, SEEK_CUR);
numBytesToRead = numBytesToRead > 2 ? numBytesToRead - 2 : 0;
}
// Next, read the raw frame data into the buffer provided:
if (numBytesToRead > fMaxSize) {
fNumTruncatedBytes = numBytesToRead - fMaxSize;
numBytesToRead = fMaxSize;
}
int numBytesRead = fread(fTo, 1, numBytesToRead, fFid);
if (numBytesRead < 0) numBytesRead = 0;
fFrameSize = numBytesRead;
fNumTruncatedBytes += numBytesToRead - numBytesRead;
// Set the 'presentation time':
if (fPresentationTime.tv_sec == 0 && fPresentationTime.tv_usec == 0) {
// This is the first frame, so use the current time:
gettimeofday(&fPresentationTime, NULL);
} else {
// Increment by the play time of the previous frame:
unsigned uSeconds = fPresentationTime.tv_usec + fuSecsPerFrame;
fPresentationTime.tv_sec += uSeconds/1000000;
fPresentationTime.tv_usec = uSeconds%1000000;
}
fDurationInMicroseconds = fuSecsPerFrame;
// Switch to another task, and inform the reader that he has data:
nextTask() = envir().taskScheduler().scheduleDelayedTask(0,
(TaskFunc*)FramedSource::afterGetting, this);
}
live/liveMedia/AMRAudioFileServerMediaSubsession.cpp 000444 001752 001752 00000004345 12656261123 022470 0 ustar 00rsf rsf 000000 000000 /**********
This library is free software; you can redistribute it and/or modify it under
the terms of the GNU Lesser General Public License as published by the
Free Software Foundation; either version 2.1 of the License, or (at your
option) any later version. (See .)
This library is distributed in the hope that it will be useful, but WITHOUT
ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS
FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for
more details.
You should have received a copy of the GNU Lesser General Public License
along with this library; if not, write to the Free Software Foundation, Inc.,
51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
**********/
// "liveMedia"
// Copyright (c) 1996-2016 Live Networks, Inc. All rights reserved.
// A 'ServerMediaSubsession' object that creates new, unicast, "RTPSink"s
// on demand, from an AMR audio file.
// Implementation
#include "AMRAudioFileServerMediaSubsession.hh"
#include "AMRAudioRTPSink.hh"
#include "AMRAudioFileSource.hh"
AMRAudioFileServerMediaSubsession*
AMRAudioFileServerMediaSubsession::createNew(UsageEnvironment& env,
char const* fileName,
Boolean reuseFirstSource) {
return new AMRAudioFileServerMediaSubsession(env, fileName, reuseFirstSource);
}
AMRAudioFileServerMediaSubsession
::AMRAudioFileServerMediaSubsession(UsageEnvironment& env,
char const* fileName, Boolean reuseFirstSource)
: FileServerMediaSubsession(env, fileName, reuseFirstSource) {
}
AMRAudioFileServerMediaSubsession
::~AMRAudioFileServerMediaSubsession() {
}
FramedSource* AMRAudioFileServerMediaSubsession
::createNewStreamSource(unsigned /*clientSessionId*/, unsigned& estBitrate) {
estBitrate = 10; // kbps, estimate
return AMRAudioFileSource::createNew(envir(), fFileName);
}
RTPSink* AMRAudioFileServerMediaSubsession
::createNewRTPSink(Groupsock* rtpGroupsock,
unsigned char rtpPayloadTypeIfDynamic,
FramedSource* inputSource) {
AMRAudioFileSource* amrSource = (AMRAudioFileSource*)inputSource;
return AMRAudioRTPSink::createNew(envir(), rtpGroupsock,
rtpPayloadTypeIfDynamic,
amrSource->isWideband(),
amrSource->numChannels());
}
live/liveMedia/AMRAudioFileSink.cpp 000444 001752 001752 00000007036 12656261123 017110 0 ustar 00rsf rsf 000000 000000 /**********
This library is free software; you can redistribute it and/or modify it under
the terms of the GNU Lesser General Public License as published by the
Free Software Foundation; either version 2.1 of the License, or (at your
option) any later version. (See .)
This library is distributed in the hope that it will be useful, but WITHOUT
ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS
FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for
more details.
You should have received a copy of the GNU Lesser General Public License
along with this library; if not, write to the Free Software Foundation, Inc.,
51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
**********/
// "liveMedia"
// Copyright (c) 1996-2016 Live Networks, Inc. All rights reserved.
// AMR Audio File sinks
// Implementation
#include "AMRAudioFileSink.hh"
#include "AMRAudioSource.hh"
#include "OutputFile.hh"
////////// AMRAudioFileSink //////////
AMRAudioFileSink
::AMRAudioFileSink(UsageEnvironment& env, FILE* fid, unsigned bufferSize,
char const* perFrameFileNamePrefix)
: FileSink(env, fid, bufferSize, perFrameFileNamePrefix),
fHaveWrittenHeader(False) {
}
AMRAudioFileSink::~AMRAudioFileSink() {
}
AMRAudioFileSink*
AMRAudioFileSink::createNew(UsageEnvironment& env, char const* fileName,
unsigned bufferSize, Boolean oneFilePerFrame) {
do {
FILE* fid;
char const* perFrameFileNamePrefix;
if (oneFilePerFrame) {
// Create the fid for each frame
fid = NULL;
perFrameFileNamePrefix = fileName;
} else {
// Normal case: create the fid once
fid = OpenOutputFile(env, fileName);
if (fid == NULL) break;
perFrameFileNamePrefix = NULL;
}
return new AMRAudioFileSink(env, fid, bufferSize, perFrameFileNamePrefix);
} while (0);
return NULL;
}
Boolean AMRAudioFileSink::sourceIsCompatibleWithUs(MediaSource& source) {
// The input source must be a AMR Audio source:
return source.isAMRAudioSource();
}
void AMRAudioFileSink::afterGettingFrame(unsigned frameSize,
unsigned numTruncatedBytes,
struct timeval presentationTime) {
AMRAudioSource* source = (AMRAudioSource*)fSource;
if (source == NULL) return; // sanity check
if (!fHaveWrittenHeader && fPerFrameFileNameBuffer == NULL) {
// Output the appropriate AMR header to the start of the file.
// This header is defined in RFC 4867, section 5.
// (However, we don't do this if we're creating one file per frame.)
char headerBuffer[100];
sprintf(headerBuffer, "#!AMR%s%s\n",
source->isWideband() ? "-WB" : "",
source->numChannels() > 1 ? "_MC1.0" : "");
unsigned headerLength = strlen(headerBuffer);
if (source->numChannels() > 1) {
// Also add a 32-bit channel description field:
headerBuffer[headerLength++] = 0;
headerBuffer[headerLength++] = 0;
headerBuffer[headerLength++] = 0;
headerBuffer[headerLength++] = source->numChannels();
}
addData((unsigned char*)headerBuffer, headerLength, presentationTime);
}
fHaveWrittenHeader = True;
// Add the 1-byte header, before writing the file data proper:
// (Again, we don't do this if we're creating one file per frame.)
if (fPerFrameFileNameBuffer == NULL) {
u_int8_t frameHeader = source->lastFrameHeader();
addData(&frameHeader, 1, presentationTime);
}
// Call the parent class to complete the normal file write with the input data:
FileSink::afterGettingFrame(frameSize, numTruncatedBytes, presentationTime);
}
live/liveMedia/AMRAudioFileSource.cpp 000444 001752 001752 00000013266 12656261123 017446 0 ustar 00rsf rsf 000000 000000 /**********
This library is free software; you can redistribute it and/or modify it under
the terms of the GNU Lesser General Public License as published by the
Free Software Foundation; either version 2.1 of the License, or (at your
option) any later version. (See .)
This library is distributed in the hope that it will be useful, but WITHOUT
ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS
FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for
more details.
You should have received a copy of the GNU Lesser General Public License
along with this library; if not, write to the Free Software Foundation, Inc.,
51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
**********/
// "liveMedia"
// Copyright (c) 1996-2016 Live Networks, Inc. All rights reserved.
// A source object for AMR audio files (as defined in RFC 4867, section 5)
// Implementation
#include "AMRAudioFileSource.hh"
#include "InputFile.hh"
#include "GroupsockHelper.hh"
////////// AMRAudioFileSource //////////
AMRAudioFileSource*
AMRAudioFileSource::createNew(UsageEnvironment& env, char const* fileName) {
FILE* fid = NULL;
Boolean magicNumberOK = True;
do {
fid = OpenInputFile(env, fileName);
if (fid == NULL) break;
// Now, having opened the input file, read the first few bytes, to
// check the required 'magic number':
magicNumberOK = False; // until we learn otherwise
Boolean isWideband = False; // by default
unsigned numChannels = 1; // by default
char buf[100];
// Start with the first 6 bytes (the first 5 of which must be "#!AMR"):
if (fread(buf, 1, 6, fid) < 6) break;
if (strncmp(buf, "#!AMR", 5) != 0) break; // bad magic #
unsigned bytesRead = 6;
// The next bytes must be "\n", "-WB\n", "_MC1.0\n", or "-WB_MC1.0\n"
if (buf[5] == '-') {
// The next bytes must be "WB\n" or "WB_MC1.0\n"
if (fread(&buf[bytesRead], 1, 3, fid) < 3) break;
if (strncmp(&buf[bytesRead], "WB", 2) != 0) break; // bad magic #
isWideband = True;
bytesRead += 3;
}
if (buf[bytesRead-1] == '_') {
// The next bytes must be "MC1.0\n"
if (fread(&buf[bytesRead], 1, 6, fid) < 6) break;
if (strncmp(&buf[bytesRead], "MC1.0\n", 6) != 0) break; // bad magic #
bytesRead += 6;
// The next 4 bytes contain the number of channels:
char channelDesc[4];
if (fread(channelDesc, 1, 4, fid) < 4) break;
numChannels = channelDesc[3]&0xF;
} else if (buf[bytesRead-1] != '\n') {
break; // bad magic #
}
// If we get here, the magic number was OK:
magicNumberOK = True;
#ifdef DEBUG
fprintf(stderr, "isWideband: %d, numChannels: %d\n",
isWideband, numChannels);
#endif
return new AMRAudioFileSource(env, fid, isWideband, numChannels);
} while (0);
// An error occurred:
CloseInputFile(fid);
if (!magicNumberOK) {
env.setResultMsg("Bad (or nonexistent) AMR file header");
}
return NULL;
}
AMRAudioFileSource
::AMRAudioFileSource(UsageEnvironment& env, FILE* fid,
Boolean isWideband, unsigned numChannels)
: AMRAudioSource(env, isWideband, numChannels),
fFid(fid) {
}
AMRAudioFileSource::~AMRAudioFileSource() {
CloseInputFile(fFid);
}
// The mapping from the "FT" field to frame size.
// Values of 65535 are invalid.
#define FT_INVALID 65535
static unsigned short const frameSize[16] = {
12, 13, 15, 17,
19, 20, 26, 31,
5, FT_INVALID, FT_INVALID, FT_INVALID,
FT_INVALID, FT_INVALID, FT_INVALID, 0
};
static unsigned short const frameSizeWideband[16] = {
17, 23, 32, 36,
40, 46, 50, 58,
60, 5, FT_INVALID, FT_INVALID,
FT_INVALID, FT_INVALID, 0, 0
};
// Note: We should change the following to use asynchronous file reading, #####
// as we now do with ByteStreamFileSource. #####
void AMRAudioFileSource::doGetNextFrame() {
if (feof(fFid) || ferror(fFid)) {
handleClosure();
return;
}
// Begin by reading the 1-byte frame header (and checking it for validity)
while (1) {
if (fread(&fLastFrameHeader, 1, 1, fFid) < 1) {
handleClosure();
return;
}
if ((fLastFrameHeader&0x83) != 0) {
#ifdef DEBUG
fprintf(stderr, "Invalid frame header 0x%02x (padding bits (0x83) are not zero)\n", fLastFrameHeader);
#endif
} else {
unsigned char ft = (fLastFrameHeader&0x78)>>3;
fFrameSize = fIsWideband ? frameSizeWideband[ft] : frameSize[ft];
if (fFrameSize == FT_INVALID) {
#ifdef DEBUG
fprintf(stderr, "Invalid FT field %d (from frame header 0x%02x)\n",
ft, fLastFrameHeader);
#endif
} else {
// The frame header is OK
#ifdef DEBUG
fprintf(stderr, "Valid frame header 0x%02x -> ft %d -> frame size %d\n", fLastFrameHeader, ft, fFrameSize);
#endif
break;
}
}
}
// Next, read the frame-block into the buffer provided:
fFrameSize *= fNumChannels; // because multiple channels make up a frame-block
if (fFrameSize > fMaxSize) {
fNumTruncatedBytes = fFrameSize - fMaxSize;
fFrameSize = fMaxSize;
}
fFrameSize = fread(fTo, 1, fFrameSize, fFid);
// Set the 'presentation time':
if (fPresentationTime.tv_sec == 0 && fPresentationTime.tv_usec == 0) {
// This is the first frame, so use the current time:
gettimeofday(&fPresentationTime, NULL);
} else {
// Increment by the play time of the previous frame (20 ms)
unsigned uSeconds = fPresentationTime.tv_usec + 20000;
fPresentationTime.tv_sec += uSeconds/1000000;
fPresentationTime.tv_usec = uSeconds%1000000;
}
fDurationInMicroseconds = 20000; // each frame is 20 ms
// Switch to another task, and inform the reader that he has data:
nextTask() = envir().taskScheduler().scheduleDelayedTask(0,
(TaskFunc*)FramedSource::afterGetting, this);
}
live/liveMedia/AMRAudioRTPSink.cpp 000444 001752 001752 00000012134 12656261123 016671 0 ustar 00rsf rsf 000000 000000 /**********
This library is free software; you can redistribute it and/or modify it under
the terms of the GNU Lesser General Public License as published by the
Free Software Foundation; either version 2.1 of the License, or (at your
option) any later version. (See .)
This library is distributed in the hope that it will be useful, but WITHOUT
ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS
FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for
more details.
You should have received a copy of the GNU Lesser General Public License
along with this library; if not, write to the Free Software Foundation, Inc.,
51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
**********/
// "liveMedia"
// Copyright (c) 1996-2016 Live Networks, Inc. All rights reserved.
// RTP sink for AMR audio (RFC 4867)
// Implementation
// NOTE: At present, this is just a limited implementation, supporting:
// octet-alignment only; no interleaving; no frame CRC; no robust-sorting.
#include "AMRAudioRTPSink.hh"
#include "AMRAudioSource.hh"
AMRAudioRTPSink*
AMRAudioRTPSink::createNew(UsageEnvironment& env, Groupsock* RTPgs,
unsigned char rtpPayloadFormat,
Boolean sourceIsWideband,
unsigned numChannelsInSource) {
return new AMRAudioRTPSink(env, RTPgs, rtpPayloadFormat,
sourceIsWideband, numChannelsInSource);
}
AMRAudioRTPSink
::AMRAudioRTPSink(UsageEnvironment& env, Groupsock* RTPgs,
unsigned char rtpPayloadFormat,
Boolean sourceIsWideband, unsigned numChannelsInSource)
: AudioRTPSink(env, RTPgs, rtpPayloadFormat,
sourceIsWideband ? 16000 : 8000,
sourceIsWideband ? "AMR-WB": "AMR",
numChannelsInSource),
fSourceIsWideband(sourceIsWideband), fFmtpSDPLine(NULL) {
}
AMRAudioRTPSink::~AMRAudioRTPSink() {
delete[] fFmtpSDPLine;
}
Boolean AMRAudioRTPSink::sourceIsCompatibleWithUs(MediaSource& source) {
// Our source must be an AMR audio source:
if (!source.isAMRAudioSource()) return False;
// Also, the source must be wideband iff we asked for this:
AMRAudioSource& amrSource = (AMRAudioSource&)source;
if ((amrSource.isWideband()^fSourceIsWideband) != 0) return False;
// Also, the source must have the same number of channels that we
// specified. (It could, in principle, have more, but we don't
// support that.)
if (amrSource.numChannels() != numChannels()) return False;
// Also, because in our current implementation we output only one
// frame in each RTP packet, this means that for multi-channel audio,
// each 'frame-block' will be split over multiple RTP packets, which
// may violate the spec. Warn about this:
if (amrSource.numChannels() > 1) {
envir() << "AMRAudioRTPSink: Warning: Input source has " << amrSource.numChannels()
<< " audio channels. In the current implementation, the multi-frame frame-block will be split over multiple RTP packets\n";
}
return True;
}
void AMRAudioRTPSink::doSpecialFrameHandling(unsigned fragmentationOffset,
unsigned char* frameStart,
unsigned numBytesInFrame,
struct timeval framePresentationTime,
unsigned numRemainingBytes) {
// If this is the 1st frame in the 1st packet, set the RTP 'M' (marker)
// bit (because this is considered the start of a talk spurt):
if (isFirstPacket() && isFirstFrameInPacket()) {
setMarkerBit();
}
// If this is the first frame in the packet, set the 1-byte payload
// header (using CMR 15)
if (isFirstFrameInPacket()) {
u_int8_t payloadHeader = 0xF0;
setSpecialHeaderBytes(&payloadHeader, 1, 0);
}
// Set the TOC field for the current frame, based on the "FT" and "Q"
// values from our source:
AMRAudioSource* amrSource = (AMRAudioSource*)fSource;
if (amrSource == NULL) return; // sanity check
u_int8_t toc = amrSource->lastFrameHeader();
// Clear the "F" bit, because we're the last frame in this packet: #####
toc &=~ 0x80;
setSpecialHeaderBytes(&toc, 1, 1+numFramesUsedSoFar());
// Important: Also call our base class's doSpecialFrameHandling(),
// to set the packet's timestamp:
MultiFramedRTPSink::doSpecialFrameHandling(fragmentationOffset,
frameStart, numBytesInFrame,
framePresentationTime,
numRemainingBytes);
}
Boolean AMRAudioRTPSink
::frameCanAppearAfterPacketStart(unsigned char const* /*frameStart*/,
unsigned /*numBytesInFrame*/) const {
// For now, pack only one AMR frame into each outgoing RTP packet: #####
return False;
}
unsigned AMRAudioRTPSink::specialHeaderSize() const {
// For now, because we're packing only one frame per packet,
// there's just a 1-byte payload header, plus a 1-byte TOC #####
return 2;
}
char const* AMRAudioRTPSink::auxSDPLine() {
if (fFmtpSDPLine == NULL) {
// Generate a "a=fmtp:" line with "octet-aligned=1"
// (That is the only non-default parameter.)
char buf[100];
sprintf(buf, "a=fmtp:%d octet-align=1\r\n", rtpPayloadType());
delete[] fFmtpSDPLine; fFmtpSDPLine = strDup(buf);
}
return fFmtpSDPLine;
}
live/liveMedia/AMRAudioRTPSource.cpp 000444 001752 001752 00000062622 12656261123 017234 0 ustar 00rsf rsf 000000 000000 /**********
This library is free software; you can redistribute it and/or modify it under
the terms of the GNU Lesser General Public License as published by the
Free Software Foundation; either version 2.1 of the License, or (at your
option) any later version. (See .)
This library is distributed in the hope that it will be useful, but WITHOUT
ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS
FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for
more details.
You should have received a copy of the GNU Lesser General Public License
along with this library; if not, write to the Free Software Foundation, Inc.,
51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
**********/
// "liveMedia"
// Copyright (c) 1996-2016 Live Networks, Inc. All rights reserved.
// AMR Audio RTP Sources (RFC 4867)
// Implementation
#include "AMRAudioRTPSource.hh"
#include "MultiFramedRTPSource.hh"
#include "BitVector.hh"
#include
#include
// This source is implemented internally by two separate sources:
// (i) a RTP source for the raw (and possibly interleaved) AMR frames, and
// (ii) a deinterleaving filter that reads from this.
// Define these two new classes here:
class RawAMRRTPSource: public MultiFramedRTPSource {
public:
static RawAMRRTPSource*
createNew(UsageEnvironment& env,
Groupsock* RTPgs, unsigned char rtpPayloadFormat,
Boolean isWideband, Boolean isOctetAligned,
Boolean isInterleaved, Boolean CRCsArePresent);
Boolean isWideband() const { return fIsWideband; }
unsigned char ILL() const { return fILL; }
unsigned char ILP() const { return fILP; }
unsigned TOCSize() const { return fTOCSize; } // total # of frames in the last pkt
unsigned char* TOC() const { return fTOC; } // FT+Q value for each TOC entry
unsigned& frameIndex() { return fFrameIndex; } // index of frame-block within pkt
Boolean& isSynchronized() { return fIsSynchronized; }
private:
RawAMRRTPSource(UsageEnvironment& env, Groupsock* RTPgs,
unsigned char rtpPayloadFormat,
Boolean isWideband, Boolean isOctetAligned,
Boolean isInterleaved, Boolean CRCsArePresent);
// called only by createNew()
virtual ~RawAMRRTPSource();
private:
// redefined virtual functions:
virtual Boolean hasBeenSynchronizedUsingRTCP();
virtual Boolean processSpecialHeader(BufferedPacket* packet,
unsigned& resultSpecialHeaderSize);
virtual char const* MIMEtype() const;
private:
Boolean fIsWideband, fIsOctetAligned, fIsInterleaved, fCRCsArePresent;
unsigned char fILL, fILP;
unsigned fTOCSize;
unsigned char* fTOC;
unsigned fFrameIndex;
Boolean fIsSynchronized;
};
class AMRDeinterleaver: public AMRAudioSource {
public:
static AMRDeinterleaver*
createNew(UsageEnvironment& env,
Boolean isWideband, unsigned numChannels, unsigned maxInterleaveGroupSize,
RawAMRRTPSource* inputSource);
private:
AMRDeinterleaver(UsageEnvironment& env,
Boolean isWideband, unsigned numChannels,
unsigned maxInterleaveGroupSize, RawAMRRTPSource* inputSource);
// called only by "createNew()"
virtual ~AMRDeinterleaver();
static void afterGettingFrame(void* clientData, unsigned frameSize,
unsigned numTruncatedBytes,
struct timeval presentationTime,
unsigned durationInMicroseconds);
void afterGettingFrame1(unsigned frameSize, struct timeval presentationTime);
private:
// Redefined virtual functions:
void doGetNextFrame();
virtual void doStopGettingFrames();
private:
RawAMRRTPSource* fInputSource;
class AMRDeinterleavingBuffer* fDeinterleavingBuffer;
Boolean fNeedAFrame;
};
////////// AMRAudioRTPSource implementation //////////
#define MAX_NUM_CHANNELS 20 // far larger than ever expected...
#define MAX_INTERLEAVING_GROUP_SIZE 1000 // far larger than ever expected...
AMRAudioSource*
AMRAudioRTPSource::createNew(UsageEnvironment& env,
Groupsock* RTPgs,
RTPSource*& resultRTPSource,
unsigned char rtpPayloadFormat,
Boolean isWideband,
unsigned numChannels,
Boolean isOctetAligned,
unsigned interleaving,
Boolean robustSortingOrder,
Boolean CRCsArePresent) {
// Perform sanity checks on the input parameters:
if (robustSortingOrder) {
env << "AMRAudioRTPSource::createNew(): 'Robust sorting order' was specified, but we don't yet support this!\n";
return NULL;
} else if (numChannels > MAX_NUM_CHANNELS) {
env << "AMRAudioRTPSource::createNew(): The \"number of channels\" parameter ("
<< numChannels << ") is much too large!\n";
return NULL;
} else if (interleaving > MAX_INTERLEAVING_GROUP_SIZE) {
env << "AMRAudioRTPSource::createNew(): The \"interleaving\" parameter ("
<< interleaving << ") is much too large!\n";
return NULL;
}
// 'Bandwidth-efficient mode' precludes some other options:
if (!isOctetAligned) {
if (interleaving > 0 || robustSortingOrder || CRCsArePresent) {
env << "AMRAudioRTPSource::createNew(): 'Bandwidth-efficient mode' was specified, along with interleaving, 'robust sorting order', and/or CRCs, so we assume 'octet-aligned mode' instead.\n";
isOctetAligned = True;
}
}
Boolean isInterleaved;
unsigned maxInterleaveGroupSize; // in frames (not frame-blocks)
if (interleaving > 0) {
isInterleaved = True;
maxInterleaveGroupSize = interleaving*numChannels;
} else {
isInterleaved = False;
maxInterleaveGroupSize = numChannels;
}
RawAMRRTPSource* rawRTPSource;
resultRTPSource = rawRTPSource
= RawAMRRTPSource::createNew(env, RTPgs, rtpPayloadFormat,
isWideband, isOctetAligned,
isInterleaved, CRCsArePresent);
if (resultRTPSource == NULL) return NULL;
AMRDeinterleaver* deinterleaver
= AMRDeinterleaver::createNew(env, isWideband, numChannels,
maxInterleaveGroupSize, rawRTPSource);
if (deinterleaver == NULL) {
Medium::close(resultRTPSource);
resultRTPSource = NULL;
}
return deinterleaver;
}
////////// AMRBufferedPacket and AMRBufferedPacketFactory //////////
// A subclass of BufferedPacket, used to separate out AMR frames.
class AMRBufferedPacket: public BufferedPacket {
public:
AMRBufferedPacket(RawAMRRTPSource& ourSource);
virtual ~AMRBufferedPacket();
private: // redefined virtual functions
virtual unsigned nextEnclosedFrameSize(unsigned char*& framePtr,
unsigned dataSize);
private:
RawAMRRTPSource& fOurSource;
};
class AMRBufferedPacketFactory: public BufferedPacketFactory {
private: // redefined virtual functions
virtual BufferedPacket* createNewPacket(MultiFramedRTPSource* ourSource);
};
///////// RawAMRRTPSource implementation ////////
RawAMRRTPSource*
RawAMRRTPSource::createNew(UsageEnvironment& env, Groupsock* RTPgs,
unsigned char rtpPayloadFormat,
Boolean isWideband, Boolean isOctetAligned,
Boolean isInterleaved, Boolean CRCsArePresent) {
return new RawAMRRTPSource(env, RTPgs, rtpPayloadFormat,
isWideband, isOctetAligned,
isInterleaved, CRCsArePresent);
}
RawAMRRTPSource
::RawAMRRTPSource(UsageEnvironment& env,
Groupsock* RTPgs, unsigned char rtpPayloadFormat,
Boolean isWideband, Boolean isOctetAligned,
Boolean isInterleaved, Boolean CRCsArePresent)
: MultiFramedRTPSource(env, RTPgs, rtpPayloadFormat,
isWideband ? 16000 : 8000,
new AMRBufferedPacketFactory),
fIsWideband(isWideband), fIsOctetAligned(isOctetAligned),
fIsInterleaved(isInterleaved), fCRCsArePresent(CRCsArePresent),
fILL(0), fILP(0), fTOCSize(0), fTOC(NULL), fFrameIndex(0), fIsSynchronized(False) {
}
RawAMRRTPSource::~RawAMRRTPSource() {
delete[] fTOC;
}
#define FT_SPEECH_LOST 14
#define FT_NO_DATA 15
static void unpackBandwidthEfficientData(BufferedPacket* packet,
Boolean isWideband); // forward
Boolean RawAMRRTPSource
::processSpecialHeader(BufferedPacket* packet,
unsigned& resultSpecialHeaderSize) {
// If the data is 'bandwidth-efficient', first unpack it so that it's
// 'octet-aligned':
if (!fIsOctetAligned) unpackBandwidthEfficientData(packet, fIsWideband);
unsigned char* headerStart = packet->data();
unsigned packetSize = packet->dataSize();
// There's at least a 1-byte header, containing the CMR:
if (packetSize < 1) return False;
resultSpecialHeaderSize = 1;
if (fIsInterleaved) {
// There's an extra byte, containing the interleave parameters:
if (packetSize < 2) return False;
// Get the interleaving parameters, and check them for validity:
unsigned char const secondByte = headerStart[1];
fILL = (secondByte&0xF0)>>4;
fILP = secondByte&0x0F;
if (fILP > fILL) return False; // invalid
++resultSpecialHeaderSize;
}
#ifdef DEBUG
fprintf(stderr, "packetSize: %d, ILL: %d, ILP: %d\n", packetSize, fILL, fILP);
#endif
fFrameIndex = 0; // initially
// Next, there's a "Payload Table of Contents" (one byte per entry):
unsigned numFramesPresent = 0, numNonEmptyFramesPresent = 0;
unsigned tocStartIndex = resultSpecialHeaderSize;
Boolean F;
do {
if (resultSpecialHeaderSize >= packetSize) return False;
unsigned char const tocByte = headerStart[resultSpecialHeaderSize++];
F = (tocByte&0x80) != 0;
unsigned char const FT = (tocByte&0x78) >> 3;
#ifdef DEBUG
unsigned char Q = (tocByte&0x04)>>2;
fprintf(stderr, "\tTOC entry: F %d, FT %d, Q %d\n", F, FT, Q);
#endif
++numFramesPresent;
if (FT != FT_SPEECH_LOST && FT != FT_NO_DATA) ++numNonEmptyFramesPresent;
} while (F);
#ifdef DEBUG
fprintf(stderr, "TOC contains %d entries (%d non-empty)\n", numFramesPresent, numNonEmptyFramesPresent);
#endif
// Now that we know the size of the TOC, fill in our copy:
if (numFramesPresent > fTOCSize) {
delete[] fTOC;
fTOC = new unsigned char[numFramesPresent];
}
fTOCSize = numFramesPresent;
for (unsigned i = 0; i < fTOCSize; ++i) {
unsigned char const tocByte = headerStart[tocStartIndex + i];
fTOC[i] = tocByte&0x7C; // clear everything except the F and Q fields
}
if (fCRCsArePresent) {
// 'numNonEmptyFramesPresent' CRC bytes will follow.
// Note: we currently don't check the CRCs for validity #####
resultSpecialHeaderSize += numNonEmptyFramesPresent;
#ifdef DEBUG
fprintf(stderr, "Ignoring %d following CRC bytes\n", numNonEmptyFramesPresent);
#endif
if (resultSpecialHeaderSize > packetSize) return False;
}
#ifdef DEBUG
fprintf(stderr, "Total special header size: %d\n", resultSpecialHeaderSize);
#endif
return True;
}
char const* RawAMRRTPSource::MIMEtype() const {
return fIsWideband ? "audio/AMR-WB" : "audio/AMR";
}
Boolean RawAMRRTPSource::hasBeenSynchronizedUsingRTCP() {
return fIsSynchronized;
}
///// AMRBufferedPacket and AMRBufferedPacketFactory implementation
AMRBufferedPacket::AMRBufferedPacket(RawAMRRTPSource& ourSource)
: fOurSource(ourSource) {
}
AMRBufferedPacket::~AMRBufferedPacket() {
}
// The mapping from the "FT" field to frame size.
// Values of 65535 are invalid.
#define FT_INVALID 65535
static unsigned short const frameBytesFromFT[16] = {
12, 13, 15, 17,
19, 20, 26, 31,
5, FT_INVALID, FT_INVALID, FT_INVALID,
FT_INVALID, FT_INVALID, FT_INVALID, 0
};
static unsigned short const frameBytesFromFTWideband[16] = {
17, 23, 32, 36,
40, 46, 50, 58,
60, 5, FT_INVALID, FT_INVALID,
FT_INVALID, FT_INVALID, 0, 0
};
unsigned AMRBufferedPacket::
nextEnclosedFrameSize(unsigned char*& framePtr, unsigned dataSize) {
if (dataSize == 0) return 0; // sanity check
// The size of the AMR frame is determined by the corresponding 'FT' value
// in the packet's Table of Contents.
unsigned const tocIndex = fOurSource.frameIndex();
if (tocIndex >= fOurSource.TOCSize()) return 0; // sanity check
unsigned char const tocByte = fOurSource.TOC()[tocIndex];
unsigned char const FT = (tocByte&0x78) >> 3;
// ASSERT: FT < 16
unsigned short frameSize
= fOurSource.isWideband() ? frameBytesFromFTWideband[FT] : frameBytesFromFT[FT];
if (frameSize == FT_INVALID) {
// Strange TOC entry!
fOurSource.envir() << "AMRBufferedPacket::nextEnclosedFrameSize(): invalid FT: " << FT << "\n";
frameSize = 0; // This probably messes up the rest of this packet, but...
}
#ifdef DEBUG
fprintf(stderr, "AMRBufferedPacket::nextEnclosedFrameSize(): frame #: %d, FT: %d, isWideband: %d => frameSize: %d (dataSize: %d)\n", tocIndex, FT, fOurSource.isWideband(), frameSize, dataSize);
#endif
++fOurSource.frameIndex();
if (dataSize < frameSize) return 0;
return frameSize;
}
BufferedPacket* AMRBufferedPacketFactory
::createNewPacket(MultiFramedRTPSource* ourSource) {
return new AMRBufferedPacket((RawAMRRTPSource&)(*ourSource));
}
///////// AMRDeinterleavingBuffer /////////
// (used to implement AMRDeinterleaver)
#define AMR_MAX_FRAME_SIZE 60
class AMRDeinterleavingBuffer {
public:
AMRDeinterleavingBuffer(unsigned numChannels, unsigned maxInterleaveGroupSize);
virtual ~AMRDeinterleavingBuffer();
void deliverIncomingFrame(unsigned frameSize, RawAMRRTPSource* source,
struct timeval presentationTime);
Boolean retrieveFrame(unsigned char* to, unsigned maxSize,
unsigned& resultFrameSize, unsigned& resultNumTruncatedBytes,
u_int8_t& resultFrameHeader,
struct timeval& resultPresentationTime,
Boolean& resultIsSynchronized);
unsigned char* inputBuffer() { return fInputBuffer; }
unsigned inputBufferSize() const { return AMR_MAX_FRAME_SIZE; }
private:
unsigned char* createNewBuffer();
class FrameDescriptor {
public:
FrameDescriptor();
virtual ~FrameDescriptor();
unsigned frameSize;
unsigned char* frameData;
u_int8_t frameHeader;
struct timeval presentationTime;
Boolean fIsSynchronized;
};
unsigned fNumChannels, fMaxInterleaveGroupSize;
FrameDescriptor* fFrames[2];
unsigned char fIncomingBankId; // toggles between 0 and 1
unsigned char fIncomingBinMax; // in the incoming bank
unsigned char fOutgoingBinMax; // in the outgoing bank
unsigned char fNextOutgoingBin;
Boolean fHaveSeenPackets;
u_int16_t fLastPacketSeqNumForGroup;
unsigned char* fInputBuffer;
struct timeval fLastRetrievedPresentationTime;
unsigned fNumSuccessiveSyncedFrames;
unsigned char fILL;
};
////////// AMRDeinterleaver implementation /////////
AMRDeinterleaver* AMRDeinterleaver
::createNew(UsageEnvironment& env,
Boolean isWideband, unsigned numChannels, unsigned maxInterleaveGroupSize,
RawAMRRTPSource* inputSource) {
return new AMRDeinterleaver(env, isWideband, numChannels, maxInterleaveGroupSize, inputSource);
}
AMRDeinterleaver::AMRDeinterleaver(UsageEnvironment& env,
Boolean isWideband, unsigned numChannels,
unsigned maxInterleaveGroupSize,
RawAMRRTPSource* inputSource)
: AMRAudioSource(env, isWideband, numChannels),
fInputSource(inputSource), fNeedAFrame(False) {
fDeinterleavingBuffer
= new AMRDeinterleavingBuffer(numChannels, maxInterleaveGroupSize);
}
AMRDeinterleaver::~AMRDeinterleaver() {
delete fDeinterleavingBuffer;
Medium::close(fInputSource);
}
static unsigned const uSecsPerFrame = 20000; // 20 ms
void AMRDeinterleaver::doGetNextFrame() {
// First, try getting a frame from the deinterleaving buffer:
if (fDeinterleavingBuffer->retrieveFrame(fTo, fMaxSize,
fFrameSize, fNumTruncatedBytes,
fLastFrameHeader, fPresentationTime,
fInputSource->isSynchronized())) {
// Success!
fNeedAFrame = False;
fDurationInMicroseconds = uSecsPerFrame;
// Call our own 'after getting' function. Because we're not a 'leaf'
// source, we can call this directly, without risking
// infinite recursion
afterGetting(this);
return;
}
// No luck, so ask our source for help:
fNeedAFrame = True;
if (!fInputSource->isCurrentlyAwaitingData()) {
fInputSource->getNextFrame(fDeinterleavingBuffer->inputBuffer(),
fDeinterleavingBuffer->inputBufferSize(),
afterGettingFrame, this,
FramedSource::handleClosure, this);
}
}
void AMRDeinterleaver::doStopGettingFrames() {
fNeedAFrame = False;
fInputSource->stopGettingFrames();
}
void AMRDeinterleaver
::afterGettingFrame(void* clientData, unsigned frameSize,
unsigned /*numTruncatedBytes*/,
struct timeval presentationTime,
unsigned /*durationInMicroseconds*/) {
AMRDeinterleaver* deinterleaver = (AMRDeinterleaver*)clientData;
deinterleaver->afterGettingFrame1(frameSize, presentationTime);
}
void AMRDeinterleaver
::afterGettingFrame1(unsigned frameSize, struct timeval presentationTime) {
RawAMRRTPSource* source = (RawAMRRTPSource*)fInputSource;
// First, put the frame into our deinterleaving buffer:
fDeinterleavingBuffer->deliverIncomingFrame(frameSize, source, presentationTime);
// Then, try delivering a frame to the client (if he wants one):
if (fNeedAFrame) doGetNextFrame();
}
////////// AMRDeinterleavingBuffer implementation /////////
AMRDeinterleavingBuffer
::AMRDeinterleavingBuffer(unsigned numChannels, unsigned maxInterleaveGroupSize)
: fNumChannels(numChannels), fMaxInterleaveGroupSize(maxInterleaveGroupSize),
fIncomingBankId(0), fIncomingBinMax(0),
fOutgoingBinMax(0), fNextOutgoingBin(0),
fHaveSeenPackets(False), fNumSuccessiveSyncedFrames(0), fILL(0) {
// Use two banks of descriptors - one for incoming, one for outgoing
fFrames[0] = new FrameDescriptor[fMaxInterleaveGroupSize];
fFrames[1] = new FrameDescriptor[fMaxInterleaveGroupSize];
fInputBuffer = createNewBuffer();
}
AMRDeinterleavingBuffer::~AMRDeinterleavingBuffer() {
delete[] fInputBuffer;
delete[] fFrames[0]; delete[] fFrames[1];
}
void AMRDeinterleavingBuffer
::deliverIncomingFrame(unsigned frameSize, RawAMRRTPSource* source,
struct timeval presentationTime) {
fILL = source->ILL();
unsigned char const ILP = source->ILP();
unsigned frameIndex = source->frameIndex();
unsigned short packetSeqNum = source->curPacketRTPSeqNum();
// First perform a sanity check on the parameters:
// (This is overkill, as the source should have already done this.)
if (ILP > fILL || frameIndex == 0) {
#ifdef DEBUG
fprintf(stderr, "AMRDeinterleavingBuffer::deliverIncomingFrame() param sanity check failed (%d,%d,%d,%d)\n", frameSize, fILL, ILP, frameIndex);
#endif
source->envir().internalError();
}
--frameIndex; // because it was incremented by the source when this frame was read
u_int8_t frameHeader;
if (frameIndex >= source->TOCSize()) { // sanity check
frameHeader = FT_NO_DATA<<3;
} else {
frameHeader = source->TOC()[frameIndex];
}
unsigned frameBlockIndex = frameIndex/fNumChannels;
unsigned frameWithinFrameBlock = frameIndex%fNumChannels;
// The input "presentationTime" was that of the first frame-block in this
// packet. Update it for the current frame:
unsigned uSecIncrement = frameBlockIndex*(fILL+1)*uSecsPerFrame;
presentationTime.tv_usec += uSecIncrement;
presentationTime.tv_sec += presentationTime.tv_usec/1000000;
presentationTime.tv_usec = presentationTime.tv_usec%1000000;
// Next, check whether this packet is part of a new interleave group
if (!fHaveSeenPackets
|| seqNumLT(fLastPacketSeqNumForGroup, packetSeqNum + frameBlockIndex)) {
// We've moved to a new interleave group
#ifdef DEBUG
fprintf(stderr, "AMRDeinterleavingBuffer::deliverIncomingFrame(): new interleave group\n");
#endif
fHaveSeenPackets = True;
fLastPacketSeqNumForGroup = packetSeqNum + fILL - ILP;
// Switch the incoming and outgoing banks:
fIncomingBankId ^= 1;
unsigned char tmp = fIncomingBinMax;
fIncomingBinMax = fOutgoingBinMax;
fOutgoingBinMax = tmp;
fNextOutgoingBin = 0;
}
// Now move the incoming frame into the appropriate bin:
unsigned const binNumber
= ((ILP + frameBlockIndex*(fILL+1))*fNumChannels + frameWithinFrameBlock)
% fMaxInterleaveGroupSize; // the % is for sanity
#ifdef DEBUG
fprintf(stderr, "AMRDeinterleavingBuffer::deliverIncomingFrame(): frameIndex %d (%d,%d) put in bank %d, bin %d (%d): size %d, header 0x%02x, presentationTime %lu.%06ld\n", frameIndex, frameBlockIndex, frameWithinFrameBlock, fIncomingBankId, binNumber, fMaxInterleaveGroupSize, frameSize, frameHeader, presentationTime.tv_sec, presentationTime.tv_usec);
#endif
FrameDescriptor& inBin = fFrames[fIncomingBankId][binNumber];
unsigned char* curBuffer = inBin.frameData;
inBin.frameData = fInputBuffer;
inBin.frameSize = frameSize;
inBin.frameHeader = frameHeader;
inBin.presentationTime = presentationTime;
inBin.fIsSynchronized = ((RTPSource*)source)->RTPSource::hasBeenSynchronizedUsingRTCP();
if (curBuffer == NULL) curBuffer = createNewBuffer();
fInputBuffer = curBuffer;
if (binNumber >= fIncomingBinMax) {
fIncomingBinMax = binNumber + 1;
}
}
Boolean AMRDeinterleavingBuffer
::retrieveFrame(unsigned char* to, unsigned maxSize,
unsigned& resultFrameSize, unsigned& resultNumTruncatedBytes,
u_int8_t& resultFrameHeader,
struct timeval& resultPresentationTime,
Boolean& resultIsSynchronized) {
if (fNextOutgoingBin >= fOutgoingBinMax) return False; // none left
FrameDescriptor& outBin = fFrames[fIncomingBankId^1][fNextOutgoingBin];
unsigned char* fromPtr = outBin.frameData;
unsigned char fromSize = outBin.frameSize;
outBin.frameSize = 0; // for the next time this bin is used
resultIsSynchronized = False; // by default; can be changed by:
if (outBin.fIsSynchronized) {
// Don't consider the outgoing frame to be synchronized until we've received at least a complete interleave cycle of
// synchronized frames. This ensures that the receiver will be getting all synchronized frames from now on.
if (++fNumSuccessiveSyncedFrames > fILL) {
resultIsSynchronized = True;
fNumSuccessiveSyncedFrames = fILL+1; // prevents overflow
}
} else {
fNumSuccessiveSyncedFrames = 0;
}
// Check whether this frame is missing; if so, return a FT_NO_DATA frame:
if (fromSize == 0) {
resultFrameHeader = FT_NO_DATA<<3;
// Compute this erasure frame's presentation time via extrapolation:
resultPresentationTime = fLastRetrievedPresentationTime;
resultPresentationTime.tv_usec += uSecsPerFrame;
if (resultPresentationTime.tv_usec >= 1000000) {
++resultPresentationTime.tv_sec;
resultPresentationTime.tv_usec -= 1000000;
}
} else {
// Normal case - a frame exists:
resultFrameHeader = outBin.frameHeader;
resultPresentationTime = outBin.presentationTime;
}
fLastRetrievedPresentationTime = resultPresentationTime;
if (fromSize > maxSize) {
resultNumTruncatedBytes = fromSize - maxSize;
resultFrameSize = maxSize;
} else {
resultNumTruncatedBytes = 0;
resultFrameSize = fromSize;
}
memmove(to, fromPtr, resultFrameSize);
#ifdef DEBUG
fprintf(stderr, "AMRDeinterleavingBuffer::retrieveFrame(): from bank %d, bin %d: size %d, header 0x%02x, presentationTime %lu.%06ld\n", fIncomingBankId^1, fNextOutgoingBin, resultFrameSize, resultFrameHeader, resultPresentationTime.tv_sec, resultPresentationTime.tv_usec);
#endif
++fNextOutgoingBin;
return True;
}
unsigned char* AMRDeinterleavingBuffer::createNewBuffer() {
return new unsigned char[inputBufferSize()];
}
AMRDeinterleavingBuffer::FrameDescriptor::FrameDescriptor()
: frameSize(0), frameData(NULL) {
}
AMRDeinterleavingBuffer::FrameDescriptor::~FrameDescriptor() {
delete[] frameData;
}
// Unpack bandwidth-aligned data to octet-aligned:
static unsigned short const frameBitsFromFT[16] = {
95, 103, 118, 134,
148, 159, 204, 244,
39, 0, 0, 0,
0, 0, 0, 0
};
static unsigned short const frameBitsFromFTWideband[16] = {
132, 177, 253, 285,
317, 365, 397, 461,
477, 40, 0, 0,
0, 0, 0, 0
};
static void unpackBandwidthEfficientData(BufferedPacket* packet,
Boolean isWideband) {
#ifdef DEBUG
fprintf(stderr, "Unpacking 'bandwidth-efficient' payload (%d bytes):\n", packet->dataSize());
for (unsigned j = 0; j < packet->dataSize(); ++j) {
fprintf(stderr, "%02x:", (packet->data())[j]);
}
fprintf(stderr, "\n");
#endif
BitVector fromBV(packet->data(), 0, 8*packet->dataSize());
unsigned const toBufferSize = 2*packet->dataSize(); // conservatively large
unsigned char* toBuffer = new unsigned char[toBufferSize];
unsigned toCount = 0;
// Begin with the payload header:
unsigned CMR = fromBV.getBits(4);
toBuffer[toCount++] = CMR << 4;
// Then, run through and unpack the TOC entries:
while (1) {
unsigned toc = fromBV.getBits(6);
toBuffer[toCount++] = toc << 2;
if ((toc&0x20) == 0) break; // the F bit is 0
}
// Then, using the TOC data, unpack each frame payload:
unsigned const tocSize = toCount - 1;
for (unsigned i = 1; i <= tocSize; ++i) {
unsigned char tocByte = toBuffer[i];
unsigned char const FT = (tocByte&0x78) >> 3;
unsigned short frameSizeBits
= isWideband ? frameBitsFromFTWideband[FT] : frameBitsFromFT[FT];
unsigned short frameSizeBytes = (frameSizeBits+7)/8;
shiftBits(&toBuffer[toCount], 0, // to
packet->data(), fromBV.curBitIndex(), // from
frameSizeBits // num bits
);
#ifdef DEBUG
if (frameSizeBits > fromBV.numBitsRemaining()) {
fprintf(stderr, "\tWarning: Unpacking frame %d of %d: want %d bits, but only %d are available!\n", i, tocSize, frameSizeBits, fromBV.numBitsRemaining());
}
#endif
fromBV.skipBits(frameSizeBits);
toCount += frameSizeBytes;
}
#ifdef DEBUG
if (fromBV.numBitsRemaining() > 7) {
fprintf(stderr, "\tWarning: %d bits remain unused!\n", fromBV.numBitsRemaining());
}
#endif
// Finally, replace the current packet data with the unpacked data:
packet->removePadding(packet->dataSize()); // throws away current packet data
packet->appendData(toBuffer, toCount);
delete[] toBuffer;
}
live/liveMedia/AMRAudioSource.cpp 000444 001752 001752 00000002473 12656261123 016644 0 ustar 00rsf rsf 000000 000000 /**********
This library is free software; you can redistribute it and/or modify it under
the terms of the GNU Lesser General Public License as published by the
Free Software Foundation; either version 2.1 of the License, or (at your
option) any later version. (See .)
This library is distributed in the hope that it will be useful, but WITHOUT
ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS
FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for
more details.
You should have received a copy of the GNU Lesser General Public License
along with this library; if not, write to the Free Software Foundation, Inc.,
51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
**********/
// "liveMedia"
// Copyright (c) 1996-2016 Live Networks, Inc. All rights reserved.
// A source object for AMR audio sources
// Implementation
#include "AMRAudioSource.hh"
AMRAudioSource::AMRAudioSource(UsageEnvironment& env,
Boolean isWideband, unsigned numChannels)
: FramedSource(env),
fIsWideband(isWideband), fNumChannels(numChannels), fLastFrameHeader(0) {
}
AMRAudioSource::~AMRAudioSource() {
}
char const* AMRAudioSource::MIMEtype() const {
return "audio/AMR";
}
Boolean AMRAudioSource::isAMRAudioSource() const {
return True;
}
live/liveMedia/AudioInputDevice.cpp 000444 001752 001752 00000003165 12656261123 017262 0 ustar 00rsf rsf 000000 000000 /**********
This library is free software; you can redistribute it and/or modify it under
the terms of the GNU Lesser General Public License as published by the
Free Software Foundation; either version 2.1 of the License, or (at your
option) any later version. (See .)
This library is distributed in the hope that it will be useful, but WITHOUT
ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS
FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for
more details.
You should have received a copy of the GNU Lesser General Public License
along with this library; if not, write to the Free Software Foundation, Inc.,
51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
**********/
// Copyright (c) 2001-2003 Live Networks, Inc. All rights reserved.
// Generic audio input device (such as a microphone, or an input sound card)
// Implementation
#include
AudioInputDevice
::AudioInputDevice(UsageEnvironment& env, unsigned char bitsPerSample,
unsigned char numChannels,
unsigned samplingFrequency, unsigned granularityInMS)
: FramedSource(env), fBitsPerSample(bitsPerSample),
fNumChannels(numChannels), fSamplingFrequency(samplingFrequency),
fGranularityInMS(granularityInMS) {
}
AudioInputDevice::~AudioInputDevice() {
}
char** AudioInputDevice::allowedDeviceNames = NULL;
////////// AudioPortNames implementation //////////
AudioPortNames::AudioPortNames()
: numPorts(0), portName(NULL) {
}
AudioPortNames::~AudioPortNames() {
for (unsigned i = 0; i < numPorts; ++i) delete portName[i];
delete portName;
}
live/liveMedia/AudioRTPSink.cpp 000444 001752 001752 00000002575 12656261123 016341 0 ustar 00rsf rsf 000000 000000 /**********
This library is free software; you can redistribute it and/or modify it under
the terms of the GNU Lesser General Public License as published by the
Free Software Foundation; either version 2.1 of the License, or (at your
option) any later version. (See .)
This library is distributed in the hope that it will be useful, but WITHOUT
ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS
FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for
more details.
You should have received a copy of the GNU Lesser General Public License
along with this library; if not, write to the Free Software Foundation, Inc.,
51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
**********/
// "liveMedia"
// Copyright (c) 1996-2016 Live Networks, Inc. All rights reserved.
// A generic RTP sink for audio codecs (abstract base class)
// Implementation
#include "AudioRTPSink.hh"
AudioRTPSink::AudioRTPSink(UsageEnvironment& env,
Groupsock* rtpgs, unsigned char rtpPayloadType,
unsigned rtpTimestampFrequency,
char const* rtpPayloadFormatName,
unsigned numChannels)
: MultiFramedRTPSink(env, rtpgs, rtpPayloadType, rtpTimestampFrequency,
rtpPayloadFormatName, numChannels) {
}
AudioRTPSink::~AudioRTPSink() {
}
char const* AudioRTPSink::sdpMediaType() const {
return "audio";
}
live/liveMedia/AVIFileSink.cpp 000444 001752 001752 00000065662 12656261123 016137 0 ustar 00rsf rsf 000000 000000 /**********
This library is free software; you can redistribute it and/or modify it under
the terms of the GNU Lesser General Public License as published by the
Free Software Foundation; either version 2.1 of the License, or (at your
option) any later version. (See .)
This library is distributed in the hope that it will be useful, but WITHOUT
ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS
FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for
more details.
You should have received a copy of the GNU Lesser General Public License
along with this library; if not, write to the Free Software Foundation, Inc.,
51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
**********/
// "liveMedia"
// Copyright (c) 1996-2016 Live Networks, Inc. All rights reserved.
// A sink that generates an AVI file from a composite media session
// Implementation
#include "AVIFileSink.hh"
#include "InputFile.hh"
#include "OutputFile.hh"
#include "GroupsockHelper.hh"
#define fourChar(x,y,z,w) ( ((w)<<24)|((z)<<16)|((y)<<8)|(x) )/*little-endian*/
////////// AVISubsessionIOState ///////////
// A structure used to represent the I/O state of each input 'subsession':
class SubsessionBuffer {
public:
SubsessionBuffer(unsigned bufferSize)
: fBufferSize(bufferSize) {
reset();
fData = new unsigned char[bufferSize];
}
virtual ~SubsessionBuffer() { delete[] fData; }
void reset() { fBytesInUse = 0; }
void addBytes(unsigned numBytes) { fBytesInUse += numBytes; }
unsigned char* dataStart() { return &fData[0]; }
unsigned char* dataEnd() { return &fData[fBytesInUse]; }
unsigned bytesInUse() const { return fBytesInUse; }
unsigned bytesAvailable() const { return fBufferSize - fBytesInUse; }
void setPresentationTime(struct timeval const& presentationTime) {
fPresentationTime = presentationTime;
}
struct timeval const& presentationTime() const {return fPresentationTime;}
private:
unsigned fBufferSize;
struct timeval fPresentationTime;
unsigned char* fData;
unsigned fBytesInUse;
};
class AVISubsessionIOState {
public:
AVISubsessionIOState(AVIFileSink& sink, MediaSubsession& subsession);
virtual ~AVISubsessionIOState();
void setAVIstate(unsigned subsessionIndex);
void setFinalAVIstate();
void afterGettingFrame(unsigned packetDataSize,
struct timeval presentationTime);
void onSourceClosure();
UsageEnvironment& envir() const { return fOurSink.envir(); }
public:
SubsessionBuffer *fBuffer, *fPrevBuffer;
AVIFileSink& fOurSink;
MediaSubsession& fOurSubsession;
unsigned short fLastPacketRTPSeqNum;
Boolean fOurSourceIsActive;
struct timeval fPrevPresentationTime;
unsigned fMaxBytesPerSecond;
Boolean fIsVideo, fIsAudio, fIsByteSwappedAudio;
unsigned fAVISubsessionTag;
unsigned fAVICodecHandlerType;
unsigned fAVISamplingFrequency; // for audio
u_int16_t fWAVCodecTag; // for audio
unsigned fAVIScale;
unsigned fAVIRate;
unsigned fAVISize;
unsigned fNumFrames;
unsigned fSTRHFrameCountPosition;
private:
void useFrame(SubsessionBuffer& buffer);
};
///////// AVIIndexRecord definition & implementation //////////
class AVIIndexRecord {
public:
AVIIndexRecord(unsigned chunkId, unsigned flags, unsigned offset, unsigned size)
: fNext(NULL), fChunkId(chunkId), fFlags(flags), fOffset(offset), fSize(size) {
}
AVIIndexRecord*& next() { return fNext; }
unsigned chunkId() const { return fChunkId; }
unsigned flags() const { return fFlags; }
unsigned offset() const { return fOffset; }
unsigned size() const { return fSize; }
private:
AVIIndexRecord* fNext;
unsigned fChunkId;
unsigned fFlags;
unsigned fOffset;
unsigned fSize;
};
////////// AVIFileSink implementation //////////
AVIFileSink::AVIFileSink(UsageEnvironment& env,
MediaSession& inputSession,
char const* outputFileName,
unsigned bufferSize,
unsigned short movieWidth, unsigned short movieHeight,
unsigned movieFPS, Boolean packetLossCompensate)
: Medium(env), fInputSession(inputSession),
fIndexRecordsHead(NULL), fIndexRecordsTail(NULL), fNumIndexRecords(0),
fBufferSize(bufferSize), fPacketLossCompensate(packetLossCompensate),
fAreCurrentlyBeingPlayed(False), fNumSubsessions(0), fNumBytesWritten(0),
fHaveCompletedOutputFile(False),
fMovieWidth(movieWidth), fMovieHeight(movieHeight), fMovieFPS(movieFPS) {
fOutFid = OpenOutputFile(env, outputFileName);
if (fOutFid == NULL) return;
// Set up I/O state for each input subsession:
MediaSubsessionIterator iter(fInputSession);
MediaSubsession* subsession;
while ((subsession = iter.next()) != NULL) {
// Ignore subsessions without a data source:
FramedSource* subsessionSource = subsession->readSource();
if (subsessionSource == NULL) continue;
// If "subsession's" SDP description specified screen dimension
// or frame rate parameters, then use these.
if (subsession->videoWidth() != 0) {
fMovieWidth = subsession->videoWidth();
}
if (subsession->videoHeight() != 0) {
fMovieHeight = subsession->videoHeight();
}
if (subsession->videoFPS() != 0) {
fMovieFPS = subsession->videoFPS();
}
AVISubsessionIOState* ioState
= new AVISubsessionIOState(*this, *subsession);
subsession->miscPtr = (void*)ioState;
// Also set a 'BYE' handler for this subsession's RTCP instance:
if (subsession->rtcpInstance() != NULL) {
subsession->rtcpInstance()->setByeHandler(onRTCPBye, ioState);
}
++fNumSubsessions;
}
// Begin by writing an AVI header:
addFileHeader_AVI();
}
AVIFileSink::~AVIFileSink() {
completeOutputFile();
// Then, stop streaming and delete each active "AVISubsessionIOState":
MediaSubsessionIterator iter(fInputSession);
MediaSubsession* subsession;
while ((subsession = iter.next()) != NULL) {
if (subsession->readSource() != NULL) subsession->readSource()->stopGettingFrames();
AVISubsessionIOState* ioState
= (AVISubsessionIOState*)(subsession->miscPtr);
if (ioState == NULL) continue;
delete ioState;
}
// Then, delete the index records:
AVIIndexRecord* cur = fIndexRecordsHead;
while (cur != NULL) {
AVIIndexRecord* next = cur->next();
delete cur;
cur = next;
}
// Finally, close our output file:
CloseOutputFile(fOutFid);
}
AVIFileSink* AVIFileSink
::createNew(UsageEnvironment& env, MediaSession& inputSession,
char const* outputFileName,
unsigned bufferSize,
unsigned short movieWidth, unsigned short movieHeight,
unsigned movieFPS, Boolean packetLossCompensate) {
AVIFileSink* newSink =
new AVIFileSink(env, inputSession, outputFileName, bufferSize,
movieWidth, movieHeight, movieFPS, packetLossCompensate);
if (newSink == NULL || newSink->fOutFid == NULL) {
Medium::close(newSink);
return NULL;
}
return newSink;
}
Boolean AVIFileSink::startPlaying(afterPlayingFunc* afterFunc,
void* afterClientData) {
// Make sure we're not already being played:
if (fAreCurrentlyBeingPlayed) {
envir().setResultMsg("This sink has already been played");
return False;
}
fAreCurrentlyBeingPlayed = True;
fAfterFunc = afterFunc;
fAfterClientData = afterClientData;
return continuePlaying();
}
Boolean AVIFileSink::continuePlaying() {
// Run through each of our input session's 'subsessions',
// asking for a frame from each one:
Boolean haveActiveSubsessions = False;
MediaSubsessionIterator iter(fInputSession);
MediaSubsession* subsession;
while ((subsession = iter.next()) != NULL) {
FramedSource* subsessionSource = subsession->readSource();
if (subsessionSource == NULL) continue;
if (subsessionSource->isCurrentlyAwaitingData()) continue;
AVISubsessionIOState* ioState
= (AVISubsessionIOState*)(subsession->miscPtr);
if (ioState == NULL) continue;
haveActiveSubsessions = True;
unsigned char* toPtr = ioState->fBuffer->dataEnd();
unsigned toSize = ioState->fBuffer->bytesAvailable();
subsessionSource->getNextFrame(toPtr, toSize,
afterGettingFrame, ioState,
onSourceClosure, ioState);
}
if (!haveActiveSubsessions) {
envir().setResultMsg("No subsessions are currently active");
return False;
}
return True;
}
void AVIFileSink
::afterGettingFrame(void* clientData, unsigned packetDataSize,
unsigned numTruncatedBytes,
struct timeval presentationTime,
unsigned /*durationInMicroseconds*/) {
AVISubsessionIOState* ioState = (AVISubsessionIOState*)clientData;
if (numTruncatedBytes > 0) {
ioState->envir() << "AVIFileSink::afterGettingFrame(): The input frame data was too large for our buffer. "
<< numTruncatedBytes
<< " bytes of trailing data was dropped! Correct this by increasing the \"bufferSize\" parameter in the \"createNew()\" call.\n";
}
ioState->afterGettingFrame(packetDataSize, presentationTime);
}
void AVIFileSink::onSourceClosure(void* clientData) {
AVISubsessionIOState* ioState = (AVISubsessionIOState*)clientData;
ioState->onSourceClosure();
}
void AVIFileSink::onSourceClosure1() {
// Check whether *all* of the subsession sources have closed.
// If not, do nothing for now:
MediaSubsessionIterator iter(fInputSession);
MediaSubsession* subsession;
while ((subsession = iter.next()) != NULL) {
AVISubsessionIOState* ioState
= (AVISubsessionIOState*)(subsession->miscPtr);
if (ioState == NULL) continue;
if (ioState->fOurSourceIsActive) return; // this source hasn't closed
}
completeOutputFile();
// Call our specified 'after' function:
if (fAfterFunc != NULL) {
(*fAfterFunc)(fAfterClientData);
}
}
void AVIFileSink::onRTCPBye(void* clientData) {
AVISubsessionIOState* ioState = (AVISubsessionIOState*)clientData;
struct timeval timeNow;
gettimeofday(&timeNow, NULL);
unsigned secsDiff
= timeNow.tv_sec - ioState->fOurSink.fStartTime.tv_sec;
MediaSubsession& subsession = ioState->fOurSubsession;
ioState->envir() << "Received RTCP \"BYE\" on \""
<< subsession.mediumName()
<< "/" << subsession.codecName()
<< "\" subsession (after "
<< secsDiff << " seconds)\n";
// Handle the reception of a RTCP "BYE" as if the source had closed:
ioState->onSourceClosure();
}
void AVIFileSink::addIndexRecord(AVIIndexRecord* newIndexRecord) {
if (fIndexRecordsHead == NULL) {
fIndexRecordsHead = newIndexRecord;
} else {
fIndexRecordsTail->next() = newIndexRecord;
}
fIndexRecordsTail = newIndexRecord;
++fNumIndexRecords;
}
void AVIFileSink::completeOutputFile() {
if (fHaveCompletedOutputFile || fOutFid == NULL) return;
// Update various AVI 'size' fields to take account of the codec data that
// we've now written to the file:
unsigned maxBytesPerSecond = 0;
unsigned numVideoFrames = 0;
unsigned numAudioFrames = 0;
//// Subsession-specific fields:
MediaSubsessionIterator iter(fInputSession);
MediaSubsession* subsession;
while ((subsession = iter.next()) != NULL) {
AVISubsessionIOState* ioState
= (AVISubsessionIOState*)(subsession->miscPtr);
if (ioState == NULL) continue;
maxBytesPerSecond += ioState->fMaxBytesPerSecond;
setWord(ioState->fSTRHFrameCountPosition, ioState->fNumFrames);
if (ioState->fIsVideo) numVideoFrames = ioState->fNumFrames;
else if (ioState->fIsAudio) numAudioFrames = ioState->fNumFrames;
}
//// Global fields:
add4ByteString("idx1");
addWord(fNumIndexRecords*4*4); // the size of all of the index records, which come next:
for (AVIIndexRecord* indexRecord = fIndexRecordsHead; indexRecord != NULL; indexRecord = indexRecord->next()) {
addWord(indexRecord->chunkId());
addWord(indexRecord->flags());
addWord(indexRecord->offset());
addWord(indexRecord->size());
}
fRIFFSizeValue += fNumBytesWritten;
setWord(fRIFFSizePosition, fRIFFSizeValue);
setWord(fAVIHMaxBytesPerSecondPosition, maxBytesPerSecond);
setWord(fAVIHFrameCountPosition,
numVideoFrames > 0 ? numVideoFrames : numAudioFrames);
fMoviSizeValue += fNumBytesWritten;
setWord(fMoviSizePosition, fMoviSizeValue);
// We're done:
fHaveCompletedOutputFile = True;
}
////////// AVISubsessionIOState implementation ///////////
AVISubsessionIOState::AVISubsessionIOState(AVIFileSink& sink,
MediaSubsession& subsession)
: fOurSink(sink), fOurSubsession(subsession),
fMaxBytesPerSecond(0), fIsVideo(False), fIsAudio(False), fIsByteSwappedAudio(False), fNumFrames(0) {
fBuffer = new SubsessionBuffer(fOurSink.fBufferSize);
fPrevBuffer = sink.fPacketLossCompensate
? new SubsessionBuffer(fOurSink.fBufferSize) : NULL;
FramedSource* subsessionSource = subsession.readSource();
fOurSourceIsActive = subsessionSource != NULL;
fPrevPresentationTime.tv_sec = 0;
fPrevPresentationTime.tv_usec = 0;
}
AVISubsessionIOState::~AVISubsessionIOState() {
delete fBuffer; delete fPrevBuffer;
}
void AVISubsessionIOState::setAVIstate(unsigned subsessionIndex) {
fIsVideo = strcmp(fOurSubsession.mediumName(), "video") == 0;
fIsAudio = strcmp(fOurSubsession.mediumName(), "audio") == 0;
if (fIsVideo) {
fAVISubsessionTag
= fourChar('0'+subsessionIndex/10,'0'+subsessionIndex%10,'d','c');
if (strcmp(fOurSubsession.codecName(), "JPEG") == 0) {
fAVICodecHandlerType = fourChar('m','j','p','g');
} else if (strcmp(fOurSubsession.codecName(), "MP4V-ES") == 0) {
fAVICodecHandlerType = fourChar('D','I','V','X');
} else if (strcmp(fOurSubsession.codecName(), "MPV") == 0) {
fAVICodecHandlerType = fourChar('m','p','g','1'); // what about MPEG-2?
} else if (strcmp(fOurSubsession.codecName(), "H263-1998") == 0 ||
strcmp(fOurSubsession.codecName(), "H263-2000") == 0) {
fAVICodecHandlerType = fourChar('H','2','6','3');
} else if (strcmp(fOurSubsession.codecName(), "H264") == 0) {
fAVICodecHandlerType = fourChar('H','2','6','4');
} else {
fAVICodecHandlerType = fourChar('?','?','?','?');
}
fAVIScale = 1; // ??? #####
fAVIRate = fOurSink.fMovieFPS; // ??? #####
fAVISize = fOurSink.fMovieWidth*fOurSink.fMovieHeight*3; // ??? #####
} else if (fIsAudio) {
fIsByteSwappedAudio = False; // by default
fAVISubsessionTag
= fourChar('0'+subsessionIndex/10,'0'+subsessionIndex%10,'w','b');
fAVICodecHandlerType = 1; // ??? ####
unsigned numChannels = fOurSubsession.numChannels();
fAVISamplingFrequency = fOurSubsession.rtpTimestampFrequency(); // default
if (strcmp(fOurSubsession.codecName(), "L16") == 0) {
fIsByteSwappedAudio = True; // need to byte-swap data before writing it
fWAVCodecTag = 0x0001;
fAVIScale = fAVISize = 2*numChannels; // 2 bytes/sample
fAVIRate = fAVISize*fAVISamplingFrequency;
} else if (strcmp(fOurSubsession.codecName(), "L8") == 0) {
fWAVCodecTag = 0x0001;
fAVIScale = fAVISize = numChannels; // 1 byte/sample
fAVIRate = fAVISize*fAVISamplingFrequency;
} else if (strcmp(fOurSubsession.codecName(), "PCMA") == 0) {
fWAVCodecTag = 0x0006;
fAVIScale = fAVISize = numChannels; // 1 byte/sample
fAVIRate = fAVISize*fAVISamplingFrequency;
} else if (strcmp(fOurSubsession.codecName(), "PCMU") == 0) {
fWAVCodecTag = 0x0007;
fAVIScale = fAVISize = numChannels; // 1 byte/sample
fAVIRate = fAVISize*fAVISamplingFrequency;
} else if (strcmp(fOurSubsession.codecName(), "MPA") == 0) {
fWAVCodecTag = 0x0050;
fAVIScale = fAVISize = 1;
fAVIRate = 0; // ??? #####
} else {
fWAVCodecTag = 0x0001; // ??? #####
fAVIScale = fAVISize = 1;
fAVIRate = 0; // ??? #####
}
} else { // unknown medium
fAVISubsessionTag
= fourChar('0'+subsessionIndex/10,'0'+subsessionIndex%10,'?','?');
fAVICodecHandlerType = 0;
fAVIScale = fAVISize = 1;
fAVIRate = 0; // ??? #####
}
}
void AVISubsessionIOState::afterGettingFrame(unsigned packetDataSize,
struct timeval presentationTime) {
// Begin by checking whether there was a gap in the RTP stream.
// If so, try to compensate for this (if desired):
unsigned short rtpSeqNum
= fOurSubsession.rtpSource()->curPacketRTPSeqNum();
if (fOurSink.fPacketLossCompensate && fPrevBuffer->bytesInUse() > 0) {
short seqNumGap = rtpSeqNum - fLastPacketRTPSeqNum;
for (short i = 1; i < seqNumGap; ++i) {
// Insert a copy of the previous frame, to compensate for the loss:
useFrame(*fPrevBuffer);
}
}
fLastPacketRTPSeqNum = rtpSeqNum;
// Now, continue working with the frame that we just got
if (fBuffer->bytesInUse() == 0) {
fBuffer->setPresentationTime(presentationTime);
}
fBuffer->addBytes(packetDataSize);
useFrame(*fBuffer);
if (fOurSink.fPacketLossCompensate) {
// Save this frame, in case we need it for recovery:
SubsessionBuffer* tmp = fPrevBuffer; // assert: != NULL
fPrevBuffer = fBuffer;
fBuffer = tmp;
}
fBuffer->reset(); // for the next input
// Now, try getting more frames:
fOurSink.continuePlaying();
}
void AVISubsessionIOState::useFrame(SubsessionBuffer& buffer) {
unsigned char* const frameSource = buffer.dataStart();
unsigned const frameSize = buffer.bytesInUse();
struct timeval const& presentationTime = buffer.presentationTime();
if (fPrevPresentationTime.tv_usec != 0||fPrevPresentationTime.tv_sec != 0) {
int uSecondsDiff
= (presentationTime.tv_sec - fPrevPresentationTime.tv_sec)*1000000
+ (presentationTime.tv_usec - fPrevPresentationTime.tv_usec);
if (uSecondsDiff > 0) {
unsigned bytesPerSecond = (unsigned)((frameSize*1000000.0)/uSecondsDiff);
if (bytesPerSecond > fMaxBytesPerSecond) {
fMaxBytesPerSecond = bytesPerSecond;
}
}
}
fPrevPresentationTime = presentationTime;
if (fIsByteSwappedAudio) {
// We need to swap the 16-bit audio samples from big-endian
// to little-endian order, before writing them to a file:
for (unsigned i = 0; i < frameSize; i += 2) {
unsigned char tmp = frameSource[i];
frameSource[i] = frameSource[i+1];
frameSource[i+1] = tmp;
}
}
// Add an index record for this frame:
AVIIndexRecord* newIndexRecord
= new AVIIndexRecord(fAVISubsessionTag, // chunk id
frameSource[0] == 0x67 ? 0x10 : 0, // flags
fOurSink.fMoviSizePosition + 8 + fOurSink.fNumBytesWritten, // offset (note: 8 == size + 'movi')
frameSize + 4); // size
fOurSink.addIndexRecord(newIndexRecord);
// Write the data into the file:
fOurSink.fNumBytesWritten += fOurSink.addWord(fAVISubsessionTag);
if (strcmp(fOurSubsession.codecName(), "H264") == 0) {
// Insert a 'start code' (0x00 0x00 0x00 0x01) in front of the frame:
fOurSink.fNumBytesWritten += fOurSink.addWord(4+frameSize);
fOurSink.fNumBytesWritten += fOurSink.addWord(fourChar(0x00, 0x00, 0x00, 0x01));//add start code
} else {
fOurSink.fNumBytesWritten += fOurSink.addWord(frameSize);
}
fwrite(frameSource, 1, frameSize, fOurSink.fOutFid);
fOurSink.fNumBytesWritten += frameSize;
// Pad to an even length:
if (frameSize%2 != 0) fOurSink.fNumBytesWritten += fOurSink.addByte(0);
++fNumFrames;
}
void AVISubsessionIOState::onSourceClosure() {
fOurSourceIsActive = False;
fOurSink.onSourceClosure1();
}
////////// AVI-specific implementation //////////
unsigned AVIFileSink::addWord(unsigned word) {
// Add "word" to the file in little-endian order:
addByte(word); addByte(word>>8);
addByte(word>>16); addByte(word>>24);
return 4;
}
unsigned AVIFileSink::addHalfWord(unsigned short halfWord) {
// Add "halfWord" to the file in little-endian order:
addByte((unsigned char)halfWord); addByte((unsigned char)(halfWord>>8));
return 2;
}
unsigned AVIFileSink::addZeroWords(unsigned numWords) {
for (unsigned i = 0; i < numWords; ++i) {
addWord(0);
}
return numWords*4;
}
unsigned AVIFileSink::add4ByteString(char const* str) {
addByte(str[0]); addByte(str[1]); addByte(str[2]);
addByte(str[3] == '\0' ? ' ' : str[3]); // e.g., for "AVI "
return 4;
}
void AVIFileSink::setWord(unsigned filePosn, unsigned size) {
do {
if (SeekFile64(fOutFid, filePosn, SEEK_SET) < 0) break;
addWord(size);
if (SeekFile64(fOutFid, 0, SEEK_END) < 0) break; // go back to where we were
return;
} while (0);
// One of the SeekFile64()s failed, probable because we're not a seekable file
envir() << "AVIFileSink::setWord(): SeekFile64 failed (err "
<< envir().getErrno() << ")\n";
}
// Methods for writing particular file headers. Note the following macros:
#define addFileHeader(tag,name) \
unsigned AVIFileSink::addFileHeader_##name() { \
add4ByteString("" #tag ""); \
unsigned headerSizePosn = (unsigned)TellFile64(fOutFid); addWord(0); \
add4ByteString("" #name ""); \
unsigned ignoredSize = 8;/*don't include size of tag or size fields*/ \
unsigned size = 12
#define addFileHeader1(name) \
unsigned AVIFileSink::addFileHeader_##name() { \
add4ByteString("" #name ""); \
unsigned headerSizePosn = (unsigned)TellFile64(fOutFid); addWord(0); \
unsigned ignoredSize = 8;/*don't include size of name or size fields*/ \
unsigned size = 8
#define addFileHeaderEnd \
setWord(headerSizePosn, size-ignoredSize); \
return size; \
}
addFileHeader(RIFF,AVI);
size += addFileHeader_hdrl();
size += addFileHeader_movi();
fRIFFSizePosition = headerSizePosn;
fRIFFSizeValue = size-ignoredSize;
addFileHeaderEnd;
addFileHeader(LIST,hdrl);
size += addFileHeader_avih();
// Then, add a "strl" header for each subsession (stream):
// (Make the video subsession (if any) come before the audio subsession.)
unsigned subsessionCount = 0;
MediaSubsessionIterator iter(fInputSession);
MediaSubsession* subsession;
while ((subsession = iter.next()) != NULL) {
fCurrentIOState = (AVISubsessionIOState*)(subsession->miscPtr);
if (fCurrentIOState == NULL) continue;
if (strcmp(subsession->mediumName(), "video") != 0) continue;
fCurrentIOState->setAVIstate(subsessionCount++);
size += addFileHeader_strl();
}
iter.reset();
while ((subsession = iter.next()) != NULL) {
fCurrentIOState = (AVISubsessionIOState*)(subsession->miscPtr);
if (fCurrentIOState == NULL) continue;
if (strcmp(subsession->mediumName(), "video") == 0) continue;
fCurrentIOState->setAVIstate(subsessionCount++);
size += addFileHeader_strl();
}
// Then add another JUNK entry
++fJunkNumber;
size += addFileHeader_JUNK();
addFileHeaderEnd;
#define AVIF_HASINDEX 0x00000010 // Index at end of file?
#define AVIF_MUSTUSEINDEX 0x00000020
#define AVIF_ISINTERLEAVED 0x00000100
#define AVIF_TRUSTCKTYPE 0x00000800 // Use CKType to find key frames?
#define AVIF_WASCAPTUREFILE 0x00010000
#define AVIF_COPYRIGHTED 0x00020000
addFileHeader1(avih);
unsigned usecPerFrame = fMovieFPS == 0 ? 0 : 1000000/fMovieFPS;
size += addWord(usecPerFrame); // dwMicroSecPerFrame
fAVIHMaxBytesPerSecondPosition = (unsigned)TellFile64(fOutFid);
size += addWord(0); // dwMaxBytesPerSec (fill in later)
size += addWord(0); // dwPaddingGranularity
size += addWord(AVIF_TRUSTCKTYPE|AVIF_HASINDEX|AVIF_ISINTERLEAVED); // dwFlags
fAVIHFrameCountPosition = (unsigned)TellFile64(fOutFid);
size += addWord(0); // dwTotalFrames (fill in later)
size += addWord(0); // dwInitialFrame
size += addWord(fNumSubsessions); // dwStreams
size += addWord(fBufferSize); // dwSuggestedBufferSize
size += addWord(fMovieWidth); // dwWidth
size += addWord(fMovieHeight); // dwHeight
size += addZeroWords(4); // dwReserved
addFileHeaderEnd;
addFileHeader(LIST,strl);
size += addFileHeader_strh();
size += addFileHeader_strf();
fJunkNumber = 0;
size += addFileHeader_JUNK();
addFileHeaderEnd;
addFileHeader1(strh);
size += add4ByteString(fCurrentIOState->fIsVideo ? "vids" :
fCurrentIOState->fIsAudio ? "auds" :
"????"); // fccType
size += addWord(fCurrentIOState->fAVICodecHandlerType); // fccHandler
size += addWord(0); // dwFlags
size += addWord(0); // wPriority + wLanguage
size += addWord(0); // dwInitialFrames
size += addWord(fCurrentIOState->fAVIScale); // dwScale
size += addWord(fCurrentIOState->fAVIRate); // dwRate
size += addWord(0); // dwStart
fCurrentIOState->fSTRHFrameCountPosition = (unsigned)TellFile64(fOutFid);
size += addWord(0); // dwLength (fill in later)
size += addWord(fBufferSize); // dwSuggestedBufferSize
size += addWord((unsigned)-1); // dwQuality
size += addWord(fCurrentIOState->fAVISize); // dwSampleSize
size += addWord(0); // rcFrame (start)
if (fCurrentIOState->fIsVideo) {
size += addHalfWord(fMovieWidth);
size += addHalfWord(fMovieHeight);
} else {
size += addWord(0);
}
addFileHeaderEnd;
addFileHeader1(strf);
if (fCurrentIOState->fIsVideo) {
// Add a BITMAPINFO header:
unsigned extraDataSize = 0;
size += addWord(10*4 + extraDataSize); // size
size += addWord(fMovieWidth);
size += addWord(fMovieHeight);
size += addHalfWord(1); // planes
size += addHalfWord(24); // bits-per-sample #####
size += addWord(fCurrentIOState->fAVICodecHandlerType); // compr. type
size += addWord(fCurrentIOState->fAVISize);
size += addZeroWords(4); // ??? #####
// Later, add extra data here (if any) #####
} else if (fCurrentIOState->fIsAudio) {
// Add a WAVFORMATEX header:
size += addHalfWord(fCurrentIOState->fWAVCodecTag);
unsigned numChannels = fCurrentIOState->fOurSubsession.numChannels();
size += addHalfWord(numChannels);
size += addWord(fCurrentIOState->fAVISamplingFrequency);
size += addWord(fCurrentIOState->fAVIRate); // bytes per second
size += addHalfWord(fCurrentIOState->fAVISize); // block alignment
unsigned bitsPerSample = (fCurrentIOState->fAVISize*8)/numChannels;
size += addHalfWord(bitsPerSample);
if (strcmp(fCurrentIOState->fOurSubsession.codecName(), "MPA") == 0) {
// Assume MPEG layer II audio (not MP3): #####
size += addHalfWord(22); // wav_extra_size
size += addHalfWord(2); // fwHeadLayer
size += addWord(8*fCurrentIOState->fAVIRate); // dwHeadBitrate #####
size += addHalfWord(numChannels == 2 ? 1: 8); // fwHeadMode
size += addHalfWord(0); // fwHeadModeExt
size += addHalfWord(1); // wHeadEmphasis
size += addHalfWord(16); // fwHeadFlags
size += addWord(0); // dwPTSLow
size += addWord(0); // dwPTSHigh
}
}
addFileHeaderEnd;
#define AVI_MASTER_INDEX_SIZE 256
addFileHeader1(JUNK);
if (fJunkNumber == 0) {
size += addHalfWord(4); // wLongsPerEntry
size += addHalfWord(0); // bIndexSubType + bIndexType
size += addWord(0); // nEntriesInUse #####
size += addWord(fCurrentIOState->fAVISubsessionTag); // dwChunkId
size += addZeroWords(2); // dwReserved
size += addZeroWords(AVI_MASTER_INDEX_SIZE*4);
} else {
size += add4ByteString("odml");
size += add4ByteString("dmlh");
unsigned wtfCount = 248;
size += addWord(wtfCount); // ??? #####
size += addZeroWords(wtfCount/4);
}
addFileHeaderEnd;
addFileHeader(LIST,movi);
fMoviSizePosition = headerSizePosn;
fMoviSizeValue = size-ignoredSize;
addFileHeaderEnd;
live/liveMedia/Base64.cpp 000444 001752 001752 00000010717 12656261123 015106 0 ustar 00rsf rsf 000000 000000 /**********
This library is free software; you can redistribute it and/or modify it under
the terms of the GNU Lesser General Public License as published by the
Free Software Foundation; either version 2.1 of the License, or (at your
option) any later version. (See .)
This library is distributed in the hope that it will be useful, but WITHOUT
ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS
FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for
more details.
You should have received a copy of the GNU Lesser General Public License
along with this library; if not, write to the Free Software Foundation, Inc.,
51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
**********/
// "liveMedia"
// Copyright (c) 1996-2016 Live Networks, Inc. All rights reserved.
// Base64 encoding and decoding
// implementation
#include "Base64.hh"
#include
#include
static char base64DecodeTable[256];
static void initBase64DecodeTable() {
int i;
for (i = 0; i < 256; ++i) base64DecodeTable[i] = (char)0x80;
// default value: invalid
for (i = 'A'; i <= 'Z'; ++i) base64DecodeTable[i] = 0 + (i - 'A');
for (i = 'a'; i <= 'z'; ++i) base64DecodeTable[i] = 26 + (i - 'a');
for (i = '0'; i <= '9'; ++i) base64DecodeTable[i] = 52 + (i - '0');
base64DecodeTable[(unsigned char)'+'] = 62;
base64DecodeTable[(unsigned char)'/'] = 63;
base64DecodeTable[(unsigned char)'='] = 0;
}
unsigned char* base64Decode(char const* in, unsigned& resultSize,
Boolean trimTrailingZeros) {
if (in == NULL) return NULL; // sanity check
return base64Decode(in, strlen(in), resultSize, trimTrailingZeros);
}
unsigned char* base64Decode(char const* in, unsigned inSize,
unsigned& resultSize,
Boolean trimTrailingZeros) {
static Boolean haveInitializedBase64DecodeTable = False;
if (!haveInitializedBase64DecodeTable) {
initBase64DecodeTable();
haveInitializedBase64DecodeTable = True;
}
unsigned char* out = (unsigned char*)strDupSize(in); // ensures we have enough space
int k = 0;
int paddingCount = 0;
int const jMax = inSize - 3;
// in case "inSize" is not a multiple of 4 (although it should be)
for (int j = 0; j < jMax; j += 4) {
char inTmp[4], outTmp[4];
for (int i = 0; i < 4; ++i) {
inTmp[i] = in[i+j];
if (inTmp[i] == '=') ++paddingCount;
outTmp[i] = base64DecodeTable[(unsigned char)inTmp[i]];
if ((outTmp[i]&0x80) != 0) outTmp[i] = 0; // this happens only if there was an invalid character; pretend that it was 'A'
}
out[k++] = (outTmp[0]<<2) | (outTmp[1]>>4);
out[k++] = (outTmp[1]<<4) | (outTmp[2]>>2);
out[k++] = (outTmp[2]<<6) | outTmp[3];
}
if (trimTrailingZeros) {
while (paddingCount > 0 && k > 0 && out[k-1] == '\0') { --k; --paddingCount; }
}
resultSize = k;
unsigned char* result = new unsigned char[resultSize];
memmove(result, out, resultSize);
delete[] out;
return result;
}
static const char base64Char[] =
"ABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyz0123456789+/";
char* base64Encode(char const* origSigned, unsigned origLength) {
unsigned char const* orig = (unsigned char const*)origSigned; // in case any input bytes have the MSB set
if (orig == NULL) return NULL;
unsigned const numOrig24BitValues = origLength/3;
Boolean havePadding = origLength > numOrig24BitValues*3;
Boolean havePadding2 = origLength == numOrig24BitValues*3 + 2;
unsigned const numResultBytes = 4*(numOrig24BitValues + havePadding);
char* result = new char[numResultBytes+1]; // allow for trailing '\0'
// Map each full group of 3 input bytes into 4 output base-64 characters:
unsigned i;
for (i = 0; i < numOrig24BitValues; ++i) {
result[4*i+0] = base64Char[(orig[3*i]>>2)&0x3F];
result[4*i+1] = base64Char[(((orig[3*i]&0x3)<<4) | (orig[3*i+1]>>4))&0x3F];
result[4*i+2] = base64Char[((orig[3*i+1]<<2) | (orig[3*i+2]>>6))&0x3F];
result[4*i+3] = base64Char[orig[3*i+2]&0x3F];
}
// Now, take padding into account. (Note: i == numOrig24BitValues)
if (havePadding) {
result[4*i+0] = base64Char[(orig[3*i]>>2)&0x3F];
if (havePadding2) {
result[4*i+1] = base64Char[(((orig[3*i]&0x3)<<4) | (orig[3*i+1]>>4))&0x3F];
result[4*i+2] = base64Char[(orig[3*i+1]<<2)&0x3F];
} else {
result[4*i+1] = base64Char[((orig[3*i]&0x3)<<4)&0x3F];
result[4*i+2] = '=';
}
result[4*i+3] = '=';
}
result[numResultBytes] = '\0';
return result;
}
live/liveMedia/BasicUDPSink.cpp 000444 001752 001752 00000007150 12656261123 016276 0 ustar 00rsf rsf 000000 000000 /**********
This library is free software; you can redistribute it and/or modify it under
the terms of the GNU Lesser General Public License as published by the
Free Software Foundation; either version 2.1 of the License, or (at your
option) any later version. (See .)
This library is distributed in the hope that it will be useful, but WITHOUT
ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS
FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for
more details.
You should have received a copy of the GNU Lesser General Public License
along with this library; if not, write to the Free Software Foundation, Inc.,
51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
**********/
// "liveMedia"
// Copyright (c) 1996-2016 Live Networks, Inc. All rights reserved.
// A simple UDP sink (i.e., without RTP or other headers added); one frame per packet
// Implementation
#include "BasicUDPSink.hh"
#include
BasicUDPSink* BasicUDPSink::createNew(UsageEnvironment& env, Groupsock* gs,
unsigned maxPayloadSize) {
return new BasicUDPSink(env, gs, maxPayloadSize);
}
BasicUDPSink::BasicUDPSink(UsageEnvironment& env, Groupsock* gs,
unsigned maxPayloadSize)
: MediaSink(env),
fGS(gs), fMaxPayloadSize(maxPayloadSize) {
fOutputBuffer = new unsigned char[fMaxPayloadSize];
}
BasicUDPSink::~BasicUDPSink() {
delete[] fOutputBuffer;
}
Boolean BasicUDPSink::continuePlaying() {
// Record the fact that we're starting to play now:
gettimeofday(&fNextSendTime, NULL);
// Arrange to get and send the first payload.
// (This will also schedule any future sends.)
continuePlaying1();
return True;
}
void BasicUDPSink::continuePlaying1() {
if (fSource != NULL) {
fSource->getNextFrame(fOutputBuffer, fMaxPayloadSize,
afterGettingFrame, this,
onSourceClosure, this);
}
}
void BasicUDPSink::afterGettingFrame(void* clientData, unsigned frameSize,
unsigned numTruncatedBytes,
struct timeval /*presentationTime*/,
unsigned durationInMicroseconds) {
BasicUDPSink* sink = (BasicUDPSink*)clientData;
sink->afterGettingFrame1(frameSize, numTruncatedBytes, durationInMicroseconds);
}
void BasicUDPSink::afterGettingFrame1(unsigned frameSize, unsigned numTruncatedBytes,
unsigned durationInMicroseconds) {
if (numTruncatedBytes > 0) {
envir() << "BasicUDPSink::afterGettingFrame1(): The input frame data was too large for our spcified maximum payload size ("
<< fMaxPayloadSize << "). "
<< numTruncatedBytes << " bytes of trailing data was dropped!\n";
}
// Send the packet:
fGS->output(envir(), fOutputBuffer, frameSize);
// Figure out the time at which the next packet should be sent, based
// on the duration of the payload that we just read:
fNextSendTime.tv_usec += durationInMicroseconds;
fNextSendTime.tv_sec += fNextSendTime.tv_usec/1000000;
fNextSendTime.tv_usec %= 1000000;
struct timeval timeNow;
gettimeofday(&timeNow, NULL);
int secsDiff = fNextSendTime.tv_sec - timeNow.tv_sec;
int64_t uSecondsToGo = secsDiff*1000000 + (fNextSendTime.tv_usec - timeNow.tv_usec);
if (uSecondsToGo < 0 || secsDiff < 0) { // sanity check: Make sure that the time-to-delay is non-negative:
uSecondsToGo = 0;
}
// Delay this amount of time:
nextTask() = envir().taskScheduler().scheduleDelayedTask(uSecondsToGo,
(TaskFunc*)sendNext, this);
}
// The following is called after each delay between packet sends:
void BasicUDPSink::sendNext(void* firstArg) {
BasicUDPSink* sink = (BasicUDPSink*)firstArg;
sink->continuePlaying1();
}
live/liveMedia/BasicUDPSource.cpp 000444 001752 001752 00000005604 12656261123 016634 0 ustar 00rsf rsf 000000 000000 /**********
This library is free software; you can redistribute it and/or modify it under
the terms of the GNU Lesser General Public License as published by the
Free Software Foundation; either version 2.1 of the License, or (at your
option) any later version. (See .)
This library is distributed in the hope that it will be useful, but WITHOUT
ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS
FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for
more details.
You should have received a copy of the GNU Lesser General Public License
along with this library; if not, write to the Free Software Foundation, Inc.,
51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
**********/
// "liveMedia"
// Copyright (c) 1996-2016 Live Networks, Inc. All rights reserved.
// A simple UDP source, where every UDP payload is a complete frame
// Implementation
#include "BasicUDPSource.hh"
#include
BasicUDPSource* BasicUDPSource::createNew(UsageEnvironment& env,
Groupsock* inputGS) {
return new BasicUDPSource(env, inputGS);
}
BasicUDPSource::BasicUDPSource(UsageEnvironment& env, Groupsock* inputGS)
: FramedSource(env), fInputGS(inputGS), fHaveStartedReading(False) {
// Try to use a large receive buffer (in the OS):
increaseReceiveBufferTo(env, inputGS->socketNum(), 50*1024);
// Make the socket non-blocking, even though it will be read from only asynchronously, when packets arrive.
// The reason for this is that, in some OSs, reads on a blocking socket can (allegedly) sometimes block,
// even if the socket was previously reported (e.g., by "select()") as having data available.
// (This can supposedly happen if the UDP checksum fails, for example.)
makeSocketNonBlocking(fInputGS->socketNum());
}
BasicUDPSource::~BasicUDPSource(){
envir().taskScheduler().turnOffBackgroundReadHandling(fInputGS->socketNum());
}
void BasicUDPSource::doGetNextFrame() {
if (!fHaveStartedReading) {
// Await incoming packets:
envir().taskScheduler().turnOnBackgroundReadHandling(fInputGS->socketNum(),
(TaskScheduler::BackgroundHandlerProc*)&incomingPacketHandler, this);
fHaveStartedReading = True;
}
}
void BasicUDPSource::doStopGettingFrames() {
envir().taskScheduler().turnOffBackgroundReadHandling(fInputGS->socketNum());
fHaveStartedReading = False;
}
void BasicUDPSource::incomingPacketHandler(BasicUDPSource* source, int /*mask*/){
source->incomingPacketHandler1();
}
void BasicUDPSource::incomingPacketHandler1() {
if (!isCurrentlyAwaitingData()) return; // we're not ready for the data yet
// Read the packet into our desired destination:
struct sockaddr_in fromAddress;
if (!fInputGS->handleRead(fTo, fMaxSize, fFrameSize, fromAddress)) return;
// Tell our client that we have new data:
afterGetting(this); // we're preceded by a net read; no infinite recursion
}
live/liveMedia/BitVector.cpp 000444 001752 001752 00000011716 12656261123 015763 0 ustar 00rsf rsf 000000 000000 /**********
This library is free software; you can redistribute it and/or modify it under
the terms of the GNU Lesser General Public License as published by the
Free Software Foundation; either version 2.1 of the License, or (at your
option) any later version. (See .)
This library is distributed in the hope that it will be useful, but WITHOUT
ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS
FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for
more details.
You should have received a copy of the GNU Lesser General Public License
along with this library; if not, write to the Free Software Foundation, Inc.,
51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
**********/
// "liveMedia"
// Copyright (c) 1996-2016 Live Networks, Inc. All rights reserved.
// Bit Vector data structure
// Implementation
#include "BitVector.hh"
BitVector::BitVector(unsigned char* baseBytePtr,
unsigned baseBitOffset,
unsigned totNumBits) {
setup(baseBytePtr, baseBitOffset, totNumBits);
}
void BitVector::setup(unsigned char* baseBytePtr,
unsigned baseBitOffset,
unsigned totNumBits) {
fBaseBytePtr = baseBytePtr;
fBaseBitOffset = baseBitOffset;
fTotNumBits = totNumBits;
fCurBitIndex = 0;
}
static unsigned char const singleBitMask[8]
= {0x80, 0x40, 0x20, 0x10, 0x08, 0x04, 0x02, 0x01};
#define MAX_LENGTH 32
void BitVector::putBits(unsigned from, unsigned numBits) {
if (numBits == 0) return;
unsigned char tmpBuf[4];
unsigned overflowingBits = 0;
if (numBits > MAX_LENGTH) {
numBits = MAX_LENGTH;
}
if (numBits > fTotNumBits - fCurBitIndex) {
overflowingBits = numBits - (fTotNumBits - fCurBitIndex);
}
tmpBuf[0] = (unsigned char)(from>>24);
tmpBuf[1] = (unsigned char)(from>>16);
tmpBuf[2] = (unsigned char)(from>>8);
tmpBuf[3] = (unsigned char)from;
shiftBits(fBaseBytePtr, fBaseBitOffset + fCurBitIndex, /* to */
tmpBuf, MAX_LENGTH - numBits, /* from */
numBits - overflowingBits /* num bits */);
fCurBitIndex += numBits - overflowingBits;
}
void BitVector::put1Bit(unsigned bit) {
// The following is equivalent to "putBits(..., 1)", except faster:
if (fCurBitIndex >= fTotNumBits) { /* overflow */
return;
} else {
unsigned totBitOffset = fBaseBitOffset + fCurBitIndex++;
unsigned char mask = singleBitMask[totBitOffset%8];
if (bit) {
fBaseBytePtr[totBitOffset/8] |= mask;
} else {
fBaseBytePtr[totBitOffset/8] &=~ mask;
}
}
}
unsigned BitVector::getBits(unsigned numBits) {
if (numBits == 0) return 0;
unsigned char tmpBuf[4];
unsigned overflowingBits = 0;
if (numBits > MAX_LENGTH) {
numBits = MAX_LENGTH;
}
if (numBits > fTotNumBits - fCurBitIndex) {
overflowingBits = numBits - (fTotNumBits - fCurBitIndex);
}
shiftBits(tmpBuf, 0, /* to */
fBaseBytePtr, fBaseBitOffset + fCurBitIndex, /* from */
numBits - overflowingBits /* num bits */);
fCurBitIndex += numBits - overflowingBits;
unsigned result
= (tmpBuf[0]<<24) | (tmpBuf[1]<<16) | (tmpBuf[2]<<8) | tmpBuf[3];
result >>= (MAX_LENGTH - numBits); // move into low-order part of word
result &= (0xFFFFFFFF << overflowingBits); // so any overflow bits are 0
return result;
}
unsigned BitVector::get1Bit() {
// The following is equivalent to "getBits(1)", except faster:
if (fCurBitIndex >= fTotNumBits) { /* overflow */
return 0;
} else {
unsigned totBitOffset = fBaseBitOffset + fCurBitIndex++;
unsigned char curFromByte = fBaseBytePtr[totBitOffset/8];
unsigned result = (curFromByte >> (7-(totBitOffset%8))) & 0x01;
return result;
}
}
void BitVector::skipBits(unsigned numBits) {
if (numBits > fTotNumBits - fCurBitIndex) { /* overflow */
fCurBitIndex = fTotNumBits;
} else {
fCurBitIndex += numBits;
}
}
unsigned BitVector::get_expGolomb() {
unsigned numLeadingZeroBits = 0;
unsigned codeStart = 1;
while (get1Bit() == 0 && fCurBitIndex < fTotNumBits) {
++numLeadingZeroBits;
codeStart *= 2;
}
return codeStart - 1 + getBits(numLeadingZeroBits);
}
void shiftBits(unsigned char* toBasePtr, unsigned toBitOffset,
unsigned char const* fromBasePtr, unsigned fromBitOffset,
unsigned numBits) {
if (numBits == 0) return;
/* Note that from and to may overlap, if from>to */
unsigned char const* fromBytePtr = fromBasePtr + fromBitOffset/8;
unsigned fromBitRem = fromBitOffset%8;
unsigned char* toBytePtr = toBasePtr + toBitOffset/8;
unsigned toBitRem = toBitOffset%8;
while (numBits-- > 0) {
unsigned char fromBitMask = singleBitMask[fromBitRem];
unsigned char fromBit = (*fromBytePtr)&fromBitMask;
unsigned char toBitMask = singleBitMask[toBitRem];
if (fromBit != 0) {
*toBytePtr |= toBitMask;
} else {
*toBytePtr &=~ toBitMask;
}
if (++fromBitRem == 8) {
++fromBytePtr;
fromBitRem = 0;
}
if (++toBitRem == 8) {
++toBytePtr;
toBitRem = 0;
}
}
}
live/liveMedia/ByteStreamFileSource.cpp 000444 001752 001752 00000014412 12656261123 020116 0 ustar 00rsf rsf 000000 000000 /**********
This library is free software; you can redistribute it and/or modify it under
the terms of the GNU Lesser General Public License as published by the
Free Software Foundation; either version 2.1 of the License, or (at your
option) any later version. (See .)
This library is distributed in the hope that it will be useful, but WITHOUT
ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS
FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for
more details.
You should have received a copy of the GNU Lesser General Public License
along with this library; if not, write to the Free Software Foundation, Inc.,
51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
**********/
// "liveMedia"
// Copyright (c) 1996-2016 Live Networks, Inc. All rights reserved.
// A file source that is a plain byte stream (rather than frames)
// Implementation
#include "ByteStreamFileSource.hh"
#include "InputFile.hh"
#include "GroupsockHelper.hh"
////////// ByteStreamFileSource //////////
ByteStreamFileSource*
ByteStreamFileSource::createNew(UsageEnvironment& env, char const* fileName,
unsigned preferredFrameSize,
unsigned playTimePerFrame) {
FILE* fid = OpenInputFile(env, fileName);
if (fid == NULL) return NULL;
ByteStreamFileSource* newSource
= new ByteStreamFileSource(env, fid, preferredFrameSize, playTimePerFrame);
newSource->fFileSize = GetFileSize(fileName, fid);
return newSource;
}
ByteStreamFileSource*
ByteStreamFileSource::createNew(UsageEnvironment& env, FILE* fid,
unsigned preferredFrameSize,
unsigned playTimePerFrame) {
if (fid == NULL) return NULL;
ByteStreamFileSource* newSource = new ByteStreamFileSource(env, fid, preferredFrameSize, playTimePerFrame);
newSource->fFileSize = GetFileSize(NULL, fid);
return newSource;
}
void ByteStreamFileSource::seekToByteAbsolute(u_int64_t byteNumber, u_int64_t numBytesToStream) {
SeekFile64(fFid, (int64_t)byteNumber, SEEK_SET);
fNumBytesToStream = numBytesToStream;
fLimitNumBytesToStream = fNumBytesToStream > 0;
}
void ByteStreamFileSource::seekToByteRelative(int64_t offset, u_int64_t numBytesToStream) {
SeekFile64(fFid, offset, SEEK_CUR);
fNumBytesToStream = numBytesToStream;
fLimitNumBytesToStream = fNumBytesToStream > 0;
}
void ByteStreamFileSource::seekToEnd() {
SeekFile64(fFid, 0, SEEK_END);
}
ByteStreamFileSource::ByteStreamFileSource(UsageEnvironment& env, FILE* fid,
unsigned preferredFrameSize,
unsigned playTimePerFrame)
: FramedFileSource(env, fid), fFileSize(0), fPreferredFrameSize(preferredFrameSize),
fPlayTimePerFrame(playTimePerFrame), fLastPlayTime(0),
fHaveStartedReading(False), fLimitNumBytesToStream(False), fNumBytesToStream(0) {
#ifndef READ_FROM_FILES_SYNCHRONOUSLY
makeSocketNonBlocking(fileno(fFid));
#endif
// Test whether the file is seekable
fFidIsSeekable = FileIsSeekable(fFid);
}
ByteStreamFileSource::~ByteStreamFileSource() {
if (fFid == NULL) return;
#ifndef READ_FROM_FILES_SYNCHRONOUSLY
envir().taskScheduler().turnOffBackgroundReadHandling(fileno(fFid));
#endif
CloseInputFile(fFid);
}
void ByteStreamFileSource::doGetNextFrame() {
if (feof(fFid) || ferror(fFid) || (fLimitNumBytesToStream && fNumBytesToStream == 0)) {
handleClosure();
return;
}
#ifdef READ_FROM_FILES_SYNCHRONOUSLY
doReadFromFile();
#else
if (!fHaveStartedReading) {
// Await readable data from the file:
envir().taskScheduler().turnOnBackgroundReadHandling(fileno(fFid),
(TaskScheduler::BackgroundHandlerProc*)&fileReadableHandler, this);
fHaveStartedReading = True;
}
#endif
}
void ByteStreamFileSource::doStopGettingFrames() {
envir().taskScheduler().unscheduleDelayedTask(nextTask());
#ifndef READ_FROM_FILES_SYNCHRONOUSLY
envir().taskScheduler().turnOffBackgroundReadHandling(fileno(fFid));
fHaveStartedReading = False;
#endif
}
void ByteStreamFileSource::fileReadableHandler(ByteStreamFileSource* source, int /*mask*/) {
if (!source->isCurrentlyAwaitingData()) {
source->doStopGettingFrames(); // we're not ready for the data yet
return;
}
source->doReadFromFile();
}
void ByteStreamFileSource::doReadFromFile() {
// Try to read as many bytes as will fit in the buffer provided (or "fPreferredFrameSize" if less)
if (fLimitNumBytesToStream && fNumBytesToStream < (u_int64_t)fMaxSize) {
fMaxSize = (unsigned)fNumBytesToStream;
}
if (fPreferredFrameSize > 0 && fPreferredFrameSize < fMaxSize) {
fMaxSize = fPreferredFrameSize;
}
#ifdef READ_FROM_FILES_SYNCHRONOUSLY
fFrameSize = fread(fTo, 1, fMaxSize, fFid);
#else
if (fFidIsSeekable) {
fFrameSize = fread(fTo, 1, fMaxSize, fFid);
} else {
// For non-seekable files (e.g., pipes), call "read()" rather than "fread()", to ensure that the read doesn't block:
fFrameSize = read(fileno(fFid), fTo, fMaxSize);
}
#endif
if (fFrameSize == 0) {
handleClosure();
return;
}
fNumBytesToStream -= fFrameSize;
// Set the 'presentation time':
if (fPlayTimePerFrame > 0 && fPreferredFrameSize > 0) {
if (fPresentationTime.tv_sec == 0 && fPresentationTime.tv_usec == 0) {
// This is the first frame, so use the current time:
gettimeofday(&fPresentationTime, NULL);
} else {
// Increment by the play time of the previous data:
unsigned uSeconds = fPresentationTime.tv_usec + fLastPlayTime;
fPresentationTime.tv_sec += uSeconds/1000000;
fPresentationTime.tv_usec = uSeconds%1000000;
}
// Remember the play time of this data:
fLastPlayTime = (fPlayTimePerFrame*fFrameSize)/fPreferredFrameSize;
fDurationInMicroseconds = fLastPlayTime;
} else {
// We don't know a specific play time duration for this data,
// so just record the current time as being the 'presentation time':
gettimeofday(&fPresentationTime, NULL);
}
// Inform the reader that he has data:
#ifdef READ_FROM_FILES_SYNCHRONOUSLY
// To avoid possible infinite recursion, we need to return to the event loop to do this:
nextTask() = envir().taskScheduler().scheduleDelayedTask(0,
(TaskFunc*)FramedSource::afterGetting, this);
#else
// Because the file read was done from the event loop, we can call the
// 'after getting' function directly, without risk of infinite recursion:
FramedSource::afterGetting(this);
#endif
}
live/liveMedia/ByteStreamMemoryBufferSource.cpp 000444 001752 001752 00000010672 12656261123 021645 0 ustar 00rsf rsf 000000 000000 /**********
This library is free software; you can redistribute it and/or modify it under
the terms of the GNU Lesser General Public License as published by the
Free Software Foundation; either version 2.1 of the License, or (at your
option) any later version. (See .)
This library is distributed in the hope that it will be useful, but WITHOUT
ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS
FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for
more details.
You should have received a copy of the GNU Lesser General Public License
along with this library; if not, write to the Free Software Foundation, Inc.,
51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
**********/
// "liveMedia"
// Copyright (c) 1996-2016 Live Networks, Inc. All rights reserved.
// A class for streaming data from a (static) memory buffer, as if it were a file.
// Implementation
#include "ByteStreamMemoryBufferSource.hh"
#include "GroupsockHelper.hh"
////////// ByteStreamMemoryBufferSource //////////
ByteStreamMemoryBufferSource*
ByteStreamMemoryBufferSource::createNew(UsageEnvironment& env,
u_int8_t* buffer, u_int64_t bufferSize,
Boolean deleteBufferOnClose,
unsigned preferredFrameSize,
unsigned playTimePerFrame) {
if (buffer == NULL) return NULL;
return new ByteStreamMemoryBufferSource(env, buffer, bufferSize, deleteBufferOnClose, preferredFrameSize, playTimePerFrame);
}
ByteStreamMemoryBufferSource::ByteStreamMemoryBufferSource(UsageEnvironment& env,
u_int8_t* buffer, u_int64_t bufferSize,
Boolean deleteBufferOnClose,
unsigned preferredFrameSize,
unsigned playTimePerFrame)
: FramedSource(env), fBuffer(buffer), fBufferSize(bufferSize), fCurIndex(0), fDeleteBufferOnClose(deleteBufferOnClose),
fPreferredFrameSize(preferredFrameSize), fPlayTimePerFrame(playTimePerFrame), fLastPlayTime(0),
fLimitNumBytesToStream(False), fNumBytesToStream(0) {
}
ByteStreamMemoryBufferSource::~ByteStreamMemoryBufferSource() {
if (fDeleteBufferOnClose) delete[] fBuffer;
}
void ByteStreamMemoryBufferSource::seekToByteAbsolute(u_int64_t byteNumber, u_int64_t numBytesToStream) {
fCurIndex = byteNumber;
if (fCurIndex > fBufferSize) fCurIndex = fBufferSize;
fNumBytesToStream = numBytesToStream;
fLimitNumBytesToStream = fNumBytesToStream > 0;
}
void ByteStreamMemoryBufferSource::seekToByteRelative(int64_t offset, u_int64_t numBytesToStream) {
int64_t newIndex = fCurIndex + offset;
if (newIndex < 0) {
fCurIndex = 0;
} else {
fCurIndex = (u_int64_t)offset;
if (fCurIndex > fBufferSize) fCurIndex = fBufferSize;
}
fNumBytesToStream = numBytesToStream;
fLimitNumBytesToStream = fNumBytesToStream > 0;
}
void ByteStreamMemoryBufferSource::doGetNextFrame() {
if (fCurIndex >= fBufferSize || (fLimitNumBytesToStream && fNumBytesToStream == 0)) {
handleClosure();
return;
}
// Try to read as many bytes as will fit in the buffer provided (or "fPreferredFrameSize" if less)
fFrameSize = fMaxSize;
if (fLimitNumBytesToStream && fNumBytesToStream < (u_int64_t)fFrameSize) {
fFrameSize = (unsigned)fNumBytesToStream;
}
if (fPreferredFrameSize > 0 && fPreferredFrameSize < fFrameSize) {
fFrameSize = fPreferredFrameSize;
}
if (fCurIndex + fFrameSize > fBufferSize) {
fFrameSize = (unsigned)(fBufferSize - fCurIndex);
}
memmove(fTo, &fBuffer[fCurIndex], fFrameSize);
fCurIndex += fFrameSize;
fNumBytesToStream -= fFrameSize;
// Set the 'presentation time':
if (fPlayTimePerFrame > 0 && fPreferredFrameSize > 0) {
if (fPresentationTime.tv_sec == 0 && fPresentationTime.tv_usec == 0) {
// This is the first frame, so use the current time:
gettimeofday(&fPresentationTime, NULL);
} else {
// Increment by the play time of the previous data:
unsigned uSeconds = fPresentationTime.tv_usec + fLastPlayTime;
fPresentationTime.tv_sec += uSeconds/1000000;
fPresentationTime.tv_usec = uSeconds%1000000;
}
// Remember the play time of this data:
fLastPlayTime = (fPlayTimePerFrame*fFrameSize)/fPreferredFrameSize;
fDurationInMicroseconds = fLastPlayTime;
} else {
// We don't know a specific play time duration for this data,
// so just record the current time as being the 'presentation time':
gettimeofday(&fPresentationTime, NULL);
}
// Inform the downstream object that it has data:
FramedSource::afterGetting(this);
}
live/liveMedia/ByteStreamMultiFileSource.cpp 000444 001752 001752 00000011062 12656261123 021127 0 ustar 00rsf rsf 000000 000000 /**********
This library is free software; you can redistribute it and/or modify it under
the terms of the GNU Lesser General Public License as published by the
Free Software Foundation; either version 2.1 of the License, or (at your
option) any later version. (See .)
This library is distributed in the hope that it will be useful, but WITHOUT
ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS
FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for
more details.
You should have received a copy of the GNU Lesser General Public License
along with this library; if not, write to the Free Software Foundation, Inc.,
51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
**********/
// "liveMedia"
// Copyright (c) 1996-2016 Live Networks, Inc. All rights reserved.
// A source that consists of multiple byte-stream files, read sequentially
// Implementation
#include "ByteStreamMultiFileSource.hh"
ByteStreamMultiFileSource
::ByteStreamMultiFileSource(UsageEnvironment& env, char const** fileNameArray,
unsigned preferredFrameSize, unsigned playTimePerFrame)
: FramedSource(env),
fPreferredFrameSize(preferredFrameSize), fPlayTimePerFrame(playTimePerFrame),
fCurrentlyReadSourceNumber(0), fHaveStartedNewFile(False) {
// Begin by counting the number of sources:
for (fNumSources = 0; ; ++fNumSources) {
if (fileNameArray[fNumSources] == NULL) break;
}
// Next, copy the source file names into our own array:
fFileNameArray = new char const*[fNumSources];
if (fFileNameArray == NULL) return;
unsigned i;
for (i = 0; i < fNumSources; ++i) {
fFileNameArray[i] = strDup(fileNameArray[i]);
}
// Next, set up our array of component ByteStreamFileSources
// Don't actually create these yet; instead, do this on demand
fSourceArray = new ByteStreamFileSource*[fNumSources];
if (fSourceArray == NULL) return;
for (i = 0; i < fNumSources; ++i) {
fSourceArray[i] = NULL;
}
}
ByteStreamMultiFileSource::~ByteStreamMultiFileSource() {
unsigned i;
for (i = 0; i < fNumSources; ++i) {
Medium::close(fSourceArray[i]);
}
delete[] fSourceArray;
for (i = 0; i < fNumSources; ++i) {
delete[] (char*)(fFileNameArray[i]);
}
delete[] fFileNameArray;
}
ByteStreamMultiFileSource* ByteStreamMultiFileSource
::createNew(UsageEnvironment& env, char const** fileNameArray,
unsigned preferredFrameSize, unsigned playTimePerFrame) {
ByteStreamMultiFileSource* newSource
= new ByteStreamMultiFileSource(env, fileNameArray,
preferredFrameSize, playTimePerFrame);
return newSource;
}
void ByteStreamMultiFileSource::doGetNextFrame() {
do {
// First, check whether we've run out of sources:
if (fCurrentlyReadSourceNumber >= fNumSources) break;
fHaveStartedNewFile = False;
ByteStreamFileSource*& source
= fSourceArray[fCurrentlyReadSourceNumber];
if (source == NULL) {
// The current source hasn't been created yet. Do this now:
source = ByteStreamFileSource::createNew(envir(),
fFileNameArray[fCurrentlyReadSourceNumber],
fPreferredFrameSize, fPlayTimePerFrame);
if (source == NULL) break;
fHaveStartedNewFile = True;
}
// (Attempt to) read from the current source.
source->getNextFrame(fTo, fMaxSize,
afterGettingFrame, this,
onSourceClosure, this);
return;
} while (0);
// An error occurred; consider ourselves closed:
handleClosure();
}
void ByteStreamMultiFileSource
::afterGettingFrame(void* clientData,
unsigned frameSize, unsigned numTruncatedBytes,
struct timeval presentationTime,
unsigned durationInMicroseconds) {
ByteStreamMultiFileSource* source
= (ByteStreamMultiFileSource*)clientData;
source->fFrameSize = frameSize;
source->fNumTruncatedBytes = numTruncatedBytes;
source->fPresentationTime = presentationTime;
source->fDurationInMicroseconds = durationInMicroseconds;
FramedSource::afterGetting(source);
}
void ByteStreamMultiFileSource::onSourceClosure(void* clientData) {
ByteStreamMultiFileSource* source
= (ByteStreamMultiFileSource*)clientData;
source->onSourceClosure1();
}
void ByteStreamMultiFileSource::onSourceClosure1() {
// This routine was called because the currently-read source was closed
// (probably due to EOF). Close this source down, and move to the
// next one:
ByteStreamFileSource*& source
= fSourceArray[fCurrentlyReadSourceNumber++];
Medium::close(source);
source = NULL;
// Try reading again:
doGetNextFrame();
}
live/liveMedia/COPYING 000755 001752 001752 00000000000 12656261123 015644 2../COPYING ustar 00rsf rsf 000000 000000 live/liveMedia/DeviceSource.cpp 000444 001752 001752 00000015675 12656261123 016452 0 ustar 00rsf rsf 000000 000000 /**********
This library is free software; you can redistribute it and/or modify it under
the terms of the GNU Lesser General Public License as published by the
Free Software Foundation; either version 2.1 of the License, or (at your
option) any later version. (See .)
This library is distributed in the hope that it will be useful, but WITHOUT
ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS
FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for
more details.
You should have received a copy of the GNU Lesser General Public License
along with this library; if not, write to the Free Software Foundation, Inc.,
51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
**********/
// "liveMedia"
// Copyright (c) 1996-2016 Live Networks, Inc. All rights reserved.
// A template for a MediaSource encapsulating an audio/video input device
//
// NOTE: Sections of this code labeled "%%% TO BE WRITTEN %%%" are incomplete, and need to be written by the programmer
// (depending on the features of the particular device).
// Implementation
#include "DeviceSource.hh"
#include // for "gettimeofday()"
DeviceSource*
DeviceSource::createNew(UsageEnvironment& env,
DeviceParameters params) {
return new DeviceSource(env, params);
}
EventTriggerId DeviceSource::eventTriggerId = 0;
unsigned DeviceSource::referenceCount = 0;
DeviceSource::DeviceSource(UsageEnvironment& env,
DeviceParameters params)
: FramedSource(env), fParams(params) {
if (referenceCount == 0) {
// Any global initialization of the device would be done here:
//%%% TO BE WRITTEN %%%
}
++referenceCount;
// Any instance-specific initialization of the device would be done here:
//%%% TO BE WRITTEN %%%
// We arrange here for our "deliverFrame" member function to be called
// whenever the next frame of data becomes available from the device.
//
// If the device can be accessed as a readable socket, then one easy way to do this is using a call to
// envir().taskScheduler().turnOnBackgroundReadHandling( ... )
// (See examples of this call in the "liveMedia" directory.)
//
// If, however, the device *cannot* be accessed as a readable socket, then instead we can implement it using 'event triggers':
// Create an 'event trigger' for this device (if it hasn't already been done):
if (eventTriggerId == 0) {
eventTriggerId = envir().taskScheduler().createEventTrigger(deliverFrame0);
}
}
DeviceSource::~DeviceSource() {
// Any instance-specific 'destruction' (i.e., resetting) of the device would be done here:
//%%% TO BE WRITTEN %%%
--referenceCount;
if (referenceCount == 0) {
// Any global 'destruction' (i.e., resetting) of the device would be done here:
//%%% TO BE WRITTEN %%%
// Reclaim our 'event trigger'
envir().taskScheduler().deleteEventTrigger(eventTriggerId);
eventTriggerId = 0;
}
}
void DeviceSource::doGetNextFrame() {
// This function is called (by our 'downstream' object) when it asks for new data.
// Note: If, for some reason, the source device stops being readable (e.g., it gets closed), then you do the following:
if (0 /* the source stops being readable */ /*%%% TO BE WRITTEN %%%*/) {
handleClosure();
return;
}
// If a new frame of data is immediately available to be delivered, then do this now:
if (0 /* a new frame of data is immediately available to be delivered*/ /*%%% TO BE WRITTEN %%%*/) {
deliverFrame();
}
// No new data is immediately available to be delivered. We don't do anything more here.
// Instead, our event trigger must be called (e.g., from a separate thread) when new data becomes available.
}
void DeviceSource::deliverFrame0(void* clientData) {
((DeviceSource*)clientData)->deliverFrame();
}
void DeviceSource::deliverFrame() {
// This function is called when new frame data is available from the device.
// We deliver this data by copying it to the 'downstream' object, using the following parameters (class members):
// 'in' parameters (these should *not* be modified by this function):
// fTo: The frame data is copied to this address.
// (Note that the variable "fTo" is *not* modified. Instead,
// the frame data is copied to the address pointed to by "fTo".)
// fMaxSize: This is the maximum number of bytes that can be copied
// (If the actual frame is larger than this, then it should
// be truncated, and "fNumTruncatedBytes" set accordingly.)
// 'out' parameters (these are modified by this function):
// fFrameSize: Should be set to the delivered frame size (<= fMaxSize).
// fNumTruncatedBytes: Should be set iff the delivered frame would have been
// bigger than "fMaxSize", in which case it's set to the number of bytes
// that have been omitted.
// fPresentationTime: Should be set to the frame's presentation time
// (seconds, microseconds). This time must be aligned with 'wall-clock time' - i.e., the time that you would get
// by calling "gettimeofday()".
// fDurationInMicroseconds: Should be set to the frame's duration, if known.
// If, however, the device is a 'live source' (e.g., encoded from a camera or microphone), then we probably don't need
// to set this variable, because - in this case - data will never arrive 'early'.
// Note the code below.
if (!isCurrentlyAwaitingData()) return; // we're not ready for the data yet
u_int8_t* newFrameDataStart = (u_int8_t*)0xDEADBEEF; //%%% TO BE WRITTEN %%%
unsigned newFrameSize = 0; //%%% TO BE WRITTEN %%%
// Deliver the data here:
if (newFrameSize > fMaxSize) {
fFrameSize = fMaxSize;
fNumTruncatedBytes = newFrameSize - fMaxSize;
} else {
fFrameSize = newFrameSize;
}
gettimeofday(&fPresentationTime, NULL); // If you have a more accurate time - e.g., from an encoder - then use that instead.
// If the device is *not* a 'live source' (e.g., it comes instead from a file or buffer), then set "fDurationInMicroseconds" here.
memmove(fTo, newFrameDataStart, fFrameSize);
// After delivering the data, inform the reader that it is now available:
FramedSource::afterGetting(this);
}
// The following code would be called to signal that a new frame of data has become available.
// This (unlike other "LIVE555 Streaming Media" library code) may be called from a separate thread.
// (Note, however, that "triggerEvent()" cannot be called with the same 'event trigger id' from different threads.
// Also, if you want to have multiple device threads, each one using a different 'event trigger id', then you will need
// to make "eventTriggerId" a non-static member variable of "DeviceSource".)
void signalNewFrameData() {
TaskScheduler* ourScheduler = NULL; //%%% TO BE WRITTEN %%%
DeviceSource* ourDevice = NULL; //%%% TO BE WRITTEN %%%
if (ourScheduler != NULL) { // sanity check
ourScheduler->triggerEvent(DeviceSource::eventTriggerId, ourDevice);
}
}
live/liveMedia/DigestAuthentication.cpp 000444 001752 001752 00000012736 12656261123 020204 0 ustar 00rsf rsf 000000 000000 /**********
This library is free software; you can redistribute it and/or modify it under
the terms of the GNU Lesser General Public License as published by the
Free Software Foundation; either version 2.1 of the License, or (at your
option) any later version. (See .)
This library is distributed in the hope that it will be useful, but WITHOUT
ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS
FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for
more details.
You should have received a copy of the GNU Lesser General Public License
along with this library; if not, write to the Free Software Foundation, Inc.,
51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
**********/
// "liveMedia"
// Copyright (c) 1996-2016 Live Networks, Inc. All rights reserved.
// A class used for digest authentication.
// Implementation
#include "DigestAuthentication.hh"
#include "ourMD5.hh"
#include
#include // for gettimeofday()
#include
#include
#include
Authenticator::Authenticator() {
assign(NULL, NULL, NULL, NULL, False);
}
Authenticator::Authenticator(char const* username, char const* password, Boolean passwordIsMD5) {
assign(NULL, NULL, username, password, passwordIsMD5);
}
Authenticator::Authenticator(const Authenticator& orig) {
assign(orig.realm(), orig.nonce(), orig.username(), orig.password(), orig.fPasswordIsMD5);
}
Authenticator& Authenticator::operator=(const Authenticator& rightSide) {
if (&rightSide != this) {
reset();
assign(rightSide.realm(), rightSide.nonce(),
rightSide.username(), rightSide.password(), rightSide.fPasswordIsMD5);
}
return *this;
}
Boolean Authenticator::operator<(const Authenticator* rightSide) {
// Returns True if "rightSide" is 'newer' than us:
if (rightSide != NULL && rightSide != this &&
(rightSide->realm() != NULL || rightSide->nonce() != NULL ||
username() == NULL || password() == NULL ||
strcmp(rightSide->username(), username()) != 0 ||
strcmp(rightSide->password(), password()) != 0)) {
return True;
}
return False;
}
Authenticator::~Authenticator() {
reset();
}
void Authenticator::reset() {
resetRealmAndNonce();
resetUsernameAndPassword();
}
void Authenticator::setRealmAndNonce(char const* realm, char const* nonce) {
resetRealmAndNonce();
assignRealmAndNonce(realm, nonce);
}
void Authenticator::setRealmAndRandomNonce(char const* realm) {
resetRealmAndNonce();
// Construct data to seed the random nonce:
struct {
struct timeval timestamp;
unsigned counter;
} seedData;
gettimeofday(&seedData.timestamp, NULL);
static unsigned counter = 0;
seedData.counter = ++counter;
// Use MD5 to compute a 'random' nonce from this seed data:
char nonceBuf[33];
our_MD5Data((unsigned char*)(&seedData), sizeof seedData, nonceBuf);
assignRealmAndNonce(realm, nonceBuf);
}
void Authenticator::setUsernameAndPassword(char const* username,
char const* password,
Boolean passwordIsMD5) {
resetUsernameAndPassword();
assignUsernameAndPassword(username, password, passwordIsMD5);
}
char const* Authenticator::computeDigestResponse(char const* cmd,
char const* url) const {
// The "response" field is computed as:
// md5(md5(::)::md5(:))
// or, if "fPasswordIsMD5" is True:
// md5(::md5(:))
char ha1Buf[33];
if (fPasswordIsMD5) {
strncpy(ha1Buf, password(), 32);
ha1Buf[32] = '\0'; // just in case
} else {
unsigned const ha1DataLen = strlen(username()) + 1
+ strlen(realm()) + 1 + strlen(password());
unsigned char* ha1Data = new unsigned char[ha1DataLen+1];
sprintf((char*)ha1Data, "%s:%s:%s", username(), realm(), password());
our_MD5Data(ha1Data, ha1DataLen, ha1Buf);
delete[] ha1Data;
}
unsigned const ha2DataLen = strlen(cmd) + 1 + strlen(url);
unsigned char* ha2Data = new unsigned char[ha2DataLen+1];
sprintf((char*)ha2Data, "%s:%s", cmd, url);
char ha2Buf[33];
our_MD5Data(ha2Data, ha2DataLen, ha2Buf);
delete[] ha2Data;
unsigned const digestDataLen
= 32 + 1 + strlen(nonce()) + 1 + 32;
unsigned char* digestData = new unsigned char[digestDataLen+1];
sprintf((char*)digestData, "%s:%s:%s",
ha1Buf, nonce(), ha2Buf);
char const* result = our_MD5Data(digestData, digestDataLen, NULL);
delete[] digestData;
return result;
}
void Authenticator::reclaimDigestResponse(char const* responseStr) const {
delete[](char*)responseStr;
}
void Authenticator::resetRealmAndNonce() {
delete[] fRealm; fRealm = NULL;
delete[] fNonce; fNonce = NULL;
}
void Authenticator::resetUsernameAndPassword() {
delete[] fUsername; fUsername = NULL;
delete[] fPassword; fPassword = NULL;
fPasswordIsMD5 = False;
}
void Authenticator::assignRealmAndNonce(char const* realm, char const* nonce) {
fRealm = strDup(realm);
fNonce = strDup(nonce);
}
void Authenticator::assignUsernameAndPassword(char const* username, char const* password, Boolean passwordIsMD5) {
if (username == NULL) username = "";
if (password == NULL) password = "";
fUsername = strDup(username);
fPassword = strDup(password);
fPasswordIsMD5 = passwordIsMD5;
}
void Authenticator::assign(char const* realm, char const* nonce,
char const* username, char const* password, Boolean passwordIsMD5) {
assignRealmAndNonce(realm, nonce);
assignUsernameAndPassword(username, password, passwordIsMD5);
}
live/liveMedia/DVVideoFileServerMediaSubsession.cpp 000444 001752 001752 00000010455 12656261123 022366 0 ustar 00rsf rsf 000000 000000 /**********
This library is free software; you can redistribute it and/or modify it under
the terms of the GNU Lesser General Public License as published by the
Free Software Foundation; either version 2.1 of the License, or (at your
option) any later version. (See .)
This library is distributed in the hope that it will be useful, but WITHOUT
ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS
FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for
more details.
You should have received a copy of the GNU Lesser General Public License
along with this library; if not, write to the Free Software Foundation, Inc.,
51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
**********/
// "liveMedia"
// Copyright (c) 1996-2016 Live Networks, Inc. All rights reserved.
// A 'ServerMediaSubsession' object that creates new, unicast, "RTPSink"s
// on demand, from a DV video file.
// Implementation
#include "DVVideoFileServerMediaSubsession.hh"
#include "DVVideoRTPSink.hh"
#include "ByteStreamFileSource.hh"
#include "DVVideoStreamFramer.hh"
DVVideoFileServerMediaSubsession*
DVVideoFileServerMediaSubsession::createNew(UsageEnvironment& env, char const* fileName, Boolean reuseFirstSource) {
return new DVVideoFileServerMediaSubsession(env, fileName, reuseFirstSource);
}
DVVideoFileServerMediaSubsession
::DVVideoFileServerMediaSubsession(UsageEnvironment& env, char const* fileName, Boolean reuseFirstSource)
: FileServerMediaSubsession(env, fileName, reuseFirstSource),
fFileDuration(0.0) {
}
DVVideoFileServerMediaSubsession::~DVVideoFileServerMediaSubsession() {
}
FramedSource* DVVideoFileServerMediaSubsession
::createNewStreamSource(unsigned /*clientSessionId*/, unsigned& estBitrate) {
// Create the video source:
ByteStreamFileSource* fileSource = ByteStreamFileSource::createNew(envir(), fFileName);
if (fileSource == NULL) return NULL;
fFileSize = fileSource->fileSize();
// Create a framer for the Video Elementary Stream:
DVVideoStreamFramer* framer = DVVideoStreamFramer::createNew(envir(), fileSource, True/*the file source is seekable*/);
// Use the framer to figure out the file's duration:
unsigned frameSize;
double frameDuration;
if (framer->getFrameParameters(frameSize, frameDuration)) {
fFileDuration = (float)(((int64_t)fFileSize*frameDuration)/(frameSize*1000000.0));
estBitrate = (unsigned)((8000.0*frameSize)/frameDuration); // in kbps
} else {
estBitrate = 50000; // kbps, estimate
}
return framer;
}
RTPSink* DVVideoFileServerMediaSubsession::createNewRTPSink(Groupsock* rtpGroupsock,
unsigned char rtpPayloadTypeIfDynamic,
FramedSource* /*inputSource*/) {
return DVVideoRTPSink::createNew(envir(), rtpGroupsock, rtpPayloadTypeIfDynamic);
}
char const* DVVideoFileServerMediaSubsession::getAuxSDPLine(RTPSink* rtpSink, FramedSource* inputSource) {
return ((DVVideoRTPSink*)rtpSink)->auxSDPLineFromFramer((DVVideoStreamFramer*)inputSource);
}
float DVVideoFileServerMediaSubsession::duration() const {
return fFileDuration;
}
void DVVideoFileServerMediaSubsession
::seekStreamSource(FramedSource* inputSource, double& seekNPT, double streamDuration, u_int64_t& numBytes) {
// First, get the file source from "inputSource" (a framer):
DVVideoStreamFramer* framer = (DVVideoStreamFramer*)inputSource;
ByteStreamFileSource* fileSource = (ByteStreamFileSource*)(framer->inputSource());
// Then figure out where to seek to within the file:
if (fFileDuration > 0.0) {
u_int64_t seekByteNumber = (u_int64_t)(((int64_t)fFileSize*seekNPT)/fFileDuration);
numBytes = (u_int64_t)(((int64_t)fFileSize*streamDuration)/fFileDuration);
fileSource->seekToByteAbsolute(seekByteNumber, numBytes);
}
}
void DVVideoFileServerMediaSubsession
::setStreamSourceDuration(FramedSource* inputSource, double streamDuration, u_int64_t& numBytes) {
// First, get the file source from "inputSource" (a framer):
DVVideoStreamFramer* framer = (DVVideoStreamFramer*)inputSource;
ByteStreamFileSource* fileSource = (ByteStreamFileSource*)(framer->inputSource());
// Then figure out how many bytes to limit the streaming to:
if (fFileDuration > 0.0) {
numBytes = (u_int64_t)(((int64_t)fFileSize*streamDuration)/fFileDuration);
fileSource->seekToByteRelative(0, numBytes);
}
}
live/liveMedia/DVVideoRTPSink.cpp 000444 001752 001752 00000006755 12656261123 016604 0 ustar 00rsf rsf 000000 000000 /**********
This library is free software; you can redistribute it and/or modify it under
the terms of the GNU Lesser General Public License as published by the
Free Software Foundation; either version 2.1 of the License, or (at your
option) any later version. (See .)
This library is distributed in the hope that it will be useful, but WITHOUT
ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS
FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for
more details.
You should have received a copy of the GNU Lesser General Public License
along with this library; if not, write to the Free Software Foundation, Inc.,
51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
**********/
// "liveMedia"
// Copyright (c) 1996-2016 Live Networks, Inc. All rights reserved.
// RTP sink for DV video (RFC 3189)
// (Thanks to Ben Hutchings for prototyping this.)
// Implementation
#include "DVVideoRTPSink.hh"
////////// DVVideoRTPSink implementation //////////
DVVideoRTPSink
::DVVideoRTPSink(UsageEnvironment& env, Groupsock* RTPgs, unsigned char rtpPayloadFormat)
: VideoRTPSink(env, RTPgs, rtpPayloadFormat, 90000, "DV"),
fFmtpSDPLine(NULL) {
}
DVVideoRTPSink::~DVVideoRTPSink() {
delete[] fFmtpSDPLine;
}
DVVideoRTPSink*
DVVideoRTPSink::createNew(UsageEnvironment& env, Groupsock* RTPgs, unsigned char rtpPayloadFormat) {
return new DVVideoRTPSink(env, RTPgs, rtpPayloadFormat);
}
Boolean DVVideoRTPSink::sourceIsCompatibleWithUs(MediaSource& source) {
// Our source must be an appropriate framer:
return source.isDVVideoStreamFramer();
}
void DVVideoRTPSink::doSpecialFrameHandling(unsigned fragmentationOffset,
unsigned char* /*frameStart*/,
unsigned /*numBytesInFrame*/,
struct timeval framePresentationTime,
unsigned numRemainingBytes) {
if (numRemainingBytes == 0) {
// This packet contains the last (or only) fragment of the frame.
// Set the RTP 'M' ('marker') bit:
setMarkerBit();
}
// Also set the RTP timestamp:
setTimestamp(framePresentationTime);
}
unsigned DVVideoRTPSink::computeOverflowForNewFrame(unsigned newFrameSize) const {
unsigned initialOverflow = MultiFramedRTPSink::computeOverflowForNewFrame(newFrameSize);
// Adjust (increase) this overflow, if necessary, so that the amount of frame data that we use is an integral number
// of DIF blocks:
unsigned numFrameBytesUsed = newFrameSize - initialOverflow;
initialOverflow += numFrameBytesUsed%DV_DIF_BLOCK_SIZE;
return initialOverflow;
}
char const* DVVideoRTPSink::auxSDPLine() {
// Generate a new "a=fmtp:" line each time, using parameters from
// our framer source (in case they've changed since the last time that
// we were called):
DVVideoStreamFramer* framerSource = (DVVideoStreamFramer*)fSource;
if (framerSource == NULL) return NULL; // we don't yet have a source
return auxSDPLineFromFramer(framerSource);
}
char const* DVVideoRTPSink::auxSDPLineFromFramer(DVVideoStreamFramer* framerSource) {
char const* const profileName = framerSource->profileName();
if (profileName == NULL) return NULL;
char const* const fmtpSDPFmt = "a=fmtp:%d encode=%s;audio=bundled\r\n";
unsigned fmtpSDPFmtSize = strlen(fmtpSDPFmt)
+ 3 // max payload format code length
+ strlen(profileName);
delete[] fFmtpSDPLine; // if it already exists
fFmtpSDPLine = new char[fmtpSDPFmtSize];
sprintf(fFmtpSDPLine, fmtpSDPFmt, rtpPayloadType(), profileName);
return fFmtpSDPLine;
}
live/liveMedia/DVVideoRTPSource.cpp 000444 001752 001752 00000004267 12656261123 017134 0 ustar 00rsf rsf 000000 000000 /**********
This library is free software; you can redistribute it and/or modify it under
the terms of the GNU Lesser General Public License as published by the
Free Software Foundation; either version 2.1 of the License, or (at your
option) any later version. (See .)
This library is distributed in the hope that it will be useful, but WITHOUT
ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS
FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for
more details.
You should have received a copy of the GNU Lesser General Public License
along with this library; if not, write to the Free Software Foundation, Inc.,
51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
**********/
// "liveMedia"
// Copyright (c) 1996-2016 Live Networks, Inc. All rights reserved.
// DV Video RTP Sources
// Implementation
#include "DVVideoRTPSource.hh"
DVVideoRTPSource*
DVVideoRTPSource::createNew(UsageEnvironment& env,
Groupsock* RTPgs,
unsigned char rtpPayloadFormat,
unsigned rtpTimestampFrequency) {
return new DVVideoRTPSource(env, RTPgs, rtpPayloadFormat, rtpTimestampFrequency);
}
DVVideoRTPSource::DVVideoRTPSource(UsageEnvironment& env,
Groupsock* rtpGS,
unsigned char rtpPayloadFormat,
unsigned rtpTimestampFrequency)
: MultiFramedRTPSource(env, rtpGS,
rtpPayloadFormat, rtpTimestampFrequency) {
}
DVVideoRTPSource::~DVVideoRTPSource() {
}
#define DV_DIF_BLOCK_SIZE 80
#define DV_SECTION_HEADER 0x1F
Boolean DVVideoRTPSource
::processSpecialHeader(BufferedPacket* packet,
unsigned& resultSpecialHeaderSize) {
unsigned const packetSize = packet->dataSize();
if (packetSize < DV_DIF_BLOCK_SIZE) return False; // TARFU!
u_int8_t const* data = packet->data();
fCurrentPacketBeginsFrame = data[0] == DV_SECTION_HEADER && (data[1]&0xf8) == 0 && data[2] == 0; // thanks to Ben Hutchings
// The RTP "M" (marker) bit indicates the last fragment of a frame:
fCurrentPacketCompletesFrame = packet->rtpMarkerBit();
// There is no special header
resultSpecialHeaderSize = 0;
return True;
}
char const* DVVideoRTPSource::MIMEtype() const {
return "video/DV";
}
live/liveMedia/DVVideoStreamFramer.cpp 000444 001752 001752 00000022676 12656261123 017702 0 ustar 00rsf rsf 000000 000000 /**********
This library is free software; you can redistribute it and/or modify it under
the terms of the GNU Lesser General Public License as published by the
Free Software Foundation; either version 2.1 of the License, or (at your
option) any later version. (See .)
This library is distributed in the hope that it will be useful, but WITHOUT
ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS
FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for
more details.
You should have received a copy of the GNU Lesser General Public License
along with this library; if not, write to the Free Software Foundation, Inc.,
51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
**********/
// "liveMedia"
// Copyright (c) 1996-2016 Live Networks, Inc. All rights reserved.
// A filter that parses a DV input stream into DV frames to deliver to the downstream object
// Implementation
// (Thanks to Ben Hutchings for his help, including a prototype implementation.)
#include "DVVideoStreamFramer.hh"
#include "GroupsockHelper.hh"
////////// DVVideoStreamFramer implementation //////////
DVVideoStreamFramer::DVVideoStreamFramer(UsageEnvironment& env, FramedSource* inputSource,
Boolean sourceIsSeekable, Boolean leavePresentationTimesUnmodified)
: FramedFilter(env, inputSource),
fLeavePresentationTimesUnmodified(leavePresentationTimesUnmodified),
fOurProfile(NULL), fInitialBlocksPresent(False), fSourceIsSeekable(sourceIsSeekable) {
fTo = NULL; // hack used when reading "fSavedInitialBlocks"
// Use the current wallclock time as the initial 'presentation time':
gettimeofday(&fNextFramePresentationTime, NULL);
}
DVVideoStreamFramer::~DVVideoStreamFramer() {
}
DVVideoStreamFramer*
DVVideoStreamFramer::createNew(UsageEnvironment& env, FramedSource* inputSource,
Boolean sourceIsSeekable, Boolean leavePresentationTimesUnmodified) {
return new DVVideoStreamFramer(env, inputSource, sourceIsSeekable, leavePresentationTimesUnmodified);
}
// Define the parameters for the profiles that we understand:
struct DVVideoProfile {
char const* name;
unsigned apt;
unsigned sType;
unsigned sequenceCount;
unsigned channelCount;
unsigned dvFrameSize; // in bytes (== sequenceCount*channelCount*(DV_NUM_BLOCKS_PER_SEQUENCE*DV_DIF_BLOCK_SIZE i.e. 12000))
double frameDuration; // duration of the above, in microseconds. (1000000/this == frame rate)
};
static DVVideoProfile const profiles[] = {
{ "SD-VCR/525-60", 0, 0x00, 10, 1, 120000, (1000000*1001)/30000.0 },
{ "SD-VCR/625-50", 0, 0x00, 12, 1, 144000, 1000000/25.0 },
{ "314M-25/525-60", 1, 0x00, 10, 1, 120000, (1000000*1001)/30000.0 },
{ "314M-25/625-50", 1, 0x00, 12, 1, 144000, 1000000/25.0 },
{ "314M-50/525-60", 1, 0x04, 10, 2, 240000, (1000000*1001)/30000.0 },
{ "314M-50/625-50", 1, 0x04, 12, 2, 288000, 1000000/25.0 },
{ "370M/1080-60i", 1, 0x14, 10, 4, 480000, (1000000*1001)/30000.0 },
{ "370M/1080-50i", 1, 0x14, 12, 4, 576000, 1000000/25.0 },
{ "370M/720-60p", 1, 0x18, 10, 2, 240000, (1000000*1001)/60000.0 },
{ "370M/720-50p", 1, 0x18, 12, 2, 288000, 1000000/50.0 },
{ NULL, 0, 0, 0, 0, 0, 0.0 }
};
char const* DVVideoStreamFramer::profileName() {
if (fOurProfile == NULL) getProfile();
return fOurProfile != NULL ? ((DVVideoProfile const*)fOurProfile)->name : NULL;
}
Boolean DVVideoStreamFramer::getFrameParameters(unsigned& frameSize, double& frameDuration) {
if (fOurProfile == NULL) getProfile();
if (fOurProfile == NULL) return False;
frameSize = ((DVVideoProfile const*)fOurProfile)->dvFrameSize;
frameDuration = ((DVVideoProfile const*)fOurProfile)->frameDuration;
return True;
}
void DVVideoStreamFramer::getProfile() {
// To determine the stream's profile, we need to first read a chunk of data that we can parse:
fInputSource->getNextFrame(fSavedInitialBlocks, DV_SAVED_INITIAL_BLOCKS_SIZE,
afterGettingFrame, this, FramedSource::handleClosure, this);
// Handle events until the requested data arrives:
envir().taskScheduler().doEventLoop(&fInitialBlocksPresent);
}
Boolean DVVideoStreamFramer::isDVVideoStreamFramer() const {
return True;
}
void DVVideoStreamFramer::doGetNextFrame() {
fFrameSize = 0; // initially, until we deliver data
// If we have saved initial blocks (and won't be seeking back to re-read this data), so use this data first.
if (fInitialBlocksPresent && !fSourceIsSeekable) {
// For simplicity, we require the downstream object's buffer to be >= this data's size:
if (fMaxSize < DV_SAVED_INITIAL_BLOCKS_SIZE) {
fNumTruncatedBytes = fMaxSize;
afterGetting(this);
return;
}
memmove(fTo, fSavedInitialBlocks, DV_SAVED_INITIAL_BLOCKS_SIZE);
fFrameSize = DV_SAVED_INITIAL_BLOCKS_SIZE;
fTo += DV_SAVED_INITIAL_BLOCKS_SIZE;
fInitialBlocksPresent = False; // for the future
}
// Arrange to read the (rest of the) requested data.
// (But first, make sure that we read an integral multiple of the DV block size.)
fMaxSize -= fMaxSize%DV_DIF_BLOCK_SIZE;
getAndDeliverData();
}
#define DV_SMALLEST_POSSIBLE_FRAME_SIZE 120000
void DVVideoStreamFramer::getAndDeliverData() {
unsigned const totFrameSize
= fOurProfile != NULL ? ((DVVideoProfile const*)fOurProfile)->dvFrameSize : DV_SMALLEST_POSSIBLE_FRAME_SIZE;
unsigned totBytesToDeliver = totFrameSize < fMaxSize ? totFrameSize : fMaxSize;
unsigned numBytesToRead = totBytesToDeliver - fFrameSize;
fInputSource->getNextFrame(fTo, numBytesToRead, afterGettingFrame, this, FramedSource::handleClosure, this);
}
void DVVideoStreamFramer::afterGettingFrame(void* clientData, unsigned frameSize,
unsigned numTruncatedBytes,
struct timeval presentationTime, unsigned /*durationInMicroseconds*/) {
DVVideoStreamFramer* source = (DVVideoStreamFramer*)clientData;
source->afterGettingFrame(frameSize, numTruncatedBytes, presentationTime);
}
#define DVSectionId(n) ptr[(n)*DV_DIF_BLOCK_SIZE + 0]
#define DVData(n,i) ptr[(n)*DV_DIF_BLOCK_SIZE + 3+(i)]
#define DV_SECTION_HEADER 0x1F
#define DV_PACK_HEADER_10 0x3F
#define DV_PACK_HEADER_12 0xBF
#define DV_SECTION_VAUX_MIN 0x50
#define DV_SECTION_VAUX_MAX 0x5F
#define DV_PACK_VIDEO_SOURCE 60
#ifndef MILLION
#define MILLION 1000000
#endif
void DVVideoStreamFramer::afterGettingFrame(unsigned frameSize, unsigned numTruncatedBytes, struct timeval presentationTime) {
if (fOurProfile == NULL && frameSize >= DV_SAVED_INITIAL_BLOCKS_SIZE) {
// (Try to) parse this data enough to figure out its profile.
// We assume that the data begins on a (80-byte) block boundary, but not necessarily on a (150-block) sequence boundary.
// We therefore scan each 80-byte block, until we find the 6-block header that begins a sequence:
u_int8_t const* data = (fTo == NULL) ? fSavedInitialBlocks : fTo;
for (u_int8_t const* ptr = data; ptr + 6*DV_DIF_BLOCK_SIZE <= &data[DV_SAVED_INITIAL_BLOCKS_SIZE]; ptr += DV_DIF_BLOCK_SIZE) {
// Check whether "ptr" points to an appropriate header:
u_int8_t const sectionHeader = DVSectionId(0);
u_int8_t const sectionVAUX = DVSectionId(5);
u_int8_t const packHeaderNum = DVData(0,0);
if (sectionHeader == DV_SECTION_HEADER
&& (packHeaderNum == DV_PACK_HEADER_10 || packHeaderNum == DV_PACK_HEADER_12)
&& (sectionVAUX >= DV_SECTION_VAUX_MIN && sectionVAUX <= DV_SECTION_VAUX_MAX)) {
// This data begins a sequence; look up the DV profile from this:
u_int8_t const apt = DVData(0,1)&0x07;
u_int8_t const sType = DVData(5,48)&0x1F;
u_int8_t const sequenceCount = (packHeaderNum == DV_PACK_HEADER_10) ? 10 : 12;
// Use these three parameters (apt, sType, sequenceCount) to look up the DV profile:
for (DVVideoProfile const* profile = profiles; profile->name != NULL; ++profile) {
if (profile->apt == apt && profile->sType == sType && profile->sequenceCount == sequenceCount) {
fOurProfile = profile;
break;
}
}
break; // because we found a correct sequence header (even if we don't happen to define a profile for it)
}
}
}
if (fTo != NULL) { // There is a downstream object; complete delivery to it (or read more data, if necessary)
unsigned const totFrameSize
= fOurProfile != NULL ? ((DVVideoProfile const*)fOurProfile)->dvFrameSize : DV_SMALLEST_POSSIBLE_FRAME_SIZE;
fFrameSize += frameSize;
fTo += frameSize;
fPresentationTime = presentationTime; // by default; may get changed below
if (fFrameSize < totFrameSize && fFrameSize < fMaxSize && numTruncatedBytes == 0) {
// We have more data to deliver; get it now:
getAndDeliverData();
} else {
// We're done delivering this DV frame (but check for truncation):
fNumTruncatedBytes = totFrameSize - fFrameSize;
if (fOurProfile != NULL) {
// Also set the presentation time, and increment it for next time,
// based on the length of this frame:
if (!fLeavePresentationTimesUnmodified) fPresentationTime = fNextFramePresentationTime;
DVVideoProfile const* ourProfile =(DVVideoProfile const*)fOurProfile;
double durationInMicroseconds = (fFrameSize*ourProfile->frameDuration)/ourProfile->dvFrameSize;
fDurationInMicroseconds = (unsigned)durationInMicroseconds;
fNextFramePresentationTime.tv_usec += fDurationInMicroseconds;
fNextFramePresentationTime.tv_sec += fNextFramePresentationTime.tv_usec/MILLION;
fNextFramePresentationTime.tv_usec %= MILLION;
}
afterGetting(this);
}
} else {
// We read data into our special buffer; signal that it has arrived:
fInitialBlocksPresent = True;
}
}
live/liveMedia/EBMLNumber.cpp 000444 001752 001752 00000014705 12656261123 015753 0 ustar 00rsf rsf 000000 000000 /**********
This library is free software; you can redistribute it and/or modify it under
the terms of the GNU Lesser General Public License as published by the
Free Software Foundation; either version 2.1 of the License, or (at your
option) any later version. (See .)
This library is distributed in the hope that it will be useful, but WITHOUT
ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS
FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for
more details.
You should have received a copy of the GNU Lesser General Public License
along with this library; if not, write to the Free Software Foundation, Inc.,
51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
**********/
// "liveMedia"
// Copyright (c) 1996-2016 Live Networks, Inc. All rights reserved.
// EBML numbers (ids and sizes)
// Implementation
#include "EBMLNumber.hh"
EBMLNumber::EBMLNumber(Boolean stripLeading1)
: stripLeading1(stripLeading1), len(0) {
}
EBMLNumber::~EBMLNumber() {
}
char* EBMLNumber::hexString() const {
static char printBuf[2*EBML_NUMBER_MAX_LEN + 1];
char* to = printBuf;
for (unsigned i = 0; i < len; ++i) {
sprintf(to, "%02X", data[i]);
to += 2;
}
return printBuf;
}
u_int64_t EBMLNumber::val() const {
u_int64_t result = 0;
for (unsigned i = 0; i < len; ++i) {
result = result*256 + data[i];
}
return result;
}
EBMLId::EBMLId()
: EBMLNumber(False) {
}
EBMLId::~EBMLId() {
}
char const* EBMLId::stringName() const {
switch (val()) {
case MATROSKA_ID_EBML: { return "EBML"; }
case MATROSKA_ID_VOID: { return "Void"; }
case MATROSKA_ID_CRC_32: { return "CRC-32"; }
case MATROSKA_ID_SEGMENT: { return "Segment"; }
case MATROSKA_ID_SEEK_HEAD: { return "Seek Head"; }
case MATROSKA_ID_SEEK: { return "Seek"; }
case MATROSKA_ID_SEEK_ID: { return "Seek ID"; }
case MATROSKA_ID_SEEK_POSITION: { return "Seek Position"; }
case MATROSKA_ID_INFO: { return "Segment Info"; }
case MATROSKA_ID_SEGMENT_UID: { return "Segment UID"; }
case MATROSKA_ID_DURATION: { return "Segment Duration"; }
case MATROSKA_ID_TIMECODE_SCALE: { return "Timecode Scale"; }
case MATROSKA_ID_DATE_UTC: { return "Date (UTC)"; }
case MATROSKA_ID_TITLE: { return "Title"; }
case MATROSKA_ID_MUXING_APP: { return "Muxing App"; }
case MATROSKA_ID_WRITING_APP: { return "Writing App"; }
case MATROSKA_ID_CLUSTER: { return "Cluster"; }
case MATROSKA_ID_TIMECODE: { return "TimeCode"; }
case MATROSKA_ID_POSITION: { return "Position"; }
case MATROSKA_ID_PREV_SIZE: { return "Prev. Size"; }
case MATROSKA_ID_SIMPLEBLOCK: { return "SimpleBlock"; }
case MATROSKA_ID_BLOCK_GROUP: { return "Block Group"; }
case MATROSKA_ID_BLOCK: { return "Block"; }
case MATROSKA_ID_BLOCK_DURATION: { return "Block Duration"; }
case MATROSKA_ID_REFERENCE_BLOCK: { return "Reference Block"; }
case MATROSKA_ID_TRACKS: { return "Tracks"; }
case MATROSKA_ID_TRACK_ENTRY: { return "Track Entry"; }
case MATROSKA_ID_TRACK_NUMBER: { return "Track Number"; }
case MATROSKA_ID_TRACK_UID: { return "Track UID"; }
case MATROSKA_ID_TRACK_TYPE: { return "Track Type"; }
case MATROSKA_ID_FLAG_ENABLED: { return "Flag Enabled"; }
case MATROSKA_ID_FLAG_DEFAULT: { return "Flag Default"; }
case MATROSKA_ID_FLAG_FORCED: { return "Flag Forced"; }
case MATROSKA_ID_FLAG_LACING: { return "Flag Lacing"; }
case MATROSKA_ID_MIN_CACHE: { return "Min Cache"; }
case MATROSKA_ID_DEFAULT_DURATION: { return "Default Duration"; }
case MATROSKA_ID_TRACK_TIMECODE_SCALE: { return "Track Timecode Scale"; }
case MATROSKA_ID_MAX_BLOCK_ADDITION_ID: { return "Max Block Addition ID"; }
case MATROSKA_ID_NAME: { return "Name"; }
case MATROSKA_ID_LANGUAGE: { return "Language"; }
case MATROSKA_ID_CODEC: { return "Codec ID"; }
case MATROSKA_ID_CODEC_PRIVATE: { return "Codec Private"; }
case MATROSKA_ID_CODEC_NAME: { return "Codec Name"; }
case MATROSKA_ID_CODEC_DECODE_ALL: { return "Codec Decode All"; }
case MATROSKA_ID_VIDEO: { return "Video Settings"; }
case MATROSKA_ID_FLAG_INTERLACED: { return "Flag Interlaced"; }
case MATROSKA_ID_PIXEL_WIDTH: { return "Pixel Width"; }
case MATROSKA_ID_PIXEL_HEIGHT: { return "Pixel Height"; }
case MATROSKA_ID_DISPLAY_WIDTH: { return "Display Width"; }
case MATROSKA_ID_DISPLAY_HEIGHT: { return "Display Height"; }
case MATROSKA_ID_DISPLAY_UNIT: { return "Display Unit"; }
case MATROSKA_ID_AUDIO: { return "Audio Settings"; }
case MATROSKA_ID_SAMPLING_FREQUENCY: { return "Sampling Frequency"; }
case MATROSKA_ID_OUTPUT_SAMPLING_FREQUENCY: { return "Output Sampling Frequency"; }
case MATROSKA_ID_CHANNELS: { return "Channels"; }
case MATROSKA_ID_BIT_DEPTH: { return "Bit Depth"; }
case MATROSKA_ID_CONTENT_ENCODINGS: { return "Content Encodings"; }
case MATROSKA_ID_CONTENT_ENCODING: { return "Content Encoding"; }
case MATROSKA_ID_CONTENT_COMPRESSION: { return "Content Compression"; }
case MATROSKA_ID_CONTENT_COMP_ALGO: { return "Content Compression Algorithm"; }
case MATROSKA_ID_CONTENT_COMP_SETTINGS: { return "Content Compression Settings"; }
case MATROSKA_ID_CONTENT_ENCRYPTION: { return "Content Encryption"; }
case MATROSKA_ID_ATTACHMENTS: { return "Attachments"; }
case MATROSKA_ID_ATTACHED_FILE: { return "Attached File"; }
case MATROSKA_ID_FILE_DESCRIPTION: { return "File Description"; }
case MATROSKA_ID_FILE_NAME: { return "File Name"; }
case MATROSKA_ID_FILE_MIME_TYPE: { return "File MIME Type"; }
case MATROSKA_ID_FILE_DATA: { return "File Data"; }
case MATROSKA_ID_FILE_UID: { return "File UID"; }
case MATROSKA_ID_CUES: { return "Cues"; }
case MATROSKA_ID_CUE_POINT: { return "Cue Point"; }
case MATROSKA_ID_CUE_TIME: { return "Cue Time"; }
case MATROSKA_ID_CUE_TRACK_POSITIONS: { return "Cue Track Positions"; }
case MATROSKA_ID_CUE_TRACK: { return "Cue Track"; }
case MATROSKA_ID_CUE_CLUSTER_POSITION: { return "Cue Cluster Position"; }
case MATROSKA_ID_CUE_BLOCK_NUMBER: { return "Cue Block Number"; }
case MATROSKA_ID_TAGS: { return "Tags"; }
case MATROSKA_ID_SEEK_PRE_ROLL: { return "SeekPreRoll"; }
case MATROSKA_ID_CODEC_DELAY: { return "CodecDelay"; }
case MATROSKA_ID_DISCARD_PADDING: { return "DiscardPadding"; }
default: { return "*****unknown*****"; }
}
}
EBMLDataSize::EBMLDataSize()
: EBMLNumber(True) {
}
EBMLDataSize::~EBMLDataSize() {
}
live/liveMedia/EBMLNumber.hh 000444 001752 001752 00000011442 12656261123 015563 0 ustar 00rsf rsf 000000 000000 /**********
This library is free software; you can redistribute it and/or modify it under
the terms of the GNU Lesser General Public License as published by the
Free Software Foundation; either version 2.1 of the License, or (at your
option) any later version. (See .)
This library is distributed in the hope that it will be useful, but WITHOUT
ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS
FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for
more details.
You should have received a copy of the GNU Lesser General Public License
along with this library; if not, write to the Free Software Foundation, Inc.,
51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
**********/
// "liveMedia"
// Copyright (c) 1996-2016 Live Networks, Inc. All rights reserved.
// EBML numbers (ids and sizes)
// C++ header
#ifndef _EBML_NUMBER_HH
#define _EBML_NUMBER_HH
#include "NetCommon.h"
#include "Boolean.hh"
#include
#define EBML_NUMBER_MAX_LEN 8
class EBMLNumber {
public:
EBMLNumber(Boolean stripLeading1 = True);
virtual ~EBMLNumber();
u_int64_t val() const;
char* hexString() const; // used for debugging
Boolean operator==(u_int64_t arg2) const { return val() == arg2; }
Boolean operator!=(u_int64_t arg2) const { return !(*this == arg2); }
public:
Boolean stripLeading1;
unsigned len;
u_int8_t data[EBML_NUMBER_MAX_LEN];
};
// Definitions of some Matroska/EBML IDs (including the ones that we check for):
#define MATROSKA_ID_EBML 0x1A45DFA3
#define MATROSKA_ID_VOID 0xEC
#define MATROSKA_ID_CRC_32 0xBF
#define MATROSKA_ID_SEGMENT 0x18538067
#define MATROSKA_ID_SEEK_HEAD 0x114D9B74
#define MATROSKA_ID_SEEK 0x4DBB
#define MATROSKA_ID_SEEK_ID 0x53AB
#define MATROSKA_ID_SEEK_POSITION 0x53AC
#define MATROSKA_ID_INFO 0x1549A966
#define MATROSKA_ID_SEGMENT_UID 0x73A4
#define MATROSKA_ID_TIMECODE_SCALE 0x2AD7B1
#define MATROSKA_ID_DURATION 0x4489
#define MATROSKA_ID_DATE_UTC 0x4461
#define MATROSKA_ID_TITLE 0x7BA9
#define MATROSKA_ID_MUXING_APP 0x4D80
#define MATROSKA_ID_WRITING_APP 0x5741
#define MATROSKA_ID_CLUSTER 0x1F43B675
#define MATROSKA_ID_TIMECODE 0xE7
#define MATROSKA_ID_POSITION 0xA7
#define MATROSKA_ID_PREV_SIZE 0xAB
#define MATROSKA_ID_SIMPLEBLOCK 0xA3
#define MATROSKA_ID_BLOCK_GROUP 0xA0
#define MATROSKA_ID_BLOCK 0xA1
#define MATROSKA_ID_BLOCK_DURATION 0x9B
#define MATROSKA_ID_REFERENCE_BLOCK 0xFB
#define MATROSKA_ID_TRACKS 0x1654AE6B
#define MATROSKA_ID_TRACK_ENTRY 0xAE
#define MATROSKA_ID_TRACK_NUMBER 0xD7
#define MATROSKA_ID_TRACK_UID 0x73C5
#define MATROSKA_ID_TRACK_TYPE 0x83
#define MATROSKA_ID_FLAG_ENABLED 0xB9
#define MATROSKA_ID_FLAG_DEFAULT 0x88
#define MATROSKA_ID_FLAG_FORCED 0x55AA
#define MATROSKA_ID_FLAG_LACING 0x9C
#define MATROSKA_ID_MIN_CACHE 0x6DE7
#define MATROSKA_ID_DEFAULT_DURATION 0x23E383
#define MATROSKA_ID_TRACK_TIMECODE_SCALE 0x23314F
#define MATROSKA_ID_MAX_BLOCK_ADDITION_ID 0x55EE
#define MATROSKA_ID_NAME 0x536E
#define MATROSKA_ID_LANGUAGE 0x22B59C
#define MATROSKA_ID_CODEC 0x86
#define MATROSKA_ID_CODEC_PRIVATE 0x63A2
#define MATROSKA_ID_CODEC_NAME 0x258688
#define MATROSKA_ID_CODEC_DECODE_ALL 0xAA
#define MATROSKA_ID_VIDEO 0xE0
#define MATROSKA_ID_FLAG_INTERLACED 0x9A
#define MATROSKA_ID_PIXEL_WIDTH 0xB0
#define MATROSKA_ID_PIXEL_HEIGHT 0xBA
#define MATROSKA_ID_DISPLAY_WIDTH 0x54B0
#define MATROSKA_ID_DISPLAY_HEIGHT 0x54BA
#define MATROSKA_ID_DISPLAY_UNIT 0x54B2
#define MATROSKA_ID_AUDIO 0xE1
#define MATROSKA_ID_SAMPLING_FREQUENCY 0xB5
#define MATROSKA_ID_OUTPUT_SAMPLING_FREQUENCY 0x78B5
#define MATROSKA_ID_CHANNELS 0x9F
#define MATROSKA_ID_BIT_DEPTH 0x6264
#define MATROSKA_ID_CONTENT_ENCODINGS 0x6D80
#define MATROSKA_ID_CONTENT_ENCODING 0x6240
#define MATROSKA_ID_CONTENT_COMPRESSION 0x5034
#define MATROSKA_ID_CONTENT_COMP_ALGO 0x4254
#define MATROSKA_ID_CONTENT_COMP_SETTINGS 0x4255
#define MATROSKA_ID_CONTENT_ENCRYPTION 0x5035
#define MATROSKA_ID_ATTACHMENTS 0x1941A469
#define MATROSKA_ID_ATTACHED_FILE 0x61A7
#define MATROSKA_ID_FILE_DESCRIPTION 0x467E
#define MATROSKA_ID_FILE_NAME 0x466E
#define MATROSKA_ID_FILE_MIME_TYPE 0x4660
#define MATROSKA_ID_FILE_DATA 0x465C
#define MATROSKA_ID_FILE_UID 0x46AE
#define MATROSKA_ID_CUES 0x1C53BB6B
#define MATROSKA_ID_CUE_POINT 0xBB
#define MATROSKA_ID_CUE_TIME 0xB3
#define MATROSKA_ID_CUE_TRACK_POSITIONS 0xB7
#define MATROSKA_ID_CUE_TRACK 0xF7
#define MATROSKA_ID_CUE_CLUSTER_POSITION 0xF1
#define MATROSKA_ID_CUE_BLOCK_NUMBER 0x5378
#define MATROSKA_ID_TAGS 0x1254C367
#define MATROSKA_ID_SEEK_PRE_ROLL 0x56BB
#define MATROSKA_ID_CODEC_DELAY 0x56AA
#define MATROSKA_ID_DISCARD_PADDING 0x75A2
class EBMLId: public EBMLNumber {
public:
EBMLId();
virtual ~EBMLId();
char const* stringName() const; // used for debugging
};
class EBMLDataSize: public EBMLNumber {
public:
EBMLDataSize();
virtual ~EBMLDataSize();
};
#endif
live/liveMedia/FileSink.cpp 000444 001752 001752 00000012417 12656261123 015565 0 ustar 00rsf rsf 000000 000000 /**********
This library is free software; you can redistribute it and/or modify it under
the terms of the GNU Lesser General Public License as published by the
Free Software Foundation; either version 2.1 of the License, or (at your
option) any later version. (See .)
This library is distributed in the hope that it will be useful, but WITHOUT
ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS
FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for
more details.
You should have received a copy of the GNU Lesser General Public License
along with this library; if not, write to the Free Software Foundation, Inc.,
51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
**********/
// "liveMedia"
// Copyright (c) 1996-2016 Live Networks, Inc. All rights reserved.
// File sinks
// Implementation
#if (defined(__WIN32__) || defined(_WIN32)) && !defined(_WIN32_WCE)
#include
#include
#endif
#include "FileSink.hh"
#include "GroupsockHelper.hh"
#include "OutputFile.hh"
////////// FileSink //////////
FileSink::FileSink(UsageEnvironment& env, FILE* fid, unsigned bufferSize,
char const* perFrameFileNamePrefix)
: MediaSink(env), fOutFid(fid), fBufferSize(bufferSize), fSamePresentationTimeCounter(0) {
fBuffer = new unsigned char[bufferSize];
if (perFrameFileNamePrefix != NULL) {
fPerFrameFileNamePrefix = strDup(perFrameFileNamePrefix);
fPerFrameFileNameBuffer = new char[strlen(perFrameFileNamePrefix) + 100];
} else {
fPerFrameFileNamePrefix = NULL;
fPerFrameFileNameBuffer = NULL;
}
fPrevPresentationTime.tv_sec = ~0; fPrevPresentationTime.tv_usec = 0;
}
FileSink::~FileSink() {
delete[] fPerFrameFileNameBuffer;
delete[] fPerFrameFileNamePrefix;
delete[] fBuffer;
if (fOutFid != NULL) fclose(fOutFid);
}
FileSink* FileSink::createNew(UsageEnvironment& env, char const* fileName,
unsigned bufferSize, Boolean oneFilePerFrame) {
do {
FILE* fid;
char const* perFrameFileNamePrefix;
if (oneFilePerFrame) {
// Create the fid for each frame
fid = NULL;
perFrameFileNamePrefix = fileName;
} else {
// Normal case: create the fid once
fid = OpenOutputFile(env, fileName);
if (fid == NULL) break;
perFrameFileNamePrefix = NULL;
}
return new FileSink(env, fid, bufferSize, perFrameFileNamePrefix);
} while (0);
return NULL;
}
Boolean FileSink::continuePlaying() {
if (fSource == NULL) return False;
fSource->getNextFrame(fBuffer, fBufferSize,
afterGettingFrame, this,
onSourceClosure, this);
return True;
}
void FileSink::afterGettingFrame(void* clientData, unsigned frameSize,
unsigned numTruncatedBytes,
struct timeval presentationTime,
unsigned /*durationInMicroseconds*/) {
FileSink* sink = (FileSink*)clientData;
sink->afterGettingFrame(frameSize, numTruncatedBytes, presentationTime);
}
void FileSink::addData(unsigned char const* data, unsigned dataSize,
struct timeval presentationTime) {
if (fPerFrameFileNameBuffer != NULL && fOutFid == NULL) {
// Special case: Open a new file on-the-fly for this frame
if (presentationTime.tv_usec == fPrevPresentationTime.tv_usec &&
presentationTime.tv_sec == fPrevPresentationTime.tv_sec) {
// The presentation time is unchanged from the previous frame, so we add a 'counter'
// suffix to the file name, to distinguish them:
sprintf(fPerFrameFileNameBuffer, "%s-%lu.%06lu-%u", fPerFrameFileNamePrefix,
presentationTime.tv_sec, presentationTime.tv_usec, ++fSamePresentationTimeCounter);
} else {
sprintf(fPerFrameFileNameBuffer, "%s-%lu.%06lu", fPerFrameFileNamePrefix,
presentationTime.tv_sec, presentationTime.tv_usec);
fPrevPresentationTime = presentationTime; // for next time
fSamePresentationTimeCounter = 0; // for next time
}
fOutFid = OpenOutputFile(envir(), fPerFrameFileNameBuffer);
}
// Write to our file:
#ifdef TEST_LOSS
static unsigned const framesPerPacket = 10;
static unsigned const frameCount = 0;
static Boolean const packetIsLost;
if ((frameCount++)%framesPerPacket == 0) {
packetIsLost = (our_random()%10 == 0); // simulate 10% packet loss #####
}
if (!packetIsLost)
#endif
if (fOutFid != NULL && data != NULL) {
fwrite(data, 1, dataSize, fOutFid);
}
}
void FileSink::afterGettingFrame(unsigned frameSize,
unsigned numTruncatedBytes,
struct timeval presentationTime) {
if (numTruncatedBytes > 0) {
envir() << "FileSink::afterGettingFrame(): The input frame data was too large for our buffer size ("
<< fBufferSize << "). "
<< numTruncatedBytes << " bytes of trailing data was dropped! Correct this by increasing the \"bufferSize\" parameter in the \"createNew()\" call to at least "
<< fBufferSize + numTruncatedBytes << "\n";
}
addData(fBuffer, frameSize, presentationTime);
if (fOutFid == NULL || fflush(fOutFid) == EOF) {
// The output file has closed. Handle this the same way as if the input source had closed:
if (fSource != NULL) fSource->stopGettingFrames();
onSourceClosure();
return;
}
if (fPerFrameFileNameBuffer != NULL) {
if (fOutFid != NULL) { fclose(fOutFid); fOutFid = NULL; }
}
// Then try getting the next frame:
continuePlaying();
}
live/liveMedia/FileServerMediaSubsession.cpp 000444 001752 001752 00000002512 12656261123 021140 0 ustar 00rsf rsf 000000 000000 /**********
This library is free software; you can redistribute it and/or modify it under
the terms of the GNU Lesser General Public License as published by the
Free Software Foundation; either version 2.1 of the License, or (at your
option) any later version. (See .)
This library is distributed in the hope that it will be useful, but WITHOUT
ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS
FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for
more details.
You should have received a copy of the GNU Lesser General Public License
along with this library; if not, write to the Free Software Foundation, Inc.,
51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
**********/
// "liveMedia"
// Copyright (c) 1996-2016 Live Networks, Inc. All rights reserved.
// A 'ServerMediaSubsession' object that creates new, unicast, "RTPSink"s
// on demand, from a file.
// Implementation
#include "FileServerMediaSubsession.hh"
FileServerMediaSubsession
::FileServerMediaSubsession(UsageEnvironment& env, char const* fileName,
Boolean reuseFirstSource)
: OnDemandServerMediaSubsession(env, reuseFirstSource),
fFileSize(0) {
fFileName = strDup(fileName);
}
FileServerMediaSubsession::~FileServerMediaSubsession() {
delete[] (char*)fFileName;
}
live/liveMedia/FramedFileSource.cpp 000444 001752 001752 00000002143 12656261123 017233 0 ustar 00rsf rsf 000000 000000 /**********
This library is free software; you can redistribute it and/or modify it under
the terms of the GNU Lesser General Public License as published by the
Free Software Foundation; either version 2.1 of the License, or (at your
option) any later version. (See .)
This library is distributed in the hope that it will be useful, but WITHOUT
ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS
FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for
more details.
You should have received a copy of the GNU Lesser General Public License
along with this library; if not, write to the Free Software Foundation, Inc.,
51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
**********/
// "liveMedia"
// Copyright (c) 1996-2016 Live Networks, Inc. All rights reserved.
// Framed File Sources
// Implementation
#include "FramedFileSource.hh"
////////// FramedFileSource //////////
FramedFileSource::FramedFileSource(UsageEnvironment& env, FILE* fid)
: FramedSource(env), fFid(fid) {
}
FramedFileSource::~FramedFileSource() {
}
live/liveMedia/FramedFilter.cpp 000444 001752 001752 00000003504 12656261123 016422 0 ustar 00rsf rsf 000000 000000 /**********
This library is free software; you can redistribute it and/or modify it under
the terms of the GNU Lesser General Public License as published by the
Free Software Foundation; either version 2.1 of the License, or (at your
option) any later version. (See .)
This library is distributed in the hope that it will be useful, but WITHOUT
ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS
FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for
more details.
You should have received a copy of the GNU Lesser General Public License
along with this library; if not, write to the Free Software Foundation, Inc.,
51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
**********/
// "liveMedia"
// Copyright (c) 1996-2016 Live Networks, Inc. All rights reserved.
// Framed Filters
// Implementation
#include "FramedFilter.hh"
////////// FramedFilter //////////
#include
void FramedFilter::detachInputSource() {
if (fInputSource != NULL) {
fInputSource->stopGettingFrames();
reassignInputSource(NULL);
}
}
FramedFilter::FramedFilter(UsageEnvironment& env,
FramedSource* inputSource)
: FramedSource(env),
fInputSource(inputSource) {
}
FramedFilter::~FramedFilter() {
Medium::close(fInputSource);
}
// Default implementations of needed virtual functions. These merely
// call the same function in the input source - i.e., act like a 'null filter
char const* FramedFilter::MIMEtype() const {
if (fInputSource == NULL) return "";
return fInputSource->MIMEtype();
}
void FramedFilter::getAttributes() const {
if (fInputSource != NULL) fInputSource->getAttributes();
}
void FramedFilter::doStopGettingFrames() {
FramedSource::doStopGettingFrames();
if (fInputSource != NULL) fInputSource->stopGettingFrames();
}
live/liveMedia/FramedSource.cpp 000444 001752 001752 00000010034 12656261123 016431 0 ustar 00rsf rsf 000000 000000 /**********
This library is free software; you can redistribute it and/or modify it under
the terms of the GNU Lesser General Public License as published by the
Free Software Foundation; either version 2.1 of the License, or (at your
option) any later version. (See .)
This library is distributed in the hope that it will be useful, but WITHOUT
ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS
FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for
more details.
You should have received a copy of the GNU Lesser General Public License
along with this library; if not, write to the Free Software Foundation, Inc.,
51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
**********/
// "liveMedia"
// Copyright (c) 1996-2016 Live Networks, Inc. All rights reserved.
// Framed Sources
// Implementation
#include "FramedSource.hh"
#include
////////// FramedSource //////////
FramedSource::FramedSource(UsageEnvironment& env)
: MediaSource(env),
fAfterGettingFunc(NULL), fAfterGettingClientData(NULL),
fOnCloseFunc(NULL), fOnCloseClientData(NULL),
fIsCurrentlyAwaitingData(False) {
fPresentationTime.tv_sec = fPresentationTime.tv_usec = 0; // initially
}
FramedSource::~FramedSource() {
}
Boolean FramedSource::isFramedSource() const {
return True;
}
Boolean FramedSource::lookupByName(UsageEnvironment& env, char const* sourceName,
FramedSource*& resultSource) {
resultSource = NULL; // unless we succeed
MediaSource* source;
if (!MediaSource::lookupByName(env, sourceName, source)) return False;
if (!source->isFramedSource()) {
env.setResultMsg(sourceName, " is not a framed source");
return False;
}
resultSource = (FramedSource*)source;
return True;
}
void FramedSource::getNextFrame(unsigned char* to, unsigned maxSize,
afterGettingFunc* afterGettingFunc,
void* afterGettingClientData,
onCloseFunc* onCloseFunc,
void* onCloseClientData) {
// Make sure we're not already being read:
if (fIsCurrentlyAwaitingData) {
envir() << "FramedSource[" << this << "]::getNextFrame(): attempting to read more than once at the same time!\n";
envir().internalError();
}
fTo = to;
fMaxSize = maxSize;
fNumTruncatedBytes = 0; // by default; could be changed by doGetNextFrame()
fDurationInMicroseconds = 0; // by default; could be changed by doGetNextFrame()
fAfterGettingFunc = afterGettingFunc;
fAfterGettingClientData = afterGettingClientData;
fOnCloseFunc = onCloseFunc;
fOnCloseClientData = onCloseClientData;
fIsCurrentlyAwaitingData = True;
doGetNextFrame();
}
void FramedSource::afterGetting(FramedSource* source) {
source->fIsCurrentlyAwaitingData = False;
// indicates that we can be read again
// Note that this needs to be done here, in case the "fAfterFunc"
// called below tries to read another frame (which it usually will)
if (source->fAfterGettingFunc != NULL) {
(*(source->fAfterGettingFunc))(source->fAfterGettingClientData,
source->fFrameSize, source->fNumTruncatedBytes,
source->fPresentationTime,
source->fDurationInMicroseconds);
}
}
void FramedSource::handleClosure(void* clientData) {
FramedSource* source = (FramedSource*)clientData;
source->handleClosure();
}
void FramedSource::handleClosure() {
fIsCurrentlyAwaitingData = False; // because we got a close instead
if (fOnCloseFunc != NULL) {
(*fOnCloseFunc)(fOnCloseClientData);
}
}
void FramedSource::stopGettingFrames() {
fIsCurrentlyAwaitingData = False; // indicates that we can be read again
fAfterGettingFunc = NULL;
fOnCloseFunc = NULL;
// Perform any specialized action now:
doStopGettingFrames();
}
void FramedSource::doStopGettingFrames() {
// Default implementation: Do nothing except cancel any pending 'delivery' task:
envir().taskScheduler().unscheduleDelayedTask(nextTask());
// Subclasses may wish to redefine this function.
}
unsigned FramedSource::maxFrameSize() const {
// By default, this source has no maximum frame size.
return 0;
}
live/liveMedia/GSMAudioRTPSink.cpp 000444 001752 001752 00000002657 12656261123 016711 0 ustar 00rsf rsf 000000 000000 /**********
This library is free software; you can redistribute it and/or modify it under
the terms of the GNU Lesser General Public License as published by the
Free Software Foundation; either version 2.1 of the License, or (at your
option) any later version. (See .)
This library is distributed in the hope that it will be useful, but WITHOUT
ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS
FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for
more details.
You should have received a copy of the GNU Lesser General Public License
along with this library; if not, write to the Free Software Foundation, Inc.,
51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
**********/
// "liveMedia"
// Copyright (c) 1996-2016 Live Networks, Inc. All rights reserved.
// RTP sink for GSM audio
// Implementation
#include "GSMAudioRTPSink.hh"
GSMAudioRTPSink::GSMAudioRTPSink(UsageEnvironment& env, Groupsock* RTPgs)
: AudioRTPSink(env, RTPgs, 3, 8000, "GSM") {
}
GSMAudioRTPSink::~GSMAudioRTPSink() {
}
GSMAudioRTPSink*
GSMAudioRTPSink::createNew(UsageEnvironment& env, Groupsock* RTPgs) {
return new GSMAudioRTPSink(env, RTPgs);
}
Boolean GSMAudioRTPSink
::frameCanAppearAfterPacketStart(unsigned char const* /*frameStart*/,
unsigned /*numBytesInFrame*/) const {
// Allow at most 5 frames in a single packet:
return numFramesUsedSoFar() < 5;
}
live/liveMedia/H261VideoRTPSource.cpp 000444 001752 001752 00000004367 12656261123 017244 0 ustar 00rsf rsf 000000 000000 /**********
This library is free software; you can redistribute it and/or modify it under
the terms of the GNU Lesser General Public License as published by the
Free Software Foundation; either version 2.1 of the License, or (at your
option) any later version. (See .)
This library is distributed in the hope that it will be useful, but WITHOUT
ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS
FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for
more details.
You should have received a copy of the GNU Lesser General Public License
along with this library; if not, write to the Free Software Foundation, Inc.,
51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
**********/
// "liveMedia"
// Copyright (c) 1996-2016 Live Networks, Inc. All rights reserved.
// H.261 Video RTP Sources
// Implementation
#include "H261VideoRTPSource.hh"
H261VideoRTPSource*
H261VideoRTPSource::createNew(UsageEnvironment& env, Groupsock* RTPgs,
unsigned char rtpPayloadFormat,
unsigned rtpTimestampFrequency) {
return new H261VideoRTPSource(env, RTPgs, rtpPayloadFormat,
rtpTimestampFrequency);
}
H261VideoRTPSource
::H261VideoRTPSource(UsageEnvironment& env, Groupsock* RTPgs,
unsigned char rtpPayloadFormat,
unsigned rtpTimestampFrequency)
: MultiFramedRTPSource(env, RTPgs,
rtpPayloadFormat, rtpTimestampFrequency),
fLastSpecialHeader(0) {
}
H261VideoRTPSource::~H261VideoRTPSource() {
}
Boolean H261VideoRTPSource
::processSpecialHeader(BufferedPacket* packet,
unsigned& resultSpecialHeaderSize) {
// There's a 4-byte video-specific header
if (packet->dataSize() < 4) return False;
unsigned char* headerStart = packet->data();
fLastSpecialHeader
= (headerStart[0]<<24)|(headerStart[1]<<16)|(headerStart[2]<<8)|headerStart[3];
#ifdef DELIVER_COMPLETE_FRAMES
fCurrentPacketBeginsFrame = fCurrentPacketCompletesFrame;
// whether the *previous* packet ended a frame
// The RTP "M" (marker) bit indicates the last fragment of a frame:
fCurrentPacketCompletesFrame = packet->rtpMarkerBit();
#endif
resultSpecialHeaderSize = 4;
return True;
}
char const* H261VideoRTPSource::MIMEtype() const {
return "video/H261";
}
live/liveMedia/H263plusVideoFileServerMediaSubsession.cpp 000444 001752 001752 00000005112 12656261123 023375 0 ustar 00rsf rsf 000000 000000 /**********
This library is free software; you can redistribute it and/or modify it under
the terms of the GNU Lesser General Public License as published by the
Free Software Foundation; either version 2.1 of the License, or (at your
option) any later version. (See .)
This library is distributed in the hope that it will be useful, but WITHOUT
ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS
FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for
more details.
You should have received a copy of the GNU Lesser General Public License
along with this library; if not, write to the Free Software Foundation, Inc.,
51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
**********/
// "liveMedia"
// Copyright (c) 1996-2016 Live Networks, Inc. All rights reserved.
// A 'ServerMediaSubsession' object that creates new, unicast, "RTPSink"s
// on demand, from a H263 video file.
// Implementation
// Author: Bernhard Feiten. // Based on MPEG4VideoFileServerMediaSubsession
// Updated by Ross FInlayson (December 2007)
#include "H263plusVideoFileServerMediaSubsession.hh"
#include "H263plusVideoRTPSink.hh"
#include "ByteStreamFileSource.hh"
#include "H263plusVideoStreamFramer.hh"
H263plusVideoFileServerMediaSubsession*
H263plusVideoFileServerMediaSubsession::createNew(UsageEnvironment& env,
char const* fileName,
Boolean reuseFirstSource) {
return new H263plusVideoFileServerMediaSubsession(env, fileName, reuseFirstSource);
}
H263plusVideoFileServerMediaSubsession
::H263plusVideoFileServerMediaSubsession(UsageEnvironment& env,
char const* fileName,
Boolean reuseFirstSource)
: FileServerMediaSubsession(env, fileName, reuseFirstSource) {
}
H263plusVideoFileServerMediaSubsession::~H263plusVideoFileServerMediaSubsession() {
}
FramedSource* H263plusVideoFileServerMediaSubsession
::createNewStreamSource(unsigned /*clientSessionId*/, unsigned& estBitrate) {
estBitrate = 500; // kbps, estimate ??
// Create the video source:
ByteStreamFileSource* fileSource = ByteStreamFileSource::createNew(envir(), fFileName);
if (fileSource == NULL) return NULL;
fFileSize = fileSource->fileSize();
// Create a framer for the Video Elementary Stream:
return H263plusVideoStreamFramer::createNew(envir(), fileSource);
}
RTPSink* H263plusVideoFileServerMediaSubsession::createNewRTPSink(Groupsock* rtpGroupsock,
unsigned char rtpPayloadTypeIfDynamic,
FramedSource* /*inputSource*/) {
return H263plusVideoRTPSink::createNew(envir(), rtpGroupsock, rtpPayloadTypeIfDynamic);
}
live/liveMedia/H263plusVideoRTPSink.cpp 000444 001752 001752 00000006371 12656261123 017613 0 ustar 00rsf rsf 000000 000000 /**********
This library is free software; you can redistribute it and/or modify it under
the terms of the GNU Lesser General Public License as published by the
Free Software Foundation; either version 2.1 of the License, or (at your
option) any later version. (See .)
This library is distributed in the hope that it will be useful, but WITHOUT
ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS
FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for
more details.
You should have received a copy of the GNU Lesser General Public License
along with this library; if not, write to the Free Software Foundation, Inc.,
51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
**********/
// "liveMedia"
// Copyright (c) 1996-2016 Live Networks, Inc. All rights reserved.
// RTP sink for H.263+ video (RFC 4629)
// Implementation
#include "H263plusVideoRTPSink.hh"
H263plusVideoRTPSink
::H263plusVideoRTPSink(UsageEnvironment& env, Groupsock* RTPgs,
unsigned char rtpPayloadFormat,
u_int32_t rtpTimestampFrequency)
: VideoRTPSink(env, RTPgs, rtpPayloadFormat, rtpTimestampFrequency, "H263-1998") {
}
H263plusVideoRTPSink::~H263plusVideoRTPSink() {
}
H263plusVideoRTPSink*
H263plusVideoRTPSink::createNew(UsageEnvironment& env, Groupsock* RTPgs,
unsigned char rtpPayloadFormat,
u_int32_t rtpTimestampFrequency) {
return new H263plusVideoRTPSink(env, RTPgs, rtpPayloadFormat, rtpTimestampFrequency);
}
Boolean H263plusVideoRTPSink
::frameCanAppearAfterPacketStart(unsigned char const* /*frameStart*/,
unsigned /*numBytesInFrame*/) const {
// A packet can contain only one frame
return False;
}
void H263plusVideoRTPSink
::doSpecialFrameHandling(unsigned fragmentationOffset,
unsigned char* frameStart,
unsigned numBytesInFrame,
struct timeval framePresentationTime,
unsigned numRemainingBytes) {
if (fragmentationOffset == 0) {
// This packet contains the first (or only) fragment of the frame.
// Set the 'P' bit in the special header:
unsigned short specialHeader = 0x0400;
// Also, reuse the first two bytes of the payload for this special
// header. (They should both have been zero.)
if (numBytesInFrame < 2) {
envir() << "H263plusVideoRTPSink::doSpecialFrameHandling(): bad frame size "
<< numBytesInFrame << "\n";
return;
}
if (frameStart[0] != 0 || frameStart[1] != 0) {
envir() << "H263plusVideoRTPSink::doSpecialFrameHandling(): unexpected non-zero first two bytes!\n";
}
frameStart[0] = specialHeader>>8;
frameStart[1] = (unsigned char)specialHeader;
} else {
unsigned short specialHeader = 0;
setSpecialHeaderBytes((unsigned char*)&specialHeader, 2);
}
if (numRemainingBytes == 0) {
// This packet contains the last (or only) fragment of the frame.
// Set the RTP 'M' ('marker') bit:
setMarkerBit();
}
// Also set the RTP timestamp:
setTimestamp(framePresentationTime);
}
unsigned H263plusVideoRTPSink::specialHeaderSize() const {
// There's a 2-byte special video header. However, if we're the first
// (or only) fragment of a frame, then we reuse the first 2 bytes of
// the payload instead.
return (curFragmentationOffset() == 0) ? 0 : 2;
}
live/liveMedia/H263plusVideoRTPSource.cpp 000444 001752 001752 00000007221 12656261123 020142 0 ustar 00rsf rsf 000000 000000 /**********
This library is free software; you can redistribute it and/or modify it under
the terms of the GNU Lesser General Public License as published by the
Free Software Foundation; either version 2.1 of the License, or (at your
option) any later version. (See .)
This library is distributed in the hope that it will be useful, but WITHOUT
ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS
FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for
more details.
You should have received a copy of the GNU Lesser General Public License
along with this library; if not, write to the Free Software Foundation, Inc.,
51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
**********/
// "liveMedia"
// Copyright (c) 1996-2016 Live Networks, Inc. All rights reserved.
// H.263+ Video RTP Sources
// Implementation
#include "H263plusVideoRTPSource.hh"
H263plusVideoRTPSource*
H263plusVideoRTPSource::createNew(UsageEnvironment& env, Groupsock* RTPgs,
unsigned char rtpPayloadFormat,
unsigned rtpTimestampFrequency) {
return new H263plusVideoRTPSource(env, RTPgs, rtpPayloadFormat,
rtpTimestampFrequency);
}
H263plusVideoRTPSource
::H263plusVideoRTPSource(UsageEnvironment& env, Groupsock* RTPgs,
unsigned char rtpPayloadFormat,
unsigned rtpTimestampFrequency)
: MultiFramedRTPSource(env, RTPgs,
rtpPayloadFormat, rtpTimestampFrequency),
fNumSpecialHeaders(0), fSpecialHeaderBytesLength(0) {
}
H263plusVideoRTPSource::~H263plusVideoRTPSource() {
}
Boolean H263plusVideoRTPSource
::processSpecialHeader(BufferedPacket* packet,
unsigned& resultSpecialHeaderSize) {
unsigned char* headerStart = packet->data();
unsigned packetSize = packet->dataSize();
// The H.263+ payload header is at least 2 bytes in size.
// Extract the known fields from the first 2 bytes:
unsigned expectedHeaderSize = 2;
if (packetSize < expectedHeaderSize) return False;
//unsigned char RR = headerStart[0]>>3;
Boolean P = (headerStart[0]&0x4) != 0;
Boolean V = (headerStart[0]&0x2) != 0;
unsigned char PLEN = ((headerStart[0]&0x1)<<5)|(headerStart[1]>>3);
//unsigned char PEBIT = headerStart[1]&0x7;
if (V) {
// There's an extra VRC byte at the end of the header:
++expectedHeaderSize;
if (packetSize < expectedHeaderSize) return False;
}
if (PLEN > 0) {
// There's an extra picture header at the end:
expectedHeaderSize += PLEN;
if (packetSize < expectedHeaderSize) return False;
}
fCurrentPacketBeginsFrame = P;
if (fCurrentPacketBeginsFrame) {
fNumSpecialHeaders = fSpecialHeaderBytesLength = 0;
}
// Make a copy of the special header bytes, in case a reader
// can use them:
unsigned bytesAvailable
= SPECIAL_HEADER_BUFFER_SIZE - fSpecialHeaderBytesLength - 1;
if (expectedHeaderSize <= bytesAvailable) {
fSpecialHeaderBytes[fSpecialHeaderBytesLength++] = expectedHeaderSize;
for (unsigned i = 0; i < expectedHeaderSize; ++i) {
fSpecialHeaderBytes[fSpecialHeaderBytesLength++] = headerStart[i];
}
fPacketSizes[fNumSpecialHeaders++] = packetSize;
}
if (P) {
// Prepend two zero bytes to the start of the payload proper.
// Hack: Do this by shrinking this special header by 2 bytes:
expectedHeaderSize -= 2;
headerStart[expectedHeaderSize] = 0;
headerStart[expectedHeaderSize+1] = 0;
}
// The RTP "M" (marker) bit indicates the last fragment of a frame:
fCurrentPacketCompletesFrame = packet->rtpMarkerBit();
resultSpecialHeaderSize = expectedHeaderSize;
return True;
}
char const* H263plusVideoRTPSource::MIMEtype() const {
return "video/H263-1998";
}
live/liveMedia/H263plusVideoStreamFramer.cpp 000444 001752 001752 00000011605 12656261123 020705 0 ustar 00rsf rsf 000000 000000 /**********
This library is free software; you can redistribute it and/or modify it under
the terms of the GNU Lesser General Public License as published by the
Free Software Foundation; either version 2.1 of the License, or (at your
option) any later version. (See .)
This library is distributed in the hope that it will be useful, but WITHOUT
ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS
FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for
more details.
You should have received a copy of the GNU Lesser General Public License
along with this library; if not, write to the Free Software Foundation, Inc.,
51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
**********/
// "liveMedia"
// Copyright (c) 1996-2016 Live Networks, Inc. All rights reserved.
// Author Bernhard Feiten
// A filter that breaks up an H.263plus video stream into frames.
//
#include "H263plusVideoStreamFramer.hh"
#include "H263plusVideoStreamParser.hh"
#include
#include
///////////////////////////////////////////////////////////////////////////////
////////// H263plusVideoStreamFramer implementation //////////
//public///////////////////////////////////////////////////////////////////////
H263plusVideoStreamFramer* H263plusVideoStreamFramer::createNew(
UsageEnvironment& env,
FramedSource* inputSource)
{
// Need to add source type checking here??? #####
H263plusVideoStreamFramer* fr;
fr = new H263plusVideoStreamFramer(env, inputSource);
return fr;
}
///////////////////////////////////////////////////////////////////////////////
H263plusVideoStreamFramer::H263plusVideoStreamFramer(
UsageEnvironment& env,
FramedSource* inputSource,
Boolean createParser)
: FramedFilter(env, inputSource),
fFrameRate(0.0), // until we learn otherwise
fPictureEndMarker(False)
{
// Use the current wallclock time as the base 'presentation time':
gettimeofday(&fPresentationTimeBase, NULL);
fParser = createParser ? new H263plusVideoStreamParser(this, inputSource) : NULL;
}
///////////////////////////////////////////////////////////////////////////////
H263plusVideoStreamFramer::~H263plusVideoStreamFramer()
{
delete fParser;
}
///////////////////////////////////////////////////////////////////////////////
void H263plusVideoStreamFramer::doGetNextFrame()
{
fParser->registerReadInterest(fTo, fMaxSize);
continueReadProcessing();
}
///////////////////////////////////////////////////////////////////////////////
Boolean H263plusVideoStreamFramer::isH263plusVideoStreamFramer() const
{
return True;
}
///////////////////////////////////////////////////////////////////////////////
void H263plusVideoStreamFramer::continueReadProcessing(
void* clientData,
unsigned char* /*ptr*/, unsigned /*size*/,
struct timeval /*presentationTime*/)
{
H263plusVideoStreamFramer* framer = (H263plusVideoStreamFramer*)clientData;
framer->continueReadProcessing();
}
///////////////////////////////////////////////////////////////////////////////
void H263plusVideoStreamFramer::continueReadProcessing()
{
unsigned acquiredFrameSize;
u_int64_t frameDuration; // in ms
acquiredFrameSize = fParser->parse(frameDuration);
// Calculate some average bitrate information (to be adapted)
// avgBitrate = (totalBytes * 8 * H263_TIMESCALE) / totalDuration;
if (acquiredFrameSize > 0) {
// We were able to acquire a frame from the input.
// It has already been copied to the reader's space.
fFrameSize = acquiredFrameSize;
// fNumTruncatedBytes = fParser->numTruncatedBytes(); // not needed so far
fFrameRate = frameDuration == 0 ? 0.0 : 1000./(long)frameDuration;
// Compute "fPresentationTime"
if (acquiredFrameSize == 5) // first frame
fPresentationTime = fPresentationTimeBase;
else
fPresentationTime.tv_usec += (long) frameDuration*1000;
while (fPresentationTime.tv_usec >= 1000000) {
fPresentationTime.tv_usec -= 1000000;
++fPresentationTime.tv_sec;
}
// Compute "fDurationInMicroseconds"
fDurationInMicroseconds = (unsigned int) frameDuration*1000;;
// Call our own 'after getting' function. Because we're not a 'leaf'
// source, we can call this directly, without risking infinite recursion.
afterGetting(this);
} else {
// We were unable to parse a complete frame from the input, because:
// - we had to read more data from the source stream, or
// - the source stream has ended.
}
}
live/liveMedia/H263plusVideoStreamParser.cpp 000444 001752 001752 00000105434 12656261123 020731 0 ustar 00rsf rsf 000000 000000 /**********
This library is free software; you can redistribute it and/or modify it under
the terms of the GNU Lesser General Public License as published by the
Free Software Foundation; either version 2.1 of the License, or (at your
option) any later version. (See .)
This library is distributed in the hope that it will be useful, but WITHOUT
ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS
FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for
more details.
You should have received a copy of the GNU Lesser General Public License
along with this library; if not, write to the Free Software Foundation, Inc.,
51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
**********/
// "liveMedia"
// Copyright (c) 1996-2016 Live Networks, Inc. All rights reserved.
// Author Bernhard Feiten
// A filter that breaks up an H.263plus video stream into frames.
// Based on MPEG4IP/mp4creator/h263.c
#include "H263plusVideoStreamParser.hh"
#include "H263plusVideoStreamFramer.hh"
//#include
//#include "GroupsockHelper.hh"
H263plusVideoStreamParser::H263plusVideoStreamParser(
H263plusVideoStreamFramer* usingSource,
FramedSource* inputSource)
: StreamParser(inputSource,
FramedSource::handleClosure,
usingSource,
&H263plusVideoStreamFramer::continueReadProcessing,
usingSource),
fUsingSource(usingSource),
fnextTR(0),
fcurrentPT(0)
{
memset(fStates, 0, sizeof(fStates));
memset(&fNextInfo, 0, sizeof(fNextInfo));
memset(&fCurrentInfo, 0, sizeof(fCurrentInfo));
memset(&fMaxBitrateCtx, 0, sizeof(fMaxBitrateCtx));
memset(fNextHeader,0, H263_REQUIRE_HEADER_SIZE_BYTES);
}
///////////////////////////////////////////////////////////////////////////////
H263plusVideoStreamParser::~H263plusVideoStreamParser()
{
}
///////////////////////////////////////////////////////////////////////////////
void H263plusVideoStreamParser::restoreSavedParserState()
{
StreamParser::restoreSavedParserState();
fTo = fSavedTo;
fNumTruncatedBytes = fSavedNumTruncatedBytes;
}
///////////////////////////////////////////////////////////////////////////////
void H263plusVideoStreamParser::setParseState()
{
fSavedTo = fTo;
fSavedNumTruncatedBytes = fNumTruncatedBytes;
saveParserState(); // Needed for the parsing process in StreamParser
}
///////////////////////////////////////////////////////////////////////////////
void H263plusVideoStreamParser::registerReadInterest(
unsigned char* to,
unsigned maxSize)
{
fStartOfFrame = fTo = fSavedTo = to;
fLimit = to + maxSize;
fMaxSize = maxSize;
fNumTruncatedBytes = fSavedNumTruncatedBytes = 0;
}
///////////////////////////////////////////////////////////////////////////////
// parse() , derived from H263Creator of MPEG4IP, h263.c
unsigned H263plusVideoStreamParser::parse(u_int64_t & currentDuration)
{
// u_int8_t frameBuffer[H263_BUFFER_SIZE]; // The input buffer
// Pointer which tells LoadNextH263Object where to read data to
// u_int8_t* pFrameBuffer = fTo + H263_REQUIRE_HEADER_SIZE_BYTES;
u_int32_t frameSize; // The current frame size
// Pointer to receive address of the header data
// u_int8_t* pCurrentHeader;// = pFrameBuffer;
// u_int64_t currentDuration; // The current frame's duration
u_int8_t trDifference; // The current TR difference
// The previous TR difference
// u_int8_t prevTrDifference = H263_BASIC_FRAME_RATE;
// u_int64_t totalDuration = 0;// Duration accumulator
// u_int64_t avgBitrate; // Average bitrate
// u_int64_t totalBytes = 0; // Size accumulator
try // The get data routines of the class FramedFilter returns an error when
{ // the buffer is empty. This occurs at the beginning and at the end of the file.
fCurrentInfo = fNextInfo;
// Parse 1 frame
// For the first time, only the first frame's header is returned.
// The second time the full first frame is returned
frameSize = parseH263Frame();
currentDuration = 0;
if ((frameSize > 0)){
// We were able to acquire a frame from the input.
// Parse the returned frame header (if any)
if (!ParseShortHeader(fTo, &fNextInfo)) {
#ifdef DEBUG
fprintf(stderr,"H263plusVideoStreamParser: Fatal error\n");
#endif
}
trDifference = GetTRDifference(fNextInfo.tr, fCurrentInfo.tr);
// calculate the current frame duration
currentDuration = CalculateDuration(trDifference);
// Accumulate the frame's size and duration for avgBitrate calculation
//totalDuration += currentDuration;
//totalBytes += frameSize;
// If needed, recalculate bitrate information
// if (h263Bitrates)
//GetMaxBitrate(&fMaxBitrateCtx, frameSize, prevTrDifference);
//prevTrDifference = trDifference;
setParseState(); // Needed for the parsing process in StreamParser
}
} catch (int /*e*/) {
#ifdef DEBUG
fprintf(stderr, "H263plusVideoStreamParser::parse() EXCEPTION (This is normal behavior - *not* an error)\n");
#endif
frameSize=0;
}
return frameSize;
}
///////////////////////////////////////////////////////////////////////////////
// parseH263Frame derived from LoadNextH263Object of MPEG4IP
// - service routine that reads a single frame from the input file.
// It shall fill the input buffer with data up until - and including - the
// next start code and shall report back both the number of bytes read and a
// pointer to the next start code. The first call to this function shall only
// yield a pointer with 0 data bytes and the last call to this function shall
// only yield data bytes with a NULL pointer as the next header.
//
// TODO: This function only supports valid bit streams. Upon error, it fails
// without the possibility to recover. A Better idea would be to skip frames
// until a parsable frame is read from the file.
//
// Parameters:
// ppNextHeader - output parameter that upon return points to the location
// of the next frame's head in the buffer.
// This pointer shall be NULL for the last frame read.
// Returns the total number of bytes read.
// Uses FrameFileSource intantiated by constructor.
///////////////////////////////////////////////////////////////////////////////
int H263plusVideoStreamParser::parseH263Frame( )
{
char row = 0;
u_int8_t * bufferIndex = fTo;
// The buffer end which will allow the loop to leave place for
// the additionalBytesNeeded
u_int8_t * bufferEnd = fTo + fMaxSize - ADDITIONAL_BYTES_NEEDED - 1;
memcpy(fTo, fNextHeader, H263_REQUIRE_HEADER_SIZE_BYTES);
bufferIndex += H263_REQUIRE_HEADER_SIZE_BYTES;
// The state table and the following loop implements a state machine enabling
// us to read bytes from the file until (and inclusing) the requested
// start code (00 00 8X) is found
// Initialize the states array, if it hasn't been initialized yet...
if (!fStates[0][0]) {
// One 00 was read
fStates[0][0] = 1;
// Two sequential 0x00 ware read
fStates[1][0] = fStates[2][0] = 2;
// A full start code was read
fStates[2][128] = fStates[2][129] = fStates[2][130] = fStates[2][131] = -1;
}
// Read data from file into the output buffer until either a start code
// is found, or the end of file has been reached.
do {
*bufferIndex = get1Byte();
} while ((bufferIndex < bufferEnd) && // We have place in the buffer
((row = fStates[(unsigned char)row][*(bufferIndex++)]) != -1)); // Start code was not found
if (row != -1) {
fprintf(stderr, "%s: Buffer too small (%u)\n",
"h263reader:", bufferEnd - fTo + ADDITIONAL_BYTES_NEEDED);
return 0;
}
// Cool ... now we have a start code
// Now we just have to read the additionalBytesNeeded
getBytes(bufferIndex, ADDITIONAL_BYTES_NEEDED);
memcpy(fNextHeader, bufferIndex - H263_STARTCODE_SIZE_BYTES, H263_REQUIRE_HEADER_SIZE_BYTES);
int sz = bufferIndex - fTo - H263_STARTCODE_SIZE_BYTES;
if (sz == 5) // first frame
memcpy(fTo, fTo+H263_REQUIRE_HEADER_SIZE_BYTES, H263_REQUIRE_HEADER_SIZE_BYTES);
return sz;
}
////////////////////////////////////////////////////////////////////////////////
// ParseShortHeader - service routine that accepts a buffer containing a frame
// header and extracts relevant codec information from it.
//
// NOTE: the first bit in the following commnets is 0 (zero).
//
// 0 1 2 3
// 0 1 2 3 4 5 6 7 8 9 0 1 2 3 4 5 6 7 8 9 0 1 2 3 4 5 6 7 8 9 0 1
// +-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+
// | PSC (Picture Start Code=22 bits) | (TR=8 bits) | >
// |0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0| |1 0>
// +-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+
// < (PTYPE=13 bits) |
// <. . .|(FMT)|Z|. . . .|
// +-+-+-+-+-+-+-+-+-+-+-+
// -> PTYPE.FMT contains a width/height identification
// -> PTYPE.Z is 1 for P-Frames, 0 for I-Frames
// Note: When FMT is 111, there is an extended PTYPE...
//
// Inputs:
// headerBuffer - pointer to the current header buffer
// outputInfoStruct - pointer to the structure receiving the data
// Outputs:
// This function returns a structure of important codec-specific
// information (The Temporal Reference bits, width & height of the current
// frame and the sync - or "frame type" - bit. It reports success or
// failure to the calling function.
////////////////////////////////////////////////////////////////////////////////
bool H263plusVideoStreamParser::ParseShortHeader(
u_int8_t *headerBuffer,
H263INFO *outputInfoStruct)
{
u_int8_t fmt = 0;
// Extract temporal reference (TR) from the buffer (bits 22-29 inclusive)
outputInfoStruct->tr = (headerBuffer[2] << 6) & 0xC0; // 2 LS bits out of the 3rd byte
outputInfoStruct->tr |= (headerBuffer[3] >> 2) & 0x3F; // 6 MS bits out of the 4th byte
// Extract the FMT part of PTYPE from the buffer (bits 35-37 inclusive)
fmt = (headerBuffer[4] >> 2) & 0x07; // bits 3-5 ouf of the 5th byte
// If PTYPE is not supported, return a failure notice to the calling function
// FIXME: PLUSPTYPE is not supported
if (fmt == 0x07) {
return false;
}
// If PTYPE is supported, calculate the current width and height according to
// a predefined table
if (!GetWidthAndHeight(fmt, &(outputInfoStruct->width),
&(outputInfoStruct->height))) {
return false;
}
// Extract the frame-type bit, which is the 9th bit of PTYPE (bit 38)
outputInfoStruct->isSyncFrame = !(headerBuffer[4] & 0x02);
return true;
}
////////////////////////////////////////////////////////////////////////////////
// GetMaxBitrate- service routine that accepts frame information and
// derives bitrate information from it. This function uses a sliding window
// technique to calculate the maximum bitrates in any window of 1 second
// inside the file.
// The sliding window is implemented with a table of bitrates for the last
// second (30 entries - one entry per TR unit).
//
// Inputs:
// ctx - context for this function
// frameSize - the size of the current frame in bytes
// frameTRDiff - the "duration" of the frame in TR units
// Outputs:
// This function returns the up-to-date maximum bitrate
////////////////////////////////////////////////////////////////////////////////
void H263plusVideoStreamParser::GetMaxBitrate( MaxBitrate_CTX *ctx,
u_int32_t frameSize,
u_int8_t frameTRDiff)
{
if (frameTRDiff == 0)
return;
// Calculate the current frame's bitrate as bits per TR unit (round the result
// upwards)
u_int32_t frameBitrate = frameSize * 8 / frameTRDiff + 1;
// for each TRdiff received,
while (frameTRDiff--) {
// Subtract the oldest bitrate entry from the current bitrate
ctx->windowBitrate -= ctx->bitrateTable[ctx->tableIndex];
// Update the oldest bitrate entry with the current frame's bitrate
ctx->bitrateTable[ctx->tableIndex] = frameBitrate;
// Add the current frame's bitrate to the current bitrate
ctx->windowBitrate += frameBitrate;
// Check if we have a new maximum bitrate
if (ctx->windowBitrate > ctx->maxBitrate) {
ctx->maxBitrate = ctx->windowBitrate;
}
// Advance the table index
// Wrapping around the bitrateTable size
ctx->tableIndex = (ctx->tableIndex + 1) %
( sizeof(ctx->bitrateTable) / sizeof(ctx->bitrateTable[0]) );
}
}
////////////////////////////////////////////////////////////////////////////////
// CalculateDuration - service routine that calculates the current frame's
// duration in milli-seconds using it's duration in TR units.
// - In order not to accumulate the calculation error, we are using the TR
// duration to calculate the current and the next frame's presentation time in
// milli-seconds.
//
// Inputs: trDiff - The current frame's duration in TR units
// Return: The current frame's duration in milli-seconds
////////////////////////////////////////////////////////////////////////////////
u_int64_t H263plusVideoStreamParser::CalculateDuration(u_int8_t trDiff)
{
u_int64_t nextPT; // The next frame's presentation time in milli-seconds
u_int64_t duration; // The current frame's duration in milli-seconds
fnextTR += trDiff;
// Calculate the next frame's presentation time, in milli-seconds
nextPT = (fnextTR * 1001) / H263_BASIC_FRAME_RATE;
// The frame's duration is the difference between the next presentation
// time and the current presentation time.
duration = nextPT - fcurrentPT;
// "Remember" the next presentation time for the next time this function is called
fcurrentPT = nextPT;
return duration;
}
////////////////////////////////////////////////////////////////////////////////
bool H263plusVideoStreamParser::GetWidthAndHeight( u_int8_t fmt,
u_int16_t *width,
u_int16_t *height)
{
// The 'fmt' corresponds to bits 5-7 of the PTYPE
static struct {
u_int16_t width;
u_int16_t height;
} const dimensionsTable[8] = {
{ 0, 0 }, // 000 - 0 - forbidden, generates an error
{ 128, 96 }, // 001 - 1 - Sub QCIF
{ 176, 144 }, // 010 - 2 - QCIF
{ 352, 288 }, // 011 - 3 - CIF
{ 704, 576 }, // 100 - 4 - 4CIF
{ 1409, 1152 }, // 101 - 5 - 16CIF
{ 0, 0 }, // 110 - 6 - reserved, generates an error
{ 0, 0 } // 111 - 7 - extended, not supported by profile 0
};
if (fmt > 7)
return false;
*width = dimensionsTable[fmt].width;
*height = dimensionsTable[fmt].height;
if (*width == 0)
return false;
return true;
}
////////////////////////////////////////////////////////////////////////////////
u_int8_t H263plusVideoStreamParser::GetTRDifference(
u_int8_t nextTR,
u_int8_t currentTR)
{
if (currentTR > nextTR) {
// Wrap around 255...
return nextTR + (256 - currentTR);
} else {
return nextTR - currentTR;
}
}
////////////////////////////////////////////////////////////////////////////////
////////////////////////////////////////////////////////////////////////////////
////////////////////////////////////////////////////////////////////////////////
// this is the h263.c file of MPEG4IP mp4creator
/*
#include "mp4creator.h"
// Default timescale for H.263 (1000ms)
#define H263_TIMESCALE 1000
// Default H263 frame rate (30fps)
#define H263_BASIC_FRAME_RATE 30
// Minimum number of bytes needed to parse an H263 header
#define H263_REQUIRE_HEADER_SIZE_BYTES 5
// Number of bytes the start code requries
#define H263_STARTCODE_SIZE_BYTES 3
// This is the input buffer's size. It should contain
// 1 frame with the following start code
#define H263_BUFFER_SIZE 256 * 1024
// The default max different (in %) betwqeen max and average bitrates
#define H263_DEFAULT_CBR_TOLERANCE 10
// The following structure holds information extracted from each frame's header:
typedef struct _H263INFO {
u_int8_t tr; // Temporal Reference, used in duration calculation
u_int16_t width; // Width of the picture
u_int16_t height; // Height of the picture
bool isSyncFrame; // Frame type (true = I frame = "sync" frame)
} H263INFO;
// Context for the GetMaxBitrate function
typedef struct _MaxBitrate_CTX {
u_int32_t bitrateTable[H263_BASIC_FRAME_RATE];// Window of 1 second
u_int32_t windowBitrate; // The bitrate of the current window
u_int32_t maxBitrate; // The up-to-date maximum bitrate
u_int32_t tableIndex; // The next TR unit to update
} MaxBitrate_CTX;
// Forward declarations:
static int LoadNextH263Object( FILE *inputFileHandle,
u_int8_t *frameBuffer,
u_int32_t *frameBufferSize,
u_int32_t additionalBytesNeeded,
u_int8_t **ppNextHeader);
static bool ParseShortHeader( u_int8_t *headerBuffer,
H263INFO *outputInfoStruct);
static u_int8_t GetTRDifference(u_int8_t nextTR,
u_int8_t currentTR);
static void GetMaxBitrate( MaxBitrate_CTX *ctx,
u_int32_t frameSize,
u_int8_t frameTRDiff);
static MP4Duration CalculateDuration(u_int8_t trDiff);
static bool GetWidthAndHeight( u_int8_t fmt,
u_int16_t *width,
u_int16_t *height);
static char states[3][256];
/ *
* H263Creator - Main function
* Inputs:
* outputFileHandle - The handle of the output file
* inputFileHandle - The handle of the input file
* Codec-specific parameters:
* H263Level - H.263 Level used for this track
* H263Profile - H.263 Profile used for this track
* H263Bitrates - A Parameter indicating whether the function
* should calculate H263 bitrates or not.
* cbrTolerance - CBR tolerance indicates when to set the
* average bitrate.
* Outputs:
* This function returns either the track ID of the newly added track upon
* success or a predefined value representing an erroneous state.
* /
MP4TrackId H263Creator(MP4FileHandle outputFileHandle,
FILE* inputFileHandle,
u_int8_t h263Profile,
u_int8_t h263Level,
bool h263Bitrates,
u_int8_t cbrTolerance)
{
H263INFO nextInfo; // Holds information about the next frame
H263INFO currentInfo;// Holds information about the current frame
MaxBitrate_CTX maxBitrateCtx;// Context for the GetMaxBitrate function
memset(&nextInfo, 0, sizeof(nextInfo));
memset(¤tInfo, 0, sizeof(currentInfo));
memset(&maxBitrateCtx, 0, sizeof(maxBitrateCtx));
memset(states, 0, sizeof(states));
u_int8_t frameBuffer[H263_BUFFER_SIZE]; // The input buffer
// Pointer which tells LoadNextH263Object where to read data to
u_int8_t* pFrameBuffer = frameBuffer + H263_REQUIRE_HEADER_SIZE_BYTES;
u_int32_t frameSize; // The current frame size
// Pointer to receive address of the header data
u_int8_t* pCurrentHeader = pFrameBuffer;
MP4Duration currentDuration; // The current frame's duration
u_int8_t trDifference; // The current TR difference
// The previous TR difference
u_int8_t prevTrDifference = H263_BASIC_FRAME_RATE;
MP4Duration totalDuration = 0;// Duration accumulator
MP4Duration avgBitrate; // Average bitrate
u_int64_t totalBytes = 0; // Size accumulator
MP4TrackId trackId = MP4_INVALID_TRACK_ID; // Our MP4 track
bool stay = true; // loop flag
while (stay) {
currentInfo = nextInfo;
memmove(frameBuffer, pCurrentHeader, H263_REQUIRE_HEADER_SIZE_BYTES);
frameSize = H263_BUFFER_SIZE - H263_REQUIRE_HEADER_SIZE_BYTES;
// Read 1 frame and the next frame's header from the file.
// For the first frame, only the first frame's header is returned.
// For the last frame, only the last frame's data is returned.
if (! LoadNextH263Object(inputFileHandle, pFrameBuffer, &frameSize,
H263_REQUIRE_HEADER_SIZE_BYTES - H263_STARTCODE_SIZE_BYTES,
&pCurrentHeader))
break; // Fatal error ...
if (pCurrentHeader) {
// Parse the returned frame header (if any)
if (!ParseShortHeader(pCurrentHeader, &nextInfo))
break; // Fatal error
trDifference = GetTRDifference(nextInfo.tr, currentInfo.tr);
} else {
// This is the last frame ... we have to fake the trDifference ...
trDifference = 1;
// No header data has been read at this iteration, so we have to manually
// add the frame's header we read at the previous iteration.
// Note that LoadNextH263Object returns the number of bytes read, which
// are the current frame's data and the next frame's header
frameSize += H263_REQUIRE_HEADER_SIZE_BYTES;
// There is no need for the next iteration ...
stay = false;
}
// If this is the first iteration ...
if (currentInfo.width == 0) {
// If we have more data than just the header
if ((frameSize > H263_REQUIRE_HEADER_SIZE_BYTES) ||
!pCurrentHeader) // Or no header at all
break; // Fatal error
else
continue; // We have only the first frame's header ...
}
if (trackId == MP4_INVALID_TRACK_ID) {
// If a track has not been added yet, add the track to the file.
trackId = MP4AddH263VideoTrack(outputFileHandle, H263_TIMESCALE,
0, currentInfo.width, currentInfo.height,
h263Level, h263Profile, 0, 0);
if (trackId == MP4_INVALID_TRACK_ID)
break; // Fatal error
}
// calculate the current frame duration
currentDuration = CalculateDuration(trDifference);
// Write the current frame to the file.
if (!MP4WriteSample(outputFileHandle, trackId, frameBuffer, frameSize,
currentDuration, 0, currentInfo.isSyncFrame))
break; // Fatal error
// Accumulate the frame's size and duration for avgBitrate calculation
totalDuration += currentDuration;
totalBytes += frameSize;
// If needed, recalculate bitrate information
if (h263Bitrates)
GetMaxBitrate(&maxBitrateCtx, frameSize, prevTrDifference);
prevTrDifference = trDifference;
} // while (stay)
// If this is the last frame,
if (!stay) {
// If needed and possible, update bitrate information in the file
if (h263Bitrates && totalDuration) {
avgBitrate = (totalBytes * 8 * H263_TIMESCALE) / totalDuration;
if (cbrTolerance == 0)
cbrTolerance = H263_DEFAULT_CBR_TOLERANCE;
// Same as: if (maxBitrate / avgBitrate > (cbrTolerance + 100) / 100.0)
if (maxBitrateCtx.maxBitrate * 100 > (cbrTolerance + 100) * avgBitrate)
avgBitrate = 0;
MP4SetH263Bitrates(outputFileHandle, trackId,
avgBitrate, maxBitrateCtx.maxBitrate);
}
// Return the newly added track ID
return trackId;
}
// If we got to here... something went wrong ...
fprintf(stderr,
"%s: Could not parse input file, invalid video stream?\n", ProgName);
// Upon failure, delete the newly added track if it has been added
if (trackId != MP4_INVALID_TRACK_ID) {
MP4DeleteTrack(outputFileHandle, trackId);
}
return MP4_INVALID_TRACK_ID;
}
/ *
* LoadNextH263Object - service routine that reads a single frame from the input
* file. It shall fill the input buffer with data up until - and including - the
* next start code and shall report back both the number of bytes read and a
* pointer to the next start code. The first call to this function shall only
* yield a pointer with 0 data bytes and the last call to this function shall
* only yield data bytes with a NULL pointer as the next header.
*
* TODO: This function only supports valid bit streams. Upon error, it fails
* without the possibility to recover. A Better idea would be to skip frames
* until a parsable frame is read from the file.
*
* Parameters:
* inputFileHandle - The handle of the input file
* frameBuffer - buffer where to place read data
* frameBufferSize - in/out parameter indicating the size of the buffer on
* entry and the number of bytes copied to the buffer upon
* return
* additionalBytesNeeded - indicates how many additional bytes are to be read
* from the next frame's header (over the 3 bytes that
* are already read).
* NOTE: This number MUST be > 0
* ppNextHeader - output parameter that upon return points to the location
* of the next frame's head in the buffer
* Outputs:
* This function returns two pieces of information:
* 1. The total number of bytes read.
* 2. A Pointer to the header of the next frame. This pointer shall be NULL
* for the last frame read.
* /
static int LoadNextH263Object( FILE *inputFileHandle,
u_int8_t *frameBuffer,
u_int32_t *frameBufferSize,
u_int32_t additionalBytesNeeded,
u_int8_t **ppNextHeader)
{
// This table and the following loop implements a state machine enabling
// us to read bytes from the file untill (and inclusing) the requested
// start code (00 00 8X) is found
char row = 0;
u_int8_t *bufferStart = frameBuffer;
// The buffer end which will allow the loop to leave place for
// the additionalBytesNeeded
u_int8_t *bufferEnd = frameBuffer + *frameBufferSize -
additionalBytesNeeded - 1;
// Initialize the states array, if it hasn't been initialized yet...
if (!states[0][0]) {
// One 00 was read
states[0][0] = 1;
// Two sequential 0x00 ware read
states[1][0] = states[2][0] = 2;
// A full start code was read
states[2][128] = states[2][129] = states[2][130] = states[2][131] = -1;
}
// Read data from file into the output buffer until either a start code
// is found, or the end of file has been reached.
do {
if (fread(frameBuffer, 1, 1, inputFileHandle) != 1){
// EOF or other error before we got a start code
*ppNextHeader = NULL;
*frameBufferSize = frameBuffer - bufferStart;
return 1;
}
} while ((frameBuffer < bufferEnd) && // We have place in the buffer
((row = states[row][*(frameBuffer++)]) != -1)); // Start code was not found
if (row != -1) {
fprintf(stderr, "%s: Buffer too small (%u)\n",
ProgName, bufferEnd - bufferStart + additionalBytesNeeded);
return 0;
}
// Cool ... now we have a start code
*ppNextHeader = frameBuffer - H263_STARTCODE_SIZE_BYTES;
*frameBufferSize = frameBuffer - bufferStart + additionalBytesNeeded;
// Now we just have to read the additionalBytesNeeded
if(fread(frameBuffer, additionalBytesNeeded, 1, inputFileHandle) != 1) {
/// We got a start code but can't read additionalBytesNeeded ... that's a fatal error
fprintf(stderr, "%s: Invalid H263 bitstream\n", ProgName);
return 0;
}
return 1;
}
/ *
* ParseShortHeader - service routine that accepts a buffer containing a frame
* header and extracts relevant codec information from it.
*
* NOTE: the first bit in the following commnets is 0 (zero).
*
*
* 0 1 2 3
* 0 1 2 3 4 5 6 7 8 9 0 1 2 3 4 5 6 7 8 9 0 1 2 3 4 5 6 7 8 9 0 1
* +-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+
* | PSC (Picture Start Code=22 bits) | (TR=8 bits) | >
* |0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0| |1 0>
* +-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+
* < (PTYPE=13 bits) |
* <. . .|(FMT)|Z|. . . .|
* +-+-+-+-+-+-+-+-+-+-+-+
* -> PTYPE.FMT contains a width/height identification
* -> PTYPE.Z is 1 for P-Frames, 0 for I-Frames
* Note: When FMT is 111, there is an extended PTYPE...
*
* Inputs:
* headerBuffer - pointer to the current header buffer
* outputInfoStruct - pointer to the structure receiving the data
* Outputs:
* This function returns a structure of important codec-specific
* information (The Temporal Reference bits, width & height of the current
* frame and the sync - or "frame type" - bit. It reports success or
* failure to the calling function.
* /
static bool ParseShortHeader( u_int8_t *headerBuffer,
H263INFO *outputInfoStruct)
{
u_int8_t fmt = 0;
// Extract temporal reference (TR) from the buffer (bits 22-29 inclusive)
outputInfoStruct->tr = (headerBuffer[2] << 6) & 0xC0; // 2 LS bits out of the 3rd byte
outputInfoStruct->tr |= (headerBuffer[3] >> 2) & 0x3F; // 6 MS bits out of the 4th byte
// Extract the FMT part of PTYPE from the buffer (bits 35-37 inclusive)
fmt = (headerBuffer[4] >> 2) & 0x07; // bits 3-5 ouf of the 5th byte
// If PTYPE is not supported, return a failure notice to the calling function
// FIXME: PLUSPTYPE is not supported
if (fmt == 0x07) {
return false;
}
// If PTYPE is supported, calculate the current width and height according to
// a predefined table
if (!GetWidthAndHeight(fmt, &(outputInfoStruct->width),
&(outputInfoStruct->height))) {
return false;
}
// Extract the frame-type bit, which is the 9th bit of PTYPE (bit 38)
outputInfoStruct->isSyncFrame = !(headerBuffer[4] & 0x02);
return true;
}
/ *
* GetMaxBitrate- service routine that accepts frame information and
* derives bitrate information from it. This function uses a sliding window
* technique to calculate the maximum bitrates in any window of 1 second
* inside the file.
* The sliding window is implemented with a table of bitrates for the last
* second (30 entries - one entry per TR unit).
*
* Inputs:
* ctx - context for this function
* frameSize - the size of the current frame in bytes
* frameTRDiff - the "duration" of the frame in TR units
* Outputs:
* This function returns the up-to-date maximum bitrate
* /
static void GetMaxBitrate( MaxBitrate_CTX *ctx,
u_int32_t frameSize,
u_int8_t frameTRDiff)
{
if (frameTRDiff == 0)
return;
// Calculate the current frame's bitrate as bits per TR unit (round the result
// upwards)
u_int32_t frameBitrate = frameSize * 8 / frameTRDiff + 1;
// for each TRdiff received,
while (frameTRDiff--) {
// Subtract the oldest bitrate entry from the current bitrate
ctx->windowBitrate -= ctx->bitrateTable[ctx->tableIndex];
// Update the oldest bitrate entry with the current frame's bitrate
ctx->bitrateTable[ctx->tableIndex] = frameBitrate;
// Add the current frame's bitrate to the current bitrate
ctx->windowBitrate += frameBitrate;
// Check if we have a new maximum bitrate
if (ctx->windowBitrate > ctx->maxBitrate) {
ctx->maxBitrate = ctx->windowBitrate;
}
// Advance the table index
ctx->tableIndex = (ctx->tableIndex + 1) %
// Wrapping around the bitrateTable size
( sizeof(ctx->bitrateTable) / sizeof(ctx->bitrateTable[0]) );
}
}
/ *
* CalculateDuration - service routine that calculates the current frame's
* duration in milli-seconds using it's duration in TR units.
* - In order not to accumulate the calculation error, we are using the TR
* duration to calculate the current and the next frame's presentation time in
* milli-seconds.
*
* Inputs:
* trDiff - The current frame's duration in TR units
* Outputs:
* The current frame's duration in milli-seconds
* /
static MP4Duration CalculateDuration(u_int8_t trDiff)
{
static u_int32_t const nextTR = 0; // The next frame's presentation time in TR units
static MP4Duration const currentPT = 0; // The current frame's presentation time in milli-seconds
MP4Duration nextPT; // The next frame's presentation time in milli-seconds
MP4Duration duration; // The current frame's duration in milli-seconds
nextTR += trDiff;
// Calculate the next frame's presentation time, in milli-seconds
nextPT = (nextTR * 1001) / H263_BASIC_FRAME_RATE;
// The frame's duration is the difference between the next presentation
// time and the current presentation time.
duration = nextPT - currentPT;
// "Remember" the next presentation time for the next time this function is
// called
currentPT = nextPT;
return duration;
}
static bool GetWidthAndHeight( u_int8_t fmt,
u_int16_t *width,
u_int16_t *height)
{
// The 'fmt' corresponds to bits 5-7 of the PTYPE
static struct {
u_int16_t width;
u_int16_t height;
} const dimensionsTable[8] = {
{ 0, 0 }, // 000 - 0 - forbidden, generates an error
{ 128, 96 }, // 001 - 1 - Sub QCIF
{ 176, 144 }, // 010 - 2 - QCIF
{ 352, 288 }, // 011 - 3 - CIF
{ 704, 576 }, // 100 - 4 - 4CIF
{ 1409, 1152 }, // 101 - 5 - 16CIF
{ 0, 0 }, // 110 - 6 - reserved, generates an error
{ 0, 0 } // 111 - 7 - extended, not supported by profile 0
};
if (fmt > 7)
return false;
*width = dimensionsTable[fmt].width;
*height = dimensionsTable[fmt].height;
if (*width == 0)
return false;
return true;
}
static u_int8_t GetTRDifference(u_int8_t nextTR,
u_int8_t currentTR)
{
if (currentTR > nextTR) {
// Wrap around 255...
return nextTR + (256 - currentTR);
} else {
return nextTR - currentTR;
}
}
*/
live/liveMedia/H263plusVideoStreamParser.hh 000444 001752 001752 00000011071 12656261123 020537 0 ustar 00rsf rsf 000000 000000 /**********
This library is free software; you can redistribute it and/or modify it under
the terms of the GNU Lesser General Public License as published by the
Free Software Foundation; either version 2.1 of the License, or (at your
option) any later version. (See .)
This library is distributed in the hope that it will be useful, but WITHOUT
ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS
FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for
more details.
You should have received a copy of the GNU Lesser General Public License
along with this library; if not, write to the Free Software Foundation, Inc.,
51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
**********/
// "liveMedia"
// Copyright (c) 1996-2016 Live Networks, Inc. All rights reserved.
// A filter that breaks up an H263 video stream into frames.
// derived from MPEG4IP h263.c
// Author Benhard Feiten
#ifndef _H263PLUS_VIDEO_STREAM_PARSER_HH
#define _H263PLUS_VIDEO_STREAM_PARSER_HH
#ifndef _STREAM_PARSER_HH
#include "StreamParser.hh"
#endif
// Default timescale for H.263 (1000ms)
#define H263_TIMESCALE 1000
// Default H263 frame rate (30fps)
#define H263_BASIC_FRAME_RATE 30
// Minimum number of bytes needed to parse an H263 header
#define H263_REQUIRE_HEADER_SIZE_BYTES 5
// Number of bytes the start code requries
#define H263_STARTCODE_SIZE_BYTES 3
// This is the input buffer's size. It should contain
// 1 frame with the following start code
#define H263_BUFFER_SIZE 256 * 1024
// additionalBytesNeeded - indicates how many additional bytes are to be read
// from the next frame's header (over the 3 bytes that are already read).
#define ADDITIONAL_BYTES_NEEDED H263_REQUIRE_HEADER_SIZE_BYTES - H263_STARTCODE_SIZE_BYTES
// The default max different (in %) betwqeen max and average bitrates
#define H263_DEFAULT_CBR_TOLERANCE 10
// The following structure holds information extracted from each frame's header:
typedef struct _H263INFO {
u_int8_t tr; // Temporal Reference, used in duration calculation
u_int16_t width; // Width of the picture
u_int16_t height; // Height of the picture
bool isSyncFrame; // Frame type (true = I frame = "sync" frame)
} H263INFO;
typedef struct _MaxBitrate_CTX {
u_int32_t bitrateTable[H263_BASIC_FRAME_RATE];// Window of 1 second
u_int32_t windowBitrate; // The bitrate of the current window
u_int32_t maxBitrate; // The up-to-date maximum bitrate
u_int32_t tableIndex; // The next TR unit to update
} MaxBitrate_CTX;
class H263plusVideoStreamParser : public StreamParser {
public:
H263plusVideoStreamParser( class H263plusVideoStreamFramer* usingSource,
FramedSource* inputSource);
virtual ~H263plusVideoStreamParser();
void registerReadInterest(unsigned char* to, unsigned maxSize);
unsigned parse(u_int64_t & currentDuration); // returns the size of the frame that was acquired, or 0 if none
unsigned numTruncatedBytes() const { return fNumTruncatedBytes; } // The number of truncated bytes (if any)
protected:
// H263plusVideoStreamFramer* usingSource() {
// return (H263plusVideoStreamFramer*)fUsingSource;
// }
void setParseState();
// void setParseState(H263plusParseState parseState);
private:
int parseH263Frame( );
bool ParseShortHeader(u_int8_t *headerBuffer, H263INFO *outputInfoStruct);
void GetMaxBitrate( MaxBitrate_CTX *ctx, u_int32_t frameSize, u_int8_t frameTRDiff);
u_int64_t CalculateDuration(u_int8_t trDiff);
bool GetWidthAndHeight( u_int8_t fmt, u_int16_t *width, u_int16_t *height);
u_int8_t GetTRDifference( u_int8_t nextTR, u_int8_t currentTR);
virtual void restoreSavedParserState();
protected:
class H263plusVideoStreamFramer* fUsingSource;
unsigned char* fTo;
unsigned fMaxSize;
unsigned char* fStartOfFrame;
unsigned char* fSavedTo;
unsigned char* fLimit;
unsigned fNumTruncatedBytes;
unsigned fSavedNumTruncatedBytes;
private:
H263INFO fNextInfo; // Holds information about the next frame
H263INFO fCurrentInfo; // Holds information about the current frame
MaxBitrate_CTX fMaxBitrateCtx; // Context for the GetMaxBitrate function
char fStates[3][256];
u_int8_t fNextHeader[H263_REQUIRE_HEADER_SIZE_BYTES];
u_int32_t fnextTR; // The next frame's presentation time in TR units
u_int64_t fcurrentPT; // The current frame's presentation time in milli-seconds
};
#endif
live/liveMedia/H264or5VideoFileSink.cpp 000444 001752 001752 00000005257 12656261123 017552 0 ustar 00rsf rsf 000000 000000 /**********
This library is free software; you can redistribute it and/or modify it under
the terms of the GNU Lesser General Public License as published by the
Free Software Foundation; either version 2.1 of the License, or (at your
option) any later version. (See .)
This library is distributed in the hope that it will be useful, but WITHOUT
ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS
FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for
more details.
You should have received a copy of the GNU Lesser General Public License
along with this library; if not, write to the Free Software Foundation, Inc.,
51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
**********/
// "liveMedia"
// Copyright (c) 1996-2016 Live Networks, Inc. All rights reserved.
// H.264 or H.265 Video File sinks
// Implementation
#include "H264or5VideoFileSink.hh"
#include "H264VideoRTPSource.hh" // for "parseSPropParameterSets()"
////////// H264or5VideoFileSink //////////
H264or5VideoFileSink
::H264or5VideoFileSink(UsageEnvironment& env, FILE* fid,
unsigned bufferSize, char const* perFrameFileNamePrefix,
char const* sPropParameterSetsStr1,
char const* sPropParameterSetsStr2,
char const* sPropParameterSetsStr3)
: FileSink(env, fid, bufferSize, perFrameFileNamePrefix),
fHaveWrittenFirstFrame(False) {
fSPropParameterSetsStr[0] = sPropParameterSetsStr1;
fSPropParameterSetsStr[1] = sPropParameterSetsStr2;
fSPropParameterSetsStr[2] = sPropParameterSetsStr3;
}
H264or5VideoFileSink::~H264or5VideoFileSink() {
}
void H264or5VideoFileSink::afterGettingFrame(unsigned frameSize, unsigned numTruncatedBytes, struct timeval presentationTime) {
unsigned char const start_code[4] = {0x00, 0x00, 0x00, 0x01};
if (!fHaveWrittenFirstFrame) {
// If we have NAL units encoded in "sprop parameter strings", prepend these to the file:
for (unsigned j = 0; j < 3; ++j) {
unsigned numSPropRecords;
SPropRecord* sPropRecords
= parseSPropParameterSets(fSPropParameterSetsStr[j], numSPropRecords);
for (unsigned i = 0; i < numSPropRecords; ++i) {
addData(start_code, 4, presentationTime);
addData(sPropRecords[i].sPropBytes, sPropRecords[i].sPropLength, presentationTime);
}
delete[] sPropRecords;
}
fHaveWrittenFirstFrame = True; // for next time
}
// Write the input data to the file, with the start code in front:
addData(start_code, 4, presentationTime);
// Call the parent class to complete the normal file write with the input data:
FileSink::afterGettingFrame(frameSize, numTruncatedBytes, presentationTime);
}
live/liveMedia/H264or5VideoRTPSink.cpp 000444 001752 001752 00000027004 12656261123 017332 0 ustar 00rsf rsf 000000 000000 /**********
This library is free software; you can redistribute it and/or modify it under
the terms of the GNU Lesser General Public License as published by the
Free Software Foundation; either version 2.1 of the License, or (at your
option) any later version. (See .)
This library is distributed in the hope that it will be useful, but WITHOUT
ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS
FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for
more details.
You should have received a copy of the GNU Lesser General Public License
along with this library; if not, write to the Free Software Foundation, Inc.,
51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
**********/
// "liveMedia"
// Copyright (c) 1996-2016 Live Networks, Inc. All rights reserved.
// RTP sink for H.264 or H.265 video
// Implementation
#include "H264or5VideoRTPSink.hh"
#include "H264or5VideoStreamFramer.hh"
////////// H264or5Fragmenter definition //////////
// Because of the ideosyncracies of the H.264 RTP payload format, we implement
// "H264or5VideoRTPSink" using a separate "H264or5Fragmenter" class that delivers,
// to the "H264or5VideoRTPSink", only fragments that will fit within an outgoing
// RTP packet. I.e., we implement fragmentation in this separate "H264or5Fragmenter"
// class, rather than in "H264or5VideoRTPSink".
// (Note: This class should be used only by "H264or5VideoRTPSink", or a subclass.)
class H264or5Fragmenter: public FramedFilter {
public:
H264or5Fragmenter(int hNumber, UsageEnvironment& env, FramedSource* inputSource,
unsigned inputBufferMax, unsigned maxOutputPacketSize);
virtual ~H264or5Fragmenter();
Boolean lastFragmentCompletedNALUnit() const { return fLastFragmentCompletedNALUnit; }
private: // redefined virtual functions:
virtual void doGetNextFrame();
virtual void doStopGettingFrames();
private:
static void afterGettingFrame(void* clientData, unsigned frameSize,
unsigned numTruncatedBytes,
struct timeval presentationTime,
unsigned durationInMicroseconds);
void afterGettingFrame1(unsigned frameSize,
unsigned numTruncatedBytes,
struct timeval presentationTime,
unsigned durationInMicroseconds);
void reset();
private:
int fHNumber;
unsigned fInputBufferSize;
unsigned fMaxOutputPacketSize;
unsigned char* fInputBuffer;
unsigned fNumValidDataBytes;
unsigned fCurDataOffset;
unsigned fSaveNumTruncatedBytes;
Boolean fLastFragmentCompletedNALUnit;
};
////////// H264or5VideoRTPSink implementation //////////
H264or5VideoRTPSink
::H264or5VideoRTPSink(int hNumber,
UsageEnvironment& env, Groupsock* RTPgs, unsigned char rtpPayloadFormat,
u_int8_t const* vps, unsigned vpsSize,
u_int8_t const* sps, unsigned spsSize,
u_int8_t const* pps, unsigned ppsSize)
: VideoRTPSink(env, RTPgs, rtpPayloadFormat, 90000, hNumber == 264 ? "H264" : "H265"),
fHNumber(hNumber), fOurFragmenter(NULL), fFmtpSDPLine(NULL) {
if (vps != NULL) {
fVPSSize = vpsSize;
fVPS = new u_int8_t[fVPSSize];
memmove(fVPS, vps, fVPSSize);
} else {
fVPSSize = 0;
fVPS = NULL;
}
if (sps != NULL) {
fSPSSize = spsSize;
fSPS = new u_int8_t[fSPSSize];
memmove(fSPS, sps, fSPSSize);
} else {
fSPSSize = 0;
fSPS = NULL;
}
if (pps != NULL) {
fPPSSize = ppsSize;
fPPS = new u_int8_t[fPPSSize];
memmove(fPPS, pps, fPPSSize);
} else {
fPPSSize = 0;
fPPS = NULL;
}
}
H264or5VideoRTPSink::~H264or5VideoRTPSink() {
fSource = fOurFragmenter; // hack: in case "fSource" had gotten set to NULL before we were called
delete[] fFmtpSDPLine;
delete[] fVPS; delete[] fSPS; delete[] fPPS;
stopPlaying(); // call this now, because we won't have our 'fragmenter' when the base class destructor calls it later.
// Close our 'fragmenter' as well:
Medium::close(fOurFragmenter);
fSource = NULL; // for the base class destructor, which gets called next
}
Boolean H264or5VideoRTPSink::continuePlaying() {
// First, check whether we have a 'fragmenter' class set up yet.
// If not, create it now:
if (fOurFragmenter == NULL) {
fOurFragmenter = new H264or5Fragmenter(fHNumber, envir(), fSource, OutPacketBuffer::maxSize,
ourMaxPacketSize() - 12/*RTP hdr size*/);
} else {
fOurFragmenter->reassignInputSource(fSource);
}
fSource = fOurFragmenter;
// Then call the parent class's implementation:
return MultiFramedRTPSink::continuePlaying();
}
void H264or5VideoRTPSink::doSpecialFrameHandling(unsigned /*fragmentationOffset*/,
unsigned char* /*frameStart*/,
unsigned /*numBytesInFrame*/,
struct timeval framePresentationTime,
unsigned /*numRemainingBytes*/) {
// Set the RTP 'M' (marker) bit iff
// 1/ The most recently delivered fragment was the end of (or the only fragment of) an NAL unit, and
// 2/ This NAL unit was the last NAL unit of an 'access unit' (i.e. video frame).
if (fOurFragmenter != NULL) {
H264or5VideoStreamFramer* framerSource
= (H264or5VideoStreamFramer*)(fOurFragmenter->inputSource());
// This relies on our fragmenter's source being a "H264or5VideoStreamFramer".
if (((H264or5Fragmenter*)fOurFragmenter)->lastFragmentCompletedNALUnit()
&& framerSource != NULL && framerSource->pictureEndMarker()) {
setMarkerBit();
framerSource->pictureEndMarker() = False;
}
}
setTimestamp(framePresentationTime);
}
Boolean H264or5VideoRTPSink
::frameCanAppearAfterPacketStart(unsigned char const* /*frameStart*/,
unsigned /*numBytesInFrame*/) const {
return False;
}
////////// H264or5Fragmenter implementation //////////
H264or5Fragmenter::H264or5Fragmenter(int hNumber,
UsageEnvironment& env, FramedSource* inputSource,
unsigned inputBufferMax, unsigned maxOutputPacketSize)
: FramedFilter(env, inputSource),
fHNumber(hNumber),
fInputBufferSize(inputBufferMax+1), fMaxOutputPacketSize(maxOutputPacketSize) {
fInputBuffer = new unsigned char[fInputBufferSize];
reset();
}
H264or5Fragmenter::~H264or5Fragmenter() {
delete[] fInputBuffer;
detachInputSource(); // so that the subsequent ~FramedFilter() doesn't delete it
}
void H264or5Fragmenter::doGetNextFrame() {
if (fNumValidDataBytes == 1) {
// We have no NAL unit data currently in the buffer. Read a new one:
fInputSource->getNextFrame(&fInputBuffer[1], fInputBufferSize - 1,
afterGettingFrame, this,
FramedSource::handleClosure, this);
} else {
// We have NAL unit data in the buffer. There are three cases to consider:
// 1. There is a new NAL unit in the buffer, and it's small enough to deliver
// to the RTP sink (as is).
// 2. There is a new NAL unit in the buffer, but it's too large to deliver to
// the RTP sink in its entirety. Deliver the first fragment of this data,
// as a FU packet, with one extra preceding header byte (for the "FU header").
// 3. There is a NAL unit in the buffer, and we've already delivered some
// fragment(s) of this. Deliver the next fragment of this data,
// as a FU packet, with two (H.264) or three (H.265) extra preceding header bytes
// (for the "NAL header" and the "FU header").
if (fMaxSize < fMaxOutputPacketSize) { // shouldn't happen
envir() << "H264or5Fragmenter::doGetNextFrame(): fMaxSize ("
<< fMaxSize << ") is smaller than expected\n";
} else {
fMaxSize = fMaxOutputPacketSize;
}
fLastFragmentCompletedNALUnit = True; // by default
if (fCurDataOffset == 1) { // case 1 or 2
if (fNumValidDataBytes - 1 <= fMaxSize) { // case 1
memmove(fTo, &fInputBuffer[1], fNumValidDataBytes - 1);
fFrameSize = fNumValidDataBytes - 1;
fCurDataOffset = fNumValidDataBytes;
} else { // case 2
// We need to send the NAL unit data as FU packets. Deliver the first
// packet now. Note that we add "NAL header" and "FU header" bytes to the front
// of the packet (overwriting the existing "NAL header").
if (fHNumber == 264) {
fInputBuffer[0] = (fInputBuffer[1] & 0xE0) | 28; // FU indicator
fInputBuffer[1] = 0x80 | (fInputBuffer[1] & 0x1F); // FU header (with S bit)
} else { // 265
u_int8_t nal_unit_type = (fInputBuffer[1]&0x7E)>>1;
fInputBuffer[0] = (fInputBuffer[1] & 0x81) | (49<<1); // Payload header (1st byte)
fInputBuffer[1] = fInputBuffer[2]; // Payload header (2nd byte)
fInputBuffer[2] = 0x80 | nal_unit_type; // FU header (with S bit)
}
memmove(fTo, fInputBuffer, fMaxSize);
fFrameSize = fMaxSize;
fCurDataOffset += fMaxSize - 1;
fLastFragmentCompletedNALUnit = False;
}
} else { // case 3
// We are sending this NAL unit data as FU packets. We've already sent the
// first packet (fragment). Now, send the next fragment. Note that we add
// "NAL header" and "FU header" bytes to the front. (We reuse these bytes that
// we already sent for the first fragment, but clear the S bit, and add the E
// bit if this is the last fragment.)
unsigned numExtraHeaderBytes;
if (fHNumber == 264) {
fInputBuffer[fCurDataOffset-2] = fInputBuffer[0]; // FU indicator
fInputBuffer[fCurDataOffset-1] = fInputBuffer[1]&~0x80; // FU header (no S bit)
numExtraHeaderBytes = 2;
} else { // 265
fInputBuffer[fCurDataOffset-3] = fInputBuffer[0]; // Payload header (1st byte)
fInputBuffer[fCurDataOffset-2] = fInputBuffer[1]; // Payload header (2nd byte)
fInputBuffer[fCurDataOffset-1] = fInputBuffer[2]&~0x80; // FU header (no S bit)
numExtraHeaderBytes = 3;
}
unsigned numBytesToSend = numExtraHeaderBytes + (fNumValidDataBytes - fCurDataOffset);
if (numBytesToSend > fMaxSize) {
// We can't send all of the remaining data this time:
numBytesToSend = fMaxSize;
fLastFragmentCompletedNALUnit = False;
} else {
// This is the last fragment:
fInputBuffer[fCurDataOffset-1] |= 0x40; // set the E bit in the FU header
fNumTruncatedBytes = fSaveNumTruncatedBytes;
}
memmove(fTo, &fInputBuffer[fCurDataOffset-numExtraHeaderBytes], numBytesToSend);
fFrameSize = numBytesToSend;
fCurDataOffset += numBytesToSend - numExtraHeaderBytes;
}
if (fCurDataOffset >= fNumValidDataBytes) {
// We're done with this data. Reset the pointers for receiving new data:
fNumValidDataBytes = fCurDataOffset = 1;
}
// Complete delivery to the client:
FramedSource::afterGetting(this);
}
}
void H264or5Fragmenter::doStopGettingFrames() {
// Make sure that we don't have any stale data fragments lying around, should we later resume:
reset();
FramedFilter::doStopGettingFrames();
}
void H264or5Fragmenter::afterGettingFrame(void* clientData, unsigned frameSize,
unsigned numTruncatedBytes,
struct timeval presentationTime,
unsigned durationInMicroseconds) {
H264or5Fragmenter* fragmenter = (H264or5Fragmenter*)clientData;
fragmenter->afterGettingFrame1(frameSize, numTruncatedBytes, presentationTime,
durationInMicroseconds);
}
void H264or5Fragmenter::afterGettingFrame1(unsigned frameSize,
unsigned numTruncatedBytes,
struct timeval presentationTime,
unsigned durationInMicroseconds) {
fNumValidDataBytes += frameSize;
fSaveNumTruncatedBytes = numTruncatedBytes;
fPresentationTime = presentationTime;
fDurationInMicroseconds = durationInMicroseconds;
// Deliver data to the client:
doGetNextFrame();
}
void H264or5Fragmenter::reset() {
fNumValidDataBytes = fCurDataOffset = 1;
fSaveNumTruncatedBytes = 0;
fLastFragmentCompletedNALUnit = True;
}
live/liveMedia/H264or5VideoStreamDiscreteFramer.cpp 000444 001752 001752 00000011352 12656261123 022112 0 ustar 00rsf rsf 000000 000000 /**********
This library is free software; you can redistribute it and/or modify it under
the terms of the GNU Lesser General Public License as published by the
Free Software Foundation; either version 2.1 of the License, or (at your
option) any later version. (See .)
This library is distributed in the hope that it will be useful, but WITHOUT
ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS
FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for
more details.
You should have received a copy of the GNU Lesser General Public License
along with this library; if not, write to the Free Software Foundation, Inc.,
51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
**********/
// "liveMedia"
// Copyright (c) 1996-2016 Live Networks, Inc. All rights reserved.
// A simplified version of "H264or5VideoStreamFramer" that takes only complete,
// discrete frames (rather than an arbitrary byte stream) as input.
// This avoids the parsing and data copying overhead of the full
// "H264or5VideoStreamFramer".
// Implementation
#include "H264or5VideoStreamDiscreteFramer.hh"
H264or5VideoStreamDiscreteFramer
::H264or5VideoStreamDiscreteFramer(int hNumber, UsageEnvironment& env, FramedSource* inputSource)
: H264or5VideoStreamFramer(hNumber, env, inputSource, False/*don't create a parser*/, False) {
}
H264or5VideoStreamDiscreteFramer::~H264or5VideoStreamDiscreteFramer() {
}
void H264or5VideoStreamDiscreteFramer::doGetNextFrame() {
// Arrange to read data (which should be a complete H.264 or H.265 NAL unit)
// from our data source, directly into the client's input buffer.
// After reading this, we'll do some parsing on the frame.
fInputSource->getNextFrame(fTo, fMaxSize,
afterGettingFrame, this,
FramedSource::handleClosure, this);
}
void H264or5VideoStreamDiscreteFramer
::afterGettingFrame(void* clientData, unsigned frameSize,
unsigned numTruncatedBytes,
struct timeval presentationTime,
unsigned durationInMicroseconds) {
H264or5VideoStreamDiscreteFramer* source = (H264or5VideoStreamDiscreteFramer*)clientData;
source->afterGettingFrame1(frameSize, numTruncatedBytes, presentationTime, durationInMicroseconds);
}
void H264or5VideoStreamDiscreteFramer
::afterGettingFrame1(unsigned frameSize, unsigned numTruncatedBytes,
struct timeval presentationTime,
unsigned durationInMicroseconds) {
// Get the "nal_unit_type", to see if this NAL unit is one that we want to save a copy of:
u_int8_t nal_unit_type;
if (fHNumber == 264 && frameSize >= 1) {
nal_unit_type = fTo[0]&0x1F;
} else if (fHNumber == 265 && frameSize >= 2) {
nal_unit_type = (fTo[0]&0x7E)>>1;
} else {
// This is too short to be a valid NAL unit, so just assume a bogus nal_unit_type
nal_unit_type = 0xFF;
}
// Begin by checking for a (likely) common error: NAL units that (erroneously) begin with a
// 0x00000001 or 0x000001 'start code'. (Those start codes should only be in byte-stream data;
// *not* data that consists of discrete NAL units.)
// Once again, to be clear: The NAL units that you feed to a "H264or5VideoStreamDiscreteFramer"
// MUST NOT include start codes.
if (frameSize >= 4 && fTo[0] == 0 && fTo[1] == 0 && ((fTo[2] == 0 && fTo[3] == 1) || fTo[2] == 1)) {
envir() << "H264or5VideoStreamDiscreteFramer error: MPEG 'start code' seen in the input\n";
} else if (isVPS(nal_unit_type)) { // Video parameter set (VPS)
saveCopyOfVPS(fTo, frameSize);
} else if (isSPS(nal_unit_type)) { // Sequence parameter set (SPS)
saveCopyOfSPS(fTo, frameSize);
} else if (isPPS(nal_unit_type)) { // Picture parameter set (PPS)
saveCopyOfPPS(fTo, frameSize);
}
fPictureEndMarker = nalUnitEndsAccessUnit(nal_unit_type);
// Finally, complete delivery to the client:
fFrameSize = frameSize;
fNumTruncatedBytes = numTruncatedBytes;
fPresentationTime = presentationTime;
fDurationInMicroseconds = durationInMicroseconds;
afterGetting(this);
}
Boolean H264or5VideoStreamDiscreteFramer::nalUnitEndsAccessUnit(u_int8_t nal_unit_type) {
// Check whether this NAL unit ends the current 'access unit' (basically, a video frame).
// Unfortunately, we can't do this reliably, because we don't yet know anything about the
// *next* NAL unit that we'll see. So, we guess this as best as we can, by assuming that
// if this NAL unit is a VCL NAL unit, then it ends the current 'access unit'.
//
// This will be wrong if you are streaming multiple 'slices' per picture. In that case,
// you can define a subclass that reimplements this virtual function to do the right thing.
return isVCL(nal_unit_type);
}
live/liveMedia/H264or5VideoStreamFramer.cpp 000444 001752 001752 00000132755 12656261123 020442 0 ustar 00rsf rsf 000000 000000 /**********
This library is free software; you can redistribute it and/or modify it under
the terms of the GNU Lesser General Public License as published by the
Free Software Foundation; either version 2.1 of the License, or (at your
option) any later version. (See .)
This library is distributed in the hope that it will be useful, but WITHOUT
ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS
FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for
more details.
You should have received a copy of the GNU Lesser General Public License
along with this library; if not, write to the Free Software Foundation, Inc.,
51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
**********/
// "liveMedia"
// Copyright (c) 1996-2016 Live Networks, Inc. All rights reserved.
// A filter that breaks up a H.264 or H.265 Video Elementary Stream into NAL units.
// Implementation
#include "H264or5VideoStreamFramer.hh"
#include "MPEGVideoStreamParser.hh"
#include "BitVector.hh"
////////// H264or5VideoStreamParser definition //////////
class H264or5VideoStreamParser: public MPEGVideoStreamParser {
public:
H264or5VideoStreamParser(int hNumber, H264or5VideoStreamFramer* usingSource,
FramedSource* inputSource, Boolean includeStartCodeInOutput);
virtual ~H264or5VideoStreamParser();
private: // redefined virtual functions:
virtual void flushInput();
virtual unsigned parse();
private:
H264or5VideoStreamFramer* usingSource() {
return (H264or5VideoStreamFramer*)fUsingSource;
}
Boolean isVPS(u_int8_t nal_unit_type) { return usingSource()->isVPS(nal_unit_type); }
Boolean isSPS(u_int8_t nal_unit_type) { return usingSource()->isSPS(nal_unit_type); }
Boolean isPPS(u_int8_t nal_unit_type) { return usingSource()->isPPS(nal_unit_type); }
Boolean isVCL(u_int8_t nal_unit_type) { return usingSource()->isVCL(nal_unit_type); }
Boolean isSEI(u_int8_t nal_unit_type);
Boolean isEOF(u_int8_t nal_unit_type);
Boolean usuallyBeginsAccessUnit(u_int8_t nal_unit_type);
void removeEmulationBytes(u_int8_t* nalUnitCopy, unsigned maxSize, unsigned& nalUnitCopySize);
void analyze_video_parameter_set_data(unsigned& num_units_in_tick, unsigned& time_scale);
void analyze_seq_parameter_set_data(unsigned& num_units_in_tick, unsigned& time_scale);
void profile_tier_level(BitVector& bv, unsigned max_sub_layers_minus1);
void analyze_vui_parameters(BitVector& bv, unsigned& num_units_in_tick, unsigned& time_scale);
void analyze_hrd_parameters(BitVector& bv);
void analyze_sei_data(u_int8_t nal_unit_type);
void analyze_sei_payload(unsigned payloadType, unsigned payloadSize, u_int8_t* payload);
private:
int fHNumber; // 264 or 265
unsigned fOutputStartCodeSize;
Boolean fHaveSeenFirstStartCode, fHaveSeenFirstByteOfNALUnit;
u_int8_t fFirstByteOfNALUnit;
double fParsedFrameRate;
// variables set & used in the specification:
unsigned cpb_removal_delay_length_minus1, dpb_output_delay_length_minus1;
Boolean CpbDpbDelaysPresentFlag, pic_struct_present_flag;
double DeltaTfiDivisor;
};
////////// H264or5VideoStreamFramer implementation //////////
H264or5VideoStreamFramer
::H264or5VideoStreamFramer(int hNumber, UsageEnvironment& env, FramedSource* inputSource,
Boolean createParser, Boolean includeStartCodeInOutput)
: MPEGVideoStreamFramer(env, inputSource),
fHNumber(hNumber),
fLastSeenVPS(NULL), fLastSeenVPSSize(0),
fLastSeenSPS(NULL), fLastSeenSPSSize(0),
fLastSeenPPS(NULL), fLastSeenPPSSize(0) {
fParser = createParser
? new H264or5VideoStreamParser(hNumber, this, inputSource, includeStartCodeInOutput)
: NULL;
fNextPresentationTime = fPresentationTimeBase;
fFrameRate = 25.0; // We assume a frame rate of 25 fps, unless we learn otherwise (from parsing a VPS or SPS NAL unit)
}
H264or5VideoStreamFramer::~H264or5VideoStreamFramer() {
delete[] fLastSeenPPS;
delete[] fLastSeenSPS;
delete[] fLastSeenVPS;
}
#define VPS_MAX_SIZE 1000 // larger than the largest possible VPS (Video Parameter Set) NAL unit
void H264or5VideoStreamFramer::saveCopyOfVPS(u_int8_t* from, unsigned size) {
if (from == NULL) return;
delete[] fLastSeenVPS;
fLastSeenVPS = new u_int8_t[size];
memmove(fLastSeenVPS, from, size);
fLastSeenVPSSize = size;
}
#define SPS_MAX_SIZE 1000 // larger than the largest possible SPS (Sequence Parameter Set) NAL unit
void H264or5VideoStreamFramer::saveCopyOfSPS(u_int8_t* from, unsigned size) {
if (from == NULL) return;
delete[] fLastSeenSPS;
fLastSeenSPS = new u_int8_t[size];
memmove(fLastSeenSPS, from, size);
fLastSeenSPSSize = size;
}
void H264or5VideoStreamFramer::saveCopyOfPPS(u_int8_t* from, unsigned size) {
if (from == NULL) return;
delete[] fLastSeenPPS;
fLastSeenPPS = new u_int8_t[size];
memmove(fLastSeenPPS, from, size);
fLastSeenPPSSize = size;
}
Boolean H264or5VideoStreamFramer::isVPS(u_int8_t nal_unit_type) {
// VPS NAL units occur in H.265 only:
return fHNumber == 265 && nal_unit_type == 32;
}
Boolean H264or5VideoStreamFramer::isSPS(u_int8_t nal_unit_type) {
return fHNumber == 264 ? nal_unit_type == 7 : nal_unit_type == 33;
}
Boolean H264or5VideoStreamFramer::isPPS(u_int8_t nal_unit_type) {
return fHNumber == 264 ? nal_unit_type == 8 : nal_unit_type == 34;
}
Boolean H264or5VideoStreamFramer::isVCL(u_int8_t nal_unit_type) {
return fHNumber == 264
? (nal_unit_type <= 5 && nal_unit_type > 0)
: (nal_unit_type <= 31);
}
////////// H264or5VideoStreamParser implementation //////////
H264or5VideoStreamParser
::H264or5VideoStreamParser(int hNumber, H264or5VideoStreamFramer* usingSource,
FramedSource* inputSource, Boolean includeStartCodeInOutput)
: MPEGVideoStreamParser(usingSource, inputSource),
fHNumber(hNumber), fOutputStartCodeSize(includeStartCodeInOutput ? 4 : 0), fHaveSeenFirstStartCode(False), fHaveSeenFirstByteOfNALUnit(False), fParsedFrameRate(0.0),
cpb_removal_delay_length_minus1(23), dpb_output_delay_length_minus1(23),
CpbDpbDelaysPresentFlag(0), pic_struct_present_flag(0),
DeltaTfiDivisor(2.0) {
}
H264or5VideoStreamParser::~H264or5VideoStreamParser() {
}
#define PREFIX_SEI_NUT 39 // for H.265
#define SUFFIX_SEI_NUT 40 // for H.265
Boolean H264or5VideoStreamParser::isSEI(u_int8_t nal_unit_type) {
return fHNumber == 264
? nal_unit_type == 6
: (nal_unit_type == PREFIX_SEI_NUT || nal_unit_type == SUFFIX_SEI_NUT);
}
Boolean H264or5VideoStreamParser::isEOF(u_int8_t nal_unit_type) {
// "end of sequence" or "end of (bit)stream"
return fHNumber == 264
? (nal_unit_type == 10 || nal_unit_type == 11)
: (nal_unit_type == 36 || nal_unit_type == 37);
}
Boolean H264or5VideoStreamParser::usuallyBeginsAccessUnit(u_int8_t nal_unit_type) {
return fHNumber == 264
? (nal_unit_type >= 6 && nal_unit_type <= 9) || (nal_unit_type >= 14 && nal_unit_type <= 18)
: (nal_unit_type >= 32 && nal_unit_type <= 35) || (nal_unit_type == 39)
|| (nal_unit_type >= 41 && nal_unit_type <= 44)
|| (nal_unit_type >= 48 && nal_unit_type <= 55);
}
void H264or5VideoStreamParser
::removeEmulationBytes(u_int8_t* nalUnitCopy, unsigned maxSize, unsigned& nalUnitCopySize) {
u_int8_t* nalUnitOrig = fStartOfFrame + fOutputStartCodeSize;
unsigned const numBytesInNALunit = fTo - nalUnitOrig;
nalUnitCopySize
= removeH264or5EmulationBytes(nalUnitCopy, maxSize, nalUnitOrig, numBytesInNALunit);
}
#ifdef DEBUG
char const* nal_unit_type_description_h264[32] = {
"Unspecified", //0
"Coded slice of a non-IDR picture", //1
"Coded slice data partition A", //2
"Coded slice data partition B", //3
"Coded slice data partition C", //4
"Coded slice of an IDR picture", //5
"Supplemental enhancement information (SEI)", //6
"Sequence parameter set", //7
"Picture parameter set", //8
"Access unit delimiter", //9
"End of sequence", //10
"End of stream", //11
"Filler data", //12
"Sequence parameter set extension", //13
"Prefix NAL unit", //14
"Subset sequence parameter set", //15
"Reserved", //16
"Reserved", //17
"Reserved", //18
"Coded slice of an auxiliary coded picture without partitioning", //19
"Coded slice extension", //20
"Reserved", //21
"Reserved", //22
"Reserved", //23
"Unspecified", //24
"Unspecified", //25
"Unspecified", //26
"Unspecified", //27
"Unspecified", //28
"Unspecified", //29
"Unspecified", //30
"Unspecified" //31
};
char const* nal_unit_type_description_h265[64] = {
"Coded slice segment of a non-TSA, non-STSA trailing picture", //0
"Coded slice segment of a non-TSA, non-STSA trailing picture", //1
"Coded slice segment of a TSA picture", //2
"Coded slice segment of a TSA picture", //3
"Coded slice segment of a STSA picture", //4
"Coded slice segment of a STSA picture", //5
"Coded slice segment of a RADL picture", //6
"Coded slice segment of a RADL picture", //7
"Coded slice segment of a RASL picture", //8
"Coded slice segment of a RASL picture", //9
"Reserved", //10
"Reserved", //11
"Reserved", //12
"Reserved", //13
"Reserved", //14
"Reserved", //15
"Coded slice segment of a BLA picture", //16
"Coded slice segment of a BLA picture", //17
"Coded slice segment of a BLA picture", //18
"Coded slice segment of an IDR picture", //19
"Coded slice segment of an IDR picture", //20
"Coded slice segment of a CRA picture", //21
"Reserved", //22
"Reserved", //23
"Reserved", //24
"Reserved", //25
"Reserved", //26
"Reserved", //27
"Reserved", //28
"Reserved", //29
"Reserved", //30
"Reserved", //31
"Video parameter set", //32
"Sequence parameter set", //33
"Picture parameter set", //34
"Access unit delimiter", //35
"End of sequence", //36
"End of bitstream", //37
"Filler data", //38
"Supplemental enhancement information (SEI)", //39
"Supplemental enhancement information (SEI)", //40
"Reserved", //41
"Reserved", //42
"Reserved", //43
"Reserved", //44
"Reserved", //45
"Reserved", //46
"Reserved", //47
"Unspecified", //48
"Unspecified", //49
"Unspecified", //50
"Unspecified", //51
"Unspecified", //52
"Unspecified", //53
"Unspecified", //54
"Unspecified", //55
"Unspecified", //56
"Unspecified", //57
"Unspecified", //58
"Unspecified", //59
"Unspecified", //60
"Unspecified", //61
"Unspecified", //62
"Unspecified", //63
};
#endif
#ifdef DEBUG
static unsigned numDebugTabs = 1;
#define DEBUG_PRINT_TABS for (unsigned _i = 0; _i < numDebugTabs; ++_i) fprintf(stderr, "\t")
#define DEBUG_PRINT(x) do { DEBUG_PRINT_TABS; fprintf(stderr, "%s: %d\n", #x, x); } while (0)
#define DEBUG_STR(x) do { DEBUG_PRINT_TABS; fprintf(stderr, "%s\n", x); } while (0)
class DebugTab {
public:
DebugTab() {++numDebugTabs;}
~DebugTab() {--numDebugTabs;}
};
#define DEBUG_TAB DebugTab dummy
#else
#define DEBUG_PRINT(x) do {x = x;} while (0)
// Note: the "x=x;" statement is intended to eliminate "unused variable" compiler warning messages
#define DEBUG_STR(x) do {} while (0)
#define DEBUG_TAB do {} while (0)
#endif
void H264or5VideoStreamParser::profile_tier_level(BitVector& bv, unsigned max_sub_layers_minus1) {
bv.skipBits(96);
unsigned i;
Boolean sub_layer_profile_present_flag[7], sub_layer_level_present_flag[7];
for (i = 0; i < max_sub_layers_minus1; ++i) {
sub_layer_profile_present_flag[i] = bv.get1BitBoolean();
sub_layer_level_present_flag[i] = bv.get1BitBoolean();
}
if (max_sub_layers_minus1 > 0) {
bv.skipBits(2*(8-max_sub_layers_minus1)); // reserved_zero_2bits
}
for (i = 0; i < max_sub_layers_minus1; ++i) {
if (sub_layer_profile_present_flag[i]) {
bv.skipBits(88);
}
if (sub_layer_level_present_flag[i]) {
bv.skipBits(8); // sub_layer_level_idc[i]
}
}
}
void H264or5VideoStreamParser
::analyze_vui_parameters(BitVector& bv,
unsigned& num_units_in_tick, unsigned& time_scale) {
Boolean aspect_ratio_info_present_flag = bv.get1BitBoolean();
DEBUG_PRINT(aspect_ratio_info_present_flag);
if (aspect_ratio_info_present_flag) {
DEBUG_TAB;
unsigned aspect_ratio_idc = bv.getBits(8);
DEBUG_PRINT(aspect_ratio_idc);
if (aspect_ratio_idc == 255/*Extended_SAR*/) {
bv.skipBits(32); // sar_width; sar_height
}
}
Boolean overscan_info_present_flag = bv.get1BitBoolean();
DEBUG_PRINT(overscan_info_present_flag);
if (overscan_info_present_flag) {
bv.skipBits(1); // overscan_appropriate_flag
}
Boolean video_signal_type_present_flag = bv.get1BitBoolean();
DEBUG_PRINT(video_signal_type_present_flag);
if (video_signal_type_present_flag) {
DEBUG_TAB;
bv.skipBits(4); // video_format; video_full_range_flag
Boolean colour_description_present_flag = bv.get1BitBoolean();
DEBUG_PRINT(colour_description_present_flag);
if (colour_description_present_flag) {
bv.skipBits(24); // colour_primaries; transfer_characteristics; matrix_coefficients
}
}
Boolean chroma_loc_info_present_flag = bv.get1BitBoolean();
DEBUG_PRINT(chroma_loc_info_present_flag);
if (chroma_loc_info_present_flag) {
(void)bv.get_expGolomb(); // chroma_sample_loc_type_top_field
(void)bv.get_expGolomb(); // chroma_sample_loc_type_bottom_field
}
if (fHNumber == 265) {
bv.skipBits(2); // neutral_chroma_indication_flag, field_seq_flag
Boolean frame_field_info_present_flag = bv.get1BitBoolean();
DEBUG_PRINT(frame_field_info_present_flag);
pic_struct_present_flag = frame_field_info_present_flag; // hack to make H.265 like H.264
Boolean default_display_window_flag = bv.get1BitBoolean();
DEBUG_PRINT(default_display_window_flag);
if (default_display_window_flag) {
(void)bv.get_expGolomb(); // def_disp_win_left_offset
(void)bv.get_expGolomb(); // def_disp_win_right_offset
(void)bv.get_expGolomb(); // def_disp_win_top_offset
(void)bv.get_expGolomb(); // def_disp_win_bottom_offset
}
}
Boolean timing_info_present_flag = bv.get1BitBoolean();
DEBUG_PRINT(timing_info_present_flag);
if (timing_info_present_flag) {
DEBUG_TAB;
num_units_in_tick = bv.getBits(32);
DEBUG_PRINT(num_units_in_tick);
time_scale = bv.getBits(32);
DEBUG_PRINT(time_scale);
if (fHNumber == 264) {
Boolean fixed_frame_rate_flag = bv.get1BitBoolean();
DEBUG_PRINT(fixed_frame_rate_flag);
} else { // 265
Boolean vui_poc_proportional_to_timing_flag = bv.get1BitBoolean();
DEBUG_PRINT(vui_poc_proportional_to_timing_flag);
if (vui_poc_proportional_to_timing_flag) {
unsigned vui_num_ticks_poc_diff_one_minus1 = bv.get_expGolomb();
DEBUG_PRINT(vui_num_ticks_poc_diff_one_minus1);
}
return; // For H.265, don't bother parsing any more of this #####
}
}
// The following is H.264 only: #####
Boolean nal_hrd_parameters_present_flag = bv.get1BitBoolean();
DEBUG_PRINT(nal_hrd_parameters_present_flag);
if (nal_hrd_parameters_present_flag) analyze_hrd_parameters(bv);
Boolean vcl_hrd_parameters_present_flag = bv.get1BitBoolean();
DEBUG_PRINT(vcl_hrd_parameters_present_flag);
if (vcl_hrd_parameters_present_flag) analyze_hrd_parameters(bv);
CpbDpbDelaysPresentFlag = nal_hrd_parameters_present_flag || vcl_hrd_parameters_present_flag;
if (CpbDpbDelaysPresentFlag) {
bv.skipBits(1); // low_delay_hrd_flag
}
pic_struct_present_flag = bv.get1BitBoolean();
DEBUG_PRINT(pic_struct_present_flag);
}
void H264or5VideoStreamParser::analyze_hrd_parameters(BitVector& bv) {
DEBUG_TAB;
unsigned cpb_cnt_minus1 = bv.get_expGolomb();
DEBUG_PRINT(cpb_cnt_minus1);
unsigned bit_rate_scale = bv.getBits(4);
DEBUG_PRINT(bit_rate_scale);
unsigned cpb_size_scale = bv.getBits(4);
DEBUG_PRINT(cpb_size_scale);
for (unsigned SchedSelIdx = 0; SchedSelIdx <= cpb_cnt_minus1; ++SchedSelIdx) {
DEBUG_TAB;
DEBUG_PRINT(SchedSelIdx);
unsigned bit_rate_value_minus1 = bv.get_expGolomb();
DEBUG_PRINT(bit_rate_value_minus1);
unsigned cpb_size_value_minus1 = bv.get_expGolomb();
DEBUG_PRINT(cpb_size_value_minus1);
Boolean cbr_flag = bv.get1BitBoolean();
DEBUG_PRINT(cbr_flag);
}
unsigned initial_cpb_removal_delay_length_minus1 = bv.getBits(5);
DEBUG_PRINT(initial_cpb_removal_delay_length_minus1);
cpb_removal_delay_length_minus1 = bv.getBits(5);
DEBUG_PRINT(cpb_removal_delay_length_minus1);
dpb_output_delay_length_minus1 = bv.getBits(5);
DEBUG_PRINT(dpb_output_delay_length_minus1);
unsigned time_offset_length = bv.getBits(5);
DEBUG_PRINT(time_offset_length);
}
void H264or5VideoStreamParser
::analyze_video_parameter_set_data(unsigned& num_units_in_tick, unsigned& time_scale) {
num_units_in_tick = time_scale = 0; // default values
// Begin by making a copy of the NAL unit data, removing any 'emulation prevention' bytes:
u_int8_t vps[VPS_MAX_SIZE];
unsigned vpsSize;
removeEmulationBytes(vps, sizeof vps, vpsSize);
BitVector bv(vps, 0, 8*vpsSize);
// Assert: fHNumber == 265 (because this function is called only when parsing H.265)
unsigned i;
bv.skipBits(28); // nal_unit_header, vps_video_parameter_set_id, vps_reserved_three_2bits, vps_max_layers_minus1
unsigned vps_max_sub_layers_minus1 = bv.getBits(3);
DEBUG_PRINT(vps_max_sub_layers_minus1);
bv.skipBits(17); // vps_temporal_id_nesting_flag, vps_reserved_0xffff_16bits
profile_tier_level(bv, vps_max_sub_layers_minus1);
Boolean vps_sub_layer_ordering_info_present_flag = bv.get1BitBoolean();
DEBUG_PRINT(vps_sub_layer_ordering_info_present_flag);
for (i = vps_sub_layer_ordering_info_present_flag ? 0 : vps_max_sub_layers_minus1;
i <= vps_max_sub_layers_minus1; ++i) {
(void)bv.get_expGolomb(); // vps_max_dec_pic_buffering_minus1[i]
(void)bv.get_expGolomb(); // vps_max_num_reorder_pics[i]
(void)bv.get_expGolomb(); // vps_max_latency_increase_plus1[i]
}
unsigned vps_max_layer_id = bv.getBits(6);
DEBUG_PRINT(vps_max_layer_id);
unsigned vps_num_layer_sets_minus1 = bv.get_expGolomb();
DEBUG_PRINT(vps_num_layer_sets_minus1);
for (i = 1; i <= vps_num_layer_sets_minus1; ++i) {
bv.skipBits(vps_max_layer_id+1); // layer_id_included_flag[i][0..vps_max_layer_id]
}
Boolean vps_timing_info_present_flag = bv.get1BitBoolean();
DEBUG_PRINT(vps_timing_info_present_flag);
if (vps_timing_info_present_flag) {
DEBUG_TAB;
num_units_in_tick = bv.getBits(32);
DEBUG_PRINT(num_units_in_tick);
time_scale = bv.getBits(32);
DEBUG_PRINT(time_scale);
Boolean vps_poc_proportional_to_timing_flag = bv.get1BitBoolean();
DEBUG_PRINT(vps_poc_proportional_to_timing_flag);
if (vps_poc_proportional_to_timing_flag) {
unsigned vps_num_ticks_poc_diff_one_minus1 = bv.get_expGolomb();
DEBUG_PRINT(vps_num_ticks_poc_diff_one_minus1);
}
}
Boolean vps_extension_flag = bv.get1BitBoolean();
DEBUG_PRINT(vps_extension_flag);
}
void H264or5VideoStreamParser
::analyze_seq_parameter_set_data(unsigned& num_units_in_tick, unsigned& time_scale) {
num_units_in_tick = time_scale = 0; // default values
// Begin by making a copy of the NAL unit data, removing any 'emulation prevention' bytes:
u_int8_t sps[SPS_MAX_SIZE];
unsigned spsSize;
removeEmulationBytes(sps, sizeof sps, spsSize);
BitVector bv(sps, 0, 8*spsSize);
if (fHNumber == 264) {
bv.skipBits(8); // forbidden_zero_bit; nal_ref_idc; nal_unit_type
unsigned profile_idc = bv.getBits(8);
DEBUG_PRINT(profile_idc);
unsigned constraint_setN_flag = bv.getBits(8); // also "reserved_zero_2bits" at end
DEBUG_PRINT(constraint_setN_flag);
unsigned level_idc = bv.getBits(8);
DEBUG_PRINT(level_idc);
unsigned seq_parameter_set_id = bv.get_expGolomb();
DEBUG_PRINT(seq_parameter_set_id);
if (profile_idc == 100 || profile_idc == 110 || profile_idc == 122 || profile_idc == 244 || profile_idc == 44 || profile_idc == 83 || profile_idc == 86 || profile_idc == 118 || profile_idc == 128 ) {
DEBUG_TAB;
unsigned chroma_format_idc = bv.get_expGolomb();
DEBUG_PRINT(chroma_format_idc);
if (chroma_format_idc == 3) {
DEBUG_TAB;
Boolean separate_colour_plane_flag = bv.get1BitBoolean();
DEBUG_PRINT(separate_colour_plane_flag);
}
(void)bv.get_expGolomb(); // bit_depth_luma_minus8
(void)bv.get_expGolomb(); // bit_depth_chroma_minus8
bv.skipBits(1); // qpprime_y_zero_transform_bypass_flag
Boolean seq_scaling_matrix_present_flag = bv.get1BitBoolean();
DEBUG_PRINT(seq_scaling_matrix_present_flag);
if (seq_scaling_matrix_present_flag) {
for (int i = 0; i < ((chroma_format_idc != 3) ? 8 : 12); ++i) {
DEBUG_TAB;
DEBUG_PRINT(i);
Boolean seq_scaling_list_present_flag = bv.get1BitBoolean();
DEBUG_PRINT(seq_scaling_list_present_flag);
if (seq_scaling_list_present_flag) {
DEBUG_TAB;
unsigned sizeOfScalingList = i < 6 ? 16 : 64;
unsigned lastScale = 8;
unsigned nextScale = 8;
for (unsigned j = 0; j < sizeOfScalingList; ++j) {
DEBUG_TAB;
DEBUG_PRINT(j);
DEBUG_PRINT(nextScale);
if (nextScale != 0) {
DEBUG_TAB;
unsigned delta_scale = bv.get_expGolomb();
DEBUG_PRINT(delta_scale);
nextScale = (lastScale + delta_scale + 256) % 256;
}
lastScale = (nextScale == 0) ? lastScale : nextScale;
DEBUG_PRINT(lastScale);
}
}
}
}
}
unsigned log2_max_frame_num_minus4 = bv.get_expGolomb();
DEBUG_PRINT(log2_max_frame_num_minus4);
unsigned pic_order_cnt_type = bv.get_expGolomb();
DEBUG_PRINT(pic_order_cnt_type);
if (pic_order_cnt_type == 0) {
DEBUG_TAB;
unsigned log2_max_pic_order_cnt_lsb_minus4 = bv.get_expGolomb();
DEBUG_PRINT(log2_max_pic_order_cnt_lsb_minus4);
} else if (pic_order_cnt_type == 1) {
DEBUG_TAB;
bv.skipBits(1); // delta_pic_order_always_zero_flag
(void)bv.get_expGolomb(); // offset_for_non_ref_pic
(void)bv.get_expGolomb(); // offset_for_top_to_bottom_field
unsigned num_ref_frames_in_pic_order_cnt_cycle = bv.get_expGolomb();
DEBUG_PRINT(num_ref_frames_in_pic_order_cnt_cycle);
for (unsigned i = 0; i < num_ref_frames_in_pic_order_cnt_cycle; ++i) {
(void)bv.get_expGolomb(); // offset_for_ref_frame[i]
}
}
unsigned max_num_ref_frames = bv.get_expGolomb();
DEBUG_PRINT(max_num_ref_frames);
Boolean gaps_in_frame_num_value_allowed_flag = bv.get1BitBoolean();
DEBUG_PRINT(gaps_in_frame_num_value_allowed_flag);
unsigned pic_width_in_mbs_minus1 = bv.get_expGolomb();
DEBUG_PRINT(pic_width_in_mbs_minus1);
unsigned pic_height_in_map_units_minus1 = bv.get_expGolomb();
DEBUG_PRINT(pic_height_in_map_units_minus1);
Boolean frame_mbs_only_flag = bv.get1BitBoolean();
DEBUG_PRINT(frame_mbs_only_flag);
if (!frame_mbs_only_flag) {
bv.skipBits(1); // mb_adaptive_frame_field_flag
}
bv.skipBits(1); // direct_8x8_inference_flag
Boolean frame_cropping_flag = bv.get1BitBoolean();
DEBUG_PRINT(frame_cropping_flag);
if (frame_cropping_flag) {
(void)bv.get_expGolomb(); // frame_crop_left_offset
(void)bv.get_expGolomb(); // frame_crop_right_offset
(void)bv.get_expGolomb(); // frame_crop_top_offset
(void)bv.get_expGolomb(); // frame_crop_bottom_offset
}
Boolean vui_parameters_present_flag = bv.get1BitBoolean();
DEBUG_PRINT(vui_parameters_present_flag);
if (vui_parameters_present_flag) {
DEBUG_TAB;
analyze_vui_parameters(bv, num_units_in_tick, time_scale);
}
} else { // 265
unsigned i;
bv.skipBits(16); // nal_unit_header
bv.skipBits(4); // sps_video_parameter_set_id
unsigned sps_max_sub_layers_minus1 = bv.getBits(3);
DEBUG_PRINT(sps_max_sub_layers_minus1);
bv.skipBits(1); // sps_temporal_id_nesting_flag
profile_tier_level(bv, sps_max_sub_layers_minus1);
(void)bv.get_expGolomb(); // sps_seq_parameter_set_id
unsigned chroma_format_idc = bv.get_expGolomb();
DEBUG_PRINT(chroma_format_idc);
if (chroma_format_idc == 3) bv.skipBits(1); // separate_colour_plane_flag
unsigned pic_width_in_luma_samples = bv.get_expGolomb();
DEBUG_PRINT(pic_width_in_luma_samples);
unsigned pic_height_in_luma_samples = bv.get_expGolomb();
DEBUG_PRINT(pic_height_in_luma_samples);
Boolean conformance_window_flag = bv.get1BitBoolean();
DEBUG_PRINT(conformance_window_flag);
if (conformance_window_flag) {
DEBUG_TAB;
unsigned conf_win_left_offset = bv.get_expGolomb();
DEBUG_PRINT(conf_win_left_offset);
unsigned conf_win_right_offset = bv.get_expGolomb();
DEBUG_PRINT(conf_win_right_offset);
unsigned conf_win_top_offset = bv.get_expGolomb();
DEBUG_PRINT(conf_win_top_offset);
unsigned conf_win_bottom_offset = bv.get_expGolomb();
DEBUG_PRINT(conf_win_bottom_offset);
}
(void)bv.get_expGolomb(); // bit_depth_luma_minus8
(void)bv.get_expGolomb(); // bit_depth_chroma_minus8
unsigned log2_max_pic_order_cnt_lsb_minus4 = bv.get_expGolomb();
Boolean sps_sub_layer_ordering_info_present_flag = bv.get1BitBoolean();
DEBUG_PRINT(sps_sub_layer_ordering_info_present_flag);
for (i = (sps_sub_layer_ordering_info_present_flag ? 0 : sps_max_sub_layers_minus1);
i <= sps_max_sub_layers_minus1; ++i) {
(void)bv.get_expGolomb(); // sps_max_dec_pic_buffering_minus1[i]
(void)bv.get_expGolomb(); // sps_max_num_reorder_pics[i]
(void)bv.get_expGolomb(); // sps_max_latency_increase[i]
}
(void)bv.get_expGolomb(); // log2_min_luma_coding_block_size_minus3
(void)bv.get_expGolomb(); // log2_diff_max_min_luma_coding_block_size
(void)bv.get_expGolomb(); // log2_min_transform_block_size_minus2
(void)bv.get_expGolomb(); // log2_diff_max_min_transform_block_size
(void)bv.get_expGolomb(); // max_transform_hierarchy_depth_inter
(void)bv.get_expGolomb(); // max_transform_hierarchy_depth_intra
Boolean scaling_list_enabled_flag = bv.get1BitBoolean();
DEBUG_PRINT(scaling_list_enabled_flag);
if (scaling_list_enabled_flag) {
DEBUG_TAB;
Boolean sps_scaling_list_data_present_flag = bv.get1BitBoolean();
DEBUG_PRINT(sps_scaling_list_data_present_flag);
if (sps_scaling_list_data_present_flag) {
// scaling_list_data()
DEBUG_TAB;
for (unsigned sizeId = 0; sizeId < 4; ++sizeId) {
DEBUG_PRINT(sizeId);
for (unsigned matrixId = 0; matrixId < (sizeId == 3 ? 2 : 6); ++matrixId) {
DEBUG_TAB;
DEBUG_PRINT(matrixId);
Boolean scaling_list_pred_mode_flag = bv.get1BitBoolean();
DEBUG_PRINT(scaling_list_pred_mode_flag);
if (!scaling_list_pred_mode_flag) {
(void)bv.get_expGolomb(); // scaling_list_pred_matrix_id_delta[sizeId][matrixId]
} else {
unsigned const c = 1 << (4+(sizeId<<1));
unsigned coefNum = c < 64 ? c : 64;
if (sizeId > 1) {
(void)bv.get_expGolomb(); // scaling_list_dc_coef_minus8[sizeId][matrixId]
}
for (i = 0; i < coefNum; ++i) {
(void)bv.get_expGolomb(); // scaling_list_delta_coef
}
}
}
}
}
}
bv.skipBits(2); // amp_enabled_flag, sample_adaptive_offset_enabled_flag
Boolean pcm_enabled_flag = bv.get1BitBoolean();
DEBUG_PRINT(pcm_enabled_flag);
if (pcm_enabled_flag) {
bv.skipBits(8); // pcm_sample_bit_depth_luma_minus1, pcm_sample_bit_depth_chroma_minus1
(void)bv.get_expGolomb(); // log2_min_pcm_luma_coding_block_size_minus3
(void)bv.get_expGolomb(); // log2_diff_max_min_pcm_luma_coding_block_size
bv.skipBits(1); // pcm_loop_filter_disabled_flag
}
unsigned num_short_term_ref_pic_sets = bv.get_expGolomb();
DEBUG_PRINT(num_short_term_ref_pic_sets);
unsigned num_negative_pics = 0, prev_num_negative_pics = 0;
unsigned num_positive_pics = 0, prev_num_positive_pics = 0;
for (i = 0; i < num_short_term_ref_pic_sets; ++i) {
// short_term_ref_pic_set(i):
DEBUG_TAB;
DEBUG_PRINT(i);
Boolean inter_ref_pic_set_prediction_flag = False;
if (i != 0) {
inter_ref_pic_set_prediction_flag = bv.get1BitBoolean();
}
DEBUG_PRINT(inter_ref_pic_set_prediction_flag);
if (inter_ref_pic_set_prediction_flag) {
DEBUG_TAB;
if (i == num_short_term_ref_pic_sets) {
// This can't happen here, but it's in the spec, so we include it for completeness
(void)bv.get_expGolomb(); // delta_idx_minus1
}
bv.skipBits(1); // delta_rps_sign
(void)bv.get_expGolomb(); // abs_delta_rps_minus1
unsigned NumDeltaPocs = prev_num_negative_pics + prev_num_positive_pics; // correct???
for (unsigned j = 0; j < NumDeltaPocs; ++j) {
DEBUG_PRINT(j);
Boolean used_by_curr_pic_flag = bv.get1BitBoolean();
DEBUG_PRINT(used_by_curr_pic_flag);
if (!used_by_curr_pic_flag) bv.skipBits(1); // use_delta_flag[j]
}
} else {
prev_num_negative_pics = num_negative_pics;
num_negative_pics = bv.get_expGolomb();
DEBUG_PRINT(num_negative_pics);
prev_num_positive_pics = num_positive_pics;
num_positive_pics = bv.get_expGolomb();
DEBUG_PRINT(num_positive_pics);
unsigned k;
for (k = 0; k < num_negative_pics; ++k) {
(void)bv.get_expGolomb(); // delta_poc_s0_minus1[k]
bv.skipBits(1); // used_by_curr_pic_s0_flag[k]
}
for (k = 0; k < num_positive_pics; ++k) {
(void)bv.get_expGolomb(); // delta_poc_s1_minus1[k]
bv.skipBits(1); // used_by_curr_pic_s1_flag[k]
}
}
}
Boolean long_term_ref_pics_present_flag = bv.get1BitBoolean();
DEBUG_PRINT(long_term_ref_pics_present_flag);
if (long_term_ref_pics_present_flag) {
DEBUG_TAB;
unsigned num_long_term_ref_pics_sps = bv.get_expGolomb();
DEBUG_PRINT(num_long_term_ref_pics_sps);
for (i = 0; i < num_long_term_ref_pics_sps; ++i) {
bv.skipBits(log2_max_pic_order_cnt_lsb_minus4); // lt_ref_pic_poc_lsb_sps[i]
bv.skipBits(1); // used_by_curr_pic_lt_sps_flag[1]
}
}
bv.skipBits(2); // sps_temporal_mvp_enabled_flag, strong_intra_smoothing_enabled_flag
Boolean vui_parameters_present_flag = bv.get1BitBoolean();
DEBUG_PRINT(vui_parameters_present_flag);
if (vui_parameters_present_flag) {
DEBUG_TAB;
analyze_vui_parameters(bv, num_units_in_tick, time_scale);
}
Boolean sps_extension_flag = bv.get1BitBoolean();
DEBUG_PRINT(sps_extension_flag);
}
}
#define SEI_MAX_SIZE 5000 // larger than the largest possible SEI NAL unit
#ifdef DEBUG
#define MAX_SEI_PAYLOAD_TYPE_DESCRIPTION_H264 46
char const* sei_payloadType_description_h264[MAX_SEI_PAYLOAD_TYPE_DESCRIPTION_H264+1] = {
"buffering_period", //0
"pic_timing", //1
"pan_scan_rect", //2
"filler_payload", //3
"user_data_registered_itu_t_t35", //4
"user_data_unregistered", //5
"recovery_point", //6
"dec_ref_pic_marking_repetition", //7
"spare_pic", //8
"scene_info", //9
"sub_seq_info", //10
"sub_seq_layer_characteristics", //11
"sub_seq_characteristics", //12
"full_frame_freeze", //13
"full_frame_freeze_release", //14
"full_frame_snapshot", //15
"progressive_refinement_segment_start", //16
"progressive_refinement_segment_end", //17
"motion_constrained_slice_group_set", //18
"film_grain_characteristics", //19
"deblocking_filter_display_preference", //20
"stereo_video_info", //21
"post_filter_hint", //22
"tone_mapping_info", //23
"scalability_info", //24
"sub_pic_scalable_layer", //25
"non_required_layer_rep", //26
"priority_layer_info", //27
"layers_not_present", //28
"layer_dependency_change", //29
"scalable_nesting", //30
"base_layer_temporal_hrd", //31
"quality_layer_integrity_check", //32
"redundant_pic_property", //33
"tl0_dep_rep_index", //34
"tl_switching_point", //35
"parallel_decoding_info", //36
"mvc_scalable_nesting", //37
"view_scalability_info", //38
"multiview_scene_info", //39
"multiview_acquisition_info", //40
"non_required_view_component", //41
"view_dependency_change", //42
"operation_points_not_present", //43
"base_view_temporal_hrd", //44
"frame_packing_arrangement", //45
"reserved_sei_message" // 46 or higher
};
#endif
void H264or5VideoStreamParser::analyze_sei_data(u_int8_t nal_unit_type) {
// Begin by making a copy of the NAL unit data, removing any 'emulation prevention' bytes:
u_int8_t sei[SEI_MAX_SIZE];
unsigned seiSize;
removeEmulationBytes(sei, sizeof sei, seiSize);
unsigned j = 1; // skip the initial byte (forbidden_zero_bit; nal_ref_idc; nal_unit_type); we've already seen it
while (j < seiSize) {
unsigned payloadType = 0;
do {
payloadType += sei[j];
} while (sei[j++] == 255 && j < seiSize);
if (j >= seiSize) break;
unsigned payloadSize = 0;
do {
payloadSize += sei[j];
} while (sei[j++] == 255 && j < seiSize);
if (j >= seiSize) break;
#ifdef DEBUG
char const* description;
if (fHNumber == 264) {
unsigned descriptionNum = payloadType <= MAX_SEI_PAYLOAD_TYPE_DESCRIPTION_H264
? payloadType : MAX_SEI_PAYLOAD_TYPE_DESCRIPTION_H264;
description = sei_payloadType_description_h264[descriptionNum];
} else { // 265
description =
payloadType == 3 ? "filler_payload" :
payloadType == 4 ? "user_data_registered_itu_t_t35" :
payloadType == 5 ? "user_data_unregistered" :
payloadType == 17 ? "progressive_refinement_segment_end" :
payloadType == 22 ? "post_filter_hint" :
(payloadType == 132 && nal_unit_type == SUFFIX_SEI_NUT) ? "decoded_picture_hash" :
nal_unit_type == SUFFIX_SEI_NUT ? "reserved_sei_message" :
payloadType == 0 ? "buffering_period" :
payloadType == 1 ? "pic_timing" :
payloadType == 2 ? "pan_scan_rect" :
payloadType == 6 ? "recovery_point" :
payloadType == 9 ? "scene_info" :
payloadType == 15 ? "picture_snapshot" :
payloadType == 16 ? "progressive_refinement_segment_start" :
payloadType == 19 ? "film_grain_characteristics" :
payloadType == 23 ? "tone_mapping_info" :
payloadType == 45 ? "frame_packing_arrangement" :
payloadType == 47 ? "display_orientation" :
payloadType == 128 ? "structure_of_pictures_info" :
payloadType == 129 ? "active_parameter_sets" :
payloadType == 130 ? "decoding_unit_info" :
payloadType == 131 ? "temporal_sub_layer_zero_index" :
payloadType == 133 ? "scalable_nesting" :
payloadType == 134 ? "region_refresh_info" : "reserved_sei_message";
}
fprintf(stderr, "\tpayloadType %d (\"%s\"); payloadSize %d\n", payloadType, description, payloadSize);
#endif
analyze_sei_payload(payloadType, payloadSize, &sei[j]);
j += payloadSize;
}
}
void H264or5VideoStreamParser
::analyze_sei_payload(unsigned payloadType, unsigned payloadSize, u_int8_t* payload) {
if (payloadType == 1/* pic_timing, for both H.264 and H.265 */) {
BitVector bv(payload, 0, 8*payloadSize);
DEBUG_TAB;
if (CpbDpbDelaysPresentFlag) {
unsigned cpb_removal_delay = bv.getBits(cpb_removal_delay_length_minus1 + 1);
DEBUG_PRINT(cpb_removal_delay);
unsigned dpb_output_delay = bv.getBits(dpb_output_delay_length_minus1 + 1);
DEBUG_PRINT(dpb_output_delay);
}
if (pic_struct_present_flag) {
unsigned pic_struct = bv.getBits(4);
DEBUG_PRINT(pic_struct);
// Use this to set "DeltaTfiDivisor" (which is used to compute the frame rate):
double prevDeltaTfiDivisor = DeltaTfiDivisor;
if (fHNumber == 264) {
DeltaTfiDivisor =
pic_struct == 0 ? 2.0 :
pic_struct <= 2 ? 1.0 :
pic_struct <= 4 ? 2.0 :
pic_struct <= 6 ? 3.0 :
pic_struct == 7 ? 4.0 :
pic_struct == 8 ? 6.0 :
2.0;
} else { // H.265
DeltaTfiDivisor =
pic_struct == 0 ? 2.0 :
pic_struct <= 2 ? 1.0 :
pic_struct <= 4 ? 2.0 :
pic_struct <= 6 ? 3.0 :
pic_struct == 7 ? 2.0 :
pic_struct == 8 ? 3.0 :
pic_struct <= 12 ? 1.0 :
2.0;
}
// If "DeltaTfiDivisor" has changed, and we've already computed the frame rate, then
// adjust it, based on the new value of "DeltaTfiDivisor":
if (DeltaTfiDivisor != prevDeltaTfiDivisor && fParsedFrameRate != 0.0) {
usingSource()->fFrameRate = fParsedFrameRate
= fParsedFrameRate*(prevDeltaTfiDivisor/DeltaTfiDivisor);
#ifdef DEBUG
fprintf(stderr, "Changed frame rate to %f fps\n", usingSource()->fFrameRate);
#endif
}
}
// Ignore the rest of the payload (timestamps) for now... #####
}
}
void H264or5VideoStreamParser::flushInput() {
fHaveSeenFirstStartCode = False;
fHaveSeenFirstByteOfNALUnit = False;
StreamParser::flushInput();
}
#define NUM_NEXT_SLICE_HEADER_BYTES_TO_ANALYZE 12
unsigned H264or5VideoStreamParser::parse() {
try {
// The stream must start with a 0x00000001:
if (!fHaveSeenFirstStartCode) {
// Skip over any input bytes that precede the first 0x00000001:
u_int32_t first4Bytes;
while ((first4Bytes = test4Bytes()) != 0x00000001) {
get1Byte(); setParseState(); // ensures that we progress over bad data
}
skipBytes(4); // skip this initial code
setParseState();
fHaveSeenFirstStartCode = True; // from now on
}
if (fOutputStartCodeSize > 0 && curFrameSize() == 0 && !haveSeenEOF()) {
// Include a start code in the output:
save4Bytes(0x00000001);
}
// Then save everything up until the next 0x00000001 (4 bytes) or 0x000001 (3 bytes), or we hit EOF.
// Also make note of the first byte, because it contains the "nal_unit_type":
if (haveSeenEOF()) {
// We hit EOF the last time that we tried to parse this data, so we know that any remaining unparsed data
// forms a complete NAL unit, and that there's no 'start code' at the end:
unsigned remainingDataSize = totNumValidBytes() - curOffset();
#ifdef DEBUG
unsigned const trailingNALUnitSize = remainingDataSize;
#endif
while (remainingDataSize > 0) {
u_int8_t nextByte = get1Byte();
if (!fHaveSeenFirstByteOfNALUnit) {
fFirstByteOfNALUnit = nextByte;
fHaveSeenFirstByteOfNALUnit = True;
}
saveByte(nextByte);
--remainingDataSize;
}
#ifdef DEBUG
if (fHNumber == 264) {
u_int8_t nal_ref_idc = (fFirstByteOfNALUnit&0x60)>>5;
u_int8_t nal_unit_type = fFirstByteOfNALUnit&0x1F;
fprintf(stderr, "Parsed trailing %d-byte NAL-unit (nal_ref_idc: %d, nal_unit_type: %d (\"%s\"))\n",
trailingNALUnitSize, nal_ref_idc, nal_unit_type, nal_unit_type_description_h264[nal_unit_type]);
} else { // 265
u_int8_t nal_unit_type = (fFirstByteOfNALUnit&0x7E)>>1;
fprintf(stderr, "Parsed trailing %d-byte NAL-unit (nal_unit_type: %d (\"%s\"))\n",
trailingNALUnitSize, nal_unit_type, nal_unit_type_description_h265[nal_unit_type]);
}
#endif
(void)get1Byte(); // forces another read, which will cause EOF to get handled for real this time
return 0;
} else {
u_int32_t next4Bytes = test4Bytes();
if (!fHaveSeenFirstByteOfNALUnit) {
fFirstByteOfNALUnit = next4Bytes>>24;
fHaveSeenFirstByteOfNALUnit = True;
}
while (next4Bytes != 0x00000001 && (next4Bytes&0xFFFFFF00) != 0x00000100) {
// We save at least some of "next4Bytes".
if ((unsigned)(next4Bytes&0xFF) > 1) {
// Common case: 0x00000001 or 0x000001 definitely doesn't begin anywhere in "next4Bytes", so we save all of it:
save4Bytes(next4Bytes);
skipBytes(4);
} else {
// Save the first byte, and continue testing the rest:
saveByte(next4Bytes>>24);
skipBytes(1);
}
setParseState(); // ensures forward progress
next4Bytes = test4Bytes();
}
// Assert: next4Bytes starts with 0x00000001 or 0x000001, and we've saved all previous bytes (forming a complete NAL unit).
// Skip over these remaining bytes, up until the start of the next NAL unit:
if (next4Bytes == 0x00000001) {
skipBytes(4);
} else {
skipBytes(3);
}
}
fHaveSeenFirstByteOfNALUnit = False; // for the next NAL unit that we'll parse
u_int8_t nal_unit_type;
if (fHNumber == 264) {
nal_unit_type = fFirstByteOfNALUnit&0x1F;
#ifdef DEBUG
u_int8_t nal_ref_idc = (fFirstByteOfNALUnit&0x60)>>5;
fprintf(stderr, "Parsed %d-byte NAL-unit (nal_ref_idc: %d, nal_unit_type: %d (\"%s\"))\n",
curFrameSize()-fOutputStartCodeSize, nal_ref_idc, nal_unit_type, nal_unit_type_description_h264[nal_unit_type]);
#endif
} else { // 265
nal_unit_type = (fFirstByteOfNALUnit&0x7E)>>1;
#ifdef DEBUG
fprintf(stderr, "Parsed %d-byte NAL-unit (nal_unit_type: %d (\"%s\"))\n",
curFrameSize()-fOutputStartCodeSize, nal_unit_type, nal_unit_type_description_h265[nal_unit_type]);
#endif
}
// Now that we have found (& copied) a NAL unit, process it if it's of special interest to us:
if (isVPS(nal_unit_type)) { // Video parameter set
// First, save a copy of this NAL unit, in case the downstream object wants to see it:
usingSource()->saveCopyOfVPS(fStartOfFrame + fOutputStartCodeSize, curFrameSize() - fOutputStartCodeSize);
if (fParsedFrameRate == 0.0) {
// We haven't yet parsed a frame rate from the stream.
// So parse this NAL unit to check whether frame rate information is present:
unsigned num_units_in_tick, time_scale;
analyze_video_parameter_set_data(num_units_in_tick, time_scale);
if (time_scale > 0 && num_units_in_tick > 0) {
usingSource()->fFrameRate = fParsedFrameRate
= time_scale/(DeltaTfiDivisor*num_units_in_tick);
#ifdef DEBUG
fprintf(stderr, "Set frame rate to %f fps\n", usingSource()->fFrameRate);
#endif
} else {
#ifdef DEBUG
fprintf(stderr, "\tThis \"Video Parameter Set\" NAL unit contained no frame rate information, so we use a default frame rate of %f fps\n", usingSource()->fFrameRate);
#endif
}
}
} else if (isSPS(nal_unit_type)) { // Sequence parameter set
// First, save a copy of this NAL unit, in case the downstream object wants to see it:
usingSource()->saveCopyOfSPS(fStartOfFrame + fOutputStartCodeSize, curFrameSize() - fOutputStartCodeSize);
if (fParsedFrameRate == 0.0) {
// We haven't yet parsed a frame rate from the stream.
// So parse this NAL unit to check whether frame rate information is present:
unsigned num_units_in_tick, time_scale;
analyze_seq_parameter_set_data(num_units_in_tick, time_scale);
if (time_scale > 0 && num_units_in_tick > 0) {
usingSource()->fFrameRate = fParsedFrameRate
= time_scale/(DeltaTfiDivisor*num_units_in_tick);
#ifdef DEBUG
fprintf(stderr, "Set frame rate to %f fps\n", usingSource()->fFrameRate);
#endif
} else {
#ifdef DEBUG
fprintf(stderr, "\tThis \"Sequence Parameter Set\" NAL unit contained no frame rate information, so we use a default frame rate of %f fps\n", usingSource()->fFrameRate);
#endif
}
}
} else if (isPPS(nal_unit_type)) { // Picture parameter set
// Save a copy of this NAL unit, in case the downstream object wants to see it:
usingSource()->saveCopyOfPPS(fStartOfFrame + fOutputStartCodeSize, curFrameSize() - fOutputStartCodeSize);
} else if (isSEI(nal_unit_type)) { // Supplemental enhancement information (SEI)
analyze_sei_data(nal_unit_type);
// Later, perhaps adjust "fPresentationTime" if we saw a "pic_timing" SEI payload??? #####
}
usingSource()->setPresentationTime();
#ifdef DEBUG
unsigned long secs = (unsigned long)usingSource()->fPresentationTime.tv_sec;
unsigned uSecs = (unsigned)usingSource()->fPresentationTime.tv_usec;
fprintf(stderr, "\tPresentation time: %lu.%06u\n", secs, uSecs);
#endif
// Now, check whether this NAL unit ends an 'access unit'.
// (RTP streamers need to know this in order to figure out whether or not to set the "M" bit.)
Boolean thisNALUnitEndsAccessUnit;
if (haveSeenEOF() || isEOF(nal_unit_type)) {
// There is no next NAL unit, so we assume that this one ends the current 'access unit':
thisNALUnitEndsAccessUnit = True;
} else if (usuallyBeginsAccessUnit(nal_unit_type)) {
// These NAL units usually *begin* an access unit, so assume that they don't end one here:
thisNALUnitEndsAccessUnit = False;
} else {
// We need to check the *next* NAL unit to figure out whether
// the current NAL unit ends an 'access unit':
u_int8_t firstBytesOfNextNALUnit[3];
testBytes(firstBytesOfNextNALUnit, 3);
u_int8_t const& next_nal_unit_type = fHNumber == 264
? (firstBytesOfNextNALUnit[0]&0x1F) : ((firstBytesOfNextNALUnit[0]&0x7E)>>1);
if (isVCL(next_nal_unit_type)) {
// The high-order bit of the byte after the "nal_unit_header" tells us whether it's
// the start of a new 'access unit' (and thus the current NAL unit ends an 'access unit'):
u_int8_t const byteAfter_nal_unit_header
= fHNumber == 264 ? firstBytesOfNextNALUnit[1] : firstBytesOfNextNALUnit[2];
thisNALUnitEndsAccessUnit = (byteAfter_nal_unit_header&0x80) != 0;
} else if (usuallyBeginsAccessUnit(next_nal_unit_type)) {
// The next NAL unit's type is one that usually appears at the start of an 'access unit',
// so we assume that the current NAL unit ends an 'access unit':
thisNALUnitEndsAccessUnit = True;
} else {
// The next NAL unit definitely doesn't start a new 'access unit',
// which means that the current NAL unit doesn't end one:
thisNALUnitEndsAccessUnit = False;
}
}
if (thisNALUnitEndsAccessUnit) {
#ifdef DEBUG
fprintf(stderr, "*****This NAL unit ends the current access unit*****\n");
#endif
usingSource()->fPictureEndMarker = True;
++usingSource()->fPictureCount;
// Note that the presentation time for the next NAL unit will be different:
struct timeval& nextPT = usingSource()->fNextPresentationTime; // alias
nextPT = usingSource()->fPresentationTime;
double nextFraction = nextPT.tv_usec/1000000.0 + 1/usingSource()->fFrameRate;
unsigned nextSecsIncrement = (long)nextFraction;
nextPT.tv_sec += (long)nextSecsIncrement;
nextPT.tv_usec = (long)((nextFraction - nextSecsIncrement)*1000000);
}
setParseState();
return curFrameSize();
} catch (int /*e*/) {
#ifdef DEBUG
fprintf(stderr, "H264or5VideoStreamParser::parse() EXCEPTION (This is normal behavior - *not* an error)\n");
#endif
return 0; // the parsing got interrupted
}
}
unsigned removeH264or5EmulationBytes(u_int8_t* to, unsigned toMaxSize,
u_int8_t* from, unsigned fromSize) {
unsigned toSize = 0;
unsigned i = 0;
while (i < fromSize && toSize+1 < toMaxSize) {
if (i+2 < fromSize && from[i] == 0 && from[i+1] == 0 && from[i+2] == 3) {
to[toSize] = to[toSize+1] = 0;
toSize += 2;
i += 3;
} else {
to[toSize] = from[i];
toSize += 1;
i += 1;
}
}
return toSize;
}
live/liveMedia/H264VideoFileServerMediaSubsession.cpp 000444 001752 001752 00000011077 12656261123 022501 0 ustar 00rsf rsf 000000 000000 /**********
This library is free software; you can redistribute it and/or modify it under
the terms of the GNU Lesser General Public License as published by the
Free Software Foundation; either version 2.1 of the License, or (at your
option) any later version. (See .)
This library is distributed in the hope that it will be useful, but WITHOUT
ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS
FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for
more details.
You should have received a copy of the GNU Lesser General Public License
along with this library; if not, write to the Free Software Foundation, Inc.,
51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
**********/
// "liveMedia"
// Copyright (c) 1996-2016 Live Networks, Inc. All rights reserved.
// A 'ServerMediaSubsession' object that creates new, unicast, "RTPSink"s
// on demand, from a H264 video file.
// Implementation
#include "H264VideoFileServerMediaSubsession.hh"
#include "H264VideoRTPSink.hh"
#include "ByteStreamFileSource.hh"
#include "H264VideoStreamFramer.hh"
H264VideoFileServerMediaSubsession*
H264VideoFileServerMediaSubsession::createNew(UsageEnvironment& env,
char const* fileName,
Boolean reuseFirstSource) {
return new H264VideoFileServerMediaSubsession(env, fileName, reuseFirstSource);
}
H264VideoFileServerMediaSubsession::H264VideoFileServerMediaSubsession(UsageEnvironment& env,
char const* fileName, Boolean reuseFirstSource)
: FileServerMediaSubsession(env, fileName, reuseFirstSource),
fAuxSDPLine(NULL), fDoneFlag(0), fDummyRTPSink(NULL) {
}
H264VideoFileServerMediaSubsession::~H264VideoFileServerMediaSubsession() {
delete[] fAuxSDPLine;
}
static void afterPlayingDummy(void* clientData) {
H264VideoFileServerMediaSubsession* subsess = (H264VideoFileServerMediaSubsession*)clientData;
subsess->afterPlayingDummy1();
}
void H264VideoFileServerMediaSubsession::afterPlayingDummy1() {
// Unschedule any pending 'checking' task:
envir().taskScheduler().unscheduleDelayedTask(nextTask());
// Signal the event loop that we're done:
setDoneFlag();
}
static void checkForAuxSDPLine(void* clientData) {
H264VideoFileServerMediaSubsession* subsess = (H264VideoFileServerMediaSubsession*)clientData;
subsess->checkForAuxSDPLine1();
}
void H264VideoFileServerMediaSubsession::checkForAuxSDPLine1() {
char const* dasl;
if (fAuxSDPLine != NULL) {
// Signal the event loop that we're done:
setDoneFlag();
} else if (fDummyRTPSink != NULL && (dasl = fDummyRTPSink->auxSDPLine()) != NULL) {
fAuxSDPLine = strDup(dasl);
fDummyRTPSink = NULL;
// Signal the event loop that we're done:
setDoneFlag();
} else if (!fDoneFlag) {
// try again after a brief delay:
int uSecsToDelay = 100000; // 100 ms
nextTask() = envir().taskScheduler().scheduleDelayedTask(uSecsToDelay,
(TaskFunc*)checkForAuxSDPLine, this);
}
}
char const* H264VideoFileServerMediaSubsession::getAuxSDPLine(RTPSink* rtpSink, FramedSource* inputSource) {
if (fAuxSDPLine != NULL) return fAuxSDPLine; // it's already been set up (for a previous client)
if (fDummyRTPSink == NULL) { // we're not already setting it up for another, concurrent stream
// Note: For H264 video files, the 'config' information ("profile-level-id" and "sprop-parameter-sets") isn't known
// until we start reading the file. This means that "rtpSink"s "auxSDPLine()" will be NULL initially,
// and we need to start reading data from our file until this changes.
fDummyRTPSink = rtpSink;
// Start reading the file:
fDummyRTPSink->startPlaying(*inputSource, afterPlayingDummy, this);
// Check whether the sink's 'auxSDPLine()' is ready:
checkForAuxSDPLine(this);
}
envir().taskScheduler().doEventLoop(&fDoneFlag);
return fAuxSDPLine;
}
FramedSource* H264VideoFileServerMediaSubsession::createNewStreamSource(unsigned /*clientSessionId*/, unsigned& estBitrate) {
estBitrate = 500; // kbps, estimate
// Create the video source:
ByteStreamFileSource* fileSource = ByteStreamFileSource::createNew(envir(), fFileName);
if (fileSource == NULL) return NULL;
fFileSize = fileSource->fileSize();
// Create a framer for the Video Elementary Stream:
return H264VideoStreamFramer::createNew(envir(), fileSource);
}
RTPSink* H264VideoFileServerMediaSubsession
::createNewRTPSink(Groupsock* rtpGroupsock,
unsigned char rtpPayloadTypeIfDynamic,
FramedSource* /*inputSource*/) {
return H264VideoRTPSink::createNew(envir(), rtpGroupsock, rtpPayloadTypeIfDynamic);
}
live/liveMedia/H264VideoFileSink.cpp 000444 001752 001752 00000003753 12656261123 017123 0 ustar 00rsf rsf 000000 000000 /**********
This library is free software; you can redistribute it and/or modify it under
the terms of the GNU Lesser General Public License as published by the
Free Software Foundation; either version 2.1 of the License, or (at your
option) any later version. (See .)
This library is distributed in the hope that it will be useful, but WITHOUT
ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS
FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for
more details.
You should have received a copy of the GNU Lesser General Public License
along with this library; if not, write to the Free Software Foundation, Inc.,
51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
**********/
// "liveMedia"
// Copyright (c) 1996-2016 Live Networks, Inc. All rights reserved.
// H.264 Video File sinks
// Implementation
#include "H264VideoFileSink.hh"
#include "OutputFile.hh"
////////// H264VideoFileSink //////////
H264VideoFileSink
::H264VideoFileSink(UsageEnvironment& env, FILE* fid,
char const* sPropParameterSetsStr,
unsigned bufferSize, char const* perFrameFileNamePrefix)
: H264or5VideoFileSink(env, fid, bufferSize, perFrameFileNamePrefix,
sPropParameterSetsStr, NULL, NULL) {
}
H264VideoFileSink::~H264VideoFileSink() {
}
H264VideoFileSink*
H264VideoFileSink::createNew(UsageEnvironment& env, char const* fileName,
char const* sPropParameterSetsStr,
unsigned bufferSize, Boolean oneFilePerFrame) {
do {
FILE* fid;
char const* perFrameFileNamePrefix;
if (oneFilePerFrame) {
// Create the fid for each frame
fid = NULL;
perFrameFileNamePrefix = fileName;
} else {
// Normal case: create the fid once
fid = OpenOutputFile(env, fileName);
if (fid == NULL) break;
perFrameFileNamePrefix = NULL;
}
return new H264VideoFileSink(env, fid, sPropParameterSetsStr, bufferSize, perFrameFileNamePrefix);
} while (0);
return NULL;
}
live/liveMedia/H264VideoRTPSink.cpp 000444 001752 001752 00000012062 12656261123 016702 0 ustar 00rsf rsf 000000 000000 /**********
This library is free software; you can redistribute it and/or modify it under
the terms of the GNU Lesser General Public License as published by the
Free Software Foundation; either version 2.1 of the License, or (at your
option) any later version. (See .)
This library is distributed in the hope that it will be useful, but WITHOUT
ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS
FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for
more details.
You should have received a copy of the GNU Lesser General Public License
along with this library; if not, write to the Free Software Foundation, Inc.,
51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
**********/
// "liveMedia"
// Copyright (c) 1996-2016 Live Networks, Inc. All rights reserved.
// RTP sink for H.264 video (RFC 3984)
// Implementation
#include "H264VideoRTPSink.hh"
#include "H264VideoStreamFramer.hh"
#include "Base64.hh"
#include "H264VideoRTPSource.hh" // for "parseSPropParameterSets()"
////////// H264VideoRTPSink implementation //////////
H264VideoRTPSink
::H264VideoRTPSink(UsageEnvironment& env, Groupsock* RTPgs, unsigned char rtpPayloadFormat,
u_int8_t const* sps, unsigned spsSize, u_int8_t const* pps, unsigned ppsSize)
: H264or5VideoRTPSink(264, env, RTPgs, rtpPayloadFormat,
NULL, 0, sps, spsSize, pps, ppsSize) {
}
H264VideoRTPSink::~H264VideoRTPSink() {
}
H264VideoRTPSink* H264VideoRTPSink
::createNew(UsageEnvironment& env, Groupsock* RTPgs, unsigned char rtpPayloadFormat) {
return new H264VideoRTPSink(env, RTPgs, rtpPayloadFormat);
}
H264VideoRTPSink* H264VideoRTPSink
::createNew(UsageEnvironment& env, Groupsock* RTPgs, unsigned char rtpPayloadFormat,
u_int8_t const* sps, unsigned spsSize, u_int8_t const* pps, unsigned ppsSize) {
return new H264VideoRTPSink(env, RTPgs, rtpPayloadFormat, sps, spsSize, pps, ppsSize);
}
H264VideoRTPSink* H264VideoRTPSink
::createNew(UsageEnvironment& env, Groupsock* RTPgs, unsigned char rtpPayloadFormat,
char const* sPropParameterSetsStr) {
u_int8_t* sps = NULL; unsigned spsSize = 0;
u_int8_t* pps = NULL; unsigned ppsSize = 0;
unsigned numSPropRecords;
SPropRecord* sPropRecords = parseSPropParameterSets(sPropParameterSetsStr, numSPropRecords);
for (unsigned i = 0; i < numSPropRecords; ++i) {
if (sPropRecords[i].sPropLength == 0) continue; // bad data
u_int8_t nal_unit_type = (sPropRecords[i].sPropBytes[0])&0x1F;
if (nal_unit_type == 7/*SPS*/) {
sps = sPropRecords[i].sPropBytes;
spsSize = sPropRecords[i].sPropLength;
} else if (nal_unit_type == 8/*PPS*/) {
pps = sPropRecords[i].sPropBytes;
ppsSize = sPropRecords[i].sPropLength;
}
}
H264VideoRTPSink* result
= new H264VideoRTPSink(env, RTPgs, rtpPayloadFormat, sps, spsSize, pps, ppsSize);
delete[] sPropRecords;
return result;
}
Boolean H264VideoRTPSink::sourceIsCompatibleWithUs(MediaSource& source) {
// Our source must be an appropriate framer:
return source.isH264VideoStreamFramer();
}
char const* H264VideoRTPSink::auxSDPLine() {
// Generate a new "a=fmtp:" line each time, using our SPS and PPS (if we have them),
// otherwise parameters from our framer source (in case they've changed since the last time that
// we were called):
H264or5VideoStreamFramer* framerSource = NULL;
u_int8_t* vpsDummy = NULL; unsigned vpsDummySize = 0;
u_int8_t* sps = fSPS; unsigned spsSize = fSPSSize;
u_int8_t* pps = fPPS; unsigned ppsSize = fPPSSize;
if (sps == NULL || pps == NULL) {
// We need to get SPS and PPS from our framer source:
if (fOurFragmenter == NULL) return NULL; // we don't yet have a fragmenter (and therefore not a source)
framerSource = (H264or5VideoStreamFramer*)(fOurFragmenter->inputSource());
if (framerSource == NULL) return NULL; // we don't yet have a source
framerSource->getVPSandSPSandPPS(vpsDummy, vpsDummySize, sps, spsSize, pps, ppsSize);
if (sps == NULL || pps == NULL) return NULL; // our source isn't ready
}
// Set up the "a=fmtp:" SDP line for this stream:
u_int8_t* spsWEB = new u_int8_t[spsSize]; // "WEB" means "Without Emulation Bytes"
unsigned spsWEBSize = removeH264or5EmulationBytes(spsWEB, spsSize, sps, spsSize);
if (spsWEBSize < 4) { // Bad SPS size => assume our source isn't ready
delete[] spsWEB;
return NULL;
}
u_int32_t profileLevelId = (spsWEB[1]<<16) | (spsWEB[2]<<8) | spsWEB[3];
delete[] spsWEB;
char* sps_base64 = base64Encode((char*)sps, spsSize);
char* pps_base64 = base64Encode((char*)pps, ppsSize);
char const* fmtpFmt =
"a=fmtp:%d packetization-mode=1"
";profile-level-id=%06X"
";sprop-parameter-sets=%s,%s\r\n";
unsigned fmtpFmtSize = strlen(fmtpFmt)
+ 3 /* max char len */
+ 6 /* 3 bytes in hex */
+ strlen(sps_base64) + strlen(pps_base64);
char* fmtp = new char[fmtpFmtSize];
sprintf(fmtp, fmtpFmt,
rtpPayloadType(),
profileLevelId,
sps_base64, pps_base64);
delete[] sps_base64;
delete[] pps_base64;
delete[] fFmtpSDPLine; fFmtpSDPLine = fmtp;
return fFmtpSDPLine;
}
live/liveMedia/H264VideoRTPSource.cpp 000444 001752 001752 00000014167 12656261123 017246 0 ustar 00rsf rsf 000000 000000 /**********
This library is free software; you can redistribute it and/or modify it under
the terms of the GNU Lesser General Public License as published by the
Free Software Foundation; either version 2.1 of the License, or (at your
option) any later version. (See .)
This library is distributed in the hope that it will be useful, but WITHOUT
ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS
FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for
more details.
You should have received a copy of the GNU Lesser General Public License
along with this library; if not, write to the Free Software Foundation, Inc.,
51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
**********/
// "liveMedia"
// Copyright (c) 1996-2016 Live Networks, Inc. All rights reserved.
// H.264 Video RTP Sources
// Implementation
#include "H264VideoRTPSource.hh"
#include "Base64.hh"
////////// H264BufferedPacket and H264BufferedPacketFactory //////////
class H264BufferedPacket: public BufferedPacket {
public:
H264BufferedPacket(H264VideoRTPSource& ourSource);
virtual ~H264BufferedPacket();
private: // redefined virtual functions
virtual unsigned nextEnclosedFrameSize(unsigned char*& framePtr,
unsigned dataSize);
private:
H264VideoRTPSource& fOurSource;
};
class H264BufferedPacketFactory: public BufferedPacketFactory {
private: // redefined virtual functions
virtual BufferedPacket* createNewPacket(MultiFramedRTPSource* ourSource);
};
///////// H264VideoRTPSource implementation ////////
H264VideoRTPSource*
H264VideoRTPSource::createNew(UsageEnvironment& env, Groupsock* RTPgs,
unsigned char rtpPayloadFormat,
unsigned rtpTimestampFrequency) {
return new H264VideoRTPSource(env, RTPgs, rtpPayloadFormat,
rtpTimestampFrequency);
}
H264VideoRTPSource
::H264VideoRTPSource(UsageEnvironment& env, Groupsock* RTPgs,
unsigned char rtpPayloadFormat,
unsigned rtpTimestampFrequency)
: MultiFramedRTPSource(env, RTPgs, rtpPayloadFormat, rtpTimestampFrequency,
new H264BufferedPacketFactory) {
}
H264VideoRTPSource::~H264VideoRTPSource() {
}
Boolean H264VideoRTPSource
::processSpecialHeader(BufferedPacket* packet,
unsigned& resultSpecialHeaderSize) {
unsigned char* headerStart = packet->data();
unsigned packetSize = packet->dataSize();
unsigned numBytesToSkip;
// Check the 'nal_unit_type' for special 'aggregation' or 'fragmentation' packets:
if (packetSize < 1) return False;
fCurPacketNALUnitType = (headerStart[0]&0x1F);
switch (fCurPacketNALUnitType) {
case 24: { // STAP-A
numBytesToSkip = 1; // discard the type byte
break;
}
case 25: case 26: case 27: { // STAP-B, MTAP16, or MTAP24
numBytesToSkip = 3; // discard the type byte, and the initial DON
break;
}
case 28: case 29: { // // FU-A or FU-B
// For these NALUs, the first two bytes are the FU indicator and the FU header.
// If the start bit is set, we reconstruct the original NAL header into byte 1:
if (packetSize < 2) return False;
unsigned char startBit = headerStart[1]&0x80;
unsigned char endBit = headerStart[1]&0x40;
if (startBit) {
fCurrentPacketBeginsFrame = True;
headerStart[1] = (headerStart[0]&0xE0)|(headerStart[1]&0x1F);
numBytesToSkip = 1;
} else {
// The start bit is not set, so we skip both the FU indicator and header:
fCurrentPacketBeginsFrame = False;
numBytesToSkip = 2;
}
fCurrentPacketCompletesFrame = (endBit != 0);
break;
}
default: {
// This packet contains one complete NAL unit:
fCurrentPacketBeginsFrame = fCurrentPacketCompletesFrame = True;
numBytesToSkip = 0;
break;
}
}
resultSpecialHeaderSize = numBytesToSkip;
return True;
}
char const* H264VideoRTPSource::MIMEtype() const {
return "video/H264";
}
SPropRecord* parseSPropParameterSets(char const* sPropParameterSetsStr,
// result parameter:
unsigned& numSPropRecords) {
// Make a copy of the input string, so we can replace the commas with '\0's:
char* inStr = strDup(sPropParameterSetsStr);
if (inStr == NULL) {
numSPropRecords = 0;
return NULL;
}
// Count the number of commas (and thus the number of parameter sets):
numSPropRecords = 1;
char* s;
for (s = inStr; *s != '\0'; ++s) {
if (*s == ',') {
++numSPropRecords;
*s = '\0';
}
}
// Allocate and fill in the result array:
SPropRecord* resultArray = new SPropRecord[numSPropRecords];
s = inStr;
for (unsigned i = 0; i < numSPropRecords; ++i) {
resultArray[i].sPropBytes = base64Decode(s, resultArray[i].sPropLength);
s += strlen(s) + 1;
}
delete[] inStr;
return resultArray;
}
////////// H264BufferedPacket and H264BufferedPacketFactory implementation //////////
H264BufferedPacket::H264BufferedPacket(H264VideoRTPSource& ourSource)
: fOurSource(ourSource) {
}
H264BufferedPacket::~H264BufferedPacket() {
}
unsigned H264BufferedPacket
::nextEnclosedFrameSize(unsigned char*& framePtr, unsigned dataSize) {
unsigned resultNALUSize = 0; // if an error occurs
switch (fOurSource.fCurPacketNALUnitType) {
case 24: case 25: { // STAP-A or STAP-B
// The first two bytes are NALU size:
if (dataSize < 2) break;
resultNALUSize = (framePtr[0]<<8)|framePtr[1];
framePtr += 2;
break;
}
case 26: { // MTAP16
// The first two bytes are NALU size. The next three are the DOND and TS offset:
if (dataSize < 5) break;
resultNALUSize = (framePtr[0]<<8)|framePtr[1];
framePtr += 5;
break;
}
case 27: { // MTAP24
// The first two bytes are NALU size. The next four are the DOND and TS offset:
if (dataSize < 6) break;
resultNALUSize = (framePtr[0]<<8)|framePtr[1];
framePtr += 6;
break;
}
default: {
// Common case: We use the entire packet data:
return dataSize;
}
}
return (resultNALUSize <= dataSize) ? resultNALUSize : dataSize;
}
BufferedPacket* H264BufferedPacketFactory
::createNewPacket(MultiFramedRTPSource* ourSource) {
return new H264BufferedPacket((H264VideoRTPSource&)(*ourSource));
}
live/liveMedia/H264VideoStreamDiscreteFramer.cpp 000444 001752 001752 00000003227 12656261123 021466 0 ustar 00rsf rsf 000000 000000 /**********
This library is free software; you can redistribute it and/or modify it under
the terms of the GNU Lesser General Public License as published by the
Free Software Foundation; either version 2.1 of the License, or (at your
option) any later version. (See .)
This library is distributed in the hope that it will be useful, but WITHOUT
ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS
FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for
more details.
You should have received a copy of the GNU Lesser General Public License
along with this library; if not, write to the Free Software Foundation, Inc.,
51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
**********/
// "liveMedia"
// Copyright (c) 1996-2016 Live Networks, Inc. All rights reserved.
// A simplified version of "H264VideoStreamFramer" that takes only complete,
// discrete frames (rather than an arbitrary byte stream) as input.
// This avoids the parsing and data copying overhead of the full
// "H264VideoStreamFramer".
// Implementation
#include "H264VideoStreamDiscreteFramer.hh"
H264VideoStreamDiscreteFramer*
H264VideoStreamDiscreteFramer::createNew(UsageEnvironment& env, FramedSource* inputSource) {
return new H264VideoStreamDiscreteFramer(env, inputSource);
}
H264VideoStreamDiscreteFramer
::H264VideoStreamDiscreteFramer(UsageEnvironment& env, FramedSource* inputSource)
: H264or5VideoStreamDiscreteFramer(264, env, inputSource) {
}
H264VideoStreamDiscreteFramer::~H264VideoStreamDiscreteFramer() {
}
Boolean H264VideoStreamDiscreteFramer::isH264VideoStreamFramer() const {
return True;
}
live/liveMedia/H264VideoStreamFramer.cpp 000444 001752 001752 00000003107 12656261123 020000 0 ustar 00rsf rsf 000000 000000 /**********
This library is free software; you can redistribute it and/or modify it under
the terms of the GNU Lesser General Public License as published by the
Free Software Foundation; either version 2.1 of the License, or (at your
option) any later version. (See .)
This library is distributed in the hope that it will be useful, but WITHOUT
ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS
FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for
more details.
You should have received a copy of the GNU Lesser General Public License
along with this library; if not, write to the Free Software Foundation, Inc.,
51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
**********/
// "liveMedia"
// Copyright (c) 1996-2016 Live Networks, Inc. All rights reserved.
// A filter that breaks up a H.264 Video Elementary Stream into NAL units.
// Implementation
#include "H264VideoStreamFramer.hh"
H264VideoStreamFramer* H264VideoStreamFramer
::createNew(UsageEnvironment& env, FramedSource* inputSource, Boolean includeStartCodeInOutput) {
return new H264VideoStreamFramer(env, inputSource, True, includeStartCodeInOutput);
}
H264VideoStreamFramer
::H264VideoStreamFramer(UsageEnvironment& env, FramedSource* inputSource, Boolean createParser, Boolean includeStartCodeInOutput)
: H264or5VideoStreamFramer(264, env, inputSource, createParser, includeStartCodeInOutput) {
}
H264VideoStreamFramer::~H264VideoStreamFramer() {
}
Boolean H264VideoStreamFramer::isH264VideoStreamFramer() const {
return True;
}
live/liveMedia/H265VideoFileServerMediaSubsession.cpp 000444 001752 001752 00000011142 12656261123 022473 0 ustar 00rsf rsf 000000 000000 /**********
This library is free software; you can redistribute it and/or modify it under
the terms of the GNU Lesser General Public License as published by the
Free Software Foundation; either version 2.1 of the License, or (at your
option) any later version. (See .)
This library is distributed in the hope that it will be useful, but WITHOUT
ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS
FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for
more details.
You should have received a copy of the GNU Lesser General Public License
along with this library; if not, write to the Free Software Foundation, Inc.,
51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
**********/
// "liveMedia"
// Copyright (c) 1996-2016 Live Networks, Inc. All rights reserved.
// A 'ServerMediaSubsession' object that creates new, unicast, "RTPSink"s
// on demand, from a H265 video file.
// Implementation
#include "H265VideoFileServerMediaSubsession.hh"
#include "H265VideoRTPSink.hh"
#include "ByteStreamFileSource.hh"
#include "H265VideoStreamFramer.hh"
H265VideoFileServerMediaSubsession*
H265VideoFileServerMediaSubsession::createNew(UsageEnvironment& env,
char const* fileName,
Boolean reuseFirstSource) {
return new H265VideoFileServerMediaSubsession(env, fileName, reuseFirstSource);
}
H265VideoFileServerMediaSubsession::H265VideoFileServerMediaSubsession(UsageEnvironment& env,
char const* fileName, Boolean reuseFirstSource)
: FileServerMediaSubsession(env, fileName, reuseFirstSource),
fAuxSDPLine(NULL), fDoneFlag(0), fDummyRTPSink(NULL) {
}
H265VideoFileServerMediaSubsession::~H265VideoFileServerMediaSubsession() {
delete[] fAuxSDPLine;
}
static void afterPlayingDummy(void* clientData) {
H265VideoFileServerMediaSubsession* subsess = (H265VideoFileServerMediaSubsession*)clientData;
subsess->afterPlayingDummy1();
}
void H265VideoFileServerMediaSubsession::afterPlayingDummy1() {
// Unschedule any pending 'checking' task:
envir().taskScheduler().unscheduleDelayedTask(nextTask());
// Signal the event loop that we're done:
setDoneFlag();
}
static void checkForAuxSDPLine(void* clientData) {
H265VideoFileServerMediaSubsession* subsess = (H265VideoFileServerMediaSubsession*)clientData;
subsess->checkForAuxSDPLine1();
}
void H265VideoFileServerMediaSubsession::checkForAuxSDPLine1() {
char const* dasl;
if (fAuxSDPLine != NULL) {
// Signal the event loop that we're done:
setDoneFlag();
} else if (fDummyRTPSink != NULL && (dasl = fDummyRTPSink->auxSDPLine()) != NULL) {
fAuxSDPLine = strDup(dasl);
fDummyRTPSink = NULL;
// Signal the event loop that we're done:
setDoneFlag();
} else if (!fDoneFlag) {
// try again after a brief delay:
int uSecsToDelay = 100000; // 100 ms
nextTask() = envir().taskScheduler().scheduleDelayedTask(uSecsToDelay,
(TaskFunc*)checkForAuxSDPLine, this);
}
}
char const* H265VideoFileServerMediaSubsession::getAuxSDPLine(RTPSink* rtpSink, FramedSource* inputSource) {
if (fAuxSDPLine != NULL) return fAuxSDPLine; // it's already been set up (for a previous client)
if (fDummyRTPSink == NULL) { // we're not already setting it up for another, concurrent stream
// Note: For H265 video files, the 'config' information (used for several payload-format
// specific parameters in the SDP description) isn't known until we start reading the file.
// This means that "rtpSink"s "auxSDPLine()" will be NULL initially,
// and we need to start reading data from our file until this changes.
fDummyRTPSink = rtpSink;
// Start reading the file:
fDummyRTPSink->startPlaying(*inputSource, afterPlayingDummy, this);
// Check whether the sink's 'auxSDPLine()' is ready:
checkForAuxSDPLine(this);
}
envir().taskScheduler().doEventLoop(&fDoneFlag);
return fAuxSDPLine;
}
FramedSource* H265VideoFileServerMediaSubsession::createNewStreamSource(unsigned /*clientSessionId*/, unsigned& estBitrate) {
estBitrate = 500; // kbps, estimate
// Create the video source:
ByteStreamFileSource* fileSource = ByteStreamFileSource::createNew(envir(), fFileName);
if (fileSource == NULL) return NULL;
fFileSize = fileSource->fileSize();
// Create a framer for the Video Elementary Stream:
return H265VideoStreamFramer::createNew(envir(), fileSource);
}
RTPSink* H265VideoFileServerMediaSubsession
::createNewRTPSink(Groupsock* rtpGroupsock,
unsigned char rtpPayloadTypeIfDynamic,
FramedSource* /*inputSource*/) {
return H265VideoRTPSink::createNew(envir(), rtpGroupsock, rtpPayloadTypeIfDynamic);
}
live/liveMedia/H265VideoFileSink.cpp 000444 001752 001752 00000004207 12656261123 017117 0 ustar 00rsf rsf 000000 000000 /**********
This library is free software; you can redistribute it and/or modify it under
the terms of the GNU Lesser General Public License as published by the
Free Software Foundation; either version 2.1 of the License, or (at your
option) any later version. (See .)
This library is distributed in the hope that it will be useful, but WITHOUT
ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS
FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for
more details.
You should have received a copy of the GNU Lesser General Public License
along with this library; if not, write to the Free Software Foundation, Inc.,
51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
**********/
// "liveMedia"
// Copyright (c) 1996-2016 Live Networks, Inc. All rights reserved.
// H.265 Video File sinks
// Implementation
#include "H265VideoFileSink.hh"
#include "OutputFile.hh"
////////// H265VideoFileSink //////////
H265VideoFileSink
::H265VideoFileSink(UsageEnvironment& env, FILE* fid,
char const* sPropVPSStr,
char const* sPropSPSStr,
char const* sPropPPSStr,
unsigned bufferSize, char const* perFrameFileNamePrefix)
: H264or5VideoFileSink(env, fid, bufferSize, perFrameFileNamePrefix,
sPropVPSStr, sPropSPSStr, sPropPPSStr) {
}
H265VideoFileSink::~H265VideoFileSink() {
}
H265VideoFileSink*
H265VideoFileSink::createNew(UsageEnvironment& env, char const* fileName,
char const* sPropVPSStr,
char const* sPropSPSStr,
char const* sPropPPSStr,
unsigned bufferSize, Boolean oneFilePerFrame) {
do {
FILE* fid;
char const* perFrameFileNamePrefix;
if (oneFilePerFrame) {
// Create the fid for each frame
fid = NULL;
perFrameFileNamePrefix = fileName;
} else {
// Normal case: create the fid once
fid = OpenOutputFile(env, fileName);
if (fid == NULL) break;
perFrameFileNamePrefix = NULL;
}
return new H265VideoFileSink(env, fid, sPropVPSStr, sPropSPSStr, sPropPPSStr, bufferSize, perFrameFileNamePrefix);
} while (0);
return NULL;
}
live/liveMedia/H265VideoRTPSink.cpp 000444 001752 001752 00000016225 12656261123 016710 0 ustar 00rsf rsf 000000 000000 /**********
This library is free software; you can redistribute it and/or modify it under
the terms of the GNU Lesser General Public License as published by the
Free Software Foundation; either version 2.1 of the License, or (at your
option) any later version. (See .)
This library is distributed in the hope that it will be useful, but WITHOUT
ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS
FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for
more details.
You should have received a copy of the GNU Lesser General Public License
along with this library; if not, write to the Free Software Foundation, Inc.,
51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
**********/
// "liveMedia"
// Copyright (c) 1996-2016 Live Networks, Inc. All rights reserved.
// RTP sink for H.265 video
// Implementation
#include "H265VideoRTPSink.hh"
#include "H265VideoStreamFramer.hh"
#include "Base64.hh"
#include "BitVector.hh"
#include "H264VideoRTPSource.hh" // for "parseSPropParameterSets()"
////////// H265VideoRTPSink implementation //////////
H265VideoRTPSink
::H265VideoRTPSink(UsageEnvironment& env, Groupsock* RTPgs, unsigned char rtpPayloadFormat,
u_int8_t const* vps, unsigned vpsSize,
u_int8_t const* sps, unsigned spsSize,
u_int8_t const* pps, unsigned ppsSize)
: H264or5VideoRTPSink(265, env, RTPgs, rtpPayloadFormat,
vps, vpsSize, sps, spsSize, pps, ppsSize) {
}
H265VideoRTPSink::~H265VideoRTPSink() {
}
H265VideoRTPSink* H265VideoRTPSink
::createNew(UsageEnvironment& env, Groupsock* RTPgs, unsigned char rtpPayloadFormat) {
return new H265VideoRTPSink(env, RTPgs, rtpPayloadFormat);
}
H265VideoRTPSink* H265VideoRTPSink
::createNew(UsageEnvironment& env, Groupsock* RTPgs, unsigned char rtpPayloadFormat,
u_int8_t const* vps, unsigned vpsSize,
u_int8_t const* sps, unsigned spsSize,
u_int8_t const* pps, unsigned ppsSize) {
return new H265VideoRTPSink(env, RTPgs, rtpPayloadFormat,
vps, vpsSize, sps, spsSize, pps, ppsSize);
}
H265VideoRTPSink* H265VideoRTPSink
::createNew(UsageEnvironment& env, Groupsock* RTPgs, unsigned char rtpPayloadFormat,
char const* sPropVPSStr, char const* sPropSPSStr, char const* sPropPPSStr) {
u_int8_t* vps = NULL; unsigned vpsSize = 0;
u_int8_t* sps = NULL; unsigned spsSize = 0;
u_int8_t* pps = NULL; unsigned ppsSize = 0;
// Parse each 'sProp' string, extracting and then classifying the NAL unit(s) from each one.
// We're 'liberal in what we accept'; it's OK if the strings don't contain the NAL unit type
// implied by their names (or if one or more of the strings encode multiple NAL units).
SPropRecord* sPropRecords[3];
unsigned numSPropRecords[3];
sPropRecords[0] = parseSPropParameterSets(sPropVPSStr, numSPropRecords[0]);
sPropRecords[1] = parseSPropParameterSets(sPropSPSStr, numSPropRecords[1]);
sPropRecords[2] = parseSPropParameterSets(sPropPPSStr, numSPropRecords[2]);
for (unsigned j = 0; j < 3; ++j) {
SPropRecord* records = sPropRecords[j];
unsigned numRecords = numSPropRecords[j];
for (unsigned i = 0; i < numRecords; ++i) {
if (records[i].sPropLength == 0) continue; // bad data
u_int8_t nal_unit_type = ((records[i].sPropBytes[0])&0x7E)>>1;
if (nal_unit_type == 32/*VPS*/) {
vps = records[i].sPropBytes;
vpsSize = records[i].sPropLength;
} else if (nal_unit_type == 33/*SPS*/) {
sps = records[i].sPropBytes;
spsSize = records[i].sPropLength;
} else if (nal_unit_type == 34/*PPS*/) {
pps = records[i].sPropBytes;
ppsSize = records[i].sPropLength;
}
}
}
H265VideoRTPSink* result = new H265VideoRTPSink(env, RTPgs, rtpPayloadFormat,
vps, vpsSize, sps, spsSize, pps, ppsSize);
delete[] sPropRecords[0]; delete[] sPropRecords[1]; delete[] sPropRecords[2];
return result;
}
Boolean H265VideoRTPSink::sourceIsCompatibleWithUs(MediaSource& source) {
// Our source must be an appropriate framer:
return source.isH265VideoStreamFramer();
}
char const* H265VideoRTPSink::auxSDPLine() {
// Generate a new "a=fmtp:" line each time, using our VPS, SPS and PPS (if we have them),
// otherwise parameters from our framer source (in case they've changed since the last time that
// we were called):
H264or5VideoStreamFramer* framerSource = NULL;
u_int8_t* vps = fVPS; unsigned vpsSize = fVPSSize;
u_int8_t* sps = fSPS; unsigned spsSize = fSPSSize;
u_int8_t* pps = fPPS; unsigned ppsSize = fPPSSize;
if (vps == NULL || sps == NULL || pps == NULL) {
// We need to get VPS, SPS and PPS from our framer source:
if (fOurFragmenter == NULL) return NULL; // we don't yet have a fragmenter (and therefore not a source)
framerSource = (H264or5VideoStreamFramer*)(fOurFragmenter->inputSource());
if (framerSource == NULL) return NULL; // we don't yet have a source
framerSource->getVPSandSPSandPPS(vps, vpsSize, sps, spsSize, pps, ppsSize);
if (vps == NULL || sps == NULL || pps == NULL) {
return NULL; // our source isn't ready
}
}
// Set up the "a=fmtp:" SDP line for this stream.
u_int8_t* vpsWEB = new u_int8_t[vpsSize]; // "WEB" means "Without Emulation Bytes"
unsigned vpsWEBSize = removeH264or5EmulationBytes(vpsWEB, vpsSize, vps, vpsSize);
if (vpsWEBSize < 6/*'profile_tier_level' offset*/ + 12/*num 'profile_tier_level' bytes*/) {
// Bad VPS size => assume our source isn't ready
delete[] vpsWEB;
return NULL;
}
u_int8_t const* profileTierLevelHeaderBytes = &vpsWEB[6];
unsigned profileSpace = profileTierLevelHeaderBytes[0]>>6; // general_profile_space
unsigned profileId = profileTierLevelHeaderBytes[0]&0x1F; // general_profile_idc
unsigned tierFlag = (profileTierLevelHeaderBytes[0]>>5)&0x1; // general_tier_flag
unsigned levelId = profileTierLevelHeaderBytes[11]; // general_level_idc
u_int8_t const* interop_constraints = &profileTierLevelHeaderBytes[5];
char interopConstraintsStr[100];
sprintf(interopConstraintsStr, "%02X%02X%02X%02X%02X%02X",
interop_constraints[0], interop_constraints[1], interop_constraints[2],
interop_constraints[3], interop_constraints[4], interop_constraints[5]);
delete[] vpsWEB;
char* sprop_vps = base64Encode((char*)vps, vpsSize);
char* sprop_sps = base64Encode((char*)sps, spsSize);
char* sprop_pps = base64Encode((char*)pps, ppsSize);
char const* fmtpFmt =
"a=fmtp:%d profile-space=%u"
";profile-id=%u"
";tier-flag=%u"
";level-id=%u"
";interop-constraints=%s"
";sprop-vps=%s"
";sprop-sps=%s"
";sprop-pps=%s\r\n";
unsigned fmtpFmtSize = strlen(fmtpFmt)
+ 3 /* max num chars: rtpPayloadType */ + 20 /* max num chars: profile_space */
+ 20 /* max num chars: profile_id */
+ 20 /* max num chars: tier_flag */
+ 20 /* max num chars: level_id */
+ strlen(interopConstraintsStr)
+ strlen(sprop_vps)
+ strlen(sprop_sps)
+ strlen(sprop_pps);
char* fmtp = new char[fmtpFmtSize];
sprintf(fmtp, fmtpFmt,
rtpPayloadType(), profileSpace,
profileId,
tierFlag,
levelId,
interopConstraintsStr,
sprop_vps,
sprop_sps,
sprop_pps);
delete[] sprop_vps;
delete[] sprop_sps;
delete[] sprop_pps;
delete[] fFmtpSDPLine; fFmtpSDPLine = fmtp;
return fFmtpSDPLine;
}
live/liveMedia/Locale.cpp 000444 001752 001752 00000003742 12656261123 015261 0 ustar 00rsf rsf 000000 000000 /**********
This library is free software; you can redistribute it and/or modify it under
the terms of the GNU Lesser General Public License as published by the
Free Software Foundation; either version 2.1 of the License, or (at your
option) any later version. (See .)
This library is distributed in the hope that it will be useful, but WITHOUT
ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS
FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for
more details.
You should have received a copy of the GNU Lesser General Public License
along with this library; if not, write to the Free Software Foundation, Inc.,
51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
**********/
// "liveMedia"
// Copyright (c) 1996-2016 Live Networks, Inc. All rights reserved.
// Support for temporarily setting the locale (e.g., to "C" or "POSIX") for (e.g.) parsing or printing
// floating-point numbers in protocol headers, or calling toupper()/tolower() on human-input strings.
// Implementation
#include "Locale.hh"
#include
Locale::Locale(char const* newLocale, LocaleCategory category) {
#ifndef LOCALE_NOT_USED
#ifndef XLOCALE_NOT_USED
int categoryMask;
switch (category) {
case All: { categoryMask = LC_ALL_MASK; break; }
case Numeric: { categoryMask = LC_NUMERIC_MASK; break; }
}
fLocale = newlocale(categoryMask, newLocale, NULL);
fPrevLocale = uselocale(fLocale);
#else
switch (category) {
case All: { fCategoryNum = LC_ALL; break; }
case Numeric: { fCategoryNum = LC_NUMERIC; break; }
}
fPrevLocale = strDup(setlocale(fCategoryNum, NULL));
setlocale(fCategoryNum, newLocale);
#endif
#endif
}
Locale::~Locale() {
#ifndef LOCALE_NOT_USED
#ifndef XLOCALE_NOT_USED
if (fLocale != (locale_t)0) {
uselocale(fPrevLocale);
freelocale(fLocale);
}
#else
if (fPrevLocale != NULL) {
setlocale(fCategoryNum, fPrevLocale);
delete[] fPrevLocale;
}
#endif
#endif
}
live/liveMedia/H265VideoRTPSource.cpp 000444 001752 001752 00000016026 12656261123 017243 0 ustar 00rsf rsf 000000 000000 /**********
This library is free software; you can redistribute it and/or modify it under
the terms of the GNU Lesser General Public License as published by the
Free Software Foundation; either version 2.1 of the License, or (at your
option) any later version. (See .)
This library is distributed in the hope that it will be useful, but WITHOUT
ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS
FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for
more details.
You should have received a copy of the GNU Lesser General Public License
along with this library; if not, write to the Free Software Foundation, Inc.,
51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
**********/
// "liveMedia"
// Copyright (c) 1996-2016 Live Networks, Inc. All rights reserved.
// H.265 Video RTP Sources
// Implementation
#include "H265VideoRTPSource.hh"
////////// H265BufferedPacket and H265BufferedPacketFactory //////////
class H265BufferedPacket: public BufferedPacket {
public:
H265BufferedPacket(H265VideoRTPSource& ourSource);
virtual ~H265BufferedPacket();
private: // redefined virtual functions
virtual unsigned nextEnclosedFrameSize(unsigned char*& framePtr,
unsigned dataSize);
private:
H265VideoRTPSource& fOurSource;
};
class H265BufferedPacketFactory: public BufferedPacketFactory {
private: // redefined virtual functions
virtual BufferedPacket* createNewPacket(MultiFramedRTPSource* ourSource);
};
///////// H265VideoRTPSource implementation ////////
H265VideoRTPSource*
H265VideoRTPSource::createNew(UsageEnvironment& env, Groupsock* RTPgs,
unsigned char rtpPayloadFormat,
Boolean expectDONFields,
unsigned rtpTimestampFrequency) {
return new H265VideoRTPSource(env, RTPgs, rtpPayloadFormat,
expectDONFields, rtpTimestampFrequency);
}
H265VideoRTPSource
::H265VideoRTPSource(UsageEnvironment& env, Groupsock* RTPgs,
unsigned char rtpPayloadFormat,
Boolean expectDONFields,
unsigned rtpTimestampFrequency)
: MultiFramedRTPSource(env, RTPgs, rtpPayloadFormat, rtpTimestampFrequency,
new H265BufferedPacketFactory),
fExpectDONFields(expectDONFields),
fPreviousNALUnitDON(0), fCurrentNALUnitAbsDon((u_int64_t)(~0)) {
}
H265VideoRTPSource::~H265VideoRTPSource() {
}
Boolean H265VideoRTPSource
::processSpecialHeader(BufferedPacket* packet,
unsigned& resultSpecialHeaderSize) {
unsigned char* headerStart = packet->data();
unsigned packetSize = packet->dataSize();
u_int16_t DONL = 0;
unsigned numBytesToSkip;
// Check the Payload Header's 'nal_unit_type' for special aggregation or fragmentation packets:
if (packetSize < 2) return False;
fCurPacketNALUnitType = (headerStart[0]&0x7E)>>1;
switch (fCurPacketNALUnitType) {
case 48: { // Aggregation Packet (AP)
// We skip over the 2-byte Payload Header, and the DONL header (if any).
if (fExpectDONFields) {
if (packetSize < 4) return False;
DONL = (headerStart[2]<<8)|headerStart[3];
numBytesToSkip = 4;
} else {
numBytesToSkip = 2;
}
break;
}
case 49: { // Fragmentation Unit (FU)
// This NALU begins with the 2-byte Payload Header, the 1-byte FU header, and (optionally)
// the 2-byte DONL header.
// If the start bit is set, we reconstruct the original NAL header at the end of these
// 3 (or 5) bytes, and skip over the first 1 (or 3) bytes.
if (packetSize < 3) return False;
u_int8_t startBit = headerStart[2]&0x80; // from the FU header
u_int8_t endBit = headerStart[2]&0x40; // from the FU header
if (startBit) {
fCurrentPacketBeginsFrame = True;
u_int8_t nal_unit_type = headerStart[2]&0x3F; // the last 6 bits of the FU header
u_int8_t newNALHeader[2];
newNALHeader[0] = (headerStart[0]&0x81)|(nal_unit_type<<1);
newNALHeader[1] = headerStart[1];
if (fExpectDONFields) {
if (packetSize < 5) return False;
DONL = (headerStart[3]<<8)|headerStart[4];
headerStart[3] = newNALHeader[0];
headerStart[4] = newNALHeader[1];
numBytesToSkip = 3;
} else {
headerStart[1] = newNALHeader[0];
headerStart[2] = newNALHeader[1];
numBytesToSkip = 1;
}
} else {
// The start bit is not set, so we skip over all headers:
fCurrentPacketBeginsFrame = False;
if (fExpectDONFields) {
if (packetSize < 5) return False;
DONL = (headerStart[3]<<8)|headerStart[4];
numBytesToSkip = 5;
} else {
numBytesToSkip = 3;
}
}
fCurrentPacketCompletesFrame = (endBit != 0);
break;
}
default: {
// This packet contains one complete NAL unit:
fCurrentPacketBeginsFrame = fCurrentPacketCompletesFrame = True;
numBytesToSkip = 0;
break;
}
}
computeAbsDonFromDON(DONL);
resultSpecialHeaderSize = numBytesToSkip;
return True;
}
char const* H265VideoRTPSource::MIMEtype() const {
return "video/H265";
}
void H265VideoRTPSource::computeAbsDonFromDON(u_int16_t DON) {
if (!fExpectDONFields) {
// Without DON fields in the input stream, we just increment our "AbsDon" count each time:
++fCurrentNALUnitAbsDon;
} else {
if (fCurrentNALUnitAbsDon == (u_int64_t)(~0)) {
// This is the very first NAL unit, so "AbsDon" is just "DON":
fCurrentNALUnitAbsDon = (u_int64_t)DON;
} else {
// Use the previous NAL unit's DON and the current DON to compute "AbsDon":
// AbsDon[n] = AbsDon[n-1] + (DON[n] - DON[n-1]) mod 2^16
short signedDiff16 = (short)(DON - fPreviousNALUnitDON);
int64_t signedDiff64 = (int64_t)signedDiff16;
fCurrentNALUnitAbsDon += signedDiff64;
}
fPreviousNALUnitDON = DON; // for next time
}
}
////////// H265BufferedPacket and H265BufferedPacketFactory implementation //////////
H265BufferedPacket::H265BufferedPacket(H265VideoRTPSource& ourSource)
: fOurSource(ourSource) {
}
H265BufferedPacket::~H265BufferedPacket() {
}
unsigned H265BufferedPacket
::nextEnclosedFrameSize(unsigned char*& framePtr, unsigned dataSize) {
unsigned resultNALUSize = 0; // if an error occurs
switch (fOurSource.fCurPacketNALUnitType) {
case 48: { // Aggregation Packet (AP)
if (useCount() > 0) {
// We're other than the first NAL unit inside this Aggregation Packet.
// Update our 'decoding order number':
u_int16_t DONL = 0;
if (fOurSource.fExpectDONFields) {
// There's a 1-byte DOND field next:
if (dataSize < 1) break;
u_int8_t DOND = framePtr[0];
DONL = fOurSource.fPreviousNALUnitDON + (u_int16_t)(DOND + 1);
++framePtr;
--dataSize;
}
fOurSource.computeAbsDonFromDON(DONL);
}
// The next 2 bytes are the NAL unit size:
if (dataSize < 2) break;
resultNALUSize = (framePtr[0]<<8)|framePtr[1];
framePtr += 2;
break;
}
default: {
// Common case: We use the entire packet data:
return dataSize;
}
}
return (resultNALUSize <= dataSize) ? resultNALUSize : dataSize;
}
BufferedPacket* H265BufferedPacketFactory
::createNewPacket(MultiFramedRTPSource* ourSource) {
return new H265BufferedPacket((H265VideoRTPSource&)(*ourSource));
}
live/liveMedia/H265VideoStreamFramer.cpp 000444 001752 001752 00000003107 12656261123 020001 0 ustar 00rsf rsf 000000 000000 /**********
This library is free software; you can redistribute it and/or modify it under
the terms of the GNU Lesser General Public License as published by the
Free Software Foundation; either version 2.1 of the License, or (at your
option) any later version. (See .)
This library is distributed in the hope that it will be useful, but WITHOUT
ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS
FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for
more details.
You should have received a copy of the GNU Lesser General Public License
along with this library; if not, write to the Free Software Foundation, Inc.,
51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
**********/
// "liveMedia"
// Copyright (c) 1996-2016 Live Networks, Inc. All rights reserved.
// A filter that breaks up a H.265 Video Elementary Stream into NAL units.
// Implementation
#include "H265VideoStreamFramer.hh"
H265VideoStreamFramer* H265VideoStreamFramer
::createNew(UsageEnvironment& env, FramedSource* inputSource, Boolean includeStartCodeInOutput) {
return new H265VideoStreamFramer(env, inputSource, True, includeStartCodeInOutput);
}
H265VideoStreamFramer
::H265VideoStreamFramer(UsageEnvironment& env, FramedSource* inputSource, Boolean createParser, Boolean includeStartCodeInOutput)
: H264or5VideoStreamFramer(265, env, inputSource, createParser, includeStartCodeInOutput) {
}
H265VideoStreamFramer::~H265VideoStreamFramer() {
}
Boolean H265VideoStreamFramer::isH265VideoStreamFramer() const {
return True;
}
live/liveMedia/H265VideoStreamDiscreteFramer.cpp 000444 001752 001752 00000003227 12656261123 021467 0 ustar 00rsf rsf 000000 000000 /**********
This library is free software; you can redistribute it and/or modify it under
the terms of the GNU Lesser General Public License as published by the
Free Software Foundation; either version 2.1 of the License, or (at your
option) any later version. (See .)
This library is distributed in the hope that it will be useful, but WITHOUT
ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS
FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for
more details.
You should have received a copy of the GNU Lesser General Public License
along with this library; if not, write to the Free Software Foundation, Inc.,
51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
**********/
// "liveMedia"
// Copyright (c) 1996-2016 Live Networks, Inc. All rights reserved.
// A simplified version of "H265VideoStreamFramer" that takes only complete,
// discrete frames (rather than an arbitrary byte stream) as input.
// This avoids the parsing and data copying overhead of the full
// "H265VideoStreamFramer".
// Implementation
#include "H265VideoStreamDiscreteFramer.hh"
H265VideoStreamDiscreteFramer*
H265VideoStreamDiscreteFramer::createNew(UsageEnvironment& env, FramedSource* inputSource) {
return new H265VideoStreamDiscreteFramer(env, inputSource);
}
H265VideoStreamDiscreteFramer
::H265VideoStreamDiscreteFramer(UsageEnvironment& env, FramedSource* inputSource)
: H264or5VideoStreamDiscreteFramer(265, env, inputSource) {
}
H265VideoStreamDiscreteFramer::~H265VideoStreamDiscreteFramer() {
}
Boolean H265VideoStreamDiscreteFramer::isH265VideoStreamFramer() const {
return True;
}
live/liveMedia/InputFile.cpp 000444 001752 001752 00000005766 12656261123 015771 0 ustar 00rsf rsf 000000 000000 /**********
This library is free software; you can redistribute it and/or modify it under
the terms of the GNU Lesser General Public License as published by the
Free Software Foundation; either version 2.1 of the License, or (at your
option) any later version. (See .)
This library is distributed in the hope that it will be useful, but WITHOUT
ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS
FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for
more details.
You should have received a copy of the GNU Lesser General Public License
along with this library; if not, write to the Free Software Foundation, Inc.,
51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
**********/
// "liveMedia"
// Copyright (c) 1996-2016 Live Networks, Inc. All rights reserved.
// Common routines for opening/closing named input files
// Implementation
#include "InputFile.hh"
#include
FILE* OpenInputFile(UsageEnvironment& env, char const* fileName) {
FILE* fid;
// Check for a special case file name: "stdin"
if (strcmp(fileName, "stdin") == 0) {
fid = stdin;
#if (defined(__WIN32__) || defined(_WIN32)) && !defined(_WIN32_WCE)
_setmode(_fileno(stdin), _O_BINARY); // convert to binary mode
#endif
} else {
fid = fopen(fileName, "rb");
if (fid == NULL) {
env.setResultMsg("unable to open file \"",fileName, "\"");
}
}
return fid;
}
void CloseInputFile(FILE* fid) {
// Don't close 'stdin', in case we want to use it again later.
if (fid != NULL && fid != stdin) fclose(fid);
}
u_int64_t GetFileSize(char const* fileName, FILE* fid) {
u_int64_t fileSize = 0; // by default
if (fid != stdin) {
#if !defined(_WIN32_WCE)
if (fileName == NULL) {
#endif
if (fid != NULL && SeekFile64(fid, 0, SEEK_END) >= 0) {
fileSize = (u_int64_t)TellFile64(fid);
if (fileSize == (u_int64_t)-1) fileSize = 0; // TellFile64() failed
SeekFile64(fid, 0, SEEK_SET);
}
#if !defined(_WIN32_WCE)
} else {
struct stat sb;
if (stat(fileName, &sb) == 0) {
fileSize = sb.st_size;
}
}
#endif
}
return fileSize;
}
int64_t SeekFile64(FILE *fid, int64_t offset, int whence) {
if (fid == NULL) return -1;
clearerr(fid);
fflush(fid);
#if (defined(__WIN32__) || defined(_WIN32)) && !defined(_WIN32_WCE)
return _lseeki64(_fileno(fid), offset, whence) == (int64_t)-1 ? -1 : 0;
#else
#if defined(_WIN32_WCE)
return fseek(fid, (long)(offset), whence);
#else
return fseeko(fid, (off_t)(offset), whence);
#endif
#endif
}
int64_t TellFile64(FILE *fid) {
if (fid == NULL) return -1;
clearerr(fid);
fflush(fid);
#if (defined(__WIN32__) || defined(_WIN32)) && !defined(_WIN32_WCE)
return _telli64(_fileno(fid));
#else
#if defined(_WIN32_WCE)
return ftell(fid);
#else
return ftello(fid);
#endif
#endif
}
Boolean FileIsSeekable(FILE *fid) {
if (SeekFile64(fid, 1, SEEK_CUR) < 0) {
return False;
}
SeekFile64(fid, -1, SEEK_CUR); // seek back to where we were
return True;
}
live/liveMedia/JPEGVideoRTPSink.cpp 000444 001752 001752 00000011505 12656261123 017005 0 ustar 00rsf rsf 000000 000000 /**********
This library is free software; you can redistribute it and/or modify it under
the terms of the GNU Lesser General Public License as published by the
Free Software Foundation; either version 2.1 of the License, or (at your
option) any later version. (See .)
This library is distributed in the hope that it will be useful, but WITHOUT
ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS
FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for
more details.
You should have received a copy of the GNU Lesser General Public License
along with this library; if not, write to the Free Software Foundation, Inc.,
51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
**********/
// "liveMedia"
// Copyright (c) 1996-2016 Live Networks, Inc. All rights reserved.
// RTP sink for JPEG video (RFC 2435)
// Implementation
#include "JPEGVideoRTPSink.hh"
#include "JPEGVideoSource.hh"
JPEGVideoRTPSink
::JPEGVideoRTPSink(UsageEnvironment& env, Groupsock* RTPgs)
: VideoRTPSink(env, RTPgs, 26, 90000, "JPEG") {
}
JPEGVideoRTPSink::~JPEGVideoRTPSink() {
}
JPEGVideoRTPSink*
JPEGVideoRTPSink::createNew(UsageEnvironment& env, Groupsock* RTPgs) {
return new JPEGVideoRTPSink(env, RTPgs);
}
Boolean JPEGVideoRTPSink::sourceIsCompatibleWithUs(MediaSource& source) {
return source.isJPEGVideoSource();
}
Boolean JPEGVideoRTPSink
::frameCanAppearAfterPacketStart(unsigned char const* /*frameStart*/,
unsigned /*numBytesInFrame*/) const {
// A packet can contain only one frame
return False;
}
void JPEGVideoRTPSink
::doSpecialFrameHandling(unsigned fragmentationOffset,
unsigned char* /*frameStart*/,
unsigned /*numBytesInFrame*/,
struct timeval framePresentationTime,
unsigned numRemainingBytes) {
// Our source is known to be a JPEGVideoSource
JPEGVideoSource* source = (JPEGVideoSource*)fSource;
if (source == NULL) return; // sanity check
u_int8_t mainJPEGHeader[8]; // the special header
u_int8_t const type = source->type();
mainJPEGHeader[0] = 0; // Type-specific
mainJPEGHeader[1] = fragmentationOffset >> 16;
mainJPEGHeader[2] = fragmentationOffset >> 8;
mainJPEGHeader[3] = fragmentationOffset;
mainJPEGHeader[4] = type;
mainJPEGHeader[5] = source->qFactor();
mainJPEGHeader[6] = source->width();
mainJPEGHeader[7] = source->height();
setSpecialHeaderBytes(mainJPEGHeader, sizeof mainJPEGHeader);
unsigned restartMarkerHeaderSize = 0; // by default
if (type >= 64 && type <= 127) {
// There is also a Restart Marker Header:
restartMarkerHeaderSize = 4;
u_int16_t const restartInterval = source->restartInterval(); // should be non-zero
u_int8_t restartMarkerHeader[4];
restartMarkerHeader[0] = restartInterval>>8;
restartMarkerHeader[1] = restartInterval&0xFF;
restartMarkerHeader[2] = restartMarkerHeader[3] = 0xFF; // F=L=1; Restart Count = 0x3FFF
setSpecialHeaderBytes(restartMarkerHeader, restartMarkerHeaderSize,
sizeof mainJPEGHeader/* start position */);
}
if (fragmentationOffset == 0 && source->qFactor() >= 128) {
// There is also a Quantization Header:
u_int8_t precision;
u_int16_t length;
u_int8_t const* quantizationTables
= source->quantizationTables(precision, length);
unsigned const quantizationHeaderSize = 4 + length;
u_int8_t* quantizationHeader = new u_int8_t[quantizationHeaderSize];
quantizationHeader[0] = 0; // MBZ
quantizationHeader[1] = precision;
quantizationHeader[2] = length >> 8;
quantizationHeader[3] = length&0xFF;
if (quantizationTables != NULL) { // sanity check
for (u_int16_t i = 0; i < length; ++i) {
quantizationHeader[4+i] = quantizationTables[i];
}
}
setSpecialHeaderBytes(quantizationHeader, quantizationHeaderSize,
sizeof mainJPEGHeader + restartMarkerHeaderSize/* start position */);
delete[] quantizationHeader;
}
if (numRemainingBytes == 0) {
// This packet contains the last (or only) fragment of the frame.
// Set the RTP 'M' ('marker') bit:
setMarkerBit();
}
// Also set the RTP timestamp:
setTimestamp(framePresentationTime);
}
unsigned JPEGVideoRTPSink::specialHeaderSize() const {
// Our source is known to be a JPEGVideoSource
JPEGVideoSource* source = (JPEGVideoSource*)fSource;
if (source == NULL) return 0; // sanity check
unsigned headerSize = 8; // by default
u_int8_t const type = source->type();
if (type >= 64 && type <= 127) {
// There is also a Restart Marker Header:
headerSize += 4;
}
if (curFragmentationOffset() == 0 && source->qFactor() >= 128) {
// There is also a Quantization Header:
u_int8_t dummy;
u_int16_t quantizationTablesSize;
(void)(source->quantizationTables(dummy, quantizationTablesSize));
headerSize += 4 + quantizationTablesSize;
}
return headerSize;
}
live/liveMedia/JPEGVideoRTPSource.cpp 000444 001752 001752 00000040465 12656261123 017350 0 ustar 00rsf rsf 000000 000000 /**********
This library is free software; you can redistribute it and/or modify it under
the terms of the GNU Lesser General Public License as published by the
Free Software Foundation; either version 2.1 of the License, or (at your
option) any later version. (See .)
This library is distributed in the hope that it will be useful, but WITHOUT
ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS
FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for
more details.
You should have received a copy of the GNU Lesser General Public License
along with this library; if not, write to the Free Software Foundation, Inc.,
51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
**********/
// "liveMedia"
// Copyright (c) 1996-2016 Live Networks, Inc. All rights reserved.
// JPEG Video (RFC 2435) RTP Sources
// Implementation
#include "JPEGVideoRTPSource.hh"
////////// JPEGBufferedPacket and JPEGBufferedPacketFactory //////////
class JPEGBufferedPacket: public BufferedPacket {
public:
Boolean completesFrame;
private:
// Redefined virtual functions:
virtual void reset();
virtual unsigned nextEnclosedFrameSize(unsigned char*& framePtr,
unsigned dataSize);
};
class JPEGBufferedPacketFactory: public BufferedPacketFactory {
private: // redefined virtual functions
virtual BufferedPacket* createNewPacket(MultiFramedRTPSource* ourSource);
};
////////// JPEGVideoRTPSource implementation //////////
#define BYTE unsigned char
#define WORD unsigned
#define DWORD unsigned long
JPEGVideoRTPSource*
JPEGVideoRTPSource::createNew(UsageEnvironment& env, Groupsock* RTPgs,
unsigned char rtpPayloadFormat,
unsigned rtpTimestampFrequency,
unsigned defaultWidth, unsigned defaultHeight) {
return new JPEGVideoRTPSource(env, RTPgs, rtpPayloadFormat,
rtpTimestampFrequency, defaultWidth, defaultHeight);
}
JPEGVideoRTPSource::JPEGVideoRTPSource(UsageEnvironment& env,
Groupsock* RTPgs,
unsigned char rtpPayloadFormat,
unsigned rtpTimestampFrequency,
unsigned defaultWidth, unsigned defaultHeight)
: MultiFramedRTPSource(env, RTPgs,
rtpPayloadFormat, rtpTimestampFrequency,
new JPEGBufferedPacketFactory),
fDefaultWidth(defaultWidth), fDefaultHeight(defaultHeight) {
}
JPEGVideoRTPSource::~JPEGVideoRTPSource() {
}
enum {
MARKER_SOF0 = 0xc0, // start-of-frame, baseline scan
MARKER_SOI = 0xd8, // start of image
MARKER_EOI = 0xd9, // end of image
MARKER_SOS = 0xda, // start of scan
MARKER_DRI = 0xdd, // restart interval
MARKER_DQT = 0xdb, // define quantization tables
MARKER_DHT = 0xc4, // huffman tables
MARKER_APP_FIRST = 0xe0,
MARKER_APP_LAST = 0xef,
MARKER_COMMENT = 0xfe,
};
static unsigned char const lum_dc_codelens[] = {
0, 1, 5, 1, 1, 1, 1, 1, 1, 0, 0, 0, 0, 0, 0, 0,
};
static unsigned char const lum_dc_symbols[] = {
0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11,
};
static unsigned char const lum_ac_codelens[] = {
0, 2, 1, 3, 3, 2, 4, 3, 5, 5, 4, 4, 0, 0, 1, 0x7d,
};
static unsigned char const lum_ac_symbols[] = {
0x01, 0x02, 0x03, 0x00, 0x04, 0x11, 0x05, 0x12,
0x21, 0x31, 0x41, 0x06, 0x13, 0x51, 0x61, 0x07,
0x22, 0x71, 0x14, 0x32, 0x81, 0x91, 0xa1, 0x08,
0x23, 0x42, 0xb1, 0xc1, 0x15, 0x52, 0xd1, 0xf0,
0x24, 0x33, 0x62, 0x72, 0x82, 0x09, 0x0a, 0x16,
0x17, 0x18, 0x19, 0x1a, 0x25, 0x26, 0x27, 0x28,
0x29, 0x2a, 0x34, 0x35, 0x36, 0x37, 0x38, 0x39,
0x3a, 0x43, 0x44, 0x45, 0x46, 0x47, 0x48, 0x49,
0x4a, 0x53, 0x54, 0x55, 0x56, 0x57, 0x58, 0x59,
0x5a, 0x63, 0x64, 0x65, 0x66, 0x67, 0x68, 0x69,
0x6a, 0x73, 0x74, 0x75, 0x76, 0x77, 0x78, 0x79,
0x7a, 0x83, 0x84, 0x85, 0x86, 0x87, 0x88, 0x89,
0x8a, 0x92, 0x93, 0x94, 0x95, 0x96, 0x97, 0x98,
0x99, 0x9a, 0xa2, 0xa3, 0xa4, 0xa5, 0xa6, 0xa7,
0xa8, 0xa9, 0xaa, 0xb2, 0xb3, 0xb4, 0xb5, 0xb6,
0xb7, 0xb8, 0xb9, 0xba, 0xc2, 0xc3, 0xc4, 0xc5,
0xc6, 0xc7, 0xc8, 0xc9, 0xca, 0xd2, 0xd3, 0xd4,
0xd5, 0xd6, 0xd7, 0xd8, 0xd9, 0xda, 0xe1, 0xe2,
0xe3, 0xe4, 0xe5, 0xe6, 0xe7, 0xe8, 0xe9, 0xea,
0xf1, 0xf2, 0xf3, 0xf4, 0xf5, 0xf6, 0xf7, 0xf8,
0xf9, 0xfa,
};
static unsigned char const chm_dc_codelens[] = {
0, 3, 1, 1, 1, 1, 1, 1, 1, 1, 1, 0, 0, 0, 0, 0,
};
static unsigned char const chm_dc_symbols[] = {
0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11,
};
static unsigned char const chm_ac_codelens[] = {
0, 2, 1, 2, 4, 4, 3, 4, 7, 5, 4, 4, 0, 1, 2, 0x77,
};
static unsigned char const chm_ac_symbols[] = {
0x00, 0x01, 0x02, 0x03, 0x11, 0x04, 0x05, 0x21,
0x31, 0x06, 0x12, 0x41, 0x51, 0x07, 0x61, 0x71,
0x13, 0x22, 0x32, 0x81, 0x08, 0x14, 0x42, 0x91,
0xa1, 0xb1, 0xc1, 0x09, 0x23, 0x33, 0x52, 0xf0,
0x15, 0x62, 0x72, 0xd1, 0x0a, 0x16, 0x24, 0x34,
0xe1, 0x25, 0xf1, 0x17, 0x18, 0x19, 0x1a, 0x26,
0x27, 0x28, 0x29, 0x2a, 0x35, 0x36, 0x37, 0x38,
0x39, 0x3a, 0x43, 0x44, 0x45, 0x46, 0x47, 0x48,
0x49, 0x4a, 0x53, 0x54, 0x55, 0x56, 0x57, 0x58,
0x59, 0x5a, 0x63, 0x64, 0x65, 0x66, 0x67, 0x68,
0x69, 0x6a, 0x73, 0x74, 0x75, 0x76, 0x77, 0x78,
0x79, 0x7a, 0x82, 0x83, 0x84, 0x85, 0x86, 0x87,
0x88, 0x89, 0x8a, 0x92, 0x93, 0x94, 0x95, 0x96,
0x97, 0x98, 0x99, 0x9a, 0xa2, 0xa3, 0xa4, 0xa5,
0xa6, 0xa7, 0xa8, 0xa9, 0xaa, 0xb2, 0xb3, 0xb4,
0xb5, 0xb6, 0xb7, 0xb8, 0xb9, 0xba, 0xc2, 0xc3,
0xc4, 0xc5, 0xc6, 0xc7, 0xc8, 0xc9, 0xca, 0xd2,
0xd3, 0xd4, 0xd5, 0xd6, 0xd7, 0xd8, 0xd9, 0xda,
0xe2, 0xe3, 0xe4, 0xe5, 0xe6, 0xe7, 0xe8, 0xe9,
0xea, 0xf2, 0xf3, 0xf4, 0xf5, 0xf6, 0xf7, 0xf8,
0xf9, 0xfa,
};
static void createHuffmanHeader(unsigned char*& p,
unsigned char const* codelens,
int ncodes,
unsigned char const* symbols,
int nsymbols,
int tableNo, int tableClass) {
*p++ = 0xff; *p++ = MARKER_DHT;
*p++ = 0; /* length msb */
*p++ = 3 + ncodes + nsymbols; /* length lsb */
*p++ = (tableClass << 4) | tableNo;
memcpy(p, codelens, ncodes);
p += ncodes;
memcpy(p, symbols, nsymbols);
p += nsymbols;
}
static unsigned computeJPEGHeaderSize(unsigned qtlen, unsigned dri) {
unsigned qtlen_half = qtlen/2; // in case qtlen is odd; shouldn't happen
qtlen = qtlen_half*2;
unsigned numQtables = qtlen > 64 ? 2 : 1;
return 485 + numQtables*5 + qtlen + (dri > 0 ? 6 : 0);
}
static void createJPEGHeader(unsigned char* buf, unsigned type,
unsigned w, unsigned h,
unsigned char const* qtables, unsigned qtlen,
unsigned dri) {
unsigned char *ptr = buf;
unsigned numQtables = qtlen > 64 ? 2 : 1;
// MARKER_SOI:
*ptr++ = 0xFF; *ptr++ = MARKER_SOI;
// MARKER_APP_FIRST:
*ptr++ = 0xFF; *ptr++ = MARKER_APP_FIRST;
*ptr++ = 0x00; *ptr++ = 0x10; // size of chunk
*ptr++ = 'J'; *ptr++ = 'F'; *ptr++ = 'I'; *ptr++ = 'F'; *ptr++ = 0x00;
*ptr++ = 0x01; *ptr++ = 0x01; // JFIF format version (1.1)
*ptr++ = 0x00; // no units
*ptr++ = 0x00; *ptr++ = 0x01; // Horizontal pixel aspect ratio
*ptr++ = 0x00; *ptr++ = 0x01; // Vertical pixel aspect ratio
*ptr++ = 0x00; *ptr++ = 0x00; // no thumbnail
// MARKER_DRI:
if (dri > 0) {
*ptr++ = 0xFF; *ptr++ = MARKER_DRI;
*ptr++ = 0x00; *ptr++ = 0x04; // size of chunk
*ptr++ = (BYTE)(dri >> 8); *ptr++ = (BYTE)(dri); // restart interval
}
// MARKER_DQT (luma):
unsigned tableSize = numQtables == 1 ? qtlen : qtlen/2;
*ptr++ = 0xFF; *ptr++ = MARKER_DQT;
*ptr++ = 0x00; *ptr++ = tableSize + 3; // size of chunk
*ptr++ = 0x00; // precision(0), table id(0)
memcpy(ptr, qtables, tableSize);
qtables += tableSize;
ptr += tableSize;
if (numQtables > 1) {
unsigned tableSize = qtlen - qtlen/2;
// MARKER_DQT (chroma):
*ptr++ = 0xFF; *ptr++ = MARKER_DQT;
*ptr++ = 0x00; *ptr++ = tableSize + 3; // size of chunk
*ptr++ = 0x01; // precision(0), table id(1)
memcpy(ptr, qtables, tableSize);
qtables += tableSize;
ptr += tableSize;
}
// MARKER_SOF0:
*ptr++ = 0xFF; *ptr++ = MARKER_SOF0;
*ptr++ = 0x00; *ptr++ = 0x11; // size of chunk
*ptr++ = 0x08; // sample precision
*ptr++ = (BYTE)(h >> 8);
*ptr++ = (BYTE)(h); // number of lines (must be a multiple of 8)
*ptr++ = (BYTE)(w >> 8);
*ptr++ = (BYTE)(w); // number of columns (must be a multiple of 8)
*ptr++ = 0x03; // number of components
*ptr++ = 0x01; // id of component
*ptr++ = type ? 0x22 : 0x21; // sampling ratio (h,v)
*ptr++ = 0x00; // quant table id
*ptr++ = 0x02; // id of component
*ptr++ = 0x11; // sampling ratio (h,v)
*ptr++ = numQtables == 1 ? 0x00 : 0x01; // quant table id
*ptr++ = 0x03; // id of component
*ptr++ = 0x11; // sampling ratio (h,v)
*ptr++ = numQtables == 1 ? 0x00 : 0x01; // quant table id
createHuffmanHeader(ptr, lum_dc_codelens, sizeof lum_dc_codelens,
lum_dc_symbols, sizeof lum_dc_symbols, 0, 0);
createHuffmanHeader(ptr, lum_ac_codelens, sizeof lum_ac_codelens,
lum_ac_symbols, sizeof lum_ac_symbols, 0, 1);
createHuffmanHeader(ptr, chm_dc_codelens, sizeof chm_dc_codelens,
chm_dc_symbols, sizeof chm_dc_symbols, 1, 0);
createHuffmanHeader(ptr, chm_ac_codelens, sizeof chm_ac_codelens,
chm_ac_symbols, sizeof chm_ac_symbols, 1, 1);
// MARKER_SOS:
*ptr++ = 0xFF; *ptr++ = MARKER_SOS;
*ptr++ = 0x00; *ptr++ = 0x0C; // size of chunk
*ptr++ = 0x03; // number of components
*ptr++ = 0x01; // id of component
*ptr++ = 0x00; // huffman table id (DC, AC)
*ptr++ = 0x02; // id of component
*ptr++ = 0x11; // huffman table id (DC, AC)
*ptr++ = 0x03; // id of component
*ptr++ = 0x11; // huffman table id (DC, AC)
*ptr++ = 0x00; // start of spectral
*ptr++ = 0x3F; // end of spectral
*ptr++ = 0x00; // successive approximation bit position (high, low)
}
// The default 'luma' and 'chroma' quantizer tables, in zigzag order:
static unsigned char const defaultQuantizers[128] = {
// luma table:
16, 11, 12, 14, 12, 10, 16, 14,
13, 14, 18, 17, 16, 19, 24, 40,
26, 24, 22, 22, 24, 49, 35, 37,
29, 40, 58, 51, 61, 60, 57, 51,
56, 55, 64, 72, 92, 78, 64, 68,
87, 69, 55, 56, 80, 109, 81, 87,
95, 98, 103, 104, 103, 62, 77, 113,
121, 112, 100, 120, 92, 101, 103, 99,
// chroma table:
17, 18, 18, 24, 21, 24, 47, 26,
26, 47, 99, 66, 56, 66, 99, 99,
99, 99, 99, 99, 99, 99, 99, 99,
99, 99, 99, 99, 99, 99, 99, 99,
99, 99, 99, 99, 99, 99, 99, 99,
99, 99, 99, 99, 99, 99, 99, 99,
99, 99, 99, 99, 99, 99, 99, 99,
99, 99, 99, 99, 99, 99, 99, 99
};
static void makeDefaultQtables(unsigned char* resultTables, unsigned Q) {
int factor = Q;
int q;
if (Q < 1) factor = 1;
else if (Q > 99) factor = 99;
if (Q < 50) {
q = 5000 / factor;
} else {
q = 200 - factor*2;
}
for (int i = 0; i < 128; ++i) {
int newVal = (defaultQuantizers[i]*q + 50)/100;
if (newVal < 1) newVal = 1;
else if (newVal > 255) newVal = 255;
resultTables[i] = newVal;
}
}
Boolean JPEGVideoRTPSource
::processSpecialHeader(BufferedPacket* packet,
unsigned& resultSpecialHeaderSize) {
unsigned char* headerStart = packet->data();
unsigned packetSize = packet->dataSize();
unsigned char* qtables = NULL;
unsigned qtlen = 0;
unsigned dri = 0;
// There's at least 8-byte video-specific header
/*
0 1 2 3
0 1 2 3 4 5 6 7 8 9 0 1 2 3 4 5 6 7 8 9 0 1 2 3 4 5 6 7 8 9 0 1
+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+
| Type-specific | Fragment Offset |
+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+
| Type | Q | Width | Height |
+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+
*/
if (packetSize < 8) return False;
resultSpecialHeaderSize = 8;
unsigned Offset = (unsigned)((DWORD)headerStart[1] << 16 | (DWORD)headerStart[2] << 8 | (DWORD)headerStart[3]);
unsigned Type = (unsigned)headerStart[4];
unsigned type = Type & 1;
unsigned Q = (unsigned)headerStart[5];
unsigned width = (unsigned)headerStart[6] * 8;
unsigned height = (unsigned)headerStart[7] * 8;
if ((width == 0 || height == 0) && fDefaultWidth != 0 && fDefaultHeight != 0) {
// Use the default width and height parameters instead:
width = fDefaultWidth;
height = fDefaultHeight;
}
if (width == 0) width = 256*8; // special case
if (height == 0) height = 256*8; // special case
if (Type > 63) {
// Restart Marker header present
/*
0 1 2 3
0 1 2 3 4 5 6 7 8 9 0 1 2 3 4 5 6 7 8 9 0 1 2 3 4 5 6 7 8 9 0 1
+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+
| Restart Interval |F|L| Restart Count |
+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+
*/
if (packetSize < resultSpecialHeaderSize + 4) return False;
unsigned RestartInterval = (unsigned)((WORD)headerStart[resultSpecialHeaderSize] << 8 | (WORD)headerStart[resultSpecialHeaderSize + 1]);
dri = RestartInterval;
resultSpecialHeaderSize += 4;
}
if (Offset == 0) {
if (Q > 127) {
// Quantization Table header present
/*
0 1 2 3
0 1 2 3 4 5 6 7 8 9 0 1 2 3 4 5 6 7 8 9 0 1 2 3 4 5 6 7 8 9 0 1
+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+
| MBZ | Precision | Length |
+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+
| Quantization Table Data |
| ... |
+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+
*/
if (packetSize < resultSpecialHeaderSize + 4) return False;
unsigned MBZ = (unsigned)headerStart[resultSpecialHeaderSize];
if (MBZ == 0) {
// unsigned Precision = (unsigned)headerStart[resultSpecialHeaderSize + 1];
unsigned Length = (unsigned)((WORD)headerStart[resultSpecialHeaderSize + 2] << 8 | (WORD)headerStart[resultSpecialHeaderSize + 3]);
//ASSERT(Length == 128);
resultSpecialHeaderSize += 4;
if (packetSize < resultSpecialHeaderSize + Length) return False;
qtlen = Length;
qtables = &headerStart[resultSpecialHeaderSize];
resultSpecialHeaderSize += Length;
}
}
}
// If this is the first (or only) fragment of a JPEG frame, then we need
// to synthesize a JPEG header, and prepend it to the incoming data.
// Hack: We can do this because we allowed space for it in
// our special "JPEGBufferedPacket" subclass. We also adjust
// "resultSpecialHeaderSize" to compensate for this, by subtracting
// the size of the synthesized header. Note that this will cause
// "resultSpecialHeaderSize" to become negative, but the code that called
// us (in "MultiFramedRTPSource") will handle this properly.
if (Offset == 0) {
unsigned char newQtables[128];
if (qtlen == 0) {
// A quantization table was not present in the RTP JPEG header,
// so use the default tables, scaled according to the "Q" factor:
makeDefaultQtables(newQtables, Q);
qtables = newQtables;
qtlen = sizeof newQtables;
}
unsigned hdrlen = computeJPEGHeaderSize(qtlen, dri);
resultSpecialHeaderSize -= hdrlen; // goes negative
headerStart += (int)resultSpecialHeaderSize; // goes backward
createJPEGHeader(headerStart, type, width, height, qtables, qtlen, dri);
}
fCurrentPacketBeginsFrame = (Offset == 0);
// The RTP "M" (marker) bit indicates the last fragment of a frame:
((JPEGBufferedPacket*)packet)->completesFrame
= fCurrentPacketCompletesFrame = packet->rtpMarkerBit();
return True;
}
char const* JPEGVideoRTPSource::MIMEtype() const {
return "video/JPEG";
}
////////// JPEGBufferedPacket and JPEGBufferedPacketFactory implementation
void JPEGBufferedPacket::reset() {
BufferedPacket::reset();
// Move our "fHead" and "fTail" forward, to allow space for a synthesized
// JPEG header to precede the RTP data that comes in over the network.
unsigned offset = MAX_JPEG_HEADER_SIZE;
if (offset > fPacketSize) offset = fPacketSize; // shouldn't happen
fHead = fTail = offset;
}
unsigned JPEGBufferedPacket
::nextEnclosedFrameSize(unsigned char*& framePtr, unsigned dataSize) {
// Normally, the enclosed frame size is just "dataSize". If, however,
// the frame does not end with the "EOI" marker, then add this now:
if (completesFrame && dataSize >= 2 &&
!(framePtr[dataSize-2] == 0xFF && framePtr[dataSize-1] == MARKER_EOI)) {
framePtr[dataSize++] = 0xFF;
framePtr[dataSize++] = MARKER_EOI;
}
return dataSize;
}
BufferedPacket* JPEGBufferedPacketFactory
::createNewPacket(MultiFramedRTPSource* /*ourSource*/) {
return new JPEGBufferedPacket;
}
live/liveMedia/JPEGVideoSource.cpp 000444 001752 001752 00000002565 12656261123 016761 0 ustar 00rsf rsf 000000 000000 /**********
This library is free software; you can redistribute it and/or modify it under
the terms of the GNU Lesser General Public License as published by the
Free Software Foundation; either version 2.1 of the License, or (at your
option) any later version. (See .)
This library is distributed in the hope that it will be useful, but WITHOUT
ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS
FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for
more details.
You should have received a copy of the GNU Lesser General Public License
along with this library; if not, write to the Free Software Foundation, Inc.,
51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
**********/
// "liveMedia"
// Copyright (c) 1996-2016 Live Networks, Inc. All rights reserved.
// JPEG video sources
// Implementation
#include "JPEGVideoSource.hh"
JPEGVideoSource::JPEGVideoSource(UsageEnvironment& env)
: FramedSource(env) {
}
JPEGVideoSource::~JPEGVideoSource() {
}
u_int8_t const* JPEGVideoSource::quantizationTables(u_int8_t& precision,
u_int16_t& length) {
// Default implementation
precision = 0;
length = 0;
return NULL;
}
u_int16_t JPEGVideoSource::restartInterval() {
// Default implementation
return 0;
}
Boolean JPEGVideoSource::isJPEGVideoSource() const {
return True;
}
live/liveMedia/Makefile.head 000440 001752 001752 00000000246 12656261123 015706 0 ustar 00rsf rsf 000000 000000 INCLUDES = -Iinclude -I../UsageEnvironment/include -I../groupsock/include
PREFIX = /usr/local
LIBDIR = $(PREFIX)/lib
##### Change the following for your environment:
live/liveMedia/Makefile.tail 000444 001752 001752 00000104206 12656261123 015743 0 ustar 00rsf rsf 000000 000000 ##### End of variables to change
NAME = libliveMedia
LIVEMEDIA_LIB = $(NAME).$(LIB_SUFFIX)
ALL = $(LIVEMEDIA_LIB)
all: $(ALL)
.$(C).$(OBJ):
$(C_COMPILER) -c $(C_FLAGS) $<
.$(CPP).$(OBJ):
$(CPLUSPLUS_COMPILER) -c $(CPLUSPLUS_FLAGS) $<
MP3_SOURCE_OBJS = MP3FileSource.$(OBJ) MP3Transcoder.$(OBJ) MP3ADU.$(OBJ) MP3ADUdescriptor.$(OBJ) MP3ADUinterleaving.$(OBJ) MP3ADUTranscoder.$(OBJ) MP3StreamState.$(OBJ) MP3Internals.$(OBJ) MP3InternalsHuffman.$(OBJ) MP3InternalsHuffmanTable.$(OBJ) MP3ADURTPSource.$(OBJ)
MPEG_SOURCE_OBJS = MPEG1or2Demux.$(OBJ) MPEG1or2DemuxedElementaryStream.$(OBJ) MPEGVideoStreamFramer.$(OBJ) MPEG1or2VideoStreamFramer.$(OBJ) MPEG1or2VideoStreamDiscreteFramer.$(OBJ) MPEG4VideoStreamFramer.$(OBJ) MPEG4VideoStreamDiscreteFramer.$(OBJ) H264or5VideoStreamFramer.$(OBJ) H264or5VideoStreamDiscreteFramer.$(OBJ) H264VideoStreamFramer.$(OBJ) H264VideoStreamDiscreteFramer.$(OBJ) H265VideoStreamFramer.$(OBJ) H265VideoStreamDiscreteFramer.$(OBJ) MPEGVideoStreamParser.$(OBJ) MPEG1or2AudioStreamFramer.$(OBJ) MPEG1or2AudioRTPSource.$(OBJ) MPEG4LATMAudioRTPSource.$(OBJ) MPEG4ESVideoRTPSource.$(OBJ) MPEG4GenericRTPSource.$(OBJ) $(MP3_SOURCE_OBJS) MPEG1or2VideoRTPSource.$(OBJ) MPEG2TransportStreamMultiplexor.$(OBJ) MPEG2TransportStreamFromPESSource.$(OBJ) MPEG2TransportStreamFromESSource.$(OBJ) MPEG2TransportStreamFramer.$(OBJ) ADTSAudioFileSource.$(OBJ)
H263_SOURCE_OBJS = H263plusVideoRTPSource.$(OBJ) H263plusVideoStreamFramer.$(OBJ) H263plusVideoStreamParser.$(OBJ)
AC3_SOURCE_OBJS = AC3AudioStreamFramer.$(OBJ) AC3AudioRTPSource.$(OBJ)
DV_SOURCE_OBJS = DVVideoStreamFramer.$(OBJ) DVVideoRTPSource.$(OBJ)
MP3_SINK_OBJS = MP3ADURTPSink.$(OBJ)
MPEG_SINK_OBJS = MPEG1or2AudioRTPSink.$(OBJ) $(MP3_SINK_OBJS) MPEG1or2VideoRTPSink.$(OBJ) MPEG4LATMAudioRTPSink.$(OBJ) MPEG4GenericRTPSink.$(OBJ) MPEG4ESVideoRTPSink.$(OBJ)
H263_SINK_OBJS = H263plusVideoRTPSink.$(OBJ)
H264_OR_5_SINK_OBJS = H264or5VideoRTPSink.$(OBJ) H264VideoRTPSink.$(OBJ) H265VideoRTPSink.$(OBJ)
DV_SINK_OBJS = DVVideoRTPSink.$(OBJ)
AC3_SINK_OBJS = AC3AudioRTPSink.$(OBJ)
MISC_SOURCE_OBJS = MediaSource.$(OBJ) FramedSource.$(OBJ) FramedFileSource.$(OBJ) FramedFilter.$(OBJ) ByteStreamFileSource.$(OBJ) ByteStreamMultiFileSource.$(OBJ) ByteStreamMemoryBufferSource.$(OBJ) BasicUDPSource.$(OBJ) DeviceSource.$(OBJ) AudioInputDevice.$(OBJ) WAVAudioFileSource.$(OBJ) $(MPEG_SOURCE_OBJS) $(H263_SOURCE_OBJS) $(AC3_SOURCE_OBJS) $(DV_SOURCE_OBJS) JPEGVideoSource.$(OBJ) AMRAudioSource.$(OBJ) AMRAudioFileSource.$(OBJ) InputFile.$(OBJ) StreamReplicator.$(OBJ)
MISC_SINK_OBJS = MediaSink.$(OBJ) FileSink.$(OBJ) BasicUDPSink.$(OBJ) AMRAudioFileSink.$(OBJ) H264or5VideoFileSink.$(OBJ) H264VideoFileSink.$(OBJ) H265VideoFileSink.$(OBJ) OggFileSink.$(OBJ) $(MPEG_SINK_OBJS) $(H263_SINK_OBJS) $(H264_OR_5_SINK_OBJS) $(DV_SINK_OBJS) $(AC3_SINK_OBJS) VorbisAudioRTPSink.$(OBJ) TheoraVideoRTPSink.$(OBJ) VP8VideoRTPSink.$(OBJ) VP9VideoRTPSink.$(OBJ) GSMAudioRTPSink.$(OBJ) JPEGVideoRTPSink.$(OBJ) SimpleRTPSink.$(OBJ) AMRAudioRTPSink.$(OBJ) T140TextRTPSink.$(OBJ) TCPStreamSink.$(OBJ) OutputFile.$(OBJ)
MISC_FILTER_OBJS = uLawAudioFilter.$(OBJ)
TRANSPORT_STREAM_TRICK_PLAY_OBJS = MPEG2IndexFromTransportStream.$(OBJ) MPEG2TransportStreamIndexFile.$(OBJ) MPEG2TransportStreamTrickModeFilter.$(OBJ)
RTP_SOURCE_OBJS = RTPSource.$(OBJ) MultiFramedRTPSource.$(OBJ) SimpleRTPSource.$(OBJ) H261VideoRTPSource.$(OBJ) H264VideoRTPSource.$(OBJ) H265VideoRTPSource.$(OBJ) QCELPAudioRTPSource.$(OBJ) AMRAudioRTPSource.$(OBJ) JPEGVideoRTPSource.$(OBJ) VorbisAudioRTPSource.$(OBJ) TheoraVideoRTPSource.$(OBJ) VP8VideoRTPSource.$(OBJ) VP9VideoRTPSource.$(OBJ)
RTP_SINK_OBJS = RTPSink.$(OBJ) MultiFramedRTPSink.$(OBJ) AudioRTPSink.$(OBJ) VideoRTPSink.$(OBJ) TextRTPSink.$(OBJ)
RTP_INTERFACE_OBJS = RTPInterface.$(OBJ)
RTP_OBJS = $(RTP_SOURCE_OBJS) $(RTP_SINK_OBJS) $(RTP_INTERFACE_OBJS)
RTCP_OBJS = RTCP.$(OBJ) rtcp_from_spec.$(OBJ)
GENERIC_MEDIA_SERVER_OBJS = GenericMediaServer.$(OBJ)
RTSP_OBJS = RTSPServer.$(OBJ) RTSPClient.$(OBJ) RTSPCommon.$(OBJ) RTSPServerSupportingHTTPStreaming.$(OBJ) RTSPRegisterSender.$(OBJ)
SIP_OBJS = SIPClient.$(OBJ)
SESSION_OBJS = MediaSession.$(OBJ) ServerMediaSession.$(OBJ) PassiveServerMediaSubsession.$(OBJ) OnDemandServerMediaSubsession.$(OBJ) FileServerMediaSubsession.$(OBJ) MPEG4VideoFileServerMediaSubsession.$(OBJ) H264VideoFileServerMediaSubsession.$(OBJ) H265VideoFileServerMediaSubsession.$(OBJ) H263plusVideoFileServerMediaSubsession.$(OBJ) WAVAudioFileServerMediaSubsession.$(OBJ) AMRAudioFileServerMediaSubsession.$(OBJ) MP3AudioFileServerMediaSubsession.$(OBJ) MPEG1or2VideoFileServerMediaSubsession.$(OBJ) MPEG1or2FileServerDemux.$(OBJ) MPEG1or2DemuxedServerMediaSubsession.$(OBJ) MPEG2TransportFileServerMediaSubsession.$(OBJ) ADTSAudioFileServerMediaSubsession.$(OBJ) DVVideoFileServerMediaSubsession.$(OBJ) AC3AudioFileServerMediaSubsession.$(OBJ) MPEG2TransportUDPServerMediaSubsession.$(OBJ) ProxyServerMediaSession.$(OBJ)
QUICKTIME_OBJS = QuickTimeFileSink.$(OBJ) QuickTimeGenericRTPSource.$(OBJ)
AVI_OBJS = AVIFileSink.$(OBJ)
MATROSKA_FILE_OBJS = MatroskaFile.$(OBJ) MatroskaFileParser.$(OBJ) EBMLNumber.$(OBJ) MatroskaDemuxedTrack.$(OBJ)
MATROSKA_SERVER_MEDIA_SUBSESSION_OBJS = MatroskaFileServerMediaSubsession.$(OBJ) MP3AudioMatroskaFileServerMediaSubsession.$(OBJ)
MATROSKA_RTSP_SERVER_OBJS = MatroskaFileServerDemux.$(OBJ) $(MATROSKA_SERVER_MEDIA_SUBSESSION_OBJS)
MATROSKA_OBJS = $(MATROSKA_FILE_OBJS) $(MATROSKA_RTSP_SERVER_OBJS)
OGG_FILE_OBJS = OggFile.$(OBJ) OggFileParser.$(OBJ) OggDemuxedTrack.$(OBJ)
OGG_SERVER_MEDIA_SUBSESSION_OBJS = OggFileServerMediaSubsession.$(OBJ)
OGG_RTSP_SERVER_OBJS = OggFileServerDemux.$(OBJ) $(OGG_SERVER_MEDIA_SUBSESSION_OBJS)
OGG_OBJS = $(OGG_FILE_OBJS) $(OGG_RTSP_SERVER_OBJS)
MISC_OBJS = BitVector.$(OBJ) StreamParser.$(OBJ) DigestAuthentication.$(OBJ) ourMD5.$(OBJ) Base64.$(OBJ) Locale.$(OBJ)
LIVEMEDIA_LIB_OBJS = Media.$(OBJ) $(MISC_SOURCE_OBJS) $(MISC_SINK_OBJS) $(MISC_FILTER_OBJS) $(RTP_OBJS) $(RTCP_OBJS) $(GENERIC_MEDIA_SERVER_OBJS) $(RTSP_OBJS) $(SIP_OBJS) $(SESSION_OBJS) $(QUICKTIME_OBJS) $(AVI_OBJS) $(TRANSPORT_STREAM_TRICK_PLAY_OBJS) $(MATROSKA_OBJS) $(OGG_OBJS) $(MISC_OBJS)
$(LIVEMEDIA_LIB): $(LIVEMEDIA_LIB_OBJS) \
$(PLATFORM_SPECIFIC_LIB_OBJS)
$(LIBRARY_LINK)$@ $(LIBRARY_LINK_OPTS) \
$(LIVEMEDIA_LIB_OBJS)
Media.$(CPP): include/Media.hh
include/Media.hh: include/liveMedia_version.hh
MediaSource.$(CPP): include/MediaSource.hh
include/MediaSource.hh: include/Media.hh
FramedSource.$(CPP): include/FramedSource.hh
include/FramedSource.hh: include/MediaSource.hh
FramedFileSource.$(CPP): include/FramedFileSource.hh
include/FramedFileSource.hh: include/FramedSource.hh
FramedFilter.$(CPP): include/FramedFilter.hh
include/FramedFilter.hh: include/FramedSource.hh
RTPSource.$(CPP): include/RTPSource.hh
include/RTPSource.hh: include/FramedSource.hh include/RTPInterface.hh
include/RTPInterface.hh: include/Media.hh
MultiFramedRTPSource.$(CPP): include/MultiFramedRTPSource.hh include/RTCP.hh
include/MultiFramedRTPSource.hh: include/RTPSource.hh
SimpleRTPSource.$(CPP): include/SimpleRTPSource.hh
include/SimpleRTPSource.hh: include/MultiFramedRTPSource.hh
H261VideoRTPSource.$(CPP): include/H261VideoRTPSource.hh
include/H261VideoRTPSource.hh: include/MultiFramedRTPSource.hh
H264VideoRTPSource.$(CPP): include/H264VideoRTPSource.hh include/Base64.hh
include/H264VideoRTPSource.hh: include/MultiFramedRTPSource.hh
H265VideoRTPSource.$(CPP): include/H265VideoRTPSource.hh
include/H265VideoRTPSource.hh: include/MultiFramedRTPSource.hh
QCELPAudioRTPSource.$(CPP): include/QCELPAudioRTPSource.hh include/MultiFramedRTPSource.hh include/FramedFilter.hh
include/QCELPAudioRTPSource.hh: include/RTPSource.hh
AMRAudioRTPSource.$(CPP): include/AMRAudioRTPSource.hh include/MultiFramedRTPSource.hh
include/AMRAudioRTPSource.hh: include/RTPSource.hh include/AMRAudioSource.hh
JPEGVideoRTPSource.$(CPP): include/JPEGVideoRTPSource.hh
include/JPEGVideoRTPSource.hh: include/MultiFramedRTPSource.hh
VorbisAudioRTPSource.$(CPP): include/VorbisAudioRTPSource.hh include/Base64.hh
include/VorbisAudioRTPSource.hh: include/MultiFramedRTPSource.hh
TheoraVideoRTPSource.$(CPP): include/TheoraVideoRTPSource.hh
include/TheoraVideoRTPSource.hh: include/MultiFramedRTPSource.hh
VP8VideoRTPSource.$(CPP): include/VP8VideoRTPSource.hh
include/VP8VideoRTPSource.hh: include/MultiFramedRTPSource.hh
VP9VideoRTPSource.$(CPP): include/VP9VideoRTPSource.hh
include/VP9VideoRTPSource.hh: include/MultiFramedRTPSource.hh
ByteStreamFileSource.$(CPP): include/ByteStreamFileSource.hh include/InputFile.hh
include/ByteStreamFileSource.hh: include/FramedFileSource.hh
ByteStreamMultiFileSource.$(CPP): include/ByteStreamMultiFileSource.hh
include/ByteStreamMultiFileSource.hh: include/ByteStreamFileSource.hh
ByteStreamMemoryBufferSource.$(CPP): include/ByteStreamMemoryBufferSource.hh
include/ByteStreamMemoryBufferSource.hh: include/FramedSource.hh
BasicUDPSource.$(CPP): include/BasicUDPSource.hh
include/BasicUDPSource.hh: include/FramedSource.hh
DeviceSource.$(CPP): include/DeviceSource.hh
include/DeviceSource.hh: include/FramedSource.hh
AudioInputDevice.$(CPP): include/AudioInputDevice.hh
include/AudioInputDevice.hh: include/FramedSource.hh
WAVAudioFileSource.$(CPP): include/WAVAudioFileSource.hh include/InputFile.hh
include/WAVAudioFileSource.hh: include/AudioInputDevice.hh
MPEG1or2Demux.$(CPP): include/MPEG1or2Demux.hh include/MPEG1or2DemuxedElementaryStream.hh StreamParser.hh
include/MPEG1or2Demux.hh: include/FramedSource.hh
include/MPEG1or2DemuxedElementaryStream.hh: include/MPEG1or2Demux.hh
StreamParser.hh: include/FramedSource.hh
MPEG1or2DemuxedElementaryStream.$(CPP): include/MPEG1or2DemuxedElementaryStream.hh
MPEGVideoStreamFramer.$(CPP): MPEGVideoStreamParser.hh
MPEGVideoStreamParser.hh: StreamParser.hh include/MPEGVideoStreamFramer.hh
include/MPEGVideoStreamFramer.hh: include/FramedFilter.hh
MPEG1or2VideoStreamFramer.$(CPP): include/MPEG1or2VideoStreamFramer.hh MPEGVideoStreamParser.hh
include/MPEG1or2VideoStreamFramer.hh: include/MPEGVideoStreamFramer.hh
MPEG1or2VideoStreamDiscreteFramer.$(CPP): include/MPEG1or2VideoStreamDiscreteFramer.hh
include/MPEG1or2VideoStreamDiscreteFramer.hh: include/MPEG1or2VideoStreamFramer.hh
MPEG4VideoStreamFramer.$(CPP): include/MPEG4VideoStreamFramer.hh MPEGVideoStreamParser.hh include/MPEG4LATMAudioRTPSource.hh
include/MPEG4VideoStreamFramer.hh: include/MPEGVideoStreamFramer.hh
MPEG4VideoStreamDiscreteFramer.$(CPP): include/MPEG4VideoStreamDiscreteFramer.hh
include/MPEG4VideoStreamDiscreteFramer.hh: include/MPEG4VideoStreamFramer.hh
H264or5VideoStreamFramer.$(CPP): include/H264or5VideoStreamFramer.hh MPEGVideoStreamParser.hh include/BitVector.hh
include/H264or5VideoStreamFramer.hh: include/MPEGVideoStreamFramer.hh
H264or5VideoStreamDiscreteFramer.$(CPP): include/H264or5VideoStreamDiscreteFramer.hh
include/H264or5VideoStreamDiscreteFramer.hh: include/H264or5VideoStreamFramer.hh
H264VideoStreamFramer.$(CPP): include/H264VideoStreamFramer.hh
include/H264VideoStreamFramer.hh: include/H264or5VideoStreamFramer.hh
H264VideoStreamDiscreteFramer.$(CPP): include/H264VideoStreamDiscreteFramer.hh
include/H264VideoStreamDiscreteFramer.hh: include/H264VideoStreamFramer.hh
H265VideoStreamFramer.$(CPP): include/H265VideoStreamFramer.hh
include/H265VideoStreamFramer.hh: include/H264or5VideoStreamFramer.hh
H265VideoStreamDiscreteFramer.$(CPP): include/H265VideoStreamDiscreteFramer.hh
include/H265VideoStreamDiscreteFramer.hh: include/H265VideoStreamFramer.hh
MPEGVideoStreamParser.$(CPP): MPEGVideoStreamParser.hh
MPEG1or2AudioStreamFramer.$(CPP): include/MPEG1or2AudioStreamFramer.hh StreamParser.hh MP3Internals.hh
include/MPEG1or2AudioStreamFramer.hh: include/FramedFilter.hh
MPEG1or2AudioRTPSource.$(CPP): include/MPEG1or2AudioRTPSource.hh
include/MPEG1or2AudioRTPSource.hh: include/MultiFramedRTPSource.hh
MPEG4LATMAudioRTPSource.$(CPP): include/MPEG4LATMAudioRTPSource.hh
include/MPEG4LATMAudioRTPSource.hh: include/MultiFramedRTPSource.hh
MPEG4ESVideoRTPSource.$(CPP): include/MPEG4ESVideoRTPSource.hh
include/MPEG4ESVideoRTPSource.hh: include/MultiFramedRTPSource.hh
MPEG4GenericRTPSource.$(CPP): include/MPEG4GenericRTPSource.hh include/BitVector.hh include/MPEG4LATMAudioRTPSource.hh
include/MPEG4GenericRTPSource.hh: include/MultiFramedRTPSource.hh
MP3FileSource.$(CPP): include/MP3FileSource.hh MP3StreamState.hh include/InputFile.hh
include/MP3FileSource.hh: include/FramedFileSource.hh
MP3StreamState.hh: MP3Internals.hh
MP3Internals.hh: include/BitVector.hh
MP3Transcoder.$(CPP): include/MP3ADU.hh include/MP3Transcoder.hh
include/MP3ADU.hh: include/FramedFilter.hh
include/MP3Transcoder.hh: include/MP3ADU.hh include/MP3ADUTranscoder.hh
include/MP3ADUTranscoder.hh: include/FramedFilter.hh
MP3ADU.$(CPP): include/MP3ADU.hh MP3ADUdescriptor.hh MP3Internals.hh
MP3ADUdescriptor.$(CPP): MP3ADUdescriptor.hh
MP3ADUinterleaving.$(CPP): include/MP3ADUinterleaving.hh MP3ADUdescriptor.hh
include/MP3ADUinterleaving.hh: include/FramedFilter.hh
MP3ADUTranscoder.$(CPP): include/MP3ADUTranscoder.hh MP3Internals.hh
MP3StreamState.$(CPP): MP3StreamState.hh include/InputFile.hh
MP3Internals.$(CPP): MP3InternalsHuffman.hh
MP3InternalsHuffman.hh: MP3Internals.hh
MP3InternalsHuffman.$(CPP): MP3InternalsHuffman.hh
MP3InternalsHuffmanTable.$(CPP): MP3InternalsHuffman.hh
MP3ADURTPSource.$(CPP): include/MP3ADURTPSource.hh MP3ADUdescriptor.hh
include/MP3ADURTPSource.hh: include/MultiFramedRTPSource.hh
MPEG1or2VideoRTPSource.$(CPP): include/MPEG1or2VideoRTPSource.hh
include/MPEG1or2VideoRTPSource.hh: include/MultiFramedRTPSource.hh
MPEG2TransportStreamMultiplexor.$(CPP): include/MPEG2TransportStreamMultiplexor.hh
include/MPEG2TransportStreamMultiplexor.hh: include/FramedSource.hh include/MPEG1or2Demux.hh
MPEG2TransportStreamFromPESSource.$(CPP): include/MPEG2TransportStreamFromPESSource.hh
include/MPEG2TransportStreamFromPESSource.hh: include/MPEG2TransportStreamMultiplexor.hh include/MPEG1or2DemuxedElementaryStream.hh
MPEG2TransportStreamFromESSource.$(CPP): include/MPEG2TransportStreamFromESSource.hh
include/MPEG2TransportStreamFromESSource.hh: include/MPEG2TransportStreamMultiplexor.hh
MPEG2TransportStreamFramer.$(CPP): include/MPEG2TransportStreamFramer.hh
include/MPEG2TransportStreamFramer.hh: include/FramedFilter.hh include/MPEG2TransportStreamIndexFile.hh
ADTSAudioFileSource.$(CPP): include/ADTSAudioFileSource.hh include/InputFile.hh
include/ADTSAudioFileSource.hh: include/FramedFileSource.hh
H263plusVideoRTPSource.$(CPP): include/H263plusVideoRTPSource.hh
include/H263plusVideoRTPSource.hh: include/MultiFramedRTPSource.hh
H263plusVideoStreamFramer.$(CPP): include/H263plusVideoStreamFramer.hh H263plusVideoStreamParser.hh
include/H263plusVideoStreamFramer.hh: include/FramedFilter.hh
H263plusVideoStreamParser.hh: StreamParser.hh
H263plusVideoStreamParser.$(CPP): H263plusVideoStreamParser.hh include/H263plusVideoStreamFramer.hh
AC3AudioStreamFramer.$(CPP): include/AC3AudioStreamFramer.hh StreamParser.hh
include/AC3AudioStreamFramer.hh: include/FramedFilter.hh
AC3AudioRTPSource.$(CPP): include/AC3AudioRTPSource.hh
include/AC3AudioRTPSource.hh: include/MultiFramedRTPSource.hh
DVVideoRTPSource.$(CPP): include/DVVideoRTPSource.hh
include/DVVideoRTPSource.hh: include/MultiFramedRTPSource.hh
JPEGVideoSource.$(CPP): include/JPEGVideoSource.hh
include/JPEGVideoSource.hh: include/FramedSource.hh
AMRAudioSource.$(CPP): include/AMRAudioSource.hh
include/AMRAudioSource.hh: include/FramedSource.hh
AMRAudioFileSource.$(CPP): include/AMRAudioFileSource.hh include/InputFile.hh
include/AMRAudioFileSource.hh: include/AMRAudioSource.hh
InputFile.$(CPP): include/InputFile.hh
StreamReplicator.$(CPP): include/StreamReplicator.hh
include/StreamReplicator.hh: include/FramedSource.hh
MediaSink.$(CPP): include/MediaSink.hh
include/MediaSink.hh: include/FramedSource.hh
FileSink.$(CPP): include/FileSink.hh include/OutputFile.hh
include/FileSink.hh: include/MediaSink.hh
BasicUDPSink.$(CPP): include/BasicUDPSink.hh
include/BasicUDPSink.hh: include/MediaSink.hh
AMRAudioFileSink.$(CPP): include/AMRAudioFileSink.hh include/AMRAudioSource.hh include/OutputFile.hh
include/AMRAudioFileSink.hh: include/FileSink.hh
H264or5VideoFileSink.$(CPP): include/H264or5VideoFileSink.hh include/H264VideoRTPSource.hh
include/H264or5VideoFileSink.hh: include/FileSink.hh
H264VideoFileSink.$(CPP): include/H264VideoFileSink.hh include/OutputFile.hh
include/H264VideoFileSink.hh: include/H264or5VideoFileSink.hh
H265VideoFileSink.$(CPP): include/H265VideoFileSink.hh include/OutputFile.hh
include/H265VideoFileSink.hh: include/H264or5VideoFileSink.hh
OggFileSink.$(CPP): include/OggFileSink.hh include/OutputFile.hh include/VorbisAudioRTPSource.hh include/MPEG2TransportStreamMultiplexor.hh include/FramedSource.hh
include/OggFileSink.hh: include/FileSink.hh
RTPSink.$(CPP): include/RTPSink.hh
include/RTPSink.hh: include/MediaSink.hh include/RTPInterface.hh
MultiFramedRTPSink.$(CPP): include/MultiFramedRTPSink.hh
include/MultiFramedRTPSink.hh: include/RTPSink.hh
AudioRTPSink.$(CPP): include/AudioRTPSink.hh
include/AudioRTPSink.hh: include/MultiFramedRTPSink.hh
VideoRTPSink.$(CPP): include/VideoRTPSink.hh
include/VideoRTPSink.hh: include/MultiFramedRTPSink.hh
TextRTPSink.$(CPP): include/TextRTPSink.hh
include/TextRTPSink.hh: include/MultiFramedRTPSink.hh
RTPInterface.$(CPP): include/RTPInterface.hh
MPEG1or2AudioRTPSink.$(CPP): include/MPEG1or2AudioRTPSink.hh
include/MPEG1or2AudioRTPSink.hh: include/AudioRTPSink.hh
MP3ADURTPSink.$(CPP): include/MP3ADURTPSink.hh
include/MP3ADURTPSink.hh: include/AudioRTPSink.hh
MPEG1or2VideoRTPSink.$(CPP): include/MPEG1or2VideoRTPSink.hh include/MPEG1or2VideoStreamFramer.hh
include/MPEG1or2VideoRTPSink.hh: include/VideoRTPSink.hh
MPEG4LATMAudioRTPSink.$(CPP): include/MPEG4LATMAudioRTPSink.hh
include/MPEG4LATMAudioRTPSink.hh: include/AudioRTPSink.hh
MPEG4GenericRTPSink.$(CPP): include/MPEG4GenericRTPSink.hh include/Locale.hh
include/MPEG4GenericRTPSink.hh: include/MultiFramedRTPSink.hh
MPEG4ESVideoRTPSink.$(CPP): include/MPEG4ESVideoRTPSink.hh include/MPEG4VideoStreamFramer.hh include/MPEG4LATMAudioRTPSource.hh
include/MPEG4ESVideoRTPSink.hh: include/VideoRTPSink.hh
H263plusVideoRTPSink.$(CPP): include/H263plusVideoRTPSink.hh
include/H263plusVideoRTPSink.hh: include/VideoRTPSink.hh
H264or5VideoRTPSink.$(CPP): include/H264or5VideoRTPSink.hh include/H264or5VideoStreamFramer.hh
include/H264or5VideoRTPSink.hh: include/VideoRTPSink.hh include/FramedFilter.hh
H264VideoRTPSink.$(CPP): include/H264VideoRTPSink.hh include/H264VideoStreamFramer.hh include/Base64.hh include/H264VideoRTPSource.hh
include/H264VideoRTPSink.hh: include/H264or5VideoRTPSink.hh
H265VideoRTPSink.$(CPP): include/H265VideoRTPSink.hh include/H265VideoStreamFramer.hh include/Base64.hh include/BitVector.hh include/H264VideoRTPSource.hh
include/H265VideoRTPSink.hh: include/H264or5VideoRTPSink.hh
DVVideoRTPSink.$(CPP): include/DVVideoRTPSink.hh
include/DVVideoRTPSink.hh: include/VideoRTPSink.hh include/DVVideoStreamFramer.hh
include/DVVideoStreamFramer.hh: include/FramedFilter.hh
AC3AudioRTPSink.$(CPP): include/AC3AudioRTPSink.hh
include/AC3AudioRTPSink.hh: include/AudioRTPSink.hh
VorbisAudioRTPSink.$(CPP): include/VorbisAudioRTPSink.hh include/Base64.hh include/VorbisAudioRTPSource.hh
include/VorbisAudioRTPSink.hh: include/AudioRTPSink.hh
TheoraVideoRTPSink.$(CPP): include/TheoraVideoRTPSink.hh include/Base64.hh include/VorbisAudioRTPSource.hh include/VorbisAudioRTPSink.hh
include/TheoraVideoRTPSink.hh: include/VideoRTPSink.hh
VP8VideoRTPSink.$(CPP): include/VP8VideoRTPSink.hh
include/VP8VideoRTPSink.hh: include/VideoRTPSink.hh
VP9VideoRTPSink.$(CPP): include/VP9VideoRTPSink.hh
include/VP9VideoRTPSink.hh: include/VideoRTPSink.hh
GSMAudioRTPSink.$(CPP): include/GSMAudioRTPSink.hh
include/GSMAudioRTPSink.hh: include/AudioRTPSink.hh
JPEGVideoRTPSink.$(CPP): include/JPEGVideoRTPSink.hh include/JPEGVideoSource.hh
include/JPEGVideoRTPSink.hh: include/VideoRTPSink.hh
SimpleRTPSink.$(CPP): include/SimpleRTPSink.hh
include/SimpleRTPSink.hh: include/MultiFramedRTPSink.hh
AMRAudioRTPSink.$(CPP): include/AMRAudioRTPSink.hh include/AMRAudioSource.hh
include/AMRAudioRTPSink.hh: include/AudioRTPSink.hh
T140TextRTPSink.$(CPP): include/T140TextRTPSink.hh
include/T140TextRTPSink.hh: include/TextRTPSink.hh include/FramedFilter.hh
TCPStreamSink.$(CPP): include/TCPStreamSink.hh
include/TCPStreamSink.hh: include/MediaSink.hh
OutputFile.$(CPP): include/OutputFile.hh
uLawAudioFilter.$(CPP): include/uLawAudioFilter.hh
include/uLawAudioFilter.hh: include/FramedFilter.hh
MPEG2IndexFromTransportStream.$(CPP): include/MPEG2IndexFromTransportStream.hh
include/MPEG2IndexFromTransportStream.hh: include/FramedFilter.hh
MPEG2TransportStreamIndexFile.$(CPP): include/MPEG2TransportStreamIndexFile.hh include/InputFile.hh
include/MPEG2TransportStreamIndexFile.hh: include/Media.hh
MPEG2TransportStreamTrickModeFilter.$(CPP): include/MPEG2TransportStreamTrickModeFilter.hh include/ByteStreamFileSource.hh
include/MPEG2TransportStreamTrickModeFilter.hh: include/FramedFilter.hh include/MPEG2TransportStreamIndexFile.hh
RTCP.$(CPP): include/RTCP.hh rtcp_from_spec.h
include/RTCP.hh: include/RTPSink.hh include/RTPSource.hh
rtcp_from_spec.$(C): rtcp_from_spec.h
GenericMediaServer.$(CPP): include/GenericMediaServer.hh
include/GenericMediaServer.hh: include/ServerMediaSession.hh
RTSPServer.$(CPP): include/RTSPServer.hh include/RTSPCommon.hh include/RTSPRegisterSender.hh include/ProxyServerMediaSession.hh include/Base64.hh
include/RTSPServer.hh: include/GenericMediaServer.hh include/DigestAuthentication.hh
include/ServerMediaSession.hh: include/RTCP.hh
RTSPClient.$(CPP): include/RTSPClient.hh include/RTSPCommon.hh include/Base64.hh include/Locale.hh include/ourMD5.hh
include/RTSPClient.hh: include/MediaSession.hh include/DigestAuthentication.hh
RTSPCommon.$(CPP): include/RTSPCommon.hh include/Locale.hh
RTSPServerSupportingHTTPStreaming.$(CPP): include/RTSPServerSupportingHTTPStreaming.hh include/RTSPCommon.hh
include/RTSPServerSupportingHTTPStreaming.hh: include/RTSPServer.hh include/ByteStreamMemoryBufferSource.hh include/TCPStreamSink.hh
RTSPRegisterSender.$(CPP): include/RTSPRegisterSender.hh
include/RTSPRegisterSender.hh: include/RTSPClient.hh
SIPClient.$(CPP): include/SIPClient.hh
include/SIPClient.hh: include/MediaSession.hh include/DigestAuthentication.hh
MediaSession.$(CPP): include/liveMedia.hh include/Locale.hh
include/MediaSession.hh: include/RTCP.hh include/FramedFilter.hh
ServerMediaSession.$(CPP): include/ServerMediaSession.hh
PassiveServerMediaSubsession.$(CPP): include/PassiveServerMediaSubsession.hh
include/PassiveServerMediaSubsession.hh: include/ServerMediaSession.hh include/RTPSink.hh include/RTCP.hh
OnDemandServerMediaSubsession.$(CPP): include/OnDemandServerMediaSubsession.hh
include/OnDemandServerMediaSubsession.hh: include/ServerMediaSession.hh include/RTPSink.hh include/BasicUDPSink.hh include/RTCP.hh
FileServerMediaSubsession.$(CPP): include/FileServerMediaSubsession.hh
include/FileServerMediaSubsession.hh: include/OnDemandServerMediaSubsession.hh
MPEG4VideoFileServerMediaSubsession.$(CPP): include/MPEG4VideoFileServerMediaSubsession.hh include/MPEG4ESVideoRTPSink.hh include/ByteStreamFileSource.hh include/MPEG4VideoStreamFramer.hh
include/MPEG4VideoFileServerMediaSubsession.hh: include/FileServerMediaSubsession.hh
H264VideoFileServerMediaSubsession.$(CPP): include/H264VideoFileServerMediaSubsession.hh include/H264VideoRTPSink.hh include/ByteStreamFileSource.hh include/H264VideoStreamFramer.hh
include/H264VideoFileServerMediaSubsession.hh: include/FileServerMediaSubsession.hh
H265VideoFileServerMediaSubsession.$(CPP): include/H265VideoFileServerMediaSubsession.hh include/H265VideoRTPSink.hh include/ByteStreamFileSource.hh include/H265VideoStreamFramer.hh
include/H265VideoFileServerMediaSubsession.hh: include/FileServerMediaSubsession.hh
H263plusVideoFileServerMediaSubsession.$(CPP): include/H263plusVideoFileServerMediaSubsession.hh include/H263plusVideoRTPSink.hh include/ByteStreamFileSource.hh include/H263plusVideoStreamFramer.hh
include/H263plusVideoFileServerMediaSubsession.hh: include/FileServerMediaSubsession.hh
WAVAudioFileServerMediaSubsession.$(CPP): include/WAVAudioFileServerMediaSubsession.hh include/WAVAudioFileSource.hh include/uLawAudioFilter.hh include/SimpleRTPSink.hh
include/WAVAudioFileServerMediaSubsession.hh: include/FileServerMediaSubsession.hh
AMRAudioFileServerMediaSubsession.$(CPP): include/AMRAudioFileServerMediaSubsession.hh include/AMRAudioRTPSink.hh include/AMRAudioFileSource.hh
include/AMRAudioFileServerMediaSubsession.hh: include/FileServerMediaSubsession.hh
MP3AudioFileServerMediaSubsession.$(CPP): include/MP3AudioFileServerMediaSubsession.hh include/MPEG1or2AudioRTPSink.hh include/MP3ADURTPSink.hh include/MP3FileSource.hh include/MP3ADU.hh
include/MP3AudioFileServerMediaSubsession.hh: include/FileServerMediaSubsession.hh include/MP3ADUinterleaving.hh
MPEG1or2VideoFileServerMediaSubsession.$(CPP): include/MPEG1or2VideoFileServerMediaSubsession.hh include/MPEG1or2VideoRTPSink.hh include/ByteStreamFileSource.hh include/MPEG1or2VideoStreamFramer.hh
include/MPEG1or2VideoFileServerMediaSubsession.hh: include/FileServerMediaSubsession.hh
MPEG1or2FileServerDemux.$(CPP): include/MPEG1or2FileServerDemux.hh include/MPEG1or2DemuxedServerMediaSubsession.hh include/ByteStreamFileSource.hh
include/MPEG1or2FileServerDemux.hh: include/ServerMediaSession.hh include/MPEG1or2DemuxedElementaryStream.hh
MPEG1or2DemuxedServerMediaSubsession.$(CPP): include/MPEG1or2DemuxedServerMediaSubsession.hh include/MPEG1or2AudioStreamFramer.hh include/MPEG1or2AudioRTPSink.hh include/MPEG1or2VideoStreamFramer.hh include/MPEG1or2VideoRTPSink.hh include/AC3AudioStreamFramer.hh include/AC3AudioRTPSink.hh include/ByteStreamFileSource.hh
include/MPEG1or2DemuxedServerMediaSubsession.hh: include/OnDemandServerMediaSubsession.hh include/MPEG1or2FileServerDemux.hh
MPEG2TransportFileServerMediaSubsession.$(CPP): include/MPEG2TransportFileServerMediaSubsession.hh include/SimpleRTPSink.hh
include/MPEG2TransportFileServerMediaSubsession.hh: include/FileServerMediaSubsession.hh include/MPEG2TransportStreamFramer.hh include/ByteStreamFileSource.hh include/MPEG2TransportStreamTrickModeFilter.hh include/MPEG2TransportStreamFromESSource.hh
ADTSAudioFileServerMediaSubsession.$(CPP): include/ADTSAudioFileServerMediaSubsession.hh include/ADTSAudioFileSource.hh include/MPEG4GenericRTPSink.hh
include/ADTSAudioFileServerMediaSubsession.hh: include/FileServerMediaSubsession.hh
DVVideoFileServerMediaSubsession.$(CPP): include/DVVideoFileServerMediaSubsession.hh include/DVVideoRTPSink.hh include/ByteStreamFileSource.hh include/DVVideoStreamFramer.hh
include/DVVideoFileServerMediaSubsession.hh: include/FileServerMediaSubsession.hh
AC3AudioFileServerMediaSubsession.$(CPP): include/AC3AudioFileServerMediaSubsession.hh include/AC3AudioRTPSink.hh include/ByteStreamFileSource.hh include/AC3AudioStreamFramer.hh
include/AC3AudioFileServerMediaSubsession.hh: include/FileServerMediaSubsession.hh
MPEG2TransportUDPServerMediaSubsession.$(CPP): include/MPEG2TransportUDPServerMediaSubsession.hh include/BasicUDPSource.hh include/SimpleRTPSource.hh include/MPEG2TransportStreamFramer.hh include/SimpleRTPSink.hh
include/MPEG2TransportUDPServerMediaSubsession.hh: include/OnDemandServerMediaSubsession.hh
ProxyServerMediaSession.$(CPP): include/liveMedia.hh include/RTSPCommon.hh
include/ProxyServerMediaSession.hh: include/ServerMediaSession.hh include/MediaSession.hh include/RTSPClient.hh include/MediaTranscodingTable.hh
include/MediaTranscodingTable.hh: include/FramedFilter.hh include/MediaSession.hh
QuickTimeFileSink.$(CPP): include/QuickTimeFileSink.hh include/InputFile.hh include/OutputFile.hh include/QuickTimeGenericRTPSource.hh include/H263plusVideoRTPSource.hh include/MPEG4GenericRTPSource.hh include/MPEG4LATMAudioRTPSource.hh
include/QuickTimeFileSink.hh: include/MediaSession.hh
QuickTimeGenericRTPSource.$(CPP): include/QuickTimeGenericRTPSource.hh
include/QuickTimeGenericRTPSource.hh: include/MultiFramedRTPSource.hh
AVIFileSink.$(CPP): include/AVIFileSink.hh include/InputFile.hh include/OutputFile.hh
include/AVIFileSink.hh: include/MediaSession.hh
MatroskaFile.$(CPP): MatroskaFileParser.hh MatroskaDemuxedTrack.hh include/ByteStreamFileSource.hh include/H264VideoStreamDiscreteFramer.hh include/H265VideoStreamDiscreteFramer.hh include/MPEG1or2AudioRTPSink.hh include/MPEG4GenericRTPSink.hh include/AC3AudioRTPSink.hh include/SimpleRTPSink.hh include/VorbisAudioRTPSink.hh include/H264VideoRTPSink.hh include/H265VideoRTPSink.hh include/VP8VideoRTPSink.hh include/VP9VideoRTPSink.hh include/T140TextRTPSink.hh
MatroskaFileParser.hh: StreamParser.hh include/MatroskaFile.hh EBMLNumber.hh
include/MatroskaFile.hh: include/RTPSink.hh
MatroskaDemuxedTrack.hh: include/FramedSource.hh
MatroskaFileParser.$(CPP): MatroskaFileParser.hh MatroskaDemuxedTrack.hh include/ByteStreamFileSource.hh
EBMLNumber.$(CPP): EBMLNumber.hh
MatroskaDemuxedTrack.$(CPP): MatroskaDemuxedTrack.hh include/MatroskaFile.hh
MatroskaFileServerMediaSubsession.$(CPP): MatroskaFileServerMediaSubsession.hh MatroskaDemuxedTrack.hh include/FramedFilter.hh
MatroskaFileServerMediaSubsession.hh: include/FileServerMediaSubsession.hh include/MatroskaFileServerDemux.hh
MP3AudioMatroskaFileServerMediaSubsession.$(CPP): MP3AudioMatroskaFileServerMediaSubsession.hh MatroskaDemuxedTrack.hh
MP3AudioMatroskaFileServerMediaSubsession.hh: include/MP3AudioFileServerMediaSubsession.hh include/MatroskaFileServerDemux.hh
MatroskaFileServerDemux.$(CPP): include/MatroskaFileServerDemux.hh MP3AudioMatroskaFileServerMediaSubsession.hh MatroskaFileServerMediaSubsession.hh
include/MatroskaFileServerDemux.hh: include/ServerMediaSession.hh include/MatroskaFile.hh
OggFile.$(CPP): OggFileParser.hh OggDemuxedTrack.hh include/ByteStreamFileSource.hh include/VorbisAudioRTPSink.hh include/SimpleRTPSink.hh include/TheoraVideoRTPSink.hh
OggFileParser.hh: StreamParser.hh include/OggFile.hh
include/OggFile.hh: include/RTPSink.hh
OggDemuxedTrack.hh: include/FramedSource.hh
OggFileParser.$(CPP): OggFileParser.hh OggDemuxedTrack.hh
OggDemuxedTrack.$(CPP): OggDemuxedTrack.hh include/OggFile.hh
OggFileServerMediaSubsession.$(CPP): OggFileServerMediaSubsession.hh OggDemuxedTrack.hh include/FramedFilter.hh
OggFileServerMediaSubsession.hh: include/FileServerMediaSubsession.hh include/OggFileServerDemux.hh
OggFileServerDemux.$(CPP): include/OggFileServerDemux.hh OggFileServerMediaSubsession.hh
include/OggFileServerDemux.hh: include/ServerMediaSession.hh include/OggFile.hh
BitVector.$(CPP): include/BitVector.hh
StreamParser.$(CPP): StreamParser.hh
DigestAuthentication.$(CPP): include/DigestAuthentication.hh include/ourMD5.hh
ourMD5.$(CPP): include/ourMD5.hh
Base64.$(CPP): include/Base64.hh
Locale.$(CPP): include/Locale.hh
include/liveMedia.hh:: include/MPEG1or2AudioRTPSink.hh include/MP3ADURTPSink.hh include/MPEG1or2VideoRTPSink.hh include/MPEG4ESVideoRTPSink.hh include/BasicUDPSink.hh include/AMRAudioFileSink.hh include/H264VideoFileSink.hh include/H265VideoFileSink.hh include/OggFileSink.hh include/GSMAudioRTPSink.hh include/H263plusVideoRTPSink.hh include/H264VideoRTPSink.hh include/H265VideoRTPSink.hh include/DVVideoRTPSource.hh include/DVVideoRTPSink.hh include/DVVideoStreamFramer.hh include/H264VideoStreamFramer.hh include/H265VideoStreamFramer.hh include/H264VideoStreamDiscreteFramer.hh include/H265VideoStreamDiscreteFramer.hh include/JPEGVideoRTPSink.hh include/SimpleRTPSink.hh include/uLawAudioFilter.hh include/MPEG2IndexFromTransportStream.hh include/MPEG2TransportStreamTrickModeFilter.hh include/ByteStreamMultiFileSource.hh include/ByteStreamMemoryBufferSource.hh include/BasicUDPSource.hh include/SimpleRTPSource.hh include/MPEG1or2AudioRTPSource.hh include/MPEG4LATMAudioRTPSource.hh include/MPEG4LATMAudioRTPSink.hh include/MPEG4ESVideoRTPSource.hh include/MPEG4GenericRTPSource.hh include/MP3ADURTPSource.hh include/QCELPAudioRTPSource.hh include/AMRAudioRTPSource.hh include/JPEGVideoRTPSource.hh include/JPEGVideoSource.hh include/MPEG1or2VideoRTPSource.hh include/VorbisAudioRTPSource.hh include/TheoraVideoRTPSource.hh include/VP8VideoRTPSource.hh include/VP9VideoRTPSource.hh
include/liveMedia.hh:: include/MPEG2TransportStreamFromPESSource.hh include/MPEG2TransportStreamFromESSource.hh include/MPEG2TransportStreamFramer.hh include/ADTSAudioFileSource.hh include/H261VideoRTPSource.hh include/H263plusVideoRTPSource.hh include/H264VideoRTPSource.hh include/H265VideoRTPSource.hh include/MP3FileSource.hh include/MP3ADU.hh include/MP3ADUinterleaving.hh include/MP3Transcoder.hh include/MPEG1or2DemuxedElementaryStream.hh include/MPEG1or2AudioStreamFramer.hh include/MPEG1or2VideoStreamDiscreteFramer.hh include/MPEG4VideoStreamDiscreteFramer.hh include/H263plusVideoStreamFramer.hh include/AC3AudioStreamFramer.hh include/AC3AudioRTPSource.hh include/AC3AudioRTPSink.hh include/VorbisAudioRTPSink.hh include/TheoraVideoRTPSink.hh include/VP8VideoRTPSink.hh include/VP9VideoRTPSink.hh include/MPEG4GenericRTPSink.hh include/DeviceSource.hh include/AudioInputDevice.hh include/WAVAudioFileSource.hh include/StreamReplicator.hh include/RTSPRegisterSender.hh
include/liveMedia.hh:: include/RTSPServerSupportingHTTPStreaming.hh include/RTSPClient.hh include/SIPClient.hh include/QuickTimeFileSink.hh include/QuickTimeGenericRTPSource.hh include/AVIFileSink.hh include/PassiveServerMediaSubsession.hh include/MPEG4VideoFileServerMediaSubsession.hh include/H264VideoFileServerMediaSubsession.hh include/H265VideoFileServerMediaSubsession.hh include/WAVAudioFileServerMediaSubsession.hh include/AMRAudioFileServerMediaSubsession.hh include/AMRAudioFileSource.hh include/AMRAudioRTPSink.hh include/T140TextRTPSink.hh include/TCPStreamSink.hh include/MP3AudioFileServerMediaSubsession.hh include/MPEG1or2VideoFileServerMediaSubsession.hh include/MPEG1or2FileServerDemux.hh include/MPEG2TransportFileServerMediaSubsession.hh include/H263plusVideoFileServerMediaSubsession.hh include/ADTSAudioFileServerMediaSubsession.hh include/DVVideoFileServerMediaSubsession.hh include/AC3AudioFileServerMediaSubsession.hh include/MPEG2TransportUDPServerMediaSubsession.hh include/MatroskaFileServerDemux.hh include/OggFileServerDemux.hh include/ProxyServerMediaSession.hh
clean:
-rm -rf *.$(OBJ) $(ALL) core *.core *~ include/*~
install: install1 $(INSTALL2)
install1: $(LIVEMEDIA_LIB)
install -d $(DESTDIR)$(PREFIX)/include/liveMedia $(DESTDIR)$(LIBDIR)
install -m 644 include/*.hh $(DESTDIR)$(PREFIX)/include/liveMedia
install -m 644 $(LIVEMEDIA_LIB) $(DESTDIR)$(LIBDIR)
install_shared_libraries: $(LIVEMEDIA_LIB)
ln -fs $(NAME).$(LIB_SUFFIX) $(DESTDIR)$(LIBDIR)/$(NAME).$(SHORT_LIB_SUFFIX)
ln -fs $(NAME).$(LIB_SUFFIX) $(DESTDIR)$(LIBDIR)/$(NAME).so
##### Any additional, platform-specific rules come here:
live/liveMedia/MatroskaDemuxedTrack.cpp 000444 001752 001752 00000003436 12656261123 020144 0 ustar 00rsf rsf 000000 000000 /**********
This library is free software; you can redistribute it and/or modify it under
the terms of the GNU Lesser General Public License as published by the
Free Software Foundation; either version 2.1 of the License, or (at your
option) any later version. (See .)
This library is distributed in the hope that it will be useful, but WITHOUT
ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS
FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for
more details.
You should have received a copy of the GNU Lesser General Public License
along with this library; if not, write to the Free Software Foundation, Inc.,
51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
**********/
// "liveMedia"
// Copyright (c) 1996-2016 Live Networks, Inc. All rights reserved.
// A media track, demultiplexed from a Matroska file
// Implementation
#include "MatroskaDemuxedTrack.hh"
#include "MatroskaFile.hh"
void MatroskaDemuxedTrack::seekToTime(double& seekNPT) {
fOurSourceDemux.seekToTime(seekNPT);
}
MatroskaDemuxedTrack::MatroskaDemuxedTrack(UsageEnvironment& env, unsigned trackNumber, MatroskaDemux& sourceDemux)
: FramedSource(env),
fOurTrackNumber(trackNumber), fOurSourceDemux(sourceDemux), fDurationImbalance(0),
fOpusTrackNumber(0) {
fPrevPresentationTime.tv_sec = 0; fPrevPresentationTime.tv_usec = 0;
}
MatroskaDemuxedTrack::~MatroskaDemuxedTrack() {
fOurSourceDemux.removeTrack(fOurTrackNumber);
}
void MatroskaDemuxedTrack::doGetNextFrame() {
fOurSourceDemux.continueReading();
}
char const* MatroskaDemuxedTrack::MIMEtype() const {
MatroskaTrack* track = fOurSourceDemux.fOurFile.lookup(fOurTrackNumber);
if (track == NULL) return "(unknown)"; // shouldn't happen
return track->mimeType;
}
live/liveMedia/MatroskaDemuxedTrack.hh 000444 001752 001752 00000004415 12656261123 017757 0 ustar 00rsf rsf 000000 000000 /**********
This library is free software; you can redistribute it and/or modify it under
the terms of the GNU Lesser General Public License as published by the
Free Software Foundation; either version 2.1 of the License, or (at your
option) any later version. (See .)
This library is distributed in the hope that it will be useful, but WITHOUT
ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS
FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for
more details.
You should have received a copy of the GNU Lesser General Public License
along with this library; if not, write to the Free Software Foundation, Inc.,
51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
**********/
// "liveMedia"
// Copyright (c) 1996-2016 Live Networks, Inc. All rights reserved.
// A media track, demultiplexed from a Matroska file
// C++ header
#ifndef _MATROSKA_DEMUXED_TRACK_HH
#define _MATROSKA_DEMUXED_TRACK_HH
#ifndef _FRAMED_SOURCE_HH
#include "FramedSource.hh"
#endif
class MatroskaDemux; // forward
class MatroskaDemuxedTrack: public FramedSource {
public:
void seekToTime(double& seekNPT);
private: // We are created only by a MatroskaDemux (a friend)
friend class MatroskaDemux;
MatroskaDemuxedTrack(UsageEnvironment& env, unsigned trackNumber, MatroskaDemux& sourceDemux);
virtual ~MatroskaDemuxedTrack();
private:
// redefined virtual functions:
virtual void doGetNextFrame();
virtual char const* MIMEtype() const;
private: // We are accessed only by MatroskaDemux and by MatroskaFileParser (a friend)
friend class MatroskaFileParser;
unsigned char* to() { return fTo; }
unsigned maxSize() { return fMaxSize; }
unsigned& frameSize() { return fFrameSize; }
unsigned& numTruncatedBytes() { return fNumTruncatedBytes; }
struct timeval& presentationTime() { return fPresentationTime; }
unsigned& durationInMicroseconds() { return fDurationInMicroseconds; }
struct timeval& prevPresentationTime() { return fPrevPresentationTime; }
int& durationImbalance() { return fDurationImbalance; }
private:
unsigned fOurTrackNumber;
MatroskaDemux& fOurSourceDemux;
struct timeval fPrevPresentationTime;
int fDurationImbalance;
unsigned fOpusTrackNumber; // hack for Opus audio
};
#endif
live/liveMedia/MatroskaFile.cpp 000444 001752 001752 00000102272 12656261123 016441 0 ustar 00rsf rsf 000000 000000 /**********
This library is free software; you can redistribute it and/or modify it under
the terms of the GNU Lesser General Public License as published by the
Free Software Foundation; either version 2.1 of the License, or (at your
option) any later version. (See .)
This library is distributed in the hope that it will be useful, but WITHOUT
ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS
FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for
more details.
You should have received a copy of the GNU Lesser General Public License
along with this library; if not, write to the Free Software Foundation, Inc.,
51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
**********/
// "liveMedia"
// Copyright (c) 1996-2016 Live Networks, Inc. All rights reserved.
// A class that encapsulates a Matroska file.
// Implementation
#include "MatroskaFileParser.hh"
#include "MatroskaDemuxedTrack.hh"
#include
#include
#include
#include
#include
#include
#include
#include
#include
#include
#include
#include
#include
#include
////////// CuePoint definition //////////
class CuePoint {
public:
CuePoint(double cueTime, u_int64_t clusterOffsetInFile, unsigned blockNumWithinCluster/* 1-based */);
virtual ~CuePoint();
static void addCuePoint(CuePoint*& root, double cueTime, u_int64_t clusterOffsetInFile, unsigned blockNumWithinCluster/* 1-based */,
Boolean& needToReviseBalanceOfParent);
// If "cueTime" == "root.fCueTime", replace the existing data, otherwise add to the left or right subtree.
// (Note that this is a static member function because - as a result of tree rotation - "root" might change.)
Boolean lookup(double& cueTime, u_int64_t& resultClusterOffsetInFile, unsigned& resultBlockNumWithinCluster);
static void fprintf(FILE* fid, CuePoint* cuePoint); // used for debugging; it's static to allow for "cuePoint == NULL"
private:
// The "CuePoint" tree is implemented as an AVL Tree, to keep it balanced (for efficient lookup).
CuePoint* fSubTree[2]; // 0 => left; 1 => right
CuePoint* left() const { return fSubTree[0]; }
CuePoint* right() const { return fSubTree[1]; }
char fBalance; // height of right subtree - height of left subtree
static void rotate(unsigned direction/*0 => left; 1 => right*/, CuePoint*& root); // used to keep the tree in balance
double fCueTime;
u_int64_t fClusterOffsetInFile;
unsigned fBlockNumWithinCluster; // 0-based
};
UsageEnvironment& operator<<(UsageEnvironment& env, const CuePoint* cuePoint); // used for debugging
////////// MatroskaTrackTable definition /////////
// For looking up and iterating over the file's tracks:
class MatroskaTrackTable {
public:
MatroskaTrackTable();
virtual ~MatroskaTrackTable();
void add(MatroskaTrack* newTrack, unsigned trackNumber);
MatroskaTrack* lookup(unsigned trackNumber);
unsigned numTracks() const;
class Iterator {
public:
Iterator(MatroskaTrackTable& ourTable);
virtual ~Iterator();
MatroskaTrack* next();
private:
HashTable::Iterator* fIter;
};
private:
friend class Iterator;
HashTable* fTable;
};
////////// MatroskaFile implementation //////////
void MatroskaFile
::createNew(UsageEnvironment& env, char const* fileName, onCreationFunc* onCreation, void* onCreationClientData,
char const* preferredLanguage) {
new MatroskaFile(env, fileName, onCreation, onCreationClientData, preferredLanguage);
}
MatroskaFile::MatroskaFile(UsageEnvironment& env, char const* fileName, onCreationFunc* onCreation, void* onCreationClientData,
char const* preferredLanguage)
: Medium(env),
fFileName(strDup(fileName)), fOnCreation(onCreation), fOnCreationClientData(onCreationClientData),
fPreferredLanguage(strDup(preferredLanguage)),
fTimecodeScale(1000000), fSegmentDuration(0.0), fSegmentDataOffset(0), fClusterOffset(0), fCuesOffset(0), fCuePoints(NULL),
fChosenVideoTrackNumber(0), fChosenAudioTrackNumber(0), fChosenSubtitleTrackNumber(0) {
fTrackTable = new MatroskaTrackTable;
fDemuxesTable = HashTable::create(ONE_WORD_HASH_KEYS);
FramedSource* inputSource = ByteStreamFileSource::createNew(envir(), fileName);
if (inputSource == NULL) {
// The specified input file does not exist!
fParserForInitialization = NULL;
handleEndOfTrackHeaderParsing(); // we have no file, and thus no tracks, but we still need to signal this
} else {
// Initialize ourselves by parsing the file's 'Track' headers:
fParserForInitialization = new MatroskaFileParser(*this, inputSource, handleEndOfTrackHeaderParsing, this, NULL);
}
}
MatroskaFile::~MatroskaFile() {
delete fParserForInitialization;
delete fCuePoints;
// Delete any outstanding "MatroskaDemux"s, and the table for them:
MatroskaDemux* demux;
while ((demux = (MatroskaDemux*)fDemuxesTable->RemoveNext()) != NULL) {
delete demux;
}
delete fDemuxesTable;
delete fTrackTable;
delete[] (char*)fPreferredLanguage;
delete[] (char*)fFileName;
}
void MatroskaFile::handleEndOfTrackHeaderParsing(void* clientData) {
((MatroskaFile*)clientData)->handleEndOfTrackHeaderParsing();
}
class TrackChoiceRecord {
public:
unsigned trackNumber;
u_int8_t trackType;
unsigned choiceFlags;
};
void MatroskaFile::handleEndOfTrackHeaderParsing() {
// Having parsed all of our track headers, iterate through the tracks to figure out which ones should be played.
// The Matroska 'specification' is rather imprecise about this (as usual). However, we use the following algorithm:
// - Use one (but no more) enabled track of each type (video, audio, subtitle). (Ignore all tracks that are not 'enabled'.)
// - For each track type, choose the one that's 'forced'.
// - If more than one is 'forced', choose the first one that matches our preferred language, or the first if none matches.
// - If none is 'forced', choose the one that's 'default'.
// - If more than one is 'default', choose the first one that matches our preferred language, or the first if none matches.
// - If none is 'default', choose the first one that matches our preferred language, or the first if none matches.
unsigned numTracks = fTrackTable->numTracks();
if (numTracks > 0) {
TrackChoiceRecord* trackChoice = new TrackChoiceRecord[numTracks];
unsigned numEnabledTracks = 0;
MatroskaTrackTable::Iterator iter(*fTrackTable);
MatroskaTrack* track;
while ((track = iter.next()) != NULL) {
if (!track->isEnabled || track->trackType == 0 || track->mimeType[0] == '\0') continue; // track not enabled, or not fully-defined
trackChoice[numEnabledTracks].trackNumber = track->trackNumber;
trackChoice[numEnabledTracks].trackType = track->trackType;
// Assign flags for this track so that, when sorted, the largest value becomes our choice:
unsigned choiceFlags = 0;
if (fPreferredLanguage != NULL && track->language != NULL && strcmp(fPreferredLanguage, track->language) == 0) {
// This track matches our preferred language:
choiceFlags |= 1;
}
if (track->isForced) {
choiceFlags |= 4;
} else if (track->isDefault) {
choiceFlags |= 2;
}
trackChoice[numEnabledTracks].choiceFlags = choiceFlags;
++numEnabledTracks;
}
// Choose the desired track for each track type:
for (u_int8_t trackType = 0x01; trackType != MATROSKA_TRACK_TYPE_OTHER; trackType <<= 1) {
int bestNum = -1;
int bestChoiceFlags = -1;
for (unsigned i = 0; i < numEnabledTracks; ++i) {
if (trackChoice[i].trackType == trackType && (int)trackChoice[i].choiceFlags > bestChoiceFlags) {
bestNum = i;
bestChoiceFlags = (int)trackChoice[i].choiceFlags;
}
}
if (bestChoiceFlags >= 0) { // There is a track for this track type
if (trackType == MATROSKA_TRACK_TYPE_VIDEO) fChosenVideoTrackNumber = trackChoice[bestNum].trackNumber;
else if (trackType == MATROSKA_TRACK_TYPE_AUDIO) fChosenAudioTrackNumber = trackChoice[bestNum].trackNumber;
else fChosenSubtitleTrackNumber = trackChoice[bestNum].trackNumber;
}
}
delete[] trackChoice;
}
#ifdef DEBUG
if (fChosenVideoTrackNumber > 0) fprintf(stderr, "Chosen video track: #%d\n", fChosenVideoTrackNumber); else fprintf(stderr, "No chosen video track\n");
if (fChosenAudioTrackNumber > 0) fprintf(stderr, "Chosen audio track: #%d\n", fChosenAudioTrackNumber); else fprintf(stderr, "No chosen audio track\n");
if (fChosenSubtitleTrackNumber > 0) fprintf(stderr, "Chosen subtitle track: #%d\n", fChosenSubtitleTrackNumber); else fprintf(stderr, "No chosen subtitle track\n");
#endif
// Delete our parser, because it's done its job now:
delete fParserForInitialization; fParserForInitialization = NULL;
// Finally, signal our caller that we've been created and initialized:
if (fOnCreation != NULL) (*fOnCreation)(this, fOnCreationClientData);
}
MatroskaTrack* MatroskaFile::lookup(unsigned trackNumber) const {
return fTrackTable->lookup(trackNumber);
}
MatroskaDemux* MatroskaFile::newDemux() {
MatroskaDemux* demux = new MatroskaDemux(*this);
fDemuxesTable->Add((char const*)demux, demux);
return demux;
}
void MatroskaFile::removeDemux(MatroskaDemux* demux) {
fDemuxesTable->Remove((char const*)demux);
}
float MatroskaFile::fileDuration() {
if (fCuePoints == NULL) return 0.0; // Hack, because the RTSP server code assumes that duration > 0 => seekable. (fix this) #####
return segmentDuration()*(timecodeScale()/1000000000.0f);
}
FramedSource* MatroskaFile
::createSourceForStreaming(FramedSource* baseSource, unsigned trackNumber,
unsigned& estBitrate, unsigned& numFiltersInFrontOfTrack) {
if (baseSource == NULL) return NULL;
FramedSource* result = baseSource; // by default
estBitrate = 100; // by default
numFiltersInFrontOfTrack = 0; // by default
// Look at the track's MIME type to set its estimated bitrate (for use by RTCP).
// (Later, try to be smarter about figuring out the bitrate.) #####
// Some MIME types also require adding a special 'framer' in front of the source.
MatroskaTrack* track = lookup(trackNumber);
if (track != NULL) { // should always be true
if (strcmp(track->mimeType, "audio/MPEG") == 0) {
estBitrate = 128;
} else if (strcmp(track->mimeType, "audio/AAC") == 0) {
estBitrate = 96;
} else if (strcmp(track->mimeType, "audio/AC3") == 0) {
estBitrate = 48;
} else if (strcmp(track->mimeType, "audio/VORBIS") == 0) {
estBitrate = 96;
} else if (strcmp(track->mimeType, "video/H264") == 0) {
estBitrate = 500;
// Allow for the possibility of very large NAL units being fed to the sink object:
OutPacketBuffer::increaseMaxSizeTo(300000); // bytes
// Add a framer in front of the source:
result = H264VideoStreamDiscreteFramer::createNew(envir(), result);
++numFiltersInFrontOfTrack;
} else if (strcmp(track->mimeType, "video/H265") == 0) {
estBitrate = 500;
// Allow for the possibility of very large NAL units being fed to the sink object:
OutPacketBuffer::increaseMaxSizeTo(300000); // bytes
// Add a framer in front of the source:
result = H265VideoStreamDiscreteFramer::createNew(envir(), result);
++numFiltersInFrontOfTrack;
} else if (strcmp(track->mimeType, "video/VP8") == 0) {
estBitrate = 500;
} else if (strcmp(track->mimeType, "video/VP9") == 0) {
estBitrate = 500;
} else if (strcmp(track->mimeType, "video/THEORA") == 0) {
estBitrate = 500;
} else if (strcmp(track->mimeType, "text/T140") == 0) {
estBitrate = 48;
}
}
return result;
}
#define getPrivByte(b) if (n == 0) break; else do {b = *p++; --n;} while (0) /* Vorbis/Theora configuration header parsing */
#define CHECK_PTR if (ptr >= limit) break /* H.264/H.265 parsing */
#define NUM_BYTES_REMAINING (unsigned)(limit - ptr) /* H.264/H.265 parsing */
RTPSink* MatroskaFile
::createRTPSinkForTrackNumber(unsigned trackNumber, Groupsock* rtpGroupsock,
unsigned char rtpPayloadTypeIfDynamic) {
RTPSink* result = NULL; // default value, if an error occurs
do {
MatroskaTrack* track = lookup(trackNumber);
if (track == NULL) break;
if (strcmp(track->mimeType, "audio/MPEG") == 0) {
result = MPEG1or2AudioRTPSink::createNew(envir(), rtpGroupsock);
} else if (strcmp(track->mimeType, "audio/AAC") == 0) {
// The Matroska file's 'Codec Private' data is assumed to be the AAC configuration
// information. Use this to generate a hexadecimal 'config' string for the new RTP sink:
char* configStr = new char[2*track->codecPrivateSize + 1]; if (configStr == NULL) break;
// 2 hex digits per byte, plus the trailing '\0'
for (unsigned i = 0; i < track->codecPrivateSize; ++i) {
sprintf(&configStr[2*i], "%02X", track->codecPrivate[i]);
}
result = MPEG4GenericRTPSink::createNew(envir(), rtpGroupsock,
rtpPayloadTypeIfDynamic,
track->samplingFrequency,
"audio", "AAC-hbr", configStr,
track->numChannels);
delete[] configStr;
} else if (strcmp(track->mimeType, "audio/AC3") == 0) {
result = AC3AudioRTPSink
::createNew(envir(), rtpGroupsock, rtpPayloadTypeIfDynamic, track->samplingFrequency);
} else if (strcmp(track->mimeType, "audio/OPUS") == 0) {
result = SimpleRTPSink
::createNew(envir(), rtpGroupsock, rtpPayloadTypeIfDynamic,
48000, "audio", "OPUS", 2, False/*only 1 Opus 'packet' in each RTP packet*/);
} else if (strcmp(track->mimeType, "audio/VORBIS") == 0 || strcmp(track->mimeType, "video/THEORA") == 0) {
// The Matroska file's 'Codec Private' data is assumed to be the codec configuration
// information, containing the "Identification", "Comment", and "Setup" headers.
// Extract these headers now:
u_int8_t* identificationHeader = NULL; unsigned identificationHeaderSize = 0;
u_int8_t* commentHeader = NULL; unsigned commentHeaderSize = 0;
u_int8_t* setupHeader = NULL; unsigned setupHeaderSize = 0;
Boolean isTheora = strcmp(track->mimeType, "video/THEORA") == 0; // otherwise, Vorbis
do {
u_int8_t* p = track->codecPrivate;
unsigned n = track->codecPrivateSize;
if (n == 0 || p == NULL) break; // we have no 'Codec Private' data
u_int8_t numHeaders;
getPrivByte(numHeaders);
unsigned headerSize[3]; // we don't handle any more than 2+1 headers
// Extract the sizes of each of these headers:
unsigned sizesSum = 0;
Boolean success = True;
unsigned i;
for (i = 0; i < numHeaders && i < 3; ++i) {
unsigned len = 0;
u_int8_t c;
do {
success = False;
getPrivByte(c);
success = True;
len += c;
} while (c == 255);
if (!success || len == 0) break;
headerSize[i] = len;
sizesSum += len;
}
if (!success) break;
// Compute the implicit size of the final header:
if (numHeaders < 3) {
int finalHeaderSize = n - sizesSum;
if (finalHeaderSize <= 0) break; // error in data; give up
headerSize[numHeaders] = (unsigned)finalHeaderSize;
++numHeaders; // include the final header now
} else {
numHeaders = 3; // The maximum number of headers that we handle
}
// Then, extract and classify each header:
for (i = 0; i < numHeaders; ++i) {
success = False;
unsigned newHeaderSize = headerSize[i];
u_int8_t* newHeader = new u_int8_t[newHeaderSize];
if (newHeader == NULL) break;
u_int8_t* hdr = newHeader;
while (newHeaderSize-- > 0) {
success = False;
getPrivByte(*hdr++);
success = True;
}
if (!success) {
delete[] newHeader;
break;
}
u_int8_t headerType = newHeader[0];
if (headerType == 1 || (isTheora && headerType == 0x80)) { // "identification" header
delete[] identificationHeader; identificationHeader = newHeader;
identificationHeaderSize = headerSize[i];
} else if (headerType == 3 || (isTheora && headerType == 0x81)) { // "comment" header
delete[] commentHeader; commentHeader = newHeader;
commentHeaderSize = headerSize[i];
} else if (headerType == 5 || (isTheora && headerType == 0x82)) { // "setup" header
delete[] setupHeader; setupHeader = newHeader;
setupHeaderSize = headerSize[i];
} else {
delete[] newHeader; // because it was a header type that we don't understand
}
}
if (!success) break;
if (isTheora) {
result = TheoraVideoRTPSink
::createNew(envir(), rtpGroupsock, rtpPayloadTypeIfDynamic,
identificationHeader, identificationHeaderSize,
commentHeader, commentHeaderSize,
setupHeader, setupHeaderSize);
} else { // Vorbis
result = VorbisAudioRTPSink
::createNew(envir(), rtpGroupsock, rtpPayloadTypeIfDynamic,
track->samplingFrequency, track->numChannels,
identificationHeader, identificationHeaderSize,
commentHeader, commentHeaderSize,
setupHeader, setupHeaderSize);
}
} while (0);
delete[] identificationHeader; delete[] commentHeader; delete[] setupHeader;
} else if (strcmp(track->mimeType, "video/H264") == 0) {
// Use our track's 'Codec Private' data: Bytes 5 and beyond contain SPS and PPSs:
u_int8_t* SPS = NULL; unsigned SPSSize = 0;
u_int8_t* PPS = NULL; unsigned PPSSize = 0;
u_int8_t* SPSandPPSBytes = NULL; unsigned numSPSandPPSBytes = 0;
do {
if (track->codecPrivateSize < 6) break;
numSPSandPPSBytes = track->codecPrivateSize - 5;
SPSandPPSBytes = &track->codecPrivate[5];
// Extract, from "SPSandPPSBytes", one SPS NAL unit, and one PPS NAL unit.
// (I hope one is all we need of each.)
unsigned i;
u_int8_t* ptr = SPSandPPSBytes;
u_int8_t* limit = &SPSandPPSBytes[numSPSandPPSBytes];
unsigned numSPSs = (*ptr++)&0x1F; CHECK_PTR;
for (i = 0; i < numSPSs; ++i) {
unsigned spsSize = (*ptr++)<<8; CHECK_PTR;
spsSize |= *ptr++; CHECK_PTR;
if (spsSize > NUM_BYTES_REMAINING) break;
u_int8_t nal_unit_type = ptr[0]&0x1F;
if (SPS == NULL && nal_unit_type == 7/*sanity check*/) { // save the first one
SPSSize = spsSize;
SPS = new u_int8_t[spsSize];
memmove(SPS, ptr, spsSize);
}
ptr += spsSize;
}
unsigned numPPSs = (*ptr++)&0x1F; CHECK_PTR;
for (i = 0; i < numPPSs; ++i) {
unsigned ppsSize = (*ptr++)<<8; CHECK_PTR;
ppsSize |= *ptr++; CHECK_PTR;
if (ppsSize > NUM_BYTES_REMAINING) break;
u_int8_t nal_unit_type = ptr[0]&0x1F;
if (PPS == NULL && nal_unit_type == 8/*sanity check*/) { // save the first one
PPSSize = ppsSize;
PPS = new u_int8_t[ppsSize];
memmove(PPS, ptr, ppsSize);
}
ptr += ppsSize;
}
} while (0);
result = H264VideoRTPSink::createNew(envir(), rtpGroupsock, rtpPayloadTypeIfDynamic,
SPS, SPSSize, PPS, PPSSize);
delete[] SPS; delete[] PPS;
} else if (strcmp(track->mimeType, "video/H265") == 0) {
u_int8_t* VPS = NULL; unsigned VPSSize = 0;
u_int8_t* SPS = NULL; unsigned SPSSize = 0;
u_int8_t* PPS = NULL; unsigned PPSSize = 0;
u_int8_t* VPS_SPS_PPSBytes = NULL; unsigned numVPS_SPS_PPSBytes = 0;
unsigned i;
do {
if (track->codecPrivateUsesH264FormatForH265) {
// The data uses the H.264-style format (but including VPS NAL unit(s)).
// The VPS,SPS,PPS NAL unit information starts at byte #5:
if (track->codecPrivateSize >= 6) {
numVPS_SPS_PPSBytes = track->codecPrivateSize - 5;
VPS_SPS_PPSBytes = &track->codecPrivate[5];
}
} else {
// The data uses the proper H.265-style format.
// The VPS,SPS,PPS NAL unit information starts at byte #22:
if (track->codecPrivateSize >= 23) {
numVPS_SPS_PPSBytes = track->codecPrivateSize - 22;
VPS_SPS_PPSBytes = &track->codecPrivate[22];
}
}
// Extract, from "VPS_SPS_PPSBytes", one VPS NAL unit, one SPS NAL unit, and one PPS NAL unit.
// (I hope one is all we need of each.)
if (numVPS_SPS_PPSBytes == 0 || VPS_SPS_PPSBytes == NULL) break; // sanity check
u_int8_t* ptr = VPS_SPS_PPSBytes;
u_int8_t* limit = &VPS_SPS_PPSBytes[numVPS_SPS_PPSBytes];
if (track->codecPrivateUsesH264FormatForH265) {
// The data uses the H.264-style format (but including VPS NAL unit(s)).
while (NUM_BYTES_REMAINING > 0) {
unsigned numNALUnits = (*ptr++)&0x1F; CHECK_PTR;
for (i = 0; i < numNALUnits; ++i) {
unsigned nalUnitLength = (*ptr++)<<8; CHECK_PTR;
nalUnitLength |= *ptr++; CHECK_PTR;
if (nalUnitLength > NUM_BYTES_REMAINING) break;
u_int8_t nal_unit_type = (ptr[0]&0x7E)>>1;
if (nal_unit_type == 32) { // VPS
VPSSize = nalUnitLength;
delete[] VPS; VPS = new u_int8_t[nalUnitLength];
memmove(VPS, ptr, nalUnitLength);
} else if (nal_unit_type == 33) { // SPS
SPSSize = nalUnitLength;
delete[] SPS; SPS = new u_int8_t[nalUnitLength];
memmove(SPS, ptr, nalUnitLength);
} else if (nal_unit_type == 34) { // PPS
PPSSize = nalUnitLength;
delete[] PPS; PPS = new u_int8_t[nalUnitLength];
memmove(PPS, ptr, nalUnitLength);
}
ptr += nalUnitLength;
}
}
} else {
// The data uses the proper H.265-style format.
unsigned numOfArrays = *ptr++; CHECK_PTR;
for (unsigned j = 0; j < numOfArrays; ++j) {
++ptr; CHECK_PTR; // skip the 'array_completeness'|'reserved'|'NAL_unit_type' byte
unsigned numNalus = (*ptr++)<<8; CHECK_PTR;
numNalus |= *ptr++; CHECK_PTR;
for (i = 0; i < numNalus; ++i) {
unsigned nalUnitLength = (*ptr++)<<8; CHECK_PTR;
nalUnitLength |= *ptr++; CHECK_PTR;
if (nalUnitLength > NUM_BYTES_REMAINING) break;
u_int8_t nal_unit_type = (ptr[0]&0x7E)>>1;
if (nal_unit_type == 32) { // VPS
VPSSize = nalUnitLength;
delete[] VPS; VPS = new u_int8_t[nalUnitLength];
memmove(VPS, ptr, nalUnitLength);
} else if (nal_unit_type == 33) { // SPS
SPSSize = nalUnitLength;
delete[] SPS; SPS = new u_int8_t[nalUnitLength];
memmove(SPS, ptr, nalUnitLength);
} else if (nal_unit_type == 34) { // PPS
PPSSize = nalUnitLength;
delete[] PPS; PPS = new u_int8_t[nalUnitLength];
memmove(PPS, ptr, nalUnitLength);
}
ptr += nalUnitLength;
}
}
}
} while (0);
result = H265VideoRTPSink::createNew(envir(), rtpGroupsock, rtpPayloadTypeIfDynamic,
VPS, VPSSize, SPS, SPSSize, PPS, PPSSize);
delete[] VPS; delete[] SPS; delete[] PPS;
} else if (strcmp(track->mimeType, "video/VP8") == 0) {
result = VP8VideoRTPSink::createNew(envir(), rtpGroupsock, rtpPayloadTypeIfDynamic);
} else if (strcmp(track->mimeType, "video/VP9") == 0) {
result = VP9VideoRTPSink::createNew(envir(), rtpGroupsock, rtpPayloadTypeIfDynamic);
} else if (strcmp(track->mimeType, "text/T140") == 0) {
result = T140TextRTPSink::createNew(envir(), rtpGroupsock, rtpPayloadTypeIfDynamic);
}
} while (0);
return result;
}
void MatroskaFile::addTrack(MatroskaTrack* newTrack, unsigned trackNumber) {
fTrackTable->add(newTrack, trackNumber);
}
void MatroskaFile::addCuePoint(double cueTime, u_int64_t clusterOffsetInFile, unsigned blockNumWithinCluster) {
Boolean dummy = False; // not used
CuePoint::addCuePoint(fCuePoints, cueTime, clusterOffsetInFile, blockNumWithinCluster, dummy);
}
Boolean MatroskaFile::lookupCuePoint(double& cueTime, u_int64_t& resultClusterOffsetInFile, unsigned& resultBlockNumWithinCluster) {
if (fCuePoints == NULL) return False;
(void)fCuePoints->lookup(cueTime, resultClusterOffsetInFile, resultBlockNumWithinCluster);
return True;
}
void MatroskaFile::printCuePoints(FILE* fid) {
CuePoint::fprintf(fid, fCuePoints);
}
////////// MatroskaTrackTable implementation //////////
MatroskaTrackTable::MatroskaTrackTable()
: fTable(HashTable::create(ONE_WORD_HASH_KEYS)) {
}
MatroskaTrackTable::~MatroskaTrackTable() {
// Remove and delete all of our "MatroskaTrack" descriptors, and the hash table itself:
MatroskaTrack* track;
while ((track = (MatroskaTrack*)fTable->RemoveNext()) != NULL) {
delete track;
}
delete fTable;
}
void MatroskaTrackTable::add(MatroskaTrack* newTrack, unsigned trackNumber) {
if (newTrack != NULL && newTrack->trackNumber != 0) fTable->Remove((char const*)newTrack->trackNumber);
MatroskaTrack* existingTrack = (MatroskaTrack*)fTable->Add((char const*)trackNumber, newTrack);
delete existingTrack; // in case it wasn't NULL
}
MatroskaTrack* MatroskaTrackTable::lookup(unsigned trackNumber) {
return (MatroskaTrack*)fTable->Lookup((char const*)trackNumber);
}
unsigned MatroskaTrackTable::numTracks() const { return fTable->numEntries(); }
MatroskaTrackTable::Iterator::Iterator(MatroskaTrackTable& ourTable) {
fIter = HashTable::Iterator::create(*(ourTable.fTable));
}
MatroskaTrackTable::Iterator::~Iterator() {
delete fIter;
}
MatroskaTrack* MatroskaTrackTable::Iterator::next() {
char const* key;
return (MatroskaTrack*)fIter->next(key);
}
////////// MatroskaTrack implementation //////////
MatroskaTrack::MatroskaTrack()
: trackNumber(0/*not set*/), trackType(0/*unknown*/),
isEnabled(True), isDefault(True), isForced(False),
defaultDuration(0),
name(NULL), language(NULL), codecID(NULL),
samplingFrequency(0), numChannels(2), mimeType(""),
codecPrivateSize(0), codecPrivate(NULL),
codecPrivateUsesH264FormatForH265(False), codecIsOpus(False),
headerStrippedBytesSize(0), headerStrippedBytes(NULL),
subframeSizeSize(0) {
}
MatroskaTrack::~MatroskaTrack() {
delete[] name; delete[] language; delete[] codecID;
delete[] codecPrivate;
delete[] headerStrippedBytes;
}
////////// MatroskaDemux implementation //////////
MatroskaDemux::MatroskaDemux(MatroskaFile& ourFile)
: Medium(ourFile.envir()),
fOurFile(ourFile), fDemuxedTracksTable(HashTable::create(ONE_WORD_HASH_KEYS)),
fNextTrackTypeToCheck(0x1) {
fOurParser = new MatroskaFileParser(ourFile, ByteStreamFileSource::createNew(envir(), ourFile.fileName()),
handleEndOfFile, this, this);
}
MatroskaDemux::~MatroskaDemux() {
// Begin by acting as if we've reached the end of the source file. This should cause all of our demuxed tracks to get closed.
handleEndOfFile();
// Then delete our table of "MatroskaDemuxedTrack"s
// - but not the "MatroskaDemuxedTrack"s themselves; that should have already happened:
delete fDemuxedTracksTable;
delete fOurParser;
fOurFile.removeDemux(this);
}
FramedSource* MatroskaDemux::newDemuxedTrack() {
unsigned dummyResultTrackNumber;
return newDemuxedTrack(dummyResultTrackNumber);
}
FramedSource* MatroskaDemux::newDemuxedTrack(unsigned& resultTrackNumber) {
FramedSource* result;
resultTrackNumber = 0;
for (result = NULL; result == NULL && fNextTrackTypeToCheck != MATROSKA_TRACK_TYPE_OTHER;
fNextTrackTypeToCheck <<= 1) {
if (fNextTrackTypeToCheck == MATROSKA_TRACK_TYPE_VIDEO) resultTrackNumber = fOurFile.chosenVideoTrackNumber();
else if (fNextTrackTypeToCheck == MATROSKA_TRACK_TYPE_AUDIO) resultTrackNumber = fOurFile.chosenAudioTrackNumber();
else if (fNextTrackTypeToCheck == MATROSKA_TRACK_TYPE_SUBTITLE) resultTrackNumber = fOurFile.chosenSubtitleTrackNumber();
result = newDemuxedTrackByTrackNumber(resultTrackNumber);
}
return result;
}
FramedSource* MatroskaDemux::newDemuxedTrackByTrackNumber(unsigned trackNumber) {
if (trackNumber == 0) return NULL;
FramedSource* trackSource = new MatroskaDemuxedTrack(envir(), trackNumber, *this);
fDemuxedTracksTable->Add((char const*)trackNumber, trackSource);
return trackSource;
}
MatroskaDemuxedTrack* MatroskaDemux::lookupDemuxedTrack(unsigned trackNumber) {
return (MatroskaDemuxedTrack*)fDemuxedTracksTable->Lookup((char const*)trackNumber);
}
void MatroskaDemux::removeTrack(unsigned trackNumber) {
fDemuxedTracksTable->Remove((char const*)trackNumber);
if (fDemuxedTracksTable->numEntries() == 0) {
// We no longer have any demuxed tracks, so delete ourselves now:
Medium::close(this);
}
}
void MatroskaDemux::continueReading() {
fOurParser->continueParsing();
}
void MatroskaDemux::seekToTime(double& seekNPT) {
if (fOurParser != NULL) fOurParser->seekToTime(seekNPT);
}
void MatroskaDemux::handleEndOfFile(void* clientData) {
((MatroskaDemux*)clientData)->handleEndOfFile();
}
void MatroskaDemux::handleEndOfFile() {
// Iterate through all of our 'demuxed tracks', handling 'end of input' on each one.
// Hack: Because this can cause the hash table to get modified underneath us, we don't call the handlers until after we've
// first iterated through all of the tracks.
unsigned numTracks = fDemuxedTracksTable->numEntries();
if (numTracks == 0) return;
MatroskaDemuxedTrack** tracks = new MatroskaDemuxedTrack*[numTracks];
HashTable::Iterator* iter = HashTable::Iterator::create(*fDemuxedTracksTable);
unsigned i;
char const* trackNumber;
for (i = 0; i < numTracks; ++i) {
tracks[i] = (MatroskaDemuxedTrack*)iter->next(trackNumber);
}
delete iter;
for (i = 0; i < numTracks; ++i) {
if (tracks[i] == NULL) continue; // sanity check; shouldn't happen
tracks[i]->handleClosure();
}
delete[] tracks;
}
////////// CuePoint implementation //////////
CuePoint::CuePoint(double cueTime, u_int64_t clusterOffsetInFile, unsigned blockNumWithinCluster)
: fBalance(0),
fCueTime(cueTime), fClusterOffsetInFile(clusterOffsetInFile), fBlockNumWithinCluster(blockNumWithinCluster - 1) {
fSubTree[0] = fSubTree[1] = NULL;
}
CuePoint::~CuePoint() {
delete fSubTree[0]; delete fSubTree[1];
}
void CuePoint::addCuePoint(CuePoint*& root, double cueTime, u_int64_t clusterOffsetInFile, unsigned blockNumWithinCluster,
Boolean& needToReviseBalanceOfParent) {
needToReviseBalanceOfParent = False; // by default; may get changed below
if (root == NULL) {
root = new CuePoint(cueTime, clusterOffsetInFile, blockNumWithinCluster);
needToReviseBalanceOfParent = True;
} else if (cueTime == root->fCueTime) {
// Replace existing data:
root->fClusterOffsetInFile = clusterOffsetInFile;
root->fBlockNumWithinCluster = blockNumWithinCluster - 1;
} else {
// Add to our left or right subtree:
int direction = cueTime > root->fCueTime; // 0 (left) or 1 (right)
Boolean needToReviseOurBalance = False;
addCuePoint(root->fSubTree[direction], cueTime, clusterOffsetInFile, blockNumWithinCluster, needToReviseOurBalance);
if (needToReviseOurBalance) {
// We need to change our 'balance' number, perhaps while also performing a rotation to bring ourself back into balance:
if (root->fBalance == 0) {
// We were balanced before, but now we're unbalanced (by 1) on the "direction" side:
root->fBalance = -1 + 2*direction; // -1 for "direction" 0; 1 for "direction" 1
needToReviseBalanceOfParent = True;
} else if (root->fBalance == 1 - 2*direction) { // 1 for "direction" 0; -1 for "direction" 1
// We were unbalanced (by 1) on the side opposite to where we added an entry, so now we're balanced:
root->fBalance = 0;
} else {
// We were unbalanced (by 1) on the side where we added an entry, so now we're unbalanced by 2, and have to rebalance:
if (root->fSubTree[direction]->fBalance == -1 + 2*direction) { // -1 for "direction" 0; 1 for "direction" 1
// We're 'doubly-unbalanced' on this side, so perform a single rotation in the opposite direction:
root->fBalance = root->fSubTree[direction]->fBalance = 0;
rotate(1-direction, root);
} else {
// This is the Left-Right case (for "direction" 0) or the Right-Left case (for "direction" 1); perform two rotations:
char newParentCurBalance = root->fSubTree[direction]->fSubTree[1-direction]->fBalance;
if (newParentCurBalance == 1 - 2*direction) { // 1 for "direction" 0; -1 for "direction" 1
root->fBalance = 0;
root->fSubTree[direction]->fBalance = -1 + 2*direction; // -1 for "direction" 0; 1 for "direction" 1
} else if (newParentCurBalance == 0) {
root->fBalance = 0;
root->fSubTree[direction]->fBalance = 0;
} else {
root->fBalance = 1 - 2*direction; // 1 for "direction" 0; -1 for "direction" 1
root->fSubTree[direction]->fBalance = 0;
}
rotate(direction, root->fSubTree[direction]);
root->fSubTree[direction]->fBalance = 0; // the new root will be balanced
rotate(1-direction, root);
}
}
}
}
}
Boolean CuePoint::lookup(double& cueTime, u_int64_t& resultClusterOffsetInFile, unsigned& resultBlockNumWithinCluster) {
if (cueTime < fCueTime) {
if (left() == NULL) {
resultClusterOffsetInFile = 0;
resultBlockNumWithinCluster = 0;
return False;
} else {
return left()->lookup(cueTime, resultClusterOffsetInFile, resultBlockNumWithinCluster);
}
} else {
if (right() == NULL || !right()->lookup(cueTime, resultClusterOffsetInFile, resultBlockNumWithinCluster)) {
// Use this record:
cueTime = fCueTime;
resultClusterOffsetInFile = fClusterOffsetInFile;
resultBlockNumWithinCluster = fBlockNumWithinCluster;
}
return True;
}
}
void CuePoint::fprintf(FILE* fid, CuePoint* cuePoint) {
if (cuePoint != NULL) {
::fprintf(fid, "[");
fprintf(fid, cuePoint->left());
::fprintf(fid, ",%.1f{%d},", cuePoint->fCueTime, cuePoint->fBalance);
fprintf(fid, cuePoint->right());
::fprintf(fid, "]");
}
}
void CuePoint::rotate(unsigned direction/*0 => left; 1 => right*/, CuePoint*& root) {
CuePoint* pivot = root->fSubTree[1-direction]; // ASSERT: pivot != NULL
root->fSubTree[1-direction] = pivot->fSubTree[direction];
pivot->fSubTree[direction] = root;
root = pivot;
}
live/liveMedia/MatroskaFileParser.cpp 000444 001752 001752 00000145136 12656261123 017624 0 ustar 00rsf rsf 000000 000000 /**********
This library is free software; you can redistribute it and/or modify it under
the terms of the GNU Lesser General Public License as published by the
Free Software Foundation; either version 2.1 of the License, or (at your
option) any later version. (See .)
This library is distributed in the hope that it will be useful, but WITHOUT
ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS
FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for
more details.
You should have received a copy of the GNU Lesser General Public License
along with this library; if not, write to the Free Software Foundation, Inc.,
51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
**********/
// "liveMedia"
// Copyright (c) 1996-2016 Live Networks, Inc. All rights reserved.
// A parser for a Matroska file.
// Implementation
#include "MatroskaFileParser.hh"
#include "MatroskaDemuxedTrack.hh"
#include
#include // for "gettimeofday()
MatroskaFileParser::MatroskaFileParser(MatroskaFile& ourFile, FramedSource* inputSource,
FramedSource::onCloseFunc* onEndFunc, void* onEndClientData,
MatroskaDemux* ourDemux)
: StreamParser(inputSource, onEndFunc, onEndClientData, continueParsing, this),
fOurFile(ourFile), fInputSource(inputSource),
fOnEndFunc(onEndFunc), fOnEndClientData(onEndClientData),
fOurDemux(ourDemux),
fCurOffsetInFile(0), fSavedCurOffsetInFile(0), fLimitOffsetInFile(0),
fNumHeaderBytesToSkip(0), fClusterTimecode(0), fBlockTimecode(0),
fFrameSizesWithinBlock(NULL),
fPresentationTimeOffset(0.0) {
if (ourDemux == NULL) {
// Initialization
fCurrentParseState = PARSING_START_OF_FILE;
continueParsing();
} else {
fCurrentParseState = LOOKING_FOR_CLUSTER;
// In this case, parsing (of track data) doesn't start until a client starts reading from a track.
}
}
MatroskaFileParser::~MatroskaFileParser() {
delete[] fFrameSizesWithinBlock;
Medium::close(fInputSource);
}
void MatroskaFileParser::seekToTime(double& seekNPT) {
#ifdef DEBUG
fprintf(stderr, "seekToTime(%f)\n", seekNPT);
#endif
if (seekNPT <= 0.0) {
#ifdef DEBUG
fprintf(stderr, "\t=> start of file\n");
#endif
seekNPT = 0.0;
seekToFilePosition(0);
} else if (seekNPT >= fOurFile.fileDuration()) {
#ifdef DEBUG
fprintf(stderr, "\t=> end of file\n");
#endif
seekNPT = fOurFile.fileDuration();
seekToEndOfFile();
} else {
u_int64_t clusterOffsetInFile;
unsigned blockNumWithinCluster;
if (!fOurFile.lookupCuePoint(seekNPT, clusterOffsetInFile, blockNumWithinCluster)) {
#ifdef DEBUG
fprintf(stderr, "\t=> not supported\n");
#endif
return; // seeking not supported
}
#ifdef DEBUG
fprintf(stderr, "\t=> seek time %f, file position %llu, block number within cluster %d\n", seekNPT, clusterOffsetInFile, blockNumWithinCluster);
#endif
seekToFilePosition(clusterOffsetInFile);
fCurrentParseState = LOOKING_FOR_BLOCK;
// LATER handle "blockNumWithinCluster"; for now, we assume that it's 0 #####
}
}
void MatroskaFileParser
::continueParsing(void* clientData, unsigned char* /*ptr*/, unsigned /*size*/, struct timeval /*presentationTime*/) {
((MatroskaFileParser*)clientData)->continueParsing();
}
void MatroskaFileParser::continueParsing() {
if (fInputSource != NULL) {
if (fInputSource->isCurrentlyAwaitingData()) return; // Our input source is currently being read. Wait until that read completes
if (!parse()) {
// We didn't complete the parsing, because we had to read more data from the source, or because we're waiting for
// another read from downstream. Once that happens, we'll get called again.
return;
}
}
// We successfully parsed the file. Call our 'done' function now:
if (fOnEndFunc != NULL) (*fOnEndFunc)(fOnEndClientData);
}
Boolean MatroskaFileParser::parse() {
Boolean areDone = False;
try {
skipRemainingHeaderBytes(True); // if any
do {
switch (fCurrentParseState) {
case PARSING_START_OF_FILE: {
areDone = parseStartOfFile();
break;
}
case LOOKING_FOR_TRACKS: {
lookForNextTrack();
break;
}
case PARSING_TRACK: {
areDone = parseTrack();
if (areDone && fOurFile.fCuesOffset > 0) {
// We've finished parsing the 'Track' information. There are also 'Cues' in the file, so parse those before finishing:
// Seek to the specified position in the file. We were already told that the 'Cues' begins there:
#ifdef DEBUG
fprintf(stderr, "Seeking to file position %llu (the previously-reported location of 'Cues')\n", fOurFile.fCuesOffset);
#endif
seekToFilePosition(fOurFile.fCuesOffset);
fCurrentParseState = PARSING_CUES;
areDone = False;
}
break;
}
case PARSING_CUES: {
areDone = parseCues();
break;
}
case LOOKING_FOR_CLUSTER: {
if (fOurFile.fClusterOffset > 0) {
// Optimization: Seek to the specified position in the file. We were already told that the 'Cluster' begins there:
#ifdef DEBUG
fprintf(stderr, "Optimization: Seeking to file position %llu (the previously-reported location of a 'Cluster')\n", fOurFile.fClusterOffset);
#endif
seekToFilePosition(fOurFile.fClusterOffset);
}
fCurrentParseState = LOOKING_FOR_BLOCK;
break;
}
case LOOKING_FOR_BLOCK: {
lookForNextBlock();
break;
}
case PARSING_BLOCK: {
parseBlock();
break;
}
case DELIVERING_FRAME_WITHIN_BLOCK: {
if (!deliverFrameWithinBlock()) return False;
break;
}
case DELIVERING_FRAME_BYTES: {
deliverFrameBytes();
return False; // Halt parsing for now. A new 'read' from downstream will cause parsing to resume.
break;
}
}
} while (!areDone);
return True;
} catch (int /*e*/) {
#ifdef DEBUG
fprintf(stderr, "MatroskaFileParser::parse() EXCEPTION (This is normal behavior - *not* an error)\n");
#endif
return False; // the parsing got interrupted
}
}
Boolean MatroskaFileParser::parseStartOfFile() {
#ifdef DEBUG
fprintf(stderr, "parsing start of file\n");
#endif
EBMLId id;
EBMLDataSize size;
// The file must begin with the standard EBML header (which we skip):
if (!parseEBMLIdAndSize(id, size) || id != MATROSKA_ID_EBML) {
fOurFile.envir() << "ERROR: File does not begin with an EBML header\n";
return True; // We're done with the file, because it's not valid
}
#ifdef DEBUG
fprintf(stderr, "MatroskaFileParser::parseStartOfFile(): Parsed id 0x%s (%s), size: %lld\n", id.hexString(), id.stringName(), size.val());
#endif
fCurrentParseState = LOOKING_FOR_TRACKS;
skipHeader(size);
return False; // because we have more parsing to do - inside the 'Track' header
}
void MatroskaFileParser::lookForNextTrack() {
#ifdef DEBUG
fprintf(stderr, "looking for Track\n");
#endif
EBMLId id;
EBMLDataSize size;
// Read and skip over (or enter) each Matroska header, until we get to a 'Track'.
while (fCurrentParseState == LOOKING_FOR_TRACKS) {
while (!parseEBMLIdAndSize(id, size)) {}
#ifdef DEBUG
fprintf(stderr, "MatroskaFileParser::lookForNextTrack(): Parsed id 0x%s (%s), size: %lld\n", id.hexString(), id.stringName(), size.val());
#endif
switch (id.val()) {
case MATROSKA_ID_SEGMENT: { // 'Segment' header: enter this
// Remember the position, within the file, of the start of Segment data, because Seek Positions are relative to this:
fOurFile.fSegmentDataOffset = fCurOffsetInFile;
break;
}
case MATROSKA_ID_SEEK_HEAD: { // 'Seek Head' header: enter this
break;
}
case MATROSKA_ID_SEEK: { // 'Seek' header: enter this
break;
}
case MATROSKA_ID_SEEK_ID: { // 'Seek ID' header: get this value
if (parseEBMLNumber(fLastSeekId)) {
#ifdef DEBUG
fprintf(stderr, "\tSeek ID 0x%s:\t%s\n", fLastSeekId.hexString(), fLastSeekId.stringName());
#endif
}
break;
}
case MATROSKA_ID_SEEK_POSITION: { // 'Seek Position' header: get this value
u_int64_t seekPosition;
if (parseEBMLVal_unsigned64(size, seekPosition)) {
u_int64_t offsetInFile = fOurFile.fSegmentDataOffset + seekPosition;
#ifdef DEBUG
fprintf(stderr, "\tSeek Position %llu (=> offset within the file: %llu (0x%llx))\n", seekPosition, offsetInFile, offsetInFile);
#endif
// The only 'Seek Position's that we care about are for 'Cluster' and 'Cues':
if (fLastSeekId == MATROSKA_ID_CLUSTER) {
fOurFile.fClusterOffset = offsetInFile;
} else if (fLastSeekId == MATROSKA_ID_CUES) {
fOurFile.fCuesOffset = offsetInFile;
}
}
break;
}
case MATROSKA_ID_INFO: { // 'Segment Info' header: enter this
break;
}
case MATROSKA_ID_TIMECODE_SCALE: { // 'Timecode Scale' header: get this value
unsigned timecodeScale;
if (parseEBMLVal_unsigned(size, timecodeScale) && timecodeScale > 0) {
fOurFile.fTimecodeScale = timecodeScale;
#ifdef DEBUG
fprintf(stderr, "\tTimecode Scale %u ns (=> Segment Duration == %f seconds)\n",
fOurFile.timecodeScale(), fOurFile.segmentDuration()*(fOurFile.fTimecodeScale/1000000000.0f));
#endif
}
break;
}
case MATROSKA_ID_DURATION: { // 'Segment Duration' header: get this value
if (parseEBMLVal_float(size, fOurFile.fSegmentDuration)) {
#ifdef DEBUG
fprintf(stderr, "\tSegment Duration %f (== %f seconds)\n",
fOurFile.segmentDuration(), fOurFile.segmentDuration()*(fOurFile.fTimecodeScale/1000000000.0f));
#endif
}
break;
}
#ifdef DEBUG
case MATROSKA_ID_TITLE: { // 'Segment Title': display this value
char* title;
if (parseEBMLVal_string(size, title)) {
#ifdef DEBUG
fprintf(stderr, "\tTitle: %s\n", title);
#endif
delete[] title;
}
break;
}
#endif
case MATROSKA_ID_TRACKS: { // enter this, and move on to parsing 'Tracks'
fLimitOffsetInFile = fCurOffsetInFile + size.val(); // Make sure we don't read past the end of this header
fCurrentParseState = PARSING_TRACK;
break;
}
default: { // skip over this header
skipHeader(size);
break;
}
}
setParseState();
}
}
Boolean MatroskaFileParser::parseTrack() {
#ifdef DEBUG
fprintf(stderr, "parsing Track\n");
#endif
// Read and process each Matroska header, until we get to the end of the Track:
MatroskaTrack* track = NULL;
EBMLId id;
EBMLDataSize size;
while (fCurOffsetInFile < fLimitOffsetInFile) {
while (!parseEBMLIdAndSize(id, size)) {}
#ifdef DEBUG
if (id == MATROSKA_ID_TRACK_ENTRY) fprintf(stderr, "\n"); // makes debugging output easier to read
fprintf(stderr, "MatroskaFileParser::parseTrack(): Parsed id 0x%s (%s), size: %lld\n", id.hexString(), id.stringName(), size.val());
#endif
switch (id.val()) {
case MATROSKA_ID_TRACK_ENTRY: { // 'Track Entry' header: enter this
// Create a new "MatroskaTrack" object for this entry:
if (track != NULL && track->trackNumber == 0) delete track; // We had a previous "MatroskaTrack" object that was never used
track = new MatroskaTrack;
break;
}
case MATROSKA_ID_TRACK_NUMBER: {
unsigned trackNumber;
if (parseEBMLVal_unsigned(size, trackNumber)) {
#ifdef DEBUG
fprintf(stderr, "\tTrack Number %d\n", trackNumber);
#endif
if (track != NULL && trackNumber != 0) {
track->trackNumber = trackNumber;
fOurFile.addTrack(track, trackNumber);
}
}
break;
}
case MATROSKA_ID_TRACK_TYPE: {
unsigned trackType;
if (parseEBMLVal_unsigned(size, trackType) && track != NULL) {
// We convert the Matroska 'track type' code into our own code (which we can use as a bitmap):
track->trackType
= trackType == 1 ? MATROSKA_TRACK_TYPE_VIDEO : trackType == 2 ? MATROSKA_TRACK_TYPE_AUDIO
: trackType == 0x11 ? MATROSKA_TRACK_TYPE_SUBTITLE : MATROSKA_TRACK_TYPE_OTHER;
#ifdef DEBUG
fprintf(stderr, "\tTrack Type 0x%02x (%s)\n", trackType,
track->trackType == MATROSKA_TRACK_TYPE_VIDEO ? "video" :
track->trackType == MATROSKA_TRACK_TYPE_AUDIO ? "audio" :
track->trackType == MATROSKA_TRACK_TYPE_SUBTITLE ? "subtitle" :
"");
#endif
}
break;
}
case MATROSKA_ID_FLAG_ENABLED: {
unsigned flagEnabled;
if (parseEBMLVal_unsigned(size, flagEnabled)) {
#ifdef DEBUG
fprintf(stderr, "\tTrack is Enabled: %d\n", flagEnabled);
#endif
if (track != NULL) track->isEnabled = flagEnabled != 0;
}
break;
}
case MATROSKA_ID_FLAG_DEFAULT: {
unsigned flagDefault;
if (parseEBMLVal_unsigned(size, flagDefault)) {
#ifdef DEBUG
fprintf(stderr, "\tTrack is Default: %d\n", flagDefault);
#endif
if (track != NULL) track->isDefault = flagDefault != 0;
}
break;
}
case MATROSKA_ID_FLAG_FORCED: {
unsigned flagForced;
if (parseEBMLVal_unsigned(size, flagForced)) {
#ifdef DEBUG
fprintf(stderr, "\tTrack is Forced: %d\n", flagForced);
#endif
if (track != NULL) track->isForced = flagForced != 0;
}
break;
}
case MATROSKA_ID_DEFAULT_DURATION: {
unsigned defaultDuration;
if (parseEBMLVal_unsigned(size, defaultDuration)) {
#ifdef DEBUG
fprintf(stderr, "\tDefault duration %f ms\n", defaultDuration/1000000.0);
#endif
if (track != NULL) track->defaultDuration = defaultDuration;
}
break;
}
case MATROSKA_ID_MAX_BLOCK_ADDITION_ID: {
unsigned maxBlockAdditionID;
if (parseEBMLVal_unsigned(size, maxBlockAdditionID)) {
#ifdef DEBUG
fprintf(stderr, "\tMax Block Addition ID: %u\n", maxBlockAdditionID);
#endif
}
break;
}
case MATROSKA_ID_NAME: {
char* name;
if (parseEBMLVal_string(size, name)) {
#ifdef DEBUG
fprintf(stderr, "\tName: %s\n", name);
#endif
if (track != NULL) {
delete[] track->name; track->name = name;
} else {
delete[] name;
}
}
break;
}
case MATROSKA_ID_LANGUAGE: {
char* language;
if (parseEBMLVal_string(size, language)) {
#ifdef DEBUG
fprintf(stderr, "\tLanguage: %s\n", language);
#endif
if (track != NULL) {
delete[] track->language; track->language = language;
} else {
delete[] language;
}
}
break;
}
case MATROSKA_ID_CODEC: {
char* codecID;
if (parseEBMLVal_string(size, codecID)) {
#ifdef DEBUG
fprintf(stderr, "\tCodec ID: %s\n", codecID);
#endif
if (track != NULL) {
delete[] track->codecID; track->codecID = codecID;
// Also set the track's "mimeType" field, if we can deduce it from the "codecID":
if (strncmp(codecID, "A_MPEG", 6) == 0) {
track->mimeType = "audio/MPEG";
} else if (strncmp(codecID, "A_AAC", 5) == 0) {
track->mimeType = "audio/AAC";
} else if (strncmp(codecID, "A_AC3", 5) == 0) {
track->mimeType = "audio/AC3";
} else if (strncmp(codecID, "A_VORBIS", 8) == 0) {
track->mimeType = "audio/VORBIS";
} else if (strcmp(codecID, "A_OPUS") == 0) {
track->mimeType = "audio/OPUS";
track->codecIsOpus = True;
} else if (strcmp(codecID, "V_MPEG4/ISO/AVC") == 0) {
track->mimeType = "video/H264";
} else if (strcmp(codecID, "V_MPEGH/ISO/HEVC") == 0) {
track->mimeType = "video/H265";
} else if (strncmp(codecID, "V_VP8", 5) == 0) {
track->mimeType = "video/VP8";
} else if (strncmp(codecID, "V_VP9", 5) == 0) {
track->mimeType = "video/VP9";
} else if (strncmp(codecID, "V_THEORA", 8) == 0) {
track->mimeType = "video/THEORA";
} else if (strncmp(codecID, "S_TEXT", 6) == 0) {
track->mimeType = "text/T140";
}
} else {
delete[] codecID;
}
}
break;
}
case MATROSKA_ID_CODEC_PRIVATE: {
u_int8_t* codecPrivate;
unsigned codecPrivateSize;
if (parseEBMLVal_binary(size, codecPrivate)) {
codecPrivateSize = (unsigned)size.val();
#ifdef DEBUG
fprintf(stderr, "\tCodec Private: ");
for (unsigned i = 0; i < codecPrivateSize; ++i) fprintf(stderr, "%02x:", codecPrivate[i]);
fprintf(stderr, "\n");
#endif
if (track != NULL) {
delete[] track->codecPrivate; track->codecPrivate = codecPrivate;
track->codecPrivateSize = codecPrivateSize;
// Hack for H.264 and H.265: The 'codec private' data contains
// the size of NAL unit lengths:
if (track->codecID != NULL) {
if (strcmp(track->codecID, "V_MPEG4/ISO/AVC") == 0) { // H.264
// Byte 4 of the 'codec private' data contains 'lengthSizeMinusOne':
if (codecPrivateSize >= 5) track->subframeSizeSize = (codecPrivate[4]&0x3) + 1;
} else if (strcmp(track->codecID, "V_MPEGH/ISO/HEVC") == 0) { // H.265
// H.265 'codec private' data is *supposed* to use the format that's described in
// http://lists.matroska.org/pipermail/matroska-devel/2013-September/004567.html
// However, some Matroska files use the same format that was used for H.264.
// We check for this here, by checking various fields that are supposed to be
// 'all-1' in the 'correct' format:
if (codecPrivateSize < 23 || (codecPrivate[13]&0xF0) != 0xF0 ||
(codecPrivate[15]&0xFC) != 0xFC || (codecPrivate[16]&0xFC) != 0xFC ||
(codecPrivate[17]&0xF8) != 0xF8 || (codecPrivate[18]&0xF8) != 0xF8) {
// The 'correct' format isn't being used, so assume the H.264 format instead:
track->codecPrivateUsesH264FormatForH265 = True;
// Byte 4 of the 'codec private' data contains 'lengthSizeMinusOne':
if (codecPrivateSize >= 5) track->subframeSizeSize = (codecPrivate[4]&0x3) + 1;
} else {
// This looks like the 'correct' format:
track->codecPrivateUsesH264FormatForH265 = False;
// Byte 21 of the 'codec private' data contains 'lengthSizeMinusOne':
track->subframeSizeSize = (codecPrivate[21]&0x3) + 1;
}
}
}
} else {
delete[] codecPrivate;
}
}
break;
}
case MATROSKA_ID_VIDEO: { // 'Video settings' header: enter this
break;
}
case MATROSKA_ID_PIXEL_WIDTH: {
unsigned pixelWidth;
if (parseEBMLVal_unsigned(size, pixelWidth)) {
#ifdef DEBUG
fprintf(stderr, "\tPixel Width %d\n", pixelWidth);
#endif
}
break;
}
case MATROSKA_ID_PIXEL_HEIGHT: {
unsigned pixelHeight;
if (parseEBMLVal_unsigned(size, pixelHeight)) {
#ifdef DEBUG
fprintf(stderr, "\tPixel Height %d\n", pixelHeight);
#endif
}
break;
}
case MATROSKA_ID_DISPLAY_WIDTH: {
unsigned displayWidth;
if (parseEBMLVal_unsigned(size, displayWidth)) {
#ifdef DEBUG
fprintf(stderr, "\tDisplay Width %d\n", displayWidth);
#endif
}
break;
}
case MATROSKA_ID_DISPLAY_HEIGHT: {
unsigned displayHeight;
if (parseEBMLVal_unsigned(size, displayHeight)) {
#ifdef DEBUG
fprintf(stderr, "\tDisplay Height %d\n", displayHeight);
#endif
}
break;
}
case MATROSKA_ID_DISPLAY_UNIT: {
unsigned displayUnit;
if (parseEBMLVal_unsigned(size, displayUnit)) {
#ifdef DEBUG
fprintf(stderr, "\tDisplay Unit %d\n", displayUnit);
#endif
}
break;
}
case MATROSKA_ID_AUDIO: { // 'Audio settings' header: enter this
break;
}
case MATROSKA_ID_SAMPLING_FREQUENCY: {
float samplingFrequency;
if (parseEBMLVal_float(size, samplingFrequency)) {
if (track != NULL) {
track->samplingFrequency = (unsigned)samplingFrequency;
#ifdef DEBUG
fprintf(stderr, "\tSampling frequency %f (->%d)\n", samplingFrequency, track->samplingFrequency);
#endif
}
}
break;
}
case MATROSKA_ID_OUTPUT_SAMPLING_FREQUENCY: {
float outputSamplingFrequency;
if (parseEBMLVal_float(size, outputSamplingFrequency)) {
#ifdef DEBUG
fprintf(stderr, "\tOutput sampling frequency %f\n", outputSamplingFrequency);
#endif
}
break;
}
case MATROSKA_ID_CHANNELS: {
unsigned numChannels;
if (parseEBMLVal_unsigned(size, numChannels)) {
#ifdef DEBUG
fprintf(stderr, "\tChannels %d\n", numChannels);
#endif
if (track != NULL) track->numChannels = numChannels;
}
break;
}
case MATROSKA_ID_BIT_DEPTH: {
unsigned bitDepth;
if (parseEBMLVal_unsigned(size, bitDepth)) {
#ifdef DEBUG
fprintf(stderr, "\tBit Depth %d\n", bitDepth);
#endif
}
break;
}
case MATROSKA_ID_CONTENT_ENCODINGS:
case MATROSKA_ID_CONTENT_ENCODING: { // 'Content Encodings' or 'Content Encoding' header: enter this
break;
}
case MATROSKA_ID_CONTENT_COMPRESSION: { // 'Content Compression' header: enter this
// Note: We currently support only 'Header Stripping' compression, not 'zlib' compression (the default algorithm).
// Therefore, we disable this track, unless/until we later see that 'Header Stripping' is supported:
if (track != NULL) track->isEnabled = False;
break;
}
case MATROSKA_ID_CONTENT_COMP_ALGO: {
unsigned contentCompAlgo;
if (parseEBMLVal_unsigned(size, contentCompAlgo)) {
#ifdef DEBUG
fprintf(stderr, "\tContent Compression Algorithm %d (%s)\n", contentCompAlgo,
contentCompAlgo == 0 ? "zlib" : contentCompAlgo == 3 ? "Header Stripping" : "");
#endif
// The only compression algorithm that we support is #3: Header Stripping; disable the track otherwise
if (track != NULL) track->isEnabled = contentCompAlgo == 3;
}
break;
}
case MATROSKA_ID_CONTENT_COMP_SETTINGS: {
u_int8_t* headerStrippedBytes;
unsigned headerStrippedBytesSize;
if (parseEBMLVal_binary(size, headerStrippedBytes)) {
headerStrippedBytesSize = (unsigned)size.val();
#ifdef DEBUG
fprintf(stderr, "\tHeader Stripped Bytes: ");
for (unsigned i = 0; i < headerStrippedBytesSize; ++i) fprintf(stderr, "%02x:", headerStrippedBytes[i]);
fprintf(stderr, "\n");
#endif
if (track != NULL) {
delete[] track->headerStrippedBytes; track->headerStrippedBytes = headerStrippedBytes;
track->headerStrippedBytesSize = headerStrippedBytesSize;
} else {
delete[] headerStrippedBytes;
}
}
break;
}
case MATROSKA_ID_CONTENT_ENCRYPTION: { // 'Content Encrpytion' header: skip this
// Note: We don't currently support encryption at all. Therefore, we disable this track:
if (track != NULL) track->isEnabled = False;
// Fall through to...
}
default: { // We don't process this header, so just skip over it:
skipHeader(size);
break;
}
}
setParseState();
}
fLimitOffsetInFile = 0; // reset
if (track != NULL && track->trackNumber == 0) delete track; // We had a previous "MatroskaTrack" object that was never used
return True; // we're done parsing track entries
}
void MatroskaFileParser::lookForNextBlock() {
#ifdef DEBUG
fprintf(stderr, "looking for Block\n");
#endif
// Read and skip over each Matroska header, until we get to a 'Cluster':
EBMLId id;
EBMLDataSize size;
while (fCurrentParseState == LOOKING_FOR_BLOCK) {
while (!parseEBMLIdAndSize(id, size)) {}
#ifdef DEBUG
fprintf(stderr, "MatroskaFileParser::lookForNextBlock(): Parsed id 0x%s (%s), size: %lld\n", id.hexString(), id.stringName(), size.val());
#endif
switch (id.val()) {
case MATROSKA_ID_SEGMENT: { // 'Segment' header: enter this
break;
}
case MATROSKA_ID_CLUSTER: { // 'Cluster' header: enter this
break;
}
case MATROSKA_ID_TIMECODE: { // 'Timecode' header: get this value
unsigned timecode;
if (parseEBMLVal_unsigned(size, timecode)) {
fClusterTimecode = timecode;
#ifdef DEBUG
fprintf(stderr, "\tCluster timecode: %d (== %f seconds)\n", fClusterTimecode, fClusterTimecode*(fOurFile.fTimecodeScale/1000000000.0));
#endif
}
break;
}
case MATROSKA_ID_BLOCK_GROUP: { // 'Block Group' header: enter this
break;
}
case MATROSKA_ID_SIMPLEBLOCK:
case MATROSKA_ID_BLOCK: { // 'SimpleBlock' or 'Block' header: enter this (and we're done)
fBlockSize = (unsigned)size.val();
fCurrentParseState = PARSING_BLOCK;
break;
}
case MATROSKA_ID_BLOCK_DURATION: { // 'Block Duration' header: get this value (but we currently don't do anything with it)
unsigned blockDuration;
if (parseEBMLVal_unsigned(size, blockDuration)) {
#ifdef DEBUG
fprintf(stderr, "\tblock duration: %d (== %f ms)\n", blockDuration, (float)(blockDuration*fOurFile.fTimecodeScale/1000000.0));
#endif
}
break;
}
// Attachments are parsed only if we're in DEBUG mode (otherwise we just skip over them):
#ifdef DEBUG
case MATROSKA_ID_ATTACHMENTS: { // 'Attachments': enter this
break;
}
case MATROSKA_ID_ATTACHED_FILE: { // 'Attached File': enter this
break;
}
case MATROSKA_ID_FILE_DESCRIPTION: { // 'File Description': get this value
char* fileDescription;
if (parseEBMLVal_string(size, fileDescription)) {
#ifdef DEBUG
fprintf(stderr, "\tFile Description: %s\n", fileDescription);
#endif
delete[] fileDescription;
}
break;
}
case MATROSKA_ID_FILE_NAME: { // 'File Name': get this value
char* fileName;
if (parseEBMLVal_string(size, fileName)) {
#ifdef DEBUG
fprintf(stderr, "\tFile Name: %s\n", fileName);
#endif
delete[] fileName;
}
break;
}
case MATROSKA_ID_FILE_MIME_TYPE: { // 'File MIME Type': get this value
char* fileMIMEType;
if (parseEBMLVal_string(size, fileMIMEType)) {
#ifdef DEBUG
fprintf(stderr, "\tFile MIME Type: %s\n", fileMIMEType);
#endif
delete[] fileMIMEType;
}
break;
}
case MATROSKA_ID_FILE_UID: { // 'File UID': get this value
unsigned fileUID;
if (parseEBMLVal_unsigned(size, fileUID)) {
#ifdef DEBUG
fprintf(stderr, "\tFile UID: 0x%x\n", fileUID);
#endif
}
break;
}
#endif
default: { // skip over this header
skipHeader(size);
break;
}
}
setParseState();
}
}
Boolean MatroskaFileParser::parseCues() {
#if defined(DEBUG) || defined(DEBUG_CUES)
fprintf(stderr, "parsing Cues\n");
#endif
EBMLId id;
EBMLDataSize size;
// Read the next header, which should be MATROSKA_ID_CUES:
if (!parseEBMLIdAndSize(id, size) || id != MATROSKA_ID_CUES) return True; // The header wasn't what we expected, so we're done
fLimitOffsetInFile = fCurOffsetInFile + size.val(); // Make sure we don't read past the end of this header
double currentCueTime = 0.0;
u_int64_t currentClusterOffsetInFile = 0;
while (fCurOffsetInFile < fLimitOffsetInFile) {
while (!parseEBMLIdAndSize(id, size)) {}
#ifdef DEBUG_CUES
if (id == MATROSKA_ID_CUE_POINT) fprintf(stderr, "\n"); // makes debugging output easier to read
fprintf(stderr, "MatroskaFileParser::parseCues(): Parsed id 0x%s (%s), size: %lld\n", id.hexString(), id.stringName(), size.val());
#endif
switch (id.val()) {
case MATROSKA_ID_CUE_POINT: { // 'Cue Point' header: enter this
break;
}
case MATROSKA_ID_CUE_TIME: { // 'Cue Time' header: get this value
unsigned cueTime;
if (parseEBMLVal_unsigned(size, cueTime)) {
currentCueTime = cueTime*(fOurFile.fTimecodeScale/1000000000.0);
#ifdef DEBUG_CUES
fprintf(stderr, "\tCue Time %d (== %f seconds)\n", cueTime, currentCueTime);
#endif
}
break;
}
case MATROSKA_ID_CUE_TRACK_POSITIONS: { // 'Cue Track Positions' header: enter this
break;
}
case MATROSKA_ID_CUE_TRACK: { // 'Cue Track' header: get this value (but only for debugging; we don't do anything with it)
unsigned cueTrack;
if (parseEBMLVal_unsigned(size, cueTrack)) {
#ifdef DEBUG_CUES
fprintf(stderr, "\tCue Track %d\n", cueTrack);
#endif
}
break;
}
case MATROSKA_ID_CUE_CLUSTER_POSITION: { // 'Cue Cluster Position' header: get this value
u_int64_t cueClusterPosition;
if (parseEBMLVal_unsigned64(size, cueClusterPosition)) {
currentClusterOffsetInFile = fOurFile.fSegmentDataOffset + cueClusterPosition;
#ifdef DEBUG_CUES
fprintf(stderr, "\tCue Cluster Position %llu (=> offset within the file: %llu (0x%llx))\n", cueClusterPosition, currentClusterOffsetInFile, currentClusterOffsetInFile);
#endif
// Record this cue point:
fOurFile.addCuePoint(currentCueTime, currentClusterOffsetInFile, 1/*default block number within cluster*/);
}
break;
}
case MATROSKA_ID_CUE_BLOCK_NUMBER: { // 'Cue Block Number' header: get this value
unsigned cueBlockNumber;
if (parseEBMLVal_unsigned(size, cueBlockNumber) && cueBlockNumber != 0) {
#ifdef DEBUG_CUES
fprintf(stderr, "\tCue Block Number %d\n", cueBlockNumber);
#endif
// Record this cue point (overwriting any existing entry for this cue time):
fOurFile.addCuePoint(currentCueTime, currentClusterOffsetInFile, cueBlockNumber);
}
break;
}
default: { // We don't process this header, so just skip over it:
skipHeader(size);
break;
}
}
setParseState();
}
fLimitOffsetInFile = 0; // reset
#if defined(DEBUG) || defined(DEBUG_CUES)
fprintf(stderr, "done parsing Cues\n");
#endif
#ifdef DEBUG_CUES
fprintf(stderr, "Cue Point tree: ");
fOurFile.printCuePoints(stderr);
fprintf(stderr, "\n");
#endif
return True; // we're done parsing Cues
}
typedef enum { NoLacing, XiphLacing, FixedSizeLacing, EBMLLacing } MatroskaLacingType;
void MatroskaFileParser::parseBlock() {
#ifdef DEBUG
fprintf(stderr, "parsing SimpleBlock or Block\n");
#endif
do {
unsigned blockStartPos = curOffset();
// The block begins with the track number:
EBMLNumber trackNumber;
if (!parseEBMLNumber(trackNumber)) break;
fBlockTrackNumber = (unsigned)trackNumber.val();
// If this track is not being read, then skip the rest of this block, and look for another one:
if (fOurDemux->lookupDemuxedTrack(fBlockTrackNumber) == NULL) {
unsigned headerBytesSeen = curOffset() - blockStartPos;
if (headerBytesSeen < fBlockSize) {
skipBytes(fBlockSize - headerBytesSeen);
}
#ifdef DEBUG
fprintf(stderr, "\tSkipped block for unused track number %d\n", fBlockTrackNumber);
#endif
fCurrentParseState = LOOKING_FOR_BLOCK;
setParseState();
return;
}
MatroskaTrack* track = fOurFile.lookup(fBlockTrackNumber);
if (track == NULL) break; // shouldn't happen
// The next two bytes are the block's timecode (relative to the cluster timecode)
fBlockTimecode = (get1Byte()<<8)|get1Byte();
// The next byte indicates the type of 'lacing' used:
u_int8_t c = get1Byte();
c &= 0x6; // we're interested in bits 5-6 only
MatroskaLacingType lacingType = (c==0x0)?NoLacing : (c==0x02)?XiphLacing : (c==0x04)?FixedSizeLacing : EBMLLacing;
#ifdef DEBUG
fprintf(stderr, "\ttrack number %d, timecode %d (=> %f seconds), %s lacing\n", fBlockTrackNumber, fBlockTimecode, (fClusterTimecode+fBlockTimecode)*(fOurFile.fTimecodeScale/1000000000.0), (lacingType==NoLacing)?"no" : (lacingType==XiphLacing)?"Xiph" : (lacingType==FixedSizeLacing)?"fixed-size" : "EBML");
#endif
if (lacingType == NoLacing) {
fNumFramesInBlock = 1;
} else {
// The next byte tells us how many frames are present in this block
fNumFramesInBlock = get1Byte() + 1;
}
delete[] fFrameSizesWithinBlock; fFrameSizesWithinBlock = new unsigned[fNumFramesInBlock];
if (fFrameSizesWithinBlock == NULL) break;
if (lacingType == NoLacing) {
unsigned headerBytesSeen = curOffset() - blockStartPos;
if (headerBytesSeen > fBlockSize) break;
fFrameSizesWithinBlock[0] = fBlockSize - headerBytesSeen;
} else if (lacingType == FixedSizeLacing) {
unsigned headerBytesSeen = curOffset() - blockStartPos;
if (headerBytesSeen > fBlockSize) break;
unsigned frameBytesAvailable = fBlockSize - headerBytesSeen;
unsigned constantFrameSize = frameBytesAvailable/fNumFramesInBlock;
for (unsigned i = 0; i < fNumFramesInBlock; ++i) {
fFrameSizesWithinBlock[i] = constantFrameSize;
}
// If there are any bytes left over, assign them to the last frame:
fFrameSizesWithinBlock[fNumFramesInBlock-1] += frameBytesAvailable%fNumFramesInBlock;
} else { // EBML or Xiph lacing
unsigned curFrameSize = 0;
unsigned frameSizesTotal = 0;
unsigned i;
for (i = 0; i < fNumFramesInBlock-1; ++i) {
if (lacingType == EBMLLacing) {
EBMLNumber frameSize;
if (!parseEBMLNumber(frameSize)) break;
unsigned fsv = (unsigned)frameSize.val();
if (i == 0) {
curFrameSize = fsv;
} else {
// The value we read is a signed value, that's added to the previous frame size, to get the current frame size:
unsigned toSubtract = (fsv>0xFFFFFF)?0x07FFFFFF : (fsv>0xFFFF)?0x0FFFFF : (fsv>0xFF)?0x1FFF : 0x3F;
int fsv_signed = fsv - toSubtract;
curFrameSize += fsv_signed;
if ((int)curFrameSize < 0) break;
}
} else { // Xiph lacing
curFrameSize = 0;
u_int8_t c;
do {
c = get1Byte();
curFrameSize += c;
} while (c == 0xFF);
}
fFrameSizesWithinBlock[i] = curFrameSize;
frameSizesTotal += curFrameSize;
}
if (i != fNumFramesInBlock-1) break; // an error occurred within the "for" loop
// Compute the size of the final frame within the block (from the block's size, and the frame sizes already computed):)
unsigned headerBytesSeen = curOffset() - blockStartPos;
if (headerBytesSeen + frameSizesTotal > fBlockSize) break;
fFrameSizesWithinBlock[i] = fBlockSize - (headerBytesSeen + frameSizesTotal);
}
// We're done parsing headers within the block, and (as a result) we now know the sizes of all frames within the block.
// If we have 'stripped bytes' that are common to (the front of) all frames, then count them now:
if (track->headerStrippedBytesSize != 0) {
for (unsigned i = 0; i < fNumFramesInBlock; ++i) fFrameSizesWithinBlock[i] += track->headerStrippedBytesSize;
}
#ifdef DEBUG
fprintf(stderr, "\tThis block contains %d frame(s); size(s):", fNumFramesInBlock);
unsigned frameSizesTotal = 0;
for (unsigned i = 0; i < fNumFramesInBlock; ++i) {
fprintf(stderr, " %d", fFrameSizesWithinBlock[i]);
frameSizesTotal += fFrameSizesWithinBlock[i];
}
if (fNumFramesInBlock > 1) fprintf(stderr, " (total: %u)", frameSizesTotal);
fprintf(stderr, " bytes\n");
#endif
// Next, start delivering these frames:
fCurrentParseState = DELIVERING_FRAME_WITHIN_BLOCK;
fCurOffsetWithinFrame = fNextFrameNumberToDeliver = 0;
setParseState();
return;
} while (0);
// An error occurred. Try to recover:
#ifdef DEBUG
fprintf(stderr, "parseBlock(): Error parsing data; trying to recover...\n");
#endif
fCurrentParseState = LOOKING_FOR_BLOCK;
}
Boolean MatroskaFileParser::deliverFrameWithinBlock() {
#ifdef DEBUG
fprintf(stderr, "delivering frame within SimpleBlock or Block\n");
#endif
do {
MatroskaTrack* track = fOurFile.lookup(fBlockTrackNumber);
if (track == NULL) break; // shouldn't happen
MatroskaDemuxedTrack* demuxedTrack = fOurDemux->lookupDemuxedTrack(fBlockTrackNumber);
if (demuxedTrack == NULL) break; // shouldn't happen
if (!demuxedTrack->isCurrentlyAwaitingData()) {
// Someone has been reading this stream, but isn't right now.
// We can't deliver this frame until he asks for it, so punt for now.
// The next time he asks for a frame, he'll get it.
#ifdef DEBUG
fprintf(stderr, "\tdeferring delivery of frame #%d (%d bytes)", fNextFrameNumberToDeliver, fFrameSizesWithinBlock[fNextFrameNumberToDeliver]);
if (track->haveSubframes()) fprintf(stderr, "[offset %d]", fCurOffsetWithinFrame);
fprintf(stderr, "\n");
#endif
restoreSavedParserState(); // so we read from the beginning next time
return False;
}
unsigned frameSize;
u_int8_t const* specialFrameSource = NULL;
u_int8_t const opusCommentHeader[16]
= {'O','p','u','s','T','a','g','s', 0, 0, 0, 0, 0, 0, 0, 0};
if (track->codecIsOpus && demuxedTrack->fOpusTrackNumber < 2) {
// Special case for Opus audio. The first frame (the 'configuration' header) comes from
// the 'private data'. The second frame (the 'comment' header) comes is synthesized by
// us here:
if (demuxedTrack->fOpusTrackNumber == 0) {
specialFrameSource = track->codecPrivate;
frameSize = track->codecPrivateSize;
} else { // demuxedTrack->fOpusTrackNumber == 1
specialFrameSource = opusCommentHeader;
frameSize = sizeof opusCommentHeader;
}
++demuxedTrack->fOpusTrackNumber;
} else {
frameSize = fFrameSizesWithinBlock[fNextFrameNumberToDeliver];
if (track->haveSubframes()) {
// The next "track->subframeSizeSize" bytes contain the length of a 'subframe':
if (fCurOffsetWithinFrame + track->subframeSizeSize > frameSize) break; // sanity check
unsigned subframeSize = 0;
for (unsigned i = 0; i < track->subframeSizeSize; ++i) {
u_int8_t c;
getCommonFrameBytes(track, &c, 1, 0);
if (fCurFrameNumBytesToGet > 0) { // it'll be 1
c = get1Byte();
++fCurOffsetWithinFrame;
}
subframeSize = subframeSize*256 + c;
}
if (subframeSize == 0 || fCurOffsetWithinFrame + subframeSize > frameSize) break; // sanity check
frameSize = subframeSize;
}
}
// Compute the presentation time of this frame (from the cluster timecode, the block timecode, and the default duration):
double pt = (fClusterTimecode+fBlockTimecode)*(fOurFile.fTimecodeScale/1000000000.0)
+ fNextFrameNumberToDeliver*(track->defaultDuration/1000000000.0);
if (fPresentationTimeOffset == 0.0) {
// This is the first time we've computed a presentation time. Compute an offset to make the presentation times aligned
// with 'wall clock' time:
struct timeval timeNow;
gettimeofday(&timeNow, NULL);
double ptNow = timeNow.tv_sec + timeNow.tv_usec/1000000.0;
fPresentationTimeOffset = ptNow - pt;
}
pt += fPresentationTimeOffset;
struct timeval presentationTime;
presentationTime.tv_sec = (unsigned)pt;
presentationTime.tv_usec = (unsigned)((pt - presentationTime.tv_sec)*1000000);
unsigned durationInMicroseconds;
if (specialFrameSource != NULL) {
durationInMicroseconds = 0;
} else { // normal case
durationInMicroseconds = track->defaultDuration/1000;
if (track->haveSubframes()) {
// If this is a 'subframe', use a duration of 0 instead (unless it's the last 'subframe'):
if (fCurOffsetWithinFrame + frameSize + track->subframeSizeSize < fFrameSizesWithinBlock[fNextFrameNumberToDeliver]) {
// There's room for at least one more subframe after this, so give this subframe a duration of 0
durationInMicroseconds = 0;
}
}
}
if (track->defaultDuration == 0) {
// Adjust the frame duration to keep the sum of frame durations aligned with presentation times.
if (demuxedTrack->prevPresentationTime().tv_sec != 0) { // not the first time for this track
demuxedTrack->durationImbalance()
+= (presentationTime.tv_sec - demuxedTrack->prevPresentationTime().tv_sec)*1000000
+ (presentationTime.tv_usec - demuxedTrack->prevPresentationTime().tv_usec);
}
int adjustment = 0;
if (demuxedTrack->durationImbalance() > 0) {
// The duration needs to be increased.
int const adjustmentThreshold = 100000; // don't increase the duration by more than this amount (in case there's a mistake)
adjustment = demuxedTrack->durationImbalance() > adjustmentThreshold
? adjustmentThreshold : demuxedTrack->durationImbalance();
} else if (demuxedTrack->durationImbalance() < 0) {
// The duration needs to be decreased.
adjustment = (unsigned)(-demuxedTrack->durationImbalance()) < durationInMicroseconds
? demuxedTrack->durationImbalance() : -(int)durationInMicroseconds;
}
durationInMicroseconds += adjustment;
demuxedTrack->durationImbalance() -= durationInMicroseconds; // for next time
demuxedTrack->prevPresentationTime() = presentationTime; // for next time
}
demuxedTrack->presentationTime() = presentationTime;
demuxedTrack->durationInMicroseconds() = durationInMicroseconds;
// Deliver the next block now:
if (frameSize > demuxedTrack->maxSize()) {
demuxedTrack->numTruncatedBytes() = frameSize - demuxedTrack->maxSize();
demuxedTrack->frameSize() = demuxedTrack->maxSize();
} else { // normal case
demuxedTrack->numTruncatedBytes() = 0;
demuxedTrack->frameSize() = frameSize;
}
getCommonFrameBytes(track, demuxedTrack->to(), demuxedTrack->frameSize(), demuxedTrack->numTruncatedBytes());
// Next, deliver (and/or skip) bytes from the input file:
if (specialFrameSource != NULL) {
memmove(demuxedTrack->to(), specialFrameSource, demuxedTrack->frameSize());
#ifdef DEBUG
fprintf(stderr, "\tdelivered special frame: %d bytes", demuxedTrack->frameSize());
if (demuxedTrack->numTruncatedBytes() > 0) fprintf(stderr, " (%d bytes truncated)", demuxedTrack->numTruncatedBytes());
fprintf(stderr, " @%u.%06u (%.06f from start); duration %u us\n", demuxedTrack->presentationTime().tv_sec, demuxedTrack->presentationTime().tv_usec, demuxedTrack->presentationTime().tv_sec+demuxedTrack->presentationTime().tv_usec/1000000.0-fPresentationTimeOffset, demuxedTrack->durationInMicroseconds());
#endif
setParseState();
FramedSource::afterGetting(demuxedTrack); // completes delivery
} else { // normal case
fCurrentParseState = DELIVERING_FRAME_BYTES;
setParseState();
}
return True;
} while (0);
// An error occurred. Try to recover:
#ifdef DEBUG
fprintf(stderr, "deliverFrameWithinBlock(): Error parsing data; trying to recover...\n");
#endif
fCurrentParseState = LOOKING_FOR_BLOCK;
return True;
}
void MatroskaFileParser::deliverFrameBytes() {
do {
MatroskaTrack* track = fOurFile.lookup(fBlockTrackNumber);
if (track == NULL) break; // shouldn't happen
MatroskaDemuxedTrack* demuxedTrack = fOurDemux->lookupDemuxedTrack(fBlockTrackNumber);
if (demuxedTrack == NULL) break; // shouldn't happen
unsigned const BANK_SIZE = bankSize();
while (fCurFrameNumBytesToGet > 0) {
// Hack: We can get no more than BANK_SIZE bytes at a time:
unsigned numBytesToGet = fCurFrameNumBytesToGet > BANK_SIZE ? BANK_SIZE : fCurFrameNumBytesToGet;
getBytes(fCurFrameTo, numBytesToGet);
fCurFrameTo += numBytesToGet;
fCurFrameNumBytesToGet -= numBytesToGet;
fCurOffsetWithinFrame += numBytesToGet;
setParseState();
}
while (fCurFrameNumBytesToSkip > 0) {
// Hack: We can skip no more than BANK_SIZE bytes at a time:
unsigned numBytesToSkip = fCurFrameNumBytesToSkip > BANK_SIZE ? BANK_SIZE : fCurFrameNumBytesToSkip;
skipBytes(numBytesToSkip);
fCurFrameNumBytesToSkip -= numBytesToSkip;
fCurOffsetWithinFrame += numBytesToSkip;
setParseState();
}
#ifdef DEBUG
fprintf(stderr, "\tdelivered frame #%d: %d bytes", fNextFrameNumberToDeliver, demuxedTrack->frameSize());
if (track->haveSubframes()) fprintf(stderr, "[offset %d]", fCurOffsetWithinFrame - track->subframeSizeSize - demuxedTrack->frameSize() - demuxedTrack->numTruncatedBytes());
if (demuxedTrack->numTruncatedBytes() > 0) fprintf(stderr, " (%d bytes truncated)", demuxedTrack->numTruncatedBytes());
fprintf(stderr, " @%u.%06u (%.06f from start); duration %u us\n", demuxedTrack->presentationTime().tv_sec, demuxedTrack->presentationTime().tv_usec, demuxedTrack->presentationTime().tv_sec+demuxedTrack->presentationTime().tv_usec/1000000.0-fPresentationTimeOffset, demuxedTrack->durationInMicroseconds());
#endif
if (!track->haveSubframes()
|| fCurOffsetWithinFrame + track->subframeSizeSize >= fFrameSizesWithinBlock[fNextFrameNumberToDeliver]) {
// Either we don't have subframes, or there's no more room for another subframe => We're completely done with this frame now:
++fNextFrameNumberToDeliver;
fCurOffsetWithinFrame = 0;
}
if (fNextFrameNumberToDeliver == fNumFramesInBlock) {
// We've delivered all of the frames from this block. Look for another block next:
fCurrentParseState = LOOKING_FOR_BLOCK;
} else {
fCurrentParseState = DELIVERING_FRAME_WITHIN_BLOCK;
}
setParseState();
FramedSource::afterGetting(demuxedTrack); // completes delivery
return;
} while (0);
// An error occurred. Try to recover:
#ifdef DEBUG
fprintf(stderr, "deliverFrameBytes(): Error parsing data; trying to recover...\n");
#endif
fCurrentParseState = LOOKING_FOR_BLOCK;
}
void MatroskaFileParser
::getCommonFrameBytes(MatroskaTrack* track, u_int8_t* to, unsigned numBytesToGet, unsigned numBytesToSkip) {
if (track->headerStrippedBytesSize > fCurOffsetWithinFrame) {
// We have some common 'header stripped' bytes that remain to be prepended to the frame. Use these first:
unsigned numRemainingHeaderStrippedBytes = track->headerStrippedBytesSize - fCurOffsetWithinFrame;
unsigned numHeaderStrippedBytesToGet;
if (numBytesToGet <= numRemainingHeaderStrippedBytes) {
numHeaderStrippedBytesToGet = numBytesToGet;
numBytesToGet = 0;
if (numBytesToGet + numBytesToSkip <= numRemainingHeaderStrippedBytes) {
numBytesToSkip = 0;
} else {
numBytesToSkip = numBytesToGet + numBytesToSkip - numRemainingHeaderStrippedBytes;
}
} else {
numHeaderStrippedBytesToGet = numRemainingHeaderStrippedBytes;
numBytesToGet = numBytesToGet - numRemainingHeaderStrippedBytes;
}
if (numHeaderStrippedBytesToGet > 0) {
memmove(to, &track->headerStrippedBytes[fCurOffsetWithinFrame], numHeaderStrippedBytesToGet);
to += numHeaderStrippedBytesToGet;
fCurOffsetWithinFrame += numHeaderStrippedBytesToGet;
}
}
fCurFrameTo = to;
fCurFrameNumBytesToGet = numBytesToGet;
fCurFrameNumBytesToSkip = numBytesToSkip;
}
Boolean MatroskaFileParser::parseEBMLNumber(EBMLNumber& num) {
unsigned i;
u_int8_t bitmask = 0x80;
for (i = 0; i < EBML_NUMBER_MAX_LEN; ++i) {
while (1) {
if (fLimitOffsetInFile > 0 && fCurOffsetInFile > fLimitOffsetInFile) return False; // We've hit our pre-set limit
num.data[i] = get1Byte();
++fCurOffsetInFile;
// If we're looking for an id, skip any leading bytes that don't contain a '1' in the first 4 bits:
if (i == 0/*we're a leading byte*/ && !num.stripLeading1/*we're looking for an id*/ && (num.data[i]&0xF0) == 0) {
setParseState(); // ensures that we make forward progress if the parsing gets interrupted
continue;
}
break;
}
if ((num.data[0]&bitmask) != 0) {
// num[i] is the last byte of the id
if (num.stripLeading1) num.data[0] &=~ bitmask;
break;
}
bitmask >>= 1;
}
if (i == EBML_NUMBER_MAX_LEN) return False;
num.len = i+1;
return True;
}
Boolean MatroskaFileParser::parseEBMLIdAndSize(EBMLId& id, EBMLDataSize& size) {
return parseEBMLNumber(id) && parseEBMLNumber(size);
}
Boolean MatroskaFileParser::parseEBMLVal_unsigned64(EBMLDataSize& size, u_int64_t& result) {
u_int64_t sv = size.val();
if (sv > 8) return False; // size too large
result = 0; // initially
for (unsigned i = (unsigned)sv; i > 0; --i) {
if (fLimitOffsetInFile > 0 && fCurOffsetInFile > fLimitOffsetInFile) return False; // We've hit our pre-set limit
u_int8_t c = get1Byte();
++fCurOffsetInFile;
result = result*256 + c;
}
return True;
}
Boolean MatroskaFileParser::parseEBMLVal_unsigned(EBMLDataSize& size, unsigned& result) {
if (size.val() > 4) return False; // size too large
u_int64_t result64;
if (!parseEBMLVal_unsigned64(size, result64)) return False;
result = (unsigned)result64;
return True;
}
Boolean MatroskaFileParser::parseEBMLVal_float(EBMLDataSize& size, float& result) {
if (size.val() == 4) {
// Normal case. Read the value as if it were a 4-byte integer, then copy it to the 'float' result:
unsigned resultAsUnsigned;
if (!parseEBMLVal_unsigned(size, resultAsUnsigned)) return False;
if (sizeof result != sizeof resultAsUnsigned) return False;
memcpy(&result, &resultAsUnsigned, sizeof result);
return True;
} else if (size.val() == 8) {
// Read the value as if it were an 8-byte integer, then copy it to a 'double', the convert that to the 'float' result:
u_int64_t resultAsUnsigned64;
if (!parseEBMLVal_unsigned64(size, resultAsUnsigned64)) return False;
double resultDouble;
if (sizeof resultDouble != sizeof resultAsUnsigned64) return False;
memcpy(&resultDouble, &resultAsUnsigned64, sizeof resultDouble);
result = (float)resultDouble;
return True;
} else {
// Unworkable size
return False;
}
}
Boolean MatroskaFileParser::parseEBMLVal_string(EBMLDataSize& size, char*& result) {
unsigned resultLength = (unsigned)size.val();
result = new char[resultLength + 1]; // allow for the trailing '\0'
if (result == NULL) return False;
char* p = result;
unsigned i;
for (i = 0; i < resultLength; ++i) {
if (fLimitOffsetInFile > 0 && fCurOffsetInFile > fLimitOffsetInFile) break; // We've hit our pre-set limit
u_int8_t c = get1Byte();
++fCurOffsetInFile;
*p++ = c;
}
if (i < resultLength) { // an error occurred
delete[] result;
result = NULL;
return False;
}
*p = '\0';
return True;
}
Boolean MatroskaFileParser::parseEBMLVal_binary(EBMLDataSize& size, u_int8_t*& result) {
unsigned resultLength = (unsigned)size.val();
result = new u_int8_t[resultLength];
if (result == NULL) return False;
u_int8_t* p = result;
unsigned i;
for (i = 0; i < resultLength; ++i) {
if (fLimitOffsetInFile > 0 && fCurOffsetInFile > fLimitOffsetInFile) break; // We've hit our pre-set limit
u_int8_t c = get1Byte();
++fCurOffsetInFile;
*p++ = c;
}
if (i < resultLength) { // an error occurred
delete[] result;
result = NULL;
return False;
}
return True;
}
void MatroskaFileParser::skipHeader(EBMLDataSize const& size) {
u_int64_t sv = (unsigned)size.val();
#ifdef DEBUG
fprintf(stderr, "\tskipping %llu bytes\n", sv);
#endif
fNumHeaderBytesToSkip = sv;
skipRemainingHeaderBytes(False);
}
void MatroskaFileParser::skipRemainingHeaderBytes(Boolean isContinuation) {
if (fNumHeaderBytesToSkip == 0) return; // common case
// Hack: To avoid tripping into a parser 'internal error' if we try to skip an excessively large
// distance, break up the skipping into manageable chunks, to ensure forward progress:
unsigned const maxBytesToSkip = bankSize();
while (fNumHeaderBytesToSkip > 0) {
unsigned numBytesToSkipNow
= fNumHeaderBytesToSkip < maxBytesToSkip ? (unsigned)fNumHeaderBytesToSkip : maxBytesToSkip;
setParseState();
skipBytes(numBytesToSkipNow);
#ifdef DEBUG
if (isContinuation || numBytesToSkipNow < fNumHeaderBytesToSkip) {
fprintf(stderr, "\t\t(skipped %u bytes; %llu bytes remaining)\n",
numBytesToSkipNow, fNumHeaderBytesToSkip - numBytesToSkipNow);
}
#endif
fCurOffsetInFile += numBytesToSkipNow;
fNumHeaderBytesToSkip -= numBytesToSkipNow;
}
}
void MatroskaFileParser::setParseState() {
fSavedCurOffsetInFile = fCurOffsetInFile;
fSavedCurOffsetWithinFrame = fCurOffsetWithinFrame;
saveParserState();
}
void MatroskaFileParser::restoreSavedParserState() {
StreamParser::restoreSavedParserState();
fCurOffsetInFile = fSavedCurOffsetInFile;
fCurOffsetWithinFrame = fSavedCurOffsetWithinFrame;
}
void MatroskaFileParser::seekToFilePosition(u_int64_t offsetInFile) {
ByteStreamFileSource* fileSource = (ByteStreamFileSource*)fInputSource; // we know it's a "ByteStreamFileSource"
if (fileSource != NULL) {
fileSource->seekToByteAbsolute(offsetInFile);
resetStateAfterSeeking();
}
}
void MatroskaFileParser::seekToEndOfFile() {
ByteStreamFileSource* fileSource = (ByteStreamFileSource*)fInputSource; // we know it's a "ByteStreamFileSource"
if (fileSource != NULL) {
fileSource->seekToEnd();
resetStateAfterSeeking();
}
}
void MatroskaFileParser::resetStateAfterSeeking() {
// Because we're resuming parsing after seeking to a new position in the file, reset the parser state:
fCurOffsetInFile = fSavedCurOffsetInFile = 0;
fCurOffsetWithinFrame = fSavedCurOffsetWithinFrame = 0;
flushInput();
}
live/liveMedia/MatroskaFileParser.hh 000444 001752 001752 00000010636 12656261123 017435 0 ustar 00rsf rsf 000000 000000 /**********
This library is free software; you can redistribute it and/or modify it under
the terms of the GNU Lesser General Public License as published by the
Free Software Foundation; either version 2.1 of the License, or (at your
option) any later version. (See .)
This library is distributed in the hope that it will be useful, but WITHOUT
ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS
FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for
more details.
You should have received a copy of the GNU Lesser General Public License
along with this library; if not, write to the Free Software Foundation, Inc.,
51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
**********/
// "liveMedia"
// Copyright (c) 1996-2016 Live Networks, Inc. All rights reserved.
// A parser for a Matroska file.
// C++ header
#ifndef _MATROSKA_FILE_PARSER_HH
#ifndef _STREAM_PARSER_HH
#include "StreamParser.hh"
#endif
#ifndef _MATROSKA_FILE_HH
#include "MatroskaFile.hh"
#endif
#ifndef _EBML_NUMBER_HH
#include "EBMLNumber.hh"
#endif
// An enum representing the current state of the parser:
enum MatroskaParseState {
PARSING_START_OF_FILE,
LOOKING_FOR_TRACKS,
PARSING_TRACK,
PARSING_CUES,
LOOKING_FOR_CLUSTER,
LOOKING_FOR_BLOCK,
PARSING_BLOCK,
DELIVERING_FRAME_WITHIN_BLOCK,
DELIVERING_FRAME_BYTES
};
class MatroskaFileParser: public StreamParser {
public:
MatroskaFileParser(MatroskaFile& ourFile, FramedSource* inputSource,
FramedSource::onCloseFunc* onEndFunc, void* onEndClientData,
MatroskaDemux* ourDemux = NULL);
virtual ~MatroskaFileParser();
void seekToTime(double& seekNPT);
// StreamParser 'client continue' function:
static void continueParsing(void* clientData, unsigned char* ptr, unsigned size, struct timeval presentationTime);
void continueParsing();
private:
// Parsing functions:
Boolean parse();
// returns True iff we have finished parsing to the end of all 'Track' headers (on initialization)
Boolean parseStartOfFile();
void lookForNextTrack();
Boolean parseTrack();
Boolean parseCues();
void lookForNextBlock();
void parseBlock();
Boolean deliverFrameWithinBlock();
void deliverFrameBytes();
void getCommonFrameBytes(MatroskaTrack* track, u_int8_t* to, unsigned numBytesToGet, unsigned numBytesToSkip);
Boolean parseEBMLNumber(EBMLNumber& num);
Boolean parseEBMLIdAndSize(EBMLId& id, EBMLDataSize& size);
Boolean parseEBMLVal_unsigned64(EBMLDataSize& size, u_int64_t& result);
Boolean parseEBMLVal_unsigned(EBMLDataSize& size, unsigned& result);
Boolean parseEBMLVal_float(EBMLDataSize& size, float& result);
Boolean parseEBMLVal_string(EBMLDataSize& size, char*& result);
// Note: "result" is dynamically allocated; the caller must delete[] it later
Boolean parseEBMLVal_binary(EBMLDataSize& size, u_int8_t*& result);
// Note: "result" is dynamically allocated; the caller must delete[] it later
void skipHeader(EBMLDataSize const& size);
void skipRemainingHeaderBytes(Boolean isContinuation);
void setParseState();
void seekToFilePosition(u_int64_t offsetInFile);
void seekToEndOfFile();
void resetStateAfterSeeking(); // common code, called by both of the above
private: // redefined virtual functions
virtual void restoreSavedParserState();
private:
// General state for parsing:
MatroskaFile& fOurFile;
FramedSource* fInputSource;
FramedSource::onCloseFunc* fOnEndFunc;
void* fOnEndClientData;
MatroskaDemux* fOurDemux;
MatroskaParseState fCurrentParseState;
u_int64_t fCurOffsetInFile, fSavedCurOffsetInFile, fLimitOffsetInFile;
// For skipping over (possibly large) headers:
u_int64_t fNumHeaderBytesToSkip;
// For parsing 'Seek ID's:
EBMLId fLastSeekId;
// Parameters of the most recently-parsed 'Cluster':
unsigned fClusterTimecode;
// Parameters of the most recently-parsed 'Block':
unsigned fBlockSize;
unsigned fBlockTrackNumber;
short fBlockTimecode;
unsigned fNumFramesInBlock;
unsigned* fFrameSizesWithinBlock;
// Parameters of the most recently-parsed frame within a 'Block':
double fPresentationTimeOffset;
unsigned fNextFrameNumberToDeliver;
unsigned fCurOffsetWithinFrame, fSavedCurOffsetWithinFrame; // used if track->haveSubframes()
// Parameters of the (sub)frame that's currently being delivered:
u_int8_t* fCurFrameTo;
unsigned fCurFrameNumBytesToGet;
unsigned fCurFrameNumBytesToSkip;
};
#endif
live/liveMedia/MatroskaFileServerDemux.cpp 000444 001752 001752 00000012117 12656261123 020631 0 ustar 00rsf rsf 000000 000000 /**********
This library is free software; you can redistribute it and/or modify it under
the terms of the GNU Lesser General Public License as published by the
Free Software Foundation; either version 2.1 of the License, or (at your
option) any later version. (See .)
This library is distributed in the hope that it will be useful, but WITHOUT
ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS
FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for
more details.
You should have received a copy of the GNU Lesser General Public License
along with this library; if not, write to the Free Software Foundation, Inc.,
51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
**********/
// "liveMedia"
// Copyright (c) 1996-2016 Live Networks, Inc. All rights reserved.
// A server demultiplexor for a Matroska file
// Implementation
#include "MatroskaFileServerDemux.hh"
#include "MP3AudioMatroskaFileServerMediaSubsession.hh"
#include "MatroskaFileServerMediaSubsession.hh"
void MatroskaFileServerDemux
::createNew(UsageEnvironment& env, char const* fileName,
onCreationFunc* onCreation, void* onCreationClientData,
char const* preferredLanguage) {
(void)new MatroskaFileServerDemux(env, fileName,
onCreation, onCreationClientData,
preferredLanguage);
}
ServerMediaSubsession* MatroskaFileServerDemux::newServerMediaSubsession() {
unsigned dummyResultTrackNumber;
return newServerMediaSubsession(dummyResultTrackNumber);
}
ServerMediaSubsession* MatroskaFileServerDemux
::newServerMediaSubsession(unsigned& resultTrackNumber) {
ServerMediaSubsession* result;
resultTrackNumber = 0;
for (result = NULL; result == NULL && fNextTrackTypeToCheck != MATROSKA_TRACK_TYPE_OTHER; fNextTrackTypeToCheck <<= 1) {
if (fNextTrackTypeToCheck == MATROSKA_TRACK_TYPE_VIDEO) resultTrackNumber = fOurMatroskaFile->chosenVideoTrackNumber();
else if (fNextTrackTypeToCheck == MATROSKA_TRACK_TYPE_AUDIO) resultTrackNumber = fOurMatroskaFile->chosenAudioTrackNumber();
else if (fNextTrackTypeToCheck == MATROSKA_TRACK_TYPE_SUBTITLE) resultTrackNumber = fOurMatroskaFile->chosenSubtitleTrackNumber();
result = newServerMediaSubsessionByTrackNumber(resultTrackNumber);
}
return result;
}
ServerMediaSubsession* MatroskaFileServerDemux
::newServerMediaSubsessionByTrackNumber(unsigned trackNumber) {
MatroskaTrack* track = fOurMatroskaFile->lookup(trackNumber);
if (track == NULL) return NULL;
// Use the track's "codecID" string to figure out which "ServerMediaSubsession" subclass to use:
ServerMediaSubsession* result = NULL;
if (strcmp(track->mimeType, "audio/MPEG") == 0) {
result = MP3AudioMatroskaFileServerMediaSubsession::createNew(*this, track);
} else {
result = MatroskaFileServerMediaSubsession::createNew(*this, track);
}
if (result != NULL) {
#ifdef DEBUG
fprintf(stderr, "Created 'ServerMediaSubsession' object for track #%d: %s (%s)\n", track->trackNumber, track->codecID, track->mimeType);
#endif
}
return result;
}
FramedSource* MatroskaFileServerDemux::newDemuxedTrack(unsigned clientSessionId, unsigned trackNumber) {
MatroskaDemux* demuxToUse = NULL;
if (clientSessionId != 0 && clientSessionId == fLastClientSessionId) {
demuxToUse = fLastCreatedDemux; // use the same demultiplexor as before
// Note: This code relies upon the fact that the creation of streams for different
// client sessions do not overlap - so all demuxed tracks are created for one "MatroskaDemux" at a time.
// Also, the "clientSessionId != 0" test is a hack, because 'session 0' is special; its audio and video streams
// are created and destroyed one-at-a-time, rather than both streams being
// created, and then (later) both streams being destroyed (as is the case
// for other ('real') session ids). Because of this, a separate demultiplexor is used for each 'session 0' track.
}
if (demuxToUse == NULL) demuxToUse = fOurMatroskaFile->newDemux();
fLastClientSessionId = clientSessionId;
fLastCreatedDemux = demuxToUse;
return demuxToUse->newDemuxedTrackByTrackNumber(trackNumber);
}
MatroskaFileServerDemux
::MatroskaFileServerDemux(UsageEnvironment& env, char const* fileName,
onCreationFunc* onCreation, void* onCreationClientData,
char const* preferredLanguage)
: Medium(env),
fFileName(fileName), fOnCreation(onCreation), fOnCreationClientData(onCreationClientData),
fNextTrackTypeToCheck(0x1), fLastClientSessionId(0), fLastCreatedDemux(NULL) {
MatroskaFile::createNew(env, fileName, onMatroskaFileCreation, this, preferredLanguage);
}
MatroskaFileServerDemux::~MatroskaFileServerDemux() {
Medium::close(fOurMatroskaFile);
}
void MatroskaFileServerDemux::onMatroskaFileCreation(MatroskaFile* newFile, void* clientData) {
((MatroskaFileServerDemux*)clientData)->onMatroskaFileCreation(newFile);
}
void MatroskaFileServerDemux::onMatroskaFileCreation(MatroskaFile* newFile) {
fOurMatroskaFile = newFile;
// Now, call our own creation notification function:
if (fOnCreation != NULL) (*fOnCreation)(this, fOnCreationClientData);
}
live/liveMedia/Media.cpp 000444 001752 001752 00000010703 12656261123 015074 0 ustar 00rsf rsf 000000 000000 /**********
This library is free software; you can redistribute it and/or modify it under
the terms of the GNU Lesser General Public License as published by the
Free Software Foundation; either version 2.1 of the License, or (at your
option) any later version. (See .)
This library is distributed in the hope that it will be useful, but WITHOUT
ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS
FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for
more details.
You should have received a copy of the GNU Lesser General Public License
along with this library; if not, write to the Free Software Foundation, Inc.,
51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
**********/
// "liveMedia"
// Copyright (c) 1996-2016 Live Networks, Inc. All rights reserved.
// Media
// Implementation
#include "Media.hh"
#include "HashTable.hh"
////////// Medium //////////
Medium::Medium(UsageEnvironment& env)
: fEnviron(env), fNextTask(NULL) {
// First generate a name for the new medium:
MediaLookupTable::ourMedia(env)->generateNewName(fMediumName, mediumNameMaxLen);
env.setResultMsg(fMediumName);
// Then add it to our table:
MediaLookupTable::ourMedia(env)->addNew(this, fMediumName);
}
Medium::~Medium() {
// Remove any tasks that might be pending for us:
fEnviron.taskScheduler().unscheduleDelayedTask(fNextTask);
}
Boolean Medium::lookupByName(UsageEnvironment& env, char const* mediumName,
Medium*& resultMedium) {
resultMedium = MediaLookupTable::ourMedia(env)->lookup(mediumName);
if (resultMedium == NULL) {
env.setResultMsg("Medium ", mediumName, " does not exist");
return False;
}
return True;
}
void Medium::close(UsageEnvironment& env, char const* name) {
MediaLookupTable::ourMedia(env)->remove(name);
}
void Medium::close(Medium* medium) {
if (medium == NULL) return;
close(medium->envir(), medium->name());
}
Boolean Medium::isSource() const {
return False; // default implementation
}
Boolean Medium::isSink() const {
return False; // default implementation
}
Boolean Medium::isRTCPInstance() const {
return False; // default implementation
}
Boolean Medium::isRTSPClient() const {
return False; // default implementation
}
Boolean Medium::isRTSPServer() const {
return False; // default implementation
}
Boolean Medium::isMediaSession() const {
return False; // default implementation
}
Boolean Medium::isServerMediaSession() const {
return False; // default implementation
}
////////// _Tables implementation //////////
_Tables* _Tables::getOurTables(UsageEnvironment& env, Boolean createIfNotPresent) {
if (env.liveMediaPriv == NULL && createIfNotPresent) {
env.liveMediaPriv = new _Tables(env);
}
return (_Tables*)(env.liveMediaPriv);
}
void _Tables::reclaimIfPossible() {
if (mediaTable == NULL && socketTable == NULL) {
fEnv.liveMediaPriv = NULL;
delete this;
}
}
_Tables::_Tables(UsageEnvironment& env)
: mediaTable(NULL), socketTable(NULL), fEnv(env) {
}
_Tables::~_Tables() {
}
////////// MediaLookupTable implementation //////////
MediaLookupTable* MediaLookupTable::ourMedia(UsageEnvironment& env) {
_Tables* ourTables = _Tables::getOurTables(env);
if (ourTables->mediaTable == NULL) {
// Create a new table to record the media that are to be created in
// this environment:
ourTables->mediaTable = new MediaLookupTable(env);
}
return ourTables->mediaTable;
}
Medium* MediaLookupTable::lookup(char const* name) const {
return (Medium*)(fTable->Lookup(name));
}
void MediaLookupTable::addNew(Medium* medium, char* mediumName) {
fTable->Add(mediumName, (void*)medium);
}
void MediaLookupTable::remove(char const* name) {
Medium* medium = lookup(name);
if (medium != NULL) {
fTable->Remove(name);
if (fTable->IsEmpty()) {
// We can also delete ourselves (to reclaim space):
_Tables* ourTables = _Tables::getOurTables(fEnv);
delete this;
ourTables->mediaTable = NULL;
ourTables->reclaimIfPossible();
}
delete medium;
}
}
void MediaLookupTable::generateNewName(char* mediumName,
unsigned /*maxLen*/) {
// We should really use snprintf() here, but not all systems have it
sprintf(mediumName, "liveMedia%d", fNameGenerator++);
}
MediaLookupTable::MediaLookupTable(UsageEnvironment& env)
: fEnv(env), fTable(HashTable::create(STRING_HASH_KEYS)), fNameGenerator(0) {
}
MediaLookupTable::~MediaLookupTable() {
delete fTable;
}
live/liveMedia/MatroskaFileServerMediaSubsession.cpp 000444 001752 001752 00000005465 12656261123 022654 0 ustar 00rsf rsf 000000 000000 /**********
This library is free software; you can redistribute it and/or modify it under
the terms of the GNU Lesser General Public License as published by the
Free Software Foundation; either version 2.1 of the License, or (at your
option) any later version. (See .)
This library is distributed in the hope that it will be useful, but WITHOUT
ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS
FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for
more details.
You should have received a copy of the GNU Lesser General Public License
along with this library; if not, write to the Free Software Foundation, Inc.,
51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
**********/
// "liveMedia"
// Copyright (c) 1996-2016 Live Networks, Inc. All rights reserved.
// A 'ServerMediaSubsession' object that creates new, unicast, "RTPSink"s
// on demand, from a track within a Matroska file.
// Implementation
#include "MatroskaFileServerMediaSubsession.hh"
#include "MatroskaDemuxedTrack.hh"
#include "FramedFilter.hh"
MatroskaFileServerMediaSubsession* MatroskaFileServerMediaSubsession
::createNew(MatroskaFileServerDemux& demux, MatroskaTrack* track) {
return new MatroskaFileServerMediaSubsession(demux, track);
}
MatroskaFileServerMediaSubsession
::MatroskaFileServerMediaSubsession(MatroskaFileServerDemux& demux, MatroskaTrack* track)
: FileServerMediaSubsession(demux.envir(), demux.fileName(), False),
fOurDemux(demux), fTrack(track), fNumFiltersInFrontOfTrack(0) {
}
MatroskaFileServerMediaSubsession::~MatroskaFileServerMediaSubsession() {
}
float MatroskaFileServerMediaSubsession::duration() const { return fOurDemux.fileDuration(); }
void MatroskaFileServerMediaSubsession
::seekStreamSource(FramedSource* inputSource, double& seekNPT, double /*streamDuration*/, u_int64_t& /*numBytes*/) {
for (unsigned i = 0; i < fNumFiltersInFrontOfTrack; ++i) {
// "inputSource" is a filter. Go back to *its* source:
inputSource = ((FramedFilter*)inputSource)->inputSource();
}
((MatroskaDemuxedTrack*)inputSource)->seekToTime(seekNPT);
}
FramedSource* MatroskaFileServerMediaSubsession
::createNewStreamSource(unsigned clientSessionId, unsigned& estBitrate) {
FramedSource* baseSource = fOurDemux.newDemuxedTrack(clientSessionId, fTrack->trackNumber);
if (baseSource == NULL) return NULL;
return fOurDemux.ourMatroskaFile()
->createSourceForStreaming(baseSource, fTrack->trackNumber,
estBitrate, fNumFiltersInFrontOfTrack);
}
RTPSink* MatroskaFileServerMediaSubsession
::createNewRTPSink(Groupsock* rtpGroupsock, unsigned char rtpPayloadTypeIfDynamic, FramedSource* /*inputSource*/) {
return fOurDemux.ourMatroskaFile()
->createRTPSinkForTrackNumber(fTrack->trackNumber, rtpGroupsock, rtpPayloadTypeIfDynamic);
}
live/liveMedia/MP3ADU.cpp 000444 001752 001752 00000050206 12656261123 015010 0 ustar 00rsf rsf 000000 000000 /**********
This library is free software; you can redistribute it and/or modify it under
the terms of the GNU Lesser General Public License as published by the
Free Software Foundation; either version 2.1 of the License, or (at your
option) any later version. (See .)
This library is distributed in the hope that it will be useful, but WITHOUT
ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS
FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for
more details.
You should have received a copy of the GNU Lesser General Public License
along with this library; if not, write to the Free Software Foundation, Inc.,
51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
**********/
// "liveMedia"
// Copyright (c) 1996-2016 Live Networks, Inc. All rights reserved.
// 'ADU' MP3 streams (for improved loss-tolerance)
// Implementation
#include "MP3ADU.hh"
#include "MP3ADUdescriptor.hh"
#include "MP3Internals.hh"
#include
#ifdef TEST_LOSS
#include "GroupsockHelper.hh"
#endif
// Segment data structures, used in the implementation below:
#define SegmentBufSize 2000 /* conservatively high */
class Segment {
public:
unsigned char buf[SegmentBufSize];
unsigned char* dataStart() { return &buf[descriptorSize]; }
unsigned frameSize; // if it's a non-ADU frame
unsigned dataHere(); // if it's a non-ADU frame
unsigned descriptorSize;
static unsigned const headerSize;
unsigned sideInfoSize, aduSize;
unsigned backpointer;
struct timeval presentationTime;
unsigned durationInMicroseconds;
};
unsigned const Segment::headerSize = 4;
#define SegmentQueueSize 20
class SegmentQueue {
public:
SegmentQueue(Boolean directionIsToADU, Boolean includeADUdescriptors)
: fDirectionIsToADU(directionIsToADU),
fIncludeADUdescriptors(includeADUdescriptors) {
reset();
}
Segment s[SegmentQueueSize];
unsigned headIndex() {return fHeadIndex;}
Segment& headSegment() {return s[fHeadIndex];}
unsigned nextFreeIndex() {return fNextFreeIndex;}
Segment& nextFreeSegment() {return s[fNextFreeIndex];}
Boolean isEmpty() {return isEmptyOrFull() && totalDataSize() == 0;}
Boolean isFull() {return isEmptyOrFull() && totalDataSize() > 0;}
static unsigned nextIndex(unsigned ix) {return (ix+1)%SegmentQueueSize;}
static unsigned prevIndex(unsigned ix) {return (ix+SegmentQueueSize-1)%SegmentQueueSize;}
unsigned totalDataSize() {return fTotalDataSize;}
void enqueueNewSegment(FramedSource* inputSource, FramedSource* usingSource);
Boolean dequeue();
Boolean insertDummyBeforeTail(unsigned backpointer);
void reset() { fHeadIndex = fNextFreeIndex = fTotalDataSize = 0; }
private:
static void sqAfterGettingSegment(void* clientData,
unsigned numBytesRead,
unsigned numTruncatedBytes,
struct timeval presentationTime,
unsigned durationInMicroseconds);
Boolean sqAfterGettingCommon(Segment& seg, unsigned numBytesRead);
Boolean isEmptyOrFull() {return headIndex() == nextFreeIndex();}
unsigned fHeadIndex, fNextFreeIndex, fTotalDataSize;
// The following is used for asynchronous reads:
FramedSource* fUsingSource;
// This tells us whether the direction in which we're being used
// is MP3->ADU, or vice-versa. (This flag is used for debugging output.)
Boolean fDirectionIsToADU;
// The following is true iff we're used to enqueue incoming
// ADU frames, and these have an ADU descriptor in front
Boolean fIncludeADUdescriptors;
};
////////// ADUFromMP3Source //////////
ADUFromMP3Source::ADUFromMP3Source(UsageEnvironment& env,
FramedSource* inputSource,
Boolean includeADUdescriptors)
: FramedFilter(env, inputSource),
fAreEnqueueingMP3Frame(False),
fSegments(new SegmentQueue(True /* because we're MP3->ADU */,
False /*no descriptors in incoming frames*/)),
fIncludeADUdescriptors(includeADUdescriptors),
fTotalDataSizeBeforePreviousRead(0), fScale(1), fFrameCounter(0) {
}
ADUFromMP3Source::~ADUFromMP3Source() {
delete fSegments;
}
char const* ADUFromMP3Source::MIMEtype() const {
return "audio/MPA-ROBUST";
}
ADUFromMP3Source* ADUFromMP3Source::createNew(UsageEnvironment& env,
FramedSource* inputSource,
Boolean includeADUdescriptors) {
// The source must be a MPEG audio source:
if (strcmp(inputSource->MIMEtype(), "audio/MPEG") != 0) {
env.setResultMsg(inputSource->name(), " is not an MPEG audio source");
return NULL;
}
return new ADUFromMP3Source(env, inputSource, includeADUdescriptors);
}
void ADUFromMP3Source::resetInput() {
fSegments->reset();
}
Boolean ADUFromMP3Source::setScaleFactor(int scale) {
if (scale < 1) return False;
fScale = scale;
return True;
}
void ADUFromMP3Source::doGetNextFrame() {
if (!fAreEnqueueingMP3Frame) {
// Arrange to enqueue a new MP3 frame:
fTotalDataSizeBeforePreviousRead = fSegments->totalDataSize();
fAreEnqueueingMP3Frame = True;
fSegments->enqueueNewSegment(fInputSource, this);
} else {
// Deliver an ADU from a previously-read MP3 frame:
fAreEnqueueingMP3Frame = False;
if (!doGetNextFrame1()) {
// An internal error occurred; act as if our source went away:
handleClosure();
}
}
}
Boolean ADUFromMP3Source::doGetNextFrame1() {
// First, check whether we have enough previously-read data to output an
// ADU for the last-read MP3 frame:
unsigned tailIndex;
Segment* tailSeg;
Boolean needMoreData;
if (fSegments->isEmpty()) {
needMoreData = True;
tailSeg = NULL; tailIndex = 0; // unneeded, but stops compiler warnings
} else {
tailIndex = SegmentQueue::prevIndex(fSegments->nextFreeIndex());
tailSeg = &(fSegments->s[tailIndex]);
needMoreData
= fTotalDataSizeBeforePreviousRead < tailSeg->backpointer // bp points back too far
|| tailSeg->backpointer + tailSeg->dataHere() < tailSeg->aduSize; // not enough data
}
if (needMoreData) {
// We don't have enough data to output an ADU from the last-read MP3
// frame, so need to read another one and try again:
doGetNextFrame();
return True;
}
// Output an ADU from the tail segment:
fFrameSize = tailSeg->headerSize+tailSeg->sideInfoSize+tailSeg->aduSize;
fPresentationTime = tailSeg->presentationTime;
fDurationInMicroseconds = tailSeg->durationInMicroseconds;
unsigned descriptorSize
= fIncludeADUdescriptors ? ADUdescriptor::computeSize(fFrameSize) : 0;
#ifdef DEBUG
fprintf(stderr, "m->a:outputting ADU %d<-%d, nbr:%d, sis:%d, dh:%d, (descriptor size: %d)\n", tailSeg->aduSize, tailSeg->backpointer, fFrameSize, tailSeg->sideInfoSize, tailSeg->dataHere(), descriptorSize);
#endif
if (descriptorSize + fFrameSize > fMaxSize) {
envir() << "ADUFromMP3Source::doGetNextFrame1(): not enough room ("
<< descriptorSize + fFrameSize << ">"
<< fMaxSize << ")\n";
fFrameSize = 0;
return False;
}
unsigned char* toPtr = fTo;
// output the ADU descriptor:
if (fIncludeADUdescriptors) {
fFrameSize += ADUdescriptor::generateDescriptor(toPtr, fFrameSize);
}
// output header and side info:
memmove(toPtr, tailSeg->dataStart(),
tailSeg->headerSize + tailSeg->sideInfoSize);
toPtr += tailSeg->headerSize + tailSeg->sideInfoSize;
// go back to the frame that contains the start of our data:
unsigned offset = 0;
unsigned i = tailIndex;
unsigned prevBytes = tailSeg->backpointer;
while (prevBytes > 0) {
i = SegmentQueue::prevIndex(i);
unsigned dataHere = fSegments->s[i].dataHere();
if (dataHere < prevBytes) {
prevBytes -= dataHere;
} else {
offset = dataHere - prevBytes;
break;
}
}
// dequeue any segments that we no longer need:
while (fSegments->headIndex() != i) {
fSegments->dequeue(); // we're done with it
}
unsigned bytesToUse = tailSeg->aduSize;
while (bytesToUse > 0) {
Segment& seg = fSegments->s[i];
unsigned char* fromPtr
= &seg.dataStart()[seg.headerSize + seg.sideInfoSize + offset];
unsigned dataHere = seg.dataHere() - offset;
unsigned bytesUsedHere = dataHere < bytesToUse ? dataHere : bytesToUse;
memmove(toPtr, fromPtr, bytesUsedHere);
bytesToUse -= bytesUsedHere;
toPtr += bytesUsedHere;
offset = 0;
i = SegmentQueue::nextIndex(i);
}
if (fFrameCounter++%fScale == 0) {
// Call our own 'after getting' function. Because we're not a 'leaf'
// source, we can call this directly, without risking infinite recursion.
afterGetting(this);
} else {
// Don't use this frame; get another one:
doGetNextFrame();
}
return True;
}
////////// MP3FromADUSource //////////
MP3FromADUSource::MP3FromADUSource(UsageEnvironment& env,
FramedSource* inputSource,
Boolean includeADUdescriptors)
: FramedFilter(env, inputSource),
fAreEnqueueingADU(False),
fSegments(new SegmentQueue(False /* because we're ADU->MP3 */,
includeADUdescriptors)) {
}
MP3FromADUSource::~MP3FromADUSource() {
delete fSegments;
}
char const* MP3FromADUSource::MIMEtype() const {
return "audio/MPEG";
}
MP3FromADUSource* MP3FromADUSource::createNew(UsageEnvironment& env,
FramedSource* inputSource,
Boolean includeADUdescriptors) {
// The source must be an MP3 ADU source:
if (strcmp(inputSource->MIMEtype(), "audio/MPA-ROBUST") != 0) {
env.setResultMsg(inputSource->name(), " is not an MP3 ADU source");
return NULL;
}
return new MP3FromADUSource(env, inputSource, includeADUdescriptors);
}
void MP3FromADUSource::doGetNextFrame() {
if (fAreEnqueueingADU) insertDummyADUsIfNecessary();
fAreEnqueueingADU = False;
if (needToGetAnADU()) {
// Before returning a frame, we must enqueue at least one ADU:
#ifdef TEST_LOSS
NOTE: This code no longer works, because it uses synchronous reads,
which are no longer supported.
static unsigned const framesPerPacket = 10;
static unsigned const frameCount = 0;
static Boolean packetIsLost;
while (1) {
if ((frameCount++)%framesPerPacket == 0) {
packetIsLost = (our_random()%10 == 0); // simulate 10% packet loss #####
}
if (packetIsLost) {
// Read and discard the next input frame (that would be part of
// a lost packet):
Segment dummySegment;
unsigned numBytesRead;
struct timeval presentationTime;
// (this works only if the source can be read synchronously)
fInputSource->syncGetNextFrame(dummySegment.buf,
sizeof dummySegment.buf, numBytesRead,
presentationTime);
} else {
break; // from while (1)
}
}
#endif
fAreEnqueueingADU = True;
fSegments->enqueueNewSegment(fInputSource, this);
} else {
// Return a frame now:
generateFrameFromHeadADU();
// sets fFrameSize, fPresentationTime, and fDurationInMicroseconds
// Call our own 'after getting' function. Because we're not a 'leaf'
// source, we can call this directly, without risking infinite recursion.
afterGetting(this);
}
}
Boolean MP3FromADUSource::needToGetAnADU() {
// Check whether we need to first enqueue a new ADU before we
// can generate a frame for our head ADU.
Boolean needToEnqueue = True;
if (!fSegments->isEmpty()) {
unsigned index = fSegments->headIndex();
Segment* seg = &(fSegments->headSegment());
int const endOfHeadFrame = (int) seg->dataHere();
unsigned frameOffset = 0;
while (1) {
int endOfData = frameOffset - seg->backpointer + seg->aduSize;
if (endOfData >= endOfHeadFrame) {
// We already have enough data to generate a frame
needToEnqueue = False;
break;
}
frameOffset += seg->dataHere();
index = SegmentQueue::nextIndex(index);
if (index == fSegments->nextFreeIndex()) break;
seg = &(fSegments->s[index]);
}
}
return needToEnqueue;
}
void MP3FromADUSource::insertDummyADUsIfNecessary() {
if (fSegments->isEmpty()) return; // shouldn't happen
// The tail segment (ADU) is assumed to have been recently
// enqueued. If its backpointer would overlap the data
// of the previous ADU, then we need to insert one or more
// empty, 'dummy' ADUs ahead of it. (This situation should occur
// only if an intermediate ADU was lost.)
unsigned tailIndex
= SegmentQueue::prevIndex(fSegments->nextFreeIndex());
Segment* tailSeg = &(fSegments->s[tailIndex]);
while (1) {
unsigned prevADUend; // relative to the start of the new ADU
if (fSegments->headIndex() != tailIndex) {
// there is a previous segment
unsigned prevIndex = SegmentQueue::prevIndex(tailIndex);
Segment& prevSegment = fSegments->s[prevIndex];
prevADUend = prevSegment.dataHere() + prevSegment.backpointer;
if (prevSegment.aduSize > prevADUend) {
// shouldn't happen if the previous ADU was well-formed
prevADUend = 0;
} else {
prevADUend -= prevSegment.aduSize;
}
} else {
prevADUend = 0;
}
if (tailSeg->backpointer > prevADUend) {
// We need to insert a dummy ADU in front of the tail
#ifdef DEBUG
fprintf(stderr, "a->m:need to insert a dummy ADU (%d, %d, %d) [%d, %d]\n", tailSeg->backpointer, prevADUend, tailSeg->dataHere(), fSegments->headIndex(), fSegments->nextFreeIndex());
#endif
tailIndex = fSegments->nextFreeIndex();
if (!fSegments->insertDummyBeforeTail(prevADUend)) return;
tailSeg = &(fSegments->s[tailIndex]);
} else {
break; // no more dummy ADUs need to be inserted
}
}
}
Boolean MP3FromADUSource::generateFrameFromHeadADU() {
// Output a frame for the head ADU:
if (fSegments->isEmpty()) return False;
unsigned index = fSegments->headIndex();
Segment* seg = &(fSegments->headSegment());
#ifdef DEBUG
fprintf(stderr, "a->m:outputting frame for %d<-%d (fs %d, dh %d), (descriptorSize: %d)\n", seg->aduSize, seg->backpointer, seg->frameSize, seg->dataHere(), seg->descriptorSize);
#endif
unsigned char* toPtr = fTo;
// output header and side info:
fFrameSize = seg->frameSize;
fPresentationTime = seg->presentationTime;
fDurationInMicroseconds = seg->durationInMicroseconds;
memmove(toPtr, seg->dataStart(), seg->headerSize + seg->sideInfoSize);
toPtr += seg->headerSize + seg->sideInfoSize;
// zero out the rest of the frame, in case ADU data doesn't fill it all in
unsigned bytesToZero = seg->dataHere();
for (unsigned i = 0; i < bytesToZero; ++i) {
toPtr[i] = '\0';
}
// Fill in the frame with appropriate ADU data from this and
// subsequent ADUs:
unsigned frameOffset = 0;
unsigned toOffset = 0;
unsigned const endOfHeadFrame = seg->dataHere();
while (toOffset < endOfHeadFrame) {
int startOfData = frameOffset - seg->backpointer;
if (startOfData > (int)endOfHeadFrame) break; // no more ADUs needed
int endOfData = startOfData + seg->aduSize;
if (endOfData > (int)endOfHeadFrame) {
endOfData = endOfHeadFrame;
}
unsigned fromOffset;
if (startOfData <= (int)toOffset) {
fromOffset = toOffset - startOfData;
startOfData = toOffset;
if (endOfData < startOfData) endOfData = startOfData;
} else {
fromOffset = 0;
// we may need some padding bytes beforehand
unsigned bytesToZero = startOfData - toOffset;
#ifdef DEBUG
if (bytesToZero > 0) fprintf(stderr, "a->m:outputting %d zero bytes (%d, %d, %d, %d)\n", bytesToZero, startOfData, toOffset, frameOffset, seg->backpointer);
#endif
toOffset += bytesToZero;
}
unsigned char* fromPtr
= &seg->dataStart()[seg->headerSize + seg->sideInfoSize + fromOffset];
unsigned bytesUsedHere = endOfData - startOfData;
#ifdef DEBUG
if (bytesUsedHere > 0) fprintf(stderr, "a->m:outputting %d bytes from %d<-%d\n", bytesUsedHere, seg->aduSize, seg->backpointer);
#endif
memmove(toPtr + toOffset, fromPtr, bytesUsedHere);
toOffset += bytesUsedHere;
frameOffset += seg->dataHere();
index = SegmentQueue::nextIndex(index);
if (index == fSegments->nextFreeIndex()) break;
seg = &(fSegments->s[index]);
}
fSegments->dequeue();
return True;
}
////////// Segment //////////
unsigned Segment::dataHere() {
int result = frameSize - (headerSize + sideInfoSize);
if (result < 0) {
return 0;
}
return (unsigned)result;
}
////////// SegmentQueue //////////
void SegmentQueue::enqueueNewSegment(FramedSource* inputSource,
FramedSource* usingSource) {
if (isFull()) {
usingSource->envir() << "SegmentQueue::enqueueNewSegment() overflow\n";
usingSource->handleClosure();
return;
}
fUsingSource = usingSource;
Segment& seg = nextFreeSegment();
inputSource->getNextFrame(seg.buf, sizeof seg.buf,
sqAfterGettingSegment, this,
FramedSource::handleClosure, usingSource);
}
void SegmentQueue::sqAfterGettingSegment(void* clientData,
unsigned numBytesRead,
unsigned /*numTruncatedBytes*/,
struct timeval presentationTime,
unsigned durationInMicroseconds) {
SegmentQueue* segQueue = (SegmentQueue*)clientData;
Segment& seg = segQueue->nextFreeSegment();
seg.presentationTime = presentationTime;
seg.durationInMicroseconds = durationInMicroseconds;
if (segQueue->sqAfterGettingCommon(seg, numBytesRead)) {
#ifdef DEBUG
char const* direction = segQueue->fDirectionIsToADU ? "m->a" : "a->m";
fprintf(stderr, "%s:read frame %d<-%d, fs:%d, sis:%d, dh:%d, (descriptor size: %d)\n", direction, seg.aduSize, seg.backpointer, seg.frameSize, seg.sideInfoSize, seg.dataHere(), seg.descriptorSize);
#endif
}
// Continue our original calling source where it left off:
segQueue->fUsingSource->doGetNextFrame();
}
// Common code called after a new segment is enqueued
Boolean SegmentQueue::sqAfterGettingCommon(Segment& seg,
unsigned numBytesRead) {
unsigned char* fromPtr = seg.buf;
if (fIncludeADUdescriptors) {
// The newly-read data is assumed to be an ADU with a descriptor
// in front
(void)ADUdescriptor::getRemainingFrameSize(fromPtr);
seg.descriptorSize = (unsigned)(fromPtr-seg.buf);
} else {
seg.descriptorSize = 0;
}
// parse the MP3-specific info in the frame to get the ADU params
unsigned hdr;
MP3SideInfo sideInfo;
if (!GetADUInfoFromMP3Frame(fromPtr, numBytesRead,
hdr, seg.frameSize,
sideInfo, seg.sideInfoSize,
seg.backpointer, seg.aduSize)) {
return False;
}
// If we've just read an ADU (rather than a regular MP3 frame), then use the
// entire "numBytesRead" data for the 'aduSize', so that we include any
// 'ancillary data' that may be present at the end of the ADU:
if (!fDirectionIsToADU) {
unsigned newADUSize
= numBytesRead - seg.descriptorSize - 4/*header size*/ - seg.sideInfoSize;
if (newADUSize > seg.aduSize) seg.aduSize = newADUSize;
}
fTotalDataSize += seg.dataHere();
fNextFreeIndex = nextIndex(fNextFreeIndex);
return True;
}
Boolean SegmentQueue::dequeue() {
if (isEmpty()) {
fUsingSource->envir() << "SegmentQueue::dequeue(): underflow!\n";
return False;
}
Segment& seg = s[headIndex()];
fTotalDataSize -= seg.dataHere();
fHeadIndex = nextIndex(fHeadIndex);
return True;
}
Boolean SegmentQueue::insertDummyBeforeTail(unsigned backpointer) {
if (isEmptyOrFull()) return False;
// Copy the current tail segment to its new position, then modify the
// old tail segment to be a 'dummy' ADU
unsigned newTailIndex = nextFreeIndex();
Segment& newTailSeg = s[newTailIndex];
unsigned oldTailIndex = prevIndex(newTailIndex);
Segment& oldTailSeg = s[oldTailIndex];
newTailSeg = oldTailSeg; // structure copy
// Begin by setting (replacing) the ADU descriptor of the dummy ADU:
unsigned char* ptr = oldTailSeg.buf;
if (fIncludeADUdescriptors) {
unsigned remainingFrameSize
= oldTailSeg.headerSize + oldTailSeg.sideInfoSize + 0 /* 0-size ADU */;
unsigned currentDescriptorSize = oldTailSeg.descriptorSize;
if (currentDescriptorSize == 2) {
ADUdescriptor::generateTwoByteDescriptor(ptr, remainingFrameSize);
} else {
(void)ADUdescriptor::generateDescriptor(ptr, remainingFrameSize);
}
}
// Then zero out the side info of the dummy frame:
if (!ZeroOutMP3SideInfo(ptr, oldTailSeg.frameSize,
backpointer)) return False;
unsigned dummyNumBytesRead
= oldTailSeg.descriptorSize + 4/*header size*/ + oldTailSeg.sideInfoSize;
return sqAfterGettingCommon(oldTailSeg, dummyNumBytesRead);
}
live/liveMedia/MatroskaFileServerMediaSubsession.hh 000444 001752 001752 00000004301 12656261123 022455 0 ustar 00rsf rsf 000000 000000 /**********
This library is free software; you can redistribute it and/or modify it under
the terms of the GNU Lesser General Public License as published by the
Free Software Foundation; either version 2.1 of the License, or (at your
option) any later version. (See .)
This library is distributed in the hope that it will be useful, but WITHOUT
ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS
FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for
more details.
You should have received a copy of the GNU Lesser General Public License
along with this library; if not, write to the Free Software Foundation, Inc.,
51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
**********/
// "liveMedia"
// Copyright (c) 1996-2016 Live Networks, Inc. All rights reserved.
// A 'ServerMediaSubsession' object that creates new, unicast, "RTPSink"s
// on demand, from a track within a Matroska file.
// C++ header
#ifndef _MATROSKA_FILE_SERVER_MEDIA_SUBSESSION_HH
#define _MATROSKA_FILE_SERVER_MEDIA_SUBSESSION_HH
#ifndef _FILE_SERVER_MEDIA_SUBSESSION_HH
#include "FileServerMediaSubsession.hh"
#endif
#ifndef _MATROSKA_FILE_SERVER_DEMUX_HH
#include "MatroskaFileServerDemux.hh"
#endif
class MatroskaFileServerMediaSubsession: public FileServerMediaSubsession {
public:
static MatroskaFileServerMediaSubsession*
createNew(MatroskaFileServerDemux& demux, MatroskaTrack* track);
protected:
MatroskaFileServerMediaSubsession(MatroskaFileServerDemux& demux, MatroskaTrack* track);
// called only by createNew(), or by subclass constructors
virtual ~MatroskaFileServerMediaSubsession();
protected: // redefined virtual functions
virtual float duration() const;
virtual void seekStreamSource(FramedSource* inputSource, double& seekNPT, double streamDuration, u_int64_t& numBytes);
virtual FramedSource* createNewStreamSource(unsigned clientSessionId,
unsigned& estBitrate);
virtual RTPSink* createNewRTPSink(Groupsock* rtpGroupsock, unsigned char rtpPayloadTypeIfDynamic, FramedSource* inputSource);
protected:
MatroskaFileServerDemux& fOurDemux;
MatroskaTrack* fTrack;
unsigned fNumFiltersInFrontOfTrack;
};
#endif
live/liveMedia/MediaSink.cpp 000400 001752 001752 00000014630 12656261123 015714 0 ustar 00rsf rsf 000000 000000 /**********
This library is free software; you can redistribute it and/or modify it under
the terms of the GNU Lesser General Public License as published by the
Free Software Foundation; either version 2.1 of the License, or (at your
option) any later version. (See .)
This library is distributed in the hope that it will be useful, but WITHOUT
ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS
FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for
more details.
You should have received a copy of the GNU Lesser General Public License
along with this library; if not, write to the Free Software Foundation, Inc.,
51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
**********/
// "liveMedia"
// Copyright (c) 1996-2016 Live Networks, Inc. All rights reserved.
// Media Sinks
// Implementation
#include "MediaSink.hh"
#include "GroupsockHelper.hh"
#include
////////// MediaSink //////////
MediaSink::MediaSink(UsageEnvironment& env)
: Medium(env), fSource(NULL) {
}
MediaSink::~MediaSink() {
stopPlaying();
}
Boolean MediaSink::isSink() const {
return True;
}
Boolean MediaSink::lookupByName(UsageEnvironment& env, char const* sinkName,
MediaSink*& resultSink) {
resultSink = NULL; // unless we succeed
Medium* medium;
if (!Medium::lookupByName(env, sinkName, medium)) return False;
if (!medium->isSink()) {
env.setResultMsg(sinkName, " is not a media sink");
return False;
}
resultSink = (MediaSink*)medium;
return True;
}
Boolean MediaSink::sourceIsCompatibleWithUs(MediaSource& source) {
// We currently support only framed sources.
return source.isFramedSource();
}
Boolean MediaSink::startPlaying(MediaSource& source,
afterPlayingFunc* afterFunc,
void* afterClientData) {
// Make sure we're not already being played:
if (fSource != NULL) {
envir().setResultMsg("This sink is already being played");
return False;
}
// Make sure our source is compatible:
if (!sourceIsCompatibleWithUs(source)) {
envir().setResultMsg("MediaSink::startPlaying(): source is not compatible!");
return False;
}
fSource = (FramedSource*)&source;
fAfterFunc = afterFunc;
fAfterClientData = afterClientData;
return continuePlaying();
}
void MediaSink::stopPlaying() {
// First, tell the source that we're no longer interested:
if (fSource != NULL) fSource->stopGettingFrames();
// Cancel any pending tasks:
envir().taskScheduler().unscheduleDelayedTask(nextTask());
fSource = NULL; // indicates that we can be played again
fAfterFunc = NULL;
}
void MediaSink::onSourceClosure(void* clientData) {
MediaSink* sink = (MediaSink*)clientData;
sink->onSourceClosure();
}
void MediaSink::onSourceClosure() {
// Cancel any pending tasks:
envir().taskScheduler().unscheduleDelayedTask(nextTask());
fSource = NULL; // indicates that we can be played again
if (fAfterFunc != NULL) {
(*fAfterFunc)(fAfterClientData);
}
}
Boolean MediaSink::isRTPSink() const {
return False; // default implementation
}
////////// OutPacketBuffer //////////
unsigned OutPacketBuffer::maxSize = 60000; // by default
OutPacketBuffer
::OutPacketBuffer(unsigned preferredPacketSize, unsigned maxPacketSize, unsigned maxBufferSize)
: fPreferred(preferredPacketSize), fMax(maxPacketSize),
fOverflowDataSize(0) {
if (maxBufferSize == 0) maxBufferSize = maxSize;
unsigned maxNumPackets = (maxBufferSize + (maxPacketSize-1))/maxPacketSize;
fLimit = maxNumPackets*maxPacketSize;
fBuf = new unsigned char[fLimit];
resetPacketStart();
resetOffset();
resetOverflowData();
}
OutPacketBuffer::~OutPacketBuffer() {
delete[] fBuf;
}
void OutPacketBuffer::enqueue(unsigned char const* from, unsigned numBytes) {
if (numBytes > totalBytesAvailable()) {
#ifdef DEBUG
fprintf(stderr, "OutPacketBuffer::enqueue() warning: %d > %d\n", numBytes, totalBytesAvailable());
#endif
numBytes = totalBytesAvailable();
}
if (curPtr() != from) memmove(curPtr(), from, numBytes);
increment(numBytes);
}
void OutPacketBuffer::enqueueWord(u_int32_t word) {
u_int32_t nWord = htonl(word);
enqueue((unsigned char*)&nWord, 4);
}
void OutPacketBuffer::insert(unsigned char const* from, unsigned numBytes,
unsigned toPosition) {
unsigned realToPosition = fPacketStart + toPosition;
if (realToPosition + numBytes > fLimit) {
if (realToPosition > fLimit) return; // we can't do this
numBytes = fLimit - realToPosition;
}
memmove(&fBuf[realToPosition], from, numBytes);
if (toPosition + numBytes > fCurOffset) {
fCurOffset = toPosition + numBytes;
}
}
void OutPacketBuffer::insertWord(u_int32_t word, unsigned toPosition) {
u_int32_t nWord = htonl(word);
insert((unsigned char*)&nWord, 4, toPosition);
}
void OutPacketBuffer::extract(unsigned char* to, unsigned numBytes,
unsigned fromPosition) {
unsigned realFromPosition = fPacketStart + fromPosition;
if (realFromPosition + numBytes > fLimit) { // sanity check
if (realFromPosition > fLimit) return; // we can't do this
numBytes = fLimit - realFromPosition;
}
memmove(to, &fBuf[realFromPosition], numBytes);
}
u_int32_t OutPacketBuffer::extractWord(unsigned fromPosition) {
u_int32_t nWord;
extract((unsigned char*)&nWord, 4, fromPosition);
return ntohl(nWord);
}
void OutPacketBuffer::skipBytes(unsigned numBytes) {
if (numBytes > totalBytesAvailable()) {
numBytes = totalBytesAvailable();
}
increment(numBytes);
}
void OutPacketBuffer
::setOverflowData(unsigned overflowDataOffset,
unsigned overflowDataSize,
struct timeval const& presentationTime,
unsigned durationInMicroseconds) {
fOverflowDataOffset = overflowDataOffset;
fOverflowDataSize = overflowDataSize;
fOverflowPresentationTime = presentationTime;
fOverflowDurationInMicroseconds = durationInMicroseconds;
}
void OutPacketBuffer::useOverflowData() {
enqueue(&fBuf[fPacketStart + fOverflowDataOffset], fOverflowDataSize);
fCurOffset -= fOverflowDataSize; // undoes increment performed by "enqueue"
resetOverflowData();
}
void OutPacketBuffer::adjustPacketStart(unsigned numBytes) {
fPacketStart += numBytes;
if (fOverflowDataOffset >= numBytes) {
fOverflowDataOffset -= numBytes;
} else {
fOverflowDataOffset = 0;
fOverflowDataSize = 0; // an error otherwise
}
}
void OutPacketBuffer::resetPacketStart() {
if (fOverflowDataSize > 0) {
fOverflowDataOffset += fPacketStart;
}
fPacketStart = 0;
}
live/liveMedia/MediaSource.cpp 000444 001752 001752 00000005052 12656261123 016256 0 ustar 00rsf rsf 000000 000000 /**********
This library is free software; you can redistribute it and/or modify it under
the terms of the GNU Lesser General Public License as published by the
Free Software Foundation; either version 2.1 of the License, or (at your
option) any later version. (See .)
This library is distributed in the hope that it will be useful, but WITHOUT
ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS
FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for
more details.
You should have received a copy of the GNU Lesser General Public License
along with this library; if not, write to the Free Software Foundation, Inc.,
51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
**********/
// "liveMedia"
// Copyright (c) 1996-2016 Live Networks, Inc. All rights reserved.
// Media Sources
// Implementation
#include "MediaSource.hh"
////////// MediaSource //////////
MediaSource::MediaSource(UsageEnvironment& env)
: Medium(env) {
}
MediaSource::~MediaSource() {
}
Boolean MediaSource::isSource() const {
return True;
}
char const* MediaSource::MIMEtype() const {
return "application/OCTET-STREAM"; // default type
}
Boolean MediaSource::isFramedSource() const {
return False; // default implementation
}
Boolean MediaSource::isRTPSource() const {
return False; // default implementation
}
Boolean MediaSource::isMPEG1or2VideoStreamFramer() const {
return False; // default implementation
}
Boolean MediaSource::isMPEG4VideoStreamFramer() const {
return False; // default implementation
}
Boolean MediaSource::isH264VideoStreamFramer() const {
return False; // default implementation
}
Boolean MediaSource::isH265VideoStreamFramer() const {
return False; // default implementation
}
Boolean MediaSource::isDVVideoStreamFramer() const {
return False; // default implementation
}
Boolean MediaSource::isJPEGVideoSource() const {
return False; // default implementation
}
Boolean MediaSource::isAMRAudioSource() const {
return False; // default implementation
}
Boolean MediaSource::lookupByName(UsageEnvironment& env,
char const* sourceName,
MediaSource*& resultSource) {
resultSource = NULL; // unless we succeed
Medium* medium;
if (!Medium::lookupByName(env, sourceName, medium)) return False;
if (!medium->isSource()) {
env.setResultMsg(sourceName, " is not a media source");
return False;
}
resultSource = (MediaSource*)medium;
return True;
}
void MediaSource::getAttributes() const {
// Default implementation
envir().setResultMsg("");
}
live/liveMedia/MP3ADUdescriptor.cpp 000444 001752 001752 00000004136 12656261123 017110 0 ustar 00rsf rsf 000000 000000 /**********
This library is free software; you can redistribute it and/or modify it under
the terms of the GNU Lesser General Public License as published by the
Free Software Foundation; either version 2.1 of the License, or (at your
option) any later version. (See .)
This library is distributed in the hope that it will be useful, but WITHOUT
ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS
FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for
more details.
You should have received a copy of the GNU Lesser General Public License
along with this library; if not, write to the Free Software Foundation, Inc.,
51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
**********/
// "liveMedia"
// Copyright (c) 1996-2016 Live Networks, Inc. All rights reserved.
// Descriptor preceding frames of 'ADU' MP3 streams (for improved loss-tolerance)
// Implementation
#include "MP3ADUdescriptor.hh"
////////// ADUdescriptor //////////
//##### NOTE: For now, ignore fragmentation. Fix this later! #####
#define TWO_BYTE_DESCR_FLAG 0x40
unsigned ADUdescriptor::generateDescriptor(unsigned char*& toPtr,
unsigned remainingFrameSize) {
unsigned descriptorSize = ADUdescriptor::computeSize(remainingFrameSize);
switch (descriptorSize) {
case 1: {
*toPtr++ = (unsigned char)remainingFrameSize;
break;
}
case 2: {
generateTwoByteDescriptor(toPtr, remainingFrameSize);
break;
}
}
return descriptorSize;
}
void ADUdescriptor::generateTwoByteDescriptor(unsigned char*& toPtr,
unsigned remainingFrameSize) {
*toPtr++ = (TWO_BYTE_DESCR_FLAG|(unsigned char)(remainingFrameSize>>8));
*toPtr++ = (unsigned char)(remainingFrameSize&0xFF);
}
unsigned ADUdescriptor::getRemainingFrameSize(unsigned char*& fromPtr) {
unsigned char firstByte = *fromPtr++;
if (firstByte&TWO_BYTE_DESCR_FLAG) {
// This is a 2-byte descriptor
unsigned char secondByte = *fromPtr++;
return ((firstByte&0x3F)<<8) | secondByte;
} else {
// This is a 1-byte descriptor
return (firstByte&0x3F);
}
}
live/liveMedia/MP3ADUdescriptor.hh 000444 001752 001752 00000003465 12656261123 016731 0 ustar 00rsf rsf 000000 000000 /**********
This library is free software; you can redistribute it and/or modify it under
the terms of the GNU Lesser General Public License as published by the
Free Software Foundation; either version 2.1 of the License, or (at your
option) any later version. (See .)
This library is distributed in the hope that it will be useful, but WITHOUT
ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS
FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for
more details.
You should have received a copy of the GNU Lesser General Public License
along with this library; if not, write to the Free Software Foundation, Inc.,
51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
**********/
// "liveMedia"
// Copyright (c) 1996-2016 Live Networks, Inc. All rights reserved.
// Descriptor preceding frames of 'ADU' MP3 streams (for improved loss-tolerance)
// C++ header
#ifndef _MP3_ADU_DESCRIPTOR_HH
#define _MP3_ADU_DESCRIPTOR_HH
// A class for handling the descriptor that begins each ADU frame:
// (Note: We don't yet implement fragmentation)
class ADUdescriptor {
public:
// Operations for generating a new descriptor
static unsigned computeSize(unsigned remainingFrameSize) {
return remainingFrameSize >= 64 ? 2 : 1;
}
static unsigned generateDescriptor(unsigned char*& toPtr, unsigned remainingFrameSize);
// returns descriptor size; increments "toPtr" afterwards
static void generateTwoByteDescriptor(unsigned char*& toPtr, unsigned remainingFrameSize);
// always generates a 2-byte descriptor, even if "remainingFrameSize" is
// small enough for a 1-byte descriptor
// Operations for reading a descriptor
static unsigned getRemainingFrameSize(unsigned char*& fromPtr);
// increments "fromPtr" afterwards
};
#endif
live/liveMedia/MP3ADUinterleaving.cpp 000444 001752 001752 00000040723 12656261123 017423 0 ustar 00rsf rsf 000000 000000 /**********
This library is free software; you can redistribute it and/or modify it under
the terms of the GNU Lesser General Public License as published by the
Free Software Foundation; either version 2.1 of the License, or (at your
option) any later version. (See .)
This library is distributed in the hope that it will be useful, but WITHOUT
ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS
FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for
more details.
You should have received a copy of the GNU Lesser General Public License
along with this library; if not, write to the Free Software Foundation, Inc.,
51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
**********/
// "liveMedia"
// Copyright (c) 1996-2016 Live Networks, Inc. All rights reserved.
// Interleaving of MP3 ADUs
// Implementation
#include "MP3ADUinterleaving.hh"
#include "MP3ADUdescriptor.hh"
#include
#ifdef TEST_LOSS
#include "GroupsockHelper.hh"
#endif
////////// Interleaving //////////
Interleaving::Interleaving(unsigned cycleSize,
unsigned char const* cycleArray)
: fCycleSize(cycleSize) {
for (unsigned i = 0; i < fCycleSize; ++i) {
fInverseCycle[cycleArray[i]] = i;
}
}
Interleaving::~Interleaving() {
}
////////// MP3ADUinterleaverBase //////////
MP3ADUinterleaverBase::MP3ADUinterleaverBase(UsageEnvironment& env,
FramedSource* inputSource)
: FramedFilter(env, inputSource) {
}
MP3ADUinterleaverBase::~MP3ADUinterleaverBase() {
}
FramedSource* MP3ADUinterleaverBase::getInputSource(UsageEnvironment& env,
char const* inputSourceName) {
FramedSource* inputSource;
if (!FramedSource::lookupByName(env, inputSourceName, inputSource))
return NULL;
if (strcmp(inputSource->MIMEtype(), "audio/MPA-ROBUST") != 0) {
env.setResultMsg(inputSourceName, " is not an MP3 ADU source");
return NULL;
}
return inputSource;
}
void MP3ADUinterleaverBase::afterGettingFrame(void* clientData,
unsigned numBytesRead,
unsigned /*numTruncatedBytes*/,
struct timeval presentationTime,
unsigned durationInMicroseconds) {
MP3ADUinterleaverBase* interleaverBase = (MP3ADUinterleaverBase*)clientData;
// Finish up after reading:
interleaverBase->afterGettingFrame(numBytesRead,
presentationTime, durationInMicroseconds);
// Then, continue to deliver an outgoing frame:
interleaverBase->doGetNextFrame();
}
////////// InterleavingFrames (definition) //////////
class InterleavingFrames {
public:
InterleavingFrames(unsigned maxCycleSize);
virtual ~InterleavingFrames();
Boolean haveReleaseableFrame();
void getIncomingFrameParams(unsigned char index,
unsigned char*& dataPtr,
unsigned& bytesAvailable);
void getReleasingFrameParams(unsigned char index,
unsigned char*& dataPtr,
unsigned& bytesInUse,
struct timeval& presentationTime,
unsigned& durationInMicroseconds);
void setFrameParams(unsigned char index,
unsigned char icc, unsigned char ii,
unsigned frameSize, struct timeval presentationTime,
unsigned durationInMicroseconds);
unsigned nextIndexToRelease() {return fNextIndexToRelease;}
void releaseNext();
private:
unsigned fMaxCycleSize;
unsigned fNextIndexToRelease;
class InterleavingFrameDescriptor* fDescriptors;
};
////////// MP3ADUinterleaver //////////
MP3ADUinterleaver::MP3ADUinterleaver(UsageEnvironment& env,
Interleaving const& interleaving,
FramedSource* inputSource)
: MP3ADUinterleaverBase(env, inputSource),
fInterleaving(interleaving),
fFrames(new InterleavingFrames(interleaving.cycleSize())),
fII(0), fICC(0) {
}
MP3ADUinterleaver::~MP3ADUinterleaver() {
delete fFrames;
}
MP3ADUinterleaver* MP3ADUinterleaver::createNew(UsageEnvironment& env,
Interleaving const& interleaving,
FramedSource* inputSource) {
return new MP3ADUinterleaver(env, interleaving, inputSource);
}
void MP3ADUinterleaver::doGetNextFrame() {
// If there's a frame immediately available, deliver it, otherwise get new
// frames from the source until one's available:
if (fFrames->haveReleaseableFrame()) {
releaseOutgoingFrame();
// Call our own 'after getting' function. Because we're not a 'leaf'
// source, we can call this directly, without risking infinite recursion.
afterGetting(this);
} else {
fPositionOfNextIncomingFrame = fInterleaving.lookupInverseCycle(fII);
unsigned char* dataPtr;
unsigned bytesAvailable;
fFrames->getIncomingFrameParams(fPositionOfNextIncomingFrame,
dataPtr, bytesAvailable);
// Read the next incoming frame (asynchronously)
fInputSource->getNextFrame(dataPtr, bytesAvailable,
&MP3ADUinterleaverBase::afterGettingFrame, this,
handleClosure, this);
}
}
void MP3ADUinterleaver::releaseOutgoingFrame() {
unsigned char* fromPtr;
fFrames->getReleasingFrameParams(fFrames->nextIndexToRelease(),
fromPtr, fFrameSize,
fPresentationTime, fDurationInMicroseconds);
if (fFrameSize > fMaxSize) {
fNumTruncatedBytes = fFrameSize - fMaxSize;
fFrameSize = fMaxSize;
}
memmove(fTo, fromPtr, fFrameSize);
fFrames->releaseNext();
}
void MP3ADUinterleaver::afterGettingFrame(unsigned numBytesRead,
struct timeval presentationTime,
unsigned durationInMicroseconds) {
// Set the (icc,ii) and frame size of the newly-read frame:
fFrames->setFrameParams(fPositionOfNextIncomingFrame,
fICC, fII, numBytesRead,
presentationTime, durationInMicroseconds);
// Prepare our counters for the next frame:
if (++fII == fInterleaving.cycleSize()) {
fII = 0;
fICC = (fICC+1)%8;
}
}
////////// DeinterleavingFrames (definition) //////////
class DeinterleavingFrames {
public:
DeinterleavingFrames();
virtual ~DeinterleavingFrames();
Boolean haveReleaseableFrame();
void getIncomingFrameParams(unsigned char*& dataPtr,
unsigned& bytesAvailable);
void getIncomingFrameParamsAfter(unsigned frameSize,
struct timeval presentationTime,
unsigned durationInMicroseconds,
unsigned char& icc, unsigned char& ii);
void getReleasingFrameParams(unsigned char*& dataPtr,
unsigned& bytesInUse,
struct timeval& presentationTime,
unsigned& durationInMicroseconds);
void moveIncomingFrameIntoPlace();
void releaseNext();
void startNewCycle();
private:
unsigned fNextIndexToRelease;
Boolean fHaveEndedCycle;
unsigned fIIlastSeen;
unsigned fMinIndexSeen, fMaxIndexSeen; // actually, max+1
class DeinterleavingFrameDescriptor* fDescriptors;
};
////////// MP3ADUdeinterleaver //////////
MP3ADUdeinterleaver::MP3ADUdeinterleaver(UsageEnvironment& env,
FramedSource* inputSource)
: MP3ADUinterleaverBase(env, inputSource),
fFrames(new DeinterleavingFrames),
fIIlastSeen(~0), fICClastSeen(~0) {
}
MP3ADUdeinterleaver::~MP3ADUdeinterleaver() {
delete fFrames;
}
MP3ADUdeinterleaver* MP3ADUdeinterleaver::createNew(UsageEnvironment& env,
FramedSource* inputSource) {
return new MP3ADUdeinterleaver(env, inputSource);
}
void MP3ADUdeinterleaver::doGetNextFrame() {
// If there's a frame immediately available, deliver it, otherwise get new
// frames from the source until one's available:
if (fFrames->haveReleaseableFrame()) {
releaseOutgoingFrame();
// Call our own 'after getting' function. Because we're not a 'leaf'
// source, we can call this directly, without risking infinite recursion.
afterGetting(this);
} else {
#ifdef TEST_LOSS
NOTE: This code no longer works, because it uses synchronous reads,
which are no longer supported.
static unsigned const framesPerPacket = 3;
static unsigned const frameCount = 0;
static Boolean packetIsLost;
while (1) {
unsigned packetCount = frameCount/framesPerPacket;
if ((frameCount++)%framesPerPacket == 0) {
packetIsLost = (our_random()%10 == 0); // simulate 10% packet loss #####
}
if (packetIsLost) {
// Read and discard the next input frame (that would be part of
// a lost packet):
unsigned char dummyBuf[2000];
unsigned numBytesRead;
struct timeval presentationTime;
// (this works only if the source can be read synchronously)
fInputSource->syncGetNextFrame(dummyBuf, sizeof dummyBuf,
numBytesRead, presentationTime);
} else {
break; // from while (1)
}
}
#endif
unsigned char* dataPtr;
unsigned bytesAvailable;
fFrames->getIncomingFrameParams(dataPtr, bytesAvailable);
// Read the next incoming frame (asynchronously)
fInputSource->getNextFrame(dataPtr, bytesAvailable,
&MP3ADUinterleaverBase::afterGettingFrame, this,
handleClosure, this);
}
}
void MP3ADUdeinterleaver::afterGettingFrame(unsigned numBytesRead,
struct timeval presentationTime,
unsigned durationInMicroseconds) {
// Get the (icc,ii) and set the frame size of the newly-read frame:
unsigned char icc, ii;
fFrames->getIncomingFrameParamsAfter(numBytesRead,
presentationTime, durationInMicroseconds,
icc, ii);
// Compare these to the values we saw last:
if (icc != fICClastSeen || ii == fIIlastSeen) {
// We've started a new interleave cycle
// (or interleaving was not used). Release all
// pending ADU frames to the ADU->MP3 conversion step:
fFrames->startNewCycle();
} else {
// We're still in the same cycle as before.
// Move the newly-read frame into place, so it can be used:
fFrames->moveIncomingFrameIntoPlace();
}
fICClastSeen = icc;
fIIlastSeen = ii;
}
void MP3ADUdeinterleaver::releaseOutgoingFrame() {
unsigned char* fromPtr;
fFrames->getReleasingFrameParams(fromPtr, fFrameSize,
fPresentationTime, fDurationInMicroseconds);
if (fFrameSize > fMaxSize) {
fNumTruncatedBytes = fFrameSize - fMaxSize;
fFrameSize = fMaxSize;
}
memmove(fTo, fromPtr, fFrameSize);
fFrames->releaseNext();
}
////////// InterleavingFrames (implementation) //////////
#define MAX_FRAME_SIZE 2000 /* conservatively high */
class InterleavingFrameDescriptor {
public:
InterleavingFrameDescriptor() {frameDataSize = 0;}
unsigned frameDataSize; // includes ADU descriptor and (modified) MPEG hdr
struct timeval presentationTime;
unsigned durationInMicroseconds;
unsigned char frameData[MAX_FRAME_SIZE]; // ditto
};
InterleavingFrames::InterleavingFrames(unsigned maxCycleSize)
: fMaxCycleSize(maxCycleSize), fNextIndexToRelease(0),
fDescriptors(new InterleavingFrameDescriptor[maxCycleSize]) {
}
InterleavingFrames::~InterleavingFrames() {
delete[] fDescriptors;
}
Boolean InterleavingFrames::haveReleaseableFrame() {
return fDescriptors[fNextIndexToRelease].frameDataSize > 0;
}
void InterleavingFrames::getIncomingFrameParams(unsigned char index,
unsigned char*& dataPtr,
unsigned& bytesAvailable) {
InterleavingFrameDescriptor& desc = fDescriptors[index];
dataPtr = &desc.frameData[0];
bytesAvailable = MAX_FRAME_SIZE;
}
void InterleavingFrames::getReleasingFrameParams(unsigned char index,
unsigned char*& dataPtr,
unsigned& bytesInUse,
struct timeval& presentationTime,
unsigned& durationInMicroseconds) {
InterleavingFrameDescriptor& desc = fDescriptors[index];
dataPtr = &desc.frameData[0];
bytesInUse = desc.frameDataSize;
presentationTime = desc.presentationTime;
durationInMicroseconds = desc.durationInMicroseconds;
}
void InterleavingFrames::setFrameParams(unsigned char index,
unsigned char icc,
unsigned char ii,
unsigned frameSize,
struct timeval presentationTime,
unsigned durationInMicroseconds) {
InterleavingFrameDescriptor& desc = fDescriptors[index];
desc.frameDataSize = frameSize;
desc.presentationTime = presentationTime;
desc.durationInMicroseconds = durationInMicroseconds;
// Advance over the ADU descriptor, to get to the MPEG 'syncword':
unsigned char* ptr = &desc.frameData[0];
(void)ADUdescriptor::getRemainingFrameSize(ptr);
// Replace the next 11 bits with (ii,icc):
*ptr++ = ii;
*ptr &=~ 0xE0;
*ptr |= (icc<<5);
}
void InterleavingFrames::releaseNext() {
fDescriptors[fNextIndexToRelease].frameDataSize = 0;
fNextIndexToRelease = (fNextIndexToRelease+1)%fMaxCycleSize;
}
////////// DeinterleavingFrames (implementation) //////////
class DeinterleavingFrameDescriptor {
public:
DeinterleavingFrameDescriptor() {frameDataSize = 0; frameData = NULL;}
virtual ~DeinterleavingFrameDescriptor() {delete[] frameData;}
unsigned frameDataSize; // includes ADU descriptor and (modified) MPEG hdr
struct timeval presentationTime;
unsigned durationInMicroseconds;
unsigned char* frameData;
};
DeinterleavingFrames::DeinterleavingFrames()
: fNextIndexToRelease(0), fHaveEndedCycle(False),
fMinIndexSeen(MAX_CYCLE_SIZE), fMaxIndexSeen(0),
fDescriptors(new DeinterleavingFrameDescriptor[MAX_CYCLE_SIZE+1]) {
}
DeinterleavingFrames::~DeinterleavingFrames() {
delete[] fDescriptors;
}
Boolean DeinterleavingFrames::haveReleaseableFrame() {
if (!fHaveEndedCycle) {
// Check just the next frame in the sequence
return fDescriptors[fNextIndexToRelease].frameDataSize > 0;
} else {
// We've just ended a cycle, so we can skip over frames that didn't
// get filled in (due to packet loss):
if (fNextIndexToRelease < fMinIndexSeen) {
fNextIndexToRelease = fMinIndexSeen;
}
while (fNextIndexToRelease < fMaxIndexSeen
&& fDescriptors[fNextIndexToRelease].frameDataSize == 0) {
++fNextIndexToRelease;
}
if (fNextIndexToRelease >= fMaxIndexSeen) {
// No more frames are available from the cycle that we just ended, so
// clear out all previously stored frames, then make available
// the last-read frame, and return false for now:
for (unsigned i = fMinIndexSeen; i < fMaxIndexSeen; ++i) {
fDescriptors[i].frameDataSize = 0;
}
fMinIndexSeen = MAX_CYCLE_SIZE; fMaxIndexSeen = 0;
moveIncomingFrameIntoPlace();
fHaveEndedCycle = False;
fNextIndexToRelease = 0;
return False;
}
return True;
}
}
void DeinterleavingFrames::getIncomingFrameParams(unsigned char*& dataPtr,
unsigned& bytesAvailable) {
// Use fDescriptors[MAX_CYCLE_SIZE] to store the incoming frame,
// prior to figuring out its real position:
DeinterleavingFrameDescriptor& desc = fDescriptors[MAX_CYCLE_SIZE];
if (desc.frameData == NULL) {
// There's no buffer yet, so allocate a new one:
desc.frameData = new unsigned char[MAX_FRAME_SIZE];
}
dataPtr = desc.frameData;
bytesAvailable = MAX_FRAME_SIZE;
}
void DeinterleavingFrames
::getIncomingFrameParamsAfter(unsigned frameSize,
struct timeval presentationTime,
unsigned durationInMicroseconds,
unsigned char& icc, unsigned char& ii) {
DeinterleavingFrameDescriptor& desc = fDescriptors[MAX_CYCLE_SIZE];
desc.frameDataSize = frameSize;
desc.presentationTime = presentationTime;
desc.durationInMicroseconds = durationInMicroseconds;
// Advance over the ADU descriptor, to get to the MPEG 'syncword':
unsigned char* ptr = desc.frameData;
(void)ADUdescriptor::getRemainingFrameSize(ptr);
// Read the next 11 bits into (ii,icc), and replace them with all-1s:
fIIlastSeen = ii = *ptr; *ptr++ = 0xFF;
icc = (*ptr&0xE0)>>5; *ptr |= 0xE0;
}
void DeinterleavingFrames::getReleasingFrameParams(unsigned char*& dataPtr,
unsigned& bytesInUse,
struct timeval& presentationTime,
unsigned& durationInMicroseconds) {
DeinterleavingFrameDescriptor& desc = fDescriptors[fNextIndexToRelease];
dataPtr = desc.frameData;
bytesInUse = desc.frameDataSize;
presentationTime = desc.presentationTime;
durationInMicroseconds = desc.durationInMicroseconds;
}
void DeinterleavingFrames::moveIncomingFrameIntoPlace() {
DeinterleavingFrameDescriptor& fromDesc = fDescriptors[MAX_CYCLE_SIZE];
DeinterleavingFrameDescriptor& toDesc = fDescriptors[fIIlastSeen];
toDesc.frameDataSize = fromDesc.frameDataSize;
toDesc.presentationTime = fromDesc.presentationTime;
// Move the data pointer into place by swapping the data pointers:
unsigned char* tmp = toDesc.frameData;
toDesc.frameData = fromDesc.frameData;
fromDesc.frameData = tmp;
if (fIIlastSeen < fMinIndexSeen) {
fMinIndexSeen = fIIlastSeen;
}
if (fIIlastSeen + 1 > fMaxIndexSeen) {
fMaxIndexSeen = fIIlastSeen + 1;
}
}
void DeinterleavingFrames::releaseNext() {
fDescriptors[fNextIndexToRelease].frameDataSize = 0;
fNextIndexToRelease = (fNextIndexToRelease+1)%MAX_CYCLE_SIZE;
}
void DeinterleavingFrames::startNewCycle() {
fHaveEndedCycle = True;
}
live/liveMedia/MP3ADURTPSink.cpp 000444 001752 001752 00000010313 12656261123 016216 0 ustar 00rsf rsf 000000 000000 /**********
This library is free software; you can redistribute it and/or modify it under
the terms of the GNU Lesser General Public License as published by the
Free Software Foundation; either version 2.1 of the License, or (at your
option) any later version. (See .)
This library is distributed in the hope that it will be useful, but WITHOUT
ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS
FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for
more details.
You should have received a copy of the GNU Lesser General Public License
along with this library; if not, write to the Free Software Foundation, Inc.,
51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
**********/
// "liveMedia"
// Copyright (c) 1996-2016 Live Networks, Inc. All rights reserved.
// RTP sink for 'ADUized' MP3 frames ("mpa-robust")
// Implementation
#include "MP3ADURTPSink.hh"
MP3ADURTPSink::MP3ADURTPSink(UsageEnvironment& env, Groupsock* RTPgs,
unsigned char RTPPayloadType)
: AudioRTPSink(env, RTPgs, RTPPayloadType, 90000, "MPA-ROBUST") {
}
MP3ADURTPSink::~MP3ADURTPSink() {
}
MP3ADURTPSink*
MP3ADURTPSink::createNew(UsageEnvironment& env, Groupsock* RTPgs,
unsigned char RTPPayloadType) {
return new MP3ADURTPSink(env, RTPgs, RTPPayloadType);
}
static void badDataSize(UsageEnvironment& env, unsigned numBytesInFrame) {
env << "MP3ADURTPSink::doSpecialFrameHandling(): invalid size ("
<< numBytesInFrame << ") of non-fragmented input ADU!\n";
}
void MP3ADURTPSink::doSpecialFrameHandling(unsigned fragmentationOffset,
unsigned char* frameStart,
unsigned numBytesInFrame,
struct timeval framePresentationTime,
unsigned numRemainingBytes) {
// If this is the first (or only) fragment of an ADU, then
// check the "ADU descriptor" (that should be at the front) for validity:
if (fragmentationOffset == 0) {
unsigned aduDescriptorSize;
if (numBytesInFrame < 1) {
badDataSize(envir(), numBytesInFrame);
return;
}
if (frameStart[0]&0x40) {
// We have a 2-byte ADU descriptor
aduDescriptorSize = 2;
if (numBytesInFrame < 2) {
badDataSize(envir(), numBytesInFrame);
return;
}
fCurADUSize = ((frameStart[0]&~0xC0)<<8) | frameStart[1];
} else {
// We have a 1-byte ADU descriptor
aduDescriptorSize = 1;
fCurADUSize = frameStart[0]&~0x80;
}
if (frameStart[0]&0x80) {
envir() << "Unexpected \"C\" bit seen on non-fragment input ADU!\n";
return;
}
// Now, check whether the ADU size in the ADU descriptor is consistent
// with the total data size of (all fragments of) the input frame:
unsigned expectedADUSize =
fragmentationOffset + numBytesInFrame + numRemainingBytes
- aduDescriptorSize;
if (fCurADUSize != expectedADUSize) {
envir() << "MP3ADURTPSink::doSpecialFrameHandling(): Warning: Input ADU size "
<< expectedADUSize << " (=" << fragmentationOffset
<< "+" << numBytesInFrame << "+" << numRemainingBytes
<< "-" << aduDescriptorSize
<< ") did not match the value (" << fCurADUSize
<< ") in the ADU descriptor!\n";
fCurADUSize = expectedADUSize;
}
} else {
// This is the second (or subsequent) fragment.
// Insert a new ADU descriptor:
unsigned char aduDescriptor[2];
aduDescriptor[0] = 0xC0|(fCurADUSize>>8);
aduDescriptor[1] = fCurADUSize&0xFF;
setSpecialHeaderBytes(aduDescriptor, 2);
}
// Important: Also call our base class's doSpecialFrameHandling(),
// to set the packet's timestamp:
MultiFramedRTPSink::doSpecialFrameHandling(fragmentationOffset,
frameStart, numBytesInFrame,
framePresentationTime,
numRemainingBytes);
}
unsigned MP3ADURTPSink::specialHeaderSize() const {
// Normally there's no special header.
// (The "ADU descriptor" is already present in the data.)
unsigned specialHeaderSize = 0;
// However, if we're about to output the second (or subsequent) fragment
// of a fragmented ADU, then we need to insert a new ADU descriptor at
// the front of the packet:
if (curFragmentationOffset() > 0) {
specialHeaderSize = 2;
}
return specialHeaderSize;
}
live/liveMedia/MP3ADURTPSource.cpp 000444 001752 001752 00000005526 12656261123 016564 0 ustar 00rsf rsf 000000 000000 /**********
This library is free software; you can redistribute it and/or modify it under
the terms of the GNU Lesser General Public License as published by the
Free Software Foundation; either version 2.1 of the License, or (at your
option) any later version. (See .)
This library is distributed in the hope that it will be useful, but WITHOUT
ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS
FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for
more details.
You should have received a copy of the GNU Lesser General Public License
along with this library; if not, write to the Free Software Foundation, Inc.,
51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
**********/
// "liveMedia"
// Copyright (c) 1996-2016 Live Networks, Inc. All rights reserved.
// RTP source for 'ADUized' MP3 frames ("mpa-robust")
// Implementation
#include "MP3ADURTPSource.hh"
#include "MP3ADUdescriptor.hh"
////////// ADUBufferedPacket and ADUBufferedPacketFactory //////////
class ADUBufferedPacket: public BufferedPacket {
private: // redefined virtual functions
virtual unsigned nextEnclosedFrameSize(unsigned char*& framePtr,
unsigned dataSize);
};
class ADUBufferedPacketFactory: public BufferedPacketFactory {
private: // redefined virtual functions
virtual BufferedPacket* createNewPacket(MultiFramedRTPSource* ourSource);
};
///////// MP3ADURTPSource implementation ////////
MP3ADURTPSource*
MP3ADURTPSource::createNew(UsageEnvironment& env, Groupsock* RTPgs,
unsigned char rtpPayloadFormat,
unsigned rtpTimestampFrequency) {
return new MP3ADURTPSource(env, RTPgs, rtpPayloadFormat,
rtpTimestampFrequency);
}
MP3ADURTPSource::MP3ADURTPSource(UsageEnvironment& env, Groupsock* RTPgs,
unsigned char rtpPayloadFormat,
unsigned rtpTimestampFrequency)
: MultiFramedRTPSource(env, RTPgs,
rtpPayloadFormat, rtpTimestampFrequency,
new ADUBufferedPacketFactory) {
}
MP3ADURTPSource::~MP3ADURTPSource() {
}
char const* MP3ADURTPSource::MIMEtype() const {
return "audio/MPA-ROBUST";
}
////////// ADUBufferedPacket and ADUBufferredPacketFactory implementation
unsigned ADUBufferedPacket
::nextEnclosedFrameSize(unsigned char*& framePtr, unsigned dataSize) {
// Return the size of the next MP3 'ADU', on the assumption that
// the input data is ADU-encoded MP3 frames.
unsigned char* frameDataPtr = framePtr;
unsigned remainingFrameSize
= ADUdescriptor::getRemainingFrameSize(frameDataPtr);
unsigned descriptorSize = (unsigned)(frameDataPtr - framePtr);
unsigned fullADUSize = descriptorSize + remainingFrameSize;
return (fullADUSize <= dataSize) ? fullADUSize : dataSize;
}
BufferedPacket* ADUBufferedPacketFactory
::createNewPacket(MultiFramedRTPSource* /*ourSource*/) {
return new ADUBufferedPacket;
}
live/liveMedia/MP3ADUTranscoder.cpp 000444 001752 001752 00000006457 12656261123 017046 0 ustar 00rsf rsf 000000 000000 /**********
This library is free software; you can redistribute it and/or modify it under
the terms of the GNU Lesser General Public License as published by the
Free Software Foundation; either version 2.1 of the License, or (at your
option) any later version. (See .)
This library is distributed in the hope that it will be useful, but WITHOUT
ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS
FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for
more details.
You should have received a copy of the GNU Lesser General Public License
along with this library; if not, write to the Free Software Foundation, Inc.,
51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
**********/
// "liveMedia"
// Copyright (c) 1996-2016 Live Networks, Inc. All rights reserved.
// Transcoder for ADUized MP3 frames
// Implementation
#include "MP3ADUTranscoder.hh"
#include "MP3Internals.hh"
#include
MP3ADUTranscoder::MP3ADUTranscoder(UsageEnvironment& env,
unsigned outBitrate /* in kbps */,
FramedSource* inputSource)
: FramedFilter(env, inputSource),
fOutBitrate(outBitrate),
fAvailableBytesForBackpointer(0),
fOrigADU(new unsigned char[MAX_MP3_FRAME_SIZE]) {
}
MP3ADUTranscoder::~MP3ADUTranscoder() {
delete[] fOrigADU;
}
MP3ADUTranscoder* MP3ADUTranscoder::createNew(UsageEnvironment& env,
unsigned outBitrate /* in kbps */,
FramedSource* inputSource) {
// The source must be an MP3 ADU source:
if (strcmp(inputSource->MIMEtype(), "audio/MPA-ROBUST") != 0) {
env.setResultMsg(inputSource->name(), " is not an MP3 ADU source");
return NULL;
}
return new MP3ADUTranscoder(env, outBitrate, inputSource);
}
void MP3ADUTranscoder::getAttributes() const {
// Begin by getting the attributes from our input source:
fInputSource->getAttributes();
// Then modify them by appending the corrected bandwidth
char buffer[30];
sprintf(buffer, " bandwidth %d", outBitrate());
envir().appendToResultMsg(buffer);
}
void MP3ADUTranscoder::doGetNextFrame() {
fInputSource->getNextFrame(fOrigADU, MAX_MP3_FRAME_SIZE,
afterGettingFrame, this, handleClosure, this);
}
void MP3ADUTranscoder::afterGettingFrame(void* clientData,
unsigned numBytesRead,
unsigned numTruncatedBytes,
struct timeval presentationTime,
unsigned durationInMicroseconds) {
MP3ADUTranscoder* transcoder = (MP3ADUTranscoder*)clientData;
transcoder->afterGettingFrame1(numBytesRead, numTruncatedBytes,
presentationTime, durationInMicroseconds);
}
void MP3ADUTranscoder::afterGettingFrame1(unsigned numBytesRead,
unsigned numTruncatedBytes,
struct timeval presentationTime,
unsigned durationInMicroseconds) {
fNumTruncatedBytes = numTruncatedBytes; // but can we handle this being >0? #####
fPresentationTime = presentationTime;
fDurationInMicroseconds = durationInMicroseconds;
fFrameSize = TranscodeMP3ADU(fOrigADU, numBytesRead, fOutBitrate,
fTo, fMaxSize, fAvailableBytesForBackpointer);
if (fFrameSize == 0) { // internal error - bad ADU data?
handleClosure();
return;
}
// Call our own 'after getting' function. Because we're not a 'leaf'
// source, we can call this directly, without risking infinite recursion.
afterGetting(this);
}
live/liveMedia/MP3AudioFileServerMediaSubsession.cpp 000444 001752 001752 00000015431 12656261123 022446 0 ustar 00rsf rsf 000000 000000 /**********
This library is free software; you can redistribute it and/or modify it under
the terms of the GNU Lesser General Public License as published by the
Free Software Foundation; either version 2.1 of the License, or (at your
option) any later version. (See .)
This library is distributed in the hope that it will be useful, but WITHOUT
ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS
FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for
more details.
You should have received a copy of the GNU Lesser General Public License
along with this library; if not, write to the Free Software Foundation, Inc.,
51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
**********/
// "liveMedia"
// Copyright (c) 1996-2016 Live Networks, Inc. All rights reserved.
// A 'ServerMediaSubsession' object that creates new, unicast, "RTPSink"s
// on demand, from a MP3 audio file.
// (Actually, any MPEG-1 or MPEG-2 audio file should work.)
// Implementation
#include "MP3AudioFileServerMediaSubsession.hh"
#include "MPEG1or2AudioRTPSink.hh"
#include "MP3ADURTPSink.hh"
#include "MP3FileSource.hh"
#include "MP3ADU.hh"
MP3AudioFileServerMediaSubsession* MP3AudioFileServerMediaSubsession
::createNew(UsageEnvironment& env, char const* fileName, Boolean reuseFirstSource,
Boolean generateADUs, Interleaving* interleaving) {
return new MP3AudioFileServerMediaSubsession(env, fileName, reuseFirstSource,
generateADUs, interleaving);
}
MP3AudioFileServerMediaSubsession
::MP3AudioFileServerMediaSubsession(UsageEnvironment& env,
char const* fileName, Boolean reuseFirstSource,
Boolean generateADUs,
Interleaving* interleaving)
: FileServerMediaSubsession(env, fileName, reuseFirstSource),
fGenerateADUs(generateADUs), fInterleaving(interleaving), fFileDuration(0.0) {
}
MP3AudioFileServerMediaSubsession
::~MP3AudioFileServerMediaSubsession() {
delete fInterleaving;
}
FramedSource* MP3AudioFileServerMediaSubsession
::createNewStreamSourceCommon(FramedSource* baseMP3Source, unsigned mp3NumBytes, unsigned& estBitrate) {
FramedSource* streamSource;
do {
streamSource = baseMP3Source; // by default
if (streamSource == NULL) break;
// Use the MP3 file size, plus the duration, to estimate the stream's bitrate:
if (mp3NumBytes > 0 && fFileDuration > 0.0) {
estBitrate = (unsigned)(mp3NumBytes/(125*fFileDuration) + 0.5); // kbps, rounded
} else {
estBitrate = 128; // kbps, estimate
}
if (fGenerateADUs) {
// Add a filter that converts the source MP3s to ADUs:
streamSource = ADUFromMP3Source::createNew(envir(), streamSource);
if (streamSource == NULL) break;
if (fInterleaving != NULL) {
// Add another filter that interleaves the ADUs before packetizing:
streamSource = MP3ADUinterleaver::createNew(envir(), *fInterleaving,
streamSource);
if (streamSource == NULL) break;
}
} else if (fFileDuration > 0.0) {
// Because this is a seekable file, insert a pair of filters: one that
// converts the input MP3 stream to ADUs; another that converts these
// ADUs back to MP3. This allows us to seek within the input stream without
// tripping over the MP3 'bit reservoir':
streamSource = ADUFromMP3Source::createNew(envir(), streamSource);
if (streamSource == NULL) break;
streamSource = MP3FromADUSource::createNew(envir(), streamSource);
if (streamSource == NULL) break;
}
} while (0);
return streamSource;
}
void MP3AudioFileServerMediaSubsession::getBaseStreams(FramedSource* frontStream,
FramedSource*& sourceMP3Stream, ADUFromMP3Source*& aduStream/*if any*/) {
if (fGenerateADUs) {
// There's an ADU stream.
if (fInterleaving != NULL) {
// There's an interleaving filter in front of the ADU stream. So go back one, to reach the ADU stream:
aduStream = (ADUFromMP3Source*)(((FramedFilter*)frontStream)->inputSource());
} else {
aduStream = (ADUFromMP3Source*)frontStream;
}
// Then, go back one more, to reach the MP3 source:
sourceMP3Stream = (MP3FileSource*)(aduStream->inputSource());
} else if (fFileDuration > 0.0) {
// There are a pair of filters - MP3->ADU and ADU->MP3 - in front of the
// original MP3 source. So, go back one, to reach the ADU source:
aduStream = (ADUFromMP3Source*)(((FramedFilter*)frontStream)->inputSource());
// Then, go back one more, to reach the MP3 source:
sourceMP3Stream = (MP3FileSource*)(aduStream->inputSource());
} else {
// There's no filter in front of the source MP3 stream (and there's no ADU stream):
aduStream = NULL;
sourceMP3Stream = frontStream;
}
}
void MP3AudioFileServerMediaSubsession
::seekStreamSource(FramedSource* inputSource, double& seekNPT, double streamDuration, u_int64_t& /*numBytes*/) {
FramedSource* sourceMP3Stream;
ADUFromMP3Source* aduStream;
getBaseStreams(inputSource, sourceMP3Stream, aduStream);
if (aduStream != NULL) aduStream->resetInput(); // because we're about to seek within its source
((MP3FileSource*)sourceMP3Stream)->seekWithinFile(seekNPT, streamDuration);
}
void MP3AudioFileServerMediaSubsession
::setStreamSourceScale(FramedSource* inputSource, float scale) {
FramedSource* sourceMP3Stream;
ADUFromMP3Source* aduStream;
getBaseStreams(inputSource, sourceMP3Stream, aduStream);
if (aduStream == NULL) return; // because, in this case, the stream's not scalable
int iScale = (int)scale;
aduStream->setScaleFactor(iScale);
((MP3FileSource*)sourceMP3Stream)->setPresentationTimeScale(iScale);
}
FramedSource* MP3AudioFileServerMediaSubsession
::createNewStreamSource(unsigned /*clientSessionId*/, unsigned& estBitrate) {
MP3FileSource* mp3Source = MP3FileSource::createNew(envir(), fFileName);
if (mp3Source == NULL) return NULL;
fFileDuration = mp3Source->filePlayTime();
return createNewStreamSourceCommon(mp3Source, mp3Source->fileSize(), estBitrate);
}
RTPSink* MP3AudioFileServerMediaSubsession
::createNewRTPSink(Groupsock* rtpGroupsock,
unsigned char rtpPayloadTypeIfDynamic,
FramedSource* /*inputSource*/) {
if (fGenerateADUs) {
return MP3ADURTPSink::createNew(envir(), rtpGroupsock,
rtpPayloadTypeIfDynamic);
} else {
return MPEG1or2AudioRTPSink::createNew(envir(), rtpGroupsock);
}
}
void MP3AudioFileServerMediaSubsession::testScaleFactor(float& scale) {
if (fFileDuration <= 0.0) {
// The file is non-seekable, so is probably a live input source.
// We don't support scale factors other than 1
scale = 1;
} else {
// We support any integral scale >= 1
int iScale = (int)(scale + 0.5); // round
if (iScale < 1) iScale = 1;
scale = (float)iScale;
}
}
float MP3AudioFileServerMediaSubsession::duration() const {
return fFileDuration;
}
live/liveMedia/MP3AudioMatroskaFileServerMediaSubsession.cpp 000444 001752 001752 00000005315 12656261123 024150 0 ustar 00rsf rsf 000000 000000 /**********
This library is free software; you can redistribute it and/or modify it under
the terms of the GNU Lesser General Public License as published by the
Free Software Foundation; either version 2.1 of the License, or (at your
option) any later version. (See .)
This library is distributed in the hope that it will be useful, but WITHOUT
ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS
FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for
more details.
You should have received a copy of the GNU Lesser General Public License
along with this library; if not, write to the Free Software Foundation, Inc.,
51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
**********/
// "liveMedia"
// Copyright (c) 1996-2016 Live Networks, Inc. All rights reserved.
// A 'ServerMediaSubsession' object that creates new, unicast, "RTPSink"s
// on demand, from an MP3 audio track within a Matroska file.
// (Actually, MPEG-1 or MPEG-2 audio file should also work.)
// Implementation
#include "FileServerMediaSubsession.hh"
#include "MP3AudioMatroskaFileServerMediaSubsession.hh"
#include "MatroskaDemuxedTrack.hh"
MP3AudioMatroskaFileServerMediaSubsession* MP3AudioMatroskaFileServerMediaSubsession
::createNew(MatroskaFileServerDemux& demux, MatroskaTrack* track,
Boolean generateADUs, Interleaving* interleaving) {
return new MP3AudioMatroskaFileServerMediaSubsession(demux, track, generateADUs, interleaving);
}
MP3AudioMatroskaFileServerMediaSubsession
::MP3AudioMatroskaFileServerMediaSubsession(MatroskaFileServerDemux& demux, MatroskaTrack* track,
Boolean generateADUs, Interleaving* interleaving)
: MP3AudioFileServerMediaSubsession(demux.envir(), demux.fileName(), False, generateADUs, interleaving),
fOurDemux(demux), fTrackNumber(track->trackNumber) {
fFileDuration = fOurDemux.fileDuration();
}
MP3AudioMatroskaFileServerMediaSubsession::~MP3AudioMatroskaFileServerMediaSubsession() {
}
void MP3AudioMatroskaFileServerMediaSubsession
::seekStreamSource(FramedSource* inputSource, double& seekNPT, double /*streamDuration*/, u_int64_t& /*numBytes*/) {
FramedSource* sourceMP3Stream;
ADUFromMP3Source* aduStream;
getBaseStreams(inputSource, sourceMP3Stream, aduStream);
if (aduStream != NULL) aduStream->resetInput(); // because we're about to seek within its source
((MatroskaDemuxedTrack*)sourceMP3Stream)->seekToTime(seekNPT);
}
FramedSource* MP3AudioMatroskaFileServerMediaSubsession
::createNewStreamSource(unsigned clientSessionId, unsigned& estBitrate) {
FramedSource* baseMP3Source = fOurDemux.newDemuxedTrack(clientSessionId, fTrackNumber);
return createNewStreamSourceCommon(baseMP3Source, 0, estBitrate);
}
live/liveMedia/MP3FileSource.cpp 000444 001752 001752 00000012710 12656261123 016435 0 ustar 00rsf rsf 000000 000000 /**********
This library is free software; you can redistribute it and/or modify it under
the terms of the GNU Lesser General Public License as published by the
Free Software Foundation; either version 2.1 of the License, or (at your
option) any later version. (See .)
This library is distributed in the hope that it will be useful, but WITHOUT
ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS
FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for
more details.
You should have received a copy of the GNU Lesser General Public License
along with this library; if not, write to the Free Software Foundation, Inc.,
51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
**********/
// "liveMedia"
// Copyright (c) 1996-2016 Live Networks, Inc. All rights reserved.
// MP3 File Sources
// Implementation
#include "MP3FileSource.hh"
#include "MP3StreamState.hh"
#include "InputFile.hh"
////////// MP3FileSource //////////
MP3FileSource::MP3FileSource(UsageEnvironment& env, FILE* fid)
: FramedFileSource(env, fid),
fStreamState(new MP3StreamState(env)) {
}
MP3FileSource::~MP3FileSource() {
delete fStreamState;
}
char const* MP3FileSource::MIMEtype() const {
return "audio/MPEG";
}
MP3FileSource* MP3FileSource::createNew(UsageEnvironment& env, char const* fileName) {
MP3FileSource* newSource = NULL;
do {
FILE* fid;
fid = OpenInputFile(env, fileName);
if (fid == NULL) break;
newSource = new MP3FileSource(env, fid);
if (newSource == NULL) break;
unsigned fileSize = (unsigned)GetFileSize(fileName, fid);
newSource->assignStream(fid, fileSize);
if (!newSource->initializeStream()) break;
return newSource;
} while (0);
Medium::close(newSource);
return NULL;
}
float MP3FileSource::filePlayTime() const {
return fStreamState->filePlayTime();
}
unsigned MP3FileSource::fileSize() const {
return fStreamState->fileSize();
}
void MP3FileSource::setPresentationTimeScale(unsigned scale) {
fStreamState->setPresentationTimeScale(scale);
}
void MP3FileSource::seekWithinFile(double seekNPT, double streamDuration) {
float fileDuration = filePlayTime();
// First, make sure that 0.0 <= seekNPT <= seekNPT + streamDuration <= fileDuration
if (seekNPT < 0.0) {
seekNPT = 0.0;
} else if (seekNPT > fileDuration) {
seekNPT = fileDuration;
}
if (streamDuration < 0.0) {
streamDuration = 0.0;
} else if (seekNPT + streamDuration > fileDuration) {
streamDuration = fileDuration - seekNPT;
}
float seekFraction = (float)seekNPT/fileDuration;
unsigned seekByteNumber = fStreamState->getByteNumberFromPositionFraction(seekFraction);
fStreamState->seekWithinFile(seekByteNumber);
fLimitNumBytesToStream = False; // by default
if (streamDuration > 0.0) {
float endFraction = (float)(seekNPT + streamDuration)/fileDuration;
unsigned endByteNumber = fStreamState->getByteNumberFromPositionFraction(endFraction);
if (endByteNumber > seekByteNumber) { // sanity check
fNumBytesToStream = endByteNumber - seekByteNumber;
fLimitNumBytesToStream = True;
}
}
}
void MP3FileSource::getAttributes() const {
char buffer[200];
fStreamState->getAttributes(buffer, sizeof buffer);
envir().setResultMsg(buffer);
}
void MP3FileSource::doGetNextFrame() {
if (!doGetNextFrame1()) {
handleClosure();
return;
}
// Switch to another task:
#if defined(__WIN32__) || defined(_WIN32)
// HACK: liveCaster/lc uses an implementation of scheduleDelayedTask()
// that performs very badly (chewing up lots of CPU time, apparently polling)
// on Windows. Until this is fixed, we just call our "afterGetting()"
// function directly. This avoids infinite recursion, as long as our sink
// is discontinuous, which is the case for the RTP sink that liveCaster/lc
// uses. #####
afterGetting(this);
#else
nextTask() = envir().taskScheduler().scheduleDelayedTask(0,
(TaskFunc*)afterGetting, this);
#endif
}
Boolean MP3FileSource::doGetNextFrame1() {
if (fLimitNumBytesToStream && fNumBytesToStream == 0) return False; // we've already streamed as much as we were asked for
if (!fHaveJustInitialized) {
if (fStreamState->findNextHeader(fPresentationTime) == 0) return False;
} else {
fPresentationTime = fFirstFramePresentationTime;
fHaveJustInitialized = False;
}
if (!fStreamState->readFrame(fTo, fMaxSize, fFrameSize, fDurationInMicroseconds)) {
char tmp[200];
sprintf(tmp,
"Insufficient buffer size %d for reading MPEG audio frame (needed %d)\n",
fMaxSize, fFrameSize);
envir().setResultMsg(tmp);
fFrameSize = fMaxSize;
return False;
}
if (fNumBytesToStream > fFrameSize) fNumBytesToStream -= fFrameSize; else fNumBytesToStream = 0;
return True;
}
void MP3FileSource::assignStream(FILE* fid, unsigned fileSize) {
fStreamState->assignStream(fid, fileSize);
}
Boolean MP3FileSource::initializeStream() {
// Make sure the file has an appropriate header near the start:
if (fStreamState->findNextHeader(fFirstFramePresentationTime) == 0) {
envir().setResultMsg("not an MPEG audio file");
return False;
}
fStreamState->checkForXingHeader(); // in case this is a VBR file
fHaveJustInitialized = True;
fLimitNumBytesToStream = False;
fNumBytesToStream = 0;
// Hack: It's possible that our environment's 'result message' has been
// reset within this function, so set it again to our name now:
envir().setResultMsg(name());
return True;
}
live/liveMedia/ProxyServerMediaSession.cpp 000444 001752 001752 00000124166 12656261123 020702 0 ustar 00rsf rsf 000000 000000 /**********
This library is free software; you can redistribute it and/or modify it under
the terms of the GNU Lesser General Public License as published by the
Free Software Foundation; either version 2.1 of the License, or (at your
option) any later version. (See .)
This library is distributed in the hope that it will be useful, but WITHOUT
ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS
FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for
more details.
You should have received a copy of the GNU Lesser General Public License
along with this library; if not, write to the Free Software Foundation, Inc.,
51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
**********/
// "liveMedia"
// Copyright (c) 1996-2016 Live Networks, Inc. All rights reserved.
// A subclass of "ServerMediaSession" that can be used to create a (unicast) RTSP servers that acts as a 'proxy' for
// another (unicast or multicast) RTSP/RTP stream.
// Implementation
#include "liveMedia.hh"
#include "RTSPCommon.hh"
#include "GroupsockHelper.hh" // for "our_random()"
#ifndef MILLION
#define MILLION 1000000
#endif
// A "OnDemandServerMediaSubsession" subclass, used to implement a unicast RTSP server that's proxying another RTSP stream:
class ProxyServerMediaSubsession: public OnDemandServerMediaSubsession {
public:
ProxyServerMediaSubsession(MediaSubsession& mediaSubsession,
portNumBits initialPortNum, Boolean multiplexRTCPWithRTP);
virtual ~ProxyServerMediaSubsession();
char const* codecName() const { return fCodecName; }
private: // redefined virtual functions
virtual FramedSource* createNewStreamSource(unsigned clientSessionId,
unsigned& estBitrate);
virtual void closeStreamSource(FramedSource *inputSource);
virtual RTPSink* createNewRTPSink(Groupsock* rtpGroupsock,
unsigned char rtpPayloadTypeIfDynamic,
FramedSource* inputSource);
virtual Groupsock* createGroupsock(struct in_addr const& addr, Port port);
virtual RTCPInstance* createRTCP(Groupsock* RTCPgs, unsigned totSessionBW, /* in kbps */
unsigned char const* cname, RTPSink* sink);
private:
static void subsessionByeHandler(void* clientData);
void subsessionByeHandler();
int verbosityLevel() const { return ((ProxyServerMediaSession*)fParentSession)->fVerbosityLevel; }
private:
friend class ProxyRTSPClient;
MediaSubsession& fClientMediaSubsession; // the 'client' media subsession object that corresponds to this 'server' media subsession
char const* fCodecName; // copied from "fClientMediaSubsession" once it's been set up
ProxyServerMediaSubsession* fNext; // used when we're part of a queue
Boolean fHaveSetupStream;
};
////////// ProxyServerMediaSession implementation //////////
UsageEnvironment& operator<<(UsageEnvironment& env, const ProxyServerMediaSession& psms) { // used for debugging
return env << "ProxyServerMediaSession[\"" << psms.url() << "\"]";
}
ProxyRTSPClient*
defaultCreateNewProxyRTSPClientFunc(ProxyServerMediaSession& ourServerMediaSession,
char const* rtspURL,
char const* username, char const* password,
portNumBits tunnelOverHTTPPortNum, int verbosityLevel,
int socketNumToServer) {
return new ProxyRTSPClient(ourServerMediaSession, rtspURL, username, password,
tunnelOverHTTPPortNum, verbosityLevel, socketNumToServer);
}
ProxyServerMediaSession* ProxyServerMediaSession
::createNew(UsageEnvironment& env, GenericMediaServer* ourMediaServer,
char const* inputStreamURL, char const* streamName,
char const* username, char const* password,
portNumBits tunnelOverHTTPPortNum, int verbosityLevel, int socketNumToServer,
MediaTranscodingTable* transcodingTable) {
return new ProxyServerMediaSession(env, ourMediaServer, inputStreamURL, streamName, username, password,
tunnelOverHTTPPortNum, verbosityLevel, socketNumToServer,
transcodingTable);
}
ProxyServerMediaSession
::ProxyServerMediaSession(UsageEnvironment& env, GenericMediaServer* ourMediaServer,
char const* inputStreamURL, char const* streamName,
char const* username, char const* password,
portNumBits tunnelOverHTTPPortNum, int verbosityLevel,
int socketNumToServer,
MediaTranscodingTable* transcodingTable,
createNewProxyRTSPClientFunc* ourCreateNewProxyRTSPClientFunc,
portNumBits initialPortNum, Boolean multiplexRTCPWithRTP)
: ServerMediaSession(env, streamName, NULL, NULL, False, NULL),
describeCompletedFlag(0), fOurMediaServer(ourMediaServer), fClientMediaSession(NULL),
fVerbosityLevel(verbosityLevel),
fPresentationTimeSessionNormalizer(new PresentationTimeSessionNormalizer(envir())),
fCreateNewProxyRTSPClientFunc(ourCreateNewProxyRTSPClientFunc),
fTranscodingTable(transcodingTable),
fInitialPortNum(initialPortNum), fMultiplexRTCPWithRTP(multiplexRTCPWithRTP) {
// Open a RTSP connection to the input stream, and send a "DESCRIBE" command.
// We'll use the SDP description in the response to set ourselves up.
fProxyRTSPClient
= (*fCreateNewProxyRTSPClientFunc)(*this, inputStreamURL, username, password,
tunnelOverHTTPPortNum,
verbosityLevel > 0 ? verbosityLevel-1 : verbosityLevel,
socketNumToServer);
ProxyRTSPClient::sendDESCRIBE(fProxyRTSPClient);
}
ProxyServerMediaSession::~ProxyServerMediaSession() {
if (fVerbosityLevel > 0) {
envir() << *this << "::~ProxyServerMediaSession()\n";
}
// Begin by sending a "TEARDOWN" command (without checking for a response):
if (fProxyRTSPClient != NULL && fClientMediaSession != NULL) {
fProxyRTSPClient->sendTeardownCommand(*fClientMediaSession, NULL, fProxyRTSPClient->auth());
}
// Then delete our state:
Medium::close(fTranscodingTable);
Medium::close(fClientMediaSession);
Medium::close(fProxyRTSPClient);
Medium::close(fPresentationTimeSessionNormalizer);
}
char const* ProxyServerMediaSession::url() const {
return fProxyRTSPClient == NULL ? NULL : fProxyRTSPClient->url();
}
Groupsock* ProxyServerMediaSession::createGroupsock(struct in_addr const& addr, Port port) {
// Default implementation; may be redefined by subclasses:
return new Groupsock(envir(), addr, port, 255);
}
RTCPInstance* ProxyServerMediaSession
::createRTCP(Groupsock* RTCPgs, unsigned totSessionBW, /* in kbps */
unsigned char const* cname, RTPSink* sink) {
// Default implementation; may be redefined by subclasses:
return RTCPInstance::createNew(envir(), RTCPgs, totSessionBW, cname, sink, NULL/*we're a server*/);
}
void ProxyServerMediaSession::continueAfterDESCRIBE(char const* sdpDescription) {
describeCompletedFlag = 1;
// Create a (client) "MediaSession" object from the stream's SDP description ("resultString"), then iterate through its
// "MediaSubsession" objects, to set up corresponding "ServerMediaSubsession" objects that we'll use to serve the stream's tracks.
do {
fClientMediaSession = MediaSession::createNew(envir(), sdpDescription);
if (fClientMediaSession == NULL) break;
MediaSubsessionIterator iter(*fClientMediaSession);
for (MediaSubsession* mss = iter.next(); mss != NULL; mss = iter.next()) {
ServerMediaSubsession* smss
= new ProxyServerMediaSubsession(*mss, fInitialPortNum, fMultiplexRTCPWithRTP);
addSubsession(smss);
if (fVerbosityLevel > 0) {
envir() << *this << " added new \"ProxyServerMediaSubsession\" for "
<< mss->protocolName() << "/" << mss->mediumName() << "/" << mss->codecName() << " track\n";
}
}
} while (0);
}
void ProxyServerMediaSession::resetDESCRIBEState() {
// Delete all of our "ProxyServerMediaSubsession"s; they'll get set up again once we get a response to the new "DESCRIBE".
if (fOurMediaServer != NULL) {
// First, close any client connections that may have already been set up:
fOurMediaServer->closeAllClientSessionsForServerMediaSession(this);
}
deleteAllSubsessions();
// Finally, delete the client "MediaSession" object that we had set up after receiving the response to the previous "DESCRIBE":
Medium::close(fClientMediaSession); fClientMediaSession = NULL;
}
///////// RTSP 'response handlers' //////////
static void continueAfterDESCRIBE(RTSPClient* rtspClient, int resultCode, char* resultString) {
char const* res;
if (resultCode == 0) {
// The "DESCRIBE" command succeeded, so "resultString" should be the stream's SDP description.
res = resultString;
} else {
// The "DESCRIBE" command failed.
res = NULL;
}
((ProxyRTSPClient*)rtspClient)->continueAfterDESCRIBE(res);
delete[] resultString;
}
static void continueAfterSETUP(RTSPClient* rtspClient, int resultCode, char* resultString) {
((ProxyRTSPClient*)rtspClient)->continueAfterSETUP(resultCode);
delete[] resultString;
}
static void continueAfterPLAY(RTSPClient* rtspClient, int resultCode, char* resultString) {
((ProxyRTSPClient*)rtspClient)->continueAfterPLAY(resultCode);
delete[] resultString;
}
static void continueAfterOPTIONS(RTSPClient* rtspClient, int resultCode, char* resultString) {
Boolean serverSupportsGetParameter = False;
if (resultCode == 0) {
// Note whether the server told us that it supports the "GET_PARAMETER" command:
serverSupportsGetParameter = RTSPOptionIsSupported("GET_PARAMETER", resultString);
}
((ProxyRTSPClient*)rtspClient)->continueAfterLivenessCommand(resultCode, serverSupportsGetParameter);
delete[] resultString;
}
#ifdef SEND_GET_PARAMETER_IF_SUPPORTED
static void continueAfterGET_PARAMETER(RTSPClient* rtspClient, int resultCode, char* resultString) {
((ProxyRTSPClient*)rtspClient)->continueAfterLivenessCommand(resultCode, True);
delete[] resultString;
}
#endif
////////// "ProxyRTSPClient" implementation /////////
UsageEnvironment& operator<<(UsageEnvironment& env, const ProxyRTSPClient& proxyRTSPClient) { // used for debugging
return env << "ProxyRTSPClient[\"" << proxyRTSPClient.url() << "\"]";
}
ProxyRTSPClient::ProxyRTSPClient(ProxyServerMediaSession& ourServerMediaSession, char const* rtspURL,
char const* username, char const* password,
portNumBits tunnelOverHTTPPortNum, int verbosityLevel, int socketNumToServer)
: RTSPClient(ourServerMediaSession.envir(), rtspURL, verbosityLevel, "ProxyRTSPClient",
tunnelOverHTTPPortNum == (portNumBits)(~0) ? 0 : tunnelOverHTTPPortNum, socketNumToServer),
fOurServerMediaSession(ourServerMediaSession), fOurURL(strDup(rtspURL)), fStreamRTPOverTCP(tunnelOverHTTPPortNum != 0),
fSetupQueueHead(NULL), fSetupQueueTail(NULL), fNumSetupsDone(0), fNextDESCRIBEDelay(1),
fServerSupportsGetParameter(False), fLastCommandWasPLAY(False), fResetOnNextLivenessTest(False),
fLivenessCommandTask(NULL), fDESCRIBECommandTask(NULL), fSubsessionTimerTask(NULL) {
if (username != NULL && password != NULL) {
fOurAuthenticator = new Authenticator(username, password);
} else {
fOurAuthenticator = NULL;
}
}
void ProxyRTSPClient::reset() {
envir().taskScheduler().unscheduleDelayedTask(fLivenessCommandTask); fLivenessCommandTask = NULL;
envir().taskScheduler().unscheduleDelayedTask(fDESCRIBECommandTask); fDESCRIBECommandTask = NULL;
envir().taskScheduler().unscheduleDelayedTask(fSubsessionTimerTask); fSubsessionTimerTask = NULL;
fSetupQueueHead = fSetupQueueTail = NULL;
fNumSetupsDone = 0;
fNextDESCRIBEDelay = 1;
fLastCommandWasPLAY = False;
RTSPClient::reset();
}
ProxyRTSPClient::~ProxyRTSPClient() {
reset();
delete fOurAuthenticator;
delete[] fOurURL;
}
void ProxyRTSPClient::continueAfterDESCRIBE(char const* sdpDescription) {
if (sdpDescription != NULL) {
fOurServerMediaSession.continueAfterDESCRIBE(sdpDescription);
// Unlike most RTSP streams, there might be a long delay between this "DESCRIBE" command (to the downstream server) and the
// subsequent "SETUP"/"PLAY" - which doesn't occur until the first time that a client requests the stream.
// To prevent the proxied connection (between us and the downstream server) from timing out, we send periodic 'liveness'
// ("OPTIONS" or "GET_PARAMETER") commands. (The usual RTCP liveness mechanism wouldn't work here, because RTCP packets
// don't get sent until after the "PLAY" command.)
scheduleLivenessCommand();
} else {
// The "DESCRIBE" command failed, most likely because the server or the stream is not yet running.
// Reschedule another "DESCRIBE" command to take place later:
scheduleDESCRIBECommand();
}
}
void ProxyRTSPClient::continueAfterLivenessCommand(int resultCode, Boolean serverSupportsGetParameter) {
if (fResetOnNextLivenessTest) {
// Hack: We've arranged to reset the connection with the server (regardless of "resultCode"):
fResetOnNextLivenessTest = False;
resultCode = 2; // arbitrary > 0
}
if (resultCode != 0) {
// The periodic 'liveness' command failed, suggesting that the back-end stream is no longer alive.
// We handle this by resetting our connection state with this server. Any current clients will be closed, but
// subsequent clients will cause new RTSP "SETUP"s and "PLAY"s to get done, restarting the stream.
// Then continue by sending more "DESCRIBE" commands, to try to restore the stream.
fServerSupportsGetParameter = False; // until we learn otherwise, in response to a future "OPTIONS" command
if (resultCode < 0) {
// The 'liveness' command failed without getting a response from the server (otherwise "resultCode" would have been > 0).
// This suggests that the RTSP connection itself has failed. Print this error code, in case it's useful for debugging:
if (fVerbosityLevel > 0) {
envir() << *this << ": lost connection to server ('errno': " << -resultCode << "). Resetting...\n";
}
}
reset();
fOurServerMediaSession.resetDESCRIBEState();
setBaseURL(fOurURL); // because we'll be sending an initial "DESCRIBE" all over again
sendDESCRIBE(this);
return;
}
fServerSupportsGetParameter = serverSupportsGetParameter;
// Schedule the next 'liveness' command (i.e., to tell the back-end server that we're still alive):
scheduleLivenessCommand();
}
#define SUBSESSION_TIMEOUT_SECONDS 10 // how many seconds to wait for the last track's "SETUP" to be done (note below)
void ProxyRTSPClient::continueAfterSETUP(int resultCode) {
if (resultCode != 0) {
// The "SETUP" command failed, so arrange to reset the state after the next RTSP 'liveness'
// command. (We don't do this now, because it deletes the "ProxyServerMediaSubsession",
// and we can't do that during "ProxyServerMediaSubsession::createNewStreamSource()".)
fResetOnNextLivenessTest = True;
envir().taskScheduler().rescheduleDelayedTask(fLivenessCommandTask, 0, sendLivenessCommand, this);
return;
}
if (fVerbosityLevel > 0) {
envir() << *this << "::continueAfterSETUP(): head codec: " << fSetupQueueHead->codecName()
<< "; numSubsessions " << fSetupQueueHead->fParentSession->numSubsessions() << "\n\tqueue:";
for (ProxyServerMediaSubsession* p = fSetupQueueHead; p != NULL; p = p->fNext) {
envir() << "\t" << p->codecName();
if (p->fNext == fSetupQueueHead || p->fNext == p) { fprintf(stderr, "##### INTERNAL ERROR 1\n"); break; } //##### TEMP FOR DEBUGGING
}
envir() << "\n";
}
envir().taskScheduler().unscheduleDelayedTask(fSubsessionTimerTask); // in case it had been set
// Dequeue the first "ProxyServerMediaSubsession" from our 'SETUP queue'. It will be the one for which this "SETUP" was done:
ProxyServerMediaSubsession* smss = fSetupQueueHead; // Assert: != NULL
if (fSetupQueueHead == NULL) fprintf(stderr, "##### INTERNAL ERROR 2\n"); else //##### TEMP FOR DEBUGGING
fSetupQueueHead = fSetupQueueHead->fNext;
if (fSetupQueueHead == NULL) fSetupQueueTail = NULL;
if (fSetupQueueHead != NULL) {
// There are still entries in the queue, for tracks for which we have still to do a "SETUP".
// "SETUP" the first of these now:
sendSetupCommand(fSetupQueueHead->fClientMediaSubsession, ::continueAfterSETUP,
False, fStreamRTPOverTCP, False, fOurAuthenticator);
++fNumSetupsDone;
fSetupQueueHead->fHaveSetupStream = True;
} else {
if (fNumSetupsDone >= smss->fParentSession->numSubsessions()) {
// We've now finished setting up each of our subsessions (i.e., 'tracks').
// Continue by sending a "PLAY" command (an 'aggregate' "PLAY" command, on the whole session):
sendPlayCommand(smss->fClientMediaSubsession.parentSession(), ::continueAfterPLAY, -1.0f, -1.0f, 1.0f, fOurAuthenticator);
// the "-1.0f" "start" parameter causes the "PLAY" to be sent without a "Range:" header, in case we'd already done
// a "PLAY" before (as a result of a 'subsession timeout' (note below))
fLastCommandWasPLAY = True;
} else {
// Some of this session's subsessions (i.e., 'tracks') remain to be "SETUP". They might get "SETUP" very soon, but it's
// also possible - if the remote client chose to play only some of the session's tracks - that they might not.
// To allow for this possibility, we set a timer. If the timer expires without the remaining subsessions getting "SETUP",
// then we send a "PLAY" command anyway:
fSubsessionTimerTask
= envir().taskScheduler().scheduleDelayedTask(SUBSESSION_TIMEOUT_SECONDS*MILLION, (TaskFunc*)subsessionTimeout, this);
}
}
}
void ProxyRTSPClient::continueAfterPLAY(int resultCode) {
if (resultCode != 0) {
// The "PLAY" command failed, so arrange to reset the state after the next RTSP 'liveness'
// command. (We don't do this now, because it deletes the "ProxyServerMediaSubsession",
// and we can't do that during "ProxyServerMediaSubsession::createNewStreamSource()".)
fResetOnNextLivenessTest = True;
envir().taskScheduler().rescheduleDelayedTask(fLivenessCommandTask, 0, sendLivenessCommand, this);
return;
}
}
void ProxyRTSPClient::scheduleLivenessCommand() {
// Delay a random time before sending another 'liveness' command.
unsigned delayMax = sessionTimeoutParameter(); // if the server specified a maximum time between 'liveness' probes, then use that
if (delayMax == 0) {
delayMax = 60;
}
// Choose a random time from [delayMax/2,delayMax-1) seconds:
unsigned const us_1stPart = delayMax*500000;
unsigned uSecondsToDelay;
if (us_1stPart <= 1000000) {
uSecondsToDelay = us_1stPart;
} else {
unsigned const us_2ndPart = us_1stPart-1000000;
uSecondsToDelay = us_1stPart + (us_2ndPart*our_random())%us_2ndPart;
}
fLivenessCommandTask = envir().taskScheduler().scheduleDelayedTask(uSecondsToDelay, sendLivenessCommand, this);
}
void ProxyRTSPClient::sendLivenessCommand(void* clientData) {
ProxyRTSPClient* rtspClient = (ProxyRTSPClient*)clientData;
// Note. By default, we do not send "GET_PARAMETER" as our 'liveness notification' command, even if the server previously
// indicated (in its response to our earlier "OPTIONS" command) that it supported "GET_PARAMETER". This is because
// "GET_PARAMETER" crashes some camera servers (even though they claimed to support "GET_PARAMETER").
#ifdef SEND_GET_PARAMETER_IF_SUPPORTED
MediaSession* sess = rtspClient->fOurServerMediaSession.fClientMediaSession;
if (rtspClient->fServerSupportsGetParameter && rtspClient->fNumSetupsDone > 0 && sess != NULL) {
rtspClient->sendGetParameterCommand(*sess, ::continueAfterGET_PARAMETER, "", rtspClient->auth());
} else {
#endif
rtspClient->sendOptionsCommand(::continueAfterOPTIONS, rtspClient->auth());
#ifdef SEND_GET_PARAMETER_IF_SUPPORTED
}
#endif
}
void ProxyRTSPClient::scheduleDESCRIBECommand() {
// Delay 1s, 2s, 4s, 8s ... 256s until sending the next "DESCRIBE". Then, keep delaying a random time from [256..511] seconds:
unsigned secondsToDelay;
if (fNextDESCRIBEDelay <= 256) {
secondsToDelay = fNextDESCRIBEDelay;
fNextDESCRIBEDelay *= 2;
} else {
secondsToDelay = 256 + (our_random()&0xFF); // [256..511] seconds
}
if (fVerbosityLevel > 0) {
envir() << *this << ": RTSP \"DESCRIBE\" command failed; trying again in " << secondsToDelay << " seconds\n";
}
fDESCRIBECommandTask = envir().taskScheduler().scheduleDelayedTask(secondsToDelay*MILLION, sendDESCRIBE, this);
}
void ProxyRTSPClient::sendDESCRIBE(void* clientData) {
ProxyRTSPClient* rtspClient = (ProxyRTSPClient*)clientData;
if (rtspClient != NULL) rtspClient->sendDescribeCommand(::continueAfterDESCRIBE, rtspClient->auth());
}
void ProxyRTSPClient::subsessionTimeout(void* clientData) {
((ProxyRTSPClient*)clientData)->handleSubsessionTimeout();
}
void ProxyRTSPClient::handleSubsessionTimeout() {
// We still have one or more subsessions ('tracks') left to "SETUP". But we can't wait any longer for them. Send a "PLAY" now:
MediaSession* sess = fOurServerMediaSession.fClientMediaSession;
if (sess != NULL) sendPlayCommand(*sess, ::continueAfterPLAY, -1.0f, -1.0f, 1.0f, fOurAuthenticator);
fLastCommandWasPLAY = True;
}
//////// "ProxyServerMediaSubsession" implementation //////////
ProxyServerMediaSubsession
::ProxyServerMediaSubsession(MediaSubsession& mediaSubsession,
portNumBits initialPortNum, Boolean multiplexRTCPWithRTP)
: OnDemandServerMediaSubsession(mediaSubsession.parentSession().envir(), True/*reuseFirstSource*/,
initialPortNum, multiplexRTCPWithRTP),
fClientMediaSubsession(mediaSubsession), fCodecName(strDup(mediaSubsession.codecName())),
fNext(NULL), fHaveSetupStream(False) {
}
UsageEnvironment& operator<<(UsageEnvironment& env, const ProxyServerMediaSubsession& psmss) { // used for debugging
return env << "ProxyServerMediaSubsession[\"" << psmss.codecName() << "\"]";
}
ProxyServerMediaSubsession::~ProxyServerMediaSubsession() {
if (verbosityLevel() > 0) {
envir() << *this << "::~ProxyServerMediaSubsession()\n";
}
delete[] (char*)fCodecName;
}
FramedSource* ProxyServerMediaSubsession::createNewStreamSource(unsigned clientSessionId, unsigned& estBitrate) {
ProxyServerMediaSession* const sms = (ProxyServerMediaSession*)fParentSession;
if (verbosityLevel() > 0) {
envir() << *this << "::createNewStreamSource(session id " << clientSessionId << ")\n";
}
// If we haven't yet created a data source from our 'media subsession' object, initiate() it to do so:
if (fClientMediaSubsession.readSource() == NULL) {
fClientMediaSubsession.receiveRawMP3ADUs(); // hack for MPA-ROBUST streams
fClientMediaSubsession.receiveRawJPEGFrames(); // hack for proxying JPEG/RTP streams. (Don't do this if we're transcoding.)
fClientMediaSubsession.initiate();
if (verbosityLevel() > 0) {
envir() << "\tInitiated: " << *this << "\n";
}
if (fClientMediaSubsession.readSource() != NULL) {
// First, check whether we have defined a 'transcoder' filter to be used with this codec:
if (sms->fTranscodingTable != NULL) {
char* outputCodecName;
FramedFilter* transcoder
= sms->fTranscodingTable->lookupTranscoder(fClientMediaSubsession, outputCodecName);
if (transcoder != NULL) {
fClientMediaSubsession.addFilter(transcoder);
delete[] (char*)fCodecName; fCodecName = outputCodecName;
}
}
// Then, add to the front of all data sources a filter that will 'normalize' their frames'
// presentation times, before the frames get re-transmitted by our server:
FramedFilter* normalizerFilter = sms->fPresentationTimeSessionNormalizer
->createNewPresentationTimeSubsessionNormalizer(fClientMediaSubsession.readSource(),
fClientMediaSubsession.rtpSource(),
fCodecName);
fClientMediaSubsession.addFilter(normalizerFilter);
// Some data sources require a 'framer' object to be added, before they can be fed into
// a "RTPSink". Adjust for this now:
if (strcmp(fCodecName, "H264") == 0) {
fClientMediaSubsession.addFilter(H264VideoStreamDiscreteFramer
::createNew(envir(), fClientMediaSubsession.readSource()));
} else if (strcmp(fCodecName, "H265") == 0) {
fClientMediaSubsession.addFilter(H265VideoStreamDiscreteFramer
::createNew(envir(), fClientMediaSubsession.readSource()));
} else if (strcmp(fCodecName, "MP4V-ES") == 0) {
fClientMediaSubsession.addFilter(MPEG4VideoStreamDiscreteFramer
::createNew(envir(), fClientMediaSubsession.readSource(),
True/* leave PTs unmodified*/));
} else if (strcmp(fCodecName, "MPV") == 0) {
fClientMediaSubsession.addFilter(MPEG1or2VideoStreamDiscreteFramer
::createNew(envir(), fClientMediaSubsession.readSource(),
False, 5.0, True/* leave PTs unmodified*/));
} else if (strcmp(fCodecName, "DV") == 0) {
fClientMediaSubsession.addFilter(DVVideoStreamFramer
::createNew(envir(), fClientMediaSubsession.readSource(),
False, True/* leave PTs unmodified*/));
}
}
if (fClientMediaSubsession.rtcpInstance() != NULL) {
fClientMediaSubsession.rtcpInstance()->setByeHandler(subsessionByeHandler, this);
}
}
ProxyRTSPClient* const proxyRTSPClient = sms->fProxyRTSPClient;
if (clientSessionId != 0) {
// We're being called as a result of implementing a RTSP "SETUP".
if (!fHaveSetupStream) {
// This is our first "SETUP". Send RTSP "SETUP" and later "PLAY" commands to the proxied server, to start streaming:
// (Before sending "SETUP", enqueue ourselves on the "RTSPClient"s 'SETUP queue', so we'll be able to get the correct
// "ProxyServerMediaSubsession" to handle the response. (Note that responses come back in the same order as requests.))
Boolean queueWasEmpty = proxyRTSPClient->fSetupQueueHead == NULL;
if (queueWasEmpty) {
if (proxyRTSPClient->fSetupQueueTail != NULL) fprintf(stderr, "##### INTERNAL ERROR 3\n");
proxyRTSPClient->fSetupQueueHead = this;
} else {
if (proxyRTSPClient->fSetupQueueTail == NULL) fprintf(stderr, "##### INTERNAL ERROR 4\n"); else //##### TEMP FOR DEBUGGING
proxyRTSPClient->fSetupQueueTail->fNext = this;
}
proxyRTSPClient->fSetupQueueTail = this;
// Hack: If there's already a pending "SETUP" request (for another track), don't send this track's "SETUP" right away, because
// the server might not properly handle 'pipelined' requests. Instead, wait until after previous "SETUP" responses come back.
if (queueWasEmpty) {
proxyRTSPClient->sendSetupCommand(fClientMediaSubsession, ::continueAfterSETUP,
False, proxyRTSPClient->fStreamRTPOverTCP, False, proxyRTSPClient->auth());
++proxyRTSPClient->fNumSetupsDone;
fHaveSetupStream = True;
}
} else {
// This is a "SETUP" from a new client. We know that there are no other currently active clients (otherwise we wouldn't
// have been called here), so we know that the substream was previously "PAUSE"d. Send "PLAY" downstream once again,
// to resume the stream:
if (!proxyRTSPClient->fLastCommandWasPLAY) { // so that we send only one "PLAY"; not one for each subsession
proxyRTSPClient->sendPlayCommand(fClientMediaSubsession.parentSession(), ::continueAfterPLAY, -1.0f/*resume from previous point*/,
-1.0f, 1.0f, proxyRTSPClient->auth());
proxyRTSPClient->fLastCommandWasPLAY = True;
}
}
}
estBitrate = fClientMediaSubsession.bandwidth();
if (estBitrate == 0) estBitrate = 50; // kbps, estimate
return fClientMediaSubsession.readSource();
}
void ProxyServerMediaSubsession::closeStreamSource(FramedSource* inputSource) {
if (verbosityLevel() > 0) {
envir() << *this << "::closeStreamSource()\n";
}
// Because there's only one input source for this 'subsession' (regardless of how many downstream clients are proxying it),
// we don't close the input source here. (Instead, we wait until *this* object gets deleted.)
// However, because (as evidenced by this function having been called) we no longer have any clients accessing the stream,
// then we "PAUSE" the downstream proxied stream, until a new client arrives:
if (fHaveSetupStream) {
ProxyServerMediaSession* const sms = (ProxyServerMediaSession*)fParentSession;
ProxyRTSPClient* const proxyRTSPClient = sms->fProxyRTSPClient;
if (proxyRTSPClient->fLastCommandWasPLAY) { // so that we send only one "PAUSE"; not one for each subsession
proxyRTSPClient->sendPauseCommand(fClientMediaSubsession.parentSession(), NULL, proxyRTSPClient->auth());
proxyRTSPClient->fLastCommandWasPLAY = False;
}
}
}
RTPSink* ProxyServerMediaSubsession
::createNewRTPSink(Groupsock* rtpGroupsock, unsigned char rtpPayloadTypeIfDynamic, FramedSource* inputSource) {
if (verbosityLevel() > 0) {
envir() << *this << "::createNewRTPSink()\n";
}
// Create (and return) the appropriate "RTPSink" object for our codec:
// (Note: The configuration string might not be correct if a transcoder is used. FIX!) #####
RTPSink* newSink;
if (strcmp(fCodecName, "AC3") == 0 || strcmp(fCodecName, "EAC3") == 0) {
newSink = AC3AudioRTPSink::createNew(envir(), rtpGroupsock, rtpPayloadTypeIfDynamic,
fClientMediaSubsession.rtpTimestampFrequency());
#if 0 // This code does not work; do *not* enable it:
} else if (strcmp(fCodecName, "AMR") == 0 || strcmp(fCodecName, "AMR-WB") == 0) {
Boolean isWideband = strcmp(fCodecName, "AMR-WB") == 0;
newSink = AMRAudioRTPSink::createNew(envir(), rtpGroupsock, rtpPayloadTypeIfDynamic,
isWideband, fClientMediaSubsession.numChannels());
#endif
} else if (strcmp(fCodecName, "DV") == 0) {
newSink = DVVideoRTPSink::createNew(envir(), rtpGroupsock, rtpPayloadTypeIfDynamic);
} else if (strcmp(fCodecName, "GSM") == 0) {
newSink = GSMAudioRTPSink::createNew(envir(), rtpGroupsock);
} else if (strcmp(fCodecName, "H263-1998") == 0 || strcmp(fCodecName, "H263-2000") == 0) {
newSink = H263plusVideoRTPSink::createNew(envir(), rtpGroupsock, rtpPayloadTypeIfDynamic,
fClientMediaSubsession.rtpTimestampFrequency());
} else if (strcmp(fCodecName, "H264") == 0) {
newSink = H264VideoRTPSink::createNew(envir(), rtpGroupsock, rtpPayloadTypeIfDynamic,
fClientMediaSubsession.fmtp_spropparametersets());
} else if (strcmp(fCodecName, "H265") == 0) {
newSink = H265VideoRTPSink::createNew(envir(), rtpGroupsock, rtpPayloadTypeIfDynamic,
fClientMediaSubsession.fmtp_spropvps(),
fClientMediaSubsession.fmtp_spropsps(),
fClientMediaSubsession.fmtp_sproppps());
} else if (strcmp(fCodecName, "JPEG") == 0) {
newSink = SimpleRTPSink::createNew(envir(), rtpGroupsock, 26, 90000, "video", "JPEG",
1/*numChannels*/, False/*allowMultipleFramesPerPacket*/, False/*doNormalMBitRule*/);
} else if (strcmp(fCodecName, "MP4A-LATM") == 0) {
newSink = MPEG4LATMAudioRTPSink::createNew(envir(), rtpGroupsock, rtpPayloadTypeIfDynamic,
fClientMediaSubsession.rtpTimestampFrequency(),
fClientMediaSubsession.fmtp_config(),
fClientMediaSubsession.numChannels());
} else if (strcmp(fCodecName, "MP4V-ES") == 0) {
newSink = MPEG4ESVideoRTPSink::createNew(envir(), rtpGroupsock, rtpPayloadTypeIfDynamic,
fClientMediaSubsession.rtpTimestampFrequency(),
fClientMediaSubsession.attrVal_unsigned("profile-level-id"),
fClientMediaSubsession.fmtp_config());
} else if (strcmp(fCodecName, "MPA") == 0) {
newSink = MPEG1or2AudioRTPSink::createNew(envir(), rtpGroupsock);
} else if (strcmp(fCodecName, "MPA-ROBUST") == 0) {
newSink = MP3ADURTPSink::createNew(envir(), rtpGroupsock, rtpPayloadTypeIfDynamic);
} else if (strcmp(fCodecName, "MPEG4-GENERIC") == 0) {
newSink = MPEG4GenericRTPSink::createNew(envir(), rtpGroupsock,
rtpPayloadTypeIfDynamic, fClientMediaSubsession.rtpTimestampFrequency(),
fClientMediaSubsession.mediumName(),
fClientMediaSubsession.attrVal_str("mode"),
fClientMediaSubsession.fmtp_config(), fClientMediaSubsession.numChannels());
} else if (strcmp(fCodecName, "MPV") == 0) {
newSink = MPEG1or2VideoRTPSink::createNew(envir(), rtpGroupsock);
} else if (strcmp(fCodecName, "OPUS") == 0) {
newSink = SimpleRTPSink::createNew(envir(), rtpGroupsock, rtpPayloadTypeIfDynamic,
48000, "audio", "OPUS", 2, False/*only 1 Opus 'packet' in each RTP packet*/);
} else if (strcmp(fCodecName, "T140") == 0) {
newSink = T140TextRTPSink::createNew(envir(), rtpGroupsock, rtpPayloadTypeIfDynamic);
} else if (strcmp(fCodecName, "THEORA") == 0) {
newSink = TheoraVideoRTPSink::createNew(envir(), rtpGroupsock, rtpPayloadTypeIfDynamic,
fClientMediaSubsession.fmtp_config());
} else if (strcmp(fCodecName, "VORBIS") == 0) {
newSink = VorbisAudioRTPSink::createNew(envir(), rtpGroupsock, rtpPayloadTypeIfDynamic,
fClientMediaSubsession.rtpTimestampFrequency(), fClientMediaSubsession.numChannels(),
fClientMediaSubsession.fmtp_config());
} else if (strcmp(fCodecName, "VP8") == 0) {
newSink = VP8VideoRTPSink::createNew(envir(), rtpGroupsock, rtpPayloadTypeIfDynamic);
} else if (strcmp(fCodecName, "VP9") == 0) {
newSink = VP9VideoRTPSink::createNew(envir(), rtpGroupsock, rtpPayloadTypeIfDynamic);
} else if (strcmp(fCodecName, "AMR") == 0 || strcmp(fCodecName, "AMR-WB") == 0) {
// Proxying of these codecs is currently *not* supported, because the data received by the "RTPSource" object is not in a
// form that can be fed directly into a corresponding "RTPSink" object.
if (verbosityLevel() > 0) {
envir() << "\treturns NULL (because we currently don't support the proxying of \""
<< fClientMediaSubsession.mediumName() << "/" << fCodecName << "\" streams)\n";
}
return NULL;
} else if (strcmp(fCodecName, "QCELP") == 0 ||
strcmp(fCodecName, "H261") == 0 ||
strcmp(fCodecName, "H263-1998") == 0 || strcmp(fCodecName, "H263-2000") == 0 ||
strcmp(fCodecName, "X-QT") == 0 || strcmp(fCodecName, "X-QUICKTIME") == 0) {
// This codec requires a specialized RTP payload format; however, we don't yet have an appropriate "RTPSink" subclass for it:
if (verbosityLevel() > 0) {
envir() << "\treturns NULL (because we don't have a \"RTPSink\" subclass for this RTP payload format)\n";
}
return NULL;
} else {
// This codec is assumed to have a simple RTP payload format that can be implemented just with a "SimpleRTPSink":
Boolean allowMultipleFramesPerPacket = True; // by default
Boolean doNormalMBitRule = True; // by default
// Some codecs change the above default parameters:
if (strcmp(fCodecName, "MP2T") == 0) {
doNormalMBitRule = False; // no RTP 'M' bit
}
newSink = SimpleRTPSink::createNew(envir(), rtpGroupsock,
rtpPayloadTypeIfDynamic, fClientMediaSubsession.rtpTimestampFrequency(),
fClientMediaSubsession.mediumName(), fCodecName,
fClientMediaSubsession.numChannels(), allowMultipleFramesPerPacket, doNormalMBitRule);
}
// Because our relayed frames' presentation times are inaccurate until the input frames have been RTCP-synchronized,
// we temporarily disable RTCP "SR" reports for this "RTPSink" object:
newSink->enableRTCPReports() = False;
// Also tell our "PresentationTimeSubsessionNormalizer" object about the "RTPSink", so it can enable RTCP "SR" reports later:
PresentationTimeSubsessionNormalizer* ssNormalizer;
if (strcmp(fCodecName, "H264") == 0 ||
strcmp(fCodecName, "H265") == 0 ||
strcmp(fCodecName, "MP4V-ES") == 0 ||
strcmp(fCodecName, "MPV") == 0 ||
strcmp(fCodecName, "DV") == 0) {
// There was a separate 'framer' object in front of the "PresentationTimeSubsessionNormalizer", so go back one object to get it:
ssNormalizer = (PresentationTimeSubsessionNormalizer*)(((FramedFilter*)inputSource)->inputSource());
} else {
ssNormalizer = (PresentationTimeSubsessionNormalizer*)inputSource;
}
ssNormalizer->setRTPSink(newSink);
return newSink;
}
Groupsock* ProxyServerMediaSubsession::createGroupsock(struct in_addr const& addr, Port port) {
ProxyServerMediaSession* parentSession = (ProxyServerMediaSession*)fParentSession;
return parentSession->createGroupsock(addr, port);
}
RTCPInstance* ProxyServerMediaSubsession
::createRTCP(Groupsock* RTCPgs, unsigned totSessionBW, /* in kbps */
unsigned char const* cname, RTPSink* sink) {
ProxyServerMediaSession* parentSession = (ProxyServerMediaSession*)fParentSession;
return parentSession->createRTCP(RTCPgs, totSessionBW, cname, sink);
}
void ProxyServerMediaSubsession::subsessionByeHandler(void* clientData) {
((ProxyServerMediaSubsession*)clientData)->subsessionByeHandler();
}
void ProxyServerMediaSubsession::subsessionByeHandler() {
if (verbosityLevel() > 0) {
envir() << *this << ": received RTCP \"BYE\". (The back-end stream has ended.)\n";
}
// This "BYE" signals that our input source has (effectively) closed, so pass this onto the front-end clients:
fHaveSetupStream = False; // hack to stop "PAUSE" getting sent by:
if (fClientMediaSubsession.readSource() != NULL) {
fClientMediaSubsession.readSource()->handleClosure();
}
// And then treat this as if we had lost connection to the back-end server,
// and can reestablish streaming from it only by sending another "DESCRIBE":
ProxyServerMediaSession* const sms = (ProxyServerMediaSession*)fParentSession;
ProxyRTSPClient* const proxyRTSPClient = sms->fProxyRTSPClient;
proxyRTSPClient->continueAfterLivenessCommand(1/*hack*/, proxyRTSPClient->fServerSupportsGetParameter);
}
////////// PresentationTimeSessionNormalizer and PresentationTimeSubsessionNormalizer implementations //////////
// PresentationTimeSessionNormalizer:
PresentationTimeSessionNormalizer::PresentationTimeSessionNormalizer(UsageEnvironment& env)
: Medium(env),
fSubsessionNormalizers(NULL), fMasterSSNormalizer(NULL) {
}
PresentationTimeSessionNormalizer::~PresentationTimeSessionNormalizer() {
while (fSubsessionNormalizers != NULL) {
Medium::close(fSubsessionNormalizers);
}
}
PresentationTimeSubsessionNormalizer* PresentationTimeSessionNormalizer
::createNewPresentationTimeSubsessionNormalizer(FramedSource* inputSource, RTPSource* rtpSource,
char const* codecName) {
fSubsessionNormalizers
= new PresentationTimeSubsessionNormalizer(*this, inputSource, rtpSource, codecName, fSubsessionNormalizers);
return fSubsessionNormalizers;
}
void PresentationTimeSessionNormalizer
::normalizePresentationTime(PresentationTimeSubsessionNormalizer* ssNormalizer,
struct timeval& toPT, struct timeval const& fromPT) {
Boolean const hasBeenSynced = ssNormalizer->fRTPSource->hasBeenSynchronizedUsingRTCP();
if (!hasBeenSynced) {
// If "fromPT" has not yet been RTCP-synchronized, then it was generated by our own receiving code, and thus
// is already aligned with 'wall-clock' time. Just copy it 'as is' to "toPT":
toPT = fromPT;
} else {
if (fMasterSSNormalizer == NULL) {
// Make "ssNormalizer" the 'master' subsession - meaning that its presentation time is adjusted to align with 'wall clock'
// time, and the presentation times of other subsessions (if any) are adjusted to retain their relative separation with
// those of the master:
fMasterSSNormalizer = ssNormalizer;
struct timeval timeNow;
gettimeofday(&timeNow, NULL);
// Compute: fPTAdjustment = timeNow - fromPT
fPTAdjustment.tv_sec = timeNow.tv_sec - fromPT.tv_sec;
fPTAdjustment.tv_usec = timeNow.tv_usec - fromPT.tv_usec;
// Note: It's OK if one or both of these fields underflows; the result still works out OK later.
}
// Compute a normalized presentation time: toPT = fromPT + fPTAdjustment
toPT.tv_sec = fromPT.tv_sec + fPTAdjustment.tv_sec - 1;
toPT.tv_usec = fromPT.tv_usec + fPTAdjustment.tv_usec + MILLION;
while (toPT.tv_usec > MILLION) { ++toPT.tv_sec; toPT.tv_usec -= MILLION; }
// Because "ssNormalizer"s relayed presentation times are accurate from now on, enable RTCP "SR" reports for its "RTPSink":
RTPSink* const rtpSink = ssNormalizer->fRTPSink;
if (rtpSink != NULL) { // sanity check; should always be true
rtpSink->enableRTCPReports() = True;
}
}
}
void PresentationTimeSessionNormalizer
::removePresentationTimeSubsessionNormalizer(PresentationTimeSubsessionNormalizer* ssNormalizer) {
// Unlink "ssNormalizer" from the linked list (starting with "fSubsessionNormalizers"):
if (fSubsessionNormalizers == ssNormalizer) {
fSubsessionNormalizers = fSubsessionNormalizers->fNext;
} else {
PresentationTimeSubsessionNormalizer** ssPtrPtr = &(fSubsessionNormalizers->fNext);
while (*ssPtrPtr != ssNormalizer) ssPtrPtr = &((*ssPtrPtr)->fNext);
*ssPtrPtr = (*ssPtrPtr)->fNext;
}
}
// PresentationTimeSubsessionNormalizer:
PresentationTimeSubsessionNormalizer
::PresentationTimeSubsessionNormalizer(PresentationTimeSessionNormalizer& parent, FramedSource* inputSource, RTPSource* rtpSource,
char const* codecName, PresentationTimeSubsessionNormalizer* next)
: FramedFilter(parent.envir(), inputSource),
fParent(parent), fRTPSource(rtpSource), fRTPSink(NULL), fCodecName(codecName), fNext(next) {
}
PresentationTimeSubsessionNormalizer::~PresentationTimeSubsessionNormalizer() {
fParent.removePresentationTimeSubsessionNormalizer(this);
}
void PresentationTimeSubsessionNormalizer::afterGettingFrame(void* clientData, unsigned frameSize,
unsigned numTruncatedBytes,
struct timeval presentationTime,
unsigned durationInMicroseconds) {
((PresentationTimeSubsessionNormalizer*)clientData)
->afterGettingFrame(frameSize, numTruncatedBytes, presentationTime, durationInMicroseconds);
}
void PresentationTimeSubsessionNormalizer::afterGettingFrame(unsigned frameSize,
unsigned numTruncatedBytes,
struct timeval presentationTime,
unsigned durationInMicroseconds) {
// This filter is implemented by passing all frames through unchanged, except that "fPresentationTime" is changed:
fFrameSize = frameSize;
fNumTruncatedBytes = numTruncatedBytes;
fDurationInMicroseconds = durationInMicroseconds;
fParent.normalizePresentationTime(this, fPresentationTime, presentationTime);
// Hack for JPEG/RTP proxying. Because we're proxying JPEG by just copying the raw JPEG/RTP payloads, without interpreting them,
// we need to also 'copy' the RTP 'M' (marker) bit from the "RTPSource" to the "RTPSink":
if (fRTPSource->curPacketMarkerBit() && strcmp(fCodecName, "JPEG") == 0) ((SimpleRTPSink*)fRTPSink)->setMBitOnNextPacket();
// Complete delivery:
FramedSource::afterGetting(this);
}
void PresentationTimeSubsessionNormalizer::doGetNextFrame() {
fInputSource->getNextFrame(fTo, fMaxSize, afterGettingFrame, this, FramedSource::handleClosure, this);
}
live/liveMedia/MP3AudioMatroskaFileServerMediaSubsession.hh 000444 001752 001752 00000004611 12656261123 023763 0 ustar 00rsf rsf 000000 000000 /**********
This library is free software; you can redistribute it and/or modify it under
the terms of the GNU Lesser General Public License as published by the
Free Software Foundation; either version 2.1 of the License, or (at your
option) any later version. (See .)
This library is distributed in the hope that it will be useful, but WITHOUT
ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS
FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for
more details.
You should have received a copy of the GNU Lesser General Public License
along with this library; if not, write to the Free Software Foundation, Inc.,
51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
**********/
// "liveMedia"
// Copyright (c) 1996-2016 Live Networks, Inc. All rights reserved.
// A 'ServerMediaSubsession' object that creates new, unicast, "RTPSink"s
// on demand, from an MP3 audio track within a Matroska file.
// (Actually, MPEG-1 or MPEG-2 audio should also work.)
// C++ header
#ifndef _MP3_AUDIO_MATROSKA_FILE_SERVER_MEDIA_SUBSESSION_HH
#define _MP3_AUDIO_MATROSKA_FILE_SERVER_MEDIA_SUBSESSION_HH
#ifndef _MP3_AUDIO_FILE_SERVER_MEDIA_SUBSESSION_HH
#include "MP3AudioFileServerMediaSubsession.hh"
#endif
#ifndef _MATROSKA_FILE_SERVER_DEMUX_HH
#include "MatroskaFileServerDemux.hh"
#endif
class MP3AudioMatroskaFileServerMediaSubsession: public MP3AudioFileServerMediaSubsession {
public:
static MP3AudioMatroskaFileServerMediaSubsession*
createNew(MatroskaFileServerDemux& demux, MatroskaTrack* track,
Boolean generateADUs = False, Interleaving* interleaving = NULL);
// Note: "interleaving" is used only if "generateADUs" is True,
// (and a value of NULL means 'no interleaving')
private:
MP3AudioMatroskaFileServerMediaSubsession(MatroskaFileServerDemux& demux, MatroskaTrack* track,
Boolean generateADUs, Interleaving* interleaving);
// called only by createNew();
virtual ~MP3AudioMatroskaFileServerMediaSubsession();
private: // redefined virtual functions
virtual void seekStreamSource(FramedSource* inputSource, double& seekNPT, double streamDuration, u_int64_t& numBytes);
virtual FramedSource* createNewStreamSource(unsigned clientSessionId,
unsigned& estBitrate);
private:
MatroskaFileServerDemux& fOurDemux;
unsigned fTrackNumber;
};
#endif
live/liveMedia/MP3Internals.cpp 000444 001752 001752 00000063415 12656261123 016344 0 ustar 00rsf rsf 000000 000000 /**********
This library is free software; you can redistribute it and/or modify it under
the terms of the GNU Lesser General Public License as published by the
Free Software Foundation; either version 2.1 of the License, or (at your
option) any later version. (See .)
This library is distributed in the hope that it will be useful, but WITHOUT
ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS
FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for
more details.
You should have received a copy of the GNU Lesser General Public License
along with this library; if not, write to the Free Software Foundation, Inc.,
51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
**********/
// "liveMedia"
// Copyright (c) 1996-2016 Live Networks, Inc. All rights reserved.
// MP3 internal implementation details
// Implementation
#include "MP3InternalsHuffman.hh"
#include
#include
#include
#include
// This is crufty old code that needs to be cleaned up #####
static unsigned const live_tabsel[2][3][16] = {
{ {32,32,64,96,128,160,192,224,256,288,320,352,384,416,448,448},
{32,32,48,56, 64, 80, 96,112,128,160,192,224,256,320,384,384},
{32,32,40,48, 56, 64, 80, 96,112,128,160,192,224,256,320,320} },
{ {32,32,48,56,64,80,96,112,128,144,160,176,192,224,256,256},
{8,8,16,24,32,40,48,56,64,80,96,112,128,144,160,160},
{8,8,16,24,32,40,48,56,64,80,96,112,128,144,160,160} }
};
/* Note: live_tabsel[*][*][0 or 15] shouldn't occur; use dummy values there */
static long const live_freqs[]
= { 44100, 48000, 32000, 22050, 24000, 16000, 11025, 12000, 8000, 0 };
struct bandInfoStruct {
int longIdx[23];
int longDiff[22];
int shortIdx[14];
int shortDiff[13];
};
static struct bandInfoStruct const bandInfo[7] = {
/* MPEG 1.0 */
{ {0,4,8,12,16,20,24,30,36,44,52,62,74, 90,110,134,162,196,238,288,342,418,576},
{4,4,4,4,4,4,6,6,8, 8,10,12,16,20,24,28,34,42,50,54, 76,158},
{0,4*3,8*3,12*3,16*3,22*3,30*3,40*3,52*3,66*3, 84*3,106*3,136*3,192*3},
{4,4,4,4,6,8,10,12,14,18,22,30,56} } ,
{ {0,4,8,12,16,20,24,30,36,42,50,60,72, 88,106,128,156,190,230,276,330,384,576},
{4,4,4,4,4,4,6,6,6, 8,10,12,16,18,22,28,34,40,46,54, 54,192},
{0,4*3,8*3,12*3,16*3,22*3,28*3,38*3,50*3,64*3, 80*3,100*3,126*3,192*3},
{4,4,4,4,6,6,10,12,14,16,20,26,66} } ,
{ {0,4,8,12,16,20,24,30,36,44,54,66,82,102,126,156,194,240,296,364,448,550,576} ,
{4,4,4,4,4,4,6,6,8,10,12,16,20,24,30,38,46,56,68,84,102, 26} ,
{0,4*3,8*3,12*3,16*3,22*3,30*3,42*3,58*3,78*3,104*3,138*3,180*3,192*3} ,
{4,4,4,4,6,8,12,16,20,26,34,42,12} } ,
/* MPEG 2.0 */
{ {0,6,12,18,24,30,36,44,54,66,80,96,116,140,168,200,238,284,336,396,464,522,576},
{6,6,6,6,6,6,8,10,12,14,16,20,24,28,32,38,46,52,60,68,58,54 } ,
{0,4*3,8*3,12*3,18*3,24*3,32*3,42*3,56*3,74*3,100*3,132*3,174*3,192*3} ,
{4,4,4,6,6,8,10,14,18,26,32,42,18 } } ,
{ {0,6,12,18,24,30,36,44,54,66,80,96,114,136,162,194,232,278,330,394,464,540,576},
{6,6,6,6,6,6,8,10,12,14,16,18,22,26,32,38,46,52,64,70,76,36 } ,
{0,4*3,8*3,12*3,18*3,26*3,36*3,48*3,62*3,80*3,104*3,136*3,180*3,192*3} ,
{4,4,4,6,8,10,12,14,18,24,32,44,12 } } ,
{ {0,6,12,18,24,30,36,44,54,66,80,96,116,140,168,200,238,284,336,396,464,522,576},
{6,6,6,6,6,6,8,10,12,14,16,20,24,28,32,38,46,52,60,68,58,54 },
{0,4*3,8*3,12*3,18*3,26*3,36*3,48*3,62*3,80*3,104*3,134*3,174*3,192*3},
{4,4,4,6,8,10,12,14,18,24,30,40,18 } } ,
/* MPEG 2.5, wrong! table (it's just a copy of MPEG 2.0/44.1kHz) */
{ {0,6,12,18,24,30,36,44,54,66,80,96,116,140,168,200,238,284,336,396,464,522,576},
{6,6,6,6,6,6,8,10,12,14,16,20,24,28,32,38,46,52,60,68,58,54 } ,
{0,4*3,8*3,12*3,18*3,24*3,32*3,42*3,56*3,74*3,100*3,132*3,174*3,192*3} ,
{4,4,4,6,6,8,10,14,18,26,32,42,18 } } ,
};
unsigned int n_slen2[512]; /* MPEG 2.0 slen for 'normal' mode */
unsigned int i_slen2[256]; /* MPEG 2.0 slen for intensity stereo */
#define MPG_MD_MONO 3
////////// MP3FrameParams //////////
MP3FrameParams::MP3FrameParams()
: bv(frameBytes, 0, sizeof frameBytes) /* by default */ {
oldHdr = firstHdr = 0;
static Boolean doneInit = False;
if (doneInit) return;
int i,j,k,l;
for (i=0;i<5;i++) {
for (j=0;j<6;j++) {
for (k=0;k<6;k++) {
int n = k + j * 6 + i * 36;
i_slen2[n] = i|(j<<3)|(k<<6)|(3<<12);
}
}
}
for (i=0;i<4;i++) {
for (j=0;j<4;j++) {
for (k=0;k<4;k++) {
int n = k + j * 4 + i * 16;
i_slen2[n+180] = i|(j<<3)|(k<<6)|(4<<12);
}
}
}
for (i=0;i<4;i++) {
for (j=0;j<3;j++) {
int n = j + i * 3;
i_slen2[n+244] = i|(j<<3) | (5<<12);
n_slen2[n+500] = i|(j<<3) | (2<<12) | (1<<15);
}
}
for (i=0;i<5;i++) {
for (j=0;j<5;j++) {
for (k=0;k<4;k++) {
for (l=0;l<4;l++) {
int n = l + k * 4 + j * 16 + i * 80;
n_slen2[n] = i|(j<<3)|(k<<6)|(l<<9)|(0<<12);
}
}
}
}
for (i=0;i<5;i++) {
for (j=0;j<5;j++) {
for (k=0;k<4;k++) {
int n = k + j * 4 + i * 20;
n_slen2[n+400] = i|(j<<3)|(k<<6)|(1<<12);
}
}
}
doneInit = True;
}
MP3FrameParams::~MP3FrameParams() {
}
void MP3FrameParams::setParamsFromHeader() {
if (hdr & (1<<20)) {
isMPEG2 = (hdr & (1<<19)) ? 0x0 : 0x1;
isMPEG2_5 = 0;
}
else {
isMPEG2 = 1;
isMPEG2_5 = 1;
}
layer = 4-((hdr>>17)&3);
if (layer == 4) layer = 3; // layer==4 is not allowed
bitrateIndex = ((hdr>>12)&0xf);
if (isMPEG2_5) {
samplingFreqIndex = ((hdr>>10)&0x3) + 6;
} else {
samplingFreqIndex = ((hdr>>10)&0x3) + (isMPEG2*3);
}
hasCRC = (hdr & 0x10000) == 0;
padding = ((hdr>>9)&0x1);
extension = ((hdr>>8)&0x1);
mode = ((hdr>>6)&0x3);
mode_ext = ((hdr>>4)&0x3);
copyright = ((hdr>>3)&0x1);
original = ((hdr>>2)&0x1);
emphasis = hdr & 0x3;
stereo = (mode == MPG_MD_MONO) ? 1 : 2;
if (((hdr>>10)&0x3) == 0x3) {
#ifdef DEBUG_ERRORS
fprintf(stderr,"Stream error - hdr: 0x%08x\n", hdr);
#endif
}
bitrate = live_tabsel[isMPEG2][layer-1][bitrateIndex];
samplingFreq = live_freqs[samplingFreqIndex];
isStereo = (stereo > 1);
isFreeFormat = (bitrateIndex == 0);
frameSize
= ComputeFrameSize(bitrate, samplingFreq, padding, isMPEG2, layer);
sideInfoSize = computeSideInfoSize();
}
unsigned MP3FrameParams::computeSideInfoSize() {
unsigned size;
if (isMPEG2) {
size = isStereo ? 17 : 9;
} else {
size = isStereo ? 32 : 17;
}
if (hasCRC) {
size += 2;
}
return size;
}
unsigned ComputeFrameSize(unsigned bitrate, unsigned samplingFreq,
Boolean usePadding, Boolean isMPEG2,
unsigned char layer) {
if (samplingFreq == 0) return 0;
unsigned const bitrateMultiplier = (layer == 1) ? 12000*4 : 144000;
unsigned framesize;
framesize = bitrate*bitrateMultiplier;
framesize /= samplingFreq<<(isMPEG2 ? 1 : 0);
framesize = framesize + usePadding - 4;
return framesize;
}
#define TRUNC_FAIRLY
static unsigned updateSideInfoSizes(MP3SideInfo& sideInfo, Boolean isMPEG2,
unsigned char const* mainDataPtr,
unsigned allowedNumBits,
unsigned& part23Length0a,
unsigned& part23Length0aTruncation,
unsigned& part23Length0b,
unsigned& part23Length0bTruncation,
unsigned& part23Length1a,
unsigned& part23Length1aTruncation,
unsigned& part23Length1b,
unsigned& part23Length1bTruncation) {
unsigned p23L0, p23L1 = 0, p23L0Trunc = 0, p23L1Trunc = 0;
p23L0 = sideInfo.ch[0].gr[0].part2_3_length;
p23L1 = isMPEG2 ? 0 : sideInfo.ch[0].gr[1].part2_3_length;
#ifdef TRUNC_ONLY0
if (p23L0 < allowedNumBits)
allowedNumBits = p23L0;
#endif
#ifdef TRUNC_ONLY1
if (p23L1 < allowedNumBits)
allowedNumBits = p23L1;
#endif
if (p23L0 + p23L1 > allowedNumBits) {
/* We need to shorten one or both fields */
unsigned truncation = p23L0 + p23L1 - allowedNumBits;
#ifdef TRUNC_FAIRLY
p23L0Trunc = (truncation*p23L0)/(p23L0 + p23L1);
p23L1Trunc = truncation - p23L0Trunc;
#endif
#if defined(TRUNC_FAVOR0) || defined(TRUNC_ONLY0)
p23L1Trunc = (truncation>p23L1) ? p23L1 : truncation;
p23L0Trunc = truncation - p23L1Trunc;
#endif
#if defined(TRUNC_FAVOR1) || defined(TRUNC_ONLY1)
p23L0Trunc = (truncation>p23L0) ? p23L0 : truncation;
p23L1Trunc = truncation - p23L0Trunc;
#endif
}
/* ASSERT: (p23L0Trunc <= p23L0) && (p23l1Trunc <= p23L1) */
p23L0 -= p23L0Trunc; p23L1 -= p23L1Trunc;
#ifdef DEBUG
fprintf(stderr, "updateSideInfoSizes (allowed: %d): %d->%d, %d->%d\n", allowedNumBits, p23L0+p23L0Trunc, p23L0, p23L1+p23L1Trunc, p23L1);
#endif
// The truncations computed above are still estimates. We need to
// adjust them so that the new fields will continue to end on
// Huffman-encoded sample boundaries:
updateSideInfoForHuffman(sideInfo, isMPEG2, mainDataPtr,
p23L0, p23L1,
part23Length0a, part23Length0aTruncation,
part23Length0b, part23Length0bTruncation,
part23Length1a, part23Length1aTruncation,
part23Length1b, part23Length1bTruncation);
p23L0 = part23Length0a + part23Length0b;
p23L1 = part23Length1a + part23Length1b;
sideInfo.ch[0].gr[0].part2_3_length = p23L0;
sideInfo.ch[0].gr[1].part2_3_length = p23L1;
part23Length0bTruncation
+= sideInfo.ch[1].gr[0].part2_3_length; /* allow for stereo */
sideInfo.ch[1].gr[0].part2_3_length = 0; /* output mono */
sideInfo.ch[1].gr[1].part2_3_length = 0; /* output mono */
return p23L0 + p23L1;
}
Boolean GetADUInfoFromMP3Frame(unsigned char const* framePtr,
unsigned totFrameSize,
unsigned& hdr, unsigned& frameSize,
MP3SideInfo& sideInfo, unsigned& sideInfoSize,
unsigned& backpointer, unsigned& aduSize) {
if (totFrameSize < 4) return False; // there's not enough data
MP3FrameParams fr;
fr.hdr = ((unsigned)framePtr[0] << 24) | ((unsigned)framePtr[1] << 16)
| ((unsigned)framePtr[2] << 8) | (unsigned)framePtr[3];
fr.setParamsFromHeader();
fr.setBytePointer(framePtr + 4, totFrameSize - 4); // skip hdr
frameSize = 4 + fr.frameSize;
if (fr.layer != 3) {
// Special case for non-layer III frames
backpointer = 0;
sideInfoSize = 0;
aduSize = fr.frameSize;
return True;
}
sideInfoSize = fr.sideInfoSize;
if (totFrameSize < 4 + sideInfoSize) return False; // not enough data
fr.getSideInfo(sideInfo);
hdr = fr.hdr;
backpointer = sideInfo.main_data_begin;
unsigned numBits = sideInfo.ch[0].gr[0].part2_3_length;
numBits += sideInfo.ch[0].gr[1].part2_3_length;
numBits += sideInfo.ch[1].gr[0].part2_3_length;
numBits += sideInfo.ch[1].gr[1].part2_3_length;
aduSize = (numBits+7)/8;
#ifdef DEBUG
fprintf(stderr, "mp3GetADUInfoFromFrame: hdr: %08x, frameSize: %d, part2_3_lengths: %d,%d,%d,%d, aduSize: %d, backpointer: %d\n", hdr, frameSize, sideInfo.ch[0].gr[0].part2_3_length, sideInfo.ch[0].gr[1].part2_3_length, sideInfo.ch[1].gr[0].part2_3_length, sideInfo.ch[1].gr[1].part2_3_length, aduSize, backpointer);
#endif
return True;
}
static void getSideInfo1(MP3FrameParams& fr, MP3SideInfo& si,
int stereo, int ms_stereo, long sfreq,
int /*single*/) {
int ch, gr;
#if 0
int powdiff = (single == 3) ? 4 : 0;
#endif
/* initialize all four "part2_3_length" fields to zero: */
si.ch[0].gr[0].part2_3_length = 0; si.ch[1].gr[0].part2_3_length = 0;
si.ch[0].gr[1].part2_3_length = 0; si.ch[1].gr[1].part2_3_length = 0;
si.main_data_begin = fr.getBits(9);
if (stereo == 1)
si.private_bits = fr.getBits(5);
else
si.private_bits = fr.getBits(3);
for (ch=0; ch win-sw-flag = 0 */
gr_info.window_switching_flag = fr.get1Bit();
if (gr_info.window_switching_flag) {
int i;
gr_info.block_type = fr.getBits(2);
gr_info.mixed_block_flag = fr.get1Bit();
gr_info.table_select[0] = fr.getBits(5);
gr_info.table_select[1] = fr.getBits(5);
/*
* table_select[2] not needed, because there is no region2,
* but to satisfy some verifications tools we set it either.
*/
gr_info.table_select[2] = 0;
for (i=0;i<3;i++) {
gr_info.subblock_gain[i] = fr.getBits(3);
gr_info.full_gain[i]
= gr_info.pow2gain + ((gr_info.subblock_gain[i])<<3);
}
#ifdef DEBUG_ERRORS
if (gr_info.block_type == 0) {
fprintf(stderr,"Blocktype == 0 and window-switching == 1 not allowed.\n");
}
#endif
/* region_count/start parameters are implicit in this case. */
gr_info.region1start = 36>>1;
gr_info.region2start = 576>>1;
}
else
{
int i,r0c,r1c;
for (i=0; i<3; i++) {
gr_info.table_select[i] = fr.getBits(5);
}
r0c = gr_info.region0_count = fr.getBits(4);
r1c = gr_info.region1_count = fr.getBits(3);
gr_info.region1start = bandInfo[sfreq].longIdx[r0c+1] >> 1 ;
gr_info.region2start = bandInfo[sfreq].longIdx[r0c+1+r1c+1] >> 1;
gr_info.block_type = 0;
gr_info.mixed_block_flag = 0;
}
gr_info.preflag = fr.get1Bit();
gr_info.scalefac_scale = fr.get1Bit();
gr_info.count1table_select = fr.get1Bit();
}
}
}
static void getSideInfo2(MP3FrameParams& fr, MP3SideInfo& si,
int stereo, int ms_stereo, long sfreq,
int /*single*/) {
int ch;
#if 0
int powdiff = (single == 3) ? 4 : 0;
#endif
/* initialize all four "part2_3_length" fields to zero: */
si.ch[0].gr[0].part2_3_length = 0; si.ch[1].gr[0].part2_3_length = 0;
si.ch[0].gr[1].part2_3_length = 0; si.ch[1].gr[1].part2_3_length = 0;
si.main_data_begin = fr.getBits(8);
if (stereo == 1)
si.private_bits = fr.get1Bit();
else
si.private_bits = fr.getBits(2);
for (ch=0; ch win-sw-flag = 0 */
gr_info.window_switching_flag = fr.get1Bit();
if (gr_info.window_switching_flag) {
int i;
gr_info.block_type = fr.getBits(2);
gr_info.mixed_block_flag = fr.get1Bit();
gr_info.table_select[0] = fr.getBits(5);
gr_info.table_select[1] = fr.getBits(5);
/*
* table_select[2] not needed, because there is no region2,
* but to satisfy some verifications tools we set it either.
*/
gr_info.table_select[2] = 0;
for (i=0;i<3;i++) {
gr_info.subblock_gain[i] = fr.getBits(3);
gr_info.full_gain[i]
= gr_info.pow2gain + ((gr_info.subblock_gain[i])<<3);
}
#ifdef DEBUG_ERRORS
if (gr_info.block_type == 0) {
fprintf(stderr,"Blocktype == 0 and window-switching == 1 not allowed.\n");
}
#endif
/* region_count/start parameters are implicit in this case. */
/* check this again! */
if (gr_info.block_type == 2)
gr_info.region1start = 36>>1;
else {
gr_info.region1start = 54>>1;
}
gr_info.region2start = 576>>1;
}
else
{
int i,r0c,r1c;
for (i=0; i<3; i++) {
gr_info.table_select[i] = fr.getBits(5);
}
r0c = gr_info.region0_count = fr.getBits(4);
r1c = gr_info.region1_count = fr.getBits(3);
gr_info.region1start = bandInfo[sfreq].longIdx[r0c+1] >> 1 ;
gr_info.region2start = bandInfo[sfreq].longIdx[r0c+1+r1c+1] >> 1;
gr_info.block_type = 0;
gr_info.mixed_block_flag = 0;
}
gr_info.scalefac_scale = fr.get1Bit();
gr_info.count1table_select = fr.get1Bit();
}
}
#define MPG_MD_JOINT_STEREO 1
void MP3FrameParams::getSideInfo(MP3SideInfo& si) {
// First skip over the CRC if present:
if (hasCRC) getBits(16);
int single = -1;
int ms_stereo;
int sfreq = samplingFreqIndex;
if (stereo == 1) {
single = 0;
}
ms_stereo = (mode == MPG_MD_JOINT_STEREO) && (mode_ext & 0x2);
if (isMPEG2) {
getSideInfo2(*this, si, stereo, ms_stereo, sfreq, single);
} else {
getSideInfo1(*this, si, stereo, ms_stereo, sfreq, single);
}
}
static void putSideInfo1(BitVector& bv,
MP3SideInfo const& si, Boolean isStereo) {
int ch, gr, i;
int stereo = isStereo ? 2 : 1;
bv.putBits(si.main_data_begin,9);
if (stereo == 1)
bv.putBits(si.private_bits, 5);
else
bv.putBits(si.private_bits, 3);
for (ch=0; ch= bitrate)
return i;
}
// "bitrate" was larger than any possible, so return the largest possible:
return 14;
}
static void outputHeader(unsigned char* toPtr, unsigned hdr) {
toPtr[0] = (unsigned char)(hdr>>24);
toPtr[1] = (unsigned char)(hdr>>16);
toPtr[2] = (unsigned char)(hdr>>8);
toPtr[3] = (unsigned char)(hdr);
}
static void assignADUBackpointer(MP3FrameParams const& fr,
unsigned aduSize,
MP3SideInfo& sideInfo,
unsigned& availableBytesForBackpointer) {
// Give the ADU as large a backpointer as possible:
unsigned maxBackpointerSize = fr.isMPEG2 ? 255 : 511;
unsigned backpointerSize = availableBytesForBackpointer;
if (backpointerSize > maxBackpointerSize) {
backpointerSize = maxBackpointerSize;
}
// Store the new backpointer now:
sideInfo.main_data_begin = backpointerSize;
// Figure out how many bytes are available for the *next* ADU's backpointer:
availableBytesForBackpointer
= backpointerSize + fr.frameSize - fr.sideInfoSize ;
if (availableBytesForBackpointer < aduSize) {
availableBytesForBackpointer = 0;
} else {
availableBytesForBackpointer -= aduSize;
}
}
unsigned TranscodeMP3ADU(unsigned char const* fromPtr, unsigned fromSize,
unsigned toBitrate,
unsigned char* toPtr, unsigned toMaxSize,
unsigned& availableBytesForBackpointer) {
// Begin by parsing the input ADU's parameters:
unsigned hdr, inFrameSize, inSideInfoSize, backpointer, inAduSize;
MP3SideInfo sideInfo;
if (!GetADUInfoFromMP3Frame(fromPtr, fromSize,
hdr, inFrameSize, sideInfo, inSideInfoSize,
backpointer, inAduSize)) {
return 0;
}
fromPtr += (4+inSideInfoSize); // skip to 'main data'
// Alter the 4-byte MPEG header to reflect the output ADU:
// (different bitrate; mono; no CRC)
Boolean isMPEG2 = ((hdr&0x00080000) == 0);
unsigned toBitrateIndex = MP3BitrateToBitrateIndex(toBitrate, isMPEG2);
hdr &=~ 0xF000; hdr |= (toBitrateIndex<<12); // set bitrate index
hdr |= 0x10200; // turn on !error-prot and padding bits
hdr &=~ 0xC0; hdr |= 0xC0; // set mode to 3 (mono)
// Set up the rest of the parameters of the new ADU:
MP3FrameParams outFr;
outFr.hdr = hdr;
outFr.setParamsFromHeader();
// Figure out how big to make the output ADU:
unsigned inAveAduSize = inFrameSize - inSideInfoSize;
unsigned outAveAduSize = outFr.frameSize - outFr.sideInfoSize;
unsigned desiredOutAduSize /*=inAduSize*outAveAduSize/inAveAduSize*/
= (2*inAduSize*outAveAduSize + inAveAduSize)/(2*inAveAduSize);
// this rounds to the nearest integer
if (toMaxSize < (4 + outFr.sideInfoSize)) return 0;
unsigned maxOutAduSize = toMaxSize - (4 + outFr.sideInfoSize);
if (desiredOutAduSize > maxOutAduSize) {
desiredOutAduSize = maxOutAduSize;
}
// Figure out the new sizes of the various 'part23 lengths',
// and how much they are truncated:
unsigned part23Length0a, part23Length0aTruncation;
unsigned part23Length0b, part23Length0bTruncation;
unsigned part23Length1a, part23Length1aTruncation;
unsigned part23Length1b, part23Length1bTruncation;
unsigned numAduBits
= updateSideInfoSizes(sideInfo, outFr.isMPEG2,
fromPtr, 8*desiredOutAduSize,
part23Length0a, part23Length0aTruncation,
part23Length0b, part23Length0bTruncation,
part23Length1a, part23Length1aTruncation,
part23Length1b, part23Length1bTruncation);
#ifdef DEBUG
fprintf(stderr, "shrinkage %d->%d [(%d,%d),(%d,%d)] (trunc: [(%d,%d),(%d,%d)]) {%d}\n", inAduSize, (numAduBits+7)/8,
part23Length0a, part23Length0b, part23Length1a, part23Length1b,
part23Length0aTruncation, part23Length0bTruncation,
part23Length1aTruncation, part23Length1bTruncation,
maxOutAduSize);
#endif
unsigned actualOutAduSize = (numAduBits+7)/8;
// Give the new ADU an appropriate 'backpointer':
assignADUBackpointer(outFr, actualOutAduSize, sideInfo, availableBytesForBackpointer);
///// Now output the new ADU:
// 4-byte header
outputHeader(toPtr, hdr); toPtr += 4;
// side info
PutMP3SideInfoIntoFrame(sideInfo, outFr, toPtr); toPtr += outFr.sideInfoSize;
// 'main data', using the new lengths
unsigned toBitOffset = 0;
unsigned fromBitOffset = 0;
/* rebuild portion 0a: */
memmove(toPtr, fromPtr, (part23Length0a+7)/8);
toBitOffset += part23Length0a;
fromBitOffset += part23Length0a + part23Length0aTruncation;
/* rebuild portion 0b: */
shiftBits(toPtr, toBitOffset, fromPtr, fromBitOffset, part23Length0b);
toBitOffset += part23Length0b;
fromBitOffset += part23Length0b + part23Length0bTruncation;
/* rebuild portion 1a: */
shiftBits(toPtr, toBitOffset, fromPtr, fromBitOffset, part23Length1a);
toBitOffset += part23Length1a;
fromBitOffset += part23Length1a + part23Length1aTruncation;
/* rebuild portion 1b: */
shiftBits(toPtr, toBitOffset, fromPtr, fromBitOffset, part23Length1b);
toBitOffset += part23Length1b;
/* zero out any remaining bits (probably unnecessary, but...) */
unsigned char const zero = '\0';
shiftBits(toPtr, toBitOffset, &zero, 0,
actualOutAduSize*8 - numAduBits);
return 4 + outFr.sideInfoSize + actualOutAduSize;
}
live/liveMedia/MP3Internals.hh 000444 001752 001752 00000010216 12656261123 016150 0 ustar 00rsf rsf 000000 000000 /**********
This library is free software; you can redistribute it and/or modify it under
the terms of the GNU Lesser General Public License as published by the
Free Software Foundation; either version 2.1 of the License, or (at your
option) any later version. (See .)
This library is distributed in the hope that it will be useful, but WITHOUT
ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS
FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for
more details.
You should have received a copy of the GNU Lesser General Public License
along with this library; if not, write to the Free Software Foundation, Inc.,
51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
**********/
// "liveMedia"
// Copyright (c) 1996-2016 Live Networks, Inc. All rights reserved.
// MP3 internal implementation details
// C++ header
#ifndef _MP3_INTERNALS_HH
#define _MP3_INTERNALS_HH
#ifndef _BOOLEAN_HH
#include "Boolean.hh"
#endif
#ifndef _BIT_VECTOR_HH
#include "BitVector.hh"
#endif
typedef struct MP3SideInfo {
unsigned main_data_begin;
unsigned private_bits;
typedef struct gr_info_s {
int scfsi;
unsigned part2_3_length;
unsigned big_values;
unsigned global_gain;
unsigned scalefac_compress;
unsigned window_switching_flag;
unsigned block_type;
unsigned mixed_block_flag;
unsigned table_select[3];
unsigned region0_count;
unsigned region1_count;
unsigned subblock_gain[3];
unsigned maxband[3];
unsigned maxbandl;
unsigned maxb;
unsigned region1start;
unsigned region2start;
unsigned preflag;
unsigned scalefac_scale;
unsigned count1table_select;
double *full_gain[3];
double *pow2gain;
} gr_info_s_t;
struct {
gr_info_s_t gr[2];
} ch[2];
} MP3SideInfo_t;
#define SBLIMIT 32
#define MAX_MP3_FRAME_SIZE 2500 /* also big enough for an 'ADU'ized frame */
class MP3FrameParams {
public:
MP3FrameParams();
~MP3FrameParams();
// 4-byte MPEG header:
unsigned hdr;
// a buffer that can be used to hold the rest of the frame:
unsigned char frameBytes[MAX_MP3_FRAME_SIZE];
// public parameters derived from the header
void setParamsFromHeader(); // this sets them
Boolean isMPEG2;
unsigned layer; // currently only 3 is supported
unsigned bitrate; // in kbps
unsigned samplingFreq;
Boolean isStereo;
Boolean isFreeFormat;
unsigned frameSize; // doesn't include the initial 4-byte header
unsigned sideInfoSize;
Boolean hasCRC;
void setBytePointer(unsigned char const* restOfFrame,
unsigned totNumBytes) {// called during setup
bv.setup((unsigned char*)restOfFrame, 0, 8*totNumBytes);
}
// other, public parameters used when parsing input (perhaps get rid of)
unsigned oldHdr, firstHdr;
// Extract (unpack) the side info from the frame into a struct:
void getSideInfo(MP3SideInfo& si);
// The bit pointer used for reading data from frame data
unsigned getBits(unsigned numBits) { return bv.getBits(numBits); }
unsigned get1Bit() { return bv.get1Bit(); }
private:
BitVector bv;
// other, private parameters derived from the header
unsigned bitrateIndex;
unsigned samplingFreqIndex;
Boolean isMPEG2_5;
Boolean padding;
Boolean extension;
unsigned mode;
unsigned mode_ext;
Boolean copyright;
Boolean original;
unsigned emphasis;
unsigned stereo;
private:
unsigned computeSideInfoSize();
};
unsigned ComputeFrameSize(unsigned bitrate, unsigned samplingFreq,
Boolean usePadding, Boolean isMPEG2,
unsigned char layer);
Boolean GetADUInfoFromMP3Frame(unsigned char const* framePtr,
unsigned totFrameSize,
unsigned& hdr, unsigned& frameSize,
MP3SideInfo& sideInfo, unsigned& sideInfoSize,
unsigned& backpointer, unsigned& aduSize);
Boolean ZeroOutMP3SideInfo(unsigned char* framePtr, unsigned totFrameSize,
unsigned newBackpointer);
unsigned TranscodeMP3ADU(unsigned char const* fromPtr, unsigned fromSize,
unsigned toBitrate,
unsigned char* toPtr, unsigned toMaxSize,
unsigned& availableBytesForBackpointer);
// returns the size of the resulting ADU (0 on failure)
#endif
live/liveMedia/MP3InternalsHuffman.cpp 000444 001752 001752 00000070565 12656261123 017655 0 ustar 00rsf rsf 000000 000000 /**********
This library is free software; you can redistribute it and/or modify it under
the terms of the GNU Lesser General Public License as published by the
Free Software Foundation; either version 2.1 of the License, or (at your
option) any later version. (See .)
This library is distributed in the hope that it will be useful, but WITHOUT
ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS
FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for
more details.
You should have received a copy of the GNU Lesser General Public License
along with this library; if not, write to the Free Software Foundation, Inc.,
51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
**********/
// "liveMedia"
// Copyright (c) 1996-2016 Live Networks, Inc. All rights reserved.
// MP3 internal implementation details (Huffman encoding)
// Implementation
#include "MP3InternalsHuffman.hh"
#include
#include
#include
MP3HuffmanEncodingInfo
::MP3HuffmanEncodingInfo(Boolean includeDecodedValues) {
if (includeDecodedValues) {
decodedValues = new unsigned[(SBLIMIT*SSLIMIT + 1)*4];
} else {
decodedValues = NULL;
}
}
MP3HuffmanEncodingInfo::~MP3HuffmanEncodingInfo() {
delete[] decodedValues;
}
// This is crufty old code that needs to be cleaned up #####
static unsigned debugCount = 0; /* for debugging */
#define TRUNC_FAVORa
void updateSideInfoForHuffman(MP3SideInfo& sideInfo, Boolean isMPEG2,
unsigned char const* mainDataPtr,
unsigned p23L0, unsigned p23L1,
unsigned& part23Length0a,
unsigned& part23Length0aTruncation,
unsigned& part23Length0b,
unsigned& part23Length0bTruncation,
unsigned& part23Length1a,
unsigned& part23Length1aTruncation,
unsigned& part23Length1b,
unsigned& part23Length1bTruncation) {
int i, j;
unsigned sfLength, origTotABsize, adjustment;
MP3SideInfo::gr_info_s_t* gr;
/* First, Huffman-decode each part of the segment's main data,
to see at which bit-boundaries the samples appear:
*/
MP3HuffmanEncodingInfo hei;
++debugCount;
#ifdef DEBUG
fprintf(stderr, "usifh-start: p23L0: %d, p23L1: %d\n", p23L0, p23L1);
#endif
/* Process granule 0 */
{
gr = &(sideInfo.ch[0].gr[0]);
origTotABsize = gr->part2_3_length;
MP3HuffmanDecode(gr, isMPEG2, mainDataPtr, 0, origTotABsize, sfLength, hei);
/* Begin by computing new sizes for parts a & b (& their truncations) */
#ifdef DEBUG
fprintf(stderr, "usifh-0: %d, %d:%d, %d:%d, %d:%d, %d:%d, %d:%d\n",
hei.numSamples,
sfLength/8, sfLength%8,
hei.reg1Start/8, hei.reg1Start%8,
hei.reg2Start/8, hei.reg2Start%8,
hei.bigvalStart/8, hei.bigvalStart%8,
origTotABsize/8, origTotABsize%8);
#endif
if (p23L0 < sfLength) {
/* We can't use this, so give it all to the next granule: */
p23L1 += p23L0;
p23L0 = 0;
}
part23Length0a = hei.bigvalStart;
part23Length0b = origTotABsize - hei.bigvalStart;
part23Length0aTruncation = part23Length0bTruncation = 0;
if (origTotABsize > p23L0) {
/* We need to shorten one or both of fields a & b */
unsigned truncation = origTotABsize - p23L0;
#ifdef TRUNC_FAIRLY
part23Length0aTruncation = (truncation*(part23Length0a-sfLength))
/(origTotABsize-sfLength);
part23Length0bTruncation = truncation - part23Length0aTruncation;
#endif
#ifdef TRUNC_FAVORa
part23Length0bTruncation
= (truncation > part23Length0b) ? part23Length0b : truncation;
part23Length0aTruncation = truncation - part23Length0bTruncation;
#endif
#ifdef TRUNC_FAVORb
part23Length0aTruncation = (truncation > part23Length0a-sfLength)
? (part23Length0a-sfLength) : truncation;
part23Length0bTruncation = truncation - part23Length0aTruncation;
#endif
}
/* ASSERT: part23Length0xTruncation <= part23Length0x */
part23Length0a -= part23Length0aTruncation;
part23Length0b -= part23Length0bTruncation;
#ifdef DEBUG
fprintf(stderr, "usifh-0: interim sizes: %d (%d), %d (%d)\n",
part23Length0a, part23Length0aTruncation,
part23Length0b, part23Length0bTruncation);
#endif
/* Adjust these new lengths so they end on sample bit boundaries: */
for (i = 0; i < (int)hei.numSamples; ++i) {
if (hei.allBitOffsets[i] == part23Length0a) break;
else if (hei.allBitOffsets[i] > part23Length0a) {--i; break;}
}
if (i < 0) { /* should happen only if we couldn't fit sfLength */
i = 0; adjustment = 0;
} else {
adjustment = part23Length0a - hei.allBitOffsets[i];
}
#ifdef DEBUG
fprintf(stderr, "%d usifh-0: adjustment 1: %d\n", debugCount, adjustment);
#endif
part23Length0a -= adjustment;
part23Length0aTruncation += adjustment;
/* Assign the bits we just shaved to field b and granule 1: */
if (part23Length0bTruncation < adjustment) {
p23L1 += (adjustment - part23Length0bTruncation);
adjustment = part23Length0bTruncation;
}
part23Length0b += adjustment;
part23Length0bTruncation -= adjustment;
for (j = i; j < (int)hei.numSamples; ++j) {
if (hei.allBitOffsets[j]
== part23Length0a + part23Length0aTruncation + part23Length0b)
break;
else if (hei.allBitOffsets[j]
> part23Length0a + part23Length0aTruncation + part23Length0b)
{--j; break;}
}
if (j < 0) { /* should happen only if we couldn't fit sfLength */
j = 0; adjustment = 0;
} else {
adjustment = part23Length0a+part23Length0aTruncation+part23Length0b
- hei.allBitOffsets[j];
}
#ifdef DEBUG
fprintf(stderr, "%d usifh-0: adjustment 2: %d\n", debugCount, adjustment);
#endif
if (adjustment > part23Length0b) adjustment = part23Length0b; /*sanity*/
part23Length0b -= adjustment;
part23Length0bTruncation += adjustment;
/* Assign the bits we just shaved to granule 1 */
p23L1 += adjustment;
if (part23Length0aTruncation > 0) {
/* Change the granule's 'big_values' field to reflect the truncation */
gr->big_values = i;
}
}
/* Process granule 1 (MPEG-1 only) */
if (isMPEG2) {
part23Length1a = part23Length1b = 0;
part23Length1aTruncation = part23Length1bTruncation = 0;
} else {
unsigned granule1Offset
= origTotABsize + sideInfo.ch[1].gr[0].part2_3_length;
gr = &(sideInfo.ch[0].gr[1]);
origTotABsize = gr->part2_3_length;
MP3HuffmanDecode(gr, isMPEG2, mainDataPtr, granule1Offset,
origTotABsize, sfLength, hei);
/* Begin by computing new sizes for parts a & b (& their truncations) */
#ifdef DEBUG
fprintf(stderr, "usifh-1: %d, %d:%d, %d:%d, %d:%d, %d:%d, %d:%d\n",
hei.numSamples,
sfLength/8, sfLength%8,
hei.reg1Start/8, hei.reg1Start%8,
hei.reg2Start/8, hei.reg2Start%8,
hei.bigvalStart/8, hei.bigvalStart%8,
origTotABsize/8, origTotABsize%8);
#endif
if (p23L1 < sfLength) {
/* We can't use this, so give up on this granule: */
p23L1 = 0;
}
part23Length1a = hei.bigvalStart;
part23Length1b = origTotABsize - hei.bigvalStart;
part23Length1aTruncation = part23Length1bTruncation = 0;
if (origTotABsize > p23L1) {
/* We need to shorten one or both of fields a & b */
unsigned truncation = origTotABsize - p23L1;
#ifdef TRUNC_FAIRLY
part23Length1aTruncation = (truncation*(part23Length1a-sfLength))
/(origTotABsize-sfLength);
part23Length1bTruncation = truncation - part23Length1aTruncation;
#endif
#ifdef TRUNC_FAVORa
part23Length1bTruncation
= (truncation > part23Length1b) ? part23Length1b : truncation;
part23Length1aTruncation = truncation - part23Length1bTruncation;
#endif
#ifdef TRUNC_FAVORb
part23Length1aTruncation = (truncation > part23Length1a-sfLength)
? (part23Length1a-sfLength) : truncation;
part23Length1bTruncation = truncation - part23Length1aTruncation;
#endif
}
/* ASSERT: part23Length1xTruncation <= part23Length1x */
part23Length1a -= part23Length1aTruncation;
part23Length1b -= part23Length1bTruncation;
#ifdef DEBUG
fprintf(stderr, "usifh-1: interim sizes: %d (%d), %d (%d)\n",
part23Length1a, part23Length1aTruncation,
part23Length1b, part23Length1bTruncation);
#endif
/* Adjust these new lengths so they end on sample bit boundaries: */
for (i = 0; i < (int)hei.numSamples; ++i) {
if (hei.allBitOffsets[i] == part23Length1a) break;
else if (hei.allBitOffsets[i] > part23Length1a) {--i; break;}
}
if (i < 0) { /* should happen only if we couldn't fit sfLength */
i = 0; adjustment = 0;
} else {
adjustment = part23Length1a - hei.allBitOffsets[i];
}
#ifdef DEBUG
fprintf(stderr, "%d usifh-1: adjustment 0: %d\n", debugCount, adjustment);
#endif
part23Length1a -= adjustment;
part23Length1aTruncation += adjustment;
/* Assign the bits we just shaved to field b: */
if (part23Length1bTruncation < adjustment) {
adjustment = part23Length1bTruncation;
}
part23Length1b += adjustment;
part23Length1bTruncation -= adjustment;
for (j = i; j < (int)hei.numSamples; ++j) {
if (hei.allBitOffsets[j]
== part23Length1a + part23Length1aTruncation + part23Length1b)
break;
else if (hei.allBitOffsets[j]
> part23Length1a + part23Length1aTruncation + part23Length1b)
{--j; break;}
}
if (j < 0) { /* should happen only if we couldn't fit sfLength */
j = 0; adjustment = 0;
} else {
adjustment = part23Length1a+part23Length1aTruncation+part23Length1b
- hei.allBitOffsets[j];
}
#ifdef DEBUG
fprintf(stderr, "%d usifh-1: adjustment 1: %d\n", debugCount, adjustment);
#endif
if (adjustment > part23Length1b) adjustment = part23Length1b; /*sanity*/
part23Length1b -= adjustment;
part23Length1bTruncation += adjustment;
if (part23Length1aTruncation > 0) {
/* Change the granule's 'big_values' field to reflect the truncation */
gr->big_values = i;
}
}
#ifdef DEBUG
fprintf(stderr, "usifh-end, new vals: %d (%d), %d (%d), %d (%d), %d (%d)\n",
part23Length0a, part23Length0aTruncation,
part23Length0b, part23Length0bTruncation,
part23Length1a, part23Length1aTruncation,
part23Length1b, part23Length1bTruncation);
#endif
}
static void rsf_getline(char* line, unsigned max, unsigned char**fi) {
unsigned i;
for (i = 0; i < max; ++i) {
line[i] = *(*fi)++;
if (line[i] == '\n') {
line[i++] = '\0';
return;
}
}
line[i] = '\0';
}
static void rsfscanf(unsigned char **fi, unsigned int* v) {
while (sscanf((char*)*fi, "%x", v) == 0) {
/* skip past the next '\0' */
while (*(*fi)++ != '\0') {}
}
/* skip past any white-space before the value: */
while (*(*fi) <= ' ') ++(*fi);
/* skip past the value: */
while (*(*fi) > ' ') ++(*fi);
}
#define HUFFBITS unsigned long int
#define SIZEOF_HUFFBITS 4
#define HTN 34
#define MXOFF 250
struct huffcodetab {
char tablename[3]; /*string, containing table_description */
unsigned int xlen; /*max. x-index+ */
unsigned int ylen; /*max. y-index+ */
unsigned int linbits; /*number of linbits */
unsigned int linmax; /*max number to be stored in linbits */
int ref; /*a positive value indicates a reference*/
HUFFBITS *table; /*pointer to array[xlen][ylen] */
unsigned char *hlen; /*pointer to array[xlen][ylen] */
unsigned char(*val)[2];/*decoder tree */
unsigned int treelen; /*length of decoder tree */
};
static struct huffcodetab rsf_ht[HTN]; // array of all huffcodetable headers
/* 0..31 Huffman code table 0..31 */
/* 32,33 count1-tables */
/* read the huffman decoder table */
static int read_decoder_table(unsigned char* fi) {
int n,i,nn,t;
unsigned int v0,v1;
char command[100],line[100];
for (n=0;nscalefac_compress];
int num1 = slen[1][gr_info->scalefac_compress];
if (gr_info->block_type == 2)
{
numbits = (num0 + num1) * 18;
if (gr_info->mixed_block_flag) {
numbits -= num0; /* num0 * 17 + num1 * 18 */
}
}
else
{
int scfsi = gr_info->scfsi;
if(scfsi < 0) { /* scfsi < 0 => granule == 0 */
numbits = (num0 + num1) * 10 + num0;
}
else {
numbits = 0;
if(!(scfsi & 0x8)) {
numbits += num0 * 6;
}
else {
}
if(!(scfsi & 0x4)) {
numbits += num0 * 5;
}
else {
}
if(!(scfsi & 0x2)) {
numbits += num1 * 5;
}
else {
}
if(!(scfsi & 0x1)) {
numbits += num1 * 5;
}
else {
}
}
}
return numbits;
}
extern unsigned n_slen2[];
extern unsigned i_slen2[];
static unsigned rsf_get_scale_factors_2(MP3SideInfo::gr_info_s_t *gr_info) {
unsigned char const* pnt;
int i;
unsigned int slen;
int n = 0;
int numbits = 0;
slen = n_slen2[gr_info->scalefac_compress];
gr_info->preflag = (slen>>15) & 0x1;
n = 0;
if( gr_info->block_type == 2 ) {
n++;
if(gr_info->mixed_block_flag)
n++;
}
pnt = stab[n][(slen>>12)&0x7];
for(i=0;i<4;i++) {
int num = slen & 0x7;
slen >>= 3;
numbits += pnt[i] * num;
}
return numbits;
}
static unsigned getScaleFactorsLength(MP3SideInfo::gr_info_s_t* gr,
Boolean isMPEG2) {
return isMPEG2 ? rsf_get_scale_factors_2(gr)
: rsf_get_scale_factors_1(gr);
}
static int rsf_huffman_decoder(BitVector& bv,
struct huffcodetab const* h,
int* x, int* y, int* v, int* w); // forward
void MP3HuffmanDecode(MP3SideInfo::gr_info_s_t* gr, Boolean isMPEG2,
unsigned char const* fromBasePtr,
unsigned fromBitOffset, unsigned fromLength,
unsigned& scaleFactorsLength,
MP3HuffmanEncodingInfo& hei) {
unsigned i;
int x, y, v, w;
struct huffcodetab *h;
BitVector bv((unsigned char*)fromBasePtr, fromBitOffset, fromLength);
/* Compute the size of the scale factors (& also advance bv): */
scaleFactorsLength = getScaleFactorsLength(gr, isMPEG2);
bv.skipBits(scaleFactorsLength);
initialize_huffman();
hei.reg1Start = hei.reg2Start = hei.numSamples = 0;
/* Read bigvalues area. */
if (gr->big_values < gr->region1start + gr->region2start) {
gr->big_values = gr->region1start + gr->region2start; /* sanity check */
}
for (i = 0; i < gr->big_values; ++i) {
if (i < gr->region1start) {
/* in region 0 */
h = &rsf_ht[gr->table_select[0]];
} else if (i < gr->region2start) {
/* in region 1 */
h = &rsf_ht[gr->table_select[1]];
if (hei.reg1Start == 0) {
hei.reg1Start = bv.curBitIndex();
}
} else {
/* in region 2 */
h = &rsf_ht[gr->table_select[2]];
if (hei.reg2Start == 0) {
hei.reg2Start = bv.curBitIndex();
}
}
hei.allBitOffsets[i] = bv.curBitIndex();
rsf_huffman_decoder(bv, h, &x, &y, &v, &w);
if (hei.decodedValues != NULL) {
// Record the decoded values:
unsigned* ptr = &hei.decodedValues[4*i];
ptr[0] = x; ptr[1] = y; ptr[2] = v; ptr[3] = w;
}
}
hei.bigvalStart = bv.curBitIndex();
/* Read count1 area. */
h = &rsf_ht[gr->count1table_select+32];
while (bv.curBitIndex() < bv.totNumBits() && i < SSLIMIT*SBLIMIT) {
hei.allBitOffsets[i] = bv.curBitIndex();
rsf_huffman_decoder(bv, h, &x, &y, &v, &w);
if (hei.decodedValues != NULL) {
// Record the decoded values:
unsigned* ptr = &hei.decodedValues[4*i];
ptr[0] = x; ptr[1] = y; ptr[2] = v; ptr[3] = w;
}
++i;
}
hei.allBitOffsets[i] = bv.curBitIndex();
hei.numSamples = i;
}
HUFFBITS dmask = 1 << (SIZEOF_HUFFBITS*8-1);
unsigned int hs = SIZEOF_HUFFBITS*8;
/* do the huffman-decoding */
static int rsf_huffman_decoder(BitVector& bv,
struct huffcodetab const* h, // ptr to huffman code record
/* unsigned */ int *x, // returns decoded x value
/* unsigned */ int *y, // returns decoded y value
int* v, int* w) {
HUFFBITS level;
unsigned point = 0;
int error = 1;
level = dmask;
*x = *y = *v = *w = 0;
if (h->val == NULL) return 2;
/* table 0 needs no bits */
if (h->treelen == 0) return 0;
/* Lookup in Huffman table. */
do {
if (h->val[point][0]==0) { /*end of tree*/
*x = h->val[point][1] >> 4;
*y = h->val[point][1] & 0xf;
error = 0;
break;
}
if (bv.get1Bit()) {
while (h->val[point][1] >= MXOFF) point += h->val[point][1];
point += h->val[point][1];
}
else {
while (h->val[point][0] >= MXOFF) point += h->val[point][0];
point += h->val[point][0];
}
level >>= 1;
} while (level || (point < h->treelen) );
///// } while (level || (point < rsf_ht->treelen) );
/* Check for error. */
if (error) { /* set x and y to a medium value as a simple concealment */
printf("Illegal Huffman code in data.\n");
*x = ((h->xlen-1) << 1);
*y = ((h->ylen-1) << 1);
}
/* Process sign encodings for quadruples tables. */
if (h->tablename[0] == '3'
&& (h->tablename[1] == '2' || h->tablename[1] == '3')) {
*v = (*y>>3) & 1;
*w = (*y>>2) & 1;
*x = (*y>>1) & 1;
*y = *y & 1;
if (*v)
if (bv.get1Bit() == 1) *v = -*v;
if (*w)
if (bv.get1Bit() == 1) *w = -*w;
if (*x)
if (bv.get1Bit() == 1) *x = -*x;
if (*y)
if (bv.get1Bit() == 1) *y = -*y;
}
/* Process sign and escape encodings for dual tables. */
else {
if (h->linbits)
if ((h->xlen-1) == (unsigned)*x)
*x += bv.getBits(h->linbits);
if (*x)
if (bv.get1Bit() == 1) *x = -*x;
if (h->linbits)
if ((h->ylen-1) == (unsigned)*y)
*y += bv.getBits(h->linbits);
if (*y)
if (bv.get1Bit() == 1) *y = -*y;
}
return error;
}
#ifdef DO_HUFFMAN_ENCODING
inline int getNextSample(unsigned char const*& fromPtr) {
int sample
#ifdef FOUR_BYTE_SAMPLES
= (fromPtr[0]<<24) | (fromPtr[1]<<16) | (fromPtr[2]<<8) | fromPtr[3];
#else
#ifdef TWO_BYTE_SAMPLES
= (fromPtr[0]<<8) | fromPtr[1];
#else
// ONE_BYTE_SAMPLES
= fromPtr[0];
#endif
#endif
fromPtr += BYTES_PER_SAMPLE_VALUE;
return sample;
}
static void rsf_huffman_encoder(BitVector& bv,
struct huffcodetab* h,
int x, int y, int v, int w); // forward
unsigned MP3HuffmanEncode(MP3SideInfo::gr_info_s_t const* gr,
unsigned char const* fromPtr,
unsigned char* toPtr, unsigned toBitOffset,
unsigned numHuffBits) {
unsigned i;
struct huffcodetab *h;
int x, y, v, w;
BitVector bv(toPtr, toBitOffset, numHuffBits);
initialize_huffman();
// Encode big_values area:
unsigned big_values = gr->big_values;
if (big_values < gr->region1start + gr->region2start) {
big_values = gr->region1start + gr->region2start; /* sanity check */
}
for (i = 0; i < big_values; ++i) {
if (i < gr->region1start) {
/* in region 0 */
h = &rsf_ht[gr->table_select[0]];
} else if (i < gr->region2start) {
/* in region 1 */
h = &rsf_ht[gr->table_select[1]];
} else {
/* in region 2 */
h = &rsf_ht[gr->table_select[2]];
}
x = getNextSample(fromPtr);
y = getNextSample(fromPtr);
v = getNextSample(fromPtr);
w = getNextSample(fromPtr);
rsf_huffman_encoder(bv, h, x, y, v, w);
}
// Encode count1 area:
h = &rsf_ht[gr->count1table_select+32];
while (bv.curBitIndex() < bv.totNumBits() && i < SSLIMIT*SBLIMIT) {
x = getNextSample(fromPtr);
y = getNextSample(fromPtr);
v = getNextSample(fromPtr);
w = getNextSample(fromPtr);
rsf_huffman_encoder(bv, h, x, y, v, w);
++i;
}
return i;
}
static Boolean lookupHuffmanTableEntry(struct huffcodetab const* h,
HUFFBITS bits, unsigned bitsLength,
unsigned char& xy) {
unsigned point = 0;
unsigned mask = 1;
unsigned numBitsTestedSoFar = 0;
do {
if (h->val[point][0]==0) { // end of tree
xy = h->val[point][1];
if (h->hlen[xy] == 0) { // this entry hasn't already been used
h->table[xy] = bits;
h->hlen[xy] = bitsLength;
return True;
} else { // this entry has already been seen
return False;
}
}
if (numBitsTestedSoFar++ == bitsLength) {
// We don't yet have enough bits for this prefix
return False;
}
if (bits&mask) {
while (h->val[point][1] >= MXOFF) point += h->val[point][1];
point += h->val[point][1];
} else {
while (h->val[point][0] >= MXOFF) point += h->val[point][0];
point += h->val[point][0];
}
mask <<= 1;
} while (mask || (point < h->treelen));
return False;
}
static void buildHuffmanEncodingTable(struct huffcodetab* h) {
h->table = new unsigned long[256];
h->hlen = new unsigned char[256];
if (h->table == NULL || h->hlen == NULL) { h->table = NULL; return; }
for (unsigned i = 0; i < 256; ++i) {
h->table[i] = 0; h->hlen[i] = 0;
}
// Look up entries for each possible bit sequence length:
unsigned maxNumEntries = h->xlen * h->ylen;
unsigned numEntries = 0;
unsigned powerOf2 = 1;
for (unsigned bitsLength = 1;
bitsLength <= 8*SIZEOF_HUFFBITS; ++bitsLength) {
powerOf2 *= 2;
for (HUFFBITS bits = 0; bits < powerOf2; ++bits) {
// Find the table value - if any - for 'bits' (length 'bitsLength'):
unsigned char xy;
if (lookupHuffmanTableEntry(h, bits, bitsLength, xy)) {
++numEntries;
if (numEntries == maxNumEntries) return; // we're done
}
}
}
#ifdef DEBUG
fprintf(stderr, "Didn't find enough entries!\n"); // shouldn't happen
#endif
}
static void lookupXYandPutBits(BitVector& bv, struct huffcodetab const* h,
unsigned char xy) {
HUFFBITS bits = h->table[xy];
unsigned bitsLength = h->hlen[xy];
// Note that "bits" is in reverse order, so read them from right-to-left:
while (bitsLength-- > 0) {
bv.put1Bit(bits&0x00000001);
bits >>= 1;
}
}
static void putLinbits(BitVector& bv, struct huffcodetab const* h,
HUFFBITS bits) {
bv.putBits(bits, h->linbits);
}
static void rsf_huffman_encoder(BitVector& bv,
struct huffcodetab* h,
int x, int y, int v, int w) {
if (h->val == NULL) return;
/* table 0 produces no bits */
if (h->treelen == 0) return;
if (h->table == NULL) {
// We haven't yet built the encoding array for this table; do it now:
buildHuffmanEncodingTable(h);
if (h->table == NULL) return;
}
Boolean xIsNeg = False, yIsNeg = False, vIsNeg = False, wIsNeg = False;
unsigned char xy;
#ifdef FOUR_BYTE_SAMPLES
#else
#ifdef TWO_BYTE_SAMPLES
// Convert 2-byte negative numbers to their 4-byte equivalents:
if (x&0x8000) x |= 0xFFFF0000;
if (y&0x8000) y |= 0xFFFF0000;
if (v&0x8000) v |= 0xFFFF0000;
if (w&0x8000) w |= 0xFFFF0000;
#else
// ONE_BYTE_SAMPLES
// Convert 1-byte negative numbers to their 4-byte equivalents:
if (x&0x80) x |= 0xFFFFFF00;
if (y&0x80) y |= 0xFFFFFF00;
if (v&0x80) v |= 0xFFFFFF00;
if (w&0x80) w |= 0xFFFFFF00;
#endif
#endif
if (h->tablename[0] == '3'
&& (h->tablename[1] == '2' || h->tablename[1] == '3')) {// quad tables
if (x < 0) { xIsNeg = True; x = -x; }
if (y < 0) { yIsNeg = True; y = -y; }
if (v < 0) { vIsNeg = True; v = -v; }
if (w < 0) { wIsNeg = True; w = -w; }
// Sanity check: x,y,v,w must all be 0 or 1:
if (x>1 || y>1 || v>1 || w>1) {
#ifdef DEBUG
fprintf(stderr, "rsf_huffman_encoder quad sanity check fails: %x,%x,%x,%x\n", x, y, v, w);
#endif
}
xy = (v<<3)|(w<<2)|(x<<1)|y;
lookupXYandPutBits(bv, h, xy);
if (v) bv.put1Bit(vIsNeg);
if (w) bv.put1Bit(wIsNeg);
if (x) bv.put1Bit(xIsNeg);
if (y) bv.put1Bit(yIsNeg);
} else { // dual tables
// Sanity check: v and w must be 0:
if (v != 0 || w != 0) {
#ifdef DEBUG
fprintf(stderr, "rsf_huffman_encoder dual sanity check 1 fails: %x,%x,%x,%x\n", x, y, v, w);
#endif
}
if (x < 0) { xIsNeg = True; x = -x; }
if (y < 0) { yIsNeg = True; y = -y; }
// Sanity check: x and y must be <= 255:
if (x > 255 || y > 255) {
#ifdef DEBUG
fprintf(stderr, "rsf_huffman_encoder dual sanity check 2 fails: %x,%x,%x,%x\n", x, y, v, w);
#endif
}
int xl1 = h->xlen-1;
int yl1 = h->ylen-1;
unsigned linbitsX = 0; unsigned linbitsY = 0;
if (((x < xl1) || (xl1 == 0)) && (y < yl1)) {
// normal case
xy = (x<<4)|y;
lookupXYandPutBits(bv, h, xy);
if (x) bv.put1Bit(xIsNeg);
if (y) bv.put1Bit(yIsNeg);
} else if (x >= xl1) {
linbitsX = (unsigned)(x - xl1);
if (linbitsX > h->linmax) {
#ifdef DEBUG
fprintf(stderr,"warning: Huffman X table overflow\n");
#endif
linbitsX = h->linmax;
};
if (y >= yl1) {
xy = (xl1<<4)|yl1;
lookupXYandPutBits(bv, h, xy);
linbitsY = (unsigned)(y - yl1);
if (linbitsY > h->linmax) {
#ifdef DEBUG
fprintf(stderr,"warning: Huffman Y table overflow\n");
#endif
linbitsY = h->linmax;
};
if (h->linbits) putLinbits(bv, h, linbitsX);
if (x) bv.put1Bit(xIsNeg);
if (h->linbits) putLinbits(bv, h, linbitsY);
if (y) bv.put1Bit(yIsNeg);
} else { /* x >= h->xlen, y < h->ylen */
xy = (xl1<<4)|y;
lookupXYandPutBits(bv, h, xy);
if (h->linbits) putLinbits(bv, h, linbitsX);
if (x) bv.put1Bit(xIsNeg);
if (y) bv.put1Bit(yIsNeg);
}
} else { /* ((x < h->xlen) && (y >= h->ylen)) */
xy = (x<<4)|yl1;
lookupXYandPutBits(bv, h, xy);
linbitsY = y-yl1;
if (linbitsY > h->linmax) {
#ifdef DEBUG
fprintf(stderr,"warning: Huffman Y table overflow\n");
#endif
linbitsY = h->linmax;
};
if (x) bv.put1Bit(xIsNeg);
if (h->linbits) putLinbits(bv, h, linbitsY);
if (y) bv.put1Bit(yIsNeg);
}
}
}
#endif
live/liveMedia/MP3InternalsHuffman.hh 000444 001752 001752 00000005141 12656261123 017456 0 ustar 00rsf rsf 000000 000000 /**********
This library is free software; you can redistribute it and/or modify it under
the terms of the GNU Lesser General Public License as published by the
Free Software Foundation; either version 2.1 of the License, or (at your
option) any later version. (See .)
This library is distributed in the hope that it will be useful, but WITHOUT
ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS
FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for
more details.
You should have received a copy of the GNU Lesser General Public License
along with this library; if not, write to the Free Software Foundation, Inc.,
51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
**********/
// "liveMedia"
// Copyright (c) 1996-2016 Live Networks, Inc. All rights reserved.
// MP3 internal implementation details (Huffman encoding)
// C++ header
#ifndef _MP3_INTERNALS_HUFFMAN_HH
#define _MP3_INTERNALS_HUFFMAN_HH
#ifndef _MP3_INTERNALS_HH
#include "MP3Internals.hh"
#endif
void updateSideInfoForHuffman(MP3SideInfo& sideInfo, Boolean isMPEG2,
unsigned char const* mainDataPtr,
unsigned p23L0, unsigned p23L1,
unsigned& part23Length0a,
unsigned& part23Length0aTruncation,
unsigned& part23Length0b,
unsigned& part23Length0bTruncation,
unsigned& part23Length1a,
unsigned& part23Length1aTruncation,
unsigned& part23Length1b,
unsigned& part23Length1bTruncation);
#define SSLIMIT 18
class MP3HuffmanEncodingInfo {
public:
MP3HuffmanEncodingInfo(Boolean includeDecodedValues = False);
~MP3HuffmanEncodingInfo();
public:
unsigned numSamples;
unsigned allBitOffsets[SBLIMIT*SSLIMIT + 1];
unsigned reg1Start, reg2Start, bigvalStart; /* special bit offsets */
unsigned* decodedValues;
};
/* forward */
void MP3HuffmanDecode(MP3SideInfo::gr_info_s_t* gr, Boolean isMPEG2,
unsigned char const* fromBasePtr,
unsigned fromBitOffset, unsigned fromLength,
unsigned& scaleFactorsLength,
MP3HuffmanEncodingInfo& hei);
extern unsigned char huffdec[]; // huffman table data
// The following are used if we process Huffman-decoded values
#ifdef FOUR_BYTE_SAMPLES
#define BYTES_PER_SAMPLE_VALUE 4
#else
#ifdef TWO_BYTE_SAMPLES
#define BYTES_PER_SAMPLE_VALUE 2
#else
// ONE_BYTE_SAMPLES
#define BYTES_PER_SAMPLE_VALUE 1
#endif
#endif
#ifdef DO_HUFFMAN_ENCODING
unsigned MP3HuffmanEncode(MP3SideInfo::gr_info_s_t const* gr,
unsigned char const* fromPtr,
unsigned char* toPtr, unsigned toBitOffset,
unsigned numHuffBits);
#endif
#endif
live/liveMedia/MP3InternalsHuffmanTable.cpp 000444 001752 001752 00000330135 12656261123 020615 0 ustar 00rsf rsf 000000 000000 /**********
This library is free software; you can redistribute it and/or modify it under
the terms of the GNU Lesser General Public License as published by the
Free Software Foundation; either version 2.1 of the License, or (at your
option) any later version. (See .)
This library is distributed in the hope that it will be useful, but WITHOUT
ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS
FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for
more details.
You should have received a copy of the GNU Lesser General Public License
along with this library; if not, write to the Free Software Foundation, Inc.,
51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
**********/
// "liveMedia"
// Copyright (c) 1996-2016 Live Networks, Inc. All rights reserved.
// MP3 internal implementation details (Huffman encoding)
// Table
#include "MP3InternalsHuffman.hh"
unsigned char huffdec[] = {
0x0a, 0x2e, 0x74, 0x61, 0x62, 0x6c, 0x65, 0x20, 0x20, 0x30, 0x20, 0x20,
0x20, 0x30, 0x20, 0x20, 0x30, 0x20, 0x20, 0x30, 0x20, 0x20, 0x30, 0x0a,
0x2e, 0x74, 0x72, 0x65, 0x65, 0x64, 0x61, 0x74, 0x61, 0x0a, 0x0a, 0x2e,
0x74, 0x61, 0x62, 0x6c, 0x65, 0x20, 0x20, 0x31, 0x20, 0x20, 0x20, 0x37,
0x20, 0x20, 0x32, 0x20, 0x20, 0x32, 0x20, 0x20, 0x30, 0x0a, 0x2e, 0x74,
0x72, 0x65, 0x65, 0x64, 0x61, 0x74, 0x61, 0x0a, 0x20, 0x32, 0x20, 0x20,
0x31, 0x20, 0x20, 0x30, 0x20, 0x20, 0x30, 0x20, 0x20, 0x32, 0x20, 0x20,
0x31, 0x20, 0x20, 0x30, 0x20, 0x31, 0x30, 0x20, 0x20, 0x32, 0x20, 0x20,
0x31, 0x20, 0x20, 0x30, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x31,
0x31, 0x20, 0x0a, 0x0a, 0x2e, 0x74, 0x61, 0x62, 0x6c, 0x65, 0x20, 0x20,
0x32, 0x20, 0x20, 0x31, 0x37, 0x20, 0x20, 0x33, 0x20, 0x20, 0x33, 0x20,
0x20, 0x30, 0x0a, 0x2e, 0x74, 0x72, 0x65, 0x65, 0x64, 0x61, 0x74, 0x61,
0x0a, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x20, 0x30,
0x20, 0x20, 0x34, 0x20, 0x20, 0x31, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31,
0x20, 0x20, 0x30, 0x20, 0x31, 0x30, 0x20, 0x20, 0x30, 0x20, 0x20, 0x31,
0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x31, 0x31,
0x20, 0x20, 0x34, 0x20, 0x20, 0x31, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31,
0x20, 0x20, 0x30, 0x20, 0x32, 0x30, 0x20, 0x20, 0x30, 0x20, 0x32, 0x31,
0x20, 0x0a, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x31,
0x32, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x20,
0x32, 0x20, 0x20, 0x30, 0x20, 0x32, 0x32, 0x20, 0x0a, 0x0a, 0x2e, 0x74,
0x61, 0x62, 0x6c, 0x65, 0x20, 0x20, 0x33, 0x20, 0x20, 0x31, 0x37, 0x20,
0x20, 0x33, 0x20, 0x20, 0x33, 0x20, 0x20, 0x30, 0x0a, 0x2e, 0x74, 0x72,
0x65, 0x65, 0x64, 0x61, 0x74, 0x61, 0x0a, 0x20, 0x34, 0x20, 0x20, 0x31,
0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x20, 0x30,
0x20, 0x20, 0x30, 0x20, 0x20, 0x31, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31,
0x20, 0x20, 0x30, 0x20, 0x31, 0x31, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31,
0x20, 0x20, 0x30, 0x20, 0x31, 0x30, 0x20, 0x20, 0x34, 0x20, 0x20, 0x31,
0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x32, 0x30,
0x20, 0x20, 0x30, 0x20, 0x32, 0x31, 0x20, 0x0a, 0x20, 0x32, 0x20, 0x20,
0x31, 0x20, 0x20, 0x30, 0x20, 0x31, 0x32, 0x20, 0x20, 0x32, 0x20, 0x20,
0x31, 0x20, 0x20, 0x30, 0x20, 0x20, 0x32, 0x20, 0x20, 0x30, 0x20, 0x32,
0x32, 0x20, 0x0a, 0x0a, 0x2e, 0x74, 0x61, 0x62, 0x6c, 0x65, 0x20, 0x20,
0x34, 0x20, 0x20, 0x20, 0x30, 0x20, 0x20, 0x30, 0x20, 0x20, 0x30, 0x20,
0x20, 0x30, 0x0a, 0x2e, 0x74, 0x72, 0x65, 0x65, 0x64, 0x61, 0x74, 0x61,
0x0a, 0x0a, 0x2e, 0x74, 0x61, 0x62, 0x6c, 0x65, 0x20, 0x20, 0x35, 0x20,
0x20, 0x33, 0x31, 0x20, 0x20, 0x34, 0x20, 0x20, 0x34, 0x20, 0x20, 0x30,
0x0a, 0x2e, 0x74, 0x72, 0x65, 0x65, 0x64, 0x61, 0x74, 0x61, 0x0a, 0x20,
0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x20, 0x30, 0x20, 0x20,
0x34, 0x20, 0x20, 0x31, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20,
0x30, 0x20, 0x31, 0x30, 0x20, 0x20, 0x30, 0x20, 0x20, 0x31, 0x20, 0x20,
0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x31, 0x31, 0x20, 0x20,
0x38, 0x20, 0x20, 0x31, 0x20, 0x20, 0x34, 0x20, 0x20, 0x31, 0x20, 0x20,
0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x32, 0x30, 0x20, 0x0a,
0x20, 0x30, 0x20, 0x20, 0x32, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20,
0x20, 0x30, 0x20, 0x32, 0x31, 0x20, 0x20, 0x30, 0x20, 0x31, 0x32, 0x20,
0x20, 0x38, 0x20, 0x20, 0x31, 0x20, 0x20, 0x34, 0x20, 0x20, 0x31, 0x20,
0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x32, 0x32, 0x20,
0x20, 0x30, 0x20, 0x33, 0x30, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20,
0x20, 0x30, 0x20, 0x20, 0x33, 0x20, 0x20, 0x30, 0x20, 0x31, 0x33, 0x20,
0x0a, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x33, 0x31,
0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x33, 0x32,
0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x32, 0x33,
0x20, 0x20, 0x30, 0x20, 0x33, 0x33, 0x20, 0x0a, 0x0a, 0x2e, 0x74, 0x61,
0x62, 0x6c, 0x65, 0x20, 0x20, 0x36, 0x20, 0x20, 0x33, 0x31, 0x20, 0x20,
0x34, 0x20, 0x20, 0x34, 0x20, 0x20, 0x30, 0x0a, 0x2e, 0x74, 0x72, 0x65,
0x65, 0x64, 0x61, 0x74, 0x61, 0x0a, 0x20, 0x36, 0x20, 0x20, 0x31, 0x20,
0x20, 0x34, 0x20, 0x20, 0x31, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20,
0x20, 0x30, 0x20, 0x20, 0x30, 0x20, 0x20, 0x30, 0x20, 0x31, 0x30, 0x20,
0x20, 0x30, 0x20, 0x31, 0x31, 0x20, 0x20, 0x36, 0x20, 0x20, 0x31, 0x20,
0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x20, 0x31, 0x20,
0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x32, 0x30, 0x20,
0x20, 0x30, 0x20, 0x32, 0x31, 0x20, 0x0a, 0x20, 0x36, 0x20, 0x20, 0x31,
0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x31, 0x32,
0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x20, 0x32,
0x20, 0x20, 0x30, 0x20, 0x32, 0x32, 0x20, 0x20, 0x34, 0x20, 0x20, 0x31,
0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x33, 0x31,
0x20, 0x20, 0x30, 0x20, 0x31, 0x33, 0x20, 0x20, 0x34, 0x20, 0x20, 0x31,
0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x0a, 0x20, 0x30, 0x20, 0x33,
0x30, 0x20, 0x20, 0x30, 0x20, 0x33, 0x32, 0x20, 0x20, 0x32, 0x20, 0x20,
0x31, 0x20, 0x20, 0x30, 0x20, 0x32, 0x33, 0x20, 0x20, 0x32, 0x20, 0x20,
0x31, 0x20, 0x20, 0x30, 0x20, 0x20, 0x33, 0x20, 0x20, 0x30, 0x20, 0x33,
0x33, 0x20, 0x0a, 0x0a, 0x2e, 0x74, 0x61, 0x62, 0x6c, 0x65, 0x20, 0x20,
0x37, 0x20, 0x20, 0x37, 0x31, 0x20, 0x20, 0x36, 0x20, 0x20, 0x36, 0x20,
0x20, 0x30, 0x0a, 0x2e, 0x74, 0x72, 0x65, 0x65, 0x64, 0x61, 0x74, 0x61,
0x0a, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x20, 0x30,
0x20, 0x20, 0x34, 0x20, 0x20, 0x31, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31,
0x20, 0x20, 0x30, 0x20, 0x31, 0x30, 0x20, 0x20, 0x30, 0x20, 0x20, 0x31,
0x20, 0x20, 0x38, 0x20, 0x20, 0x31, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31,
0x20, 0x20, 0x30, 0x20, 0x31, 0x31, 0x20, 0x20, 0x34, 0x20, 0x20, 0x31,
0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x32, 0x30,
0x20, 0x0a, 0x20, 0x30, 0x20, 0x20, 0x32, 0x20, 0x20, 0x30, 0x20, 0x32,
0x31, 0x20, 0x31, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x36, 0x20, 0x20,
0x31, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x31,
0x32, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x32,
0x32, 0x20, 0x20, 0x30, 0x20, 0x33, 0x30, 0x20, 0x20, 0x34, 0x20, 0x20,
0x31, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x33,
0x31, 0x20, 0x0a, 0x20, 0x30, 0x20, 0x31, 0x33, 0x20, 0x20, 0x34, 0x20,
0x20, 0x31, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20,
0x20, 0x33, 0x20, 0x20, 0x30, 0x20, 0x33, 0x32, 0x20, 0x20, 0x32, 0x20,
0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x32, 0x33, 0x20, 0x20, 0x30, 0x20,
0x20, 0x34, 0x20, 0x20, 0x61, 0x20, 0x20, 0x31, 0x20, 0x20, 0x34, 0x20,
0x20, 0x31, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20,
0x34, 0x30, 0x20, 0x0a, 0x20, 0x30, 0x20, 0x34, 0x31, 0x20, 0x20, 0x32,
0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x31, 0x34, 0x20, 0x20, 0x32,
0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x34, 0x32, 0x20, 0x20, 0x30,
0x20, 0x32, 0x34, 0x20, 0x20, 0x63, 0x20, 0x20, 0x31, 0x20, 0x20, 0x36,
0x20, 0x20, 0x31, 0x20, 0x20, 0x34, 0x20, 0x20, 0x31, 0x20, 0x20, 0x32,
0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x33, 0x33, 0x20, 0x20, 0x30,
0x20, 0x34, 0x33, 0x20, 0x0a, 0x20, 0x30, 0x20, 0x35, 0x30, 0x20, 0x20,
0x34, 0x20, 0x20, 0x31, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20,
0x30, 0x20, 0x33, 0x34, 0x20, 0x20, 0x30, 0x20, 0x20, 0x35, 0x20, 0x20,
0x30, 0x20, 0x35, 0x31, 0x20, 0x20, 0x36, 0x20, 0x20, 0x31, 0x20, 0x20,
0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x31, 0x35, 0x20, 0x20,
0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x35, 0x32, 0x20, 0x20,
0x30, 0x20, 0x32, 0x35, 0x20, 0x0a, 0x20, 0x34, 0x20, 0x20, 0x31, 0x20,
0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x34, 0x34, 0x20,
0x20, 0x30, 0x20, 0x33, 0x35, 0x20, 0x20, 0x34, 0x20, 0x20, 0x31, 0x20,
0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x35, 0x33, 0x20,
0x20, 0x30, 0x20, 0x35, 0x34, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20,
0x20, 0x30, 0x20, 0x34, 0x35, 0x20, 0x20, 0x30, 0x20, 0x35, 0x35, 0x20,
0x0a, 0x0a, 0x2e, 0x74, 0x61, 0x62, 0x6c, 0x65, 0x20, 0x20, 0x38, 0x20,
0x20, 0x37, 0x31, 0x20, 0x20, 0x36, 0x20, 0x20, 0x36, 0x20, 0x20, 0x30,
0x0a, 0x2e, 0x74, 0x72, 0x65, 0x65, 0x64, 0x61, 0x74, 0x61, 0x0a, 0x20,
0x36, 0x20, 0x20, 0x31, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20,
0x30, 0x20, 0x20, 0x30, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20,
0x30, 0x20, 0x31, 0x30, 0x20, 0x20, 0x30, 0x20, 0x20, 0x31, 0x20, 0x20,
0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x31, 0x31, 0x20, 0x20,
0x34, 0x20, 0x20, 0x31, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20,
0x30, 0x20, 0x32, 0x31, 0x20, 0x20, 0x30, 0x20, 0x31, 0x32, 0x20, 0x0a,
0x20, 0x65, 0x20, 0x20, 0x31, 0x20, 0x20, 0x34, 0x20, 0x20, 0x31, 0x20,
0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x32, 0x30, 0x20,
0x20, 0x30, 0x20, 0x20, 0x32, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20,
0x20, 0x30, 0x20, 0x32, 0x32, 0x20, 0x20, 0x34, 0x20, 0x20, 0x31, 0x20,
0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x33, 0x30, 0x20,
0x20, 0x30, 0x20, 0x20, 0x33, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20,
0x0a, 0x20, 0x30, 0x20, 0x33, 0x31, 0x20, 0x20, 0x30, 0x20, 0x31, 0x33,
0x20, 0x20, 0x65, 0x20, 0x20, 0x31, 0x20, 0x20, 0x38, 0x20, 0x20, 0x31,
0x20, 0x20, 0x34, 0x20, 0x20, 0x31, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31,
0x20, 0x20, 0x30, 0x20, 0x33, 0x32, 0x20, 0x20, 0x30, 0x20, 0x32, 0x33,
0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x34, 0x30,
0x20, 0x20, 0x30, 0x20, 0x20, 0x34, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31,
0x20, 0x0a, 0x20, 0x30, 0x20, 0x34, 0x31, 0x20, 0x20, 0x32, 0x20, 0x20,
0x31, 0x20, 0x20, 0x30, 0x20, 0x31, 0x34, 0x20, 0x20, 0x30, 0x20, 0x34,
0x32, 0x20, 0x20, 0x63, 0x20, 0x20, 0x31, 0x20, 0x20, 0x36, 0x20, 0x20,
0x31, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x32,
0x34, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x33,
0x33, 0x20, 0x20, 0x30, 0x20, 0x35, 0x30, 0x20, 0x20, 0x34, 0x20, 0x20,
0x31, 0x20, 0x0a, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20,
0x34, 0x33, 0x20, 0x20, 0x30, 0x20, 0x33, 0x34, 0x20, 0x20, 0x30, 0x20,
0x35, 0x31, 0x20, 0x20, 0x36, 0x20, 0x20, 0x31, 0x20, 0x20, 0x32, 0x20,
0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x31, 0x35, 0x20, 0x20, 0x32, 0x20,
0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x20, 0x35, 0x20, 0x20, 0x30, 0x20,
0x35, 0x32, 0x20, 0x20, 0x36, 0x20, 0x20, 0x31, 0x20, 0x20, 0x32, 0x20,
0x20, 0x31, 0x20, 0x0a, 0x20, 0x30, 0x20, 0x32, 0x35, 0x20, 0x20, 0x32,
0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x34, 0x34, 0x20, 0x20, 0x30,
0x20, 0x33, 0x35, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30,
0x20, 0x35, 0x33, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30,
0x20, 0x34, 0x35, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30,
0x20, 0x35, 0x34, 0x20, 0x20, 0x30, 0x20, 0x35, 0x35, 0x20, 0x0a, 0x0a,
0x2e, 0x74, 0x61, 0x62, 0x6c, 0x65, 0x20, 0x20, 0x39, 0x20, 0x20, 0x37,
0x31, 0x20, 0x20, 0x36, 0x20, 0x20, 0x36, 0x20, 0x20, 0x30, 0x0a, 0x2e,
0x74, 0x72, 0x65, 0x65, 0x64, 0x61, 0x74, 0x61, 0x0a, 0x20, 0x38, 0x20,
0x20, 0x31, 0x20, 0x20, 0x34, 0x20, 0x20, 0x31, 0x20, 0x20, 0x32, 0x20,
0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x20, 0x30, 0x20, 0x20, 0x30, 0x20,
0x31, 0x30, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20,
0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x31, 0x31, 0x20, 0x20, 0x61, 0x20,
0x20, 0x31, 0x20, 0x20, 0x34, 0x20, 0x20, 0x31, 0x20, 0x20, 0x32, 0x20,
0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x32, 0x30, 0x20, 0x0a, 0x20, 0x30,
0x20, 0x32, 0x31, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30,
0x20, 0x31, 0x32, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30,
0x20, 0x20, 0x32, 0x20, 0x20, 0x30, 0x20, 0x32, 0x32, 0x20, 0x20, 0x63,
0x20, 0x20, 0x31, 0x20, 0x20, 0x36, 0x20, 0x20, 0x31, 0x20, 0x20, 0x34,
0x20, 0x20, 0x31, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30,
0x20, 0x33, 0x30, 0x20, 0x20, 0x30, 0x20, 0x20, 0x33, 0x20, 0x0a, 0x20,
0x30, 0x20, 0x33, 0x31, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20,
0x30, 0x20, 0x31, 0x33, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20,
0x30, 0x20, 0x33, 0x32, 0x20, 0x20, 0x30, 0x20, 0x32, 0x33, 0x20, 0x20,
0x63, 0x20, 0x20, 0x31, 0x20, 0x20, 0x34, 0x20, 0x20, 0x31, 0x20, 0x20,
0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x34, 0x31, 0x20, 0x20,
0x30, 0x20, 0x31, 0x34, 0x20, 0x20, 0x34, 0x20, 0x20, 0x31, 0x20, 0x0a,
0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x34, 0x30, 0x20,
0x20, 0x30, 0x20, 0x33, 0x33, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20,
0x20, 0x30, 0x20, 0x34, 0x32, 0x20, 0x20, 0x30, 0x20, 0x32, 0x34, 0x20,
0x20, 0x61, 0x20, 0x20, 0x31, 0x20, 0x20, 0x36, 0x20, 0x20, 0x31, 0x20,
0x20, 0x34, 0x20, 0x20, 0x31, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20,
0x20, 0x30, 0x20, 0x20, 0x34, 0x20, 0x20, 0x30, 0x20, 0x35, 0x30, 0x20,
0x0a, 0x20, 0x30, 0x20, 0x34, 0x33, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31,
0x20, 0x20, 0x30, 0x20, 0x33, 0x34, 0x20, 0x20, 0x30, 0x20, 0x35, 0x31,
0x20, 0x20, 0x38, 0x20, 0x20, 0x31, 0x20, 0x20, 0x34, 0x20, 0x20, 0x31,
0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x31, 0x35,
0x20, 0x20, 0x30, 0x20, 0x35, 0x32, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31,
0x20, 0x20, 0x30, 0x20, 0x32, 0x35, 0x20, 0x20, 0x30, 0x20, 0x34, 0x34,
0x20, 0x0a, 0x20, 0x36, 0x20, 0x20, 0x31, 0x20, 0x20, 0x34, 0x20, 0x20,
0x31, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x20,
0x35, 0x20, 0x20, 0x30, 0x20, 0x35, 0x34, 0x20, 0x20, 0x30, 0x20, 0x35,
0x33, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x33,
0x35, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x34,
0x35, 0x20, 0x20, 0x30, 0x20, 0x35, 0x35, 0x20, 0x0a, 0x0a, 0x2e, 0x74,
0x61, 0x62, 0x6c, 0x65, 0x20, 0x31, 0x30, 0x20, 0x31, 0x32, 0x37, 0x20,
0x20, 0x38, 0x20, 0x20, 0x38, 0x20, 0x20, 0x30, 0x0a, 0x2e, 0x74, 0x72,
0x65, 0x65, 0x64, 0x61, 0x74, 0x61, 0x0a, 0x20, 0x32, 0x20, 0x20, 0x31,
0x20, 0x20, 0x30, 0x20, 0x20, 0x30, 0x20, 0x20, 0x34, 0x20, 0x20, 0x31,
0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x31, 0x30,
0x20, 0x20, 0x30, 0x20, 0x20, 0x31, 0x20, 0x20, 0x61, 0x20, 0x20, 0x31,
0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x31, 0x31,
0x20, 0x20, 0x34, 0x20, 0x20, 0x31, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31,
0x20, 0x20, 0x30, 0x20, 0x32, 0x30, 0x20, 0x0a, 0x20, 0x30, 0x20, 0x20,
0x32, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x32,
0x31, 0x20, 0x20, 0x30, 0x20, 0x31, 0x32, 0x20, 0x31, 0x63, 0x20, 0x20,
0x31, 0x20, 0x20, 0x38, 0x20, 0x20, 0x31, 0x20, 0x20, 0x34, 0x20, 0x20,
0x31, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x32,
0x32, 0x20, 0x20, 0x30, 0x20, 0x33, 0x30, 0x20, 0x20, 0x32, 0x20, 0x20,
0x31, 0x20, 0x20, 0x30, 0x20, 0x33, 0x31, 0x20, 0x0a, 0x20, 0x30, 0x20,
0x31, 0x33, 0x20, 0x20, 0x38, 0x20, 0x20, 0x31, 0x20, 0x20, 0x34, 0x20,
0x20, 0x31, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20,
0x20, 0x33, 0x20, 0x20, 0x30, 0x20, 0x33, 0x32, 0x20, 0x20, 0x32, 0x20,
0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x32, 0x33, 0x20, 0x20, 0x30, 0x20,
0x34, 0x30, 0x20, 0x20, 0x34, 0x20, 0x20, 0x31, 0x20, 0x20, 0x32, 0x20,
0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x34, 0x31, 0x20, 0x0a, 0x20, 0x30,
0x20, 0x31, 0x34, 0x20, 0x20, 0x34, 0x20, 0x20, 0x31, 0x20, 0x20, 0x32,
0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x20, 0x34, 0x20, 0x20, 0x30,
0x20, 0x33, 0x33, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30,
0x20, 0x34, 0x32, 0x20, 0x20, 0x30, 0x20, 0x32, 0x34, 0x20, 0x31, 0x63,
0x20, 0x20, 0x31, 0x20, 0x20, 0x61, 0x20, 0x20, 0x31, 0x20, 0x20, 0x36,
0x20, 0x20, 0x31, 0x20, 0x20, 0x34, 0x20, 0x20, 0x31, 0x20, 0x0a, 0x20,
0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x35, 0x30, 0x20, 0x20,
0x30, 0x20, 0x20, 0x35, 0x20, 0x20, 0x30, 0x20, 0x36, 0x30, 0x20, 0x20,
0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x36, 0x31, 0x20, 0x20,
0x30, 0x20, 0x31, 0x36, 0x20, 0x20, 0x63, 0x20, 0x20, 0x31, 0x20, 0x20,
0x36, 0x20, 0x20, 0x31, 0x20, 0x20, 0x34, 0x20, 0x20, 0x31, 0x20, 0x20,
0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x34, 0x33, 0x20, 0x0a,
0x20, 0x30, 0x20, 0x33, 0x34, 0x20, 0x20, 0x30, 0x20, 0x35, 0x31, 0x20,
0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x31, 0x35, 0x20,
0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x35, 0x32, 0x20,
0x20, 0x30, 0x20, 0x32, 0x35, 0x20, 0x20, 0x34, 0x20, 0x20, 0x31, 0x20,
0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x32, 0x36, 0x20,
0x20, 0x30, 0x20, 0x33, 0x36, 0x20, 0x20, 0x30, 0x20, 0x37, 0x31, 0x20,
0x0a, 0x31, 0x34, 0x20, 0x20, 0x31, 0x20, 0x20, 0x38, 0x20, 0x20, 0x31,
0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x31, 0x37,
0x20, 0x20, 0x34, 0x20, 0x20, 0x31, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31,
0x20, 0x20, 0x30, 0x20, 0x34, 0x34, 0x20, 0x20, 0x30, 0x20, 0x35, 0x33,
0x20, 0x20, 0x30, 0x20, 0x20, 0x36, 0x20, 0x20, 0x36, 0x20, 0x20, 0x31,
0x20, 0x20, 0x34, 0x20, 0x20, 0x31, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31,
0x20, 0x0a, 0x20, 0x30, 0x20, 0x33, 0x35, 0x20, 0x20, 0x30, 0x20, 0x34,
0x35, 0x20, 0x20, 0x30, 0x20, 0x36, 0x32, 0x20, 0x20, 0x32, 0x20, 0x20,
0x31, 0x20, 0x20, 0x30, 0x20, 0x37, 0x30, 0x20, 0x20, 0x32, 0x20, 0x20,
0x31, 0x20, 0x20, 0x30, 0x20, 0x20, 0x37, 0x20, 0x20, 0x30, 0x20, 0x36,
0x34, 0x20, 0x20, 0x65, 0x20, 0x20, 0x31, 0x20, 0x20, 0x34, 0x20, 0x20,
0x31, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x37,
0x32, 0x20, 0x0a, 0x20, 0x30, 0x20, 0x32, 0x37, 0x20, 0x20, 0x36, 0x20,
0x20, 0x31, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20,
0x36, 0x33, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20,
0x35, 0x34, 0x20, 0x20, 0x30, 0x20, 0x35, 0x35, 0x20, 0x20, 0x32, 0x20,
0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x34, 0x36, 0x20, 0x20, 0x30, 0x20,
0x37, 0x33, 0x20, 0x20, 0x38, 0x20, 0x20, 0x31, 0x20, 0x20, 0x34, 0x20,
0x20, 0x31, 0x20, 0x0a, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30,
0x20, 0x33, 0x37, 0x20, 0x20, 0x30, 0x20, 0x36, 0x35, 0x20, 0x20, 0x32,
0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x35, 0x36, 0x20, 0x20, 0x30,
0x20, 0x37, 0x34, 0x20, 0x20, 0x36, 0x20, 0x20, 0x31, 0x20, 0x20, 0x32,
0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x34, 0x37, 0x20, 0x20, 0x32,
0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x36, 0x36, 0x20, 0x20, 0x30,
0x20, 0x37, 0x35, 0x20, 0x0a, 0x20, 0x34, 0x20, 0x20, 0x31, 0x20, 0x20,
0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x35, 0x37, 0x20, 0x20,
0x30, 0x20, 0x37, 0x36, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20,
0x30, 0x20, 0x36, 0x37, 0x20, 0x20, 0x30, 0x20, 0x37, 0x37, 0x20, 0x0a,
0x0a, 0x2e, 0x74, 0x61, 0x62, 0x6c, 0x65, 0x20, 0x31, 0x31, 0x20, 0x31,
0x32, 0x37, 0x20, 0x20, 0x38, 0x20, 0x20, 0x38, 0x20, 0x20, 0x30, 0x0a,
0x2e, 0x74, 0x72, 0x65, 0x65, 0x64, 0x61, 0x74, 0x61, 0x0a, 0x20, 0x36,
0x20, 0x20, 0x31, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30,
0x20, 0x20, 0x30, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30,
0x20, 0x31, 0x30, 0x20, 0x20, 0x30, 0x20, 0x20, 0x31, 0x20, 0x20, 0x38,
0x20, 0x20, 0x31, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30,
0x20, 0x31, 0x31, 0x20, 0x20, 0x34, 0x20, 0x20, 0x31, 0x20, 0x20, 0x32,
0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x32, 0x30, 0x20, 0x0a, 0x20,
0x30, 0x20, 0x20, 0x32, 0x20, 0x20, 0x30, 0x20, 0x31, 0x32, 0x20, 0x31,
0x38, 0x20, 0x20, 0x31, 0x20, 0x20, 0x38, 0x20, 0x20, 0x31, 0x20, 0x20,
0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x32, 0x31, 0x20, 0x20,
0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x32, 0x32, 0x20, 0x20,
0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x33, 0x30, 0x20, 0x20,
0x30, 0x20, 0x20, 0x33, 0x20, 0x20, 0x34, 0x20, 0x20, 0x31, 0x20, 0x0a,
0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x33, 0x31, 0x20,
0x20, 0x30, 0x20, 0x31, 0x33, 0x20, 0x20, 0x34, 0x20, 0x20, 0x31, 0x20,
0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x33, 0x32, 0x20,
0x20, 0x30, 0x20, 0x32, 0x33, 0x20, 0x20, 0x34, 0x20, 0x20, 0x31, 0x20,
0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x34, 0x30, 0x20,
0x20, 0x30, 0x20, 0x20, 0x34, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20,
0x0a, 0x20, 0x30, 0x20, 0x34, 0x31, 0x20, 0x20, 0x30, 0x20, 0x31, 0x34,
0x20, 0x31, 0x65, 0x20, 0x20, 0x31, 0x20, 0x31, 0x30, 0x20, 0x20, 0x31,
0x20, 0x20, 0x61, 0x20, 0x20, 0x31, 0x20, 0x20, 0x34, 0x20, 0x20, 0x31,
0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x34, 0x32,
0x20, 0x20, 0x30, 0x20, 0x32, 0x34, 0x20, 0x20, 0x34, 0x20, 0x20, 0x31,
0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x33, 0x33,
0x20, 0x0a, 0x20, 0x30, 0x20, 0x34, 0x33, 0x20, 0x20, 0x30, 0x20, 0x35,
0x30, 0x20, 0x20, 0x34, 0x20, 0x20, 0x31, 0x20, 0x20, 0x32, 0x20, 0x20,
0x31, 0x20, 0x20, 0x30, 0x20, 0x33, 0x34, 0x20, 0x20, 0x30, 0x20, 0x35,
0x31, 0x20, 0x20, 0x30, 0x20, 0x36, 0x31, 0x20, 0x20, 0x36, 0x20, 0x20,
0x31, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x31,
0x36, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x20,
0x36, 0x20, 0x0a, 0x20, 0x30, 0x20, 0x32, 0x36, 0x20, 0x20, 0x32, 0x20,
0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x36, 0x32, 0x20, 0x20, 0x32, 0x20,
0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x31, 0x35, 0x20, 0x20, 0x32, 0x20,
0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x20, 0x35, 0x20, 0x20, 0x30, 0x20,
0x35, 0x32, 0x20, 0x31, 0x30, 0x20, 0x20, 0x31, 0x20, 0x20, 0x61, 0x20,
0x20, 0x31, 0x20, 0x20, 0x36, 0x20, 0x20, 0x31, 0x20, 0x20, 0x34, 0x20,
0x20, 0x31, 0x20, 0x0a, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30,
0x20, 0x32, 0x35, 0x20, 0x20, 0x30, 0x20, 0x34, 0x34, 0x20, 0x20, 0x30,
0x20, 0x36, 0x30, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30,
0x20, 0x36, 0x33, 0x20, 0x20, 0x30, 0x20, 0x33, 0x36, 0x20, 0x20, 0x34,
0x20, 0x20, 0x31, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30,
0x20, 0x37, 0x30, 0x20, 0x20, 0x30, 0x20, 0x31, 0x37, 0x20, 0x20, 0x30,
0x20, 0x37, 0x31, 0x20, 0x0a, 0x31, 0x30, 0x20, 0x20, 0x31, 0x20, 0x20,
0x36, 0x20, 0x20, 0x31, 0x20, 0x20, 0x34, 0x20, 0x20, 0x31, 0x20, 0x20,
0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x20, 0x37, 0x20, 0x20,
0x30, 0x20, 0x36, 0x34, 0x20, 0x20, 0x30, 0x20, 0x37, 0x32, 0x20, 0x20,
0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x32, 0x37, 0x20, 0x20,
0x34, 0x20, 0x20, 0x31, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20,
0x30, 0x20, 0x35, 0x33, 0x20, 0x0a, 0x20, 0x30, 0x20, 0x33, 0x35, 0x20,
0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x35, 0x34, 0x20,
0x20, 0x30, 0x20, 0x34, 0x35, 0x20, 0x20, 0x61, 0x20, 0x20, 0x31, 0x20,
0x20, 0x34, 0x20, 0x20, 0x31, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20,
0x20, 0x30, 0x20, 0x34, 0x36, 0x20, 0x20, 0x30, 0x20, 0x37, 0x33, 0x20,
0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x33, 0x37, 0x20,
0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x0a, 0x20, 0x30, 0x20, 0x36, 0x35,
0x20, 0x20, 0x30, 0x20, 0x35, 0x36, 0x20, 0x20, 0x61, 0x20, 0x20, 0x31,
0x20, 0x20, 0x36, 0x20, 0x20, 0x31, 0x20, 0x20, 0x34, 0x20, 0x20, 0x31,
0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x35, 0x35,
0x20, 0x20, 0x30, 0x20, 0x35, 0x37, 0x20, 0x20, 0x30, 0x20, 0x37, 0x34,
0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x34, 0x37,
0x20, 0x20, 0x30, 0x20, 0x36, 0x36, 0x20, 0x0a, 0x20, 0x34, 0x20, 0x20,
0x31, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x37,
0x35, 0x20, 0x20, 0x30, 0x20, 0x37, 0x36, 0x20, 0x20, 0x32, 0x20, 0x20,
0x31, 0x20, 0x20, 0x30, 0x20, 0x36, 0x37, 0x20, 0x20, 0x30, 0x20, 0x37,
0x37, 0x20, 0x0a, 0x0a, 0x2e, 0x74, 0x61, 0x62, 0x6c, 0x65, 0x20, 0x31,
0x32, 0x20, 0x31, 0x32, 0x37, 0x20, 0x20, 0x38, 0x20, 0x20, 0x38, 0x20,
0x20, 0x30, 0x0a, 0x2e, 0x74, 0x72, 0x65, 0x65, 0x64, 0x61, 0x74, 0x61,
0x0a, 0x20, 0x63, 0x20, 0x20, 0x31, 0x20, 0x20, 0x34, 0x20, 0x20, 0x31,
0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x31, 0x30,
0x20, 0x20, 0x30, 0x20, 0x20, 0x31, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31,
0x20, 0x20, 0x30, 0x20, 0x31, 0x31, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31,
0x20, 0x20, 0x30, 0x20, 0x20, 0x30, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31,
0x20, 0x20, 0x30, 0x20, 0x32, 0x30, 0x20, 0x20, 0x30, 0x20, 0x20, 0x32,
0x20, 0x0a, 0x31, 0x30, 0x20, 0x20, 0x31, 0x20, 0x20, 0x34, 0x20, 0x20,
0x31, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x32,
0x31, 0x20, 0x20, 0x30, 0x20, 0x31, 0x32, 0x20, 0x20, 0x34, 0x20, 0x20,
0x31, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x32,
0x32, 0x20, 0x20, 0x30, 0x20, 0x33, 0x31, 0x20, 0x20, 0x32, 0x20, 0x20,
0x31, 0x20, 0x20, 0x30, 0x20, 0x31, 0x33, 0x20, 0x20, 0x32, 0x20, 0x20,
0x31, 0x20, 0x0a, 0x20, 0x30, 0x20, 0x33, 0x30, 0x20, 0x20, 0x32, 0x20,
0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x20, 0x33, 0x20, 0x20, 0x30, 0x20,
0x34, 0x30, 0x20, 0x31, 0x61, 0x20, 0x20, 0x31, 0x20, 0x20, 0x38, 0x20,
0x20, 0x31, 0x20, 0x20, 0x34, 0x20, 0x20, 0x31, 0x20, 0x20, 0x32, 0x20,
0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x33, 0x32, 0x20, 0x20, 0x30, 0x20,
0x32, 0x33, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20,
0x34, 0x31, 0x20, 0x0a, 0x20, 0x30, 0x20, 0x33, 0x33, 0x20, 0x20, 0x61,
0x20, 0x20, 0x31, 0x20, 0x20, 0x34, 0x20, 0x20, 0x31, 0x20, 0x20, 0x32,
0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x31, 0x34, 0x20, 0x20, 0x30,
0x20, 0x34, 0x32, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30,
0x20, 0x32, 0x34, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30,
0x20, 0x20, 0x34, 0x20, 0x20, 0x30, 0x20, 0x35, 0x30, 0x20, 0x20, 0x34,
0x20, 0x20, 0x31, 0x20, 0x0a, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20,
0x30, 0x20, 0x34, 0x33, 0x20, 0x20, 0x30, 0x20, 0x33, 0x34, 0x20, 0x20,
0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x35, 0x31, 0x20, 0x20,
0x30, 0x20, 0x31, 0x35, 0x20, 0x31, 0x63, 0x20, 0x20, 0x31, 0x20, 0x20,
0x65, 0x20, 0x20, 0x31, 0x20, 0x20, 0x38, 0x20, 0x20, 0x31, 0x20, 0x20,
0x34, 0x20, 0x20, 0x31, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20,
0x30, 0x20, 0x35, 0x32, 0x20, 0x0a, 0x20, 0x30, 0x20, 0x32, 0x35, 0x20,
0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x35, 0x33, 0x20,
0x20, 0x30, 0x20, 0x33, 0x35, 0x20, 0x20, 0x34, 0x20, 0x20, 0x31, 0x20,
0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x36, 0x30, 0x20,
0x20, 0x30, 0x20, 0x31, 0x36, 0x20, 0x20, 0x30, 0x20, 0x36, 0x31, 0x20,
0x20, 0x34, 0x20, 0x20, 0x31, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20,
0x20, 0x30, 0x20, 0x36, 0x32, 0x20, 0x0a, 0x20, 0x30, 0x20, 0x32, 0x36,
0x20, 0x20, 0x36, 0x20, 0x20, 0x31, 0x20, 0x20, 0x34, 0x20, 0x20, 0x31,
0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x20, 0x35,
0x20, 0x20, 0x30, 0x20, 0x20, 0x36, 0x20, 0x20, 0x30, 0x20, 0x34, 0x34,
0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x35, 0x34,
0x20, 0x20, 0x30, 0x20, 0x34, 0x35, 0x20, 0x31, 0x32, 0x20, 0x20, 0x31,
0x20, 0x20, 0x61, 0x20, 0x20, 0x31, 0x20, 0x0a, 0x20, 0x34, 0x20, 0x20,
0x31, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x36,
0x33, 0x20, 0x20, 0x30, 0x20, 0x33, 0x36, 0x20, 0x20, 0x34, 0x20, 0x20,
0x31, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x37,
0x30, 0x20, 0x20, 0x30, 0x20, 0x20, 0x37, 0x20, 0x20, 0x30, 0x20, 0x37,
0x31, 0x20, 0x20, 0x34, 0x20, 0x20, 0x31, 0x20, 0x20, 0x32, 0x20, 0x20,
0x31, 0x20, 0x20, 0x30, 0x20, 0x31, 0x37, 0x20, 0x0a, 0x20, 0x30, 0x20,
0x36, 0x34, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20,
0x34, 0x36, 0x20, 0x20, 0x30, 0x20, 0x37, 0x32, 0x20, 0x20, 0x61, 0x20,
0x20, 0x31, 0x20, 0x20, 0x36, 0x20, 0x20, 0x31, 0x20, 0x20, 0x32, 0x20,
0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x32, 0x37, 0x20, 0x20, 0x32, 0x20,
0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x35, 0x35, 0x20, 0x20, 0x30, 0x20,
0x37, 0x33, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x0a, 0x20, 0x30,
0x20, 0x33, 0x37, 0x20, 0x20, 0x30, 0x20, 0x35, 0x36, 0x20, 0x20, 0x38,
0x20, 0x20, 0x31, 0x20, 0x20, 0x34, 0x20, 0x20, 0x31, 0x20, 0x20, 0x32,
0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x36, 0x35, 0x20, 0x20, 0x30,
0x20, 0x37, 0x34, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30,
0x20, 0x34, 0x37, 0x20, 0x20, 0x30, 0x20, 0x36, 0x36, 0x20, 0x20, 0x34,
0x20, 0x20, 0x31, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x0a, 0x20,
0x30, 0x20, 0x37, 0x35, 0x20, 0x20, 0x30, 0x20, 0x35, 0x37, 0x20, 0x20,
0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x37, 0x36, 0x20, 0x20,
0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x36, 0x37, 0x20, 0x20,
0x30, 0x20, 0x37, 0x37, 0x20, 0x0a, 0x0a, 0x2e, 0x74, 0x61, 0x62, 0x6c,
0x65, 0x20, 0x31, 0x33, 0x20, 0x35, 0x31, 0x31, 0x20, 0x31, 0x36, 0x20,
0x31, 0x36, 0x20, 0x20, 0x30, 0x0a, 0x2e, 0x74, 0x72, 0x65, 0x65, 0x64,
0x61, 0x74, 0x61, 0x0a, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30,
0x20, 0x20, 0x30, 0x20, 0x20, 0x36, 0x20, 0x20, 0x31, 0x20, 0x20, 0x32,
0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x31, 0x30, 0x20, 0x20, 0x32,
0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30,
0x20, 0x31, 0x31, 0x20, 0x31, 0x63, 0x20, 0x20, 0x31, 0x20, 0x20, 0x38,
0x20, 0x20, 0x31, 0x20, 0x20, 0x34, 0x20, 0x20, 0x31, 0x20, 0x20, 0x32,
0x20, 0x20, 0x31, 0x20, 0x0a, 0x20, 0x30, 0x20, 0x32, 0x30, 0x20, 0x20,
0x30, 0x20, 0x20, 0x32, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20,
0x30, 0x20, 0x32, 0x31, 0x20, 0x20, 0x30, 0x20, 0x31, 0x32, 0x20, 0x20,
0x38, 0x20, 0x20, 0x31, 0x20, 0x20, 0x34, 0x20, 0x20, 0x31, 0x20, 0x20,
0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x32, 0x32, 0x20, 0x20,
0x30, 0x20, 0x33, 0x30, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20,
0x30, 0x20, 0x20, 0x33, 0x20, 0x0a, 0x20, 0x30, 0x20, 0x33, 0x31, 0x20,
0x20, 0x36, 0x20, 0x20, 0x31, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20,
0x20, 0x30, 0x20, 0x31, 0x33, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20,
0x20, 0x30, 0x20, 0x33, 0x32, 0x20, 0x20, 0x30, 0x20, 0x32, 0x33, 0x20,
0x20, 0x34, 0x20, 0x20, 0x31, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20,
0x20, 0x30, 0x20, 0x34, 0x30, 0x20, 0x20, 0x30, 0x20, 0x20, 0x34, 0x20,
0x20, 0x30, 0x20, 0x34, 0x31, 0x20, 0x0a, 0x34, 0x36, 0x20, 0x20, 0x31,
0x20, 0x31, 0x63, 0x20, 0x20, 0x31, 0x20, 0x20, 0x65, 0x20, 0x20, 0x31,
0x20, 0x20, 0x36, 0x20, 0x20, 0x31, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31,
0x20, 0x20, 0x30, 0x20, 0x31, 0x34, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31,
0x20, 0x20, 0x30, 0x20, 0x33, 0x33, 0x20, 0x20, 0x30, 0x20, 0x34, 0x32,
0x20, 0x20, 0x34, 0x20, 0x20, 0x31, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31,
0x20, 0x20, 0x30, 0x20, 0x32, 0x34, 0x20, 0x0a, 0x20, 0x30, 0x20, 0x35,
0x30, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x34,
0x33, 0x20, 0x20, 0x30, 0x20, 0x33, 0x34, 0x20, 0x20, 0x34, 0x20, 0x20,
0x31, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x35,
0x31, 0x20, 0x20, 0x30, 0x20, 0x31, 0x35, 0x20, 0x20, 0x34, 0x20, 0x20,
0x31, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x20,
0x35, 0x20, 0x20, 0x30, 0x20, 0x35, 0x32, 0x20, 0x0a, 0x20, 0x32, 0x20,
0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x32, 0x35, 0x20, 0x20, 0x32, 0x20,
0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x34, 0x34, 0x20, 0x20, 0x30, 0x20,
0x35, 0x33, 0x20, 0x20, 0x65, 0x20, 0x20, 0x31, 0x20, 0x20, 0x38, 0x20,
0x20, 0x31, 0x20, 0x20, 0x34, 0x20, 0x20, 0x31, 0x20, 0x20, 0x32, 0x20,
0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x36, 0x30, 0x20, 0x20, 0x30, 0x20,
0x20, 0x36, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x0a, 0x20, 0x30,
0x20, 0x36, 0x31, 0x20, 0x20, 0x30, 0x20, 0x31, 0x36, 0x20, 0x20, 0x34,
0x20, 0x20, 0x31, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30,
0x20, 0x38, 0x30, 0x20, 0x20, 0x30, 0x20, 0x20, 0x38, 0x20, 0x20, 0x30,
0x20, 0x38, 0x31, 0x20, 0x31, 0x30, 0x20, 0x20, 0x31, 0x20, 0x20, 0x38,
0x20, 0x20, 0x31, 0x20, 0x20, 0x34, 0x20, 0x20, 0x31, 0x20, 0x20, 0x32,
0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x33, 0x35, 0x20, 0x0a, 0x20,
0x30, 0x20, 0x36, 0x32, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20,
0x30, 0x20, 0x32, 0x36, 0x20, 0x20, 0x30, 0x20, 0x35, 0x34, 0x20, 0x20,
0x34, 0x20, 0x20, 0x31, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20,
0x30, 0x20, 0x34, 0x35, 0x20, 0x20, 0x30, 0x20, 0x36, 0x33, 0x20, 0x20,
0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x33, 0x36, 0x20, 0x20,
0x30, 0x20, 0x37, 0x30, 0x20, 0x20, 0x36, 0x20, 0x20, 0x31, 0x20, 0x0a,
0x20, 0x34, 0x20, 0x20, 0x31, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20,
0x20, 0x30, 0x20, 0x20, 0x37, 0x20, 0x20, 0x30, 0x20, 0x35, 0x35, 0x20,
0x20, 0x30, 0x20, 0x37, 0x31, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20,
0x20, 0x30, 0x20, 0x31, 0x37, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20,
0x20, 0x30, 0x20, 0x32, 0x37, 0x20, 0x20, 0x30, 0x20, 0x33, 0x37, 0x20,
0x34, 0x38, 0x20, 0x20, 0x31, 0x20, 0x31, 0x38, 0x20, 0x20, 0x31, 0x20,
0x0a, 0x20, 0x63, 0x20, 0x20, 0x31, 0x20, 0x20, 0x34, 0x20, 0x20, 0x31,
0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x31, 0x38,
0x20, 0x20, 0x30, 0x20, 0x38, 0x32, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31,
0x20, 0x20, 0x30, 0x20, 0x32, 0x38, 0x20, 0x20, 0x34, 0x20, 0x20, 0x31,
0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x36, 0x34,
0x20, 0x20, 0x30, 0x20, 0x34, 0x36, 0x20, 0x20, 0x30, 0x20, 0x37, 0x32,
0x20, 0x0a, 0x20, 0x38, 0x20, 0x20, 0x31, 0x20, 0x20, 0x34, 0x20, 0x20,
0x31, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x38,
0x34, 0x20, 0x20, 0x30, 0x20, 0x34, 0x38, 0x20, 0x20, 0x32, 0x20, 0x20,
0x31, 0x20, 0x20, 0x30, 0x20, 0x39, 0x30, 0x20, 0x20, 0x30, 0x20, 0x20,
0x39, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x39,
0x31, 0x20, 0x20, 0x30, 0x20, 0x31, 0x39, 0x20, 0x31, 0x38, 0x20, 0x20,
0x31, 0x20, 0x0a, 0x20, 0x65, 0x20, 0x20, 0x31, 0x20, 0x20, 0x38, 0x20,
0x20, 0x31, 0x20, 0x20, 0x34, 0x20, 0x20, 0x31, 0x20, 0x20, 0x32, 0x20,
0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x37, 0x33, 0x20, 0x20, 0x30, 0x20,
0x36, 0x35, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20,
0x35, 0x36, 0x20, 0x20, 0x30, 0x20, 0x37, 0x34, 0x20, 0x20, 0x34, 0x20,
0x20, 0x31, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20,
0x34, 0x37, 0x20, 0x0a, 0x20, 0x30, 0x20, 0x36, 0x36, 0x20, 0x20, 0x30,
0x20, 0x38, 0x33, 0x20, 0x20, 0x36, 0x20, 0x20, 0x31, 0x20, 0x20, 0x32,
0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x33, 0x38, 0x20, 0x20, 0x32,
0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x37, 0x35, 0x20, 0x20, 0x30,
0x20, 0x35, 0x37, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30,
0x20, 0x39, 0x32, 0x20, 0x20, 0x30, 0x20, 0x32, 0x39, 0x20, 0x20, 0x65,
0x20, 0x20, 0x31, 0x20, 0x0a, 0x20, 0x38, 0x20, 0x20, 0x31, 0x20, 0x20,
0x34, 0x20, 0x20, 0x31, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20,
0x30, 0x20, 0x36, 0x37, 0x20, 0x20, 0x30, 0x20, 0x38, 0x35, 0x20, 0x20,
0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x35, 0x38, 0x20, 0x20,
0x30, 0x20, 0x33, 0x39, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20,
0x30, 0x20, 0x39, 0x33, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20,
0x30, 0x20, 0x34, 0x39, 0x20, 0x0a, 0x20, 0x30, 0x20, 0x38, 0x36, 0x20,
0x20, 0x36, 0x20, 0x20, 0x31, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20,
0x20, 0x30, 0x20, 0x61, 0x30, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20,
0x20, 0x30, 0x20, 0x36, 0x38, 0x20, 0x20, 0x30, 0x20, 0x20, 0x61, 0x20,
0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x61, 0x31, 0x20,
0x20, 0x30, 0x20, 0x31, 0x61, 0x20, 0x34, 0x34, 0x20, 0x20, 0x31, 0x20,
0x31, 0x38, 0x20, 0x20, 0x31, 0x20, 0x0a, 0x20, 0x63, 0x20, 0x20, 0x31,
0x20, 0x20, 0x34, 0x20, 0x20, 0x31, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31,
0x20, 0x20, 0x30, 0x20, 0x61, 0x32, 0x20, 0x20, 0x30, 0x20, 0x32, 0x61,
0x20, 0x20, 0x34, 0x20, 0x20, 0x31, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31,
0x20, 0x20, 0x30, 0x20, 0x39, 0x35, 0x20, 0x20, 0x30, 0x20, 0x35, 0x39,
0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x61, 0x33,
0x20, 0x20, 0x30, 0x20, 0x33, 0x61, 0x20, 0x0a, 0x20, 0x38, 0x20, 0x20,
0x31, 0x20, 0x20, 0x34, 0x20, 0x20, 0x31, 0x20, 0x20, 0x32, 0x20, 0x20,
0x31, 0x20, 0x20, 0x30, 0x20, 0x34, 0x61, 0x20, 0x20, 0x30, 0x20, 0x39,
0x36, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x62,
0x30, 0x20, 0x20, 0x30, 0x20, 0x20, 0x62, 0x20, 0x20, 0x32, 0x20, 0x20,
0x31, 0x20, 0x20, 0x30, 0x20, 0x62, 0x31, 0x20, 0x20, 0x30, 0x20, 0x31,
0x62, 0x20, 0x31, 0x34, 0x20, 0x20, 0x31, 0x20, 0x0a, 0x20, 0x38, 0x20,
0x20, 0x31, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20,
0x62, 0x32, 0x20, 0x20, 0x34, 0x20, 0x20, 0x31, 0x20, 0x20, 0x32, 0x20,
0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x37, 0x36, 0x20, 0x20, 0x30, 0x20,
0x37, 0x37, 0x20, 0x20, 0x30, 0x20, 0x39, 0x34, 0x20, 0x20, 0x36, 0x20,
0x20, 0x31, 0x20, 0x20, 0x34, 0x20, 0x20, 0x31, 0x20, 0x20, 0x32, 0x20,
0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x38, 0x37, 0x20, 0x0a, 0x20, 0x30,
0x20, 0x37, 0x38, 0x20, 0x20, 0x30, 0x20, 0x61, 0x34, 0x20, 0x20, 0x34,
0x20, 0x20, 0x31, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30,
0x20, 0x36, 0x39, 0x20, 0x20, 0x30, 0x20, 0x61, 0x35, 0x20, 0x20, 0x30,
0x20, 0x32, 0x62, 0x20, 0x20, 0x63, 0x20, 0x20, 0x31, 0x20, 0x20, 0x36,
0x20, 0x20, 0x31, 0x20, 0x20, 0x34, 0x20, 0x20, 0x31, 0x20, 0x20, 0x32,
0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x35, 0x61, 0x20, 0x0a, 0x20,
0x30, 0x20, 0x38, 0x38, 0x20, 0x20, 0x30, 0x20, 0x62, 0x33, 0x20, 0x20,
0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x33, 0x62, 0x20, 0x20,
0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x37, 0x39, 0x20, 0x20,
0x30, 0x20, 0x61, 0x36, 0x20, 0x20, 0x36, 0x20, 0x20, 0x31, 0x20, 0x20,
0x34, 0x20, 0x20, 0x31, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20,
0x30, 0x20, 0x36, 0x61, 0x20, 0x20, 0x30, 0x20, 0x62, 0x34, 0x20, 0x0a,
0x20, 0x30, 0x20, 0x63, 0x30, 0x20, 0x20, 0x34, 0x20, 0x20, 0x31, 0x20,
0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x20, 0x63, 0x20,
0x20, 0x30, 0x20, 0x39, 0x38, 0x20, 0x20, 0x30, 0x20, 0x63, 0x31, 0x20,
0x33, 0x63, 0x20, 0x20, 0x31, 0x20, 0x31, 0x36, 0x20, 0x20, 0x31, 0x20,
0x20, 0x61, 0x20, 0x20, 0x31, 0x20, 0x20, 0x36, 0x20, 0x20, 0x31, 0x20,
0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x31, 0x63, 0x20,
0x0a, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x38, 0x39,
0x20, 0x20, 0x30, 0x20, 0x62, 0x35, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31,
0x20, 0x20, 0x30, 0x20, 0x35, 0x62, 0x20, 0x20, 0x30, 0x20, 0x63, 0x32,
0x20, 0x20, 0x34, 0x20, 0x20, 0x31, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31,
0x20, 0x20, 0x30, 0x20, 0x32, 0x63, 0x20, 0x20, 0x30, 0x20, 0x33, 0x63,
0x20, 0x20, 0x34, 0x20, 0x20, 0x31, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31,
0x20, 0x0a, 0x20, 0x30, 0x20, 0x62, 0x36, 0x20, 0x20, 0x30, 0x20, 0x36,
0x62, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x63,
0x34, 0x20, 0x20, 0x30, 0x20, 0x34, 0x63, 0x20, 0x31, 0x30, 0x20, 0x20,
0x31, 0x20, 0x20, 0x38, 0x20, 0x20, 0x31, 0x20, 0x20, 0x34, 0x20, 0x20,
0x31, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x61,
0x38, 0x20, 0x20, 0x30, 0x20, 0x38, 0x61, 0x20, 0x20, 0x32, 0x20, 0x20,
0x31, 0x20, 0x0a, 0x20, 0x30, 0x20, 0x64, 0x30, 0x20, 0x20, 0x30, 0x20,
0x20, 0x64, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20,
0x64, 0x31, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20,
0x34, 0x62, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20,
0x39, 0x37, 0x20, 0x20, 0x30, 0x20, 0x61, 0x37, 0x20, 0x20, 0x63, 0x20,
0x20, 0x31, 0x20, 0x20, 0x36, 0x20, 0x20, 0x31, 0x20, 0x20, 0x32, 0x20,
0x20, 0x31, 0x20, 0x0a, 0x20, 0x30, 0x20, 0x63, 0x33, 0x20, 0x20, 0x32,
0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x37, 0x61, 0x20, 0x20, 0x30,
0x20, 0x39, 0x39, 0x20, 0x20, 0x34, 0x20, 0x20, 0x31, 0x20, 0x20, 0x32,
0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x63, 0x35, 0x20, 0x20, 0x30,
0x20, 0x35, 0x63, 0x20, 0x20, 0x30, 0x20, 0x62, 0x37, 0x20, 0x20, 0x34,
0x20, 0x20, 0x31, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30,
0x20, 0x31, 0x64, 0x20, 0x0a, 0x20, 0x30, 0x20, 0x64, 0x32, 0x20, 0x20,
0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x32, 0x64, 0x20, 0x20,
0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x37, 0x62, 0x20, 0x20,
0x30, 0x20, 0x64, 0x33, 0x20, 0x33, 0x34, 0x20, 0x20, 0x31, 0x20, 0x31,
0x63, 0x20, 0x20, 0x31, 0x20, 0x20, 0x63, 0x20, 0x20, 0x31, 0x20, 0x20,
0x34, 0x20, 0x20, 0x31, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20,
0x30, 0x20, 0x33, 0x64, 0x20, 0x0a, 0x20, 0x30, 0x20, 0x63, 0x36, 0x20,
0x20, 0x34, 0x20, 0x20, 0x31, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20,
0x20, 0x30, 0x20, 0x36, 0x63, 0x20, 0x20, 0x30, 0x20, 0x61, 0x39, 0x20,
0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x39, 0x61, 0x20,
0x20, 0x30, 0x20, 0x64, 0x34, 0x20, 0x20, 0x38, 0x20, 0x20, 0x31, 0x20,
0x20, 0x34, 0x20, 0x20, 0x31, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20,
0x20, 0x30, 0x20, 0x62, 0x38, 0x20, 0x0a, 0x20, 0x30, 0x20, 0x38, 0x62,
0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x34, 0x64,
0x20, 0x20, 0x30, 0x20, 0x63, 0x37, 0x20, 0x20, 0x34, 0x20, 0x20, 0x31,
0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x37, 0x63,
0x20, 0x20, 0x30, 0x20, 0x64, 0x35, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31,
0x20, 0x20, 0x30, 0x20, 0x35, 0x64, 0x20, 0x20, 0x30, 0x20, 0x65, 0x30,
0x20, 0x20, 0x61, 0x20, 0x20, 0x31, 0x20, 0x0a, 0x20, 0x34, 0x20, 0x20,
0x31, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x65,
0x31, 0x20, 0x20, 0x30, 0x20, 0x31, 0x65, 0x20, 0x20, 0x34, 0x20, 0x20,
0x31, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x20,
0x65, 0x20, 0x20, 0x30, 0x20, 0x32, 0x65, 0x20, 0x20, 0x30, 0x20, 0x65,
0x32, 0x20, 0x20, 0x38, 0x20, 0x20, 0x31, 0x20, 0x20, 0x34, 0x20, 0x20,
0x31, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x0a, 0x20, 0x30, 0x20,
0x65, 0x33, 0x20, 0x20, 0x30, 0x20, 0x36, 0x64, 0x20, 0x20, 0x32, 0x20,
0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x38, 0x63, 0x20, 0x20, 0x30, 0x20,
0x65, 0x34, 0x20, 0x20, 0x34, 0x20, 0x20, 0x31, 0x20, 0x20, 0x32, 0x20,
0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x65, 0x35, 0x20, 0x20, 0x30, 0x20,
0x62, 0x61, 0x20, 0x20, 0x30, 0x20, 0x66, 0x30, 0x20, 0x32, 0x36, 0x20,
0x20, 0x31, 0x20, 0x31, 0x30, 0x20, 0x20, 0x31, 0x20, 0x0a, 0x20, 0x34,
0x20, 0x20, 0x31, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30,
0x20, 0x66, 0x31, 0x20, 0x20, 0x30, 0x20, 0x31, 0x66, 0x20, 0x20, 0x36,
0x20, 0x20, 0x31, 0x20, 0x20, 0x34, 0x20, 0x20, 0x31, 0x20, 0x20, 0x32,
0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x61, 0x61, 0x20, 0x20, 0x30,
0x20, 0x39, 0x62, 0x20, 0x20, 0x30, 0x20, 0x62, 0x39, 0x20, 0x20, 0x32,
0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x33, 0x65, 0x20, 0x0a, 0x20,
0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x64, 0x36, 0x20, 0x20,
0x30, 0x20, 0x63, 0x38, 0x20, 0x20, 0x63, 0x20, 0x20, 0x31, 0x20, 0x20,
0x36, 0x20, 0x20, 0x31, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20,
0x30, 0x20, 0x34, 0x65, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20,
0x30, 0x20, 0x64, 0x37, 0x20, 0x20, 0x30, 0x20, 0x37, 0x64, 0x20, 0x20,
0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x61, 0x62, 0x20, 0x0a,
0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x35, 0x65, 0x20,
0x20, 0x30, 0x20, 0x63, 0x39, 0x20, 0x20, 0x36, 0x20, 0x20, 0x31, 0x20,
0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x20, 0x66, 0x20,
0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x39, 0x63, 0x20,
0x20, 0x30, 0x20, 0x36, 0x65, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20,
0x20, 0x30, 0x20, 0x66, 0x32, 0x20, 0x20, 0x30, 0x20, 0x32, 0x66, 0x20,
0x0a, 0x32, 0x30, 0x20, 0x20, 0x31, 0x20, 0x31, 0x30, 0x20, 0x20, 0x31,
0x20, 0x20, 0x36, 0x20, 0x20, 0x31, 0x20, 0x20, 0x34, 0x20, 0x20, 0x31,
0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x64, 0x38,
0x20, 0x20, 0x30, 0x20, 0x38, 0x64, 0x20, 0x20, 0x30, 0x20, 0x33, 0x66,
0x20, 0x20, 0x36, 0x20, 0x20, 0x31, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31,
0x20, 0x20, 0x30, 0x20, 0x66, 0x33, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31,
0x20, 0x0a, 0x20, 0x30, 0x20, 0x65, 0x36, 0x20, 0x20, 0x30, 0x20, 0x63,
0x61, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x66,
0x34, 0x20, 0x20, 0x30, 0x20, 0x34, 0x66, 0x20, 0x20, 0x38, 0x20, 0x20,
0x31, 0x20, 0x20, 0x34, 0x20, 0x20, 0x31, 0x20, 0x20, 0x32, 0x20, 0x20,
0x31, 0x20, 0x20, 0x30, 0x20, 0x62, 0x62, 0x20, 0x20, 0x30, 0x20, 0x61,
0x63, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x65,
0x37, 0x20, 0x0a, 0x20, 0x30, 0x20, 0x66, 0x35, 0x20, 0x20, 0x34, 0x20,
0x20, 0x31, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20,
0x64, 0x39, 0x20, 0x20, 0x30, 0x20, 0x39, 0x64, 0x20, 0x20, 0x32, 0x20,
0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x35, 0x66, 0x20, 0x20, 0x30, 0x20,
0x65, 0x38, 0x20, 0x31, 0x65, 0x20, 0x20, 0x31, 0x20, 0x20, 0x63, 0x20,
0x20, 0x31, 0x20, 0x20, 0x36, 0x20, 0x20, 0x31, 0x20, 0x20, 0x32, 0x20,
0x20, 0x31, 0x20, 0x0a, 0x20, 0x30, 0x20, 0x36, 0x66, 0x20, 0x20, 0x32,
0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x66, 0x36, 0x20, 0x20, 0x30,
0x20, 0x63, 0x62, 0x20, 0x20, 0x34, 0x20, 0x20, 0x31, 0x20, 0x20, 0x32,
0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x62, 0x63, 0x20, 0x20, 0x30,
0x20, 0x61, 0x64, 0x20, 0x20, 0x30, 0x20, 0x64, 0x61, 0x20, 0x20, 0x38,
0x20, 0x20, 0x31, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30,
0x20, 0x66, 0x37, 0x20, 0x0a, 0x20, 0x34, 0x20, 0x20, 0x31, 0x20, 0x20,
0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x37, 0x65, 0x20, 0x20,
0x30, 0x20, 0x37, 0x66, 0x20, 0x20, 0x30, 0x20, 0x38, 0x65, 0x20, 0x20,
0x36, 0x20, 0x20, 0x31, 0x20, 0x20, 0x34, 0x20, 0x20, 0x31, 0x20, 0x20,
0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x39, 0x65, 0x20, 0x20,
0x30, 0x20, 0x61, 0x65, 0x20, 0x20, 0x30, 0x20, 0x63, 0x63, 0x20, 0x20,
0x32, 0x20, 0x20, 0x31, 0x20, 0x0a, 0x20, 0x30, 0x20, 0x66, 0x38, 0x20,
0x20, 0x30, 0x20, 0x38, 0x66, 0x20, 0x31, 0x32, 0x20, 0x20, 0x31, 0x20,
0x20, 0x38, 0x20, 0x20, 0x31, 0x20, 0x20, 0x34, 0x20, 0x20, 0x31, 0x20,
0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x64, 0x62, 0x20,
0x20, 0x30, 0x20, 0x62, 0x64, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20,
0x20, 0x30, 0x20, 0x65, 0x61, 0x20, 0x20, 0x30, 0x20, 0x66, 0x39, 0x20,
0x20, 0x34, 0x20, 0x20, 0x31, 0x20, 0x0a, 0x20, 0x32, 0x20, 0x20, 0x31,
0x20, 0x20, 0x30, 0x20, 0x39, 0x66, 0x20, 0x20, 0x30, 0x20, 0x65, 0x62,
0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x62, 0x65,
0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x63, 0x64,
0x20, 0x20, 0x30, 0x20, 0x66, 0x61, 0x20, 0x20, 0x65, 0x20, 0x20, 0x31,
0x20, 0x20, 0x34, 0x20, 0x20, 0x31, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31,
0x20, 0x20, 0x30, 0x20, 0x64, 0x64, 0x20, 0x0a, 0x20, 0x30, 0x20, 0x65,
0x63, 0x20, 0x20, 0x36, 0x20, 0x20, 0x31, 0x20, 0x20, 0x34, 0x20, 0x20,
0x31, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x65,
0x39, 0x20, 0x20, 0x30, 0x20, 0x61, 0x66, 0x20, 0x20, 0x30, 0x20, 0x64,
0x63, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x63,
0x65, 0x20, 0x20, 0x30, 0x20, 0x66, 0x62, 0x20, 0x20, 0x38, 0x20, 0x20,
0x31, 0x20, 0x20, 0x34, 0x20, 0x20, 0x31, 0x20, 0x0a, 0x20, 0x32, 0x20,
0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x62, 0x66, 0x20, 0x20, 0x30, 0x20,
0x64, 0x65, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20,
0x63, 0x66, 0x20, 0x20, 0x30, 0x20, 0x65, 0x65, 0x20, 0x20, 0x34, 0x20,
0x20, 0x31, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20,
0x64, 0x66, 0x20, 0x20, 0x30, 0x20, 0x65, 0x66, 0x20, 0x20, 0x32, 0x20,
0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x66, 0x66, 0x20, 0x0a, 0x20, 0x32,
0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x65, 0x64, 0x20, 0x20, 0x32,
0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x66, 0x64, 0x20, 0x20, 0x32,
0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x66, 0x63, 0x20, 0x20, 0x30,
0x20, 0x66, 0x65, 0x20, 0x0a, 0x0a, 0x2e, 0x74, 0x61, 0x62, 0x6c, 0x65,
0x20, 0x31, 0x34, 0x20, 0x20, 0x20, 0x30, 0x20, 0x20, 0x30, 0x20, 0x20,
0x30, 0x20, 0x20, 0x30, 0x0a, 0x2e, 0x74, 0x72, 0x65, 0x65, 0x64, 0x61,
0x74, 0x61, 0x0a, 0x0a, 0x2e, 0x74, 0x61, 0x62, 0x6c, 0x65, 0x20, 0x31,
0x35, 0x20, 0x35, 0x31, 0x31, 0x20, 0x31, 0x36, 0x20, 0x31, 0x36, 0x20,
0x20, 0x30, 0x0a, 0x2e, 0x74, 0x72, 0x65, 0x65, 0x64, 0x61, 0x74, 0x61,
0x0a, 0x31, 0x30, 0x20, 0x20, 0x31, 0x20, 0x20, 0x36, 0x20, 0x20, 0x31,
0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x20, 0x30,
0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x31, 0x30,
0x20, 0x20, 0x30, 0x20, 0x20, 0x31, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31,
0x20, 0x20, 0x30, 0x20, 0x31, 0x31, 0x20, 0x20, 0x34, 0x20, 0x20, 0x31,
0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x32, 0x30,
0x20, 0x0a, 0x20, 0x30, 0x20, 0x20, 0x32, 0x20, 0x20, 0x32, 0x20, 0x20,
0x31, 0x20, 0x20, 0x30, 0x20, 0x32, 0x31, 0x20, 0x20, 0x30, 0x20, 0x31,
0x32, 0x20, 0x33, 0x32, 0x20, 0x20, 0x31, 0x20, 0x31, 0x30, 0x20, 0x20,
0x31, 0x20, 0x20, 0x36, 0x20, 0x20, 0x31, 0x20, 0x20, 0x32, 0x20, 0x20,
0x31, 0x20, 0x20, 0x30, 0x20, 0x32, 0x32, 0x20, 0x20, 0x32, 0x20, 0x20,
0x31, 0x20, 0x20, 0x30, 0x20, 0x33, 0x30, 0x20, 0x20, 0x30, 0x20, 0x33,
0x31, 0x20, 0x0a, 0x20, 0x36, 0x20, 0x20, 0x31, 0x20, 0x20, 0x32, 0x20,
0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x31, 0x33, 0x20, 0x20, 0x32, 0x20,
0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x20, 0x33, 0x20, 0x20, 0x30, 0x20,
0x34, 0x30, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20,
0x33, 0x32, 0x20, 0x20, 0x30, 0x20, 0x32, 0x33, 0x20, 0x20, 0x65, 0x20,
0x20, 0x31, 0x20, 0x20, 0x36, 0x20, 0x20, 0x31, 0x20, 0x20, 0x34, 0x20,
0x20, 0x31, 0x20, 0x0a, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30,
0x20, 0x20, 0x34, 0x20, 0x20, 0x30, 0x20, 0x31, 0x34, 0x20, 0x20, 0x30,
0x20, 0x34, 0x31, 0x20, 0x20, 0x34, 0x20, 0x20, 0x31, 0x20, 0x20, 0x32,
0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x33, 0x33, 0x20, 0x20, 0x30,
0x20, 0x34, 0x32, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30,
0x20, 0x32, 0x34, 0x20, 0x20, 0x30, 0x20, 0x34, 0x33, 0x20, 0x20, 0x61,
0x20, 0x20, 0x31, 0x20, 0x0a, 0x20, 0x36, 0x20, 0x20, 0x31, 0x20, 0x20,
0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x33, 0x34, 0x20, 0x20,
0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x35, 0x30, 0x20, 0x20,
0x30, 0x20, 0x20, 0x35, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20,
0x30, 0x20, 0x35, 0x31, 0x20, 0x20, 0x30, 0x20, 0x31, 0x35, 0x20, 0x20,
0x34, 0x20, 0x20, 0x31, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20,
0x30, 0x20, 0x35, 0x32, 0x20, 0x0a, 0x20, 0x30, 0x20, 0x32, 0x35, 0x20,
0x20, 0x34, 0x20, 0x20, 0x31, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20,
0x20, 0x30, 0x20, 0x34, 0x34, 0x20, 0x20, 0x30, 0x20, 0x35, 0x33, 0x20,
0x20, 0x30, 0x20, 0x36, 0x31, 0x20, 0x35, 0x61, 0x20, 0x20, 0x31, 0x20,
0x32, 0x34, 0x20, 0x20, 0x31, 0x20, 0x31, 0x32, 0x20, 0x20, 0x31, 0x20,
0x20, 0x61, 0x20, 0x20, 0x31, 0x20, 0x20, 0x36, 0x20, 0x20, 0x31, 0x20,
0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x0a, 0x20, 0x30, 0x20, 0x33, 0x35,
0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x36, 0x30,
0x20, 0x20, 0x30, 0x20, 0x20, 0x36, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31,
0x20, 0x20, 0x30, 0x20, 0x31, 0x36, 0x20, 0x20, 0x30, 0x20, 0x36, 0x32,
0x20, 0x20, 0x34, 0x20, 0x20, 0x31, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31,
0x20, 0x20, 0x30, 0x20, 0x32, 0x36, 0x20, 0x20, 0x30, 0x20, 0x35, 0x34,
0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x0a, 0x20, 0x30, 0x20, 0x34,
0x35, 0x20, 0x20, 0x30, 0x20, 0x36, 0x33, 0x20, 0x20, 0x61, 0x20, 0x20,
0x31, 0x20, 0x20, 0x36, 0x20, 0x20, 0x31, 0x20, 0x20, 0x32, 0x20, 0x20,
0x31, 0x20, 0x20, 0x30, 0x20, 0x33, 0x36, 0x20, 0x20, 0x32, 0x20, 0x20,
0x31, 0x20, 0x20, 0x30, 0x20, 0x37, 0x30, 0x20, 0x20, 0x30, 0x20, 0x20,
0x37, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x37,
0x31, 0x20, 0x20, 0x30, 0x20, 0x35, 0x35, 0x20, 0x0a, 0x20, 0x34, 0x20,
0x20, 0x31, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20,
0x31, 0x37, 0x20, 0x20, 0x30, 0x20, 0x36, 0x34, 0x20, 0x20, 0x32, 0x20,
0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x37, 0x32, 0x20, 0x20, 0x30, 0x20,
0x32, 0x37, 0x20, 0x31, 0x38, 0x20, 0x20, 0x31, 0x20, 0x31, 0x30, 0x20,
0x20, 0x31, 0x20, 0x20, 0x38, 0x20, 0x20, 0x31, 0x20, 0x20, 0x34, 0x20,
0x20, 0x31, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x0a, 0x20, 0x30,
0x20, 0x34, 0x36, 0x20, 0x20, 0x30, 0x20, 0x37, 0x33, 0x20, 0x20, 0x32,
0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x33, 0x37, 0x20, 0x20, 0x30,
0x20, 0x36, 0x35, 0x20, 0x20, 0x34, 0x20, 0x20, 0x31, 0x20, 0x20, 0x32,
0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x35, 0x36, 0x20, 0x20, 0x30,
0x20, 0x38, 0x30, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30,
0x20, 0x20, 0x38, 0x20, 0x20, 0x30, 0x20, 0x37, 0x34, 0x20, 0x0a, 0x20,
0x34, 0x20, 0x20, 0x31, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20,
0x30, 0x20, 0x38, 0x31, 0x20, 0x20, 0x30, 0x20, 0x31, 0x38, 0x20, 0x20,
0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x38, 0x32, 0x20, 0x20,
0x30, 0x20, 0x32, 0x38, 0x20, 0x31, 0x30, 0x20, 0x20, 0x31, 0x20, 0x20,
0x38, 0x20, 0x20, 0x31, 0x20, 0x20, 0x34, 0x20, 0x20, 0x31, 0x20, 0x20,
0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x34, 0x37, 0x20, 0x0a,
0x20, 0x30, 0x20, 0x36, 0x36, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20,
0x20, 0x30, 0x20, 0x38, 0x33, 0x20, 0x20, 0x30, 0x20, 0x33, 0x38, 0x20,
0x20, 0x34, 0x20, 0x20, 0x31, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20,
0x20, 0x30, 0x20, 0x37, 0x35, 0x20, 0x20, 0x30, 0x20, 0x35, 0x37, 0x20,
0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x38, 0x34, 0x20,
0x20, 0x30, 0x20, 0x34, 0x38, 0x20, 0x20, 0x36, 0x20, 0x20, 0x31, 0x20,
0x0a, 0x20, 0x34, 0x20, 0x20, 0x31, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31,
0x20, 0x20, 0x30, 0x20, 0x39, 0x30, 0x20, 0x20, 0x30, 0x20, 0x31, 0x39,
0x20, 0x20, 0x30, 0x20, 0x39, 0x31, 0x20, 0x20, 0x34, 0x20, 0x20, 0x31,
0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x39, 0x32,
0x20, 0x20, 0x30, 0x20, 0x37, 0x36, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31,
0x20, 0x20, 0x30, 0x20, 0x36, 0x37, 0x20, 0x20, 0x30, 0x20, 0x32, 0x39,
0x20, 0x0a, 0x35, 0x63, 0x20, 0x20, 0x31, 0x20, 0x32, 0x34, 0x20, 0x20,
0x31, 0x20, 0x31, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x61, 0x20, 0x20,
0x31, 0x20, 0x20, 0x34, 0x20, 0x20, 0x31, 0x20, 0x20, 0x32, 0x20, 0x20,
0x31, 0x20, 0x20, 0x30, 0x20, 0x38, 0x35, 0x20, 0x20, 0x30, 0x20, 0x35,
0x38, 0x20, 0x20, 0x34, 0x20, 0x20, 0x31, 0x20, 0x20, 0x32, 0x20, 0x20,
0x31, 0x20, 0x20, 0x30, 0x20, 0x20, 0x39, 0x20, 0x20, 0x30, 0x20, 0x37,
0x37, 0x20, 0x0a, 0x20, 0x30, 0x20, 0x39, 0x33, 0x20, 0x20, 0x34, 0x20,
0x20, 0x31, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20,
0x33, 0x39, 0x20, 0x20, 0x30, 0x20, 0x39, 0x34, 0x20, 0x20, 0x32, 0x20,
0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x34, 0x39, 0x20, 0x20, 0x30, 0x20,
0x38, 0x36, 0x20, 0x20, 0x61, 0x20, 0x20, 0x31, 0x20, 0x20, 0x36, 0x20,
0x20, 0x31, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20,
0x36, 0x38, 0x20, 0x0a, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30,
0x20, 0x61, 0x30, 0x20, 0x20, 0x30, 0x20, 0x20, 0x61, 0x20, 0x20, 0x32,
0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x61, 0x31, 0x20, 0x20, 0x30,
0x20, 0x31, 0x61, 0x20, 0x20, 0x34, 0x20, 0x20, 0x31, 0x20, 0x20, 0x32,
0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x61, 0x32, 0x20, 0x20, 0x30,
0x20, 0x32, 0x61, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30,
0x20, 0x39, 0x35, 0x20, 0x0a, 0x20, 0x30, 0x20, 0x35, 0x39, 0x20, 0x31,
0x61, 0x20, 0x20, 0x31, 0x20, 0x20, 0x65, 0x20, 0x20, 0x31, 0x20, 0x20,
0x36, 0x20, 0x20, 0x31, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20,
0x30, 0x20, 0x61, 0x33, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20,
0x30, 0x20, 0x33, 0x61, 0x20, 0x20, 0x30, 0x20, 0x38, 0x37, 0x20, 0x20,
0x34, 0x20, 0x20, 0x31, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20,
0x30, 0x20, 0x37, 0x38, 0x20, 0x0a, 0x20, 0x30, 0x20, 0x61, 0x34, 0x20,
0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x34, 0x61, 0x20,
0x20, 0x30, 0x20, 0x39, 0x36, 0x20, 0x20, 0x36, 0x20, 0x20, 0x31, 0x20,
0x20, 0x34, 0x20, 0x20, 0x31, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20,
0x20, 0x30, 0x20, 0x36, 0x39, 0x20, 0x20, 0x30, 0x20, 0x62, 0x30, 0x20,
0x20, 0x30, 0x20, 0x62, 0x31, 0x20, 0x20, 0x34, 0x20, 0x20, 0x31, 0x20,
0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x0a, 0x20, 0x30, 0x20, 0x31, 0x62,
0x20, 0x20, 0x30, 0x20, 0x61, 0x35, 0x20, 0x20, 0x30, 0x20, 0x62, 0x32,
0x20, 0x20, 0x65, 0x20, 0x20, 0x31, 0x20, 0x20, 0x38, 0x20, 0x20, 0x31,
0x20, 0x20, 0x34, 0x20, 0x20, 0x31, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31,
0x20, 0x20, 0x30, 0x20, 0x35, 0x61, 0x20, 0x20, 0x30, 0x20, 0x32, 0x62,
0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x38, 0x38,
0x20, 0x20, 0x30, 0x20, 0x39, 0x37, 0x20, 0x0a, 0x20, 0x32, 0x20, 0x20,
0x31, 0x20, 0x20, 0x30, 0x20, 0x62, 0x33, 0x20, 0x20, 0x32, 0x20, 0x20,
0x31, 0x20, 0x20, 0x30, 0x20, 0x37, 0x39, 0x20, 0x20, 0x30, 0x20, 0x33,
0x62, 0x20, 0x20, 0x38, 0x20, 0x20, 0x31, 0x20, 0x20, 0x34, 0x20, 0x20,
0x31, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x36,
0x61, 0x20, 0x20, 0x30, 0x20, 0x62, 0x34, 0x20, 0x20, 0x32, 0x20, 0x20,
0x31, 0x20, 0x20, 0x30, 0x20, 0x34, 0x62, 0x20, 0x0a, 0x20, 0x30, 0x20,
0x63, 0x31, 0x20, 0x20, 0x34, 0x20, 0x20, 0x31, 0x20, 0x20, 0x32, 0x20,
0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x39, 0x38, 0x20, 0x20, 0x30, 0x20,
0x38, 0x39, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20,
0x31, 0x63, 0x20, 0x20, 0x30, 0x20, 0x62, 0x35, 0x20, 0x35, 0x30, 0x20,
0x20, 0x31, 0x20, 0x32, 0x32, 0x20, 0x20, 0x31, 0x20, 0x31, 0x30, 0x20,
0x20, 0x31, 0x20, 0x20, 0x36, 0x20, 0x20, 0x31, 0x20, 0x0a, 0x20, 0x34,
0x20, 0x20, 0x31, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30,
0x20, 0x35, 0x62, 0x20, 0x20, 0x30, 0x20, 0x32, 0x63, 0x20, 0x20, 0x30,
0x20, 0x63, 0x32, 0x20, 0x20, 0x36, 0x20, 0x20, 0x31, 0x20, 0x20, 0x34,
0x20, 0x20, 0x31, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30,
0x20, 0x20, 0x62, 0x20, 0x20, 0x30, 0x20, 0x63, 0x30, 0x20, 0x20, 0x30,
0x20, 0x61, 0x36, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x0a, 0x20,
0x30, 0x20, 0x61, 0x37, 0x20, 0x20, 0x30, 0x20, 0x37, 0x61, 0x20, 0x20,
0x61, 0x20, 0x20, 0x31, 0x20, 0x20, 0x34, 0x20, 0x20, 0x31, 0x20, 0x20,
0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x63, 0x33, 0x20, 0x20,
0x30, 0x20, 0x33, 0x63, 0x20, 0x20, 0x34, 0x20, 0x20, 0x31, 0x20, 0x20,
0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x20, 0x63, 0x20, 0x20,
0x30, 0x20, 0x39, 0x39, 0x20, 0x20, 0x30, 0x20, 0x62, 0x36, 0x20, 0x0a,
0x20, 0x34, 0x20, 0x20, 0x31, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20,
0x20, 0x30, 0x20, 0x36, 0x62, 0x20, 0x20, 0x30, 0x20, 0x63, 0x34, 0x20,
0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x34, 0x63, 0x20,
0x20, 0x30, 0x20, 0x61, 0x38, 0x20, 0x31, 0x34, 0x20, 0x20, 0x31, 0x20,
0x20, 0x61, 0x20, 0x20, 0x31, 0x20, 0x20, 0x34, 0x20, 0x20, 0x31, 0x20,
0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x38, 0x61, 0x20,
0x0a, 0x20, 0x30, 0x20, 0x63, 0x35, 0x20, 0x20, 0x34, 0x20, 0x20, 0x31,
0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x64, 0x30,
0x20, 0x20, 0x30, 0x20, 0x35, 0x63, 0x20, 0x20, 0x30, 0x20, 0x64, 0x31,
0x20, 0x20, 0x34, 0x20, 0x20, 0x31, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31,
0x20, 0x20, 0x30, 0x20, 0x62, 0x37, 0x20, 0x20, 0x30, 0x20, 0x37, 0x62,
0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x31, 0x64,
0x20, 0x0a, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x20,
0x64, 0x20, 0x20, 0x30, 0x20, 0x32, 0x64, 0x20, 0x20, 0x63, 0x20, 0x20,
0x31, 0x20, 0x20, 0x34, 0x20, 0x20, 0x31, 0x20, 0x20, 0x32, 0x20, 0x20,
0x31, 0x20, 0x20, 0x30, 0x20, 0x64, 0x32, 0x20, 0x20, 0x30, 0x20, 0x64,
0x33, 0x20, 0x20, 0x34, 0x20, 0x20, 0x31, 0x20, 0x20, 0x32, 0x20, 0x20,
0x31, 0x20, 0x20, 0x30, 0x20, 0x33, 0x64, 0x20, 0x20, 0x30, 0x20, 0x63,
0x36, 0x20, 0x0a, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20,
0x36, 0x63, 0x20, 0x20, 0x30, 0x20, 0x61, 0x39, 0x20, 0x20, 0x36, 0x20,
0x20, 0x31, 0x20, 0x20, 0x34, 0x20, 0x20, 0x31, 0x20, 0x20, 0x32, 0x20,
0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x39, 0x61, 0x20, 0x20, 0x30, 0x20,
0x62, 0x38, 0x20, 0x20, 0x30, 0x20, 0x64, 0x34, 0x20, 0x20, 0x34, 0x20,
0x20, 0x31, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20,
0x38, 0x62, 0x20, 0x0a, 0x20, 0x30, 0x20, 0x34, 0x64, 0x20, 0x20, 0x32,
0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x63, 0x37, 0x20, 0x20, 0x30,
0x20, 0x37, 0x63, 0x20, 0x34, 0x34, 0x20, 0x20, 0x31, 0x20, 0x32, 0x32,
0x20, 0x20, 0x31, 0x20, 0x31, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x61,
0x20, 0x20, 0x31, 0x20, 0x20, 0x34, 0x20, 0x20, 0x31, 0x20, 0x20, 0x32,
0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x64, 0x35, 0x20, 0x20, 0x30,
0x20, 0x35, 0x64, 0x20, 0x0a, 0x20, 0x34, 0x20, 0x20, 0x31, 0x20, 0x20,
0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x65, 0x30, 0x20, 0x20,
0x30, 0x20, 0x20, 0x65, 0x20, 0x20, 0x30, 0x20, 0x65, 0x31, 0x20, 0x20,
0x34, 0x20, 0x20, 0x31, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20,
0x30, 0x20, 0x31, 0x65, 0x20, 0x20, 0x30, 0x20, 0x65, 0x32, 0x20, 0x20,
0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x61, 0x61, 0x20, 0x20,
0x30, 0x20, 0x32, 0x65, 0x20, 0x0a, 0x20, 0x38, 0x20, 0x20, 0x31, 0x20,
0x20, 0x34, 0x20, 0x20, 0x31, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20,
0x20, 0x30, 0x20, 0x62, 0x39, 0x20, 0x20, 0x30, 0x20, 0x39, 0x62, 0x20,
0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x65, 0x33, 0x20,
0x20, 0x30, 0x20, 0x64, 0x36, 0x20, 0x20, 0x34, 0x20, 0x20, 0x31, 0x20,
0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x36, 0x64, 0x20,
0x20, 0x30, 0x20, 0x33, 0x65, 0x20, 0x0a, 0x20, 0x32, 0x20, 0x20, 0x31,
0x20, 0x20, 0x30, 0x20, 0x63, 0x38, 0x20, 0x20, 0x30, 0x20, 0x38, 0x63,
0x20, 0x31, 0x30, 0x20, 0x20, 0x31, 0x20, 0x20, 0x38, 0x20, 0x20, 0x31,
0x20, 0x20, 0x34, 0x20, 0x20, 0x31, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31,
0x20, 0x20, 0x30, 0x20, 0x65, 0x34, 0x20, 0x20, 0x30, 0x20, 0x34, 0x65,
0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x64, 0x37,
0x20, 0x20, 0x30, 0x20, 0x37, 0x64, 0x20, 0x0a, 0x20, 0x34, 0x20, 0x20,
0x31, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x65,
0x35, 0x20, 0x20, 0x30, 0x20, 0x62, 0x61, 0x20, 0x20, 0x32, 0x20, 0x20,
0x31, 0x20, 0x20, 0x30, 0x20, 0x61, 0x62, 0x20, 0x20, 0x30, 0x20, 0x35,
0x65, 0x20, 0x20, 0x38, 0x20, 0x20, 0x31, 0x20, 0x20, 0x34, 0x20, 0x20,
0x31, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x63,
0x39, 0x20, 0x20, 0x30, 0x20, 0x39, 0x63, 0x20, 0x0a, 0x20, 0x32, 0x20,
0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x66, 0x31, 0x20, 0x20, 0x30, 0x20,
0x31, 0x66, 0x20, 0x20, 0x36, 0x20, 0x20, 0x31, 0x20, 0x20, 0x34, 0x20,
0x20, 0x31, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20,
0x66, 0x30, 0x20, 0x20, 0x30, 0x20, 0x36, 0x65, 0x20, 0x20, 0x30, 0x20,
0x66, 0x32, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20,
0x32, 0x66, 0x20, 0x20, 0x30, 0x20, 0x65, 0x36, 0x20, 0x0a, 0x32, 0x36,
0x20, 0x20, 0x31, 0x20, 0x31, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x38,
0x20, 0x20, 0x31, 0x20, 0x20, 0x34, 0x20, 0x20, 0x31, 0x20, 0x20, 0x32,
0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x64, 0x38, 0x20, 0x20, 0x30,
0x20, 0x66, 0x33, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30,
0x20, 0x33, 0x66, 0x20, 0x20, 0x30, 0x20, 0x66, 0x34, 0x20, 0x20, 0x36,
0x20, 0x20, 0x31, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x0a, 0x20,
0x30, 0x20, 0x34, 0x66, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20,
0x30, 0x20, 0x38, 0x64, 0x20, 0x20, 0x30, 0x20, 0x64, 0x39, 0x20, 0x20,
0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x62, 0x62, 0x20, 0x20,
0x30, 0x20, 0x63, 0x61, 0x20, 0x20, 0x38, 0x20, 0x20, 0x31, 0x20, 0x20,
0x34, 0x20, 0x20, 0x31, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20,
0x30, 0x20, 0x61, 0x63, 0x20, 0x20, 0x30, 0x20, 0x65, 0x37, 0x20, 0x0a,
0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x37, 0x65, 0x20,
0x20, 0x30, 0x20, 0x66, 0x35, 0x20, 0x20, 0x38, 0x20, 0x20, 0x31, 0x20,
0x20, 0x34, 0x20, 0x20, 0x31, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20,
0x20, 0x30, 0x20, 0x39, 0x64, 0x20, 0x20, 0x30, 0x20, 0x35, 0x66, 0x20,
0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x65, 0x38, 0x20,
0x20, 0x30, 0x20, 0x38, 0x65, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20,
0x0a, 0x20, 0x30, 0x20, 0x66, 0x36, 0x20, 0x20, 0x30, 0x20, 0x63, 0x62,
0x20, 0x32, 0x32, 0x20, 0x20, 0x31, 0x20, 0x31, 0x32, 0x20, 0x20, 0x31,
0x20, 0x20, 0x61, 0x20, 0x20, 0x31, 0x20, 0x20, 0x36, 0x20, 0x20, 0x31,
0x20, 0x20, 0x34, 0x20, 0x20, 0x31, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31,
0x20, 0x20, 0x30, 0x20, 0x20, 0x66, 0x20, 0x20, 0x30, 0x20, 0x61, 0x65,
0x20, 0x20, 0x30, 0x20, 0x36, 0x66, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31,
0x20, 0x0a, 0x20, 0x30, 0x20, 0x62, 0x63, 0x20, 0x20, 0x30, 0x20, 0x64,
0x61, 0x20, 0x20, 0x34, 0x20, 0x20, 0x31, 0x20, 0x20, 0x32, 0x20, 0x20,
0x31, 0x20, 0x20, 0x30, 0x20, 0x61, 0x64, 0x20, 0x20, 0x30, 0x20, 0x66,
0x37, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x37,
0x66, 0x20, 0x20, 0x30, 0x20, 0x65, 0x39, 0x20, 0x20, 0x38, 0x20, 0x20,
0x31, 0x20, 0x20, 0x34, 0x20, 0x20, 0x31, 0x20, 0x20, 0x32, 0x20, 0x20,
0x31, 0x20, 0x0a, 0x20, 0x30, 0x20, 0x39, 0x65, 0x20, 0x20, 0x30, 0x20,
0x63, 0x63, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20,
0x66, 0x38, 0x20, 0x20, 0x30, 0x20, 0x38, 0x66, 0x20, 0x20, 0x34, 0x20,
0x20, 0x31, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20,
0x64, 0x62, 0x20, 0x20, 0x30, 0x20, 0x62, 0x64, 0x20, 0x20, 0x32, 0x20,
0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x65, 0x61, 0x20, 0x20, 0x30, 0x20,
0x66, 0x39, 0x20, 0x0a, 0x31, 0x30, 0x20, 0x20, 0x31, 0x20, 0x20, 0x38,
0x20, 0x20, 0x31, 0x20, 0x20, 0x34, 0x20, 0x20, 0x31, 0x20, 0x20, 0x32,
0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x39, 0x66, 0x20, 0x20, 0x30,
0x20, 0x64, 0x63, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30,
0x20, 0x63, 0x64, 0x20, 0x20, 0x30, 0x20, 0x65, 0x62, 0x20, 0x20, 0x34,
0x20, 0x20, 0x31, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30,
0x20, 0x62, 0x65, 0x20, 0x0a, 0x20, 0x30, 0x20, 0x66, 0x61, 0x20, 0x20,
0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x61, 0x66, 0x20, 0x20,
0x30, 0x20, 0x64, 0x64, 0x20, 0x20, 0x65, 0x20, 0x20, 0x31, 0x20, 0x20,
0x36, 0x20, 0x20, 0x31, 0x20, 0x20, 0x34, 0x20, 0x20, 0x31, 0x20, 0x20,
0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x65, 0x63, 0x20, 0x20,
0x30, 0x20, 0x63, 0x65, 0x20, 0x20, 0x30, 0x20, 0x66, 0x62, 0x20, 0x20,
0x34, 0x20, 0x20, 0x31, 0x20, 0x0a, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20,
0x20, 0x30, 0x20, 0x62, 0x66, 0x20, 0x20, 0x30, 0x20, 0x65, 0x64, 0x20,
0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x64, 0x65, 0x20,
0x20, 0x30, 0x20, 0x66, 0x63, 0x20, 0x20, 0x36, 0x20, 0x20, 0x31, 0x20,
0x20, 0x34, 0x20, 0x20, 0x31, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20,
0x20, 0x30, 0x20, 0x63, 0x66, 0x20, 0x20, 0x30, 0x20, 0x66, 0x64, 0x20,
0x20, 0x30, 0x20, 0x65, 0x65, 0x20, 0x0a, 0x20, 0x34, 0x20, 0x20, 0x31,
0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x64, 0x66,
0x20, 0x20, 0x30, 0x20, 0x66, 0x65, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31,
0x20, 0x20, 0x30, 0x20, 0x65, 0x66, 0x20, 0x20, 0x30, 0x20, 0x66, 0x66,
0x20, 0x0a, 0x0a, 0x2e, 0x74, 0x61, 0x62, 0x6c, 0x65, 0x20, 0x31, 0x36,
0x20, 0x35, 0x31, 0x31, 0x20, 0x31, 0x36, 0x20, 0x31, 0x36, 0x20, 0x20,
0x31, 0x0a, 0x2e, 0x74, 0x72, 0x65, 0x65, 0x64, 0x61, 0x74, 0x61, 0x0a,
0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x20, 0x30, 0x20,
0x20, 0x36, 0x20, 0x20, 0x31, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20,
0x20, 0x30, 0x20, 0x31, 0x30, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20,
0x20, 0x30, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x31, 0x31, 0x20,
0x32, 0x61, 0x20, 0x20, 0x31, 0x20, 0x20, 0x38, 0x20, 0x20, 0x31, 0x20,
0x20, 0x34, 0x20, 0x20, 0x31, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20,
0x0a, 0x20, 0x30, 0x20, 0x32, 0x30, 0x20, 0x20, 0x30, 0x20, 0x20, 0x32,
0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x32, 0x31,
0x20, 0x20, 0x30, 0x20, 0x31, 0x32, 0x20, 0x20, 0x61, 0x20, 0x20, 0x31,
0x20, 0x20, 0x36, 0x20, 0x20, 0x31, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31,
0x20, 0x20, 0x30, 0x20, 0x32, 0x32, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31,
0x20, 0x20, 0x30, 0x20, 0x33, 0x30, 0x20, 0x20, 0x30, 0x20, 0x20, 0x33,
0x20, 0x0a, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x33,
0x31, 0x20, 0x20, 0x30, 0x20, 0x31, 0x33, 0x20, 0x20, 0x61, 0x20, 0x20,
0x31, 0x20, 0x20, 0x34, 0x20, 0x20, 0x31, 0x20, 0x20, 0x32, 0x20, 0x20,
0x31, 0x20, 0x20, 0x30, 0x20, 0x33, 0x32, 0x20, 0x20, 0x30, 0x20, 0x32,
0x33, 0x20, 0x20, 0x34, 0x20, 0x20, 0x31, 0x20, 0x20, 0x32, 0x20, 0x20,
0x31, 0x20, 0x20, 0x30, 0x20, 0x34, 0x30, 0x20, 0x20, 0x30, 0x20, 0x20,
0x34, 0x20, 0x0a, 0x20, 0x30, 0x20, 0x34, 0x31, 0x20, 0x20, 0x36, 0x20,
0x20, 0x31, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20,
0x31, 0x34, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20,
0x33, 0x33, 0x20, 0x20, 0x30, 0x20, 0x34, 0x32, 0x20, 0x20, 0x34, 0x20,
0x20, 0x31, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20,
0x32, 0x34, 0x20, 0x20, 0x30, 0x20, 0x35, 0x30, 0x20, 0x20, 0x32, 0x20,
0x20, 0x31, 0x20, 0x0a, 0x20, 0x30, 0x20, 0x34, 0x33, 0x20, 0x20, 0x30,
0x20, 0x33, 0x34, 0x20, 0x38, 0x61, 0x20, 0x20, 0x31, 0x20, 0x32, 0x38,
0x20, 0x20, 0x31, 0x20, 0x31, 0x30, 0x20, 0x20, 0x31, 0x20, 0x20, 0x36,
0x20, 0x20, 0x31, 0x20, 0x20, 0x34, 0x20, 0x20, 0x31, 0x20, 0x20, 0x32,
0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x20, 0x35, 0x20, 0x20, 0x30,
0x20, 0x31, 0x35, 0x20, 0x20, 0x30, 0x20, 0x35, 0x31, 0x20, 0x20, 0x34,
0x20, 0x20, 0x31, 0x20, 0x0a, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20,
0x30, 0x20, 0x35, 0x32, 0x20, 0x20, 0x30, 0x20, 0x32, 0x35, 0x20, 0x20,
0x34, 0x20, 0x20, 0x31, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20,
0x30, 0x20, 0x34, 0x34, 0x20, 0x20, 0x30, 0x20, 0x33, 0x35, 0x20, 0x20,
0x30, 0x20, 0x35, 0x33, 0x20, 0x20, 0x61, 0x20, 0x20, 0x31, 0x20, 0x20,
0x36, 0x20, 0x20, 0x31, 0x20, 0x20, 0x34, 0x20, 0x20, 0x31, 0x20, 0x20,
0x32, 0x20, 0x20, 0x31, 0x20, 0x0a, 0x20, 0x30, 0x20, 0x36, 0x30, 0x20,
0x20, 0x30, 0x20, 0x20, 0x36, 0x20, 0x20, 0x30, 0x20, 0x36, 0x31, 0x20,
0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x31, 0x36, 0x20,
0x20, 0x30, 0x20, 0x36, 0x32, 0x20, 0x20, 0x38, 0x20, 0x20, 0x31, 0x20,
0x20, 0x34, 0x20, 0x20, 0x31, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20,
0x20, 0x30, 0x20, 0x32, 0x36, 0x20, 0x20, 0x30, 0x20, 0x35, 0x34, 0x20,
0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x0a, 0x20, 0x30, 0x20, 0x34, 0x35,
0x20, 0x20, 0x30, 0x20, 0x36, 0x33, 0x20, 0x20, 0x34, 0x20, 0x20, 0x31,
0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x33, 0x36,
0x20, 0x20, 0x30, 0x20, 0x37, 0x30, 0x20, 0x20, 0x30, 0x20, 0x37, 0x31,
0x20, 0x32, 0x38, 0x20, 0x20, 0x31, 0x20, 0x31, 0x32, 0x20, 0x20, 0x31,
0x20, 0x20, 0x38, 0x20, 0x20, 0x31, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31,
0x20, 0x20, 0x30, 0x20, 0x31, 0x37, 0x20, 0x0a, 0x20, 0x32, 0x20, 0x20,
0x31, 0x20, 0x20, 0x30, 0x20, 0x20, 0x37, 0x20, 0x20, 0x32, 0x20, 0x20,
0x31, 0x20, 0x20, 0x30, 0x20, 0x35, 0x35, 0x20, 0x20, 0x30, 0x20, 0x36,
0x34, 0x20, 0x20, 0x34, 0x20, 0x20, 0x31, 0x20, 0x20, 0x32, 0x20, 0x20,
0x31, 0x20, 0x20, 0x30, 0x20, 0x37, 0x32, 0x20, 0x20, 0x30, 0x20, 0x32,
0x37, 0x20, 0x20, 0x34, 0x20, 0x20, 0x31, 0x20, 0x20, 0x32, 0x20, 0x20,
0x31, 0x20, 0x20, 0x30, 0x20, 0x34, 0x36, 0x20, 0x0a, 0x20, 0x30, 0x20,
0x36, 0x35, 0x20, 0x20, 0x30, 0x20, 0x37, 0x33, 0x20, 0x20, 0x61, 0x20,
0x20, 0x31, 0x20, 0x20, 0x36, 0x20, 0x20, 0x31, 0x20, 0x20, 0x32, 0x20,
0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x33, 0x37, 0x20, 0x20, 0x32, 0x20,
0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x35, 0x36, 0x20, 0x20, 0x30, 0x20,
0x20, 0x38, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20,
0x38, 0x30, 0x20, 0x20, 0x30, 0x20, 0x38, 0x31, 0x20, 0x0a, 0x20, 0x36,
0x20, 0x20, 0x31, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30,
0x20, 0x31, 0x38, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30,
0x20, 0x37, 0x34, 0x20, 0x20, 0x30, 0x20, 0x34, 0x37, 0x20, 0x20, 0x32,
0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x38, 0x32, 0x20, 0x20, 0x32,
0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x32, 0x38, 0x20, 0x20, 0x30,
0x20, 0x36, 0x36, 0x20, 0x31, 0x38, 0x20, 0x20, 0x31, 0x20, 0x0a, 0x20,
0x65, 0x20, 0x20, 0x31, 0x20, 0x20, 0x38, 0x20, 0x20, 0x31, 0x20, 0x20,
0x34, 0x20, 0x20, 0x31, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20,
0x30, 0x20, 0x38, 0x33, 0x20, 0x20, 0x30, 0x20, 0x33, 0x38, 0x20, 0x20,
0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x37, 0x35, 0x20, 0x20,
0x30, 0x20, 0x38, 0x34, 0x20, 0x20, 0x34, 0x20, 0x20, 0x31, 0x20, 0x20,
0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x34, 0x38, 0x20, 0x0a,
0x20, 0x30, 0x20, 0x39, 0x30, 0x20, 0x20, 0x30, 0x20, 0x39, 0x31, 0x20,
0x20, 0x36, 0x20, 0x20, 0x31, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20,
0x20, 0x30, 0x20, 0x31, 0x39, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20,
0x20, 0x30, 0x20, 0x20, 0x39, 0x20, 0x20, 0x30, 0x20, 0x37, 0x36, 0x20,
0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x39, 0x32, 0x20,
0x20, 0x30, 0x20, 0x32, 0x39, 0x20, 0x20, 0x65, 0x20, 0x20, 0x31, 0x20,
0x0a, 0x20, 0x38, 0x20, 0x20, 0x31, 0x20, 0x20, 0x34, 0x20, 0x20, 0x31,
0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x38, 0x35,
0x20, 0x20, 0x30, 0x20, 0x35, 0x38, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31,
0x20, 0x20, 0x30, 0x20, 0x39, 0x33, 0x20, 0x20, 0x30, 0x20, 0x33, 0x39,
0x20, 0x20, 0x34, 0x20, 0x20, 0x31, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31,
0x20, 0x20, 0x30, 0x20, 0x61, 0x30, 0x20, 0x20, 0x30, 0x20, 0x20, 0x61,
0x20, 0x0a, 0x20, 0x30, 0x20, 0x31, 0x61, 0x20, 0x20, 0x38, 0x20, 0x20,
0x31, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x61,
0x32, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x36,
0x37, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x35,
0x37, 0x20, 0x20, 0x30, 0x20, 0x34, 0x39, 0x20, 0x20, 0x36, 0x20, 0x20,
0x31, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x39,
0x34, 0x20, 0x0a, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20,
0x37, 0x37, 0x20, 0x20, 0x30, 0x20, 0x38, 0x36, 0x20, 0x20, 0x32, 0x20,
0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x61, 0x31, 0x20, 0x20, 0x32, 0x20,
0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x36, 0x38, 0x20, 0x20, 0x30, 0x20,
0x39, 0x35, 0x20, 0x64, 0x63, 0x20, 0x20, 0x31, 0x20, 0x37, 0x65, 0x20,
0x20, 0x31, 0x20, 0x33, 0x32, 0x20, 0x20, 0x31, 0x20, 0x31, 0x61, 0x20,
0x20, 0x31, 0x20, 0x0a, 0x20, 0x63, 0x20, 0x20, 0x31, 0x20, 0x20, 0x36,
0x20, 0x20, 0x31, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30,
0x20, 0x32, 0x61, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30,
0x20, 0x35, 0x39, 0x20, 0x20, 0x30, 0x20, 0x33, 0x61, 0x20, 0x20, 0x32,
0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x61, 0x33, 0x20, 0x20, 0x32,
0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x38, 0x37, 0x20, 0x20, 0x30,
0x20, 0x37, 0x38, 0x20, 0x0a, 0x20, 0x38, 0x20, 0x20, 0x31, 0x20, 0x20,
0x34, 0x20, 0x20, 0x31, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20,
0x30, 0x20, 0x61, 0x34, 0x20, 0x20, 0x30, 0x20, 0x34, 0x61, 0x20, 0x20,
0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x39, 0x36, 0x20, 0x20,
0x30, 0x20, 0x36, 0x39, 0x20, 0x20, 0x34, 0x20, 0x20, 0x31, 0x20, 0x20,
0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x62, 0x30, 0x20, 0x20,
0x30, 0x20, 0x20, 0x62, 0x20, 0x0a, 0x20, 0x30, 0x20, 0x62, 0x31, 0x20,
0x20, 0x61, 0x20, 0x20, 0x31, 0x20, 0x20, 0x34, 0x20, 0x20, 0x31, 0x20,
0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x31, 0x62, 0x20,
0x20, 0x30, 0x20, 0x62, 0x32, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20,
0x20, 0x30, 0x20, 0x32, 0x62, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20,
0x20, 0x30, 0x20, 0x61, 0x35, 0x20, 0x20, 0x30, 0x20, 0x35, 0x61, 0x20,
0x20, 0x36, 0x20, 0x20, 0x31, 0x20, 0x0a, 0x20, 0x32, 0x20, 0x20, 0x31,
0x20, 0x20, 0x30, 0x20, 0x62, 0x33, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31,
0x20, 0x20, 0x30, 0x20, 0x61, 0x36, 0x20, 0x20, 0x30, 0x20, 0x36, 0x61,
0x20, 0x20, 0x34, 0x20, 0x20, 0x31, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31,
0x20, 0x20, 0x30, 0x20, 0x62, 0x34, 0x20, 0x20, 0x30, 0x20, 0x34, 0x62,
0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x20, 0x63,
0x20, 0x20, 0x30, 0x20, 0x63, 0x31, 0x20, 0x0a, 0x31, 0x65, 0x20, 0x20,
0x31, 0x20, 0x20, 0x65, 0x20, 0x20, 0x31, 0x20, 0x20, 0x36, 0x20, 0x20,
0x31, 0x20, 0x20, 0x34, 0x20, 0x20, 0x31, 0x20, 0x20, 0x32, 0x20, 0x20,
0x31, 0x20, 0x20, 0x30, 0x20, 0x62, 0x35, 0x20, 0x20, 0x30, 0x20, 0x63,
0x32, 0x20, 0x20, 0x30, 0x20, 0x32, 0x63, 0x20, 0x20, 0x34, 0x20, 0x20,
0x31, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x61,
0x37, 0x20, 0x20, 0x30, 0x20, 0x63, 0x33, 0x20, 0x0a, 0x20, 0x32, 0x20,
0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x36, 0x62, 0x20, 0x20, 0x30, 0x20,
0x63, 0x34, 0x20, 0x20, 0x38, 0x20, 0x20, 0x31, 0x20, 0x20, 0x32, 0x20,
0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x31, 0x64, 0x20, 0x20, 0x34, 0x20,
0x20, 0x31, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20,
0x38, 0x38, 0x20, 0x20, 0x30, 0x20, 0x39, 0x37, 0x20, 0x20, 0x30, 0x20,
0x33, 0x62, 0x20, 0x20, 0x34, 0x20, 0x20, 0x31, 0x20, 0x0a, 0x20, 0x32,
0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x64, 0x31, 0x20, 0x20, 0x30,
0x20, 0x64, 0x32, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30,
0x20, 0x32, 0x64, 0x20, 0x20, 0x30, 0x20, 0x64, 0x33, 0x20, 0x31, 0x32,
0x20, 0x20, 0x31, 0x20, 0x20, 0x36, 0x20, 0x20, 0x31, 0x20, 0x20, 0x34,
0x20, 0x20, 0x31, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30,
0x20, 0x31, 0x65, 0x20, 0x20, 0x30, 0x20, 0x32, 0x65, 0x20, 0x0a, 0x20,
0x30, 0x20, 0x65, 0x32, 0x20, 0x20, 0x36, 0x20, 0x20, 0x31, 0x20, 0x20,
0x34, 0x20, 0x20, 0x31, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20,
0x30, 0x20, 0x37, 0x39, 0x20, 0x20, 0x30, 0x20, 0x39, 0x38, 0x20, 0x20,
0x30, 0x20, 0x63, 0x30, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20,
0x30, 0x20, 0x31, 0x63, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20,
0x30, 0x20, 0x38, 0x39, 0x20, 0x20, 0x30, 0x20, 0x35, 0x62, 0x20, 0x0a,
0x20, 0x65, 0x20, 0x20, 0x31, 0x20, 0x20, 0x36, 0x20, 0x20, 0x31, 0x20,
0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x33, 0x63, 0x20,
0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x37, 0x61, 0x20,
0x20, 0x30, 0x20, 0x62, 0x36, 0x20, 0x20, 0x34, 0x20, 0x20, 0x31, 0x20,
0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x34, 0x63, 0x20,
0x20, 0x30, 0x20, 0x39, 0x39, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20,
0x0a, 0x20, 0x30, 0x20, 0x61, 0x38, 0x20, 0x20, 0x30, 0x20, 0x38, 0x61,
0x20, 0x20, 0x36, 0x20, 0x20, 0x31, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31,
0x20, 0x20, 0x30, 0x20, 0x20, 0x64, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31,
0x20, 0x20, 0x30, 0x20, 0x63, 0x35, 0x20, 0x20, 0x30, 0x20, 0x35, 0x63,
0x20, 0x20, 0x34, 0x20, 0x20, 0x31, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31,
0x20, 0x20, 0x30, 0x20, 0x33, 0x64, 0x20, 0x20, 0x30, 0x20, 0x63, 0x36,
0x20, 0x0a, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x36,
0x63, 0x20, 0x20, 0x30, 0x20, 0x39, 0x61, 0x20, 0x35, 0x38, 0x20, 0x20,
0x31, 0x20, 0x35, 0x36, 0x20, 0x20, 0x31, 0x20, 0x32, 0x34, 0x20, 0x20,
0x31, 0x20, 0x31, 0x30, 0x20, 0x20, 0x31, 0x20, 0x20, 0x38, 0x20, 0x20,
0x31, 0x20, 0x20, 0x34, 0x20, 0x20, 0x31, 0x20, 0x20, 0x32, 0x20, 0x20,
0x31, 0x20, 0x20, 0x30, 0x20, 0x38, 0x62, 0x20, 0x20, 0x30, 0x20, 0x34,
0x64, 0x20, 0x0a, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20,
0x63, 0x37, 0x20, 0x20, 0x30, 0x20, 0x37, 0x63, 0x20, 0x20, 0x34, 0x20,
0x20, 0x31, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20,
0x64, 0x35, 0x20, 0x20, 0x30, 0x20, 0x35, 0x64, 0x20, 0x20, 0x32, 0x20,
0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x65, 0x30, 0x20, 0x20, 0x30, 0x20,
0x20, 0x65, 0x20, 0x20, 0x38, 0x20, 0x20, 0x31, 0x20, 0x20, 0x32, 0x20,
0x20, 0x31, 0x20, 0x0a, 0x20, 0x30, 0x20, 0x65, 0x33, 0x20, 0x20, 0x34,
0x20, 0x20, 0x31, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30,
0x20, 0x64, 0x30, 0x20, 0x20, 0x30, 0x20, 0x62, 0x37, 0x20, 0x20, 0x30,
0x20, 0x37, 0x62, 0x20, 0x20, 0x36, 0x20, 0x20, 0x31, 0x20, 0x20, 0x34,
0x20, 0x20, 0x31, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30,
0x20, 0x61, 0x39, 0x20, 0x20, 0x30, 0x20, 0x62, 0x38, 0x20, 0x20, 0x30,
0x20, 0x64, 0x34, 0x20, 0x0a, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20,
0x30, 0x20, 0x65, 0x31, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20,
0x30, 0x20, 0x61, 0x61, 0x20, 0x20, 0x30, 0x20, 0x62, 0x39, 0x20, 0x31,
0x38, 0x20, 0x20, 0x31, 0x20, 0x20, 0x61, 0x20, 0x20, 0x31, 0x20, 0x20,
0x36, 0x20, 0x20, 0x31, 0x20, 0x20, 0x34, 0x20, 0x20, 0x31, 0x20, 0x20,
0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x39, 0x62, 0x20, 0x20,
0x30, 0x20, 0x64, 0x36, 0x20, 0x0a, 0x20, 0x30, 0x20, 0x36, 0x64, 0x20,
0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x33, 0x65, 0x20,
0x20, 0x30, 0x20, 0x63, 0x38, 0x20, 0x20, 0x36, 0x20, 0x20, 0x31, 0x20,
0x20, 0x34, 0x20, 0x20, 0x31, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20,
0x20, 0x30, 0x20, 0x38, 0x63, 0x20, 0x20, 0x30, 0x20, 0x65, 0x34, 0x20,
0x20, 0x30, 0x20, 0x34, 0x65, 0x20, 0x20, 0x34, 0x20, 0x20, 0x31, 0x20,
0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x0a, 0x20, 0x30, 0x20, 0x64, 0x37,
0x20, 0x20, 0x30, 0x20, 0x65, 0x35, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31,
0x20, 0x20, 0x30, 0x20, 0x62, 0x61, 0x20, 0x20, 0x30, 0x20, 0x61, 0x62,
0x20, 0x20, 0x63, 0x20, 0x20, 0x31, 0x20, 0x20, 0x34, 0x20, 0x20, 0x31,
0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x39, 0x63,
0x20, 0x20, 0x30, 0x20, 0x65, 0x36, 0x20, 0x20, 0x34, 0x20, 0x20, 0x31,
0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x0a, 0x20, 0x30, 0x20, 0x36,
0x65, 0x20, 0x20, 0x30, 0x20, 0x64, 0x38, 0x20, 0x20, 0x32, 0x20, 0x20,
0x31, 0x20, 0x20, 0x30, 0x20, 0x38, 0x64, 0x20, 0x20, 0x30, 0x20, 0x62,
0x62, 0x20, 0x20, 0x38, 0x20, 0x20, 0x31, 0x20, 0x20, 0x34, 0x20, 0x20,
0x31, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x65,
0x37, 0x20, 0x20, 0x30, 0x20, 0x39, 0x64, 0x20, 0x20, 0x32, 0x20, 0x20,
0x31, 0x20, 0x20, 0x30, 0x20, 0x65, 0x38, 0x20, 0x0a, 0x20, 0x30, 0x20,
0x38, 0x65, 0x20, 0x20, 0x34, 0x20, 0x20, 0x31, 0x20, 0x20, 0x32, 0x20,
0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x63, 0x62, 0x20, 0x20, 0x30, 0x20,
0x62, 0x63, 0x20, 0x20, 0x30, 0x20, 0x39, 0x65, 0x20, 0x20, 0x30, 0x20,
0x66, 0x31, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20,
0x31, 0x66, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20,
0x20, 0x66, 0x20, 0x20, 0x30, 0x20, 0x32, 0x66, 0x20, 0x0a, 0x34, 0x32,
0x20, 0x20, 0x31, 0x20, 0x33, 0x38, 0x20, 0x20, 0x31, 0x20, 0x20, 0x32,
0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x66, 0x32, 0x20, 0x33, 0x34,
0x20, 0x20, 0x31, 0x20, 0x33, 0x32, 0x20, 0x20, 0x31, 0x20, 0x31, 0x34,
0x20, 0x20, 0x31, 0x20, 0x20, 0x38, 0x20, 0x20, 0x31, 0x20, 0x20, 0x32,
0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x62, 0x64, 0x20, 0x20, 0x32,
0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x35, 0x65, 0x20, 0x0a, 0x20,
0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x37, 0x64, 0x20, 0x20,
0x30, 0x20, 0x63, 0x39, 0x20, 0x20, 0x36, 0x20, 0x20, 0x31, 0x20, 0x20,
0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x63, 0x61, 0x20, 0x20,
0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x61, 0x63, 0x20, 0x20,
0x30, 0x20, 0x37, 0x65, 0x20, 0x20, 0x34, 0x20, 0x20, 0x31, 0x20, 0x20,
0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x64, 0x61, 0x20, 0x0a,
0x20, 0x30, 0x20, 0x61, 0x64, 0x20, 0x20, 0x30, 0x20, 0x63, 0x63, 0x20,
0x20, 0x61, 0x20, 0x20, 0x31, 0x20, 0x20, 0x36, 0x20, 0x20, 0x31, 0x20,
0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x61, 0x65, 0x20,
0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x64, 0x62, 0x20,
0x20, 0x30, 0x20, 0x64, 0x63, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20,
0x20, 0x30, 0x20, 0x63, 0x64, 0x20, 0x20, 0x30, 0x20, 0x62, 0x65, 0x20,
0x0a, 0x20, 0x36, 0x20, 0x20, 0x31, 0x20, 0x20, 0x34, 0x20, 0x20, 0x31,
0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x65, 0x62,
0x20, 0x20, 0x30, 0x20, 0x65, 0x64, 0x20, 0x20, 0x30, 0x20, 0x65, 0x65,
0x20, 0x20, 0x36, 0x20, 0x20, 0x31, 0x20, 0x20, 0x34, 0x20, 0x20, 0x31,
0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x64, 0x39,
0x20, 0x20, 0x30, 0x20, 0x65, 0x61, 0x20, 0x20, 0x30, 0x20, 0x65, 0x39,
0x20, 0x0a, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x64,
0x65, 0x20, 0x20, 0x34, 0x20, 0x20, 0x31, 0x20, 0x20, 0x32, 0x20, 0x20,
0x31, 0x20, 0x20, 0x30, 0x20, 0x64, 0x64, 0x20, 0x20, 0x30, 0x20, 0x65,
0x63, 0x20, 0x20, 0x30, 0x20, 0x63, 0x65, 0x20, 0x20, 0x30, 0x20, 0x33,
0x66, 0x20, 0x20, 0x30, 0x20, 0x66, 0x30, 0x20, 0x20, 0x34, 0x20, 0x20,
0x31, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x66,
0x33, 0x20, 0x0a, 0x20, 0x30, 0x20, 0x66, 0x34, 0x20, 0x20, 0x32, 0x20,
0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x34, 0x66, 0x20, 0x20, 0x32, 0x20,
0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x66, 0x35, 0x20, 0x20, 0x30, 0x20,
0x35, 0x66, 0x20, 0x20, 0x61, 0x20, 0x20, 0x31, 0x20, 0x20, 0x32, 0x20,
0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x66, 0x66, 0x20, 0x20, 0x34, 0x20,
0x20, 0x31, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20,
0x66, 0x36, 0x20, 0x0a, 0x20, 0x30, 0x20, 0x36, 0x66, 0x20, 0x20, 0x32,
0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x66, 0x37, 0x20, 0x20, 0x30,
0x20, 0x37, 0x66, 0x20, 0x20, 0x63, 0x20, 0x20, 0x31, 0x20, 0x20, 0x36,
0x20, 0x20, 0x31, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30,
0x20, 0x38, 0x66, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30,
0x20, 0x66, 0x38, 0x20, 0x20, 0x30, 0x20, 0x66, 0x39, 0x20, 0x20, 0x34,
0x20, 0x20, 0x31, 0x20, 0x0a, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20,
0x30, 0x20, 0x39, 0x66, 0x20, 0x20, 0x30, 0x20, 0x66, 0x61, 0x20, 0x20,
0x30, 0x20, 0x61, 0x66, 0x20, 0x20, 0x38, 0x20, 0x20, 0x31, 0x20, 0x20,
0x34, 0x20, 0x20, 0x31, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20,
0x30, 0x20, 0x66, 0x62, 0x20, 0x20, 0x30, 0x20, 0x62, 0x66, 0x20, 0x20,
0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x66, 0x63, 0x20, 0x20,
0x30, 0x20, 0x63, 0x66, 0x20, 0x0a, 0x20, 0x34, 0x20, 0x20, 0x31, 0x20,
0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x66, 0x64, 0x20,
0x20, 0x30, 0x20, 0x64, 0x66, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20,
0x20, 0x30, 0x20, 0x66, 0x65, 0x20, 0x20, 0x30, 0x20, 0x65, 0x66, 0x20,
0x0a, 0x0a, 0x2e, 0x74, 0x61, 0x62, 0x6c, 0x65, 0x20, 0x31, 0x37, 0x20,
0x35, 0x31, 0x31, 0x20, 0x31, 0x36, 0x20, 0x31, 0x36, 0x20, 0x20, 0x32,
0x0a, 0x2e, 0x72, 0x65, 0x66, 0x65, 0x72, 0x65, 0x6e, 0x63, 0x65, 0x20,
0x31, 0x36, 0x0a, 0x0a, 0x2e, 0x74, 0x61, 0x62, 0x6c, 0x65, 0x20, 0x31,
0x38, 0x20, 0x35, 0x31, 0x31, 0x20, 0x31, 0x36, 0x20, 0x31, 0x36, 0x20,
0x20, 0x33, 0x0a, 0x2e, 0x72, 0x65, 0x66, 0x65, 0x72, 0x65, 0x6e, 0x63,
0x65, 0x20, 0x31, 0x36, 0x0a, 0x0a, 0x2e, 0x74, 0x61, 0x62, 0x6c, 0x65,
0x20, 0x31, 0x39, 0x20, 0x35, 0x31, 0x31, 0x20, 0x31, 0x36, 0x20, 0x31,
0x36, 0x20, 0x20, 0x34, 0x0a, 0x2e, 0x72, 0x65, 0x66, 0x65, 0x72, 0x65,
0x6e, 0x63, 0x65, 0x20, 0x31, 0x36, 0x0a, 0x0a, 0x2e, 0x74, 0x61, 0x62,
0x6c, 0x65, 0x20, 0x32, 0x30, 0x20, 0x35, 0x31, 0x31, 0x20, 0x31, 0x36,
0x20, 0x31, 0x36, 0x20, 0x20, 0x36, 0x0a, 0x2e, 0x72, 0x65, 0x66, 0x65,
0x72, 0x65, 0x6e, 0x63, 0x65, 0x20, 0x31, 0x36, 0x0a, 0x0a, 0x2e, 0x74,
0x61, 0x62, 0x6c, 0x65, 0x20, 0x32, 0x31, 0x20, 0x35, 0x31, 0x31, 0x20,
0x31, 0x36, 0x20, 0x31, 0x36, 0x20, 0x20, 0x38, 0x0a, 0x2e, 0x72, 0x65,
0x66, 0x65, 0x72, 0x65, 0x6e, 0x63, 0x65, 0x20, 0x31, 0x36, 0x0a, 0x0a,
0x2e, 0x74, 0x61, 0x62, 0x6c, 0x65, 0x20, 0x32, 0x32, 0x20, 0x35, 0x31,
0x31, 0x20, 0x31, 0x36, 0x20, 0x31, 0x36, 0x20, 0x31, 0x30, 0x0a, 0x2e,
0x72, 0x65, 0x66, 0x65, 0x72, 0x65, 0x6e, 0x63, 0x65, 0x20, 0x31, 0x36,
0x0a, 0x0a, 0x2e, 0x74, 0x61, 0x62, 0x6c, 0x65, 0x20, 0x32, 0x33, 0x20,
0x35, 0x31, 0x31, 0x20, 0x31, 0x36, 0x20, 0x31, 0x36, 0x20, 0x31, 0x33,
0x0a, 0x2e, 0x72, 0x65, 0x66, 0x65, 0x72, 0x65, 0x6e, 0x63, 0x65, 0x20,
0x31, 0x36, 0x0a, 0x0a, 0x2e, 0x74, 0x61, 0x62, 0x6c, 0x65, 0x20, 0x32,
0x34, 0x20, 0x35, 0x31, 0x32, 0x20, 0x31, 0x36, 0x20, 0x31, 0x36, 0x20,
0x20, 0x34, 0x0a, 0x2e, 0x74, 0x72, 0x65, 0x65, 0x64, 0x61, 0x74, 0x61,
0x0a, 0x33, 0x63, 0x20, 0x20, 0x31, 0x20, 0x20, 0x38, 0x20, 0x20, 0x31,
0x20, 0x20, 0x34, 0x20, 0x20, 0x31, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31,
0x20, 0x20, 0x30, 0x20, 0x20, 0x30, 0x20, 0x20, 0x30, 0x20, 0x31, 0x30,
0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x20, 0x31,
0x20, 0x20, 0x30, 0x20, 0x31, 0x31, 0x20, 0x20, 0x65, 0x20, 0x20, 0x31,
0x20, 0x20, 0x36, 0x20, 0x20, 0x31, 0x20, 0x20, 0x34, 0x20, 0x20, 0x31,
0x20, 0x0a, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x32,
0x30, 0x20, 0x20, 0x30, 0x20, 0x20, 0x32, 0x20, 0x20, 0x30, 0x20, 0x32,
0x31, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x31,
0x32, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x32,
0x32, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x33,
0x30, 0x20, 0x20, 0x30, 0x20, 0x20, 0x33, 0x20, 0x20, 0x65, 0x20, 0x20,
0x31, 0x20, 0x0a, 0x20, 0x34, 0x20, 0x20, 0x31, 0x20, 0x20, 0x32, 0x20,
0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x33, 0x31, 0x20, 0x20, 0x30, 0x20,
0x31, 0x33, 0x20, 0x20, 0x34, 0x20, 0x20, 0x31, 0x20, 0x20, 0x32, 0x20,
0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x33, 0x32, 0x20, 0x20, 0x30, 0x20,
0x32, 0x33, 0x20, 0x20, 0x34, 0x20, 0x20, 0x31, 0x20, 0x20, 0x32, 0x20,
0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x34, 0x30, 0x20, 0x20, 0x30, 0x20,
0x20, 0x34, 0x20, 0x0a, 0x20, 0x30, 0x20, 0x34, 0x31, 0x20, 0x20, 0x38,
0x20, 0x20, 0x31, 0x20, 0x20, 0x34, 0x20, 0x20, 0x31, 0x20, 0x20, 0x32,
0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x31, 0x34, 0x20, 0x20, 0x30,
0x20, 0x33, 0x33, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30,
0x20, 0x34, 0x32, 0x20, 0x20, 0x30, 0x20, 0x32, 0x34, 0x20, 0x20, 0x36,
0x20, 0x20, 0x31, 0x20, 0x20, 0x34, 0x20, 0x20, 0x31, 0x20, 0x20, 0x32,
0x20, 0x20, 0x31, 0x20, 0x0a, 0x20, 0x30, 0x20, 0x34, 0x33, 0x20, 0x20,
0x30, 0x20, 0x33, 0x34, 0x20, 0x20, 0x30, 0x20, 0x35, 0x31, 0x20, 0x20,
0x36, 0x20, 0x20, 0x31, 0x20, 0x20, 0x34, 0x20, 0x20, 0x31, 0x20, 0x20,
0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x35, 0x30, 0x20, 0x20,
0x30, 0x20, 0x20, 0x35, 0x20, 0x20, 0x30, 0x20, 0x31, 0x35, 0x20, 0x20,
0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x35, 0x32, 0x20, 0x20,
0x30, 0x20, 0x32, 0x35, 0x20, 0x0a, 0x66, 0x61, 0x20, 0x20, 0x31, 0x20,
0x36, 0x32, 0x20, 0x20, 0x31, 0x20, 0x32, 0x32, 0x20, 0x20, 0x31, 0x20,
0x31, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x61, 0x20, 0x20, 0x31, 0x20,
0x20, 0x34, 0x20, 0x20, 0x31, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20,
0x20, 0x30, 0x20, 0x34, 0x34, 0x20, 0x20, 0x30, 0x20, 0x35, 0x33, 0x20,
0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x33, 0x35, 0x20,
0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x0a, 0x20, 0x30, 0x20, 0x36, 0x30,
0x20, 0x20, 0x30, 0x20, 0x20, 0x36, 0x20, 0x20, 0x34, 0x20, 0x20, 0x31,
0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x36, 0x31,
0x20, 0x20, 0x30, 0x20, 0x31, 0x36, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31,
0x20, 0x20, 0x30, 0x20, 0x36, 0x32, 0x20, 0x20, 0x30, 0x20, 0x32, 0x36,
0x20, 0x20, 0x38, 0x20, 0x20, 0x31, 0x20, 0x20, 0x34, 0x20, 0x20, 0x31,
0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x0a, 0x20, 0x30, 0x20, 0x35,
0x34, 0x20, 0x20, 0x30, 0x20, 0x34, 0x35, 0x20, 0x20, 0x32, 0x20, 0x20,
0x31, 0x20, 0x20, 0x30, 0x20, 0x36, 0x33, 0x20, 0x20, 0x30, 0x20, 0x33,
0x36, 0x20, 0x20, 0x34, 0x20, 0x20, 0x31, 0x20, 0x20, 0x32, 0x20, 0x20,
0x31, 0x20, 0x20, 0x30, 0x20, 0x37, 0x31, 0x20, 0x20, 0x30, 0x20, 0x35,
0x35, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x36,
0x34, 0x20, 0x20, 0x30, 0x20, 0x34, 0x36, 0x20, 0x0a, 0x32, 0x30, 0x20,
0x20, 0x31, 0x20, 0x20, 0x65, 0x20, 0x20, 0x31, 0x20, 0x20, 0x36, 0x20,
0x20, 0x31, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20,
0x37, 0x32, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20,
0x32, 0x37, 0x20, 0x20, 0x30, 0x20, 0x33, 0x37, 0x20, 0x20, 0x32, 0x20,
0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x37, 0x33, 0x20, 0x20, 0x34, 0x20,
0x20, 0x31, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x0a, 0x20, 0x30,
0x20, 0x37, 0x30, 0x20, 0x20, 0x30, 0x20, 0x20, 0x37, 0x20, 0x20, 0x30,
0x20, 0x31, 0x37, 0x20, 0x20, 0x61, 0x20, 0x20, 0x31, 0x20, 0x20, 0x34,
0x20, 0x20, 0x31, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30,
0x20, 0x36, 0x35, 0x20, 0x20, 0x30, 0x20, 0x35, 0x36, 0x20, 0x20, 0x34,
0x20, 0x20, 0x31, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30,
0x20, 0x38, 0x30, 0x20, 0x20, 0x30, 0x20, 0x20, 0x38, 0x20, 0x0a, 0x20,
0x30, 0x20, 0x38, 0x31, 0x20, 0x20, 0x34, 0x20, 0x20, 0x31, 0x20, 0x20,
0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x37, 0x34, 0x20, 0x20,
0x30, 0x20, 0x34, 0x37, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20,
0x30, 0x20, 0x31, 0x38, 0x20, 0x20, 0x30, 0x20, 0x38, 0x32, 0x20, 0x31,
0x30, 0x20, 0x20, 0x31, 0x20, 0x20, 0x38, 0x20, 0x20, 0x31, 0x20, 0x20,
0x34, 0x20, 0x20, 0x31, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x0a,
0x20, 0x30, 0x20, 0x32, 0x38, 0x20, 0x20, 0x30, 0x20, 0x36, 0x36, 0x20,
0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x38, 0x33, 0x20,
0x20, 0x30, 0x20, 0x33, 0x38, 0x20, 0x20, 0x34, 0x20, 0x20, 0x31, 0x20,
0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x37, 0x35, 0x20,
0x20, 0x30, 0x20, 0x35, 0x37, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20,
0x20, 0x30, 0x20, 0x38, 0x34, 0x20, 0x20, 0x30, 0x20, 0x34, 0x38, 0x20,
0x0a, 0x20, 0x38, 0x20, 0x20, 0x31, 0x20, 0x20, 0x34, 0x20, 0x20, 0x31,
0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x39, 0x31,
0x20, 0x20, 0x30, 0x20, 0x31, 0x39, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31,
0x20, 0x20, 0x30, 0x20, 0x39, 0x32, 0x20, 0x20, 0x30, 0x20, 0x37, 0x36,
0x20, 0x20, 0x34, 0x20, 0x20, 0x31, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31,
0x20, 0x20, 0x30, 0x20, 0x36, 0x37, 0x20, 0x20, 0x30, 0x20, 0x32, 0x39,
0x20, 0x0a, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x38,
0x35, 0x20, 0x20, 0x30, 0x20, 0x35, 0x38, 0x20, 0x35, 0x63, 0x20, 0x20,
0x31, 0x20, 0x32, 0x32, 0x20, 0x20, 0x31, 0x20, 0x31, 0x30, 0x20, 0x20,
0x31, 0x20, 0x20, 0x38, 0x20, 0x20, 0x31, 0x20, 0x20, 0x34, 0x20, 0x20,
0x31, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x39,
0x33, 0x20, 0x20, 0x30, 0x20, 0x33, 0x39, 0x20, 0x20, 0x32, 0x20, 0x20,
0x31, 0x20, 0x0a, 0x20, 0x30, 0x20, 0x39, 0x34, 0x20, 0x20, 0x30, 0x20,
0x34, 0x39, 0x20, 0x20, 0x34, 0x20, 0x20, 0x31, 0x20, 0x20, 0x32, 0x20,
0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x37, 0x37, 0x20, 0x20, 0x30, 0x20,
0x38, 0x36, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20,
0x36, 0x38, 0x20, 0x20, 0x30, 0x20, 0x61, 0x31, 0x20, 0x20, 0x38, 0x20,
0x20, 0x31, 0x20, 0x20, 0x34, 0x20, 0x20, 0x31, 0x20, 0x20, 0x32, 0x20,
0x20, 0x31, 0x20, 0x0a, 0x20, 0x30, 0x20, 0x61, 0x32, 0x20, 0x20, 0x30,
0x20, 0x32, 0x61, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30,
0x20, 0x39, 0x35, 0x20, 0x20, 0x30, 0x20, 0x35, 0x39, 0x20, 0x20, 0x34,
0x20, 0x20, 0x31, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30,
0x20, 0x61, 0x33, 0x20, 0x20, 0x30, 0x20, 0x33, 0x61, 0x20, 0x20, 0x32,
0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x38, 0x37, 0x20, 0x20, 0x32,
0x20, 0x20, 0x31, 0x20, 0x0a, 0x20, 0x30, 0x20, 0x37, 0x38, 0x20, 0x20,
0x30, 0x20, 0x34, 0x61, 0x20, 0x31, 0x36, 0x20, 0x20, 0x31, 0x20, 0x20,
0x63, 0x20, 0x20, 0x31, 0x20, 0x20, 0x34, 0x20, 0x20, 0x31, 0x20, 0x20,
0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x61, 0x34, 0x20, 0x20,
0x30, 0x20, 0x39, 0x36, 0x20, 0x20, 0x34, 0x20, 0x20, 0x31, 0x20, 0x20,
0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x36, 0x39, 0x20, 0x20,
0x30, 0x20, 0x62, 0x31, 0x20, 0x0a, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20,
0x20, 0x30, 0x20, 0x31, 0x62, 0x20, 0x20, 0x30, 0x20, 0x61, 0x35, 0x20,
0x20, 0x36, 0x20, 0x20, 0x31, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20,
0x20, 0x30, 0x20, 0x62, 0x32, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20,
0x20, 0x30, 0x20, 0x35, 0x61, 0x20, 0x20, 0x30, 0x20, 0x32, 0x62, 0x20,
0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x38, 0x38, 0x20,
0x20, 0x30, 0x20, 0x62, 0x33, 0x20, 0x0a, 0x31, 0x30, 0x20, 0x20, 0x31,
0x20, 0x20, 0x61, 0x20, 0x20, 0x31, 0x20, 0x20, 0x36, 0x20, 0x20, 0x31,
0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x39, 0x30,
0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x20, 0x39,
0x20, 0x20, 0x30, 0x20, 0x61, 0x30, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31,
0x20, 0x20, 0x30, 0x20, 0x39, 0x37, 0x20, 0x20, 0x30, 0x20, 0x37, 0x39,
0x20, 0x20, 0x34, 0x20, 0x20, 0x31, 0x20, 0x0a, 0x20, 0x32, 0x20, 0x20,
0x31, 0x20, 0x20, 0x30, 0x20, 0x61, 0x36, 0x20, 0x20, 0x30, 0x20, 0x36,
0x61, 0x20, 0x20, 0x30, 0x20, 0x62, 0x34, 0x20, 0x20, 0x63, 0x20, 0x20,
0x31, 0x20, 0x20, 0x36, 0x20, 0x20, 0x31, 0x20, 0x20, 0x32, 0x20, 0x20,
0x31, 0x20, 0x20, 0x30, 0x20, 0x31, 0x61, 0x20, 0x20, 0x32, 0x20, 0x20,
0x31, 0x20, 0x20, 0x30, 0x20, 0x20, 0x61, 0x20, 0x20, 0x30, 0x20, 0x62,
0x30, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x0a, 0x20, 0x30, 0x20,
0x33, 0x62, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20,
0x20, 0x62, 0x20, 0x20, 0x30, 0x20, 0x63, 0x30, 0x20, 0x20, 0x34, 0x20,
0x20, 0x31, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20,
0x34, 0x62, 0x20, 0x20, 0x30, 0x20, 0x63, 0x31, 0x20, 0x20, 0x32, 0x20,
0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x39, 0x38, 0x20, 0x20, 0x30, 0x20,
0x38, 0x39, 0x20, 0x34, 0x33, 0x20, 0x20, 0x31, 0x20, 0x0a, 0x32, 0x32,
0x20, 0x20, 0x31, 0x20, 0x31, 0x30, 0x20, 0x20, 0x31, 0x20, 0x20, 0x38,
0x20, 0x20, 0x31, 0x20, 0x20, 0x34, 0x20, 0x20, 0x31, 0x20, 0x20, 0x32,
0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x31, 0x63, 0x20, 0x20, 0x30,
0x20, 0x62, 0x35, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30,
0x20, 0x35, 0x62, 0x20, 0x20, 0x30, 0x20, 0x63, 0x32, 0x20, 0x20, 0x34,
0x20, 0x20, 0x31, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x0a, 0x20,
0x30, 0x20, 0x32, 0x63, 0x20, 0x20, 0x30, 0x20, 0x61, 0x37, 0x20, 0x20,
0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x37, 0x61, 0x20, 0x20,
0x30, 0x20, 0x63, 0x33, 0x20, 0x20, 0x61, 0x20, 0x20, 0x31, 0x20, 0x20,
0x36, 0x20, 0x20, 0x31, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20,
0x30, 0x20, 0x33, 0x63, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20,
0x30, 0x20, 0x20, 0x63, 0x20, 0x20, 0x30, 0x20, 0x64, 0x30, 0x20, 0x0a,
0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x62, 0x36, 0x20,
0x20, 0x30, 0x20, 0x36, 0x62, 0x20, 0x20, 0x34, 0x20, 0x20, 0x31, 0x20,
0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x63, 0x34, 0x20,
0x20, 0x30, 0x20, 0x34, 0x63, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20,
0x20, 0x30, 0x20, 0x39, 0x39, 0x20, 0x20, 0x30, 0x20, 0x61, 0x38, 0x20,
0x31, 0x30, 0x20, 0x20, 0x31, 0x20, 0x20, 0x38, 0x20, 0x20, 0x31, 0x20,
0x0a, 0x20, 0x34, 0x20, 0x20, 0x31, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31,
0x20, 0x20, 0x30, 0x20, 0x38, 0x61, 0x20, 0x20, 0x30, 0x20, 0x63, 0x35,
0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x35, 0x63,
0x20, 0x20, 0x30, 0x20, 0x64, 0x31, 0x20, 0x20, 0x34, 0x20, 0x20, 0x31,
0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x62, 0x37,
0x20, 0x20, 0x30, 0x20, 0x37, 0x62, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31,
0x20, 0x0a, 0x20, 0x30, 0x20, 0x31, 0x64, 0x20, 0x20, 0x30, 0x20, 0x64,
0x32, 0x20, 0x20, 0x39, 0x20, 0x20, 0x31, 0x20, 0x20, 0x34, 0x20, 0x20,
0x31, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x32,
0x64, 0x20, 0x20, 0x30, 0x20, 0x64, 0x33, 0x20, 0x20, 0x32, 0x20, 0x20,
0x31, 0x20, 0x20, 0x30, 0x20, 0x33, 0x64, 0x20, 0x20, 0x30, 0x20, 0x63,
0x36, 0x20, 0x35, 0x35, 0x20, 0x66, 0x61, 0x20, 0x20, 0x34, 0x20, 0x20,
0x31, 0x20, 0x0a, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20,
0x36, 0x63, 0x20, 0x20, 0x30, 0x20, 0x61, 0x39, 0x20, 0x20, 0x32, 0x20,
0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x39, 0x61, 0x20, 0x20, 0x30, 0x20,
0x64, 0x34, 0x20, 0x32, 0x30, 0x20, 0x20, 0x31, 0x20, 0x31, 0x30, 0x20,
0x20, 0x31, 0x20, 0x20, 0x38, 0x20, 0x20, 0x31, 0x20, 0x20, 0x34, 0x20,
0x20, 0x31, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20,
0x62, 0x38, 0x20, 0x0a, 0x20, 0x30, 0x20, 0x38, 0x62, 0x20, 0x20, 0x32,
0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x34, 0x64, 0x20, 0x20, 0x30,
0x20, 0x63, 0x37, 0x20, 0x20, 0x34, 0x20, 0x20, 0x31, 0x20, 0x20, 0x32,
0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x37, 0x63, 0x20, 0x20, 0x30,
0x20, 0x64, 0x35, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30,
0x20, 0x35, 0x64, 0x20, 0x20, 0x30, 0x20, 0x65, 0x31, 0x20, 0x20, 0x38,
0x20, 0x20, 0x31, 0x20, 0x0a, 0x20, 0x34, 0x20, 0x20, 0x31, 0x20, 0x20,
0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x31, 0x65, 0x20, 0x20,
0x30, 0x20, 0x65, 0x32, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20,
0x30, 0x20, 0x61, 0x61, 0x20, 0x20, 0x30, 0x20, 0x62, 0x39, 0x20, 0x20,
0x34, 0x20, 0x20, 0x31, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20,
0x30, 0x20, 0x39, 0x62, 0x20, 0x20, 0x30, 0x20, 0x65, 0x33, 0x20, 0x20,
0x32, 0x20, 0x20, 0x31, 0x20, 0x0a, 0x20, 0x30, 0x20, 0x64, 0x36, 0x20,
0x20, 0x30, 0x20, 0x36, 0x64, 0x20, 0x31, 0x34, 0x20, 0x20, 0x31, 0x20,
0x20, 0x61, 0x20, 0x20, 0x31, 0x20, 0x20, 0x36, 0x20, 0x20, 0x31, 0x20,
0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x33, 0x65, 0x20,
0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x32, 0x65, 0x20,
0x20, 0x30, 0x20, 0x34, 0x65, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20,
0x20, 0x30, 0x20, 0x63, 0x38, 0x20, 0x0a, 0x20, 0x30, 0x20, 0x38, 0x63,
0x20, 0x20, 0x34, 0x20, 0x20, 0x31, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31,
0x20, 0x20, 0x30, 0x20, 0x65, 0x34, 0x20, 0x20, 0x30, 0x20, 0x64, 0x37,
0x20, 0x20, 0x34, 0x20, 0x20, 0x31, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31,
0x20, 0x20, 0x30, 0x20, 0x37, 0x64, 0x20, 0x20, 0x30, 0x20, 0x61, 0x62,
0x20, 0x20, 0x30, 0x20, 0x65, 0x35, 0x20, 0x20, 0x61, 0x20, 0x20, 0x31,
0x20, 0x20, 0x34, 0x20, 0x20, 0x31, 0x20, 0x0a, 0x20, 0x32, 0x20, 0x20,
0x31, 0x20, 0x20, 0x30, 0x20, 0x62, 0x61, 0x20, 0x20, 0x30, 0x20, 0x35,
0x65, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x63,
0x39, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x39,
0x63, 0x20, 0x20, 0x30, 0x20, 0x36, 0x65, 0x20, 0x20, 0x38, 0x20, 0x20,
0x31, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x65,
0x36, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x0a, 0x20, 0x30, 0x20,
0x20, 0x64, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20,
0x65, 0x30, 0x20, 0x20, 0x30, 0x20, 0x20, 0x65, 0x20, 0x20, 0x34, 0x20,
0x20, 0x31, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20,
0x64, 0x38, 0x20, 0x20, 0x30, 0x20, 0x38, 0x64, 0x20, 0x20, 0x32, 0x20,
0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x62, 0x62, 0x20, 0x20, 0x30, 0x20,
0x63, 0x61, 0x20, 0x34, 0x61, 0x20, 0x20, 0x31, 0x20, 0x0a, 0x20, 0x32,
0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x66, 0x66, 0x20, 0x34, 0x30,
0x20, 0x20, 0x31, 0x20, 0x33, 0x61, 0x20, 0x20, 0x31, 0x20, 0x32, 0x30,
0x20, 0x20, 0x31, 0x20, 0x31, 0x30, 0x20, 0x20, 0x31, 0x20, 0x20, 0x38,
0x20, 0x20, 0x31, 0x20, 0x20, 0x34, 0x20, 0x20, 0x31, 0x20, 0x20, 0x32,
0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x61, 0x63, 0x20, 0x20, 0x30,
0x20, 0x65, 0x37, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x0a, 0x20,
0x30, 0x20, 0x37, 0x65, 0x20, 0x20, 0x30, 0x20, 0x64, 0x39, 0x20, 0x20,
0x34, 0x20, 0x20, 0x31, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20,
0x30, 0x20, 0x39, 0x64, 0x20, 0x20, 0x30, 0x20, 0x65, 0x38, 0x20, 0x20,
0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x38, 0x65, 0x20, 0x20,
0x30, 0x20, 0x63, 0x62, 0x20, 0x20, 0x38, 0x20, 0x20, 0x31, 0x20, 0x20,
0x34, 0x20, 0x20, 0x31, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x0a,
0x20, 0x30, 0x20, 0x62, 0x63, 0x20, 0x20, 0x30, 0x20, 0x64, 0x61, 0x20,
0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x61, 0x64, 0x20,
0x20, 0x30, 0x20, 0x65, 0x39, 0x20, 0x20, 0x34, 0x20, 0x20, 0x31, 0x20,
0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x39, 0x65, 0x20,
0x20, 0x30, 0x20, 0x63, 0x63, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20,
0x20, 0x30, 0x20, 0x64, 0x62, 0x20, 0x20, 0x30, 0x20, 0x62, 0x64, 0x20,
0x0a, 0x31, 0x30, 0x20, 0x20, 0x31, 0x20, 0x20, 0x38, 0x20, 0x20, 0x31,
0x20, 0x20, 0x34, 0x20, 0x20, 0x31, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31,
0x20, 0x20, 0x30, 0x20, 0x65, 0x61, 0x20, 0x20, 0x30, 0x20, 0x61, 0x65,
0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x64, 0x63,
0x20, 0x20, 0x30, 0x20, 0x63, 0x64, 0x20, 0x20, 0x34, 0x20, 0x20, 0x31,
0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x65, 0x62,
0x20, 0x0a, 0x20, 0x30, 0x20, 0x62, 0x65, 0x20, 0x20, 0x32, 0x20, 0x20,
0x31, 0x20, 0x20, 0x30, 0x20, 0x64, 0x64, 0x20, 0x20, 0x30, 0x20, 0x65,
0x63, 0x20, 0x20, 0x38, 0x20, 0x20, 0x31, 0x20, 0x20, 0x34, 0x20, 0x20,
0x31, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x63,
0x65, 0x20, 0x20, 0x30, 0x20, 0x65, 0x64, 0x20, 0x20, 0x32, 0x20, 0x20,
0x31, 0x20, 0x20, 0x30, 0x20, 0x64, 0x65, 0x20, 0x20, 0x30, 0x20, 0x65,
0x65, 0x20, 0x0a, 0x20, 0x30, 0x20, 0x20, 0x66, 0x20, 0x20, 0x34, 0x20,
0x20, 0x31, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20,
0x66, 0x30, 0x20, 0x20, 0x30, 0x20, 0x31, 0x66, 0x20, 0x20, 0x30, 0x20,
0x66, 0x31, 0x20, 0x20, 0x34, 0x20, 0x20, 0x31, 0x20, 0x20, 0x32, 0x20,
0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x66, 0x32, 0x20, 0x20, 0x30, 0x20,
0x32, 0x66, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20,
0x66, 0x33, 0x20, 0x0a, 0x20, 0x30, 0x20, 0x33, 0x66, 0x20, 0x31, 0x32,
0x20, 0x20, 0x31, 0x20, 0x20, 0x38, 0x20, 0x20, 0x31, 0x20, 0x20, 0x34,
0x20, 0x20, 0x31, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30,
0x20, 0x66, 0x34, 0x20, 0x20, 0x30, 0x20, 0x34, 0x66, 0x20, 0x20, 0x32,
0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x66, 0x35, 0x20, 0x20, 0x30,
0x20, 0x35, 0x66, 0x20, 0x20, 0x34, 0x20, 0x20, 0x31, 0x20, 0x20, 0x32,
0x20, 0x20, 0x31, 0x20, 0x0a, 0x20, 0x30, 0x20, 0x66, 0x36, 0x20, 0x20,
0x30, 0x20, 0x36, 0x66, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20,
0x30, 0x20, 0x66, 0x37, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20,
0x30, 0x20, 0x37, 0x66, 0x20, 0x20, 0x30, 0x20, 0x38, 0x66, 0x20, 0x20,
0x61, 0x20, 0x20, 0x31, 0x20, 0x20, 0x34, 0x20, 0x20, 0x31, 0x20, 0x20,
0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x66, 0x38, 0x20, 0x20,
0x30, 0x20, 0x66, 0x39, 0x20, 0x0a, 0x20, 0x34, 0x20, 0x20, 0x31, 0x20,
0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x39, 0x66, 0x20,
0x20, 0x30, 0x20, 0x61, 0x66, 0x20, 0x20, 0x30, 0x20, 0x66, 0x61, 0x20,
0x20, 0x38, 0x20, 0x20, 0x31, 0x20, 0x20, 0x34, 0x20, 0x20, 0x31, 0x20,
0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x66, 0x62, 0x20,
0x20, 0x30, 0x20, 0x62, 0x66, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20,
0x20, 0x30, 0x20, 0x66, 0x63, 0x20, 0x0a, 0x20, 0x30, 0x20, 0x63, 0x66,
0x20, 0x20, 0x34, 0x20, 0x20, 0x31, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31,
0x20, 0x20, 0x30, 0x20, 0x66, 0x64, 0x20, 0x20, 0x30, 0x20, 0x64, 0x66,
0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x66, 0x65,
0x20, 0x20, 0x30, 0x20, 0x65, 0x66, 0x20, 0x0a, 0x0a, 0x2e, 0x74, 0x61,
0x62, 0x6c, 0x65, 0x20, 0x32, 0x35, 0x20, 0x35, 0x31, 0x32, 0x20, 0x31,
0x36, 0x20, 0x31, 0x36, 0x20, 0x20, 0x35, 0x0a, 0x2e, 0x72, 0x65, 0x66,
0x65, 0x72, 0x65, 0x6e, 0x63, 0x65, 0x20, 0x32, 0x34, 0x0a, 0x0a, 0x2e,
0x74, 0x61, 0x62, 0x6c, 0x65, 0x20, 0x32, 0x36, 0x20, 0x35, 0x31, 0x32,
0x20, 0x31, 0x36, 0x20, 0x31, 0x36, 0x20, 0x20, 0x36, 0x0a, 0x2e, 0x72,
0x65, 0x66, 0x65, 0x72, 0x65, 0x6e, 0x63, 0x65, 0x20, 0x32, 0x34, 0x0a,
0x0a, 0x2e, 0x74, 0x61, 0x62, 0x6c, 0x65, 0x20, 0x32, 0x37, 0x20, 0x35,
0x31, 0x32, 0x20, 0x31, 0x36, 0x20, 0x31, 0x36, 0x20, 0x20, 0x37, 0x0a,
0x2e, 0x72, 0x65, 0x66, 0x65, 0x72, 0x65, 0x6e, 0x63, 0x65, 0x20, 0x32,
0x34, 0x0a, 0x0a, 0x2e, 0x74, 0x61, 0x62, 0x6c, 0x65, 0x20, 0x32, 0x38,
0x20, 0x35, 0x31, 0x32, 0x20, 0x31, 0x36, 0x20, 0x31, 0x36, 0x20, 0x20,
0x38, 0x0a, 0x2e, 0x72, 0x65, 0x66, 0x65, 0x72, 0x65, 0x6e, 0x63, 0x65,
0x20, 0x32, 0x34, 0x0a, 0x0a, 0x2e, 0x74, 0x61, 0x62, 0x6c, 0x65, 0x20,
0x32, 0x39, 0x20, 0x35, 0x31, 0x32, 0x20, 0x31, 0x36, 0x20, 0x31, 0x36,
0x20, 0x20, 0x39, 0x0a, 0x2e, 0x72, 0x65, 0x66, 0x65, 0x72, 0x65, 0x6e,
0x63, 0x65, 0x20, 0x32, 0x34, 0x0a, 0x0a, 0x2e, 0x74, 0x61, 0x62, 0x6c,
0x65, 0x20, 0x33, 0x30, 0x20, 0x35, 0x31, 0x32, 0x20, 0x31, 0x36, 0x20,
0x31, 0x36, 0x20, 0x31, 0x31, 0x0a, 0x2e, 0x72, 0x65, 0x66, 0x65, 0x72,
0x65, 0x6e, 0x63, 0x65, 0x20, 0x32, 0x34, 0x0a, 0x0a, 0x2e, 0x74, 0x61,
0x62, 0x6c, 0x65, 0x20, 0x33, 0x31, 0x20, 0x35, 0x31, 0x32, 0x20, 0x31,
0x36, 0x20, 0x31, 0x36, 0x20, 0x31, 0x33, 0x0a, 0x2e, 0x72, 0x65, 0x66,
0x65, 0x72, 0x65, 0x6e, 0x63, 0x65, 0x20, 0x32, 0x34, 0x0a, 0x0a, 0x2e,
0x74, 0x61, 0x62, 0x6c, 0x65, 0x20, 0x33, 0x32, 0x20, 0x20, 0x33, 0x31,
0x20, 0x20, 0x31, 0x20, 0x31, 0x36, 0x20, 0x20, 0x30, 0x0a, 0x2e, 0x74,
0x72, 0x65, 0x65, 0x64, 0x61, 0x74, 0x61, 0x0a, 0x20, 0x32, 0x20, 0x20,
0x31, 0x20, 0x20, 0x30, 0x20, 0x20, 0x30, 0x20, 0x20, 0x38, 0x20, 0x20,
0x31, 0x20, 0x20, 0x34, 0x20, 0x20, 0x31, 0x20, 0x20, 0x32, 0x20, 0x20,
0x31, 0x20, 0x20, 0x30, 0x20, 0x20, 0x38, 0x20, 0x20, 0x30, 0x20, 0x20,
0x34, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x20,
0x31, 0x20, 0x20, 0x30, 0x20, 0x20, 0x32, 0x20, 0x20, 0x38, 0x20, 0x20,
0x31, 0x20, 0x20, 0x34, 0x20, 0x20, 0x31, 0x20, 0x0a, 0x20, 0x32, 0x20,
0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x20, 0x63, 0x20, 0x20, 0x30, 0x20,
0x20, 0x61, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20,
0x20, 0x33, 0x20, 0x20, 0x30, 0x20, 0x20, 0x36, 0x20, 0x20, 0x36, 0x20,
0x20, 0x31, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20,
0x20, 0x39, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20,
0x20, 0x35, 0x20, 0x20, 0x30, 0x20, 0x20, 0x37, 0x20, 0x0a, 0x20, 0x34,
0x20, 0x20, 0x31, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30,
0x20, 0x20, 0x65, 0x20, 0x20, 0x30, 0x20, 0x20, 0x64, 0x20, 0x20, 0x32,
0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x20, 0x66, 0x20, 0x20, 0x30,
0x20, 0x20, 0x62, 0x20, 0x0a, 0x0a, 0x2e, 0x74, 0x61, 0x62, 0x6c, 0x65,
0x20, 0x33, 0x33, 0x20, 0x20, 0x33, 0x31, 0x20, 0x20, 0x31, 0x20, 0x31,
0x36, 0x20, 0x20, 0x30, 0x0a, 0x2e, 0x74, 0x72, 0x65, 0x65, 0x64, 0x61,
0x74, 0x61, 0x0a, 0x31, 0x30, 0x20, 0x20, 0x31, 0x20, 0x20, 0x38, 0x20,
0x20, 0x31, 0x20, 0x20, 0x34, 0x20, 0x20, 0x31, 0x20, 0x20, 0x32, 0x20,
0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x20, 0x30, 0x20, 0x20, 0x30, 0x20,
0x20, 0x31, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20,
0x20, 0x32, 0x20, 0x20, 0x30, 0x20, 0x20, 0x33, 0x20, 0x20, 0x34, 0x20,
0x20, 0x31, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20,
0x20, 0x34, 0x20, 0x0a, 0x20, 0x30, 0x20, 0x20, 0x35, 0x20, 0x20, 0x32,
0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x20, 0x36, 0x20, 0x20, 0x30,
0x20, 0x20, 0x37, 0x20, 0x20, 0x38, 0x20, 0x20, 0x31, 0x20, 0x20, 0x34,
0x20, 0x20, 0x31, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30,
0x20, 0x20, 0x38, 0x20, 0x20, 0x30, 0x20, 0x20, 0x39, 0x20, 0x20, 0x32,
0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x20, 0x61, 0x20, 0x20, 0x30,
0x20, 0x20, 0x62, 0x20, 0x0a, 0x20, 0x34, 0x20, 0x20, 0x31, 0x20, 0x20,
0x32, 0x20, 0x20, 0x31, 0x20, 0x20, 0x30, 0x20, 0x20, 0x63, 0x20, 0x20,
0x30, 0x20, 0x20, 0x64, 0x20, 0x20, 0x32, 0x20, 0x20, 0x31, 0x20, 0x20,
0x30, 0x20, 0x20, 0x65, 0x20, 0x20, 0x30, 0x20, 0x20, 0x66, 0x20, 0x0a,
0x0a, 0x2e, 0x65, 0x6e, 0x64, 0x0a
};
live/liveMedia/MP3StreamState.cpp 000444 001752 001752 00000032144 12656261123 016634 0 ustar 00rsf rsf 000000 000000 /**********
This library is free software; you can redistribute it and/or modify it under
the terms of the GNU Lesser General Public License as published by the
Free Software Foundation; either version 2.1 of the License, or (at your
option) any later version. (See .)
This library is distributed in the hope that it will be useful, but WITHOUT
ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS
FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for
more details.
You should have received a copy of the GNU Lesser General Public License
along with this library; if not, write to the Free Software Foundation, Inc.,
51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
**********/
// "liveMedia"
// Copyright (c) 1996-2016 Live Networks, Inc. All rights reserved.
// A class encapsulating the state of a MP3 stream
// Implementation
#include "MP3StreamState.hh"
#include "InputFile.hh"
#include "GroupsockHelper.hh"
#if defined(__WIN32__) || defined(_WIN32)
#define snprintf _snprintf
#if _MSC_VER >= 1400 // 1400 == vs2005
#define fileno _fileno
#endif
#endif
#define MILLION 1000000
MP3StreamState::MP3StreamState(UsageEnvironment& env)
: fEnv(env), fFid(NULL), fPresentationTimeScale(1) {
}
MP3StreamState::~MP3StreamState() {
// Close our open file or socket:
if (fFid != NULL && fFid != stdin) {
if (fFidIsReallyASocket) {
intptr_t fid_long = (intptr_t)fFid;
closeSocket((int)fid_long);
} else {
CloseInputFile(fFid);
}
}
}
void MP3StreamState::assignStream(FILE* fid, unsigned fileSize) {
fFid = fid;
if (fileSize == (unsigned)(-1)) { /*HACK#####*/
fFidIsReallyASocket = 1;
fFileSize = 0;
} else {
fFidIsReallyASocket = 0;
fFileSize = fileSize;
}
fNumFramesInFile = 0; // until we know otherwise
fIsVBR = fHasXingTOC = False; // ditto
// Set the first frame's 'presentation time' to the current wall time:
gettimeofday(&fNextFramePresentationTime, NULL);
}
struct timeval MP3StreamState::currentFramePlayTime() const {
unsigned const numSamples = 1152;
unsigned const freq = fr().samplingFreq*(1 + fr().isMPEG2);
// result is numSamples/freq
unsigned const uSeconds
= ((numSamples*2*MILLION)/freq + 1)/2; // rounds to nearest integer
struct timeval result;
result.tv_sec = uSeconds/MILLION;
result.tv_usec = uSeconds%MILLION;
return result;
}
float MP3StreamState::filePlayTime() const {
unsigned numFramesInFile = fNumFramesInFile;
if (numFramesInFile == 0) {
// Estimate the number of frames from the file size, and the
// size of the current frame:
numFramesInFile = fFileSize/(4 + fCurrentFrame.frameSize);
}
struct timeval const pt = currentFramePlayTime();
return numFramesInFile*(pt.tv_sec + pt.tv_usec/(float)MILLION);
}
unsigned MP3StreamState::getByteNumberFromPositionFraction(float fraction) {
if (fHasXingTOC) {
// The file is VBR, with a Xing TOC; use it to determine which byte to seek to:
float percent = fraction*100.0f;
unsigned a = (unsigned)percent;
if (a > 99) a = 99;
unsigned fa = fXingTOC[a];
unsigned fb;
if (a < 99) {
fb = fXingTOC[a+1];
} else {
fb = 256;
}
fraction = (fa + (fb-fa)*(percent-a))/256.0f;
}
return (unsigned)(fraction*fFileSize);
}
void MP3StreamState::seekWithinFile(unsigned seekByteNumber) {
if (fFidIsReallyASocket) return; // it's not seekable
SeekFile64(fFid, seekByteNumber, SEEK_SET);
}
unsigned MP3StreamState::findNextHeader(struct timeval& presentationTime) {
presentationTime = fNextFramePresentationTime;
if (!findNextFrame()) return 0;
// From this frame, figure out the *next* frame's presentation time:
struct timeval framePlayTime = currentFramePlayTime();
if (fPresentationTimeScale > 1) {
// Scale this value
unsigned secondsRem = framePlayTime.tv_sec % fPresentationTimeScale;
framePlayTime.tv_sec -= secondsRem;
framePlayTime.tv_usec += secondsRem*MILLION;
framePlayTime.tv_sec /= fPresentationTimeScale;
framePlayTime.tv_usec /= fPresentationTimeScale;
}
fNextFramePresentationTime.tv_usec += framePlayTime.tv_usec;
fNextFramePresentationTime.tv_sec
+= framePlayTime.tv_sec + fNextFramePresentationTime.tv_usec/MILLION;
fNextFramePresentationTime.tv_usec %= MILLION;
return fr().hdr;
}
Boolean MP3StreamState::readFrame(unsigned char* outBuf, unsigned outBufSize,
unsigned& resultFrameSize,
unsigned& resultDurationInMicroseconds) {
/* We assume that "mp3FindNextHeader()" has already been called */
resultFrameSize = 4 + fr().frameSize;
if (outBufSize < resultFrameSize) {
#ifdef DEBUG_ERRORS
fprintf(stderr, "Insufficient buffer size for reading input frame (%d, need %d)\n",
outBufSize, resultFrameSize);
#endif
if (outBufSize < 4) outBufSize = 0;
resultFrameSize = outBufSize;
return False;
}
if (resultFrameSize >= 4) {
unsigned& hdr = fr().hdr;
*outBuf++ = (unsigned char)(hdr>>24);
*outBuf++ = (unsigned char)(hdr>>16);
*outBuf++ = (unsigned char)(hdr>>8);
*outBuf++ = (unsigned char)(hdr);
memmove(outBuf, fr().frameBytes, resultFrameSize-4);
}
struct timeval const pt = currentFramePlayTime();
resultDurationInMicroseconds = pt.tv_sec*MILLION + pt.tv_usec;
return True;
}
void MP3StreamState::getAttributes(char* buffer, unsigned bufferSize) const {
char const* formatStr
= "bandwidth %d MPEGnumber %d MPEGlayer %d samplingFrequency %d isStereo %d playTime %d isVBR %d";
unsigned fpt = (unsigned)(filePlayTime() + 0.5); // rounds to nearest integer
#if defined(IRIX) || defined(ALPHA) || defined(_QNX4) || defined(IMN_PIM) || defined(CRIS)
/* snprintf() isn't defined, so just use sprintf() - ugh! */
sprintf(buffer, formatStr,
fr().bitrate, fr().isMPEG2 ? 2 : 1, fr().layer, fr().samplingFreq, fr().isStereo,
fpt, fIsVBR);
#else
snprintf(buffer, bufferSize, formatStr,
fr().bitrate, fr().isMPEG2 ? 2 : 1, fr().layer, fr().samplingFreq, fr().isStereo,
fpt, fIsVBR);
#endif
}
// This is crufty old code that needs to be cleaned up #####
#define HDRCMPMASK 0xfffffd00
Boolean MP3StreamState::findNextFrame() {
unsigned char hbuf[8];
unsigned l; int i;
int attempt = 0;
read_again:
if (readFromStream(hbuf, 4) != 4) return False;
fr().hdr = ((unsigned long) hbuf[0] << 24)
| ((unsigned long) hbuf[1] << 16)
| ((unsigned long) hbuf[2] << 8)
| (unsigned long) hbuf[3];
#ifdef DEBUG_PARSE
fprintf(stderr, "fr().hdr: 0x%08x\n", fr().hdr);
#endif
if (fr().oldHdr != fr().hdr || !fr().oldHdr) {
i = 0;
init_resync:
#ifdef DEBUG_PARSE
fprintf(stderr, "init_resync: fr().hdr: 0x%08x\n", fr().hdr);
#endif
if ( (fr().hdr & 0xffe00000) != 0xffe00000
|| (fr().hdr & 0x00060000) == 0 // undefined 'layer' field
|| (fr().hdr & 0x0000F000) == 0 // 'free format' bitrate index
|| (fr().hdr & 0x0000F000) == 0x0000F000 // undefined bitrate index
|| (fr().hdr & 0x00000C00) == 0x00000C00 // undefined frequency index
|| (fr().hdr & 0x00000003) != 0x00000000 // 'emphasis' field unexpectedly set
) {
/* RSF: Do the following test even if we're not at the
start of the file, in case we have two or more
separate MP3 files cat'ed together:
*/
/* Check for RIFF hdr */
if (fr().hdr == ('R'<<24)+('I'<<16)+('F'<<8)+'F') {
unsigned char buf[70 /*was: 40*/];
#ifdef DEBUG_ERRORS
fprintf(stderr,"Skipped RIFF header\n");
#endif
readFromStream(buf, 66); /* already read 4 */
goto read_again;
}
/* Check for ID3 hdr */
if ((fr().hdr&0xFFFFFF00) == ('I'<<24)+('D'<<16)+('3'<<8)) {
unsigned tagSize, bytesToSkip;
unsigned char buf[1000];
readFromStream(buf, 6); /* already read 4 */
tagSize = ((buf[2]&0x7F)<<21) + ((buf[3]&0x7F)<<14) + ((buf[4]&0x7F)<<7) + (buf[5]&0x7F);
bytesToSkip = tagSize;
while (bytesToSkip > 0) {
unsigned bytesToRead = sizeof buf;
if (bytesToRead > bytesToSkip) {
bytesToRead = bytesToSkip;
}
readFromStream(buf, bytesToRead);
bytesToSkip -= bytesToRead;
}
#ifdef DEBUG_ERRORS
fprintf(stderr,"Skipped %d-byte ID3 header\n", tagSize);
#endif
goto read_again;
}
/* give up after 20,000 bytes */
if (i++ < 20000/*4096*//*1024*/) {
memmove (&hbuf[0], &hbuf[1], 3);
if (readFromStream(hbuf+3,1) != 1) {
return False;
}
fr().hdr <<= 8;
fr().hdr |= hbuf[3];
fr().hdr &= 0xffffffff;
#ifdef DEBUG_PARSE
fprintf(stderr, "calling init_resync %d\n", i);
#endif
goto init_resync;
}
#ifdef DEBUG_ERRORS
fprintf(stderr,"Giving up searching valid MPEG header\n");
#endif
return False;
#ifdef DEBUG_ERRORS
fprintf(stderr,"Illegal Audio-MPEG-Header 0x%08lx at offset 0x%lx.\n",
fr().hdr,tell_stream(str)-4);
#endif
/* Read more bytes until we find something that looks
reasonably like a valid header. This is not a
perfect strategy, but it should get us back on the
track within a short time (and hopefully without
too much distortion in the audio output). */
do {
attempt++;
memmove (&hbuf[0], &hbuf[1], 7);
if (readFromStream(&hbuf[3],1) != 1) {
return False;
}
/* This is faster than combining fr().hdr from scratch */
fr().hdr = ((fr().hdr << 8) | hbuf[3]) & 0xffffffff;
if (!fr().oldHdr)
goto init_resync; /* "considered harmful", eh? */
} while ((fr().hdr & HDRCMPMASK) != (fr().oldHdr & HDRCMPMASK)
&& (fr().hdr & HDRCMPMASK) != (fr().firstHdr & HDRCMPMASK));
#ifdef DEBUG_ERRORS
fprintf (stderr, "Skipped %d bytes in input.\n", attempt);
#endif
}
if (!fr().firstHdr) {
fr().firstHdr = fr().hdr;
}
fr().setParamsFromHeader();
fr().setBytePointer(fr().frameBytes, fr().frameSize);
fr().oldHdr = fr().hdr;
if (fr().isFreeFormat) {
#ifdef DEBUG_ERRORS
fprintf(stderr,"Free format not supported.\n");
#endif
return False;
}
#ifdef MP3_ONLY
if (fr().layer != 3) {
#ifdef DEBUG_ERRORS
fprintf(stderr, "MPEG layer %d is not supported!\n", fr().layer);
#endif
return False;
}
#endif
}
if ((l = readFromStream(fr().frameBytes, fr().frameSize))
!= fr().frameSize) {
if (l == 0) return False;
memset(fr().frameBytes+1, 0, fr().frameSize-1);
}
return True;
}
static Boolean socketIsReadable(int socket) {
const unsigned numFds = socket+1;
fd_set rd_set;
FD_ZERO(&rd_set);
FD_SET((unsigned)socket, &rd_set);
struct timeval timeout;
timeout.tv_sec = timeout.tv_usec = 0;
int result = select(numFds, &rd_set, NULL, NULL, &timeout);
return result != 0; // not > 0, because windows can return -1 for file sockets
}
static char watchVariable;
static void checkFunc(void* /*clientData*/) {
watchVariable = ~0;
}
static void waitUntilSocketIsReadable(UsageEnvironment& env, int socket) {
while (!socketIsReadable(socket)) {
// Delay a short period of time before checking again.
unsigned usecsToDelay = 1000; // 1 ms
env.taskScheduler().scheduleDelayedTask(usecsToDelay,
(TaskFunc*)checkFunc, (void*)NULL);
watchVariable = 0;
env.taskScheduler().doEventLoop(&watchVariable);
// This allows other tasks to run while we're waiting:
}
}
unsigned MP3StreamState::readFromStream(unsigned char* buf,
unsigned numChars) {
// Hack for doing socket I/O instead of file I/O (e.g., on Windows)
if (fFidIsReallyASocket) {
intptr_t fid_long = (intptr_t)fFid;
int sock = (int)fid_long;
unsigned totBytesRead = 0;
do {
waitUntilSocketIsReadable(fEnv, sock);
int bytesRead
= recv(sock, &((char*)buf)[totBytesRead], numChars-totBytesRead, 0);
if (bytesRead < 0) return 0;
totBytesRead += (unsigned)bytesRead;
} while (totBytesRead < numChars);
return totBytesRead;
} else {
#ifndef _WIN32_WCE
waitUntilSocketIsReadable(fEnv, (int)fileno(fFid));
#endif
return fread(buf, 1, numChars, fFid);
}
}
#define XING_FRAMES_FLAG 0x0001
#define XING_BYTES_FLAG 0x0002
#define XING_TOC_FLAG 0x0004
#define XING_VBR_SCALE_FLAG 0x0008
void MP3StreamState::checkForXingHeader() {
// Look for 'Xing' in the first 4 bytes after the 'side info':
if (fr().frameSize < fr().sideInfoSize) return;
unsigned bytesAvailable = fr().frameSize - fr().sideInfoSize;
unsigned char* p = &(fr().frameBytes[fr().sideInfoSize]);
if (bytesAvailable < 8) return;
if (p[0] != 'X' || p[1] != 'i' || p[2] != 'n' || p[3] != 'g') return;
// We found it.
fIsVBR = True;
u_int32_t flags = (p[4]<<24) | (p[5]<<16) | (p[6]<<8) | p[7];
unsigned i = 8;
bytesAvailable -= 8;
if (flags&XING_FRAMES_FLAG) {
// The next 4 bytes are the number of frames:
if (bytesAvailable < 4) return;
fNumFramesInFile = (p[i]<<24)|(p[i+1]<<16)|(p[i+2]<<8)|(p[i+3]);
i += 4; bytesAvailable -= 4;
}
if (flags&XING_BYTES_FLAG) {
// The next 4 bytes is the file size:
if (bytesAvailable < 4) return;
fFileSize = (p[i]<<24)|(p[i+1]<<16)|(p[i+2]<<8)|(p[i+3]);
i += 4; bytesAvailable -= 4;
}
if (flags&XING_TOC_FLAG) {
// Fill in the Xing 'table of contents':
if (bytesAvailable < XING_TOC_LENGTH) return;
fHasXingTOC = True;
for (unsigned j = 0; j < XING_TOC_LENGTH; ++j) {
fXingTOC[j] = p[i+j];
}
i += XING_TOC_FLAG; bytesAvailable -= XING_TOC_FLAG;
}
}
live/liveMedia/MP3StreamState.hh 000444 001752 001752 00000005427 12656261123 016455 0 ustar 00rsf rsf 000000 000000 /**********
This library is free software; you can redistribute it and/or modify it under
the terms of the GNU Lesser General Public License as published by the
Free Software Foundation; either version 2.1 of the License, or (at your
option) any later version. (See .)
This library is distributed in the hope that it will be useful, but WITHOUT
ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS
FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for
more details.
You should have received a copy of the GNU Lesser General Public License
along with this library; if not, write to the Free Software Foundation, Inc.,
51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
**********/
// "liveMedia"
// Copyright (c) 1996-2016 Live Networks, Inc. All rights reserved.
// A class encapsulating the state of a MP3 stream
// C++ header
#ifndef _MP3_STREAM_STATE_HH
#define _MP3_STREAM_STATE_HH
#ifndef _USAGE_ENVIRONMENT_HH
#include "UsageEnvironment.hh"
#endif
#ifndef _BOOLEAN_HH
#include "Boolean.hh"
#endif
#ifndef _MP3_INTERNALS_HH
#include "MP3Internals.hh"
#endif
#ifndef _NET_COMMON_H
#include "NetCommon.h"
#endif
#include
#define XING_TOC_LENGTH 100
class MP3StreamState {
public:
MP3StreamState(UsageEnvironment& env);
virtual ~MP3StreamState();
void assignStream(FILE* fid, unsigned fileSize);
unsigned findNextHeader(struct timeval& presentationTime);
Boolean readFrame(unsigned char* outBuf, unsigned outBufSize,
unsigned& resultFrameSize,
unsigned& resultDurationInMicroseconds);
// called after findNextHeader()
void getAttributes(char* buffer, unsigned bufferSize) const;
float filePlayTime() const; // in seconds
unsigned fileSize() const { return fFileSize; }
void setPresentationTimeScale(unsigned scale) { fPresentationTimeScale = scale; }
unsigned getByteNumberFromPositionFraction(float fraction); // 0.0 <= fraction <= 1.0
void seekWithinFile(unsigned seekByteNumber);
void checkForXingHeader(); // hack for Xing VBR files
protected: // private->protected requested by Pierre l'Hussiez
unsigned readFromStream(unsigned char* buf, unsigned numChars);
private:
MP3FrameParams& fr() {return fCurrentFrame;}
MP3FrameParams const& fr() const {return fCurrentFrame;}
struct timeval currentFramePlayTime() const;
Boolean findNextFrame();
private:
UsageEnvironment& fEnv;
FILE* fFid;
Boolean fFidIsReallyASocket;
unsigned fFileSize;
unsigned fNumFramesInFile;
unsigned fPresentationTimeScale;
// used if we're streaming at other than the normal rate
Boolean fIsVBR, fHasXingTOC;
u_int8_t fXingTOC[XING_TOC_LENGTH]; // set iff "fHasXingTOC" is True
MP3FrameParams fCurrentFrame;
struct timeval fNextFramePresentationTime;
};
#endif
live/liveMedia/MP3Transcoder.cpp 000444 001752 001752 00000003455 12656261123 016507 0 ustar 00rsf rsf 000000 000000 /**********
This library is free software; you can redistribute it and/or modify it under
the terms of the GNU Lesser General Public License as published by the
Free Software Foundation; either version 2.1 of the License, or (at your
option) any later version. (See .)
This library is distributed in the hope that it will be useful, but WITHOUT
ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS
FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for
more details.
You should have received a copy of the GNU Lesser General Public License
along with this library; if not, write to the Free Software Foundation, Inc.,
51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
**********/
// "liveMedia"
// Copyright (c) 1996-2016 Live Networks, Inc. All rights reserved.
// MP3 Transcoder
// Implementation
#include "MP3Transcoder.hh"
MP3Transcoder::MP3Transcoder(UsageEnvironment& env,
MP3ADUTranscoder* aduTranscoder)
: MP3FromADUSource(env, aduTranscoder, False) {
}
MP3Transcoder::~MP3Transcoder() {
}
MP3Transcoder* MP3Transcoder::createNew(UsageEnvironment& env,
unsigned outBitrate /* in kbps */,
FramedSource* inputSource) {
MP3Transcoder* newSource = NULL;
do {
// Create the intermediate filters that help implement the transcoder:
ADUFromMP3Source* aduFromMP3
= ADUFromMP3Source::createNew(env, inputSource, False);
// Note: This also checks that "inputSource" is an MP3 source
if (aduFromMP3 == NULL) break;
MP3ADUTranscoder* aduTranscoder
= MP3ADUTranscoder::createNew(env, outBitrate, aduFromMP3);
if (aduTranscoder == NULL) break;
// Then create the transcoder itself:
newSource = new MP3Transcoder(env, aduTranscoder);
} while (0);
return newSource;
}
live/liveMedia/MPEG1or2AudioRTPSink.cpp 000444 001752 001752 00000004473 12656261123 017515 0 ustar 00rsf rsf 000000 000000 /**********
This library is free software; you can redistribute it and/or modify it under
the terms of the GNU Lesser General Public License as published by the
Free Software Foundation; either version 2.1 of the License, or (at your
option) any later version. (See .)
This library is distributed in the hope that it will be useful, but WITHOUT
ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS
FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for
more details.
You should have received a copy of the GNU Lesser General Public License
along with this library; if not, write to the Free Software Foundation, Inc.,
51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
**********/
// "liveMedia"
// Copyright (c) 1996-2016 Live Networks, Inc. All rights reserved.
// RTP sink for MPEG audio (RFC 2250)
// Implementation
#include "MPEG1or2AudioRTPSink.hh"
MPEG1or2AudioRTPSink::MPEG1or2AudioRTPSink(UsageEnvironment& env, Groupsock* RTPgs)
: AudioRTPSink(env, RTPgs, 14, 90000, "MPA") {
}
MPEG1or2AudioRTPSink::~MPEG1or2AudioRTPSink() {
}
MPEG1or2AudioRTPSink*
MPEG1or2AudioRTPSink::createNew(UsageEnvironment& env, Groupsock* RTPgs) {
return new MPEG1or2AudioRTPSink(env, RTPgs);
}
void MPEG1or2AudioRTPSink::doSpecialFrameHandling(unsigned fragmentationOffset,
unsigned char* frameStart,
unsigned numBytesInFrame,
struct timeval framePresentationTime,
unsigned numRemainingBytes) {
// If this is the 1st frame in the 1st packet, set the RTP 'M' (marker)
// bit (because this is considered the start of a talk spurt):
if (isFirstPacket() && isFirstFrameInPacket()) {
setMarkerBit();
}
// If this is the first frame in the packet, set the lower half of the
// audio-specific header (to the "fragmentationOffset"):
if (isFirstFrameInPacket()) {
setSpecialHeaderWord(fragmentationOffset&0xFFFF);
}
// Important: Also call our base class's doSpecialFrameHandling(),
// to set the packet's timestamp:
MultiFramedRTPSink::doSpecialFrameHandling(fragmentationOffset,
frameStart, numBytesInFrame,
framePresentationTime,
numRemainingBytes);
}
unsigned MPEG1or2AudioRTPSink::specialHeaderSize() const {
// There's a 4 byte special audio header:
return 4;
}
live/liveMedia/MPEG1or2AudioRTPSource.cpp 000444 001752 001752 00000004323 12656261123 020043 0 ustar 00rsf rsf 000000 000000 /**********
This library is free software; you can redistribute it and/or modify it under
the terms of the GNU Lesser General Public License as published by the
Free Software Foundation; either version 2.1 of the License, or (at your
option) any later version. (See .)
This library is distributed in the hope that it will be useful, but WITHOUT
ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS
FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for
more details.
You should have received a copy of the GNU Lesser General Public License
along with this library; if not, write to the Free Software Foundation, Inc.,
51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
**********/
// "liveMedia"
// Copyright (c) 1996-2016 Live Networks, Inc. All rights reserved.
// MPEG-1 or MPEG-2 Audio RTP Sources
// Implementation
#include "MPEG1or2AudioRTPSource.hh"
MPEG1or2AudioRTPSource*
MPEG1or2AudioRTPSource::createNew(UsageEnvironment& env,
Groupsock* RTPgs,
unsigned char rtpPayloadFormat,
unsigned rtpTimestampFrequency) {
return new MPEG1or2AudioRTPSource(env, RTPgs, rtpPayloadFormat,
rtpTimestampFrequency);
}
MPEG1or2AudioRTPSource::MPEG1or2AudioRTPSource(UsageEnvironment& env,
Groupsock* rtpGS,
unsigned char rtpPayloadFormat,
unsigned rtpTimestampFrequency)
: MultiFramedRTPSource(env, rtpGS,
rtpPayloadFormat, rtpTimestampFrequency) {
}
MPEG1or2AudioRTPSource::~MPEG1or2AudioRTPSource() {
}
Boolean MPEG1or2AudioRTPSource
::processSpecialHeader(BufferedPacket* packet,
unsigned& resultSpecialHeaderSize) {
// There's a 4-byte header indicating fragmentation.
if (packet->dataSize() < 4) return False;
// Note: This fragmentation header is actually useless to us, because
// it doesn't tell us whether or not this RTP packet *ends* a
// fragmented frame. Thus, we can't use it to properly set
// "fCurrentPacketCompletesFrame". Instead, we assume that even
// a partial audio frame will be usable to clients.
resultSpecialHeaderSize = 4;
return True;
}
char const* MPEG1or2AudioRTPSource::MIMEtype() const {
return "audio/MPEG";
}
live/liveMedia/MPEG1or2AudioStreamFramer.cpp 000444 001752 001752 00000015243 12656261123 020610 0 ustar 00rsf rsf 000000 000000 /**********
This library is free software; you can redistribute it and/or modify it under
the terms of the GNU Lesser General Public License as published by the
Free Software Foundation; either version 2.1 of the License, or (at your
option) any later version. (See .)
This library is distributed in the hope that it will be useful, but WITHOUT
ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS
FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for
more details.
You should have received a copy of the GNU Lesser General Public License
along with this library; if not, write to the Free Software Foundation, Inc.,
51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
**********/
// "liveMedia"
// Copyright (c) 1996-2016 Live Networks, Inc. All rights reserved.
// A filter that breaks up an MPEG (1,2) audio elementary stream into frames
// Implementation
#include "MPEG1or2AudioStreamFramer.hh"
#include "StreamParser.hh"
#include "MP3Internals.hh"
#include
////////// MPEG1or2AudioStreamParser definition //////////
class MPEG1or2AudioStreamParser: public StreamParser {
public:
MPEG1or2AudioStreamParser(MPEG1or2AudioStreamFramer* usingSource,
FramedSource* inputSource);
virtual ~MPEG1or2AudioStreamParser();
public:
unsigned parse(unsigned& numTruncatedBytes);
// returns the size of the frame that was acquired, or 0 if none was
void registerReadInterest(unsigned char* to, unsigned maxSize);
MP3FrameParams const& currentFrame() const { return fCurrentFrame; }
private:
unsigned char* fTo;
unsigned fMaxSize;
// Parameters of the most recently read frame:
MP3FrameParams fCurrentFrame; // also works for layer I or II
};
////////// MPEG1or2AudioStreamFramer implementation //////////
MPEG1or2AudioStreamFramer
::MPEG1or2AudioStreamFramer(UsageEnvironment& env, FramedSource* inputSource,
Boolean syncWithInputSource)
: FramedFilter(env, inputSource),
fSyncWithInputSource(syncWithInputSource) {
reset();
fParser = new MPEG1or2AudioStreamParser(this, inputSource);
}
MPEG1or2AudioStreamFramer::~MPEG1or2AudioStreamFramer() {
delete fParser;
}
MPEG1or2AudioStreamFramer*
MPEG1or2AudioStreamFramer::createNew(UsageEnvironment& env,
FramedSource* inputSource,
Boolean syncWithInputSource) {
// Need to add source type checking here??? #####
return new MPEG1or2AudioStreamFramer(env, inputSource, syncWithInputSource);
}
void MPEG1or2AudioStreamFramer::flushInput() {
reset();
fParser->flushInput();
}
void MPEG1or2AudioStreamFramer::reset() {
// Use the current wallclock time as the initial 'presentation time':
struct timeval timeNow;
gettimeofday(&timeNow, NULL);
resetPresentationTime(timeNow);
}
void MPEG1or2AudioStreamFramer
::resetPresentationTime(struct timeval newPresentationTime) {
fNextFramePresentationTime = newPresentationTime;
}
void MPEG1or2AudioStreamFramer::doGetNextFrame() {
fParser->registerReadInterest(fTo, fMaxSize);
continueReadProcessing();
}
#define MILLION 1000000
static unsigned const numSamplesByLayer[4] = {0, 384, 1152, 1152};
struct timeval MPEG1or2AudioStreamFramer::currentFramePlayTime() const {
MP3FrameParams const& fr = fParser->currentFrame();
unsigned const numSamples = numSamplesByLayer[fr.layer];
struct timeval result;
unsigned const freq = fr.samplingFreq*(1 + fr.isMPEG2);
if (freq == 0) {
result.tv_sec = 0;
result.tv_usec = 0;
return result;
}
// result is numSamples/freq
unsigned const uSeconds
= ((numSamples*2*MILLION)/freq + 1)/2; // rounds to nearest integer
result.tv_sec = uSeconds/MILLION;
result.tv_usec = uSeconds%MILLION;
return result;
}
void MPEG1or2AudioStreamFramer
::continueReadProcessing(void* clientData,
unsigned char* /*ptr*/, unsigned /*size*/,
struct timeval presentationTime) {
MPEG1or2AudioStreamFramer* framer = (MPEG1or2AudioStreamFramer*)clientData;
if (framer->fSyncWithInputSource) {
framer->resetPresentationTime(presentationTime);
}
framer->continueReadProcessing();
}
void MPEG1or2AudioStreamFramer::continueReadProcessing() {
unsigned acquiredFrameSize = fParser->parse(fNumTruncatedBytes);
if (acquiredFrameSize > 0) {
// We were able to acquire a frame from the input.
// It has already been copied to the reader's space.
fFrameSize = acquiredFrameSize;
// Also set the presentation time, and increment it for next time,
// based on the length of this frame:
fPresentationTime = fNextFramePresentationTime;
struct timeval framePlayTime = currentFramePlayTime();
fDurationInMicroseconds = framePlayTime.tv_sec*MILLION + framePlayTime.tv_usec;
fNextFramePresentationTime.tv_usec += framePlayTime.tv_usec;
fNextFramePresentationTime.tv_sec
+= framePlayTime.tv_sec + fNextFramePresentationTime.tv_usec/MILLION;
fNextFramePresentationTime.tv_usec %= MILLION;
// Call our own 'after getting' function. Because we're not a 'leaf'
// source, we can call this directly, without risking infinite recursion.
afterGetting(this);
} else {
// We were unable to parse a complete frame from the input, because:
// - we had to read more data from the source stream, or
// - the source stream has ended.
}
}
////////// MPEG1or2AudioStreamParser implementation //////////
MPEG1or2AudioStreamParser
::MPEG1or2AudioStreamParser(MPEG1or2AudioStreamFramer* usingSource,
FramedSource* inputSource)
: StreamParser(inputSource, FramedSource::handleClosure, usingSource,
&MPEG1or2AudioStreamFramer::continueReadProcessing, usingSource) {
}
MPEG1or2AudioStreamParser::~MPEG1or2AudioStreamParser() {
}
void MPEG1or2AudioStreamParser::registerReadInterest(unsigned char* to,
unsigned maxSize) {
fTo = to;
fMaxSize = maxSize;
}
unsigned MPEG1or2AudioStreamParser::parse(unsigned& numTruncatedBytes) {
try {
saveParserState();
// We expect a MPEG audio header (first 11 bits set to 1) at the start:
while (((fCurrentFrame.hdr = test4Bytes())&0xFFE00000) != 0xFFE00000) {
skipBytes(1);
saveParserState();
}
fCurrentFrame.setParamsFromHeader();
// Copy the frame to the requested destination:
unsigned frameSize = fCurrentFrame.frameSize + 4; // include header
if (frameSize > fMaxSize) {
numTruncatedBytes = frameSize - fMaxSize;
frameSize = fMaxSize;
} else {
numTruncatedBytes = 0;
}
getBytes(fTo, frameSize);
skipBytes(numTruncatedBytes);
return frameSize;
} catch (int /*e*/) {
#ifdef DEBUG
fprintf(stderr, "MPEG1or2AudioStreamParser::parse() EXCEPTION (This is normal behavior - *not* an error)\n");
#endif
return 0; // the parsing got interrupted
}
}
live/liveMedia/MPEG1or2Demux.cpp 000444 001752 001752 00000062530 12656261123 016321 0 ustar 00rsf rsf 000000 000000 /**********
This library is free software; you can redistribute it and/or modify it under
the terms of the GNU Lesser General Public License as published by the
Free Software Foundation; either version 2.1 of the License, or (at your
option) any later version. (See .)
This library is distributed in the hope that it will be useful, but WITHOUT
ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS
FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for
more details.
You should have received a copy of the GNU Lesser General Public License
along with this library; if not, write to the Free Software Foundation, Inc.,
51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
**********/
// "liveMedia"
// Copyright (c) 1996-2016 Live Networks, Inc. All rights reserved.
// Demultiplexer for a MPEG 1 or 2 Program Stream
// Implementation
#include "MPEG1or2Demux.hh"
#include "MPEG1or2DemuxedElementaryStream.hh"
#include "StreamParser.hh"
#include
////////// MPEGProgramStreamParser definition //////////
// An enum representing the current state of the parser:
enum MPEGParseState {
PARSING_PACK_HEADER,
PARSING_SYSTEM_HEADER,
PARSING_PES_PACKET
};
class MPEGProgramStreamParser: public StreamParser {
public:
MPEGProgramStreamParser(MPEG1or2Demux* usingDemux, FramedSource* inputSource);
virtual ~MPEGProgramStreamParser();
public:
unsigned char parse();
// returns the stream id of a stream for which a frame was acquired,
// or 0 if no such frame was acquired.
private:
void setParseState(MPEGParseState parseState);
void parsePackHeader();
void parseSystemHeader();
unsigned char parsePESPacket(); // returns as does parse()
Boolean isSpecialStreamId(unsigned char stream_id) const;
// for PES packet header parsing
private:
MPEG1or2Demux* fUsingDemux;
MPEGParseState fCurrentParseState;
};
////////// MPEG1or2Demux::OutputDescriptor::SavedData definition/implementation //////////
class MPEG1or2Demux::OutputDescriptor::SavedData {
public:
SavedData(unsigned char* buf, unsigned size)
: next(NULL), data(buf), dataSize(size), numBytesUsed(0) {
}
virtual ~SavedData() {
delete[] data;
delete next;
}
SavedData* next;
unsigned char* data;
unsigned dataSize, numBytesUsed;
};
////////// MPEG1or2Demux implementation //////////
MPEG1or2Demux
::MPEG1or2Demux(UsageEnvironment& env,
FramedSource* inputSource, Boolean reclaimWhenLastESDies)
: Medium(env),
fInputSource(inputSource), fMPEGversion(0),
fNextAudioStreamNumber(0), fNextVideoStreamNumber(0),
fReclaimWhenLastESDies(reclaimWhenLastESDies), fNumOutstandingESs(0),
fNumPendingReads(0), fHaveUndeliveredData(False) {
fParser = new MPEGProgramStreamParser(this, inputSource);
for (unsigned i = 0; i < 256; ++i) {
fOutput[i].savedDataHead = fOutput[i].savedDataTail = NULL;
fOutput[i].isPotentiallyReadable = False;
fOutput[i].isCurrentlyActive = False;
fOutput[i].isCurrentlyAwaitingData = False;
}
}
MPEG1or2Demux::~MPEG1or2Demux() {
delete fParser;
for (unsigned i = 0; i < 256; ++i) delete fOutput[i].savedDataHead;
Medium::close(fInputSource);
}
MPEG1or2Demux* MPEG1or2Demux
::createNew(UsageEnvironment& env,
FramedSource* inputSource, Boolean reclaimWhenLastESDies) {
// Need to add source type checking here??? #####
return new MPEG1or2Demux(env, inputSource, reclaimWhenLastESDies);
}
MPEG1or2Demux::SCR::SCR()
: highBit(0), remainingBits(0), extension(0), isValid(False) {
}
void MPEG1or2Demux
::noteElementaryStreamDeletion(MPEG1or2DemuxedElementaryStream* /*es*/) {
if (--fNumOutstandingESs == 0 && fReclaimWhenLastESDies) {
Medium::close(this);
}
}
void MPEG1or2Demux::flushInput() {
fParser->flushInput();
}
MPEG1or2DemuxedElementaryStream*
MPEG1or2Demux::newElementaryStream(u_int8_t streamIdTag) {
++fNumOutstandingESs;
fOutput[streamIdTag].isPotentiallyReadable = True;
return new MPEG1or2DemuxedElementaryStream(envir(), streamIdTag, *this);
}
MPEG1or2DemuxedElementaryStream* MPEG1or2Demux::newAudioStream() {
unsigned char newAudioStreamTag = 0xC0 | (fNextAudioStreamNumber++&~0xE0);
// MPEG audio stream tags are 110x xxxx (binary)
return newElementaryStream(newAudioStreamTag);
}
MPEG1or2DemuxedElementaryStream* MPEG1or2Demux::newVideoStream() {
unsigned char newVideoStreamTag = 0xE0 | (fNextVideoStreamNumber++&~0xF0);
// MPEG video stream tags are 1110 xxxx (binary)
return newElementaryStream(newVideoStreamTag);
}
// Appropriate one of the reserved stream id tags to mean: return raw PES packets:
#define RAW_PES 0xFC
MPEG1or2DemuxedElementaryStream* MPEG1or2Demux::newRawPESStream() {
return newElementaryStream(RAW_PES);
}
void MPEG1or2Demux::registerReadInterest(u_int8_t streamIdTag,
unsigned char* to, unsigned maxSize,
FramedSource::afterGettingFunc* afterGettingFunc,
void* afterGettingClientData,
FramedSource::onCloseFunc* onCloseFunc,
void* onCloseClientData) {
struct OutputDescriptor& out = fOutput[streamIdTag];
// Make sure this stream is not already being read:
if (out.isCurrentlyAwaitingData) {
envir() << "MPEG1or2Demux::registerReadInterest(): attempt to read stream more than once!\n";
envir().internalError();
}
out.to = to; out.maxSize = maxSize;
out.fAfterGettingFunc = afterGettingFunc;
out.afterGettingClientData = afterGettingClientData;
out.fOnCloseFunc = onCloseFunc;
out.onCloseClientData = onCloseClientData;
out.isCurrentlyActive = True;
out.isCurrentlyAwaitingData = True;
// out.frameSize and out.presentationTime will be set when a frame's read
++fNumPendingReads;
}
Boolean MPEG1or2Demux::useSavedData(u_int8_t streamIdTag,
unsigned char* to, unsigned maxSize,
FramedSource::afterGettingFunc* afterGettingFunc,
void* afterGettingClientData) {
struct OutputDescriptor& out = fOutput[streamIdTag];
if (out.savedDataHead == NULL) return False; // common case
unsigned totNumBytesCopied = 0;
while (maxSize > 0 && out.savedDataHead != NULL) {
OutputDescriptor::SavedData& savedData = *(out.savedDataHead);
unsigned char* from = &savedData.data[savedData.numBytesUsed];
unsigned numBytesToCopy = savedData.dataSize - savedData.numBytesUsed;
if (numBytesToCopy > maxSize) numBytesToCopy = maxSize;
memmove(to, from, numBytesToCopy);
to += numBytesToCopy;
maxSize -= numBytesToCopy;
out.savedDataTotalSize -= numBytesToCopy;
totNumBytesCopied += numBytesToCopy;
savedData.numBytesUsed += numBytesToCopy;
if (savedData.numBytesUsed == savedData.dataSize) {
out.savedDataHead = savedData.next;
if (out.savedDataHead == NULL) out.savedDataTail = NULL;
savedData.next = NULL;
delete &savedData;
}
}
out.isCurrentlyActive = True;
if (afterGettingFunc != NULL) {
struct timeval presentationTime;
presentationTime.tv_sec = 0; presentationTime.tv_usec = 0; // should fix #####
(*afterGettingFunc)(afterGettingClientData, totNumBytesCopied,
0 /* numTruncatedBytes */, presentationTime,
0 /* durationInMicroseconds ?????#####*/);
}
return True;
}
void MPEG1or2Demux
::continueReadProcessing(void* clientData,
unsigned char* /*ptr*/, unsigned /*size*/,
struct timeval /*presentationTime*/) {
MPEG1or2Demux* demux = (MPEG1or2Demux*)clientData;
demux->continueReadProcessing();
}
void MPEG1or2Demux::continueReadProcessing() {
while (fNumPendingReads > 0) {
unsigned char acquiredStreamIdTag = fParser->parse();
if (acquiredStreamIdTag != 0) {
// We were able to acquire a frame from the input.
struct OutputDescriptor& newOut = fOutput[acquiredStreamIdTag];
newOut.isCurrentlyAwaitingData = False;
// indicates that we can be read again
// (This needs to be set before the 'after getting' call below,
// in case it tries to read another frame)
// Call our own 'after getting' function. Because we're not a 'leaf'
// source, we can call this directly, without risking infinite recursion.
if (newOut.fAfterGettingFunc != NULL) {
(*newOut.fAfterGettingFunc)(newOut.afterGettingClientData,
newOut.frameSize, 0 /* numTruncatedBytes */,
newOut.presentationTime,
0 /* durationInMicroseconds ?????#####*/);
--fNumPendingReads;
}
} else {
// We were unable to parse a complete frame from the input, because:
// - we had to read more data from the source stream, or
// - we found a frame for a stream that was being read, but whose
// reader is not ready to get the frame right now, or
// - the source stream has ended.
break;
}
}
}
void MPEG1or2Demux::getNextFrame(u_int8_t streamIdTag,
unsigned char* to, unsigned maxSize,
FramedSource::afterGettingFunc* afterGettingFunc,
void* afterGettingClientData,
FramedSource::onCloseFunc* onCloseFunc,
void* onCloseClientData) {
// First, check whether we have saved data for this stream id:
if (useSavedData(streamIdTag, to, maxSize,
afterGettingFunc, afterGettingClientData)) {
return;
}
// Then save the parameters of the specified stream id:
registerReadInterest(streamIdTag, to, maxSize,
afterGettingFunc, afterGettingClientData,
onCloseFunc, onCloseClientData);
// Next, if we're the only currently pending read, continue looking for data:
if (fNumPendingReads == 1 || fHaveUndeliveredData) {
fHaveUndeliveredData = 0;
continueReadProcessing();
} // otherwise the continued read processing has already been taken care of
}
void MPEG1or2Demux::stopGettingFrames(u_int8_t streamIdTag) {
struct OutputDescriptor& out = fOutput[streamIdTag];
if (out.isCurrentlyAwaitingData && fNumPendingReads > 0) --fNumPendingReads;
out.isCurrentlyActive = out.isCurrentlyAwaitingData = False;
}
void MPEG1or2Demux::handleClosure(void* clientData) {
MPEG1or2Demux* demux = (MPEG1or2Demux*)clientData;
demux->fNumPendingReads = 0;
// Tell all pending readers that our source has closed.
// Note that we need to make a copy of our readers' close functions
// (etc.) before we start calling any of them, in case one of them
// ends up deleting this.
struct {
FramedSource::onCloseFunc* fOnCloseFunc;
void* onCloseClientData;
} savedPending[256];
unsigned i, numPending = 0;
for (i = 0; i < 256; ++i) {
struct OutputDescriptor& out = demux->fOutput[i];
if (out.isCurrentlyAwaitingData) {
if (out.fOnCloseFunc != NULL) {
savedPending[numPending].fOnCloseFunc = out.fOnCloseFunc;
savedPending[numPending].onCloseClientData = out.onCloseClientData;
++numPending;
}
}
delete out.savedDataHead; out.savedDataHead = out.savedDataTail = NULL;
out.savedDataTotalSize = 0;
out.isPotentiallyReadable = out.isCurrentlyActive = out.isCurrentlyAwaitingData
= False;
}
for (i = 0; i < numPending; ++i) {
(*savedPending[i].fOnCloseFunc)(savedPending[i].onCloseClientData);
}
}
////////// MPEGProgramStreamParser implementation //////////
#include
MPEGProgramStreamParser::MPEGProgramStreamParser(MPEG1or2Demux* usingDemux,
FramedSource* inputSource)
: StreamParser(inputSource, MPEG1or2Demux::handleClosure, usingDemux,
&MPEG1or2Demux::continueReadProcessing, usingDemux),
fUsingDemux(usingDemux), fCurrentParseState(PARSING_PACK_HEADER) {
}
MPEGProgramStreamParser::~MPEGProgramStreamParser() {
}
void MPEGProgramStreamParser::setParseState(MPEGParseState parseState) {
fCurrentParseState = parseState;
saveParserState();
}
unsigned char MPEGProgramStreamParser::parse() {
unsigned char acquiredStreamTagId = 0;
try {
do {
switch (fCurrentParseState) {
case PARSING_PACK_HEADER: {
parsePackHeader();
break;
}
case PARSING_SYSTEM_HEADER: {
parseSystemHeader();
break;
}
case PARSING_PES_PACKET: {
acquiredStreamTagId = parsePESPacket();
break;
}
}
} while(acquiredStreamTagId == 0);
return acquiredStreamTagId;
} catch (int /*e*/) {
#ifdef DEBUG
fprintf(stderr, "MPEGProgramStreamParser::parse() EXCEPTION (This is normal behavior - *not* an error)\n");
fflush(stderr);
#endif
return 0; // the parsing got interrupted
}
}
#define PACK_START_CODE 0x000001BA
#define SYSTEM_HEADER_START_CODE 0x000001BB
#define PACKET_START_CODE_PREFIX 0x00000100
static inline Boolean isPacketStartCode(unsigned code) {
return (code&0xFFFFFF00) == PACKET_START_CODE_PREFIX
&& code > SYSTEM_HEADER_START_CODE;
}
void MPEGProgramStreamParser::parsePackHeader() {
#ifdef DEBUG
fprintf(stderr, "parsing pack header\n"); fflush(stderr);
#endif
unsigned first4Bytes;
while (1) {
first4Bytes = test4Bytes();
// We're supposed to have a pack header here, but check also for
// a system header or a PES packet, just in case:
if (first4Bytes == PACK_START_CODE) {
skipBytes(4);
break;
} else if (first4Bytes == SYSTEM_HEADER_START_CODE) {
#ifdef DEBUG
fprintf(stderr, "found system header instead of pack header\n");
#endif
setParseState(PARSING_SYSTEM_HEADER);
return;
} else if (isPacketStartCode(first4Bytes)) {
#ifdef DEBUG
fprintf(stderr, "found packet start code 0x%02x instead of pack header\n", first4Bytes);
#endif
setParseState(PARSING_PES_PACKET);
return;
}
setParseState(PARSING_PACK_HEADER); // ensures we progress over bad data
if ((first4Bytes&0xFF) > 1) { // a system code definitely doesn't start here
skipBytes(4);
} else {
skipBytes(1);
}
}
// The size of the pack header differs depending on whether it's
// MPEG-1 or MPEG-2. The next byte tells us this:
unsigned char nextByte = get1Byte();
MPEG1or2Demux::SCR& scr = fUsingDemux->fLastSeenSCR; // alias
if ((nextByte&0xF0) == 0x20) { // MPEG-1
fUsingDemux->fMPEGversion = 1;
scr.highBit = (nextByte&0x08)>>3;
scr.remainingBits = (nextByte&0x06)<<29;
unsigned next4Bytes = get4Bytes();
scr.remainingBits |= (next4Bytes&0xFFFE0000)>>2;
scr.remainingBits |= (next4Bytes&0x0000FFFE)>>1;
scr.extension = 0;
scr.isValid = True;
skipBits(24);
#if defined(DEBUG_TIMESTAMPS) || defined(DEBUG_SCR_TIMESTAMPS)
fprintf(stderr, "pack hdr system_clock_reference_base: 0x%x",
scr.highBit);
fprintf(stderr, "%08x\n", scr.remainingBits);
#endif
} else if ((nextByte&0xC0) == 0x40) { // MPEG-2
fUsingDemux->fMPEGversion = 2;
scr.highBit = (nextByte&0x20)>>5;
scr.remainingBits = (nextByte&0x18)<<27;
scr.remainingBits |= (nextByte&0x03)<<28;
unsigned next4Bytes = get4Bytes();
scr.remainingBits |= (next4Bytes&0xFFF80000)>>4;
scr.remainingBits |= (next4Bytes&0x0003FFF8)>>3;
scr.extension = (next4Bytes&0x00000003)<<7;
next4Bytes = get4Bytes();
scr.extension |= (next4Bytes&0xFE000000)>>25;
scr.isValid = True;
skipBits(5);
#if defined(DEBUG_TIMESTAMPS) || defined(DEBUG_SCR_TIMESTAMPS)
fprintf(stderr, "pack hdr system_clock_reference_base: 0x%x",
scr.highBit);
fprintf(stderr, "%08x\n", scr.remainingBits);
fprintf(stderr, "pack hdr system_clock_reference_extension: 0x%03x\n",
scr.extension);
#endif
unsigned char pack_stuffing_length = getBits(3);
skipBytes(pack_stuffing_length);
} else { // unknown
fUsingDemux->envir() << "StreamParser::parsePack() saw strange byte following pack_start_code\n";
}
// Check for a System Header next:
setParseState(PARSING_SYSTEM_HEADER);
}
void MPEGProgramStreamParser::parseSystemHeader() {
#ifdef DEBUG
fprintf(stderr, "parsing system header\n"); fflush(stderr);
#endif
unsigned next4Bytes = test4Bytes();
if (next4Bytes != SYSTEM_HEADER_START_CODE) {
// The system header was optional. Look for a PES Packet instead:
setParseState(PARSING_PES_PACKET);
return;
}
#ifdef DEBUG
fprintf(stderr, "saw system_header_start_code\n"); fflush(stderr);
#endif
skipBytes(4); // we've already seen the system_header_start_code
unsigned short remaining_header_length = get2Bytes();
// According to the MPEG-1 and MPEG-2 specs, "remaining_header_length" should be
// at least 6 bytes. Check this now:
if (remaining_header_length < 6) {
fUsingDemux->envir() << "StreamParser::parseSystemHeader(): saw strange header_length: "
<< remaining_header_length << " < 6\n";
}
skipBytes(remaining_header_length);
// Check for a PES Packet next:
setParseState(PARSING_PES_PACKET);
}
#define private_stream_1 0xBD
#define private_stream_2 0xBF
// A test for stream ids that are exempt from normal PES packet header parsing
Boolean MPEGProgramStreamParser
::isSpecialStreamId(unsigned char stream_id) const {
if (stream_id == RAW_PES) return True; // hack
if (fUsingDemux->fMPEGversion == 1) {
return stream_id == private_stream_2;
} else { // assume MPEG-2
if (stream_id <= private_stream_2) {
return stream_id != private_stream_1;
} else if ((stream_id&0xF0) == 0xF0) {
unsigned char lower4Bits = stream_id&0x0F;
return lower4Bits <= 2 || lower4Bits == 0x8 || lower4Bits == 0xF;
} else {
return False;
}
}
}
#define READER_NOT_READY 2
unsigned char MPEGProgramStreamParser::parsePESPacket() {
#ifdef DEBUG
fprintf(stderr, "parsing PES packet\n"); fflush(stderr);
#endif
unsigned next4Bytes = test4Bytes();
if (!isPacketStartCode(next4Bytes)) {
// The PES Packet was optional. Look for a Pack Header instead:
setParseState(PARSING_PACK_HEADER);
return 0;
}
#ifdef DEBUG
fprintf(stderr, "saw packet_start_code_prefix\n"); fflush(stderr);
#endif
skipBytes(3); // we've already seen the packet_start_code_prefix
unsigned char stream_id = get1Byte();
#if defined(DEBUG) || defined(DEBUG_TIMESTAMPS)
unsigned char streamNum = stream_id;
char const* streamTypeStr;
if ((stream_id&0xE0) == 0xC0) {
streamTypeStr = "audio";
streamNum = stream_id&~0xE0;
} else if ((stream_id&0xF0) == 0xE0) {
streamTypeStr = "video";
streamNum = stream_id&~0xF0;
} else if (stream_id == 0xbc) {
streamTypeStr = "reserved";
} else if (stream_id == 0xbd) {
streamTypeStr = "private_1";
} else if (stream_id == 0xbe) {
streamTypeStr = "padding";
} else if (stream_id == 0xbf) {
streamTypeStr = "private_2";
} else {
streamTypeStr = "unknown";
}
#endif
#ifdef DEBUG
static unsigned frameCount = 1;
fprintf(stderr, "%d, saw %s stream: 0x%02x\n", frameCount, streamTypeStr, streamNum); fflush(stderr);
#endif
unsigned short PES_packet_length = get2Bytes();
#ifdef DEBUG
fprintf(stderr, "PES_packet_length: %d\n", PES_packet_length); fflush(stderr);
#endif
// Parse over the rest of the header, until we get to the packet data itself.
// This varies depending upon the MPEG version:
if (fUsingDemux->fOutput[RAW_PES].isPotentiallyReadable) {
// Hack: We've been asked to return raw PES packets, for every stream:
stream_id = RAW_PES;
}
unsigned savedParserOffset = curOffset();
#ifdef DEBUG_TIMESTAMPS
unsigned char pts_highBit = 0;
unsigned pts_remainingBits = 0;
unsigned char dts_highBit = 0;
unsigned dts_remainingBits = 0;
#endif
if (fUsingDemux->fMPEGversion == 1) {
if (!isSpecialStreamId(stream_id)) {
unsigned char nextByte;
while ((nextByte = get1Byte()) == 0xFF) { // stuffing_byte
}
if ((nextByte&0xC0) == 0x40) { // '01'
skipBytes(1);
nextByte = get1Byte();
}
if ((nextByte&0xF0) == 0x20) { // '0010'
#ifdef DEBUG_TIMESTAMPS
pts_highBit = (nextByte&0x08)>>3;
pts_remainingBits = (nextByte&0x06)<<29;
unsigned next4Bytes = get4Bytes();
pts_remainingBits |= (next4Bytes&0xFFFE0000)>>2;
pts_remainingBits |= (next4Bytes&0x0000FFFE)>>1;
#else
skipBytes(4);
#endif
} else if ((nextByte&0xF0) == 0x30) { // '0011'
#ifdef DEBUG_TIMESTAMPS
pts_highBit = (nextByte&0x08)>>3;
pts_remainingBits = (nextByte&0x06)<<29;
unsigned next4Bytes = get4Bytes();
pts_remainingBits |= (next4Bytes&0xFFFE0000)>>2;
pts_remainingBits |= (next4Bytes&0x0000FFFE)>>1;
nextByte = get1Byte();
dts_highBit = (nextByte&0x08)>>3;
dts_remainingBits = (nextByte&0x06)<<29;
next4Bytes = get4Bytes();
dts_remainingBits |= (next4Bytes&0xFFFE0000)>>2;
dts_remainingBits |= (next4Bytes&0x0000FFFE)>>1;
#else
skipBytes(9);
#endif
}
}
} else { // assume MPEG-2
if (!isSpecialStreamId(stream_id)) {
// Fields in the next 3 bytes determine the size of the rest:
unsigned next3Bytes = getBits(24);
#ifdef DEBUG_TIMESTAMPS
unsigned char PTS_DTS_flags = (next3Bytes&0x00C000)>>14;
#endif
#ifdef undef
unsigned char ESCR_flag = (next3Bytes&0x002000)>>13;
unsigned char ES_rate_flag = (next3Bytes&0x001000)>>12;
unsigned char DSM_trick_mode_flag = (next3Bytes&0x000800)>>11;
#endif
unsigned char PES_header_data_length = (next3Bytes&0x0000FF);
#ifdef DEBUG
fprintf(stderr, "PES_header_data_length: 0x%02x\n", PES_header_data_length); fflush(stderr);
#endif
#ifdef DEBUG_TIMESTAMPS
if (PTS_DTS_flags == 0x2 && PES_header_data_length >= 5) {
unsigned char nextByte = get1Byte();
pts_highBit = (nextByte&0x08)>>3;
pts_remainingBits = (nextByte&0x06)<<29;
unsigned next4Bytes = get4Bytes();
pts_remainingBits |= (next4Bytes&0xFFFE0000)>>2;
pts_remainingBits |= (next4Bytes&0x0000FFFE)>>1;
skipBytes(PES_header_data_length-5);
} else if (PTS_DTS_flags == 0x3 && PES_header_data_length >= 10) {
unsigned char nextByte = get1Byte();
pts_highBit = (nextByte&0x08)>>3;
pts_remainingBits = (nextByte&0x06)<<29;
unsigned next4Bytes = get4Bytes();
pts_remainingBits |= (next4Bytes&0xFFFE0000)>>2;
pts_remainingBits |= (next4Bytes&0x0000FFFE)>>1;
nextByte = get1Byte();
dts_highBit = (nextByte&0x08)>>3;
dts_remainingBits = (nextByte&0x06)<<29;
next4Bytes = get4Bytes();
dts_remainingBits |= (next4Bytes&0xFFFE0000)>>2;
dts_remainingBits |= (next4Bytes&0x0000FFFE)>>1;
skipBytes(PES_header_data_length-10);
}
#else
skipBytes(PES_header_data_length);
#endif
}
}
#ifdef DEBUG_TIMESTAMPS
fprintf(stderr, "%s stream, ", streamTypeStr);
fprintf(stderr, "packet presentation_time_stamp: 0x%x", pts_highBit);
fprintf(stderr, "%08x\n", pts_remainingBits);
fprintf(stderr, "\t\tpacket decoding_time_stamp: 0x%x", dts_highBit);
fprintf(stderr, "%08x\n", dts_remainingBits);
#endif
// The rest of the packet will be the "PES_packet_data_byte"s
// Make sure that "PES_packet_length" was consistent with where we are now:
unsigned char acquiredStreamIdTag = 0;
unsigned currentParserOffset = curOffset();
unsigned bytesSkipped = currentParserOffset - savedParserOffset;
if (stream_id == RAW_PES) {
restoreSavedParserState(); // so we deliver from the beginning of the PES packet
PES_packet_length += 6; // to include the whole of the PES packet
bytesSkipped = 0;
}
if (PES_packet_length < bytesSkipped) {
fUsingDemux->envir() << "StreamParser::parsePESPacket(): saw inconsistent PES_packet_length "
<< PES_packet_length << " < "
<< bytesSkipped << "\n";
} else {
PES_packet_length -= bytesSkipped;
#ifdef DEBUG
unsigned next4Bytes = test4Bytes();
#endif
// Check whether our using source is interested in this stream type.
// If so, deliver the frame to him:
MPEG1or2Demux::OutputDescriptor_t& out = fUsingDemux->fOutput[stream_id];
if (out.isCurrentlyAwaitingData) {
unsigned numBytesToCopy;
if (PES_packet_length > out.maxSize) {
fUsingDemux->envir() << "MPEGProgramStreamParser::parsePESPacket() error: PES_packet_length ("
<< PES_packet_length
<< ") exceeds max frame size asked for ("
<< out.maxSize << ")\n";
numBytesToCopy = out.maxSize;
} else {
numBytesToCopy = PES_packet_length;
}
getBytes(out.to, numBytesToCopy);
out.frameSize = numBytesToCopy;
#ifdef DEBUG
fprintf(stderr, "%d, %d bytes of PES_packet_data (out.maxSize: %d); first 4 bytes: 0x%08x\n", frameCount, numBytesToCopy, out.maxSize, next4Bytes); fflush(stderr);
#endif
// set out.presentationTime later #####
acquiredStreamIdTag = stream_id;
PES_packet_length -= numBytesToCopy;
} else if (out.isCurrentlyActive) {
// Someone has been reading this stream, but isn't right now.
// We can't deliver this frame until he asks for it, so punt for now.
// The next time he asks for a frame, he'll get it.
#ifdef DEBUG
fprintf(stderr, "%d, currently undeliverable PES data; first 4 bytes: 0x%08x - currently undeliverable!\n", frameCount, next4Bytes); fflush(stderr);
#endif
restoreSavedParserState(); // so we read from the beginning next time
fUsingDemux->fHaveUndeliveredData = True;
throw READER_NOT_READY;
} else if (out.isPotentiallyReadable &&
out.savedDataTotalSize + PES_packet_length < 1000000 /*limit*/) {
// Someone is interested in this stream, but hasn't begun reading it yet.
// Save this data, so that the reader will get it when he later asks for it.
unsigned char* buf = new unsigned char[PES_packet_length];
getBytes(buf, PES_packet_length);
MPEG1or2Demux::OutputDescriptor::SavedData* savedData
= new MPEG1or2Demux::OutputDescriptor::SavedData(buf, PES_packet_length);
if (out.savedDataHead == NULL) {
out.savedDataHead = out.savedDataTail = savedData;
} else {
out.savedDataTail->next = savedData;
out.savedDataTail = savedData;
}
out.savedDataTotalSize += PES_packet_length;
PES_packet_length = 0;
}
skipBytes(PES_packet_length);
}
// Check for another PES Packet next:
setParseState(PARSING_PES_PACKET);
#ifdef DEBUG
++frameCount;
#endif
return acquiredStreamIdTag;
}
live/liveMedia/MPEG1or2DemuxedElementaryStream.cpp 000444 001752 001752 00000006145 12656261123 022034 0 ustar 00rsf rsf 000000 000000 /**********
This library is free software; you can redistribute it and/or modify it under
the terms of the GNU Lesser General Public License as published by the
Free Software Foundation; either version 2.1 of the License, or (at your
option) any later version. (See .)
This library is distributed in the hope that it will be useful, but WITHOUT
ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS
FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for
more details.
You should have received a copy of the GNU Lesser General Public License
along with this library; if not, write to the Free Software Foundation, Inc.,
51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
**********/
// "liveMedia"
// Copyright (c) 1996-2016 Live Networks, Inc. All rights reserved.
// A MPEG 1 or 2 Elementary Stream, demultiplexed from a Program Stream
// Implementation
#include "MPEG1or2DemuxedElementaryStream.hh"
////////// MPEG1or2DemuxedElementaryStream //////////
MPEG1or2DemuxedElementaryStream::
MPEG1or2DemuxedElementaryStream(UsageEnvironment& env, u_int8_t streamIdTag,
MPEG1or2Demux& sourceDemux)
: FramedSource(env),
fOurStreamIdTag(streamIdTag), fOurSourceDemux(sourceDemux), fMPEGversion(0) {
// Set our MIME type string for known media types:
if ((streamIdTag&0xE0) == 0xC0) {
fMIMEtype = "audio/MPEG";
} else if ((streamIdTag&0xF0) == 0xE0) {
fMIMEtype = "video/MPEG";
} else {
fMIMEtype = MediaSource::MIMEtype();
}
}
MPEG1or2DemuxedElementaryStream::~MPEG1or2DemuxedElementaryStream() {
fOurSourceDemux.noteElementaryStreamDeletion(this);
}
void MPEG1or2DemuxedElementaryStream::doGetNextFrame() {
fOurSourceDemux.getNextFrame(fOurStreamIdTag, fTo, fMaxSize,
afterGettingFrame, this,
handleClosure, this);
}
void MPEG1or2DemuxedElementaryStream::doStopGettingFrames() {
fOurSourceDemux.stopGettingFrames(fOurStreamIdTag);
}
char const* MPEG1or2DemuxedElementaryStream::MIMEtype() const {
return fMIMEtype;
}
unsigned MPEG1or2DemuxedElementaryStream::maxFrameSize() const {
return 6+65535;
// because the MPEG spec allows for PES packets as large as
// (6 + 65535) bytes (header + data)
}
void MPEG1or2DemuxedElementaryStream
::afterGettingFrame(void* clientData,
unsigned frameSize, unsigned numTruncatedBytes,
struct timeval presentationTime,
unsigned durationInMicroseconds) {
MPEG1or2DemuxedElementaryStream* stream
= (MPEG1or2DemuxedElementaryStream*)clientData;
stream->afterGettingFrame1(frameSize, numTruncatedBytes,
presentationTime, durationInMicroseconds);
}
void MPEG1or2DemuxedElementaryStream
::afterGettingFrame1(unsigned frameSize, unsigned numTruncatedBytes,
struct timeval presentationTime,
unsigned durationInMicroseconds) {
fFrameSize = frameSize;
fNumTruncatedBytes = numTruncatedBytes;
fPresentationTime = presentationTime;
fDurationInMicroseconds = durationInMicroseconds;
fLastSeenSCR = fOurSourceDemux.lastSeenSCR();
fMPEGversion = fOurSourceDemux.mpegVersion();
FramedSource::afterGetting(this);
}
live/liveMedia/MPEG4ESVideoRTPSink.cpp 000444 001752 001752 00000012720 12656261123 017324 0 ustar 00rsf rsf 000000 000000 /**********
This library is free software; you can redistribute it and/or modify it under
the terms of the GNU Lesser General Public License as published by the
Free Software Foundation; either version 2.1 of the License, or (at your
option) any later version. (See .)
This library is distributed in the hope that it will be useful, but WITHOUT
ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS
FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for
more details.
You should have received a copy of the GNU Lesser General Public License
along with this library; if not, write to the Free Software Foundation, Inc.,
51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
**********/
// "liveMedia"
// Copyright (c) 1996-2016 Live Networks, Inc. All rights reserved.
// RTP sink for MPEG-4 Elementary Stream video (RFC 3016)
// Implementation
#include "MPEG4ESVideoRTPSink.hh"
#include "MPEG4VideoStreamFramer.hh"
#include "MPEG4LATMAudioRTPSource.hh" // for "parseGeneralConfigStr()"
MPEG4ESVideoRTPSink
::MPEG4ESVideoRTPSink(UsageEnvironment& env, Groupsock* RTPgs, unsigned char rtpPayloadFormat, u_int32_t rtpTimestampFrequency,
u_int8_t profileAndLevelIndication, char const* configStr)
: VideoRTPSink(env, RTPgs, rtpPayloadFormat, rtpTimestampFrequency, "MP4V-ES"),
fVOPIsPresent(False), fProfileAndLevelIndication(profileAndLevelIndication), fFmtpSDPLine(NULL) {
fConfigBytes = parseGeneralConfigStr(configStr, fNumConfigBytes);
}
MPEG4ESVideoRTPSink::~MPEG4ESVideoRTPSink() {
delete[] fFmtpSDPLine;
delete[] fConfigBytes;
}
MPEG4ESVideoRTPSink*
MPEG4ESVideoRTPSink::createNew(UsageEnvironment& env,
Groupsock* RTPgs, unsigned char rtpPayloadFormat,
u_int32_t rtpTimestampFrequency) {
return new MPEG4ESVideoRTPSink(env, RTPgs, rtpPayloadFormat, rtpTimestampFrequency);
}
MPEG4ESVideoRTPSink*
MPEG4ESVideoRTPSink::createNew(UsageEnvironment& env,
Groupsock* RTPgs, unsigned char rtpPayloadFormat, u_int32_t rtpTimestampFrequency,
u_int8_t profileAndLevelIndication, char const* configStr) {
return new MPEG4ESVideoRTPSink(env, RTPgs, rtpPayloadFormat, rtpTimestampFrequency, profileAndLevelIndication, configStr);
}
Boolean MPEG4ESVideoRTPSink::sourceIsCompatibleWithUs(MediaSource& source) {
// Our source must be an appropriate framer:
return source.isMPEG4VideoStreamFramer();
}
#define VOP_START_CODE 0x000001B6
void MPEG4ESVideoRTPSink
::doSpecialFrameHandling(unsigned fragmentationOffset,
unsigned char* frameStart,
unsigned numBytesInFrame,
struct timeval framePresentationTime,
unsigned numRemainingBytes) {
if (fragmentationOffset == 0) {
// Begin by inspecting the 4-byte code at the start of the frame:
if (numBytesInFrame < 4) return; // shouldn't happen
u_int32_t startCode
= (frameStart[0]<<24) | (frameStart[1]<<16) | (frameStart[2]<<8) | frameStart[3];
fVOPIsPresent = startCode == VOP_START_CODE;
}
// Set the RTP 'M' (marker) bit iff this frame ends a VOP
// (and there are no fragments remaining).
// This relies on the source being a "MPEG4VideoStreamFramer".
MPEG4VideoStreamFramer* framerSource = (MPEG4VideoStreamFramer*)fSource;
if (framerSource != NULL && framerSource->pictureEndMarker()
&& numRemainingBytes == 0) {
setMarkerBit();
framerSource->pictureEndMarker() = False;
}
// Also set the RTP timestamp. (We do this for each frame
// in the packet, to ensure that the timestamp of the VOP (if present)
// gets used.)
setTimestamp(framePresentationTime);
}
Boolean MPEG4ESVideoRTPSink::allowFragmentationAfterStart() const {
return True;
}
Boolean MPEG4ESVideoRTPSink
::frameCanAppearAfterPacketStart(unsigned char const* /*frameStart*/,
unsigned /*numBytesInFrame*/) const {
// Once we've packed a VOP into the packet, then no other
// frame can be packed into it:
return !fVOPIsPresent;
}
char const* MPEG4ESVideoRTPSink::auxSDPLine() {
// Generate a new "a=fmtp:" line each time, using our own 'configuration' information (if we have it),
// otherwise parameters from our framer source (in case they've changed since the last time that
// we were called):
unsigned configLength = fNumConfigBytes;
unsigned char* config = fConfigBytes;
if (fProfileAndLevelIndication == 0 || config == NULL) {
// We need to get this information from our framer source:
MPEG4VideoStreamFramer* framerSource = (MPEG4VideoStreamFramer*)fSource;
if (framerSource == NULL) return NULL; // we don't yet have a source
fProfileAndLevelIndication = framerSource->profile_and_level_indication();
if (fProfileAndLevelIndication == 0) return NULL; // our source isn't ready
config = framerSource->getConfigBytes(configLength);
if (config == NULL) return NULL; // our source isn't ready
}
char const* fmtpFmt =
"a=fmtp:%d "
"profile-level-id=%d;"
"config=";
unsigned fmtpFmtSize = strlen(fmtpFmt)
+ 3 /* max char len */
+ 3 /* max char len */
+ 2*configLength /* 2*, because each byte prints as 2 chars */
+ 2 /* trailing \r\n */;
char* fmtp = new char[fmtpFmtSize];
sprintf(fmtp, fmtpFmt, rtpPayloadType(), fProfileAndLevelIndication);
char* endPtr = &fmtp[strlen(fmtp)];
for (unsigned i = 0; i < configLength; ++i) {
sprintf(endPtr, "%02X", config[i]);
endPtr += 2;
}
sprintf(endPtr, "\r\n");
delete[] fFmtpSDPLine;
fFmtpSDPLine = strDup(fmtp);
delete[] fmtp;
return fFmtpSDPLine;
}
live/liveMedia/MPEG1or2DemuxedServerMediaSubsession.cpp 000444 001752 001752 00000012523 12656261123 023034 0 ustar 00rsf rsf 000000 000000 /**********
This library is free software; you can redistribute it and/or modify it under
the terms of the GNU Lesser General Public License as published by the
Free Software Foundation; either version 2.1 of the License, or (at your
option) any later version. (See .)
This library is distributed in the hope that it will be useful, but WITHOUT
ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS
FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for
more details.
You should have received a copy of the GNU Lesser General Public License
along with this library; if not, write to the Free Software Foundation, Inc.,
51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
**********/
// "liveMedia"
// Copyright (c) 1996-2016 Live Networks, Inc. All rights reserved.
// A 'ServerMediaSubsession' object that creates new, unicast, "RTPSink"s
// on demand, from a MPEG-1 or 2 demuxer.
// Implementation
#include "MPEG1or2DemuxedServerMediaSubsession.hh"
#include "MPEG1or2AudioStreamFramer.hh"
#include "MPEG1or2AudioRTPSink.hh"
#include "MPEG1or2VideoStreamFramer.hh"
#include "MPEG1or2VideoRTPSink.hh"
#include "AC3AudioStreamFramer.hh"
#include "AC3AudioRTPSink.hh"
#include "ByteStreamFileSource.hh"
MPEG1or2DemuxedServerMediaSubsession* MPEG1or2DemuxedServerMediaSubsession
::createNew(MPEG1or2FileServerDemux& demux, u_int8_t streamIdTag,
Boolean reuseFirstSource, Boolean iFramesOnly, double vshPeriod) {
return new MPEG1or2DemuxedServerMediaSubsession(demux, streamIdTag,
reuseFirstSource,
iFramesOnly, vshPeriod);
}
MPEG1or2DemuxedServerMediaSubsession
::MPEG1or2DemuxedServerMediaSubsession(MPEG1or2FileServerDemux& demux,
u_int8_t streamIdTag, Boolean reuseFirstSource,
Boolean iFramesOnly, double vshPeriod)
: OnDemandServerMediaSubsession(demux.envir(), reuseFirstSource),
fOurDemux(demux), fStreamIdTag(streamIdTag),
fIFramesOnly(iFramesOnly), fVSHPeriod(vshPeriod) {
}
MPEG1or2DemuxedServerMediaSubsession::~MPEG1or2DemuxedServerMediaSubsession() {
}
FramedSource* MPEG1or2DemuxedServerMediaSubsession
::createNewStreamSource(unsigned clientSessionId, unsigned& estBitrate) {
FramedSource* es = NULL;
do {
es = fOurDemux.newElementaryStream(clientSessionId, fStreamIdTag);
if (es == NULL) break;
if ((fStreamIdTag&0xF0) == 0xC0 /*MPEG audio*/) {
estBitrate = 128; // kbps, estimate
return MPEG1or2AudioStreamFramer::createNew(envir(), es);
} else if ((fStreamIdTag&0xF0) == 0xE0 /*video*/) {
estBitrate = 500; // kbps, estimate
return MPEG1or2VideoStreamFramer::createNew(envir(), es,
fIFramesOnly, fVSHPeriod);
} else if (fStreamIdTag == 0xBD /*AC-3 audio*/) {
estBitrate = 192; // kbps, estimate
return AC3AudioStreamFramer::createNew(envir(), es, 0x80);
} else { // unknown stream type
break;
}
} while (0);
// An error occurred:
Medium::close(es);
return NULL;
}
RTPSink* MPEG1or2DemuxedServerMediaSubsession
::createNewRTPSink(Groupsock* rtpGroupsock, unsigned char rtpPayloadTypeIfDynamic,
FramedSource* inputSource) {
if ((fStreamIdTag&0xF0) == 0xC0 /*MPEG audio*/) {
return MPEG1or2AudioRTPSink::createNew(envir(), rtpGroupsock);
} else if ((fStreamIdTag&0xF0) == 0xE0 /*video*/) {
return MPEG1or2VideoRTPSink::createNew(envir(), rtpGroupsock);
} else if (fStreamIdTag == 0xBD /*AC-3 audio*/) {
// Get the sampling frequency from the audio source; use it for the RTP frequency:
AC3AudioStreamFramer* audioSource
= (AC3AudioStreamFramer*)inputSource;
return AC3AudioRTPSink::createNew(envir(), rtpGroupsock, rtpPayloadTypeIfDynamic,
audioSource->samplingRate());
} else {
return NULL;
}
}
void MPEG1or2DemuxedServerMediaSubsession
::seekStreamSource(FramedSource* inputSource, double& seekNPT, double /*streamDuration*/, u_int64_t& /*numBytes*/) {
float const dur = duration();
unsigned const size = fOurDemux.fileSize();
unsigned absBytePosition = dur == 0.0 ? 0 : (unsigned)((seekNPT/dur)*size);
// "inputSource" is a 'framer'
// Flush its data, to account for the seek that we're about to do:
if ((fStreamIdTag&0xF0) == 0xC0 /*MPEG audio*/) {
MPEG1or2AudioStreamFramer* framer = (MPEG1or2AudioStreamFramer*)inputSource;
framer->flushInput();
} else if ((fStreamIdTag&0xF0) == 0xE0 /*video*/) {
MPEG1or2VideoStreamFramer* framer = (MPEG1or2VideoStreamFramer*)inputSource;
framer->flushInput();
}
// "inputSource" is a filter; its input source is the original elem stream source:
MPEG1or2DemuxedElementaryStream* elemStreamSource
= (MPEG1or2DemuxedElementaryStream*)(((FramedFilter*)inputSource)->inputSource());
// Next, get the original source demux:
MPEG1or2Demux& sourceDemux = elemStreamSource->sourceDemux();
// and flush its input buffers:
sourceDemux.flushInput();
// Then, get the original input file stream from the source demux:
ByteStreamFileSource* inputFileSource
= (ByteStreamFileSource*)(sourceDemux.inputSource());
// Note: We can make that cast, because we know that the demux was originally
// created from a "ByteStreamFileSource".
// Do the appropriate seek within the input file stream:
inputFileSource->seekToByteAbsolute(absBytePosition);
}
float MPEG1or2DemuxedServerMediaSubsession::duration() const {
return fOurDemux.fileDuration();
}
live/liveMedia/MPEG1or2FileServerDemux.cpp 000444 001752 001752 00000022577 12656261123 020317 0 ustar 00rsf rsf 000000 000000 /**********
This library is free software; you can redistribute it and/or modify it under
the terms of the GNU Lesser General Public License as published by the
Free Software Foundation; either version 2.1 of the License, or (at your
option) any later version. (See .)
This library is distributed in the hope that it will be useful, but WITHOUT
ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS
FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for
more details.
You should have received a copy of the GNU Lesser General Public License
along with this library; if not, write to the Free Software Foundation, Inc.,
51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
**********/
// "liveMedia"
// Copyright (c) 1996-2016 Live Networks, Inc. All rights reserved.
// A server demultiplexer for a MPEG 1 or 2 Program Stream
// Implementation
#include "MPEG1or2FileServerDemux.hh"
#include "MPEG1or2DemuxedServerMediaSubsession.hh"
#include "ByteStreamFileSource.hh"
MPEG1or2FileServerDemux*
MPEG1or2FileServerDemux::createNew(UsageEnvironment& env, char const* fileName,
Boolean reuseFirstSource) {
return new MPEG1or2FileServerDemux(env, fileName, reuseFirstSource);
}
static float MPEG1or2ProgramStreamFileDuration(UsageEnvironment& env,
char const* fileName,
unsigned& fileSize); // forward
MPEG1or2FileServerDemux
::MPEG1or2FileServerDemux(UsageEnvironment& env, char const* fileName,
Boolean reuseFirstSource)
: Medium(env),
fReuseFirstSource(reuseFirstSource),
fSession0Demux(NULL), fLastCreatedDemux(NULL), fLastClientSessionId(~0) {
fFileName = strDup(fileName);
fFileDuration = MPEG1or2ProgramStreamFileDuration(env, fileName, fFileSize);
}
MPEG1or2FileServerDemux::~MPEG1or2FileServerDemux() {
Medium::close(fSession0Demux);
delete[] (char*)fFileName;
}
ServerMediaSubsession*
MPEG1or2FileServerDemux::newAudioServerMediaSubsession() {
return MPEG1or2DemuxedServerMediaSubsession::createNew(*this, 0xC0, fReuseFirstSource);
}
ServerMediaSubsession*
MPEG1or2FileServerDemux::newVideoServerMediaSubsession(Boolean iFramesOnly,
double vshPeriod) {
return MPEG1or2DemuxedServerMediaSubsession::createNew(*this, 0xE0, fReuseFirstSource,
iFramesOnly, vshPeriod);
}
ServerMediaSubsession*
MPEG1or2FileServerDemux::newAC3AudioServerMediaSubsession() {
return MPEG1or2DemuxedServerMediaSubsession::createNew(*this, 0xBD, fReuseFirstSource);
// because, in a VOB file, the AC3 audio has stream id 0xBD
}
MPEG1or2DemuxedElementaryStream*
MPEG1or2FileServerDemux::newElementaryStream(unsigned clientSessionId,
u_int8_t streamIdTag) {
MPEG1or2Demux* demuxToUse;
if (clientSessionId == 0) {
// 'Session 0' is treated especially, because its audio & video streams
// are created and destroyed one-at-a-time, rather than both streams being
// created, and then (later) both streams being destroyed (as is the case
// for other ('real') session ids). Because of this, a separate demux is
// used for session 0, and its deletion is managed by us, rather than
// happening automatically.
if (fSession0Demux == NULL) {
// Open our input file as a 'byte-stream file source':
ByteStreamFileSource* fileSource
= ByteStreamFileSource::createNew(envir(), fFileName);
if (fileSource == NULL) return NULL;
fSession0Demux = MPEG1or2Demux::createNew(envir(), fileSource, False/*note!*/);
}
demuxToUse = fSession0Demux;
} else {
// First, check whether this is a new client session. If so, create a new
// demux for it:
if (clientSessionId != fLastClientSessionId) {
// Open our input file as a 'byte-stream file source':
ByteStreamFileSource* fileSource
= ByteStreamFileSource::createNew(envir(), fFileName);
if (fileSource == NULL) return NULL;
fLastCreatedDemux = MPEG1or2Demux::createNew(envir(), fileSource, True);
// Note: We tell the demux to delete itself when its last
// elementary stream is deleted.
fLastClientSessionId = clientSessionId;
// Note: This code relies upon the fact that the creation of streams for
// different client sessions do not overlap - so one "MPEG1or2Demux" is used
// at a time.
}
demuxToUse = fLastCreatedDemux;
}
if (demuxToUse == NULL) return NULL; // shouldn't happen
return demuxToUse->newElementaryStream(streamIdTag);
}
static Boolean getMPEG1or2TimeCode(FramedSource* dataSource,
MPEG1or2Demux& parentDemux,
Boolean returnFirstSeenCode,
float& timeCode); // forward
static float MPEG1or2ProgramStreamFileDuration(UsageEnvironment& env,
char const* fileName,
unsigned& fileSize) {
FramedSource* dataSource = NULL;
float duration = 0.0; // until we learn otherwise
fileSize = 0; // ditto
do {
// Open the input file as a 'byte-stream file source':
ByteStreamFileSource* fileSource = ByteStreamFileSource::createNew(env, fileName);
if (fileSource == NULL) break;
dataSource = fileSource;
fileSize = (unsigned)(fileSource->fileSize());
if (fileSize == 0) break;
// Create a MPEG demultiplexor that reads from that source.
MPEG1or2Demux* baseDemux = MPEG1or2Demux::createNew(env, dataSource, True);
if (baseDemux == NULL) break;
// Create, from this, a source that returns raw PES packets:
dataSource = baseDemux->newRawPESStream();
// Read the first time code from the file:
float firstTimeCode;
if (!getMPEG1or2TimeCode(dataSource, *baseDemux, True, firstTimeCode)) break;
// Then, read the last time code from the file.
// (Before doing this, flush the demux's input buffers,
// and seek towards the end of the file, for efficiency.)
baseDemux->flushInput();
unsigned const startByteFromEnd = 100000;
unsigned newFilePosition
= fileSize < startByteFromEnd ? 0 : fileSize - startByteFromEnd;
if (newFilePosition > 0) fileSource->seekToByteAbsolute(newFilePosition);
float lastTimeCode;
if (!getMPEG1or2TimeCode(dataSource, *baseDemux, False, lastTimeCode)) break;
// Take the difference between these time codes as being the file duration:
float timeCodeDiff = lastTimeCode - firstTimeCode;
if (timeCodeDiff < 0) break;
duration = timeCodeDiff;
} while (0);
Medium::close(dataSource);
return duration;
}
#define MFSD_DUMMY_SINK_BUFFER_SIZE (6+65535) /* large enough for a PES packet */
class MFSD_DummySink: public MediaSink {
public:
MFSD_DummySink(MPEG1or2Demux& demux, Boolean returnFirstSeenCode);
virtual ~MFSD_DummySink();
char watchVariable;
private:
// redefined virtual function:
virtual Boolean continuePlaying();
private:
static void afterGettingFrame(void* clientData, unsigned frameSize,
unsigned numTruncatedBytes,
struct timeval presentationTime,
unsigned durationInMicroseconds);
void afterGettingFrame1();
private:
MPEG1or2Demux& fOurDemux;
Boolean fReturnFirstSeenCode;
unsigned char fBuf[MFSD_DUMMY_SINK_BUFFER_SIZE];
};
static void afterPlayingMFSD_DummySink(MFSD_DummySink* sink); // forward
static float computeSCRTimeCode(MPEG1or2Demux::SCR const& scr); // forward
static Boolean getMPEG1or2TimeCode(FramedSource* dataSource,
MPEG1or2Demux& parentDemux,
Boolean returnFirstSeenCode,
float& timeCode) {
// Start reading through "dataSource", until we see a SCR time code:
parentDemux.lastSeenSCR().isValid = False;
UsageEnvironment& env = dataSource->envir(); // alias
MFSD_DummySink sink(parentDemux, returnFirstSeenCode);
sink.startPlaying(*dataSource,
(MediaSink::afterPlayingFunc*)afterPlayingMFSD_DummySink, &sink);
env.taskScheduler().doEventLoop(&sink.watchVariable);
timeCode = computeSCRTimeCode(parentDemux.lastSeenSCR());
return parentDemux.lastSeenSCR().isValid;
}
////////// MFSD_DummySink implementation //////////
MFSD_DummySink::MFSD_DummySink(MPEG1or2Demux& demux, Boolean returnFirstSeenCode)
: MediaSink(demux.envir()),
watchVariable(0), fOurDemux(demux), fReturnFirstSeenCode(returnFirstSeenCode) {
}
MFSD_DummySink::~MFSD_DummySink() {
}
Boolean MFSD_DummySink::continuePlaying() {
if (fSource == NULL) return False; // sanity check
fSource->getNextFrame(fBuf, sizeof fBuf,
afterGettingFrame, this,
onSourceClosure, this);
return True;
}
void MFSD_DummySink::afterGettingFrame(void* clientData, unsigned /*frameSize*/,
unsigned /*numTruncatedBytes*/,
struct timeval /*presentationTime*/,
unsigned /*durationInMicroseconds*/) {
MFSD_DummySink* sink = (MFSD_DummySink*)clientData;
sink->afterGettingFrame1();
}
void MFSD_DummySink::afterGettingFrame1() {
if (fReturnFirstSeenCode && fOurDemux.lastSeenSCR().isValid) {
// We were asked to return the first SCR that we saw, and we've seen one,
// so we're done. (Handle this as if the input source had closed.)
onSourceClosure();
return;
}
continuePlaying();
}
static void afterPlayingMFSD_DummySink(MFSD_DummySink* sink) {
// Return from the "doEventLoop()" call:
sink->watchVariable = ~0;
}
static float computeSCRTimeCode(MPEG1or2Demux::SCR const& scr) {
double result = scr.remainingBits/90000.0 + scr.extension/300.0;
if (scr.highBit) {
// Add (2^32)/90000 == (2^28)/5625
double const highBitValue = (256*1024*1024)/5625.0;
result += highBitValue;
}
return (float)result;
}
live/liveMedia/MPEG1or2VideoFileServerMediaSubsession.cpp 000444 001752 001752 00000005132 12656261123 023305 0 ustar 00rsf rsf 000000 000000 /**********
This library is free software; you can redistribute it and/or modify it under
the terms of the GNU Lesser General Public License as published by the
Free Software Foundation; either version 2.1 of the License, or (at your
option) any later version. (See .)
This library is distributed in the hope that it will be useful, but WITHOUT
ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS
FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for
more details.
You should have received a copy of the GNU Lesser General Public License
along with this library; if not, write to the Free Software Foundation, Inc.,
51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
**********/
// "liveMedia"
// Copyright (c) 1996-2016 Live Networks, Inc. All rights reserved.
// A 'ServerMediaSubsession' object that creates new, unicast, "RTPSink"s
// on demand, from a MPEG-1 or 2 Elementary Stream video file.
// Implementation
#include "MPEG1or2VideoFileServerMediaSubsession.hh"
#include "MPEG1or2VideoRTPSink.hh"
#include "ByteStreamFileSource.hh"
#include "MPEG1or2VideoStreamFramer.hh"
MPEG1or2VideoFileServerMediaSubsession*
MPEG1or2VideoFileServerMediaSubsession::createNew(UsageEnvironment& env,
char const* fileName,
Boolean reuseFirstSource,
Boolean iFramesOnly,
double vshPeriod) {
return new MPEG1or2VideoFileServerMediaSubsession(env, fileName, reuseFirstSource,
iFramesOnly, vshPeriod);
}
MPEG1or2VideoFileServerMediaSubsession
::MPEG1or2VideoFileServerMediaSubsession(UsageEnvironment& env,
char const* fileName,
Boolean reuseFirstSource,
Boolean iFramesOnly,
double vshPeriod)
: FileServerMediaSubsession(env, fileName, reuseFirstSource),
fIFramesOnly(iFramesOnly), fVSHPeriod(vshPeriod) {
}
MPEG1or2VideoFileServerMediaSubsession
::~MPEG1or2VideoFileServerMediaSubsession() {
}
FramedSource* MPEG1or2VideoFileServerMediaSubsession
::createNewStreamSource(unsigned /*clientSessionId*/, unsigned& estBitrate) {
estBitrate = 500; // kbps, estimate
ByteStreamFileSource* fileSource
= ByteStreamFileSource::createNew(envir(), fFileName);
if (fileSource == NULL) return NULL;
fFileSize = fileSource->fileSize();
return MPEG1or2VideoStreamFramer
::createNew(envir(), fileSource, fIFramesOnly, fVSHPeriod);
}
RTPSink* MPEG1or2VideoFileServerMediaSubsession
::createNewRTPSink(Groupsock* rtpGroupsock,
unsigned char /*rtpPayloadTypeIfDynamic*/,
FramedSource* /*inputSource*/) {
return MPEG1or2VideoRTPSink::createNew(envir(), rtpGroupsock);
}
live/liveMedia/MPEG1or2VideoRTPSink.cpp 000444 001752 001752 00000015165 12656261123 017522 0 ustar 00rsf rsf 000000 000000 /**********
This library is free software; you can redistribute it and/or modify it under
the terms of the GNU Lesser General Public License as published by the
Free Software Foundation; either version 2.1 of the License, or (at your
option) any later version. (See .)
This library is distributed in the hope that it will be useful, but WITHOUT
ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS
FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for
more details.
You should have received a copy of the GNU Lesser General Public License
along with this library; if not, write to the Free Software Foundation, Inc.,
51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
**********/
// "liveMedia"
// Copyright (c) 1996-2016 Live Networks, Inc. All rights reserved.
// RTP sink for MPEG video (RFC 2250)
// Implementation
#include "MPEG1or2VideoRTPSink.hh"
#include "MPEG1or2VideoStreamFramer.hh"
MPEG1or2VideoRTPSink::MPEG1or2VideoRTPSink(UsageEnvironment& env, Groupsock* RTPgs)
: VideoRTPSink(env, RTPgs, 32, 90000, "MPV") {
fPictureState.temporal_reference = 0;
fPictureState.picture_coding_type = fPictureState.vector_code_bits = 0;
}
MPEG1or2VideoRTPSink::~MPEG1or2VideoRTPSink() {
}
MPEG1or2VideoRTPSink*
MPEG1or2VideoRTPSink::createNew(UsageEnvironment& env, Groupsock* RTPgs) {
return new MPEG1or2VideoRTPSink(env, RTPgs);
}
Boolean MPEG1or2VideoRTPSink::sourceIsCompatibleWithUs(MediaSource& source) {
// Our source must be an appropriate framer:
return source.isMPEG1or2VideoStreamFramer();
}
Boolean MPEG1or2VideoRTPSink::allowFragmentationAfterStart() const {
return True;
}
Boolean MPEG1or2VideoRTPSink
::frameCanAppearAfterPacketStart(unsigned char const* frameStart,
unsigned numBytesInFrame) const {
// A 'frame' (which in this context can mean a header or a slice as well as a
// complete picture) can appear at other than the first position in a packet
// in all situations, EXCEPT when it follows the end of (i.e., the last slice
// of) a picture. I.e., the headers at the beginning of a picture must
// appear at the start of a RTP packet.
if (!fPreviousFrameWasSlice) return True;
// A slice is already packed into this packet. We allow this new 'frame'
// to be packed after it, provided that it is also a slice:
return numBytesInFrame >= 4
&& frameStart[0] == 0 && frameStart[1] == 0 && frameStart[2] == 1
&& frameStart[3] >= 1 && frameStart[3] <= 0xAF;
}
#define VIDEO_SEQUENCE_HEADER_START_CODE 0x000001B3
#define PICTURE_START_CODE 0x00000100
void MPEG1or2VideoRTPSink
::doSpecialFrameHandling(unsigned fragmentationOffset,
unsigned char* frameStart,
unsigned numBytesInFrame,
struct timeval framePresentationTime,
unsigned numRemainingBytes) {
Boolean thisFrameIsASlice = False; // until we learn otherwise
if (isFirstFrameInPacket()) {
fSequenceHeaderPresent = fPacketBeginsSlice = fPacketEndsSlice = False;
}
if (fragmentationOffset == 0) {
// Begin by inspecting the 4-byte code at the start of the frame:
if (numBytesInFrame < 4) return; // shouldn't happen
unsigned startCode = (frameStart[0]<<24) | (frameStart[1]<<16)
| (frameStart[2]<<8) | frameStart[3];
if (startCode == VIDEO_SEQUENCE_HEADER_START_CODE) {
// This is a video sequence header
fSequenceHeaderPresent = True;
} else if (startCode == PICTURE_START_CODE) {
// This is a picture header
// Record the parameters of this picture:
if (numBytesInFrame < 8) return; // shouldn't happen
unsigned next4Bytes = (frameStart[4]<<24) | (frameStart[5]<<16)
| (frameStart[6]<<8) | frameStart[7];
unsigned char byte8 = numBytesInFrame == 8 ? 0 : frameStart[8];
fPictureState.temporal_reference = (next4Bytes&0xFFC00000)>>(32-10);
fPictureState.picture_coding_type = (next4Bytes&0x00380000)>>(32-(10+3));
unsigned char FBV, BFC, FFV, FFC;
FBV = BFC = FFV = FFC = 0;
switch (fPictureState.picture_coding_type) {
case 3:
FBV = (byte8&0x40)>>6;
BFC = (byte8&0x38)>>3;
// fall through to:
case 2:
FFV = (next4Bytes&0x00000004)>>2;
FFC = ((next4Bytes&0x00000003)<<1) | ((byte8&0x80)>>7);
}
fPictureState.vector_code_bits = (FBV<<7) | (BFC<<4) | (FFV<<3) | FFC;
} else if ((startCode&0xFFFFFF00) == 0x00000100) {
unsigned char lastCodeByte = startCode&0xFF;
if (lastCodeByte <= 0xAF) {
// This is (the start of) a slice
thisFrameIsASlice = True;
} else {
// This is probably a GOP header; we don't do anything with this
}
} else {
// The first 4 bytes aren't a code that we recognize.
envir() << "Warning: MPEG1or2VideoRTPSink::doSpecialFrameHandling saw strange first 4 bytes "
<< (void*)startCode << ", but we're not a fragment\n";
}
} else {
// We're a fragment (other than the first) of a slice.
thisFrameIsASlice = True;
}
if (thisFrameIsASlice) {
// This packet begins a slice iff there's no fragmentation offset:
fPacketBeginsSlice = (fragmentationOffset == 0);
// This packet also ends a slice iff there are no fragments remaining:
fPacketEndsSlice = (numRemainingBytes == 0);
}
// Set the video-specific header based on the parameters that we've seen.
// Note that this may get done more than once, if several frames appear
// in the packet. That's OK, because this situation happens infrequently,
// and we want the video-specific header to reflect the most up-to-date
// information (in particular, from a Picture Header) anyway.
unsigned videoSpecificHeader =
// T == 0
(fPictureState.temporal_reference<<16) |
// AN == N == 0
(fSequenceHeaderPresent<<13) |
(fPacketBeginsSlice<<12) |
(fPacketEndsSlice<<11) |
(fPictureState.picture_coding_type<<8) |
fPictureState.vector_code_bits;
setSpecialHeaderWord(videoSpecificHeader);
// Also set the RTP timestamp. (As above, we do this for each frame
// in the packet.)
setTimestamp(framePresentationTime);
// Set the RTP 'M' (marker) bit iff this frame ends (i.e., is the last
// slice of) a picture (and there are no fragments remaining).
// This relies on the source being a "MPEG1or2VideoStreamFramer".
MPEG1or2VideoStreamFramer* framerSource = (MPEG1or2VideoStreamFramer*)fSource;
if (framerSource != NULL && framerSource->pictureEndMarker()
&& numRemainingBytes == 0) {
setMarkerBit();
framerSource->pictureEndMarker() = False;
}
fPreviousFrameWasSlice = thisFrameIsASlice;
}
unsigned MPEG1or2VideoRTPSink::specialHeaderSize() const {
// There's a 4 byte special video header:
return 4;
}
live/liveMedia/MPEG1or2VideoRTPSource.cpp 000444 001752 001752 00000005414 12656261123 020052 0 ustar 00rsf rsf 000000 000000 /**********
This library is free software; you can redistribute it and/or modify it under
the terms of the GNU Lesser General Public License as published by the
Free Software Foundation; either version 2.1 of the License, or (at your
option) any later version. (See .)
This library is distributed in the hope that it will be useful, but WITHOUT
ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS
FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for
more details.
You should have received a copy of the GNU Lesser General Public License
along with this library; if not, write to the Free Software Foundation, Inc.,
51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
**********/
// "liveMedia"
// Copyright (c) 1996-2016 Live Networks, Inc. All rights reserved.
// MPEG-1 or MPEG-2 Video RTP Sources
// Implementation
#include "MPEG1or2VideoRTPSource.hh"
MPEG1or2VideoRTPSource*
MPEG1or2VideoRTPSource::createNew(UsageEnvironment& env, Groupsock* RTPgs,
unsigned char rtpPayloadFormat,
unsigned rtpTimestampFrequency) {
return new MPEG1or2VideoRTPSource(env, RTPgs, rtpPayloadFormat,
rtpTimestampFrequency);
}
MPEG1or2VideoRTPSource::MPEG1or2VideoRTPSource(UsageEnvironment& env,
Groupsock* RTPgs,
unsigned char rtpPayloadFormat,
unsigned rtpTimestampFrequency)
: MultiFramedRTPSource(env, RTPgs,
rtpPayloadFormat, rtpTimestampFrequency){
}
MPEG1or2VideoRTPSource::~MPEG1or2VideoRTPSource() {
}
Boolean MPEG1or2VideoRTPSource
::processSpecialHeader(BufferedPacket* packet,
unsigned& resultSpecialHeaderSize) {
// There's a 4-byte video-specific header
if (packet->dataSize() < 4) return False;
u_int32_t header = ntohl(*(u_int32_t*)(packet->data()));
u_int32_t sBit = header&0x00002000; // sequence-header-present
u_int32_t bBit = header&0x00001000; // beginning-of-slice
u_int32_t eBit = header&0x00000800; // end-of-slice
fCurrentPacketBeginsFrame = (sBit|bBit) != 0;
fCurrentPacketCompletesFrame = ((sBit != 0) && (bBit == 0)) || (eBit != 0);
resultSpecialHeaderSize = 4;
return True;
}
Boolean MPEG1or2VideoRTPSource
::packetIsUsableInJitterCalculation(unsigned char* packet,
unsigned packetSize) {
// There's a 4-byte video-specific header
if (packetSize < 4) return False;
// Extract the "Picture-Type" field from this, to determine whether
// this packet can be used in jitter calculations:
unsigned header = ntohl(*(u_int32_t*)packet);
unsigned short pictureType = (header>>8)&0x7;
if (pictureType == 1) { // an I frame
return True;
} else { // a P, B, D, or other unknown frame type
return False;
}
}
char const* MPEG1or2VideoRTPSource::MIMEtype() const {
return "video/MPEG";
}
live/liveMedia/MPEG1or2VideoStreamDiscreteFramer.cpp 000444 001752 001752 00000017733 12656261123 022306 0 ustar 00rsf rsf 000000 000000 /**********
This library is free software; you can redistribute it and/or modify it under
the terms of the GNU Lesser General Public License as published by the
Free Software Foundation; either version 2.1 of the License, or (at your
option) any later version. (See .)
This library is distributed in the hope that it will be useful, but WITHOUT
ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS
FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for
more details.
You should have received a copy of the GNU Lesser General Public License
along with this library; if not, write to the Free Software Foundation, Inc.,
51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
**********/
// "liveMedia"
// Copyright (c) 1996-2016 Live Networks, Inc. All rights reserved.
// A simplified version of "MPEG1or2VideoStreamFramer" that takes only
// complete, discrete frames (rather than an arbitrary byte stream) as input.
// This avoids the parsing and data copying overhead of the full
// "MPEG1or2VideoStreamFramer".
// Implementation
#include "MPEG1or2VideoStreamDiscreteFramer.hh"
MPEG1or2VideoStreamDiscreteFramer*
MPEG1or2VideoStreamDiscreteFramer::createNew(UsageEnvironment& env,
FramedSource* inputSource,
Boolean iFramesOnly,
double vshPeriod,
Boolean leavePresentationTimesUnmodified) {
// Need to add source type checking here??? #####
return new MPEG1or2VideoStreamDiscreteFramer(env, inputSource,
iFramesOnly, vshPeriod, leavePresentationTimesUnmodified);
}
MPEG1or2VideoStreamDiscreteFramer
::MPEG1or2VideoStreamDiscreteFramer(UsageEnvironment& env,
FramedSource* inputSource,
Boolean iFramesOnly, double vshPeriod, Boolean leavePresentationTimesUnmodified)
: MPEG1or2VideoStreamFramer(env, inputSource, iFramesOnly, vshPeriod,
False/*don't create a parser*/),
fLeavePresentationTimesUnmodified(leavePresentationTimesUnmodified),
fLastNonBFrameTemporal_reference(0),
fSavedVSHSize(0), fSavedVSHTimestamp(0.0),
fIFramesOnly(iFramesOnly), fVSHPeriod(vshPeriod) {
fLastNonBFramePresentationTime.tv_sec = 0;
fLastNonBFramePresentationTime.tv_usec = 0;
}
MPEG1or2VideoStreamDiscreteFramer::~MPEG1or2VideoStreamDiscreteFramer() {
}
void MPEG1or2VideoStreamDiscreteFramer::doGetNextFrame() {
// Arrange to read data (which should be a complete MPEG-1 or 2 video frame)
// from our data source, directly into the client's input buffer.
// After reading this, we'll do some parsing on the frame.
fInputSource->getNextFrame(fTo, fMaxSize,
afterGettingFrame, this,
FramedSource::handleClosure, this);
}
void MPEG1or2VideoStreamDiscreteFramer
::afterGettingFrame(void* clientData, unsigned frameSize,
unsigned numTruncatedBytes,
struct timeval presentationTime,
unsigned durationInMicroseconds) {
MPEG1or2VideoStreamDiscreteFramer* source
= (MPEG1or2VideoStreamDiscreteFramer*)clientData;
source->afterGettingFrame1(frameSize, numTruncatedBytes,
presentationTime, durationInMicroseconds);
}
static double const frameRateFromCode[] = {
0.0, // forbidden
24000/1001.0, // approx 23.976
24.0,
25.0,
30000/1001.0, // approx 29.97
30.0,
50.0,
60000/1001.0, // approx 59.94
60.0,
0.0, // reserved
0.0, // reserved
0.0, // reserved
0.0, // reserved
0.0, // reserved
0.0, // reserved
0.0 // reserved
};
#define MILLION 1000000
void MPEG1or2VideoStreamDiscreteFramer
::afterGettingFrame1(unsigned frameSize, unsigned numTruncatedBytes,
struct timeval presentationTime,
unsigned durationInMicroseconds) {
// Check that the first 4 bytes are a system code:
if (frameSize >= 4 && fTo[0] == 0 && fTo[1] == 0 && fTo[2] == 1) {
fPictureEndMarker = True; // Assume that we have a complete 'picture' here
u_int8_t nextCode = fTo[3];
if (nextCode == 0xB3) { // VIDEO_SEQUENCE_HEADER_START_CODE
// Note the following 'frame rate' code:
if (frameSize >= 8) {
u_int8_t frame_rate_code = fTo[7]&0x0F;
fFrameRate = frameRateFromCode[frame_rate_code];
}
// Also, save away this Video Sequence Header, in case we need it later:
// First, figure out how big it is:
unsigned vshSize;
for (vshSize = 4; vshSize < frameSize-3; ++vshSize) {
if (fTo[vshSize] == 0 && fTo[vshSize+1] == 0 && fTo[vshSize+2] == 1 &&
(fTo[vshSize+3] == 0xB8 || fTo[vshSize+3] == 0x00)) break;
}
if (vshSize == frameSize-3) vshSize = frameSize; // There was nothing else following it
if (vshSize <= sizeof fSavedVSHBuffer) {
memmove(fSavedVSHBuffer, fTo, vshSize);
fSavedVSHSize = vshSize;
fSavedVSHTimestamp
= presentationTime.tv_sec + presentationTime.tv_usec/(double)MILLION;
}
} else if (nextCode == 0xB8) { // GROUP_START_CODE
// If necessary, insert a saved Video Sequence Header in front of this:
double pts = presentationTime.tv_sec + presentationTime.tv_usec/(double)MILLION;
if (pts > fSavedVSHTimestamp + fVSHPeriod &&
fSavedVSHSize + frameSize <= fMaxSize) {
memmove(&fTo[fSavedVSHSize], &fTo[0], frameSize); // make room for the header
memmove(&fTo[0], fSavedVSHBuffer, fSavedVSHSize); // insert it
frameSize += fSavedVSHSize;
fSavedVSHTimestamp = pts;
}
}
unsigned i = 3;
if (nextCode == 0xB3 /*VIDEO_SEQUENCE_HEADER_START_CODE*/ ||
nextCode == 0xB8 /*GROUP_START_CODE*/) {
// Skip to the following PICTURE_START_CODE (if any):
for (i += 4; i < frameSize; ++i) {
if (fTo[i] == 0x00 /*PICTURE_START_CODE*/
&& fTo[i-1] == 1 && fTo[i-2] == 0 && fTo[i-3] == 0) {
nextCode = fTo[i];
break;
}
}
}
if (nextCode == 0x00 /*PICTURE_START_CODE*/ && i+2 < frameSize) {
// Get the 'temporal_reference' and 'picture_coding_type' from the
// following 2 bytes:
++i;
unsigned short temporal_reference = (fTo[i]<<2)|(fTo[i+1]>>6);
unsigned char picture_coding_type = (fTo[i+1]&0x38)>>3;
// If this is not an "I" frame, but we were asked for "I" frames only, then try again:
if (fIFramesOnly && picture_coding_type != 1) {
doGetNextFrame();
return;
}
// If this is a "B" frame, then we have to tweak "presentationTime":
if (!fLeavePresentationTimesUnmodified && picture_coding_type == 3/*B*/
&& (fLastNonBFramePresentationTime.tv_usec > 0 ||
fLastNonBFramePresentationTime.tv_sec > 0)) {
int trIncrement
= fLastNonBFrameTemporal_reference - temporal_reference;
if (trIncrement < 0) trIncrement += 1024; // field is 10 bits in size
unsigned usIncrement = fFrameRate == 0.0 ? 0
: (unsigned)((trIncrement*MILLION)/fFrameRate);
unsigned secondsToSubtract = usIncrement/MILLION;
unsigned uSecondsToSubtract = usIncrement%MILLION;
presentationTime = fLastNonBFramePresentationTime;
if ((unsigned)presentationTime.tv_usec < uSecondsToSubtract) {
presentationTime.tv_usec += MILLION;
if (presentationTime.tv_sec > 0) --presentationTime.tv_sec;
}
presentationTime.tv_usec -= uSecondsToSubtract;
if ((unsigned)presentationTime.tv_sec > secondsToSubtract) {
presentationTime.tv_sec -= secondsToSubtract;
} else {
presentationTime.tv_sec = presentationTime.tv_usec = 0;
}
} else {
fLastNonBFramePresentationTime = presentationTime;
fLastNonBFrameTemporal_reference = temporal_reference;
}
}
}
// ##### Later:
// - do "iFramesOnly" if requested
// Complete delivery to the client:
fFrameSize = frameSize;
fNumTruncatedBytes = numTruncatedBytes;
fPresentationTime = presentationTime;
fDurationInMicroseconds = durationInMicroseconds;
afterGetting(this);
}
live/liveMedia/MPEG1or2VideoStreamFramer.cpp 000444 001752 001752 00000036613 12656261123 020621 0 ustar 00rsf rsf 000000 000000 /**********
This library is free software; you can redistribute it and/or modify it under
the terms of the GNU Lesser General Public License as published by the
Free Software Foundation; either version 2.1 of the License, or (at your
option) any later version. (See .)
This library is distributed in the hope that it will be useful, but WITHOUT
ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS
FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for
more details.
You should have received a copy of the GNU Lesser General Public License
along with this library; if not, write to the Free Software Foundation, Inc.,
51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
**********/
// "liveMedia"
// Copyright (c) 1996-2016 Live Networks, Inc. All rights reserved.
// A filter that breaks up an MPEG 1 or 2 video elementary stream into
// frames for: Video_Sequence_Header, GOP_Header, Picture_Header
// Implementation
#include "MPEG1or2VideoStreamFramer.hh"
#include "MPEGVideoStreamParser.hh"
#include
////////// MPEG1or2VideoStreamParser definition //////////
// An enum representing the current state of the parser:
enum MPEGParseState {
PARSING_VIDEO_SEQUENCE_HEADER,
PARSING_VIDEO_SEQUENCE_HEADER_SEEN_CODE,
PARSING_GOP_HEADER,
PARSING_GOP_HEADER_SEEN_CODE,
PARSING_PICTURE_HEADER,
PARSING_SLICE
};
#define VSH_MAX_SIZE 1000
class MPEG1or2VideoStreamParser: public MPEGVideoStreamParser {
public:
MPEG1or2VideoStreamParser(MPEG1or2VideoStreamFramer* usingSource,
FramedSource* inputSource,
Boolean iFramesOnly, double vshPeriod);
virtual ~MPEG1or2VideoStreamParser();
private: // redefined virtual functions:
virtual void flushInput();
virtual unsigned parse();
private:
void reset();
MPEG1or2VideoStreamFramer* usingSource() {
return (MPEG1or2VideoStreamFramer*)fUsingSource;
}
void setParseState(MPEGParseState parseState);
unsigned parseVideoSequenceHeader(Boolean haveSeenStartCode);
unsigned parseGOPHeader(Boolean haveSeenStartCode);
unsigned parsePictureHeader();
unsigned parseSlice();
private:
MPEGParseState fCurrentParseState;
unsigned fPicturesSinceLastGOP;
// can be used to compute timestamp for a video_sequence_header
unsigned short fCurPicTemporalReference;
// used to compute slice timestamp
unsigned char fCurrentSliceNumber; // set when parsing a slice
// A saved copy of the most recently seen 'video_sequence_header',
// in case we need to insert it into the stream periodically:
unsigned char fSavedVSHBuffer[VSH_MAX_SIZE];
unsigned fSavedVSHSize;
double fSavedVSHTimestamp;
double fVSHPeriod;
Boolean fIFramesOnly, fSkippingCurrentPicture;
void saveCurrentVSH();
Boolean needToUseSavedVSH();
unsigned useSavedVSH(); // returns the size of the saved VSH
};
////////// MPEG1or2VideoStreamFramer implementation //////////
MPEG1or2VideoStreamFramer::MPEG1or2VideoStreamFramer(UsageEnvironment& env,
FramedSource* inputSource,
Boolean iFramesOnly,
double vshPeriod,
Boolean createParser)
: MPEGVideoStreamFramer(env, inputSource) {
fParser = createParser
? new MPEG1or2VideoStreamParser(this, inputSource,
iFramesOnly, vshPeriod)
: NULL;
}
MPEG1or2VideoStreamFramer::~MPEG1or2VideoStreamFramer() {
}
MPEG1or2VideoStreamFramer*
MPEG1or2VideoStreamFramer::createNew(UsageEnvironment& env,
FramedSource* inputSource,
Boolean iFramesOnly,
double vshPeriod) {
// Need to add source type checking here??? #####
return new MPEG1or2VideoStreamFramer(env, inputSource, iFramesOnly, vshPeriod);
}
double MPEG1or2VideoStreamFramer::getCurrentPTS() const {
return fPresentationTime.tv_sec + fPresentationTime.tv_usec/1000000.0;
}
Boolean MPEG1or2VideoStreamFramer::isMPEG1or2VideoStreamFramer() const {
return True;
}
////////// MPEG1or2VideoStreamParser implementation //////////
MPEG1or2VideoStreamParser
::MPEG1or2VideoStreamParser(MPEG1or2VideoStreamFramer* usingSource,
FramedSource* inputSource,
Boolean iFramesOnly, double vshPeriod)
: MPEGVideoStreamParser(usingSource, inputSource),
fCurrentParseState(PARSING_VIDEO_SEQUENCE_HEADER),
fVSHPeriod(vshPeriod), fIFramesOnly(iFramesOnly) {
reset();
}
MPEG1or2VideoStreamParser::~MPEG1or2VideoStreamParser() {
}
void MPEG1or2VideoStreamParser::setParseState(MPEGParseState parseState) {
fCurrentParseState = parseState;
MPEGVideoStreamParser::setParseState();
}
void MPEG1or2VideoStreamParser::reset() {
fPicturesSinceLastGOP = 0;
fCurPicTemporalReference = 0;
fCurrentSliceNumber = 0;
fSavedVSHSize = 0;
fSkippingCurrentPicture = False;
}
void MPEG1or2VideoStreamParser::flushInput() {
reset();
StreamParser::flushInput();
if (fCurrentParseState != PARSING_VIDEO_SEQUENCE_HEADER) {
setParseState(PARSING_GOP_HEADER); // start from the next GOP
}
}
unsigned MPEG1or2VideoStreamParser::parse() {
try {
switch (fCurrentParseState) {
case PARSING_VIDEO_SEQUENCE_HEADER: {
return parseVideoSequenceHeader(False);
}
case PARSING_VIDEO_SEQUENCE_HEADER_SEEN_CODE: {
return parseVideoSequenceHeader(True);
}
case PARSING_GOP_HEADER: {
return parseGOPHeader(False);
}
case PARSING_GOP_HEADER_SEEN_CODE: {
return parseGOPHeader(True);
}
case PARSING_PICTURE_HEADER: {
return parsePictureHeader();
}
case PARSING_SLICE: {
return parseSlice();
}
default: {
return 0; // shouldn't happen
}
}
} catch (int /*e*/) {
#ifdef DEBUG
fprintf(stderr, "MPEG1or2VideoStreamParser::parse() EXCEPTION (This is normal behavior - *not* an error)\n");
#endif
return 0; // the parsing got interrupted
}
}
void MPEG1or2VideoStreamParser::saveCurrentVSH() {
unsigned frameSize = curFrameSize();
if (frameSize > sizeof fSavedVSHBuffer) return; // too big to save
memmove(fSavedVSHBuffer, fStartOfFrame, frameSize);
fSavedVSHSize = frameSize;
fSavedVSHTimestamp = usingSource()->getCurrentPTS();
}
Boolean MPEG1or2VideoStreamParser::needToUseSavedVSH() {
return usingSource()->getCurrentPTS() > fSavedVSHTimestamp+fVSHPeriod
&& fSavedVSHSize > 0;
}
unsigned MPEG1or2VideoStreamParser::useSavedVSH() {
unsigned bytesToUse = fSavedVSHSize;
unsigned maxBytesToUse = fLimit - fStartOfFrame;
if (bytesToUse > maxBytesToUse) bytesToUse = maxBytesToUse;
memmove(fStartOfFrame, fSavedVSHBuffer, bytesToUse);
// Also reset the saved timestamp:
fSavedVSHTimestamp = usingSource()->getCurrentPTS();
#ifdef DEBUG
fprintf(stderr, "used saved video_sequence_header (%d bytes)\n", bytesToUse);
#endif
return bytesToUse;
}
#define VIDEO_SEQUENCE_HEADER_START_CODE 0x000001B3
#define GROUP_START_CODE 0x000001B8
#define PICTURE_START_CODE 0x00000100
#define SEQUENCE_END_CODE 0x000001B7
static double const frameRateFromCode[] = {
0.0, // forbidden
24000/1001.0, // approx 23.976
24.0,
25.0,
30000/1001.0, // approx 29.97
30.0,
50.0,
60000/1001.0, // approx 59.94
60.0,
0.0, // reserved
0.0, // reserved
0.0, // reserved
0.0, // reserved
0.0, // reserved
0.0, // reserved
0.0 // reserved
};
unsigned MPEG1or2VideoStreamParser
::parseVideoSequenceHeader(Boolean haveSeenStartCode) {
#ifdef DEBUG
fprintf(stderr, "parsing video sequence header\n");
#endif
unsigned first4Bytes;
if (!haveSeenStartCode) {
while ((first4Bytes = test4Bytes()) != VIDEO_SEQUENCE_HEADER_START_CODE) {
#ifdef DEBUG
fprintf(stderr, "ignoring non video sequence header: 0x%08x\n", first4Bytes);
#endif
get1Byte(); setParseState(PARSING_VIDEO_SEQUENCE_HEADER);
// ensures we progress over bad data
}
first4Bytes = get4Bytes();
} else {
// We've already seen the start code
first4Bytes = VIDEO_SEQUENCE_HEADER_START_CODE;
}
save4Bytes(first4Bytes);
// Next, extract the size and rate parameters from the next 8 bytes
unsigned paramWord1 = get4Bytes();
save4Bytes(paramWord1);
unsigned next4Bytes = get4Bytes();
#ifdef DEBUG
unsigned short horizontal_size_value = (paramWord1&0xFFF00000)>>(32-12);
unsigned short vertical_size_value = (paramWord1&0x000FFF00)>>8;
unsigned char aspect_ratio_information = (paramWord1&0x000000F0)>>4;
#endif
unsigned char frame_rate_code = (paramWord1&0x0000000F);
usingSource()->fFrameRate = frameRateFromCode[frame_rate_code];
#ifdef DEBUG
unsigned bit_rate_value = (next4Bytes&0xFFFFC000)>>(32-18);
unsigned vbv_buffer_size_value = (next4Bytes&0x00001FF8)>>3;
fprintf(stderr, "horizontal_size_value: %d, vertical_size_value: %d, aspect_ratio_information: %d, frame_rate_code: %d (=>%f fps), bit_rate_value: %d (=>%d bps), vbv_buffer_size_value: %d\n", horizontal_size_value, vertical_size_value, aspect_ratio_information, frame_rate_code, usingSource()->fFrameRate, bit_rate_value, bit_rate_value*400, vbv_buffer_size_value);
#endif
// Now, copy all bytes that we see, up until we reach a GROUP_START_CODE
// or a PICTURE_START_CODE:
do {
saveToNextCode(next4Bytes);
} while (next4Bytes != GROUP_START_CODE && next4Bytes != PICTURE_START_CODE);
setParseState((next4Bytes == GROUP_START_CODE)
? PARSING_GOP_HEADER_SEEN_CODE : PARSING_PICTURE_HEADER);
// Compute this frame's timestamp by noting how many pictures we've seen
// since the last GOP header:
usingSource()->computePresentationTime(fPicturesSinceLastGOP);
// Save this video_sequence_header, in case we need to insert a copy
// into the stream later:
saveCurrentVSH();
return curFrameSize();
}
unsigned MPEG1or2VideoStreamParser::parseGOPHeader(Boolean haveSeenStartCode) {
// First check whether we should insert a previously-saved
// 'video_sequence_header' here:
if (needToUseSavedVSH()) return useSavedVSH();
#ifdef DEBUG
fprintf(stderr, "parsing GOP header\n");
#endif
unsigned first4Bytes;
if (!haveSeenStartCode) {
while ((first4Bytes = test4Bytes()) != GROUP_START_CODE) {
#ifdef DEBUG
fprintf(stderr, "ignoring non GOP start code: 0x%08x\n", first4Bytes);
#endif
get1Byte(); setParseState(PARSING_GOP_HEADER);
// ensures we progress over bad data
}
first4Bytes = get4Bytes();
} else {
// We've already seen the GROUP_START_CODE
first4Bytes = GROUP_START_CODE;
}
save4Bytes(first4Bytes);
// Next, extract the (25-bit) time code from the next 4 bytes:
unsigned next4Bytes = get4Bytes();
unsigned time_code = (next4Bytes&0xFFFFFF80)>>(32-25);
#if defined(DEBUG) || defined(DEBUG_TIMESTAMPS)
Boolean drop_frame_flag = (time_code&0x01000000) != 0;
#endif
unsigned time_code_hours = (time_code&0x00F80000)>>19;
unsigned time_code_minutes = (time_code&0x0007E000)>>13;
unsigned time_code_seconds = (time_code&0x00000FC0)>>6;
unsigned time_code_pictures = (time_code&0x0000003F);
#if defined(DEBUG) || defined(DEBUG_TIMESTAMPS)
fprintf(stderr, "time_code: 0x%07x, drop_frame %d, hours %d, minutes %d, seconds %d, pictures %d\n", time_code, drop_frame_flag, time_code_hours, time_code_minutes, time_code_seconds, time_code_pictures);
#endif
#ifdef DEBUG
Boolean closed_gop = (next4Bytes&0x00000040) != 0;
Boolean broken_link = (next4Bytes&0x00000020) != 0;
fprintf(stderr, "closed_gop: %d, broken_link: %d\n", closed_gop, broken_link);
#endif
// Now, copy all bytes that we see, up until we reach a PICTURE_START_CODE:
do {
saveToNextCode(next4Bytes);
} while (next4Bytes != PICTURE_START_CODE);
// Record the time code:
usingSource()->setTimeCode(time_code_hours, time_code_minutes,
time_code_seconds, time_code_pictures,
fPicturesSinceLastGOP);
fPicturesSinceLastGOP = 0;
// Compute this frame's timestamp:
usingSource()->computePresentationTime(0);
setParseState(PARSING_PICTURE_HEADER);
return curFrameSize();
}
inline Boolean isSliceStartCode(unsigned fourBytes) {
if ((fourBytes&0xFFFFFF00) != 0x00000100) return False;
unsigned char lastByte = fourBytes&0xFF;
return lastByte <= 0xAF && lastByte >= 1;
}
unsigned MPEG1or2VideoStreamParser::parsePictureHeader() {
#ifdef DEBUG
fprintf(stderr, "parsing picture header\n");
#endif
// Note that we've already read the PICTURE_START_CODE
// Next, extract the temporal reference from the next 4 bytes:
unsigned next4Bytes = get4Bytes();
unsigned short temporal_reference = (next4Bytes&0xFFC00000)>>(32-10);
unsigned char picture_coding_type = (next4Bytes&0x00380000)>>19;
#ifdef DEBUG
unsigned short vbv_delay = (next4Bytes&0x0007FFF8)>>3;
fprintf(stderr, "temporal_reference: %d, picture_coding_type: %d, vbv_delay: %d\n", temporal_reference, picture_coding_type, vbv_delay);
#endif
fSkippingCurrentPicture = fIFramesOnly && picture_coding_type != 1;
if (fSkippingCurrentPicture) {
// Skip all bytes that we see, up until we reach a slice_start_code:
do {
skipToNextCode(next4Bytes);
} while (!isSliceStartCode(next4Bytes));
} else {
// Save the PICTURE_START_CODE that we've already read:
save4Bytes(PICTURE_START_CODE);
// Copy all bytes that we see, up until we reach a slice_start_code:
do {
saveToNextCode(next4Bytes);
} while (!isSliceStartCode(next4Bytes));
}
setParseState(PARSING_SLICE);
fCurrentSliceNumber = next4Bytes&0xFF;
// Record the temporal reference:
fCurPicTemporalReference = temporal_reference;
// Compute this frame's timestamp:
usingSource()->computePresentationTime(fCurPicTemporalReference);
if (fSkippingCurrentPicture) {
return parse(); // try again, until we get a non-skipped frame
} else {
return curFrameSize();
}
}
unsigned MPEG1or2VideoStreamParser::parseSlice() {
// Note that we've already read the slice_start_code:
unsigned next4Bytes = PICTURE_START_CODE|fCurrentSliceNumber;
#ifdef DEBUG_SLICE
fprintf(stderr, "parsing slice: 0x%08x\n", next4Bytes);
#endif
if (fSkippingCurrentPicture) {
// Skip all bytes that we see, up until we reach a code of some sort:
skipToNextCode(next4Bytes);
} else {
// Copy all bytes that we see, up until we reach a code of some sort:
saveToNextCode(next4Bytes);
}
// The next thing to parse depends on the code that we just saw:
if (isSliceStartCode(next4Bytes)) { // common case
setParseState(PARSING_SLICE);
fCurrentSliceNumber = next4Bytes&0xFF;
} else {
// Because we don't see any more slices, we are assumed to have ended
// the current picture:
++fPicturesSinceLastGOP;
++usingSource()->fPictureCount;
usingSource()->fPictureEndMarker = True; // HACK #####
switch (next4Bytes) {
case SEQUENCE_END_CODE: {
setParseState(PARSING_VIDEO_SEQUENCE_HEADER);
break;
}
case VIDEO_SEQUENCE_HEADER_START_CODE: {
setParseState(PARSING_VIDEO_SEQUENCE_HEADER_SEEN_CODE);
break;
}
case GROUP_START_CODE: {
setParseState(PARSING_GOP_HEADER_SEEN_CODE);
break;
}
case PICTURE_START_CODE: {
setParseState(PARSING_PICTURE_HEADER);
break;
}
default: {
usingSource()->envir() << "MPEG1or2VideoStreamParser::parseSlice(): Saw unexpected code "
<< (void*)next4Bytes << "\n";
setParseState(PARSING_SLICE); // the safest way to recover...
break;
}
}
}
// Compute this frame's timestamp:
usingSource()->computePresentationTime(fCurPicTemporalReference);
if (fSkippingCurrentPicture) {
return parse(); // try again, until we get a non-skipped frame
} else {
return curFrameSize();
}
}
live/liveMedia/MPEG2IndexFromTransportStream.cpp 000444 001752 001752 00000057775 12656261123 021620 0 ustar 00rsf rsf 000000 000000 /**********
This library is free software; you can redistribute it and/or modify it under
the terms of the GNU Lesser General Public License as published by the
Free Software Foundation; either version 2.1 of the License, or (at your
option) any later version. (See .)
This library is distributed in the hope that it will be useful, but WITHOUT
ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS
FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for
more details.
You should have received a copy of the GNU Lesser General Public License
along with this library; if not, write to the Free Software Foundation, Inc.,
51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
**********/
// "liveMedia"
// Copyright (c) 1996-2016 Live Networks, Inc. All rights reserved.
// A filter that produces a sequence of I-frame indices from a MPEG-2 Transport Stream
// Implementation
#include "MPEG2IndexFromTransportStream.hh"
////////// IndexRecord definition //////////
enum RecordType {
RECORD_UNPARSED = 0,
RECORD_VSH = 1, // a MPEG Video Sequence Header
RECORD_GOP = 2,
RECORD_PIC_NON_IFRAME = 3, // includes slices
RECORD_PIC_IFRAME = 4, // includes slices
RECORD_NAL_H264_SPS = 5, // H.264
RECORD_NAL_H264_PPS = 6, // H.264
RECORD_NAL_H264_SEI = 7, // H.264
RECORD_NAL_H264_NON_IFRAME = 8, // H.264
RECORD_NAL_H264_IFRAME = 9, // H.264
RECORD_NAL_H264_OTHER = 10, // H.264
RECORD_NAL_H265_VPS = 11, // H.265
RECORD_NAL_H265_SPS = 12, // H.265
RECORD_NAL_H265_PPS = 13, // H.265
RECORD_NAL_H265_NON_IFRAME = 14, // H.265
RECORD_NAL_H265_IFRAME = 15, // H.265
RECORD_NAL_H265_OTHER = 16, // H.265
RECORD_JUNK
};
class IndexRecord {
public:
IndexRecord(u_int8_t startOffset, u_int8_t size,
unsigned long transportPacketNumber, float pcr);
virtual ~IndexRecord();
RecordType& recordType() { return fRecordType; }
void setFirstFlag() { fRecordType = (RecordType)(((u_int8_t)fRecordType) | 0x80); }
u_int8_t startOffset() const { return fStartOffset; }
u_int8_t& size() { return fSize; }
float pcr() const { return fPCR; }
unsigned long transportPacketNumber() const { return fTransportPacketNumber; }
IndexRecord* next() const { return fNext; }
void addAfter(IndexRecord* prev);
void unlink();
private:
// Index records are maintained in a doubly-linked list:
IndexRecord* fNext;
IndexRecord* fPrev;
RecordType fRecordType;
u_int8_t fStartOffset; // within the Transport Stream packet
u_int8_t fSize; // in bytes, following "fStartOffset".
// Note: fStartOffset + fSize <= TRANSPORT_PACKET_SIZE
float fPCR;
unsigned long fTransportPacketNumber;
};
#ifdef DEBUG
static char const* recordTypeStr[] = {
"UNPARSED",
"VSH",
"GOP",
"PIC(non-I-frame)",
"PIC(I-frame)",
"SPS (H.264)",
"PPS (H.264)",
"SEI (H.264)",
"H.264 non-I-frame",
"H.264 I-frame",
"other NAL unit (H.264)",
"VPS (H.265)",
"SPS (H.265)",
"PPS (H.265)",
"H.265 non-I-frame",
"H.265 I-frame",
"other NAL unit (H.265)",
"JUNK"
};
UsageEnvironment& operator<<(UsageEnvironment& env, IndexRecord& r) {
return env << "[" << ((r.recordType()&0x80) != 0 ? "1" : "")
<< recordTypeStr[r.recordType()&0x7F] << ":"
<< (unsigned)r.transportPacketNumber() << ":" << r.startOffset()
<< "(" << r.size() << ")@" << r.pcr() << "]";
}
#endif
////////// MPEG2IFrameIndexFromTransportStream implementation //////////
MPEG2IFrameIndexFromTransportStream*
MPEG2IFrameIndexFromTransportStream::createNew(UsageEnvironment& env,
FramedSource* inputSource) {
return new MPEG2IFrameIndexFromTransportStream(env, inputSource);
}
// The largest expected frame size (in bytes):
#define MAX_FRAME_SIZE 400000
// Make our parse buffer twice as large as this, to ensure that at least one
// complete frame will fit inside it:
#define PARSE_BUFFER_SIZE (2*MAX_FRAME_SIZE)
// The PID used for the PAT (as defined in the MPEG Transport Stream standard):
#define PAT_PID 0
MPEG2IFrameIndexFromTransportStream
::MPEG2IFrameIndexFromTransportStream(UsageEnvironment& env,
FramedSource* inputSource)
: FramedFilter(env, inputSource),
fIsH264(False), fIsH265(False),
fInputTransportPacketCounter((unsigned)-1), fClosureNumber(0), fLastContinuityCounter(~0),
fFirstPCR(0.0), fLastPCR(0.0), fHaveSeenFirstPCR(False),
fPMT_PID(0x10), fVideo_PID(0xE0), // default values
fParseBufferSize(PARSE_BUFFER_SIZE),
fParseBufferFrameStart(0), fParseBufferParseEnd(4), fParseBufferDataEnd(0),
fHeadIndexRecord(NULL), fTailIndexRecord(NULL) {
fParseBuffer = new unsigned char[fParseBufferSize];
}
MPEG2IFrameIndexFromTransportStream::~MPEG2IFrameIndexFromTransportStream() {
delete fHeadIndexRecord;
delete[] fParseBuffer;
}
void MPEG2IFrameIndexFromTransportStream::doGetNextFrame() {
// Begin by trying to deliver an index record (for an already-parsed frame)
// to the client:
if (deliverIndexRecord()) return;
// No more index records are left to deliver, so try to parse a new frame:
if (parseFrame()) { // success - try again
doGetNextFrame();
return;
}
// We need to read some more Transport Stream packets. Check whether we have room:
if (fParseBufferSize - fParseBufferDataEnd < TRANSPORT_PACKET_SIZE) {
// There's no room left. Compact the buffer, and check again:
compactParseBuffer();
if (fParseBufferSize - fParseBufferDataEnd < TRANSPORT_PACKET_SIZE) {
envir() << "ERROR: parse buffer full; increase MAX_FRAME_SIZE\n";
// Treat this as if the input source ended:
handleInputClosure1();
return;
}
}
// Arrange to read a new Transport Stream packet:
fInputSource->getNextFrame(fInputBuffer, sizeof fInputBuffer,
afterGettingFrame, this,
handleInputClosure, this);
}
void MPEG2IFrameIndexFromTransportStream
::afterGettingFrame(void* clientData, unsigned frameSize,
unsigned numTruncatedBytes,
struct timeval presentationTime,
unsigned durationInMicroseconds) {
MPEG2IFrameIndexFromTransportStream* source
= (MPEG2IFrameIndexFromTransportStream*)clientData;
source->afterGettingFrame1(frameSize, numTruncatedBytes,
presentationTime, durationInMicroseconds);
}
#define TRANSPORT_SYNC_BYTE 0x47
void MPEG2IFrameIndexFromTransportStream
::afterGettingFrame1(unsigned frameSize,
unsigned numTruncatedBytes,
struct timeval presentationTime,
unsigned durationInMicroseconds) {
if (frameSize < TRANSPORT_PACKET_SIZE || fInputBuffer[0] != TRANSPORT_SYNC_BYTE) {
if (fInputBuffer[0] != TRANSPORT_SYNC_BYTE) {
envir() << "Bad TS sync byte: 0x" << fInputBuffer[0] << "\n";
}
// Handle this as if the source ended:
handleInputClosure1();
return;
}
++fInputTransportPacketCounter;
// Figure out how much of this Transport Packet contains PES data:
u_int8_t adaptation_field_control = (fInputBuffer[3]&0x30)>>4;
u_int8_t totalHeaderSize
= adaptation_field_control <= 1 ? 4 : 5 + fInputBuffer[4];
if ((adaptation_field_control == 2 && totalHeaderSize != TRANSPORT_PACKET_SIZE) ||
(adaptation_field_control == 3 && totalHeaderSize >= TRANSPORT_PACKET_SIZE)) {
envir() << "Bad \"adaptation_field_length\": " << fInputBuffer[4] << "\n";
doGetNextFrame();
return;
}
// Check for a PCR:
if (totalHeaderSize > 5 && (fInputBuffer[5]&0x10) != 0) {
// There's a PCR:
u_int32_t pcrBaseHigh
= (fInputBuffer[6]<<24)|(fInputBuffer[7]<<16)
|(fInputBuffer[8]<<8)|fInputBuffer[9];
float pcr = pcrBaseHigh/45000.0f;
if ((fInputBuffer[10]&0x80) != 0) pcr += 1/90000.0f; // add in low-bit (if set)
unsigned short pcrExt = ((fInputBuffer[10]&0x01)<<8) | fInputBuffer[11];
pcr += pcrExt/27000000.0f;
if (!fHaveSeenFirstPCR) {
fFirstPCR = pcr;
fHaveSeenFirstPCR = True;
} else if (pcr < fLastPCR) {
// The PCR timestamp has gone backwards. Display a warning about this
// (because it indicates buggy Transport Stream data), and compensate for it.
envir() << "\nWarning: At about " << fLastPCR-fFirstPCR
<< " seconds into the file, the PCR timestamp decreased - from "
<< fLastPCR << " to " << pcr << "\n";
fFirstPCR -= (fLastPCR - pcr);
}
fLastPCR = pcr;
}
// Get the PID from the packet, and check for special tables: the PAT and PMT:
u_int16_t PID = ((fInputBuffer[1]&0x1F)<<8) | fInputBuffer[2];
if (PID == PAT_PID) {
analyzePAT(&fInputBuffer[totalHeaderSize], TRANSPORT_PACKET_SIZE-totalHeaderSize);
} else if (PID == fPMT_PID) {
analyzePMT(&fInputBuffer[totalHeaderSize], TRANSPORT_PACKET_SIZE-totalHeaderSize);
}
// Ignore transport packets for non-video programs,
// or packets with no data, or packets that duplicate the previous packet:
u_int8_t continuity_counter = fInputBuffer[3]&0x0F;
if ((PID != fVideo_PID) ||
!(adaptation_field_control == 1 || adaptation_field_control == 3) ||
continuity_counter == fLastContinuityCounter) {
doGetNextFrame();
return;
}
fLastContinuityCounter = continuity_counter;
// Also, if this is the start of a PES packet, then skip over the PES header:
Boolean payload_unit_start_indicator = (fInputBuffer[1]&0x40) != 0;
if (payload_unit_start_indicator && totalHeaderSize < TRANSPORT_PACKET_SIZE - 8
&& fInputBuffer[totalHeaderSize] == 0x00 && fInputBuffer[totalHeaderSize+1] == 0x00
&& fInputBuffer[totalHeaderSize+2] == 0x01) {
u_int8_t PES_header_data_length = fInputBuffer[totalHeaderSize+8];
totalHeaderSize += 9 + PES_header_data_length;
if (totalHeaderSize >= TRANSPORT_PACKET_SIZE) {
envir() << "Unexpectedly large PES header size: " << PES_header_data_length << "\n";
// Handle this as if the source ended:
handleInputClosure1();
return;
}
}
// The remaining data is Video Elementary Stream data. Add it to our parse buffer:
unsigned vesSize = TRANSPORT_PACKET_SIZE - totalHeaderSize;
memmove(&fParseBuffer[fParseBufferDataEnd], &fInputBuffer[totalHeaderSize], vesSize);
fParseBufferDataEnd += vesSize;
// And add a new index record noting where it came from:
addToTail(new IndexRecord(totalHeaderSize, vesSize, fInputTransportPacketCounter,
fLastPCR - fFirstPCR));
// Try again:
doGetNextFrame();
}
void MPEG2IFrameIndexFromTransportStream::handleInputClosure(void* clientData) {
MPEG2IFrameIndexFromTransportStream* source
= (MPEG2IFrameIndexFromTransportStream*)clientData;
source->handleInputClosure1();
}
#define VIDEO_SEQUENCE_START_CODE 0xB3 // MPEG-1 or 2
#define VISUAL_OBJECT_SEQUENCE_START_CODE 0xB0 // MPEG-4
#define GROUP_START_CODE 0xB8 // MPEG-1 or 2
#define GROUP_VOP_START_CODE 0xB3 // MPEG-4
#define PICTURE_START_CODE 0x00 // MPEG-1 or 2
#define VOP_START_CODE 0xB6 // MPEG-4
void MPEG2IFrameIndexFromTransportStream::handleInputClosure1() {
if (++fClosureNumber == 1 && fParseBufferDataEnd > fParseBufferFrameStart
&& fParseBufferDataEnd <= fParseBufferSize - 4) {
// This is the first time we saw EOF, and there's still data remaining to be
// parsed. Hack: Append a Picture Header code to the end of the unparsed
// data, and try again. This should use up all of the unparsed data.
fParseBuffer[fParseBufferDataEnd++] = 0;
fParseBuffer[fParseBufferDataEnd++] = 0;
fParseBuffer[fParseBufferDataEnd++] = 1;
fParseBuffer[fParseBufferDataEnd++] = PICTURE_START_CODE;
// Try again:
doGetNextFrame();
} else {
// Handle closure in the regular way:
handleClosure();
}
}
void MPEG2IFrameIndexFromTransportStream
::analyzePAT(unsigned char* pkt, unsigned size) {
// Get the PMT_PID:
while (size >= 17) { // The table is large enough
u_int16_t program_number = (pkt[9]<<8) | pkt[10];
if (program_number != 0) {
fPMT_PID = ((pkt[11]&0x1F)<<8) | pkt[12];
return;
}
pkt += 4; size -= 4;
}
}
void MPEG2IFrameIndexFromTransportStream
::analyzePMT(unsigned char* pkt, unsigned size) {
// Scan the "elementary_PID"s in the map, until we see the first video stream.
// First, get the "section_length", to get the table's size:
u_int16_t section_length = ((pkt[2]&0x0F)<<8) | pkt[3];
if ((unsigned)(4+section_length) < size) size = (4+section_length);
// Then, skip any descriptors following the "program_info_length":
if (size < 22) return; // not enough data
unsigned program_info_length = ((pkt[11]&0x0F)<<8) | pkt[12];
pkt += 13; size -= 13;
if (size < program_info_length) return; // not enough data
pkt += program_info_length; size -= program_info_length;
// Look at each ("stream_type","elementary_PID") pair, looking for a video stream:
while (size >= 9) {
u_int8_t stream_type = pkt[0];
u_int16_t elementary_PID = ((pkt[1]&0x1F)<<8) | pkt[2];
if (stream_type == 1 || stream_type == 2 ||
stream_type == 0x1B/*H.264 video*/ || stream_type == 0x24/*H.265 video*/) {
if (stream_type == 0x1B) fIsH264 = True;
else if (stream_type == 0x24) fIsH265 = True;
fVideo_PID = elementary_PID;
return;
}
u_int16_t ES_info_length = ((pkt[3]&0x0F)<<8) | pkt[4];
pkt += 5; size -= 5;
if (size < ES_info_length) return; // not enough data
pkt += ES_info_length; size -= ES_info_length;
}
}
Boolean MPEG2IFrameIndexFromTransportStream::deliverIndexRecord() {
IndexRecord* head = fHeadIndexRecord;
if (head == NULL) return False;
// Check whether the head record has been parsed yet:
if (head->recordType() == RECORD_UNPARSED) return False;
// Remove the head record (the one whose data we'll be delivering):
IndexRecord* next = head->next();
head->unlink();
if (next == head) {
fHeadIndexRecord = fTailIndexRecord = NULL;
} else {
fHeadIndexRecord = next;
}
if (head->recordType() == RECORD_JUNK) {
// Don't actually deliver the data to the client:
delete head;
// Try to deliver the next record instead:
return deliverIndexRecord();
}
// Deliver data from the head record:
#ifdef DEBUG
envir() << "delivering: " << *head << "\n";
#endif
if (fMaxSize < 11) {
fFrameSize = 0;
} else {
fTo[0] = (u_int8_t)(head->recordType());
fTo[1] = head->startOffset();
fTo[2] = head->size();
// Deliver the PCR, as 24 bits (integer part; little endian) + 8 bits (fractional part)
float pcr = head->pcr();
unsigned pcr_int = (unsigned)pcr;
u_int8_t pcr_frac = (u_int8_t)(256*(pcr-pcr_int));
fTo[3] = (unsigned char)(pcr_int);
fTo[4] = (unsigned char)(pcr_int>>8);
fTo[5] = (unsigned char)(pcr_int>>16);
fTo[6] = (unsigned char)(pcr_frac);
// Deliver the transport packet number (in little-endian order):
unsigned long tpn = head->transportPacketNumber();
fTo[7] = (unsigned char)(tpn);
fTo[8] = (unsigned char)(tpn>>8);
fTo[9] = (unsigned char)(tpn>>16);
fTo[10] = (unsigned char)(tpn>>24);
fFrameSize = 11;
}
// Free the (former) head record (as we're now done with it):
delete head;
// Complete delivery to the client:
afterGetting(this);
return True;
}
Boolean MPEG2IFrameIndexFromTransportStream::parseFrame() {
// At this point, we have a queue of >=0 (unparsed) index records, representing
// the data in the parse buffer from "fParseBufferFrameStart"
// to "fParseBufferDataEnd". We now parse through this data, looking for
// a complete 'frame', where a 'frame', in this case, means:
// for MPEG video: a Video Sequence Header, GOP Header, Picture Header, or Slice
// for H.264 or H.265 video: a NAL unit
// Inspect the frame's initial 4-byte code, to make sure it starts with a system code:
if (fParseBufferDataEnd-fParseBufferFrameStart < 4) return False; // not enough data
unsigned numInitialBadBytes = 0;
unsigned char const* p = &fParseBuffer[fParseBufferFrameStart];
if (!(p[0] == 0 && p[1] == 0 && p[2] == 1)) {
// There's no system code at the beginning. Parse until we find one:
if (fParseBufferParseEnd == fParseBufferFrameStart + 4) {
// Start parsing from the beginning of the frame data:
fParseBufferParseEnd = fParseBufferFrameStart;
}
unsigned char nextCode;
if (!parseToNextCode(nextCode)) return False;
numInitialBadBytes = fParseBufferParseEnd - fParseBufferFrameStart;
fParseBufferFrameStart = fParseBufferParseEnd;
fParseBufferParseEnd += 4; // skip over the code that we just saw
p = &fParseBuffer[fParseBufferFrameStart];
}
unsigned char curCode = p[3];
if (fIsH264) curCode &= 0x1F; // nal_unit_type
else if (fIsH265) curCode = (curCode&0x7E)>>1;
RecordType curRecordType;
unsigned char nextCode;
if (fIsH264) {
switch (curCode) {
case 1: // Coded slice of a non-IDR picture
curRecordType = RECORD_NAL_H264_NON_IFRAME;
if (!parseToNextCode(nextCode)) return False;
break;
case 5: // Coded slice of an IDR picture
curRecordType = RECORD_NAL_H264_IFRAME;
if (!parseToNextCode(nextCode)) return False;
break;
case 6: // Supplemental enhancement information (SEI)
curRecordType = RECORD_NAL_H264_SEI;
if (!parseToNextCode(nextCode)) return False;
break;
case 7: // Sequence parameter set (SPS)
curRecordType = RECORD_NAL_H264_SPS;
if (!parseToNextCode(nextCode)) return False;
break;
case 8: // Picture parameter set (PPS)
curRecordType = RECORD_NAL_H264_PPS;
if (!parseToNextCode(nextCode)) return False;
break;
default:
curRecordType = RECORD_NAL_H264_OTHER;
if (!parseToNextCode(nextCode)) return False;
break;
}
} else if (fIsH265) {
switch (curCode) {
case 19: // Coded slice segment of an IDR picture
case 20: // Coded slice segment of an IDR picture
curRecordType = RECORD_NAL_H265_IFRAME;
if (!parseToNextCode(nextCode)) return False;
break;
case 32: // Video parameter set (VPS)
curRecordType = RECORD_NAL_H265_VPS;
if (!parseToNextCode(nextCode)) return False;
break;
case 33: // Sequence parameter set (SPS)
curRecordType = RECORD_NAL_H265_SPS;
if (!parseToNextCode(nextCode)) return False;
break;
case 34: // Picture parameter set (PPS)
curRecordType = RECORD_NAL_H265_PPS;
if (!parseToNextCode(nextCode)) return False;
break;
default:
curRecordType = (curCode <= 31) ? RECORD_NAL_H265_NON_IFRAME : RECORD_NAL_H265_OTHER;
if (!parseToNextCode(nextCode)) return False;
break;
}
} else { // MPEG-1, 2, or 4
switch (curCode) {
case VIDEO_SEQUENCE_START_CODE:
case VISUAL_OBJECT_SEQUENCE_START_CODE:
curRecordType = RECORD_VSH;
while (1) {
if (!parseToNextCode(nextCode)) return False;
if (nextCode == GROUP_START_CODE ||
nextCode == PICTURE_START_CODE || nextCode == VOP_START_CODE) break;
fParseBufferParseEnd += 4; // skip over the code that we just saw
}
break;
case GROUP_START_CODE:
curRecordType = RECORD_GOP;
while (1) {
if (!parseToNextCode(nextCode)) return False;
if (nextCode == PICTURE_START_CODE || nextCode == VOP_START_CODE) break;
fParseBufferParseEnd += 4; // skip over the code that we just saw
}
break;
default: // picture
curRecordType = RECORD_PIC_NON_IFRAME; // may get changed to IFRAME later
while (1) {
if (!parseToNextCode(nextCode)) return False;
if (nextCode == VIDEO_SEQUENCE_START_CODE ||
nextCode == VISUAL_OBJECT_SEQUENCE_START_CODE ||
nextCode == GROUP_START_CODE || nextCode == GROUP_VOP_START_CODE ||
nextCode == PICTURE_START_CODE || nextCode == VOP_START_CODE) break;
fParseBufferParseEnd += 4; // skip over the code that we just saw
}
break;
}
}
if (curRecordType == RECORD_PIC_NON_IFRAME) {
if (curCode == VOP_START_CODE) { // MPEG-4
if ((fParseBuffer[fParseBufferFrameStart+4]&0xC0) == 0) {
// This is actually an I-frame. Note it as such:
curRecordType = RECORD_PIC_IFRAME;
}
} else { // MPEG-1 or 2
if ((fParseBuffer[fParseBufferFrameStart+5]&0x38) == 0x08) {
// This is actually an I-frame. Note it as such:
curRecordType = RECORD_PIC_IFRAME;
}
}
}
// There is now a parsed 'frame', from "fParseBufferFrameStart"
// to "fParseBufferParseEnd". Tag the corresponding index records to note this:
unsigned frameSize = fParseBufferParseEnd - fParseBufferFrameStart + numInitialBadBytes;
#ifdef DEBUG
envir() << "parsed " << recordTypeStr[curRecordType] << "; length "
<< frameSize << "\n";
#endif
for (IndexRecord* r = fHeadIndexRecord; ; r = r->next()) {
if (numInitialBadBytes >= r->size()) {
r->recordType() = RECORD_JUNK;
numInitialBadBytes -= r->size();
} else {
r->recordType() = curRecordType;
}
if (r == fHeadIndexRecord) r->setFirstFlag();
// indicates that this is the first record for this frame
if (r->size() > frameSize) {
// This record contains extra data that's not part of the frame.
// Shorten this record, and move the extra data to a new record
// that comes afterwards:
u_int8_t newOffset = r->startOffset() + frameSize;
u_int8_t newSize = r->size() - frameSize;
r->size() = frameSize;
#ifdef DEBUG
envir() << "tagged record (modified): " << *r << "\n";
#endif
IndexRecord* newRecord
= new IndexRecord(newOffset, newSize, r->transportPacketNumber(), r->pcr());
newRecord->addAfter(r);
if (fTailIndexRecord == r) fTailIndexRecord = newRecord;
#ifdef DEBUG
envir() << "added extra record: " << *newRecord << "\n";
#endif
} else {
#ifdef DEBUG
envir() << "tagged record: " << *r << "\n";
#endif
}
frameSize -= r->size();
if (frameSize == 0) break;
if (r == fTailIndexRecord) { // this shouldn't happen
envir() << "!!!!!Internal consistency error!!!!!\n";
return False;
}
}
// Finally, update our parse state (to skip over the now-parsed data):
fParseBufferFrameStart = fParseBufferParseEnd;
fParseBufferParseEnd += 4; // to skip over the next code (that we found)
return True;
}
Boolean MPEG2IFrameIndexFromTransportStream
::parseToNextCode(unsigned char& nextCode) {
unsigned char const* p = &fParseBuffer[fParseBufferParseEnd];
unsigned char const* end = &fParseBuffer[fParseBufferDataEnd];
while (p <= end-4) {
if (p[2] > 1) p += 3; // common case (optimized)
else if (p[2] == 0) ++p;
else if (p[0] == 0 && p[1] == 0) { // && p[2] == 1
// We found a code here:
nextCode = p[3];
fParseBufferParseEnd = p - &fParseBuffer[0]; // where we've gotten to
return True;
} else p += 3;
}
fParseBufferParseEnd = p - &fParseBuffer[0]; // where we've gotten to
return False; // no luck this time
}
void MPEG2IFrameIndexFromTransportStream::compactParseBuffer() {
#ifdef DEBUG
envir() << "Compacting parse buffer: [" << fParseBufferFrameStart
<< "," << fParseBufferParseEnd << "," << fParseBufferDataEnd << "]";
#endif
memmove(&fParseBuffer[0], &fParseBuffer[fParseBufferFrameStart],
fParseBufferDataEnd - fParseBufferFrameStart);
fParseBufferDataEnd -= fParseBufferFrameStart;
fParseBufferParseEnd -= fParseBufferFrameStart;
fParseBufferFrameStart = 0;
#ifdef DEBUG
envir() << "-> [" << fParseBufferFrameStart
<< "," << fParseBufferParseEnd << "," << fParseBufferDataEnd << "]\n";
#endif
}
void MPEG2IFrameIndexFromTransportStream::addToTail(IndexRecord* newIndexRecord) {
#ifdef DEBUG
envir() << "adding new: " << *newIndexRecord << "\n";
#endif
if (fTailIndexRecord == NULL) {
fHeadIndexRecord = fTailIndexRecord = newIndexRecord;
} else {
newIndexRecord->addAfter(fTailIndexRecord);
fTailIndexRecord = newIndexRecord;
}
}
////////// IndexRecord implementation //////////
IndexRecord::IndexRecord(u_int8_t startOffset, u_int8_t size,
unsigned long transportPacketNumber, float pcr)
: fNext(this), fPrev(this), fRecordType(RECORD_UNPARSED),
fStartOffset(startOffset), fSize(size),
fPCR(pcr), fTransportPacketNumber(transportPacketNumber) {
}
IndexRecord::~IndexRecord() {
IndexRecord* nextRecord = next();
unlink();
if (nextRecord != this) delete nextRecord;
}
void IndexRecord::addAfter(IndexRecord* prev) {
fNext = prev->fNext;
fPrev = prev;
prev->fNext->fPrev = this;
prev->fNext = this;
}
void IndexRecord::unlink() {
fNext->fPrev = fPrev;
fPrev->fNext = fNext;
fNext = fPrev = this;
}
live/liveMedia/MPEG2TransportFileServerMediaSubsession.cpp 000444 001752 001752 00000033037 12656261123 023616 0 ustar 00rsf rsf 000000 000000 /**********
This library is free software; you can redistribute it and/or modify it under
the terms of the GNU Lesser General Public License as published by the
Free Software Foundation; either version 2.1 of the License, or (at your
option) any later version. (See .)
This library is distributed in the hope that it will be useful, but WITHOUT
ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS
FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for
more details.
You should have received a copy of the GNU Lesser General Public License
along with this library; if not, write to the Free Software Foundation, Inc.,
51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
**********/
// "liveMedia"
// Copyright (c) 1996-2016 Live Networks, Inc. All rights reserved.
// A 'ServerMediaSubsession' object that creates new, unicast, "RTPSink"s
// on demand, from a MPEG-2 Transport Stream file.
// Implementation
#include "MPEG2TransportFileServerMediaSubsession.hh"
#include "SimpleRTPSink.hh"
MPEG2TransportFileServerMediaSubsession*
MPEG2TransportFileServerMediaSubsession::createNew(UsageEnvironment& env,
char const* fileName,
char const* indexFileName,
Boolean reuseFirstSource) {
MPEG2TransportStreamIndexFile* indexFile;
if (indexFileName != NULL && reuseFirstSource) {
// It makes no sense to support trick play if all clients use the same source. Fix this:
env << "MPEG2TransportFileServerMediaSubsession::createNew(): ignoring the index file name, because \"reuseFirstSource\" is set\n";
indexFile = NULL;
} else {
indexFile = MPEG2TransportStreamIndexFile::createNew(env, indexFileName);
}
return new MPEG2TransportFileServerMediaSubsession(env, fileName, indexFile,
reuseFirstSource);
}
MPEG2TransportFileServerMediaSubsession
::MPEG2TransportFileServerMediaSubsession(UsageEnvironment& env,
char const* fileName,
MPEG2TransportStreamIndexFile* indexFile,
Boolean reuseFirstSource)
: FileServerMediaSubsession(env, fileName, reuseFirstSource),
fIndexFile(indexFile), fDuration(0.0), fClientSessionHashTable(NULL) {
if (fIndexFile != NULL) { // we support 'trick play'
fDuration = fIndexFile->getPlayingDuration();
fClientSessionHashTable = HashTable::create(ONE_WORD_HASH_KEYS);
}
}
MPEG2TransportFileServerMediaSubsession
::~MPEG2TransportFileServerMediaSubsession() {
if (fIndexFile != NULL) { // we support 'trick play'
Medium::close(fIndexFile);
// Clean out the client session hash table:
while (1) {
ClientTrickPlayState* client
= (ClientTrickPlayState*)(fClientSessionHashTable->RemoveNext());
if (client == NULL) break;
delete client;
}
delete fClientSessionHashTable;
}
}
#define TRANSPORT_PACKET_SIZE 188
#define TRANSPORT_PACKETS_PER_NETWORK_PACKET 7
// The product of these two numbers must be enough to fit within a network packet
void MPEG2TransportFileServerMediaSubsession
::startStream(unsigned clientSessionId, void* streamToken, TaskFunc* rtcpRRHandler,
void* rtcpRRHandlerClientData, unsigned short& rtpSeqNum,
unsigned& rtpTimestamp,
ServerRequestAlternativeByteHandler* serverRequestAlternativeByteHandler,
void* serverRequestAlternativeByteHandlerClientData) {
if (fIndexFile != NULL) { // we support 'trick play'
ClientTrickPlayState* client = lookupClient(clientSessionId);
if (client != NULL && client->areChangingScale()) {
// First, handle this like a "PAUSE", except that we back up to the previous VSH
client->updateStateOnPlayChange(True);
OnDemandServerMediaSubsession::pauseStream(clientSessionId, streamToken);
// Then, adjust for the change of scale:
client->updateStateOnScaleChange();
}
}
// Call the original, default version of this routine:
OnDemandServerMediaSubsession::startStream(clientSessionId, streamToken,
rtcpRRHandler, rtcpRRHandlerClientData,
rtpSeqNum, rtpTimestamp,
serverRequestAlternativeByteHandler, serverRequestAlternativeByteHandlerClientData);
}
void MPEG2TransportFileServerMediaSubsession
::pauseStream(unsigned clientSessionId, void* streamToken) {
if (fIndexFile != NULL) { // we support 'trick play'
ClientTrickPlayState* client = lookupClient(clientSessionId);
if (client != NULL) {
client->updateStateOnPlayChange(False);
}
}
// Call the original, default version of this routine:
OnDemandServerMediaSubsession::pauseStream(clientSessionId, streamToken);
}
void MPEG2TransportFileServerMediaSubsession
::seekStream(unsigned clientSessionId, void* streamToken, double& seekNPT, double streamDuration, u_int64_t& numBytes) {
// Begin by calling the original, default version of this routine:
OnDemandServerMediaSubsession::seekStream(clientSessionId, streamToken, seekNPT, streamDuration, numBytes);
// Then, special handling specific to indexed Transport Stream files:
if (fIndexFile != NULL) { // we support 'trick play'
ClientTrickPlayState* client = lookupClient(clientSessionId);
if (client != NULL) {
unsigned long numTSPacketsToStream = client->updateStateFromNPT(seekNPT, streamDuration);
numBytes = numTSPacketsToStream*TRANSPORT_PACKET_SIZE;
}
}
}
void MPEG2TransportFileServerMediaSubsession
::setStreamScale(unsigned clientSessionId, void* streamToken, float scale) {
if (fIndexFile != NULL) { // we support 'trick play'
ClientTrickPlayState* client = lookupClient(clientSessionId);
if (client != NULL) {
client->setNextScale(scale); // scale won't take effect until the next "PLAY"
}
}
// Call the original, default version of this routine:
OnDemandServerMediaSubsession::setStreamScale(clientSessionId, streamToken, scale);
}
void MPEG2TransportFileServerMediaSubsession
::deleteStream(unsigned clientSessionId, void*& streamToken) {
if (fIndexFile != NULL) { // we support 'trick play'
ClientTrickPlayState* client = lookupClient(clientSessionId);
if (client != NULL) {
client->updateStateOnPlayChange(False);
}
}
// Call the original, default version of this routine:
OnDemandServerMediaSubsession::deleteStream(clientSessionId, streamToken);
}
ClientTrickPlayState* MPEG2TransportFileServerMediaSubsession::newClientTrickPlayState() {
return new ClientTrickPlayState(fIndexFile);
}
FramedSource* MPEG2TransportFileServerMediaSubsession
::createNewStreamSource(unsigned clientSessionId, unsigned& estBitrate) {
// Create the video source:
unsigned const inputDataChunkSize
= TRANSPORT_PACKETS_PER_NETWORK_PACKET*TRANSPORT_PACKET_SIZE;
ByteStreamFileSource* fileSource
= ByteStreamFileSource::createNew(envir(), fFileName, inputDataChunkSize);
if (fileSource == NULL) return NULL;
fFileSize = fileSource->fileSize();
// Use the file size and the duration to estimate the stream's bitrate:
if (fFileSize > 0 && fDuration > 0.0) {
estBitrate = (unsigned)((int64_t)fFileSize/(125*fDuration) + 0.5); // kbps, rounded
} else {
estBitrate = 5000; // kbps, estimate
}
// Create a framer for the Transport Stream:
MPEG2TransportStreamFramer* framer
= MPEG2TransportStreamFramer::createNew(envir(), fileSource);
if (fIndexFile != NULL) { // we support 'trick play'
// Keep state for this client (if we don't already have it):
ClientTrickPlayState* client = lookupClient(clientSessionId);
if (client == NULL) {
client = newClientTrickPlayState();
fClientSessionHashTable->Add((char const*)clientSessionId, client);
}
client->setSource(framer);
}
return framer;
}
RTPSink* MPEG2TransportFileServerMediaSubsession
::createNewRTPSink(Groupsock* rtpGroupsock,
unsigned char /*rtpPayloadTypeIfDynamic*/,
FramedSource* /*inputSource*/) {
return SimpleRTPSink::createNew(envir(), rtpGroupsock,
33, 90000, "video", "MP2T",
1, True, False /*no 'M' bit*/);
}
void MPEG2TransportFileServerMediaSubsession::testScaleFactor(float& scale) {
if (fIndexFile != NULL && fDuration > 0.0) {
// We support any integral scale, other than 0
int iScale = scale < 0.0 ? (int)(scale - 0.5f) : (int)(scale + 0.5f); // round
if (iScale == 0) iScale = 1;
scale = (float)iScale;
} else {
scale = 1.0f;
}
}
float MPEG2TransportFileServerMediaSubsession::duration() const {
return fDuration;
}
ClientTrickPlayState* MPEG2TransportFileServerMediaSubsession
::lookupClient(unsigned clientSessionId) {
return (ClientTrickPlayState*)(fClientSessionHashTable->Lookup((char const*)clientSessionId));
}
////////// ClientTrickPlayState implementation //////////
ClientTrickPlayState::ClientTrickPlayState(MPEG2TransportStreamIndexFile* indexFile)
: fIndexFile(indexFile),
fOriginalTransportStreamSource(NULL),
fTrickModeFilter(NULL), fTrickPlaySource(NULL),
fFramer(NULL),
fScale(1.0f), fNextScale(1.0f), fNPT(0.0f),
fTSRecordNum(0), fIxRecordNum(0) {
}
unsigned long ClientTrickPlayState::updateStateFromNPT(double npt, double streamDuration) {
fNPT = (float)npt;
// Map "fNPT" to the corresponding Transport Stream and Index record numbers:
unsigned long tsRecordNum, ixRecordNum;
fIndexFile->lookupTSPacketNumFromNPT(fNPT, tsRecordNum, ixRecordNum);
updateTSRecordNum();
if (tsRecordNum != fTSRecordNum) {
fTSRecordNum = tsRecordNum;
fIxRecordNum = ixRecordNum;
// Seek the source to the new record number:
reseekOriginalTransportStreamSource();
// Note: We assume that we're asked to seek only in normal
// (i.e., non trick play) mode, so we don't seek within the trick
// play source (if any).
fFramer->clearPIDStatusTable();
}
unsigned long numTSRecordsToStream = 0;
float pcrLimit = 0.0;
if (streamDuration > 0.0) {
// fNPT might have changed when we looked it up in the index file. Adjust "streamDuration" accordingly:
streamDuration += npt - (double)fNPT;
if (streamDuration > 0.0) {
// Specify that we want to stream no more data than this.
if (fNextScale == 1.0f) {
// We'll be streaming from the original file.
// Use the index file to figure out how many Transport Packets we get to stream:
unsigned long toTSRecordNum, toIxRecordNum;
float toNPT = (float)(fNPT + streamDuration);
fIndexFile->lookupTSPacketNumFromNPT(toNPT, toTSRecordNum, toIxRecordNum);
if (toTSRecordNum > tsRecordNum) { // sanity check
numTSRecordsToStream = toTSRecordNum - tsRecordNum;
}
} else {
// We'll be streaming from the trick play stream.
// It'd be difficult to figure out how many Transport Packets we need to stream, so instead set a PCR
// limit in the trick play stream. (We rely upon the fact that PCRs in the trick play stream start at 0.0)
int direction = fNextScale < 0.0 ? -1 : 1;
pcrLimit = (float)(streamDuration/(fNextScale*direction));
}
}
}
fFramer->setNumTSPacketsToStream(numTSRecordsToStream);
fFramer->setPCRLimit(pcrLimit);
return numTSRecordsToStream;
}
void ClientTrickPlayState::updateStateOnScaleChange() {
fScale = fNextScale;
// Change our source objects to reflect the change in scale:
// First, close the existing trick play source (if any):
if (fTrickPlaySource != NULL) {
fTrickModeFilter->forgetInputSource();
// so that the underlying Transport Stream source doesn't get deleted by:
Medium::close(fTrickPlaySource);
fTrickPlaySource = NULL;
fTrickModeFilter = NULL;
}
if (fNextScale != 1.0f) {
// Create a new trick play filter from the original Transport Stream source:
UsageEnvironment& env = fIndexFile->envir(); // alias
fTrickModeFilter = MPEG2TransportStreamTrickModeFilter
::createNew(env, fOriginalTransportStreamSource, fIndexFile, int(fNextScale));
fTrickModeFilter->seekTo(fTSRecordNum, fIxRecordNum);
// And generate a Transport Stream from this:
fTrickPlaySource = MPEG2TransportStreamFromESSource::createNew(env);
fTrickPlaySource->addNewVideoSource(fTrickModeFilter, fIndexFile->mpegVersion());
fFramer->changeInputSource(fTrickPlaySource);
} else {
// Switch back to the original Transport Stream source:
reseekOriginalTransportStreamSource();
fFramer->changeInputSource(fOriginalTransportStreamSource);
}
}
void ClientTrickPlayState::updateStateOnPlayChange(Boolean reverseToPreviousVSH) {
updateTSRecordNum();
if (fTrickPlaySource == NULL) {
// We were in regular (1x) play. Use the index file to look up the
// index record number and npt from the current transport number:
fIndexFile->lookupPCRFromTSPacketNum(fTSRecordNum, reverseToPreviousVSH, fNPT, fIxRecordNum);
} else {
// We were in trick mode, and so already have the index record number.
// Get the transport record number and npt from this:
fIxRecordNum = fTrickModeFilter->nextIndexRecordNum();
if ((long)fIxRecordNum < 0) fIxRecordNum = 0; // we were at the start of the file
unsigned long transportRecordNum;
float pcr;
u_int8_t offset, size, recordType; // all dummy
if (fIndexFile->readIndexRecordValues(fIxRecordNum, transportRecordNum,
offset, size, pcr, recordType)) {
fTSRecordNum = transportRecordNum;
fNPT = pcr;
}
}
}
void ClientTrickPlayState::setSource(MPEG2TransportStreamFramer* framer) {
fFramer = framer;
fOriginalTransportStreamSource = (ByteStreamFileSource*)(framer->inputSource());
}
void ClientTrickPlayState::updateTSRecordNum(){
if (fFramer != NULL) fTSRecordNum += (unsigned long)(fFramer->tsPacketCount());
}
void ClientTrickPlayState::reseekOriginalTransportStreamSource() {
u_int64_t tsRecordNum64 = (u_int64_t)fTSRecordNum;
fOriginalTransportStreamSource->seekToByteAbsolute(tsRecordNum64*TRANSPORT_PACKET_SIZE);
}
live/liveMedia/MPEG2TransportStreamFramer.cpp 000444 001752 001752 00000025475 12656261123 021131 0 ustar 00rsf rsf 000000 000000 /**********
This library is free software; you can redistribute it and/or modify it under
the terms of the GNU Lesser General Public License as published by the
Free Software Foundation; either version 2.1 of the License, or (at your
option) any later version. (See .)
This library is distributed in the hope that it will be useful, but WITHOUT
ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS
FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for
more details.
You should have received a copy of the GNU Lesser General Public License
along with this library; if not, write to the Free Software Foundation, Inc.,
51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
**********/
// "liveMedia"
// Copyright (c) 1996-2016 Live Networks, Inc. All rights reserved.
// A filter that passes through (unchanged) chunks that contain an integral number
// of MPEG-2 Transport Stream packets, but returning (in "fDurationInMicroseconds")
// an updated estimate of the time gap between chunks.
// Implementation
#include "MPEG2TransportStreamFramer.hh"
#include // for "gettimeofday()"
#define TRANSPORT_PACKET_SIZE 188
////////// Definitions of constants that control the behavior of this code /////////
#if !defined(NEW_DURATION_WEIGHT)
#define NEW_DURATION_WEIGHT 0.5
// How much weight to give to the latest duration measurement (must be <= 1)
#endif
#if !defined(TIME_ADJUSTMENT_FACTOR)
#define TIME_ADJUSTMENT_FACTOR 0.8
// A factor by which to adjust the duration estimate to ensure that the overall
// packet transmission times remains matched with the PCR times (which will be the
// times that we expect receivers to play the incoming packets).
// (must be <= 1)
#endif
#if !defined(MAX_PLAYOUT_BUFFER_DURATION)
#define MAX_PLAYOUT_BUFFER_DURATION 0.1 // (seconds)
#endif
#if !defined(PCR_PERIOD_VARIATION_RATIO)
#define PCR_PERIOD_VARIATION_RATIO 0.5
#endif
////////// PIDStatus //////////
class PIDStatus {
public:
PIDStatus(double _firstClock, double _firstRealTime)
: firstClock(_firstClock), lastClock(_firstClock),
firstRealTime(_firstRealTime), lastRealTime(_firstRealTime),
lastPacketNum(0) {
}
double firstClock, lastClock, firstRealTime, lastRealTime;
u_int64_t lastPacketNum;
};
////////// MPEG2TransportStreamFramer //////////
MPEG2TransportStreamFramer* MPEG2TransportStreamFramer
::createNew(UsageEnvironment& env, FramedSource* inputSource) {
return new MPEG2TransportStreamFramer(env, inputSource);
}
MPEG2TransportStreamFramer
::MPEG2TransportStreamFramer(UsageEnvironment& env, FramedSource* inputSource)
: FramedFilter(env, inputSource),
fTSPacketCount(0), fTSPacketDurationEstimate(0.0), fTSPCRCount(0),
fLimitNumTSPacketsToStream(False), fNumTSPacketsToStream(0),
fLimitTSPacketsToStreamByPCR(False), fPCRLimit(0.0) {
fPIDStatusTable = HashTable::create(ONE_WORD_HASH_KEYS);
}
MPEG2TransportStreamFramer::~MPEG2TransportStreamFramer() {
clearPIDStatusTable();
delete fPIDStatusTable;
}
void MPEG2TransportStreamFramer::clearPIDStatusTable() {
PIDStatus* pidStatus;
while ((pidStatus = (PIDStatus*)fPIDStatusTable->RemoveNext()) != NULL) {
delete pidStatus;
}
}
void MPEG2TransportStreamFramer::setNumTSPacketsToStream(unsigned long numTSRecordsToStream) {
fNumTSPacketsToStream = numTSRecordsToStream;
fLimitNumTSPacketsToStream = numTSRecordsToStream > 0;
}
void MPEG2TransportStreamFramer::setPCRLimit(float pcrLimit) {
fPCRLimit = pcrLimit;
fLimitTSPacketsToStreamByPCR = pcrLimit != 0.0;
}
void MPEG2TransportStreamFramer::doGetNextFrame() {
if (fLimitNumTSPacketsToStream) {
if (fNumTSPacketsToStream == 0) {
handleClosure();
return;
}
if (fNumTSPacketsToStream*TRANSPORT_PACKET_SIZE < fMaxSize) {
fMaxSize = fNumTSPacketsToStream*TRANSPORT_PACKET_SIZE;
}
}
// Read directly from our input source into our client's buffer:
fFrameSize = 0;
fInputSource->getNextFrame(fTo, fMaxSize,
afterGettingFrame, this,
FramedSource::handleClosure, this);
}
void MPEG2TransportStreamFramer::doStopGettingFrames() {
FramedFilter::doStopGettingFrames();
fTSPacketCount = 0;
fTSPCRCount = 0;
clearPIDStatusTable();
}
void MPEG2TransportStreamFramer
::afterGettingFrame(void* clientData, unsigned frameSize,
unsigned /*numTruncatedBytes*/,
struct timeval presentationTime,
unsigned /*durationInMicroseconds*/) {
MPEG2TransportStreamFramer* framer = (MPEG2TransportStreamFramer*)clientData;
framer->afterGettingFrame1(frameSize, presentationTime);
}
#define TRANSPORT_SYNC_BYTE 0x47
void MPEG2TransportStreamFramer::afterGettingFrame1(unsigned frameSize,
struct timeval presentationTime) {
fFrameSize += frameSize;
unsigned const numTSPackets = fFrameSize/TRANSPORT_PACKET_SIZE;
fNumTSPacketsToStream -= numTSPackets;
fFrameSize = numTSPackets*TRANSPORT_PACKET_SIZE; // an integral # of TS packets
if (fFrameSize == 0) {
// We didn't read a complete TS packet; assume that the input source has closed.
handleClosure();
return;
}
// Make sure the data begins with a sync byte:
unsigned syncBytePosition;
for (syncBytePosition = 0; syncBytePosition < fFrameSize; ++syncBytePosition) {
if (fTo[syncBytePosition] == TRANSPORT_SYNC_BYTE) break;
}
if (syncBytePosition == fFrameSize) {
envir() << "No Transport Stream sync byte in data.";
handleClosure();
return;
} else if (syncBytePosition > 0) {
// There's a sync byte, but not at the start of the data. Move the good data
// to the start of the buffer, then read more to fill it up again:
memmove(fTo, &fTo[syncBytePosition], fFrameSize - syncBytePosition);
fFrameSize -= syncBytePosition;
fInputSource->getNextFrame(&fTo[fFrameSize], syncBytePosition,
afterGettingFrame, this,
FramedSource::handleClosure, this);
return;
} // else normal case: the data begins with a sync byte
fPresentationTime = presentationTime;
// Scan through the TS packets that we read, and update our estimate of
// the duration of each packet:
struct timeval tvNow;
gettimeofday(&tvNow, NULL);
double timeNow = tvNow.tv_sec + tvNow.tv_usec/1000000.0;
for (unsigned i = 0; i < numTSPackets; ++i) {
if (!updateTSPacketDurationEstimate(&fTo[i*TRANSPORT_PACKET_SIZE], timeNow)) {
// We hit a preset limit (based on PCR) within the stream. Handle this as if the input source has closed:
handleClosure();
return;
}
}
fDurationInMicroseconds
= numTSPackets * (unsigned)(fTSPacketDurationEstimate*1000000);
// Complete the delivery to our client:
afterGetting(this);
}
Boolean MPEG2TransportStreamFramer::updateTSPacketDurationEstimate(unsigned char* pkt, double timeNow) {
// Sanity check: Make sure we start with the sync byte:
if (pkt[0] != TRANSPORT_SYNC_BYTE) {
envir() << "Missing sync byte!\n";
return True;
}
++fTSPacketCount;
// If this packet doesn't contain a PCR, then we're not interested in it:
u_int8_t const adaptation_field_control = (pkt[3]&0x30)>>4;
if (adaptation_field_control != 2 && adaptation_field_control != 3) return True;
// there's no adaptation_field
u_int8_t const adaptation_field_length = pkt[4];
if (adaptation_field_length == 0) return True;
u_int8_t const discontinuity_indicator = pkt[5]&0x80;
u_int8_t const pcrFlag = pkt[5]&0x10;
if (pcrFlag == 0) return True; // no PCR
// There's a PCR. Get it, and the PID:
++fTSPCRCount;
u_int32_t pcrBaseHigh = (pkt[6]<<24)|(pkt[7]<<16)|(pkt[8]<<8)|pkt[9];
double clock = pcrBaseHigh/45000.0;
if ((pkt[10]&0x80) != 0) clock += 1/90000.0; // add in low-bit (if set)
unsigned short pcrExt = ((pkt[10]&0x01)<<8) | pkt[11];
clock += pcrExt/27000000.0;
if (fLimitTSPacketsToStreamByPCR) {
if (clock > fPCRLimit) {
// We've hit a preset limit within the stream:
return False;
}
}
unsigned pid = ((pkt[1]&0x1F)<<8) | pkt[2];
// Check whether we already have a record of a PCR for this PID:
PIDStatus* pidStatus = (PIDStatus*)(fPIDStatusTable->Lookup((char*)pid));
if (pidStatus == NULL) {
// We're seeing this PID's PCR for the first time:
pidStatus = new PIDStatus(clock, timeNow);
fPIDStatusTable->Add((char*)pid, pidStatus);
#ifdef DEBUG_PCR
fprintf(stderr, "PID 0x%x, FIRST PCR 0x%08x+%d:%03x == %f @ %f, pkt #%lu\n", pid, pcrBaseHigh, pkt[10]>>7, pcrExt, clock, timeNow, fTSPacketCount);
#endif
} else {
// We've seen this PID's PCR before; update our per-packet duration estimate:
int64_t packetsSinceLast = (int64_t)(fTSPacketCount - pidStatus->lastPacketNum);
// it's "int64_t" because some compilers can't convert "u_int64_t" -> "double"
double durationPerPacket = (clock - pidStatus->lastClock)/packetsSinceLast;
// Hack (suggested by "Romain"): Don't update our estimate if this PCR appeared unusually quickly.
// (This can produce more accurate estimates for wildly VBR streams.)
double meanPCRPeriod = 0.0;
if (fTSPCRCount > 0) {
double tsPacketCount = (double)(int64_t)fTSPacketCount;
double tsPCRCount = (double)(int64_t)fTSPCRCount;
meanPCRPeriod = tsPacketCount/tsPCRCount;
if (packetsSinceLast < meanPCRPeriod*PCR_PERIOD_VARIATION_RATIO) return True;
}
if (fTSPacketDurationEstimate == 0.0) { // we've just started
fTSPacketDurationEstimate = durationPerPacket;
} else if (discontinuity_indicator == 0 && durationPerPacket >= 0.0) {
fTSPacketDurationEstimate
= durationPerPacket*NEW_DURATION_WEIGHT
+ fTSPacketDurationEstimate*(1-NEW_DURATION_WEIGHT);
// Also adjust the duration estimate to try to ensure that the transmission
// rate matches the playout rate:
double transmitDuration = timeNow - pidStatus->firstRealTime;
double playoutDuration = clock - pidStatus->firstClock;
if (transmitDuration > playoutDuration) {
fTSPacketDurationEstimate *= TIME_ADJUSTMENT_FACTOR; // reduce estimate
} else if (transmitDuration + MAX_PLAYOUT_BUFFER_DURATION < playoutDuration) {
fTSPacketDurationEstimate /= TIME_ADJUSTMENT_FACTOR; // increase estimate
}
} else {
// the PCR has a discontinuity from its previous value; don't use it now,
// but reset our PCR and real-time values to compensate:
pidStatus->firstClock = clock;
pidStatus->firstRealTime = timeNow;
}
#ifdef DEBUG_PCR
fprintf(stderr, "PID 0x%x, PCR 0x%08x+%d:%03x == %f @ %f (diffs %f @ %f), pkt #%lu, discon %d => this duration %f, new estimate %f, mean PCR period=%f\n", pid, pcrBaseHigh, pkt[10]>>7, pcrExt, clock, timeNow, clock - pidStatus->firstClock, timeNow - pidStatus->firstRealTime, fTSPacketCount, discontinuity_indicator != 0, durationPerPacket, fTSPacketDurationEstimate, meanPCRPeriod );
#endif
}
pidStatus->lastClock = clock;
pidStatus->lastRealTime = timeNow;
pidStatus->lastPacketNum = fTSPacketCount;
return True;
}
live/liveMedia/MPEG2TransportStreamFromESSource.cpp 000444 001752 001752 00000022647 12656261123 022227 0 ustar 00rsf rsf 000000 000000 /**********
This library is free software; you can redistribute it and/or modify it under
the terms of the GNU Lesser General Public License as published by the
Free Software Foundation; either version 2.1 of the License, or (at your
option) any later version. (See .)
This library is distributed in the hope that it will be useful, but WITHOUT
ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS
FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for
more details.
You should have received a copy of the GNU Lesser General Public License
along with this library; if not, write to the Free Software Foundation, Inc.,
51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
**********/
// "liveMedia"
// Copyright (c) 1996-2016 Live Networks, Inc. All rights reserved.
// A filter for converting one or more MPEG Elementary Streams
// to a MPEG-2 Transport Stream
// Implementation
#include "MPEG2TransportStreamFromESSource.hh"
#define MAX_INPUT_ES_FRAME_SIZE 100000
#define SIMPLE_PES_HEADER_SIZE 14
#define LOW_WATER_MARK 1000 // <= MAX_INPUT_ES_FRAME_SIZE
#define INPUT_BUFFER_SIZE (SIMPLE_PES_HEADER_SIZE + 2*MAX_INPUT_ES_FRAME_SIZE)
////////// InputESSourceRecord definition //////////
class InputESSourceRecord {
public:
InputESSourceRecord(MPEG2TransportStreamFromESSource& parent,
FramedSource* inputSource,
u_int8_t streamId, int mpegVersion,
InputESSourceRecord* next, int16_t PID = -1);
virtual ~InputESSourceRecord();
InputESSourceRecord* next() const { return fNext; }
FramedSource* inputSource() const { return fInputSource; }
void askForNewData();
Boolean deliverBufferToClient();
unsigned char* buffer() const { return fInputBuffer; }
void reset() {
// Reset the buffer for future use:
fInputBufferBytesAvailable = 0;
fInputBufferInUse = False;
}
private:
static void afterGettingFrame(void* clientData, unsigned frameSize,
unsigned numTruncatedBytes,
struct timeval presentationTime,
unsigned durationInMicroseconds);
void afterGettingFrame1(unsigned frameSize,
unsigned numTruncatedBytes,
struct timeval presentationTime);
private:
InputESSourceRecord* fNext;
MPEG2TransportStreamFromESSource& fParent;
FramedSource* fInputSource;
u_int8_t fStreamId;
int fMPEGVersion;
unsigned char* fInputBuffer;
unsigned fInputBufferBytesAvailable;
Boolean fInputBufferInUse;
MPEG1or2Demux::SCR fSCR;
int16_t fPID;
};
////////// MPEG2TransportStreamFromESSource implementation //////////
MPEG2TransportStreamFromESSource* MPEG2TransportStreamFromESSource
::createNew(UsageEnvironment& env) {
return new MPEG2TransportStreamFromESSource(env);
}
void MPEG2TransportStreamFromESSource
::addNewVideoSource(FramedSource* inputSource, int mpegVersion, int16_t PID) {
u_int8_t streamId = 0xE0 | (fVideoSourceCounter++&0x0F);
addNewInputSource(inputSource, streamId, mpegVersion, PID);
fHaveVideoStreams = True;
}
void MPEG2TransportStreamFromESSource
::addNewAudioSource(FramedSource* inputSource, int mpegVersion, int16_t PID) {
u_int8_t streamId = 0xC0 | (fAudioSourceCounter++&0x0F);
addNewInputSource(inputSource, streamId, mpegVersion, PID);
}
MPEG2TransportStreamFromESSource
::MPEG2TransportStreamFromESSource(UsageEnvironment& env)
: MPEG2TransportStreamMultiplexor(env),
fInputSources(NULL), fVideoSourceCounter(0), fAudioSourceCounter(0),
fAwaitingBackgroundDelivery(False) {
fHaveVideoStreams = False; // unless we add a video source
}
MPEG2TransportStreamFromESSource::~MPEG2TransportStreamFromESSource() {
doStopGettingFrames();
delete fInputSources;
}
void MPEG2TransportStreamFromESSource::doStopGettingFrames() {
// Stop each input source:
for (InputESSourceRecord* sourceRec = fInputSources; sourceRec != NULL;
sourceRec = sourceRec->next()) {
sourceRec->inputSource()->stopGettingFrames();
}
}
void MPEG2TransportStreamFromESSource
::awaitNewBuffer(unsigned char* oldBuffer) {
InputESSourceRecord* sourceRec;
// Begin by resetting the old buffer:
if (oldBuffer != NULL) {
for (sourceRec = fInputSources; sourceRec != NULL;
sourceRec = sourceRec->next()) {
if (sourceRec->buffer() == oldBuffer) {
sourceRec->reset();
break;
}
}
fAwaitingBackgroundDelivery = False;
}
if (isCurrentlyAwaitingData()) {
// Try to deliver one filled-in buffer to the client:
for (sourceRec = fInputSources; sourceRec != NULL;
sourceRec = sourceRec->next()) {
if (sourceRec->deliverBufferToClient()) return;
}
fAwaitingBackgroundDelivery = True;
}
// No filled-in buffers are available. Ask each of our inputs for data:
for (sourceRec = fInputSources; sourceRec != NULL;
sourceRec = sourceRec->next()) {
sourceRec->askForNewData();
}
}
void MPEG2TransportStreamFromESSource
::addNewInputSource(FramedSource* inputSource,
u_int8_t streamId, int mpegVersion, int16_t PID) {
if (inputSource == NULL) return;
fInputSources = new InputESSourceRecord(*this, inputSource, streamId,
mpegVersion, fInputSources, PID);
}
////////// InputESSourceRecord implementation //////////
InputESSourceRecord
::InputESSourceRecord(MPEG2TransportStreamFromESSource& parent,
FramedSource* inputSource,
u_int8_t streamId, int mpegVersion,
InputESSourceRecord* next, int16_t PID)
: fNext(next), fParent(parent), fInputSource(inputSource),
fStreamId(streamId), fMPEGVersion(mpegVersion), fPID(PID) {
fInputBuffer = new unsigned char[INPUT_BUFFER_SIZE];
reset();
}
InputESSourceRecord::~InputESSourceRecord() {
Medium::close(fInputSource);
delete[] fInputBuffer;
delete fNext;
}
void InputESSourceRecord::askForNewData() {
if (fInputBufferInUse) return;
if (fInputBufferBytesAvailable == 0) {
// Reset our buffer, by adding a simple PES header at the start:
fInputBuffer[0] = 0; fInputBuffer[1] = 0; fInputBuffer[2] = 1;
fInputBuffer[3] = fStreamId;
fInputBuffer[4] = 0; fInputBuffer[5] = 0; // fill in later with the length
fInputBuffer[6] = 0x80;
fInputBuffer[7] = 0x80; // include a PTS
fInputBuffer[8] = 5; // PES_header_data_length (enough for a PTS)
// fInputBuffer[9..13] will be the PTS; fill this in later
fInputBufferBytesAvailable = SIMPLE_PES_HEADER_SIZE;
}
if (fInputBufferBytesAvailable < LOW_WATER_MARK &&
!fInputSource->isCurrentlyAwaitingData()) {
// We don't yet have enough data in our buffer. Arrange to read more:
fInputSource->getNextFrame(&fInputBuffer[fInputBufferBytesAvailable],
INPUT_BUFFER_SIZE-fInputBufferBytesAvailable,
afterGettingFrame, this,
FramedSource::handleClosure, &fParent);
}
}
Boolean InputESSourceRecord::deliverBufferToClient() {
if (fInputBufferInUse || fInputBufferBytesAvailable < LOW_WATER_MARK) return False;
// Fill in the PES_packet_length field that we left unset before:
unsigned PES_packet_length = fInputBufferBytesAvailable - 6;
if (PES_packet_length > 0xFFFF) {
// Set the PES_packet_length field to 0. This indicates an unbounded length (see ISO 13818-1, 2.4.3.7)
PES_packet_length = 0;
}
fInputBuffer[4] = PES_packet_length>>8;
fInputBuffer[5] = PES_packet_length;
// Fill in the PES PTS (from our SCR):
fInputBuffer[9] = 0x20|(fSCR.highBit<<3)|(fSCR.remainingBits>>29)|0x01;
fInputBuffer[10] = fSCR.remainingBits>>22;
fInputBuffer[11] = (fSCR.remainingBits>>14)|0x01;
fInputBuffer[12] = fSCR.remainingBits>>7;
fInputBuffer[13] = (fSCR.remainingBits<<1)|0x01;
fInputBufferInUse = True;
// Do the delivery:
fParent.handleNewBuffer(fInputBuffer, fInputBufferBytesAvailable,
fMPEGVersion, fSCR, fPID);
return True;
}
void InputESSourceRecord
::afterGettingFrame(void* clientData, unsigned frameSize,
unsigned numTruncatedBytes,
struct timeval presentationTime,
unsigned /*durationInMicroseconds*/) {
InputESSourceRecord* source = (InputESSourceRecord*)clientData;
source->afterGettingFrame1(frameSize, numTruncatedBytes, presentationTime);
}
void InputESSourceRecord
::afterGettingFrame1(unsigned frameSize, unsigned numTruncatedBytes,
struct timeval presentationTime) {
if (numTruncatedBytes > 0) {
fParent.envir() << "MPEG2TransportStreamFromESSource: input buffer too small; increase \"MAX_INPUT_ES_FRAME_SIZE\" in \"MPEG2TransportStreamFromESSource\" by at least "
<< numTruncatedBytes << " bytes!\n";
}
if (fInputBufferBytesAvailable == SIMPLE_PES_HEADER_SIZE) {
// Use this presentationTime for our SCR:
fSCR.highBit
= ((presentationTime.tv_sec*45000 + (presentationTime.tv_usec*9)/200)&
0x80000000) != 0;
fSCR.remainingBits
= presentationTime.tv_sec*90000 + (presentationTime.tv_usec*9)/100;
fSCR.extension = (presentationTime.tv_usec*9)%100;
#ifdef DEBUG_SCR
fprintf(stderr, "PES header: stream_id 0x%02x, pts: %u.%06u => SCR 0x%x%08x:%03x\n", fStreamId, (unsigned)presentationTime.tv_sec, (unsigned)presentationTime.tv_usec, fSCR.highBit, fSCR.remainingBits, fSCR.extension);
#endif
}
fInputBufferBytesAvailable += frameSize;
fParent.fPresentationTime = presentationTime;
// Now that we have new input data, check if we can deliver to the client:
if (fParent.fAwaitingBackgroundDelivery) {
fParent.fAwaitingBackgroundDelivery = False;
fParent.awaitNewBuffer(NULL);
}
}
live/liveMedia/OggDemuxedTrack.cpp 000444 001752 001752 00000003174 12656261123 017076 0 ustar 00rsf rsf 000000 000000 /**********
This library is free software; you can redistribute it and/or modify it under
the terms of the GNU Lesser General Public License as published by the
Free Software Foundation; either version 2.1 of the License, or (at your
option) any later version. (See .)
This library is distributed in the hope that it will be useful, but WITHOUT
ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS
FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for
more details.
You should have received a copy of the GNU Lesser General Public License
along with this library; if not, write to the Free Software Foundation, Inc.,
51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
**********/
// "liveMedia"
// Copyright (c) 1996-2016 Live Networks, Inc. All rights reserved.
// A media track, demultiplexed from an Ogg file
// Implementation
#include "OggDemuxedTrack.hh"
#include "OggFile.hh"
OggDemuxedTrack::OggDemuxedTrack(UsageEnvironment& env, unsigned trackNumber, OggDemux& sourceDemux)
: FramedSource(env),
fOurTrackNumber(trackNumber), fOurSourceDemux(sourceDemux),
fCurrentPageIsContinuation(False) {
fNextPresentationTime.tv_sec = 0; fNextPresentationTime.tv_usec = 0;
}
OggDemuxedTrack::~OggDemuxedTrack() {
fOurSourceDemux.removeTrack(fOurTrackNumber);
}
void OggDemuxedTrack::doGetNextFrame() {
fOurSourceDemux.continueReading();
}
char const* OggDemuxedTrack::MIMEtype() const {
OggTrack* track = fOurSourceDemux.fOurFile.lookup(fOurTrackNumber);
if (track == NULL) return "(unknown)"; // shouldn't happen
return track->mimeType;
}
live/liveMedia/MPEG2TransportStreamFromPESSource.cpp 000444 001752 001752 00000005347 12656261123 022345 0 ustar 00rsf rsf 000000 000000 /**********
This library is free software; you can redistribute it and/or modify it under
the terms of the GNU Lesser General Public License as published by the
Free Software Foundation; either version 2.1 of the License, or (at your
option) any later version. (See .)
This library is distributed in the hope that it will be useful, but WITHOUT
ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS
FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for
more details.
You should have received a copy of the GNU Lesser General Public License
along with this library; if not, write to the Free Software Foundation, Inc.,
51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
**********/
// "liveMedia"
// Copyright (c) 1996-2016 Live Networks, Inc. All rights reserved.
// A filter for converting a stream of MPEG PES packets to a MPEG-2 Transport Stream
// Implementation
#include "MPEG2TransportStreamFromPESSource.hh"
#define MAX_PES_PACKET_SIZE (6+65535)
MPEG2TransportStreamFromPESSource* MPEG2TransportStreamFromPESSource
::createNew(UsageEnvironment& env, MPEG1or2DemuxedElementaryStream* inputSource) {
return new MPEG2TransportStreamFromPESSource(env, inputSource);
}
MPEG2TransportStreamFromPESSource
::MPEG2TransportStreamFromPESSource(UsageEnvironment& env,
MPEG1or2DemuxedElementaryStream* inputSource)
: MPEG2TransportStreamMultiplexor(env),
fInputSource(inputSource) {
fInputBuffer = new unsigned char[MAX_PES_PACKET_SIZE];
}
MPEG2TransportStreamFromPESSource::~MPEG2TransportStreamFromPESSource() {
Medium::close(fInputSource);
delete[] fInputBuffer;
}
void MPEG2TransportStreamFromPESSource::doStopGettingFrames() {
fInputSource->stopGettingFrames();
}
void MPEG2TransportStreamFromPESSource
::awaitNewBuffer(unsigned char* /*oldBuffer*/) {
fInputSource->getNextFrame(fInputBuffer, MAX_PES_PACKET_SIZE,
afterGettingFrame, this,
FramedSource::handleClosure, this);
}
void MPEG2TransportStreamFromPESSource
::afterGettingFrame(void* clientData, unsigned frameSize,
unsigned numTruncatedBytes,
struct timeval presentationTime,
unsigned durationInMicroseconds) {
MPEG2TransportStreamFromPESSource* source
= (MPEG2TransportStreamFromPESSource*)clientData;
source->afterGettingFrame1(frameSize, numTruncatedBytes,
presentationTime, durationInMicroseconds);
}
void MPEG2TransportStreamFromPESSource
::afterGettingFrame1(unsigned frameSize,
unsigned /*numTruncatedBytes*/,
struct timeval /*presentationTime*/,
unsigned /*durationInMicroseconds*/) {
if (frameSize < 4) return;
handleNewBuffer(fInputBuffer, frameSize,
fInputSource->mpegVersion(), fInputSource->lastSeenSCR());
}
live/liveMedia/MPEG2TransportStreamIndexFile.cpp 000444 001752 001752 00000027067 12656261123 021563 0 ustar 00rsf rsf 000000 000000 /**********
This library is free software; you can redistribute it and/or modify it under
the terms of the GNU Lesser General Public License as published by the
Free Software Foundation; either version 2.1 of the License, or (at your
option) any later version. (See .)
This library is distributed in the hope that it will be useful, but WITHOUT
ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS
FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for
more details.
You should have received a copy of the GNU Lesser General Public License
along with this library; if not, write to the Free Software Foundation, Inc.,
51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
**********/
// "liveMedia"
// Copyright (c) 1996-2016 Live Networks, Inc. All rights reserved.
// A class that encapsulates MPEG-2 Transport Stream 'index files'/
// These index files are used to implement 'trick play' operations
// (seek-by-time, fast forward, reverse play) on Transport Stream files.
//
// Implementation
#include "MPEG2TransportStreamIndexFile.hh"
#include "InputFile.hh"
MPEG2TransportStreamIndexFile
::MPEG2TransportStreamIndexFile(UsageEnvironment& env, char const* indexFileName)
: Medium(env),
fFileName(strDup(indexFileName)), fFid(NULL), fMPEGVersion(0), fCurrentIndexRecordNum(0),
fCachedPCR(0.0f), fCachedTSPacketNumber(0), fNumIndexRecords(0) {
// Get the file size, to determine how many index records it contains:
u_int64_t indexFileSize = GetFileSize(indexFileName, NULL);
if (indexFileSize % INDEX_RECORD_SIZE != 0) {
env << "Warning: Size of the index file \"" << indexFileName
<< "\" (" << (unsigned)indexFileSize
<< ") is not a multiple of the index record size ("
<< INDEX_RECORD_SIZE << ")\n";
}
fNumIndexRecords = (unsigned long)(indexFileSize/INDEX_RECORD_SIZE);
}
MPEG2TransportStreamIndexFile* MPEG2TransportStreamIndexFile
::createNew(UsageEnvironment& env, char const* indexFileName) {
if (indexFileName == NULL) return NULL;
MPEG2TransportStreamIndexFile* indexFile
= new MPEG2TransportStreamIndexFile(env, indexFileName);
// Reject empty or non-existent index files:
if (indexFile->getPlayingDuration() == 0.0f) {
delete indexFile;
indexFile = NULL;
}
return indexFile;
}
MPEG2TransportStreamIndexFile::~MPEG2TransportStreamIndexFile() {
closeFid();
delete[] fFileName;
}
void MPEG2TransportStreamIndexFile
::lookupTSPacketNumFromNPT(float& npt, unsigned long& tsPacketNumber,
unsigned long& indexRecordNumber) {
if (npt <= 0.0 || fNumIndexRecords == 0) { // Fast-track a common case:
npt = 0.0f;
tsPacketNumber = indexRecordNumber = 0;
return;
}
// If "npt" is the same as the one that we last looked up, return its cached result:
if (npt == fCachedPCR) {
tsPacketNumber = fCachedTSPacketNumber;
indexRecordNumber = fCachedIndexRecordNumber;
return;
}
// Search for the pair of neighboring index records whose PCR values span "npt".
// Use the 'regula-falsi' method.
Boolean success = False;
unsigned long ixFound = 0;
do {
unsigned long ixLeft = 0, ixRight = fNumIndexRecords-1;
float pcrLeft = 0.0f, pcrRight;
if (!readIndexRecord(ixRight)) break;
pcrRight = pcrFromBuf();
if (npt > pcrRight) npt = pcrRight;
// handle "npt" too large by seeking to the last frame of the file
while (ixRight-ixLeft > 1 && pcrLeft < npt && npt <= pcrRight) {
unsigned long ixNew = ixLeft
+ (unsigned long)(((npt-pcrLeft)/(pcrRight-pcrLeft))*(ixRight-ixLeft));
if (ixNew == ixLeft || ixNew == ixRight) {
// use bisection instead:
ixNew = (ixLeft+ixRight)/2;
}
if (!readIndexRecord(ixNew)) break;
float pcrNew = pcrFromBuf();
if (pcrNew < npt) {
pcrLeft = pcrNew;
ixLeft = ixNew;
} else {
pcrRight = pcrNew;
ixRight = ixNew;
}
}
if (ixRight-ixLeft > 1 || npt <= pcrLeft || npt > pcrRight) break; // bad PCR values in index file?
ixFound = ixRight;
// "Rewind' until we reach the start of a Video Sequence or GOP header:
success = rewindToCleanPoint(ixFound);
} while (0);
if (success && readIndexRecord(ixFound)) {
// Return (and cache) information from record "ixFound":
npt = fCachedPCR = pcrFromBuf();
tsPacketNumber = fCachedTSPacketNumber = tsPacketNumFromBuf();
indexRecordNumber = fCachedIndexRecordNumber = ixFound;
} else {
// An error occurred: Return the default values, for npt == 0:
npt = 0.0f;
tsPacketNumber = indexRecordNumber = 0;
}
closeFid();
}
void MPEG2TransportStreamIndexFile
::lookupPCRFromTSPacketNum(unsigned long& tsPacketNumber, Boolean reverseToPreviousCleanPoint,
float& pcr, unsigned long& indexRecordNumber) {
if (tsPacketNumber == 0 || fNumIndexRecords == 0) { // Fast-track a common case:
pcr = 0.0f;
indexRecordNumber = 0;
return;
}
// If "tsPacketNumber" is the same as the one that we last looked up, return its cached result:
if (tsPacketNumber == fCachedTSPacketNumber) {
pcr = fCachedPCR;
indexRecordNumber = fCachedIndexRecordNumber;
return;
}
// Search for the pair of neighboring index records whose TS packet #s span "tsPacketNumber".
// Use the 'regula-falsi' method.
Boolean success = False;
unsigned long ixFound = 0;
do {
unsigned long ixLeft = 0, ixRight = fNumIndexRecords-1;
unsigned long tsLeft = 0, tsRight;
if (!readIndexRecord(ixRight)) break;
tsRight = tsPacketNumFromBuf();
if (tsPacketNumber > tsRight) tsPacketNumber = tsRight;
// handle "tsPacketNumber" too large by seeking to the last frame of the file
while (ixRight-ixLeft > 1 && tsLeft < tsPacketNumber && tsPacketNumber <= tsRight) {
unsigned long ixNew = ixLeft
+ (unsigned long)(((tsPacketNumber-tsLeft)/(tsRight-tsLeft))*(ixRight-ixLeft));
if (ixNew == ixLeft || ixNew == ixRight) {
// Use bisection instead:
ixNew = (ixLeft+ixRight)/2;
}
if (!readIndexRecord(ixNew)) break;
unsigned long tsNew = tsPacketNumFromBuf();
if (tsNew < tsPacketNumber) {
tsLeft = tsNew;
ixLeft = ixNew;
} else {
tsRight = tsNew;
ixRight = ixNew;
}
}
if (ixRight-ixLeft > 1 || tsPacketNumber <= tsLeft || tsPacketNumber > tsRight) break; // bad PCR values in index file?
ixFound = ixRight;
if (reverseToPreviousCleanPoint) {
// "Rewind' until we reach the start of a Video Sequence or GOP header:
success = rewindToCleanPoint(ixFound);
} else {
success = True;
}
} while (0);
if (success && readIndexRecord(ixFound)) {
// Return (and cache) information from record "ixFound":
pcr = fCachedPCR = pcrFromBuf();
fCachedTSPacketNumber = tsPacketNumFromBuf();
if (reverseToPreviousCleanPoint) tsPacketNumber = fCachedTSPacketNumber;
indexRecordNumber = fCachedIndexRecordNumber = ixFound;
} else {
// An error occurred: Return the default values, for tsPacketNumber == 0:
pcr = 0.0f;
indexRecordNumber = 0;
}
closeFid();
}
Boolean MPEG2TransportStreamIndexFile
::readIndexRecordValues(unsigned long indexRecordNum,
unsigned long& transportPacketNum, u_int8_t& offset,
u_int8_t& size, float& pcr, u_int8_t& recordType) {
if (!readIndexRecord(indexRecordNum)) return False;
transportPacketNum = tsPacketNumFromBuf();
offset = offsetFromBuf();
size = sizeFromBuf();
pcr = pcrFromBuf();
recordType = recordTypeFromBuf();
return True;
}
float MPEG2TransportStreamIndexFile::getPlayingDuration() {
if (fNumIndexRecords == 0 || !readOneIndexRecord(fNumIndexRecords-1)) return 0.0f;
return pcrFromBuf();
}
int MPEG2TransportStreamIndexFile::mpegVersion() {
if (fMPEGVersion != 0) return fMPEGVersion; // we already know it
// Read the first index record, and figure out the MPEG version from its type:
if (!readOneIndexRecord(0)) return 0; // unknown; perhaps the indecx file is empty?
setMPEGVersionFromRecordType(recordTypeFromBuf());
return fMPEGVersion;
}
Boolean MPEG2TransportStreamIndexFile::openFid() {
if (fFid == NULL && fFileName != NULL) {
if ((fFid = OpenInputFile(envir(), fFileName)) != NULL) {
fCurrentIndexRecordNum = 0;
}
}
return fFid != NULL;
}
Boolean MPEG2TransportStreamIndexFile::seekToIndexRecord(unsigned long indexRecordNumber) {
if (!openFid()) return False;
if (indexRecordNumber == fCurrentIndexRecordNum) return True; // we're already there
if (SeekFile64(fFid, (int64_t)(indexRecordNumber*INDEX_RECORD_SIZE), SEEK_SET) != 0) return False;
fCurrentIndexRecordNum = indexRecordNumber;
return True;
}
Boolean MPEG2TransportStreamIndexFile::readIndexRecord(unsigned long indexRecordNum) {
do {
if (!seekToIndexRecord(indexRecordNum)) break;
if (fread(fBuf, INDEX_RECORD_SIZE, 1, fFid) != 1) break;
++fCurrentIndexRecordNum;
return True;
} while (0);
return False; // an error occurred
}
Boolean MPEG2TransportStreamIndexFile::readOneIndexRecord(unsigned long indexRecordNum) {
Boolean result = readIndexRecord(indexRecordNum);
closeFid();
return result;
}
void MPEG2TransportStreamIndexFile::closeFid() {
if (fFid != NULL) {
CloseInputFile(fFid);
fFid = NULL;
}
}
float MPEG2TransportStreamIndexFile::pcrFromBuf() {
unsigned pcr_int = (fBuf[5]<<16) | (fBuf[4]<<8) | fBuf[3];
u_int8_t pcr_frac = fBuf[6];
return pcr_int + pcr_frac/256.0f;
}
unsigned long MPEG2TransportStreamIndexFile::tsPacketNumFromBuf() {
return (fBuf[10]<<24) | (fBuf[9]<<16) | (fBuf[8]<<8) | fBuf[7];
}
void MPEG2TransportStreamIndexFile::setMPEGVersionFromRecordType(u_int8_t recordType) {
if (fMPEGVersion != 0) return; // we already know it
u_int8_t const recordTypeWithoutStartBit = recordType&~0x80;
if (recordTypeWithoutStartBit >= 1 && recordTypeWithoutStartBit <= 4) fMPEGVersion = 2;
else if (recordTypeWithoutStartBit >= 5 && recordTypeWithoutStartBit <= 10) fMPEGVersion = 5;
// represents H.264
else if (recordTypeWithoutStartBit >= 11 && recordTypeWithoutStartBit <= 16) fMPEGVersion = 6;
// represents H.265
}
Boolean MPEG2TransportStreamIndexFile::rewindToCleanPoint(unsigned long&ixFound) {
Boolean success = False; // until we learn otherwise
while (ixFound > 0) {
if (!readIndexRecord(ixFound)) break;
u_int8_t recordType = recordTypeFromBuf();
setMPEGVersionFromRecordType(recordType);
// A 'clean point' is the start of a 'frame' from which a decoder can cleanly resume
// handling the stream. For H.264, this is a SPS. For H.265, this is a VPS.
// For MPEG-2, this is a Video Sequence Header, or a GOP.
if ((recordType&0x80) != 0) { // This is the start of a 'frame'
recordType &=~ 0x80; // remove the 'start of frame' bit
if (fMPEGVersion == 5) { // H.264
if (recordType == 5/*SPS*/) {
success = True;
break;
}
} else if (fMPEGVersion == 6) { // H.265
if (recordType == 11/*VPS*/) {
success = True;
break;
}
} else { // MPEG-1, 2, or 4
if (recordType == 1/*VSH*/) {
success = True;
break;
} else if (recordType == 2/*GOP*/) {
// Hack: If the preceding record is for a Video Sequence Header, then use it instead:
unsigned long newIxFound = ixFound;
while (--newIxFound > 0) {
if (!readIndexRecord(newIxFound)) break;
recordType = recordTypeFromBuf();
if ((recordType&0x7F) != 1) break; // not a Video Sequence Header
if ((recordType&0x80) != 0) { // this is the start of the VSH; use it
ixFound = newIxFound;
break;
}
}
}
success = True;
break;
}
}
// Keep checking, from the previous record:
--ixFound;
}
if (ixFound == 0) success = True; // use record 0 anyway
return success;
}
live/liveMedia/MPEG2TransportStreamMultiplexor.cpp 000444 001752 001752 00000043127 12656261123 022233 0 ustar 00rsf rsf 000000 000000 /**********
This library is free software; you can redistribute it and/or modify it under
the terms of the GNU Lesser General Public License as published by the
Free Software Foundation; either version 2.1 of the License, or (at your
option) any later version. (See .)
This library is distributed in the hope that it will be useful, but WITHOUT
ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS
FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for
more details.
You should have received a copy of the GNU Lesser General Public License
along with this library; if not, write to the Free Software Foundation, Inc.,
51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
**********/
// "liveMedia"
// Copyright (c) 1996-2016 Live Networks, Inc. All rights reserved.
// A class for generating MPEG-2 Transport Stream from one or more input
// Elementary Stream data sources
// Implementation
#include "MPEG2TransportStreamMultiplexor.hh"
#define TRANSPORT_PACKET_SIZE 188
#define PAT_PERIOD 100 // # of packets between Program Association Tables
#define PMT_PERIOD 500 // # of packets between Program Map Tables
#define PID_TABLE_SIZE 256
MPEG2TransportStreamMultiplexor
::MPEG2TransportStreamMultiplexor(UsageEnvironment& env)
: FramedSource(env),
fHaveVideoStreams(True/*by default*/),
fOutgoingPacketCounter(0), fProgramMapVersion(0),
fPreviousInputProgramMapVersion(0xFF), fCurrentInputProgramMapVersion(0xFF),
fPCR_PID(0), fCurrentPID(0),
fInputBuffer(NULL), fInputBufferSize(0), fInputBufferBytesUsed(0),
fIsFirstAdaptationField(True) {
for (unsigned i = 0; i < PID_TABLE_SIZE; ++i) {
fPIDState[i].counter = 0;
fPIDState[i].streamType = 0;
}
}
MPEG2TransportStreamMultiplexor::~MPEG2TransportStreamMultiplexor() {
}
void MPEG2TransportStreamMultiplexor::doGetNextFrame() {
if (fInputBufferBytesUsed >= fInputBufferSize) {
// No more bytes are available from the current buffer.
// Arrange to read a new one.
awaitNewBuffer(fInputBuffer);
return;
}
do {
// Periodically return a Program Association Table packet instead:
if (fOutgoingPacketCounter++ % PAT_PERIOD == 0) {
deliverPATPacket();
break;
}
// Periodically (or when we see a new PID) return a Program Map Table instead:
Boolean programMapHasChanged = fPIDState[fCurrentPID].counter == 0
|| fCurrentInputProgramMapVersion != fPreviousInputProgramMapVersion;
if (fOutgoingPacketCounter % PMT_PERIOD == 0 || programMapHasChanged) {
if (programMapHasChanged) { // reset values for next time:
fPIDState[fCurrentPID].counter = 1;
fPreviousInputProgramMapVersion = fCurrentInputProgramMapVersion;
}
deliverPMTPacket(programMapHasChanged);
break;
}
// Normal case: Deliver (or continue delivering) the recently-read data:
deliverDataToClient(fCurrentPID, fInputBuffer, fInputBufferSize,
fInputBufferBytesUsed);
} while (0);
// NEED TO SET fPresentationTime, durationInMicroseconds #####
// Complete the delivery to the client:
if ((fOutgoingPacketCounter%10) == 0) {
// To avoid excessive recursion (and stack overflow) caused by excessively large input frames,
// occasionally return to the event loop to do this:
envir().taskScheduler().scheduleDelayedTask(0, (TaskFunc*)FramedSource::afterGetting, this);
} else {
afterGetting(this);
}
}
void MPEG2TransportStreamMultiplexor
::handleNewBuffer(unsigned char* buffer, unsigned bufferSize,
int mpegVersion, MPEG1or2Demux::SCR scr, int16_t PID) {
if (bufferSize < 4) return;
fInputBuffer = buffer;
fInputBufferSize = bufferSize;
fInputBufferBytesUsed = 0;
u_int8_t stream_id = fInputBuffer[3];
// Use "stream_id" directly as our PID.
// Also, figure out the Program Map 'stream type' from this.
if (stream_id == 0xBE) { // padding_stream; ignore
fInputBufferSize = 0;
} else if (stream_id == 0xBC) { // program_stream_map
setProgramStreamMap(fInputBufferSize);
fInputBufferSize = 0; // then, ignore the buffer
} else {
if (PID == -1)
fCurrentPID = stream_id;
else
fCurrentPID = (u_int8_t)PID;
// Set the stream's type:
u_int8_t& streamType = fPIDState[fCurrentPID].streamType; // alias
if (streamType == 0) {
// Instead, set the stream's type to default values, based on whether
// the stream is audio or video, and whether it's MPEG-1 or MPEG-2:
if ((stream_id&0xF0) == 0xE0) { // video
streamType = mpegVersion == 1 ? 1 : mpegVersion == 2 ? 2 : mpegVersion == 4 ? 0x10 :
mpegVersion == 5/*H.264*/ ? 0x1B : 0x24/*assume H.265*/;
} else if ((stream_id&0xE0) == 0xC0) { // audio
streamType = mpegVersion == 1 ? 3 : mpegVersion == 2 ? 4 : 0xF;
} else if (stream_id == 0xBD) { // private_stream1 (usually AC-3)
streamType = 0x06; // for DVB; for ATSC, use 0x81
} else { // something else, e.g., AC-3 uses private_stream1 (0xBD)
streamType = 0x81; // private
}
}
if (fPCR_PID == 0) { // set it to this stream, if it's appropriate:
if ((!fHaveVideoStreams && (streamType == 3 || streamType == 4 || streamType == 0xF))/* audio stream */ ||
(streamType == 1 || streamType == 2 || streamType == 0x10 || streamType == 0x1B || streamType == 0x24)/* video stream */) {
fPCR_PID = fCurrentPID; // use this stream's SCR for PCR
}
}
if (fCurrentPID == fPCR_PID) {
// Record the input's current SCR timestamp, for use as our PCR:
fPCR = scr;
}
}
// Now that we have new input data, retry the last delivery to the client:
doGetNextFrame();
}
void MPEG2TransportStreamMultiplexor
::deliverDataToClient(u_int8_t pid, unsigned char* buffer, unsigned bufferSize,
unsigned& startPositionInBuffer) {
// Construct a new Transport packet, and deliver it to the client:
if (fMaxSize < TRANSPORT_PACKET_SIZE) {
fFrameSize = 0; // the client hasn't given us enough space; deliver nothing
fNumTruncatedBytes = TRANSPORT_PACKET_SIZE;
} else {
fFrameSize = TRANSPORT_PACKET_SIZE;
Boolean willAddPCR = pid == fPCR_PID && startPositionInBuffer == 0
&& !(fPCR.highBit == 0 && fPCR.remainingBits == 0 && fPCR.extension == 0);
unsigned const numBytesAvailable = bufferSize - startPositionInBuffer;
unsigned numHeaderBytes = 4; // by default
unsigned numPCRBytes = 0; // by default
unsigned numPaddingBytes = 0; // by default
unsigned numDataBytes;
u_int8_t adaptation_field_control;
if (willAddPCR) {
adaptation_field_control = 0x30;
numHeaderBytes += 2; // for the "adaptation_field_length" and flags
numPCRBytes = 6;
if (numBytesAvailable >= TRANSPORT_PACKET_SIZE - numHeaderBytes - numPCRBytes) {
numDataBytes = TRANSPORT_PACKET_SIZE - numHeaderBytes - numPCRBytes;
} else {
numDataBytes = numBytesAvailable;
numPaddingBytes
= TRANSPORT_PACKET_SIZE - numHeaderBytes - numPCRBytes - numDataBytes;
}
} else if (numBytesAvailable >= TRANSPORT_PACKET_SIZE - numHeaderBytes) {
// This is the common case
adaptation_field_control = 0x10;
numDataBytes = TRANSPORT_PACKET_SIZE - numHeaderBytes;
} else {
adaptation_field_control = 0x30;
++numHeaderBytes; // for the "adaptation_field_length"
// ASSERT: numBytesAvailable <= TRANSPORT_PACKET_SIZE - numHeaderBytes
numDataBytes = numBytesAvailable;
if (numDataBytes < TRANSPORT_PACKET_SIZE - numHeaderBytes) {
++numHeaderBytes; // for the adaptation field flags
numPaddingBytes = TRANSPORT_PACKET_SIZE - numHeaderBytes - numDataBytes;
}
}
// ASSERT: numHeaderBytes+numPCRBytes+numPaddingBytes+numDataBytes
// == TRANSPORT_PACKET_SIZE
// Fill in the header of the Transport Stream packet:
unsigned char* header = fTo;
*header++ = 0x47; // sync_byte
*header++ = (startPositionInBuffer == 0) ? 0x40 : 0x00;
// transport_error_indicator, payload_unit_start_indicator, transport_priority,
// first 5 bits of PID
*header++ = pid;
// last 8 bits of PID
unsigned& continuity_counter = fPIDState[pid].counter; // alias
*header++ = adaptation_field_control|(continuity_counter&0x0F);
// transport_scrambling_control, adaptation_field_control, continuity_counter
++continuity_counter;
if (adaptation_field_control == 0x30) {
// Add an adaptation field:
u_int8_t adaptation_field_length
= (numHeaderBytes == 5) ? 0 : 1 + numPCRBytes + numPaddingBytes;
*header++ = adaptation_field_length;
if (numHeaderBytes > 5) {
u_int8_t flags = willAddPCR ? 0x10 : 0x00;
if (fIsFirstAdaptationField) {
flags |= 0x80; // discontinuity_indicator
fIsFirstAdaptationField = False;
}
*header++ = flags;
if (willAddPCR) {
u_int32_t pcrHigh32Bits = (fPCR.highBit<<31) | (fPCR.remainingBits>>1);
u_int8_t pcrLowBit = fPCR.remainingBits&1;
u_int8_t extHighBit = (fPCR.extension&0x100)>>8;
*header++ = pcrHigh32Bits>>24;
*header++ = pcrHigh32Bits>>16;
*header++ = pcrHigh32Bits>>8;
*header++ = pcrHigh32Bits;
*header++ = (pcrLowBit<<7)|0x7E|extHighBit;
*header++ = (u_int8_t)fPCR.extension; // low 8 bits of extension
}
}
}
// Add any padding bytes:
for (unsigned i = 0; i < numPaddingBytes; ++i) *header++ = 0xFF;
// Finally, add the data bytes:
memmove(header, &buffer[startPositionInBuffer], numDataBytes);
startPositionInBuffer += numDataBytes;
}
}
#define PAT_PID 0
#ifndef OUR_PROGRAM_NUMBER
#define OUR_PROGRAM_NUMBER 1
#endif
#define OUR_PROGRAM_MAP_PID 0x30
void MPEG2TransportStreamMultiplexor::deliverPATPacket() {
// First, create a new buffer for the PAT packet:
unsigned const patSize = TRANSPORT_PACKET_SIZE - 4; // allow for the 4-byte header
unsigned char* patBuffer = new unsigned char[patSize];
// and fill it in:
unsigned char* pat = patBuffer;
*pat++ = 0; // pointer_field
*pat++ = 0; // table_id
*pat++ = 0xB0; // section_syntax_indicator; 0; reserved, section_length (high)
*pat++ = 13; // section_length (low)
*pat++ = 0; *pat++ = 1; // transport_stream_id
*pat++ = 0xC3; // reserved; version_number; current_next_indicator
*pat++ = 0; // section_number
*pat++ = 0; // last_section_number
*pat++ = OUR_PROGRAM_NUMBER>>8; *pat++ = OUR_PROGRAM_NUMBER; // program_number
*pat++ = 0xE0|(OUR_PROGRAM_MAP_PID>>8); // reserved; program_map_PID (high)
*pat++ = OUR_PROGRAM_MAP_PID; // program_map_PID (low)
// Compute the CRC from the bytes we currently have (not including "pointer_field"):
u_int32_t crc = calculateCRC(patBuffer+1, pat - (patBuffer+1));
*pat++ = crc>>24; *pat++ = crc>>16; *pat++ = crc>>8; *pat++ = crc;
// Fill in the rest of the packet with padding bytes:
while (pat < &patBuffer[patSize]) *pat++ = 0xFF;
// Deliver the packet:
unsigned startPosition = 0;
deliverDataToClient(PAT_PID, patBuffer, patSize, startPosition);
// Finally, remove the new buffer:
delete[] patBuffer;
}
void MPEG2TransportStreamMultiplexor::deliverPMTPacket(Boolean hasChanged) {
if (hasChanged) ++fProgramMapVersion;
// First, create a new buffer for the PMT packet:
unsigned const pmtSize = TRANSPORT_PACKET_SIZE - 4; // allow for the 4-byte header
unsigned char* pmtBuffer = new unsigned char[pmtSize];
// and fill it in:
unsigned char* pmt = pmtBuffer;
*pmt++ = 0; // pointer_field
*pmt++ = 2; // table_id
*pmt++ = 0xB0; // section_syntax_indicator; 0; reserved, section_length (high)
unsigned char* section_lengthPtr = pmt; // save for later
*pmt++ = 0; // section_length (low) (fill in later)
*pmt++ = OUR_PROGRAM_NUMBER>>8; *pmt++ = OUR_PROGRAM_NUMBER; // program_number
*pmt++ = 0xC1|((fProgramMapVersion&0x1F)<<1); // reserved; version_number; current_next_indicator
*pmt++ = 0; // section_number
*pmt++ = 0; // last_section_number
*pmt++ = 0xE0; // reserved; PCR_PID (high)
*pmt++ = fPCR_PID; // PCR_PID (low)
*pmt++ = 0xF0; // reserved; program_info_length (high)
*pmt++ = 0; // program_info_length (low)
for (int pid = 0; pid < PID_TABLE_SIZE; ++pid) {
if (fPIDState[pid].streamType != 0) {
// This PID gets recorded in the table
*pmt++ = fPIDState[pid].streamType;
*pmt++ = 0xE0; // reserved; elementary_pid (high)
*pmt++ = pid; // elementary_pid (low)
*pmt++ = 0xF0; // reserved; ES_info_length (high)
*pmt++ = 0; // ES_info_length (low)
}
}
unsigned section_length = pmt - (section_lengthPtr+1) + 4 /*for CRC*/;
*section_lengthPtr = section_length;
// Compute the CRC from the bytes we currently have (not including "pointer_field"):
u_int32_t crc = calculateCRC(pmtBuffer+1, pmt - (pmtBuffer+1));
*pmt++ = crc>>24; *pmt++ = crc>>16; *pmt++ = crc>>8; *pmt++ = crc;
// Fill in the rest of the packet with padding bytes:
while (pmt < &pmtBuffer[pmtSize]) *pmt++ = 0xFF;
// Deliver the packet:
unsigned startPosition = 0;
deliverDataToClient(OUR_PROGRAM_MAP_PID, pmtBuffer, pmtSize, startPosition);
// Finally, remove the new buffer:
delete[] pmtBuffer;
}
void MPEG2TransportStreamMultiplexor::setProgramStreamMap(unsigned frameSize) {
if (frameSize <= 16) return; // program_stream_map is too small to be useful
if (frameSize > 0xFF) return; // program_stream_map is too large
u_int16_t program_stream_map_length = (fInputBuffer[4]<<8) | fInputBuffer[5];
if ((u_int16_t)frameSize > 6+program_stream_map_length) {
frameSize = 6+program_stream_map_length;
}
u_int8_t versionByte = fInputBuffer[6];
if ((versionByte&0x80) == 0) return; // "current_next_indicator" is not set
fCurrentInputProgramMapVersion = versionByte&0x1F;
u_int16_t program_stream_info_length = (fInputBuffer[8]<<8) | fInputBuffer[9];
unsigned offset = 10 + program_stream_info_length; // skip over 'descriptors'
u_int16_t elementary_stream_map_length
= (fInputBuffer[offset]<<8) | fInputBuffer[offset+1];
offset += 2;
frameSize -= 4; // sizeof CRC_32
if (frameSize > offset + elementary_stream_map_length) {
frameSize = offset + elementary_stream_map_length;
}
while (offset + 4 <= frameSize) {
u_int8_t stream_type = fInputBuffer[offset];
u_int8_t elementary_stream_id = fInputBuffer[offset+1];
fPIDState[elementary_stream_id].streamType = stream_type;
u_int16_t elementary_stream_info_length
= (fInputBuffer[offset+2]<<8) | fInputBuffer[offset+3];
offset += 4 + elementary_stream_info_length;
}
}
static u_int32_t const CRC32[256] = {
0x00000000, 0x04c11db7, 0x09823b6e, 0x0d4326d9,
0x130476dc, 0x17c56b6b, 0x1a864db2, 0x1e475005,
0x2608edb8, 0x22c9f00f, 0x2f8ad6d6, 0x2b4bcb61,
0x350c9b64, 0x31cd86d3, 0x3c8ea00a, 0x384fbdbd,
0x4c11db70, 0x48d0c6c7, 0x4593e01e, 0x4152fda9,
0x5f15adac, 0x5bd4b01b, 0x569796c2, 0x52568b75,
0x6a1936c8, 0x6ed82b7f, 0x639b0da6, 0x675a1011,
0x791d4014, 0x7ddc5da3, 0x709f7b7a, 0x745e66cd,
0x9823b6e0, 0x9ce2ab57, 0x91a18d8e, 0x95609039,
0x8b27c03c, 0x8fe6dd8b, 0x82a5fb52, 0x8664e6e5,
0xbe2b5b58, 0xbaea46ef, 0xb7a96036, 0xb3687d81,
0xad2f2d84, 0xa9ee3033, 0xa4ad16ea, 0xa06c0b5d,
0xd4326d90, 0xd0f37027, 0xddb056fe, 0xd9714b49,
0xc7361b4c, 0xc3f706fb, 0xceb42022, 0xca753d95,
0xf23a8028, 0xf6fb9d9f, 0xfbb8bb46, 0xff79a6f1,
0xe13ef6f4, 0xe5ffeb43, 0xe8bccd9a, 0xec7dd02d,
0x34867077, 0x30476dc0, 0x3d044b19, 0x39c556ae,
0x278206ab, 0x23431b1c, 0x2e003dc5, 0x2ac12072,
0x128e9dcf, 0x164f8078, 0x1b0ca6a1, 0x1fcdbb16,
0x018aeb13, 0x054bf6a4, 0x0808d07d, 0x0cc9cdca,
0x7897ab07, 0x7c56b6b0, 0x71159069, 0x75d48dde,
0x6b93dddb, 0x6f52c06c, 0x6211e6b5, 0x66d0fb02,
0x5e9f46bf, 0x5a5e5b08, 0x571d7dd1, 0x53dc6066,
0x4d9b3063, 0x495a2dd4, 0x44190b0d, 0x40d816ba,
0xaca5c697, 0xa864db20, 0xa527fdf9, 0xa1e6e04e,
0xbfa1b04b, 0xbb60adfc, 0xb6238b25, 0xb2e29692,
0x8aad2b2f, 0x8e6c3698, 0x832f1041, 0x87ee0df6,
0x99a95df3, 0x9d684044, 0x902b669d, 0x94ea7b2a,
0xe0b41de7, 0xe4750050, 0xe9362689, 0xedf73b3e,
0xf3b06b3b, 0xf771768c, 0xfa325055, 0xfef34de2,
0xc6bcf05f, 0xc27dede8, 0xcf3ecb31, 0xcbffd686,
0xd5b88683, 0xd1799b34, 0xdc3abded, 0xd8fba05a,
0x690ce0ee, 0x6dcdfd59, 0x608edb80, 0x644fc637,
0x7a089632, 0x7ec98b85, 0x738aad5c, 0x774bb0eb,
0x4f040d56, 0x4bc510e1, 0x46863638, 0x42472b8f,
0x5c007b8a, 0x58c1663d, 0x558240e4, 0x51435d53,
0x251d3b9e, 0x21dc2629, 0x2c9f00f0, 0x285e1d47,
0x36194d42, 0x32d850f5, 0x3f9b762c, 0x3b5a6b9b,
0x0315d626, 0x07d4cb91, 0x0a97ed48, 0x0e56f0ff,
0x1011a0fa, 0x14d0bd4d, 0x19939b94, 0x1d528623,
0xf12f560e, 0xf5ee4bb9, 0xf8ad6d60, 0xfc6c70d7,
0xe22b20d2, 0xe6ea3d65, 0xeba91bbc, 0xef68060b,
0xd727bbb6, 0xd3e6a601, 0xdea580d8, 0xda649d6f,
0xc423cd6a, 0xc0e2d0dd, 0xcda1f604, 0xc960ebb3,
0xbd3e8d7e, 0xb9ff90c9, 0xb4bcb610, 0xb07daba7,
0xae3afba2, 0xaafbe615, 0xa7b8c0cc, 0xa379dd7b,
0x9b3660c6, 0x9ff77d71, 0x92b45ba8, 0x9675461f,
0x8832161a, 0x8cf30bad, 0x81b02d74, 0x857130c3,
0x5d8a9099, 0x594b8d2e, 0x5408abf7, 0x50c9b640,
0x4e8ee645, 0x4a4ffbf2, 0x470cdd2b, 0x43cdc09c,
0x7b827d21, 0x7f436096, 0x7200464f, 0x76c15bf8,
0x68860bfd, 0x6c47164a, 0x61043093, 0x65c52d24,
0x119b4be9, 0x155a565e, 0x18197087, 0x1cd86d30,
0x029f3d35, 0x065e2082, 0x0b1d065b, 0x0fdc1bec,
0x3793a651, 0x3352bbe6, 0x3e119d3f, 0x3ad08088,
0x2497d08d, 0x2056cd3a, 0x2d15ebe3, 0x29d4f654,
0xc5a92679, 0xc1683bce, 0xcc2b1d17, 0xc8ea00a0,
0xd6ad50a5, 0xd26c4d12, 0xdf2f6bcb, 0xdbee767c,
0xe3a1cbc1, 0xe760d676, 0xea23f0af, 0xeee2ed18,
0xf0a5bd1d, 0xf464a0aa, 0xf9278673, 0xfde69bc4,
0x89b8fd09, 0x8d79e0be, 0x803ac667, 0x84fbdbd0,
0x9abc8bd5, 0x9e7d9662, 0x933eb0bb, 0x97ffad0c,
0xafb010b1, 0xab710d06, 0xa6322bdf, 0xa2f33668,
0xbcb4666d, 0xb8757bda, 0xb5365d03, 0xb1f740b4
};
u_int32_t calculateCRC(u_int8_t const* data, unsigned dataLength, u_int32_t initialValue) {
u_int32_t crc = initialValue;
while (dataLength-- > 0) {
crc = (crc<<8) ^ CRC32[(crc>>24) ^ (u_int32_t)(*data++)];
}
return crc;
}
live/liveMedia/MPEG2TransportStreamTrickModeFilter.cpp 000444 001752 001752 00000025424 12656261123 022736 0 ustar 00rsf rsf 000000 000000 /**********
This library is free software; you can redistribute it and/or modify it under
the terms of the GNU Lesser General Public License as published by the
Free Software Foundation; either version 2.1 of the License, or (at your
option) any later version. (See .)
This library is distributed in the hope that it will be useful, but WITHOUT
ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS
FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for
more details.
You should have received a copy of the GNU Lesser General Public License
along with this library; if not, write to the Free Software Foundation, Inc.,
51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
**********/
// "liveMedia"
// Copyright (c) 1996-2016 Live Networks, Inc. All rights reserved.
// A filter that converts a MPEG Transport Stream file - with corresponding index file
// - to a corresponding Video Elementary Stream. It also uses a "scale" parameter
// to implement 'trick mode' (fast forward or reverse play, using I-frames) on
// the video stream.
// Implementation
#include "MPEG2TransportStreamTrickModeFilter.hh"
#include
// Define the following to be True if we want the output file to have the same frame rate as the original file.
// (Because the output file contains I-frames only, this means that each I-frame will appear in the output file
// several times, and therefore the output file's bitrate will be significantly higher than that of the original.)
// Define the following to be False if we want the output file to include each I-frame no more than once.
// (This means that - except for high 'scale' values - both the output frame rate and the output bit rate
// will be less than that of the original.)
#define KEEP_ORIGINAL_FRAME_RATE False
MPEG2TransportStreamTrickModeFilter* MPEG2TransportStreamTrickModeFilter
::createNew(UsageEnvironment& env, FramedSource* inputSource,
MPEG2TransportStreamIndexFile* indexFile, int scale) {
return new MPEG2TransportStreamTrickModeFilter(env, inputSource, indexFile, scale);
}
MPEG2TransportStreamTrickModeFilter
::MPEG2TransportStreamTrickModeFilter(UsageEnvironment& env, FramedSource* inputSource,
MPEG2TransportStreamIndexFile* indexFile, int scale)
: FramedFilter(env, inputSource),
fHaveStarted(False), fIndexFile(indexFile), fScale(scale), fDirection(1),
fState(SKIPPING_FRAME), fFrameCount(0),
fNextIndexRecordNum(0), fNextTSPacketNum(0),
fCurrentTSPacketNum((unsigned long)(-1)), fUseSavedFrameNextTime(False) {
if (fScale < 0) { // reverse play
fScale = -fScale;
fDirection = -1;
}
}
MPEG2TransportStreamTrickModeFilter::~MPEG2TransportStreamTrickModeFilter() {
}
Boolean MPEG2TransportStreamTrickModeFilter::seekTo(unsigned long tsPacketNumber,
unsigned long indexRecordNumber) {
seekToTransportPacket(tsPacketNumber);
fNextIndexRecordNum = indexRecordNumber;
return True;
}
#define isIFrameStart(type) ((type) == 0x81/*actually, a VSH*/ || (type) == 0x85/*actually, a SPS, for H.264*/ || (type) == 0x8B/*actually, a VPS, for H.265*/)
// This relies upon I-frames always being preceded by a VSH+GOP (for MPEG-2 data),
// by a SPS (for H.264 data), or by a VPS (for H.265 data)
#define isNonIFrameStart(type) ((type) == 0x83 || (type) == 0x88/*for H.264*/ || (type) == 0x8E/*for H.265*/)
void MPEG2TransportStreamTrickModeFilter::doGetNextFrame() {
// fprintf(stderr, "#####DGNF1\n");
// If our client's buffer size is too small, then deliver
// a 0-byte 'frame', to tell it to process all of the data that it has
// already read, before asking for more data from us:
if (fMaxSize < TRANSPORT_PACKET_SIZE) {
fFrameSize = 0;
afterGetting(this);
return;
}
while (1) {
// Get the next record from our index file.
// This tells us the type of frame this data is, which Transport Stream packet
// (from the input source) the data comes from, and where in the Transport Stream
// packet it comes from:
u_int8_t recordType;
float recordPCR;
Boolean endOfIndexFile = False;
if (!fIndexFile->readIndexRecordValues(fNextIndexRecordNum,
fDesiredTSPacketNum, fDesiredDataOffset,
fDesiredDataSize, recordPCR,
recordType)) {
// We ran off the end of the index file. If we're not delivering a
// pre-saved frame, then handle this the same way as if the
// input Transport Stream source ended.
if (fState != DELIVERING_SAVED_FRAME) {
onSourceClosure1();
return;
}
endOfIndexFile = True;
} else if (!fHaveStarted) {
fFirstPCR = recordPCR;
fHaveStarted = True;
}
// fprintf(stderr, "#####read index record %ld: ts %ld: %c, PCR %f\n", fNextIndexRecordNum, fDesiredTSPacketNum, isIFrameStart(recordType) ? 'I' : isNonIFrameStart(recordType) ? 'j' : 'x', recordPCR);
fNextIndexRecordNum
+= (fState == DELIVERING_SAVED_FRAME) ? 1 : fDirection;
// Handle this index record, depending on the record type and our current state:
switch (fState) {
case SKIPPING_FRAME:
case SAVING_AND_DELIVERING_FRAME: {
// if (fState == SKIPPING_FRAME) fprintf(stderr, "\tSKIPPING_FRAME\n"); else fprintf(stderr, "\tSAVING_AND_DELIVERING_FRAME\n");//#####
if (isIFrameStart(recordType)) {
// Save a record of this frame:
fSavedFrameIndexRecordStart = fNextIndexRecordNum - fDirection;
fUseSavedFrameNextTime = True;
// fprintf(stderr, "\trecording\n");//#####
if ((fFrameCount++)%fScale == 0 && fUseSavedFrameNextTime) {
// A frame is due now.
fFrameCount = 1; // reset to avoid overflow
if (fDirection > 0) {
// Begin delivering this frame, as we're scanning it:
fState = SAVING_AND_DELIVERING_FRAME;
// fprintf(stderr, "\tdelivering\n");//#####
fDesiredDataPCR = recordPCR; // use this frame's PCR
attemptDeliveryToClient();
return;
} else {
// Deliver this frame, then resume normal scanning:
// (This relies on the index records having begun with an I-frame.)
fState = DELIVERING_SAVED_FRAME;
fSavedSequentialIndexRecordNum = fNextIndexRecordNum;
fDesiredDataPCR = recordPCR;
// use this frame's (not the saved frame's) PCR
fNextIndexRecordNum = fSavedFrameIndexRecordStart;
// fprintf(stderr, "\tbeginning delivery of saved frame\n");//#####
}
} else {
// No frame is needed now:
fState = SKIPPING_FRAME;
}
} else if (isNonIFrameStart(recordType)) {
if ((fFrameCount++)%fScale == 0 && fUseSavedFrameNextTime) {
// A frame is due now, so begin delivering the one that we had saved:
// (This relies on the index records having begun with an I-frame.)
fFrameCount = 1; // reset to avoid overflow
fState = DELIVERING_SAVED_FRAME;
fSavedSequentialIndexRecordNum = fNextIndexRecordNum;
fDesiredDataPCR = recordPCR;
// use this frame's (not the saved frame's) PCR
fNextIndexRecordNum = fSavedFrameIndexRecordStart;
// fprintf(stderr, "\tbeginning delivery of saved frame\n");//#####
} else {
// No frame is needed now:
fState = SKIPPING_FRAME;
}
} else {
// Not the start of a frame, but deliver it, if it's needed:
if (fState == SAVING_AND_DELIVERING_FRAME) {
// fprintf(stderr, "\tdelivering\n");//#####
fDesiredDataPCR = recordPCR; // use this frame's PCR
attemptDeliveryToClient();
return;
}
}
break;
}
case DELIVERING_SAVED_FRAME: {
// fprintf(stderr, "\tDELIVERING_SAVED_FRAME\n");//#####
if (endOfIndexFile
|| (isIFrameStart(recordType)
&& fNextIndexRecordNum-1 != fSavedFrameIndexRecordStart)
|| isNonIFrameStart(recordType)) {
// fprintf(stderr, "\tended delivery of saved frame\n");//#####
// We've reached the end of the saved frame, so revert to the
// original sequence of index records:
fNextIndexRecordNum = fSavedSequentialIndexRecordNum;
fUseSavedFrameNextTime = KEEP_ORIGINAL_FRAME_RATE;
fState = SKIPPING_FRAME;
} else {
// Continue delivering:
// fprintf(stderr, "\tdelivering\n");//#####
attemptDeliveryToClient();
return;
}
break;
}
}
}
}
void MPEG2TransportStreamTrickModeFilter::doStopGettingFrames() {
FramedFilter::doStopGettingFrames();
fIndexFile->stopReading();
}
void MPEG2TransportStreamTrickModeFilter::attemptDeliveryToClient() {
if (fCurrentTSPacketNum == fDesiredTSPacketNum) {
// fprintf(stderr, "\t\tdelivering ts %d:%d, %d bytes, PCR %f\n", fCurrentTSPacketNum, fDesiredDataOffset, fDesiredDataSize, fDesiredDataPCR);//#####
// We already have the Transport Packet that we want. Deliver its data:
memmove(fTo, &fInputBuffer[fDesiredDataOffset], fDesiredDataSize);
fFrameSize = fDesiredDataSize;
float deliveryPCR = fDirection*(fDesiredDataPCR - fFirstPCR)/fScale;
if (deliveryPCR < 0.0) deliveryPCR = 0.0;
fPresentationTime.tv_sec = (unsigned long)deliveryPCR;
fPresentationTime.tv_usec
= (unsigned long)((deliveryPCR - fPresentationTime.tv_sec)*1000000.0f);
// fprintf(stderr, "#####DGNF9\n");
afterGetting(this);
} else {
// Arrange to read the Transport Packet that we want:
readTransportPacket(fDesiredTSPacketNum);
}
}
void MPEG2TransportStreamTrickModeFilter::seekToTransportPacket(unsigned long tsPacketNum) {
if (tsPacketNum == fNextTSPacketNum) return; // we're already there
ByteStreamFileSource* tsFile = (ByteStreamFileSource*)fInputSource;
u_int64_t tsPacketNum64 = (u_int64_t)tsPacketNum;
tsFile->seekToByteAbsolute(tsPacketNum64*TRANSPORT_PACKET_SIZE);
fNextTSPacketNum = tsPacketNum;
}
void MPEG2TransportStreamTrickModeFilter::readTransportPacket(unsigned long tsPacketNum) {
seekToTransportPacket(tsPacketNum);
fInputSource->getNextFrame(fInputBuffer, TRANSPORT_PACKET_SIZE,
afterGettingFrame, this,
onSourceClosure, this);
}
void MPEG2TransportStreamTrickModeFilter
::afterGettingFrame(void* clientData, unsigned frameSize,
unsigned /*numTruncatedBytes*/,
struct timeval presentationTime,
unsigned /*durationInMicroseconds*/) {
MPEG2TransportStreamTrickModeFilter* filter = (MPEG2TransportStreamTrickModeFilter*)clientData;
filter->afterGettingFrame1(frameSize);
}
void MPEG2TransportStreamTrickModeFilter::afterGettingFrame1(unsigned frameSize) {
if (frameSize != TRANSPORT_PACKET_SIZE) {
// Treat this as if the input source ended:
onSourceClosure1();
return;
}
fCurrentTSPacketNum = fNextTSPacketNum; // i.e., the one that we just read
++fNextTSPacketNum;
// Attempt deliver again:
attemptDeliveryToClient();
}
void MPEG2TransportStreamTrickModeFilter::onSourceClosure(void* clientData) {
MPEG2TransportStreamTrickModeFilter* filter = (MPEG2TransportStreamTrickModeFilter*)clientData;
filter->onSourceClosure1();
}
void MPEG2TransportStreamTrickModeFilter::onSourceClosure1() {
fIndexFile->stopReading();
handleClosure();
}
live/liveMedia/MPEG2TransportUDPServerMediaSubsession.cpp 000444 001752 001752 00000006333 12656261123 023366 0 ustar 00rsf rsf 000000 000000 /**********
This library is free software; you can redistribute it and/or modify it under
the terms of the GNU Lesser General Public License as published by the
Free Software Foundation; either version 2.1 of the License, or (at your
option) any later version. (See .)
This library is distributed in the hope that it will be useful, but WITHOUT
ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS
FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for
more details.
You should have received a copy of the GNU Lesser General Public License
along with this library; if not, write to the Free Software Foundation, Inc.,
51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
**********/
// "liveMedia"
// Copyright (c) 1996-2016 Live Networks, Inc. All rights reserved.
// A 'ServerMediaSubsession' object that creates new, unicast, "RTPSink"s
// on demand, from an incoming UDP (or RTP/UDP) MPEG-2 Transport Stream
// Implementation
#include "MPEG2TransportUDPServerMediaSubsession.hh"
#include "BasicUDPSource.hh"
#include "SimpleRTPSource.hh"
#include "MPEG2TransportStreamFramer.hh"
#include "SimpleRTPSink.hh"
#include "GroupsockHelper.hh"
MPEG2TransportUDPServerMediaSubsession*
MPEG2TransportUDPServerMediaSubsession::createNew(UsageEnvironment& env,
char const* inputAddressStr, Port const& inputPort, Boolean inputStreamIsRawUDP) {
return new MPEG2TransportUDPServerMediaSubsession(env, inputAddressStr, inputPort, inputStreamIsRawUDP);
}
MPEG2TransportUDPServerMediaSubsession
::MPEG2TransportUDPServerMediaSubsession(UsageEnvironment& env,
char const* inputAddressStr, Port const& inputPort, Boolean inputStreamIsRawUDP)
: OnDemandServerMediaSubsession(env, True/*reuseFirstSource*/),
fInputPort(inputPort), fInputGroupsock(NULL), fInputStreamIsRawUDP(inputStreamIsRawUDP) {
fInputAddressStr = strDup(inputAddressStr);
}
MPEG2TransportUDPServerMediaSubsession::
~MPEG2TransportUDPServerMediaSubsession() {
delete fInputGroupsock;
delete[] (char*)fInputAddressStr;
}
FramedSource* MPEG2TransportUDPServerMediaSubsession
::createNewStreamSource(unsigned/* clientSessionId*/, unsigned& estBitrate) {
estBitrate = 5000; // kbps, estimate
if (fInputGroupsock == NULL) {
// Create a 'groupsock' object for receiving the input stream:
struct in_addr inputAddress;
inputAddress.s_addr = fInputAddressStr == NULL ? 0 : our_inet_addr(fInputAddressStr);
fInputGroupsock = new Groupsock(envir(), inputAddress, fInputPort, 255);
}
FramedSource* transportStreamSource;
if (fInputStreamIsRawUDP) {
transportStreamSource = BasicUDPSource::createNew(envir(), fInputGroupsock);
} else {
transportStreamSource = SimpleRTPSource::createNew(envir(), fInputGroupsock, 33, 90000, "video/MP2T", 0, False /*no 'M' bit*/);
}
return MPEG2TransportStreamFramer::createNew(envir(), transportStreamSource);
}
RTPSink* MPEG2TransportUDPServerMediaSubsession
::createNewRTPSink(Groupsock* rtpGroupsock, unsigned char /*rtpPayloadTypeIfDynamic*/, FramedSource* /*inputSource*/) {
return SimpleRTPSink::createNew(envir(), rtpGroupsock,
33, 90000, "video", "MP2T",
1, True, False /*no 'M' bit*/);
}
live/liveMedia/MPEG4ESVideoRTPSource.cpp 000444 001752 001752 00000004341 12656261123 017660 0 ustar 00rsf rsf 000000 000000 /**********
This library is free software; you can redistribute it and/or modify it under
the terms of the GNU Lesser General Public License as published by the
Free Software Foundation; either version 2.1 of the License, or (at your
option) any later version. (See .)
This library is distributed in the hope that it will be useful, but WITHOUT
ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS
FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for
more details.
You should have received a copy of the GNU Lesser General Public License
along with this library; if not, write to the Free Software Foundation, Inc.,
51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
**********/
// "liveMedia"
// Copyright (c) 1996-2016 Live Networks, Inc. All rights reserved.
// MP4V-ES video RTP stream sources
// Implementation
#include "MPEG4ESVideoRTPSource.hh"
///////// MPEG4ESVideoRTPSource implementation ////////
//##### NOTE: INCOMPLETE!!! #####
MPEG4ESVideoRTPSource*
MPEG4ESVideoRTPSource::createNew(UsageEnvironment& env, Groupsock* RTPgs,
unsigned char rtpPayloadFormat,
unsigned rtpTimestampFrequency) {
return new MPEG4ESVideoRTPSource(env, RTPgs, rtpPayloadFormat,
rtpTimestampFrequency);
}
MPEG4ESVideoRTPSource
::MPEG4ESVideoRTPSource(UsageEnvironment& env, Groupsock* RTPgs,
unsigned char rtpPayloadFormat,
unsigned rtpTimestampFrequency)
: MultiFramedRTPSource(env, RTPgs,
rtpPayloadFormat, rtpTimestampFrequency) {
}
MPEG4ESVideoRTPSource::~MPEG4ESVideoRTPSource() {
}
Boolean MPEG4ESVideoRTPSource
::processSpecialHeader(BufferedPacket* packet,
unsigned& resultSpecialHeaderSize) {
// The packet begins a frame iff its data begins with a system code
// (i.e., 0x000001??)
fCurrentPacketBeginsFrame
= packet->dataSize() >= 4 && (packet->data())[0] == 0
&& (packet->data())[1] == 0 && (packet->data())[2] == 1;
// The RTP "M" (marker) bit indicates the last fragment of a frame:
fCurrentPacketCompletesFrame = packet->rtpMarkerBit();
// There is no special header
resultSpecialHeaderSize = 0;
return True;
}
char const* MPEG4ESVideoRTPSource::MIMEtype() const {
return "video/MP4V-ES";
}
live/liveMedia/MPEG4GenericRTPSink.cpp 000444 001752 001752 00000011620 12656261123 017400 0 ustar 00rsf rsf 000000 000000 /**********
This library is free software; you can redistribute it and/or modify it under
the terms of the GNU Lesser General Public License as published by the
Free Software Foundation; either version 2.1 of the License, or (at your
option) any later version. (See .)
This library is distributed in the hope that it will be useful, but WITHOUT
ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS
FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for
more details.
You should have received a copy of the GNU Lesser General Public License
along with this library; if not, write to the Free Software Foundation, Inc.,
51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
**********/
// "liveMedia"
// Copyright (c) 1996-2016 Live Networks, Inc. All rights reserved.
// MPEG4-GENERIC ("audio", "video", or "application") RTP stream sinks
// Implementation
#include "MPEG4GenericRTPSink.hh"
#include "Locale.hh"
#include // needed on some systems to define "tolower()"
MPEG4GenericRTPSink
::MPEG4GenericRTPSink(UsageEnvironment& env, Groupsock* RTPgs,
u_int8_t rtpPayloadFormat,
u_int32_t rtpTimestampFrequency,
char const* sdpMediaTypeString,
char const* mpeg4Mode, char const* configString,
unsigned numChannels)
: MultiFramedRTPSink(env, RTPgs, rtpPayloadFormat,
rtpTimestampFrequency, "MPEG4-GENERIC", numChannels),
fSDPMediaTypeString(strDup(sdpMediaTypeString)),
fMPEG4Mode(strDup(mpeg4Mode)), fConfigString(strDup(configString)) {
// Check whether "mpeg4Mode" is one that we handle:
if (mpeg4Mode == NULL) {
env << "MPEG4GenericRTPSink error: NULL \"mpeg4Mode\" parameter\n";
} else {
// To ease comparison, convert "mpeg4Mode" to lower case:
size_t const len = strlen(mpeg4Mode) + 1;
char* m = new char[len];
Locale l("POSIX");
for (size_t i = 0; i < len; ++i) m[i] = tolower(mpeg4Mode[i]);
if (strcmp(m, "aac-hbr") != 0) {
env << "MPEG4GenericRTPSink error: Unknown \"mpeg4Mode\" parameter: \"" << mpeg4Mode << "\"\n";
}
delete[] m;
}
// Set up the "a=fmtp:" SDP line for this stream:
char const* fmtpFmt =
"a=fmtp:%d "
"streamtype=%d;profile-level-id=1;"
"mode=%s;sizelength=13;indexlength=3;indexdeltalength=3;"
"config=%s\r\n";
unsigned fmtpFmtSize = strlen(fmtpFmt)
+ 3 /* max char len */
+ 3 /* max char len */
+ strlen(fMPEG4Mode)
+ strlen(fConfigString);
char* fmtp = new char[fmtpFmtSize];
sprintf(fmtp, fmtpFmt,
rtpPayloadType(),
strcmp(fSDPMediaTypeString, "video") == 0 ? 4 : 5,
fMPEG4Mode,
fConfigString);
fFmtpSDPLine = strDup(fmtp);
delete[] fmtp;
}
MPEG4GenericRTPSink::~MPEG4GenericRTPSink() {
delete[] fFmtpSDPLine;
delete[] (char*)fConfigString;
delete[] (char*)fMPEG4Mode;
delete[] (char*)fSDPMediaTypeString;
}
MPEG4GenericRTPSink*
MPEG4GenericRTPSink::createNew(UsageEnvironment& env, Groupsock* RTPgs,
u_int8_t rtpPayloadFormat,
u_int32_t rtpTimestampFrequency,
char const* sdpMediaTypeString,
char const* mpeg4Mode,
char const* configString, unsigned numChannels) {
return new MPEG4GenericRTPSink(env, RTPgs, rtpPayloadFormat,
rtpTimestampFrequency,
sdpMediaTypeString, mpeg4Mode,
configString, numChannels);
}
Boolean MPEG4GenericRTPSink
::frameCanAppearAfterPacketStart(unsigned char const* /*frameStart*/,
unsigned /*numBytesInFrame*/) const {
// (For now) allow at most 1 frame in a single packet:
return False;
}
void MPEG4GenericRTPSink
::doSpecialFrameHandling(unsigned fragmentationOffset,
unsigned char* frameStart,
unsigned numBytesInFrame,
struct timeval framePresentationTime,
unsigned numRemainingBytes) {
// Set the "AU Header Section". This is 4 bytes: 2 bytes for the
// initial "AU-headers-length" field, and 2 bytes for the first
// (and only) "AU Header":
unsigned fullFrameSize
= fragmentationOffset + numBytesInFrame + numRemainingBytes;
unsigned char headers[4];
headers[0] = 0; headers[1] = 16 /* bits */; // AU-headers-length
headers[2] = fullFrameSize >> 5; headers[3] = (fullFrameSize&0x1F)<<3;
setSpecialHeaderBytes(headers, sizeof headers);
if (numRemainingBytes == 0) {
// This packet contains the last (or only) fragment of the frame.
// Set the RTP 'M' ('marker') bit:
setMarkerBit();
}
// Important: Also call our base class's doSpecialFrameHandling(),
// to set the packet's timestamp:
MultiFramedRTPSink::doSpecialFrameHandling(fragmentationOffset,
frameStart, numBytesInFrame,
framePresentationTime,
numRemainingBytes);
}
unsigned MPEG4GenericRTPSink::specialHeaderSize() const {
return 2 + 2;
}
char const* MPEG4GenericRTPSink::sdpMediaType() const {
return fSDPMediaTypeString;
}
char const* MPEG4GenericRTPSink::auxSDPLine() {
return fFmtpSDPLine;
}
live/liveMedia/MPEG4GenericRTPSource.cpp 000444 001752 001752 00000017426 12656261123 017746 0 ustar 00rsf rsf 000000 000000 /**********
This library is free software; you can redistribute it and/or modify it under
the terms of the GNU Lesser General Public License as published by the
Free Software Foundation; either version 2.1 of the License, or (at your
option) any later version. (See .)
This library is distributed in the hope that it will be useful, but WITHOUT
ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS
FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for
more details.
You should have received a copy of the GNU Lesser General Public License
along with this library; if not, write to the Free Software Foundation, Inc.,
51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
**********/
// "liveMedia"
// Copyright (c) 1996-2016 Live Networks, Inc. All rights reserved.
// MPEG4-GENERIC ("audio", "video", or "application") RTP stream sources
// Implementation
#include "MPEG4GenericRTPSource.hh"
#include "BitVector.hh"
#include "MPEG4LATMAudioRTPSource.hh" // for parseGeneralConfigStr()
////////// MPEG4GenericBufferedPacket and MPEG4GenericBufferedPacketFactory
class MPEG4GenericBufferedPacket: public BufferedPacket {
public:
MPEG4GenericBufferedPacket(MPEG4GenericRTPSource* ourSource);
virtual ~MPEG4GenericBufferedPacket();
private: // redefined virtual functions
virtual unsigned nextEnclosedFrameSize(unsigned char*& framePtr,
unsigned dataSize);
private:
MPEG4GenericRTPSource* fOurSource;
};
class MPEG4GenericBufferedPacketFactory: public BufferedPacketFactory {
private: // redefined virtual functions
virtual BufferedPacket* createNewPacket(MultiFramedRTPSource* ourSource);
};
////////// AUHeader //////////
struct AUHeader {
unsigned size;
unsigned index; // indexDelta for the 2nd & subsequent headers
};
///////// MPEG4GenericRTPSource implementation ////////
//##### NOTE: INCOMPLETE!!! Support more modes, and interleaving #####
MPEG4GenericRTPSource*
MPEG4GenericRTPSource::createNew(UsageEnvironment& env, Groupsock* RTPgs,
unsigned char rtpPayloadFormat,
unsigned rtpTimestampFrequency,
char const* mediumName,
char const* mode,
unsigned sizeLength, unsigned indexLength,
unsigned indexDeltaLength
) {
return new MPEG4GenericRTPSource(env, RTPgs, rtpPayloadFormat,
rtpTimestampFrequency, mediumName,
mode, sizeLength, indexLength,
indexDeltaLength
);
}
MPEG4GenericRTPSource
::MPEG4GenericRTPSource(UsageEnvironment& env, Groupsock* RTPgs,
unsigned char rtpPayloadFormat,
unsigned rtpTimestampFrequency,
char const* mediumName,
char const* mode,
unsigned sizeLength, unsigned indexLength,
unsigned indexDeltaLength
)
: MultiFramedRTPSource(env, RTPgs,
rtpPayloadFormat, rtpTimestampFrequency,
new MPEG4GenericBufferedPacketFactory),
fSizeLength(sizeLength), fIndexLength(indexLength),
fIndexDeltaLength(indexDeltaLength),
fNumAUHeaders(0), fNextAUHeader(0), fAUHeaders(NULL) {
unsigned mimeTypeLength =
strlen(mediumName) + 14 /* strlen("/MPEG4-GENERIC") */ + 1;
fMIMEType = new char[mimeTypeLength];
if (fMIMEType != NULL) {
sprintf(fMIMEType, "%s/MPEG4-GENERIC", mediumName);
}
fMode = strDup(mode);
// Check for a "mode" that we don't yet support: //#####
if (mode == NULL ||
(strcmp(mode, "aac-hbr") != 0 && strcmp(mode, "generic") != 0)) {
envir() << "MPEG4GenericRTPSource Warning: Unknown or unsupported \"mode\": "
<< mode << "\n";
}
}
MPEG4GenericRTPSource::~MPEG4GenericRTPSource() {
delete[] fAUHeaders;
delete[] fMode;
delete[] fMIMEType;
}
Boolean MPEG4GenericRTPSource
::processSpecialHeader(BufferedPacket* packet,
unsigned& resultSpecialHeaderSize) {
unsigned char* headerStart = packet->data();
unsigned packetSize = packet->dataSize();
fCurrentPacketBeginsFrame = fCurrentPacketCompletesFrame;
// whether the *previous* packet ended a frame
// The RTP "M" (marker) bit indicates the last fragment of a frame:
fCurrentPacketCompletesFrame = packet->rtpMarkerBit();
// default values:
resultSpecialHeaderSize = 0;
fNumAUHeaders = 0;
fNextAUHeader = 0;
delete[] fAUHeaders; fAUHeaders = NULL;
if (fSizeLength > 0) {
// The packet begins with a "AU Header Section". Parse it, to
// determine the "AU-header"s for each frame present in this packet:
resultSpecialHeaderSize += 2;
if (packetSize < resultSpecialHeaderSize) return False;
unsigned AU_headers_length = (headerStart[0]<<8)|headerStart[1];
unsigned AU_headers_length_bytes = (AU_headers_length+7)/8;
if (packetSize
< resultSpecialHeaderSize + AU_headers_length_bytes) return False;
resultSpecialHeaderSize += AU_headers_length_bytes;
// Figure out how many AU-headers are present in the packet:
int bitsAvail = AU_headers_length - (fSizeLength + fIndexLength);
if (bitsAvail >= 0 && (fSizeLength + fIndexDeltaLength) > 0) {
fNumAUHeaders = 1 + bitsAvail/(fSizeLength + fIndexDeltaLength);
}
if (fNumAUHeaders > 0) {
fAUHeaders = new AUHeader[fNumAUHeaders];
// Fill in each header:
BitVector bv(&headerStart[2], 0, AU_headers_length);
fAUHeaders[0].size = bv.getBits(fSizeLength);
fAUHeaders[0].index = bv.getBits(fIndexLength);
for (unsigned i = 1; i < fNumAUHeaders; ++i) {
fAUHeaders[i].size = bv.getBits(fSizeLength);
fAUHeaders[i].index = bv.getBits(fIndexDeltaLength);
}
}
}
return True;
}
char const* MPEG4GenericRTPSource::MIMEtype() const {
return fMIMEType;
}
////////// MPEG4GenericBufferedPacket
////////// and MPEG4GenericBufferedPacketFactory implementation
MPEG4GenericBufferedPacket
::MPEG4GenericBufferedPacket(MPEG4GenericRTPSource* ourSource)
: fOurSource(ourSource) {
}
MPEG4GenericBufferedPacket::~MPEG4GenericBufferedPacket() {
}
unsigned MPEG4GenericBufferedPacket
::nextEnclosedFrameSize(unsigned char*& /*framePtr*/, unsigned dataSize) {
// WE CURRENTLY DON'T IMPLEMENT INTERLEAVING. FIX THIS! #####
AUHeader* auHeader = fOurSource->fAUHeaders;
if (auHeader == NULL) return dataSize;
unsigned numAUHeaders = fOurSource->fNumAUHeaders;
if (fOurSource->fNextAUHeader >= numAUHeaders) {
fOurSource->envir() << "MPEG4GenericBufferedPacket::nextEnclosedFrameSize("
<< dataSize << "): data error ("
<< auHeader << "," << fOurSource->fNextAUHeader
<< "," << numAUHeaders << ")!\n";
return dataSize;
}
auHeader = &auHeader[fOurSource->fNextAUHeader++];
return auHeader->size <= dataSize ? auHeader->size : dataSize;
}
BufferedPacket* MPEG4GenericBufferedPacketFactory
::createNewPacket(MultiFramedRTPSource* ourSource) {
return new MPEG4GenericBufferedPacket((MPEG4GenericRTPSource*)ourSource);
}
////////// samplingFrequencyFromAudioSpecificConfig() implementation //////////
static unsigned const samplingFrequencyFromIndex[16] = {
96000, 88200, 64000, 48000, 44100, 32000, 24000, 22050,
16000, 12000, 11025, 8000, 7350, 0, 0, 0
};
unsigned samplingFrequencyFromAudioSpecificConfig(char const* configStr) {
unsigned char* config = NULL;
unsigned result = 0; // if returned, indicates an error
do {
// Begin by parsing the config string:
unsigned configSize;
config = parseGeneralConfigStr(configStr, configSize);
if (config == NULL) break;
if (configSize < 2) break;
unsigned char samplingFrequencyIndex = ((config[0]&0x07)<<1) | (config[1]>>7);
if (samplingFrequencyIndex < 15) {
result = samplingFrequencyFromIndex[samplingFrequencyIndex];
break;
}
// Index == 15 means that the actual frequency is next (24 bits):
if (configSize < 5) break;
result = ((config[1]&0x7F)<<17) | (config[2]<<9) | (config[3]<<1) | (config[4]>>7);
} while (0);
delete[] config;
return result;
}
live/liveMedia/MPEG4LATMAudioRTPSink.cpp 000444 001752 001752 00000006465 12656261123 017556 0 ustar 00rsf rsf 000000 000000 /**********
This library is free software; you can redistribute it and/or modify it under
the terms of the GNU Lesser General Public License as published by the
Free Software Foundation; either version 2.1 of the License, or (at your
option) any later version. (See .)
This library is distributed in the hope that it will be useful, but WITHOUT
ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS
FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for
more details.
You should have received a copy of the GNU Lesser General Public License
along with this library; if not, write to the Free Software Foundation, Inc.,
51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
**********/
// "liveMedia"
// Copyright (c) 1996-2016 Live Networks, Inc. All rights reserved.
// RTP sink for MPEG-4 audio, using LATM multiplexing (RFC 3016)
// Implementation
#include "MPEG4LATMAudioRTPSink.hh"
MPEG4LATMAudioRTPSink
::MPEG4LATMAudioRTPSink(UsageEnvironment& env, Groupsock* RTPgs,
u_int8_t rtpPayloadFormat,
u_int32_t rtpTimestampFrequency,
char const* streamMuxConfigString,
unsigned numChannels,
Boolean allowMultipleFramesPerPacket)
: AudioRTPSink(env, RTPgs, rtpPayloadFormat,
rtpTimestampFrequency, "MP4A-LATM", numChannels),
fStreamMuxConfigString(strDup(streamMuxConfigString)),
fAllowMultipleFramesPerPacket(allowMultipleFramesPerPacket) {
// Set up the "a=fmtp:" SDP line for this stream:
char const* fmtpFmt =
"a=fmtp:%d "
"cpresent=0;config=%s\r\n";
unsigned fmtpFmtSize = strlen(fmtpFmt)
+ 3 /* max char len */
+ strlen(fStreamMuxConfigString);
char* fmtp = new char[fmtpFmtSize];
sprintf(fmtp, fmtpFmt,
rtpPayloadType(),
fStreamMuxConfigString);
fFmtpSDPLine = strDup(fmtp);
delete[] fmtp;
}
MPEG4LATMAudioRTPSink::~MPEG4LATMAudioRTPSink() {
delete[] fFmtpSDPLine;
delete[] (char*)fStreamMuxConfigString;
}
MPEG4LATMAudioRTPSink*
MPEG4LATMAudioRTPSink::createNew(UsageEnvironment& env, Groupsock* RTPgs,
u_int8_t rtpPayloadFormat,
u_int32_t rtpTimestampFrequency,
char const* streamMuxConfigString,
unsigned numChannels,
Boolean allowMultipleFramesPerPacket) {
return new MPEG4LATMAudioRTPSink(env, RTPgs, rtpPayloadFormat,
rtpTimestampFrequency, streamMuxConfigString,
numChannels,
allowMultipleFramesPerPacket);
}
Boolean MPEG4LATMAudioRTPSink
::frameCanAppearAfterPacketStart(unsigned char const* /*frameStart*/,
unsigned /*numBytesInFrame*/) const {
return fAllowMultipleFramesPerPacket;
}
void MPEG4LATMAudioRTPSink
::doSpecialFrameHandling(unsigned fragmentationOffset,
unsigned char* frameStart,
unsigned numBytesInFrame,
struct timeval framePresentationTime,
unsigned numRemainingBytes) {
if (numRemainingBytes == 0) {
// This packet contains the last (or only) fragment of the frame.
// Set the RTP 'M' ('marker') bit:
setMarkerBit();
}
// Important: Also call our base class's doSpecialFrameHandling(),
// to set the packet's timestamp:
MultiFramedRTPSink::doSpecialFrameHandling(fragmentationOffset,
frameStart, numBytesInFrame,
framePresentationTime,
numRemainingBytes);
}
char const* MPEG4LATMAudioRTPSink::auxSDPLine() {
return fFmtpSDPLine;
}
live/liveMedia/MPEG4LATMAudioRTPSource.cpp 000444 001752 001752 00000017523 12656261123 020107 0 ustar 00rsf rsf 000000 000000 /**********
This library is free software; you can redistribute it and/or modify it under
the terms of the GNU Lesser General Public License as published by the
Free Software Foundation; either version 2.1 of the License, or (at your
option) any later version. (See .)
This library is distributed in the hope that it will be useful, but WITHOUT
ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS
FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for
more details.
You should have received a copy of the GNU Lesser General Public License
along with this library; if not, write to the Free Software Foundation, Inc.,
51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
**********/
// "liveMedia"
// Copyright (c) 1996-2016 Live Networks, Inc. All rights reserved.
// MPEG-4 audio, using LATM multiplexing
// Implementation
#include "MPEG4LATMAudioRTPSource.hh"
////////// LATMBufferedPacket and LATMBufferedPacketFactory //////////
class LATMBufferedPacket: public BufferedPacket {
public:
LATMBufferedPacket(Boolean includeLATMDataLengthField);
virtual ~LATMBufferedPacket();
private: // redefined virtual functions
virtual unsigned nextEnclosedFrameSize(unsigned char*& framePtr,
unsigned dataSize);
private:
Boolean fIncludeLATMDataLengthField;
};
class LATMBufferedPacketFactory: public BufferedPacketFactory {
private: // redefined virtual functions
virtual BufferedPacket* createNewPacket(MultiFramedRTPSource* ourSource);
};
///////// MPEG4LATMAudioRTPSource implementation ////////
MPEG4LATMAudioRTPSource*
MPEG4LATMAudioRTPSource::createNew(UsageEnvironment& env, Groupsock* RTPgs,
unsigned char rtpPayloadFormat,
unsigned rtpTimestampFrequency) {
return new MPEG4LATMAudioRTPSource(env, RTPgs, rtpPayloadFormat,
rtpTimestampFrequency);
}
MPEG4LATMAudioRTPSource
::MPEG4LATMAudioRTPSource(UsageEnvironment& env, Groupsock* RTPgs,
unsigned char rtpPayloadFormat,
unsigned rtpTimestampFrequency)
: MultiFramedRTPSource(env, RTPgs,
rtpPayloadFormat, rtpTimestampFrequency,
new LATMBufferedPacketFactory),
fIncludeLATMDataLengthField(True) {
}
MPEG4LATMAudioRTPSource::~MPEG4LATMAudioRTPSource() {
}
void MPEG4LATMAudioRTPSource::omitLATMDataLengthField() {
fIncludeLATMDataLengthField = False;
}
Boolean MPEG4LATMAudioRTPSource
::processSpecialHeader(BufferedPacket* packet,
unsigned& resultSpecialHeaderSize) {
fCurrentPacketBeginsFrame = fCurrentPacketCompletesFrame;
// whether the *previous* packet ended a frame
// The RTP "M" (marker) bit indicates the last fragment of a frame:
fCurrentPacketCompletesFrame = packet->rtpMarkerBit();
// There is no special header
resultSpecialHeaderSize = 0;
return True;
}
char const* MPEG4LATMAudioRTPSource::MIMEtype() const {
return "audio/MP4A-LATM";
}
////////// LATMBufferedPacket and LATMBufferedPacketFactory implementation
LATMBufferedPacket::LATMBufferedPacket(Boolean includeLATMDataLengthField)
: fIncludeLATMDataLengthField(includeLATMDataLengthField) {
}
LATMBufferedPacket::~LATMBufferedPacket() {
}
unsigned LATMBufferedPacket
::nextEnclosedFrameSize(unsigned char*& framePtr, unsigned dataSize) {
// Look at the LATM data length byte(s), to determine the size
// of the LATM payload.
unsigned resultFrameSize = 0;
unsigned i;
for (i = 0; i < dataSize; ++i) {
resultFrameSize += framePtr[i];
if (framePtr[i] != 0xFF) break;
}
++i;
if (fIncludeLATMDataLengthField) {
resultFrameSize += i;
} else {
framePtr += i;
dataSize -= i;
}
return (resultFrameSize <= dataSize) ? resultFrameSize : dataSize;
}
BufferedPacket* LATMBufferedPacketFactory
::createNewPacket(MultiFramedRTPSource* ourSource) {
MPEG4LATMAudioRTPSource* source = (MPEG4LATMAudioRTPSource*)ourSource;
return new LATMBufferedPacket(source->returnedFrameIncludesLATMDataLengthField());
}
////////// parseStreamMuxConfigStr() implementation //////////
static Boolean getNibble(char const*& configStr,
unsigned char& resultNibble) {
char c = configStr[0];
if (c == '\0') return False; // we've reached the end
if (c >= '0' && c <= '9') {
resultNibble = c - '0';
} else if (c >= 'A' && c <= 'F') {
resultNibble = 10 + c - 'A';
} else if (c >= 'a' && c <= 'f') {
resultNibble = 10 + c - 'a';
} else {
return False;
}
++configStr; // move to the next nibble
return True;
}
static Boolean getByte(char const*& configStr, unsigned char& resultByte) {
resultByte = 0; // by default, in case parsing fails
unsigned char firstNibble;
if (!getNibble(configStr, firstNibble)) return False;
resultByte = firstNibble<<4;
unsigned char secondNibble = 0;
if (!getNibble(configStr, secondNibble) && configStr[0] != '\0') {
// There's a second nibble, but it's malformed
return False;
}
resultByte |= secondNibble;
return True;
}
Boolean
parseStreamMuxConfigStr(char const* configStr,
// result parameters:
Boolean& audioMuxVersion,
Boolean& allStreamsSameTimeFraming,
unsigned char& numSubFrames,
unsigned char& numProgram,
unsigned char& numLayer,
unsigned char*& audioSpecificConfig,
unsigned& audioSpecificConfigSize) {
// Set default versions of the result parameters:
audioMuxVersion = False;
allStreamsSameTimeFraming = True;
numSubFrames = numProgram = numLayer = 0;
audioSpecificConfig = NULL;
audioSpecificConfigSize = 0;
do {
if (configStr == NULL) break;
unsigned char nextByte;
if (!getByte(configStr, nextByte)) break;
audioMuxVersion = (nextByte&0x80) != 0;
if (audioMuxVersion) break;
allStreamsSameTimeFraming = ((nextByte&0x40)>>6) != 0;
numSubFrames = (nextByte&0x3F);
if (!getByte(configStr, nextByte)) break;
numProgram = (nextByte&0xF0)>>4;
numLayer = (nextByte&0x0E)>>1;
// The one remaining bit, and the rest of the string,
// are used for "audioSpecificConfig":
unsigned char remainingBit = nextByte&1;
unsigned ascSize = (strlen(configStr)+1)/2 + 1;
audioSpecificConfig = new unsigned char[ascSize];
Boolean parseSuccess;
unsigned i = 0;
do {
nextByte = 0;
parseSuccess = getByte(configStr, nextByte);
audioSpecificConfig[i++] = (remainingBit<<7)|((nextByte&0xFE)>>1);
remainingBit = nextByte&1;
} while (parseSuccess);
if (i != ascSize) break; // part of the remaining string was bad
audioSpecificConfigSize = ascSize;
return True; // parsing succeeded
} while (0);
delete[] audioSpecificConfig;
return False; // parsing failed
}
unsigned char* parseStreamMuxConfigStr(char const* configStr,
// result parameter:
unsigned& audioSpecificConfigSize) {
Boolean audioMuxVersion, allStreamsSameTimeFraming;
unsigned char numSubFrames, numProgram, numLayer;
unsigned char* audioSpecificConfig;
if (!parseStreamMuxConfigStr(configStr,
audioMuxVersion, allStreamsSameTimeFraming,
numSubFrames, numProgram, numLayer,
audioSpecificConfig, audioSpecificConfigSize)) {
audioSpecificConfigSize = 0;
return NULL;
}
return audioSpecificConfig;
}
unsigned char* parseGeneralConfigStr(char const* configStr,
// result parameter:
unsigned& configSize) {
unsigned char* config = NULL;
do {
if (configStr == NULL) break;
configSize = (strlen(configStr)+1)/2;
config = new unsigned char[configSize];
if (config == NULL) break;
unsigned i;
for (i = 0; i < configSize; ++i) {
if (!getByte(configStr, config[i])) break;
}
if (i != configSize) break; // part of the string was bad
return config;
} while (0);
configSize = 0;
delete[] config;
return NULL;
}
live/liveMedia/MPEG4VideoFileServerMediaSubsession.cpp 000444 001752 001752 00000011127 12656261123 022666 0 ustar 00rsf rsf 000000 000000 /**********
This library is free software; you can redistribute it and/or modify it under
the terms of the GNU Lesser General Public License as published by the
Free Software Foundation; either version 2.1 of the License, or (at your
option) any later version. (See .)
This library is distributed in the hope that it will be useful, but WITHOUT
ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS
FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for
more details.
You should have received a copy of the GNU Lesser General Public License
along with this library; if not, write to the Free Software Foundation, Inc.,
51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
**********/
// "liveMedia"
// Copyright (c) 1996-2016 Live Networks, Inc. All rights reserved.
// A 'ServerMediaSubsession' object that creates new, unicast, "RTPSink"s
// on demand, from a MPEG-4 video file.
// Implementation
#include "MPEG4VideoFileServerMediaSubsession.hh"
#include "MPEG4ESVideoRTPSink.hh"
#include "ByteStreamFileSource.hh"
#include "MPEG4VideoStreamFramer.hh"
MPEG4VideoFileServerMediaSubsession*
MPEG4VideoFileServerMediaSubsession::createNew(UsageEnvironment& env,
char const* fileName,
Boolean reuseFirstSource) {
return new MPEG4VideoFileServerMediaSubsession(env, fileName, reuseFirstSource);
}
MPEG4VideoFileServerMediaSubsession
::MPEG4VideoFileServerMediaSubsession(UsageEnvironment& env,
char const* fileName, Boolean reuseFirstSource)
: FileServerMediaSubsession(env, fileName, reuseFirstSource),
fAuxSDPLine(NULL), fDoneFlag(0), fDummyRTPSink(NULL) {
}
MPEG4VideoFileServerMediaSubsession::~MPEG4VideoFileServerMediaSubsession() {
delete[] fAuxSDPLine;
}
static void afterPlayingDummy(void* clientData) {
MPEG4VideoFileServerMediaSubsession* subsess
= (MPEG4VideoFileServerMediaSubsession*)clientData;
subsess->afterPlayingDummy1();
}
void MPEG4VideoFileServerMediaSubsession::afterPlayingDummy1() {
// Unschedule any pending 'checking' task:
envir().taskScheduler().unscheduleDelayedTask(nextTask());
// Signal the event loop that we're done:
setDoneFlag();
}
static void checkForAuxSDPLine(void* clientData) {
MPEG4VideoFileServerMediaSubsession* subsess
= (MPEG4VideoFileServerMediaSubsession*)clientData;
subsess->checkForAuxSDPLine1();
}
void MPEG4VideoFileServerMediaSubsession::checkForAuxSDPLine1() {
char const* dasl;
if (fAuxSDPLine != NULL) {
// Signal the event loop that we're done:
setDoneFlag();
} else if (fDummyRTPSink != NULL && (dasl = fDummyRTPSink->auxSDPLine()) != NULL) {
fAuxSDPLine= strDup(dasl);
fDummyRTPSink = NULL;
// Signal the event loop that we're done:
setDoneFlag();
} else if (!fDoneFlag) {
// try again after a brief delay:
int uSecsToDelay = 100000; // 100 ms
nextTask() = envir().taskScheduler().scheduleDelayedTask(uSecsToDelay,
(TaskFunc*)checkForAuxSDPLine, this);
}
}
char const* MPEG4VideoFileServerMediaSubsession::getAuxSDPLine(RTPSink* rtpSink, FramedSource* inputSource) {
if (fAuxSDPLine != NULL) return fAuxSDPLine; // it's already been set up (for a previous client)
if (fDummyRTPSink == NULL) { // we're not already setting it up for another, concurrent stream
// Note: For MPEG-4 video files, the 'config' information isn't known
// until we start reading the file. This means that "rtpSink"s
// "auxSDPLine()" will be NULL initially, and we need to start reading data from our file until this changes.
fDummyRTPSink = rtpSink;
// Start reading the file:
fDummyRTPSink->startPlaying(*inputSource, afterPlayingDummy, this);
// Check whether the sink's 'auxSDPLine()' is ready:
checkForAuxSDPLine(this);
}
envir().taskScheduler().doEventLoop(&fDoneFlag);
return fAuxSDPLine;
}
FramedSource* MPEG4VideoFileServerMediaSubsession
::createNewStreamSource(unsigned /*clientSessionId*/, unsigned& estBitrate) {
estBitrate = 500; // kbps, estimate
// Create the video source:
ByteStreamFileSource* fileSource
= ByteStreamFileSource::createNew(envir(), fFileName);
if (fileSource == NULL) return NULL;
fFileSize = fileSource->fileSize();
// Create a framer for the Video Elementary Stream:
return MPEG4VideoStreamFramer::createNew(envir(), fileSource);
}
RTPSink* MPEG4VideoFileServerMediaSubsession
::createNewRTPSink(Groupsock* rtpGroupsock,
unsigned char rtpPayloadTypeIfDynamic,
FramedSource* /*inputSource*/) {
return MPEG4ESVideoRTPSink::createNew(envir(), rtpGroupsock,
rtpPayloadTypeIfDynamic);
}
live/liveMedia/MPEG4VideoStreamDiscreteFramer.cpp 000444 001752 001752 00000022500 12656261123 021652 0 ustar 00rsf rsf 000000 000000 /**********
This library is free software; you can redistribute it and/or modify it under
the terms of the GNU Lesser General Public License as published by the
Free Software Foundation; either version 2.1 of the License, or (at your
option) any later version. (See .)
This library is distributed in the hope that it will be useful, but WITHOUT
ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS
FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for
more details.
You should have received a copy of the GNU Lesser General Public License
along with this library; if not, write to the Free Software Foundation, Inc.,
51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
**********/
// "liveMedia"
// Copyright (c) 1996-2016 Live Networks, Inc. All rights reserved.
// A simplified version of "MPEG4VideoStreamFramer" that takes only complete,
// discrete frames (rather than an arbitrary byte stream) as input.
// This avoids the parsing and data copying overhead of the full
// "MPEG4VideoStreamFramer".
// Implementation
#include "MPEG4VideoStreamDiscreteFramer.hh"
MPEG4VideoStreamDiscreteFramer*
MPEG4VideoStreamDiscreteFramer::createNew(UsageEnvironment& env,
FramedSource* inputSource, Boolean leavePresentationTimesUnmodified) {
// Need to add source type checking here??? #####
return new MPEG4VideoStreamDiscreteFramer(env, inputSource, leavePresentationTimesUnmodified);
}
MPEG4VideoStreamDiscreteFramer
::MPEG4VideoStreamDiscreteFramer(UsageEnvironment& env,
FramedSource* inputSource, Boolean leavePresentationTimesUnmodified)
: MPEG4VideoStreamFramer(env, inputSource, False/*don't create a parser*/),
fLeavePresentationTimesUnmodified(leavePresentationTimesUnmodified), vop_time_increment_resolution(0), fNumVTIRBits(0),
fLastNonBFrameVop_time_increment(0) {
fLastNonBFramePresentationTime.tv_sec = 0;
fLastNonBFramePresentationTime.tv_usec = 0;
}
MPEG4VideoStreamDiscreteFramer::~MPEG4VideoStreamDiscreteFramer() {
}
void MPEG4VideoStreamDiscreteFramer::doGetNextFrame() {
// Arrange to read data (which should be a complete MPEG-4 video frame)
// from our data source, directly into the client's input buffer.
// After reading this, we'll do some parsing on the frame.
fInputSource->getNextFrame(fTo, fMaxSize,
afterGettingFrame, this,
FramedSource::handleClosure, this);
}
void MPEG4VideoStreamDiscreteFramer
::afterGettingFrame(void* clientData, unsigned frameSize,
unsigned numTruncatedBytes,
struct timeval presentationTime,
unsigned durationInMicroseconds) {
MPEG4VideoStreamDiscreteFramer* source = (MPEG4VideoStreamDiscreteFramer*)clientData;
source->afterGettingFrame1(frameSize, numTruncatedBytes,
presentationTime, durationInMicroseconds);
}
void MPEG4VideoStreamDiscreteFramer
::afterGettingFrame1(unsigned frameSize, unsigned numTruncatedBytes,
struct timeval presentationTime,
unsigned durationInMicroseconds) {
// Check that the first 4 bytes are a system code:
if (frameSize >= 4 && fTo[0] == 0 && fTo[1] == 0 && fTo[2] == 1) {
fPictureEndMarker = True; // Assume that we have a complete 'picture' here
unsigned i = 3;
if (fTo[i] == 0xB0) { // VISUAL_OBJECT_SEQUENCE_START_CODE
// The next byte is the "profile_and_level_indication":
if (frameSize >= 5) fProfileAndLevelIndication = fTo[4];
// The start of this frame - up to the first GROUP_VOP_START_CODE
// or VOP_START_CODE - is stream configuration information. Save this:
for (i = 7; i < frameSize; ++i) {
if ((fTo[i] == 0xB3 /*GROUP_VOP_START_CODE*/ ||
fTo[i] == 0xB6 /*VOP_START_CODE*/)
&& fTo[i-1] == 1 && fTo[i-2] == 0 && fTo[i-3] == 0) {
break; // The configuration information ends here
}
}
fNumConfigBytes = i < frameSize ? i-3 : frameSize;
delete[] fConfigBytes; fConfigBytes = new unsigned char[fNumConfigBytes];
for (unsigned j = 0; j < fNumConfigBytes; ++j) fConfigBytes[j] = fTo[j];
// This information (should) also contain a VOL header, which we need
// to analyze, to get "vop_time_increment_resolution" (which we need
// - along with "vop_time_increment" - in order to generate accurate
// presentation times for "B" frames).
analyzeVOLHeader();
}
if (i < frameSize) {
u_int8_t nextCode = fTo[i];
if (nextCode == 0xB3 /*GROUP_VOP_START_CODE*/) {
// Skip to the following VOP_START_CODE (if any):
for (i += 4; i < frameSize; ++i) {
if (fTo[i] == 0xB6 /*VOP_START_CODE*/
&& fTo[i-1] == 1 && fTo[i-2] == 0 && fTo[i-3] == 0) {
nextCode = fTo[i];
break;
}
}
}
if (nextCode == 0xB6 /*VOP_START_CODE*/ && i+5 < frameSize) {
++i;
// Get the "vop_coding_type" from the next byte:
u_int8_t nextByte = fTo[i++];
u_int8_t vop_coding_type = nextByte>>6;
// Next, get the "modulo_time_base" by counting the '1' bits that
// follow. We look at the next 32-bits only.
// This should be enough in most cases.
u_int32_t next4Bytes
= (fTo[i]<<24)|(fTo[i+1]<<16)|(fTo[i+2]<<8)|fTo[i+3];
i += 4;
u_int32_t timeInfo = (nextByte<<(32-6))|(next4Bytes>>6);
unsigned modulo_time_base = 0;
u_int32_t mask = 0x80000000;
while ((timeInfo&mask) != 0) {
++modulo_time_base;
mask >>= 1;
}
mask >>= 2;
// Then, get the "vop_time_increment".
unsigned vop_time_increment = 0;
// First, make sure we have enough bits left for this:
if ((mask>>(fNumVTIRBits-1)) != 0) {
for (unsigned i = 0; i < fNumVTIRBits; ++i) {
vop_time_increment |= timeInfo&mask;
mask >>= 1;
}
while (mask != 0) {
vop_time_increment >>= 1;
mask >>= 1;
}
}
// If this is a "B" frame, then we have to tweak "presentationTime":
if (!fLeavePresentationTimesUnmodified && vop_coding_type == 2/*B*/
&& (fLastNonBFramePresentationTime.tv_usec > 0 ||
fLastNonBFramePresentationTime.tv_sec > 0)) {
int timeIncrement
= fLastNonBFrameVop_time_increment - vop_time_increment;
if (timeIncrement<0) timeIncrement += vop_time_increment_resolution;
unsigned const MILLION = 1000000;
double usIncrement = vop_time_increment_resolution == 0 ? 0.0
: ((double)timeIncrement*MILLION)/vop_time_increment_resolution;
unsigned secondsToSubtract = (unsigned)(usIncrement/MILLION);
unsigned uSecondsToSubtract = ((unsigned)usIncrement)%MILLION;
presentationTime = fLastNonBFramePresentationTime;
if ((unsigned)presentationTime.tv_usec < uSecondsToSubtract) {
presentationTime.tv_usec += MILLION;
if (presentationTime.tv_sec > 0) --presentationTime.tv_sec;
}
presentationTime.tv_usec -= uSecondsToSubtract;
if ((unsigned)presentationTime.tv_sec > secondsToSubtract) {
presentationTime.tv_sec -= secondsToSubtract;
} else {
presentationTime.tv_sec = presentationTime.tv_usec = 0;
}
} else {
fLastNonBFramePresentationTime = presentationTime;
fLastNonBFrameVop_time_increment = vop_time_increment;
}
}
}
}
// Complete delivery to the client:
fFrameSize = frameSize;
fNumTruncatedBytes = numTruncatedBytes;
fPresentationTime = presentationTime;
fDurationInMicroseconds = durationInMicroseconds;
afterGetting(this);
}
Boolean MPEG4VideoStreamDiscreteFramer::getNextFrameBit(u_int8_t& result) {
if (fNumBitsSeenSoFar/8 >= fNumConfigBytes) return False;
u_int8_t nextByte = fConfigBytes[fNumBitsSeenSoFar/8];
result = (nextByte>>(7-fNumBitsSeenSoFar%8))&1;
++fNumBitsSeenSoFar;
return True;
}
Boolean MPEG4VideoStreamDiscreteFramer::getNextFrameBits(unsigned numBits,
u_int32_t& result) {
result = 0;
for (unsigned i = 0; i < numBits; ++i) {
u_int8_t nextBit;
if (!getNextFrameBit(nextBit)) return False;
result = (result<<1)|nextBit;
}
return True;
}
void MPEG4VideoStreamDiscreteFramer::analyzeVOLHeader() {
// Begin by moving to the VOL header:
unsigned i;
for (i = 3; i < fNumConfigBytes; ++i) {
if (fConfigBytes[i] >= 0x20 && fConfigBytes[i] <= 0x2F
&& fConfigBytes[i-1] == 1
&& fConfigBytes[i-2] == 0 && fConfigBytes[i-3] == 0) {
++i;
break;
}
}
fNumBitsSeenSoFar = 8*i + 9;
do {
u_int8_t is_object_layer_identifier;
if (!getNextFrameBit(is_object_layer_identifier)) break;
if (is_object_layer_identifier) fNumBitsSeenSoFar += 7;
u_int32_t aspect_ratio_info;
if (!getNextFrameBits(4, aspect_ratio_info)) break;
if (aspect_ratio_info == 15 /*extended_PAR*/) fNumBitsSeenSoFar += 16;
u_int8_t vol_control_parameters;
if (!getNextFrameBit(vol_control_parameters)) break;
if (vol_control_parameters) {
fNumBitsSeenSoFar += 3; // chroma_format; low_delay
u_int8_t vbw_parameters;
if (!getNextFrameBit(vbw_parameters)) break;
if (vbw_parameters) fNumBitsSeenSoFar += 79;
}
fNumBitsSeenSoFar += 2; // video_object_layer_shape
u_int8_t marker_bit;
if (!getNextFrameBit(marker_bit)) break;
if (marker_bit != 1) break; // sanity check
if (!getNextFrameBits(16, vop_time_increment_resolution)) break;
if (vop_time_increment_resolution == 0) break; // shouldn't happen
// Compute how many bits are necessary to represent this:
fNumVTIRBits = 0;
for (unsigned test = vop_time_increment_resolution; test>0; test /= 2) {
++fNumVTIRBits;
}
} while (0);
}
live/liveMedia/MPEG4VideoStreamFramer.cpp 000444 001752 001752 00000056607 12656261123 020206 0 ustar 00rsf rsf 000000 000000 /**********
This library is free software; you can redistribute it and/or modify it under
the terms of the GNU Lesser General Public License as published by the
Free Software Foundation; either version 2.1 of the License, or (at your
option) any later version. (See .)
This library is distributed in the hope that it will be useful, but WITHOUT
ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS
FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for
more details.
You should have received a copy of the GNU Lesser General Public License
along with this library; if not, write to the Free Software Foundation, Inc.,
51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
**********/
// "liveMedia"
// Copyright (c) 1996-2016 Live Networks, Inc. All rights reserved.
// A filter that breaks up an MPEG-4 video elementary stream into
// frames for:
// - Visual Object Sequence (VS) Header + Visual Object (VO) Header
// + Video Object Layer (VOL) Header
// - Group of VOP (GOV) Header
// - VOP frame
// Implementation
#include "MPEG4VideoStreamFramer.hh"
#include "MPEGVideoStreamParser.hh"
#include "MPEG4LATMAudioRTPSource.hh" // for "parseGeneralConfigStr()"
#include
////////// MPEG4VideoStreamParser definition //////////
// An enum representing the current state of the parser:
enum MPEGParseState {
PARSING_VISUAL_OBJECT_SEQUENCE,
PARSING_VISUAL_OBJECT_SEQUENCE_SEEN_CODE,
PARSING_VISUAL_OBJECT,
PARSING_VIDEO_OBJECT_LAYER,
PARSING_GROUP_OF_VIDEO_OBJECT_PLANE,
PARSING_VIDEO_OBJECT_PLANE,
PARSING_VISUAL_OBJECT_SEQUENCE_END_CODE
};
class MPEG4VideoStreamParser: public MPEGVideoStreamParser {
public:
MPEG4VideoStreamParser(MPEG4VideoStreamFramer* usingSource,
FramedSource* inputSource);
virtual ~MPEG4VideoStreamParser();
private: // redefined virtual functions:
virtual void flushInput();
virtual unsigned parse();
private:
MPEG4VideoStreamFramer* usingSource() {
return (MPEG4VideoStreamFramer*)fUsingSource;
}
void setParseState(MPEGParseState parseState);
unsigned parseVisualObjectSequence(Boolean haveSeenStartCode = False);
unsigned parseVisualObject();
unsigned parseVideoObjectLayer();
unsigned parseGroupOfVideoObjectPlane();
unsigned parseVideoObjectPlane();
unsigned parseVisualObjectSequenceEndCode();
// These are used for parsing within an already-read frame:
Boolean getNextFrameBit(u_int8_t& result);
Boolean getNextFrameBits(unsigned numBits, u_int32_t& result);
// Which are used by:
void analyzeVOLHeader();
private:
MPEGParseState fCurrentParseState;
unsigned fNumBitsSeenSoFar; // used by the getNextFrameBit*() routines
u_int32_t vop_time_increment_resolution;
unsigned fNumVTIRBits;
// # of bits needed to count to "vop_time_increment_resolution"
u_int8_t fixed_vop_rate;
unsigned fixed_vop_time_increment; // used if 'fixed_vop_rate' is set
unsigned fSecondsSinceLastTimeCode, fTotalTicksSinceLastTimeCode, fPrevNewTotalTicks;
unsigned fPrevPictureCountDelta;
Boolean fJustSawTimeCode;
};
////////// MPEG4VideoStreamFramer implementation //////////
MPEG4VideoStreamFramer*
MPEG4VideoStreamFramer::createNew(UsageEnvironment& env,
FramedSource* inputSource) {
// Need to add source type checking here??? #####
return new MPEG4VideoStreamFramer(env, inputSource);
}
unsigned char* MPEG4VideoStreamFramer
::getConfigBytes(unsigned& numBytes) const {
numBytes = fNumConfigBytes;
return fConfigBytes;
}
void MPEG4VideoStreamFramer
::setConfigInfo(u_int8_t profileAndLevelIndication, char const* configStr) {
fProfileAndLevelIndication = profileAndLevelIndication;
delete[] fConfigBytes;
fConfigBytes = parseGeneralConfigStr(configStr, fNumConfigBytes);
}
MPEG4VideoStreamFramer::MPEG4VideoStreamFramer(UsageEnvironment& env,
FramedSource* inputSource,
Boolean createParser)
: MPEGVideoStreamFramer(env, inputSource),
fProfileAndLevelIndication(0),
fConfigBytes(NULL), fNumConfigBytes(0),
fNewConfigBytes(NULL), fNumNewConfigBytes(0) {
fParser = createParser
? new MPEG4VideoStreamParser(this, inputSource)
: NULL;
}
MPEG4VideoStreamFramer::~MPEG4VideoStreamFramer() {
delete[] fConfigBytes; delete[] fNewConfigBytes;
}
void MPEG4VideoStreamFramer::startNewConfig() {
delete[] fNewConfigBytes; fNewConfigBytes = NULL;
fNumNewConfigBytes = 0;
}
void MPEG4VideoStreamFramer
::appendToNewConfig(unsigned char* newConfigBytes, unsigned numNewBytes) {
// Allocate a new block of memory for the new config bytes:
unsigned char* configNew
= new unsigned char[fNumNewConfigBytes + numNewBytes];
// Copy the old, then the new, config bytes there:
memmove(configNew, fNewConfigBytes, fNumNewConfigBytes);
memmove(&configNew[fNumNewConfigBytes], newConfigBytes, numNewBytes);
delete[] fNewConfigBytes; fNewConfigBytes = configNew;
fNumNewConfigBytes += numNewBytes;
}
void MPEG4VideoStreamFramer::completeNewConfig() {
delete[] fConfigBytes; fConfigBytes = fNewConfigBytes;
fNewConfigBytes = NULL;
fNumConfigBytes = fNumNewConfigBytes;
fNumNewConfigBytes = 0;
}
Boolean MPEG4VideoStreamFramer::isMPEG4VideoStreamFramer() const {
return True;
}
////////// MPEG4VideoStreamParser implementation //////////
MPEG4VideoStreamParser
::MPEG4VideoStreamParser(MPEG4VideoStreamFramer* usingSource,
FramedSource* inputSource)
: MPEGVideoStreamParser(usingSource, inputSource),
fCurrentParseState(PARSING_VISUAL_OBJECT_SEQUENCE),
vop_time_increment_resolution(0), fNumVTIRBits(0),
fixed_vop_rate(0), fixed_vop_time_increment(0),
fSecondsSinceLastTimeCode(0), fTotalTicksSinceLastTimeCode(0),
fPrevNewTotalTicks(0), fPrevPictureCountDelta(1), fJustSawTimeCode(False) {
}
MPEG4VideoStreamParser::~MPEG4VideoStreamParser() {
}
void MPEG4VideoStreamParser::setParseState(MPEGParseState parseState) {
fCurrentParseState = parseState;
MPEGVideoStreamParser::setParseState();
}
void MPEG4VideoStreamParser::flushInput() {
fSecondsSinceLastTimeCode = 0;
fTotalTicksSinceLastTimeCode = 0;
fPrevNewTotalTicks = 0;
fPrevPictureCountDelta = 1;
StreamParser::flushInput();
if (fCurrentParseState != PARSING_VISUAL_OBJECT_SEQUENCE) {
setParseState(PARSING_VISUAL_OBJECT_SEQUENCE); // later, change to GOV or VOP? #####
}
}
unsigned MPEG4VideoStreamParser::parse() {
try {
switch (fCurrentParseState) {
case PARSING_VISUAL_OBJECT_SEQUENCE: {
return parseVisualObjectSequence();
}
case PARSING_VISUAL_OBJECT_SEQUENCE_SEEN_CODE: {
return parseVisualObjectSequence(True);
}
case PARSING_VISUAL_OBJECT: {
return parseVisualObject();
}
case PARSING_VIDEO_OBJECT_LAYER: {
return parseVideoObjectLayer();
}
case PARSING_GROUP_OF_VIDEO_OBJECT_PLANE: {
return parseGroupOfVideoObjectPlane();
}
case PARSING_VIDEO_OBJECT_PLANE: {
return parseVideoObjectPlane();
}
case PARSING_VISUAL_OBJECT_SEQUENCE_END_CODE: {
return parseVisualObjectSequenceEndCode();
}
default: {
return 0; // shouldn't happen
}
}
} catch (int /*e*/) {
#ifdef DEBUG
fprintf(stderr, "MPEG4VideoStreamParser::parse() EXCEPTION (This is normal behavior - *not* an error)\n");
#endif
return 0; // the parsing got interrupted
}
}
#define VISUAL_OBJECT_SEQUENCE_START_CODE 0x000001B0
#define VISUAL_OBJECT_SEQUENCE_END_CODE 0x000001B1
#define GROUP_VOP_START_CODE 0x000001B3
#define VISUAL_OBJECT_START_CODE 0x000001B5
#define VOP_START_CODE 0x000001B6
unsigned MPEG4VideoStreamParser
::parseVisualObjectSequence(Boolean haveSeenStartCode) {
#ifdef DEBUG
fprintf(stderr, "parsing VisualObjectSequence\n");
#endif
usingSource()->startNewConfig();
u_int32_t first4Bytes;
if (!haveSeenStartCode) {
while ((first4Bytes = test4Bytes()) != VISUAL_OBJECT_SEQUENCE_START_CODE) {
#ifdef DEBUG
fprintf(stderr, "ignoring non VS header: 0x%08x\n", first4Bytes);
#endif
get1Byte(); setParseState(PARSING_VISUAL_OBJECT_SEQUENCE);
// ensures we progress over bad data
}
first4Bytes = get4Bytes();
} else {
// We've already seen the start code
first4Bytes = VISUAL_OBJECT_SEQUENCE_START_CODE;
}
save4Bytes(first4Bytes);
// The next byte is the "profile_and_level_indication":
u_int8_t pali = get1Byte();
#ifdef DEBUG
fprintf(stderr, "profile_and_level_indication: %02x\n", pali);
#endif
saveByte(pali);
usingSource()->fProfileAndLevelIndication = pali;
// Now, copy all bytes that we see, up until we reach
// a VISUAL_OBJECT_START_CODE:
u_int32_t next4Bytes = get4Bytes();
while (next4Bytes != VISUAL_OBJECT_START_CODE) {
saveToNextCode(next4Bytes);
}
setParseState(PARSING_VISUAL_OBJECT);
// Compute this frame's presentation time:
usingSource()->computePresentationTime(fTotalTicksSinceLastTimeCode);
// This header forms part of the 'configuration' information:
usingSource()->appendToNewConfig(fStartOfFrame, curFrameSize());
return curFrameSize();
}
static inline Boolean isVideoObjectStartCode(u_int32_t code) {
return code >= 0x00000100 && code <= 0x0000011F;
}
unsigned MPEG4VideoStreamParser::parseVisualObject() {
#ifdef DEBUG
fprintf(stderr, "parsing VisualObject\n");
#endif
// Note that we've already read the VISUAL_OBJECT_START_CODE
save4Bytes(VISUAL_OBJECT_START_CODE);
// Next, extract the "visual_object_type" from the next 1 or 2 bytes:
u_int8_t nextByte = get1Byte(); saveByte(nextByte);
Boolean is_visual_object_identifier = (nextByte&0x80) != 0;
u_int8_t visual_object_type;
if (is_visual_object_identifier) {
#ifdef DEBUG
fprintf(stderr, "visual_object_verid: 0x%x; visual_object_priority: 0x%x\n", (nextByte&0x78)>>3, (nextByte&0x07));
#endif
nextByte = get1Byte(); saveByte(nextByte);
visual_object_type = (nextByte&0xF0)>>4;
} else {
visual_object_type = (nextByte&0x78)>>3;
}
#ifdef DEBUG
fprintf(stderr, "visual_object_type: 0x%x\n", visual_object_type);
#endif
// At present, we support only the "Video ID" "visual_object_type" (1)
if (visual_object_type != 1) {
usingSource()->envir() << "MPEG4VideoStreamParser::parseVisualObject(): Warning: We don't handle visual_object_type " << visual_object_type << "\n";
}
// Now, copy all bytes that we see, up until we reach
// a video_object_start_code
u_int32_t next4Bytes = get4Bytes();
while (!isVideoObjectStartCode(next4Bytes)) {
saveToNextCode(next4Bytes);
}
save4Bytes(next4Bytes);
#ifdef DEBUG
fprintf(stderr, "saw a video_object_start_code: 0x%08x\n", next4Bytes);
#endif
setParseState(PARSING_VIDEO_OBJECT_LAYER);
// Compute this frame's presentation time:
usingSource()->computePresentationTime(fTotalTicksSinceLastTimeCode);
// This header forms part of the 'configuration' information:
usingSource()->appendToNewConfig(fStartOfFrame, curFrameSize());
return curFrameSize();
}
static inline Boolean isVideoObjectLayerStartCode(u_int32_t code) {
return code >= 0x00000120 && code <= 0x0000012F;
}
Boolean MPEG4VideoStreamParser::getNextFrameBit(u_int8_t& result) {
if (fNumBitsSeenSoFar/8 >= curFrameSize()) return False;
u_int8_t nextByte = fStartOfFrame[fNumBitsSeenSoFar/8];
result = (nextByte>>(7-fNumBitsSeenSoFar%8))&1;
++fNumBitsSeenSoFar;
return True;
}
Boolean MPEG4VideoStreamParser::getNextFrameBits(unsigned numBits,
u_int32_t& result) {
result = 0;
for (unsigned i = 0; i < numBits; ++i) {
u_int8_t nextBit;
if (!getNextFrameBit(nextBit)) return False;
result = (result<<1)|nextBit;
}
return True;
}
void MPEG4VideoStreamParser::analyzeVOLHeader() {
// Extract timing information (in particular,
// "vop_time_increment_resolution") from the VOL Header:
fNumBitsSeenSoFar = 41;
do {
u_int8_t is_object_layer_identifier;
if (!getNextFrameBit(is_object_layer_identifier)) break;
if (is_object_layer_identifier) fNumBitsSeenSoFar += 7;
u_int32_t aspect_ratio_info;
if (!getNextFrameBits(4, aspect_ratio_info)) break;
if (aspect_ratio_info == 15 /*extended_PAR*/) fNumBitsSeenSoFar += 16;
u_int8_t vol_control_parameters;
if (!getNextFrameBit(vol_control_parameters)) break;
if (vol_control_parameters) {
fNumBitsSeenSoFar += 3; // chroma_format; low_delay
u_int8_t vbw_parameters;
if (!getNextFrameBit(vbw_parameters)) break;
if (vbw_parameters) fNumBitsSeenSoFar += 79;
}
fNumBitsSeenSoFar += 2; // video_object_layer_shape
u_int8_t marker_bit;
if (!getNextFrameBit(marker_bit)) break;
if (marker_bit != 1) { // sanity check
usingSource()->envir() << "MPEG4VideoStreamParser::analyzeVOLHeader(): marker_bit 1 not set!\n";
break;
}
if (!getNextFrameBits(16, vop_time_increment_resolution)) break;
#ifdef DEBUG
fprintf(stderr, "vop_time_increment_resolution: %d\n", vop_time_increment_resolution);
#endif
if (vop_time_increment_resolution == 0) {
usingSource()->envir() << "MPEG4VideoStreamParser::analyzeVOLHeader(): vop_time_increment_resolution is zero!\n";
break;
}
// Compute how many bits are necessary to represent this:
fNumVTIRBits = 0;
for (unsigned test = vop_time_increment_resolution; test>0; test /= 2) {
++fNumVTIRBits;
}
if (!getNextFrameBit(marker_bit)) break;
if (marker_bit != 1) { // sanity check
usingSource()->envir() << "MPEG4VideoStreamParser::analyzeVOLHeader(): marker_bit 2 not set!\n";
break;
}
if (!getNextFrameBit(fixed_vop_rate)) break;
if (fixed_vop_rate) {
// Get the following "fixed_vop_time_increment":
if (!getNextFrameBits(fNumVTIRBits, fixed_vop_time_increment)) break;
#ifdef DEBUG
fprintf(stderr, "fixed_vop_time_increment: %d\n", fixed_vop_time_increment);
if (fixed_vop_time_increment == 0) {
usingSource()->envir() << "MPEG4VideoStreamParser::analyzeVOLHeader(): fixed_vop_time_increment is zero!\n";
}
#endif
}
// Use "vop_time_increment_resolution" as the 'frame rate'
// (really, 'tick rate'):
usingSource()->fFrameRate = (double)vop_time_increment_resolution;
#ifdef DEBUG
fprintf(stderr, "fixed_vop_rate: %d; 'frame' (really tick) rate: %f\n", fixed_vop_rate, usingSource()->fFrameRate);
#endif
return;
} while (0);
if (fNumBitsSeenSoFar/8 >= curFrameSize()) {
char errMsg[200];
sprintf(errMsg, "Not enough bits in VOL header: %d/8 >= %d\n", fNumBitsSeenSoFar, curFrameSize());
usingSource()->envir() << errMsg;
}
}
unsigned MPEG4VideoStreamParser::parseVideoObjectLayer() {
#ifdef DEBUG
fprintf(stderr, "parsing VideoObjectLayer\n");
#endif
// The first 4 bytes must be a "video_object_layer_start_code".
// If not, this is a 'short video header', which we currently
// don't support:
u_int32_t next4Bytes = get4Bytes();
if (!isVideoObjectLayerStartCode(next4Bytes)) {
usingSource()->envir() << "MPEG4VideoStreamParser::parseVideoObjectLayer(): This appears to be a 'short video header', which we current don't support\n";
}
// Now, copy all bytes that we see, up until we reach
// a GROUP_VOP_START_CODE or a VOP_START_CODE:
do {
saveToNextCode(next4Bytes);
} while (next4Bytes != GROUP_VOP_START_CODE
&& next4Bytes != VOP_START_CODE);
analyzeVOLHeader();
setParseState((next4Bytes == GROUP_VOP_START_CODE)
? PARSING_GROUP_OF_VIDEO_OBJECT_PLANE
: PARSING_VIDEO_OBJECT_PLANE);
// Compute this frame's presentation time:
usingSource()->computePresentationTime(fTotalTicksSinceLastTimeCode);
// This header ends the 'configuration' information:
usingSource()->appendToNewConfig(fStartOfFrame, curFrameSize());
usingSource()->completeNewConfig();
return curFrameSize();
}
unsigned MPEG4VideoStreamParser::parseGroupOfVideoObjectPlane() {
#ifdef DEBUG
fprintf(stderr, "parsing GroupOfVideoObjectPlane\n");
#endif
// Note that we've already read the GROUP_VOP_START_CODE
save4Bytes(GROUP_VOP_START_CODE);
// Next, extract the (18-bit) time code from the next 3 bytes:
u_int8_t next3Bytes[3];
getBytes(next3Bytes, 3);
saveByte(next3Bytes[0]);saveByte(next3Bytes[1]);saveByte(next3Bytes[2]);
unsigned time_code
= (next3Bytes[0]<<10)|(next3Bytes[1]<<2)|(next3Bytes[2]>>6);
unsigned time_code_hours = (time_code&0x0003E000)>>13;
unsigned time_code_minutes = (time_code&0x00001F80)>>7;
#if defined(DEBUG) || defined(DEBUG_TIMESTAMPS)
Boolean marker_bit = (time_code&0x00000040) != 0;
#endif
unsigned time_code_seconds = (time_code&0x0000003F);
#if defined(DEBUG) || defined(DEBUG_TIMESTAMPS)
fprintf(stderr, "time_code: 0x%05x, hours %d, minutes %d, marker_bit %d, seconds %d\n", time_code, time_code_hours, time_code_minutes, marker_bit, time_code_seconds);
#endif
fJustSawTimeCode = True;
// Now, copy all bytes that we see, up until we reach a VOP_START_CODE:
u_int32_t next4Bytes = get4Bytes();
while (next4Bytes != VOP_START_CODE) {
saveToNextCode(next4Bytes);
}
// Compute this frame's presentation time:
usingSource()->computePresentationTime(fTotalTicksSinceLastTimeCode);
// Record the time code:
usingSource()->setTimeCode(time_code_hours, time_code_minutes,
time_code_seconds, 0, 0);
// Note: Because the GOV header can appear anywhere (not just at a 1s point), we
// don't pass "fTotalTicksSinceLastTimeCode" as the "picturesSinceLastGOP" parameter.
fSecondsSinceLastTimeCode = 0;
if (fixed_vop_rate) fTotalTicksSinceLastTimeCode = 0;
setParseState(PARSING_VIDEO_OBJECT_PLANE);
return curFrameSize();
}
unsigned MPEG4VideoStreamParser::parseVideoObjectPlane() {
#ifdef DEBUG
fprintf(stderr, "#parsing VideoObjectPlane\n");
#endif
// Note that we've already read the VOP_START_CODE
save4Bytes(VOP_START_CODE);
// Get the "vop_coding_type" from the next byte:
u_int8_t nextByte = get1Byte(); saveByte(nextByte);
u_int8_t vop_coding_type = nextByte>>6;
// Next, get the "modulo_time_base" by counting the '1' bits that follow.
// We look at the next 32-bits only. This should be enough in most cases.
u_int32_t next4Bytes = get4Bytes();
u_int32_t timeInfo = (nextByte<<(32-6))|(next4Bytes>>6);
unsigned modulo_time_base = 0;
u_int32_t mask = 0x80000000;
while ((timeInfo&mask) != 0) {
++modulo_time_base;
mask >>= 1;
}
mask >>= 1;
// Check the following marker bit:
if ((timeInfo&mask) == 0) {
usingSource()->envir() << "MPEG4VideoStreamParser::parseVideoObjectPlane(): marker bit not set!\n";
}
mask >>= 1;
// Then, get the "vop_time_increment".
// First, make sure we have enough bits left for this:
if ((mask>>(fNumVTIRBits-1)) == 0) {
usingSource()->envir() << "MPEG4VideoStreamParser::parseVideoObjectPlane(): 32-bits are not enough to get \"vop_time_increment\"!\n";
}
unsigned vop_time_increment = 0;
for (unsigned i = 0; i < fNumVTIRBits; ++i) {
vop_time_increment |= timeInfo&mask;
mask >>= 1;
}
while (mask != 0) {
vop_time_increment >>= 1;
mask >>= 1;
}
#ifdef DEBUG
fprintf(stderr, "vop_coding_type: %d(%c), modulo_time_base: %d, vop_time_increment: %d\n", vop_coding_type, "IPBS"[vop_coding_type], modulo_time_base, vop_time_increment);
#endif
// Now, copy all bytes that we see, up until we reach a code of some sort:
saveToNextCode(next4Bytes);
// Update our counters based on the frame timing information that we saw:
if (fixed_vop_time_increment > 0) {
// This is a 'fixed_vop_rate' stream. Use 'fixed_vop_time_increment':
usingSource()->fPictureCount += fixed_vop_time_increment;
if (vop_time_increment > 0 || modulo_time_base > 0) {
fTotalTicksSinceLastTimeCode += fixed_vop_time_increment;
// Note: "fSecondsSinceLastTimeCode" and "fPrevNewTotalTicks" are not used.
}
} else {
// Use 'vop_time_increment':
unsigned newTotalTicks
= (fSecondsSinceLastTimeCode + modulo_time_base)*vop_time_increment_resolution
+ vop_time_increment;
if (newTotalTicks == fPrevNewTotalTicks && fPrevNewTotalTicks > 0) {
// This is apparently a buggy MPEG-4 video stream, because
// "vop_time_increment" did not change. Overcome this error,
// by pretending that it did change.
#ifdef DEBUG
fprintf(stderr, "Buggy MPEG-4 video stream: \"vop_time_increment\" did not change!\n");
#endif
// The following assumes that we don't have 'B' frames. If we do, then TARFU!
usingSource()->fPictureCount += vop_time_increment;
fTotalTicksSinceLastTimeCode += vop_time_increment;
fSecondsSinceLastTimeCode += modulo_time_base;
} else {
if (newTotalTicks < fPrevNewTotalTicks && vop_coding_type != 2/*B*/
&& modulo_time_base == 0 && vop_time_increment == 0 && !fJustSawTimeCode) {
// This is another kind of buggy MPEG-4 video stream, in which
// "vop_time_increment" wraps around, but without
// "modulo_time_base" changing (or just having had a new time code).
// Overcome this by pretending that "vop_time_increment" *did* wrap around:
#ifdef DEBUG
fprintf(stderr, "Buggy MPEG-4 video stream: \"vop_time_increment\" wrapped around, but without \"modulo_time_base\" changing!\n");
#endif
++fSecondsSinceLastTimeCode;
newTotalTicks += vop_time_increment_resolution;
}
fPrevNewTotalTicks = newTotalTicks;
if (vop_coding_type != 2/*B*/) {
int pictureCountDelta = newTotalTicks - fTotalTicksSinceLastTimeCode;
if (pictureCountDelta <= 0) pictureCountDelta = fPrevPictureCountDelta;
// ensures that the picture count is always increasing
usingSource()->fPictureCount += pictureCountDelta;
fPrevPictureCountDelta = pictureCountDelta;
fTotalTicksSinceLastTimeCode = newTotalTicks;
fSecondsSinceLastTimeCode += modulo_time_base;
}
}
}
fJustSawTimeCode = False; // for next time
// The next thing to parse depends on the code that we just saw,
// but we are assumed to have ended the current picture:
usingSource()->fPictureEndMarker = True; // HACK #####
switch (next4Bytes) {
case VISUAL_OBJECT_SEQUENCE_END_CODE: {
setParseState(PARSING_VISUAL_OBJECT_SEQUENCE_END_CODE);
break;
}
case VISUAL_OBJECT_SEQUENCE_START_CODE: {
setParseState(PARSING_VISUAL_OBJECT_SEQUENCE_SEEN_CODE);
break;
}
case VISUAL_OBJECT_START_CODE: {
setParseState(PARSING_VISUAL_OBJECT);
break;
}
case GROUP_VOP_START_CODE: {
setParseState(PARSING_GROUP_OF_VIDEO_OBJECT_PLANE);
break;
}
case VOP_START_CODE: {
setParseState(PARSING_VIDEO_OBJECT_PLANE);
break;
}
default: {
if (isVideoObjectStartCode(next4Bytes)) {
setParseState(PARSING_VIDEO_OBJECT_LAYER);
} else if (isVideoObjectLayerStartCode(next4Bytes)){
// copy all bytes that we see, up until we reach a VOP_START_CODE:
u_int32_t next4Bytes = get4Bytes();
while (next4Bytes != VOP_START_CODE) {
saveToNextCode(next4Bytes);
}
setParseState(PARSING_VIDEO_OBJECT_PLANE);
} else {
usingSource()->envir() << "MPEG4VideoStreamParser::parseVideoObjectPlane(): Saw unexpected code "
<< (void*)next4Bytes << "\n";
setParseState(PARSING_VIDEO_OBJECT_PLANE); // the safest way to recover...
}
break;
}
}
// Compute this frame's presentation time:
usingSource()->computePresentationTime(fTotalTicksSinceLastTimeCode);
return curFrameSize();
}
unsigned MPEG4VideoStreamParser::parseVisualObjectSequenceEndCode() {
#ifdef DEBUG
fprintf(stderr, "parsing VISUAL_OBJECT_SEQUENCE_END_CODE\n");
#endif
// Note that we've already read the VISUAL_OBJECT_SEQUENCE_END_CODE
save4Bytes(VISUAL_OBJECT_SEQUENCE_END_CODE);
setParseState(PARSING_VISUAL_OBJECT_SEQUENCE);
// Treat this as if we had ended a picture:
usingSource()->fPictureEndMarker = True; // HACK #####
return curFrameSize();
}
live/liveMedia/OggFile.cpp 000444 001752 001752 00000025201 12656261123 015370 0 ustar 00rsf rsf 000000 000000 /**********
This library is free software; you can redistribute it and/or modify it under
the terms of the GNU Lesser General Public License as published by the
Free Software Foundation; either version 2.1 of the License, or (at your
option) any later version. (See .)
This library is distributed in the hope that it will be useful, but WITHOUT
ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS
FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for
more details.
You should have received a copy of the GNU Lesser General Public License
along with this library; if not, write to the Free Software Foundation, Inc.,
51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
**********/
// "liveMedia"
// Copyright (c) 1996-2016 Live Networks, Inc. All rights reserved.
// A class that encapsulates an Ogg file.
// Implementation
#include "OggFileParser.hh"
#include "OggDemuxedTrack.hh"
#include "ByteStreamFileSource.hh"
#include "VorbisAudioRTPSink.hh"
#include "SimpleRTPSink.hh"
#include "TheoraVideoRTPSink.hh"
////////// OggTrackTable definition /////////
// For looking up and iterating over the file's tracks:
class OggTrackTable {
public:
OggTrackTable();
virtual ~OggTrackTable();
void add(OggTrack* newTrack);
OggTrack* lookup(u_int32_t trackNumber);
unsigned numTracks() const;
private:
friend class OggTrackTableIterator;
HashTable* fTable;
};
////////// OggFile implementation //////////
void OggFile::createNew(UsageEnvironment& env, char const* fileName,
onCreationFunc* onCreation, void* onCreationClientData) {
new OggFile(env, fileName, onCreation, onCreationClientData);
}
OggTrack* OggFile::lookup(u_int32_t trackNumber) {
return fTrackTable->lookup(trackNumber);
}
OggDemux* OggFile::newDemux() {
OggDemux* demux = new OggDemux(*this);
fDemuxesTable->Add((char const*)demux, demux);
return demux;
}
unsigned OggFile::numTracks() const {
return fTrackTable->numTracks();
}
FramedSource* OggFile
::createSourceForStreaming(FramedSource* baseSource, u_int32_t trackNumber,
unsigned& estBitrate, unsigned& numFiltersInFrontOfTrack) {
if (baseSource == NULL) return NULL;
FramedSource* result = baseSource; // by default
numFiltersInFrontOfTrack = 0; // by default
// Look at the track's MIME type to set its estimated bitrate (for use by RTCP).
// (Later, try to be smarter about figuring out the bitrate.) #####
// Some MIME types also require adding a special 'framer' in front of the source.
OggTrack* track = lookup(trackNumber);
if (track != NULL) { // should always be true
estBitrate = track->estBitrate;
}
return result;
}
RTPSink* OggFile
::createRTPSinkForTrackNumber(u_int32_t trackNumber, Groupsock* rtpGroupsock,
unsigned char rtpPayloadTypeIfDynamic) {
OggTrack* track = lookup(trackNumber);
if (track == NULL || track->mimeType == NULL) return NULL;
RTPSink* result = NULL; // default value for unknown media types
if (strcmp(track->mimeType, "audio/VORBIS") == 0) {
// For Vorbis audio, we use the special "identification", "comment", and "setup" headers
// that we read when we initially read the headers at the start of the file:
result = VorbisAudioRTPSink::createNew(envir(), rtpGroupsock, rtpPayloadTypeIfDynamic,
track->samplingFrequency, track->numChannels,
track->vtoHdrs.header[0], track->vtoHdrs.headerSize[0],
track->vtoHdrs.header[1], track->vtoHdrs.headerSize[1],
track->vtoHdrs.header[2], track->vtoHdrs.headerSize[2]);
} else if (strcmp(track->mimeType, "audio/OPUS") == 0) {
result = SimpleRTPSink
::createNew(envir(), rtpGroupsock, rtpPayloadTypeIfDynamic,
48000, "audio", "OPUS", 2, False/*only 1 Opus 'packet' in each RTP packet*/);
} else if (strcmp(track->mimeType, "video/THEORA") == 0) {
// For Theora video, we use the special "identification", "comment", and "setup" headers
// that we read when we initially read the headers at the start of the file:
result = TheoraVideoRTPSink::createNew(envir(), rtpGroupsock, rtpPayloadTypeIfDynamic,
track->vtoHdrs.header[0], track->vtoHdrs.headerSize[0],
track->vtoHdrs.header[1], track->vtoHdrs.headerSize[1],
track->vtoHdrs.header[2], track->vtoHdrs.headerSize[2]);
}
return result;
}
OggFile::OggFile(UsageEnvironment& env, char const* fileName,
onCreationFunc* onCreation, void* onCreationClientData)
: Medium(env),
fFileName(strDup(fileName)),
fOnCreation(onCreation), fOnCreationClientData(onCreationClientData) {
fTrackTable = new OggTrackTable;
fDemuxesTable = HashTable::create(ONE_WORD_HASH_KEYS);
FramedSource* inputSource = ByteStreamFileSource::createNew(envir(), fileName);
if (inputSource == NULL) {
// The specified input file does not exist!
fParserForInitialization = NULL;
handleEndOfBosPageParsing(); // we have no file, and thus no tracks, but we still need to signal this
} else {
// Initialize ourselves by parsing the file's headers:
fParserForInitialization
= new OggFileParser(*this, inputSource, handleEndOfBosPageParsing, this);
}
}
OggFile::~OggFile() {
delete fParserForInitialization;
// Delete any outstanding "OggDemux"s, and the table for them:
OggDemux* demux;
while ((demux = (OggDemux*)fDemuxesTable->RemoveNext()) != NULL) {
delete demux;
}
delete fDemuxesTable;
delete fTrackTable;
delete[] (char*)fFileName;
}
void OggFile::handleEndOfBosPageParsing(void* clientData) {
((OggFile*)clientData)->handleEndOfBosPageParsing();
}
void OggFile::handleEndOfBosPageParsing() {
// Delete our parser, because it's done its job now:
delete fParserForInitialization; fParserForInitialization = NULL;
// Finally, signal our caller that we've been created and initialized:
if (fOnCreation != NULL) (*fOnCreation)(this, fOnCreationClientData);
}
void OggFile::addTrack(OggTrack* newTrack) {
fTrackTable->add(newTrack);
}
void OggFile::removeDemux(OggDemux* demux) {
fDemuxesTable->Remove((char const*)demux);
}
////////// OggTrackTable implementation /////////
OggTrackTable::OggTrackTable()
: fTable(HashTable::create(ONE_WORD_HASH_KEYS)) {
}
OggTrackTable::~OggTrackTable() {
// Remove and delete all of our "OggTrack" descriptors, and the hash table itself:
OggTrack* track;
while ((track = (OggTrack*)fTable->RemoveNext()) != NULL) {
delete track;
}
delete fTable;
}
void OggTrackTable::add(OggTrack* newTrack) {
OggTrack* existingTrack
= (OggTrack*)fTable->Add((char const*)newTrack->trackNumber, newTrack);
delete existingTrack; // if any
}
OggTrack* OggTrackTable::lookup(u_int32_t trackNumber) {
return (OggTrack*)fTable->Lookup((char const*)trackNumber);
}
unsigned OggTrackTable::numTracks() const { return fTable->numEntries(); }
OggTrackTableIterator::OggTrackTableIterator(OggTrackTable& ourTable) {
fIter = HashTable::Iterator::create(*(ourTable.fTable));
}
OggTrackTableIterator::~OggTrackTableIterator() {
delete fIter;
}
OggTrack* OggTrackTableIterator::next() {
char const* key;
return (OggTrack*)fIter->next(key);
}
////////// OggTrack implementation //////////
OggTrack::OggTrack()
: trackNumber(0), mimeType(NULL),
samplingFrequency(48000), numChannels(2), estBitrate(100) { // default settings
vtoHdrs.header[0] = vtoHdrs.header[1] = vtoHdrs.header[2] = NULL;
vtoHdrs.headerSize[0] = vtoHdrs.headerSize[1] = vtoHdrs.headerSize[2] = 0;
vtoHdrs.vorbis_mode_count = 0;
vtoHdrs.vorbis_mode_blockflag = NULL;
}
OggTrack::~OggTrack() {
delete[] vtoHdrs.header[0]; delete[] vtoHdrs.header[1]; delete[] vtoHdrs.header[2];
delete[] vtoHdrs.vorbis_mode_blockflag;
}
///////// OggDemux implementation /////////
FramedSource* OggDemux::newDemuxedTrack(u_int32_t& resultTrackNumber) {
OggTrack* nextTrack;
do {
nextTrack = fIter->next();
} while (nextTrack != NULL && nextTrack->mimeType == NULL);
if (nextTrack == NULL) { // no more tracks
resultTrackNumber = 0;
return NULL;
}
resultTrackNumber = nextTrack->trackNumber;
FramedSource* trackSource = new OggDemuxedTrack(envir(), resultTrackNumber, *this);
fDemuxedTracksTable->Add((char const*)resultTrackNumber, trackSource);
return trackSource;
}
FramedSource* OggDemux::newDemuxedTrackByTrackNumber(unsigned trackNumber) {
if (trackNumber == 0) return NULL;
FramedSource* trackSource = new OggDemuxedTrack(envir(), trackNumber, *this);
fDemuxedTracksTable->Add((char const*)trackNumber, trackSource);
return trackSource;
}
OggDemuxedTrack* OggDemux::lookupDemuxedTrack(u_int32_t trackNumber) {
return (OggDemuxedTrack*)fDemuxedTracksTable->Lookup((char const*)trackNumber);
}
OggDemux::OggDemux(OggFile& ourFile)
: Medium(ourFile.envir()),
fOurFile(ourFile), fDemuxedTracksTable(HashTable::create(ONE_WORD_HASH_KEYS)),
fIter(new OggTrackTableIterator(*fOurFile.fTrackTable)) {
FramedSource* fileSource = ByteStreamFileSource::createNew(envir(), ourFile.fileName());
fOurParser = new OggFileParser(ourFile, fileSource, handleEndOfFile, this, this);
}
OggDemux::~OggDemux() {
// Begin by acting as if we've reached the end of the source file.
// This should cause all of our demuxed tracks to get closed.
handleEndOfFile();
// Then delete our table of "OggDemuxedTrack"s
// - but not the "OggDemuxedTrack"s themselves; that should have already happened:
delete fDemuxedTracksTable;
delete fIter;
delete fOurParser;
fOurFile.removeDemux(this);
}
void OggDemux::removeTrack(u_int32_t trackNumber) {
fDemuxedTracksTable->Remove((char const*)trackNumber);
if (fDemuxedTracksTable->numEntries() == 0) {
// We no longer have any demuxed tracks, so delete ourselves now:
delete this;
}
}
void OggDemux::continueReading() {
fOurParser->continueParsing();
}
void OggDemux::handleEndOfFile(void* clientData) {
((OggDemux*)clientData)->handleEndOfFile();
}
void OggDemux::handleEndOfFile() {
// Iterate through all of our 'demuxed tracks', handling 'end of input' on each one.
// Hack: Because this can cause the hash table to get modified underneath us,
// we don't call the handlers until after we've first iterated through all of the tracks.
unsigned numTracks = fDemuxedTracksTable->numEntries();
if (numTracks == 0) return;
OggDemuxedTrack** tracks = new OggDemuxedTrack*[numTracks];
HashTable::Iterator* iter = HashTable::Iterator::create(*fDemuxedTracksTable);
unsigned i;
char const* trackNumber;
for (i = 0; i < numTracks; ++i) {
tracks[i] = (OggDemuxedTrack*)iter->next(trackNumber);
}
delete iter;
for (i = 0; i < numTracks; ++i) {
if (tracks[i] == NULL) continue; // sanity check; shouldn't happen
tracks[i]->handleClosure();
}
delete[] tracks;
}
live/liveMedia/MPEGVideoStreamFramer.cpp 000444 001752 001752 00000014660 12656261123 020113 0 ustar 00rsf rsf 000000 000000 /**********
This library is free software; you can redistribute it and/or modify it under
the terms of the GNU Lesser General Public License as published by the
Free Software Foundation; either version 2.1 of the License, or (at your
option) any later version. (See .)
This library is distributed in the hope that it will be useful, but WITHOUT
ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS
FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for
more details.
You should have received a copy of the GNU Lesser General Public License
along with this library; if not, write to the Free Software Foundation, Inc.,
51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
**********/
// "liveMedia"
// Copyright (c) 1996-2016 Live Networks, Inc. All rights reserved.
// A filter that breaks up an MPEG video elementary stream into
// headers and frames
// Implementation
#include "MPEGVideoStreamParser.hh"
#include
////////// TimeCode implementation //////////
TimeCode::TimeCode()
: days(0), hours(0), minutes(0), seconds(0), pictures(0) {
}
TimeCode::~TimeCode() {
}
int TimeCode::operator==(TimeCode const& arg2) {
return pictures == arg2.pictures && seconds == arg2.seconds
&& minutes == arg2.minutes && hours == arg2.hours && days == arg2.days;
}
////////// MPEGVideoStreamFramer implementation //////////
MPEGVideoStreamFramer::MPEGVideoStreamFramer(UsageEnvironment& env,
FramedSource* inputSource)
: FramedFilter(env, inputSource),
fFrameRate(0.0) /* until we learn otherwise */,
fParser(NULL) {
reset();
}
MPEGVideoStreamFramer::~MPEGVideoStreamFramer() {
delete fParser;
}
void MPEGVideoStreamFramer::flushInput() {
reset();
if (fParser != NULL) fParser->flushInput();
}
void MPEGVideoStreamFramer::reset() {
fPictureCount = 0;
fPictureEndMarker = False;
fPicturesAdjustment = 0;
fPictureTimeBase = 0.0;
fTcSecsBase = 0;
fHaveSeenFirstTimeCode = False;
// Use the current wallclock time as the base 'presentation time':
gettimeofday(&fPresentationTimeBase, NULL);
}
#ifdef DEBUG
static struct timeval firstPT;
#endif
void MPEGVideoStreamFramer
::computePresentationTime(unsigned numAdditionalPictures) {
// Computes "fPresentationTime" from the most recent GOP's
// time_code, along with the "numAdditionalPictures" parameter:
TimeCode& tc = fCurGOPTimeCode;
unsigned tcSecs
= (((tc.days*24)+tc.hours)*60+tc.minutes)*60+tc.seconds - fTcSecsBase;
double pictureTime = fFrameRate == 0.0 ? 0.0
: (tc.pictures + fPicturesAdjustment + numAdditionalPictures)/fFrameRate;
while (pictureTime < fPictureTimeBase) { // "if" should be enough, but just in case
if (tcSecs > 0) tcSecs -= 1;
pictureTime += 1.0;
}
pictureTime -= fPictureTimeBase;
if (pictureTime < 0.0) pictureTime = 0.0; // sanity check
unsigned pictureSeconds = (unsigned)pictureTime;
double pictureFractionOfSecond = pictureTime - (double)pictureSeconds;
fPresentationTime = fPresentationTimeBase;
fPresentationTime.tv_sec += tcSecs + pictureSeconds;
fPresentationTime.tv_usec += (long)(pictureFractionOfSecond*1000000.0);
if (fPresentationTime.tv_usec >= 1000000) {
fPresentationTime.tv_usec -= 1000000;
++fPresentationTime.tv_sec;
}
#ifdef DEBUG
if (firstPT.tv_sec == 0 && firstPT.tv_usec == 0) firstPT = fPresentationTime;
struct timeval diffPT;
diffPT.tv_sec = fPresentationTime.tv_sec - firstPT.tv_sec;
diffPT.tv_usec = fPresentationTime.tv_usec - firstPT.tv_usec;
if (fPresentationTime.tv_usec < firstPT.tv_usec) {
--diffPT.tv_sec;
diffPT.tv_usec += 1000000;
}
fprintf(stderr, "MPEGVideoStreamFramer::computePresentationTime(%d) -> %lu.%06ld [%lu.%06ld]\n", numAdditionalPictures, fPresentationTime.tv_sec, fPresentationTime.tv_usec, diffPT.tv_sec, diffPT.tv_usec);
#endif
}
void MPEGVideoStreamFramer
::setTimeCode(unsigned hours, unsigned minutes, unsigned seconds,
unsigned pictures, unsigned picturesSinceLastGOP) {
TimeCode& tc = fCurGOPTimeCode; // abbrev
unsigned days = tc.days;
if (hours < tc.hours) {
// Assume that the 'day' has wrapped around:
++days;
}
tc.days = days;
tc.hours = hours;
tc.minutes = minutes;
tc.seconds = seconds;
tc.pictures = pictures;
if (!fHaveSeenFirstTimeCode) {
fPictureTimeBase = fFrameRate == 0.0 ? 0.0 : tc.pictures/fFrameRate;
fTcSecsBase = (((tc.days*24)+tc.hours)*60+tc.minutes)*60+tc.seconds;
fHaveSeenFirstTimeCode = True;
} else if (fCurGOPTimeCode == fPrevGOPTimeCode) {
// The time code has not changed since last time. Adjust for this:
fPicturesAdjustment += picturesSinceLastGOP;
} else {
// Normal case: The time code changed since last time.
fPrevGOPTimeCode = tc;
fPicturesAdjustment = 0;
}
}
void MPEGVideoStreamFramer::doGetNextFrame() {
fParser->registerReadInterest(fTo, fMaxSize);
continueReadProcessing();
}
void MPEGVideoStreamFramer::doStopGettingFrames() {
flushInput();
FramedFilter::doStopGettingFrames();
}
void MPEGVideoStreamFramer
::continueReadProcessing(void* clientData,
unsigned char* /*ptr*/, unsigned /*size*/,
struct timeval /*presentationTime*/) {
MPEGVideoStreamFramer* framer = (MPEGVideoStreamFramer*)clientData;
framer->continueReadProcessing();
}
void MPEGVideoStreamFramer::continueReadProcessing() {
unsigned acquiredFrameSize = fParser->parse();
if (acquiredFrameSize > 0) {
// We were able to acquire a frame from the input.
// It has already been copied to the reader's space.
fFrameSize = acquiredFrameSize;
fNumTruncatedBytes = fParser->numTruncatedBytes();
// "fPresentationTime" should have already been computed.
// Compute "fDurationInMicroseconds" now:
fDurationInMicroseconds
= (fFrameRate == 0.0 || ((int)fPictureCount) < 0) ? 0
: (unsigned)((fPictureCount*1000000)/fFrameRate);
#ifdef DEBUG
fprintf(stderr, "%d bytes @%u.%06d, fDurationInMicroseconds: %d ((%d*1000000)/%f)\n", acquiredFrameSize, fPresentationTime.tv_sec, fPresentationTime.tv_usec, fDurationInMicroseconds, fPictureCount, fFrameRate);
#endif
fPictureCount = 0;
// Call our own 'after getting' function. Because we're not a 'leaf'
// source, we can call this directly, without risking infinite recursion.
afterGetting(this);
} else {
// We were unable to parse a complete frame from the input, because:
// - we had to read more data from the source stream, or
// - the source stream has ended.
}
}
live/liveMedia/MPEGVideoStreamParser.cpp 000444 001752 001752 00000003215 12656261123 020125 0 ustar 00rsf rsf 000000 000000 /**********
This library is free software; you can redistribute it and/or modify it under
the terms of the GNU Lesser General Public License as published by the
Free Software Foundation; either version 2.1 of the License, or (at your
option) any later version. (See .)
This library is distributed in the hope that it will be useful, but WITHOUT
ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS
FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for
more details.
You should have received a copy of the GNU Lesser General Public License
along with this library; if not, write to the Free Software Foundation, Inc.,
51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
**********/
// "liveMedia"
// Copyright (c) 1996-2016 Live Networks, Inc. All rights reserved.
// An abstract parser for MPEG video streams
// Implementation
#include "MPEGVideoStreamParser.hh"
MPEGVideoStreamParser
::MPEGVideoStreamParser(MPEGVideoStreamFramer* usingSource,
FramedSource* inputSource)
: StreamParser(inputSource, FramedSource::handleClosure, usingSource,
&MPEGVideoStreamFramer::continueReadProcessing, usingSource),
fUsingSource(usingSource) {
}
MPEGVideoStreamParser::~MPEGVideoStreamParser() {
}
void MPEGVideoStreamParser::restoreSavedParserState() {
StreamParser::restoreSavedParserState();
fTo = fSavedTo;
fNumTruncatedBytes = fSavedNumTruncatedBytes;
}
void MPEGVideoStreamParser::registerReadInterest(unsigned char* to,
unsigned maxSize) {
fStartOfFrame = fTo = fSavedTo = to;
fLimit = to + maxSize;
fNumTruncatedBytes = fSavedNumTruncatedBytes = 0;
}
live/liveMedia/MPEGVideoStreamParser.hh 000444 001752 001752 00000007214 12656261123 017745 0 ustar 00rsf rsf 000000 000000 /**********
This library is free software; you can redistribute it and/or modify it under
the terms of the GNU Lesser General Public License as published by the
Free Software Foundation; either version 2.1 of the License, or (at your
option) any later version. (See .)
This library is distributed in the hope that it will be useful, but WITHOUT
ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS
FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for
more details.
You should have received a copy of the GNU Lesser General Public License
along with this library; if not, write to the Free Software Foundation, Inc.,
51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
**********/
// "liveMedia"
// Copyright (c) 1996-2016 Live Networks, Inc. All rights reserved.
// An abstract parser for MPEG video streams
// C++ header
#ifndef _MPEG_VIDEO_STREAM_PARSER_HH
#define _MPEG_VIDEO_STREAM_PARSER_HH
#ifndef _STREAM_PARSER_HH
#include "StreamParser.hh"
#endif
#ifndef _MPEG_VIDEO_STREAM_FRAMER_HH
#include "MPEGVideoStreamFramer.hh"
#endif
////////// MPEGVideoStreamParser definition //////////
class MPEGVideoStreamParser: public StreamParser {
public:
MPEGVideoStreamParser(MPEGVideoStreamFramer* usingSource,
FramedSource* inputSource);
virtual ~MPEGVideoStreamParser();
public:
void registerReadInterest(unsigned char* to, unsigned maxSize);
virtual unsigned parse() = 0;
// returns the size of the frame that was acquired, or 0 if none was
// The number of truncated bytes (if any) is given by:
unsigned numTruncatedBytes() const { return fNumTruncatedBytes; }
protected:
void setParseState() {
fSavedTo = fTo;
fSavedNumTruncatedBytes = fNumTruncatedBytes;
saveParserState();
}
// Record "byte" in the current output frame:
void saveByte(u_int8_t byte) {
if (fTo >= fLimit) { // there's no space left
++fNumTruncatedBytes;
return;
}
*fTo++ = byte;
}
void save4Bytes(u_int32_t word) {
if (fTo+4 > fLimit) { // there's no space left
fNumTruncatedBytes += 4;
return;
}
*fTo++ = word>>24; *fTo++ = word>>16; *fTo++ = word>>8; *fTo++ = word;
}
// Save data until we see a sync word (0x000001xx):
void saveToNextCode(u_int32_t& curWord) {
saveByte(curWord>>24);
curWord = (curWord<<8)|get1Byte();
while ((curWord&0xFFFFFF00) != 0x00000100) {
if ((unsigned)(curWord&0xFF) > 1) {
// a sync word definitely doesn't begin anywhere in "curWord"
save4Bytes(curWord);
curWord = get4Bytes();
} else {
// a sync word might begin in "curWord", although not at its start
saveByte(curWord>>24);
unsigned char newByte = get1Byte();
curWord = (curWord<<8)|newByte;
}
}
}
// Skip data until we see a sync word (0x000001xx):
void skipToNextCode(u_int32_t& curWord) {
curWord = (curWord<<8)|get1Byte();
while ((curWord&0xFFFFFF00) != 0x00000100) {
if ((unsigned)(curWord&0xFF) > 1) {
// a sync word definitely doesn't begin anywhere in "curWord"
curWord = get4Bytes();
} else {
// a sync word might begin in "curWord", although not at its start
unsigned char newByte = get1Byte();
curWord = (curWord<<8)|newByte;
}
}
}
protected:
MPEGVideoStreamFramer* fUsingSource;
// state of the frame that's currently being read:
unsigned char* fStartOfFrame;
unsigned char* fTo;
unsigned char* fLimit;
unsigned fNumTruncatedBytes;
unsigned curFrameSize() { return fTo - fStartOfFrame; }
unsigned char* fSavedTo;
unsigned fSavedNumTruncatedBytes;
private: // redefined virtual functions
virtual void restoreSavedParserState();
};
#endif
live/liveMedia/MultiFramedRTPSink.cpp 000444 001752 001752 00000037702 12656261123 017511 0 ustar 00rsf rsf 000000 000000 /**********
This library is free software; you can redistribute it and/or modify it under
the terms of the GNU Lesser General Public License as published by the
Free Software Foundation; either version 2.1 of the License, or (at your
option) any later version. (See .)
This library is distributed in the hope that it will be useful, but WITHOUT
ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS
FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for
more details.
You should have received a copy of the GNU Lesser General Public License
along with this library; if not, write to the Free Software Foundation, Inc.,
51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
**********/
// "liveMedia"
// Copyright (c) 1996-2016 Live Networks, Inc. All rights reserved.
// RTP sink for a common kind of payload format: Those which pack multiple,
// complete codec frames (as many as possible) into each RTP packet.
// Implementation
#include "MultiFramedRTPSink.hh"
#include "GroupsockHelper.hh"
////////// MultiFramedRTPSink //////////
void MultiFramedRTPSink::setPacketSizes(unsigned preferredPacketSize,
unsigned maxPacketSize) {
if (preferredPacketSize > maxPacketSize || preferredPacketSize == 0) return;
// sanity check
delete fOutBuf;
fOutBuf = new OutPacketBuffer(preferredPacketSize, maxPacketSize);
fOurMaxPacketSize = maxPacketSize; // save value, in case subclasses need it
}
MultiFramedRTPSink::MultiFramedRTPSink(UsageEnvironment& env,
Groupsock* rtpGS,
unsigned char rtpPayloadType,
unsigned rtpTimestampFrequency,
char const* rtpPayloadFormatName,
unsigned numChannels)
: RTPSink(env, rtpGS, rtpPayloadType, rtpTimestampFrequency,
rtpPayloadFormatName, numChannels),
fOutBuf(NULL), fCurFragmentationOffset(0), fPreviousFrameEndedFragmentation(False),
fOnSendErrorFunc(NULL), fOnSendErrorData(NULL) {
setPacketSizes(1000, 1456);
// Default max packet size (1500, minus allowance for IP, UDP, UMTP headers)
// (Also, make it a multiple of 4 bytes, just in case that matters.)
}
MultiFramedRTPSink::~MultiFramedRTPSink() {
delete fOutBuf;
}
void MultiFramedRTPSink
::doSpecialFrameHandling(unsigned /*fragmentationOffset*/,
unsigned char* /*frameStart*/,
unsigned /*numBytesInFrame*/,
struct timeval framePresentationTime,
unsigned /*numRemainingBytes*/) {
// default implementation: If this is the first frame in the packet,
// use its presentationTime for the RTP timestamp:
if (isFirstFrameInPacket()) {
setTimestamp(framePresentationTime);
}
}
Boolean MultiFramedRTPSink::allowFragmentationAfterStart() const {
return False; // by default
}
Boolean MultiFramedRTPSink::allowOtherFramesAfterLastFragment() const {
return False; // by default
}
Boolean MultiFramedRTPSink
::frameCanAppearAfterPacketStart(unsigned char const* /*frameStart*/,
unsigned /*numBytesInFrame*/) const {
return True; // by default
}
unsigned MultiFramedRTPSink::specialHeaderSize() const {
// default implementation: Assume no special header:
return 0;
}
unsigned MultiFramedRTPSink::frameSpecificHeaderSize() const {
// default implementation: Assume no frame-specific header:
return 0;
}
unsigned MultiFramedRTPSink::computeOverflowForNewFrame(unsigned newFrameSize) const {
// default implementation: Just call numOverflowBytes()
return fOutBuf->numOverflowBytes(newFrameSize);
}
void MultiFramedRTPSink::setMarkerBit() {
unsigned rtpHdr = fOutBuf->extractWord(0);
rtpHdr |= 0x00800000;
fOutBuf->insertWord(rtpHdr, 0);
}
void MultiFramedRTPSink::setTimestamp(struct timeval framePresentationTime) {
// First, convert the presentation time to a 32-bit RTP timestamp:
fCurrentTimestamp = convertToRTPTimestamp(framePresentationTime);
// Then, insert it into the RTP packet:
fOutBuf->insertWord(fCurrentTimestamp, fTimestampPosition);
}
void MultiFramedRTPSink::setSpecialHeaderWord(unsigned word,
unsigned wordPosition) {
fOutBuf->insertWord(word, fSpecialHeaderPosition + 4*wordPosition);
}
void MultiFramedRTPSink::setSpecialHeaderBytes(unsigned char const* bytes,
unsigned numBytes,
unsigned bytePosition) {
fOutBuf->insert(bytes, numBytes, fSpecialHeaderPosition + bytePosition);
}
void MultiFramedRTPSink::setFrameSpecificHeaderWord(unsigned word,
unsigned wordPosition) {
fOutBuf->insertWord(word, fCurFrameSpecificHeaderPosition + 4*wordPosition);
}
void MultiFramedRTPSink::setFrameSpecificHeaderBytes(unsigned char const* bytes,
unsigned numBytes,
unsigned bytePosition) {
fOutBuf->insert(bytes, numBytes, fCurFrameSpecificHeaderPosition + bytePosition);
}
void MultiFramedRTPSink::setFramePadding(unsigned numPaddingBytes) {
if (numPaddingBytes > 0) {
// Add the padding bytes (with the last one being the padding size):
unsigned char paddingBuffer[255]; //max padding
memset(paddingBuffer, 0, numPaddingBytes);
paddingBuffer[numPaddingBytes-1] = numPaddingBytes;
fOutBuf->enqueue(paddingBuffer, numPaddingBytes);
// Set the RTP padding bit:
unsigned rtpHdr = fOutBuf->extractWord(0);
rtpHdr |= 0x20000000;
fOutBuf->insertWord(rtpHdr, 0);
}
}
Boolean MultiFramedRTPSink::continuePlaying() {
// Send the first packet.
// (This will also schedule any future sends.)
buildAndSendPacket(True);
return True;
}
void MultiFramedRTPSink::stopPlaying() {
fOutBuf->resetPacketStart();
fOutBuf->resetOffset();
fOutBuf->resetOverflowData();
// Then call the default "stopPlaying()" function:
MediaSink::stopPlaying();
}
void MultiFramedRTPSink::buildAndSendPacket(Boolean isFirstPacket) {
fIsFirstPacket = isFirstPacket;
// Set up the RTP header:
unsigned rtpHdr = 0x80000000; // RTP version 2; marker ('M') bit not set (by default; it can be set later)
rtpHdr |= (fRTPPayloadType<<16);
rtpHdr |= fSeqNo; // sequence number
fOutBuf->enqueueWord(rtpHdr);
// Note where the RTP timestamp will go.
// (We can't fill this in until we start packing payload frames.)
fTimestampPosition = fOutBuf->curPacketSize();
fOutBuf->skipBytes(4); // leave a hole for the timestamp
fOutBuf->enqueueWord(SSRC());
// Allow for a special, payload-format-specific header following the
// RTP header:
fSpecialHeaderPosition = fOutBuf->curPacketSize();
fSpecialHeaderSize = specialHeaderSize();
fOutBuf->skipBytes(fSpecialHeaderSize);
// Begin packing as many (complete) frames into the packet as we can:
fTotalFrameSpecificHeaderSizes = 0;
fNoFramesLeft = False;
fNumFramesUsedSoFar = 0;
packFrame();
}
void MultiFramedRTPSink::packFrame() {
// Get the next frame.
// First, see if we have an overflow frame that was too big for the last pkt
if (fOutBuf->haveOverflowData()) {
// Use this frame before reading a new one from the source
unsigned frameSize = fOutBuf->overflowDataSize();
struct timeval presentationTime = fOutBuf->overflowPresentationTime();
unsigned durationInMicroseconds = fOutBuf->overflowDurationInMicroseconds();
fOutBuf->useOverflowData();
afterGettingFrame1(frameSize, 0, presentationTime, durationInMicroseconds);
} else {
// Normal case: we need to read a new frame from the source
if (fSource == NULL) return;
fCurFrameSpecificHeaderPosition = fOutBuf->curPacketSize();
fCurFrameSpecificHeaderSize = frameSpecificHeaderSize();
fOutBuf->skipBytes(fCurFrameSpecificHeaderSize);
fTotalFrameSpecificHeaderSizes += fCurFrameSpecificHeaderSize;
fSource->getNextFrame(fOutBuf->curPtr(), fOutBuf->totalBytesAvailable(),
afterGettingFrame, this, ourHandleClosure, this);
}
}
void MultiFramedRTPSink
::afterGettingFrame(void* clientData, unsigned numBytesRead,
unsigned numTruncatedBytes,
struct timeval presentationTime,
unsigned durationInMicroseconds) {
MultiFramedRTPSink* sink = (MultiFramedRTPSink*)clientData;
sink->afterGettingFrame1(numBytesRead, numTruncatedBytes,
presentationTime, durationInMicroseconds);
}
void MultiFramedRTPSink
::afterGettingFrame1(unsigned frameSize, unsigned numTruncatedBytes,
struct timeval presentationTime,
unsigned durationInMicroseconds) {
if (fIsFirstPacket) {
// Record the fact that we're starting to play now:
gettimeofday(&fNextSendTime, NULL);
}
fMostRecentPresentationTime = presentationTime;
if (fInitialPresentationTime.tv_sec == 0 && fInitialPresentationTime.tv_usec == 0) {
fInitialPresentationTime = presentationTime;
}
if (numTruncatedBytes > 0) {
unsigned const bufferSize = fOutBuf->totalBytesAvailable();
envir() << "MultiFramedRTPSink::afterGettingFrame1(): The input frame data was too large for our buffer size ("
<< bufferSize << "). "
<< numTruncatedBytes << " bytes of trailing data was dropped! Correct this by increasing \"OutPacketBuffer::maxSize\" to at least "
<< OutPacketBuffer::maxSize + numTruncatedBytes << ", *before* creating this 'RTPSink'. (Current value is "
<< OutPacketBuffer::maxSize << ".)\n";
}
unsigned curFragmentationOffset = fCurFragmentationOffset;
unsigned numFrameBytesToUse = frameSize;
unsigned overflowBytes = 0;
// If we have already packed one or more frames into this packet,
// check whether this new frame is eligible to be packed after them.
// (This is independent of whether the packet has enough room for this
// new frame; that check comes later.)
if (fNumFramesUsedSoFar > 0) {
if ((fPreviousFrameEndedFragmentation
&& !allowOtherFramesAfterLastFragment())
|| !frameCanAppearAfterPacketStart(fOutBuf->curPtr(), frameSize)) {
// Save away this frame for next time:
numFrameBytesToUse = 0;
fOutBuf->setOverflowData(fOutBuf->curPacketSize(), frameSize,
presentationTime, durationInMicroseconds);
}
}
fPreviousFrameEndedFragmentation = False;
if (numFrameBytesToUse > 0) {
// Check whether this frame overflows the packet
if (fOutBuf->wouldOverflow(frameSize)) {
// Don't use this frame now; instead, save it as overflow data, and
// send it in the next packet instead. However, if the frame is too
// big to fit in a packet by itself, then we need to fragment it (and
// use some of it in this packet, if the payload format permits this.)
if (isTooBigForAPacket(frameSize)
&& (fNumFramesUsedSoFar == 0 || allowFragmentationAfterStart())) {
// We need to fragment this frame, and use some of it now:
overflowBytes = computeOverflowForNewFrame(frameSize);
numFrameBytesToUse -= overflowBytes;
fCurFragmentationOffset += numFrameBytesToUse;
} else {
// We don't use any of this frame now:
overflowBytes = frameSize;
numFrameBytesToUse = 0;
}
fOutBuf->setOverflowData(fOutBuf->curPacketSize() + numFrameBytesToUse,
overflowBytes, presentationTime, durationInMicroseconds);
} else if (fCurFragmentationOffset > 0) {
// This is the last fragment of a frame that was fragmented over
// more than one packet. Do any special handling for this case:
fCurFragmentationOffset = 0;
fPreviousFrameEndedFragmentation = True;
}
}
if (numFrameBytesToUse == 0 && frameSize > 0) {
// Send our packet now, because we have filled it up:
sendPacketIfNecessary();
} else {
// Use this frame in our outgoing packet:
unsigned char* frameStart = fOutBuf->curPtr();
fOutBuf->increment(numFrameBytesToUse);
// do this now, in case "doSpecialFrameHandling()" calls "setFramePadding()" to append padding bytes
// Here's where any payload format specific processing gets done:
doSpecialFrameHandling(curFragmentationOffset, frameStart,
numFrameBytesToUse, presentationTime,
overflowBytes);
++fNumFramesUsedSoFar;
// Update the time at which the next packet should be sent, based
// on the duration of the frame that we just packed into it.
// However, if this frame has overflow data remaining, then don't
// count its duration yet.
if (overflowBytes == 0) {
fNextSendTime.tv_usec += durationInMicroseconds;
fNextSendTime.tv_sec += fNextSendTime.tv_usec/1000000;
fNextSendTime.tv_usec %= 1000000;
}
// Send our packet now if (i) it's already at our preferred size, or
// (ii) (heuristic) another frame of the same size as the one we just
// read would overflow the packet, or
// (iii) it contains the last fragment of a fragmented frame, and we
// don't allow anything else to follow this or
// (iv) one frame per packet is allowed:
if (fOutBuf->isPreferredSize()
|| fOutBuf->wouldOverflow(numFrameBytesToUse)
|| (fPreviousFrameEndedFragmentation &&
!allowOtherFramesAfterLastFragment())
|| !frameCanAppearAfterPacketStart(fOutBuf->curPtr() - frameSize,
frameSize) ) {
// The packet is ready to be sent now
sendPacketIfNecessary();
} else {
// There's room for more frames; try getting another:
packFrame();
}
}
}
static unsigned const rtpHeaderSize = 12;
Boolean MultiFramedRTPSink::isTooBigForAPacket(unsigned numBytes) const {
// Check whether a 'numBytes'-byte frame - together with a RTP header and
// (possible) special headers - would be too big for an output packet:
// (Later allow for RTP extension header!) #####
numBytes += rtpHeaderSize + specialHeaderSize() + frameSpecificHeaderSize();
return fOutBuf->isTooBigForAPacket(numBytes);
}
void MultiFramedRTPSink::sendPacketIfNecessary() {
if (fNumFramesUsedSoFar > 0) {
// Send the packet:
#ifdef TEST_LOSS
if ((our_random()%10) != 0) // simulate 10% packet loss #####
#endif
if (!fRTPInterface.sendPacket(fOutBuf->packet(), fOutBuf->curPacketSize())) {
// if failure handler has been specified, call it
if (fOnSendErrorFunc != NULL) (*fOnSendErrorFunc)(fOnSendErrorData);
}
++fPacketCount;
fTotalOctetCount += fOutBuf->curPacketSize();
fOctetCount += fOutBuf->curPacketSize()
- rtpHeaderSize - fSpecialHeaderSize - fTotalFrameSpecificHeaderSizes;
++fSeqNo; // for next time
}
if (fOutBuf->haveOverflowData()
&& fOutBuf->totalBytesAvailable() > fOutBuf->totalBufferSize()/2) {
// Efficiency hack: Reset the packet start pointer to just in front of
// the overflow data (allowing for the RTP header and special headers),
// so that we probably don't have to "memmove()" the overflow data
// into place when building the next packet:
unsigned newPacketStart = fOutBuf->curPacketSize()
- (rtpHeaderSize + fSpecialHeaderSize + frameSpecificHeaderSize());
fOutBuf->adjustPacketStart(newPacketStart);
} else {
// Normal case: Reset the packet start pointer back to the start:
fOutBuf->resetPacketStart();
}
fOutBuf->resetOffset();
fNumFramesUsedSoFar = 0;
if (fNoFramesLeft) {
// We're done:
onSourceClosure();
} else {
// We have more frames left to send. Figure out when the next frame
// is due to start playing, then make sure that we wait this long before
// sending the next packet.
struct timeval timeNow;
gettimeofday(&timeNow, NULL);
int secsDiff = fNextSendTime.tv_sec - timeNow.tv_sec;
int64_t uSecondsToGo = secsDiff*1000000 + (fNextSendTime.tv_usec - timeNow.tv_usec);
if (uSecondsToGo < 0 || secsDiff < 0) { // sanity check: Make sure that the time-to-delay is non-negative:
uSecondsToGo = 0;
}
// Delay this amount of time:
nextTask() = envir().taskScheduler().scheduleDelayedTask(uSecondsToGo, (TaskFunc*)sendNext, this);
}
}
// The following is called after each delay between packet sends:
void MultiFramedRTPSink::sendNext(void* firstArg) {
MultiFramedRTPSink* sink = (MultiFramedRTPSink*)firstArg;
sink->buildAndSendPacket(False);
}
void MultiFramedRTPSink::ourHandleClosure(void* clientData) {
MultiFramedRTPSink* sink = (MultiFramedRTPSink*)clientData;
// There are no frames left, but we may have a partially built packet
// to send
sink->fNoFramesLeft = True;
sink->sendPacketIfNecessary();
}
live/liveMedia/MultiFramedRTPSource.cpp 000444 001752 001752 00000052705 12656261123 020045 0 ustar 00rsf rsf 000000 000000 /**********
This library is free software; you can redistribute it and/or modify it under
the terms of the GNU Lesser General Public License as published by the
Free Software Foundation; either version 2.1 of the License, or (at your
option) any later version. (See .)
This library is distributed in the hope that it will be useful, but WITHOUT
ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS
FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for
more details.
You should have received a copy of the GNU Lesser General Public License
along with this library; if not, write to the Free Software Foundation, Inc.,
51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
**********/
// "liveMedia"
// Copyright (c) 1996-2016 Live Networks, Inc. All rights reserved.
// RTP source for a common kind of payload format: Those that pack multiple,
// complete codec frames (as many as possible) into each RTP packet.
// Implementation
#include "MultiFramedRTPSource.hh"
#include "RTCP.hh"
#include "GroupsockHelper.hh"
#include
////////// ReorderingPacketBuffer definition //////////
class ReorderingPacketBuffer {
public:
ReorderingPacketBuffer(BufferedPacketFactory* packetFactory);
virtual ~ReorderingPacketBuffer();
void reset();
BufferedPacket* getFreePacket(MultiFramedRTPSource* ourSource);
Boolean storePacket(BufferedPacket* bPacket);
BufferedPacket* getNextCompletedPacket(Boolean& packetLossPreceded);
void releaseUsedPacket(BufferedPacket* packet);
void freePacket(BufferedPacket* packet) {
if (packet != fSavedPacket) {
delete packet;
} else {
fSavedPacketFree = True;
}
}
Boolean isEmpty() const { return fHeadPacket == NULL; }
void setThresholdTime(unsigned uSeconds) { fThresholdTime = uSeconds; }
void resetHaveSeenFirstPacket() { fHaveSeenFirstPacket = False; }
private:
BufferedPacketFactory* fPacketFactory;
unsigned fThresholdTime; // uSeconds
Boolean fHaveSeenFirstPacket; // used to set initial "fNextExpectedSeqNo"
unsigned short fNextExpectedSeqNo;
BufferedPacket* fHeadPacket;
BufferedPacket* fTailPacket;
BufferedPacket* fSavedPacket;
// to avoid calling new/free in the common case
Boolean fSavedPacketFree;
};
////////// MultiFramedRTPSource implementation //////////
MultiFramedRTPSource
::MultiFramedRTPSource(UsageEnvironment& env, Groupsock* RTPgs,
unsigned char rtpPayloadFormat,
unsigned rtpTimestampFrequency,
BufferedPacketFactory* packetFactory)
: RTPSource(env, RTPgs, rtpPayloadFormat, rtpTimestampFrequency) {
reset();
fReorderingBuffer = new ReorderingPacketBuffer(packetFactory);
// Try to use a big receive buffer for RTP:
increaseReceiveBufferTo(env, RTPgs->socketNum(), 50*1024);
}
void MultiFramedRTPSource::reset() {
fCurrentPacketBeginsFrame = True; // by default
fCurrentPacketCompletesFrame = True; // by default
fAreDoingNetworkReads = False;
fPacketReadInProgress = NULL;
fNeedDelivery = False;
fPacketLossInFragmentedFrame = False;
}
MultiFramedRTPSource::~MultiFramedRTPSource() {
delete fReorderingBuffer;
}
Boolean MultiFramedRTPSource
::processSpecialHeader(BufferedPacket* /*packet*/,
unsigned& resultSpecialHeaderSize) {
// Default implementation: Assume no special header:
resultSpecialHeaderSize = 0;
return True;
}
Boolean MultiFramedRTPSource
::packetIsUsableInJitterCalculation(unsigned char* /*packet*/,
unsigned /*packetSize*/) {
// Default implementation:
return True;
}
void MultiFramedRTPSource::doStopGettingFrames() {
if (fPacketReadInProgress != NULL) {
fReorderingBuffer->freePacket(fPacketReadInProgress);
fPacketReadInProgress = NULL;
}
envir().taskScheduler().unscheduleDelayedTask(nextTask());
fRTPInterface.stopNetworkReading();
fReorderingBuffer->reset();
reset();
}
void MultiFramedRTPSource::doGetNextFrame() {
if (!fAreDoingNetworkReads) {
// Turn on background read handling of incoming packets:
fAreDoingNetworkReads = True;
TaskScheduler::BackgroundHandlerProc* handler
= (TaskScheduler::BackgroundHandlerProc*)&networkReadHandler;
fRTPInterface.startNetworkReading(handler);
}
fSavedTo = fTo;
fSavedMaxSize = fMaxSize;
fFrameSize = 0; // for now
fNeedDelivery = True;
doGetNextFrame1();
}
void MultiFramedRTPSource::doGetNextFrame1() {
while (fNeedDelivery) {
// If we already have packet data available, then deliver it now.
Boolean packetLossPrecededThis;
BufferedPacket* nextPacket
= fReorderingBuffer->getNextCompletedPacket(packetLossPrecededThis);
if (nextPacket == NULL) break;
fNeedDelivery = False;
if (nextPacket->useCount() == 0) {
// Before using the packet, check whether it has a special header
// that needs to be processed:
unsigned specialHeaderSize;
if (!processSpecialHeader(nextPacket, specialHeaderSize)) {
// Something's wrong with the header; reject the packet:
fReorderingBuffer->releaseUsedPacket(nextPacket);
fNeedDelivery = True;
continue;
}
nextPacket->skip(specialHeaderSize);
}
// Check whether we're part of a multi-packet frame, and whether
// there was packet loss that would render this packet unusable:
if (fCurrentPacketBeginsFrame) {
if (packetLossPrecededThis || fPacketLossInFragmentedFrame) {
// We didn't get all of the previous frame.
// Forget any data that we used from it:
fTo = fSavedTo; fMaxSize = fSavedMaxSize;
fFrameSize = 0;
}
fPacketLossInFragmentedFrame = False;
} else if (packetLossPrecededThis) {
// We're in a multi-packet frame, with preceding packet loss
fPacketLossInFragmentedFrame = True;
}
if (fPacketLossInFragmentedFrame) {
// This packet is unusable; reject it:
fReorderingBuffer->releaseUsedPacket(nextPacket);
fNeedDelivery = True;
continue;
}
// The packet is usable. Deliver all or part of it to our caller:
unsigned frameSize;
nextPacket->use(fTo, fMaxSize, frameSize, fNumTruncatedBytes,
fCurPacketRTPSeqNum, fCurPacketRTPTimestamp,
fPresentationTime, fCurPacketHasBeenSynchronizedUsingRTCP,
fCurPacketMarkerBit);
fFrameSize += frameSize;
if (!nextPacket->hasUsableData()) {
// We're completely done with this packet now
fReorderingBuffer->releaseUsedPacket(nextPacket);
}
if (fCurrentPacketCompletesFrame && fFrameSize > 0) {
// We have all the data that the client wants.
if (fNumTruncatedBytes > 0) {
envir() << "MultiFramedRTPSource::doGetNextFrame1(): The total received frame size exceeds the client's buffer size ("
<< fSavedMaxSize << "). "
<< fNumTruncatedBytes << " bytes of trailing data will be dropped!\n";
}
// Call our own 'after getting' function, so that the downstream object can consume the data:
if (fReorderingBuffer->isEmpty()) {
// Common case optimization: There are no more queued incoming packets, so this code will not get
// executed again without having first returned to the event loop. Call our 'after getting' function
// directly, because there's no risk of a long chain of recursion (and thus stack overflow):
afterGetting(this);
} else {
// Special case: Call our 'after getting' function via the event loop.
nextTask() = envir().taskScheduler().scheduleDelayedTask(0,
(TaskFunc*)FramedSource::afterGetting, this);
}
} else {
// This packet contained fragmented data, and does not complete
// the data that the client wants. Keep getting data:
fTo += frameSize; fMaxSize -= frameSize;
fNeedDelivery = True;
}
}
}
void MultiFramedRTPSource
::setPacketReorderingThresholdTime(unsigned uSeconds) {
fReorderingBuffer->setThresholdTime(uSeconds);
}
#define ADVANCE(n) do { bPacket->skip(n); } while (0)
void MultiFramedRTPSource::networkReadHandler(MultiFramedRTPSource* source, int /*mask*/) {
source->networkReadHandler1();
}
void MultiFramedRTPSource::networkReadHandler1() {
BufferedPacket* bPacket = fPacketReadInProgress;
if (bPacket == NULL) {
// Normal case: Get a free BufferedPacket descriptor to hold the new network packet:
bPacket = fReorderingBuffer->getFreePacket(this);
}
// Read the network packet, and perform sanity checks on the RTP header:
Boolean readSuccess = False;
do {
struct sockaddr_in fromAddress;
Boolean packetReadWasIncomplete = fPacketReadInProgress != NULL;
if (!bPacket->fillInData(fRTPInterface, fromAddress, packetReadWasIncomplete)) {
if (bPacket->bytesAvailable() == 0) { // should not happen??
envir() << "MultiFramedRTPSource internal error: Hit limit when reading incoming packet over TCP\n";
}
fPacketReadInProgress = NULL;
break;
}
if (packetReadWasIncomplete) {
// We need additional read(s) before we can process the incoming packet:
fPacketReadInProgress = bPacket;
return;
} else {
fPacketReadInProgress = NULL;
}
#ifdef TEST_LOSS
setPacketReorderingThresholdTime(0);
// don't wait for 'lost' packets to arrive out-of-order later
if ((our_random()%10) == 0) break; // simulate 10% packet loss
#endif
// Check for the 12-byte RTP header:
if (bPacket->dataSize() < 12) break;
unsigned rtpHdr = ntohl(*(u_int32_t*)(bPacket->data())); ADVANCE(4);
Boolean rtpMarkerBit = (rtpHdr&0x00800000) != 0;
unsigned rtpTimestamp = ntohl(*(u_int32_t*)(bPacket->data()));ADVANCE(4);
unsigned rtpSSRC = ntohl(*(u_int32_t*)(bPacket->data())); ADVANCE(4);
// Check the RTP version number (it should be 2):
if ((rtpHdr&0xC0000000) != 0x80000000) break;
// Check the Payload Type.
unsigned char rtpPayloadType = (unsigned char)((rtpHdr&0x007F0000)>>16);
if (rtpPayloadType != rtpPayloadFormat()) {
if (fRTCPInstanceForMultiplexedRTCPPackets != NULL
&& rtpPayloadType >= 64 && rtpPayloadType <= 95) {
// This is a multiplexed RTCP packet, and we've been asked to deliver such packets.
// Do so now:
fRTCPInstanceForMultiplexedRTCPPackets
->injectReport(bPacket->data()-12, bPacket->dataSize()+12, fromAddress);
}
break;
}
// Skip over any CSRC identifiers in the header:
unsigned cc = (rtpHdr>>24)&0x0F;
if (bPacket->dataSize() < cc*4) break;
ADVANCE(cc*4);
// Check for (& ignore) any RTP header extension
if (rtpHdr&0x10000000) {
if (bPacket->dataSize() < 4) break;
unsigned extHdr = ntohl(*(u_int32_t*)(bPacket->data())); ADVANCE(4);
unsigned remExtSize = 4*(extHdr&0xFFFF);
if (bPacket->dataSize() < remExtSize) break;
ADVANCE(remExtSize);
}
// Discard any padding bytes:
if (rtpHdr&0x20000000) {
if (bPacket->dataSize() == 0) break;
unsigned numPaddingBytes
= (unsigned)(bPacket->data())[bPacket->dataSize()-1];
if (bPacket->dataSize() < numPaddingBytes) break;
bPacket->removePadding(numPaddingBytes);
}
// The rest of the packet is the usable data. Record and save it:
if (rtpSSRC != fLastReceivedSSRC) {
// The SSRC of incoming packets has changed. Unfortunately we don't yet handle streams that contain multiple SSRCs,
// but we can handle a single-SSRC stream where the SSRC changes occasionally:
fLastReceivedSSRC = rtpSSRC;
fReorderingBuffer->resetHaveSeenFirstPacket();
}
unsigned short rtpSeqNo = (unsigned short)(rtpHdr&0xFFFF);
Boolean usableInJitterCalculation
= packetIsUsableInJitterCalculation((bPacket->data()),
bPacket->dataSize());
struct timeval presentationTime; // computed by:
Boolean hasBeenSyncedUsingRTCP; // computed by:
receptionStatsDB()
.noteIncomingPacket(rtpSSRC, rtpSeqNo, rtpTimestamp,
timestampFrequency(),
usableInJitterCalculation, presentationTime,
hasBeenSyncedUsingRTCP, bPacket->dataSize());
// Fill in the rest of the packet descriptor, and store it:
struct timeval timeNow;
gettimeofday(&timeNow, NULL);
bPacket->assignMiscParams(rtpSeqNo, rtpTimestamp, presentationTime,
hasBeenSyncedUsingRTCP, rtpMarkerBit,
timeNow);
if (!fReorderingBuffer->storePacket(bPacket)) break;
readSuccess = True;
} while (0);
if (!readSuccess) fReorderingBuffer->freePacket(bPacket);
doGetNextFrame1();
// If we didn't get proper data this time, we'll get another chance
}
////////// BufferedPacket and BufferedPacketFactory implementation /////
#define MAX_PACKET_SIZE 65536
BufferedPacket::BufferedPacket()
: fPacketSize(MAX_PACKET_SIZE),
fBuf(new unsigned char[MAX_PACKET_SIZE]),
fNextPacket(NULL) {
}
BufferedPacket::~BufferedPacket() {
delete fNextPacket;
delete[] fBuf;
}
void BufferedPacket::reset() {
fHead = fTail = 0;
fUseCount = 0;
fIsFirstPacket = False; // by default
}
// The following function has been deprecated:
unsigned BufferedPacket
::nextEnclosedFrameSize(unsigned char*& /*framePtr*/, unsigned dataSize) {
// By default, use the entire buffered data, even though it may consist
// of more than one frame, on the assumption that the client doesn't
// care. (This is more efficient than delivering a frame at a time)
return dataSize;
}
void BufferedPacket
::getNextEnclosedFrameParameters(unsigned char*& framePtr, unsigned dataSize,
unsigned& frameSize,
unsigned& frameDurationInMicroseconds) {
// By default, use the entire buffered data, even though it may consist
// of more than one frame, on the assumption that the client doesn't
// care. (This is more efficient than delivering a frame at a time)
// For backwards-compatibility with existing uses of (the now deprecated)
// "nextEnclosedFrameSize()", call that function to implement this one:
frameSize = nextEnclosedFrameSize(framePtr, dataSize);
frameDurationInMicroseconds = 0; // by default. Subclasses should correct this.
}
Boolean BufferedPacket::fillInData(RTPInterface& rtpInterface, struct sockaddr_in& fromAddress,
Boolean& packetReadWasIncomplete) {
if (!packetReadWasIncomplete) reset();
unsigned const maxBytesToRead = bytesAvailable();
if (maxBytesToRead == 0) return False; // exceeded buffer size when reading over TCP
unsigned numBytesRead;
int tcpSocketNum; // not used
unsigned char tcpStreamChannelId; // not used
if (!rtpInterface.handleRead(&fBuf[fTail], maxBytesToRead,
numBytesRead, fromAddress,
tcpSocketNum, tcpStreamChannelId,
packetReadWasIncomplete)) {
return False;
}
fTail += numBytesRead;
return True;
}
void BufferedPacket
::assignMiscParams(unsigned short rtpSeqNo, unsigned rtpTimestamp,
struct timeval presentationTime,
Boolean hasBeenSyncedUsingRTCP, Boolean rtpMarkerBit,
struct timeval timeReceived) {
fRTPSeqNo = rtpSeqNo;
fRTPTimestamp = rtpTimestamp;
fPresentationTime = presentationTime;
fHasBeenSyncedUsingRTCP = hasBeenSyncedUsingRTCP;
fRTPMarkerBit = rtpMarkerBit;
fTimeReceived = timeReceived;
}
void BufferedPacket::skip(unsigned numBytes) {
fHead += numBytes;
if (fHead > fTail) fHead = fTail;
}
void BufferedPacket::removePadding(unsigned numBytes) {
if (numBytes > fTail-fHead) numBytes = fTail-fHead;
fTail -= numBytes;
}
void BufferedPacket::appendData(unsigned char* newData, unsigned numBytes) {
if (numBytes > fPacketSize-fTail) numBytes = fPacketSize - fTail;
memmove(&fBuf[fTail], newData, numBytes);
fTail += numBytes;
}
void BufferedPacket::use(unsigned char* to, unsigned toSize,
unsigned& bytesUsed, unsigned& bytesTruncated,
unsigned short& rtpSeqNo, unsigned& rtpTimestamp,
struct timeval& presentationTime,
Boolean& hasBeenSyncedUsingRTCP,
Boolean& rtpMarkerBit) {
unsigned char* origFramePtr = &fBuf[fHead];
unsigned char* newFramePtr = origFramePtr; // may change in the call below
unsigned frameSize, frameDurationInMicroseconds;
getNextEnclosedFrameParameters(newFramePtr, fTail - fHead,
frameSize, frameDurationInMicroseconds);
if (frameSize > toSize) {
bytesTruncated += frameSize - toSize;
bytesUsed = toSize;
} else {
bytesTruncated = 0;
bytesUsed = frameSize;
}
memmove(to, newFramePtr, bytesUsed);
fHead += (newFramePtr - origFramePtr) + frameSize;
++fUseCount;
rtpSeqNo = fRTPSeqNo;
rtpTimestamp = fRTPTimestamp;
presentationTime = fPresentationTime;
hasBeenSyncedUsingRTCP = fHasBeenSyncedUsingRTCP;
rtpMarkerBit = fRTPMarkerBit;
// Update "fPresentationTime" for the next enclosed frame (if any):
fPresentationTime.tv_usec += frameDurationInMicroseconds;
if (fPresentationTime.tv_usec >= 1000000) {
fPresentationTime.tv_sec += fPresentationTime.tv_usec/1000000;
fPresentationTime.tv_usec = fPresentationTime.tv_usec%1000000;
}
}
BufferedPacketFactory::BufferedPacketFactory() {
}
BufferedPacketFactory::~BufferedPacketFactory() {
}
BufferedPacket* BufferedPacketFactory
::createNewPacket(MultiFramedRTPSource* /*ourSource*/) {
return new BufferedPacket;
}
////////// ReorderingPacketBuffer implementation //////////
ReorderingPacketBuffer
::ReorderingPacketBuffer(BufferedPacketFactory* packetFactory)
: fThresholdTime(100000) /* default reordering threshold: 100 ms */,
fHaveSeenFirstPacket(False), fHeadPacket(NULL), fTailPacket(NULL), fSavedPacket(NULL), fSavedPacketFree(True) {
fPacketFactory = (packetFactory == NULL)
? (new BufferedPacketFactory)
: packetFactory;
}
ReorderingPacketBuffer::~ReorderingPacketBuffer() {
reset();
delete fPacketFactory;
}
void ReorderingPacketBuffer::reset() {
if (fSavedPacketFree) delete fSavedPacket; // because fSavedPacket is not in the list
delete fHeadPacket; // will also delete fSavedPacket if it's in the list
resetHaveSeenFirstPacket();
fHeadPacket = fTailPacket = fSavedPacket = NULL;
}
BufferedPacket* ReorderingPacketBuffer::getFreePacket(MultiFramedRTPSource* ourSource) {
if (fSavedPacket == NULL) { // we're being called for the first time
fSavedPacket = fPacketFactory->createNewPacket(ourSource);
fSavedPacketFree = True;
}
if (fSavedPacketFree == True) {
fSavedPacketFree = False;
return fSavedPacket;
} else {
return fPacketFactory->createNewPacket(ourSource);
}
}
Boolean ReorderingPacketBuffer::storePacket(BufferedPacket* bPacket) {
unsigned short rtpSeqNo = bPacket->rtpSeqNo();
if (!fHaveSeenFirstPacket) {
fNextExpectedSeqNo = rtpSeqNo; // initialization
bPacket->isFirstPacket() = True;
fHaveSeenFirstPacket = True;
}
// Ignore this packet if its sequence number is less than the one
// that we're looking for (in this case, it's been excessively delayed).
if (seqNumLT(rtpSeqNo, fNextExpectedSeqNo)) return False;
if (fTailPacket == NULL) {
// Common case: There are no packets in the queue; this will be the first one:
bPacket->nextPacket() = NULL;
fHeadPacket = fTailPacket = bPacket;
return True;
}
if (seqNumLT(fTailPacket->rtpSeqNo(), rtpSeqNo)) {
// The next-most common case: There are packets already in the queue; this packet arrived in order => put it at the tail:
bPacket->nextPacket() = NULL;
fTailPacket->nextPacket() = bPacket;
fTailPacket = bPacket;
return True;
}
if (rtpSeqNo == fTailPacket->rtpSeqNo()) {
// This is a duplicate packet - ignore it
return False;
}
// Rare case: This packet is out-of-order. Run through the list (from the head), to figure out where it belongs:
BufferedPacket* beforePtr = NULL;
BufferedPacket* afterPtr = fHeadPacket;
while (afterPtr != NULL) {
if (seqNumLT(rtpSeqNo, afterPtr->rtpSeqNo())) break; // it comes here
if (rtpSeqNo == afterPtr->rtpSeqNo()) {
// This is a duplicate packet - ignore it
return False;
}
beforePtr = afterPtr;
afterPtr = afterPtr->nextPacket();
}
// Link our new packet between "beforePtr" and "afterPtr":
bPacket->nextPacket() = afterPtr;
if (beforePtr == NULL) {
fHeadPacket = bPacket;
} else {
beforePtr->nextPacket() = bPacket;
}
return True;
}
void ReorderingPacketBuffer::releaseUsedPacket(BufferedPacket* packet) {
// ASSERT: packet == fHeadPacket
// ASSERT: fNextExpectedSeqNo == packet->rtpSeqNo()
++fNextExpectedSeqNo; // because we're finished with this packet now
fHeadPacket = fHeadPacket->nextPacket();
if (!fHeadPacket) {
fTailPacket = NULL;
}
packet->nextPacket() = NULL;
freePacket(packet);
}
BufferedPacket* ReorderingPacketBuffer
::getNextCompletedPacket(Boolean& packetLossPreceded) {
if (fHeadPacket == NULL) return NULL;
// Check whether the next packet we want is already at the head
// of the queue:
// ASSERT: fHeadPacket->rtpSeqNo() >= fNextExpectedSeqNo
if (fHeadPacket->rtpSeqNo() == fNextExpectedSeqNo) {
packetLossPreceded = fHeadPacket->isFirstPacket();
// (The very first packet is treated as if there was packet loss beforehand.)
return fHeadPacket;
}
// We're still waiting for our desired packet to arrive. However, if
// our time threshold has been exceeded, then forget it, and return
// the head packet instead:
Boolean timeThresholdHasBeenExceeded;
if (fThresholdTime == 0) {
timeThresholdHasBeenExceeded = True; // optimization
} else {
struct timeval timeNow;
gettimeofday(&timeNow, NULL);
unsigned uSecondsSinceReceived
= (timeNow.tv_sec - fHeadPacket->timeReceived().tv_sec)*1000000
+ (timeNow.tv_usec - fHeadPacket->timeReceived().tv_usec);
timeThresholdHasBeenExceeded = uSecondsSinceReceived > fThresholdTime;
}
if (timeThresholdHasBeenExceeded) {
fNextExpectedSeqNo = fHeadPacket->rtpSeqNo();
// we've given up on earlier packets now
packetLossPreceded = True;
return fHeadPacket;
}
// Otherwise, keep waiting for our desired packet to arrive:
return NULL;
}
live/liveMedia/OggDemuxedTrack.hh 000444 001752 001752 00000004072 12656261123 016711 0 ustar 00rsf rsf 000000 000000 /**********
This library is free software; you can redistribute it and/or modify it under
the terms of the GNU Lesser General Public License as published by the
Free Software Foundation; either version 2.1 of the License, or (at your
option) any later version. (See .)
This library is distributed in the hope that it will be useful, but WITHOUT
ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS
FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for
more details.
You should have received a copy of the GNU Lesser General Public License
along with this library; if not, write to the Free Software Foundation, Inc.,
51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
**********/
// "liveMedia"
// Copyright (c) 1996-2016 Live Networks, Inc. All rights reserved.
// A media track, demultiplexed from an Ogg file
// C++ header
#ifndef _OGG_DEMUXED_TRACK_HH
#define _OGG_DEMUXED_TRACK_HH
#ifndef _FRAMED_SOURCE_HH
#include "FramedSource.hh"
#endif
class OggDemux; // forward
class OggDemuxedTrack: public FramedSource {
private: // We are created only by a OggDemux (a friend)
friend class OggDemux;
OggDemuxedTrack(UsageEnvironment& env, unsigned trackNumber, OggDemux& sourceDemux);
virtual ~OggDemuxedTrack();
private:
// redefined virtual functions:
virtual void doGetNextFrame();
virtual char const* MIMEtype() const;
private: // We are accessed only by OggDemux and by OggFileParser (a friend)
friend class OggFileParser;
unsigned char*& to() { return fTo; }
unsigned& maxSize() { return fMaxSize; }
unsigned& frameSize() { return fFrameSize; }
unsigned& numTruncatedBytes() { return fNumTruncatedBytes; }
struct timeval& presentationTime() { return fPresentationTime; }
unsigned& durationInMicroseconds() { return fDurationInMicroseconds; }
struct timeval& nextPresentationTime() { return fNextPresentationTime; }
private:
unsigned fOurTrackNumber;
OggDemux& fOurSourceDemux;
Boolean fCurrentPageIsContinuation;
struct timeval fNextPresentationTime;
};
#endif
live/liveMedia/OggFileParser.cpp 000444 001752 001752 00000112025 12656261123 016546 0 ustar 00rsf rsf 000000 000000 /**********
This library is free software; you can redistribute it and/or modify it under
the terms of the GNU Lesser General Public License as published by the
Free Software Foundation; either version 2.1 of the License, or (at your
option) any later version. (See .)
This library is distributed in the hope that it will be useful, but WITHOUT
ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS
FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for
more details.
You should have received a copy of the GNU Lesser General Public License
along with this library; if not, write to the Free Software Foundation, Inc.,
51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
**********/
// "liveMedia"
// Copyright (c) 1996-2016 Live Networks, Inc. All rights reserved.
// A parser for an Ogg file.
// Implementation
#include "OggFileParser.hh"
#include "OggDemuxedTrack.hh"
#include // for "gettimeofday()
PacketSizeTable::PacketSizeTable(unsigned number_page_segments)
: numCompletedPackets(0), totSizes(0), nextPacketNumToDeliver(0),
lastPacketIsIncomplete(False) {
size = new unsigned[number_page_segments];
for (unsigned i = 0; i < number_page_segments; ++i) size[i] = 0;
}
PacketSizeTable::~PacketSizeTable() {
delete[] size;
}
OggFileParser::OggFileParser(OggFile& ourFile, FramedSource* inputSource,
FramedSource::onCloseFunc* onEndFunc, void* onEndClientData,
OggDemux* ourDemux)
: StreamParser(inputSource, onEndFunc, onEndClientData, continueParsing, this),
fOurFile(ourFile), fInputSource(inputSource),
fOnEndFunc(onEndFunc), fOnEndClientData(onEndClientData),
fOurDemux(ourDemux), fNumUnfulfilledTracks(0),
fPacketSizeTable(NULL), fCurrentTrackNumber(0), fSavedPacket(NULL) {
if (ourDemux == NULL) {
// Initialization
fCurrentParseState = PARSING_START_OF_FILE;
continueParsing();
} else {
fCurrentParseState = PARSING_AND_DELIVERING_PAGES;
// In this case, parsing (of page data) doesn't start until a client starts reading from a track.
}
}
OggFileParser::~OggFileParser() {
delete[] fSavedPacket;
delete fPacketSizeTable;
Medium::close(fInputSource);
}
void OggFileParser::continueParsing(void* clientData, unsigned char* ptr, unsigned size, struct timeval presentationTime) {
((OggFileParser*)clientData)->continueParsing();
}
void OggFileParser::continueParsing() {
if (fInputSource != NULL) {
if (fInputSource->isCurrentlyAwaitingData()) return;
// Our input source is currently being read. Wait until that read completes
if (!parse()) {
// We didn't complete the parsing, because we had to read more data from the source,
// or because we're waiting for another read from downstream.
// Once that happens, we'll get called again.
return;
}
}
// We successfully parsed the file. Call our 'done' function now:
if (fOnEndFunc != NULL) (*fOnEndFunc)(fOnEndClientData);
}
Boolean OggFileParser::parse() {
try {
while (1) {
switch (fCurrentParseState) {
case PARSING_START_OF_FILE: {
if (parseStartOfFile()) return True;
}
case PARSING_AND_DELIVERING_PAGES: {
parseAndDeliverPages();
}
case DELIVERING_PACKET_WITHIN_PAGE: {
if (deliverPacketWithinPage()) return False;
}
}
}
} catch (int /*e*/) {
#ifdef DEBUG
fprintf(stderr, "OggFileParser::parse() EXCEPTION (This is normal behavior - *not* an error)\n");
#endif
return False; // the parsing got interrupted
}
}
Boolean OggFileParser::parseStartOfFile() {
#ifdef DEBUG
fprintf(stderr, "parsing start of file\n");
#endif
// Read and parse each 'page', until we see the first non-BOS page, or until we have
// collected all required headers for Vorbis, Theora, or Opus track(s) (if any).
u_int8_t header_type_flag;
do {
header_type_flag = parseInitialPage();
} while ((header_type_flag&0x02) != 0 || needHeaders());
#ifdef DEBUG
fprintf(stderr, "Finished parsing start of file\n");
#endif
return True;
}
static u_int32_t byteSwap(u_int32_t x) {
return (x<<24)|((x<<8)&0x00FF0000)|((x>>8)&0x0000FF00)|(x>>24);
}
u_int8_t OggFileParser::parseInitialPage() {
u_int8_t header_type_flag;
u_int32_t bitstream_serial_number;
parseStartOfPage(header_type_flag, bitstream_serial_number);
// If this is a BOS page, examine the first 8 bytes of the first 'packet', to see whether
// the track data type is one that we know how to stream:
OggTrack* track;
if ((header_type_flag&0x02) != 0) { // BOS
char const* mimeType = NULL; // if unknown
if (fPacketSizeTable != NULL && fPacketSizeTable->size[0] >= 8) { // sanity check
char buf[8];
testBytes((u_int8_t*)buf, 8);
if (strncmp(&buf[1], "vorbis", 6) == 0) {
mimeType = "audio/VORBIS";
++fNumUnfulfilledTracks;
} else if (strncmp(buf, "OpusHead", 8) == 0) {
mimeType = "audio/OPUS";
++fNumUnfulfilledTracks;
} else if (strncmp(&buf[1], "theora", 6) == 0) {
mimeType = "video/THEORA";
++fNumUnfulfilledTracks;
}
}
// Add a new track descriptor for this track:
track = new OggTrack;
track->trackNumber = bitstream_serial_number;
track->mimeType = mimeType;
fOurFile.addTrack(track);
} else { // not a BOS page
// Because this is not a BOS page, the specified track should already have been seen:
track = fOurFile.lookup(bitstream_serial_number);
}
if (track != NULL) { // sanity check
#ifdef DEBUG
fprintf(stderr, "This track's MIME type: %s\n",
track->mimeType == NULL ? "(unknown)" : track->mimeType);
#endif
if (track->mimeType != NULL &&
(strcmp(track->mimeType, "audio/VORBIS") == 0 ||
strcmp(track->mimeType, "video/THEORA") == 0 ||
strcmp(track->mimeType, "audio/OPUS") == 0)) {
// Special-case handling of Vorbis, Theora, or Opus tracks:
// Make a copy of each packet, until we get the three special headers that we need:
Boolean isVorbis = strcmp(track->mimeType, "audio/VORBIS") == 0;
Boolean isTheora = strcmp(track->mimeType, "video/THEORA") == 0;
for (unsigned j = 0; j < fPacketSizeTable->numCompletedPackets && track->weNeedHeaders(); ++j) {
unsigned const packetSize = fPacketSizeTable->size[j];
if (packetSize == 0) continue; // sanity check
delete[] fSavedPacket/*if any*/; fSavedPacket = new u_int8_t[packetSize];
getBytes(fSavedPacket, packetSize);
fPacketSizeTable->totSizes -= packetSize;
// The start of the packet tells us whether its a header that we know about:
Boolean headerIsKnown = False;
unsigned index = 0;
if (isVorbis) {
u_int8_t const firstByte = fSavedPacket[0];
headerIsKnown = firstByte == 1 || firstByte == 3 || firstByte == 5;
index = (firstByte-1)/2; // 1, 3, or 5 => 0, 1, or 2
} else if (isTheora) {
u_int8_t const firstByte = fSavedPacket[0];
headerIsKnown = firstByte == 0x80 || firstByte == 0x81 || firstByte == 0x82;
index = firstByte &~0x80; // 0x80, 0x81, or 0x82 => 0, 1, or 2
} else { // Opus
if (strncmp((char const*)fSavedPacket, "OpusHead", 8) == 0) {
headerIsKnown = True;
index = 0; // "identification" header
} else if (strncmp((char const*)fSavedPacket, "OpusTags", 8) == 0) {
headerIsKnown = True;
index = 1; // "comment" header
}
}
if (headerIsKnown) {
#ifdef DEBUG
char const* headerName[3] = { "identification", "comment", "setup" };
fprintf(stderr, "Saved %d-byte %s \"%s\" header\n", packetSize, track->mimeType,
headerName[index]);
#endif
// This is a header, but first check it for validity:
if (!validateHeader(track, fSavedPacket, packetSize)) continue;
// Save this header (deleting any old header of the same type that we'd saved before)
delete[] track->vtoHdrs.header[index];
track->vtoHdrs.header[index] = fSavedPacket;
fSavedPacket = NULL;
track->vtoHdrs.headerSize[index] = packetSize;
if (!track->weNeedHeaders()) {
// We now have all of the needed Vorbis, Theora, or Opus headers for this track:
--fNumUnfulfilledTracks;
}
// Note: The above code won't work if a required header is fragmented over
// more than one 'page'. We assume that that won't ever happen...
}
}
}
}
// Skip over any remaining packet data bytes:
if (fPacketSizeTable->totSizes > 0) {
#ifdef DEBUG
fprintf(stderr, "Skipping %d remaining packet data bytes\n", fPacketSizeTable->totSizes);
#endif
skipBytes(fPacketSizeTable->totSizes);
}
return header_type_flag;
}
// A simple bit vector class for reading bits in little-endian order.
// (We can't use our usual "BitVector" class, because that's big-endian.)
class LEBitVector {
public:
LEBitVector(u_int8_t const* p, unsigned numBytes)
: fPtr(p), fEnd(&p[numBytes]), fNumBitsRemainingInCurrentByte(8) {
}
u_int32_t getBits(unsigned numBits/*<=32*/) {
if (noMoreBits()) {
return 0;
} else if (numBits == fNumBitsRemainingInCurrentByte) {
u_int32_t result = (*fPtr++)>>(8-fNumBitsRemainingInCurrentByte);
fNumBitsRemainingInCurrentByte = 8;
return result;
} else if (numBits < fNumBitsRemainingInCurrentByte) {
u_int8_t mask = 0xFF>>(8-numBits);
u_int32_t result = ((*fPtr)>>(8-fNumBitsRemainingInCurrentByte)) & mask;
fNumBitsRemainingInCurrentByte -= numBits;
return result;
} else { // numBits > fNumBitsRemainingInCurrentByte
// Do two recursive calls to get the result:
unsigned nbr = fNumBitsRemainingInCurrentByte;
u_int32_t firstBits = getBits(nbr);
u_int32_t nextBits = getBits(numBits - nbr);
return (nextBits< 32) {
(void)getBits(32);
numBits -= 32;
}
(void)getBits(numBits);
}
unsigned numBitsRemaining() { return (fEnd-fPtr-1)*8 + fNumBitsRemainingInCurrentByte; }
Boolean noMoreBits() const { return fPtr >= fEnd; }
private:
u_int8_t const* fPtr;
u_int8_t const* fEnd;
unsigned fNumBitsRemainingInCurrentByte; // 1..8
};
static unsigned ilog(int n) {
if (n < 0) return 0;
unsigned x = (unsigned)n;
unsigned result = 0;
while (x > 0) {
++result;
x >>= 1;
}
return result;
}
static unsigned lookup1_values(unsigned codebook_entries, unsigned codebook_dimensions) {
// "the greatest integer value for which [return_value] to the power of [codebook_dimensions]
// is less than or equal to [codebook_entries]"
unsigned return_value = 0;
unsigned powerValue;
do {
++return_value;
// Compute powerValue = return_value ** codebook_dimensions
if (return_value == 1) powerValue = 1; // optimization
else {
powerValue = 1;
for (unsigned i = 0; i < codebook_dimensions; ++i) {
powerValue *= return_value;
}
}
} while (powerValue <= codebook_entries);
return_value -= 1;
return return_value;
}
static Boolean parseVorbisSetup_codebook(LEBitVector& bv) {
if (bv.noMoreBits()) return False;
unsigned sync = bv.getBits(24);
if (sync != 0x564342) return False;
unsigned codebook_dimensions = bv.getBits(16);
unsigned codebook_entries = bv.getBits(24);
unsigned ordered = bv.getBits(1);
#ifdef DEBUG_SETUP_HEADER
fprintf(stderr, "\t\t\tcodebook_dimensions: %d; codebook_entries: %d, ordered: %d\n",
codebook_dimensions, codebook_entries, ordered);
#endif
if (!ordered) {
unsigned sparse = bv.getBits(1);
#ifdef DEBUG_SETUP_HEADER
fprintf(stderr, "\t\t\t!ordered: sparse %d\n", sparse);
#endif
for (unsigned i = 0; i < codebook_entries; ++i) {
unsigned codewordLength;
if (sparse) {
unsigned flag = bv.getBits(1);
if (flag) {
codewordLength = bv.getBits(5) + 1;
} else {
codewordLength = 0;
}
} else {
codewordLength = bv.getBits(5) + 1;
}
#ifdef DEBUG_SETUP_HEADER
fprintf(stderr, "\t\t\t\tcodeword length[%d]:\t%d\n", i, codewordLength);
#else
codewordLength = codewordLength; // to prevent compiler warning
#endif
}
} else { // ordered
#ifdef DEBUG_SETUP_HEADER
fprintf(stderr, "\t\t\tordered:\n");
#endif
unsigned current_entry = 0;
unsigned current_length = bv.getBits(5) + 1;
do {
unsigned number = bv.getBits(ilog(codebook_entries - current_entry));
#ifdef DEBUG_SETUP_HEADER
fprintf(stderr, "\t\t\t\tcodeword length[%d..%d]:\t%d\n",
current_entry, current_entry + number - 1, current_length);
#endif
current_entry += number;
if (current_entry > codebook_entries) {
fprintf(stderr, "Vorbis codebook parsing error: current_entry %d > codebook_entries %d!\n", current_entry, codebook_entries);
return False;
}
++current_length;
} while (current_entry < codebook_entries);
}
unsigned codebook_lookup_type = bv.getBits(4);
#ifdef DEBUG_SETUP_HEADER
fprintf(stderr, "\t\t\tcodebook_lookup_type: %d\n", codebook_lookup_type);
#endif
if (codebook_lookup_type > 2) {
fprintf(stderr, "Vorbis codebook parsing error: codebook_lookup_type %d!\n", codebook_lookup_type);
return False;
} else if (codebook_lookup_type > 0) { // 1 or 2
bv.skipBits(32+32); // "codebook_minimum_value" and "codebook_delta_value"
unsigned codebook_value_bits = bv.getBits(4) + 1;
bv.skipBits(1); // "codebook_lookup_p"
unsigned codebook_lookup_values;
if (codebook_lookup_type == 1) {
codebook_lookup_values = lookup1_values(codebook_entries, codebook_dimensions);
} else { // 2
codebook_lookup_values = codebook_entries*codebook_dimensions;
}
bv.skipBits(codebook_lookup_values*codebook_value_bits); // "codebook_multiplicands"
}
return True;
}
static Boolean parseVorbisSetup_codebooks(LEBitVector& bv) {
if (bv.noMoreBits()) return False;
unsigned vorbis_codebook_count = bv.getBits(8) + 1;
#ifdef DEBUG_SETUP_HEADER
fprintf(stderr, "\tCodebooks: vorbis_codebook_count: %d\n", vorbis_codebook_count);
#endif
for (unsigned i = 0; i < vorbis_codebook_count; ++i) {
#ifdef DEBUG_SETUP_HEADER
fprintf(stderr, "\t\tCodebook %d:\n", i);
#endif
if (!parseVorbisSetup_codebook(bv)) return False;
}
return True;
}
static Boolean parseVorbisSetup_timeDomainTransforms(LEBitVector& bv) {
if (bv.noMoreBits()) return False;
unsigned vorbis_time_count = bv.getBits(6) + 1;
#ifdef DEBUG_SETUP_HEADER
fprintf(stderr, "\tTime domain transforms: vorbis_time_count: %d\n", vorbis_time_count);
#endif
for (unsigned i = 0; i < vorbis_time_count; ++i) {
unsigned val = bv.getBits(16);
if (val != 0) {
fprintf(stderr, "Vorbis Time domain transforms, read non-zero value %d\n", val);
return False;
}
}
return True;
}
static Boolean parseVorbisSetup_floors(LEBitVector& bv) {
if (bv.noMoreBits()) return False;
unsigned vorbis_floor_count = bv.getBits(6) + 1;
#ifdef DEBUG_SETUP_HEADER
fprintf(stderr, "\tFloors: vorbis_floor_count: %d\n", vorbis_floor_count);
#endif
for (unsigned i = 0; i < vorbis_floor_count; ++i) {
unsigned floorType = bv.getBits(16);
if (floorType == 0) {
bv.skipBits(8+16+16+6+8);
unsigned floor0_number_of_books = bv.getBits(4) + 1;
bv.skipBits(floor0_number_of_books*8);
} else if (floorType == 1) {
unsigned floor1_partitions = bv.getBits(5);
unsigned* floor1_partition_class_list = new unsigned[floor1_partitions];
unsigned maximum_class = 0, j;
for (j = 0; j < floor1_partitions; ++j) {
floor1_partition_class_list[j] = bv.getBits(4);
if (floor1_partition_class_list[j] > maximum_class) maximum_class = floor1_partition_class_list[j];
}
unsigned* floor1_class_dimensions = new unsigned[maximum_class + 1];
for (j = 0; j <= maximum_class; ++j) {
floor1_class_dimensions[j] = bv.getBits(3) + 1;
unsigned floor1_class_subclasses = bv.getBits(2);
if (floor1_class_subclasses != 0) {
bv.skipBits(8); // "floor1_class_masterbooks[j]"
}
unsigned twoExp_floor1_class_subclasses = 1 << floor1_class_subclasses;
bv.skipBits(twoExp_floor1_class_subclasses*8); // "floor1_subclass_books[j][*]"
}
bv.skipBits(2); // "floor1_multiplier"
unsigned rangebits = bv.getBits(4);
for (j = 0; j < floor1_partitions; ++j) {
unsigned current_class_number = floor1_partition_class_list[j];
bv.skipBits(floor1_class_dimensions[current_class_number] * rangebits);
}
delete[] floor1_partition_class_list;
delete[] floor1_class_dimensions;
} else { // floorType > 1
fprintf(stderr, "Vorbis Floors, read bad floor type %d\n", floorType);
return False;
}
}
return True;
}
static Boolean parseVorbisSetup_residues(LEBitVector& bv) {
if (bv.noMoreBits()) return False;
unsigned vorbis_residue_count = bv.getBits(6) + 1;
#ifdef DEBUG_SETUP_HEADER
fprintf(stderr, "\tResidues: vorbis_residue_count: %d\n", vorbis_residue_count);
#endif
for (unsigned i = 0; i < vorbis_residue_count; ++i) {
unsigned vorbis_residue_type = bv.getBits(16);
if (vorbis_residue_type > 2) {
fprintf(stderr, "Vorbis Residues, read bad vorbis_residue_type: %d\n", vorbis_residue_type);
return False;
} else {
bv.skipBits(24+24+24); // "residue_begin", "residue_end", "residue_partition_size"
unsigned residue_classifications = bv.getBits(6) + 1;
bv.skipBits(8); // "residue_classbook"
u_int8_t* residue_cascade = new u_int8_t[residue_classifications];
unsigned j;
for (j = 0; j < residue_classifications; ++j) {
u_int8_t high_bits = 0;
u_int8_t low_bits = bv.getBits(3);
unsigned bitflag = bv.getBits(1);
if (bitflag) {
high_bits = bv.getBits(5);
}
residue_cascade[j] = (high_bits<<3) | low_bits;
}
for (j = 0; j < residue_classifications; ++j) {
u_int8_t const cascade = residue_cascade[j];
u_int8_t mask = 0x80;
while (mask != 0) {
if ((cascade&mask) != 0) bv.skipBits(8); // "residue_books[j][*]"
mask >>= 1;
}
}
delete[] residue_cascade;
}
}
return True;
}
static Boolean parseVorbisSetup_mappings(LEBitVector& bv, unsigned audio_channels) {
if (bv.noMoreBits()) return False;
unsigned vorbis_mapping_count = bv.getBits(6) + 1;
#ifdef DEBUG_SETUP_HEADER
fprintf(stderr, "\tMappings: vorbis_mapping_count: %d\n", vorbis_mapping_count);
#endif
for (unsigned i = 0; i < vorbis_mapping_count; ++i) {
unsigned vorbis_mapping_type = bv.getBits(16);
if (vorbis_mapping_type != 0) {
fprintf(stderr, "Vorbis Mappings, read bad vorbis_mapping_type: %d\n", vorbis_mapping_type);
return False;
}
unsigned vorbis_mapping_submaps = 1;
if (bv.getBits(1)) vorbis_mapping_submaps = bv.getBits(4) + 1;
if (bv.getBits(1)) { // "square polar channel mapping is in use"
unsigned vorbis_mapping_coupling_steps = bv.getBits(8) + 1;
for (unsigned j = 0; j < vorbis_mapping_coupling_steps; ++j) {
unsigned ilog_audio_channels_minus_1 = ilog(audio_channels - 1);
bv.skipBits(2*ilog_audio_channels_minus_1); // "vorbis_mapping_magnitude", "vorbis_mapping_angle"
}
}
unsigned reserved = bv.getBits(2);
if (reserved != 0) {
fprintf(stderr, "Vorbis Mappings, read bad 'reserved' field\n");
return False;
}
if (vorbis_mapping_submaps > 1) {
for (unsigned j = 0; j < audio_channels; ++j) {
unsigned vorbis_mapping_mux = bv.getBits(4);
fprintf(stderr, "\t\t\t\tvorbis_mapping_mux[%d]: %d\n", j, vorbis_mapping_mux);
if (vorbis_mapping_mux >= vorbis_mapping_submaps) {
fprintf(stderr, "Vorbis Mappings, read bad \"vorbis_mapping_mux\" %d (>= \"vorbis_mapping_submaps\" %d)\n", vorbis_mapping_mux, vorbis_mapping_submaps);
return False;
}
}
}
bv.skipBits(vorbis_mapping_submaps*(8+8+8)); // "the floor and residue numbers"
}
return True;
}
static Boolean parseVorbisSetup_modes(LEBitVector& bv, OggTrack* track) {
if (bv.noMoreBits()) return False;
unsigned vorbis_mode_count = bv.getBits(6) + 1;
unsigned ilog_vorbis_mode_count_minus_1 = ilog(vorbis_mode_count - 1);
#ifdef DEBUG_SETUP_HEADER
fprintf(stderr, "\tModes: vorbis_mode_count: %d (ilog(%d-1):%d)\n",
vorbis_mode_count, vorbis_mode_count, ilog_vorbis_mode_count_minus_1);
#endif
track->vtoHdrs.vorbis_mode_count = vorbis_mode_count;
track->vtoHdrs.ilog_vorbis_mode_count_minus_1 = ilog_vorbis_mode_count_minus_1;
track->vtoHdrs.vorbis_mode_blockflag = new u_int8_t[vorbis_mode_count];
for (unsigned i = 0; i < vorbis_mode_count; ++i) {
track->vtoHdrs.vorbis_mode_blockflag[i] = (u_int8_t)bv.getBits(1);
#ifdef DEBUG_SETUP_HEADER
fprintf(stderr, "\t\tMode %d: vorbis_mode_blockflag: %d\n", i, track->vtoHdrs.vorbis_mode_blockflag[i]);
#endif
bv.skipBits(16+16+8); // "vorbis_mode_windowtype", "vorbis_mode_transformtype", "vorbis_mode_mapping"
}
return True;
}
static Boolean parseVorbisSetupHeader(OggTrack* track, u_int8_t const* p, unsigned headerSize) {
LEBitVector bv(p, headerSize);
do {
if (!parseVorbisSetup_codebooks(bv)) break;
if (!parseVorbisSetup_timeDomainTransforms(bv)) break;
if (!parseVorbisSetup_floors(bv)) break;
if (!parseVorbisSetup_residues(bv)) break;
if (!parseVorbisSetup_mappings(bv, track->numChannels)) break;
if (!parseVorbisSetup_modes(bv, track)) break;
unsigned framingFlag = bv.getBits(1);
if (framingFlag == 0) {
fprintf(stderr, "Vorbis \"setup\" header did not end with a 'framing flag'!\n");
break;
}
return True;
} while (0);
// An error occurred:
return False;
}
#ifdef DEBUG
#define CHECK_PTR if (p >= pEnd) return False
#define printComment(p, len) do { for (unsigned k = 0; k < len; ++k) { CHECK_PTR; fprintf(stderr, "%c", *p++); } } while (0)
#endif
static Boolean validateCommentHeader(u_int8_t const *p, unsigned headerSize,
unsigned isOpus = 0) {
if (headerSize < 15+isOpus) { // need 7+isOpus + 4(vendor_length) + 4(user_comment_list_length)
fprintf(stderr, "\"comment\" header is too short (%d bytes)\n", headerSize);
return False;
}
#ifdef DEBUG
u_int8_t const* pEnd = &p[headerSize];
p += 7+isOpus;
u_int32_t vendor_length = (p[3]<<24)|(p[2]<<16)|(p[1]<<8)|p[0]; p += 4;
fprintf(stderr, "\tvendor_string:");
printComment(p, vendor_length);
fprintf(stderr, "\n");
u_int32_t user_comment_list_length = (p[3]<<24)|(p[2]<<16)|(p[1]<<8)|p[0]; p += 4;
for (unsigned i = 0; i < user_comment_list_length; ++i) {
CHECK_PTR; u_int32_t length = (p[3]<<24)|(p[2]<<16)|(p[1]<<8)|p[0]; p += 4;
fprintf(stderr, "\tuser_comment[%d]:", i);
printComment(p, length);
fprintf(stderr, "\n");
}
#endif
return True;
}
static unsigned blocksizeFromExponent(unsigned exponent) {
unsigned result = 1;
for (unsigned i = 0; i < exponent; ++i) result = 2*result;
return result;
}
Boolean OggFileParser::validateHeader(OggTrack* track, u_int8_t const* p, unsigned headerSize) {
// Assert: headerSize >= 7 (because we've already checked "XXXXXX" or "OpusXXXX")
if (strcmp(track->mimeType, "audio/VORBIS") == 0) {
u_int8_t const firstByte = p[0];
if (firstByte == 1) { // "identification" header
if (headerSize < 30) {
fprintf(stderr, "Vorbis \"identification\" header is too short (%d bytes)\n", headerSize);
return False;
} else if ((p[29]&0x1) != 1) {
fprintf(stderr, "Vorbis \"identification\" header: 'framing_flag' is not set\n");
return False;
}
p += 7;
u_int32_t vorbis_version = (p[3]<<24)|(p[2]<<16)|(p[1]<<8)|p[0]; p += 4;
if (vorbis_version != 0) {
fprintf(stderr, "Vorbis \"identification\" header has a bad 'vorbis_version': 0x%08x\n", vorbis_version);
return False;
}
u_int8_t audio_channels = *p++;
if (audio_channels == 0) {
fprintf(stderr, "Vorbis \"identification\" header: 'audio_channels' is 0!\n");
return False;
}
track->numChannels = audio_channels;
u_int32_t audio_sample_rate = (p[3]<<24)|(p[2]<<16)|(p[1]<<8)|p[0]; p += 4;
if (audio_sample_rate == 0) {
fprintf(stderr, "Vorbis \"identification\" header: 'audio_sample_rate' is 0!\n");
return False;
}
track->samplingFrequency = audio_sample_rate;
p += 4; // skip over 'bitrate_maximum'
u_int32_t bitrate_nominal = (p[3]<<24)|(p[2]<<16)|(p[1]<<8)|p[0]; p += 4;
if (bitrate_nominal > 0) track->estBitrate = (bitrate_nominal+500)/1000; // round
p += 4; // skip over 'bitrate_maximum'
// Note the two 'block sizes' (samples per packet), and their durations in microseconds:
u_int8_t blocksizeBits = *p++;
unsigned& blocksize_0 = track->vtoHdrs.blocksize[0]; // alias
unsigned& blocksize_1 = track->vtoHdrs.blocksize[1]; // alias
blocksize_0 = blocksizeFromExponent(blocksizeBits&0x0F);
blocksize_1 = blocksizeFromExponent(blocksizeBits>>4);
double uSecsPerSample = 1000000.0/(track->samplingFrequency*2);
// Why the "2"? I don't know, but it seems to be necessary
track->vtoHdrs.uSecsPerPacket[0] = (unsigned)(uSecsPerSample*blocksize_0);
track->vtoHdrs.uSecsPerPacket[1] = (unsigned)(uSecsPerSample*blocksize_1);
#ifdef DEBUG
fprintf(stderr, "\t%u Hz, %u-channel, %u kbps (est), block sizes: %u,%u (%u,%u us)\n",
track->samplingFrequency, track->numChannels, track->estBitrate,
blocksize_0, blocksize_1,
track->vtoHdrs.uSecsPerPacket[0], track->vtoHdrs.uSecsPerPacket[1]);
#endif
// To be valid, "blocksize_0" must be <= "blocksize_1", and both must be in [64,8192]:
if (!(blocksize_0 <= blocksize_1 && blocksize_0 >= 64 && blocksize_1 <= 8192)) {
fprintf(stderr, "Invalid Vorbis \"blocksize_0\" (%d) and/or \"blocksize_1\" (%d)!\n",
blocksize_0, blocksize_1);
return False;
}
} else if (firstByte == 3) { // "comment" header
if (!validateCommentHeader(p, headerSize)) return False;
} else if (firstByte == 5) { // "setup" header
// Parse the "setup" header to get the values that we want:
// "vorbis_mode_count", and "vorbis_mode_blockflag" for each mode. Unfortunately these come
// near the end of the header, so we have to parse lots of other crap first.
p += 7;
if (!parseVorbisSetupHeader(track, p, headerSize)) {
fprintf(stderr, "Failed to parse Vorbis \"setup\" header!\n");
return False;
}
}
} else if (strcmp(track->mimeType, "video/THEORA") == 0) {
u_int8_t const firstByte = p[0];
if (firstByte == 0x80) { // "identification" header
if (headerSize < 42) {
fprintf(stderr, "Theora \"identification\" header is too short (%d bytes)\n", headerSize);
return False;
} else if ((p[41]&0x7) != 0) {
fprintf(stderr, "Theora \"identification\" header: 'res' bits are non-zero\n");
return False;
}
track->vtoHdrs.KFGSHIFT = ((p[40]&3)<<3) | (p[41]>>5);
u_int32_t FRN = (p[22]<<24) | (p[23]<<16) | (p[24]<<8) | p[25]; // Frame rate numerator
u_int32_t FRD = (p[26]<<24) | (p[27]<<16) | (p[28]<<8) | p[29]; // Frame rate numerator
#ifdef DEBUG
fprintf(stderr, "\tKFGSHIFT %d, Frame rate numerator %d, Frame rate denominator %d\n", track->vtoHdrs.KFGSHIFT, FRN, FRD);
#endif
if (FRN == 0 || FRD == 0) {
fprintf(stderr, "Theora \"identification\" header: Bad FRN and/or FRD values: %d, %d\n", FRN, FRD);
return False;
}
track->vtoHdrs.uSecsPerFrame = (unsigned)((1000000.0*FRD)/FRN);
#ifdef DEBUG
fprintf(stderr, "\t\t=> %u microseconds per frame\n", track->vtoHdrs.uSecsPerFrame);
#endif
} else if (firstByte == 0x81) { // "comment" header
if (!validateCommentHeader(p, headerSize)) return False;
} else if (firstByte == 0x82) { // "setup" header
// We don't care about the contents of the Theora "setup" header; just assume it's valid
}
} else { // Opus audio
if (strncmp((char const*)p, "OpusHead", 8) == 0) { // "identification" header
// Just check the size, and the 'major' number of the version byte:
if (headerSize < 19 || (p[8]&0xF0) != 0) return False;
} else { // comment header
if (!validateCommentHeader(p, headerSize, 1/*isOpus*/)) return False;
}
}
return True;
}
void OggFileParser::parseAndDeliverPages() {
#ifdef DEBUG
fprintf(stderr, "parsing and delivering data\n");
#endif
while (parseAndDeliverPage()) {}
}
Boolean OggFileParser::parseAndDeliverPage() {
u_int8_t header_type_flag;
u_int32_t bitstream_serial_number;
parseStartOfPage(header_type_flag, bitstream_serial_number);
OggDemuxedTrack* demuxedTrack = fOurDemux->lookupDemuxedTrack(bitstream_serial_number);
if (demuxedTrack == NULL) { // this track is not being read
#ifdef DEBUG
fprintf(stderr, "\tIgnoring page from unread track; skipping %d remaining packet data bytes\n",
fPacketSizeTable->totSizes);
#endif
skipBytes(fPacketSizeTable->totSizes);
return True;
} else if (fPacketSizeTable->totSizes == 0) {
// This page is empty (has no packets). Skip it and continue
#ifdef DEBUG
fprintf(stderr, "\t[track: %s] Skipping empty page\n", demuxedTrack->MIMEtype());
#endif
return True;
}
// Start delivering packets next:
demuxedTrack->fCurrentPageIsContinuation = (header_type_flag&0x01) != 0;
fCurrentTrackNumber = bitstream_serial_number;
fCurrentParseState = DELIVERING_PACKET_WITHIN_PAGE;
saveParserState();
return False;
}
Boolean OggFileParser::deliverPacketWithinPage() {
OggDemuxedTrack* demuxedTrack = fOurDemux->lookupDemuxedTrack(fCurrentTrackNumber);
if (demuxedTrack == NULL) return False; // should not happen
unsigned packetNum = fPacketSizeTable->nextPacketNumToDeliver;
unsigned packetSize = fPacketSizeTable->size[packetNum];
if (!demuxedTrack->isCurrentlyAwaitingData()) {
// Someone has been reading this stream, but isn't right now.
// We can't deliver this frame until he asks for it, so punt for now.
// The next time he asks for a frame, he'll get it.
#ifdef DEBUG
fprintf(stderr, "\t[track: %s] Deferring delivery of packet %d (%d bytes%s)\n",
demuxedTrack->MIMEtype(), packetNum, packetSize,
packetNum == fPacketSizeTable->numCompletedPackets ? " (incomplete)" : "");
#endif
return True;
}
// Deliver the next packet:
#ifdef DEBUG
fprintf(stderr, "\t[track: %s] Delivering packet %d (%d bytes%s)\n", demuxedTrack->MIMEtype(),
packetNum, packetSize,
packetNum == fPacketSizeTable->numCompletedPackets ? " (incomplete)" : "");
#endif
unsigned numBytesDelivered
= packetSize < demuxedTrack->maxSize() ? packetSize : demuxedTrack->maxSize();
getBytes(demuxedTrack->to(), numBytesDelivered);
u_int8_t firstByte = numBytesDelivered > 0 ? demuxedTrack->to()[0] : 0x00;
u_int8_t secondByte = numBytesDelivered > 1 ? demuxedTrack->to()[1] : 0x00;
demuxedTrack->to() += numBytesDelivered;
if (demuxedTrack->fCurrentPageIsContinuation) { // the previous page's read was incomplete
demuxedTrack->frameSize() += numBytesDelivered;
} else {
// This is the first delivery for this "doGetNextFrame()" call.
demuxedTrack->frameSize() = numBytesDelivered;
}
if (packetSize > demuxedTrack->maxSize()) {
demuxedTrack->numTruncatedBytes() += packetSize - demuxedTrack->maxSize();
}
demuxedTrack->maxSize() -= numBytesDelivered;
// Figure out the duration and presentation time of this frame.
unsigned durationInMicroseconds;
OggTrack* track = fOurFile.lookup(demuxedTrack->fOurTrackNumber);
if (strcmp(track->mimeType, "audio/VORBIS") == 0) {
if ((firstByte&0x01) != 0) { // This is a header packet
durationInMicroseconds = 0;
} else { // This is a data packet.
// Parse the first byte to figure out its duration.
// Extract the next "track->vtoHdrs.ilog_vorbis_mode_count_minus_1" bits of the first byte:
u_int8_t const mask = 0xFE<<(track->vtoHdrs.ilog_vorbis_mode_count_minus_1);
u_int8_t const modeNumber = (firstByte&~mask)>>1;
if (modeNumber >= track->vtoHdrs.vorbis_mode_count) {
fprintf(stderr, "Error: Bad mode number %d (>= vorbis_mode_count %d) in Vorbis packet!\n",
modeNumber, track->vtoHdrs.vorbis_mode_count);
durationInMicroseconds = 0;
} else {
unsigned blockNumber = track->vtoHdrs.vorbis_mode_blockflag[modeNumber];
durationInMicroseconds = track->vtoHdrs.uSecsPerPacket[blockNumber];
}
}
} else if (strcmp(track->mimeType, "video/THEORA") == 0) {
if ((firstByte&0x80) != 0) { // This is a header packet
durationInMicroseconds = 0;
} else { // This is a data packet.
durationInMicroseconds = track->vtoHdrs.uSecsPerFrame;
}
} else { // "audio/OPUS"
if (firstByte == 0x4F/*'O'*/ && secondByte == 0x70/*'p*/) { // This is a header packet
durationInMicroseconds = 0;
} else { // This is a data packet.
// Parse the first byte to figure out the duration of each frame, and then (if necessary)
// parse the second byte to figure out how many frames are in this packet:
u_int8_t config = firstByte >> 3;
u_int8_t c = firstByte & 0x03;
unsigned const configDuration[32] = { // in microseconds
10000, 20000, 40000, 60000, // config 0..3
10000, 20000, 40000, 60000, // config 4..7
10000, 20000, 40000, 60000, // config 8..11
10000, 20000, // config 12..13
10000, 20000, // config 14..15
2500, 5000, 10000, 20000, // config 16..19
2500, 5000, 10000, 20000, // config 20..23
2500, 5000, 10000, 20000, // config 24..27
2500, 5000, 10000, 20000 // config 28..31
};
unsigned const numFramesInPacket = c == 0 ? 1 : c == 3 ? (secondByte&0x3F) : 2;
durationInMicroseconds = numFramesInPacket*configDuration[config];
}
}
if (demuxedTrack->nextPresentationTime().tv_sec == 0 && demuxedTrack->nextPresentationTime().tv_usec == 0) {
// This is the first delivery. Initialize "demuxedTrack->nextPresentationTime()":
gettimeofday(&demuxedTrack->nextPresentationTime(), NULL);
}
demuxedTrack->presentationTime() = demuxedTrack->nextPresentationTime();
demuxedTrack->durationInMicroseconds() = durationInMicroseconds;
demuxedTrack->nextPresentationTime().tv_usec += durationInMicroseconds;
while (demuxedTrack->nextPresentationTime().tv_usec >= 1000000) {
++demuxedTrack->nextPresentationTime().tv_sec;
demuxedTrack->nextPresentationTime().tv_usec -= 1000000;
}
saveParserState();
// And check whether there's a next packet in this page:
if (packetNum == fPacketSizeTable->numCompletedPackets) {
// This delivery was for an incomplete packet, at the end of the page.
// Return without completing delivery:
fCurrentParseState = PARSING_AND_DELIVERING_PAGES;
return False;
}
if (packetNum < fPacketSizeTable->numCompletedPackets-1
|| fPacketSizeTable->lastPacketIsIncomplete) {
// There is at least one more packet (possibly incomplete) left in this packet.
// Deliver it next:
++fPacketSizeTable->nextPacketNumToDeliver;
} else {
// Start parsing a new page next:
fCurrentParseState = PARSING_AND_DELIVERING_PAGES;
}
FramedSource::afterGetting(demuxedTrack); // completes delivery
return True;
}
void OggFileParser::parseStartOfPage(u_int8_t& header_type_flag,
u_int32_t& bitstream_serial_number) {
saveParserState();
// First, make sure we start with the 'capture_pattern': 0x4F676753 ('OggS'):
while (test4Bytes() != 0x4F676753) {
skipBytes(1);
saveParserState(); // ensures forward progress through the file
}
skipBytes(4);
#ifdef DEBUG
fprintf(stderr, "\nSaw Ogg page header:\n");
#endif
u_int8_t stream_structure_version = get1Byte();
if (stream_structure_version != 0) {
fprintf(stderr, "Saw page with unknown Ogg file version number: 0x%02x\n", stream_structure_version);
}
header_type_flag = get1Byte();
#ifdef DEBUG
fprintf(stderr, "\theader_type_flag: 0x%02x (", header_type_flag);
if (header_type_flag&0x01) fprintf(stderr, "continuation ");
if (header_type_flag&0x02) fprintf(stderr, "bos ");
if (header_type_flag&0x04) fprintf(stderr, "eos ");
fprintf(stderr, ")\n");
#endif
u_int32_t granule_position1 = byteSwap(get4Bytes());
u_int32_t granule_position2 = byteSwap(get4Bytes());
bitstream_serial_number = byteSwap(get4Bytes());
u_int32_t page_sequence_number = byteSwap(get4Bytes());
u_int32_t CRC_checksum = byteSwap(get4Bytes());
u_int8_t number_page_segments = get1Byte();
#ifdef DEBUG
fprintf(stderr, "\tgranule_position 0x%08x%08x, bitstream_serial_number 0x%08x, page_sequence_number 0x%08x, CRC_checksum 0x%08x, number_page_segments %d\n", granule_position2, granule_position1, bitstream_serial_number, page_sequence_number, CRC_checksum, number_page_segments);
#else
// Dummy statements to prevent 'unused variable' compiler warnings:
#define DUMMY_STATEMENT(x) do {x = x;} while (0)
DUMMY_STATEMENT(granule_position1);
DUMMY_STATEMENT(granule_position2);
DUMMY_STATEMENT(page_sequence_number);
DUMMY_STATEMENT(CRC_checksum);
#endif
// Look at the "segment_table" to count the sizes of the packets in this page:
delete fPacketSizeTable/*if any*/; fPacketSizeTable = new PacketSizeTable(number_page_segments);
u_int8_t lacing_value = 0;
#ifdef DEBUG
fprintf(stderr, "\tsegment_table\n");
#endif
for (unsigned i = 0; i < number_page_segments; ++i) {
lacing_value = get1Byte();
#ifdef DEBUG
fprintf(stderr, "\t\t%d:\t%d", i, lacing_value);
#endif
fPacketSizeTable->totSizes += lacing_value;
fPacketSizeTable->size[fPacketSizeTable->numCompletedPackets] += lacing_value;
if (lacing_value < 255) {
// This completes a packet:
#ifdef DEBUG
fprintf(stderr, " (->%d)", fPacketSizeTable->size[fPacketSizeTable->numCompletedPackets]);
#endif
++fPacketSizeTable->numCompletedPackets;
}
#ifdef DEBUG
fprintf(stderr, "\n");
#endif
}
fPacketSizeTable->lastPacketIsIncomplete = lacing_value == 255;
}
live/liveMedia/OggFileParser.hh 000444 001752 001752 00000006022 12656261123 016362 0 ustar 00rsf rsf 000000 000000 /**********
This library is free software; you can redistribute it and/or modify it under
the terms of the GNU Lesser General Public License as published by the
Free Software Foundation; either version 2.1 of the License, or (at your
option) any later version. (See .)
This library is distributed in the hope that it will be useful, but WITHOUT
ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS
FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for
more details.
You should have received a copy of the GNU Lesser General Public License
along with this library; if not, write to the Free Software Foundation, Inc.,
51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
**********/
// "liveMedia"
// Copyright (c) 1996-2016 Live Networks, Inc. All rights reserved.
// A parser for an Ogg file
// C++ header
#ifndef _OGG_FILE_PARSER_HH
#ifndef _STREAM_PARSER_HH
#include "StreamParser.hh"
#endif
#ifndef _OGG_FILE_HH
#include "OggFile.hh"
#endif
// An enum representing the current state of the parser:
enum OggParseState {
PARSING_START_OF_FILE,
PARSING_AND_DELIVERING_PAGES,
DELIVERING_PACKET_WITHIN_PAGE
};
// A structure that counts the sizes of 'packets' given by each page's "segment_table":
class PacketSizeTable {
public:
PacketSizeTable(unsigned number_page_segments);
~PacketSizeTable();
unsigned numCompletedPackets; // will be <= "number_page_segments"
unsigned* size; // an array of sizes of each of the packets
unsigned totSizes;
unsigned nextPacketNumToDeliver;
Boolean lastPacketIsIncomplete; // iff the last segment's 'lacing' was 255
};
class OggFileParser: public StreamParser {
public:
OggFileParser(OggFile& ourFile, FramedSource* inputSource,
FramedSource::onCloseFunc* onEndFunc, void* onEndClientData,
OggDemux* ourDemux = NULL);
virtual ~OggFileParser();
// StreamParser 'client continue' function:
static void continueParsing(void* clientData, unsigned char* ptr, unsigned size, struct timeval presentationTime);
void continueParsing();
private:
Boolean needHeaders() { return fNumUnfulfilledTracks > 0; }
// Parsing functions:
Boolean parse(); // returns True iff we have finished parsing all BOS pages (on initialization)
Boolean parseStartOfFile();
u_int8_t parseInitialPage(); // returns the 'header_type_flag' byte
void parseAndDeliverPages();
Boolean parseAndDeliverPage();
Boolean deliverPacketWithinPage();
void parseStartOfPage(u_int8_t& header_type_flag, u_int32_t& bitstream_serial_number);
Boolean validateHeader(OggTrack* track, u_int8_t const* p, unsigned headerSize);
private:
// General state for parsing:
OggFile& fOurFile;
FramedSource* fInputSource;
FramedSource::onCloseFunc* fOnEndFunc;
void* fOnEndClientData;
OggDemux* fOurDemux;
OggParseState fCurrentParseState;
unsigned fNumUnfulfilledTracks;
PacketSizeTable* fPacketSizeTable;
u_int32_t fCurrentTrackNumber;
u_int8_t* fSavedPacket; // used to temporarily save a copy of a 'packet' from a page
};
#endif
live/liveMedia/OggFileServerDemux.cpp 000444 001752 001752 00000010157 12656261123 017566 0 ustar 00rsf rsf 000000 000000 /**********
This library is free software; you can redistribute it and/or modify it under
the terms of the GNU Lesser General Public License as published by the
Free Software Foundation; either version 2.1 of the License, or (at your
option) any later version. (See .)
This library is distributed in the hope that it will be useful, but WITHOUT
ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS
FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for
more details.
You should have received a copy of the GNU Lesser General Public License
along with this library; if not, write to the Free Software Foundation, Inc.,
51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
**********/
// "liveMedia"
// Copyright (c) 1996-2016 Live Networks, Inc. All rights reserved.
// A server demultiplexor for a Ogg file
// Implementation
#include "OggFileServerDemux.hh"
#include "OggFileServerMediaSubsession.hh"
void OggFileServerDemux
::createNew(UsageEnvironment& env, char const* fileName,
onCreationFunc* onCreation, void* onCreationClientData) {
(void)new OggFileServerDemux(env, fileName,
onCreation, onCreationClientData);
}
ServerMediaSubsession* OggFileServerDemux::newServerMediaSubsession() {
u_int32_t dummyResultTrackNumber;
return newServerMediaSubsession(dummyResultTrackNumber);
}
ServerMediaSubsession* OggFileServerDemux
::newServerMediaSubsession(u_int32_t& resultTrackNumber) {
resultTrackNumber = 0;
OggTrack* nextTrack = fIter->next();
if (nextTrack == NULL) return NULL;
return newServerMediaSubsessionByTrackNumber(nextTrack->trackNumber);
}
ServerMediaSubsession* OggFileServerDemux
::newServerMediaSubsessionByTrackNumber(u_int32_t trackNumber) {
OggTrack* track = fOurOggFile->lookup(trackNumber);
if (track == NULL) return NULL;
ServerMediaSubsession* result = OggFileServerMediaSubsession::createNew(*this, track);
if (result != NULL) {
#ifdef DEBUG
fprintf(stderr, "Created 'ServerMediaSubsession' object for track #%d: (%s)\n", track->trackNumber, track->mimeType);
#endif
}
return result;
}
FramedSource* OggFileServerDemux::newDemuxedTrack(unsigned clientSessionId, u_int32_t trackNumber) {
OggDemux* demuxToUse = NULL;
if (clientSessionId != 0 && clientSessionId == fLastClientSessionId) {
demuxToUse = fLastCreatedDemux; // use the same demultiplexor as before
// Note: This code relies upon the fact that the creation of streams for different
// client sessions do not overlap - so all demuxed tracks are created for one "OggDemux" at a time.
// Also, the "clientSessionId != 0" test is a hack, because 'session 0' is special; its audio and video streams
// are created and destroyed one-at-a-time, rather than both streams being
// created, and then (later) both streams being destroyed (as is the case
// for other ('real') session ids). Because of this, a separate demultiplexor is used for each 'session 0' track.
}
if (demuxToUse == NULL) demuxToUse = fOurOggFile->newDemux();
fLastClientSessionId = clientSessionId;
fLastCreatedDemux = demuxToUse;
return demuxToUse->newDemuxedTrackByTrackNumber(trackNumber);
}
OggFileServerDemux
::OggFileServerDemux(UsageEnvironment& env, char const* fileName,
onCreationFunc* onCreation, void* onCreationClientData)
: Medium(env),
fFileName(fileName), fOnCreation(onCreation), fOnCreationClientData(onCreationClientData),
fIter(NULL/*until the OggFile is created*/),
fLastClientSessionId(0), fLastCreatedDemux(NULL) {
OggFile::createNew(env, fileName, onOggFileCreation, this);
}
OggFileServerDemux::~OggFileServerDemux() {
Medium::close(fOurOggFile);
delete fIter;
}
void OggFileServerDemux::onOggFileCreation(OggFile* newFile, void* clientData) {
((OggFileServerDemux*)clientData)->onOggFileCreation(newFile);
}
void OggFileServerDemux::onOggFileCreation(OggFile* newFile) {
fOurOggFile = newFile;
fIter = new OggTrackTableIterator(fOurOggFile->trackTable());
// Now, call our own creation notification function:
if (fOnCreation != NULL) (*fOnCreation)(this, fOnCreationClientData);
}
live/liveMedia/OggFileServerMediaSubsession.cpp 000444 001752 001752 00000004344 12656261123 021602 0 ustar 00rsf rsf 000000 000000 /**********
This library is free software; you can redistribute it and/or modify it under
the terms of the GNU Lesser General Public License as published by the
Free Software Foundation; either version 2.1 of the License, or (at your
option) any later version. (See .)
This library is distributed in the hope that it will be useful, but WITHOUT
ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS
FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for
more details.
You should have received a copy of the GNU Lesser General Public License
along with this library; if not, write to the Free Software Foundation, Inc.,
51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
**********/
// "liveMedia"
// Copyright (c) 1996-2016 Live Networks, Inc. All rights reserved.
// A 'ServerMediaSubsession' object that creates new, unicast, "RTPSink"s
// on demand, from a track within an Ogg file.
// Implementation
#include "OggFileServerMediaSubsession.hh"
#include "OggDemuxedTrack.hh"
#include "FramedFilter.hh"
OggFileServerMediaSubsession* OggFileServerMediaSubsession
::createNew(OggFileServerDemux& demux, OggTrack* track) {
return new OggFileServerMediaSubsession(demux, track);
}
OggFileServerMediaSubsession
::OggFileServerMediaSubsession(OggFileServerDemux& demux, OggTrack* track)
: FileServerMediaSubsession(demux.envir(), demux.fileName(), False),
fOurDemux(demux), fTrack(track), fNumFiltersInFrontOfTrack(0) {
}
OggFileServerMediaSubsession::~OggFileServerMediaSubsession() {
}
FramedSource* OggFileServerMediaSubsession
::createNewStreamSource(unsigned clientSessionId, unsigned& estBitrate) {
FramedSource* baseSource = fOurDemux.newDemuxedTrack(clientSessionId, fTrack->trackNumber);
if (baseSource == NULL) return NULL;
return fOurDemux.ourOggFile()
->createSourceForStreaming(baseSource, fTrack->trackNumber,
estBitrate, fNumFiltersInFrontOfTrack);
}
RTPSink* OggFileServerMediaSubsession
::createNewRTPSink(Groupsock* rtpGroupsock, unsigned char rtpPayloadTypeIfDynamic, FramedSource* /*inputSource*/) {
return fOurDemux.ourOggFile()
->createRTPSinkForTrackNumber(fTrack->trackNumber, rtpGroupsock, rtpPayloadTypeIfDynamic);
}
live/liveMedia/OggFileServerMediaSubsession.hh 000444 001752 001752 00000003734 12656261123 021421 0 ustar 00rsf rsf 000000 000000 /**********
This library is free software; you can redistribute it and/or modify it under
the terms of the GNU Lesser General Public License as published by the
Free Software Foundation; either version 2.1 of the License, or (at your
option) any later version. (See .)
This library is distributed in the hope that it will be useful, but WITHOUT
ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS
FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for
more details.
You should have received a copy of the GNU Lesser General Public License
along with this library; if not, write to the Free Software Foundation, Inc.,
51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
**********/
// "liveMedia"
// Copyright (c) 1996-2016 Live Networks, Inc. All rights reserved.
// A 'ServerMediaSubsession' object that creates new, unicast, "RTPSink"s
// on demand, from a track within an Ogg file.
// C++ header
#ifndef _OGG_FILE_SERVER_MEDIA_SUBSESSION_HH
#define _OGG_FILE_SERVER_MEDIA_SUBSESSION_HH
#ifndef _FILE_SERVER_MEDIA_SUBSESSION_HH
#include "FileServerMediaSubsession.hh"
#endif
#ifndef _OGG_FILE_SERVER_DEMUX_HH
#include "OggFileServerDemux.hh"
#endif
class OggFileServerMediaSubsession: public FileServerMediaSubsession {
public:
static OggFileServerMediaSubsession*
createNew(OggFileServerDemux& demux, OggTrack* track);
protected:
OggFileServerMediaSubsession(OggFileServerDemux& demux, OggTrack* track);
// called only by createNew(), or by subclass constructors
virtual ~OggFileServerMediaSubsession();
protected: // redefined virtual functions
virtual FramedSource* createNewStreamSource(unsigned clientSessionId,
unsigned& estBitrate);
virtual RTPSink* createNewRTPSink(Groupsock* rtpGroupsock, unsigned char rtpPayloadTypeIfDynamic, FramedSource* inputSource);
protected:
OggFileServerDemux& fOurDemux;
OggTrack* fTrack;
unsigned fNumFiltersInFrontOfTrack;
};
#endif
live/liveMedia/OggFileSink.cpp 000444 001752 001752 00000026763 12656261123 016233 0 ustar 00rsf rsf 000000 000000 /**********
This library is free software; you can redistribute it and/or modify it under
the terms of the GNU Lesser General Public License as published by the
Free Software Foundation; either version 2.1 of the License, or (at your
option) any later version. (See .)
This library is distributed in the hope that it will be useful, but WITHOUT
ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS
FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for
more details.
You should have received a copy of the GNU Lesser General Public License
along with this library; if not, write to the Free Software Foundation, Inc.,
51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
**********/
// "liveMedia"
// Copyright (c) 1996-2016 Live Networks, Inc. All rights reserved.
// 'Ogg' File Sink (recording a single media track only)
// Implementation
#include "OggFileSink.hh"
#include "OutputFile.hh"
#include "VorbisAudioRTPSource.hh" // for "parseVorbisOrTheoraConfigStr()"
#include "MPEG2TransportStreamMultiplexor.hh" // for calculateCRC()
#include "FramedSource.hh"
OggFileSink* OggFileSink
::createNew(UsageEnvironment& env, char const* fileName,
unsigned samplingFrequency, char const* configStr,
unsigned bufferSize, Boolean oneFilePerFrame) {
do {
FILE* fid;
char const* perFrameFileNamePrefix;
if (oneFilePerFrame) {
// Create the fid for each frame
fid = NULL;
perFrameFileNamePrefix = fileName;
} else {
// Normal case: create the fid once
fid = OpenOutputFile(env, fileName);
if (fid == NULL) break;
perFrameFileNamePrefix = NULL;
}
return new OggFileSink(env, fid, samplingFrequency, configStr, bufferSize, perFrameFileNamePrefix);
} while (0);
return NULL;
}
OggFileSink::OggFileSink(UsageEnvironment& env, FILE* fid,
unsigned samplingFrequency, char const* configStr,
unsigned bufferSize, char const* perFrameFileNamePrefix)
: FileSink(env, fid, bufferSize, perFrameFileNamePrefix),
fSamplingFrequency(samplingFrequency), fConfigStr(configStr),
fHaveWrittenFirstFrame(False), fHaveSeenEOF(False),
fGranulePosition(0), fGranulePositionAdjustment(0), fPageSequenceNumber(0),
fIsTheora(False), fGranuleIncrementPerFrame(1),
fAltFrameSize(0), fAltNumTruncatedBytes(0) {
fAltBuffer = new unsigned char[bufferSize];
// Initialize our 'Ogg page header' array with constant values:
u_int8_t* p = fPageHeaderBytes;
*p++=0x4f; *p++=0x67; *p++=0x67; *p++=0x53; // bytes 0..3: 'capture_pattern': "OggS"
*p++=0; // byte 4: 'stream_structure_version': 0
*p++=0; // byte 5: 'header_type_flag': set on each write
*p++=0; *p++=0; *p++=0; *p++=0; *p++=0; *p++=0; *p++=0; *p++=0;
// bytes 6..13: 'granule_position': set on each write
*p++=1; *p++=0; *p++=0; *p++=0; // bytes 14..17: 'bitstream_serial_number': 1
*p++=0; *p++=0; *p++=0; *p++=0; // bytes 18..21: 'page_sequence_number': set on each write
*p++=0; *p++=0; *p++=0; *p++=0; // bytes 22..25: 'CRC_checksum': set on each write
*p=0; // byte 26: 'number_page_segments': set on each write
}
OggFileSink::~OggFileSink() {
// We still have the previously-arrived frame, so write it to the file before we end:
fHaveSeenEOF = True;
OggFileSink::addData(fAltBuffer, fAltFrameSize, fAltPresentationTime);
delete[] fAltBuffer;
}
Boolean OggFileSink::continuePlaying() {
// Identical to "FileSink::continuePlaying()",
// except that we use our own 'on source closure' function:
if (fSource == NULL) return False;
fSource->getNextFrame(fBuffer, fBufferSize,
FileSink::afterGettingFrame, this,
ourOnSourceClosure, this);
return True;
}
#define PAGE_DATA_MAX_SIZE (255*255)
void OggFileSink::addData(unsigned char const* data, unsigned dataSize,
struct timeval presentationTime) {
if (dataSize == 0) return;
// Set "fGranulePosition" for this frame:
if (fIsTheora) {
// Special case for Theora: "fGranulePosition" is supposed to be made up of a pair:
// (frame count to last key frame) | (frame count since last key frame)
// However, because there appears to be no easy way to figure out which frames are key frames,
// we just assume that all frames are key frames.
if (!(data[0] >= 0x80 && data[0] <= 0x82)) { // for header pages, "fGranulePosition" remains 0
fGranulePosition += fGranuleIncrementPerFrame;
}
} else {
double ptDiff
= (presentationTime.tv_sec - fFirstPresentationTime.tv_sec)
+ (presentationTime.tv_usec - fFirstPresentationTime.tv_usec)/1000000.0;
int64_t newGranulePosition
= (int64_t)(fSamplingFrequency*ptDiff) + fGranulePositionAdjustment;
if (newGranulePosition < fGranulePosition) {
// Update "fGranulePositionAdjustment" so that "fGranulePosition" remains monotonic
fGranulePositionAdjustment += fGranulePosition - newGranulePosition;
} else {
fGranulePosition = newGranulePosition;
}
}
// Write the frame to the file as a single Ogg 'page' (or perhaps as multiple pages
// if it's too big for a single page). We don't aggregate more than one frame within
// an Ogg page because that's not legal for some headers, and because that would make
// it difficult for us to properly set the 'eos' (end of stream) flag on the last page.
// First, figure out how many pages to write here
// (a page can contain no more than PAGE_DATA_MAX_SIZE bytes)
unsigned numPagesToWrite = dataSize/PAGE_DATA_MAX_SIZE + 1;
// Note that if "dataSize" is a integral multiple of PAGE_DATA_MAX_SIZE, there will
// be an extra 0-size page at the end
for (unsigned i = 0; i < numPagesToWrite; ++i) {
// First, fill in the changeable parts of our 'page header' array;
u_int8_t header_type_flag = 0x0;
if (!fHaveWrittenFirstFrame && i == 0) {
header_type_flag |= 0x02; // 'bos'
fHaveWrittenFirstFrame = True; // for the future
}
if (i > 0) header_type_flag |= 0x01; // 'continuation'
if (fHaveSeenEOF && i == numPagesToWrite-1) header_type_flag |= 0x04; // 'eos'
fPageHeaderBytes[5] = header_type_flag;
if (i < numPagesToWrite-1) {
// For pages where the frame does not end, set 'granule_position' in the header to -1:
fPageHeaderBytes[6] = fPageHeaderBytes[7] = fPageHeaderBytes[8] = fPageHeaderBytes[9] =
fPageHeaderBytes[10] = fPageHeaderBytes[11] = fPageHeaderBytes[12] = fPageHeaderBytes[13]
= 0xFF;
} else {
fPageHeaderBytes[6] = (u_int8_t)fGranulePosition;
fPageHeaderBytes[7] = (u_int8_t)(fGranulePosition>>8);
fPageHeaderBytes[8] = (u_int8_t)(fGranulePosition>>16);
fPageHeaderBytes[9] = (u_int8_t)(fGranulePosition>>24);
fPageHeaderBytes[10] = (u_int8_t)(fGranulePosition>>32);
fPageHeaderBytes[11] = (u_int8_t)(fGranulePosition>>40);
fPageHeaderBytes[12] = (u_int8_t)(fGranulePosition>>48);
fPageHeaderBytes[13] = (u_int8_t)(fGranulePosition>>56);
}
fPageHeaderBytes[18] = (u_int8_t)fPageSequenceNumber;
fPageHeaderBytes[19] = (u_int8_t)(fPageSequenceNumber>>8);
fPageHeaderBytes[20] = (u_int8_t)(fPageSequenceNumber>>16);
fPageHeaderBytes[21] = (u_int8_t)(fPageSequenceNumber>>24);
++fPageSequenceNumber;
unsigned pageDataSize;
u_int8_t number_page_segments;
if (dataSize >= PAGE_DATA_MAX_SIZE) {
pageDataSize = PAGE_DATA_MAX_SIZE;
number_page_segments = 255;
} else {
pageDataSize = dataSize;
number_page_segments = (pageDataSize+255)/255; // so that we don't end with a lacing of 255
}
fPageHeaderBytes[26] = number_page_segments;
u_int8_t segment_table[255];
for (unsigned j = 0; j < (unsigned)(number_page_segments-1); ++j) {
segment_table[j] = 255;
}
segment_table[number_page_segments-1] = pageDataSize%255;
// Compute the CRC from the 'page header' array, the 'segment_table', and the frame data:
u_int32_t crc = 0;
fPageHeaderBytes[22] = fPageHeaderBytes[23] = fPageHeaderBytes[24] = fPageHeaderBytes[25] = 0;
crc = calculateCRC(fPageHeaderBytes, 27, 0);
crc = calculateCRC(segment_table, number_page_segments, crc);
crc = calculateCRC(data, pageDataSize, crc);
fPageHeaderBytes[22] = (u_int8_t)crc;
fPageHeaderBytes[23] = (u_int8_t)(crc>>8);
fPageHeaderBytes[24] = (u_int8_t)(crc>>16);
fPageHeaderBytes[25] = (u_int8_t)(crc>>24);
// Then write out the 'page header' array:
FileSink::addData(fPageHeaderBytes, 27, presentationTime);
// Then write out the 'segment_table':
FileSink::addData(segment_table, number_page_segments, presentationTime);
// Then add frame data, to complete the page:
FileSink::addData(data, pageDataSize, presentationTime);
data += pageDataSize;
dataSize -= pageDataSize;
}
}
void OggFileSink::afterGettingFrame(unsigned frameSize, unsigned numTruncatedBytes, struct timeval presentationTime) {
if (!fHaveWrittenFirstFrame) {
fFirstPresentationTime = presentationTime;
// If we have a 'config string' representing 'packed configuration headers'
// ("identification", "comment", "setup"), unpack them and prepend them to the file:
if (fConfigStr != NULL && fConfigStr[0] != '\0') {
u_int8_t* identificationHdr; unsigned identificationHdrSize;
u_int8_t* commentHdr; unsigned commentHdrSize;
u_int8_t* setupHdr; unsigned setupHdrSize;
u_int32_t identField;
parseVorbisOrTheoraConfigStr(fConfigStr,
identificationHdr, identificationHdrSize,
commentHdr, commentHdrSize,
setupHdr, setupHdrSize,
identField);
if (identificationHdrSize >= 42
&& strncmp((const char*)&identificationHdr[1], "theora", 6) == 0) {
// Hack for Theora video: Parse the "identification" hdr to get the "KFGSHIFT" parameter:
fIsTheora = True;
u_int8_t const KFGSHIFT = ((identificationHdr[40]&3)<<3) | (identificationHdr[41]>>5);
fGranuleIncrementPerFrame = (u_int64_t)(1 << KFGSHIFT);
}
OggFileSink::addData(identificationHdr, identificationHdrSize, presentationTime);
OggFileSink::addData(commentHdr, commentHdrSize, presentationTime);
// Hack: Handle the "setup" header as if had arrived in the previous delivery, so it'll get
// written properly below:
if (setupHdrSize > fBufferSize) {
fAltFrameSize = fBufferSize;
fAltNumTruncatedBytes = setupHdrSize - fBufferSize;
} else {
fAltFrameSize = setupHdrSize;
fAltNumTruncatedBytes = 0;
}
memmove(fAltBuffer, setupHdr, fAltFrameSize);
fAltPresentationTime = presentationTime;
delete[] identificationHdr;
delete[] commentHdr;
delete[] setupHdr;
}
}
// Save this input frame for next time, and instead write the previous input frame now:
unsigned char* tmpPtr = fBuffer; fBuffer = fAltBuffer; fAltBuffer = tmpPtr;
unsigned prevFrameSize = fAltFrameSize; fAltFrameSize = frameSize;
unsigned prevNumTruncatedBytes = fAltNumTruncatedBytes; fAltNumTruncatedBytes = numTruncatedBytes;
struct timeval prevPresentationTime = fAltPresentationTime; fAltPresentationTime = presentationTime;
// Call the parent class to complete the normal file write with the (previous) input frame:
FileSink::afterGettingFrame(prevFrameSize, prevNumTruncatedBytes, prevPresentationTime);
}
void OggFileSink::ourOnSourceClosure(void* clientData) {
((OggFileSink*)clientData)->ourOnSourceClosure();
}
void OggFileSink::ourOnSourceClosure() {
fHaveSeenEOF = True;
// We still have the previously-arrived frame, so write it to the file before we end:
OggFileSink::addData(fAltBuffer, fAltFrameSize, fAltPresentationTime);
// Handle the closure for real:
onSourceClosure();
}
live/liveMedia/ourMD5.cpp 000444 001752 001752 00000025602 12656261123 015174 0 ustar 00rsf rsf 000000 000000 /**********
This library is free software; you can redistribute it and/or modify it under
the terms of the GNU Lesser General Public License as published by the
Free Software Foundation; either version 2.1 of the License, or (at your
option) any later version. (See .)
This library is distributed in the hope that it will be useful, but WITHOUT
ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS
FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for
more details.
You should have received a copy of the GNU Lesser General Public License
along with this library; if not, write to the Free Software Foundation, Inc.,
51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
**********/
// "liveMedia"
// Copyright (c) 1996-2016 Live Networks, Inc. All rights reserved.
// Because MD5 may not be implemented (at least, with the same interface) on all systems,
// we have our own implementation.
// Implementation
#include "ourMD5.hh"
#include // for u_int32_t, u_int64_t
#include
#define DIGEST_SIZE_IN_BYTES 16
#define DIGEST_SIZE_IN_HEX_DIGITS (2*DIGEST_SIZE_IN_BYTES)
#define DIGEST_SIZE_AS_STRING (DIGEST_SIZE_IN_HEX_DIGITS+1)
// The state of a MD5 computation in progress:
class MD5Context {
public:
MD5Context();
~MD5Context();
void addData(unsigned char const* inputData, unsigned inputDataSize);
void end(char* outputDigest /*must point to an array of size DIGEST_SIZE_AS_STRING*/);
void finalize(unsigned char* outputDigestInBytes);
// Like "end()", except that the argument is a byte array, of size DIGEST_SIZE_IN_BYTES.
// This function is used to implement "end()".
private:
void zeroize(); // to remove potentially sensitive information
void transform64Bytes(unsigned char const block[64]); // does the actual MD5 transform
private:
u_int32_t fState[4]; // ABCD
u_int64_t fBitCount; // number of bits, modulo 2^64
unsigned char fWorkingBuffer[64];
};
char* our_MD5Data(unsigned char const* data, unsigned dataSize, char* outputDigest) {
MD5Context ctx;
ctx.addData(data, dataSize);
if (outputDigest == NULL) outputDigest = new char[DIGEST_SIZE_AS_STRING];
ctx.end(outputDigest);
return outputDigest;
}
unsigned char* our_MD5DataRaw(unsigned char const* data, unsigned dataSize,
unsigned char* outputDigest) {
MD5Context ctx;
ctx.addData(data, dataSize);
if (outputDigest == NULL) outputDigest = new unsigned char[DIGEST_SIZE_IN_BYTES];
ctx.finalize(outputDigest);
return outputDigest;
}
////////// MD5Context implementation //////////
MD5Context::MD5Context()
: fBitCount(0) {
// Initialize with magic constants:
fState[0] = 0x67452301;
fState[1] = 0xefcdab89;
fState[2] = 0x98badcfe;
fState[3] = 0x10325476;
}
MD5Context::~MD5Context() {
zeroize();
}
void MD5Context::addData(unsigned char const* inputData, unsigned inputDataSize) {
// Begin by noting how much of our 64-byte working buffer remains unfilled:
u_int64_t const byteCount = fBitCount>>3;
unsigned bufferBytesInUse = (unsigned)(byteCount&0x3F);
unsigned bufferBytesRemaining = 64 - bufferBytesInUse;
// Then update our bit count:
fBitCount += inputDataSize<<3;
unsigned i = 0;
if (inputDataSize >= bufferBytesRemaining) {
// We have enough input data to do (64-byte) MD5 transforms.
// Do this now, starting with a transform on our working buffer, then with
// (as many as possible) transforms on rest of the input data.
memcpy((unsigned char*)&fWorkingBuffer[bufferBytesInUse], (unsigned char*)inputData, bufferBytesRemaining);
transform64Bytes(fWorkingBuffer);
bufferBytesInUse = 0;
for (i = bufferBytesRemaining; i + 63 < inputDataSize; i += 64) {
transform64Bytes(&inputData[i]);
}
}
// Copy any remaining (and currently un-transformed) input data into our working buffer:
if (i < inputDataSize) {
memcpy((unsigned char*)&fWorkingBuffer[bufferBytesInUse], (unsigned char*)&inputData[i], inputDataSize - i);
}
}
void MD5Context::end(char* outputDigest) {
unsigned char digestInBytes[DIGEST_SIZE_IN_BYTES];
finalize(digestInBytes);
// Convert the digest from bytes (binary) to hex digits:
static char const hex[]="0123456789abcdef";
unsigned i;
for (i = 0; i < DIGEST_SIZE_IN_BYTES; ++i) {
outputDigest[2*i] = hex[digestInBytes[i] >> 4];
outputDigest[2*i+1] = hex[digestInBytes[i] & 0x0F];
}
outputDigest[2*i] = '\0';
}
// Routines that unpack 32 and 64-bit values into arrays of bytes (in little-endian order).
// (These are used to implement "finalize()".)
static void unpack32(unsigned char out[4], u_int32_t in) {
for (unsigned i = 0; i < 4; ++i) {
out[i] = (unsigned char)((in>>(8*i))&0xFF);
}
}
static void unpack64(unsigned char out[8], u_int64_t in) {
for (unsigned i = 0; i < 8; ++i) {
out[i] = (unsigned char)((in>>(8*i))&0xFF);
}
}
static unsigned char const PADDING[64] = {
0x80, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0
};
void MD5Context::finalize(unsigned char* outputDigestInBytes) {
// Unpack our bit count:
unsigned char bitCountInBytes[8];
unpack64(bitCountInBytes, fBitCount);
// Before 'finalizing', make sure that we transform any remaining bytes in our working buffer:
u_int64_t const byteCount = fBitCount>>3;
unsigned bufferBytesInUse = (unsigned)(byteCount&0x3F);
unsigned numPaddingBytes
= (bufferBytesInUse < 56) ? (56 - bufferBytesInUse) : (64 + 56 - bufferBytesInUse);
addData(PADDING, numPaddingBytes);
addData(bitCountInBytes, 8);
// Unpack our 'state' into the output digest:
unpack32(&outputDigestInBytes[0], fState[0]);
unpack32(&outputDigestInBytes[4], fState[1]);
unpack32(&outputDigestInBytes[8], fState[2]);
unpack32(&outputDigestInBytes[12], fState[3]);
zeroize();
}
void MD5Context::zeroize() {
fState[0] = fState[1] = fState[2] = fState[3] = 0;
fBitCount = 0;
for (unsigned i = 0; i < 64; ++i) fWorkingBuffer[i] = 0;
}
////////// Implementation of the MD5 transform ("MD5Context::transform64Bytes()") //////////
// Constants for the transform:
#define S11 7
#define S12 12
#define S13 17
#define S14 22
#define S21 5
#define S22 9
#define S23 14
#define S24 20
#define S31 4
#define S32 11
#define S33 16
#define S34 23
#define S41 6
#define S42 10
#define S43 15
#define S44 21
// Basic MD5 functions:
#define F(x, y, z) (((x) & (y)) | ((~x) & (z)))
#define G(x, y, z) (((x) & (z)) | ((y) & (~z)))
#define H(x, y, z) ((x) ^ (y) ^ (z))
#define I(x, y, z) ((y) ^ ((x) | (~z)))
// Rotate "x" left "n" bits:
#define ROTATE_LEFT(x, n) (((x) << (n)) | ((x) >> (32-(n))))
// Other transforms:
#define FF(a, b, c, d, x, s, ac) { \
(a) += F((b), (c), (d)) + (x) + (u_int32_t)(ac); \
(a) = ROTATE_LEFT((a), (s)); \
(a) += (b); \
}
#define GG(a, b, c, d, x, s, ac) { \
(a) += G((b), (c), (d)) + (x) + (u_int32_t)(ac); \
(a) = ROTATE_LEFT((a), (s)); \
(a) += (b); \
}
#define HH(a, b, c, d, x, s, ac) { \
(a) += H((b), (c), (d)) + (x) + (u_int32_t)(ac); \
(a) = ROTATE_LEFT((a), (s)); \
(a) += (b); \
}
#define II(a, b, c, d, x, s, ac) { \
(a) += I((b), (c), (d)) + (x) + (u_int32_t)(ac); \
(a) = ROTATE_LEFT((a), (s)); \
(a) += (b); \
}
void MD5Context::transform64Bytes(unsigned char const block[64]) {
u_int32_t a = fState[0], b = fState[1], c = fState[2], d = fState[3];
// Begin by packing "block" into an array ("x") of 16 32-bit values (in little-endian order):
u_int32_t x[16];
for (unsigned i = 0, j = 0; i < 16; ++i, j += 4) {
x[i] = ((u_int32_t)block[j]) | (((u_int32_t)block[j+1]) << 8) | (((u_int32_t)block[j+2]) << 16) | (((u_int32_t)block[j+3]) << 24);
}
// Now, perform the transform on the array "x":
// Round 1
FF(a, b, c, d, x[0], S11, 0xd76aa478); // 1
FF(d, a, b, c, x[1], S12, 0xe8c7b756); // 2
FF(c, d, a, b, x[2], S13, 0x242070db); // 3
FF(b, c, d, a, x[3], S14, 0xc1bdceee); // 4
FF(a, b, c, d, x[4], S11, 0xf57c0faf); // 5
FF(d, a, b, c, x[5], S12, 0x4787c62a); // 6
FF(c, d, a, b, x[6], S13, 0xa8304613); // 7
FF(b, c, d, a, x[7], S14, 0xfd469501); // 8
FF(a, b, c, d, x[8], S11, 0x698098d8); // 9
FF(d, a, b, c, x[9], S12, 0x8b44f7af); // 10
FF(c, d, a, b, x[10], S13, 0xffff5bb1); // 11
FF(b, c, d, a, x[11], S14, 0x895cd7be); // 12
FF(a, b, c, d, x[12], S11, 0x6b901122); // 13
FF(d, a, b, c, x[13], S12, 0xfd987193); // 14
FF(c, d, a, b, x[14], S13, 0xa679438e); // 15
FF(b, c, d, a, x[15], S14, 0x49b40821); // 16
// Round 2
GG(a, b, c, d, x[1], S21, 0xf61e2562); // 17
GG(d, a, b, c, x[6], S22, 0xc040b340); // 18
GG(c, d, a, b, x[11], S23, 0x265e5a51); // 19
GG(b, c, d, a, x[0], S24, 0xe9b6c7aa); // 20
GG(a, b, c, d, x[5], S21, 0xd62f105d); // 21
GG(d, a, b, c, x[10], S22, 0x2441453); // 22
GG(c, d, a, b, x[15], S23, 0xd8a1e681); // 23
GG(b, c, d, a, x[4], S24, 0xe7d3fbc8); // 24
GG(a, b, c, d, x[9], S21, 0x21e1cde6); // 25
GG(d, a, b, c, x[14], S22, 0xc33707d6); // 26
GG(c, d, a, b, x[3], S23, 0xf4d50d87); // 27
GG(b, c, d, a, x[8], S24, 0x455a14ed); // 28
GG(a, b, c, d, x[13], S21, 0xa9e3e905); // 29
GG(d, a, b, c, x[2], S22, 0xfcefa3f8); // 30
GG(c, d, a, b, x[7], S23, 0x676f02d9); // 31
GG(b, c, d, a, x[12], S24, 0x8d2a4c8a); // 32
// Round 3
HH(a, b, c, d, x[5], S31, 0xfffa3942); // 33
HH(d, a, b, c, x[8], S32, 0x8771f681); // 34
HH(c, d, a, b, x[11], S33, 0x6d9d6122); // 35
HH(b, c, d, a, x[14], S34, 0xfde5380c); // 36
HH(a, b, c, d, x[1], S31, 0xa4beea44); // 37
HH(d, a, b, c, x[4], S32, 0x4bdecfa9); // 38
HH(c, d, a, b, x[7], S33, 0xf6bb4b60); // 39
HH(b, c, d, a, x[10], S34, 0xbebfbc70); // 40
HH(a, b, c, d, x[13], S31, 0x289b7ec6); // 41
HH(d, a, b, c, x[0], S32, 0xeaa127fa); // 42
HH(c, d, a, b, x[3], S33, 0xd4ef3085); // 43
HH(b, c, d, a, x[6], S34, 0x4881d05); // 44
HH(a, b, c, d, x[9], S31, 0xd9d4d039); // 45
HH(d, a, b, c, x[12], S32, 0xe6db99e5); // 46
HH(c, d, a, b, x[15], S33, 0x1fa27cf8); // 47
HH(b, c, d, a, x[2], S34, 0xc4ac5665); // 48
// Round 4
II(a, b, c, d, x[0], S41, 0xf4292244); // 49
II(d, a, b, c, x[7], S42, 0x432aff97); // 50
II(c, d, a, b, x[14], S43, 0xab9423a7); // 51
II(b, c, d, a, x[5], S44, 0xfc93a039); // 52
II(a, b, c, d, x[12], S41, 0x655b59c3); // 53
II(d, a, b, c, x[3], S42, 0x8f0ccc92); // 54
II(c, d, a, b, x[10], S43, 0xffeff47d); // 55
II(b, c, d, a, x[1], S44, 0x85845dd1); // 56
II(a, b, c, d, x[8], S41, 0x6fa87e4f); // 57
II(d, a, b, c, x[15], S42, 0xfe2ce6e0); // 58
II(c, d, a, b, x[6], S43, 0xa3014314); // 59
II(b, c, d, a, x[13], S44, 0x4e0811a1); // 60
II(a, b, c, d, x[4], S41, 0xf7537e82); // 61
II(d, a, b, c, x[11], S42, 0xbd3af235); // 62
II(c, d, a, b, x[2], S43, 0x2ad7d2bb); // 63
II(b, c, d, a, x[9], S44, 0xeb86d391); // 64
fState[0] += a; fState[1] += b; fState[2] += c; fState[3] += d;
// Zeroize sensitive information.
for (unsigned k = 0; k < 16; ++k) x[k] = 0;
}
live/liveMedia/OutputFile.cpp 000444 001752 001752 00000003777 12656261123 016172 0 ustar 00rsf rsf 000000 000000 /**********
This library is free software; you can redistribute it and/or modify it under
the terms of the GNU Lesser General Public License as published by the
Free Software Foundation; either version 2.1 of the License, or (at your
option) any later version. (See .)
This library is distributed in the hope that it will be useful, but WITHOUT
ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS
FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for
more details.
You should have received a copy of the GNU Lesser General Public License
along with this library; if not, write to the Free Software Foundation, Inc.,
51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
**********/
// "liveMedia"
// Copyright (c) 1996-2016 Live Networks, Inc. All rights reserved.
// Common routines for opening/closing named output files
// Implementation
#if (defined(__WIN32__) || defined(_WIN32)) && !defined(_WIN32_WCE)
#include
#include
#endif
#ifndef _WIN32_WCE
#include
#endif
#include
#include "OutputFile.hh"
FILE* OpenOutputFile(UsageEnvironment& env, char const* fileName) {
FILE* fid;
// Check for special case 'file names': "stdout" and "stderr"
if (strcmp(fileName, "stdout") == 0) {
fid = stdout;
#if (defined(__WIN32__) || defined(_WIN32)) && !defined(_WIN32_WCE)
_setmode(_fileno(stdout), _O_BINARY); // convert to binary mode
#endif
} else if (strcmp(fileName, "stderr") == 0) {
fid = stderr;
#if (defined(__WIN32__) || defined(_WIN32)) && !defined(_WIN32_WCE)
_setmode(_fileno(stderr), _O_BINARY); // convert to binary mode
#endif
} else {
fid = fopen(fileName, "wb");
}
if (fid == NULL) {
env.setResultMsg("unable to open file \"", fileName, "\"");
}
return fid;
}
void CloseOutputFile(FILE* fid) {
// Don't close 'stdout' or 'stderr', in case we want to use it again later.
if (fid != NULL && fid != stdout && fid != stderr) fclose(fid);
}
live/liveMedia/PassiveServerMediaSubsession.cpp 000444 001752 001752 00000020525 12656261123 021677 0 ustar 00rsf rsf 000000 000000 /**********
This library is free software; you can redistribute it and/or modify it under
the terms of the GNU Lesser General Public License as published by the
Free Software Foundation; either version 2.1 of the License, or (at your
option) any later version. (See .)
This library is distributed in the hope that it will be useful, but WITHOUT
ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS
FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for
more details.
You should have received a copy of the GNU Lesser General Public License
along with this library; if not, write to the Free Software Foundation, Inc.,
51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
**********/
// "liveMedia"
// Copyright (c) 1996-2016 Live Networks, Inc. All rights reserved.
// A 'ServerMediaSubsession' object that represents an existing
// 'RTPSink', rather than one that creates new 'RTPSink's on demand.
// Implementation
#include "PassiveServerMediaSubsession.hh"
#include
////////// PassiveServerMediaSubsession //////////
PassiveServerMediaSubsession*
PassiveServerMediaSubsession::createNew(RTPSink& rtpSink,
RTCPInstance* rtcpInstance) {
return new PassiveServerMediaSubsession(rtpSink, rtcpInstance);
}
PassiveServerMediaSubsession
::PassiveServerMediaSubsession(RTPSink& rtpSink, RTCPInstance* rtcpInstance)
: ServerMediaSubsession(rtpSink.envir()),
fSDPLines(NULL), fRTPSink(rtpSink), fRTCPInstance(rtcpInstance) {
fClientRTCPSourceRecords = HashTable::create(ONE_WORD_HASH_KEYS);
}
class RTCPSourceRecord {
public:
RTCPSourceRecord(netAddressBits addr, Port const& port)
: addr(addr), port(port) {
}
netAddressBits addr;
Port port;
};
PassiveServerMediaSubsession::~PassiveServerMediaSubsession() {
delete[] fSDPLines;
// Clean out the RTCPSourceRecord table:
while (1) {
RTCPSourceRecord* source = (RTCPSourceRecord*)(fClientRTCPSourceRecords->RemoveNext());
if (source == NULL) break;
delete source;
}
delete fClientRTCPSourceRecords;
}
Boolean PassiveServerMediaSubsession::rtcpIsMuxed() {
if (fRTCPInstance == NULL) return False;
// Check whether RTP and RTCP use the same "groupsock" object:
return &(fRTPSink.groupsockBeingUsed()) == fRTCPInstance->RTCPgs();
}
char const*
PassiveServerMediaSubsession::sdpLines() {
if (fSDPLines == NULL ) {
// Construct a set of SDP lines that describe this subsession:
// Use the components from "rtpSink":
Groupsock const& gs = fRTPSink.groupsockBeingUsed();
AddressString groupAddressStr(gs.groupAddress());
unsigned short portNum = ntohs(gs.port().num());
unsigned char ttl = gs.ttl();
unsigned char rtpPayloadType = fRTPSink.rtpPayloadType();
char const* mediaType = fRTPSink.sdpMediaType();
unsigned estBitrate
= fRTCPInstance == NULL ? 50 : fRTCPInstance->totSessionBW();
char* rtpmapLine = fRTPSink.rtpmapLine();
char const* rtcpmuxLine = rtcpIsMuxed() ? "a=rtcp-mux\r\n" : "";
char const* rangeLine = rangeSDPLine();
char const* auxSDPLine = fRTPSink.auxSDPLine();
if (auxSDPLine == NULL) auxSDPLine = "";
char const* const sdpFmt =
"m=%s %d RTP/AVP %d\r\n"
"c=IN IP4 %s/%d\r\n"
"b=AS:%u\r\n"
"%s"
"%s"
"%s"
"%s"
"a=control:%s\r\n";
unsigned sdpFmtSize = strlen(sdpFmt)
+ strlen(mediaType) + 5 /* max short len */ + 3 /* max char len */
+ strlen(groupAddressStr.val()) + 3 /* max char len */
+ 20 /* max int len */
+ strlen(rtpmapLine)
+ strlen(rtcpmuxLine)
+ strlen(rangeLine)
+ strlen(auxSDPLine)
+ strlen(trackId());
char* sdpLines = new char[sdpFmtSize];
sprintf(sdpLines, sdpFmt,
mediaType, // m=
portNum, // m=
rtpPayloadType, // m=
groupAddressStr.val(), // c=
ttl, // c= TTL
estBitrate, // b=AS:
rtpmapLine, // a=rtpmap:... (if present)
rtcpmuxLine, // a=rtcp-mux:... (if present)
rangeLine, // a=range:... (if present)
auxSDPLine, // optional extra SDP line
trackId()); // a=control:
delete[] (char*)rangeLine; delete[] rtpmapLine;
fSDPLines = strDup(sdpLines);
delete[] sdpLines;
}
return fSDPLines;
}
void PassiveServerMediaSubsession
::getStreamParameters(unsigned clientSessionId,
netAddressBits clientAddress,
Port const& /*clientRTPPort*/,
Port const& clientRTCPPort,
int /*tcpSocketNum*/,
unsigned char /*rtpChannelId*/,
unsigned char /*rtcpChannelId*/,
netAddressBits& destinationAddress,
u_int8_t& destinationTTL,
Boolean& isMulticast,
Port& serverRTPPort,
Port& serverRTCPPort,
void*& streamToken) {
isMulticast = True;
Groupsock& gs = fRTPSink.groupsockBeingUsed();
if (destinationTTL == 255) destinationTTL = gs.ttl();
if (destinationAddress == 0) { // normal case
destinationAddress = gs.groupAddress().s_addr;
} else { // use the client-specified destination address instead:
struct in_addr destinationAddr; destinationAddr.s_addr = destinationAddress;
gs.changeDestinationParameters(destinationAddr, 0, destinationTTL);
if (fRTCPInstance != NULL) {
Groupsock* rtcpGS = fRTCPInstance->RTCPgs();
rtcpGS->changeDestinationParameters(destinationAddr, 0, destinationTTL);
}
}
serverRTPPort = gs.port();
if (fRTCPInstance != NULL) {
Groupsock* rtcpGS = fRTCPInstance->RTCPgs();
serverRTCPPort = rtcpGS->port();
}
streamToken = NULL; // not used
// Make a record of this client's source - for RTCP RR handling:
RTCPSourceRecord* source = new RTCPSourceRecord(clientAddress, clientRTCPPort);
fClientRTCPSourceRecords->Add((char const*)clientSessionId, source);
}
void PassiveServerMediaSubsession::startStream(unsigned clientSessionId,
void* /*streamToken*/,
TaskFunc* rtcpRRHandler,
void* rtcpRRHandlerClientData,
unsigned short& rtpSeqNum,
unsigned& rtpTimestamp,
ServerRequestAlternativeByteHandler* /*serverRequestAlternativeByteHandler*/,
void* /*serverRequestAlternativeByteHandlerClientData*/) {
rtpSeqNum = fRTPSink.currentSeqNo();
rtpTimestamp = fRTPSink.presetNextTimestamp();
// Try to use a big send buffer for RTP - at least 0.1 second of
// specified bandwidth and at least 50 KB
unsigned streamBitrate = fRTCPInstance == NULL ? 50 : fRTCPInstance->totSessionBW(); // in kbps
unsigned rtpBufSize = streamBitrate * 25 / 2; // 1 kbps * 0.1 s = 12.5 bytes
if (rtpBufSize < 50 * 1024) rtpBufSize = 50 * 1024;
increaseSendBufferTo(envir(), fRTPSink.groupsockBeingUsed().socketNum(), rtpBufSize);
if (fRTCPInstance != NULL) {
// Hack: Send a RTCP "SR" packet now, so that receivers will (likely) be able to
// get RTCP-synchronized presentation times immediately:
fRTCPInstance->sendReport();
// Set up the handler for incoming RTCP "RR" packets from this client:
RTCPSourceRecord* source = (RTCPSourceRecord*)(fClientRTCPSourceRecords->Lookup((char const*)clientSessionId));
if (source != NULL) {
fRTCPInstance->setSpecificRRHandler(source->addr, source->port,
rtcpRRHandler, rtcpRRHandlerClientData);
}
}
}
float PassiveServerMediaSubsession::getCurrentNPT(void* streamToken) {
// Return the elapsed time between our "RTPSink"s creation time, and the current time:
struct timeval const& creationTime = fRTPSink.creationTime(); // alias
struct timeval timeNow;
gettimeofday(&timeNow, NULL);
return (float)(timeNow.tv_sec - creationTime.tv_sec + (timeNow.tv_usec - creationTime.tv_usec)/1000000.0);
}
void PassiveServerMediaSubsession
::getRTPSinkandRTCP(void* streamToken,
RTPSink const*& rtpSink, RTCPInstance const*& rtcp) {
rtpSink = &fRTPSink;
rtcp = fRTCPInstance;
}
void PassiveServerMediaSubsession::deleteStream(unsigned clientSessionId, void*& /*streamToken*/) {
// Lookup and remove the 'RTCPSourceRecord' for this client. Also turn off RTCP "RR" handling:
RTCPSourceRecord* source = (RTCPSourceRecord*)(fClientRTCPSourceRecords->Lookup((char const*)clientSessionId));
if (source != NULL) {
if (fRTCPInstance != NULL) {
fRTCPInstance->unsetSpecificRRHandler(source->addr, source->port);
}
fClientRTCPSourceRecords->Remove((char const*)clientSessionId);
delete source;
}
}
live/liveMedia/RTCP.cpp 000444 001752 001752 00000120617 12656261123 014633 0 ustar 00rsf rsf 000000 000000 /**********
This library is free software; you can redistribute it and/or modify it under
the terms of the GNU Lesser General Public License as published by the
Free Software Foundation; either version 2.1 of the License, or (at your
option) any later version. (See .)
This library is distributed in the hope that it will be useful, but WITHOUT
ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS
FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for
more details.
You should have received a copy of the GNU Lesser General Public License
along with this library; if not, write to the Free Software Foundation, Inc.,
51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
**********/
// "liveMedia"
// Copyright (c) 1996-2016 Live Networks, Inc. All rights reserved.
// RTCP
// Implementation
#include "RTCP.hh"
#include "GroupsockHelper.hh"
#include "rtcp_from_spec.h"
#if defined(__WIN32__) || defined(_WIN32) || defined(_QNX4)
#define snprintf _snprintf
#endif
#define HACK_FOR_CHROME_WEBRTC_BUG 1 //#####@@@@@
////////// RTCPMemberDatabase //////////
class RTCPMemberDatabase {
public:
RTCPMemberDatabase(RTCPInstance& ourRTCPInstance)
: fOurRTCPInstance(ourRTCPInstance), fNumMembers(1 /*ourself*/),
fTable(HashTable::create(ONE_WORD_HASH_KEYS)) {
}
virtual ~RTCPMemberDatabase() {
delete fTable;
}
Boolean isMember(unsigned ssrc) const {
return fTable->Lookup((char*)(long)ssrc) != NULL;
}
Boolean noteMembership(unsigned ssrc, unsigned curTimeCount) {
Boolean isNew = !isMember(ssrc);
if (isNew) {
++fNumMembers;
}
// Record the current time, so we can age stale members
fTable->Add((char*)(long)ssrc, (void*)(long)curTimeCount);
return isNew;
}
Boolean remove(unsigned ssrc) {
Boolean wasPresent = fTable->Remove((char*)(long)ssrc);
if (wasPresent) {
--fNumMembers;
}
return wasPresent;
}
unsigned numMembers() const {
return fNumMembers;
}
void reapOldMembers(unsigned threshold);
private:
RTCPInstance& fOurRTCPInstance;
unsigned fNumMembers;
HashTable* fTable;
};
void RTCPMemberDatabase::reapOldMembers(unsigned threshold) {
Boolean foundOldMember;
u_int32_t oldSSRC = 0;
do {
foundOldMember = False;
HashTable::Iterator* iter
= HashTable::Iterator::create(*fTable);
uintptr_t timeCount;
char const* key;
while ((timeCount = (uintptr_t)(iter->next(key))) != 0) {
#ifdef DEBUG
fprintf(stderr, "reap: checking SSRC 0x%lx: %ld (threshold %d)\n", (unsigned long)key, timeCount, threshold);
#endif
if (timeCount < (uintptr_t)threshold) { // this SSRC is old
uintptr_t ssrc = (uintptr_t)key;
oldSSRC = (u_int32_t)ssrc;
foundOldMember = True;
}
}
delete iter;
if (foundOldMember) {
#ifdef DEBUG
fprintf(stderr, "reap: removing SSRC 0x%x\n", oldSSRC);
#endif
fOurRTCPInstance.removeSSRC(oldSSRC, True);
}
} while (foundOldMember);
}
////////// RTCPInstance //////////
static double dTimeNow() {
struct timeval timeNow;
gettimeofday(&timeNow, NULL);
return (double) (timeNow.tv_sec + timeNow.tv_usec/1000000.0);
}
static unsigned const maxRTCPPacketSize = 1456;
// bytes (1500, minus some allowance for IP, UDP, UMTP headers)
static unsigned const preferredRTCPPacketSize = 1000; // bytes
RTCPInstance::RTCPInstance(UsageEnvironment& env, Groupsock* RTCPgs,
unsigned totSessionBW,
unsigned char const* cname,
RTPSink* sink, RTPSource* source,
Boolean isSSMSource)
: Medium(env), fRTCPInterface(this, RTCPgs), fTotSessionBW(totSessionBW),
fSink(sink), fSource(source), fIsSSMSource(isSSMSource),
fCNAME(RTCP_SDES_CNAME, cname), fOutgoingReportCount(1),
fAveRTCPSize(0), fIsInitial(1), fPrevNumMembers(0),
fLastSentSize(0), fLastReceivedSize(0), fLastReceivedSSRC(0),
fTypeOfEvent(EVENT_UNKNOWN), fTypeOfPacket(PACKET_UNKNOWN_TYPE),
fHaveJustSentPacket(False), fLastPacketSentSize(0),
fByeHandlerTask(NULL), fByeHandlerClientData(NULL),
fSRHandlerTask(NULL), fSRHandlerClientData(NULL),
fRRHandlerTask(NULL), fRRHandlerClientData(NULL),
fSpecificRRHandlerTable(NULL),
fAppHandlerTask(NULL), fAppHandlerClientData(NULL) {
#ifdef DEBUG
fprintf(stderr, "RTCPInstance[%p]::RTCPInstance()\n", this);
#endif
if (fTotSessionBW == 0) { // not allowed!
env << "RTCPInstance::RTCPInstance error: totSessionBW parameter should not be zero!\n";
fTotSessionBW = 1;
}
if (isSSMSource) RTCPgs->multicastSendOnly(); // don't receive multicast
double timeNow = dTimeNow();
fPrevReportTime = fNextReportTime = timeNow;
fKnownMembers = new RTCPMemberDatabase(*this);
fInBuf = new unsigned char[maxRTCPPacketSize];
if (fKnownMembers == NULL || fInBuf == NULL) return;
fNumBytesAlreadyRead = 0;
fOutBuf = new OutPacketBuffer(preferredRTCPPacketSize, maxRTCPPacketSize, maxRTCPPacketSize);
if (fOutBuf == NULL) return;
if (fSource != NULL && fSource->RTPgs() == RTCPgs) {
// We're receiving RTCP reports that are multiplexed with RTP, so ask the RTP source
// to give them to us:
fSource->registerForMultiplexedRTCPPackets(this);
} else {
// Arrange to handle incoming reports from the network:
TaskScheduler::BackgroundHandlerProc* handler
= (TaskScheduler::BackgroundHandlerProc*)&incomingReportHandler;
fRTCPInterface.startNetworkReading(handler);
}
// Send our first report.
fTypeOfEvent = EVENT_REPORT;
onExpire(this);
}
struct RRHandlerRecord {
TaskFunc* rrHandlerTask;
void* rrHandlerClientData;
};
RTCPInstance::~RTCPInstance() {
#ifdef DEBUG
fprintf(stderr, "RTCPInstance[%p]::~RTCPInstance()\n", this);
#endif
// Begin by sending a BYE. We have to do this immediately, without
// 'reconsideration', because "this" is going away.
fTypeOfEvent = EVENT_BYE; // not used, but...
sendBYE();
if (fSource != NULL && fSource->RTPgs() == fRTCPInterface.gs()) {
// We were receiving RTCP reports that were multiplexed with RTP, so tell the RTP source
// to stop giving them to us:
fSource->deregisterForMultiplexedRTCPPackets();
fRTCPInterface.forgetOurGroupsock();
// so that the "fRTCPInterface" destructor doesn't turn off background read handling
}
if (fSpecificRRHandlerTable != NULL) {
AddressPortLookupTable::Iterator iter(*fSpecificRRHandlerTable);
RRHandlerRecord* rrHandler;
while ((rrHandler = (RRHandlerRecord*)iter.next()) != NULL) {
delete rrHandler;
}
delete fSpecificRRHandlerTable;
}
delete fKnownMembers;
delete fOutBuf;
delete[] fInBuf;
}
void RTCPInstance::noteArrivingRR(struct sockaddr_in const& fromAddressAndPort,
int tcpSocketNum, unsigned char tcpStreamChannelId) {
// If a 'RR handler' was set, call it now:
// Specific RR handler:
if (fSpecificRRHandlerTable != NULL) {
netAddressBits fromAddr;
portNumBits fromPortNum;
if (tcpSocketNum < 0) {
// Normal case: We read the RTCP packet over UDP
fromAddr = fromAddressAndPort.sin_addr.s_addr;
fromPortNum = ntohs(fromAddressAndPort.sin_port);
} else {
// Special case: We read the RTCP packet over TCP (interleaved)
// Hack: Use the TCP socket and channel id to look up the handler
fromAddr = tcpSocketNum;
fromPortNum = tcpStreamChannelId;
}
Port fromPort(fromPortNum);
RRHandlerRecord* rrHandler
= (RRHandlerRecord*)(fSpecificRRHandlerTable->Lookup(fromAddr, (~0), fromPort));
if (rrHandler != NULL) {
if (rrHandler->rrHandlerTask != NULL) {
(*(rrHandler->rrHandlerTask))(rrHandler->rrHandlerClientData);
}
}
}
// General RR handler:
if (fRRHandlerTask != NULL) (*fRRHandlerTask)(fRRHandlerClientData);
}
RTCPInstance* RTCPInstance::createNew(UsageEnvironment& env, Groupsock* RTCPgs,
unsigned totSessionBW,
unsigned char const* cname,
RTPSink* sink, RTPSource* source,
Boolean isSSMSource) {
return new RTCPInstance(env, RTCPgs, totSessionBW, cname, sink, source,
isSSMSource);
}
Boolean RTCPInstance::lookupByName(UsageEnvironment& env,
char const* instanceName,
RTCPInstance*& resultInstance) {
resultInstance = NULL; // unless we succeed
Medium* medium;
if (!Medium::lookupByName(env, instanceName, medium)) return False;
if (!medium->isRTCPInstance()) {
env.setResultMsg(instanceName, " is not a RTCP instance");
return False;
}
resultInstance = (RTCPInstance*)medium;
return True;
}
Boolean RTCPInstance::isRTCPInstance() const {
return True;
}
unsigned RTCPInstance::numMembers() const {
if (fKnownMembers == NULL) return 0;
return fKnownMembers->numMembers();
}
void RTCPInstance::setByeHandler(TaskFunc* handlerTask, void* clientData,
Boolean handleActiveParticipantsOnly) {
fByeHandlerTask = handlerTask;
fByeHandlerClientData = clientData;
fByeHandleActiveParticipantsOnly = handleActiveParticipantsOnly;
}
void RTCPInstance::setSRHandler(TaskFunc* handlerTask, void* clientData) {
fSRHandlerTask = handlerTask;
fSRHandlerClientData = clientData;
}
void RTCPInstance::setRRHandler(TaskFunc* handlerTask, void* clientData) {
fRRHandlerTask = handlerTask;
fRRHandlerClientData = clientData;
}
void RTCPInstance
::setSpecificRRHandler(netAddressBits fromAddress, Port fromPort,
TaskFunc* handlerTask, void* clientData) {
if (handlerTask == NULL && clientData == NULL) {
unsetSpecificRRHandler(fromAddress, fromPort);
return;
}
RRHandlerRecord* rrHandler = new RRHandlerRecord;
rrHandler->rrHandlerTask = handlerTask;
rrHandler->rrHandlerClientData = clientData;
if (fSpecificRRHandlerTable == NULL) {
fSpecificRRHandlerTable = new AddressPortLookupTable;
}
RRHandlerRecord* existingRecord = (RRHandlerRecord*)fSpecificRRHandlerTable->Add(fromAddress, (~0), fromPort, rrHandler);
delete existingRecord; // if any
}
void RTCPInstance
::unsetSpecificRRHandler(netAddressBits fromAddress, Port fromPort) {
if (fSpecificRRHandlerTable == NULL) return;
RRHandlerRecord* rrHandler
= (RRHandlerRecord*)(fSpecificRRHandlerTable->Lookup(fromAddress, (~0), fromPort));
if (rrHandler != NULL) {
fSpecificRRHandlerTable->Remove(fromAddress, (~0), fromPort);
delete rrHandler;
}
}
void RTCPInstance::setAppHandler(RTCPAppHandlerFunc* handlerTask, void* clientData) {
fAppHandlerTask = handlerTask;
fAppHandlerClientData = clientData;
}
void RTCPInstance::sendAppPacket(u_int8_t subtype, char const* name,
u_int8_t* appDependentData, unsigned appDependentDataSize) {
// Set up the first 4 bytes: V,PT,subtype,PT,length:
u_int32_t rtcpHdr = 0x80000000; // version 2, no padding
rtcpHdr |= (subtype&0x1F)<<24;
rtcpHdr |= (RTCP_PT_APP<<16);
unsigned length = 2 + (appDependentDataSize+3)/4;
rtcpHdr |= (length&0xFFFF);
fOutBuf->enqueueWord(rtcpHdr);
// Set up the next 4 bytes: SSRC:
fOutBuf->enqueueWord(fSource != NULL ? fSource->SSRC() : fSink != NULL ? fSink->SSRC() : 0);
// Set up the next 4 bytes: name:
char nameBytes[4];
nameBytes[0] = nameBytes[1] = nameBytes[2] = nameBytes[3] = '\0'; // by default
if (name != NULL) {
snprintf(nameBytes, 4, "%s", name);
}
fOutBuf->enqueue((u_int8_t*)nameBytes, 4);
// Set up the remaining bytes (if any): application-dependent data (+ padding):
if (appDependentData != NULL && appDependentDataSize > 0) {
fOutBuf->enqueue(appDependentData, appDependentDataSize);
unsigned modulo = appDependentDataSize%4;
unsigned paddingSize = modulo == 0 ? 0 : 4-modulo;
u_int8_t const paddingByte = 0x00;
for (unsigned i = 0; i < paddingSize; ++i) fOutBuf->enqueue(&paddingByte, 1);
}
// Finally, send the packet:
sendBuiltPacket();
}
void RTCPInstance::setStreamSocket(int sockNum,
unsigned char streamChannelId) {
// Turn off background read handling:
fRTCPInterface.stopNetworkReading();
// Switch to RTCP-over-TCP:
fRTCPInterface.setStreamSocket(sockNum, streamChannelId);
// Turn background reading back on:
TaskScheduler::BackgroundHandlerProc* handler
= (TaskScheduler::BackgroundHandlerProc*)&incomingReportHandler;
fRTCPInterface.startNetworkReading(handler);
}
void RTCPInstance::addStreamSocket(int sockNum,
unsigned char streamChannelId) {
// First, turn off background read handling for the default (UDP) socket:
envir().taskScheduler().turnOffBackgroundReadHandling(fRTCPInterface.gs()->socketNum());
// Add the RTCP-over-TCP interface:
fRTCPInterface.addStreamSocket(sockNum, streamChannelId);
// Turn on background reading for this socket (in case it's not on already):
TaskScheduler::BackgroundHandlerProc* handler
= (TaskScheduler::BackgroundHandlerProc*)&incomingReportHandler;
fRTCPInterface.startNetworkReading(handler);
}
void RTCPInstance
::injectReport(u_int8_t const* packet, unsigned packetSize, struct sockaddr_in const& fromAddress) {
if (packetSize > maxRTCPPacketSize) packetSize = maxRTCPPacketSize;
memmove(fInBuf, packet, packetSize);
processIncomingReport(packetSize, fromAddress, -1, 0xFF); // assume report received over UDP
}
static unsigned const IP_UDP_HDR_SIZE = 28;
// overhead (bytes) of IP and UDP hdrs
#define ADVANCE(n) pkt += (n); packetSize -= (n)
void RTCPInstance::incomingReportHandler(RTCPInstance* instance,
int /*mask*/) {
instance->incomingReportHandler1();
}
void RTCPInstance::incomingReportHandler1() {
do {
if (fNumBytesAlreadyRead >= maxRTCPPacketSize) {
envir() << "RTCPInstance error: Hit limit when reading incoming packet over TCP. Increase \"maxRTCPPacketSize\"\n";
break;
}
unsigned numBytesRead;
struct sockaddr_in fromAddress;
int tcpSocketNum;
unsigned char tcpStreamChannelId;
Boolean packetReadWasIncomplete;
Boolean readResult
= fRTCPInterface.handleRead(&fInBuf[fNumBytesAlreadyRead], maxRTCPPacketSize - fNumBytesAlreadyRead,
numBytesRead, fromAddress,
tcpSocketNum, tcpStreamChannelId,
packetReadWasIncomplete);
unsigned packetSize = 0;
if (packetReadWasIncomplete) {
fNumBytesAlreadyRead += numBytesRead;
return; // more reads are needed to get the entire packet
} else { // normal case: We've read the entire packet
packetSize = fNumBytesAlreadyRead + numBytesRead;
fNumBytesAlreadyRead = 0; // for next time
}
if (!readResult) break;
// Ignore the packet if it was looped-back from ourself:
Boolean packetWasFromOurHost = False;
if (RTCPgs()->wasLoopedBackFromUs(envir(), fromAddress)) {
packetWasFromOurHost = True;
// However, we still want to handle incoming RTCP packets from
// *other processes* on the same machine. To distinguish this
// case from a true loop-back, check whether we've just sent a
// packet of the same size. (This check isn't perfect, but it seems
// to be the best we can do.)
if (fHaveJustSentPacket && fLastPacketSentSize == packetSize) {
// This is a true loop-back:
fHaveJustSentPacket = False;
break; // ignore this packet
}
}
if (fIsSSMSource && !packetWasFromOurHost) {
// This packet is assumed to have been received via unicast (because we're a SSM source,
// and SSM receivers send back RTCP "RR" packets via unicast).
// 'Reflect' the packet by resending it to the multicast group, so that any other receivers
// can also get to see it.
// NOTE: Denial-of-service attacks are possible here.
// Users of this software may wish to add their own,
// application-specific mechanism for 'authenticating' the
// validity of this packet before reflecting it.
// NOTE: The test for "!packetWasFromOurHost" means that we won't reflect RTCP packets
// that come from other processes on the same host as us. The reason for this is that the
// 'packet size' test above is not 100% reliable; some packets that were truly looped back
// from us might not be detected as such, and this might lead to infinite
// forwarding/receiving of some packets. To avoid this possibility, we reflect only
// RTCP packets that we know for sure originated elsewhere.
// (Note, though, that if we ever re-enable the code in "Groupsock::multicastSendOnly()",
// then we could remove the test for "!packetWasFromOurHost".)
fRTCPInterface.sendPacket(fInBuf, packetSize);
fHaveJustSentPacket = True;
fLastPacketSentSize = packetSize;
}
processIncomingReport(packetSize, fromAddress, tcpSocketNum, tcpStreamChannelId);
} while (0);
}
void RTCPInstance
::processIncomingReport(unsigned packetSize, struct sockaddr_in const& fromAddressAndPort,
int tcpSocketNum, unsigned char tcpStreamChannelId) {
do {
Boolean callByeHandler = False;
unsigned char* pkt = fInBuf;
#ifdef DEBUG
fprintf(stderr, "[%p]saw incoming RTCP packet (from ", this);
if (tcpSocketNum < 0) {
// Note that "fromAddressAndPort" is valid only if we're receiving over UDP (not over TCP):
fprintf(stderr, "address %s, port %d", AddressString(fromAddressAndPort).val(), ntohs(fromAddressAndPort.sin_port));
} else {
fprintf(stderr, "TCP socket #%d, stream channel id %d", tcpSocketNum, tcpStreamChannelId);
}
fprintf(stderr, ")\n");
for (unsigned i = 0; i < packetSize; ++i) {
if (i%4 == 0) fprintf(stderr, " ");
fprintf(stderr, "%02x", pkt[i]);
}
fprintf(stderr, "\n");
#endif
int totPacketSize = IP_UDP_HDR_SIZE + packetSize;
// Check the RTCP packet for validity:
// It must at least contain a header (4 bytes), and this header
// must be version=2, with no padding bit, and a payload type of
// SR (200), RR (201), or APP (204):
if (packetSize < 4) break;
unsigned rtcpHdr = ntohl(*(u_int32_t*)pkt);
if ((rtcpHdr & 0xE0FE0000) != (0x80000000 | (RTCP_PT_SR<<16)) &&
(rtcpHdr & 0xE0FF0000) != (0x80000000 | (RTCP_PT_APP<<16))) {
#ifdef DEBUG
fprintf(stderr, "rejected bad RTCP packet: header 0x%08x\n", rtcpHdr);
#endif
break;
}
// Process each of the individual RTCP 'subpackets' in (what may be)
// a compound RTCP packet.
int typeOfPacket = PACKET_UNKNOWN_TYPE;
unsigned reportSenderSSRC = 0;
Boolean packetOK = False;
while (1) {
u_int8_t rc = (rtcpHdr>>24)&0x1F;
u_int8_t pt = (rtcpHdr>>16)&0xFF;
unsigned length = 4*(rtcpHdr&0xFFFF); // doesn't count hdr
ADVANCE(4); // skip over the header
if (length > packetSize) break;
// Assume that each RTCP subpacket begins with a 4-byte SSRC:
if (length < 4) break; length -= 4;
reportSenderSSRC = ntohl(*(u_int32_t*)pkt); ADVANCE(4);
#ifdef HACK_FOR_CHROME_WEBRTC_BUG
if (reportSenderSSRC == 0x00000001 && pt == RTCP_PT_RR) {
// Chrome (and Opera) WebRTC receivers have a bug that causes them to always send
// SSRC 1 in their "RR"s. To work around this (to help us distinguish between different
// receivers), we use a fake SSRC in this case consisting of the IP address, XORed with
// the port number:
reportSenderSSRC = fromAddressAndPort.sin_addr.s_addr^fromAddressAndPort.sin_port;
}
#endif
Boolean subPacketOK = False;
switch (pt) {
case RTCP_PT_SR: {
#ifdef DEBUG
fprintf(stderr, "SR\n");
#endif
if (length < 20) break; length -= 20;
// Extract the NTP timestamp, and note this:
unsigned NTPmsw = ntohl(*(u_int32_t*)pkt); ADVANCE(4);
unsigned NTPlsw = ntohl(*(u_int32_t*)pkt); ADVANCE(4);
unsigned rtpTimestamp = ntohl(*(u_int32_t*)pkt); ADVANCE(4);
if (fSource != NULL) {
RTPReceptionStatsDB& receptionStats
= fSource->receptionStatsDB();
receptionStats.noteIncomingSR(reportSenderSSRC,
NTPmsw, NTPlsw, rtpTimestamp);
}
ADVANCE(8); // skip over packet count, octet count
// If a 'SR handler' was set, call it now:
if (fSRHandlerTask != NULL) (*fSRHandlerTask)(fSRHandlerClientData);
// The rest of the SR is handled like a RR (so, no "break;" here)
}
case RTCP_PT_RR: {
#ifdef DEBUG
fprintf(stderr, "RR\n");
#endif
unsigned reportBlocksSize = rc*(6*4);
if (length < reportBlocksSize) break;
length -= reportBlocksSize;
if (fSink != NULL) {
// Use this information to update stats about our transmissions:
RTPTransmissionStatsDB& transmissionStats = fSink->transmissionStatsDB();
for (unsigned i = 0; i < rc; ++i) {
unsigned senderSSRC = ntohl(*(u_int32_t*)pkt); ADVANCE(4);
// We care only about reports about our own transmission, not others'
if (senderSSRC == fSink->SSRC()) {
unsigned lossStats = ntohl(*(u_int32_t*)pkt); ADVANCE(4);
unsigned highestReceived = ntohl(*(u_int32_t*)pkt); ADVANCE(4);
unsigned jitter = ntohl(*(u_int32_t*)pkt); ADVANCE(4);
unsigned timeLastSR = ntohl(*(u_int32_t*)pkt); ADVANCE(4);
unsigned timeSinceLastSR = ntohl(*(u_int32_t*)pkt); ADVANCE(4);
transmissionStats.noteIncomingRR(reportSenderSSRC, fromAddressAndPort,
lossStats,
highestReceived, jitter,
timeLastSR, timeSinceLastSR);
} else {
ADVANCE(4*5);
}
}
} else {
ADVANCE(reportBlocksSize);
}
if (pt == RTCP_PT_RR) { // i.e., we didn't fall through from 'SR'
noteArrivingRR(fromAddressAndPort, tcpSocketNum, tcpStreamChannelId);
}
subPacketOK = True;
typeOfPacket = PACKET_RTCP_REPORT;
break;
}
case RTCP_PT_BYE: {
#ifdef DEBUG
fprintf(stderr, "BYE\n");
#endif
// If a 'BYE handler' was set, arrange for it to be called at the end of this routine.
// (Note: We don't call it immediately, in case it happens to cause "this" to be deleted.)
if (fByeHandlerTask != NULL
&& (!fByeHandleActiveParticipantsOnly
|| (fSource != NULL
&& fSource->receptionStatsDB().lookup(reportSenderSSRC) != NULL)
|| (fSink != NULL
&& fSink->transmissionStatsDB().lookup(reportSenderSSRC) != NULL))) {
callByeHandler = True;
}
// We should really check for & handle >1 SSRCs being present #####
subPacketOK = True;
typeOfPacket = PACKET_BYE;
break;
}
case RTCP_PT_APP: {
u_int8_t& subtype = rc; // In "APP" packets, the "rc" field gets used as "subtype"
#ifdef DEBUG
fprintf(stderr, "APP (subtype 0x%02x)\n", subtype);
#endif
if (length < 4) {
#ifdef DEBUG
fprintf(stderr, "\tError: No \"name\" field!\n");
#endif
break;
}
#ifdef DEBUG
fprintf(stderr, "\tname:%c%c%c%c\n", pkt[0], pkt[1], pkt[2], pkt[3]);
#endif
u_int32_t nameBytes = (pkt[0]<<24)|(pkt[1]<<16)|(pkt[2]<<8)|(pkt[3]);
ADVANCE(4); // skip over "name", to the 'application-dependent data'
#ifdef DEBUG
fprintf(stderr, "\tapplication-dependent data size: %d bytes\n", length);
#endif
// If an 'APP' packet handler was set, call it now:
if (fAppHandlerTask != NULL) {
(*fAppHandlerTask)(fAppHandlerClientData, subtype, nameBytes, pkt, length);
}
subPacketOK = True;
typeOfPacket = PACKET_RTCP_APP;
break;
}
// Other RTCP packet types that we don't yet handle:
case RTCP_PT_SDES: {
#ifdef DEBUG
// 'Handle' SDES packets only in debugging code, by printing out the 'SDES items':
fprintf(stderr, "SDES\n");
// Process each 'chunk':
Boolean chunkOK = False;
ADVANCE(-4); length += 4; // hack so that we see the first SSRC/CSRC again
while (length >= 8) { // A valid chunk must be at least 8 bytes long
chunkOK = False; // until we learn otherwise
u_int32_t SSRC_CSRC = ntohl(*(u_int32_t*)pkt); ADVANCE(4); length -= 4;
fprintf(stderr, "\tSSRC/CSRC: 0x%08x\n", SSRC_CSRC);
// Process each 'SDES item' in the chunk:
u_int8_t itemType = *pkt; ADVANCE(1); --length;
while (itemType != 0) {
unsigned itemLen = *pkt; ADVANCE(1); --length;
// Make sure "itemLen" allows for at least 1 zero byte at the end of the chunk:
if (itemLen + 1 > length || pkt[itemLen] != 0) break;
fprintf(stderr, "\t\t%s:%s\n",
itemType == 1 ? "CNAME" :
itemType == 2 ? "NAME" :
itemType == 3 ? "EMAIL" :
itemType == 4 ? "PHONE" :
itemType == 5 ? "LOC" :
itemType == 6 ? "TOOL" :
itemType == 7 ? "NOTE" :
itemType == 8 ? "PRIV" :
"(unknown)",
itemType < 8 ? (char*)pkt // hack, because we know it's '\0'-terminated
: "???"/* don't try to print out PRIV or unknown items */);
ADVANCE(itemLen); length -= itemLen;
itemType = *pkt; ADVANCE(1); --length;
}
if (itemType != 0) break; // bad 'SDES item'
// Thus, itemType == 0. This zero 'type' marks the end of the list of SDES items.
// Skip over remaining zero padding bytes, so that this chunk ends on a 4-byte boundary:
while (length%4 > 0 && *pkt == 0) { ADVANCE(1); --length; }
if (length%4 > 0) break; // Bad (non-zero) padding byte
chunkOK = True;
}
if (!chunkOK || length > 0) break; // bad chunk, or not enough bytes for the last chunk
#endif
subPacketOK = True;
break;
}
case RTCP_PT_RTPFB: {
#ifdef DEBUG
fprintf(stderr, "RTPFB(unhandled)\n");
#endif
subPacketOK = True;
break;
}
case RTCP_PT_PSFB: {
#ifdef DEBUG
fprintf(stderr, "PSFB(unhandled)\n");
// Temporary code to show "Receiver Estimated Maximum Bitrate" (REMB) feedback reports:
//#####
if (length >= 12 && pkt[4] == 'R' && pkt[5] == 'E' && pkt[6] == 'M' && pkt[7] == 'B') {
u_int8_t exp = pkt[9]>>2;
u_int32_t mantissa = ((pkt[9]&0x03)<<16)|(pkt[10]<<8)|pkt[11];
double remb = (double)mantissa;
while (exp > 0) {
remb *= 2.0;
exp /= 2;
}
fprintf(stderr, "\tReceiver Estimated Max Bitrate (REMB): %g bps\n", remb);
}
#endif
subPacketOK = True;
break;
}
case RTCP_PT_XR: {
#ifdef DEBUG
fprintf(stderr, "XR(unhandled)\n");
#endif
subPacketOK = True;
break;
}
case RTCP_PT_AVB: {
#ifdef DEBUG
fprintf(stderr, "AVB(unhandled)\n");
#endif
subPacketOK = True;
break;
}
case RTCP_PT_RSI: {
#ifdef DEBUG
fprintf(stderr, "RSI(unhandled)\n");
#endif
subPacketOK = True;
break;
}
case RTCP_PT_TOKEN: {
#ifdef DEBUG
fprintf(stderr, "TOKEN(unhandled)\n");
#endif
subPacketOK = True;
break;
}
case RTCP_PT_IDMS: {
#ifdef DEBUG
fprintf(stderr, "IDMS(unhandled)\n");
#endif
subPacketOK = True;
break;
}
default: {
#ifdef DEBUG
fprintf(stderr, "UNKNOWN TYPE(0x%x)\n", pt);
#endif
subPacketOK = True;
break;
}
}
if (!subPacketOK) break;
// need to check for (& handle) SSRC collision! #####
#ifdef DEBUG
fprintf(stderr, "validated RTCP subpacket: rc:%d, pt:%d, bytes remaining:%d, report sender SSRC:0x%08x\n", rc, pt, length, reportSenderSSRC);
#endif
// Skip over any remaining bytes in this subpacket:
ADVANCE(length);
// Check whether another RTCP 'subpacket' follows:
if (packetSize == 0) {
packetOK = True;
break;
} else if (packetSize < 4) {
#ifdef DEBUG
fprintf(stderr, "extraneous %d bytes at end of RTCP packet!\n", packetSize);
#endif
break;
}
rtcpHdr = ntohl(*(u_int32_t*)pkt);
if ((rtcpHdr & 0xC0000000) != 0x80000000) {
#ifdef DEBUG
fprintf(stderr, "bad RTCP subpacket: header 0x%08x\n", rtcpHdr);
#endif
break;
}
}
if (!packetOK) {
#ifdef DEBUG
fprintf(stderr, "rejected bad RTCP subpacket: header 0x%08x\n", rtcpHdr);
#endif
break;
} else {
#ifdef DEBUG
fprintf(stderr, "validated entire RTCP packet\n");
#endif
}
onReceive(typeOfPacket, totPacketSize, reportSenderSSRC);
// Finally, if we need to call a "BYE" handler, do so now (in case it causes "this" to get deleted):
if (callByeHandler && fByeHandlerTask != NULL/*sanity check*/) {
TaskFunc* byeHandler = fByeHandlerTask;
fByeHandlerTask = NULL; // because we call the handler only once, by default
(*byeHandler)(fByeHandlerClientData);
}
} while (0);
}
void RTCPInstance::onReceive(int typeOfPacket, int totPacketSize,
unsigned ssrc) {
fTypeOfPacket = typeOfPacket;
fLastReceivedSize = totPacketSize;
fLastReceivedSSRC = ssrc;
int members = (int)numMembers();
int senders = (fSink != NULL) ? 1 : 0;
OnReceive(this, // p
this, // e
&members, // members
&fPrevNumMembers, // pmembers
&senders, // senders
&fAveRTCPSize, // avg_rtcp_size
&fPrevReportTime, // tp
dTimeNow(), // tc
fNextReportTime);
}
void RTCPInstance::sendReport() {
#ifdef DEBUG
fprintf(stderr, "sending REPORT\n");
#endif
// Begin by including a SR and/or RR report:
if (!addReport()) return;
// Then, include a SDES:
addSDES();
// Send the report:
sendBuiltPacket();
// Periodically clean out old members from our SSRC membership database:
const unsigned membershipReapPeriod = 5;
if ((++fOutgoingReportCount) % membershipReapPeriod == 0) {
unsigned threshold = fOutgoingReportCount - membershipReapPeriod;
fKnownMembers->reapOldMembers(threshold);
}
}
void RTCPInstance::sendBYE() {
#ifdef DEBUG
fprintf(stderr, "sending BYE\n");
#endif
// The packet must begin with a SR and/or RR report:
(void)addReport(True);
addBYE();
sendBuiltPacket();
}
void RTCPInstance::sendBuiltPacket() {
#ifdef DEBUG
fprintf(stderr, "sending RTCP packet\n");
unsigned char* p = fOutBuf->packet();
for (unsigned i = 0; i < fOutBuf->curPacketSize(); ++i) {
if (i%4 == 0) fprintf(stderr," ");
fprintf(stderr, "%02x", p[i]);
}
fprintf(stderr, "\n");
#endif
unsigned reportSize = fOutBuf->curPacketSize();
fRTCPInterface.sendPacket(fOutBuf->packet(), reportSize);
fOutBuf->resetOffset();
fLastSentSize = IP_UDP_HDR_SIZE + reportSize;
fHaveJustSentPacket = True;
fLastPacketSentSize = reportSize;
}
int RTCPInstance::checkNewSSRC() {
return fKnownMembers->noteMembership(fLastReceivedSSRC,
fOutgoingReportCount);
}
void RTCPInstance::removeLastReceivedSSRC() {
removeSSRC(fLastReceivedSSRC, False/*keep stats around*/);
}
void RTCPInstance::removeSSRC(u_int32_t ssrc, Boolean alsoRemoveStats) {
fKnownMembers->remove(ssrc);
if (alsoRemoveStats) {
// Also, remove records of this SSRC from any reception or transmission stats
if (fSource != NULL) fSource->receptionStatsDB().removeRecord(ssrc);
if (fSink != NULL) fSink->transmissionStatsDB().removeRecord(ssrc);
}
}
void RTCPInstance::onExpire(RTCPInstance* instance) {
instance->onExpire1();
}
// Member functions to build specific kinds of report:
Boolean RTCPInstance::addReport(Boolean alwaysAdd) {
// Include a SR or a RR, depending on whether we have an associated sink or source:
if (fSink != NULL) {
if (!alwaysAdd) {
if (!fSink->enableRTCPReports()) return False;
// Hack: Don't send a SR during those (brief) times when the timestamp of the
// next outgoing RTP packet has been preset, to ensure that that timestamp gets
// used for that outgoing packet. (David Bertrand, 2006.07.18)
if (fSink->nextTimestampHasBeenPreset()) return False;
}
addSR();
}
if (fSource != NULL) {
if (!alwaysAdd) {
if (!fSource->enableRTCPReports()) return False;
}
addRR();
}
return True;
}
void RTCPInstance::addSR() {
// ASSERT: fSink != NULL
enqueueCommonReportPrefix(RTCP_PT_SR, fSink->SSRC(),
5 /* extra words in a SR */);
// Now, add the 'sender info' for our sink
// Insert the NTP and RTP timestamps for the 'wallclock time':
struct timeval timeNow;
gettimeofday(&timeNow, NULL);
fOutBuf->enqueueWord(timeNow.tv_sec + 0x83AA7E80);
// NTP timestamp most-significant word (1970 epoch -> 1900 epoch)
double fractionalPart = (timeNow.tv_usec/15625.0)*0x04000000; // 2^32/10^6
fOutBuf->enqueueWord((unsigned)(fractionalPart+0.5));
// NTP timestamp least-significant word
unsigned rtpTimestamp = fSink->convertToRTPTimestamp(timeNow);
fOutBuf->enqueueWord(rtpTimestamp); // RTP ts
// Insert the packet and byte counts:
fOutBuf->enqueueWord(fSink->packetCount());
fOutBuf->enqueueWord(fSink->octetCount());
enqueueCommonReportSuffix();
}
void RTCPInstance::addRR() {
// ASSERT: fSource != NULL
enqueueCommonReportPrefix(RTCP_PT_RR, fSource->SSRC());
enqueueCommonReportSuffix();
}
void RTCPInstance::enqueueCommonReportPrefix(unsigned char packetType,
unsigned SSRC,
unsigned numExtraWords) {
unsigned numReportingSources;
if (fSource == NULL) {
numReportingSources = 0; // we don't receive anything
} else {
RTPReceptionStatsDB& allReceptionStats
= fSource->receptionStatsDB();
numReportingSources = allReceptionStats.numActiveSourcesSinceLastReset();
// This must be <32, to fit in 5 bits:
if (numReportingSources >= 32) { numReportingSources = 32; }
// Later: support adding more reports to handle >32 sources (unlikely)#####
}
unsigned rtcpHdr = 0x80000000; // version 2, no padding
rtcpHdr |= (numReportingSources<<24);
rtcpHdr |= (packetType<<16);
rtcpHdr |= (1 + numExtraWords + 6*numReportingSources);
// each report block is 6 32-bit words long
fOutBuf->enqueueWord(rtcpHdr);
fOutBuf->enqueueWord(SSRC);
}
void RTCPInstance::enqueueCommonReportSuffix() {
// Output the report blocks for each source:
if (fSource != NULL) {
RTPReceptionStatsDB& allReceptionStats
= fSource->receptionStatsDB();
RTPReceptionStatsDB::Iterator iterator(allReceptionStats);
while (1) {
RTPReceptionStats* receptionStats = iterator.next();
if (receptionStats == NULL) break;
enqueueReportBlock(receptionStats);
}
allReceptionStats.reset(); // because we have just generated a report
}
}
void
RTCPInstance::enqueueReportBlock(RTPReceptionStats* stats) {
fOutBuf->enqueueWord(stats->SSRC());
unsigned highestExtSeqNumReceived = stats->highestExtSeqNumReceived();
unsigned totNumExpected
= highestExtSeqNumReceived - stats->baseExtSeqNumReceived();
int totNumLost = totNumExpected - stats->totNumPacketsReceived();
// 'Clamp' this loss number to a 24-bit signed value:
if (totNumLost > 0x007FFFFF) {
totNumLost = 0x007FFFFF;
} else if (totNumLost < 0) {
if (totNumLost < -0x00800000) totNumLost = 0x00800000; // unlikely, but...
totNumLost &= 0x00FFFFFF;
}
unsigned numExpectedSinceLastReset
= highestExtSeqNumReceived - stats->lastResetExtSeqNumReceived();
int numLostSinceLastReset
= numExpectedSinceLastReset - stats->numPacketsReceivedSinceLastReset();
unsigned char lossFraction;
if (numExpectedSinceLastReset == 0 || numLostSinceLastReset < 0) {
lossFraction = 0;
} else {
lossFraction = (unsigned char)
((numLostSinceLastReset << 8) / numExpectedSinceLastReset);
}
fOutBuf->enqueueWord((lossFraction<<24) | totNumLost);
fOutBuf->enqueueWord(highestExtSeqNumReceived);
fOutBuf->enqueueWord(stats->jitter());
unsigned NTPmsw = stats->lastReceivedSR_NTPmsw();
unsigned NTPlsw = stats->lastReceivedSR_NTPlsw();
unsigned LSR = ((NTPmsw&0xFFFF)<<16)|(NTPlsw>>16); // middle 32 bits
fOutBuf->enqueueWord(LSR);
// Figure out how long has elapsed since the last SR rcvd from this src:
struct timeval const& LSRtime = stats->lastReceivedSR_time(); // "last SR"
struct timeval timeNow, timeSinceLSR;
gettimeofday(&timeNow, NULL);
if (timeNow.tv_usec < LSRtime.tv_usec) {
timeNow.tv_usec += 1000000;
timeNow.tv_sec -= 1;
}
timeSinceLSR.tv_sec = timeNow.tv_sec - LSRtime.tv_sec;
timeSinceLSR.tv_usec = timeNow.tv_usec - LSRtime.tv_usec;
// The enqueued time is in units of 1/65536 seconds.
// (Note that 65536/1000000 == 1024/15625)
unsigned DLSR;
if (LSR == 0) {
DLSR = 0;
} else {
DLSR = (timeSinceLSR.tv_sec<<16)
| ( (((timeSinceLSR.tv_usec<<11)+15625)/31250) & 0xFFFF);
}
fOutBuf->enqueueWord(DLSR);
}
void RTCPInstance::addSDES() {
// For now we support only the CNAME item; later support more #####
// Begin by figuring out the size of the entire SDES report:
unsigned numBytes = 4;
// counts the SSRC, but not the header; it'll get subtracted out
numBytes += fCNAME.totalSize(); // includes id and length
numBytes += 1; // the special END item
unsigned num4ByteWords = (numBytes + 3)/4;
unsigned rtcpHdr = 0x81000000; // version 2, no padding, 1 SSRC chunk
rtcpHdr |= (RTCP_PT_SDES<<16);
rtcpHdr |= num4ByteWords;
fOutBuf->enqueueWord(rtcpHdr);
if (fSource != NULL) {
fOutBuf->enqueueWord(fSource->SSRC());
} else if (fSink != NULL) {
fOutBuf->enqueueWord(fSink->SSRC());
}
// Add the CNAME:
fOutBuf->enqueue(fCNAME.data(), fCNAME.totalSize());
// Add the 'END' item (i.e., a zero byte), plus any more needed to pad:
unsigned numPaddingBytesNeeded = 4 - (fOutBuf->curPacketSize() % 4);
unsigned char const zero = '\0';
while (numPaddingBytesNeeded-- > 0) fOutBuf->enqueue(&zero, 1);
}
void RTCPInstance::addBYE() {
unsigned rtcpHdr = 0x81000000; // version 2, no padding, 1 SSRC
rtcpHdr |= (RTCP_PT_BYE<<16);
rtcpHdr |= 1; // 2 32-bit words total (i.e., with 1 SSRC)
fOutBuf->enqueueWord(rtcpHdr);
if (fSource != NULL) {
fOutBuf->enqueueWord(fSource->SSRC());
} else if (fSink != NULL) {
fOutBuf->enqueueWord(fSink->SSRC());
}
}
void RTCPInstance::schedule(double nextTime) {
fNextReportTime = nextTime;
double secondsToDelay = nextTime - dTimeNow();
if (secondsToDelay < 0) secondsToDelay = 0;
#ifdef DEBUG
fprintf(stderr, "schedule(%f->%f)\n", secondsToDelay, nextTime);
#endif
int64_t usToGo = (int64_t)(secondsToDelay * 1000000);
nextTask() = envir().taskScheduler().scheduleDelayedTask(usToGo,
(TaskFunc*)RTCPInstance::onExpire, this);
}
void RTCPInstance::reschedule(double nextTime) {
envir().taskScheduler().unscheduleDelayedTask(nextTask());
schedule(nextTime);
}
void RTCPInstance::onExpire1() {
// Note: fTotSessionBW is kbits per second
double rtcpBW = 0.05*fTotSessionBW*1024/8; // -> bytes per second
OnExpire(this, // event
numMembers(), // members
(fSink != NULL) ? 1 : 0, // senders
rtcpBW, // rtcp_bw
(fSink != NULL) ? 1 : 0, // we_sent
&fAveRTCPSize, // ave_rtcp_size
&fIsInitial, // initial
dTimeNow(), // tc
&fPrevReportTime, // tp
&fPrevNumMembers // pmembers
);
}
////////// SDESItem //////////
SDESItem::SDESItem(unsigned char tag, unsigned char const* value) {
unsigned length = strlen((char const*)value);
if (length > 0xFF) length = 0xFF; // maximum data length for a SDES item
fData[0] = tag;
fData[1] = (unsigned char)length;
memmove(&fData[2], value, length);
}
unsigned SDESItem::totalSize() const {
return 2 + (unsigned)fData[1];
}
////////// Implementation of routines imported by the "rtcp_from_spec" C code
extern "C" void Schedule(double nextTime, event e) {
RTCPInstance* instance = (RTCPInstance*)e;
if (instance == NULL) return;
instance->schedule(nextTime);
}
extern "C" void Reschedule(double nextTime, event e) {
RTCPInstance* instance = (RTCPInstance*)e;
if (instance == NULL) return;
instance->reschedule(nextTime);
}
extern "C" void SendRTCPReport(event e) {
RTCPInstance* instance = (RTCPInstance*)e;
if (instance == NULL) return;
instance->sendReport();
}
extern "C" void SendBYEPacket(event e) {
RTCPInstance* instance = (RTCPInstance*)e;
if (instance == NULL) return;
instance->sendBYE();
}
extern "C" int TypeOfEvent(event e) {
RTCPInstance* instance = (RTCPInstance*)e;
if (instance == NULL) return EVENT_UNKNOWN;
return instance->typeOfEvent();
}
extern "C" int SentPacketSize(event e) {
RTCPInstance* instance = (RTCPInstance*)e;
if (instance == NULL) return 0;
return instance->sentPacketSize();
}
extern "C" int PacketType(packet p) {
RTCPInstance* instance = (RTCPInstance*)p;
if (instance == NULL) return PACKET_UNKNOWN_TYPE;
return instance->packetType();
}
extern "C" int ReceivedPacketSize(packet p) {
RTCPInstance* instance = (RTCPInstance*)p;
if (instance == NULL) return 0;
return instance->receivedPacketSize();
}
extern "C" int NewMember(packet p) {
RTCPInstance* instance = (RTCPInstance*)p;
if (instance == NULL) return 0;
return instance->checkNewSSRC();
}
extern "C" int NewSender(packet /*p*/) {
return 0; // we don't yet recognize senders other than ourselves #####
}
extern "C" void AddMember(packet /*p*/) {
// Do nothing; all of the real work was done when NewMember() was called
}
extern "C" void AddSender(packet /*p*/) {
// we don't yet recognize senders other than ourselves #####
}
extern "C" void RemoveMember(packet p) {
RTCPInstance* instance = (RTCPInstance*)p;
if (instance == NULL) return;
instance->removeLastReceivedSSRC();
}
extern "C" void RemoveSender(packet /*p*/) {
// we don't yet recognize senders other than ourselves #####
}
extern "C" double drand30() {
unsigned tmp = our_random()&0x3FFFFFFF; // a random 30-bit integer
return tmp/(double)(1024*1024*1024);
}
live/liveMedia/QCELPAudioRTPSource.cpp 000444 001752 001752 00000040003 12656261123 017446 0 ustar 00rsf rsf 000000 000000 /**********
This library is free software; you can redistribute it and/or modify it under
the terms of the GNU Lesser General Public License as published by the
Free Software Foundation; either version 2.1 of the License, or (at your
option) any later version. (See .)
This library is distributed in the hope that it will be useful, but WITHOUT
ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS
FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for
more details.
You should have received a copy of the GNU Lesser General Public License
along with this library; if not, write to the Free Software Foundation, Inc.,
51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
**********/
// "liveMedia"
// Copyright (c) 1996-2016 Live Networks, Inc. All rights reserved.
// Qualcomm "PureVoice" (aka. "QCELP") Audio RTP Sources
// Implementation
#include "QCELPAudioRTPSource.hh"
#include "MultiFramedRTPSource.hh"
#include "FramedFilter.hh"
#include
#include
// This source is implemented internally by two separate sources:
// (i) a RTP source for the raw (interleaved) QCELP frames, and
// (ii) a deinterleaving filter that reads from this.
// Define these two new classes here:
class RawQCELPRTPSource: public MultiFramedRTPSource {
public:
static RawQCELPRTPSource* createNew(UsageEnvironment& env,
Groupsock* RTPgs,
unsigned char rtpPayloadFormat,
unsigned rtpTimestampFrequency);
unsigned char interleaveL() const { return fInterleaveL; }
unsigned char interleaveN() const { return fInterleaveN; }
unsigned char& frameIndex() { return fFrameIndex; } // index within pkt
private:
RawQCELPRTPSource(UsageEnvironment& env, Groupsock* RTPgs,
unsigned char rtpPayloadFormat,
unsigned rtpTimestampFrequency);
// called only by createNew()
virtual ~RawQCELPRTPSource();
private:
// redefined virtual functions:
virtual Boolean processSpecialHeader(BufferedPacket* packet,
unsigned& resultSpecialHeaderSize);
virtual char const* MIMEtype() const;
virtual Boolean hasBeenSynchronizedUsingRTCP();
private:
unsigned char fInterleaveL, fInterleaveN, fFrameIndex;
unsigned fNumSuccessiveSyncedPackets;
};
class QCELPDeinterleaver: public FramedFilter {
public:
static QCELPDeinterleaver* createNew(UsageEnvironment& env,
RawQCELPRTPSource* inputSource);
private:
QCELPDeinterleaver(UsageEnvironment& env,
RawQCELPRTPSource* inputSource);
// called only by "createNew()"
virtual ~QCELPDeinterleaver();
static void afterGettingFrame(void* clientData, unsigned frameSize,
unsigned numTruncatedBytes,
struct timeval presentationTime,
unsigned durationInMicroseconds);
void afterGettingFrame1(unsigned frameSize, struct timeval presentationTime);
private:
// Redefined virtual functions:
void doGetNextFrame();
virtual void doStopGettingFrames();
private:
class QCELPDeinterleavingBuffer* fDeinterleavingBuffer;
Boolean fNeedAFrame;
};
////////// QCELPAudioRTPSource implementation //////////
FramedSource*
QCELPAudioRTPSource::createNew(UsageEnvironment& env,
Groupsock* RTPgs,
RTPSource*& resultRTPSource,
unsigned char rtpPayloadFormat,
unsigned rtpTimestampFrequency) {
RawQCELPRTPSource* rawRTPSource;
resultRTPSource = rawRTPSource
= RawQCELPRTPSource::createNew(env, RTPgs, rtpPayloadFormat,
rtpTimestampFrequency);
if (resultRTPSource == NULL) return NULL;
QCELPDeinterleaver* deinterleaver
= QCELPDeinterleaver::createNew(env, rawRTPSource);
if (deinterleaver == NULL) {
Medium::close(resultRTPSource);
resultRTPSource = NULL;
}
return deinterleaver;
}
////////// QCELPBufferedPacket and QCELPBufferedPacketFactory //////////
// A subclass of BufferedPacket, used to separate out QCELP frames.
class QCELPBufferedPacket: public BufferedPacket {
public:
QCELPBufferedPacket(RawQCELPRTPSource& ourSource);
virtual ~QCELPBufferedPacket();
private: // redefined virtual functions
virtual unsigned nextEnclosedFrameSize(unsigned char*& framePtr,
unsigned dataSize);
private:
RawQCELPRTPSource& fOurSource;
};
class QCELPBufferedPacketFactory: public BufferedPacketFactory {
private: // redefined virtual functions
virtual BufferedPacket* createNewPacket(MultiFramedRTPSource* ourSource);
};
///////// RawQCELPRTPSource implementation ////////
RawQCELPRTPSource*
RawQCELPRTPSource::createNew(UsageEnvironment& env, Groupsock* RTPgs,
unsigned char rtpPayloadFormat,
unsigned rtpTimestampFrequency) {
return new RawQCELPRTPSource(env, RTPgs, rtpPayloadFormat,
rtpTimestampFrequency);
}
RawQCELPRTPSource::RawQCELPRTPSource(UsageEnvironment& env,
Groupsock* RTPgs,
unsigned char rtpPayloadFormat,
unsigned rtpTimestampFrequency)
: MultiFramedRTPSource(env, RTPgs, rtpPayloadFormat,
rtpTimestampFrequency,
new QCELPBufferedPacketFactory),
fInterleaveL(0), fInterleaveN(0), fFrameIndex(0),
fNumSuccessiveSyncedPackets(0) {
}
RawQCELPRTPSource::~RawQCELPRTPSource() {
}
Boolean RawQCELPRTPSource
::processSpecialHeader(BufferedPacket* packet,
unsigned& resultSpecialHeaderSize) {
unsigned char* headerStart = packet->data();
unsigned packetSize = packet->dataSize();
// First, check whether this packet's RTP timestamp is synchronized:
if (RTPSource::hasBeenSynchronizedUsingRTCP()) {
++fNumSuccessiveSyncedPackets;
} else {
fNumSuccessiveSyncedPackets = 0;
}
// There's a 1-byte header indicating the interleave parameters
if (packetSize < 1) return False;
// Get the interleaving parameters from the 1-byte header,
// and check them for validity:
unsigned char const firstByte = headerStart[0];
unsigned char const interleaveL = (firstByte&0x38)>>3;
unsigned char const interleaveN = firstByte&0x07;
#ifdef DEBUG
fprintf(stderr, "packetSize: %d, interleaveL: %d, interleaveN: %d\n", packetSize, interleaveL, interleaveN);
#endif
if (interleaveL > 5 || interleaveN > interleaveL) return False; //invalid
fInterleaveL = interleaveL;
fInterleaveN = interleaveN;
fFrameIndex = 0; // initially
resultSpecialHeaderSize = 1;
return True;
}
char const* RawQCELPRTPSource::MIMEtype() const {
return "audio/QCELP";
}
Boolean RawQCELPRTPSource::hasBeenSynchronizedUsingRTCP() {
// Don't report ourselves as being synchronized until we've received
// at least a complete interleave cycle of synchronized packets.
// This ensures that the receiver is currently getting a frame from
// a packet that was synchronized.
if (fNumSuccessiveSyncedPackets > (unsigned)(fInterleaveL+1)) {
fNumSuccessiveSyncedPackets = fInterleaveL+2; // prevents overflow
return True;
}
return False;
}
///// QCELPBufferedPacket and QCELPBufferedPacketFactory implementation
QCELPBufferedPacket::QCELPBufferedPacket(RawQCELPRTPSource& ourSource)
: fOurSource(ourSource) {
}
QCELPBufferedPacket::~QCELPBufferedPacket() {
}
unsigned QCELPBufferedPacket::
nextEnclosedFrameSize(unsigned char*& framePtr, unsigned dataSize) {
// The size of the QCELP frame is determined by the first byte:
if (dataSize == 0) return 0; // sanity check
unsigned char const firstByte = framePtr[0];
unsigned frameSize;
switch (firstByte) {
case 0: { frameSize = 1; break; }
case 1: { frameSize = 4; break; }
case 2: { frameSize = 8; break; }
case 3: { frameSize = 17; break; }
case 4: { frameSize = 35; break; }
default: { frameSize = 0; break; }
}
#ifdef DEBUG
fprintf(stderr, "QCELPBufferedPacket::nextEnclosedFrameSize(): frameSize: %d, dataSize: %d\n", frameSize, dataSize);
#endif
if (dataSize < frameSize) return 0;
++fOurSource.frameIndex();
return frameSize;
}
BufferedPacket* QCELPBufferedPacketFactory
::createNewPacket(MultiFramedRTPSource* ourSource) {
return new QCELPBufferedPacket((RawQCELPRTPSource&)(*ourSource));
}
///////// QCELPDeinterleavingBuffer /////////
// (used to implement QCELPDeinterleaver)
#define QCELP_MAX_FRAME_SIZE 35
#define QCELP_MAX_INTERLEAVE_L 5
#define QCELP_MAX_FRAMES_PER_PACKET 10
#define QCELP_MAX_INTERLEAVE_GROUP_SIZE \
((QCELP_MAX_INTERLEAVE_L+1)*QCELP_MAX_FRAMES_PER_PACKET)
class QCELPDeinterleavingBuffer {
public:
QCELPDeinterleavingBuffer();
virtual ~QCELPDeinterleavingBuffer();
void deliverIncomingFrame(unsigned frameSize,
unsigned char interleaveL,
unsigned char interleaveN,
unsigned char frameIndex,
unsigned short packetSeqNum,
struct timeval presentationTime);
Boolean retrieveFrame(unsigned char* to, unsigned maxSize,
unsigned& resultFrameSize, unsigned& resultNumTruncatedBytes,
struct timeval& resultPresentationTime);
unsigned char* inputBuffer() { return fInputBuffer; }
unsigned inputBufferSize() const { return QCELP_MAX_FRAME_SIZE; }
private:
class FrameDescriptor {
public:
FrameDescriptor();
virtual ~FrameDescriptor();
unsigned frameSize;
unsigned char* frameData;
struct timeval presentationTime;
};
// Use two banks of descriptors - one for incoming, one for outgoing
FrameDescriptor fFrames[QCELP_MAX_INTERLEAVE_GROUP_SIZE][2];
unsigned char fIncomingBankId; // toggles between 0 and 1
unsigned char fIncomingBinMax; // in the incoming bank
unsigned char fOutgoingBinMax; // in the outgoing bank
unsigned char fNextOutgoingBin;
Boolean fHaveSeenPackets;
u_int16_t fLastPacketSeqNumForGroup;
unsigned char* fInputBuffer;
struct timeval fLastRetrievedPresentationTime;
};
////////// QCELPDeinterleaver implementation /////////
QCELPDeinterleaver*
QCELPDeinterleaver::createNew(UsageEnvironment& env,
RawQCELPRTPSource* inputSource) {
return new QCELPDeinterleaver(env, inputSource);
}
QCELPDeinterleaver::QCELPDeinterleaver(UsageEnvironment& env,
RawQCELPRTPSource* inputSource)
: FramedFilter(env, inputSource),
fNeedAFrame(False) {
fDeinterleavingBuffer = new QCELPDeinterleavingBuffer();
}
QCELPDeinterleaver::~QCELPDeinterleaver() {
delete fDeinterleavingBuffer;
}
static unsigned const uSecsPerFrame = 20000; // 20 ms
void QCELPDeinterleaver::doGetNextFrame() {
// First, try getting a frame from the deinterleaving buffer:
if (fDeinterleavingBuffer->retrieveFrame(fTo, fMaxSize,
fFrameSize, fNumTruncatedBytes,
fPresentationTime)) {
// Success!
fNeedAFrame = False;
fDurationInMicroseconds = uSecsPerFrame;
// Call our own 'after getting' function. Because we're not a 'leaf'
// source, we can call this directly, without risking
// infinite recursion
afterGetting(this);
return;
}
// No luck, so ask our source for help:
fNeedAFrame = True;
if (!fInputSource->isCurrentlyAwaitingData()) {
fInputSource->getNextFrame(fDeinterleavingBuffer->inputBuffer(),
fDeinterleavingBuffer->inputBufferSize(),
afterGettingFrame, this,
FramedSource::handleClosure, this);
}
}
void QCELPDeinterleaver::doStopGettingFrames() {
fNeedAFrame = False;
fInputSource->stopGettingFrames();
}
void QCELPDeinterleaver
::afterGettingFrame(void* clientData, unsigned frameSize,
unsigned /*numTruncatedBytes*/,
struct timeval presentationTime,
unsigned /*durationInMicroseconds*/) {
QCELPDeinterleaver* deinterleaver = (QCELPDeinterleaver*)clientData;
deinterleaver->afterGettingFrame1(frameSize, presentationTime);
}
void QCELPDeinterleaver
::afterGettingFrame1(unsigned frameSize, struct timeval presentationTime) {
RawQCELPRTPSource* source = (RawQCELPRTPSource*)fInputSource;
// First, put the frame into our deinterleaving buffer:
fDeinterleavingBuffer
->deliverIncomingFrame(frameSize, source->interleaveL(),
source->interleaveN(), source->frameIndex(),
source->curPacketRTPSeqNum(),
presentationTime);
// Then, try delivering a frame to the client (if he wants one):
if (fNeedAFrame) doGetNextFrame();
}
////////// QCELPDeinterleavingBuffer implementation /////////
QCELPDeinterleavingBuffer::QCELPDeinterleavingBuffer()
: fIncomingBankId(0), fIncomingBinMax(0),
fOutgoingBinMax(0), fNextOutgoingBin(0),
fHaveSeenPackets(False) {
fInputBuffer = new unsigned char[QCELP_MAX_FRAME_SIZE];
}
QCELPDeinterleavingBuffer::~QCELPDeinterleavingBuffer() {
delete[] fInputBuffer;
}
void QCELPDeinterleavingBuffer
::deliverIncomingFrame(unsigned frameSize,
unsigned char interleaveL,
unsigned char interleaveN,
unsigned char frameIndex,
unsigned short packetSeqNum,
struct timeval presentationTime) {
// First perform a sanity check on the parameters:
// (This is overkill, as the source should have already done this.)
if (frameSize > QCELP_MAX_FRAME_SIZE
|| interleaveL > QCELP_MAX_INTERLEAVE_L || interleaveN > interleaveL
|| frameIndex == 0 || frameIndex > QCELP_MAX_FRAMES_PER_PACKET) {
#ifdef DEBUG
fprintf(stderr, "QCELPDeinterleavingBuffer::deliverIncomingFrame() param sanity check failed (%d,%d,%d,%d)\n", frameSize, interleaveL, interleaveN, frameIndex);
#endif
return;
}
// The input "presentationTime" was that of the first frame in this
// packet. Update it for the current frame:
unsigned uSecIncrement = (frameIndex-1)*(interleaveL+1)*uSecsPerFrame;
presentationTime.tv_usec += uSecIncrement;
presentationTime.tv_sec += presentationTime.tv_usec/1000000;
presentationTime.tv_usec = presentationTime.tv_usec%1000000;
// Next, check whether this packet is part of a new interleave group
if (!fHaveSeenPackets
|| seqNumLT(fLastPacketSeqNumForGroup, packetSeqNum)) {
// We've moved to a new interleave group
fHaveSeenPackets = True;
fLastPacketSeqNumForGroup = packetSeqNum + interleaveL - interleaveN;
// Switch the incoming and outgoing banks:
fIncomingBankId ^= 1;
unsigned char tmp = fIncomingBinMax;
fIncomingBinMax = fOutgoingBinMax;
fOutgoingBinMax = tmp;
fNextOutgoingBin = 0;
}
// Now move the incoming frame into the appropriate bin:
unsigned const binNumber
= interleaveN + (frameIndex-1)*(interleaveL+1);
FrameDescriptor& inBin = fFrames[binNumber][fIncomingBankId];
unsigned char* curBuffer = inBin.frameData;
inBin.frameData = fInputBuffer;
inBin.frameSize = frameSize;
inBin.presentationTime = presentationTime;
if (curBuffer == NULL) curBuffer = new unsigned char[QCELP_MAX_FRAME_SIZE];
fInputBuffer = curBuffer;
if (binNumber >= fIncomingBinMax) {
fIncomingBinMax = binNumber + 1;
}
}
Boolean QCELPDeinterleavingBuffer
::retrieveFrame(unsigned char* to, unsigned maxSize,
unsigned& resultFrameSize, unsigned& resultNumTruncatedBytes,
struct timeval& resultPresentationTime) {
if (fNextOutgoingBin >= fOutgoingBinMax) return False; // none left
FrameDescriptor& outBin = fFrames[fNextOutgoingBin][fIncomingBankId^1];
unsigned char* fromPtr;
unsigned char fromSize = outBin.frameSize;
outBin.frameSize = 0; // for the next time this bin is used
// Check whether this frame is missing; if so, return an 'erasure' frame:
unsigned char erasure = 14;
if (fromSize == 0) {
fromPtr = &erasure;
fromSize = 1;
// Compute this erasure frame's presentation time via extrapolation:
resultPresentationTime = fLastRetrievedPresentationTime;
resultPresentationTime.tv_usec += uSecsPerFrame;
if (resultPresentationTime.tv_usec >= 1000000) {
++resultPresentationTime.tv_sec;
resultPresentationTime.tv_usec -= 1000000;
}
} else {
// Normal case - a frame exists:
fromPtr = outBin.frameData;
resultPresentationTime = outBin.presentationTime;
}
fLastRetrievedPresentationTime = resultPresentationTime;
if (fromSize > maxSize) {
resultNumTruncatedBytes = fromSize - maxSize;
resultFrameSize = maxSize;
} else {
resultNumTruncatedBytes = 0;
resultFrameSize = fromSize;
}
memmove(to, fromPtr, resultFrameSize);
++fNextOutgoingBin;
return True;
}
QCELPDeinterleavingBuffer::FrameDescriptor::FrameDescriptor()
: frameSize(0), frameData(NULL) {
}
QCELPDeinterleavingBuffer::FrameDescriptor::~FrameDescriptor() {
delete[] frameData;
}
live/liveMedia/QuickTimeFileSink.cpp 000444 001752 001752 00000241445 12656261123 017406 0 ustar 00rsf rsf 000000 000000 /**********
This library is free software; you can redistribute it and/or modify it under
the terms of the GNU Lesser General Public License as published by the
Free Software Foundation; either version 2.1 of the License, or (at your
option) any later version. (See .)
This library is distributed in the hope that it will be useful, but WITHOUT
ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS
FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for
more details.
You should have received a copy of the GNU Lesser General Public License
along with this library; if not, write to the Free Software Foundation, Inc.,
51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
**********/
// "liveMedia"
// Copyright (c) 1996-2016 Live Networks, Inc. All rights reserved.
// A sink that generates a QuickTime file from a composite media session
// Implementation
#include "QuickTimeFileSink.hh"
#include "QuickTimeGenericRTPSource.hh"
#include "GroupsockHelper.hh"
#include "InputFile.hh"
#include "OutputFile.hh"
#include "H263plusVideoRTPSource.hh" // for the special header
#include "MPEG4GenericRTPSource.hh" //for "samplingFrequencyFromAudioSpecificConfig()"
#include "MPEG4LATMAudioRTPSource.hh" // for "parseGeneralConfigStr()"
#include "Base64.hh"
#include
#define fourChar(x,y,z,w) ( ((x)<<24)|((y)<<16)|((z)<<8)|(w) )
#define H264_IDR_FRAME 0x65 //bit 8 == 0, bits 7-6 (ref) == 3, bits 5-0 (type) == 5
////////// SubsessionIOState, ChunkDescriptor ///////////
// A structure used to represent the I/O state of each input 'subsession':
class ChunkDescriptor {
public:
ChunkDescriptor(int64_t offsetInFile, unsigned size,
unsigned frameSize, unsigned frameDuration,
struct timeval presentationTime);
ChunkDescriptor* extendChunk(int64_t newOffsetInFile, unsigned newSize,
unsigned newFrameSize,
unsigned newFrameDuration,
struct timeval newPresentationTime);
// this may end up allocating a new chunk instead
public:
ChunkDescriptor* fNextChunk;
int64_t fOffsetInFile;
unsigned fNumFrames;
unsigned fFrameSize;
unsigned fFrameDuration;
struct timeval fPresentationTime; // of the start of the data
};
class SubsessionBuffer {
public:
SubsessionBuffer(unsigned bufferSize)
: fBufferSize(bufferSize) {
reset();
fData = new unsigned char[bufferSize];
}
virtual ~SubsessionBuffer() { delete[] fData; }
void reset() { fBytesInUse = 0; }
void addBytes(unsigned numBytes) { fBytesInUse += numBytes; }
unsigned char* dataStart() { return &fData[0]; }
unsigned char* dataEnd() { return &fData[fBytesInUse]; }
unsigned bytesInUse() const { return fBytesInUse; }
unsigned bytesAvailable() const { return fBufferSize - fBytesInUse; }
void setPresentationTime(struct timeval const& presentationTime) {
fPresentationTime = presentationTime;
}
struct timeval const& presentationTime() const {return fPresentationTime;}
private:
unsigned fBufferSize;
struct timeval fPresentationTime;
unsigned char* fData;
unsigned fBytesInUse;
};
class SyncFrame {
public:
SyncFrame(unsigned frameNum);
public:
class SyncFrame *nextSyncFrame;
unsigned sfFrameNum;
};
// A 64-bit counter, used below:
class Count64 {
public:
Count64()
: hi(0), lo(0) {
}
void operator+=(unsigned arg);
u_int32_t hi, lo;
};
class SubsessionIOState {
public:
SubsessionIOState(QuickTimeFileSink& sink, MediaSubsession& subsession);
virtual ~SubsessionIOState();
Boolean setQTstate();
void setFinalQTstate();
void afterGettingFrame(unsigned packetDataSize,
struct timeval presentationTime);
void onSourceClosure();
Boolean syncOK(struct timeval presentationTime);
// returns true iff data is usable despite a sync check
static void setHintTrack(SubsessionIOState* hintedTrack,
SubsessionIOState* hintTrack);
Boolean isHintTrack() const { return fTrackHintedByUs != NULL; }
Boolean hasHintTrack() const { return fHintTrackForUs != NULL; }
UsageEnvironment& envir() const { return fOurSink.envir(); }
public:
static unsigned fCurrentTrackNumber;
unsigned fTrackID;
SubsessionIOState* fHintTrackForUs; SubsessionIOState* fTrackHintedByUs;
SubsessionBuffer *fBuffer, *fPrevBuffer;
QuickTimeFileSink& fOurSink;
MediaSubsession& fOurSubsession;
unsigned short fLastPacketRTPSeqNum;
Boolean fOurSourceIsActive;
Boolean fHaveBeenSynced; // used in synchronizing with other streams
struct timeval fSyncTime;
Boolean fQTEnableTrack;
unsigned fQTcomponentSubtype;
char const* fQTcomponentName;
typedef unsigned (QuickTimeFileSink::*atomCreationFunc)();
atomCreationFunc fQTMediaInformationAtomCreator;
atomCreationFunc fQTMediaDataAtomCreator;
char const* fQTAudioDataType;
unsigned short fQTSoundSampleVersion;
unsigned fQTTimeScale;
unsigned fQTTimeUnitsPerSample;
unsigned fQTBytesPerFrame;
unsigned fQTSamplesPerFrame;
// These next fields are derived from the ones above,
// plus the information from each chunk:
unsigned fQTTotNumSamples;
unsigned fQTDurationM; // in media time units
unsigned fQTDurationT; // in track time units
int64_t fTKHD_durationPosn;
// position of the duration in the output 'tkhd' atom
unsigned fQTInitialOffsetDuration;
// if there's a pause at the beginning
ChunkDescriptor *fHeadChunk, *fTailChunk;
unsigned fNumChunks;
SyncFrame *fHeadSyncFrame, *fTailSyncFrame;
// Counters to be used in the hint track's 'udta'/'hinf' atom;
struct hinf {
Count64 trpy;
Count64 nump;
Count64 tpyl;
// Is 'maxr' needed? Computing this would be a PITA. #####
Count64 dmed;
Count64 dimm;
// 'drep' is always 0
// 'tmin' and 'tmax' are always 0
unsigned pmax;
unsigned dmax;
} fHINF;
private:
void useFrame(SubsessionBuffer& buffer);
void useFrameForHinting(unsigned frameSize,
struct timeval presentationTime,
unsigned startSampleNumber);
// used by the above two routines:
unsigned useFrame1(unsigned sourceDataSize,
struct timeval presentationTime,
unsigned frameDuration, int64_t destFileOffset);
// returns the number of samples in this data
private:
// A structure used for temporarily storing frame state:
struct {
unsigned frameSize;
struct timeval presentationTime;
int64_t destFileOffset; // used for non-hint tracks only
// The remaining fields are used for hint tracks only:
unsigned startSampleNumber;
unsigned short seqNum;
unsigned rtpHeader;
unsigned char numSpecialHeaders; // used when our RTP source has special headers
unsigned specialHeaderBytesLength; // ditto
unsigned char specialHeaderBytes[SPECIAL_HEADER_BUFFER_SIZE]; // ditto
unsigned packetSizes[256];
} fPrevFrameState;
};
////////// QuickTimeFileSink implementation //////////
QuickTimeFileSink::QuickTimeFileSink(UsageEnvironment& env,
MediaSession& inputSession,
char const* outputFileName,
unsigned bufferSize,
unsigned short movieWidth,
unsigned short movieHeight,
unsigned movieFPS,
Boolean packetLossCompensate,
Boolean syncStreams,
Boolean generateHintTracks,
Boolean generateMP4Format)
: Medium(env), fInputSession(inputSession),
fBufferSize(bufferSize), fPacketLossCompensate(packetLossCompensate),
fSyncStreams(syncStreams), fGenerateMP4Format(generateMP4Format),
fAreCurrentlyBeingPlayed(False),
fLargestRTPtimestampFrequency(0),
fNumSubsessions(0), fNumSyncedSubsessions(0),
fHaveCompletedOutputFile(False),
fMovieWidth(movieWidth), fMovieHeight(movieHeight),
fMovieFPS(movieFPS), fMaxTrackDurationM(0) {
fOutFid = OpenOutputFile(env, outputFileName);
if (fOutFid == NULL) return;
fNewestSyncTime.tv_sec = fNewestSyncTime.tv_usec = 0;
fFirstDataTime.tv_sec = fFirstDataTime.tv_usec = (unsigned)(~0);
// Set up I/O state for each input subsession:
MediaSubsessionIterator iter(fInputSession);
MediaSubsession* subsession;
while ((subsession = iter.next()) != NULL) {
// Ignore subsessions without a data source:
FramedSource* subsessionSource = subsession->readSource();
if (subsessionSource == NULL) continue;
// If "subsession's" SDP description specified screen dimension
// or frame rate parameters, then use these. (Note that this must
// be done before the call to "setQTState()" below.)
if (subsession->videoWidth() != 0) {
fMovieWidth = subsession->videoWidth();
}
if (subsession->videoHeight() != 0) {
fMovieHeight = subsession->videoHeight();
}
if (subsession->videoFPS() != 0) {
fMovieFPS = subsession->videoFPS();
}
SubsessionIOState* ioState
= new SubsessionIOState(*this, *subsession);
if (ioState == NULL || !ioState->setQTstate()) {
// We're not able to output a QuickTime track for this subsession
delete ioState; ioState = NULL;
continue;
}
subsession->miscPtr = (void*)ioState;
if (generateHintTracks) {
// Also create a hint track for this track:
SubsessionIOState* hintTrack
= new SubsessionIOState(*this, *subsession);
SubsessionIOState::setHintTrack(ioState, hintTrack);
if (!hintTrack->setQTstate()) {
delete hintTrack;
SubsessionIOState::setHintTrack(ioState, NULL);
}
}
// Also set a 'BYE' handler for this subsession's RTCP instance:
if (subsession->rtcpInstance() != NULL) {
subsession->rtcpInstance()->setByeHandler(onRTCPBye, ioState);
}
unsigned rtpTimestampFrequency = subsession->rtpTimestampFrequency();
if (rtpTimestampFrequency > fLargestRTPtimestampFrequency) {
fLargestRTPtimestampFrequency = rtpTimestampFrequency;
}
++fNumSubsessions;
}
// Use the current time as the file's creation and modification
// time. Use Apple's time format: seconds since January 1, 1904
gettimeofday(&fStartTime, NULL);
fAppleCreationTime = fStartTime.tv_sec - 0x83dac000;
// Begin by writing a "mdat" atom at the start of the file.
// (Later, when we've finished copying data to the file, we'll come
// back and fill in its size.)
fMDATposition = TellFile64(fOutFid);
addAtomHeader64("mdat");
// add 64Bit offset
fMDATposition += 8;
}
QuickTimeFileSink::~QuickTimeFileSink() {
completeOutputFile();
// Then, stop streaming and delete each active "SubsessionIOState":
MediaSubsessionIterator iter(fInputSession);
MediaSubsession* subsession;
while ((subsession = iter.next()) != NULL) {
if (subsession->readSource() != NULL) subsession->readSource()->stopGettingFrames();
SubsessionIOState* ioState
= (SubsessionIOState*)(subsession->miscPtr);
if (ioState == NULL) continue;
delete ioState->fHintTrackForUs; // if any
delete ioState;
}
// Finally, close our output file:
CloseOutputFile(fOutFid);
}
QuickTimeFileSink*
QuickTimeFileSink::createNew(UsageEnvironment& env,
MediaSession& inputSession,
char const* outputFileName,
unsigned bufferSize,
unsigned short movieWidth,
unsigned short movieHeight,
unsigned movieFPS,
Boolean packetLossCompensate,
Boolean syncStreams,
Boolean generateHintTracks,
Boolean generateMP4Format) {
QuickTimeFileSink* newSink =
new QuickTimeFileSink(env, inputSession, outputFileName, bufferSize, movieWidth, movieHeight, movieFPS,
packetLossCompensate, syncStreams, generateHintTracks, generateMP4Format);
if (newSink == NULL || newSink->fOutFid == NULL) {
Medium::close(newSink);
return NULL;
}
return newSink;
}
void QuickTimeFileSink
::noteRecordedFrame(MediaSubsession& /*inputSubsession*/,
unsigned /*packetDataSize*/, struct timeval const& /*presentationTime*/) {
// Default implementation: Do nothing
}
Boolean QuickTimeFileSink::startPlaying(afterPlayingFunc* afterFunc,
void* afterClientData) {
// Make sure we're not already being played:
if (fAreCurrentlyBeingPlayed) {
envir().setResultMsg("This sink has already been played");
return False;
}
fAreCurrentlyBeingPlayed = True;
fAfterFunc = afterFunc;
fAfterClientData = afterClientData;
return continuePlaying();
}
Boolean QuickTimeFileSink::continuePlaying() {
// Run through each of our input session's 'subsessions',
// asking for a frame from each one:
Boolean haveActiveSubsessions = False;
MediaSubsessionIterator iter(fInputSession);
MediaSubsession* subsession;
while ((subsession = iter.next()) != NULL) {
FramedSource* subsessionSource = subsession->readSource();
if (subsessionSource == NULL) continue;
if (subsessionSource->isCurrentlyAwaitingData()) continue;
SubsessionIOState* ioState
= (SubsessionIOState*)(subsession->miscPtr);
if (ioState == NULL) continue;
haveActiveSubsessions = True;
unsigned char* toPtr = ioState->fBuffer->dataEnd();
unsigned toSize = ioState->fBuffer->bytesAvailable();
subsessionSource->getNextFrame(toPtr, toSize,
afterGettingFrame, ioState,
onSourceClosure, ioState);
}
if (!haveActiveSubsessions) {
envir().setResultMsg("No subsessions are currently active");
return False;
}
return True;
}
void QuickTimeFileSink
::afterGettingFrame(void* clientData, unsigned packetDataSize,
unsigned numTruncatedBytes,
struct timeval presentationTime,
unsigned /*durationInMicroseconds*/) {
SubsessionIOState* ioState = (SubsessionIOState*)clientData;
if (!ioState->syncOK(presentationTime)) {
// Ignore this data:
ioState->fOurSink.continuePlaying();
return;
}
if (numTruncatedBytes > 0) {
ioState->envir() << "QuickTimeFileSink::afterGettingFrame(): The input frame data was too large for our buffer. "
<< numTruncatedBytes
<< " bytes of trailing data was dropped! Correct this by increasing the \"bufferSize\" parameter in the \"createNew()\" call.\n";
}
ioState->afterGettingFrame(packetDataSize, presentationTime);
}
void QuickTimeFileSink::onSourceClosure(void* clientData) {
SubsessionIOState* ioState = (SubsessionIOState*)clientData;
ioState->onSourceClosure();
}
void QuickTimeFileSink::onSourceClosure1() {
// Check whether *all* of the subsession sources have closed.
// If not, do nothing for now:
MediaSubsessionIterator iter(fInputSession);
MediaSubsession* subsession;
while ((subsession = iter.next()) != NULL) {
SubsessionIOState* ioState
= (SubsessionIOState*)(subsession->miscPtr);
if (ioState == NULL) continue;
if (ioState->fOurSourceIsActive) return; // this source hasn't closed
}
completeOutputFile();
// Call our specified 'after' function:
if (fAfterFunc != NULL) {
(*fAfterFunc)(fAfterClientData);
}
}
void QuickTimeFileSink::onRTCPBye(void* clientData) {
SubsessionIOState* ioState = (SubsessionIOState*)clientData;
struct timeval timeNow;
gettimeofday(&timeNow, NULL);
unsigned secsDiff
= timeNow.tv_sec - ioState->fOurSink.fStartTime.tv_sec;
MediaSubsession& subsession = ioState->fOurSubsession;
ioState->envir() << "Received RTCP \"BYE\" on \""
<< subsession.mediumName()
<< "/" << subsession.codecName()
<< "\" subsession (after "
<< secsDiff << " seconds)\n";
// Handle the reception of a RTCP "BYE" as if the source had closed:
ioState->onSourceClosure();
}
static Boolean timevalGE(struct timeval const& tv1,
struct timeval const& tv2) {
return (unsigned)tv1.tv_sec > (unsigned)tv2.tv_sec
|| (tv1.tv_sec == tv2.tv_sec
&& (unsigned)tv1.tv_usec >= (unsigned)tv2.tv_usec);
}
void QuickTimeFileSink::completeOutputFile() {
if (fHaveCompletedOutputFile || fOutFid == NULL) return;
// Begin by filling in the initial "mdat" atom with the current
// file size:
int64_t curFileSize = TellFile64(fOutFid);
setWord64(fMDATposition, (u_int64_t)curFileSize);
// Then, note the time of the first received data:
MediaSubsessionIterator iter(fInputSession);
MediaSubsession* subsession;
while ((subsession = iter.next()) != NULL) {
SubsessionIOState* ioState
= (SubsessionIOState*)(subsession->miscPtr);
if (ioState == NULL) continue;
ChunkDescriptor* const headChunk = ioState->fHeadChunk;
if (headChunk != NULL
&& timevalGE(fFirstDataTime, headChunk->fPresentationTime)) {
fFirstDataTime = headChunk->fPresentationTime;
}
}
// Then, update the QuickTime-specific state for each active track:
iter.reset();
while ((subsession = iter.next()) != NULL) {
SubsessionIOState* ioState
= (SubsessionIOState*)(subsession->miscPtr);
if (ioState == NULL) continue;
ioState->setFinalQTstate();
// Do the same for a hint track (if any):
if (ioState->hasHintTrack()) {
ioState->fHintTrackForUs->setFinalQTstate();
}
}
if (fGenerateMP4Format) {
// Begin with a "ftyp" atom:
addAtom_ftyp();
}
// Then, add a "moov" atom for the file metadata:
addAtom_moov();
// We're done:
fHaveCompletedOutputFile = True;
}
////////// SubsessionIOState, ChunkDescriptor implementation ///////////
unsigned SubsessionIOState::fCurrentTrackNumber = 0;
SubsessionIOState::SubsessionIOState(QuickTimeFileSink& sink,
MediaSubsession& subsession)
: fHintTrackForUs(NULL), fTrackHintedByUs(NULL),
fOurSink(sink), fOurSubsession(subsession),
fLastPacketRTPSeqNum(0), fHaveBeenSynced(False), fQTTotNumSamples(0),
fHeadChunk(NULL), fTailChunk(NULL), fNumChunks(0),
fHeadSyncFrame(NULL), fTailSyncFrame(NULL) {
fTrackID = ++fCurrentTrackNumber;
fBuffer = new SubsessionBuffer(fOurSink.fBufferSize);
fPrevBuffer = sink.fPacketLossCompensate
? new SubsessionBuffer(fOurSink.fBufferSize) : NULL;
FramedSource* subsessionSource = subsession.readSource();
fOurSourceIsActive = subsessionSource != NULL;
fPrevFrameState.presentationTime.tv_sec = 0;
fPrevFrameState.presentationTime.tv_usec = 0;
fPrevFrameState.seqNum = 0;
}
SubsessionIOState::~SubsessionIOState() {
delete fBuffer; delete fPrevBuffer;
// Delete the list of chunk descriptors:
ChunkDescriptor* chunk = fHeadChunk;
while (chunk != NULL) {
ChunkDescriptor* next = chunk->fNextChunk;
delete chunk;
chunk = next;
}
// Delete the list of sync frames:
SyncFrame* syncFrame = fHeadSyncFrame;
while (syncFrame != NULL) {
SyncFrame* next = syncFrame->nextSyncFrame;
delete syncFrame;
syncFrame = next;
}
}
Boolean SubsessionIOState::setQTstate() {
char const* noCodecWarning1 = "Warning: We don't implement a QuickTime ";
char const* noCodecWarning2 = " Media Data Type for the \"";
char const* noCodecWarning3 = "\" track, so we'll insert a dummy \"????\" Media Data Atom instead. A separate, codec-specific editing pass will be needed before this track can be played.\n";
do {
fQTEnableTrack = True; // enable this track in the movie by default
fQTTimeScale = fOurSubsession.rtpTimestampFrequency(); // by default
fQTTimeUnitsPerSample = 1; // by default
fQTBytesPerFrame = 0;
// by default - indicates that the whole packet data is a frame
fQTSamplesPerFrame = 1; // by default
// Make sure our subsession's medium is one that we know how to
// represent in a QuickTime file:
if (isHintTrack()) {
// Hint tracks are treated specially
fQTEnableTrack = False; // hint tracks are marked as inactive
fQTcomponentSubtype = fourChar('h','i','n','t');
fQTcomponentName = "hint media handler";
fQTMediaInformationAtomCreator = &QuickTimeFileSink::addAtom_gmhd;
fQTMediaDataAtomCreator = &QuickTimeFileSink::addAtom_rtp;
} else if (strcmp(fOurSubsession.mediumName(), "audio") == 0) {
fQTcomponentSubtype = fourChar('s','o','u','n');
fQTcomponentName = "Apple Sound Media Handler";
fQTMediaInformationAtomCreator = &QuickTimeFileSink::addAtom_smhd;
fQTMediaDataAtomCreator
= &QuickTimeFileSink::addAtom_soundMediaGeneral; // by default
fQTSoundSampleVersion = 0; // by default
// Make sure that our subsession's codec is one that we can handle:
if (strcmp(fOurSubsession.codecName(), "X-QT") == 0 ||
strcmp(fOurSubsession.codecName(), "X-QUICKTIME") == 0) {
fQTMediaDataAtomCreator = &QuickTimeFileSink::addAtom_genericMedia;
} else if (strcmp(fOurSubsession.codecName(), "PCMU") == 0) {
fQTAudioDataType = "ulaw";
fQTBytesPerFrame = 1;
} else if (strcmp(fOurSubsession.codecName(), "GSM") == 0) {
fQTAudioDataType = "agsm";
fQTBytesPerFrame = 33;
fQTSamplesPerFrame = 160;
} else if (strcmp(fOurSubsession.codecName(), "PCMA") == 0) {
fQTAudioDataType = "alaw";
fQTBytesPerFrame = 1;
} else if (strcmp(fOurSubsession.codecName(), "QCELP") == 0) {
fQTMediaDataAtomCreator = &QuickTimeFileSink::addAtom_Qclp;
fQTSamplesPerFrame = 160;
} else if (strcmp(fOurSubsession.codecName(), "MPEG4-GENERIC") == 0 ||
strcmp(fOurSubsession.codecName(), "MP4A-LATM") == 0) {
fQTMediaDataAtomCreator = &QuickTimeFileSink::addAtom_mp4a;
fQTTimeUnitsPerSample = 1024; // QT considers each frame to be a 'sample'
// The time scale (frequency) comes from the 'config' information.
// It might be different from the RTP timestamp frequency (e.g., aacPlus).
unsigned frequencyFromConfig
= samplingFrequencyFromAudioSpecificConfig(fOurSubsession.fmtp_config());
if (frequencyFromConfig != 0) fQTTimeScale = frequencyFromConfig;
} else {
envir() << noCodecWarning1 << "Audio" << noCodecWarning2
<< fOurSubsession.codecName() << noCodecWarning3;
fQTMediaDataAtomCreator = &QuickTimeFileSink::addAtom_dummy;
fQTEnableTrack = False; // disable this track in the movie
}
} else if (strcmp(fOurSubsession.mediumName(), "video") == 0) {
fQTcomponentSubtype = fourChar('v','i','d','e');
fQTcomponentName = "Apple Video Media Handler";
fQTMediaInformationAtomCreator = &QuickTimeFileSink::addAtom_vmhd;
// Make sure that our subsession's codec is one that we can handle:
if (strcmp(fOurSubsession.codecName(), "X-QT") == 0 ||
strcmp(fOurSubsession.codecName(), "X-QUICKTIME") == 0) {
fQTMediaDataAtomCreator = &QuickTimeFileSink::addAtom_genericMedia;
} else if (strcmp(fOurSubsession.codecName(), "H263-1998") == 0 ||
strcmp(fOurSubsession.codecName(), "H263-2000") == 0) {
fQTMediaDataAtomCreator = &QuickTimeFileSink::addAtom_h263;
fQTTimeScale = 600;
fQTTimeUnitsPerSample = fQTTimeScale/fOurSink.fMovieFPS;
} else if (strcmp(fOurSubsession.codecName(), "H264") == 0) {
fQTMediaDataAtomCreator = &QuickTimeFileSink::addAtom_avc1;
fQTTimeScale = 600;
fQTTimeUnitsPerSample = fQTTimeScale/fOurSink.fMovieFPS;
} else if (strcmp(fOurSubsession.codecName(), "MP4V-ES") == 0) {
fQTMediaDataAtomCreator = &QuickTimeFileSink::addAtom_mp4v;
fQTTimeScale = 600;
fQTTimeUnitsPerSample = fQTTimeScale/fOurSink.fMovieFPS;
} else {
envir() << noCodecWarning1 << "Video" << noCodecWarning2
<< fOurSubsession.codecName() << noCodecWarning3;
fQTMediaDataAtomCreator = &QuickTimeFileSink::addAtom_dummy;
fQTEnableTrack = False; // disable this track in the movie
}
} else {
envir() << "Warning: We don't implement a QuickTime Media Handler for media type \""
<< fOurSubsession.mediumName() << "\"";
break;
}
#ifdef QT_SUPPORT_PARTIALLY_ONLY
envir() << "Warning: We don't have sufficient codec-specific information (e.g., sample sizes) to fully generate the \""
<< fOurSubsession.mediumName() << "/" << fOurSubsession.codecName()
<< "\" track, so we'll disable this track in the movie. A separate, codec-specific editing pass will be needed before this track can be played\n";
fQTEnableTrack = False; // disable this track in the movie
#endif
return True;
} while (0);
envir() << ", so a track for the \"" << fOurSubsession.mediumName()
<< "/" << fOurSubsession.codecName()
<< "\" subsession will not be included in the output QuickTime file\n";
return False;
}
void SubsessionIOState::setFinalQTstate() {
// Compute derived parameters, by running through the list of chunks:
fQTDurationT = 0;
ChunkDescriptor* chunk = fHeadChunk;
while (chunk != NULL) {
unsigned const numFrames = chunk->fNumFrames;
unsigned const dur = numFrames*chunk->fFrameDuration;
fQTDurationT += dur;
chunk = chunk->fNextChunk;
}
// Convert this duration from track to movie time scale:
double scaleFactor = fOurSink.movieTimeScale()/(double)fQTTimeScale;
fQTDurationM = (unsigned)(fQTDurationT*scaleFactor);
if (fQTDurationM > fOurSink.fMaxTrackDurationM) {
fOurSink.fMaxTrackDurationM = fQTDurationM;
}
}
void SubsessionIOState::afterGettingFrame(unsigned packetDataSize,
struct timeval presentationTime) {
// Begin by checking whether there was a gap in the RTP stream.
// If so, try to compensate for this (if desired):
if (fOurSubsession.rtpSource() != NULL) { // we have a RTP stream
unsigned short rtpSeqNum
= fOurSubsession.rtpSource()->curPacketRTPSeqNum();
if (fOurSink.fPacketLossCompensate && fPrevBuffer->bytesInUse() > 0) {
short seqNumGap = rtpSeqNum - fLastPacketRTPSeqNum;
for (short i = 1; i < seqNumGap; ++i) {
// Insert a copy of the previous frame, to compensate for the loss:
useFrame(*fPrevBuffer);
}
}
fLastPacketRTPSeqNum = rtpSeqNum;
}
// Now, continue working with the frame that we just got
fOurSink.noteRecordedFrame(fOurSubsession, packetDataSize, presentationTime);
if (fBuffer->bytesInUse() == 0) {
fBuffer->setPresentationTime(presentationTime);
}
fBuffer->addBytes(packetDataSize);
// If our RTP source is a "QuickTimeGenericRTPSource", then
// use its 'qtState' to set some parameters that we need:
if (fOurSubsession.rtpSource() != NULL // we have a RTP stream
&& fQTMediaDataAtomCreator == &QuickTimeFileSink::addAtom_genericMedia) {
QuickTimeGenericRTPSource* rtpSource
= (QuickTimeGenericRTPSource*)fOurSubsession.rtpSource();
QuickTimeGenericRTPSource::QTState& qtState = rtpSource->qtState;
fQTTimeScale = qtState.timescale;
if (qtState.width != 0) {
fOurSink.fMovieWidth = qtState.width;
}
if (qtState.height != 0) {
fOurSink.fMovieHeight = qtState.height;
}
// Also, if the media type in the "sdAtom" is one that we recognize
// to have a special parameters, then fix this here:
if (qtState.sdAtomSize >= 8) {
char const* atom = qtState.sdAtom;
unsigned mediaType = fourChar(atom[4],atom[5],atom[6],atom[7]);
switch (mediaType) {
case fourChar('a','g','s','m'): {
fQTBytesPerFrame = 33;
fQTSamplesPerFrame = 160;
break;
}
case fourChar('Q','c','l','p'): {
fQTBytesPerFrame = 35;
fQTSamplesPerFrame = 160;
break;
}
case fourChar('H','c','l','p'): {
fQTBytesPerFrame = 17;
fQTSamplesPerFrame = 160;
break;
}
case fourChar('h','2','6','3'): {
fQTTimeUnitsPerSample = fQTTimeScale/fOurSink.fMovieFPS;
break;
}
}
}
} else if (fQTMediaDataAtomCreator == &QuickTimeFileSink::addAtom_Qclp) {
// For QCELP data, make a note of the frame size (even though it's the
// same as the packet data size), because it varies depending on the
// 'rate' of the stream, and this size gets used later when setting up
// the 'Qclp' QuickTime atom:
fQTBytesPerFrame = packetDataSize;
}
useFrame(*fBuffer);
if (fOurSink.fPacketLossCompensate) {
// Save this frame, in case we need it for recovery:
SubsessionBuffer* tmp = fPrevBuffer; // assert: != NULL
fPrevBuffer = fBuffer;
fBuffer = tmp;
}
fBuffer->reset(); // for the next input
// Now, try getting more frames:
fOurSink.continuePlaying();
}
void SubsessionIOState::useFrame(SubsessionBuffer& buffer) {
unsigned char* const frameSource = buffer.dataStart();
unsigned const frameSize = buffer.bytesInUse();
struct timeval const& presentationTime = buffer.presentationTime();
int64_t const destFileOffset = TellFile64(fOurSink.fOutFid);
unsigned sampleNumberOfFrameStart = fQTTotNumSamples + 1;
Boolean avcHack = fQTMediaDataAtomCreator == &QuickTimeFileSink::addAtom_avc1;
// If we're not syncing streams, or this subsession is not video, then
// just give this frame a fixed duration:
if (!fOurSink.fSyncStreams
|| fQTcomponentSubtype != fourChar('v','i','d','e')) {
unsigned const frameDuration = fQTTimeUnitsPerSample*fQTSamplesPerFrame;
unsigned frameSizeToUse = frameSize;
if (avcHack) frameSizeToUse += 4; // H.264/AVC gets the frame size prefix
fQTTotNumSamples += useFrame1(frameSizeToUse, presentationTime, frameDuration, destFileOffset);
} else {
// For synced video streams, we use the difference between successive
// frames' presentation times as the 'frame duration'. So, record
// information about the *previous* frame:
struct timeval const& ppt = fPrevFrameState.presentationTime; //abbrev
if (ppt.tv_sec != 0 || ppt.tv_usec != 0) {
// There has been a previous frame.
double duration = (presentationTime.tv_sec - ppt.tv_sec)
+ (presentationTime.tv_usec - ppt.tv_usec)/1000000.0;
if (duration < 0.0) duration = 0.0;
unsigned frameDuration
= (unsigned)((2*duration*fQTTimeScale+1)/2); // round
unsigned frameSizeToUse = fPrevFrameState.frameSize;
if (avcHack) frameSizeToUse += 4; // H.264/AVC gets the frame size prefix
unsigned numSamples
= useFrame1(frameSizeToUse, ppt, frameDuration, fPrevFrameState.destFileOffset);
fQTTotNumSamples += numSamples;
sampleNumberOfFrameStart = fQTTotNumSamples + 1;
}
if (avcHack && (*frameSource == H264_IDR_FRAME)) {
SyncFrame* newSyncFrame = new SyncFrame(fQTTotNumSamples + 1);
if (fTailSyncFrame == NULL) {
fHeadSyncFrame = newSyncFrame;
} else {
fTailSyncFrame->nextSyncFrame = newSyncFrame;
}
fTailSyncFrame = newSyncFrame;
}
// Remember the current frame for next time:
fPrevFrameState.frameSize = frameSize;
fPrevFrameState.presentationTime = presentationTime;
fPrevFrameState.destFileOffset = destFileOffset;
}
if (avcHack) fOurSink.addWord(frameSize);
// Write the data into the file:
fwrite(frameSource, 1, frameSize, fOurSink.fOutFid);
// If we have a hint track, then write to it also (only if we have a RTP stream):
if (hasHintTrack() && fOurSubsession.rtpSource() != NULL) {
// Because presentation times are used for RTP packet timestamps,
// we don't starting writing to the hint track until we've been synced:
if (!fHaveBeenSynced) {
fHaveBeenSynced = fOurSubsession.rtpSource()->hasBeenSynchronizedUsingRTCP();
}
if (fHaveBeenSynced) {
fHintTrackForUs->useFrameForHinting(frameSize, presentationTime,
sampleNumberOfFrameStart);
}
}
}
void SubsessionIOState::useFrameForHinting(unsigned frameSize,
struct timeval presentationTime,
unsigned startSampleNumber) {
// At this point, we have a single, combined frame - not individual packets.
// For the hint track, we need to split the frame back up into separate packets.
// However, for some RTP sources, then we also need to reuse the special
// header bytes that were at the start of each of the RTP packets.
Boolean hack263 = strcmp(fOurSubsession.codecName(), "H263-1998") == 0;
Boolean hackm4a_generic = strcmp(fOurSubsession.mediumName(), "audio") == 0
&& strcmp(fOurSubsession.codecName(), "MPEG4-GENERIC") == 0;
Boolean hackm4a_latm = strcmp(fOurSubsession.mediumName(), "audio") == 0
&& strcmp(fOurSubsession.codecName(), "MP4A-LATM") == 0;
Boolean hackm4a = hackm4a_generic || hackm4a_latm;
Boolean haveSpecialHeaders = (hack263 || hackm4a_generic);
// If there has been a previous frame, then output a 'hint sample' for it.
// (We use the current frame's presentation time to compute the previous
// hint sample's duration.)
RTPSource* const rs = fOurSubsession.rtpSource(); // abbrev (ASSERT: != NULL)
struct timeval const& ppt = fPrevFrameState.presentationTime; //abbrev
if (ppt.tv_sec != 0 || ppt.tv_usec != 0) {
double duration = (presentationTime.tv_sec - ppt.tv_sec)
+ (presentationTime.tv_usec - ppt.tv_usec)/1000000.0;
if (duration < 0.0) duration = 0.0;
unsigned msDuration = (unsigned)(duration*1000); // milliseconds
if (msDuration > fHINF.dmax) fHINF.dmax = msDuration;
unsigned hintSampleDuration
= (unsigned)((2*duration*fQTTimeScale+1)/2); // round
if (hackm4a) {
// Because multiple AAC frames can appear in a RTP packet, the presentation
// times of the second and subsequent frames will not be accurate.
// So, use the known "hintSampleDuration" instead:
hintSampleDuration = fTrackHintedByUs->fQTTimeUnitsPerSample;
// Also, if the 'time scale' was different from the RTP timestamp frequency,
// (as can happen with aacPlus), then we need to scale "hintSampleDuration"
// accordingly:
if (fTrackHintedByUs->fQTTimeScale != fOurSubsession.rtpTimestampFrequency()) {
unsigned const scalingFactor
= fOurSubsession.rtpTimestampFrequency()/fTrackHintedByUs->fQTTimeScale ;
hintSampleDuration *= scalingFactor;
}
}
int64_t const hintSampleDestFileOffset = TellFile64(fOurSink.fOutFid);
unsigned const maxPacketSize = 1450;
unsigned short numPTEntries
= (fPrevFrameState.frameSize + (maxPacketSize-1))/maxPacketSize; // normal case
unsigned char* immediateDataPtr = NULL;
unsigned immediateDataBytesRemaining = 0;
if (haveSpecialHeaders) { // special case
numPTEntries = fPrevFrameState.numSpecialHeaders;
immediateDataPtr = fPrevFrameState.specialHeaderBytes;
immediateDataBytesRemaining
= fPrevFrameState.specialHeaderBytesLength;
}
unsigned hintSampleSize
= fOurSink.addHalfWord(numPTEntries);// Entry count
hintSampleSize += fOurSink.addHalfWord(0x0000); // Reserved
unsigned offsetWithinSample = 0;
for (unsigned i = 0; i < numPTEntries; ++i) {
// Output a Packet Table entry (representing a single RTP packet):
unsigned short numDTEntries = 1;
unsigned short seqNum = fPrevFrameState.seqNum++;
// Note: This assumes that the input stream had no packets lost #####
unsigned rtpHeader = fPrevFrameState.rtpHeader;
if (i+1 < numPTEntries) {
// This is not the last RTP packet, so clear the marker bit:
rtpHeader &=~ (1<<23);
}
unsigned dataFrameSize = (i+1 < numPTEntries)
? maxPacketSize : fPrevFrameState.frameSize - i*maxPacketSize; // normal case
unsigned sampleNumber = fPrevFrameState.startSampleNumber;
unsigned char immediateDataLen = 0;
if (haveSpecialHeaders) { // special case
++numDTEntries; // to include a Data Table entry for the special hdr
if (immediateDataBytesRemaining > 0) {
if (hack263) {
immediateDataLen = *immediateDataPtr++;
--immediateDataBytesRemaining;
if (immediateDataLen > immediateDataBytesRemaining) {
// shouldn't happen (length byte was bad)
immediateDataLen = immediateDataBytesRemaining;
}
} else {
immediateDataLen = fPrevFrameState.specialHeaderBytesLength;
}
}
dataFrameSize = fPrevFrameState.packetSizes[i] - immediateDataLen;
if (hack263) {
Boolean PbitSet
= immediateDataLen >= 1 && (immediateDataPtr[0]&0x4) != 0;
if (PbitSet) {
offsetWithinSample += 2; // to omit the two leading 0 bytes
}
}
}
// Output the Packet Table:
hintSampleSize += fOurSink.addWord(0); // Relative transmission time
hintSampleSize += fOurSink.addWord(rtpHeader|seqNum);
// RTP header info + RTP sequence number
hintSampleSize += fOurSink.addHalfWord(0x0000); // Flags
hintSampleSize += fOurSink.addHalfWord(numDTEntries); // Entry count
unsigned totalPacketSize = 0;
// Output the Data Table:
if (haveSpecialHeaders) {
// use the "Immediate Data" format (1):
hintSampleSize += fOurSink.addByte(1); // Source
unsigned char len = immediateDataLen > 14 ? 14 : immediateDataLen;
hintSampleSize += fOurSink.addByte(len); // Length
totalPacketSize += len; fHINF.dimm += len;
unsigned char j;
for (j = 0; j < len; ++j) {
hintSampleSize += fOurSink.addByte(immediateDataPtr[j]); // Data
}
for (j = len; j < 14; ++j) {
hintSampleSize += fOurSink.addByte(0); // Data (padding)
}
immediateDataPtr += immediateDataLen;
immediateDataBytesRemaining -= immediateDataLen;
}
// use the "Sample Data" format (2):
hintSampleSize += fOurSink.addByte(2); // Source
hintSampleSize += fOurSink.addByte(0); // Track ref index
hintSampleSize += fOurSink.addHalfWord(dataFrameSize); // Length
totalPacketSize += dataFrameSize; fHINF.dmed += dataFrameSize;
hintSampleSize += fOurSink.addWord(sampleNumber); // Sample number
hintSampleSize += fOurSink.addWord(offsetWithinSample); // Offset
// Get "bytes|samples per compression block" from the hinted track:
unsigned short const bytesPerCompressionBlock
= fTrackHintedByUs->fQTBytesPerFrame;
unsigned short const samplesPerCompressionBlock
= fTrackHintedByUs->fQTSamplesPerFrame;
hintSampleSize += fOurSink.addHalfWord(bytesPerCompressionBlock);
hintSampleSize += fOurSink.addHalfWord(samplesPerCompressionBlock);
offsetWithinSample += dataFrameSize;// for the next iteration (if any)
// Tally statistics for this packet:
fHINF.nump += 1;
fHINF.tpyl += totalPacketSize;
totalPacketSize += 12; // add in the size of the RTP header
fHINF.trpy += totalPacketSize;
if (totalPacketSize > fHINF.pmax) fHINF.pmax = totalPacketSize;
}
// Make note of this completed hint sample frame:
fQTTotNumSamples += useFrame1(hintSampleSize, ppt, hintSampleDuration,
hintSampleDestFileOffset);
}
// Remember this frame for next time:
fPrevFrameState.frameSize = frameSize;
fPrevFrameState.presentationTime = presentationTime;
fPrevFrameState.startSampleNumber = startSampleNumber;
fPrevFrameState.rtpHeader
= rs->curPacketMarkerBit()<<23
| (rs->rtpPayloadFormat()&0x7F)<<16;
if (hack263) {
H263plusVideoRTPSource* rs_263 = (H263plusVideoRTPSource*)rs;
fPrevFrameState.numSpecialHeaders = rs_263->fNumSpecialHeaders;
fPrevFrameState.specialHeaderBytesLength = rs_263->fSpecialHeaderBytesLength;
unsigned i;
for (i = 0; i < rs_263->fSpecialHeaderBytesLength; ++i) {
fPrevFrameState.specialHeaderBytes[i] = rs_263->fSpecialHeaderBytes[i];
}
for (i = 0; i < rs_263->fNumSpecialHeaders; ++i) {
fPrevFrameState.packetSizes[i] = rs_263->fPacketSizes[i];
}
} else if (hackm4a_generic) {
// Synthesize a special header, so that this frame can be in its own RTP packet.
unsigned const sizeLength = fOurSubsession.attrVal_unsigned("sizelength");
unsigned const indexLength = fOurSubsession.attrVal_unsigned("indexlength");
if (sizeLength + indexLength != 16) {
envir() << "Warning: unexpected 'sizeLength' " << sizeLength
<< " and 'indexLength' " << indexLength
<< "seen when creating hint track\n";
}
fPrevFrameState.numSpecialHeaders = 1;
fPrevFrameState.specialHeaderBytesLength = 4;
fPrevFrameState.specialHeaderBytes[0] = 0; // AU_headers_length (high byte)
fPrevFrameState.specialHeaderBytes[1] = 16; // AU_headers_length (low byte)
fPrevFrameState.specialHeaderBytes[2] = ((frameSize<>8;
fPrevFrameState.specialHeaderBytes[3] = (frameSize<extendChunk(destFileOffset, sourceDataSize,
frameSize, frameDuration,
presentationTime);
}
if (newTailChunk != fTailChunk) {
// This data created a new chunk, rather than extending the old one
++fNumChunks;
fTailChunk = newTailChunk;
}
return numSamples;
}
void SubsessionIOState::onSourceClosure() {
fOurSourceIsActive = False;
fOurSink.onSourceClosure1();
}
Boolean SubsessionIOState::syncOK(struct timeval presentationTime) {
QuickTimeFileSink& s = fOurSink; // abbreviation
if (!s.fSyncStreams || fOurSubsession.rtpSource() == NULL) return True; // we don't care
if (s.fNumSyncedSubsessions < s.fNumSubsessions) {
// Not all subsessions have yet been synced. Check whether ours was
// one of the unsynced ones, and, if so, whether it is now synced:
if (!fHaveBeenSynced) {
// We weren't synchronized before
if (fOurSubsession.rtpSource()->hasBeenSynchronizedUsingRTCP()) {
// H264 ?
if (fQTMediaDataAtomCreator == &QuickTimeFileSink::addAtom_avc1) {
// special case: audio + H264 video: wait until audio is in sync
if ((s.fNumSubsessions == 2) && (s.fNumSyncedSubsessions < (s.fNumSubsessions - 1))) return False;
// if audio is in sync, wait for the next IDR frame to start
unsigned char* const frameSource = fBuffer->dataStart();
if (*frameSource != H264_IDR_FRAME) return False;
}
// But now we are
fHaveBeenSynced = True;
fSyncTime = presentationTime;
++s.fNumSyncedSubsessions;
if (timevalGE(fSyncTime, s.fNewestSyncTime)) {
s.fNewestSyncTime = fSyncTime;
}
}
}
}
// Check again whether all subsessions have been synced:
if (s.fNumSyncedSubsessions < s.fNumSubsessions) return False;
// Allow this data if it is more recent than the newest sync time:
return timevalGE(presentationTime, s.fNewestSyncTime);
}
void SubsessionIOState::setHintTrack(SubsessionIOState* hintedTrack,
SubsessionIOState* hintTrack) {
if (hintedTrack != NULL) hintedTrack->fHintTrackForUs = hintTrack;
if (hintTrack != NULL) hintTrack->fTrackHintedByUs = hintedTrack;
}
SyncFrame::SyncFrame(unsigned frameNum)
: nextSyncFrame(NULL), sfFrameNum(frameNum) {
}
void Count64::operator+=(unsigned arg) {
unsigned newLo = lo + arg;
if (newLo < lo) { // lo has overflowed
++hi;
}
lo = newLo;
}
ChunkDescriptor
::ChunkDescriptor(int64_t offsetInFile, unsigned size,
unsigned frameSize, unsigned frameDuration,
struct timeval presentationTime)
: fNextChunk(NULL), fOffsetInFile(offsetInFile),
fNumFrames(size/frameSize),
fFrameSize(frameSize), fFrameDuration(frameDuration),
fPresentationTime(presentationTime) {
}
ChunkDescriptor* ChunkDescriptor
::extendChunk(int64_t newOffsetInFile, unsigned newSize,
unsigned newFrameSize, unsigned newFrameDuration,
struct timeval newPresentationTime) {
// First, check whether the new space is just at the end of this
// existing chunk:
if (newOffsetInFile == fOffsetInFile + fNumFrames*fFrameSize) {
// We can extend this existing chunk, provided that the frame size
// and frame duration have not changed:
if (newFrameSize == fFrameSize && newFrameDuration == fFrameDuration) {
fNumFrames += newSize/fFrameSize;
return this;
}
}
// We'll allocate a new ChunkDescriptor, and link it to the end of us:
ChunkDescriptor* newDescriptor
= new ChunkDescriptor(newOffsetInFile, newSize,
newFrameSize, newFrameDuration,
newPresentationTime);
fNextChunk = newDescriptor;
return newDescriptor;
}
////////// QuickTime-specific implementation //////////
unsigned QuickTimeFileSink::addWord64(u_int64_t word) {
addByte((unsigned char)(word>>56)); addByte((unsigned char)(word>>48));
addByte((unsigned char)(word>>40)); addByte((unsigned char)(word>>32));
addByte((unsigned char)(word>>24)); addByte((unsigned char)(word>>16));
addByte((unsigned char)(word>>8)); addByte((unsigned char)(word));
return 8;
}
unsigned QuickTimeFileSink::addWord(unsigned word) {
addByte(word>>24); addByte(word>>16);
addByte(word>>8); addByte(word);
return 4;
}
unsigned QuickTimeFileSink::addHalfWord(unsigned short halfWord) {
addByte((unsigned char)(halfWord>>8)); addByte((unsigned char)halfWord);
return 2;
}
unsigned QuickTimeFileSink::addZeroWords(unsigned numWords) {
for (unsigned i = 0; i < numWords; ++i) {
addWord(0);
}
return numWords*4;
}
unsigned QuickTimeFileSink::add4ByteString(char const* str) {
addByte(str[0]); addByte(str[1]); addByte(str[2]); addByte(str[3]);
return 4;
}
unsigned QuickTimeFileSink::addArbitraryString(char const* str,
Boolean oneByteLength) {
unsigned size = 0;
if (oneByteLength) {
// Begin with a byte containing the string length:
unsigned strLength = strlen(str);
if (strLength >= 256) {
envir() << "QuickTimeFileSink::addArbitraryString(\""
<< str << "\") saw string longer than we know how to handle ("
<< strLength << "\n";
}
size += addByte((unsigned char)strLength);
}
while (*str != '\0') {
size += addByte(*str++);
}
return size;
}
unsigned QuickTimeFileSink::addAtomHeader(char const* atomName) {
// Output a placeholder for the 4-byte size:
addWord(0);
// Output the 4-byte atom name:
add4ByteString(atomName);
return 8;
}
unsigned QuickTimeFileSink::addAtomHeader64(char const* atomName) {
// Output 64Bit size marker
addWord(1);
// Output the 4-byte atom name:
add4ByteString(atomName);
addWord64(0);
return 16;
}
void QuickTimeFileSink::setWord(int64_t filePosn, unsigned size) {
do {
if (SeekFile64(fOutFid, filePosn, SEEK_SET) < 0) break;
addWord(size);
if (SeekFile64(fOutFid, 0, SEEK_END) < 0) break; // go back to where we were
return;
} while (0);
// One of the SeekFile64()s failed, probable because we're not a seekable file
envir() << "QuickTimeFileSink::setWord(): SeekFile64 failed (err "
<< envir().getErrno() << ")\n";
}
void QuickTimeFileSink::setWord64(int64_t filePosn, u_int64_t size) {
do {
if (SeekFile64(fOutFid, filePosn, SEEK_SET) < 0) break;
addWord64(size);
if (SeekFile64(fOutFid, 0, SEEK_END) < 0) break; // go back to where we were
return;
} while (0);
// One of the SeekFile64()s failed, probable because we're not a seekable file
envir() << "QuickTimeFileSink::setWord64(): SeekFile64 failed (err "
<< envir().getErrno() << ")\n";
}
// Methods for writing particular atoms. Note the following macros:
#define addAtom(name) \
unsigned QuickTimeFileSink::addAtom_##name() { \
int64_t initFilePosn = TellFile64(fOutFid); \
unsigned size = addAtomHeader("" #name "")
#define addAtomEnd \
setWord(initFilePosn, size); \
return size; \
}
addAtom(ftyp);
size += add4ByteString("mp42");
size += addWord(0x00000000);
size += add4ByteString("mp42");
size += add4ByteString("isom");
addAtomEnd;
addAtom(moov);
size += addAtom_mvhd();
if (fGenerateMP4Format) {
size += addAtom_iods();
}
// Add a 'trak' atom for each subsession:
// (For some unknown reason, QuickTime Player (5.0 at least)
// doesn't display the movie correctly unless the audio track
// (if present) appears before the video track. So ensure this here.)
MediaSubsessionIterator iter(fInputSession);
MediaSubsession* subsession;
while ((subsession = iter.next()) != NULL) {
fCurrentIOState = (SubsessionIOState*)(subsession->miscPtr);
if (fCurrentIOState == NULL) continue;
if (strcmp(subsession->mediumName(), "audio") != 0) continue;
size += addAtom_trak();
if (fCurrentIOState->hasHintTrack()) {
// This track has a hint track; output it also:
fCurrentIOState = fCurrentIOState->fHintTrackForUs;
size += addAtom_trak();
}
}
iter.reset();
while ((subsession = iter.next()) != NULL) {
fCurrentIOState = (SubsessionIOState*)(subsession->miscPtr);
if (fCurrentIOState == NULL) continue;
if (strcmp(subsession->mediumName(), "audio") == 0) continue;
size += addAtom_trak();
if (fCurrentIOState->hasHintTrack()) {
// This track has a hint track; output it also:
fCurrentIOState = fCurrentIOState->fHintTrackForUs;
size += addAtom_trak();
}
}
addAtomEnd;
addAtom(mvhd);
size += addWord(0x00000000); // Version + Flags
size += addWord(fAppleCreationTime); // Creation time
size += addWord(fAppleCreationTime); // Modification time
// For the "Time scale" field, use the largest RTP timestamp frequency
// that we saw in any of the subsessions.
size += addWord(movieTimeScale()); // Time scale
unsigned const duration = fMaxTrackDurationM;
fMVHD_durationPosn = TellFile64(fOutFid);
size += addWord(duration); // Duration
size += addWord(0x00010000); // Preferred rate
size += addWord(0x01000000); // Preferred volume + Reserved[0]
size += addZeroWords(2); // Reserved[1-2]
size += addWord(0x00010000); // matrix top left corner
size += addZeroWords(3); // matrix
size += addWord(0x00010000); // matrix center
size += addZeroWords(3); // matrix
size += addWord(0x40000000); // matrix bottom right corner
size += addZeroWords(6); // various time fields
size += addWord(SubsessionIOState::fCurrentTrackNumber+1);// Next track ID
addAtomEnd;
addAtom(iods);
size += addWord(0x00000000); // Version + Flags
size += addWord(0x10808080);
size += addWord(0x07004FFF);
size += addWord(0xFF0FFFFF);
addAtomEnd;
addAtom(trak);
size += addAtom_tkhd();
// If we're synchronizing the media streams (or are a hint track),
// add an edit list that helps do this:
if (fCurrentIOState->fHeadChunk != NULL
&& (fSyncStreams || fCurrentIOState->isHintTrack())) {
size += addAtom_edts();
}
// If we're generating a hint track, add a 'tref' atom:
if (fCurrentIOState->isHintTrack()) size += addAtom_tref();
size += addAtom_mdia();
// If we're generating a hint track, add a 'udta' atom:
if (fCurrentIOState->isHintTrack()) size += addAtom_udta();
addAtomEnd;
addAtom(tkhd);
if (fCurrentIOState->fQTEnableTrack) {
size += addWord(0x0000000F); // Version + Flags
} else {
// Disable this track in the movie:
size += addWord(0x00000000); // Version + Flags
}
size += addWord(fAppleCreationTime); // Creation time
size += addWord(fAppleCreationTime); // Modification time
size += addWord(fCurrentIOState->fTrackID); // Track ID
size += addWord(0x00000000); // Reserved
unsigned const duration = fCurrentIOState->fQTDurationM; // movie units
fCurrentIOState->fTKHD_durationPosn = TellFile64(fOutFid);
size += addWord(duration); // Duration
size += addZeroWords(3); // Reserved+Layer+Alternate grp
size += addWord(0x01000000); // Volume + Reserved
size += addWord(0x00010000); // matrix top left corner
size += addZeroWords(3); // matrix
size += addWord(0x00010000); // matrix center
size += addZeroWords(3); // matrix
size += addWord(0x40000000); // matrix bottom right corner
if (strcmp(fCurrentIOState->fOurSubsession.mediumName(), "video") == 0) {
size += addWord(fMovieWidth<<16); // Track width
size += addWord(fMovieHeight<<16); // Track height
} else {
size += addZeroWords(2); // not video: leave width and height fields zero
}
addAtomEnd;
addAtom(edts);
size += addAtom_elst();
addAtomEnd;
#define addEdit1(duration,trackPosition) do { \
unsigned trackDuration \
= (unsigned) ((2*(duration)*movieTimeScale()+1)/2); \
/* in movie time units */ \
size += addWord(trackDuration); /* Track duration */ \
totalDurationOfEdits += trackDuration; \
size += addWord(trackPosition); /* Media time */ \
size += addWord(0x00010000); /* Media rate (1x) */ \
++numEdits; \
} while (0)
#define addEdit(duration) addEdit1((duration),editTrackPosition)
#define addEmptyEdit(duration) addEdit1((duration),(~0))
addAtom(elst);
size += addWord(0x00000000); // Version + Flags
// Add a dummy "Number of entries" field
// (and remember its position). We'll fill this field in later:
int64_t numEntriesPosition = TellFile64(fOutFid);
size += addWord(0); // dummy for "Number of entries"
unsigned numEdits = 0;
unsigned totalDurationOfEdits = 0; // in movie time units
// Run through our chunks, looking at their presentation times.
// From these, figure out the edits that need to be made to keep
// the track media data in sync with the presentation times.
double const syncThreshold = 0.1; // 100 ms
// don't allow the track to get out of sync by more than this
struct timeval editStartTime = fFirstDataTime;
unsigned editTrackPosition = 0;
unsigned currentTrackPosition = 0;
double trackDurationOfEdit = 0.0;
unsigned chunkDuration = 0;
ChunkDescriptor* chunk = fCurrentIOState->fHeadChunk;
while (chunk != NULL) {
struct timeval const& chunkStartTime = chunk->fPresentationTime;
double movieDurationOfEdit
= (chunkStartTime.tv_sec - editStartTime.tv_sec)
+ (chunkStartTime.tv_usec - editStartTime.tv_usec)/1000000.0;
trackDurationOfEdit = (currentTrackPosition-editTrackPosition)
/ (double)(fCurrentIOState->fQTTimeScale);
double outOfSync = movieDurationOfEdit - trackDurationOfEdit;
if (outOfSync > syncThreshold) {
// The track's data is too short, so end this edit, add a new
// 'empty' edit after it, and start a new edit
// (at the current track posn.):
if (trackDurationOfEdit > 0.0) addEdit(trackDurationOfEdit);
addEmptyEdit(outOfSync);
editStartTime = chunkStartTime;
editTrackPosition = currentTrackPosition;
} else if (outOfSync < -syncThreshold) {
// The track's data is too long, so end this edit, and start
// a new edit (pointing at the current track posn.):
if (movieDurationOfEdit > 0.0) addEdit(movieDurationOfEdit);
editStartTime = chunkStartTime;
editTrackPosition = currentTrackPosition;
}
// Note the duration of this chunk:
unsigned numChannels = fCurrentIOState->fOurSubsession.numChannels();
chunkDuration = chunk->fNumFrames*chunk->fFrameDuration/numChannels;
currentTrackPosition += chunkDuration;
chunk = chunk->fNextChunk;
}
// Write out the final edit
trackDurationOfEdit
+= (double)chunkDuration/fCurrentIOState->fQTTimeScale;
if (trackDurationOfEdit > 0.0) addEdit(trackDurationOfEdit);
// Now go back and fill in the "Number of entries" field:
setWord(numEntriesPosition, numEdits);
// Also, if the sum of all of the edit durations exceeds the
// track duration that we already computed (from sample durations),
// then reset the track duration to this new value:
if (totalDurationOfEdits > fCurrentIOState->fQTDurationM) {
fCurrentIOState->fQTDurationM = totalDurationOfEdits;
setWord(fCurrentIOState->fTKHD_durationPosn, totalDurationOfEdits);
// Also, check whether the overall movie duration needs to change:
if (totalDurationOfEdits > fMaxTrackDurationM) {
fMaxTrackDurationM = totalDurationOfEdits;
setWord(fMVHD_durationPosn, totalDurationOfEdits);
}
// Also, convert to track time scale:
double scaleFactor
= fCurrentIOState->fQTTimeScale/(double)movieTimeScale();
fCurrentIOState->fQTDurationT
= (unsigned)(totalDurationOfEdits*scaleFactor);
}
addAtomEnd;
addAtom(tref);
size += addAtom_hint();
addAtomEnd;
addAtom(hint);
SubsessionIOState* hintedTrack = fCurrentIOState->fTrackHintedByUs;
// Assert: hintedTrack != NULL
size += addWord(hintedTrack->fTrackID);
addAtomEnd;
addAtom(mdia);
size += addAtom_mdhd();
size += addAtom_hdlr();
size += addAtom_minf();
addAtomEnd;
addAtom(mdhd);
size += addWord(0x00000000); // Version + Flags
size += addWord(fAppleCreationTime); // Creation time
size += addWord(fAppleCreationTime); // Modification time
unsigned const timeScale = fCurrentIOState->fQTTimeScale;
size += addWord(timeScale); // Time scale
unsigned const duration = fCurrentIOState->fQTDurationT; // track units
size += addWord(duration); // Duration
size += addWord(0x00000000); // Language+Quality
addAtomEnd;
addAtom(hdlr);
size += addWord(0x00000000); // Version + Flags
size += add4ByteString("mhlr"); // Component type
size += addWord(fCurrentIOState->fQTcomponentSubtype);
// Component subtype
size += add4ByteString("appl"); // Component manufacturer
size += addWord(0x00000000); // Component flags
size += addWord(0x00000000); // Component flags mask
size += addArbitraryString(fCurrentIOState->fQTcomponentName);
// Component name
addAtomEnd;
addAtom(minf);
SubsessionIOState::atomCreationFunc mediaInformationAtomCreator
= fCurrentIOState->fQTMediaInformationAtomCreator;
size += (this->*mediaInformationAtomCreator)();
size += addAtom_hdlr2();
size += addAtom_dinf();
size += addAtom_stbl();
addAtomEnd;
addAtom(smhd);
size += addZeroWords(2); // Version+Flags+Balance+Reserved
addAtomEnd;
addAtom(vmhd);
size += addWord(0x00000001); // Version + Flags
size += addWord(0x00408000); // Graphics mode + Opcolor[red]
size += addWord(0x80008000); // Opcolor[green} + Opcolor[blue]
addAtomEnd;
addAtom(gmhd);
size += addAtom_gmin();
addAtomEnd;
addAtom(gmin);
size += addWord(0x00000000); // Version + Flags
// The following fields probably aren't used for hint tracks, so just
// use values that I've seen in other files:
size += addWord(0x00408000); // Graphics mode + Opcolor (1st 2 bytes)
size += addWord(0x80008000); // Opcolor (last 4 bytes)
size += addWord(0x00000000); // Balance + Reserved
addAtomEnd;
unsigned QuickTimeFileSink::addAtom_hdlr2() {
int64_t initFilePosn = TellFile64(fOutFid);
unsigned size = addAtomHeader("hdlr");
size += addWord(0x00000000); // Version + Flags
size += add4ByteString("dhlr"); // Component type
size += add4ByteString("alis"); // Component subtype
size += add4ByteString("appl"); // Component manufacturer
size += addZeroWords(2); // Component flags+Component flags mask
size += addArbitraryString("Apple Alias Data Handler"); // Component name
addAtomEnd;
addAtom(dinf);
size += addAtom_dref();
addAtomEnd;
addAtom(dref);
size += addWord(0x00000000); // Version + Flags
size += addWord(0x00000001); // Number of entries
size += addAtom_alis();
addAtomEnd;
addAtom(alis);
size += addWord(0x00000001); // Version + Flags
addAtomEnd;
addAtom(stbl);
size += addAtom_stsd();
size += addAtom_stts();
if (fCurrentIOState->fQTcomponentSubtype == fourChar('v','i','d','e')) {
size += addAtom_stss(); // only for video streams
}
size += addAtom_stsc();
size += addAtom_stsz();
size += addAtom_co64();
addAtomEnd;
addAtom(stsd);
size += addWord(0x00000000); // Version+Flags
size += addWord(0x00000001); // Number of entries
SubsessionIOState::atomCreationFunc mediaDataAtomCreator
= fCurrentIOState->fQTMediaDataAtomCreator;
size += (this->*mediaDataAtomCreator)();
addAtomEnd;
unsigned QuickTimeFileSink::addAtom_genericMedia() {
int64_t initFilePosn = TellFile64(fOutFid);
// Our source is assumed to be a "QuickTimeGenericRTPSource"
// Use its "sdAtom" state for our contents:
QuickTimeGenericRTPSource* rtpSource = (QuickTimeGenericRTPSource*)
fCurrentIOState->fOurSubsession.rtpSource();
unsigned size = 0;
if (rtpSource != NULL) {
QuickTimeGenericRTPSource::QTState& qtState = rtpSource->qtState;
char const* from = qtState.sdAtom;
size = qtState.sdAtomSize;
for (unsigned i = 0; i < size; ++i) addByte(from[i]);
}
addAtomEnd;
unsigned QuickTimeFileSink::addAtom_soundMediaGeneral() {
int64_t initFilePosn = TellFile64(fOutFid);
unsigned size = addAtomHeader(fCurrentIOState->fQTAudioDataType);
// General sample description fields:
size += addWord(0x00000000); // Reserved
size += addWord(0x00000001); // Reserved+Data reference index
// Sound sample description fields:
unsigned short const version = fCurrentIOState->fQTSoundSampleVersion;
size += addWord(version<<16); // Version+Revision level
size += addWord(0x00000000); // Vendor
unsigned short numChannels
= (unsigned short)(fCurrentIOState->fOurSubsession.numChannels());
size += addHalfWord(numChannels); // Number of channels
size += addHalfWord(0x0010); // Sample size
// size += addWord(0x00000000); // Compression ID+Packet size
size += addWord(0xfffe0000); // Compression ID+Packet size #####
unsigned const sampleRateFixedPoint = fCurrentIOState->fQTTimeScale << 16;
size += addWord(sampleRateFixedPoint); // Sample rate
addAtomEnd;
unsigned QuickTimeFileSink::addAtom_Qclp() {
// The beginning of this atom looks just like a general Sound Media atom,
// except with a version field of 1:
int64_t initFilePosn = TellFile64(fOutFid);
fCurrentIOState->fQTAudioDataType = "Qclp";
fCurrentIOState->fQTSoundSampleVersion = 1;
unsigned size = addAtom_soundMediaGeneral();
// Next, add the four fields that are particular to version 1:
// (Later, parameterize these #####)
size += addWord(0x000000a0); // samples per packet
size += addWord(0x00000000); // ???
size += addWord(0x00000000); // ???
size += addWord(0x00000002); // bytes per sample (uncompressed)
// Other special fields are in a 'wave' atom that follows:
size += addAtom_wave();
addAtomEnd;
addAtom(wave);
size += addAtom_frma();
if (strcmp(fCurrentIOState->fQTAudioDataType, "Qclp") == 0) {
size += addWord(0x00000014); // ???
size += add4ByteString("Qclp"); // ???
if (fCurrentIOState->fQTBytesPerFrame == 35) {
size += addAtom_Fclp(); // full-rate QCELP
} else {
size += addAtom_Hclp(); // half-rate QCELP
} // what about other QCELP 'rates'??? #####
size += addWord(0x00000008); // ???
size += addWord(0x00000000); // ???
size += addWord(0x00000000); // ???
size += addWord(0x00000008); // ???
} else if (strcmp(fCurrentIOState->fQTAudioDataType, "mp4a") == 0) {
size += addWord(0x0000000c); // ???
size += add4ByteString("mp4a"); // ???
size += addWord(0x00000000); // ???
size += addAtom_esds(); // ESDescriptor
size += addWord(0x00000008); // ???
size += addWord(0x00000000); // ???
}
addAtomEnd;
addAtom(frma);
size += add4ByteString(fCurrentIOState->fQTAudioDataType); // ???
addAtomEnd;
addAtom(Fclp);
size += addWord(0x00000000); // ???
addAtomEnd;
addAtom(Hclp);
size += addWord(0x00000000); // ???
addAtomEnd;
unsigned QuickTimeFileSink::addAtom_mp4a() {
unsigned size = 0;
// The beginning of this atom looks just like a general Sound Media atom,
// except with a version field of 1:
int64_t initFilePosn = TellFile64(fOutFid);
fCurrentIOState->fQTAudioDataType = "mp4a";
if (fGenerateMP4Format) {
fCurrentIOState->fQTSoundSampleVersion = 0;
size = addAtom_soundMediaGeneral();
size += addAtom_esds();
} else {
fCurrentIOState->fQTSoundSampleVersion = 1;
size = addAtom_soundMediaGeneral();
// Next, add the four fields that are particular to version 1:
// (Later, parameterize these #####)
size += addWord(fCurrentIOState->fQTTimeUnitsPerSample);
size += addWord(0x00000001); // ???
size += addWord(0x00000001); // ???
size += addWord(0x00000002); // bytes per sample (uncompressed)
// Other special fields are in a 'wave' atom that follows:
size += addAtom_wave();
}
addAtomEnd;
addAtom(esds);
//#####
MediaSubsession& subsession = fCurrentIOState->fOurSubsession;
if (strcmp(subsession.mediumName(), "audio") == 0) {
// MPEG-4 audio
size += addWord(0x00000000); // ???
size += addWord(0x03808080); // ???
size += addWord(0x2a000000); // ???
size += addWord(0x04808080); // ???
size += addWord(0x1c401500); // ???
size += addWord(0x18000000); // ???
size += addWord(0x6d600000); // ???
size += addWord(0x6d600580); // ???
size += addByte(0x80); size += addByte(0x80); // ???
} else if (strcmp(subsession.mediumName(), "video") == 0) {
// MPEG-4 video
size += addWord(0x00000000); // ???
size += addWord(0x03330000); // ???
size += addWord(0x1f042b20); // ???
size += addWord(0x1104fd46); // ???
size += addWord(0x000d4e10); // ???
size += addWord(0x000d4e10); // ???
size += addByte(0x05); // ???
}
// Add the source's 'config' information:
unsigned configSize;
unsigned char* config
= parseGeneralConfigStr(subsession.fmtp_config(), configSize);
size += addByte(configSize);
for (unsigned i = 0; i < configSize; ++i) {
size += addByte(config[i]);
}
delete[] config;
if (strcmp(subsession.mediumName(), "audio") == 0) {
// MPEG-4 audio
size += addWord(0x06808080); // ???
size += addHalfWord(0x0102); // ???
} else {
// MPEG-4 video
size += addHalfWord(0x0601); // ???
size += addByte(0x02); // ???
}
//#####
addAtomEnd;
addAtom(srcq);
//#####
size += addWord(0x00000040); // ???
//#####
addAtomEnd;
addAtom(h263);
// General sample description fields:
size += addWord(0x00000000); // Reserved
size += addWord(0x00000001); // Reserved+Data reference index
// Video sample description fields:
size += addWord(0x00020001); // Version+Revision level
size += add4ByteString("appl"); // Vendor
size += addWord(0x00000000); // Temporal quality
size += addWord(0x000002fc); // Spatial quality
unsigned const widthAndHeight = (fMovieWidth<<16)|fMovieHeight;
size += addWord(widthAndHeight); // Width+height
size += addWord(0x00480000); // Horizontal resolution
size += addWord(0x00480000); // Vertical resolution
size += addWord(0x00000000); // Data size
size += addWord(0x00010548); // Frame count+Compressor name (start)
// "H.263"
size += addWord(0x2e323633); // Compressor name (continued)
size += addZeroWords(6); // Compressor name (continued - zero)
size += addWord(0x00000018); // Compressor name (final)+Depth
size += addHalfWord(0xffff); // Color table id
addAtomEnd;
addAtom(avc1);
// General sample description fields:
size += addWord(0x00000000); // Reserved
size += addWord(0x00000001); // Reserved+Data reference index
// Video sample description fields:
size += addWord(0x00000000); // Version+Revision level
size += add4ByteString("appl"); // Vendor
size += addWord(0x00000000); // Temporal quality
size += addWord(0x00000000); // Spatial quality
unsigned const widthAndHeight = (fMovieWidth<<16)|fMovieHeight;
size += addWord(widthAndHeight); // Width+height
size += addWord(0x00480000); // Horizontal resolution
size += addWord(0x00480000); // Vertical resolution
size += addWord(0x00000000); // Data size
size += addWord(0x00010548); // Frame count+Compressor name (start)
// "H.264"
size += addWord(0x2e323634); // Compressor name (continued)
size += addZeroWords(6); // Compressor name (continued - zero)
size += addWord(0x00000018); // Compressor name (final)+Depth
size += addHalfWord(0xffff); // Color table id
size += addAtom_avcC();
addAtomEnd;
addAtom(avcC);
// Begin by Base-64 decoding the "sprop" parameter sets strings:
char* psets = strDup(fCurrentIOState->fOurSubsession.fmtp_spropparametersets());
if (psets == NULL) return 0;
size_t comma_pos = strcspn(psets, ",");
psets[comma_pos] = '\0';
char const* sps_b64 = psets;
char const* pps_b64 = &psets[comma_pos+1];
unsigned sps_count;
unsigned char* sps_data = base64Decode(sps_b64, sps_count, false);
unsigned pps_count;
unsigned char* pps_data = base64Decode(pps_b64, pps_count, false);
// Then add the decoded data:
size += addByte(0x01); // configuration version
size += addByte(sps_data[1]); // profile
size += addByte(sps_data[2]); // profile compat
size += addByte(sps_data[3]); // level
size += addByte(0xff); /* 0b11111100 | lengthsize = 0x11 */
size += addByte(0xe0 | (sps_count > 0 ? 1 : 0) );
if (sps_count > 0) {
size += addHalfWord(sps_count);
for (unsigned i = 0; i < sps_count; i++) {
size += addByte(sps_data[i]);
}
}
size += addByte(pps_count > 0 ? 1 : 0);
if (pps_count > 0) {
size += addHalfWord(pps_count);
for (unsigned i = 0; i < pps_count; i++) {
size += addByte(pps_data[i]);
}
}
// Finally, delete the data that we allocated:
delete[] pps_data; delete[] sps_data;
delete[] psets;
addAtomEnd;
addAtom(mp4v);
// General sample description fields:
size += addWord(0x00000000); // Reserved
size += addWord(0x00000001); // Reserved+Data reference index
// Video sample description fields:
size += addWord(0x00020001); // Version+Revision level
size += add4ByteString("appl"); // Vendor
size += addWord(0x00000200); // Temporal quality
size += addWord(0x00000400); // Spatial quality
unsigned const widthAndHeight = (fMovieWidth<<16)|fMovieHeight;
size += addWord(widthAndHeight); // Width+height
size += addWord(0x00480000); // Horizontal resolution
size += addWord(0x00480000); // Vertical resolution
size += addWord(0x00000000); // Data size
size += addWord(0x00010c4d); // Frame count+Compressor name (start)
// "MPEG-4 Video"
size += addWord(0x5045472d); // Compressor name (continued)
size += addWord(0x34205669); // Compressor name (continued)
size += addWord(0x64656f00); // Compressor name (continued)
size += addZeroWords(4); // Compressor name (continued - zero)
size += addWord(0x00000018); // Compressor name (final)+Depth
size += addHalfWord(0xffff); // Color table id
size += addAtom_esds(); // ESDescriptor
size += addWord(0x00000000); // ???
addAtomEnd;
unsigned QuickTimeFileSink::addAtom_rtp() {
int64_t initFilePosn = TellFile64(fOutFid);
unsigned size = addAtomHeader("rtp ");
size += addWord(0x00000000); // Reserved (1st 4 bytes)
size += addWord(0x00000001); // Reserved (last 2 bytes) + Data ref index
size += addWord(0x00010001); // Hint track version + Last compat htv
size += addWord(1450); // Max packet size
size += addAtom_tims();
addAtomEnd;
addAtom(tims);
size += addWord(fCurrentIOState->fOurSubsession.rtpTimestampFrequency());
addAtomEnd;
addAtom(stts); // Time-to-Sample
size += addWord(0x00000000); // Version+flags
// First, add a dummy "Number of entries" field
// (and remember its position). We'll fill this field in later:
int64_t numEntriesPosition = TellFile64(fOutFid);
size += addWord(0); // dummy for "Number of entries"
// Then, run through the chunk descriptors, and enter the entries
// in this (compressed) Time-to-Sample table:
unsigned numEntries = 0, numSamplesSoFar = 0;
unsigned prevSampleDuration = 0;
unsigned const samplesPerFrame = fCurrentIOState->fQTSamplesPerFrame;
ChunkDescriptor* chunk = fCurrentIOState->fHeadChunk;
while (chunk != NULL) {
unsigned const sampleDuration = chunk->fFrameDuration/samplesPerFrame;
if (sampleDuration != prevSampleDuration) {
// This chunk will start a new table entry,
// so write out the old one (if any):
if (chunk != fCurrentIOState->fHeadChunk) {
++numEntries;
size += addWord(numSamplesSoFar); // Sample count
size += addWord(prevSampleDuration); // Sample duration
numSamplesSoFar = 0;
}
}
unsigned const numSamples = chunk->fNumFrames*samplesPerFrame;
numSamplesSoFar += numSamples;
prevSampleDuration = sampleDuration;
chunk = chunk->fNextChunk;
}
// Then, write out the last entry:
++numEntries;
size += addWord(numSamplesSoFar); // Sample count
size += addWord(prevSampleDuration); // Sample duration
// Now go back and fill in the "Number of entries" field:
setWord(numEntriesPosition, numEntries);
addAtomEnd;
addAtom(stss); // Sync-Sample
size += addWord(0x00000000); // Version+flags
// First, add a dummy "Number of entries" field
// (and remember its position). We'll fill this field in later:
int64_t numEntriesPosition = TellFile64(fOutFid);
size += addWord(0); // dummy for "Number of entries"
unsigned numEntries = 0, numSamplesSoFar = 0;
if (fCurrentIOState->fHeadSyncFrame != NULL) {
SyncFrame* currentSyncFrame = fCurrentIOState->fHeadSyncFrame;
// First, count the number of frames (to use as a sanity check; see below):
unsigned totNumFrames = 0;
for (ChunkDescriptor* chunk = fCurrentIOState->fHeadChunk; chunk != NULL; chunk = chunk->fNextChunk) totNumFrames += chunk->fNumFrames;
while (currentSyncFrame != NULL) {
if (currentSyncFrame->sfFrameNum >= totNumFrames) break; // sanity check
++numEntries;
size += addWord(currentSyncFrame->sfFrameNum);
currentSyncFrame = currentSyncFrame->nextSyncFrame;
}
} else {
// First, run through the chunk descriptors, counting up the total number of samples:
unsigned const samplesPerFrame = fCurrentIOState->fQTSamplesPerFrame;
ChunkDescriptor* chunk = fCurrentIOState->fHeadChunk;
while (chunk != NULL) {
unsigned const numSamples = chunk->fNumFrames*samplesPerFrame;
numSamplesSoFar += numSamples;
chunk = chunk->fNextChunk;
}
// Then, write out the sample numbers that we deem correspond to 'sync samples':
unsigned i;
for (i = 0; i < numSamplesSoFar; i += 12) {
// For an explanation of the constant "12", see http://lists.live555.com/pipermail/live-devel/2009-July/010969.html
// (Perhaps we should really try to keep track of which 'samples' ('frames' for video) really are 'key frames'?)
size += addWord(i+1);
++numEntries;
}
// Then, write out the last entry (if we haven't already done so):
if (i != (numSamplesSoFar - 1)) {
size += addWord(numSamplesSoFar);
++numEntries;
}
}
// Now go back and fill in the "Number of entries" field:
setWord(numEntriesPosition, numEntries);
addAtomEnd;
addAtom(stsc); // Sample-to-Chunk
size += addWord(0x00000000); // Version+flags
// First, add a dummy "Number of entries" field
// (and remember its position). We'll fill this field in later:
int64_t numEntriesPosition = TellFile64(fOutFid);
size += addWord(0); // dummy for "Number of entries"
// Then, run through the chunk descriptors, and enter the entries
// in this (compressed) Sample-to-Chunk table:
unsigned numEntries = 0, chunkNumber = 0;
unsigned prevSamplesPerChunk = ~0;
unsigned const samplesPerFrame = fCurrentIOState->fQTSamplesPerFrame;
ChunkDescriptor* chunk = fCurrentIOState->fHeadChunk;
while (chunk != NULL) {
++chunkNumber;
unsigned const samplesPerChunk = chunk->fNumFrames*samplesPerFrame;
if (samplesPerChunk != prevSamplesPerChunk) {
// This chunk will be a new table entry:
++numEntries;
size += addWord(chunkNumber); // Chunk number
size += addWord(samplesPerChunk); // Samples per chunk
size += addWord(0x00000001); // Sample description ID
prevSamplesPerChunk = samplesPerChunk;
}
chunk = chunk->fNextChunk;
}
// Now go back and fill in the "Number of entries" field:
setWord(numEntriesPosition, numEntries);
addAtomEnd;
addAtom(stsz); // Sample Size
size += addWord(0x00000000); // Version+flags
// Begin by checking whether our chunks all have the same
// 'bytes-per-sample'. This determines whether this atom's table
// has just a single entry, or multiple entries.
Boolean haveSingleEntryTable = True;
double firstBPS = 0.0;
ChunkDescriptor* chunk = fCurrentIOState->fHeadChunk;
while (chunk != NULL) {
double bps
= (double)(chunk->fFrameSize)/(fCurrentIOState->fQTSamplesPerFrame);
if (bps < 1.0) {
// I don't think a multiple-entry table would make sense in
// this case, so assume a single entry table ??? #####
break;
}
if (firstBPS == 0.0) {
firstBPS = bps;
} else if (bps != firstBPS) {
haveSingleEntryTable = False;
break;
}
chunk = chunk->fNextChunk;
}
unsigned sampleSize;
if (haveSingleEntryTable) {
if (fCurrentIOState->isHintTrack()
&& fCurrentIOState->fHeadChunk != NULL) {
sampleSize = fCurrentIOState->fHeadChunk->fFrameSize
/ fCurrentIOState->fQTSamplesPerFrame;
} else {
// The following doesn't seem right, but seems to do the right thing:
sampleSize = fCurrentIOState->fQTTimeUnitsPerSample; //???
}
} else {
sampleSize = 0; // indicates a multiple-entry table
}
size += addWord(sampleSize); // Sample size
unsigned const totNumSamples = fCurrentIOState->fQTTotNumSamples;
size += addWord(totNumSamples); // Number of entries
if (!haveSingleEntryTable) {
// Multiple-entry table:
// Run through the chunk descriptors, entering the sample sizes:
ChunkDescriptor* chunk = fCurrentIOState->fHeadChunk;
while (chunk != NULL) {
unsigned numSamples
= chunk->fNumFrames*(fCurrentIOState->fQTSamplesPerFrame);
unsigned sampleSize
= chunk->fFrameSize/(fCurrentIOState->fQTSamplesPerFrame);
for (unsigned i = 0; i < numSamples; ++i) {
size += addWord(sampleSize);
}
chunk = chunk->fNextChunk;
}
}
addAtomEnd;
addAtom(co64); // Chunk Offset
size += addWord(0x00000000); // Version+flags
size += addWord(fCurrentIOState->fNumChunks); // Number of entries
// Run through the chunk descriptors, entering the file offsets:
ChunkDescriptor* chunk = fCurrentIOState->fHeadChunk;
while (chunk != NULL) {
size += addWord64(chunk->fOffsetInFile);
chunk = chunk->fNextChunk;
}
addAtomEnd;
addAtom(udta);
size += addAtom_name();
size += addAtom_hnti();
size += addAtom_hinf();
addAtomEnd;
addAtom(name);
char description[100];
sprintf(description, "Hinted %s track",
fCurrentIOState->fOurSubsession.mediumName());
size += addArbitraryString(description, False); // name of object
addAtomEnd;
addAtom(hnti);
size += addAtom_sdp();
addAtomEnd;
unsigned QuickTimeFileSink::addAtom_sdp() {
int64_t initFilePosn = TellFile64(fOutFid);
unsigned size = addAtomHeader("sdp ");
// Add this subsession's SDP lines:
char const* sdpLines = fCurrentIOState->fOurSubsession.savedSDPLines();
// We need to change any "a=control:trackID=" values to be this
// track's actual track id:
char* newSDPLines = new char[strlen(sdpLines)+100/*overkill*/];
char const* searchStr = "a=control:trackid=";
Boolean foundSearchString = False;
char const *p1, *p2, *p3;
for (p1 = sdpLines; *p1 != '\0'; ++p1) {
for (p2 = p1,p3 = searchStr; tolower(*p2) == *p3; ++p2,++p3) {}
if (*p3 == '\0') {
// We found the end of the search string, at p2.
int beforeTrackNumPosn = p2-sdpLines;
// Look for the subsequent track number, and skip over it:
int trackNumLength;
if (sscanf(p2, " %*d%n", &trackNumLength) < 0) break;
int afterTrackNumPosn = beforeTrackNumPosn + trackNumLength;
// Replace the old track number with the correct one:
int i;
for (i = 0; i < beforeTrackNumPosn; ++i) newSDPLines[i] = sdpLines[i];
sprintf(&newSDPLines[i], "%d", fCurrentIOState->fTrackID);
i = afterTrackNumPosn;
int j = i + strlen(&newSDPLines[i]);
while (1) {
if ((newSDPLines[j] = sdpLines[i]) == '\0') break;
++i; ++j;
}
foundSearchString = True;
break;
}
}
if (!foundSearchString) {
// Because we didn't find a "a=control:trackID=" line,
// add one of our own:
sprintf(newSDPLines, "%s%s%d\r\n",
sdpLines, searchStr, fCurrentIOState->fTrackID);
}
size += addArbitraryString(newSDPLines, False);
delete[] newSDPLines;
addAtomEnd;
addAtom(hinf);
size += addAtom_totl();
size += addAtom_npck();
size += addAtom_tpay();
size += addAtom_trpy();
size += addAtom_nump();
size += addAtom_tpyl();
// Is 'maxr' required? #####
size += addAtom_dmed();
size += addAtom_dimm();
size += addAtom_drep();
size += addAtom_tmin();
size += addAtom_tmax();
size += addAtom_pmax();
size += addAtom_dmax();
size += addAtom_payt();
addAtomEnd;
addAtom(totl);
size += addWord(fCurrentIOState->fHINF.trpy.lo);
addAtomEnd;
addAtom(npck);
size += addWord(fCurrentIOState->fHINF.nump.lo);
addAtomEnd;
addAtom(tpay);
size += addWord(fCurrentIOState->fHINF.tpyl.lo);
addAtomEnd;
addAtom(trpy);
size += addWord(fCurrentIOState->fHINF.trpy.hi);
size += addWord(fCurrentIOState->fHINF.trpy.lo);
addAtomEnd;
addAtom(nump);
size += addWord(fCurrentIOState->fHINF.nump.hi);
size += addWord(fCurrentIOState->fHINF.nump.lo);
addAtomEnd;
addAtom(tpyl);
size += addWord(fCurrentIOState->fHINF.tpyl.hi);
size += addWord(fCurrentIOState->fHINF.tpyl.lo);
addAtomEnd;
addAtom(dmed);
size += addWord(fCurrentIOState->fHINF.dmed.hi);
size += addWord(fCurrentIOState->fHINF.dmed.lo);
addAtomEnd;
addAtom(dimm);
size += addWord(fCurrentIOState->fHINF.dimm.hi);
size += addWord(fCurrentIOState->fHINF.dimm.lo);
addAtomEnd;
addAtom(drep);
size += addWord(0);
size += addWord(0);
addAtomEnd;
addAtom(tmin);
size += addWord(0);
addAtomEnd;
addAtom(tmax);
size += addWord(0);
addAtomEnd;
addAtom(pmax);
size += addWord(fCurrentIOState->fHINF.pmax);
addAtomEnd;
addAtom(dmax);
size += addWord(fCurrentIOState->fHINF.dmax);
addAtomEnd;
addAtom(payt);
MediaSubsession& ourSubsession = fCurrentIOState->fOurSubsession;
RTPSource* rtpSource = ourSubsession.rtpSource();
if (rtpSource != NULL) {
size += addWord(rtpSource->rtpPayloadFormat());
// Also, add a 'rtpmap' string: /
unsigned rtpmapStringLength = strlen(ourSubsession.codecName()) + 20;
char* rtpmapString = new char[rtpmapStringLength];
sprintf(rtpmapString, "%s/%d",
ourSubsession.codecName(), rtpSource->timestampFrequency());
size += addArbitraryString(rtpmapString);
delete[] rtpmapString;
}
addAtomEnd;
// A dummy atom (with name "????"):
unsigned QuickTimeFileSink::addAtom_dummy() {
int64_t initFilePosn = TellFile64(fOutFid);
unsigned size = addAtomHeader("????");
addAtomEnd;
live/liveMedia/QuickTimeGenericRTPSource.cpp 000444 001752 001752 00000023044 12656261123 021016 0 ustar 00rsf rsf 000000 000000 /**********
This library is free software; you can redistribute it and/or modify it under
the terms of the GNU Lesser General Public License as published by the
Free Software Foundation; either version 2.1 of the License, or (at your
option) any later version. (See .)
This library is distributed in the hope that it will be useful, but WITHOUT
ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS
FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for
more details.
You should have received a copy of the GNU Lesser General Public License
along with this library; if not, write to the Free Software Foundation, Inc.,
51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
**********/
// "liveMedia"
// Copyright (c) 1996-2016 Live Networks, Inc. All rights reserved.
// RTP Sources containing generic QuickTime stream data, as defined in
//
// Implementation
#include "QuickTimeGenericRTPSource.hh"
///// QTGenericBufferedPacket and QTGenericBufferedPacketFactory /////
// A subclass of BufferedPacket, used to separate out
// individual frames (when PCK == 2)
class QTGenericBufferedPacket: public BufferedPacket {
public:
QTGenericBufferedPacket(QuickTimeGenericRTPSource& ourSource);
virtual ~QTGenericBufferedPacket();
private: // redefined virtual functions
virtual unsigned nextEnclosedFrameSize(unsigned char*& framePtr,
unsigned dataSize);
private:
QuickTimeGenericRTPSource& fOurSource;
};
class QTGenericBufferedPacketFactory: public BufferedPacketFactory {
private: // redefined virtual functions
virtual BufferedPacket* createNewPacket(MultiFramedRTPSource* ourSource);
};
////////// QuickTimeGenericRTPSource //////////
QuickTimeGenericRTPSource*
QuickTimeGenericRTPSource::createNew(UsageEnvironment& env,
Groupsock* RTPgs,
unsigned char rtpPayloadFormat,
unsigned rtpTimestampFrequency,
char const* mimeTypeString) {
return new QuickTimeGenericRTPSource(env, RTPgs, rtpPayloadFormat,
rtpTimestampFrequency,
mimeTypeString);
}
QuickTimeGenericRTPSource
::QuickTimeGenericRTPSource(UsageEnvironment& env, Groupsock* RTPgs,
unsigned char rtpPayloadFormat,
unsigned rtpTimestampFrequency,
char const* mimeTypeString)
: MultiFramedRTPSource(env, RTPgs,
rtpPayloadFormat, rtpTimestampFrequency,
new QTGenericBufferedPacketFactory),
fMIMEtypeString(strDup(mimeTypeString)) {
qtState.PCK = 0;
qtState.timescale = 0;
qtState.sdAtom = NULL;
qtState.sdAtomSize = qtState.width = qtState.height = 0;
}
QuickTimeGenericRTPSource::~QuickTimeGenericRTPSource() {
delete[] qtState.sdAtom;
delete[] (char*)fMIMEtypeString;
}
Boolean QuickTimeGenericRTPSource
::processSpecialHeader(BufferedPacket* packet,
unsigned& resultSpecialHeaderSize) {
unsigned char* headerStart = packet->data();
unsigned packetSize = packet->dataSize();
// The "QuickTime Header" must be at least 4 bytes in size:
// Extract the known fields from the first 4 bytes:
unsigned expectedHeaderSize = 4;
if (packetSize < expectedHeaderSize) return False;
unsigned char VER = (headerStart[0]&0xF0)>>4;
if (VER > 1) return False; // unknown header version
qtState.PCK = (headerStart[0]&0x0C)>>2;
#ifdef DEBUG
Boolean S = (headerStart[0]&0x02) != 0;
#endif
Boolean Q = (headerStart[0]&0x01) != 0;
Boolean L = (headerStart[1]&0x80) != 0;
#ifdef DEBUG
Boolean D = (headerStart[2]&0x80) != 0;
unsigned short payloadId = ((headerStart[2]&0x7F)<<8)|headerStart[3];
#endif
headerStart += 4;
#ifdef DEBUG
fprintf(stderr, "PCK: %d, S: %d, Q: %d, L: %d, D: %d, payloadId: %d\n", qtState.PCK, S, Q, L, D, payloadId);
#endif
if (Q) { // A "QuickTime Payload Description" follows
expectedHeaderSize += 4;
if (packetSize < expectedHeaderSize) return False;
#ifdef DEBUG
Boolean K = (headerStart[0]&0x80) != 0;
Boolean F = (headerStart[0]&0x40) != 0;
Boolean A = (headerStart[0]&0x20) != 0;
Boolean Z = (headerStart[0]&0x10) != 0;
#endif
unsigned payloadDescriptionLength = (headerStart[2]<<8)|headerStart[3];
headerStart += 4;
#ifdef DEBUG
fprintf(stderr, "\tK: %d, F: %d, A: %d, Z: %d, payloadDescriptionLength: %d\n", K, F, A, Z, payloadDescriptionLength);
#endif
// Make sure "payloadDescriptionLength" is valid
if (payloadDescriptionLength < 12) return False;
expectedHeaderSize += (payloadDescriptionLength - 4);
unsigned nonPaddedSize = expectedHeaderSize;
expectedHeaderSize += 3;
expectedHeaderSize -= expectedHeaderSize%4; // adds padding
if (packetSize < expectedHeaderSize) return False;
unsigned char padding = expectedHeaderSize - nonPaddedSize;
#ifdef DEBUG
unsigned mediaType = (headerStart[0]<<24)|(headerStart[1]<<16)
|(headerStart[2]<<8)|headerStart[3];
#endif
qtState.timescale = (headerStart[4]<<24)|(headerStart[5]<<16)
|(headerStart[6]<<8)|headerStart[7];
headerStart += 8;
payloadDescriptionLength -= 12;
#ifdef DEBUG
fprintf(stderr, "\tmediaType: '%c%c%c%c', timescale: %d, %d bytes of TLVs left\n", mediaType>>24, (mediaType&0xFF0000)>>16, (mediaType&0xFF00)>>8, mediaType&0xFF, qtState.timescale, payloadDescriptionLength);
#endif
while (payloadDescriptionLength > 3) {
unsigned short tlvLength = (headerStart[0]<<8)|headerStart[1];
unsigned short tlvType = (headerStart[2]<<8)|headerStart[3];
payloadDescriptionLength -= 4;
if (tlvLength > payloadDescriptionLength) return False; // bad TLV
headerStart += 4;
#ifdef DEBUG
fprintf(stderr, "\t\tTLV '%c%c', length %d, leaving %d remaining bytes\n", tlvType>>8, tlvType&0xFF, tlvLength, payloadDescriptionLength - tlvLength);
for (int i = 0; i < tlvLength; ++i) fprintf(stderr, "%02x:", headerStart[i]); fprintf(stderr, "\n");
#endif
// Check for 'TLV's that we can use for our 'qtState'
switch (tlvType) {
case ('s'<<8|'d'): { // session description atom
// Sanity check: the first 4 bytes of this must equal "tlvLength":
unsigned atomLength = (headerStart[0]<<24)|(headerStart[1]<<16)
|(headerStart[2]<<8)|(headerStart[3]);
if (atomLength != (unsigned)tlvLength) break;
delete[] qtState.sdAtom; qtState.sdAtom = new char[tlvLength];
memmove(qtState.sdAtom, headerStart, tlvLength);
qtState.sdAtomSize = tlvLength;
break;
}
case ('t'<<8|'w'): { // track width
qtState.width = (headerStart[0]<<8)|headerStart[1];
break;
}
case ('t'<<8|'h'): { // track height
qtState.height = (headerStart[0]<<8)|headerStart[1];
break;
}
}
payloadDescriptionLength -= tlvLength;
headerStart += tlvLength;
}
if (payloadDescriptionLength > 0) return False; // malformed TLV data
headerStart += padding;
}
if (L) { // Sample-Specific info follows
expectedHeaderSize += 4;
if (packetSize < expectedHeaderSize) return False;
unsigned ssInfoLength = (headerStart[2]<<8)|headerStart[3];
headerStart += 4;
#ifdef DEBUG
fprintf(stderr, "\tssInfoLength: %d\n", ssInfoLength);
#endif
// Make sure "ssInfoLength" is valid
if (ssInfoLength < 4) return False;
expectedHeaderSize += (ssInfoLength - 4);
unsigned nonPaddedSize = expectedHeaderSize;
expectedHeaderSize += 3;
expectedHeaderSize -= expectedHeaderSize%4; // adds padding
if (packetSize < expectedHeaderSize) return False;
unsigned char padding = expectedHeaderSize - nonPaddedSize;
ssInfoLength -= 4;
while (ssInfoLength > 3) {
unsigned short tlvLength = (headerStart[0]<<8)|headerStart[1];
#ifdef DEBUG
unsigned short tlvType = (headerStart[2]<<8)|headerStart[3];
#endif
ssInfoLength -= 4;
if (tlvLength > ssInfoLength) return False; // bad TLV
#ifdef DEBUG
fprintf(stderr, "\t\tTLV '%c%c', length %d, leaving %d remaining bytes\n", tlvType>>8, tlvType&0xFF, tlvLength, ssInfoLength - tlvLength);
for (int i = 0; i < tlvLength; ++i) fprintf(stderr, "%02x:", headerStart[4+i]); fprintf(stderr, "\n");
#endif
ssInfoLength -= tlvLength;
headerStart += 4 + tlvLength;
}
if (ssInfoLength > 0) return False; // malformed TLV data
headerStart += padding;
}
fCurrentPacketBeginsFrame = fCurrentPacketCompletesFrame;
// whether the *previous* packet ended a frame
fCurrentPacketCompletesFrame = packet->rtpMarkerBit();
resultSpecialHeaderSize = expectedHeaderSize;
#ifdef DEBUG
fprintf(stderr, "Result special header size: %d\n", resultSpecialHeaderSize);
#endif
return True;
}
char const* QuickTimeGenericRTPSource::MIMEtype() const {
if (fMIMEtypeString == NULL) return MultiFramedRTPSource::MIMEtype();
return fMIMEtypeString;
}
////////// QTGenericBufferedPacket and QTGenericBufferedPacketFactory impl
QTGenericBufferedPacket
::QTGenericBufferedPacket(QuickTimeGenericRTPSource& ourSource)
: fOurSource(ourSource) {
}
QTGenericBufferedPacket::~QTGenericBufferedPacket() {
}
unsigned QTGenericBufferedPacket::
nextEnclosedFrameSize(unsigned char*& framePtr, unsigned dataSize) {
// We use the entire packet for a frame, unless "PCK" == 2
if (fOurSource.qtState.PCK != 2) return dataSize;
if (dataSize < 8) return 0; // sanity check
unsigned short sampleLength = (framePtr[2]<<8)|framePtr[3];
// later, extract and use the "timestamp" field #####
framePtr += 8;
dataSize -= 8;
return sampleLength < dataSize ? sampleLength : dataSize;
}
BufferedPacket* QTGenericBufferedPacketFactory
::createNewPacket(MultiFramedRTPSource* ourSource) {
return new QTGenericBufferedPacket((QuickTimeGenericRTPSource&)(*ourSource));
}
live/liveMedia/rtcp_from_spec.c 000444 001752 001752 00000024452 12656261123 016530 0 ustar 00rsf rsf 000000 000000 /* RTCP code taken directly from the most recent RTP specification:
* RFC 3550
* Implementation
*/
#include "rtcp_from_spec.h"
/*****
A.7 Computing the RTCP Transmission Interval
The following functions implement the RTCP transmission and reception
rules described in Section 6.2. These rules are coded in several
functions:
o rtcp_interval() computes the deterministic calculated
interval, measured in seconds. The parameters are defined in
Section 6.3.
o OnExpire() is called when the RTCP transmission timer expires.
o OnReceive() is called whenever an RTCP packet is received.
Both OnExpire() and OnReceive() have event e as an argument. This is
the next scheduled event for that participant, either an RTCP report
or a BYE packet. It is assumed that the following functions are
available:
o Schedule(time t, event e) schedules an event e to occur at
time t. When time t arrives, the function OnExpire is called
with e as an argument.
o Reschedule(time t, event e) reschedules a previously scheduled
event e for time t.
o SendRTCPReport(event e) sends an RTCP report.
o SendBYEPacket(event e) sends a BYE packet.
o TypeOfEvent(event e) returns EVENT_BYE if the event being
processed is for a BYE packet to be sent, else it returns
EVENT_REPORT.
o PacketType(p) returns PACKET_RTCP_REPORT if packet p is an
RTCP report (not BYE), PACKET_BYE if its a BYE RTCP packet,
and PACKET_RTP if its a regular RTP data packet.
o ReceivedPacketSize() and SentPacketSize() return the size of
the referenced packet in octets.
o NewMember(p) returns a 1 if the participant who sent packet p
is not currently in the member list, 0 otherwise. Note this
function is not sufficient for a complete implementation
because each CSRC identifier in an RTP packet and each SSRC in
a BYE packet should be processed.
o NewSender(p) returns a 1 if the participant who sent packet p
is not currently in the sender sublist of the member list, 0
otherwise.
o AddMember() and RemoveMember() to add and remove participants
from the member list.
o AddSender() and RemoveSender() to add and remove participants
from the sender sublist of the member list.
*****/
double rtcp_interval(int members,
int senders,
double rtcp_bw,
int we_sent,
double avg_rtcp_size,
int initial)
{
/*
* Minimum average time between RTCP packets from this site (in
* seconds). This time prevents the reports from `clumping' when
* sessions are small and the law of large numbers isn't helping
* to smooth out the traffic. It also keeps the report interval
* from becoming ridiculously small during transient outages like
* a network partition.
*/
double const RTCP_MIN_TIME = 5.;
/*
* Fraction of the RTCP bandwidth to be shared among active
* senders. (This fraction was chosen so that in a typical
* session with one or two active senders, the computed report
* time would be roughly equal to the minimum report time so that
* we don't unnecessarily slow down receiver reports.) The
* receiver fraction must be 1 - the sender fraction.
*/
double const RTCP_SENDER_BW_FRACTION = 0.25;
double const RTCP_RCVR_BW_FRACTION = (1-RTCP_SENDER_BW_FRACTION);
/*
* To compensate for "unconditional reconsideration" converging to a
* value below the intended average.
*/
double const COMPENSATION = 2.71828 - 1.5;
double t; /* interval */
double rtcp_min_time = RTCP_MIN_TIME;
int n; /* no. of members for computation */
/*
* Very first call at application start-up uses half the min
* delay for quicker notification while still allowing some time
* before reporting for randomization and to learn about other
* sources so the report interval will converge to the correct
* interval more quickly.
*/
if (initial) {
rtcp_min_time /= 2;
}
/*
* If there were active senders, give them at least a minimum
* share of the RTCP bandwidth. Otherwise all participants share
* the RTCP bandwidth equally.
*/
n = members;
if (senders > 0 && senders < members * RTCP_SENDER_BW_FRACTION) {
if (we_sent) {
rtcp_bw *= RTCP_SENDER_BW_FRACTION;
n = senders;
} else {
rtcp_bw *= RTCP_RCVR_BW_FRACTION;
n -= senders;
}
}
/*
* The effective number of sites times the average packet size is
* the total number of octets sent when each site sends a report.
* Dividing this by the effective bandwidth gives the time
* interval over which those packets must be sent in order to
* meet the bandwidth target, with a minimum enforced. In that
* time interval we send one report so this time is also our
* average time between reports.
*/
t = avg_rtcp_size * n / rtcp_bw;
if (t < rtcp_min_time) t = rtcp_min_time;
/*
* To avoid traffic bursts from unintended synchronization with
* other sites, we then pick our actual next report interval as a
* random number uniformly distributed between 0.5*t and 1.5*t.
*/
t = t * (drand48() + 0.5);
t = t / COMPENSATION;
return t;
}
void OnExpire(event e,
int members,
int senders,
double rtcp_bw,
int we_sent,
double *avg_rtcp_size,
int *initial,
time_tp tc,
time_tp *tp,
int *pmembers)
{
/* This function is responsible for deciding whether to send
* an RTCP report or BYE packet now, or to reschedule transmission.
* It is also responsible for updating the pmembers, initial, tp,
* and avg_rtcp_size state variables. This function should be called
* upon expiration of the event timer used by Schedule(). */
double t; /* Interval */
double tn; /* Next transmit time */
/* In the case of a BYE, we use "unconditional reconsideration" to
* reschedule the transmission of the BYE if necessary */
if (TypeOfEvent(e) == EVENT_BYE) {
t = rtcp_interval(members,
senders,
rtcp_bw,
we_sent,
*avg_rtcp_size,
*initial);
tn = *tp + t;
if (tn <= tc) {
SendBYEPacket(e);
exit(1);
} else {
Schedule(tn, e);
}
} else if (TypeOfEvent(e) == EVENT_REPORT) {
t = rtcp_interval(members,
senders,
rtcp_bw,
we_sent,
*avg_rtcp_size,
*initial);
tn = *tp + t;
if (tn <= tc) {
SendRTCPReport(e);
*avg_rtcp_size = (1./16.)*SentPacketSize(e) +
(15./16.)*(*avg_rtcp_size);
*tp = tc;
/* We must redraw the interval. Don't reuse the
one computed above, since its not actually
distributed the same, as we are conditioned
on it being small enough to cause a packet to
be sent */
t = rtcp_interval(members,
senders,
rtcp_bw,
we_sent,
*avg_rtcp_size,
*initial);
Schedule(t+tc,e);
*initial = 0;
} else {
Schedule(tn, e);
}
*pmembers = members;
}
}
void OnReceive(packet p,
event e,
int *members,
int *pmembers,
int *senders,
double *avg_rtcp_size,
double *tp,
double tc,
double tn)
{
/* What we do depends on whether we have left the group, and
* are waiting to send a BYE (TypeOfEvent(e) == EVENT_BYE) or
* an RTCP report. p represents the packet that was just received. */
if (PacketType(p) == PACKET_RTCP_REPORT) {
if (NewMember(p) && (TypeOfEvent(e) == EVENT_REPORT)) {
AddMember(p);
*members += 1;
}
*avg_rtcp_size = (1./16.)*ReceivedPacketSize(p) +
(15./16.)*(*avg_rtcp_size);
} else if (PacketType(p) == PACKET_RTP) {
if (NewMember(p) && (TypeOfEvent(e) == EVENT_REPORT)) {
AddMember(p);
*members += 1;
}
if (NewSender(p) && (TypeOfEvent(e) == EVENT_REPORT)) {
AddSender(p);
*senders += 1;
}
} else if (PacketType(p) == PACKET_BYE) {
*avg_rtcp_size = (1./16.)*ReceivedPacketSize(p) +
(15./16.)*(*avg_rtcp_size);
if (TypeOfEvent(e) == EVENT_REPORT) {
if (NewSender(p) == FALSE) {
RemoveSender(p);
*senders -= 1;
}
if (NewMember(p) == FALSE) {
RemoveMember(p);
*members -= 1;
}
if(*members < *pmembers) {
tn = tc + (((double) *members)/(*pmembers))*(tn - tc);
*tp = tc - (((double) *members)/(*pmembers))*(tc - *tp);
/* Reschedule the next report for time tn */
Reschedule(tn, e);
*pmembers = *members;
}
} else if (TypeOfEvent(e) == EVENT_BYE) {
*members += 1;
}
}
}
live/liveMedia/rtcp_from_spec.h 000444 001752 001752 00000004404 12656261123 016530 0 ustar 00rsf rsf 000000 000000 /* RTCP code taken directly from the most recent RTP specification:
* draft-ietf-avt-rtp-new-11.txt
* C header
*/
#ifndef _RTCP_FROM_SPEC_H
#define _RTCP_FROM_SPEC_H
#include
/* Definitions of _ANSI_ARGS and EXTERN that will work in either
C or C++ code:
*/
#undef _ANSI_ARGS_
#if ((defined(__STDC__) || defined(SABER)) && !defined(NO_PROTOTYPE)) || defined(__cplusplus) || defined(USE_PROTOTYPE)
# define _ANSI_ARGS_(x) x
#else
# define _ANSI_ARGS_(x) ()
#endif
#ifdef __cplusplus
# define EXTERN extern "C"
#else
# define EXTERN extern
#endif
/* The code from the spec assumes a type "event"; make this a void*: */
typedef void* event;
#define EVENT_UNKNOWN 0
#define EVENT_REPORT 1
#define EVENT_BYE 2
/* The code from the spec assumes a type "time_tp"; make this a double: */
typedef double time_tp;
/* The code from the spec assumes a type "packet"; make this a void*: */
typedef void* packet;
#define PACKET_UNKNOWN_TYPE 0
#define PACKET_RTP 1
#define PACKET_RTCP_REPORT 2
#define PACKET_BYE 3
#define PACKET_RTCP_APP 4
/* The code from the spec calls drand48(), but we have drand30() instead */
#define drand48 drand30
/* The code calls "exit()", but we don't want to exit, so make it a noop: */
#define exit(n) do {} while (0)
#ifndef FALSE
#define FALSE 0
#endif
#ifndef TRUE
#define TRUE 1
#endif
/* EXPORTS: */
EXTERN void OnExpire _ANSI_ARGS_((event, int, int, double, int, double*, int*, time_tp, time_tp*, int*));
EXTERN void OnReceive _ANSI_ARGS_((packet, event, int*, int*, int*, double*, double*, double, double));
/* IMPORTS: */
EXTERN void Schedule _ANSI_ARGS_((double,event));
EXTERN void Reschedule _ANSI_ARGS_((double,event));
EXTERN void SendRTCPReport _ANSI_ARGS_((event));
EXTERN void SendBYEPacket _ANSI_ARGS_((event));
EXTERN int TypeOfEvent _ANSI_ARGS_((event));
EXTERN int SentPacketSize _ANSI_ARGS_((event));
EXTERN int PacketType _ANSI_ARGS_((packet));
EXTERN int ReceivedPacketSize _ANSI_ARGS_((packet));
EXTERN int NewMember _ANSI_ARGS_((packet));
EXTERN int NewSender _ANSI_ARGS_((packet));
EXTERN void AddMember _ANSI_ARGS_((packet));
EXTERN void AddSender _ANSI_ARGS_((packet));
EXTERN void RemoveMember _ANSI_ARGS_((packet));
EXTERN void RemoveSender _ANSI_ARGS_((packet));
EXTERN double drand30 _ANSI_ARGS_((void));
#endif
live/liveMedia/RTPInterface.cpp 000444 001752 001752 00000057776 12656261123 016370 0 ustar 00rsf rsf 000000 000000 /**********
This library is free software; you can redistribute it and/or modify it under
the terms of the GNU Lesser General Public License as published by the
Free Software Foundation; either version 2.1 of the License, or (at your
option) any later version. (See .)
This library is distributed in the hope that it will be useful, but WITHOUT
ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS
FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for
more details.
You should have received a copy of the GNU Lesser General Public License
along with this library; if not, write to the Free Software Foundation, Inc.,
51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
**********/
// "liveMedia"
// Copyright (c) 1996-2016 Live Networks, Inc. All rights reserved.
// An abstraction of a network interface used for RTP (or RTCP).
// (This allows the RTP-over-TCP hack (RFC 2326, section 10.12) to
// be implemented transparently.)
// Implementation
#include "RTPInterface.hh"
#include