upload android base code part4

This commit is contained in:
August 2018-08-08 17:00:29 +08:00
parent b9e30e05b1
commit 78ea2404cd
23455 changed files with 5250148 additions and 0 deletions

View file

@ -0,0 +1,35 @@
# 5\. Multimedia Compatibility
Device implementations:
* [C-0-1] MUST support the media formats, encoders, decoders, file types,
and container formats defined in [section 5.1](#5_1_media-codecs.md)
for each and every codec declared by `MediaCodecList`.
* [C-0-2] MUST declare and report support of the encoders, decoders available
to third-party applications via [`MediaCodecList`](
http://developer.android.com/reference/android/media/MediaCodecList.html).
* [C-0-3] MUST be able to decode and make available to third-party apps all
the formats it can encode. This includes all bitstreams that its encoders
generate and the profiles reported in its [`CamcorderProfile`](
http://developer.android.com/reference/android/media/CamcorderProfile.html).
Device implementations:
* SHOULD aim for minimum codec latency, in others words, they
* SHOULD NOT consume and store input buffers and return input buffers only
once processed.
* SHOULD NOT hold onto decoded buffers for longer than as specified by the
standard (e.g. SPS).
* SHOULD NOT hold onto encoded buffers longer than required by the GOP
structure.
All of the codecs listed in the section below are provided as software
implementations in the preferred Android implementation from the Android Open
Source Project.
Please note that neither Google nor the Open Handset Alliance make any
representation that these codecs are free from third-party patents. Those
intending to use this source code in hardware or software products are advised
that implementations of this code, including in open source software or
shareware, may require patent licenses from the relevant patent holders.

View file

@ -0,0 +1,93 @@
## 5.10\. Professional Audio
If device implementations report support for feature
`android.hardware.audio.pro` via the
[android.content.pm.PackageManager](
http://developer.android.com/reference/android/content/pm/PackageManager.html)
class, they:
* [C-1-1] MUST report support for feature
`android.hardware.audio.low_latency`.
* [C-1-2] MUST have the continuous round-trip audio latency, as defined in
[section 5.6 Audio Latency](#5_6_audio_latency), MUST be 20 milliseconds or less and SHOULD be
10 milliseconds or less over at least one supported path.
* [C-1-3] MUST include a USB port(s) supporting USB host mode and USB
peripheral mode.
* [C-1-4] MUST report support for feature `android.software.midi`.
* [C-1-5] MUST meet latencies and USB audio requirements using the
[OpenSL ES](https://developer.android.com/ndk/guides/audio/opensl-for-android.html)
PCM buffer queue API.
* SHOULD provide a sustainable level of CPU performance while audio is active.
* SHOULD minimize audio clock inaccuracy and drift relative to standard time.
* SHOULD minimize audio clock drift relative to the CPU `CLOCK_MONOTONIC` when both
are active.
* SHOULD minimize audio latency over on-device transducers.
* SHOULD minimize audio latency over USB digital audio.
* SHOULD document audio latency measurements over all paths.
* SHOULD minimize jitter in audio buffer completion callback entry times, as this
affects usable percentage of full CPU bandwidth by the callback.
* SHOULD provide zero audio underruns (output) or overruns (input) under normal use
at reported latency.
* SHOULD provide zero inter-channel latency difference.
* SHOULD minimize MIDI mean latency over all transports.
* SHOULD minimize MIDI latency variability under load (jitter) over all transports.
* SHOULD provide accurate MIDI timestamps over all transports.
* SHOULD minimize audio signal noise over on-device transducers, including the
period immediately after cold start.
* SHOULD provide zero audio clock difference between the input and output sides of
corresponding end-points, when both are active. Examples of corresponding
end-points include the on-device microphone and speaker, or the audio jack input
and output.
* SHOULD handle audio buffer completion callbacks for the input and output sides
of corresponding end-points on the same thread when both are active, and enter
the output callback immediately after the return from the input callback. Or
if it is not feasible to handle the callbacks on the same thread, then enter the
output callback shortly after entering the input callback to permit the
application to have a consistent timing of the input and output sides.
* SHOULD minimize the phase difference between HAL audio buffering for the input
and output sides of corresponding end-points.
* SHOULD minimize touch latency.
* SHOULD minimize touch latency variability under load (jitter).
If device implementations meet all of the above requirements, they:
* [SR] STRONGLY RECOMMENDED to report support for feature
`android.hardware.audio.pro` via the [`android.content.pm.PackageManager`](
http://developer.android.com/reference/android/content/pm/PackageManager.html)
class.
If device implementations meet the requirements via the OpenSL ES PCM buffer
queue API, they:
* [SR] STRONGLY RECOMMENDED to also meet the same requirements via the
[AAudio](https://developer.android.com/ndk/guides/audio/aaudio/aaudio.html) API.
If device implementations include a 4 conductor 3.5mm audio jack, they:
* [C-2-1] MUST have the continuous round-trip audio latency to be 20
milliseconds or less over the audio jack path.
* [SR] STRONGLY RECOMMENDED to comply with
section [Mobile device (jack) specifications](
https://source.android.com/devices/accessories/headset/jack-headset-spec)
of the [Wired Audio Headset Specification (v1.1)](
https://source.android.com/devices/accessories/headset/plug-headset-spec).
* The continuous round-trip audio latency SHOULD be 10 milliseconds
or less over the audio jack path.
If device implementations omit a 4 conductor 3.5mm audio jack, they:
* [C-3-1] MUST have a continuous round-trip audio latency of 20
milliseconds or less.
* The continuous round-trip audio latency SHOULD be 10 milliseconds
or less over the USB host mode port using USB audio class.
If device implementations include a USB port(s) supporting USB host mode, they:
* [C-4-1] MUST implement the USB audio class.
If device implementations include an HDMI port, they:
* [C-5-1] MUST support output in stereo and eight channels at 20-bit or
24-bit depth and 192 kHz without bit-depth loss or resampling.

View file

@ -0,0 +1,63 @@
## 5.11\. Capture for Unprocessed
Android includes support for recording of unprocessed audio via the
`android.media.MediaRecorder.AudioSource.UNPROCESSED` audio source. In
OpenSL ES, it can be accessed with the record preset
`SL_ANDROID_RECORDING_PRESET_UNPROCESSED`.
If device implementations intent to support unprocessed audio source and make
it available to third-party apps, they:
* [C-1-1] MUST report the support through the `android.media.AudioManager`
property [PROPERTY_SUPPORT_AUDIO_SOURCE_UNPROCESSED](http://developer.android.com/reference/android/media/AudioManager.html#PROPERTY_SUPPORT_AUDIO_SOURCE_UNPROCESSED).
* [C-1-2] MUST exhibit approximately flat amplitude-versus-frequency
characteristics in the mid-frequency range: specifically ±10dB from
100 Hz to 7000 Hz for each and every microphone used to record the unprocessed
audio source.
* [C-1-3] MUST exhibit amplitude levels in the low frequency
range: specifically from ±20 dB from 5 Hz to 100 Hz compared to the
mid-frequency range for each and every microphone used to record the
unprocessed audio source.
* [C-1-4] MUST exhibit amplitude levels in the high frequency
range: specifically from ±30 dB from 7000 Hz to 22 KHz compared to the
mid-frequency range for each and every microphone used to record the
unprocessed audio source.
* [C-1-5] MUST set audio input sensitivity such that a 1000 Hz sinusoidal
tone source played at 94 dB Sound Pressure Level (SPL) yields a response with
RMS of 520 for 16 bit-samples (or -36 dB Full Scale for floating point/double
precision samples) for each and every microphone used to record the unprocessed
audio source.
* [C-1-6] MUST have a signal-to-noise ratio (SNR) at 60 dB or higher for
each and every microphone used to record the unprocessed audio source.
(whereas the SNR is measured as the difference between 94 dB SPL and equivalent
SPL of self noise, A-weighted).
* [C-1-7] MUST have a total harmonic distortion (THD) less than be less than
1% for 1 kHZ at 90 dB SPL input level at each and every microphone used to
record the unprocessed audio source.
* MUST not have any other signal processing (e.g. Automatic Gain Control,
High Pass Filter, or Echo cancellation) in the path other than a level
multiplier to bring the level to desired range. In other words:
* [C-1-8] If any signal processing is present in the architecture for any
reason, it MUST be disabled and effectively introduce zero delay or extra
latency to the signal path.
* [C-1-9] The level multiplier, while allowed to be on the path, MUST NOT
introduce delay or latency to the signal path.
All SPL measurements are made directly next to the microphone under test.
For multiple microphone configurations, these requirements apply to
each microphone.
If device implementations declare `android.hardware.microphone` but do not
support unprocessed audio source, they:
* [C-2-1] MUST return `null` for the `AudioManager.getProperty(PROPERTY_SUPPORT_AUDIO_SOURCE_UNPROCESSED)`
API method, to properly indicate the lack of support.
* [SR] are still STRONGLY RECOMMENDED to satisfy as many of the requirements
for the signal path for the unprocessed recording source.

View file

@ -0,0 +1,328 @@
## 5.1\. Media Codecs
### 5.1.1\. Audio Encoding
See more details in [5.1.3. Audio Codecs Details](#5_1_3_audio_codecs_details).
Handheld device implementations MUST support the following audio encoding:
* [H-0-1] AMR-NB
* [H-0-2] AMR-WB
* [H-0-3] MPEG-4 AAC Profile (AAC LC)
* [H-0-4] MPEG-4 HE AAC Profile (AAC+)
* [H-0-5] AAC ELD (enhanced low delay AAC)
Television device implementations MUST support the following audio encoding:
* [T-0-1] MPEG-4 AAC Profile (AAC LC)
* [T-0-2] MPEG-4 HE AAC Profile (AAC+)
* [T-0-3] AAC ELD (enhanced low delay AAC)
Automotive device implementations MUST support the following audio encoding:
* [A-1-1] MPEG-4 AAC Profile (AAC LC)
* [A-1-2] MPEG-4 HE AAC Profile (AAC+)
* [A-1-3] AAC ELD (enhanced low delay AAC)
If device implementations declare `android.hardware.microphone`,
they MUST support the following audio encoding:
* [C-1-1] PCM/WAVE
### 5.1.2\. Audio Decoding
See more details in [5.1.3. Audio Codecs Details](#5_1_3_audio_codecs_details).
Handheld device implementations MUST support the following decoding.
* [H-0-1] AMR-NB
* [H-0-2] AMR-WB
If device implementations declare support for the
`android.hardware.audio.output` feature, they:
* [C-1-1] MPEG-4 AAC Profile (AAC LC)
* [C-1-2] MPEG-4 HE AAC Profile (AAC+)
* [C-1-3] MPEG-4 HE AACv2 Profile (enhanced AAC+)
* [C-1-4] AAC ELD (enhanced low delay AAC)
* [C-1-5] FLAC
* [C-1-6] MP3
* [C-1-7] MIDI
* [C-1-8] Vorbis
* [C-1-9] PCM/WAVE
* [C-1-10] Opus
If device implementations support the decoding of AAC input buffers of
multichannel streams (i.e. more than two channels) to PCM through the default
AAC audio decoder in the `android.media.MediaCodec` API, the following MUST be
supported:
* [C-2-1] Decoding MUST be performed without downmixing (e.g. a 5.0 AAC
stream must be decoded to five channels of PCM, a 5.1 AAC stream must be decoded
to six channels of PCM).
* [C-2-2] Dynamic range metadata MUST be as defined in "Dynamic Range Control
(DRC)" in ISO/IEC 14496-3, and the `android.media.MediaFormat` DRC keys to
configure the dynamic range-related behaviors of the audio decoder. The
AAC DRC keys were introduced in API 21,and are:
KEY_AAC_DRC_ATTENUATION_FACTOR, KEY_AAC_DRC_BOOST_FACTOR,
KEY_AAC_DRC_HEAVY_COMPRESSION, KEY_AAC_DRC_TARGET_REFERENCE_LEVEL and
KEY_AAC_ENCODED_TARGET_LEVEL
### 5.1.3\. Audio Codecs Details
<table>
<tr>
<th>Format/Codec</th>
<th>Details</th>
<th>Supported File Types/Container Formats</th>
</tr>
<tr>
<td>MPEG-4 AAC Profile<br />(AAC LC)</td>
<td>Support for mono/stereo/5.0/5.1 content with standard
sampling rates from 8 to 48 kHz.</td>
<td>
<ul>
<li class="table_list">3GPP (.3gp)</li>
<li class="table_list">MPEG-4 (.mp4, .m4a)</li>
<li class="table_list">ADTS raw AAC (.aac, ADIF not supported)</li>
<li class="table_list">MPEG-TS (.ts, not seekable)</li></ul>
</td>
</tr>
<tr>
<td>MPEG-4 HE AAC Profile (AAC+)</td>
<td>Support for mono/stereo/5.0/5.1 content with standard
sampling rates from 16 to 48 kHz.</td>
<td></td>
</tr>
<tr>
<td>MPEG-4 HE AACv2<br />
Profile (enhanced AAC+)</td>
<td>Support for mono/stereo/5.0/5.1 content with standard
sampling rates from 16 to 48 kHz.</td>
<td></td>
</tr>
<tr>
<td>AAC ELD (enhanced low delay AAC)</td>
<td>Support for mono/stereo content with standard sampling rates from 16 to
48 kHz.</td>
<td></td>
</tr>
<tr>
<td>AMR-NB</td>
<td>4.75 to 12.2 kbps sampled @ 8 kHz</td>
<td>3GPP (.3gp)</td>
</tr>
<tr>
<td>AMR-WB</td>
<td>9 rates from 6.60 kbit/s to 23.85 kbit/s sampled @ 16 kHz</td>
<td></td>
</tr>
<tr>
<td>FLAC</td>
<td>Mono/Stereo (no multichannel). Sample rates up to 48 kHz (but up to 44.1
kHz is RECOMMENDED on devices with 44.1 kHz output, as the 48 to 44.1 kHz
downsampler does not include a low-pass filter). 16-bit RECOMMENDED; no
dither applied for 24-bit.</td>
<td>FLAC (.flac) only</td>
</tr>
<tr>
<td>MP3</td>
<td>Mono/Stereo 8-320Kbps constant (CBR) or variable bitrate (VBR)</td>
<td>MP3 (.mp3)</td>
</tr>
<tr>
<td>MIDI</td>
<td>MIDI Type 0 and 1. DLS Version 1 and 2. XMF and Mobile XMF. Support for
ringtone formats RTTTL/RTX, OTA, and iMelody</td>
<td><ul>
<li class="table_list">Type 0 and 1 (.mid, .xmf, .mxmf)</li>
<li class="table_list">RTTTL/RTX (.rtttl, .rtx)</li>
<li class="table_list">OTA (.ota)</li>
<li class="table_list">iMelody (.imy)</li></ul></td>
</tr>
<tr>
<td>Vorbis</td>
<td></td>
<td><ul>
<li class="table_list">Ogg (.ogg)</li>
<li class="table_list">Matroska (.mkv, Android 4.0+)</li></ul></td>
</tr>
<tr>
<td>PCM/WAVE</td>
<td>16-bit linear PCM (rates up to limit of hardware). Devices MUST support
sampling rates for raw PCM recording at 8000, 11025, 16000, and 44100 Hz
frequencies.</td>
<td>WAVE (.wav)</td>
</tr>
<tr>
<td>Opus</td>
<td></td>
<td>Matroska (.mkv), Ogg(.ogg)</td>
</tr>
</table>
### 5.1.4\. Image Encoding
See more details in [5.1.6. Image Codecs Details](#5_1_6_image_codecs_details).
Device implementations MUST support encoding the following image encoding:
* [C-0-1] JPEG
* [C-0-2] PNG
* [C-0-3] WebP
### 5.1.5\. Image Decoding
See more details in [5.1.6. Image Codecs Details](#5_1_6_image_codecs_details).
Device impelementations MUST support encoding the following image decoding:
* [C-0-1] JPEG
* [C-0-2] GIF
* [C-0-3] PNG
* [C-0-4] BMP
* [C-0-5] WebP
* [C-0-6] Raw
### 5.1.6\. Image Codecs Details
<table>
<tr>
<th>Format/Codec</th>
<th>Details</th>
<th>Supported File Types/Container Formats</th>
</tr>
<tr>
<td>JPEG</td>
<td>Base+progressive</td>
<td>JPEG (.jpg)</td>
</tr>
<tr>
<td>GIF</td>
<td></td>
<td>GIF (.gif)</td>
</tr>
<tr>
<td>PNG</td>
<td></td>
<td>PNG (.png)</td>
</tr>
<tr>
<td>BMP</td>
<td></td>
<td>BMP (.bmp)</td>
</tr>
<tr>
<td>WebP</td>
<td></td>
<td>WebP (.webp)</td>
</tr>
<tr>
<td>Raw</td>
<td></td>
<td>ARW (.arw), CR2 (.cr2), DNG (.dng), NEF (.nef), NRW (.nrw), ORF (.orf),
PEF (.pef), RAF (.raf), RW2 (.rw2), SRW (.srw)</td>
</tr>
</table>
### 5.1.7\. Video Codecs
* For acceptable quality of web video streaming and video-conference
services, device implementations SHOULD use a hardware VP8 codec that meets the
[requirements](http://www.webmproject.org/hardware/rtc-coding-requirements/).
If device implementations include a video decoder or encoder:
* [C-1-1] Video codecs MUST support output and input bytebuffer sizes that
accommodate the largest feasible compressed and uncompressed frame as dictated
by the standard and configuration but also not overallocate.
* [C-1-2] Video encoders and decoders MUST support YUV420 flexible color
format (COLOR_FormatYUV420Flexible).
If device implementations advertise HDR profile support through
[`Display.HdrCapabilities`](
https://developer.android.com/reference/android/view/Display.HdrCapabilities.html),
they:
* [C-2-1] MUST support HDR static metadata parsing and handling.
If device implementations advertise intra refresh support through
`FEATURE_IntraRefresh` in the [`MediaCodecInfo.CodecCapabilities`](
https://developer.android.com/reference/android/media/MediaCodecInfo.CodecCapabilities.html#FEATURE_IntraRefresh)
class, they:
* [C-3-1]MUST support the refresh periods in the range of 10 - 60 frames and
accurately operate within 20% of configured refresh period.
### 5.1.8\. Video Codecs List
<table>
<tr>
<th>Format/Codec</th>
<th>Details</th>
<th>Supported File Types/<br>Container Formats</th>
</tr>
<tr>
<td>H.263</td>
<td></td>
<td><ul>
<li class="table_list">3GPP (.3gp)</li>
<li class="table_list">MPEG-4 (.mp4)</li></ul></td>
</tr>
<tr>
<td>H.264 AVC</td>
<td>See <a href="#5_2_video_encoding">section 5.2 </a>and
<a href="#5_3_video_decoding">5.3</a> for details</td>
<td><ul>
<li class="table_list">3GPP (.3gp)</li>
<li class="table_list">MPEG-4 (.mp4)</li>
<li class="table_list">MPEG-2 TS (.ts, AAC audio only, not seekable, Android
3.0+)</li></ul></td>
</tr>
<tr>
<td>H.265 HEVC</td>
<td>See <a href="#5_3_video_decoding">section 5.3</a> for details</td>
<td>MPEG-4 (.mp4)</td>
</tr>
<tr>
<td>MPEG-2</td>
<td>Main Profile</td>
<td>MPEG2-TS</td>
</tr>
<tr>
<td>MPEG-4 SP</td>
<td></td>
<td>3GPP (.3gp)</td>
</tr>
<tr>
<td>VP8</td>
<td>See <a href="#5_2_video_encoding">section 5.2</a> and
<a href="#5_3_video_decoding">5.3</a> for details</td>
<td><ul>
<li class="table_list"><a href="http://www.webmproject.org/">WebM
(.webm)</a></li>
<li class="table_list">Matroska (.mkv)</li></ul>
</td>
</tr>
<tr>
<td>VP9</td>
<td>See <a href="#5_3_video_decoding">section 5.3</a> for details</td>
<td><ul>
<li class="table_list"><a href="http://www.webmproject.org/">WebM
(.webm)</a></li>
<li class="table_list">Matroska (.mkv)</li></ul>
</td>
</tr>
</table>

View file

@ -0,0 +1,176 @@
## 5.2\. Video Encoding
Handheld device implementations MUST support the following encoding and make it
available to third-party applications.
* [H-0-1] H.264 AVC
* [H-0-2] VP8
Television device implementations MUST support the following encoding.
* [T-0-1] H.264 AVC
* [T-0-2] VP8
Automotive device implementations MUST support the following encoding:
* [A-0-1] H.264 AVC
* [A-0-2] VP8
If device implementations support any video encoder and make it available
to third-party apps, they:
* SHOULD NOT be, over two sliding windows, more than ~15% over the bitrate
between intraframe (I-frame) intervals.
* SHOULD NOT be more than ~100% over the bitrate over a sliding window
of 1 second.
If device implementations include an embedded screen display with the
diagonal length of at least 2.5 inches or include a video output port or
declare the support of a camera via the `android.hardware.camera.any`
feature flag, they:
* [C-1-1] MUST include the support of at least one of the VP8 or H.264 video
encoders, and make it available for third-party applications.
* SHOULD support both VP8 and H.264 video encoders, and make it available
for third-party applications.
If device implementations support any of the H.264, VP8, VP9 or HEVC video
encoders and make it available to third-party applications, they:
* [C-2-1] MUST support dynamically configurable bitrates.
* SHOULD support variable frame rates, where video encoder SHOULD determine
instantaneous frame duration based on the timestamps of input buffers, and
allocate its bit bucket based on that frame duration.
If device implementations support the MPEG-4 SP video encoder and make it
available to third-party apps, they:
* SHOULD support dynamically configurable bitrates for the supported encoder.
### 5.2.1\. H.263
If device implementations support H.263 encoders and make it available
to third-party apps, they:
* [C-1-1] MUST support Baseline Profile Level 45.
* SHOULD support dynamically configurable bitrates for the supported encoder.
### 5.2.2\. H-264
Television device implementations are:
* [T-SR] STRONGLY RECOMMENDED to support H.264 encoding of 720p and 1080p
resolution videos.
* [T-SR] STRONGLY RECOMMENDED to support H.264 encoding of 1080p resolution
video at 30 frame-per-second (fps).
If device implementations support H.264 codec, they:
* [C-1-1] MUST support Baseline Profile Level 3.
However, support for ASO (Arbitrary Slice Ordering), FMO (Flexible Macroblock
Ordering) and RS (Redundant Slices) is OPTIONAL. Moreover, to maintain
compatibility with other Android devices, it is RECOMMENDED that ASO, FMO
and RS are not used for Baseline Profile by encoders.
* [C-1-2] MUST support the SD (Standard Definition) video encoding profiles
in the following table.
* SHOULD support Main Profile Level 4.
* SHOULD support the HD (High Definition) video encoding profiles as
indicated in the following table.
If device implementations report support of H.264 encoding for 720p or 1080p
resolution videos through the media APIs, they:
* [C-2-1] MUST support the encoding profiles in the following table.
<table>
<tr>
<th></th>
<th>SD (Low quality)</th>
<th>SD (High quality)</th>
<th>HD 720p</th>
<th>HD 1080p</th>
</tr>
<tr>
<th>Video resolution</th>
<td>320 x 240 px</td>
<td>720 x 480 px</td>
<td>1280 x 720 px</td>
<td>1920 x 1080 px</td>
</tr>
<tr>
<th>Video frame rate</th>
<td>20 fps</td>
<td>30 fps</td>
<td>30 fps</td>
<td>30 fps</td>
</tr>
<tr>
<th>Video bitrate</th>
<td>384 Kbps</td>
<td>2 Mbps</td>
<td>4 Mbps</td>
<td>10 Mbps</td>
</tr>
</table>
### 5.2.3\. VP8
If device implementations support VP8 codec, they:
* [C-1-1] MUST support the SD video encoding profiles.
* SHOULD support the following HD (High Definition) video encoding profiles.
* SHOULD support writing Matroska WebM files.
* SHOULD use a hardware VP8 codec that meets the
[WebM project RTC hardware coding requirements](
http://www.webmproject.org/hardware/rtc-coding-requirements), to ensure
acceptable quality of web video streaming and video-conference services.
If device implementations report support of VP8 encoding for 720p or 1080p
resolution videos through the media APIs, they:
* [C-2-1] MUST support the encoding profiles in the following table.
<table>
<tr>
<th></th>
<th>SD (Low quality)</th>
<th>SD (High quality)</th>
<th>HD 720p</th>
<th>HD 1080p</th>
</tr>
<tr>
<th>Video resolution</th>
<td>320 x 180 px</td>
<td>640 x 360 px</td>
<td>1280 x 720 px</td>
<td>1920 x 1080 px</td>
</tr>
<tr>
<th>Video frame rate</th>
<td>30 fps</td>
<td>30 fps</td>
<td>30 fps</td>
<td>30 fps</td>
</tr>
<tr>
<th>Video bitrate</th>
<td>800 Kbps </td>
<td>2 Mbps</td>
<td>4 Mbps</td>
<td>10 Mbps</td>
</tr>
</table>
### 5.2.4\. VP9
If device implementations support VP9 codec, they:
* SHOULD support writing Matroska WebM files.

View file

@ -0,0 +1,312 @@
## 5.3\. Video Decoding
Handheld device implementations:
* [H-0-1] MUST support decoding of H.264 AVC.
* [H-0-2] MUST support decoding of H.265 HEVC.
* [H-0-3] MUST support decoding of MPEG-4 SP.
* [H-0-4] MUST support decoding of VP8.
* [H-0-5] MUST support decoding of VP9.
Television device implementations:
* [T-0-1] MUST support decoding of H.264 AVC.
* [T-0-2] MUST support decoding of H.265 HEVC.
* [T-0-3] MUST support decoding of MPEG-4 SP.
* [T-0-4] MUST support decoding of VP8.
* [T-0-5] MUST support decoding of VP9.
* [T-SR] Are Strongly Recommended to support MPEG-2 decoding.
Automotive device implementations:
* [A-0-1] MUST support decoding of H.264 AVC.
* [A-0-2] MUST support decoding of MPEG-4 SP.
* [A-0-3] MUST support decoding of VP8.
* [A-0-4] MUST support decoding of VP9.
* [A-SR] Are Strongly Recommended to support H.265 HEVC decoding.
If device implementations support VP8, VP9, H.264, or H.265 codecs, they:
* [C-1-1] MUST support dynamic video resolution and frame rate switching
through the standard Android APIs within the same stream for all VP8, VP9,
H.264, and H.265 codecs in real time and up to the maximum resolution supported
by each codec on the device.
If device implementations declare support for the Dolby Vision decoder through
[`HDR_TYPE_DOLBY_VISION`](https://developer.android.com/reference/android/view/Display.HdrCapabilities.html#HDR_TYPE_DOLBY_VISION)
, they:
* [C-2-1] MUST provide a Dolby Vision-capable extractor.
* [C-2-2] MUST properly display Dolby Vision content on the device screen or
on a standard video output port (e.g., HDMI).
* [C-2-3] MUST set the track index of backward-compatible base-layer(s) (if
present) to be the same as the combined Dolby Vision layer's track index.
### 5.3.1\. MPEG-2
If device implementations support MPEG-2 decoders, they:
* [C-1-1] MUST support the Main Profile High Level.
### 5.3.2\. H.263
If device implementations support H.263 decoders, they:
* [C-1-1] MUST support Baseline Profile Level 30 and Level 45.
### 5.3.3\. MPEG-4
If device implementations with MPEG-4 decoders, they:
* [C-1-1] MUST support Simple Profile Level 3.
### 5.3.4\. H.264
If device implementations support H.264 decoders, they:
* [C-1-1] MUST support Main Profile Level 3.1 and Baseline Profile. Support
for ASO (Arbitrary Slice Ordering), FMO (Flexible Macroblock Ordering) and RS
(Redundant Slices) is OPTIONAL.
* [C-1-2] MUST be capable of decoding videos with the SD (Standard Definition)
profiles listed in the following table and encoded with the Baseline Profile
and Main Profile Level 3.1 (including 720p30).
* SHOULD be capable of decoding videos with the HD (High Definition) profiles
as indicated in the following table.
If the height that is reported by the `Display.getSupportedModes()` method is
equal or greater than the video resolution, device implementations:
* [C-2-1] MUST support the HD 720p video encoding profiles in the following
table.
* [C-2-2] MUST support the HD 1080p video encoding profiles in the following
table.
If Television device implementations support H.264 decoders, they:
* [T-1-1] MUST support High Profile Level 4.2 and the HD 1080p (at 60 fps)
decoding profile.
* [T-1-2] MUST be capable of decoding videos with both HD profiles as
indicated in the following table and encoded with either the Baseline Profile,
Main Profile, or the High Profile Level 4.2
<table>
<tr>
<th></th>
<th>SD (Low quality)</th>
<th>SD (High quality)</th>
<th>HD 720p</th>
<th>HD 1080p</th>
</tr>
<tr>
<th>Video resolution</th>
<td>320 x 240 px</td>
<td>720 x 480 px</td>
<td>1280 x 720 px</td>
<td>1920 x 1080 px</td>
</tr>
<tr>
<th>Video frame rate</th>
<td>30 fps</td>
<td>30 fps</td>
<td>60 fps</td>
<td>30 fps (60 fps<sup>Television</sup>)</td>
</tr>
<tr>
<th>Video bitrate</th>
<td>800 Kbps </td>
<td>2 Mbps</td>
<td>8 Mbps</td>
<td>20 Mbps</td>
</tr>
</table>
### 5.3.5\. H.265 (HEVC)
If device implementations support H.265 codec, they:
* [C-1-1] MUST support the Main Profile Level 3 Main tier and the SD video
decoding profiles as indicated in the following table.
* SHOULD support the HD decoding profiles as indicated in the following table.
* [C-1-2] MUST support the HD decoding profiles as indicated in the following
table if there is a hardware decoder.
If the height that is reported by the `Display.getSupportedModes()` method is
equal to or greater than the video resolution, then:
* [C-2-1] Device implementations MUST support at least one of H.265 or VP9
decoding of 720, 1080 and UHD profiles.
If Television device implementations support H.265 codec and the HD 1080p
decoding profile, they:
* [T-1-1] MUST support the Main Profile Level 4.1 Main tier.
* [T-SR] STRONGLY RECOMMENDED to support 60 fps video frame rate
for HD 1080p.
If Television device implementations support H.265 codec and the UHD decoding
profile, then:
* [T-2-1] The codec MUST support Main10 Level 5 Main Tier profile.
<table>
<tr>
<th></th>
<th>SD (Low quality)</th>
<th>SD (High quality)</th>
<th>HD 720p</th>
<th>HD 1080p</th>
<th>UHD</th>
</tr>
<tr>
<th>Video resolution</th>
<td>352 x 288 px</td>
<td>720 x 480 px</td>
<td>1280 x 720 px</td>
<td>1920 x 1080 px</td>
<td>3840 x 2160 px</td>
</tr>
<tr>
<th>Video frame rate</th>
<td>30 fps</td>
<td>30 fps</td>
<td>30 fps</td>
<td>30/60 fps (60 fps<sup>Television with H.265 hardware decoding</sup>)</td>
<td>60 fps</td>
</tr>
<tr>
<th>Video bitrate</th>
<td>600 Kbps </td>
<td>1.6 Mbps</td>
<td>4 Mbps</td>
<td>5 Mbps</td>
<td>20 Mbps</td>
</tr>
</table>
### 5.3.6\. VP8
If device implementations support VP8 codec, they:
* [C-1-1] MUST support the SD decoding profiles in the following table.
* SHOULD use a hardware VP8 codec that meets the
[requirements]("http://www.webmproject.org/hardware/rtc-coding-requirements/").
* SHOULD support the HD decoding profiles in the following table.
If the height as reported by the `Display.getSupportedModes()` method is equal
or greater than the video resolution, then:
* [C-2-1] Device implementations MUST support 720p profiles in the
following table.
* [C-2-2] Device implementations MUST support 1080p profiles in the
following table.
If Television device implementations support VP8 codec, they:
* [T-1-1] MUST support the HD 1080p60 decoding profile.
If Television device implementations support VP8 codec and support 720p, they:
* [T-2-1] MUST support the HD 720p60 decoding profile.
<table>
<tr>
<th></th>
<th>SD (Low quality)</th>
<th>SD (High quality)</th>
<th>HD 720p</th>
<th>HD 1080p</th>
</tr>
<tr>
<th>Video resolution</th>
<td>320 x 180 px</td>
<td>640 x 360 px</td>
<td>1280 x 720 px</td>
<td>1920 x 1080 px</td>
</tr>
<tr>
<th>Video frame rate</th>
<td>30 fps</td>
<td>30 fps</td>
<td>30 fps (60 fps<sup>Television</sup>)</td>
<td>30 (60 fps<sup>Television</sup>)</td>
</tr>
<tr>
<th>Video bitrate</th>
<td>800 Kbps </td>
<td>2 Mbps</td>
<td>8 Mbps</td>
<td>20 Mbps</td>
</tr>
</table>
### 5.3.7\. VP9
If device implementations support VP9 codec, they:
* [C-1-1] MUST support the SD video decoding profiles as indicated in the
following table.
* SHOULD support the HD decoding profiles as indicated in the following table.
If device implementations support VP9 codec and a hardware decoder:
* [C-2-2] MUST support the HD decoding profiles as indicated in the following
table.
If the height that is reported by the `Display.getSupportedModes()` method is
equal to or greater than the video resolution, then:
* [C-3-1] Device implementations MUST support at least one of VP9 or H.265
decoding of the 720, 1080 and UHD profiles.
If Television device implementations support VP9 codec and the UHD video
decoding, they:
* [T-1-1] MUST support 8-bit color depth and SHOULD support VP9 Profile 2
(10-bit).
If Television device implementations support VP9 codec, the 1080p profile and
VP9 hardware decoding, they:
* [T-2-1] MUST support 60 fps for 1080p.
<table>
<tr>
<th></th>
<th>SD (Low quality)</th>
<th>SD (High quality)</th>
<th>HD 720p</th>
<th>HD 1080p</th>
<th>UHD</th>
</tr>
<tr>
<th>Video resolution</th>
<td>320 x 180 px</td>
<td>640 x 360 px</td>
<td>1280 x 720 px</td>
<td>1920 x 1080 px</td>
<td>3840 x 2160 px</td>
</tr>
<tr>
<th>Video frame rate</th>
<td>30 fps</td>
<td>30 fps</td>
<td>30 fps</td>
<td>30 fps (60 fps<sup>Television with VP9 hardware decoding</sup>)</td>
<td>60 fps</td>
</tr>
<tr>
<th>Video bitrate</th>
<td>600 Kbps</td>
<td>1.6 Mbps</td>
<td>4 Mbps</td>
<td>5 Mbps</td>
<td>20 Mbps</td>
</tr>
</table>

View file

@ -0,0 +1,87 @@
## 5.4\. Audio Recording
While some of the requirements outlined in this section are listed as SHOULD
since Android 4.3, the Compatibility Definition for future versions are planned
to change these to MUST. Existing and new Android devices are **STRONGLY
RECOMMENDED** to meet these requirements that are listed as SHOULD, or they
will not be able to attain Android compatibility when upgraded to the future
version.
### 5.4.1\. Raw Audio Capture
If device implementations declare `android.hardware.microphone`, they:
* [C-1-1] MUST allow capture of raw audio content with the following
characteristics:
* **Format**: Linear PCM, 16-bit
* **Sampling rates**: 8000, 11025, 16000, 44100 Hz
* **Channels**: Mono
* [C-1-2] MUST capture at above sample rates without up-sampling.
* [C-1-3] MUST include an appropriate anti-aliasing filter when the
sample rates given above are captured with down-sampling.
* SHOULD allow AM radio and DVD quality capture of raw audio content, which
means the following characteristics:
* **Format**: Linear PCM, 16-bit
* **Sampling rates**: 22050, 48000 Hz
* **Channels**: Stereo
If device implementations allow AM radio and DVD quality capture of raw audio
content, they:
* [C-2-1] MUST capture without up-sampling at any ratio higher
than 16000:22050 or 44100:48000.
* [C-2-2] MUST include an appropriate anti-aliasing filter for any
up-sampling or down-sampling.
### 5.4.2\. Capture for Voice Recognition
If device implementations declare `android.hardware.microphone`, they:
* [C-1-1] MUST capture
`android.media.MediaRecorder.AudioSource.VOICE_RECOGNITION` audio source at
one of the sampling rates, 44100 and 48000.
* [C-1-2] MUST, by default, disable any noise reduction audio processing when
recording an audio stream from the `AudioSource.VOICE_RECOGNITION` audio
source.
* [C-1-3] MUST, by default, disable any automatic gain control when recording
an audio stream from the `AudioSource.VOICE_RECOGNITION` audio source.
* SHOULD record the voice recognition audio stream with approximately flat
amplitude versus frequency characteristics: specifically, ±3 dB, from 100 Hz
to 4000 Hz.
* SHOULD record the voice recognition audio stream with input sensitivity set
such that a 90 dB sound power level (SPL) source at 1000 Hz yields RMS of
2500 for 16-bit samples.
* SHOULD record the voice recognition audio stream so that the PCM amplitude
levels linearly track input SPL changes over at least a 30 dB range from -18
dB to +12 dB re 90 dB SPL at the microphone.
* SHOULD record the voice recognition audio stream with total harmonic
distortion (THD) less than 1% for 1 kHz at 90 dB SPL input level at the
microphone.
If device impelementations declare `android.hardware.microphone` and noise
suppression (reduction) technologies tuned for speech recognition, they:
* [C-2-1] MUST allow this audio affect to be controllable with the
`android.media.audiofx.NoiseSuppressor` API.
* [C-2-2] MUST uniquely identfiy each noise suppression technology
implementation via the `AudioEffect.Descriptor.uuid` field.
### 5.4.3\. Capture for Rerouting of Playback
The `android.media.MediaRecorder.AudioSource` class includes the `REMOTE_SUBMIX`
audio source.
If device implementations declare both `android.hardware.audio.output` and
`android.hardware.microphone`, they:
* [C-1-1] MUST properly implement the `REMOTE_SUBMIX` audio source so that
when an application uses the `android.media.AudioRecord` API to record from this
audio source, it captures a mix of all audio streams except for the following:
* `AudioManager.STREAM_RING`
* `AudioManager.STREAM_ALARM`
* `AudioManager.STREAM_NOTIFICATION`

View file

@ -0,0 +1,55 @@
## 5.5\. Audio Playback
Android includes the support to allow apps to playback audio through the audio
output peripheral as defined in section 7.8.2.
### 5.5.1\. Raw Audio Playback
If device implementations declare `android.hardware.audio.output`, they:
* [C-1-1] MUST allow playback of raw audio content with the following
characteristics:
* **Format**: Linear PCM, 16-bit
* **Sampling rates**: 8000, 11025, 16000, 22050, 32000, 44100
* **Channels**: Mono, Stereo
* SHOULD allow playback of raw audio content with the following
characteristics:
* **Sampling rates**: 24000, 48000
### 5.5.2\. Audio Effects
Android provides an [API for audio effects](
http://developer.android.com/reference/android/media/audiofx/AudioEffect.html)
for device implementations.
If device implementations declare the feature `android.hardware.audio.output`,
they:
* [C-1-1] MUST support the `EFFECT_TYPE_EQUALIZER` and
`EFFECT_TYPE_LOUDNESS_ENHANCER` implementations controllable through the
AudioEffect subclasses `Equalizer`, `LoudnessEnhancer`.
* [C-1-2] MUST support the visualizer API implementation, controllable through
the `Visualizer` class.
* SHOULD support the `EFFECT_TYPE_BASS_BOOST`, `EFFECT_TYPE_ENV_REVERB`,
`EFFECT_TYPE_PRESET_REVERB`, and `EFFECT_TYPE_VIRTUALIZER` implementations
controllable through the `AudioEffect` sub-classes `BassBoost`,
`EnvironmentalReverb`, `PresetReverb`, and `Virtualizer`.
### 5.5.3\. Audio Output Volume
Television device implementations:
* [T-0-1] MUST include support for system Master Volume and digital audio
output volume attenuation on supported outputs,
except for compressed audio passthrough output (where no audio decoding is done
on the device).
Automotive device implementations:
* SHOULD allow adjusting audio volume
separately per each audio stream using the content type or usage as defined
by [AudioAttributes]("http://developer.android.com/reference/android/media/AudioAttributes.html")
and car audio usage as publicly defined in `android.car.CarAudioManager`.

View file

@ -0,0 +1,70 @@
## 5.6\. Audio Latency
Audio latency is the time delay as an audio signal passes through a system.
Many classes of applications rely on short latencies, to achieve real-time
sound effects.
For the purposes of this section, use the following definitions:
* **output latency**. The interval between when an application writes a frame
of PCM-coded data and when the corresponding sound is presented to environment
at an on-device transducer or signal leaves the device via a port and can be
observed externally.
* **cold output latency**. The output latency for the first frame, when the
audio output system has been idle and powered down prior to the request.
* **continuous output latency**. The output latency for subsequent frames,
after the device is playing audio.
* **input latency**. The interval between when a sound is presented by
environment to device at an on-device transducer or signal enters the device via
a port and when an application reads the corresponding frame of PCM-coded data.
* **lost input**. The initial portion of an input signal that is unusable or
unavailable.
* **cold input latency**. The sum of lost input time and the input latency
for the first frame, when the audio input system has been idle and powered down
prior to the request.
* **continuous input latency**. The input latency for subsequent frames,
while the device is capturing audio.
* **cold output jitter**. The variability among separate measurements of cold
output latency values.
* **cold input jitter**. The variability among separate measurements of cold
input latency values.
* **continuous round-trip latency**. The sum of continuous input latency plus
continuous output latency plus one buffer period. The buffer period allows
time for the app to process the signal and time for the app to mitigate phase
difference between input and output streams.
* **OpenSL ES PCM buffer queue API**. The set of PCM-related
[OpenSL ES](https://developer.android.com/ndk/guides/audio/opensl/index.html)
APIs within [Android NDK](https://developer.android.com/ndk/index.html).
* **AAudio native audio API**. The set of
[AAudio](https://developer.android.com/ndk/guides/audio/aaudio/aaudio.html) APIs
within [Android NDK](https://developer.android.com/ndk/index.html).
If device implementations declare `android.hardware.audio.output` they are
STRONGLY RECOMMENDED to meet or exceed the following requirements:
* [SR] Cold output latency of 100 milliseconds or less
* [SR] Continuous output latency of 45 milliseconds or less
* [SR] Minimize the cold output jitter
If device implementations meet the above requirements after any initial
calibration when using the OpenSL ES PCM buffer queue API, for continuous output
latency and cold output latency over at least one supported audio output device,
they are:
* [SR] STRONGLY RECOMMENDED to report low latency audio by declaring
`android.hardware.audio.low_latency` feature flag.
* [SR] STRONGLY RECOMMENDED to also meet the requirements for low-latency
audio via the AAudio API.
If device implementations do not meet the requirements for low-latency audio
via the OpenSL ES PCM buffer queue API, they:
* [C-1-1] MUST NOT report support for low-latency audio.
If device implementations include `android.hardware.microphone`, they are
STRONGLY RECOMMENDED to meet these input audio requirements:
* [SR] Cold input latency of 100 milliseconds or less
* [SR] Continuous input latency of 30 milliseconds or less
* [SR] Continuous round-trip latency of 50 milliseconds or less
* [SR] Minimize the cold input jitter

View file

@ -0,0 +1,154 @@
## 5.7\. Network Protocols
Device implementations MUST support the [media network protocols](
http://developer.android.com/guide/appendix/media-formats.html)
for audio and video playback as specified in the Android SDK documentation.
If device implementations include an audio or a video decoder, they:
* [C-1-1] MUST support all required codecs and container formats in
[section 5.1](#5_1_media_codecs) over HTTP(S).
* [C-1-2] MUST support the media segment formats shown in
the Media Segmant Formats table below over
[HTTP Live Streaming draft protocol, Version 7](
http://tools.ietf.org/html/draft-pantos-http-live-streaming-07).
* [C-1-3] MUST support the following RTP audio video profile and related
codecs in the RTSP table below. For exceptions please see the table footnotes
in [section 5.1](#5_1_media_codecs).
Media Segment Formats
<table>
<tr>
<th>Segment formats</th>
<th>Reference(s)</th>
<th>Required codec support</th>
</tr>
<tr id="mp2t">
<td>MPEG-2 Transport Stream</td>
<td><a href="http://www.iso.org/iso/catalogue_detail?csnumber=44169">ISO 13818</a></td>
<td>
Video codecs:
<ul>
<li class="table_list">H264 AVC</li>
<li class="table_list">MPEG-4 SP</li>
<li class="table_list">MPEG-2</li>
</ul>
See <a href="#5_1_3_video_codecs">section 5.1.3</a> for details on H264 AVC, MPEG2-4 SP,<br/>
and MPEG-2.
<p>Audio codecs:
<ul>
<li class="table_list">AAC</li>
</ul>
See <a href="#5_1_1_audio_codecs">section 5.1.1 </a> for details on AAC and its variants.
</td>
</tr>
<tr>
<td>AAC with ADTS framing and ID3 tags</td>
<td><a href="http://www.iso.org/iso/home/store/catalogue_tc/catalogue_detail.htm?csnumber=43345">ISO 13818-7</a></td>
<td>See <a href="#5_1_1_audio_codecs">section 5.1.1 </a>
for details on AAC and its variants</td>
</tr>
<tr>
<td>WebVTT</td>
<td><a href="http://dev.w3.org/html5/webvtt/">WebVTT</a></td>
<td></td>
</tr>
</table>
RTSP (RTP, SDP)
<table>
<tr>
<th>Profile name</th>
<th>Reference(s)</th>
<th>Required codec support</th>
</tr>
<tr>
<td>H264 AVC</td>
<td><a href="https://tools.ietf.org/html/rfc6184">RFC 6184</a></td>
<td>See <a href="#5_1_3_video_codecs">section 5.1.3 </a>
for details on H264 AVC</td>
</tr>
<tr>
<td>MP4A-LATM</td>
<td><a href="https://tools.ietf.org/html/rfc6416">RFC 6416</a></td>
<td>See <a href="#5_1_1_audio_codecs">section 5.1.1 </a>
for details on AAC and its variants</td>
</tr>
<tr>
<td>H263-1998</td>
<td>
<a href="https://tools.ietf.org/html/rfc3551">RFC 3551</a><br/>
<a href="https://tools.ietf.org/html/rfc4629">RFC 4629</a><br/>
<a href="https://tools.ietf.org/html/rfc2190">RFC 2190</a>
</td>
<td>See <a href="#5_1_3_video_codecs">section 5.1.3 </a>
for details on H263
</td>
</tr>
<tr>
<td>H263-2000</td>
<td>
<a href="https://tools.ietf.org/html/rfc4629">RFC 4629</a>
</td>
<td>See <a href="#5_1_3_video_codecs">section 5.1.3 </a>
for details on H263
</td>
</tr>
<tr>
<td>AMR</td>
<td>
<a href="https://tools.ietf.org/html/rfc4867">RFC 4867</a>
</td>
<td>See <a href="#5_1_1_audio_codecs">section 5.1.1 </a>
for details on AMR-NB
</td>
</tr>
<tr>
<td>AMR-WB</td>
<td>
<a href="https://tools.ietf.org/html/rfc4867">RFC 4867</a>
</td>
<td>See <a href="#5_1_1_audio_codecs">section 5.1.1 </a>
for details on AMR-WB
</td>
</tr>
<tr>
<td>MP4V-ES</td>
<td>
<a href="https://tools.ietf.org/html/rfc6416">RFC 6416</a>
</td>
<td>See <a href="#5_1_3_video_codecs">section 5.1.3 </a>
for details on MPEG-4 SP
</td>
</tr>
<tr>
<td>mpeg4-generic</td>
<td><a href="https://tools.ietf.org/html/rfc3640">RFC 3640</a></td>
<td>See <a href="#5_1_1_audio_codecs">section 5.1.1 </a>
for details on AAC and its variants</td>
</tr>
<tr>
<td>MP2T</td>
<td><a href="https://tools.ietf.org/html/rfc2250">RFC 2250</a></td>
<td>See <a href="#mp2t">MPEG-2 Transport Stream</a> underneath HTTP Live Streaming for details</td>
</tr>
</table>

View file

@ -0,0 +1,31 @@
## 5.8\. Secure Media
If device implementations support secure video output and are capable of
supporting secure surfaces, they:
* [C-1-1] MUST declare support for `Display.FLAG_SECURE`.
If device implementations declare support for `Display.FLAG_SECURE` and support
wireless display protocol, they:
* [C-2-1] MUST secure the link with a cryptographically strong mechanism such
as HDCP 2.x or higher for the displays connected through wireless protocols
such as Miracast.
If device implementations declare support for `Display.FLAG_SECURE` and
support wired external display, they:
* [C-3-1] MUST support HDCP 1.2 or higher for all wired external displays.
If device implementations are Android Television devices and support 4K
resolution, they:
* [T-1-1] MUST support HDCP 2.2 for all wired external displays.
If Television device implementations don't support 4K resolution, they:
* [T-2-1] MUST support HDCP 1.4 for all wired external displays.
* [T-SR] Television device implementations are STRONGLY RECOMMENDED to
support simulataneous decoding of secure streams. At minimum, simultaneous
decoding of two steams is STRONGLY RECOMMENDED.

View file

@ -0,0 +1,22 @@
## 5.9\. Musical Instrument Digital Interface (MIDI)
If a device implementation supports the inter-app MIDI software transport
(virtual MIDI devices), and it supports MIDI over _all_ of the following
MIDI-capable hardware transports for which it provides generic non-MIDI
connectivity, it is:
* [SR] STRONGLY RECOMMENDED to report support for feature
android.software.midi via the [android.content.pm.PackageManager](http://developer.android.com/reference/android/content/pm/PackageManager.html)
class.
The MIDI-capable hardware transports are:
* USB host mode (section 7.7 USB)
* USB peripheral mode (section 7.7 USB)
* MIDI over Bluetooth LE acting in central role (section 7.4.3 Bluetooth)
If the device implementation provides generic non-MIDI connectivity over a
particular MIDI-capable hardware transport listed above, but does not support
MIDI over that hardware transport, it:
* [C-1-1] MUST NOT report support for feature android.software.midi.