android multimedia framework

Post on 17-May-2015

28.083 Views

Category:

Technology

5 Downloads

Preview:

Click to see full reader

DESCRIPTION

This slide shows how the StageFright framework works on the Android.

TRANSCRIPT

Android Multimedia Framework

on Jelly Bean

Author: Picker

Introduction to OpenMAX

Resource from Khronos::OpenMAX

Resource from Khronos::OpenMAX

fromMultimedia Framework

toOpenMAX

Resource from Khronos::OpenMAX

Resource from Khronos::OpenMAX

Operating SystemOperating SystemOperating SystemOperating System

Multimedia FrameworkMultimedia Framework

OpenMAX ILOpenMAX IL

Software & Hardware CodecSoftware & Hardware Codec

Go through the OpenMAX

Developer API (MediaPlayer)Developer API (MediaPlayer)Developer API (MediaPlayer)Developer API (MediaPlayer)

EventHandlerEventHandler SurfaceHolderSurfaceHolder

StageFrightStageFright

OpenMAX InterfaceOpenMAX Interface

OpenMAX ILOpenMAX IL

Software & Hardware CodecSoftware & Hardware Codec

The simple stack architecture

Big picture of the StageFright

The Simple Workflow

The StageFright Workflow

Path of All the Components

frameworks/av/media/libmedia

Path of the Source Files

frameworks/base/media/java/android/media

frameworks/av/media/libmediaplayerservice

frameworks/base/media/jni

frameworks/av/media/libstagefright

Big PictureClass Diagram

High­Level Applications Framework

StageFright Framework

OpenMAX Interface

Four Steps for Playing the Media File

Step 1:  MediaPlayer mp = new MediaPlayer();Step 2:  mp.setDataSourcesetDataSource(mediaPath);Step 3:  mp.prepareprepare();Step 4:  mp.startstart();

How to set the data source?

The workflow of the setDataSource

The sequence flow of the setDataSource

How to prepare to play?

The workflow of the prepare

The sequence flow of the prepare

How to start the playing?

The workflow of the start

The sequence flow of the start

Synchronization Architecture of StageFright

Synchronization Flow of StageFright

The Synchronization Formula of StageFright

mPositionTimeRealsUs=(

mNumFramesPlayed+sizedone

mFrameSizemSampleRate

)×1000000

mTimeSourceDeltaUs=mPositionTimeRealUs−mPositionTimeMediaUs

nowUs=RealTimeUs−mTimeSourceDeltaUs

latenessUs=nowUs−timeUs

mPositionTimeRealsUs: the time in real playingmPositionTimeMediaUs: the time which is defined in the media source

The Synchronization ConditionsConditions of latenessUs1. > 500000ll2. > 400003. < -10000 1750 if (latenessUs > 500000ll

1751 && mAudioPlayer != NULL 1752 && mAudioPlayer->getMediaTimeMapping(1753 &realTimeUs, &mediaTimeUs)) {

1757 mVideoBuffer->release(); 1758 mVideoBuffer = NULL; 1763 postVideoEvent_l(); 1764 return;1765 }

1767 if (latenessUs > 40000) { 1768 // We're more than 40ms late.

1780 mVideoBuffer->release(); 1781 mVideoBuffer = NULL; 1787 1788 postVideoEvent_l();1789 return;1791 }

1793 if (latenessUs < -10000) {1794 // We're more than 10ms early.1795 postVideoEvent_l(10000);1796 return;1797 }

Conclusion

●Easy for maintaining rather than OpenCore●Support Software/Hardware Codecs●Unstable Product

● Rough Mechanisms● Create the NuPlayer for playing the multimedia streaming

top related