jboss communications platform-1.2-media server user guide-en-us

64
JBoss Communications Platform 1.2 Media Server User Guide The JBCP Media Server Gateway Guide Jared Morgan Tom Wells Douglas Silas Ivelin Ivanov Vladimir Ralev Eduardo Martins Jean Deruelle

Upload: prashantpratik

Post on 06-Apr-2015

108 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: JBoss Communications Platform-1.2-Media Server User Guide-En-US

JBoss CommunicationsPlatform 1.2

Media Server User GuideThe JBCP Media Server Gateway Guide

Jared Morgan

Tom Wells

Douglas Silas

Ivelin Ivanov

Vladimir Ralev

Eduardo Martins

Jean Deruelle

Page 2: JBoss Communications Platform-1.2-Media Server User Guide-En-US

Media Server User Guide

Oleg Kulikov

Amit Bhayani

Luis Barreiro

Alexandre Mendonça

Bartosz Baranowski

Pavel Šlégr

Page 3: JBoss Communications Platform-1.2-Media Server User Guide-En-US

JBoss Communications Platform 1.2 Media Server User GuideThe JBCP Media Server Gateway GuideEdition 1.2.8

Author Jared Morgan [email protected] Tom Wells [email protected] Douglas Silas [email protected] Ivelin Ivanov [email protected] Vladimir Ralev [email protected] Eduardo Martins [email protected] Jean Deruelle [email protected] Oleg Kulikov [email protected] Amit Bhayani [email protected] Luis Barreiro [email protected] Alexandre Mendonça [email protected] Bartosz Baranowski [email protected] Pavel Šlégr [email protected] © 2010 Red Hat, Inc.

Copyright © 2010 Red Hat Inc.

The text of and illustrations in this document are licensed by Red Hat under a Creative CommonsAttribution–Share Alike 3.0 Unported license ("CC-BY-SA"). An explanation of CC-BY-SA is availableat http://creativecommons.org/licenses/by-sa/3.0/. In accordance with CC-BY-SA, if you distribute thisdocument or an adaptation of it, you must provide the URL for the original version.

Red Hat, as the licensor of this document, waives the right to enforce, and agrees not to assert,Section 4d of CC-BY-SA to the fullest extent permitted by applicable law.

Red Hat, Red Hat Enterprise Linux, the Shadowman logo, JBoss, MetaMatrix, Fedora, the InfinityLogo, and RHCE are trademarks of Red Hat, Inc., registered in the United States and other countries.

Linux® is the registered trademark of Linus Torvalds in the United States and other countries.

Java® is a registered trademark of Oracle and/or its affiliates.

XFS® is a trademark of Silicon Graphics International Corp. or its subsidiaries in the United Statesand/or other countries.

MySQL® is a registered trademark of MySQL AB in the United States, the European Union and othercountries.

All other trademarks are the property of their respective owners.

The JBoss Communications Platform (JBCP), is the first and only open source VoIP platform certifiedfor JAIN SLEE 1.0 and SIP Servlets 1.1 compliance. JBCP serves as a high-performance core forService Delivery Platforms (SDPs) and IP Multimedia Subsystems (IMSs) by leveraging J2EE toenable the convergence of data and video in Next-Generation Intelligent Network (NGIN) applications.

JBCP enables the composition of predefined Service Building Blocks (SBBs) such as Call-Control,Billing, User-Provisioning, Administration and Presence-Sensing. Out-of-the-box monitoring and

Page 4: JBoss Communications Platform-1.2-Media Server User Guide-En-US

Media Server User Guide

management of JBCP components is achieved through JMX Consoles. JSLEE allows popular protocolstacks such as SIP to be plugged in as Resource Adapters (RAs), and Service Building Blocks—which share many similarities with EJBs—allow the easy accommodation and integration of enterpriseapplications with end points such as the Web, Customer Relationship Management (CRM) systemsand Service-Oriented Architectures (SOAs). The JBCP is the natural choice for telecommunicationOperations Support Systems (OSSs) and Network Management Systems (NMSs).

In addition to the telecommunication industry, JBCP is suitable for a variety of problem domainsdemanding an Event-Driven Architecture (EDA) for high-volume, low-latency signaling, such asfinancial trading, online gaming, (RFID) sensor network integration, and distributed control.

Page 5: JBoss Communications Platform-1.2-Media Server User Guide-En-US

v

Preface vii1. Document Conventions .................................................................................................. vii

1.1. Typographic Conventions .................................................................................... vii1.2. Pull-quote Conventions ....................................................................................... viii1.3. Notes and Warnings ............................................................................................ ix

2. We Need Feedback ........................................................................................................ ix

1. Introduction to the JBCP Media Server 11.1. Overview: the Reasoning and Need for Media Servers ................................................... 11.2. Media Server Architecture ............................................................................................ 2

1.2.1. Design Overview ............................................................................................... 21.2.2. Endpoints .......................................................................................................... 51.2.3. Endpoint Identifiers ............................................................................................ 81.2.4. Controller Modules ............................................................................................ 81.2.5. Connections ...................................................................................................... 81.2.6. Events and Signals ............................................................................................ 9

2. Installing 112.1. Writing and Running Tests .......................................................................................... 11

3. Configuring the Mobicents Media Server 133.1. RTPManager .............................................................................................................. 133.2. Announcement Server Access Points .......................................................................... 143.3. Interactive Voice Response ......................................................................................... 153.4. Packet Relay Endpoint ............................................................................................... 163.5. Conference Bridge Endpoint ....................................................................................... 173.6. MMS STUN Support ................................................................................................... 18

4. Controlling and Programming 214.1. Media Server Control Protocols ................................................................................... 21

4.1.1. Media Gateway Control Protocol Interface ......................................................... 214.2. Media Server Control API ........................................................................................... 21

4.2.1. Basic Components of the Media Server API ...................................................... 224.2.2. Basic API Patterns: Listeners ........................................................................... 244.2.3. Events ............................................................................................................. 244.2.4. MSC API Objects: Finite State Machines .......................................................... 274.2.5. API Methods and Usage .................................................................................. 28

5. Event Packages 33

6. Demonstration Example 35

7. Best Practices 417.1. JBCP Media Server Best Practices ............................................................................. 41

7.1.1. DTMF Detection Mode: RFC2833 versus Inband versus Auto ............................. 417.1.2. Transcoding Is CPU-Intensive .......................................................................... 417.1.3. Conference Endpoints block the Number of Connections at Start Time ................ 41

A. Understanding Digital Signal Processing and Streaming 45A.1. Introduction to Digital Signal Processing ...................................................................... 45A.2. Analog and Digital Signals .......................................................................................... 45

A.2.1. Discrete Signals .............................................................................................. 45A.3. Sampling, Quantization, and Packetization .................................................................. 46A.4. Transfer Protocols ...................................................................................................... 47

A.4.1. Real-time Transport Protocol ............................................................................ 47

Page 6: JBoss Communications Platform-1.2-Media Server User Guide-En-US

Media Server User Guide

vi

A.4.2. Real-time Transport Control Protocol ................................................................ 49A.4.3. Jitter ............................................................................................................... 51

B. Revision History 53

Page 7: JBoss Communications Platform-1.2-Media Server User Guide-En-US

vii

Preface

1. Document ConventionsThis manual uses several conventions to highlight certain words and phrases and draw attention tospecific pieces of information.

In PDF and paper editions, this manual uses typefaces drawn from the Liberation Fonts1 set. TheLiberation Fonts set is also used in HTML editions if the set is installed on your system. If not,alternative but equivalent typefaces are displayed. Note: Red Hat Enterprise Linux 5 and later includesthe Liberation Fonts set by default.

1.1. Typographic ConventionsFour typographic conventions are used to call attention to specific words and phrases. Theseconventions, and the circumstances they apply to, are as follows.

Mono-spaced Bold

Used to highlight system input, including shell commands, file names and paths. Also used to highlightkeycaps and key combinations. For example:

To see the contents of the file my_next_bestselling_novel in your currentworking directory, enter the cat my_next_bestselling_novel command at theshell prompt and press Enter to execute the command.

The above includes a file name, a shell command and a keycap, all presented in mono-spaced boldand all distinguishable thanks to context.

Key combinations can be distinguished from keycaps by the hyphen connecting each part of a keycombination. For example:

Press Enter to execute the command.

Press Ctrl+Alt+F2 to switch to the first virtual terminal. Press Ctrl+Alt+F1 toreturn to your X-Windows session.

The first paragraph highlights the particular keycap to press. The second highlights two keycombinations (each a set of three keycaps with each set pressed simultaneously).

If source code is discussed, class names, methods, functions, variable names and returned valuesmentioned within a paragraph will be presented as above, in mono-spaced bold. For example:

File-related classes include filesystem for file systems, file for files, and dir fordirectories. Each class has its own associated set of permissions.

Proportional Bold

This denotes words or phrases encountered on a system, including application names; dialog box text;labeled buttons; check-box and radio button labels; menu titles and sub-menu titles. For example:

Choose System → Preferences → Mouse from the main menu bar to launch MousePreferences. In the Buttons tab, click the Left-handed mouse check box and click

1 https://fedorahosted.org/liberation-fonts/

Page 8: JBoss Communications Platform-1.2-Media Server User Guide-En-US

Preface

viii

Close to switch the primary mouse button from the left to the right (making the mousesuitable for use in the left hand).

To insert a special character into a gedit file, choose Applications → Accessories→ Character Map from the main menu bar. Next, choose Search → Find… from theCharacter Map menu bar, type the name of the character in the Search field and clickNext. The character you sought will be highlighted in the Character Table. Double-click this highlighted character to place it in the Text to copy field and then click the

Copy button. Now switch back to your document and choose Edit → Paste from thegedit menu bar.

The above text includes application names; system-wide menu names and items; application-specificmenu names; and buttons and text found within a GUI interface, all presented in proportional bold andall distinguishable by context.

Mono-spaced Bold Italic or Proportional Bold Italic

Whether mono-spaced bold or proportional bold, the addition of italics indicates replaceable orvariable text. Italics denotes text you do not input literally or displayed text that changes depending oncircumstance. For example:

To connect to a remote machine using ssh, type ssh [email protected] ata shell prompt. If the remote machine is example.com and your username on thatmachine is john, type ssh [email protected].

The mount -o remount file-system command remounts the named filesystem. For example, to remount the /home file system, the command is mount -oremount /home.

To see the version of a currently installed package, use the rpm -q packagecommand. It will return a result as follows: package-version-release.

Note the words in bold italics above — username, domain.name, file-system, package, version andrelease. Each word is a placeholder, either for text you enter when issuing a command or for textdisplayed by the system.

Aside from standard usage for presenting the title of a work, italics denotes the first use of a new andimportant term. For example:

Publican is a DocBook publishing system.

1.2. Pull-quote ConventionsTerminal output and source code listings are set off visually from the surrounding text.

Output sent to a terminal is set in mono-spaced roman and presented thus:

books Desktop documentation drafts mss photos stuff svnbooks_tests Desktop1 downloads images notes scripts svgs

Source-code listings are also set in mono-spaced roman but add syntax highlighting as follows:

package org.jboss.book.jca.ex1;

Page 9: JBoss Communications Platform-1.2-Media Server User Guide-En-US

Notes and Warnings

ix

import javax.naming.InitialContext;

public class ExClient{ public static void main(String args[]) throws Exception { InitialContext iniCtx = new InitialContext(); Object ref = iniCtx.lookup("EchoBean"); EchoHome home = (EchoHome) ref; Echo echo = home.create();

System.out.println("Created Echo");

System.out.println("Echo.echo('Hello') = " + echo.echo("Hello")); }}

1.3. Notes and WarningsFinally, we use three visual styles to draw attention to information that might otherwise be overlooked.

NoteNotes are tips, shortcuts or alternative approaches to the task at hand. Ignoring anote should have no negative consequences, but you might miss out on a trick thatmakes your life easier.

ImportantImportant boxes detail things that are easily missed: configuration changes thatonly apply to the current session, or services that need restarting before an updatewill apply. Ignoring a box labeled 'Important' will not cause data loss but may causeirritation and frustration.

WarningWarnings should not be ignored. Ignoring warnings will most likely cause data loss.

2. We Need FeedbackIf you find a typographical error in this manual, or if you have thought of a way to make thismanual better, submit a report in Bugzilla: http://bugzilla.redhat.com/bugzilla/ against the JBossCommunication Platform.

When submitting a bug report, be sure to mention the manual's identifier: doc-Media_Server_User_Guide.

If you have a suggestion for improving the documentation, try to be as specific as possible whendescribing it. If you have found an error, please include the section number and some of thesurrounding text so we can find it easily.

Page 10: JBoss Communications Platform-1.2-Media Server User Guide-En-US

x

Page 11: JBoss Communications Platform-1.2-Media Server User Guide-En-US

Chapter 1.

1

Introduction to the JBCP Media Server

1.1. Overview: the Reasoning and Need for Media Servers

Media Gateways Bridge Multiple TechnologiesToday, all communications can be routed through computers. Widespread access to broadbandInternet and the ubiquity of Internet Protocol (IP) enable the convergence of voice, data and video.Media gateways provide the ability to switch voice media between a network and its access point.Using Digital Subscriber Line (DSL) and fast-Internet cable technology, a media gateway converts,compresses and packetizes voice data for transmission back-and-forth across the Internet backbonefor landline and wireless phones. Media gateways sit at the intersection of Public Switched TelephoneNetworks (PSTNs) and wireless or IP-based networks.

The Justification for Media Gateways for VoIPMultiple market demands are pushing companies to converge all of their media services using mediagateways with Voice-over-IP (VoIP) capabilities. Companies have expectations for such architectures,which include:

Lowering initial costsCapital investment is decreased because low-cost commodity hardware can be used for multiplefunctions.

Lowering development costsOpen system hardware and software standards with well-defined applications reduce costs, andApplication Programming Interfaces (APIs) accelerate development.

Handling multiple media typesCompanies want VoIP solutions today, but also need to choose extensible solutions that willhandle video in the near future.

Lowering the costs of deployment and maintenanceStandardized, modular systems reduce training costs and maintenance while simultaneouslyimproving uptime.

Enabling rapid time-to-marketEarly market entry hits the window of opportunity and maximizes revenue.

What Is the JBCP Media Server?The JBCP Media Gateway is an open source Media Server aimed at:

• Delivering competitive, complete, best-of-breed media gateway functionality of the highest quality.

• Meeting the demands of converged wireless and landline networks, DSL and cable broadbandaccess, and fixed-mobile converged VoIP—— networks from a singleand singularly-capablemediagateway platform.

Page 12: JBoss Communications Platform-1.2-Media Server User Guide-En-US

Chapter 1. Introduction to the JBCP Media Server

2

• Increasing flexibility with a media gateway that supports a wide variety of call control protocols,which possesses an architecture that can scale to meet the demands of small-carrier providers aswell as large enterprises.

1.2. Media Server ArchitectureMedia services have played an important role in the traditional Time Division Multiplexing (TDM)-based telephone network. As the network migrates to an Internet Protocol (IP)-based environment,media services are also moving to new environments.

One of the most exciting trends is the emergence and adoption of complementary modular standardsthat leverage the Internet to enable media services to be developed, deployed and updated morerapidly than before in a network architecture that supports the two concepts called provisioning-on-demand and scaling-on-demand.

1.2.1. Design Overview

General Design OverviewThe Mobicents Media Server is developed on top of existing Java technologies. The Java platformis ideal for network computing. It offers a single, unified-and-unifying programming model that canconnect all elements of a business infrastructure. The modularization effort is supported by the useof the Java Management Extension (JMX) API, and the industry-standard Service Logic ExecutionEnvironment (SLEE) container. Using JMX enables easy management of both the server's mediacomponents and the control modules hosted by the SLEE.

Page 13: JBoss Communications Platform-1.2-Media Server User Guide-En-US

Design Overview

3

The Media Server's high degree of modularity benefits the application developer in several ways. Thealready-tight code can be further optimized to support applications that require small footprints. Forexample, if PSTN interconnection is unnecessary in an application, then the D-channel feature can beremoved from the Media Server. In the future, if the same application is deployed within a Signaling

Page 14: JBoss Communications Platform-1.2-Media Server User Guide-En-US

Chapter 1. Introduction to the JBCP Media Server

4

System 7 (SS7) network, then the appropriate endpoint can be enabled, and the application is thencompatible.

The Media Server architecture assumes that call control intelligence lies outside of the MediaServer, and is handled by an external entity. The Media Server also assumes that call controllerswill use control procedures such as MGCP, Mecago or MSML, among others. Each specific controlmodule can be plugged in directly to the server as a standard deployable unit. Utilizing the JBossMicrocontainer for the implementation of control protocol-specific communication logic allows forsimple deployment. It is therefore unnecessary for developers to configure low-level transaction andstate management details, multi-threading, connection-pooling and other low-level details and APIs.

NoteThe Media Server uses SLEE for implementing its own communication capabilities.The SLEE container does not serve here as a call controller.

In addition to control protocol modules, the SLEE container is aimed at providing high-level featureslike Interactive Voice Response (IVR), the Drools business rule management system, and VoiceXMLengines.

The modules deployed under SLEE control interact with the Media Server's Service Provider Interface(SPI) through the Media Server Control Resource Adapter, or MSC-RA. The MSC-RA follows therecommendations of JSR-3091 and implements asynchronous interconnection with the Media ServerSPI stack. This local implementation is restricted and does not use high-level abstractions (forexample, VoiceXML dialogs).

Media Flow PathService Objects are used to represent the media flow path for media objects used with the MediaServer. By implementing Service Objects to manage components, constructing media services can beseparated into two areas:• Implementing components that generate, or consume, media data.

• Assembling media component chains to build a media flow path.

1 http://jcp.org/en/jsr/detail?id=309

Page 15: JBoss Communications Platform-1.2-Media Server User Guide-En-US

Endpoints

5

Media Components consist of a number of sub-components

1.2.1.1. Typical Deployment ScenarioThe Media Server offers a complete media gateway and server solution; here is a non-exhaustive listof the Media Server's capabilities:

• Digital Signal Processing to convert and compress TDB voice circuits into IP packets.

• Announcement access points.

• Conferencing.

• High-level Interactive Voice Response (IVR) engines.

The gateway is able to provide signaling conversation and can operate as a Session Border Controllerat the boundaries of Local Access Networks (LANs). The Media Server is always controlled by anexternal JBCP Platform application server, which implements the call control logic.

Typical Media Server Deployment Scenario

1.2.2. Endpoints

EndpointsIt is convenient to consider a media gateway as a collection of endpoints. An endpoint is a logicalrepresentation of a physical entity such as an analog phone or a channel in a trunk. Endpoints aresources or sinks of data and can be either physical or virtual. Physical endpoint creation requireshardware installation, while software is sufficient for creating virtual endpoints. An interface on agateway that terminates at a trunk connected to a PTSN switch would be an example of a physicalendpoint. An audio source in an audio content server would be an example of a virtual endpoint.

The type of the endpoint determines its functionality. Our analysis, so far, has led us to isolate thefollowing basic endpoint types:

• digital signal 0 (DS0)

• analog line

Page 16: JBoss Communications Platform-1.2-Media Server User Guide-En-US

Chapter 1. Introduction to the JBCP Media Server

6

• announcement server access point

• conference bridge access point

• packet relay

• Asynchronous Transfer Mode (ATM) "trunk side" interface

This list is not final: other endpoint types may be defined in the future, such as test endpoints whichcould be used to check network quality, or frame-relay endpoints that could be used to manage audiochannels multiplexed over a frame-relay virtual circuit.

Descriptions of Various Access Point TypesAnnouncement Server Access Point

An announcement server endpoint provides access, intuitively, to an announcement server.Upon receiving requests from the call agent, the announcement server “plays” a specifiedannouncement. A given announcement endpoint is not expected to support more than oneconnection at a time. Connections to an announcement server are typically one-way; theyare “half-duplex”: the announcement server is not expected to listen to audio signals from theconnection. Announcement access points are capable of playing announcements; however, theseendpoints do not have the capability of transcoding. To achieve transcoding, a Packet Relay mustbe used. Also note that the announcement server endpoint can generate tones, such as dual-tonemulti-frequency (DTMF).

Interactive Voice Response Access PointAn Interactive Voice Response (IVR) endpoint provides access to an IVR service. Upon requestsfrom the call agent, the IVR server “plays” announcements and tones, and “listens” for responses,such as (DTMF) input or voice messages, from the user. A given IVR endpoint is not expected tosupport more than one connection at a time. Similarly to announcement endpoints, IVR endpointsdo not possess media-transcoding capabilities. IVR plays and records in the format in which themedia was stored or received.

Conference Bridge Access PointA conference bridge endpoint is used to provide access to a specific conference. Media gatewaysshould be able to establish several connections between the endpoint and packet networks,or between the endpoint and other endpoints in the same gateway. The signals originatingfrom these connections are mixed according to the connection “mode” (as specified later in thisdocument). The precise number of connections that an endpoint supports is characteristic of thegateway, and may, in fact, vary according to the allocation of resources within the gateway.

Packet Relay EndpointA packet relay endpoint is a specific form of conference bridge that typically only supports twoconnections. Packet relays can be found in firewalls between a protected and an open network,or in transcoding servers used to provide interoperation between incompatible gateways, such asgateways which don't support compatible compression algorithms and gateways which operateover different transmission networks, such as IP or ATM.

Echo EndpointAn echo—or loopback—endpoint is a test endpoint that is used for maintenance and/or continuitytesting. The endpoint returns the incoming audio signal from the endpoint back to that sameendpoint, thus creating an echo effect

Page 17: JBoss Communications Platform-1.2-Media Server User Guide-En-US

Endpoints

7

Signal Generators (SGs) and Signal Detectors (SDs)This endpoint contains a set of resources which provide media-processing functionality. It managesthe interconnection of media streams between the resources, and arbitrates the flow of media streamdata between them. Media services, also called commands, are invoked by a client application on theendpoint; that endpoint causes the resources to perform the desired services, and directs events sentby the resources to the appropriate client. A primary resource and zero or more secondary resourcesare included in the endpoint. The primary resource is typically connected to an external media stream,and provides the data from that stream to secondary resources. The secondary resources mayprocess that stream (for example, recording it and/or performing automatic speech recognition on it),or may themselves generate generate media stream data (for example, playing a voice file) which isthen transmitted to the primary resource.

A resource is statically prepared if the preparation takes place at the time of creation. A resource isdynamically prepared if preparation of a particular resource (and its associated media streams) doesnot occur until it is required by a media operation. Static preparation can lead to less efficient usage ofthe Media Server's resources, because those resources tend to be allocated for a longer time beforeuse. However, once a resource has been prepared, it is guaranteed to be available for use. Dynamicpreparation may utilize resources more efficiently because just-in-time (JIT) allocation algorithms maybe used.

An endpoint is divided logically into a Service Provider Interface that is used to implement a specificendpoint, and a management interface, which is used to implement the manageable resources of thatendpoint. All endpoints are plugged into the JBCP SLEE server by registering each endpoint with theappropriate JBoss Microcontainer. All major endpoints are manageable JBoss Microcontainers whichare interconnected through the server. The most effective way to add endpoints to a Media Server is tocreate the endpoint application within a JBoss Microcontainer.

Page 18: JBoss Communications Platform-1.2-Media Server User Guide-En-US

Chapter 1. Introduction to the JBCP Media Server

8

The SPI layer is an abstraction that endpoint providers must implement in order to enable their media-processing features. An implementation of an SPI for an endpoint is referred to as an EndpointProvider.

1.2.3. Endpoint IdentifiersAn endpoint is identified by its local name. The syntax of the local name depends on the type ofendpoint being named. However, the local name for each of these types is naturally hierarchical,beginning with a term that identifies the physical gateway containing the given endpoint, and endingwith a term which specifies the individual endpoint concerned. With this in mind, the JNDI namingrules are applied to the endpoint identifiers.

1.2.4. Controller ModulesController Modules allows external interfaces to be implemented for the Media Server. Eachcontroller module implements an industry standard control protocol, and uses a generic SPI to controlprocessing components or endpoints.

One such controller module is the Media Gateway Control Protocol (MGCP). This controller module isimplemented as an internal protocol within a distributed system, and appears to external networks as asingle VoIP gateway. The MGCP is composed of a Call Agent, a set of gateways including at least one"media gateway", and a "signalling gateway" (when connecting to an SS7 controlled network). TheCall Agent can be distributed over several computer platforms. Each gateway handles the conversionof media signals between circuits and packets.

1.2.5. ConnectionsConnections are created on the call agent on each endpoint that will be involved in the “call”. Inthe classic example of a connection between two “DS0” endpoints, EP1 and EP2, the call agentscontrolling the endpoints establish two connections (C1 and C2):

Media Server Connections

Each connection is designated locally by a connection identifier, and will be characterized byconnection attributes.

Resources and Connection AttributesMany types of resources can be associated with a connection, such as specific signal-processingfunctions or packetization functions. Generally, these resources fall in two categories:

Two Types of ResourcesExternally-Visible Resources

Externally-visible resources are ones which affect the format of “the bits on the network”, and mustbe communicated to the second endpoint involved in the connection.

Internal ResourcesInternal resources are resources which determine which signal is being sent over the connectionand how the received signals are processed by the endpoint.

Page 19: JBoss Communications Platform-1.2-Media Server User Guide-En-US

Events and Signals

9

The resources allocated to a connection or, more generally, to the handling of the connection, arechosen by the Media Server under instructions from the call agent. The call agent provides theseinstructions by sending two set of parameters to the Media Server:

• The local directives instruct the gateway on the choice of resources that should be used for aconnection.

• When available, the session description is provided by the other end of the connection.

The local directives specify parameters such as the mode of the connection (e.g. send-only, orsend-receive), preferred coding or packetization methods, the usage of echo-cancellation or silencesuppression, etc. (A more comprehensive and detailed list can be found in the specification of theLocalConnectionOptions parameter of the CreateConnection command.) For each of theseparameters, the call agent can either specify a value, a range of values, or no value at all. This allowvarious implementations to implement various levels of control, from very tight control where the callagent specifies minute details of the connection-handling, to very loose control, where the call agentonly specifies broad guidelines, such as the maximum bandwidth, and lets the gateway select thedetailed values itself.

Based on the value of the local directives, the gateway determines the resources allocated to theconnection. When this is possible, the gateway will choose values that are in line with the remotesession description; however, there is no absolute requirement that the parameters will be exactly thesame.

Once the resource have been allocated, the gateway will compose a session description thatdescribes the way it intends to receive packets. Note that the session description may in some casespresent a range of values. For example, if the gateway is ready to accept one of several compressionalgorithms, it can provide a list of these accepted algorithms.

Local Connections Are a Special CaseLarge gateways include a large number of endpoints which are often of different types. In somenetworks, we may often have to set up connections between endpoints located within the samegateway. Examples of such connections may be:

• connecting a trunk line to a wiretap device;

• connecting a call to an Interactive Voice-Response (IVR) unit;

• connecting a call to a conferencing unit; or,

• routing a call from one endpoint to another, something often described as a hairpin connection.

Local connections are much simpler to establish than network connections. In most cases, theconnection will be established through a local interconnecting device, such as, for example, a TDMbus.

1.2.6. Events and SignalsThe concept of events and signals is central to the Media Server. A Call Controller may ask to benotified about certain events occurring in an endpoint (for example: off-hook events) by passing anevent identifier as a parameter to an endpoint's subscribe() method.

A Call Controller may also request certain signals to be applied to an endpoint (for example: a dial-tone) by supplying the identifier of the event as an argument to the endpoint's apply() method.

Page 20: JBoss Communications Platform-1.2-Media Server User Guide-En-US

Chapter 1. Introduction to the JBCP Media Server

10

Events and signals are grouped in packages, within which they share the same namespace. which wewill refer to as an event identifier. Event identifiers are integer constants. Some of the events may beparametrized with additional data such as with a DTMF mask.

Signals are divided into different types depending on their behavior:

Types of SignalsOn/off (OO)

Once applied, these signals last until they are turned off. This can only happen as the result ofa reboot/restart or a new signal request where the signal is explicitly turned off. Signals of typeOO are defined to be idempotent; thus, multiple requests to turn a given OO signal on (or off)are perfectly valid. An On/Off signal could be a visual message-waiting indicator (VMWI). Onceturned on, it must not be turned off until explicitly instructed to by the Call Agent, or as a result ofan endpoint restart. In other words, these signals will not turn off as a result of the detection of arequested event.

Time-out (TO)Once applied, these signals last until they are either cancelled (by the occurrence of an event orby explicit releasing of signal generator), or a signal-specific period of time has elapsed. A TOsignal that times out will generate an operation complete event. If an event occurs prior to theallotted 180 seconds, the signal will, by default, be stopped (the keep signals active action canbe used to override this behavior). If the signal is not stopped, the signal will time out, stop, andgenerate an operation complete event, about which the server controller may or may not haverequested to be notified. A TO signal that fails after being started, but before having generated anoperation complete event, will generate an operation failure event that includes the name of thesignal that failed. Deletion of a connection with an active TO signal will result in such a failure.

Brief (BR)The duration of these signals is normally so short that they stop on their own. If a signal stoppingevent occurs, or a new signal request is applied, a currently active BR signal will not stop.However, any pending BR signals not yet applied will be cancelled (a BR signal becomes pendingif a signal request includes a BR signal, and there is already an active BR signal). As an example,a brief tone could be a DTMF digit. If the DTMF digit 1 is currently being played, and a signalstopping event occurs, the 1 would play to completion. If a request to play DTMF digit 2 arrivesbefore DTMF digit 1 finishes playing, DTMF digit 2 would become pending.

Signal(s) may be generated on endpoints or on connections. One or more actions such as thefollowing are associated with each event:

• notify the event immediately, together with the accumulated list of observed events;

• accumulate the event in an event buffer, but do not yet notify;

• keep signal(s) active; or

• ignore the event.

Page 21: JBoss Communications Platform-1.2-Media Server User Guide-En-US

Chapter 2.

11

InstallingThe Media Server is a self-contained Java software stack consisting of multiple servers architecturallydesigned to work together. This server stack includes the JBoss Application Server and the JAINSLEE Server; both of these required servers are included in the Media Server distribution.

The Media Server can be installed along with all other JBCP servers by downloading the JBossCommunications Platform installation package. For more information regarding installing the package,refer to the JBCP Platform Installation Guide1.

2.1. Writing and Running TestsFor information about the different kinds of tests that the Media Server provides, refer to Writing andRunning Tests Against Media Server2.

1 http://www.redhat.com/docs/en-US/JBoss_Communications_Platform/1.2.3/html-single/Platform_Installation_Guide/index.htm2 http://groups.google.com/group/mobicents-public/web/mobicents-ms-tests

Page 22: JBoss Communications Platform-1.2-Media Server User Guide-En-US

12

Page 23: JBoss Communications Platform-1.2-Media Server User Guide-En-US

Chapter 3.

13

Configuring the Mobicents MediaServerAll endpoints are plugged into the Mobicents JAIN SLEE server by registering with the MBean server.Note that if you have configured the servlet container, e.g. Tomcat, to service a different port, then youwill need to supply a different port number in the URL.

3.1. RTPManagerRTPManager is responsible for managing the actual RTP Socket. The reference of RTPManageris passed to each endpoint (the endpoint does the look-up via JNDI) and endpoints leverage theRTPManagerRTPManger to create Connections and decide on supported codecs.

The configurable aspects of the RTPManager are:

JndiNameThe Java Naming and Directory Interface (JNDI) name under which the endpoint is to be bound.

BindAddressThe IP address to which this endpoint is bound.

JitterThe size of the jitter buffer in milliseconds.

PacketizationPeriodThe period of media stream packetization in milliseconds.

PortRangeThe port range within which the RTP socket will be created. The first free port in the given rangewill be used.

AudioFormatsThe Audio Formats supported by this instance of RTPManger.

UseStunWhether the Mobicents Media Server is behind a NAT and a STUN setting is required. The STUNdetails are explained in Section 3.6, “MMS STUN Support”.

Supported RTP FormatsThe RTPManager is able to receive the following RTP media types:

Media Type PayLoad Number

Audio: G711 (A-law) 8bit, 8kHz 8

Audio: G711 (U-law) 8bit, 8kHz 0

telephone-event 101

Audio: GSM 8bit, 8kHz 3

Audio: G729 8bit, 8kHz 18

Page 24: JBoss Communications Platform-1.2-Media Server User Guide-En-US

Chapter 3. Configuring the Mobicents Media Server

14

Media Type PayLoad Number

Audio: Speex 8bit, 8kHz 97

Table 3.1. Supported RTP Formats

<mbean code="org.mobicents.media.server.impl.jmx.rtp.RTPManager" name="media.mobicents:service=RTPManager,QID=1"> <attribute name="JndiName">java:media/mobicents/protocol/RTP</attribute> <attribute name="BindAddress">${jboss.bind.address}</attribute> <attribute name="Jitter">60</attribute> <attribute name="PacketizationPeriod">20</attribute> <attribute name="PortRange">1024-65535</attribute> <attribute name="AudioFormats">0 = ULAW, 8000, 8, 1; 3 = GSM, 8000, 8, 1; 8 = ALAW, 8000, 8, 1; 97 = SPEEX, 8000, 8, 1; 101 = telephone-event/8000</attribute></mbean>

Example 3.1. The RTPManager MBean

3.2. Announcement Server Access PointsAn Announcement Server endpoint provides access to an announcement service. Upon requestsfrom the call agent, an Announcement Server will “play” a specified announcement. A givenannouncement endpoint is not expected to support more than one connection at a time. Connectionsto an Announcement Server are typically one-way, i.e. “half-duplex”: the Announcement Server is notexpected to listen to audio signals from the connection. Announcement endpoints do not transcodeannounced media; in order to achieve this, the application must use Packet Relay endpoints onthe media path. Also note that the announcement server endpoint can generate a tone such as, forexample, DTMF.

<mbean code="org.mobicents.media.server.impl.jmx.enp.ann.AnnEndpointManagement" name="media.mobicents:endpoint=announcement"> <attribute name="JndiName">media/trunk/Announcement</attribute> <attribute name="RtpFactoryName">java:media/mobicents/protocol/RTP</attribute></mbean>

Example 3.2. The AnnEndpointManagement MBean

Configuration of an Announcement Server Access PointThe configurable attributes of the Announcement Server are as follows:

JndiNameThe Java Naming and Directory Interface (JNDI) name under which the endpoint is to be bound.

RtpFactoryNameThe JNDI name of RTPManager.

Page 25: JBoss Communications Platform-1.2-Media Server User Guide-En-US

Interactive Voice Response

15

ChannelsControls the number of announcement endpoints available in the server instance, in an endpointspool. Endpoints are not created dynamically. At any given time, the number of endpoints in use cannotexceed the Channels value. It is not subject to change during runtime.

Supported PackagesThe supported packages are as follows:

• org.mobicents.media.server.spi.events.Announcement

3.3. Interactive Voice ResponseAn Interactive Voice Response (IVR) endpoint provides access to an IVR service. Upon requests fromthe Call Agent, the IVR server “plays” announcements and tones, and “listens” to voice messagesfrom the user. A given IVR endpoint is not expected to support more than one connection at a time.For example, if several connections were established to the same endpoint, then the same tones andannouncements would be played simultaneously over all connections. IVR endpoints do not possesthe capability of transcoding played or recorded media streams. IVRs record or play in the format thatthe data was delivered.

<mbean code="org.mobicents.media.server.impl.jmx.enp.ivr.IVRTrunkManagement" name="media.mobicents:endpoint=ivr"> <depends>media.mobicents:service=RTPManager,QID=1</depends> <attribute name="JndiName">media/trunk/IVR</attribute> <attribute name="RtpFactoryName">java:media/mobicents/protocol/RTP</attribute> <attribute name="MediaType">audio.x_wav</attribute> <!-- DtmfMode can be either RFC2833 or INBAND or AUTO --> <attribute name="DtmfMode">AUTO</attribute> <attribute name="RecordDir">${jboss.server.data.dir}</attribute> <attribute name="Channels">24</attribute></mbean>

Example 3.3. The IVREndpointManagement MBean

Configuration of the Interactive Voice Response EndpointThe configurable attributes of the Interactive Voice Response endpoint are as follows:

JndiNameThe Java Naming and Directory Interface (JNDI) name under which the endpoint is to be bound.

RtpFactoryNameThe JNDI name of RTPManager.

RecordDirThe directory where the recorded files should be created and stored.

Page 26: JBoss Communications Platform-1.2-Media Server User Guide-En-US

Chapter 3. Configuring the Mobicents Media Server

16

ChannelsControls the number of announcement endpoints available in the server instance , in an endpointspool. Endpoints are not created dynamically. At any given time the number of endpoints in use cannot exceed the channels value. It is not subject to change during runtime.

MediaTypeIt currently defaults to WAV.

DtmfModeControls DTMF detection mode. Possible values are: RFC2833, INBAND or AUTO.

Supported Media Types and FormatsThe supported media types and formats are listed as follows:

WAVE (.wav)16-bit mono/stereo linear

Record Directory ConfigurationYou can specify the common directory where all the recorded files should be stored, or simply omitthis attribute, in which case the default directory is null, and the application needs to pass an absolutedirectory path to record to.

Supported PackagesThe supported packages are as follows:

• org.mobicents.media.server.spi.events.Announcement

• org.mobicents.media.server.spi.events.Basic

• org.mobicents.media.server.spi.events.AU

3.4. Packet Relay EndpointA packet relay endpoint is a specific form of conference bridge that typically only supports twoconnections. Packet relays can be found in firewalls between a protected and an open network, orin transcoding servers used to provide interoperation between incompatible gateways (for example,gateways which do not support compatible compression algorithms, or gateways which operate overdifferent transmission networks such as IP or ATM).

<mbean code="org.mobicents.media.server.impl.jmx.enp.prl.PRTrunkManagement" name="media.mobicents:endpoint=packet-relay"> <depends>media.mobicents:service=RTPManager,QID=1</depends> <attribute name="JndiName">media/trunk/PacketRelay</attribute> <attribute name="RtpFactoryName">java:media/mobicents/protocol/RTP</attribute> <attribute name="Channels">10</attribute></mbean>

Example 3.4. The PREndpointManagement MBean

Page 27: JBoss Communications Platform-1.2-Media Server User Guide-En-US

Conference Bridge Endpoint

17

Configuration of the Packet Relay EndpointThe configurable attributes of the Packet Relay endpoint are as follows:

JndiNameThe JNDI name under which endpoint is to be bound.

RtpFactoryNameThe JNDI name of RTPManager.

ChannelsControls the number of announcement endpoints available in the server instance , in an endpointspool. Endpoints are not created dynamically. At any given time, the number of endpoints in usecannot exceed the channels value. It is not subject to change during runtime.

Supported PackagesThe supported packages are as follows:

• org.mobicents.media.server.spi.events.Basic

3.5. Conference Bridge EndpointThe Mobicents Media Server should be able to establish several connections between the endpointand packet networks, or between the endpoint and other endpoints in the same gateway. Thesignals originating from these connections shall be mixed according to the connection “mode”. Theprecise number of connections an endpoint supports is a characteristic of the gateway, and mayin fact vary according with the allocation of resources within the gateway. The conf endpoint canplay an announcement directly on connections and hence only for the participant listening to anannouncement, and can even detect DTMF for connection.

<mbean code="org.mobicents.media.server.impl.jmx.enp.cnf.ConfTrunkManagement" name="media.mobicents:endpoint=conf"> <depends>media.mobicents:service=RTPManager,QID=1</depends> <attribute name="JndiName">media/trunk/Conference</attribute> <attribute name="RtpFactoryName"> java:media/mobicents/protocol/RTP</attribute> <attribute name="Channels">10</attribute></mbean>

Example 3.5. The ConfEndpointManagement MBean

Configuration of the Conference Bridge EndpointThe configurable attributes of the Conference Bridge endpoint are as follows:

JndiNameThe JNDI name under which endpoint is to be bound.

RtpFactoryNameThe JNDI name of RTPManager.

Page 28: JBoss Communications Platform-1.2-Media Server User Guide-En-US

Chapter 3. Configuring the Mobicents Media Server

18

ChannelsControls the number of announcement endpoints available in the server instance, in an endpointspool. Endpoints are not created dynamically. At any given time, the number of endpoints in usecannot exceed the Channels value. It is not subject to change during runtime.

Supported PackagesThe supported packages are as follows:

• org.mobicents.media.server.spi.events.Basic

3.6. MMS STUN SupportWhen using Mobicents Media Server behind a routing device performing Network Address Translation,you may need to employ the Simple Traversal of User Datagram Protocol through Network AddressTranslators (abbreviated: STUN) protocol in order for the server to operate correctly. In general, itis recommended to avoid deploying the MMS behind a NAT, since doing so can incur significantperformance penalties and failures. Nevertheless, the current MMS implementation does work with astatic NAT, a.k.a. a one-to-one (1-1) NAT, in which no port-mapping occurs. Full Cone NAT should alsowork with Address-Restricted NAT.

For more informantion STUN NAT classification, refer to chapter 5 of RFC3489 - STUN - SimpleTraversal of User Datagram Protocol (UDP)1.

MMS STUN ConfigurationEach RTPManager in the Media Server can have its own STUN preferences. The STUN options arespecified in the jboss-service.xml configuration file. Here is an example of an RTPManagerMBean with static NAT configuration:

<mbean code="org.mobicents.media.server.impl.jmx.rtp.RTPManager" name="media.mobicents:service=RTPManager,QID=1"> <attribute name="JndiName">java:media/mobicents/protocol/RTP</attribute> <attribute name="BindAddress">${jboss.bind.address}</attribute> <attribute name="Jitter">60</attribute> <attribute name="PacketizationPeriod">20</attribute> <attribute name="PortRange">1024-65535</attribute> <attribute name="AudioFormats">8 = ALAW, 8000, 8, 1;0 = ULAW, 8000, 8, 1;101 = telephone-event</attribute>

<attribute name="UseStun">true</attribute> <attribute name="StunServerAddress">stun.ekiga.net</attribute> <attribute

1 http://www.faqs.org/rfcs/rfc3489.html

Page 29: JBoss Communications Platform-1.2-Media Server User Guide-En-US

MMS STUN Support

19

name="StunServerPort">3478</attribute> <attribute name="UsePortMapping">false</attribute></mbean>

Example 3.6. Static NAT configuration of an Announcement Endpoint in jboss-service.xml

There are four attributes related to STUN configuration:

UseStunA boolean attribute which enables or disables STUN for the current endpoint.

StunServerAddressA string attribute; the address of a STUN server. In the jboss-service.xml configuration fileexample, this attribute is set to stun.ekiga.net.

StunServerPortA string attribute representing the port number of the STUN server. jboss-service.xmlconfiguration file example, 3478 is the port of the Ekiga server.

UsePortMappingA boolean attribute that specifies whether the NAT is mapping the port numbers. A NAT ismapping ports if the internal and external ports are not guaranteed to be the same for everyconnection through the NAT. In other words, if the client established a connection with theNAT at the hypothetical address 111.111.111.111, on port 1024, then the NAT will establish thesecond leg of the connection to some different (private) address, but on the same port, such as192.168.1.1:1024. If these ports are the same (1024), then your NAT is not mapping the ports,and you can set this attribute to false, which improves the performance of the NAT traversal bydoing the STUN lookup only once at boot-time, instead of doing it every time a new connection isestablished. NATs that don't map ports are also known as static NATs.

Page 30: JBoss Communications Platform-1.2-Media Server User Guide-En-US

20

Page 31: JBoss Communications Platform-1.2-Media Server User Guide-En-US

Chapter 4.

21

Controlling and Programming

4.1. Media Server Control ProtocolsThe JBCP Media Server adopts a call control architecture where the call control “intelligence” islocated outside of the Media Server itself, and is handled by external call control elements collectivelyknown as Call State Control Function (CSCF).The media server assumes that these call controlelements will synchronize with each other to send coherent commands and responses to the mediaservers under their control. Server Control Protocols is, in essence, an asynchronous master/slaveprotocol, where the Server Control Modules are expected to execute commands sent by CSCF.Each Server Control Module is implemented as a JSLEE application, and consists of a set of ServiceBuilding Blocks (SBB)s, which are in charge of communicating with media server endpoints via SPI.Such an architecture avoids difficulties with programming concurrency, low-level transaction and state-management details, connection-pooling and other complex APIs.

4.1.1. Media Gateway Control Protocol InterfaceThe Media Gateway Control Protocol (MGCP) is a protocol for controlling media gateways (forexample, the Media Server) from external call control elements such as media gateway controllersor Call Agents. The MGCP assumes that the Call Agents, will synchronize with each other to sendcoherent commands and responses to the gateways under their control.

The MGCP module is included in the binary distribution. The Call Agent uses the MGCP to tell theMedia Server:

• which events should be reported to the Call Agent;

• how endpoints should be connected; and,

• which signals should be played on which endpoints.

MGCP is, in essence, a master/slave protocol, where the gateways are expected to executecommands sent by the Call Agents. The general base architecture and programming interface isdescribed in RFC 28051, and the current specific MGCP definition is located in RFC 34352.

4.2. Media Server Control APIThe main objective of the Media Server Control API is to provide multimedia application developerswith a Media Server abstraction interface.

The JavaDoc for the Media Server Control APIThe JavaDoc documentation for the JBCP Media Server Control API isavailable here: http://hudson.jboss.org/hudson/job/MobicentsDocumentation/lastSuccessfulBuild/artifact/msc-api/apidocs/index.html.

1 http://www.ietf.org/rfc/rfc2805.txt2 http://www.ietf.org/rfc/rfc3435.txt

Page 32: JBoss Communications Platform-1.2-Media Server User Guide-En-US

Chapter 4. Controlling and Programming

22

4.2.1. Basic Components of the Media Server APIThis section describes the basic objects of the API as well as some common design patterns.

The API components consist of a related set of interfaces, classes, operations, events, capabilities,and exceptions. The API provides seven key objects common to media servers, and more advancedpackages. We provide a very brief description of the API in this overview document; the seven keyobjects are:

MsProviderRepresents the “window” through which an application views the call processing.

MsSessionRepresents a call; this object is a dynamic collection of physical and logical entities that bring twoor more endpoints together.

MsEndpointRepresents a logical endpoint (e.g., an announcement access server, or an interactive voiceresponse server).

MsConnectionRepresents the dynamic relationship between an MsSession object and a user agent.

MsLinkRepresent the dynamic relationship between two endpoints located on the same Media Server.

MsRequestedEventThe application requests the detection of certain events like DTMF on an endpoint using this API.

MsRequestedSignalThe application requests the application of signals on endpoints, such as the Play-on-Announcement endpoint, using this API.

The purpose of an MsConnection object is to describe the relationship between an MsSessionobject and a user agent. An MsConnection object exists if the user agent is part of the mediasession. MsConnection objects are immutable in terms of their MsSession and user agentreferences. In other words, the MsSession and user agent object references do not changethroughout the lifetime of the MsConnection object instance. The same MsConnection object maynot be used in another MsSession.

Page 33: JBoss Communications Platform-1.2-Media Server User Guide-En-US

Basic Components of the Media Server API

23

Interface Diagram of the Media Server API

MsProvider can be used to create the MsSession object and to create the instance ofMsEventFactory.

MsSession is a transient association of zero or more connections for the purposes of engaging in areal-time communication exchange. The session and its associated connection objects describe thecontrol and media flows taking place in a communication network. Applications create instances of anMsSession object with the MsProvider.createSession() method, which returns an MsSessionobject that has zero connections and is in the IDLE state. The MsProvider object instance does notchange throughout the lifetime of the MsSession object. The MsProvider object associated with anMsSession object is obtained via the getProvider() method.

Applications create instances of MsConnection objects with theMsSession.createNetworkConnection(String endpointName) method. At this stageMsConnection is in the IDLE state. The Application calls MsConnection.modify(StringlocalDesc, String remoteDesc) passing the local SDP and remote SDP. MsConnection atthis time will find out the corresponding EndPoint, using JNDI, and using the endPointName passedto it. It will then call createConnection(int mode) to create an instance of Connection. ThisConnection creates an instance of RtpSocketAdaptorImpl, which opens up the socket for RTPdata transfer. However, the transfer of data does not yet begin, and the state of MsConnection isHALF_OPEN. At this stage, Connection can only accept RTP packets as it has no knowledge of a peerto which to send RTP packets. If remoteDesc is not null, at this stage it will be applied to Connection,and now the state of MsConnection is OPEN as it knows peer SDP, and can receive as well as sendRTP packets. Once MsConnection.release() is called, all of the resources of MsConnection arereleased and it transforms to the CLOSED state. MsConnection is unusable in the CLOSED state andgets garbage-collected.

Applications create instances of MsLink objects with the MsSession.createLink(MsLinkModemode) method. At this stage, MsLink is in the IDLE state. The application calls theMsLink.join(String endpointName1, String endpointName2), passing the endpointnames of the two local endpoints to be joined. At this point, the MsLink object will find out thecorresponding EndPoints, using JNDI, and by using the endPointName passed to it. It will thencall createConnection(int mode) to create an instance of the Connection object. The

Page 34: JBoss Communications Platform-1.2-Media Server User Guide-En-US

Chapter 4. Controlling and Programming

24

connections are local connections and hence no network resources are acquired (Sockets). Assoon as Connections are created for both EndPoints, setOtherParty(Connection other)is called on each Connection passing the other Connection, which starts the data transferbetween the two Connections. At this stage, MsLink changes to the CONNECTED state. As soonas the application calls MsLink.release(), release() is called on the connection of respectiveendpoints. As soon as both of the connections are released, MsLink changes to DISCONNECTED andbecomes unusable. Soon after this, MsLink gets garbage-collected.

The application may ask to be notified about certain events occurring in an endpoint (e.g., DTMF),or the application may also request certain signals to be applied to an endpoint (e.g., Play anAnnouncement). To achieve this, the application needs to get an instance of MsEventFactoryby calling MsProvider.getEventFactory() and create an instance of MsRequestedEventto request for the notification of events or to create an instance of MsRequestedEvent to applysignals at endpoints. The application needs to pass the corresponding MsEventIdentifier as aparameter to MsEventFactory.createRequestedEvent(MsEventIdentifier eventID) orMsEventFactory.createRequestedSignal(MsEventIdentifier eventID). The examplesbelow will clarify this

4.2.2. Basic API Patterns: ListenersThe basic programming pattern of the API is that applications (which reside “above” the API) makesynchronous calls to API methods. The platform or network element implementing the API can informthe application of underlying events (for example, the arrival of incoming calls) by means of Javaevents. The application provides Listener objects corresponding to the events it is interested inobtaining.

Listeners MsSessionListener

Applications which are interested in receiving events for changes in state of the MsSessionobject should implement MsSessionListener.

MsConnectionListenerApplications which are interested in receiving events for changes of state in MsConnectionshould implement MsConnectionListener.

MsLinkListenerApplications which are interested in receiving events for changes in state of MsLink shouldimplement MsLinkListener.

MsResourceListenerApplications interested in receiving events for changes in state of MsSignalDetector orMsSignalGenerator should implement MsResourceListener.

4.2.3. EventsEach of the listeners defined above listen to different types of events fired by the server.

Events related to MsSessionMsSessionListener is listening for MsSessionEvent, which carries the MsSessionEventIDrepresenting an MsSession state change. The following table shows the different types

Page 35: JBoss Communications Platform-1.2-Media Server User Guide-En-US

Events

25

of MsSessionEventID, when these events are fired, and the corresponding methods ofMsSessionListener that will be called.

MsSessionEventID Description MsSessionListenerMethodCalled

SESSION_CREATED Fired whenMsProvider.createSession()is called and a newMsSession is created

public voidsessionCreated(MsSessionEventevt)

SESSION_ACTIVE When the MsConnectionor MsLink is created onMsSession for the first time,it transitions to ACTIVE stateand SESSION_ACTIVE isfired. Afterwards, this thestate remains ACTIVE even ifthe application creates moreMsConnections or MsLinks.

public voidsessionActive(MsSessionEventevt)

SESSION_INVALID When all the MsConnectionor MsLink objects aredisassociated fromMsSession,it transitions to INVALID stateand SESSION_INVALID isfired.

public voidsessionInvalid(MsSessionEventevt)

Events Related to MsConnectionMsConnectionListener listens for an MsConnectionEvent, which carries theMsConnectionEventID that represents an MsConnection state change. The following tableshows the different types of MsConnectionEventID, when these events would be fired, and thecorresponding methods of MsConnectionListener that will be called.

MsConnectionEventID Description MsConnectionListenerMethod Called

CONNECTION_CREATED Fired as soon as the creation ofMsConnection is successful.MsConnection is not holdingany resources yet.

public voidconnectionCreated(MsConnectionEventevent)

CONNECTION_HALF_OPEN Fired as soon as themodification of MsConnectionis successful. At this stagethe RTP socket is openin the Media Server toreceive a stream, but hasno idea about remote SDP.The application may callMsConnection.modify(localDesc,null), passing null forremote SDP if the remote SDPis not known yet, and then

Page 36: JBoss Communications Platform-1.2-Media Server User Guide-En-US

Chapter 4. Controlling and Programming

26

MsConnectionEventID Description MsConnectionListenerMethod Called

later call modify again with theactual SDP once its known

CONNECTION_MODIFIED As soon as MsConnectionis successfullymodified, by callingMsConnection.modify(StringlocalDesc, StringremoteDesc),CONNECTION_MODIFIEDis fired. When modify() iscalled, MsConnection checksto see whether there is anendpoint associated it and,if so, then this means it is amodification request.

public voidconnectionHalfOpen(MsConnectionEventevent);

CONNECTION_OPEN Fired as soon as themodification of MsConnectionis successful and the SDPpassed by the Call Agentis successfully applied toan RTP Connection. At thisstage, there is a flow ofRTP packets from the UserAgent to the Media Serverand vice versa. Its possiblethat the application may callMsConnection.modify(localDesc,remoteDesc), passing theremoteDesc(remote SDP)

public voidconnectionOpen(MsConnectionEventevent);

CONNECTION_DISCONNECTED As soon as MsConnectionis successfully released,MsConnection.release()CONNECTION_DISCONNECTEDis fired.

public voidconnectionDisconnected(MsConnectionEventevent);

CONNECTION_FAILED Fired as soon as the creationof MsConnection failsfor reasons specified inMsConnectionEventCause.Immediately afterCONNECTION_FAILED,CONNECTION_DISCONNECTEDwill be fired, giving the lister achance to perform clean up.

public voidconnectionFailed(MsConnectionEventevent);

Events Related to MsLinkMsLinkListener listens for an MsLinkEvent which carries the MsLinkEventID that representsan MsLink state change. The following table shows the different types of MsLinkEventID, whenthese events are fired, and the corresponding methods of MsLinkListener that are called.

Page 37: JBoss Communications Platform-1.2-Media Server User Guide-En-US

MSC API Objects: Finite State Machines

27

MsLinkEventID Description MsLinkListener methodcalled

LINK_CREATED As soon as a new MsLinkis created by callingMsSession.createLink(MsLinkModemode), LINK_CREATED isfired.

public voidlinkCreated(MsLinkEventevt)

LINK_CONNECTED Fired as soon as thejoin(String a, Stringb) operation of MsLink issuccessful.

public voidlinkConnected(MsLinkEventevt);

LINK_DISCONNECTED Fired as soon as therelease() operation ofMsLink is successful.

public voidlinkDisconnected(MsLinkEventevt);

LINK_FAILED Fired as soon as thejoin(String a, Stringb) operation of MsLink fails.

public voidlinkFailed(MsLinkEventevt)

4.2.4. MSC API Objects: Finite State Machines

MsSessionState Finite State MachineThe behavior of MsSession is specified in terms of Finite State Machines (FSMs) represented byMsSessionState, shown below:

IDLEThis state indicates that the session has zero connections or links.

ACTIVEThis state indicates that the session has one or more connections or links.

INVALIDThis state indicates the session has lost all of its connections or links.

MsConnection Finite State MachineMsConnection state is represented by the MsConnectionState enum:

IDLEThis state indicates that the MsConnection has only been created and has no resourcesattached to it.

HALF_OPENThis state indicates that the MsConnection has created the RTP socket, but doesn't yet haveany information about Remote SDP to send the RTP Packets. MsConnection is still usable inHALF_OPEN state if it is only receiving the RTP Packets but doesn't have to send any.

Page 38: JBoss Communications Platform-1.2-Media Server User Guide-En-US

Chapter 4. Controlling and Programming

28

OPENThis state indicates that the MsConnection now has information about remote SDP and cansend RTP Packates to the remote IP (for example, to a remote user agent).

FAILEDThis state indicates that the creation or modification of MsConnection failed, and that theMsConnection object isn't reusable anymore.

CLOSEDThis state indicates that MsConnection has released all its resources and closed the RTPsockets. It is not usable any more.

MsLink Finite State MachineMsLink state is represented by the MsLinkState enum:

IDLEThis state indicates that the MsLink has been created and has no endpoints associated with it.

CONNECTEDThis state indicates that the connections from both endpoints have been created and that datatransfer has started.

FAILEDThis state indicates that the creation of MsLink failed and is not usable anymore.

DISCONNECTEDThis state indicates that MsLink has closed the connections of both endpoints and is not usableanymore.

4.2.5. API Methods and UsageSo far we have specified the key objects as well as their Finite State Machines (FSMs). To understandoperationally how these objects are used and the methods they provide, we can look at the UMLsequence diagram examples. The following call flow depicts a simple announcement.

Click to see the Announcement call flow diagram3.

/** * This is just a psuedocode to show how to use the MSC Api. This example uses * the Announcement Endpoint to play an announcement * * user agent <----> RTP Connection <--- Announcement Endpoint * * @author amit bhayani * */public class AnnouncementExample implements MsSessionListener, MsConnectionListener { private MsProvider msProvider; private MsSession msSession;

3 http://mobicents-public.googlegroups.com/web/sas-MMSControlAPI-dia-IVRMsConnectionAPI.png?gda=hEmmFl4AAAAF_VX0TG5xx-FBSRUj3rSwgeNEfkM5quPf0dNuRU50JeoDajkVeSsnUQ5nTudipElRBm39yBjFjuPyiOBf15ilwxyWU4Owty8oB440nFYg8OOwpdWz5ftt1dlzlu5J-bE

Page 39: JBoss Communications Platform-1.2-Media Server User Guide-En-US

API Methods and Usage

29

public void startMedia(String remoteDesc) { // Creating the provider MsProvider provider = new MsProviderImpl(); // Registering the Listeners provider.addSessionListener(this); provider.addConnectionListener(this); // Creating the Session msSession = provider.createSession(); // Creating the connection passing the Endpoint Name. Here we are // creating Announcement Endpoint which will be connected to User Agent // (remoteDesc is SDP of remote end) MsConnection msCOnnection = msSession.createNetworkConnection("media/trunk/Announcement/$"); // Get the Remote SDP here and pass it to connection. If creation of // connection is successful connectionCreated method will be called msCOnnection.modify("$", remoteDesc); } public void sessionActive(MsSessionEvent evt) { // TODO Auto-generated method stub } public void sessionCreated(MsSessionEvent evt) { // TODO Auto-generated method stub } public void sessionInvalid(MsSessionEvent evt) { // TODO Auto-generated method stub } public void connectionCreated(MsConnectionEvent event) { MsConnection connection = event.getConnection(); MsEndpoint endpoint = connection.getEndpoint(); // This is the actualname, could be something like // 'media/trunk/Announcement/enp-1' String endpointName = endpoint.getLocalName(); // URL to play audio file. String url= "http://something/mobicents.wav"; MsEventFactory eventFactory = msProvider.getEventFactory(); MsPlayRequestedSignal play = null; play = (MsPlayRequestedSignal) eventFactory.createRequestedSignal(MsAnnouncement.PLAY); play.setURL(url); // Let us request for Announcement Complete event or Failure in case // if it happens MsRequestedEvent onCompleted = null; MsRequestedEvent onFailed = null; onCompleted = eventFactory.createRequestedEvent(MsAnnouncement.COMPLETED); onCompleted.setEventAction(MsEventAction.NOTIFY); onFailed = eventFactory.createRequestedEvent(MsAnnouncement.FAILED); onFailed.setEventAction(MsEventAction.NOTIFY); MsRequestedSignal[] requestedSignals = new MsRequestedSignal[] { play }; MsRequestedEvent[] requestedEvents = new MsRequestedEvent[] { onCompleted, onFailed }; endpoint.execute(requestedSignals, requestedEvents, connection); } public void connectionDisconnected(MsConnectionEvent event) { // TODO Auto-generated method stub } public void connectionFailed(MsConnectionEvent event) { // TODO Auto-generated method stub } public void connectionHalfOpen(MsConnectionEvent event) { // TODO Auto-generated method stub } public void connectionOpen(MsConnectionEvent event) { // TODO Auto-generated method stub }

Page 40: JBoss Communications Platform-1.2-Media Server User Guide-En-US

Chapter 4. Controlling and Programming

30

}

Example 4.1. MSC API Example Code

// Example that shows how to listen for DTMF. For simplicity removed all imports and other code

public class IVRExample implements MsSessionListener, MsConnectionListener, MsNotificationListener { public void startMedia(String remoteDesc) { // Creating the provider MsProvider provider = new MsProviderImpl(); // Registering the Listeners provider.addSessionListener(this); provider.addConnectionListener(this); provider.addNotificationListener(this); // Creating the Session msSession = provider.createSession(); // Creating the connection passing the Endpoint Name. Here we are // creating Announcement Endpoint which will be connected to User Agent // (remoteDesc is SDP of remote end) MsConnection msConnection = msSession.createNetworkConnection("media/trunk/IVR/$"); // Get the Remote SDP here and pass it to connection. If creation of // connection is successful connectionCreated method will be called msConnection.modify("$", remoteDesc); } public void connectionCreated(MsConnectionEvent event) { MsConnection connection = event.getConnection(); MsEndpoint endpoint = connection.getEndpoint(); // This is the actualname, could be something like // 'media/trunk/Announcement/enp-1' String endpointName = endpoint.getLocalName(); MsEventFactory factory = msProvider.getEventFactory(); MsDtmfRequestedEvent dtmf = (MsDtmfRequestedEvent) factory.createRequestedEvent(DTMF.TONE); MsRequestedSignal[] signals = new MsRequestedSignal[] {}; MsRequestedEvent[] events = new MsRequestedEvent[] { dtmf }; endpoint.execute(signals, events, connection); } public void update(MsNotifyEvent evt) { MsEventIdentifier identifier = evt.getEventID(); if (identifier.equals(DTMF.TONE)) { MsDtmfNotifyEvent event = (MsDtmfNotifyEvent) evt; String seq = event.getSequence(); if (seq.equals("0")) { } else if (seq.equals("1")) { } else if (seq.equals("2")) { } else if (seq.equals("3")) { } else if (seq.equals("4")) { } else if (seq.equals("5")) { } else if (seq.equals("6")) { } else if (seq.equals("7")) { } else if (seq.equals("8")) { } else if (seq.equals("9")) { } } }}

Example 4.2. DTMF Listener Example Code

Page 41: JBoss Communications Platform-1.2-Media Server User Guide-En-US

API Methods and Usage

31

// Example that shows how DTMF signal can be applied to Endpoint

MsEventFactory eventFactory = msProvider.getEventFactory();

MsRequestedSignal dtmf = eventFactory.createRequestedSignal(DTMF.TONE); dtmf.setTone("1"); MsRequestedSignal[] signals = new MsRequestedSignal[] { dtmf }; MsRequestedEvent[] events = new MsRequestedEvent[];

msEndpoint.execute(signals, events, connection);

Example 4.3. DTMF Signal to Endpoint Example Code

// Example that shows how to begin recording and listen for FAILED event String RECORDER = "file://home/user/recordedfile.wav"; MsEventFactory eventFactory = msProvider.getEventFactory(); MsRecordRequestedSignal record = (MsRecordRequestedSignal) eventFactory.createRequestedSignal(MsAudio.RECORD); record.setFile(RECORDER); MsRequestedEvent onFailed = eventFactory.createRequestedEvent(MsAudio.FAILED); onFailed.setEventAction(MsEventAction.NOTIFY); MsRequestedSignal[] requestedSignals = new MsRequestedSignal[] { record }; MsRequestedEvent[] requestedEvents = new MsRequestedEvent[] { onFailed }; endpoint.execute(requestedSignals, requestedEvents, connection);// NOTE: Passing empty MsRequestedSignal[] and MsRequestedEvent[] will nullify all previous MsRequestedSignal and MsRequestedEvent

Example 4.4. Record and Listen FAILED Event Example Code

Page 42: JBoss Communications Platform-1.2-Media Server User Guide-En-US

32

Page 43: JBoss Communications Platform-1.2-Media Server User Guide-En-US

Chapter 5.

33

Event Packages

The Basic PackagePackage name: org.mobicents.media.server.spi.events.Basic

Event ID Description Type Duration

org.mobicents.media.server.spi.events.Basic.DTMFDTMF Event BR

The Announcement PackagePackage name: org.mobicents.media.server.spi.event.Announcement

Event ID Description Type Duration

org.mobicents.media.server.spi.event.Announcement.PLAYplay an announcement TO Variable

org.mobicents.media.server.spi.event.Announcement.COMPLETED

org.mobicents.media.server.spi.event.Announcement.FAILED

Announcement actions are qualified by URLs and by sets of initial parameters. The “operationcompleted” (COMPLETED) event will be detected once an announcement has finished playing. If theannouncement cannot be played in its entirety, an “operation failure” (FAILED) event can be returned.The failure can also be explained with a commentary.

The Advanced Audio PackagePackage name: org.mobicents.media.server.spi.events.AU

Event ID Description Type Duration

org.mobicents.media.server.spi.event.AU.PLAY_RECORDPlay a prompt(optional) and thenrecord some speech

TO Variable

org.mobicents.media.server.spi.event.AU.PROMPT_AND_COLLECT

org.mobicents.media.server.spi.event.Announcement.FAILED

The function of PLAY_RECORD is to play a prompt and record the user's speech. If the user does notspeak, the user may be re-prompted and given another chance to record. By default, PLAY_RECORDdoes not play an initial prompt, makes only one attempt to record, and therefore functions as a simplerecord operation

Page 44: JBoss Communications Platform-1.2-Media Server User Guide-En-US

34

Page 45: JBoss Communications Platform-1.2-Media Server User Guide-En-US

Chapter 6.

35

Demonstration ExampleThe motive of this example is to demonstrate the capabilities of new Media Server (MS) and MediaServer Resource Adapters (MSC-RA).

The example demonstrates the usage of the following Endpoints:

• Announcement

• Packet Relay

• Loop

• Conference

• IVR

For more information on each of these types of endpoints, refer to Section 1.2, “Media ServerArchitecture”.

Where is the Code?Check out the 'mms-demo' example from http://code.google.com/p/mobicents/source/browse/#svn/branches/servers/media/1.x.y/examples/mms-demo.

Install and RunStart the JBCP Server (this will also start Media Server). Make sure you have server/default/deploy/mobicents.sar and server/default/deploy/mediaserver.sar in your JBCPServer

From BinaryGo to /examples/mms-demo and call 'ant deploy-all'. This will deploy the SIP RA, MSC RA, themms-demo example and also mms-demo-audio.war. The war file contains the audio *.wav files thatare used by mms-demo example.

From Source CodeIf you are deploying from source code, you may deploy each of the resource adapters individually

• make sure JBOSS_HOME is set and the server is running.

• Call mvn install from servers/jain-slee/resources/sip to deploy SIP RA

• Call mvn install from servers/media/controllers/msc to deploy media RA

• Call mvn install from servers/media/examples/mms-demo to deploy example

Once the example is deployed, make a call from your SIP Phone to TBD.

1010: Loop Endpoint Usage DemonstrationAs soon as the call is established CallSbb creates a Connection using PREndpointImpl.PREndpointImpl has two Connections, one connected to calling UA by calling

Page 46: JBoss Communications Platform-1.2-Media Server User Guide-En-US

Chapter 6. Demonstration Example

36

msConnection.modify("$", sdp). Once the connection is established CallSbb creates childLoopDemoSbb and calls startDemo() on it passing the PREndpoint name as argument.LoopDemoSbb creates child AnnouncementSbb which uses the AnnEndpointImpl to make anannouncement. The other Connection of PREndpointImpl is connected to Connection fromAnnEndpointImpl using the MsLink.

MsLink link = session.createLink(MsLink.MODE_FULL_DUPLEX);.......link.join(userEndpoint, ANNOUNCEMENT_ENDPOINT);

Once the link is created (look at onLinkConnected() ) AnnouncementSbb creates theinstance of MsPlayRequestedSignal and sets the path of audio url. AnnouncementSbbalso creates an instance of MsRequestedEvent for MsAnnouncement.COMPLETED andMsAnnouncement.FAILED such that the Media resource adapter fires respective events and theSBB has a handler for the org.mobicents.media.events.announcement.COMPLETED event tohandle Announcement Complete.

MsEventFactory eventFactory = msProvider.getEventFactory();

MsPlayRequestedSignal play = null;play = (MsPlayRequestedSignal) eventFactory.createRequestedSignal(MsAnnouncement.PLAY);play.setURL(url);

MsRequestedEvent onCompleted = null;MsRequestedEvent onFailed = null;

onCompleted = eventFactory.createRequestedEvent(MsAnnouncement.COMPLETED);onCompleted.setEventAction(MsEventAction.NOTIFY);

onFailed = eventFactory.createRequestedEvent(MsAnnouncement.FAILED);onFailed.setEventAction(MsEventAction.NOTIFY);

MsRequestedSignal[] requestedSignals = new MsRequestedSignal[]{play};MsRequestedEvent[] requestedEvents = new MsRequestedEvent[]{onCompleted, onFailed};

link.getEndpoints()[1].execute(requestedSignals, requestedEvents, link);

Announcement Endpoint

As soon as the announcement is over LoopDemoSbb creates child LoopbackSbb and callsstartConversation() on it, passing the PREndpoint name as argument. LoopbackSbbuses MsLink to associate the other connection of PREndpointImpl to LoopEndpointImpl.LoopEndpointImpl simply forwards the voice packet received from caller back to caller.

MsLink link = session.createLink(MsLink.MODE_FULL_DUPLEX);..........link.join(endpointName, LOOP_ENDPOINT);

Page 47: JBoss Communications Platform-1.2-Media Server User Guide-En-US

37

Loop Endpoint

The SBB Child Relation Diagram

1011: DTMF Usage DemonstrationAs soon as the call is established CallSbb creates a Connection using PREndpointImpl.PREndpointImpl has two Connections, one connected to calling UA by callingmsConnection.modify("$", sdp). Once the connection is established CallSbb creates childDtmfDemoSbb and calls startDemo() on it passing the PREndpoint name as argument. DtmfDemoSbbcreates child AnnouncementSbb which uses the AnnEndpointImpl to make an announcement. Theother Connection of PREndpointImpl is connected to Connection from AnnEndpointImpl using theMsLink.

MsLink link = session.createLink(MsLink.MODE_FULL_DUPLEX);.......link.join(userEndpoint, ANNOUNCEMENT_ENDPOINT);

Once the link is created the flow is the same as that of 1010 to play the announcement.

Announcement Endpoint Implementation

As soon as announcement is over DtmfDemoSbb creates instance of MsDtmfRequestedEvent andapplies it on IVREndpoint. Look at onAnnouncementComplete() method of DtmfDemoSbb

MsLink link = (MsLink) evt.getSource();MsEndpoint ivr = link.getEndpoints()[1];

MsEventFactory factory = msProvider.getEventFactory();MsDtmfRequestedEvent dtmf = (MsDtmfRequestedEvent) factory.createRequestedEvent(DTMF.TONE);MsRequestedSignal[] signals = new MsRequestedSignal[]{};

Page 48: JBoss Communications Platform-1.2-Media Server User Guide-En-US

Chapter 6. Demonstration Example

38

MsRequestedEvent[] events = new MsRequestedEvent[]{dtmf};

ivr.execute(signals, events, link);

On every DTMF received DtmfDemoSbb plays corresponding WAV file using the AnnouncementSbbas explained above.

The SBB Child Relation Diagram

1012: ConfEndpointImpl Usage DemonstrationAs soon as the call is established CallSbb creates a Connection using PREndpointImpl.PREndpointImpl has two Connections, one connected to calling UA by callingmsConnection.modify("$", sdp). Once the connection is established CallSbb creates childConfDemoSbb and calls startDemo() on it passing the PREndpoint name as argument. ConfDemoSbbcreates child AnnouncementSbb which uses the AnnEndpointImpl to make an announcement. Theother Connection of PREndpointImpl is connected to Connection from AnnEndpointImpl using theMsLink.

....MsLink link = session.createLink(MsLink.MODE_FULL_DUPLEX);.......link.join(userEndpoint, ANNOUNCEMENT_ENDPOINT);

Once the link is created the flow is the same as that of 1010 to play the announcement.

Announcement Endpoint Implementation

As soon as announcement is over ConfDemoSbb creates child ForestSbb and calls enter() onit passing the PREndpoint name as argument. ForestSbb uses MsLink to associate the otherConnection of PREndpointImpl to ConfEndpointImpl:

Page 49: JBoss Communications Platform-1.2-Media Server User Guide-En-US

39

MsLink link = session.createLink(MsLink.MODE_FULL_DUPLEX);link.join(endpointName, CNF_ENDPOINT);

Once the link is established (Look at onConfBridgeCreated() ) ForestSbb creates many childAnnouncementSbb each responsible for unique announcement (in this case playing crickets.wav andmocking.wav). So now UA is actually listening to many announcements at same go.

Conference Endpoint Implementation

SBB Child Relation

Recording Usage DemonstrationAs soon as the call is established, CallSbb creates a Connection using PREndpointImpl.PREndpointImpl has two Connections, one connection to the calling User Agent by callingmsConnection.modify("$", sdp). Once the connection is established, CallSbb creates childRecorderDemoSbb and calls startDemo() on it, passing the PREndpoint name as an argument.RecorderDemoSbb creates child AnnouncementSbb which uses the AnnEndpointImpl to makean announcement. The other Connection of PREndpointImpl is connected to Connection fromAnnEndpointImpl using the MsLink.

Page 50: JBoss Communications Platform-1.2-Media Server User Guide-En-US

40

Page 51: JBoss Communications Platform-1.2-Media Server User Guide-En-US

Chapter 7.

41

Best Practices

7.1. JBCP Media Server Best PracticesNote: these best practices apply to JBCP Media Server version 1.0.0.CR6 and later

7.1.1. DTMF Detection Mode: RFC2833 versus Inband versus AutoThe JBCP Media Server will block the resource depending on the DTMF detection mode configuredin jboss-service.xml at start-up time. Inband is highly resource-intensive and must perform many morecalculations in order to detect DTMF when compared to RFC2833. So if your application alreadyknows that User Agents (UAs) support RFC2833, it is always better to configure DTMF mode asRFC2833 rather than as Inband or Auto. Also, please note that Auto is even more resource-intensivebecause it does not know beforehand whether DTMF would be Inband or RFC2833, and hence bothdetection methods must be started. The default detection mode is RFC2833.

All of the Conference, Packet Relay and IVR endpoints have DTMF detection enabled; the mode canbe configured using jboss-service.xml. We advise retaining the same mode for all three, but this is nota necessity.

7.1.2. Transcoding Is CPU-IntensiveDigital Signal Processing (DSP) is very costly and should be avoided as much as possible. By default,Announcement endpoints and IVR endpoints do not have DSP enabled. What this means is that yourapplication needs to know beforehand which codecs are supported by your UA; you can then askAnnouncement or IVR to play an audio file which has been pre-encoded in one of these formats. Theonus of deciding which pre-encoded file to play lies with the application. For example, if I am writinga simple announcement application that would only play announcements to the end user, and I knowthat my end users have one of either the PCMU or GSM codecs, then I would make sure to havepre-encoded audio files such as helloworld-pcmu.wav and helloworld-gsm.gsm. Then, when the UAattempts to connect to the Media Server, my application knows which codecs the UA supports and canask the Media Server to play the respective file.

This strategy will work fine because, most of the time in the telecommunications world, applicationshave a known set of supported codecs, .However if this is not true, or if you are writing a simple demoapplication and need or want all codecs to be supported, you can put a Packet Relay endpoint infront of Announcement or IVR endpoint. This way, the Packet Relay will do all necessary digital signalprocessing, and your application need not bother about which audio file to play. The audio file inthis case will be encoded in Linear format, and all UAs, irrespective of whether they support PCMU,PCMA, Speex, G729 or GSM codecs, would be able to hear the announcement.

7.1.3. Conference Endpoints block the Number of Connections atStart TimeThe Conference endpoint starts all of the connections at boot time. This means that Conferenceblocks all the necessary resources at start time even if UAs are not yet connected. In our experience,this is required because resource allocation at runtime causes jitter for the other participants. Due tothis, there is cap on the maximum number of connections a conference can handle which takes effectat start time. By default, this number is set to five in jboss-service.xml:

Page 52: JBoss Communications Platform-1.2-Media Server User Guide-En-US

Chapter 7. Best Practices

42

<attribute name="MaxConnections">5</attribute>

If your requirements are such that your application will have conferences ranging from five to tensimultaneous users, it is best to define two or more ConfTrunkManagement MBeans, and allow yourapplication to use the correct Conference endpoint rather than changing the value of MaxConnectionsto "10" for all. For example:

<mbean code="org.mobicents.media.server.impl.jmx.enp.cnf.ConfTrunkManagement" name="media.mobicents:endpoint=conf"> <depends>media.mobicents:service=RTPManager,QID=1</depends> <attribute name="JndiName">media/trunk/Conference5</attribute> <attribute name="RtpFactoryName">java:media/mobicents/protocol/RTP</attribute> <attribute name="Channels">1</attribute> <attribute name="DtmfMode">RFC2833</attribute> <!--MaxConnections represents the maximum number of participants who can join a conference. Use judiciously: this blocks resources at MMS startup--> <attribute name="MaxConnections">5</attribute></mbean>

<mbean code="org.mobicents.media.server.impl.jmx.enp.cnf.ConfTrunkManagement" name="media.mobicents:endpoint=conf"> <depends>media.mobicents:service=RTPManager,QID=1</depends> <attribute name="JndiName">media/trunk/Conference7</attribute> <attribute name="RtpFactoryName">java:media/mobicents/protocol/RTP</attribute> <attribute name="Channels">1</attribute> <attribute name="DtmfMode">RFC2833</attribute> <!--MaxConnections represents the maximum number of participants who can join a conference. Use judiciously: this blocks resources at MMS startup--> <attribute name="MaxConnections">7</attribute></mbean>

<mbean code="org.mobicents.media.server.impl.jmx.enp.cnf.ConfTrunkManagement" name="media.mobicents:endpoint=conf"> <depends>media.mobicents:service=RTPManager,QID=1</depends> <attribute name="JndiName">media/trunk/Conference10</attribute> <attribute name="RtpFactoryName">java:media/mobicents/protocol/RTP</attribute> <attribute name="Channels">1</attribute> <attribute name="DtmfMode">RFC2833</attribute>

Page 53: JBoss Communications Platform-1.2-Media Server User Guide-En-US

Conference Endpoints block the Number of Connections at Start Time

43

<!--MaxConnections represents the maximum number of participants who can join a conference. Use judiciously: this blocks resources at MMS startup--> <attribute name="MaxConnections">10</attribute></mbean>

Finally, ensure that you configure Channels carefully: this represents the number of conferences thatcan occur concurrently.

Page 54: JBoss Communications Platform-1.2-Media Server User Guide-En-US

44

Page 55: JBoss Communications Platform-1.2-Media Server User Guide-En-US

45

Appendix A. Understanding DigitalSignal Processing and StreamingThe following information provides a basic introduction to Digital Signal Processing, and Streamingtechnologies. These two technologies are used extensively in the Media Server, thereforeunderstanding these concepts will assist developers in creating customized media services for theMedia Server.

A.1. Introduction to Digital Signal ProcessingDigital Signal Processing, as the name suggests, is the processing of signals by digital means.A signal in this context can mean a number of different things. Historically the origins of signalprocessing are in electrical engineering, and a signal here means an electrical signal carried by awire or telephone line, or perhaps by a radio wave. More generally, however, a signal is a streamof information representing anything from stock prices to data from a remote-sensing satellite. Theterm "digital" originates from the word "digit", meaning a number, therefore "digital" literally meansnumerical. This introduction to DSP will focus primary on two types digital signals: audio and voice.

A.2. Analog and Digital SignalsData can already be in a digital format (for example, the data stream from a Compact Disk player),and will not require any coversion. In many cases however, a signal is received in the form of ananalog electrical voltage or current, produced by a microphone or other type of transducer. BeforeDSP techniques can be applied to an analog signal, it must be converted into digital form. Analogelectrical voltage signals can be digitized using an analog-to-digital converter (ADC), which generatesa digital output as a stream of binary numbers. These numbers represent the electrical voltage input tothe device at each sampling instant.

A.2.1. Discrete SignalsWhen converting a continuous analog signal to a digital signal, the analog signal must be convertedto a signal format that computers can analyze and perform complex calculations on. Discrete Signalsare easily stored and transmitted over digital networks and have the ability to be discrete in magnitude,time, or both.

Discrete-in-time values only exist at certain points in time. For example, if a sample of discrete-in-timedata is taken at a point in time where there is no data, the result is zero.

Page 56: JBoss Communications Platform-1.2-Media Server User Guide-En-US

Appendix A. Understanding Digital Signal Processing and Streaming

46

Discrete-In-Magnitude values exist across a time range, however, the value of the datum in each timerange consists of one constant result, rather than a variable set of results.

By converting continuous analog signals to discrete signals, finer computer data analysis is possible,and the signal can be stored and tranmitted efficiently over digital networks.

A.3. Sampling, Quantization, and PacketizationSampling is the process of recording the values of a signal at given points in time. For ADCs, thesepoints in time are equidistant, with the number of samples taken during one second dictating the calledsample rate. It's important to understand that these samples are still analogue values. The mathematicdescription of the ideal sampling is the multiplication of the signal with a sequence of direct pulses.

Quantization is the process of representing the value of an analog signal by a fixed number of bits.The value of the analog signal is compared to a set of pre-defined levels. Each level is representedby a unique binary number, and the binary number that corresponds to the level closest to the analogsignal value is chosen to represent that sample.

Sampling and quantization prepare digitized media for future processing or streaming. However,streaming and processing over individual samples is not effective for high volumes of data transferred

Page 57: JBoss Communications Platform-1.2-Media Server User Guide-En-US

Transfer Protocols

47

via a network. The risk of data-loss is much higher when a large portion of data is transferred in ablock. Networked media should be transmitted using media packets that carry several samples,thereby reducing the risk of data loss through the transmission process. This process is referred to aspacketization.

A.4. Transfer ProtocolsThe Real-time Streaming Protocol (RTSP), Real-time Transport Protocol (RTP) and the Real-timeTransport Control Protocol (RTCP) were specifically designed to stream media over networks. Thelatter two are built on top of UDP.

A.4.1. Real-time Transport ProtocolRTP provides end-to-end network transport functions suitable for applications transmitting real-timedata, such as audio, video or simulation data, over multicast or unicast network services. RTP doesnot address resource reservation and does not guarantee quality-of-service for real-time services.The data transport is augmented by the Real-time Control Protocol (RTCP) to allow monitoring of thedata delivery in a manner scalable to large multicast networks, and to provide minimal control andidentification functionality. RTP and RTCP are designed to be independent of the underlying transportand network layers.

A RTP packet consists of a RTP header, followed by the data to send. In the RTP specification, thisdata is referred to as the payload. The header is transmitted in network byte order, just like the IPheader. The Figure 5 shows the RTP header format.

Header Component Description

V (Version) Contains the version number of the RTPprotocol. For example, the current versionnumber is 2. This part of the header consumes 2bits of the RTP packet.

P (Padding) Contains padding bytes, which are excludedfrom the payload data count. The last padding

Page 58: JBoss Communications Platform-1.2-Media Server User Guide-En-US

Appendix A. Understanding Digital Signal Processing and Streaming

48

Header Component Descriptionbyte contains the number of padding bytespresent in the packet. Padding may be requiredfor certain encryption algorithms that need thepayload to be aligned on a multi-byte boundary.

X (Extension) Specifies whether the header contains anExtension Header.

CC (CSRC Count) Specifies how many contributing sources arespecified in the header.

M (Marker) Contains arbitrary data that can be interpretedby an application. The RTP specification doesnot limit the information type contained in thiscomponent of the header. For example, theMarker component might specify that media datais contained within the packet.

PT (Payload Type) Specifies the type of data the packet contains,which determines how an application receivingthe packet interprets the payload.

Sequence Number Contains a unique numerical value, that can beused by applications to place received packetsin the correct order. Video streams rely onthe sequence number to order the packetsfor individual video frames received by anapplication. The starting number for a packetstream is randomized for security reasons.

Time Stamp Contains the synchronization information for astream of packets. The value specifies whenthe first byte of the payload was sampled. Thestarting number for the Time Stamp is alsorandomized for security reasons. For audio,the timestamp is typically incremented withthe amount of samples in the packet so thereceiving application can play the audio data atexactly the right time. For video, the timestampis typically incremented per image. One image ofa video will generally be sent in several packets,therefore the pieces of data will have the sameTime Stamp, but use a different SequenceNumber.

SSRC ID Contains the packet Synchronization Source(SSRC) identifier of the sender. The informationcontained in this component of the header isused to correctly order multiple RTP streamscontained in a packet. This scenario often occurswhen an application sends both video and audioRTP streams in one packet. So the receivingapplication can correctly order and synchronizethe data, the identifier is chosen randomly. This

Page 59: JBoss Communications Platform-1.2-Media Server User Guide-En-US

Real-time Transport Control Protocol

49

Header Component Descriptionreduces the chance of a packet in both streamshaving the same identifier.

CSRC ID Contains one (or more) Contributing Source(CSRC) identifiers for each RTP stream presentin the packet. To assist audio streams re-assembly, the SSRC IDs can be appended tothis packet component. The SSRC ID of thepacket then becomes the source identifier for theforwarded packet.

Extension Header Contains arbitrary information, specified by theapplication. The RTP defines the extensionmechanism only. The extensions containedwithin the Extension Header are controlled by theapplication.

Table A.1. Supported RTP Formats

NoteRTP headers do not contain a payload length field. The protocol relies on theunderlying protocol to determine the end of the payload. For example, in the TCP/IP architecture, RTP is used on top of UDP, which does contain length information.Using this, an application can determine the size of the whole RTP packet and afterits header has been processed, the application automatically knows the amount ofdata in its payload section.

A.4.2. Real-time Transport Control ProtocolThe RTP is accompanied by a control protocol, the Real-time Transport Control Protocol (RTCP).Each participant of a RTP session periodically sends RTCP packets to all other participants in thesession for the following reasons:

• To provide feedback on the quality of data distribution. The information can be used by theapplication to perform flow and congestion control functions, and be used for diagnostic purposes.

• To distribute identifiers that are used to group different streams together (for example, audio andvideo). Such a mechanism is necessary since RTP itself does not provide this information.

• To observe the number of participants. The RTP data cannot be used to determine the number ofparticipants because participants may not be sending packets, only receiving them. For example,students listening to an on-line lecture.

• To distribute information about a participant. For example, information used to identify students inthe lecturer's conferencing user-interface.

There are several types of RTCP packets that provide this functionality.

• Sender

• Receiver

Page 60: JBoss Communications Platform-1.2-Media Server User Guide-En-US

Appendix A. Understanding Digital Signal Processing and Streaming

50

• Source Description

• Application-specific Data

Sender reports (SR) are used by active senders to distribute transmission and reception statistics. If aparticipant is not an active sender, reception statistics are still transmitted by sending receiver reports(RR).

Descriptive participant information is transmitted in the form of Source Description (SDES) items.SDES items give general information about a participant, such as their name and e-mail. However,it also includes a canonical name (CNAME) string, which identifies the sender of the RTP packets.Unlike the SSRC identifier, the SDES item stays constant for a given participant, is independent of thecurrent session, and is normally unique for each participant. Thanks to this identifier it is possible togroup different streams coming from the same source.

There is a packet type that allows application-specific data (APP) to be transmitted with RTP data.When a participant is about to leave the session, a goodbye (BYE) packet is transmitted.

The transmission statistics which an active sender distributes, include both the number of bytes sentand the number of packets sent. The statistics also include two timestamps: a Network Time Protocol(NTP) timestamp, which gives the time when this report was created, and a RTP timestamp, whichdescribes the same time, but in the same units and with the same random offset of the timestamps inthe RTP packets.

This is particularly useful when several RTP packet streams have to be associated with each other.For example, if both video and audio signals are distributed, there has to be synchronization betweenthese two media types on playback, called inter-media synchronization. Since their RTP timestampshave no relation whatsoever, there has to be some other way to do this. By giving the relation betweeneach timestamp format and the NTP time, the receiving application can do the necessary calculationsto synchronize the streams.

A participant to a RTP session distributes reception statistics about each sender in the session. For aspecific sender, a reception report includes the following information:

• Fraction of lost packets since the last report. An increase of this value can be used as an indicationof congestion.

• Total amount of lost packets since the start of the session.

• Amount of inter-arrival jitter, measured in timestamp units. When the jitter increases, this is also apossible indication of congestion.

• Information used by the sender to measure the round-trip propagation time to this receiver. Theround-trip propagation time is the time it takes for a packet to travel to this receiver and back.

Because the RTCP packets are sent periodically by each participant to all destinations, the packetbroadcast interval should be reduced as much as possible. The RTCP packet interval is calculatedfrom the number of participants and the amount of bandwidth the RTCP packets may occupy. Tostagger the broadcast interval of RTCP packets to participants, the packet interval value is multipliedby a random number.

Page 61: JBoss Communications Platform-1.2-Media Server User Guide-En-US

Jitter

51

A.4.3. JitterThe term Jitter refers to processing delays that occur at each endpoint, and are generally caused bypacket processing by operating systems, codecs, and networks. Jitter affects the quality of the audioand video stream when it is decoded by the receiving application.

End-to-end delay is caused by the processing delay at each endpoint, and may be caused in part byIP packets travelling through different network paths from the source to the destination. The time ittakes a router to process a packet depends on its congestion situation, and this may also vary duringthe session.

Although a large overall delay can cause loss of interactivity, jitter may also cause loss of intelligibility.Though Jitter cannot be totally removed, the effects can be reduced by using a Jitter Buffer at thereceiving end. The diagram below shows effect with media buffer and without media buffer

Page 62: JBoss Communications Platform-1.2-Media Server User Guide-En-US

Appendix A. Understanding Digital Signal Processing and Streaming

52

Fig a. Shows that packet 3 is lost as it arrived late. Fig b uses Jitter buffer and hence arrived packetsare stored in jitter and media components reads from Jitter once its half full. This way even if Packet 3arrives little late, its read by the components.

Page 63: JBoss Communications Platform-1.2-Media Server User Guide-En-US

53

Appendix B. Revision HistoryRevision 8.0 Mon Sep 13 2010 Tom Wells [email protected]

Seventh release of the "parameterized" documentation.

Revision 7.0 Fri Aug 6 2010 Tom Wells [email protected] release of the "parameterized" documentation.

Revision 6.0 Fri Apr 23 2010 Tom Wells [email protected] release of the "parameterized" documentation.

Revision 5.0 Tue Feb 2 2010 Tom Wells [email protected] release of the "parameterized" documentation.

Revision 4.0 Thu Nov 19 2009 Tom Wells [email protected] release of the "parameterized" documentation.

Revision 3.0 Thu Jun 11 2009 Jared Morgan [email protected] release of the "parameterized" documentation.

Revision 2.0 Fri Mar 06 2009 Douglas Silas [email protected] release of the "parameterized", and much-improved JBCP documentation.

Page 64: JBoss Communications Platform-1.2-Media Server User Guide-En-US

54