Rich Communication Services(RCS) program is a GSM Association (GSMA) initiative to create inter-operator communication services based on IP Multimedia Subsystem (IMS) . This initiative came up as the Global Telecom Operators ́response to the decline in their revenues and to help compete ’Over The Top’(OTT) service providers such as Viber, whatsapp, etc. RCS is an universal standard, making it inter-operable between mul- tiple service providers unlike OTT services with closed communities. RCS services use IMS as the underlying architecture with a RCS stack imple- mented into Android background service which offers high level API. For the purpose of testing RCS stack functionality which is usually affected by external dependencies like third party vendors, or ISP customizations in real telecommunication scenario, there is a persistent demand for scenario replay tools that can recreate the range of test conditions similar to those experienced in live deployments. There is also a need to evaluate the per- formance of service provided by application servers in the network in-order to predict the factors affecting the RCS service in general. In this work, we propose a tool to address the RCS scenario repro- duction in a test environment. The tool is implemented within an automated test environment with full control on interaction with the RCS stack, hence the ability to replay the scenario in a controlled fashion. To achieve the goal, the tool replays trace interactively with the RCS stack in a stateful manner , it ensures no improper packet generation which is critical feature for test environments where protocol semantics accuracy is fundamental. A detailed demonstration of how the tool can be deployed in test environ- ments is stated. A novel approach is used to validate the effectiveness of the replayed scenario, the sequence of events and states are compared to those from the recorded scenario using a call-back service to indicate the state. The replayed scenario showed strong relationship with the recorded RCS scenario. The paper also presents a performance evaluation of Application service by considering the request-reponse times of Network Registration procedure. The obtained results show that the average time taken for the Registration process is 555 milliseconds and in few instances there exists larger deviations from this average value showing the faulty behavior of the Server which is most crucial during the debugging process for the developers.
Identifer | oai:union.ndltd.org:UPSALLA1/oai:DiVA.org:bth-10846 |
Date | January 2015 |
Creators | Yellakonda, Amulya |
Publisher | Blekinge Tekniska Högskola, Institutionen för kommunikationssystem |
Source Sets | DiVA Archive at Upsalla University |
Language | English |
Detected Language | English |
Type | Student thesis, info:eu-repo/semantics/bachelorThesis, text |
Format | application/pdf |
Rights | info:eu-repo/semantics/openAccess |
Page generated in 0.0019 seconds