
6  Conclusions and Future Work 
Performance testing is needed to reduce risk in the going live process of any system, 
but, as it is expensive and resource demanding, it is typically made in a poor or 
incomplete way, or the results come too late. The most demanding task is the 
automation of the functionalities to be tested, taking part of the time that could be 
used to execute tests and analyze how to improve the system. 
Taking this into account, this article presented a tool to generate performance test 
scripts in a cheaper way, taking advantage of the functional test scripts. This not only 
gives major flexibility when adjusting test scripts according to the changes and 
improvements performed on the application (that are always performed during any 
performance testing project), but it also helps generating better scripts, with better 
quality, in less time and with less effort.  
The tool from this approach has being used in different projects to test the 
performance of a variety of systems, demonstrating the benefits of the proposal. 
We plan to extend the performance test script generation to different load 
generators, like JMeter, which supports different communication protocols, allowing 
the execution of tests against systems that are accessed by different interfaces (HTTP, 
SOAP, FTP), and managing the test centralized in one single tool. 
Acknowledgements 
This work has been partially funded by Agencia Nacional de Investigación e 
Innovación (ANII, Uruguay) and by the GEODAS-BC project (TIN2012-37493-C03-
01). We would also like to express our special acknowledgement to Abstracta team. 
References 
1.  Graham, D., Fewster, M.: Experiences of Test Automation: Case Studies of Software Test 
Automation. Addison-Wesley Professional (2012). 
2.  Meier, J., Farre, C., Bansode, P., Barber, S., Rea, D.: Performance testing guidance for web 
applications: patterns & practices. Microsoft Press (2007). 
3.  Vázquez, G., Reina, M., Toledo, F., de Uvarow, S., Greisin, E., López, H.: Metodología de 
Pruebas de Performance. Presented at the JCC (2008). 
4.  Barber, S.: User Experience, not Metrics. (2001). 
5.  OMG: MOF Model to Text Transformation Language (MOFM2T), 1.0. (2008). 
6.  Santos, I. de S., Santos, A.R., Neto, P. de A. dos S.: Reusing Functional Testing in order to 
Decrease Performance and Stress Testing Costs. SEKE. pp. 470–474 (2011). 
7.  García-Domínguez, A., Medina-Bulo, I., Marcos-Bárcena, M.: Performance Test Case 
Generation for Java and WSDL-based Web Services from MARTE. Advances in Internet 
Technology. 5, 173–185 (2012). 
8.  Garousi, V., Briand, L.C., Labiche, Y.: Traffic-aware stress testing of distributed systems 
based on UML models. ICSE. pp. 391–400. ACM, New York, NY, USA (2006). 
9.  Shams, M., Krishnamurthy, D., Far, B.: A model-based approach for testing the 
performance of web applications. Presented at the SOQUA (2006). 
19