Difference between revisions of "GXtest2OpenSTA for Performance Testing"

From GXtest Wiki
Jump to: navigation, search
(Installing OpenSTA)
 
(10 intermediate revisions by 3 users not shown)
Line 1: Line 1:
 +
{{Idiomas
 +
|  GXtest2OpenSTA para pruebas de performance
 +
|GXtest2OpenSTA for Performance Testing
 +
}}
 
[[category:GXtest Guides]]
 
[[category:GXtest Guides]]
== Introducción ==
+
== Introduction ==
GXtest2OpenSTA es una nueva feature que nos permite hacer pruebas de performance (stress testing, load testing, scalability testing, capacity testing) minimizando en aproximadamente un 80% el costo de automatización. Esto se logra ya que, si bien GXtest es una herramienta de testing funcional, una vez que tenemos los casos de prueba automatizados nos permite generar scripts de forma automática en una herramienta gratuita para pruebas de performance como es [http://opensta.org OpenSTA] dándole al tester numerosas ventajas que se verán más adelante.
+
GXtest2OpenSTA is a new feature that allows us to execute performance tests (stress testing, load testing, scalability testing, capacity testing) bringing down automation costs to approximately 20%. How? Even though GXtest is a functional testing tool, once we have automated our test cases we can automatically generate scripts for an open source performance testing software ([http://opensta.org OpenSTA]), giving the tester numerous advantages, which we'll see further in this doc.
  
En este artículo veremos una introducción a las pruebas de performance, luego veremos cómo estas se suelen realizar con OpenSTA, para finalizar viendo cómo utilizar GXtest2OpenSTA y así entender los verdaderos beneficios que esta herramienta nos aporta para garantizar la calidad de los sistemas antes de ponerlos en producción.
+
In this article we'll start with an introduction to performance testing, then we'll go through some general aspects of performance testing in OpenSTA, and we'll end up watching how GXtest2OpenSTA works, so we can see the true advantages that this tool provides us regarding the quality of our systems before we deploy them to the production environments.
  
== Introducción al testing de performance ==
+
== Introducing Performance Testing ==
Un aspecto fundamental a tener en cuenta en los test de performance es que, por varias razones, es imposible testear todas las funcionalidades de una aplicación. Teniendo presente esto, surge un nuevo problema: ¿Qué funcionalidades probar? ¿Qué casos de prueba  seleccionamos para automatizar e incluir en una prueba de performance?
+
An essential point to consider about performance tests is that, for many reasons, it is impossible to test every functionality of an application. Having that in mind a new problem comes up: Which functionalities should we test? Which test cases should we select to automate and include in a performance test?
  
Para resolver esto se puede recurrir por ejemplo a datos históricos tomados del sistema anterior (en caso de que el hubiera una versión anterior en funcionamiento) que nos de una pauta de los usuarios activos, número de veces que se utilizan las funcionalidades que nos interesan, etc. De no tener esta posibilidad, se puede recurrir al conocimiento de expertos (encargados del proyecto, desarrolladores, usuarios u otras personas que puedan realizar un estimativo de que funcionalidades serán las más adecuadas para incluir en las pruebas).
+
To solve these problems we can resort to historical data taken for the previous system (in case there is a previous version running), to get an idea of the active users, the number of times each functionality is used, etc. If we don’t have that possibility, we can always turn to our application’s expert’s knowledge (project managers, developers, users, or any other person who can estimate which functionalities are the most adequate to be included in our tests).
  
Luego de determinados los casos de prueba a ejecutar, se deben automatizar. Armar los scripts es generalmente la parte que lleva más tiempo del proyecto, afortunadamente, con GXtest esta tarea se puede hacer con menor esfuerzo.
+
After we have selected which test cases to run, we have to automate them. Making the scripts is generally the most time consuming part of the project. Luckily, GXtest helps us in this task, reducing our efforts considerably.
  
  
''¿Cuándo realizar testing de performance?''
+
=== When do we use performance testing? ===
  
Básicamente, existen dos estrategias para encarar las pruebas de performance.
+
Basically, there are two strategies we can use to approach performance testing. The traditional one is to start automating our test cases after the product is finished, so there won’t be changes in the source code. This strategy has the advantage of not having to waste time in maintaining our automated test cases, given that the system version will not suffer any changes. However, its main drawback is that we could find a design or architectural problem late in the developing process that will require a major effort to reconstruct the system.
La más tradicional es comenzar con la automatización de los casos de prueba luego de finalizado el producto, por lo que no van a haber más cambios en el desarrollo. Esta estrategia tiene como ventaja que no se tendrá que invertir tiempo en mantenimiento de los casos de prueba automatizados ya que se trabaja con una única versión del sistema. Sin embargo tiene la principal contra de que se pueden encontrar de manera muy tardía problemas de diseño o de arquitectura que impliquen un retrabajo de construcción del sistema muy grande.
+
  
La otra manera de encarar un proyecto de performance, es comenzar las pruebas de performance en paralelo con el desarrollo, a medida que se van liberando las funcionalidades principales. Este método tiene como ventaja que se pueden descubrir fallas más temprano y se pueden ir solucionando a medida que avanza el desarrollo, sin embargo, el costo de las pruebas de performance puede ser mayor ya que hay que ir adaptando los casos de prueba cuando se producen cambios en el desarrollo. En esta tarea de mantenimiento de los casos de prueba es fundamental el uso de GXtest como herramienta de automatización ya que nos va a permitir rápidamente adaptar los casos de prueba automatizados a los cambios del sistema.
+
The other way to approach a performance project is starting with the tests alongside the development stage, while the main functionalities are being released. This has the advantage of being a proper time to discover any design, architectural or even functional errors, implying less effort in solving them. Nevertheless, the cost of maintaining the performance test cases can grow considerably, given that they need to be adapted every time a change in development takes place. When maintaining the test cases, the use of GXtest as an automation tool is fundamental, due to the fact that it allows us to quickly adapt our automated test cases to the changes in the system.
  
 
== OpenSTA ==
 
== OpenSTA ==
[http://www.opensta.org OpenSTA] es una herramienta open source para hacer test de performance, se utiliza para automatizar los casos de pruebas en scripts que contienen los pedidos http (o https) que realiza cada caso y permite modificarlos para realizar distintas acciones (tales como ingresar validaciones, condiciones, parametrizaciones, etc.). En la siguiente figura , podemos apreciar el formato de un script de OpenSTA.
+
[http://www.opensta.org OpenSTA] is an open source tool for performance testing and it is used for automating test cases, generating scripts that contain the http (or https) requests executed in each test case, and it allows modifications in order to perform some actions (such as introducing validations, conditions, parameterizations, etc.). In the figure below, we can appreciate the format of an OpenSTA script.
  
  
  
[[image:OpenSTASctipt.png|frame|center]]
+
[[Image:OpenSTAScript-English.png|frame|center]]
  
  
  
Además, el OpenSTA permite montar escenarios de prueba donde se pueden configurar la cantidad de usuarios activos, cantidad de iteraciones, forma en que los usuarios virtuales ingresan al sistema y otras opciones que permiten que la prueba represente un escenario lo más realista posible. En la figura que sigue se muestra un ejemplo de un escenario.
+
Moreover, OpenSTA allows us to mount test scenarios where we can define the amount of active users, the number of iterations, the way in which users access the system and other options that let us represent the most realistic scenario. The figure below shows an example of a scenario.
  
  
Line 37: Line 40:
  
  
En materia de performance, el OpenSTA es una de las mejores herramientas ya que:
+
Regarding performance, OpenSTA is one of the best tool due to the following:
  
*Permite ingresar un gran número de usuarios virtuales por máquina generadora.
+
*It allows to introduce a large number of virtual users for each load generating computer.
*Permite distribuir la carga de manera sencilla.
+
*It simplifies the load distribution.
*Presenta un modelo para armar la simulación del escenario muy bien orientado a aplicaciones web.
+
*It presents a model to assembly the scenario simulation which is well-oriented to web applications.
*Presenta gran cantidad de opciones para analizar los resultados tales como: numerosas gráficas, tablas, etc.
+
*It presents lots of options to analyze the results such as: numerous charts, tables, etc.
*La productividad en desarrollo es mucho mayor una vez que se tiene experiencia con la herramienta.
+
*The development productivity is much higher once you have experience with the tool.
  
Por otro lado, una desventaja del OpenSTA es el grabado de casos de prueba. Al grabar los casos de prueba se obtiene un script con los pedidos http que se realizaron, por lo tanto, para alguien que no conozca la aplicación le va a resultar muy difícil diferenciar cada paso, o identificar que es lo que se hace en cada pedido. Muchas veces, para dejar el script más claro, se ingresan durante la grabación comentarios entre los pasos, o timers que controlen el tiempo de cada paso. Si encima de esto, queremos agregar validaciones o parametrizar variables, vamos a concluir que realizar un script en OpenSTA es una tarea pesada y puede que resulte engorroso en los proyectos en los que  se esté cambiando constantemente el código de la aplicación, lo que hace que se tengan que regenerar los scripts seguido.
+
On the other hand, a drawback of OpenSTA is the test case recording process. When recording the test cases we get a script with the http requests that were made, therefore, for someone that has no knowledge of the application, it will be really difficult to identify each step, or what each request does. Many times, in order to have a clearer script, we can introduce comments between each step during the recording, as well as timers that control the timing of each step. If on top of that, we want to add validations or parameterized variables, we can conclude that doing a script with OpenSTA is a tedious task and it may be annoying in those projects where the application’s code is being constantly changed, which means that the scripts have to be often re-generated.
  
Es recomendable para un proyecto de performance, que se complemente las mediciones del OpenSTA con herramientas de monitorización de GX, como por ejemplo [http://iroqueta.blogspot.com/2007/09/medir-ajustar-monitorear.html JMX].
+
It is advisable for a performance project that OpenSTA measurements are complemented with GX monitoring tools, such as [http://wiki.gxtechnical.com/commwiki/servlet/hwiki?Application+Monitoring+and+Management%3A+using+a+JMX+or+WMI+monitor, JMX].
  
 
== GXtest2OpenSTA ==
 
== GXtest2OpenSTA ==
Como mencionábamos en la introducción, GXtest presenta una alternativa que permite superar esta problemática.
+
As it was mentioned in the introduction, GXtest presents an alternative that allows us to overcome this difficulty.
  
Gracias a su nueva funcionalidad, GXtest puede generar scripts de OpenSTA a partir de un caso de prueba previamente grabado. Esto le brinda al tester muchas ventajas, entre ellas:
+
Thanks to its new functionality, GXtest can generate OpenSTA scripts from a previously recorded test case. This gives many advantages to the tester:
  
*Grabar los casos de prueba en GXtest, que permite hacerlo de una forma mucho más práctica con posibilidad de agregar validaciones y otras opciones mencionadas en otros artículos.  
+
*Recording the test cases with GXtest, which is done in a much more practical way and allows to add validations and more options which are mentioned in other articles.
*Realizar la parte más pesada del script automáticamente sin necesidad de perder tiempo en hacer tareas repetitivas.
+
*Automatization of the most tedious parts of the script, without the need of wasting time in executing repetitive tasks.
*En caso de que haya un cambio en la aplicación, poder impactar los cambios en GXtest y luego regenerar los scripts de forma fácil, sin necesidad de grabar de nuevo en GXtest.
+
*In the event of a change in the application, the possibility of taking those changes to GXtest and then re-generate the scripts in an easy way, without the need of recording them again in GXtest.
*No tener que ingresar manualmente validaciones, timers, ni parametrizaciones. Normalmente esto se vuelve una tarea bastante larga y muchas veces presenta problemas, GXtest ingresa todos estos elementos de forma automática.
+
*No need of introducing validations manually, timers or parametrizations. Normally, this turns into a considerably long task and usually presents problems. GXtest automatically introduces all these elements.
*Tener los pedidos secundarios (css, gif, jpg, js, etc.) en archivos separados de los primarios. Esto permite tener un script más prolijo, lo que facilita el análisis.
+
*Having the secondary requests (css, gif, jpg, js, etc.) in separate files, away from the primary ones. This allows to have a clearer script which makes analysing easier.
*Tener una bandera de debug que condiciona los pedidos secundarios y mensajes de log.
+
*The use of debugging flags that condition the secondary requests and log messages.
*Soporta autenticación NTLM.
+
*Support of NTLM authentication.
*Manejo de redirects automático.
+
*Automatical management of all redirects.
  
''¿Cómo funciona el generador?''
+
=== How does the script generator work? ===
  
Al ejecutar el pedido para crear un script de OpenSTA, el usuario verá en su pantalla el caso de prueba ejecutándose tal y como se hace normalmente. Sin embargo, internamente se realizan varias acciones distintas: a medida que se va ejecutando el caso de prueba, se crea un archivo xml con todos los comandos de GXtest que se van ejecutando en la prueba, además, en paralelo con esto se graban los pedidos http en un archivo de extensión ''saz''. Estos archivos se dejan dentro de una carpeta con el nombre del caso de prueba,  donde está instalado el GXtest Designer, normalmente en: ''Abstracta\GXtest Designer\Performance\ExecutionLog\''.
+
The performance script is created by clicking the GXtest button "Generate OpenSTA scripts".
  
El formato ''saz'' pertenece a la herramienta [http://www.fiddler2.com Fiddler], un proxy que graba el tráfico http(s) entre Internet y la PC para luego poder analizarlo permitiendo insertar breakpoints, editar los pedidos a nivel de protocolo, etc.
+
[[File:OpenSTAButton.png]]
  
A partir de estos dos archivos (xml y saz) se va generando el script de OpenSTA tomando los comandos de GXtest del xml y los pedidos http(s) del saz por lo que, una vez finalizada la ejecución, el script tendrá cada comando ejecutado de GXtest asociado a los pedidos http(s) que se hacen en él con sus correspondientes ''timers'' y, en caso de que se hayan ingresado, sus correspondientes validaciones. Además, si en el caso de prueba de GXtest se toma algún valor de una variable desde un datapool, esta quedará parametrizada en el script de OpenSTA.
+
When sending the request to create an OpenSTA script, the test case will start executing as usual in the user’s screen. Nevertheless, there are many different actions happening internally: in the course of the test case execution, a xml file is created with all the GXtest commands that are being used in the test, while http requests are recorded into a saz extension file. This files are left inside a folder with the name of the test case, where the GXtest Designer is installed, normally in: ''Abstracta\GXtest Designer\Performance\ExecutionLog\''.
Estos archivos se crean en la carpeta ''Abstracta\GXtest Designer\Repository''.
+
  
En general los scripts generados están listos para ejecutarse sin hacer ningún cambio, aunque dependiendo del sistema se le hacen modificaciones para agregar condiciones, variables, validaciones, etc. para que el script se comporte de la forma más real posible.
+
The ''saz'' format belongs to the [http://www.fiddler2.com Fiddler] tool, which is a proxy that records http traffic between Internet and the PC, that then can be analyzed by introducing breakpoints, editing requests at protocol level, etc.
  
 +
With these two files (xml and saz), the OpenSTA script is generated using GXtest commands from the xml file and http(s) requests from the saz file. Once the execution is completed, the resulting script will have each GXtest command that was executed, associated with the http(s) requests that are involved in each of them, with their appropriate timers, and in the case they have been introduced, the corresponding validations. Moreover, if the GXtest test case uses a variable which value is taken from a datapool, it will appear parameterized in the OpenSTA script. These files are created in the folder ''Abstracta\GXtest Designer\Repository''.
  
Para habilitar en GXtest la generación de los archivos para OpenSTA ver [[Habilitar GXtest2OpenSTA]].
+
In general, the resulting scripts are ready to be executed without doing any changes. Although, depending on the system, some modifications might be done to add conditions, variables, validations, etc., so the script will behave as real as possible.
  
== Resumen ==
+
=== Installing OpenSTA ===
[[Image:TestersTrabajando.png|frame|left|200px]] GXtest permite crear scripts para hacer pruebas de performance en la herramienta OpenSTA. Generar los scripts a partir de GXtest le permite al tester ahorrar mucho tiempo de trabajo repetitivo permitiéndole dedicarse a otras tareas para mitigar la mayor cantidad de errores de performance y así obtener un producto de mayor calidad.
+
  
== Referencias ==
+
 
 +
Abstracta made some bug fixes over the last official released version of OpenSTA. In order to apply these fixes, it is necessary to install the last released version of OpenSTA, and download the fix [http://www.abstracta.com.uy/en/descargas-2.html here]. Finally, apply the patch following the instructions in the readme file inside the downladed files.
 +
 
 +
=== Enable GXtest2OpenSTA ===
 +
 
 +
In order to generate the required files for the performance tests using OpenSTA, it is necessary to enable the generation in the menu ''Options -> Local Settings''.
 +
 
 +
[[Image:HabilitaropenSta.PNG|frame|center]]
 +
 
 +
There we can enable or disable the file generation, and also configure the location of the Fiddler tool and the OpenSTA Modeller.
 +
 
 +
== Summary ==
 +
[[Image:TestersTrabajando.png|frame|center|200px]]
 +
 
 +
GXtest allows to create scripts to use in performance tests with the OpenSTA tool. When generating the scripts from GXtest, the tester saves loads of time from doing repetitive tasks, and it gives him the possibility to focus on other activities to reduce the amount of performance errors and therefore obtain a better quality product.
 +
 
 +
== References ==
  
 
*[http://perftestingguide.codeplex.com/ Scott Barber; Performance Testing Guidance for Web Applications]
 
*[http://perftestingguide.codeplex.com/ Scott Barber; Performance Testing Guidance for Web Applications]
 
*[http://www.ibm.com/developerworks/rational/library/4228.html Scott Barber; User experience, not metrics]
 
*[http://www.ibm.com/developerworks/rational/library/4228.html Scott Barber; User experience, not metrics]
 
*[http://blog.ces.com.uy/?cat=10 Blog del CES]
 
*[http://blog.ces.com.uy/?cat=10 Blog del CES]

Latest revision as of 19:11, 24 January 2014

Spanish.gif
English.gif
link= {{{3}}}

Contents

Introduction

GXtest2OpenSTA is a new feature that allows us to execute performance tests (stress testing, load testing, scalability testing, capacity testing) bringing down automation costs to approximately 20%. How? Even though GXtest is a functional testing tool, once we have automated our test cases we can automatically generate scripts for an open source performance testing software (OpenSTA), giving the tester numerous advantages, which we'll see further in this doc.

In this article we'll start with an introduction to performance testing, then we'll go through some general aspects of performance testing in OpenSTA, and we'll end up watching how GXtest2OpenSTA works, so we can see the true advantages that this tool provides us regarding the quality of our systems before we deploy them to the production environments.

Introducing Performance Testing

An essential point to consider about performance tests is that, for many reasons, it is impossible to test every functionality of an application. Having that in mind a new problem comes up: Which functionalities should we test? Which test cases should we select to automate and include in a performance test?

To solve these problems we can resort to historical data taken for the previous system (in case there is a previous version running), to get an idea of the active users, the number of times each functionality is used, etc. If we don’t have that possibility, we can always turn to our application’s expert’s knowledge (project managers, developers, users, or any other person who can estimate which functionalities are the most adequate to be included in our tests).

After we have selected which test cases to run, we have to automate them. Making the scripts is generally the most time consuming part of the project. Luckily, GXtest helps us in this task, reducing our efforts considerably.


When do we use performance testing?

Basically, there are two strategies we can use to approach performance testing. The traditional one is to start automating our test cases after the product is finished, so there won’t be changes in the source code. This strategy has the advantage of not having to waste time in maintaining our automated test cases, given that the system version will not suffer any changes. However, its main drawback is that we could find a design or architectural problem late in the developing process that will require a major effort to reconstruct the system.

The other way to approach a performance project is starting with the tests alongside the development stage, while the main functionalities are being released. This has the advantage of being a proper time to discover any design, architectural or even functional errors, implying less effort in solving them. Nevertheless, the cost of maintaining the performance test cases can grow considerably, given that they need to be adapted every time a change in development takes place. When maintaining the test cases, the use of GXtest as an automation tool is fundamental, due to the fact that it allows us to quickly adapt our automated test cases to the changes in the system.

OpenSTA

OpenSTA is an open source tool for performance testing and it is used for automating test cases, generating scripts that contain the http (or https) requests executed in each test case, and it allows modifications in order to perform some actions (such as introducing validations, conditions, parameterizations, etc.). In the figure below, we can appreciate the format of an OpenSTA script.


OpenSTAScript-English.png


Moreover, OpenSTA allows us to mount test scenarios where we can define the amount of active users, the number of iterations, the way in which users access the system and other options that let us represent the most realistic scenario. The figure below shows an example of a scenario.


OpenSTAEscenario.png


Regarding performance, OpenSTA is one of the best tool due to the following:

  • It allows to introduce a large number of virtual users for each load generating computer.
  • It simplifies the load distribution.
  • It presents a model to assembly the scenario simulation which is well-oriented to web applications.
  • It presents lots of options to analyze the results such as: numerous charts, tables, etc.
  • The development productivity is much higher once you have experience with the tool.

On the other hand, a drawback of OpenSTA is the test case recording process. When recording the test cases we get a script with the http requests that were made, therefore, for someone that has no knowledge of the application, it will be really difficult to identify each step, or what each request does. Many times, in order to have a clearer script, we can introduce comments between each step during the recording, as well as timers that control the timing of each step. If on top of that, we want to add validations or parameterized variables, we can conclude that doing a script with OpenSTA is a tedious task and it may be annoying in those projects where the application’s code is being constantly changed, which means that the scripts have to be often re-generated.

It is advisable for a performance project that OpenSTA measurements are complemented with GX monitoring tools, such as JMX.

GXtest2OpenSTA

As it was mentioned in the introduction, GXtest presents an alternative that allows us to overcome this difficulty.

Thanks to its new functionality, GXtest can generate OpenSTA scripts from a previously recorded test case. This gives many advantages to the tester:

  • Recording the test cases with GXtest, which is done in a much more practical way and allows to add validations and more options which are mentioned in other articles.
  • Automatization of the most tedious parts of the script, without the need of wasting time in executing repetitive tasks.
  • In the event of a change in the application, the possibility of taking those changes to GXtest and then re-generate the scripts in an easy way, without the need of recording them again in GXtest.
  • No need of introducing validations manually, timers or parametrizations. Normally, this turns into a considerably long task and usually presents problems. GXtest automatically introduces all these elements.
  • Having the secondary requests (css, gif, jpg, js, etc.) in separate files, away from the primary ones. This allows to have a clearer script which makes analysing easier.
  • The use of debugging flags that condition the secondary requests and log messages.
  • Support of NTLM authentication.
  • Automatical management of all redirects.

How does the script generator work?

The performance script is created by clicking the GXtest button "Generate OpenSTA scripts".

OpenSTAButton.png

When sending the request to create an OpenSTA script, the test case will start executing as usual in the user’s screen. Nevertheless, there are many different actions happening internally: in the course of the test case execution, a xml file is created with all the GXtest commands that are being used in the test, while http requests are recorded into a saz extension file. This files are left inside a folder with the name of the test case, where the GXtest Designer is installed, normally in: Abstracta\GXtest Designer\Performance\ExecutionLog\.

The saz format belongs to the Fiddler tool, which is a proxy that records http traffic between Internet and the PC, that then can be analyzed by introducing breakpoints, editing requests at protocol level, etc.

With these two files (xml and saz), the OpenSTA script is generated using GXtest commands from the xml file and http(s) requests from the saz file. Once the execution is completed, the resulting script will have each GXtest command that was executed, associated with the http(s) requests that are involved in each of them, with their appropriate timers, and in the case they have been introduced, the corresponding validations. Moreover, if the GXtest test case uses a variable which value is taken from a datapool, it will appear parameterized in the OpenSTA script. These files are created in the folder Abstracta\GXtest Designer\Repository.

In general, the resulting scripts are ready to be executed without doing any changes. Although, depending on the system, some modifications might be done to add conditions, variables, validations, etc., so the script will behave as real as possible.

Installing OpenSTA

Abstracta made some bug fixes over the last official released version of OpenSTA. In order to apply these fixes, it is necessary to install the last released version of OpenSTA, and download the fix here. Finally, apply the patch following the instructions in the readme file inside the downladed files.

Enable GXtest2OpenSTA

In order to generate the required files for the performance tests using OpenSTA, it is necessary to enable the generation in the menu Options -> Local Settings.

HabilitaropenSta.PNG

There we can enable or disable the file generation, and also configure the location of the Fiddler tool and the OpenSTA Modeller.

Summary

TestersTrabajando.png

GXtest allows to create scripts to use in performance tests with the OpenSTA tool. When generating the scripts from GXtest, the tester saves loads of time from doing repetitive tasks, and it gives him the possibility to focus on other activities to reduce the amount of performance errors and therefore obtain a better quality product.

References