Subscribe by Email


Showing posts with label Variables. Show all posts
Showing posts with label Variables. Show all posts

Wednesday, May 22, 2013

What are Address Binding, Dynamic Loading and Dynamic Linking?

In this article we shall discuss about three interrelated concepts namely address binding, dynamic loading and dynamic linking.

1. Address Binding: 
- There are two types of addresses for the computer memory. 
- These are called the physical address and the logical address. 
- A physical memory location is allocated to a logical pointer by address binding process.
- This is actually nothing but associating the physical address and the logical address with each other. 
- Sometimes logical address is also referred to as the virtual address. 
- This concept is an important part of the memory management. 
- Operating system is responsible for carrying out address binding on behalf of the applications and programs that need an access to the memory. 
- A program cannot be executed without bringing it to the main memory. 
- The instructions of the program have to be bound to right address spaces in the physical memory. 
- Address binding is simply a scheme for performing this job. 
- It can be thought of as something similar to address mapping. 
- Address binding can be carried out at any of the following times:
Ø  Compile time
Ø  Loading time
Ø  Execution time

- In execution time binding, whenever the program requires access to memory, it has to go through a register called the relocation register and is similar to the base register. 
- Then the offset is added. 
- But in binding during the loading time, same thing is done but every time this register need not be evaluated. 
- The addresses are mapped at the time of loading the program in to the memory. 
- If there is a change in the base address, the whole program has to be reloaded.

2. Dynamic Loading: 
- This mechanism is very useful for a program as it helps it do the following things:
Ø  Loading library in to the main memory.
Ø  Retrieving the address of the variables and routines that are contained in the library.
Ø  Accessing those variables and executing those routines.
Ø  Unloading the library.
- Dynamic loading is very much different from the load time linking and static linking. 
- Dynamic loading allows a system to start up even of the libraries are absent. - It also helps in discovering the absent libraries and then gaining the additional functionality. 
- Dynamic loading is a very transparent process since it is the operating system that handles it. 
- Main advantages are firstly, it helps in fixing the patches at once without having the need for re-linking them and secondly, it provides protection to the libraries against modification that is not authorized. 
Dynamic loading find its major use in the implementation of the software plugins.
- It is also used in the implementation of the computer programs where requisite functionality is supplied by the different libraries and user has the freedom to select the libraries he/ she wishes to provide.

3. Dynamic Linking: 
- This is an important part of the binding process. 
- The purpose of the dynamic linking is resolving the references or symbols and links to the library modules. 
- This process is carried out by a linker program. 
- This programs searches for a set of library modules in some given sequence. 
This process takes place during the creation of the executable file. 
- The resolved references may be addresses of the jump calls and the routines. - These may in different modules or in the main program.
- Dynamic linking resolves them in to relocatable address or fixed address through allocation of the memory to each of the memory segment of the referenced module. 


Friday, October 26, 2012

How to run the silk scripts?


Silk test as we know is a test automation tool from the Borland which was launched in the year of 2006. This tool facilitates the automated regression testing and functional testing. However, the credit for the original creation goes to the segue software.
Various clients are offered by the silk test as described below:
  1. Silk test classic: This client of the silk test makes use of the domain specific language called “4test” for scripting of the test automation scripts. This language just like the C++ language is an object oriented language. Just like C++ it also makes use of the Object Oriented concepts such as following:
a)   Inheritance
b)   Classes and
c)   objects
  1. Silk 4J: This client of the silk test enables one to follow test automation by using java as the scripting language in eclipse.
  2. Silk 4 net: This client of the silk test also enables one to follow test automation by using VBScript or sometimes using C# as the scripting language in the visual studio.
  3. Silk test work bench: This client of the silk test enables the testers to carry out the automation testing using VB.net as the scripting language as well as on a visual level.

What kind of files a silk script consists?

In this article we are going to see how the execution of the test scripts takes place in silk test.
-  A basic or a typical silk script consists of two files namely:
  1. Include file and
  2. Script file
- The include file of the silk script is saved with an extension .inc and is particularly for the declaration of the following:
  1. Window names
  2. Window objects
  3. Variables
  4. Constants
  5. Structures
  6. Classes and so on.
- The second file i.e., the script file contributes in writing of the scripts. 
- It is saved by the extension .t and is used for defining of the body of the scripts. 
- Test cases meeting the various test conditions are defined in this script. 
- The two different types of files have been used to maintain the clarity of the code as much as possible. 
- If the include file of the test script does not consists of any of the test cases then the file may be compiled but cannot be executed. 
- If tried for the execution an error will be generated saying that there are no test cases. 
- Only the include file which consists of some test case can be run. 
- Before you run the script always make sure that you make separate declaration for the files:
  1. For the declaration of the objects and
  2. For creation of the scripts using declaration file.

Steps for running Test Scripts

After the two declarations have been made you should proceed for their compilation. Below we are stating the steps that are to be followed for running the test scripts:
  1. Launch the silk test automation tool.
  2. Open the script file of the script to be executed.
  3. Compile the above script by using the compile option from menu bar.
  4. Once the compilation of the file is complete the status of the script can be checked by the progress status.
  5. Any errors present in the script will be displayed at the end of the compiling process.
  6. Now run the silk scripts by clicking the run option in the menu bar.
  7. For running the test scripts you have two options: either you can run all of them at a single stretch or you can run them selectively.
  8. If you have opted for the latter option then you need to specify which all tests have to be executed.
  9. After selection you can again give the run command. 


Monday, October 8, 2012

How Does Run time data (Parameterization) is handled in QTP?


Efficient run time data handling is quite important for proper test automation through quick test professional. In this article we take up discussion regarding how run time data is parameterized or handled during a run session in quick test professional.
The parameterization of run time data is necessary in quick test professional  as it enhances test components. 

What happens in Run-time Data Parameterization?

- In run time data parameterization or handling, a variable is passed as a parameter or an argument via an external source which is generally a generator in most of the cases. 
- This variable or parameter which is passed essentially consists of a value that is assigned via the generator as mentioned in the previous line. 
- The parameterization of the variables in the test component can be done in a series of different check points as well as in a series of a set of different steps as it is required by the situation.
- Apart from normal values, the parameterization of the action data values can also be done. 
- For parameterizing the same value in a series of several steps the data driven wizard can be used.  

How it is handled in QTP?

- Quick test professional test automation suite comes with an in built integrated spread sheet when the run time data table is filled up with the test data. 
- This spread sheet is just like the excel spread sheet and thus multiple test iterations can be created which in turn will save you a big deal on the side of programming efforts. 
- The run time data can be entered either manually or in an automatic way by importing the data from the spread sheets, text files, data bases and so on. 
The spread sheets in quick test professional come with full functionality as that of the excel spread sheets. 
- Using these spread sheets following tasks can be achieved:
  1. Manipulation of the data sets
  2. Creation of multiple iterations of the tests
  3. Expanding test coverage without unnecessary programming and so on.
- In simple form, parameterization can be defined as providing multiple test data or inputs to the tests. 
- While working on quick test professional, input can be supplied in the below mentioned 4 ways:
  1. Input through note pad
  2. Input through key board
  3. Input or import via a data base
  4. Input through excel spread sheet.

What is Run Time Data

Run time data is nothing but a live version of the data that has a current association with the test that is currently under execution. 
There are two methods available for the parameterization of the run time data as mentioned below:
  1. DTSheet.GetParameter: with this parameterization method the specified parameter can be retrieved from the run time data table.
  2. DTSheet.AddParameter: with this parameterization method a new column is added.
Properties of Run-Time Data
  1. Name property: It defines the name of the column in the run time data table.
  2. Raw value property: It defines the raw value of a cell residing in the current row of the parameter under consideration. The actual string written in the cell before computation is called the raw value for example actual text present in some formula.
  3. Value property: This is the default property of the parameter and used for retrieving as well as setting the value of the cell of the active row in run time data table.
  4. Value by row property: It is used for retrieving the value of the in the row specified by the parameter.  


Wednesday, October 3, 2012

How would you export a Script from one PC to another in QTP? Can launch two instances of QTP on the same machine?


While working on a collaborative project in quick test professional it becomes necessary to import some scripts from one machine to another. But how this to be done will be discussed and also how two instances of quick test professional can be launched on the same machine.

Exporting a script from one PC to another

- If you would have noticed the quick test professional comes with a tab titled “export”. 
- For doing so just mark the code that you want to export, copy and paste it in a text file. 
- The copied code may consist of references repository items, part of tests, part of settings, parameters, environment variables and so on. 
- However, it is another fact that the export option is not directly available in the quality center’s test script lab.
- A suitable version of quick test professional needs to be installed. 
- Then, you need to connect it to the quality center project. 
- Next step would be to open the test via quick test professional. 
- Once you are done with these 3 basic steps you can now export the file via “export to zip file” feature of the quick test professional either in to the quick test professional data base or some other location and either on the same machine or on some different machine. 
- One may have a bulk of test automation quick test professional scripts on a local machine which might be required to be exported from local machine to  QC. 
- Exporting some times becomes necessary when there are too many scripts and it takes a whole lot of time to open each and every script and then save it in the quality center.
- Some third party tools are available which serve the purpose of uploading many quick test professional scripts at once from one PC to another PC.  
- There is one more method which makes use of the “generate script” function that comes in built with the quick test professional. 
- This function is available in the object identification section under test settings, tools and options tab. 
- Using this function of the quick test professional, a zip file of the scripts to be exported can be created. 
- The zip file of the specified scripts is created at the source computer itself. 
Later, this zip file can be easily exported to quick test professional on destined computer.

Now let us answer the second question of this article regarding the launch of two instances on the same machine. 
- The answer is no since the quick test professional only has the ability of working with the multiple instances of the AUT or application. 
- An example of this can be multiple windows of the internet explorer browser that can be handled by the quick test professional. 
- Therefore, at a time only a single instance of the quick test professional can be launched on the same machine. 
- To say, actually two instances cannot be launched on the same machine but yes you can go for virtualization option. 
- Today, there are many tools available for virtualization such as sandboxie, Altiris SVS and so on. 
- These two tools have been used for running two Mozilla Firefox profiles on the same PC and for testing some software in isolated PC environment but have not been put to use for launching two instances of the quick test professional on the same machine. 
- But the possibilities are that these two tools can be used for running multiple instances of quick test professional on the same machine. 


Thursday, September 27, 2012

How would you connect to database using vbscript?


VBScript or visual basic scripting edition is the basic scripting language for the HP test automation suite called the quick test professional. This scripting language is quite a lively script whose interpretation is done via script host of the Microsoft windows. This scripting language comes with a whole lot of excellent and powerful functions with good support for data types, error handling and variables. 

There are two engines that can interpret the VBScript namely:
  1. Wscript.exe and Cscript.exe: it works in the GUI environment f the windows with the help of the WSH or windows script host. However this engine is basically used for automating the administration tasks as well as to automate the systems.
  2. VBScript.dll- this engine can be invoked by asp.dll and basically serves in the web environment.
VBScript is used as the following in quick test professional:
  1. Data base record set object
  2. Data base command object
  3. Data base connection object
Here in this article, it has been discussed about how a connection between data base and quick test professional can be established through the VBScript. 

The below mentioned are the various VBScripts available for connecting and accessing data base in quick test professional:
  1. Updating records in a record set
  2. Deleting a record from the record set
  3. Finding a record in record set
  4. Clearing a data base table
  5. Connecting to an ADO data base
  6. Adding a new record to the data base

How to connect to a database using VBScript?

- To connect to a data base using VBScript, a DSN inventory is required. 
- An ADO connection is another requirement for making an open connection with a specific data source. 
- Once this connection is established, the data base can be easily accessed as well as manipulated. 
- Also, once this connection is established it is not important how many times that data base is accessed. 
- Another way to establish a connection with the data base using VBScript is by passing a connection string through some record set object or a particular command. 
- Though this method is quite quick but it holds good only for the single specific queries. 
- An essential connection from a web server to a data base is nothing but a data source. 
- This connection can either be established via a dedicated machine running SQL server or through a data base file on a web server at some other location. 

Two types of DSNs or data sources are available namely:
  1. File DSN: This is the connection that is created by the script when an access to the data base is required. Here the path and name of the data base to be accessed needs to be specified. Another condition for this DSN to work is that the data base to be connected to must be residing on a server in a directory which must be accessible by the script.
  2. System DSN: Creation of this kind of connection is under the charge of the administrator of the web server which consists of the required data base. This is the most popular data base connection since it is much reliable than its former counterpart.
- After the access to the data base is complete it is important that the connection should be closed at the end of each page. 
- For creating any application, the foremost thing that is required is data base itself. 
- A number of programs are available for developing a data base, however, the Microsoft access remains the most popular of them all. 
- It is important for web application designer to know the basics of the data base usage with VBScript as well as active server pages for developing real web applications.


Wednesday, September 19, 2012

What are the types of environment variables in QTP?


The quick test professional comes with a whole lot of environment variables and under many types. In this article we are going to discuss about the same but first let us focus what actually these environment variables are. 

What are Environment Variables

- The environment variables have come to be termed as “dynamic named values” which have the ability to affect the way the processes behave on a computer system while in execution phase.
- With the environment variables an operating environment is created in which the applications or processes are supposed to run. 

We give an example to make it easy for you to understand the concept of environment variables:
- Suppose a particular system uses a particular location for storing its temporary files. 
- This location can be designated with an environment variable having a standard name. 
- This location may differ from computer to computer. 
- This environment variable can be accessed by any of the processes using the standard name assigned to it. 
- This mechanism makes sure that the process stores the temporary files in a directory or a folder that surely exists and has sufficient place for storing the data. 
- In systems having UNIX or UNIX like operating systems each process is provided with its own set of environment variables.

More about Environment Variables
- Whenever a process is created from its parent process a duplicate environment of the parent process is inherited by the child process by default. - However, the explicit changes are not inherited. 
- From command shells, the environment variables can be changed by the user by an indirect invocation or by using the below stated notation:
ENVIRONMENT _ VARIABLE  = VALUE
- Even the operating systems like MS- DOS, Microsoft windows etc come with environment variables. 
- However, all of these do not make use of the exact same environment variables. 
- For the purpose of configuration, the values of these environment variables can be used by the running processes. 
- Below we are stating some common examples of the environment variables:
  1. HOME
  2. USER PROFILE
  3. PATH
  4. TERM
  5. MAIL
  6. TEMP
- For the batch files as well as the shell scripts the environment variables serve the purpose of communicating the preferences to child processes and data. 

Environment Variables in QTP

Now we shall see the concept of environment variables in quick test professional. 
- In quick test professional the environment variables are more like global variables which can be accessed by any part of the test script.
- The values of the environment do not change by any number of the iterations but yes they can be changed by modifying them in the script. 
- The basic advantage of the environment variables is that they can be used or shared across several reusable actions. 
- 2 types of environment variables have been defined namely:
  1. Built in environment variables and
  2. User defined environment variables.
- The first type of the environment variables defines the internal variables that are a property of quick test professional and hold valuable information like path of the folders, name of iteration, and version of the operating system and so on.
- The user defined environment variables can be further defined in to 2 types namely:
  1. User defined internal: variables defined within the test and accessible only within the test and
  2. User defined external: Variables predefined in the external environment variable file.
Below mentioned are the built in environment variables in quick test professional:
  1. ActionName
  2. ActionIteration
  3. OS
  4. OSversion
  5. ProductName
  6. ProductVer
  7. ResultDir
  8. TestDir
  9. UserName
  10. TestName
  11. UpdatingActiveScreen
  12. UpdatingCheckPoints
  13. GroupName
  14. ScenarioID
To get the values from the environment variables during the run time you should know the path of the folder where the test is located. 


Tuesday, August 21, 2012

When do you use Break Points? What is Toggle Break Points? How it differ from Break points?


What you got to do when you are in need of pausing a test run in between its execution? 
- Break points are used to mark such points in the test script where you want the test to be paused. 
- Break points have proved to be a very useful tool in the discovery of the flaws that might be present in a test script. 
- A break point marker marks the break point present in the test script. 
- This break point marker resides in the left margin of the test scripts.
- The test run is paused by the winrunner whenever it encounters a break point. 
- On a particular break point following tasks can be carried out:
  1. The effects of the test run can be examined.
  2. The current value of the variables can be viewed.
  3. Any necessary changes can be made.
  4. Test can be continued to further execution.

How to use Break Points?

- To continue the execution of the test after completing the task to be carried out on the break point the “run from arrow” command can be given and the test will restart from the break point. 
- Once, it gets in to the execution process again it will continue to execute until and unless it encounters another break point or till the test is complete. 
- One thing that should always be taken care of is to make sure that the winrunner is out of batch mode otherwise it won’t pause. 
- For winrunner to pause on break points, it is mandatory that it comes out of the batch mode. 
- If it is not, it is sure to ignore the break points. 
- There are 3 main uses of the break points:
  1. They are used to suspend the test run at a particular point so that the state of your software system or application can be inspected.
  2. They are used for monitoring the entries that are there in the watch list.
  3. They are used to mark certain points throughout the test script for the stepping purpose by using the step commands.

Types of Break Points

We have two types of break points as mentioned below:
  1. Break at location point: This break point makes the test to stop at some line number specified by the user in a test script and
  2. Break in function point: This break point makes the test to stop whenever a functioned specified by the user is called in a loaded compiled module.
The break points that you define in a session remain active for that particular winrunner session only. Once you terminate that particular winrunner then you must redefine the break points to continue the debugging of the test script in the successive session.

Toggle Break Point and how they differ from normal break point

- Toggle break point command is used whenever you have to set a break at the location break point. 
- For doing this, you just need to move the point of insertion to that line of the test script where according to your desire the test must stop executing. 
- Now go to the debug menu and click on the option “toggle break point” or you can simply go for the button entitled toggle break point. 
- Once you have clicked on this you can observe a symbol of break point towards the left margin of the current window of the winrunner. 
- This break like other break points is also listed in the break points list.
- There is just one simple difference which is that the toggle break points automatically make use of the pass count of 0 and if some other pass count is required these break points can be easily modified. 


Saturday, June 9, 2012

What are different scrum controls?


It happens in some of the cases that the whole scrum process comes on the verge of the collapsing! In such cases it is required that the management controls stay in order, undisturbed and firm all the times. 
There are many scrum controls; however the risk assessment continues to be the most valuable one with its impacts as well.

What are different Scrum Controls?


The below mentioned are the effective scrum controls:

1. Issues: 
Issues can be thought of as the obstacles that do not pose any major risk, defect or bug but cannot be considered to be a positive aspect for the software project.

2. Risk assessment: 
This is the most influential scrum control also as it influences the other scrum controls quite much. The success of the project depends largely up on this scrum control as well its impacts.

3. Packets: 
These are product elements pending for the modification in order to facilitate the implementation of the product backlog items in to the working software that is to be released at the end of the sprint.

4. Backlog: 
This backlog consists of all the details of the bugs, defects and the requests of the customers that could not be implemented in the current release and have to be incorporated in to the next release. In addition to all these, the backlogs also consist of the technology and functionality upgrades.

5. Solutions: 
These are the scrum controls occurring between the risks, problems and changes.

6. Release and Enhancement: 
After the risk assessment, this is the second most valuable scrum control for the entire development cycle. This scrum control at any point of time represents a viable release based up on the requirement variables.

How does these scrum controls help?


- Most of the above mentioned scrum controls are employed for the management of the product backlogs and the sprint backlogs. 
- These scrum controls are used for the following purposes:
  1. Managing issues
  2. Obtaining better solutions
- Even these controls are reviewed from time to time and modified or reconciled if and whenever required during the sprint planning meetings. 
- These scrum controls help control chaos that occurs during the development process. 
- All the above mentioned scrum controls play a great role in the following stages of the scrum:
  1. Defined processes
  2. Project cost
  3. Final product
  4. Responsiveness to the environment
  5. Completion date
  6. Knowledge transfer
  7. Team flexibility creativity
  8. Probability of success

Scrum, we can say is an enhanced version of the iterative and incremental object oriented development cycle. 
The software releases in a scrum are planned according to the below mentioned variables:
  1. Time pressure: Time frame required to make most of the competitive advantage.
  2. Quality
  3. Resource: It includes staff availability and funds.
  4. Vision (system vision)
  5. Competition: What is required to gain the competitive edge?
  6. Customer requirements: How the current system can be enhanced?
All the above mentioned can be modified according to the development plan during the project. But any processes carried out further should take these changed variables in to account. A system that requires a complicate and complex development process require appropriate control and maximum and efficient control.



Tuesday, May 15, 2012

How does a DU path segment play a role in data flow testing?


Whenever you would have came across the topic of data flow testing, you surely would have heard about the term “du path segment” but still not familiar with it! This article if focussed up on the du path segments and what role it has got to play in the data flow testing. 
We will discuss the du path segments under the context of data flow testing and not as a separate topic so that it becomes easy for you to understand. 
The whole process of data flow testing is guided by a control flow graph that apart from just guiding the testing process also helps in rooting out the anomalies present in the data flow. With all the anomalies being already discovered one can now design better path selection strategies taking in to consideration these data flow anomalies. 

There are nine possible anomalies combinations as mentioned below:
  1. dd: harmless but suspicious
  2. dk: might be a bug
  3. du: a normal case
  4. kd: a normal situation
  5. kk: harmless but might be containing bugs
  6. ku: a bug or error
  7. ud: not a bug because of re- assignment
  8. uk: a normal situation
  9. uu: a normal situation
For data flow testing some data object states and usage have been defined as mentioned below:
1.      Defined, initialized, created à d
2.      Killed, undefined, unreleased àk
3.      Used for:
(a)    Calculations à c
(b)   Predictions à p

Terminology associated with Data Flow Testing


Almost all the strategies that are implemented for the data flow testing are structural in nature. There are certain terminologies associated with the data flow testing as stated below:
  1. Definition clear path segment
  2. Loop free path segment
  3. Simple path segment and lastly
  4. Du path

What is a DU path Segment?


- DU path segment can be defined as a path segment which is simple and definition clear if and only if the last link or node of the path has a use of the variable x.

Let us take an example to make the concept of du path segment clearer. 
- Suppose a du path for a variable X exists between two nodes namely A and B such that the last link between the two nodes consists of a computational use of the variable X. 
- This path is definition clear and simple. 
- If there exists a node C  at the last but one position that is the path is having a predicate use and the path from the node A to node C is definition clear and does not contain any loop. 
- Several strategies have been defined for carrying out the data flow testing like:
  1. ADUP or all du paths strategy
  2. AU or all uses strategy
  3. APU+ C or all p uses/ some c uses strategy
  4. ACU +P or all c uses/ some p uses strategy
  5. AD or all definitions strategy
  6. APU or all predicate uses strategy
  7. ACU or all computational uses strategy

Strategy for DU Path Strategy

We shall describe in detail here only the ADUP or all du paths strategy. 
- This strategy is considered to be one of the most reliable and strongest data flow testing strategy. 
It involves the use or exercising of all the du paths included in the definition of the variables that have defined to every use of the definition.
- All the du paths suppose to be a strong criterion for testing but it does not involve as many tests as it seems.
- Simultaneously many criterion are satisfied by one test for several definitions and uses of the variables.




Facebook activity