Subscribe by Email


Showing posts with label Simulation. Show all posts
Showing posts with label Simulation. Show all posts

Tuesday, January 1, 2013

What is TPT (time partition testing) test methodology?


TPT test methodology or Time Partition Testing is quite a systematic one and is used for carrying out automated testing of software systems and applications. This tool is also used to verify the data flow programs and embedded control systems as well. 

About Time Partition Testing

- Time partition testing methodology has been designed exclusively for the validation and verification of the embedded systems whose input and output data can be represented as signals. 
- Also, this tool is used for testing the continuous behavior of the software systems and applications. 
- Many of the controls systems belong to this category of software systems and applications.
- One of the characteristic feature of such systems is that they interact very well when interlinked with the real world environment. 
- There is a need for the controllers to closely observe the environment and react accordingly towards it. 
- There exists an interactive cycle within this environment in which these systems are supposed to work. 
- This cycle is subject to the temporal constraints. 
- Testing such systems means simulation and checking of their timing behavior. 
- The time partition testing makes use of model based testing rather than using scripts like the traditional functional testing methods. 
- Time portioning testing methodology makes a combination of the graphic and systematic modeling techniques for running the test cases fully in a variety of environments and need to be evaluated automatically. 
- The below mentioned 4 test activities are covered by time portioning testing methodology:
  1. Test case modeling
  2. Automated test documentation
  3. Test management
  4. Automated test execution in a number of environments
  5. Automated test result analysis including automated test assessment
- In this methodology, the special state machines and time partitioning methodology is used for modeling the tests. 
- Hybrid automation can be done for all the tests belonging to an SUT (system under test).
- Tests come with a logical phase sequence. 
- The logical passes are somewhat similar for all the tests and are represented by the states of the finite state machine. 
- The transitions that occur between the phases of a test are modeled by trigger conditions. 
- Each transition is characterized by different variants. 
- These individual variants are made in to unique combinations which in turn are used for modeling individual test cases. 
- Natural language texts, since they support demonstrative as well as simple reading for programmers and non – programmers alike, are known to become a part of the graphics. 
- There are certain substantial techniques which help in modeling the complex test cases via intuitive and graphic modeling:
  1. Parallel state machines
  2. Hierarchical branching state machines
  3. Conditional branching
  4. Re-activity
  5. Signal description
  6. Measured signals
  7. Test step list
- Graphics hide the complexity of a test. 
- Test step lists (also known as direct definitions) constitute the lowest level of the signal description. 
- The last technique i.e., the test step list implies a certain number of predefined actions which are executed sequentially and therefore are ordered chronologically. 
- This technique also holds good for the test cases which can be described normally in a sequential manner which can be chosen from the following:
  1. Channel set i.e., setting a signal
  2. Ramp channel or ramp signal
  3. Set parameters
  4. Compare
  5. Wait
  6. Other conditional values like if, else if, else values and so on.
- In addition to this, these sequences can be modeled in combination with the other models. 
- Direct definition method is used for specifying the test-lets which involves an ordered list of equations and signal definition. 
- Signals can be imported via a function wizard or manual signal editor. 
However, TPT remains exclusive for testing the reactive behavior of the embedded systems. 


Tuesday, May 1, 2012

How does penetration testing tool emphasize on data base security?


Data base is one of the critical elements of a web application and very much crucial for its proper functioning. All of the sensitive information regarding the functioning of the application as well as the user data is stored in the data base. 

This data is of very much use to the attacker. The attackers can steal this data and use it to their advantage. Therefore, it becomes absolutely necessary that the data base of an application must be provided with adequate security coverage.

Penetration testing is one of the ways to ensure the data base security. Most of us are familiar with what actually is the penetration testing. In this piece of writing we have discussed how the penetration testing tools emphasize up on the data base security. 

About Penetration Testing and Database Security


- Penetration testing is yet another testing methodology that has been adopted for testing the security of a computer network or system against the malicious attacks.
- It is quite a decent measure to evaluate the security level of the computer network by bombarding the network with false simulated attacks as malicious attacks from the outside as well as inside attackers.
Penetration testing is concerned with the security of the data base both from the aliens, foreigners or outside attackers who do not hold any authorized access to the computer system or network as well as the inside attackers who do have that access, but it is limited to a certain level. 
- The whole process of the penetration testing involves performing an active analysis using the penetration testing tools.
- This active analysis brings about an assessment of all the potential vulnerabilities of the whole data base system that are merely a consequence of the malfunctioning of the poor security level as well as configuration level of the application. 
- This active analysis is deemed to successful only if it has been carried out from the view point of a malicious attacker and is concerned about the active exploitation of the recognized vulnerabilities.
- The data base security depends up on the effectiveness of the testing which is in turn is affected by the effectiveness of the tools that are employed in the testing. 
- The tools indeed affect data base security, since the more effective are the tools, the more improvement will be there in the security mechanisms.

How Penetration Testing emphasize on Database Security?


- First step in the penetration testing of the data base is always the identification and recognition of the vulnerabilities and security leaks. 
- A number of penetration tests are then carried out on that particular application data base while simultaneously coupling the information with the active assessment of the risks and threats associated with the data base using the penetration testing tools.
- A whole lot of effective tools are designed to reduce the affect of these vulnerabilities.
- Penetration testing tools have been recognized as important component of the data base security audits.
- There are several other reasons why the penetration testing tools holds good for the data base security:
  1. They provide assistance in the assessment of the measure of the operational and business impacts of the attacks on the data base system.
  2. Successfully test the effectiveness of the security defenders in detecting and responding to the attacks.
  3. Provide the evidence in support of the investments that need to be made in the security field of the data base.



How does penetration testing tool emphasize on security subsystem?


Security is one of the important contributing factors in the success of a software system or application. The security level of the software system or application also influences the security of the users that use that system or application. The higher the security of a system is, the more secure it is for use. 

Since security plays a very important role in the computer world, there has to be some strategy or testing methodology that could judge or assess the security levels and mechanisms of the software systems and applications.
Do we have any such testing methodology? Yes of course we have! The penetration testing! 

About Penetration Testing and Security Sub Systems


- This software testing methodology has the answers to all our security related issues.
- The security mechanism of a software system or application is comprised of many sub mechanisms or sub systems which are commonly addressed as security sub systems. 
- These security subsystems are security components that make up the whole security model of the system.
- These sub systems ensure that the applications are not able to access the resources without being authorized and authenticated.
- Furthermore, they keep a track of the security policies and user accounts of the system. 
- There is a sub system called LSA which is responsible for maintaining all the information and details about the local security of the system. 
- The interactive user authentication services are provided by the security sub systems.
- The tokens containing the user information regarding security privileges are also generated by these sub systems. 
- The audit settings and policies are also managed by the security sub systems. 
- The following aspects are identified by the sub systems:
1.       Domain
2.       Who an access the system?
3.       Who has what privileges?
4.       Security auditing to be performed
5.       Memory quota

How Penetration Testing tool emphasize on Security Sub Systems?


So for having better security at the surface, it is important that the security at the sub systems level should not be over looked. All these matters make the security sub systems very essential. 
Therefore, it is required that to improve the overall quality of the security mechanisms, these sub systems should be tested. 

- The penetration testing tools emphasize upon the security sub systems in the same way as they emphasize the network security.
- Penetration testing was first adopted for the testing of the security of a computer network or system against the malicious attacks.
- For providing a way to evaluate the security level of the computer network by bombarding the network with false simulated attacks as malicious attacks from the outside as well as inside attackers. 
- The whole process of the penetration testing is driven by an active analysis which involves an assessment of all the potential vulnerabilities of the security sub systems that are merely a consequence of its poor security level as well as configuration level. 
- Apart from this, the flaws form both the hardware as well as software components contribute to these vulnerabilities rather than only operational weaknesses. 
- The security at the sub system level depends up on the effectiveness of the testing. 
- And the testing in turn is affected by the effectiveness of the tools that have been employed in the testing. 
- The tools indeed affect the sub systems’ security, since if the tools are reliable and efficient in finding vulnerabilities, obviously there will be more improvement in the security mechanisms. 
- A whole lot of effective tools are designed to reduce the affect of these vulnerabilities.




Saturday, April 28, 2012

What is meant by production verification testing?


Production verification is also an important part of the software testing life cycle like the other software testing methodologies but is much unheard of! Therefore we have dedicated this article entirely to the discussion about what is production verification testing? 

This software testing methodology is carried out after the user acceptance testing phase is completed successfully. The production verification testing is aimed at the simulation of the cutover of the whole production process as close to the true value as possible. 

This software testing methodology has been designed for the verification of the below mentioned aspects:
  1. Business process flows
  2. Proper functioning of the data entry functions
  3. Proper running of any batch processes against the actual data values of the production process.

About Production Verification Testing


- Production verification testing can be thought of as an opportunity for the conduction of a full dress rehearsal of the changes in the business requirements if any. 
- The production verification is not to be confused from the parallel testing since there is a difference of the goal.
- We mean to say that the goal of the production verification testing is to verify that the data is being processed properly by the software system or application rather than comparing the results of the data handling of the new software system software or application with the current one as in the case of parallel testing. 
- For the production verification testing to commence, it is important that the documentation of the previous testings is produced and the issues and faults that were discovered then are fixed and closed.
- If there is a final opportunity for the determination of whether or not the software system or application is ready for the release, it is the production verification testing. 
- Apart from just the simulation of the actual production cut over, the real business activities are also simulated during the phase of the production verification testing. 
- Since it is the full rehearsal of the production phase and business activities, it should serve the purpose of the identification of the unexpected changes or anomalies presiding in the existing processes as a result of the production of the new software system or application which is currently under the test. 
- The importance of this software testing technique cannot be overstated in the case of the critical software applications.
- For the production verification testing, the testers need to remove or uninstall the software system or application from the testing environment and reinstall it again as it will be installed in the case of the production implementation.
- This is for carrying out a mock test of the whole production process, since such kind of mock tests help a lot in the verification of the interfaces, existing business flows. 
- The batch processes continue to execute alongside those mock tests. 
- This is entirely different from the parallel testing in which both the new and the old systems run besides each other.
- Therefore in parallel testing, the mock testing is not an option to provide accurate results for the data handling issues since the source data or data base has a limited access. 

Entry and Exit Criterion for Production Verification Testing


Here we list some of the entry and exit criteria of the production verification testing:
Entry criteria:
  1. The completion of the User acceptance testing is over and has been approved by all the involved parties.
  2. The documentation of the known defects is ready.
  3. The documentation of the migration package has been completed, reviewed and approved by all the parties and without fail by the production systems manager.
Exit Criteria:
  1. The processing of the migration package is complete.
  2. The installation testing has been performed and its documentation is ready and signed off.
  3. The documentation of the mock testing has been approved and reviewed.
  4. A record of the system changes has been prepared and approved.


Monday, September 20, 2010

What is the possible test approach for simulation system ?

A simulation system's primary responsibility is to replicate the behavior of the real system as accurately as possible. Therefore, a good place to start creating a test plan would be to understand the behavior of the real system.

- Subjective Testing:
It mainly depends on an expert's opinion. An expert is a person who is proficient and experienced in the system under test. Conducting the test involves test runs of the simulation by the expert and then the expert evaluates and validates the results based on some criteria. Advantage of this approach is that it can test those conditions which cannot be tested objectively. Disadvantage is that the evaluation of the system is based on the expert's opinion which may differ from expert to expert.
- Objective Testing:
It is mainly used in the systems where the data can be recorded while the simulation is running. This testing technique relies on the application of statistical and automated methods to the data collected.
Statistical methods are used to provide an insight into the accuracy of the simulation. These methods include hypothesis testing, data plots, principle component analysis and cluster analysis.
Automated testing requires a knowledge base of valid outcomes for various runs of simulation. The knowledge base is created by domain experts of the simulation system being tested. The data collected in various test runs is compared against this knowledge base to automatically validate the system under test. An advantage of this kind of testing is that the system can continually be regression tested as it is being developed.


Sunday, September 19, 2010

Types of Simulation Systems: Dynamic, Discrete, Continuous and Social Simulation Systems

- Dynamic Simulation Systems: It has a model that accommodates for changes in data over time. This means that the input data affecting the results will be entered in to the simulation during its entire lifetime than just at the beginning. A simulation system used to predict the growth of the economy may need to incorporate changes in economic data is a good example of a dynamic simulation systems.

- Discrete Simulation Systems: These systems use models that have discrete entities with multiple attributes. Each of these entities can be in any state, at any given time, represented by the value of its attributes. The state of the system is a set of all the states of all its entities. This stage changes one discrete step at a time as events happen in the system. therefore, the actual designing of the simulation involves making choices about which entities to model. Examples include simulated battlefield scenarios, highway traffic control systems etc.

- Continuous Simulation Systems: If instead of using a model with discrete entities, we use data with continuous values, we will end up with continuous simulation.

- Social Simulation Systems: It is not a technique by itself but uses the various types of simulation described above. However, because of the specialized application of those techniques for social simulation, it deserves a special mention of its own. The field of social simulation involves using simulation to learn about and predict various social phenomenon such as voting patterns, migration patterns, economic decisions made by the general population etc.


Saturday, September 18, 2010

Types of Simulation Systems: Deterministic, Stochastic, Static Simulation Systems

Simulation is widely used in many fields. Some of the applications are :
- Models of planes and cars that are tested in wind tunnels to determine the aerodynamic properties.
- It is used in computer games e.g. simCity, car games etc. This simulates the roads, people talking, playing games etc.
- War tactics that are simulated using simulated battlefields.
- Most of the embedded systems are developed by simulation software before they ever make it to the chip fabrication labs.
- Stochastic simulation models are often used to model applications such as weather forecasting systems.
- Social simulation is used to model socio-economic situations.
- It is extensively used in the field of operations research.

Simulation systems can be characterized in numerous ways depending on the characterization criteria applied. Some of them are:
- Deterministic Simulation Systems: These systems have completely predictable outcomes. If given a certain input, we can predict the exact outcome. Another feature of these systems is idem-potency which means that the results for any given input are always the same. Examples include population prediction models, atmospheric science etc.
- Stochastic Simulation Systems: These systems have models with random variables. This means that the exact outcome is not predictable for any given input resulting in potentially very different outcomes for the same input.
- Static Simulation Systems: These systems use statistical models in which time does not play any role. These models include various probabilistic scenarios which are used to calculate the results of any given input. Examples of such systems include financial portfolio valuation models.


Friday, September 17, 2010

Types of Software Systems : Diagnostic Software Systems, Sensor and Signal Processing Systems, Simulation Systems

The type of software system refers to the processing that will be performed by that system.
Diagnostic Software Systems:
These systems helps in diagnosing the computer hardware components. When a new device is plugged into your computer, a diagnostic software system does some work. The "New Hardware Found" dialog can be seen as a result of this system.

Sensor and Signal Processing Systems:
The message processing system helps in sending and receiving messages. These systems are more complex because they make use of mathematics for signal processing. In a signal processing system, the computer receives input in the form of signals and then transforms the signals to a user understandable output.

Simulation Systems:
Simulation is the process of designing a model of a real system and conducting experiments with this model for the purpose of understanding the behavior of the system or evaluating various strategies for the operation of the system. A simulation is a software package that re-creates or simulates, albeit in a simplified manner, a complex phenomenon, environment, experience providing the user an opportunity for some new level of understanding.
Simulation systems are easier, cheaper and safer to use as compared to real systems and often the only way to build the real systems. For example, learning to fly a fighter plane using a simulator is much safer and less expensive than learning on a real fighter plane. System simulation mimics the operation of a real system such as the operation in a bank or the running of an assembly line in a factory.
Simulation in the early stage of design cycle is important because the cost of mistakes increases dramatically later in the product life cycle. Also, simulation software can analyze the operation of a real system without the improvement of an expert i.e. it can also be analyzed with a non-expert like a manager.


Tuesday, October 27, 2009

Introduction to System Simulation

Systems simulation is a set of techniques for using computers to imitate, or simulate, the operations of various kinds of real-world facilities or processes.The computer is used to generate a numerical model of reality for the purposes of describing complex interaction among components of a system. The complexity of the system surges from the stochastic (probabilistic) nature of the events, from the rules for the interactions of the elements, and the difficulty to perceive the behavior of the systems as a whole with the passing of time.

When to use simulations?
Systems that change with time, such as a gas station where cars come and go (called dynamic systems) and involve randomness.Modeling complex dynamic systems theoretically need too many simplifications and the emerging models may not be therefore valid. Simulation does not require that many simplifying assumptions, making it the only tool even in absence of randomness.

System terminology:
- State: A variable characterizing an attribute in the system.
- Event: An occurrence at a point in time which may change the state of the system.
- Entity: An object that passes through the system.
- Queue: It is a task list.
- Creating: Creating is causing an arrival of a new entity to the system at some point in time.
- Scheduling: Scheduling is the act of assigning a new future event to an existing entity.
- Random variable: A random variable is a quantity that is uncertain.
- Random variate: A random variate is an artificially generated random variable.
- Distribution: A distribution is the mathematical law which governs the probabilistic features of a random variable.


Facebook activity