Subscribe by Email


Showing posts with label Model. Show all posts
Showing posts with label Model. Show all posts

Wednesday, July 17, 2013

What are network layer design issues?

- The network layer i.e., the third layer of the OSI model is responsible for facilitating the exchange of the individual information or data pieces between hosts over the network. 
- This exchange only takes place between the end devices that are identified. 
For accomplishing this task, 4 processes are used by the network layer and these are:
Ø  Addressing
Ø  Encapsulation
Ø  Routing
Ø  Decapsulation
In this article we focus up on the design issues of the network layer. 

- For accomplishing this task, the network layer also need s to have knowledge about the communication subnet’s topology and select the appropriate routes through it. 
- Another thing that the network layer needs to take care of is to select only those routers that do not overload the other routers and the communication lines while leaving the other lines and router in an idle state.

Below mentioned are some of the major issues with the network layer design:
  1. Services provided to the layer 4 i.e., the transport layer.
  2. Implementation of the services that are connection oriented.
  3. Store – and  - forward packet switching
  4. Implementation of the services that are not connection oriented.
  5. Comparison of the data-gram sub-nets and the virtual circuits.
- The sender host sends the packet to the router that is nearest to it either over a point-to-point carrier link or LAN. 
- The packet is stored until its complete arrival for the verification of the check sum. 
- Once verified, the packet is then transmitted to the next intermediate router. 
- This process continues till the packet has reached its destination. 
- This mechanism is termed as the store and forward packet switching.

The services that are provided to the transport layer are designed based up on the following goals:
  1. They should be independent of the router technology.
  2. Shielding from the type, number and topology of the routers must be provided to the transport layer.
  3. The network addresses that are provided to the transport layer must exhibit a uniform numbering plan irrespective of whether it’s a LAN or a WAN.
Now based up on the type of services that are offered, there is a possibility for two different organizations.

Offered service is Connection-less: 
- The packets are individually introduced in to the sub-net and the routing of the packets is done independently of each other. 
- It does not require any advance set up. 
- The sub-net is referred to as the data gram sub-net and the packets are called data-grams.

Offered service is connection-oriented: 
- In this case the router between the source and the destination must be established prior to the beginning of the transmission of the packets. 
- Here, the connection is termed as the virtual circuit and subnet as the “virtual circuit subnet” or simply VC subnet.

- Choosing a new router every time is a thing to be avoided and this is the basic idea behind the use of the virtual circuits. 
- Whenever we establish a connection, a route has to be selected from source to destination. 
- This is counted as a part of the connection setup only. 
- This route is saved in the routers tables that are managed by the routers and is then used by the flowing traffic. 
- On the release of connection, the VC is automatically terminated. 
- In case of the connection oriented service, an identifier is contained in each packet which tells the virtual circuit to which it belongs.

- In data-gram sub-net circuit setup is not required whereas it is required in the VC circuit. 
- The state info is not held by the routers in the data gram subnet whereas router table space is required for each VC for each connection. 


Sunday, March 24, 2013

What are types of artificial neural networks?


In this article we discuss the types of artificial neural networks. These models simulate the real life biological system of nervous system.
1. Feed forward neural network: 
- This is the simplest type of neural network that has been ever devised. 
- In these networks the information flow is unidirectional; therefore the data moves only in forward direction. 
- From input nodes data flows to the output nodes via hidden nodes (if there are any). 
- In this model there are no loops or cycles. 
- Different types of units can be used for constructing feed forward networks for example, McCulloch – pitts neurons.
- Continuous neurons are used in error back propagation along with the sigmoidal activation.
2. Radial basis function network: 
- For interpolating in a multi – dimensional space radial basis functions are the most powerful tools. 
- These functions can be built in to criterion of distance with respect to some center.
- These functions can be applied in the neural networks. 
- In these networks, sigmoidal hidden layer transfer characteristic can be replaced by these functions.
3. Kohonen self–organization network: 
- Un–supervised learning is performed with the help of self – organizing map or SOM. 
- This map was an invention of Teuvo Kohonen.
- Few neurons learn mapping points in the input space that could not coordinate in the output space. 
- The dimensions and topology of the input space can be different from those of the output space. SOM makes an attempt for preserving these.
4. Learning vector quantization or LVQ: 
- This can also be considered as neural network architecture. 
- This one also was a suggestion of Teuvo Kohonen.  
- In these prototypical representatives are parameterized along with two important things namely, a classification scheme based - up on distance and a distance measure.
5. Recurrent neural network: 
- These networks are somewhat contrary to the feed forward networks. 
- They offer a bi–directional flow of data.
- On a feed forward network data is propagated linearly from input to output. 
- Data from later stages of processing is also transferred to its earlier stages by this network. 
- Sometimes these also double up as the general sequence processors. 
- Recurrent neural networks have a number of types as mentioned below:
Ø  Fully recurrent network
Ø  Hopfield network
Ø  Boltzmann machine
Ø  Simple recurrent networks
Ø  Echo state network
Ø  Long short term memory network
Ø  Bi – directional RNN
Ø  Hierarchical RNN
Ø  Stochastic neural networks
6. Modular neural networks: 
- As per the studies have shown that human brain works actually as a collection of several small networks rather than as just one huge network, this ultimately helped in realizing the modular neural networks where smaller networks cooperate in solving a problem. 
- Modular networks are also of many types such as:
Ø  Committee of machines: Different networks that work together on a given problem are collectively termed as the committee of machines. The result achieved through this kind of networking is quite better than what is achieved with the others. The result is highly stabilized.
Ø  Associative neural network or ASNN: This is an extension of the previous one. And extends a little beyond the weighted average of various models. This one is a combined form of the k- nearest neighbor technique (kNN) and the feed forward neural networks. Its memory is coincident with that of the training set.
7. Physical neural network: 
- It consists of some resistance material that is electrically adjustable and capable of simulating the artificial synapses.
There are other types of ANNs that do not fall in any of the above categories:
Ø  Holographic associative memory
Ø  Instantaneously trained networks
Ø  Spiking neural networks
Ø  Dynamic neural networks
Ø  Cascading neural networks
Ø  Neuro – fuzzy networks
Ø  Compositional pattern producing networks
Ø  One – shot associative memory


Wednesday, February 27, 2013

Explain TestOptimal - Web Functional/Regression Test Tool



About TestOptimal Testing Tool

- TestOptimal provides a convenient way for the functional/ regression/ load/ stress testing of the web–based applications in an automated way. 
- It also works for the java applications. 
- The technology behind the TestOptimal testing tool is the MBT or model based testing and some mathematical optimization techniques. 
- It generates as well as executes the test cases directly from the model of the application. 
- Actually, TestOptimal is itself a web – based application. 
- It has facilities for integrating it with the JUnit. 
- Furthermore, it can be run along with the NetBeans and Eclipse.
- Another striking feature of TestOptimal apart from the technology is that it uses is the application modeling with graphs.
- Example of such graphs are the state chart XML or in short SCXML. 
- These charts have drag and drop user interface that are capable of running  on  the standard browsers. 
- TestOptimal has a number of test sequencers that effectively meet the testing needs of different users. 
- Mscript or java is used for the automation of the tests i.e., the XML – based scripting. 
-TestOptimal provides statistical analysis of the virtual users and test executions required for load testing. 
- TestOptimal can be integrated with other tools such as QTP, quality center etc. with the help of its web service interface. 
- TestOptimal supports multiple browsers on a number of platforms such as Unix, Linux and Windows.
- The following constitute this model – based test automation suite for load and performance testing:
  1. Test optimal basic MBT
  2. proMBT
  3. enterprise MBT
  4. runtime MBT
- Model based testing and DDT (data driven testing) are combined together by the TestOptimal so as to provide a sophisticated and efficient test automation and test case generation tool. 
- With MBT, one can find the defects in the early stages of the development cycle, thus enabling a quick and efficient response. 
- TestOptimal animates the test execution that provides the user with an insight in to the testing. 
- This also enables the user to validate the model visually. 
- It also lets you track the requirement coverage.
-The test cases can be visualized with the help of various graphs. 
- For achieving the desired test coverage, there are a number of algorithms capable of generating requires test sequences. 
- The same automation scripts and models can be re–purposed if the user wants to perform load and performance testing.
- TestOptimal helps you cut down the length of the development cycle and at the same time achieving desired test coverage. 
- This in turn improves you response to the frequent changes and makes you confident about your software.
- With TestOptimal, it is sure that over 90 percent of your coverage requirements will be met and release turnaround time will be improved. 


Features of TestOptimal

Below we state some unique features of this excellent testing tool:
  1. Finite state machine notation via MBT modeling.
  2. Superstate and sub-model: With this feature a larger model can be partitioned in to smaller library components that are reusable.
  3. Graphs: It provides various graphs such as the MSC (message sequence chart), coverage graph, model graph, sequence graphs and so on.
  4. Model import and merge: It offers various modeling formats based up on XML and UML XMI such as the graphML, graphXML and so on.
  5. Test case generation: It comes with many sequencers such as the optimal sequencer, custom test case sequencer, random walk and so on.
  6. Scriptless data driven testing
  7. Scripting offered in mscript and java.
  8. ODBC/ JDBC support: Relational databases can be accessed and operations such as reading, writing, storing and verifying test results can be performed.
  9. Integration with REST websvc, JUnit, java IDE specifically netbeans and eclipse, remote agents and so on.
  10. Cross browser testing support on browsers such as chrome, IE, opera, firefox and safari.






Monday, February 25, 2013

What is meant by Software Process Improvement?


About Software Process Improvement

- SPI or Software Process Improvement is a program that has been developed to provide guidance for the integrated long – range plan for the initiation and management of the SPI program. 
- SPI is based up on a model called the IDEAL model which has the following 5 major stages:
  1. Initiating
  2. Diagnosing
  3. Establishing
  4. Acting
  5. Leveraging
- These 5 major steps form a continuous loop. 
- However, the time taken for the completion of one cycle varies from one organization to other. 
- Depending on the available resources an organization must be able to decide whether or not it would be able to commit to software process improvement. 
SPI requires many activities to be carried out in parallel to each other. 
- Some part of the organization may take care of the activities in one phase while others take care of the other phase activities.
- Practically, the boundaries of the various stages in a software process improvement are not clearly defined. 
- The infrastructure also plays a great role in the success of the SPI. 
- The value added to SPI by infrastructure just cannot be underestimated. 
- It provides a great help in understanding its roles.

About Initiating Phase

- As the name indicates this is the starting point of the process. 
- This stage involves setting up of the improvement infrastructure. 
- Then the infrastructure’s roles and responsibilities are defined. 
- The resources are checked for availability and assigned.
- Finally, an SPI plan that will guide this initiating phase as well as the other higher stages. 
- It is during this stage that the goals of the software process improvement are defined and established based up on the organization’s business needs. 
- During the establishing phase these goals are further refined and specified.
Two components are typically established namely:
Ø  A software engineering process group or SEPG
Ø  A management steering group or MSG 

About Diagnosing Phase

- In this stage, the organization as per the SPI plan starts. 
- This stage serves as foundation for the stages that will follow. 
- The plan is initiated keeping in view the vision of the organization along with its business strategy, past lessons, current business issues and long term goals. 
- Appraisal activities are carried out so that a baseline of the current state of the organization. 
- The results of these activities are reconciled with the existing efforts so as to be included in the main plan.

About Establishing Phase

 
- In this stage the issues to be addressed by the improvement activities are assigned priorities.
- Also, the strategies for obtaining a solution are also pursued. 
- The draft of the plan is completed as per the organization’s vision, plan, goals and issues. 
- From general goals, measurable goals are developed and put in to the final SPI plan. 
- Metrics essential to the process are also defined.

About Acting Phase

 
- Solutions addressing the improvement issues discovered in the previous stages are created and deployed in and out of the organization. 
- Other plans are developed for the evaluation of the improved processes.

About Leveraging Phase

 
- This stage is led by the objective of making the next pass through the process more effective. 
- By this time the organization has developed solutions and metrics concerning performance and achievement of the goals. 
- All this data obtained is stored in a process database that will later serve as source information for the next pass. 
- Also, this information would be used for the revaluation of the strategies and methods involved in the SPI program.
- Software process improvement activities work with two components namely, the tactical component and the strategic component. 
- The former is driven by the latter that is based up on the needs of the organization. 


Saturday, February 9, 2013

What are the main features of TOSCA? What are its benefits and limitations?

About TOSCA

TOSCA test suite was developed as a tool for automated execution for regression and functional testing by TRICENTIS technology and consulting GmbH.
- TOSCA is a wise combination of an application programming interface, a graphical user interface, integrated test management and a command line interface. 
- In the year of 2011, TOSCA became the second most widely used software test automation tool. 
- Since then, it has received a number of awards for its web and customer support.  
The architecture of TOSCA comprises the following components:
Ø  TOSCA commander
Ø  TOSCA wizard
Ø  TOSCA executor
Ø  TOSCA exchange portal
Ø  TOSCA test repository
- This test automation tool follows a model driven approach so that the whole test can be made dynamic in nature. 
- The modules can be dragged and dropped for creating new tests.  
- This in turn makes it possible for having a business – based description of the test cases.
- With such a description it becomes for the non – technical users to design and automate the test cases.


Features of TOSCA


- Main features of TOSCA include:
Ø  Generation of dynamic test data
Ø  Generation of synthetic test data
Ø  Business dynamic steering of test case generation that is highly automated
Ø  Unified handling as well as execution of the automated and manual test cases.
Ø  Unified handling and execution of the non – GUI and GUI test cases.
- In addition to all these above mentioned features, another feature of TOSCA is that it weighs the test cases as per their importance in proper execution of the business processes. 
- This also enables the TOSCA to provide detailed reports about the impact that the existing technical weakness have on the fulfillment of the requirements.
- A software called Fecher uses TOSCA for the development of data base and application migration projects.


Limitations of TOSCA

TOSCA has two major limitations when compared with the other software test automation tools:
Ø  Using it one cannot perform any load testing and stress testing.
Ø  It hasn’t got any active X components.

Below mentioned extensions are provided with TOSCA:
Ø  Requirements: This extension lets the requirements to be imported, edited, exported and administrated the risk associated with the requirements can be weighted. These requirements are then linked to the tests after they have been designed.
Ø  Test case – design workbench: This extension defines the tests cases that are required for providing coverage to specific test objects on the basis of the requirements. The test cases are then generated by employing all the combinations in the following ways:
1.    Pairwise
2.    Orthogonal array
3.    Linear expansion
Ø  Reporting: This extension lets you collect, analyze and present the test results. Either crystal reports can be used for creating the test reports or they can be exported as an XML or PDF file.
Ø  TOSCA easy entrance: This extension lets you create the entities that can be reused via drag and drop feature.
Ø  User management: This extension lets you integrate the multiple user concept with check in and out mechanisms as well as versioning.
Ø  Web Access: This extension works for providing remote access.
Ø  SAP solution manager ready: This extension lets you integrate TOSCA with SAP.

Because of its various features TOSCA is so famous among the testers.  A total of 300 customers have been recorded worldwide. The software is being used in numerous tasks such as:
  1. Banks
  2. Insurance
  3. Telecommunication
  4. Industrial companies and so on.
It makes use of third generation model that uses plain English for the creation of tests.  




Monday, February 4, 2013

How are unit and integration testing done in EiffelStudio?


- Eiffelstudio provides a development environment that is complete and well integrated.
- This environment is ideal for performing unit testing and integration testing. 
- Eiffelstudio lets you create software systems and applications that are scalable, robust and of course fast. 
- With Eiffelstudio, you can model your application just the way you want. 
Eiffelstudio has effective tools for capturing your thought process as well as the requirements. 
- Once you are ready to follow your design, you can start building up on the model that you have already created. 
- The creation and implementation of the models both can be done through Eiffelstudio. 
- There is no need of keeping one thing out and starting over. 
- Further, you do not need any other external tools to go back and make modifications to the architecture. 
- Eiffelstudio provides all the tools. 
- Eiffelstudio provides round-trip engineering facility by default in addition to productivity and test metrics tools.
Eiffel studio provides the facility for integration testing through its component called the eiffelstudio auto test. 
- Sophisticated unit tests and integration testing suites might be developed by the software developers that might be quite simple in their build. 
- With eiffelstudio auto test the Eiffel class code can be executed and tested by the developer at the feature level. 
- At this level, the testing is considered to be the unit testing. 
- However, if the code is executed and tested for the entire class systems, then the testing is considered as the integration testing.
- Executing this code leads to the execution of contracts of attributes and features that have already been executed. 
- Eiffelstudio auto test also serves as a means for implementing the tests as well as assumptions made regarding the design as per the conditions of the contract. 
- Therefore, there is no need of re-testing the things that have been given as specification in class texts contracts by unit and integration testing through some sort of test oracles or assertions.

- Eiffelstudio auto test lays out three methods for creating the test cases for unit and integration testing:
  1. A test class is created by the auto test for the tests that have been manually created. This test class contains the test framework. So the user only needs to input the code for the test.
  2. The second method for the creation of the tests is based up on the failure of the application during its run time. Such a test is known as the ‘extracted’. Whenever an unexpected failure occurs during the run time of the system under test, the auto test works up on the info provided by the debugger in order to produce a new test case. The calls and the states that cause the system to fail are reproduced by this test. After fixing the failure, the extracted tests are then added to the complete suite as a thing that would avoid the recurrence of the problem.
  3. The third method involves production of tests known as generated tests. For this the user needs to provide the classes for which tests are required and plus some additional info that auto test might require to control the generation of the tests. The routines of the target classes are then called by the tool using arguments values that have been randomized. A single new test is created that reproduces the call that caused the failure whenever there is a violation of a class invariant or some post condition.


Thursday, January 17, 2013

What is meant by Behavior Driven Development (BDD)?


BDD or Behavior Driven Development is one of the most important development approaches in the field of software engineering and is just a modified implementation of the TDD or test driven development. It actually combines the guiding principles and general techniques of the test driven development with the combined ideas from two sources namely:
  1. Domain – driven design
  2. Object – oriented analysis and design
- The behavior driven development thus provides a shared process and tools to business analysts and software developers for their collaboration on the software development process.
- Ideally, behavior driven development represents an idea regarding the management of the software development process by both technical sight and business interests.
- It is assumed in this approach that the specialized software tools are being used to provide support to the development process. 
- The specialized tools thus involved have been developed especially to be used in projects following the BDD approach. 
- They can be considered as special tools that supports test driven development. 
- The primary purpose of these tools is to automate the ubiquitous language statements around which the BDD process is centralized. 
- Agile software development gets successful only when it is considered from the beginning. 
- On the contrast, in some projects this is the last thing to be considered which means that there is no sustainable operation in the websites that serve purposes other than blogging.
-  A blogging web site is just a software that is just modified by the users as it is ready for use. 
- However, BDD and TDD provides the only ways for achieving this goal. 
- The word ‘driven’ in TDD signifies that the test cases are written first and then the code for passing the test. 
- TDD is actually a low level methodology of accomplishing tasks and is sort of developer oriented. 
- On the other hand in BDD, the description of the tests is written in a natural language, thus making the tests more accessible to people outside the development team. 
- Such descriptions can describe the functionality in the same way as done by the specifications. 
Dan North was the person who actually brought the concept of behavior driven development in response to the issues experienced in implementing the test driven development such as:
  1. From where to start testing?
  2. What is to be tested?
  3. How much should be tested in one turn?
  4. How the reason for the failure of the test is to be understood?
- At the heart of the behavior driven development approach lies the approach to acceptance testing and unit testing which were again identified by North. 
- He emphasized that the acceptance tests should be written using the user story frame work. 
- Thus starting from the scratch, the development of the BDD approach continued over a couple of years and finally we have it as a communication and collaboration framework. 
- It has been designed especially for the QA people and business participants involved in a project.
- The agile specifications, BDD and testing eXchange meet highlighted the following characteristics of BDD:
  1. Second generation methodology
  2. Outside – in approach
  3. Pull based
  4. Multiple stake holder support
  5. Multiple scale generation
  6. High automation process
  7. Agile methodology
- Further, it was said that it involves a cycle of interactions with outputs that are well defined and results in delivery of working software that makes sense. - First, BDD frame work that came in to existence was Jbehave which was followed by Rbehave–a story level BDD ruby frame work.
- All these frameworks were later replaced by cucumber testing tool. It was developed by Aslak Hellesoy. 


Facebook activity