The content model is derived from an examination of use cases developed for the web application. This model contains structural elements that provides a view of content requirements for a web application. Structural elements include content objects like text, images, audio, video and photographs.It also contains analysis classes which cover attributes that describe it, operations that effect the behavior and collaborations.
A content object is any item of cohesive information that is to be presented to an end-user. These content objects are extracted from use cases. Content can be developed before the web application implementation, while the web application is being built or long after web application is operational.
The web engineer meets the person who has designed the use case and obtain a more detailed understanding of what descriptive and pricing information means. The descriptive information includes a paragraph of general description of component, photograph of component, a multi-paragraph technical description of component, a schematic diagram of component and a thumbnail video that shows how to install the component.
Content objects with a brief description of each object is more than enough to define the requirements for content that must be designed and implemented. In some cases, a data tree is used to represent a hierarchy of content objects. The content model may contain entity relationship diagrams that describes the relationship among content objects.
Monday, January 31, 2011
Overview of The Content Model for Web Applications
Posted by Sunflower at 1/31/2011 09:55:00 PM 0 comments
Labels: Analysis, Analysis Model, Application, Classes, Components, Content, Content Model, Content Object, Data, Design, Implementation, Information, Objects, Use cases, Users, Web Applications, WebApp
Subscribe by Email |
|
Analysis Model and analysis activities for Web Applications
The analysis model of web application is driven by information contained within the use cases that have been developed for the application. Analysis modeling focuses on four fundamental aspects - content, interaction, function, and configuration. Content that is presented by the web application is identified and functions to be performed are extracted from use case descriptions.
TYPES OF ANALYSIS ACTIVITY OCCURRING DURING MODELLING OF A WEB APPLICATION
- Content analysis: It identifies the content across the full range provided by the web application. The content includes text, graphics, images, audio and video.
- Interaction analysis: The user interaction with the web application is defined in the interaction analysis.
- Functional analysis: There are operations and some other processing functions that are applied on the web application. These operations and processing functions are defined in functional analysis. These functions are independent of content but necessary for the end user.
- Configuration analysis: The environment and infrastructure is described in which the web application resides.
All the information that is obtained from these four steps is reviewed, modified as required and then organized into a model which contains structural elements that identifies analysis classes and content objects and dynamic elements that describes all the structural elements interact with one another and with end users.
Posted by Sunflower at 1/31/2011 08:58:00 PM 0 comments
Labels: activities, Analysis, Analysis Model, Application, Aspects, Configuration, Content, Functional, Functions, Interaction, Modelling, software engineering, Web Applications, WebApps
Subscribe by Email |
|
Overview of analysis for Web Applications
Web application analysis focuses on the information or content that is to be presented, the functions that are to be performed for the end user and behaviors that a web application exhibits as it presents content and performs functions.
Web application analysis model is comprised of a set of UML diagrams and text that describe content, interaction, function and configuration. Analysis of web applications is done when the following conditions are met:
- Web application to be built is large and complex.
- Number of stakeholders is large.
- Number of web engineers and other contributors is large.
- Web application's goals and objectives will effect the business bottom line.
- Success of web application will have strong bearing on the success of business.
Requirement analysis consists of formulation, requirement gathering and analysis modeling. In formulation, goals, objectives and categories of users are defined. In requirement gathering, communication between web engineering team and web application stakeholders intensifies.
Building a user hierarchy provides with a snapshot of the user population and a cross-check to ensure that the needs of every user have been addressed.
A use case diagram is created for each user category. Use cases are organized into functional packages and each package is assessed to ensure that it is comprehensible, cohesive, loosely coupled and hierarchically shallow.
Requirement analysis and modelling are iterative activities, new use cases are defined and added to packages, existing use cases will be refined and that specific use cases might be reallocated to different packages.
Posted by Sunflower at 1/31/2011 03:17:00 PM 0 comments
Labels: Analysis, Analysis Model, Content, Formulation, Functions, Requirements, Software, Use cases, Users, Web Applications, Web engineers, WebApps
Subscribe by Email |
|
Friday, January 28, 2011
Introduction to Navigation Design - Navigation Semantics and Syntax
Once the web application architecture and the content is defined, navigation pathways that enable users to access the web application content and functions.
- semantics of the navigation for different users of the site should be identified.
- syntax of achieving the navigation should be defined.
NAVIGATION SEMANTICS
Each user category has a user hierarchy and related use cases. There are different navigation requirements for each actor. A set of classes are defined for the use cases developed for each user. It contains one or more content objects or web application functions. As interaction with web application occurs, series of Navigation Semantic Units(NSUs) are encountered. A NSU describes the navigation requirements for each use case. The NSU shows how an actor moves between content objects or web application functions.
Navigation Semantic Unit is a set of information and related navigation structures that collaborate in the fulfillment of a subset of related user requirements.
The Web application designer creates a navigation semantic unit for each use case associated with each user role. During initial stages of navigation design, the web application content architecture is assessed to determine one or more ways of navigating for each use case. Ways of navigating identifies navigation nodes and the links that enable navigation between them.
NAVIGATION SYNTAX
The syntax of navigation design are:
- Tabs: a variation of navigation bar or column.
- Individual navigation link: test based links, icons, buttons and graphical metaphors.
- Horizontal Navigation bar: lists major content or functional categories in a bar containing appropriate links.
- Vertical Navigation column: lists major content or functional categories and second lists virtually all major content objects within web application.
- Site maps: provide an all-inclusive table of contents for navigation to all content objects and functionality contained within web application.
Posted by Sunflower at 1/28/2011 02:06:00 PM 0 comments
Labels: Application, Classes, Design, Identification, Interaction, Mechanics, navigation, Navigation Design, NSU, Objects, Requirements, Semantics, Syntax, Units, Use cases, Users, Web Applications, WebApp
Subscribe by Email |
|
Thursday, January 27, 2011
Introduction to Architecture Design - WebApp Architecture
The design process for identifying the subsystems making up a system and the
framework for sub-system control and communication is architectural design. An architectural design is :
- early stage in system design process.
- conducted in parallel with other design activities.
- establishes a link among goals established for web application, content, users visiting it, and the navigation criterion.
- identifying system components and their communications.
A web application is an application that is accessed over a network such as the Internet or an intranet.Web application architecture provides an infrastructure that enables a web based system to achieve its business objectives.
The Model View Controller (MVC) architecture decouples the user interface from web application functionality and information content.
The MVC design pattern divides applications into three components:
- The Model maintains the state and data that the application represents.
- The View allows the display of information about the model to the user. It contains all interface specific functions.
- The Controller allows the user to manipulate the application. It coordinates the flow of data between model and view.
Web application architecture is defined within the context of the development environment in which the application is to be implemented.
Posted by Sunflower at 1/27/2011 06:26:00 PM 1 comments
Labels: activities, Architectural, Architecture, Content, Control, Design, Model-View-Controller, MVC, navigation, Network, Objects, Relationships, Web Application Architecture
Subscribe by Email |
|
Introduction to Architecture Design - Content Architecture
The design process for identifying the subsystems making up a system and the
framework for sub-system control and communication is architectural design. An architectural design is :
- early stage in system design process.
- conducted in parallel with other design activities.
- establishes a link among goals established for web application, content, users visiting it, and the navigation criterion.
- identifying system components and their communications.
Content architecture emphasize on the fact how are the content objects structured for presentation and navigation. It focuses on overall hypermedia structure of the web application. It focuses on
- identifying links and relationships among content and documents.
- defining the structure of content.
- specifying consistent document requirements and attributes.
The design can choose from four different content structures:
- Linear Structures : a predictable sequence of interactions is common. The sequence of content presentation is predefined and linear in nature.
- Grid Structures : applied when the web application content can be organized categorically in two dimensions. This web application architecture is useful when highly regular content is encountered.
- Hierarchical Structures : it is the most common web application architecture. It is designed in a manner that enables flow of control horizontally, across vertical branches of the structure.
- Networked Structures : architectural components are designed so that they may pass control to virtually every other component in the system. It provides navigational flexibility but at the same time it can be a bit confusing to a user.
- Composite Structures : the overall architecture of the web application may be hierarchical, but part of the a structure may exhibit linear characteristics, while another part of the architecture may be networked.
Posted by Sunflower at 1/27/2011 03:55:00 PM 0 comments
Labels: activities, Architectural, Architecture, Content, Content Architecture, Design, Grids, Hierarchical, Linear, navigation, Network, Objects, Relationships, Structures
Subscribe by Email |
|
Wednesday, January 26, 2011
Introduction to Content Design - Content Objects and Content Design Issues
Content design focuses on two different design issues, each addressed by individuals with different skill sets. In content design :
- a design representation is developed for content objects and
- mechanisms are represented that are required to initiate their relationships to one another.
- representation of information within a specific content object is concerned.
It is design activity that is conducted by copywriters, graphic designers, and others who generate the content to be used within a web application.
A content object is closely aligned with a data object. A content object has attributes that include content specific information and implementation specific attributes that are specified as part of design.
UML association and aggregation may be used to represent relationships between content objects.
Once all the content objects are modeled, the information that each object is to deliver must be authored and then formatted to best meet the customer's needs. Content authoring is the job of specialists who design the content object by providing an outline of information to be delivered and an indication of the types of generic content objects that will be used to deliver the information.
As content objects are designed, they are chunked to form web application pages. The number of content objects incorporated into a single page is a function of user needs, constraints imposed by download speed of internet connections, and restrictions imposed by the amount of scrolling that the user will tolerate.
Posted by Sunflower at 1/26/2011 11:40:00 PM 0 comments
Labels: Application, Attributes, Content, Content Design, Content Object, Design, Issues, Objects, Relationships, Representation, software engineering, Web Applications, WebApp, WebApps
Subscribe by Email |
|
Monday, January 24, 2011
Aesthetic or Graphic Design - Layout Issues and Graphic Design Issues
Aesthetic or Graphic design makes the web application appealing, though it is functional in nature. It is an artistic angle that compliments the technical aspects of web engineering. To perform aesthetic design, web application users decide how they want the application to look like.
General Layout Guidelines To Consider When Screen Layout Is Designed
- Do not over crowd the web page with too much information as it makes it difficult for the suer to identify the needed information and it creates too much visual chaos.
- Scrolling on web page should be reduced as the users prefer not to scroll much. The best idea is to reduce the content or just put necessary content.
- The design should specify all layout items as a percentage of available space.
- There has to be a pattern that should be followed during group navigation, content and function geographically within the page.
- Web application should be designed in a way in which there is more emphasis on content. A typical web page should contain eighty percent of content with the remaining part dedicated to navigation and other features.
- Users have a tendency to scan the web page in a way they scan the page of a book i.e. from top-left to bottom right. High priority elements should be placed in upper-left portion of the page.
GRAPHIC DESIGN ISSUES
Graphic design begins with layout and then it covers the color schemes, sizes, styles, use of media, typefaces and all other aesthetic elements of an application.
Different web sites provide design tips and guidelines.
Posted by Sunflower at 1/24/2011 08:23:00 PM 0 comments
Labels: Aesthetic, Application, Design, Graphic Design, Guidlines, Information, Issues, Layout, Pattern, User, User Interface, Web Applications, Web page, WebApps
Subscribe by Email |
|
What are different tasks representing WebApp interface design?
The user interface design begins with
- identification of user.
- identification of task.
- identification of environmental requirements.
Once these are identified, user scenarios are created and analyzed.
TASKS REPRESENTING WEB APPLICATION INTERFACE DESIGN ARE:
- The information that is contained in the analysis model is reviewed and refined as required.
- A rough sketch of web application interface layout is developed.
- The user objectives should be mapped into specific interface actions.
- A set of user tasks associated with action should be defined.
- Each interface action should have a storyboard screen image which depicts the interface response to user interaction.
- The input from aesthetic design should be used to refine the interface layout and storyboards.
- Implementing interface requires user interface objects. These user interface objects should be identified which may require a search through an existing object library.
- A procedural representation of user's interaction with interface is developed.
- A behavioral representation of the interface is also developed. It may use UML state diagrams to represent state mechanisms.
- Each state's interface layout is described using design information developed in tasks 2 and tasks 5.
- The interface design model should be refined and reviewed.
Posted by Sunflower at 1/24/2011 12:49:00 PM 0 comments
Labels: Actions, Application, Define, Design, Images, Interface, Layout, Mapping, Representation, software engineering, Tasks, User, User Interface, Web Applications, WebApp
Subscribe by Email |
|
Friday, January 21, 2011
The WebApp Design - Attributes, Goals and Web Design Pyramid
Design is an engineering activity that leads to a high quality product. The major attributes of quality for web applications are:
- Security : The main emphasis of security is the ability of the web application and its environment to avoid unauthorized access or attack.
- Availability : Web application will not meet users needs if its unavailable. Availability is the measure of the percentage of time that a web application is available for use.
- Scalability : Is the variation in volume handled significantly by the web applications and the system. It is important to build a web application that can accommodate the burden of success.
- Time to market : It is a measure of quality from business point of view.
What should be considered when assessing content quality?
- Scope and depth of content be easily determined so that it meets user's needs?
- Background and authority of content's author be easily identified?
- Possibility of determining the currency of content, last update?
- Stability of content and location?
- Credibility of content?
- Uniqueness of content?
- Is content well organized?
- Is content valuable?
Design Goals
The design goals for every web application are:
- Simplicity
- Consistency
- Identity
- Robustness
- Navigability
- Visual appeal
- Compatibility
Web Design Pyramid
Each level of the pyramid represents the design activities:
- Interface Design : It describes structure and organization of the user interface. It includes screen layout, interaction modes, navigation mechanisms.
- Aesthetic Design : It describes the look and feel of the application.
- Content Design : It defines layout, structure and outline of all content.
- Navigation Design : It describes the navigational flow for web application.
- Architectural Design : It represents the overall hypermedia structure.
- Component Design : It develops detailed processing logic.
Posted by Sunflower at 1/21/2011 01:04:00 PM 0 comments
Labels: Activity, Aesthetic, Application, Architectural, Attributes, Content, Design Pyramid, Goals, Interface, navigation, Product, Quality, Representation, Scope, User, Web Applications, WebApp
Subscribe by Email |
|
Thursday, January 20, 2011
What are different user interface design principles and guidelines in software engineering ?
A good web application interface is understandable and forgiving, providing the user with a sense of control. The inner workings of the system are not for concern for the users. Effective applications perform a maximum of work, while requiring a minimum of information from users.
USER INTERFACE DESIGN PRINCIPLES AND GUIDELINES
- Consistency of actions should be required in similar situations. The use of navigation controls, menus, icons should be consistent throughout the web application.
- Anticipation Web application should be designed in such a way that it should interpret the user's next move.
- Communication Whatever activity that is been initiated by the user, it should be communicated by the interface. Communication can be obvious or subtle. User status and location should also be communicated by the interface.
- Efficiency Users work efficiency should be optimized by the design of the web application and its interface and not the efficiency of the web engineer who designs and builds it.
- Flexibility Flexibility of the user interface should enable some users to get the tasks done directly and some other users to explore the web application in random fashion.
- Controlled Autonomy User movement should be facilitated by the interface in such a manner that enforces navigation conventions that are established for application.
- Focus The interface of the web application should be focussed on user tasks at hand.
- Human Interface Objects Use reusable human interface objects. An interface object that is seen, heard, touched by the end user can be get from the object libraries.
- Learn ability Learning time should be reduced by a well designed web application interface.
- Latency Reduction The web application should use multitasking so that the user can proceed with his work and do not wait for some internal operation to get completed.
- Metaphors A metaphor should call images and concepts from user's experience, but it does not need to be an exact reproduction of real world experience. The web application interface that uses an interaction metaphor is easier to learn and use.
- Maintain work product integrity A work product must be automatically saved so that there is no loss of information if an error occurs.
- Readability Every person should be able to read the information in the user interface.
- Track State The state of the user interaction should be saved and stored so that the user can return to the same point even if he or she logs off.
- Visible navigation A well designed web application interface provides the illusion that users are in the same place, with the work brought to them.
Posted by Sunflower at 1/20/2011 01:07:00 PM 0 comments
Labels: Appliaction, Communication, Consistency, Design, Effective, Guidelines, Information, Interface, Interpret, Learnability, navigation, Principles, Product, User, User Interface, Users, Web Applications
Subscribe by Email |
|
Wednesday, January 19, 2011
Object Oriented Hypermedia Design Method (OOHDM)
Object Oriented Hypermedia Design Method (OOHDM) is a method for designing web applications. The development of hypermedia applications consists of four different design activities :
- Conceptual Design for OOHDM
In conceptual design, a conceptual schema is built that represents objects, relationships and collaborations existing in the target domain. To create class diagrams, aggregations and class representations, diagrams and other information, Unified Modelling Language(UML) is used.
- Navigational Design for OOHDM
In OOHDM, the navigational design is build over the view of conceptual design. It uses a predefined set of navigation classes like nodes, links, anchors and structures. Navigation design is expressed in two schema's, the Navigational Class schema and the Navigational Context schema.
Objects that are derived from classes defined in conceptual design are called navigational objects. A series of navigational classes or nodes are defined to encapsulate these objects.
Navigational design must take into account the way in which the user explores the hypermedia space.
-Abstract Interface Design
Abstract Interface Design specifies the interface objects. To represent the relationship between interface and navigation objects, a formal model of interface objects called Abstract Data View is used. The abstract data view defines:
- a static layout.
- behavioral component.
- Implementation
Any application-specific model, like OOHDM design, must be eventually implemented using an implementation technology. This activity represents a design iteration that is specific to the environment in which web application will operate.
Posted by Sunflower at 1/19/2011 04:11:00 PM 0 comments
Labels: Abstract Interface, Applications, conceptual design, Design, Hypermedia, Implementation, Methods, Navigational, Object Oriented Hypermedia Design Method, Objects, OOHDM
Subscribe by Email |
|
Hypermedia Design Patterns in Web Engineering
Web engineering uses design patterns. These are of two types:
- Generic design patterns - applicable for every software.
- Hypermedia design pattern - specific to WebApps.
Design problems can be solved by using design patterns. There are some pattern categories:
NAVIGATION PATTERNS
These patterns helps in the design of NSU, navigation links and overall navigation flow of the web application.
ARCHITECTURAL PATTERNS
It helps in the design of content and web application architecture. Many architectural patterns are available for web engineers who design web applications in various business domains.
COMPONENT CONSTRUCTION PATTERNS
The web application components can be combined by the methods provided by these patterns. When data processing functionality is required within a web application, the architectural and component level design patterns are applicable.
PRESENTATION PATTERNS
Presentation patterns assist in the presentation of content as it is presented to the user via the interface. They tell how to organize user interface control functions, shows the relationship between an interface action and the content object it affects, establish content hierarchies.
BEHAVIOR AND USER INTERACTION PATTERNS
These patterns assist in design of user machine interaction. They address how the interface informs the user of the consequences of a specific action, how a user expands content based on usage context, how to best describe the destination that is implied by a link.
Posted by Sunflower at 1/19/2011 04:08:00 PM 0 comments
Labels: Architectural, Behavior, Component Construction patterns, Design, Hypermedia, Hypermedia Design Patterns, navigation, Patterns, Presentation, Web Applications, Web Engineering, WebApps
Subscribe by Email |
|
Tuesday, January 18, 2011
The Risk Mitigation, Monitoring and Management (RMMM) Plan
The Risk Mitigation, Monitoring and Management, RMMM, plan documents all work performed as part of risk analysis and is used by the project manager as part of overall project plan.
The goal of the risk mitigation, monitoring and management plan is to identify as
many potential risks as possible.It is the organization’s responsibility to perform risk mitigation, monitoring, and management in order to produce a quality product.
Every phase of this plan is of equal importance. More focus is maintained in the initial phases i.e. the identification and assessment of possible risks.
Once RMMM has been documented and project has begun, risk mitigation and monitoring steps commence.
Once RMMM has been documented and the project has begun, risk mitigation and monitoring steps commence.
The measurement of effectiveness of these mitigation / contingency plans should be well carried. It must ensure that after the execution of such plans the risk exposure is reduced or preferably eliminated.
- Risk Mitigation covers efforts taken to reduce either the probability or consequences of a threat.
- Risk monitoring and control is the process of identifying, analyzing, and planning for newly discovered risks and managing identified risks.
- Risk management is the identification, assessment, and prioritization of risks.
Posted by Sunflower at 1/18/2011 08:53:00 PM 0 comments
Labels: Analysis, Control, Document, Focus areas, Goals, Management, Monitoring, Organization, Risk Mitigation, Risk Mitigation Monitoring and Management Plan, Risks, RMMM Plan
Subscribe by Email |
|
Software Six Sigma for Software Engineering
Software Six Sigma is a strategy to enhance and sustain continuous improvements in software development process and quality management. It uses data and statistical analysis to measure and improve company's performance by eliminating defects in manufacturing and service related processes.
ATTRIBUTES OF SIX SIGMA
- genuine metric data.
- accurate planning.
- real time analysis and decision support by the use of statistical tools.
- high quality product.
- software improvement costs and benefits.
STEPS IN SIX SIGMA METHODOLOGY
- Customer requirements are defined, project goals via well defined methods.
- Quality performance is determined by measuring existing process and its output.
- Analyzing the defect metrics.
- Process improvement is done by eliminating the root causes of defects.
- Process control to ensure changes made in future will not introduce the cause of defects again.
These steps are referred to as DMAIC(define, measure,analyze,improve and control) method.
- Design the process to avoid the root causes of defects and to meet customer requirements.
- Verify the process model will avoid defects and meet customer requirements.
This variation is called DMADV(define, measure, analyze, design, and verify) method.
Posted by Sunflower at 1/18/2011 07:16:00 PM 0 comments
Labels: Attributes, Changes, Control, Defect, Defect Metrics, Goals, Improvement, Output, Project, Quality, Requirements, Six Sigma, Steps
Subscribe by Email |
|
Monday, January 17, 2011
The COCOMO II (Constructive Cost Estimation Model) Model
COCOMO II is an extension to the COCOMO model. COCOMO II takes into account new development processes, increased flexibility in software development, need for decision making with incomplete information and new data about projects.
COCOMO II is really three different models :
- The Application Composition Model used for prototyping.
- The Early Design Model used when requirements are available but design has not yet started.
- The Post-Architecture Model used once the system architecture has been designed.
COCOMO II models require sizing information. Three different sizing options are available as part of the model hierarchy i.e. object points, function points, and lines of source codes.
The application composition model uses object points which is an indirect software measure that is computed using counts of the number of screens, reports and components likely to be required to build the application.
DIFFERENCES BETWEEN COCOMO I AND COCOMO II
- COCOMO I requires software size in KDSI as an input, but COCOMO II is based on KSLOC.
- COCOMO I provides point estimates of effort and schedule, but COCOMO II provides likely ranges of estimates.
- In COCOMO II, the estimation equation exponent is determined by five scale factors instead of three.
- Data points in COCOMO I: 63 and COCOMO II: 161.
- COCOMO II adjusts for software reuse and re-engineering but COCOMO I made little accommodation for these factors.
Posted by Sunflower at 1/17/2011 09:46:00 PM 0 comments
Labels: Application, COCOMO II Model, Comparisons, composition, Design, Development, Information, Object Points, Points, Post-Architectural, Software
Subscribe by Email |
|
The COCOMO (Constructive Cost Estimation Model) Model - Basic, Intermediate and Complete COCOMO
COCOMO (Constructive Cost Estimation Model) was proposed by Boehm. COCOMO is a widely spread model that combines statistical figures, mathematical equations, and expert judgement. COCOMO is an open model. It includes the underlying cost estimation equations, every assumption made in the model, every definition and the costs included in an estimate are explicitly stated.
- COCOMO estimates are more objective and repeatable than estimates made by methods relying on proprietary models.
- COCOMO can be calibrated to reflect your software development environment, and to produce more accurate estimates.
Software cost estimation should be done through three stages:
- Basic COCOMO : It is a single-valued, static model in which the development effort is estimated as a function of program size.
Effort = a1 х (KLOC)^a2 PM
Tdev = b1 x (Effort)^b2 Months
where:
• KLOC is the estimated size of the software product expressed in Kilo Lines of Code,
• a1, a2, b1, b2 are constants for each category of software products,
• Tdev is the estimated time to develop the software, expressed in
months,
• Effort is the total effort required to develop the software product.
- Intermediate COCOMO : It computes software development effort as a function of program size and a set of fifteen "cost drivers". It takes into account factors such as required product reliability, database size, execution and storage constraints, personnel aptitude, and the use of software tools.
- Complete COCOMO : The main shortcoming of basic and intermediate COCOMO model is that they consider a software product as a single homogeneous entity. The system is made up of sub-systems which have their own characteristics. Sub-systems may have different inherent development complexity, reliability requirements may be high, development team experience.
The complete COCOMO model considers these differences in characteristics of the subsystems and estimates the effort and development time as the sum of the estimates for the individual subsystems.
Posted by Sunflower at 1/17/2011 08:42:00 PM 0 comments
Labels: Basic, COCOMO Model, Code, Complete, Constructive, Cost, Development, Estimation, Estmates, Function, Intermediate, Product, program, Software, Stages
Subscribe by Email |
|
Saturday, January 15, 2011
Project Management - The W5HH Principle
Barry Bohem suggested an approach that addresses project objectives, milestones and schedules, responsibilities, management and technical approaches and required resources. This is called W5HH principle. The questions that are answered in this principle are:
- Why is the system being developed?
- What will be done by When?
- Who is responsible for a function?
- Where are they organizationally located?
- How will the job be done technically and managerially?
- How much of each resource is needed?
WHY IS THE SYSTEM BEING DEVELOPED?
It enables the parties to assess the validity of business reasons for the software work. It justifies the expenditure of people, time, and money.
WHAT WILL BE DONE?
It specifies the task set required for the project.
WHEN WILL IT BE DONE?
It helps to determine the project schedule. It helps in determining when tasks are conducted and when milestones are reached.
WHO IS RESPONSIBLE FOR A FUNCTION?
It helps to accomplish the role and responsibilities of each member of the software team.
WHERE ARE THEY ORGANIZATIONALLY LOCATED?
The software team does not contain all the roles and responsibilities. The customers, users and stakeholders also have responsibilities.
HOW WILL THE JOB BE DONE TECHNICALLY AND MANAGERIALLY?
The management and technical strategy of project is defined once the scope of the product is established.
HOW MUCH OF EACH RESOURCE IS NEEDED?
It helps in deriving estimates based on the answers to the above questions.
Posted by Sunflower at 1/15/2011 10:32:00 PM 0 comments
Labels: Approach, Development, Functions, Location, Organization, Principle, Project, Project Management, Resources, Software, software engineering, Task set, Tasks, Technically, validity, W5HH principle
Subscribe by Email |
|
Friday, January 14, 2011
Cleanroom Software Engineering - Advantages, Principles and Process Teams
Cleanroom software engineering involves the integrated use of software engineering modeling, program verification and statistical software quality assurance.
- Cleanroom software engineering verifies design specification using mathematically-based proof of correctness.
- Cleanroom software engineering relies heavily on statistical use testing to uncover high impact errors.
- Cleanroom software engineering generally follows an incremental development process.
CLEANROOM PRINCIPLES
- Small Teams include independent specification, development, and certification sub-teams.
- Incremental development under statistical quality control.
- Software development is based on mathematical principles. The box principle is used for specification and design. The formal verification is used to confirm correctness of implementation of specification. The program correctness is verified by team reviews using questionnaires.
- Testing is based on statistical principles.
CLEANROOM PROCESS TEAMS
- Specification team develops and maintains the system specification.
- Development team develops and verifies software. The software is not compiled or executes during verification.
- Certification team develops set of statistical test to exercise software after development. Reliability growth models are used to assess reliability.
BENEFITS OF CLEANROOM SOFTWARE ENGINEERING
- Zero failures in the field which is a goal but a realistic expectation is<5 failures per KLOC on first program execution in the first team project.
- Short development cycles
- Longer product life.
Posted by Sunflower at 1/14/2011 09:53:00 PM 0 comments
Labels: Advantages, Benefits, Certification, Cleanroom Software engineering, Development, Incremental, Principles, Process teams, Quality, Software, Specification, statistical use testing, Strategy, Teams
Subscribe by Email |
|
Thursday, January 13, 2011
Cleanroom Testing - Statistical Use Testing and Certfication
Clean room testing is different from conventional testing approaches. The goal of clean room testing is to validate software requirements by demonstrating that a statistical sample of use-cases have been executed successfully.
Statistical use testing tests the software in a way the users intend to use it. the clean room testing teams determine the usage probability distribution or the software. Each increment's specification is analyzed to define a set of inputs or events that cause the software to change its behavior. A probability of use is assigned to each input or event based on the interviews with users, the creation of usage scenarios and a general understanding o application domain. The testing team executes these use cases and verifies software behavior against the specification for the system. Using the recorded interval times, the certification team can compute mean time to failure.
Within the clean room software engineering approach, certification implies that the reliability can be specified for each component. Reusable software components can be stored along with their usage scenarios, program stimuli, and probability distributions. The certification approach involves :
- usage scenarios must be created.
- usage profile is specified.
- test cases are generated from profile.
- tests are executed and failure data are recorded and analyzed.
- reliability is computed and certified.
Certification for clean room software engineering requires creation of three models:
- Sampling Model : Certification is given if no failure or a specified number of failures occur after executing m random test cases. The value of m is derived mathematically.
- Component Model : This model enables the analyst to determine the probability that component i will fail prior to completion.
- Certification Model : The overall reliability of the system is projected and certified.
Posted by Sunflower at 1/13/2011 09:48:00 PM 0 comments
Labels: Approaches, Certification, Cleanroom Software engineering, Cleanroom strategy, Cleanroom testing, Components, Models, Probability, Reliability, Samples, statistical use testing, Verification
Subscribe by Email |
|
Wednesday, January 12, 2011
Model Driven Architecture (MDA) - Advantages and MDA Process
The MDA is a new way of writing specifications, based on a platform-independent
model.
Why should we use Model Driven Architecture
- Portability
- Interoperability
- Domain facilities provide much wider interoperability.
- MDA allows to model the functionality and behavior only once, therefore saves a lot of time.
- Requirements are always changing.
- New technology is arising.
- Require to integrate old system with new system, and any other system in future.
- MDA makes it easier to integrate applications and facilities across middle-ware boundaries.
Model Driven Architecture Process
THE BASIC PROCESS
- To construct a MDA application, the first step is to create a computation independent model(CIM) by a business analyst.
- The CIM is transformed into platform independent model(PIM) by enterprise architect.
- The resulting PIM has to be targeted to a platform to complete the build process.
- The transformation of a PIM to a PSM will be done by a platform specialist.
THE COMPLEX PROCESS
The process from computation independent model to platform specific model can be a bit more complex.
- Between the models, there can be some gaps present which makes transformation difficult.
- As a result, you can have interrelated models having different layers of abstraction.
- One consequence is that a single layer of abstraction can have horizontal transformations. Consider the example where a PIM is converted multiple times into more detailed PIMs. And there are vertical transformations in addition to vertical transformation of models.
Posted by Sunflower at 1/12/2011 09:02:00 PM 0 comments
Labels: Complex, Details, Development, Focus areas, Goals, Independent, MDA, Model Driven Architecture, Organization, Platforms, Portable, Simple Process, Software, Specific, Specification, Structure
Subscribe by Email |
|
Model Driven Architecture (MDA) - Characteristics and Viewpoints
OMG was formed as a standards organization to help reduce complexity, lower costs, and hasten the introduction of new software applications.
- The Object Management Group (OMG) adopted the Model Driven Architecture as an approach for using models in software development.
- Its three primary goals are portability, interoperability and reusability through architectural separation of concerns.
Characteristics of Model Driven Architecture(MDA)
- MDA enables development of new specifications.
- MDA provides a comprehensive, structured solution for application interoperability and portability into the future.
- MDA consists of services specified by OMG. It also includes directory services, event handling, persistence, transactions, and security.
- MDA enables the creation of standardized domain models for some vertical industries.
- MDA separates the operation of the system from the way it uses its capabilities of its platform.
- MDA enables converting platform-independent models to produce platform-specific models using mappings.
What are different viewpoints of Model Driven Architecture
- The first viewpoint is called Computation Independent Viewpoint which focuses on environment and requirements of the system rather than the details of system's structure and processing.
- The second viewpoint is called Platform Independent Viewpoint which focuses on how the system operates. It hides the details necessary for a particular platform. The part of complete specification does not change when the platform changes.
- The third viewpoint is called Platform Specific Viewpoint which focus on the detail of the use of a specific platform in addition to platform independent viewpoint.
Posted by Sunflower at 1/12/2011 08:11:00 PM 0 comments
Labels: Characteristics, Details, Development, Focus areas, Goals, Independent, MDA, Model Driven Architecture, Organization, Platforms, Portable, Software, Specific, Specification, Structure, Viewpoints
Subscribe by Email |
|
Monday, January 10, 2011
Rapid Application Development (RAD) - Advantages and Disadvantages
The main objective of Rapid Application Development is to avoid extensive pre-planning, generally allowing software to be written much faster and making it easier to change requirements.
Rapid Application Development Model (RAD Model) is a linear sequence of the software development process model where we focus a very short development cycle by using a component based construction approach.
When organizations adopt rapid development methodologies, care must be taken to avoid role and responsibility confusion and communication breakdown within the development team, and between the team and the client.
To facilitate rapid development, strong emphasis was placed on the idea of software re-use. The notion of software components began to be nurtured.
ADVANTAGES OF RAPID APPLICATION DEVELOPMENT(RAD)
- It increases speed of developing software. It can be achieved using methods like rapid prototyping, virtualization of system related routines, the use of CASE tools and other techniques.
- Re-usability of components help to speed up development.
- It increases the quality.
- Some systems also deliver advantages of interoperability, extensibility, and portability.
- It incorporates short development cycles.
- Promotes strong collaborative atmosphere and dynamic gathering of requirements.
DISADVANTAGES OF RAPID APPLICATION DEVELOPMENT(RAD)
- Unknown cost of product.
- Difficult to commit the time required for success of the RAD process.
- Short iteration may not add enough functionality, leading to significant delays in final iterations.
- Early RAD systems faces reduced scalability occurs because a RAD developed application starts as a prototype and evolves into a finished application.
- Early RAD systems have reduced features that occur due to time boxing, where features are pushed to later versions in order to finish a release in a short amount of time.
- Dependency on strong cohesive teams and individual commitment to the project.
Posted by Sunflower at 1/10/2011 05:35:00 PM 0 comments
Labels: Application, Components, Design, Development, Disadvantages, Focus areas, Phases, Product, Prototyping, RAD, Rapid Application Development, Requirements, Software Development Methodology Advantages
Subscribe by Email |
|
Rapid Application Development (RAD) - Characteristics and Phases
Rapid Application Development is a software development methodology that focuses to decrease the time that is needed to design the software through gathering requirements using workshops or focus groups, prototyping and early, reiterative user testing of designs, the re-use of software components, a rigidly paced schedule that defers design improvements to the next product version, less formality in reviews and other team communication.
Characteristics of Rapid Application Development(RAD)
- It involves techniques like iterative development and software prototyping.
- Focused scope where the business objectives are well defined and narrow is well suited for RAD.
- Project data suitable for RAD is the data for the project already exists (completely or in part). The project largely comprises analysis or reporting of the data.
- Decisions can be made by a small number of people who are available and preferably co-located are suitable for RAD.
- A small project team (preferably six people or less) is suitable for RAD.
- In RAD, the technical architecture is defined and clear and the key technology components are in place and tested.
Phases of Rapid Application Development
RAD has a step by step process.
- Planning of Requirements: Developers meet with the project coordinator or manager to create specific objectives from the desired program. Strategies for development and tools for development are also laid out in a specific project.
- RAD Design Workshop: Using the agreed tools and interfaces, developers will start to create different programs based on the business need.
- Implementation Phase: Even though it has gone through hundreds or even thousands of testing and critique, the stage wherein the software is implemented in a larger scale is different hence new suggestions and bugs should be expected from different users.
Posted by Sunflower at 1/10/2011 01:24:00 PM 0 comments
Labels: Application, Characteristics, Components, Design, Development, Focus areas, Phases, Product, Prototyping, RAD, Rapid Application Development, Requirements, Software, Software Development Methodology
Subscribe by Email |
|
Saturday, January 8, 2011
Software Development Methodology - Joint Application Development (JAD)
- Joint Application Development(JAD)is a process that is originally used to develop computer based systems.
- Joint Application Development is a process that accelerates the design of information technology solutions.
- JAD uses customer involvement and group dynamics to accurately depict the user's view of the business need and to jointly develop a solution.
- JAD is thought to lead to shorter development times and greater client satisfaction because the client is involved throughout the development process.
- JAD centers around a workshop session that is structured and focused. Participants of these sessions would typically include a facilitator, end users, developers, observers, mediators and experts.
- In order to get agreement on the goals and scope of the project, a series of structured interviews are held.
- The sessions are very focused, conducted in a dedicated environment, quickly drive major requirements.
Concept of Joint Application Development
- User who do the job have the best understanding of that job.
- The developers have the best understanding of the technology.
- The software development process and business process work in the same way.
- When all groups work equal and as one team with a single goal, the best software comes out.
Principles of JAD Process
- Define session objectives.
- Prepare for the session.
- Conduct the JAD session.
- Procedure the documents.
JAD improves the final quality of the product by keeping the focus on the upfront of the development cycle thus reducing the errors that are likely to cause huge expenses.
Advantages of Joint Application Development
- JAD decreases time and costs associated with requirements elicitation process.
- The experts get a chance to share their views, understand views of others, and develop the sense of project ownership.
- The techniques of JAD implementation are well known as it is the first accelerated design technique.
- Easy integration of CASE tools into JAD workshops improves session productivity and provides systems analysts with discussed and ready to use models.
- Enhances quality.
- Creates a design from the customer's perspective.
Posted by Sunflower at 1/08/2011 05:07:00 PM 0 comments
Labels: Advantages, Concepts, Customer, Design, Development, JAD, Joint Application Development, Methods, Principles, Product, Quality, Sessions, Software Development Methodology, Users
Subscribe by Email |
|
Quick Glance : Sociability(Sensitivity) Tests, Tuning Cycle Tests, Protocol Tests, Thick Client Application Tests, Thin Client Application tests
SOCIABILITY(sensitivity) TESTS
Sensitivity analysis testing can determine impact of activities in one system on another related system. Such testing involves a mathematical approach to determine the impact that one system will have on another system. For example, web enabling a customer 'order status' facility may impact on performance of telemarketing screens that interrogate the same tables in the same database. The issue of web enabling ca be that it is more successful than anticipated and can result in many more inquiries than originally envisioned, which loads the IT system with one work than had been planned.
TUNING CYCLE TESTS
A series of test cycles can be executed with a primary purpose of identifying tuning opportunities. Tests can be refined and re-targeted on the fly to allow technology support staff to make configuration changes so that the impact of those changes can be immediately measured.
PROTOCOL TESTS
Protocol tests involve the mechanisms used in an application, rather than the applications themselves.For example, a protocol test of a web server will involve a number of HTTP interactions that would typically occur if a web browser were to interact with a web server - but the test would not be done using a web browser. LoadRunner is usually used to drive load into a system using VUGen at a protocol level so that a small number of computers can be used to simulate thousand of users.
THICK CLIENT APPLICATION TESTS
Thick clients, also called heavy clients, are full-featured computers that are connected to a network. While a thick client is fully functional without a network connection, it is only a "client" when it is connected to a server. The server may provide the thick client with programs and files that are not stored on the local machine's hard drive.
A thick client is a purpose built piece of software that has been developed to work as a client with a server. It often has substantial business logic embedded within it, beyond the simple validation that is able to be achieved through a web browser. A thick client is often able to be very efficient with the amount of data that is transferred between it and its server, but is also often sensitive to any poor communication links. Testing tools such as WinRunner are able to be used to drive a thick client, so that response time can be measured under a variety of circumstances within a testing regime.
Developing a load test based on thick client activity usually requires significantly more effort for the coding stage of testing, as VUGen must be used to simulate the protocol between the client and the server. That protocol may be database connection based, COM/DCOM based, a proprietary communications protocol or even a combination of protocols.
THIN CLIENT APPLICATION TESTS
An internet browser that is used to run an application is said to be a thin client. But even thin clients can consume substantial amounts of CPU time on the computer that they are running on. This is particularly the case with complex web pages that utilize many recently introduced features to liven up a web page. Rendering a page from after hitting a SUBMIT button may take several seconds even though the server may have responded to the request in less than one second. Testing tools such as WinRunner are able to be used to drive a thin client, so that response time can be measured from a users perspective, rather than from a protocol level.
Thursday, January 6, 2011
Volume tests - Volume Testing of Batch Processing Systems
Capacity drivers in batch processing systems are also critical as certain record types may require significant CPU processing, while other record types may invoke substantial database and disk activity. Some batch processes also contain substantial aggregation processing, and the mix of transactions can significantly impact the processing requirements of the aggregation phase.
In addition to the contents of any batch file, the total amount of processing effort may also depend on the size and makeup of the database that the batch process interacts with. Also, some details in the database may be used to validate batch records, so the test database must match test batch files.
Before conducting a meaningful test on a batch system, the following must be known :
- The capacity drivers for the batch records.
- The mix of batch records to be processed, grouped by capacity driver.
- Peak expected batch sizes (check end of month, quarter and year batch sizes).
- Similarity of production database and test database.
- Performance Requirements.
Batch runs can be analyzed and the capacity drivers can be identified, so that large batches can be generated for validation of processing within batch windows. Volume tests are also executed to ensure that the anticipated numbers of transactions are able to be processed and that they satisfy the stated performance requirements.
Posted by Sunflower at 1/06/2011 01:45:00 PM 0 comments
Labels: Application, Batch Processing Systems, Batch Systems, Capacity Drivers, CPU, Database, Drivers, files, Records, Software testing, Volume, Volume test, Volume testing
Subscribe by Email |
|
Wednesday, January 5, 2011
Volume tests - Volume Testing of Messaging systems
Volume tests are often most appropriate to messaging, batch and conversion processing type situations. In a volume test, there is often no such measure as response time. Instead, there is usually a concept of throughput. A key to effective volume testing is the identification of the relevant capacity drivers. A capacity driver is something that directly impacts on the total processing capacity. For a messaging system, a capacity driver may well be the size of messages being processed.
Most messaging systems do not interrogate the body of the messages they are processing, so varying the content of the test messages may not impact the total message throughput capacity, but significantly changing the size of the messages may have a significant effect. However, the message header may include indicators that have a very significant impact on processing efficiency. For example, a flag saying that the message need not be delivered under certain circumstances is much easier to deal with than a message with a flag saying that it must be held for delivery for as long as necessary to deliver the message, and the message must not be lost. In the former example, the message may be held in memory, but in the later example, the message must be physically written to disk multiple times.
Before conducting a meaningful test on a messaging system, the following must be known:
- the capacity drivers for the messages.
- the peak rate of messages that need to be processed, grouped by capacity driver.
- the duration of peak message activity that needs to be replicated.
- the required message processing rates.
A test can then be designed to measure the throughput of a messaging system as well as the internal messaging system metrics while that throughput rate is being processed. Such measures would typically include CPU utilization and disk activity. It is important that a test be run, at peak load, for a period of time equal to or greater than the expected production duration of peak load.
Posted by Sunflower at 1/05/2011 02:48:00 PM 0 comments
Labels: Capacity, Duration, Header, Messages, Messaging System, Peak, Process, Response time, Software testing, Tests, Throughput, Time, Volume, Volume test, Volume testing
Subscribe by Email |
|
Tuesday, January 4, 2011
What is the need to execute Network Sensitivity Tests?
The three principle reasons for executing network sensitivity tests are as follows:
- Determine the impact on response time of WAN link.
- Determine the capacity of a system based on a given WAN link.
- Determine the impact on the system under test that is under dirty communications load.
Execution of performance and load tests for analysis of network sensitivity require test system configuration to emulate a WAN. Once a WAN link has been configured, performance and load tests conducted will become Network Sensitivity Tests.
There are two ways of configuring such tests:
- Use a simulated WAN and inject appropriate background traffic
This can be achieved by putting back to back routers between a load generator and the system under test. The routers can be configured to allow the required level of bandwidth, and instead of connecting to a real WAN, they connect directly through to each other.
When back to back routers are configured to be part of a test, they will basically limit the bandwidth. If the test is to be realistic, then additional traffic will need to be applied to the routers. This can be achieved by a web server at one end of the link serving pages and another load generator generating
requests. It is important that the mix of traffic is realistic.
For example, a few continuous file transfers may impact response time in a different way to a large number of small transmissions. By forcing extra more traffic over the simulated WAN link, the latency will increase and some packet loss may even occur. While this is much more realistic than testing over a high speed LAN, it does not take into account many features of a congested WAN such as out of sequence packets.
- Use the WAN emulation facility within LoadRunner
The WAN emulation facility within LoadRunner supports a variety of WAN scenarios. Each load generator can be assigned a number of WAN emulation parameters, such as error rates and latency. WAN parameters can be set individually, or WAN link types can be selected from a list of pre-set configurations.
It is important to ensure that measured response times incorporate the impact of WAN effects both at an individual session, as part of a performance test, and under load as part of a load test, because a system under WAN affected load may work much harder than a system doing the same actions over a clean communications link.
Posted by Sunflower at 1/04/2011 04:17:00 PM 0 comments
Labels: Configuring, Impact, Load, Load tests, LoadRunner, Network, Network Sensitivity Tests, Response time, Routers, Sensitive, Tests, traffic, WAN
Subscribe by Email |
|
Monday, January 3, 2011
How to execute Performance Tests?
Performance testing involves executing the same test case multiple times with data variations for each execution, and then collating response times and computing response time statistics to compare against the formal expectations. Often, performance is different when the data used in the test case is different, as different number of rows are processed in the database, different processing and validation come into play, and so on. By executing a test case many times with different data, a statistical measure of response time can be computed that can be directly compared against a formal stated expectation.
Network sensitivity tests are variations on load tests and performance tests that focus on the Wide Area Network (WAN) limitations and network activity. Network sensitivity tests can be used to predict the impact of a given WAN segment or traffic profile on various applications that are bandwidth dependent. Network issues often arise at low levels of concurrency over low bandwidth WAN segments. Very chatty applications can appear to be more prone to response time degradation under certain conditions than other applications that actually use more bandwidth. For example, some applications may degrade to unacceptable levels of response time when a certain pattern of network traffic uses 50% of available bandwidth, while other applications are virtually un-changed in response time even with 85% of available bandwidth consumed elsewhere.
This is particularly important test for deployment of a time critical application over a WAN. Also, some front end systems such as web servers, need to work much harder with dirty communications compared with clean communications encountered on a high speed LAN in an isolated load and performance testing environment.
Posted by Sunflower at 1/03/2011 03:28:00 PM 0 comments
Labels: Activity, Application, Bandwidth, Compute, Data, Expectations, Load, Load tests, Network, Performance, Performance testing, Response time, Software testing, Tests, WAN
Subscribe by Email |
|