Download Free Templates http://bigtheme.net/ Websites Templates

 

 

 

Software Development Glossary

3-tier application - a program that is organized into three major parts: the workstation or presentation interface; the business logic; and the database and related programming. Each of these is distributed to one or more separate places on a network.

Agile software development - calls for keeping code simple, testing often, and delivering small, functional bits of the application as soon as they're ready. The focus is to build a succession of parts, rather than delivering one large application at the end of the project.

Amdahl's law - stipulates that, in a program with parallel processing, a relatively few instructions that have to be performed in sequence will have a limiting factor on program speedup such that adding more processors may not make the program run faster.

Amelioration pattern - a design pattern that describes how to go from a bad solution to a better one.

Antipattern - a frequently used, but largely ineffective solution to a problem. The term was originally used to refer to a design pattern gone wrong.

API (application programming interface) - a specific method prescribed by a computer operating system or by an application program by which a programmer writing an application program can make requests of the operating system or another application.

Application integration - the process of bringing data or a function from one application program together with that of another application program. Where these programs already exist, the process is sometimes realized by using middleware.

Application program - a program designed to perform a specific function directly for the user or, in some cases, for another application program.

Aspect-oriented programming (AOP) - an approach to programming that allows global properties of a program to determine how it is compiled into an executable program.

Best practice - a technique or methodology that, through experience and research, has proven to reliably lead to a desired result.

Bug - a coding error in a computer program.

Build - a version of a program, usually pre-release, and identified by a build number, rather than by a release number. As a verb, to build can mean either to write code or to put individual coded components of a program together.

Build tool - a programming utility that is used when building a new version of a program.

Capability Maturity Model - a methodology used to develop and refine an organization's software development process. The model describes a five-level evolutionary path of increasingly organized and systematically more mature processes.

Data modeling - the analysis of data objects that are used in a business or other context and the identification of the relationships among these data objects.

Debugging - the process of locating and fixing or bypassing bugs (errors) in computer program code or the engineering of a hardware device.

Design pattern - a written document that describes a general solution to a design problem that recurs repeatedly in many projects.

Development environment - the set of processes and programming tools used to create the program or software product.

Development process - a set of tasks performed for a given purpose in a software development project.

Driver - a program that interacts with a particular device or special kind of software. The driver contains special knowledge of the device or special software interface that programs using the driver do not.

Driver development kit (DDK) - a set of programs and related files that are used to develop a new software or hardware driver or to update an existing legacy application driver for an operating system.

Elegant solution - a solution in which the maximum desired effect is achieved with the smallest, or simplest effort.

Embedded systems programming - the programming of an embedded system in some device using the permitted programming interfaces provided by that system.

Enterprise application integration - the plans, methods, and tools aimed at modernizing, consolidating, and coordinating the computer applications in an enterprise.

Entity-relationship diagram - a data modeling technique that creates a graphical representation of the entities, and the relationships between entities, within an information system.

Ergonomics - the science of refining the design of products to optimize them for human use. Human characteristics, such as height, weight, and proportions are considered, as well as information about human hearing, sight, temperature preferences, and so on.

Exploratory model - a systems development method that consists of planning and trying different designs until one of them seems to be the right one to develop.

Extreme Programming - a pragmatic approach to program development that emphasizes business results first, and takes an incremental, get-something-started approach to building the product, using continual testing and revision.

Feature creep - a tendency for product or project requirements to increase during development beyond those originally foreseen, leading to features that weren't originally planned and resulting risk to product quality or schedule.

Functional programming - a style of programming that emphasizes the evaluation of expressions rather than the execution of commands.

Functional specification - a formal document used to describe in detail for software developers a product's intended capabilities, appearance, and interactions with users.

Gantt chart - a horizontal bar chart frequently used in project management that provides a graphical illustration of a schedule that helps to plan, coordinate, and track specific tasks in a project.

Gap analysis - the study of the differences between two different information systems or applications, often for the purpose of determining how to get from one state to a new state. Sometimes spoken of as "the space between where we are and where we want to be."

genetic programming - a model of programming which uses the ideas of biological evolution to handle a complex problem, most appropriate with problems in which there are a large number of fluctuating variables, such as those related to artificial intelligence.

Gold code - the final, ready-to-manufacture (that is, replicate onto media) version of the software.

Help system - a documentation component of a software program that explains the features of the program and helps the user understand its capabilities.

Hotfix - code (sometimes called a patch) that fixes a bug in a product.

Human factors - the study of how humans behave physically and psychologically in relation to particular environments, products, or services.

Information architecture - the set of ideas about how all information in a given context should be treated philosophically and, in a general way, how it should be organized; this is expressed in an information architecture document.

Information design - the detailed planning of specific information that is to be provided to a particular audience to meet specific objectives. In one hierarchical model, the information design follows the information architecture and information planning stages.

Integrated development environment - a programming environment that has been packaged as an application program, typically consisting of a code editor, a compiler, a debugger, and a GUI builder.

ISV (independent software vendor) - a company that makes and sells software products that run on one or more computer hardware or operating system platforms.

Iterative - describes a heuristic planning and development process where an application is developed in small sections called iterations.

ITIL - a set of best practices standards for information technology (IT) service management developed by the United Kingdom's Central Computer and Telecommunications Agency (CCTA).

Joint application development - a methodology that involves the client or end user in the design and development of an application, through a succession of collaborative workshops called JAD sessions.

KISS Principle (Keep It Simple, Stupid) - the principle that people want products that are easy to learn and use, and that companies realize time and cost benefits by producing such products.

KLOC (thousands of lines of code) - a traditional measure of how large a computer program is or how long or how many people it will take to write it, sometimes used as a rough measure of programmer productivity.

Lean programming - a concept that emphasizes optimizing efficiency and minimizing waste in the development of a computer program; the concept is also applicable to all enterprise practices.

Legacy application - an enterprise application that is based on languages, platforms, and/or techniques that predate current technology.

Metric - the measurement of a particular characteristic of a program's performance or efficiency.

object-oriented programming - a programming model organized around objects rather than actions and data rather than logic, based on the idea that what we really care about are the objects we want to manipulate, rather than the logic required to manipulate them..

Open source - describes a program whose source code is made available for use or modification as users or other developers see fit.

Outsourcing - an arrangement in which one company provides services for another company that could also be or usually have been provided in-house.

Pasta Theory of Programming - the idea that various programming structures can be likened to the structures of well-known pasta dishes: unstructured procedural programming is called spaghetti code, structured programming is called lasagna code, and object-oriented programming is called ravioli code.

Patch - a quick-repair job for the problems in a piece of programming, often available for download through the software maker's Web site.

Pattern - see design pattern

Peer review - a process used for checking the work performed by one's equals (peers) to ensure it meets specific criteria.

PERT chart (Program Evaluation Review Technique) - a project management tool used to schedule, organize, and coordinate tasks within a project developed by the U.S. Navy in the 1950s.

Polymorphism - from the Greek meaning "having multiple forms," the characteristic of being able to assign a different meaning or usage to something in different contexts - specifically, to allow an entity such as a variable, a function, or an object to have more than one form.

Portability - a characteristic attributed to a computer program if it can be used in operating systems other than the one in which it was created without requiring major rework.

PRINCE2 - a project management methodology developed by the government of the United Kingdom that makes use of the best proven practices from a variety of industries and backgrounds.

Program layer - a separate functional component that interacts with others in some sequential and hierarchical way, with each layer usually is having an interface only to the layer above it and the layer below it.

Project planning - a discipline for stating how to complete a project within a certain timeframe, usually with defined stages, and with designated resources.

prototyping - a systems development method (SDM) in which a prototype (an early approximation of a final system or product) is built, tested, and then reworked as necessary until an acceptable prototype is finally achieved from which the complete system or product can now be developed.

Pseudo code (pronounced SOO-doh-kohd) - a detailed yet readable description of what a computer program or algorithm must do, expressed in a formally-styled natural language rather than in a programming language.

Rapid application development (RAD) - an approach based on the concept that products can be developed faster and of higher quality through: gathering requirements using workshops or focus groups; prototyping and early, reiterative user testing of designs; reusing software components; and using less formality in communication documents, such as reviews.

Rational Unified Process (RUP) - an object-oriented and Web-enabled program development methodology that is said to be like an online mentor that provides guidelines, templates, and examples for all aspects and stages of program development.

Refactoring - a process that improves the internal structure of a software system without changing its external behavior.

Regression testing - the process of testing changes to computer programs to make sure that the older programming still works with the new changes.

Risk management - the process of planning, organizing, leading, and controlling the activities of an organization in order to minimize the effects of risk on an organization's capital and earnings.

ROI (return on investment) - for a given use of money in an enterprise, the amount of profit or cost saving realized.

Runtime - when a program is running.

SDK (software development kit) - a set of programs used by a computer programmer to write application programs.

Service pack - an orderable or downloadable update to a customer's software that fixes existing problems and, in some cases, delivers product enhancements.

Shotgun debugging - the debugging of a program, hardware, or system problem using the approach of trying several possible solutions at the same time in the hope that one of them will work.

Smoke testing - non-exhaustive software testing, ascertaining that the most crucial functions of a program work, but not bothering with finer details.

Spaghetti code - computer programming that is unnecessarily convoluted and particularly programming code that uses frequent branching from one section of code to another.

Spiral model - a systems development method (SDM) that combines the features of the prototyping model and the waterfall model.

SSADM (Structured Systems Analysis & Design Method) - a widely-used computer application development method in the UK that divides an application development project into modules, stages, steps, and tasks, and provides a framework for describing projects in a fashion suited to managing the project.

Structured programming - a subset of procedural programming that enforces a logical structure on the program being written to make it more efficient and easier to understand and modify.

synchronize-and-stabilize - a systems development life cycle model in which teams work in parallel on individual application modules, frequently synchronizing their code with that of other teams, and debugging (stabilizing) code regularly throughout the development process.

Systems development method (SDM) - a work discipline that is chosen by the developers of a computer system or product as a way to ensure successful results.

Systems development life cycle model (SDLC) - one of a number of structured approaches to information system development, created to guide all the processes involved, from an initial feasibility study through maintenance of the completed application. Models include the waterfall model; rapid application development (RAD); joint application development (JAD); the fountain model; the spiral model; build and fix; and synchronize-and-stabilize.

Systems thinking - a holistic approach to analysis that focuses on the way that a system's constituent parts interrelate and how systems work over time and within the context of larger systems.

TCO (total cost of ownership) - a type of calculation designed to help consumers and enterprise managers assess both direct and indirect costs and benefits related to the purchase of any IT component.

Tool Kit (Tk) - a companion program to Tool Command Language (Tcl) for creating graphical user interfaces. Together with Tcl, Tk is a rapid program development tool.

User acceptance testing - a phase of software development in which the software is tested in the "real world" by the intended audience.

user interface - everything designed into an information device with which a human being may interact -- including display screen, keyboard, mouse, light pen, the appearance of a desktop, illuminated characters, help messages, and how an application program or a Web site invites interaction and responds to it.

Utility - a small program that provides an addition to the capabilities provided by the operating system.

Waterfall model - popular version of the systems development life cycle model that describes a linear and sequential development method.

Web services - services made available from a business's Web server for Web users or other Web-connected programs.

Write-only code - programming code that is hard to read.


Application: Software programs, such as word processors and spreadsheets that most users use to do work on a computer.


Application Server:
Also called an Appserver. A program that handles all application operations between users and an organization's backend business applications or databases. Application servers are typically used for complex transaction-based applications. To support high-end needs, an application server has to have built-in redundancy, monitors for high availability, high-performance distributed application services and support for complex database access.


ASP:
Active Server page a specification for a dynamically created Web page with a .ASP extension that utilizes ActiveX scripting -- usually VB Script or Jscript code. When a browser requests an ASP, the Web server generates a page with HTML code and sends it back to the browser. So ASPs are similar to CGI scripts, but they enable Visual Basic programmers to work with familiar tools.

 

Bandwidth: The amount of information or data that can be sent over a network connection in a given period of time. Bandwidth is usually stated in bits per second (bps), kilobits per second (kbps), or megabits per second (mps).

 

Backup: To create a copy of data as a precaution against the loss or damage of the original data. Most users backup some of their files, and many computer networks utilize automatic backup software to make regular copies of some or all of the data on the network. Some backup systems use digital audio tape (DAT) as a storage medium.

 

Backup Data: Backup Data is information that is not presently in use by an organization and is routinely stored separately upon portable media, to free up space and permit data recovery in the event of disaster.

 

CD-ROM: Data storage medium that uses compact discs to store about 1,500 floppy discs worth of data.

 

Desktop: Usually refers to an individual PC - a user's desktop computer.

 

Data: Information stored on the computer system, used by applications to accomplish tasks.

 

Data Cleansing: Also referred to as data scrubbing, the act of detecting and removing and/or correcting a database’s dirty data (i.e., data that is incorrect, out-of-date, redundant, incomplete, or formatted incorrectly). The goal of data cleansing is not just to clean up the data in a database but also to bring consistency to different sets of data that have been merged from separate databases. Sophisticated software applications are available to clean a database’s data using algorithms, rules and look-up tables, a task that was once done manually and therefore still subject to human error.

 

Data Migration:

(1) The process of translating data from one format to another. Data migration is

Necessary when an organization decides to use new computing systems or database

Management system that is incompatible with the current system. Typically, data

Migration is performed by a set of customized programs or scripts that automatically

Transfer the data.

(2) The process of moving data from one storage device to another.

 

Electronic Mail: Electronic Mail, commonly referred to as e-mail, is an electronic means for communicating information under specified conditions, generally in the form of text messages, through systems that will send, store, process, and receive information and in which messages are held in storage until the addressee accesses them.

 

Encryption: A procedure that renders the contents of a message or file unintelligible to anyone not authorized to read it.

 

ETL: Short for extract, transform, load, three database functions that are combined into one tool to pull data out of one database and place it into another database.

Extract -- the process of reading data from a database.

Transform -- the process of converting the extracted data from its previous form into the form it needs to be in so that it can be placed into another database. Transformation occurs by using rules or lookup tables or by combining the data with other data.

Load -- the process of writing the data into the target database.

ETL is used to migrate data from one database to another, to form data marts and data warehouses and also to convert databases from one format or type to another.

 

Firewall: A set of related programs that protect the resources of a private network from users from other networks.

 

Hard disk: A peripheral data storage device that may be found inside a desktop or laptop as in a hard drive situation. The hard disk may also be a transportable version and attached to a desktop or laptop.

 

Hard drive: The primary storage unit on PCs, consisting of one or more magnetic media platters on which digital data can be written and erased magnetically.

 

HTML (Hypertext Markup Language): The tag-based ASCII language used to create pages on the web.

 

Internet: The interconnecting global public network made by connecting smaller shared public networks. The most well known Internet is the Internet, the worldwide network of networks that use the TCP/IP protocol to facilitate information exchange.

 

Intranet: A network of interconnecting smaller private networks those are isolated from the public Internet.

 

IT: Short for Information Technology, and pronounced as separate letters, the broad subject concerned with all aspects of managing and processing information, especially within a large organization or company. Because computers are central to information management, computer departments within companies and universities are often called IT departments. Some companies refer to this department as IS (Information Services) or MIS (Management Information Services).

 

LAN (Local Area Network): Usually refers to a network of computers in a single building or other discrete location.

 

Migrated data: Migrated data is information that has been moved from one database or format to another, usually as a result of a change from one hardware or software technology to another.

 

Mirroring: The duplication of data for purposes of backup or to distribute network traffic among several computers with identical data.

 

MIS: Management Information Systems is a general term for the computer systems in an enterprise that provide information about its business operations.

MIS is a general term for the computer systems in an enterprise that provide information about its business operations. It's also used to refer to the people who manage these systems.

Typically, in a large corporation, "MIS" or the "MIS department" refers to a central or

Centrally coordinated system of computer expertise and management, often including mainframe systems but also including by extension the corporation's entire network of computer resources.

In the beginning, business computers were used for the practical business of computing the payroll and keeping track of accounts payable and receivable. As applications were developed that provided managers with information about sales, inventories, and other data that would help in managing the enterprise, the term "MIS" arose to describe these kinds of applications.

Today, the term is used broadly in a number of contexts and includes (but is not limited to) decision support systems, resource and people management applications, project management, and database retrieval applications.

Short for management information system or management information services, and

Pronounced as separate letters, MIS refers broadly to a computer-based system that provides managers with the tools for organizing, evaluating and efficiently running their departments.

In order to provide past, present and prediction information, an MIS can include software that helps in decision making, data resources such as databases, the hardware resources of a system, decision support systems, people management and project management applications, and any computerized processes that enable the department to run efficiently. Within companies and large organizations, the department responsible for computer systems is sometimes called the MIS department. Other names for MIS include IS (Information Services) and IT (Information Technology).

"Concerning both the management of information technology and the use of information technology for managerial and organizational purposes”

Set of interrelated components that collect (retrieve), process, store, and distribute information to support decision-making and control in an organization

 

Modem: A piece of hardware that lets a computer talks to another computer over a phone line.

 

MS SQL: Microsoft SQL server is a database management system used to store large amounts of data and manipulate them in a secure and safe environment using a variety of tools.

 

Network: A group of computers or devices that is connected together for the exchange of data and sharing of resources.

 

Operating System (OS): The software that the rest of the software depends on to make the computer functional. On most PCs this is Windows or the Macintosh OS. UNIX and Linux are other operating systems often found in scientific and technical environments.

 

ODBC: Short for Open Database Connectivity, a standard database access method developed by the SQL Access group in 1992. The goal of ODBC is to make it possible to access any data from any application, regardless of which database management system (DBMS) is handling the data. ODBC manages this by inserting a middle layer, called a database driver, between an application and the DBMS. The purpose of this layer is to translate the application's data queries into commands that the DBMS understands. For this to work, both the application and the DBMS must be ODBC-compliant -- that is, the application must be capable of issuing ODBC commands and the DBMS must be capable of responding to them.

 

PC: Personal computer.

 

RAM (Random Access Memory): The working memory of the computer into which application programs can be loaded and executed.

 

Router: A piece of hardware that routes data from a local area network (LAN) to a phone line.

Report: A formatted and organized presentation of data. Most database management systems include a report writer that enables you to design and generate reports.

 

Server: Any computer on a network that contains data or applications shared by users of the network on their client PCs.

 

Software: Coded instructions (programs) that make a computer do useful work.

 

System administrator: (sysadmin, sysop) the person in charge of keeping a network working.

 

SQL: Abbreviation of structured query language, and pronounced either see-kwell or as separate letters. SQL is a standardized query language for requesting information from a database. The original version called SEQUEL (structured English query language) was designed by an IBM research center in 1974 and 1975. SQL was first introduced as a commercial database system in 1979 by Oracle Corporation.

Historically, SQL has been the favorite query language for database management systems running on minicomputers and mainframes. Increasingly, however, SQL is being supported by PC database systems because it supports distributed databases (databases that are spread out over several computer systems). This enables several users on a local-area network to access the same database simultaneously. Although there are different dialects of SQL, it is nevertheless the closest thing to a standard query language that currently exists. In 1986, ANSI approved a rudimentary version of SQL as the official standard, but most versions of SQL since then have included many extensions to the ANSI standard. In 1991, ANSI updated the standard. The new standard is known as SAG SQL.

 

VPN (Virtual Private Network): A virtually private network that is constructed by using public wires to connect nodes.

 

World Wide Web: The WWW is made up of all of the computers on the Internet, which use HTML-capable software (Netscape, Explorer, etc.) to exchange data. Data exchange on the WWW is characterized by easy-to-use graphical interfaces, hypertext links, images, and sound. Today the WWW has become synonymous with the Internet, although technically it is really just one component.

 

WAN: Short for Wide Area Network. A computer network that spans a relatively large geographical area. Typically, a WAN consists of two or more local-area networks (LANs).

Computers connected to a wide-area network are often connected through public networks, such as the telephone system. They can also be connected through leased lines or satellites.

The largest WAN in existence is the Internet.


Project Management Terminology

 

Initiation

The purpose of this procedure is to initiate the project and identify the higher level of information needed for the project management plan based on the information provided from the Statement Of Work (SOW) document or any other equivalent documentation from the pre-sales phase. In this phase, the standards by which project’s work products are to be developed will be identified, the Life Cycle Model (LCM) for software development and the tools to be used on the project will be identified also.

 

Estimation

The purpose of this procedure is to identify project estimates including size, effort, and cost. These estimates are used to develop and define project’s schedule, project’s team organization and non-human resources.    

 

Risk Planning

The purpose of this procedure is to identify project’s risks, assess and assign priority to each risk and finally plan for mitigation and contingencies.

 

Finalize Project Plan

The purpose of this procedure is to consolidate the project master plan with product development plan, quality assurance plan, and configuration management plan, peer reviewing plan and estimation data. After the consolidation, the procedure defines the tracking activities during project execution and the responsibilities and assignments for those activities. Obtaining commitment from project team and senior management is important as well as the project plan itself. These commitments are taken from all relevant stakeholders in this procedure.  

 

Monitor and Control

The purpose of this procedure is to monitor, collect and analyze all project data. The aim of the monitoring activities is to determine whether the project is on track or not, take corrective actions when the project deviates significantly from the Project Management Plan and finally, communicate the project status with management, team and customer.

 

Closure

The purpose of the project closure procedure is to ensure a graceful exit for the project, collect closure acceptance from customer, discuss closure conditions content with project team and senior, and record the experience gained in practice in order to guide the enactment of other projects in he future. The project closure procedure may involve data analysis to suggest process improvements and lesson learnt.

 

Product Development Terminology

 

Requirements Planning

The objective of the requirements planning procedure is to guide the enactment of the requirements elicitation process. The procedure involves establishing an agreement among the project’s stakeholders on who will do what, and by when the process tasks should be accomplished.

 

Requirements Elicitation

The objective of the elicitation procedure is to discover and capture candidate software requirements (both functional and non-functional) by communicating with the customer and/or end users and others who have stake in the system development. There are several techniques to elicit requirements these techniques which include brainstorming, interviews, questionnaires and focus groups.

 

Requirements Analysis

The objective of the analysis procedure is to further understand the requirements to resolve conflicts and inconsistencies and to ensure that they meet the required quality attributes and reflect the customer needs. The procedure also involves negotiation among stakeholders to agree on a set of requirements. Tasks of this procedure will likely be repeated several times until an agreement is reached.

 

Requirements Development

The objective of the requirements development procedure is to transform identified requirements into a formal software requirement specification document. The result of the formalization procedure is a document, Software Requirements Specifications (SRS), which is used to communicate requirements among all stakeholders.

 

Requirements Validation

The objective of the validation procedure is to ensure that the developed SRS reflects the customer requirements. The process involves communicating SRS to all stakeholders and facilitating agreement among them.

 

Requirements Acceptance

The objective of the acceptance process is to confirm that the baseline requirements reflect the project’s acceptance criteria. This procedure can also be used as a milestone to report progress to the customer and senior management.

 

 

 

 

 

Requirements Administration

The objective of the requirements administration process is to ensure that all requirements are traceable and under control. The procedure mainly involves administrating and maintaining the requirements database in addition to the requirements traceability. This procedure is needed when any changes to the approved SRS occur. The change may be a modification in an old requirement or as a new one.30 SPIG for SMEs V1.1

 

Development Planning

The objective of this process is to establish a reasonable plan for performing development activities and to be involved in the project from the initial stages which will provide a strong infrastructure for the project's success. Product development team will share the planning of the project with the other stakeholders. Selecting the appropriate software development life cycle is an important step. The project deliverables and estimation of the product size and effort will be shared with other team members.

 

Architecture Designing

The objectives of the design process are to develop a coherent, well-organized representation of the software product that meets the customer’s requirements and satisfies the predefined quality criteria. The process comprises the architectural design that will be followed by the detailed design in the next procedure. Architectural design provides the infrastructure for this detailed design. The importance of software design can be defined with the phrase, ‘quality design is the place where quality is fostered in software engineering’. It is an iterative process through which requirements are translated into a ‘blueprint’ for constructing the software.

 

Detailed Designing

Architectural design and detailed design are usually carried out in sequence because detailed design is largely dependent on the architectural design. Detailed design provides the basis for the product implementation.

 

Implementation

The objective of the implementation procedure is the transformation of the detailed design representation into a programming language realization by applying the appropriate coding standard and to develop the required product documentation to support the coded product. The code will be grouped into units (this will be dictated by the selected language and design information). All units shall be transformed into executable code to be debugged. Incorrect code and other product component will be re-worked until run free of errors.

 

 

 

 

Unit Testing Preparation and Execution

The unit test is a procedure used to validate that a particular module of source code is working properly. The procedure is to write test cases for all functions and methods so that whenever a change causes a regression, it can be quickly identified and fixed. Ideally, each test case is separate from the others. This type of testing is mostly done by the developer/tester and not by end-users. The goal of unit testing is to isolate each part of the program and show that the individual parts are correct.

 

Integration Testing Preparation and Execution

Integration testing is the phase of software testing in which individual software modules are combined and tested as a group. It follows unit testing and precedes system testing. Integration testing takes ‘modules’ as its input. These modules have been checked out by unit testing. Integration test groups them in larger aggregates, applies tests defined in an Integration test plan to those aggregates, and delivers it as an integrated system that is ready for system testing.

 

System Testing Preparation and Execution

System testing is testing conducted on a complete, integrated system to evaluate the system's compliance with its specified requirements. System SPIG for SMEs V1.1 31 testing is the first time that the entire system can be tested against the functional and non-functional requirement. System testing is intended to test up to and beyond the bounds defined in the software/hardware requirements specifications.

 

Acceptance Testing Preparation and Execution

The acceptance test is jointly performed by users or sponsors with manufacturers or producers through black-box testing (that is, testers need not know anything about the internal workings of the system). The results will determine acceptance of the system. Acceptance tests generally take the form of a suite of tests designed to be executed on the completed system. Each individual test, known as a case, exercises a particular operating condition of the user's environment or feature of the system, and will result in a pass or fail Boolean outcome. The objective is to provide confidence that the delivered system meets the business requirements of both sponsors and users.

 

Product Releasing

A software release refers to the creation and availability of a new version of a software product. Each time a software program has major changes, the project team should decide on how to distribute the changes or the changed system to the customer. Release procedure is a procedure concerned with the compilation, assembly and delivery of source code and any related documentation into finished products or other software components.nts. 

 

Peer Review Terminology

 

Planning

The planning phase enables the identification of the work products to be reviewed, the method to be used to perform the review and the requirements to be satisfied by each selected work product is identified.

Execution

The execution of peer review involves scheduling, preparing, and executing peer review meetings.

Reworking

The reworking procedure will give the author the opportunity to rework the work product and resolve all raised issues in the review meeting. This procedure is optional.

Follow-up

In the follow-up procedure, corrections of all defects planned for rework are verified. The performers confirm that all open issues have been resolved, and all redline errors have been corrected, and closing out the review.

 

 

Quality Assurance Terminology

 

Planning

The objective of this procedure is to establish a reasonable plan for performing QA auditing activities on the project level and to be involved in the project from 96 SPIG for SMEs V1.1 the initial stages which will provide a strong infrastructure for the project's success.

 

Execution

The objective of this procedure is to execute the QA auditing activities on the project level according to the QA plan. Auditing and evaluating project processes and work products are done to ensure adherence to applicable process description, standards and procedures as per the project’s processes. Auditing ensures that all issues and/or deviations detected during audit processes are communicated to relevant stakeholders and that the corrective actions are identified and documented.

 

Follow-up

The objective of this procedure is to ensure the closure of all NCs appearing in the QA audit according to the agreed corrective actions, or referring the nonclosed NCs to the senior management to take the required and suitable action.

 

 

 

 

 

Configuration Management Terminology

 

 

Planning

The objective of the CM planning procedure is to develop the required guidance for the deployment of the configuration management procedures. This procedure 106 SPIG for SMEs V1.1 involves the identification of the project configuration controller, developing the CM plan and ensuring its integration with the overall project plan.

 

Establishing CM Environment

The purpose of this procedure is to establish the configuration management system including the storage media, system, and the tools for accessing the configuration system. The procedure will produce the required infrastructure for the overall development project.

 

 

Maintaining CM Environment

The purpose of this procedure is to maintain the configuration management system includes the storage media, system, and the tools for accessing the configuration system. The procedure will produce the required infrastructure for the overall development project.

 

Functional Configuration Audit

The purpose of this procedure is to guide the performance of the functional audit activities on the configuration management system including the storage media, system, and the tools for accessing the configuration system. The procedure, when performed, ensures the logical consistency between the contents of the configuration management system and approves the readiness for performing the baselining. This audit is not related to the quality assurance audit. The quality assurance audit is just seeking the compliance with predefined standards, while this audit is seeking the integrity and compliance to the requirements specification.

 

 

 

 

 

 

 

 

 

 

 

Physical Configuration Audit

The purpose of this procedure is to guide the physical audit activities on the configuration management system including the storage media, system, and the tools for accessing the configuration system. The procedure, when performed, ensures the physical existence and consistency between the contents of the configuration management system and approves the readiness for performing the baselining. This procedure is not only performed before the baselining, as in the functional audit case, but also can be done in any time to ensure the required level of integrity.

 

Baselining

The purpose of this procedure is to produce baselines from the identified CIs.

The procedure, when performed, ensures the existence and existence’s announcement of the baselines that will be considered as a starting point for further development phases in the product development life cycle. The quality of the baseline is completely dependent on the functional and physical audits done before the baselining.

 

Change Control

The purpose of this procedure is to guide the performing of save changes to any stable work product. The change control procedure starts by raising a change request, then be being evaluated, implemented and verified. The impact of these changes should be estimated and then evaluated at the end of the project at the project closure phase.

 

Quality Management System (QMS)