Help TOC > Instructions for DEX developers > DEX testing
|
DEX testing |
Date: 2010/02/10 16:56:25 Revision: 1.8
|
This document focus on the need to establish a framework for testing that supports
the overall objectives of the PLCS Technical Committee of OASIS. This will apply
to different targets,
- Testing applied to the Capabilities used in the DEXs.
- Testing applied to the DEXs as deliverables from the TC.
- Testing applied to example Data sets that are made available to
support formal documents.
- Testing of software implementations.
This quality check relates to the definition of a complete capability, which is,
- QC internal Team completed review.
- Complete all sections according to PLCS/771 - "Project Specification for
DEX Development".
- Checked by second modeller.
- Accepted by teams using the capability.
- User guidance/ Documentation,
- The prime reader and user of the capabilities is the modellers and
implementers.
- Business experts in the development teams need sufficient understanding
to confirm that business requirements are met. Hence the documentation of
each capability should comprise a short business focused overview.
- Instantiation diagram complete.
- XML is semantically correct,
- Checked by second modeller.
- Sign off by team leader.
- QC external to the team
- Accepted by teams using the capability.
- Sign off by DEX coordinator.
When DEXs using the Capability has reached Level 2 and Level 3 there is higher
levels of quality for this Capability. This should be reflected in a master
listing of the Capabilities.
The testing of DEXs as deliverables relates to the definition of a completed DEX.
The completeness of a DEX is divided into four (4) levels,
Level 1 - Drafting the DEX documentation
- Completed all sections according to PLCS/771 -
"Project Specification for DEX Development".
- Reference data.
- Establish Normative Reference data mandatory classes identified.
- Provide examples for each of the others class types.
- The DEX should contain documentation on how the Reference data is
implemented in the DEX.
- Accepted by the team. Sign off by team leader.
Level 2 - Testing by mapping
- The DEX should contain the longform needed to implement the DEX.
- The DEX should contain the DEX Reference data.
- Generated exchange file (Part 21 or Part 28) and Reference Data
based on industrial data.
- Update DEX and capabilities according to lessons learned.
- Submit DEX for Committee Draft ballot (full membership of the PLCS TC).
Level 3- Testing by exchange
- Establish a test data set and Reference Data (Extending 'bike data set').
- Exchange tested between two or more systems
(systems may be based on the same application).
- Verify the DEX in a Business Context
.
- Update DEX and capabilities according to lessons learned.
Level 4 - Published
- At least 3 implementations.
- Submit for OASIS Standard ballot (entire OASIS).
The availability of test data sets is vital in establishing a critical mass of
software implementations. It is important that such data sets are published in
support of the formal DEX documentation and developed in conjunction with the
DEX. (The development of test data sets provides necessary feedback on the DEX
and its documentation). Data sets may be presented in a variety of formats,
- ISO 10303-21.
- XML according to bindings in ISO 10303-28: 2003.
- XML according to the XML Schema binding (in preparation) It may be preferable
to have the same data set available in more than one format.
It is anticipated that data sets will fall into two major categories,
- Simple or artificial - Simple or artificial data sets are those designed
primarily for the purposes of either documentation or testing. Such
artificial data sets may, of necessity, be hand-coded.
- Production - Production data sets will be based on "real life" data and
will have been written by a software implementation.
Independent of format it is suggested that the following criteria are applied,
- Is the content of the data set within the scope of the DEX?
- Is there supporting documentation?
- Text description of content.
- Instance diagram (possibly only for simpler data sets).
- Supporting illustrations where appropriate.
- Is the data set syntactically correct according to the relevant format's rules?
- Does the data set properly correspond to the data model for the DEX?
- Does the data set properly correspond to the data model for AP239
?
- Is the meta data defined in the data set? (File header)
- Where a data set has been created by a software system with import (read)
capability, has the data set been successfully re-imported?
(The so-called loop-back test).
- Has the data set successfully been processed by 2 or more implementations,
excluding the creating system? The implementations should themselves claim
to support the same DEX.
All of the above criteria merit further explanation and expansion. Data sets that
meet the above criteria should be made available through a version-controlled repository.
The following types of testing could apply to software implementations,
- Conformance - does the implementation satisfy requirements defined in
the applicable standard?
- Interoperability - can different implementations exchange or share data?
- Robustness - How well does an implementation handle invalid data,
large data volumes, etc?
- Performance - How well does the implementation perform?
Of these, interoperability testing is the closest to the desired business
functionality. Conformance testing also merits further consideration it
that it can be used as the basis for a certification program, allowing
vendors of software implementations to support their claims. Given that,
at the time of writing, there are very few implementations that could claim
conformance and AP239 has yet to complete DIS balloting, it is reasonable to
treat development of testing processes for implementations as a lower priority