ProSTEP iViP/VDA JT Application Benchmark · Vendor Translator Version CATIA V5-6R2014 Creo 2 NX 9...

14
7th JT Application Benchmark SHORT REPORT ProSTEP iViP/VDA JT Application Benchmark ProSTEP iViP/VDA JT Application Benchmark

Transcript of ProSTEP iViP/VDA JT Application Benchmark · Vendor Translator Version CATIA V5-6R2014 Creo 2 NX 9...

Page 1: ProSTEP iViP/VDA JT Application Benchmark · Vendor Translator Version CATIA V5-6R2014 Creo 2 NX 9 CT CoreTechnologie 3D_Evolution 4x x x Elysium Asfalis EX 7.0 xxx Siemens PLM JT

7th JT Application BenchmarkSHORT REPORT

ProSTEP iViP/VDAJT Application BenchmarkProSTEP iViP/VDAJT Application Benchmark

Page 2: ProSTEP iViP/VDA JT Application Benchmark · Vendor Translator Version CATIA V5-6R2014 Creo 2 NX 9 CT CoreTechnologie 3D_Evolution 4x x x Elysium Asfalis EX 7.0 xxx Siemens PLM JT

3

TABLE OF CONTENTSJT Application Benchmark

ContentsTable of Contents 3Disclaimer 4Copyright 41 Introduction 52 Approach 5

2.1 Four Steps 52.2 Building Blocks 62.3 Documentation 62.4 Criteria 62.5 Translation Quality Benchmark 6

2.5.1 Structural Criteria 6

2.5.2 Geometry Criteria 6

2.5.3 Tessellation Criteria 6

2.5.4 Attribute Criteria 7

2.6 Application Performance Benchmark 72.6.1 Translator Performance 7

2.6.2 Viewer Performance 7

3 Testing 73.1 Configuration and Settings 73.2 Translation Quality Benchmark 7

3.2.1 Tested Translators 7

3.2.2 Test Models 9

3.2.3 Testing Procedure 9

3.3 Application Performance Benchmark 93.3.1 Tested Translators 9

3.3.2 Tested Viewing Applications 9

3.3.2.1Siemens PLM JT2Go 10

3.3.2.2TechniaTranscat LiteBox3D 10

3.3.2.3Siemens PLM Teamcenter Vis Base & Professional 10

3.3.3 Test Models 11

3.3.4 Testing Procedure 11

4 Results 124.1 Translation Quality Benchmark Results 12

4.1.1 CAD to JT and STEP AP242 XML Test Results 12

4.1.2 JT and STEP AP242 XML to CAD Test Results 12

4.2 Application Performance Benchmark Results 135 Summary and Outlook 146 Acknowledgements 14

Page 3: ProSTEP iViP/VDA JT Application Benchmark · Vendor Translator Version CATIA V5-6R2014 Creo 2 NX 9 CT CoreTechnologie 3D_Evolution 4x x x Elysium Asfalis EX 7.0 xxx Siemens PLM JT

4

SHORT REPORT JT Application Benchmark

Disclaimer

This document is a ProSTEP iViP / VDA Documentation. It isfreely available for all ProSTEP iViP e. V. members and thoseof the VDA AK PLM. Anyone using these recommendationsis responsible for ensuring that they are used correctly.

This Documentation gives due consideration to the prevailingstate-of-the-art at the time of publication. Anyone using PSIDocumentations must assume responsibility for his or heractions and acts at their own risk. The ProSTEP iViP Associa-tion and the parties involved in drawing up the Documenta-tion assume no liability whatsoever.

We request that anyone encountering an error or the possibil-ity of an incorrect interpretation when using the Documentationscontact the ProSTEP iViP Association ([email protected])immediately so that any errors can be rectified.

Full or partial reproduction of tables or pictures is onlyallowed, using the complete original underline, including adescription of symbols.

For all full or partial reproductions a deposit copy to ProSTEPiViP is required.

Copyright

I. All rights on this PSI Documentation, in particular thecopyright rights of use and sale such as the right toduplicate, distribute or publish the Documentationremain exclusively with the ProSTEP iViP Association andits members.

II. The PSI Documentation may be duplicated and distri -buted unchanged, for instance for use in the context ofcreating software or services.

III. It is not permitted to change or edit this PSI Documenta-tion.

IV. A suitable notice indicating the copyright owner and therestrictions on use must always appear.

Page 4: ProSTEP iViP/VDA JT Application Benchmark · Vendor Translator Version CATIA V5-6R2014 Creo 2 NX 9 CT CoreTechnologie 3D_Evolution 4x x x Elysium Asfalis EX 7.0 xxx Siemens PLM JT

JT has become a widely used format for product visualizationduring the product development process. The ProSTEP iViPAssociation and the German Association of the AutomotiveIndustry (VDA) have launched three JT-related projects in suc-cession, which are being coordinated with each other: TheProSTEP iViP / VDA JT Workflow Forum, the ProSTEP iViP /VDA JT Application Benchmark and the ProSTEP iViP / VDAJT Implementor Forum. In August 2010, the ProSTEP iViPAssociation submitted the latest JT specification (version 9.5)to the ISO for standardization. JT 9.5 was published by ISOas ISO 14306:2012 in December, 2012.

Since then, ProSTEP iViP and VDA have developed imple-mentation guidelines for JT and discussed issues and furtherrequirements. This eventually led to the publishing of the JTIndustrial Application Package (PSI 14), combining theguidelines and enhancements with the latest JT specification.This specification is compatible with the ISO standardreleased in 2012 as well as with the latest proprietary ver-sions of JT.

As the latest in a row of seven benchmarks, the JT Applica-tion Benchmark was carried out in 2016 to achieve an inde-pendent evaluation of the progress being made with regardto the development of JT translators and viewing applica-tions. The object of testing was ISO 14306:2012 (JT 9.5).Additionally, the interoperability between JT and the STEPAP242 Business Object (BO) Model XML schema (publishedas ISO Standard ISO 10303-242:2014) was also part ofthe benchmark.

The benchmark was managed by the JT Workflow Forumand JT Implementor Forum. Because the benchmark is anindependent activity, it was financed directly by the twoorganizations, the ProSTEP iViP Association and the VDA,and not by the participating companies whose productswere tested. It is a neutral test of trendsetting JT applicationswith regard to the selected criteria. Therefore, the results ofthe benchmark cannot only be used to evaluate the applica-tion of JT in PLM environments, but also for improvement ofthe interoperability of the applications.

As such applications are undergoing a permanent develop-ment; the benchmark can only give a snapshot of the func-tions and qualities of the applications.

5

SHORT REPORTJT Application Benchmark

1 Introduction 2 Approach

The JT Application Benchmark is a common activity of theProSTEP iViP Association and the VDA. Goal of the bench-mark is a neutral evaluation of actual JT applications. Focalpoints of this 7th benchmark were the testing of CAD-JT andJT-CAD translators as well as testing the state of the artregarding the interaction of JT and STEP AP242 XML. Also,the performance of viewing applications was tested.

2.1 Four steps

Based on the Lessons Learned from previous benchmarks, theJT Workflow and JT Implementor Forum agreed on the follow-ing four-step approach:1. The JT Workflow Forum clarified the target intent for the

benchmark and provided details on the expected out-come.

2. The vendors made proposals for the JT file scope, con-figuration settings and evaluation approach which intheir eyes would best fit the requirements.

3. A proof of concept / test run for the benchmark wasconducted, using the agreed-on settings and test mod-els, with close involvement of the vendors.

4. After the successful test run, the actual benchmark wasconducted.

Page 5: ProSTEP iViP/VDA JT Application Benchmark · Vendor Translator Version CATIA V5-6R2014 Creo 2 NX 9 CT CoreTechnologie 3D_Evolution 4x x x Elysium Asfalis EX 7.0 xxx Siemens PLM JT

6

SHORT REPORT JT Application Benchmark

2.2 Building blocks

This benchmark was composed of two independent buildingblocks, the translation quality benchmark and the applicationperformance benchmark:

In the translation quality benchmark, the results of CAD to JTand STEP AP242 XML translations and the results of JT andSTEP AP242 XML to CAD translations were evaluated in twosteps.• CAD to JT translations: Translation of CATIA V5-6R2014, Creo2 and NX9 CAD models to JT and STEPAP242 XML with quality checks regarding geometry (XT-BREP), tessellation, assembly structure and attributes.

• JT to CAD translations: Translation of the JT and STEPAP242 XML models that were created during the firsttranslation step back to CATIA V5-6R2014, Creo2 andNX9 CAD models with quality checks regarding geom-etry (XT-BREP), assembly structure and attributes.

In the application performance benchmark, the time neededfor several actions was measured:• Converting complex CAD models to JT.• Loading and viewing large JT models.

2.3 Documentation

This short report is made publicly available; a long reportwith more detailed information is provided to the projectmembers.

2.4 Criteria

The criteria were defined by the JT Workflow Forum. Details,especially the validation methods, were elaborated in col-laboration with the JT Implementor Forum.

2.5 Translation Quality Benchmark

The translation quality benchmark focused on the applicationof different options for converting assembly structure. Theinvestigated options were one JT file for the entire assembly;one JT file containing the assembly structure with referencesto JT files holding the parts’ geometry and a STEP AP242XML file containing the assembly structure with references toJT files holding the parts’ geometry. For the translation toCAD systems, a STEP AP242 XML file also referencing a JTassembly file was introduced. The used CAD source formatsfor the CAD to JT translations and the target CAD formats forthe JT to CAD translations were CATIA V5-6R2014, Creo 2and NX 9. The used JT format for all translations was JT ISO14306:2012.

2.5.1 Structural CriteriaIndependent from the used option for the file structure, theresulting models should fulfill the following criteria:• The assembly structure in the target system is equivalentto source assembly structure.

• The positions of all instances are correct in target appli-cation.

• The instantiation of components in assembly are equiva-lent to source definition.

2.5.2 Geometry CriteriaRegarding the correctness of geometry it was checked if thecenter of gravity, volume and surface of the resulting geom-etry were the same as in the source geometry. The BREP datahad to be stored as XT-BREP.

2.5.3 Tessellation CriteriaRegarding the correctness of the created tessellations, it waschecked if the settings for Level of Detail (LOD) creation wereconforming to the content harmonization guidelines of the JTWorkflow Forum.

Figure 1: Benchmark building blocks

Page 6: ProSTEP iViP/VDA JT Application Benchmark · Vendor Translator Version CATIA V5-6R2014 Creo 2 NX 9 CT CoreTechnologie 3D_Evolution 4x x x Elysium Asfalis EX 7.0 xxx Siemens PLM JT

7

SHORT REPORTJT Application Benchmark

2.5.4 Attribute CriteriaRegarding the correct translation of metadata from CAD toJT, three categories of attributes were checked: • The CAD properties, defined in the JT ISO specification.• The translator information, defined in the JT-IF Implemen-tation Guideline.

• User-defined properties created on different nodes ofthe assembly structure.

2.6 Application Performance Benchmark

In the second part of this benchmark the performance oftranslators and viewing application was tested.

2.6.1 Translator PerformanceFor the translator performance tests the time that elapsed dur-ing the translation of a complex CAD model to JT was meas-ured.

2.6.2 Viewer Performance The performance of viewing applications was measured inthree steps:• First, the time for the start of the application itself wasmeasured.

• Second, the elapsed time between giving theload/open command and the full visualization of themodel was measured.

• Third, the model was rotated and zoomed to see if thereis any recognizable delay between the end of the user’saction and re-visualization of the model.

3 Testing

The benchmark tests were executed at PROSTEP to assureneutral testing and documentation. The vendors provided thesoftware to be benchmarked and licenses for the runtime ofthe benchmark testing and evaluation.

3.1 Configuration and settings

The vendors of the translators were asked to provide the con-figuration and settings that would fit best to the benchmarkcriteria. The viewing applications were tested as installed,without any changes of the default settings.

3.2 Translation Quality Benchmark

Table 1 gives an overview of the participating vendors in thetranslation quality benchmark. It also shows who participat-ed in the CAD-JT tests or in the JT-CAD tests.

3.2.1 Tested TranslatorsTable 2 gives an overview of the translators tested in theCAD to JT test of the translation quality benchmark. It alsoshows which of the tested CAD systems were supported byeach translator. Table 3 gives an overview of the translatorstested in the JT to CAD tests of the translation quality bench-mark. It also shows which of the tested CAD systems weresupported by each translator.

Page 7: ProSTEP iViP/VDA JT Application Benchmark · Vendor Translator Version CATIA V5-6R2014 Creo 2 NX 9 CT CoreTechnologie 3D_Evolution 4x x x Elysium Asfalis EX 7.0 xxx Siemens PLM JT

8

SHORT REPORT JT Application Benchmark

Vendor Translator Version CATIA V5-6R2014 Creo 2 NX 9

CT CoreTechnologie 3D_Evolution 4 x x x

Elysium Asfalis EX 7.0 x x x

Siemens PLM JT bi-directional translator 11 xfor CatiaV5

Siemens PLM JT translator for ProE/Creo 12.1 x

Siemens PLM NX 11.0 x

Theorem CADverter 19.3 x - -

T-Systems COM/FOX 6.2.4 x - -

Table 2: Benchmarked JT translators and supported CAD formats in the CAD to JT/AP242 XML tests

Vendor Translator Version CATIA V5-6R2014 Creo 2 NX 9

Elysium Asfalis EX 7.0 x x x

Siemens PLM JT bi-directional translator 11.0 x - -for CatiaV5

Siemens PLM NX 11.0 x

Theorem CADverter 19.3 x -

T-Systems COM/FOX 6.2.4 x -

Table 3: Benchmarked JT translators and supported CAD formats in the JT/STEP AP242 XML to CAD tests

Vendor CAD-JT/STEP AP242 XML JT/STEP AP242 XML-CAD

CoreTechnologie Yes No

Elysium Yes Yes

Siemens PLM Yes Yes

Theorem Yes Yes

T-Systems Yes Yes

Table 1: Vendor participation in the translation quality benchmark

Page 8: ProSTEP iViP/VDA JT Application Benchmark · Vendor Translator Version CATIA V5-6R2014 Creo 2 NX 9 CT CoreTechnologie 3D_Evolution 4x x x Elysium Asfalis EX 7.0 xxx Siemens PLM JT

9

SHORT REPORTJT Application Benchmark

3.2.2 Test ModelsThe test models were the torque converter assembly whichwas already used in previous JT and STEP benchmarks. It isavailable in all tested source formats. It contains several sub-assemblies and multiple instances of several parts. Therefore,it fits the criteria of the translation quality benchmark. User-defined attributes where added to the model on differentassembly nodes.

Figure 2 shows the NX 9 test model and its assembly struc-ture.

3.2.3 Testing ProcedureFigure 3 gives an overview of the translation quality bench-mark testing procedure. The quality of each individual stepwas checked and the errors that had occurred in the first stepdid not affect the results of the second step.

3.3 Application Performance Benchmark

The application performance benchmark consisted of twoseparate sections: The translator performance tests and theviewing performance tests.

3.3.1 Tested TranslatorsFor the translator performance tests, the same translators asin the translation quality benchmark were tested.

3.3.2 Tested Viewing ApplicationsThe tested viewing applications include basic viewers aswell as more advanced engineering tools with functionalitiesfor collaboration, simulation, planning and analysis.

Figure 2: NX9 test model

Page 9: ProSTEP iViP/VDA JT Application Benchmark · Vendor Translator Version CATIA V5-6R2014 Creo 2 NX 9 CT CoreTechnologie 3D_Evolution 4x x x Elysium Asfalis EX 7.0 xxx Siemens PLM JT

10

SHORT REPORT JT Application Benchmark

Figure 4: Translator performance test model

Figure 3: Testing procedure for translation quality benchmark

3.3.2.1 Siemens PLM JT2GoTested Version: 11.2.2JT2Go is a basic viewer, which is available for free.

3.3.2.2 TechniaTranscat LiteBox3DTested Version: 2016.3LiteBox3D is an easy and free to use JT viewer, based on thespecification of the JT ISO standard. It can be enhanced formultiple use cases.

3.3.2.3 Siemens PLM Teamcenter Vis Base & ProfessionalTested Version: 11.2.2Teamcenter Vis is a scalable solution for different engineer-ing visualization use cases. For the application performancebenchmark the basic level “Base” and the high-level “Profes-sional” versions were tested.

Page 10: ProSTEP iViP/VDA JT Application Benchmark · Vendor Translator Version CATIA V5-6R2014 Creo 2 NX 9 CT CoreTechnologie 3D_Evolution 4x x x Elysium Asfalis EX 7.0 xxx Siemens PLM JT

11

SHORT REPORTJT Application Benchmark

3.3.3 Test ModelsFor the translator tests, a large pressure cast part with com-plex geometry was used (see Figure 4). This model was kind-ly provided by Daimler.

For the viewing tests, two models where used: A genericlarge model created especially for stress tests, also kindlyprovided by Daimler, and a model of a nuclear poweredsubmarine, kindly provided by Siemens PLM and BAE (seeFigure 5).

Figure 5: Viewing performance test models

3.3.4 Testing ProcedureThe translator performance tests were run in batch mode.The time elapsed between the start of the batch commandand the finished translation was measured.

The viewing performance was measured in three steps:• Measuring of duration of application start. The elapsedtime from start command until menus and buttons areusable was measured.

• Measuring of duration of part load. The elapsed timebetween load/open command and full visualization ofthe model was measured.

• Measuring of duration of part rotate / zoom. The fullyvisualized model was rotated and zoomed in on. Thetime elapsed until visualization of the model from theend of user interaction was measured.

All tests were repeated several times to assure the measuredtimes are representative.

Page 11: ProSTEP iViP/VDA JT Application Benchmark · Vendor Translator Version CATIA V5-6R2014 Creo 2 NX 9 CT CoreTechnologie 3D_Evolution 4x x x Elysium Asfalis EX 7.0 xxx Siemens PLM JT

12

SHORT REPORT JT Application Benchmark

4 Results

In the following chapters, an overview of the test results forthe different Building Blocks is given.

4.1 Translation Quality Benchmark Results

In the following, the results for the two steps of the translationquality benchmark are summarized.

4.1.1 CAD to JT and STEP AP242 XML Test ResultsIn the first step of the translation quality benchmark, the qual-ity of the CAD to JT and STEP AP242 XML translations wasevaluated.

The tests showed very good results for the translation ofgeometry, both for tessellated and exact representation.With the file structure options using only JT files, the assem-bly structure was always translated correctly. Also, the resultsfor the attributes are very good. The use of STEP AP242 XMLas assembly format was not yet fully supported by all trans-

lators. This regarded mostly the support of attributes; theassembly structure was always translated correctly.

4.1.2 JT and STEP AP242 XML to CAD Test ResultsIn the second step of the translation quality benchmark, thequality of the JT and STEP AP242 XML to CAD translationswas evaluated.

Again, the test showed very good results for the translationof geometry. It also shows that influences of the file structurehave little impact on the assembly structure or the translationof user-defined attributes. Only with one tool monolithic JTfiles were not converted to an assembly but to a part withmultiple bodies.

The import of STEP AP242 XML files is not yet supported byall tested translators. Also, there were interoperability issueswith some of the STEP AP242 XML files, which will be solvedwith future versions of the translators.

Figure 6: Overview of CAD to JT and STEP AP242 XML test results

Page 12: ProSTEP iViP/VDA JT Application Benchmark · Vendor Translator Version CATIA V5-6R2014 Creo 2 NX 9 CT CoreTechnologie 3D_Evolution 4x x x Elysium Asfalis EX 7.0 xxx Siemens PLM JT

13

SHORT REPORTJT Application Benchmark

4.2 Application Performance Benchmark Results

For the translator performance, measured times werebetween a few seconds and several minutes. The testsshowed that the time needed is more dependent on the usedsource format than on the used translator. For the samesource format, translation times of different translators wereclose together.

The viewing performance benchmark showed that all testedviewers were capable of handling the test models. Therewas no measurable delay between user interaction likezooming and rotating and the visualization of the model withthe new view point. For the loading times, the test showedthat it is depending on the details that are loaded. Some ofthe tested viewers load a coarse and simplified tessellationfirst, while another always loads the finest tessellation avail-able by default.

Figure 7: Overview of JT and STEP AP242 XML to CAD test results

Page 13: ProSTEP iViP/VDA JT Application Benchmark · Vendor Translator Version CATIA V5-6R2014 Creo 2 NX 9 CT CoreTechnologie 3D_Evolution 4x x x Elysium Asfalis EX 7.0 xxx Siemens PLM JT

SHORT REPORT JT Application Benchmark

14

The translation quality benchmark has proven the capabili-ties of JT translators. It has shown that the quality and com-pleteness of JT translations from and to other CAD formats isvery high. This regards geometrical data as well as assem-bly structure and metadata. Compared to previous bench-marks, the tests showed a massive improvement of the testedtranslators.

As STEP AP242 XML is a relatively new format, it is not yetsupported by all translators and there are still some interop-erability issues.

Due to the close cooperation between the software vendorsand testers, issues found during the testing could be directlycommunicated. This allows the vendors to consider thebenchmark results in the current development. With issuesdiscussed in the JT Workflow Forum and the JT ImplementorForum, a common understanding of the users’ requirementsis achieved. Also, the discussion of issues and results amongthe vendors will lead to improved interoperability betweenthe various tools.

We would like to thank the software vendors who providedtheir software for the benchmark testing and supported theinstallation and configuration of their software.

5 Summary and outlook 6 Acknowledgements

Page 14: ProSTEP iViP/VDA JT Application Benchmark · Vendor Translator Version CATIA V5-6R2014 Creo 2 NX 9 CT CoreTechnologie 3D_Evolution 4x x x Elysium Asfalis EX 7.0 xxx Siemens PLM JT

ProSTEP iViP AssociationDolivostraße 1164293 DarmstadtGermanyPhone: +49-6151-9287-336Fax: [email protected] 2017