You are on page 1of 44

Introduction to Teamcenter Customization

Teamcenter provides variety of mechanism for customization of teamcenter based on


business requirement. The customization is based on base framework of API provide by
teamcenter. In this blog I will discuss all customization options and its architecture.

Customization Architecture
Teamcenter customization architecture can be broadly distinguished based on Teamcenter
technical architecture. It can be categorized in to three layers.

Server or Enterprise Layer


Web Layer
Client Layer
Client Layer is basically portal or thin client customization which usually deals with UI
and data handling of the result of server request. SOA client is SOA API for calling SOA
services. You can see in detail of Teamcenter SOA service in my SOA blogs. Web Layer
is nothing Teamcenter based J2EE deployment layer which basically communicate
between Server and client. Server customization is core of all customization as most of
the Business logic is written in this layer. It mainly deals with all business transaction as
it interacts with data base through Persistence Object Layer (POM) API. FMS is resource
layer which support actual file transfer between client and server through FMS
framework. For more detail in FMS you can visit my blog on Teamcenter FMS. Server
customization is done through C based API provided by Teamcenter. This API is also
called Integration Toolkit (ITK). Apart from above discussed customization there is SOA
customization and BMIDE extension which are basically either server and client\web
customization or both. Below diagram depict Customization Architecture diagram for
Teamcenter. As shown in diagram, all BMIDE extension is in server side. This is because
most of BMIDE extension overrides or changes object behavior based on business
requirement. This can be only accomplished in server layer; hence all extension is
implemented by using core ITK API provide in server layer. Below diagram shows the
Customization Architect of Teamcenter.

Based on above Customization Architect, Teamcenter customization can be categorized


in to following area.
1. Server Customization
2. Portal Customization
3. Web or Thin client customization
4. SOA based customization
5. BMIDE extension customization
Server Customization: Server side customization is a most frequently used
customization, as all business logic are written in this layer. Basically all requests pass
through through server layer for all teamcenter transaction. Hence it is core of teamcenter
customization. As discuss in Customization Architecture, Teamcenter provide C based
API called Integration Toolkit (ITK) for server side customization. This toolkit provides
hundred of API for processing various business process based on Teamcenter
functionality. The ITK is categorized by various modules and functionality of
Teamcenter. Also various extension mechanisms are provided by ITK toolkit to plug in
custom code based on various Teamcenter events and object status. The detail discussion
of ITK customization is out of scope of this blog and I will cover it another blog.
Portal Customization: Teamcenter Client is layer is written on Java JFC and eclipse
SWT. The core client API are written in Java JFC framework and slowly it will ported to
eclipse SWT framework. Presently Teamceter support both JFC and SWT customization,
but it is recommended to do customization in SWT looking at Teamcenter future vision.
The Portal Customization can be done extending OOTB Plug-in or developing your own
plug-in. Apart from JFC/SWT UI api, the Teamcenter client API also provides object
interface component which is an encapsulation of Teamceter Data model through Client
object model. This Object Interface component also form interface layer between client
and server.

Web or Thin Client Customization: This customization is for Teamcenter Web client.
Teamcenter provides standard web interface for viewing and editing Teamcenter object in
web browser. Web client is builds on asynchronous JavaScript and XML (AJAX) to allow
dynamic loading of data in the browser. The HTML pages are renders by java script on
XML data. Most of the thin client customization is carried through JavaScript which
allow the rendering as well managing request\response from web server. Both client-toserver requests and server-to-client responses in Teamcenter thin client are standard
HTTP responses.
SOA Customization: It is also called Teamcenter services. It is a standard SOA based
services provided by Teamcenter for integrating with third party as well custom client.
Also Teamcenter provides framework to create your own custom SOA services. I covered
Teamcenter SOA services in detail in my SOA blogs.
BMIDE Extension Customization: This is mainly a server customization using
Teamcenter BMIDE. BMIDE provide various extension customization mechanisms for
implementing desired behavior in Teamcenter. Some of example of BMIDE extension is
pre-action or post-action operation for Business object, Runtime property etc. This
extensions are implemented in BMIDE environment by writing C\C++ server code
mainly using ITK API. BMIDE framework create stub code and other require classes for
extension implementation. Developer only required implementing base logic of the
extension. I will try to cover extension implementation in one of my future blog.
Apart from the above customization; Teamcenter Dispatcher module can also be
customized for required translation behavior. Most of time Dispatcher client required to
be implemented for extracting and loading translated file from Teamcenter. The
Dispatcher Client Framework is based on Teamcenter SOA service and most OOTB SOA
API is used apart from Dispatcher API which encapsulates most of complex Teamcenter
SOA API calls. I already covered Dispatcher Customization in my blog on Teamcenter
Dispatcher.
See Also :
Teamcenter FMS Overview
Teamcenter SOA : Introduction
Teamcenter Dispatcher Framework
Posted by Manoj Blog at 11:43 AM No comments:
Email ThisBlogThis!Share to TwitterShare to Facebook

Sunday, August 4, 2013


Teamcenter Query Builder
Query Builder is one of frequent used module in Teamcenter for creating object based
search query. Query Builder is power full tool for quickly configuring Teamcenter for

object based search query based on attribute criteria. It is also used in report builder for
creating report based on query output. I covered it in detail in my Report Builder blog. In
this blog I will explain in detail about local query builder configuration and basic concept
to create or modify query.

Basic of Query Builder:


Teamcenter Query is based on Teamcenter persistence data model which I cover in
Teamcenter Data Model. Following are the basic characteristic of query in Teamcenter.
1. Query is created against any of the Teamcenter persistence class.
2. Query Criteria can be defined either on attribute of target object or for the related
object which are related to search class through either GRM relation or reference.
3. Criteria can be made invisible to end user by making use entry field empty.
4. Predefined keyword variable can be used against some of the attributes. These
variable values are executed at runtime. For example keyword variable $USERID
will resolve to session userid when query will be executed.
5. Teamcenter also support keyword query which allow to search dataset files
content, that contain a specific keyword or combinations of keywords.
6. Query can also be customized through Teamcenter use exit.
7. Query results always shows list of object of class which is defined as search class
in query builder.

Steps for creating Query:


Below description shows list of task required to be done for defining new query in
Teamcenter through Query Builder.
1. Define Query : Before creating query, you have to first define what exactly
required to be search and in what context. Context means some condition or
criteria which user can either provide while executing search or defined as basic
criteria for search itself. For example, if you want create a query for find specific
Data type only, then you defined Object Type is query builder and make the field
invisible.
2. Map to Data Model : After defining search requirement , you map it to
persistence data model of Teamcenter. This become your search class in Query
Builder. Basically the output will be shown for instance of this class only. It is
assumed the admin user who will be building the query will be aware of basic
data model of Teamcenter. I covered it in the my Teamcenter Data Model blog.
Once you select the Search Class, automatically its attributes and GRM relation
are shown in Attribute Display area. The attributes can be filter to show only class
specific attributes or all inherited attributes from all parent classes.
3. Define Search Context : Then defined search criteria or context in which search
required to happen. For example if you want to get Item Revision which has
attachment of specific data set only. In this case you have to traverse from search
class i.e Item Revision to Dataset through predefined GRM relation shown in
attribute display area or through reference by from imanrelation object to Dataset
(Primary object to Secondary Object).
4. Use Search Criteria Field : At last you defined the the criteria which you want to
expose to user for executing the search. This defined list of attributes shown to

User whose value can be set by user before query execution. The attribute can be
of search class or related class.

Query Builder Dialog Section


Query Builder Modules has three important sections as shown in below image

Search Class: This is target class against which search will be executed. The search
result will be list of instance of this class. As defined above section this will be base class
for query. You can navigate required class from Class Selection dialog which shows
Teamcenter hierarchy data model.

Attribute Selection: This provide list of attributes of search class. Option provide to see
only display attributes for a given class or all attributes its inherited from all parent class
hierarchy. This section help to select attribute used for search Criteria
Search Criteria: This defined criteria for context search as well user search criteria field.
Following are field of Search Criteria section
Boolean Rule : This connect multiple search clauses in search criteria through And\OR
condition. AND condition required related clauses should be satisfied where as OR
condition means any of them.
UserEntry Name : Shows the display name shown in query. It refers to query locale file
for actual display name of attributes search dialog. Make it empty if you want to hide
some of clauses.
Logical Operator: This is most important field of Search field which defined actual query
condition required to be executed. It basically define rules of attributes are evaluated
against the value provided for execution. For example for date attribute you can defined
greater then or less then Operator to execute query for before or after date clauses. Query
builder support almost all sql operator clauses. The below image shows all support
operation condition in Teamcenter.

Limitation of Query Builder:


Although Query Builder is very powerful tool of Teamcenter for configuring object based
query, but it also has some limitation due to Teamcenter data model design.
1. Query can be only be configured for persistence data object.
2. Query can be only build for object in the context of related object defined in Data
Model. That means query against two independent object which are not related
through some relation or reference cant be created.
3. There is no way to filter Search Results based on specific filter rules. For example
if I want to get latest Revision of Item, it cant be achieved through configured
Query Builder search. The results will be shows all revision of the item. The
reason is all Item Revision of Item are related to Item only which is of reference
relation and there is no way defined search criteria which filter old revision.
4. Query rules cant be build for joint query or set of and /OR clauses as can be
defined in SQL statement. For example in SQL you can build where clause based

on set of and / or clause through grouping the clauses throuch under different
bracket () .
5. Query results only shows list of instance of search class only. You can't see
related object along with search object in query results dialogs.
See Also :

Teamcenter Data Model


Teamcenter Report Framework

Teamcenter Data Model

Data Model is core of any Packaging software. To have a good technical


command in any package, it is important to have a good
understanding of its Data Model. Teamcenter is no difference with it. In
this blog, I will explain basic data model of Teamcenter as well
corresponding schema in Database. This will help people new to
Teamcenter to have a better understanding of Teamcenter system.
Teamcenter Data model can be categorized in to three distinguish
layer. They are
POM or Schema Layer
Business and Relation Object Layer
Business Rules
POM or Persistence Object Model is lowest layer, which basically
represent mapping for underlying Data Base of Teamcenter. It is not
always one to one mapping, but closest to DB Tables for most of
classes. Developer should know detail aspect of POM layer for
customization and extension of system.
Business and Relation Object Layer resides above POM layer. This layer
represents actual entity to Business and its process. Mainly Business
Analyst or Solution Architect interacts at this layer. Business Object
and Relation defines overall Data Model from Business process
perspective.
Business Rules are the top level layer of Data Model. This layer
basically constitutes Business Object behavior based on the rules
configured in BMIDE. Business rules along with Business Object
encapsulate overall PLM business process. Teamcenter provided both
configurable like naming rule, conditions etc or custom like extension
for defining business rules.

Below diagram shows the basic building block of Teamcenter Data


Model.

POM Schema of Teamcenter Data Model:


Teamcenter Data Model Schema is hirierachy based, it means there is
base level object through which all the object in the stystem are
derived. The base object in Teamcenter is called POM_object. It is base
parent object for all object defined in Teamcenter. POM level object are
represented as tables in Teamcenter data base. All derived class of
Teamcenter Data Model is represented as corresponded table in data
base. Under POM_object classes there many immediate child classes
which are mainly used as storage classes like form storage class. Out
of which one important class is POM_application_object class. This is
important class from perspective of it actually representing all Business
object of Teamcenter. Workspace object which represent as parents of
all objects which user can see in the teamcenter is derived from
POM_application_object class.
All Business classes in Teamcenter either directly or indirectly (through
hierarchy) is derived from workspace object. For example Item class is
derived from workspace object. Same is valid for Folder, Dataset or
ItemRevision. Below diagram shows the class hierarchy for basic
workspace object.

1)
2)
3)

a)

Most of time you create custom type by extending data model of Item
or form type. Once deploy from BMIDE, it will create a new table in
Data base with columns having custom attribute defined in BMIDE. All
inherited classes automatically inherit parent attributes. Hence child
attributes are combination of parent attributes plus child attributes.
Business Object:
The building block of Teamcenter is Business Object. It resides above
POM Objects or DB Classes. Business Object can be seen as actual
representation of real life entity which are encapsulated as Business
object. The underlining objects are still persistence schema classes.
Teamcenter UA provides hundred of OOTB business objects. Following
are major characteristic of Business Object.
Business Objects are related to each other through relations.
Business Objects have property which can be persistence (attributes
from underlining classes) or Dynamic (evaluated run time).
Business Objects behavior can be controlled through rules which are
defined in BMIDE. Rule can be either configurable (Ex: Naming Rules)
or customization (extension, user_exit etc).
GRM Relation: Teamcenter Relation is second building block. Relation
defined the inter dependent of various Business Object with each
others. In Teamcenter Relation can be categorized in to two groups.
Reference by : The Business Object underline schema classed direct
has reference to other object through attributes. It can be compare to
pointer reference to other classes in object orient concept. For example
POM_application object has reference to owning group or user.

b) GRM Relation : Other way relation between is created by creating a


relation object which encapsulate both Business object through
concept of primary and secondary object. Advantage of using GRM
relation rather than direct relation is that of having more flexibility in
term of defining business rules. For example you can define DeepCopy
Rules or GRM Rules. Also different relation type object can be created
to defined different Business rules.
Property:
Properties define business objects. All attributes which are present in
underline POM Class for given Business Object are automatically
become property of Business Object. Apart from persistence property,
there are other properties which are either derived from other relation
object or created run time by writing custom codes. Teamcenter
property can be classified in following four categories.
a) Persistence Property: Attributes which are stored in database. This are
defined in underline schema classes.
b) Compound Property: It a property which basically propagates property
of other object which is related to target business object through either
reference or relation. Example of this can Form property shown at Item
or Item Revision.
c) Runtime Property: These are property define dynamically through
custom code. The custom code required to be written, which executes
when the property value is fetch from server.
d) Relation: This is property which defines relation between target object
and source.
Thats all from Teamcenter Basic Data Model Perspective. Hope this
provide good starting point for people who want to understand
Teamcenter Data Model.

Teamcenter Report Framework


Report is one of the key modules in any PLM system. And with the
advancement and more emphasis on Analytic and Reporting on
Enterprise data, it is going to play key role in future. Till now Report in
PLM are static and so called dumb data. They just provide data in term
of data extraction and transformation mainly done through style sheet.
Now with the advancement of technology and powerful system report
is become more complicated and expected to provide answer rather
than just dumping the data in nice format. Teamcenter also provide
tool for creating both static as well intelligent report. In this blog we
will discuss on static report which can be created in Teamcenter
through in Build report module. The functionality for complex and
Analytic Report is also available in Teamcenter through third party tool.

Reports in Teamcenter: Teamcenter provide to Report Builder


Module to create report based on PLM data from Teamcenter. Reports
in Teamcenter are based on Query and PLMXML Framework. The basic
concept is to get the objects from Teamcenter through query which is
basically converted by Teamcenter server in to PLMXML output on top
of which xsl style sheet can be applied for layout and Report UI.
PLMXML output can be controlled in same way as of creating Transfer
mode for exporting/importing object from Teamcenter. I already cover
PLMXML in my earlier blog . Reports Builder always create Static Report
Report Builder. Reports are basically categorized in three types in
Report Builder.
1) Summary Report: These are basically general overall report where
report is not in context of specific object. Ex: Change Object Status
Reports
2) Item Report: These reports are generated in context of specific object
which generally selected by user to create a report. Ex: Signoff Report
for CR where CR required to be selected.
3) Custom Report: Teamcenter provide custom Hook for creating Custom
report which usually cant be created through Query or XSL. A custom
exe can be written which can create a report when executed by user in
Teamcenter session.
Report Builder Modules Component:
Report Builder five major components which basically define the
extraction and transformation rule for Reports.

Query: This is search query created in Query Builder module of


Teamcenter. Query defined which object required to be extracted under
different search criteria. Search Criteria can be either based on some
property of Target object or based related object (Ex Item Revision as
Target and Dataset as related object). The search criteria either can be
exposed to end user or it can be hidden. I covered Query in detail in
my Query Builder Blog.
Closure Rule: This component comes from PLMXML module. It defines
the extraction rule for the object extracted from Query. The Closure
rule defines what other related objects required to be extracted along
with target object coming from Query. See my PLMXML blog to know
more about Closure Rule.
Property Set: This defines addition attributes required to extract
apart from default property for given Object Type by PLMXML Engine.
See my PLMXML blog to know more about Property Set.
Report Format: This define XML format for report extracted from
Teamcenter. Teamcenter two format traditional PLMXML and TcXML
based on new schema available in Teamcenter Unified.
Report Style Sheet: This defined the Layout of the report. This style
sheet is XSL files which can convert the XML output from Teamcenter in
to different compatible format with proper layout. You can know more
about style sheet in W3C site or Wikipedia. Most of the time
customization of Report will be limited to XSL creation based on Report
Layout requirement.
Custom Component: This is used when you want to create custom
reports through some customize code. We required providing path of
custom executable and parameter if any required for custom
executable.
Most of the generic report can be created from Report Builder without
required for any customization. But the limitation is that this report will
be always static report and no intelligence can be provided (Filter
criteria, If-Else Analysis etc). Also there is no way to fetch data from
different source and normalize with Teamcenter for Reporting.
Due to above limitation Teamcenter also provide tightly integrated
Teamcenter Reporting and Analytic for advance report from different data
source as well for Analytics.

Teamcenter FMS Overview


File Management System (FMS) is one of the Teamcenter component
for managing files or vault in Teamcenter. FMS is responsible for all
transaction related to files from Teamcenter server and client. In this
blog I will discuss the basic architecture of FMS and its interaction with
Teamcenter Application.

FMS Overview:
FMS is independent tool which run as service in server (as FSC) and
client machine (as FCC). Teamcenter Application Tier and Client Tier
interact with FMS framework through HTTP or HTTPS protocol. The two
components of FMS are FMS server cache (FSC) and FMS client Cache.
As name suggest FSC is service running in server side which basically
cache file in server and serves multiple user request where as FMS
client cache work in client machine where it serve request for single
user and also interact with FSC for getting latest or new files from
server.
Architecture of FMS:
As discussed in FMS Overview, FMS has two components: FSC and FCC.
For basic installation you usually have one FSC and multiple FCC based
on number of user using the Teamcenter Client. Each of portal clients
will have one FCC running on client machine. But in production
Environment where user can be in multiple geographical location or
number of user are so high that single FSC cant service so many
users. Also if volumes are mounted in different server then also we
required FSC on each volume server as FSC is must for each of the
volume server. Hence we required to have multiple FSC running in
different server to server different geography or set of user or volume
server. This multiple FSC server are distributed in such a way that they
can be near to each of geographical location. Due to multiple FSC
server architect we then required to define one FSC server as master
for managing request and routing to different FSC server. The below
diagram shows FMS architecture.

FMS Configuration
Configuration of FMS is managed through xml files. Basically there are
three types of Files
FMS Master
FSC
FCC

1.
2.
3.
4.
5.

6.
7.
8.

FMS master configuration file is master configuration file resides in


master FSC server. FMS master configuration file which define various
FSC sites in cluster or FSC Group. Apart from FSC information it may
information of Volumes related to FSC. It will also have default
configuration information for FSC and FCC which can be override by
respective configuration
FSC configuration file is installed in each of the FSC server. FSC
configuration basically contain two main elements
FMSMaster : Defines FMS master location from where FMS Master
Configuration file can be read by FSC. FMS Master information help FSC
to route the file request in case it doesnt resides in it volume or cache.
FSC: Defined detail of installed FSC in server. In has different
parameter which defines files transfer characteristic as well error and
log information. Also it has parameter related to FSC cache for files as
well cache location. The parameter vale basically decided based on
load, file size, performance requirement as well overall FSC
architecture.
FCC configuration installed in each client. It has two main elements
fccdefault : This override FCC configuration from FSC. This has various
configuration parameter related to client cache and request.
parentfsc : This define FSC which FCC refer to for downloading FMS
configuration. You can have multiple FSC defined as a backup for
failover.
Communication Flow between FMS and Teamcenter :
Below is the process for communication between Teamcenter and FMS.
User try to retrieve file from dataset.
Whenever there is any request of file in teamcenter by user,
application server forward the request to FMS for retrieving file from
Vault.
FMS create a FMS ticket corresponding to file retrieval from vault. FMS
ticket is sent to client end which then request to FMS with FMS Ticket.
FMS request is routed to FCC installed in client site for File retrieval.
FCC check if the file cached in FCC and not modified. Modification
check of file is done through concept of GUID which is associated with
every file in Teamcenter. GUID is a business neutral identifier for file
contents, to determine when to pull a file from its local cache. Every
file in a Teamcenter vault has a single file GUID associated with every
replicated copy of the file. Any change in File results in having a new
GUID for the file. In this way FCC check for modification.
If file doesnt resides in FCC or changes, then FCC sent request to FSC
associated with the site id. The priority defines FSC request sequence if
the FCC is configured with multiple FSC for given sites id.
FSC check if files is cached in its own server and belong to its own
volume. Otherwise it will forward it to corresponding FSC. The other
FSC site information its retrieve from FMS Master config file.
FSC sent the file to FCC which in turn route it to client request.

The below diagram depict the overall flow of FMS request.

I cover all aspect of FMS overview. Hope this will help to understand FMS working and
configuration.

Teamcenter Dispatcher Framework.


Presently I am working Translation Service of Teamcenter. Though to share my learning
experience with you people. Translation service comes as a Dispatcher Service under
teamcenter installation. Translation service is nothing but to translate one file format to
other. For example Doc to PDF. The broader task any translation are as follows.
a) Extract Data from Teamcenter.
b) Execute Translation.
c) Load translated result to teamcenter.
Hence the Dispatcher Service of teamcenter has three main components.
1) Scheduler
2) Module
3) Dispatcher Client
There is one more component called Dispatcher Admin which is basically used for
Admin activity and it is optional component. Each of the above three component run
independently and can be run as service or in console. Each component can be run in
different server. As name suggests scheduler manage the whole framework by interacting
between Module and Dispatcher client. Dispatcher Client component basically manage
extract and loading of data. Module does actual translation. The below diagram depict the
Translation Frame work.

Dispatcher Client is the front end of Dispatcher Framework which basically interacts with
Teamcenter through SOA for translation request. Teamcenter required to be configuring
through ETS preferences for new translation services and object type on which this
service is valid. Once the request is received to Dispatcher Client, it processes the request
and put all extracted files required to be translated in to directory called staging directory.
Staging directory is required to be configured during Dispatcher Service Installation. In
staging directory a unique subfolder is created for each request by Dispatcher client based
on Task ID generated during user request in Teamcenter. Once Dispatcher client
completes the extract, it inform scheduler for translation processing. Scheduler in turn
informs Module to start processing the task. Module translate the file and put the output
in staging directory. Once completed schedule ping the Dispacher client which load
translated file back to Teamcenter.
Siemens PLM provide lot of out of box translation service which required to be make
active. In next blog I will provide more detail about each component and there
configuration.

Configuring Translator
In my last blog I discuss Dispatcher Framework of Teamcenter. In this blog I will discuss
in detail on each of module and how to configure COTS translation service. As discussed
in last blog there are three components for Translator.
1) Scheduler
2) Module
3) Dispatcher Client
Each module run as independent service and in different server. Component communicate
through RMI
Scheduler : This component act as Moderator between Module and DispatcherClient.
Module and DispatcherClient required configuring to communicate to Scheduler. Once
installed the scheduler directory structure created as shown below.

The transscheduler.properties file stored in config subdirectory which defined port and
other properties for Scheduler. The scheduler can be run by runscheduler.bat present in
bin subfolder. Lib store all Jar file correspond to scheduler. Most of time there is no
change required to be done in Scheduler once installed.
Module: Module is the component which does actual translation. It interacts with
Scheduler to get the Translation Task. Module invokes specific Translation based on Task
information and Translator service available to it. . Once installed the Module directory
structure created as shown below.

The conf folder has translator.xml which contains all Translation service details for
Module. Modules publish this service to Scheduler. Based on this information Scheduler
in turn dispatch specific Translation task to the Module.
Contain of translator.xml look like as follow

The XML show subsection of configuring one of Translation service in Module. Every
translation service in Module starts with name of service as element in translator.xml. As
in above example Translation service JtToCatiaV5 is configured under provider
SIEMENS with service name jttocatiaV5. The above configuration show presently the
service is not active. Means Module will not publish this service. If you want to make this

service active then required to change the isactive value to true.


The TransExecutable element define the location of Translator root directory and name
of exe or bat required to call for translation. In above example jttocatiav5.bat will be
invoked located at d:/Siemens /DISPAT~1/Module/Translators/jttocatia/v5 location.
The Options element define what argument required to be pass. In above example two
arguments with i and o with their value will pass by Module will to jttocatiav5.bat. The
Module will convert the above config while calling actual jttocatia translator as follows.
d:/Siemens /DISPAT~1/Module/Translators/jttocatia/v5 location/ jttocatiav5.bat -i {Input
file dir} o {outputfiledir}
The inputfile dir and outputfile dir will get at run time through DispatcherClient
component through Scheduler Component.
The FileExtensions define the input and output extension expected for this Translator. In
this example the input file will of .jt and output is of type .CATPart or .CatProduct.
Apart from this Module has transmodule.properties file which detail has related scheduler
rmi port and staging root directory configured. The Module can be run by runmodule.bat
present in bin subfolder. Lib store all Jar file correspond to scheduler.
DispatcherClient: This is the component which interfaces with Teamcenter. Its main job
is to fetch data files Teamcenter and upload translated file to Teamcenter. Once installed
the DispatcherClient directory structure created as shown below.

Lib folder contains jar files for Dispatcher Client. All customize Jar file for new translator
service required to be copy in this folder. Detail regarding Translator customization will
be discuss in my next blog. The config contain all file related to Translator configuration
for DispatcherClient. DispatcherClient.config contains information related to Translator
server like rmi port of Scheduler, staging directory etc. Where as Service.properties
contain information required to connect to Teamcenter like host, Translator Teamcenter
user, tcserver port number etc. For activating COTS translation usually we dont required
to change in DispatcherClient component.

Activating COTS Translation Service: Siemens provided many Cots service with
installation. But this service are by default are not activated. Also some of the Translator
service required core service to be installed. For example previewservice translation can
only work if visualization and vis translator is installed. Following steps required to be

done to make any COTS translator active


1) Modify Translator.xml inside Module/conf folder. Make service active by changing
isactive attribute to true. See Translator.xml image above.
2) Change the core bat file for that translator present in root service directort and provide
all required variable value like tc_root, tc_data,ug_root etc. For example in jttocatiaV5
translator required to provide UGS_LICENSE_SERVER in catiav5tojt.bat file.
3) Verify following preference are present in Teamcenter to enable it in Teamcenter.
a. ETS.PROVIDERS : Provide list of Translator provider. SIEMENS is a default value.
So in case of configuring COTS translation, will required to change any thing.
b. ETS.TRANSLATORS.. This provide list of Translation available by provider. So for
COTS configuration required translation service name for preference
ETS.TRANSLATORS.SIEMENS. For example for jttocatiav5 is required to be added
here.
c. ETS... : Provide list of Business Object type on which this translator can be invoked as
primary object. For example in case of jttocatiav5 it should be invoked for JT dataset type
only. Then the preference required to be set is
ETS.DATASETTYPES.SIEMENS.JTTOCATIAV5 with multi-value and value as
JtSimplification and other dataset type which support JT files.
There are other preference which are optional.

Customizing Translation Services


In my last two blog I have given detail about Teamcenter Translation Framework and its
Configuration. In this blog I will provide about customization and creating new translator
service. As discussed in Translator Framework, the three main components are
Dispatcher Client, Module and Scheduler. For any customization this component required
to be customized. In all most all cases Dispatcher and Module required to customize.
Where as in Dispatcher client you required to right a code for extracting or loading or
both and for Module required to define calling exe or bat fill which really do translation.
Also required to configured translator.xml as discussed in my earlier configuration blog.

Customization Steps:Let take a example of one of Translation Use


case where we required to translate one dataset content from one language to other and
upload the translated file to same attach revision with specific relation and dataset type.
Now the requirement is to make this relation type and dataset type to be configurable.
Steps required be performed are as follows.
1) Extracting file to be translated from Teamcenter.
2) Translating file to required language either using custom language translator or third
party translator like Google Translator.
3) Upload the file back to teamcenter by creating dataset and upload the file. Attach
dataset with given relation to Itemrevision.
Step 1 and 3 will be part of Dispatcher Client whereas step 2 will be part of Module
implementation.
Dispatcher Customization:Dispatcher provide java based out of box
implementation for extract from TaskPrep for extracting specific data from teamcenter.

Also it provides OOTB Data Loader for loading of output file automatically. This auto
behavior can be controlled without writing piece of code by configuring Translator.xml
for the specific translation service. The entire class document related to Dispatcher Client
can be found under docs folder inside Dispatcher Client root directory. Also sample
implementation can found in sample folder under Dispatcher Client root directory.
Implementation:
Dispatcher Framework provides two main interfaces for customization the translator.
1) Taskprep class for extracting the file from Teamcenter. Implementation to prepare task
to submit to Translation serverice.
2) DatabaseOperation class for loading the translated files to Teamcenter.
Usually both the class required to implemented for new translation service. Taskprep is
the first called when a translation request is created and Dispatcher then find the specific
task prep implementation for correspondent translation service request. Once the
Translation is done by Module ,the dispatcher invoke DatabaseOperation implementation
for the given service for upload of data to team center. In our example if we required to
convert text document from one language to other then the task prep with first extract
Document from target Dataset and put the file in staging location. Once the module
complete the translation the Database Operation class will be invoked and the translated
file will be uploaded to Teamcenter with specified dataset type and attach to target object
with relation.

Extraction Implementation : Taskprep implementation is done by extending


com.teamcenter.ets.extract.TaskPrep which is an abstract implementation of extraction.
The abstract class has some member variable which encapsulates all detail of translation
request as well staging directory location. Some of key member are
request: The request object for the current extract session
stagingLoc : The staging location in which all the files will be place.
The function which is called execution is prepareTask(). It is defined as abstract for
Taskprep class and required to be implemented by all extending classes.
prepareTask() is function required to be implemented.
Pseudo implementation: Usually in implementation we access the target object called
primary object and its associate object called secondary object through current request
object. Following are pseudo code for same.
m_PrimaryObjs = request.getProperty("primaryObjects").getModelObjectArrayValue();

m_SecondaryObjs =
request.getProperty("secondaryObjects").getModelObjectArrayValue();ModelObject are
SOA base wrapper class for all Teamcenter objects . primaryobject are object in
teamcenter which selected for Translation and secondary object are those object which
are associated with target through relation. For example in case of language translation
we decided to that text file can be target object (Primary object) and the Item Revision to
which is associated will be then secondary object.
Once we get the primary object then the Name reference file required to be extracted
from dataset and put in staging location. This is done through SOA call to teamcenter.
Sample code snippet
Dataset dataset = (Dataset) m_PrimaryObjs[i];
ModelObject contexts[] = dataset.get_ref_list();
ImanFile zIFile = null;
for(int j = 0; j < contexts.length; j++)
{
if(!(contexts[j] instanceof ImanFile))
{
continue;
}
zIFile = (ImanFile)contexts[j];m_InputFile =
TranslationRequest.getFileToStaging(zIFile, stagingLoc);Also required to create a
Translation request detail, which is referred by Module for translation. For example in
Language translator usecase we would required to have option where user can provide
from which language to which language a translation is required. This is done by having
Translator Argument while creating translation request by user. This can be retrieve and
further process in Taskprep. The snippets for accessing the argument are as follows.
Map translationArgsMap = TranslationRequest.getTranslationArgs(request);This Map
contains an Argument as key and its value as value. Also Taskprep can create its own
argument based on process which can be used by module or database loader for further
processing.
translationArgsMap.put(argumentKey, argumentValue);For example in Language
Translator we would required to change the character set to specific value based on
translated Language selected by user.
The translation request detail is created as xml file which resides in staging director under
unique task id. The sample will look like this

Basically this xml contain user argument which required for Translation. For example
above the option provide from and to language. This is used by module for translating.
Also it has detail of corresponding dataset, item and item rev. This can be populated but
not always required as in above example we also population uid of primary and
secondary object which can be used for loading the translated file to with specific relation
to an object.
Loader Implementation: Database loader is implemented by extending
com.teamcenter.ets.load.DatabaseOperation abstract class. Load() function required to be
implemented for a translation service. The DatabaseOperation class has Translationdata
transData attribute which encapsulate all translated request data. From translation data we
can get all information we populate during extraction (taskprep). For example from
Transdata you can get result file list from translation. This help for loader to load all file
in teamcenter. The pseudo code for same will
TranslationDBMapInfo zDbMapInfo = transData.getTranslationDBMapInfo();
List zResultFileList = TranslationTaskUtil.getMapperResults(zDbMapInfo,
scResultFileType);
Where TranslationTaskUtil is utility class provide various generic facility.
ScResultFileType is expected file type for translation. User and Taskprep option can be
access through TranslationTask which is a member of DatabaseOperation. The pseudo
code for same will
TranslatorOptions transOpts = transTask.getTranslatorOptions();
TranslatorOptions provide encapsulation of all option with Key and Value pair. The map
can be access through
Option transOption = transOpts.getOption(i);
if(transOption.getName().equals("SomeoptionName"))
strOutputType = transOption.getValue() ;
Uploading of all result file is done through SOA call to Teamcenter. Dispatcher
Framework provides various helper class which encapsulate the SOA call to Teamcenter.
If requirement cant be fulfilled through helper class then SOA can directly be called for

the same. 0ne of helper class is DatasetHelper class which provides all functions related
to dataset. One of function is which create new dataset for all result file list and attach it
to primary or secondary target with a given relation. The pseudo code for same will be
zat
dtsethelper.createInsertDataset((ItemRevision)secondaryObj, (Dataset)primaryObj,
datasettype , releationtype, namereferencetype, resultdir, filelist, flag to insert
to source dataset or itemrevision);
Jar File Packaging and Dispatcher configuration:
We required to create a Jar file for the custom dispatcher code. Also in JAR file should
have a property file which defines implemented Taskpreperation and Database loader
class initiated through reflection mechanism in dispatcher framework. This is sample
property file.
Translator.serviceprovide.translation servicename.Prepare=packagename.TaskPrep
class name
Translator. .serviceprovide.translation servicename.Load=
packagename.DatabaseOperation class name
Service provider is the name of service provided, for example OOT translation service it
is Siemens. It is configure as preference called ETS.PROVIDERS. Translation
Servicename is the name of Translation service which configure in Module config and in
Teamcenter preference ETS.TRANSLATORS.
In case of Language translation usecase service provider name can exampletranslation
and service name can be examplelanguagetranslation. The content for property for this
will be
Translator. ExampleTranslation
examplelanguagetranslation.Prepare=packagename.LanguageTransTaskPrep class
name
Translator. ExampleTranslation.examplelanguagetranslation.Load=
packagename.LanguageTransDatabaseOperation class name
Once the JAR package is created it required to put is DispatcherClient\lib folder. Also
service.property under DispatcherClient\config folder required to be update with the
property file name
import
TSBasicService,TSCatiaService,TSProEService,TSUGNXService,TSProjectTransServic
e, ourpropertyfilename
This will load all classed when DispatcherClient is started.

Module Customization: Module customization usually has following steps.


1) Create Translation programs or indentify third party application which is used for
translation.
2) Usually we create a wrapper of bat file to run the Program. This is not required but

most of the time we required to set some environment like Teamcenter root or Tc data.
3) Add the service detail in Translator.xml under Module\conf regarding the service.
Let take our example usecase of Language Translator. We are going to use third party
translator for example Google Translator for it. There are some Java API which invoke
remote google translator. We will required to create a Java wrapper on top of this API to
create our Teamcenter Translator. Our Program will take input file , outputfile with
location, language from which it will be translate and to language which required to
translated.
Probably we also required to create bat file for setting a JAVA_Home and other env
variable. This bat will also take this four parameter required for our program. Our sample
config required to be added inTranslator.xml is as follow

Where ModuleRoot is a key word belong to module base directory.


The above config when read by module framework will convert to call
$ModuleRoot/Translators/examplelanguagetranslation/examplelanguagetranslation.
bat input=absolute file location outputdir=outputdir from_lang value
to_lang value
Also a workflow action handlercan be created to integrate a Language Translation with
Change Management and other business process. ITK api required to be used is
DISPATCHER_create_request.
Handler can implemented in such a way that it can take a argument of Translation
Provider, service name and various options supported by translation service. This will
provide the flexibility to reuse the handler for other translation service.
This is from my side on Teamcenter Translation framework. Hope it might be helpful for
people who want to quick start on translation service. Any comments are welcome.

Teamcenter SOA : Introduction


Service Oriented Architecture (SOA) framework is offered by many
enterprise product vendor due to its advantage of interoperability as
well reusability. Also due to service based framework based on
buisness use case maje SOA API are easy to use as it hide all
complexity to Application Developer. Teamcenter also offer SOA
framework for customization as well for integration with other

Application. In series of Blog's I will provide detail concept of


Teamcenter SOA framework and creating your own SOA based on Tc
SOA Framework.
Teamcenter UA SOA :
Teamcenter provide SOA framework as well set of out of box SOA
service for direct consumption. Teamcenter SOA can be basically used
in two ways.
1) Using OOTB SOA service as SOA client.
2) Creating your own SOA which can consume by others.
Teamcenter SOA support following language presently C#, C++ and
Java. Development can be done in any of above language either using
OOTB SOA service for Application Development or developing your own
SOA for other developer usage. The list of SOA service can be seen in
BMIDE under extension -> Code Generation->Services. It provides all
the list of Service available for given Teamcenter environment. Also
you can get all detail of Data Type and Operation corresponding to SOA
services in the BMIDE as shown in below image. We will discuss in
detail about Data Type and Operation in future blogs.

Teamcenter SOA Framework:


Teamcenter SOA service Framework provide set of connection protocol
like HTTP, Corba and auto generated stub in the server as well Data
Model to support client application. SOA server architecture resides
above Business Object layer (AOM layer). SOA server code can call ITK
API to perform business logic as shown in below diagram.

Teamcenter SOA is set of API or programming interface used for


application developer. The API libraries are present in soa_client.zip
file on the Teamcenter software distribution image. The libraries are
present inside soa_client for respective supported programming
language Java, C++ and C#. This ZIP required to be extracted
preferably in TC_ROOT folder for linking Application code which usage
SOA service. soa_client.zip also contain some sample SAO code in all
supported language.We will see in my next blog how to use SOA API ,
establish connection through SOA and use OOTB SOA services.

Teamcenter SOA : Using OOTB SOA Services


In last blog I have given introduction to Teamcenter SOA service. In this
blog I will provide detail of using OOTB SOA services provided by
Teamcenter. Teamcenter provide set of core SOA Framework class for
login , Error Logging, Object Model Change framework etc. Apart from
core Framework API teamcenter also provide set of SOA service for all
modules like Workflow, Change Management, Query, Structure
Manager etc. The detail of all API and SOA service can be found in
BMIDE in Extension view under Code Generation - > Services.
Process of using SOA Client:
Teamcenter provide SOA framework for creating SOA session and using
publish SOA services. For implementing SOA client required following
steps.
1) Implementing Login Credential and session Class encapsulation
connection.
2) Using published SOA service.
3) Optionally using other Framework services like Object Model Changes,
Exception handling etc.

I suggest going through the sample program provided in


SAO_client.zip. I provided the detail of setup step for sample code in
this blog.
For using SOA service first thing is to connect to Teamcenter Server
with two tier (iiop connection) or four tier (web URL). Teamcenter SOA
framework encapsulates most of connection implementation through
com.teamcenter.soa.client.connection class and only required
credential management through implementing the CredentialManager
interface.
The SOA client can implement either in .NET,C++ or Java. First step is
to get the connection instance for SOA which is done by creating
connection instance

url is connection string for two tier or four tier server connection.
credentialManagerInstance is a instance of class which implemented
CredentialManager interface.
public class AppCredentialManager implements
CredentialManager
Calling SAO services:
Teamcenter provide set of OOTB SOA service for different modules like
work flow etc. Each of this service has its own static service class
which provide handle to those specific services. For example Query
service handle can be get by
SavedQueryService queryService =
SavedQueryService.getService(connectionObj);
Or workflow service instance can be obtain through

WorkflowService wfser = WorkflowService.getService( connectionObj);


connectionObj is the connection instance which we created above.
Similarly there are other several services for different functionality.
Detail of all services and its operation detail can be seen in BMIDE as
shown in below images.

The above image show the SavedQuery and available operation for it.
Once we get instance on query service we can invoke all available SOA
operation on it. The detail of all operations functionality and its
argument Data Type is describe in BMIDE.
Quick Example of using SOA service:
Let look small code snippet of using OTTB SOA service in Java. These
below code release the object using operation from TcWorkflow SOA
service.

The above code release given object by calling Teamcenter SOA


service. This are basic step for doing any SOA service call.
1) Get Instance of required SOA service by providing connection object.
2) Prepare input argument Data for requesting for specific services.
3) Call the specific Operation for achieving the required results.
In above example we get first WorkFlow service instance. Then we
prepare input argument by creating release options object, release
input object and populating them with appropriate value.
Note the most of Argument for SOA API takes as array value and return
are also in array. The reason for this is that Teamcenter SOA is design is
such a way that it takes bulk request (in an array) in single operation
for avoiding multiple calls to server from client. Hence in SOA
implementation we have to first create instance of object and then
encapsulate them in Array.
I hope the above blog will help you to understand basic aspect of using
OOTB SOA services.

Temcenter SOA : Sample SOA Code Setup


In this blog I will provide detail step for setting up sample SOA code
provided in install image file under SOA zip file. Below steps is for Java
SOA sample code.

1.

Unzip soa_client.zip from install image to TC_ROOT directory. It will


create soa_client folder under TC_ROOT.
2. Install Eclipse 3.6 and launch
3. Add ClassPath variable for the root location of the soa_client folder.
Open the Preferences dialog of eclipes (Window --> Preferences ... ).
Select the ClassPath Variable tab ( Java --> Build Path --> ClassPath
Variables).
Add the variable 'TEAMCENTER_SERVICES_HOME', set it to the root
path of the 'soa_client' folder

4.

Import the project


Open the Import dialog ( File --> Import... )
Select Existing Project into Workspace (General --> Existing Projects
into Workspace )
Click Next, then browse to .../soa_client/java/sample, select Finish
Note:- While importing FileManagement sample, FMS_HOME need to be
added just like TEAMCENTER_SERVICES_HOME

5.

Compile the project


If the IDE is not configured to Build Automatically, force a build of the
project

6.

Execute the client application


Open the Debug/Run dialog ( Run --> Debug ...)
Select the HelloTeamcenter Java Application
The launch is configured to connect to the server on
http://localhost:7001/tc to change this URI on the Args tab.

If the connection is http, ensure its pointing out to the right URI and
this can be checked through Run->Debug Configurations-> Arguments
If it the connection is through iiop, it should be set as follows
-Dhost=iiop:localhost:1572/TcServer1

Note:- Only Java and C++ bindings work in 2-tier mode.


7.

Go through the files to understand how the services are being called
and can be used in the custom solution. Refer my OOTB SOA Service
Blog for basic understanding of sample code.

Teamcenter SOA : Create You Own SOA


In my last two blog I discussed about basic fundamental of SOA and
how to use OOTB SOA in client side. In this blog we will discuss about
basic for creating your own SOA. Usually new SOA service is required to
be created when you dont find OOTB SOA satisfying your requirement
or what to have the service behavior in different way. Following are the
guideline for creating your own SOA.

1) Check whether you can achieve the required result at client side by
using OOTB SOA.
2) Analyze the desire state of transaction between Client and SOA server
layer. For example you have to look whether transaction should be
committed in one request or multiple request provide by OOTB SOA is
acceptable.
3) Analyze what service you want to implement. Whether it will be single
request base or multiple requests which satisfy the use case.
4) Analyze what input to perform the service. This will help in defining
input data type.
5) Analyze what output will be provided for requested service and how
the output will be used. This will also help in deciding whether you
required having loose or strong binding.

Before You Start:

Before start you should understand some aspect of SOA service in


teamcenter.
Loose vs. Strong Binding: The binding define how the data map
between SOA server and Client. Loose binding mean Key/Value pair
mapping as string map of data and its value. Whereas strong binding
means the client data model is populated with typed objects that
correspond one-to-one with their server-side object type. Hence
decision required to be taken based on use case. For example if SOA
service is for creating some static reports then loose Binding can be
good option and for integration strong binding is good option.
Data Type:
Data Type defined encapsulation of Data while creating a request for
SOA service as well getting output from SOA service. In SOA you can
define data type either of Structure, Map or Enumeration. Meanings of
this Data Types are same as in a programming language like C or Java.
Structure defines set of basic data types encapsulated in structure.
Similarly Map is data container.
Operations:
Operation defines set of function which will available to client for
created a request for services to be performed. The SOA service can be
either defined through single operation if it is a simple service or set of
Operation if it is a complex use case.

Creating SOA services:

As you understand the basic building block of creating SOA, we are


ready for creating your own SOA service in the BMIDE. The detail of
steps of creating SOA service is given in this blog. Following are the
steps of creating SOA service
1) Define preference binding. For example for Java you can select loose
or strong binding.
2) Create new SOA service library or use existing if you already have
one.

3) Create new service under SOA service library.


4) Define all the data type which will be used for SOA Operation.
5) Defined all the Operations with define data type. The Operation will
have input data type and return data types. Data Type can be either
defined data type or OOTB data types.
6) Generate code in BMIDE by selecting Generate Code -> Service
Artifacts. This will create stub code for both server and client side (see
step blog for detail).
7) Implement the server side code for all defined operation in BMIDE.
The stubs for all Operation is already created by BMIDE and only
required to implement the business logic for the Operation. Server
side implementation is always in C++ and ITK API can be used for
implementation.
8) No implementation is required at client side.
9) Build the BMIDE project. Jar file created for the client (if it is Java
Binding) and dll for server.
10) Deploy the server dll in TC_ROOT\bin directory.
11) Implement custom SOA service in client side. The process is same as
defined for OOTB service in the Blog. Only addition will be to add the
client SOA jar file in the Class path for using Java client and client dll if
using c++ binding.

The detail steps is given in Steps for creating SOA.

These close my series on Teamcenter SOA. I thanks my friend Bhaskar for introducing me to
Teamcenter
SOA
Framework.
Hope this blogs will help good foundation for all people who want to learn Teamcenter
SOA services and serve as quick start on for using and creating own services.

Teamcenter SOA : Detail Steps for SOA Creation


This blog provides detail steps of creating your own SOA. The SOA service
provides all the query present in Teamcenter Query Builder. The SOA has
one Operation which gat all query object in Teamcenter. The Operation
doesn't take any input argument, it return list of Query Objects
encapsulated in custom Data Structure.

Detail Steps :
1. Create a new service library in BMIDE Extensions tab by selecting
Project->Code Generation->Services

2.
Select the newly created service library (Ex: D4SoaTest) and create
new service

2. Open the new service (Ex: ServiceTest)


2. Define the DataTypes and Operations as per the requirement. Ex:
Operation is to get all the saved queries. Define Data Types as
below
Define data type TestQueryObject

a.

Click on Add button in the Data Types tab


b. Select Structure and click Next
c. Enter name as TestObjectQuery and description for it
d. Select DataType elements by clicking Add button

query->Teamcenter::ImanQuery [holds the query tag]


queryname->std::string [holds the name of the query]
querydescription->std::string [holds the description of the query]

Define data type TestGetSavedQueriesResponse Used for as return data


type in operation which returns vector of query objects holding query tag,
name and description
a.

Click on Add button in the Data Types tab


b. Select Structure and click Next
c. Enter name as TestGetSavedQueriesResponse and description
for it
d. Select DataType elements by clicking Add button

queries->Vector of TestQueryObject
services->Teamcenter::Soa::Server::ServiceData

5.
Create Operation using the above defined data type. Select
Operations tab of the service (Ex Test Service). Click Add on to create new
operation.
a. Click on Browse button of New Service Operation dialog
a. Select
D4::Soa::Test::_2012_02::TestService::TestGetSavedQueriesRe
sponse as the return data type
a. Enter Service Operation Name: testGetSavedQueries
a. Enter Service Operation Description: SOA Operation for
getting Query List
a. Enter Return Description: List of Query Object in Teamcenter
a. Click on Finish

5.

a.

6.
Once the Operation is created, in Operation Tab when Service is
opened.
6. Save the datamodel.

6. Select the library created (D4SoaTest). Select GenerateCode>Service Artifacts

9.
Check the Console view to know if the code generation is successful
or errors in other case.
9. Source and header files corresponding to each service are created.
They are placed in the following path in Navigator view:
Project->src->server->LibraryName
Ex: AccentureTest->src->server->D4SoaTest (This folder contains the files
which need to be implemented by the developer)
Project->output->client->LibraryName - This contains the auto generated
files for the client (no changes required to these files)
Project->output->server->LibraryName - This contains the auto
generated files for the server(no changes required to these files)
Project->output->types->LibraryName - This contains the auto generated
files for the data types that are created. (no changes required to these
files)

Inferences:
a.

i.

Name of the service library:


D4SoaTest
b. Services in the library D4SoaTest:
TestService
c. Operations in the service by name TestService:

ii.

testGetSavedQueries

11.
Open the source file and enter the required code. Shown below is
the file generated in the step #8. Operation testGetSavedQueries is high

lightened for a quick notice.

12.

Build the project. Dlls are generated in

a. Project\output\client\jars (Project\output\types \lib if its c++


bindings) jars files to added to java client classpath
a. Project\output\server\lib (libraries which would be copied to
Teamcenter Server TC_ROOT\bin)
a. Project\output\types\jars (Project\output\types\lib if its c+
+bindings) contains the types definition jars files to added
to java client classpath
Note:-

a.

Check console to see a successful build message


b. libD4soatypes.dll under types\lib and server\lib are same.
c. Ensure the dependant libraries are added to the library definition.
Ex: query module is used in the D4SoaTest library and hence added
to the D4SoaTest library definition.

13.
Deploy the data model. This will deploy the library, service and
operation definitions in the template to the database as well as it copies
the built server libraries to TC_ROOT\bin. Ex Project\output\server\lib to
TC_ROOT\bin

You can now call your SOA at client site in a same way as you called
sample SOA.

You might also like