Azure DevOps Server/Services/Version Control
Prerequisites
Privileges for User
These are the common privileges for user. To know specific privileges required for user, refer to User Privileges.
Add a user in Azure DevOps that is dedicated for OpsHub Integration Manager. This user shouldn't perform any other action from Azure DevOps user interface. Please make sure this user or Service Principal has a unique display name across the instance.
User must be a member of Project Administrators group for work item entities and Build entity migration. For Meta entities like Group, User entities, integration user must be a member of Project Collection Administrators group or Project Administrators group. Refer section How to Add User or Service Principal in group.
Note: If integration user is not a member of Project Collection Administrators group, collection level permissions will not be synchronized.
Bypass rules on work item updates in Boards section of the permissions must be Allow to impersonate the comment author.
If you are using Service Principal Authentication, the steps described above for user will be applicable for Service Principal Authentication. For more information refer to Service Principal Privileges.
User Privileges
User can use Basic Authentication or Personal Access Token authentication method to communicate with API for Azure DevOps.
In case of Personal Access Token authentication, please check Personal Access Token Permission section for the required permission details. Personal Access Token is supported for Team foundation Server On-Premise (TFS instance with HTTPS installation only) with version 2017 and above and Azure DevOps.
For On-Premises deployment, either Basic authentication or PAT authentication needs to be enabled on server. Please refer to Internet Information Services(IIS) Configurations to learn more about enabling the Basic/PAT authentication in IIS.
In case user want to synchronize User type fields of Azure DevOps with any other system with default OpsHub Integration Manager generated mapping, it is necsessary that all users have their preferred e-mail address set in Azure DevOps.
The user for both the source and target systems requires a minimum access level of Basic + Test Plans to synchronize both query-based and requirement-based suites. Additionally, the user of the target system must also have at least Basic access to synchronize new tags. Refer to the Access Level documentation to know more about this access level or subscription for the user. Otherwise, Test Suite synchronization will be resulted in to job error/sync failure as "You are not authorized to access this API. Please contact your project administrator".
If your Azure DevOps is configured with SSO, then the above mentioned privileges and permissions are sufficient.
If Bypass Rules is set to Yes in the system configuration, make sure the user or Service Principal has the Bypass rules on work item updates permission set to Allow at the project level in Azure DevOps.
Personal Access Token Minimum Required Permission
Refer Create Personal Access Token section to learn about how to create Personal Access Token.
For On-Cloud Deployment
For On-Cloud deployment, Personal Access Token should be created with Full access scope for entities such as Test Plan, Test Result, Test Suite, Test Run, Build, Team, User, Group & Permission. For other entities, user can create a Personal Access Token with Full access scope if possible, otherwise user can create Personal Access Token with Custom defined scope with essential permissions specified below.
Least required permissions for all entity types (except Version Control and Git)
Permission Types
Required Permission Values
Identity
Read & Manage
Member Entitlement Management
Read & Write
Project and Team
Read, Write & Manage
Service Connections
Read, Write & Manage
Work Items
Read, Write & Manage
Graph
Read
Additional permissions for specific entities (except Version Control and Git)
Entity Types
Permission Types
Required Permission Values
Build
Build
Read
Test Case, Shared Parameters & Shared Step
Test Management
Read & Write
Dashboard & Widget
Tokens
Read & manage
Permissions required for Version Control and Git
Permission Types
Required Permission Values
Code
Full
Identity
Read & Manage
Project and Team
Read, Write & Manage
Security
Manage
Permissions required for Pipeline
Permission Types
Required Permission Values
Build
Read & Execute
Secure Files
Read
Task Groups
Read
Variable Groups
Read & Create
Agent Pools
Read

Note: In case build pipeline is created with TFSGit as source code, you will need to provide additional permission for Git (as specified in Additional permissions for specific entities) data while creating Personal access token.
For On-Premises Deployment
Personal Access Token should be created with Full access scope for all entities if user is using On-Premises deployed server.
Service Principal Privileges
It is applicable when the authentication mode is set to Service Principal - Client Secret or Service Principal - Client Certificate.
When these authentication modes are selected, the supported entity types are: Work Items, Build, Pipeline, Areas, Iterations, Test Entities (Test Plan, Test Suite, Test Run, Test Result), Shared Parameter, Git Commit Information.
Note: Entities not yet supported with Service Principal authentication: Pull Request, Query, Dashboard, Widget, User, Group, and Team.
Azure DevOps collection must be connected to Microsoft Entra (Azure Active Directory) for which Service Principal is being used.
Refer to Secret key & Certificate section to generate Secret key or to upload Certificate in Microsoft Entra (Azure Active Directory).
Service configuration
OpsHub Integration Manager requires this service to communicate with the Azure DevOps. It acts as a translation layer between Azure DevOps and OpsHub Integration Manager and must be configured for synchronization with Azure DevOps.
Service pre-requisites
Operating System (Tested On): Windows Server 2008 R2, Windows Server 2012, Windows Server 2012 R2, Windows Server 2016, Windows 7, Windows 8, Windows 8.1, Windows 10
It is recommended to install Service on a machine having quad-core processor and minimum 4 GB RAM.
Required disk space for the service depends upon the data size of the source control data. It is recommended to have disk space greater than the total data size of source control.
It is recommended to install Service on different machine where Team Foundation Server is not installed.
The OpsHub Integration Manager Service requires the machine to have .NET framework 4.7.2 or higher installed on it.
Note: Refer to the table below to check which entity types require this pre-requisite. A check mark indicates a mandatory pre-requisite, while a cross mark indicates an optional one.
Entity Type
Azure DevOps Services
Azure DevOps Server (version >= 2020)
Azure DevOps Server (version < 2020)
Work Items
❌
❌
✅
Test entities (Test Suite, Test Plan, Test Run and Test Result)
❌
❌
✅
Area and Iteration
❌
❌
✅
Git Commit Information
❌
❌
✅
Pipeline
❌
❌
✅
Build
❌
❌
❌
Other entities
✅
✅
✅
Follow the steps given below for installation:
Navigate to the path
<OPSHUB_INSTALLATION_PATH>\Other_Resources\Resources.Extract the
OpsHubTFSService.zippackage.Service will be installed on port <9090> by default. Please check the port available for service which you configure for the service (Default port is <9090>). Refer section How to change the port of service to learn how to change the default port of service.
Open the command prompt as Run As Administrator and navigate to the extracted folder in which the
registerTFSWCFService.batis placed and executeregisterTFSWCFService.bat.Once the command is executed, go to Windows Services, and look for a service with the name OpsHubTFSService. Check if the service has started or not. If it has not started, then start the service.
Test the web service by opening this URL in browser:
http://<hostname>:<port>/TFSService. E.g.http://localhost:9090/TFSService. For Troubleshooting, refer Service Troubleshooting section.
In case the machine on which OpsHub Integration Manager installed is behind the proxy (network proxy), then perform the steps mentioned in the Proxy settings section.
It is also required to configure the proxy settings for OpsHub Integration Manager Service, refer to Proxy settings in appendix section for the OpsHub Integration Manager Service to learn the configuration steps.
Internet Information Services(IIS) Configurations
When TFS System is to be configured with 'Basic Authentication' in OpsHub Integration Manager
If TFS Server Version is >=2015, the IIS setting for Basic Authentication needs to Enable, for the steps to enable basic authentication on IIS, please refer this
Note: If we use Basic Authentication and this option is Disabled in the IIS Manager, then you might receive a processing failure of 'Unauthorized Access'.
When TFS System is to be configured with Personal Access Token in OpsHub Integration Manager
The IIS setting for Basic Authentication needs to be kept Disabled.
Note: If we use Personal Access Token and Basic Authentication option is Enabled in the IIS Manager, then you might receive a processing failure of 'Unauthorized Access'.
Please refer this link for more information
System Configuration
Before you continue to the integration, you must first configure Azure DevOps.
Click System Configuration to learn the step-by-step process to configure a system.
Refer the screenshot given below for reference.


Azure DevOps System Form Details
Field Name
When field is visible on the System form
Description
System Name
Always
Provide Azure DevOps System Name
Deployment Mode
Always
Choose the deployment type of server.
Version
Deployment type is On-Premises
Put the version for the Team Foundation Server. Refer section How to find Team Foundation Server's version to learn how to find version of installed Team Foundation Server.
Server URL
Always
In case of On-Premises deployment, set the URL to: http://<host name>:<port no>/tfs; in case of Visual Studio Team Services (VSTS On-Cloud) instance, set the URL to: https://<instance name>.visualstudio.com; and for a new Azure DevOps (On-Cloud) instance, set the URL to: https://dev.azure.com/<organization name>.
Authentication Mode
Always
Select the authentication mode you would like to use for communicating with Azure DevOps systems API.
User Name
Deployment type is On-Premises
Enter a primary username with the user domain (if there is any). User must have administrator privileges in Project Administrators user group or Project Collection Administrators user group. Please make sure this user has a unique display name across the instance. Refer to the Add User in Group section to learn how to add user in user group list.
User Email
Deployment type is On-Cloud
Enter User Email Address. User must have administrator privileges in Project Administrators user group or Project Collection Administrators user group.
User Password
Authentication mode is Basic
In On-Premises deployed server, enter the primary password.
Personal Access Token
Authentication mode is Personal Access Token
Enter the Personal Access Token generated for the integration user in the Azure DevOps. Refer Create Personal Access Token section to learn about how to create Personal Access Token.
Tenant ID
Deployment type is On-Cloud & Authentication mode is Service Principal - Client Secret or Service Principal - Client Certificate
Enter the Tenant Id of Azure Active Directory to which organization is connected. This can be found in the Microsoft Entra (Azure Active Directory).
Application ID
Deployment type is On-Cloud & Authentication mode is Service Principal - Client Secret or Service Principal - Client Certificate
Enter the Application (client) ID of a dedicated application for API communication with your Azure DevOps instance. This can be found in the Microsoft Entra (Azure Active Directory).
Secret Value
Deployment type is On-Cloud & Authentication mode is Service Principal - Client Secret
Provide the Secret Value generated in Azure Active Directory for the application given in "Application ID" input. This can be found in the Microsoft Entra (Azure Active Directory) while generating secret key.
Private Key
Deployment type is On-Cloud & Authentication mode is Service Principal - Client Certificate
Provide the Private Key of a certificate uploaded in Azure Active Directory for the application given in "Application ID" input.
Thumbprint
Deployment type is On-Cloud & Authentication mode is Service Principal - Client Certificate
Provide the Thumbprint of a certificate uploaded in Azure Active Directory for the application given in "Application ID" input. This can be found in the Microsoft Entra (Azure Active Directory) in "Certificates & secrets" section.
Team Collection Name
Deployment type is On-Premises
Enter the Collection name. For example, PrimaryCollection.
Service URL
Always
Provide the Service URL where the Service is installed. For example: http://<service_host>:<port>/TFSService. The Service URL is mandatory for all versions of Azure DevOps Server below 2020, regardless of the work item being integrated. For 2020 and above versions of Azure DevOps Server and Azure DevOps Service, the Service URL is mandatory for these work items: Team, Group, User, Query, Dashboard, Widget, and Pull Request.
Bypass rules
Always
Setting Bypass Rules to 'Yes' means disabling the rules while writing the changes to the system. This change will allow users to write invalid value(s) to any field in the system. For over writing, 'Changed By', 'Changed Date', etc. fields, enable the Bypass rules. Refer Bypass Rule with User Impersonation in the appendix section to learn in detail about User Impersonation and ByPass Rule. Note If Bypass Rules is set to 'Yes' in the system configuration, make sure the user or Service Principal has the 'Bypass rules on work item updates permission' set to Allow at the project level in Azure DevOps.
Mapping Configuration
Map the fields between Azure DevOps and the other system to be integrated to ensure that the data between both the systems synchronizes correctly.
Click Mapping Configuration to learn the step-by-step process to configure mapping between the systems.

For Changed By and Changed Date synchronization please marked overwrite true in the mapping (For source system). Refer Overwrite section to learn how to marked field overwrite.
When Azure DevOps Server or Services is the target system and Iterations and Area Paths are not considered separate entities, the default behavior is to verify and create these entities within the Azure DevOps system if they are mapped as fields in work items or test entities.
If the iteration and area path do not already exist in the target system, they will be created without start and end dates unless the checkAndCreate property is either not specified or set to 'true'. If these entities already exist, the new entity will be placed under the designated Area Path or Iteration.
If the user wants to disable this "Check and Create" behavior, they can set the checkAndCreate property to false. This will prevent the creation of new iterations or area paths in the ADO system if they do not already exist.
In such cases, processing will fail for any work item or test entity that references non-existent area paths or iterations.
<Area-space-Path checkAndCreate="false"> <xsl:value-of xmlns:xsl="http://www.w3.org/1999/XSL/Transform" select="SourceXML/updatedFields/Property/Area-space-Path"/> </Area-space-Path>For Azure DevOps to Azure DevOps integration, if source and target project names are different, then, for Path field, advance mapping is to be done. The mapping is as follows:
<xsl:choose xmlns:xsl="http://www.w3.org/1999/XSL/Transform">
<xsl:when test="SourceXML/updatedFields/Property/Path !='<Source Project Name>'">
<Path>
<xsl:value-of select="concat('<Target Project Name>', substring-after(SourceXML/updatedFields/Property/Path ,'\'))"/>
</Path>
</xsl:when>
<xsl:otherwise>
<Path>
<xsl:value-of select="'<Target Project Name>'"/>
</Path>
</xsl:otherwise>
</xsl:choose>If you want to create mapping between HP Test and TFS Test case with HP Test Parameters and TFS Parameters, then few changes needs to be done in the Parameter default mapping. Find field mapping between HP ''Test Parameters'' field and TFS ''Parameter'', then click ''View/Edit XSLT Configuration'' to open advance mapping, and do following changes:
In case of HP to TFS mapping, find
<xsl:value-of select="value"/>in default mapping and replace with<xsl:value-of select="utils:convertHTMLToPlainText(value)"/>.In case of HP to TFS mapping, find
<xsl:value-of select="parameterName"/>in default mapping and replace with<xsl:value-of select="utils:replace(parameterName,' ','_')"/>. Here in replace method, you can use any character which will be replaced in place of space.In case of Bi-directional configuration from TFS to HP, find
<xsl:value-of select="parameterName"/>in default mapping and replace with<xsl:value-of select="utils:replace(parameterName,'_','')"/>. Here character provided in second parameter of replace method should be same which is given in previous configuration, during HP to TFS mapping.
To synchronize Steps field [having "Shared Steps"] of Test Case entity to other systems, the advanced mapping needs to be configured in OpsHub Integration Manager to convert Shared Steps to single level steps. Given below is a sample advanced mapping from TFS to Jira to synchronize Steps field [having "Shared Steps"] of Test Case entity to Zephyr Teststep field of Test entity along with formatting:*
<Zephyr-space-Teststep xmlns:xsl="http://www.w3.org/1999/XSL/Transform">
<xsl:variable name="testCaseId" select="SourceXML/opshubEntityId"/>
<xsl:variable name="orderedSteps" select="utils:getSharedStepsInSingleLevel($workflowId,$sourceSystemId,$testCaseId)"/>
<xsl:for-each xmlns:map="http://java.util.Map" select="$orderedSteps">
<xsl:variable name="testStep" select="."/>
<xsl:variable name="position">
<xsl:value-of select="map:get($testStep,'position')"/>
</xsl:variable>
<xsl:variable name="action">
<xsl:value-of select="map:get($testStep,'action')"/>
</xsl:variable>
<xsl:variable name="expectedResult">
<xsl:value-of select="map:get($testStep,'expectedResult')"/>
</xsl:variable>
<xsl:variable name="description">
<xsl:value-of select="map:get($testStep,'description')"/>
</xsl:variable>
<xsl:element name="{concat('_',$position)}">
<xsl:element name="order">
<xsl:value-of select="$position"/>
</xsl:element>
<xsl:element name="step">
<xsl:value-of select="utils:convertHTMLToPlainText($action)"/>
</xsl:element>
<xsl:element name="expected">
<xsl:value-of select="utils:convertHTMLToPlainText($expectedResult)"/>
</xsl:element>
<xsl:element name="description">
<xsl:value-of select="utils:convertHTMLToPlainText($description)"/>
</xsl:element>
</xsl:element>
</xsl:for-each>
</Zephyr-space-Teststep>Test Point Advance Mapping Configuration
Test Point is an association between Test Suite and Test Case with configuration and tester. This association is synchronized by configuring the Test-Case linkage with Test Suite integration.

The Advance Mapping required for synchronizing configuration/tester with Test-Case linkage is given in the snippet below:
<OHEntityReferences>
<xsl:variable xmlns:xsl="http://www.w3.org/1999/XSL/Transform" name="nonDefaultLinks" as="item()*">
<Item targetLinkType="Test-Case-space-Linkage">Test-Case Linkage</Item>
<Item targetLinkType="Tested-space-By">Tested By</Item>
<Item targetLinkType="Test-space-Suite-space-Child">Test Suite Child</Item>
<Item targetLinkType="Test-space-Case">Test Case</Item>
<Item targetLinkType="Test-space-Suite-space-Parent">Test Suite Parent</Item>
<Item targetLinkType="Tests">Tests</Item>
</xsl:variable>
<xsl:variable xmlns:xsl="http://www.w3.org/1999/XSL/Transform" name="entityTypeMapping" as="item()*">
<Item targetEntityType="Test Case">Test Case</Item>
<Item targetEntityType="Test Suite">Test Suite</Item>
<Item targetEntityType="Test Result">Test Result</Item>
</xsl:variable>
<xsl:variable xmlns:xsl="http://www.w3.org/1999/XSL/Transform" name="defaultLinkWithSourceLinks" as="item()*" />
<xsl:variable xmlns:xsl="http://www.w3.org/1999/XSL/Transform" name="defaultLinkWithOH_DEFAULT" as="item()*" />
<xsl:variable xmlns:xsl="http://www.w3.org/1999/XSL/Transform" name="entityReferencecontext" select="/SourceXML/updatedFields/Property/OHEntityReferences/OHEntityReference" />
<xsl:variable xmlns:xsl="http://www.w3.org/1999/XSL/Transform" name="linksToBeCarriedFromSourceEvent" as="item()*">
<xsl:for-each select="$nonDefaultLinks">
<xsl:variable name="currentLinkType" select="." />
<xsl:if test="$entityReferencecontext/linkType[text()=$currentLinkType]">
<xsl:if test="$entityReferencecontext/links/EAILinkEntityItem">
<Item targetLinkType="{@targetLinkType}">
<xsl:value-of select="." />
</Item>
</xsl:if>
</xsl:if>
</xsl:for-each>
</xsl:variable>
<xsl:variable xmlns:xsl="http://www.w3.org/1999/XSL/Transform" name="linksToBeAddedAsDefault" as="item()*">
<xsl:for-each select="$defaultLinkWithOH_DEFAULT">
<Item lookupQuery="{@lookupQuery}" entityType="{@entityType}">
<xsl:value-of select="." />
</Item>
</xsl:for-each>
<xsl:for-each select="$defaultLinkWithSourceLinks">
<xsl:variable name="currentLinkType" select="." />
<xsl:if test="not($linksToBeCarriedFromSourceEvent[text() = $currentLinkType] = $currentLinkType)">
<Item lookupQuery="{@lookupQuery}" entityType="{@entityType}">
<xsl:value-of select="./@targetLinkType" />
</Item>
</xsl:if>
</xsl:for-each>
</xsl:variable>
<xsl:for-each xmlns:xsl="http://www.w3.org/1999/XSL/Transform" select="$linksToBeCarriedFromSourceEvent">
<xsl:variable name="currentLinkType" select="." />
<op_list>
<xsl:element name="{@targetLinkType}">
<xsl:for-each select="$entityReferencecontext[linkType=$currentLinkType]/links/EAILinkEntityItem">
<xsl:element name="{concat('_',position())}">
<xsl:element name="EntityType">
<xsl:variable name="sourceEntityType" select="entityType" />
<xsl:value-of select="$entityTypeMapping[text()=$sourceEntityType]/@targetEntityType" />
</xsl:element>
<xsl:element name="GlobalId">
<xsl:value-of select="linkGlobalId" />
</xsl:element>
<xsl:element name="LinkAddedDate">
<xsl:value-of select="linkAddedDate" />
</xsl:element>
<xsl:element name="LinkedID">
<xsl:value-of select="entityInternalId" />
</xsl:element>
<xsl:element name="IsExternalLink">
<xsl:value-of select="isExternalLink" />
</xsl:element>
<xsl:element name="EntityLinkComment">
<xsl:value-of select="linkComment" />
</xsl:element>
<xsl:if test="order!=''">
<xsl:element name="order">
<xsl:value-of select="order" />
</xsl:element>
</xsl:if>
<xsl:if test="id!=''">
<xsl:element name="id">
<xsl:value-of select="id" />
</xsl:element>
</xsl:if>
<config>
<xsl:for-each select="linkProps/Property/config/string">
<fieldvalue>
<xsl:value-of select="." />
</fieldvalue>
</xsl:for-each>
</config>
<testers>
<xsl:for-each select="linkProps/Property/testers/Property/*">
<xsl:element name="{name()}">
<xsl:for-each select="com.opshub.eai.metadata.UserMeta">
<fieldvalue>
<xsl:variable name="tgtUserOutput" select="userUtils:getUserNameFromEmail($workflowId, $targetSystemId, SourceXML/updatedFields/Property/testers, 'false', 'false')" />
<xsl:choose>
<xsl:when test="$tgtUserOutput != ''">
<xsl:value-of select="$tgtUserOutput" />
</xsl:when>
<xsl:otherwise>
<xsl:value-of select="userName" />
</xsl:otherwise>
</xsl:choose>
</fieldvalue>
</xsl:for-each>
</xsl:element>
</xsl:for-each>
</testers>
</xsl:element>
</xsl:for-each>
</xsl:element>
</op_list>
</xsl:for-each>
<xsl:for-each xmlns:xsl="http://www.w3.org/1999/XSL/Transform" select="$linksToBeAddedAsDefault">
<op_list>
<xsl:element name="{.}">
<xsl:element name="_1">
<xsl:element name="EntityType">
<xsl:value-of select="@entityType" />
</xsl:element>
<xsl:element name="LookupQuery">
<xsl:value-of select="@lookupQuery" />
</xsl:element>
</xsl:element>
</xsl:element>
</op_list>
</xsl:for-each>
</OHEntityReferences>To preserve Test Case order, the
OH Enable Rankfield must be configured in the Test Suite mapping.
Lookup Fields Configuration
In Azure DevOps, if any lookup field contains the value which is same as one of the values of "State" field [case is not same], the lookup field value will not sync to the target. For example, if one of the states is "In Progress" and lookup field value is also "in progress", then the "In Progress" (instead of "in progress") will be present in the mapping of lookup field. Hence, the lookup field value "in progress" will not sync to the target.
Note: For the above mentioned case, if the lookup field of Azure DevOps is mapped to the mandatory field of the target, the processing failure will be generated during the synchronization.
The images below show that value list of State field and one of the lookup fields. In both the lists, the "In progress" option is common but alphabetical case is different.


The images below depicts the sample mapping which will be generated, when the lookup field contains the "in progress" option. The "In Progress" value is visible in the mapping.


Corrective actions to be taken to configure the advanced mapping and replace the current value in the mapping with the actual field value of the lookup field of Azure DevOps. For example, to sync the lookup field value "in progress" to the target, update the advance XSLT as given below:
<xsl:variable xmlns:xsl="http://www.w3.org/1999/XSL/Transform" name="xPathVariable" select="SourceXML/updatedFields/Property/Lookup_Field_Name"/>
<xsl:choose xmlns:xsl="http://www.w3.org/1999/XSL/Transform">
<xsl:when test="$xPathVariable='in progress'">
<xsl:value-of select="'target_field_option'"/>
</xsl:when>
</xsl:choose>Relationship Configuration
Git Commit/Branch Link Configuration
To synchronize Git Commit/Branch links of an entity to other systems, the Commit/Branch links need to be mapped in OpsHub Integration Manager relationship mapping.
When the Git Commit/Branch links are mapped in OpsHub Integration Manager:
While synchronizing a workitem, if any GIT artifact's project or repository is not found in the target system, this artifact will get skipped by
<code class="expression">space.vars.SITENAME</code>.If any GIT artifact is missing in target repository, workitem's artifact link will be synced with the missing object. On syncing delta changes, those links will be re-establised with an artifact object if it is found in target repository.
To sync delta changes from source repository to target repository, refer to this link for more details: https://docs.github.com/en/repositories/creating-and-managing-repositories/duplicating-a-repository#mirroring-a-repository.
For syncing the link Git Commit/Branch with a workitem to target [TFS/VSTS] systems, you must import source repository into target repository to bring all the Git commit and branch links into target repository.
If Commit/Branch link has a different project name or a different repository name:
Provide the respective project's name or repository's name using advance XSLT.
For example, if source commit is found in project, 'project-xyz' and repository, 'repository-xyz', corresponding in target this commit is found in project, 'project-abc' and repository, 'repository-abc'. Therefore, to sync commit link of an entity, update the advance XSLT from this:
<xsl:for-each select="linkProps/Property">
<xsl:for-each select="*">
<xsl:element name="{name(.)}">
<xsl:value-of select="."/>
</xsl:element>
</xsl:for-each>
</xsl:for-each>to this:
<xsl:for-each select="linkProps/Property">
<xsl:for-each select="*">
<xsl:element name="{name(.)}">
<xsl:choose>
<xsl:when test="name(.)='GitProject'">
<xsl:if test=".='project-xyz'">
<xsl:value-of select="'project-abc'"/>
</xsl:if>
</xsl:when>
<xsl:when test="name(.)='GitRepository'">
<xsl:if test=".='repository-xyz'">
<xsl:value-of select="'repository-abc'"/>
</xsl:if>
</xsl:when>
<xsl:otherwise>
<xsl:value-of select="."/>
</xsl:otherwise>
</xsl:choose>
</xsl:element>
</xsl:for-each>
</xsl:for-each>Mapping for Entity mention field
When Team Foundation Server ALM/Azure Devops service is configured as source system in the integration and its field/comment type is rich text (HTML), then the entity mention synchronization is supported.
Click on Known Behaviors & Limitation to know about entity mention sync limitation for this system.
Click on Rank configuration to know more about entity mention mapping and synchronization behavior in general.
Mapping for Soft Delete Configuration
When Team Foundation Server is the target system, the Soft delete operation is performed by default in the synchronization of the Source Delete event.
After the Soft Delete operation is performed by OpsHub Integration Manager in Team Foundation Server, the entity will be deleted in the Team Foundation Server, and it can be found in the "Recycle bin" of the corresponding project, where it existed earlier.
To only enable the logical delete operation in the target, "OH Soft Delete" field shall be mapped with the default value "No" in the Delete Mode mapping.
Note: The above behavior is supported only for Workitems. Additionally it is supported from Team Foundation Server 2017 and above.
Kanban Board Field Configuration
To sync the Kanban Board field, advanced mapping is required in OpsHub Integration Manager.
Below is the sample advanced mapping for syncing Kanban Board field between Azure DevOps to Azure DevOps systems.
<Kanban-space-Board>
<xsl:for-each xmlns:xsl="http://www.w3.org/1999/XSL/Transform" select="SourceXML/updatedFields/Property/Kanban-space-Board/list">
<op_list>
<xsl:element name="teamName">
<xsl:value-of select="Property/teamName"/>
</xsl:element>
<xsl:element name="boardRef">
<xsl:value-of select="Property/boardRef"/>
</xsl:element>
<xsl:element name="column">
<xsl:value-of select="Property/column"/>
</xsl:element>
<xsl:element name="columnDone">
<xsl:value-of select="Property/columnDone"/>
</xsl:element>
<xsl:element name="lane">
<xsl:value-of select="Property/lane"/>
</xsl:element>
</op_list>
</xsl:for-each>
</Kanban-space-Board>Pipeline Variables Advance Mapping Configuration
To sync variables of pipeline, advance mapping is required in OpsHub Integration Manager.
Below is the sample advanced mapping for syncing Variables field:
<variables>
<xsl:for-each xmlns:xsl="http://www.w3.org/1999/XSL/Transform" select="SourceXML/updatedFields/Property/variables/Property/*">
<xsl:element name="{name()}">
<xsl:for-each select="./Property/*">
<xsl:element name="{name()}">
<xsl:value-of select="."/>
</xsl:element>
</xsl:for-each>
</xsl:element>
</xsl:for-each>
</variables>Comments Field Advance Mapping Configuration for Pipeline Entity
By default, the comments field is synchronised, as it is, for each revision in the pipeline entity.
If there is a need to add actual revision time and user email with each revision comment, the following XSLT can be used:
<comment xmlns:xsl="http://www.w3.org/1999/XSL/Transform">
<xsl:variable name="date" select="SourceXML/updatedFields/Property/createdDate"/>
<xsl:variable name="user" select="SourceXML/updatedFields/Property/authoredBy/userEmail"/>
<xsl:variable name="info" select="concat('[ Originally changed by ',$user,' on ', $date, ' ]')"/>
<xsl:variable name="comment" select="SourceXML/updatedFields/Property/comment"/>
<xsl:value-of select="concat($comment, ' ',$info)"/>
</comment>Perform check & create for Variable Groups in Pipeline
To perform check & create for Variable Groups in pipeline, Variable Group details field should be mapped.
Advanced mapping is required for the same in OpsHub Integration Manager. Below is the sample advanced mapping:
<Variable-space-group-space-details>
<xsl:for-each xmlns:xsl="http://www.w3.org/1999/XSL/Transform" select="SourceXML/updatedFields/Property/Variable-space-group-space-details/list">
<op_list>
<xsl:element name="variables">
<xsl:for-each select="Property/variables/Property/*">
<xsl:element name="{name()}">
<xsl:for-each select="./Property/*">
<xsl:element name="{name()}">
<xsl:value-of select="."/>
</xsl:element>
</xsl:for-each>
</xsl:element>
</xsl:for-each>
</xsl:element>
<xsl:element name="type">
<xsl:value-of select="Property/type"/>
</xsl:element>
<xsl:element name="name">
<xsl:value-of select="Property/name"/>
</xsl:element>
<xsl:element name="description">
<xsl:value-of select="Property/description"/>
</xsl:element>
<xsl:element name="id">
<xsl:value-of select="Property/id"/>
</xsl:element>
</op_list>
</xsl:for-each>
</Variable-space-group-space-details>While configuring integration for the same, Default Integration Workflow Pipeline should be selected to perform check & create for variable groups. For more details, refer to Workflow Association.
Integration Configuration
In this step, set a time to synchronize data between Azure DevOps and the other system to be integrated. Also, define parameters and conditions, if any, for integration. Click Integration Configuration to learn the step-by-step process to configure integration between two systems.

Criteria Configuration
If you want to specify conditions for synchronizing an entity between Azure DevOps and the other system to be integrated, you can use the Criteria Configuration feature.
To configure criteria in Azure DevOps, integration needs to be created with Azure DevOps as the source system. Query in Azure DevOps system is a valid Azure DevOps project query that contains a reference column as criteria available in the Azure DevOps system for a project. Values for the criteria fields are same as display value in Azure DevOps system's UI.
How to get the reference name for Azure DevOps work-item fields: To know the reference name of Azure DevOps work-item fields refer section Find Reference name of field. It will open a window in which you can find the Ref Name for the field for which you want to enter query.
Note: Table Sample Criteria Examples include examples for all work-items except Test Suite (TFS<2013), Build entity and Pull Request for which separate tables have been included below.
Sample Criteria Examples
Field Type
Criteria Description
Criteria Snippet
Lookup
Synchronize all entities which have priority value as '1'
[Microsoft.VSTS.Common.Priority] = '1'
Lookup
Synchronize all entities which have backlog status value 'Active' or 'Done'
[Microsoft.VSTS.Common.State] in ('Active', 'Done')
Text
Synchronize all entities which contain 'Bug Title' in title field
[System.Title] = 'Bug Title'
User
Synchronize all entities which are created by '[email protected]' user
[System.CreatedBy] = '[email protected]'
Lookup and User
Synchronize all entities which are created by '[email protected]' user and primary as '1'
[System.AssignedTo] = '[email protected]' and [Microsoft.VSTS.Common.Priority] = '1'
Sample Criteria Examples for 'Test Suite' entity (Team Foundation Server version < 2013)
Field Type
Criteria Description
Criteria Snippet
User
Synchronize all entities which are updated by 'TestUser'
LastUpdatedBy='TestUser'
Text
Synchronize all entities whose title contains 'Demo'
Title contains 'Demo'
Text
Synchronize all entities whose status is 'In Progress'
Status = 'In Progress'
Date
Synchronize all entities which are updated before 01 Jan 2020
LastUpdated < '2020-01-01 00:00:00.000'
Text
Synchronize entity whose id is 10
SuiteId = '10'
Text
Synchronize all entities whose description contains 'Test Description'
Description Contains 'Test Description'
Note: Please refer to table Sample Criteria Examples for Team Foundation Server version >= 2013 or Azure DevOps
Sample Criteria Examples for 'Build' entity
Field Type
Criteria Description
Criteria Snippet
Lookup
Synchronize builds data including the deleted builds
deletedFilter=includeDeleted
Lookup
Synchronize builds data with the specific result
resultFilter=failed
Multiple lookup
Synchronize builds where result is succeeded and status is completed
resultFilter=succeeded&statusFilter=completed
Text multivalue list
Only synchronize builds with id 38 and 39
buildIds=38,39
You can refer to Microsoft API documentation to check all the possible criterias available for the build entity. Special symbols [%, $, !, |] are not supported in criteria.
Sample Criteria Examples for 'Pull Request' entity
Field Name
Criteria Description
Criteria Snippet
Status
Synchronize only active state Pull Requests
searchCriteria.status=active
Status
Synchronize only completed state Pull Requests
searchCriteria.status=completed
Source Branch Name
Synchronize all Pull Requests having the "main" source branch name
searchCriteria.sourceRefName=refs/heads/main
Target Branch Name and Status
Synchronize all completed state Pull Requests having the "main" source branch name
searchCriteria.targetRefName=refs/heads/main&searchCriteria.status=completed
Note: If we do not use
searchCriteria.statusin the query, it assumes active state automatically.
You can refer to Microsoft API documentation to check all the possible criterias available for the Pull Request entity.
Sample Criteria Examples for 'Pipeline' entity
Field Name
Criteria Description
Criteria Snippet
Name
Synchronize all entities with the name 'TestPipeline'
name=TestPipeline
path
Synchronize all entities present on the Folder path "/Pipeline"
path=%5CPipeline
Repository Id and Repository Type
Synchronize all entities with repository Id "$/" and repository Type "TfsVersionControl".
repositoryId=$%2F&repositoryType=TfsVersionControl
Note: Set the query as per Native ADO URL encoded query format.
Refer to Microsoft API documentation to check all the possible criterias available for the Pipeline entity.
You can find more Criteria Configuration details on Integration Configuration page.
Target LookUp Configuration
Target Lookup Queries for Work Items
Provide a query in Target Search Query such that it is possible to search the entity in the Azure DevOps as destination system.
General syntax: [Target_System_Field_Referance_Name] operators (=, in, under, not under, <, >, <>, etc...) @Source_System_Field_name@
Sample queries for work items:
Target Lookup query based on title field:
[System.Title] = '@Title@'Target Lookup query based on AreaPath field:
[System.AreaPath] under '@AreaPathValue@'
Supported Target Lookup Query for Query Entity
The query must be in the format: Path=@path@/@name@
Here, @path@ and @name@ are internal field names (Folder and Name respectively), and are dynamically replaced from the source query.
If a query named TestQuery exists in the folder Shared Queries/FolderA, then the target lookup query becomes:
Shared Queries/FolderA/TestQuery
Supported Target Lookup Query for Pipeline Entity
The query must be in the format: name=@name@
Supported Target Lookup Queries for Other Entities
Users: Supported user attributes and their equivalent queries:
Username:
UserName=@FullUserName@Display Name:
UserDisplayName=@UserDisplayName@Email address:
UserEmail=@UserEmail@
Groups: Only supported on the group name attribute:
GroupName=@Name@Teams: Teams can only be queried by name:
Name=@Name@
Configuring Rich Text Field Format for Write Operations
Azure DevOps Services (Cloud) now supports using Markdown in rich text fields like Description and Acceptance Criteria.
By default, these fields use HTML, but user can now choose to use Markdown when syncing data.
This setting in OpsHub Integration Manager lets you decide whether content should be written in HTML or Markdown when syncing to Azure DevOps. This configuration option in OpsHub Integration Manager allows you to decide whether a rich text field should be written in HTML or Markdown syncing when Azure DevOps Service as a Target system.
Steps to Configure
To configure this rich text field format, please navigate to 'Override parameters for write operations(Destination)' in Entity level advance configurations.
As shown in below image, please select either HTML or Markdown as the target format in field Rich Text Field Format.

Note: If no format is selected, the content will be written in HTML by default.
Best Practices
Keep the format consistent: Once you choose HTML or Markdown, avoid changing it later. Switching formats after writing data can cause display/rendering or sync issues.
Use HTML for complex formatting: As per the Microsoft Guidelines, If your content has more complex HTML formatting, its recommended to keeping it in HTML only. Important
Remote Link and Id field Configuration: When using Azure DevOps Services as the source system, it's recommended not to change the format of the field used for remote links. By default, OpsHub Integration Manager uses HTML format to write the remote link.
Meta Entities
OpsHub Integration Manager supports migration of meta-entities including Users, Groups, Teams, Areas, Iterations & Security Permissions for Team Foundation Server and Azure DevOps.
Supported versions of Team Foundation Server are listed in the Systems Supported List.
Users
Pre-requisite: Same set of users must exist in both source and target systems and domain names must match for successful migration.
Behavior: Users are not created in the target system but rather linked to their equivalents. This enables OpsHub Integration Manager to use source user equivalents during other migrations (e.g., assign work-items, impersonation).
Known Issues: If a user exists in the source but not in the target, the migration user (i.e., the integration user) will be assigned to all related changes.
Groups
Pre-requisite: Source and target should either use the same Active Directory or have AD groups with identical names. AD groups in the source must exist as members in at least one native group in the target.
Example target lookup query:
GroupName=@Name@&Requestor=@Requestor@Behavior:
Default Collection/Project group(s) is not duplicated on the target side. They will be auto-detected, and their hierarchy and permission will be updated as per source. 'Members' and 'Member of' relationships will be set as source.
For Collection Level Group synchronization, the integration user must be a '''Project Collection Administrator''' to sync 'Members' and 'Member of' relationships. Otherwise, synchronization may fail due to insufficient permissions.
Active Directory Groups:
Active directory group(s) will not be duplicated or created on the target side due to unavailability of the APIs.
They will be auto-detected, and their hierarchy and permission will be updated as per source via {{SITENAME}}.
If groups are missing in the target:
Same AD: Add AD groups to a native group in the target.
Different AD: Create groups with matching names in the target AD and assign them as needed.
Teams
Behavior: Collection/Project teams are not duplicated. Hierarchy and permissions are updated based on the source system.
Areas & Iterations
Behavior: Default project-level nodes are not duplicated. Existing nodes are updated. Hierarchical Area and Iteration nodes are created to match the source.
Security Permissions
Pre-requisite: The migration user must have permissions to read security namespaces and user/group permissions in the source.
Behavior: Permissions are migrated for:
Collection level (Users & Groups)
Project level (Users, Groups & Teams)
Area & Iteration nodes
Version Control paths
Build definitions
If a security namespace is missing in the target, related permissions are ignored.
Known Issues:
Permissions with value
Not Setin source overwrite target values.To retain existing values in target, remove the
Permissionsfield from mapping configuration.The following collection-level permissions are not supported:
Delete team project (unless at user/group level)
Delete team project collection
Widgets
Widgets can refer to various items like Queries, Teams, Projects. To resolve the correct references in the target:
A JSON input is required that defines:
Referenced item types per widget
Their location within the API response
If not provided, OpsHub Integration Manager uses a default JSON.

JSON Structure Overview
The JSON input consists of the following sections:
generic: Defines a set of default reference rules for widgets that do not have specific configurations in the JSON input. Each object in this section contains: referenceTypes (array of strings): Specifies the types of referenced items (e.g., "Query", "Team"). Other than the entities synced by OpsHub Integration Manager, it can have following values - Release (for Release Pipelines), Project, Repository (for Git Repos). jsonPath (string): A valid JSON Path expression to locate values in the API response. Either jsonPath or regex must be provided. regex (string): A regular expression to search for referenced IDs within the API response. When combined with jsonPath, the search is confined to values found at the specified path.
widgetSpecific: Defines widget-specific reference rules for certain widget types. Each object in this section contains: widgetType (string): Specifies the widget type, corresponding to the contributionId key in the widget API response. referenceInformation (array of objects): A list of reference rules specific to this widget type. Each object in this list contains:
referenceTypes (array of strings): Same as described above.
regex (string): Same as described above.
jsonPath (string): Same as described above.
Note :Using jsonPath is preferred for accurate transformation of referenced IDs.
A sample snippet of JSON is given below:
{
"generic": [
{
"referenceTypes": [
"Team",
"Query",
"Project"
],
"regex": "[0-9a-fA-F]{8}\\-[0-9a-fA-F]{4}\\-[0-9a-fA-F]{4}\\-[0-9a-fA-F]{4}\\-[0-9a-fA-F]{12}"
}
],
"widgetSpecific": [
{
"widgetType": "ms.vss-dashboards-web.Microsoft.VisualStudioOnline.Dashboards.QueryScalarWidget",
"referenceInformation": [
{
"referenceTypes": [
"Query"
],
"jsonPath": "$.queryId"
}
]
}
]
}Known Behaviors & Limitations
Common
For the user type of fields synchronization, make sure all the users have signed in the organization at least once or are assigned to any work-item.
Reason: ADO\TFS API Limitations.
The above behavior has been confirmed with Microsoft. Please refer to the thread for more information: https://developercommunity2.visualstudio.com/t/all-aad-users-not-coming-in-response/1243303?from=email&viewtype=all#T-ND1246993
For Team Foundation Server as the target system, if the attachment file name contains Windows invalid file name characters (
<,>,:,",/,\,|,?,*), then the invalid Windows characters will be replaced by an underscore (_).Reason: ADO\TFS API Limitations.
To avoid this replacement, it is recommended to follow file naming conventions as mentioned in Microsoft File Naming Conventions.
For the rich text type of field (HTML) or comments:
Entity mention synchronization is not supported for entity type(s) Test Suite, and Test Plan.
Entity mention synchronization is not supported for the Team Foundation Server ALM with version < 2015.
Default entity mention synchronization option is Sync source id. So, migration will migrate source mentioned entity as source id in target.
Refer "Mention Sync Option" for more details.
Following fields are read-only, and can be synced from Azure DevOps to other systems.
Area Id, Attached File Count, Authorized As, Authorized Date, Board Lane, External Link Count, Hyperlink Count, ID, Iteration Id, Node Name, Related Link Count,Rev, Revised Date, Team Project, Work Item Type, Board Column, Board Column Done
Specific Authentication Mode
Service Principal - Client Secret & Service Principal - Client Certificate
When these authentication modes are selected, the supported entity types are: Work Items, Build, Pipeline, Areas, Iterations, Test Entities (Test Plan, Test Suite, Test Run, Test Result), Shared Parameter, Git Commit Information.
Reason: At present, only these entities use REST APIs. Other entities make use of both REST APIs and the Object Model.
Note: Entities not yet supported with Service Principal authentication: Pull Request, Query, Dashboard, Widget, User, Group, and Team.
The User Mention functionality can be used to mention User, but it does not work for Service Principal.
Reason: Azure DevOps does not allow to mention Service Principal on UI.
Note: When Azure DevOps is configured as the target system, it is recommended that the default user is mapped instead of the Service Principal for User Mention. If the Service Principal is mapped, it will not result in failure. However, an email will be sent by Azure DevOps, saying "ServicePrincipalName cannot be mentioned. The identity is not configured to receive notifications.
Work Item Entities (Bug, User Story, Task, etc)
Inline images that are added using the Microsoft's "Test & Feedback" tool and identity images (user profile images) will not be synchronized.
When TFS\ADO is a source end point, any change performed in link/relationship among entities, will be synchronized to target with next Update on those entities.
Reason: ADO\TFS API Limitations.
If link/relationship's comment is updated in TFS/ADO, then this comment update will not be synchronized to the target system.
In above mentioned case, the processing failure will be observed in the OpsHub Integration Manager.
Links of 'Build', 'Integrated in Release', 'Pull Request', 'Tag' (Repository Tag), 'Versioned Item', 'Wiki page', 'GitHub Commit', 'GitHub pull request' and 'GitHub Issue' are not supported.
Synchronization of Kanban Board field is supported for ADO/TFS version 2019 and above.
Comment Author details in case of 3-way integration with Team Foundation Server as middle system [System 1 -> Team Foundation Server -> System 2]
This limitation is only applicable only in case integrations have more than 2 systems involved. For example, if we have 1 integration from System 1 to Team Foundation Server and then Team Foundation Server to System 2.
In this case, if Impersonation is configured for Team Foundation Server [and in system configuration username given is IntegrationUser] and Changed By field is mapped in System 1 -> Team Foundation Server integration, then if user1 adds a comment in System 1 and if that gets synchronized to Team Foundation Server with user1 [Assumption: user1 is present in System 1 and TFS], but the Team Foundation Server to System 2 will still have Comment author as IntegrationUser (Not user1).
Entity created by an integration user won't get polled if that particular entity does not meet the criteria.
This limitation is applicable when bi-directional integration is configured and the criteria is configured with an end system storage settings. Further, only if the bypass rule is enabled for the system Azure DevOps.
OIM polls the entity created by an integration user even though the entity does not meet the criteria. But due to certain API limitations this won't be possible for the system with above configuration.
Example: Suppose bi-directional integration is being configured between X system and TFS system. Further, criteria with end system storage is configured for the integration between TFS Bug to X System Bug, and the bypass rule is enabled for the TFS system used with integration between X System Bug to TFS Bug. Those entities which are created by OIM into TFS system via integration X System Bug to TFS Bug won't get polled by integration of TFS Bug to X System Bug, given those entities are not meeting the criteria configured for the integration TFS Bug to X System Bug. Refer the below screenshots for more clarity on the configuration along with the workaround.
Workaround: Edit the mapping of the X System Bug to TFS Bug to map the field used for an end system storage criteria setting on integration TFS Bug to X System Bug to -NONE- with value as 'True'.
For Team Foundation Server with version equal to or above 2017, the Remote URL will be different from the remote URLs of the older versions of Team Foundation Server. Also, for Azure DevOps, the Remote URLs will be different.
Test Entities (Test Case, Test Plan, Test Suite, Test Result, Test Run)
Impersonation is not supported.
Entity Specific
Following are the limitations and behaviors specific to the individual entities in addition to the common:
Test Case
Test Parameters having only numeric characters in their name are not supported.
If Test Case step field like Action or Expected Result contains only image and no text, then Azure DevOps does not render the inline image on user interface. However, the inline image can be seen from history.
If Test Case step field contains a hyperlink, then Azure DevOps does not allow to click on the link from user interface.
When updating only test case steps, parameters, and parameter values - Updating only these fields won't be synced to target. To overcome this limitation, it is always recommended to use overwrite option for all these fields while configuring field mapping.
Test Plan
REST API–based synchronization is not supported for on-premises instances of Azure DevOps Server prior to version 2020.
Links of 'Remote Work', 'Release pipeline', 'Build', 'GitHub', 'Git' and 'Wiki' with Test Plan are not supported. Test Run settings, Outcome settings, MTM settings and MTM environments are not supported in Test Plan.
For TFS version below 2013-Update 3, Test Plan is not supported as a separate entity. Test Plan with few fields will synchronize with Test Suite synchronization. Also, Test Plan with duplicate name will not get synchronized.
Test Plan as a separate entity is supported from OIM version 7.46. For fresh project synchronization, it is recommended to synchronize Test Plan separately, followed by Test Suite, Test Run and Test Result. After migrating to OIM version >= 7.46 from older version, if you want to synchronize Test Plan separately in running integrations, please refer to post migration guideline for Test Plan entity support.
Test Suite
REST API–based synchronization is not supported for on-premises instances of Azure DevOps Server prior to version 2020.
Test Suite will migrate current state for on-premise instance with version equal and below version 2013 (Update 3) .
Synchronization of Test Case chart and Test Result chart created within test suite is not supported.
Query-Based Suite or Requirement-Based Suite.
Once a Query-Based Suite or Requirement-Based Suite synced to target system, then after any new linkage of Test-Case with test suite added due to modification in test case. Hence newly added Test-Case linkage of Test Suite will not sync to the target system and any Test Run with corresponding test point will resulted into processing failure. In such case do following click here for troubleshoot. {% if "OpsHub Migrator for Microsoft Azure DevOps" === space.vars.SITENAME %}
Any update in Test Suite Configuration will only migrate when test suite is updated.
Any update in Test Suite Configuration will only synchronize when test suite is updated.
Ordering in Test Cases which are added to Test Suite is supported only for version 2019 onwards for on-premise deployments (i.e. Team Foundation Server) and all cloud deployments (i.e. Azure DevOps). In addition to that, ordering is only possible when the user has selected authentication type as Personal Access Token in the system configuration. Refer section Create Personal Access Token.
If source endpoint is Team Foundation Server with version lower than 2017 or target endpoint is not an Azure DevOps, all types of Test Suite (Static/Query based/Requirement) will be migrated as Static Suite.
If source endpoint is Azure DevOps or on-premise deployment (i.e Team Foundation Server) with version 2017 onwards and target endpoint is Azure DevOps, then the Static suite is synchronized as Static Test Suite. The Requirement-based test suite is synchronized as Requirement-based test suite and Query-based test suite is synchronized as Query-based test suite.
* The user of source and target endpoint requires desired access level Basic + Test Plans in end system to synchronize query-based and requirement-based suite. Refer [Access Level](https://docs.microsoft.com/en-us/azure/devops/organizations/security/access-levels?view=azure-devops) to know more about this access level or subscription for the sync user. Otherwise, Test Suite synchronization will be resulted in to job error/sync failure as "You are not authorized to access this API. Please contact your project administrator". * Synchronization Behavior of **Query Text** field of Query based Test Suite: * The Query based test suite has a field **Query Text** that represents the actual criteria that has been given in the Query Suite entity. The **Query Text** follows a specific format for which you can refer to [WIQL syntax](https://docs.microsoft.com/en-us/azure/devops/boards/queries/wiql-syntax?view=azure-devops). * Refer to the section [Synchronization Behavior of fields with WIQL format](../..connectors/team-foundation-server.md#synchronization-behavior-of-fields-with-wiql-format) to know general synchronization behavior applicable to this type of field. Following are the behavior specific to Query Text field of Test Suite entity: * It is recommended to have both source and target endpoints having identical templates (fields, lookups, iteration, areas, etc.) to synchronize the Query Text field of the Query-based suite. Any differences in the template could lead to a mismatch in Test Case association and cause the Test Suite sync failure. It may require the end-user to manually correct the Query Text field of Test Suite in the source or target end system to retry the failure. * **User values mentioned in Query Text** * Query Text Field with a user type of field clause will be restricted to transform the user(s) not the Group or Team present as part of clause value. {% if "OpsHub Migrator for Microsoft Azure DevOps" === space.vars.SITENAME %} * The user will be transformed to corresponding target end system user as per user mapping of migration. {% endif %} {% if "OpsHub Integration Manager" === space.vars.SITENAME %} * The user will be transformed to corresponding target end system user as per user mentions mapping of field **Query Text** . {% endif %} * **Id values mentioned in Query Text** * In the **Query Text** field, an id clause can refer to a particular or set of a work item of type Test Case. * The synchronized test case will have a different id in target system than the source entity. Henceforth it is required to transform the id clause as per the target end system to avoid mismatch in the association of Test Case(s) with Test Suite between the source system and target system. By default, the Query Text field with ID field clause will not be changed as per target entity id. Perform following configuration(s) in order to transform ID as per the target end system. {% if "OpsHub Migrator for Microsoft Azure DevOps" === space.vars.SITENAME %} * Create a custom field with the name as **Source Workitem ID** and type as Integer for the Test Case entity in the target system prior to migration to transform the ID as per the target entity. So, target query-based test suite gets populated with the desired test cases. * In the absence of this custom field Source Workitem ID in the target end system the Query based test suite will be migrated as static suite. * If a Custom field named?"Source Workitem ID"?exists before migration for Test Case entity in target endpoint, then Query Text field with ID field Clause will be migrated as "Source Workitem ID" clause instead ID clause in Query Text. For Example, [ID] = 1234 is the id clause in the source end system, then this id clause migrated in target as [Source Workitem ID] = 1234. * It is recommended to have?the "Source Workitem ID"?field in Test Case for the target entity with Type?"Integer"?to avoid mismatch in the association of Test Case(s) with Test Suite between the source endpoint and target endpoint after migration * If the Type of the field?"Source Workitem ID" is?"String", then migration of Query Text field with ID clause is restricted to synchronize certain operators which are compatible with String type of field. For example, >, <, =, <=, >=, <>, In, Not In etc. The operator(s) which are only compatible with the Integer type of field and not compatible with the String type of field will cause the sync failure for Test Suite. Such incompatible operators are `[=Field]`, `[>Filed]`, `[>=Field]`, `[<=Field]`, `[<>Field]`, etc. The custom field used to replace ID field is of?String?type requires compatible operator and value. {% endif %} {% if "OpsHub Integration Manager" === space.vars.SITENAME %} * Create a custom field with any name but type as Integer for the Test Case entity in the target system. * Configure the Remote Id field for Test Case integration using above created custom field prior to synchronize Test Case(s). * Configure the following advance mapping for field Query Text to replace ID field with created custom field. Later in this documentation the sample advance mapping to replace ID field with custom field named "Source Workitem ID" is given. * If the Type of the above created custom field is "String" then * The synchronization of Query Text field with ID clause is restricted to synchronize certain operators which are compatible with String type of field. For example, >, <, =, <=, >=, <>, In, Not In etc. The operator(s) which are only compatible with the Integer type of field and not compatible with the String type of field will cause the sync failure for Test Suite. Such incompatible operators are [=Field], [>Filed], [>=Field], [<=Field], [<>Field], etc. The custom field used to replace ID field is of?String?type requires compatible operator and value. * It is requires to use the advance workflow of **Default Integration Workflow For TFS to TFS Test Suit.xml** to synchronize the Query Text with ID clause when target end system has custom field "Source Workitem ID" is of String type. * **Sample Advance Mapping to replace ID field to custom id field name Source Workitem ID** ```xml ```
* **Transform the field clause of Query Text field having different source and target field value(s)**
* Scenario: Different lookups of field in source and target end system for test case entity.
* Example, Suppose valid lookups of priority field of source system are 1,2,3,4. whereas priority lookups of target system are 1,2,3. As per the test case mapping the missing look up 4 in target system is mapped to 3.
* **Sample Query Text of test suite entity of source system** Following is the sample mapping which will transform the Priority field clause of query text field of Test Suite entity as per the given Test Case mapping. After transformation using the below sample mapping, the field clause of **\[Priority] = 4** will be transformed as **\[Priority] = 3**.
```sql
select [System.Id], [System.Title], [System.AssignedTo], [System.AreaPath]
from WorkItems
where [System.TeamProject] = @project
and [System.WorkItemType] in group 'Test Case Category'
and [Microsoft.VSTS.Common.Priority] = 4
and [System.Id] >= 1
and [System.State] = 'Closed'
and [System.Reason] = 'Duplicate'
order by [System.Id]
```
* **Expected Query Text to be synchronized in target system**
```sql
select [System.Id], [System.Title], [System.AssignedTo], [System.AreaPath]
from WorkItems
where [System.TeamProject] = @project
and [System.WorkItemType] in group 'Test Case Category'
and [Microsoft.VSTS.Common.Priority] = 3
and [System.Id] >= 1
and [System.State] = 'Closed'
and [System.Reason] = 'Duplicate'
order by [System.Id]
```
* **Sample Advance mapping for query text for above transformation**
* Following is the sample mapping which will transformed Priority field clause of query text field of test suite entity as per the given test case mapping.After transformation using below sample mapping the field clause of [Priority] = 4 will be transformed as [Priority] = 3.
```xml
<Query-space-Text>
<xsl:variable name="xpathQueryText" select="SourceXML/updatedFields/Property/Query-space-Text"/>
<xsl:variable name="xapthFieldPriority" select="utils:transformWIQLClauseAsPerMapping($xpathQueryText,'Priority','\\[Priority\\] = ([0-9]+),\\[Priority\\] in ([0-9]+)','Test Case Mapping',false(),$sourceSystemId,$targetSystemId)"/>
<xsl:value-of select="$xapthFieldPriority"/>
</Query-space-Text>
```
* **About utility method `transformWIQLClauseAsPerMapping`**
This method requires the following inputs as mentioned in the following sequence:
* **Query Text:** Source value of Query Text field which field clause is required to transform as per the target system.
* **Clause Field Name:** Name of the field which clause needs to match against the following RegEx input.
* **Clause Match RegExSet:** Comma-separated RegEx to match the possible value part of the given field clause to extract the value from the source query and replace it as per the test case mapping input.
* **Mapping Name:** Name of valid test case mapping to transform given field clause. This mapping is expected to have mapping for the field name given as part of the second input parameter.
* **Enclosed Value With Quote:** Boolean parameter indicating whether to have the transformed value within single quotes or not. If invoked as `true()`, the value will be enclosed in single quotes in the resulting query text. If `false()`, then the transformed value will not be enclosed.
* **Source System Id:** System id of the source end system.
* **Target System Id:** System id of the target end system.
{% endif %}## **Test Result and Test Run**REST API–based synchronization is not supported for on-premises instances of Azure DevOps Server prior to version 2020.
Test Run and Test Result will migrate current state. Any changes in the target system after synchronization may show inconsistency in data in both end points.
Following Test Result and Test Run will not synchronize (these Run and Result are logged in OpsHub Integration Manager logs).
Run and Result created with Test Suite not existing in the source.
Run and Result created with Test Case not associated with the Test Suite while synchronization was performed.
Any Run and Result created with Test Case/Configuration, associated with Test Suite after synchronization.
Run and Result existing in Plan (e) Automated Run and Result are not supported.
Below are only for Test Result:
Synchronization of video format attachment of Result Steps and Result is not supported.
Synchronization of Parameter Values for Step Results is not happening.
Reason: ADO/TFS API limitations. *'Duration' field will get synchronized only when the value of 'Status' field is 'completed'.
Meta Entities (User, Group and Team, Area, Iteration)
Impersonation is not supported.
Synchronization of Meta Entities for Team Foundation Server 2010 or lower is not supported.
OpsHub Integration Manager will not sync the following two permissions at collection level for Group and Users due to lack of API:
Delete team project
Delete team project collection However, the first permissions for a group or a user will be set at the project level.
Synchronization of the groups with reserved name is only possible if they are present in the target system. If such groups are not present in the target system, processing failures will be observed in OpsHub Integration Manager.
Reason: Groups cannot be created with reserved names, i.e., Group name 'Endpoint Creators' is reserved by the end system. While trying to create this group, a failure error message will be generated, 'Cannot complete the operation because the group name 'Endpoint Creators' is reserved by the system.'
User needs to manually delete this failure and start the synchronization again.
If integration user is not a member of Project Collection Administrators group, collection level permissions will not be synchronized.
Following are the limitations of OpsHub Integration Manager, if you are syncing Area or Iteration:
Target Lookup Query is supported for only one field i.e. Path and the query must be Path=@Path@ for Team Foundation Server to Team Foundation Server integration.
Recovery functionality is effective only when Manual Conflict Detection is put off for field Path. It could be set to 'disable conflict detection' or enabled with either 'Source Wins' or 'Target Wins'.
Restart Team Foundation Server and OpsHubTFSService.
Dashboard/Query/Widgets Entities
Pull Request
Pipeline Entity
Attachments, Comments, and Inline images' synchronization is not supported.
Reason: Pipeline does not have Attachments, Comments, and Inline images.
End System Criteria Storage is not supported.
Reason: Pipeline does not have any custom fields.
Service Connections, Agent Pools, Secure Files, Task Groups, and Azure Git Repositories with the same names in the source system must be present in the target system to avoid any sync failures.
Impersonation is not supported.
Release Pipeline is not supported [only Build Pipeline is supported].
For on-premise deployment, there is a Retention tab in the Pipeline entity [not available in cloud deployment]. This Retention tab synchronization is not supported.
Process parameters are not supported.
During the Pipeline entity synchronization, the processing failure may come while syncing the Service Connection for the below mentioned use case. For more details around the next steps, refer to this section.
Use case: Service Connection 1 was associated with some steps of any job in the Pipeline entity. The user changed the Service Connection from Service Connection 1 to Service Connection 2 and deleted the Service Connection 1 from the end system.
Appendix
Query Synchronization
The Query entity has a field WIQL that represents the actual criteria that has been given in the Query. The WIQL follows a specific format for which you can refer to [https://docs.microsoft.com/en-us/azure/devops/boards/queries/wiql-syntax?view=azure-devops WIQL syntax]. As WIQL is an internal format of Team Foundation Server/Azure DevOps, it will contain details of source end point in a pre-defined format. For example field names being in form of [System.Id] and user values being in form of 'automationsyncuser [email protected]'. With the synchronization, such details need to be transformed to the corresponding detail of target end point for the fields and user. Below is the detailed information around this transformation.
Field names in WIQL
Team Foundation Server/Azure DevOps End point Format - [Field internal name]. Example : [System.ID]
Format being used for processing/synchronization - [Field display name]. Example : [ID]
For example Consider a WIQL: select [System.Id], [System.WorkItemType], [System.Assigned To] from WorkItems where [System.TeamProject] = @project and [System.RemoteLink] = '[System.TestField]' This will be transformed internally to: select [ID], [Work Item Type], [Assigned To] from WorkItems where [Team Project] = @project and [Remote Link] = [Test Field] for processing.
Note: If field name is present in WIQL, which is not in this format, then OpsHub Integration Manager will not do any transformation and the details will be available as stated in the "Team Foundation Server/Azure DevOps End point Format" only. In such case, if any transformation is needed, you can do it with the help of advance mapping as per the expected format.
What happens when the source field is not present in target system During synchronization, failures will occur for the entities to which the missing target field is referred. To resolve these failures, any one of the following configurations can be done:
Create the missing field with the same datatype in any unused template in the target system. For adding the field, refer to [https://docs.microsoft.com/en-us/azure/devops/organizations/settings/work/add-custom-field?view=azure-devops-2020 Add custom field].
Replace the missing field names with the matching existing field name of the same datatype using advanced XSLT.
<wiql>
<xsl:variable xmlns:xsl="http://www.w3.org/1999/XSL/Transform" name="wiqlUpdatedValue" select="SourceXML/updatedFields/Property/wiql"/>
<xsl:value-of xmlns:xsl="http://www.w3.org/1999/XSL/Transform" select="replace(replace(replace(replace(replace($wiqlUpdatedValue,'\[Custom_Field\]','[ID]'),'\[Custom_Date_Field\]','[ID]'),'\[Custom_Integer_Field\]','[ID]'),'\[Custom Date 2\]','[ID]'),'\[Custom Field\]','[ID]'),"/>
</wiql>Note: The behavior is the same for the missing field values in the target. For example: If WIQL refers to area path 'Area1' in the source which is not present in the target, then advance mapping can be done to transform the source area path to the corresponding target area path.
User values mentioned in WIQL
Team Foundation Server/Azure DevOps End point Format - User Display Name . Example: demouser1 [email protected]
Format being used for processing/synchronization - User Display Name . Example: demouser1 [email protected] [No change is done here and hence it's expected that User Display Name is same in Source and Target End Point and based on that the user values will be synchronized in the target end point]
In case the user with same display name is not available in target end point then the source user display name will be synchronized as text in the WIQL field in the target end system. For example - Consider a WIQL :
select [System.ID], [System.WorkItemType] from WorkItems where [System.State] = 'Active' and [System.AssignedTo] in ('demouser1 <[email protected]>', 'demouser2 <[email protected]>')This will be synchronized as :select [System.ID], [System.WorkItemType] from WorkItems where [System.State] = 'Active' and [System.AssignedTo] in ('demouser1 <[email protected]>', demouser2), if no user with user name demouser2 exists in target end system.
Id values mentioned in WIQL In WIQL, an id of a work item can be referred in the field value.
Team Foundation Server/Azure DevOps End point Format -
[ID] [=, <, >, <=, >=, <>, in] [Source entity id]. Example:[ID] = [12345]Format being used for processing/synchronization -
[ID] [=, <, >, <=, >=, <>, in] [Source entity id]. Example:[ID] = [12345][No change is done here and hence the source work item id will be synchronized/visible in the target end point]
In case, you want the Source workitem id to be replaced with its corresponding target id [Which is synchronized by OpsHub Integration Manager], please use a customized workflow - Default Integration Workflow - TFS to TFS - Query.xml.
For example - Consider a WIQL : select [System.ID], [System.WorkItemType] from WorkItems where [System.ID] = 1234 and [System.AssignedTo] This will be synchronized as : select [System.ID], [System.WorkItemType] from WorkItems where [System.ID] = 6789 and [System.AssignedTo] Here, "1234" is the source workitem id and "6789" is the corresponding target work item id.
Create Personal Access Token
Log in with the integration user in AzureDevOps server.
Click on your user name at the top-right corner and select Security option.

Select Personal Access Tokens and click on New Token option.

Provide the name for the token and select All accessible organizations option for the Organization. Then choose the scope for the Personal Access Token, and click on the create button.

Copy the token value.

Proxy settings for the Service
Click Proxy Setting to see step by step details about how to configure proxy in OpsHub Integration Manager. After configuring the proxy in OpsHub Integration Manager please follow given steps.
Open file explorer and navigate to the service installation folder (Ex:
<OPSHUB_INSTALLATION_PATH>\Other_Resources\Resources\OpsHubTFSService) and open file named OpsHubTFSService.exe.config in any text editor. Un-comment the following code from OpsHubTFSService.exe.config file:<!-- <system.net> <defaultProxy enabled="true" useDefaultCredentials="false" > <module type="com.opshub.tfs.test.Proxy, opshubtfsservice" /> </defaultProxy> </system.net> -->Open run in machine (You can open it by pressing Windows + R button).
Type
services.mscand click OK.Find service name OpsHubTFSService and click on Restart.
Find Reference name of field
Log in Team Foundation Server with a user having administrative rights.
Select the 'Open WIT from Server' menu item under the Tools > Process Editor > Work Item Types menu. Note : Please make sure Microsoft Visual Studio has been installed with extension 'Process Template Editor' to see above options.

Select the Team Foundation Server collection which contains the project to synchronize.
Expand the project and then select the entity which is used for synchronization(in this case Bug).
Click 'OK' to open the Work Item Type Fields screen.

Here the user will see the list of all the fields with it's data-type and reference name for selected work-item.

How to change the port of service
Open file explorer and navigate to the service installation folder (Ex: C:\Program Files\OpsHub\Other_Resources\Resources\OpsHubTFSService).
Open the file named "opshubtfsservice.exe.config" in any text editor.
Search
<baseAddresses>tag in the file. In<add baseAddresstag change the <9090> with the port on which you want to deploy service. Save the changes. Refer the image below for reference.

Open the command prompt as 'Run As Administrator' and navigate to the service installation folder (Sample Path: C:\Program Files\OpsHub\Other_Resources\Resources\OpsHubTFSService).
Run "registerTFSWCFService.bat".
Once the command is executed, go to Windows Services and look for a service with the name "OpsHubTFSService". Check if the service has started or not. If it has not started, then start the service.
Test the web service by opening this URL in browser:
http://<hostname>:<port>/TFSService. E.g.http://localhost:<port>/TFSService. For Troubleshooting, refer Service Troubleshooting section.
How to add a user in Collection/Organization
Add User in Team Foundation Server Collection
Open Team Foundation Server Administration Console.
Click "Team Foundation Collection" under "Application Tier".
Select Collection and click "Administer Security".

Under "Add Users and Groups", select "Windows User or Group" option and Click "Add".

Enter the name of the user and then click "Check Names" to check user existence.

Click "Ok". This will add the user in the selected collection.
Add User in Azure DevOps Organization
Login into Azure DevOps with a user having administrative rights.
Click the "Organization Settings".

In the left panel, under "General" option, click "Users" option and then click "Add New Users".

Add the email address of the user/s under "Users" field and select "Access Level".

Click "Ok".
How to add user or Service Principal in group
Add User or Service Principal in Collection Administration Group
Login into Azure DevOps with the user having administrative rights.
For Azure DevOps system click on the "Organization Settings"

For Team Foundation Server click on "Settings".

Click on the "Security" option.

Click on the "Project Collection Administrators" group. Then click on "Members".

Click on "+ Add" button.

Search the User or Service Principal or user group name in searchbox. Then click on "Save Changes" button.

Add User or Service Principal in Project Administration Group
Login into Azure DevOps with the user having administrative rights.
Navigate to the project. Then click on "Settings" icon and select "Security" option.

Select "Project Administration" group and select members.
Follow number 5 to 7 point of section Add User in Collection Administration Group to add a User or Service Principal in "Project Administration".
Secret key & Certificate in Microsoft Entra (Azure Active Directory)
Generate Secret key in Microsoft Entra (Azure Active Directory)
Log into Microsoft Entra (Azure Active Directory) with the administrative user.
Navigate to Microsoft Entra Id -> Applications and select application added as Service Principal in Azure DevOps collection -> Certificates & secrets.
Navigate to Client secrets tab and add a new client secret.

Upload Certificate in Microsoft Entra (Azure Active Directory)
Log into Microsoft Entra (Azure Active Directory) with the administrative user.
Navigate to Microsoft Entra Id -> Applications and select application added as Service Principal in Azure DevOps collection -> Certificates & secrets.
Navigate to Certificates tab and upload a new certificate.

How to find Azure DevOps/Team Foundation Server's version
please follow given steps fo find Team Foundation/Azure DevOps Server version.
Open Team Foundation Server Administration Console.
You can see the Team Foundation Server instance version detila in right side of panel. Please refere given screenshot for reference.

QTP MTM Test Extension Installation and Configuration
QtpMtmTestInstall.zip is bundled with the OpsHub Integration Manager installation.
On OpsHub Integration Manager installation machine, navigate to:
<OpsHub_Installation_Directory>\Other_Resources\Resourcesand copy and extract QtpMtmTestInstall.zip to machine where QTP MTM Test Extension has to be installed (i.e. MTM Test Agent, MTM Test Controller, etc.).For installation of QTP MTM Test Extension for MTM 2010 launch
Install QTP MTM Test Extension - MTM 2010.bat.
Note: Launch as Administrator
For installation of QTP MTM Test Extension for MTM 2012 launch
Install QTP MTM Test Extension - MTM 2012.bat.
Note: Launch as Administrator
Test Storage File Configuration
Copy default.qtpmtm Test Storage file
QtpMtmTestInstall > QtpMtmTestExtensiondirectory into your Team Foundation Server source project.Open default.qtpmtm Test Storage file in Notepad.
Provide the QTP Test Case Storage directory windows share path in the
default.qtpmtmfile in the first line. All the QTP Test available in the given directory will be discoverable by the QTP MTM Test Extension.Save the default.qtpmtm file and check into the Team Foundation ServerProject.
Azure DevOps Web Hook Support
Web Hooks provides functionality to trigger synchronization process on create/update of any workitem on Azure DevOps. This enables real-time synchronization of any changes made on Azure DevOps to any target system. For more details on Azure DevOps Web Hooks, please refer the following document link for configuring web hook: https://docs.microsoft.com/en-us/azure/devops/service-hooks/services/webhooks?view=azure-devops
OpsHub Integration Manager supports the following workitem events:
Work item created
Work item updated
Comments added to a work item
Note: OpsHub supports web hook for Azure DevOps instance only.
While configuring web hook on Azure DevOps, provide URL in this pattern: http://[Opshub_Path]/OpsHubWS/ServiceHook/tfs for sending Web Hook request to valid OpsHub instance. Provide the URL of OpsHub which is accessible from Visual Studio Team Services instance. Refer following figure for URL configuration of Web Hook for OpsHub Service.

Bypass Rule with User Impersonation
If an integration is configured to Azure DevOps from any other system with 'Bypass Rule' option enabled, OpsHub Integration Manager will consider the audit revision's author as the user on the basis of which impersonation is to be performed.
Link impersonation will be supported between Azure DevOps systems. When two entities are linked then on Azure DevOps side, only one entity will contain actual linked added by user while on another entity link will be added by default integration user.
Bypass rules also allow Azure DevOps system to write any data ( valid or invalid ) data into server. OpsHub Integration Manager can create data on past dates as well by enabling this feature.
In case of Current State Synchronization/ Reconciliation:
Fields and Attachments:
They will be impersonated with the last changed by user of source entity.
Comments:
They will be impersonated with the comment user of source entity
Bypass Rule with Time Impersonation
If an integration is configured to Azure DevOps from any other system with 'Bypass Rule' option enabled, OpsHub Integration Manager will consider the audit revision's timestamp as the timestamp on the basis of which impersonation is to be performed.
In case of Current State Synchronization/ Reconciliation:
Fields, Comments and Attachments will be impersonated with the last changed time of source entity.
State Transitions known behavior
For Team Foundation Server system, state transitions is performed implicitly by OIM using API, given no customization has been done for dependent fields of state transitions. If a user-defined field is configured as a dependent field for the state transition, then it would require configuring the state transitions using mapping XML.*
How to configure transitions XML using mapping? Refer this: [Transition Section (../integrate/mapping-configuration.md#attachments-comments-relationships-and-workflow-transition).
Following is the example of a transition script for the Team Foundation Server:*
Particular field "customblock" is required in the end system when state is changed from 'Active' to 'Block', otherwise its hidden. Other dependent field(s) are system defined, for example 'Reasons' field. As the user-defined field configure for transition field, we must configure transitions in the mapping as shown below:
<FieldTransitions>
<FieldTransition>
<transitionName>transitionName 1</transitionName>
<fromField>State</fromField>
<toField>State</toField>
<sourceValue/>
<targetValue>Proposed</targetValue>
<defaultTransition>true</defaultTransition>
</FieldTransition>
<FieldTransition>
<transitionName>transitionName 2</transitionName>
<fromField>State</fromField>
<toField>State</toField>
<sourceValue>Proposed</sourceValue>
<targetValue>Active</targetValue>
<dependentFields>
<dependentField>
<fieldName>Reason</fieldName>
<possibleTargetValues>
<possibleValue>Approved</possibleValue>
</possibleTargetValues>
<defaultValue>Approved</defaultValue>
</dependentField>
</dependentFields>
</FieldTransition>
<FieldTransition>
<transitionName>transitionName 3</transitionName>
<fromField>State</fromField>
<toField>State</toField>
<sourceValue>Active</sourceValue>
<targetValue>Block</targetValue>
<dependentFields>
<dependentField>
<fieldName>Reason</fieldName>
<possibleTargetValues>
<possibleValue>Fixed</possibleValue>
</possibleTargetValues>
<defaultValue>Fixed</defaultValue>
</dependentField>
<dependentField>
<fieldName>customblock</fieldName>
<possibleTargetValues>
</possibleTargetValues>
</dependentField>
</dependentFields>
</FieldTransition>
</FieldTransitions>Troubleshoot
Test Point does not exist failure
For detailed understanding of Test Point, please refer to Test Point Advance Mapping Configuration section.
Some possible scenarios that may cause this failure:
Test point is yet not synchronized in target system
To resolve this issue do following. Let first TestCase then TestSuite sync with Test-Case linkage configuration prior to Test Run sync or failure retry. For synchronizing configuration/tester with Test-Case linkage advance mapping is required. For advance mapping, please refer Test Point Advance Mapping Configuration section.
TestPoint is deleted from Target System
TestPoint can be deleted with either of following ways:
TestCase Linkage will remove from TestSuite. In such case all TestPoint corresponds to that TestCase will deleted. For example, If we remove TestCase 81639 this will remove the first TestPoint shown in above screenshot. Whereas if we remove TestCase 81640 then it will remove both the TestPoints belong to TestCase 81640 i.e. 2nd and 3rd both TestPoints will removed.
Remove/change the configuration of particular TestPoint. For example, change the configuration for first TestPoint to Firefox instead Chrome, then it will create new TestPoint and remove old TestPoint.
Deleting the TestCase will remove all TestPoints corresponds to that TestCase.
Deleting the TestSuite itself will remove all TestPoints corresponds to that TestSuite.
In such cases the failure remains for Test Run until the required TestPoint not added back to TestSuite in target system.
TestCase can be added in Query-Based Suite or Requirement-Based suite after TestSuite synced in target system, or Test-Case Linkage configured after synchronization.
Perform following steps to resolve failure due to this scenario:
Update the TestSuite of source end system for which failure is generated.
Execute the TestSuite Integration.
Once updates of TestSuite synced to target end system the retry then Test-Run failure.
Last updated

