Recommended User Acceptance Testing (UAT) Checklist

Amanda Jennewein
Amanda Jennewein
  • Updated

Updated: July 2024

Deployment Migration

  • Self-hosted to the Cloud
  • POST Dry-run
  • Recommended User Acceptance Testing (UAT) Checklist

Summary

The purpose of this document is to provide recommendations for Customer users during UAT. 

During the UAT (User Acceptance Testing) period, the customer will have access to a copy of their production data in a temporary instance of Jama Connect in their preferred deployment type, resulting from a Dry Run Migration. The UAT period enables customers to test and confirm that all Jama Connect functionalities work the same way as their original deployment. 

So that you know, the list provided is only a set of recommendations and is not exhaustive. It may not cover all customer-specific use cases or the use of Jama Connect for business purposes. As such, customers are welcome to add more items to the list or only complete some of the recommendations based on their project plan.

Customers are encouraged to select 1-2 projects for performing checks and maintain consistency between dry run and production cutover checks.

End Goal

The Customer must confirm completion of UAT and provide a go/no go decision to Jama Software® for any production deployment migration planning. 

UAT recommendations and checks

  • Users can log in to Jama Connect® successfully. 
  • Deactivated users cannot log in.
  • Jama session times out after a pre-defined # of minutes set 
  • Users can add bookmarks.
  • Users can delete bookmarks.
  • Users can perform keyword searches, and keyword searches work as expected. 
  • Users can see active reviews. 
  • Users can see stream activity.
  • Users can see recently viewed items. 
  • Users can edit their profiles. 
  • Users can create and delete API credentials.
  • Creator users can create items.
  • Creator users can edit items. 
  • Creator users can lock and unlock items.
  • Verify Item Types across instances.
  • Verify Item Type fields across instances.
  • Verify item Project ID, Global ID, and API ID across instances.
  • Verify Project Explorer Tree Hierarchy across instances.
  • Verify item versions across instances.
  • Verify item relationships across instances. 
  • For sync'd item, sync'd status, and content is accurate. 
  • Can you create test cases?
  • Can you create test plans?
  • Can you create test runs?
  • The test results, plans, cycles, and runs are all validated for the testing data.
  • Ensure standard and custom reports are consistent across all instances. 
  • Confirm reviews are consistent across all instances, including the IDs, status, and comments.
  • Both the creator and stakeholder can create a filter.
  • Ensure that the public or personal filters are consistent across all instances.
  • Verify email subscriptions across different instances.
  • Confirm that the activity streams for version history are consistent across all instances.
  • Verify the user groups in all instances.
  • Confirm the permissions for all instances.
  • Confirm the workflows for verifying the type of item across different instances.
  • Verify the relationship rules across all instances.
  • Ensure consistency of editor templates and review for errors.
  • Verify the picklists across all instances.
  • Verify the list of projects across all instances.
  • Verify the baselines for content, version history, and IDs across all instances.
  • Verify the releases across all instances for accuracy.

 

 

Related to

Was this article helpful?

0 out of 0 found this helpful

Have more questions? Submit a request

Comments

0 comments

Article is closed for comments.