Platform Overview
Understanding the core concepts and structure of the ConductorQA platform will help you use it more effectively and organize your testing activities efficiently.
Core Concepts
1. Organization
Your Organization is the top-level container that represents your company or team. It serves as the foundation for all testing activities and includes:
- All Projects: Every project belongs to your organization
- Team Members: Users and their assigned roles
- Security Settings: Access controls and permissions
- API Management: Keys for external integrations
- Billing Information: Subscription and usage details
2. Project
A Project represents a software application or system you want to test. It acts as a container for all testing activities related to a specific product or service.
Project Examples:
- “E-commerce Website” - Your main web application
- “Mobile Banking App” - iOS and Android applications
- “Payment API Gateway” - Backend payment processing system
- “Customer Support Portal” - Internal tools and interfaces
Project Benefits:
- Organized test management per application
- Team collaboration focused on specific systems
- Project-specific analytics and reporting
- Granular access control and permissions
3. Test Suite
A Test Suite is a logical grouping of related test cases. It helps you organize tests by feature, component, testing type, or any other meaningful categorization.
Suite Organization Examples:
By Feature
- “User Authentication Tests”
- “Payment Processing Tests”
- “Order Management Tests”
- “Customer Dashboard Tests”
By Component
- “Frontend UI Tests”
- “Backend API Tests”
- “Database Integration Tests”
- “Third-Party Service Tests”
By Test Type
- “Functional Tests”
- “Integration Tests”
- “Performance Tests”
- “Security Tests”
4. Test Case
A Test Case is an individual test that validates specific functionality or requirements. Each test case includes:
- Detailed Steps: Step-by-step instructions for execution
- Expected Outcomes: What should happen at each step
- Priority Level: CRITICAL, HIGH, MEDIUM, or LOW
- Application Association: Which system components it tests
- Automation Status: Manual or automated execution tracking
5. Test Run
A Test Run represents the execution of one or more test cases. It captures:
- Execution Status: Pending, Running, Passed, Failed, etc.
- Timing Information: Start time, duration, completion time
- Results and Artifacts: Screenshots, logs, error details
- Execution Context: Who ran it, environment details
Platform Hierarchy
Understanding the hierarchical relationship helps in organizing your testing activities:
Organization (Your Company)
├── Project 1 (E-commerce Website)
│ ├── Test Suite A (User Authentication)
│ │ ├── Test Case 1 (Login with Valid Credentials)
│ │ ├── Test Case 2 (Login with Invalid Password)
│ │ └── Test Case 3 (Password Reset Flow)
│ ├── Test Suite B (Payment Processing)
│ │ ├── Test Case 4 (Credit Card Payment Success)
│ │ └── Test Case 5 (Payment Error Handling)
│ └── Test Runs
│ ├── Run 1 (Executed Suite A - Authentication Tests)
│ └── Run 2 (Executed Suite B - Payment Tests)
└── Project 2 (Mobile Banking App)
└── [Similar hierarchical structure]
Multi-Application Support
Application Management
Each project can contain multiple applications or system components:
Frontend Applications
- Web Applications: React, Angular, Vue.js applications
- Mobile Applications: iOS and Android native/hybrid apps
- Desktop Applications: Electron, native desktop apps
Backend Services
- REST APIs: RESTful service endpoints
- GraphQL APIs: GraphQL service implementations
- Microservices: Individual service components
- Databases: Data layer testing
Integration Points
- Third-Party Services: External API integrations
- Payment Gateways: Stripe, PayPal, payment processors
- Authentication Providers: OAuth, SAML, SSO systems
Infrastructure Components
- Load Balancers: Traffic distribution systems
- CDNs: Content delivery networks
- Monitoring Systems: Application performance monitoring
Key Platform Features
Real-Time Execution Monitoring
- Live Status Updates: See test progress as it happens
- Progress Tracking: Visual indicators of completion status
- Resource Monitoring: Track execution time and performance
- Detailed Logging: Comprehensive execution logs and artifacts
Comprehensive Analytics
- Success Rate Tracking: Monitor test pass/fail rates over time
- Performance Analysis: Execution time trends and bottlenecks
- Flaky Test Detection: Identify unreliable or inconsistent tests
- Quality Metrics: Overall testing quality insights
Team Collaboration Features
- Shared Dashboards: Team-wide visibility into test status
- Comments and Notes: Collaborative documentation on test cases
- Issue Tracking: Link test failures to bug tracking systems
- Role-Based Access: Granular permissions for different team roles
External Integration Capabilities
- REST API: Comprehensive API for external tool integration
- Secure Authentication: API key-based access control
- Bulk Operations: Efficient batch processing for large datasets
- Multiple Formats: Support for various data import/export formats
Navigation Structure
Main Platform Navigation
Home Dashboard
- Organization-wide analytics and metrics
- Recent activity across all projects
- Interactive charts and trend analysis
- Quick access to important insights
Projects Section
- List view of all accessible projects
- Project creation and management tools
- Quick access to project dashboards
- Search and filtering capabilities
Settings Area
- Account and profile management
- Organization and team settings
- API key creation and management
- Security and access controls
Project-Level Navigation
When working within a specific project:
Project Overview
- Project-specific dashboard and analytics
- Recent activity within the project
- Key metrics and performance indicators
- Team member activity
Test Suites Management
- Create and organize test suites
- Manage test cases within suites
- Bulk operations for efficiency
- Import and export capabilities
Test Execution
- Create and manage test runs
- Monitor execution progress
- Review results and artifacts
- Historical execution data
Project Settings
- Application and environment configuration
- Team member management for the project
- Integration settings and API access
- Project-specific preferences
Data Organization Principles
Logical Grouping Strategies
1. Feature-Based Organization
Group tests by the features they validate:
- All authentication-related tests in one suite
- All payment-related tests in another suite
- All user management tests grouped together
2. Component-Based Organization
Organize by system components:
- Frontend tests separate from backend tests
- Database tests grouped by database type
- API tests organized by service endpoints
3. Priority-Based Organization
Structure by business importance:
- Critical path tests that must always pass
- High-priority tests for key functionality
- Medium and low priority tests for comprehensive coverage
Scalability Considerations
The platform is designed to handle:
- Multiple Projects: Dozens of projects per organization
- Large Test Suites: Hundreds of test suites per project
- Extensive Test Cases: Thousands of test cases per suite
- High Execution Volume: Unlimited test runs and executions
Flexibility Features
- Custom Templates: Create reusable test case templates
- Custom Fields: Add metadata specific to your needs
- Multiple Organization Methods: Organize tests in ways that fit your workflow
- Adaptive Structure: Modify organization as your needs evolve
Status and Priority Systems
Test Case Priority Levels
- Critical: Must pass for any release - core business functionality
- High: Important features that significantly impact user experience
- Medium: Standard functionality that should work correctly
- Low: Nice-to-have features and edge cases
Test Run Status Indicators
- Pending: Test run created but execution not started
- Running: Currently executing tests
- Passed: All tests completed successfully
- Failed: One or more tests failed during execution
- Skipped: Tests were skipped due to dependencies or conditions
- Blocked: Tests cannot be executed due to system issues
- Cancelled: Test run was manually cancelled
Best Practices for Organization
Project Structure Guidelines
Separate by System Boundaries
- Create different projects for distinctly different systems
- Consider environment-specific projects (Development, Staging, Production)
- Use projects to separate different product lines or major versions
Logical Suite Organization
- Keep related test cases together in suites
- Aim for 10-50 test cases per suite for manageability
- Use descriptive names that clearly indicate the suite’s purpose
- Consider test execution time when grouping tests
Naming Conventions
Consistent Naming Patterns
- Projects: “System Name - Environment” (e.g., “E-commerce - Production”)
- Test Suites: “Feature/Component Description” (e.g., “User Authentication”)
- Test Cases: “Action - Expected Outcome” (e.g., “Login - Success with Valid Credentials”)
Version Management
- Include version numbers when applicable
- Use consistent prefixes for different types of tests
- Document naming conventions for team consistency
Next Steps: Now that you understand the platform structure, you’re ready to create your first project or explore test suite organization in detail.