Platform Contributor Guide
[Legacy v3] Platform
[Legacy v3] Platform
  • 👋[Legacy v3] Welcome | README
  • Contributing | Getting Involved
    • Specific tasks needed for COVID19-support
    • Add code to Ushahidi
    • Encouraging contribution from non-developers
  • Frequently Asked Questions
  • Join the Ushahidi community
  • Contributors ✨
  • 🛣️ The Ushahidi Platform Roadmap
    • V2-V3+ Migration tool
  • Privacy and security best practices
    • Security as a user
    • Security for deployment admins
    • Security for deployment hosts
  • Development & Code
    • Development: Overview
    • How to get the source code
    • Setup Guides
      • Installing for production environments
      • Development environment with XAMPP
      • Development environment setup with Vagrant
      • [Client] Setting up the Platform Client for development
        • Migration from AngularJS
      • Setting up the Pattern Library for development
      • [API & Client] Bundled release install
    • Add code to Ushahidi
    • Development process
    • Coding Standards
    • Track and submit issues in Github
    • Upgrading Ushahidi
      • Upgrading to latest release
      • Upgrading from V3.x.x to V4.x.x
    • ⚙️ Installation Helper‌
  • Tech Stack
    • API Documentation
    • Third party app development
      • Web hooks
    • Database | Tables overview
    • Database | Database Schema Diagram
    • Database | Table details
    • 📐Architecture
    • Use case internals
  • QA & Testing
    • The QA process
    • How to run QA tests
    • Defect Management
    • How to write QA test scripts
    • Hotfixes
  • Front-end development
    • Changing UI styles: introduction to the pattern library
      • File-structure
      • Installing new packages
      • How to Apply to the Platform
      • Using the changed styles in platform-client
      • Syntax and Formatting
      • Grid, Breakpoints, & Media Queries
      • Variables
      • Mixins
      • Helpers
      • Icons
      • Create a New Component from Scratch
      • Read Direction
  • Design
    • 🎨Design: overview
    • 'Best practice' design
    • Ushahidi Platform 'Sticker Sheet'
    • User testing process
    • User testing script examples
    • Synthesising user testing results examples
      • Synthesis example 1
      • Synthesis example 2
      • Synthesis example 3
      • Synthesis recommendations example 1
      • Synthesis recommendations example 2
    • Open Source Design
  • Documentation
    • Documentation
    • Contributing docs via GitHub
  • Translation
    • Localization and Translation
  • The Ushahidi Platform Facebook bot
    • The Facebook bot
      • Installing the bot
      • The bot script
  • Hackathon and events
    • Installathon, May 2019
      • Welcome to the hackathon!
    • Write/Speak/Code 2019
    • Open Design: Bangalore
    • Open Design: Taipei
    • 📑Google season of docs
    • 💻Google Summer of Code
      • GSoC 2024
  • Enhancement Proposals
    • Exchange Format
    • Importing data from previous versions
Powered by GitBook
On this page
  1. QA & Testing

How to run QA tests

PreviousThe QA processNextDefect Management

Running of tests is usually dictated by a checklist that can be found on test cases that reside on TestPad, or a checklist that is found on the issue/task card.

From issue/task card

Tests will be run for a specific issue/task when only that single issue is being tested. This will happen throughout the development cycle on a need-to basis i.e whenever an issue is ready to be tested.

For instances where the checklist is on the issue/task card, once the issue is ready for testing and has been deployed then the tester will follow these list ticking off every item as they pass. A list that completely passes has all items in the list ticked off. Where an item in the list fails, this is left unmarked to indicate that it did not pass. Where relevant, additional information will be left in the comment section of the issue/task card. The relevant developer will be alerted and they can follow up, knowing what item failed to pass.

From TestPad

TestPad houses a comprehensive test suite that cover most parts of the platform application. You’ll turn to TestPad when you’re running either smoke or regression tests.

usually happens before a release. It's usually quick and looks at the core functionality of the application to verify the main application flows are working.

are more detailed and cover the application in its entirety.

Tests in TestPad are structured in a checklist format. As you run the tests,you mark off every item as either passing or failing.

TestPad also allows for more details on a test including when the test was run, who run it, and relevant statistics i.e passing tests and failing tests as percentages.

Smoke testing
Regression tests