Project KPIs

In the below table there are some examples of success criteria indicators that we use in our typical projects

Code Quality

  • Every piece of code is reviewed and signed off by the team; we use buddy developers for the 1st round of code review and code reviews groups for a 2nd round; external parties can also be invited to perform code reviews, as needed.
  • Anytime a code checking is performed, we use Continuous Integration procedures; we execute the automated test cases suites, to detect regressions.
  • All the code must always comply with the coding standards defined in the Software Design & Development plan.

Test Quality

  • The unit tests cases coverage is at least 95%.
  • The functional test cases coverage is at least 75%.
  • The automated functional test cases coverage must be at least 60%
  • All P1 tests cases and at least 80% of the P2 tests cases must pass to meet the deliverable exit criteria.
  • All types of appropriate testing are performed (functional, security, localization, integration etc.), as appropriate.
  • The test code is reviewed and meets the same quality bar as the product code.

Project Management Quality

Project management methodology is sometimes a part of the project non-functional requirements:

  • Is project management methodology followed properly?
  • Are all the project management documents up to date?
  • Are there clear responsibilities defined in the team?
  • External auditors “checklists” compliance is met?
  • Is there:
    • A change management plan available and enforced?
    • A risk management plan available and enforced?
    • etc.

Project Documentation

  • The project documentation includes the following documents which must be reviewed, approved and maintained:
  • Product Architecture
  • Quality Assurance Plan
  • Software Design & Development Plan
  • Test Plan
  • Test cases documentation
  • Scrum documentation artefacts (Scrum backlog, Sprints backlog etc.)
  • etc.

Development Environment Quality

We also measure and analyze, depending on the project specific needs:

  • Hardware resources availability numbers
  • Downtimes
  • (Root Cause Analysis – RCA) for the development environment downtimes
  • Cost for hardware and software maintenance
  • Cost for the specific testing environment

Delivery Performance

  • The following indicators are measured and the delivery performance is assessed against them:
  • Deliveries completed on time
  • Deliveries accepted as complete on time
  • Requests to fix
  • Bugs: incoming rate, severity and priority, average time to resolve/fix/close, areas distribution etc.

Support and Maintenance

We usually measure:

  • Responses completed on time
  • Change requests tendencies/evolution
  • Average time for handling change requests
  • Types/Severity/Priority
  • Customer support satisfaction


Depending on the nature of the project, we can deal with specific requirements. For example, there are project where security policies demands requires:

  • A security plan for the project development lifecycle
  • An environment resources access list
  • An audit policy for the project team
  • Security reviews activities
  • etc.

We always define together with our customers and partners the project success criteria definition and we also agree together how to measure, who and when. The key is to be flexible, use high-value KPIs which add value to the project.