RFP Evaluation Scorecard
A practical template for objectively evaluating vendor proposals and making confident technology decisions. Stop choosing vendors based on who gives the best presentation — start choosing based on who'll deliver the best results.
Built from our experience on both sides of the RFP process — as the vendor being evaluated and as the advisor helping organizations evaluate.
What's Inside
Before You Write the RFP
The most common RFP mistake happens before a single word is written: not knowing what you actually need. A vague RFP gets vague proposals. A clear RFP gets proposals you can actually compare.
Define Your Requirements
Before drafting the RFP, document:
- The problem you're solving — Not the solution you want, but the problem you have. "Our permit application process takes 45 days and generates 200 support calls per month" is better than "We need a new permitting system."
- Must-haves vs. nice-to-haves — Be ruthless about the distinction. If everything is a must-have, nothing is. Aim for no more than 10 must-haves and categorize everything else.
- Success criteria — How will you know the project succeeded? Define measurable outcomes before you start evaluating vendors.
- Integration requirements — What existing systems must the new solution work with? Document APIs, data formats, and authentication requirements.
Budget Reality
Include a budget range in your RFP. Yes, really. Vendors who know your budget can propose a solution that fits. Vendors who don't know your budget either over-engineer (hoping for a bigger contract) or under-scope (hoping to win on price and add change orders later). Neither serves you well.
If you genuinely don't know what this should cost, ask 2-3 vendors for ballpark estimates before writing the RFP. There's no rule that says you can't have conversations before the formal process begins.
Evaluation Criteria
We recommend five weighted categories. Adjust the weights based on your priorities, but resist the urge to weight cost above 20% — the cheapest proposal is rarely the best value.
Technical Approach
30%Does the vendor understand the problem and propose a sound solution?
- Clear understanding of requirements and constraints
- Appropriate technology choices with rationale
- Realistic architecture that accounts for scale and security
- Accessibility built into the approach (not bolted on)
- Data migration and integration strategy
- Maintenance and support plan
Team & Experience
25%Do they have the right people with relevant experience?
- Named team members (not generic roles)
- Relevant government/nonprofit experience
- Team availability and dedication (full-time vs. shared)
- Subcontractor transparency
- Staff retention and continuity plan
- Technical certifications and expertise
Past Performance
20%Have they done this before — successfully?
- Similar projects in scope and complexity
- Government or nonprofit sector experience
- References from comparable organizations
- Evidence of on-time and on-budget delivery
- Case studies with measurable outcomes
- Long-term client relationships (not just one-off projects)
Cost
15%Is the pricing realistic, transparent, and sustainable?
- Total cost of ownership (not just implementation)
- Transparent rate structure
- Clear assumptions and exclusions
- Change order pricing methodology
- Ongoing maintenance and licensing costs
- Payment terms and milestones tied to deliverables
Timeline & Methodology
10%Is the plan realistic and well-structured?
- Realistic timeline with clear milestones
- Dependencies and assumptions documented
- Risk mitigation plan
- Client responsibilities clearly defined
- Communication and reporting cadence
- Acceptance criteria for each phase
Scoring Methodology
Use a 1–5 scale with clear definitions for each level. The most important thing is that every evaluator uses the same definitions. Calibrate before scoring by reviewing one proposal together as a group.
Exceptional
Significantly exceeds requirements. Demonstrates deep understanding and innovative approach. Provides substantial added value beyond what was requested.
Good
Exceeds requirements in meaningful ways. Clearly understands the problem and proposes a strong solution. Minor areas could be stronger.
Acceptable
Meets requirements adequately. Competent approach but nothing distinguishing. No significant concerns but no standout strengths.
Marginal
Partially meets requirements. Notable gaps or concerns. Would need significant clarification or revision to be viable.
Unacceptable
Fails to meet requirements. Major gaps, misunderstandings, or concerns. Not a viable candidate without fundamental changes.
Calculating the Final Score
For each vendor, multiply the raw score (1–5) by the category weight, then sum all categories:
Maximum possible score: 5.0. A score below 3.0 should be an automatic disqualification.
Red Flags to Watch For
Beyond the scorecard, watch for these warning signs. Any one of them should prompt deeper investigation. Two or more should give you serious pause.
Vague or Aggressive Timelines
If a vendor promises to deliver a complex system in half the time everyone else quoted, they're either cutting corners or don't understand the scope. Unrealistic timelines are the #1 predictor of project failure.
No Referenceable Clients
Every credible vendor can provide references from similar projects. 'We can't share client names due to NDAs' for every single project is a red flag. You should be able to talk to at least 2-3 past clients.
Scope Padding
Watch for vendors who add services you didn't ask for to inflate the budget. 'Strategic discovery workshops' and 'governance framework development' might be valuable — or they might be padding. Ask what happens if you remove them.
Technology Lock-In
Proprietary platforms, custom frameworks, or solutions that only the vendor can maintain are designed to make you dependent. Ask: 'If we part ways, can another vendor take over this codebase?'
No Accessibility Plan
If accessibility isn't mentioned in the proposal — or is a single bullet point — the vendor doesn't take it seriously. For government work, this isn't optional. Ask for their specific accessibility testing methodology.
Bait and Switch on Team
The senior architect who presented in the pitch isn't always the person who does the work. Ask which team members will be dedicated to your project, what percentage of their time, and what happens if they leave.
Decision Framework
The scorecard narrows your field. The decision framework helps you choose from the finalists.
Step 1: Shortlist (Top 2–3 Vendors)
After scoring, identify your top 2–3 vendors. If there's a clear winner (more than 0.5 points ahead), you may not need the remaining steps. If scores are close, proceed with deeper evaluation.
Step 2: Reference Checks
Call references. Don't just confirm they completed the project — ask:
- Would you hire them again? Why or why not?
- How did they handle scope changes or disagreements?
- Was the team that pitched the same team that delivered?
- What surprised you — positively or negatively?
- Did the project come in on time and on budget? If not, why?
Step 3: Pilot or Proof of Concept
For high-stakes projects, consider a paid pilot (2–4 weeks) with your top finalist. This tells you more about what it's like to work with a vendor than any proposal ever will. You'll see their communication style, their problem-solving approach, and the actual quality of their work.
Step 4: Final Selection
Make your decision based on the complete picture: scorecard ranking, reference feedback, and pilot results. Document the rationale. In government procurement, you may need to defend this decision — a clear paper trail protects everyone.
Sample Scorecard Template
Use this template for each evaluator. Average scores across evaluators for the final ranking.
| Category | Weight | Score (1–5) | Weighted | Notes |
|---|---|---|---|---|
| Technical Approach | 30% | ___ | ___ | _______________ |
| Team & Experience | 25% | ___ | ___ | _______________ |
| Past Performance | 20% | ___ | ___ | _______________ |
| Cost | 15% | ___ | ___ | _______________ |
| Timeline & Methodology | 10% | ___ | ___ | _______________ |
| Total | 100% | — | ___ |
Tips for Using This Scorecard
- • Have each evaluator score independently before discussing as a group
- • Use the notes column to document specific evidence for each score
- • Discuss any score where evaluators differ by more than 1 point
- • Keep completed scorecards as part of your procurement record
Need help evaluating vendors?
We help organizations write better RFPs, evaluate proposals objectively, and select technology partners they won't regret.
Get Procurement Support