About 75-85% of SBIR Phase I proposals are rejected. That's normal. The reviewer feedback you received in that rejection is worth more than most paid consulting -- it's specific, expert analysis of exactly where your proposal fell short. The founders who win on their second or third attempt are the ones who treat feedback as a resubmission roadmap, not a verdict.
This guide covers the resubmission process agency by agency, how to decode what reviewers are really telling you, and the structural fixes that turn rejections into awards.
Resubmission rules by agency
Each agency handles resubmissions differently. Know the rules before you start revising.
| Agency | Resubmission Allowed? | Formal Mechanism | Key Rules |
|---|---|---|---|
| NIH | Yes -- one A1 resubmission | 1-page Introduction to Revised Application | Must address every major concern from summary statement. A1 gets reviewed by the same study section. If A1 is rejected, you must submit as a new (A0) application. |
| NSF | Yes -- new proposal | No formal resubmission; submit new proposal next cycle | Revise your Project Pitch first (rolling, resubmit anytime). If invited again, submit a new full proposal. Include a "Resubmission Change Description" in your proposal. |
| DoD | Yes -- if topic recurs | No formal resubmission; submit to next matching topic | Topics rotate by solicitation cycle. If your topic doesn't reappear, look for adjacent topics at the same agency or a different DoD branch. |
| DOE | Yes -- new proposal | No formal resubmission | Submit to matching topic in the next solicitation. Topics may change between cycles. |
| NASA | Yes -- new proposal | No formal resubmission | Submit to matching subtopic in the next solicitation. |
| AFWERX | Yes -- resubmit to Open Topic | No formal resubmission | Open Topic rolls continuously. Revise and resubmit in the next submission window. |
The NIH distinction matters: NIH's A1 mechanism is the only formal resubmission process in the SBIR system. The 1-page Introduction to Revised Application is your chance to show reviewers you heard them. At every other agency, you're submitting what is technically a new proposal -- but informed by prior feedback.
How to decode reviewer feedback
Reviewer comments follow patterns. Learning to read between the lines is the first step to a strong resubmission.
What reviewers say vs. what they mean
| What They Write | What They Mean | What to Fix |
|---|---|---|
| "The innovation is incremental" | Your proposal reads as an improvement, not a breakthrough. No clear technical gap. | Reframe the problem. Lead with what's unknown, not what you'd build. Show why current approaches fundamentally can't solve this. |
| "The approach lacks detail" | You described goals but not methodology. Reviewers can't evaluate what they can't see. | Add specific experimental design, success metrics, go/no-go criteria, and risk mitigation for each aim. |
| "Commercialization plan is weak" | You said "large market" without naming customers. No evidence of demand. | Add customer discovery data, letters of support, named pilot sites, competitive positioning, and revenue model. |
| "Significance is not well established" | The problem you're solving isn't framed as important enough. Reviewers don't feel the urgency. | Quantify the problem. How many people affected? What's the cost of the current approach? What fails without your solution? |
| "The team lacks relevant expertise" | Your biosketches don't demonstrate domain experience for THIS specific problem. | Add relevant publications, patents, prior projects, or industry experience that directly maps to the proposed work. |
| "Feasibility is not demonstrated" | No preliminary data, no published evidence, nothing tangible. Just a plan. | Add pilot data, computational models, literature-based feasibility arguments, or analogous results from adjacent work. |
| "The scope is too broad for Phase I" | You're proposing Phase II work in a Phase I budget. | Cut scope to the minimum needed to answer "is this feasible?" Save the rest for Phase II. |
| "The proposal doesn't address the solicitation" | You wrote a generic proposal and pasted it into this agency's format. Topic mismatch. | Reread the solicitation. Mirror their language. Reorganize around their stated priorities. |
Read for themes, not individual comments
Don't respond to each reviewer comment in isolation. Group feedback by theme:
- If 2+ reviewers flag the approach: your methodology section needs structural redesign
- If 2+ reviewers flag commercialization: you need real market evidence, not assertions
- If 1 reviewer flags something and others don't: it's worth addressing but may be a personal preference
- If the overall score is close to the funding line: your proposal is competitive and small fixes can push it over
- If the overall score is far from the funding line: you may need a fundamental reframe, not incremental edits
The 5 structural fixes that win on resubmission
These are the patterns we see across 500+ proposals at Cada when a rejection turns into an award.
1. Reframe the problem, not just the solution
The most common rejection reason across all agencies: the innovation isn't framed as a genuine research question. Your resubmission should open with a sharper problem statement -- what's technically unknown, why current solutions fail, and what gap your research closes.
If your first submission led with "we're building X," your resubmission should lead with "Y is impossible with current approaches because of Z, and here's how we'd investigate a fundamentally different path."
Example: a climate tech company we worked with rewrote their Aim 1 from "develop an electrochemical sensor" to "determine whether electrochemical detection can achieve sub-5ppb sensitivity under field conditions" -- the approach score moved from a 4 to a 2 at NIH.
2. Add data you didn't have before
The strongest resubmissions include new preliminary data collected between submissions. Run a small experiment. Generate benchmark results. Publish a technical preprint. Any tangible evidence that wasn't in the first submission strengthens your feasibility argument and shows reviewers you're actively working the problem.
If you can't generate new data, add literature-based feasibility arguments or computational models that weren't in the original proposal.
3. Restructure the approach with measurable milestones
Vague approaches are the second most common rejection cause. Your resubmission should have explicit go/no-go decision points for each aim, quantifiable success metrics (not "we will develop" but "we will demonstrate >X performance on Y metric"), and risk mitigation strategies for the highest-uncertainty steps.
4. Strengthen the commercialization evidence
Between submissions, talk to potential customers. Get a letter of support. Run a customer discovery interview series. Even 5-10 structured conversations with potential buyers gives you concrete evidence that didn't exist in your first submission.
One specific letter from a potential customer is worth more than three pages of market analysis.
5. Get someone outside your team to read it
The single cheapest fix: have someone who's never seen your technology read your proposal and summarize what they think you're proposing. If their summary doesn't match your intent, the writing isn't clear enough. Reviewers read 8-15 proposals per cycle. If yours requires re-reading to understand, it loses.
The NIH resubmission deep dive
NIH's A1 process is the most structured resubmission in SBIR. If you're resubmitting to NIH, this section is your playbook.
The 1-page Introduction to Revised Application
This is your most important page. It goes at the front of your resubmission and tells the study section exactly what changed and why.
Structure it like this:
- Thank the reviewers -- briefly, not obsequiously. One sentence.
- Summarize the key concerns -- group by theme, not by individual reviewer. Show you understand the pattern.
- For each concern, state: what you changed, where in the proposal to find it, and why the change addresses the concern.
- Highlight new data -- if you ran experiments or collected customer evidence between submissions, call it out explicitly.
What NOT to do:
- Don't argue with reviewers. Even if a comment was wrong, address the underlying concern.
- Don't just say "we added more detail." Point to the specific section and the specific change.
- Don't ignore any major concern. If a reviewer raised it and you didn't fix it, the study section notices.
After the A1
If your A1 resubmission is rejected, NIH requires you to submit a new application (A0) with substantial changes. You can't submit an A2. The new application should reflect a materially different approach -- not the same proposal with additional edits.
When to switch agencies instead of resubmitting
Sometimes the feedback tells you the proposal was strong but the fit was wrong. In those cases, reframing for a different agency is smarter than revising for the same one.
Signs you should switch:
- Feedback focuses on mission alignment, not quality ("interesting work, but doesn't fit our priorities")
- Your technology is at the intersection of two agency domains (e.g., health AI that could go NIH or NSF)
- The topic you applied to has been retired or won't recur
- Reviewers praised the technology but scored commercialization or broader impacts poorly (different agencies weight these differently)
Common switches:
- NSF to NIH: if reviewers said "too clinical" for NSF, NIH is probably the right home
- NIH to NSF: if reviewers said "not enough biomedical relevance" but the tech is broadly applicable
- DoD branch to AFWERX: if you need open-topic flexibility instead of specific topic fit
- Any agency to DARPA: if reviewers said "too ambitious" -- DARPA wants ambitious
For help mapping your technology to the right agency, see our SBIR guide for startups or how to win an SBIR grant.
The resubmission timeline
Don't sit on feedback. The fastest path from rejection to award:
| Step | Timeline | What to Do |
|---|---|---|
| Receive feedback | Day 0 | Read it once for emotion. Read it again for data. |
| Thematic analysis | Week 1 | Group feedback by theme. Identify the 2-3 structural issues. |
| Call the program officer | Week 1-2 | Ask which concerns are most critical. Get guidance on fit. |
| Collect new data / customer evidence | Weeks 2-8 | Run experiments, talk to customers, generate what was missing. |
| Rewrite (not edit) the weak sections | Weeks 4-8 | Structural revision, not line editing. |
| Internal review | Week 8-10 | Have someone outside your team read the full proposal. |
| Submit | Next deadline | NIH: next receipt date. NSF: anytime (pitch). DoD: next matching topic. |
Want help diagnosing what went wrong?
We've reviewed hundreds of rejected SBIR proposals and helped teams resubmit successfully. The most common issue isn't the technology -- it's the framing. Our Strategy Review includes a feedback analysis specific to your proposal and target agency.