$1.6B Secured
500+ Proposals Written
Federal, State & Foundation Grants
SBIR/STTR, Grant Strategy

How to Research Prior SBIR Awardees Before You Apply: A 3-Step Competitive Intelligence Method

Last updated: April 3, 2026 | Author: Nalin Vahil, Cada -- 50+ SBIR/STTR applications, 86% success rate

Three months before submitting her first NIH SBIR Phase I, a MedTech founder ran a simple search on USAspending.gov. She found 12 funded companies in her niche -- but she also found something else: a gap. Every award in her area targeted oncology applications. Nobody had applied her technology to neurological conditions. She repositioned her proposal around the gap, quoted the competitive landscape in her Specific Aims, and won on her first submission.

Most first-time SBIR applicants skip this step entirely. They write their proposal without ever looking at who's been funded before them -- and waste 40-80 hours on an application they could have sharpened (or redirected) with a few hours of research.

The data on every SBIR award -- who got funded, for how much, and for what technology -- is free, public, and searchable on USAspending.gov. Here's the 3-Step SBIR Competitive Scan, a method for turning that data into intelligence that sharpens your proposal.

SBIR competitive research is the process of using federal spending databases -- primarily USAspending.gov -- to identify prior awardees in your technology area, profile their funding history, and assess how crowded or open the competitive landscape is before you invest 40-80 hours writing an application.


What Is USAspending.gov and Why Should SBIR Applicants Care?

USAspending.gov is the U.S. government's public database of every federal dollar spent -- grants, contracts, loans, and direct payments. For SBIR applicants, it's the most complete searchable record of which companies received SBIR and STTR awards, from which agencies, for how much, and for what technology.

Why USAspending over sbir.gov's award search? Three reasons:

  1. USAspending covers contracts. DOD funds SBIR through contracts, not grants. The sbir.gov awards database is grant-focused and misses a significant chunk of DOD SBIR activity.
  2. Better filtering. USAspending lets you filter by CFDA number, amount range, fiscal year, recipient location, and more. The sbir.gov search is keyword-only.
  3. Structured data. You can export results to CSV for analysis. The sbir.gov interface is designed for browsing, not research.

Step 1: Find Prior Awardees in Your Technology Area

Start at USAspending.gov's Advanced Search. You'll need to set four filters: award type, program identifier, keywords, and fiscal year range.

Choose the Right Award Type

This is where most beginners make their first mistake. The award type depends on the agency:

  • NIH, NSF, DOE, USDA: Select "Grants" (award type codes 02-05)
  • DOD: Select "Contracts" (award type codes A-D). DOD SBIR is funded as contracts, not grants. If you search grants for DOD SBIR, you'll get zero results and think nobody's been funded in your area.
  • ARPA-H: Select "Grants" but note that historical data is limited since the agency was established in 2022.

Set Your Program Identifier (CFDA Number)

Each agency's SBIR program has a CFDA number (now called "Assistance Listing Number") that narrows your search. But there are gotchas:

Agency CFDA / Identifier Gotcha Fix
NIH Varies by Institute (e.g., 93.242 for NIMH, 93.393 for NCI) CFDA covers ALL grant types at that Institute, not just SBIR Post-filter results by Award ID prefix: R43 (Phase I), R44 (Phase II), R41 (STTR I), R42 (STTR II)
NSF 47.084 (TIP Directorate) Covers the entire TIP directorate, not just SBIR Add "SBIR Phase I" or "SBIR Phase II" to your keyword filter
DOD Not applicable (contracts) DOD uses contract classification, not CFDA Search by keywords ("SBIR" + your technology terms) with contract award type
ARPA-H No dedicated SBIR CFDA Agency is too new for a stable CFDA structure Search by keyword only ("ARPA-H" + technology terms)
DOE Varies by office Multiple offices fund SBIR with different CFDAs Use keyword search ("SBIR" + technology terms) as a starting point

Add Technology Keywords

Add 2-3 specific technology keywords. Be precise. "AI" will return thousands of results. "Federated learning for medical imaging" will return a manageable, relevant set.

Set Fiscal Year Range

Three fiscal years is the sweet spot. Shorter than that and you miss patterns. Longer and you're looking at potentially outdated competitive landscapes. For FY2024-FY2026, set dates from October 1, 2023 to September 30, 2026. Federal fiscal years start in October, not January.

Set Amount Range

Amount range helps exclude non-SBIR awards that slip through your filters:

  • NIH: $50,000 -- $3,000,000 (excludes large R01 grants that aren't SBIR)
  • DOD: $50,000 -- $2,000,000 (excludes SBIR III production contracts)
  • NSF: $50,000 -- $2,000,000 (Phase I $275K typical, Phase II up to $2M under the 2022 SBIR/STTR Extension Act)

Worked Example

Say you're a biotech startup developing a novel drug delivery platform and you want to know who's been funded by NIH in your space.

  1. Go to USAspending.gov Advanced Search
  2. Select Award Type: Grants
  3. Enter CFDA: 93.859 (NIDDK) or the relevant IC for your disease area
  4. Keywords: "drug delivery" AND "nanoparticle" (or your specific technology)
  5. Fiscal Years: FY2024-FY2026
  6. Amount Range: $50,000 -- $3,000,000
  7. Run the search
  8. Critical step: Export the results and filter the Award ID column. Keep only rows starting with R43, R44, R41, or R42. Everything else is a non-SBIR grant.

You should now have a list of 5-50 SBIR/STTR awards in your technology area.


Step 2: Profile the Funded Companies

Raw search results are data. The value comes from profiling.

For each awardee on your list, build a quick competitive profile:

What to Capture

  • Company name and location: Are they in a specific geographic cluster? Some ICs show regional funding patterns.
  • Number of awards (3-year window): A company with 5+ SBIR awards in 3 years is an experienced SBIR player. A company with 1 award is likely a first-timer like you.
  • Award amounts: Phase I awards are typically $275K (NIH) or $150K-$275K (NSF/DOD). Phase II awards are $1M-$2M. If a company has both, they successfully converted from Phase I to Phase II -- that's a signal of strong execution.
  • Award descriptions: Read these carefully. The 200-character descriptions reveal what framing and language succeeded. If 8 out of 10 funded descriptions mention "clinical validation," that's a pattern worth noting.
  • Phase I to Phase II conversion: If 3 companies got Phase I awards in your area but none converted to Phase II, that tells you the agency might be skeptical of the technology's commercial viability.

Identifying Patterns

Once you've profiled 10-15 awardees, look for:

  • Repeat winners: Companies with 3+ awards in your technology area are the established players. Your proposal needs to explain why you're different.
  • Funding concentration: If 80% of awards went to 3 companies, the space is concentrated. New entrants need a strong differentiation story.
  • Technology clustering: Are awards clustered around a specific application (e.g., "wearable biosensors for diabetes") or spread across the broader technology space? Clustered means more competition in that niche. Spread means more room to define your angle.
  • Median award amount: This tells you what to budget for. If the median NIH Phase I in your area is $275K, don't budget $400K -- it signals you don't understand the program.

Worked Example

A fictional biotech startup found 15 NIH SBIR awards in the drug delivery nanoparticle space over 3 years:

  • 4 companies received 2+ awards each (repeat players)
  • Median Phase I amount: $275,000
  • 3 companies converted Phase I to Phase II ($1.5M median Phase II)
  • Award descriptions cluster around "targeted delivery for oncology" (9 of 15)
  • Only 2 awards mention "CNS delivery" (less competitive niche)

Insight: Oncology-focused nanoparticle delivery is crowded. CNS delivery is an underserved niche with clear opportunity. If your technology applies to CNS, lead with that angle.


Step 3: Assess Competitive Density and Position Your Proposal

Competitive density is the number of funded companies in your specific niche over a 3-year period. It tells you how the agency views your technology area -- and how hard you'll need to work to stand out.

The Competitive Density Spectrum

Density Awards in 3 Years What It Means Your Strategy
Low 0-4 Agency may not prioritize this area, or it's genuinely novel Emphasize why the agency SHOULD fund this. Tie to stated agency priorities. Higher risk but less competition.
Medium 5-15 Validated area with room for new entrants Sweet spot. Show differentiation from funded companies. Reference the landscape to demonstrate awareness.
High 16+ Well-established funding area. Crowded. Must clearly articulate what's novel about your approach. "Me too" proposals don't win in crowded spaces.

Positioning Against the Landscape

Your competitive scan gives you three positioning tools:

  1. Gap identification: What technology angles aren't being funded? If everyone's doing oncology drug delivery and nobody's doing CNS delivery, that gap is your opportunity.
  2. Differentiation language: You can now write "Unlike prior approaches funded by [IC name] that focused on [X], our technology addresses [Y]." That signals to reviewers you've done your homework.
  3. Budget calibration: Match your budget to the median award in your area. Outlier budgets (high or low) raise reviewer questions.

When Competitive Research Changes Your Strategy

Sometimes the data tells you to pivot:

  • Zero awards in your exact niche at one agency, 12 at another? Target the agency that's already funding your technology area. A validated niche with existing funding is easier to enter than convincing an agency to fund something new.
  • All awards going to a single company? That company may have a close relationship with the program manager. Consider whether a different agency or IC might be more receptive to a new entrant.
  • Declining award counts year-over-year? The agency may be deprioritizing this area. Check the agency's current strategic plan before investing 80 hours in an application.

Agency-Specific Query Gotchas That Trip Up Beginners

These aren't edge cases -- they affect the majority of first-time searches. Here are the four most common query construction errors and how to fix them.

NIH: The CFDA Contamination Problem

NIH organizes funding by Institute and Center (IC), and each IC has a CFDA number. But that CFDA number covers every grant type the IC funds -- R01 research grants, R21 exploratory grants, F31 fellowships, and SBIR/STTR awards.

If you search USAspending for CFDA 93.242 (NIMH), you'll get hundreds of results, most of which are NOT SBIR. The fix: export your results and filter the "Award ID" column. SBIR/STTR awards use specific prefixes:

  • R43: SBIR Phase I
  • R44: SBIR Phase II
  • R41: STTR Phase I
  • R42: STTR Phase II

Everything else (R01, R21, K99, F31, etc.) is a different grant mechanism and should be excluded.

NSF: The TIP Directorate Catch-All

NSF's SBIR program lives under the Technology, Innovation, and Partnerships (TIP) Directorate, CFDA 47.084. But that CFDA covers everything TIP does -- not just SBIR.

Fix: add "SBIR Phase I" or "SBIR Phase II" as a keyword alongside your technology terms. This narrows results to actual SBIR awards.

DOD: The Contract Trap

DOD SBIR is structured as contracts, not grants. This is the single most common error for first-time searchers. If you select "Grants" as your award type and search for DOD SBIR, you'll get zero or near-zero results.

Fix: select "Contracts" as your award type. Add "SBIR" as a keyword along with your technology terms. Set the amount range to $50,000 -- $2,000,000 to exclude SBIR III production contracts (which can be $5M+).

ARPA-H: Limited Historical Data

ARPA-H was established in 2022 and its SBIR programs are still maturing. There's no dedicated CFDA number for ARPA-H SBIR. Search by keyword only ("ARPA-H" plus your technology terms) and expect limited results.

If you find fewer than 3 awards, don't conclude ARPA-H isn't funding your area. The agency's portfolio is still small. Supplement your USAspending search with ARPA-H's own announcements and program pages.


What This Research Tells You (and What It Doesn't)

What It Tells You

  • Who's been funded in your technology area, by name
  • How much agencies typically award for your type of technology
  • How competitive the landscape is (density of awards)
  • What language works in successful award descriptions
  • Funding trends -- whether the agency is increasing or decreasing investment in your area

What It Doesn't Tell You

  • Why specific applications won. You see outcomes, not reviewer scores or feedback.
  • Who applied and didn't get funded. Only awarded data is public. You can't calculate success rates from USAspending alone.
  • Reviewer preferences. Study sections and program managers change. Past funding patterns suggest, but don't guarantee, future priorities.
  • Real-time data. USAspending data has a 30-90 day reporting lag. Very recent awards may not appear yet.

Complementary Data Sources

USAspending is your primary tool, but supplement it with:

  • NIH RePORTER: Detailed project abstracts, PI information, and publication links for NIH-funded projects
  • NSF Award Search: NSF-specific award details including project abstracts
  • sbir.gov Awards: Browse-friendly interface, good for reading full award descriptions
  • Agency strategic plans: Confirm whether your technology area aligns with stated priorities

Frankly, competitive research has limits. If you're working on something genuinely novel -- technology that doesn't map to existing SBIR categories -- the absence of data IS the data. It means you'll need to spend more time in your proposal explaining why this area deserves funding. That's not a dealbreaker, but it's a different positioning challenge.


Frequently Asked Questions

How far back should I search to find previous SBIR awardees on USAspending?

Three fiscal years is the sweet spot. It gives you enough data to spot trends without including outdated competitive landscapes. Going back 5+ years means you're looking at companies that may no longer be active or technology areas that have shifted.

Can I find out who applied for SBIR and didn't get funded?

No. Only awarded data is public. Unfunded applications are confidential. This is why you can't calculate true success rates from USAspending -- you see the winners but not the total applicant pool.

Is USAspending data accurate for SBIR research?

Yes, with a caveat. Federal agencies are required to report spending data, and USAspending undergoes regular audits. However, there's a 30-90 day reporting lag, so very recent awards may not appear. Occasional data entry errors exist (especially in description fields), but recipient names and amounts are reliable.

Should I contact prior SBIR awardees I find on USAspending?

It depends. Some funded companies are open to conversations, especially if you're exploring STTR partnerships (which require a research institution partner). Others view you as competition. A brief, professional outreach that references their funded work and proposes a specific collaboration angle has the best response rate.

How long does an SBIR competitor analysis take?

A thorough competitive scan -- searching 2-3 agencies, profiling 10-15 awardees, and assessing density -- takes 2-4 hours. That's a meaningful investment, but small compared to the 40-80 hours you'll spend on the application itself. Better to spend 3 hours confirming you're competitive than 80 hours on an application in a space where you're not.


Do It Yourself -- or Let Cada Do It for You

Everything in this guide is something you can do yourself, today, for free. USAspending.gov is a public tool and the 3-Step SBIR Competitive Scan works.

That said, we run this exact competitive scan for every Cada client as part of our grant roadmap process. Across 50+ SBIR/STTR applications, Cada maintains an 86% success rate -- and competitive landscape research is one of the reasons. We've built query templates for 7+ federal agencies, we know which CFDA numbers map to which SBIR programs, and we synthesize the results into a competitive positioning strategy -- not a data dump.

If you're not sure whether your company is competitive for SBIR, that's the first question to answer -- before investing 40+ hours in an application. We do a free 15-minute assessment call that includes a quick competitive landscape overview for your technology area. No pitch, no obligation -- just a straight answer on where you stand.