Independent and vendor-neutral. Every figure on this site is either a source-cited published statistic or a reader-controlled bounded calculation. No vendor averages presented as fact.

ShadowITCost

Last verified April 2026

How to detect shadow IT

Four methods to discover shadow IT in your organization

Each method has a blind spot. Combining them gets you to most of the portfolio. This page compares effort, coverage, and blind spots so you can sequence the work.

Why one method is never enough

Every discovery method covers a subset of the shadow IT footprint. SSO-based approaches see what connects to your IdP. Financial approaches see what leaves a payment trail. Network approaches see what transits your managed egress. Survey approaches see what employees will tell you. The portfolio you want to catalogue sits in the intersection of all of these plus what falls outside each.

Combined, the four methods below typically yield 80 to 95 percent of the visible app portfolio. That 80 to 95 percent figure is a methodological estimate from practitioner experience across discovery sprints, not a measurement from a peer-reviewed study. Apply it as order-of-magnitude guidance.

Comparison table

MethodWhat it coversEffortCoverage estimatePrimary blind spot
SSO gap analysisApps connected to your IdP (Okta, Entra ID, Google Workspace)Low (half-day)40 to 70 percentApps not using SSO at all, personal browser logins
Expense auditPaid apps that leave a financial trail (corporate card, expense reports, vendor invoices)Low to medium (one week)30 to 60 percentFree-tier apps, personal-card spend never reimbursed, annual billing in unexpected categories
CASB and network analysisSaaS traffic observed from managed devices or network egress pointsMedium to high (weeks to months; tool deployment)60 to 85 percent on managed devicesPersonal device access, home-network usage, privacy-compliant logging constraints
Browser inventory + employee surveyBrowser extensions, local app use, and honest disclosure of toolsMedium (two to four weeks, includes survey wave)20 to 50 percent incremental over the aboveResponse bias in surveys, personal-device browsing, tools actively being hidden

Coverage estimates are practitioner heuristics from discovery sprints, not measured figures. Individual results vary by SSO adoption, device management posture, finance system completeness, and survey response rate.

Method summaries

SSO gap analysis

Apps connected to your IdP (Okta, Entra ID, Google Workspace)

Method detail ->

Effort: Low (half-day)
Coverage: 40 to 70 percent
Tools: Your IdP's admin console, a spreadsheet

Expense audit

Paid apps that leave a financial trail (corporate card, expense reports, vendor invoices)

Method detail ->

Effort: Low to medium (one week)
Coverage: 30 to 60 percent
Tools: Expense system export, finance partner, MCC filter

CASB and network analysis

SaaS traffic observed from managed devices or network egress points

Method detail ->

Effort: Medium to high (weeks to months; tool deployment)
Coverage: 60 to 85 percent on managed devices
Tools: Netskope, Zscaler, Microsoft Defender for Cloud Apps, DNS log analytics

Browser inventory + employee survey

Browser extensions, local app use, and honest disclosure of tools

Method detail ->

Effort: Medium (two to four weeks, includes survey wave)
Coverage: 20 to 50 percent incremental over the above
Tools: Chrome Browser Cloud Management, MDM extension inventory, survey platform

Four-week discovery sprint sequencing

  1. Week 1: SSO gap analysis. Export IdP app lists, compare against approved catalog, build baseline.
  2. Week 2: Expense audit. Pull 12 months of expense report and corporate card data, filter for SaaS merchants, merge with SSO gap baseline.
  3. Week 3: Browser inventory plus amnesty survey. Deploy extension inventory via MDM, launch short amnesty-framed survey.
  4. Week 4: Consolidation. Merge findings, assign owners and data classifications, write the disposition for each app (approve, consolidate, retire, require controls). CASB or network analysis typically follows as an ongoing capability rather than a sprint deliverable.

The output: a consolidated shadow app registry

The deliverable at the end of the sprint is a single spreadsheet with one row per shadow app and the following columns. Every row should trace back to evidence from at least one of the four methods.

App nameCategoryDetected byUsersDepartmentData classAnnual spendAction
NotionProductivitySSO + expense42Product, EngConfidential$7,560Approve, add to catalog
LoomVideoSurvey only~80MultiConfidentialUnknownConsolidate
ChatGPT teamAIExpense + browser15MarketingConfidential$5,400Require controls

Tool categories

CASB vs SaaS management platform ->

Cost the findings

Measure your exposure ->

Frequently asked questions

Why run four methods instead of just buying one tool?+
Because every method has a blind spot. SSO gap misses apps not configured for SSO. Expense audit misses free tiers. CASB misses personal devices and home networks. Browser inventory misses tools actively being hidden. The combined coverage is roughly 80 to 95 percent of the visible portfolio; any single method is 40 to 70 percent. For a first-pass audit, running the two cheapest methods (SSO gap plus expense audit) gets you most of the way. Layering CASB and browser inventory closes the remaining gap over a longer timeline.
Are those coverage percentages measured or estimated?+
Estimated. These are practitioner heuristics from discovery sprints across multiple mid-market organizations, not figures from a peer-reviewed study. Treat them as order-of-magnitude guidance. Your specific coverage depends on your SSO adoption rate, your device management posture, your finance system completeness, and the honesty of your survey response.
Which method should I run first?+
SSO gap analysis. It takes a half-day, requires no tool procurement, uses data you already own, and typically surfaces the long tail of departmental SaaS that was set up with corporate email but without IT review. The findings form the approved-catalog baseline you then compare expense audit results against. Running expense audit without the SSO baseline creates re-work.
Do I need a CASB for shadow IT discovery?+
Not necessarily. CASB is the heaviest tool in the category and is most valuable when you already have network egress visibility requirements for other reasons (DLP, malware inspection, zero trust network access). For shadow IT discovery alone, SSO gap and expense audit typically deliver 60 to 80 percent of the findings that a CASB deployment would surface. The tools overview page covers when a CASB is worth the spend.
Is the employee survey really part of discovery?+
Yes, if you frame it as an amnesty. An amnesty-framed survey (no disciplinary consequence for honest disclosure, short response window, clear benefit to the responder in terms of getting their tool officially supported) typically adds 10 to 30 percent incremental app count over the technical methods. Survey alone is weak. Survey after SSO gap and expense audit, with the approved catalog in hand to reduce noise, works.
What output should I produce at the end?+
A consolidated shadow app registry with one row per app: name, category, detected-by method(s), user count or known-users, department owner, data classification (PII, PHI, cardholder data, confidential, none), annualized spend, and action (approve into catalog, consolidate with approved alternative, retire, require controls). Every row should trace back to evidence from one of the four methods.