Independent and vendor-neutral. Every figure on this site is either a source-cited published statistic or a reader-controlled bounded calculation. No vendor averages presented as fact.

ShadowITCost

Last verified April 2026

Category 1: Observable spend

Observable Spend: Measuring Shadow IT License Waste

The most quantifiable bucket of shadow IT cost. Every unauthorized subscription leaves a financial trail. Three complementary methods, vendor-published benchmarks cited honestly, a worked range for a typical mid-market organization.

Definition

Observable spend is the direct subscription cost of SaaS applications, cloud services, and digital tools that are in use by employees but have not been catalogued in the approved application inventory and have not been through formal procurement. The category covers paid subscriptions that leave a financial trail. Free-tier apps, which create compliance and breach exposure without subscription cost, are counted in categories 2 and 3.

The three measurement methods

Expense audit (pull 12 months of expense reports and corporate card data, filter for SaaS merchants using MCC codes and keyword matches, de-duplicate by merchant, sum annual spend). This is the primary method for organizations without a SaaS management platform.

SSO gap analysis (export federated and OAuth app lists from your identity provider, compare against approved catalog, the gap is authenticated-but-not-catalogued apps). This method captures use but not spend directly; you reconcile it against the expense audit to attach dollar values.

SaaS management platform deployment (Zylo, Torii, BetterCloud, Productiv, CloudEagle, Nudge Security). A SaaS management platform continuously ingests expense, SSO, and contract data to produce an ongoing observable-spend inventory. The tools overview page covers when the spend is justified.

Benchmark data and how to read it

Productiv publishes an annual State of SaaS report

Productiv

Productiv State of SaaS Apps Report (2024)

Measures: Average and median number of SaaS applications per surveyed customer organization, departmental SaaS adoption patterns, and licence usage rates.

Methodology: Vendor-published. Aggregated telemetry from Productiv platform customer base; not a representative sample of all enterprises. Sample size and methodology self-disclosed in the report.

Trust: Vendor-published, methodology self-disclosed

https://productiv.com/state-of-saas/
with aggregated telemetry from its customer base. Typical data points include average number of SaaS applications per customer organization, departmental SaaS adoption patterns, and license usage rates. The report is vendor-published: the sample is Productiv customers (organizations that have bought SaaS management tooling), which is not a random sample of the enterprise population. Read the data as directionally useful, not as a representative enterprise benchmark.

Zylo publishes an annual SaaS Management Index

Zylo

Zylo Annual SaaS Management Index (2024)

Measures: SaaS spending and application portfolio benchmarks across Zylo customer base, including spend by employee band and by category.

Methodology: Vendor-published. Aggregated Zylo platform telemetry from a self-selecting customer set. Sample size and methodology self-disclosed.

Trust: Vendor-published, methodology self-disclosed

https://zylo.com/saas-management-index/
with similar telemetry from its customer base. Same methodology caveats apply. BetterCloud's State of SaaSOps
BetterCloud

BetterCloud State of SaaSOps (2024)

Measures: SaaS adoption growth, IT versus non-IT app procurement, and SaaSOps practices.

Methodology: Vendor-published. Practitioner survey conducted by BetterCloud. Sample size and respondent profile self-disclosed.

Trust: Vendor-published, methodology self-disclosed

https://www.bettercloud.com/state-of-saasops/
is a practitioner survey; different methodology, different biases (survey respondents rather than telemetry), same directional usefulness.

The widely-cited Gartner 30 to 40 percent figure

Gartner

Gartner CIO Agenda research, analyst estimate of business-led IT spending (2019/2022)

Measures: Estimated share of enterprise technology spending occurring outside the formal IT organization in large enterprises.

Methodology: Analyst estimate derived from Gartner's CIO survey panel and analyst forecasting models. Not a primary measurement of any single organization. Range commonly cited as 30 to 40 percent of large-enterprise technology spending.

Trust: Analyst estimate, methodology partially disclosed

https://www.gartner.com/en/information-technology/insights/cio-agenda
is an analyst estimate of large-enterprise technology spending occurring outside IT, not a measurement of observable spend specifically. It is often quoted as a shadow IT spending statistic but it is a broader concept; be careful when citing it.

Worked range for a 1,000-employee mid-market organization

Inputs: 1,000 employees, partial SaaS management maturity, general mid-market industry. Assumptions: 1.5 to 3 shadow apps per employee (maturity-adjusted range), $15 to $45 per app per month.

Calculation: 1,000 employees x 1.5 to 3 apps x $15 to $45 per month x 12 months. Low bound: 1,000 x 1.5 x $15 x 12 = $270,000. High bound: 1,000 x 3 x $45 x 12 = $1,620,000. Expected value (geometric mean): approximately $660,000. The interactive estimator lets you adjust all inputs and see the range update.

The variance of that range (roughly 6x between low and high) is normal and communicates the inherent uncertainty of estimating before measurement. Running the expense audit and SSO gap will tighten the range substantially, often to within a 2x ratio of low to high.

How to present this on a board deck

Lead with the single expected figure from your actual expense audit if complete, or from the estimator if you have not yet measured. Show the low and high range adjacent. List the top five apps by spend, top five consolidation candidates (same tool, multiple subscriptions), and top five apps with unknown owners. Close with the next-step commitment: complete expense audit within 30 days, complete SSO gap within 45 days, return with a refined number. The board will rarely object to the expected figure when the methodology is disclosed.

Related measurement

Observable spend feeds directly into the governance ROI calculation. The 'reduction' side of that calculation uses the expected-reduction benchmark, which is labelled honestly on /statistics as a vendor marketing range (60 to 70 percent) that should be applied conservatively (20 to 40 percent) for a defensible business case.

Category 2

Breach risk ->

Category 3

Compliance ->

Category 4

Operational ->

Frequently asked questions

What counts as observable spend?+
Any SaaS subscription, cloud service, or digital tool paid for by the organization (corporate card, expense reimbursement, vendor invoice) that has not been catalogued in the approved application inventory and has not been through formal procurement. Free-tier SaaS is not observable spend (there is no financial trail) and falls into the compliance and breach exposure categories instead.
Why is this the easiest category to measure?+
Because there is a financial trail that finance and procurement have already partially instrumented for other reasons (tax, accounting, budgeting). The methods are established (expense audit, SSO gap, SaaS management platform), you can validate one method against another, and the result is a dollar figure that does not require defending a probability assumption. It is the category you produce first when building a shadow IT business case because it lands on the board deck without methodological disclaimers.
What vendor-published benchmark data is available?+
Productiv, Zylo, and BetterCloud publish annual State-of-SaaS or State-of-SaaSOps reports with data from their respective customer bases. These are useful benchmarks for app count per organization, spend per employee, and category breakdown; they are vendor-published and their customer base self-selects, so they do not represent the overall population of organizations. Every figure on this site cites these reports as vendor-published with the self-disclosed methodology caveat.
How do I quantify the 'waste' portion specifically?+
Waste is the subset of observable spend where the value delivered is less than the cost. Common waste patterns: (a) unused licenses (seats assigned, not active in the last 30 or 90 days); (b) duplicated tools (two or more apps performing the same function because different teams bought separately); (c) over-tier licensing (enterprise plan purchased when a team plan would suffice); (d) orphaned subscriptions (the owner left the company, no one reviewed). Measuring waste requires usage data, which is typically only available once a SaaS management platform is deployed. The expense audit finds the spend; the usage instrumentation separates waste from used.
What should I expect to find?+
Practitioner patterns in mid-market organizations doing a first pass: unauthorized SaaS spend typically falls in the low hundreds of thousands to low millions of dollars annual range; the same tool often has 10 to 50 separate personal subscriptions before consolidation; duplicated-tool waste is often 15 to 30 percent of total SaaS spend once consolidated. These are heuristics for sanity checking, not a replacement for your own measurement.
How often should I re-run this measurement?+
Quarterly. The portfolio shifts continuously: new subscriptions added between quarters, license counts fluctuate, departments reorganize, tools get consolidated. Quarterly also aligns with most organizations' budget and procurement review cadence, which is when the observable spend number has the most operational leverage. Annual-only measurement misses the variance.