Impact & Reach

Evidence of use, usefulness, and public benefit

WhereWeLearn exists to support learning by helping people discover and organise free educational material from across the internet.

This page provides a high-level view of reach and use, shared in a way that respects privacy, avoids profiling, and prioritises public understanding over performance claims.

Impact here is understood as evidence of usefulness, not measurement of individuals.


Our approach to impact

WhereWeLearn takes a restrained and proportionate approach to impact measurement.

We focus on:

  • whether the service is used
  • whether people return to it
  • whether it remains broadly accessible

We deliberately avoid metrics that:

  • profile individuals
  • infer personal outcomes or characteristics
  • create pressure or comparison
  • incentivise behaviour over learning

Not everything that can be measured should be.


What we measure (and why)

At a high level, WhereWeLearn reviews aggregate usage information to help answer simple questions such as:

  • Is the service being used?
  • Is it being returned to?
  • Is it accessible across different contexts?

Examples of measures reviewed may include:

  • total visits or sessions over a given period
  • repeat usage at an aggregate level
  • high-level geographic distribution (country level)
  • broad patterns of discovery and navigation

These measures exist to support stewardship and improvement — not optimisation or surveillance.


Age and child safety

WhereWeLearn is used by people of different ages, including children.

To support age-appropriate exploration when an account is used, WhereWeLearn may ask for a user’s date of birth.

This information is used only to:

  • distinguish between child and adult users
  • ensure learning materials presented are appropriate to age
  • support parents or caregivers in allowing children to explore safely

This approach is similar to age-based profiles used by family services such as children’s media platforms.

Providing a date of birth is not required to browse or explore WhereWeLearn without logging in.


How age information is treated

When a date of birth is provided:

  • it is used only for age-appropriateness and safeguarding
  • it is not used for advertising, profiling, or commercial purposes
  • it is not shared or sold
  • it is not used to infer identity, behaviour, or outcomes

Age information exists to reduce risk, not to increase data collection.


Children, families, and restraint

Where children may be using the service:

  • persuasive or manipulative design is avoided
  • no pressure-based prompts are used
  • learning is presented as exploratory rather than directive

Safeguards are applied proportionately and with care.

Parents and caregivers retain responsibility for how children use the service, and WhereWeLearn aims to support — not replace — that role.

What we do not measure

To protect trust and avoid misuse, WhereWeLearn does not:

  • track individuals over time
  • build user profiles
  • infer learning outcomes, abilities, or intent
  • collect demographic or behavioural data for analysis
  • rank or score people, content, or providers

Learning is treated as a human activity, not a dataset.


Operational history

WhereWeLearn has been operational for several years.

During this time, the service has:

  • been used consistently over multiple years
  • supported learners across multiple countries, including Ireland
  • demonstrated sustained, repeat use rather than one-off traffic

This history provides confidence that the underlying need — help navigating free learning — is real and ongoing.


Reach overview

The figures below are shared at a year-level and aggregate level only.

  • Period covered: 2025
  • Total visits / sessions: 57,154
  • Geographic reach: (top countries by use, country level only)
Flag Country Count %
IE 30232 52.9
US 15315 26.8
SG 2885 5.05
CN 2006 3.51
GB 1583 2.77
CA 1196 2.09
DE 426 0.75

All figures are intended to show scale of use, not intensity or individual behaviour.

(Exact figures are published cautiously and updated periodically as governance and reporting mature.)


How to interpret these numbers

These figures should be read as:

  • evidence that people find the service useful enough to return to
  • confirmation that the need spans more than one location or context
  • an indication that free learning navigation remains a real problem

They should not be read as:

  • measures of learning success or failure
  • indicators of educational outcomes
  • comparisons between groups or regions

Impact here is about availability and usefulness, not assessment.


Limits and caveats

All impact reporting has limits.

For WhereWeLearn, these include:

  • reliance on aggregate technical signals
  • avoidance of individual-level data
  • deliberate restraint in interpretation

We prefer to under-claim rather than over-state impact.


How impact reporting will evolve

As the charity formalises and governance structures mature, impact reporting may evolve to:

  • improve clarity and consistency
  • better explain usefulness at a system level
  • support accountability to funders and the public

Any changes to what is measured — or how it is reported — will be documented openly through the Transparency Hub.


Closing principle

WhereWeLearn measures impact to remain accountable — not to create pressure, competition, or surveillance.

Reach matters because access matters.
Use matters because usefulness matters.
Trust matters most of all.