TellSomeone is fully compliant with the UK Online Safety Act 2023 (OSA), yet operates from secure servers in the United States, protected by the First Amendment. This matters.
In the U.S., anonymous crime reporting is a constitutionally protected form of speech — in the UK, the OSA's framework, as currently drafted, risks extinguishing it.
We are not a "user-to-user" or "search" service under the OSA. Our platform hosts no public feeds, no comment threads, and no searchable content.
The Online Safety Act 2023 places duties on "search services", defined as internet services that are, or include, a search engine. A search engine under the Act is a tool that allows a person to search multiple websites and/or multiple separate databases.
TellSomeone is not, and will not be, a search engine in this sense. Our planned archive of testimonies, FOI responses, and related materials will be a closed repository. Any search function will only operate within our own database — it will not access, index, or retrieve live results from multiple external websites or third-party databases.
Every submission is encrypted and accessible only to authorised safeguarding, legal, or journalistic professionals. That is how we remain lawful while refusing to strip victims of their anonymity.
It is a closed, secure repository designed for research, accountability, and safeguarding — not general internet search.
The OSA imposes a Children's Access Assessment duty (s. 11) and, if a service is "likely to be accessed by children," demands age-assurance measures (s. 12). Ofcom's own guidance openly contemplates:
These are not "optional extras" — they are enforcement expectations for platforms within scope. While designed to shield children from harmful content, these same measures can bar children from the very platforms they need to reach safety.
The consequence? A child being groomed, a teenager trapped in an abusive household, or a young person under threat from an organised exploitation ring could be told: No ID, no entry. No verification, no voice.
For victims, especially children, anonymity is often the only shield against retaliation. The OSA's identity-first logic creates a chilling, sometimes lethal, barrier:
Identification documents can be impossible to obtain — especially for children estranged from guardians, victims in witness protection, or undocumented minors.
Verification systems create a traceable link — which abusers can exploit if the system is ever compromised.
Fear of exposure silences victims — many would rather endure abuse than risk discovery through a mandatory ID process.
If implemented by OFCOM without exception, the OSA's requirements will actively endanger lives by cutting off access to anonymous reporting routes.
In the United States, anonymity in speech and reporting is not a technical allowance. It is a legal right. Crime reporters, including minors, can communicate with journalists, lawyers, and safeguarding professionals without showing ID, uploading a passport scan, or submitting to biometric analysis.
This is why TellSomeone's hosting in the U.S. is deliberate. It allows us to reject intrusive identification demands while still maintaining full compliance with UK safeguarding duties.
We still provide:
The OSA was written to make the internet safer for children. But without clear exemptions for victim reporting in how OFCOM chooses to enforce it, its age-assurance rules risk turning into a locked gate — one that bars the desperate, silences the vulnerable, and protects abusers by default.
TellSomeone will never force victims to hand over passports, credit cards, or facial scans just to be heard. We meet the law's intent, not its most dangerous technical overreach.