156

Welcoming Discord users amidst the challenge of Age Verification

I appreciate their effort but isn't Matrix based out of the UK and primary hosted instances on AWS in the UK? The UK were the first AFAIK to create such internet laws [0]. I could imagine people running their own instances in places where the age laws are not yet active but that number is shrinking fast. [1]

Their solution is for everyone to pay for Matrix with a credit card to verify age. I assume that means there must be a way to force only paid registered accounts to join ones instance? What percentage of the accounts on Discord are paid for with a credit or debit card? Or boosted? I don't keep up with terminology

[0] - https://en.wikipedia.org/wiki/Online_age_verification_in_the...

[1] - https://avpassociation.com/4271-2/

an hour agoBender

I wrote the OP, so to try to clarify:

> isn't Matrix based out of the UK and primary hosted instances on AWS in the UK?

It doesn't matter what country you run your server in or where your company is based; if you're providing public signup to a chat server then the countries (UK, AU, NZ etc) which require age verification will object if you don't age verify the users from those countries. (This is why Discord is doing it, despite being US HQ'd). In other words, the fact that The Matrix.org Foundation happens to be UK HQ'd doesn't affect the situation particularly.

(Edit: also, as others have pointed out, Matrix is a protocol, not a service or a product. The Matrix Foundation is effectively a standards body which happens to run the matrix.org server instance, but the jurisdiction that the standards body is incorporated in makes little difference - just like IETF being US-based doesn't mean the Internet is actually controlled by the US govt).

> Their solution is for everyone to pay for Matrix with a credit card to verify age.

Verifying users in affected countries based on owning a credit card is one solution we're proposing; suspect there will be other ways to do so too. However: this would only apply on the matrix.org server instance. Meanwhile, there are 23,306 other servers currently federating with matrix.org (out of a total of 156,055) - and those other servers, if they provide public signup, can figure out how to solve the problem in their own way.

Also, the current plan on the matrix.org server is to only verify users who are in affected countries (as opposed to try to verify the whole userbase as Discord is).

23 minutes agoArathorn

Matrix is a protocol, not a service. It's likely the UK government can enforce laws against content and accounts hosted on the matrix.org servers, but no single government has jurisdiction over the entire network.

an hour agoZak

That sounds more like a recipe for overreach than a method to escape the law, to be honest. Governments don't typically go "aw, shucks, you've caught us on a technicality" without getting the courts involved.

Clueless lawmakers will see this app called Element full of kids chatting without restrictions and tell it to add a filter. When the app says "we can't", the government says "sucks to be you, figure it out" and either hands out a fine or blocks the app.

There are distinctions between the community vibe Discord is going for (with things like forums and massive chat rooms with thousands of people) and Matrix (which has a few chatrooms but mostly contains small groups of people). No in-app purchases, hype generation, or kyhrt predatory designs, just the bare basics to get a functional chat app (and even less than that if you go for some clients).

I'd say being based in the UK will put matrix.org and Element users at risk, but with Matrix development being funded mostly by the people behind matrix.org that implies an impact to the larger decentralized network.

30 minutes agojeroenhd

It would take some clever crafting to outlaw Matrix clients without also outlawing web browsers and conventional email clients. Let's assume they did though. The best they can do is block it from app stores, which won't stop anyone but iOS users.

More likely, it just won't become popular enough for lawmakers to notice because the UX is a little rough, and people have very little patience for such things anymore.

12 minutes agoZak

Matrix is a protocol, not a service

I thought it was both and their hosted service is in the UK. Is it not? I know people can host their own but I have had very little success in getting people to host their own things. Most here at HN will not do anything that requires more than their cell phone. Who knows maybe Discords actions will incentivize more people to self host.

38 minutes agoBender

> had very little success in getting people to host their own things. Most here at HN will not do anything that requires more than their cell phone.

You're just talking to the wrong ones :-)

18 minutes agoesseph

Like all global finance goes through NYC, they will find a throat to choke if motivated.

38 minutes agotoomuchtodo

Couldn't you simply set up your own instance and link up with the wider network? I guess you would have to age verify yourself if you live in a country that requires it, but regulating that would be sort of hilarious.

an hour agoQuothling

Yes, you could.

Whether or not authorities with jurisdiction over you would notice your instance (homeserver) or bother you about age verification is an issue you'd have to consider for yourself.

29 minutes agoforesto

The problem is that "simply" is a lie.

26 minutes agodirewolf20

Couldn't you simply set up your own instance and link up with the wider network?

I honestly have no idea. As much as they love money I am not paying my lawyers to research AI this one. I would probably wait for others to get made example of.

an hour agoBender

It's an interesting legal question, but I would imagine for a federated service, the burden of proof should be on the individual's home server for age verification. That's where the user account is, after all.

Matrix is basically labeled "adults only" everywhere, so restricting certain servers/rooms due to possible innocent eyes is likely out of scope.

an hour agoTheCraiggers
[deleted]
an hour ago

The Australian law doesn't care where servers are run. I don't know about others.

an hour agostevage

People without a physical or legal presence in Australia likely don't care what the Australian law cares about.

an hour agoZak

Yeah that's the thing. No matter what you do, it's bound to be illegal somewhere in the world. Be it North Korea or Iran or Australia. You simply can't follow everyone's laws because they are often contradictory.

an hour agowolvoleo

ISIS cuts off hands for watching porn. They will have to cut through my porn induced callused skin using hydraulics.

39 minutes agoBender

For everyone not reading the post:

> Practically speaking, that means that people and organisations running a Matrix server with open registration must verify the ages of users in countries which require it. Last summer we announced a series of changes to the terms and conditions of the Matrix.org homeserver instance, to ensure UK-based users are handled in alignment with the UK’s Online Safety Act (OSA).

At least you can self-host matrix and messages are end to end encrypted, unlike IRC.

an hour agocuillevel3

unlike IRC

There are a few IRC clients that support OTR. irssi-otr is one [1] weechat-otr is another [2]. Pidgin though I have not used it in a very long time. Hexchat using an always work in progress plugin. There may be others.

OTR could use some updates to include modern ciphers similar to the recent work of OpenSSH but probably good enough for most people.

E2EE aside having chat split up into gazillions of self hosted instances makes it much harder for chat to be hoovered up all in one place. It takes more effort to target each person and that becomes a government scalability issue. Example effort: [3]

[1] - https://github.com/cryptodotis/irssi-otr

[2] - https://github.com/mmb/weechat-otr

[3] - https://archive.ph/4wi5t

an hour agoBender

IRC is also most commonly used for open servers where anyone can join whenever they want to without as much as needing to register for an 'account'! You just pick a nickname out of thin air and off you go.

In that kind of environment, end to end encryption really doesn't add value.

43 minutes agowolvoleo

The IRC admins can read all your messages, be it to a channel or to another user.

Even without registering my nick, I would expect a modern protocol to keep my pm communication private by default.

34 minutes agocuillevel3

How will you verify who you're talking to?

25 minutes agodirewolf20

Recent and related. Others?

Discord/Twitch/Snapchat age verification bypass - https://news.ycombinator.com/item?id=46982421 - Feb 2026 (435 comments)

Discord faces backlash over age checks after data breach exposed 70k IDs - https://news.ycombinator.com/item?id=46951999 - Feb 2026 (21 comments)

Discord Alternatives, Ranked - https://news.ycombinator.com/item?id=46949564 - Feb 2026 (465 comments)

Discord will require a face scan or ID for full access next month - https://news.ycombinator.com/item?id=46945663 - Feb 2026 (2018 comments)`

17 minutes agodang

Last time I tried matrix (~2022) they still didn't have voice channels--they had voice calls but not a mechanism where people can join/leave a particular voice chat at will. To me this is a must have feature for anyone who has used discord/mumble/ventrilo.

28 minutes agosregister

I agree with you. The good news is that it looks like some of the alternate clients are focusing on it. https://commet.chat/ has voice channels (video rooms but default to camera off), and cinny's element call support PR defaults to camera off in video rooms as well iirc.

8 minutes agowkrp

I look at discussions on Hacker News for Discord replacements frequently with despair.

If it doesn't have enough of the utility, performance, and positive UX, it will never gain enough market share to matter.

E2EE encryption doesn't matter if you don't have someone else to communicate over it with!

17 minutes agorockskon

This appeal falls flat when you get to the parts about their homeserver requiring some form of age verification:

> From our perspective, the matrix.org homeserver instance has never been a service aimed at children, which our terms of use reflect by making it clear that users need to be at least 18 years old to use the server. However, the various age-verification laws require stricter forms of age verification measures than a self-declaration. Our Safety team and DPO are evaluating options that preserve your privacy while satisfying the age verification requirements in the jurisdictions where we have users.

Which is actually more strict than Discord's upcoming policy which allows accounts to operate for free without any verification, with some limitations around adult-oriented servers and content.

There has been a lot of FUD about the Discord age verification, so a refresher: The upcoming changes do not actually require you to verify anything to use Discord. It just leaves the account in teen mode by default. This means the account can't join age-restricted channels, can't unblur images marked as sensitive, and incoming message requests from unknown users will go to a second inbox with a warning by default.

You can, of course, run your own Matrix server. Having been there before I would suggest reading up on some typical experiences in running one of these servers. Unless you have someone willing to spend a lot of time running the server and playing IT person for people using it, it can be a real headache. They also note that running a server doesn't actually get around any age requirements:

> Practically speaking, that means that people and organisations running a Matrix server with open registration must verify the ages of users in countries which require it.

an hour agoAurornis

Except discord’s verification applies globally, while matrix is only aiming to implement it for users who live somewhere where it is required by law.

an hour agokennywinker

The list of locations with those laws is growing very large. From the post:

> Last summer we announced a series of changes to the terms and conditions of the Matrix.org homeserver instance, to ensure UK-based users are handled in alignment with the UK’s Online Safety Act (OSA). Since then Australia, New Zealand and the EU have introduced similar legislation, with movement in the US and Canada too.

an hour agoAurornis

It doesn't matter how many locations have those laws if even one country doesn't have those laws, because VPNs exist... unless a platform decides to proactively engage in voluntarily compliance with authoritarianism and the construction of globe-spanning surveillance states, like Discord is doing.

41 minutes agoanonymous908213

...and while we have no choice but implement it on the matrix.org instance, other folks running their own servers are responsible for their own choices.

an hour agoArathorn

Has anyone managed to run Matrix over I2P or other similar overlay network technologies?

12 minutes agoapopapo

I wanted to love matrix and its clients but its just not quite there yet honestly.

I'm hopeful the experience will improve in the future.

44 minutes agopuppycodes

Totally agree there and they actually talk about that in the post:

> Finally: we’re painfully aware that none of the Matrix clients available today provide a full drop-in replacement for Discord yet. All the ingredients are there, and the initial goal for the project was always to provide a decentralised, secure, open platform where communities and organisations could communicate together. However, the reality is that the team at Element who originally created Matrix have had to focus on providing deployments for the public sector (see here or here) to be able to pay developers working on Matrix. Some of the key features expected by Discord users have yet to be prioritised (game streaming, push-to-talk, voice channels, custom emoji, extensible presence, richer hierarchical moderation, etc).

31 minutes agocuillevel3

Same here, tried a couple of years ago. I was drawn to it because of the protocol concept. The experience was not bad, everything worked. But I remember the signup/domain/keys/backups/etc UX was a bit confusing. Happy to see there is more attention going to Matrix lately. Time to give it another go perhaps

36 minutes agokaboomshebang

I cannot even use Discord if I wanted to... every time I try to sign up I get immediately phone-walled and/or banned, and the appeal is always denied with "our automated system is working properly." I have been trying for close to ten(!) years now off and on, with all different combinations of browsers, OSes, ISPs and physical machines. No VPN or proxy either.

And even if I was able to register, that "automated system" still randomly bans people whenever it feels like it. Search the r/discordapp subreddit or just google "discord random ban", it's a widespread problem with no solution and I have no idea how so many other people seem to have no issues, yet at the same time you can find lots of people just as frustrated as me.

an hour agoranger_danger

"Automated system discriminating against me with no appeal or recourse" may not be the biggest injustice in the world right now, but I fear/loathe that it seems like it's going to keep getting bigger.

A bug blocking functionality is an annoyance, but a Scarlet Letter branded onto a secret dossier is terrifying.

an hour agoTerr_

Does phone-walled mean you have to verify with a phone number? Are you unable to do it because it doesn’t work, or because you don’t want to give it your phone number?

an hour agocortesoft

Most likely they don't want to. It's ridiculous that you would need to give your phone number for some chat program.

an hour agothe_gipsy

Since you need to provide a phone number to sign up for Discord is the reason I never will.

There should be no reason for a phone number and nor do I want to waste my time trying to buy pass it with internet provided single use numbers.

Unless it is a service I must use, then I will provide a phone number. If it is a service I get to choose to use then I will never provide a number.

34 minutes agoyndoendo

On the two occasions I’ve tried to chat with someone on the public Matrix server, I was completely unable to get it to work. I’ve tried with the new Mac app and with some older thing years ago.

So… choose your poison? I’m sure Matrix/Element works for someone or they would be out of business, but it does not work for me.

an hour agoamluto

I have a similar issue with Matrix as well... even though it's federated, most large rooms use the same bots and blocklists so I end up getting banned from many rooms before I've even attempted to join.

Apparently my monopoly ISP rotates IPs fairly often and I am sharing them with people that have been doing bad things with them, so not only are many Matrix channels blocked but even large regular websites like etsy or locals are completely blocked for me as well. Anything with a CF captcha is also an infinite loop.

an hour agoranger_danger

As far as I know I wasn’t banned or restricted or anything. The client just never managed to create a room or initiate a chat or whatever they called it.

an hour agoamluto

If you live in a GDPR country, you could try that — they're required to explain automated profiling.

22 minutes agodirewolf20

My security collective is honestly considering going back to IRC.

It's becoming increasingly apparent that if you don't use something truly free and open source and host it yourself, you're just setting yourself up for more of this sort of thing.

You can't trust anyone to properly handle the problem of "how the hell do we keep creeps the f*ck away from kids?" with any amount of common sense.

an hour agolenerdenator

Even if you self-host matrix there are still multiple ways you could be liable for content you don't even know exists. Especially the last 4 points here:

https://telegra.ph/why-not-matrix-08-07

There are even custom message/media types that people use to upload hidden content you can't see even if you're joined to the same channel using a typical client.

an hour agoranger_danger

Does matrix self-hosting allow you to disable federation & uploads?

16 minutes agojamespo

Has this actually happened, or is it hypothetical?

19 minutes agodirewolf20
[deleted]
an hour ago

I'll be closing and uninstalling Discord the first time I get a face scan pop up.

2 hours agojosefritzishere

FWIW It's done on the client side and there are multiple ways to bypass it.

https://news.ycombinator.com/item?id=46982421

https://tech.yahoo.com/social-media/articles/now-bypass-disc...

an hour agoranger_danger

That K-ID bypass has already been patched, and even if it's bypassed again, Discord is apparently directing some users to Persona instead now. Persona does server-side classification so that one won't be as easy as nulling out the checks on the client.

The 3D model method might work on Persona, but that demo only shows it fooling K-IDs classifier.

an hour agojsheard

Oh, so they promised your face was only processed on the client and then deleted, but none of that is true? They're courting some huge GDPR fines.

21 minutes agodirewolf20

Eh, the worldwide rollout hasn't happened yet so currently the only people getting directed to Persona after they promised client-side scanning are those who are fiddling around with Discords internals to trigger the age verification flow early. But yeah if they stick with Persona they will need to retract the client-side promise before the full rollout, and it'll be even more fuel on the fire.

18 minutes agojsheard

There's just something about that headline that doesn't land well.

an hour agogenghisjahn

I'll be willing to believe that matrix is a home when they can get their shit together and stop transphobic hate waves for good.

an hour agoxena

Why (and more importantly how) are you proposing a decentralized protocol censors something?

an hour agoGaryBluto

I've always wished there was a market for mod actions.

Moderation and centralization while typically aren't independent, aren't necessarily dependent. One can imagine viewing content with one set of moderation actions and another person viewing the same content with a different set of moderation actions.

We sort of have this in HN already with viewing flagged content. It's essentially using an empty set for mod actions.

I believe it's technically viable to syndicate of mod actions and possibly solves the mod.labor.prpbl, but whether it's a socially viable way to build a network is another question.

an hour agokelseyfrog

Consider the ActivityPub Fediverse. With notable, short-lived exceptions (when a bad actor shows up with a new technique), the majority of the abuse comes from a handful of instances, whose administrators are generally either negligent or complicit.

an hour agowizzwizz4

So your solution to people using a decentralized, federated protocol to say things you don't like is to stop various servers interacting with each other? At that point why not just use federated services with multiple accounts?

It seems far too risky to sign up on a service for the purpose of intercommunication that is able (or even likely) to burn bridges with another for any reason at any time. In the end people will just accumulate on 2 or 3 big providers and then you have pseudo-federation anyway.

an hour agoGaryBluto

Servers stopping federation with each other is pretty normal IMO. If I had a mastodon server I would also not federate with something like gab.com.

However all the LGBT+ friendly servers federate with each other and that's good enough for me. I like not having to see toxicity, there's too much of it in the world already.

40 minutes agowolvoleo

My solution is for instances to stop being negligent. Mastodon still directs everyone to create an account on mastodon.social using dark patterns (see https://joinmastodon.org/), which has lead to the flagship instance being far bigger than its moderation team can handle, leading to a situation where it's a major source of abuse and where defederation is too costly for many to consider.

"People will just accumulate on 2 or 3 big providers" is far from an inevitable circumstance, but there are conditions that make it more likely. That, too, is largely down to negligence or malice (but less so than the abusive communications problem).

an hour agowizzwizz4

> which has lead to the flagship instance being far bigger than its moderation team can handle, leading to a situation where it's a major source of abuse

Is that still true? As the admin of a small instance, I find the abuse coming from mastodon.social has been really low for a few years. There is the occasional spammer, but they often deal with it as quickly as I do.

an hour agoprogval

Throwing in Nostr as a truly decentralized alternative. Instead of relying on federated servers, the messages themselves are signed and relayed for anyone to receive.

34 minutes agolittlecranky67

it's up to the maintainer of a particular server to moderate what goes on in said server. Now, if the Matrix.org Foundation wants to moderate their servers one way or the other, that's one thing, but to expect the protocol/spec to lay down a content policy is, with all due respect, dumb as hell.

an hour agob00ty4breakfast

you are literally on hackernews

an hour agoaystatic

Is the implication that HN is transphobic?

an hour agopoly2it

you're free to have your own opinion based on your experiences here, but i wouldn't blame anyone for feeling that way. for the record, i don't think dang or anybody is a transphobe, but i have to imagine the culture here is pretty off-putting to trans people

https://news.ycombinator.com/item?id=36231993

an hour agoaystatic

This is a wild take. HN has transphobic users like it has trans and ally users. It's neutral to this topic, it's about tech.

an hour agooytis

i don't think it's that "wild". sure, i'm not so cynical as to feel hn's become a nazi bar or anything, but i am willing to recognize that some of the incidents i've witnessed could be reason enough for a trans person to want to avoid this site.

> It's neutral to this topic, it's about tech.

this thread began by xe bringing up failures in moderation affecting trans people

41 minutes agoaystatic

That isn't how it works. The presence of neutral allies doesn't somehow counterbalance and cancel out the transphobia. If a platform allows transphobic users - as Hacker News does because transphobia isn't against the guidelines - and transphobia is common in threads where trans issues or people are a subject (and it is) then it's a hostile platform to trans people.

Asking trans people to ignore this is like asking Jews to be comfortable in a bar where only ten percent of the patrons are Nazis. Arguing that "well not everyone is a Nazi" doesn't help, an attitude of "we're neutral about Nazis, we serve drinks to anyone" still makes it a Nazi bar, just implicitly rather than explicitly.

42 minutes agokrapp

I'd agree with this logic if we were discussing all kinds of different topics here, and one's stance on gender would be immediately visible to anyone. But I can't remember the last time the matters of gender were discussed here at all, and pretty sure anything openly transphobic would be flagged or deleted pretty soon.

39 minutes agooytis

>I'd agree with this logic if we were discussing all kinds of different topics here, and one's stance on gender would be immediately visible to anyone.

We do discuss all kinds of different topics here. Despite what many people here want to believe, Hacker News isn't exclusively for tech and tech-related subjects.

>and pretty sure anything openly transphobic would be flagged or deleted pretty soon.

But not banned, that's the problem. The guidelines are extremely pedantic but nowhere is bigotry, racism, antisemitism or transphobia mentioned as being against those guidelines. You might say that shouldn't be necessary, but it's weird that so much effort is put into tone policing specific edge cases but the closest the guidelines come to defending marginalized groups is "Please don't use Hacker News for political or ideological battle. It tramples curiosity." Transphobia is treated as a mere faux pas on the same par as being too snarky, or tediously repetitive. The real transgression being not the bigotry but "trampling curiosity." Any trans person who posts here knows that bigots who hate them and want to do them harm aren't going to suffer meaningful consequences (especially if they just spin up a green account) and that the culture here isn't that concerned about their safety.

Read the green account just below me. That sort of thing happens all the time. Yes, the comment is [dead] but why should a trans person be comfortable here, or consider themselves welcome, knowing that this is the kind of thing they'll encounter?

19 minutes agokrapp

I'm not in a position to tell marginalized people how they should feel, but a moderation policy that wouldn't even allow offensive messages by new accounts appear for a short time would make this place into another social media - walled off and tracking their users. I understand the point though.

2 minutes agooytis
[deleted]
24 minutes ago

[dead]

32 minutes agouxhoiuewfhhiu

Elaborate?