242

The Internet runs on free and open source software and so does the DNS

Report: The Domain Name System Runs on Free and Open Source Software (FOSS) [pdf] https://itp.cdn.icann.org/en/files/security-and-stability-ad...

It's interesting that there is a generation of developers now who seem to believe that the Internet is an achievement of pure commercial, market dynamics and are surprised to learn about ARPANET and its early development within academia (we were taught this history in the first year of university). If the foundations of the Internet (particularly the protocol suites) had not been open, government-funded and not-for-profit, we would probably have a number of competing closed platforms instead of a single Internet, with paid services to perform protocol translations between them.

8 hours agohliyan

Even among the open protocols there was healthy competition like Gopher and BBS networks

It was truly a time to be alive, the 1980s and 1990s, if you were into technology. The change was interesting and quite rapid, lot of variety and experiments

4 hours agono_wizard

Yep, services like AOL, CompuServe, Prodigy and so on, would absolutely be separate. As it was, each of these services basically had to allow access to the greater Internet.

3 hours agoSophira

We would have had telcos (or satellite TVs) creating and competing on siloed services and eventually some big company would buy a telco to have access to their customers. Some multinational telcos would provide the same services around the world but most countries would have a wall at the border.

5 hours agopmontra

The thing is, we did have telcos and others provide siloed services. AOL being the big one, but also things like Prodigy, Compuserve, French Minitel, etc. What I wonder about is, would an Internet-like system have arisen anyway just because it's such a better ux? You saw a development like that with Fidonet for example but I think the Internet stole its oxygen.

4 hours agofoobarian

Minitel was offered by France Télécom, then a government-operated service, under what was originally the Ministry of Postes, Télégraphes et Téléphones. (The name "Minitel" is not short for "ministry of telecommunications", but rather Médium interactif par numérisation d'information téléphonique. Though I suspect some advantageous ambiguity was exploited.

France Telecom was privatised during the 1990s with the aquisition of the Orange Group, its current (commercial) branding.

2 hours agodredmorbius

We did have that well into Internet years because some services where not there yet. I give you an example: I worked for one of the early 3G providers (2003-). YouTube is from 2005. We developed and run streaming video services because our customers did not have any meaningful source of videos to watch, so we had to build that and other services to drive the sales of our phone contracts. Mobile web sites were not a thing for a few more years so we had news and meteo and many other things. Our little private internet. Then YouTube became big, other services were born and became big too, all phone companies basically became internet providers.

Without IP protocols we probably would not even be born as a company. BBSes on analog modems or GSM data would be all we had.

4 hours agopmontra

So true. I find another prove that altruist collaboration wins any other model although users may not perceived as such or there is no interest spreading these facts.

7 hours agon3storm

Altruist? DARPA is a military agency, ARPANET was a prototype network designed to survive a nuclear strike. I think the grandparent comment's point is that the innovation was government-funded and made available openly; none of which depends on the slightest on its being altruist.

5 hours agocousin_it

The resilience of ARPANET was influenced by CYCLADES, which was developed in French Academia: https://en.wikipedia.org/wiki/CYCLADES

> The CYCLADES network was the first to make the hosts responsible for the reliable delivery of data, rather than this being a centralized service of the network itself. Datagrams were exchanged on the network using transport protocols that do not guarantee reliable delivery, but only attempt best-effort [..] The experience with these concepts led to the design of key features of the Internet Protocol in the ARPANET project

Keeping with the theme of the thread, CYCLADES was destroyed because of greed:

> Data transmission was a state monopoly in France at the time, and IRIA needed a special dispensation to run the CYCLADES network. The PTT did not agree to funding by the government of a competitor to their Transpac network, and insisted that the permission and funding be rescinded. By 1981, Cyclades was forced to shut down.

5 hours agotremon

Any technology that is developed to be federated and resilient in the face of apocalyptic events is definitionaly altruistic toward humanity.

3 hours agofellowniusmonk

https://siliconfolklore.com/internet-history/

> Rumors had persisted for years that the ARPANET had been built to protect national security in the face of a nuclear attack. It was a myth that had gone unchallenged long enough to become widely accepted as fact.

No, the Internet (inclusive of ARPANET, NSFNet, and so on) was not designed to survive a nuclear war. It's the worst kind of myth: One you can cite legitimate sources for, because it's been repeated long enough even semi-experts believe it.

The ARPANET was made to help researchers and to justify the cost of a mainframe computer:

> It's understandable how it could spread. Military communications during Nuclear War makes a more memorable story than designing a way to remote access what would become the first massively parallel computer, the ILLIAC IV. The funding and motivation for building ARPANET was partially to get this computer, once built, to be "online" in order to justify the cost of building it. This way more scientists could use the expensive machine.

4 hours agomsla

That's a valiant attempt at myth-fighting, but it doesn't fully convince me. For example, one hop to Wikipedia gives this:

> Later, in the 1970s, ARPA did emphasize the goal of "command and control". According to Stephen J. Lukasik, who was deputy director (1967–1970) and Director of DARPA (1970–1975):

> "The goal was to exploit new computer technologies to meet the needs of military command and control against nuclear threats, achieve survivable control of US nuclear forces, and improve military tactical and management decision making."

3 hours agocousin_it

The protocols were made open by necessity, not by design. The motive was to connect academic , government, and commercial institutions across the country, all of which operated on incompatible operating systems and data networks. However, the common man would not have benefited from this before 1993, as the government effectively operated as a semi-competent firewall against commercial content and the broader public. They even sued ISPs that permitted legitimate accounts from remotely accessing the net through PPP or SLIP protocols. Not even commercial news feeds were permitted until the late 80s.

The only Internet the common man interacted with is the one that began to flourish as the government relinquished control. The Internet since the mid-90s is and has been a purely commercial achievement.

2 hours agoDracophoenix

There were some early ISPs, like The World (Boston), that had IP access around 1991 or so. I believe they were connected through UUNet. I don't know if they were routeable on the NSFNet? "Commercial" traffic was supposedly prohibited.

2 hours agoicedchai

I remember a few BBSs that offered email (done via nightly UUCP so slower than people today are used to) to the common person.

Because it was new there weren't really thought out mechanism for people connecting early on, but some did.

http://www.armory.com/~spcecdt/deepthought.html

2 hours ago_DeadFred_

I had a UUCP feed, by either late 1992 or early '93. Several other local BBSes had one, as well.

an hour agoicedchai

> we would probably have a number of competing closed platforms instead of a single Internet, with paid services to perform protocol translations between them.

That did exist with the likes of Tymnet and the various Online Services (AOL, CompuServe, etc). The Internet won out over those because it was open, as you alluded to. Internet adoption really exploded with unlimited services rather than ISPs that billed hourly.

4 hours agogiantrobot

Not really surprising when you have political parties and large tech companies both either outright minimizing the role public institutions had in developing anything or outright lying that the public sector can't possibly do anything at all.

4 hours agomhurron

Open source and hacker culture is basically a massive ethical triumph that has catapulted humanity forward.

There isn't a single person living on the planet that isn't touched and benefitted by this in some way, even remote island tribes we consider untouched, there is hardly a single ounce of space payload that doesn't have open source in it's causal chain.

There is no more pragmatically altruistic culture that has ever existed or impacted more people.

In the U.S. no culture has helped lift the impoverished out of poverty into the upper middle class through empowerment (ask any of us adults who were once kids going to bed hungry.)

People have their pet beliefs and metaphysics and ideological communities but "information deserves to be free" is the goat.

3 hours agofellowniusmonk

Thanks for pointing this out. We're so surrounded by the dystopian tech future, it's nice to see some positivity to tech, it's been a while.

2 hours ago_DeadFred_

> There isn't a single person living on the planet that isn't touched and benefitted by this in some way, even remote island tribes we consider untouched...

That seems a bit over the top. I'd love to see your causal chain for that claim.

3 hours agoAnimalMuppet

The Sentinelese monitoring program run by the government in India the works to enforce no contact rules relies on open source in it's causal chain.

My mother did remote tribal work in the brasilian interior, she grew up on the Amazon, I personally know well off Ruby devs who grew up unable to even read.

Whether you are against intervention or pro intervention, all across that spectrum, whatever your luxury belief may be, everyone is impacted by open-source.

Open source is an insane force multiplier, not just for production but for the entire training and r&d pipeline as well.

2 hours agofellowniusmonk

Imagine a scenario where you want to start gardening. Go to gardening clubs and you'll find a lot of free information there and people to guide you. Public libraries exist if you want to join a book club and start reading. Again free. Agriculture, irrigation, building homes, woodworking, stitching clothes, etc. everything essential has been free to learn and do.

Apply this to the internet and essentials are FOSS. Linux, DNS and maybe RISCV someday will mean you can build computers and internet on essentials that are free to learn and use.

15 hours agomumber_typhoon

In the same analogy, doesn't that mean that vendor-locked software like iOS or ChromeOS would be akin to vendor-locked seeds from Monsanto?

11 hours agocobertos

Bayer these days and yes, avoid like the plague for nothing good will come of it.

10 hours agomrarjen

Raspberry Pi's obviously trying to make this a reality.

Learning to self-host and get off cloud services might be one of the most personally freeing feelings I've had in a long time.

Rent-seeking is obviously growing out of control and one of the most powerful ways to combat it is personal ownership (if possible).

13 hours agostaplers

Land for garden in my town costs like 1000 eur per square meter. Gardening clubs are full of old dudes, who want to have a sex with me. Public libraries are homeless shelters now. If I actually plant some vegetables (in pots or front yard), it will be full of dog/cat excrements the next day!

You are living in imaginary land, nothing is free in todays society!

10 hours agothrow844958585i

Just because you live in some shitty, doesn't mean everyone has to or wants to be...

(I do realise you used euro, I just don't think we need to adjust our standards down, when our locality sucks.)

4 hours ago1718627440

For what I remember, most of the DNS root servers used to run Bind9 exclusively. I'm glad to see that this is now more diverse with NSD and Knot also being used (see table 4 in the report).

Nothing against Bind9, but it is almost exclusively maintained by the ISC, so the DNS's future used to depend heavily on the ISC getting the funding needed to continue operating.

8 hours agoLeonM

Not to mention how much better it is for standards/protocols/standarization to have multiple implementations of the same protocol in real-world usage so we can nail down the protocol in all situations. Bind9 almost ended up being "DNS" itself, which would have been bad overall, and instead we're seeing more diversity which means we'll be able to make better use of the specifications.

6 hours agoembedding-shape

Perhaps it doesn't even matter anymore, but I'm not yet past the point where it's disheartening every time I click on a link and it's clear that it came out of an LLM. Hopefully this doesn't extend to the actual report.

10 hours agoevertedsphere

We should tax cloudflare, aws etc. for using public infrastructure

13 hours agoseydor

I don't understand your sentiment against Cloudflare here.

Cloudflare also delivers a rather large portion of said public infrastructure free of charge. They also released a few of their own projects as FOSS, and regularly contribute.

Granted, the centralisation part worries me too, but it feels like a bit of a cheap shot against CF just because they are a large player.

9 hours agoLeonM
[deleted]
2 hours ago

We do tax them, my dude.

13 hours agorenewiltord

Just maybe not enough.

9 hours agotellarin

>In the cloud, hyperscale computing platforms such as Microsoft Azure, Google Cloud, and Amazon Web Services all operate significant resolver infrastructure to support their services. At least four of the biggest hyperscalers rely on FOSS for DNS resolving, while others have built proprietary solutions based on FOSS DNS libraries.

This is surprising. I would have expected them to have custom needs with so many customers that using an off the shelf service would be sufficient.

18 hours agocharcircuit

It's all cURL.

16 hours agolofties
[deleted]
10 hours ago

It's cURL, SQLite, and one weird awk script no one understands but works so no one touches it.

https://xkcd.com/2347/

4 hours agogiantrobot
[deleted]
a day ago

But the infrastructure is highly centralized and only certain chosen entities can operate gTLDs and certificate authorities. It's extremely misleading to call it 'free software'. Why can't there be multiple competing systems. There should be a push for Blockchain-based alternatives. I still don't understand why projects like Unstoppable Domains aren't getting more traction. The idea of a domain name that you actually own is appealing.

20 hours agojongjong

There can be and are multiple competing systems. There are alternative dns roots, opennic.org for example, and entirely separate protocols like ipfs and i2p with their own methods of mapping names to numbers.

You can go make your entire own alternative DNS system, with your own governance and policy. Free as you like. You just have to convince people to resolve against you.

19 hours agodenkmoon

I think that permanent identifiers might be helpful, which could include a timestamp and other stuff. This can be combined with web of trust, that it can then be verified in this way.

19 hours agozzo38computer

I think you may be applying a very weird definition of "free software", even compared to the usual gratis vs libre axis.

Also, I really don't think controlling a domain name NFT in a system that's mostly computers you neither own nor control constitutes "more ownership" than the IRL law and contract bound rental world we currently live in. Especially if all the requirements and outcomes (payments for control resulting in land grabs of valuable names) are the same as our current system.

20 hours agolelandbatey

Maybe not more ownership by the owner, but certainly more ownership by the user, which is what's important. If there are multiple blockchain-based alternatives, I can choose which one to resolve with; it's also essentially built-in namespacing (with each name marked with its resolver.) And although I'm personally very crypto-negative, a distributed ledger is exactly what I would want to make sure that any nodes that I use to resolve a name on a particular registry are trustworthy.

The throughput problem that poisons cryptocurrency becomes irrelevant when we're talking about something that's as naturally long-lived as domain names. Every domain blockchain can have its own gatekeeping process; one can sell names for thousands of $ each, and another can give away thousands for a $. They can require that domain owners have a camera pointed at them personally for 24-hours a day or be revoked, or they can hand out infinite names through a onion-routed API.

2 hours agopessimizer

They just run one root. You can run a different root. E.g. Some people run an ENS bridge.