IMHO D just missed the mark with the GC in core. It was released in a time where a replacement for C++ was sorely needed, and it tried to position itself as that (obvious from the name).
But by including the GC/runtime it went into a category with C# and Java which are much better options if you're fine with shipping a runtime and GC. Eventually Go showed up to crowd out this space even further.
Meanwhile in the C/C++ replacement camp there was nothing credible until Rust showed up, and nowadays I think Zig is what D wanted to be with more momentum behind it.
Still kind of salty about the directions they took because we could have had a viable C++ alternative way earlier - I remember getting excited about the language a lifetime ago :D
I'd rather say that the GC is the superpower of the language. It allows you to quickly prototype without focusing too much on performance, but it also allows you to come back to the exact same piece of code and rewrite it using malloc at any time. C# or Java don't have this, nor can they compile C code and seamlessly interoperate with it — but in D, this is effortless.
Furthermore, if you dig deeper, you'll find that D offers far greater control over its garbage collector than any other high-level language, to the point that you can eagerly free chunks of allocated memory, minimizing or eliminating garbage collector stops where it matters.
> C# or Java don't have this, nor can they compile C code and seamlessly interoperate with it — but in D, this is effortless.
C# C interop is pretty smooth, Java is a different story. The fact that C# is becoming the GC language in game dev is proving my point.
>Furthermore, if you dig deeper, you'll find that D offers far greater control over its garbage collector than any other high-level language, to the point that you can eagerly free chunks of allocated memory, minimizing or eliminating garbage collector stops where it matters.
Yes, and the no-gc stuff was just attempts to backpedal on the wrong initial decision to fit into the use-cases they should have targeted from the start in my opinion.
Look D was an OK language but it had no corporate backing and there was no case where it was "the only good solution". If it was an actual C++ modernization attempt that stayed C compatible it would have seen much better adoption.
FIl-C, the new memory-safe C/C++ compiler actually achieved that through introducing a GC, with that in mind I'd say D was kind of a misunderstood prodigy in retrospect.
There's two classes of programs - stuff written in C for historic reasons that could have been written in higher level language but rewrite is too expensive - fill c.
Stuff where you need low level - Rust/C++/Zig
I often see people lament the lack of popularity for D in comparison to Rust. I've always been curios about D as I like a lot of what Rust does, but never found the time to deep dive and would appreciate someone whetting my appetite.
Are there technical reasons that Rust took off and D didn't?
What are some advantages of D over Rust (and vice versa)?
D and Rust are on the opposite sides at dealing with memory safety. Rust ensures safety by constantly making you think about memory with its highly sophisticated compile-time checks. D, on the other hand, offers you to either employ a GC and forget about (almost) all memory-safety concerns or a block scoped opt-out with cowboy-style manual memory management.
D retains object-oriented programming but also allows functional programming, while Rust seems to be specifically designed for functional programming and does not allow OOP in the conventional sense.
I've been working with D for a couple of months now and I noticed that it's almost a no-brainer to port C/C++ code to D because it mostly builds on the same semantics. With Rust, porting a piece of code may often require rethinking the whole thing from scratch.
> block scoped opt-out with cowboy-style manual memory management
Is this a Walter Bright alt? I've seen him use the cowboy programmer term a few times on the forum before.
Yeah, I just saw his posts too and picked up the term :)
> Are there technical reasons that Rust took off and D didn't?
As someone who considered it back then when it actually stood a chance to become the next big thing, from what I remember, the whole ecosystem was just too confusing and simply didn't look stable and reliable enough to build upon long-term. A few examples:
* The compiler situation: The official compiler was not yet FOSS and other compilers were not available or at least not usable. Switch to FOSS happened way too late and GCC support took too long to mature.
* This whole D version 1 vs version 2 thingy
* This whole Phobos vs Tango standard library thingy
* This whole GC vs no-GC thingy
This is not a judgement on D itself or its governance. I always thought it's a very nice language and the project simply lacked man-power and commercial backing to overcome the magical barrier of wide adoption. There was some excitement when Facebook picked it up, but unfortunately, it seems it didn't really stick.
How many people were working on the core compiler/language at the time versus Rust? This could explain it.
I think 3 things
1. D had a split similar to python 2 vs 3 early on with having the garbage collector or not (and therefor effectively 2 standard libraries), but unlike python it didn't already have a massive community that was willing to suffer through it.
2. It didn't really have any big backing. Rust having Mozilla backing it for integration with Firefox makes a pretty big difference.
3. D wasn't different enough, it felt much more "this is c++ done better" than it's own language, but unlike c++ where it's mostly a superset of c you couldn't do "c with classes" style migrations
One feature of D that i really wish other languages would adopt (not sure about Rust but i also think it lacks it, though if it has it to a similar extent as D it might be the reason i check it again more seriously) is the metaprogramming and compile-time code evaluation features it has (IIRC you can use most of the language during compile time as it runs in a bytecode VM), down to even having functions that generate source code which is then treated as part of the compilation process.
Of course you can make codegen as part of your build process with any language, but that can be kludgy (and often limited to a single project).
Arguably, most of the metaprogramming in D is done with templates and it comes with all the flaws of templates in C++. The error messages are long and it's hard to decipher what exactly went wrong (static asserts help a lot for this, when they actually exist). IDE support is non-existent after a certain point because IDE can't reason about code that doesn't exist yet. And code gets less self-documenting because it's all Output(T,U) foo(T, U)(T t, U u) and even the official samples use auto everywhere because it's hard to get the actual output types.
I'd say D's template error messages are much better than C++'s, because D prints the instantiation stack with exact locations in the code and the whole message is just more concise. In C++, it just prints a bunch of gibberish, and you're basically left guessing.
> Are there technical reasons that Rust took off and D didn't?
My (somewhat outdated) experience is that D feels like a better and more elegant C++. Rust certainly has been influenced by C and C++, but it also took a lot of inspiration from the ML-family of languages and it has a much stronger type system as a consequence.
More like the companies that jumped into D versus Rust, D only had Facebook and Remedy Games toy a bit with it.
Many of us believe on automatic memory management for systems programming, having used quite a few in such scenarios, so that is already one thing that D does better than Rust.
There is the GC phobia, mostly by folks that don't get not all GCs were born alike, and just like you need to pick and chose your malloc()/free() implementation depending on the scenario, there are many ways to implement a GC, and having a GC doesn't preclude having value types, stack and global memory segment allocation.
D has compile time reflection, and compile time metaprogramming is much easier to use than Rust macros, and it does compile time execution as well.
And the compile times! It is like using Turbo Pascal, Delphi,... even thought the language is like C++ in capabilities. Yet another proof complexity doesn't imply slow compile natives in a native systems language.
For me, C# and Swift replace the tasks at work were I in the past could have reached for D instead, mostly due to who is behind those languages, and I don't want to be that guy that leaves and is the one that knew the stack.
> Many of us believe on automatic memory management for systems programming
The problem is the term "systems programming". For some, it's kernels and device drivers. For some, it's embedded real-time systems. For some, it's databases, game engines, compilers, language run-times, whatever.
There is no GC that could possibly handle all these use-cases.
But there could be a smoother path between having a GC and having no GC.
Right now, you'd have to switch languages.
But in a Great Language you'd just have to refactor some code.
> Are there technical reasons that Rust took off and D didn't?
Yes. D tried to jump on the "systems programming with garbage collection" dead horse, with predictable results.
(People who want that sort of stupidity already have Go and Java, they don't need D.)
> (People who want that sort of stupidity already have Go and Java, they don't need D.)
Go wasn't around when D was created, and Java was an unbelievable memory hog, with execution speeds that could only be described as "glacial".
As an example, using my 2001 desktop, the `ls` program at the time was a few kb, needed about the same in runtime RAM and started up and completed execution in under 100ms.
The almost equivalent Java program I wrote in 2001 to list files (with `ls` options) took over 5s just to start up and chewed through about 16MB of RAM (around 1/4 of my system's RAM).
Java was a non-started.
Go wasn't around when D was released and Java has for the longest time been quite horrible (I first learnt it before diamond inference was a thing, but leaving that aside it's been overly verbose and awkward until relatively recently).
Is Java even a "systems programming" language?
I don't even know what that term means anymore; but afaik Java didn't really have reliable low-level APIs until recently.
Depends if one considers writing compilers, linkers, JITs, database engines, and running bare metal on embedded real time systems "systems programming".
As far as adoption is concerned, I'm not sure it should be that big of a concern.
After all, D is supported by GCC and Clang and continually being maintained, and if updates stopped coming at some point in the future, anyone who knew a bit of C / Java / insert language here could easily port it to their language of choice.
Meanwhile, its syntax is more expressive than many other compiled languages, the library is feature-rich and fairly tidy, and for me it's been a joy to use.
It has an LLVM backend, LDC, that is separate from the LLVM project/Clang.
GCC usually drops frontends if there are no maintainers around, it already happened to gcj, and I am waiting for the same to happen to gccgo any time now, as it has hardly gotten any updates since Go 1.18.
The team is quite small and mostly volunteers, so there is the question how long can Walter Bright keep at it, and who will keep it going afterwards when he passes the torch.
I like D in general, however it is missing out in WASM where other languages like Rust, Zig, even Go are thriving. Official reasoning usually included waiting for GC support from WASM runtime, but other GC languages seem to just ship their own GC and move on.
D is boring, let's see how to recreate the B language:
Say your starting a new Staff Engineer or Tech Lead job. What gets you to convince a CTO that we need to have a team learn D ?
On the flip side, where are the 200k base salary D positions.
Get me an interview in 2 months and I'll drop 10 hours a week into learning
Well, I would say it's more like glasses - you can't convince those who don't wear them, and you don't need to convince those who need them either.
What problem is D solving ?
One good case for it that I see is a viable basis for cross-platform desktop apps. Today, cross-platform desktop GUI apps are either just a snapshot of the website contained inside Electron, or a C/C++ code base with manual memory management. D can serve as a nice middle ground in that space.
Off topic: Back in the day, C++ programming books Andrei Alexandrescu are a joy to read, especially, Modern C++ design.
When I was student, our group was forced to use D lang instead C++ for CS2* classes. That was back in 2009. After 16 years I see that level of adoption did not change at all.
Seen D being posted regularly on here, seems like flogging a dead horse. It's the equivalent of keeping grandma on life support when there is no hope.
You'd be surprised to see how active the D community is, despite your fair point that it's noticeably smaller than in the "competing" (in quotes because it's not a competition, actually) languages.
The latest release [1] was on Jan 7th, and it contains more updates than, say, the latest release of Dart, which has one of the largest corporations behind it.
This is a somewhat simplistic view of ownership and borrowing for modern programming languages.
Pointers are not the only 'pointer's to resources. You can have handles specific to your codebase or system, you can have indices to objects in some flat array that the rest of your codebase uses, even temporary file names.
An object oriented (or 'multi paradigm') language has to account for these and not just literal pointers.
This is handled reasonably well both in Rust and C++. (In the spirit of avoiding yet another C++ vs Rust flamewar here, yes the semantics are different, no it doesn not make sense for C++ to adopt Rust semantics)
How does Rust (or C++) treat array indices as resources? And won't that defy the reason to use indices over pointers?
I don't know D so I'm probably missing some basic syntax. If pointers cannot be copied how do you have multiple objects referencing the same shared object?
> If pointers cannot be copied
They can.
Is there any experience on how this works in practice?
OOP and ownership are two concepts that mix poorly - ownership in the presence of OOP-like constructs is never simple.
The reason for that is OOP tends to favor constructs where each objects holds references to other objects, creating whole graphs, its not uncommon that from a single object, hundreds of others can be traversed.
Even something so simple as calling a member function from a member function becomes incredibly difficult to handle.
Tbh - this is with good reason, one of the biggest flaws of OOP is that if x.foo() calls x.bar() in the middle, x.bar() can clobber a lot of local state, and result in code that's very difficult to reason about, both for the compiler and the programmer.
And it's a simple case, OOP offers tons of tools to make the programmers job even more difficult - virtual methods, object chains with callbacks, etc. It's just not a clean programming style.
Edit: Just to make it clear, I am not pointing out these problems, to sell you or even imply that I have the solution. I'm not saying programming style X is better.
I work at a D company. We tend to use OOP only for state owners with strict dependencies, so it's rare to even get cycles. It is extremely useful for modeling application state. However, all the domain data is described by immutable values and objects are accessed via parameters as much as fields.
When commandline apps were everywhere, people dreamed of graphical interfaces. Burdened by having to also do jobs that it was bad at, the commandline got a bad reputation. It took the dominance of the desktop for commandline apps to find their niche.
In a similar way, OOP is cursed by its popularity. It has to become part of a mixed diet so that people can put it where it has advantages, and it does have advantages.
It worked alright for Rust, and yes Rust does support OOP, there are many meanings to what is OOP from CS point of view.
I have ported Ray Tracing in One Weekend into Rust, while keeping the same OOP design from the tutorial, and affine types were not an impediment to interfaces, polymorphism and dynamic dispatch.
>one of the biggest flaws of OOP is that if x.foo() calls x.bar() in the middle, x.bar() can clobber a lot of local state, and result in code that's very difficult to reason about
That's more a problem of having mutable references, you'd have the same problem in a procedural language.
On the flipside, with OOP is usually quite easy to put a debugger breakpoint on a particular line and see the full picture of what the program is doing.
In diehard FP (e.g. Haskell) it's hard to even place a breakpoint, let alone see the complete state. In many cases, where implementing a piece of logic without carrying a lot of state is impossible, functional programming can also become very confusing. This is especially true when introducing certain theoretical concepts that facilitate working with IO and state, such as Monad Transformers.
That is true, but on the flip-flip side, while procedural or FP programs are usually easy to run piecewise, with OOP, you have to run the entire app, and navigate to the statement in question to be even able to debug it.
Imho, most FP languages have very serious human-interface issues.
It's no accident that C likes statements (and not too complex ones at that). You can read and parse a statement atomically, which makes the code much easier to read.
In contrast, FP tends to be very, very dense, or even worse, have a density that's super inconsistent.
Slowly it is going to be only skills.md.
I agree with the sentiment, I really like D and find a missing opportunity that it wasn't taken off regarding adoption.
Most of what made D special in D is nowadays partially available in mainstream languages, making the adoption speech even harder, and lack of LLM training data doesn't help either.
> lack of LLM training data doesn't help either.
That shouldn't stop any self-respecting programmer.
Self respecting developers are an endangered species, otherwise we would not have so much Electron crap.
Those that learn to do robot maintenance, are the ones left at the factory.
Exactly. We wrote code before LLMs and we can after their advent too
Yeah, that is why carpenters are still around and no one buys Ikea.
Is your proposition that programmers are now incapable of writing code?
Eventually yes, when incapable becomes a synonymous with finding a job in an AI dominated software factory industry.
Enterprise CMS deployment projects have already dropped amount of assets teams, translators, integration teams, backend devs, replaced by a mix of AI, SaaS and iPaaS tools.
Now the teams are a fraction of the size they used to be like five years ago.
Fear not, there will be always a place for the few ones that can invert a tree, calculate how many golf balls fit into a plane, and are elected to work at the AI dungeons as the new druids.
Same for ERP/CRM/HRM and some financial systems ; all systems that were heavy 'no-code' (or a lot of configuration with knobs and switches rather than code) before AI are now just going to lose their programmers (and the other roles); the business logic / financial calcs etc were already done by other people upfront in excel, visio etc ; now you can just throw that into Claude Code. These systems have decades of rigid code practices so there is not a lot of architecting/design to be done in the first place.
While I don't share this cynical worldview, I am mildly amused by the concept of a future where, Warhammer 40,000 style, us code monkeys get replaced by tech priests who appease the machine gods by burning incense and invoking hymns.
> Yeah, that is why carpenters are still around and no one buys Ikea.
I'm sorry, what? Are you suggesting that Ikea made carpenters obsolete? It's been less than 6 months since last I had a professional carpenter do work in my house. He seemed very real. And charged very real prices. This despite the fact that I've got lots of Ikea stuff.
Compared to before, not a lot of carpenters/furniture makers are left. This is due to automation.
> Compared to before, not a lot of carpenters/furniture makers are left.
Which is it? Carpenters or furniture makers? Because the two have nothing in common beyond the fact that both professions primarily work with wood. The former has been unaffected by automation – or even might plausibly have more demand due to the overall economic activity caused by automation! The latter certainly has been greatly affected.
The fact that people all over the thread are mixing up the two is mindboggling. Is there a language issue or something?
> that is why carpenters are still around and no one buys Ikea
The irony in this statement is hilarious, and perfectly sums up the reality of the situation IMO.
For anyone who doesn't understand the irony: a carpenter is someone who makes things like houses, out of wood. They absolutely still fucking exist.
Industrialised furniture such as IKEA sells has reduced the reliance on a workforce of cabinet makers - people who make furniture using joinery.
Now if you want to go ask a carpenter to make you a table he can probably make one, but it's going to look like construction lumber nailed together. Which is also quite a coincidence when you consider the results of asking spicy autocomplete to do anything more complex than auto-complete a half-written line of code.
I think you have misunderstood what a carpenter is. A carpenter is someone who makes wooden furniture (among other things).
> I think you have misunderstood what a carpenter is. A carpenter is someone who makes wooden furniture (among other things).
I think _you_ have misunderstood what a carpenter is. At least where I live, you might get a carpenter to erect the wood framing for a house. Or build a wooden staircase. Or erect a drywall. I'm sure most carpenters worth their salt could plausibly also make wooden furniture, at an exorbitant cost, but it's not at all what they do.
I sanity checked with Wiktionary, and it agrees: "A person skilled at carpentry, the trade of cutting and joining timber in order to construct buildings or other structures."
Self-respecting programmers write assembly for the machines they built themselves. I swear, kids these days have no respect for the craft
My experience is that all LLMs that I have tested so far did a very good job producing D code.
I actually think that the average D code produced has been superior to the code produced for the C++ problems I tested. This may be an outlier (the problems are quite different), but the quality issues I saw on the C++ side came partially from the ease in which the language enables incompatible use of different features to achieve similar goals (e.g. smart_ptr s new/delete).
I work with D and LLMs do very well with it. I don't know if it could be better but it does D well enough. The problem is only working on a complex system that cannot all be held in context at once.
Serious question, how is this on the front page? We all know of the language and chosen not to use it.
Edit: Instead of downvoting, just answer the question if you've upvoted it. But I'm guessing it's the same sock accounts that upvoted it.
> We all know...
HN isn't as homogeneous as you think. By this measuring stick, half of the posts on the front page can be put into question every day.
Let's be serious, most people are regulars and this has been on the front page multiple times like constantly. And it was upvoted 4 times on new to get to the front page rapidly. It's not something new that we're all "Oh that's cool".
We also know there are tons of sock accounts.
And no half of the posts on front page can't be put in that since they aren't constantly reposted like this.
So, while there are a few people who will have learnt about this for the first time. Most of you know what it is and somehow feel like this is your chance to go look I'm smarter than Iain. And I think you've failed again.
Do you know the joke with "I'll repeat the joke to you until you understand it?".
That's why some things get reposted and upvoted. In hope of getting someone else to understand them.
By the way, do you complain about sock accounts when yet another "Here is this problem, and by the way we sell a product that claims to solve it" gets upvoted?
> Do you know the joke with "I'll repeat the joke to you until you understand it?".
Nope. That's not a joke. That's not funny.
> That's why some things get reposted and upvoted. In hope of getting someone else to understand them.
No, they get reposted and upvoted by sock accounts in hope that someone will finally be interested in a 30 year old programming language.
> By the way, do you complain about sock accounts when yet another "Here is this problem, and by the way we sell a product that claims to solve it" gets upvoted?
What does content marketing have to do with sock accounts?
I'm honestly not sure what point you thought was getting made. Do you honestly think people don't understand D? It's been looked at repeatedly and still nothing cool is built in it.
You're harsh but that's OK. There is a lot of truth in what you're saying. I really wish people would quit downvoting everything they disagree with. HN would be 100x better if both the downvote and flag buttons were removed.
To me, a C guy, the focus on garbage collection is a turn-off. I'm aware that D can work without it, but it's unclear how much of the standard library etc works fine with no garbage collection. That hasn't been explained, that I saw at least.
The biggest problem however is the bootstrapping requirement, which is annoyingly difficult or too involved. (See explanation in my other post.)
I'm not sure how I'm being harsh. It's literally a somewhat well known programming language being reposted for the 100th time or something silly like that. I'm literally just pointing out the truth and it's almost certainly the main poster downvoting things.
> I'm literally just pointing out the truth
Problem identified.
That's not popular here.
As evidenced by several other comments, even if someone already knows about D they can still use posts like this as a prompt for talking about their experiences and current thoughts about it (which can be different from 1, 5 or 10 years ago).
Weird post. How does one of today's 10,000 who have never heard of a subject learn about it?
Interestingly, today someone can be one of the lucky to learn about the lucky 10000:
All seriousness, do you honestly think this site has 10,000 new users a day? How many people do you think are on here that aren't very well informed? Honestly, I'm just wondering?
Also, do you know it only gets to front page if the hardcore that go to new upvote it? How many hardcore people don't know what D is?
Genuinely curious as I'm relatively new compared to the time of inception of this language. Can you cite the reasons why people didn't choose D?
It was competing with C and Java when it came out. People who like C will not use a language with garbage collection, even one that allows you to not use it. Against Java, it was a losing battle due to Java being backed by a giant (Sun , then Oracle) and basically taking the world by storm. Then there were also license problems in early versions of D, and two incompatible and competing standard libraries dividing the community. By the time all these problems were fixed, like a decade ago, it was already too late to make a comeback. Today D is a nice language with 3 different compilers with different strengths, one very fast, one produces faster results, and one also does that by works in the GCC ecosystem. That’s something few languages have. D even has a betterC mode now which makes it very good as a C replacement, with speed and size equivalent or better than a C equivalent binary… and D has the arguably best meta programming capabilities of any language that is not a Lisp, including Zig. But no one seems to care anymore as all the hotness is now with Rust and Zig in the systems languages space.
I like and use D but Nim has better metaprogramming capabilities (but D's templates are top-notch except for the error message cascades). (And Zig's metaprogramming is severely hobbled by Andrew's hatred of macros, mixins, and anything else that smells of code generation.)
Can you explain what BetterC is, and what it is used for?
I think there's also something called ImportC.
Not sure what that is either.
I read the D blog sometimes, and have written some programs in D, but am not quite clear about these two terms.
> Note: ImportC and BetterC are very different. ImportC is an actual C compiler. BetterC is a subset of D that relies only on the existence of the C Standard library. BetterC code can be linked with ImportC code, too.
D contains an actual C compiler because Walter Bright wrote one long ago and then incorporated it into D.
Zig also contains an actual C compiler, based on clang, and has a @cImport directive.
I had D support in my distro for a while, but regrettably had to remove it. There's just too many problems with this language and how it's packaged and offered to the end user, IMO. It was too much hassle to keep it around.
To get it onto one's system, a bootstrapping step is required. Either building gcc 9 (and only gcc 9) with D support, then using that gcc to bootstrap a later version, or bootstrapping dmd with itself.
In the former case I'm already having to bootstrap Ada onto the system, so D just adds another level of pain. It also doesn't support all the same architectures as other gcc languages.
In the case of dmd, last I checked they just shove a tarball at you containing vague instructions and dead FTP links. Later I think they "updated" this to some kind of fancy script that autodownloads things. Neither is acceptable for my purposes.
I just want a simple tarball containing everything needed with clear instructions, and no auto downloading anything, like at least 90% of other packages provide. Why is this so hard?
Tip: pretend it's still the BBS days and you are distributing your software. How would you do it? That's how you should still do it.
I haven't tried the LLVM D compiler, and at this point quite frankly I don't want to waste any more time with the language, in its current form at least--with apologies to Walter Bright, who is truly a smart and likeable guy. Like I said, it's regrettable.
The only way to revive interest in D is through a well planned rebranding and marketing campaign. I think the technical foundation is pretty sound, but the whole image and presentation needs a major overhaul. I have an idea of how to approach that, were there interest.
The first step would be to revive and update the C/C++ version of the D compiler for gcc so as to remove the bootstrapping requirement and allow the latest D to be built, plus a commitment to keeping this up to date indefinitely. It needs to support all architectures that GCC does.
Next, a rebranding focused on the power of D without garbage collection.
I'm willing to offer ongoing consultation in this area and assistance in the form of distro support and promotion, in exchange for a Broadwell or later Xeon workstation with at least 40 cores. (Approx $350 on Ebay.) That's the cost of entry for me as I have way too much work to do and too few available CPU cycles to process it.
Otherwise, I sincerely wish the D folks best of luck. The language has a lot of good ideas and I trust that Walter knows what he is doing from a technical standpoint. The marketing has not been successful however, sadly.
"We all know of the language and chosen not to use it."
Is a strange claim, and hard to cite. But I think many HNers have tried out D and decided that it's not good enough for them for anything. It is certainly advertised hard here.
even in this empty thread there are people who dont know it.
It's a programming language that some people like, and or would like to see become more mainstream?
I think any presumption about what "we all know" will earn you downvotes.
D is like a forced meme at that point.
Never has an old language gained traction, its all about the initial network effects created by excitement.
No matter how much better it is from C now, C is slowly losing traction and its potential replacements already have up and running communities (Rust, zig etc)
Not everything needs to have "traction", "excitement" or the biggest community. D is a useful, well designed programming language that many thousands of people in this vast world enjoy using, and if you enjoy it too, you can use it. Isn't that nice?
Oh a programming language certainly needs to have traction and community for it to succeed, or be a viable option for serious projects.
You can code your quines in whatever you'd like, but a serious project needs existence of good tooling, good libraries, proven track record & devs that speak the language.
"Good tooling, good libraries, proven track record" are all relative concepts, it's not something you have or don't have.
There are serious projects being written in D as we speak, I'm sure, and the language has a track record of having been consistently maintained and improved since 2001, and has some very good libraries and tooling (very nice standard library, three independent and supported compiler implementations!) It does not have good libraries and tooling for all things; certainly integrations with other libs and systems often lag behind more popular languages, but no programming language is suitable for everything.
What I'm saying is there's a big world out there, not all programmers are burdened with having to care about CV-maxxing, community or the preferences of other devs, some of them can just do things in the language they prefer. And therefore, not everything benefits from being written in Rust or whatever the top #1 Most Popular! Trending! Best Choice for System Programming 2026! programming language of the week happens to be.
Python was first released in 1991. It rumbled along for about 20 years until exploding in popularity with ML and the rise of data science.
That's not how I remember it. Excitement for python strongly predated ML and data science. I remember python being the cool new language in 1997 when I was still in high school. Python 2.4 was already out, and O'Reilly had put several books kn the topic already it. Python was known as this almost pseudo code like language thst used indentation for blocking. MIT was considering switching to it for its introductory classes. It was definitely already hyped back then -- which led to U Toronto picking it for its first ML projects that eventually everyone adopted when deep learning got started.
It was popular as a teaching language when it started out, along side BASIC or Pascal. When the Web took off, it was one of a few that took off for scripting simple backends, along side PHP, JS and Ruby.
But the real explosion happened with ML.
I agree with the person you're replying to. Python was definitely already a thing before ML. The way I remember it is it started taking off as a nice scripting language that was more user friendly than Perl, the king of scripting languages at the time. The popularity gain accelerated with the proliferation of web frameworks, with Django tailgating immensely popular at the time Ruby on Rails and Flask capturing the micro-framework enthusiast crowd. At the same time the perceived ease of use and availability of numeric libraries established Python in scientific circles. By the time ML started breaking into mainstream, Python was already one of the most popular programming languages.
Sure, but the point was that it being used for web backends was years after it was invented, an area in which it never ruled the roost. ML is where it has gained massive traction outside SW dev.
As I remember it there was a time when Ruby and Python were the two big up-and-coming scripting languages while Perl was in decline.
Python was common place long before ML. Ever since 1991, it would jump in popularity every now and then, collect enough mindshare, then dives again once people find better tools for the job. It long took the place of perl as the quick "linux script that's too complex for bash" especially when python2 was shipping with almost all distros.
For example, python got a similar boost in popularity in the late 2000s and early 2010s when almost every startup was either ruby on rails or django. Then again in the mid 2010s when "data science" got popular with pandas. Then again in the end of 2010s with ML. Then again in the 2020s with LLMs. Every time people eventually drop it for something else. It's arguably in a much better place with types, asyncio, and much better ecosystem in general these days than it was back then. As someone who worked on developer tools and devops for most of the time, I always dread dealing with python developers though tbh.
> I always dread dealing with python developers though tbh.
Out of curiosity, why is that?
There are plenty of brilliant people who use python. However, in every one of these boom cycles with python I dealt with A LOT of developers with horrific software engineering practices, little understanding of how their applications and dependencies work, and just plane bizarre ideas of how services work. Like the one who comes with 1 8k line run.py with like 3 functions asking to “deploy it as a service”, expecting it to literally launch `python3 run.py` for every request. It takes 5 minutes to run. It assumes there is only 1 execution at a time per VM because it always writes to /tmp/data.tmp. Then poses a lot of “You guys don’t know what you’re doing” questions like “yeah, it takes a minute, but can’t you just return a progress bar?” In a REST api? Or “yeah, just run one per machine. Shouldn’t you provide isolation?”. Then there is the guy who zips up their venv from a Mac or Windows machine and expects it to just run on a Linux server. Or the guy who has no idea what system libs their application needs and is so confused we’re not running a full Ubuntu desktop in a server environment. Or the guy who gives you a 12GB docker image because ‘well, I’m using anaconda”
Containers have certainly helped a lot with python deployments these days, even if the Python community was late to adopt it for some reason. throughout the 2010s where containers would have provided a much better story especially for python where most libraries are just C wrappers and you must pip install on the same target environments, python developers I dealt with were all very dismissive of it and just wanted to upload a zip or tarball because “python is cross platform. It shouldn’t matter” then we had to invent all sorts of workarounds to make sure we have hundreds of random system libs installed because who knows what they are using and what pip will need to build their things. prebuilt wheels were a lot less common back then too causing pip installs to be very resource intensive, slow and flaky because som system lib is missing or was updated. Still python application docker images always range in the 10s of GBs
Python crossed the chasm in the early 2000s with scripting, web applications, and teaching. Yes, it's riding an ML rocket, but it didn't become popular because it was used for ML, it was chosen for ML because it was popular.
Python had already exploded in popularity in the early 2000s, and for all sorts of things (like cross-platform shell scripting or as scripting/plugin system for native applications).
Oh? How about Raymond's "Why python?" article that basically described the language as the best thing since sliced bread? Published in 2000, and my first contact with python.
Not really, back in 2003 when I joined CERN it was already the offical scripting language on ATLAS, our build pipeline at the time (CMT) used Python, there were Python trainings available for the staff, and it was a required skill for anyone working in Grid Computing.
I started using Python in version 1.6, there were already several O'Reilly books, and Dr.Dobbs issues dedicated to Python.
This is not true. It took about 20 years for Python to reach the levels of its today's popularity. JavaScript also wasn't so dominant and omnipresent until the Chrome era.
Also, many languages that see a lot of hype initially lose most of their admirers in the long run, e.g. Scala.
> Never has an old language gained traction, its all about the initial network effects created by excitement.
Python?! Created in 1991, became increasingly popular – especially in university circles – only in the mid-2000s, and then completely exploded thanks to the ML/DL boom of the 2010s. That boom fed back into programming training, and it's now a very popular first language too.
Love it or hate it, Python was a teenager by the time it properly took off.
IMHO D just missed the mark with the GC in core. It was released in a time where a replacement for C++ was sorely needed, and it tried to position itself as that (obvious from the name).
But by including the GC/runtime it went into a category with C# and Java which are much better options if you're fine with shipping a runtime and GC. Eventually Go showed up to crowd out this space even further.
Meanwhile in the C/C++ replacement camp there was nothing credible until Rust showed up, and nowadays I think Zig is what D wanted to be with more momentum behind it.
Still kind of salty about the directions they took because we could have had a viable C++ alternative way earlier - I remember getting excited about the language a lifetime ago :D
I'd rather say that the GC is the superpower of the language. It allows you to quickly prototype without focusing too much on performance, but it also allows you to come back to the exact same piece of code and rewrite it using malloc at any time. C# or Java don't have this, nor can they compile C code and seamlessly interoperate with it — but in D, this is effortless.
Furthermore, if you dig deeper, you'll find that D offers far greater control over its garbage collector than any other high-level language, to the point that you can eagerly free chunks of allocated memory, minimizing or eliminating garbage collector stops where it matters.
> C# or Java don't have this, nor can they compile C code and seamlessly interoperate with it — but in D, this is effortless.
C# C interop is pretty smooth, Java is a different story. The fact that C# is becoming the GC language in game dev is proving my point.
>Furthermore, if you dig deeper, you'll find that D offers far greater control over its garbage collector than any other high-level language, to the point that you can eagerly free chunks of allocated memory, minimizing or eliminating garbage collector stops where it matters.
Yes, and the no-gc stuff was just attempts to backpedal on the wrong initial decision to fit into the use-cases they should have targeted from the start in my opinion.
Look D was an OK language but it had no corporate backing and there was no case where it was "the only good solution". If it was an actual C++ modernization attempt that stayed C compatible it would have seen much better adoption.
FIl-C, the new memory-safe C/C++ compiler actually achieved that through introducing a GC, with that in mind I'd say D was kind of a misunderstood prodigy in retrospect.
There's two classes of programs - stuff written in C for historic reasons that could have been written in higher level language but rewrite is too expensive - fill c. Stuff where you need low level - Rust/C++/Zig
I often see people lament the lack of popularity for D in comparison to Rust. I've always been curios about D as I like a lot of what Rust does, but never found the time to deep dive and would appreciate someone whetting my appetite.
Are there technical reasons that Rust took off and D didn't?
What are some advantages of D over Rust (and vice versa)?
D and Rust are on the opposite sides at dealing with memory safety. Rust ensures safety by constantly making you think about memory with its highly sophisticated compile-time checks. D, on the other hand, offers you to either employ a GC and forget about (almost) all memory-safety concerns or a block scoped opt-out with cowboy-style manual memory management.
D retains object-oriented programming but also allows functional programming, while Rust seems to be specifically designed for functional programming and does not allow OOP in the conventional sense.
I've been working with D for a couple of months now and I noticed that it's almost a no-brainer to port C/C++ code to D because it mostly builds on the same semantics. With Rust, porting a piece of code may often require rethinking the whole thing from scratch.
> block scoped opt-out with cowboy-style manual memory management
Is this a Walter Bright alt? I've seen him use the cowboy programmer term a few times on the forum before.
Yeah, I just saw his posts too and picked up the term :)
> Are there technical reasons that Rust took off and D didn't?
As someone who considered it back then when it actually stood a chance to become the next big thing, from what I remember, the whole ecosystem was just too confusing and simply didn't look stable and reliable enough to build upon long-term. A few examples:
* The compiler situation: The official compiler was not yet FOSS and other compilers were not available or at least not usable. Switch to FOSS happened way too late and GCC support took too long to mature.
* This whole D version 1 vs version 2 thingy
* This whole Phobos vs Tango standard library thingy
* This whole GC vs no-GC thingy
This is not a judgement on D itself or its governance. I always thought it's a very nice language and the project simply lacked man-power and commercial backing to overcome the magical barrier of wide adoption. There was some excitement when Facebook picked it up, but unfortunately, it seems it didn't really stick.
How many people were working on the core compiler/language at the time versus Rust? This could explain it.
I think 3 things
1. D had a split similar to python 2 vs 3 early on with having the garbage collector or not (and therefor effectively 2 standard libraries), but unlike python it didn't already have a massive community that was willing to suffer through it.
2. It didn't really have any big backing. Rust having Mozilla backing it for integration with Firefox makes a pretty big difference.
3. D wasn't different enough, it felt much more "this is c++ done better" than it's own language, but unlike c++ where it's mostly a superset of c you couldn't do "c with classes" style migrations
One feature of D that i really wish other languages would adopt (not sure about Rust but i also think it lacks it, though if it has it to a similar extent as D it might be the reason i check it again more seriously) is the metaprogramming and compile-time code evaluation features it has (IIRC you can use most of the language during compile time as it runs in a bytecode VM), down to even having functions that generate source code which is then treated as part of the compilation process.
Of course you can make codegen as part of your build process with any language, but that can be kludgy (and often limited to a single project).
Arguably, most of the metaprogramming in D is done with templates and it comes with all the flaws of templates in C++. The error messages are long and it's hard to decipher what exactly went wrong (static asserts help a lot for this, when they actually exist). IDE support is non-existent after a certain point because IDE can't reason about code that doesn't exist yet. And code gets less self-documenting because it's all Output(T,U) foo(T, U)(T t, U u) and even the official samples use auto everywhere because it's hard to get the actual output types.
I'd say D's template error messages are much better than C++'s, because D prints the instantiation stack with exact locations in the code and the whole message is just more concise. In C++, it just prints a bunch of gibberish, and you're basically left guessing.
> Are there technical reasons that Rust took off and D didn't?
My (somewhat outdated) experience is that D feels like a better and more elegant C++. Rust certainly has been influenced by C and C++, but it also took a lot of inspiration from the ML-family of languages and it has a much stronger type system as a consequence.
More like the companies that jumped into D versus Rust, D only had Facebook and Remedy Games toy a bit with it.
Many of us believe on automatic memory management for systems programming, having used quite a few in such scenarios, so that is already one thing that D does better than Rust.
There is the GC phobia, mostly by folks that don't get not all GCs were born alike, and just like you need to pick and chose your malloc()/free() implementation depending on the scenario, there are many ways to implement a GC, and having a GC doesn't preclude having value types, stack and global memory segment allocation.
D has compile time reflection, and compile time metaprogramming is much easier to use than Rust macros, and it does compile time execution as well.
And the compile times! It is like using Turbo Pascal, Delphi,... even thought the language is like C++ in capabilities. Yet another proof complexity doesn't imply slow compile natives in a native systems language.
For me, C# and Swift replace the tasks at work were I in the past could have reached for D instead, mostly due to who is behind those languages, and I don't want to be that guy that leaves and is the one that knew the stack.
> Many of us believe on automatic memory management for systems programming
The problem is the term "systems programming". For some, it's kernels and device drivers. For some, it's embedded real-time systems. For some, it's databases, game engines, compilers, language run-times, whatever.
There is no GC that could possibly handle all these use-cases.
But there could be a smoother path between having a GC and having no GC.
Right now, you'd have to switch languages.
But in a Great Language you'd just have to refactor some code.
> Are there technical reasons that Rust took off and D didn't?
Yes. D tried to jump on the "systems programming with garbage collection" dead horse, with predictable results.
(People who want that sort of stupidity already have Go and Java, they don't need D.)
> (People who want that sort of stupidity already have Go and Java, they don't need D.)
Go wasn't around when D was created, and Java was an unbelievable memory hog, with execution speeds that could only be described as "glacial".
As an example, using my 2001 desktop, the `ls` program at the time was a few kb, needed about the same in runtime RAM and started up and completed execution in under 100ms.
The almost equivalent Java program I wrote in 2001 to list files (with `ls` options) took over 5s just to start up and chewed through about 16MB of RAM (around 1/4 of my system's RAM).
Java was a non-started.
Go wasn't around when D was released and Java has for the longest time been quite horrible (I first learnt it before diamond inference was a thing, but leaving that aside it's been overly verbose and awkward until relatively recently).
Is Java even a "systems programming" language?
I don't even know what that term means anymore; but afaik Java didn't really have reliable low-level APIs until recently.
Depends if one considers writing compilers, linkers, JITs, database engines, and running bare metal on embedded real time systems "systems programming".
As far as adoption is concerned, I'm not sure it should be that big of a concern.
After all, D is supported by GCC and Clang and continually being maintained, and if updates stopped coming at some point in the future, anyone who knew a bit of C / Java / insert language here could easily port it to their language of choice.
Meanwhile, its syntax is more expressive than many other compiled languages, the library is feature-rich and fairly tidy, and for me it's been a joy to use.
It has an LLVM backend, LDC, that is separate from the LLVM project/Clang.
GCC usually drops frontends if there are no maintainers around, it already happened to gcj, and I am waiting for the same to happen to gccgo any time now, as it has hardly gotten any updates since Go 1.18.
The team is quite small and mostly volunteers, so there is the question how long can Walter Bright keep at it, and who will keep it going afterwards when he passes the torch.
I like D in general, however it is missing out in WASM where other languages like Rust, Zig, even Go are thriving. Official reasoning usually included waiting for GC support from WASM runtime, but other GC languages seem to just ship their own GC and move on.
D is boring, let's see how to recreate the B language:
https://www.youtube.com/playlist?list=PLpM-Dvs8t0VZn81xEz6Ng...
What can D do other languages can't?
Say your starting a new Staff Engineer or Tech Lead job. What gets you to convince a CTO that we need to have a team learn D ?
On the flip side, where are the 200k base salary D positions.
Get me an interview in 2 months and I'll drop 10 hours a week into learning
Well, I would say it's more like glasses - you can't convince those who don't wear them, and you don't need to convince those who need them either.
What problem is D solving ?
One good case for it that I see is a viable basis for cross-platform desktop apps. Today, cross-platform desktop GUI apps are either just a snapshot of the website contained inside Electron, or a C/C++ code base with manual memory management. D can serve as a nice middle ground in that space.
Off topic: Back in the day, C++ programming books Andrei Alexandrescu are a joy to read, especially, Modern C++ design.
Also, this presentation https://accu.org/conf-docs/PDFs_2007/Alexandrescu-Choose_You... killed a lot of bike shedding!
When I was student, our group was forced to use D lang instead C++ for CS2* classes. That was back in 2009. After 16 years I see that level of adoption did not change at all.
Seen D being posted regularly on here, seems like flogging a dead horse. It's the equivalent of keeping grandma on life support when there is no hope.
You'd be surprised to see how active the D community is, despite your fair point that it's noticeably smaller than in the "competing" (in quotes because it's not a competition, actually) languages.
The latest release [1] was on Jan 7th, and it contains more updates than, say, the latest release of Dart, which has one of the largest corporations behind it.
1. https://dlang.org/changelog/2.112.0.html
How good are the big LLMs at writing D code? Just curious.
I was personally a lot more excited by D and subsequently Nim, but ultimately it's Rust and Zig that got adoption. Sigh.
D is a treasure we should continue to cherish and protect
A language with sane Compile Time features (Type Introspection, CTFE, mixins, etc)
A language that can embrace C ecosystem with sane diagnostics
A language that ships with its own optimizing code generator and inline assembler!
A compiler that compiles code VERY fast
A compiler with a readable source code that bootstraps itself in just 5 seconds
People who dunk on it "bEcAuSe iT Is nOt MaInsTrEaM" are clueless
I remember the creator of D programming Language replying to me on HN on one of my posts!
https://news.ycombinator.com/item?id=46261452
Walter's a regular on HN.
This very post is probably his too, under an alt :)
I had an interview at Facebook 10+ years ago and my interviewer was the other creator!
I never understood why this language didn't gain much traction. It seems very solid.
At the same time, I've never used it, I'm not sure why.
Anyway, the author of D language is here on HN (Walter Bright).
Sigh.
Ownership and borrowing are so much less baroque in D than in Rust. And compile times are superb.
In a better world, we would all be using D instead of C, C++ or Rust.
However in this age of Kali...
For those curious what ownership and borrowing looks like in D: https://dlang.org/blog/2019/07/15/ownership-and-borrowing-in...
This is a somewhat simplistic view of ownership and borrowing for modern programming languages.
Pointers are not the only 'pointer's to resources. You can have handles specific to your codebase or system, you can have indices to objects in some flat array that the rest of your codebase uses, even temporary file names.
An object oriented (or 'multi paradigm') language has to account for these and not just literal pointers.
This is handled reasonably well both in Rust and C++. (In the spirit of avoiding yet another C++ vs Rust flamewar here, yes the semantics are different, no it doesn not make sense for C++ to adopt Rust semantics)
How does Rust (or C++) treat array indices as resources? And won't that defy the reason to use indices over pointers?
I don't know D so I'm probably missing some basic syntax. If pointers cannot be copied how do you have multiple objects referencing the same shared object?
> If pointers cannot be copied
They can.
Is there any experience on how this works in practice?
OOP and ownership are two concepts that mix poorly - ownership in the presence of OOP-like constructs is never simple.
The reason for that is OOP tends to favor constructs where each objects holds references to other objects, creating whole graphs, its not uncommon that from a single object, hundreds of others can be traversed.
Even something so simple as calling a member function from a member function becomes incredibly difficult to handle.
Tbh - this is with good reason, one of the biggest flaws of OOP is that if x.foo() calls x.bar() in the middle, x.bar() can clobber a lot of local state, and result in code that's very difficult to reason about, both for the compiler and the programmer.
And it's a simple case, OOP offers tons of tools to make the programmers job even more difficult - virtual methods, object chains with callbacks, etc. It's just not a clean programming style.
Edit: Just to make it clear, I am not pointing out these problems, to sell you or even imply that I have the solution. I'm not saying programming style X is better.
I work at a D company. We tend to use OOP only for state owners with strict dependencies, so it's rare to even get cycles. It is extremely useful for modeling application state. However, all the domain data is described by immutable values and objects are accessed via parameters as much as fields.
When commandline apps were everywhere, people dreamed of graphical interfaces. Burdened by having to also do jobs that it was bad at, the commandline got a bad reputation. It took the dominance of the desktop for commandline apps to find their niche.
In a similar way, OOP is cursed by its popularity. It has to become part of a mixed diet so that people can put it where it has advantages, and it does have advantages.
It worked alright for Rust, and yes Rust does support OOP, there are many meanings to what is OOP from CS point of view.
I have ported Ray Tracing in One Weekend into Rust, while keeping the same OOP design from the tutorial, and affine types were not an impediment to interfaces, polymorphism and dynamic dispatch.
>one of the biggest flaws of OOP is that if x.foo() calls x.bar() in the middle, x.bar() can clobber a lot of local state, and result in code that's very difficult to reason about
That's more a problem of having mutable references, you'd have the same problem in a procedural language.
On the flipside, with OOP is usually quite easy to put a debugger breakpoint on a particular line and see the full picture of what the program is doing.
In diehard FP (e.g. Haskell) it's hard to even place a breakpoint, let alone see the complete state. In many cases, where implementing a piece of logic without carrying a lot of state is impossible, functional programming can also become very confusing. This is especially true when introducing certain theoretical concepts that facilitate working with IO and state, such as Monad Transformers.
That is true, but on the flip-flip side, while procedural or FP programs are usually easy to run piecewise, with OOP, you have to run the entire app, and navigate to the statement in question to be even able to debug it.
Imho, most FP languages have very serious human-interface issues.
It's no accident that C likes statements (and not too complex ones at that). You can read and parse a statement atomically, which makes the code much easier to read.
In contrast, FP tends to be very, very dense, or even worse, have a density that's super inconsistent.
Slowly it is going to be only skills.md.
I agree with the sentiment, I really like D and find a missing opportunity that it wasn't taken off regarding adoption.
Most of what made D special in D is nowadays partially available in mainstream languages, making the adoption speech even harder, and lack of LLM training data doesn't help either.
> lack of LLM training data doesn't help either.
That shouldn't stop any self-respecting programmer.
Self respecting developers are an endangered species, otherwise we would not have so much Electron crap.
Those that learn to do robot maintenance, are the ones left at the factory.
Exactly. We wrote code before LLMs and we can after their advent too
Yeah, that is why carpenters are still around and no one buys Ikea.
Is your proposition that programmers are now incapable of writing code?
Eventually yes, when incapable becomes a synonymous with finding a job in an AI dominated software factory industry.
Enterprise CMS deployment projects have already dropped amount of assets teams, translators, integration teams, backend devs, replaced by a mix of AI, SaaS and iPaaS tools.
Now the teams are a fraction of the size they used to be like five years ago.
Fear not, there will be always a place for the few ones that can invert a tree, calculate how many golf balls fit into a plane, and are elected to work at the AI dungeons as the new druids.
Same for ERP/CRM/HRM and some financial systems ; all systems that were heavy 'no-code' (or a lot of configuration with knobs and switches rather than code) before AI are now just going to lose their programmers (and the other roles); the business logic / financial calcs etc were already done by other people upfront in excel, visio etc ; now you can just throw that into Claude Code. These systems have decades of rigid code practices so there is not a lot of architecting/design to be done in the first place.
While I don't share this cynical worldview, I am mildly amused by the concept of a future where, Warhammer 40,000 style, us code monkeys get replaced by tech priests who appease the machine gods by burning incense and invoking hymns.
> Yeah, that is why carpenters are still around and no one buys Ikea.
I'm sorry, what? Are you suggesting that Ikea made carpenters obsolete? It's been less than 6 months since last I had a professional carpenter do work in my house. He seemed very real. And charged very real prices. This despite the fact that I've got lots of Ikea stuff.
Compared to before, not a lot of carpenters/furniture makers are left. This is due to automation.
> Compared to before, not a lot of carpenters/furniture makers are left.
Which is it? Carpenters or furniture makers? Because the two have nothing in common beyond the fact that both professions primarily work with wood. The former has been unaffected by automation – or even might plausibly have more demand due to the overall economic activity caused by automation! The latter certainly has been greatly affected.
The fact that people all over the thread are mixing up the two is mindboggling. Is there a language issue or something?
> that is why carpenters are still around and no one buys Ikea
The irony in this statement is hilarious, and perfectly sums up the reality of the situation IMO.
For anyone who doesn't understand the irony: a carpenter is someone who makes things like houses, out of wood. They absolutely still fucking exist.
Industrialised furniture such as IKEA sells has reduced the reliance on a workforce of cabinet makers - people who make furniture using joinery.
Now if you want to go ask a carpenter to make you a table he can probably make one, but it's going to look like construction lumber nailed together. Which is also quite a coincidence when you consider the results of asking spicy autocomplete to do anything more complex than auto-complete a half-written line of code.
I think you have misunderstood what a carpenter is. A carpenter is someone who makes wooden furniture (among other things).
> I think you have misunderstood what a carpenter is. A carpenter is someone who makes wooden furniture (among other things).
I think _you_ have misunderstood what a carpenter is. At least where I live, you might get a carpenter to erect the wood framing for a house. Or build a wooden staircase. Or erect a drywall. I'm sure most carpenters worth their salt could plausibly also make wooden furniture, at an exorbitant cost, but it's not at all what they do.
I sanity checked with Wiktionary, and it agrees: "A person skilled at carpentry, the trade of cutting and joining timber in order to construct buildings or other structures."
https://en.wikipedia.org/wiki/Carpentry
Carpenters make many things besides houses.
See the section "Types of carpentry".
[dead]
Self-respecting programmers write assembly for the machines they built themselves. I swear, kids these days have no respect for the craft
My experience is that all LLMs that I have tested so far did a very good job producing D code.
I actually think that the average D code produced has been superior to the code produced for the C++ problems I tested. This may be an outlier (the problems are quite different), but the quality issues I saw on the C++ side came partially from the ease in which the language enables incompatible use of different features to achieve similar goals (e.g. smart_ptr s new/delete).
I work with D and LLMs do very well with it. I don't know if it could be better but it does D well enough. The problem is only working on a complex system that cannot all be held in context at once.
I based my opinion on this recent thread, https://forum.dlang.org/thread/bvteanmgrxnjiknrkeyg@forum.dl...
Which the discussion seems to imply it kind of works, but not without a few pain points.
Kali Yuga.
https://en.wikipedia.org/wiki/Kali_Yuga
Serious question, how is this on the front page? We all know of the language and chosen not to use it.
Edit: Instead of downvoting, just answer the question if you've upvoted it. But I'm guessing it's the same sock accounts that upvoted it.
> We all know...
HN isn't as homogeneous as you think. By this measuring stick, half of the posts on the front page can be put into question every day.
Let's be serious, most people are regulars and this has been on the front page multiple times like constantly. And it was upvoted 4 times on new to get to the front page rapidly. It's not something new that we're all "Oh that's cool".
We also know there are tons of sock accounts.
And no half of the posts on front page can't be put in that since they aren't constantly reposted like this.
So, while there are a few people who will have learnt about this for the first time. Most of you know what it is and somehow feel like this is your chance to go look I'm smarter than Iain. And I think you've failed again.
Do you know the joke with "I'll repeat the joke to you until you understand it?".
That's why some things get reposted and upvoted. In hope of getting someone else to understand them.
By the way, do you complain about sock accounts when yet another "Here is this problem, and by the way we sell a product that claims to solve it" gets upvoted?
> Do you know the joke with "I'll repeat the joke to you until you understand it?".
Nope. That's not a joke. That's not funny.
> That's why some things get reposted and upvoted. In hope of getting someone else to understand them.
No, they get reposted and upvoted by sock accounts in hope that someone will finally be interested in a 30 year old programming language.
> By the way, do you complain about sock accounts when yet another "Here is this problem, and by the way we sell a product that claims to solve it" gets upvoted?
What does content marketing have to do with sock accounts?
I'm honestly not sure what point you thought was getting made. Do you honestly think people don't understand D? It's been looked at repeatedly and still nothing cool is built in it.
You're harsh but that's OK. There is a lot of truth in what you're saying. I really wish people would quit downvoting everything they disagree with. HN would be 100x better if both the downvote and flag buttons were removed.
To me, a C guy, the focus on garbage collection is a turn-off. I'm aware that D can work without it, but it's unclear how much of the standard library etc works fine with no garbage collection. That hasn't been explained, that I saw at least.
The biggest problem however is the bootstrapping requirement, which is annoyingly difficult or too involved. (See explanation in my other post.)
I'm not sure how I'm being harsh. It's literally a somewhat well known programming language being reposted for the 100th time or something silly like that. I'm literally just pointing out the truth and it's almost certainly the main poster downvoting things.
> I'm literally just pointing out the truth
Problem identified.
That's not popular here.
As evidenced by several other comments, even if someone already knows about D they can still use posts like this as a prompt for talking about their experiences and current thoughts about it (which can be different from 1, 5 or 10 years ago).
Weird post. How does one of today's 10,000 who have never heard of a subject learn about it?
Interestingly, today someone can be one of the lucky to learn about the lucky 10000:
https://xkcd.com/1053/
meta
All seriousness, do you honestly think this site has 10,000 new users a day? How many people do you think are on here that aren't very well informed? Honestly, I'm just wondering?
Also, do you know it only gets to front page if the hardcore that go to new upvote it? How many hardcore people don't know what D is?
Genuinely curious as I'm relatively new compared to the time of inception of this language. Can you cite the reasons why people didn't choose D?
It was competing with C and Java when it came out. People who like C will not use a language with garbage collection, even one that allows you to not use it. Against Java, it was a losing battle due to Java being backed by a giant (Sun , then Oracle) and basically taking the world by storm. Then there were also license problems in early versions of D, and two incompatible and competing standard libraries dividing the community. By the time all these problems were fixed, like a decade ago, it was already too late to make a comeback. Today D is a nice language with 3 different compilers with different strengths, one very fast, one produces faster results, and one also does that by works in the GCC ecosystem. That’s something few languages have. D even has a betterC mode now which makes it very good as a C replacement, with speed and size equivalent or better than a C equivalent binary… and D has the arguably best meta programming capabilities of any language that is not a Lisp, including Zig. But no one seems to care anymore as all the hotness is now with Rust and Zig in the systems languages space.
I like and use D but Nim has better metaprogramming capabilities (but D's templates are top-notch except for the error message cascades). (And Zig's metaprogramming is severely hobbled by Andrew's hatred of macros, mixins, and anything else that smells of code generation.)
Can you explain what BetterC is, and what it is used for?
I think there's also something called ImportC. Not sure what that is either.
I read the D blog sometimes, and have written some programs in D, but am not quite clear about these two terms.
https://dlang.org/spec/betterc.html
https://dlang.org/spec/importc.html
> Note: ImportC and BetterC are very different. ImportC is an actual C compiler. BetterC is a subset of D that relies only on the existence of the C Standard library. BetterC code can be linked with ImportC code, too.
D contains an actual C compiler because Walter Bright wrote one long ago and then incorporated it into D.
Zig also contains an actual C compiler, based on clang, and has a @cImport directive.
I had D support in my distro for a while, but regrettably had to remove it. There's just too many problems with this language and how it's packaged and offered to the end user, IMO. It was too much hassle to keep it around.
To get it onto one's system, a bootstrapping step is required. Either building gcc 9 (and only gcc 9) with D support, then using that gcc to bootstrap a later version, or bootstrapping dmd with itself.
In the former case I'm already having to bootstrap Ada onto the system, so D just adds another level of pain. It also doesn't support all the same architectures as other gcc languages.
In the case of dmd, last I checked they just shove a tarball at you containing vague instructions and dead FTP links. Later I think they "updated" this to some kind of fancy script that autodownloads things. Neither is acceptable for my purposes.
I just want a simple tarball containing everything needed with clear instructions, and no auto downloading anything, like at least 90% of other packages provide. Why is this so hard?
Tip: pretend it's still the BBS days and you are distributing your software. How would you do it? That's how you should still do it.
I haven't tried the LLVM D compiler, and at this point quite frankly I don't want to waste any more time with the language, in its current form at least--with apologies to Walter Bright, who is truly a smart and likeable guy. Like I said, it's regrettable.
The only way to revive interest in D is through a well planned rebranding and marketing campaign. I think the technical foundation is pretty sound, but the whole image and presentation needs a major overhaul. I have an idea of how to approach that, were there interest.
The first step would be to revive and update the C/C++ version of the D compiler for gcc so as to remove the bootstrapping requirement and allow the latest D to be built, plus a commitment to keeping this up to date indefinitely. It needs to support all architectures that GCC does.
Next, a rebranding focused on the power of D without garbage collection.
I'm willing to offer ongoing consultation in this area and assistance in the form of distro support and promotion, in exchange for a Broadwell or later Xeon workstation with at least 40 cores. (Approx $350 on Ebay.) That's the cost of entry for me as I have way too much work to do and too few available CPU cycles to process it.
Otherwise, I sincerely wish the D folks best of luck. The language has a lot of good ideas and I trust that Walter knows what he is doing from a technical standpoint. The marketing has not been successful however, sadly.
"We all know of the language and chosen not to use it."
Is a strange claim, and hard to cite. But I think many HNers have tried out D and decided that it's not good enough for them for anything. It is certainly advertised hard here.
Maybe you should Ask HN.
You should familiarize yourself with these: https://news.ycombinator.com/newsguidelines.html
even in this empty thread there are people who dont know it.
It's a programming language that some people like, and or would like to see become more mainstream?
I think any presumption about what "we all know" will earn you downvotes.
D is like a forced meme at that point.
Never has an old language gained traction, its all about the initial network effects created by excitement.
No matter how much better it is from C now, C is slowly losing traction and its potential replacements already have up and running communities (Rust, zig etc)
Not everything needs to have "traction", "excitement" or the biggest community. D is a useful, well designed programming language that many thousands of people in this vast world enjoy using, and if you enjoy it too, you can use it. Isn't that nice?
Oh a programming language certainly needs to have traction and community for it to succeed, or be a viable option for serious projects.
You can code your quines in whatever you'd like, but a serious project needs existence of good tooling, good libraries, proven track record & devs that speak the language.
"Good tooling, good libraries, proven track record" are all relative concepts, it's not something you have or don't have.
There are serious projects being written in D as we speak, I'm sure, and the language has a track record of having been consistently maintained and improved since 2001, and has some very good libraries and tooling (very nice standard library, three independent and supported compiler implementations!) It does not have good libraries and tooling for all things; certainly integrations with other libs and systems often lag behind more popular languages, but no programming language is suitable for everything.
What I'm saying is there's a big world out there, not all programmers are burdened with having to care about CV-maxxing, community or the preferences of other devs, some of them can just do things in the language they prefer. And therefore, not everything benefits from being written in Rust or whatever the top #1 Most Popular! Trending! Best Choice for System Programming 2026! programming language of the week happens to be.
Python was first released in 1991. It rumbled along for about 20 years until exploding in popularity with ML and the rise of data science.
That's not how I remember it. Excitement for python strongly predated ML and data science. I remember python being the cool new language in 1997 when I was still in high school. Python 2.4 was already out, and O'Reilly had put several books kn the topic already it. Python was known as this almost pseudo code like language thst used indentation for blocking. MIT was considering switching to it for its introductory classes. It was definitely already hyped back then -- which led to U Toronto picking it for its first ML projects that eventually everyone adopted when deep learning got started.
It was popular as a teaching language when it started out, along side BASIC or Pascal. When the Web took off, it was one of a few that took off for scripting simple backends, along side PHP, JS and Ruby.
But the real explosion happened with ML.
I agree with the person you're replying to. Python was definitely already a thing before ML. The way I remember it is it started taking off as a nice scripting language that was more user friendly than Perl, the king of scripting languages at the time. The popularity gain accelerated with the proliferation of web frameworks, with Django tailgating immensely popular at the time Ruby on Rails and Flask capturing the micro-framework enthusiast crowd. At the same time the perceived ease of use and availability of numeric libraries established Python in scientific circles. By the time ML started breaking into mainstream, Python was already one of the most popular programming languages.
Sure, but the point was that it being used for web backends was years after it was invented, an area in which it never ruled the roost. ML is where it has gained massive traction outside SW dev.
As I remember it there was a time when Ruby and Python were the two big up-and-coming scripting languages while Perl was in decline.
Python was common place long before ML. Ever since 1991, it would jump in popularity every now and then, collect enough mindshare, then dives again once people find better tools for the job. It long took the place of perl as the quick "linux script that's too complex for bash" especially when python2 was shipping with almost all distros.
For example, python got a similar boost in popularity in the late 2000s and early 2010s when almost every startup was either ruby on rails or django. Then again in the mid 2010s when "data science" got popular with pandas. Then again in the end of 2010s with ML. Then again in the 2020s with LLMs. Every time people eventually drop it for something else. It's arguably in a much better place with types, asyncio, and much better ecosystem in general these days than it was back then. As someone who worked on developer tools and devops for most of the time, I always dread dealing with python developers though tbh.
> I always dread dealing with python developers though tbh.
Out of curiosity, why is that?
There are plenty of brilliant people who use python. However, in every one of these boom cycles with python I dealt with A LOT of developers with horrific software engineering practices, little understanding of how their applications and dependencies work, and just plane bizarre ideas of how services work. Like the one who comes with 1 8k line run.py with like 3 functions asking to “deploy it as a service”, expecting it to literally launch `python3 run.py` for every request. It takes 5 minutes to run. It assumes there is only 1 execution at a time per VM because it always writes to /tmp/data.tmp. Then poses a lot of “You guys don’t know what you’re doing” questions like “yeah, it takes a minute, but can’t you just return a progress bar?” In a REST api? Or “yeah, just run one per machine. Shouldn’t you provide isolation?”. Then there is the guy who zips up their venv from a Mac or Windows machine and expects it to just run on a Linux server. Or the guy who has no idea what system libs their application needs and is so confused we’re not running a full Ubuntu desktop in a server environment. Or the guy who gives you a 12GB docker image because ‘well, I’m using anaconda”
Containers have certainly helped a lot with python deployments these days, even if the Python community was late to adopt it for some reason. throughout the 2010s where containers would have provided a much better story especially for python where most libraries are just C wrappers and you must pip install on the same target environments, python developers I dealt with were all very dismissive of it and just wanted to upload a zip or tarball because “python is cross platform. It shouldn’t matter” then we had to invent all sorts of workarounds to make sure we have hundreds of random system libs installed because who knows what they are using and what pip will need to build their things. prebuilt wheels were a lot less common back then too causing pip installs to be very resource intensive, slow and flaky because som system lib is missing or was updated. Still python application docker images always range in the 10s of GBs
Python crossed the chasm in the early 2000s with scripting, web applications, and teaching. Yes, it's riding an ML rocket, but it didn't become popular because it was used for ML, it was chosen for ML because it was popular.
Python had already exploded in popularity in the early 2000s, and for all sorts of things (like cross-platform shell scripting or as scripting/plugin system for native applications).
Oh? How about Raymond's "Why python?" article that basically described the language as the best thing since sliced bread? Published in 2000, and my first contact with python.
Not really, back in 2003 when I joined CERN it was already the offical scripting language on ATLAS, our build pipeline at the time (CMT) used Python, there were Python trainings available for the staff, and it was a required skill for anyone working in Grid Computing.
I started using Python in version 1.6, there were already several O'Reilly books, and Dr.Dobbs issues dedicated to Python.
This is not true. It took about 20 years for Python to reach the levels of its today's popularity. JavaScript also wasn't so dominant and omnipresent until the Chrome era.
Also, many languages that see a lot of hype initially lose most of their admirers in the long run, e.g. Scala.
> Never has an old language gained traction, its all about the initial network effects created by excitement.
Python?! Created in 1991, became increasingly popular – especially in university circles – only in the mid-2000s, and then completely exploded thanks to the ML/DL boom of the 2010s. That boom fed back into programming training, and it's now a very popular first language too.
Love it or hate it, Python was a teenager by the time it properly took off.
Oohh, riiiighttt, D is new(s).
Slow day?