I remembered reading about this news back when that first message was posted on the mailing list, and didn't think much of it then (rust has been worming its way into a lot of places over the past few years, just one more thing I tack on for some automation)...
But seeing the maintainer works for Canonical, it seems like the tail (Ubuntu) keeps trying to wag the dog (Debian ecosystem) without much regard for the wider non-Ubuntu community.
I think the whole message would be more palatable if it weren't written as a decree including the dig on "retro computers", but instead positioned only on the merits of the change.
As an end user, it doesn't concern me too much, but someone choosing to add a new dependency chain to critical software plumbing does, at least slightly, if not done for very good reason.
Agreed. I think that announcement was unprofessional.
This was a unilateral decision affecting other's hard work, and the author didn't provide them the opportunity to provide feedback on the change.
It disregards the importance of ports. Even if an architecture isn't widely used, supporting multiple architectures can help reveal bugs in the original implementation that wouldn't otherwise be obvious.
This is breaking support for multiple ports to rewrite some feature for a tiny security benefit. And doing so on an unacceptably short timeline. Introducing breakage like this is unacceptable.
There's no clear cost-benefit analysis done for this change. Canonical or debian should work on porting the rust toolchain (ideally with tier 1 support) to every architecture they release for, and actually put the cart before the horse.
I love and use rust, it is my favorite language and I use it in several of my OSS projects but I'm tired of this "rewrite it in rust" evangilism and the reputational damage they do to the rust community.
> I love and use rust, it is my favorite language and I use it in several of my OSS projects but I'm tired of this "rewrite it in rust" evangilism and the reputational damage they do to the rust community.
Thanks for this.
I know intellectually, that there are sane/pragmatic people who appreciate Rust.
But often the vibe I’ve gotten is the evangelism, the clear “I’ve found a tribe to be part of and it makes me feel special”.
So it helps when the reasonable signal breaks through the noisy minority.
> This is breaking support for multiple ports to rewrite some feature for a tiny security benefit. And doing so on an unacceptably short timeline. Introducing breakage like this is unacceptable.
I’m
Normally I’d agree, but the ports in question are really quite old and obscure. I don’t think anything would have changed with an even longer timeline.
I think the best move would have been to announce deprecation of those ports separately. As it was announced, people who will never be impacted by their deprecation are upset because the deprecation was tied to something else (Rust) that is a hot topic.
If the deprecation of those ports was announced separately I doubt it would have even been news. Instead we’ve got this situation where people are angry that Rust took something away from someone.
Those ports were never official, and so aren't being deprecated. Nothing changes about Debian's support policies with this change.
EDIT: okay so I was slightly too strong: some of them were official as of 2011, but haven't been since then. The main point that this isn't deprecating any supported ports is still accurate.
*It disregards the importance of ports. Even if an architecture isn't widely used, supporting multiple architectures can help reveal bugs in the original implementation that wouldn't otherwise be obvious."
Imo this is true for going from one to a handful, but less true when going from a handful to more. Afaict there are 6 official ports and 12 unofficial ports (from https://www.debian.org/ports/).
> I love and use rust, it is my favorite language and I use it in several of my OSS projects but I'm tired of this "rewrite it in rust" evangilism and the reputational damage they do to the rust community.
This right here.
As a side-note, I was reading one of Cloudflare's docs on how it implemented its firewall rules, and it's so utterly disappointing how the document stops being informative suddenly start to reads like a parody of the whole cargo cult around Rust. Rust this, Rust that, and I was there trying to read up on how Cloudflare actually supports firewall rules. The way they focus on a specific and frankly irrelevant implementation detail conveys the idea things are ran by amateurs that are charmed by a shiny toy.
> I think the whole message would be more palatable if it weren't written as a decree including the dig on "retro computers", but instead positioned only on the merits of the change.
The wording could have been better, but I don’t see it as a dig. When you look at the platforms that would be left behind they’re really, really old.
It’s unfortunate that it would be the end of the road for them, but holding up progress for everyone to retain support for some very old platforms would be the definition of the tail wagging the dog. Any project that starts holding up progress to retain support for some very old platforms would be making a mistake.
It might have been better to leave out any mention of the old platforms in the Rust announcement and wait for someone to mention it in another post. As it was written, it became an unfortunate focal point of the announcement despite having such a small impact that it shouldn’t be a factor holding up progress.
> As an end user, it doesn't concern me too much ...
It doesn't concern me neither, but there's some attitude here that makes me uneasy.
This could have been managed better. I see a similar change in the future that could affect me, and there will be precedent. Canonical paying Devs and all, it isn't a great way of influencing a community.
I agree. It's sad to see maintainers take a "my way or the highway" approach to package maintenance, but this attitude has gradually become more accepted in Debian over the years. I've seen this play before, with different actors: gcc maintainers (regarding cross-bootstrapping ports), udev (regarding device naming, I think?), systemd (regarding systemd), and now with apt. Not all of them involved Canonical employees, and sometimes the Canonical employees were the voice of reason (e.g. that's how I remember Steve Langasek).
I'm sure some will point out that each example above was just an isolated incident, but I perceive a growing pattern of incidents. There was a time when Debian proudly called itself "The Universal Operating System", but I think that hasn't been true for a while now.
> It's sad to see maintainers take a "my way or the highway" approach to package maintenance, but this attitude has gradually become more accepted in Debian over the years.
It's frankly the only way to maintain a distribution relying almost completely on volunteer work! The more different options there are, the more expensive (both in terms of human cost, engineering time and hardware cost) testing gets.
It's one thing if you're, say, Red Hat with a serious amount of commercial customers, they can and do pay for conformance testing and all the options. But for a fully FOSS project like Debian, eventually it becomes unmaintainable.
Additionally, the more "liberty" distributions take in how the system is set up, the more work software developers have to put in. Just look at autotools, an abomination that is sadly necessary.
> Canonical paying Devs and all, it isn't a great way of influencing a community.
That's kind of the point of modern open source organizations. Let corporations fund the projects, and in exchange they get a say in terms of direction, and hopefully everything works out. The bigger issue with Ubuntu is that they lack vision, and when they ram things through, they give up at the slightest hint of opposition (and waste a tremendous amount of resources and time along the way). For example Mir and Unity were perfectly fine technologies but they retired it because they didn't want to see things through. For such a successful company, it's surprising that there technical direction setting is so unserious.
There are many high profile DDs who work or have worked for Canonical who are emphatically not the inverse — Canonical employees who are part of the Debian org.
The conclusion you drew is perfectly reasonable but I’m not sure it is correct, especially when in comparison Canonical is the newcomer. It could even be seen to impugn their integrity.
If you look at the article, it seems like the hard dependency on Rust is being added for parsing functionality that only Canonical uses:
> David Kalnischkies, who is also a major contributor to APT, suggested that if the goal is to reduce bugs, it would be better to remove the code that is used to parse the .deb, .ar, and .tar formats that Klode mentioned from APT entirely. It is only needed for two tools, apt-ftparchive and apt-extracttemplates, he said, and the only ""serious usage"" of apt-ftparchive was by Klode's employer, Canonical, for its Launchpad software-collaboration platform. If those were taken out of the main APT code base, then it would not matter whether they were written in Rust, Python, or another language, since the tools are not directly necessary for any given port.
Mmm, apt-ftparchive is pretty useful for cooking up repos for "in-house" distros (which we certainly thought was serious...) but those tools are already a separate binary package (apt-utils) so factoring them out at the source level wouldn't be particularly troublesome. (I was going to add that there are also nicer tools that have turned up in the last 10 years but the couple of examples I looked at depend on apt-utils, oops)
apt-utils comes from the same top-level source package though:
I know you can make configure-time decisions based on the architecture and ship a leaner apt-utils on a legacy platform, but it's not as obvious as "oh yeah that thing is fully auxiliary and in a totally different codebase".
I understand, but the comment to which I was replying implied that this keeps happening, and in general. That’s not fair to the N-1 other DDs who aren’t the subject of this LWN article (which I read!)
The most interesting criticism / idea in the article was that the parts that are intended for Rust-ification should actually be removed from core apt.
> it would be better to remove the code that is used to parse the .deb, .ar, and .tar formats [...] from APT entirely. It is only needed for two tools, apt-ftparchive and apt-extracttemplates [...]
Another interesting, although perhaps tangential, criticism was that the "new solver" currently lacks a testsuite (unit tests; it has integration tests). I'm actually kind of surprised that writing a dependency solver is a greenfield project instead of using an existing one. Or is this just a dig at something that pulls in a well-tested external module for solving?
Posted in curiosity, not knowing much about apt.
It seems silly to say that it has no tests. If I had to pick between unit and integration tests, I'd pick integration tests every time.
> the "new solver" currently lacks a testsuite
To borrow a phrase I recently coined:
If it's not tested then it's not Engineered.
You'd think that core tools would have proper Software Engineering behind them. Alas, it's surprising how many do not.
Unit tests does not make Software Engineering. That's simply part of the development phase, which should be the smallest phase out of all the phases involved in REAL Software Engineering, which is rarely even done these days, outside of DO-178 (et al) monotony. The entire private-to-public industry has even polluted upper management in defense software engineering into accepting SCRUM as somehow more desirable than the ability to effectively plan your requirements and execute without deviation. Yes it's possible, and yes it's even plausible. SWE laziness turns Engineers into developers. Running some auto-documentation script or a generic non-official block diagram is not the same as a Civil PE creating blueprints for a house, let alone a mile long bridge or skyscraper.
Nonsense. I know and talk to multiple Engineers all the time and they all envy our position of continuing to fix issues in the project.
Mechanical engineers having to work around other component failures all the time because their lead time is gigantic and no matter how much planning they do, failures still pop-up.
The idea that Software Engineering has more bugs is absurd. Electronic engineers, mechanical, electric, all face similar issues to what we face and normally don't have the capacity to deploy fixes as fast as we do because of real world constraints.
Not nonsense. Don't be reductive.
As far as I understand the idea behind scrum it's not that you don't plan, it's that you significantly shorten the planning-implementation-review cycle.
Perhaps that is the ideal when it was laid out, but the reality of the common implementation is that planning is dispensed with. It gives some management a great excuse to look no further than the next jira ticket, if that.
The ideal implementation of a methodology is only relevant for a small number of management who would do well with almost any methodology because they will take initiative to improve whatever they are doing. The best methodology for wide adoption is the one that works okay for the largest number of management who struggle to take responsibility or initiative.
That is to say, the methodology that requires management to take responsibility in its "lowest energy state" is the best one for most people-- because they will migrate to the lowest energy state. If the "lowest energy state" allows management to do almost nothing, then they will. If the structure allows being clueless, a lot of managers will migrate to pointy haired Dilbert manager cluelessness.
With that said; I do agree with getting products to clients quickly, getting feedback quickly, and being "agile" in adapting to requirements; but having a good plan based on actual knowledge of the requirements is important. Any strict adherence to an extreme methodology is probably going to fail in edge cases, so having the judgement of when to apply which methodology is a characteristic of good management. You've got to know your domain, know your team, and use the right tool for the job.
I've got a bridge to sell. It's made from watered-down concrete and comes with blueprints written on site. It was very important to get the implementation started asap to shorten the review cycle.
Integration tests are still tests. There are definitely cases for tools where you can largely get by without unit tests in favor of integration tests. I've written a lot of code generation tools this way for instance.
Given that Cargo is written in Rust, you would think there would be at least one battle tested solver that could be used. Perhaps it was harder to extract and make generic than write a new one?
Cargo's solver incorporates concepts that .debs don't have, like Cargo features, and I'm sure that .debs have features that Cargo packages don't have either.
[deleted]
Could the rust code be transpired to readable C?
Dependency solvers are actually an area that can benefit from updating IMO.
Every time I consider learning Rust, I am thrown back by how... "janky" the syntax is. It seems to me that we ought to have a system-level language which builds upon the learnings of the past 20+ years. Can someone help me understand this? Why are we pushing forward with a language that has a Perl-esque unreadability...?
Comparison: I often program in Python (and teach it) - and while it has its own syntax warts & frustrations - overall the language has a "pseudocode which compiles" approach, which I appreciate. Similarly, I appreciate what Kotlin has done with Java. Is there a "Kotlin for Rust"? or another high quality system language we ought to be investing in? I genuinely believe that languages ought to start with "newbie friendliness", and would love to hear challenges to that idea.
You might this blog post interesting, which argues that it's Rust semantics and not syntax that results in the noisiness, i.e.: it's intrinsic complexity:
I found it reasonably convincing. For what it's worth, I found Rust's syntax quite daunting at first (coming from Python as well), but it only took a few months of continuous use to get used to it. I think "Perl-esque" is an overstatement.
It has some upsides over Python as well, notably that the lack of significant whitespace means inserting a small change and letting the autoformatter deal with syntax changes is quite easy, whereas in Python I occasionally have to faff with indentation before Black/Ruff will let me autoformat.
I appreciate that for teaching, the trade-offs go in the other direction.
I'm not sure which of the dozen Rust-syntax supporters I should reply to, but consider something like these three (probably equivalent) syntaxes:
let mut a = Vec::<u32>::new();
let mut b = <Vec::<u32>>::new();
let mut c = <Vec<u32>>::new();
let mut d: Vec<u32> = Vec::new();
Which one will your coworker choose? What will your other corworkers choose?
This is day one stuff for declaring a dynamic array. What you really want is something like:
let mut z = Vec<u32>::new();
However, the grammar is problematic here because of using less-than and greater-than as brackets in a type "context". You can explain that as either not learning from C++'s mistakes or trying to appeal to a C++ audience I guess.
Yes, I know there is a `vec!` macro. Will you require your coworkers to declare a similar macro when they start to implement their own generic types?
There are lots of other examples when you get to what traits are required to satisfy generics ("where clauses" vs "bounds"), or the lifetime signature stuff and so on...
You can argue that strong typing has some intrinsic complexity, but it's tougher to defend the multiple ways to do things, and that WAS one of Perl's mantras.
This is like complaining that in C you can write
a->b
(a->b)
(*a).b
((*a).b)
Being able to use disambiguated syntaxes, and being able to add extra brackets, isn't an issue.
PS. The formatting tooling normalizes your second and third example to the same syntax. Personally I think it ought to normalize both of them to the first syntax as well, but it's not particularly surprising that it doesn't because they aren't things anyone ever writes.
This will be the case in any language with both generics and type inference. It's nothing to do specifically with Rust.
Most likely
let e = Vec::new()
or
let f = vec![]
rustc will figure out the type
exactly. you specify types for function parameters and structs and let the language do it's thing. it's a bit of a niche to specify a type within a function...
There is a reason the multiple methods detailed above exist. Mostly for random iterator syntax. Such as summing an array or calling collect on an iterator. Most Rust devs probably don't use all of these syntax in a single year or maybe even their careers.
I've only ever seen `a` and `d`. Personally I prefer `a`. The only time I've seen `c` is for trait methods like `<Self as Trait<Generic>>::func`. Noisy? I guess. Not sure how else this could really be written.
Fwiw, I didn't go looking for obscure examples to make HN posts. I've had three rounds of sincerely trying to really learn and understand Rust. The first was back when pointer types had sigils, but this exact declaration was my first stumbling block on my second time around.
The first version I got working was `d`, and my first thought was, "you're kidding me - the right hand side is inferring it's type from the left?!?" I didn't learn about "turbo fish" until some time later.
> Which one will your coworker choose? What will your other corworkers choose?
I don’t think I’ve ever seen the second two syntaxes anywhere.
I really don’t think this is a problem.
[deleted]
I mean, the fact that you mention "probably equivalent" is part of the reality here: Nobody writes the majority of these forms in real code. They are equivalent, by the way.
In real code, the only form I've ever seen out of these in the wild is your d form.
This is some True Scotsman style counter argument, and it's hard for me to make a polite reply to it.
There are people who program with a "fake it till you make it" approach, cutting and pasting from Stack Overflow, and hoping the compiler errors are enough to fix their mess. Historically, these are the ones your pages/books cater to, and the ones who think the borrow checker is the hard part. It doesn't surprise me that you only see code from that kind of beginner and experts on some rust-dev forum and nothing in between.
The issue though is that this isn't a solvable "problem". This is how programming languages' syntax work. It's like saying that C's if syntax is bad because these are equivalent:
if (x > y) {
if ((x > y)) {
if (((x) > (y))) {
Yes, one of your co-workers may write the third form. But it's just not possible for a programming language to stop this from existing, or at least, maybe you could do it, but it would add a ton of complexity for something that in practice isn't a problem.
Well, the solution usually isn't in syntax, but it often is solved by way of code formatters, which can normalize the syntax to a preferred form among several equivalent options.
I think Perl-esque is apt, but that's because I've done quite a bit of Perl and think the syntax concerns are overblown. Once you get past the sigils on the variables Perl's syntax is generally pretty straightforward, albeit with a few warts in places like almost every language. The other area where people complained about Perl's opaqueness was the regular expressions, which most languages picked up anyway because people realized just how useful they are.
That's it exactly.
Once you're writing Rust at full speed, you'll find you won't be putting lifetimes and trait bounds on everything. Some of this becomes implicit, some of it you can just avoid with simpler patterns.
When you write Rust code without lifetimes and trait bounds and nested types, the language looks like Ruby lite.
When you write Rust code with traits or nested types, it looks like Java + Ruby.
When you sprinkle in the lifetimes, it takes on a bit of character of its own.
It honestly isn't hard to read once you use the language a lot. Imagine what Python looks like to a day zero newbie vs. a seasoned python developer.
You can constrain complexity (if you even need it) to certain modules, leaving other code relatively clean. Imagine the Python modules that use all the language features - you've seen them!
One of the best hacks of all: if you're writing HTTP services, you might be able to write nearly 100% of your code without lifetimes at all. Because almost everything happening in request flow is linear and not shared.
>When you write Rust code without lifetimes and trait bounds and nested types, the language looks like Ruby lite.
And once you learn a few idioms this is mostly the default.
This honestly reads like the cliche "you just don't get it yet" dismissals of many rust criticisms.
That article is really good, because it highlight that Rust doesn't have to look messy. Part of the problem, I think, is that there's a few to many people who think that messy version is better, because it "uses more of the language" and it makes them look smarter. Or maybe Rust just makes it to hard to see through the semantics and realize that just because feature is there doesn't mean that you need it.
There's also a massive difference between the type of C or Perl someone like me would write, versus someone trying to cope with a more hostile environment or who requires higher levels of performance. My code might be easier to read, but it technically has issue, they are mostly not relevant, while the reverse is true for a more skilled developer, in a different environment. Rust seems to attract really skilled people, who have really defensive code styles or who use more of the provided language features, and that makes to code harder to read, but that would also be the case in e.g. C++.
> I am thrown back by how... "janky" the syntax is.
Well if you come from C++ it's a breath of fresh air! Rust is like a "cleaned-up" C++, that does not carry the historical baggage forced by backwards compatibility. It is well-thought out from the start. The syntax may appear a bit too synthetic; but that's just the first day of use. If you use it for a few days, you'll soon find that it's a great, beautiful language!
The main problem with rust is that the community around it has embraced all the toxic traditions of the js/node ecosystem, and then some. Cargo is a terrifying nightmare. If you could install regular rust dependencies with "apt install" in debian stable, that would be a different story! But no. They want the version churn: continuously adding and removing bugs, like particle/anti-particle pairs at the boundary of a black hole.
Concerning TFA, adding rust to apt might be a step in the right direction. But it should be symmetric: apt depends on rust, that's great! But all the rust that it depends on needs to be installed by apt, and by apt alone!
I am coming from C++ and think Cargo is a blessing.
I like that I can just add a dependency and be done instead of having to deal with dependencies which require downloading stuff from the internet and making them discoverable for the project specific tool chain - which works differently on every operating system.
Same goes for compiling other projects.
While it kinda flies under the radar, most modern C projects do have a kind of package management solution in the form of pkg-config. Instead of the wild west of downloading and installing every dependency and figuring out how to integrate it properly with the OS and your project you can add a bit of syntactic sugar to your Makefile and have that mostly handled for you, save for the part where you will need to use your platform's native package manager to install the dependencies first. On a modern system using a package on a C project just requires a Makefile that looks something like this:
But that is the kind of convenience and ease of use that brings us another npm malware incident every other month at this point.
This is a real problem but I wouldn't blame the existence of good tooling on it.
Sure you don't have this issue with C or C++, but thats because adding even a single dependency to a C or C++ project sucks, the tooling sucks.
I wholly blame developers who are too eager to just pull new dependencies in when they could've just written 7 lines themselves.
I remember hearing a few years ago about how developers considered every line of code the wrote as a failing and talked about how modern development was just gluing otherwise maintained modules together to avoid having to maintain their own project. I thought this sounded insane and I still do.
And in a way I think AI can help here, where instead you get just the snippet vs having to add that dep that then becomes a long-term security liability
Debian already builds Rust packages from apt, so it will satisfy that critera.
> Cargo is a terrifying nightmare
Really? Why? I'm not a Rust guru, but Cargo is the only part of Rust that gave me a great first impression.
GP mostly answered that in the comment already:
> If you could install regular rust dependencies with "apt install" in debian stable, that would be a different story! But no. They want the version churn: continuously adding and removing bugs, like particle/anti-particle pairs at the boundary of a black hole.
I don't know, it doesn't explain how and why Cargo causes "continuously adding and removing bugs, like particle/anti-particle pairs at the boundary of a black hole."
The problem, of course, is that "apt install" only works on platforms that use apt to manage their packages.
As a c/c++ cmake user, cargo sounds like a utopia in comparison. It still amazes me that c/c++ package management is still spread between about 5 different solutions.
IMO, the biggest improvement to C/C++ would be ISO defining a package manager a.la pip or uv or cargo. I'm so tired of writing cmake. just... tired.
People that don't understand make are destined to recreate it poorly, and there's no better example than cmake, imho.
Here's my arc through C/C++ build systems:
- make (copy pasted examples)
- RTFM [1]
- recursive make for all sorts of non-build purposes - this is as good as hadoop up to about 16 machines
- autotools
- cmake
- read "recursive make considered harmful" [2]
- make + templates
Anyway, once you've understood [1] and [2], it's pretty hard to justify cmake over make + manual vendoring. If you need windows + linux builds (cmake's most-advertised feature), you'll pretty quickly realize the VS projects it produces are a hot mess, and wonder why you don't just maintain a separate build config for windows.
If I was going to try to improve on the state of the art, I'd clean up a few corner cases in make semantics where it misses productions in complicated corner cases (the problems are analogous to prolog vs datalog), and then fix the macro syntax.
If you want a good package manager for C/C++, check out Debian or its derivatives. (I'm serious -- if you're upset about the lack of packages, there's a pretty obvious solution. Now that docker exists, the packages run most places. Support for some sort of AppImage style installer would be nice for use with lesser distros.)
cmake exists not because people didn't understand make, but because there was no one make to understand. The "c" is for "cross platform." It's a replacement for autoconf/automake, not a replacement for make.
> If I was going to try to improve on the state of the art
The state of the art is buck/bazel/nix/build2.
cmake is a self-inflicted problem of some C++ users, and an independent issue of the language itself (just like cargo for rust). If you want, you can use a makefile and distribution-provided dependencies, or vendored dependencies, and you don't need cmake.
imo the biggest single problem with C++ that the simple act of building it is not (and it seems, cannot) be standardized.
This creates kind of geographic barriers that segregate populations of C++ users, and just like any language, that isolation begets dialects and idioms that are foreign to anyone from a different group.
But the stewards of the language seem to pretend these barriers don't exist, or at least don't understand them, and go on to make the mountain ranges separating our valleys even steeper.
So it's not that CMake is a self-inflicted wound. It's the natural evolution of a tool to fill in the gaps left under specified by the language developers.
> Rust is like a "cleaned-up" C++
Except they got the order of type and variable wrong. That alone is enough reason to never use Rust, Go, TypeScript or any other language that botches such a critical cornerstone of language syntax.
I just can’t help but wonder if you’re 12 or autistic if this is a stance you’re willing to take on a public forum.
It’s completely inconsequential and makes the language easier to parse.
That was needlessly rude.
> Comparison: I often program in Python (and teach it) - and while it has its own syntax warts & frustrations - overall the language has a "pseudocode which compiles" approach, which I appreciate.
I think this is why you don’t like Rust: In Rust you have to be explicit by design. Being explicit adds syntax.
If you appreciate languages where you can write pseudocode and have the details handled automatically for you, then you’re probably not going to enjoy any language that expects you to be explicit about details.
As far as “janky syntax”, that’s a matter of perspective. Every time I deal with Python and do things like “__slots__” it feels like janky layer upon layer of ideas added on top of a language that has evolved to support things it wasn’t originally planned to do, which feels janky to me. All of the things I have to do in order to get a performant Python program feel incredibly janky relative to using a language with first class support for the things I need to do.
Both Python and JS evolved by building on top of older versions, but somehow JS did a way better job than Python, even though Py forced a major breaking change.
Agree about Rust, all the syntax is necessary for what it's trying to do.
You mean typescript?
Before that. The classes and stuff added in ES6 and earlier
Syntax tends to be deeply personal. I would say the most straightforward answer to your question is "many people disagree that it is unreadable."
Rust did build on the learnings of the past 20 years. Essentially all of its syntax was taken from other languages, even lifetimes.
Are the many who disagree that it is unreadable more than the people who agree? I have been involved with the language for a while now, and while I appreciate what you and many others have done for it, the sense that the group is immune to feedback just becomes too palpable too often. That, and the really aggressive PR.
Rust is trying to solve a really important problem, and so far it might well be one of the best solutions we have for it in a general sense. I 100% support its use in as many places as possible, so that it can evolve. However, its evolution seems to be thwarted by a very vocal subset of its leadership and community who have made it a part of their identity and whatever socio-political leverage toolset they use.
Rust is almost git hyoe 2.0. That hyoe set the world up with (a) a dominant VCS that is spectacularly bad at almost everything it does compared to its competitors and (b) the dominant Github social network owned by MS that got ripped to train Copilot.
Developers have a way of running with a hyoe that can be quite disturbing and detrimental in the long run. The one difference here is that rust has some solid ideas implemented underneath. But the community proselytizing and throwing non-believers under the bus is quite real.
> Are the many who disagree that it is unreadable more than the people who agree?
I have no way to properly evaluate that statement. My gut says no, because I see people complain about other things far more often, but I do think it's unknowable.
I'm not involved with Rust any more, and I also agree with you that sometimes Rust leadership can be insular and opaque. But the parent isn't really feedback. It's just a complaint. There's nothing actionable to do here. In fact, when I read the parent's post, I said "hm, I'm not that familiar with Kotlin actually, maybe I'll go check it out," loaded up https://kotlinlang.org/docs/basic-syntax.html, and frankly, it looks a lot like Rust.
But even beyond that: it's not reasonably possible to change a language's entire syntax ten years post 1.0. Sure, you can make tweaks, but turning Rust into Python simply is not going to happen. It would be irresponsible.
I've found the rust core team to be very open to feedback. And maybe I've just been using Rust for too long, but the syntax feels quite reasonable to me.
Just for my own curiosity, do you have an examples of suggestions for how to improve the syntax that have been brought up and dismissed by the language maintainers?
> the sense that the group is immune to feedback
Is complaining about syntax really productive though? What is really going to be done about it?
This is such a weird take. What do you suggest? Should Rust’s syntax have been democratically decided?
There’s syntax that is objectively easier to both read and write, and there’s syntax that is both harder to read and write. For a majority.
In general, using english words consisting of a-z is easier to read. Using regex-like mojibake is harder.
For an concrete example in rust, using pipes in lambdas, instead of an arrow, is aweful.
Rust's pipes in lambdas come from Ruby, a language that's often regarded as having beautiful syntax.
Rust is objectively not mojibake. The equivalent here would be like using a-z, as Rust's syntax is borrowed from other languages in wide use, not anything particularly esoteric. (Unless you could OCaml as esoteric, which I do believe is somewhat arguable but that's only one thing, the argument still holds for the vast majority of the language.)
uuuh I like the pipes even though its my first language with them?
Concise and much clearer to read vs parentheses where you gotta wonder if the params are just arguments, or a tuple, etc. What are you talking about.
I would encourage you to give it a try anyways. Unfamiliar syntax is off-putting for sure, but you can get comfortable with any syntax.
Coming from Python, I needed to work on some legacy Perl code. Perl code looks quite rough to a new user. After time, I got used to it. The syntax becomes a lot less relevant as you spend more time with the language.
Sure... but you don't want to spend time if it's such a mess to read it.
Once one does spend some time to become comfortable with the language, that feeling of messiness with unfamiliar syntax fades away. That's the case with any unfamiliar language, not just Rust.
I used Rust for a year and still wasn't used to the syntax, though this was v1.0 so idk what changed. I see why it's so complicated and would definitely prefer it over C or Cpp, but wouldn't do higher-level code in it.
I’ve been writing python professionally for over 10 years. In the last year I’ve been writing more and most Rust. At first I thought the same as you. It’s a fugly language, there’s no denying it. But once I started to learn what all the weird syntax was for, it began to ruin Python for me.
Now I begrudge any time I have to go back to python. It feels like its beauty is only skin deep, but the ugly details are right there beneath the surface: prolific duck typing, exceptions as control flow, dynamic attributes. All these now make me uneasy, like I can’t be sure what my code will really do at runtime.
Rust is ugly but it’s telling you exactly what it will do.
Seems like a fairly decent syntax. It’s less simple than many systems languages because it has a very strong type system. That’s a choice of preference in how you want to solve a problem.
I don’t think the memory safety guarantees of Rust could be expressed in the syntax of a language like C or Go.
> It’s less simple than many systems languages because it has a very strong type system.
I don’t think that’s the case, somehow most ML derived languages ended up with stronger type system and cleaner syntax.
Is ML a systems language? Sorry, maybe my definition is wrong, but I consider a systems language something that’s used by a decent amount of OS’es, programming languages and OS utilities.
I assume you’re talking about OCaml et al? I’m intruiged by it, but I’m coming from a Haskell/C++ background.
Rust is somewhat unique in terms of system language this because it’s the first one that’s not “simple” like C but still used for systems tools, more than Go is as far as I’m aware.
Which probably has to do with its performance characteristics being close to the machine, which Go cannot do (ie based on LLVM, no GC, etc)
There is no other ML-like that is as low level. Except perhaps ATS, which has terrible syntax.
One of the design goals of rust is explicitness. I think if Rust had type elision, like many other functional languages, it would go a long way to cleaning up the syntax.
Rust's most complained about syntax, the lifetime syntax, was borrowed from an ML: OCaml.
I code mostly in Go and the typing sloppiness is a major pain point.
Example: You read the expression "x.f", say, in the output of git-diff. Is x a struct object, or a pointer to a struct? Only by referring to enclosing context can you know for sure.
Maybe I've Stockholm'd myself, but I think Rust's syntax is very pleasant. I also think a lot of C code looks very good (although there is some _ugly_ C code out there).
Sometimes the different sets of angle and curly brackets adding up can look ugly at first, and maybe the anonymous function syntax of || {}, but it grows on you if you spend some time with the language (as do all syntaxes, in my experience).
> It seems to me that we ought to have a system-level language which builds upon the learnings of the past 20+ years.
Maybe Ada, D or Nim might qualify?
The family of languages that started with ML[0] mostly look like this. Studying that language family will probably help you feel much more at home in Rust.
Many features and stylistic choices from ML derivatives have made their way into Swift, Typescript, and other non-ML languages.
I often say that if you want to be a career programmer, it is a good idea to deeply learn one Lisp-type language (which will help with stuff like Python), one ML-type language (which will help with stuff like Rust) and one C-type language (for obvious reasons.)
F# looks nothing like Rust. Is much more readable for me.
I don’t program much in Rust, but I find it a beautiful syntax… they took C++ and made it pretty much strictly better along with taking some inspiration from ML (which is beautiful imo)
The sigils in Rust (and perl) are there to aid readability. After you use it a bit, you get used to ignoring them unless they look weird.
All the python programs I've had to maintain (I never choose python) have had major maintainability problems due to python's clean looking syntax. I can still look at crazy object oriented perl meta-programming stuff I wrote 20 years ago, and figure out what it's doing.
Golang takes another approach: They impoverished the language until it didn't need fancy syntax to be unambiguously readable. As a workaround, they heavily rely on codegen, so (for instance) Kubernetes is around 2 million lines of code. The lines are mostly readable (even the machine generated ones), but no human is going to be able to read them at the rate they churn.
Anyway, pick your poison, I guess, but there's a reason Rust attracts experienced systems programmers.
I think this is subjective, because I think Rust's syntax is (mostly) beautiful.
Given the constraint that they had to keep it familiar to C++ people, I'd say they did a wonderful job. It's like C++ meets OCaml.
Do you have any particular complaints about the syntax?
Aside from async/await which I agree is somewhat janky syntaxtically, I'm curious what you consider to be janky. I think Rust is overall pretty nice to read and write. Patterns show up where you want them, type inference is somewhat limited but still useful. Literals are readily available. UFCS is really elegant. I could go on.
Ironically, I find Python syntax frustrating. Imports and list comprehensions read half backwards, variable bindings escape scope, dunder functions, doc comments inside the function, etc.
What do people actually mean when they say "the syntax is janky"?
I often see comparisons to languages like Python and Kotlin, but both encode far less information on their syntax because they don't have the same features as Rust, so there's no way for them to express the same semantics as rust.
Sure, you can make Rust look simpler by removing information, but at that point you're not just changing syntax, you're changing the language's semantics.
Is there any language that preserves the same level of type information while using a less "janky" syntax?
Kotlin programmer here who is picking up Rust recently. you're right, it's no Kotlin when it comes to the elegance of APIs but it's also not too bad at all.
In fact there are some things about the syntax that are actually nice like range syntax, Unit type being (), match expressions, super explicit types, how mutability is represented etc.
I'd argue it's the most similar system level language to Kotlin I've encountered. I encourage you to power through that initial discomfort because in the process it does unlock a level of performance other languages dream of.
> Why are we pushing forward with a language that has a Perl-esque unreadability...?
The reason is the same for any (including Perl, except those meme languages where obfuscation is a feature) language: the early adopters don't think it's unreadable.
> Is there a "Kotlin for Rust"?
While it's not a systems language, have you tried Swift?
Swift is as relevant to this discussion as Common Lisp.
On the contrary, Swift is very relevant on this subject. It has high feature parity with rust, with a much readable syntax.
But Swift is not "Kotlin for Rust" though, I can't see the connection at all. "Kotlin for Rust" would be a language that keeps you in the Rust ecosystem.
The commenter I replied to seems to like Kotlin. Swift is extremely close to Kotlin in syntax and features, but is not for the JVM. Swift also has a lot of similarities with Rust, if you ignore the fact that it has a garbage collector.
Ah yeah ok, makes sense in that way
[deleted]
Have you considered that part of it is not the language but the users?
I'm learning rust and the sample code I frequently find is... cryptically terse. But the (unidiomatic, amateurish) code I write ironically reads a lot better.
I think rust attracts a former c/c++ audience, which then bring the customs of that language here. Something as simple as your variable naming (character vs c, index vs i) can reduce issues already.
As an official greybeard who has written much in C, C++, Perl, Python, and now Rust, I can say Rust is a wonderful systems programming language. Nothing at all like Perl, and as others have mentioned, a great relief from C++ while providing all the power and low-level bits and bobs important for systems programming.
[deleted]
I would argue that anything that is not Lisp has a complicated syntax.
The question is: is it worth it?
With Rust for the answer is yes. The reliability, speed, data-race free nature of the code I get from Rust absolutely justifies the syntax quirks (for me!).
what makes it unreadable for you?
Legit question really. A comparative study on language readability using codes doing the same thing written idiomatically in different languages will be interesting. Beyond syntax, idioms/paradigm/familiarity should also play role.
nta you're replying to, but as someone who doesn't know rust, on first glance it seems like it's littered with too many special symbols and very verbose. as i understand it this is required because of the very granular low level control rust offers
maybe unreadable is too strong of a word, but there is a valid point of it looking unapproachable to someone new
People often misuse unreadable when they mean unfamiliar. Rust really isn't that difficult to read when you get used to it.
I think the main issue people who don't like the syntax have with it is that it's dense. We can imagine a much less dense syntax that preserves the same semantics, but IMO it'd be far worse.
Using matklad's first example from his article on how the issue is more the semantics[1]
we can imagine a much less symbol-heavy syntax inspired by POSIX shell, FORTH, & ADA:
generic
type P is Path containedBy AsRef
public function read takes type Path named path returns u8 containedBy Vector containedBy Result fromModule io
function inner takes type reference to Path named path returns u8 containedBy Vector containedBy Result fromModule io
try
let mutable file = path open fromModule File
let mutable bytes = new fromModule Vector
try
mutable reference to bytes file.read_to_end
bytes Ok return
noitcnuf
path as_ref inner return
noitcnuf
and I think we'll all agree that's much less readable even though the only punctuation is `=` and `.`. So "symbol heavy" isn't a root cause of the confusion, it's trivial to make worse syntax with fewer symbols. And I like RPN syntax & FORTH.
In your opinion how does Rust compare to C++ for readability?
> Every time I consider learning Rust, I am thrown back by how... "janky" the syntax is. It seems to me that we ought to have a system-level language which builds upon the learnings of the past 20+ years.
I said this years ago and I was basically told "skill issue". It's unreadable. I shudder to think what it's like to maintain a Rust system at scale.
You get used to it. Like any language.
I'm writing this as a heavy python user in my day job. Python is terrible for writing complex systems in. Both the language and the libraries are full of footguns for the novice and expert alike. It has 20 years of baggage, the packaging and environment handling is nothing short of an unmitigated disaster, although uv seems to be a minor light at the end of the tunnel. It is not a simple language at this point. It has had so many features tacked on, that it needs years of use to have a solid understanding of all the interactions.
Python is a language that became successful not because it was the best in it's class, but because it was the least bad. It became the lingua franca of quantitative analysis, because R was even worse and matlab was a closed ecosystem with strong whiffs of the 80s. It became successful because it was the least bad glue language for getting up and running with ML and later on LLMs.
In comparison, Rust is a very predictable and robust language. The tradeoff it makes is that it buys safety for the price of higher upfront complexity. I'd never use Rust to do research in. It'd be an exercise in frustration. However, for writing reliable and robust systems, it's the least bad currently.
What's wrong with R? I used it and liked it in undergrad. I certainly didn't use it as seriously as the users who made Python popular, but to this day I remember R fondly and would never choose Python for a personal project.
My R use was self-taught, as well. I refused to use proprietary software for school all through high school and university, so I used R where we were expected to use Excel or MatLab (though I usually used GNU Octave for the latter), including for at least one or two math classes. I don't remember anything being tricky or difficult to work with.
R is the most haphazard programming environment I've ever used. It feels like an agglomeration of hundreds of different people's shell aliases and scripting one-liners.
I'll grant my only exposure has been a two- or three-day "Intro to R" class but I ran screaming from that experience and have never touched it again.
It maybe worked against me that I am a programmer, not a statistician or researcher.
Python had already become vastly popular before ML/AI. Scripting/tools/apps/web/... Only space that hasn't entered is mobile.
> upon the learnings of the past 20+ years.
That's the thing though... Rust does build on many of those learnings. For starters, managing a big type system is better when some types are implicit, so Rust features type inference to ease the burden in that area. They've also learned from C++'s mistake of having a context sensitive grammar. They learned from C++'s template nightmare error messages so generics are easier to work with. They also applied learnings about immutability being a better default that mutability. The reason Rust is statically linked and packages are managed by a central repository is based on decades of seeing how difficult it is to build and deploy projects in C++, and how easy it is to build and deploy projects in the Node / NPM ecosystem. Pattern matching and tagged unions were added because of how well they worked in functional languages.
As for "Perl-esque unreadability" I submit that it's not unreadable, you are just unfamiliar. I myself find Chinese unreadable, but that doesn't mean Chinese is unreadable.
> Is there a "Kotlin for Rust"?
Kotlin came out 16 years after Java. Rust is relatively new, and it has built on other languages, but it's not the end point. Languages will be written that build on Rust, but that will take some time. Already many nascent projects are out there, but it is yet to be seen which will rise to the top.
> It seems to me that we ought to have a system-level language which builds upon the learnings of the past 20+ years
I mean, Rust does. I builds on 20+ years of compiler and type system advancements. Then Syntax is verbose if you include all then things you can possibly do. If you stick to the basics it's pretty similar to most other languages. Hell, I'd say a lot of syntax Rust is similar to type-hinted Python.
Having said that, comparing a GC'd dynamic language to a systems programming language just isn't a fair comparison. When you need to be concerned about memory allocation you just need more syntax.
Perl’s most notable syntax feature is sigils on all variables.
So it’s strange to hear a comparison. Maybe there’s something I’m missing.
It seems closer to C++ syntax than Perl.
Does it really add any value to the conversation?
What are you talking about? Rust’s function signature and type declaration syntaxes are extremely vanilla, unless you venture into some really extreme use cases with lots of lifetime annotations and generic bounds.
That's just a weird and unrealistic example, though. Like, why is process_handler taking an owned, boxed reference to something it only needs shared access to? Why is there an unnecessary 'a bound on handler?
In the places where you need to add lifetime annotations, it's certainly useful to be able to see them in the types, rather than relegate them to the documentation like in C++; cf. all the places where C++'s STL has to mention iterator and reference invalidation.
LLMs LOVE to write Rust like this. They add smart pointers, options and lifetimes everywhere when none of those things are necessary. I don’t know what it is, but they love over-engineering it.
I agree that the signature for process_handler is weird, but you could steelman it to take a borrowed trait object instead, which would have an extra sigil.
The handler function isn't actually unnecessary, or at least, it isn't superfluous: by default, the signature would include 'a on self as well, and that's probably not what you actually want.
I do think that the example basically boils down to the lifetime syntax though, and yes, while it's a bit odd at first, every other thing that was tried was worse.
> The handler function isn't actually unnecessary, or at least, it isn't superfluous: by default, the signature would include 'a on self as well, and that's probably not what you actually want.
To clarify, I meant the 'a in `Box<dyn Handler + 'a>` in the definition of `process_handler` is unnecessary. I'm not saying that the <'a> parameter in the definition of Handler::handle is unnecessary, which seems to be what you think I said, unless I misunderstood.
Ah yes, I misunderstood you in exactly that way, my apologies.
Lifetimes really only come into play if you are doing something really obscure. Often times when I’m about to add lifetimes to my code I re-think it and realize there is a better way to architect it that doesn’t involve them at all. They are a warning sign.
now show me an alternative syntax encoding the same information
[deleted]
...
There's a deeper connection there: lifetimes are a form of type variable, just like in OCaml.
[deleted]
While I don’t disagree that this is at first blush quite complex, using it as an example also obscures a few additional details that aren’t present in something like python, namely monads and lifetimes. I think in absence of these, this code is a bit easier to read. However, if you had prior exposure to these concepts, I think that this is more approachable. I guess what I’m getting at here is that rust doesn’t seem to be syntactic spaghetti as much as it is a confluence of several lesser-used concepts not typically used in other “simpler” languages.
> > really extreme use cases with lots of lifetime annotations and generic bounds
You choose as your example a pretty advanced use case.
Which is the exact use case someone would choose rust for over other languages
No, the use cases of Rust are pretty much the same as the use cases of C++. Most Rust code shouldn't have objects with complicated lifetimes, just like most code in any language should avoid objects with complicated lifetimes.
Could have thrown a few uses of macros with the # and ! which threw me off completely while trying to read a Rust codebase as a non-Rust programmer.
That's simple even in Perl. The problem is when you start adding the expected idioms for real world problems.
Python users don’t even believe in enabling cursory type checking, their language design is surpassed even by JavaScript, should it really even be mentioned in a language comparison? It is a tool for ML, nothing else in that language is good or worthwhile
”[One] major contributor to APT suggested it would be better to remove the Rust code entirely as it is only needed by Canonical for its Launchpad platform. If it were taken out of the main APT code base, then it would not matter whether they were written in Rust, Python, or another language, since the tools are not directly necessary [for regular installations].”
Given the abundance of the hundreds of deb-* and dh-* tools across different packages, it is surprising that apt isn’t more actively split into separate, independent tools. Or maybe it is, but they are all in a monorepo, and the debate is about how if one niche part of the monorepo uses Rust then the whole suite can only be built on platforms that support Rust?
#!/bin/sh
build_core
if has_rust
then
build_launchpad_utils
fi
It’s like arguing about the bike shed when everyone takes the bus except for one guy who cycles in every four weeks to clean the windows.
If this could be done it seems like the ideal compromise. Everyone gets what they want.
That said eventually more modern languages will be dependencies of the tools one way or another (and they should). So probably Debian as a whole should come to a consensus on how that should happen, so it can happen in some sort of standard and fair fashion.
Interesting how instead of embracing Rust as a required toolchain for APT, the conversation quickly devolved into
"why don't we just build a tool that can translate memory-safe Rust code into memory-unsafe C code? Then we don't have to do anything."
This feels like swimming upstream just for spite.
>tool that can translate memory-safe Rust code into memory-unsafe C code
Fwiw, there're two such ongoing efforts. One[1] being an, written in C++, alternative Rust compiler that emits C (aka, in project's words, high-level assembly), the other[2] being a Rust compiler backend/plugin (as an extra goal to its initial being to compile Rust to CLR asm). Last one apparently is[3] quite modular and could be adapted for other targets too. Other options are continuing/improve GCC front-end for Rust and a recent attempt to make a Rust compiler in C[4] that compiles to QBE IR which can then be compiled with QBE/cc.
That's not what the comment said. It said, "How about a Rust to C converter?..." The idea was that using a converter could eliminate the problem of not having a rust compiler for certain platforms.
The problem is that rust is being shoved in pointless places with a rewrite-everything-in-rust mentality.
There's lunatics that want to replace basic Unix tools like sudo, etc, that are battle tested since ages which has been a mess of bugs till now.
Instead Rust should find it's niches beyond rewriting what works, but tackling what doesn't.
FWIW sudo has been maintained by an OpenBSD developer for a while now but got replaced in the base system by doas. Independent of any concerns about Rust versus C, I don't think it's quite as unreasonable as you're claiming to consider alternatives to sudo given that the OS that maintains it felt that it was flawed enough to be worth writing a replacement for from scratch.
sudo had grown a lot of features and a complicated config syntax over the years, which ended up being confusing and rarely needed in practice. doas is a lot simpler. It wasn't just a rewrite of a flawed utility but a simplification of it.
sudo is not fully battle tested, even today. You just don't really see the CVEs getting press.
Applying strict compile time rules makes software better. And with time it will also become battle tested.
Cue for all those battle tested programs that people keep finding vulnerabilities several decades after they got considered "done". You should try looking at the test results once in a while.
And by the way, we had to replace almost all of the basic Unix tools at the turn of the century because they were completely unfit for purpose. There aren't many left.
Calling it pointless comes across as jaded. It's not pointless.
Supporting Rust attracts contributors, and those contributors are much less likely to introduce vulnerabilities in Rust when contributing vs alternatives.
to introduce certain common vulnerabilities ...
not vulnerabilities in general.
Converting parsers to Rust is not "pointless". Doing string manipulation in C is both an awful experience and also extremely fertile ground for serious issues.
apt is C++
It’s very easy to write a string library in C which makes string operations high level (both in API and memory management). Sure, you shouldn’t HAVE to do this. I get it. But anyone writing a parser is definitely skilled enough to maintain a couple hundred lines of code for a linear allocator and a pointer plus length string. And to be frank, doing things like “string operations but cheaply allocated” is something you have to do ANYWAY if you’re writing e.g. a parser.
This holds for many things in C
This is just a variation of the "skill issue" argument.
If it were correct, we wouldn't see these issues continue to pop up. But we do.
> a pointer plus length
What would length represent? Bytes? Code points?
Anyway, I think what you are asking for already exists in the excellent ICU library.
And it's not a very easy thing to maintain. Unicode stuff changes more often than you might think and it can be political.
Issues that are battle tested from ages.
Sure, which is highly valuable information that hopefully made its way into a testing / verification suite. Which can then be used to rewrite the tool into a memory-safe language, which allows a lot of fixes and edge cases that were added over time to deal with said issues to be refactored out.
Of course there's a risk that new issues are introduced, but again, that depends a lot on the verification suite for the existing tool.
Also, just because someone did a port, doesn't mean it has to be adopted or that it should replace the original. That's open source / the UNIX mentality.
I seem to remember going through this with systemD in Ubuntu. Lots of lessons learned seemed to come back as "didn't we fix this bug 3 years ago?"
We need lisp, cobol, and java in apt, too. and firefox.
Is the apt package manager a pointless place? It seems like a pretty foundational piece of supply chain software with a large surface area.
The author of the rust software did not solve the platform problem, as a result it is not a solution. Since it is not a solution, it should be reverted. It's really that simple.
All compilers do anyways is translate from one language specification to another. There's nothing magical about Rust or any specific architecture target. The compiler of a "memory safe" language like Rust could easily output assembly with severe issues in the presence of a compiler bug. There's no difference between compiling to assembly vs. C in that regard.
The assumption here is that there exists an unambiguous C representation for all LLVM IR bitcode emitted by the Rust compiler.
To my knowledge, this isn’t the case.
> The assumption here is that there exists an unambiguous C representation for all LLVM IR bitcode emitted by the Rust compiler.
> To my knowledge, this isn’t the case.
Tell us more?
Source-to-source translation will be very hard to get right, because lots of things are UB in C that aren’t in Rust, and obviously vice versa.
Rust has unwinding (panics), C doesn’t.
For one, signed integer overflow is allowed and well-defined in Rust (the result simply wraps around in release builds), while it's Undefined Behavior in C. This means that the LLVM IR emitted by the Rust compiler for signed integer arithmetic can't be directly translated into the analogous C code, because that would change the semantics of the program. There are ways around this and other issues, but they aren't necessarily simple, efficient, and portable all at once.
You guys seem to be assuming transpiling to C means it must produce C that DTRT on any random C compiler invoked any which way on the other side, where UB is some huge possibility space.
There's nothing preventing it from being some specific invocation of a narrow set of compilers like gcc-only of some specific version range with a set of flags configuring the UB to match what's required. UB doesn't mean non-deterministic, it's simply undefined by the standard and generally defined by the implementation (and often something you can influence w/cli flags).
The gigantic difference is that assembly language has extremely simple semantics, while C has very complex semantics. Similarly, assembler output is quite predictable, while C compilers are anything but. So the level of match between the Rust code and the machine code you'll get from a Rust-to-assembly compiler will be much, much easier to understand than the match you'll get between the Rust code and the machine code produced by a C compiler compiling C code output by a Rust-to-C transpiler.
Rust developers are so dogmatic about their way being the best and only way that I just avoid it altogether. I've had people ask about Rust in issues/discussions in small hobby projects I released as open source - I just ban them immediately because there is no reasoning with them and they never give up. Open source terrorists.
"Open source terrorism" is a hilarious designation for Rust-like traditions and customs. I wonder what other programming language/software communities may fall under this definition?
Shouldn't we wait until Rust gets full support in GCC? This should resolve the issue with ports without a working Rust compiler.
I don't have a problem with Rust, it is just a language, but it doesn't seem to play along well with the mostly C/C++ based UNIX ecosystem, particularly when it comes to dependencies and package management. C and C++ don't have one, and often rely on system-wide dynamic libraries, while Rust has cargo, which promotes large dependency graphs of small libraries, and static linking.
I have never seen a program segfault and crash more than apt. The status quo is extremely bad, and it desperately needs to be revamped in some way. Targeted rewrites in a memory safe & less mistake-prone language sounds like a great way to do that.
If you think this is a random decision caused by hype, cargo culting, or a maintainer's/canonical's mindless whims... please, have a tour through the apt codebase some day. It is a ticking time bomb, way more than you ever imagined such an important project would be.
You know, it is easy to find this kind of nitpicking and seemingly eternal discussion over details exhausting and meaningless, but I do think it is actually a good sign and a consequence of "openness".
In politics, authoritarianism tend to show a pretty façade where everyone mostly agrees (the reality be damned), and discussion and dissenting voice are only allowed to a certain extent as a communication tool. This is usually what we see in corporate development.
Free software are much more like democracy, everyone can voice their opinion freely, and it tends to be messy, confrontational, nitpicky. It does often lead to slowing down changes, but it also avoids the common pitfall of authoritarian regime of going head first into a wall at the speed of light.
What?
Opensource software doesn't have 1 governance model and most of it starts out as basically a pure authoritarian run.
It's only as the software ages, grows, and becomes more integral that it switches to more democratic forms of maintenance.
Even then, the most important OS code on the planet, the kernel, is basically a monarchy with King Linus holding absolute authority to veto the decision of any of the Lords. Most stuff is maintained by the Lords but if Linus says "no" or "yes" then there's no parliament which can override his decision (beyond forking the kernel).
""and not be held back by trying to shoehorn modern software on retro computing devices""
Nice. So discrimination of poor users who are running "retro" machines because that is the best they can afford or acquire.
I knew of at least two devs who are stuck with older 32 bit machines as that is what they can afford/obtain. I even offered to ship them a spare laptop with a newer CPU and they said thanks but import duties in their country would be unaffordable. Thankfully they are also tinkering with 9front which has little to no issues with portability and still supports 32 bit.
Looking at the list of affected architectures: Alpha (alpha), Motorola 680x0 (m68k), PA-RISC (hppa), and SuperH (sh4) I think these are much much more likely to be run by enthusiasts than someone needing an affordable computer.
The last 32bit laptop CPU was produced nearly 20 years ago.
Further, there are still several LTS linux distros (including the likes of Ubuntu and Debian) which don't have the rust requirement and won't until the next LTS. 24.04 is supported until 2029. Meaning you are talking about a 25 year old CPU at that point.
And even if you continue to need support. Debian based distros aren't the only ones on the plant. You can pick something else if it really matters.
No one is using an Alpha, Motorola 680x0, PA-RISC, or SuperH computer because that's the only thing they can afford. Rust supports 32bit x86.
Rust works fine on 32 bit, (and 16 bit) that’s not what they mean…
Rust even works on 8-bit via the LLVM-MOS backend for MOS 6502 :)
Poor people aren’t running exotic hardware.
You seem to be involved with 9front.
Are you trying to suggest there is a nontrivial community of people who cannot afford modern 64-bit Linux platforms, and opt for 9front on some ancient 32-bit hardware instead? Where are they coming from? Don't get me wrong, I love the 9 as much as the next guy, but you seem to paint it as some kind of affordability frontier...
The announcement says:
>In particular, our code to parse .deb, .ar, .tar, and the HTTP signature verification code would strongly benefit from memory safe languages and a stronger approach to unit testing.
I can understand the importance of safe signature verification, but how is .deb parsing a problem? If you're installing a malicious package you've already lost. There's no need to exploit the parser when the user has already given you permission to modify arbitrary files.
It is possible the deb package is parsed to extract some metadata before being installed and before verifying signature.
Also there is aspect of defence in depth. Maybe you can compromise one package that itself can't do much, but installer runs with higher priviledges and has network access.
Another angle -- installed package may compromise one container, while a bug in apt can compromise the environment which provisions containers.
And then at some point there is "oh..." moment when the holes in different layers align nicely to make four "bad but not exploitable" bugs into a zero day shitshow
> It is possible the deb package is parsed to extract some metadata before being installed and before verifying signature.
Yes, .deb violates the cryptographic doom principle[1] (if you have to perform any cryptographic operation before verifying the message authentication code (or signature) on a message you’ve received, it will somehow inevitably lead to doom).
Their signed package formats (there are two) add extra sections to the `ar` archive for the signature, so they have to parse the archive metadata & extract the contents before validating the signature. This gives attackers a window to try to exploit this parsing & extraction code. Moving this to Rust will make attacks harder, but the root cause is a file format violating the cryptographic doom principle.
The parser can run before the user is asked for permission to make changes. The parsed metadata can then discourage the user from installing the package (e.g. because of extremely questionable dependencies).
Dependencies are probably in the apt database and do not need parsing, but not everything is, or perhaps apt can install arbitrary .deb files now?
.deb is a packaging format like any other. There are plenty of reasons for parsing without running the code inside them.
I have a dual pentium pro 200 that runs gentoo and openbsd, but rust doesn't ship i586 binaries, only i686+. So I would need to compile on a separate computer to use any software that is using rust.
There is already an initrd package tool I can't use since it is rust based, but I don't use initrd on that machine so it is not a problem so far.
The computer runs modern linux just fine, I just wish the rust team would at least release an "i386" boostrap binary that actually works on all i386 like all of the other compilers.
"We don't care about retro computers" is not a good argument imho, especially when there is an easy fix. It was the same when the Xorg project patched out support for RAMDAC and obsoleted a bunch of drivers instead of fixing it easily. I had to fix the S3 driver myself to be able to use my S3 trio 64v+ with a new Xorg server.
/rant off
This sounds like it's fun. However, I have to ask, why should the linux world cater to supporting 30 year old systems? Just because it scratches an itch?
You can grab a $150 NUC which will run circles around this dual pentium pro system while also using a faction of the power.
You obviously have to do a lot of extra work, including having a second system, just to keep this old system running. More work than it'd take to migrate to a new CPU.
The system is actually running fine standalone since I have been able to avoid rust software.
As to why it should cater to it, it's more that there is no need to remove something that already works just to remove it.
It is possible to compile rustc on another system so it supports i586 and below. Just a small change in the command line options. And it doesn't degrade the newer systems.
I have plenty of faster machines, I just enjoy not throwing things away or making odd systems work. It's called having fun :)
Surely retro hardware is fine with retro software.
Yes, sorry I remembered incorrectly.
The rust compiler claims to be i686 and the CPU is i686 too, but the rust compiler is using Pentium 4 only instructions so it doesn't actually work for i686.
Edit: I see from the sister post that it is actually llvm and not rust, so I'm half barking up the wrong tree. But somehow this is not an issue with gcc and friends.
Pentium Pro is the first i686 CPU, so you should be fine.
I mean... Pentium Pro is 30 years old at this point. I don't think it's unreasonable that modern software isn't targeting those machines.
Sometimes you do wonder if those 4chan memes about those who push rust rewrites are just memes or what..
A maintainer of a project making a decision about their project is not someone pushing a re-write.
This thing gets everywhere.
This is just one reason I'm not the biggest fan of Rust. The language is good (as well as what it solves), but this tendency to force it into everything (even where it would provide no benefit whatsoever) is just mind-boggling to me. And the Rust evangelists then wonder why there are so many anti-rust folk.
Maybe there's a place for Future Debian distro that could be a place for phasing out old tech and introducing new features?
Isn't that literally what debian unstable is for?
Or maybe old devices and tech should expect a limited support window, or be expected to fork after some time?
It sounds like all of the affected Debian ports are long since diverged from the official Debian releases anyway:
> The sh4 port has never been officially supported, and none of the other ports have been supported since Debian 6.0.
Wikipedia tells me Debian 6 was released on 6 February 2011
[dead]
[flagged]
> What is it about Rust fanatics [....]
The universalization from one developer's post to all Rust "fanatics" is itself an unwelcome attack. I prefer to keep my discussion as civilized as possible.
Just criticize the remark.
I read that more as "here's a perfect example of something I'd noticed already" rather than "wow this is a terrible first impression your group is making".
Perhaps this reading is colored by how this same pair of sentiments seems to come up practically every single time there's a push to change the language for some project.
[flagged]
I think you'll experience some pushback on the assertion that that particular quote has a lot of arrogance or disdain in it.
Building large legacy projects can be difficult and tapping into a thriving ecosystem of packages might be a good thing. But it's also possible to have "shiny object" or "grass is greener" syndrome.
“If you maintain a port without a working Rust toolchain, please ensure it has one within the next 6 months, or sunset the port.”
If that’s not arrogant, I don’t know what is.
Is it arrogant or a clear and straightforward announcement that a Decision has been made and these are the consequences? I'm not seeing any arrogance in the message myself.
How is this arrogant? Are open source developers now responsible for ensuring every fork works with the dependencies and changes they make?
This seems like a long window, given to ports to say, "we are making changes that may impact you, heads up." The options presented are, frankly, the two primary options "add the dependency or tell people you are no longer a current port".
"Arrogant" does not mean "forceful" or "assertive" or "makes me angry".
This is forceful, assertive, and probably makes people angry.
Does the speaker have the authority to make this happen? Because if so, this is just a mandate and it's hard to find some kind of moral failing with a change in development direction communicated clearly.
> I think you'll experience some pushback on the assertion that that particular quote has a lot of arrogance or disdain in it.
It's just a roundabout way of saying "anything that isn't running Rust isn't a REAL computer". Which is pretty clearly an arrogant statement, I don't see any other way of interpreting it.
Be real for a second. People are arguing against Rust because it supports fewer target architectures than GCC. Which of the target architectures do you believe if important enough that it should decide the future development of apt?
I read it as a straightforward way of saying "support for a few mostly unused architectures is all that is holding us back from adopting rust, and adopting rust is viewed as a good thing"
Is it the borrow checker? Normally rust had your back when it comes to memory oopsies. Maybe we need a borrow checker for empathy..
from the outside it looks like a defense mechanism from a group of developers who have been suffering crusades against them ever since a very prolific c developer decided rust would be a good fit for this rather successful project he created in his youth.
Maybe they are just really tired of having to deal with people who constantly object and throw every possible obstacle they can on the way.
Maybe they wouldn't experience so much pushback if they were more humble, had more respect for established software and practices, and were more open to discussion.
You can't go around screaming "your code SUCKS and you need to rewrite it my way NOW" at everyone all the time and expect people to not react negatively.
> You can't go around screaming "your code SUCKS and you need to rewrite it my way NOW"
It seems you are imagining things and hate people for the things you imagined.
In reality there are situations where during technical discussions some people stand up and with trembling voice start derailing these technical discussions with "arguments" like "you are trying to convince everyone to switch over to the religion".
https://youtu.be/WiPp9YEBV0Q?t=1529
That’s also not something anybody has actually said.
While no one has explicitly said that, it is the implied justification of rewriting so much stuff in rust
I disagree very strongly that a suggestion to change something is also a personal attack on the author of the original code. That’s not a professional or constructive attitude.
Are you serious? It's basically impossible to discuss C/C++ anymore without someone bringing up Rust.
If you search for HN posts with C++ in the title from the last year, the top post is about how C++ sucks and Rust is better. The fourth result is a post titled "C++ is an absolute blast" and the comments contain 128 (one hundred and twenty eight) mentions of the word "Rust". It's ridiculous.
Lots of current and former C++ developers are excited about Rust, so it’s natural that it comes up in similar conversations. But bringing up Rust in any conversation still does not amount to a personal attack, and I would encourage some reflection here if that is your first reaction.
To be clear, the "you" and "my" in your sentence refer to the same person. Julian appears to be the APT maintainer, so there's no compulsion except what he applies to himself.
(Maybe you mean this in some general sense, but the actual situation at hand doesn't remotely resemble a hostile unaffiliated demand against a project.)
> Julian appears to be the APT maintainer, so there's no compulsion except what he applies to himself.
To who is this addressed?
> If you maintain a port without a working Rust toolchain, please ensure it has one within the next 6 months, or sunset the port.
Because that sure reads as a compulsion to me.
The endless crusades are indeed tiresome.
Yes, the immediate and endless backlash we get whenever anybody says the word "Rust" is quite tiresome.
No, honestly Rust has just really crappy attitude and culture. Even as a person who should naturally like Rust and I do plan to learn it despite that I find these people really grating.
[flagged]
And just like vegans, their detractors are far more vocal in reality.
>reality
In reality everyone eats meats because it's what the human body evolved to consume. There's nothing to detract
Untrue.
As evidenced by this very comment chain. I've seen, by far, way more comment from people annoyed by vegans. I can't even remember the last time I've heard a vegan discuss it outside of just stating the food preference when we got out to eat.
actually a vegan has to preach to some degree, otherwise it would be like a human rights advocate looking away when humans are tortured
As a vegetarian on ethical grounds (mostly due to factory farming of meat) I politely disagree with your assessment.
I have to decline and explain in social settings all the time, because I will not eat meat served to me. But I do not need to preach when I observe others eating meat. I, like all humans, have a finite amount of time and energy. I'd rather spend that time focused on where I think it will do the greatest good. And that's rarely explaining why factory farming of meat is truly evil.
The best time is when someone asks, "why don't you eat meat?" Then you can have a conversation. Otherwise I've found it best to just quietly and politely decline, as more often than not one can be accommodated easily. (Very occasionally, though, someone feels it necessary to try and score imaginary points on you because they have some axe to grind against vegetarians and vegans. I've found it best to let them burn themselves out and move on. Life's too short to worry about them.)
Frankly, I more often see meat eaters get defensive. We got to a restaurant, the vegan guy gets a meatless meal. The vegan guy gets bombarded with "Oh, you don't eat mean?" "Why?" "What's wrong with eating meat?" "I just like having a steak now and then."
That's a bit of a jump. Veganism is a personal lifestyle / dietary choice. Objecting to livestock is activism. You can do either without the other.
it's not just a dietary choice and it's a personal lifestyle in the sense of it being your choice, but not in the sense of a lifestyle which is limited to your private space.
You think it's wrong abusing animals. Why would you relate that only to you and think it would be ok for others to abuse them? You wouldn't
Why is this still a discussion?
> was no room for a change in plan
yes, pretty much
at least the questions about it breaking unofficial distros, mostly related to some long term discontinued architectures, should never affect how a Distro focused on current desktop and server usage develops.
if you have worries/problems outside of unsupported things breaking then it should be obvious that you can discuss them, that is what the mailing list is for, that is why you announce intend beforehand instead of putting things in the change log
> complained that Klode's wording was unpleasant and that the approach was confrontational
its mostly just very direct communication, in a professional setting that is preferable IMHO, I have seen too much time wasted due to misunderstandings due to people not saying things directly out of fear to offend someone
through he still could have done better
> also questioned the claim that Rust was necessary to achieve the stronger approach to unit testing that Klode mentioned:
given the focus on Sequoia in the mail, my interpretation was that this is less about writing unit tests, and more about using some AFIK very well tested dependencies, but even when it comes to writing code out of experience the ease with which you can write tests hugely affects how much it's done, rust makes it very easy and convenient to unit test everything all the time. That is if we speak about unit tests, other tests are still nice but not quite at the same level of convenience.
> "currently has problems with rebuilding packages of types that systematically use static linking"
that seems like a _huge_ issue even outside of rust, no reliable Linux distros should have problems with reliable rebuilding things after security fixes, no matter how it's linked
if I where to guess there this might be related to how the lower levels of dependency management on Linux is quite a mess due to requirements from 90 no longer relevant today, but which some people still obsess over.
To elaborate (sorry for the wall of text) you can _roughly_ fit all dependencies of a application (app) into 3 categories:
1. programs the system provides (opt.) called by the app (e.g. over ipc, or spawning a sub process), communicating over well defined non language specific protocols. E.g. most cmd-line tools, or you systems file picker/explorer should be invoked like that (that it often isn't is a huge annoyance).
2. programs the system needs to provide, called using a programming language ABI (Application Binary Interface, i.e. mostly C ABI, can have platform dependent layout/encoding)
3. code reused to not rewrite everything all the time, e.g. hash maps, algorithms etc.
The messy part in Linux is that for historic reasons the later two parts where not treated differently even through they have _very_ different properties wrt. the software live cycle. For the last category they are for your code and specific use case only! The supported versions usable with your program are often far more limited: Breaking changes far more normal; LTO is often desirable or even needed; Other programs needing different incompatible versions is the norm; Even versions with security vulnerabilities can be fine _iff_ the vulnerabilities are on code paths not used by your application; etc. The fact that Linux has a long history of treating them the same is IMHO a huge fuck up.
It made sense in the 90th. It doesn't anymore since ~20 years.
It's just completely in conflict with how software development works in practice and this has put a huge amount of strain on OSS maintainers, due to stuff like distros shipping incompatible versions potentially by (even incorrectly) patching your code.... and end users blaming you for it.
IMHO Linux should have a way to handle such application specific dependencies in a all cases from scripting dependencies (e.g. python), over shared object to static linking (which doesn't need any special handling outside of the build tooling).
People have estimated the storage size difference of linking everything statically, and AFIK it's irrelevant in relation to availability and pricing on modern systems.
And the argument that you might want to use a patched version of a dependency "for security" reasons fails if we consider that this has lead to security incidents more then one time. Most software isn't developed to support this at all and bugs can be subtle and bad to a point of a RCE.
And yes there are special cases, and gray areas in between this categories.
E.g. dependencies in the 3rd category you want to be able to update independently, or dependencies from the 2nd which are often handled like the 3rd for various piratical reasons etc.
Anyway coming back the the article Rust can handle dynamic linking just fine, but only for C ABI as of now. And while rust might get some form of RustABI to make dynamic linking better it will _never_ handle it for arbitrary libraries, as that is neither desirable nor technical possible.
---
EDIT: Just for context, in case of C you also have to rebuild all header only libraries using pre-processor macros, not doing so is risky as you now mix different versions of the same software in one build. Same (somewhat) for C++ with anything using template libraries. The way you can speed it up is by caching intermediate build artifacts, that works for rust, too.
I hate learning new things. It sucks. Also, I hate things that make my knowledge of C++ obsolete. I hate all the people that are getting good at rust and are threatening to take away my job. I hate that rust is a great leveler, making all my esoteric knowledge of C++ that I have been able to lord over others irrelevant. I hate that other people are allowed to do this to me and to do whatever they want, like making the decision to use rust in apt. It’s just sad and crazy to me. I can’t believe it. There are lots of people like me who are scared and angry and we should be able to control anyone else who makes us feel this way. Wow, I’m upset. I hope there is another negative post about rust I can upvote soon.
Think tech space isn’t for you if you hate learning new things.
Can you confirm these C++ fascists you speak of are in the room with you right now?
I remembered reading about this news back when that first message was posted on the mailing list, and didn't think much of it then (rust has been worming its way into a lot of places over the past few years, just one more thing I tack on for some automation)...
But seeing the maintainer works for Canonical, it seems like the tail (Ubuntu) keeps trying to wag the dog (Debian ecosystem) without much regard for the wider non-Ubuntu community.
I think the whole message would be more palatable if it weren't written as a decree including the dig on "retro computers", but instead positioned only on the merits of the change.
As an end user, it doesn't concern me too much, but someone choosing to add a new dependency chain to critical software plumbing does, at least slightly, if not done for very good reason.
Agreed. I think that announcement was unprofessional.
This was a unilateral decision affecting other's hard work, and the author didn't provide them the opportunity to provide feedback on the change.
It disregards the importance of ports. Even if an architecture isn't widely used, supporting multiple architectures can help reveal bugs in the original implementation that wouldn't otherwise be obvious.
This is breaking support for multiple ports to rewrite some feature for a tiny security benefit. And doing so on an unacceptably short timeline. Introducing breakage like this is unacceptable.
There's no clear cost-benefit analysis done for this change. Canonical or debian should work on porting the rust toolchain (ideally with tier 1 support) to every architecture they release for, and actually put the cart before the horse.
I love and use rust, it is my favorite language and I use it in several of my OSS projects but I'm tired of this "rewrite it in rust" evangilism and the reputational damage they do to the rust community.
> I love and use rust, it is my favorite language and I use it in several of my OSS projects but I'm tired of this "rewrite it in rust" evangilism and the reputational damage they do to the rust community.
Thanks for this.
I know intellectually, that there are sane/pragmatic people who appreciate Rust.
But often the vibe I’ve gotten is the evangelism, the clear “I’ve found a tribe to be part of and it makes me feel special”.
So it helps when the reasonable signal breaks through the noisy minority.
> This is breaking support for multiple ports to rewrite some feature for a tiny security benefit. And doing so on an unacceptably short timeline. Introducing breakage like this is unacceptable. I’m Normally I’d agree, but the ports in question are really quite old and obscure. I don’t think anything would have changed with an even longer timeline.
I think the best move would have been to announce deprecation of those ports separately. As it was announced, people who will never be impacted by their deprecation are upset because the deprecation was tied to something else (Rust) that is a hot topic.
If the deprecation of those ports was announced separately I doubt it would have even been news. Instead we’ve got this situation where people are angry that Rust took something away from someone.
Those ports were never official, and so aren't being deprecated. Nothing changes about Debian's support policies with this change.
EDIT: okay so I was slightly too strong: some of them were official as of 2011, but haven't been since then. The main point that this isn't deprecating any supported ports is still accurate.
*It disregards the importance of ports. Even if an architecture isn't widely used, supporting multiple architectures can help reveal bugs in the original implementation that wouldn't otherwise be obvious."
Imo this is true for going from one to a handful, but less true when going from a handful to more. Afaict there are 6 official ports and 12 unofficial ports (from https://www.debian.org/ports/).
> I love and use rust, it is my favorite language and I use it in several of my OSS projects but I'm tired of this "rewrite it in rust" evangilism and the reputational damage they do to the rust community.
This right here.
As a side-note, I was reading one of Cloudflare's docs on how it implemented its firewall rules, and it's so utterly disappointing how the document stops being informative suddenly start to reads like a parody of the whole cargo cult around Rust. Rust this, Rust that, and I was there trying to read up on how Cloudflare actually supports firewall rules. The way they focus on a specific and frankly irrelevant implementation detail conveys the idea things are ran by amateurs that are charmed by a shiny toy.
> I think the whole message would be more palatable if it weren't written as a decree including the dig on "retro computers", but instead positioned only on the merits of the change.
The wording could have been better, but I don’t see it as a dig. When you look at the platforms that would be left behind they’re really, really old.
It’s unfortunate that it would be the end of the road for them, but holding up progress for everyone to retain support for some very old platforms would be the definition of the tail wagging the dog. Any project that starts holding up progress to retain support for some very old platforms would be making a mistake.
It might have been better to leave out any mention of the old platforms in the Rust announcement and wait for someone to mention it in another post. As it was written, it became an unfortunate focal point of the announcement despite having such a small impact that it shouldn’t be a factor holding up progress.
> As an end user, it doesn't concern me too much ...
It doesn't concern me neither, but there's some attitude here that makes me uneasy.
This could have been managed better. I see a similar change in the future that could affect me, and there will be precedent. Canonical paying Devs and all, it isn't a great way of influencing a community.
I agree. It's sad to see maintainers take a "my way or the highway" approach to package maintenance, but this attitude has gradually become more accepted in Debian over the years. I've seen this play before, with different actors: gcc maintainers (regarding cross-bootstrapping ports), udev (regarding device naming, I think?), systemd (regarding systemd), and now with apt. Not all of them involved Canonical employees, and sometimes the Canonical employees were the voice of reason (e.g. that's how I remember Steve Langasek).
I'm sure some will point out that each example above was just an isolated incident, but I perceive a growing pattern of incidents. There was a time when Debian proudly called itself "The Universal Operating System", but I think that hasn't been true for a while now.
> It's sad to see maintainers take a "my way or the highway" approach to package maintenance, but this attitude has gradually become more accepted in Debian over the years.
It's frankly the only way to maintain a distribution relying almost completely on volunteer work! The more different options there are, the more expensive (both in terms of human cost, engineering time and hardware cost) testing gets.
It's one thing if you're, say, Red Hat with a serious amount of commercial customers, they can and do pay for conformance testing and all the options. But for a fully FOSS project like Debian, eventually it becomes unmaintainable.
Additionally, the more "liberty" distributions take in how the system is set up, the more work software developers have to put in. Just look at autotools, an abomination that is sadly necessary.
> Canonical paying Devs and all, it isn't a great way of influencing a community.
That's kind of the point of modern open source organizations. Let corporations fund the projects, and in exchange they get a say in terms of direction, and hopefully everything works out. The bigger issue with Ubuntu is that they lack vision, and when they ram things through, they give up at the slightest hint of opposition (and waste a tremendous amount of resources and time along the way). For example Mir and Unity were perfectly fine technologies but they retired it because they didn't want to see things through. For such a successful company, it's surprising that there technical direction setting is so unserious.
https://www.reddit.com/r/linux/comments/15brwi0/why_canonica...
[dead]
There are many high profile DDs who work or have worked for Canonical who are emphatically not the inverse — Canonical employees who are part of the Debian org.
The conclusion you drew is perfectly reasonable but I’m not sure it is correct, especially when in comparison Canonical is the newcomer. It could even be seen to impugn their integrity.
If you look at the article, it seems like the hard dependency on Rust is being added for parsing functionality that only Canonical uses:
> David Kalnischkies, who is also a major contributor to APT, suggested that if the goal is to reduce bugs, it would be better to remove the code that is used to parse the .deb, .ar, and .tar formats that Klode mentioned from APT entirely. It is only needed for two tools, apt-ftparchive and apt-extracttemplates, he said, and the only ""serious usage"" of apt-ftparchive was by Klode's employer, Canonical, for its Launchpad software-collaboration platform. If those were taken out of the main APT code base, then it would not matter whether they were written in Rust, Python, or another language, since the tools are not directly necessary for any given port.
Mmm, apt-ftparchive is pretty useful for cooking up repos for "in-house" distros (which we certainly thought was serious...) but those tools are already a separate binary package (apt-utils) so factoring them out at the source level wouldn't be particularly troublesome. (I was going to add that there are also nicer tools that have turned up in the last 10 years but the couple of examples I looked at depend on apt-utils, oops)
apt-utils comes from the same top-level source package though:
https://packages.debian.org/source/sid/apt
I know you can make configure-time decisions based on the architecture and ship a leaner apt-utils on a legacy platform, but it's not as obvious as "oh yeah that thing is fully auxiliary and in a totally different codebase".
I understand, but the comment to which I was replying implied that this keeps happening, and in general. That’s not fair to the N-1 other DDs who aren’t the subject of this LWN article (which I read!)
The most interesting criticism / idea in the article was that the parts that are intended for Rust-ification should actually be removed from core apt.
> it would be better to remove the code that is used to parse the .deb, .ar, and .tar formats [...] from APT entirely. It is only needed for two tools, apt-ftparchive and apt-extracttemplates [...]
Another interesting, although perhaps tangential, criticism was that the "new solver" currently lacks a testsuite (unit tests; it has integration tests). I'm actually kind of surprised that writing a dependency solver is a greenfield project instead of using an existing one. Or is this just a dig at something that pulls in a well-tested external module for solving?
Posted in curiosity, not knowing much about apt.
It seems silly to say that it has no tests. If I had to pick between unit and integration tests, I'd pick integration tests every time.
> the "new solver" currently lacks a testsuite
To borrow a phrase I recently coined:
If it's not tested then it's not Engineered.
You'd think that core tools would have proper Software Engineering behind them. Alas, it's surprising how many do not.
Unit tests does not make Software Engineering. That's simply part of the development phase, which should be the smallest phase out of all the phases involved in REAL Software Engineering, which is rarely even done these days, outside of DO-178 (et al) monotony. The entire private-to-public industry has even polluted upper management in defense software engineering into accepting SCRUM as somehow more desirable than the ability to effectively plan your requirements and execute without deviation. Yes it's possible, and yes it's even plausible. SWE laziness turns Engineers into developers. Running some auto-documentation script or a generic non-official block diagram is not the same as a Civil PE creating blueprints for a house, let alone a mile long bridge or skyscraper.
Nonsense. I know and talk to multiple Engineers all the time and they all envy our position of continuing to fix issues in the project.
Mechanical engineers having to work around other component failures all the time because their lead time is gigantic and no matter how much planning they do, failures still pop-up.
The idea that Software Engineering has more bugs is absurd. Electronic engineers, mechanical, electric, all face similar issues to what we face and normally don't have the capacity to deploy fixes as fast as we do because of real world constraints.
Not nonsense. Don't be reductive.
As far as I understand the idea behind scrum it's not that you don't plan, it's that you significantly shorten the planning-implementation-review cycle.
Perhaps that is the ideal when it was laid out, but the reality of the common implementation is that planning is dispensed with. It gives some management a great excuse to look no further than the next jira ticket, if that.
The ideal implementation of a methodology is only relevant for a small number of management who would do well with almost any methodology because they will take initiative to improve whatever they are doing. The best methodology for wide adoption is the one that works okay for the largest number of management who struggle to take responsibility or initiative.
That is to say, the methodology that requires management to take responsibility in its "lowest energy state" is the best one for most people-- because they will migrate to the lowest energy state. If the "lowest energy state" allows management to do almost nothing, then they will. If the structure allows being clueless, a lot of managers will migrate to pointy haired Dilbert manager cluelessness.
With that said; I do agree with getting products to clients quickly, getting feedback quickly, and being "agile" in adapting to requirements; but having a good plan based on actual knowledge of the requirements is important. Any strict adherence to an extreme methodology is probably going to fail in edge cases, so having the judgement of when to apply which methodology is a characteristic of good management. You've got to know your domain, know your team, and use the right tool for the job.
I've got a bridge to sell. It's made from watered-down concrete and comes with blueprints written on site. It was very important to get the implementation started asap to shorten the review cycle.
Integration tests are still tests. There are definitely cases for tools where you can largely get by without unit tests in favor of integration tests. I've written a lot of code generation tools this way for instance.
Given that Cargo is written in Rust, you would think there would be at least one battle tested solver that could be used. Perhaps it was harder to extract and make generic than write a new one?
Cargo's solver incorporates concepts that .debs don't have, like Cargo features, and I'm sure that .debs have features that Cargo packages don't have either.
Could the rust code be transpired to readable C?
Dependency solvers are actually an area that can benefit from updating IMO.
Every time I consider learning Rust, I am thrown back by how... "janky" the syntax is. It seems to me that we ought to have a system-level language which builds upon the learnings of the past 20+ years. Can someone help me understand this? Why are we pushing forward with a language that has a Perl-esque unreadability...?
Comparison: I often program in Python (and teach it) - and while it has its own syntax warts & frustrations - overall the language has a "pseudocode which compiles" approach, which I appreciate. Similarly, I appreciate what Kotlin has done with Java. Is there a "Kotlin for Rust"? or another high quality system language we ought to be investing in? I genuinely believe that languages ought to start with "newbie friendliness", and would love to hear challenges to that idea.
You might this blog post interesting, which argues that it's Rust semantics and not syntax that results in the noisiness, i.e.: it's intrinsic complexity:
https://matklad.github.io/2023/01/26/rusts-ugly-syntax.html
I found it reasonably convincing. For what it's worth, I found Rust's syntax quite daunting at first (coming from Python as well), but it only took a few months of continuous use to get used to it. I think "Perl-esque" is an overstatement.
It has some upsides over Python as well, notably that the lack of significant whitespace means inserting a small change and letting the autoformatter deal with syntax changes is quite easy, whereas in Python I occasionally have to faff with indentation before Black/Ruff will let me autoformat.
I appreciate that for teaching, the trade-offs go in the other direction.
I'm not sure which of the dozen Rust-syntax supporters I should reply to, but consider something like these three (probably equivalent) syntaxes:
Which one will your coworker choose? What will your other corworkers choose?This is day one stuff for declaring a dynamic array. What you really want is something like:
However, the grammar is problematic here because of using less-than and greater-than as brackets in a type "context". You can explain that as either not learning from C++'s mistakes or trying to appeal to a C++ audience I guess.Yes, I know there is a `vec!` macro. Will you require your coworkers to declare a similar macro when they start to implement their own generic types?
There are lots of other examples when you get to what traits are required to satisfy generics ("where clauses" vs "bounds"), or the lifetime signature stuff and so on...
You can argue that strong typing has some intrinsic complexity, but it's tougher to defend the multiple ways to do things, and that WAS one of Perl's mantras.
This is like complaining that in C you can write
Being able to use disambiguated syntaxes, and being able to add extra brackets, isn't an issue.PS. The formatting tooling normalizes your second and third example to the same syntax. Personally I think it ought to normalize both of them to the first syntax as well, but it's not particularly surprising that it doesn't because they aren't things anyone ever writes.
This will be the case in any language with both generics and type inference. It's nothing to do specifically with Rust.
Most likely
or rustc will figure out the typeexactly. you specify types for function parameters and structs and let the language do it's thing. it's a bit of a niche to specify a type within a function...
There is a reason the multiple methods detailed above exist. Mostly for random iterator syntax. Such as summing an array or calling collect on an iterator. Most Rust devs probably don't use all of these syntax in a single year or maybe even their careers.
I've only ever seen `a` and `d`. Personally I prefer `a`. The only time I've seen `c` is for trait methods like `<Self as Trait<Generic>>::func`. Noisy? I guess. Not sure how else this could really be written.
Fwiw, I didn't go looking for obscure examples to make HN posts. I've had three rounds of sincerely trying to really learn and understand Rust. The first was back when pointer types had sigils, but this exact declaration was my first stumbling block on my second time around.
The first version I got working was `d`, and my first thought was, "you're kidding me - the right hand side is inferring it's type from the left?!?" I didn't learn about "turbo fish" until some time later.
> Which one will your coworker choose? What will your other corworkers choose?
I don’t think I’ve ever seen the second two syntaxes anywhere.
I really don’t think this is a problem.
I mean, the fact that you mention "probably equivalent" is part of the reality here: Nobody writes the majority of these forms in real code. They are equivalent, by the way.
In real code, the only form I've ever seen out of these in the wild is your d form.
This is some True Scotsman style counter argument, and it's hard for me to make a polite reply to it.
There are people who program with a "fake it till you make it" approach, cutting and pasting from Stack Overflow, and hoping the compiler errors are enough to fix their mess. Historically, these are the ones your pages/books cater to, and the ones who think the borrow checker is the hard part. It doesn't surprise me that you only see code from that kind of beginner and experts on some rust-dev forum and nothing in between.
The issue though is that this isn't a solvable "problem". This is how programming languages' syntax work. It's like saying that C's if syntax is bad because these are equivalent:
Yes, one of your co-workers may write the third form. But it's just not possible for a programming language to stop this from existing, or at least, maybe you could do it, but it would add a ton of complexity for something that in practice isn't a problem.Well, the solution usually isn't in syntax, but it often is solved by way of code formatters, which can normalize the syntax to a preferred form among several equivalent options.
I think Perl-esque is apt, but that's because I've done quite a bit of Perl and think the syntax concerns are overblown. Once you get past the sigils on the variables Perl's syntax is generally pretty straightforward, albeit with a few warts in places like almost every language. The other area where people complained about Perl's opaqueness was the regular expressions, which most languages picked up anyway because people realized just how useful they are.
That's it exactly.
Once you're writing Rust at full speed, you'll find you won't be putting lifetimes and trait bounds on everything. Some of this becomes implicit, some of it you can just avoid with simpler patterns.
When you write Rust code without lifetimes and trait bounds and nested types, the language looks like Ruby lite.
When you write Rust code with traits or nested types, it looks like Java + Ruby.
When you sprinkle in the lifetimes, it takes on a bit of character of its own.
It honestly isn't hard to read once you use the language a lot. Imagine what Python looks like to a day zero newbie vs. a seasoned python developer.
You can constrain complexity (if you even need it) to certain modules, leaving other code relatively clean. Imagine the Python modules that use all the language features - you've seen them!
One of the best hacks of all: if you're writing HTTP services, you might be able to write nearly 100% of your code without lifetimes at all. Because almost everything happening in request flow is linear and not shared.
>When you write Rust code without lifetimes and trait bounds and nested types, the language looks like Ruby lite.
And once you learn a few idioms this is mostly the default.
This honestly reads like the cliche "you just don't get it yet" dismissals of many rust criticisms.
That article is really good, because it highlight that Rust doesn't have to look messy. Part of the problem, I think, is that there's a few to many people who think that messy version is better, because it "uses more of the language" and it makes them look smarter. Or maybe Rust just makes it to hard to see through the semantics and realize that just because feature is there doesn't mean that you need it.
There's also a massive difference between the type of C or Perl someone like me would write, versus someone trying to cope with a more hostile environment or who requires higher levels of performance. My code might be easier to read, but it technically has issue, they are mostly not relevant, while the reverse is true for a more skilled developer, in a different environment. Rust seems to attract really skilled people, who have really defensive code styles or who use more of the provided language features, and that makes to code harder to read, but that would also be the case in e.g. C++.
> I am thrown back by how... "janky" the syntax is.
Well if you come from C++ it's a breath of fresh air! Rust is like a "cleaned-up" C++, that does not carry the historical baggage forced by backwards compatibility. It is well-thought out from the start. The syntax may appear a bit too synthetic; but that's just the first day of use. If you use it for a few days, you'll soon find that it's a great, beautiful language!
The main problem with rust is that the community around it has embraced all the toxic traditions of the js/node ecosystem, and then some. Cargo is a terrifying nightmare. If you could install regular rust dependencies with "apt install" in debian stable, that would be a different story! But no. They want the version churn: continuously adding and removing bugs, like particle/anti-particle pairs at the boundary of a black hole.
Concerning TFA, adding rust to apt might be a step in the right direction. But it should be symmetric: apt depends on rust, that's great! But all the rust that it depends on needs to be installed by apt, and by apt alone!
I am coming from C++ and think Cargo is a blessing.
I like that I can just add a dependency and be done instead of having to deal with dependencies which require downloading stuff from the internet and making them discoverable for the project specific tool chain - which works differently on every operating system.
Same goes for compiling other projects.
While it kinda flies under the radar, most modern C projects do have a kind of package management solution in the form of pkg-config. Instead of the wild west of downloading and installing every dependency and figuring out how to integrate it properly with the OS and your project you can add a bit of syntactic sugar to your Makefile and have that mostly handled for you, save for the part where you will need to use your platform's native package manager to install the dependencies first. On a modern system using a package on a C project just requires a Makefile that looks something like this:
But that is the kind of convenience and ease of use that brings us another npm malware incident every other month at this point.
This is a real problem but I wouldn't blame the existence of good tooling on it. Sure you don't have this issue with C or C++, but thats because adding even a single dependency to a C or C++ project sucks, the tooling sucks.
I wholly blame developers who are too eager to just pull new dependencies in when they could've just written 7 lines themselves.
I remember hearing a few years ago about how developers considered every line of code the wrote as a failing and talked about how modern development was just gluing otherwise maintained modules together to avoid having to maintain their own project. I thought this sounded insane and I still do.
And in a way I think AI can help here, where instead you get just the snippet vs having to add that dep that then becomes a long-term security liability
Debian already builds Rust packages from apt, so it will satisfy that critera.
> Cargo is a terrifying nightmare
Really? Why? I'm not a Rust guru, but Cargo is the only part of Rust that gave me a great first impression.
GP mostly answered that in the comment already:
> If you could install regular rust dependencies with "apt install" in debian stable, that would be a different story! But no. They want the version churn: continuously adding and removing bugs, like particle/anti-particle pairs at the boundary of a black hole.
I don't know, it doesn't explain how and why Cargo causes "continuously adding and removing bugs, like particle/anti-particle pairs at the boundary of a black hole."
The problem, of course, is that "apt install" only works on platforms that use apt to manage their packages.
As a c/c++ cmake user, cargo sounds like a utopia in comparison. It still amazes me that c/c++ package management is still spread between about 5 different solutions.
IMO, the biggest improvement to C/C++ would be ISO defining a package manager a.la pip or uv or cargo. I'm so tired of writing cmake. just... tired.
People that don't understand make are destined to recreate it poorly, and there's no better example than cmake, imho.
Here's my arc through C/C++ build systems:
- make (copy pasted examples)
- RTFM [1]
- recursive make for all sorts of non-build purposes - this is as good as hadoop up to about 16 machines
- autotools
- cmake
- read "recursive make considered harmful" [2]
- make + templates
Anyway, once you've understood [1] and [2], it's pretty hard to justify cmake over make + manual vendoring. If you need windows + linux builds (cmake's most-advertised feature), you'll pretty quickly realize the VS projects it produces are a hot mess, and wonder why you don't just maintain a separate build config for windows.
[1] https://www.gnu.org/software/make/manual/
[2] https://news.ycombinator.com/item?id=20014348
If I was going to try to improve on the state of the art, I'd clean up a few corner cases in make semantics where it misses productions in complicated corner cases (the problems are analogous to prolog vs datalog), and then fix the macro syntax.
If you want a good package manager for C/C++, check out Debian or its derivatives. (I'm serious -- if you're upset about the lack of packages, there's a pretty obvious solution. Now that docker exists, the packages run most places. Support for some sort of AppImage style installer would be nice for use with lesser distros.)
cmake exists not because people didn't understand make, but because there was no one make to understand. The "c" is for "cross platform." It's a replacement for autoconf/automake, not a replacement for make.
> If I was going to try to improve on the state of the art
The state of the art is buck/bazel/nix/build2.
cmake is a self-inflicted problem of some C++ users, and an independent issue of the language itself (just like cargo for rust). If you want, you can use a makefile and distribution-provided dependencies, or vendored dependencies, and you don't need cmake.
imo the biggest single problem with C++ that the simple act of building it is not (and it seems, cannot) be standardized.
This creates kind of geographic barriers that segregate populations of C++ users, and just like any language, that isolation begets dialects and idioms that are foreign to anyone from a different group.
But the stewards of the language seem to pretend these barriers don't exist, or at least don't understand them, and go on to make the mountain ranges separating our valleys even steeper.
So it's not that CMake is a self-inflicted wound. It's the natural evolution of a tool to fill in the gaps left under specified by the language developers.
> Rust is like a "cleaned-up" C++
Except they got the order of type and variable wrong. That alone is enough reason to never use Rust, Go, TypeScript or any other language that botches such a critical cornerstone of language syntax.
I just can’t help but wonder if you’re 12 or autistic if this is a stance you’re willing to take on a public forum.
It’s completely inconsequential and makes the language easier to parse.
That was needlessly rude.
> Comparison: I often program in Python (and teach it) - and while it has its own syntax warts & frustrations - overall the language has a "pseudocode which compiles" approach, which I appreciate.
I think this is why you don’t like Rust: In Rust you have to be explicit by design. Being explicit adds syntax.
If you appreciate languages where you can write pseudocode and have the details handled automatically for you, then you’re probably not going to enjoy any language that expects you to be explicit about details.
As far as “janky syntax”, that’s a matter of perspective. Every time I deal with Python and do things like “__slots__” it feels like janky layer upon layer of ideas added on top of a language that has evolved to support things it wasn’t originally planned to do, which feels janky to me. All of the things I have to do in order to get a performant Python program feel incredibly janky relative to using a language with first class support for the things I need to do.
Both Python and JS evolved by building on top of older versions, but somehow JS did a way better job than Python, even though Py forced a major breaking change.
Agree about Rust, all the syntax is necessary for what it's trying to do.
You mean typescript?
Before that. The classes and stuff added in ES6 and earlier
Syntax tends to be deeply personal. I would say the most straightforward answer to your question is "many people disagree that it is unreadable."
Rust did build on the learnings of the past 20 years. Essentially all of its syntax was taken from other languages, even lifetimes.
Are the many who disagree that it is unreadable more than the people who agree? I have been involved with the language for a while now, and while I appreciate what you and many others have done for it, the sense that the group is immune to feedback just becomes too palpable too often. That, and the really aggressive PR.
Rust is trying to solve a really important problem, and so far it might well be one of the best solutions we have for it in a general sense. I 100% support its use in as many places as possible, so that it can evolve. However, its evolution seems to be thwarted by a very vocal subset of its leadership and community who have made it a part of their identity and whatever socio-political leverage toolset they use.
Rust is almost git hyoe 2.0. That hyoe set the world up with (a) a dominant VCS that is spectacularly bad at almost everything it does compared to its competitors and (b) the dominant Github social network owned by MS that got ripped to train Copilot.
Developers have a way of running with a hyoe that can be quite disturbing and detrimental in the long run. The one difference here is that rust has some solid ideas implemented underneath. But the community proselytizing and throwing non-believers under the bus is quite real.
> Are the many who disagree that it is unreadable more than the people who agree?
I have no way to properly evaluate that statement. My gut says no, because I see people complain about other things far more often, but I do think it's unknowable.
I'm not involved with Rust any more, and I also agree with you that sometimes Rust leadership can be insular and opaque. But the parent isn't really feedback. It's just a complaint. There's nothing actionable to do here. In fact, when I read the parent's post, I said "hm, I'm not that familiar with Kotlin actually, maybe I'll go check it out," loaded up https://kotlinlang.org/docs/basic-syntax.html, and frankly, it looks a lot like Rust.
But even beyond that: it's not reasonably possible to change a language's entire syntax ten years post 1.0. Sure, you can make tweaks, but turning Rust into Python simply is not going to happen. It would be irresponsible.
I've found the rust core team to be very open to feedback. And maybe I've just been using Rust for too long, but the syntax feels quite reasonable to me.
Just for my own curiosity, do you have an examples of suggestions for how to improve the syntax that have been brought up and dismissed by the language maintainers?
> the sense that the group is immune to feedback
Is complaining about syntax really productive though? What is really going to be done about it?
This is such a weird take. What do you suggest? Should Rust’s syntax have been democratically decided?
There’s syntax that is objectively easier to both read and write, and there’s syntax that is both harder to read and write. For a majority.
In general, using english words consisting of a-z is easier to read. Using regex-like mojibake is harder.
For an concrete example in rust, using pipes in lambdas, instead of an arrow, is aweful.
Rust's pipes in lambdas come from Ruby, a language that's often regarded as having beautiful syntax.
Rust is objectively not mojibake. The equivalent here would be like using a-z, as Rust's syntax is borrowed from other languages in wide use, not anything particularly esoteric. (Unless you could OCaml as esoteric, which I do believe is somewhat arguable but that's only one thing, the argument still holds for the vast majority of the language.)
uuuh I like the pipes even though its my first language with them?
Concise and much clearer to read vs parentheses where you gotta wonder if the params are just arguments, or a tuple, etc. What are you talking about.
I would encourage you to give it a try anyways. Unfamiliar syntax is off-putting for sure, but you can get comfortable with any syntax.
Coming from Python, I needed to work on some legacy Perl code. Perl code looks quite rough to a new user. After time, I got used to it. The syntax becomes a lot less relevant as you spend more time with the language.
Sure... but you don't want to spend time if it's such a mess to read it.
Once one does spend some time to become comfortable with the language, that feeling of messiness with unfamiliar syntax fades away. That's the case with any unfamiliar language, not just Rust.
I used Rust for a year and still wasn't used to the syntax, though this was v1.0 so idk what changed. I see why it's so complicated and would definitely prefer it over C or Cpp, but wouldn't do higher-level code in it.
I’ve been writing python professionally for over 10 years. In the last year I’ve been writing more and most Rust. At first I thought the same as you. It’s a fugly language, there’s no denying it. But once I started to learn what all the weird syntax was for, it began to ruin Python for me.
Now I begrudge any time I have to go back to python. It feels like its beauty is only skin deep, but the ugly details are right there beneath the surface: prolific duck typing, exceptions as control flow, dynamic attributes. All these now make me uneasy, like I can’t be sure what my code will really do at runtime.
Rust is ugly but it’s telling you exactly what it will do.
Seems like a fairly decent syntax. It’s less simple than many systems languages because it has a very strong type system. That’s a choice of preference in how you want to solve a problem.
I don’t think the memory safety guarantees of Rust could be expressed in the syntax of a language like C or Go.
> It’s less simple than many systems languages because it has a very strong type system.
I don’t think that’s the case, somehow most ML derived languages ended up with stronger type system and cleaner syntax.
Is ML a systems language? Sorry, maybe my definition is wrong, but I consider a systems language something that’s used by a decent amount of OS’es, programming languages and OS utilities.
I assume you’re talking about OCaml et al? I’m intruiged by it, but I’m coming from a Haskell/C++ background.
Rust is somewhat unique in terms of system language this because it’s the first one that’s not “simple” like C but still used for systems tools, more than Go is as far as I’m aware.
Which probably has to do with its performance characteristics being close to the machine, which Go cannot do (ie based on LLVM, no GC, etc)
There is no other ML-like that is as low level. Except perhaps ATS, which has terrible syntax.
One of the design goals of rust is explicitness. I think if Rust had type elision, like many other functional languages, it would go a long way to cleaning up the syntax.
Rust's most complained about syntax, the lifetime syntax, was borrowed from an ML: OCaml.
I code mostly in Go and the typing sloppiness is a major pain point.
Example: You read the expression "x.f", say, in the output of git-diff. Is x a struct object, or a pointer to a struct? Only by referring to enclosing context can you know for sure.
Maybe I've Stockholm'd myself, but I think Rust's syntax is very pleasant. I also think a lot of C code looks very good (although there is some _ugly_ C code out there).
Sometimes the different sets of angle and curly brackets adding up can look ugly at first, and maybe the anonymous function syntax of || {}, but it grows on you if you spend some time with the language (as do all syntaxes, in my experience).
> It seems to me that we ought to have a system-level language which builds upon the learnings of the past 20+ years.
Maybe Ada, D or Nim might qualify?
The family of languages that started with ML[0] mostly look like this. Studying that language family will probably help you feel much more at home in Rust.
Many features and stylistic choices from ML derivatives have made their way into Swift, Typescript, and other non-ML languages.
I often say that if you want to be a career programmer, it is a good idea to deeply learn one Lisp-type language (which will help with stuff like Python), one ML-type language (which will help with stuff like Rust) and one C-type language (for obvious reasons.)
[0] https://en.wikipedia.org/wiki/ML_(programming_language)
F# looks nothing like Rust. Is much more readable for me.
I don’t program much in Rust, but I find it a beautiful syntax… they took C++ and made it pretty much strictly better along with taking some inspiration from ML (which is beautiful imo)
The sigils in Rust (and perl) are there to aid readability. After you use it a bit, you get used to ignoring them unless they look weird.
All the python programs I've had to maintain (I never choose python) have had major maintainability problems due to python's clean looking syntax. I can still look at crazy object oriented perl meta-programming stuff I wrote 20 years ago, and figure out what it's doing.
Golang takes another approach: They impoverished the language until it didn't need fancy syntax to be unambiguously readable. As a workaround, they heavily rely on codegen, so (for instance) Kubernetes is around 2 million lines of code. The lines are mostly readable (even the machine generated ones), but no human is going to be able to read them at the rate they churn.
Anyway, pick your poison, I guess, but there's a reason Rust attracts experienced systems programmers.
I think this is subjective, because I think Rust's syntax is (mostly) beautiful.
Given the constraint that they had to keep it familiar to C++ people, I'd say they did a wonderful job. It's like C++ meets OCaml.
Do you have any particular complaints about the syntax?
Aside from async/await which I agree is somewhat janky syntaxtically, I'm curious what you consider to be janky. I think Rust is overall pretty nice to read and write. Patterns show up where you want them, type inference is somewhat limited but still useful. Literals are readily available. UFCS is really elegant. I could go on.
Ironically, I find Python syntax frustrating. Imports and list comprehensions read half backwards, variable bindings escape scope, dunder functions, doc comments inside the function, etc.
What do people actually mean when they say "the syntax is janky"?
I often see comparisons to languages like Python and Kotlin, but both encode far less information on their syntax because they don't have the same features as Rust, so there's no way for them to express the same semantics as rust.
Sure, you can make Rust look simpler by removing information, but at that point you're not just changing syntax, you're changing the language's semantics.
Is there any language that preserves the same level of type information while using a less "janky" syntax?
Kotlin programmer here who is picking up Rust recently. you're right, it's no Kotlin when it comes to the elegance of APIs but it's also not too bad at all.
In fact there are some things about the syntax that are actually nice like range syntax, Unit type being (), match expressions, super explicit types, how mutability is represented etc.
I'd argue it's the most similar system level language to Kotlin I've encountered. I encourage you to power through that initial discomfort because in the process it does unlock a level of performance other languages dream of.
> Why are we pushing forward with a language that has a Perl-esque unreadability...?
The reason is the same for any (including Perl, except those meme languages where obfuscation is a feature) language: the early adopters don't think it's unreadable.
> Is there a "Kotlin for Rust"?
While it's not a systems language, have you tried Swift?
Swift is as relevant to this discussion as Common Lisp.
On the contrary, Swift is very relevant on this subject. It has high feature parity with rust, with a much readable syntax.
But Swift is not "Kotlin for Rust" though, I can't see the connection at all. "Kotlin for Rust" would be a language that keeps you in the Rust ecosystem.
The commenter I replied to seems to like Kotlin. Swift is extremely close to Kotlin in syntax and features, but is not for the JVM. Swift also has a lot of similarities with Rust, if you ignore the fact that it has a garbage collector.
Ah yeah ok, makes sense in that way
Have you considered that part of it is not the language but the users?
I'm learning rust and the sample code I frequently find is... cryptically terse. But the (unidiomatic, amateurish) code I write ironically reads a lot better.
I think rust attracts a former c/c++ audience, which then bring the customs of that language here. Something as simple as your variable naming (character vs c, index vs i) can reduce issues already.
As an official greybeard who has written much in C, C++, Perl, Python, and now Rust, I can say Rust is a wonderful systems programming language. Nothing at all like Perl, and as others have mentioned, a great relief from C++ while providing all the power and low-level bits and bobs important for systems programming.
I would argue that anything that is not Lisp has a complicated syntax.
The question is: is it worth it?
With Rust for the answer is yes. The reliability, speed, data-race free nature of the code I get from Rust absolutely justifies the syntax quirks (for me!).
what makes it unreadable for you?
Legit question really. A comparative study on language readability using codes doing the same thing written idiomatically in different languages will be interesting. Beyond syntax, idioms/paradigm/familiarity should also play role.
nta you're replying to, but as someone who doesn't know rust, on first glance it seems like it's littered with too many special symbols and very verbose. as i understand it this is required because of the very granular low level control rust offers
maybe unreadable is too strong of a word, but there is a valid point of it looking unapproachable to someone new
People often misuse unreadable when they mean unfamiliar. Rust really isn't that difficult to read when you get used to it.
I think the main issue people who don't like the syntax have with it is that it's dense. We can imagine a much less dense syntax that preserves the same semantics, but IMO it'd be far worse.
Using matklad's first example from his article on how the issue is more the semantics[1]
we can imagine a much less symbol-heavy syntax inspired by POSIX shell, FORTH, & ADA: and I think we'll all agree that's much less readable even though the only punctuation is `=` and `.`. So "symbol heavy" isn't a root cause of the confusion, it's trivial to make worse syntax with fewer symbols. And I like RPN syntax & FORTH.[1] https://matklad.github.io/2023/01/26/rusts-ugly-syntax.html
In your opinion how does Rust compare to C++ for readability?
> Every time I consider learning Rust, I am thrown back by how... "janky" the syntax is. It seems to me that we ought to have a system-level language which builds upon the learnings of the past 20+ years.
I said this years ago and I was basically told "skill issue". It's unreadable. I shudder to think what it's like to maintain a Rust system at scale.
You get used to it. Like any language.
I'm writing this as a heavy python user in my day job. Python is terrible for writing complex systems in. Both the language and the libraries are full of footguns for the novice and expert alike. It has 20 years of baggage, the packaging and environment handling is nothing short of an unmitigated disaster, although uv seems to be a minor light at the end of the tunnel. It is not a simple language at this point. It has had so many features tacked on, that it needs years of use to have a solid understanding of all the interactions.
Python is a language that became successful not because it was the best in it's class, but because it was the least bad. It became the lingua franca of quantitative analysis, because R was even worse and matlab was a closed ecosystem with strong whiffs of the 80s. It became successful because it was the least bad glue language for getting up and running with ML and later on LLMs.
In comparison, Rust is a very predictable and robust language. The tradeoff it makes is that it buys safety for the price of higher upfront complexity. I'd never use Rust to do research in. It'd be an exercise in frustration. However, for writing reliable and robust systems, it's the least bad currently.
What's wrong with R? I used it and liked it in undergrad. I certainly didn't use it as seriously as the users who made Python popular, but to this day I remember R fondly and would never choose Python for a personal project.
My R use was self-taught, as well. I refused to use proprietary software for school all through high school and university, so I used R where we were expected to use Excel or MatLab (though I usually used GNU Octave for the latter), including for at least one or two math classes. I don't remember anything being tricky or difficult to work with.
R is the most haphazard programming environment I've ever used. It feels like an agglomeration of hundreds of different people's shell aliases and scripting one-liners.
I'll grant my only exposure has been a two- or three-day "Intro to R" class but I ran screaming from that experience and have never touched it again.
It maybe worked against me that I am a programmer, not a statistician or researcher.
Python had already become vastly popular before ML/AI. Scripting/tools/apps/web/... Only space that hasn't entered is mobile.
> upon the learnings of the past 20+ years.
That's the thing though... Rust does build on many of those learnings. For starters, managing a big type system is better when some types are implicit, so Rust features type inference to ease the burden in that area. They've also learned from C++'s mistake of having a context sensitive grammar. They learned from C++'s template nightmare error messages so generics are easier to work with. They also applied learnings about immutability being a better default that mutability. The reason Rust is statically linked and packages are managed by a central repository is based on decades of seeing how difficult it is to build and deploy projects in C++, and how easy it is to build and deploy projects in the Node / NPM ecosystem. Pattern matching and tagged unions were added because of how well they worked in functional languages.
As for "Perl-esque unreadability" I submit that it's not unreadable, you are just unfamiliar. I myself find Chinese unreadable, but that doesn't mean Chinese is unreadable.
> Is there a "Kotlin for Rust"?
Kotlin came out 16 years after Java. Rust is relatively new, and it has built on other languages, but it's not the end point. Languages will be written that build on Rust, but that will take some time. Already many nascent projects are out there, but it is yet to be seen which will rise to the top.
> It seems to me that we ought to have a system-level language which builds upon the learnings of the past 20+ years
I mean, Rust does. I builds on 20+ years of compiler and type system advancements. Then Syntax is verbose if you include all then things you can possibly do. If you stick to the basics it's pretty similar to most other languages. Hell, I'd say a lot of syntax Rust is similar to type-hinted Python.
Having said that, comparing a GC'd dynamic language to a systems programming language just isn't a fair comparison. When you need to be concerned about memory allocation you just need more syntax.
Perl’s most notable syntax feature is sigils on all variables.
So it’s strange to hear a comparison. Maybe there’s something I’m missing.
It seems closer to C++ syntax than Perl.
Does it really add any value to the conversation?
What are you talking about? Rust’s function signature and type declaration syntaxes are extremely vanilla, unless you venture into some really extreme use cases with lots of lifetime annotations and generic bounds.
I seriously don’t get it.
Where’s the “Perl-esqueness”?That's just a weird and unrealistic example, though. Like, why is process_handler taking an owned, boxed reference to something it only needs shared access to? Why is there an unnecessary 'a bound on handler?
In the places where you need to add lifetime annotations, it's certainly useful to be able to see them in the types, rather than relegate them to the documentation like in C++; cf. all the places where C++'s STL has to mention iterator and reference invalidation.
LLMs LOVE to write Rust like this. They add smart pointers, options and lifetimes everywhere when none of those things are necessary. I don’t know what it is, but they love over-engineering it.
I agree that the signature for process_handler is weird, but you could steelman it to take a borrowed trait object instead, which would have an extra sigil.
The handler function isn't actually unnecessary, or at least, it isn't superfluous: by default, the signature would include 'a on self as well, and that's probably not what you actually want.
I do think that the example basically boils down to the lifetime syntax though, and yes, while it's a bit odd at first, every other thing that was tried was worse.
> The handler function isn't actually unnecessary, or at least, it isn't superfluous: by default, the signature would include 'a on self as well, and that's probably not what you actually want.
To clarify, I meant the 'a in `Box<dyn Handler + 'a>` in the definition of `process_handler` is unnecessary. I'm not saying that the <'a> parameter in the definition of Handler::handle is unnecessary, which seems to be what you think I said, unless I misunderstood.
Ah yes, I misunderstood you in exactly that way, my apologies.
Lifetimes really only come into play if you are doing something really obscure. Often times when I’m about to add lifetimes to my code I re-think it and realize there is a better way to architect it that doesn’t involve them at all. They are a warning sign.
now show me an alternative syntax encoding the same information
...
There's a deeper connection there: lifetimes are a form of type variable, just like in OCaml.
While I don’t disagree that this is at first blush quite complex, using it as an example also obscures a few additional details that aren’t present in something like python, namely monads and lifetimes. I think in absence of these, this code is a bit easier to read. However, if you had prior exposure to these concepts, I think that this is more approachable. I guess what I’m getting at here is that rust doesn’t seem to be syntactic spaghetti as much as it is a confluence of several lesser-used concepts not typically used in other “simpler” languages.
> > really extreme use cases with lots of lifetime annotations and generic bounds
You choose as your example a pretty advanced use case.
Which is the exact use case someone would choose rust for over other languages
No, the use cases of Rust are pretty much the same as the use cases of C++. Most Rust code shouldn't have objects with complicated lifetimes, just like most code in any language should avoid objects with complicated lifetimes.
Could have thrown a few uses of macros with the # and ! which threw me off completely while trying to read a Rust codebase as a non-Rust programmer.
That's simple even in Perl. The problem is when you start adding the expected idioms for real world problems.
Python users don’t even believe in enabling cursory type checking, their language design is surpassed even by JavaScript, should it really even be mentioned in a language comparison? It is a tool for ML, nothing else in that language is good or worthwhile
”[One] major contributor to APT suggested it would be better to remove the Rust code entirely as it is only needed by Canonical for its Launchpad platform. If it were taken out of the main APT code base, then it would not matter whether they were written in Rust, Python, or another language, since the tools are not directly necessary [for regular installations].”
Given the abundance of the hundreds of deb-* and dh-* tools across different packages, it is surprising that apt isn’t more actively split into separate, independent tools. Or maybe it is, but they are all in a monorepo, and the debate is about how if one niche part of the monorepo uses Rust then the whole suite can only be built on platforms that support Rust?
It’s like arguing about the bike shed when everyone takes the bus except for one guy who cycles in every four weeks to clean the windows.If this could be done it seems like the ideal compromise. Everyone gets what they want.
That said eventually more modern languages will be dependencies of the tools one way or another (and they should). So probably Debian as a whole should come to a consensus on how that should happen, so it can happen in some sort of standard and fair fashion.
Interesting how instead of embracing Rust as a required toolchain for APT, the conversation quickly devolved into
"why don't we just build a tool that can translate memory-safe Rust code into memory-unsafe C code? Then we don't have to do anything."
This feels like swimming upstream just for spite.
>tool that can translate memory-safe Rust code into memory-unsafe C code
Fwiw, there're two such ongoing efforts. One[1] being an, written in C++, alternative Rust compiler that emits C (aka, in project's words, high-level assembly), the other[2] being a Rust compiler backend/plugin (as an extra goal to its initial being to compile Rust to CLR asm). Last one apparently is[3] quite modular and could be adapted for other targets too. Other options are continuing/improve GCC front-end for Rust and a recent attempt to make a Rust compiler in C[4] that compiles to QBE IR which can then be compiled with QBE/cc.
[1]: https://github.com/thepowersgang/mrustc [2]: https://github.com/FractalFir/rustc_codegen_clr [3]: https://old.reddit.com/r/rust/comments/1bhajzp/ [4]: https://codeberg.org/notgull/dozer
That's not what the comment said. It said, "How about a Rust to C converter?..." The idea was that using a converter could eliminate the problem of not having a rust compiler for certain platforms.
The problem is that rust is being shoved in pointless places with a rewrite-everything-in-rust mentality.
There's lunatics that want to replace basic Unix tools like sudo, etc, that are battle tested since ages which has been a mess of bugs till now.
Instead Rust should find it's niches beyond rewriting what works, but tackling what doesn't.
FWIW sudo has been maintained by an OpenBSD developer for a while now but got replaced in the base system by doas. Independent of any concerns about Rust versus C, I don't think it's quite as unreasonable as you're claiming to consider alternatives to sudo given that the OS that maintains it felt that it was flawed enough to be worth writing a replacement for from scratch.
sudo had grown a lot of features and a complicated config syntax over the years, which ended up being confusing and rarely needed in practice. doas is a lot simpler. It wasn't just a rewrite of a flawed utility but a simplification of it.
sudo is not fully battle tested, even today. You just don't really see the CVEs getting press.
https://www.oligo.security/blog/new-sudo-vulnerabilities-cve...
Neither of those vulnerabilities look like rust would necessarily help however
> The problem is that rust is being shoved in pointless places with a rewrite-everything-in-rust mentality.
> There's lunatics ...
I think the problem is people calling developers "lunatics" and telling them which languages they must use and which software they must not rewrite.
Battle tested is not bulletproof: https://cybersecuritynews.com/sudo-linux-vulnerability/
Applying strict compile time rules makes software better. And with time it will also become battle tested.
Cue for all those battle tested programs that people keep finding vulnerabilities several decades after they got considered "done". You should try looking at the test results once in a while.
And by the way, we had to replace almost all of the basic Unix tools at the turn of the century because they were completely unfit for purpose. There aren't many left.
Calling it pointless comes across as jaded. It's not pointless.
Supporting Rust attracts contributors, and those contributors are much less likely to introduce vulnerabilities in Rust when contributing vs alternatives.
to introduce certain common vulnerabilities ...
not vulnerabilities in general.
Converting parsers to Rust is not "pointless". Doing string manipulation in C is both an awful experience and also extremely fertile ground for serious issues.
apt is C++
It’s very easy to write a string library in C which makes string operations high level (both in API and memory management). Sure, you shouldn’t HAVE to do this. I get it. But anyone writing a parser is definitely skilled enough to maintain a couple hundred lines of code for a linear allocator and a pointer plus length string. And to be frank, doing things like “string operations but cheaply allocated” is something you have to do ANYWAY if you’re writing e.g. a parser.
This holds for many things in C
This is just a variation of the "skill issue" argument.
If it were correct, we wouldn't see these issues continue to pop up. But we do.
> a pointer plus length
What would length represent? Bytes? Code points?
Anyway, I think what you are asking for already exists in the excellent ICU library.
And it's not a very easy thing to maintain. Unicode stuff changes more often than you might think and it can be political.
Issues that are battle tested from ages.
Sure, which is highly valuable information that hopefully made its way into a testing / verification suite. Which can then be used to rewrite the tool into a memory-safe language, which allows a lot of fixes and edge cases that were added over time to deal with said issues to be refactored out.
Of course there's a risk that new issues are introduced, but again, that depends a lot on the verification suite for the existing tool.
Also, just because someone did a port, doesn't mean it has to be adopted or that it should replace the original. That's open source / the UNIX mentality.
I seem to remember going through this with systemD in Ubuntu. Lots of lessons learned seemed to come back as "didn't we fix this bug 3 years ago?"
We need lisp, cobol, and java in apt, too. and firefox.
Is the apt package manager a pointless place? It seems like a pretty foundational piece of supply chain software with a large surface area.
The author of the rust software did not solve the platform problem, as a result it is not a solution. Since it is not a solution, it should be reverted. It's really that simple.
All compilers do anyways is translate from one language specification to another. There's nothing magical about Rust or any specific architecture target. The compiler of a "memory safe" language like Rust could easily output assembly with severe issues in the presence of a compiler bug. There's no difference between compiling to assembly vs. C in that regard.
The assumption here is that there exists an unambiguous C representation for all LLVM IR bitcode emitted by the Rust compiler.
To my knowledge, this isn’t the case.
> The assumption here is that there exists an unambiguous C representation for all LLVM IR bitcode emitted by the Rust compiler.
> To my knowledge, this isn’t the case.
Tell us more?
Source-to-source translation will be very hard to get right, because lots of things are UB in C that aren’t in Rust, and obviously vice versa.
Rust has unwinding (panics), C doesn’t.
For one, signed integer overflow is allowed and well-defined in Rust (the result simply wraps around in release builds), while it's Undefined Behavior in C. This means that the LLVM IR emitted by the Rust compiler for signed integer arithmetic can't be directly translated into the analogous C code, because that would change the semantics of the program. There are ways around this and other issues, but they aren't necessarily simple, efficient, and portable all at once.
You guys seem to be assuming transpiling to C means it must produce C that DTRT on any random C compiler invoked any which way on the other side, where UB is some huge possibility space.
There's nothing preventing it from being some specific invocation of a narrow set of compilers like gcc-only of some specific version range with a set of flags configuring the UB to match what's required. UB doesn't mean non-deterministic, it's simply undefined by the standard and generally defined by the implementation (and often something you can influence w/cli flags).
The gigantic difference is that assembly language has extremely simple semantics, while C has very complex semantics. Similarly, assembler output is quite predictable, while C compilers are anything but. So the level of match between the Rust code and the machine code you'll get from a Rust-to-assembly compiler will be much, much easier to understand than the match you'll get between the Rust code and the machine code produced by a C compiler compiling C code output by a Rust-to-C transpiler.
Rust developers are so dogmatic about their way being the best and only way that I just avoid it altogether. I've had people ask about Rust in issues/discussions in small hobby projects I released as open source - I just ban them immediately because there is no reasoning with them and they never give up. Open source terrorists.
"Open source terrorism" is a hilarious designation for Rust-like traditions and customs. I wonder what other programming language/software communities may fall under this definition?
Shouldn't we wait until Rust gets full support in GCC? This should resolve the issue with ports without a working Rust compiler.
I don't have a problem with Rust, it is just a language, but it doesn't seem to play along well with the mostly C/C++ based UNIX ecosystem, particularly when it comes to dependencies and package management. C and C++ don't have one, and often rely on system-wide dynamic libraries, while Rust has cargo, which promotes large dependency graphs of small libraries, and static linking.
I have never seen a program segfault and crash more than apt. The status quo is extremely bad, and it desperately needs to be revamped in some way. Targeted rewrites in a memory safe & less mistake-prone language sounds like a great way to do that.
If you think this is a random decision caused by hype, cargo culting, or a maintainer's/canonical's mindless whims... please, have a tour through the apt codebase some day. It is a ticking time bomb, way more than you ever imagined such an important project would be.
You know, it is easy to find this kind of nitpicking and seemingly eternal discussion over details exhausting and meaningless, but I do think it is actually a good sign and a consequence of "openness". In politics, authoritarianism tend to show a pretty façade where everyone mostly agrees (the reality be damned), and discussion and dissenting voice are only allowed to a certain extent as a communication tool. This is usually what we see in corporate development.
Free software are much more like democracy, everyone can voice their opinion freely, and it tends to be messy, confrontational, nitpicky. It does often lead to slowing down changes, but it also avoids the common pitfall of authoritarian regime of going head first into a wall at the speed of light.
What?
Opensource software doesn't have 1 governance model and most of it starts out as basically a pure authoritarian run.
It's only as the software ages, grows, and becomes more integral that it switches to more democratic forms of maintenance.
Even then, the most important OS code on the planet, the kernel, is basically a monarchy with King Linus holding absolute authority to veto the decision of any of the Lords. Most stuff is maintained by the Lords but if Linus says "no" or "yes" then there's no parliament which can override his decision (beyond forking the kernel).
""and not be held back by trying to shoehorn modern software on retro computing devices""
Nice. So discrimination of poor users who are running "retro" machines because that is the best they can afford or acquire.
I knew of at least two devs who are stuck with older 32 bit machines as that is what they can afford/obtain. I even offered to ship them a spare laptop with a newer CPU and they said thanks but import duties in their country would be unaffordable. Thankfully they are also tinkering with 9front which has little to no issues with portability and still supports 32 bit.
Looking at the list of affected architectures: Alpha (alpha), Motorola 680x0 (m68k), PA-RISC (hppa), and SuperH (sh4) I think these are much much more likely to be run by enthusiasts than someone needing an affordable computer.
The last 32bit laptop CPU was produced nearly 20 years ago.
Further, there are still several LTS linux distros (including the likes of Ubuntu and Debian) which don't have the rust requirement and won't until the next LTS. 24.04 is supported until 2029. Meaning you are talking about a 25 year old CPU at that point.
And even if you continue to need support. Debian based distros aren't the only ones on the plant. You can pick something else if it really matters.
No one is using an Alpha, Motorola 680x0, PA-RISC, or SuperH computer because that's the only thing they can afford. Rust supports 32bit x86.
Rust works fine on 32 bit, (and 16 bit) that’s not what they mean…
Rust even works on 8-bit via the LLVM-MOS backend for MOS 6502 :)
Poor people aren’t running exotic hardware.
You seem to be involved with 9front.
Are you trying to suggest there is a nontrivial community of people who cannot afford modern 64-bit Linux platforms, and opt for 9front on some ancient 32-bit hardware instead? Where are they coming from? Don't get me wrong, I love the 9 as much as the next guy, but you seem to paint it as some kind of affordability frontier...
The announcement says:
>In particular, our code to parse .deb, .ar, .tar, and the HTTP signature verification code would strongly benefit from memory safe languages and a stronger approach to unit testing.
I can understand the importance of safe signature verification, but how is .deb parsing a problem? If you're installing a malicious package you've already lost. There's no need to exploit the parser when the user has already given you permission to modify arbitrary files.
It is possible the deb package is parsed to extract some metadata before being installed and before verifying signature.
Also there is aspect of defence in depth. Maybe you can compromise one package that itself can't do much, but installer runs with higher priviledges and has network access.
Another angle -- installed package may compromise one container, while a bug in apt can compromise the environment which provisions containers.
And then at some point there is "oh..." moment when the holes in different layers align nicely to make four "bad but not exploitable" bugs into a zero day shitshow
> It is possible the deb package is parsed to extract some metadata before being installed and before verifying signature.
Yes, .deb violates the cryptographic doom principle[1] (if you have to perform any cryptographic operation before verifying the message authentication code (or signature) on a message you’ve received, it will somehow inevitably lead to doom).
Their signed package formats (there are two) add extra sections to the `ar` archive for the signature, so they have to parse the archive metadata & extract the contents before validating the signature. This gives attackers a window to try to exploit this parsing & extraction code. Moving this to Rust will make attacks harder, but the root cause is a file format violating the cryptographic doom principle.
[1] https://moxie.org/2011/12/13/the-cryptographic-doom-principl...
The parser can run before the user is asked for permission to make changes. The parsed metadata can then discourage the user from installing the package (e.g. because of extremely questionable dependencies).
Dependencies are probably in the apt database and do not need parsing, but not everything is, or perhaps apt can install arbitrary .deb files now?
.deb is a packaging format like any other. There are plenty of reasons for parsing without running the code inside them.
I have a dual pentium pro 200 that runs gentoo and openbsd, but rust doesn't ship i586 binaries, only i686+. So I would need to compile on a separate computer to use any software that is using rust.
There is already an initrd package tool I can't use since it is rust based, but I don't use initrd on that machine so it is not a problem so far.
The computer runs modern linux just fine, I just wish the rust team would at least release an "i386" boostrap binary that actually works on all i386 like all of the other compilers.
"We don't care about retro computers" is not a good argument imho, especially when there is an easy fix. It was the same when the Xorg project patched out support for RAMDAC and obsoleted a bunch of drivers instead of fixing it easily. I had to fix the S3 driver myself to be able to use my S3 trio 64v+ with a new Xorg server.
/rant off
This sounds like it's fun. However, I have to ask, why should the linux world cater to supporting 30 year old systems? Just because it scratches an itch?
You can grab a $150 NUC which will run circles around this dual pentium pro system while also using a faction of the power.
You obviously have to do a lot of extra work, including having a second system, just to keep this old system running. More work than it'd take to migrate to a new CPU.
[1] https://www.amazon.com/KAMRUI-AK1PLUS-Processor-Computer-Eth...
The system is actually running fine standalone since I have been able to avoid rust software.
As to why it should cater to it, it's more that there is no need to remove something that already works just to remove it.
It is possible to compile rustc on another system so it supports i586 and below. Just a small change in the command line options. And it doesn't degrade the newer systems.
I have plenty of faster machines, I just enjoy not throwing things away or making odd systems work. It's called having fun :)
Surely retro hardware is fine with retro software.
I thought the Pentium Pro _was_ a 686?
Wikipedia seems to correlate: https://en.wikipedia.org/wiki/Pentium_Pro, as do discussions on CMOV: https://stackoverflow.com/a/4429563
Yes, sorry I remembered incorrectly. The rust compiler claims to be i686 and the CPU is i686 too, but the rust compiler is using Pentium 4 only instructions so it doesn't actually work for i686.
Yeah, that sucks. I assume this is SSE2?
It does look like there are legitimate issues with x87 floating-point: https://github.com/rust-lang/rust/issues/114479
That is correct :)
Edit: I see from the sister post that it is actually llvm and not rust, so I'm half barking up the wrong tree. But somehow this is not an issue with gcc and friends.
Pentium Pro is the first i686 CPU, so you should be fine.
I mean... Pentium Pro is 30 years old at this point. I don't think it's unreasonable that modern software isn't targeting those machines.
Sometimes you do wonder if those 4chan memes about those who push rust rewrites are just memes or what..
A maintainer of a project making a decision about their project is not someone pushing a re-write.
This thing gets everywhere.
This is just one reason I'm not the biggest fan of Rust. The language is good (as well as what it solves), but this tendency to force it into everything (even where it would provide no benefit whatsoever) is just mind-boggling to me. And the Rust evangelists then wonder why there are so many anti-rust folk.
Related:
Hard Rust requirements from May onward
https://news.ycombinator.com/item?id=45779860
Maybe there's a place for Future Debian distro that could be a place for phasing out old tech and introducing new features?
Isn't that literally what debian unstable is for?
Or maybe old devices and tech should expect a limited support window, or be expected to fork after some time?
It sounds like all of the affected Debian ports are long since diverged from the official Debian releases anyway:
> The sh4 port has never been officially supported, and none of the other ports have been supported since Debian 6.0.
Wikipedia tells me Debian 6 was released on 6 February 2011
[dead]
[flagged]
> What is it about Rust fanatics [....]
The universalization from one developer's post to all Rust "fanatics" is itself an unwelcome attack. I prefer to keep my discussion as civilized as possible.
Just criticize the remark.
I read that more as "here's a perfect example of something I'd noticed already" rather than "wow this is a terrible first impression your group is making".
Perhaps this reading is colored by how this same pair of sentiments seems to come up practically every single time there's a push to change the language for some project.
[flagged]
I think you'll experience some pushback on the assertion that that particular quote has a lot of arrogance or disdain in it.
Building large legacy projects can be difficult and tapping into a thriving ecosystem of packages might be a good thing. But it's also possible to have "shiny object" or "grass is greener" syndrome.
“If you maintain a port without a working Rust toolchain, please ensure it has one within the next 6 months, or sunset the port.”
If that’s not arrogant, I don’t know what is.
Is it arrogant or a clear and straightforward announcement that a Decision has been made and these are the consequences? I'm not seeing any arrogance in the message myself.
How is this arrogant? Are open source developers now responsible for ensuring every fork works with the dependencies and changes they make?
This seems like a long window, given to ports to say, "we are making changes that may impact you, heads up." The options presented are, frankly, the two primary options "add the dependency or tell people you are no longer a current port".
"Arrogant" does not mean "forceful" or "assertive" or "makes me angry".
This is forceful, assertive, and probably makes people angry.
Does the speaker have the authority to make this happen? Because if so, this is just a mandate and it's hard to find some kind of moral failing with a change in development direction communicated clearly.
> I think you'll experience some pushback on the assertion that that particular quote has a lot of arrogance or disdain in it.
It's just a roundabout way of saying "anything that isn't running Rust isn't a REAL computer". Which is pretty clearly an arrogant statement, I don't see any other way of interpreting it.
Be real for a second. People are arguing against Rust because it supports fewer target architectures than GCC. Which of the target architectures do you believe if important enough that it should decide the future development of apt?
I read it as a straightforward way of saying "support for a few mostly unused architectures is all that is holding us back from adopting rust, and adopting rust is viewed as a good thing"
Is it the borrow checker? Normally rust had your back when it comes to memory oopsies. Maybe we need a borrow checker for empathy..
from the outside it looks like a defense mechanism from a group of developers who have been suffering crusades against them ever since a very prolific c developer decided rust would be a good fit for this rather successful project he created in his youth.
Maybe they are just really tired of having to deal with people who constantly object and throw every possible obstacle they can on the way.
Maybe they wouldn't experience so much pushback if they were more humble, had more respect for established software and practices, and were more open to discussion.
You can't go around screaming "your code SUCKS and you need to rewrite it my way NOW" at everyone all the time and expect people to not react negatively.
> You can't go around screaming "your code SUCKS and you need to rewrite it my way NOW"
It seems you are imagining things and hate people for the things you imagined.
In reality there are situations where during technical discussions some people stand up and with trembling voice start derailing these technical discussions with "arguments" like "you are trying to convince everyone to switch over to the religion". https://youtu.be/WiPp9YEBV0Q?t=1529
That’s also not something anybody has actually said.
While no one has explicitly said that, it is the implied justification of rewriting so much stuff in rust
I disagree very strongly that a suggestion to change something is also a personal attack on the author of the original code. That’s not a professional or constructive attitude.
Are you serious? It's basically impossible to discuss C/C++ anymore without someone bringing up Rust.
If you search for HN posts with C++ in the title from the last year, the top post is about how C++ sucks and Rust is better. The fourth result is a post titled "C++ is an absolute blast" and the comments contain 128 (one hundred and twenty eight) mentions of the word "Rust". It's ridiculous.
Lots of current and former C++ developers are excited about Rust, so it’s natural that it comes up in similar conversations. But bringing up Rust in any conversation still does not amount to a personal attack, and I would encourage some reflection here if that is your first reaction.
To be clear, the "you" and "my" in your sentence refer to the same person. Julian appears to be the APT maintainer, so there's no compulsion except what he applies to himself.
(Maybe you mean this in some general sense, but the actual situation at hand doesn't remotely resemble a hostile unaffiliated demand against a project.)
> Julian appears to be the APT maintainer, so there's no compulsion except what he applies to himself.
To who is this addressed?
> If you maintain a port without a working Rust toolchain, please ensure it has one within the next 6 months, or sunset the port.
Because that sure reads as a compulsion to me.
The endless crusades are indeed tiresome.
Yes, the immediate and endless backlash we get whenever anybody says the word "Rust" is quite tiresome.
No, honestly Rust has just really crappy attitude and culture. Even as a person who should naturally like Rust and I do plan to learn it despite that I find these people really grating.
[flagged]
And just like vegans, their detractors are far more vocal in reality.
>reality In reality everyone eats meats because it's what the human body evolved to consume. There's nothing to detract
Untrue.
As evidenced by this very comment chain. I've seen, by far, way more comment from people annoyed by vegans. I can't even remember the last time I've heard a vegan discuss it outside of just stating the food preference when we got out to eat.
actually a vegan has to preach to some degree, otherwise it would be like a human rights advocate looking away when humans are tortured
As a vegetarian on ethical grounds (mostly due to factory farming of meat) I politely disagree with your assessment.
I have to decline and explain in social settings all the time, because I will not eat meat served to me. But I do not need to preach when I observe others eating meat. I, like all humans, have a finite amount of time and energy. I'd rather spend that time focused on where I think it will do the greatest good. And that's rarely explaining why factory farming of meat is truly evil.
The best time is when someone asks, "why don't you eat meat?" Then you can have a conversation. Otherwise I've found it best to just quietly and politely decline, as more often than not one can be accommodated easily. (Very occasionally, though, someone feels it necessary to try and score imaginary points on you because they have some axe to grind against vegetarians and vegans. I've found it best to let them burn themselves out and move on. Life's too short to worry about them.)
Frankly, I more often see meat eaters get defensive. We got to a restaurant, the vegan guy gets a meatless meal. The vegan guy gets bombarded with "Oh, you don't eat mean?" "Why?" "What's wrong with eating meat?" "I just like having a steak now and then."
That's a bit of a jump. Veganism is a personal lifestyle / dietary choice. Objecting to livestock is activism. You can do either without the other.
it's not just a dietary choice and it's a personal lifestyle in the sense of it being your choice, but not in the sense of a lifestyle which is limited to your private space.
You think it's wrong abusing animals. Why would you relate that only to you and think it would be ok for others to abuse them? You wouldn't
Why is this still a discussion?
> was no room for a change in plan
yes, pretty much
at least the questions about it breaking unofficial distros, mostly related to some long term discontinued architectures, should never affect how a Distro focused on current desktop and server usage develops.
if you have worries/problems outside of unsupported things breaking then it should be obvious that you can discuss them, that is what the mailing list is for, that is why you announce intend beforehand instead of putting things in the change log
> complained that Klode's wording was unpleasant and that the approach was confrontational
its mostly just very direct communication, in a professional setting that is preferable IMHO, I have seen too much time wasted due to misunderstandings due to people not saying things directly out of fear to offend someone
through he still could have done better
> also questioned the claim that Rust was necessary to achieve the stronger approach to unit testing that Klode mentioned:
given the focus on Sequoia in the mail, my interpretation was that this is less about writing unit tests, and more about using some AFIK very well tested dependencies, but even when it comes to writing code out of experience the ease with which you can write tests hugely affects how much it's done, rust makes it very easy and convenient to unit test everything all the time. That is if we speak about unit tests, other tests are still nice but not quite at the same level of convenience.
> "currently has problems with rebuilding packages of types that systematically use static linking"
that seems like a _huge_ issue even outside of rust, no reliable Linux distros should have problems with reliable rebuilding things after security fixes, no matter how it's linked
if I where to guess there this might be related to how the lower levels of dependency management on Linux is quite a mess due to requirements from 90 no longer relevant today, but which some people still obsess over.
To elaborate (sorry for the wall of text) you can _roughly_ fit all dependencies of a application (app) into 3 categories:
1. programs the system provides (opt.) called by the app (e.g. over ipc, or spawning a sub process), communicating over well defined non language specific protocols. E.g. most cmd-line tools, or you systems file picker/explorer should be invoked like that (that it often isn't is a huge annoyance).
2. programs the system needs to provide, called using a programming language ABI (Application Binary Interface, i.e. mostly C ABI, can have platform dependent layout/encoding)
3. code reused to not rewrite everything all the time, e.g. hash maps, algorithms etc.
The messy part in Linux is that for historic reasons the later two parts where not treated differently even through they have _very_ different properties wrt. the software live cycle. For the last category they are for your code and specific use case only! The supported versions usable with your program are often far more limited: Breaking changes far more normal; LTO is often desirable or even needed; Other programs needing different incompatible versions is the norm; Even versions with security vulnerabilities can be fine _iff_ the vulnerabilities are on code paths not used by your application; etc. The fact that Linux has a long history of treating them the same is IMHO a huge fuck up.
It made sense in the 90th. It doesn't anymore since ~20 years.
It's just completely in conflict with how software development works in practice and this has put a huge amount of strain on OSS maintainers, due to stuff like distros shipping incompatible versions potentially by (even incorrectly) patching your code.... and end users blaming you for it.
IMHO Linux should have a way to handle such application specific dependencies in a all cases from scripting dependencies (e.g. python), over shared object to static linking (which doesn't need any special handling outside of the build tooling).
People have estimated the storage size difference of linking everything statically, and AFIK it's irrelevant in relation to availability and pricing on modern systems.
And the argument that you might want to use a patched version of a dependency "for security" reasons fails if we consider that this has lead to security incidents more then one time. Most software isn't developed to support this at all and bugs can be subtle and bad to a point of a RCE.
And yes there are special cases, and gray areas in between this categories.
E.g. dependencies in the 3rd category you want to be able to update independently, or dependencies from the 2nd which are often handled like the 3rd for various piratical reasons etc.
Anyway coming back the the article Rust can handle dynamic linking just fine, but only for C ABI as of now. And while rust might get some form of RustABI to make dynamic linking better it will _never_ handle it for arbitrary libraries, as that is neither desirable nor technical possible.
---
EDIT: Just for context, in case of C you also have to rebuild all header only libraries using pre-processor macros, not doing so is risky as you now mix different versions of the same software in one build. Same (somewhat) for C++ with anything using template libraries. The way you can speed it up is by caching intermediate build artifacts, that works for rust, too.
I hate learning new things. It sucks. Also, I hate things that make my knowledge of C++ obsolete. I hate all the people that are getting good at rust and are threatening to take away my job. I hate that rust is a great leveler, making all my esoteric knowledge of C++ that I have been able to lord over others irrelevant. I hate that other people are allowed to do this to me and to do whatever they want, like making the decision to use rust in apt. It’s just sad and crazy to me. I can’t believe it. There are lots of people like me who are scared and angry and we should be able to control anyone else who makes us feel this way. Wow, I’m upset. I hope there is another negative post about rust I can upvote soon.
Think tech space isn’t for you if you hate learning new things.
Can you confirm these C++ fascists you speak of are in the room with you right now?