FWIW, maintaining a language is really, really hard. That said, this is going to kill adoption until it's either addressed or forked. While I doubt it is a major driving factor for D, no informed tech lead in their right mind should approve or endorse this language for any commercial stack.
What troubles me more is the attitude towards the reports. Or hiding behind "hey you can always give us money to do what you want." The maintainers are absolutely right in principle that they really answer to no one and donate their time, but this plus the constant breaking changes is going to burn through any trust left. Things aren't perfect, people aren't perfect, shit breaks, mistakes are made; getting defensive about it isn't going to make things better.
marcosdumay 670 days ago [-]
> The maintainers are absolutely right in principle that they really answer to no one and donate their time
Well, it makes the project unprofessional. What is perfectly ok, but you don't want to use it professionally.
They have every right to that choice, and they state it very clearly, so nobody has any right to complain. But yeah, it is a very good reason for not using the language.
destructionator 670 days ago [-]
> Well, it makes the project unprofessional.
Next time I hire a professional, I'll be sure to call them unprofessional when they hand me a bill. A real professional doesn't expect to be paid!
marcosdumay 670 days ago [-]
Oh, sure, next time I talk about a project, you can make sure to extend it to anybody involved too. You can even extend it to a country or some other random group; it's a proven way to spice the discussion.
670 days ago [-]
pwdisswordfishc 670 days ago [-]
[flagged]
cassepipe 670 days ago [-]
To be fair this is not Walter Bright responding and I doubt the person saying that is saying that "everything has to be bought" rather than "you're not entitled to my free work".
And I say as someone who considers the free market must be constrained by strict and enforced rules to prevent societal negative externalities of the economic jungle.
ht85 670 days ago [-]
Not being entitled is a given but healthy open source projects are driven by the passion of the community to create something they are proud of.
A prominent member of the community immediately steering the discussion towards the workload and "who is going to pay for it" is very unappealing — to me at least.
cassepipe 669 days ago [-]
Fully agree, if you complain about some specific usability pain point you encounter as a user, you are providing valuable feedback and you shouldn't be dismissed with a handwawy "oh yeah ? Well, are you even donating ?"
It's almost an admission of guilt
670 days ago [-]
badsectoracula 670 days ago [-]
I've considered using D several times instead of Free Pascal because D has much stronger metaprogramming functionality that in some of my projects i really wanted, but this stance towards backwards compatibility is basically one of the two main reasons i didn't (the other being that Free Pascal has a great IDE and framework in Lazarus/LCL whereas the D community never seemed to get behind a single one to polish it - and amusing the only one in the wiki that doesn't look dead is actually written in Free Pascal and Lazarus).
FPC "solves" the backwards compatibility issue by simply not breaking existing code and any potential conflicting functionality is enabled with compiler directives - this way you can still use code you wrote more than a decade ago without any modifications while at the same time you can use brand new functionality introduced to the language on a per-source level (which also allows you to bring a larger codebase up to speed piece by piece instead of forcing you to upgrade everything in one huge step).
This is IMO the best approach for evolving a language without breaking existing code. Does it make the first few lines of a source file look weird with all the directives that enable functionality? Sure it does, but this is way more preferable than having code that you spent time on break. IMO that is a complete disregard for the time your users have invested learning and working with your language. We are not immortal after all.
lumost 670 days ago [-]
In my experience, this is the main problem with choosing an off the beaten path language/database/library X.
For all the benefits this X brings, you have to deal with either being a steward of X - or dealing with the aftermath of whatever decisions X made. In a mainstream X, you get a collective averaging effect from the majority of users who simply want their stuff to keep working.
mhd 670 days ago [-]
> In my experience, this is the main problem with choosing an off the beaten path language/database/library X.
It increasingly is also the case with regular libraries and programming languages. This was somewhat accepted for the former outside of the most static of enterprise environments or OS standard libraries, but somewhat new for programming languages. Mostly because they're now managed like products and actually are - singular implementation, no standardization effort attempted. Thuse "move fast and break things" and an ever-increasing scope creep are part of the deal. If you aren't adding some feature, you seem dead.
Granted, within that "late stage PL design" area you've got different degrees of severity, especially given how all those new features relate to backwards compatibility. TypeScript and Rust never got the bad reputation that D had, for example (mostly relating to the old debates, where it was still in an earlier dev stage and GC vs non-GC was a big issue, for example; can't say anyting about this recent debate).
Waterluvian 670 days ago [-]
> It increasingly is also the case with regular libraries and programming languages.
Let me posit this, without actually feeling sure that this is correct:
As a rough example, say there’s 100 hardened libraries and languages that meet the reliable and slow and steady sensibilities. Now consider that programming becomes less and less niche, and in addition to those 100 libraries, we also have 500 libraries of dubious dev attitude and quality.
People will feel that quality is falling off a cliff when they experience it as a whole. But really there’s just more choice, most of which you can just ignore.
I think I wonder the same thing about people’s feelings of how programmers are getting worse or everyone’s just a web dev these days. I’m gonna guess there’s more “”serious programmers”” than ever before. They just make a smaller slice of the pie.
So I wonder if this is actually true and can be measured beyond a handful of anecdotes, or if we’re just being biased by a growing selection of “move fast / I’m not too serious about this” libraries?
mhd 670 days ago [-]
So, basically Sturgeon's law, where 80% of everything is crud, and now we got a huge crud heap, but also lot's of shining gems besides it?
I'm not that sure about that, mostly because I'd say that the actual quality of something is orthogonal to the featuritis exhibited quite often. For one, all the shiny new features of something might actually be very good. And a change breaking compatibility might be the good design choice - getting it right the first time or having to drag Sisyphos' rock along for eons shouldn't be the two possible outcomes.
It's a trade-off, especially when you consider the other 80/20 split here, where 80% use only 20% of features, but different 20%...
In our days of easy release cycles (just tag a release, finish a sprint etc.) and easier communication (issue trackers, emails, social media) and contribution (open source, pull requests), we're just upstreaming more workarounds and library additions that would've happened in your local projects.
I have an easier time accepting this in libraries. They're malleable, maybe even ephemeral objects you handle when working on your project. Maybe a bit loosey-goosey these days, never mind the tendency to hop between frameworks/libraries, but well, we never got our true component system where you can just extend existing libraries that then remain untouched.
But languages operating in the same way still rubs me the wrong way. I'd prefer to have a more solid foundation, not mutating turtles all the way down. But after C++ unleased the feature genie, it's hard to put it back into the bottle again.
pjmlp 669 days ago [-]
> Mostly because they're now managed like products and actually are - singular implementation, no standardization effort attempted. Thuse "move fast and break things" and an ever-increasing scope creep are part of the deal. If you aren't adding some feature, you seem dead.
That has always been the case, even more so when all programming languages were commercial products and newer versions had to be sold.
mhd 669 days ago [-]
Well, you still had to ship floppies, which kinda limits your pace. And with heavily fragmented platforms, standards – informal or formal – helped as one vendor didn't cover everything. And thus compatibility to someone's else Pascal/Cobol/C compiler actually helped sales.
badsectoracula 670 days ago [-]
It is more of a culture thing than a off beat thing. Some language communities care more about backwards compatibility than others, e.g. one famous example that is brought often here is Common Lisp and that is certainly far from mainstream.
On the other hand while JavaScript-the-language is indeed backwards compatible, the culture around it is anything but - and that is despite JavaScript being as mainstream as it can get.
669 days ago [-]
waselighis 670 days ago [-]
I used D for a while several years ago. The language was always a huge mess. The compiler and standard library were riddled with bugs. D was the only language I ever used where it could actually be a compiler bug. D was a mess of features that didn't play well together; things were added with little thought or care given to they'll fit with existing language features. I speak in past-tense because I haven't paid close attention to D's development in several years, but it seems like every time D pops up in some news feed, it's never good. Seeing posts like this, it seems like things haven't improved much. I've seen a few posts like this over the years complaining about minor releases introducing breaking changes.
I understand building a language and compiler is hard. However, D is over 20 years old at this point. It's had plenty of time to mature despite being a small community project. There are much younger languages which are far more stable. Minor releases should not be introducing so many breaking changes, breaking backwards compatibility. D is actually a pretty nice language, has a lot of great features, but I would never recommend it to anybody.
jerf 670 days ago [-]
Throughout my career I've only grown a monotonically increasing level of appreciation and support for backwards compatibility. It seems to be an ever-increasing problem, because as we solve a lot of the other problems around using libraries, this one gets bigger and bigger.
However, I don't know how things are actually supposed to advance in such an environment. At the programming language level, my answer is, develop new languages. Existing languages should not klunk along and keep adding feature after feature trying to chase the state of the world 20-30 years after they were initially designed. But even if you accept that statement from me, which I know many of you won't and I'm OK with that, that's still not a very big help anyhow. I don't have a language I'm maintaining, and neither do most of you. I have some apps and libraries. What am I supposed to do with them as deficiencies in their underlying foundation are revealed?
To which I simply don't have an answer. My little team has two UIs we've inherited, both in super deprecated languages or UI frameworks that we can barely work on anymore, but we can't afford to be rewriting them every four years. A nontrivial part of the reason they're still staring at us is that we're reluctant to put a lot of time into upgrading them, only to discover 2-4 years later that we upgraded one obsolete stack we can't actually change anymore into another obsolete stack we can't change without a rewrite.
The only thing I can say is that I can see the programming world coming to a greater understanding every year that every dependency is a bigger long-term cost than meets the eye, and all I can say is thumbs up to that and it's something we need to continue to appreciate collectively. It isn't a solution, but at least it leads to actions that are little improvements, and lots of little improvements are still better than nothing.
tialaramex 670 days ago [-]
> However, I don't know how things are actually supposed to advance in such an environment.
Gradually.
Rust has lots of examples of this, but look at std::mem::uninitialized::<Thing>() https://doc.rust-lang.org/std/mem/fn.uninitialized.html -- back in 2015 the assumption was, OK, surely so I mark this memory as a Thing, and then I scribble over the values in memory so that it makes sense as an actual Thing and we're all good. Right? Wrong. That's actually definitely nonsense,
In 1.36 Rust got a type to do this properly, std::mem::MaybeUninit<Thing> now we say OK, this memory might be a Thing, but, for now it's just uninitialized. Then we scribble over whatever is necessary to make a Thing, and then we call MaybeUninit::assume_init() to promise that this is a Thing now. That's type correct, the compiler won't get confused and assume it's a Thing until it really is a Thing. In 1.39 Rust deprecated std::mem::uninitialized()
But, if you wrote code back then and never updated it, your code is not worse than it was back then, you don't need to rewrite it (though if it uses this deprecated function you should consider doing so) it's just now indirectly calling MaybeUninit::uninit() then doing some best efforts work to tell the compiler please don't hurt this programmer, and then MaybeUninit::assume_init() to get the result you asked for even though that might be a bad idea, it is what you asked for.
670 days ago [-]
lenkite 669 days ago [-]
Languages need to break stuff once every X years or they will die. Witness C++ - which is currently undergoing a death spiral. Thanks to perma ABI and language compatibility, even genius level people are finding it impossible to create workable language/stdlib improvements. There is always an edge case that breaks compatibility which is un-acceptable. Most folks just give up after the Nth rejection of their feature proposal.
And performance of many components of the standard library is now laughable. C++ std::regex is slower than most interpreted language regex libraries. It is utterly embarrassing for a language that prides itself on performance. This keep-it-backward-compatible till the heat death of the universe attitude of the standards committee is causing folks to create several successor languages that transpile to C++ - or just move to Rust.
Its the perfect example of how adherence to backward compatibility over everything else in a language is just effective suicide.
jerf 669 days ago [-]
C++'s problem isn't that it stopped breaking stuff. C++'s problem, or as I like to call it, Katamari Dama-C++, is that it needed to freeze a long time ago and they just needed to build a new language. They're exactly who I am thinking of in terms of just making a new language.
By language scales, Rust is exploding on to the scene and eating C++'s lunch precisely because they won't stop mucking with it. You might say that in absolute terms Rust is still not that much code and that's true, but in terms of how languages tend to turn over Rust is blowing C++ away. Much faster than the usual such process.
There are other languages that just need to accept that they are what they are and that endlessly adding features is only making their situation worse. Looking at you, Python. The community should just have accepted that Python can not be multithreaded and it doesn't have a good async solution, and people who needed these things should have started moving to other languages for solutions to those problems, instead of accreting an ever-increasing bunch of features flailing about trying to solve these deep, fundamental problems, and convincing a whole bunch of people that if they just hang on for another year, 3 years, 5 years, maybe a bit longer, Python will someday solve their problem. Now they're years into that process and they'd have been better off just switching years ago instead of being on an increasingly complicated language, and Python would have been better off being a good Python instead of trying to chase the kids.
Some of the issues regarding ABI are caused by compiler vendors themselves, ISO C++, just like ISO C, has zero content on ABI.
Same applies to stuff like std::regex, compiler vendors are the ones to blame.
In any case, there is plenty of C++98 code that won't compile in C++23, because even ISO C++ does break backwards compatibility.
p0nce 669 days ago [-]
> To which I simply don't have an answer.
D does it with gradual deprecation, and not too much at once, it is the best for deprecating stuff (you don't need to fix your code immediately) but there is always a residual complaining from people that assume there will be no work to maintain software. In reality someone maintain your compiler/runtime, and their needs are part of your stack.
Deukhoofd 670 days ago [-]
So if I'm reading this correctly the Dlang syntax has non-backwards compatible changes every couple months? That's absurd, how would you even develop for that?
1980phipsi 670 days ago [-]
It's usually the result of deprecations that have warnings for several releases before being turned into errors. But there have been a few more deprecations lately than previously. In at least one case (disabling "alias this" for classes), I think Walter expressed some sympathy since there isn't always something obvious to do instead.
destructionator 670 days ago [-]
I have D code I wrote back in 2008 that still works fine today.
That said, there are often small breakages scattered around and it can be annoying. If it is your own code, you spend the 5-10 minutes to update it (with some exceptions, the last couple months have been particularly bad, most aren't like that though). But if it is someone else's code and you have to wait for them to update it then push a new version tag, and cascade through the dependency list, that can suck.
of course dependencies suck so you get what you pay for lol
temp51723 670 days ago [-]
You pick a version of the compiler and stick with it until you're ready to try a newer version?
ubercore 670 days ago [-]
That ignores any other libraries you use which may not make the same choice as you.
temp51723 670 days ago [-]
It doesn't. You stick to the version of those libraries at work with that version of the compiler- like every other language.
ObscureScience 670 days ago [-]
The difference is that most other language are backwards compatible to a fault (it's hard to have the stability cake and eat improvements)
ubercore 670 days ago [-]
That's fine in a vacuum, but I feel like you'd very quickly run into the need for a security update, or some external API that changes, etc. You can't ignore all externalities, usually.
kzrdude 670 days ago [-]
Then you risk missing the deprecation warnings and head straight into errors.
mhh__ 670 days ago [-]
It's absurd because it's not really true.
pjmlp 670 days ago [-]
D seemed quite cool when Andrei Alexandrescu's book about The D Programming Language came out.
Since then, the language evolved, kept chasing the next big feature that would bring people in, never quite finishing the previous big feature.
Meanwhile C++, Java and C# kept improving, Go, Rust, Swift, Nim, Odin, Zig came along, some of them took away possible scenarios for D's adoption.
Remedy Games played with D, but it was C# with Unity that earned indie developers hearts.
Andrei Alexandrescu also returned to C++, now doing CUDA research.
Nice language, unless a big corporation makes it relevant, it will stay on the sidelines.
memefrog 670 days ago [-]
Funny how WalterBright seems to comment in every single HN thread other than this one...
anticensor 670 days ago [-]
He tends to be absent from D-related threads for some reason.
asveikau 670 days ago [-]
I could see various legit reasons for this. He's obviously very attached to the topic and would skew discussion. Or maybe he'd be too into it. Or maybe it's like a judge recusing themselves. Whatever the reason, I wouldn't read too much into it.
hyperman1 669 days ago [-]
This probably makes sense.
Whatever he says is instantly Word of God, and will influence both the D language and the public opinion of it. Better do that in a well thought out message, e.g on the D lang site.
Meanwhile, he can listen to the HN 'jungle drums' and get a feel about what the public thinks. Even if the silent majority and the noisy opinions don't always agree, you can learn a lot about how to shape both the language and the public opinion of it.
mhh__ 670 days ago [-]
He's probably asleep
Alifatisk 670 days ago [-]
Remember how Dlang split the community into two when there were two different standards?
This doesn't shock me to be honest, and this is coming from someone who used to write some of my projects in D!
foxyv 670 days ago [-]
The best feature of the Java language, in my opinion, is that I can compile Java 4 code using JDK 17. This is what killed my enthusiasm for Python after Python 3 not being able to run Python 2 code without some special sauce. Backwards compatibility is just such a huge boon when you maintain applications that have been running for over 20 years.
tjalfi 670 days ago [-]
I have compiled Fortran programs from the 70s on modern platforms without changing a line. The compiler, OS, and CPU architecture had all disappeared but the programs still worked correctly.
foxyv 670 days ago [-]
Fortran is the silent king of the world. I think the singularity will be caused by an old piece of extremely optimal Fortran code that has been quietly taking over the world on exceedingly faster hardware.
pjmlp 670 days ago [-]
Except since Java 8 that is no longer true.
Nowadays with modules and linked runtimes, deprecated APIs are indeed removed.
Plenty of Java 4 code won't run in Java 17, if it happens to use any of those.
foxyv 669 days ago [-]
Sure the APIs are gone, but you can still compile the code and include the libraries from those APIs and still run the dang thing. Also, the removal of those APIs was done in an extremely responsible manner with long periods of deprecation. Most of those APIs were deprecated in Java 5 anyways.
pjmlp 669 days ago [-]
No you cannot, because some of those APIs depend on native code that is now gone like in Thread.stop().
It doesn't change the fact that Java 4 using those features won't run anylonger without additional effort.
Another are that has occasionally had breaking changes were JDBC drivers and related interfaces.
Java folks try to minimize breaking changes, yet each release has a section about breaking changes for a reason, throughout Java's history.
foxyv 669 days ago [-]
If you are using obscure features then yeah, you are going to see breaking changes. Just try running Tomcat 7 on Java 8+. But there will always be exceptions like that in any language. Just like Rust has safety guarantees, until you start using "unsafe." But, I have applications that were coded in the 90s that are still running fine on Java 17 with minimal maintenance for security updates. No trans-piling necessary.
I don't know about these "Java Folks" but no one is claiming that Java is a magic unicorn. But it is a well designed language that has excellent backwards compatibility.
Notice how this uses a sun.* package, i.e. a hidden implementation detail of 1 specific JVM . Java was never guaranteed to have it in the first place. Some of these classes were so usefull that people used them even if they shouldn't. There are compatibility flags in the jvmbto allow access even in java 9, but each new version slowly causes additional pain.
One risk of using libraries (or working with other people) is that you don't know if they limited them selves to good enough practices. Most do, a small percentage doesn't, but they cause pain for all of us.
foxyv 670 days ago [-]
The best part is you can still get around this. Although I've been lucky enough to not need to do this. I think my oldest project used Java 5 originally and doesn't use Sun classes.
The wave of dismissive comments is quite informative to stay far far away from D:
> I would love to ensure that every release of the compiler doesn't have any bugs, please let me know when you have figured out how to guarantee that.
Oh dear
WhereIsTheTruth 670 days ago [-]
The problem of D is they made mistakes in the past, they acknowledge it, yet they refuse to fix it, they are afraid of "breaking changes", they act like they are a pillar of the "industry"(?), when they are just a niche, niche languages should be willing to do breaking changes and explore new features
D is perfectly positioned to explore with ownership for example, they tried "@live", yet they don't commit
Take a look at Nim, they explored a ton with memory management, GC, RC, and now orc, they do what D's afraid of doing
All others languages catched up (including both Java and C#..), D lost it's charm, it's a language that is stuck, afraid of trying new things, and afraid of fixing broken things
zig/odin are eating that betterC market share
go/c#/java are eating that high level market share
nim ate that niche market share
and rust ate their better C++ market share
tyg13 670 days ago [-]
This comment feels non sequitur given that the linked thread is complaining about too many breaking changes, leading to them have to constantly have to fix the code after compiler releases.
WhereIsTheTruth 670 days ago [-]
It's not the same issue, versioning is also a problem with D, minor releases shouldn't introduce breaking changes
The linked thread specifically complains about that, and the OP request for a LTS version
p0nce 669 days ago [-]
So: D is bad because it needs more breakage! But at the same time D is bad because it needs less breakage. Right.
KingOfCoders 670 days ago [-]
This killed Scala for me some years ago [0]. Coming from Java where despite changes to the JVM, compiler and JDK upgrades where painless.
[0] and the abysmal compile times.
globalreset 669 days ago [-]
I would not extrapolate from a single case. Language makes some minor tweaks, etc. someone gets unlucky and keeps hitting issues in their own codebase and their dependencies, so get rightfully pissed.
Even in allegedly super-stable and backward compatible languages, in a large enough codebase you'll keep hitting issues when you change deps/compiler versions. If I could get $1 every time a C++ package refuses to compile due to some weird compilation issue...
mhh__ 670 days ago [-]
I work on and maintain a few D codebases, we have issues upstream every now and again you only notice the things that don't work. I recently moved forward several compiler releases with no problems.
PrimeMcFly 670 days ago [-]
There are too many languages as is, too much reinventing the wheel but worse.
Is D ever truly the best option?
JonChesterfield 670 days ago [-]
I'm considering compiling a dynamic language to it. Mainly because it gives you C style performance on the fast paths and a built in garbage collector. Decent chance it's optimal for that.
What troubles me more is the attitude towards the reports. Or hiding behind "hey you can always give us money to do what you want." The maintainers are absolutely right in principle that they really answer to no one and donate their time, but this plus the constant breaking changes is going to burn through any trust left. Things aren't perfect, people aren't perfect, shit breaks, mistakes are made; getting defensive about it isn't going to make things better.
Well, it makes the project unprofessional. What is perfectly ok, but you don't want to use it professionally.
They have every right to that choice, and they state it very clearly, so nobody has any right to complain. But yeah, it is a very good reason for not using the language.
Next time I hire a professional, I'll be sure to call them unprofessional when they hand me a bill. A real professional doesn't expect to be paid!
And I say as someone who considers the free market must be constrained by strict and enforced rules to prevent societal negative externalities of the economic jungle.
A prominent member of the community immediately steering the discussion towards the workload and "who is going to pay for it" is very unappealing — to me at least.
It's almost an admission of guilt
FPC "solves" the backwards compatibility issue by simply not breaking existing code and any potential conflicting functionality is enabled with compiler directives - this way you can still use code you wrote more than a decade ago without any modifications while at the same time you can use brand new functionality introduced to the language on a per-source level (which also allows you to bring a larger codebase up to speed piece by piece instead of forcing you to upgrade everything in one huge step).
This is IMO the best approach for evolving a language without breaking existing code. Does it make the first few lines of a source file look weird with all the directives that enable functionality? Sure it does, but this is way more preferable than having code that you spent time on break. IMO that is a complete disregard for the time your users have invested learning and working with your language. We are not immortal after all.
For all the benefits this X brings, you have to deal with either being a steward of X - or dealing with the aftermath of whatever decisions X made. In a mainstream X, you get a collective averaging effect from the majority of users who simply want their stuff to keep working.
It increasingly is also the case with regular libraries and programming languages. This was somewhat accepted for the former outside of the most static of enterprise environments or OS standard libraries, but somewhat new for programming languages. Mostly because they're now managed like products and actually are - singular implementation, no standardization effort attempted. Thuse "move fast and break things" and an ever-increasing scope creep are part of the deal. If you aren't adding some feature, you seem dead.
Granted, within that "late stage PL design" area you've got different degrees of severity, especially given how all those new features relate to backwards compatibility. TypeScript and Rust never got the bad reputation that D had, for example (mostly relating to the old debates, where it was still in an earlier dev stage and GC vs non-GC was a big issue, for example; can't say anyting about this recent debate).
Let me posit this, without actually feeling sure that this is correct:
As a rough example, say there’s 100 hardened libraries and languages that meet the reliable and slow and steady sensibilities. Now consider that programming becomes less and less niche, and in addition to those 100 libraries, we also have 500 libraries of dubious dev attitude and quality.
People will feel that quality is falling off a cliff when they experience it as a whole. But really there’s just more choice, most of which you can just ignore.
I think I wonder the same thing about people’s feelings of how programmers are getting worse or everyone’s just a web dev these days. I’m gonna guess there’s more “”serious programmers”” than ever before. They just make a smaller slice of the pie.
So I wonder if this is actually true and can be measured beyond a handful of anecdotes, or if we’re just being biased by a growing selection of “move fast / I’m not too serious about this” libraries?
I'm not that sure about that, mostly because I'd say that the actual quality of something is orthogonal to the featuritis exhibited quite often. For one, all the shiny new features of something might actually be very good. And a change breaking compatibility might be the good design choice - getting it right the first time or having to drag Sisyphos' rock along for eons shouldn't be the two possible outcomes.
It's a trade-off, especially when you consider the other 80/20 split here, where 80% use only 20% of features, but different 20%...
In our days of easy release cycles (just tag a release, finish a sprint etc.) and easier communication (issue trackers, emails, social media) and contribution (open source, pull requests), we're just upstreaming more workarounds and library additions that would've happened in your local projects.
I have an easier time accepting this in libraries. They're malleable, maybe even ephemeral objects you handle when working on your project. Maybe a bit loosey-goosey these days, never mind the tendency to hop between frameworks/libraries, but well, we never got our true component system where you can just extend existing libraries that then remain untouched.
But languages operating in the same way still rubs me the wrong way. I'd prefer to have a more solid foundation, not mutating turtles all the way down. But after C++ unleased the feature genie, it's hard to put it back into the bottle again.
That has always been the case, even more so when all programming languages were commercial products and newer versions had to be sold.
On the other hand while JavaScript-the-language is indeed backwards compatible, the culture around it is anything but - and that is despite JavaScript being as mainstream as it can get.
I understand building a language and compiler is hard. However, D is over 20 years old at this point. It's had plenty of time to mature despite being a small community project. There are much younger languages which are far more stable. Minor releases should not be introducing so many breaking changes, breaking backwards compatibility. D is actually a pretty nice language, has a lot of great features, but I would never recommend it to anybody.
However, I don't know how things are actually supposed to advance in such an environment. At the programming language level, my answer is, develop new languages. Existing languages should not klunk along and keep adding feature after feature trying to chase the state of the world 20-30 years after they were initially designed. But even if you accept that statement from me, which I know many of you won't and I'm OK with that, that's still not a very big help anyhow. I don't have a language I'm maintaining, and neither do most of you. I have some apps and libraries. What am I supposed to do with them as deficiencies in their underlying foundation are revealed?
To which I simply don't have an answer. My little team has two UIs we've inherited, both in super deprecated languages or UI frameworks that we can barely work on anymore, but we can't afford to be rewriting them every four years. A nontrivial part of the reason they're still staring at us is that we're reluctant to put a lot of time into upgrading them, only to discover 2-4 years later that we upgraded one obsolete stack we can't actually change anymore into another obsolete stack we can't change without a rewrite.
The only thing I can say is that I can see the programming world coming to a greater understanding every year that every dependency is a bigger long-term cost than meets the eye, and all I can say is thumbs up to that and it's something we need to continue to appreciate collectively. It isn't a solution, but at least it leads to actions that are little improvements, and lots of little improvements are still better than nothing.
Gradually.
Rust has lots of examples of this, but look at std::mem::uninitialized::<Thing>() https://doc.rust-lang.org/std/mem/fn.uninitialized.html -- back in 2015 the assumption was, OK, surely so I mark this memory as a Thing, and then I scribble over the values in memory so that it makes sense as an actual Thing and we're all good. Right? Wrong. That's actually definitely nonsense,
In 1.36 Rust got a type to do this properly, std::mem::MaybeUninit<Thing> now we say OK, this memory might be a Thing, but, for now it's just uninitialized. Then we scribble over whatever is necessary to make a Thing, and then we call MaybeUninit::assume_init() to promise that this is a Thing now. That's type correct, the compiler won't get confused and assume it's a Thing until it really is a Thing. In 1.39 Rust deprecated std::mem::uninitialized()
But, if you wrote code back then and never updated it, your code is not worse than it was back then, you don't need to rewrite it (though if it uses this deprecated function you should consider doing so) it's just now indirectly calling MaybeUninit::uninit() then doing some best efforts work to tell the compiler please don't hurt this programmer, and then MaybeUninit::assume_init() to get the result you asked for even though that might be a bad idea, it is what you asked for.
And performance of many components of the standard library is now laughable. C++ std::regex is slower than most interpreted language regex libraries. It is utterly embarrassing for a language that prides itself on performance. This keep-it-backward-compatible till the heat death of the universe attitude of the standards committee is causing folks to create several successor languages that transpile to C++ - or just move to Rust.
Its the perfect example of how adherence to backward compatibility over everything else in a language is just effective suicide.
By language scales, Rust is exploding on to the scene and eating C++'s lunch precisely because they won't stop mucking with it. You might say that in absolute terms Rust is still not that much code and that's true, but in terms of how languages tend to turn over Rust is blowing C++ away. Much faster than the usual such process.
There are other languages that just need to accept that they are what they are and that endlessly adding features is only making their situation worse. Looking at you, Python. The community should just have accepted that Python can not be multithreaded and it doesn't have a good async solution, and people who needed these things should have started moving to other languages for solutions to those problems, instead of accreting an ever-increasing bunch of features flailing about trying to solve these deep, fundamental problems, and convincing a whole bunch of people that if they just hang on for another year, 3 years, 5 years, maybe a bit longer, Python will someday solve their problem. Now they're years into that process and they'd have been better off just switching years ago instead of being on an increasingly complicated language, and Python would have been better off being a good Python instead of trying to chase the kids.
Same applies to stuff like std::regex, compiler vendors are the ones to blame.
In any case, there is plenty of C++98 code that won't compile in C++23, because even ISO C++ does break backwards compatibility.
D does it with gradual deprecation, and not too much at once, it is the best for deprecating stuff (you don't need to fix your code immediately) but there is always a residual complaining from people that assume there will be no work to maintain software. In reality someone maintain your compiler/runtime, and their needs are part of your stack.
That said, there are often small breakages scattered around and it can be annoying. If it is your own code, you spend the 5-10 minutes to update it (with some exceptions, the last couple months have been particularly bad, most aren't like that though). But if it is someone else's code and you have to wait for them to update it then push a new version tag, and cascade through the dependency list, that can suck.
of course dependencies suck so you get what you pay for lol
Since then, the language evolved, kept chasing the next big feature that would bring people in, never quite finishing the previous big feature.
Meanwhile C++, Java and C# kept improving, Go, Rust, Swift, Nim, Odin, Zig came along, some of them took away possible scenarios for D's adoption.
Remedy Games played with D, but it was C# with Unity that earned indie developers hearts.
Andrei Alexandrescu also returned to C++, now doing CUDA research.
Nice language, unless a big corporation makes it relevant, it will stay on the sidelines.
Whatever he says is instantly Word of God, and will influence both the D language and the public opinion of it. Better do that in a well thought out message, e.g on the D lang site.
Meanwhile, he can listen to the HN 'jungle drums' and get a feel about what the public thinks. Even if the silent majority and the noisy opinions don't always agree, you can learn a lot about how to shape both the language and the public opinion of it.
This doesn't shock me to be honest, and this is coming from someone who used to write some of my projects in D!
Nowadays with modules and linked runtimes, deprecated APIs are indeed removed.
Plenty of Java 4 code won't run in Java 17, if it happens to use any of those.
It doesn't change the fact that Java 4 using those features won't run anylonger without additional effort.
Another are that has occasionally had breaking changes were JDBC drivers and related interfaces.
Java folks try to minimize breaking changes, yet each release has a section about breaking changes for a reason, throughout Java's history.
I don't know about these "Java Folks" but no one is claiming that Java is a magic unicorn. But it is a well designed language that has excellent backwards compatibility.
https://stackoverflow.com/a/33240753 ?
One risk of using libraries (or working with other people) is that you don't know if they limited them selves to good enough practices. Most do, a small percentage doesn't, but they cause pain for all of us.
https://stackoverflow.com/questions/4065401/using-internal-s...
> I would love to ensure that every release of the compiler doesn't have any bugs, please let me know when you have figured out how to guarantee that.
Oh dear
D is perfectly positioned to explore with ownership for example, they tried "@live", yet they don't commit
Take a look at Nim, they explored a ton with memory management, GC, RC, and now orc, they do what D's afraid of doing
All others languages catched up (including both Java and C#..), D lost it's charm, it's a language that is stuck, afraid of trying new things, and afraid of fixing broken things
zig/odin are eating that betterC market share
go/c#/java are eating that high level market share
nim ate that niche market share
and rust ate their better C++ market share
The linked thread specifically complains about that, and the OP request for a LTS version
[0] and the abysmal compile times.
Even in allegedly super-stable and backward compatible languages, in a large enough codebase you'll keep hitting issues when you change deps/compiler versions. If I could get $1 every time a C++ package refuses to compile due to some weird compilation issue...
Is D ever truly the best option?