• 10 Posts
  • 32 Comments
Joined 2 months ago
cake
Cake day: April 4th, 2025

help-circle



  • What I find interesting is that move semantics silently add something to C++ that did not exist before: invalid objects.

    Before, if you created an object, you could design it so that it kept all invariants until it was destroyed. I’d even argue that it is the true core of OOP that you get data structures with guaranteed invariants - a vector or hash map or binary heap never ceases to guarantee its invariants.

    But now, you can construct complex objects and then move their data away with std::move() .

    What happens with the invariants of these objects?




  • Did you ever note that when intelligent engineers talk about designs (or quite generally when intelligent people talk about consequential decisions they took), they talk about their goals, about the alternatives they had, about what they knew about the properties of these alternatives and how these evaluated with their goals, about which alternatives they chose in the end and how they addressed the inevitable difficulties they encountered?

    For me, this is quite a very telling sign of intelligence in individuals. And truly good engineering organizations do collect and treasure that knowledge - it is path-dependent and you cannot quickly and fully reproduce it when it is lost. And more importantly, some fundamental reasons for your decisions and designs might change, and you might have to revise them. Good decisions also have a quality of stability which is that the route taken does not change dramatically when an external factor changes a little.

    So and now compare that to when you let automatically plan a route through a dense, complex suburban train network, by using a routing app. The route you get will likely be the fastest one, with the implicit assumption that this is what you of course want - but any small hiccup or delay in the transport network can well make it the slowest option.





  • So, how many users of Debian would even think about creating own packages?

    I already have a hunch what went wrong: they were probably trying to package software that has no standard build system. This is painful because the standard tools, like GNU autotools for C programs, or cmake, or setuptools or its newer siblings for python, make sure that the right commands are used to build a package on whatever platform, and that, importantly, its components are installed into the right places. If they don’t use these, they will have a problem to build packages for any standard distribution.

    Guix has support for all the mayor build systems (otherwise, it could not support building of 50000 packages).



  • Yes, Nix solves the same problem. The main difference is that the language used for package descriptions is less attractive to some developers compared to the language which Guix uses, which is Guile Scheme. Guile is very mature, well documented and has good performance.

    I think that will give Guix an advantage in the long run, since for a successful disyribution, one needs a bunch of packages and for this, volunteers need to write package definitions and maintain them. Guix makes it easier to write definitions.

    Clearly the strict focus on FLOSS will prevent some packages like NVidia drivers from appearing there. But on the other hand, this gives you a system which you will be able to completely compile from source in 10 years time.


  • Guix is really making fantastic progress and is a good alternative in the space between stable and fully FOSS distributions, likes Debian, and distributions which are more up-to-date, like Arch.

    And one interesting thing is that the number of packages is now so large that one can frequently install additional more recent packages on a Debian systems, or ones that are not packaged by Debian.

    For example, I run Debian stable as base system, Guix as extra package manager (and Arch in a VM for trying out latest software for programming).

    The thing is now Guix often provides more recent packages tham Debian, like many Rust command line tools, where Debian is lagging a bit. There are many interesting ones, and most are recent because Rust is progressing so fast. Using Guix, I can install them without using the language package manager, regardless whether iy is written in Rust, Go, or Python 3.13.

    Or, today I read an article about improvements in spaced repetition learning algorithms. It mentioned that the FLOSS software Anki provided it, and I became curious and wanted to have a look at Anki. Well, Debian has no “anki” package - and it is written, among other languages, im Python and Rust, so good luck getting it on Debian stable. But for Guix, I only had to do “guix install anki” and had it installed.

    This works a tad slower than apt-get … but it still saves time compared to installing stuff and dependencies manually.



  • I don’t get that people constantly complain that the Guix project does not distributes or actively supports distribution of binary, propietary software. That is like complaining that Apple does not sells their Laptop with Linux, Microsoft does not sells Google’s Chromebooks, or that Amazon does not distribute free eBooks from project Gutenberg, ScienceHub or O’Reilly.

    And users can of course use the nonguix channel to get their non-free firmware or whatever, but they should not complain and demand that volunteers of other projects do more unpaid work. Instead, they should donate money or volunteer do do it themselves.

    But guess what? I think these complaints come to a good part from companies which want to sell their proprietary software. Valve and Steams show that a company can very well sell software for Linux, with mutual benefit, but not by freeloading on volunteer work.

    And one more thing, Guix allows to do exactly what Flatpaks etc. promise: Any company, as well as any lonely coder, team of scientists, or small FLOSS project, can build their own packages founded on a stable Guix base system, with libraries and everything, binary or from source, and distribute it from their own website in a company channel - just like any Emacs user can distribute his own, self-written Emacs extensions from a Web page. And thanks to the portability of the Guix package manager, this software can be installed on any Linux system, resting on a fully reproducible base.



  • If you walk around in my city and open your eyes, you will see that half of the bars and restaurants are closed because there is a shortage of even unskilled staff and restaurants didn’t pay enough to people. They now work in other sectors.

    And yes, software developers are leaving jobs with unreasonable demands and shitty work conditions. Last not least because conserving mental health is more important. Go, for exanple, to the news.ycombinators.com forum and just search for the keyword “burnout”. That’s becoming a massive problem for companies because rising complexity is not matched by adequate organizational practices.

    And AI is not going to help with that - it is already massively increasing technical debt.


  • It’s the Dunning-Kruger effect.

    And it’s fostered by an massive amount of spam and astroturfing coming from “AI” companies, lying that LLMs are good at this or that. Sure, algorithms like neural networks can recognize patterns. Algorithms like backtracking can play chess or solve or transform algebraic equations. But these are not LLMs and LLMs will not and can not replace software engineering.

    Sure, companies want to pay less for programming. But they don’t pay for software developers to generate some gibberish in source code syntax, they need working code. And this is why software engineers and good programmers will not only remain scarce but will become even shorter in supply.

    And companies that don’t pay six-figure salaries to developers will find that experienced developers will flat out refuse to work on AI-generated codebases, because they are unmaintainable and lead to burnout and brain rot.




  • The early stages of a project is exactly where you should really think hard and long about what exactly you do want to achieve, what qualities you want the software to have, what are the detailed requirements, how you test them, and how the UI should look like. And from that, you derive the architecture.

    AI is fucking useless at all of that.

    In all complex planned activities, laying the right groundwork and foundations is essential for success. Software engineering is no different. You won’t order a bricklayer apprentice to draw the plan for a new house.

    And if your difficulty is in lacking detailed knowledge of a programming language, it might be - depending on the case ! - the best approach to write a first prototype in a language you know well, so that your head is free to think about the concerns listed in paragraph 1.