• 314 Posts
  • 1.07K Comments
Joined 10 months ago
cake
Cake day: April 4th, 2025

help-circle
  • Doctorow writes:

    After more than 20 years of being consistently wrong and terrible for artists’ rights, the US Copyright Office has finally done something gloriously, wonderfully right. All through this AI bubble, the Copyright Office has maintained – correctly – that AI-generated works cannot be copyrighted, because copyright is exclusively for humans. That is why the “monkey selfie” is in the public domain. Copyright is only awarded to works of human creative expression that are fixed in a tangible medium.

    And not only has the Copyright Office taken this position, they have defended it vigorously in court, repeatedly winning judgments to uphold this principle.

    The fact that every AI-created work is in the public domain means that if Getty or Disney or Universal or Hearst newspapers use AI to generate works – then anyone else can take those works, copy them, sell them or give them away for nothing.

    Genius.




  • I would agree with that.

    Especially, “being 70%” finished does not mean you will get a working product at all. If the fundamentale understanding is not there, you will not getting a working product without fundamental rewrites.

    I have seen code from such bullshit developers myself. Vibe-coded device drivers where people do not understand the fundamentals of multi-threading. Why and when you need locks in C++. No clear API descriptions. Messaging architectures that look like a rats nest. Wild mix of synchronous and async code. Insistence that their code is self-documenting and needs neither comments nor doc. And: Agressivity when confronted with all that. Because the bullshit taints any working relationship.





  • I would expect the Fat Head of most used open source projects to make up the vast majority of the open source code included in apps. It is not a common practice to include 1000 small projects into a code base for an app, or even 100.

    Not usually 1000. But nowadays apps really do have a lot of dependencies - often more than 100.

    An article about this:

    https://wiki.alopex.li/LetsBeRealAboutDependencies

    Rust apps also have been criticized for this. The thing is that when building a Rust app, every direct and indirect dependency is fetched and built locally. That makes them very visible. But the same happens e.g. with Debian systems with installing C or Python libraries, which bundle many pre-compilef dependencies, which is faster, and not as visible.



  • If an app includes 50 well-known big projects and 1000 small projects, the sum result can still be that small projects make up for a large fraction of the code.

    It does not need to be an even distribution, it can be a “long tail” distribution.

    And FOSS code is inherited often. Some years ago, a bug in a string-to-floating point connversion routine was found. This affected, if I remember correctly, PHP. But it turned out that many more languages were affected.

    Similar with the TimSort algorithm, which was written by Python’s Tim Peters.











  • The insight that a majority of open source projects are small contributions by hobby developers, and that it is their summed joint effort what matters, is very interesting.

    This is part of a political discussion and lobbying effort that FLOSS development should be “funded better” to secure digital infrastructure.

    What if the majority of contributors

    (1) are not motivated by money in the first place, and

    (2) don’t have time to work more?

    Another thought: I think that one reason why the proportion of open source code grows is also software quality:

    Companies would love to own all their code. So, when they employ people who work on proprietary code, the amount of proprietary code should grow, shouldn’t it?

    Except that companies have mostly very short-term goals. And this affects quality: A lot of proprietary code has quite shit quality and is not really maintainable. Which has the effect that either the project dies, or becomes very slow to develop further, because of tons of technical debt.

    FOSS projects do not have this constraint on short-term returns, so they often have better quality. Which makes it more likely that these projects live and prosper a bit longer. The short-term difference might not be even large - but the process goes year for year, round for round, and it becomes an evolutionary advantage.

    In the end, everyone uses that Finnish students former hobby kernel project, and nobody uses Windows 95 - or wants to use its shitty successors.

    (And this is why I also think that Guix will win in the long term: The capability to automatically re-produce all components of a program or system from freely available source is, in the long run, an overwhelming evolutionary advantage.)


  • Well, hobbyist projects are surely not the only pillar of the open source systems, and big projects like the Linux kernel matter immensely, too. But the article author does not deny that. He makes a point that the hobbyist projects are very important, too. Without them, there would be very little desktop software. I’d guess that much of KDE is hobbyist-powered.

    And apart from that, financial support for projects important for infrastructure is a popular talking point. But I don’t see that happen much. Where are the SW engineering jobs for maintainers and contributors of real time Linux, messaging middleware, things like Ceph and file systems, FLOSS browsers, conference software, and so on? And now there are calls that the FLOSS community should care for security in infrastructure and industrial applications. If this were serious, one could simply pay the people who already do that (and massively hire more of them).


  • Guix vs Nix will be an interesting example. Nix has a way bigger user base right now but it has the whole Anduril & governance issue.

    Guix has a way better configuration language and one can learn in an afternoon enough to use it productively.

    What is your experience with guix like?

    I am mainly using Guix as a package manager on top of Debian stable (and on top of my Arch install running in a vm). I use it mostly to have a reproducible development environment for my free time projects (which use Rust and Guile), and it works very nicely to that. It is also certainly a nice way to distribute software as source, with very little effort (just putting the own package definutions into a channel repo).

    Does getting away from systemd affect things?

    I have also started to run it directly on my PC as a base system. After replacing the NVidia GeForce card with an AMD Radeon one, I had no issues.

    The configuration and init system work well - the only thing I would have to do is to write my own stumpwm(*) init script, which I didn’t have time for, so I use, as a fallback, i3wm and Gnome or XFce2, what I use at work, too.

    (*) Stumpwm is a highly configurable tiling window manager written in Common Lisp. Similar to i3, but using key chords, and window manager actions are just lisp functions one can program and extend - they are called via key chords like Emacs commands.

    In respect to the init systems, I have to confess that I am mostly agnostic. As long as it works, I am fine. I think Guix is the more modern and better approach.


  • The insight that a majority of open source projects are small contributions by hobby developers, and that it is their summed joint effort what matters, is very interesting.

    This is part of a discussion that FLOSS development should be “funded better” to secure digital infrastructure.

    What if the majority of contributors

    (1) are not motivated by money in the first place, and

    (2) don’t have time to work more?

    Another thought: I think that one reason why the proportion of open source code grows is also software quality:

    Companies would love to own all their code. So, when they employ people who work on proprietary code, the amount of proprietary code should grow, shouldn’t it?

    Except that companies have mostly very short-term goals. And this affects quality: A lot of proprietary code has quite shit quality and is not really maintainable. Which has the effect that either the project dies, or becomes very slow to develop further, because of tons of technical debt.

    FOSS projects do not have this constraint on short-term returns, so they often have better quality. Which makes it more likely that these projects live and prosper a bit longer. The short-term difference might not be even large - but the process goes year for year, round for round, and it becomes an evolutionary advantage.

    In the end, everyone uses that Finnish students former hobby kernel project, and nobody uses Windows 95 - or wants to use its shitty successors.

    (And this is why I also think that Guix will win in the long term: The capability to automatically re-produce all components of a program or system from freely available source is, in the long run, an overwhelming evolutionary advantage.)




  • Die Hälfte der über 80-Jährigen bucht kein Zugticket in der Bahn-App und nutzt kein Google Maps zum Navigieren.

    Ich nutze die Bahn-App, die DHL-App, Doctolib, Sndroid, Windows, und Google Maps auch nicht - wegen fehlendem bzw verweigertem Datenschutz.

    Statt dessen Papierticket, OpenStreetMap, brouter.de, OSMAnd, Linux, SailfishOS, und der DHL Mitarbeiter kann gern mein Päckchen ins Treppenhaus legen und kriegt dafür auch einen Euro in die Hand.

    Eine Studie des Branchenverbands Bitkom kommt zu dem Ergebnis, dass etwa die Hälfte der Menschen über 80 nie das Internet nutzt. In der Studie heißt es:

    Fehlendes Wissen oder Ressourcen, Bedenken oder schlichtweg kein Interesse - die Gründe, das Internet nicht zu nutzen, sind vielfältig.

    Selbst schuld liebe Bitkom Firmen - ohne Datenschutz gibt’s halt nur ein Minimum an kostensenkender Digitalisierung. Denn um Kostensenkung geht’s doch hier?