
It’s not uncommon for older projects to use plain git, patch files, and email groups. Linux kernel development still gets done that way every day.
It’s not uncommon for older projects to use plain git, patch files, and email groups. Linux kernel development still gets done that way every day.
Forge is a newish term for systems like github, gitlab, forgejo, gitea, etc that provide source control, project management, issues, and discussion features for projects.
Lmao you linked to the same page I did where this text appears:
GPTBot is used to make our generative AI foundation models more useful and safe. It is used to crawl content that may be used in training our generative AI foundation models.
Also you’re so capitalism brained you assume anyone running a website must be doing so for profit. My hobby projects (personal homepage and personal git forge) were getting slammed by bots while I just paid the bills. I could have locked them both behind an auth portal but then I might as well just take them off the internet and run everything on my LAN.
Oh ok I’ll just ignore the constant requests from GPTBot, ByteSpider, and the hundreds of others who very plainly, sometimes in their useragent, tell you that they’re grabbing content for training data. Robots.txt is nice and all but manually adding every single up and coming AI company is impossible. Like I said Anubis is the first time I’ve gotten them all to even remotely calm down.
You clearly haven’t run a website recently. Until I set up anubis last week I was getting constant requests from dozens of various bot scrapers 24/7. That included the big ones.
Home grown slop is still slop. The lying machine can’t make anything else.
You’re describing exactly how all these web tools worked. “HTML, CSS, and JS are too hard to do manually. Here’s a shiny new tool that abstracts all that away and lets you get right to making your site!” Except they all added additional headaches, security concerns, and failed to fill in edge cases, so you still need to know how to do all that HTML, CSS, and JS anyway. That’s exactly how LLM generated code works now. It’ll be useful and common for a while and then the technical debt will pile up and pile up and eventually everyone will look around and think “what the hell were we thinking” and tear it all down.
The killer product being the lying machine or the deleting working code machine? I think there’s a small number of people for who these tools really fit into their workflows but they are not universal so there’s limited growth and they’re already wildly unprofitable.
This problem right here is why the entirety of containerization was invented.
It has basically the same limitations that lemmy-mastodon has. Pixelfed users can follow lemmy communities to see posts in their timelines. They can see top level comments as replies and can also reply to create top level comments. Lemmy users can’t follow pixelfed accounts in any way.
Due to immense anger at the orphan crushing machine, people bribe it with loads of money to not crush one orphan.
That would make sense, I’m pretty sure communities are already just actors who auto-boost posts.
Could also look at a Coral m.2 if tensor detectors are more your speed.
This is highly relevant to my interests, thanks for making it!
Peeking at the gituhb, there has not been basically any activity in the last couple months. The frontend code get continual updates from a dependency bot and otherwise nothing has happened since August.
14
No love for Lemmy in this survey at all.
Google ruined their search with AI bullshit, try this other thing that is entirely AI bullshit without even pretending to be a search engine.
The fun part is tesla FSD shuts off just before accidents, so you’re always the one at fault.