

thanks!
Though I should mention my original motivation with makemkv was to rip blu-ray discs, which has complications that go beyond DVD. But the DVD guide will still be quite useful.
thanks!
Though I should mention my original motivation with makemkv was to rip blu-ray discs, which has complications that go beyond DVD. But the DVD guide will still be quite useful.
I’m so tired of overly busy qr codes.
I’m tired of having to search through text to get enough of an idea of what a QR code is before I go to the trouble of pulling out a scanner. Is it an URL? Wi-Fi creds? It’s not about being cute. It’s about being informative in as little space as possible. Do you scan a naked QR code without cause? Street wise users want an indication of what they are scanning in the very least.
It should also be noted that the QR code pixels will get smaller and smaller the more data you’re encoding.
You have control over that. If you want to hold the pixel size constant, the qr code’s geometry gets bigger. The qrcode LaTeX pkg includes a size parameter. Either way, up to 30% of the space could be wasted, depending on the use case.
QR codes have countless applications. Not all QR codes need to be scanned from the other side of a room. When a QR code appears on a document that someone is holding, as opposed to a sign, it only needs to function within 10cm. I’m working on 2-column bilingal legal documents citing laws from different countries. There is insufficient space for country indicators and 30% of the QR code is just wasted space in this context, which really adds up of you have many QR codes. In a corner case, flaws from multiple generations of photocopies could manifest but 30% redundancy is overkill. So putting the country indicator for the law being referenced inside the QR code makes the most efficient use of page real estate without resorting to poor aesthetics.
Also, QR codes are ugly. I’m happy to see creative people dress them up. Of course there is only room for clever artists in this space and easy for kids making qr codes to get carried away.
Fun suggestion… could be useful to have as a side hack if congestion becomes an issue but I doubt it would come to that. They have what seems to be a high-end switch with 20 or so ports and internal fans.
The event is ~2—3 hours or so. If someone needs the full Debian (80 gb!), I think over USB 2 it would not transfer in that timeframe. USB 2 sticks may be rare but at this event there are some ppl with old laptops that have no USB 3 sockets. A lot of people plug into ethernet. And the switch looks somewhat more serious than a 4-port SOHO… it has like 20+ ports with fans, so I don’t get the impression ethernet congestion would be an issue.
I think they could do the job. I’ve never admin’d an NFS so I’m figuring there’s a notable learning curve there. SAMBA, well, maybe. I’ve used it before. I’m leaning toward ProFTPd at the moment but if that gives me any friction I guess I’ll consider SAMBA. Perhaps I’ll go into overachiever mode and have both SAMBA and ProFTPd pointing to the same directory.
Two possible issues w/that w.r.t my use case:
Nonetheless, I appreciate the suggestion. It could be handy in some situations.
oh, sorry. Indeed. I answered from the notifications page w/out context. Glad to know Filezilla will work for that!
I use filezilla but AFAIK it’s just a client not a server.
Indeed i noticed openssh-sftp-server
was automatically installed with Debian 12. Guess I’ll look into that first. Might be interesting if ppl could choose between FTP or mounting with SSHFS.
(edit) found this guide
Thanks for mentioning it. It encouraged me to look closer at it and I believe it’s well suited for my needs.
Well it’s still the same problem. I mean, it’s likely piracy to copy the public lib’s disc to begin with, even if just for a moment. From there, if I want to share it w/others I still need to be able to exit the library with the data before they close. So it’d still be a matter of transcoding as a distinctly separate step.
Not sure how that makes sense. Why would a captive portal block the 1st 39 attempts but not the 40th, for example?
My workaround is to establish a VPN (which happens quickly w/out issue) then run tor over that, which is also instantly working over the VPN.
What’s the point of spending a day compressing something that I only need to watch once?
If I pop into the public library and start a ripping process using Handbrake, the library will close for the day before the job is complete for a single title. I could check-out the media, but there are trade-offs:
Wow, thanks for the research and effort! I will be taking your approach for sure.
I’ll have a brief look but I doubt ffmpeg would know about DVD CSS encryption.
The design approach would also help serve people in impoverished areas, where you might imagine they have no internet access at home but can still participate through some community access point.
Indeed, I really meant tools that have some cloud interaction but give us asynchronous autonomy from the cloud.
Of course there are also scenarios that normally use the could but can be made fully offline. E.g. Argos Translate. If you use a web-based translator like Google Translate or Yandex Translate, you are not only exposed to the dependency of having a WAN when you need to translate, but you give up privacy. Argos Translate empowers you to translate text without cloud dependency while also getting sensible level of privacy. Or in the case of Google Maps vs. OSMand, you have the privacy of not sharing your location and also the robustness of not being dependant on a functioning uplink.
Both scenarios (fully offline apps and periodic syncing of msgs) are about power and control. If all your content is sitting on someone else’s server, you are disempowered because they can boot you at any moment, alter your content, or they can pull the plug on their server spontaneously without warning (this has happened to me many times). They can limit your searching capability too. A natural artifact of offline consumption is that you have your own copy of the data.
if it aint broke dont fix it
It’s broke from where I’m sitting. Many times Mastodon and Lemmy servers went offline out of the pure blue and all my msgs were mostly gone, apart from what got cached on other hosts which is tedious and non-trivial to track down. It’s technically broken security in the form of data loss/loss of availability.
I have nothing for these use cases, off the top of my head:
You just wrote your response using an app that’s dysfunctional offline. You had to be online.
Perhaps before your time, Usenet was the way to do forums. Gnus (an emacs mode) was good for this. Gnus would fetch everything to my specification and store a local copy. It served as an offline newsreader. I could search my local archive of messages and the search was not constrained to a specific tool (e.g. grep would work, but gnus was better). I could configure it to grab all headers for new msgs in a particular newsgroup, or full payloads. Then when disconnected it was possible to read posts. I never tested replies because I had other complexities in play (mixmaster), but it was likely possible to compose a reply and sync/upload it later when online. The UX was similar to how mailing lists work.
None of that is possible with Lemmy. It’s theoretically possible given the API, but the tools don’t exist for that.
Offline workflows were designed to accommodate WAN access interruptions, but an unforeseen benefit was control. Having your own copy naturally gives you a bit of control and censorship resilience.
(update) Makes no sense that I have to be online to read something I previously wrote. I sometimes post some useful bit of information but there are only so many notes I can keep organised. Then I later need to recall (e.g. what was that legal statute that I cited for situation X?) If I wrote it into a Lemmy post, I have to be online to find it again. The search tool might be too limited to search the way I need to… and that assumes the host I wrote it on is even still online.
That page is dead for me.
But I just wanted to add that Chevron is an ALEC member in the US. Thus they feed the GOP and many extreme right policy. Should be boycotting them already anyway. They also got caught financing the Cloakroom project.
I have not tried much of anything yet. I just got a cheap laptop with a BD which came with Windows and VLC. I popped in a blu-ray disc from the library and it could not handle it… something about not having a aacs decoder or something like that. I didn’t spend any time on it yet but ultimately in principle I would install debian and try to liberate the drive to read BDs.