e.g. flac for lossless audio because…
Just going to leave this xkcd comic here.
Yes, you already know what it is.
One could say it is the standard comic for these kinds of discussions.
🪛
how did i know it was standards
now, proliferate
Open Document Standard (.odt) for all documents. In all public institutions (it’s already a NATO standard for documents).
Because the Microsoft Word ones (.doc, .docx) are unusable outside the Microsoft Office ecosystem. I feel outraged every time I need to edit .docx file because it breaks the layout easily. And some older .doc files cannot even work with Microsoft Word.
Actually, IMHO, there should be some better alternative to .odt as well. Something more out of a declarative/scripted fashion like LaTeX but still WYSIWYG. LaTeX (and XeTeX, for my use cases) is too messy for me to work with, especially when a package is Byzantine. And it can be non-reproducible if I share/reuse the same document somewhere else.
Something has to be made with document files.
I was too young to use it in any serious context, but I kinda dig how WordPerfect does formatting. It is hidden by default, but you can show them and manipulate them as needed.
It might already be a thing, but I am imagining a LaTeX-based standard for document formatting would do well with a WYSIWYG editor that would hide the complexity by default, but is available for those who need to manipulate it.
There are programs (LyX, TexMacs) that implement WYSIWYG for LaTeX, TexMacs is exceptionally good. I don’t know about the standards, though.
Another problem with LaTeX and most of the other document formats is that they are so bloated and depend on many other tasks that it is hardly possible to embed the tool into a larger document. That’s a bit of criticism for UNIX design philosophy, as well. And LaTeX code is especially hard to make portable.
There used to be a similar situation with PDFs, it was really hard to display a PDF embedded in application. Finally, Firefox pdf.js came in and solved that issue.
The only embedded and easy-to-implement standard that describes a ‘document’ is HTML, for now (with Javascript for scripting). Only that it’s not aware of page layout. If only there’s an extension standard that could make a HTML page into a document…
I was actually thinking of something like markdown or HTML forming the base of that standard. But it’s almost impossible (is it?) to do page layout with either of them.
But yeah! What I was thinking when I mentioned a LaTeX-based standard is to have a base set of “modules” (for a lack of a better term) that everyone should have and that would guarantee interoperability. That it’s possible to create a document with the exact layout one wants with just the base standard functionality. That things won’t be broken when opening up a document in a different editor.
There could be additional modules to facilitate things, but nothing like the 90’s proprietary IE tags. The way I’m imagining this is that the additional modules would work on the base modules, making things slightly easier but that they ultimately depend on the base functionality.
IDK, it’s really an idea that probably won’t work upon further investigation, but I just really like the idea of an open standard for documents based on LaTeX (kinda like how HTML has been for web pages), where you could work on it as a text file (with all the tags) if needed.
Markdown, asciidoc, restructuredtext are kinda like simple alternatives to LaTeX
There is also https://github.com/typst/typst/
It is unbelievable we do not have standard document format.
What’s messed up is that, technically, we do. Originally, OpenDocument was the ISO standard document format. But then, baffling everyone, Microsoft got the ISO to also have
.docx
as an ISO standard. So now we have 2 competing document standards, the second of which is simply worse.That’s awful, we should design something that covers both use cases!
Bro, trying to give padding in Ms word, when you know… YOU KNOOOOW… they can convert to html. It drives me up the wall.
And don’t get me started on excel.
Kill em all, I say.
zip or 7z for compressed archives. I hate that for some reason rar has become the defacto standard for piracy. It’s just so bad.
The other day I saw a tar.gz containing a multipart-rar which contained an iso which contained a compressed bin file with an exe to decompress it. Soooo unnecessary.
Edit: And the decompressed game of course has all of its compressed assets in renamed zip files.
A .tarducken, if you will.
Ziptarar?
It was originally rar because it’s so easy to separate into multiple files. Now you can do that in other formats, but the legacy has stuck.
Not just that. RAR also has recovery records.
.tar.zstd
all the way IMO. I’ve almost entirely switched to archiving with zstd, it’s a fantastic format.why not gzip?
Gzip is slower and outputs larger compression ratio. Zstandard, on the other hand, is terribly faster than any of the existing standards in means of compression speed, this is its killer feature. Also, it provides a bit better compression ratio than gzip citation_needed.
Yes, all compression levels of gzip have some zstd compression level that is both faster and better in compression ratio.
Additionally, the highest compression levels of zstd are comparable in compression level to LZMA while also being slightly faster in compression and many many times faster in decompression
gzip is very slow compared to zstd for similar levels of compression.
The zstd algorithm is a project by the same author as lz4. lz4 was designed for decompression speed, zstd was designed to balance resource utilization, speed and compression ratio and it does a fantastic job of it.
.tar.xz masterrace
This comment didn’t age well.
I don’t know what to pick, but something else than PDF for the task of transferring documents between multiple systems. And yes, I know, PDF has it’s strengths and there’s a reason why it’s so widely used, but it doesn’t mean I have to like it.
Additionally all proprietary formats, specially ones who have gained enough users so that they’re treated like a standard or requirement if you want to work with X.
oh it’s x, not x… i hate our timeline
When PDF was introduced it made these things so much better than they were before that I’ll probably remain grateful for PDF forever and always forgive it all its flaws.
I would be fine with PDFs exactly the same except Adobe doesn’t exist and neither does Acrobat.
I would be fine with PDFs exactly the same except Adobe doesn’t exist and neither does Acrobat.
Ogg Opus for all lossy audio compression (mp3 needs to die)
7z or tar.zst for general purpose compression (zip and rar need to die)
The existence of zip, and especially rar files, actually hurts me. It’s slow, it’s insecure, and the compression is from the jurassic era. We can do better
@dinckelman @Supermariofan67 I think you mean unsecure. It doesn’t feel unsure of itself. 😁
in·se·cure (ĭn′sĭ-kyo͝or′) adj.
- Inadequately guarded or protected; unsafe: A shortage of military police made the air base insecure.
https://www.thefreedictionary.com/insecure
Unsecure
a. 1. Insecure.
@hungprocess touché.
@hungprocess Also this. https://english.stackexchange.com/questions/19653/insecure-or-unsecure-when-dealing-with-security
It seems that I was quite wrong, but that a lot of other people are wrong as well. lol
why does zip and rar need to die
Again, I’m not the original poster. But zip isn’t as dense as 7zip, and I honestly haven’t seen rar are used much.
Also, if I remember correctly, the audio codecs and compression types. The other poster listed are open source. But I could be mistaken. I know at least 7zip is and I believe opus or something like that is too
Most mods on Nexus are in rar or zip. Also most game cracks; or as iso, which is even worse.
Zip has terrible compression ratio compared to modern formats, it’s also a mess of different partially incompatible implementations by different software, and also doesn’t enforce utf8 or any standard for that matter for filenames, leading to garbled names when extracting old files. Its encryption is vulnerable to a known-plaintext attack and its key-derivation function is very easy to brute force.
Rar is proprietary. That alone is reason enough not to use it. It’s also very slow.
why does ml3 need todie
It’s a 30 year old format, and large amounts of research and innovation in lossy audio compression have occurred since then. Opus can achieve better quality in like 40% the bitrate. Also, the format is, much like zip, a mess of partially broken implementations in the early days (although now everyone uses LAME so not as big of a deal). Its container/stream format is very messy too. Also no native tag format so it needs ID3 tags which don’t enforce any standardized text encoding.
Not the original poster, but there are newer audio codecs that are more efficient at storing data than mp3, I believe. And there’s also lossless standards, compared to mp3’s lossy compression.
What’s wrong with mp3
Big file size for rather bad audio quality.
I’ve yet to meet someone who can genuinely pass the 320kbps vs. lossless blind-test on anything but very high-end equipment.
People are able to on some songs because mp3 is poorly optimized for certain sounds, especially cymbals. However, opus can achieve better quality than that at 128k with fewer outliers than mp3 at 320k, which saves a lot of space.
False. OPUS achieves transparency at 192kbps compared to 320kbps for LAME MP3.
We’re not talking lossless. The comment above specified Opus-encoded OGG, which is lossy.
For example, I converted my music library from MP3 to OGG Opus and the size shrank from 16 GB to just 3 GB.
And if converting from lossless to both MP3 and OGG Opus, then OGG does sound quite a bit better at smaller file sizes.
So, the argument here is that musicians are underselling their art by primarily offering MP3 downloads. If the whole industry would just magically switch to OGG Opus, that would be quite an improvement for everyone involved.
320 kbps is approaching lossless audio compression bitrates.
Opus does better in about half the space. And goes down to comically low bitrates. And his obscenely small latency. It’s not simple, but hot dang, is it good.
The Quite Okay Imaging guy did a Quite Okay Audio follow-up, aiming for aggressive simplicity and sufficient performance, but it’s fixed at a bitrate of 278 kbps for stereo. It’s really competing with ADPCM for sound effects in video games.
Personally, I think an aggressively simple frequency-domain format could displace MP3 as a no-brainer music library format, circa 128 kbps. All you have to do is get forty samples out of sixty-four bits. Bad answers are easy and plentiful. The trick is, when each frame barely lasts a millisecond, bad answers might work anyway.
yeah, but guess what opus is transparent at? depending on the music, anywhere from 96kbps to 160kbps.
How about tar.gz? How does gzip compare to zstd?
Both slower and worse at compression at all its levels.
its worth noting that aac is actually pretty good in a lot of cases too
However, it is very patent encumbered and therefore wouldn’t make for a good standard.
aac lc and he-aac are both free now hev2 and xhe aren’t, but those have more limited use
(mp3 needs to die)
How are you going to recreate the MP3 audio artifacts that give a lot of music its originality, when encoding to OPUS? Past audio recordings cannot be fiddled with too much.
Also, fuck Zstandard, its a problematic format due to single file compression ability, hard to repair, not fully stable and lacking too many features compared to 7Z/RAR. Zst is also 15-20% worse at compression ratio. Its only a good format for temporary fast data transit applications (webpage/CDN serving, quick temporary database backups).
How are you going to recreate the MP3 audio artifacts that give a lot of music its originality, when encoding to OPUS?
Oh, a gramophone user.
Joke aside, i find ogg Opus often sounding better than the original. Probably something with it’s psychoacoustic optimizations.
The artifacts can determine the quirky sounds. They are like film grain in MPEG/WMV/AVI or the old VHS rips, you cannot recreate them in OPUS, because they get recorded that way. The recording gear and the mastering determines how the streaming audio should be encoded. OPUS probably is better sounding now with Sox Resampler equipped audio players.
Resume information. There have been several attempts, but none have become an accepted standard.
When I was a consultant, this was the one standard I longed for the most. A data file where I could put all of my information, and then filter and format it for each application. But ultimately, I wanted to be able to submit the information in a standardised format - without having to re-enter it endlessly into crappy web forms.
I think things have gotten better today, but at the cost of a reliance on a monopoly (LinkedIn). And I’m not still in that sort of job market. But I think that desire was so strong it’ll last me until I’m in my grave.
JPEG-XL for rasterized images.
I agree.
I especially love that it addresses the biggest pitfall of the typical “fancy new format does things better than the one we’re already using” transition, in that it’s specifically engineered to make migration easier, by allowing a lossless conversion from the dominant format.
Never heard of that, thanks for bringing it to my attention!
deleted by creator
GNOME introduced its support in version 45, AFAIK there isn’t a stable distro release yet that ships it.
Unfortunately, adoption has been slow and Alliance for Open Media are pushing back somewhat (especially Google[1], who leads the group) in favor of their inferior
.avif
format.
This is the kind of thing i think about all the time so i have a few.
- Archive files:
.tar.zst
- Produces better compression ratios than the DEFLATE compression algorithm (used by
.zip
andgzip
/.gz
) and does so faster. - By separating the jobs of archiving (
.tar
), compressing (.zst
), and (if you so choose) encrypting (.gpg
),.tar.zst
follows the Unix philosophy of “Make each program do one thing well.”. .tar.xz
is also very good and seems more popular (probably since it was released 6 years earlier in 2009), but, when tuned to it’s maximum compression level,.tar.zst
can achieve a compression ratio pretty close to LZMA (used by.tar.xz
and.7z
) and do it faster[1].zstd and xz trade blows in their compression ratio. Recompressing all packages to zstd with our options yields a total ~0.8% increase in package size on all of our packages combined, but the decompression time for all packages saw a ~1300% speedup.
- Produces better compression ratios than the DEFLATE compression algorithm (used by
- Image files:
JPEG XL
/.jxl
- “Why JPEG XL”
- Free and open format.
- Can handle lossy images, lossless images, images with transparency, images with layers, and animated images, giving it the potential of being a universal image format.
- Much better quality and compression efficiency than current lossy and lossless image formats (
.jpeg
,.png
,.gif
). - Produces much smaller files for lossless images than AVIF[2]
- Supports much larger resolutions than AVIF’s 9-megapixel limit (important for lossless images).
- Supports up to 24-bit color depth, much more than AVIF’s 12-bit color depth limit (which, to be fair, is probably good enough).
- Videos (Codec):
AV1
- Free and open format.
- Much more efficient than x264 (used by
.mp4
) and VP9[3].
- Documents:
OpenDocument / ODF / .odt
- @raubarno@lemmy.ml says it best here.
.odt
is simply a better standard than.docx
.
it’s already a NATO standard for documents Because the Microsoft Word ones (.doc, .docx) are unusable outside the Microsoft Office ecosystem. I feel outraged every time I need to edit .docx file because it breaks the layout easily. And some older .doc files cannot even work with Microsoft Word.
- @raubarno@lemmy.ml says it best here.
is av1 lossy
AV1 can do lossy video as well as lossless video.
- By separating the jobs of archiving (
.tar
), compressing (.zst
), and (if you so choose) encrypting (.gpg
),.tar.zst
follows the Unix philosophy of “Make each program do one thing well.”.
wait so does it do all of those things?
So there’s a tool called tar that creates an archive (a
.tar
file. Then theres a tool called zstd that can be used to compress files, including.tar
files, which then becomes a.tar.zst
file. And then you can encrypt your.tar.zst
file using a tool called gpg, which would leave you with an encrypted, compressed.tar.zst.gpg
archive.Now, most people aren’t doing everything in the terminal, so the process for most people would be pretty much the same as creating a ZIP archive.
- By separating the jobs of archiving (
Damn didn’t realize that JXL was such a big deal. That whole JPEG recompression actually seems pretty damn cool as well. There was some noise about GNOME starting to make use of JXL in their ecosystem too…
.tar
is pretty bad as it lacks in index, making it impossible to quickly seek around in the file. The compression on top adds another layer of complication. It might still work great as tape archiver, but for sending files around the Internet it is quite horrible. It’s really just getting dragged around for cargo cult reasons, not because it’s good at the job it is doing.In general I find the archive situation a little annoying, as archives are largely completely unnecessary, that’s what we have directories for. But directories don’t exist as far as HTML is concerned and only single files can be downloaded easily. So everything has to get packed and unpacked again, for absolutely no reason. It’s a job computers should handle transparently in the background, not an explicit user action.
Many file managers try to add support for
.zip
and allow you to go into them like it is a folder, but that abstraction is always quite leaky and never as smooth as it should be.By separating the jobs of archiving (.tar), compressing (.zst), and (if you so choose) encrypting (.gpg), .tar.zst follows the Unix philosophy of “Make each program do one thing well.”.
The problem here being that GnuPG does nothing really well.
Videos (Codec): AV1
- Much more efficient than x264 (used by .mp4) and VP9[3].
AV1 is also much younger than H264 (AV1 is a specification, x264 is an implementation), and only recently have software-encoders become somewhat viable; a more apt comparison would have been AV1 to HEVC, though the latter is also somewhat old nowadays but still a competitive codec. Unfortunately currently there aren’t many options to use AV1 in a very meaningful way; you can encode your own media with it, but that’s about it; you can stream to YouTube, but YouTube will recode to another codec.
The problem here being that GnuPG does nothing really well.
Could you elaborate? I’ve never had any issues with gpg before and curious what people are having issues with.
Unfortunately currently there aren’t many options to use AV1 in a very meaningful way; you can encode your own media with it, but that’s about it; you can stream to YouTube, but YouTube will recode to another codec.
AV1 has almost full browser support (iirc) and companies like YouTube, Netflix, and Meta have started moving over to AV1 from VP9 (since AV1 is the successor to VP9). But you’re right, it’s still working on adoption, but this is moreso just my dreamworld than it is a prediction for future standardization.
Could you elaborate? I’ve never had any issues with gpg before and curious what people are having issues with.
This article and the blog post linked within it summarize it very well.
Encrypting Email
Don’t. Email is insecure . Even with PGP, it’s default-plaintext, which means that even if you do everything right, some totally reasonable person you mail, doing totally reasonable things, will invariably CC the quoted plaintext of your encrypted message to someone else
Okay, provide me with an open standard that is widely-used that provides similar functionality.
It isn’t there. There are parties who would like to move email users into their own little proprietary walled gardens, but not a replacement for email.
The guy is literally saying that encrypting email is unacceptable because it hasn’t been built from the ground up to support encryption.
I mean, the PGP guys added PGP to an existing system because otherwise nobody would use their nifty new system. Hell, it’s hard enough to get people to use PGP as it is. Saying “well, if everyone in the world just adopted a similar-but-new system that is more-amenable to encryption, that would be helpful”, sure, but people aren’t going to do that.
The message to be taken from here is rather “don’t bother”, if you need secure communication use something else, if you’re just using it so that Google can’t read your mail it might be ok but don’t expect this solution to be secure or anything. It’s security theater for the reasons listed, but the threat model for some people is a powerful adversary who can spend millions on software to find something against you in your communication and controls at least a significant portion of the infrastructure your data travels through. Think about whistleblowers in oppressive regimes, it’s absolutely crucial there that no information at all leaks. There’s just no way to safely rely on mail + PGP for secure communication there, and if you’re fine with your secrets leaking at one point or another, you didn’t really need that felt security in the first place. But then again, you’re just doing what the blog calls LARPing in the first place.
wait im confusrd whats the differenc ebetween .tar.zst and .tar.xz
Different ways of compressing the initial
.tar
archive.deleted by creator
But it’s not a tarxz, it’s an xz containing a tar, and you perform operations from right to left until you arrive back at the original files with whatever extensions they use.
If I compress an exe into a zip, would you expect that to be an exezip? No, you expect it to be file.exe.zip, informing you(and your system) that this file should first be unzipped, and then should be executed.
use a real operative system then
I get your point. Since a
.tar.zst
file can be handled natively bytar
, using.tzst
instead does make sense.
.odt is simply a better standard than .docx.
No surprise, since OOXML is barely even a standard.
- Archive files:
I wish there was a more standardized open format for documents. And more people and software should use markdown/.md because you just don’t need anything fancier for most types of documents.
Yes, but only if everyone adhere to CommonMark version of Markdown.
I agree, we need support for it in libreoffice and than other document editors.
We can not expect people to use codes, but editor that saves to it would be grat.
Data output from manufacturing equipment. Just pick a standard. JSON works. TOML / YAML if you need to write as you go. Stop creating your own format that’s 80% JSON anyways.
JSON is nicer for some things, and YAML is nicer for others. It’d be nice if more apps would let you use whichever you prefer. The data can be represented in either, so let me choose.
SQLite for all “I’m going to write my own binary format because I is haxor” jobs.
There are some specific cases where SQLite isn’t appropriate (streaming). But broadly it fits in 99% of cases.
To chase this - converting to json or another standardized format in every single case where someone is tempted to write their own custom parser. Never write custom parsers kids, they’re an absolutely horrible time-suck and you’ll be fixing them quite literally forever as you discover new and interesting ways for your input data to break them.
Edit: it doesn’t have to be json, I really don’t care what format you use, just pick an existing data format that uses a robust, thoroughly tested, parser.
Also parquet if the data aren’t mutated much.
XML for machine-readable data because I live to cause chaosEither markdown or Org for human-readable text-only documents. MS Office formats and the way they are handled have been a mess since the 2007 -x versions were introduced, and those and Open Document formats are way too bloated for when you only want to share a presentable text file.
While we’re at it, standardize the fucking markdown syntax! I still have nightmares about Reddit’s degenerate four-space-indent code blocks.
Man, I’d love if markdown was more widely used, it’s pretty much the perfect format for everything I do
Markdown, CommonMark, .rst formats are good for printing basic rich text for technical documentation and so on, when text styling is made by an external application and you don’t care about reproducible layout.
But you also want to make custom styles (font size, text alignment, colours), page layout (paper format, margin size, etc.) and make sure your document is reproducible across multiple processing applications, that the layout doesn’t break, authoring tools, maybe even some version control, etc. This is when it strikes you bad.
Markdown misses checkboxes anywhere, especially in tables.
But markdown is just good. It’s just writing text as normal basically
I’d like an update to the epub ebook format that leverages zstd compression and jpeg-xl. You’d see much better decompression performance (especially for very large books,) smaller file sizes and/or better image quality. I’ve been toying with the idea of implementing this as a .zpub book format and plugin for KOReader but haven’t written any code for it yet.
matroska for media, we already have MKA for audio and MKV for video. An image container would be good too.
mp4 is more prone to data loss and slower to parse, while also being less flexible, despite this it seems to be a sort of pseudo standard.
(MP4, M4A, HEIF formats like heic, avif)
wait why not av4 or jpegxl
those are media formats, not containers.
TOML for configuration files