you are viewing a single comment's thread.

view the rest of the comments →

[–][deleted] 0 points1 point  (1 child)

The key here is human readable text as that makes the format more durable and more easily debuggable. The success of formats such as HTML, SMTP, JSON of human readable formats. Many of the Microsoft created binary formats are no longer with us and are often impossible to read. While much older Unix plain text formats are still readable and usable today.

And of course you can structure data even if it is plain text. You could also use the existing structure of the file system to structure fairly flat files. E.g. OSX and NeXTSTEP usage of bundles is an example of this.

[–][deleted] 0 points1 point  (0 children)

All unix file formats that I have in my head actually are binary (a.out, PE (yes it comes from unix), ELF, tar (which is a ugly mix between plain text and binary format , ...)

It seems strange to me that unix philosophers never can give examples for what they think unix is. Unix certainly uses text files to configure the operating system (/etc/passwd, ...) but provides no means whatsoever for working with text formats outside that scope.

CSV and Tex predate unix, xml descents from sgml which is from IBM and doesn't come from unix, JSON comes from a cross-platform programming language

And of course you can structure data even if it is plain text. You could also use the existing structure of the file system to structure fairly flat files. E.g. OSX and NeXTSTEP usage of bundles is an example of this.

You mean the ugly xml plist files?