Most modern software comes with a web UI because... oh, well, no one really can explain why, but it's what's floating programmers poop these days. However, stuff is installed, the apache finally gave up his resistance to deliver the fancy UI and then the problems are just starting. "Click this", the manpage says, "and SUPERSOFTWARE will automatically download the necessary additional scripts/themes/whatever". No, it won't, because software will not be able to reach the Internet since it's not available. But that won't stop that piece of crap someone actually dares to call a product ready for usage from trying -- and failing miserably by burping me a complete java stacktrace which contains every, literally every detail about the program I'm not the least bit interested in. It's telling me everything, except the URL the program tried to fetch and which directory it needs to be downloaded to (so I may fetch it myself from a system having Internet if the internetlessness is for sec purposes).
My next step? Use
tcpdump to see which host the program tried to connect to (and which port); set up a fake daemon and DNS record to snatch that connection (hope it doesn't validate any certs in case of https -- a sure bet it does not) and extract the URL. In case that's too much work, grep through the program's files for URLs and the like. Things I totally like to spent my time on, preferrably my free time.
Some minutes later, I again toy around the super-shiny web interface where it says "Documentation". I like documentatin which comes in a convenient manner, say, as a web page. Manpage is fine, too, but web pages may even contain these brand new things, pictures. You may have heard of them. I get my expectations up, click the link -- and my web browser is sadly informing me that there is a technical problem with following that link. Sure shot, that's because the developer set a link to an external resource which is to be found on the Internet. Which has a link that looks like
http://deep.structure.suckme.projectdom.tld/lets/go/very/deepintodetail.fancyextension?q=ba55fbceee1411e5a360001e378c8735bddee6aaee1411e5a360001e378c8735. If my system is Internet-disconnected for security purposes, I probably will now have to hand-copy-and-paste this handy link... or just start googling for a picture of the developer and get me the darts.
Leaving behind the rage, here are the pros and cons of excluding some resources from being part of the installation archive. The pros:
- It's easy to keep resources current. Documentation can always be updated to include new aspects that were forgetten before, providing a better experience for users. Scripts, themes, and templates may even be subject to security relevant updates which can take effect for everyone who downloads them w/o the need to wait for distributions to pick and deploy the new binary package release.
- It's working for 99 percent of the users. "Every system today has Internet access" -- one way or another. Thus providing things using this channel just became common. No one would assume me to install a RPM from a floppy disk, the natural expectation is that they're freshly downloaded.
- Not packing all available themes/extension scripts into the RPM saves precious space since the user will download only those extensions he actually needs. Same goes for the documentation. The user won't have his harddisk filled with stuff he hardly has a look at but in case he needs it, it's available.
- In case you do not want to be permanently Internet-connected or you cannot be permanently connected, you've lost the game; mostly before you even started it.
- Whenever I follow a link, there's a referrer telling the target where I just came from (forum-referrers.html). Same goes for a download, both provide information to the target site; starting with an existance of the software's installation. I may not want such information to be sent out. No discussion. Or because I have a special mistrust against this provider. Or because I want to install it in such a sensitive context that I cannot risk those downloads to be used as attack vectors.
- In many cases the saving on the Internet consumption is hardly a valid argument as most entertaining Internet sites come with some megabytes of additional scripts/images/interactive content which I happily download.
Please, pretty fucking please, do include all documentation in binary packages. I can live with it to be updated as part of the regular update cycle and thus not being day-current. Same goes for scripts/extensions/themes. As long as they do not consume hundreds of megabytes, I do have the necessary space on my harddisk.
You can even make it optional. Create some full-package and some net-package.
Do not try to Install something on your notebook when travelling through Mecklenburg-West Pomerania with a train.