Experiment. Try some fancy new software (OSS preferrably) which needs some serious system integration (I'm sure not talking about a fuckin' editor or office suite; think graylog, cobbler, and similar). Condition: You download the binary package (be it deb, be it rpm) and then try to set it up without Internet access. Sucks? Don't tell me.

The experience

Most modern software comes with a web UI because... oh, well, no one really can explain why, but it's what's floating programmers poop these days. However, stuff is installed, the apache finally gave up his resistance to deliver the fancy UI and then the problems are just starting. "Click this", the manpage says, "and SUPERSOFTWARE will automatically download the necessary additional scripts/themes/whatever". No, it won't, because software will not be able to reach the Internet since it's not available. But that won't stop that piece of crap someone actually dares to call a product ready for usage from trying -- and failing miserably by burping me a complete java stacktrace which contains every, literally every detail about the program I'm not the least bit interested in. It's telling me everything, except the URL the program tried to fetch and which directory it needs to be downloaded to (so I may fetch it myself from a system having Internet if the internetlessness is for sec purposes).
My next step? Use tcpdump to see which host the program tried to connect to (and which port); set up a fake daemon and DNS record to snatch that connection (hope it doesn't validate any certs in case of https -- a sure bet it does not) and extract the URL. In case that's too much work, grep through the program's files for URLs and the like. Things I totally like to spent my time on, preferrably my free time.

Some minutes later, I again toy around the super-shiny web interface where it says "Documentation". I like documentatin which comes in a convenient manner, say, as a web page. Manpage is fine, too, but web pages may even contain these brand new things, pictures. You may have heard of them. I get my expectations up, click the link -- and my web browser is sadly informing me that there is a technical problem with following that link. Sure shot, that's because the developer set a link to an external resource which is to be found on the Internet. Which has a link that looks like http://deep.structure.suckme.projectdom.tld/lets/go/very/deepintodetail.fancyextension?q=ba55fbceee1411e5a360001e378c8735bddee6aaee1411e5a360001e378c8735. If my system is Internet-disconnected for security purposes, I probably will now have to hand-copy-and-paste this handy link... or just start googling for a picture of the developer and get me the darts.

Analysis

Leaving behind the rage, here are the pros and cons of excluding some resources from being part of the installation archive. The pros:

The cons:

The solution

Please, pretty fucking please, do include all documentation in binary packages. I can live with it to be updated as part of the regular update cycle and thus not being day-current. Same goes for scripts/extensions/themes. As long as they do not consume hundreds of megabytes, I do have the necessary space on my harddisk.
You can even make it optional. Create some full-package and some net-package.

The inside scoop

Do not try to Install something on your notebook when travelling through Mecklenburg-West Pomerania with a train.

Stichworte:


Impressum