May 152018
 

Hard or Easy; Keeping up with Progress or “What are you going to do?”

There are several styles of computer use: the home and basic Internet user, the DIY hobbyist, the specialist looking for a better tool, the professional dependent on IT for productivity, the network professional, the programmer/developer and the scientific and mathematical specialist in computer science. One can place himself/herself into one of these categories and decide how much energy one expends to get done what is wanted.

The home user is geared to applications that click and work. Now tutored by smartphones, we expect things to just work and update automatically and if not working, just buy the new phone that does work. In GNU/Linux computing buying one’s way out is not really necessary. Why buy new when an upgrade will nicely improve the suite of applications that most people use? Ubuntu fits this user pretty well. Long term support for five years and a guarantee that if the application comes from the repository it will work and not break the system or other applications. The alert user will be aware of update cycles and support. One extra consideration at present is the state of CPU architecture: the 64-bit multiple core processors are going to be more widely supported than the previous 32-bit single core types, at least for Ubuntu. Debian is still working on all architectures. Both are easy to install and update. But they are not the same, so one should think about the level of support to be found in person and via Internet.

The DIY hobbyist is dedicated to a certain kind of work: photography, art, blogging, communication, sales via Internet, etc. He or she has a focused goal and has found a computing solution that works. Sometimes that is available via the GNU/Linux distribution, hereafter ‘distro’, or through a third party cooperating program. In Ubuntu there are ppa’s, “Personal Package Archives” and third party repositories. Ubuntu cooperates with these in the update process. See “UbuntuUpdates.org” for more information. The list is long, but not endless. You should note that though the packages are labelled “.deb” they are not part of the Debian OS. So far, so good if Ubuntu and your particular third party are still in cooperation when a new upgrade of Ubuntu is released. If not, then there is a very good chance of a broken system. Why? Dependencies.

Linux programming has been based on linking smaller programs already built to do bigger jobs or create new methods of solving a problem. This also follows in Distro: applications make use of existing parts of the system and save time, space and work by using these common parts. These are called dependencies. So if you install an application and look at the process it may say, “depends:” and list a variety of ‘lib_this&that’ items you have no idea were involved. In Ubuntu an example would be installing a KDE application on a Gnome desktop. The libs for each are quite different, as each team worked to coordinate the basics in a different way, so the first KDE application you grab will bring in a group of dependency libs. The libs are called “lib_someTask” these parts of the system make it work better and faster. They work as coordinated teams of linking parts. Except when they are removed as no longer necessary or have been replaced with a total rewrite with a different name. This happens over time as hardware and programmers improve how the computer works.

The Specialist user has gone above the Distro to seek out that very special and useful program not necessarily part of the Distro system. So he/she has either built the new application from source code, or found a set of intermediate libs or equivalent to make an “alien” program run with the Distro. Building an application means compiling from source, which guarantees compatibility with the kernel and hardware. That’s what compiling is about, building an application on the given machine. If you work at this level you accept the workload of compiling and testing till it works. There is also WINE, the non-emulator that offers the equivalent of the libs needed by a certain other software stream. Of course there is also the option of virtualization for running a totally foreign OS on a GNU/Linux machine. There are other ways to find and incorporate the specialist applications by searching the Internet and getting advice from others. Going outside the Distro obligates the specialist user to keep up with how many external sources are used and whether dependencies are met. Upgrading in this condition is not easy as your Distro may alter dependencies and create a new ones. If your third party items match, then all works, if not it’s back to the Internet to see what others are doing. One solution might be to freeze the system at a working point and not upgrade. Off the Internet and with updating turned off this will work, but you are on your own, without one central place to find a solution should something fail.

Network and programming professionals make many more sophisticated decisions on hardware and software on a daily basis. The basic operating system that best suits is just the beginning of their work. The Internet is built on Linux servers, so the choice there is obvious. Programming can be done in any computing environment, but some are better than others. This may explain the use of Linux systems in CGI and the inclusion of the Bash shell in certain commercial/proprietary systems. For those new to GNU/Linux the Bash shell gives a way in to learn about the next level of computing. Also called “the command line,” the text only way of getting work done can be really efficient and helps the user get a feel for how the computer takes input to make output.

The mathematical and scientific users need no help from these notes. They have studied the intricacies of the hardware/software interface, can think in abstract and effective processes to get the electrons to do the necessary computational work to make multiple events flow in nanoseconds, doing vast calculations, gathering, analyzing or distributing data. Getting the fastest processors and the most efficient hardware systems to advance science is a combination of budget for hardware and improvement in computational circuits. The computer is a tool which can handle great volumes of repeated tasks for the user. The nuanced task of choosing which tasks to automate and how to do that is up to us humans.

So, “What do you want to do and how hard do you want to work at it?” Those are the questions to ask yourself. If it’s e-mail and browser with some video, photography or audio work you can have a really full experience with Debian or Ubuntu. I choose Ubuntu as the Internet support and information is copious and the presence of third party applications is sometimes desirable. Debian has more hardware support, is totally community based and is most of what makes Ubuntu anyway. [In a future post I may address the most obvious user difference between them: root vs. sudo for administration.]

Debian and Ubuntu provide fully graphical interfaces. Installation is still a bit easier for Ubuntu, but the difference is small. Either way, a choice must be made, because they are not interchangeable. I believe the variety in the appearance of the desktop is a benefit available in either one, and that GNU/Linux offers the user the best computing experience and customization. Will it work for you? I think so, given the continuum of GNU/Linux users.