It all began with a hatred of Windows (especially after years of using other systems, which were much better), with the final straw being the continual crashing of a particular PC (a rather pathetic computer, one that I really only got for experimenting with), that (now) wouldn't even boot up; and another PC which was forever having GDI errors, was also making me heartily sick of Windows (for a so-called multi-tasking computer, it has a hell of a job just trying to do a few things at once).
Faced with having to reinstall an operating system on at least one computer, I decided to have yet another go with an alternative system; and (now) having a spare 13 gig HDD, and two computers, I could manage to experiment around, without risking losing something important, being put out of action, or having to spend a lot of money.
One thing I often hear, is; “You're running Windows 98 SE? Update it!” Well, I paid for Win98SE, and it's never worked right. As far as I'm concerned, I'm owed an “operating” system! However, I don't see Microsoft ever fulfilling their end of the deal; and I'm certainly not about to spend another couple of hundred dollars (or more), on buying another of their half-operating systems (complete with all their bugs, security risks, and sloppy programming practices, being locked into forever spending money on a system, in a vain attempt to improve it); and as much as I have no respect for Microsoft, I don't consider “not-paying” for one of their systems to be a solution either (not only is it wrong, but I'd still be stuck with a bad system).
So, it's time to try an alternative operating system, again. Linux has finally listed my hardware on their “supported” hardware lists, so I'll have another bash at it; and if I can get one PC working nicely using it, I'll probably run both using it.
If you're not familiar with what's meant by operating systems, or software; the operating system is the basic programming that makes the computer work (it gives you a way of accessing the hardware; screens, disk drives, keyboard, etc., and gives other programs a common way of working with your computer), software are the programs that you run (your word processor, web browser, etc.).
Some of the common “operating systems” are AmigaOS, MacOS, Linux, MS-DOS, Windows, and Unix. They're all different from each other, and basically incompatible, although some have ways of utilising one, or more, of the other operating systems. Because they're different, programs have to be written differently for each of them; and although some manufacturers do produce different versions of their programs, for the different operating systems (e.g. a Macintosh and a Windows version of their word processor), most programs are written for a specific operating system (e.g. a Windows-only word processor). Or, a common programming language can be used on different types of computers, with programs being written to use that common language (e.g. Java), that acts as a middle-man (with one program written to run on Java, and since Java can be run on various different types of computers, they can all run that one program), although there still tends to be variations between the different operating systems (so that Java on the Macintosh, is not exactly equal to Java on a Windows computer).
The obvious thing to occur (to most people), is; wouldn't it be easier if there was just one operating system? Yes, it would, but the trouble is that it's not very likely to happen (there's too much commercial interests vested in the different systems), and the most prolific operating system (Windows), is very bad (it's expensive, unsafe, unstable, unreliable, and so on); it wouldn't be in our best interests if it managed to become the only operating system (other systems are more safer; less prone to viruses; more stable; cheaper; don't make such huge demands to need expensive hardware; don't force you into upgrading, just because other people you work with have upgraded, and you cannot get your systems, or data, to work with each other; and aren't subjected to ridiculously restrictive, irresponsible, incomprehensible and complex licenses; etc.).
Then there's the case that we really don't all need to be using the same system, anyway (despite what people are often told, and end up believing, without really “thinking” about it). Very few of us actually need to be doing the exact same thing on our computer as someone else; and even when we're doing the same “task” (word processing, e-mailing, or whatever else), we rarely need to be using the exact same program as another person, so long as we can exchange data in a form that's understood on more than one type of program (there are many ways of writing text, or generating pictures, that can be used on many different types of computer systems).
It started with MULTICS (multiplexed information and computing system), a multi-user operating system developed in the late 1960's, that wasn't what you'd call a success (even though it was developed into a working system).
[“Multiplexing,” meaning that it handles several different tasks sequentially, sharing its time between different tasks, giving it the ability to apparently do several things at the same time. And “multi-user,” meaning that several different people could use the computer at the same time; they'd have individual consoles (“terminals”), which connect to the main computer (the “main frame,” the “frame” referring to a rack of equipment, that collectively formed a computer system). The idea of mainframe systems, is that there's a central computer system, that does all the work, and people use simple terminals which are little more than a display, and a way to communicate with the mainframe. Hence the need for the mainframe to be able to do many tasks, for several users.]
UNIX (named as a pun on the MULTICS name) was developed as a commerical successor to MULTICS, in the early 1970's; and to this day, is still used in a variety of forms. It's a stable, and reliable system, designed to be used by multiple users (people, typically), keeping them isolated from each other, and vital system functions.
Then, in the early 1990's, Linus Torvalds developed Linux (allegedly pronounced somewhat like “lee nucks,” as a pun on the UNIX name, and combining his own name into the operating system's name), as a “free” version of UNIX.
Yes, a “free” operating system. You can have it for free, if you can find some way to get your hands on it, for free. It's “free” in the same sense that you can go and walk in a park, for “free”; but you might have to spend money travelling to get there.
Linux is a huge download, so it might cost you something to download it (the same as anything else that you download, unless you have a free internet and phone service), or you can buy copies that someone else has already downloaded for you, or buy a copy that someone else has produced for sale (e.g. a boxed set, with printed manuals).
Although the Linux system is free, people are allowed to sell copies of it (they're providing you with a service; whether they charge for it, or how much, is up to them). Some people also provide a pay service, to give you assistance in using it (the same as you might pay for tuition, or service, for something else).
There's also a lot of free software that you can use with it. So, apart from the cost of your computer hardware, and possibly any expenses on obtaining copies of software, you can actually have an operating system, and just about any software you need, without having to pay for it. For people with several computers, this can be a godsend, as you don't have to buy several copies of the same thing (as many commercial software licenses dictate you must do). Quite often, the Linux installation discs also include a lot of software, so you might even be able to get everything in one go (operating system, word processors, image manipulation programs, databases, web and mail servers, etc).
I'd tried Red Hat Linux before (version 6.0), but only had a small spare hard drive, and Linux only supported a fraction of my hardware (little more than drives, mouse, keyboard, and the screen; no sound, no network, no peripherals). The PC had less RAM, back then, and all of that put the mockers on almost anything that I tried.
I also tried BeOS, but it also didn't support much of my hardware, and this PC100 (PC Chips?) computer, has some of the crappest hardware I've come across (most PCI cards and RAM DIMMs will not work in it).
And other versions of Linux had even less luck (Caldera Open Linux, for one, and I cannot recall the other one that I tried, it may have been Debian). They crashed during the install routine, never getting much past the detecting your mouse and keyboard routines, a later version of Caldera got a little further but detected no graphics card.
Just so you know what this PC contains, it's got just about everything on a cheap (unbranded) M571LMR motherboard. It was designed for Windows 98, and came with a few utilities and Corel WordPerfect Suite 8 (which completely trashed the Windows installation, twice over, and wanted the installation of the ancient, and dire, Windows Messaging service; so the Corel suite never got allowed a third chance at being installed).
The other PC, the one with recurring GDI problems, is an ABIT BE6 motherboard, with less things on the motherboard, but lots of things plugged into expansion slots. It didn't have an actual floppy disk drive, so that made writing bootdisks a problem (the LS-120 emulates a floppy, to some degree, but doesn't always support the type of “raw” writing utilites typically used to make bootdisks).
I bought Red Hat 7.3 as a 3 disc set (software, but no source files) with a book sold at the local newsagent (“Operating in Linux,” 3rd edition; an Australian publication by Next Publishing Pty. Ltd.). It got stuck any time the installer tried to get a file from the third disc. It was a major pain trying to run an install which avoided that disc, as you didn't really know which disc files were coming from, and you'd sit through around an hour of installing before it died, complaining that disc 3 really wasn't disc 3. Although you could inspect the discs (on another computer), to see what files were where, various packages required other packages, which you had no clues about.
Also, it would crash while testing your configuration of the X server. If you bypassed this step, and tried to configure X from the installed Linux, it'd complain about parameters not being suitable (or words to that effect), with no combination of resolutions being acceptable. If I tried to get X to run, it'd complain that the mouse was busy, amongst other nonsensical error messages (this seemed to be some malfunction of how they deal with plug and play, and my ISA LAN NIC).
There's something wrong with their distribution discs, though I'm not sure exactly what (disc three, or the installer). Their explanation didn't seem right, to me (that one broken file on the third disc was the problem, rather than the installation routine). Their work-around also leaves a lot to be desired; of running a minimal install, then manually installing other packages afterwards (extremely inconvenient), and their suggestion of not installing XFree86 then using GnomeRPM to install the rest (how do you run a Gnome application, when you don't have an X system running?), was verging on the ridiculous.
I borrowed a friend's Red Hat Linux 7.0 disk, and that installed fine, but stuffed up spectacularly (die, restart, ad infinitum) while trying to run and/or configure the Xserver (XFree86). I dug out my old Red Hat Linux 6.0, which installed without any problems, and ran fine, too; but left me with almost no hardware support.
After much hair pulling, and looking for useful information on the net, I opted for more traditional means, and bought a nice 3 inch thick Linux manual, which included a couple of discs: Red Hat Linux 7.1 (had the same problems as 7.0) and Caldera OpenLinux eDesktop 2.4 (crashes during the install, when it can't find my graphics card). However, the book was quite useful, providing me with some information that I needed, such as; running things in text mode, and how to edit files there, more importantly (which is damn hard to do using on-line manuals on the computer that you're trying to work on, especially in a text mode), and I'm sure it's going to be useful for a long time to come, along with a bit more information gleaned (probably) from the internet (but I can't remember now), about problems caused by Plug and Play ISA cards (I had one in there, as well as the on-board LAN NIC, because I have a 75 ohm coaxial LAN between three or four PCs).
Once I ripped the ISA LAN card out (a SureCom unit, based on the RealTek RTL8019AS chip), things were starting to work (I can only guess that in the absence of a Plug and Play system, to configure it, it was falling onto using the same IRQ or I/O address as the mouse port). I could get X to run properly. I had sound. I could use my USB scanner (a Mustek 1200CU with an Olympic brand badge on it). I had a network between two PCs (I haven't worked out how I'm going to include the third PC, seeing as UTP LAN cabling can only go directly between two devices; whether I get a hub, or opt for some other solution, like presetting the ISA LAN NIC IRQ and I/O addresses, though how I pick ones that won't conflict with the other Plug and Play set ones, I don't know).
Though the LAN seems to stall (using the motherboard NIC), every now and then, for some indecipherable reason. I can't use the Vision USB webcam (even though it's listed as compatible hardware), as it doesn't provide the video format wanted (and the error message doesn't mean much to me, at this point). And it doesn't seem to find the on-board modem (again, listed as compatible hardware), nor did I have much luck with an external one (which did work with Red Hat Linux 7.3 on a different PC); the PC is dropping DTR after the handshake completes, and repeatedly tries to reconnect, without letting you abort (installing other bits, seemed to have got that working, though not through any coherent sequence of events).
That was on the M571LMR motherboard computer, I did do a brief test run on the ABIT BE6 motherboard computer, and managed to get it to dial my ISP quite fine, it ran moderately quickly (not as fast as Windows 98SE's GUI, though), but didn't want a bar of my PnP ISA Intel LAN NIC, and I didn't test any of the other hardware, except the PnP ISA sound card (which made sounds, fine, at least).
Installing packages is still hard. When a package depended on others, I wasn't told which other packages were needed in some cases, just which files, without any clue where they might be; nor were the extra packages automatically picked to be installed along with what I wanted (that was using GnomeRPM, by the way; try picking on an RPM in the Nautilus file manager, and it doesn't give you any “install” or “open with the RPM manager program” type of option). At least Midnight Commander had right-click options to install, check, or otherwise operate on, RPMs. In several cases, I couldn't install things, at all; because it was impossible to determine where to get the other files those packages said that they needed.
Well, there's no avoiding this issue. Just about all hardware is designed with Windows in mind, very few devices have Linux drivers deliberately developed by the manufacturer; they tend to come from users who've designed their own. Of course, this means that they're not always as functional (as the person developing the driver has to guess at how the manufacturer has designed their product, and may only support features that they want to use, themselves), and that drivers for such things take some time before they're developed, often long after the device has ceased to be a current model. Trying to find drivers for some new hardware, isn't going to be very successful; and trying to buy old (but supported) hardware, isn't easy, either.
A common problem is people needing Windows for something (e.g. it's the only thing that supports their digital camera), so they opt for a dual-boot system. I tried that, but it's a major pain having to reboot for some tasks, and there's problems with sharing data between the two systems. I've made my evil PC Chips computer a dedicated Linux box, and kept Windows on the ABIT motherboard computer. That's the easiest solution, for me; and my ABIT board seems to have problems with large UDMA hard drives and Linux, anyway.
If there's one thing that any form of Unix is famous for, it's being stable and reliable. In the short time that I've been using and abusing Linux, I've not managed to crash the operating system (once I managed to get it running). I have had a few applications die on me, but that's a different thing: You can always avoid bad applications, but you're stuck with your operating system. And in the case of the application dying, only it died; other things that I was working on, carried on regardless.
One of the big problems with Windows is the way that they've integrated their unstable Explorer GUI system into the operating system. If you crash Explorer (and that's easy enough to do, just by looking at a listing of a large directory, or viewing some web pages), you can bring the whole system crashing down. Windows crashing, has been a major pain for me, where one thing crashing has interrupted everything else that I was doing, sometimes losing things which I couldn't really recover. That, and having to reboot, after making even minor changes to a configuration option.
Though, one thing puzzles me: In the Linux world, odd number releases are the not-so-stable versions, and even number versions are considered stable. So, a publication promoting Linux as your operating system, picking the newest, and not-so-reliable, version (7.3), instead of the latest reliable version (7.2), doesn't seem such a wise move to me. Publishing a broken version of it, doesn't help, either (though I don't know if the problem is in their particular publication, or Red Hat Linux 7.3, itself).
I've, also, tried Red Hat on a very unreliable Hewlett Packard Pavilion PC which came with Windows 98 Second Edition pre-installed. The computer actually became reliable, and usable, as a computer. Though, Linux is more of a burden on its graphics capability than Windows (not that it was great with Windows).
Later on, I have managed to crash Linux a few times, but only a few times, and without that stuffing up the system. Windows crashes a lot, all the time, and frequently stuffs up things on the drive. I am testing Linux on the most dire hardware, so I can't really say where the fault lay, but I'd lay odds on this crap hardware. The integrated sound seems to be the worst offender. Crashes are most likely (no matter what operating system is being used), on this computer, at the moment the system attempts to make a sound. Unfortunately, the on-board sound cannot be disabled, despite there being jumpers, and BIOS settings, for that.
I find that the GUI has become too CPU intensive, particularly the new Nautilus file manager (which is seriously limited, being little more than a “file browser”, very slow, and tarted up too much), the old Red Hat 6.0's Midnight Commander was more practical. I re-installed the old Midnight Commander from Linux 7.1 onto 7.3, and although it worked, managed to muck up the desktop (I don't know if it damaged anything else). It seems that they've gone the Windows way, and integrated what should have been a stand-alone file manager, into the GUI.
I find GUIs more convenient than CLIs. Especially when it comes to things like configuring software, where you can see what your options are, and mutually exclusive options can cancel each other out, right in front of your eyes, rather than cause software to foul up with two conflicting settings being configured. And for when you need to flick between doing more than one task, they tend to be more convenient about it than CLIs; the same for having two different tasks displayed side by side. I don't like GUIs that are slow, cumbersome, over complex, and just prettying things up for the sake of it; they're a waste of effort and processing power.
The concept of having a networked computer without a registered domain name, seems to be foreign to Linux. It's a pain to run your own networks when it wants to do DNS look-ups on your LAN machine names (there are ways around it, but if your machines also are connected to the internet, it becomes messy). Likewise, for running your own servers, they want a registered name, even though that mightn't be be applicable. You can't just arbitrarily invent domain names, so it's a problem area (quite why no-one every came up with a local domain name, that could be used on a LAN, I don't know; we only have localhost, which can only be used for a machine looking at itself; if we had something like anything-you-like.lan that would only be usable on a LAN, then life would be so much easier).
Yes, I know there's some names set asside for non-internet use (“example.com.”, “example.net.”, “localhost.”, & “test.”), but they also have problems with LAN use (e.g. you might want to see the real internet results for the “example” ones, trying to use sub-domains with localhost isn't always successful, and local mail servers may reject “test” domains).
Also, Linux doesn't seem too good at supporting the notion of dynamic IPs (getting a different IP, each time you connect to the internet); or more to the point, running servers, or forms of internet connection sharing, on a Linux system using dynamic IPs can be a problem.
They also don't seem to understand the idea of having more than one ISP, and having your connection settings change some of the major parameters, depending on which ISP you're connecting to (such as the ISP's DNS and proxy server addresses).
There's at least two very different ways of dealing with printers on Linux, so there's one problems straight up, although CUPS seems to be taking over. Many printers are only designed to work with Windows, and that can be a difficult problem to overcome. Sharing a printer on a mixed operating-system network can be a problem, but I've found that newer versions of Linux find a shared printer using CUPS all by themselves (I didn't have to configure anything on the terminals).
This is a bug-bear on many operating systems, with many of them being unable to determine that media has been removed or swapped, and stupidly assuming what was previously there, still is. Linux has it's own problem, of dealing with the simplest of removable media; CD-ROMs, making it damn hard to swap discs. Seeing as they're read-only, and the contents cannot possibly be disrupted by removing the disc from the drive at any particular moment, there's little need for the operating system to keep such a vice-like grip on them, and refuse to let you swap them, unless you unmount the media, first.
For other writable media, it's sensible to be more strict about when you can remove it, though it'd be better if the operating system would lock a device until all writes had been done, then allow it to be removed without any special requirement on the user (e.g. floppies and Zip disks).
Unix (on which Linux is based), is a case-sensitive file system. That means that files named “EXAMPLE” and “example” are two different things. It's also got case sensitive command parameters (e.g. the “ls” command behaves differently, when you type “-l” or “-L” after it, to customise the output).
This can be a nuisance for a variety of reasons, a common one being finding or typing commands, as you must know exactly how it's named (which is contrary to how most English speaking people consider the naming of things to be equivalent, whether or not capital letters were used). There's a convention of always using lower case names, though like most “conventions”, they're only sometimes adhered to. What's really annoying is when they're applied inconsistently, such as the file names used with the X windowing system (e.g. the “XFree86” system, “XF86Config” file, “xf86config” program, “xf86cfg” program, “XF86setup” program, and so on; notice the inconsistency of using capital letters).
There's a habit of using extremely abbreviated names (such as “ls” to “list” files), which aren't all that obvious, and other abbreviations which don't make sense, such as “umount” to “unmount” something (would it have been that hard to have named the command properly?).
Filenames don't rely on suffixes to distinguish what they're for (like Windows uses .exe to indicate that a file is executable). While that has some advantages (other non-Linux systems also let you name things arbitrarily), it does make it harder to spot what something is at a glance (whether it's a command, data, or a configuration file). Though, there's nothing stopping you naming files, in that manner; conversely, there's nothing insisting that a program does that for you, giving them an appropriate filename suffix (i.e. leaving it up to the user, to make a suitable choice).
A little while after I began trialling Red Hat 7.3 Linux, 8.0 came out. Once I'd got fairly comfortable with 7.3, I decided to upgrade. A few things have improved, like being able to set the display properties in a less cumbersome manner, dial-up networking worked properly (with 7.3 it wouldn't connect, until after I'd messed with bits of earlier versions of Linux), right-click menus being more useful in Nautilus (now there's a way to install RPMs, and there's more “open-with” options). But, some things have degraded, like the screen rendering being a more burdensome task on this modestly powered PC.
Shortly after that, Red Hat 9 emerged. I've just begun to play with it, and I'm not too familiar with it, yet. It's disconcerting that such a major jump in version levels has come so rapidly, and twice. And annoying, as they bring about changes that can break certain things. It's also annoying that they've jumped from one unstable system, to another, without giving it time to mature and be properly fixed up.
Then shortly after that, Red Hat Linux gets dropped for the Fedora Core project, with an even shorter life span (2 or 3 releases a year). Each seeming to be concentrated on experimenting like mad with different things, rather than putting an effort into making a stable and consistent system. Red Hat's copping a bit of stick for turning away from the free versions, to concentrate on making money from Linux—it's not the sort of thing that's appreciated considering its origins.
I've had a play with some other versions of Linux, with varying degrees of success, or dislike for them. I've tried several versions of Mandrake on several different computers, and I couldn't get it to install on any of them (it'd never complete the installation process, if it could even start it). I've played with a SuSe Live edition (one that runs directly from a CD-ROM without being installed to a hard drive), and Knoppix (another, Debian-based, one that runs from a CD-ROM).
Running from a CD-ROM is slow and tedious, but allows you to try a Linux without changing anything on already running system, as well as being able to set up a machine that's virtually impervious to being stuffed up by malicious activities or accidents. The slowness was more than a little off-putting, as well as their systems being sufficiently different from what I'd got used to (Red Hat) that I had to fiddle around more than I'd like to, to try things out.
I'm happier with the Linux system than Windows. It doesn't operate as some sort of secret, your machine is yours to control, you're not owned by Microsoft. But I find that many of the applications leave a lot to be desired. GUI based software is a relatively recent thing to *ix-land, and it's far from polished. It's slow, and still primitive, and quite inconsistent across the different GUI systems available on it, as well as the different applications on the one system. It's not without reason that many *ix users will state that the command line is better than a GUI, but that's often because they're using a badly designed GUI application, it's not bad just because it uses a GUI. In a lot of cases messing with the CLI is extremely tedious. I've spent ages typing in command after command, where I could have much more easily clicked on a few things and dragged them around.