I broke down and ranted. Thishas been on my mind for a long time now, and I'm going to officially set my philosophy.. I'm no longer going to go the techy route. I don't care anymore. I'm not going to waste my life dinking with crap. I'd rather waste it writing about not wasting my life drinking with crap. So there.
Most people would say that it's a good thing to be able to look "under the hood" to troubleshoot. I've had a lifetime of doing that, to the point of being able to intuit issues. This has lead to phenomenal software testing skills. Intuition means that I can do exploratory testing of such calibre as to boggle the mind. Yes, I can "stumble" into bugs that I can reproduce. Fear and wonder baby.
But instead of going into sysadminning and software testing, I'm far more interested in design and usability.
I've spent a long long time figuring things out. I'm one of those thinking types, except I have one big disadvantage. I don't remember stuff. This isn't to say that I wake up in the morning and read the tattoos on my chest to learn my name [ 1 ] See Memento - (2000 movie). , but it is to say that I've been so endlessly curious that I've started to notice that the stuff -- as important and interesting as it is -- which I've been curious about previously, will slowly become less in my mind and less useful. Paying attention to this, I've become able to see my memory creep away on me.
So this means that I can learn a skill, walk away from it and it's gone and not summonable in any functional way. My one saving grace is that everything becomes vaguely familiar when I've tinkered in a field enough, and so I can re-learn things with fair competance. What's even better is that with excellent usability (there's that word again) or notes (that's this place) I can pick things up in no time.
So with all of that in mind, it should be easy to see why tinkering under the hood can be a nightmarish ordeal. I shouldn't have to do it in the first place (design) and if I do it should be a walk in the park because everything should be consistant and familiar (usability).
Life started off with my being curious about computers in general. I guess I could have had just about anything deposited in front of me and I would have perked up, I was like that back then. However things basically started off in a computer lab in highschool. Previous exposure doesn't reall count. The early 386 technology was fascinating to me, and I shoulder-surfed to figure stuff out. DOS became intuitive for me, as I was able to poke commands in and figure things out. I was able to figure things out, because even my mistakes gave me clues.
When using software, I started wondering about how things were put together. I had a lot of exposure to Microsoft's software back then, and I really fell in love with Microsoft Works 3.0 for DOS (well, I used and liked 2.0 at first). I got pretty good at typing and eventually poked around the local BBSing scene on my 300 baud acoustic modem. I was still on it when the 14.4 came out.. heh.
But that's the foundation.. and although I'm an independently-thinking guy it's likely that my foundation is what has built towards my preferences today.
Roundabout those times I started taking notes on the software I used. I don't really know why, although I had in mind some vague notion of changing the software. I knew that software was closed and was compiled, but I saw what could be done to software that was bottled up and thought I could learn enough machine language / assembly language to just tear something open and rewrite things as I saw fit, without needing the original source. Easier said than done, of course. Those notes that I took sat around and some of them have been collected into /tag/reviews+software (particularly the notes in /tag/dos+software. As of this writing there is more to be put there, if I get around to finding/porting that work.
So even back in those days I started paying attention to software layout and intent. Microsoft was damned good at making software back then, because everything I was exposed to from the DOS side of things was exceptionally easy to pick up. Sure I might have been a bright kid, but I'd argue against that. This software usability experience is what I look back to when I use software today, and I'm absolutely stunned at how much more growth is possible in the tools I use.
I did eventually learn of Microsoft's various evils, even before OS/2 became a contender. But it was the documentation (and ranting) detailing the technological strengths and weaknesses of OS/2 and Windows 3.1/Windows 95 helped me understand things in greater depth.
In those days I had taken my spare time and came to understand enough things DOS that when I wanted to experiment with multitasking (task-swapping) and turned to win3.1 I really felt windows' shortcomings. Like many others, I felt abandoned when DOS stuff didn't work nearly the same in a shell. But the pull to be able to switch from application to application was quite great, so I did really fiddle around a lot (sigh, see Windows "3.12").
As an aside, I would later fiddle with some DOS-native applications for task swapping/multitasking but that experience was poor.
At any rate, stuck in microsoftland and hating every moment of it, I did eventually learn about Linux. It should go without saying that my experiences were horrific. I documented some of my experiences throughout a timeline at one point, though those notes are strewn through this content management system now. Basic things like being able to figure out what an application does were completely absent. I'm still looking for a solution to be able to write notes on the filesystem level to describe my files like 4DOS'
descript.ion files. I think I'll have to implement it myself. Man pages are still an awful experience and still don't do silly little things like explain what the software is for and show examples of common use. Crazy, I know.
So my love affair would bounce back and forth because Linux just couldn't cut it. It was when I really sat down and worked with Slackware that things started coming together. Slack just worked. Even when I bounced to another Linux distribution it just worked. I was amazed. I still couldn't figure everything out because there are walls behind walls behind walls in anything unixlike, because everything in its past is filled with the works of idiot designers and developers who only build for themselves and not the real world or real people.
However, even when Slackware "just worked" (there's a pun in there, I just know it) it didn't provide good software which was pre-configured.. that concept didn't really exist back then and most software for it was young enough to have evolved so much yet.
It's around here that the actual story begins.
Having been a Linux user for a fairly long time now (since before the great Slackware version jump) I must say that I'm still completely baffled. Firstly, the underlying mechanisms have indeed evolved, but a significant amount of stuff "under the hood" presents giant walls of stone to regular use. While distributions have improved, it has taken the efforts of complete distributions with great package management to make the experience better (think LiveCD and GUI-oriented distributions). Configuration tools have improved greatly, although I note that XF86Config (or X.org's tool?) still doesn't have a per-section editing concept. I guess everyone must love going through the entire configuration over and over and over to try a different video setting when hand-editing the file isn't possible.
Speaking of editing, Linux text editors are almost usable. Or maybe I've just gotten used to the crap.
So nowadays, I've been through hell on earth and have forgotten more than I care to mention to only now be given a distribution that can install in a half an hour and have automatically detect my hardware and just bloody well go. So Linux is now at the Windows 95 stage. All my knowledge can now be safely thrown away, being almost entirely obsolete now. The intention for a good distribution and for most tools should be to obsolete that knowledge with autoconfiguration, good design (i.e. actually work out of the box), good usability (features that are expected, etc) and good documentation. Ok, good docs are still not everywhere, which is why I still take notes.. but most good applications have a wiki -- the wrong kind, but at least a wiki -- which means that docs can be worked on. Inline docs are still absent though. Go figure.
So now that I'm at the point when I can just turn things on and my computer just works (like Slackware, but now with all sorts of software too) I am faced with a strange problem. What do I do when things go wrong?
And things do go wrong. Especially for me.. things go sideways.
Now when a problem happens I can probably figure out how to fix it. Probably. The question is.. do I care anymore? Should I care anymore? The goal of a distribution and all its software should all be to stop wasting my fucking time and just work as-expected. I've spent a lot of time figuring things out. I mean a lot. I could have been out petting kittens instead of troubleshooting. Oh the regret.
No, I don't troubleshoot anymore. I'm tired of working more than I have to for a piece of software. I'm tired of trying out cvs software.
I don't need to do all that crap anymore because my experience has finally improved to the point when I can step away from all of that. Now when something doesn't work, I can't lay the blame on the software or scene being young, I can smugly blame the developers for being retarded.
Now let me chew on my foot for a while. Mmmhrmgrlth. Ok, so software and the "stupid user" experience is now hopeful enough that there are standards that can be thought to be in place. This means that anything under the bar can be poo-pooed.
So now when a problem happens, I don't have to think about how to troubleshoot it. I can now think that there's no longer any fucking excuse for problems to happen or be inobvious to solve. If I get some random problem and I don't get a usable error message that I can search for answers for, then the software is bad. If I perform an average installation under average circumstances and a major cockup occurs, then the software is bad. There is a trend here..
So when something happens, and I can't immediately solve it, I move on to better things. Like petting kittens.
I've wasted too much of my life troubleshooting, and learning skills that are at best stressful to have and at worst entirely unusable, and so I now just want to be a regular user and .. I dunno, try to figure out why I started using computers in the first place. So these days I'd rather see a problem and bloody well reboot my damned computer.. and if the problem goes away, I'll just get on with life.. like writing crap like this.
And if rebooting doesn't fix it. I'm fucking well reinstalling the distribution. I don't care anymore! =p
Less troubleshooting, more kitten-time!
[ + ]
|1.||^||See Memento - (2000 movie).|