Page 3 of 4 FirstFirst 1234 LastLast
Results 21 to 30 of 32

Thread: Why one OS VS another isn't over played here

  1. #21
    Senior Member gore's Avatar
    Join Date
    Oct 2002
    Location
    Michigan
    Posts
    7,177
    You can spell check it, I can copy over and edit it for you if needed.

  2. #22
    Senior Member nihil's Avatar
    Join Date
    Jul 2003
    Location
    United Kingdom: Bridlington
    Posts
    17,188
    Hmmm,

    I think that RoadClosed has a good point with the "backwards compatibility" angle?

    I have software that defragments and compacts the Windows registry hives. It doesn't do that with Windows 2000 (NT5) or XP because they have been enhanced to handle this.

    It certainly was an issue with 9x/ME

    MS seem to have dumped this "backwards compatibility icon" when they went to XP? I expect them to improve rapidly, as it is a fairly easy and obvious area, and the old hardware won't really support the new operating systems.

    My suggestion is that MS were trying to corner the OS market in the commercial sector (the only people who could really afford computers back then) so this "backwards compatibility" became some sort of sacred cow?

    Just a thought

  3. #23
    Senior Member RoadClosed's Avatar
    Join Date
    Jun 2003
    Posts
    3,834
    became some sort of sacred cow?
    It was a sacred cow and perhaps the reason linux has succeeded. Anyone who was around when the transition for FAT16 to FAT32 KNOWS the issues. And that was backward compatible. For the very first kernal linux FS was superior. But did it EVER gain major performance advantage ouside extreme applications? I would say no.

    But now as hard drives are getting bigger and bigger. The advantage becomes apparent and yes MS knows this.
    West of House
    You are standing in an open field west of a white house, with a boarded front door.
    There is a small mailbox here.

  4. #24
    Senior Member gore's Avatar
    Join Date
    Oct 2002
    Location
    Michigan
    Posts
    7,177
    LinuxFS? Errrr, MinixFS?

  5. #25
    Senior Member
    Join Date
    Apr 2004
    Posts
    228

    Re: Why one OS VS another isn't over played here

    Originally posted here by gore
    It seems when I decide I want to make a post or get a decent discussion going about OSs, it takes maybe one day before someone says "This is been talked to death and it's old" or I get called a Microsoft basher.

    I don't think this has been done to death and I sure as hell don't think we should stop talknig about it.

    Why?

    Simple, this is the OS forum, it was made to talk about OSs. That has yet to of happened very often though. The only thing we see here is questions.

    So, in the near future I'll be starting another thread about Windows and other OSs. However, this is a pre warning:

    Don't think for one minute someone is going to ruin it with "This is my OS is better than yours bull"....Not happening.

    Maybe people would have realised this had there been more discussions on OSs here, but because people seem to think that once a thread about Unix and Windows was done it didn't need to be done again, I have to explain it:

    MS-DOS is different from.... Well Windows.

    Windows 3.1 and Windows 95 are not the same thing.

    Windows NT was a complete re-design from Windows 95. And at that time Linux was very different than it is now.

    Windows 2000 came out and so did another version of the Linux Kernel. Why did we not talk about it?

    Windows XP came out as did the new Linux Kernel.... Again, another chance for discussion on new changes and features in both OSs.


    Well, Windows Vista is coming. The 2.6 Kernel for Linux isn't the same thing it was, as Linus wants to slow down on adding features and start fixing up issues. Free BSD also has a new release.

    We could have a GREAT discussion on the differences from past releases... And we could talk about who is using them... How do they like it?

    And we are. For the Closet zelots (Windows users who pretend they only want fairness when really they are Windows zelots in the same way we have Linux zealots who ruin every discussion I try to get going here) if you think for one minute of ruining this new discussion coming up, here is a warning, I WILL delete your posts. I WILL re-open the thread if need be. We CAN talk about things without "My OS is better" and if you don't like it, select the option that says front page display, and un-tick OSs.

    Windows Vista

    Free BSD 6

    SUSE Enterprise 10

    Slackware 11

    Lots of topics.
    Gore, darling . I'm starting on tech support in Sun this July and will have more time to dig in to different OSs, while getting payed for doing that, so I'll be definately up for discussing their pros and cons
    Don\'t post if you\'ve got nothing constructive to say. Flooding is annoying

  6. #26
    Senior Member gore's Avatar
    Join Date
    Oct 2002
    Location
    Michigan
    Posts
    7,177
    Cool. Darling?? =o

  7. #27
    Senior Member RoadClosed's Avatar
    Join Date
    Jun 2003
    Posts
    3,834
    I was off working on a phone system cut over and was thinking about Linux and MS and this defrag debate. I had to smile because there is one more thing...

    Just because a defrag tool does not exist for a file system doesn't necessarily mean it doesn't need defraggin'ation. It means there isn't one that exists. That's all. I mean no one has written one. Either because it is difficult or impossible because of the implementation of the file system. Or no one thinks it is necessary. Having said that. ReiserFS 4 has what in development? A defragger.

    They "used" to say Linux didn't need anti-virus too. Tail packing is a tool in Reiser the eliminates excessive fragmentation (note I didn't say stop it). And it's only possible (to my knowledge) using meta data to store file tags. But even the Reiser folks recommend disabling it in resource critical applications. The performance hit on tracking file location isn't worth the benefit of decreased fragmentation. So it makes sense in some applications to reserve defragging to offline functions. Such as maintenance windows and patch time. I can't remember where I read about ReiserFS v4 and their defrag. Stay tuned. Linking follows.

    //EDIT Instead of the article I went right for the source. The Reiser dudes

    Repacker
    Another way of escaping from the balancing time vs. space efficiency tradeoff is to use a repacker. 80% of files on the disk remain unchanged for long periods of time. It is efficient to pack them perfectly, by using a repacker that runs much less often than every write to disk. This repacker goes through the entire tree ordering, from left to right and then from right to left, alternating each time it runs. When it goes from left to right in the tree ordering, it shoves everything as far to the left as it will go, and when it goes from right to left it shoves everything as far to the right as it will go. (Left means small in key or in block number:-) ). In the absence of FS activity the effect of this over time is to sort by tree order (defragment), and to pack with perfect efficiency.

    Reiser4.1 will modify the repacker to insert controlled "air holes", as it is well known that insertion efficiency is harmed by overly tight packing.

    I hypothesize that it is more efficient to periodically run a repacker that systematically repacks using large IOs than to perform lots of 1 block reads of neighboring nodes of the modification points so as to preserve a balancing invariant in the face of poorly localized modifications to the tree.
    Hmmm seems the move is on from on the fly defragging to periodic defragging during system inactivity. Could the be the reason windows has stayed their course with scheduled defragging? I only said that to jab some Linux freakzoids. In fun. But as you see that is the methodology Reiser is going to. Because it does leave more resources to be used in real time possibly increasing efficiency quite a bit. Gore, could you imagine running Doom3 in full on resolution with a high overhead FS taking the slow way to write data, journaling it and defragging it on the fly? Then comparing disk performance with Fat32.

    But their is no question that linux based file systems are more advanced. And windows as a usable OS is more advanced than linux.

    http://www.namesys.com/v4/v4.html#repacker
    West of House
    You are standing in an open field west of a white house, with a boarded front door.
    There is a small mailbox here.

  8. #28
    Senior Member gore's Avatar
    Join Date
    Oct 2002
    Location
    Michigan
    Posts
    7,177
    The way I've sen defragging for Linux and UNIX explained was that if you don't have more than 90% of the HD used up, you won't see more than a percent or so.

    A lot of linux books I have and BSD books I have say this.

  9. #29
    Senior Member gore's Avatar
    Join Date
    Oct 2002
    Location
    Michigan
    Posts
    7,177
    Oh by the way, no I can't imagine playing Doom 3 on that... Well I can, what I can't imagine is it on Fat32, because Doom 3 only runs on Windows 2000 and XP, and I don't know anyone using Fat anything for a file system. teehee.


    I also fully support ReiserFS. I use it on everything. Except floppy disks.

  10. #30
    Senior Member gore's Avatar
    Join Date
    Oct 2002
    Location
    Michigan
    Posts
    7,177
    Something I should probably put in Cosmos. I really liked this and I hope everyone does read the somewhat long... But very good story here:

    [ From http://www.performancecomputing.com/.../9809of1.shtml ]

    The Elements Of Style: UNIX As Literature

    If there's nothing different about UNIX people, how come
    so many were liberal-arts majors? It's the love of words
    that makes UNIX stand out.

    Thomas Scoville

    In the late 1980s, I worked in the advanced R&D arm of the Silicon
    Valley's regional telephone company. My lab was populated mostly by
    Ph.D.s and gifted hackers. It was, as you might expect, an all-UNIX
    shop.

    The manager of the group was an exception: no advanced degree, no
    technical credentials. He seemed pointedly self-conscious about it. We
    suspected he felt (wrongly, we agreed) underconfident of his education
    and intellect.

    One day, a story circulated through the group that
    confirmed our suspicions: the manager had confided he was indeed
    intimidated by the intelligence of the group, and was taking steps to
    remedy the situation.

    His prescription, though, was unanticipated: "I
    need to become more of an intellectual," he said. "I'm going to learn
    UNIX."

    Needless to say, we made more than a little fun out of this. I mean,
    come on: as if UNIX could transform him into a mastermind, like the
    supplicating scarecrow in "The Wizard of Oz." I uncharitably imagined
    a variation on the old Charles Atlas ads: "Those senior engineers will
    never kick sand in my face again."

    But part of me was sympathetic: "The boss isn't entirely wrong, is he?
    There is something different about UNIX people, isn't there?" In the
    years since, I've come to recognize what my old manager was getting
    at.

    I still think he was misguided, but in retrospect I think his
    belief was more accurate than I recognized at the time.

    To be sure, the UNIX community has its own measure of technical
    parochialism and nerdy tunnel vision, but in my experience there
    seemed to be a suspicious overrepresentation of polyglots and
    liberal-arts folks in UNIX shops.

    I'll admit my evidence is sketchy and anecdotal. For instance, while banging out a line of shell, with
    a fellow engineer peering over my shoulder, I might make an
    intentionally obscure literary reference:

    if test -z `ps -fe | grep whom`
    then
    echo ^G
    fi
    # Let's see for whom the bell tolls.

    UNIX colleagues were much more likely to recognize and play in a way
    I'd never expect in the VMS shops, IBM's big-iron data centers, or DOS
    ghettos on my consulting beat.

    Being a liberal-arts type myself (though I cleverly concealed this in
    my resume), I wondered why this should be true.

    My original
    explanation--UNIX's historical association with university computing
    environments, like UC Berkeley's--didn't hold up over the years; many
    of the UNIX-philiacs I met came from schools with small or absent
    computer science departments.

    There had to be a connection, but I had no plausible hypothesis.

    It wasn't until I started regularly asking UNIX refuseniks what they
    didn't like about UNIX that better explanations emerged.

    Some of the prevailing dislike had a distinctly populist
    flavor--people caught a whiff of snobbery about UNIX and regarded it
    with the same proletarian resentment usually reserved for highbrow
    institutions like opera or ballet.

    They had a point: until recently, UNIX was the lingua franca of computing's upper crust. The more
    harried, practical, and underprivileged of the computing world seemed
    to object to this aura of privilege.

    UNIX adepts historically have been a coddled bunch, and tend to be proud of their hard-won
    knowledge. But these class differences are fading fast in modern
    computing environments.

    Now UNIX engineers are more common, and low-
    or no-cost UNIX variations run on inexpensive hardware. Certainly UNIX
    folks aren't as coddled in the age of NT.

    There was a standard litany of more specific criticisms: UNIX is
    difficult and time-consuming to learn. There are too many things to
    remember. It's arcane and needlessly complex.

    But the most recurrent complaint was that it was too
    text-oriented. People really hated the command line, with all the
    utilities, obscure flags, and arguments they had to memorize. They
    hated all the typing.

    One mislaid character and you had to start
    over. Interestingly, this complaint came most often from users of the
    GUI-laden Macintosh or Windows platforms.
    People who had slaved away
    on DOS batch scripts or spent their days on character-based terminals
    of multiuser non-UNIX machines were less likely to express the same
    grievance.

    Though I understood how people might be put off by having to remember
    such willfully obscure utility names like cat and grep, I continued to
    be puzzled at why they resented typing.

    Then I realized I could connect the complaint with the scores of "intellectual elite" (as my
    manager described them) in UNIX shops. The common thread was
    wordsmithing; a suspiciously high proportion of my UNIX colleagues had
    already developed, in some prior career, a comfort and fluency with
    text and printed words.

    They were adept readers and writers, and UNIX played handily to those strengths. UNIX was, in some sense,
    literature to them. Suddenly the overrepresentation of polyglots,
    liberal-arts types, and voracious readers in the UNIX community didn't
    seem so mysterious, and pointed the way to a deeper issue: in a world
    increasingly dominated by image culture (TV, movies, .jpg files), UNIX
    remains rooted in the culture of the word.

    UNIX programmers express themselves in a rich vocabulary of system
    utilities and command-line arguments, along with a flexible, varied
    grammar and syntax.

    For UNIX enthusiasts, the language becomes second
    nature.

    Once, I overheard a conversation in a Palo Alto restaurant:

    "there used to be a shrimp-and-pasta plate here under ten bucks. Let
    me see...cat menu | grep shrimp | test -lt $10..." though not
    syntactically correct (and less-than-scintillating conversation), a
    diner from an NT shop probably couldn't have expressed himself as
    casually.

    With UNIX, text--on the command line, STDIN, STDOUT, STDERR--is the
    primary interface mechanism: UNIX system utilities are a sort of Lego
    construction set for word-smiths.

    Pipes and filters connect one utility to the next, text flows invisibly between. Working with a
    shell, awk/lex derivatives, or the utility set is literally a word
    dance.

    Working on the command line, hands poised over the keys uninterrupted
    by frequent reaches for the mouse, is a posture familiar to wordsmiths
    (especially the really old guys who once worked on teletypes or
    electric typewriters).

    It makes some of the same demands as writing an essay. Both require composition skills. Both demand a thorough knowledge of grammar and syntax. Both reward mastery with powerful,
    compact expression.

    At the risk of alienating both techies and writers alike, I also
    suggest that UNIX offers something else prized in literature: a
    coherence, a consistent style, something writers call a voice.

    It doesn't take much exposure to UNIX before you realize that the UNIX
    core was the creation of a very few well-synchronized minds.

    I've never met Dennis Ritchie, Brian Kernighan, or Ken Thompson, but after
    a decade and a half on UNIX I imagine I might greet them as friends,
    knowing something of the shape of their thoughts.

    You might argue that UNIX is as visually oriented as other OSs. Modern
    UNIX offerings certainly have their fair share of GUI-based OS
    interfaces.

    In practice though, the UNIX core subverts them; they end
    up serving UNIX's tradition of word culture, not replacing it.

    Take a look at the console of most UNIX workstations: half the windows you
    see are terminal emulators with command-line prompts or vi jobs
    running within.

    Nowhere is this word/image culture tension better represented than in
    the contrast between UNIX and NT. When the much-vaunted UNIX-killer
    arrived a few years ago, backed by the full faith and credit of the
    Redmond juggernaut, I approached it with an open mind.

    But NT left me cold. There was something deeply unsatisfying about it. I had that
    ineffable feeling (apologies to Gertrude Stein) there was no there
    there.

    Granted, I already knew the major themes of system and network
    administration from my UNIX days, and I will admit that registry
    hacking did vex me for a few days, but after my short scramble up the
    learning curve I looked back at UNIX with the feeling I'd been demoted
    from a backhoe to a leaf-blower.

    NT just didn't offer room to move. The one-size-fits-all, point-and-click,
    we've-already-anticipated-all-your-needs world of NT had me yearning
    for those obscure command-line flags and man -k.

    I wanted to craft my own solutions from my own toolbox, not have my ideas slammed into the
    visually homogenous, prepackaged, Soviet world of Microsoft Foundation
    Classes.

    NT was definitely much too close to image culture for my comfort:
    endless point-and-click graphical dialog boxes, hunting around the
    screen with the mouse, pop-up after pop-up demanding my attention.

    The experience was almost exclusively reactive. Every task demanded a
    GUI-based utility front-end loaded with insidious assumptions about
    how to visualize (and thus conceptualize) the operation.

    I couldn't think "outside the box" because everything literally was a box. There
    was no opportunity for ad hoc consideration of how a task might
    alternately be performed.

    I will admit NT made my life easier in some respects. I found myself
    doing less remembering (names of utilities, command arguments, syntax)
    and more recognizing (solution components associated with check boxes,
    radio buttons, and pull-downs).

    I spent much less time typing. Certainly my right hand spent much more time herding the mouse
    around the desktop.

    But after a few months I started to get a tired, desolate feeling, akin to the fatigue I feel after too much channel
    surfing or videogaming: too much time spent reacting, not enough spent
    in active analysis and expression. In short, image-culture burnout.

    The one ray of light that illuminated my tenure in NT environments was
    the burgeoning popularity of Perl. Perl seemed to find its way into NT
    shops as a CGI solution for Web development, but people quickly
    recognized its power and adopted it for uses far outside the scope of
    Web development: system administration, revision control, remote file
    distribution, network administration.

    The irony is that Perl itself is a subset of UNIX features condensed into a quick-and-dirty scripting
    language. In a literary light, if UNIX is the Great Novel, Perl is the
    Cliffs Notes.

    Mastery of UNIX, like mastery of language, offers real freedom. The
    price of freedom is always dear, but there's no
    substitute.

    Personally, I'd rather pay for my freedom than live in a
    bitmapped, pop-up-happy dungeon like NT. I'm hoping that as IT folks
    become more seasoned and less impressed by superficial convenience at
    the expense of real freedom, they will yearn for the kind of freedom
    and responsibility UNIX allows. When they do, UNIX will be there to
    fill the need.

    Thomas Scoville has been wrestling with UNIX since 1983. He currently
    works at Expert Support Inc. in Mountain View, CA.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •