Teen Programmers Unite  


Return to forum top

Need some details about NT/XP

Posted by vikram_1982 [send private reply] at April 14, 2002, 11:54:28 AM

It is common knowledge that Windows NT engine is much more stabler than 98/95. U dont get the"blue screen of death often" . What I want to know is, What does NT/XP have that Win 95/98 doesnt have that makes it that strong..

(btw, I have searched Google, but I couldnt find any intelligible answers)

Posted by RedX [send private reply] at April 14, 2002, 01:35:01 PM

Win 98 is based on win'95, which was based on win 3.11 with Dos. Win NT was written from scratch.


Posted by metamorphic [send private reply] at April 14, 2002, 03:51:26 PM

Also Win NT is a pure 32bit kernel, like linux. However, DOS, win 3.11, 9x and ME all have 16 bit kernels (note: only kernels, the operating system as a whole is 32 bit)

Posted by vikram_1982 [send private reply] at April 14, 2002, 09:47:37 PM

Now, Now. I am a Layman. Tell me. How does a 32 bit Kernel Help?????

Posted by gian [send private reply] at April 15, 2002, 02:19:48 AM

It can utilize 32 bit hardware (as most things are these days) without the aid of hardware abstraction layers and jazz like that.

Posted by RedX [send private reply] at April 15, 2002, 11:46:06 AM

I think that the major difference for stability between the two is that Win NT doesn't have all the work-around-design-mistakes that win'98 has inherit from everything before it.


Posted by taubz [send private reply] at April 15, 2002, 04:40:02 PM

It probably was just written "better."

One possibility: because NT is designed to work in a network environment, it's more important that if one process has a problem that it not affect other running processes. For instance, in Win2k at least, it's very strictly designed so that one process cannot interfere with another. This level of protection and separation between different processes may give it one added level of stability. Tho, this is just a guess.

- taubz

Posted by Mycroft [send private reply] at April 15, 2002, 05:05:30 PM

One of the big things about the NT Kernel is that it doesn't let software access the hardware layer with out first asking the OS. So it prevents programs such as games from crashing the hardware, the disadvantage is that games and other graphic programs run slower.

Posted by metamorphic [send private reply] at April 16, 2002, 07:57:44 AM

On that topic, it doesnt matter whether you are in a network environment or a desktop user at home. the computer crashing is still a disaster. A computer should never crash. If it does crash, it has failed to preform the task that the computer and its software was designed for. When my parents called tech support fr help when we first got this PC 5 years ago they just said "restart your computer" and to this day alot of tech support people say the same. Your computer should never have to be restarted in-volentarily. Unfortunattly MS dont share that view.

Posted by RedX [send private reply] at April 16, 2002, 01:58:28 PM

MS protects the producers of reset-buttons.

What those people of tech. support actually mean is that the only real working solution is to use pen and paper until the time has come for MS to hire a real programmer.


Posted by metamorphic [send private reply] at April 16, 2002, 02:32:44 PM

sounds about right

Posted by AngelOD [send private reply] at April 16, 2002, 04:15:37 PM

Indeed, although you have to remember that so far there is no such thing as a perfect OS.

Some of them may come closer than others, but they're still not perfect. Some may claim that Linux never crashes, or that BSD never does, but it's not true. I haven't found a system yet that doesn't crash at some point, either during normal work, or during a stresstest.. Sometimes it crashes with normal use *after* a stresstest.

Anyway, computers should be stable, but we're still quite far from achieving that goal.. Sadly enough.

Posted by RedX [send private reply] at April 16, 2002, 04:21:48 PM

Linux has more chance than windows of becoming a uncrashable OS (in normal conditions), because bugs have more chance to be detected if there are so many people working on it.


Posted by AngelOD [send private reply] at April 16, 2002, 04:55:50 PM

Actually, the more people working on a project, the bigger chance of causing new bugs. This is also proven with Windows, since that's a huge number of programmers, working on the same thing.

Linux has the advantage of being OpenSource. This allows individual, small groups of developers write their modules/applications and contribute it. If the software turns out to be useful to a lot of users, it's often integrated in larger distributions.

Now if Linux hadn't been OpenSource, but still had the number of actual developers (people working on the actual system) as they do today, it wouldn't be anywhere *near* this good..

Anyway, that's just my 2 cents. :)

You must be logged in to post messages and see which you have already read.

Log on
Save for later automatic logon

Register as a new user
Copyright TPU 2002. See the Credits and About TPU for more information.