That's what Microsoft calls it, and for some reason people actually believe that that's what they've done. It's not. In reality, rebuilding an operating system as complex as Windows "from the ground up" would take a decade, at least. What they ACTUALLY did was swap out the XP core that they were using pre-2004, and plug in the Windows 2003 core.
Yes, of course. I see that my posting could be understood this way, but I wasn't implying that. I'm not a native speaker, so I hope you can forgive me the confusion I caused with my twisted phrasing. I was merely trying to say that Microsoft replaced the very foundations on which Longhorn was built and started over again.
this swap was also a very trivial move from a development perspective.
Frankly said, I don't know if it really was a walk in the park. If my recollection is correct, it took Microsoft at least about a year to do the shift with the 5xxx releases and arrive where the latest 4xxx revisions have been before, in terms of functionality.
Anyway, that's very off topic and you don't really have to reply to it. It just irritates me when people spread the Microsoft line about "rebuilding from the ground up" and other associated garbage.
Actually I haven't seen a single announcement from Microsoft claiming that. I've only read about the replacement of the foundations (XP vs. 2003) so far. You can blame me for struggling with the language, however. Yes, my statement was quite ambiguous and vague. I apologize.
I was replying to the fact that you said that OSX for Intel is pretty much complete. My point (which you ignored in your reply)
I don't think I ignored this, I just thought it wouldn't be necessary to comment on that. I believed you were referring to the ever-evolving nature of software, and in another, older posting I already have questioned the notion that any piece of software may be considered "done" or "complete", especially with regard to the OS, as there always is some kind of progress.http://forum.osx86pr...pic=2430&st=60#
is that OSX for Intel is not complete, and is likely nowhere near complete,
Well, OK. On my part, I would believe that things were seriously going wrong if a software developer wouldn't have his parts together half a year before the final release is scheduled...
and -- most importantly -- you are not in a position to make that judgement; neither am I, neither is MrBond, because none of us work for Apple.
I'm inclined to think that at least those developers working with the developer transition kit to port their applications to Mac OS for Intel will have at least a bit of an insight into how far the OS has evolved. I simply cannot believe Apple is encouraging folks to start porting their apps to an operating system that isn't yet there while supplying them with some fake previews, whose very basics are still in a constant flux of changing dramatically. Developers need something definite to rely on.
Whether you or I think it's an "alpha" or a "beta" is also irrelevant.
Sure it is...
That's a label that developers assign to their software, not users.
...I just adopted the terminology here as a device to draw distinctions between different degrees of sophistication.
I realize that users like to try, but since user experience (the only thing that users can judge) is NOT the determining factor in assigning development-stage labels, what any user thinks about the matter is not relevant.
So you're basically saying that simple users cannot differentiate at least rough stages of sophistication in a software product?