Jump to content

Intel Employee – Apple Likely to Use Stock Chips


Shuddertrix

Mashugly interviewed an Intel employee:

 

I had the chance recently to speak with an Intel employee working in the Fab. department where Apple’s next-gen Intel chips are being made. He spoke under condition of anonymity, giving only his opinions and not the official sentiments of Intel. His credibility is sterling.

 

He gave us his opinions and insider’s perspective on Apple’s move to Intel. Note that while he states that there are many ways that Apple could restrict their OS, the assumption of he and his colleagues is that Apple will use stock production chips. But that’s not all he had to say... read on.

 

1. What is the atmosphere in your dept? Excitement? Slight nervousness?

First, let me give you a bit better understanding of what it is I do. I'm a technician in the "FAB", that is one of the factories where the processors are manufactured. When I see the chips they are still in complete wafers and several weeks if not months from being what you would know as the chip in your computer. People in my department, and the rest of the FAB, are concerned with keeping our tools running and processing the wafers to get them out of the FAB. We manufacture several different processor types (i.e. laptop, desktop and server) so there isn't really a whole lot of attention paid to what product is running on a tool at any given time.

 

What I can tell you is that in the seven plus years I have been working for Intel this is the first time in the past few years...

 

...that I have seen the majority of the people in the FAB excited about the direction the company is moving in. The Apple deal is part of that but the confidence in our new CEO and our new products has a lot more to do with it. Intel has always had a mind set of "we can achieve anything we set our minds to". With the resources the company has for the most part that is true. There are not many other chip companies that can afford to spend billions of dollars to develop new technologies. One thing that people totally overlook in the whole Intel vs. AMD thing is that in the past few years Intel has completely redesigned its chips from the core out. There were a lot of problems to overcome in making that happen and that is the source of the majority of the performance issues that people have seen in Intel chips during that time period. AMD is still designing on a processor core that is getting to the end of it's usable life cycle. It will continue to get harder and harder for them to get more performance out of this older design while Intel will start to fine tune it's newer designs.

2. Will the parts that are used in the first Intel Macs be generic P4s or will they be a special processor made just for Apple?

I do not know for sure but I would highly doubt that we are making special parts for Apple. It's just too expensive to design a special part, work out the manufacturing bugs and then ramp it into production. Again you have to look at it from the Intel point of view. We manufacture hundreds of millions of processors each year. While we are proud of the fact that Apple is using our chips they will be a very small percentage of the product we make. I would be more inclined to believe that during the design cycle of our new chips input from Apple was used to decide what features would be included in the final design. As we all have now learned Apple and Intel have been talking for a long time about a lot of things. Also Apple has been working on porting their OS to an x86 platform for at least a few years now so I'm sure there were a lot of ideas exchanged over that time.

 

3. How did you (and your colleagues) react to the news of OS X being hacked to normal hardware? About breaking the TPM?

I didn't know about it until I saw what you had reported on it. I'm not surprised that there is a hack floating around out there. Not many people at Intel are really into Macs. You have to remember that until very recently they were the competition. I still get some funny looks when I walk around with my iPod. :)

4. Is Apple planning on using the TPM or something similar to restrict OS X to their own hardware in the final product?

No idea about this one. There are many things that can actually be built right into the chip that could be used to restrict what OS is ran with it. Each chip has it's own ID and I'm sure it would be possible for Intel to use a special convention for the ID code for chips intended to go in Apple machines. Features such as clock speed and amounts of cache are actually set when the chip is e-tested and they see how that chip performs. When they come out of the Fab there is no difference between 3 gig P4 and a 3.6 gig P4. It's how the chip performs at e-test that decides what speed it will be certified at and the final configuration is burned in to the chip at e-test.

 

5. Are most of the people you work with Apple fans? Or are they just working in that Dept. because they were assigned to it?

Again reference the fact that we don't make chips just for Apple at the Fab I work in. As far as being fans I would say that most people are happy that we are working with Apple but it's not the most exciting thing in the world to them. Another big misconception that people have about us folks at Intel is that we are all big computer geeks. I have actually known a few people who work for the company that don't even have computers at home.

 

6. In your opinion, who will benefit most from the Intel-Apple partnership?

I think both companies will benefit in the short and long term. How much comes out of this is mostly in Apple's hands if you ask me. They are in a position where they have to make some very important decisions about Apple's future. Once Intel chips go into Macs there will be no difference between a Mac and a Dell but the OS running on the machine. If they decide to stay with their image of the "rebel" company that prides it self on doing things differently then there is no way that they will separate their OS from their machines and sell it on it's own. If they do that, they will put themselves in competition with all the other PC makers out there and they will learn what the rest have learned. Profit margins are very small and to make any money you have to sell a lot of computers - way more than 4 or 5 percent of the market. The other thing that selling their OS by itself will do is get them the attention of all the hacker and virus writers that until now have left them alone. If Apple had as many people trying to break it's code as Windows does I don't think that it would keep it's rep for being so stable and secure for very long. On the other hand if they do decide to try and go big in the market and expand their share by opening up their OS to be ran on PCs Intel is a great partner to have from the perspective of supplying parts and supporting their new design ideas. As I've said before, I think Apple will be one of our most demanding customers but that will only make Intel better and force us to move in new directions.

 

7. What do you think was so attractive about Intel to Apple? I mean, other than the obvious stuff (like what Jobs said about lower wattage) what can you offer Apple that IBM/PPC couldn't?

Since I'm going under the assumption that Apple will be using normal production chips from us I think that Apple likes the idea of having their chips made by a company with such a strong history in pushing the computer industry and also the fact that making processors is our first and most important job. It is what defines Intel so we are going to be sure to do everything we can to make things better all the time. IBM is such a big company and involved in so many different things that making chips for Apple was just one more thing on their to do list. I don't think it was a real priority for them. Also we can supply more chips than anyone else could.

 

8. Any ideas on what they're planning on calling the chips for Apple? If not a final name, what codename are you using?

No word on that yet at all.

 

Thanks for your time!

 

Russian translation (thanks Kajy)


User Feedback

Recommended Comments



That could be said. However, if Be thought they would do well in the OS market, they wouldn't have switched over to PDAs. Be obviously believed that they could get a piece of the Internet appliance/mobile/PDA market if they decided to make such a drastic change.

 

It's not as clear-cut with NeXT, since Steve Jobs was running the company, and he clearly was not happy with Apple. Here's a quote from http://www.simson.net/nextworld/NextWorld_...ExpoSpec04.html :

Now, it seems to me that NeXT was fairly reluctant to get into the shrink-wrapped software business and that they still wanted a hold over the hardware NeXTstep came out on.

 

I'm not saying anyone was forced onto Intel, just that they made a business decision to abandon proprietary boxes and that they got absorbed by bigger companies.

 

 

With Be, it may have been the right decision or at least what they wanted. They got bought out. The handlheld hardware market was much smaller and eaiser to design for and Be would have been a good place to start considering how it ran (fast) on almost anything.

 

Basically PDA's were an easy, open market at the time, so why fight with Microsoft?

 

 

The problem with NeXt is simple. it is the same problem with Apple now.

 

Steve Jobs. You never know with that guy what he will do. He has hurt Apple in the past just to punish hardware manufacturers (ATI, lttle does he know it only hurt the customer). He is kind of like the kid who brought the basketball and if you make him angry,, he takes his ball and goes home. Jobs is still that kid.

 

Keep in mind, the head man at Be, was the guy who replaced Jobs at one time. The apple (no pun intended) does not fall far from the tree. Both like to do things just to trip you up.

 

Sort of like Richard branson, who will practically throw his company away on a hair brained scheme that will never make money.

Link to comment
Share on other sites

Most developers SHOULD know unix or have learned how to program in it while in school.

And you forgot group d) of apple users, professionals.

 

That's right most developers, how many home users do you think that makes up um.... 5%? 2%? 1%? NONE?????

 

And as for group "d" they always lie in either b or c :blink:

Link to comment
Share on other sites

A lot of what your saying is that the new cpu's will combine aspects of both cores, which ever parts combine to make them. Didn't I say that the cores aren't going to die but rather converge?

 

Dual core's at present do not perform anywhere near "TWICE AS FAST" in real world sceenarios pentium D or yonah. On paper you would think that they are twice as fast but they're not. Just like Two SLI'd 6800 ultra's aren't twice as fast as 1.

 

With the CPU's at present this is largley due to the fact that a lot of developers can't program for them, in fact most apps written for dual cores today run slower than their single core counterparts (programmers are having problems utilising the dual cores and in particular the cache) and until they get to grips with thigs like this you won't see significant increases in a given app. This change should come around quite quickly you would hope. You will see an improvement in running multiple apps but how much still remains to be seen there will still be a small but significant OS overhead preventing the "twice as fast". I have spent some time playing with a Pentium D rig and at present I wouldn't trade in my HT P4 for one. Running Half-Life 2 and just to make it fair I put the same gfx card in both PC's (a 6600GT & and 6800 Ultra) the Pentium D wasn't anywhere near my HT P4????? and won't be for quite some time. Until Vista is mainstream and you're at least one software engine revision maybe two down the road your not going to see a huge benefit in running 1 app or game. You don't have to take my word for it try it yourself with a pentium D.

 

The current generation of PC's are more than fast enough for todays average user the only thing that is going to drive cpu sales will be things like the resource devouring vista. PC's have now got to the stage where they can do most things asked of them by most users.

 

Maybe when the second cpu version ships or I might even wait for the 3rd to buy one. Right now I have a v fast P4 media center that handles everything I throw at it. And I won't be upgrading to vista until maybe 2007.

 

And I'm porbably wrong lol but I thought I'd read on the register that Intel were going to be producing two low power Itaniums before the desktop CPU's to combat something AMD was doing?

 

With regard to the design of the core I would be interested to see how much of the the M's benefits are down to imrpoved fab processes (smaller scale, reduced inductance + reduced leakage and different materials, strained silicon etc) rather that redesigned core logic. How much of the core logic is the same? (forgetting the hyper threading) more than you would think i would bet! how much has been done to reduce leakage accross the transitor gates to reduce the power loss and have the switching times been reduced (which happens when transister sizes are reduced). How similar are the cores for a logic perspective?

 

IA64 is where the future is..... x86 wih EMT64 won't live forever....

 

lol

 

people take things far too seriously

Link to comment
Share on other sites

terry-

 

Well, thanks for your comments. I understand that it wasn't a fantastic interview - I hesitated to post it, as I knew that he didn't know as much as I thought he did. He didn't "fool" me though - he made no claims as to what he knew before the interview.

 

But the fact that he works at Intel does count for something. First, no other news site has gotten such an interview, regardless of quality. Secondly, rumors do float around the "fab" which might have told him something we didn't already know. "What was that memo I saw about production being changed to another fab the other day?" "Oh, something to do with Apple. Word is that they..." You know how it goes.

 

And on the DMCA thing - yeah, we did delete some things that were clearly in violation of the DMCA, and if necessary, we will continue to do so. While we still want to remain compliant, we're also working out a clear standard for what can and can't be posted. I'm also aware of copyright law (although I had to educate myself about the DMCA once this site grew), but when other sites are recieving C/D letters, it just makes sense to batton down the hatches.

 

Constructive criticism is fine, and I truly welcome it. Asking that I live up to the standard set by Ars Tecnica is unfair - I'm a college student doing this on the side and I have no industry connections whatsoever. However, when I try to find unique content for this site and discover that it's called "totally biased" (OSX Blows - to what questions were you refering?) and that it "sucked," it makes me question why I bother providing this content at all.

 

Hi I didn't mean any offense, your obviously enthusiatic which is a good thing but the whole interview was filled with leading questions which makes the whole interview very biased. If your doing a constructive interview your questions need to be open ended, you were trying to get the tech to say great things about Apple that he was never going to say. The interview came accross as someone who was very enthusiastic trying to lead someone into saying that Apple were the best thing since sliced bread. Sorry but that's how it came over. All credit to you for getting access to someone at intel though. And don't be put off by getting some flak about it if you're new to it it's one of those things that imrpoves with experience. Interviewing is a lot lot harder than most people think and did a pretty good job.

Link to comment
Share on other sites

I am glad you posted this intereview Mashugly, information is information.

 

Since this guy had some time to reply to your questions, it is not nessesarily the answers he gave but maybe the syntax and word structure that he uses that makes this interview interesting to me.

 

It is noteable that before he even answers anything he stresses that he only works in the "FAB" department that only deals with processors and he goes on to blabber about waffers and all to emphasize this. Although the questions he chose to answer did ask about P4's this just makes him justify not talking about any other "things" Intel and Apple have been talking about when he says

 

As we all have now learned Apple and Intel have been talking for a long time about a lot of things.

He talks about how he isn't much of an Apple fan and how most Intel employees are oblivious to the computer world or something along those lines and just seems to shrug off the TPM being hacked like he just heard it happened but didn't care.

 

Anyway I hate to write long posts because shorter ones can be more effective but if Intel and Apple have been working on developing a proprietary Apple chipset to run the regular Pentiums a TPM would not be needed. Also the x86 Darwin expirement for generic 386 chipsets (and 915) was just to get OS X to "sing on x86" (as jobs said) not on the plethora of motherboards.

 

This just ads to my theory that the x86 macs will have a unique proprietary chipset not processor or socket type and jaguar x86 will be coded so that it will work on only the apple chipset which may be quite different from 386,586, etc chipsets.

Link to comment
Share on other sites

A lot of what your saying is that the new cpu's will combine aspects of both cores, which ever parts combine to make them. Didn't I say that the cores aren't going to die but rather converge?
No, you said:

 

Pentium D's (and furture multicore chips) are still going to be largely based on this design but with some of the Pentium M optimisations and power reduction properties
Which simply isn't true, it's the other way round. There's very little, virtually nothing from Netburst left in the new CPU designs. They're based an a completely different design approach from the ground up.

 

Dual core's at present do not perform anywhere near "TWICE AS FAST" in real world sceenarios pentium D or yonah.
Sorry, I lose you till the end of the sentence, grammatically this doesn't make sense to me... I suppose you wanted to say that dual core doesn't mean twice the speed compared to single-core. That's correct, of course. But I didn't say that anywhere. I said that the dual-core chip Conroe is expected to be twice as fast as its predecessor Presler, which happens to be a dual-core CPU, too.

 

On paper you would think that they are twice as fast but they're not. Just like Two SLI'd 6800 ultra's aren't twice as fast as 1.
Sure, of course.

 

With the CPU's at present this is largley due to the fact that a lot of developers can't program for them,
I'm not sure whether they can't, but most of them simply didn't have to, as SMP has until now just been an issue in high-profile computing and not in the home computer market. Besides that, CPUs are generally fast enough to run most apps now at reasonable speed. I wouldn't see a reason why Thomson ResearchSoft's EndNote, for instance, would have to be heavily multi-threaded in order to take advantage of the multi-core architecture. But you're certainly right that with projections like a hundert CPU cores on a chip, new approaches to programming will be necessary.

 

in fact most apps written for dual cores today run slower than their single core counterparts
Nonsense. Single-threaded software runs a bit slower, but software that has been designed to utilize multiple cores runs much faster -- in ranges between 50 to 80% -- on a dual core CPU than on a single core one with the same clock speed and feature set.

 

(programmers are having problems utilising the dual cores and in particular the cache) and until they get to grips with thigs like this you won't see significant increases in a given app.
I wouldn't call that "problems", but yes, as stated above, new approaches to programming are necessary to exploit the power.

 

I have spent some time playing with a Pentium D rig and at present I wouldn't trade in my HT P4 for one. Running Half-Life 2 and just to make it fair I put the same gfx card in both PC's (a 6600GT & and 6800 Ultra) the Pentium D wasn't anywhere near my HT P4????? and won't be for quite some time.
I wouldn't call contemporary games the perfect testbed for exploring the power potential of multi-core CPUs... Try encoding apps, for instance, and you'll see how great the benefits can be.

 

your not going to see a huge benefit in running 1 app or game. You don't have to take my word for it try it yourself with a pentium D.
Sure, I didn't question that.

 

The current generation of PC's are more than fast enough for todays average user the only thing that is going to drive cpu sales will be things like the resource devouring vista. PC's have now got to the stage where they can do most things asked of them by most users.
Yep, that's also the reason why now it is the perfect time to stop the clock speed race and try other approaches to make processors more powerful in the future.

 

With regard to the design of the core I would be interested to see how much of the the M's benefits are down to imrpoved fab processes (smaller scale, reduced inductance + reduced leakage and different materials, strained silicon etc) rather that redesigned core logic.
Actually the P4 is the "redesigned core logic". The P-M is rather a new iteration of the old P6 design that started with the Pentium Pro. And this P4 redesign proved to be inferior to the common P6 architecture from the ground up, everything Intel was hoping for was scalability with the acknowledged design goal of going up to 10 Ghz. The first P4 CPUs were in some respects even slower than their direct P3 predecessors, despite higher clockspeeds.
Link to comment
Share on other sites



×
×
  • Create New...