x86 Reactions

• Chris Liscio

I received some emails from customers asking what I thought about the x86 announcement, and if I plan to continue development despite the big changes ahead. I can say for certain that development will continue, and creating a Universal Binary to support PPC and x86 should not take too much effort. </p> <p> Now, regarding the announcement, I feel Apple has stepped up to the plate, and can now go out into the mainstream and prove just how well their hardware designs can go head-to-head against the mainstream PC manufacturers. Of course, not everyone feels the same way I do. Below I shall list some reactions I read (paraphrased, from memory), and why I do not agree with them. <p> Reaction: Apple's own site shows the G5 outperforming the Intel P4. Now Apple's machines will get slower! Sources: developers, users </p> <p> First of all, Apple's benchmarks appear on a page that tries to sell you an Apple PowerMac G5. Everyone claims their machine runs faster than everyone else's. Get over it. I didn't buy a Mac to run faster than a PC — I bought a Mac because I enjoy working and developing on Mac OS X. </p> <p> So let's fast-forward to when Apple produces x86 hardware. As with all their current products, Apple produces their own mainboards, has full control over picking and choosing technologies to integrate into their systems, and from what I can tell, they do as good a job as they can. Unfortunately, as everyone can plainly see, they have lagged the last few years. PowerMacs still lack PCI Express (PCI-X != PCI Express), DDR2 RAM, and other goodies you can only find on PCs nowadays. Either this lagging came from integration issues with the G4 and G5 processors, or Apple has already spent a good deal of time building their x86 hardware, or something else entirely. We will likely never know. </p> <p> I expect to see all these goodies (or better, new ones) appearing soon after Apple finishes adopting x86. With Intel chipsets, Apple now has access to a drop-in solution to support faster RAM, better video cards, and all the cutting-edge technologies that exist only on the PC right now. The processor alone does not dictate the winner in the performance wars. I feel confident that Apple will remain competitive in all their markets with x86 products. </p> <p> Also, before I close this point out, keep in mind that in general, benchmarks suck. They do not tell you anything useful about how well your computer works for you. For end-users, they will not tell you that when you switch from Safari to Mail.app while playing a Quicktime stream, your Mail.app may take longer to display a message with 12 large attachments. For developers, benchmarks will not tell you how long it takes to compile your own code, and benchmarks will certainly not tell you how long it takes you to develop a product and get it out to market. For creative professionals, benchmarks will not tell you how efficiently you can manage your media and ramp up on new creative tools. Everything I listed in this paragraph matters to the respective consumers, and how long it takes to apply gaussian blur to a large picture does not (unless, of course, your job is to take large amounts of images and blur them out — I pity you). </p> <p> Reaction: I will now throw my Mac in the garbage and switch to {Linux,Windows} now that Macs run x86 hardware. Sources: users </p> <p> And what hardware do you plan to run? Did you seriously buy a Mac solely because it had a G4 or G5 CPU inside? Admittedly, this comment could have come from a teenager, which would make me feel much better about the state of humanity right now. Even still, if you feel another platform better meets your needs, then go ahead and switch. Nobody cares what you do anyway. </p> <p> Reaction: I have used a Mac for many years, and this processor switch has made me feel physically ill. Sources: users, developers </p> <p> Sir, you need to see a psychiatrist about your psychosomatic illnesses brought on by events that cause relatively inconsequential changes in your life. Life involves change, and you just have to deal with it. Grow up. </p> <p> Reaction: I don't like x86, so I will now stop building software for Mac OS X and move to {Windows,Linux} instead. Sources: developers </p> <p> I cannot believe a Mac developer would actually say this, but I saw it with my own two eyes. I just felt more people needed to read this one. Again, I hope that a seriously troubled teenager typed this one out. </p> <p> If you develop software for the Mac, then what brought you here in the first place? Either you made a marketing decision, or you enjoy the development environment. If you made a marketing decision to exclusively target the Mac platform, then switching only gives you a lot of porting to do, and you could instead focus your efforts on making your product multi-platform. If you made a marketing decision to include the Mac platform as one of your many targets, then you will throw away a market segment for no good reason. In both cases, your idea to switch only demonstrates that you make a bad marketing decision maker anyway. </p> <p> If you enjoy the Xcode/GCC development environment, I do not see it disappearing unless it gets replaced with something better. I feel Apple treats its developers quite well, and we get spoiled with so many free tools to write great software with. </p> <p> Reaction: I planned to buy a G5 machine, but now I won't! Sources: developers, users </p> <p> This reaction often came with a side-comment that the user has run their G3 machine for over 7 years, and buying a new machine would only waste their money because it would become obsolete in two years. I would argue that Apple plans to keep their PowerPC hardware running for a long time beyond the full product line transition. </p> <p> Do you honestly believe that Apple would act in such a way to alienate the largest installed base they have had in a long time? Also, once the transition to x86 has completed, developers can easily keep their software compiling on both architectures, so you will not get left in the dark! </p> <p> I also feel the same way towards people who hold off buying until Apple makes new product announcements. If you can wait, then you probably do not need the machine now anyway. Simple as that. </p> <p> Reaction: But the x86 architecture sucks! Sources: developers, users </p> <p> If you have ever studied computer architecture, and looked at how modern CPUs evolved, you would quickly come to the realization that CPUs get built around making benchmarks go faster. Every trick in the book gets exploited so that processors run faster. Once someone determines that fact, they then proceed to the realization that all processors contain ugly hacks. No one architecture rules them all. In the end, as long as companies keep putting their R&D dollars into beefing up their CPU's performance, us consumers keep getting faster products. I believe Intel has done a good job innovating the last few years, and I expect more good things to come. </p> <p> For most users, you should not care about this. Developers must make their applications run as fast as they can within reason. Mac OS X lets you do other things while waiting for long operations to complete, anyway. </p> <p> Reaction: But AMD Opterons beat Intel CPUs! Sources: developers, users </p> <p> I have some ideas about why Apple went Intel. Most of all, Intel has a very strong brand. Consumers recognize the Pentium brand, and consumers also equate Intel with stability. I realize that consumers don't really understand CPUs, and don't care, but if you compare computer A with a Pentium versus computer B with an Opteron, consumers go with what they know. I have actually built computers with Durons inside them for family members, but told them they were Celerons to avoid confusion. When I told my dad he had an AMD Athlon, he got very confused, and was only happy when I said it was "like a Pentium, but faster". </p> <p> I also have a feeling that the decision involves politics. As I understand it, AMD gets help from IBM to make their chips. If IBM has a beef with Apple, and then AMD needed help from IBM to make up demand for Apple (who makes servers that directly compete with IBM), IBM may not help out, and Apple gets into a similar supply fiasco that caused their falling out with IBM and Motorola in the first case. It could happen. </p> <p> And finally, what goes faster today may not always lead the pack tomorrow. We know that this happens with video cards, CPUs, and every aspect of computing. Intel may have some surprises up their sleeves that will blow us away. </p> <p> Reaction: What about 64-bit computing? Sources: developers, users </p> <p> Use your imagination and listening skills for this one. I can imagine Apple keeping the G5 around to maintain its 64-bit computing line for a while. When customers require 64-bit precision or address spaces, they can get G5s for the time being. Remember, Apple has not promised the full product line until the end of 2007. If Intel has not provided Apple with a 64 bit part by this time, I would go into a state of shock, given the market's trend towards 64 bits right now. </p> <p> I do not see PowerMacs and Xserves switching over to Intel chips until Intel gets a 64-bit part ready for Apple, and one that can perform multi-processor duties. </p> <p> Reaction: But the PowerMac will only have one Pentium 4 3.6Ghz in it! Sources: developers, users </p> <p> The developer box being passed out in the transition kit only exists in that case because it allows developers to easily muck with HDDs, RAM, and PCI peripherals easily. What I envision is the P4 3.6 (more likely a Pentium-M) ending up in the Mac mini, iBook, etc. </p> <p> I highly doubt Apple would ship an Intel-based box that does not outperform their existing products. In fact, it would make sense that Apple ships a Mac mini as Intel first, so that the slower speed fits nicely into the product line (faster than G4 Mac mini, slower than iMac G5, etc). Also, it serves as a cheap entry point for smaller developers to start ensuring x86 conformance. </p> <p> I will stop for now… I had too much fun doing this. :) </p>