A (very) Brief History of (x86) Processors
The (x86) architecture that we all know (even if you don't, you still actually do) and love (?) begins with - big surprise - Intel's i8086 family of processors... in the '70's.
Wondering what (x86) actually means? Well, that nomenclature implies far more than what actually matters - (x86) is just a really fancy way of saying "32-bit processor". Intel - arguably - made the first one. And, even if they actually didn't, it wouldn't matter; they made the first successful one. The first one worth giving a hoot about. Go Intel.
The original i8086 was just a 16-bit extension of Intel's 8080 line of processors, most of which were 8-bit designs. As these designs advanced, the 80386 was born. The first truly well built, successful, 32-bit microprocessor. It was from this 32-bit ancestor that all modern *CISC processors are descended. And it was the 80386 that cemented Intel's market dominance.
* Complex Instruction Set Computing. This is the basis for the majority of what we currently consider computers. CISC - as opposed to RISC (Reduced Instruction Set Computing) - is the type of processing most users are familiar with.
How cores work
What isn't apparent is that this core is subdivided into multiple parts. Most of the die space is dedicated to Stack Integer Operation Units. Which processes *SIMD instructions; or, more specifically: **SSE instructions. In fact, in most cores, 4 or more SIOU are present to handle multiple types of SSE instructions. Modern cores also tend to host most of the following:
FPU - Floating Point calculation Unit (replaced by AVX)
AVX - Advanced Vector Extensions
FMAC - Fused Multiply/Add Calculation unit
Memory Controller (RAM controller; dictates RAM speed)
*SIMD: Single Instruction Multiple Data
**SSE: Streaming SIMD Extensions
Basically, the way it works is this:
In a standard, no ***SMT processor, you have 1 thread per bit pathway. 32-bit processors have, naturally, 32 threads; 64-bit have 64 threads. The only problem is: You can only execute 1 thread at a time, because only 1 thread can access the core at any given point.
***SMT: Simultaneous Multi-Threading
Enter SMT or "Hyper-Threading". For each bit pathway, you have 2 threads. Each thread can be scheduled to work on each core simultaneously... provided the instructions needing to be performed are not the same. In this way, you utilize more of the resources found on that core. Sounds like it would be pretty nifty, right? Sounds like you could accomplish double the work in half the time, right?? Sounds like something everyone would need, right??? RIGHT?????
Wrong. The truth is: most programs operate pretty much the same; they perform most of the same functions and therefore require most of the same resources. Also, the Operating System has to be optimized to recognize what the hardware is capable of. Any geek old enough to remember when the P4-HT was released will remember what a fiasco that was. It took Microsoft, what? 2 years? 3 years? to get with the program and release an appropriate fix - in the form of a service pack. In addition to that drawback (no longer a consideration for Intel's "Hyper-Threading") there's the fact that functions being performed on those threads cannot be called until the functions have been returned from the processor completely - meaning you have to wait until values are written and stored, at the very least to RAM, before accessing them in any way - a problem not found with AMD's "Bulldozer" architecture. Why do you have to wait with "Hyper-Threading"? Because the data has to be parsed - or split up - as it comes in, then re-parsed - or put back together - again after it has been processed. (Right about now, the real geeks out there are getting light-bulbs coming on over their heads.) What does that mean? Well, ladies and gentlemen, it means data intersection. When it comes to processing, data intersection is #2 on your all-time list of greatest fears. Because data intersection results in - wait for it - signal degradation or, to put it more succinctly: Loss of Accuracy. Try to remember that all electronics are is the processing of an electrical signal or pulse. When signals intersect, they degrade one another, thereby reducing the accuracy and/or reliability of the data or signal being processed. Hence the reason every company isn't using SMT of some sort.
Then why would Intel put "Hyper-Threading" on so many of their processors?
The answer is actually very simple: Marketing.
If you think you're getting more, you will be more likely to pay whatever price asked for the product offered.
If this is true, then why are the Intel processors so much faster?
So much faster than what? An AMD processor? They're not; not really. Anyone who wants to go look up comparisons on the original "i5" line Vs. AMD's Phenom II line will find that AMD's Phenom II is infinitely superior - in every way. Core for core, clock for clock, thread for thread; the AMD Phenom II line outperformed all of Intel's non-"Hyper-Threaded" processors in virtually every test - I would know; I owned both and tested both, in my own home. They performed significantly better than that even, when the workloads were 64-bit processes (try to remember, AMD invented what we now know as 64-bit processing and Intel's first i7 line was their first true 64-bit processor).
The reasons why Intel processors - in the here and now - seem faster isn't necessarily because they are; it's because while Intel has been optimizing their processors for how programs and operating systems currently call functions and calculations, AMD has been actually innovating with designs for how programs and operating systems will - in the future - call functions and processes. This has proven to be a very smart move on Intel's part... for now. If AMD can last another 5 years, they'll be so far ahead of Intel in every regard, Intel will be struggling to keep up technologically. Assuming, of course, that programming catches up with hardware. And, if you're old enough to remember the really old tech, you know that it will. It always does.
So why wouldn't we all just buy AMD?
Ah, a question for the ages. The truth of it is: AMD has looked to the future - and spent so much effort doing it, that they've stumbled in the here-and-now. Intel processors have been optimized several times since the first iterations of the original i7. And where Intel briefly stumbled, they've begun to pick up the slack. Is it good to be future proof? Yes; absolutely. But how future proof should one really be? AMD's "Bulldozer" will easily outperform any of Intel's 32nm processors... but only in heavily multi-threaded applications that've been optimized to take advantage of the new architecture... And programming that takes advantage of that type of architecture for daily applications and games is still 5 years away. Single-threaded, every day applications that normal folks use? Sorry AMD; Intel gets the gold on that one. And even though a Hyper-Threaded i7 will only get a 30% boost over its i5 counterpart (in most programs), it still gets a boost nonetheless. Because they were designed for the here and now. And they were designed well.
In short, Intel is cashing in on the human nature of "I want it right now; and I want the best right now". And, for right now, Intel is as good as it gets (mostly; can't count their video though; that HD 3000 and HD 4000 line of theirs is a bad joke)... though, I wouldn't waste my time or money on an i3 processor; it's a bit like buying one of those cheap Chinese knock-offs - looks great on paper, reality is a different story.
What should I buy?
That depends; what do you want to do? Are you looking to support a particular company? Do you like supporting the under-dog? Are you on a small, tight budget but still need performance? What tasks will you be trying to accomplish on a daily basis? Is HD Video playback more important to you than spreadsheet calculations?
These are just a few of the questions you should ask yourself - and answer - before making a computer purchase. For the vast majority of the public, the answers would ultimately lead them to purchasing an AMD processor, as AMD's integrated graphics is almost 5X as powerful as anything Intel is offering and is rapidly getting better - for cheaper.
Which is why companies like Intel hate it when you actually take the time to make an informed decision: You might not purchase their products if you do. Which is why you see all of those cutesy Intel commercials; Intel wants you to believe that, not only are you getting the best, but that they are your friend and want the best for you.
No where close to true.
Almost every major corporation will lie to you; pay off website reviewers (Tom's Hardware comes to mind); launch smear campaigns on the internet and pay hundreds of millions of dollars in advertising just to convince you to buy their product - even if they know it's inferior. And then there's the die-hard fans. We call them "fanboys" or, to spare their feelings and not invoke thoughts of perverted nerds in skin-tight T-Shirts: Hammer Legion Members. HLM will always argue that the company of their choice is the best... even when faced with verifiable proof that they couldn't possibly be more wrong. Be ye warned.
Bottom line? Buy what will fit your needs best; regardless of the company that makes it. Since most users don't utilize programs that can even recognize Hyper-Threading - much less actually make use of it - you're probably better off not wasting your money on an i7. Get an i5. Or, if you're more into graphics (which 99.9% of you are; whether you realize it or not), get an AMD "Llano" or "Trinity" based processor.
Take the time to educate yourselves on what the tasks you wish to accomplish need in terms of hardware and then purchase appropriately.
That's what we technicians refer to as: Optimization.
If I can't trust web reviewers, commercials or bloggers; why should I trust you?
Who said you should?
Always question. Everything.
I've nothing invested one way or the other. I don't work for any one of those companies and wouldn't lie for them if I did. AMD doesn't pay me to like their products. Nor does Intel (yes, I do like quite a few of their products, even if I often find myself hating the company). In fact, my favorite thing about working with/for Samsung is: They want me to tell the truth to my customers... even if that means the customer won't buy a Samsung product. Do I own anything with the Samsung brand? Yep; a cell phone and a microwave. Also, a few DRAM chips for a laptop (that are just sitting, unused, at the moment). But there are plenty of Samsung products that I wouldn't touch with a 40ft pole even if you were the one holding the pole.
Do I own AMD stuff? Yep; sure do; lots of it. Processors, video cards; the whole shebang.
What about Intel? Yep; own some of that too. Just got an i7 "Ivy Bridge" 3770K for rendering with Mental Ray (one of the few programs that actually fully utilizes Hyper-Threading). And current Intel processors are much, much faster for Mental Ray rendering than any of AMD's offerings... though, the original i7 line was complete trash inspite of the speed gain because of the degradation and image loss. For quality, Thuban was - literally - 1Billion times more accurate with image rendering for animations, even if it was much slower. Since "Ivy Bridge" has - supposedly - worked out those kinks and been far more optimized, I decided it was time to give Intel another go. For real this time. (After testing the first batch of i7's and i5's, I sold that trash. The AMD's might've been slower than the i7's but they turned out a much higher image quality when rendering. And that's what I needed. Now I need speed on the rendering process and image quality is less of an issue. How the world doth turn, eh? )
Why would you share any of this stuff? If you're not invested one way or the other, why would you share at all?
The answer is simpler than you think: I dislike ignorance and I hate stupidity. For future reference, use the following definitions for the two:
Ignorance: the state of not knowing any better because of a lack of education or experience
Stupidity: the state of knowing better because of education and experience and still choosing to ignore them both and stand behind what you know is wrong/incorrect
The short answer is I'm doing my part to lessen the amount of ignorance in the world.
Because there is no cure for stupidity.