ADVERTISEMENT

Intel i9 processor

cnsaguy

Diamond Knight
Gold Member
Sep 18, 2002
18,193
23,857
113
Now that this has been introduced, how long until i7's come down in price? Need a new computer but I'm waiting now.
 
How much better (in reality) is the i9 processor? I understand the i7 is significantly better than the i5, especially for gaming...
Is it worth that much more to get the i9?
 
How much better (in reality) is the i9 processor? I understand the i7 is significantly better than the i5, especially for gaming...
Is it worth that much more to get the i9?
Gaming???? How old are you?

Now if this I9 helps with my porno watching then I want one.
 
  • Like
Reactions: Knight_Light
How much better (in reality) is the i9 processor? I understand the i7 is significantly better than the i5, especially for gaming...
Is it worth that much more to get the i9?
When I built my PC years ago, I found data that suggested the processor is really not a big factor in gaming and the bigger bang for the buck was in the GPU, RAM, and HD (SSD over HDD). I'll see if I can't dig it up. In the meantime, what makes the i7 take such a big step up in gaming over the i5 (or i3 for that matter)?
 
When I built my PC years ago, I found data that suggested the processor is really not a big factor in gaming and the bigger bang for the buck was in the GPU, RAM, and HD (SSD over HDD). I'll see if I can't dig it up. In the meantime, what makes the i7 take such a big step up in gaming over the i5 (or i3 for that matter)?

I thought the processor directly impacted the resolution of the graphics being displayed on the screen? i.e., the better the processor the higher the potential for better resolution.
 
I thought the processor directly impacted the resolution of the graphics being displayed on the screen? i.e., the better the processor the higher the potential for better resolution.
The processor doesn't impact resolution of the graphics, that's entirely on the Graphics Card. You can run into bottlenecking with a lesser CPU because the CPU does handle non-rendered processing like AI positioning, team positioning, etc. Those processes can push a lesser CPU to the limit and result in reduced performance (which can't be fixed with reducing resolution). The key is to make sure you get just enough CPU to prevent bottlenecking, but don't go overboard because you won't get large improvements for the money. There's a balance and it's dependent on budget as well as the type of gaming.
 
The processor doesn't impact resolution of the graphics, that's entirely on the Graphics Card. You can run into bottlenecking with a lesser CPU because the CPU does handle non-rendered processing like AI positioning, team positioning, etc. Those processes can push a lesser CPU to the limit and result in reduced performance (which can't be fixed with reducing resolution). The key is to make sure you get just enough CPU to prevent bottlenecking, but don't go overboard because you won't get large improvements for the money. There's a balance and it's dependent on budget as well as the type of gaming.

You hit my misunderstanding perfectly. Thanks for the explanation.
 
Intel is increasingly just becoming a supplier for amazon and I don't mean for customers to buy on amazon.com... I'm talking amazon data centers.
This i9 has 12 cores, each one multithreaded. There is no use for this in a desktop PC.
 
In the meantime, what makes the i7 take such a big step up in gaming over the i5 (or i3 for that matter)?


I have several games that won't even function using an i5. World of Tanks, for instance, will play, but you have to dumb down the graphics so much that it's almost unplayable.

When I bought my current laptop, I initially purchased an i5 Asus. When one game wouldn't work, I called their tech service department and the guy on the phone was doing his best not to laugh at me trying to use an i5 computer for gaming. I took the computer back and got the i7 version. (Republic of Gamers Asus).
 
I have several games that won't even function using an i5. World of Tanks, for instance, will play, but you have to dumb down the graphics so much that it's almost unplayable.

When I bought my current laptop, I initially purchased an i5 Asus. When one game wouldn't work, I called their tech service department and the guy on the phone was doing his best not to laugh at me trying to use an i5 computer for gaming. I took the computer back and got the i7 version. (Republic of Gamers Asus).
Did the original laptop have a dedicated graphics card or integrated graphics on the motherboard? What generation i5? Based on your statement that reducing graphics made it playable, your CPU was not throttling the system. It was your GPU. Glad you found a more capable machine though.
 
Did the original laptop have a dedicated graphics card or integrated graphics on the motherboard? What generation i5? Based on your statement that reducing graphics made it playable, your CPU was not throttling the system. It was your GPU. Glad you found a more capable machine though.

Both had the graphics card. The i7 has 2gb of memory on the card, the i5 had 1gb I think.
 
Both had the graphics card. The i7 has 2gb of memory on the card, the i5 had 1gb I think.

Both have to have a graphics "card" or you wouldn't get anything on the monitor. [winking]

What matters is if one was "discrete" and one wasn't.

Discrete graphic cards usually have their own memory to handle the video chores and almost always a dedicated GPU which leaves the CPU free to do CPU chores.

Non-discrete graphics almost always share CPU cycles and system memory to handle graphic chores. I have found that this often degrades system performance since you lose system memory/cpu cycles. Low system memory often forces the system to use the hard drive more. On laptops those are usually 5400 rpm drives (i.e. slow) and bottlenecks the cpu.

This article indicates an i5 or i7 are both good for games.

http://opticsgamer.com/intel-i5-vs-i7/
 
I thought the processor directly impacted the resolution of the graphics being displayed on the screen? i.e., the better the processor the higher the potential for better resolution.
Not sure if serious ... but in case you are ...

GPUs are basically high-end vector processors, able to transform floating point data in ways that SIMD could never in a classic, Von Newuman architecture. GPUs are used outside of gaming for that reason as well, basically as a superior co-processor for all sorts of things. Especially in cases where precision is secondary.

E.g., we use them heavily in financial for analysis, especially predictive, among other things.

There really hasn't been much 'gain' in CPUs since the circa 2010 Intel SandyBridge generation (2nd generation i-Core series). It's all been just slight, along with improved fabrication-efficiency.

E.g., even an i5-2500, true 4-core 3.3GHz (3.7GHz turbo) 6MiB L3 SRAM processor, can drive most modern GPUs without too much of a hit.

As far as the new Intel i9 series, it's just the consumer workstation version of the pre-existing Intel Xeon E5 server. Intel unlocks the multiplier so it can be overclocked but, otherwise, there's not much to it. It even has two (2) internal rings, so it's really like a NUMA SoC -- especially when it's exposed in Cluster-on-Die (CoD) mode.
 
Last edited:
Both had the graphics card. The i7 has 2gb of memory on the card, the i5 had 1gb I think.
The CPU and dedicated, dual-port VRAM size has very little to do with performance. It's the ...
  • Pixel shaders
  • Vertex shaders
  • Texture mapping units
  • Render output units
Different GPUs can literally vary 10-fold in numbers for each -- a full order of magnitude between low-end add-on video cards and the top end. On CPU die GPUs can have 1/10th (or even less) the same number of shaders/units as even just a mid-range, $200 GPU.

That's where the performance impact is, the sheer vector operations that are capable of hundreds of double-precision GFLOPS (billions of operations per second).

E.g., an nVidia GTX 1060 6GiB is usually no faster than a GTX 1060 3GiB (within 5%), all while the latter can meet or usually best a GTX 970 4GiB, at 2-2.5K resolutions.

The only time the size VRAM starts impacting performance is if and the various features, such as everything from super-sampling (e.g., rendering a larger framebuffer than displayed, among other things) as well as the textures (even if compressed, they can eat up a lot of VRAM), will not fit in the local VRAM.

But usually by then, at those sizes, the GPU has to have enough shaders/units to handle the sheer data. That's why they are often sized how they are.

E.g., 3GiB is good enough for 2-2.5K resolutions with typical textures and super-sampling with ~1000 pixel and ~100 vertex/texture units (nVidia vector architectures). You start hitting 4K, you're not only talking a larger framebuffer, but the resolution usually dictates the desire to have higher resolution textures (otherwise it shows), the corresponding super-sampling, etc..., so the compounding requirement for far more shaders/units to do it.

The CPU is only needed to thread most of the other data flow and control -- inputs, secondary outputs (even audio DACs are largely just digital streams) and secondary I/O (loading portions of the gaming world for rendering). The GPU even talks directly to main system memory, and can even execute code in it (unlike any other I/O, which just typically maps for direct transfers). This is also part of the reason why GPU drivers can be unstable ... they basically 'act like a peer CPU (with all the same coherency issues as a multi-socket system, only going over the less capable PCIe channels) from the standpoint of memory.

That's also why HP, IBM and others are building new architectures where the vector processor -- which is what a GPU essential is -- is really the 'main CPU' ...
The traditional CPU and VN-arch is more of a 'peripheral controller.' That's why the 2020s will largely be a Windows PC-free gaming world at the high-end, because Microsoft is way, way behind, unless they introduce a completely different OS (although some theorize this is part of why Microsoft is porting everything it can to Linux).

For now, the economies-of-scale of the PC lets Windows remain in control.
 
Last edited:
Intel is increasingly just becoming a supplier for amazon and I don't mean for customers to buy on amazon.com... I'm talking amazon data centers.
This i9 has 12 cores, each one multithreaded. There is no use for this in a desktop PC.
That's why they have been long called Xeon E5. ;)

The problem with the i9 is the same as the higher end Ryzen, you're talking having to understand and control NUMA inside of the CPU die to efficiently use it. NUMA is not something most home consumers and game coders want to address.

It's one thing to have such on a server, or even in a compute cluster ... where you're tuning the application or running multiple containers or VMs, on a subset of processors, with pinned memory and cores. That's what HPC (High Performance Computing) people are for.

It's a completely other thing to have consumers and game coders have to deal with it.
 
Last edited:
Now that this has been introduced, how long until i7's come down in price? Need a new computer but I'm waiting now.
They won't.

The i9 is a totally different market than the i7. Maybe some of the planned AMD Ryzen products are really what will cause costs to drop, but usually they've only been affecting i3 or, in the case of gamers, i5 prices.

The best way to buy is to get a board + CPU + memory combination. I've found in-person MicroCenter (closest is Atlanta) has bundle deals (easily save up to $100 when well-timed). NewEgg has sucked for such the past 2-3 years.

Today, even I still have a ...

- Cheap $50, 2014-era ASRock** H97M-ITX/ac, with an
- Old 2014-era Haswell i7-4790K, plus ...
Upgraded last November (from a 2014-era GTX 970 4GiB VRAM) to a ...
- $500 nVidia GTX 1080 8GiB VRAM GPU

The last is what kicks butt, and would still with even a lower-end i5-4590K too.

It runs Elite: Dangerous on my Oculus Rift CV1 at 90Hz just fine, although I leave shadows at medium and leave super-sampling at 1.0x using FXAA (low overhead), while everything else is Ultra. I get a few artifacts in large planetary renderings because of the shadows setting, but that's about it.

There is no reason for me to upgrade the base platform, and I don't plan to do so until at least 2018. I'm still on Windows 7 Ultimate, and by late 2018, I'll probably move to Windows 10.

I'm also waiting for ASRock** to make MXM commonplace in the new Micro-STX form-factor ...

Micro-STX is a wider version of the Intel Mini-STX (5x5"), with the MXM card edge added.
- http://www.asrock.com/ipc/overview.asp?Model=H110-STX MXM

MXM is the notebook GPU mezzanine-like PCIe card edge so you don't need a 'vertical slot' like PCIe. Micro-STX is basically a notebook like mainboard using SO-DIMMs and other approaches. The combination will allow extremely small, low-height form-factor set-tops with high-end GPUs, compared to Mini-ITX/DTX ... and, more importantly, far better cooling (the bane of vertical PCIe slots).

The STX+MXM design is really only ASRock right now, because
A) STX itself, let alone Micro-STX, still has to catch on, and ...
B) MXM cards themselves aren't available in retail at all
I.e., MXM notebook GPU cards are only ordered in qty. of 100s, and only by notebook manufacturers, directly from nVidia or other ODMs -- usually the Clevo (various resellers, Sager being their primary distributor and reseller) along with Dell (especially Alienware, who previously used Clevo).

I'm hoping they become commonplace, and the economies-of-scale of set-top gaming PCs down in costs by 2018.

Until then, I usually go with a Silverstone SG05 7x9x11" case with a 450-600W Mini-SFX power supply, or for a full ATX PS, the CoolerMaster Elite 110.
- http://www.silverstonetek.com/product.php?pid=210

I then add a 7-8" long Mini GPU, like the Gigabyte or Zotac options in a GTX 970/1070 or even 1080 now.
- https://www.zotac.com/us/product/graphics_card/zotac-geforce-gtx-1080-mini

Portability is most important to me. Not surprisingly, Silverstone has also made the first case for the ASRock STX+MXM design, the RVZ04, although the +19V is limiting (<400W, and usually only 220W capable at this time).
- https://smallformfactor.net/news/silverstone-cases-computex-2017-pio-micro-stx


**NOTE: ASRock is the new Asus. Asus sold their ODM, Pegatron to ASRock, in 2010. I.e., ASRock fabs their own products, which is why they are leading most everyone. Asus boards are now fabbed by ODMs like ECS, FIC, Foxconn, etc... It's ironic, because Asus originally created and spun off ASRock to make cheap boards fabbed by ODMs like ECS, FIC and Foxconn, while Asus owned Pegatron and made their own stuff.
 
Last edited:
Gaming???? How old are you?
Now if this I9 helps with my porno watching then I want one.
I'm surprised you don't have a Rift or Vive then. The 180 (360 is overrated) degree, stereoscopic 3D is pretty much excellent, even if it's not 'true VR'. You're fixed in a point in space as the viewer, other than looking around.

Even a little Nano or Pico-ITX PC from 2013 is good enough for watch '2D video.' I just picked another one of these up from Microcenter for $100 a couple of weeks ago. It's basically the same, base CPU+GPU that is in the original Xbox One and PS4, sans the latter have 2 of these on-die (8 cores, 2 GPUs).
- https://www.zotac.com/de/product/mini_pcs/aq01

Runs Windows 7, RHEL 7, etc... perfectly, out-of-the-box.
 
ADVERTISEMENT
ADVERTISEMENT