Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Feb 11, 2020.
48v input! What are they using as transistors? Light switches?
...without characterizing you and dealing with your point, you're quite wrong.
the market exists, has existed, and will continue to exist regardless of your opinion. this market generates the media you consume from games to internet to streaming to videos.
Bring on the hungry GPU's, my 65W R5 3600 uses nothing, and my RX 5700 XT just uses another ~240W. My total system power is probably around ~330W maxed out; yet I have an 850W PSU sitting in my case sleeping half the time.
More nonsense. Since this market exists and has existed, name one product in the category. Just one.
That's a rhetorical question because THERE ARE NONE! 400 to 500W TDP for a GPU that is targeted for datacenters is a non-starter unless it has massive throughput, and with stacked chiplets, that is not likely because the heat dissipation will be prohibitive.
I will be going to GTC again this year and I will be sure to look for Intel's booth there and also for all of the competing products in this market.
A little comprehension goes a long way..... The chart in the article has nothing to do with products being released. The chart refers specifically to development and validation. Not consumer or data center products. There is not a single mention of a consumer or data center product in that chart.
During the winter sure that'll be fine its just inefficient electric heat. But during the summer I have higher electricity rates. 400W alone isn't a lot but my experience and understanding pf the coefficient of performance, is it take exponentially more electricity to have your AC remove 400W of heat.
Guys, as above said, let's try keep it more relaxed and academic.
We are talking about hardware, and tests, benchmarks and time will reveal what is true and what's not.
Patience and comprehension never hurt anyone.
Thanks in advance.
programming (VM's), video production, film making, music production, and many more industries have discovered the HEDT market for low cost (relative) computing.
this is the market that once used X99 that is now using threadripper that Intel is targeting.
this market generates billions of dollars from the consumer, and is well worth the attention of the CPU manufacturers for the tens of millions it earns them.
to suggest otherwise (that this market doesn't exist) means you require further education.
time equals money, and threadripper (and potential Intel competition) earns money.
You can get 48v by wiring multiple 12v rails in series.
So, we're looking at 2 layers in the stack usable atm.
Yes, comprehension certainly does go a long way. From the article : "The interesting part is that some TDPs are noted down as well, running to 400 and 500 watts indicative of a datacenter product."
Maybe you should direct your snark at HH.
Intel Xe HP (12.5) 1-Tile GPU: 512 EU [Est: 4096 Cores, 12.2 TFLOPs assuming 1.5GHz, 150W]
Intel Xe HP (12.5) 2-Tile GPU: 1024 EUs [Est: 8192 Cores, 20.48 assuming 1.25 GHz, TFLOPs, 300W]
Intel Xe HP (12.5) 4-Tile GPU: 2048 EUs [Est: 16,384 Cores, 36 TFLOPs assuming 1.1 GHz, 400W/500W]
How long has Intel been talking about releasing a gpu? I'll believe it when I see it.
they already did.
I'm talking about something that is available to purchase meant to compete with Nvidia and Amd desktop graphics cards.
Those numbers make a BIG difference and I had not seen them before. 16K cores at 400 or 500W with 36 TFLOPS is actually pretty good. In comparison, Nvidia's V100 has 5K cores at 250W and puts out 14 TFLOPS.
One big difference : the V100 has been out for a while and a new generation is expected soon while the Xe HP 4-tile GPU is expected next year. It will be interesting to see how they make stacked chips dissipate that much power.