HyperX announced that it has created the world's fastest DDR4 128GB memory kit running at 3000MHz. The kit consists of eight 16GB HyperX Predator modules (16GB x 8) motherboard in an eight module, qu... Kingston HyperX reaches Fastest 128GB DDR4 Memory Kit
For servers.. Or only as a demonstration of the DDR4 capabilities.. Or maybe for some rich bastard who want's to be future proof
MS SQL 2014 has a new all RAM database system. A boat load of high speed RAM would be awesome for that.
Future proof? 32GB is still future proof, even 64GB is beyond future proof. I got 32GB back in 2012 and I still haven't even come close to utilizing 75% of it at once.
Well, same here.. For a regular user, this amount of RAM in just insane. But on the other hand, professional user, who use it for Servers, like SQL Database, or even large SharePoint servers, they could us more than 64 GB of RAM with ease So I would say these are meant for professionals or hardcore enthusiasts.
For those of you saying this is for servers - no, it isn't. You don't put 3GHz of un-buffered non-ECC RAM in a server, that's just stupid in most cases. Also as far as I'm aware, the HyperX series was never intended for servers. Gamers obviously don't need anywhere near this much - 8GB is enough for just about any game as long as you don't run many background tasks. But if you have AV or a knack for multitasking, 12GB is a safe maximum for games. That being said, this memory is really only "useful" for workstations. Unless you use RAM disks or virtual machines, no workstation is going to need 128GB, and 3GHz isn't necessary. If you honestly think that whatever task you're doing will take up that much memory, you either have very poor task management or your software has a memory leak.
future proof? I'd say by the time you start using 50% of it, it will be useless, DDR6 will sweep the floor with it even having a quarter of the memory
Obviously haven't used many scientific applications then. Ansys software would eat that up easily and the workstation used at work currently has 128GB for this reason. Everyday users, no this is completely pointless. Servers again wouldn't ever use this but powerful workstations sure as hell could use this.
Source? All scientific applications I heard so far using clasters of servers. What kind of application would it be to run on single machine and use up to 128GB of RAM. Unless there some crazy Virtual Machines running within, it just doesn't make sense.