Quantcast

Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!

RAM advice

laxielaxie Member RarePosts: 1,065
Hi everyone,

As I was trying to run a calculation, I just realised my computer only has 16GB of RAM.
I have 2x https://www.amazon.co.uk/HyperX-HX430C15PB3-Predator-288-Pin-Memory/dp/B071ZZCSQZ/

I would like to upgrade, to make sure I can run all of the calculations comfortably - this means at least 32GB.

Is the best strategy to buy 2x more of the same? That would be 4 x 8GB DDR4?
Is there any throttling when you fill all 4 slots?
Also, is it true that the RAM performs better if you stick with one model, as opposed to mixing different ones?
time007

Comments

  • QuizzicalQuizzical Member LegendaryPosts: 22,242
    What processor and motherboard do you have?

    Assuming you have a mainstream consumer CPU and not a HEDT version, your CPU will have two 64-bit memory channels.  It can use each with a 64-bit connection to one memory module or two separate 32-bit connections to two separate modules.  Filling all four slots means that you get the same bandwidth as before, but just divided evenly among more modules.

    Using more memory modules places more stress on the memory controller, however.  This commonly means that it can't clock as high as before.  Depending on how much headroom you have, you might be able to run four modules at 3000 MHz (which is already overclocking, incidentally), or maybe you'll have to dial back the clock speeds a ways to make it stable.

    Mixing modules of different capacities will hurt your bandwidth considerably.  But so long as all modules have the same capacity, the performance hit to mixing different modules is much, much smaller.  At best, if some modules can handle higher clock speeds or tighter timings than others, you'll have to run all modules at the speed of the slowest.

    Memory isn't a case of one module being systematically faster than another in all ways, however.  If one module can handle lower values on one timing and another module tighter values on a different timing, to mix them means that you have to choose timings that all of the modules can handle--which will be looser in some ways than what any module on its own could do.  In principle, this doesn't have to be all that bad, but you're not going to have good luck with trying to auto-detect XMP profiles if different modules have different profiles.
  • RidelynnRidelynn Member EpicPosts: 7,076
    What kind of calculations are you doing? Not that it’s impossible, but for 99% of all users intentionally doing calculations on computers, 16GB of RAM is plenty.

    Hell even Bill Gates said you’ll never need more than 640kB. And he’s smart.
  • laxielaxie Member RarePosts: 1,065
    Quizzical said:
    What processor and motherboard do you have?
    Thank you Quizzical, that's a really comprehensive answer. I understand it a lot better now. The computer has a mainstream 8700k - so it has two channels. From your answer, it sounds like the best solution is to replace my 2 x 8 GB of memory with 2 x 16GB. And simply give the old memory away.

    Ridelynn said:
    What kind of calculations are you doing? Not that it’s impossible, but for 99% of all users intentionally doing calculations on computers, 16GB of RAM is plenty.
    I believe the calculation scans over a large pool of text, looking for patterns and if it has seen the specific pattern before, it adds +1 to that pattern's entry in a data frame. As you process more text, you see more unique patterns, so the table grows a fair bit. Basically it's building up a huge table and then exports it at the end.

    I'm not sure it's viable to use anything but RAM, since you need to constantly be checking if your values exist and inserting/modifying them.

    Now that I've been thinking about it though, I wonder if it would be possible to batch this process. Exporting the tables and flushing the memory. And ultimately writing something that merges all of it together.
  • VrikaVrika Member EpicPosts: 6,524
    laxie said:
    Quizzical said:
    What processor and motherboard do you have?
    Thank you Quizzical, that's a really comprehensive answer. I understand it a lot better now. The computer has a mainstream 8700k - so it has two channels. From your answer, it sounds like the best solution is to replace my 2 x 8 GB of memory with 2 x 16GB. And simply give the old memory away.
    If you can get extra 2 of the same module you already have, the performance impact will be small. Buying a new mobo + processor with quad channel support might give you better price/performance ratio than ditching 16 GB of existing RAM just so that you don't have to use all the RAM slots.

    Having only two memory sticks if you've got dual channel motberboard is slightly better, but it's normally not that important.
     
  • QuizzicalQuizzical Member LegendaryPosts: 22,242
    laxie said:
    Quizzical said:
    What processor and motherboard do you have?
    Thank you Quizzical, that's a really comprehensive answer. I understand it a lot better now. The computer has a mainstream 8700k - so it has two channels. From your answer, it sounds like the best solution is to replace my 2 x 8 GB of memory with 2 x 16GB. And simply give the old memory away.

    Ridelynn said:
    What kind of calculations are you doing? Not that it’s impossible, but for 99% of all users intentionally doing calculations on computers, 16GB of RAM is plenty.
    I believe the calculation scans over a large pool of text, looking for patterns and if it has seen the specific pattern before, it adds +1 to that pattern's entry in a data frame. As you process more text, you see more unique patterns, so the table grows a fair bit. Basically it's building up a huge table and then exports it at the end.

    I'm not sure it's viable to use anything but RAM, since you need to constantly be checking if your values exist and inserting/modifying them.

    Now that I've been thinking about it though, I wonder if it would be possible to batch this process. Exporting the tables and flushing the memory. And ultimately writing something that merges all of it together.
    You could also cut your RAM usage by half by processing the data twice while only looking for half of the possible patterns in each pass.  That would take twice as long, of course, but wouldn't be nearly as bad as constantly paging to disk.
    laxie
  • RidelynnRidelynn Member EpicPosts: 7,076
    16GB of text is an awful lot of text.

    To put it into context, it’s the entire King James Bible, nearly 4,000 times over.

    Take a text that size and cut it up into sub patterns and yeah, you could get some big tables. But there is also a good deal of room for optimization too. And I don’t know exactly what you are doing, it just seems a bit suspect from my 4,000 mile away perspective. 

    I will admit, sometimes it’s cheaper to just throw hardware at the problem. And sometimes there is no other good answer than to throw hardware at the problem.

    Regardless, to answer your original question: RAM brands and speeds don’t have to match, it will all run at the lowest common denominator. But it eliminates sometimes quirky compatibility problems if you are able to.
  • TorvalTorval Member LegendaryPosts: 20,172
    edited May 2018
    We do a lot of that stuff at work. My rig has 64GB of RAM because I'm usually running a local sql server instance and a Win7 VM hogging 4 virtual cores and it's own 8GB ram.

    Our files typically range from a few MB to a few GB in size, depending on the data set. We try and break them up into chunks past 2GB.

    I use an editor called Pilot Edit to parse and make little changes to large files and Notepad++ for small files.

    You can also use Javascript and regex to manipulate files although we typically write something in C#, Python, or use Pentaho Data Integration to do big file stuff. You might look into Pentaho. It's a big data ETL (extract, transform, load) utility. It can connect to most any db (with the right connector) or file including flat and excel.

    When dealing with large data I've found it's best to use some sort of data store. Keeping things in memory is dangerous. If the program or scripting crashes or throws an exception then you've lost all the work and processing you've done. Or it's made partial changes you can't undo or pick back up where you left off thus corrupting your data.

    I would recommend increasing your ram because more ram is never really wrong imo. However, I would also recommend engineering your process to look at smaller chunks at a time, saving progress frequently, and have a way to unwind or pickup if the process crashes. Even on good hardware extracting and manipulating say 500k - 1M rows of data comprising a few GB can take hours or days depending on the size and structure of the discrete components.

    We did increase our ram from 32 to 64 because we had a heavy job and for a few hundred thought it would be worth the trouble. It did make the job more stable, but it doesn't make things process that much faster. That aspect didn't deliver as expected.

    Just a few thoughts to mull over as you try and work through your calculations.
    laxieRidelynnceratop001blueturtle13
    Fedora - A modern, free, and open source Operating System. https://getfedora.org/

    traveller, interloper, anomaly, iteration


  • OzmodanOzmodan Member EpicPosts: 9,726
    edited May 2018
    To think my first job was working with an IBM 370 with 128k of memory.  They had just bought 1 mb of core memory from Memorex that came in four huge cabinets(cost over a million).  That supported about 25 computer terminals throughout the building.  Data was kept on tape as the hard drives were reserved for programs and the OS.  And yes we used keypunch cards to code and enter data.
    Ridelynnblueturtle13Torval
  • CleffyCleffy Member RarePosts: 6,262
    Ram speed isn't as important on an Intel. Better to run all 4 dimms if memory capacity is the issue.
    Ozmodan
  • time007time007 Member UncommonPosts: 1,061
    edited May 2018
    laxie said:
    Hi everyone,

    As I was trying to run a calculation, I just realised my computer only has 16GB of RAM.
    I have 2x https://www.amazon.co.uk/HyperX-HX430C15PB3-Predator-288-Pin-Memory/dp/B071ZZCSQZ/

    I would like to upgrade, to make sure I can run all of the calculations comfortably - this means at least 32GB.

    Is the best strategy to buy 2x more of the same? That would be 4 x 8GB DDR4?
    Is there any throttling when you fill all 4 slots?
    Also, is it true that the RAM performs better if you stick with one model, as opposed to mixing different ones?
    i applaud you greatly.  back when i built my pc, 5 years ago, everyone told me to get 8gb, not 16, as 16gb was overkill.  i regret it and wish i had it now.  i could get it now, but im going to do a new build this year probably.  so kudos to you buddy, load it up, ram is so cheap heheh.  

    IMPORTANT:  Please keep all replies to my posts about GAMING.  Please no negative or backhanded comments directed at me personally.  If you are going to post a reply that includes how you feel about me, please don't bother replying & just ignore my post instead.  I'm on this forum to talk about GAMING.  Thank you.
  • ceratop001ceratop001 Member RarePosts: 1,594
    guys should I upgrade my sdram? B)
    Ridelynnblueturtle13Torval
     
  • RidelynnRidelynn Member EpicPosts: 7,076
    time007 said:
     so kudos to you buddy, load it up, ram is so cheap heheh.  
    It is?
    Torval
  • CleffyCleffy Member RarePosts: 6,262
    time007 said:
    laxie said:
    Hi everyone,

    As I was trying to run a calculation, I just realised my computer only has 16GB of RAM.
    I have 2x https://www.amazon.co.uk/HyperX-HX430C15PB3-Predator-288-Pin-Memory/dp/B071ZZCSQZ/

    I would like to upgrade, to make sure I can run all of the calculations comfortably - this means at least 32GB.

    Is the best strategy to buy 2x more of the same? That would be 4 x 8GB DDR4?
    Is there any throttling when you fill all 4 slots?
    Also, is it true that the RAM performs better if you stick with one model, as opposed to mixing different ones?
    i applaud you greatly.  back when i built my pc, 5 years ago, everyone told me to get 8gb, not 16, as 16gb was overkill.  i regret it and wish i had it now.  i could get it now, but im going to do a new build this year probably.  so kudos to you buddy, load it up, ram is so cheap heheh.  

    5 years ago 8 GB was enough. Buying memory today isn't the same as buying memory 5 years ago. You can reasonably place higher clocked memory in all 4 dimms without any adverse affects.
    Ozmodan
  • QuizzicalQuizzical Member LegendaryPosts: 22,242
    Buying a given quantity of memory today could easily cost you more than it did five years ago.
  • RenoakuRenoaku Member EpicPosts: 3,134
    Ripjaws <3
  • laxielaxie Member RarePosts: 1,065
    @Torval
    Thanks for the tips. I'll definitely look into Pentaho.

    I started working with large data sets 3 years ago and it's been a wild ride. This current project is probably the trickiest so far. We started with a 800GB dataset, with the goal of real time visualisation. It's now processed into 60GB.

    I work at a Psychology department at university, which means the teams are very small. Most people around me know the theory, I have to do all the implementation. Had to do all of the analytical back end, the server-database communication and all of the client side.

    The deadline is tomorrow and I can't wait to wrap this up. :grin:
    Torval
Sign In or Register to comment.