Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!

Help me choose one of these PCs

2»

Comments

  • GrayGhost79GrayGhost79 Member UncommonPosts: 4,775

    Originally posted by Catamount

    If one really likes OS X (enough to pay in the ballpark of $1000 for it, and I do know people who do), then there's nothing wrong with buying an Apple desktop, but outside of that, there's not much Apple can give you that you can't get just home-building, and for almost immeasurably less. Outside of that, Apple actually doesn't make their own stuff (like any other company). I believe AsusTeK is their main ODM/Manufacturer for everything from the Macbooks to their Ipods

    Yeah, the only real selling point of Apple has been it's OS, but if you want to play PC games.... you don't want to use Apple's OS lol. So in the end, if you buy an apple for gaming you end up with an extremely over priced PC with Windows. 

  • CatamountCatamount Member Posts: 773

    Originally posted by unbound55

    Originally posted by jillyronald

    Best PC is Apple, it provides a very best services and life time warrenty. But it is very costly, There are so many best PCs availble  in market but which computer you buy that depends on your budjet also.

    Just pointing out that Apple (like all other companies) does not offer a lifetime warranty on their computers...

    From the looks of it, it's just the AppleCare Protection Plan that you get access to (a search for "warranty" at Apple.com yielded no other useful results, and it's all that was mentioned in the Store section), and that's $250 for three years with some tech support added.

    As I said, you can just get this home-building. Newegg.com offers warranties you can purchase on all your parts, those that don't already come with good warranties (and many will). Manufacturers can give fine tech support, and honestly, in this day and age, a good technical forum will probably get you a greater sum of qualified people to help solve your problem than waiting on the phone for Apple's support to produce someone who knows the first thing about your computer.

    I'm certainly not anti-Apple, and use some of their products, but for actual personal computers, the appeal just isn't what it used to be.


    Originally posted by GrayGhost79

    Yeah, the only real selling point of Apple has been it's OS, but if you want to play PC games.... you don't want to use Apple's OS lol. So in the end, if you buy an apple for gaming you end up with an extremely over priced PC with Windows. 

    Yep, pretty much image

    I kind of like Apple Aperture for for certain photo-work, but again, there's no way in hell I'm paying $1000 more for my computer in order to get access to it.

  • QuizzicalQuizzical Member LegendaryPosts: 25,351

    BFG offered lifetime warranties on their video cards.  The warranty was good for the lifetime of the company.  Which, for people who bought a card earlier this year, ended up being not very long.  Lifetime warranties aren't always all that great.  Even if they were, if a computer part dies after five years, do you really want to replace it by something that is then five years old?

    Companies know a lot more about how reliable their own hardware is than you do, and they offer (or price) their warranties accordingly.  Paying extra for an extended warranty is generally a bad idea.  Insurance really only makes sense for low probability events that would be devastating if they occur, such as a healthy 40 year old buying life insurance so his family won't be impoverished if he dies.

  • QuizzicalQuizzical Member LegendaryPosts: 25,351

    Originally posted by Catamount

    I kind of like Apple Aperture for for certain photo-work, but again, there's no way in hell I'm paying $1000 more for my computer in order to get access to it.

    That's why the Hackintosh was invented.  It was also the premise behind Psystar, though Apple has sued them into oblivion.

    Note that there is no Windows equivalent of a hackintosh.  If you want to assemble a computer from whatever random parts you like, Microsoft will gladly sell you a Windows license for it and do their utmost to make it work.

  • CatamountCatamount Member Posts: 773

    Originally posted by Quizzical

    BFG offered lifetime warranties on their video cards.  The warranty was good for the lifetime of the company.  Which, for people who bought a card earlier this year, ended up being not very long.  Lifetime warranties aren't always all that great.  Even if they were, if a computer part dies after five years, do you really want to replace it by something that is then five years old?

    Companies know a lot more about how reliable their own hardware is than you do, and they offer (or price) their warranties accordingly.  Paying extra for an extended warranty is generally a bad idea.  Insurance really only makes sense for low probability events that would be devastating if they occur, such as a healthy 40 year old buying life insurance so his family won't be impoverished if he dies.

    Actually this is very true. Consumer Reports actually had a study out a few years back that completely overwhelmingly concluded that extended warranties lose the average person far more money than they save them.

    It's also something that one can figure out through simple reasoning. If companies didn't make more on average from them than they payed out, they wouldn't offer extended warranties in the first place. That same $250 for AppleCare could just as easily be stuffed into a jar (which these days makes just about the same interest as a savings account image) and spent on the next needed repair.


    Originally posted by Quizzical

    That's why the Hackintosh was invented.  It was also the premise behind Psystar, though Apple has sued them into oblivion.

    Note that there is no Windows equivalent of a hackintosh.  If you want to assemble a computer from whatever random parts you like, Microsoft will gladly sell you a Windows license for it and do their utmost to make it work.

    Hypothetically, it's a good idea, but I've found that whereas Windows is basically exactly as you describe it, OS X is not only restricted by design to a certain narrow set of hardware, but is also incredibly finicky about hardware, and I've found that many times just won't load on a computer unless you specifically go after the narrow range of hardware it was designed for. Even simple things like having an unsupported southbridge, as many are, will just end your attempts right there. The last time I tried was on an Asus P5Q series motherboard, and after many failed attempts (just repeated Kernel panics), I believe I read that the issue was lack of support for the ICH10 southbridge, though it was some time ago (short experiment, I don't remember much of what I did in detail).

  • QuizzicalQuizzical Member LegendaryPosts: 25,351

    I'm not saying that a hackintosh is a good idea or a bad idea.  I'm saying that Apple's enormous markup is the reason why hackintoshes exist.  Apple actually tries to make them not work, in order to force people to pay Apple's huge markup to get something that works.

  • nevermore82nevermore82 Member Posts: 55

    Lot of good suggestions have been made already so I'll just give you a few of my golden rules to building a custom PC (if you decide to go in that direction ofc)... I've been doing custom builds since I was 14 and did a lot of mistakes (some explosive ones were kind of cool... if you don't mind the the smell of burnt PSU and such lol) so I guess I can at least give advice on what not to do eheh :)

    - Don't buy SSD if you don't plan on getting a new one in a couple of years, I know they're fast blah blah blah... but except for some models (pricey ones) you don't get a reliable data storage, they all start losing performance quicker than you think... and they are either flash or ram based drives, so they suffer the same long time issues that ram and flash drives do. A HDD with good speed and a good cache is a much wiser choice for the moment.

    - Nvidia vs ATI ... oh my, this would generate an entire forum of hatred and trolling, so I'll keep it simple and objective... I have a Nvidia SLI build but I've done a lot of Crossfire builds also. At the moment from all the available cards in the market I think you're better off choosing ATI, mid-range ATIs are better than their Nvidia counterparts. The new top Nvidia models seem (depends on a lot of factors though) to outperform the top ATI models but with that budget you're going for the mid-range models clearly.

    - Intel vs AMD ... same issue as above, I think keeping it "in house" (AMD + ATI) seems logical but I've done a lot of Intel + ATI builds and they were never outperformed by the AMD + ATI ones. Again talking in mid-range models I think you could go for more than the Phenom X2, maybe a Phenom X4 no? Did one such build last week with a 1Gb 5870 and it ran great, but I also did a 2x5750 CrossFire build with a i7 920 and it was also very good. Honestly I did not test them intensively enough to be able to state that the Intel was faster than the AMD. What I can state is that they both performed very well in the few gaming I tested. (tested them with GRiD and Crysis). So after all this what is the conclusion, well I'd focus on other stuff and then decide the CPU based more on the budget than on the brand.

    - RAM ... Buy the best one you can. Of course this has to take in account the Motherboard you choose but my advice is get the best MoBo and the best Memory you can. The most important BUS flow to me is in the Northbridge where the Memory and GPU transactions occur , if that's clogged then no super CPU can save you. So again get a good MoBo and good Memory.

     

    Regarding the Apple suggestion... c'mon you're going to buy an Apple to put in windows? That's like buying a ferrari and giving it a fiat punto engine... or whatever works as an analogy for you :) . To me as an old Apple user... really old Apple user it's a disgrace that it can even run Windows... it seriously disgusts me to see people putting Windows on an Apple. And it's because of games that they do it.... no DX on OSX so here comes Mr. Windows to the rescue... they should just get a PC instead... An Apple computer these days is nothing more than a fashion statement, look at me I'm part of the herd, I own an Apple something... I'm hip and cool. Oh well enough ranting about this because it's not on topic.

    Since I live in Europe and we deal in Euros I didn't provide you with any examples of the hardware that I mentioned but I can check that newegg site you all talk about and check some ideas.

    Good luck with the shopping :)

    TU2 Closed Beta Testing... looking very good so far :D

  • CatamountCatamount Member Posts: 773

    Originally posted by Quizzical

    I'm not saying that a hackintosh is a good idea or a bad idea.  I'm saying that Apple's enormous markup is the reason why hackintoshes exist.  Apple actually tries to make them not work, in order to force people to pay Apple's huge markup to get something that works.

    I find it curious that I've never heard of any attempts to take a Wine-style approach and just set up a virtual environment, either for the software itself (like Wine) or for the entire OS. I don't personally use it, but Wine is a great program for gaining access to basic Windows software without having to own Windows (I've even managed to play a few older games in it). It would seem far more logical to do it for the Mac OS, which is completely restricted to Apple computers.

  • QuizzicalQuizzical Member LegendaryPosts: 25,351

    While SSDs do lose performance with time, by now, it's well-known which ones lose how much performance.  For example, my own 120 GB OCZ Agility (Indilinx Barefoot) is just under a year old now.  Its sequential reads are as good as new, and sequential writes are actually faster than they day I bought it due to firmware updates.  Random read performance has dropped, but now instead of being 50x as fast as a WD Caviar Black (hard drive), it's "only" 30x as fast.  Random write performance has dropped from about 20x as fast as a Caviar Black to "only" 15x as fast.  (Benchmarks as measured by CrystalDiskMark.)  The advantage over most other hard drives would be larger yet.  It still performs like a good SSD ought to, and is still massively faster than any hard drive that ever has been made or likely ever will be.

    I don't see where you're going with memory there.  Newer Intel architectures don't have a northbridge at all.  Both AMD and Intel integrate the memory controller into the CPU now, though I guess Clarkdale only has it in the CPU package but on a separate die.  Lynnfield even puts a PCI Express controller on the CPU die, and soon both AMD and Intel will put a GPU on the CPU die.

    There isn't any real synergy advantage to going with AMD for both the CPU and the video card.  That will change in one sense with the imminent launch of APUs.  I'd expect AMD to eventually implement a way to have switchable graphics for a system with both an APU and a discrete AMD card, so that the card can be powered down and shut off entirely when integrated graphics are good enough.  That hasn't been announced yet, but is just my speculation--though Nvidia has managed to do something like it with "Optimus" laptops mixing an Nvidia discrete card with integrated Intel graphics.  Surely it would be easier if both the APU and discrete card are from the same vendor and use the same drivers.

  • nevermore82nevermore82 Member Posts: 55

    Originally posted by Quizzical

    While SSDs do lose performance with time, by now, it's well-known which ones lose how much performance.  For example, my own 120 GB OCZ Agility (Indilinx Barefoot) is just under a year old now.  Its sequential reads are as good as new, and sequential writes are actually faster than they day I bought it due to firmware updates.  Random read performance has dropped, but now instead of being 50x as fast as a WD Caviar Black (hard drive), it's "only" 30x as fast.  Random write performance has dropped from about 20x as fast as a Caviar Black to "only" 15x as fast.  (Benchmarks as measured by CrystalDiskMark.)  The advantage over most other hard drives would be larger yet.  It still performs like a good SSD ought to, and is still massively faster than any hard drive that ever has been made or likely ever will be.

    I don't see where you're going with memory there.  Newer Intel architectures don't have a northbridge at all.  Both AMD and Intel integrate the memory controller into the CPU now, though I guess Clarkdale only has it in the CPU package but on a separate die.  Lynnfield even puts a PCI Express controller on the CPU die, and soon both AMD and Intel will put a GPU on the CPU die.

    There isn't any real synergy advantage to going with AMD for both the CPU and the video card.  That will change in one sense with the imminent launch of APUs.  I'd expect AMD to eventually implement a way to have switchable graphics for a system with both an APU and a discrete AMD card, so that the card can be powered down and shut off entirely when integrated graphics are good enough.  That hasn't been announced yet, but is just my speculation--though Nvidia has managed to do something like it with "Optimus" laptops mixing an Nvidia discrete card with integrated Intel graphics.  Surely it would be easier if both the APU and discrete card are from the same vendor and use the same drivers.

    Since you felt the need to be obnoxious and try to show off your knowledge I'm gonna reply you with some hard cold facts.

    In 2 years you'll not only suffer performance losses but also data losses in that SSD drive. So I stick with my opinion that it's still a bad move. And if you're doubting it will happen I can guarantee that it will, it's the problem with using flash drive technology... you have nothing more than an oversized pen drive. I did a lot of lab tests with those memory chips and believe me they will fail. I know you've read that it has 1.5 million hours MTBF but in reality in won't get anywhere near that.

    But of course... those numbers do sound impressive, problem is that is raw data transfer numbers, a lot of extensive testing has already shown that at best an SSD can boost up to 50% a system performance. Ok, still an impressive number I give you that, but it's still too much money for a pen drive on steroids.

    Northbridge still exists, it's just in a different place... and he might not buy one of the new chipset boards and CPUs (might find a good deal on a Quad Core for example... I can still get my hands on QC... but maybe you're so evolved that the minute the Nehalem family came out you threw all the QC and DC in the trash), so what I said makes complete sense. Memory is a critical component... a lot more important than having a SSD. Even more if he's thinking on doing a bit of O.C. later on... he has to keep good ratios, a good voltage, etc... that all comes down to how good a MoBo and memory sticks you have. Even if you have the memory controller in the CPU BUS it changes nothing... do you think the memory sticks are placed where? On the CPU? What the hell are you talking about? Information has to go from the memory banks to the controller in the CPU, you think that's all the same in every motherboard? It's not.

    That's my point with memory and motherboards, quality is essential.

    Intel already puts a GPU in the CPU dye, I'm using one such processor right now. It's rubbish, they haven't got the capability to put a good GPU on the CPU dye.... yet. Neither does Intel or AMD so that won't happen for a long time in performance oriented machines... if you're looking for power consumption (important in laptops) then yes, there already are those kind of solutions in the market and more will keep appearing.

    Oh, and there is indeed some sinergy between AMD and ATI... you can't use OverDrive to it's full capabilities unless you have both. Also the instruction set on both AMD+ATI is different from using Intel+ATI, if that is even explored by the developers I don't know... it's just a fact that Intel and AMD have different instruction sets. I've only coded for RISC processors so I can't dwell any further into that subject. But it seems a bit of a waste not taking advantage of your own hardware combined, but fusion is coming so I guess there's no point in thinking on current architecture anymore.

    Again, don't buy SSD... no matter the amazing performance numbers people show you. I'll be laughing 2 years from now when regions of the drive simply... disappear :D ... gotta love flash chips :D ... of course it's attractive and everyone is pushing it out as the hottest thing out there, it's not... focus your attention and money on other components.

    Building on a budget means you can either get all mid-range stuff... or maybe save on something to buy a little something better, i.e. save money buying a HDD (my build plays every game to date with every setting maxed and I only have a poor little slow VelociRaptor.... poor me without an SSD :D ) and get a better graphics card, or better memory or a better board.

    TU2 Closed Beta Testing... looking very good so far :D

  • QuizzicalQuizzical Member LegendaryPosts: 25,351

    "In 2 years you'll not only suffer performance losses but also data losses in that SSD drive."

    Uh huh.  And what makes you so certain of that?  SSD manufacturers selling expensive products that will reliably die before the warranty expires doesn't sound like a good way to make money.

    For that matter, SSDs have been around for more than two years.  Even the first good one, Intel's first generation X25-M, is over two years old now.  Where's the rash of old SSD failures?

    "I know you've read that it has 1.5 million hours MTBF but in reality in won't get anywhere near that."

    I'm well aware that the mean time before failure numbers that get tossed around are phony.  The grade of NAND flash used in SSDs is expected to last about 10 years before it loses its ability to hold a charge.  Alternatively, 10000 writes for each cell, but that should take a lot longer than 10 years.

    "a lot of extensive testing has already shown that at best an SSD can boost up to 50% a system performance."

    In things that are storage-bound, the real world speed boost is a lot better than 50%.  My parents have a computer very similar to my own, except with a WD Caviar Black instead of an SSD, and in things that are storage bound, mine is a whole lot faster than theirs.

    Throwing out a 50% figure, or any other percentage, is intrinsically ridiculous, though.  50% faster at what?  The SSD speed boost varies wildly from one purpose to another.  In some things, it makes no difference at all.  In others, a 500% speed boost would be an underestimate.

    "Northbridge still exists, it's just in a different place"

    The northbridge functionality still exists, but there's no longer a separate northbridge chip that you can point to on Nehalem/Westmere motherboards.  The Front Side Bus is dead, though.  Intel and Nvidia are suing each other over it.  HyperTransport, Quick Path Interconnect, and Direct Media Interface aren't nearly as big of bottlenecks as the FSB was, either.

    "That's my point with memory and motherboards, quality is essential."

    Well yes, good quality is critical.  Most importantly for power supplies, then motherboards, then also some other components.

    How do you tell what good quality memory is, though?  You don't know what chips are inside it.  You can look at the rated specs, and I guess some companies have reputations for being more conservative with their specs than others.

    Really, though, memory chip speeds have outpaced the ability of processors to make use of the extra bandwidth.  1600 MHz DDR3 is pretty cheap, but most CPU memory controllers simply don't support that speed because it would offer no real advantage over 1333 MHz.  Now, memory speed may become more important with Sandy Bridge, and will become more important with Llano, but we're not there yet.

    "Intel already puts a GPU in the CPU dye, I'm using one such processor right now."

    No.  Clarkdale/Arrandale has two separate dies:  one on a 32 nm process for the CPU, and one on a 45 nm process for the GPU and memory controller.  They're in the same package, but they're two separate physical dies.  I think Atom is two separate dies like that, too, but I'm not sure.

    "It's rubbish, they haven't got the capability to put a good GPU on the CPU dye.... yet."

    The problem is that Intel doesn't have the capacity to make a good GPU at all, whether integrated or discrete.  There are good reasons why the 740i didn't have a successor and Larrabee was canceled.

    "Neither does Intel or AMD so that won't happen for a long time in performance oriented machines."

    AMD, on the other hand, does have the capability to make good graphics.  Bobcat has basically a cut and paste of Cedar.  That's not a gaming chip, but it's good enough for everything else.  Llano will have roughly Redwood class graphics, and that is good enough for everything, including lower-midrange gaming.

    Now, TDP and die size are reasons to put the CPU and GPU on separate dies and have a discrete video card for higher end products.  But for someone who wants a cheap gaming system, Llano will work just fine.  And it will be great for gaming laptops.

    My point above wasn't that a gaming machine would want an APU only.  My point was that when not gaming, the graphics on the APU could be good enough.  It would be nice if my discrete card could mostly shut down and run off the graphics in an APU and save 30 W or so at idle.

    "save money buying a HDD (my build plays every game to date with every setting maxed and I only have a poor little slow VelociRaptor.... poor me without an SSD :D )"

    Wait, you want to buy a VelociRaptor to save money?  If you're going to pay SSD prices, you might as well get SSD performance.

Sign In or Register to comment.