Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!

An All-In-One Gaming machine?

Slapshot1188Slapshot1188 Member LegendaryPosts: 16,982
OK.  My 14 year old is in need to a new PC.  It will likely be a Christmas present so I have time but I started researching now.   One option that caught my mind was this all in one from lenovo:

http://www.computershopper.com/desktops/reviews/lenovo-ideacentre-aio-y910

I have seen it for sale with a 1080 for $1599

I priced out a home built system and I am getting to over $1600 with a 1070 before I even add a monitor. (sticking with Nvidia)

So my question is... since this AOI is more like a desktop, with easy access to upgrade parts such as CPU, Memory, Storage etc.. and it uses desktop parts for the CPU and Video Card...   what would you think the downside is?

Only one I can think of is that if the screen goes for some reason, the whole PC is shot.   Thoughts?

All time classic  MY NEW FAVORITE POST!  (Keep laying those bricks)

"I should point out that no other company has shipped out a beta on a disc before this." - Official Mortal Online Lead Community Moderator

Proudly wearing the Harbinger badge since Dec 23, 2017. 

Coined the phrase "Role-Playing a Development Team" January 2018

"Oddly Slap is the main reason I stay in these forums." - Mystichaze April 9th 2018

Comments

  • RidelynnRidelynn Member EpicPosts: 7,383
    From my experience with AIOs

    Most are glorified laptops. Meaning extremely limited upgradeability, and prone to heat issues if you aren't careful.

    The Lenovo you have linked may happen to have some upgradeability. but I'd still think about it just like I would a laptop.

    A lot of people are fine gaming on laptops, some even prefer it due to the portability. Here, your getting most of the downsides, and not getting the one great upside to a laptop.

    That's my opinion about AIOs, at least insofar as it relates to gaming.
    HatefullPhry[Deleted User]
  • RidelynnRidelynn Member EpicPosts: 7,383
    Also, $1600 should be getting you a good build with an i5 and 1080, maybe even a 1080Ti and/or an i7 without cutting any serious corners, if you are pricing out your own build and being conscious about the parts your choosing.
    Phry[Deleted User]
  • NyghthowlerNyghthowler Member UncommonPosts: 392
    Ridelynn pretty much covered it. 
    I've worked with them in the past, and despite their claims they are terrible for gaming. I won't own one.

    Phry
  • Slapshot1188Slapshot1188 Member LegendaryPosts: 16,982
    I had a previous one (well my kid did) and yes, it was a glorified laptop.   This one however doesn't use mobile parts.  It uses actual desktop CPU/GPU/etc...

    Here is my quick pcpartpicker.com link.  I didn't research specifics but just wanted a general idea on what it will run (got it down a bit from my first pass):
    https://pcpartpicker.com/list/RhqB3F

    Note I'd have to add a monitor on top of that...

    All time classic  MY NEW FAVORITE POST!  (Keep laying those bricks)

    "I should point out that no other company has shipped out a beta on a disc before this." - Official Mortal Online Lead Community Moderator

    Proudly wearing the Harbinger badge since Dec 23, 2017. 

    Coined the phrase "Role-Playing a Development Team" January 2018

    "Oddly Slap is the main reason I stay in these forums." - Mystichaze April 9th 2018

  • QuizzicalQuizzical Member LegendaryPosts: 25,348
    edited July 2017
    If you want to upgrade in the future, you're going to be very sharply power limited.  In particular, if you want to get a high end video card in the future, the power supply and/or cooling probably won't be able to handle it.  If you're looking to spend $1600 on a gaming rig now, you're probably the sort of person who might want a high end video card at some point.

    It's probable that the motherboard, power supply, and some other components aren't exactly of the caliber that you'd build in your own rig.

    There's also the issue of monitors.  You should be able to move a monitor from one computer to another, or get a new monitor without having to get a computer.  With an all-in-one, you can't.  I'm not sure if it's even possible to use multiple monitors with that all-in-one; with an ordinary desktop, it's trivial to connect more.

    From the review, it looks like CPU and GPU cooling really isn't very good.  In particular, even their low power CPU hit 93 C under a gaming load.  Some other game or program might randomly happen to push the CPU considerably harder, and then bad things happen.

    Ridelynn is right:  it's better to think of an all-in-one as a larger laptop, not a smaller desktop.
    Phry[Deleted User]
  • QuizzicalQuizzical Member LegendaryPosts: 25,348
    I had a previous one (well my kid did) and yes, it was a glorified laptop.   This one however doesn't use mobile parts.  It uses actual desktop CPU/GPU/etc...

    Here is my quick pcpartpicker.com link.  I didn't research specifics but just wanted a general idea on what it will run (got it down a bit from my first pass):
    https://pcpartpicker.com/list/RhqB3F

    Note I'd have to add a monitor on top of that...

    Most of those parts that you list in that link are better than what you'd get in the all-in-one.  The CPU, CPU cooler, motherboard, memory, and power supply almost certainly are, and likely the video card and case, too.

    For example, you could save a lot of money on the power supply by getting this instead:

    https://www.newegg.com/Product/Product.aspx?Item=N82E16817151136

    And you'd still have a far superior power supply to what comes in that all-in-one, certainly by wattage and probably by quality, too.
  • Slapshot1188Slapshot1188 Member LegendaryPosts: 16,982
    Thanks!

    I guess I'll focus on watching the individual parts to come on sale and build it.
    I always like that and it's a learning experience for a 14 year old... just fare that possibility of the black screen on bootup and the hours and hours of trying to find if it was something I did or a bad part.

    All time classic  MY NEW FAVORITE POST!  (Keep laying those bricks)

    "I should point out that no other company has shipped out a beta on a disc before this." - Official Mortal Online Lead Community Moderator

    Proudly wearing the Harbinger badge since Dec 23, 2017. 

    Coined the phrase "Role-Playing a Development Team" January 2018

    "Oddly Slap is the main reason I stay in these forums." - Mystichaze April 9th 2018

  • CleffyCleffy Member RarePosts: 6,412
    Don't build a PC right now with current GPU prices unless you are using a GTX 1080. All in Ones are also poor choices unless it is specifically needed. Mobile GPU and you cannot upgrade individual components. Lenovo typically SUCKS.
    The problem with All in Ones is that the monitor and other components typically lasts much longer than the Computer. You can typically run the same Monitor, Speakers, Keyboard, Mouse, and Microphone across 2~3 towers. That means in 3~4 years when you need to upgrade, you will need to repurchase 3 of these.
  • DakeruDakeru Member EpicPosts: 3,802
    I have to say Slap, you are quite generous on Christmas.

    Do you want to be my dad? lol
    [Deleted User]
    Harbinger of Fools
  • WylfWylf Member UncommonPosts: 376
    I had a previous one (well my kid did) and yes, it was a glorified laptop.   This one however doesn't use mobile parts.  It uses actual desktop CPU/GPU/etc...

    Here is my quick pcpartpicker.com link.  I didn't research specifics but just wanted a general idea on what it will run (got it down a bit from my first pass):
    https://pcpartpicker.com/list/RhqB3F

    Note I'd have to add a monitor on top of that...

    Personally I would never game on an AIO. Slapshot1188 gave you a great link in pcpartpicker.com.  Check out the video builds, spend a little bit of time checking out the site.  Every part of the computer is covered and debated. My experience is that most forum dwellers on pcpartpicker.com are very helpful.  I would be very surprised if you couldn't put together a far better build on your own, that was cheaper.  Also the way things are, you ought to build it with your 14 year old. Great experience.


  • GruntyGrunty Member EpicPosts: 8,657
    edited July 2017
    NO!  I can't make it any clearer.   AIOs have absolutely no upgradeability.  Their PSU are suspect.  Their cooling capability is little better than a laptop.  Their screens can be problematic, especially Dell's.  If one thing breaks after warranty...

    Speaking as a former Dell warranty tech never get an AIO.  I would have rather worked on a hardened laptop.
    Phrywaynejr2
    "I used to think the worst thing in life was to be all alone.  It's not.  The worst thing in life is to end up with people who make you feel all alone."  Robin Williams
  • waynejr2waynejr2 Member EpicPosts: 7,769
    Grunty said:
    NO!  I can't make it any clearer.   AIOs have absolutely no upgradeability.  Their PSU are suspect.  Their cooling capability is little better than a laptop.  Their screens can be problematic, especially Dell's.  If one thing breaks after warranty...

    Speaking as a former Dell warranty tech never get an AIO.  I would have rather worked on a hardened laptop.

    This is just common sense! 
    http://www.youhaventlived.com/qblog/2010/QBlog190810A.html  

    Epic Music:   https://www.youtube.com/watch?v=vAigCvelkhQ&list=PLo9FRw1AkDuQLEz7Gvvaz3ideB2NpFtT1

    https://archive.org/details/softwarelibrary_msdos?&sort=-downloads&page=1

    Kyleran:  "Now there's the real trick, learning to accept and enjoy a game for what it offers rather than pass on what might be a great playing experience because it lacks a few features you prefer."

    John Henry Newman: "A man would do nothing if he waited until he could do it so well that no one could find fault."

    FreddyNoNose:  "A good game needs no defense; a bad game has no defense." "Easily digested content is just as easily forgotten."

    LacedOpium: "So the question that begs to be asked is, if you are not interested in the game mechanics that define the MMORPG genre, then why are you playing an MMORPG?"




  • QuizzicalQuizzical Member LegendaryPosts: 25,348
    edited July 2017
    All-in-ones do have a legitimate place in the world.  They're basically for people with very strict space requirements and not so much in the way of performance or longevity requirements.  That's not most gamers.
    HrimnirExcession[Deleted User]waynejr2
  • HrimnirHrimnir Member RarePosts: 2,415
    As quiz said.  All in ones absolutely have a place.  They're sleek, generally stylish, take up very little desk space, etc.  They're perfect for people who need something that will just do typing work, or web browsing, or media viewing, etc.  However, one thing they simply are not nor will likely ever be, is gaming rigs.

    "The surest way to corrupt a youth is to instruct him to hold in higher esteem those who think alike than those who think differently."

    - Friedrich Nietzsche

  • HrimnirHrimnir Member RarePosts: 2,415
    Thanks!

    I guess I'll focus on watching the individual parts to come on sale and build it.
    I always like that and it's a learning experience for a 14 year old... just fare that possibility of the black screen on bootup and the hours and hours of trying to find if it was something I did or a bad part.


    I was gonna mention, right now GPU prices are heavily inflated due to a sharp increase in demand from bitcoin miners.  That should hopefully die down with a few months and by Christmas you will probably have a little more to work with in the budget.

    "The surest way to corrupt a youth is to instruct him to hold in higher esteem those who think alike than those who think differently."

    - Friedrich Nietzsche

  • QuizzicalQuizzical Member LegendaryPosts: 25,348
    Hrimnir said:
    Thanks!

    I guess I'll focus on watching the individual parts to come on sale and build it.
    I always like that and it's a learning experience for a 14 year old... just fare that possibility of the black screen on bootup and the hours and hours of trying to find if it was something I did or a bad part.


    I was gonna mention, right now GPU prices are heavily inflated due to a sharp increase in demand from bitcoin miners.  That should hopefully die down with a few months and by Christmas you will probably have a little more to work with in the budget.
    The issue right now is ethereum, not bitcoin.  And it's only affecting certain GPUs that happen to be good at ethereum mining.  In another thread, someone said that that the GTX 1080 is being ignored because it's slower at ethereum mining than the Radeon RX 470/480/570/580 or even the GTX 1070.  I'm not sure why it would be slower than a GTX 1070; memory latency on GDDR5X is the only thing I can think of that's even plausible.
  • HrimnirHrimnir Member RarePosts: 2,415
    Quizzical said:
    Hrimnir said:
    Thanks!

    I guess I'll focus on watching the individual parts to come on sale and build it.
    I always like that and it's a learning experience for a 14 year old... just fare that possibility of the black screen on bootup and the hours and hours of trying to find if it was something I did or a bad part.


    I was gonna mention, right now GPU prices are heavily inflated due to a sharp increase in demand from bitcoin miners.  That should hopefully die down with a few months and by Christmas you will probably have a little more to work with in the budget.
    The issue right now is ethereum, not bitcoin.  And it's only affecting certain GPUs that happen to be good at ethereum mining.  In another thread, someone said that that the GTX 1080 is being ignored because it's slower at ethereum mining than the Radeon RX 470/480/570/580 or even the GTX 1070.  I'm not sure why it would be slower than a GTX 1070; memory latency on GDDR5X is the only thing I can think of that's even plausible.


    That is interesting, I was told it was bitcoin, not even familiar with etherium personally. However, yeah it doesn't make sense that a 1080 would be worse than a 1070, for example.  My understanding of that sort of dataset (assuming ehterium is similar to bitcoin) is that it's not especially memory intensive.  So, who knows.

    "The surest way to corrupt a youth is to instruct him to hold in higher esteem those who think alike than those who think differently."

    - Friedrich Nietzsche

  • QuizzicalQuizzical Member LegendaryPosts: 25,348
    edited July 2017
    Hrimnir said:
    Quizzical said:
    Hrimnir said:
    Thanks!

    I guess I'll focus on watching the individual parts to come on sale and build it.
    I always like that and it's a learning experience for a 14 year old... just fare that possibility of the black screen on bootup and the hours and hours of trying to find if it was something I did or a bad part.


    I was gonna mention, right now GPU prices are heavily inflated due to a sharp increase in demand from bitcoin miners.  That should hopefully die down with a few months and by Christmas you will probably have a little more to work with in the budget.
    The issue right now is ethereum, not bitcoin.  And it's only affecting certain GPUs that happen to be good at ethereum mining.  In another thread, someone said that that the GTX 1080 is being ignored because it's slower at ethereum mining than the Radeon RX 470/480/570/580 or even the GTX 1070.  I'm not sure why it would be slower than a GTX 1070; memory latency on GDDR5X is the only thing I can think of that's even plausible.


    That is interesting, I was told it was bitcoin, not even familiar with etherium personally. However, yeah it doesn't make sense that a 1080 would be worse than a 1070, for example.  My understanding of that sort of dataset (assuming ehterium is similar to bitcoin) is that it's not especially memory intensive.  So, who knows.

    Bitcoin is just some variant of SHA-2.  That takes only trivial amounts of memory bandwidth.  If you build a custom ASIC to do bitcoin mining, it doesn't even take very much die space to lay out an FPGA-like pipelined version that gives one output per clock.  So it's possible to build such an ASIC that absolutely blows away everything except other custom ASICs at bitcoin mining.  I mean a difference of something like 100x a high end GPU, not just a little faster.  Once those ASICs exist--and they have for several years now--there's no sense in mining bitcoins with anything else.

    That's generally considered a major flaw of bitcoin.  Rather than coins being mined by millions of members of the general public, they're only mined in any meaningful volume by the handful of people with a bitcoin mining ASIC.  No one else would make enough money by mining bitcoins to cover their power costs.  I read one article a while ago that said that a majority of the entire world's bitcoin mining capability is owned by one person in China.

    Ethereum was created with the explicit goal of avoiding that problem.  Here's their hashing function:

    https://github.com/ethereum/wiki/wiki/Ethash

    The hashing function is dominated by the work in doing random table lookups to a table of something larger than 1 GB.  That's large enough that no processors (interpreted loosely to include CPUs, GPUs, FPGAs, custom ASICs, etc.) that I'm aware of can cache it on die, and work is dominated by doing random access lookups to your off-die memory.

    GDDR5 memory requires 128-byte alignment, so any access is exactly as expensive as reading in the full 128 bytes, even if you only grab 4 bytes and ignore the rest of the cache line.  One hash takes 64 accesses, so 8 KB worth of reads.  Some simple arithmetic gives you that a GPU with 256 GB/s of memory bandwidth could do up to 256/8 = 32 million hashes per second.  Or perhaps rather, a little less than that because 8 KB = 8192 bytes, not 8000 bytes.  You can't actually exhaust a GPU's theoretical memory bandwidth, and there is a little bit of other work in occasionally setting up the table, but it's probably not a coincidence that the Radeon RX 480 and RX 570, and the GeForce GTX 1070, all rated at 256 MB/sec, all score around 25 million hashes per second, or not that far under the theoretical cap.

    GPUs, like CPUs, have a variety of caches to try to avoid using excessive memory bandwidth.  Ethash is explicitly designed to break the functionality of those caches and make you lean heavily on global memory bandwidth.

    Ever since Fermi, Nvidia's GPUs have had more sophisticated memory controllers than AMD's.  They do that to try to prevent the memory controller from choking on realistic workloads because the memory accesses don't access all of the physical memory chips evenly.  If you're inclined to do so, it's pretty trivial to design an algorithm that will make all memory accesses on an AMD GPU hit the same memory channel, and then performance completely chokes.  A milder version of that does happen sometimes in real software, too.  Nvidia's more sophisticated memory controller means that it's nearly immune to that particular problem, as you'd basically have to reverse-engineer a large chunk of their memory controller to come up with an access pattern that makes everything go to the same memory channel.

    But the extra complexity means that some Nvidia GPUs choke if you ask them to do random accesses to a large enough memory buffer.  I'm not sure why that happens.  The article that Cleffy linked only lists Pascal GPUs among Nvidia options.  Had they tested older Fermi, Kepler, and Maxwell GPUs, I'd be surprised if they didn't see a lot of them choking with far lower performance than you'd expect from the paper specs, like the GTX 1080 does.  I also wouldn't be surprised if the GTX 1060 and GTX 1070 similarly choked if you made the dataset larger--and the Ethereum dataset is designed to grow over time.

    AMD has talked up the HBM2 on Vega constituting a "high bandwidth cache".  I'm not sure exactly what that means, but I've interpreted it as meaning that Vega is moving to a more sophisticated memory controller like what Nvidia GPUs have had for several years now.  I don't know if that will mean that Vega chokes like a GTX 1080 and some other Nvidia GPUs on random accesses to large datasets.  But it wouldn't surprise me if it does.
    Post edited by Quizzical on
    Hrimnir
Sign In or Register to comment.