Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!

Fuzzy Avatars Solved! Please re-upload your avatar if it was fuzzy!

GW2 Mac Client Available NOW!

2»

Comments

  • botrytisbotrytis In Flux, MIPosts: 2,567Member
    Originally posted by Fendel84M
    Smart move IMO gives the wow Mac players another choice without needing windows.

    No good deed goes unpunished.

     

    Apple really doesn't want to support gaming as it shows by the video cards they have in their PCs.

    image

    "In 50 years, when I talk to my grandchildren about these days, I'll make sure to mention what an accomplished MMO player I was. They are going to be so proud ..."
    by Naqaj - 7/17/2013 MMORPG.com forum

  • RocketeerRocketeer NachrodtPosts: 1,304Member
    Originally posted by botrytis
    Originally posted by Fendel84M
    Smart move IMO gives the wow Mac players another choice without needing windows.

    No good deed goes unpunished.

     

    Apple really doesn't want to support gaming as it shows by the video cards they have in their PCs.

    Never had any issue with mine. I can run GW2 with everything turned to max at 2560x1440 resolution, what more do you need? I have yet to find a game that won't run well on my 27" 2011 IMac. The 6970M they use is a fairly powerful card imho.

  • IcewhiteIcewhite Elmhurst, ILPosts: 6,403Member
    Originally posted by Naqaj
    Originally posted by Lord.Bachus

    With MAC OS X being superior to Windows in every possible way, its just plain stupid to have to run a duall boot on a macbook pro.

    You did that on purpose, didn't you? Really not helping ...

    Psssst, here on fanboycentral.com, someone's always going to start.

    But I haven't heard old Mac-n-PC since the 80s.  It's heartwarming, in a strange way.

    Self-pity imprisons us in the walls of our own self-absorption. The whole world shrinks down to the size of our problem, and the more we dwell on it, the smaller we are and the larger the problem seems to grow.

  • QuizzicalQuizzical Posts: 14,779Member Uncommon
    Originally posted by Rocketeer
    Originally posted by botrytis
    Originally posted by Fendel84M
    Smart move IMO gives the wow Mac players another choice without needing windows.

    No good deed goes unpunished.

     

    Apple really doesn't want to support gaming as it shows by the video cards they have in their PCs.

    Never had any issue with mine. I can run GW2 with everything turned to max at 2560x1440 resolution, what more do you need? I have yet to find a game that won't run well on my 27" 2011 IMac. The 6970M they use is a fairly powerful card imho.

    Never had any issues... except, of course, for the major issues that you describe above.

    By performance, a Radeon HD 6970M is roughly a desktop Radeon HD 6850.  In the latest generation, that's roughly a Radeon HD 7770, which is a $120 card.  When you bought the 6970M, the 6850 was probably a $150-$180 card.  That's certainly capable of handling games, but you're getting roughly the performance of a $800-$1000 (excluding peripherals) gaming PC, and surely paid a lot more than that for it.  Though I guess a fair chunk of the price tag is the large monitor.

    -----

    As for your replies above that I won't quote for the sake of brevity:

    1)  Logitech may have made the keyboard, but what about the monitor?  Adjusting monitor brightness from a keyboard isn't a standard monitor function.  I still say it's almost certainly a driver issue.

    2)  If it's only intermittently loud, then the issue I brought up surely isn't your problem.  If it were, the fan would be going full blast all of the time.

    3)  Is the Windows side of Boot Camp somehow fundamentally different from wiping the hard drive and doing a clean install of Windows?  Or would the latter not have the proper drivers for an iMac?

    Laptop drivers can be finicky, especially if you're using some sort of discrete switchable graphics.  I don't know if iMacs go that route.

    4)  Perhaps the bigger question isn't why ArenaNet didn't recode everything in OpenGL, but rather, why Turbine did.  If your DirectX implementation relies heavily on features that aren't available in OpenGL (or are only available in newer versions of OpenGL that Apple doesn't support), then porting it from DirectX to OpenGL is impractical.  Sometimes it would mean you lose a couple of features that you can live without, but sometimes it would mean losing features that are so central to the game that there's no point in bothering.

    5)  Mac OS X is a much bigger market than Linux.  On the other hand, people who buy a video card for a Linux machine will consider the brand and drivers, while that isn't even an option if buying from Apple, so it's not clear that high quality Mac OS X drivers would increase sales for AMD or Nvidia.

    Regardless of whether drivers would be better or worse if written by AMD or Nvidia rather than Apple, by not providing modern drivers, Apple is making it much harder than necessary to port games to Mac OS X.  By not letting you use standard Windows drivers, Apple is making it much harder than necessary to run games through Boot Camp.  If you have trouble playing games on your iMac, you should point your finger at Apple first, before taking game developers to task.

  • RocketeerRocketeer NachrodtPosts: 1,304Member
    Originally posted by Quizzical
    Originally posted by Rocketeer
    Originally posted by botrytis
    Originally posted by Fendel84M
    Smart move IMO gives the wow Mac players another choice without needing windows.

    No good deed goes unpunished.

     

    Apple really doesn't want to support gaming as it shows by the video cards they have in their PCs.

    Never had any issue with mine. I can run GW2 with everything turned to max at 2560x1440 resolution, what more do you need? I have yet to find a game that won't run well on my 27" 2011 IMac. The 6970M they use is a fairly powerful card imho.

    Never had any issues... except, of course, for the major issues that you describe above.

    By performance, a Radeon HD 6970M is roughly a desktop Radeon HD 6850.  In the latest generation, that's roughly a Radeon HD 7770, which is a $120 card.  When you bought the 6970M, the 6850 was probably a $150-$180 card.  That's certainly capable of handling games, but you're getting roughly the performance of a $800-$1000 (excluding peripherals) gaming PC, and surely paid a lot more than that for it.  Though I guess a fair chunk of the price tag is the large monitor.

    Well obviously i meant i never had a issue in the context of the capabilities of the graphics card. I didn't feel the need to reiterate on the issues i do have with the system as they are in the same thread and i assume people are reading the posts in order :D. And yes, the display makes up about 900-1k$ of the systems pricetag. Back then (mid 2011) thats just what you paid for a high quality LED backlight display in 27" size and that resolution. 

    -----

    As for your replies above that I won't quote for the sake of brevity:

    1)  Logitech may have made the keyboard, but what about the monitor?  Adjusting monitor brightness from a keyboard isn't a standard monitor function.  I still say it's almost certainly a driver issue.

    I don't want to get technical, but changing brightness is a OS controlled option to me. There is a slider in energy settings for it, and the system needs it to dim the monitor for inactivity etc. What i want is to tell windows "If i press F1, dim the screen by 5%". Thats apparently not possible. Thats not a driver issue to me, i want to assign a keypress to a system function, like you can assign keys to change volume level. Again, if you know a workaround i'd be happy to hear it. I don't see the difference between changing volume or changing brightness.

    2)  If it's only intermittently loud, then the issue I brought up surely isn't your problem.  If it were, the fan would be going full blast all of the time.

    3)  Is the Windows side of Boot Camp somehow fundamentally different from wiping the hard drive and doing a clean install of Windows?  Or would the latter not have the proper drivers for an iMac?

    Laptop drivers can be finicky, especially if you're using some sort of discrete switchable graphics.  I don't know if iMacs go that route.

    Yes, afaik the imacs do. Thinking about it thats probably the issue, but lets answer in order.

    1. Bootcamp is basicly installing windows as normal, and then booting into it and installing the apple provided drivers ontop.

    2. The Imac afaik has two gfx cards, the intel one and the ati. It probably switches between them as needed on OSX but not on windows, that could explain the loudness ... maybe.

    4)  Perhaps the bigger question isn't why ArenaNet didn't recode everything in OpenGL, but rather, why Turbine did.  If your DirectX implementation relies heavily on features that aren't available in OpenGL (or are only available in newer versions of OpenGL that Apple doesn't support), then porting it from DirectX to OpenGL is impractical.  Sometimes it would mean you lose a couple of features that you can live without, but sometimes it would mean losing features that are so central to the game that there's no point in bothering.

    I don't think that you can circumvent that issue by using a transgaming wrapper. Thats basicly a modified wine and translates directx to opengl calls. If a directx call does not have a opengl corresponding call you can't translate it. Well i guess it could be done in software emulation, but you sound techsavy enough to properly guess how that would turn out.

    5)  Mac OS X is a much bigger market than Linux.  On the other hand, people who buy a video card for a Linux machine will consider the brand and drivers, while that isn't even an option if buying from Apple, so it's not clear that high quality Mac OS X drivers would increase sales for AMD or Nvidia.

    Agreed. But quality of drivers might influence Apples choice of which to include in the next mac generations. Apple does have some cloud that way, and im not talking about iCloud. 

    Regardless of whether drivers would be better or worse if written by AMD or Nvidia rather than Apple, by not providing modern drivers, Apple is making it much harder than necessary to port games to Mac OS X.  By not letting you use standard Windows drivers, Apple is making it much harder than necessary to run games through Boot Camp.  If you have trouble playing games on your iMac, you should point your finger at Apple first, before taking game developers to task.

    Hmm maybe i chose my words poorly. Im not taking arena.net to task over the state of the imacs windows drivers, or the old Opengl version in OSX. Im taking them to task for choosing a translation company with a imho poor trackrecord over producing a native version. Wouldn't have done so a month ago, but if Turbine can do any excuse that arena.net could bring would just sound hollow. This isn't blizzard we are talking about, and neither is it a multi million sub game, its freaking Turbine and freaking LotRO we are talking about. And apparently they are doing it without having to sell a kidney per dev. 

     

  • QuizzicalQuizzical Posts: 14,779Member Uncommon

    1)  I change brightness on my monitors by pressing buttons directly on the monitors.  Windows probably isn't even aware of the monitor brightness.  Doing it from a keyboard requires doing something special with drivers.

    3)  If you've got discrete switchable graphics, then Apple is supposed to be the one that repackages the normal drivers to make them available to you.  If they're not, then the problem is with Apple.  Discrete switchable graphics is nifty for laptops, as it allows lengthy battery life at idle.  But it's a dumb idea if you're not going to run the system off of a battery, because you run into driver problems.

    4)  If something can't be recoded directly in an old version of OpenGL, then it surely can't be done by automatically translating DirectX API calls into OpenGL API calls.

    5)  Apple is notorious for being extremely finicky and dumping suppliers for any reason or no reason at all.  If AMD or Nvidia were to write drivers for Mac OS X and actually make them better than the Windows drivers, Apple would probably still switch away from them within a few years, anyway.  Threats of "I won't buy your stuff if you write bad drivers" are less effective if you weren't going to buy their stuff anyway.

    How hard it would be to convert GW2 to OpenGL for a Mac OS X (and perhaps Linux while you're at it, since switching to OpenGL likely does most of the work here) version depends greatly in internal details not merely of which features are implemented, but how those particular features are implemented.  That's basically impossible to gauge without seeing the source code.

    Really, though, if GW2 is successful for long enough (say, if they're still selling 100K copies per month a year from now), then porting the game to Mac OS X and running it natively in OpenGL likely makes sense.  But it shouldn't be the highest priority as the first thing to do.  You do the Windows version first, since that's where most of the market is (and a much larger percentage of the gaming market than the total laptop or desktop markets), and don't worry about other versions until the game is stable and polished.  You cite LotRO, but they didn't even announce a Mac version until more than five years after the Windows version launched.

    And you definitely don't start working on a port before launch unless it's written in OpenGL right from the start without any use of DirectX.  If the Windows version of your game flops, then a Mac version isn't likely to do much better.

  • RocketeerRocketeer NachrodtPosts: 1,304Member
    Originally posted by Quizzical

    1)  I change brightness on my monitors by pressing buttons directly on the monitors.  Windows probably isn't even aware of the monitor brightness.  Doing it from a keyboard requires doing something special with drivers.

    Do you even bother reader what i write? There is a setting for it in energy settings on system panel, and windows is perfectly aware of it since it perfectly shows the current brightness level. Do i have to post a screenshot of it? I can click on it and drag it to a higher or lower level, you telling me doing keypress instead of a point, click and drag requires a special driver?

    3)  If you've got discrete switchable graphics, then Apple is supposed to be the one that repackages the normal drivers to make them available to you.  If they're not, then the problem is with Apple.  Discrete switchable graphics is nifty for laptops, as it allows lengthy battery life at idle.  But it's a dumb idea if you're not going to run the system off of a battery, because you run into driver problems.

    Works fine in OSX, and that dumb idea reduces my energy bill aswell as bringing peace to my room. And discrete graphics mean that there is different hardware for the same job, choosing which to use when is a OS level decision and not a driver decision. The drivers have no clue when you need highpower graphics. Its like multicore CPUs, assigning processes to different CPUs is the job of the OS.

    4)  If something can't be recoded directly in an old version of OpenGL, then it surely can't be done by automatically translating DirectX API calls into OpenGL API calls.

    Thats what i said ... hence non existing opengl calls are a non factor in deciding wether to make a native port or use a translation software.

    5)  Apple is notorious for being extremely finicky and dumping suppliers for any reason or no reason at all.  If AMD or Nvidia were to write drivers for Mac OS X and actually make them better than the Windows drivers, Apple would probably still switch away from them within a few years, anyway.  Threats of "I won't buy your stuff if you write bad drivers" are less effective if you weren't going to buy their stuff anyway.

    Then why did intel panic when apple told them their CPUs where not energy efficient enough? If you think a threat from apple wouldn't affect their suppliers your foolish, sorry. Look what kind of labour rights they made foxconn jump through just to freshen up their image somewhat.

    How hard it would be to convert GW2 to OpenGL for a Mac OS X (and perhaps Linux while you're at it, since switching to OpenGL likely does most of the work here) version depends greatly in internal details not merely of which features are implemented, but how those particular features are implemented.  That's basically impossible to gauge without seeing the source code.

    I agree. However we are talking about a 2007 game with directx 11 updated graphics, thats alot of codebase that actually even predates intel macs, while GW2 probably planned this for quite a while. You don't just pull a client for a different architecture out of your hat a month after release ... They probably atleast considered this while the game was still in developement, and didn't go out of their way to built any major hurdles in.

    Really, though, if GW2 is successful for long enough (say, if they're still selling 100K copies per month a year from now), then porting the game to Mac OS X and running it natively in OpenGL likely makes sense.  But it shouldn't be the highest priority as the first thing to do.  You do the Windows version first, since that's where most of the market is (and a much larger percentage of the gaming market than the total laptop or desktop markets), and don't worry about other versions until the game is stable and polished.  You cite LotRO, but they didn't even announce a Mac version until more than five years after the Windows version launched.

    The thing is, if its not top priority its never going to get done. Because something always is top priority, they just shuffle it around(balance, bugs, content etc) based on what gets whined about the most while generating the most revenue. Thing is they decided to go mac, and now transgaming is going to take their cut, because thats their buisiness.

    And you definitely don't start working on a port before launch unless it's written in OpenGL right from the start without any use of DirectX.  If the Windows version of your game flops, then a Mac version isn't likely to do much better.

    Probably, but i'd say "if the windows version flops a success on the mac version won't safe you either". Because lets face it, mac has maybe 3 AAA mmos already including GW2. If you release on that platform its going to be a success because there simply is not much competition. Among the blind the one eyed will be king.

     

  • QuizzicalQuizzical Posts: 14,779Member Uncommon

    1)  I couldn't find what you're talking about on my desktop.  I tried on my laptop, and I'm guessing that you mean Control Panel -> Power Options -> Change Plan Settings.  That has a brightness option on my laptop that isn't there on my desktop, even though both run Windows 7.  If it's there on some computers and not others, then presumably something has to be done to make it show up.

    3)  There are trade-offs between burning an extra 10-15 W at idle and reliably working right.  If a video card can't be quiet while putting out 15 W, then that's a bad cooler on it or a bad BIOS running the fan or something.

    5)  You're claiming that Apple is the reason why Intel released ULV Sandy Bridge processors?  Intel has been releasing ULV processors at least since Core 2 Duo, and it would have been a huge shock if they stopped doing so for the Sandy Bridge generation.  It's really just a case where, if someone is willing to pay a considerable price premium for Intel to bin out the "best" chips (in the sense of, can run at the lowest voltages), then Intel will do it.  So will AMD (e.g., A10-4655M or Z-01), Nvidia, or anyone else.

    Yes, pressure from Apple will have some effect.  But it's not clear exactly how it would work out.  And pressure from Apple has its limits: it wasn't enough to get Intel to produce the planned top bin of Ivy Bridge graphics when no one else was interested.

  • andre369andre369 .Posts: 944Member Uncommon
    Can MACs even run this game? Thought they were just a cheap computer with an upgraded visual style. 
  • QuizzicalQuizzical Posts: 14,779Member Uncommon
    Originally posted by andre369
    Can MACs even run this game? Thought they were just a cheap computer with an upgraded visual style. 

    Macs are typically somewhere around mid-range hardware, not low end.  Smaller form factors like a MacBook Air would likely choke, but recent iMacs or many MacBook Pros have the hardware to run it fine.  OS and driver support may be a different matter, as discussed throughout this thread.

  • RocketeerRocketeer NachrodtPosts: 1,304Member
    Originally posted by Quizzical

    1)  I couldn't find what you're talking about on my desktop.  I tried on my laptop, and I'm guessing that you mean Control Panel -> Power Options -> Change Plan Settings.  That has a brightness option on my laptop that isn't there on my desktop, even though both run Windows 7.  If it's there on some computers and not others, then presumably something has to be done to make it show up.

    3)  There are trade-offs between burning an extra 10-15 W at idle and reliably working right.  If a video card can't be quiet while putting out 15 W, then that's a bad cooler on it or a bad BIOS running the fan or something.

    5)  You're claiming that Apple is the reason why Intel released ULV Sandy Bridge processors?  Intel has been releasing ULV processors at least since Core 2 Duo, and it would have been a huge shock if they stopped doing so for the Sandy Bridge generation.  It's really just a case where, if someone is willing to pay a considerable price premium for Intel to bin out the "best" chips (in the sense of, can run at the lowest voltages), then Intel will do it.  So will AMD (e.g., A10-4655M or Z-01), Nvidia, or anyone else.

    Yes, pressure from Apple will have some effect.  But it's not clear exactly how it would work out.  And pressure from Apple has its limits: it wasn't enough to get Intel to produce the planned top bin of Ivy Bridge graphics when no one else was interested.

    1. This is simple, Windows detects wether the display can have its settings can be adjusted by software or not. High quality and laptop displays can, cheap or old displays can't.

    3. There is a difference between silent and quiet. When I'm talking about my iMac getting loud, it's still quieter than any desktop I ever used in idle. It's a relative thing, the iMac never gets really loud in a way where it would disturb me ...

    5. I'm not claiming anything, in the link I posted there is a quote of a intel chief designer that told that. Apple warned them they would part ways, and intel reacted with the sandy bridge. 

  • QuizzicalQuizzical Posts: 14,779Member Uncommon

    To claim that the existence of Sandy Bridge, or even ULV bins of it, are a response to threats from Apple is ridiculous.  It takes 3+ years to develop a processor architecture, so if Apple warned Intel today that we need such and such or we're leaving, unless Intel already had it in the pipeline (or at slight modification of a project that they were already working on), the soonest they could deliver would be around 2015-2016.

    The entire article is muddled nonsense for that matter.  Intel's Ultrabooks do seem to be inspired by the MacBook Air, but it's in an "Apple will pay us a lot more than normal for ULV bins of processors, so let's see if we can convince other laptop vendors to do so, too."  It turned out that the answer to that was mostly "no", as other laptop vendors assumed that consumers willing to sacrifice everything else (price, performance, reliability, features, etc.) for the sake of a sleeker form factor would buy from Apple anyway.

    So Intel created a $300 million marketing campaign for Ultrabooks to try to convince consumers that thinness and an Intel sticker are the only things they should care about, even at the expense of price, performance, reliability, features, and so forth.  And even though AMD is competitive with Intel in low power laptop processors but not in high performance ones, so convincing people that they should really want ULV processors rather than high performance ones would likely increase AMD market share.  But Intel thinks consumers are stupid, just like most laptop vendors do.

  • RocketeerRocketeer NachrodtPosts: 1,304Member
    Originally posted by Quizzical

    To claim that the existence of Sandy Bridge, or even ULV bins of it, are a response to threats from Apple is ridiculous.  It takes 3+ years to develop a processor architecture, so if Apple warned Intel today that we need such and such or we're leaving, unless Intel already had it in the pipeline (or at slight modification of a project that they were already working on), the soonest they could deliver would be around 2015-2016.

    The entire article is muddled nonsense for that matter.  Intel's Ultrabooks do seem to be inspired by the MacBook Air, but it's in an "Apple will pay us a lot more than normal for ULV bins of processors, so let's see if we can convince other laptop vendors to do so, too."  It turned out that the answer to that was mostly "no", as other laptop vendors assumed that consumers willing to sacrifice everything else (price, performance, reliability, features, etc.) for the sake of a sleeker form factor would buy from Apple anyway.

    So Intel created a $300 million marketing campaign for Ultrabooks to try to convince consumers that thinness and an Intel sticker are the only things they should care about, even at the expense of price, performance, reliability, features, and so forth.  And even though AMD is competitive with Intel in low power laptop processors but not in high performance ones, so convincing people that they should really want ULV processors rather than high performance ones would likely increase AMD market share.  But Intel thinks consumers are stupid, just like most laptop vendors do.

     "Apple informed Intel that it better drastically slash its power consumption or would likely lose Apple’s business."

    Is a confirmed quote of Greg Welch, he also said that it was a real wakeup call for them. Considering he is the director of intels ultra book group im kinda putting more stock into his opinion than yours, no offense. Obviously, they didn't say we are going to ditch you today, they said "make more energy efficient cpus, or we will start making our own based on the ARM architecture". A threat that would have been wholly unnecessary if intel was going to make sandy bridge anyway right? Cause im sure apple can read roadmaps just fine, they probably have a lawyer for that.

  • QuizzicalQuizzical Posts: 14,779Member Uncommon

    Here's an article talking about Sandy Bridge way back in April 2007:

    http://www.theinquirer.net/inquirer/news/1026742/justin-ratner-brings-babes

    It has details like an on-die GPU and PCI Express controller, too.  Unless Apple's threat to switch away from Intel came before that (and remember that they had only switched to Intel in 2006), they're not responsible for Sandy Bridge.  The particular bins that you can make depend on the silicon you have, and just saying that you'd really like to have a 2.0 GHz quad core bin with a TDP of 12 W doesn't make it magically happen.

    The rumors about apple switching to ARM is old news, too:

    http://semiaccurate.com/2011/05/05/apple-dumps-intel-from-laptop-lines/

    Intel might have been surprised that Apple might switch to ARM for desktops and laptops, but that doesn't mean that Intel is going to (or can) change their roadmap in response.  Higher performance, lower power consumption, and lower cost of production have been the goals for a long time.  Intel will deliver the best that they can (as will AMD and ARM), and Apple saying that's what they want just like everyone else won't change what Intel, AMD, and ARM can deliver.

2»
Sign In or Register to comment.