NVIDIA GTX 1080

NVIDIA’s GTX 1080: The Tip Of The Iceberg?

Marco Solorio Editorials 44 Comments

Marco Solorio of OneRiver Media discusses his thoughts on the newly announced Nvidia GeForce GTX 1080. More than just better performance at lower cost, the new GPU card may cause a larger shift in computing than what’s merely on the surface.

Nvidia announced a huge bombshell in the form of the GeForce GTX 1080. In short, the announcement is making huge waves throughout many circles. Aside from the obvious use for videogame technology, it’s also making waves for VR, video editing, color grading systems (namely DaVinci Resolve), and 3D modeling/animation systems. In fact, there are many applications where these cards are used for their computational power alone with nothing to do with graphics processing.

To add, the GTX 1080 is about twice as powerful as Nvidia’s Titan X (which we use) and will only cost $699 (which is almost half of what the Nvidia Titan X costs). It also consumes far less draw on power than the Titan X. By all accounts, this is truly a groundbreaking move in the GPU technology market. Nvidia in short, is killing it.

The announcement of the GTX 1080 is so big, that this card alone will most likely cause a shift in computer workstation ownership. Last year I wrote an article about how I upgraded our Mac Pro Tower with new CPUs, RAM, flash-based boot drives, and of course, the Titan X. The system still churns through heavy tasks, including working with 4.6K RAW footage, edited in real-time in DaVinci Resolve in 4K UHD timelines (even basic node structures play in real-time without the need for rendering).

nvidia titan x gpu graphics card

Our current Nvidia Titan X GPU packs a serious punch (as part of my “Mac Pro Tower Reborn” article last year) but pails in comparison to the newly announced GTX 1080 card… and at almost half the cost. Bitter-sweet.

But as good as that juiced up Mac Pro Tower is today, I know at some point, the time will have to come to an end, simply because Apple hasn’t built a PCIe-based system in many years now. As my article described, the alternative Mac Pro trashcan is simply not a solution for our needs, imposing too many limitations combined with a very high price tag.

The Nvidia GTX 1080 might be the final nail in the coffin. I can guarantee at this point, we will have to move to a Windows-based workstation for our main edit suite and one that supports multiple PCIe slots specifically for the GTX 1080 (I’ll most likely get two 1080s that that new price-point). I’m no stranger to working on Windows systems (I’ve built my own Windows boxes since Windows 3/NT) and have Windows systems running now in our facility. But with that said, I do prefer Apple OS X when possible. But with no support of a modern PCIe-based workstation from Apple, our hands are tied to move to Windows (we may get an HP Z840 system, something similar, or a custom build we’ll do in-house). Even if the GTX 1080 could be flashed for OSX, we wouldn’t be able to take full advantage of what the 1080 has to offer, due to The Mac Pro Tower’s older PCIe buss technology.

Even a Thunderbolt-connected PCIe expansion chassis to a Mac Pro trashcan wont help, due to the inherent bandwidth limits that Thunderbolt has as compared to the buss speeds of these GPU cards. And forget about stacking these cards in an expansion chassis… just not going to happen.

Blackmagic Design's DaVinci Resolve

With Blackmagic Design’s DaVinci Resolve making serious grounds as an NLE (not to mention its decades of color correction/grading superiority), using powerful GPU cards is a major plus for our workflow. The GTX 1080 will look to be a winner among many DaVinci Resolve users.

With all that said, I see (and have already seen) a huge migration of longtime Apple users (such as me) going to Windows systems for their main workstation needs. The sheer power and lower cost is just too huge at this point. The Nvidia GTX 1080 just compounded that point exponentially stronger.

The biggest thing for us is the fact that we have software (and licenses) that is specific to OSX. So there may be some added cost on our end to buy new software solutions that work on Windows and/or license transfers. Luckily some things (DaVinci Resolve, Adobe CC Suite, Newtek Lightwave 3D, and others) will not require a new Windows license and can merely transfer to the new Windows system.

For the record, I’m not going to invest in a new Windows system tomorrow for our edit suite, but it is on the horizon. My estimation is several months from now, as initial users work through the GTX 1080 using the same tools we use ours on. Once things stabilize and performance results are out, I’ll slowly make the transition over. We just have an enormous amount of projects and assets on the current system (including feature film work) that will take some time to migrate over from OSX to Windows. But in the end, it will happen.

What are your thoughts on the new Nvidia GTX 1080? Are you also a Mac Pro Tower user looking to switch to a Windows-based system that supports the GTX 1080? Looking forward to it or dreading it?


What's Next?
Let us know if there a specific topic of something you’d like us to cover, whether it be workflow, production, shooting techniques, or whatever! And if you subscribe to our blog, you will be notified the second a new blog post is released… BEFORE it’s posted anywhere else online, including our own social media platforms. Cheers!

Share this post


About the Author

Marco Solorio

Facebook Twitter

Marco Solorio is an award-winning feature filmmaker, published author, and international speaker as a 30-year veteran in the industry with many industry awards to his name. As owner of OneRiver Media, he has produced, directed, and lensed content spanning from commercial to feature film works. As a credited thought-leader in the industry, Marco has consulted for the likes of Pixar, Apple, and Google to name a few. Along with published books to his credit, Marco has also been featured in... Read Marco's full bio here.

FCC Disclaimer - Links from this article might go to affiliate links to B&H and/or Amazon (not much different than the ads you see in YouTube ads, but more focused). Visiting these links (or buying products from the links) cost you nothing additional and may in turn help us pay for the cost of hosting these free articles we write. These articles take an enormous amount of time (in most cases, weeks and months) to research, draft, write, edit, rewrite, and shoot custom photography/video for. In no way do these affiliate links cover the cost of all of that, so any help these links can provide is a huge help to us to continue to offer free content to our readers.


RECENT BLOG POSTS

Comments 44

  1. Food for thought. Insightful post and one that’s given me something to ponder. I still use FCPX because my iMac doesn’t have the juice to support the new version of Resolve. Was toying idea of a tower but this post illuminated the limitations of the Pro. Hmm…

    1. Post
      Author

      Hi Ryan. Yeah, anything Apple related for desktop computing at this point does impose some limits, unfortunately. A Mac Pro tower uses older PCIe bussing (with only one 16x slot at that) and the Mac Pro trashcan has no PCIe slots with Thunderbolt expansion not providing enough bandwidth. Windows is really the only viable way to move forward for a seriously powerful system with options to expand as you need to and budget allows.

  2. Apple will likely be putting Thunderbolt 3 on Macbook Pros and Mac Pros soon, which negates the need for PCI slots. Thunderbolt 3 has enough bandwidth to support external graphics cards, which is especially exciting for laptop users. Basically, I wouldn’t make any rash decisions yet, or at least until Apple’s WWDC.

    1. Post
      Author

      While I agree that waiting to see what happens at WWDC is beneficial, I do so with a grain of salt, as Apple has long been on the track of downsizing their desktop computing for many years now. While I rejoice in TB3, I still don’t see it as a complete solution on its own if I want to (and will) stack GPU cards (especially with hard disk systems in the chain). With Windows systems offering multiple 16x PCIe slots (among 8x and lower) with the inclusion of TB (fantastic for low cost RAID solutions), I can’t help ignore the flexibility and expandability of Windows systems at this point. The trashcan just doesn’t incorporate that for those that truly need a high-end system to expand on. But as mentioned in the article, it’ll be several months before we make any transition, so I’ll definitely see what happens at WWDC first.

  3. I have a Mac but I’ve always been predominantly Win PC myself due to being a Softimage / 3Ds Max user and having to building dirt cheap render farms. There has been a small movement over to Win PC in and around the people I work with and its due to the same thing you refer to “flexibility ” and bang for buck. Robert is right Apple will add TB3 to their upcoming machines and for the majority they will be super happy but being tied down to hardware components just isn’t for me. This card and more so the even cheaper GTX 1070 could be what liberates many amateur and hobbyist using Davinci, Octane render and other GPU dependant software it certainly cuts about £250 off a machine able to run Rift or Vive. The 5D of the graphics card world

    1. Post
      Author

      Yeah, I agree. For us, it’s more about the expandability and flexibility in an open tower and even beyond the limits of what TB3 can do for our needs on stacked GPUs. But yes, there will definitely be a segment of users that will use a single 1070 to TB3 chassis expansion and call it a day. This of course if Apple brings TB3 for WWDC. I have to believe they will. But more importantly though… if the 1080/1070 can even function under OSX. That in of itself will take a fair amount time if it can be done.

  4. I currently use a 2013 Mac Pro and this graphics card is the final nail in the coffin for me to switch to Windows. Adobe Preimere Pro and Resolve work well with my mid level Mac Pro but the real stuggle is After Effects and rendering 4K. With the price of this new GPU along with Xeon chipset I can get double the amount of power for half of the cost of my current Mac Pro, if Apple doesn’t do something radical again (like making it expandable in the PCI-e world) then I’m switching for sure, no questions. WWDC will either be when I stay with OS X or leave it.

    1. Post
      Author

      Yeah, we’re pretty much in the same boat, Caleb. For us though, unless Apple comes out with a PCIe-based solution that includes multiple 16x slots at WWDC, which is all incredibly unlikely, then we’re done with Apple. Even TB3 isn’t enough for us to invest in a limited traschcan. Our needs have always required the use of a tower-based host workstation for expandability. Maybe Apple can throw a hail Mary? Unlikely, unfortunately for us.

  5. Hey Marco,

    Nice posting. My main client switched to custom built Windows workstations a little over a year ago and they just absolutely crush anything that Apple has to offer. Sad, because I am a big Apple guy but I cannot dispute what I am seeing as far as performance with Adobe CC and stuff like Cinema 4D on these hogs. Even for me, once I got used to the “gotchas” of Windows again, I can work as quickly as I can on a Mac with tons more horsepower at my disposal. Also of note: they are planning on adding the 1080 card to these bad boys too.

    – Keith

    1. Post
      Author

      Yeah, the performance to cost ratio on Windows systems is just too great at this point. We’ve stretched the potential of the Mac Pro tower as far as it can at this point, while current Windows towers have multiple 16x PCIe slots and Thunderbolt built right into it. I can’t ignore that!

  6. Sing it, brother! Last Fall I built an “entry-level” Win 8.1 Pro based PC for Resolve, utilizing a GTX 980ti GPU and an inexpensive TB2 card. It was surprisingly easy to build and works great:
    https://herefortheweather.wordpress.com

    I’ve owned & used Macs since the very first 1984 model, but Mac hardware has gotten way, way behind. Apple _might_ announce a new Mac Pro at WWDC this June, but I needed a new computer last year. It’s anyone’s guess if a new MP will be cost effective/competitive (they were once).

    Aside from the hardware spec wars, an important factor for me is the environmental cost of not being able to reuse/repurpose old Mac hardware & peripherals anywhere near the extent possible with generic PC hardware.

    Cheers!

    1. Post
      Author

      Hey Peter! I definitely agree with that. I do wish Apple would be innovative with their desktop computers workstations again, but obviously it’s no longer their focus. I do hope I’m wrong with their showing at next WDCC, but I’m not holding my breath. The price/performance ratio has just gotten so wide at this point. I’m definitely in favor of repurposing old towers as well!

  7. Anyone who did actually bought the trash can Mac pro either didn’t bother doing the sums it really really thought Mac was still a good OS. I know a lot of people still paying the £5000 for a machine you can build for around £800, even if you insist on pointlessly using HP, xeon and quadro! I was a 15 year Mac only person and it was easy as pie to switch. After Shake, then final touch, then final cut, then QuickTime, I simply cannot trust them to support my business in a mature and professional way. I have seen tests however that show bandwidth may not be as important for GPU performance as we think. And I am sure somebody will make tb3 GPU compute work. But why would you when you could buy a new camera and have a holiday with the left over cash.

    1. Post
      Author

      Yeah, I think as limiting as a trashcan can be, I think it works for some people’s scenarios but definitely not ours. GPU performance is definitely critical for us, as we also run 3D modeling/animation software besides the primary need for DaVinci Resolve (this is why we use the Titan X now). But yes, after the whole Apple Pro Apps EOL with Shake et al, it has permanently left a very sour taste in my mouth with Apple and one that I can no longer trust with our professional needs as well. It is one of several reasons we’ve never gone the FCP X route. And I also agree: why go through the hoops at this point with a TB3/GTX1080 hack when a PC system at this point in time is so much more powerful, less costly, and compatible out the gate? I’ll still see what WWDC has to offer, but I’m not holding my breath!

  8. We’re a 3 seat DaVinci and AdobeCC based grading and finishing shop faced with the same issues. We switched our main grading rig over to a custom built PC just after the TrashCan pricing was announced and are not going back. Our second seat runs a Mac Cheese-Grater with a 6GB Titan and 32GB of RAM quite nicely. We use that primarily for conforming and edit. The third seat runs an i7 iMac with 32GB RAM and 2nd display and is mostly used for ingest, dailies and some render. All 3 are connected via ExaSan. The point being, running a hybrid setup is quite possible giving you the best of both worlds where and when you need it. It has the added benefit of not putting the future of your hard-earned business in the hands of Apple, who, to put it in the mildest terms, have an exceptionally poor record of listening to their customers.

    Down the road, I would totally agree. I’m a long time Mac user, but PC and Nvidia are now far and away the better choice and unless Apple changes course (and pricing) radically, it’s over for them in the professional realm within the next few years.

    1. Post
      Author

      I totally agree, Clay. Having PC systems in our facility now, I know the move to switch our Mac workstations to PC should be smooth for the most part. The bottleneck for us will be completing/transitioning current projects (including feature film work) from the Mac to PC. But it’s not so much a “how” but a “when”. I like your example, as it shows people are definitely doing this successfully and enjoying the benefits of a stronger main workstation. I definitely don’t want to continue to throw money at Apple, especially with the sour taste I still have from their Pro Apps EOL!

  9. This is a difficult issue..
    I have been preaching for quite some time that getting behind Apple was the road to post production disaster. Simply look at what they do. End of Pro Apps, the TrashCan Mac, that die hards make excuses for, but me as a person who has run facilities in the past. What an absolute disaster (That only now are the die hards, now bloody from hitting their heads against the wall, are beginning to see the light.) I would love if Apple would release a hardware platform suited to pros. or better, open OSX to general top end hardware making hackintosh use a supported use model.
    Personally I think they are struggling with how TB3 dumped them and went PC centric and not Mac centric. (USB-C) Ie recent MBook update had mainboard capable of TB3 with a USB-C connector, but they decided not to make TB3 capable. Really Apple???

    It makes me very sad that Apple is so arrogant about this.. but the only way we can get Apple to take notice is to abandon them for a few years. Make them hurt. Its all about money guys, and now apple Phones are flat, its a good time to make them hurt and for them to actually start listening to the use and not always think, they know best..

    Otherwise, if Linux actually could get its shit together. That is were I would prefer to go.. I use everything and I can tell you, Linux gets the most out of the hardware.. I imagine it comes from how nearly all of the top 100 supercomputers in the world run Linux based kernels..

    And really with 4K 10-12bit or more, HFR etc. (Needed for HDR) we will need every bit of CPU power we can get in the future.

    1. Post
      Author

      Hi James. I definitely agree with all your points. I definitely feel there’s a bit of arrogance by Apples as well. It’s a shame that they’re just not pushing out innovative new professional products for the last several years now, but I guess that’s the toll they want to make by being a phone company foremost. But as you say, even in that regard, they’re not making huge grounds against the mobile device market these days. And yes, Linux would be a cool option as well, but for us, there is just not enough supporting software (and hardware for that matter) for our needs. Not sure about your point on HFR needed for HDR though… even standard 23.98 FPS is an accepted rate for HDR, as it’s about latitude quality and not frame-rate.

      1. With a greater contrast brings with it more stutter discomfort. Everyone knowns 24fps has issues and can look very bad if moving the camera at certain speeds etc. This “problem” is considerably more annoying as the greater contrast in the image makes this stutter far more annoying..
        The only way to “fix” this issue is to goto HFR. Tho, there is this stigma that HFR looks like video.
        In reality, HFR looks far more realistic, tho we grew up and are comfortable for 24fps..

        It is expected that the greater annoyance of stutter in HDR content at 24fps will be so annoying, that HFR is likely to be adopted sooner than later. Ie sooner then all you old fuddy duddy people would like…. The younger with there 60fps games minimum will be looking at you like you look at your father and his attraction to glen miller big band music being the best… Ie it is generational to a degree. And in reality, there is nothing wrong with HFR apart from your not use to it.
        Case in point, Ang Lee demo at 120fps was considered groundbreaking and more realistic than anything before it. But are we going to poo-poo it as. it still kinda looks like video as video is higher frame rate and shorter exposure for sports etc. to make it look more realistic..

        1. I’ve never quite understood the preoccupation of shooting video at 24fps. It never really looked like ‘film’ at all because the stutter effect looks poor on all but the slowest pans. People forget that when traditional ‘film’ was projected to an audience in a cinema at 24fps on a regular film projector, there was a mechanical three blade shutter which came into play so the cinema audience actually saw 72 flashes of light per second, not 24. The brain fills in the missing blanks and the pans look acceptably smooth. Some modern 4K TV’s will reproduce this shutter effect electronically when viewing video/BD material at 24fps.

        2. Post
          Author

          My point though is that HFR is not a requirement of HDR. 24p is completely acceptable under the HDR pipeline. Whether or not someone inherently likes/dislikes 24p is merely a personal preference (and an age old topic unto itself for decades) just as 60p and 120p is also a personal preference. There is no required frame-rate in the HDR pipeline that deems 24 FPS to be out of spec.

          1. This is true. 24fps HDR is supported. Tho everyone I discuss HDR with that is connected with HDR for cinema release, do agree HFR is going to happen. The industry seems quite happy to move forward with what we have at 24fps. No so much with HDR. But really I have never liked the stutter myself.
            In some ways it is an excuse for the technical purists to force the point.. (Ie, Purist, looks its more accurate and has more detail. old punbit, but I don;t like the look… Please turn down the accuracy of the image…
            Purist. bangs head against wall)
            I find it strange how going from Dolby SR to Dolby-Digital/SDDS in cinema was applauded.. while improving the image quality is Poo-pooed. Its quite a strange fenomena

            But like Pixar said at the Future of Cinema Conference, HFR is not currently socially acceptable in the current climate. (I’ll post a video of the Pixel panel on HDR in about a month.)

            1. Post
              Author

              I presented on HDR at HPA and was on the HDR panel at NAB and the topic of HFR specifically for HDR was never in the realm of a replacement but more so an addition to. I personally am open to traditional 24, up to 60, 120, et al… it’s all about how it’s being used for the medium and telling the story, but again, it’s a personal preference. Love it or hate it, there is no closed HDR spec for either end of the spectrum: 24 or 120. Again (and personal opinions of aesthetics aside), HFR is not a requirement of HDR.

    1. Post
      Author

      You’re correct, Craig, unfortunately is has come to this. But even as an iPhone 6 Plus user myself, I’m not even sure Apple is making huge grounds against the mobile device competition like they used to. Uninspiring times by them, unfortunately.

        1. Post
          Author

          Thank you for quoting our blog post, Craig. Very much appreciated! Great write-up of yours as well. Yeah, I think the underlying tone among pro Mac users is that their days are numbered running towers or trashcans (if the jump to Windows hasn’t already been made). A bit saddening really. I’ll surely miss using OSX on my main workstations. Still holding out for now while I can though!

  10. Well for using pc hardware with OSX there is always the hackintosh option (built one motion graphics / VJ workstation that has been running rock solid for a year, basically constant load all day every day).
    But the driver support is real issue in this (new gpu architecture) case…

    1. Post
      Author

      A hackintosh solution has been a thought of mine for years, but it ends up scaring me with the thought that we have paid clients in our edit suites and risking stability for the sake of staying on OSX with PC hardware just isn’t appealing. I’d rather just work in Windows, despite the fact I do prefer OSX for many reasons. Tough decisions!

  11. Just purchased a PC workstation for a song. Did some upgrades and am now achieving incredible performance. Windows 10 is every bit as stable and modern as OSX. Both are good OSes, but the flexibility and economics of a Windows box cannot be overstated.

    1. Post
      Author
  12. Hi Marco, I’m a mobile Video Producer so a laptop is essential. Looking forward to an updated Macbook Pro with TB3. Considering and external chassis GPU, would a PC laptop have an advantage over a MacBook pro?

    1. Post
      Author

      Hi Joe. I definitely understand your need for mobile power on the go. I’m barely squeaking by with my Mid-2014 15″ Retina MacBook Pro, which even when brand new was a bit behind the PC curve. There’s no confirmation yet as to what WWDC will bring on the MacBook Pro front (or MacBook Air front as well). From what I can gather, TB3 with an external PCIe chassis can handle one 16x card for sure, an possibly two 16x cards but wouldn’t leave room for another display and/or a storage system. The bigger problem, however is if/when OSX drivers would even work with the 1080 (Nvidia has made no comment on OSX compatibility since there are no Macs that can support its maximum power). Short answer, I’d wager a PC laptop would be the way to go for maximum power and display size/resolution. However, I’ll ping my longtime buddy Gary Adcock who has a vast amount of experience on both laptop platforms and has used one of the fattest/fastest Windows-based laptops on the planet (he was telling me all about it last week when I was presenting in Chicago). He’s also a Thunderbolt guru in his own right.

      1. Thanks Marco. What laptop is he using? I’ll get in touch with him. Also, ProRes is a biggie in our workflow. How does one(i.e.: how will you) be working around this on a Windows based system? Thanks again!

        1. Post
          Author

          I’m not sure of the laptop, which is why I pinged him. Hopefully he’ll get back with an answer. There was a really good blog post on ProRes for Windows, but my memory is failing me on where I saw it (I should have bookmarked it). I’ll post it here once I find it.

          1. Hey Marco & Joe,
            I have TB3 on laptops from HP, Dell and Lenovo. The powerhouse model that Marco is referring to is the Lenovo ThinkPad P70. Mine is configured with 64G Ram, UHD screen with Pantone/ Xrite calibration built in, dual TB3 ports and an Nvidia m5000 for graphics,running Win 10.1x. It is nearly as powerful as my 2 yr old HP 820 desktop.

            The Dell Precision 15 is my “carry around” model and the Dell TB3 Dock is pretty nice.

            There are rumblings that Apple will include TB3 on the next generation of devices, They have the existing TB user base and it would be in their best interests to continue that process. With WWDC just around the corner I will assume that we will find something out then, but there are no guarantees.

            My hope would be for Apple to look at the RAZR model for TB3 adaptability, Mac users are fully entrenched into TB for workflow and it behooves Apple to make a smaller, more portable versions of their mainstay products on the desktop to maintain their edge in the creative marketplace, but by adding the power externally available over a single TB3 cable could allow creatives flexibility in the process they have never had before.

            Let me add that one other advantage of the TB workflows is the underlying display port protocols that offer users higher quality and better performance than the exsiting HDMI specs ( Note HDMI 2.0 is just starting to be included on the latest devices and it will be 2-3 years before it is mainsteam in the market)

            1. Post
              Author
          2. Hi Marco and Gary,

            Now that the new Macbook Pro is out, pretty disappointed how under spec it is. I’m looking to finally switch over to a Windows laptop(in addition to using my 2011 MBP). Your thoughts on the MSI MSI-GT62VR Dominator-Pro? Key things are gtx 1070 gpu, max 64GB ram, one Thunderbolt 3 port. Way more powerful than the 2016 MBP!(other than the additional TB 3 ports). Main use will be DaVinci Resolve, Adobe Premiere, & After Effects. You think this will be a good future proof mobile workstation for only $1500? Prefer the GTX 1080 but couldn’t find a laptop for under $2000.

            http://amzn.to/2pGDocc

        2. Post
          Author
          1. Mac users seem to biased toward or preoccupied toward ProRes. If we’re discussing HD workflows, then DNxHD is virtually the same thing. The sooner producers understand that, the better as ProRes and DNxHD can be considered as ‘visually lossless’ codecs. My HD video cameras all shoot native ProRes HQ 4:2:2 so we work with ProRes every day – on Windows 10 64bit. There are simply no issues at all and if you need to encode to ProRes on any PC, then there are dozens of free (or low cost) apps, based around FFMBC/FFMPEG where you can encode to any ProRes format.

  13. Yea, good article! I would so much prefer to work on a Mac, but….

    I had the very best iMac I could get, with 4GB of VRAM and 32GB of memory, and a core i7, and it just couldn’t cut it with 4k video. I then went Hackintosh, but yes, you have to wonder what a client thinks if they ever see you working on a Hack as opposed to a real Mac.

    Windows 10 is a huge improvement over 8, and my stuff runs on it just fine. All the Adobe stuff, Avid and DaVinci run great, and as I’m not only looking at the 1080, but also the 6950x processor, trying to get that to run on a Hackintosh could be a real challenge, and the likelihood that Apple supports that stuff is, in my view, isn’t going to happen. Technically, Hackintosh’s aren’t legit, although Apple doesn’t challenge them.

    I think you’re right. Apple has given up on their desktop hardware and machines. There is a good chance that we won’t ever see an update for the trashcan for a long, long time.

    1. Post
      Author

      It’s good to hear your Hackintosh is working for you, Meagan. It’s funny because I don’t think my clients would actually care one way or the other if I used a Hackintosh, as long as the projects got done in time and such. But my biggest concern is ongoing compatibility with software and hardware updates as well as stability… my clients definitely wouldn’t like a Hackintosh if the system started crashing out during their sessions! But it seems like Hackintosh systems are running smoother these days. Still, as tempting as it is, I must stick with a native system one way or the other. Again, very glad to hear it’s working out for you though! The input is good to hear.

  14. I stopped buying mac 2 years ago, and before then i used Mac for 6 years. I even learned how to use Final Cut Pro while working at the Apple Store. Unfortunately i prefer the Mac OS to Windows any day, But the price per performance ratio was not making any sense. But at the end of the day when you are inside your program of choice fullscreen do you even notice the OS. So 2 years ago i bought a PC laptop with 6GB VRAM and RAID SSD living in side a chassis thinner that a Macbook Pro for $1800 and i haven’t looked back since. I just built a Dual 2.6 GHz 8 Core LGA 2011 Xeon System with 64GB of ram for $1200, add a GTX 1080 (which is really hard to find at a reasonable price right now) and it will be $1800. My friend just ordered a Mac Pro for work and spent $10,000 for 2.7 GHz 12 core 64GB ram. with the left over $8K i can build 4 more computers and have a render farm with 4 extra cores that the Mac Pro on each computer. This is part of the problem with buying a Mac, gaming computers (and some laptops) that cost less these days can run circles around the Mac Pro who’s price has not scaled with the parts inside. Why should you paying $10K for parts that cost closer to $2K or $3K? I know compactness is a selling point but unless you are working in a shore how much space do you need?

    1. Post
      Author

      Yeah, I agree and all makes sense. June’s WWDC was a huge let down, both generally speaking and of course the lack of any desktop announcements (even though that was expected). No way I could logically spend $10k on a Trashcan at this point! Just makes no sense. Especially with TB3 making waves on PC systems and all the relatively lower costs associated with it.

Leave a Reply