AnandTech hardware and Ars Technica's software (especially OS X) reviews are works of art by themselves. The level of details that goes into these pieces is nothing short of amazing. They are examples of tech journaling done right.
AnandTech is often the only place that measures and reports anything about the screens of computers, tablets, and phones aside from how many pixels they have (color space, color accuracy, etc.). For this alone they are gold.
> AnandTech is often the only place that measures and reports anything about the screens of computers, tablets, and phones aside from how many pixels they have (color space, color accuracy, etc.). For this alone they are gold.
Not the only place. I have found the German site NotebookCheck.net to be more thorough than Anandtech when it comes to laptop reviews.
Not only do they measure color space and color accuracy, but they also measure backlighting levels across the screen, comment on backlight bleed, test for viewing angles, check the reflectiveness of screens, etc.
In addition, they also test decibel levels of the fan, temperature of the laptop surface, etc. at various loads. They point out situations where a certain laptop may ship in two configurations. They point out the pros/cons of the nearest products from competing vendors, which have also been tested with equal thoroughness.
In other words, they actually test the machines against a checklist, to assess the performance of each element. In contrast, when an Anandtech reviewer claims that a machine is quiet or that the screen is matte, you don't actually get all the information. Maybe the ventilation system was masking some of the sound. Maybe the screen is only semi-matte.
The only problem is that NotebookCheck is based in Germany, so some of the machines they review are not available in the United States. Still, I'm happy restricting my choices a bit to avoid being surprised with a laptop purchase.
>I like Anandtech but they have come into criticism recently, from some of their long-time readers, for fawning over Apple products.
Which, instantly, should be reason enough to understand that said criticism is BS.
Apple products are among the best in the industry, period. Not just from the industrial design part of it, but overall: coherence of product vision, attention to important characteristics for the target market (battery time, portability, weight), quality machining and materials, attention to small details (from multitouch touchpad to magsafe adaptor and from backlit keyboard to magnetic, non protruding, lid hinge).
These people think that because they are not speced and designed like gaming PCs they are not worthy ("I can have a better GPU for less money in my custom box, and with xeon lights on the sides too).
And they attribute their popularity to some BS "reality distortion" effect, ignoring the fact that hardcore hackers, prominent programmers and old school neckerbeards, from Rob Pike, DHH, and Duncan Davidson to Jamie Jawinsky and Miguel De Icaza (the frigging founder of the Gnome desktop) down to Linus Torvalds, who waxes poetically about his MacBook Air as the best in the market.
So, "fawning over Apple" justs translates to "did some favorable reviews of products, instead of making up BS reasons to dislike them".
As for Miguel De Icaza, he's eating his own dog food, as he now makes his money selling an IDE for iOS development, so of course he's going to choose the Mac because that's where the toolchain is.
>Rob Pike wrote a blog post "Thank you Apple" (sarcasm) because of the problems he was having with his iMac and Apple's software.
Half of it it's about how it's Apple's fault that he didn't have a USB to boot off, so he tried to boot of a (camera) CF card (unsupported) and then an old Mac that wasn't up to running the latest OS version. Because Apple should consider what olders machines an OS supports not on hardware specs needed by the new OS, but on the needs of people upgrading their OS without a USB that want to use their older machine as a firewire devices (huh?). It's like the kind of complaints you read on Tripadvisor ("the bell boy didn't smile enough to me", "the bedsheets where not the exact Pantone blue they had on the hotel website" etc).
Yes, but he said a MBA just until then, of which he writes on his Google+ page. Also, if you read the article, what he solely likes about the Pixel is the screen resolution. And he dislikes its weight. He looks like a perfect candidate for the inevitable retina MBA.
>As for Miguel De Icaza, he's eating his own dog food, as he now makes his money selling an IDE for iOS development, so of course he's going to choose the Mac because that's where the toolchain is.*
Well, it's not just that. He also wrote a post about why he moved to OS X, and how he got dissillusioned with the Linux desktop prospects.
From now on when people ask me why I use Linux instead of OSX and why I refuse to install closed-source, copy-protected software, I'm going to send them to Rob Pike's eloquent description of his self-inflicted torture.
Self-inflicted sums it about nice. He didn't have a USB drive (needed for installation), so he tried a few offbeat methods which didn't work, and he didn't have a bootable backup image (so he had to mess with a lengthy copying process). The only legit thing in his complaints is that Time Machine wouldn't let him restore off his Time Machine backup. The other stuff is self-inflicted.
Well, FWIW, I have had 3-4 magsafe cords (MBA, MBP, MBPr) and have used them from 5 (MBP) to 3 (MBA) to 1 year (MBPr) without issue.
I've had other parts crap on me though, e.g a battery after 2-3 years of use that had to be replaced. Also had an iMac (sold now), which had a faulty DVD (also replaced).
The thing is, those things happen to ALL production runs, there are some % of defective units. You can be Apple, IBM, Dell or BMW, and you still get this. I've had "upmarket" IBM hard disks die on me for example ( http://en.wikipedia.org/wiki/HGST_Deskstar ).
And, for tens of millions of machines sold, you only get to read about the far fewer faulty ones on such problem forums (well, duh!) -- so it's not much to get an accurate picture on.
I like to do extensive research before I buy something (I buy lots of tech gear, from stuff like DSLRs to Audio interfaces), and if I gave much promimence to the occasional forum complaints, I wouldn't have bought anything at all, because there are always people that have issues with any product you search. I prefer to stick to reviews, seeing units in action from friends and in the store, etc. Case in point, my latest buy, a Focusrite Scarlett interface. Pages of complaints about strange audio glitches with Mountain Lion / iMacs etc in audio forums. Have been working 100% fine for me.
The trouble with reviews on that sort of product is that almost nobody has any reason to ever write a good one, because everyone writing one is buying the thing because their old one broke. Unless you have a product that _never ever fails_, reviews on spare parts are always going to be pretty awful.
Which Apple is fixing with a free replacement policy. There was a similar glitch with 2010-era MacBook Pros, which was uncovered in a new OS or firmware version in 2013 - 3 years later. They fixed mine (complete replacement of the main logic board) free of charge, 2 years out of warranty.
Even their known defects make Apple come out smelling like roses. Contrast that with obvious design defects in other, cheaper PC laptops, which get ignored or refused at the 3rd party retailers they come from, and it makes Apple a pretty simple recommendation for power users and casual users both. Nobody I've recommended Apple laptops to has been disappointed.
Well that's silly. "Anandtech does great reviews unless it's favorable to something I'm not, so then those reviews must be biased". Either you trust Anandtech or you don't. And if you do, then you shouldn't call the reviews that go against your views as biased.
By "you", I don't mean you, yapcguy. Just in general towards those that are criticizing Anandtech.
I can see why. They've been quite selective in what they compare the Mac Pro to in this review, for instance, because otherwise it wouldn't have come out so well. When they're arguing that it doesn't need expandability, they compare it to laptops and desktops that aren't expandable. Yet when it comes time to justify the pricing, they exclusively compare it to workstations that are aimed at a completely different market to desktops or laptops and have far more expansion options because that market expects them. If they compared the pricing to desktops and laptops or the expansion options to workstations, the Mac Pro would look a lot worse.
They compared it's price to a few workstations to show that the price isn't unreasonable for similar hardware (i.e. Apple isn't adding a $1200 workstation tax).
For the rest of the review they compared it to other Macs, because chances are that's what buyers are going to compare it with. If you want a Mac, those are your choices. I really doubt too many people who are in the market for an HP or Dell workstation are going to consider a Mac Pro.
Plus there is the problem of benchmarks. The OS can make a big difference, so you'd either have to run every benchmark twice on each system (once on OS X, once on Windows or Linux), and then the non-Macs can't run OS X. It would be a ton of extra work, but I'm not sure how much gain it would give.
Again, I think the number of workstation shoppers who will consider this machine is small. I expect the vast majority of it's sales will be to Mac users who want something more powerful than an iMac or a MacBook Pro.
> I really doubt too many people who are in the market for an HP or Dell workstation are going to consider a Mac Pro
Especially with the gap in Mac Pro releases over the past few years, I have seen many people deciding between Mac Pro and a custom built hackintosh. For the stuff that really justifies a Mac Pro, these were the only two options for some time (short of moving off of an OSX stack). Sure, someone custom building a multi thousand dollar work station is not your typical consumer, but neither is your typical buyer of a spec'd out Mac Pro.
I feel like he addresses the new design as a risk which he thinks something that should be taken. If that's something target market doesn't appreciate,then ultimately this will be a failed product. But Mac Pro is targeted towards a different audience than a HP workstation. Its targeted towards Hollywood and Animation community where there is a lot of Thunderbolt penetration.
I always attributed this to the anti-Apple-ism of the "enthusiast" crowd that like their machines big, over clocked, and running hot.
Anandtech fawns over a lot of products, but I suspect perhaps this is due to them reviewing products they are interested in, and not wasting time on stuff they aren't. I'd start to worry if people can't reproduce their lab numbers but at the moment, they're considered the most detailed and accurate of the lot.
"Some say…" is the Fox News way of saying "we just made this up on the show that preceded this one".
Unless someone can actually call their conclusions and numbers into question using actual facts, then it's just anti-Apple whining, whether it's him or "some readers".
Pretty much every review of Apple gear has people moaning about bias in the comments.
I'm not saying there is any favoritism, but certainly the perception amongst Anandtech readers is that there might be.
At the end of the day, there's no reason why the writers at Anandtech would be any more immune to access journalism than other folk who have tried and failed.
Those kinds of complaints can be seen anywhere there is a positive review of Apple products.
You are actually saying there is favoritism. It's dishonest to claim otherwise. You've just suggested that Anand has succumbed to access journalism.
Why not just be honest about what you think, rather pretending to be disinterested reporting on the views of others?
Let's also note that the commenter accusing AnandTech of bias in that comment thread you linked is resoundingly and overwhelmingly rebutted by other readers.
Those kinds of complaints can be seen anywhere there is a positive review of Apple products.
Those kinds of complaints can be seen anywhere there is a positive review of any product. If someone likes Google Glass, four hundred comments by people calling you a glasshole and telling you that you must like Google stealing all your data. Positive reviews of the Samsung Galaxy Note got huge criticism among the Apple sphere, everyone agog over how anyone could possibly like something so big (must be payola). And on and on and on.
Apple gives special treatment to outlets and bloggers who treat them "well" (and punish those who don't). It's all the same inducement, and it's all just as grubby.
So knocking Gizmodo — who chose to profit from the receipt of an ill-gotten prototype — off the list of invitees is somehow just as bad as buying journalists outright?
If you want to present a boorish caricature and then snarkily knock it down, I guess congratulations? Gizmodo and the stolen phone debacle has absolutely nothing to do with this.
Having early access to products and personnel for Apple reviews is a huge coup for media outlets: They need Apple far more than Apple needs them (if Anandtech didn't get sent a day 0 $7000 review unit, AnandTech would have gone without a lot of views and a front-page of HN and many other sites. Apple would have lost nothing -- the target market for the Pro would have just read the reviews elsewhere, and are at no risk of buying a competing workstation).
Apple hand-selects who gets these early access units, and thus who gets the attention of an early review. There is a strong incentive for those reviews to gently understate negatives and to overstate positives, remaining on the list for the next go around. In this case a workstation that prioritizes the irrelevant (box size is never even a discussion point when talking about high power workstation, but suddenly it's the primary design point?), and has some astounding faults that most other companies would be eviscerated for: The dongle approach of expansion; Power draw that at times exceeds the power supply rating; Chipsets going to 100C+ because the "thermal prioritized" design was actually "being novel small" designed; Performance that even in CPU+GPU scenarios only marginally improves on the performance of a box from four years ago?
I would posit that had this box carried a Lenova or HP tag on it, the reviews would be extremely negative, if not mocking.
This is not unique to Apple: Exactly the same thing has happened over the years with various industry or namespace leaders, reviews veering towards the positive to assure that you get to the front of the list for the next wave, in a perpetual cycle. There was a time when getting early access to Microsoft inspired a whole industry of fawning and soft-gloving.
Bizarre that gress is so desperately trying to present the notion that I'm somehow defending Samsung and their pathetic attempts at astroturfing support (though such is the entire business model of the PR industry, which every business engages in, so pretending it's so unique is delightfully naive). Their tactics seem very trollish so I'll simply ignore their nonsense.
If you want to attack his "caricature" you should be able to point out some way in which it is inaccurate.
Samsung has been convicted of paying shills to make false forum postings and reviews.
You are accusing apple of choosing who to give review units and press invites to based on who they prefer.
Every company does this. What else do you expect them to do? Provide a review unit to every blogger who asks for one? If you claim their behavior is underhand, you should be able to explain an alternative.
Comparing giving out review units to paying for shills is plainly absurd.
As to your comments about the review itself, they certainly reveal things that you would like to complain about, but none of these things were whitewashed or concealed by AnandTech, in fact they were fully exposed in statements, charts, and numbers.
Your chief complaint seems to be that you would have liked the review to have a vitriolic tone.
It's hard to believe that you are equating the fact that Apple chooses who to invite to press conferences (what else do you expect them to do?), with Samsung paying for shills to fake reviews and forum postings.
> Pretty much every review of Apple gear has people moaning about bias in the comments.
Well, yes, but any time just about anyone anywhere reviews an Apple thing and doesn't say "This is worse than iHitler", there are cries of bias, evil conspiracy, etc. People are a bit funny about Apple.
That's just them being silly, since Apple actually do make fantastic products and care more than the superficial marketing specs wars (GHz race, number of cores race, megapixel race, screen size race, etc).
It's funny how most PC review sites get into trouble for allegedly fawning over apple hardware once they start reviewing it regularly. I wonder why that might be?
Really? It's the same format that's been around since the earliest days of Web reviews. Newer sites are moving on to data allowing comparisons between anything (storagereview was an early leader in that). Although it's not quite as pathetic as sites that post screen captures of benchmark programs, I find sites like anandtech are like quaint bedtime stories in comparison.
For me the disappointing takeaway is that good high-res monitor support isn't implemented. Like the reviewer, I had assumed the same types of scaling options that the high-res MacBook Pros have.
However, according to this review, the new Mac Pro doesn't work with the new Dell 4K monitors (I don't consider 30Hz refresh as 'working'), and even with the 4K display that Apple sells, it only works at its native 3840 x 2160 at 60Hz. When choosing a 'Scaled' resolution, it renders blurry junk.
That is pretty disappointing (although I imagine it will be fixed at some point).
What I found interesting is that the system used for all the benchmarks is significantly more expensive than the systems listed on the first page - 12 cores, 32gb Ram, and a 512gb SSD prices out at $7699, nearly double the cost of the more expensive of the two configurations listed on the first page, and it's still 10% more expensive than the "Most Expensive Configuration Upgrade Path" on page two - which means your wallet will be $700 bucks lighter.
And that's probably generous - from the GPU analysis appears that the tested unit has D700's which bumps the price to $8299 - a configuration that isn't mentioned anywhere in the article. About the only thing left to upgrade on the test unit is the RAM to 64gb.
Since the article calls itself a review, it would be better if the review unit was accurately described. It seems to me there's a bit of bait and switch because the performance numbers presented are not for the $3000 or $4000 presented in the article's lead.
The systems listed on the first page are the advertised base models from Apple. The unit they benchmarked for the review is presumably a review unit that they had no choice about.
OTOH, it's a little surprising that Apple gave out 12 core review units, since the 8 core seems to be better at benchmarking.
They gave Macworld and The Verge, at least, 8 core units. Perhaps a case of know your audience; neither of those sites would be necessarily too inclined to go into _why_ the 12 core part is slower in many benchmarks, whereas Anandtech, of course, will.
The author chose what to write and the editor chose what to edit and the publisher chose what to publish. Likewise, Apple chose what to send AnandTech, and chose to send the influential publication an $8300 machine rather than a more common model for benchmarking.
In light of the fact that much less expensive machines often performed better, it seems obvious that the favorable conclusion of the review ought to provide some rational case for being so.
Wow. For me, the big takeaway from this review are the benchmarks showing the iMac and MBPs beating out the Mac Pro. That shows that this machine is really for niche markets like professional video editing. Makes you wonder why Apple even bothers.
It's a "forward looking" architecture (to steal the line from the iPhone 5S pitch.) They're betting big on GPU compute. I wouldn't say it is for "niche markets", though perhaps today it is only useful to smaller markets, it's that this computer is going to over time get faster and faster relative to the 2013 iMac and MBPs as more apps take advantage of the GPUs. It's kind of a unique phenomenon and makes the benchmarks misleading. That said, it remains to be seen if this bet will pay off -- we could end up 5 years from now with the same small subset of apps taking advantage of GPUs as there are now.
As a mostly-hobbyist 3D artist, I'm already feeling really left out when I look at my relatively poor GPU vs. the capabilities of the rendering software that I use. If this fever has already started to seep down to my level, I certainly wouldn't predict against continued growth of the GPU computing world.
As someone doing GPGPU I just hope that they'll release an NVIDIA option. Scientific GPGPU heavily relies on CUDA support, OpenCL just isn't there yet, if ever.
I would argue that at least the current model actually is for a niche market, i.e. applications that use GPUs right now. By the time more non-gaming applications outside of media editing are using GPUs more extensively, the top end mobile graphics chips will have the power of this dual GPU setup.
It's a bold bet on a possible trend, which I like a lot, since it would mean that non-gamers would profit from the GPUs that would otherwise bore themselves to death on their machines. Also, it would give AMD a better position, maybe averting x86 being a complete Intel monopoly.
This has kind of been the claim for years now, though, with a burgeoning market of compute apps just around the corner. Only it isn't so easy, and compute applications only apply for a specific set of problems, not only because of the GPU geared restrictions and architecture of these designs, but because of the islands of memory forcing endless memory copies back and forth. Unified memory should go a long way to making compute more generally usable, though of course that does nothing for the person paying $6000 for this unit.
I wouldn't be surprised if Apple develops a library to help facilitate GPU usage, similar to how they developed Grand Central Dispatch to help developers utilize multicore CPUs more effectively.
Even better: what if Apple developed a whole language for GPU compute? They could eventually get other vendors to participate and make it an open standard. How about "Open Compute Language"? Nah, too verbose. How about "OpenCL"? ... =)
No - GCD is only about distributing workloads across CPU cores, and doesn't involve the GPU.
OpenCL uses a special programming model so you can't use it for general application code. It's good for doing repetitive operations on large arrays - e.g. Image or signal processing, or machine learning. OpenCL code will run in the CPU if there is no GPU or if the overhead of shipping the data to the GPU is too high.
In general, the MacPro loses out in single-thread performance only, since Xeons are typically one architecture behind. When the Haswell based Xeons come out, Apple will refresh the Mac Pro line, and you'll get parity single-thread performance.
That being said, I wonder what the chances are that Apple will use the same socket for the Haswell-EP CPU daughterboard. Is there anything obvious that'd prevent them from doing that (e.g, chipset compatibility)?
The line of video professionals happy to pay $10k/box to get their renders done faster screaming "TAKE MY MONEEY NAOW" might have something to do with it...
Also, since you missed all the charts showing the MP demolishing everything else by 2.5x+ on multithreaded workloads (ya know, the thing that people buy MPs for) you may want to verify your consumption of what the kids call "h8rade".
I think that is exactly the kind of niche they are looking to capitalize on. Think about the reasons to buy a powerful workstation computer. Games? You are probably going to build your own custom machine and put Windows on it. Server? Call up Dell and get a rack, or cobble some basement stuff together and throw Linux on it. The other case for needing a powerful workstation today is professional work like video editing, rendering, music production, and Apple already does will in that market. It's clearly one of their lowest priorities, as can be seen by the fact that they updated every other product line multiple times before they came back around to the Mac Pro, but at least they seem to understand the market. If they were marketing this as a gaming machine or a server platform, then they would be making a mistake.
The Mac Pro has been pretty niche for years, and has been losing out to the top-end iMac for a while on poorly-threaded stuff. To a large extent this is due to Intel's product cycle, where the many-cored high-bandwidth Xeon is at least one core iteration, and sometimes two, behind the iX.
I think many of the design/art fields that they cater to has a need for this sort of MP-capable machine. That and it's also probably a neat exercise for technology development - I'm sure there's some nice engineering data and expertise that came out of this that might "trickle down" somehow.
I think the video editing market is growing fairly rapidly, as it gets easier to take and share high quality video. Of course, most of that growth is entry level youtube-channel stuff, but as those barriers fall so will those to the professional level.
I'm glad they mention the HP Z420. The killer for me on the Z420:
three-year Mon-Fri 8-5 next business day, parts, labor and 24x7 phone support,
They come to me.
Edit: to all the naysayers: I'm in the UK. HP here is pretty good. We have over 200 machines on next day and we've had only two (!) problems and they were relating to part supply resulting in a quick purchase on Misco that arrived next day.
Hi. I'm a hp warranty victim; an so's old laptop had 3 year next business day support. On three occasions those lying sacks of shit made us mail it to them for a week at a time. Don't believe it; your "next day" warranty is just a donation of beer money for hp execs.
When I worked at a pharmacy doing IT work, they purchased some desktops from HP with a multi year on site support deal. One day we had a computer's Hard Drive die so we called them up to fix it, which they did... and nothing more. I asked them about putting Windows back on it (as the restore partition was nowhere to be found) and the guy said "not our department."
I ended up torrenting an HP restore ISO and thankfully the BIOS's licensing key matched the ISO.
Yea--I spent an embarrassing amout of time trying to fix
a HP laptop. The Nvidia chip didn't touch the cooling bar--I think--long time ago. Customers eventually won a lawsuit, but I wasen't notified? HP really need to clean up their
image. If any Sonoma State Grads read this--forward to the
next CEO.
Hah! I had P410 cache controllers on 'next day' warranty service that took 3 weeks to replace. I swore I would never have an HP server in my data center again if I could avoid it.
Do a lot of folks here generally like Mac desktops? I've grown to like Macbooks because I do like working on a portable UNIX-y platform, and have had generally bad luck with Linux laptops in the past.
On desktops though, I'm willing to put more effort into settling software update issues/device conflicts, since I probably have to do that anyway to write performance-optimized code (depends on the exact purpose of the desktop though, but I do a lot of scientific computing). So a Linux/Windows split boot on a generic PC usually wins out. I used to do a lot of PC gaming, but that's really less of a factor now.
After running every version of Windows since 3.1, I've decided to go Mac for my desktop. I run stuff on it 24/7 and have 6 external disks, so a MBP is out of the question. The cost of external enclosures, though, for the new Mac Pro makes it out of reach ($3000 + $2400 = $5400... I can put together a nice hackintosh for $1700).
The DIY PC route is still going to be more affordable. If we go the Ivy Bridge E route and opt for a Core i7-4930K, you get more cores than either of the options above for around $600 for the CPU. Adding in another $330 for a motherboard, $180 for 12GB of DDR3-1866 memory, $1400 for two W7000 GPUs and $220 for a fast SATA SSD (Samsung 840 Pro) we’re at $2730 for a configuration that would cost at least $3499 from Apple. That’s excluding case, PSU and OS, but adding another ~$350 takes care of that and still saves you some money.
As the AnandTech review points out, you're looking at $2700ish if you opt for hardware that is apples to apples comparison... regardless of whether you went with hackintosh, Linux, or Windows.
I use a MBPr for general development, have an imac close to top specs and have a beefy windows/linux box. Of the 3, my imac gets the least use, and I will probably be retiring it fairly soon. For development, I really do not need a power house machine.
I code with a focus on TDD and small unit tests, so even though I do mostly statistical computing work, my development tests use small amounts of data and low computing power. When I actually need to run big production stuff, it goes off to the Linux box or a Linux server cluster.
My beef with the imac is that it is not really that much faster than my MBPr, but has terrible heat management. When I am running big jobs on it, it gets a few degrees off from a toaster and heats up my office. A 2.5k linux box would not have this problem and would be way faster.
Your iMac is not producing any more heat per unit of computational work than your other machines unless it is significantly older and thereby less efficient. The iMac doesn't conduct the heat away from the processors and into the room as quickly as a larger desktop would, so the chips operate at higher temperatures under load. This does not have any effect on the actual amount of energy dissipated into the room over the long run.
That makes sense to me and all... but the difference is quite noticeable. As in, I need to take off some layers if I am doing something on my imac, but not with the others.
If I had to guess, it probably has mostly to do with the imac sitting closer to me while typing, and also that I tend towards using it for more graphics heavy tasks (since it has the largest screen). Also, I am in the southeast, so heat changes are unusually noticeable, especially in the summer.
You are right though, the imac probably is as efficient or even more efficient than my other machines, however it is more noticeable when the heat is in your face instead of blowing out off next to a baseboard.
Mac desktops are great if you can find a model/configuration that fits your needs. If you don't need any GPU power it really opens up your options. It gets tricky when you want a more well balanced system. You go from about $800 to $2k+ with a iMac 27" + GPU upgrade when you could be perfectly happy with an imaginary Mac Mini + discrete GPU.
I have been running Linux on laptops for about 12 years now. The only way to do it and retain your sanity/get any useful work done is to run it inside a VM on OSX or Windows. Full screen, you won't even notice the underlying OS if you don't want to.
A 1 year warranty is such a joke for this much money. I think this and the new mbp where should anything break you need a full replacement basically, need to have some serious warranty coverage changes. 3 years minimum in my mind. Just another way to nickle and dime you on something like an almost required extended warranty.
Apple, your stuff is mostly nice. Fix your product warranty and maybe I'll open my wallet.
> Isn't it worth it to purchase the Applecare warranty for $250 which extends the warranty period to 3 years?
payoff * Probability of payoff = EV
3000 * Probability of payoff = 250
3000 * P / 250 = 250/250
3000 / 250 * P = 1
12 * P = 1
∴ P = 1 / 12
You require 8.3% of the computers to fail in years 2 and 3, and for the failure to cause $3000 worth of 'damage' avoidance, and for it to be something covered under the warranty terms, for your gamble to make sense.
Judging by the fact they offer the warranty, I would imagine the fail rate to be somewhat lower than this, or for the failures to be vastly less expensive than that - else they'd be losing money.
When you see the price of the dual GPUs, the ECC RAM, PCIe SSD, and the small design, you will see why they didn't toss in the warranty for free. You do get 1 year free warranty though and that's something. It's only $249 for the extra two years which is worth it in my opinion.
ECC isn't that much more expensive, while AFAIK the price premium for pro-model GPUs is almost pure price-gouging, and Apple's surely not paying those inflated prices for the GPUs.
I think Australians get 2 years free as per the legal requirements. However I suspect they increase the overall price to compensate so we are not really better off.
I just want to make something clear:
In the EU, during the first 6 months, the seller/manufacturer has to prove that you did something wrong when there is a problem with the device. After the first 6 months the roles are reversed and YOU have the burden of prove. In most cases you cannot prove that it is the fault of the manufacturer. There are some cases however where it is clear that it is the manufacturer's fault: In the case of a serial fault.
Interesting they compared the new Mac to a workstation.
Older workstations these days are a lot more affordable and can easily be upgraded. Most MACS you're stuck with what you get.
Case in point, I just purchased an HP 8400 workstation for a friend. $320 for a dual proc 2.6GHz quad core Xeon, 16GB RAM, Two 320GB SAS drives in RAID config and ATI Fire V7350 1GB video card. Sure its a pig and isn't the quietest PC in the room, but it completely shreds anything I could find in a retail setting.
Really? $320 for a new machine with that spec? That graphics card alone costed more than that when it was new, did the CPU/storage/power supply come at negative price?
If this was true, it would really explain HP's financial situation /s
I suspected as much, but then it's silly to bring it into a cost/performance comparison against a brand new top of the line 2013 desktop. Yes they are upgradable, but by the time you upgrade them to modern spec it will cost a lot more than $320.
Writing "Mac" in all caps as if the writer obtusely believes it to be an acronym for something is part of the style manual for passive-aggressive platform debate.
Note that the D700 specs match closely with the R9 280, 280X, or 290, cards which sell retail for $349 to $449 or so each (the Mac Pro has 2 of these in the 2x D700 config). See https://en.wikipedia.org/wiki/AMD_Radeon_Rx_200_Series . The 2048 shaders would match with the 2048 number for the 280X, I think (if I am reading the chart right).
The W7000 is the much more expensive "pro" version which has ECC RAM on the card and much less volume in terms of sales.
It's funny though, I remember as a kid, thinking how cool it was that you could have a backpack-able computer (the original Macintosh could be ordered with a padded backpack). Now at 11 lbs and not very large, you could almost do the same again!
Were it not for the current shortage of 280X cards due to Litecoin mining, one could build a similar Hackintosh for quite a low price. Though you'd be stuck with 3GiB of memory per card rather than 6GiB (at least until MSI releases their 6GiB 280X, which they have listed on their site).
What bugs have you found? Have you reported them to Apple?
I haven't found any since the 10.9.1 update. They fixed the quicklook slowness bug which was my only complaint. Mavericks is one of the most polished OS X release I have used, behind Snow Leopard.
(not op) I have 4 reported to Apple about Mavericks and one (Quicktime) was fixed with 10.9.1. The other 3 are on the Finder, first appeared in 10.9, and were not fixed in 10.9.1. Mavericks has the buggiest Finder I have seen on a 10.x release. It is a pain to have to restart Finder multiple times a day.
Move some picture files from one folder to another with preview item on and watch the finder hang after about twenty files. It is damn near unusable at this point. I gave up for big sorting and just have a button on the finder toolbar that opens a terminal in the folder the window is showing and do copies that way.
I have a weird issue where, after upgrading, opening the Activity Monitor causes a process called 'systemstats' to balloon to >1.5GB memory and peg the CPU. Doesn't seem to be just a startup thing, because it does it indefinitely until I kill the process (closing Activity Monitor doesn't fix it). Googling around turns up some other people with that issue but no obvious diagnosis.
I would say at least the Finder in Mavericks is "buggy as hell"[1]. It looks like it has a real issue with preview images (when the file is dragged to another window), Applescript, and updating based on file system changes.
Mavericks is absolutely a regression. My Mac Mini i7 (2013), for instance, will regularly recover from sleep with a black screen. Now if you are careful you can log in (hope the context is in the password box, type your password and return), but this has been there since Mavericks release. Many people have reported it. Still there. It is among a long list of bizarre oddities and system quirks that simply didn't happen pre-Mavericks.
Does anybody know how much work is required to run Linux on this thing? I read some documents but found them complicated enough that I don't want to deal with on my daily computer. Also there seems to be driver problem related to thermal issues. So I ended up with virtual machines every time considering installing Linux on Mac.
Anybody running Linux on Macbooks or Mac Pro? Does it work well?
I run Linux on a Macbook Pro (late 2013). It works remarkably well considering this hardware came out a few months ago. The only thing that doesn't work are the Thunderbolt ports (they work for everything except monitors, which crashes the machine after ~10 minutes).
1. People hate Apple for whatever they do.
2. Those who said Mac Pro makes no sense have absolutely zero understanding in GFX market.
3. People who cries about Money Vs what you get to buy and build have absolutely zero understanding about engineering trade off. ( Which I expect HNers to have even from a software engineering perspective )
I wonder if there's a future for unified thermal cores in non-Mac systems? It seems like a pretty big win, but it looks hard coordinate around a design if you're not Apple doing everything in house.
How would it be hard to coordinate? You don't even really need a single thermal model, you just need a standard heatsink layout for parts to align with.
Which is complicated when you are trying to build hardware and realize "oh crap, the standard thermal layout leaves no room for us to actively cool <insert-x-part>, how are we gonna keep this thing from overheating?
The way we have our computers laid out today - vertical pcie cards, cpu sockets with either the amd snap-brace or intel screw in backplate heatsinks - is entirely arbitrary, but for posterity it persists as nobody wants to be the guy to throw out 10+ years of expansion card compatibility.
The slots thing is a design decision. Apple doesn't think people need them, I think they may be right. The Thunderbolt connectors are direct to CPU, not even the north-bridge in the way.
I remember when they said people wouldn't need ADB or SCSI. Or floppies. Or serial ports. It worked out.
If your 5 GHz CPU a Xeon with ECC? The RAM can be upgraded (those are options), the SSD can be upgraded (1 TB is an option) and it's faster than SATA, and you can attach as many disks as you want in Thunderbolt arrays. Did you have dual GPUs for that price?
It's a nice looking machine, very quiet, and very innovative. Maybe it will be a misstep, but I'm glad Apple is trying something interesting. I want to see what happens with this.
Thus far, they've made correct but initially mocked decisions on the necessity of floppy drives, optical drives, Ethernet ports, removable batteries, and physical keyboards.
The one strange assumption that most Mac Pro reviews start with is a baseline of two compute cards -- that the Mac Pro is competitive when compared with other machines with two high-end compute cards.
But I don't want two high-end compute cards, and I suspect that many who are trying to convince themselves that they'll benefit from it will gain no value from it.
For many, many workloads, modern compute still represents an iffy proposition (at the price levels being talked about, the Xeon Phi would almost certainly represent a better proposition). With unified memory things might get more workable, but as is it remains a relatively fringe benefit, and it seems odd that the entire value proposition of the machine relies upon it.
In which case, do you want a Xeon workstation of any sort? As mentioned later on in the review, you make significant sacrifices for Xeon (startlingly expensive, last-gen cores), and, besides the option for more cores than you can get on an iX, the main thing you get is extra PCIe lanes, which are not actually that useful for most things; one of the few things they _are_ useful for is dual hefty GPUs.
ECC memory is a big one. Usually getting a Xeon workstation comes with SMP, though not on the new Mac Pro. Big memory support. Lots of PCI lanes. Usually lots of space to drop in extra storage, a couple of 10GbE ports (the Mac Pro has just 1Gbps ports which is another oddity).
There are a lot of traditional reasons a so-called workstation features a Xeon.
Worth noting that there is a couple of Haswell (therefore AVX 2.0 supporting) Xeons -- the E3 v3s. Unfortunately they're the baby ones so they have ridiculous low max memory, no SMP, and max out at 4 cores. Hopefully the E5 v3s are out soon.
I honestly don't get how the Mac Pro hasn't gotten more mainstream criticism. It solves a problem that I don't remember anyone ever having (honestly a garbage can form factor seems like more of a nuisance than the flexible cubes we're all used to), while bringing a ton of problems to the table, and being a massive sunk cost for fixed hardware that is going to be outdated very, very quickly.
For the Mac Pro target market, 2xGPU combined with massive memory bandwidth is a blessing. Final Cut, Photoshop, and other creative software are GPU accelerated.
The Xeon Phi is significantly more expensive, draws significantly more power, and is significantly less useful for creative software loads.
Which is what? It's just anecdotal but most of the people I know with Pros got it as a high end development workstation, building iOS apps, etc. Final Cut Pro is pretty much the only app that benefits from the dual GPUs, and even then the gain is relatively marginal over a machine four years old. And that's paying for very high priced "workstation" GPUs (I called them compute cards because that is what they are geared for, though as with all compute cards they are derivatives of GPUs. They're really price ineffective as GPUs), when you can get almost all of the same advantages on a basic ATI card for a couple hundred dollars.
Virtually every review of the Pro seems to be giving it a very soft glove approach for some reason. It is an enormously expensive monument to the dual compute GPU, for marginal gains in most apps.
Buying a Mac Pro to "build iOS apps" or even to do heavy Photoshop work is a waste of money. The people that bought them for those reasons did not make an educated decision. A 15" rMBP or iMac is perfectly fast for those tasks.
The Mac Pro is a reasonably priced dual compute GPU workstation. But it's enormously expensive when compared to what most people actually need.
I'd say that if you are doing 4k Final Cut Pro work or writing your own OpenCL software, this machine is for you. In the future it may be suitable for people using GPU-based 3D renderers (I don't think there's any great ones on OpenCL at the moment).
Aside from the above niche target market, the Mac Pro target market doesn't exist yet. Apple seems to be using this machine to push the development of OpenCL and to push the development of "Pro" software for Mac OS X in the direction they want to see it go.
Personally, I think it's a very exciting direction to take pro software. I hope many developers jump on board. Once that happens I think we'll start to see a larger target market for the machines.
Are you debating nomenclature? One of the GPUs is effectively dedicated to compute, while the other is architecturally chosen to optimize compute. These cards are geared to compute and not graphics.
Your post was unclear and I asked what you meant. I didn't debate anything. i was extremely clear and explicit about this. i asked a question, said i didn't know, said i was curious, didn't dispute anything you said.
Only a ___domain expert could be expected to automatically know whether double GPUs is effectively the same thing as double CPUs or not. Apple has put a lot of effort into making their GPU structure effective (in general, not just for a few special cases), and I didn't know if you were saying they'd failed or not. i know parallel processing is hard, but i also know apple has smart people who've worked on it. i don't know things like whether "lack of unified memory" is a problem for apple's design too, or not. it seems completely plausible to me, not knowing the ___domain that well, that apple could have had something that works well in general, or not – i don't really know and you didn't say, just assumed your reader would somehow know what you meant (which will basically only work for people who already know your point and have no need to read your comment at all).
His entire point boils down to "GPU Compute (of all forms) is overweighted in comparisons between the Mac Pro and other workstations".
He got antsy because from his perspective, your question definitely seems kinda out of the air. In his original comment, he's basically implying that any GPU based compute solution (so these can be plain gaming optimized GPUs, the "professional" GPUs, or compute optimized GPUs) aren't worth is for the majority of use cases.
The "second card" he talked about was the Xeon Phi, and he makes the differentiation between it and other "GPU based" compute solutions since Xeon Phi consists of a "large" (sub 100) number of relatively simple, but full blown CPU cores (for example, the current Xeon Phi is based on an old Pentium core, the next gen is supposed to be basically an Atom core). This should, in theory, make it easier to exploit parallelism.