CAD-Consipracy-header

About the Author

From Suffolk, UK, Rachel Berry has been a Datacenter Ecosystem Solution Architect since joining NVIDIA in January 2016. Having never worked for a GPU or hardware vendor before, Berry began her career as an astrophysicist in academia, then became a CAD kernel engineer (Parasolid kernel at Siemens PLM) working on applications such as SOLIDWORKS, Siemens NX, Ansys Workbench, etc. She eventually moved on to hypervisor and VDI engineering including virtualized GPUs at Citrix working on XenDesktop/XenApp and XenServer. Berry’s background and experience is in enterprise software development and (due to her passion and said experience) primarily follows CAD and 3D blogs. We are publishing her article in three parts. Part II and III can be downloaded at the end of Part I.

The great “APIs, GPUs, and drivers: CAD graphical conspiracy” Part I

A few weeks ago I saw a new post from Ed Lopategui at GrabCAD (I’ve blogged about them before – awesome company!) a 3D-printing/CAD company entitled “APIs, GPUs, and drivers: CAD graphical conspiracy?“. Ed’s customers are often the same customers the GRID GPUs and virtualization technologies I work on are designed to suit—professional graphics fit for enterprise run from the Cloud/Datacenter on mobile devices powered by GPUs in the server. Ed is also someone I once was or would have been if career paths had been different. We also share a lot of the same “upbringing” in CAD (as well as previous employers). I think Ed and I probably share the same insight into how stringent the requirements and high the expectations from software are on tier 1 CAD suppliers from their customers in high-end automotive and aerospace. I’d like to think I knew exactly what Ed’s customers would demand in terms of support, reliability, and traceable process, as well as product quality and testing, to risk putting a supplier’s product within their environment, so it was very interesting to read Ed’s take on GPU pricing.

Basically, Ed echoed a view I’ve heard before that because professional cards (NVIDIA cards like the GRID and Quadro product lines) cost a lot more than consumer ones (gaming cards such as the GeForce lines), and that there is some sort of cartel/conspiracy to charge professional users more than the product is worth (an inflated cost price above the manufacture cost relative to consumer cards). Ed actually blogged on the GrabCAD blog, a company heavily involved in 3D printing. Given that context, I think I was even more surprised on his views regarding the value of professional drivers. My own experiences of drivers in 2D-printing has been that the quality control, certification, and testing is woefully below the standards enterprise manufacturing demands from its existing software and I suspect that long-term, it will be one of the biggest challenges to adoption of the technology. At the moment, mainstream printer firmware and driver updates regularly cause memory leaks and material changes in behavior that has long been acceptable in most software, let alone manufacturing. Microsoft introduced an entire driver isolation model to protect servers from rogue drivers and the drivers even have a nasty habit of interacting. Is 3D-printing regression control even halfway near the standards it needs to be to ensure a 3D-printed part can truly be trusted to be manufactured version after version the same? If there is a strong change since the simulations and physical stress tests, should that replacement part be used?

Printer drivers seem stuck in a dark age as an added bit of software needed to get the main product working. They are usually fairly lightweight and perform a relatively very limited range of operations. Yet still, on average, they are… flaky as hell. GPU drivers, on the other hand, are used and developed in the same way as enterprise software. The nearest product I would compare them to is probably a hypervisor, where there is both hardware and software interaction. For professional graphics, this involves optimizing and designing functionality for specific applications and even OSs – e.g. CATIA, Autodesk Revit, Petrel, Linux OSs and both Windows workstation and server OSs.

I work for the NVIDIA GRID product group and although I’ve only been here a few months, I’ve met hundreds of people working on software for professional graphics and precisely nobody involved in raw silicon GPU design or manufacturing and nobody working on consumer gaming cards. Computer games tend to use a similar architecture to be run on similar devices and users expect to replace them frequently to satisfy their habits. My division is staffed by people doing jobs very similar to those I did back in Siemens PLM on CAD or Citrix XenServer.

If NVIDIA wanted to make just graphics cards, I’d have been a pretty pointless and useless hire. I don’t play computer games – they are a bit boring – never got it! However, NVIDIA produces professional and enterprise graphics, so there are numerous folk like me who hopefully have a clue about the impact of NX lightweight faceting, Catix v4 vs. v5 NURBS or H.264 artifacts on hidden-line CAD. Essentially, I am Ed’s GPU conspiracy… I and my colleagues and the work we do are the reason a professional graphics card costs a lot more than a gaming card. We have vast armies of people who spend their time working on:

  • Regression testing 1000s of 3D Graphical and CAD applications

  • Teams of people on joint development, test and documentation projects with CAD ISVs; optimizing APIs for geometries even weirder than CATIA v4

  • Support staff who can actually use CAD applications and reproduce issues

  • Certification programs with CAD vendors

  • Development with hypervisors, virtualization and remote protocol vendors such as Microsoft, VMware, Citrix, NICE

I have a huge amount of respect for Ed as I know he has a commitment to enterprise quality burned into his soul. I’m also a long-time fan of his CAD blogs. I just want to persuade him that, “Hey! CAD bunnies like me need to be in professional graphics!!! My job is worth paying for!!!” Now that I’ve become terribly over-sensitive on this issue, I keep seeing tweets popping up both for and against the conspiracy. It’s worth reading the various views and experiences in the comments by readers of Ed’s conspiracy blog here.

GPU Drivers are serious software

I actually kind of get where Ed is coming from, as the way GPUs have historically been sold and how they are very much still a hardware purchase in consumer gaming land only helps fuel the conspiracy theories. However, CAD is probably one of the industries with the most precedence for paying for software functionality, rigorous testing and certification.
Consider the Parasolid kernel, the same modeling kernel is licensed within low-cost viewers, mid-range CAD packages such as SolidEdge and SOLIDWORKS, as well as high-end Siemens NX. It’s available in a variety of editions at different price points with lower cost versions allowing use of limited subsets of APIs. This is a win-win:

  • A single kernel is tested and developed so all QA is focused on a single product

  • Those products that consume the costly to support and develop APIs for class A surfacing, non-manifold Booleans, tolerant geometry essentially fund really high-quality support. Ed’s argument that the professional drivers should be given away with gaming cards is a bit like saying Dassault should give away CATIA to every SOLIDWORKS user.

  • Lower cost products are available on the market, e.g., CAD viewers which would not be economically feasible if forced to pay the average support and development costs for the full feature set and support organization. Gaming cards are essentially that lower cost product. It’s not that professional graphic users are being overcharged simply that gamers are getting something a lot cheaper to make, support, and develop.

Would you run your CAD software unsupported? Or Microsoft Windows?

The model of paying for the software development loaded into the GPU card is, in my opinion, flawed. Enterprise software is about testing and support, as well as interoperability with other products and hardware. If the price of that is loaded into a GPU, that in turn can be marked up by server OEMs leaving the user paying more to an OEM which is not allocated to development or support of the GPU software which is the main product a professional user requires. A GPU without drivers and software development is just raw silicon, a rather expensive paperweight or brick!
The sophistication of the software and support needed for GPUs today makes them comparable to an OS or hypervisor within the graphics stack. I simply can’t imagine any serious enterprise being willing to run their Microsoft OS, VMware stack, or CAD software unsupported. The idea of just buying a GPU as a piece of hardware and having no guaranteed way forward if there is an issue with the driver seems a complete anomaly.

CLICK HERE DOWNLOAD THE ENTIRE ARTICLE, INCLUDING PARTS II AND III

For more information on BOXX GPU dense workstations, fill out the form here and one of our performance specialists will reach out to discuss your needs and workflow.