Blender GTX980ti

So last year I upgraded from my old EVGA GTX580 Classified to a Palit GTX980ti JetStream.  I expected a massive increase in performance in Blender, both in the preview rendering whilst working on models and also in final render times.  Since I’ve been incredibly busy on other projects I’ve not had much time for working in Blender.

I ran a few benchmarks and was more than a little disappointed in the results.  Using the BMW Blender benchmark file I was getting render times that were in fact slower.  Since I wasn’t doing any serious work at the time it wasn’t a huge problem, however I’ve now got a bit of time to look at this again.  Turns out there is a known compatibility/performance issue using this card.  The issue is not limited to a single manufacturer and seems to be related to the GM200 chip used in the 980ti and Titan X cards.

This issue is now being investigated by the Blender developers and you can keep up with the developments on their Blender bug reporting site here.

Update!

OK, so we’ve actually seen quite a bit of movement on this issue both from Blender and from NVidia.  NVidia have reproduced the issue and are looking into it.  That said, if they do make additional improvements things are looking very good indeed.

I’ve just performed a new test with Mike Pans BMW benchmark file.

With the very latest NVidia driver (368.22) and using the latest Blender nightly (2.77.0 – nightly – Thu May 26 04:46:46 2016) my time has drastically improved from 2:10 down to 1:01.  I think with the rapidly moving world of NVidia drivers we may well see further improvements over these times.

Share

Pixar Renderman now free for Non-Commercial

Wow, you can now get your hands on a non-commercial version of Renderman for FREE.  Pixar has been selling Renderman for a long time.  That said the asking price of £1200 was pretty damn reasonable for such an amazing piece of software.  Nowhere near as good as FREE though.

Problem is now a question of if your fave 3D application is supported and integrated.  Blender isn’t at the moment so no dice there unfortunately.

Share

Blender – Default Cubism – Cube a Day(ish)

I was recently reading some blogs I subscribe to and came across this great idea for learning and developing some interesting skills in Blender. The idea has been penned by Mike Pan and you can read about the rules here. The base idea is that you do something visually interesting using just the geometry provided with the default cube, nothing else.

Basically I’m going to be having a go at this myself and see what things I arrive at. I’m certainly not going to get the time to do this once a day, every day but I’m certainly going to keep going for a while.

To be fair the first one I’ve completed (currently rendering as I blearily type this over my first coffee) is complete plagiarism from one of Mikes but since I am such a Blender noob I see that as my starting point for some inspiration. Mike knows his stuff and has done some seriously cool things with the default cube.

I’ve always found that my best creative efforts be it photography, music, graphics or even code occur when I’m given strict rules and boundaries to work within. I think that taps into some form of ingrained survival instincts that promotes creative thinking. Anyway, it’s going to be interesting to learn just what can be done with so little.Blender273_DefaultFile_Screenshot

Share

Compiling Blender on Mac

So I decided that after all these years I would like to start looking at and contributing to an open source project.  And since I’ve been using Blender a lot lately this is the project I’m going to get stuck into.

So … how to build blender.  I’ve had a few problems getting this to work on my 2013 Macbook Pro so I’m going to keep a track of all the issues and solutions that I needed in order to get things working.

I’ve been following through on this Blender documentation here:

http://wiki.blender.org/index.php/Dev:Doc/Building_Blender/Mac

Getting all the source code from git and svn was very simple and no problem to report there.  The first issue I encountered was the installation of CMake is broken and doesn’t register itself on the system correctly.  As soon as I got to these commands everything started failing:

cd ~/blender-build/blender
make

The error I was getting was make: command not found Error 127. I cannot find the post that helped me get over this issue at the moment but I will find it and update this post as soon as I do find it. The point is that I ended up giving up trying to get these commands working. I kicked off the build at about 23:30 and it was still hung and not doing anything (no error messages either) at 8am the following morning. Beachball of death …

So I moved on to building Blender in Xcode using the makefiles generated with CMake. The build completed in about 5 minutes and launched straight into Blender in debug mode. I may go and try to find out why the make command failed but XCode was fine but to be honest I’m not sure I’ll have the time (or the inclination) at this stage.

Share

EVGA GTX580 Classified Overclocking – Blender Benchmarks

Over the last few weeks I’ve been playing around with MSI Afterburner and EVGA Precision x16 to see what I can meaningfully squeeze out of this card before plunging into the land of the GTX 900 series cards.  I’ve had this card in my current machine since January 2012 and it has served me very well.  The vast majority of it’s life has spent “idling” at the stock GPU clock of 855Mhz which to be fair is a solid overclock on the stock GTX580 specifications anyway.  What little gaming I do it has munched through and it’s only since really starting to work with Blender more recently that I have started to experience that the GTX580 is starting to look a little long in the tooth.

The 900 series cards were announced in September 2014 and they really do look like they would be an immense upgrade over the GTX580 in terms of raw ppw (performance per watt).  When the GTX580 Classified was released it really was one of the highest performing single GPUs money could buy (which retaining ownership of all ones organs).  I needed the GPU acceleration for lots of video editing work I was doing and in that respect it was an incredible investment.  Yes it was expensive (£550) but it was and still is a total workhorse of a component, it still plays recent high end games at max and pretty much deals with it all without any fuss at all which is impressive to say the least considering the GTX580 reference designs are now 5+ years old.

So anyway, I have been playing around with the Classified and thought I’d get this all written up so that when I do decide to upgrade to a 900 series card and I can rerun these benchmarks and see what my cold hard cash has bought me.  The overclocking setting I have been playing around with are pretty rudimentary really as I’m simply using the software tools provided by EVGA and no special hardware for the task as I’m still far more preoccupied with stability than pure blistering speeds.

Test Machine

The host machine specifications are:

EVGA GTX 580 Classified Blender Benchmarks

The results of the various benchmarks I ran can be seen in the table below:

File

OC
Settings

Tile
Size

Processor

Time

BMW27.blend

N/A

16×16

CPU

06:10.03

BMW27.blend

N/A

256×256

CPU

08:05.94

BMW27.blend

128×128

GPU

02:08.91

BMW27.blend

256×256

GPU

01:49.18

BMW27.blend

GPU +50 Mem +50

256×256

GPU

01:43.82

BMW27.blend

GPU +50 Mem +100

256×256

GPU

01:42.05

BMW27.blend

GPU +50 Mem +100

512×512

GPU

01:42.71

BMW27.blend

GPU +50 Mem +100

16×16

GPU

33:16.59

BMW27.blend

GPU
+55 Mem +100

256×256

GPU

01:41.58

BMW27.blend

GPU
+55 Mem +125

256×256

GPU

01:40.77

BMW27.blend

GPU
+60 Mem +125

256×256

GPU

01:40.63

BMW27.blend

GPU
+70 Mem +125

256×256

GPU

01:40.20

BMW27.blend

GPU
+75 Mem +100

256×256

GPU

BMW27.blend

GPU
+75 Mem +125

256×256

GPU

01:40.63

cycles_bench_272.blend

256×256

GPU

12:01.27

cycles_bench_272.blend

GPU +50 Mem +100

256×256

GPU

11:12.48

sponza_cycles_benchmark.blend

256×256

GPU

10:12:55

sponza_cycles_benchmark.blend

GPU +50 Mem +100

256×256

GPU

09:47.31

The end result of this is that I have a rock solid +50 on the GPU and +100 on the memory clock, not bad.  My specific card has an ASIC score or 85.4% which is damn fine, I didn’t see the GPU temps rise above 71C which is nice.  The final overclocking looks like this:

OCSo what now?

GTX 900 Series

After having had a good look over the 900 series it’s a no-brainer to decide to upgrade.  The decision is then between the 980 and the 970.  Since the last time I made a decision on a graphics card I had cash to burn and so blindly bought the card I wanted and the cost definitely came second in the decision process.  Hence I bought the 580 Classified and was a very happy shopper.  The problem was I bought it thinking that I would be buying another in order to then go SLI and have a frankly epic amount of the GPU horsepower to call on when needed.  The problem is that cash comes and goes and I never found myself able to justify shelling out an additional £550 for a second 580 Classified.

It just never happened …

Considering that would have been an investment of £1100 on GPUs alone I always backed off from clicking that “buy it now” button.  Only when I was doing some intensive video work did I feel that I’d like the extra grunt.  Anyway, I’m not making that mistake again and so I’m going to buy both cards at the same time this time and do the SLI immediately.  To that end I’ve decided to not go to the bleeding edge of single GPU insanity and instead rein in the spending.  That said the sheer processing grunt in a dual GTX 970 setup in SLI configuration will be mind-bending.  The most incredible thing is that two 970 will come in at around or a little more than a single 980.  It’s just a total no-brainer.

I really haven’t even thought about which 970s to get yet as in reality I don’t have the cash right now and also I’m waiting to see what is announced over the next few months.  Waiting for the 8Gb GDDR5 cards is also a no-brainer.  Since the RAM on the GPU is so important to Blender it makes sense to wait a little while until the larger RAM capacity cards hit the streets.  Seeing as the 900 series 3rd party OC’d offerings appeared very near the release of the Nvidia reference cards it should follow that the larger 8Gb 3rd party cards will appear just as close to reference release.

So, 2x 8Gb GTX 970s in SLI it is then …

 

Share

Blender Shenanigans

The other day I decided to get myself up to speed on all things Blender. I’ve not used the app very much but had done some simple rendering work for a couple of clients in the past. Anyway, I decided to download the latest version (2.72b) and have another look at the app.

WOW …

This app has come on in leaps and bounds since I last used it. I always felt the UI suffered from a lack of finesse (just like Gimp really). The UI itself was a blocker to getting anything done unless you were prepared to spend a long time just getting to grips with that. The new UI really is very nice. I’ve also put together a couple of renders and a nice little physics demo animation. I’ll post those soon once I’ve got them completed and uploaded.

All I can say is that right now I’m really digging Blender in all it’s free glory. Really impressive stuff and it turns 20 years old in 2015!

Share