The GDC (That’s Game Developers Conference by the way, not the General Dental Council) is the big industry focused event (unlike more consumer-oriented events like E3) and as such is the perfect place for the major players to showcase new technology and make big announcements. In the first few days we saw a flurry of exciting announcements made that not only effect the game industry but also cross over into other areas (such as CG). With that in mind here’s a quick rundown of the news from last weeks event:
Epic give away $12M worth of assets for free!
Yep, you read that right. 12 million dollars worth! Why would they do such a thing? Well let’s have a bit of backstory first; for the last few years Epic have been developing a game called Paragon which was a new take on the MOBA genre, it’s key feature was bringing the action closer with a 3rd person perspective (as opposed to the traditional isometric views favoured by Dota and LoL), more emphasis on action and some truly stunning graphics. The first announce trailer 2 years ago absolutely blew me away, as a CG nerd the detail in the environments and characters was astounding and at 4k it was just a joy to watch (plus the soundtrack helped). Unfortunately for Epic the game never quite took off as much as they had hoped and when they stumbled on the magic formula of Fortnite + Battle Royale mode (which continues to go from strength to strength, ironically, unseating League of Legends from its throne atop the Twitch viewer rankings) they decided to pull the plug on Paragon and focus development on Fortnite.
I thought this was a real shame as they had created such a rich world and array of characters with Paragon it seemed madness to just give it all up. But it looks like the years of work on Paragon isn’t going entirely to waste as they announced that they were uploading it to the Unreal Marketplace for developers (such as us) to download and do what they like with for free; character models, animation rigs and blueprints, voice assets and, of most interest to us, the environment assets used to create the world. This is another great move from Epic who continue to develop Unreal Engine in new and interesting ways and support their ever-growing community with top-tier free content.
The advent of real-time raytracing.
The oft-touted ‘holy grail’ (seriously, almost every article on this subject will say that) of the CG world, real-time raytracing, was unveiled. The first reports came out on Monday with Nvidia announcing support for Microsoft’s new DirectX Raytracing (DXR) API along with Nvidia RTX Technology. DXR will run on older cards (and we’re looking forward to seeing what the performance will be like on our Titan X cards) but RTX will only be available on the new Volta cards; which, at the time of the announcement, were still a mystery! The initial demo from Remedy Entertainment running on their Northlight engine was impressive but, unfortunately, a little disappointing. While some of the detail was good and seeing the complex lighting changes in reflections was great there was just way too much noise on the glossy reflections for us to get properly excited about it (although Nvidia has announced an update for the Gameworks SDK which will include a denoiser to cope with this issue), that was until the Epic ‘State of Unreal’ keynote.
The Star Wars showstopper.
Epic had a full day of impressive keynotes lined up for the Wednesday but everything was kicked off in the morning with their ‘State of Unreal’ session. In this they looked at some of the great work produced with the Engine over the last year in a variety of disciplines, covered some impressive new work such as Siren, an Andy Serkis Mocap performance and AR work from Riot. Then a chap called Mohen Leo from ILMxLAB came on stage to talk about how they used Unreal to build the ‘hyper-reality’ experience ‘Secrets of the Empire’, which was all great but it was what he said next that made jaws hit the floor! He announced that they had been working together with Epic, Nvidia and Microsoft on an experimental project showcasing real-time raytracing and, after a brief intro, they brought up a truly stunning demo where they showed changing area lights casting beautiful ray-traced soft shadows from 3 stormtroopers in a scene; adjusting the light position, size and colour updated these shadows on the fly (with a view controlled by an iPad with ARKit link to the rendering PC) all with no noticeable flicker, noise or edge artifacts. The scene was then updated to a more complex environment with moving lights to show off surface reflections before they brought in a much more shiny character (3 guesses who it was) who walked into a different position so they could show off interactive adjustments on a glossy floor which went from a practically mirror surface to almost completely diffuse and then somewhere inbetween; all while the camera position was moving and all in real-time…to say that Monday’s scepticism was banished would be an understatement!
Of course this was just a tech demo, put together by some of the best artists in the world and running on kit out of reach for mere mortals (it was driven by an Nvidia DGX station running 4 Volta GPUs which costs an eye-watering £86,000!) and it remains to be seen how/if any of this will run on more feasible setups, but it represents the beginning of a move towards faster iteration and development in CG and we think that’s a great thing. Our initial thoughts are that the reflections are great but for the majority of projects (and considering that cost) standard reflection maps will be fine, we are excited about those shadows and area lights though as that is one area currently lacking in UE.
There was a lot more at GDC to get excited about this year, the release of Unreal Engine 4.19 was kind of drowned out by the bigger announcements but that too boasts an impressive list of features and updates (certainly when you compare it to the recent feature list for 3DS Max, *sigh* oh dear Autodesk) and 4.20 is around the corner bringing a lot of the features announced at GDC with it. It feels like a really exciting time to be working in CG and we can’t wait to see how we can develop all of this to produce bigger and better 3D experiences, animations and visuals!
You may have noticed the addition of a new badge to the sidebar of the site. Down there it sits, positively beaming its ‘Google Street View Trusted’ status to all who wish to look at it. But what’s that all about then?
Simply put it means we have been through the accreditation process with Google to be recommended 360 photography professionals and get listed on the Street View professionals directory. The process requires you to upload over 50 360 images to the platform that must meet the eligibility criteria in terms of quality, stitching accuracy and resolution. To us it’s nice to be recognised and have something to show for the hard work we’ve put in getting this stuff right; for clients it’s a bit of peace-of-mind that we are an agency that can be trusted and that actually know what they are talking about!
Street View is a great platform and has revolutionised the way you find out about a location; I’m old enough to remember scribbling lists of directions down to go somewhere new (which would always have one key bit of info missing…maybe that was just me though) so having the ability to actually go to the street level and scope somewhere out before you get there is amazing. Giving that functionality to businesses is an amazing way to drive engagement and that’s one of the reasons we’re so excited about 360 and VR.
There are a lot of general questions we keep getting asked about VR and 360 work; with that in mind we’re putting together an FAQ on the subject as a handy reference for everyone so check back soon for the update.
So it’s been quite some time since we last posted an update (almost 3 years!) and there are a lot of reasons for this; the main one being that we’ve just been incredibly busy (which we’re totally not complaining about) but also because a lot of our posts can get pretty technical and, before you know it, that ‘quick’ post you were going to make has taken days to craft, edit, check and add graphics to.
But no more! This year we’re going to make a concerted effort to get simpler, shorter posts out that cover some of the great stuff we’ve been doing over the past couple of years and the new areas we have expanded into. So keep an eye out for more info on our adventures in 360 photography and filming, VR, real-time visualisation in Unreal Engine (did we mention we’d had our work included in the global enterprise showreel for Unreal? Oh yeah, no posts for nearly 3 years…tsk), interactive touch-screen experiences and app development.
Hopefully that lot will give us something to talk about; plus this post didn’t take days so we’re off to a great start!
Design Corps officially joined the NewTek Developer Network today to offer a range of unique, user-customisable virtual set designs for use with NewTek TriCaster multi-camera video production systems.
Design Corps, a UK-based creative agency providing design, visualisation, animation and interactive services to clients worldwide, are pleased to announce their collaboration with NewTek through the NewTek Developer Network.
Design Corps provide a range of unique virtual sets for use with TriCaster systems and the Virtual Set Editor (VSE). The sets come with user-customisable features and are supplied ready to use. Design Corps also offer extended customisation on existing sets as well as a bespoke design service so that clients can have a set that is truly unique to them.
Virtual sets are an excellent way to have your own impressive studio set but without any of the overheads of actually building a real-world one. Design Corps’ sets are designed and rendered to the highest standard and are full of all of the tiny details that help sell them as a ‘real’ image. A virtual set can be customised to increase ROI and can be easily replaced with a new design when the time is right for an update.
Michael Kornet, executive vice president of Business Development for NewTek said, “We are pleased to welcome Design Corps into the NewTek Developer Network. Virtual sets provide video producers an incredible opportunity to add high production value at a minimum cost. Design Corps provides Tricaster users with unlimited creative possibilities to customize their virtual sets to enhance their brands and distinguish themselves to their audience.”
Chris Trill, MD of Design Corps said “We are extremely excited to be joining the NewTek Developer Network. We think we have a unique approach to set design due to our background in visualisation for Architecture and Engineering and we’re looking forward to working with the global network of TriCaster users on some great sets.”
International 3D community magazine 3D Artist asked me to write an expanded version of my iPhone texture photography tip recently and, of course, I jumped at the chance. It was great fun writing it and I even developed a few new methods as I was creating the files to go on the CD.
Now don’t just rush out to buy the mag because I’m in it, there’s also an excellent interview with concept vehicle designer Daniel Simon, a BTS article with WETA Digital on Dawn of the Planet of the Apes and a great article on VR headsets; for a detailed listing of what’s in the magazine click here. You can pick the magazine up from all good newsagents (Ha! Always wanted to say that) or order it from their online store here.
As before if anyone uses this approach in their work I’d love to hear about it and see the results!
As has been mentioned before on here, part of the CG process that I really enjoy is creating custom texture sets. I think it’s a rewarding process anyway but it makes a lot of sense as a CG artist to create your own texture library that you know you can fall back on.
All textures begin life as a photograph of course and the higher res you can get on the initial photography the better. I like to keep a look out for interesting textures whenever I’m out and about but lugging a load of camera kit around with you everywhere you go isn’t really an option (especially not when you’ve got 2 kids and a wife who frowns at you whenever the camera bag comes out!). Fortunately pretty much everyone carries a camera round in their pocket these days and I reckon the humble camera-phone is a perfect tool for this particular instance.
Let’s get some stuff out of the way first though; this is a camera-phone we’re using so you can forget huge resolution (although there is a workaround for this that I’ll go into shortly), focus can be iffy, there’s no manual mode and you will be working with jpegs as opposed to RAWs. However, if you’re just shooting interesting textures as you find them it doesn’t really matter too much if some go a bit wrong. Another advantage of shooting with a camera-phone is that every image is geotagged, so if you mess a really good texture up or you need to shoot a high-res one with a big camera you can easily find where it was shot and go back there.
The other issues we mentioned; iffy focus and no manual control, can be tackled by just making sure you tap the screen to focus then check it before you start shooting and purchasing some of the other camera apps out there that do offer (a bit) more control. However these apps will have one important thing missing that we need for the high-res workaround.
I’d been trying to think of a way to get better res out of the images taken with my phone; initially I thought of just shooting lots of images from close range but then the thought occurred to me, why not use the panorama tool to create one long, close image, that way you get much better res and can grab huge textures in one shot. Another thing this is good for is ‘unwrapping’ cylindrical objects (trees for instance), you can just start your panorama in one position, walk around the object and presto, instant unwrapped texture!
Ok, ok, so it’s not quite ‘instant’. For a start it can be a bit of a fiddle to capture properly, my first attempts had a lot of wobbles and repeats in but, with a bit of practice, it gets better; the trick is to move slowly and keep the arrow in the centre line…do that and you can get a pretty clean image. You will also need to edit the edges to create a proper tile, but you’d need to do that whatever image you use. I’ve included a shot I took at the weekend around a tree, the ground was slightly uneven so there are some repeat patterns but these could be cloned out easily; this one was just another test and, if I were actually using it, I’d have done a few more circuits of the tree at different heights. Still, not too bad, to get that resolution (21MP) out of a phone! If you’d like to have a look at the full res version click here to download it.
It should be blindingly obvious that you’re not going to get textures that can hold up to ultra-close renders at 8k; however for anything up to middle distance the images you grab with your phone will do just fine and your life will be (relatively) frown-free!
Has anyone else tried this method? Be interested to see the results if so.
NOTE: The phone I’m using is an iPhone 4S that shoots images at 8MP (3264 x 2448px) which is quite measly compared to some of the Android/Windows based phones. I’m using the standard camera software; as mentioned above there are other camera apps but these tend to focus on normal photos and don’t have panorama modes. There are dedicated panorama apps but none of them seem as suited to this task as the Apple one; I’ve been on the lookout for one with image stabilisation but haven’t found one yet (so if you know of one please mention it in the comments). If image wobbles are an issue you could break out a tripod and a dolly…but that then kind of negates the whole ‘quick and easy shots with a camera you put in your pocket’ thing!