User Guide
V-Ray Application SDK
I. Introduction
The V-Ray Application SDK allows third-party integrators to initiate and control a rendering process that makes use of the V-Ray engine. It provides a high-level API that enables users to load scenes, render them in a host process as well as in distributed mode and also manipulate the scene objects known as V-Ray plugins. The scenes files that the V-Ray Application SDK works with are ".vrscene" files which are exported from 3D development environments such as Autodesk 3ds Max, Maya etc. or by any other V-Ray integration, such as your own. Rendering can also be done directly from memory without going through a scene file.
II. Basic Rendering
The V-Ray Application SDK exposes a specially designed VRayRenderer class that serves as the entry point for the initiation and control of any rendering process. The class defines high-level methods for loading and rendering a V-Ray scene that hide the inner complexities of the V-Ray engine and yet provide its powerful capabilities.
The basic workflow for starting a render process usually consists of the following steps:
- Instantiating the configurable VRayRenderer class
- Either of (or even a combination of):
- Loading a scene by specifying a path to a .vrscene file (the V-Ray native format for serializing scenes)
- Creating new instances of V-Ray render plugins to translate your native scene description to V-Ray and setting their parameters
- Invoking the method for rendering
- Waiting for the image to become available
- Cleaning up memory resources by closing the VRayRenderer
All language implementations of the VRayRenderer class offer a method without arguments (render()) that is used to start the rendering of the currently loaded scene. The method is a non-blocking call which internally runs the V-Ray engine in a separate thread. Thus the rendering can conveniently take place as a background process without the need to use language-specific tools for creating threads.
The VRayRenderer class gives access to the current state of the rendered image at any point in time. Irrespective of whether the render process has finished or not, an image can be extracted to track the progress of the rendering. All language implementations expose a method that returns a specialized image class whose purpose is to capture the image data at the point of the image request. The class can be used for direct display or to save the rendered image to a file in one of several supported popular compression formats.
import vray with vray.VRayRenderer() as renderer: renderer.load('./car.vrscene') renderer.start() renderer.waitForImageReady(6000) image = renderer.getImage() image.save('car.png')
#include "vraysdk.hpp" using namespace VRay; VRayInit init; VRayRenderer renderer; renderer.load("./car.vrscene"); renderer.startSync(); renderer.waitForImageReady(6000); VRayImage* image = renderer.getImage(); image->saveToPngFile("car.png"); delete image;
using VRay; using (VRayRenderer renderer = new VRayRenderer()) { renderer.Load("./car.vrscene"); renderer.Start(); renderer.WaitForImageReady(6000); VRayImage image = renderer.GetImage(); image.SaveToPNG("car.png"); }
import com.chaosgroup.vray.*; VRayRenderer renderer = null; try { renderer = new VRayRenderer(); renderer.load("./car.vrscene"); renderer.render(); renderer.waitForImageReady(6000); VRayImage image = renderer.getImage(); image.saveToPNG("car.png"); } catch (VRayException e) { System.err.println(e.getMessage()); } finally { VRayRenderer.close(renderer); }
var vray = require('../vray'); var renderer = vray.VRayRenderer(); renderer.load('./car.vrscene', function(err) { if (err) throw err; renderer.render(); renderer.waitForImageReady(6000, function() { var image = renderer.getImage(); image.save('car.png', function() { renderer.close(); }); }); });
III. Render Modes
The V-Ray Application SDK offers the ability to run the V-Ray engine in three distinct render modes - Production, RT CPU and RT GPU. Production mode is suitable for obtaining the final desired image after the scene has been carefully configured as the render process can take a lot of time before a meaningful overview of the whole image is available. The strength of the RT modes on the other hand lies in the fast image preview and progressive image quality improvement which makes them the perfect choice for experimenting with scene settings and modifying content and receiving fast render feedback. The RT engine is supported in two flavors - CPU and GPU. The former utilizes the CPU resources while the latter takes advantage of the graphics card computing power. The GPU rendering type is in turn subdivided into two modes - OpenCL and CUDA (nVidia only) depending on the technology that is used with the GPU.
The type of the rendering initiated with the VRayRenderer class can be chosen easily through an instance of the specialized options class that is passed as an argument to the VRayRenderer constructor. The render mode is determined by the provided options upon instantiating the VRayRenderer. The default render mode for the VRayRenderer class is RT CPU and instantiating the VRayRenderer class without specifying any options is equivalent to passing to the VRayRenderer constructor an instance of the default options. The render mode can be changed between renders to avoid having to re-create or re-load the scene. This allows you to work interactively on a scene with V-Ray RT and then to switch to production mode for final rendering without any overhead.
import vray with vray.VRayRenderer(renderMode='production') as renderer: renderer.load('./car.vrscene') renderer.start() renderer.waitForImageReady(6000) image = renderer.getImage() image.save('car.png')
VRayInit init; RendererOptions options; options.renderMode = RendererOptions::RENDER_MODE_PRODUCTION; VRayRenderer renderer(options); renderer.load("./car.vrscene"); renderer.startSync(); renderer.waitForImageReady(6000); VRayImage* image = renderer.getImage(); image->saveToPngFile("car.png"); delete image;
RendererOptions options = new RendererOptions(); options.RenderMode = RenderMode.Production; using (VRayRenderer renderer = new VRayRenderer(options)) { renderer.Load("./car.vrscene"); renderer.Start(); renderer.WaitForImageReady(6000); VRayImage image = renderer.GetImage(); image.SaveToPNG("car.png"); }
VRayRenderer renderer = null; try { RendererOptions options = new RendererOptions(); options.setRenderMode(RenderMode.PRODUCTION); renderer = new VRayRenderer(options); renderer.load("./car.vrscene"); renderer.render(); renderer.waitForImageReady(6000); VRayImage image = renderer.getImage(); image.saveToPNG("car.png"); } catch (VRayException e) { System.err.println(e.getMessage()); } finally { VRayRenderer.close(renderer); }
var vray = require('../vray'); var renderer = vray.VRayRenderer({ renderMode : 'production' }); renderer.load('./car.vrscene', function(err) { if (err) throw err; renderer.render(); renderer.waitForImageReady(6000, function() { var image = renderer.getImage(); image.save('car.png', function() { renderer.close(); }); }); });
IV. Common Events
During the rendering process that runs in the background in a separate thread a number of events are emitted which are available to AppSDK users. You can subscribe to the rendering events with the help of the main VRayRenderer class.
The V-Ray Application SDK provides the means for attaching callbacks that execute custom client logic when an event occurs. The code of the callbacks is executed in a separate thread that is different from the main V-Ray rendering thread. This design allows the V-Ray rendering process to remain fast without additional overhead and ensures that slow operations in client callback code will not affect the total rendering time.
The events can be broadly classified into three types: events common for all render modes, events specific to Production mode and events specific to RT mode. This section covers the events that are emitted regardless of the used render mode.
The common render events occur when:
- The image rendering starts and stops
- V-Ray outputs text messages prior to and during rendering
- V-Ray switches the current task (i.e. loading, rendering) and reports progress percentage
Start Event
The start event is the first event to be emitted after the render process has been started with the VRayRenderer render() method. The V-Ray Application SDK enables API users to attach a callback without arguments whose code is executed immediately after the start event is emitted.
import vray def onRenderStart(renderer): print 'The V-Ray render process has started!' with vray.VRayRenderer() as renderer: renderer.setOnRenderStart(onRenderStart) renderer.load('./car.vrscene') renderer.start() renderer.waitForImageReady(6000)
void onRenderStart(VRayRenderer& renderer, void* userData) { printf("The V-Ray render process has started!\n"); } VRayInit init; VRayRenderer renderer; renderer.setOnRenderStart(onRenderStart); renderer.load("./car.vrscene"); renderer.startSync(); renderer.waitForImageReady(6000);
using (VRayRenderer renderer = new VRayRenderer()) { renderer.RenderStarted += new EventHandler((source, e) => { Console.WriteLine("The V-Ray render process has started!"); }); renderer.Load("./car.vrscene"); renderer.Start(); renderer.WaitForImageReady(6000); }
VRayRenderer renderer = null; try { renderer = new VRayRenderer(); renderer.setOnRenderStart(new OnRenderStartListener() { @Override public void onRenderStart() { System.out.println("The V-Ray render process has started!"); } }); renderer.load("./car.vrscene"); renderer.render(); renderer.waitForImageReady(6000); } catch (VRayException e) { System.err.println(e.getMessage()); } finally { VRayRenderer.close(renderer); }
var vray = require('../vray'); var renderer = vray.VRayRenderer(); renderer.on('start', function() { console.log('The V-Ray render process has started!'); }); renderer.load('./car.vrscene', function(err) { if (err) throw err; renderer.render(); renderer.waitForImageReady(6000, function() {}); });
Log Message Event
Output messages are produced by the V-Ray engine during scene loading and rendering and they can be captured by subscribing to the special log event. The callback data that becomes available when a message is logged is the text of the message and the log level type (info, warning or error).
import vray def onDumpMessage(renderer, message, level): print '[{0}] {1}'.format(level, message) with vray.VRayRenderer() as renderer: renderer.setOnDumpMessage(onDumpMessage) renderer.load('./car.vrscene') renderer.start() renderer.waitForImageReady(6000)
void onLogMessage(VRayRenderer& renderer, const char* message, int level, void* userData) { printf("[%d] %s\n", level, message); } VRayInit init; VRayRenderer renderer; renderer.setOnDumpMessage(onLogMessage); renderer.load("./car.vrscene"); renderer.startSync(); renderer.waitForImageReady(6000);
using (VRayRenderer renderer = new VRayRenderer()) { renderer.MessageLogged += new EventHandler<MessageEventArgs>((source, e) => { Console.WriteLine("[{0}] {1}", e.LogLevel.Type, e.Message); }); renderer.Load("./car.vrscene"); renderer.Start(); renderer.WaitForImageReady(6000); }
VRayRenderer renderer = null; try { renderer = new VRayRenderer(); renderer.setOnLogListener(new OnLogListener() { @Override public void onLogMessage(String message, LogLevel logLevel) { System.out.format("[%s] %s\n", logLevel.getType(), message); } }); renderer.load("./car.vrscene"); renderer.render(); renderer.waitForImageReady(6000); } catch (VRayException e) { System.err.println(e.getMessage()); } finally { VRayRenderer.close(renderer); }
var vray = require('../vray'); var renderer = vray.VRayRenderer(); renderer.on('dumpMessage', function(message, level) { console.log('[%d] %s', level, message); }); renderer.load('./car.vrscene', function(err) { if (err) throw err; renderer.render(); renderer.waitForImageReady(6000, function() {}); });
Progress Event
Progress events are emitted whenever the current task changes and also when the amount of work done increases. They include a short message with the current task name and two numbers for the total amount of work to do and the work that is already complete.
Image Ready Event
When the rendering process finishes an image ready event is emitted to notify subscribers that the image result of the process can be obtained. The purpose of the event is to mark the end of the rendering process so that API clients are aware of the time after which no more images are produced. Note that this is also emitted when the rendering is aborted, so you can still count on it to perform your cleanup.
import vray def onImageReady(renderer): image = renderer.getImage() image.save('car.png') with vray.VRayRenderer(renderMode='production') as renderer: renderer.setOnImageReady(onImageReady) renderer.load('./car.vrscene') renderer.start() renderer.waitForImageReady()
void onImageReady(VRayRenderer& renderer, void* userData) { VRayImage* image = renderer.getImage(); image->saveToPngFile("car.png"); delete image; } VRayInit init; RendererOptions options; options.renderMode = RendererOptions::RENDER_MODE_PRODUCTION; VRayRenderer renderer(options); renderer.setOnImageReady(onImageReady); renderer.load("./car.vrscene"); renderer.startSync(); renderer.waitForImageReady();
RendererOptions options = new RendererOptions(); options.RenderMode = RenderMode.Production; using (VRayRenderer renderer = new VRayRenderer(options)) { renderer.ImageReady += new EventHandler((source, e) => { VRayImage image = renderer.GetImage(); image.SaveToPNG("car.png"); }); renderer.Load("./car.vrscene"); renderer.Start(); renderer.WaitForImageReady(); }
VRayRenderer renderer = null; try { RendererOptions options = new RendererOptions(); options.setRenderMode(RenderMode.PRODUCTION); renderer = new VRayRenderer(options); final VRayRenderer rendererRef = renderer; renderer.setOnImageReady(new OnImageReadyListener() { @Override public void onImageReady() { VRayImage image = rendererRef.getImage(); image.saveToPNG("car.png"); } }); renderer.load("./car.vrscene"); renderer.render(); renderer.waitForImageReady(); } catch (VRayException e) { System.err.println(e.getMessage()); } finally { VRayRenderer.close(renderer); }
var vray = require('../vray'); var renderer = vray.VRayRenderer({ renderMode : 'production' }); renderer.on('imageReady', function() { var image = renderer.getImage(); image.save('car.png', function() { renderer.close(); }); }); renderer.load('./car.vrscene', function(err) { if (err) throw err; renderer.render(); });
V. Production Events
There are two types of image sampling in Production mode - "bucket" rendering and progressive rendering. Bucket rendering splits the image into small rectangular sub-images, each processed independently by a different local CPU thread or network server (see the Distributed Rendering section). The sub-images are only returned when completely ready. Progressive rendering works the same way as in RT mode - the whole image is sampled and images are returned for each sampling pass, reducing noise with each pass. This section covers the events which are specific to Production bucket rendering only. The progressive mode emits the image event described in the RT Events section further below.
There are two main phases in Production bucket rendering - assigning a bucket to a render host (or local thread) and rendering the assigned region of the image. This is why there are two events that are raised for each bucket of the image - initializing a bucket and receiving the image result. Users of the API can subscribe to the events via the main VRayRenderer class.
In Production mode the image is perceived as a matrix of buckets each of which holds a part of the final image. The bucket is uniquely identified by the coordinates of its top left corner and the width and height of its rectangular region. The top left corner of the whole image has coordinates (0, 0). The mentioned bucket data is passed as an argument to the callbacks attached to the bucket init and bucket ready events. Consequently, the published event data during the rendering process provides the whole information needed to reconstruct the final image from its constituent parts (buckets).
Bucket Init Event
The bucket init event is raised when the main V-Ray thread assigns a bucket to a network render host or local thread. The callback data that is provided for this event is the bucket size and coordinates as well as the name of the render host (if any) as it appears on the network.
import vray def onBucketInit(renderer, bucket): print 'Starting bucket:' print '\t x: ', bucket.x print '\t y: ', bucket.y print '\t width: ', bucket.width print '\t height:', bucket.height print '\t host:', bucket.host with vray.VRayRenderer(renderMode='production') as renderer: renderer.setOnBucketInit(onBucketInit) renderer.load('./car.vrscene') renderer.start() renderer.waitForImageReady()
void onBucketInit(VRayRenderer& renderer, int x, int y, int width, int height, const char* host, void* userData) { printf("Starting bucket:\n"); printf("\t x: %d\n", x); printf("\t y: %d\n", y); printf("\t width: %d\n", width); printf("\t height: %d\n", height); printf("\t host: %s\n", host); } VRayInit init; RendererOptions options; options.renderMode = RendererOptions::RENDER_MODE_PRODUCTION; VRayRenderer renderer(options); renderer.setOnBucketInit(onBucketInit); renderer.load("./car.vrscene"); renderer.startSync(); renderer.waitForImageReady();
RendererOptions options = new RendererOptions(); options.RenderMode = RenderMode.Production; using (VRayRenderer renderer = new VRayRenderer(options)) { renderer.BucketInit += new EventHandler<BucketRegionEventArgs>((source, e) => { Console.WriteLine("Starting bucket:"); Console.WriteLine("\t x:" + e.X); Console.WriteLine("\t y:" + e.Y); Console.WriteLine("\t width:" + e.Width); Console.WriteLine("\t height:" + e.Height); Console.WriteLine("\t host:" + e.Host); }); renderer.Load("./car.vrscene"); renderer.Start(); renderer.WaitForImageReady(); }
VRayRenderer renderer = null; try { RendererOptions options = new RendererOptions(); options.setRenderMode(RenderMode.PRODUCTION); renderer = new VRayRenderer(options); renderer.setOnBucketInit(new OnBucketInitListener() { @Override public void onBucket(BucketRegion bucket) { System.out.println("Starting bucket:"); System.out.println("\t x:" + bucket.getX()); System.out.println("\t y:" + bucket.getY()); System.out.println("\t width:" + bucket.getWidth()); System.out.println("\t height:" + bucket.getHeight()); System.out.println("\t host:" + bucket.getHost()); } }); renderer.load("./car.vrscene"); renderer.render(); renderer.waitForImageReady(); } catch (VRayException e) { System.err.println(e.getMessage()); } finally { VRayRenderer.close(renderer); }
var vray = require('../vray'); var renderer = vray.VRayRenderer({ renderMode : 'production' }); renderer.on('bucketInit', function(region) { console.log('Starting bucket:'); console.log('\t x:' + region.x); console.log('\t y:' + region.y); console.log('\t width:' + region.width); console.log('\t height:' + region.height); console.log('\t host:' + region.host); }); renderer.load('./car.vrscene', function(err) { if (err) throw err; renderer.render(); renderer.waitForImageReady(function() { renderer.close(); }); });
Bucket Ready Event
The bucket ready event is raised when the render host that has been assigned a bucket has finished rendering the part of the image and has sent its result to the main V-Ray thread. The returned callback data for the event contains the size and coordinates of the region, the render host name and the produced image.
import vray def onBucketReady(renderer, bucket): print 'Bucket ready:' print '\t x: ', bucket.x print '\t y: ', bucket.y print '\t width: ', bucket.width print '\t height:', bucket.height print '\t host:', bucket.host fileName = 'car-{0}-{1}.png'.format(bucket.x, bucket.y) bucket.save(fileName) with vray.VRayRenderer(renderMode='production') as renderer: renderer.setOnBucketReady(onBucketReady) renderer.load('./car.vrscene') renderer.start() renderer.waitForImageReady()
void onBucketReadyCallback(VRayRenderer& renderer, int x, int y, const char* host, VRayImage* image, void* userData) { printf("Bucket ready:\n"); printf("\t x: %d\n", x); printf("\t y: %d\n", y); printf("\t width: %d\n", image->getWidth()); printf("\t height: %d\n", image->getHeight()); printf("\t host: %s\n", host); char fileName[64]; sprintf(fileName, "car-%d-%d.png", x, y); image->saveToPngFile(fileName); } VRayInit init; RendererOptions options; options.renderMode = RendererOptions::RENDER_MODE_PRODUCTION; VRayRenderer renderer(options); renderer.setOnBucketReady(onBucketReadyCallback); renderer.load("./car.vrscene"); renderer.startSync(); renderer.waitForImageReady();
RendererOptions options = new RendererOptions(); options.RenderMode = RenderMode.Production; using (VRayRenderer renderer = new VRayRenderer(options)) { renderer.BucketReady += new EventHandler<BucketImageEventArgs>((source, e) => { Console.WriteLine("Bucket ready:"); Console.WriteLine("\t x:" + e.X); Console.WriteLine("\t y:" + e.Y); Console.WriteLine("\t width:" + e.Width); Console.WriteLine("\t height:" + e.Height); Console.WriteLine("\t host:" + e.Host); VRayImage image = e.Image; image.SaveToPNG(string.Format("car-{0}-{1}.png", e.X, e.Y)); image.Dispose(); }); renderer.Load("./car.vrscene"); renderer.Start(); renderer.WaitForImageReady(); }
VRayRenderer renderer = null; try { RendererOptions options = new RendererOptions(); options.setRenderMode(RenderMode.PRODUCTION); renderer = new VRayRenderer(options); renderer.setOnBucketListener(new OnBucketListener() { @Override public void onBucket(BucketImage bucket) { System.out.println("Bucket ready:"); System.out.println("\t x:" + bucket.getX()); System.out.println("\t y:" + bucket.getY()); System.out.println("\t width:" + bucket.getWidth()); System.out.println("\t height:" + bucket.getHeight()); System.out.println("\t host:" + bucket.getHost()); VRayImage image = bucket.getImage(); image.saveToPNG(String.format("car-%d-%d.png", bucket.getX(), bucket.getY())); image.close(); } }); renderer.load("./car.vrscene"); renderer.render(); renderer.waitForImageReady(); } catch (VRayException e) { System.err.println(e.getMessage()); } finally { VRayRenderer.close(renderer); }
var vray = require('../vray'); var renderer = vray.VRayRenderer({ renderMode : 'production' }); renderer.on('bucketReady', function(bucket) { console.log('Bucket ready:'); console.log('\t x:' + bucket.x); console.log('\t y:' + bucket.y); console.log('\t width:' + bucket.width); console.log('\t height:' + bucket.height); console.log('\t host:' + bucket.host); var fileName = 'car-' + bucket.x + '-' + bucket.y + '.png' bucket.save(fileName, function() { bucket.close(); }); }); renderer.load('./car.vrscene', function(err) { if (err) throw err; renderer.render(); renderer.waitForImageReady(function() { renderer.close(); }); });
VI. RT Events
One of the render modes provided by the V-Ray Application SDK is RT. Its purpose is to allow users to receive fast feedback for the scene they have configured. Noisy images are quickly available when the RT render process starts. As this process continues, it returns new intermediate images with progressively better quality. The V-Ray Application SDK gives access to those intermediate images by publishing image data when the RT engine event for an updated image is emitted. This event is also emitted in production mode when the image sampler is set to progressive mode, not buckets.
import vray counter = 0 def onImageUpdated(renderer, image): global counter counter += 1 fileName = 'car-{0}.jpeg'.format(counter) image.save(fileName) with vray.VRayRenderer(renderMode='rtCPU') as renderer: renderer.setOnRtImageUpdated(onImageUpdated) renderer.load('./car.vrscene') renderer.start() renderer.waitForImageReady()
int counter = 0; void onRTImageUpdated(VRayRenderer& renderer, VRayImage* image, void* userData) { char fileName[64]; sprintf(fileName, "car-%d.jpeg", ++counter); image->saveToJpegFile(fileName); } VRayInit init; RendererOptions options; options.renderMode = RendererOptions::RENDER_MODE_RT_CPU; VRayRenderer renderer(options); renderer.setOnRTImageUpdated(onRTImageUpdated); renderer.load("./car.vrscene"); renderer.startSync(); renderer.waitForImageReady();
RendererOptions options = new RendererOptions(); options.RenderMode = RenderMode.RT_CPU; using (VRayRenderer renderer = new VRayRenderer(options)) { int counter = 0; renderer.RTImageUpdated += new EventHandler<VRayImageEventArgs>((source, e) => { string fileName = string.Format("car-{0}.jpeg", ++counter); e.Image.SaveToJPEG(fileName); e.Image.Dispose(); }); renderer.Load("./car.vrscene"); renderer.Start(); renderer.WaitForImageReady(); }
VRayRenderer renderer = null; try { RendererOptions options = new RendererOptions(); options.setRenderMode(RenderMode.RT_CPU); renderer = new VRayRenderer(options); renderer.setOnRTImageUpdated(new OnRTImageUpdatedListener() { private int counter = 0; @Override public void onImageUpdated(VRayImage image) { String fileName = String.format("car-%s.jpeg", ++counter); image.saveToJPEG(fileName); image.close(); } }); renderer.load("./car.vrscene"); renderer.render(); renderer.waitForImageReady(); } catch (VRayException e) { System.err.println(e.getMessage()); } finally { VRayRenderer.close(renderer); }
var vray = require('../vray'); var renderer = vray.VRayRenderer({ renderMode : 'rtCPU' }); var counter = 0; renderer.on('rtImageUpdated', function(image) { var fileName = 'car-' + (++counter) + '.jpeg'; image.save(fileName, function() { image.close(); }); }); renderer.load('./car.vrscene', function(err) { if (err) throw err; renderer.render(); renderer.waitForImageReady(function() { renderer.close(); }); });
VII. V-Ray Images
The VRayImage class provides access to the images rendered by V-Ray. It is used by the VRayRenderer, as well as by some of the arguments of the callbacks invoked when an event occurs. The VRayImage class is specialized to provide utility functions for manipulating the retrieved binary image data. The binary data is in full 32-bit float format and can be accessed directly, but there are convenience methods that perform compression in several of the most popular 8-bit formats - BMP, JPEG, PNG and WebP. Saving EXR and other high dynamic range formats is also possible, but not discussed here.
The methods that are exposed by the VRayImage class come in two flavors. The first group of the methods returns directly the bytes of the compressed image, while the second group compresses the image and saves it to a file.
import vray with vray.VRayRenderer() as renderer: renderer.load('./car.vrscene') renderer.start() renderer.waitForImageReady(6000) image = renderer.getImage() data = image.compress('png') with open('car.png', 'wb') as outStream: outStream.write(data)
VRayInit init; VRayRenderer renderer; renderer.load("./car.vrscene"); renderer.startSync(); renderer.waitForImageReady(6000); VRayImage* image = renderer.getImage(); Png png = image->toPng(); ofstream outputStream("car.png", ofstream::binary); outputStream.write((char*) png.getBuf(), png.getLen()); outputStream.close(); delete image;
using (VRayRenderer renderer = new VRayRenderer()) { renderer.Load("./car.vrscene"); renderer.Start(); renderer.WaitForImageReady(6000); VRayImage image = renderer.GetImage(); byte[] data = image.ToPNG(); using(FileStream outStream = new FileStream("car.png", FileMode.Create, FileAccess.Write)) { outStream.Write(data, 0, data.Length); } }
FileOutputStream outStream = null; VRayRenderer renderer = null; try { renderer = new VRayRenderer(); renderer.load("./car.vrscene"); renderer.render(); renderer.waitForImageReady(6000); VRayImage image = renderer.getImage(); byte[] data = image.toPNG(); outStream = new FileOutputStream("car.png"); outStream.write(data); } catch (VRayException e) { System.err.println(e.getMessage()); } catch (IOException e) { System.err.println(e.getMessage()); } finally { VRayRenderer.close(renderer); if (outStream != null) { try { outStream.close(); } catch (IOException e) { System.err.println(e.getMessage()); } } }
var fs = require('fs'); var vray = require('../vray'); var renderer = vray.VRayRenderer(); renderer.load('./car.vrscene', function(err) { if (err) throw err; renderer.render(); renderer.waitForImageReady(6000, function() { var image = renderer.getImage(); image.compress('png', function(err, buffer) { fs.writeFile('car.png', buffer, function() { renderer.close(); }); }); }); });
All instances of the VRayImage class must be closed so that memory resources held by the instance are released. It is recommended to always free the resources (close the image) when you have finished working with the image. However, the platforms that support a garbage collection mechanism will take care to free the internally held resources in the event that the user does not close the retrieved image.
Downscaling
In addition to the utility methods for compression, the VRayImage class supports downscale operations that resize the retrieved image to a smaller one. The result of the downscale operations is another VRayImage instance which has access to the full functionality of the utility methods of the class.
import vray with vray.VRayRenderer() as renderer: renderer.load('./car.vrscene') renderer.start() renderer.waitForImageReady(6000) image = renderer.getImage() downscaled = image.downscale(260, 180) downscaled.save('car.png')
VRayInit init; VRayRenderer renderer; renderer.load("./car.vrscene"); renderer.startSync(); renderer.waitForImageReady(6000); LocalVRayImage image = renderer.getImage(); LocalVRayImage downscaled = image->downscale(260, 180); downscaled->saveToPngFile("car.png");
using (VRayRenderer renderer = new VRayRenderer()) { renderer.Load("./car.vrscene"); renderer.Start(); renderer.WaitForImageReady(6000); using (VRayImage image = renderer.GetImage()) { using (VRayImage downscaled = image.GetDownscaled(260, 180)) { downscaled.SaveToPNG("car.png"); } } }
VRayRenderer renderer = null; try { renderer = new VRayRenderer(); renderer.load("./car.vrscene"); renderer.render(); renderer.waitForImageReady(6000); VRayImage image = renderer.getImage(); VRayImage downscaled = image.getDownscaled(260, 180); downscaled.saveToPNG("car.png"); downscaled.close(); image.close(); } catch (VRayException e) { System.err.println(e.getMessage()); } finally { VRayRenderer.close(renderer); }
var vray = require('../vray'); var renderer = vray.VRayRenderer(); renderer.load('./car.vrscene', function (err) { if (err) throw err; renderer.render(); renderer.waitForImageReady(6000, function () { var image = renderer.getImage(); image.downscale(260, 180, function(downscaled) { downscaled.save('car.png', function() { downscaled.close(); // Not mandatory, can be left to the garbage collector image.close(); // Not mandatory, can be left to the garbage collector renderer.close(); }); }); }); });
Changing the Image Size
The size of the rendered image can be controlled with the help of the renderer options passed to the constructor of the VRayRenderer class. The width and height of the target image can be changed before instantiating a renderer, and therefore all images rendered by it will have the same dimensions. If the image width and height are not specified (i.e. by default they remain 0) then the size of the rendered image is determined on a scene-by-scene basis by the concrete settings exported in the ".vrscene" file of the currently loaded scene.
import vray with vray.VRayRenderer() as renderer: renderer = vray.VRayRenderer(imageWidth=640, imageHeight=360) renderer.load('./car.vrscene') renderer.start() renderer.waitForImageReady(6000) image = renderer.getImage() image.save('car.png')
VRayInit init; RendererOptions options; options.imageWidth = 640; options.imageHeight = 360; VRayRenderer renderer(options); renderer.load("./car.vrscene"); renderer.startSync(); renderer.waitForImageReady(6000); LocalVRayImage image = renderer.getImage(); image->saveToPngFile("car.png");
RendererOptions options = new RendererOptions(); options.ImageWidth = 640; options.ImageHeight = 360; using (VRayRenderer renderer = new VRayRenderer(options)) { renderer.Load("./car.vrscene"); renderer.Start(); renderer.WaitForImageReady(6000); using (VRayImage image = renderer.GetImage()) { image.SaveToPNG("car.png"); } }
VRayRenderer renderer = null; try { RendererOptions options = new RendererOptions(); options.setImageWidth(640); options.setImageHeight(360); renderer = new VRayRenderer(options); renderer.load("./car.vrscene"); renderer.render(); renderer.waitForImageReady(6000); VRayImage image = renderer.getImage(); image.saveToPNG("car.png"); } catch (VRayException e) { System.err.println(e.getMessage()); } finally { VRayRenderer.close(renderer); }
var vray = require('../vray'); var renderer = vray.VRayRenderer({ imageWidth : 640, imageHeight : 360 }); renderer.load('./car.vrscene', function(err) { if (err) throw err; renderer.render(); renderer.waitForImageReady(6000, function() { var image = renderer.getImage(); image.save('car.png', function() { image.close(); renderer.close(); }); }); });
VIII. Render Elements
V-Ray Render Elements (also known as AOVs) are images containing various types of render data encoded as 3-element color, single floats or integers. Some of them are Z-depth, surface normal, UV coordinates, velocity, lighting, reflections etc. Each V-Ray scene may contain an arbitrary number of render elements (also called channels). Each channel is enabled by a unique plugin, except for the RGB and Alpha channels, which are always enabled.
To access the render elements in the current scene, use the VRayRenderer instance where the scene is loaded. Each render element's data can be taken either as a VRayImage, or as raw data (as byte, integer, or float buffers). Optionally, provide a sub-region of interest to the APIs to get that part of the data.
import vray # Render elements are only available in production mode! with vray.VRayRenderer(renderMode='production') as renderer: renderer.load('./car.vrscene') renderer.start() # Render elements are available when the first bucket region is ready # Here, we simply wait for the entire image to be completed before we access # the render elements renderer.waitForImageReady() # Access render elements via the VRayRenderer instance renderElements = renderer.renderElements.getAll() # Always good to check for null (i.e. non-production mode) if not renderElements: exit() # List all available render elements and process each one for r in renderElements: print '{0}, Format({1}), PixelFormat({2})'.format(r.name, r.binaryFormat, r.defaultPixelFormat) # Output render element's data as an image # Optionally, specify an image sub-region, or leave blank to get the # entire image img = r.getImage() img.save(r.name + '.png', preserveAlpha=True) # Similar to the image output, get the raw bytes # Again, a sub-region can be specified, or the entire data can be # obtained if left blank rawData = r.getData() # Do something with rawData...
// Init V-Ray VRayInit init("VRaySDKLibrary.dll", true); // Set up rendering options RendererOptions options; options.imageWidth = 320; options.imageHeight = 200; options.renderMode = RendererOptions::RENDER_MODE_PRODUCTION; // Do some rendering VRayRenderer renderer(options); renderer.load("car.vrscene"); renderer.startSync(); // Render elements are available when the first bucket region is ready // Here, we simply wait for the entire image to be completed before we access the render elements renderer.waitForImageReady(); // Access render elements via the VRayRenderer instance RenderElements renderElements = renderer.getRenderElements(); int reCount = renderElements.getCount(); // List all available render elements and process each one for (int i = 0; i < reCount; i++) { RenderElement re = renderElements[i]; printf("%d. %s, Format(%d), PixelFormat(%d)\n", re.getIndex(), re.getName().c_str(), re.getBinaryFormat(), re.getDefaultPixelFormat()); // Output render element's data as an image // Optionally, specify an image sub-region, or leave blank to get the entire image LocalVRayImage img = re.getImage(); img->saveToPngFile(re.getName() + ".png"); // Similar to the image output, get the raw bytes // Again, a sub-region can be specified, or the entire data can be obtained if left blank void* data = NULL; if (int dataSize = re.getData(&data)) { ofstream datFile (re.getName() + ".dat", ios::out | ios::binary); datFile.write(reinterpret_cast<unsigned char*>(data), dataSize); datFile.close(); RenderElement::releaseData(data); } }
// Set up rendering options RendererOptions options = new RendererOptions(); options.RenderMode = RenderMode.Production; // Render elements are only available in production mode! using (VRayRenderer vr = new VRayRenderer(options)) { // Do some rendering vr.Load("./car.vrscene"); // Access render elements via the VRayRenderer instance RenderElements re = vr.RenderElements; // Always good to check for null (i.e. non-production mode) if (re == null) { return; } // Add render element for normals; it can be as well present in the scene re.New(RenderElementType.NORMALS, "NormalsChannel", "Normals Channel"); vr.Start(); // Render elements are available as plugins which are part of the scene and can be obtained in both production and RT mode // Only "RGB" and "Alpha" are implicit // Here, we wait for the entire image to be completed before we access the render elements vr.WaitForImageReady(); // All Get* methods on the render elements may return nulls if data is not available var allRenderElements = re.GetAll(); if (allRenderElements == null) { return; } // List all available render elements and process each one foreach (RenderElement r in allRenderElements) { Console.WriteLine("{0}, Format({1}), PixelFormat({2})\n", r.Name, r.Format, r.PixelFormat); // Output render element's data as an image // Optionally, specify an image sub-region, or leave blank to get the entire image VRayImage img = r.GetImage(); img.SaveToPNG(true, r.Name + ".png"); // Similar to the image output, get the raw bytes // Again, a sub-region can be specified, or the entire data can be obtained if left blank byte[] rawData = r.GetData(); // Do something with rawData... } }
// Set up rendering options RendererOptions options = new RendererOptions(); options.setRenderMode(RenderMode.PRODUCTION); // Render elements are only available in production mode! try (VRayRenderer vr = new VRayRenderer(options)) { // Do some rendering vr.load("./car.vrscene"); vr.render(); // Render elements are available when the first bucket region is ready // Here, we simply wait for the entire image to be completed before we access the render elements vr.waitForImageReady(); // Access render elements via the VRayRenderer instance RenderElements re = vr.getRenderElements(); // Always good to check for null (i.e. non-production mode) if (re == null) { return; } // All get* methods on the render elements may return nulls if data is not available LinkedList<RenderElement> allRenderElements = re.getAll(); if (allRenderElements == null) { return; } // List all available render elements and process each one for (RenderElement r : allRenderElements) { System.out.printf("%d. %s, Format(%s), PixelFormat(%s)\n", r.getIndex(), r.getName(), r.getFormat().toString(), r.getPixelFormat().toString()); // Output render element's data as an image // Optionally, specify an image sub-region, or leave blank to get the entire image VRayImage img = r.getImage(/*new RectRegion(20, 100, 200, 100)*/); img.saveToPNG(r.getName() + ".png", true); // Similar to the image output, get the raw bytes // Again, a sub-region can be specified, or the entire data can be obtained if left blank byte[] rawData = r.getData(new RectRegion(20, 100, 200, 100)); // Do something with rawData... } } catch (VRayException e) { e.printStackTrace(); }
var vray = require('./vray.js'); var r = vray.VRayRenderer({ renderMode : "Production", imageWidth: 320, imageHeight: 200 }); r.load('./car_p.vrscene', function(err) { if (err) throw err; // Do some rendering r.render(); // Render elements are available when the first bucket region is ready // Here, we simply wait for the entire image to be completed before we access the render elements r.waitForImageReady(function() { // Always good to check for null (i.e. non-production mode) if (r.renderElements == null) { r.close(); return; } // Access render elements via the VRayRenderer instance var renderElements = r.renderElements.getAll(); // List all available render elements and process each one for (var i in renderElements) { var renderElement = renderElements[i]; console.log(i + ". " + renderElement.name + ", Format(" + renderElement.binaryFormat + "), PixelFormat(" + renderElement.pixelFormat + ")"); // Output render element's data as an image // Optionally, specify an image sub-region, or leave blank to get the entire image var reImage = renderElement.getImage(); reImage.saveSync(renderElement.name + ".png"); // Similar to the image output, get the raw bytes // Again, a sub-region can be specified, or the entire data can be obtained if left blank var rawData = renderElement.getData(); console.log(rawData); } r.close(); }); });
IX. Plugins
The plugins are the objects that specify the lights, geometry, materials or settings that define the 3D scene. Each V-Ray scene consists of a set of plugins instances. The V-Ray Application SDK exposes methods in the main VRayRenderer class that can be used to create plugin instances or list the existing ones in the scene.
Plugin objects can be retrieved by name. Once the user has obtained a Plugin instance its property (a.k.a. parameter) values can be viewed or set to affect the rendered image. The properties of the plugin are identified by their name which is the same across all instances of the same type. In RT mode the changes to the Plugin property values are usually applied immediately during rendering and changes become visible almost instantly, but the image sampling is reset and you get some noisy images initially. In Production mode changes to Plugin property values take effect if they are applied before the rendering process starts. The following example demonstrates how the transform for the render view (camera) in a scene can be changed with the V-Ray Application SDK.
import vray with vray.VRayRenderer(showFrameBuffer=True) as renderer: renderer.load('./car.vrscene') renderer.start() # find the RenderView plugin in the scene renderView = renderer.plugins.renderView # change the transform value newTransform = renderView.transform newOffset = newTransform.offset newOffset.x = -170 newOffset.y = 120 renderView.transform = newTransform renderer.waitForImageReady(6000)
VRayInit init; RendererOptions options; options.showFrameBuffer = true; VRayRenderer renderer(options); renderer.load("./car.vrscene"); // find the RenderView plugin in the scene RenderView renderView = renderer.getPlugin<RenderView>("renderView"); // change the transform value Transform newTransform = renderView.getTransform("transform"); Vector& newOffset = newTransform.offset; newOffset.set(-170, 120, newOffset.z); renderView.set_transform(newTransform); renderer.startSync(); renderer.waitForImageReady(6000);
RendererOptions options = new RendererOptions(); options.IsFrameBufferShown = true; using (VRayRenderer renderer = new VRayRenderer(options)) { renderer.Load("./car.vrscene"); renderer.Start(); // find the RenderView plugin in the scene RenderView renderView = renderer.GetPlugin<RenderView>("renderView"); // change the trasnform value Transform newTransform = renderView.Transform; newTransform.Offset.Set(new float[] { -170, 120, newTransform.Offset.Z }); renderView.Transform = newTransform; renderer.WaitForImageReady(6000); }
VRayRenderer renderer = null; try { RendererOptions options = new RendererOptions(); options.setShowFrameBuffer(true); renderer = new VRayRenderer(options); renderer.load("./car.vrscene"); renderer.render(); // find the RenderView plugin in the scene Plugin renderView = renderer.getPlugin("renderView"); // change the transform value Transform newTransform = renderView.getTransform("transform"); Vector newOffset = newTransform.getOffset(); newOffset.set(new float[] { -170, 120, newOffset.getZ() }); renderView.setValue("transform", newTransform); renderer.waitForImageReady(6000); } catch (VRayException e) { System.err.println(e.getMessage()); } finally { VRayRenderer.close(renderer); }
var vray = require('../vray'); var renderer = vray.VRayRenderer({ showFrameBuffer : true }); renderer.load('./car.vrscene', function(err) { if (err) throw err; renderer.render(); // find the RenderView plugin in the scene var renderView = renderer.plugins.renderView; // change the transform value var newTransform = renderView.transform; var newOffset = newTransform.offset; newOffset.x = -170; newOffset.y = 120; renderView.transform = newTransform; renderer.waitForImageReady(6000, function() { renderer.close(); }); });
The call to a method that retrieves the value for a Plugin property always returns a copy of the internal value stored in the local V-Ray engine. In order to change the Plugin property value so that it affects the rendered image the specialized setter method from the Plugin class should be called. Simply modifying the value returned by a getter will not lead to changes in the scene as it is a copy, not a reference.
Adding and removing plugins
Besides changing the values of the properties of existing Plugins, Plugins can also be created and removed dynamically with the V-Ray Application SDK. The next example demonstrates how a new light can be created and added to the scene.
import vray with vray.VRayRenderer() as renderer: renderer.load('./softShadows.vrscene') # create a new light plugin lightOmni = renderer.classes.LightOmni('LightOmniBlue') lightOmni.color = 'Color(0, 0, 6000)' lightOmni.shadowBias = 0.2 lightOmni.decay = 2.0 lightOmni.shadowRadius = 40.0 lightOmni.transform = vray.Transform(vray.Matrix(vray.Vector(1.0, 0.0, 0.0), vray.Vector(0.0, 0.0, 1.0), vray.Vector(0.0, -1.0, 0.0)), vray.Vector(160, -30, 200)) renderer.start() renderer.waitForImageReady(6000)
VRayInit init; VRayRenderer renderer; renderer.load("./softShadows.vrscene"); // create a new light plugin LightOmni lightOmni = renderer.newPlugin<LightOmni>("LightOmniBlue"); lightOmni.set_color(Color(0.f, 0.f, 1.f)); lightOmni.set_intensity(60000.f); lightOmni.set_decay(2.0f); lightOmni.set_shadowRadius(40.0f); lightOmni.set_transform(Transform(Matrix(Vector(1.0, 0.0, 0.0), Vector(0.0, 0.0, 1.0), Vector(0.0, -1.0, 0.0)), Vector(160.f, -30.f, 200.f))); renderer.startSync(); renderer.waitForImageReady(6000);
using (VRayRenderer renderer = new VRayRenderer()) { renderer.Load("./softShadows.vrscene"); // create a new light plugin LightOmni lightOmni = renderer.NewPlugin<LightOmni>("LightOmniBlue"); lightOmni.Color = new Color(0, 0, 60000); lightOmni.ShadowBias = 0.2f; lightOmni.Decay = 2.0f; lightOmni.ShadowRadius = 40.0f; lightOmni.Transform = new Transform(new Matrix(new Vector(1.0, 0.0, 0.0) , new Vector(0.0, 0.0, 1.0), new Vector(0.0, -1.0, 0.0)) , new Vector(160, -30, 200)); renderer.Start(); renderer.WaitForImageReady(6000); }
VRayRenderer renderer = null; try { renderer = new VRayRenderer(); renderer.load("./softShadows.vrscene"); // create a new light plugin LightOmni lightOmni = renderer.newPlugin("LightOmniBlue", LightOmni.class); lightOmni.setColor(new Color(0f, 0f, 60000f)); lightOmni.setShadowBias(0.2f); lightOmni.setDecay(2.0f); lightOmni.setShadowRadius(40.0f); Transform transform = Transform.fromString("Transform(Matrix(Vector(1.0, 0.0, 0.0), Vector(0.0, 0.0, 1.0), Vector(0.0, -1.0, 0.0)), Vector(160, -30, 200))"); lightOmni.setTransform(transform); renderer.render(); renderer.waitForImageReady(6000); } catch (VRayException e) { System.err.println(e.getMessage()); } finally { VRayRenderer.close(renderer); }
var vray = require('../vray'); var renderer = vray.VRayRenderer(); renderer.load('./softShadows.vrscene', function(err) { if (err) throw err; // create a new light plugin var lightOmni = renderer.classes.LightOmni('LightOmniBlue'); lightOmni.color = 'Color(0, 0, 1)'; lightOmni.intensity = 60000; lightOmni.decay = 2.0; lightOmni.shadowRadius = 40.0; lightOmni.transform = vray.Transform(vray.Matrix(vray.Vector(1.0, 0.0, 0.0) , vray.Vector(0.0, 0.0, 1.0), vray.Vector(0.0, -1.0, 0.0)) , vray.Vector(160, -30, 200)); renderer.render(); renderer.waitForImageReady(6000, function() { renderer.close(); }); });
In the following example we remove a sphere plugin by its name:
import vray with vray.VRayRenderer() as renderer: renderer.load('./softShadows.vrscene') del renderer.plugins.GeoSphere01_node renderer.start() renderer.waitForImageReady(6000)
VRayInit init; VRayRenderer renderer; renderer.load("./softShadows.vrscene"); Plugin sphere = renderer.getPlugin("GeoSphere01_node"); renderer.removePlugin(sphere); renderer.startSync(); renderer.waitForImageReady(6000);
using (VRayRenderer renderer = new VRayRenderer()) { renderer.Load("./softShadows.vrscene"); Plugin sphere = renderer.GetPlugin("GeoSphere01_node"); renderer.RemovePlugin(sphere); renderer.Start(); renderer.WaitForImageReady(6000); }
VRayRenderer renderer = null; try { renderer = new VRayRenderer(); renderer.load("./softShadows.vrscene"); Plugin p = renderer.getPlugin("GeoSphere01_node"); renderer.removePlugin(p); renderer.render(); renderer.waitForImageReady(6000); } catch (VRayException e) { System.err.println(e.getMessage()); } finally { VRayRenderer.close(renderer); }
var vray = require('../vray'); var renderer = vray.VRayRenderer(); renderer.load('./softShadows.vrscene', function(err) { if (err) throw err; delete r.plugins.GeoSphere01_node; renderer.render(); renderer.waitForImageReady(6000, function() { renderer.close(); }); });
Auto commit of property changes
All changes made to plugin properties should be done before the rendering starts when rendering in Production mode. Any changes after that do not affect the current rendering, but they are not lost. In interactive mode the changes made after the rendering starts are reflected dynamically on the scene. By default each change is applied immediatelly, but the user can decide to make a batch of changes to be applied together, usually for better performance. This is done using the autoCommit property of the VRayRenderer class. In the following example we demonstrate how a group of changes are batched and commit is called explicitly to apply them.""" This example shows how to make a group of changes to a scene simultaneously. The renderer property autoCommit is set to False so changes are applied with delayed commit. See also addRemovePlugins.py. """ from vray import * import os def createNewPlugin(renderer): # create a new light plugin lightOmni = renderer.classes.LightOmni() lightOmni.color = Color(0, 0, 60000) lightOmni.shadowBias = 0.2 lightOmni.decay = 2.0 lightOmni.shadowRadius = 40.0 lightOmni.transform = Transform(Matrix(Vector(1.0, 0.0, 0.0), Vector(0.0, 0.0, 1.0), Vector(0.0, -1.0, 0.0)), Vector(160, -30, 200)) with VRayRenderer() as r: r.autoCommit = False r.load('./softShadows.vrscene') r.startSync() # This change won't be applied immediatelly createNewPlugin(r) r.waitForImageReady(2000) # Make a group of changes 2 seconds after the render starts r.plugins.renderView.fov = 1.5 del r.plugins.GeoSphere01_node # Commit applies all 3 changes r.commit() r.waitForImageReady(4000)
VRayInit init; RendererOptions options; options.renderMode = RendererOptions::RENDER_MODE_RT_CPU; VRayRenderer renderer(options); renderer.setAutoCommit(false); renderer.load("./softShadows.vrscene"); // This change won't be applied immediatelly LightOmni lightOmni = renderer.newPlugin<LightOmni>("LightOmniBlue"); lightOmni.set_color(Color(0.f, 0.f, 60000.f)); lightOmni.set_shadowBias(0.2f); lightOmni.set_decay(2.0f); lightOmni.set_shadowRadius(40.0f); lightOmni.setValueAsString("transform", "Transform(Matrix(Vector(1.0, 0.0, 0.0), Vector(0.0, 0.0, 1.0), Vector(0.0, -1.0, 0.0)), Vector(160, -30, 200))"); renderer.startSync(); renderer.waitForImageReady(2000); // Make a group of changes 2 seconds after the render starts Plugin sphere = renderer.getPlugin("GeoSphere01_node"); renderer.removePlugin(sphere); RenderView renderView = renderer.getPlugin<RenderView>("renderView"); renderView.set_fov(1.5f); // Commit applies all 3 changes renderer.commit(); renderer.waitForImageReady(4000);
RendererOptions options = new RendererOptions(); options.RenderMode = RenderMode.RT_CPU; options.AutoCommit = false; using (VRayRenderer renderer = new VRayRenderer(options)) { renderer.Load("./softShadows.vrscene"); // This change won't be applied immediatelly LightOmni lightOmni = renderer.NewPlugin<LightOmni>("LightOmniBlue"); lightOmni.Color = new Color(0, 0, 60000); lightOmni.ShadowBias = 0.2f; lightOmni.Decay = 2.0f; lightOmni.ShadowRadius = 40.0f; lightOmni.Transform = Transform.FromString("Transform(Matrix(Vector(1.0, 0.0, 0.0), Vector(0.0, 0.0, 1.0), Vector(0.0, -1.0, 0.0)), Vector(160, -30, 200))"); renderer.Start(); renderer.WaitForImageReady(2000); // Make a group of changes 2 seconds after the render starts Plugin sphere = renderer.GetPlugin("GeoSphere01_node"); renderer.RemovePlugin(sphere); RenderView renderView = renderer.GetPlugin<RenderView>("renderView"); renderView.Fov = 1.5f; // Commit applies all 3 changes renderer.Commit(); renderer.WaitForImageReady(4000); }
VRayRenderer renderer = null; try { RendererOptions options = new RendererOptions(); options.setRenderMode(RenderMode.RT_CPU); renderer = new VRayRenderer(options); renderer.setAutoCommit(false); renderer.load("./softShadows.vrscene"); // This change won't be applied immediatelly LightOmni lightOmni = renderer.newPlugin("LightOmniBlue", LightOmni.class); lightOmni.setColor(new Color(0f, 0f, 60000f)); lightOmni.setShadowBias(0.2f); lightOmni.setDecay(2.0f); lightOmni.setShadowRadius(40.0f); Transform transform = Transform.fromString("Transform(Matrix(Vector(1.0, 0.0, 0.0), Vector(0.0, 0.0, 1.0), Vector(0.0, -1.0, 0.0)), Vector(160, -30, 200))"); lightOmni.setTransform(transform); renderer.render(); renderer.waitForImageReady(2000); // Make a group of changes 2 seconds after the render starts Plugin p = renderer.getPlugin("GeoSphere01_node"); renderer.removePlugin(p); RenderView renderView = renderer.getPlugin<RenderView>("renderView"); renderView.set_fov(1.5f); // Commit applies all 3 changes renderer.commit(); renderer.waitForImageReady(6000); } catch (VRayException e) { System.err.println(e.getMessage()); } finally { VRayRenderer.close(renderer); }
var vray = require('../vray'); var renderer = vray.VRayRenderer({ renderMode: 'rtCPU', autoCommit: false }); renderer.load('./softShadows.vrscene', function(err) { if (err) throw err; renderer.render(); // This change won't be applied immediatelly var lightOmni = renderer.classes.LightOmni(); lightOmni.color = 'Color(0, 0, 60000)'; lightOmni.decay = 2.0; lightOmni.shadowRadius = 40.0; lightOmni.transform = 'Transform(Matrix(Vector(1.0, 0.0, 0.0), ' + 'Vector(0.0, 0.0, 1.0),' + ' Vector(0.0, -1.0, 0.0)), Vector(160, -30, 200))'; // Make a group of changes 2 seconds after the render starts setTimeout(function() { renderer.plugins.renderView.fov = 1.5; delete renderer.plugins.GeoSphere01_node; // Commit applies all 3 changes renderer.commit(); }, 2000); renderer.waitForImageReady(6000, function() { renderer.close(); }); });
X. Property Types
The following are the types recognized for property types:
- Basic types: int, bool, float, Color (3 float RGB), AColor (4 float ARGB), Vector (3 float), string (UTF-8), Matrix (3 Vectors), Transform (a Matrix and a Vector for translation)
- Objects: references to other plugin instances
- Typed lists: The typed lists in AppSDK are IntList, FloatList, ColorList and VectorList.
- Generic heterogenous lists: The AppSDK uses a generic type class called Value for items in a generic list. Note that generic lists can be nested.
- Output parameters: These are additional values generated by a given plugin which may be used as input by others.
XI. Animations
VRay AppSDK supports rendering of animated scenes. Animated scenes contain animated plugin properties. A plugin property is considered animated if it has a sequence of values, as each value is given for a specific time or frame number. This defines key frames and in the process of sampling, interpolation is done between them.
Two additional events are emitted while rendering a sequence of frames - sequenceStart and sequenceDone. OnImageReady is emitted on each finished frame and one can obtain the current frame number from the sequenceFrame property of the renderer.
import vray import os def onImageReady(r): print('Image Ready, frame ' + str(r.sequenceFrame) + ' (sequenceDone = ' + str(r.sequenceDone) + ')') r.continueSequence() with vray.VRayRenderer(renderMode='rtGPU', rtSampleLevel=15) as renderer: renderer.load('./anim/anim_cube.vrscene') renderer.setOnImageReady(onImageReady) # When a single number is passed, it's interpreted as "start", with "end" being the last frame and step=1 sequence = [{'start': 4, 'end': 5}, {'start': 1, 'end': 5, 'step': 2}, 9] renderer.renderSequence(sequence) renderer.waitForSequenceDone()
#include "vraysdk.hpp" using namespace VRay; using namespace std; VRayInit init; VRayRenderer renderer; renderer.load("./anim/anim_cube.vrscene"); renderer.renderSequence(); while (false == renderer.isSequenceDone()) { renderer.waitForImageReady(60000); printf("Frame %d is ready\n", renderer.getSequenceFrame()); renderer.continueSequence(); }
using VRay; static string CubeSceneFilePath = @"\scenes\anim\anim_cube.vrscene"; static VRayRenderer renderer; private static void OnImageReady(object sender, EventArgs e) { Console.WriteLine("Image ready, frame {0}", renderer.SequenceFrame); renderer.ContinueSequence(); } private static void OnSequenceStarted(object sender, EventArgs e) { Console.WriteLine("Sequence started."); } private static void OnSequenceDone(object sender, EventArgs e) { Console.WriteLine("Sequence done."); } public static void Main(string[] args) { RendererOptions options = new RendererOptions(); options.RenderMode = RenderMode.RT_CPU; // Set the noise threshold to lower per frame rendering times options.RTNoiseThreshold = 0.25f; using (renderer = new VRayRenderer(options)) { renderer.Load(CubeSceneFilePath); renderer.ImageReady += new EventHandler(OnImageReady); renderer.SequenceStarted += new EventHandler(OnSequenceStarted); renderer.SequenceDone += new EventHandler(OnSequenceDone); // SubSequenceDesc(startFrame, endFrame, step) - the latter two parameters are optional. // By default endFrame is the last frame and step is 1, so it could be just a single frame and the implicit step is 1. // The range can be inverted and the separate ranges need not be sorted. // All these options are seen in the example array. SubSequenceDesc[] seq = { new SubSequenceDesc(3, 5), new SubSequenceDesc(1), new SubSequenceDesc(2), new SubSequenceDesc(10, 6, 2) }; renderer.RenderSequence(seq); renderer.WaitForSequenceDone(); } }
import com.chaosgroup.vray.*; VRayRenderer renderer = null; try { public static final String VRAY_SDK = System.getenv("VRAY_SDK"); public static final String SCENES_FOLDER_PATH = VRAY_SDK + "/scenes/anim/"; public static final String SCENE_FILENAME = "anim_cube.vrscene"; public static final String SCENE_FILE = SCENES_FOLDER_PATH + SCENE_FILENAME; // create RT renderer renderer = new VRayRenderer(new RendererOptions() .setRenderMode(RenderMode.RT_GPU_CUDA) ); final VRayRenderer r = renderer; // set image ready callback renderer.setOnImageReady(new OnImageReadyListener() { @Override public void onImageReady() { System.out.format("Image Ready, frame %d (sequenceDone = %b)%n", r.getSequenceFrame(), r.isSequenceDone()); r.continueSequence(); } }); // load scene renderer.load(SCENE_FILE); SubSequenceDesc ss[] = {new SubSequenceDesc(1), new SubSequenceDesc(5, 2, 2), new SubSequenceDesc(5, 10)}; // starts rendering process in new thread - no blocking call renderer.renderSequence(ss); // wait for some intermediate images to become available renderer.waitForSequenceDone(); Systrem.out.format("Sequence done.%n"); } finally { VRayRenderer.close(renderer); }
var vray = require('../vray'); var renderer = vray.VRayRenderer({ renderMode: 'rtCPU' }); renderer.on('imageReady', function () { console.log('Image Ready, frame ' + renderer.sequenceFrame + ' (sequenceDone = ' + renderer.sequenceDone + ')'); renderer.continueSequence(); }); renderer.on('sequenceStart', function () { console.log('Sequence started.'); }); renderer.on('sequenceDone', function () { console.log('Sequence done. Closing renderer...'); renderer.close(); }); renderer.load('./anim/anim_cube.vrscene', function (err) { if (err) { renderer.close(); throw err; } // renderSequence() arguments can be in an array but it's not obligatory // simple example: renderSequence({ start: 3, end: 8}) // When a single number is passed, it's interpreted as "start", with "end" being the last frame and step=1 renderer.renderSequence([{ start: 4, end: 5}, 1, 3, 2, { start: 10, end: 1, step: -2}, 5, { start: 2, end: 6}]); });
XII. V-Ray Server
A V-Ray Server is used as a remote render host during the rendering of a scene. The server is a process that listens for render requests on a specific network port and clients such as instances of the VRayRenderer class can connect to the server to delegate part of the image rendering. The server cannot be used on its own to start a rendering process but plays an essential role when completing distributed tasks initiated by clients. In order to take advantage of the distributed rendering capabilities of V-Ray, such servers need to be run and be available for requests.
The V-Ray Application SDK allows for easily preparing V-Ray server processes that can be used in distributed rendering. The API offers a specialized class VRayServer that exposes methods for starting and stopping server processes. In addition the VRayServer class enables users to subscribe to V-Ray server specific events so that custom logic can be executed upon their occurrence.
The supported events occur when:
- The server process starts and is ready to accept render requests
- A render client connects to the server to request a distributed rendering task
- A render client disconnects from the server
- A server message is produced to denote a change in the server status or to track the progress of the server work
The next code snippet demonstrates how the VRayServer can be instantiated and used to start a server process.
import vray def onStart(server): print 'Server is ready.' def onDumpMessage(server, message, level): print '[{0}] {1}'.format(level, message) def onDisconnect(server, host): print 'Host {0} disconnected from server.'.format(host) def onConnect(server, host): print 'Host {0} connected to server.'.format(host) server = vray.VRayServer(portNumber=20207) server.setOnStart(onStart) server.setOnDumpMessage(onDumpMessage) server.setOnDisconnect(onDisconnect) server.setOnConnect(onConnect) server.run()
#include "vraysrv.hpp" using namespace VRay; void onLogMessage(VRayServer& server, const char* message, int level, void* userData) { printf("[%d] %s\n", level, message); } VRayServerInit init; ServerOptions options; options.portNumber = 20207; VRayServer server(options); server.setOnDumpMessage(onLogMessage); server.run();
ServerOptions options = new ServerOptions(); options.PortNumber = 20207; using (VRayServer server = new VRayServer(options)) { server.Started += new EventHandler((source, e) => { Console.WriteLine("Server is ready"); }); server.HostConnected += new EventHandler<HostEventArgs>((source, e) => { Console.WriteLine("Host {0} connected to server.", e.Host); }); server.HostDisconnected += new EventHandler<HostEventArgs>((source, e) => { Console.WriteLine("Host {0} disconnected from server.", e.Host); }); server.MessageLogged += new EventHandler<MessageEventArgs>((source, e) => { Console.WriteLine("[{0}] {1}", e.LogLevel.Type, e.Message); }); server.Run(); }
ServerOptions options = new ServerOptions(); options.setPortNumber(20207); VRayServer server = null; try { server = new VRayServer(options); server.setOnServerEventListener(new OnServerEventListener() { @Override public void onStart() { System.out.println("Server is ready"); } @Override public void onLogMessage(String message, LogLevel logLevel) { System.out.format("[%s] %s\n", logLevel.getType(), message); } @Override public void onDisconnect(String host) { System.out.format("Host %s disconnected from server.\n", host); } @Override public void onConnect(String host) { System.out.format("Host %s connected to server.\n"); } }); server.run(); } catch (VRayException e) { System.err.println(e.getMessage()); } finally { if (server != null) { server.close(); } }
var vray = require('../vray'); var server = vray.VRayServer({ portNumber : 20207 }); server.on('start', function() { console.log('Server is ready'); }); server.on('connect', function(host) { console.log('Host ' + host + ' connected to server.'); }) server.on('disconnect', function(host) { console.log('Host ' + host + ' disconnected from server.'); }) server.on('dumpMessage', function(message, level) { console.log('[%d] %s', level, message); }) server.start(); process.on('exit', function() { server.close(); });
XIII. Distributed Rendering
The V-Ray Application SDK allows third party integrators to make full use of the V-Ray distributed rendering engine. The distributed rendering gives the ability parts of the image to be rendered in parallel by multiple render hosts. The client process which initiates the rendering synchronizes the results produced by the render servers also known as render slaves. All render modes support distributed rendering.
Render hosts can be dynamically added and removed at any point in time during the rendering process. The render hosts should have a V-Ray server running on them when the render client tries to add them to its list of active hosts because the client verifies whether a connection can be established for a distributed task at that time.
Each render host is identified by an address:port pair specifying the IP or DNS address of the machine with the V-Ray server and the port on which this server is listening for render requests. The VRayRenderer class exposes methods for the dynamic addition and removal of hosts. Irrespective of the progress made by the main render process in the client, remote machines can successfully begin to participate in the rendering as long as the they have running V-Ray servers.
import vray import time with vray.VRayRenderer() as renderer: renderer.load('./car.vrscene') renderer.start() # attach several hosts after a few seconds # If port is ommitted, it defaults to 20207 time.sleep(6) renderer.addHosts('10.0.0.132:20207;10.0.0.132:20208') # display render host info print 'Active hosts: ', renderer.getActiveHosts() print 'All hosts: ', renderer.getAllHosts() # wait for the final image no more than the specified number of # milliseconds renderer.waitForImageReady(6000)
VRayInit init; VRayRenderer renderer; renderer.load("./car.vrscene"); renderer.startSync(); // Attach several hosts after a few seconds // If port is ommitted, it defaults to 20207 renderer.waitForImageReady(6000); renderer.addHosts("10.0.0.132:20207;10.0.0.132:20208"); // display render host info printf("Active hosts: %s\n", renderer.getActiveHosts().c_str()); printf("All hosts: %s\n", renderer.getAllHosts().c_str()); // wait for the final image no more than the specified number of milliseconds renderer.waitForImageReady(6000);
using (VRayRenderer renderer = new VRayRenderer()) { renderer.Load("./car.vrscene"); renderer.Start(); // attach several hosts after a few seconds // If port is ommitted, it defaults to 20207 System.Threading.Thread.Sleep(6000); renderer.AddHosts("10.0.0.132:20207;10.0.0.132:20208"); // display render host info Console.WriteLine("Active hosts: {0}", renderer.ActiveHosts); Console.WriteLine("All hosts: {0}", renderer.AllHosts); // wait for the final image no more than the specified number of milliseconds renderer.WaitForImageReady(6000); }
VRayRenderer renderer = null; try { renderer = new VRayRenderer(); renderer.load("./car.vrscene"); renderer.render(); // attach several hosts after a few seconds // If port is ommitted, it defaults to 20207 Thread.sleep(6000); renderer.addHosts("10.0.0.132:20207;10.0.0.132:20208"); // display render host info System.out.println("Active hosts: " + renderer.getActiveHosts()); System.out.println("All hosts: " + renderer.getAllHosts()); // wait for the final image no more than the specified number of milliseconds renderer.waitForImageReady(6000); } catch (VRayException e) { System.err.println(e.getMessage()); } catch (InterruptedException e) { System.err.println(e.getMessage()); } finally { VRayRenderer.close(renderer); }
var vray = require('../vray'); var renderer = vray.VRayRenderer(); renderer.load('./car.vrscene', function(err) { if (err) throw err; renderer.render(); // attach several hosts after a few seconds // If port is ommitted, it defaults to 20207 setTimeout(function() { renderer.addHosts("10.0.0.132:20207;10.0.0.132:20208", function() { // display render host info console.log("Active hosts: " + renderer.getActiveHostsSync()); console.log("All hosts: " + renderer.getAllHostsSync()); // wait for the final image no more than the specified number of milliseconds renderer.waitForImageReady(6000, function() { renderer.close(); }); }); }, 6000); });
Removal of render hosts is just as easy as adding ones. Even if all remote render hosts disconnect or are deleted from the rendering host list the rendering process will continue on the client machine which has initiated it.
import vray import time with vray.VRayRenderer() as renderer: renderer.addHosts('10.0.0.132:20207;10.0.0.132:20208') renderer.load('./car.vrscene') renderer.start() # detach a host after a few seconds time.sleep(6) renderer.removeHosts('10.0.0.132:20208') # display render host info print 'Active hosts: ', renderer.getActiveHosts() print 'Inactive hosts: ', renderer.getInactiveHosts() print 'All hosts: ', renderer.getAllHosts() # wait for the final image no more than the specified number of # milliseconds renderer.waitForImageReady(6000)
VRayInit init; VRayRenderer renderer; renderer.addHosts("10.0.0.132:20207;10.0.0.132:20208"); renderer.load("./car.vrscene"); renderer.startSync(); // detach a host after a few seconds renderer.waitForImageReady(6000); renderer.removeHosts("10.0.0.132:20208"); // display render host info printf("Active hosts: %s\n", renderer.getActiveHosts().c_str()); printf("Inactive hosts: %s\n", renderer.getInactiveHosts().c_str()); printf("All hosts: %s\n", renderer.getAllHosts().c_str()); // wait for the final image no more than the specified number of seconds renderer.waitForImageReady(6000);
using (VRayRenderer renderer = new VRayRenderer()) { renderer.AddHosts("10.0.0.132:20207;10.0.0.132:20208"); renderer.Load("./car.vrscene"); renderer.Start(); // detach a host after a few seconds System.Threading.Thread.Sleep(6000); renderer.RemoveHosts("10.0.0.132:20208"); // display render host info Console.WriteLine("Active hosts: {0}", renderer.ActiveHosts); Console.WriteLine("Inactive hosts: {0}", renderer.InactiveHosts); Console.WriteLine("All hosts: {0}", renderer.AllHosts); // wait for the final image no more than the specified number of seconds renderer.WaitForImageReady(6000); }
VRayRenderer renderer = null; try { renderer = new VRayRenderer(); renderer.addHosts("10.0.0.132:20207;10.0.0.132:20208"); renderer.load("./car.vrscene"); renderer.render(); // detach a host after a few seconds Thread.sleep(6000); renderer.removeHosts("10.0.0.132:20208"); // display render host info System.out.println("Active hosts: " + renderer.getActiveHosts()); System.out.println("Inactive hosts: " + renderer.getInactiveHosts()); System.out.println("All hosts: " + renderer.getAllHosts()); // wait for the final image no more than the specified number of seconds renderer.waitForImageReady(6000); } catch (VRayException e) { System.err.println(e.getMessage()); } catch (InterruptedException e) { System.err.println(e.getMessage()); } finally { VRayRenderer.close(renderer); }
var vray = require('../vray'); var renderer = vray.VRayRenderer(); renderer.addHosts("10.0.0.132:20207;10.0.0.132:20208", function() { renderer.load('./car.vrscene', function(err) { if (err) throw err; renderer.render(); // detach a host after a few seconds setTimeout(function() { renderer.removeHosts("10.0.0.132:20208", function() { // display render host info console.log("Active hosts: " + renderer.getActiveHostsSync()); console.log("Inactive hosts: " + renderer.getInactiveHostsSync()); console.log("All hosts: " + renderer.getAllHostsSync()); // wait for the final image no more than the specified number of seconds renderer.waitForImageReady(6000, function () { renderer.close(); }); }); }, 6000); }); });
XIV. Setup
Each time AppSDK is used, the script setenv{python version}.sh should be run to setup some environment variables. Run either if you're not interested in Python. The following variables are set:
- PATH (Windows) - Appended with the path to V-Ray's and the SDK's binaries
- LD_LIBRARY_PATH (Linux) - Appended with the path to V-Ray's and the SDK's binaries
- DYLD_LIBRARY_PATH (MacOS) - Appended with the path to V-Ray's and the SDK's binaries
- VRAY_PATH - Path to where V-Ray and its plugins reside
- PYTHONPATH - Used only by the Python binding
- VRAY_SDK - Path to the main SDK folder used only by code examples