To do this, you create a configuration asset that expresses all of this information in a series of settings. You need to tell nDisplay about the different computers you want to use in your network, the size and resolution of the screens or projectors those computers will render to, the spatial relationships between those screens in 3D space, and more. See the nDisplay 3D Config Editor for more information. It communicates and synchronizes information between all the application instances that make up the cluster, ensures all instances render the same frame at the same time, ensures each display device renders the correct frustum of the game world, and more.Ī shared configuration asset that contains all the settings nDisplay needs to start up the correct number of instances on the correct computers, each rendering the right points of view on the game's 3D world to produce the illusion of a seamless rendering across all display screens or projectors. ![]() NDisplay adds several components to the usual Unreal system architecture:Ī Plugin that works inside the Unreal Engine. For more details on using multiple graphics cards, refer to Multi-GPU Support. Once all pixels are copied and available on the display-facing GPU, they are composited in the application window and sent to the GPU outputs. In this scenario, you can map all your viewports in this extended display canvas using the nDisplay Output Mapping tool.įor improved performance, you can also leverage a second GPU for viewport rendering. This ensures that Unreal can render fullscreen on these aggregated screens for better display sync and performance. We recommend leveraging multi-display technologies from graphics card vendors such as NVIDIA Mosaic or AMD Eyefinity to treat multiple connected displays as one display. Using the Output Mapping tool, these separate viewports are then mapped into different areas of a large 2D canvas, referred to as the application window. With this option, you run a single instance of Unreal Engine per computer, but you set it up to render multiple views of the scene's 3D space. One application instance and host computer per multiple display devices. Each of these secondary nodes drives one or more display projectors. The network also contains several other PCs that run other instances of the Unreal Engine Project. This primary node accepts input into the system from a VRPN server, which relays signals that come from spatial tracking devices and other controller devices. Like all nDisplay networks, one of its PCs acts as the primary node. The image above shows a possible nDisplay network. The primary node is also responsible for accepting input from spatial trackers and controllers through connections to Virtual-Reality Peripheral Networks (VRPNs) through Live Link, and replicating that input to all other connected computers. By setting up these viewports so that their location in the 3D world matches the physical locations of the screens or projected surfaces in the real world, you give viewers the illusion of being present and immersed in the virtual world. For now the only solution is breaking the inheritance (A->B or AEditor->BEditor) however I would like to avoid such operation.Every nDisplay setup has a single primary computer, and any number of additional computers, called secondary nodes.Įach computer in the network runs one or more instances of your Unreal project either in -game or in packaged format.Įach Unreal Engine instance handles rendering to one or more display devices, such as screens, LED displays, or projectors.įor each display device an instance of Unreal Engine handles, it renders a viewport that shares the same view origin or viewpoint. The problem is that I would like to avoid this “third call”. ![]() ![]()
0 Comments
Leave a Reply. |