By using the architecture of the OGRE engine, it is more convenient to simulate a realistic three-dimensional stage lighting environment. The presence and interaction capabilities of virtual technology itself can not only transform static design and creation into dynamic reproduction, but also timely. Capturing and presenting the designer's conception, creativity and inspiration, a mature and complete virtual design system platform is an ideal platform for lighting designers, performance directors, lighting control operators, lighting art teaching and lighting effect demonstrations, etc. A very professional and practical design tool and right-hand assistant. The real-time and convenient interactive function makes the system more real-time and operable. This system provides a real-time interactive operating environment to meet the needs of users.
1. System architecture 1. Logic architecture Whether it is a game or a virtual reality, in order to show its realism, the virtual scene is mostly complex, so the creation of the virtual scene is mostly generated by a 3D modeling tool, and then the scene is rendered and output in real time. For stage lighting design, 3DMAX provides basic scene files, which are analyzed through the DOM (Document Object Model) interface, and resources are imported and scenes are organized. Finally into the OGRE system for rendering.
There are various lighting models, and their effects need to be realized separately in the scene. At the same time, users should also be able to perform various operations through the UI interface. The system requires high operational complexity, a large amount of calculation, and strong scalability.
It is necessary to design and establish an overall implementation architecture (as shown in Figure 1) with high data processing and computing efficiency, strong scalability, loose integration and strong cohesion of functional modules. The system is divided into resource layer, interface layer, and rendering layer. Resource layer: exported by the 3DMAX + Ofusion plug-in to generate resource files required by the scene organization, materials, entities, textures and other systems. Interface layer: responsible for importing these resource files into the scene.
Rendering layer: Realize the rendering of stage scenes and lights, management of system resources, real-time response to user interaction, etc. 2. Research on key technologies of system realization 1. The organization of scenes and the resource files required by the system are exported by 3DMAX through the Ofusion plug-in. The exported scene organization file is in XML format, which records some basic parameters of the stage and information about the position and orientation of each stage entity.
Scene nodes are organized in a tree form. Each node has a corresponding parent node, so we can easily move and rotate multiple child nodes at the same time through the operation of the parent node. 2. 3D coordinate transformation To display the 3D rendering results on a 2D screen, it is necessary to convert from 3D coordinates to plane coordinates. First, a three-dimensional coordinate system needs to be established. We create a three-dimensional oblique two-axis side coordinate system, in which the direction of the x-axis is horizontal to the left, the direction of the z-axis is vertical upward, and the direction of the y-axis is at an angle of 45° to the horizontal direction. .
When displaying graphics in this coordinate system, the lengths in the x-axis and z-axis directions take the actual length of the graphics, and the length in the y-axis direction takes half of the actual length. In the formula, ηx, ηy, and ηz are the axial deformation coefficients of the x, y, and z axes. Carrying out the axonometric projection transformation, the following equation can be obtained: where f and d are the coefficients of the axonometric transformation matrix, and by solving this equation, it can be obtained: In order to make the three-dimensional effect stronger, set d=f=-0.354, and the axonometric technique can be obtained Shadow transformation matrix: Next, it is necessary to convert the three-dimensional coordinates of the graphics into the device coordinates on the screen. In the perspective window, the origin of the coordinates is located in the upper left corner of the screen, the rightward direction is the positive direction of the x-axis, and the downward direction is The positive direction of the y-axis.
Assuming that a point (x, y, z) in the three-dimensional space has the coordinates (xx, yy) in the device coordinates, using the axonometric projection transformation matrix just now, the following conversion formula can be obtained: xX, yY in the formula are The relative coordinates of the origin of the three-dimensional coordinates in the device coordinate system. Substituting formula (4) into (5), the transformation equation can be obtained: 3. Particle system effect simulation Particles are represented by quadrilaterals. It has attributes such as length and width, direction, color, lifespan, quantity, material, weight and speed.
The properties of particles are jointly determined by the Particle Emitter and the Particle Affector. The particle emitter is responsible for the emission of particles, giving some properties of the particles when they are emitted, including motion speed, color, life span, etc.; the particle special effect influencer is responsible for the change of particle properties from the moment of particle emission to the stage before it dies, which can be used To simulate special effects such as gravity, tension, color decay, etc. Effects such as smoke, fire, and explosions can be created when particle emitters spew out large numbers of particles incessantly.
OGRE provides a particle system scripting language, which can set various properties of particles in the script. In this article, the effects of stage fireworks, rain and clouds are described through the particle system. Combined with the animation effect of OGRE, a more realistic scene particle effect can be achieved.
4. Illumination Effect Simulation Illumination is the key factor of the stage effect and the core technology of this design system. The rendering engine provides several commonly used lights, such as point light, directional light and spotlight. But for real stage lighting effect simulation, these are not enough.
For some special stage lighting effects, such as volumetric light, etc., it needs to be realized through programmable rendering pipeline technology (shader). There are two kinds of shaders, one is vertex-level, called vertex shader (OpenGL is called ve spit white program), which replaces the transformation and lighting parts in the fixed rendering pipeline, and programmers can control vertex transformation, lighting, etc. by themselves. The units that process vertex shaders in hardware are called vertex shader processors (vertex processing units).
One is pixel-level, called pixel shader (OpenGL is called fragment program), which replaces the rasterization part in the fixed rendering pipeline, and programmers can control pixel color and texture sampling by themselves. The units that process pixel shaders in hardware are called pixel shader processors (pixel processing units). In order to make the lighting simulation more realistic, it is also necessary to use 3D lighting equations to simulate and calculate.
This is often an approximate algorithm, but it can achieve a good simulation effect and a very fast running speed. There are two common lighting models: global illumination model and direct illumination model. This system uses the global illumination model.
The global illumination model is a lighting model that can simulate realism very well. It can take into account the reflection, refraction, transmission, shadow and interaction of light on the surface of the object at the same time. Using the global illumination model, it is necessary to simulate the propagation process of actual light and the radiance of energy exchange.
For ray tracing, it is necessary to consider the direct illumination result of the light source and the illumination effect of the reflected light on the point, and combine the two: Next, to calculate the radiance, it is necessary to calculate the illumination on each surface: where Ld is the light illuminated by the light source , T is the light propagation factor, TLi is the light reflected from other surfaces, and L is the final required light value. System interface and summary 3. System interface The system can realize stage switching, various scene effects in the stage, and real-time interaction with stage scenes and various lights. Figure 4 is the effect diagram of stage fireworks, and Figure 5 is the effect diagram of volumetric light.
Summary: Stage lighting design has become a major problem for lighting designers. They often have to face such high cost, high energy consumption and high time-consuming problems. With the vigorous development of the information industry, the field of professional stage lighting has also entered a comprehensive digital era.
The system uses the 0-GRE engine to build a virtual stage, and presents the stage lighting design and adjustment in real time. At the same time, it provides rich system interaction functions for stage lighting designers, which can solve this problem very well. Next, the system needs to further enrich the lighting model of the stage, which can simulate various lights, such as: soft light, flash light, follow spot light, etc. The user interface also needs to be further optimized to improve user-friendliness.