Focus General System Description

From dgraphic
Jump to navigation Jump to search

Introduction

Virtual studios of the Focus group are an instrument for creating television programs. It’s a state-of-the-art and increasingly popular technology for television production combining virtual set-dressing (graphical 3D images produced using computer) and real video, live actors and computer characters, etc. Traditional virtual studios have already been well described in literature, and we see products created with the utilization of these technologies on TV and cinema screens more and more often. The main obstacle to wide spreading of these technologies in all spheres of television production is its high price and technological process complexity (or rather strangeness). When creating systems of the Focus family, there was an attempt to overcome these problems with the preservation of the quality and functionality characteristic of systems that are several times more expensive. Focus can be used not only as a virtual studio, but also as a live broadcast design system using real-time 3D graphics - that is animated computer characters, virtual plasma display panels, business graphics, complex titles, and then some. Focus may play the role of a computer-based multichannel mixer as well.

Purpose and Functional Capabilities

The Focus system is intended for television studios, internet studios or television company subdivision. Being a virtual studio, it helps organize creation of some TV programs (news, weather, entertaining) or some other video products (video clips, etc.) using just one prepared room and simplified requirements to lighting devices and television equipment. The visual quality of the final product corresponds to the existing requirements of leading international television companies. An example of a product created with the help of the Focus system is given on Fig. 1.

Example of utilization of the Focus virtual studio
Fig. 1. Example of utilization of the Focus virtual studio

The main characteristics of the Focus system are:

  • multiformat video input with time-base correction which makes it possible to use asynchronous video sources;
  • possibility of video output synchronization (GenLock);
  • unique key compositing (Chromakey, or rear-screen projection, – color keying) (it is possible to process think and semi-transparent objects such as hair, smoke, glass, etc. with high quality);
  • easily increased number of input channels with independent keying and time correction;
  • integrated audio delay for the synchronization of output video and sound;
  • calculation of 3 three-dimentional scenes in real time with the use of input dynamic video images as texture on any scene objects;
  • unlimited number of virtual animated cameras;
  • support of static or robotic video cameras-sources with switching among them in real time;
  • ample opportunities for the creation of animated computer characters, interior elements, special effects;
  • automatic management of any objects and parameters, according to a scenario or interactively;
  • an opportunity of integration with video-servers, title creation systems and other peripheral equipment, including but not limited to equipment to provide work according to a uniform scenario.

Technical solutions make it possible to use video sources, for example, cameras of various types and get a high quality result. It is achieved due to the application of digital decoders and time correction in each input channel, specially adapted unique algorithms of keying, filtration and mixing. Due to its modulairty and extendability, the system can be developed for working with noncomposite, digital SDI or HD SDI-signals.

The fact that actor’s image obtained after keying is further used as the texture (material) in synthesized threedimentional stage setting which makes it possible to place the actor’s image in any place of a threedimentional scene and use virtual cameras not connected with real ones, is a characteristic of Focus virtual studios. As a result, instead of a complicated and expensive system of tracking sensors of a real camera, computer emulation of movement and zooming. It is possible to create in advance some undirectedly complex trajectories of such movement, which is impossible in a real studio or requires application of very expensive crane equipment. The range of possible movements of the virtual cameral is, of course, limited when using real static cameras, but is sufficient for a very wide range of applications. For example, too close location of a virtual camera aimed at an actor will look like digital zoom. Or, using Focus, it is not possible to show the image of the actor’s profile, that is imitate going around the actor without any special means (“turn table”), etc. On the other part, the system can be used in the usual overlay mode, when full, unscaled video image of the actor is just mixed with the sinthesized computer scenery (texel to pixel mapping is used), but in this case the opportunity to imitate zooming in and zooming out of the cameras is not available. The cost of tracking systems (sensors of camera position and parameters) is several times higher than the cost of the studio.

For some class of applications (for example, weather forecast) with several static positions of the cameras, the overlay mode is quite suitable with the consideration of the fact that it makes it possible to show a video image of the actor with the highest quality due to the absence of additional processing and filtration that are inevitable if a video as the texture in synthesized threedimetional scenes is used. In addition to advantages of using virtual cameras animation, utilization of video as the texture provides great opportunities for the creation of various special effects like morphing, duplication of objects, mirror images, utilization of curved surfaces and many others. In particular, it is possible to create virtual television panels with fantastical animation, to distort the actor’s image at discretion, etc. As the source for such video texture, not only some “live” source (a video camera, video tape recorder), but also a video file that can be reproduced either directly from the hard disc of the virtual studio computer or from a separate file server can be used. Provided that the network connection ensures failure-free data transfer of the required volume. Operation with video files via a network can even be more preferable, as it excludes the influence of the load on the system during the disc subsystem functioning. There are, of course, some limitations of the throughput performance of the system for simultaneous reproduction of several similar dynamic objects. The actual configulations allow simultaneous utilization of three or four full-size video textures depending on the general complexity of the synthesized scene. If the resolution is reduced and, consequently, the data flows, it is possible to respectively reduce the nuber of simultaneously utilized various video-textures in the scene.

Technology Characteristics

The technological process of television production with the use of the Focus virtual studio can be conditionally divided into three stages:

  • Creation of 3D scenery
  • Creation of a scenario for the transfer record management
  • Recording of a television program

Virtual scenery creation means utilization of of the Autodesk® 3ds Max® software. The designer, at the request of the program director, creates a 3-dimentional scene with the consideration of requirements and characteristics of the television and real time computer systems.

The scenario of program record management is created on the basis of the technical specification of the picture editor in the HotActions, HotActions Designer software. The scenario takes account characteristics of working with cameras, external sources, media data, etc.; also, the management interface is optimized for the commissioning editor.

Broadcasting is recorded using the HotActions or HotActions Live software. Any action or change in parameters during the recording can be made either automatically according to the scenario, or interactively, under the control of the cameraman. Moreover, several workstations of cameramen can be used for interactive management of different aspects of the shooting process (for example, of virtual actors with the help of several joysticks.) The system is built using the principles of modularity and extendability and can be used independently, with a set of options in accordance with the process, and can integrated with the existing user’s video equipment. Let’s talk in more detail about the technological stages:

Preparation

Determining Equipment Composition and Configuration

Determine the composition of the television equipment used in the process of shooting – the number of cameras, the type, parameters and location of the cameras (it is possibile to simultaneously use several different video sources in shot), video equipment (video tape recorders, mixing consoles, etc.), audio equipment, etc. Based on that, a keying process flowsheet is made and joint utilization of the equipment in the studio, and the number of the involved personnel is determined as well as requirements to them.

Creation of a Three-Dimention Scene (Scenery)

The main instrument for the creation of virtual scenery is the popular Autodesk® 3ds Max® packet with the help of which three-dimentional atmosphere is created, movable objects are animated, video textures and virtual cameras are placed, etc. Undoubtedly, real time requirements lay certain limitations on the complexity of the created scenes and require respective knowledge of the designers (limitations of the number of polygons used in the situation, texture size, used parameters of materials, allowed lighting sources, etc.) One can read about it in more detail in the user’s guide for the creation of 3D scenes. Even explosive growth of the capability of computers and graphic accelerator lately and in the near future will not make it possible to remove such limitations. On the other side, users that don’t know much about 3D graphics can use a set of standard projects supplied with the system or purchased from external designers that work with a virtual studio. Standar project users can do the required adaptation (for example, change the color, messages on the objects, textures) with the help of project parameters.

Editing of Video Materials

Media materials (jingles, bumpers, picture shots, plugs, etc.) are preliminary mounted with the help of some available non linear editing system. For real-time playing, some integrated script or interactive means can be used. In the case of integration with external playback equipment (using an additional video input channel), editing and play are done with the help of some other available means.

Preparing and Configuration of the 3D Scenery Management Interface

To manage the recording of a television program (switching of virtual cameras, launching of animation, change of virtual scenery, etc.), keypads and interactive controllers are created. During the studio operation, these panels will be used by the cameraman for editing in real time – the cameraman initiates the required actions by pushing the buttons with the help of a mouse, sensor screen or by using set hot buttons on the keypad. Below, in Chaper 5, the means that allow simplifying interaction of cameramen with the system in the real time to the maximum are described.

Television Program Shooting

On-Air Shooting

Operation in this mode is performed in real time, that is why it requres thorough training of the personnel, preparation of equipment and of the FOCUS virtual studio. To ensure reliability, it is necessary to try to use means for scenario automation to the greatest possible extent and to simplify as far as possible the interface of real time cameramen. In the interactive mode, the cameraman uses a virtual studio management interface configurated in advance, which is a set of keypads and controllers that execute different control instructions or actions in a three-dimentional scene. Actions in a virtual studio can be also performed by pressing hot keys or via GPI ports (control signals arriving via COM- or USB- ports) from external control units, for example, mixing consoles. In this case, production of some standard products, for example, daily weather forecast will not require any deep knowledge of system utilization from the cameramen of the air. A brief description of what function this or that button or controller of the interface perfom in the scene is sufficient.

Shooting in the Write Mode

This mode makes it possible to do interactive shooting to select the best takes and to achive the best results. Shooting can be done in short fragments, which makes the work with the script easier and leave place for improvisation.

System Components

Base Configuration

The base set of the system is a specially selected and configurated computer of the IBM PC class with the preset:

  • Hardware modules of audio and video input-output with time correction;
  • Graphic 3D-accelerator;
  • Special software.

The base configuration of Focus is one PC computer (system unit) with installed additional modules. The base configuration is divided into analog or digital configuration for Standard or Hi Definition of television signals and provides for two respective independent video input channels. Video images from connected video sources that undergo time correction and keying can be switched in Focus in real time. In a 3D scene, both video sources from the input channels can be used. There is support of videofiles as additional video textures provided there is respective optimization of the scen e according to the performance. The configuration includes a sound delay and a non-synchronizable video output. There is a possibility to connect external devices for interactive management via a local network, from joysticks or GPI ports.

Optional Extensions

The set of optional extensions includes:

  • Additional input channels of different formats;
  • A passive switchboard pannel;
  • An active switchboard pannel;
  • 3D keying units;
  • A device for signal transfer via GPI;
  • Means of integration with a video server and a title creation system Forward ТА /Forward ТТ.

Fig. 2 and Fig. 3 show diagrams of typical configurations of the Focus system.

Base configuration
Fig. 2. Base configuration
Extended configuration
Fig. 3. Extended configuration

User Interface Ideology

The main instrument for program recording script management is the HotActions (HotActions Live) program that is the main user application of the Focus studio. HotActions can be used in three modes:

1. Editing – creation of a scene management interface (there is also a HotActions Designer version for this purpose).

2. Testing – script development, management interface checking.

3. The mode of program recording air.

The first two modes are intended for designers and directors that develop the virtual scenery, animation, script, etc.; the third mode is respectively intended for commissioning editors or on-air projectionists.

Creation of a Scenery Management Interface (Editing)

This mode is used in HotActions to create and configure a virtual scene management interface that will further be used by the projectionist during shooting (in the on-air mode.) All preparatory works are performed. The next items describe the interface creation process.

Creation of Actions (Scene Behavior Programming)

Action is some logically closed action consisting of one or several combined commands the result of which is prompt virtual scene modification. These commands can be described in a special language for script description (script) or chosen from the library of standard actions.

In particular, Actions regulate:


  • Virtual camera switching inside the scene (measurement of their parameters);
  • Launching/stopping of animation of various scene objects;
  • Quick downloading of both certain textures and fonts, and whole scenes;
  • Playing/stopping of the sound;
  • Changes in some properties of scene elements (for example, object visibility, their scale, position, color, etc.)

Some Actions can sequentially (or simultaneously) call other actions, i.e. be script sequences. Initializing Actions are especially significant. They are automatically performed during the transition to the on-air mode. They are created to bring a three-dimentional scene to some initial (base) condition, to load and initiate the components required in the process of on-air operation.

As a rule, the scene designer/animator creates Actions as he knows its structure better than anyone else, or/and the program director. To simplify solution of this task, a library of integrated Actions was included in the application that can perform standard operations with scene objects: creation and play of video streams, virtual camera switching, object modification, texture replacement, etc.

Controller Configuration

The HotActions application supports operation with an unlimited number of simultaneously connected controllers, both software and hardware ones, in particular, joysticks or 3D keying units. Each of them can be configurated for the performance of some actions (for example, to perform Actions by pushing/releasing buttons, rotating/moving different scene objects in accordance with the position of the slide/handle, etc.) As a rule, information about controller configuration is stored in the text of initializing Actions (see above.)

Creation and Configuration of Hotsets

As ever Action is some performed action (or a sequence of actions), each such action can be assigned a certain button, by pushing which it will be performed. The panel consisting of such functional buttons is called Hotbar. For various ergonomic reasons, images and names on buttons are chosen, as well as the form and name of the panel (Hotbar). If desired, each button can be matched with a hot key on the keyboard.

Hotbars are place in the workspace of the window of the project document or a special document – Hotset. When downloading any Hotbar, the program automatically loads the required Actions that are initiated when the buttons are pushed. Hotbars and Hotsets do not depend on the content of Actions as they contain only a refference to them. In such a way, the scenery management operator, based on the purposes of convenience of his work on air.

Combining into a Project

Virtual scene files created by Actions, their collections and Hotset for the convenience of loading are combined into one project. In the project document (*.vs), references to the used files are saved, as well as the document condition (opened/closed/hidden), location of open document windows and Hotbars in respect of the main window of the application.

On-Air Control Mode

It is on the air that direct interactive work with a three-dimentional scene is done – shooting and demonstration.

The main use of this working mode is the virtual stage operator. By pushing Hotbar buttons or by using hot keys, the operator launches the performance of some Actions. The result is an interactive modification of three-dimentional scenery. Simultaneously, scene objects can be managed from different controllers: joysticks, mouse and other input devices.

During the transition to this mode, the HotActions 3.0 application changes its appearance to nothing-extra-excess-buttons. This helps the operator concentrate on his work.

It is also possible to initiate Actions from another computer connected via a network. In such a way, in some especially complicated and significant cases, there may be several stage control operators.

Virtual Studio Technical Specification (with options)

Specifications of certain products depend on the used studio configurations and the hardware modules included in them. Here are just some general and typical characteristics and formats.

Input video data formats

PAL: 720x576, 25 k/s, 4:2:2 or 4:2:2:4.

NTSC: 720х480, 29,97 k/s , 4:2:2 or 4:2:2:4.

HDTV: 1280x720p, 50/59.94 k/s ; 1920x1080i, 25/29.97 k/s.

Output video data formats

PAL: 720x576, 25 k/s , 4:2:2 or 4:2:2:4.

NTSC: 720х480, 29,97 k/s , 4:2:2 or 4:2:2:4.

HDTV: 1280x720p, 50/59.94 k/s ; 1920x1080i, 25/29.97 k/s.

Internal video data processing

YUV 4:2:2 or YUV 4:2:2:4, 8-bit per component.

Video inputs

Up to 12 Composite (RCA) (1.0Vp-p, 75 Ohm).

Up to 8 S-Video (4-pin mini-DIN or BNC) (Y:1.0 Vp-p, 75 Ohm; C: 0.286 or 0.3 Vp-p at burst level, 75 Ohm).

Up to 8–component YUV (BNC) (Y:1.0 Vp-p, 75 Ohm; U/V: 0.7 Vp-p, 75 Ohm) or RGB (BNC) (R/G/B:1.0 Vp-p, 75 Ohm).

Up to 8 SDI (BNC) (SMPTE 259M – 270Mbps).

Up to 4 HD SDI (BNC) (SMPTE – 292M).

Video outputs

Composite (RCA) (1.0Vp-p, 75 Ohm).

S-Video (4-pin mini-DIN or BNC) (Y:1.0 Vp-p, 75 Ohm; C: 0.286 or 0.3 Vp-p at burst level, 75 Ohm).

Component YUV (BNC) (Y:1.0 Vp-p, 75 Ohm; U/V: 0.7 Vp-p, 75 Ohm) or RGB (BNC) (R/G/B:1.0 Vp-p, 75 Ohm).

SDI (BNC) (SMPTE 259M – 270Mbps).

HD SDI (BNC) (SMPTE – 292M).

Audio

Up to 4 input mono channels (3 stereo channels) 16 bit, до 48 kHz.

Up to 4 output mono channels (3 stereo channels) 16 bit, до 48 kHz.

Configurated sound delay for each audio source.

Synchronization of sound to current video signal.

Balanced XLR connectors (optional).

Embedded audio SDI / HDSDI

Timing error correction

From 2 to 8 channels of TBC.

Keying

Original table or shade keying with grading, clipping and cropping.

Three-Dimension Graphics

Real time with approximately 500,000 visible polygones in picture.

Video Processing Delay

Fixed, from 3 to 5 shots depending on the operating mode.

Synchronization

Optional genlock synchronization module.

Notes on Current Realization

The complexity of the 3D scenes used in the system is limited mainly by the capacity of the utilized hardware and software platform (based on the Windows operating system) that is reguarly improved. The producer reserves the right to change internal configuration of the system at any moment and offers free and paid updates to all system users. It not highly recommended that you don’t do it and can lead to cancelation of warranty services and maintenance.