urbanfox - company logo  
         urbanfox.tv > technology articles > format articles > dv, dvcam, dvcpro

ON THIS PAGE:

Visible Degradation
Format Difference
Down The Generations
Loss of Chroma
Measuring Playback

DV, DVCAM, DVCPRO

What is the difference between the various DV formats. Which is best, and how can we tell?
Now this all started with an email from Vedran Klepac of Croation Radiotelevision. David sent back the answers below (shown in black). Then Colin Green of JVC pitch in with a few more words of wisdom.

Vedran Klepac, Croatian Radiotelevision wrote: "I found your articles very interesting and very useful, and I have some additional questions: "Here is a thought: DVCPRO decks will play DV tapes with a resolution of 4:1:0, and the question is - is this degradation visible on a TV screen and how could it be measured (with waveform monitor selecting line by line?)?"

[DAVID FOX] Sorry, I am not an expert on test and measurement equipment, but in real world use, you will have trouble noticing any degradation on a first generation tape on even a Barco or Sony broadcast monitor - the pictures will certainly be far better than anything a consumer TV could display. Also, if you use a DV tape in a DVCPRO player, it should actually be resampled as 4:1:1 with little noticeable loss.

[COLIN GREEN] First, the statement “DVCPRO decks will play DV tapes with a resolution of 4:1:0” indicates that Mr. Klepac understands something about 4:1:0 hybridisation...But please note: If we are talking PAL then the DV material is 4:2:0 and although replay of a PAL DV tape on a DVCPro deck is still at 4:2:0 resolution, if you record this onto another DVCPro deck then re-sampling occurs which produces a 4:1:0 weakling hybrid.

4:1:0 hybrids have half the resolution both vertically and horizontally of the same image in 4:2:2. The Problem is so bad that the EBU recommended that if 4:2:0 and 4:1:1 sources have to be ‘processed’ (edited) from their de-compressed form (say in a linear suite without SDTI) then the material should be processed using a 4:2:2 sampling system..which does not ‘hybridise’ with either 4:1:1 or 4:2:0 sources

This is good advice…Ifyou use the right type of 4:2:2 sampling system….For instance: If you use a low compression, INTRA-frame based system such as AVR77 MJPEG or DV50 (D9 or DVCPro50) or Digital Betacam (about 2:1 compressed, Intraframe 4:2:2) then no problem. In fact the end result after ‘x’ number of generational dubs can look cleaner than even staying with the original acquisition format for the same amount of processing. The exception to this statement is where loss-less transfer of the compressed video takes place during dubs using DV native Firewire systems or SDTI (the official recognition of Sony and Panasonic’s QSDI and CSDI). Unfortunately, most editing involves some form of picture manipulation such as a dissolve or wipe which requires de-compressed video so causing picture degradation courtesy of the de-code/encode process, particularly where the compression system uses high compression rates, 4:2:0 or 4:1:1 sampling, and (worst of all) analogue signals for re-encoding. Hence the above ‘exception’ only applies where cuts-only editing and loss-less transfer is used between editing devices.

HOWEVER: If you choose something like Betacam-SX (which is 4:2:2) then you can compound the problem by this adding in Temporal compression artefacts (Betacam SX is INTER-frame IB sequenced MPEG422P). This wasn’t understood fully enough at the time of the EBU’s recommendation.

Where material from both 4:1:1 and 4:2:0 sub-sampled systems have crossed over to either system, then unless special measures are taken to preserve the original sampling sequence 4:1:0 hybridisation will occur. You would think that armed with this knowledge most broadcasters would actively avoid mixing the two, but very often this out of their control. For instance, most modern digital contribution systems are based around MPEG2MP@ML…and guess what..it uses 4:2:0. Unless you have one of those specialist MPEG wrapping encoders designed by Panasonic and its partners in the distribution industry (designed to ensure the data comes out as 4:1:1 packets at the other end), then any time DVCPro is passed through the link hybridisation will occur (and it only needs to happen once). The overall consequences for late 20th and early 21st world archive are not good…don’t forget that the important news material we gather, distribute and store today has to withstand up-conversion to other formats in the future…and you can imagine how a high def or even advanced def system will treat 4:1:0 standard def.

back to the top

"Also, if we shot the same scene (for example, only red test chart) with different types of camcorders, with compression 4:2:2 (Digi Beta), 4:1:1 (DVCPRO), 4:2:0 (DV), and play back tapes could we notice the difference or degradation between these formats? Can we measure this difference with waveform monitor (Tektronix VM 700) selecting line by line?"

[DAVID FOX] You would have to ask Tektronix (who would probably say yes - the VM 700 is a pretty powerful piece of kit and will give you extensive measurements for luminance and chrominance), but the difference between the formats would be almost invisible to even the trained eye. To explain (and you probably know this already, but others might not), it is not only Digital Betacam or Betacam SX that are 4:2:2 [where the colour (or chrominance) is sampled at half the rate of the luminance (Y) - the 2:2 is made up of the two colour difference samples (R-Y and B-Y - also known as Cr and Cb)].

DV-based systems like Digital-S (or D9 as its standard is called) and DVCPRO-50 are also 4:2:2. DVCPRO (at 25Mbps), DVCAM or the NTSC version of DV are 4:1:1 [which samples colour half as often as 4:2:2].

4:2:0 is used for PAL DV (as well as DVD), but the 2:0 doesn't represent the two colour difference samples, instead, it means that chroma is sampled just as often as in 4:2:2, just not on every line (but on every second line - the other being sampled at 4:0:0). This would possibly give a better colour representation with a progressively scanned image than 4:1:1, but with interlaced images (which is what almost all video is), the result isn't as good for multi-generational performance - although you can hardly tell any difference between first generation tapes of each. [Incidently, there is also 4:4:4, which is RGB for computer use, 4:2:2:4, which adds a key signal sampled at the same 13.5 MHz as luminance, and 4:4:4:4 with everything sampled at full rate for the very highest quality post-production.

[COLIN GREEN] Point of note here: This chap called Nyquist invented a rule about sampling. Whatever your sampling frequency, then the maximium resolution before complete extinction is half of the sampling frequency. Example: All systems (4:2:2, 4:1:1 and 4:2:0 ) use a 13.5Mhz Luminance sampling frequency..This means the maximium frequency retained by the sampling recorder is actually 6.75Mhz. In practice this is limited to 5.75Mhz to avoid the steep roll-off towards the cut-off point.

For colour definition and sampling; 4:2:2 systems (D9, DVCPro50, Digi-Beta etc) and 4:2:0 systems (like PAL DV and DVCAM) use a 6.75Mhz ‘Colour Difference ’signal sampling frequency (even though this either over a 2-line average or once every 2 lines per R-Y or B-Y line). This means a Theoretical maximium colour frequency of 3.375Mhz, in practice limited to about 2 to 2.5Mhz (because human eyes are not as sensitive to changes in high definition colour as they are to changes in brightness).

So what about 4:1:1 systems? They use a sampling frequency for ‘Colour Difference’ signals of just 3.375Mhz, meaning a theoretical maximium colour frequency of 1.6875Mhz, which in practice is band-linited to about 1.2Mhz. This has a bearing on the statements about Betacam SP later..which manages a 1.5Mhz colour bandwidth even in its cheapo UVW form.

back to the top

"Is it [any degradation] noticeable in first generation, or shall we copy to make 3rd or 5th generation?"

[DAVID FOX] If viewing first generation material, I'd have difficulty identifying which digital tape format was used, at least if they were shot under good conditions, even on a Barco monitor. Material shot in poor light, or with very wide angle or tight angle shots, you probably would - but is the difference you notice down to the camera, the quality of lens or a better CCD (all of which you'd expect on a Digital Betacam system)? When you see just how good DV pictures can be from one of JVC's Professional DV camcorders (DV500 or DV700) or Sony's DSR 500 in DV mode, you realise that for first generation material the choice of tape format matters a lot less than the CCD, choice of lens and the other features in the camcorder.

On a third generation copy (even a digital copy) on tape, you will probably begin to notice a difference, certainly between DigiBeta and DV, by the fifth generation, you will see even DVCPRO beginning to fade compared to DigiBeta. That is why many facilities are still installing Digital Betacam tape edit suites. It is a very robust format, despite its compression - and if it does degrade, the Digital Betacam VTRs have very powerful error correction that will rescue most shots so that you don't see any drop-out. Not many broadcasters would risk even editing tape to tape with DVCPRO, and certainly not with DV.

[COLIN GREEN] Point of order Mr. Chairman: Most systems on the market use the same type of error correction system, called Double-Reed Solomon. This works courtesy of a 2-dimensional error coding system based around the use of an expanded polynomial equation used to predict a sequence of Prime numbers (the main key to the success of the system) For DV25 One dimension of this 2D grid is called 85,77 coding as it adds 8 bytes of error code to every 77 byte compressed Macroblock. It has the ability to correct for up to 4 faulty bytes within the 77 byte stream. This is known as Inner Code. Outer code on the other hand is a further 11 bytes of error coding data added to a column of data some 138 bytes tall (called 149,138 coding) and has similar correction powers. These two (inner and outer coding) combine to produce a very powerful error correction system…and most current systems on the market use it in one form or another…including Digital Betacam

What has this got to do with the above?…well because standard DV uses it, means that the ability for Digital Betacam to look good does not necessarily relate to its error correction system. Rather, the reason is because of the shear amount of errors the format has to deal with when replaying a recording. Factors here are track pitch (width), linear tape speed, and tape type used (Off-tape RF levels etc recorder can produce.

As a consequence, you will note that even the EBU/SMPTE Task force did not use Digi-Beta as a fair test against DVCPro and Beta-SX..they used standard Beta-SP as a reference (the very system that these two formats were attempting to replace). Even then, Beta-SP was still better than either of them at 1st and 4th generation.

Please also note that Digi-Beta was used as a reference only for the MPEG422P50 and D9 tests they carried out later…and D9 beat both of its rivals on several points.

[DAVID FOX] If you want to edit DV, DVCAM or DVCPRO, get a disk-based system. Even a DV iMac will deliver excellent results and any degradation should be minimal.

A point to note, is that if you ever need to upconvert material from standard definition (SD) to high definition (HD), it is well worth originating on 4:2:2. I visited Fuji TV in Japan, where they have to upconvert news material shot on SX and various DV formats, to HDTV for one of their services. The SX material looks very good, but they have problems with DV (even DVCPRO) material. Given how upconversion technology is progressing, this may not be a problem in the future, but the lesson for now seems to be that if you will want to keep material in SD for conventional transmission, and won't need to process it too much, then DV (or its variants) is good enough. If you will want to output to different formats (including lower quality ones like the Web), it does pay to start with a better quality image - so look at DVCPRO-50, D9 or SX.

back to the top

"Is it possible to notice loss of chroma in 4:1:1 and 4:2:0 systems?"

[DAVID FOX] 4:1:1 is better than Betacam SP, which contains the same colour information, but about 25% less luminance information. Many post houses have successfully done chroma key work with SP in the past, and should have little trouble getting a reasonable key from 4:1:1 material, although some limited fringing will probably be evident, depending on the type of pictures. It just won't be as good or as easy as with 4:2:2. If you need to use a 4:2:0 tape to create a key, it can be done, it just might take a lot more work, and you might have to clean up the matte in Photoshop or some other application, to make it acceptable. A high-quality digital chroma keyer uses more than just the chrominance signal to create the key - it will typically create a linear key for each of the three signal components then add them together to create the overall key signal, to get the best key it can.

back to the top

"Please, do you have any idea how to measure the play back from these three different systems?"

[DAVID FOX] If you want strict technical evaluation, this really is something you should ask Tektronix (or Snell & Wilcox or Hamlet, or one of the other test & measurement manufacturers). However, the ultimate test is what the picture looks like. If you can't tell the difference visually on a broadcast monitor, then the pictures are good enough.

The Ultimatte test, of course, is to put the pictures through a chroma key system - it will give a good indication of how the pictures stand up under processing. If you put the pictures through a compression system (using Media Cleaner Pro, for example), this might also show up differences as the cleaner the original signal, the better the pictures will compress.

Some useful links: Videouniversity - http://videouniversity.com/dvformat.htm - has a section on the difference between 4:2:2, 4:1:1 and 4:2:0 within its excellent explanation of the different DV formats. Hamlet's Adept (launched at IBC 2000) measures serial digital (SDI), firewire (DV), analogue component, YC (S-video) and composite signals and seems to be a relatively low-cost DV test system. Also try Tektronix or Snell & Wilcox

© 2000 - 2010

back to the top

MORE...
High Definition and 24P - is video finally good enough to replace film?
BBC Blows Up Human Body - The BBC is moving on to the really big screen with its first IMAX production. Plus how High Definition video works at IMAX resolution.
| BACK TO HOME PAGE | SEARCH | CONTACT US | TECHNOLOGY ARTICLES | PRODUCTION | CREATIVE STUFF | COURSE INFORMATION | CAMERA WORKBOOKS |

David Fox