Error: undecodable DPX Expected data size is bigger than real file size.
I'm getting an error, "Error: undecodable DPX Expected data size is bigger than real file size." when I attempt to process some DPX files.
See below. Please let me know if you need any additional info or samples.
rawcooked --version RAWcooked 18.10.1.20190608
$ rawcooked --check-padding dpx_folder
[...]
Error: undecodable DPX Expected data size is bigger than real file size.
Error: unsupported DPX Offset to image data in bytes.
Info from terminal on Mac: ls -al dpx_00000000.dpx -rwxrwxrwx@ 1 some one 16373168 Jun 11 17:45 dpx_00000000.dpx
Windows properties: Windows "Size" 16373168 Windows "Size on disk: 16373760
mediainfo --Details=1 dpx_00000000.dpx
0000000000 -------------------------
0000000000 --- DPX, accepted ---
0000000000 -------------------------
0000000000 Generic section header (1664 bytes)
0000000000 File information (768 bytes)
0000000000 Magic number: SDPX
0000000004 Offset to image data: 2480 (0x000009B0)
0000000008 Version number of header format: V1.0
0000000010 Total image file size: 16378880 (0x00F9EC00)
0000000014 Ditto Key: 1 (0x00000001)
0000000018 Generic section header length: 1664 (0x00000680)
000000001C Industry specific header length: 384 (0x00000180)
0000000020 User-defined header length: 432 (0x000001B0)
0000000024 FileName: dpx_00000000.dpx
0000000088 Creation Date: 2019-06-06T13:44:27Z
00000000A0 Creator: US, Some/Info
0000000104 Project: Item ID, local, 1234; Collection ID, local, 5678
0000000294 Encryption key: 4294967295 (0xFFFFFFFF)
0000000298 Reserved for future use: (104 bytes)
0000000300 Image information (640 bytes)
0000000300 Image orientation: 0 (0x0000) - Left to right, Top to bottom
0000000302 Number of image elements: 1 (0x0001)
0000000304 Pixels per line: 2336 (0x00000920)
0000000308 Lines per image element: 1752 (0x000006D8)
000000030C image element (72 bytes)
000000030C Data sign: 0 (0x00000000) - unsigned
0000000310 Reference low data code value: 0 (0x00000000)
0000000314 Reference low quantity represented: 0.000
0000000318 Reference high data code value: 1023 (0x000003FF)
000000031C Reference high quantity represented: 2.047
0000000320 Descriptor: 50 (0x32) - R,G,B
0000000321 Transfer characteristic: 1 (0x01) - Printing density
0000000322 Colorimetric specification: 2 (0x02) -
0000000323 Bit depth: 10 (0x0A) - integer
0000000324 Packing: 1 (0x0001) - Filled A
0000000326 Encoding: 0 (0x0000) - Raw
0000000328 Offset to data: 8192 (0x00002000)
000000032C End-of-line padding: 0 (0x00000000)
0000000330 End-of-image padding: 0 (0x00000000)
0000000334 Description of image element: IMAGE DESCRIPTION DATA
0000000354 Padding: (504 bytes)
000000054C Reserved for future use: (52 bytes)
0000000580 Image source information (256 bytes)
0000000580 X Offset: 0 (0x00000000)
0000000584 Y Offset: 0 (0x00000000)
0000000588 X center: 0.000
000000058C Y center: 0.000
0000000590 X original size: 2336 (0x00000920)
0000000594 Y original size: 1752 (0x000006D8)
0000000598 Source image filename:
00000005FC Source image date/time:
0000000614 Input device name: Some Input Device Info
0000000634 Input device serial number: SNSA-123
0000000654 Border validity (8 bytes)
0000000654 XL border: 65535 (0xFFFF)
0000000656 XR border: 65535 (0xFFFF)
0000000658 YT border: 65535 (0xFFFF)
000000065A YB border: 65535 (0xFFFF)
000000065C Pixel ratio : horizontal: 16777216 (0x01000000)
0000000660 Pixel ratio : vertical: 16777216 (0x01000000)
0000000664 Additional source image information (28 bytes)
0000000664 X scanned size: 0.000
0000000668 Y scanned size: 0.000
000000066C Reserved for future use: (20 bytes)
0000000680 Industry specific header (384 bytes)
0000000680 Motion-picture film information (256 bytes)
0000000680 Film mfg. ID code:
0000000682 Film type:
0000000684 Offset in perfs:
0000000686 Prefix:
000000068C Count:
0000000690 Format - e.g. Academy:
00000006B0 Frame position in sequence: 0 (0x00000000)
00000006B4 Sequence length (frames): 16777216 (0x01000000)
00000006B8 Held count (1 = default): 16777216 (0x01000000)
00000006BC Frame rate of original (frames/s): 18.000
00000006C0 Shutter angle of camera in degrees: 0.000
00000006C4 Frame identification - e.g. keyframe:
00000006E4 Slate information: SLATE INFO
0000000748 Reserved for future use: (56 bytes)
0000000780 Television information (128 bytes)
0000000780 SMPTE time code: 0 (0x00000000)
0000000784 SMPTE user bits: 0 (0x00000000)
0000000788 Interlace: 255 (0xFF) - 2:1 interlace
0000000789 Field number: 255 (0xFF)
000000078A Video signal standard: 255 (0xFF) - Reserved for future high-definition progressive
000000078B Zero: 255 (0xFF)
000000078C Horizontal sampling rate (Hz): 0.000
0000000790 Vertical sampling rate (Hz): 0.000
0000000794 Temporal sampling rate or frame rate (Hz): 18.000
0000000798 Time offset from sync to first pixel (ms): 0.000
000000079C Gamma: 0.000
00000007A0 Black level code value: 0.000
00000007A4 Black gain: 0.000
00000007A8 Breakpoint: 0.000
00000007AC Reference white level code value: 0.000
00000007B0 Integration time (s): 0.000
00000007B4 Reserved for future use: (76 bytes)
0000000800 User defined header (432 bytes)
0000000800 User identification: FADGI process history
0000000820 User defined: (400 bytes)
00000009B0 Image Data (16370688 bytes)
00000009B0 Data: (16370688 bytes)
0000F9D5B0 ------------------------
0000F9D5B0 --- DPX, filling ---
0000F9D5B0 ------------------------
0000F9D5B0 -------------------------
0000F9D5B0 --- DPX, finished ---
0000F9D5B0 -------------------------
Looks like the file is buggy, missing 5712 bytes (DPX header says 16378880, reality is 16373168) of data.
checking the theoretical size : 2336 (width) x 1752 (height) x 4 (3-component, 10-bit, filled A = 32 bits per pixel) + 8192 ("Offset to data" in image element information) = 16378880 so DPX header seems coherent and file size is not good, the 1428 last pixels are missing.
BUT: "Offset to image data" (global to all image elements) is 2480, and 8192 minus 5712 is 2480. Looks like we found the guilty metadata: "Offset to data" seems wrong, it should be 2480.
So: file is rejected because there is a danger of corrupted DPX in entry, but we could detect such pattern and compress anyway.
In the meanwhile, you should contact the DPX tool vendor and ask to fix "Offset to data" and "Total image file size" values.
In the meanwhile, you should contact the DPX tool vendor and ask to fix "Offset to data" and "Total image file size" values.
Will do.
For the moment, FFmpeg (the tool currently used for encoding, until we get our own encoder) relies on global Offset to image data and ignores first image related Offset to data, so encoding is fine with the file from this issue.
But if one day FFmpeg decides to change their parsing algorithm, or if the user uses another encoding tool which relies on first image related Offset to data instead of on global Offset to image data, encoding and reversibility may be broken. The way RAWcooked currently handles such potential issue is to reject such incoherent files (when there is a mismatch between these 2 pieces of metadata).
I see several possibilities for handling such issue:
1/ reject the stream if --check option is not set. So we accept to encode only if we verify the result, so RAWcooked fails if reversibility is not possible.
2/ add a "I know what I am doing" option for letting the user encode it at own risks
3/ test the FFmpeg version and reject any version above the one from today, and we increase the version number every x months after checking that FFmpeg didn't change their algo
4/ do nothing until we have our own encoder (so we control all), no ETA
5/ something else?
@archisysio @retokromer @kieranjol @dericed @bturkus @pjotrek-b @digitensions & others any opinion?
Hi Jérôme, sorry I missed this prompt to respond before.
We have two files now showing this same error but at RAWcooked encoding stage and not at --check stage, so they are not progressing to MKV files.
Would you have a recommended work around to encode these files or should we TAR them? Happy to send more data for the DPX files. MediaInfo output attached as txt file, full Details=1:
Thanks! metadata.txt
Would you have a recommended work around to encode these files or should we TAR them?
Option 1 is quick to implement and you (@digitensions) already use --check so no impact for you, I implement that soon.