Opened 12 years ago

Closed 11 years ago

#4026 closed defect (invalid)

Default playback group for XvMC does not invclude 1080I/P

Reported by: anonymous Owned by: danielk
Priority: minor Milestone: 0.21
Component: mythtv Version: head
Severity: low Keywords:
Cc: Ticket locked: no

Description

The default playback groups include profiles for up to 720P but not 1080. The playback will result in not using Chromakey for 1080I.

Change History (3)

comment:1 Changed 12 years ago by skd5aner@…

I noticed this last night. Because of it, it was somehow defaulting to opengl rendering on the CPU+ playback profile (made no sense to me).

Anyway, I increased the first option in the profile to say anything <=1920x1088 and that fixed it.

I do think this probably should be fixed upstream, as right now it's limited to 720 content and doesn't take into account 1080 content without defaulting to some other option. I don't understand how/where/why mine defaulted to opengl rendering, but from the -v playback logs it looked like this:

2007-09-26 19:38:53.312 AFD: Successfully opened decoder for file: "myth://192.168.1.200:6543/227225_20070925210000.mpg". novideo(0)
2007-09-26 19:38:53.317 VideoOutput: Allowed renderers: opengl,xv-blit,xshm,xlib
2007-09-26 19:38:53.317 VideoOutput: Allowed renderers (filt: ffmpeg): xlib,xshm,xv-blit,opengl
2007-09-26 19:38:53.319 VDP: Accepting: cmp(<= 720 576,> 0 0) dec(ffmpeg) rend(xv-blit) osd(softblend) osdfade(enabled) deint(kerneldeint,linearblend) filt()
2007-09-26 19:38:53.319 VDP: Accepting: cmp(<= 1280 720,> 720 576) dec(ffmpeg) rend(xv-blit) osd(softblend) osdfade(enabled) deint(kerneldeint,linearblend) filt()
2007-09-26 19:38:53.319 VDP: Accepting: cmp(<= 1280 720,> 720 576) dec(libmpeg2) rend(xv-blit) osd(softblend) osdfade(enabled) deint(bobdeint,onefield) filt()
2007-09-26 19:38:53.320 VDP: Accepting: cmp(> 0 0) dec(xvmc) rend(xvmc-blit) osd(ia44blend) osdfade(disabled) deint(bobdeint,onefield) filt()
2007-09-26 19:38:53.320 VDP: Accepting: cmp(> 0 0) dec(libmpeg2) rend(xv-blit) osd(chromakey) osdfade(disabled) deint(bobdeint,onefield) filt()
2007-09-26 19:38:53.320 VDP: LoadBestPreferences(2048x2048, 0)
2007-09-26 19:38:53.320 VDP: LoadBestPreferences(2048x2048, 60)
2007-09-26 19:38:53.320 VDP: LoadBestPreferences(1920x1088, 60)
2007-09-26 19:38:53.320 VideoOutput: Trying video renderer: xv-blit
2007-09-26 19:38:53.322 VDP: Accepting: cmp(<= 720 576,> 0 0) dec(ffmpeg) rend(xv-blit) osd(softblend) osdfade(enabled) deint(kerneldeint,linearblend) filt()
2007-09-26 19:38:53.322 VDP: Accepting: cmp(<= 1280 720,> 720 576) dec(ffmpeg) rend(xv-blit) osd(softblend) osdfade(enabled) deint(kerneldeint,linearblend) filt()
2007-09-26 19:38:53.322 VDP: Accepting: cmp(<= 1280 720,> 720 576) dec(libmpeg2) rend(xv-blit) osd(softblend) osdfade(enabled) deint(bobdeint,onefield) filt()
2007-09-26 19:38:53.322 VDP: Accepting: cmp(> 0 0) dec(xvmc) rend(xvmc-blit) osd(ia44blend) osdfade(disabled) deint(bobdeint,onefield) filt()
2007-09-26 19:38:53.322 VDP: Accepting: cmp(> 0 0) dec(libmpeg2) rend(xv-blit) osd(chromakey) osdfade(disabled) deint(bobdeint,onefield) filt()
2007-09-26 19:38:53.322 VDP: LoadBestPreferences(2048x2048, 0)
2007-09-26 19:38:53.323 VDP: LoadBestPreferences(2048x2048, 60)
2007-09-26 19:38:53.323 VideoOutputXv: ctor
2007-09-26 19:38:53.324 XOff: 0, YOff: 0
2007-09-26 19:38:53.324 VDP: LoadBestPreferences(1920x1088, 60)
2007-09-26 19:38:53.324 Display Rect  left: 0, top: 90, width: 1280, height: 540, aspect: 1.33333
2007-09-26 19:38:53.324 Video Rect    left: 0, top: 0, width: 1920, height: 1080, aspect: 1.77778
2007-09-26 19:38:53.324 VideoOutputXv: Pixel dimensions: Screen 1280x720, window 1280x720
2007-09-26 19:38:53.325 VideoOutputXv: Estimated display dimensions: 325x183 mm  Aspect: 1.77596
2007-09-26 19:38:53.325 VideoOutputXv: Estimated window dimensions: 325x183 mm  Aspect: 1.77596
2007-09-26 19:38:53.325 VideoOutputXv: InitSetupBuffers() render: xvmc-blit, allowed: opengl,xv-blit,xshm,xlib
2007-09-26 19:38:53.325 VideoOutputXv: Desired video renderer 'xvmc-blit' not available.
                        codec 'MPEG2' makes 'opengl,xv-blit,xshm,xlib,' available, using 'opengl' instead.
2007-09-26 19:38:53.325 VDP: SetVideoRenderer(opengl)
2007-09-26 19:38:53.325 VDP: Old preferences: rend(xvmc-blit) osd(ia44blend) deint(bobdeint,onefield) filt()
2007-09-26 19:38:53.325 VDP: New preferences: rend(opengl) osd(softblend) deint(bobdeint,linearblend) filt()
2007-09-26 19:38:53.328 GLCtx: Created window and context.
2007-09-26 19:38:53.344 GLCtx: GLX Version: 1.3
2007-09-26 19:38:53.344 GLCtx: Direct rendering: Yes
2007-09-26 19:38:53.344 GLCtx: OpenGL vendor  : NVIDIA Corporation
2007-09-26 19:38:53.344 GLCtx: OpenGL renderer: GeForce 7300 GS/PCI/SSE2
2007-09-26 19:38:53.344 GLCtx: OpenGL version : 2.1.1 NVIDIA 100.14.11
2007-09-26 19:38:53.344 GLCtx: Max texture size: 4096 x 4096
2007-09-26 19:38:53.344 GLVid: Viewport: 1920x1088
2007-09-26 19:38:53.351 GLVid: Created main input texture 960x544
2007-09-26 19:38:53.359 GLVid: Created main input texture 960x544
2007-09-26 19:38:53.385 GLVid: Created main input texture 1920x1088
2007-09-26 19:38:53.385 GLVid: Creating master filter.
2007-09-26 19:38:53.385 GLVid: Created fragment program master.
2007-09-26 19:38:53.385 GLVid: Creating resize filter.
2007-09-26 19:38:53.425 GLCtx: Created frame buffer object (1920x1088).
2007-09-26 19:38:53.425 GLVid: Turning off deinterlacing.
2007-09-26 19:38:53.425 GLVid: Turning off deinterlacing.
2007-09-26 19:38:53.425 Created data @0xab612020->0xab90f022
2007-09-26 19:38:53.425 Created data @0xab314020->0xab611022
2007-09-26 19:38:53.425 Created data @0xab016020->0xab313022
...
2007-09-26 19:38:53.427 Created data @0xa2f10020->0xa320d022
2007-09-26 19:38:53.516 VDP: GetFilteredDeint() : opengl -> 'bobdeint'
2007-09-26 19:38:53.517 Using deinterlace method bobdeint
2007-09-26 19:38:53.517 VDP: SetVideoRenderer(opengl)
2007-09-26 19:38:53.517 VDP: SetVideoRender(opengl) == GetVideoRenderer()
2007-09-26 19:38:53.522 Display Rect  left: 0, top: 0, width: 1280, height: 720, aspect: 1.77778
2007-09-26 19:38:53.522 Video Rect    left: 0, top: 0, width: 1920, height: 1080, aspect: 1.77778
2007-09-26 19:38:53.523 Over/underscan. V: 0, H: 0
2007-09-26 19:38:53.523 Display Rect  left: 0, top: 0, width: 1280, height: 720, aspect: 1.77778
2007-09-26 19:38:53.523 Video Rect    left: 0, top: 0, width: 1920, height: 1080, aspect: 1.77778
2007-09-26 19:38:53.523 VDP: LoadBestPreferences(1920x1088, 29.97)
2007-09-26 19:38:53.744 NVP: ClearAfterSeek(1)
2007-09-26 19:38:53.744 VideoOutputXv: ClearAfterSeek()
2007-09-26 19:38:53.744 VideoOutputXv: DiscardFrames(0)
2007-09-26 19:38:53.744 VideoBuffers::DiscardFrames(0): AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA
2007-09-26 19:38:53.745 VideoBuffers::DiscardFrames(0): AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA -- done
2007-09-26 19:38:53.745 VideoOutputXv: DiscardFrames() 3: AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA -- done()
2007-09-26 19:38:53.746 TV: StartPlayer(): took 1011 ms to start player.
2007-09-26 19:38:53.747 TV: Changing from None to WatchingPreRecorded
2007-09-26 19:38:53.748 VDP: GetFilteredDeint() : opengl -> 'bobdeint'
2007-09-26 19:38:53.748 VDP: GetFilteredDeint() : opengl -> 'bobdeint'
2007-09-26 19:38:53.749 Using deinterlace method bobdeint
2007-09-26 19:38:53.750 Realtime priority would require SUID as root.
2007-09-26 19:38:53.773 AFD: DoFastForward(27553 (1), do discard frames)
2007-09-26 19:38:53.773 Dec: DoFastForward(27553 (1), do discard frames)
2007-09-26 19:38:53.774 AFD: SeekReset(27567, 0, do flush, do discard)
2007-09-26 19:38:53.774 AFD: SeekReset() flushing
2007-09-26 19:38:53.774 VideoOutputXv: DiscardFrames(1)
2007-09-26 19:38:53.774 VideoBuffers::DiscardFrames(1): UAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA
2007-09-26 19:38:53.774 VideoBuffers::DiscardFrames(): AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA -- done()
2007-09-26 19:38:53.774 VideoBuffers::DiscardFrames(1): AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA -- done
2007-09-26 19:38:53.774 VideoOutputXv: DiscardFrames() 3: AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA -- done()
...

Once I increased the maximum resolution, it did what I wanted it to do which was use standard ffmpeg decoding without xvmc/opengl - using xv-blit. I've got to admit, the new playback profiles is cool, but there's a slight learning curve and I've still learning.

Again, like I said - maybe the default should be changed to accommodate 1080i signals. Of course, the rules that say >0x0 should cover it too... but that's more of a fallback.

comment:2 Changed 11 years ago by danielk

Milestone: unknown0.21
Owner: changed from Isaac Richards to danielk
Status: newassigned
Version: unknownhead

comment:3 Changed 11 years ago by danielk

Resolution: invalid
Status: assignedclosed

The CPU+ profile is an example that mixes hardware + software rendering methods. The CPU-- example is a better starting point for you.

Note: See TracTickets for help on using tickets.