1080i source mostly deinterlaced before encoded?

Discussion in 'Helpdesk' started by wasper, Oct 15, 2018.

  1. wasper

    wasper Well-Known Member

    Joined:
    Jan 8, 2018
    Threads:
    49
    Messages:
    2,697
    Likes Received:
    536
    hello to all,

    It might be a quite stupid question...

    I use to deinterlace the video when the source said to be 1080i. However, in some case, I don’t see any difference during playback between deinterlacing on or off. If the framerate is 50 or 59, I can figure out that it has been deinterlaced before compression. However, when framerate is 25 or 29, I assume it has not been deinterlaced. Could the compression make interlaced video harder to detect through visual or it has been deinterlaced keeping the same framerate before deinterlaing? Hope you understand my question!
     
  2. Loading...


  3. ThePixelFarm

    ThePixelFarm Livin' in a dream... Capper Uploader V . I . P. Collector Donor

    Joined:
    Aug 31, 2018
    Threads:
    935
    Messages:
    3,898
    Likes Received:
    16,082
    50 or 59, is not interlaced, in the first place. Both those frame rates, give you the original 'video look/feel'.

    25 or 29, are interlaced. Meaning. they have a 'video look', by default. (because they are 'video', as opposed to, captured on film)

    Deinterlacing, basically, gets rid of the 2 fields that make up a frame in video footage (removing the 'video look/feel', and instead, making it look like it was made on film), giving you one frame (no fields).

    For me, no one should deinterlace anything... You're media player can do it automatically, in most cases, or you can do it, manually before you watch, which will retain the 'video look/feel', of the original...

    Best I can do, without going into more detail...
     
    ianarturis, ElFunadisimo and Dream66 like this.