Gentoo Forums
Gentoo Forums
Gentoo Forums
Quick Search: in
Howto xcode recordings for mythtv & standalone automatically
View unanswered posts
View posts from last 24 hours

 
Reply to topic    Gentoo Forums Forum Index Documentation, Tips & Tricks
View previous topic :: View next topic  
Author Message
devsk
Advocate
Advocate


Joined: 24 Oct 2003
Posts: 2995
Location: Bay Area, CA

PostPosted: Sun Jan 14, 2007 5:18 am    Post subject: Howto xcode recordings for mythtv & standalone automatic Reply with quote

edit: now this script is more general purpose and can be used as a myth user job as well as a weekly cron job. updates the db more cleanly and uses parallelism of the dual core. The preview PNG files are handled as well.

I faced this very strange situation where Mpeg2 files were taking huge space on disk and if I transcoded them with builtin transcoder of mythtv, it would give me NUV files which won't play on my standalone divx player. ffmpeg creates perfectly compatible xvid encoded AVI files with no seek errors, no audio/video sync problem, no fast forward or rewind problems. These very AVI files play fine with the builtin player of mythtv and mythVideo plugin.

So, the problem was well defined for me: keep the nuvexport and mythtv filename formats the same, replace .mpg with .avi, update the database to point to the .avi file and rebuild the seek table.

One more problem was the built in transcode wasn't using my dual core cpu to the full (only 50% usage). Why did I buy the dual core cpu if I were to use only 50% of it?

So, on I went to write this very simple script which you can put in /usr/bin (or /home/<user>/bin) and make a crontab entry for it to run it over the weekends or whenever you want to transcode. It creates nuvexport jobs for all the .mpg files in your mythtv storage area, running two jobs at a time if you have a dual core cpu.

First , make sure you have mythrename.pl and a link to it in /usr/bin:

Code:

# ls -al /usr/bin/mythrename
lrwxrwxrwx 1 root root 39 Dec 20 20:43 /usr/bin/mythrename -> /usr/share/mythtv/contrib/mythrename.pl*


Second, apply this patch to /usr/share/mythtv/contrib/mythrename.pl :
Code:

# diff -u /usr/share/mythtv/contrib/mythrename.pl.orig /usr/share/mythtv/contrib/mythrename.pl
--- /usr/share/mythtv/contrib/mythrename.pl.orig        2007-01-08 18:32:18.000000000 -0800
+++ /usr/share/mythtv/contrib/mythrename.pl     2007-01-08 18:33:43.000000000 -0800
@@ -111,7 +111,7 @@
       codes above; i.e. "\%oY" gives the year in which the episode was first
       aired.

-    * A suffix of .mpg or .nuv will be added where appropriate.
+    * A suffix of .mpg or .avi will be added where appropriate.

     * To separate links into subdirectories, include the / format specifier
       between the appropriate fields.  For example, "\%T/\%S" would create
@@ -308,8 +308,8 @@
         my ($oyear, $omonth, $oday) = split(/\-/, $info{'originalairdate'}, 3);
     # Build a list of name format options
         my %fields;
-        ($fields{'T'} = ($info{'title'}       or '')) =~ s/%/%%/g;
-        ($fields{'S'} = ($info{'subtitle'}    or '')) =~ s/%/%%/g;
+        ($fields{'T'} = ($info{'title'}       or 'Untitled')) =~ s/%/%%/g;
+        ($fields{'S'} = ($info{'subtitle'}    or 'Untitled')) =~ s/%/%%/g;
         ($fields{'R'} = ($info{'description'} or '')) =~ s/%/%%/g;
         ($fields{'C'} = ($info{'category'}    or '')) =~ s/%/%%/g;
         ($fields{'U'} = ($info{'recgroup'}    or '')) =~ s/%/%%/g;
@@ -382,7 +382,7 @@
         $safe_file = "'$safe_file'";
     # Figure out the suffix
         my $out    = `file -b $safe_file 2>/dev/null`;
-        my $suffix = ($out =~ /mpe?g/i) ? '.mpg' : '.nuv';
+        my $suffix = ($out =~ /mpe?g/i) ? '.mpg' : '.avi';
     # Link destination
         if ($dest) {
         # Check for duplicates


Third, create a ~/.nuvexportrc. Mine looks like:
Code:

<nuvexport>
    export_prog=ffmpeg
    mode=xvid
    underscores=yes
    filename=%t %m %s
    date=%Y-%m-%d %i-%M%p
    crop_top    = 0
    crop_right  = 1
    crop_bottom = 1
    crop_left   = 2
</nuvexport>
<generic>
    use_cutlist = yes
    multipass = yes
    noise_reduction = no
    deinterlace     = yes
    crop = yes
</generic>
<transcode>
    force_mythtranscode = yes
    mythtranscode_cutlist = yes
</transcode>
<XviD>
    vbr          = yes
    multipass    = yes
    quantisation = 6
    a_bitrate    = 128
    v_bitrate    = 1500
    width        = 640
    height       = auto
</XviD>

#
#  Default mp3 bitrate in MythTV is 128
#
<MP3>
    bitrate = 128
</MP3>
<profile::mencodexvid>

    mode        = xvid
    export_prog = mencoder
    multipass   = yes
    vbr         = yes
    deinterlace = yes

</profile::mencodexvid>

Go ahead and edit it to heart's content. I chose 1500 kbps for video because it gives me close to 500MB for one hour show with great quality after cutting out the commercials. When running the AVI on my tvset, I can't tell it from the actual broadcast.

Now, the /usr/bin/my_transcode_job
Code:

#!/bin/sh

# get the pathname for storage dir
stor_dir=`echo "select data from settings where value=\"RecordFilePrefix\";" \
                | mysql -umythtv -pmythtv -Dmythconverg --skip-column-names`

logfile="${HOME:-/tmp}/myth-transcode.log"

# parse the args
filen=
update_db=0
for myarg in $*
do
        case $myarg in
        --help)
                echo "Transcode to AVI and update database with the newly created file."
                echo "Doesn't create a NUV file like mythtv, but a compatible AVI can be"
                echo "created which can run from standalone divx players as well as mythtv."
                echo ""
                echo "$0 :"
                echo "   --help    - Print this help message."
                echo "   --ifile   - Supply one file to make it possible to run it as a user job."
                echo "   --all     - Default mode to process all files, suitable for weekly cron."
                echo "   --update-db - Update db for a file created manually (mencoder)."
                echo "                 Could be used in conjuction with other methods to create"
                echo "                 AVI file and then use this script to just update the DB."
                echo "                 Must pass the filename with .mpg ext using --ifile."
                echo "                 Must have the AVI with same name format in $stor_dir ."
                exit 0
                ;;
        --ifile)
                shift
                filen="$1"
                echo "Working on file ${filen}." >> $logfile
                echo ""
                ;;
        --all)
                filen=
                ;;
        --update*)
                update_db=1
                ;;
        esac
done

function update_database()
{
        echo "Updating database for ${1} with ${1%.*}.avi..." >> $logfile
        nfilen=${1%.*}.avi
        # update basename
        echo "update recorded set basename=\"$nfilen\" where basename=\"$1\" ;" \
                | mysql -umythtv -pmythtv -Dmythconverg
        # delete recorded markup
        echo "delete from recordedmarkup where \
                starttime=(select starttime from recorded where basename=\"${nfilen}\") and \
                chanid=(select chanid from recorded where basename=\"${nfilen}\")" \
                | mysql -umythtv -pmythtv -Dmythconverg

        # change the filesize
        filesize=`/bin/ls -al "${stor_dir}/${nfilen}" 2> /dev/null | awk '{print $5}'`
        [ -z "$filesize" ] && filesize=0
        echo "update recorded set filesize=${filesize} where basename=\"${nfilen}\" ;" \
                | mysql -umythtv -pmythtv -Dmythconverg

        # move the preview png
        # XXXXX the png file is not renamed by mythrename, neither is it stored in db. The assumption is
        # that it has to be basename.png for it to be picked up as preview
        if [ -f "${stor_dir}/${1}.png" ]
        then
                echo "Moving preview ${stor_dir}/${1}.png to ${stor_dir}/${nfilen}.png ..." >> $logfile
                mv "${stor_dir}/${1}.png" "${stor_dir}/${nfilen}.png"
        else
                # means that the png is chanid_datetime format
                strt=`echo "select chanid,starttime from recorded where basename=\"${nfilen}\" ;" \
                        | mysql -umythtv -pmythtv -Dmythconverg --skip-column-names`
                # replace first ' ' with '_'
                pngnm=${strt/[ \        ]/_}
                # replace rest of ' ', '-' and ':' with ''
                pngnm=${pngnm//[ \      \-:]/}.mpg.png

                echo "Moving preview ${stor_dir}/$pngnm to ${stor_dir}/${nfilen}.png ..." >> $logfile

                if [ -f "${stor_dir}/$pngnm" ]
                then
                        mv "${stor_dir}/$pngnm" "${stor_dir}/${nfilen}.png"
                fi
        fi

        # I may not have cut out the comms from the orig, and seektable is goodness.
        mythcommflag --rebuild -f "${nfilen}"

        # remove the original file
        echo "Deleting \"${stor_dir}/${1}\"" >> $logfile
        /bin/rm -f "${stor_dir}/${1}"
}

function nuvexport_spawn()
{
        nuvexport --path $1 --infile $2 --profile $3
        if [ $? != 0 ]
        then
                echo "nuvexport failed with status $? on $2" >> $logfile
        else
                update_database $2
                echo "Finished transcoding \"$2\"." >> $logfile
        fi
}

if [ $update_db == 1 ]
then
        if [ -z "$filen" ]
        then
                echo "ERROR: --update-db requires --ifile parameter."
                echo ""
                exit 1
        fi
        if [ ! -f "${stor_dir}/${filen%.*}.avi" ]
        then
                echo "ERROR: --update-db requires an AVI in ${stor_dir} with same name format."
                echo ""
                exit 1
        fi
        update_database $filen
        exit 0
fi

# get starttime and chanid for the infile
if [ -n "$filen" ]
then
        starttime=`echo "select starttime from recorded where basename=\"${filen}\";" \
                        | mysql -umythtv -pmythtv -Dmythconverg --skip-column-names`
        chanid=`echo "select chanid from recorded where basename=\"${filen}\";" \
                        | mysql -umythtv -pmythtv -Dmythconverg --skip-column-names`
fi

# mythtv doesn't call the rename some of the times, call it now
/usr/bin/mythrename --underscores --format "%T %Y-%m-%d %H-%i-%s %S"

if [ -n "$filen" ]
then
        # filename has changed in the db
        filen=`echo "select basename from recorded where starttime='$starttime' and chanid='$chanid';" \
                        | mysql -umythtv -pmythtv -Dmythconverg --skip-column-names`
fi

num_jobs=`cat /proc/cpuinfo | egrep "processor.*:.*[0-9]+" |wc -l`

# protect against evil fixed string search in cpuinfo
if [ $num_jobs == 0 ]; then num_jobs=1 ;fi

# construct a working set of files
if [ -n "$filen" ]
then
        file_set="$filen"
else
        file_set=
        # drop the ones recording right now
        for i in `ls ${stor_dir}/*.mpg 2>/dev/null`
        do
                filen=`basename $i`
                chanid=`echo "select chanid from inuseprograms where \
                        starttime=(select starttime from recorded where basename=\"${filen}\") \
                        and chanid=(select chanid from recorded where basename=\"${filen}\") \
                        and recusage='recorder' ;" \
                        | mysql -umythtv -pmythtv -Dmythconverg --skip-column-names`
                # append if not recording
                if [ -z "$chanid" ]
                then
                        # append only if in the db
                        filen=`echo "select basename from recorded where basename=\"${filen}\" ;" \
                                | mysql -umythtv -pmythtv -Dmythconverg --skip-column-names`
                        [ -n "$filen" ] && file_set="$file_set $filen"
                fi
        done
fi

echo "Working on {$file_set} in directory ${stor_dir}..." >> $logfile
slot=0
for i in $file_set
do
        currJobcount=$(jobs -r | grep -c .)
        while [ $currJobcount == $num_jobs ]
        do
                # sleep for 0.5 minutes; these are heavy duty jobs, will take time
                sleep 30
                currJobcount=$(jobs -r | grep -c .)
        done

        # spawn a nuvexport for this file
        filen=`basename $i`
        echo "Starting job with file \"${filen}\" in ${stor_dir}..." >> $logfile
        nuvexport_spawn ${stor_dir} "${filen}" "mencodexvid" &
        # get current jobid
        myJobs[$slot]=$(jobs %% | sed 's/^[^0-9]*//' | sed 's/[^0-9].*$//')
        slot=$((slot+1))
done

echo "" >> $logfile
echo "All jobs finished spawning." >> $logfile
echo "" >> $logfile
echo "Waiting for jobs to finish." >> $logfile
echo "" >> $logfile
for (( i=0 ; $i < $slot ; i=$((i+1)) ))
do
        wait %${myJobs[$i]}
done


And lastly,
Code:

crontab -e
and enter something like 0 23 * * 6 /usr/bin/my_transcode_job to run this script at 11PM on Saturday night every week.


Few things to note:

0. Do all this as normal user, who should be made member of mythtv and cron groups before proceeding. stor_dir should have group write permission for mythtv group.

1. Understand that this is going to rename your recordings to this format: "%T %Y-%m-%d %g-%i%A %S" which looks like:

NUMB3RS_2007-01-12_10-00PM_Finders_Keepers.avi

with --underscores option.

If you don't like that, change both nuvexport and mythrename string in the script to your liking.

2. It updates your database to with newer basename and seektable. So, backup your database daily/weekly.

3. If and when storage groups come out and RecordFilePrefix changes, we may need to adjust the 'tail -1' or the whole line in our script to account for multiple storage places.

4. If you want, \rm -f of the original Mpeg file can be commented out of the script for now because \rm -f <blah>/*.mpg is not that difficult manually. If and when you completely trust this nuvexport, uncomment it out for complete automation.

5. transcode (the encoder program) quality was worse than ffmpeg with the same settings. I have posted screenshots to prove it elsewhere on the forums. Moreover, it didn't create standalone compatible AVI files. I couldn't seek properly and couldn't fast forward. I got horrible audio-video sync problems. So, the choice was very clearly ffmpeg.


Last edited by devsk on Mon Feb 05, 2007 6:57 am; edited 2 times in total
Back to top
View user's profile Send private message
pteppic
l33t
l33t


Joined: 28 Nov 2005
Posts: 781

PostPosted: Sun Jan 14, 2007 9:27 am    Post subject: Reply with quote

Ha, ha, glad you got it to work.
Is there any reason you don't run it as a 'user job' in mythtv ?

Also, if your looking to support both cores [CPU], then why not patch nuvexport** to support threading with the xvid encoder, or run the first and second passes (albeit with a small delay or more 'niceness' on the second pass) at the same time? I know neither of these sound trivial, but your not using most of nuvexport, just one export profile for one encoder.



**Yes yes, I know I hated patching nuvexport so much it was the main reason for embarking on my own little crusade.
Back to top
View user's profile Send private message
devsk
Advocate
Advocate


Joined: 24 Oct 2003
Posts: 2995
Location: Bay Area, CA

PostPosted: Sun Jan 14, 2007 3:28 pm    Post subject: Reply with quote

pteppic wrote:
Ha, ha, glad you got it to work.
Is there any reason you don't run it as a 'user job' in mythtv ?
There is no reason, just that I like to do it at a particular time rather than after recording the show. typically, it needs a lot of time to transcode. So, its always good to do it in a batch overnight. I have noticed that starting 11pm on saturday, running two jobs at a time, I am able to encode 7 one hour shows during night compared to 4. If I were to use a user job, it would be sort of inline, eat only 50% cpu and would disrupt my normal usage of the system anyway.
pteppic wrote:

Also, if your looking to support both cores [CPU], then why not patch nuvexport** to support threading with the xvid encoder, or run the first and second passes (albeit with a small delay or more 'niceness' on the second pass) at the same time? I know neither of these sound trivial, but your not using most of nuvexport, just one export profile for one encoder.
yeah, its limiting in that regard. but its supposed to solve only one particular problem.

transcode is supposed to use both the cores with multiple threads like you say and take advantage of the available cpu. But the end result is abysmal. One transcode job uses more than 90% of the CPU and is still slower than ffmpeg which uses only 50% cpu. Particularly 2nd pass, ffmpeg gets in my setup around 24-27fps while transcode manages only 10-11 fps for same xvid and image settings. This is the result of too many threads and most time eaten away in the synchronization. Makes a huge difference in multi pass setup in the overall time to encode. So, two ffmpeg jobs work out much much better than trying to multi thread the encoding process itself in this case. And since the shows build up during the week, the cpu has plenty of work to do on the saturday night.
Back to top
View user's profile Send private message
Display posts from previous:   
Reply to topic    Gentoo Forums Forum Index Documentation, Tips & Tricks All times are GMT
Page 1 of 1

 
Jump to:  
You cannot post new topics in this forum
You cannot reply to topics in this forum
You cannot edit your posts in this forum
You cannot delete your posts in this forum
You cannot vote in polls in this forum