Saturday, January 2, 2010

Windows 7 Right Click Manage stops working

SkyHi @ Saturday, January 02, 2010

I also had the same problem. I tried replacing the blocked CompMgmtLauncher.exe with a known working CompMgmtLauncher.exe file from another win 7 x64 system, and still nothing would happen.

I discovered that a third party shell extension was causing the problem. In fact, I have installed  a security product "a-squared Anti-Malware 4.5" (http://www.emsisoft.com/en/software/antimalware/ ), on the computer with malfunctioning right-click context  menu, that would not execute CompMgmtLauncher.exe, and hence would not start CompMgmt.msc.

I solved the issue by removing the a-squared context menu option, as follows:

Open a-squared Anti-Malware, click Configuration , and uncheck Activate Explorer integration. The trade-off of this option is that you won't be able to scan individual files/folders from the context menu (if you really do need that functionality, it's easy to recheck the box and re-activate explorer integration).

Now, I can right click my computer and "manage" works like a charm.



Reference: http://social.technet.microsoft.com/Forums/en/w7itproui/thread/70897346-38b3-4872-99ec-8e1c701fc7dc



Friday, January 1, 2010

Having Windows Live Messenger in the notification area

SkyHi @ Friday, January 01, 2010
When using Windows 7, Windows Live Messenger does not fits in the system tray when you close the window as it is the case with Vista or XP.


The solution is to run WLM in compatibility mode for Windows Vista ".

To do this:
Start/Computer/Open your system partition (C:)

Default path: C:\Program Files\Windows Live\Messenger\msnmsgr

Please right click on the file msnmgr/Properties

Next, open the Compatibility tab.

Click on Run this program in compatibility mode for:/Windows Vista (Service Pack 2 for example).

Confirm by clicking OK.

Open WLM, it will be found in the notification area.


Change the icon of a folder or shortcut


If you want to change the icon of a folder on your desktop then:
  • Start by right-clicking the New Folder.
  • Next, go to Properties then Customize.



Then go to Change Icon ... where you can change the folder icon with one of many proposed.

Click Apply to see if the new icon fits to your needs, if that is the case click OK.

Refernece: http://en.kioskea.net/faq/sujet-3135-having-windows-live-messenger-in-the-notification-area

Change the Windows 7 Taskbar to Work Like Vista

SkyHi @ Friday, January 01, 2010

While many think the new taskbar feature in Windows 7 is a great improvement, others may not want to use it. Today we take a look at how to get the Vista style taskbar back on Windows 7.

The new Windows 7 taskbar does take some getting used to for sure. However, if you need to get things done on your new OS quickly, you don’t have time to learn how it works.

1-TB

To get the Vista style taskbar back Right-click an open area on the Taskbar and select Properties.

2-tb

The Taskbar and Start Menu Properties windows opens up where you will want to click the box next to “Use small icons” and select “Combine when taskbar is full” from the drop down next to Taskbar buttons.

3-tb

Now the Taskbar look similar to how it did in Vista. It doesn’t look exactly the same but has similar functionality.

4-tb

Another tweak is to unpin all programs from the Windows 7 Taskbar.

5-TB

Then follow The Geek’s tutorial on adding the Quick Launch Bar in Windows 7 to get a look and feel that is exactly like Vista with the Quick Launch Bar enabled.

6-tb

This should help you out when you are too busy to sit and learn the new Windows 7 Taskbar or simply are not a fan of it.



These steps do NOT give you back the Vista taskbar. IT is as close as you can get in 7, but sadly many things are missing.
1) a double height taskbar in Vista will efficiently triple-stack notification icons. A double height taskbar in 7 gives you a very poorly spaced double-stack arrangement.
2) a double height taskbar in Vista allows you to designate one row for quicklaunch, and another for active program windows. In 7, it’s all one big goulash.
3) in Vista, you can fit TWICE as many quicklaunch icons into a row. In 7, each icon has a glob of wasted space around it.
4) in Vista, you can launch a second instance of a program by clicking your quicklaunch icon again. In 7, your icon becomes the active program bar, making launching a second instance require a different process from the first (that is truly stupid and probably breaks some philosophical design law somewhere).
5)in Vista, your quicklaunch icons are always in the same place. In 7, they move around based on what you have open.
6) In Vista, you can make pretty much anything you want into a quicklaunch icon just by dragging the icon there. In 7, certain things are not “pinned” the way you might expect them to be, depending on what they are.
7) In Vista, the taskbar and associated areas are functional. In 7, they’re just pretty.

I have yet to find any redeeming qualities about the 7 taskbar. In fact, I HATE IT. I see people lauding it everywhere, but I have yet to hear someone say exactly what it is that they like about it. Yes, it is “cool looking”, but how is it more functional? As far as I can tell, it’s just a pain in the ass. I use the hell out of quicklaunch, and now it is ruined.


Reference:

http://www.howtogeek.com/howto/3784/change-the-windows-7-taskbar-to-work-like-vista/

http://www.intowindows.com/how-to-get-back-vista-taskbar-in-windows-7/



Solution:

http://windows.ittoolbox.com/groups/technical-functional/windows-7-l/vista-taskbar-in-windows-7-3063338

Vista like taskbar in Windows 7
If you don't like the new Windows 7 taskbar, you can always revert to the old order and make it look like the one from Windows Vista.
When you get to the part where you create the quick launch toolbar, this is what you paste in the address: "%appdata%\Microsoft\Internet Explorer\Quick Launch". What we forgot in the video because we thought is implied is that you must hit Enter after pasting the address. Good luck!





Windows 7 - Change Remote Desktop Connection (RDC) port

SkyHi @ Friday, January 01, 2010
7. Remote Desktop Connection behind a router (Single computer)
A) Configure your router's port forwarding to allow connections on port 3389
B) Now in RDC just type the IP address supplied by your ISP
8. Remote Desktop Connection behind a router (Multiple computers)
A) For each computer you want to remotely connect to you have to configure each computer to listen to different ports to avoid port conflicts by changing the registry key in
HKEY_LOCAL_MACHINE\System\CurrentControlSet\Control\TerminalServer\WinStations\RDP-Tcp\PortNumber
Click this bar to view the full image.
Click the image to open in full size.

B) Port Numbers are in Decimal format
Note: To avoid conflict with other programs it is suggested to select a port between 49152 and 65535.

C) Now take that number you've chosen your computer to listen to and configure your router to port forward to that number

D) To connect to that computer your format will now be ISPIPAddress:PortNumber (ex. 222.222.2.8:1234) to connect to that specific computer behind the router
Important: If you have a firewall software running, that too has to be configured to allow communication to the port you opened in your router as well.

Note: The above steps assumes that you're not part of a domain

Reference: http://www.sevenforums.com/tutorials/3922-remote-desktop-connection-rdc-network.html

Thursday, December 31, 2009

Converting Video Formats with FFmpeg

SkyHi @ Thursday, December 31, 2009
FFmpeg allows Linux users to convert video files easily between a variety of different formats.
Today's affordable digital video cameras have placed the power of digital recording within most people's reach. Unfortunately, this has been accompanied with a corresponding increase in the variety of file formats and codecs available. Some of these formats are more efficient than others, and some are less encumbered by proprietary licensing restrictions. So, having the ability to convert from one format to another is a great help, as you can decide what format you are comfortable with and use that one instead of being restricted to a specific file format.
FFmpeg is a simple and straightforward application that allows Linux users to convert video files easily between a variety of different formats. In this article, I walk you through installing FFmpeg and provide a few instructive examples to demonstrate the range of applications for which it can be used.
FFmpeg Installation
FFmpeg is an open-source audio and video converter that supports most industry-standard codecs and can convert from one file format to another quickly and easily. It also lets you capture video and audio from a live source and process it.
The source code for FFmpeg is available for download from the project Web site (ffmpeg.sourceforge.net/index.php) and at the time of this writing, the latest version available at the site is 0.4.9-pre1.
Once you download the file, extract it using the following command:
tar -zxf ffmpeg-0.4.9-pre1.tar.gz
This creates a new directory containing the source code for FFmpeg. To install it with the default configuration options, run ./configure from within the FFmpeg source directory. Once the configuration script finishes, compile it by issuing make. Once the compile finishes without any errors, you can install FFmpeg by running make install as root.
On the other hand, if you like to have control over what is installed and prefer customizing software installs, you can pass some command-line parameters to the configure script. To see all the options available for the installer, run the following command:
./configure --help
This command gives you multiple screens of the various settings that can be modified, and you can choose any options you like. The on-screen display does a decent job of explaining what each option does, so I will not go into a lot of detail on this.
I suggest that you enable the following options, but this is not a requirement—feel free to experiment:
  • --enable-mp3lame: highly recommended—you won't be able to encode MP3s without this. Needs lame to be installed already.
  • --enable-a52: enables GPLed A52 support, needed for decoding some VOB files.
  • --enable-gpl: required for the previous component; otherwise, not needed.
As I didn't have lame installed on my system, I ran the following command to configure FFmpeg:
./configure --enable-a52 --enable-gpl
Once the configuration is complete, read through the output to make sure no errors were generated. Then, run make, and go have a drink or something as this might take a little while. Once the system finishes compiling FFmpeg, run make install as root to install FFmpeg, and you are done with the installation.
Basic Usage
Now that you have successfully installed FFmpeg, you can start experimenting with it. The first thing you have to do is choose a video file with which to experiment. As this is your first time with FFmpeg, making a backup copy of this file is highly recommended. You don't want to be responsible for ruining the only copy of a rare video.
This input file most probably has been encoded using a particular codec, but because FFmpeg supports most of the popular formats, we don't need to worry a lot about that. Formats supported by FFmpeg include MPEG, MPEG-4 (Divx), ASF, AVI, Real Audio/Video and Quicktime. To see a list of all the codecs/formats supported by FFmpeg, run the following command:
ffmpeg --formats
A detailed list of supported file formats is also available at the FFmpeg site.
FFmpeg supports a large list of command-line parameters that control various settings in FFmpeg. To get a listing of the various options available, run the following command:
ffmpeg --help
Don't let the multipage listing scare you from using FFmpeg, the basic usage is actually very simple. To convert a file with the default settings, run the following command:
ffmpeg -i InputFile OutputFile
The -i option tells FFmpeg that the filename immediately after it is the name of the file to be used as input. If this option is omitted, FFmpeg attempts to overwrite that file when it tries to create the output file. FFmpeg uses the extension of the output file to try to determine the format and codec to use, though this can be overridden using command-line parameters (more on this later).
The default settings create an output file that has radio-quality sound (64kbps bitrate) and very bad video quality (200kbps bitrate). Fortunately, these settings can be changed for each encoding, which allows you to choose the quality of each file depending on the need.
To change the audio bitrate, add -ab bitrate to the command used earlier, where bitrate is the bitrate you want to use. See www.mp3-tech.org/tests/gb for information on the sound quality the various bitrates represent. I prefer to encode files with a bitrate between 128–192kbps depending my needs, but you can put in a higher value if you so desire. Keep in mind, however, that the higher the bitrate you use, the larger the output file size will be. Also keep in mind that if your source file is encoded in a low bitrate, increasing the bitrate won't accomplish much other than increasing the output file size.
Now, getting a CD-quality audio track for the video doesn't really make sense if the video looks like it was taken using a five-year-old Webcam having a bad day. Thankfully, this problem also is easily solved by adding another parameter to the command line.
To change the video bitrate, add the -b bitrate option to the command line. The bitrate here can be any numeric value you like, and I have seen bitrates all the way up to 23,000 (DVD Rips). Although the quality of video encoded with a 23,000kbps bitrate is amazing, the resulting file size of that encoding is also very amazing (a 90-minute video is about 4GB). In my experience, most videos look pretty decent at bitrates between 1,000–1,400, but this is a personal preference, so play with the numbers until you figure out what works for you.
So, to encode a video with a 128kbps audio bitrate and 1,200kbps video stream, we would issue the following command:
ffmpeg -i InputFile.avi -ab 128 -b 1200 OutputFile.mpg
If you are creating a video CD or DVD, FFmpeg makes it even easier by letting you specify a target type. Then, it uses the target type to calculate the format options required automatically. To set a target type, add -target type; type can be vcd, svcd, dvd, dv, pal-vcd or ntsc-svcd on the command line. So, if we were creating a VCD, we would run the following command:
ffmpeg -i InputFile.mpg -target vcd vcd_file.mpg
FFmpeg also has support for encoding audio files. The command to convert audio files is the same as the command to encode video files. To convert a WAV file to a 128kbps MP3 file, issue the following command:
ffmpeg -i Input.wav -ab 128 Output.mp3
Now, the biggest selling point of FFmpeg is that you can customize it to a level that you are comfortable with. So, if all you want to do is convert from one codec to another, and you don't really care about the advanced features, you can stop reading here and still be able to encode/decode videos. On the other hand, if you like to have more control over the encoding, keep reading as we cover more of the advanced options available in FFmpeg.
There are far too many options available in FFmpeg for me to go over each of them here, so I cover some of the ones I found most interesting and leave the rest for you to explore.

Forcing the Use of a Particular Video Codec
There are a times when you will want to encode a video using a particular codec and file format. FFmpeg lets you choose the codec with which you want to encode by adding -vcodec codec to the command line, where codec is the name of the codec you want to use. So if we want to encode using the MPEG-4 codec at 1,200kbps video bitrate and 128kbps audio bitrate, the command looks like this:
ffmpeg -i InputFile.mpg -ab 128 -b 1200 -vcodec mpeg4 OutputFile.avi
Remove the Audio Stream
Let's say you have recorded a video that has a lot of background noise and undesired commentary, so you decide to remove the audio component of the video completely. To accomplish this, all you have to do is add the -an option to the command line, and FFmpeg automatically removes all audio from the output. Keep in mind that using this option negates any other option that affects the audio stream.
So, in our example, to remove the audio component, we would run the following command:
ffmpeg -i InputFile.mpg -an -b 1200 OutputFile.avi
Remove the Video Stream
Let's say you downloaded a news video from the Net that you want to listen to on your iPod on the way to work, but in order to do that, you have to remove the video component from the output file. FFmpeg allows you to remove the video component of the file completely by adding the -vn option to the command line. Using this option negates any other option that affects the video stream.
So, in our example, to remove the video component and save the audio as a 256kbps MP3 file, we would run the following command:
ffmpeg -i InputFile.mpg -vn -ab 256 OutputFile.mp3
Choose between Multiple Audio Streams to Encode the Output File
Many DVDs have multiple language tracks available, and you can choose in which language you want to watch the video. Having multiple audio tracks is cool if you speak multiple languages and want to be able to watch videos in multiple languages. However, if you don't speak multiple languages, the extra audio tracks are useless and are taking up disk space.
FFmpeg lets you choose which streams you want to keep and ignore the rest. The command-line parameter that allows you to map streams is called -map. So, if in our test file, stream 0 is the video stream, stream 1 is the Spanish audio stream and stream 2 is the English audio stream, and we want to keep the English audio in the output file, we would issue the following command:
ffmpeg -i InputFile.mpg -map 0:0 -map 2:1 -b 1200 OutputFile.avi
In my experience, stream 0 in most video files is usually the video stream, and the remaining streams are the audio streams available with the video.
Conclusion
FFmpeg provides a wide range of options for manipulating and converting video files between a variety of formats. For more information, or to download the latest version of FFmpeg for yourself, please refer to the project Web site.
Suramya Tomar is a Linux system administrator who also likes to program. Visit www.suramya.com for more information on his background.
Reference: http://www.linuxjournal.com/article/8517

Video Editing Magic with ffmpeg


Non-linear video editing tools are great, but they're not always the best tool for the job. This is where a powerful tool like ffmpeg becomes useful. This tutorial by Elliot Isaacson covers the basics of transcoding video, as well as more advanced tricks like creating animations, screen captures, and slow motion effects.
Reference: http://www.linuxjournal.com/video/linux-howto-video-editing-magic-ffmpeg


How to Install FFmpeg in Linux ~The Easy Way~

SkyHi @ Thursday, December 31, 2009

FFmpeg is so important if you are planning to run a video website with streaming with conversion of video files to different video formats. This tutorial is intended for Centos/Redhat versions of Linux where any novice user can install ffmpeg without compiling the source which is a more traditional way of installing the FFmpeg software on linux servers. In this tutorial i will show you the easy way to install ffmpeg and ffmpeg-php (php extension) with just yum rather than compiling ffmpeg from source files.

 

FFmpeg (http://ffmpeg.mplayerhq.hu)
Mplayer + Mencoder (http://www.mplayerhq.hu/design7/dload.html)
Flv2tool (http://inlet-media.de/flvtool2)
Libogg + Libvorbis (http://www.xiph.org/downloads)
LAME MP3 Encoder

(http://lame.sourceforge.net)
FlowPlayer - A Free Flash Video Player - http://flowplayer.org/

Installing FFMpeg

yum install ffmpeg ffmpeg-devel

If you get package not found, then you will need to add few lines in the yum repository for dag packages installation. Create a file named dag.repo in /etc/yum.repos.d with the following contents on it

[dag]
name=Dag RPM Repository for Red Hat Enterprise Linux
baseurl=http://apt.sw.be/redhat/el$releasever/en/$basearch/dag
gpgcheck=1
enabled=1

then

yum install ffmpeg ffmpeg-devel

If everything is fine, then the installation should proceed smoothly. If not you will get something like warning GPG public key missing .

Common Errors

To fix rpmforge GPG key warning:

rpm -Uhv http://apt.sw.be/redhat/el5/en/i386/rpmforge/RPMS/rpmforge-release-0.3.6-1.el5.rf.i386.rpm

For more information refer to this faq depending on Centos version

Missing Dependency Error:

If you get missing dependency error like shown below, in the middle of ffmpeg installation

Error: Missing Dependency: libc.so.6(GLIBC_2.4) is needed by package ffmpeg
Error: Missing Dependency: libtheora.so.0(libtheora.so.1.0) is needed by package ffmpeg
Error: Missing Dependency: rtld(GNU_HASH) is needed by package ffmpeg
Error: Missing Dependency: libc.so.6(GLIBC_2.4) is needed by package imlib2
Error: Missing Dependency: rtld(GNU_HASH) is needed by package a52dec
Error: Missing Dependency: rtld(GNU_HASH) is needed by package imlib2
Error: Missing Dependency: rtld(GNU_HASH) is needed by package gsm
Error: Missing Dependency: libc.so.6(GLIBC_2.4) is needed by package x264
Error: Missing Dependency: rtld(GNU_HASH) is needed by package xvidcore
Error: Missing Dependency: libc.so.6(GLIBC_2.4) is needed by package lame
Error: Missing Dependency: libc.so.6(GLIBC_2.4) is needed by package a52dec
Error: Missing Dependency: rtld(GNU_HASH) is needed by package faad2
Error: Missing Dependency: rtld(GNU_HASH) is needed by package x264
Error: Missing Dependency: rtld(GNU_HASH) is needed by package lame
Error: Missing Dependency: libc.so.6(GLIBC_2.4) is needed by package xvidcore
Error: Missing Dependency: libc.so.6(GLIBC_2.4) is needed by package faac
Error: Missing Dependency: libc.so.6(GLIBC_2.4) is needed by package faad2
Error: Missing Dependency: libgif.so.4 is needed by package imlib2
Error: Missing Dependency: rtld(GNU_HASH) is needed by package faac
Error: Missing Dependency: libc.so.6(GLIBC_2.4) is needed by package gsm
Error: Missing Dependency: libpng12.so.0(PNG12_0) is needed by package imlib2
Error: Missing Dependency: rtld(GNU_HASH) is needed by package libmp4v2
Error: Missing Dependency: libc.so.6(GLIBC_2.4) is needed by package libmp4v2

then most commonly you have GLIB 2.3 installed instead of GLIB 2.4 version. To check the current GLIB version installed on your server. just use

yum list glib*

and it should list the latest GLIB package version.

The reason i was getting this error was my rpmforge packages was pointed to centos 5 versions instead of centos 4.6.

To fix dependency error:

To fix this error, you might need to check your rpmforge packages compatible to the release of your existing CentOS version.
Check the file /etc/yum.repos.d/rpmforge.repo and it should look like for Centos 4.6(Final). If you have lines like http://apt.sw.be/redhat/el5/en/mirrors-rpmforge you might need to make changes to the rpmforge.repos like shown below

Note: Backup the original rpmforge.repo file before you edit its content.

[rpmforge]
name = Red Hat Enterprise $releasever - RPMforge.net - dag
#baseurl = http://apt.sw.be/redhat/el4/en/$basearch/dag
mirrorlist = http://apt.sw.be/redhat/el4/en/mirrors-rpmforge
#mirrorlist = file:///etc/yum.repos.d/mirrors-rpmforge
enabled = 1
protect = 0
gpgkey = file:///etc/pki/rpm-gpg/RPM-GPG-KEY-rpmforge-dag
gpgcheck = 1

To know what linux type and version you are running

cat /etc/redhat-release

Once this is done, do again yum install ffmpeg.

This trick resolved the problem in my linux box running Centos 4.6 and this is the only way i found to install ffmpeg using yum.

To check the FFmpeg working:

Finally, check the ffmpeg whether it is working or not.

> ffmpeg
> ffmpeg -formats
> ffmpeg --help
// This lists path of mpeg, its modules and other path information


ffmpeg -i Input.file Output.file

To check what audi/video formats are supported

ffmpeg -formats > ffmpeg-format.txt

Open the ffmpeg-formats.txt to see the ooutput

D means decode
E means encode
V means video
A means audio
T = Truncated

Install FFMPEG-PHP Extension

FFmpeg-php is a very good extension and wrapper for PHP which can pull useful information about video through API interface. Inorder to install it you will need to download the source file and then compile and install extension in your server. You can download the source tarball : http://ffmpeg-php.sourceforge.net/

wget /path/to/this/file/ffmpeg-php-0.5.2.1.tbz2

tar -xjf ffmpeg-0.5.2.1.tbz2

phpize

./configure
make
make install

Common Errors

1. If you get command not found error for phpize, then you will need to do yum install php-devel

2. If you get error like "ffmpeg headers not found" while configuring the source.

configure: error: ffmpeg headers not found. Make sure ffmpeg is compiled as shared libraries using the --enable-shared option

then it means you have not installed ffmpeg-devel packages.

To Fix: Just install ffmpeg-devel using

yum install ffmpeg-devel

3. If you get an error like shared libraries not found problem and the program halts in the middle, then you must specify the ffmpeg installed path explicitly to the ./configure.

configure: error: ffmpeg shared libraries not found. Make sure ffmpeg is compiled as shared libraries using the --enable-shared option

To Fix:

1. First find out the ffmpeg path with ffmpeg --help command. The prefix default path should be like /usr/local/cpffmpeg
2. Configure the FFmpeg-php with --with-ffmpeg option

./configure --with-ffmpeg=/usr/local/cpffmpeg

That should resolve the problem!

Editing PHP.INI

Once you have done that without any problems then you will see the php extension file /usr/local/lib/php/extensions/no-debug-non-zts-20060613/ffmpeg.so and you will need mention that extension in php.ini file

nano /usr/local/lib/php.ini

Put the below two lines at the end of the php.ini file

[ffmpeg]
extension=ffmpeg.so

Then restart the server service httpd restart

To check whether ffmpeg enabled with php, point your browser to test.php file. It should show the confirmation of installed ffmpeg php extension

// #test.php

<?php

phpinfo()

?>

If any case the ffmpeg does not show in the phpinfo() test make sure that php.ini path to ffmpeg.so is correct. Still the problem occurs, the reason could be you might be using older versions of ffmpeg-php which is buggy. Just download the latest version of ffmpeg-php source then compile it.

Installing Mplayer + Mencoder

Just issue the following yum commands to install the rest of the packages.

yum install mplayer mencoder

Installing FlvTool2

Flvtool2 is a flash video file manipulation tool. It can calculate metadata and can cut and edit cue points for flv files.

If you are on Centos 5 try yum install flvtool2 with dag repository and if you get package not found you will need to manually download and compile the flvtool2. You can download latest version of flvtool2 here: http://rubyforge.org/projects/flvtool2/

wget <url-link>

ruby setup.rb config
ruby setup.rb setup
sudo ruby setup.rb install

If you get command not found error, it probably means that you dont have ruby installed.

yum install ruby

Thats it! Once ffmpeg works fine with php extension, download a sample video, convert to .flv format in the command line and plug it to flowplayer to see it work on your web browser. Try also to download the video file offline and see whether the converted flv file works well with both audio and video.

Useful Links

FFmpeg (http://ffmpeg.mplayerhq.hu)
Mplayer + Mencoder (http://www.mplayerhq.hu/design7/dload.html)
Flv2tool (http://inlet-media.de/flvtool2)
Libogg + Libvorbis (http://www.xiph.org/downloads)
LAME MP3 Encoder (http://lame.sourceforge.net)
FlowPlayer - A Free Flash Video Player - http://flowplayer.org/

Install FFmpeg from Compiling Source (Tutorial Link)
Nice FFmpeg Installation Tutorial (click here)
Important Audio Codecs (http://www.mplayerhq.hu/DOCS/HTML/en/audio-codecs.html)
Common Errors & Fixes while Installing FFmpeg (click here)



Reference: http://www.mysql-apache-php.com/ffmpeg-install.htm



Converting video & audio files using ffmpeg in GNU/Linux

SkyHi @ Thursday, December 31, 2009

A few days ago I downloaded a video file (.flv) from a website. I wanted to convert the video from .flv to .avi (I know it’s not a free format, but I needed to). I searched over the Internet and found out about FFmpeg. FFmpeg is a command line tool used to convert multimedia files between formats. Not only it converts video files but it also converts audio files.

To make sure FFmpeg is already  installed on your GNU/Linux distribution, open the package manager you use and search for ffmpeg. Install it if it’s not already installed. I usually use the Konsole (Kubuntu’s Terminal), so I typed:

$ sudo apt-get install ffmpeg

Ok, now we have the program, but how does it work?

To begin with, open up a terminal. I understand that lots of people are afraid , bored or don’t like using it, but once you get the hang of it, you can perform many useful tasks.

The generic syntax is:

$ ffmpeg [[infile options][`-i' infile]]… {[outfile options] outfile}…

You have to have in mind that options that precede the infile are applied to the the infile and options that are between the infile and outfile are applied to the outfile.

In my case, I had a .flv file and I wanted to convert it to .avi. So the way to do it is:

$ ffmpeg -i infile.flv outfile.avi

Done! Simple as that! You will find the .avi file in the same folder!

Keep in mind that by default, FFmpeg tries to convert as losslessly as possible, attempting to keep the best quality for the output file. It uses the same audio and video parameters for the outputs as the one specified for the inputs.

If you want to see what formats and codecs are available on your PC  just type:

$ ffmpeg -formats

FFmpeg supports .3gp, .avi, .mp3, .mp4, .mvi, .rm , .mkv among many others.

  • Some video options that you might find useful are:

-b bitrate where bitrate is the video bitrate in bit/s. The default is  200kb/s

-r fps where fps is the frame rate in Hz. The default value is 25Hz.

-s size where size is the frame size. The default in FFmpeg is the same as the source. The format is ‘wxh’. wxh is equal to 160×128. You can use both either wxh or 160X128 (see video options link below for all sizes)

-aspect aspect where aspect is the aspect ratio with values 4:3, 16:9 or 1.3333 or 1.7777.

-vcodec codec where codec is the name of the video codec that you want the FFmpeg to use.

  • Some audio options that you might find useful:

-ar frequency where frequency is the audio sampling frequency. The default value is  41000KHz

-ab bitrate where bitrate is the audio bitrate in bit/s. The default is 64k.

-acodec codec where codec is the name of the audio codec that you want the FFmpeg to use.

More details can be found here:

  1. Main Options
  2. Audio Options
  3. Video Options

Following are some examples mentioned on ffmpeg documentation.

Example 1:

ffmpeg -i input.avi -b 64k output.avi

With the -b 64k option we set the outfile bitrate to 64kbit/s

Example 2:

ffmpeg -r 1 -i input.m2v -r 24 output.avi

Using the -r 1 we force the frame rate of the input file (valid for raw formats only) to 1 fps and with the -r 24 we set the the frame rate of the output file to 24 fps.

Example 3:

ffmpeg -i test.wav -acodec mp3 -ab 64k test.mp3

In this exampe we can see the use of FFmpeg for audio files. We convert a .wav file to .mp3. As said before using -acodec mp3 we force FFmpeg to use the mp3 audio codec to create the output file. The -ab 64k tells ffmpeg to use an audio bitrate of 64k.

Example 4:

ffmpeg -i video.avi -ar 22050 -ab 32 -s 640×480 video.mp4

Using the -ar we set the audio sampling rate to 22050 Hz. The -ab as i said sets the audio bitrate to 64k and the -s sets the frame size to 640×480. Here instead of 640×480 we could have used vga like this:

ffmpeg -i video.avi -ar 22050 -ab 32 -s vga video.mp4

Remember to see the video options link for all available sizes.

Conclusion: With FFmpeg you can manipulate and convert video and audio files between a variety of formats, fast and easy, with just a simple command. Remember not to hesitate to use the terminal!

6 Responses to “Converting video & audio files using ffmpeg in GNU/Linux”

  1. tuxhelper said:

    convert .flv to .mpg
    ffmpeg -i yourfile.flv -ab 56 -ar 22050 -b 500 -s 320×240 yourfile.mpg

    convert .3gp to .avi
    ffmpeg -i file.3gp -f avi -acodec mp3 *.avi

    extract audio .3gp to .mp3
    ffmpeg -y -i *.3gp -ac 1 -acodec mp3 -ar 22050 -f wav *.mp3

    Convert wma to ogg
    ffmpeg -i *.wma -acodec vorbis -aq 100 *.ogg

    Convert wma to mp3
    ffmpeg -i *wma -acodec mp3 -aq 100 *.mp3

    If you have a lot of files to convert run this command in the directory where your wma files reside
    convert wma to ogg
    for i in *.wma; do ffmpeg -i $i -acodec vorbis -aq 100 ${i%.wma}.ogg; done

    convert wma to mp3
    for i in *.wma; do ffmpeg -i $i -acodec mp3 -aq 100 ${i%.wma}.mp3; done

Refernce: http://www.mygnulinux.com/?p=56

Why is Internet access and Wi-Fi always so terrible at large tech conferences?

SkyHi @ Thursday, December 31, 2009

Managing your log files

SkyHi @ Thursday, December 31, 2009
Are you responsible for any Linux systems that are important to the running of your business? A web server, database server or mail server, perhaps, or some edge-of-network appliance like a firewall? If so, it's important to monitor the health of these machines, and the log files are perhaps your first port of call. Log files can tell you if things are misconfigured, alert you to break-in attempts, or simply reassure you that all is well.
In this tutorial we'll begin by taking a peek inside a few log files to get a hint about the kind of stuff you'll find there: then we'll move on to examine some tools for summarising and managing the files.
Logged messages fall into two broad types: low-volume messages that you might want to read line by line, and high-volume messages that really need to be summarised. Let's take a look at a couple of examples of the low-volume type first.
e1000: 0000:04:03.0: e1000_probe: (PCI:66MHz:32-bit) 00:0e:0c:4a:42:4e
e1000: eth0: e1000_probe: Intel(R) PRO/1000 Network Connection
e1000: eth0: e1000_watchdog_task: NIC Link is Up 10 Mbps Full Duplex
Exhibit A is taken from the kernel's ring buffer - these three lines show a kernel module discovering and initialising a network interface. Such messages are encouraging if you have added new hardware and are trying to establish if Linux can find it.
Feb  6 15:11:56 shrek named: zone localdomain/IN: loaded serial 42
Feb  6 15:11:56 shrek named: zone localhost/IN: loaded serial 42
Feb  6 15:11:56 shrek named: named.local:9: unknown RR type 'PT'
Exhibit B is taken from that all-purpose log file, /var/log/messages, and shows a DNS server failing to start up because of a syntax error in one of its zone files. Feb 6 15:11:56 is the date of the event; shrek is the name of the computer. Next, named is the service reporting the problem, which we are told is loaded serial 42. The third line goes into detail about the exact error, and tells us the line number and filename in question.
These next three examples are of messages that are likely to occur in large numbers, even on a normal healthy system.
Feb  6 15:08:06 shrek su: pam_unix(su:session):
session opened for user root by pete (uid=571)
Exhibit C, taken from /var/log/secure, shows an ordinary user gaining root access with su. On this particular machine, direct root logins are disabled, forcing a user to reveal their identity (and leave an audit trail) when they want to become root.
Feb  2 18:54:16 shrek sshd[4244]:
Failed password for root from 125.246.84.5 port 39409 ssh2
Exhibit D represents a failed SSH login attempt as root. Taken individually, it could just be a fumble-fingered user mis-typing the password, until you notice that it is one of almost 10,000 such messages occurring at roughly five-second intervals over a period of more than 12 hours...
194.105.66.18 - - [05/Feb/2007:08:31:38 +0000]
"GET /media/images/premiumskin07d/bg.gif HTTP/1.1" 404 320
Finally, exhibit E shows a line from an Apache access log. Of course, you expect your web server to get accessed, so lines in the access log should not come as a great surprise; however, this one is interesting because of the 404 (file not found) response, which probably indicates a broken link on the site.
By the way, if you define a custom log format Apache will be capable of writing much more detailed logs than this, including the contents of any fields in the HTTP request header. In fact, there is a fairly standard extension called the combined log format, which many log file analysis tools can understand (more details at http://httpd.apache.org/docs/1.3/logs.html#combined).
Generally speaking, you do not want to read log messages such as exhibits C, D, and E line by line. You would prefer to have them summarised for you, providing a report that simply says, for example, 'pete used the su command 26 times to become root'.

Become a champion birler

Without intervention, log files will grow without bound, and much of the data in each file is too old to be useful. To help you keep on top of all this informational lumber and send the old logs downstream, Linux distributions include a tool called logrotate. This, ah, rotates the log files; that is, it periodically divides them into manageable pieces and, eventually, deletes the oldest ones.
Typically, logrotate is run daily as a Cron job. It has its own configuration file specifying which logs to manage - choose to rotate logs after a set period of time (daily, weekly or monthly) or when they exceed whatever size you've specified in the logrotate config file. The old logs can be compressed, and files can be mailed to specified users.
The top-level configuration file for logrotate is /etc/logrotate.conf - let's take a look at it. This file defines some default settings for logrotate, and uses an include directive to include config files for individual services from the directory /etc/logrotate.d. Excluding comment lines, on a Fedora test installation logrotate.conf looks like this (note that the line numbers are just here for reference):
1. weekly
2. rotate 4
3. create
4. include /etc/logrotate.d
5. # no packages own wtmp -- we'll rotate them here
6. /var/log/wtmp {
7.    monthly
8.    create 0664 root utmp
9.    rotate 1
10. }
Line 1 instructs logrotate to rotate the files every week. Line 2 says that the last four log files should be kept; in the context of this example, this means the log files for the last four weeks. As log files are rotated, the current log file (for example /var/log/messages) is renamed as messages.1; the file messages.1 is renamed as messages.2, and so on.
Line 3 tells us that after each log has been rotated, a new log file will be created. You can configure logrotate's handling of each log file separately. For example, you might want to rotate /var/log/messages every day and keep seven generations, but you might want to rotate /var/log/mail.log every week and keep four generations.
On to line 4, which tells logrotate to read all the config files in /etc/logrotate.d. In that directory you'll find a collection of files, each of which relates to a specific service (or to be more accurate, a specific log file). Splitting up the configuration into individual service-specific files in this way allows each service to simply drop its own config file into /etc/logrotate.d as it is installed.
However, it is quite possible to put service-specific configurations (or even the entire configuration) directly into logrotate.conf, and indeed we see an example controlling the rotation of the wtmp file at lines 5-10 of the file.
As an example of a service-specific file, here's /etc/logrotate.d/ppp (also from Fedora). This file controls the rotation of the log file generated by the ppp daemon, which manages dial-up connections.
1. /var/log/ppp/connect-errors {
2.         missingok
3.         compress
4.         notifempty
5.         daily
6.         rotate 5
7.         create 0600 root root
8. }
Let's run through this. Line 1 lists the log files to which the following entries apply. In line 2, logrotate is told not to complain if the log file is missing. Line 3 requests that the old (rotated) log files are to be compressed - in this example they will end up with names of the form connect-errors.1.gz.
In lines 4 and 5, the administrator has configured things so that rotation is skipped if the file is empty, but that otherwise rotation should be done daily (note that this overrides the default setting in logwatch.conf). For this to make any sense, of course, logrotate must be run at least once a day.
The remaining settings should be clear to you by now: line 6 says to retain five generations of the log (again overriding the default setting), while line 7 (also overriding the default) says that after rotating, the log file should be recreated with the specified mode, ownership and group. There are quite a few more logrotate directives to pinpoint how a service's log file is dealt with; consult the logrotate man page for details.
Here we can see the result of log file rotation by listing the directory /var/log. Below, we're looking at the four rotations of the log file maillog:
$ ls -l /var/log/maillog*
-rw------- 1 root root 4112 Jan 19 04:02 /var/log/maillog
-rw------- 1 root root 6584 Jan 14 04:02 /var/log/maillog.1
-rw------- 1 root root 5771 Jan  7 04:02 /var/log/maillog.2
-rw------- 1 root root 7794 Dec 31 04:02 /var/log/maillog.3
-rw------- 1 root root 5561 Dec 24 04:02 /var/log/maillog.4
You can see that all these files were created at 04:02 in the morning (not because a neurotic sysadmin felt the need to run logrotate, but because this is the time that the sysadmin has chosen to run the daily Cron jobs), though you'll notice that the rotation occurs only once a week. Because we are keeping only four weekly generations, any mail logs dated prior to 17 December (when maillog.4 was started) have been deleted. In this example, the files are not compressed.

Watching logs

Although logrotate takes care of the problem of parcelling the log files into manageable pieces, it does not actually report on their contents. There are a number of open source applications (and several proprietary offerings, too) that can help with this.
One of the most widely used is logwatch. This is a simple but useful program that undertakes the tedious activity of reading the log files and summarising them, so that instead of looking through tens of thousands of lines relating to individual page fetches by Apache, you can see text reports or bar charts showing site usage by day, by hour, by URL and so on.
Logwatch is available on most Linux distributions. Like logrotate, it is usually run daily as a Cron job. Typically, it is configured to mail its output to root; indeed, if you log in as root and read your mail on a long-neglected server, you are quite likely to find a long list of daily messages from logwatch waiting for you (and very little else). The heart of logwatch is a Perl script, /usr/share/logwatch/scripts/logwatch.pl.
This is supplemented by a a rather confusing hierarchy of config files and a collection of service-specific filters to cut out logged information about the services you're not interested in. On Fedora, logwatch is run daily as a result of a symlink to the main logwatch.pl script in the /etc/cron.daily directory.
Most people will be content to stick with the default configuration of logwatch, but once you've mastered its rather labyrinthine assortment of config files and scripts, there's a lot you can do to configure it, including writing your own filter scripts if you have appropriate programming skills - probably with Perl or PHP.
Directory structure of logwatch Directory structure of logwatch
The Directory Structure Of Logwatch diagram above illustrates the main directory hierarchy of logwatch, which is rooted at /usr/share/logwatch. The left-hand ellipse in the figure (the directory /usr/share/logwatch/default.conf) contains the files that define a base-line configuration. The files here are common to all installations of logwatch. The middle ellipse (directory dist.conf) contains subdirectories that parallel those in default.conf. These directories are intended to contain distro-specific configuration.
There is actually a third directory, /etc/logwatch/conf (not shown in the diagram) that again parallels the default.conf directory and is intended for site-specific configuration. The idea is that you should leave the default configuration alone, and use the distro-specific and site-specific configurations to override or extend it, but on our test Fedora system the distro- and site-specific configurations are essentially empty. Not all distros do things quite this way. Red Hat Enterprise Linux, for example, has only one of these directory hierarchies, rooted at /etc/log.d.
Central to the configuration of logwatch is the concept of a 'service'. We feel obliged to put the word into quotes because we are not (necessarily) talking about a single network service such as DNS or FTP that is provided by a specific Linux daemon. In logwatch-speak, a service is simply some source of logging information, identified by a name. All clear?
Because you're probably suffering from exposure out here nattering about services, let's get back into the warmth of a configuration file: specifically, the top-level logwatch file, /usr/share/logwatch/default.conf/logwatch.conf. Here it is, minus comments (but plus line numbers):
1. LogDir = /var/log
2. TmpDir = /var/cache/logwatch
3. MailTo = root
4. Print = No
5. Range = yesterday
6. Detail = Low 
7. Service = All
8. Service = "-zz-network"
9. Mailer = /bin/mail
Line 1 tells logwatch where to look for the log files. We'll refer to this when we see how to define log-file groups in a minute. Lines 3 and 4 explain that the output from logwatch should be mailed to root and not written to standard output.
Line 5 specifies a time range from which to process log entries (valid values are yesterday, today and all), while line 6 determines the level of detail you'd like in the report - the values Low, Med and High are defined, and it is up to the filters for the individual services to decide how to interpret this setting.
If all services are to be processed you'll have line 7 set to All as in the example here, but you can set exceptions, as in line 8, which explicitly excludes the zz-network service. Line 9 tells us where the mailer is.
Before we go any further it's worth mentioning that you can override most of these settings with command-line options when you invoke logwatch. For example, the command
# logwatch --detail High --service cron --print
...processes only the service Cron, at a high level of output detail, with output written to standard output rather than being mailed. Try man logwatch or logwatch --help for details on command line options.

Get into groups

Next on our config file tour we'll stop by at one of the service definition files in default.conf/services. The files in this directory define the services that logwatch knows about. For example, here's the file sshd.conf, minus comments:
1. Title = "SSHD"
2. LogFile = secure
3. LogFile = messages
4. *OnlyService = sshd 
5. *RemoveHeaders
Line 1 defines a title for the service. Any output generated by the filter for sshd will be bracketed thus:
-------- SSHD Begin ---------
--------- SSHD End ----------
In lines 2 and 3, secure and messages are the names of a couple of log file groups. The log file groups simply serve to factor out configuration information that might be common to several service definitions. These groups are defined by files in default.conf/logfiles; in this case, the files secure.conf and messages.conf. We'll look at those next on our tour, but basically, each log file group names one or more files that will be taken as a source of logged messages.
Line 4 introduces a filter, OnlyService, to be run with the parameter sshd. This is one of the shared scripts in the scripts/shared directory; the purpose of this particular filter is to select only those lines that appear to come from sshd. Another shared filter appears in line 5: this one removes the 'garbage' at the beginning of each syslog-style logged message, by which we mean the time stamp, the host name and the facility name - so much flotsam and jetsam to sysadmin lumberjacks.
Time now to visit one of the log file group definitions. We'll take secure.conf as our example. Here it is:
1. LogFile = secure
 2. LogFile = authlog
 3. Archive = secure.*
 4. Archive = secure.*.gz
 5. Archive = archiv/secure.*
 6. Archive = archiv/secure.*.gz
 7. Archive = authlog.*
 8. Archive = authlog.*.gz
 9. *ExpandRepeats
10. *OnlyHost
11. *ApplyStdDate
Lines 1 and 2 specify the names of the actual log files. You can use absolute path names here, but if you don't, the names are taken relative to LogDir as defined in the top-level config file, which you may remember was the directory /var/log. In this example, the contents of both /var/log/secure and /var/log/authlog are passed into the filter chain for this service.
Lines 3-8 define sources of archived logs; these are used only if logwatch is started with the --archives option or if the line Archives = yes appears in the top-level config file.
Lines 9, 10 and 11 specify a further three filters (also in the scripts/shared directory) that will be placed at the head of the filter chain. The ExpandRepeats filter was originally intended to expand the last message repeated N times messages that syslog generates but the authors have evidently had a change of heart and the filter now merely suppresses these messages.
OnlyHosts filters messages according to the host from which they originated - syslog can accept messages from other hosts on the network. This filter is used in conjunction with the --splithosts command line option, which tells logwatch to create a separate report for each host found in the raw logs. Finally, the ApplyStdDate filter discards those lines whose syslog-style time stamps do not lie within the day being processed.
All of this configuration stitches together a processing chain of filters, which runs under control of the logwatch.pl script, as shown below. The shared filters shown here live in /usr/share/logwatch/scripts/shared; the service-specific script is in /usr/share/logwatch/scripts/services. It is the service-specific scripts that ultimately generate output.
The chain of filters for log data. The chain of filters for log data.

One of our files is missing

If you want to see logwatch in action you can run it directly from the command line, courtesy of another symbolic link in /usr/sbin. You will need to specify the --print option to see the output, because the default action is to mail it to root. Here's a sample run:
########### Logwatch 7.2.1 (01/18/06) #############
      Processing Initiated: Thu Feb  1 04:02:02 2007
      Date Range Processed: yesterday
                            ( 2007-Jan-31 )
                            Period is day.
    Detail Level of Output: 0
            Type of Output: unformatted
         Logfiles for Host: www.example.com
 ###################################################
 
 ------------------- httpd Begin -------------------

 Requests with error response codes
    400 Bad Request
       /: 1 Time(s)
    404 Not Found
       /media/images/premiumskin07d/bg.gif: 51 Time(s)
       /favicon.ico: 3 Time(s)
  -------------------- httpd End --------------------
 
 ------------------ pam_unix Begin ----------------- 
 sshd:
    Authentication Failures:
       unknown (61.19.233.102): 27 Time(s)
       admin (61.19.233.102): 3 Time(s)
       admin (61.136.58.249): 1 Time(s)
       root (61.136.58.249): 1 Time(s)
       unknown (61.136.58.249): 1 Time(s)
    Invalid Users:
       Unknown Account: 28 Time(s)
  su:
    Authentication Failures:
       pete(571) -> root: 1 Time(s)
    Sessions Opened:
       pete(uid=571) -> root: 3 Time(s)
What are we to make of this output? Well, we appear to have a missing file on our website. Judging by the name, it's probably just cosmetic. Second, there have been a number of failed attempts to log in to sshd (sshd and Apache are the only services enabled on this particular machine).
Thirdly, user Pete has logged in as root three times (and apparently fumbled the password once; we all do it, Pete). Because direct root logins are disabled on the machine, all attempts (successful and unsuccessful) to become root will leave an audit trail like the one shown here. All in all, there's nothing very exciting here. Just the daemons going about their day-to-day business.
Logwatch is not the only fish in this particular sea. Debian users would probably draw our attention to logcheck, which filters out 'normal' lines from log files on the basis of a regular expression match against a database of regular expressions, then mails the remaining 'abnormal' lines to the system administrator. And of course, if you have decent Perl or PHP programming skills, these kinds of tools are not that hard to custom build.
Well, there you are. Log files. We warned you they aren't exciting! But on a rainy afternoon, if you've finished the paper, read your copy of Linux Format magazine from cover to cover, and need an excuse not to shampoo the cat, you could always disappear into your office and read a few.

Web analytics

When it comes to the analysis of web server log files, you could use one of the tools at the top end of the scale such as Google Analytics and Clicktracks. These are intended to help the marketing department figure out how people travel through your site and how they got there in the first place.
The open source The Webalizer (www.mrunix.net/webalizer) is a more traditional tool, which simply feeds off the log files written by Apache. What you'll get are usage statistics in HTML format for viewing with a browser. There are yearly, monthly, daily and hourly usage stats, and it can also display usage by site, URL, referrer, user agent (ie browser), username, search strings, entry/exit pages and country.
The Webalizer gives you a detailed web analysis. The Webalizer: each of the underlined links in the table leads to a more detailed analysis for that month, including day-by-day and hour-by-hour usage charts, and a list of popular URLs.
Usefully, The Webalizer maintains its own 'running total' summaries of site activity so that it can present statistics that reach further back in time than your log file rotation regime would preserve. It also supports FTP transfer logs from the wu-ftpd server, and will automatically uncompress gzipped log files if necessary.

Reference: http://tuxradar.com/content/managing-your-log-files

HOWTO-Customize-LogWatch

SkyHi @ Thursday, December 31, 2009
Logwatch is a customizable log analysis system. 
Logwatch parses through your system's logs for a given period of time and creates a report analyzing areas that you specify, in as much detail as you require. Logwatch is easy to use and will work right out of the package on most systems. 
1. Login to your server as root via SSH. 

2. Load the logwatch configuration file
Type: pico -w /etc/log.d/conf/logwatch.conf 

3. Search for where the log files are mailed too.
Press: CTRL-W
Type: MailTo 
Set the e-mail address to an off server account so incase you get hacked they can not delete the mail without hacking atleast 2 servers. 

4. Now lets change what actions you are alerted of.
CTRL-W
Type: Detail 

5. Detail = Low
Change that to Medium, or High.
I suggest high, because you will get more detailed logs with all actions. 

6. Once you are done lets Exit & Save
CTRL-X then Y then Enter 





LogWatch is a customizable, pluggable log-monitoring system. It will go through your logs for a given period of time and make a report in the areas that you wish with the detail that you wish. Easy to use – works right out of the package on almost all systems. This utility is not a daemon , it is a perl script and usually implemented as a cron job (daily) and reports via email the most important aspects of your log files .
The newcomers to the Linux world can just accept the default configurations , read the messages that are send to their mail box and getting a view of the status of the box ( security alerts , disk space available , failed ssh log-in attempts ….) . It is also possible to run logwatch via the terminal with  parameters and display the result directly on the terminal , so there is no need to wait the report until it is executed from the cron job ( usually ones a day) .The daily report is setup via a symlink in /etc/cron.daily:
0logwatch -> /usr/share/logwatch/scripts/logwatch.pl
If you prefer to customise logwatch , then you have to go through a rather confusing hierarchy of config files and a collection of service-specific filters to cut out logged information about the services you’re not interested in. On CentOs 5.x , logwatch is run daily as a result of a symlink to the main logwatch.pl script in the /etc/cron.daily directory.
Most people will be content to stick with the default configuration of logwatch, but once you’ve mastered its rather labyrinthine assortment of config files and scripts, there’s a lot you can do to configure it, including writing your own filter scripts if you have appropriate programming skills – probably with Perl or PHP.
The most important files of logwatch on CentOs 5.x are :
  • /usr/share/logwatch/scripts/logwatch.pl :  This script is run if  cron  executes the logwatch scheduled job . The /usr/sbin/logwatch command  is a symlink to this script
  • /usr/share/doc/logwatch.xxx/HOWTO-Customize-LogWatch : This is a must read file , not only for those that need to customize logwatch
  • /etc/logwatch/conf/logwatch.con  :This file sets the default values of all the  options (see table below ) . These defaults are used when LogWatch is called without any parameters (i.e. from cron.daily). The file is well-documented, but the explanations below also apply to this config file.
  • /etc/logwatch/scripts/services/* : Actual filter programs for the various services.
  • /etc/logwatch/scripts/shared/* : Filters common to many services and/or logfiles.
  • /etc/logwatch/scripts/logfiles/* : Filters specific to just particular logfiles.
Configuration files priority :
Logwatch can be highly customized through its configuration files , these files are organized on a directory structure .
Actualy logwatch contains 3 directories for the configuration files, all have the same structure but different priority level :
  • /usr/share/logwatch/default.conf /…: The default configuration provided by logwatch
  • /usr/share/logwatch/dist.con/….. : Distribution specific configuration file ( CentOs 5.x doesn’t recommend any configuration at all , so the default configurations will take place if no user specific configurations are available) .
  • /etc/logwatch/….. : The place where the user makes his custom configuration changes
Think of it like the priority level of CSS in HTML .The first level (highest priority ) for the configuration files is the /etc/logwatch/… directory . If the first level is not present then the next level will take place ( distribution specific configuration files ) and if no second level exists then the last level ( default configuration from logwatch ) will take place .
Launching logwatch via the terminal : As mentioned previously , a fresh report can be enabled with the terminal . These are the most useful parameters that can be passed to the perl script .
logwatch terminal options
–usage    or    –helpDisplays usage information
–detail levelThis is the detail level of the report. level can be high, med, low.
–logfile log-file-groupThis will force LogWatch to process only the set of logfiles defined by log-file-group (i.e. messages, xferlog, …). LogWatch will therefore process all services that use those logfiles. This option can be specified more than once to specify multiple logfile-groups.
–service service-nameThis will force LogWatch to process only the service specified in service-name (i.e. login, pam, identd, …). LogWatch will therefore also process any log-file-groups necessary to process these services. This option can be specified more than once to specify multiple services to process. A usefulservice-name is All which will process all services (and logfile-groups) for which you have filters installed.
–printPrint the results to stdout (i.e. the screen).
–mailto addressMail the results to the email address or user specified inaddress.
–archivesEach log-file-group has basic logfiles (i.e. /var/log/messages) as well as archives (i.e. /var/log/messages.? or /var/log/messages.?.gz). This option will make LogWatch search through the archives in addition to the regular logfiles. The entries must still be in the proper date range (see below) to be processed, however.
–range rangeYou can specify a date-range to process. This option is currently limited to only Yesterday, Today and All.
–debug levelFor debugging purposes. level can range from 0 to 100. This will really clutter up your output. You probably don’t want to use this.
–save file-nameSave the output to file-name instead of displaying or mailing it.
–logdir directoryLook in directory for log files instead of the default directory.
–hostname hostnameUse hostname for the reports instead of this system’s hostname. In addition, if HostLimit is set in/etc/log.d/logwatch.conf, then only logs from this hostname will be processed (where appropriate).
Examples
  • logwatch –service ftpd-xferlog –range all –detail high –print –archives 
    This will print out all FTP transfers that are stored in all current and archived xferlogs.
  • logwatch –service pam_pwdb  –range yesterday –detail high –print
    This will print out login information for the previous day…
  • logwatch  –printJust force to execute logwatch immediately and print out results to the terminal
  • logwatch 
    logwatch will be excecuted as it was  enabled through cron , so jour mail box will receive the report
  • logwatch  –detail  high  –logfile secure –print
  • logwatch –detail high –logfile messages –mailtoyourname@domain.com
    Scan the “messages” log and send the report to a custom email






Keep in mind that you can find some Logwatch documentation at /usr/share/doc/logwatch-*/HOWTO-Customize-LogWatch, and it contains a few useful examples.
  1. On RHEL/CentOS/SL, the default logwatch configuration is under/usr/share/logwatch/default.conf/logwatch.conf
    These settings can be overriden by placing your local configuration under/etc/logwatch/conf/logwatch.conf. Place the following in that file to tell logwatch to completely ignore services like 'httpd' and the daily disk usage checks:
    # Don't spam about the following Services
    Service = "-http"
    Service = "-zz-disk_space"
    
  2. Sometimes I don't want to completely disable logwatch for a specific service, I just want to fine tune the results to make them less noisy. /usr/share/logwatch/default.conf/services/*.confcontains the default configuration for the services. These parameters can be overridden by placing your local configuration under /etc/logwatch/conf/services/$SERVICE.conf. Unfortunately, logwatch's ability here is limited, and many of the logwatch executables are full of undocumented Perl. Your choice is to replace the executable with something else, or try to override some settings using /etc/logwatch/conf/services.
    For example, I have a security scanner which runs scans across the network. As the tests run, the security scanner generates many error messages in the application logs. I would like logwatch to ignore errors from my security scanners, but still notify me of attacks from other hosts. This is covered in more detail at Logwatch: Ignore certain IPs for SSH & PAM checks?. To do this, I place the following under /etc/logwatch/conf/services/sshd.conf:
    # Ignore these hosts
    *Remove = 192.168.100.1
    *Remove = X.Y.123.123
    # Ignore these usernames
    *Remove = testuser
    # Ignore other noise. Note that we need to escape the ()
    *Remove = "pam_succeed_if\(sshd:auth\): error retrieving information about user netscan.*
    
    "
  3. logwatch also allows you to strip out output from the logwatch emails by placing regular expressions in /etc/logwatch/conf/ignore.conf. HOWTO-Customize-LogWatch says:
    ignore.conf: This file specifies regular expressions that, when matched by the output of logwatch, will suppress the matching line, regardless of which service is being executed.
    However, I haven't had much luck with this. My requirements need a conditional statement, which is something like 'If there are security warnings due to my security scanner, then don't print the output. But if there are security warnings from my security scanner and from some bad guys, then print the useful parts-- The header which says "Failed logins from:", the IPs of the bad hosts, but not the IPs of scanners.'
  4. Nip it at the source (As suggested by @user48838). These messages are being generated by some application, and then Logwatch is happily spewing the results to you. In these cases, you can modify the application to log less.
    This isn't always desirable, because sometimes you want the full logs to be sent somewhere (to a Central syslog server, central IDS server, Splunk, Nagios, etc.), but you don't want logwatch to email you about this from every server, every day.


REFERENCES
http://www.logwatch.org/tabs/docs/HOWTO-Customize-LogWatch.html
http://tournasdimitrios1.wordpress.com/2011/01/15/logwatch-system-log-analyzer-and-reporter-for-linux/
http://serverfault.com/questions/293226/linux-logwatch8-is-too-noisy-how-can-i-control-the-noise-level