#175 [RFE] Logconv improvements
Closed: wontfix None Opened 10 years ago by rmeggins.

https://bugzilla.redhat.com/show_bug.cgi?id=761614

Description of problem:

logconv.pl currently only produces a summary of operations for a file or for a
requested period.

It would help to spot peaks if some sort of running per-period values could be
generated.


Version-Release number of selected component (if applicable):

redhat-ds-base-8.2.6-1.el5dsrv


Attached is a patch against logconv.pl to add optional generation of per-second
and per-minute statistics in CSV format, allowing for further post-processing.

Adds the following command line options

  -m <per second stats file>
  -M <per minute stats file>

One small side-effect should be an improvement in speed since the
time-conversion calls have been optimised to accommodate the stats changes.

batch move to milestone 1.3

Decided to lump some other improvements into this fix as well:

  • Report the length of time of the amount of logging that was processed:

    Processed Log Time:  15 Days, 2 Hours, 5 Minutes, 51 Seconds
    
  • Add ldap compare and "mod dn" operation stats

  • Improve file processing:

    • Reorder the the logs so "access" is processed last
    • Process sub directories correctly
  • Reduced the logging output when processing the lines(now 10000 lines triggers "lines processed" message), and enhanced the output:

Filename Total Lines

[01] /tmp/access.20120116-145119 28400
10000 Lines Processed
20000 Lines Processed
28400 Lines Processed

[02] /tmp/access 20600
10000 Lines Processed
20000 Lines Processed
20600 Lines Processed

Total Log Lines Analysed: 49000

  • Misc code cleanup

Looks very cool!

This is the doc we have for logconv.pl.

http://docs.redhat.com/docs/en-US/Red_Hat_Directory_Server/9.0/html/Configuration_Command_and_File_Reference/Perl_Scripts.html#ldif2db.pl_Import-logconv.pl_Log_converter

Could you review the doc and write down how to revise it based upon your enhancement (if any)? The info would be passed to our doc writer, and she will update the doc on the web...

Thanks for the review Noriko! I've attached a openOffice doc with all the doc changes for logconv.pl

Thanks,
Mark

[mareynol@localhost src]$ git merge ticket175
Updating a48252b..b8a874a
Fast-forward
ldap/admin/src/logconv.pl | 630 ++++++++++++++++++++++++++++++++++++---------
1 files changed, 510 insertions(+), 120 deletions(-)

[mareynol@localhost src]$ git push origin master
Counting objects: 11, done.
Delta compression using up to 4 threads.
Compressing objects: 100% (5/5), done.
Writing objects: 100% (6/6), 4.83 KiB, done.
Total 6 (delta 4), reused 0 (delta 0)
To ssh://git.fedorahosted.org/git/389/ds.git
a48252b..b8a874a master -> master

Forgot to mention: also added a "Proxied Authenticated Operation" stat, as well as per second and per minute rates for the basic operations(search, add, delete, etc).

Hi Mark,

I really liked the idea to enhance the logconv (we use it daily in a way similar to logwatch), thank you! So i've tested the new logconv.pl (taken from git).

The "Lines Processed .." lines appear as
10001 Lines Processed
20002 Lines Processed
30003 Lines Processed
40004 Lines Processed
50005 Lines Processed
60006 Lines Processed
70007 Lines Processed
80008 Lines Processed
90009 Lines Processed
100010 Lines Processed
etc

In order to have the round numbers the line
if ($iff > $limit){ print STDERR sprintf" %10s Lines Processed\n",$ff; $iff="0";}

should be changed to
if ($iff >= $limit){ print STDERR sprintf" %10s Lines Processed\n",$ff; $iff="0";}

The other thing that has confused me is the order and the number of arguments for "-m" and "-M" switches.
According to the result of "logconv.pl -h":

./logconv.pl [-h] [-d <rootDN>] [-s <size limit>] [-v] [-V]
[-S <start time>] [-E <end time>]
[-efcibaltnxgju] [ access log ... ... ]

So in order to use "-m/-M" i've written
./logconv.h -m /Logs/Ldap/access

It has truncated my log file because the first argument now should be the result file instead of the log file. I've expected it instead to analyse the file and print to STDOUT the result of per second stats. I should have written "./logconv -m res.txt /Logs/Ldap/access" to avoid this.

So i think it would be wise either
to change the "-h" help message or
(better) to write by default to some file like sec-analyze.txt if there is only one argument or
* (even better) if there is only one argument and a -m/-M switch is used then print out a warning that the analyzer does not want overwrite the log file and do nothing.

Regards,
Andrey Ivanov

Thanks for the feedback Andrey!

For now I've just refined the usage information, as it wasn't very clear that you needed to provide a output file name. I also corrected the log lines output.

Thanks,
Mark

Also I am planning on doing a "report" stat, with intervals of month, day, hour, minute, and second. This would be output to STDOUT or to a file.

[mareynol@localhost src]$ git merge ticket175
Updating d4eedec..70f1c83
Fast-forward
ldap/admin/src/logconv.pl | 17 +++++++++--------
1 files changed, 9 insertions(+), 8 deletions(-)
[mareynol@localhost src]$ git push origin master
Counting objects: 11, done.
Delta compression using up to 4 threads.
Compressing objects: 100% (5/5), done.
Writing objects: 100% (6/6), 703 bytes, done.
Total 6 (delta 4), reused 0 (delta 0)
To ssh://git.fedorahosted.org/git/389/ds.git
d4eedec..70f1c83 master -> master

Added initial screened field value.

Metadata Update from @mreynolds:
- Issue assigned to mreynolds
- Issue set to the milestone: 1.2.10

5 years ago

389-ds-base is moving from Pagure to Github. This means that new issues and pull requests
will be accepted only in 389-ds-base's github repository.

This issue has been cloned to Github and is available here:
- https://github.com/389ds/389-ds-base/issues/175

If you want to receive further updates on the issue, please navigate to the github issue
and click on subscribe button.

Thank you for understanding. We apologize for all inconvenience.

Metadata Update from @spichugi:
- Issue close_status updated to: wontfix (was: Fixed)

2 years ago

Login to comment on this ticket.

Metadata