/usr/lib/python2.7/dist-packages/haproxy_log_analysis-2.0b0.egg-info/PKG-INFO is in python-haproxy-log-analysis 2.0~b0-1.
This file is owned by root:root, with mode 0o644.
The actual contents of the file can be viewed below.
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 153 154 155 156 157 158 159 160 161 162 163 164 165 166 167 168 169 170 171 172 173 174 175 176 177 178 179 180 181 182 183 184 185 186 187 188 189 190 191 192 193 194 195 196 197 198 199 200 201 202 203 204 205 206 207 208 209 210 211 212 213 214 215 216 217 218 219 220 221 222 223 224 225 226 227 228 229 230 231 232 233 234 235 236 237 238 239 240 241 242 243 244 245 246 247 248 249 250 251 252 253 254 255 256 257 258 259 260 261 262 263 264 265 266 267 268 269 270 271 272 273 274 275 276 277 278 279 280 281 282 283 284 285 286 287 288 289 290 291 292 293 294 295 296 297 298 299 300 301 302 303 304 305 306 307 308 309 310 311 312 313 314 315 316 317 318 319 320 321 322 323 324 325 326 327 328 329 330 331 332 333 334 335 336 337 338 339 340 341 342 343 344 345 346 347 348 349 350 351 352 353 354 355 356 357 358 359 360 361 362 363 364 365 366 367 368 369 370 371 372 373 374 375 376 377 378 379 380 381 382 383 384 385 386 387 388 389 390 391 392 393 394 395 396 397 398 399 400 401 402 403 404 405 406 407 408 409 410 411 412 413 414 415 416 417 418 419 420 421 422 423 424 425 426 427 428 429 430 431 432 433 434 | Metadata-Version: 1.1
Name: haproxy-log-analysis
Version: 2.0b0
Summary: Haproxy log analyzer that tries to gives an insight of what's going on
Home-page: https://github.com/gforcada/haproxy_log_analysis
Author: Gil Forcada
Author-email: gforcada@gnome.org
License: GPL v3
Description: .. -*- coding: utf-8 -*-
HAProxy log analyzer
====================
This Python package is a `HAProxy`_ log parser.
It analyzes HAProxy log files in multiple ways (see commands section below).
.. note::
Currently only the `HTTP log format`_ is supported.
Tests and coverage
------------------
No project is trustworthy if does not have tests and a decent coverage!
.. image:: https://travis-ci.org/gforcada/haproxy_log_analysis.svg?branch=master
:target: https://travis-ci.org/gforcada/haproxy_log_analysis
:alt: Tests
.. image:: https://coveralls.io/repos/gforcada/haproxy_log_analysis/badge.svg?branch=master
:target: https://coveralls.io/github/gforcada/haproxy_log_analysis
:alt: Coverage
.. image:: https://img.shields.io/pypi/dm/haproxy_log_analysis.svg
:target: https://pypi.python.org/pypi/haproxy_log_analysis/
:alt: Downloads
.. image:: https://img.shields.io/pypi/v/haproxy_log_analysis.svg
:target: https://pypi.python.org/pypi/haproxy_log_analysis/
:alt: Latest Version
.. image:: https://img.shields.io/pypi/status/haproxy_log_analysis.svg
:target: https://pypi.python.org/pypi/haproxy_log_analysis/
:alt: Egg Status
.. image:: https://img.shields.io/pypi/l/haproxy_log_analysis.svg
:target: https://pypi.python.org/pypi/haproxy_log_analysis/
:alt: License
Documentation
-------------
See the `documentation and API`_ at ReadTheDocs_.
Command-line interface
----------------------
The current ``--help`` looks like this::
usage: haproxy_log_analysis [-h] [-l LOG] [-s START] [-d DELTA] [-c COMMAND]
[-f FILTER] [-n] [--list-commands]
[--list-filters]
Analyze HAProxy log files and outputs statistics about it
optional arguments:
-h, --help show this help message and exit
-l LOG, --log LOG HAProxy log file to analyze
-s START, --start START
Process log entries starting at this time, in HAProxy
date format (e.g. 11/Dec/2013 or
11/Dec/2013:19:31:41). At least provide the
day/month/year. Values not specified will use their
base value (e.g. 00 for hour). Use in conjunction with
-d to limit the number of entries to process.
-d DELTA, --delta DELTA
Limit the number of entries to process. Express the
time delta as a number and a time unit, e.g.: 1s, 10m,
3h or 4d (for 1 second, 10 minutes, 3 hours or 4
days). Use in conjunction with -s to only analyze
certain time delta. If no start time is given, the
time on the first line will be used instead.
-c COMMAND, --command COMMAND
List of commands, comma separated, to run on the log
file. See -l to get a full list of them.
-f FILTER, --filter FILTER
List of filters to apply on the log file. Passed as
comma separated and parameters within square brackets,
e.g ip[192.168.1.1],ssl,path[/some/path]. See --list-
filters to get a full list of them.
-n, --negate-filter Make filters passed with -f work the other way around,
i.e. ifthe ``ssl`` filter is passed instead of showing
only ssl requests it will show non-ssl traffic. If the
``ip`` filter isused, then all but that ip passed to
the filter will be used.
--list-commands Lists all commands available.
--list-filters Lists all filters available.
Commands
--------
Commands are small purpose specific programs in themselves that report specific statistics about the log file being analyzed.
See the ``--help`` (or the section above) to know how to run them.
``counter``
Reports how many log lines could be parsed.
``counter_invalid``
Reports how many log lines could *not* be parsed.
``http_methods``
Reports a breakdown of how many requests have been made per HTTP method
(GET, POST...).
``ip_counter``
Reports a breakdown of how many requests have been made per IP.
Note that for this to work you need to configure HAProxy to capture the header that has the IP on it
(usually the X-Forwarded-For header).
Something like:
``capture request header X-Forwarded-For len 20``
``top_ips``
Reports the 10 IPs with most requests (and the amount of requests).
``status_codes_counter``
Reports a breakdown of how many requests per HTTP status code
(404, 500, 200, 301..) are on the log file.
``request_path_counter``
Reports a breakdown of how many requests per path (/rss, /, /another/path).
``top_request_paths``
Reports the 10 paths with most requests.
``slow_requests``
Reports a list of requests that downstream servers took more than 1 second to response.
``counter_slow_requests``
Reports the amount of requests that downstream servers took more than 1 second to response.
``average_response_time``
Reports the average time (in milliseconds) servers spend to answer requests.
.. note:: Aborted requests are not considered.
``average_waiting_time``
Reports the average time (in milliseconds) requests spend waiting on the various HAProxy queues.
``server_load``
Reports a breakdown of how many requests were processed by each downstream server.
Note that currently it does not take into account the backend the server is configured on.
``queue_peaks``
Reports a list of queue peaks.
A queue peak is defined by the biggest value on the backend queue on a series of log lines that are between log lines without being queued.
``connection_type``
Reports on how many requests were made on SSL and how many on plain HTTP.
This command only works if the default port for SSL (443) appears on the path.
``requests_per_minute``
Reports on how many requests were made per minute.
It works best when used with ``-s`` and ``-d`` command line arguments,
as the output can be huge.
``print``
Prints the raw lines.
This can be useful to trim down a file (with ``-s`` and ``-d`` for example) so that later runs are faster.
Filters
-------
Filters, contrary to commands,
are a way to reduce the amount of log lines to be processed.
.. note::
The ``-n`` command line argument allows to reverse filters output.
This helps when looking for specific traces, like a certain IP, a path...
``ip``
Filters log lines by the given IP.
``ip_range``
Filters log lines by the given IP range
(all IPs that begin with the same prefix).
``path``
Filters log lines by the given string.
``ssl``
Filters log lines that are from SSL connections.
See :method::`.HaproxyLogLine.is_https` for its limitations.
``slow_requests``
Filters log lines that take at least the given time to get answered
(in milliseconds).
``time_frame``
This is an implicit filter that is used when ``--start``, and optionally, ``--delta`` are used.
Do not use this filter on the command line, use ``--start`` and ``--delta`` instead.
``status_code``
Filters log lines that match the given HTTP status code (i.e. 404, 200...).
``status_code_family``
Filters log lines that match the given HTTP status code family
(i.e. 4 for all 4xx status codes, 5 for 5xx status codes...).
``http_method``
Filters log lines by the HTTP method used (GET, POST...).
``backend``
Filters log lines by the HAProxy backend the connection was handled with.
``frontend``
Filters log lines by the HAProxy frontend the connection arrived from.
``server``
Filters log lines by the downstream server that handled the connection.
``response_size``
Filters log lines by the response size (in bytes).
Specially useful when looking for big file downloads.
``wait_on_queues``
Filters log lines by the amount of time the request had to wait on HAProxy queues.
If a request waited less than the given amount of time is accepted.
Installation
------------
After installation you will have a console script `haproxy_log_analysis`::
$ python setup.py install
TODO
----
- add more commands: *(help appreciated)*
- reports on servers connection time
- reports on termination state
- reports around connections (active, frontend, backend, server)
- *your ideas here*
- think of a way to show the commands output in a meaningful way
- be able to specify an output format. For any command that makes sense (slow
requests for example) output the given fields for each log line (i.e.
acceptance date, path, downstream server, load at that time...)
- *your ideas*
.. _HAProxy: http://haproxy.1wt.eu/
.. _HTTP log format: http://cbonte.github.io/haproxy-dconv/configuration-1.4.html#8.2.3
.. _documentation and API: http://haproxy-log-analyzer.readthedocs.org/en/latest/
.. _ReadTheDocs: http://readthedocs.org
CHANGES
=======
2.0b0 (2016-04-18)
------------------
- Check the divisor before doing a divison to not get ``ZeroDivisionError`` exceptions.
[gforcada]
2.0a0 (2016-03-29)
------------------
- Major refactoring:
# Rename modules and classes:
- haproxy_logline -> line
- haproxy_logfile -> logfile
- HaproxyLogLine -> Line
- HaproxyLogFile -> Log
# Parse the log file on Log() creation (i.e. in its __init__)
[gforcada]
1.3 (2016-03-29)
----------------
- New filter: ``filter_wait_on_queues``.
Get all requests that waited at maximum X amount of milliseconds on HAProxy queues.
[gforcada]
- Code/docs cleanups and add code analysis.
[gforcada]
- Avoid using eval.
[gforcada]
1.2.1 (2016-02-23)
------------------
- Support -1 as a status_code
[Christopher Baines]
1.2 (2015-12-07)
----------------
- Allow a hostname on the syslog part (not only IPs)
[danny crasto]
1.1 (2015-04-19)
----------------
- Make syslog optional.
Fixes issue https://github.com/gforcada/haproxy_log_analysis/issues/10.
[gforcada]
1.0 (2015-03-24)
----------------
- Fix issue #9.
log line on the syslog part was too strict,
it was expecting the hostname to be a string and was
failing if it was an IP.
[gforcada]
0.0.3.post2 (2015-01-05)
------------------------
- Finally really fixed issue #7.
``namespace_packages`` was not meant to be on setup.py at all.
Silly copy&paste mistake.
[gforcada]
0.0.3.post (2015-01-04)
-----------------------
- Fix release on PyPI.
Solves GitHub issue #7.
https://github.com/gforcada/haproxy_log_analysis/issues/7
[gforcada]
0.0.3 (2014-07-09)
------------------
- Fix release on PyPI (again).
[gforcada]
0.0.2 (2014-07-09)
------------------
- Fix release on PyPI.
[gforcada]
0.0.1 (2014-07-09)
------------------
- Pickle :class::`.HaproxyLogFile` data for faster performance.
[gforcada]
- Add a way to negate the filters, so that instead of being able to filter by
IP, it can output all but that IP information.
[gforcada]
- Add lots of filters: ip, path, ssl, backend, frontend, server, status_code
and so on. See ``--list-filters`` for a complete list of them.
[gforcada]
- Add :method::`.HaproxyLogFile.parse_data` method to get data from data stream.
It allows you use it as a library.
[bogdangi]
- Add ``--list-filters`` argument on the command line interface.
[gforcada]
- Add ``--filter`` argument on the command line interface, inspired by
Bogdan's early design.
[bogdangi] [gforcada]
- Create a new module :module::`haproxy.filters` that holds all available filters.
[gforcada]
- Improve :method::`.HaproxyLogFile.cmd_queue_peaks` output to not only show
peaks but also when requests started to queue and when they finsihed and
the amount of requests that had been queued.
[gforcada]
- Show help when no argument is given.
[gforcada]
- Polish documentation and docstrings here and there.
[gforcada]
- Add a ``--list-commands`` argument on the command line interface.
[gforcada]
- Generate an API doc for ``HaproxyLogLine`` and ``HaproxyLogFile``.
[bogdangi]
- Create a ``console_script`` `haproxy_log_analysis` for ease of use.
[bogdangi]
- Add Sphinx documentation system, still empty.
[gforcada]
- Keep valid log lines sorted so that the exact order of connections is kept.
[gforcada]
- Add quite a few commands, see `README.rst`_ for a complete list of them.
[gforcada]
- Run commands passed as arguments (with -c flag).
[gforcada]
- Add a requirements.txt file to keep track of dependencies and pin them.
[gforcada]
- Add travis_ and coveralls_ support. See its badges on `README.rst`_.
[gforcada]
- Add argument parsing and custom validation logic for all arguments.
[gforcada]
- Add regular expressions for haproxy log lines (HTTP format) and to
parse HTTP requests path.
Added tests to ensure they work as expected.
[gforcada]
- Create distribution.
[gforcada]
.. _travis: https://travis-ci.org/
.. _coveralls: https://coveralls.io/
.. _README.rst: http://github.com/gforcada/haproxy_log_analysis
Keywords: haproxy log analysis report
Platform: UNKNOWN
Classifier: Development Status :: 5 - Production/Stable
Classifier: Environment :: Console
Classifier: Intended Audience :: System Administrators
Classifier: Programming Language :: Python :: 2.7
Classifier: Programming Language :: Python :: 3.3
Classifier: Programming Language :: Python :: 3.4
Classifier: Topic :: Internet :: Log Analysis
|