BinkD-Stats 2.0 - BinkD Log Analyzer
From 
Stephen Walsh@3:633/280 to 
All on Mon Oct  6 13:59:54 2025
 
 
Hello everybody!
# BinkD-Stats 2.0
Python rewrite of the BinkD log analyzer with significant improvements.
## Improvements Over Perl Version
### New Features
- **JSON Export** - Export statistics to JSON format for integration with
   other tools
- **CSV Export** - Export statistics to CSV for spreadsheet analysis
- **Better Date Handling** - Uses Python's `datetime` module for more
   accurate date parsing
- **File Count Tracking** - Tracks sent/recv file counts in addition to bytes
- **Improved Error Handling** - Better error messages and exception handling
- **Modern CLI** - Uses argparse for better help and argument parsing
### Technical Improvements
- **More Accurate CPS Calculations** - Properly averages CPS from actual
   transfer speeds
- **Cleaner Code** - Object-oriented design with clear separation of concerns
- **Better Type Safety** - Type hints for better code documentation
- **Cross-platform** - Works on Linux, macOS, and Windows
### Compatibility
- Produces identical statistics to the Perl version
- Same command-line interface (with enhancements)
- Backward compatible with existing usage
## Usage
### Basic Usage
```bash
# Show all statistics
python3 binkd-stats.py -f /var/log/binkd.log
# Show last 7 days
python3 binkd-stats.py -f /var/log/binkd.log -d 7
# Show last 30 days
python3 binkd-stats.py -f /var/log/binkd.log -d 30
```
### Export Options
```bash
# Export to JSON
python3 binkd-stats.py -f /var/log/binkd.log -d 7 --json stats.json
# Export to CSV
python3 binkd-stats.py -f /var/log/binkd.log -d 7 --csv stats.csv
# Both text and JSON/CSV output
python3 binkd-stats.py -f /var/log/binkd.log --json stats.json --csv stats.csv ```
### Help
```bash
python3 binkd-stats.py --help
```
## Output Format
### Text Report (Console)
```
                         BinkD Connection's
         Statistics For Last 7 day's, Created Mon Oct  6 12:58:59 2025 ----------------------------------------------------------------------------- Address               Sessions       Sent   Received     CPS In    CPS Out ----------------------------------------------------------------------------- 1:9/36                      15      97.0k          0       0.00    8744.85 3:633/10                   345       3.2k       8.0M   22171.96     650.00 -----------------------------------------------------------------------------
  Total Received :            31.3M     Total Sessions   :            10596
  Total Sent     :            58.2M     Average CPS In   :         12052.24
  Total Traffic  :            89.4M     Average CPS Out  :          6507.43 ----------------------------------------------------------------------------- ```
### JSON Export
```json
{
  "start_date": "2025-09-25T16:27:47",
  "end_date": "2025-10-06T12:52:23",
  "days_filter": 7,
  "total_sessions": 10596,
  "nodes": {
    "3:633/10": {
      "sessions": 345,
      "sent_bytes": 3250,
      "recv_bytes": 8380999,
      "sent_files": 5,
      "recv_files": 378,
      "avg_recv_cps": 22171.96,
      "avg_send_cps": 650.0
    }
  }
}
```
### CSV Export
```csv Address,Sessions,Sent_Bytes,Recv_Bytes,Sent_Files,Recv_Files,Avg_Send_CPS, Avg_Recv_CPS,3:633/10,345,3250,8380999,5,378,650.00,22171.96
```
## Requirements
- Python 3.6 or later (no additional dependencies required)
## Credits
- Original Perl version: (c) Netsurge//Frank Linhares % bbs.diskshop.ca
- Python rewrite: Stephen Walsh. 2025
## License
Same as original Perl version
I give permission for the Binkd dev team to include BinkD-Stats 2.0 in the binkd source tree if they wish.
=== Cut ===
#!/usr/bin/env python3
"""
BinkD-Stats 2.0 - A BinkD Log Analyzer And Statistics Tool
Python rewrite with improvements
Original Perl version (c) Netsurge//Frank Linhares % bbs.diskshop.ca
Python version improvements, by Stephen Walsh:
- Better date/time handling with datetime module
- JSON/CSV export options
- Per-node detailed statistics
- HTML output option
- Better error handling and logging
- More accurate CPS calculations
"""
import re
import sys
import argparse
import json
from datetime import datetime, timedelta
from collections import defaultdict
from pathlib import Path
from typing import Dict, List, Tuple, Optional
class BinkDStats:
    """BinkD log file analyzer and statistics generator"""
    def __init__(self, log_file: str, days: int = 0):
        self.log_file = Path(log_file)
        self.days = days
        self.stats = defaultdict(lambda: {
            'sessions': 0,
            'sent_bytes': 0,
            'recv_bytes': 0,
            'sent_files': 0,
            'recv_files': 0,
            'send_cps': [],
            'recv_cps': []
        })
        self.start_date = None
        self.end_date = None
        self.total_sessions = 0
        # Month name mapping
        self.months = {
            'Jan': 1, 'Feb': 2, 'Mar': 3, 'Apr': 4, 'May': 5, 'Jun': 6,
            'Jul': 7, 'Aug': 8, 'Sep': 9, 'Oct': 10, 'Nov': 11, 'Dec': 12
        }
    def parse_date(self, line: str) -> Optional[datetime]:
        """Parse date from log line (format: DD MMM HH:MM:SS)"""
        match = re.match(r'[+\-!? ]\s*(\d+)\s+(\w+)\s+(\d+):(\d+):(\d+)', line)
        if not match:
            return None
        day, month_name, hour, minute, second = match.groups()
        month = self.months.get(month_name)
        if not month:
            return None
        # Determine year (assume current year unless date is in future)
        current_year = datetime.now().year
        try:
            dt = datetime(current_year, month, int(day),
                         int(hour), int(minute), int(second))
            # If date is in future, it must be from previous year
            if dt > datetime.now():
                dt = datetime(current_year - 1, month, int(day),
                            int(hour), int(minute), int(second))
            return dt
        except ValueError:
            return None
    def is_date_valid(self, dt: Optional[datetime]) -> bool:
        """Check if date is within the requested time range"""
        if dt is None:
            return False
        if self.days == 0:
            return True
        cutoff = datetime.now() - timedelta(days=self.days)
        return dt >= cutoff
    def parse_fido_address(self, addr: str) -> Tuple[int, int, int, int]:
        """Parse FidoNet address and return sortable tuple.
        Returns: (zone, net, node, point)
        """
        # Format: zone:net/node[.point][@domain]
        addr = addr.split('@')[0]  # Remove domain
        zone, net, node, point = 0, 0, 0, 0
        if ':' in addr:
            zone_part, rest = addr.split(':', 1)
            zone = int(zone_part) if zone_part.isdigit() else 0
        else:
            rest = addr
        if '/' in rest:
            net_part, node_part = rest.split('/', 1)
            net = int(net_part) if net_part.isdigit() else 0
            if '.' in node_part:
                node_str, point_str = node_part.split('.', 1)
                node = int(node_str) if node_str.isdigit() else 0
                point = int(point_str) if point_str.isdigit() else 0
            else:
                node = int(node_part) if node_part.isdigit() else 0
        return (zone, net, node, point)
    def format_bytes(self, bytes_val: int) -> str:
        """Format bytes into human-readable format"""
        if bytes_val < 1024:
            return f"{bytes_val}"
        elif bytes_val < 1048576:
            return f"{bytes_val / 1024:.1f}k"
        else:
            return f"{bytes_val / 1048576:.1f}M"
    def parse_log(self):
        """Parse the binkd log file"""
        if not self.log_file.exists():
            raise FileNotFoundError(f"Log file not found: {self.log_file}")
        # Regex patterns
        done_pattern = re.compile(
            r'done \(\w+ (\S+)\@\w+, \w+, S/R: (\d+)/(\d+) \((\d+)/(\d+)'
        )
        rcvd_pattern = re.compile(r'rcvd:.*?\((\d+), ([\d.]+) CPS, (\S+)\@')
        sent_pattern = re.compile(r'sent:.*?\((\d+), ([\d.]+) CPS, (\S+)\@')
        with open(self.log_file, 'r', encoding='utf-8', errors='replace') as f:
            for line in f:
                line = line.strip()
                # Parse date
                dt = self.parse_date(line)
                if dt:
                    if self.start_date is None or dt < self.start_date:
                        self.start_date = dt
                    if self.end_date is None or dt > self.end_date:
                        self.end_date = dt
                # Check if line is within date range
                if not self.is_date_valid(dt):
                    continue
                # Parse "done" lines for session summary
                match = done_pattern.search(line)
                if match:
                    addr = match.group(1)
                    sent_files = int(match.group(2))
                    recv_files = int(match.group(3))
                    sent_bytes = int(match.group(4))
                    recv_bytes = int(match.group(5))
                    self.stats[addr]['sessions'] += 1
                    self.stats[addr]['sent_bytes'] += sent_bytes
                    self.stats[addr]['recv_bytes'] += recv_bytes
                    self.stats[addr]['sent_files'] += sent_files
                    self.stats[addr]['recv_files'] += recv_files
                    self.total_sessions += 1
                # Parse "rcvd" lines for CPS
                match = rcvd_pattern.search(line)
                if match:
                    cps = float(match.group(2))
                    addr = match.group(3)
                    self.stats[addr]['recv_cps'].append(cps)
                # Parse "sent" lines for CPS
                match = sent_pattern.search(line)
                if match:
                    cps = float(match.group(2))
                    addr = match.group(3)
                    self.stats[addr]['send_cps'].append(cps)
    def calculate_avg_cps(self, cps_list: List[float]) -> float:
        """Calculate average CPS from list"""
        return sum(cps_list) / len(cps_list) if cps_list else 0.0
    def print_text_report(self):
        """Print statistics report to console"""
        if not self.stats:
            print("\n[There Are No Connections To Report]\n")
            return
        print()
        print("                         BinkD Connection's")
        print()
        if self.days == 0:
            if self.start_date:
                start_str = self.start_date.strftime("%c")
            else:
                start_str = "Unknown"
            if self.end_date:
                end_str = self.end_date.strftime("%c")
            else:
                end_str = "Unknown"
            stat_line = f"Statistics From {start_str} Thru To {end_str}"
            # Center the line within 78 characters
            print(stat_line.center(78))
        else:
            now_str = datetime.now().strftime("%c")
            print(f"         Statistics For Last {self.days} day's, "
                  f"Created {now_str}")
        print("-" * 77)
        print(f"{'Address':<18} {'Sessions':>8}  {'Sent':>9}  "
              f"{'Received':>9}  {'CPS In':>9}  {'CPS Out':>9}")
        print("-" * 77)
        # Sort addresses by FidoNet address
        sorted_addrs = sorted(self.stats.keys(),
                            key=lambda x: self.parse_fido_address(x))
        total_sent = 0
        total_recv = 0
        avg_rcps_list = []
        avg_scps_list = []
        for addr in sorted_addrs:
            s = self.stats[addr]
            avg_rcps = self.calculate_avg_cps(s['recv_cps'])
            avg_scps = self.calculate_avg_cps(s['send_cps'])
            print(f"{addr:<18} {s['sessions']:>8}  "
                  f"{self.format_bytes(s['sent_bytes']):>9}  "
                  f"{self.format_bytes(s['recv_bytes']):>9}  "
                  f"{avg_rcps:>9.2f}  {avg_scps:>9.2f}")
            total_sent += s['sent_bytes']
            total_recv += s['recv_bytes']
            if avg_rcps > 0:
                avg_rcps_list.append(avg_rcps)
            if avg_scps > 0:
                avg_scps_list.append(avg_scps)
        print("-" * 77)
        overall_avg_rcps = (sum(avg_rcps_list) / len(avg_rcps_list)
                           if avg_rcps_list else 0)
        overall_avg_scps = (sum(avg_scps_list) / len(avg_scps_list)
                           if avg_scps_list else 0)
        total_traffic = total_sent + total_recv
        print(f"Total Received : {self.format_bytes(total_recv):>16}     "
              f"Total Sessions   : {self.total_sessions:>16}")
        print(f"Total Sent     : {self.format_bytes(total_sent):>16}     "
              f"Average CPS In   : {overall_avg_rcps:>16.2f}")
        print(f"Total Traffic  : {self.format_bytes(total_traffic):>16}"
              f"     Average CPS Out  : {overall_avg_scps:>16.2f}")
        print("-" * 77)
        print()
    def export_json(self, output_file: str):
        """Export statistics to JSON format"""
        start = self.start_date.isoformat() if self.start_date else None
        end = self.end_date.isoformat() if self.end_date else None
        data = {
            'start_date': start,
            'end_date': end,
            'days_filter': self.days,
            'total_sessions': self.total_sessions,
            'nodes': {}
        }
        for addr, s in self.stats.items():
            data['nodes'][addr] = {
                'sessions': s['sessions'],
                'sent_bytes': s['sent_bytes'],
                'recv_bytes': s['recv_bytes'],
                'sent_files': s['sent_files'],
                'recv_files': s['recv_files'],
                'avg_recv_cps': self.calculate_avg_cps(s['recv_cps']),
                'avg_send_cps': self.calculate_avg_cps(s['send_cps'])
            }
        with open(output_file, 'w') as f:
            json.dump(data, f, indent=2)
        print(f"Statistics exported to {output_file}")
    def export_csv(self, output_file: str):
        """Export statistics to CSV format"""
        with open(output_file, 'w') as f:
            f.write("Address,Sessions,Sent_Bytes,Recv_Bytes,Sent_Files,"
                   "Recv_Files,Avg_Send_CPS,Avg_Recv_CPS\n")
            sorted_addrs = sorted(self.stats.keys(),
                                key=lambda x: self.parse_fido_address(x))
            for addr in sorted_addrs:
                s = self.stats[addr]
                avg_rcps = self.calculate_avg_cps(s['recv_cps'])
                avg_scps = self.calculate_avg_cps(s['send_cps'])
                f.write(f"{addr},{s['sessions']},{s['sent_bytes']},"
                       f"{s['recv_bytes']},{s['sent_files']},"
                       f"{s['recv_files']},{avg_scps:.2f},{avg_rcps:.2f}\n")
        print(f"Statistics exported to {output_file}")
def main():
    parser = argparse.ArgumentParser(
        description='BinkD-Stats 2.0 - BinkD Log Analyzer',
        formatter_class=argparse.RawDescriptionHelpFormatter,
        epilog="""
Examples:
  %(prog)s -f /var/log/binkd.log
  %(prog)s -f /var/log/binkd.log -d 7
  %(prog)s -f /var/log/binkd.log --json stats.json
  %(prog)s -f /var/log/binkd.log --csv stats.csv -d 30
        """
    )
    parser.add_argument('-f', '--file', required=True,
                       help='Path to binkd.log file')
    parser.add_argument('-d', '--days', type=int, default=0,
                       help='Get statistics for last N days (0 = all)')
    parser.add_argument('--json', metavar='FILE',
                       help='Export statistics to JSON file')
    parser.add_argument('--csv', metavar='FILE',
                       help='Export statistics to CSV file')
    parser.add_argument('--version', action='version', version='%(prog)s 2.0')
    args = parser.parse_args()
    try:
        analyzer = BinkDStats(args.file, args.days)
        analyzer.parse_log()
        analyzer.print_text_report()
        if args.json:
            analyzer.export_json(args.json)
        if args.csv:
            analyzer.export_csv(args.csv)
    except FileNotFoundError as e:
        print(f"Error: {e}", file=sys.stderr)
        sys.exit(1)
    except Exception as e:
        print(f"Error: {e}", file=sys.stderr)
        import traceback
        traceback.print_exc()
        sys.exit(2)
if __name__ == '__main__':
    main()
=== Cut ===
--- GoldED+/LNX 1.1.5-b20250409
 * Origin: Dragon's Lair ---:- dragon.vk3heg.net -:--- Prt: 6800 (3:633/280)