feat: Add benchmark logging system for performance analysis (closes #104)

Add Python API for capturing performance data to JSON files:
- mcrfpy.start_benchmark() - start capturing frame data
- mcrfpy.end_benchmark() - stop and return filename
- mcrfpy.log_benchmark(msg) - add log message to current frame

The benchmark system captures per-frame data including:
- Frame timing (frame_time_ms, fps, timestamp)
- Detailed timing breakdown (grid_render, entity_render, python, animation, fov)
- Draw call and element counts
- User log messages attached to frames

Output JSON format supports analysis tools and includes:
- Benchmark metadata (PID, timestamps, duration, total frames)
- Full frame-by-frame metrics array

Also refactors ProfilingMetrics from nested GameEngine struct to
top-level struct for easier forward declaration.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
This commit is contained in:
John McCardle 2025-11-28 16:05:55 -05:00
commit a7fef2aeb6
7 changed files with 551 additions and 59 deletions

View file

@ -82,6 +82,11 @@ public:
// Profiling/metrics
static PyObject* _getMetrics(PyObject*, PyObject*);
// Benchmark logging (#104)
static PyObject* _startBenchmark(PyObject*, PyObject*);
static PyObject* _endBenchmark(PyObject*, PyObject*);
static PyObject* _logBenchmark(PyObject*, PyObject*);
// Developer console
static PyObject* _setDevConsole(PyObject*, PyObject*);