feat: Add benchmark logging system for performance analysis (closes #104)

Add Python API for capturing performance data to JSON files:
- mcrfpy.start_benchmark() - start capturing frame data
- mcrfpy.end_benchmark() - stop and return filename
- mcrfpy.log_benchmark(msg) - add log message to current frame

The benchmark system captures per-frame data including:
- Frame timing (frame_time_ms, fps, timestamp)
- Detailed timing breakdown (grid_render, entity_render, python, animation, fov)
- Draw call and element counts
- User log messages attached to frames

Output JSON format supports analysis tools and includes:
- Benchmark metadata (PID, timestamps, duration, total frames)
- Full frame-by-frame metrics array

Also refactors ProfilingMetrics from nested GameEngine struct to
top-level struct for easier forward declaration.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
This commit is contained in:
John McCardle 2025-11-28 16:05:55 -05:00
commit a7fef2aeb6
7 changed files with 551 additions and 59 deletions

View file

@ -6,6 +6,7 @@
#include "Resources.h"
#include "Animation.h"
#include "Timer.h"
#include "BenchmarkLogger.h"
#include "imgui.h"
#include "imgui-SFML.h"
#include <cmath>
@ -308,7 +309,10 @@ void GameEngine::run()
// Update profiling metrics
metrics.updateFrameTime(frameTime * 1000.0f); // Convert to milliseconds
// Record frame data for benchmark logging (if running)
g_benchmarkLogger.recordFrame(metrics);
int whole_fps = metrics.fps;
int tenth_fps = (metrics.fps * 10) % 10;