Results View
Browse and analyze experiment outputs in a structured tree interface.
Overview
The Results view displays all experiment runs from the experiments/ directory, organized in a hierarchical tree structure. It provides quick access to experiment artifacts and analysis tools.
Tree Structure
Results
├─ Configure Evaluation… (opens configs/evaluation.yaml)
└─ Experiments
└─ customer_support_test_20251104_143610
├─ Parse Results (action)
├─ per_trace_analysis/ (folder, appears after parsing)
│ ├─ 01_trace_abc123.md
│ └─ 02_trace_def456.md
├─ summary.json
├─ traces.jsonl
├─ observations.jsonl
├─ errors.json
└─ logs.json
Features
Experiments Folder
All experiment runs are grouped under a collapsible Experiments folder:
- Shows up to 15 most recent experiments
- Sorted newest to oldest by folder name
- Each experiment displays:
- Name: From
summary.jsonor folder name - Timestamp: Formatted as
YYYY-MM-DD HH:MM:SS - Statistics: Success rate and run count (if available)
- Name: From
Example label: customer_support_test with description 2025-11-04 14:36:10 • 10 runs • 100% success
Parse Results Action
Clicking Parse Results at the top of any experiment:
- Prompts for output directory (default:
per_trace_analysis) - Checks if output already exists
- Confirms overwrite if needed
- Runs
fluxloop parse experiment <folder> --output <dir> - Refreshes view to show new
per_trace_analysis/folder
Per-Trace Analysis Folder
After parsing, a per_trace_analysis/ folder appears:
- Contains one Markdown file per trace
- Files are named:
<iteration>_<trace_id>.md - Recursively browsable for nested outputs
- Click any file to open in editor
Artifact Files
Standard experiment output files:
- summary.json: Aggregate statistics, metadata, success rates
- traces.jsonl: Line-delimited trace records (one per iteration)
- observations.jsonl: Observation stream with inputs/outputs
- errors.json: Error details (only if failures occurred)
- logs.json: Runtime and debug logs
Files only appear if they exist in the experiment directory.
Actions
Configure Evaluation
Top-level button that opens configs/evaluation.yaml for editing evaluator definitions.
Parse Results
Appears inside each experiment folder. Executes:
fluxloop parse experiment <experiment_path> --output <dir> [--overwrite]
Open Files
Click any artifact file to open it in the appropriate editor:
- JSON files: Syntax-highlighted JSON editor
- JSONL files: Line-by-line text view
- Markdown files: Markdown editor with preview support
File Watchers
The Results view automatically refreshes when:
- New experiments are created in
experiments/ - Existing experiment files are modified
- Parse operations complete
Tips
- Quick Parse: Use Parse Results directly from the tree instead of Command Palette
- Timestamp Sorting: Most recent experiments appear at the top
- Multi-Parse: Parse different experiments with different output directories for comparison
- Nested Analysis:
per_trace_analysis/can contain subdirectories; all are browsable
Related
- Viewing Results User Guide - Detailed parsing workflow
- Experiments View - Running new experiments
- Workflow Commands - Parse command details