Datasets:
File size: 1,030 Bytes
9ac16a5 | 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 | # GNN Ruby Code Study
Systematic study of Graph Neural Network architectures for Ruby code complexity prediction and generation.
## Key Findings
1. **5-layer GraphSAGE** achieves MAE 4.018 (R^2 = 0.709) for complexity prediction - 16% better than 3-layer baseline
2. **GNN autoencoders produce 0% valid Ruby** across all tested architectures and loss functions
3. **Teacher-forced GIN decoder** achieves 7% validity - the only non-zero result in 35 experiments
4. **Total cost: ~$4.20** on Vast.ai RTX 4090 instances
## Contents
- `paper.md` - Full research paper
- `results/experiments.json` - All fleet experiment metrics
- `results/autonomous_research.json` - 18 baseline variance replicates
- `experiments/` - Fleet YAML specifications for all 3 tracks
- `specs/` - Ratiocinator ResearchSpec YAMLs
## Source Code
- Model code: [jubilant-palm-tree](https://github.com/timlawrenz/jubilant-palm-tree) (branch: `experiment/ratiocinator-gnn-study`)
- Orchestrator: [ratiocinator](https://github.com/timlawrenz/ratiocinator)
|