173 lines
5.0 KiB
Markdown
173 lines
5.0 KiB
Markdown
# ScolioVis API - Test Report
|
|
|
|
## Overview
|
|
|
|
| Property | Value |
|
|
|----------|-------|
|
|
| **Repository** | scoliovis-api |
|
|
| **Source** | https://github.com/blankeos/scoliovis-api |
|
|
| **Paper** | "ScolioVis: Automated Cobb Angle Measurement using Keypoint RCNN" |
|
|
| **Model** | Keypoint R-CNN (ResNet50-FPN backbone) |
|
|
| **Output** | Vertebra landmarks (4 corners each) + 3 Cobb angles |
|
|
| **Pretrained Weights** | Yes (227 MB) |
|
|
|
|
---
|
|
|
|
## Purpose
|
|
|
|
ScolioVis detects **vertebra corners** and calculates **Cobb angles** from the detected landmarks:
|
|
- Outputs 4 keypoints per vertebra (corners)
|
|
- Calculates PT, MT, TL angles from vertebra orientations
|
|
- Provides interpretable results (can visualize detected vertebrae)
|
|
|
|
---
|
|
|
|
## Test Results (OUTPUT_TEST_1)
|
|
|
|
### Test Configuration
|
|
|
|
- **Test Dataset**: Spinal-AI2024 subset5 (test set)
|
|
- **Images Tested**: 5 (016001.jpg - 016005.jpg)
|
|
- **Weights**: Pretrained (keypointsrcnn_weights.pt)
|
|
- **Device**: CPU
|
|
|
|
### Results Comparison
|
|
|
|
| Image | GT PT | Pred PT | GT MT | Pred MT | GT TL | Pred TL | Verts |
|
|
|-------|-------|---------|-------|---------|-------|---------|-------|
|
|
| 016001.jpg | 0.0° | - | 4.09° | - | 12.45° | - | 6 (failed) |
|
|
| 016002.jpg | 7.77° | 0.0° | 21.09° | 17.2° | 24.34° | 24.1° | 9 |
|
|
| 016003.jpg | 5.8° | 0.0° | 11.17° | 11.9° | 15.37° | 15.8° | 8 |
|
|
| 016004.jpg | 0.0° | - | 11.94° | - | 20.01° | - | 2 (failed) |
|
|
| 016005.jpg | 9.97° | 0.0° | 16.88° | 10.6° | 20.77° | 16.2° | 11 |
|
|
|
|
**GT = Ground Truth, Pred = Predicted, Verts = Vertebrae Detected**
|
|
|
|
### Error Analysis (Successful Predictions Only)
|
|
|
|
| Image | PT Error | MT Error | TL Error | Mean Error |
|
|
|-------|----------|----------|----------|------------|
|
|
| 016002.jpg | -7.8° | -3.9° | -0.2° | 4.0° |
|
|
| 016003.jpg | -5.8° | +0.7° | +0.4° | 2.3° |
|
|
| 016005.jpg | -10.0° | -6.3° | -4.6° | 7.0° |
|
|
|
|
**Average Error: 4.4°** (on successful predictions)
|
|
|
|
### Success Rate
|
|
|
|
- **3/5 images** (60%) successfully calculated angles
|
|
- **2/5 images** failed (too few vertebrae detected)
|
|
|
|
---
|
|
|
|
## Output Files
|
|
|
|
```
|
|
OUTPUT_TEST_1/
|
|
├── 016001_result.png # Visualization (6 verts, failed)
|
|
├── 016002_result.png # Visualization (9 verts, success)
|
|
├── 016003_result.png # Visualization (8 verts, success)
|
|
├── 016004_result.png # Visualization (2 verts, failed)
|
|
├── 016005_result.png # Visualization (11 verts, success)
|
|
└── results.json # JSON results
|
|
```
|
|
|
|
---
|
|
|
|
## How It Works
|
|
|
|
```
|
|
Input Image (JPG/PNG)
|
|
│
|
|
▼
|
|
┌─────────────────────────┐
|
|
│ Keypoint R-CNN │
|
|
│ (ResNet50-FPN) │
|
|
│ - Detect vertebrae │
|
|
│ - Predict 4 corners │
|
|
└─────────────────────────┘
|
|
│
|
|
▼
|
|
┌─────────────────────────┐
|
|
│ Post-processing │
|
|
│ - Filter by score >0.5 │
|
|
│ - NMS (IoU 0.3) │
|
|
│ - Sort by Y position │
|
|
│ - Keep top 17 verts │
|
|
└─────────────────────────┘
|
|
│
|
|
▼
|
|
┌─────────────────────────┐
|
|
│ Cobb Angle Calculation │
|
|
│ - Compute midpoint │
|
|
│ lines per vertebra │
|
|
│ - Find max angles │
|
|
│ - Classify S vs C │
|
|
└─────────────────────────┘
|
|
│
|
|
▼
|
|
Output: {
|
|
landmarks: [...],
|
|
angles: {pt, mt, tl},
|
|
curve_type: "S" | "C"
|
|
}
|
|
```
|
|
|
|
---
|
|
|
|
## Strengths
|
|
|
|
1. **Pretrained weights available** - Ready to use
|
|
2. **Interpretable output** - Can visualize detected vertebrae
|
|
3. **Good accuracy** - 4.4° average error when detection succeeds
|
|
4. **Curve type detection** - Identifies S-curve vs C-curve
|
|
|
|
## Limitations
|
|
|
|
1. **Detection failures** - 40% failure rate on test set
|
|
2. **Requires sufficient vertebrae** - Needs ~8+ vertebrae for reliable angles
|
|
3. **Synthetic image challenges** - May perform differently on real X-rays
|
|
4. **PT angle often 0** - Model tends to underestimate proximal thoracic
|
|
|
|
---
|
|
|
|
## Usage
|
|
|
|
```bash
|
|
# Activate venv
|
|
.\venv\Scripts\activate
|
|
|
|
# Run test script
|
|
python test_subset5.py
|
|
|
|
# Or start FastAPI server
|
|
uvicorn main:app --reload
|
|
# Then POST image to /v2/getprediction
|
|
```
|
|
|
|
---
|
|
|
|
## Comparison with Seg4Reg
|
|
|
|
| Metric | ScolioVis | Seg4Reg (no weights) |
|
|
|--------|-----------|---------------------|
|
|
| Avg Error | **4.4°** | 35.7° |
|
|
| Success Rate | 60% | 100% |
|
|
| Interpretable | **Yes** | No |
|
|
| Pretrained | **Yes** | No |
|
|
|
|
**Winner**: ScolioVis (when detection succeeds)
|
|
|
|
---
|
|
|
|
## Conclusion
|
|
|
|
ScolioVis with pretrained weights produces **clinically reasonable results** (4.4° average error) when vertebra detection succeeds. The main limitation is detection reliability on synthetic images - 40% of test images had too few vertebrae detected.
|
|
|
|
**Recommendation**: Good for real X-rays; may need fine-tuning for synthetic Spinal-AI2024 images.
|
|
|
|
---
|
|
|
|
*Report generated: January 2026*
|
|
*Test data: Spinal-AI2024 subset5*
|