π§ Daily Study Log [2025-07-01]
Today was a well-balanced study day across three key areas: competition experimentation, hands-on GAN practice, and my very first formal paper review!
π SCU_Competition β Submission 36β40
Focus: Comparing model types under fixed feature structure
Goal: Find the sweet spot between simplicity and generalization
Highlights:
- Submission 36: Full feature set + plain LGBM (no tuning) β CV AUC 0.8724 / Kaggle AUC 0.8892
- Submission 37: Tuned LGBM (simple structure) β Kaggle AUC 0.8932
- Submission 38: Switched to RandomForest β noticeable drop (AUC 0.8805)
- Submission 39: LogisticRegression baseline β consistent but limited (AUC 0.8736)
- Submission 40: LGBM + RandomSearchCV β robust tuning (AUC 0.8915)
Takeaways:
- Simple LGBM still outperforms both ensemble trees and linear models in this task
- Feature structure is stable; now tuning is the key lever
- RandomSearchCV with constraints yields solid results without overfitting
π§ Computer Vision Self-Study β GAN Playground
What I Did:
- Played with a basic GAN model using PyTorch
- Generated simple synthetic digits (MNIST-style)
- Adjusted generator/discriminator learning rates and batch sizes
Insights:
- Watching generator loss collapse is oddly satisfying
- Even basic GANs teach a lot about optimization dynamics
- Planning to explore conditional GANs next
π Paper Review Kickoff β MobileNetV2
Started: MobileNetV2: Inverted Residuals and Linear Bottlenecks
Todayβs Focus: Day 1 β Abstract, Introduction, Problem Statement
Repo: Paper Review β GitHub
Reflection:
- Really enjoyed this first deep-dive
- Realized the importance of mobile constraints in architecture design
- Looking forward to dissecting the inverted residual block tomorrow
π― Next Steps
- CV: Try conditional GAN with labels
- SCU: Add SHAP filtering to next submission
- Paper Review: Sketch MobileNetV2 architecture + start Section 3 analysis
- Blog: Consider auto-generated paper index page by category/tag
β
TL;DR
π SCU: Simple LGBM wins again β balance is better than brute force
π CV: GANs are super fun and super sensitive to tuning
π Paper: Started MobileNetV2 review, great kickoff to the reading habit