🧠 Daily Study Log [2025-07-03]
Today’s study spanned advanced ensemble experimentation in the SCU_Competition, hands-on DeepDream visualization in computer vision, a continuation of MobileNetV2 paper reading, and TOEIC grammar/listening/writing practice.


πŸ“Š SCU_Competition β€” Submission 46–50

Focus: Feature pruning, cluster reintroduction, ensemble tuning
Goal: Refine generalization and ensemble strength via clean feature sets

Highlights:

Takeaways:


πŸ“„ Paper Review β€” MobileNetV2 (Day 3)

Continued: MobileNetV2: Inverted Residuals and Linear Bottlenecks
Today’s Focus: Final part of Section 2 β€” Understanding inverted residuals with linear bottlenecks

Reflections:


🎨 CV Practice β€” DeepDream (Hands-on)

What I Did:

Code Summary:

def deep_dream(img, iterations=100, lr=0.15):
    for _ in range(iterations):
        model.zero_grad()
        model(img)
        loss = features['target'].mean()
        loss.backward()
        grad = img.grad.data
        img.data += lr * grad / (grad.std() + 1e-8)
        img.grad.data.zero_()
    return img

Insights:


πŸ“˜ TOEIC β€” Light Review

Today’s Practice:

Reflections:


🎯 Next Steps


βœ… TL;DR

πŸ“ SCU: VotingClassifier with weighted soft voting performed well; log1p features no longer useful
πŸ“ CV: DeepDream implemented successfully β€” surreal outputs and layer insights gained
πŸ“ Paper: MobileNetV2’s efficiency lies in its clean residual design and minimal bottlenecks
πŸ“ TOEIC: Completed Listening/Writing Unit 1 β€” daily 30-min routine is now in place