Although I haven’t committed to any GitHub updates or hands-on coding today, I’ve been deep in coursework — and that counts too!
I completed two full classes today and have been focusing intently on my current machine learning assignment, which revolves around ensemble techniques.
It’s been an eye-opener.
Lately, I’ve been gaining a clearer intuition about how ensemble methods like bagging, boosting, and stacking truly work — not just in theory, but in practice.
It’s not just about combining models; it’s about understanding how and why they complement each other.
I keep wondering:
“Which types of datasets would really benefit from this?”
It’s sparked a curiosity to revisit past projects or even explore Kaggle datasets with these techniques in mind.
Unfortunately, next week will be packed:
So, realistically, I won’t be getting much done in terms of projects or coding until the weekend.
Even though I’m not pushing commits or building something flashy,
this kind of “quiet progress” — wrestling with concepts, connecting the dots — is what keeps me moving forward.
Not every step needs to be visible. But every step still counts.