Category: Career

  • Seven Days, Seven Lessons: A Data Engineer’s Weekly Reflection

    Sundays are for slowing down. Not for scrolling through tutorials, not for chasing the next framework — but for actually sitting with what the week taught you.

    This week was unusually rich. I spent seven days writing about Spark partitioning, AI tools, Python idioms, career moves, advanced Airflow patterns, and real-world healthcare AI. By Saturday night, I noticed something strange: the daily topics were all different, but the lessons kept rhyming.

    Here are the seven that stuck. Not theory — things I actually changed my mind about this week.

    1. Tools Are Disposable. Judgment Is Not.

    Early in my career, I collected tools like trading cards. Airflow, dbt, Spark, Kafka, Flink, Snowflake, Databricks, Polars, DuckDB, Iceberg — if it had a logo, I wanted it on my résumé.

    This week I watched a senior engineer replace a 200-line Airflow DAG with a 40-line Python script and a cron job. The pipeline ran faster, broke less often, and was readable by a junior hire on day one.

    The lesson: Most of the time, the question isn’t “which tool is best?” It’s “do we even need a tool here?” Judgment is what turns a toolkit into a career.

    2. Fundamentals Compound. Trends Don’t.

    I’ve paid for three courses on “next-generation” data warehouses in the last two years. The knowledge that has actually served me across every one of those warehouses? How query planners work. How indexes get chosen. Why a seemingly innocent OR in a WHERE clause can destroy a plan.

    Fundamentals are boring to post about. They don’t trend. But they compound for decades.

    The lesson: Spend 20% of your learning budget on shiny things. Spend 80% on the fundamentals — SQL internals, distributed systems, data modeling, Linux.

    3. AI Isn’t Replacing Data Engineers. It’s Replacing a Certain Kind of Data Engineer.

    Every week there’s a new “AI will replace data engineers” post. This week I experimented openly with using AI to scaffold dbt models, write Spark transforms, and review my Python.

    The honest result: AI is extraordinary at boilerplate. It is still bad at judgment, architecture, cost modeling, and political navigation inside a company.

    The lesson: If your day-to-day is 80% boilerplate, 2026 is a wake-up call. If you spend your day on schemas, trade-offs, stakeholder alignment, and system design — AI is a jetpack, not a guillotine.

    4. Writing Publicly Is the Best Career Move I’ve Made.

    I didn’t get my last opportunity from a job board. I got it because someone read my LinkedIn posts and decided I thought clearly.

    Writing publicly forces something a promotion never will: you have to actually understand your own work well enough to explain it to strangers. That pressure makes you a better engineer.

    The lesson: Even if nobody reads it for the first six months, keep writing. The audience is a bonus. The clarity is the product.

    5. The Hardest Skill in 2026 Is Saying “Let’s Not Build That.”

    This one hurts to admit. For years, I measured my value by what I built. Pipelines shipped. DAGs authored. Models deployed.

    This week I killed three proposed pipelines before they started. Each would have added 3–5 weeks of work, two new data sources, and an ongoing maintenance burden. The business outcome we actually needed? A spreadsheet and a stakeholder conversation.

    The lesson: The best data engineers I know have a finely tuned “not now” reflex. They optimize for problems solved, not code shipped.

    6. Taste Is the Real Moat.

    You can teach someone Spark. You cannot teach them, in a weekend, to sense when a pipeline is getting too clever. To feel when a schema is drifting toward technical debt. To notice that a dashboard is answering the wrong question.

    That sensitivity is taste. It comes from reading other people’s code, breaking your own in production, and paying attention on purpose.

    The lesson: If you want to stand out in a field full of certifications, build taste. It takes years. It’s also the one thing the machine can’t clone.

    7. Unlearning Matters as Much as Learning.

    I started the week planning to write about new things I learned. I ended it realizing half the value was in unlearning — habits, tools, and opinions I had outgrown.

    • Unlearned: pandas is the only option. (Polars handled the heavy lifting in a fraction of the time.)
    • Unlearned: every pipeline deserves a DAG. (Some deserve a cron job.)
    • Unlearned: silent senior engineers are humble. (They’re just invisible. Speak up.)

    The lesson: Your growth isn’t only what you add. It’s what you’re willing to let go of.

    Closing Thought

    One week is a small window. But a week of deliberate attention will teach you more than a month of passive consumption.

    If you’re reading this on a Sunday, I’ll ask you what I asked myself this morning: What did you unlearn this week?

    Write it down. It’s probably the most valuable thing you touched all week.

    See you in the next post.

    — Pushpjeet Cholkar, Data Engineer