AvnishYadav
WorkProjectsBlogsNewsletterSupportAbout
Work With Me

Avnish Yadav

Engineer. Automate. Build. Scale.

© 2026 Avnish Yadav. All rights reserved.

The Automation Update

AI agents, automation, and micro-SaaS. Weekly.

Explore

  • Home
  • Projects
  • Blogs
  • Newsletter Archive
  • About
  • Contact
  • Support

Legal

  • Privacy Policy

Connect

LinkedInGitHubInstagramYouTube
Week 2 Retro: Escaping the Hype Cycle & Systemizing the Build
2026-02-21

Week 2 Retro: Escaping the Hype Cycle & Systemizing the Build

7 min readEngineeringRetrospectivesProcessProductivityBuild in PublicAI AutomationRetrospectiveEngineeringAnalytics

A transparent look at Week 2 of building in public. Covers the transition from launch hype to systemic execution, including analytics deep dives, failed deployments, and process optimizations.

The Week 2 Reality Check

In software engineering, there is a concept known as the "Trough of Disillusionment." It usually happens right after a technology trigger (or in this case, a Week 1 launch). The adrenaline of shipping the portfolio and announcing the roadmap has settled. Now, it is just me, the IDE, and the backlog.

Week 1 is about noise. Week 2 is about signal.

This week wasn't about flashy announcements; it was about infrastructure and iteration. I focused on proving that the systems I'm building can actually sustain a workflow, rather than just looking good in a screenshot. If Week 1 was the MVP, Week 2 is the v1.1 patch that fixes the memory leaks.

Here is the retrospective on what shipped, what broke, and what the data is telling me.


The Build Montage: What Actually Shipped?

This week's focus was "Proof." I needed to demonstrate that the AI agents I talk about aren't just wrappers for ChatGPT, but functional tools integrated into a pipeline.

1. The Content Repurposing Agent (v0.5)

I built a Python script utilizing the OpenAI API and FFmpeg. The goal: Take a single video file, transcribe it via Whisper, identify viral hooks based on semantic analysis, and clip it.

  • The Win: It successfully processed a 10-minute video into 3 usable shorts without hallucinating the timestamps.
  • The Stack: Python, OpenAI (Whisper + GPT-4o), MoviePy.
  • The Bottleneck: Processing time. Local rendering is slow. I need to move this to a cloud worker.

2. Portfolio Optimization

I noticed a significant Cumulative Layout Shift (CLS) on the main project page during Week 1. I refactored the image loading strategy in Next.js, implementing proper aspect ratio placeholders. The Lighthouse score moved from 82 to 96.


Analytics Review: The Retention Reality

This is the part most creators hide. Everyone shares the viral spike; almost no one shares the drop-off.

The Data:

  • Week 1 Unique Visitors: 1,200+ (fueled by launch posts)
  • Week 2 Unique Visitors: 450 (The baseline)

At first glance, a 60% drop looks like a failure. But as an engineer, I look at retention, not just acquisition.

The Insight: While traffic dropped, Time on Page increased by 40%. The people who visited in Week 2 weren't just clicking a link from Twitter/X out of curiosity; they were reading the documentation and looking at the code snippets. The bounce rate on the "Projects" page dropped significantly.

My takeaway: The "tourists" left. The "builders" stayed. I am optimizing for the latter.


The Best & The Worst

The Best Moment: Automated Feedback Loops

I implemented a simple webhook notification system. Whenever a user interacts with the demo agent on the site, I get a structured log in Discord. On Tuesday, I watched a user try to break the agent with prompt injection. The agent held up. Seeing the security protocols work in real-time was the highlight of the week.

The Worst Moment: The API Bill

I accidentally left a recursive loop running on a testing branch for the scraping agent. It wasn't infinite, but it was aggressive. I woke up to a usage alert from OpenAI. It wasn't a bank-breaking amount, but it was a rookie mistake.

The fix: I implemented strict budget caps and rate limiting at the application level, not just the provider level.


Process Tweaks: Systemizing the Grind

In Week 1, I was operating on adrenaline. That isn't scalable. Week 2 required installing systems.

1. CI/CD for Content

I started treating my content pipeline like a deployment pipeline. I set up a Kanban board in Notion that mirrors a Jira workflow: Backlog -> In Progress -> Code Review (Editing) -> Staging -> Production. No more writing posts on the fly 20 minutes before posting.

2. The "Code-First" Rule

I realized I was spending too much time designing thumbnails and not enough time coding. I established a new rule: Code first, document second. If I haven't committed code to GitHub, I am not allowed to open Figma. This ensures that every piece of content is backed by actual engineering work.


Technical Deep Dive: Solving the Context Window Issue

One of the biggest hurdles this week was managing the context window for the Retrieval-Augmented Generation (RAG) system I'm building for the specialized chatbot.

I found that simply dumping the vector search results into the prompt was diluting the answer quality. The "Needle in a Haystack" problem is real.

The Solution: I implemented a re-ranking step using a Cross-Encoder model (using `sentence-transformers`).

# Pseudo-code logic for the fix
results = vector_db.search(query)
reranked_results = cross_encoder.rank(query, results)
top_context = reranked_results[:3]
response = llm.generate(query, context=top_context)

This added about 200ms of latency but improved the relevance of the answers by an order of magnitude. It’s a trade-off I’m willing to make.


Looking Ahead: Week 3 Objectives

Week 2 was about stabilization. Week 3 is about Automation.

I am tired of manually aggregating the data for these retrospectives. My goal for next week is to build a dashboard that pulls my GitHub commits, website analytics, and task completion rates into a single view automatically. I want to automate the "Retro" itself.

The Roadmap:

  1. Deploy the "Retro-Dashboard" micro-SaaS.
  2. Open-source the RAG boilerplate I refined this week.
  3. Publish a video breakdown of the code behind the Content Repurposing Agent.

The hype is gone. Now we build.

Share

Comments

Loading comments...

Add a comment

By posting a comment, you’ll be subscribed to the newsletter. You can unsubscribe anytime.

0/2000