I've worked as a web developer for a decade now. When starting something fresh, I usually go with my gut
npx create-next-app. It seems fresh. It just works. Shipping to Vercel? Totally hooked.
When I began crafting Reddit Toolbox - a marketing tool for indie makers - I just went with it. No second thoughts. Set up a sleek dashboard using Next.js, tied in Supabase for backend stuff, then handled scraping inside serverless chunks instead.
It ran just fine on localhost.
Yet when I launched it live, nothing worked right.
This is how it goes if you attempt automating Reddit - or say, LinkedIn or even Twitter - from a regular web server:
You are not just "you." To Reddit's anti-spam systems, you are AWS
us-east-1. Meet Google Cloudunknown-host. Your IP’s packed with tons of bots, crawlers, automated tools - same one. Just a day after I launched the site, test logins got quietly blocked. No spam sent, nothing fishy - just that the browser setup looked like “This is headless Chrome, running in some server room.”
I gave rotating proxies a shot. Then moved to stealth tools instead. Still, none delivered steady results. That’s because a Node.js request leaves a TLS mark that stands out way too much.
I figured out one thing - to make a tool that truly protects user accounts, I couldn't stay inside the browser.
I took a step that seemed like moving backward - rewriting the whole app from scratch as a desktop version using Electron along with Python.
Why? Three reasons:
If you're making a CRUD app, stick to the web.
Yet when crafting something near the murky side of bots or data pulling - stick to local setups.
Building it takes more effort. Fixing binaries? Totally annoying. Yet without this, beating today’s bot blockers just won’t happen.
(I'm still refining this architecture, but if you want to poke around the beta, it's live here: Reddit Toolbox)
\


