building data pipelines and browser automations that actually work
i work at pimatrix as a web scraping engineer — writing scrapers, building automation pipelines, and extracting structured data from the web at scale.
most of my work lives in the background. scrapers running at 3am, pipelines delivering clean data before a client's morning meeting, automations replacing hours of manual work. that's the job.
i work mostly in node.js. puppeteer and playwright for anything dynamic, cheerio for the fast stuff, crawlee and apify when it needs to scale.
what i build
- scrapers for e-commerce, real estate, google maps, job portals
- anti-bot handling — rotating proxies, fingerprint evasion, session management
- apify actors, ready to deploy and scale
- structured output in json, csv, or excel
- workflow automations — instagram, google drive, scheduled pipelines
stack
shipped
- pimatrix portfolio — company site
- video splitter — split videos in the browser, no upload needed
- frame splitter — extract frames from any video file
- instachat extension — better instagram dms, chrome extension



