I build custom Python automation scripts and web scraping pipelines — production-grade, reliable, and documented.
What you get:
• Custom Python scripts or packages tailored to your workflow
• Scheduled automation (cron, Windows Task Scheduler, or cloud-based)
• Structured data output: CSV, JSON, Excel, or direct database insert
• Error handling, logging, retry logic and monitoring
• Clear documentation so your team can maintain it
What I automate:
• Web scraping — static sites, JavaScript-rendered pages (Playwright, Selenium), APIs
• Data extraction and transformation (ETL pipelines)
• Report generation — daily/weekly automated reports from multiple data sources
• API integrations — connecting systems that don't natively talk to each other
• File processing — bulk PDF parsing, Excel manipulation, image handling
• Messaging automation, form filling, data entry
Real examples from my work:
• Scraping and scoring 1000+ job postings daily across 6 platforms with weighted matching
• Automated marketplace analytics pipeline with deduplication and HTML dashboards
• Multi-platform data aggregation with scheduled runs and alerting
• Profile audit tools that detect content issues and generate fix reports
Tech stack: Python, BeautifulSoup, Scrapy, Playwright, Selenium, Pandas, requests, FastAPI, SQLite/PostgreSQL, scheduling (APScheduler, cron).
I write clean, maintainable code with proper structure — not throwaway scripts. Every delivery includes error handling, logging and documentation.