Job Description
Data Engineer – AI & Web Scraping Automation ( 5 + Years)
Experience:
6–10 Years
Location:
Remote (Work from Home)
Mode of Engagement:
Full-time
No of Positions:
2
Educational Qualification:
Bachelor’s degree in Computer Science, IT, or related field
Industry:
IT / Software Services / Data & AI
Notice Period:
Immediate Joiners Preferred
What We Are Looking For
Strong hands-on experience in
Python-based web scraping and crawling
using
Requests, Scrapy, Selenium, and Playwright .
Deep hands-on experience working with
JavaScript-heavy and dynamic websites
using
Selenium, Playwright , or similar browser automation frameworks.
Proven expertise handling
large-scale enterprise data scraping
across complex, high-traffic platforms.
Lead and mentor a web scraping team to build and scale enterprise-grade crawling solutions using Python, Requests, Scrapy, Selenium, and Playwright.
Strong understanding of
HTTP protocols, cookies, sessions, headers, tokens, and browser storage (local/session storage) .
Hands-on experience with
proxy rotation, IP management, routers, rate limiting, fingerprinting, and anti-bot evasion techniques .
Ability to design and maintain
scalable, reliable, and fault-tolerant crawling architectures .
Responsibilities
Build scalable ETL pipelines using Python and AWS
Develop and maintain web scraping systems (Scrapy, Selenium, BeautifulSoup)
Integrate OpenAI / Azure OpenAI APIs for structured data extraction
Build workflow automation using n8n or similar tools
Implement orchestration using Dagster or Airflow
Manage structured storage using PostgreSQL, S3, Parquet
Lead/mentor junior engineers.
Qualifications
6–10 years of
hands-on experience in web scraping and crawling using Python .
Strong practical knowledge of
Requests, Scrapy, Selenium, and Playwright,
Python, AWS (RDS/S3/EC2), OpenAI API, ETL design
Proven experience scraping
JavaScript-heavy, dynamic, and access-controlled enterprise websites .
Deep understanding of
cookies, sessions, headers, proxies, IP rotation, routers, and anti-detection strategies .
Experience working with
SQL/NoSQL databases
for structured data storage.
Strong debugging, system design, and problem-solving skills.
Ability to independently own and scale scraping systems in a remote environment.
Ready to Apply?
Don't miss this opportunity! Apply now and join our team.
Job Details
Posted Date:
February 27, 2026
Job Type:
Technology
Location:
India
Company:
AIMLEAP
Ready to Apply?
Don't miss this opportunity! Apply now and join our team.