Job Description
Job Title: IT Engineer Data Engineering
Company Name: ContiTech
Company Description:
Continental
develops pioneering technologies and services for sustainable and connected mobility of people and their goods. Founded in 1871, the technology company offers safe, efficient, intelligent, and affordable solutions for vehicles, machines, traffic, and transportation. In 2023, Continental generated sales of €41.4 billion and currently employs around 200,000 people in 56 countries and markets.
Guided by the vision of being the customer's first choice for material-driven solutions, the
ContiTech
group sector focuses on development competence and material expertise for products and systems made of rubber, plastics, metal, and fabrics. These can also be equipped with electronic components in order to optimize them functionally for individual services. ContiTech's industrial growth areas are primarily in the areas of energy, agriculture, construction, and surfaces. In addition, ContiTech serves the automotive and transportation industries as well as rail transport.
The IT Digital and Data Services Competence Center of ContiTech caters to all the Business Areas in ContiTech and responsible among other on areas of Data & Analytics, Web and Mobile Software Development and AI
The team for Data services specializes in all platforms, business applications and products in the domain of data and analytics, covering the entire spectrum including AI, machine learning, data science, data analysis, reporting and dashboarding.
Job Description
* Design, implement, and maintain secure and scalable data ingestion pipelines from a wide variety of source systems, including SAP, SalesForce, SharePoint, APIs, and (legacy) manufacturing platforms.
* Build and enhance metadata-driven services that enable discoverability, access governance, and operational transparency of enterprise data.
* Serve as a technical expert and cross-functional enabler for structured and unstructured data acquisition, quality, and compliance.
* Establish and maintain holistic data quality management, monitoring and reporting.
* Contributes to a global data engineering team delivering to all major business domains.
* Drives ingestion and metadata service implementation for over 100 enterprise data sources.
* Collaborates across business IT, cybersecurity, infrastructure, and architecture teams to ensure secure and sustainable delivery.
Main Tasks:
▪ Build and maintain Python- or Scala-based extraction services (e.g., Debezium Server, custom APIs, rclone)
• Implement CDC, delta, and event-based patterns.
• Push-based HTTP, Kerberos-authenticated DLT delivery.
• Establish, operate and troubleshoot extraction from SAP using tools like Theobald Extract Universal.o
• Integrate with systems such as Salesforce, SharePoint, and other API- or file-based endpoints.
▪ Establish and maintain a business-friendly, web-accessible data catalog application, with dataset profiles, metadata, and usability features.
▪ Integrate dataset discoverability, preview/exploration options, and lineage information using Unity Catalog as a backend metadata system.
▪ Design and implement structured access request workflows including request submission, approval chains, audit trail, and enablement triggers.
• Perform design reviews with Cybersecurity.
• Ensure documentation and compliance for all interfaces and data ingress points.
• Manage audit and traceability requirements.
• Collaborate closely with IT and business users to translate requirements into scalable technical patterns.
• Serve as technical escalation point for complex source integration.
▪ Define and implement a multi-layered data quality framework, including unit-level, integration-level, and cross-pipeline validation rules.
▪ Establish centralized and version-controlled storage of DQ rules, with integration into orchestration and CI/CD pipelines.
▪ Implement automatic DQ monitoring with severity levels (Critical, High, Medium, Low) and enable flagging, filtering, and quarantining logic at relevant stages of the pipeline.
▪ Collaborate with source system owners and business stakeholders to define meaningful and actionable DQ thresholds.
Qualifications
Degree in Computer Science, Data Engineering, or a related field. Azure or Databricks certification is a plus.
5–8 years i n data engineering, with hands-on experience in
ingesting structured and unstructured enterprise data into modern cloud platforms.
Proven implementation of source system
ingestion frameworks, metadata automation, and compliance-controlled interfaces.
Not required; however, experience mentoring junior developers or leading implementation workstreams is a plus; contributes to engineering standards and code quality improvement initiatives.
Comfortable working across geographies and time zones; collaborates effectively with global teams and enterprise stakeholders.