Descripción del Puesto
We are looking for a Senior Data Architect & Program Delivery Lead who can operate confidently across
two worlds : legacy enterprise data platforms (especially
Informatica PowerCenter and Oracle ) and modern
AWS cloud data architecture .
This role is not a pure coding position and not a pure project management role. It requires someone who can
lead technical architecture discussions with engineers in English , while also driving
program governance, delivery oversight, and stakeholder alignment
in a large-scale, multi-team, multi-vendor environment.
You will own the technical direction of the migration, define and validate the target architecture, unblock engineering teams, and ensure delivery stays on track through strong governance and execution discipline.
1) Technical Architecture Leadership (AWS + Legacy)
Design and validate
cloud-native data architectures
using AWS data services.
Analyze and decompose complex
Informatica PowerCenter ETL workflows
and
Oracle data models
to define migration patterns and sequencing.
Define the technical roadmap, drive
architectural decision records (ADRs) , and ensure engineering best practices and standards.
Review designs, code and infrastructure changes to ensure scalability, reliability, and maintainability.
Troubleshoot high-impact technical issues across data pipelines, orchestration, and performance.
Act as technical owner of the stack, including
PySpark, AWS Glue/EMR, S3, Redshift, Lake Formation, Informatica PowerCenter, Apache Iceberg, Terraform, SQL/Oracle, and Python .
2) Delivery Leadership and Team Management
Lead delivery execution across multiple agile teams, including planning, ceremonies, and delivery tracking.
Manage and guide teams of
9 to 15 engineers
across different geographies and disciplines.
Coordinate dependencies across workstreams and ensure teams are aligned on priorities and implementation approach.
Drive quality standards and ensure deliverables meet expected technical and operational requirements.
Communicate clearly and credibly with engineers and non-technical stakeholders, translating complexity into decisions and actions.
3) Program Governance and Stakeholder Management
Own program governance across stakeholders, vendors, and subcontractors in a multi-party delivery model.
Manage evolving scope, requirements, and expectations, ensuring transparency and decision clarity.
Track progress against milestones, manage risks and blockers, and support resourcing and timeline planning.
Provide clear reporting and updates to senior leadership and governance forums.
Ensure coordination with partner organizations and accountability for third-party contributions.
Required Qualifications
Technical Background
3+ years of hands-on experience with
Informatica PowerCenter
in complex ETL environments.
Strong
PySpark
skills for production-grade data transformations.
Strong AWS data engineering knowledge, including
Glue, EMR, S3, Redshift, and Lake Formation .
Experience with
Apache Iceberg
or similar modern table formats (Delta Lake, Hudi).
Infrastructure as Code experience, ideally
Terraform .
Strong
SQL and data modeling
skills (Oracle preferred).
Delivery and Leadership Experience
Proven track record leading
large-scale data migrations
involving multiple systems and high data volumes.
At least 5 years managing agile data engineering or analytics teams (5 to 15 people).
Experience working in
multi-vendor
or subcontractor delivery models.
Strong experience leading distributed, international teams.
Demonstrated ability to communicate technical concepts clearly to non-technical audiences.
Communication and Soft Skills
Strong
English communication
(spoken and written), comfortable leading architecture discussions with engineers.
Proactive, solutions-oriented, and confident managing ambiguity and complexity.
Strong stakeholder management and ability to influence decisions across senior audiences.
Structured approach to governance, reporting, and delivery accountability.
Preferred Qualifications
Experience with
Azure or GCP .
Exposure to
data quality
frameworks and tools.
Familiarity with BI and analytics tools (Power BI, Tableau, Qlik).
Knowledge of DevOps practices (CI/CD, GitHub Actions).
Experience with data governance or cataloging solutions.
Experience migrating or extracting data from
SAP .
Enterprise data warehousing background (dimensional modeling).
Domain experience in manufacturing or supply chain data is a plus.
Certifications (Nice to Have)
AWS Solutions Architect
(Associate or Professional).
SAFe ,
PRINCE2 , or
PMP .
Si quieres, te la dejo también en 2 versiones más listas para publicar:
versión más corta tipo LinkedIn post,
versión “consulting style” para enviar a vendors.
Is this conversation helpful so far?