Job Description
Looking for a
Freelance Data Architect
to join a team of rockstar developers. The candidate should have a minimum of 12+ yrs. of experience.
There are multiple openings. If you're looking for freelance/ part time opportunity (along with your day job) & a chance to work with the top 0.1% of developers in the industry, this one is for you! You will report into IIT'ans/BITS grads with 10+ years of development experience + work with F500 companies (our customers).
Company Background
- We are a multinational software company that is growing at a fast pace. We have offices in Florida & New Delhi. Our clientele spreads across the US, Australia & APAC. To give you a sense of our growth rate, we've added 70+ employees in the last 6 weeks itself and expect another 125+ by the end of Q4 2025.
Key Responsibilities
Data Architecture & Strategy
Define and implement enterprise data architecture strategies leveraging GCP (BigQuery, Dataflow, Dataproc, Pub/Sub, Cloud Storage) and Snowflake.
Design scalable, high-performance data models, warehouses, and lakes to support analytics and AI/ML initiatives.
Establish best practices for data governance, security, and compliance (GDPR, CCPA,
HIPAA).
Cloud Expertise
Develop and manage GCP-based data pipelines using Dataflow, Dataproc, Cloud Composer (Airflow), and BigQuery.
Architect and optimize Snowflake environments (virtual warehouses, data sharing,
Snowpipe, Snowpark).
Implement CDC, real-time streaming (Pub/Sub), and batch processing solutions.
Optimize query performance, cost efficiency, and storage strategies in BigQuery &
Snowflake.
Design and automate ETL/ELT workflows using GCP, Informatica.
Integrate multi-source data (SQL/NoSQL, APIs, IoT, logs) into GCP & Snowflake.
Ensure data quality, lineage, and metadata management using tools like Data Catalog, Collibra, or Alation.
Collaboration & Leadership
Partner with Data Engineers, Analysts, and Business teams to align data solutions with
organizational goals.
Evaluate emerging technologies (AI/ML, Lakehouse, Databricks) for continuous improvement
Required Skills & Qualifications
12+ years in Data Architecture, Engineering, or Warehousing, with 3+ years in GCP
& Snowflake.
Deep hands-on experience with:
GCP: BigQuery, Dataflow, Dataproc, Cloud Storage, Pub/Sub, IAM
Snowflake: Architecture design, performance tuning, SnowSQL, Snowpark, data
sharing.
SQL/Python for data transformation & automation
ETL/ELT tools dbt, Informatica, Fivetran
Data Modeling star schema, data vault, dimensional modeling
• Strong understanding of data security, governance, and compliance.
• Certifications preferred: GCP Professional Data Engineer, Snowflake SnowPro Advanced Architect
Preferred Skills
Experience with Databricks, Apache Spark, or Kafka for real-time analytics.
Familiarity with CI/CD for data pipelines (Git, Terraform, Jenkins).
What we need
-~35 hours of work per week.
-100% remote from our side
-You will be paid out every month.
-Min 12yrs of experience
-Please apply only if you have a 100% remote job currently
-If you do well, this will continue for a long time