Job Description
Company Description
ThreatXIntel
is a growing
Cybersecurity, IT Staffing, and Consulting company
delivering end-to-end technology and security solutions. Our services include
cloud security, web and mobile application security testing, DevSecOps, vulnerability assessments, IT consulting, and professional staffing services .
We support global corporate clients by
hiring and deploying skilled professionals
across IT and cybersecurity domains while helping organizations strengthen security, optimize operations, and scale efficiently. ThreatXIntel is committed to enabling business growth through secure, reliable, and high-quality technology solutions.
Role Overview
We are seeking a
Freelance Data Pipeline Engineer
with strong expertise in
Vector
to design and implement
scalable, modular, and reusable data flow pipelines
for large-scale
security telemetry environments .
The consultant will be responsible for building platform-agnostic ingestion frameworks capable of handling
multi-source telemetry data
and integrating with downstream analytics platforms such as
Snowflake, Splunk, Azure Data Explorer (ADX), and Log Analytics .
This role requires hands-on experience in
Vector-based data ingestion pipelines , schema normalization using
OCSF , and advanced
data transformation and enrichment
across large-scale telemetry sources.
Key Responsibilities
Data Pipeline Architecture
Design and implement
scalable data ingestion pipelines using Vector
Build
modular and reusable ingestion frameworks
Handle ingestion from multiple sources including:
Syslog
Kafka
HTTP
Azure Event Hubs
Blob Storage
Data Processing & Transformation
Implement
data transformation logic including filtering, enrichment, and dynamic routing
Support format transformations such as:
JSON
CSV
XML
Logfmt
Schema & Data Governance
Implement
schema normalization using Open Cybersecurity Schema Framework (OCSF)
Build field mapping templates and schema validation logic
Ensure governance and security compliance for telemetry pipelines
Security & Data Integrity
Implement
SSL/TLS security controls and client authentication
Maintain data lineage, metadata tagging, and correlation IDs
Ensure minimal data loss, duplication, or transformation drift
Observability & Monitoring
Integrate
pipeline monitoring and anomaly detection
Implement logging for pipeline failures and transformation errors
Support observability platforms and operational monitoring
Integration with Analytics Platforms
Deliver telemetry data into platforms such as:
Snowflake
Splunk
Azure Data Explorer (ADX)
Log Analytics
Anvilogic
Collaboration & Documentation
Work closely with
security, analytics, and platform engineering teams
Maintain documentation for ingestion patterns, transformation libraries, and governance standards
Mandatory Skills
Vector (Data Pipeline Platform)
Security Telemetry Data Pipelines
Kafka / Event Streaming
Data Transformation & Enrichment
OCSF (Open Cybersecurity Schema Framework)
Data Pipeline Architecture
Snowflake / Splunk / ADX integrations
Python / Groovy / JavaScript scripting
Data Governance & Schema Normalization
Observability & Pipeline Monitoring
Required Experience
7+ years of experience in Data Engineering or Security Data Platforms
Strong hands-on experience building
Vector-based data pipelines
Experience managing
large-scale telemetry ingestion (100+ data sources)
Experience integrating with
security analytics platforms
Experience designing
scalable ingestion frameworks
Ready to Apply?
Don't miss this opportunity! Apply now and join our team.
Job Details
Posted Date:
March 17, 2026
Job Type:
Technology
Location:
India
Company:
ThreatXIntel
Ready to Apply?
Don't miss this opportunity! Apply now and join our team.