Cozen Tech, and I’m pleased to present an urgent Job opportunity for Quantexa Engineer in Phoenix, AZ
Why Apply Now? Due to an immediate hiring need, qualified candidates who apply early will be fast tracked through the hiring process If your background, and your skills aligns with the role, we strongly encourage you to submit your resume promptly to expedite the process to the next level.
Job Title: Quantexa Engineer
Employment: Long term Contract
Location: Phoenix, AZ
Skill : Cloud Data Engineering, Quantexa, ETL, GCP
Role Summary:
We are seeking an experienced Data Engineer with strong expertise in Google Cloud Data Services to design, build, and optimize large-scale data pipelines.
The ideal candidate will have a proven track record in ETL/ELT development, advanced analytics enablement, and team leadership, with a focus on delivering high-quality, scalable, and performant data solutions.
Key Responsibilities:
- Design & Development: Build and optimize end-to-end data pipelines to support advanced analytics and downstream platforms (e.g., Quantexa).
- Reusable Pipelines: Develop generic, reusable frameworks for ingesting and integrating both batch and incremental data.
- Data Preparation: Transform and prepare data for entity resolution, contextual analysis, and graph-based analytics aligned with Quantexa use cases.
- Data Migration: Execute TB-scale data migration and processing with efficiency and reliability.
- Quality & Governance: Ensure data quality, consistency, and lineage across pipelines to support analytics and decision-making platforms.
- Leadership: Lead and mentor the data processing team, driving best practices and ensuring scalable, high-performance delivery.
Required Skills & Experience:
- Experience: 6–9 years in data engineering or ETL/ELT development, including 4+ years of hands-on work with Google Cloud Dataflow.
- Google Cloud Expertise: Strong proficiency in Dataflow, Cloud Storage, BigQuery, Cloud Composer, Secret Manager, and Cloud Functions.
- Data Engineering Fundamentals: Solid understanding of ETL/ELT concepts, data modeling, and large-scale data processing.
- Pipeline Optimization: Proven ability to design and optimize pipelines for performance, scalability, and reusability.
- Analytics Enablement: Experience preparing data for entity resolution, contextual analysis, and graph-based analytics.
- Team Leadership: Demonstrated ability to lead data engineering teams and deliver high-quality outcomes.
- Data Governance: Strong focus on data quality, consistency, and lineage management.
Please fill the Required Skill Matrix below mentioned:
| Skill Area | Relevant Experience (Years) | Expertise Level (1–10) |
| Data Engineering / ETL-ELT Development | ||
| Google Cloud Dataflow | ||
| Google Cloud Storage | ||
| BigQuery | ||
| Cloud Composer | ||
| Secret Manager | ||
| Cloud Functions | ||
| ETL/ELT Concepts & Data Modeling | ||
| TB-scale Data Migration & Processing | ||
| End-to-End Data Pipeline Design & Optimization | ||
| Reusable Pipeline Development (Batch & Incremental) | ||
| Data Preparation for Entity Resolution & Contextual Analysis | ||
| Graph-based Analytics (Quantexa) | ||
| Team Leadership in Data Processing | ||
| Data Quality, Consistency & Lineage Management | |
If your skill set, experience, and responsibilities align with the job description, we encourage you to apply via email by submitting your application through the email: vignesh@cozentech.com
