Quantexa Engineer Job in Phoenix, AZ.

Quantexa Engineer Job in Phoenix, AZ.

Cozen Tech, and I’m pleased to present an urgent Job opportunity for Quantexa Engineer in Phoenix, AZ

Why Apply Now? Due to an immediate hiring need, qualified candidates who apply early will be fast tracked through the hiring process If your background, and your skills aligns with the role, we strongly encourage you to submit your resume promptly to expedite the process to the next level.

Job Title: Quantexa Engineer
Employment: Long term Contract
Location: Phoenix, AZ
Skill : Cloud Data Engineering, Quantexa, ETL, GCP

Role Summary:
We are seeking an experienced Data Engineer with strong expertise in Google Cloud Data Services to design, build, and optimize large-scale data pipelines.
The ideal candidate will have a proven track record in ETL/ELT development, advanced analytics enablement, and team leadership, with a focus on delivering high-quality, scalable, and performant data solutions.

Key Responsibilities:

  1. Design & Development: Build and optimize end-to-end data pipelines to support advanced analytics and downstream platforms (e.g., Quantexa).
  2. Reusable Pipelines: Develop generic, reusable frameworks for ingesting and integrating both batch and incremental data.
  3. Data Preparation: Transform and prepare data for entity resolution, contextual analysis, and graph-based analytics aligned with Quantexa use cases.
  4. Data Migration: Execute TB-scale data migration and processing with efficiency and reliability.
  5. Quality & Governance: Ensure data quality, consistency, and lineage across pipelines to support analytics and decision-making platforms.
  6. Leadership: Lead and mentor the data processing team, driving best practices and ensuring scalable, high-performance delivery.

Required Skills & Experience:

  1. Experience: 6–9 years in data engineering or ETL/ELT development, including 4+ years of hands-on work with Google Cloud Dataflow.
  2. Google Cloud Expertise: Strong proficiency in Dataflow, Cloud Storage, BigQuery, Cloud Composer, Secret Manager, and Cloud Functions.
  3. Data Engineering Fundamentals: Solid understanding of ETL/ELT concepts, data modeling, and large-scale data processing.
  4. Pipeline Optimization: Proven ability to design and optimize pipelines for performance, scalability, and reusability.
  5. Analytics Enablement: Experience preparing data for entity resolution, contextual analysis, and graph-based analytics.
  6. Team Leadership: Demonstrated ability to lead data engineering teams and deliver high-quality outcomes.
  7. Data Governance: Strong focus on data quality, consistency, and lineage management.

Please fill the Required Skill Matrix below mentioned:

Skill AreaRelevant Experience (Years)Expertise Level (1–10)
Data Engineering / ETL-ELT Development  
Google Cloud Dataflow  
Google Cloud Storage  
BigQuery  
Cloud Composer  
Secret Manager  
Cloud Functions  
ETL/ELT Concepts & Data Modeling  
TB-scale Data Migration & Processing  
End-to-End Data Pipeline Design & Optimization  
Reusable Pipeline Development (Batch & Incremental)  
Data Preparation for Entity Resolution & Contextual Analysis  
Graph-based Analytics (Quantexa)  
Team Leadership in Data Processing  
Data Quality, Consistency & Lineage Management  

If your skill set, experience, and responsibilities align with the job description, we encourage you to apply via email by submitting your application through the email: vignesh@cozentech.com

Leave a Reply

Your email address will not be published. Required fields are marked *