Candidates must have permanent, unrestricted authorization to work and must not require visa sponsorship now or in the future (including OPT/CPT).
Key Responsibilities:
- Design and develop ETL/ELT workflows and data pipelines for batch and real-time processing.
- Build and maintain data pipelines for reporting and downstream applications using open-source frameworks and cloud technologies.
- Implement operational and analytical data stores leveraging Delta Lake and modern database concepts.
- Optimize data structures for performance and scalability across large datasets.
- Collaborate with architects and engineering teams to ensure alignment with target state architecture.
- Apply best practices for data governance, lineage tracking, and metadata management.
- Troubleshoot and resolve issues in data pipelines and ensure high availability and reliability.
Prerequisites:
- Bachelor’s Degree in Computer Science, Data Science, Information Technology, Engineering, or related fields.
- Hands-on professional experience Python, Pyspark, SQL, etc.
- Familiar with ETL/ELT concepts and database concepts.
- Exposure to GCP, BigQuery.
- Knowledge of CI/CD for data engineering.
- Strong problem-solving skills and eagerness to learn.
- Good communication and teamwork abilities.
Quintrix is an Equal Opportunity Employer and does not discriminate in employment on the basis of – Minority/Gender/Disability/Religion/LGBTQI/Age/Veterans
Pay: $35.00 - $38.00 per hour
Work Location: In person