Job Title: Senior Data Engineer
Location: Phoenix, AZ
Reports To: Manager, Engineering
Department: Information Technology
Position Summary
We are seeking a highly skilled Senior Data Engineer to design, build, and optimize enterprise data platforms that enable advanced analytics and AI initiatives. This is a hands-on development role focused on implementing scalable, secure, and high-performing data solutions. The ideal candidate will have deep expertise in data modeling, modern data architecture, data governance, and experience setting up data lakes, lakehouses, and data warehouses. Proficiency in Microsoft Fabric and familiarity with Power BI are essential.
Key Responsibilities
- Design and implement robust data models (conceptual, logical, physical) to support analytics and AI workloads.
- Architect and maintain data pipelines leveraging modern architecture concepts for managing structured and unstructured data.
- Establish and enforce data governance frameworks, including data quality, lineage, metadata management, and compliance.
- Build and manage data lakes, lakehouses, and warehouses using Microsoft Fabric and Azure services.
- Develop and optimize ETL/ELT processes for batch and real-time data ingestion.
- Plan and execute data migration strategies from SQL Server databases to Microsoft Fabric, ensuring data integrity and minimal disruption.
- Collaborate with data scientists, analysts, and business stakeholders to deliver high-quality datasets.
- Integrate data solutions with Power BI for reporting and visualization.
- Ensure compliance with cybersecurity, data privacy, and regulatory requirements.
- Define and enforce best practices for data performance, scalability, and maintainability.
- Implement CI/CD pipelines and automated testing for data workflows.
Qualifications
- Bachelor’s or Master’s degree in Computer Science, Data Engineering, or related field.
- 5+ years of experience in data engineering roles.
- Strong hands-on experience with:
- Data modeling, schema evolution, and pipeline development.
- Microsoft Fabric and Azure Data Services.
- Data migration, ingestion, and database programming.
- Knowledge of Medallion Architecture, event-driven architecture, data lakehouse concepts, and data governance principles.
- Experience with AI/ML data pipelines.
- Proficiency in Python, SQL, and Java for building scalable data solutions.
- Strong understanding of data structures and algorithms for efficient data processing and optimization.
- Knowledge of API development and integration for data ingestion and service-oriented architecture.
- Hands-on experience with CI/CD practices and automated testing for data pipelines.
- Understanding of streaming and event-driven architectures (e.g., Kafka, Kinesis) for real-time data processing.
- Familiarity with Power BI integration and optimization.
- Knowledge of SSRS (SQL Server Reporting Services) preferred.
- Proven experience working in Agile environments (Scrum/Kanban).
- Excellent communication and documentation skills.
Job Type: Full-time
Pay: $106,612.91 - $128,394.05 per year
Work Location: In person