STRATEGIC STAFFING SOLUTIONS HAS AN OPENING!
This is a Contract Opportunity with our company that MUST be worked on a W2 Only. No C2C eligibility for this position. Visa Sponsorship is Available! The details are below.
“Beware of scams. S3 never asks for money during its onboarding process.”
Job Title: Senior Data Engineer
Contract Length: 18+ Month contract
Location: WEST DES MOINES, IA 50266
Work Schedule: 3 days onsite/ 2 days remote
Pay: 42-46 an hr on W2
The Data Engineer will design, develop, and maintain data pipelines and ETL/ELT workflows supporting batch and real-time data processing. This role focuses on building scalable data infrastructure, implementing modern data storage solutions, and ensuring reliable data delivery for reporting and downstream applications.
Key Responsibilities
- Design and develop ETL/ELT workflows and data pipelines for batch and real-time processing.
- Build and maintain data pipelines for reporting and downstream applications using open source frameworks and cloud technologies.
- Implement operational and analytical data stores leveraging Delta Lake and modern database concepts.
- Optimize data structures for performance and scalability across large datasets.
- Collaborate with architects and engineering teams to ensure alignment with target state architecture.
- Apply best practices for data governance, lineage tracking, and metadata management, including integration with Google Dataplex for centralized governance and data quality enforcement.
- Develop, schedule, and orchestrate complex workflows using Apache Airflow, including designing and managing Airflow DAGs.
- Troubleshoot and resolve issues in data pipelines to ensure high availability and reliability.
Required Technical Skills
- Strong understanding of data structures, data modeling, and lifecycle management.
- Hands-on experience designing and managing ETL/ELT data pipelines.
- Advanced PySpark skills for distributed data processing and transformation.
- Experience implementing open table formats using NetApp Iceberg.
- Knowledge of the Hadoop ecosystem, including HDFS and Hive.
- Experience with cloud platforms, including GCP (BigQuery, Dataflow), Delta Lake, and Dataplex for governance and metadata management.
- Programming and orchestration using Python, Spark, and SQL.
- Strong experience with Apache Airflow, including authoring and maintaining DAGs for complex workflows.
- Strong understanding of relational and distributed database systems and reporting concepts.
Pay: $42.00 - $46.00 per hour
Work Location: Hybrid remote in West Des Moines, IA 50266