Data Quality Engineer
Dallas, TX
HYBRID
Virtual + F2F
Job Requirements:
Must Haves Skills
- Extensive experience as Data Engineer with Python Language and Cloud Technologies (AWS preferably).
- Experience in Automating ETL process/Pipelines and AWS Data & Infrastructure with Python.
- Extensive experience with AWS components like S3, Athena, EMR, Glue, Redshift, Kinesis and SageMaker.
- Extensive Experience with SQL/Unix/Linux scripting.
- Developing/testing Experience on Cloud/On Prem ETL Technologies (Ab Initio, AWS Glue, Informatica, Alteryx).
- Experience in Data migration from Onprem to Cloud is Plus.
- Experienced in large-scale application development testing – Cloud/ On Prem Data warehouse, Data Lake, Data Science.
- Extensive experience in DevOps/Data Ops space.
- Having experience in Data Science platforms like SageMaker/Machine Learning Studio/ H2O is plus.
Work Description SDET – Python, AWS, Unix and ETL.
- Work with business stakeholders, Business Systems Analysts and Developers to ensure delivery of Data Applications.
- Building Automation Frameworks using Python.
- Designing and managing the data workflows using Python during development and deployment of data products
- Design, development of Reports and dashboards.
- Analyzing and evaluating data sources, data volume, and business rules.
- Should be well versed with the Data flow and Test Strategy for Cloud/ On Prem ETL Testing.
- Interpret and analyses data from various source systems to support data integration and data reporting needs.
- Experience in testing Database Application to validate source to destination data movement and transformation.
- Work with team leads to prioritize business and information needs.
- Develop and summarize Data Quality analysis and dashboards.
- Knowledge of Data modeling and Data warehousing concepts with emphasis on Cloud/ On Prem ETL.
- Execute testing of data analytic and data integration on time and within budget.
- Troubleshoot & determine best resolution for data issues and anomalies
- Has deep understanding of data architecture & data modeling best practices and guidelines for different data and analytic platforms
Nice to Have Skills
- Experience using Jenkins and Gitlab
- Experience using both Waterfall and Agile methodologies.
- Experience in testing storage tools like S3, HDFS
- Experience with one or more industry-standard defect or Test Case management Tools
Soft Skills
- Great communication skills (regularly interacts with cross functional team members)
- Who takes Ownership to complete the tasks on time with less supervision
- Guiding developers and automation teams in case of an issue
- Monitoring, reviewing, and managing technical operations
- Effective problem-solving expertise, trouble shooting, code debugging, and root cause analysis skills.
Note: We are actively looking for Data Engineering (Python) profiles and good to have Testing experience. We need more Data Engineer skills.
Job Type: Contract
Pay: $62.74 - $65.00 per hour
Schedule:
- 8 hour shift
- Monday to Friday
Work Location: In person