Required Qualifications
- 5+ years of experience working with Hadoop and its ecosystem
- Proficiency in Hadoop and SQL
- Familiarity with big data frameworks like Apache Spark and Kafka
- Strong understanding of Linux/Unix systems and shell scripting.
- Knowledge of data security practices in Hadoop environments.
- Excellent problem-solving and communication skills.
- Must be able to handle multiple tasks, lead the team through the delivery and adapt to a constantly changing environment.
- Ability to learn quickly and work with minimal supervision.
Job Type: Contract
Pay: $62.00 - $65.00 per hour
Experience:
- Big Data: 5 years (Preferred)
- HIVE/Spark: 5 years (Preferred)
- ETL: 10 years (Required)
- Hadoop: 5 years (Required)
- SQL: 5 years (Required)
Ability to Commute:
- Charlotte, NC 28202 (Required)
Ability to Relocate:
- Charlotte, NC 28202: Relocate before starting work (Required)
Work Location: In person