Working knowledge of Snowflake/Databricks fundamentals and how to use it effectively in practice.
- in-depth knowledge and understanding of Databricks/Spark concepts and common patterns for building and operating data pipelines.
- Knowledge of data formats, Delta tables and Apache Iceberg (table semantics, schema evolution, ACID/time travel, partitioning/compaction).
- provide concrete approaches to improve project efficiency (performance tuning, cost optimization, scaling strategies), which is a key requirement for this role.
Job Type: Contract
Pay: $60.00 - $62.00 per hour
Work Location: In person