Work location: Onsite in either Glendale, CA or Santa Monica, CA or San Francisco, CA or Seattle, WA.
Duration of Assignment: 12+ Months
W2 Only Position
JOB DESCRIPTION:
We are looking for a Senior Engineer to join our Advertising Data Platform.
- Engineering group. This group powers core capabilities across addressable ad ecosystem, including operational data infrastructure, audience solutions, inventory forecasting, and full-funnel measurement.
- As a senior engineer, you will play a critical role in driving technical direction, setting engineering standards, and mentoring other developers. You will be expected to lead by example — designing scalable systems, writing high-quality code, and solving complex data and engineering challenges with autonomy and accountability. Your work will have a direct impact on the foundation and evolution of our ad platform.
- The ideal candidate will bring deep expertise in big data systems and strong experience building scalable backend or full stack services. You will work closely with engineers, data scientists, product managers, and stakeholders to build foundational systems that enable the future of advertising.
Responsibilities:
- Design, build, and maintain scalable data platform components for both real-time and batch processing. Own the full software development lifecycle — from requirements gathering and design to implementation, testing, and deployment.
- Drive engineering best practices including code quality, performance optimization, automated testing, CI/CD, and system reliability.
- Define and evaluate technical architecture, contribute to system-level design discussions, and lead decision-making on key engineering trade-offs.
- Collaborate cross-functionally with product managers, program managers, SDETs, and data scientists to deliver impactful solutions aligned with business goals.
- Lead by example to foster an inclusive, high-performing engineering culture; provide technical guidance and mentorship to junior and mid-level engineers.
- Troubleshoot and resolve complex production issues, ensuring system performance, availability, and reliability.
- Continuously evaluate emerging technologies and contribute to innovation efforts across the organization.
Qualifications:
- 5+ years of professional programming in Scala, Python, etc.
- 3+ years of big data development experience with technical stacks like Spark, Flink, Airflow, Singlestore, Kafka and AWS big data technologies.
- Deep understanding of data modeling, distributed systems, and performance optimization.
- Knowledge of system, application design and architecture.
- Experience of building industry level high available and scalable service.
- Passion about technologies, and openness to interdisciplinary work.
- Excellent communication and collaboration skills.
Preferred Qualifications:
- Hands-on experience with Databricks for development and deployment of data pipelines.
- Experience with data governance, compliance, or observability tooling.
- Demonstrated ability with cloud infrastructure technologies, including Terraform, K8S, Spinnaker, IAM, ALB, and etc.
- Experience with Snowflake, Kinesis, lambda etc.
- Experience in MicroService framework like Spring Boot, Spring Cloud, FastAPI, NestJS etc.
Required Education:
- STEM BA Degree
- 5+ years relevant experience
Job Type: Contract
Pay: $80.00 - $89.00 per hour
Expected hours: 40 per week
Application Question(s):
- This is a W2 only position. Can you be considered on a W2 basis on day one of the assignment without sponsorship?
- Can you support an onsite work scheduled in either Glendale, CA or Santa Monica, CA or San Francisco, CA or Seattle, WA?
Experience:
- Scala: 5 years (Required)
- Python: 5 years (Required)
- Spark: 3 years (Required)
- Apache Flink: 3 years (Required)
- Airflow: 3 years (Required)
- AWS Big Data: 3 years (Required)
- AWS Services: 3 years (Required)
- Databricks: 3 years (Required)
- Snowflake: 3 years (Required)
- Microservice Architecture: 3 years (Required)
- Data pipelines: 5 years (Required)
Work Location: In person