About us:
v4c.ai is an AI-powered platform designed to help businesses make smarter, faster decisions. We specialize in turning complex data into clear, actionable insights through automation, analytics, and intelligent workflows. Our mission is to simplify the way organizations harness AI so they can focus on what matters most—growth, innovation, and impact.
As a fast-growing company at the forefront of artificial intelligence, we value creativity, curiosity, and collaboration. At v4c.ai, you’ll have the opportunity to contribute to cutting-edge solutions while working in a dynamic, remote-first environment that encourages learning and growth.
About the role:
We’re looking for a Data Engineer to join our high-impact team. In this critical and fast-evolving data landscape, the ideal candidate will be highly motivated and skilled in building and maintaining scalable data pipelines, developing ETL workflows, optimizing database and cloud infrastructure, and ensuring data quality and reliability across the organization. We’re seeking someone who is eager to learn and stay current with the latest data engineering tools and technologies that influence how we store, process, and leverage data. This exciting opportunity will enable the right person to take their career to a new level and maximize their growth potential.
Key Responsibilities
- Design, build, and maintain scalable data pipelines to support analytics, machine learning, and operational workloads.
- Develop and manage ETL/ELT processes to extract, transform, and load data from multiple sources.
- Architect, optimize, and manage data infrastructure, including data warehouses, data lakes, and cloud-based storage systems.
- Ensure data quality, integrity, and reliability through validation, monitoring, and automated testing.
- Implement data governance and security best practices, including access controls and compliance standards.
- Collaborate with data scientists, analysts, and engineering teams to understand data needs and support new product features.
- Integrate new data sources and develop pipelines to onboard third-party or internal datasets.
- Monitor performance and optimize databases and pipelines to improve efficiency, scalability, and cost-effectiveness.
- Build and maintain real-time or streaming data systems using technologies like Kafka, Kinesis, or Pub/Sub.
- Document data systems, pipelines, and architecture to support maintainability and knowledge sharing.
What We’re Looking For
- 0-2 years of experience in a data engineering role
- Bachelor's degree in Computer Science, Engineering, or a related field
- Proven experience with SQL, Python, or other programming languages
- Excellent verbal and written communication skills
Why v4c.ai
- A collaborative, high-growth environment where your contributions matter
- Opportunities to learn directly from experienced professionals
- Flexible remote-first culture with team connections across locations
- A career launchpad into one of the fastest-growing industries today
Pay: $60,000.00 - $90,000.00 per year
Work Location: Remote