Job Overview
We are seeking a skilled Data Engineer, experienced with SQL, Databricks and AI, to join our team. Only US Citizens need apply.
ESSENTIAL DUTIES AND RESPONSIBILITIES
Data Ingestion and Integration: Responsible for ingesting data from various sources into the data lakehouse. This includes handling structured and unstructured data and ensuring continuous data ingestion to meet real-time business needs.
Data Transformation and Processing: Once the data is ingested, it needs to be transformed and processed to make it ready for use in AI, ML, data science, and analytics. This involves using Databricks to perform data transformations and ensuring the data is clean and usable.
Data Governance and Security: Implementing and managing data governance policies to ensure data quality, consistency, and security. This includes managing role-based access controls (RBAC) and ensuring data encryption at rest and in transit.
Performance Optimization: Ensuring the performance of the data Lakehouse environment by optimizing data processing workflows and managing resources efficiently. This includes configuring Azure resources to minimize operational costs while meeting performance requirements.
Collaboration and Communication: Working closely with other teams, such as DevOps, to manage integrations and deployments. Effective communication is crucial for coordinating efforts and ensuring smooth operations.
Maintenance and Support: Providing ongoing maintenance and support for the data lakehouse environment. This includes monitoring system performance, troubleshooting issues, and implementing updates and improvements.
Tier II Support: Providing Tier II support for the data Lakehouse.
Other Duties: Perform other duties as assigned.
EDUCATION
Bachelor’s degree in a related field or the equivalent through a combination of education and related work experience.
EXPERIENCE
- 5+ years related work experience and are a US citizen.
- Certifications in Databricks and Azure data technologies preferred.
- Experience in Databricks core components like DataFrames, Datasets, Spark SQL, Delta Lake, Databricks Notebook, DBFS, and Databricks Connect.
- Strong proficiency in programming languages such as Python or Scala.
- Experience in implementing enterprise-scale data platforms as part of a collaborative team.
- Experience working in a fast-paced, collaborative, and team-based project environment.
- Extensive experience with SQL Server, including database design, stored procedures, and performance tuning.
- Hands-on experience using version control systems such as Git and CI/CD workflows and practices.
- Hands-on experience with Azure Data Factory (ADF) for building and deploying data pipelines.
- Familiarity with other Azure data services (e.g., Azure Data Lake Storage, Azure Synapse Analytics, Azure Databricks) is preferred.
- Strong understanding of data warehousing concepts and dimensional modeling.
- Experience with data quality management and data governance principles.
- Experience working in a shared service, hybrid environment.
LOCATION
Remote position with 2-3 days per month onsite in Lansing, Michigan. The rest of the month will be working from home, remote. Priority will be given to Michigan residents and candidates willing to relocate to within the State of Michigan. 100% remote positions are possible for staff out of state.
You can also apply at https://careers.ajboggs.com/careers
MORE ABOUT OUR BENEFITS
● Group Medical, Dental, Life, HSA/FSA , and Vision Insurance
● SIMPLE IRA Accounts with an immediate vesting of 3% company match
● Paid company holidays and personal days
● Partial Internet and mobile phone expense reimbursement
● Professional development opportunities
● Great culture with excellent teams, collaborative, smart working, and innovative.
Job Type: Full-time
Pay: $95,000.00 - $101,000.00 per year
Benefits:
- Dental insurance
- Employee assistance program
- Flexible schedule
- Flexible spending account
- Health insurance
- Health savings account
- Life insurance
- Paid time off
- Parental leave
- Professional development assistance
- Referral program
- Relocation assistance
- Retirement plan
- Tuition reimbursement
- Vision insurance
Compensation Package:
Schedule:
- 8 hour shift
- Monday to Friday
Application Question(s):
- Are you a US citizen?
- How many years of experience do you have with DataBrick technology?
- Do you need a relocation package to move to Michigan or are you only interested in working out of state?
- What do you like to do for fun?
Work Location: Hybrid remote in Lansing, MI 48915