Description:
The Data Platform team at Deel is composed of senior engineers focused on improving data quality, data pipeline performance, data development experience, and cost management of the entire data stack from ingestion to outbound integrations (and everything in between). As a Senior Analytics Engineer on the Data Platform team, you will spend significant time crafting the future of our Snowflake data warehouse and ensuring it can scale with the business growth of Deel.
We work cross functionally with analysts, analytics engineers, data scientists, ML engineers, software engineers, and leadership to accomplish these goals.
As a Senior Analytics Engineer on our Data Platform team you will:
- Design and implement RBAC on Snowflake to meet all business and compliance requirements.
- Enforce data governance policies and practices to maintain data integrity, security, and compliance with global data privacy regulations.
- Design and implement scalable and efficient data models for our Core data mart.
- Develop and optimize SQL queries for analytical and reporting purposes.
- Improve data team productivity by contributing components to our data processing, transformation, orchestration, and messaging framework(s).
- Build data observability systems to provide near real-time visibility into errors, anomalous behavior, and late arriving data.
- Build a world-class development environment and CI/CD system that will allow our systems to scale for years to come
- Collaborate with cross-functional teams, such as data scientists, software engineers, and business stakeholders, to understand data platform needs.
- Mentor other Analytics Engineers on the team, help them grow, and ensure their work meets internal standards
What you’ll need to be successful in this role:
- Proficient in RBAC principles and implementation details in modern data warehouse.
- Proficiency in a data engineering tech stack:
- 5+ years of experience writing complex, performant, readable SQL to process and transform large scale data sets
- Deep DBT core experience required (experience building on top of dbt core libraries would be awesome)
- Experience with Snowflake desired (or other cloud database platform)
- Experience with workflow management frameworks like Airflow, Luigi, Dagster, etc
- You are proficient in Python and/or another data centric language
- You have experience building highly observable data systems
- You are familiar with dimensional data modeling techniques and best practices
- You have an eye for detail, good data intuition, and a passion for data quality
- You appreciate the importance of great documentation and data debugging skills
- You are comfortable working in a rapidly changing environment with ambiguous requirements. You are nimble and take intelligent risks