Position : Data Engineer
Location : Cambridge / Luton, UK (Hybrid 2-3 days onsite in a week)
Duration : Long Term B2B Contract
Job Description :
The ideal candidate will have a minimum of 5+ years of experience working with Snowflake, DBT, Python, and AWS to deliver ETL / ELT pipelines using various resources.
- Proficiency in Snowflake data warehouse architecture. Design, build, and optimize ETL / ELT pipelines using DBT (Data Build Tool) and Snowflake.
- Experience with DBT (Data Build Tool) for data transformation and modeling. Implement data transformation workflows using DBT (core / cloud).
- Strong Python programming skills for automation and data processing. Leverage Python to create automation scripts and optimize data processing tasks.
- Proficiency in SQL performance tuning and query optimization techniques using Snowflake.
- Troubleshoot and optimize DBT models and Snowflake performance.
- Knowledge of CI / CD and version control (Git) tools. Experience with orchestration tools such as Airflow.
- Strong analytical and problem-solving skills with the ability to work independently in an agile development environment.
- Ensure data quality, reliability, and consistency across different environments.
- Collaborate with other data engineers, data analysts, and business stakeholders to understand data needs and translate them into engineering solutions.
- Certification in AWS, Snowflake, or DBT is a plus.
J-18808-Ljbffr