Middle Data Engineer (for Investor Services Solutions)
Adaugat: 2 zile în urmă
Brightgrove Inc.
Acest anunt este cu aplicare externa. Cand dati click pe Aplicare Externa veti fi redirectionat pe un alt site pentru a aplica.
About the Client Our customer is a leading investor services group employing 4750+ people across 25 jurisdictions worldwide. Solution brings together that rare combination of global technical expertise and a deep understanding of their clients' needs. Solution helps their clients and the sector to stay compliant. They act as a guardian and facilitate client’s investments.
Project Details Bridge current capacity for BAU data engineering. Keep pipelines healthy, improve reliability, and free senior engineers to focus on roadmap and strategic work.
Initial engagement is 6 months with option to extend.
Your Team Small, high performing group led by the Group Head of Data Transformation and the Head of Data Engineering. Tooling includes Snowflake, DBT, Python and orchestration on Kubernetes. Collaboration with DevOps and Data Science. Culture values ownership, clarity and calm incident response.
What's in it for you
- Interview process that respects people and their time
- Professional and open IT community
- Internal meet-ups and resources for knowledge sharing
- Time for recovery and relaxation
- Bright online and offline events
- Opportunity to become part of our internal volunteer community
Responsibilities
- Build, monitor and support data pipelines landing from varied sources into Snowflake
- Create and maintain DBT models, tests, documentation and environments
- Write clean Python for ELT and utilities, including packaging and simple CI CD steps with Jenkins or similar
- Investigate and resolve BAU incidents within agreed priorities and handover notes
- Improve observability, data quality tests and run books
- Contribute concise PRs and reviews, follow coding standards and branching strategy
- Coordinate with DevOps on resource usage, secrets and deploys on Kubernetes
- Support knowledge transfer and keep documentation up to date
Skills Must have
- 4 plus years in data engineering with production experience in Python
- Strong DBT modeling with tests, exposures and environment management
- Solid Snowflake skills including performance basics, roles and warehouses
- Comfortable reading logs, tracing failures and fixing pipeline issues
- Clear English communication and pragmatic problem solving
Nice to have
- PySpark or Snowpark
- Dagster or another orchestrator
- Jenkins or similar CI CD
- Experience with Azure or AWS storage and integrations
Sfaturi de siguranta
- Nu trimiteti niciodata BANI in avans sau acte de identitate pentru aplicarea la un loc de munca. Nu trimiteti bani in avans pentru promisiuni de angajare sau alte oferte similare.
- Daca aveti impresia ca acest anunt nu este real, va rugam sa il raportati apasand butonul "Raporteaza Job"
This action will pause all job alerts. Are you sure?
Fii informat
Aboneaza-te la newsletter-ul nostru si primeste cele mai recente oferte de munca si informatii despre cariera direct in inbox-ul tau.
Securitatea datelor dumneavoastra este importanta pentru noi. Citeste Politica de confidentialitate.