Data Engineer (PySpark, Azure, Databricks)
Adaugat: Ieri
Adecco
Data Engineer (PySpark, Azure, Databricks)
Adaugat: Ieri
Adecco
Acest anunt este cu aplicare externa. Cand dati click pe Aplicare Externa veti fi redirectionat pe un alt site pentru a aplica.
Role: Sr. Data Engineer (PySpark, Azure, Databricks)
Work type: Full remote (from Romania)
Collaboration type: employment contract or B2B contract
Role Objective
• Design, develop and maintain scalable data pipelines within a modern cloud-based data platform
• Contribute to the development of a Lakehouse architecture leveraging Azure technologies and
Databricks for advanced analytics and AI-driven use cases
Key Responsibilities
• Design and implement scalable data pipelines using Azure Fabric
• Develop data processing and transformation logic using Python, PySpark and SparkSQL
• Work with OneLake and Delta Lake concepts to support modern Lakehouse data architectures
• Develop and support solutions using Cosmos DB (NoSQL API)
• Contribute to Azure Fabric workloads including Data Engineering, Data Factory Gen2 and Lakehouse
• Build and optimize Spark workloads using Databricks
• Implement CI/CD pipelines and follow DevOps best practices
• Integrate data solutions with Power BI for reporting and analytics
• Collaborate with AI, data science and product teams to enable data-driven and AI-powered solutions
• Ensure data quality, performance, reliability and security across data platforms
• Participate in Agile ceremonies and contribute to sprint deliveries
• Support production environments and contribute to continuous improvements
Technical Skills
• 5+ years of experience in Data Engineering or related engineering roles
• Strong hands-on experience with Azure Fabric
• Proficiency in Python
• Solid experience with PySpark and SparkSQL
• Experience with Batching
• Experience with Spark Streaming
• Hands-on experience with OneLake / Delta Lake (OpenLake concepts)
• Knowledge of DF Gen2 and M-code
• Experience with CI/CD pipelines (Azure DevOps or equivalent)
• Good understanding of Azure services
• Experience integrating data solutions with Power BI
Nice-to-Have Skills
• Experience with code generation, including non-AI and AI-assisted approaches
• Expertise with Cosmos DB and variants (Mongo, Cassandra, Table APIs)
• Exposure to Azure AI Foundry
• Experience with Data Science workflows
• Strong background in Big Data and Spark ecosystems
• Knowledge of financial instruments and financial services data
• Hands-on experience with industry-standard LLMs (including GPT, Claude, or similar)
Qualifications
• University degree in Computer Science, Engineering, Information Systems or related field
• Strong understanding of modern data platforms and big data ecosystems
• Experience working in Agile development environments
• Ability to work independently and collaboratively in distributed teams
Competencies
• Strong analytical and problem-solving skills
• Excellent communication and collaboration abilities
• Results-oriented mindset
• Adaptability and continuous learning mindset
• Attention to detail and quality
Show more
Show less
Sfaturi de siguranta
- Nu trimiteti niciodata BANI in avans sau acte de identitate pentru aplicarea la un loc de munca. Nu trimiteti bani in avans pentru promisiuni de angajare sau alte oferte similare.
- Daca aveti impresia ca acest anunt nu este real, va rugam sa il raportati apasand butonul "Raporteaza Job"
This action will pause all job alerts. Are you sure?
Locuri de munca similare
Fii informat
Aboneaza-te la newsletter-ul nostru si primeste cele mai recente oferte de munca si informatii despre cariera direct in inbox-ul tau.
Securitatea datelor dumneavoastra este importanta pentru noi. Citeste Politica de confidentialitate.