Job description

We are looking for a highly skilled and proactive Senior Data Engineer to join our engineering team

With a primary focus on building Delta Lakehouse solutions within Microsoft Fabric. With hands-on experience with Azure Data Factory / Azure Databricks / Synapse Analitics you will be indispensable in orchestrating the migration, integration and optimisation of our next generation data platform.

Duties
  • Design and lead the implementation of robust and scalable Delta Lakehouse architectures within Microsoft Fabric

  • Develop, optimise and maintain complex data ingestion, transformation and orchestration pipelines using pipelines and other Data Factory objects within Fabric, as well as Python/PySpark and advanced level Notebooks, including CDC techniques, Incremental Updates, file manipulation and logs in Delta, among others facilitating the movement and preparation of data for analysis.

  • Collaborate closely with data architects, data scientists and business teams to translate functional requirements into efficient technical data solutions aligned with business strategy.

  • Implement and ensure best practices in data quality, data security and data governance

  • Optimise the performance of data processing workloads in Fabric, Databricks and Data Factory.

  • Act as a technical reference and mentor for junior data engineers,

  • Continually research and evaluate new features and updates to Microsoft Fabric

  • Actively participate in the definition and implementation of data engineering standards, patterns and methodologies.

Qualifications
  • Academic: Senior Engineers, Statisticians, etc

  • English: Level B2 or higher

  • Certifications in the Microsoft environment DP 700, DP 600, etc.

Professional Experience
  • More than three (3) years of demonstrable experience as a Data Engineer.

  • In-depth, hands-on experience with Azure Data Factory (ADF).

  • Strong and proven experience with Azure Databricks/Synapse Analytics.

  • Demonstrable experience building and optimizing Delta Lakehouse architectures.

  • At least one (1) year of hands-on experience developing and implementing data pipelines within Microsoft Fabric.

  • Advanced knowledge of dimensional data modelling.

  • Experience in implementing ‘Medallion’ data architectures (Bronze, Silver, Gold layers) for data management and refinement in a Lakehouse.

  • Advanced SQL proficiency for data manipulation and analysis.

  • Familiarity with CI/CD (Continuous Integration/Continuous Delivery) principles.

  • Strong understanding of data modelling concepts, data warehousing and data architecture patterns.

  • Strong analytical, problem solving and communication skills, with ability to explain complex technical concepts to non-technical audiences.

Salary range

According to experience but starting from 45.000 €.

Data Engineer

  • Madrid
  • Fulltime
Register