
Data Engineer Data Platform Team
Dit wordt je functie
Data Engineer Data Platform Team
Are you a highly-motivated Data Platform Engineer who wants to contribute to the enablement of a data-driven company, our cloud transformation and Agile WoW to help APG Asset Management to become a sustainable and Digital Investments Solutions Provider? Do you have relevant experience in data platform activities in an highly demanding enterprise environment? Do you like to work on a massive digitalization of one of the biggest Asset Managers in Europe? Than read on.
The role
We are looking for a Data Platform Engineer to join the Data Platform Team in the Orchestration & Integration (O&I) department at APG Asset Management. The Data Platform Team has various responsibilities, including delivering fit-for-purpose and reliable tech solutions for our Data Office, provide Data Platform consumers with the means to monitor and control their components and content and (re)create data capabilities to accelerate consumers of the data platform.
In your role as Data Platform Engineer on the Data Platform team, you will facilitate the consumption of components from our data platform. You will be responsible for building and optimizing the data infrastructure on which our consumers build their data solutions. You will work collaboratively with use case teams, IT-specialists and various other stakeholders. You enjoy laying the technical groundwork to deliver information products. Your key responsibilities will include:
- Building scalable, reusable and future-proof patterns for the data platform.
- Centralize and document business requirements and methodologies (rules, definitions, guardrails).
- Ensure that the data organization system in place is clean, compliant reliable and prepared for any usage scenarios that may arise.
- Keeping abreast of technological developments (in the cloud and other areas) and translating how these innovations can be applied to our data platform.
- Continuously optimize the performance of the company’s data ecosystem.
The Data Platform team is part of the Data Integration team. Apart from Data Integration, the O&I department also consists of a Data Integration Team, an Application Integration Team, a Cloud Competence Centre and Generic Services. Our professional goal is to deliver integrated and seamless IT solutions to ensure a coherent and sustainable IT landscape and experience. Together, we are a technical enablement department that delivers value to the Asset Management organization. We make it possible for our internal DevOps and global engineering teams to focus on delivering business value for their customers.
What you would do
Your daily tasks will involve:
- Designing and implementing the data storage layer with Azure Data Lake Storage Gen2.
- Reading in batched data using Azure Data Factory pipelines, data structures and data connections.
- Transforming data with Azure Databricks-Spark notebooks using the Delta Lake/Parquet format.
- Implementing user authentication and authorization using AAD, applying the principle of least privilege, securing data both at rest and in transit, and enforcing compliance controls for sensitive data.
- Monitoring, managing and optimizing data storage and data processing.
- Managing data backup and recovery, troubleshooting based on metrics and log data using Azure Monitor.
- Setting up and maintaining Azure DevOps repositories to facilitate version control and collaborative development.
- Sharing and transferring knowledge with team members and the broader organization.
- Providing support for testing activities and implementations.
What you bring
You think in terms of opportunities and are service-minded. You find taking responsibility and accepting responsibilities comes naturally to you. You have a pragmatic attitude and are motivated to deliver quality. You are willing to develop or improve your T-shaped skills. You also have:
- A bachelor degree in Computer Science/Engineering or comparable.
- 3-5 years’ relevant experience
- Hands-on experience with Azure Databricks, Azure Data Factory, Azure DevOps
- Hands on experience with Bicep and Terraform templates usage in pipeline as code
- Scripting knowledge (Powershell, Azure CLI)
- Technical expertise in Data Engineering, Data Architecture, Python, Pyspark and Parquet are considered a pre.
- Have an automate first mindset and deliver via CI/CD
- Familiarity with Agile practices. Willingness to be part of a DevOps team.
- Business skills such as curiosity, resilience, autonomy, innovative mindset and team player.
- You live in the Netherlands or its proximity and are fluent in spoken and written English.
What we offer you
We will offer you a gross salary of up to 6.000,- euro gross per month (40 hrs) and great employment conditions aimed at flexibility, such as:
- A guaranteed end of year bonus of 8.33%
- Attention to your vitality and personal development
- Possibility to work from home 2-3 days per week
- And obviously a well-managed pension
Where you will work
For pension provider APG, pension is about people, life, and living together. With careful asset management, pension administration, communication and advice, we work on a livable future for current and future generations. One in which we share prosperity and well-being fairly and sustainably. Now and later.
APG is committed to around 4.5 million people in the Netherlands, which is why we believe it is important to be a reflection of Dutch society. This means that APG strives for an inclusive work environment, in which everyone can be themselves and where your unique qualities are embraced.
When you choose to work for APG, you choose for a job in which you contribute to a bright future. Apply today!
More information
Get in touch with Ed Rinkel via +31623703426 or ed.rinkel@apg.nl or Hiring Manager Heidi van de Loosdrecht via heidi.vande.loosdrecht@apg-am.nl who can tell you more about the specifics of the position.
Do you see your future at APG? Apply before 7-6-2024 and let yourself be seen.
We do not appreciate acquisition in response to this vacancy