The Client:
KPN Wholesale, that's behind the "Chinese walls," right? Yes, in the past, the rules were strict, but the legislation has since been relaxed, and we are increasingly seeking cooperation with other KPN departments. KPN has an open network policy. The open network policy is one of the key focal points in the strategy of 'Connected Networks.' It means that parties other than KPN can use our network. KPN Wholesale is the provider towards customers, thereby acting as KPN's wholesale division. And it's bearing fruit, as KPN Wholesale grows year over year by selling access and services to other service providers. In this way, we connect more and more partners to our network AND make optimal use of our network. In short: We connect everyone!
The team:
Digital Products is part of KPN Wholesale. Within this department, new products and business models are developed that should further accelerate growth in the coming years. Our Digital Products team combines the strength of KPN's business network, mobile and fixed network, and our Sales network. In this spectrum, we find new innovative products (such as KPN Pay, Mobile Connect, NL-ix) and manage our existing products. The Data & Analytics team, which you will become a part of in this role, contributes to this by working on a future-proof data platform, excellent data quality, reliable insights, and valuable analyses that help Wholesale move forward.
Job Description:
As a Data DevOps Engineer, you contribute to managing and innovating the new Azure PAAS environment; you unlock and model data, build ADF pipelines, and manage KPN Wholesale's Azure Subscriptions and Resources via GIT. In addition, you ensure that the environment continues to meet the required (KPN) security standards. You work closely with other data engineers, analysts, and business colleagues to build robust and scalable solutions. You advise and assist your teammates, whether they ask for it or not. You are a true DevOps person; you build it, you own it!
Responsibilities:
Designing, implementing, and managing data pipelines and ETL processes
Developing and maintaining monitoring and logging solutions for data infrastructure
Developing tools and scripts for automation of operational tasks
Collaborating with other teams to understand technical requirements and deliver solutions
Identifying and resolving performance and scalability issues
Implementing best practices for data security and compliance
The ideal candidate will have:
Minimum of a bachelor's level of working and thinking
Demonstrable experience with BI, data engineering, and DevOps activities
Extensive hands-on experience with Azure (PAAS), ETL/ELT, T-SQL, Azure Synapse, Azure Data Factory, Cloud Security
Solid knowledge of Azure DevOps/GIT, CICD, Agile Scrum, Data modeling
Knowledge of IAAS (YAML/Bicep) and Microsoft Fabric is a plus
What we offer:
initial 6-month-assigment with extension possibility
37 hour work week
hybrid working
transit and/or working from home allowance
internet allowance
€ 4.400,00 - € 6.056,16 gross per month (depending on experience)
holiday payment (8,33% on your gross annual income)
25 paid holiday days (based on 40-hour week)
full pension contribution