Kafka DevOps Data Engineer
Assignment start date asap -
Assignment end date 31 December 2021 -
Possible extension YES -
Hours per week 36
We are looking for a Senior DevOps Data Engineer (Kafka) for our Banking client;
… to build with us the Strategic Data Exchange between Lending core systems and surrounding systems. The DataGen squad is part of WB Tribe Lending. The primary focus of the squad is on the processes concerning data delivery to Lending internal applications, to the Wholesale Bank Data Lake and data delivery on Regulations.
Data is becoming more important every day. Your contribution to the Strategic Data Integration will be critical to realize our ambitioned Lending data platform, with high quality and timely data availability, moving from batch to real-time. This way enabling excellent data consumption possibilities to meet our ever increasing client- and regulatory demands on data.
We need your help in designing and building this new exchange and building bridges towards other squads in order to realize end-to-end delivery across Lending- and other Tribes. We value Agile, self-organization and craftsmanship. We are driven professionals who enjoy shaping the future of this place.
We are looking for someone with an easy-to-work-with, mature and no-nonsense mentality. Someone who is an open and honest communicator, who values working as part of a team, who is willing and able to coach or train other developers and who is aware of developments and trends in the industry and corporate ecosystem.
Are you also passionate about a (not so distant) future that most data processing is done in a streaming fashion, not scared off by complex data, and enjoy developing complex components in Java? Then please read on.
On the more technical side you must have 9+ years of relevant experience in data engineering and especially must have experience in the following fields:
1. Agile / Scrum.
2. Track record in building larger corporate systems.
3. Kafka Streaming API.
4. Kafka, Schema Registry and Kafka Connect, using the Confluent framework.
5. Java 8 or higher backend development.
6. CI / CD tooling: Azure DevOps, Maven, CheckMarx, Git, Ansible.
7. Running and managing a Kafka cluster and related components.
8. Linux (bash) scripting capabilities.
9. Data Integration techniques.
10. Oracle Sql 12c or higher.
Next to these must haves, we appreciate you to have knowledge of the following:
1. Oracle RDBMS 12c or higher.
2. Database Change Data Capture.
3. Logging and monitoring with Grafana, Elastic, Kibana, Prometheus or Logstash.
4. Data modelling.
5. Oracle Data Integrator 12c.
6. Experience in a complex, corporate environment.
7. Experience in Lending, Financial systems.
8. Issue trackers like JIRA, ServiceNow.
9. Collaboration tooling like Confluence.
Michael Siep +31(0)20-333 7629