This role will contribute in designing and developing our platform to fulfil the business needs and to also improve our systems. Within this capacity you will have the chance to design new data pipelines, maintain platforms hosted on data streams for batch-loading and real-time visualisations. Maintaining and evolving our existing data platform; Building processes to ingest data from Kafka, APIs and Databases using Nifi. Applying transformations to data streams; Being involved in data modelling following standards e.g. Inmon, Kimball, Data Vault; Ensuring data quality by verifying data consistency and accuracy; Keeping up to date on research and development of new technologies and techniques to enhance our data platform; Have an investigative mindset to be able to troubleshoot – thinking outside the box when it comes to troubleshooting problems and incident management; Full ownership on projects and tasks assigned together with being able to work within a Team; Able to document well processes and perform knowledge sharing sessions. Some Key Requirements of what we would be looking for: Experience with modern cloud database technologies such as Snowflake ; Experience with orchestrating data pipelines using Airflow ; Experience with SQL , and Data Integration Tools Experience with any programming language like Python or Java; Knowledge on AWS Services like S3/Lambda/API Gateways/DMS/RDS; Development experience in both Microsoft , and Linux/Cloud environments; Have strong analytical and problem-solving skills . It would be extra awesome if you also have experience with: Familiar with Data Warehousing concepts and data modelling techniques such as Inmon, Kimball, Data Vault; Familiar with data streaming using technologies such as Kafka ; Familiar with software versioning tools like GIT ; Familiar with infrastructure scripting like Terraform ; Familiar with data flow systems such as Apache NiFi ; Familiar with Data Monitoring and Visualisation tools such as Prometheus and Grafana, Cloudwatch ; Scripting – Powershell, Unix Scripting; At least 2 years experience in Data Engineering; Excellent verbal and writing English communication skills; Good learning mindset; Able to set priorities and multitask. #J-18808-Ljbffr