Technology is our how. And people are our why. For over two decades, we have been harnessing technology to drive meaningful change. By combining world-class engineering, industry expertise and a people-centric mindset, we consult and partner with leading brands from various industries to create dynamic platforms and intelligent digital experiences that drive innovation and transform businesses. From prototype to real-world impact - be part of a global shift by doing work that matters. Job Description Our data team has expertise across engineering, analysis, architecture, modeling, machine learning, artificial intelligence, and data science. This discipline is responsible for transforming raw data into actionable insights, building robust data infrastructures, and enabling data-driven decision-making and innovation through advanced analytics and predictive modeling. Responsibilities: Working with Endava's customers to analyze existing systems and prepare a plan for migrating databases and data processing systems to the cloud. Defining and implementing cloud data strategy including security posture, target operating model, DR strategy, and others. Deliver end-to-end data solutions – from early concept and planning, through early PoCs and benefits analysis, all the way to production rollouts. Enhance the client's data landscape by improving data lineage, quality, and reliability. Help organizations adopt AI/ML-based solutions by introducing MLOps culture. Be part of the technical advisory and cloud infrastructure team responsible for: Securing foundational Data Lakes and Data Meshes implementations. Automated provisioning of infrastructure and pipelines. Cloud-ready ETL/ELT architectures. Presenting analytical findings on cutting-edge BI Dashboards. Qualifications Understanding of entire software development lifecycle, CI & CD as well as Data/MLOps approach. Expert knowledge of SQL and at least one language used in Data Analytics/Science space (Python, R, SAS). Knowledge of at least one programming language (Java, C#, C++). Knowledge of Big Data and Orchestration tools such as Apache Airflow or Spark. Experience working with Relational and NoSQL databases. Working experience with BI Tools (Looker, Power BI, Tableau, Data Studio). Basic understanding of GIT and various automation servers (Jenkins, CircleCI, GitLab CI). Knowledge of messaging systems. Basic knowledge of containers, Docker, and Kubernetes. Cloud certifications such as Associate Cloud Engineer will be an asset. Additional Information Discover some of the global benefits that empower our people to become the best version of themselves: Finance: Competitive salary package, share plan, company performance bonuses, value-based recognition awards, referral bonus. Career Development: Career coaching, global career opportunities, non-linear career paths, internal development programmes for management and technical leadership. Learning Opportunities: Complex projects, rotations, internal tech communities, training, certifications, coaching, online learning platforms subscriptions, pass-it-on sessions, workshops, conferences. Work-Life Balance: Hybrid work and flexible working hours, employee assistance programme. Health: Global internal wellbeing programme, access to wellbeing apps. Community: Global internal tech communities, hobby clubs and interest groups, inclusion and diversity programmes, events and celebrations. #J-18808-Ljbffr