R-00167222

Senior Data Engineer

London, United Kingdom
Posted on 27/04/2022

Join us as a Senior Data Engineer

  • This is an exciting opportunity to use your technical expertise to collaborate with colleagues and build effortless, digital first customer experiences
  • You’ll lead on designing, building and implementing streaming applications and pipelines, as well as leading on developing the Mettle data platform in both AWS and GCP.
  • You'll act as a subject matter expert and communicate with clarity and empathy to both technical and non-technical audiences


What you'll do

We’ll look to you to drive value for the customer through modelling, sourcing and data transformation. You’ll be working closely with core technology and architecture teams to design and implement real-time data systems, working with all parts of the business to help guide the decision making across Mettle.


 

We’ll also expect you to be:

  • Developing Mettle’s data analytics capability, building on Kafka and BigQuery in GCP and AWS
  • Design and implement real-time data analytics systems that serve all parts of the business (Product, Marketing, Customer Operations, Risk and Compliance and Engineering)
  • Maintain and enhance core data infrastructure and ETL frameworks
  • Develop and own core tools, frameworks, and methodologies. Develop with an emphasis on scale, reusability, and simplicity.
  • Complement our data scientists by providing a reliable, secure, and maintainable modelling framework


The skills you'll need

You'll need a good understanding of modern code development practices along with good critical thinking and proven problem solving abilities. We'd also like you to have a background in data analytics and data science and the ability to translate business objectives into data driven insights.

You’ll also need:

  • Experience of developing using Python, Scala and Java
  • Familiarity with microservices, containerisation, gitops and CI CD (e.g. Docker, Kubernetes, Helm, Terraform)
  • Experience with stream processing and event streaming platforms (e.g. Kafka, Kafka streams, Akka streams, Beam, Spark)
  • Experience with schema design, schema registries, data serialisation formats and performant data storage (e.g. Avro, Thrift, Protocol buffers, Parquet)
  • Experience with ETL frameworks/methodologies (e.g. Airflow)
  • Demonstrable experience working with data persistence and warehousing technologies in GCP and AWS

If you need any adjustments to support your application, such as information in alternative formats or special requirements to access our buildings, or if you’re eligible under the Disability Confident Scheme please contact us and we’ll do everything we can to help.

Not the right role?

We’re always on the lookout for talented people. If you don’t see the right role today, sign up to job alerts and we’ll let you know when something more suitable comes up.

Set up job alerts