Data Engineer

Login to apply

Type: permanent (Full-time)

Location: Newcastle Upon Tyne

Salary: £45K-65K

Employer: Login to view

Data Engineer

As a Data Engineer within Seriös Group you are a key player in the implementation of our client’s cloud data platforms, IoT analytics, data integration and migration projects. You will deliver Data Solutions from data pipelines and processing solutions with our Data Architects that support a client’s data architecture framework.

You will collaborate with Data Architects and Insight Analysts when implementing data pipelines across supporting data layers and models, and ensure orchestration of data pipelines support data provenance, quality and lineage to assure supportability. You will support and demonstrate a client’s data architecture framework is implemented ensuring principles and standards are followed that guide development to well architected, scalable, robust, and cost-effective Data Solutions.

You will support and refine the technical standards across the organisation when implementing technologies that shape the organisations approach to all things data to deliver best practise.

You will be working in a technology agnostic manner, with market leading technologies that shape the technology into a best fit solution for our clients. Liaising with our clients and working with them in partnership is a key relationship and you may be required demonstrate and grow your client facing skills.

The role may involve direct management of technical people from both a line management and coaching / mentoring perspective. Therefore, prior technical and team lead experience is preferred.

You will also naturally have a passion for all things data, keeping up to date with the latest technologies and methodologies, whilst supporting others in the team to continually improve.


  • Work closely with Data Architects and Insight Analysts on the development of our client’s cloud data warehouse and IoT analytics projects utilising cloud technologies such as AWS or Azure.
  • Create and maintain detailed solutions documentation.
  • Create data pipeline processes which are orchestrated in an optimal manner across data layers and data models.
  • Create and maintenance of Infrastructure as Code Solutions.
  • Adhere to source control best practices.
  • Ensure data provenance, quality and lineage can be supported and maintained from development work considering supportability.
  • Manage workload utilising Agile delivery methods.
  • Coaching /mentoring of graduate and/or apprentice consultants may be required
  • Keep up to date with the latest technologies, methodologies and best practices in all things cloud and data.
  • Gain and maintain relevant certifications.
  • Build and maintain strong client relationships.
  • The above list is non-exhaustive; you may be required to carry out any ancillary duties in relation to your role, in addition to the abovementioned list.

Person Specification:

  • Ability to implement technical solutions based on architectural designs.
  • Ability to provide guidance to apprentice/graduate team members.
  • Ability to professionally present and communicate technical solutions and concepts.
  • Ability to be self-motivated and have a proactive approach to work.
  • Ability to develop strong client relationships.
  • Ability to communicate technical concepts and solutions to non-technical stakeholders.
  • Ability to understand and address the needs of multiple clients.
  • Ability to adapt to changing requirements and business needs.
  • Ability to prioritise workload and work to deadlines.


  • 2+ years prior experience in data engineering or business intelligence role.
  • Extensive ETL and data pipeline implementation experience, technology agnostic.
  • Experience implementing solutions using at least one of the following technologies Azure Data Factory, Azure Event Hubs, Azure Data Lake Storage, Azure Function Apps, Azure Synapse Analytics, AWS Glue, AWS S3, AWS Lambda Functions, AWS Redshift, Databricks, Snowflake, Google Big Query, Alteryx, SSIS, Informatica.
  • Demonstrable understanding of data warehouse and data lake principles.
  • Demonstrable data modelling techniques such as Kimball, Inmon or Data Vault methodologies.
  • Demonstrable understanding of unstructured, semi-structured and structured data source types for databases, files, formats and APIs.
  • Expert SQL skills including the ability to optimise performance.
  • Exposure to data engineering coding languages such as Python, R or Spark
  • Experience using backlog management tools such as Jira or Azure DevOps.


  • Experience of data governance and polices to implement data obfuscation or implement record retention.
  • Experience with data visualisation tools such as PowerBI, Tableau or QlikSense.
  • Experience working with IoT sensor technologies would be highly advantageous.
  • Experience working in an agile consulting team.
  • Experience writing Infrastructure as Code using either ARM Templates, PowerShell, CloudFormation or Terraform.
  • Experience developing CI/CD pipelines.
  • Experience of implementation of source control utilising Git.
  • Experience in mentoring junior team members.

Sign up to our newsletter

By submitting your information you agree to the Terms & ConditionsTerms & Conditions and Privacy PolicyPrivacy Policy

© 2024 techtalentengine. All Rights Reserved.