Data Engineer | Dunstable | Permanent
We have an exciting opportunity for a Data Engineer within our Head Office in Dunstable. Can you take commercial data and turn it into actionable insights? Harness the full potential of bleeding-edge technologies, while enabling business stakeholders to understand complex data? If so, then we would love to hear from you!
Did you know?
- We store c113TB of data in the Azure data lake, pull c60TB of data through to the raw layer and store c131GB of data on the Azure Data Warehouse (Azure Synapse)
- We run c130,000 read/write transactions daily with c8000 connections to the data warehouse
- We have c1,000 pipelines executing per day, creating >300 datasets
What continues to set us apart are our excellent products and people. In Premier Inn, we have the UK’s favourite budget hotel chain, currently outperforming the market and ambitious for more as we execute our plans in both the UK and Germany. We have much loved brands such as Beefeater, Brewers Fayre, Bar + Block alongside exciting up and coming propositions such as Cookhouse & Pub.
We remain true to our values and put our people at the centre of everything we do. Throughout this crisis we a proud to have supported our nations Key Workers by providing accommodation throughout the pandemic.
What you’ll be doing:
- Deliver across the entire software delivery life cycle in an agile fashion – concept, design, build, deploy, test, release into post implementation support.
- Ensure work is of the highest quality and performs code/peer reviews within the team and across teams.
- Specify user/system interfaces and translates logical designs into physical designs taking account of target environment, performance requirements and existing systems.
- Ensure all deliverables are well documented, re-usable, tested and conform to agreed architecture design patterns and coding standards.
- Undertake analytical activities required to support the development and maintenance of systems.
- Provide support on key products should there be an incident / problem related to a product that’s now live and requires a development fix.
- Support the relevant project/backlog delivery events; planning / user story estimation, daily stand-ups, sprint reviews/ demos & retrospectives.
- Support / collaborate with the team and takes ownership of driving forward relevant stories (updating the ticket on the Kanban board & Jira).
- Jointly responsible with the team for converting the Product backlog into ‘Done’ potentially releasable increments.
- Passionate for building, maintaining and optimising scalable and robust cloud platforms
What you’ll need:
- Demonstrable experience of working in Agile environments and implementing DevOps principles
- Proven experience using Python to build data pipelines
- Azure Data Services (Data Factory, Databricks, Synapse Analytics)
- Solid experience of T-SQL (stored procedures, functions)
- Integration and ETL experience
- Experience of Tableau and/or Power BI
- Experience with relational and multi-dimensional databases and data modelling
- Understanding of DevOps principles like IaC, CI and CD
- Strong experience with containerisation and / or Kubernetes
- Experience deploying code as part of CI/CD pipelines in Azure DevOps stack
- Experience of provisioning the Azure PAAS/IAAS environments through CI/CD pipelines
- Understanding of general infrastructure, networking & security best practices
- Microsoft BI experience (SSAS, SSIS, SSRS, Power Platform)
- Understanding of Terraform scripting and ARM Templates
- Cloud component design and pattern-based implementation
- Experience with NoSQL databases, preferably CosmosDB
- Evaluate gaps related to DevOps best standard practices and provide improvements
Whitbread is an inclusive employer, strongly believing that everyone is unique and there should be no limits to ambition. We welcome your application whatever your background or situation. We are open to flexible working and, where possible, will try to support this.