Senior Analytics Engineer
Cera
Position: Senior Analytics Engineer, Data Platform
Reporting to: Head of BI
Contract type: Full Time, Permanent, Remote
This is an exciting opportunity to join one of Europe's fastest-growing healthtech startups
About Cera
Cera Care is Europe's largest digital-first provider of home healthcare. We combine professional care with cutting-edge technology, leveraging AI and data analytics to predict health deteriorations, prevent falls and hospitalisations, and empower the elderly to live longer, healthier, and more independently in their own homes. Our mission is to transform social care, making it proactive, personalised, and accessible, ultimately alleviating pressure on healthcare systems and improving lives. Join us in shaping the future of health tech.
In just a few years, Cera has expanded to 10,000+ staff across the UK and Germany, delivering 50,000+ healthcare visits a day, and has grown from zero to $300 million in revenue.
The Opportunity
We are seeking an experienced and passionate Senior Analytics Engineer to join our growing Data & Analytics team, working alongside another Senior Analytics Engineer. In this pivotal role, you will be instrumental in building, maintaining, and optimising our core analytics capabilities on Google Cloud Platform (GCP). You will work at the intersection of data engineering, data science, and business intelligence, transforming raw data into reliable, high-quality, and actionable insights that drive strategic decisions across Cera Care. If you thrive on solving complex data challenges, have a strong engineering mindset, and are eager to make a tangible impact on people's health and well-being, we want to hear from you.
As a Senior Analytics Engineer, you will also have the opportunity to play a key role in harnessing the power of Generative AI, particularly in optimising our own development workflows and accelerating data product delivery.
What you’ll do
- Design & Develop Data Models: Collaborate with the team to design and implement robust, scalable, and efficient data models within BigQuery, optimising for performance, usability, and data integrity.
- Build ETL/ELT Pipelines: Develop and maintain complex data pipelines using BigQuery and Dataform, to transform data from various sources into our analytics data warehouse.
- Ensure Data Quality & Governance: Implement rigorous data quality checks, monitoring (using Metaplane), and validation processes to ensure the accuracy, completeness, and reliability of our analytics data. Champion data governance best practices.
- Collaborate Cross-functionally: Work closely with Data Scientists, Data Analysts, Data Engineers, and other stakeholders to understand their data requirements, translate them into technical solutions, and provide data expertise.
- Performance Optimisation: Continuously identify and implement optimisations for existing data models and pipelines to improve efficiency, reduce costs, and enhance performance.
- North Star & Best Practices: Act as a subject matter expert, define the optimal path for our analytics ecosystem, and advocate for best practices in data modelling, SQL development, and analytics engineering within the team.
- Documentation: Create and maintain comprehensive documentation for data models, pipelines, and data dictionaries.
- Tooling & Innovation: Evaluate and recommend new tools and technologies to enhance our analytics capabilities and stay at the forefront of the industry.
- Support & Troubleshooting: Provide expert support for data-related issues, troubleshooting problems within data pipelines and data models.
Who you are
- 5+ years of experience in a data-focused role, with at least 2+ years specifically as an Analytics Engineer, Data Engineer, or a similar role focused on building analytical data products.
- Expert-level SQL skills with a proven track record of writing complex, performant, and maintainable queries.
- Strong experience with Cloud Platforms, ideally Google Cloud Platform (GCP) services, particularly BigQuery for data warehousing.
- Proficiency with Dataform (or dbt - data build tool) for data transformation, testing, and documentation – this is a key requirement.
- Experience designing and implementing dimensional data models (e.g., Star Schema, Snowflake Schema) and an understanding of data warehousing concepts.
- Solid understanding of ETL/ELT principles and experience building automated data pipelines.
- Familiarity with version control systems (e.g., Git) and CI/CD practices for data pipelines.
- Strong analytical and problem-solving skills, with an ability to translate business requirements into technical solutions.
- Excellent communication skills, with the ability to articulate complex technical concepts to non-technical stakeholders.
- A high degree of self-sufficiency, with a proven track record of delivery in a high autonomy environment
What we offer
- We offer flexible working hours and remote working arrangements,
- 25 days holiday + your birthday off on top of bank holidays
- Laptop and full remote IT package setup to work remotely.
- Company pension scheme
- Training and development for your role and future career development
- Competitive salary, equity options and long-term employee schemes
- Lifeworks discount platform and Employee Assistance Programme
- Refer a Friend scheme
- Service and Recognition awards
Why Cera Care
- Impact: Make a real difference in people's lives by contributing to technology that revolutionises care in their own homes.
- Growth: Be part of a rapidly scaling health tech company at the forefront of innovation.
- Culture: Work in a dynamic, collaborative, and supportive environment with a team that values creativity and continuous learning.
- Benefits: Competitive salary, equity options, generous holiday allowance, health and wellness programs, and opportunities for professional development.
- Hybrid Working: Enjoy flexibility with our hybrid working model