Description

Like the idea of supporting company-wide decisions?

Then Jobber might be the place for you! We’re looking for a Senior Data Engineer to be part of our Business Technology team in our Business Operations (BizOps) Department.

Jobber exists to help people in small businesses be successful. We work with home and field service companies, like your local plumbers, painters, and landscapers, to help them better quote, schedule, invoice and collect payments from their customers. Being an entrepreneur in today’s world looks and operates very differently than it did in the past, so that’s why we put the power and flexibility in their hands to run their businesses how, where and when they want! 

Our culture of transparency, inclusivity, collaboration, and innovation has been recognized by Great Place to Work, Canada’s Most Admired Corporate Cultures, and more. Jobber has also been named on the Globe and Mail’s Canada’s Top Growing Companies list, and Deloitte Canada’s Technology Fast 50™, Enterprise Fast 15, and Technology Fast 500™ lists. With an Executive team that has over thirty years of industry experience of leading the way, we’ve come a long way from our first customer in 2011—but we’ve just scratched the surface of what we want to accomplish for our customers.

The team:

Business Technology is the engineering team within Business Operations, our internal consulting department - they’re the decision support mechanism that connects data, business insights and an internal tech stack (systems) with the rest of the organization. In essence, BizOps is a central function that exists to drive business outcomes in all corners of Jobber’s ecosystem.

The role: 

Reporting to the Director, Business Technology, the Senior Data Engineer will work on our Business Technology team which develops internal software, integrations and data infrastructure. Our work unlocks improved operational outcomes, workflow efficiencies and new business insights across our organization. We help teams leverage data, tools and technology in order to successfully execute on their own mandates. We research, develop and maintain systems which support other internal teams from an operational and analytical perspective.

We’re looking for people who are ready for their next challenge, and want to use their experience to influence people, processes and decisions.

The Senior Data Engineer will

  • Build the foundation of our growth. Design, build and maintain batch and real time data pipelines in cloud infrastructure (preferably AWS). Build scripts, tools, serverless applications and workflows.
  • Set up our internal teams for success. Internal process improvements such as automating manual processes, building alerting/monitoring tools. Collaborate closely with other teams to build tools, frameworks, reports to run experiments, analyze A/B test results, enable insights.
  • Be a business accelerator. Work with analysts, data scientists and product teams to extract actionable insights from data that shape the direction of the company.
  • Participate in strategic planning. Lead initiatives to research, analyze and propose new technologies and tooling for our data engineering stack. Participate in design and code reviews - learn from your peers and teach your peers. Solve problems with technology and make decisions backed by data.
  • Be the data integrity expert. If two reports are diverging, you’re going to dive into the nitty gritty code to uncover the source of truth.

To be successful, you should have:

  • Experience as a Data Engineer or similar role in an Agile/Scrum environment
  • Proficiency in writing code in a few different languages 
  • Experience in analytics dimensional modeling/star schema and data warehousing in a cloud envionrment (preferably AWS Redshift)
  • Experience in building and maintaining data pipelines for ETL/ELT processes
  • Relevant experience with data collection and ingestion from external sources, and optimizing data flow between different systems and environments.
  • Knowledge of SQL including query performance debugging and tuning skills.
  • Familiarity of BI tools.
  • Experience in developing and operating high-volume, high-available and scalable environments.
  • Strong communication skills, with the ability to collaborate with both non-technical and technical team members.

It would be really great (but not a deal-breaker) if you had any:

  • Using templated SQL in your ETL pipeline (eg. dbt)
  • Experience with Integrations, APIs and working within their limitations.
  • Knowledge or experience with Terraform 
  • Backend development experience, especially dealing with message queues and stream processing
  • Experience with languages such as javascript and python
  • Experience with NoSQL
  • Knowledge and experience of Airflow


Please mention the word **CONVINCING** and tag RMmEwMTo0Zjg6MWMxZTplNWNjOjox when applying to show you read the job post completely (#RMmEwMTo0Zjg6MWMxZTplNWNjOjox). This is a beta feature to avoid spam applicants. Companies can search these words to find applicants that read this and see they're human.