SENIOR DATA ENGINEER TSD

Job ID: 55932
Job Category: Engineering & Technical
Division & Section: Technology Services, Office of the Chief Technology Officer
Work Location: Metro Hall
Job Type & Duration: Full-time, 1 Permanent Vacancy
Salary: $123,833.00 - $170,184.00, TM5394  & Wage Grade 8
Shift Information: Monday to Friday, 35 hours per week
Affiliation: Non-Union
Number of Positions Open: 1
Posting Period: 04-JUN-2025 to 18-JUN-2025


About the Role:

The City of Toronto is seeking a skilled and experienced Senior Data Engineer to join our Enterprise Data Platform team. This role is vital in supporting the design, development, and implementation of our Enterprise Data Platform. The ideal candidate will have a strong background in AWS technologies, data engineering, and modern data architectures.


Why Join the City of Toronto:

As a Senior Data Engineer at the City of Toronto, you will have the opportunity to work on cutting-edge data solutions that directly impact the lives of Toronto's residents. You'll be part of a team driving the city's digital transformation, working on projects that enhance city services and operations through innovative data utilization.

You'll work in a collaborative environment that values your expertise and provides opportunities for professional growth. If you're passionate about leveraging data and AWS technologies to create meaningful change, we encourage you to apply and be part of our mission to build a smarter, more connected Toronto.


Key Responsibilities:

  • AWS Expertise: Utilize a wide range of AWS services to build and maintain scalable, secure, and efficient data infrastructure. Key services include S3, Redshift, Kinesis, EMR, Glue, Data Zone, Lake Formation, and CloudFormation.
  • Data Pipeline Development: Design, implement, and maintain robust ETL/ELT processes using tools such as AWS Glue, DBT (Data Build Tool), and Apache Spark.
  • Data Mesh Implementation: Contribute to the implementation of a data mesh architecture, enabling decentralized, domain-oriented data ownership and management.
  • Infrastructure as Code: Develop and maintain infrastructure as code using Terraform or AWS CloudFormation to automate and streamline the deployment of cloud resources.
  • Data Processing: Utilize Python and Apache Spark for large-scale data processing, transformation, and analysis.
  • Data Modeling: Design and implement efficient data models to support analytics, machine learning, and reporting needs.
  • Streaming Solutions: Develop and maintain both batch and real-time data streaming solutions using technologies such as AWS Kinesis.
  • Data Governance: Implement and adhere to data governance policies to ensure data quality, privacy, and compliance with regulations.
  • Platform Enhancement: Work with technologies such as Databricks and Snowflake to enhance the capabilities of the data platform.
  • Collaboration: Work closely with data scientists, analysts, and other stakeholders to understand data requirements and provide tailored solutions.
  • Documentation and Knowledge Sharing: Create and maintain comprehensive documentation for data processes, pipelines, and models. Share knowledge with team members and contribute to the team's overall growth.


Required Qualifications:

  • Bachelor's degree in Computer Science, Data Science, Information Technology, or a related field.
  • 5+ years of experience in data engineering or related fields.
  • Deep expertise in AWS technologies, particularly in data-related services (S3, Redshift, Kinesis, EMR, Glue, etc.).
  • Strong proficiency in Python programming, especially for data processing tasks.
  • Experience with big data processing frameworks, particularly Apache Spark.
  • Hands-on experience with ETL/ELT processes and tools like AWS Glue and DBT.
  • Solid understanding of data modeling concepts and techniques.
  • Experience with infrastructure as code, preferably using Terraform or AWS CloudFormation.
  • Familiarity with data governance principles and privacy regulations (e.g., GDPR, CCPA).


Preferred Qualifications:

  • Master's degree in a relevant field.
  • AWS certifications (e.g., AWS Certified Data Analytics - Specialty, AWS Certified Big Data - Specialty).
  • Experience with data mesh architecture concepts and implementation.
  • Knowledge of other cloud platforms (e.g., Azure, GCP) for multi-cloud strategies.
  • Familiarity with containerization technologies (e.g., Docker, Kubernetes).
  • Experience with CI/CD practices and tools.
  • Understanding of machine learning workflows and MLOps practices.


Key Skills:

  • Strong problem-solving and analytical skills
  • Excellent communication skills, able to explain complex technical concepts to non-technical stakeholders
  • Self-motivated with the ability to work independently and as part of a team
  • Attention to detail and commitment to delivering high-quality work
  • Adaptability and willingness to learn new technologies and methodologies
  • Time management skills and ability to handle multiple projects simultaneously


Equity, Diversity and Inclusion

The City is an equal opportunity employer, dedicated to creating a workplace culture of inclusiveness that reflects the diverse residents that we serve. Learn more about the City’s commitment to employment equity.

Accommodation

The City of Toronto is committed to creating an accessible and inclusive organization. We are committed to providing barrier-free and accessible employment practices in compliance with the Accessibility for Ontarians with Disabilities Act (AODA). Should you require Code-protected accommodation through any stage of the recruitment process, please make them known when contacted and we will work with you to meet your needs. Disability-related accommodation during the application process is available upon request. Learn more about the City’s Hiring Policies and Accommodation Process.