Are you a highly skilled ETL Developer with expertise in Microsoft SSIS, Delta Lake, and Databricks pipelines? Our high-profile client is seeking a Software Developer - ETL to design, develop, and maintain ETL workflows for data warehousing, data lakes, and analytics solutions. If you're ready to leverage your skills, this could be the perfect opportunity for you!
...
Advantages
✔ Competitive contract rates with a long-term engagement.
✔ Work on large-scale enterprise data transformation projects.
✔ Hybrid work model (3 days onsite, 2 days remote).
✔ Collaborate with top industry professionals in Azure, Databricks, and Data Engineering.
Responsibilities
General Responsibilities:
This role is responsible for designing, developing, maintaining, and optimizing ETL (Extract, Transform, Load) processes in Databricks for data warehousing, data lakes, and analytics. The developer will work closely with data architects and business teams to ensure the efficient transformation and movement of data to meet business needs, including handling Change Data Capture (CDC) and streaming data.
Tools used are:
- Azure Databricks, Delta Lake, Delta Live Tables, and Spark to process structured and unstructured data.
-Azure Databricks/PySpark (good Python/PySpark knowledge required) to build transformations of raw data into curated zones in the data lake.
-Azure Databricks/PySpark/SQL (good SQL knowledge required) to develop and/or troubleshoot transformations of curated data into FHIR.
Data design:
- Understand the requirements. Recommend changes to models to support ETL design;
- Define primary keys, indexing strategies, and relationships that enhance data integrity and performance across layers;
- Define the initial schemas for each data layer;
- Assist with data modelling and updates of source-to-target mapping documentation;
- Document and implement schema validation rules to ensure incoming data conforms to expected formats and standards;
- Design data quality checks within the pipeline to catch inconsistencies, missing values, or errors early in the process;
- Proactively communicate with business and IT experts on any changes required to conceptual, logical and physical models, communicate and review timelines, dependencies, and risks.
Development of ETL strategy and solution for different sets of data modules:
- Understand the Tables and Relationships in the data model;
- Create low-level design documents and test cases for ETL development;
- Implement error-catching, logging, retry mechanisms, and handling data anomalies;
- Create the workflows and pipeline design.
Development and testing of data pipelines with Incremental and Full Load:
- Develop high-quality ETL mappings/scripts/notebooks;
- Develop and maintain pipeline from Oracle data source to Azure Delta Lakes and FHIR;
- Perform unit testing;
- Ensure performance monitoring and improvement;
- Performance review, data consistency checks;
- Troubleshoot performance issues, ETL issues, and log activity for each pipeline and transformation;
- Review and optimize overall ETL performance.
End-to-end integrated testing for Full Load and Incremental Load:
- Plan for Go Live, Production Deployment;
- Create production deployment steps;
- Configure parameters, and scripts for go live. Test and review the instructions;
- Create release documents and help build and deploy code across servers.
Go Live Support and Review after Go Live
- Review existing ETL processes, and tools and provide recommendations on improving performance and reducing ETL timelines;
- Review infrastructure and remediate issues for overall process improvement;
- Knowledge Transfer to Ministry staff, development of documentation on the work completed;
- Document work and share the ETL end-to-end design, troubleshooting steps, configuration and script review;
- Transfer documents, scripts and review of documents to Ministry.
Qualifications
Must-Have Skills:
- 7+ years using ETL tools such as Microsoft SSIS, stored procedures, T-SQL;
- 2+ Delta Lake, Databricks and Azure Databricks pipelines;
- Strong knowledge of Delta Lake for data management and optimization;
- Familiarity with Databricks Workflows for scheduling and orchestrating tasks;
- 2+ years of Python and PySpark;
- Solid understanding of the Medallion Architecture (Bronze, Silver, Gold) and experience implementing it in production environments;
- Hands-on experience with CDC tools (e.g., GoldenGate) for managing real-time data; and
- SQL Server, Oracle.
Summary
If you are a Software Developer and the prospect of joining a dedicated team intrigues you, this role with our high-profile client could be the perfect opportunity for you.
Also, don’t forget that updating your profile on Randstad.ca helps us find you faster when we have roles that match your skills! So even if this role isn’t for you, please update your profile so we can find you!
We look forward to supporting you in your job search! Good luck!
Randstad Canada is committed to fostering a workforce reflective of all peoples of Canada. As a result, we are committed to developing and implementing strategies to increase the equity, diversity and inclusion within the workplace by examining our internal policies, practices, and systems throughout the entire lifecycle of our workforce, including its recruitment, retention and advancement for all employees. In addition to our deep commitment to respecting human rights, we are dedicated to positive actions to affect change to ensure everyone has full participation in the workforce free from any barriers, systemic or otherwise, especially equity-seeking groups who are usually underrepresented in Canada's workforce, including those who identify as women or non-binary/gender non-conforming; Indigenous or Aboriginal Peoples; persons with disabilities (visible or invisible) and; members of visible minorities, racialized groups and the LGBTQ2+ community.
Randstad Canada is committed to creating and maintaining an inclusive and accessible workplace for all its candidates and employees by supporting their accessibility and accommodation needs throughout the employment lifecycle. We ask that all job applications please identify any accommodation requirements by sending an email to accessibility@randstad.ca to ensure their ability to fully participate in the interview process.
show more
Are you a highly skilled ETL Developer with expertise in Microsoft SSIS, Delta Lake, and Databricks pipelines? Our high-profile client is seeking a Software Developer - ETL to design, develop, and maintain ETL workflows for data warehousing, data lakes, and analytics solutions. If you're ready to leverage your skills, this could be the perfect opportunity for you!
Advantages
✔ Competitive contract rates with a long-term engagement.
✔ Work on large-scale enterprise data transformation projects.
✔ Hybrid work model (3 days onsite, 2 days remote).
✔ Collaborate with top industry professionals in Azure, Databricks, and Data Engineering.
Responsibilities
General Responsibilities:
This role is responsible for designing, developing, maintaining, and optimizing ETL (Extract, Transform, Load) processes in Databricks for data warehousing, data lakes, and analytics. The developer will work closely with data architects and business teams to ensure the efficient transformation and movement of data to meet business needs, including handling Change Data Capture (CDC) and streaming data.
Tools used are:
...
- Azure Databricks, Delta Lake, Delta Live Tables, and Spark to process structured and unstructured data.
-Azure Databricks/PySpark (good Python/PySpark knowledge required) to build transformations of raw data into curated zones in the data lake.
-Azure Databricks/PySpark/SQL (good SQL knowledge required) to develop and/or troubleshoot transformations of curated data into FHIR.
Data design:
- Understand the requirements. Recommend changes to models to support ETL design;
- Define primary keys, indexing strategies, and relationships that enhance data integrity and performance across layers;
- Define the initial schemas for each data layer;
- Assist with data modelling and updates of source-to-target mapping documentation;
- Document and implement schema validation rules to ensure incoming data conforms to expected formats and standards;
- Design data quality checks within the pipeline to catch inconsistencies, missing values, or errors early in the process;
- Proactively communicate with business and IT experts on any changes required to conceptual, logical and physical models, communicate and review timelines, dependencies, and risks.
Development of ETL strategy and solution for different sets of data modules:
- Understand the Tables and Relationships in the data model;
- Create low-level design documents and test cases for ETL development;
- Implement error-catching, logging, retry mechanisms, and handling data anomalies;
- Create the workflows and pipeline design.
Development and testing of data pipelines with Incremental and Full Load:
- Develop high-quality ETL mappings/scripts/notebooks;
- Develop and maintain pipeline from Oracle data source to Azure Delta Lakes and FHIR;
- Perform unit testing;
- Ensure performance monitoring and improvement;
- Performance review, data consistency checks;
- Troubleshoot performance issues, ETL issues, and log activity for each pipeline and transformation;
- Review and optimize overall ETL performance.
End-to-end integrated testing for Full Load and Incremental Load:
- Plan for Go Live, Production Deployment;
- Create production deployment steps;
- Configure parameters, and scripts for go live. Test and review the instructions;
- Create release documents and help build and deploy code across servers.
Go Live Support and Review after Go Live
- Review existing ETL processes, and tools and provide recommendations on improving performance and reducing ETL timelines;
- Review infrastructure and remediate issues for overall process improvement;
- Knowledge Transfer to Ministry staff, development of documentation on the work completed;
- Document work and share the ETL end-to-end design, troubleshooting steps, configuration and script review;
- Transfer documents, scripts and review of documents to Ministry.
Qualifications
Must-Have Skills:
- 7+ years using ETL tools such as Microsoft SSIS, stored procedures, T-SQL;
- 2+ Delta Lake, Databricks and Azure Databricks pipelines;
- Strong knowledge of Delta Lake for data management and optimization;
- Familiarity with Databricks Workflows for scheduling and orchestrating tasks;
- 2+ years of Python and PySpark;
- Solid understanding of the Medallion Architecture (Bronze, Silver, Gold) and experience implementing it in production environments;
- Hands-on experience with CDC tools (e.g., GoldenGate) for managing real-time data; and
- SQL Server, Oracle.
Summary
If you are a Software Developer and the prospect of joining a dedicated team intrigues you, this role with our high-profile client could be the perfect opportunity for you.
Also, don’t forget that updating your profile on Randstad.ca helps us find you faster when we have roles that match your skills! So even if this role isn’t for you, please update your profile so we can find you!
We look forward to supporting you in your job search! Good luck!
Randstad Canada is committed to fostering a workforce reflective of all peoples of Canada. As a result, we are committed to developing and implementing strategies to increase the equity, diversity and inclusion within the workplace by examining our internal policies, practices, and systems throughout the entire lifecycle of our workforce, including its recruitment, retention and advancement for all employees. In addition to our deep commitment to respecting human rights, we are dedicated to positive actions to affect change to ensure everyone has full participation in the workforce free from any barriers, systemic or otherwise, especially equity-seeking groups who are usually underrepresented in Canada's workforce, including those who identify as women or non-binary/gender non-conforming; Indigenous or Aboriginal Peoples; persons with disabilities (visible or invisible) and; members of visible minorities, racialized groups and the LGBTQ2+ community.
Randstad Canada is committed to creating and maintaining an inclusive and accessible workplace for all its candidates and employees by supporting their accessibility and accommodation needs throughout the employment lifecycle. We ask that all job applications please identify any accommodation requirements by sending an email to accessibility@randstad.ca to ensure their ability to fully participate in the interview process.
show more