Careers Overview


ALETHA is a dynamic, Toronto-based IT consulting boutique which strives for excellence in customer service and delivery in financial services and fintech industry. We focus on technology advisory, cloud computing, integration solutions, and enterprise big data and analytics. With a team of highly knowledgeable business and technical experts, we provide strategic, mission critical solutions to our clients.

Senior Data Engineer


ALETHA is looking for a Senior Data Engineer who has hands-on experience in data integration and database development in hybrid environment and who will contribute to successfully deliver hybrid data integration or custom database development projects in the Greater Toronto Area.

Responsibilities

  • Understand business requirements for data and information especially in the financial services industry.

  • Responsible for overall design of the data integration solution or data pipeline strategy for client engagements based on an approach that is holistic, scalable, pragmatic, and effective design of data processing.

  • Work with Data Architect on the overall design fulfilling the data and business requirements.

  • Develop high performing data pipelines in Microsoft SSIS, Azure Data Factory, or Databricks.

  • Develop database stored procedures based on Microsoft T-SQL and/or Oracle PL/SQL.

  • Develop data processing modules in one of these programming languages: C#, Java, Python

  • Design automated unit-testing framework as per project requirement.

  • Integrate pipelines with Continuous Integration/Delivery framework as needed.

  • Produce estimates for analysis, design, development, and testing for data pipelines.

  • Lead, mentor, and coach the technical team on client engagements.

  • Assist with proposals and pre-sales activities.

  • Lead development to ALETHA’s best practices in data integration and data engineering.

  • Ensure compliance with business, data, and technical requirements.

  • Ensure that client’s enterprise architecture standards, policies, and procedures are followed.


Requirements

  • Minimum Bachelors degree in Computer Science or Engineering.

  • Minimum 8 years experience in Information Technology, participating in complex projects.

  • Strong communication skills.

  • Strong knowledge in data analysis, database development, data warehousing life cycle and data integration methodologies (ETL, ELT, EAI, EII).

  • Strong knowledge and extensive experiences using Microsoft SSIS or Informatica.

  • Strong SQL knowledge in Microsoft SQL Server or Oracle databases and its languages: T-SQL, PL/SQL

  • Working knowledge in one or more of the following languages: Java, C#, Python, Bash, PowerShell

  • Familiar with capabilities of various platform services (e.g. Compute, Storage, Web, Developer, Integration, Data, Security, or Management) from the cloud vendors.

  • Working knowledge of Azure Data Factory.

  • Working knowledge of one of data analytics tools (such as Microsoft Power BI).

  • Knowledgeable in both relational and dimensional data modeling (with both Kimball and Inmon approaches).

  • Knowledge in data management, REST-oriented API, Continuous Integration and Delivery (CI/CD) principles.

  • Good overall business knowledge in financial industries (i.e. one or more in retail banking, commercial banking, capital markets, wealth management, insurance, pension fund, and fintech).

  • Preferably exposures to Hadoop ecosystem data tools (e.g. Hive, Hbase, Nifi, Spark).

  • Ability to work independently and excel in a team environment.


AVAILABILITY

1 people

EXPERIENCE

8 years

GRADUATION

Bachelor

Apply now

Make sure to have all the requirements before contacting us!