Compétences recherchées — Connectez-vous et téléversez votre CV pour comparer avec votre profil
Détails du poste
- Lieu de travail : Boucherville (Hybride)
- Type de poste : Permanent à temps plein
Data Engineer
Détails du poste
Position: Data Engineer
Location: Montreal, QC ( 3 days onsite is must)
Duration: 12+ Months
Description du poste
Data Engineer
KEY RESPONSIBILITIES:
- Designing, implementing, managing scalable data solutions using Snowflake environment for optimized data storage and processing.
- Migrate existing data domains/flows from relational data store to cloud data store (Snowflake).
- Identify and optimize new/existing data workflows.
- Identify and implement data integrity practices.
- Integrate data governance and data science tools with Snowflake ecosystem as per practice.
- Support the development of data models and ETL processes to ensure high quality data ingestion in cloud data store.
- Collaborate with team members to design and implement effective data workflows and transformations.
- Assist in the maintenance and optimization of Snowflake environments to improve performance and reduce costs.
- Contribute to proof of concept, documentation and best practices for data management and governance within the Snowflake ecosystem.
- Participate in code reviews and provide constructive feedback to improve team deliverables quality
- Design and develop data ingestion pipeline using Talend/Informatica using industry best practices.
- Writing efficient SQL and Python scripts for large dataset analysis and building end to end automation process on a set schedule.
- Design, implement data distribution layer using Snowflake REST API.
Exigences
SKILLS / QUALIFICATIONS
- Bachelor’s degree in computer science, Software Engineering, Information Technology, Management Information Systems or related field required (Master’s degree preferred)
- 5 to 10 years’ experience in data analysis, data objects development and modeling in Snowflake data store.
- Snowflake REST API experience is required
- Experience with building data pipelines from legacy to Snowflake for unstructured and semi structured datasets
- Informatica ETL and/or Talend ETL experience.
- Efficient SQL and PLSQL queries development and Python scripts development experience.
- Proven ability to work in distributed systems
- Proficiency with relational databases (such as DB2) querying and focus on data transformations.
- Excellent problem-solving skills and team-oriented mindset
- Strong data modelling concepts and schema design on relational data store and on cloud data store
- Strong communication skills both verbal and written. Capable of collaborating effectively across a variety of IT and Business groups, across regions and different roles.
- Familiarity with data visualization tools such as Tableau and PowerBI is a plus.
- Collaborating with data scientists/experts to integrate machine learning models into Snowflake
- Data Warehousing background is required
Contact
Send resume & documents to
Call: