CPQi provides consultative services and expertise to help our clientele in the financial industry deliver better services to their users, speed up development lifecycles and bring them into a digital age. We work with 8 of the world’s top 12 banks, exposing our team to the most exciting technologies and strategies in today’s digital era. You will be working on various projects supporting our clients and we want you to grow with us. In doing that we focus heavily on the progress and education of all of our team members. That’s why we invest in you by offering CPQi University, paid certifications and other training programs. We want to build a career for life here at CPQi.
We have partnered with our client in developing a next-generation data architecture to support their growing business. As part of the Operational Data and Integration Portfolio, you will be tasked with provisioning new datasets for MDM and ODS use cases, changes to existing loads and overall data flow scheduling batch, micro batch and real-time.
What you’ll do
- Participate in agile team working with product owner, data stewards and team members to understand and codify data requirements
- Recommend an ETL design based on the requirements of the specific use case and provide accurate estimates of effort
- Assist to build out data model with other team members and DBA’s. Help to test and fix performance issues.
- Partner with RDBMS DBAs to understand the source data structures and design of the target structures
- Use the appropriate tools and frameworks available to develop the data acquisition and ingestion process in accordance with the approved design
- Leverage workflow tools to maintain accurate status of assigned tasks (JIRA, etc.)
- Ensure the necessary data validation steps are included to ensure completeness and accuracy of data
- Perform initial validation of the process and inspection of the data
- Utilize the deployment frameworks available to move artifacts from development to test to production environments
- Ensure jobs are scheduled to run on a frequency consistent with stakeholder requirements
- Ensure role-based access is established on target tables
- Support ETL jobs once deployed in production to ensure SLAs are met for data consumers
What you’ll need
- 3+ years of experience in an ETL Developer role, Informatica Integration Cloud Services is preferred.
- Advanced working knowledge of data acquisition frameworks and ETL development
- Advanced SQL coding experience and performance tuning
- Linux/Unix platform experience
- 3+ years of experience using the following platforms/technologies:
- ETL/ELT Tools or Suites o Relational SQL databases
- Hadoop and NoSQL platforms
- Knowledge of Informatica MDM is desirable
- Bachelor’s degree in Computer Science, Information Systems or another applicable field is preferred
*** Must be located in, or willing to relocate to Halifax