The Big Data Developer is an intermediate level position responsible for participation in the establishment and implementation of new or revised application systems and programs in coordination with the Technology team. The overall objective of this role is to contribute to applications systems analysis and programming activities.
- Building data caching layer on NoSQL data platforms for fast response times
- Build out canonical models and data conformance
- Batch ingestion using Big data tools like Spark/Hive/Kafka/HBase and other ETL Tools.
- Real-time ingestion/stream processing and data distribution via Big Data APIs
- Implement best in class data management and data ingestion.
- Build and implement best in class Data Quality tools for monitoring of Critical Data Elements
- Conduct tasks related to feasibility studies, time and cost estimates, IT planning, risk technology, applications development, model development, and establish and implement new or revised applications systems and programs to meet specific business needs or user areas
- Monitor and control all phases of development process and analysis, design, construction, testing, and implementation as well as provide user and operational support on applications to business users
- Serve as advisor or coach to new or lower level analysts
- Has the ability to operate with a limited level of direct supervision.
- Utilize in-depth specialty knowledge of applications development to analyze complex problems/issues, provide evaluation of business process, system process, and industry standards, and make evaluative judgement
- Recommend and develop security measures in post implementation analysis of business usage to ensure successful system design and functionality
- Can exercise independence of judgement and autonomy.
- Acts as SME to senior stakeholders and /or other team members.
- Experience with Spark/Storm/Kafka or equivalent streaming/batch processing and event based messaging
- Strong data analysis skills and the ability to slice and dice the data as needed for business reporting
- Ability to work under pressure and manage deadlines or unexpected changes in expectations or requirements
- Strong solid understanding of the Big Data architecture and the ability to trouble shoot performance and/or development issues on Hadoop (Cloudera preferably)
- 5-8 years of relevant experience
- Experience in systems analysis and programming of software applications
- Hands-on experience working with Spark, Python, Scala, Hive, Impala and Hbase for data curation/conformance related work
- Strong proficiency in Spark for development work related to curation/conformance. Strong Scala developer (with previous Java background) preferred.
- Experience working in an agile environment with a fast paced changing requirements.
- Excellent planning and organizational skills
- Experience in managing and implementing successful projects
- Working knowledge of consulting/project management techniques/methods
Vacancy Type: Full Time
Job Location: Mississauga, ON, CA
Application Deadline: N/A