Job: Technology
Location: Chennai
Schedule: Full-time
Employee Status: Permanent
We’re committed to promoting equality in the workplace and creating an inclusive and flexible culture – one where everyone can realise their full potential and make a positive contribution to our organisation. This in turn helps us to provide better support to our broad client base.
Job responsibilities:
- Performing data ingestion and extraction in HDFS
- Writing complex hive queries for data processing as and when needed
- Writing shell scripts for data processing, cleansing and DB connectivity
- Developing/Monitoring/bug-fixing control-M jobs
- Writing queries in Teradata for data ingestion and extraction through TD utilities
- Automating the manual L3 tasks
- Performing end to end test for any Production release
- Coordinating production deployments for interface between EDMp and Mantas
- Performing post production bug fixes and support for interface between EDMp and Mantas for all live countries
- Peer Code review, provide technical solution for a problem
- Leading a small project independently
Requirements:
- Min of 8 years of experience in Hadoop ecosystem preferably Hive, oozie, yarn, python, spark. Unix shell scripting
- Good hands on control-M / scheduling tool development.
- Good knowledge in data warehousing system.
- Banking domain exposure preferably in AML
- Good communication skill to interact with different upstreams/downstreams/PSS/Dev team