Job Description: | | Job Description and Responsibilities: The candidate's primary role will be a Hadoop Developer. As a developer on the team, the resource will be writing Linux Scripts, setting up Autosys jobs, writing Pig Scripts, Hive queries, Oozie workflows and Map Reduce programs. In addition to development activities, the resource will participate in analysis and design as well as complete project documentation. This goal of this project is to transfer an ETL batch process from mainframe/teradata to Hadoop. Job Requirements: Must have these skills: 1. Minimum 4 years experience using Core Java 2. Must have 4+ years of development experience in any of the RDBMS like · ORACLE, · SYBASE, · TERADATA, · NETEZZA, · MS SQL etc. 3. Experience writing Shell scripts in LINUX or Unix. 4. Experience with Autosys 5. Must have excellent and in-depth knowledge in SQL 6. Experience in analyzing text, streams with emerging Hadoop-based big data, NoSQL. · Hands on experience with Running Pig and Hive Queries. · Analyzing data with Hive, Pig and HBase · Hand on experience with Oozie. · Importing and exporting the data using Sqoop from HDFS to Relational Database systems/mainframe and vice-versa. · Loading data into HDFS. · Developing MapReduce Program to format the data. 7. Experience developing a batch process 8. Have the zeal to contribute, collaborate and work in a team environment 9. Must have excellent communication skills. Source Consumer Card, Home Loans, Auto Loans into Alpide. Establish EDMP controls, FES and Manual Adjustments capabilities on Alpide. Develop Andes Parallel environment and source key data from Alpide for Enterprise Capital Management. Retail Data Strategy is a multiyear program, with commitment from Enterprise Risk and Enterprise Capital Management to address the challenges related to quality and Retail ECRIS. Retail Data Strategy began in 2013, successfully established Alpide, sourced data via Alpide for RAS/GPS(Alps) and BASEL(Andes). |
|