Title : Hadoop Engineer
Rate : DOE
Location : King of Prussia, PA
Client : Vertex
REQUIRED SKILLS- Minimum of five years of experience in software engineering
- Minimum one year of experience developing with Big Data technologies
- Very strong experience with an object-oriented programming language, preferably Java
- Strong object-oriented analysis and design skills
- Experience with Apache Hive, HiveServer2, MapReduce, Apache Knox, Apache Ranger, Ambari, Hue, Presto
- Experience with developing RESTful services
- Experience with SQL, JDBC, ODBC
- Experience with Mongo DB, MySQL, Open LDAP
- Experience with JIRA and Confluence
- Experience working in SCRUM methodology
- Knowledge of cloud computing infrastructure - Amazon AWS (EC2, EBS, S3)
- Strong communication (written and verbal) and interpersonal skills
Looking for a Hadoop engineer that has developed within the context of a web application, so they understand basic UI browser interaction, web browser, knows how to write a REST API, and how to interact with Hadoop. - Hadoop
- Java Web browser
- REST API
- Understanding interaction from Middle Tier to UI
•
Last updated on Jul 7, 2017