Interview: Phone and skype ( Locals Preferred )Our client is looking to hire a Hadoop Big Data Developer/Architect focused on building the organization's capability to support a large scale Big Data Program. The company is transitioning into new technologies to create new value and/or savings for the company, Hadoop being one of the key technology platforms. This role will be critical to its success and will focus on analyzing projects coming through the pipeline, identifying opportunities to create and leverage Hadoop, and provide consultation/support on any projects that are leverage the technology.
This person will architect how to evolve hundreds of files into a Hadoop ecosystem, enriching Hadoop data, and distributing significant amounts of that data to SQL and to dozens of other applications. The IT environment has traditionally been an Oracle / DB2 / Teradata shop and is now transitioning into the Big Data Space with Hadoop.
Responsibilities:- Create, Configure, Implement, Optimize, Document, and/or Maintain:
- Custom jobs to ingest, enrich and distribute data in a Hadoop ecosystem
- File, source and record level data quality checks
- Logging and Error Handling
- Job & workflow scheduling
- Unix scripts
- Tables and Views
- Role-Based security across all data storage locations, including file, table, row and column level security
- Compare data across data storage locations: from files, to Hadoop, to any of the company's other database platforms.
- Work as technical leader by helping to establish our standards and best practices in this space; review and approve technical deliverables produced by vendor delivery partners.
- Collaborate with information architects and solution architects to optimize our solutions leveraging this technology.
- Recommend tools that will optimize efficient delivery with this platform.
- Work independently and at times with partial information, moving as many deliverables forward and mitigate impact of blocking issues
- Professionally influence and negotiate with other technical leaders to arrive and implement the most optimum solution considering standards and project constraints
- Mentor junior and other team members in all related Big Data technologies
What you will need to be successful:- Bachelor's degree or equivalent work experience
- 8 or more years of application development experience
- 3 or more years of experience with Data Warehousing, ETL, and ELT technology
- 2 or more years of development experience on Big Data / Hadoop projects