Interview : Phone and F2F
Description :
Qualifications :
Top 3 skills needed: o Skill 1: Strong hands on experience in Scala, Hadoop Platform tools, Real time streaming using Kafka, Node js, Microservice architecture and Rest APIs o Skill 2: AWS tech stack and tools experience,S3, EC2, EMR, MSK, FaaS, Cloudwatch, APIgateway , service discovery tools , monitoring alerting and dashboards. o Skill 3: Agile practices and DevOps CI/CD .
Responsibilities:
This position is a permanent full time Developer or Associate Developer role, which begins with a 9-month program consisting of three rotations within our Atlanta offices. You will be embedded within a large-scale program, rotating through a variety of Test Automation, Data Engineering, and Development roles in order to gain a deep understanding of the program and emerge with subject matter expertise. Our on the job curriculum includes paired programming, hands-on activities, and virtual training. Training topics focus on program-specific technologies and processes. You will also be assigned mentor to provide support throughout the program. We have active executive management sponsorship and involvement throughout program. During your rotations, you will have the opportunity to participate in and/or complete features which address real business and customer needs. Lastly you will improve your technical, business and professional skills while working in a collaborative, agile organization.
WHAT ARE WE LOOKING FOR? / WHAT EXPERIENCE DO YOU NEED?
Must Haves: -
Strong data engineering background - designing and building data pipelines to support real time streaming and eventing - minimum of 2 years
Minimum of 2 years building micro services and API architecture
Experience with AI/Client tools and technologies
Experience with Hadoop, HBase, Hive and other Big Data technologies/tools
Experience working with relational and non-relational data
Experience working with Spark/Scala java and Python
Experience with ingestion integration tools like Apache NIFI
Must have the ability to listen to and collaborate with colleagues; convey ideas effectively; and prepare clear, written documentation
Large project experience with high transaction volumes is required
Driven to solve difficult challenges
Exposure to TDD and automated testing frameworks
Experience with any of the following message formats : Parquet, Avro, Protocol Buffer
Experience with Cloud tools and technologies, including AWS
Create solutions on AWS using services such as Kinesis, Lambda and API Gateway
Knowledge of build tools like Maven, Gradle and the artifact lifecycle from snapshot to releases and patch fixes
Preferred: -
A minimum of 2 years working in a fast-paced agile environment with VersionOne or similar
Ability to communicate with both business and technical teams on how product works and integrations with other products
Previous experience in supporting large, critical applications in production environment with strong debugging skills required
Proficient on at-least on any cloud automation frameworks like Terraform cloud formation etc.
Extensive experience architecting designing and programming applications in an AWS cloud environment
Experience with designing and building application using AWS services such as EC2, MSK etc
Experience architecting highly available systems that utilize load balancing, horizontal scalability and HA
Exposure to Node JS
•
Last updated on Feb 4, 2020