Browse
Employers / Recruiters

Data Engineer : Delta Airlines

Negotiable
Full-time
Continue
By pressing the button above, you agree to our Terms and Privacy Policy, and agree to receive email job alerts. You can unsubscribe anytime.
Interview : Phone and F2F

Description :


Qualifications :
Top 3 skills needed: o Skill 1: Strong hands on experience in Scala, Hadoop Platform tools, Real time streaming using Kafka, Node js, Microservice architecture and Rest APIs o Skill 2: AWS tech stack and tools experience,S3, EC2, EMR, MSK, FaaS, Cloudwatch, APIgateway , service discovery tools , monitoring alerting and dashboards. o Skill 3: Agile practices and DevOps CI/CD .

Responsibilities:
This position is a permanent full time Developer or Associate Developer role, which begins with a 9-month program consisting of three rotations within our Atlanta offices. You will be embedded within a large-scale program, rotating through a variety of Test Automation, Data Engineering, and Development roles in order to gain a deep understanding of the program and emerge with subject matter expertise. Our on the job curriculum includes paired programming, hands-on activities, and virtual training. Training topics focus on program-specific technologies and processes. You will also be assigned mentor to provide support throughout the program. We have active executive management sponsorship and involvement throughout program. During your rotations, you will have the opportunity to participate in and/or complete features which address real business and customer needs. Lastly you will improve your technical, business and professional skills while working in a collaborative, agile organization.

WHAT ARE WE LOOKING FOR? / WHAT EXPERIENCE DO YOU NEED?
Must Haves: -
Strong data engineering background - designing and building data pipelines to support real time streaming and eventing - minimum of 2 years
Minimum of 2 years building micro services and API architecture
Experience with AI/Client tools and technologies
Experience with Hadoop, HBase, Hive and other Big Data technologies/tools
Experience working with relational and non-relational data
Experience working with Spark/Scala java and Python
Experience with ingestion integration tools like Apache NIFI
Must have the ability to listen to and collaborate with colleagues; convey ideas effectively; and prepare clear, written documentation
Large project experience with high transaction volumes is required
Driven to solve difficult challenges
Exposure to TDD and automated testing frameworks
Experience with any of the following message formats : Parquet, Avro, Protocol Buffer
Experience with Cloud tools and technologies, including AWS
Create solutions on AWS using services such as Kinesis, Lambda and API Gateway
Knowledge of build tools like Maven, Gradle and the artifact lifecycle from snapshot to releases and patch fixes


Preferred: -
A minimum of 2 years working in a fast-paced agile environment with VersionOne or similar
Ability to communicate with both business and technical teams on how product works and integrations with other products
Previous experience in supporting large, critical applications in production environment with strong debugging skills required
Proficient on at-least on any cloud automation frameworks like Terraform cloud formation etc.
Extensive experience architecting designing and programming applications in an AWS cloud environment
Experience with designing and building application using AWS services such as EC2, MSK etc
Experience architecting highly available systems that utilize load balancing, horizontal scalability and HA
Exposure to Node JS


Last updated on Feb 4, 2020

See more

More jobs at ipjdnw7f5napvetweqr2ziz9mno8hm0423sbqrczl5q6rasimprbblk5rouuuujo

Analyzing

Stamford, Connecticut

 · 

30+ days ago

Dallas, Texas

 · 

30+ days ago

Atlanta, Georgia

 · 

30+ days ago

Hanover, New Jersey

 · 

30+ days ago

New Hartford, New York

 · 

30+ days ago

More jobs like this

Analyzing
Network Control
3
3djdnw5yqdh8wl3frr5t6561tvvokq01affwpxt3lcutzo4f8yt1aeiy3msk02or

Melbourne, Florida

 · 

30+ days ago

Data Engineer (with NIKE exp.)
A
atjdnw2s7bs9ixn3syxicb6lo3i6p309225p0sn85jt6hn8a2nd1lz60q1ugarb5

Dearborn, Michigan

 · 

30+ days ago

Big Fix Engineer
C
crjdnwsnowo2i4nz45b1teboszrxlg0351vr73gpqw7yanury9u287prckhdnkww

New York, New York

 · 

30+ days ago

Chicago, Illinois

 · 

30+ days ago

Seattle, Washington

 · 

30+ days ago

DPOR - Data Warehouse/BI Developer - IN PERSON IVS ONLY!
B
b6jdnwcpcemgg8el3r9winlpunj8hc038b1vkhowrzxn9gitznreodi38t7rirkp

Richmond, Virginia

 · 

30+ days ago

Staff GIS Specialist
C
c4jdnwc7x3stjcj6zixxnwiepq2dyk03b8lddp27c7hr98p88sagx6olnglsveeo

Kansas City, Missouri

 · 

30+ days ago

SAS/ Mirth Developer
R
rsjdnwc9jel4i3xyjsm3m8vnhrmayk037bphn44zg3i1bl3dcjtqhqlclsisinpr

Cary, North Carolina

 · 

30+ days ago

SSIS Integration Engineer
D
d1jdnwvhbdgegwtuivnn7ap9pt7oxh03c5khec0dlwfqm0mxseydguueduceafch

Hoboken, New Jersey

 · 

30+ days ago

Developed by Blake and Linh in the US and Vietnam.
We're interested in hearing what you like and don't like! Live chat with our founder or join our Discord
Changelog
🚀 LaunchpadNov 27
Create a site and sell services based on your CV.
🔥 Job search dashboardNov 13
Revamped job search UI with a sortable grid, live filtering, bookmarks, and application tracking.
🫡 Cover letter instructionsSep 27
New Studio settings give you control over AI output.
✨ Cover Letter StudioAug 9
Automatically generate cover letters for any job.
🎯 Suggested filtersAug 6
Copilot suggests additional filters above the results.
⚡️ Quick applicationsAug 2
Apply to jobs using info from your CV. Initial coverage of ~200k jobs in Spain, Germany, Austria, Switzerland, France, and the Netherlands.
🧠 Job AnalysisJul 12
Have Copilot read job descriptions and extract out key info you want to know. Click "Analyze All" to try it out. Click on the Copilot's gear icon to customize the prompt.
© 2024 RemoteAmbitionAffiliate · Privacy · Terms · Sitemap · Status