Browse
Employers / Recruiters

Data Engineer - (Hadoop/Hive/python/Spark/Scala)

capco · 30+ days ago
Negotiable
Full-time
Continue
By pressing the button above, you agree to our Terms and Privacy Policy, and agree to receive email job alerts. You can unsubscribe anytime.

Job Title: Senior Data Engineer/Developer

Number of Positions: 2

Job Description:

The Senior Data Engineer will be responsible for designing, developing, and maintaining scalable data pipelines and building out new API integrations to support continuing increases in data volume and complexity. They will collaborate with analytics and business teams to improve data models that feed business intelligence tools, increasing data accessibility and fostering data-driven decision making across the organization.

Responsibilities:

  • Design, construct, install, test and maintain highly scalable data management systems & Data Pipeline.
  • Ensure systems meet business requirements and industry practices.
  • Build high-performance algorithms, prototypes, predictive models, and proof of concepts.
  • Research opportunities for data acquisition and new uses for existing data.
  • Develop data set processes for data modeling, mining and production.
  • Integrate new data management technologies and software engineering tools into existing structures.
  • Create custom software components and analytics applications.
  • Install and update disaster recovery procedures.
  • Collaborate with data architects, modelers, and IT team members on project goals.
  • Provide senior level technical consulting to peer data engineers during data application design and development for highly complex and critical data projects.

Qualifications:

  • Bachelor's degree in computer science, Engineering, or related field, or equivalent work experience.
  • Proven 5-8 years of experience as a Senior Data Engineer or similar role.
  • Experience with big data tools: Hadoop, Spark, Kafka, Ansible, chef, Terraform, Airflow, and Protobuf RPC etc.
  • Expert level SQL skills for data manipulation (DML) and validation (DB2).
  • Experience with data pipeline and workflow management tools.
  • Experience with object-oriented/object function scripting languages: Python, Java, Go lang etc.
  • Strong problem solving and analytical skills.
  • Excellent verbal communication skills.
  • Good interpersonal skills.
  • Ability to provide technical leadership for the team.

Last updated on Aug 12, 2024

See more

About the company

More jobs at capco

Analyzing

London, England

 · 

30+ days ago

São Paulo, State of São Paulo

 · 

30+ days ago

Bengaluru, Karnataka

 · 

30+ days ago

 · 

30+ days ago

Developed by Blake and Linh in the US and Vietnam.
We're interested in hearing what you like and don't like! Live chat with our founder or join our Discord
Changelog
🚀 LaunchpadNov 27
Create a site and sell services based on your CV.
🔥 Job search dashboardNov 13
Revamped job search UI with a sortable grid, live filtering, bookmarks, and application tracking.
🫡 Cover letter instructionsSep 27
New Studio settings give you control over AI output.
✨ Cover Letter StudioAug 9
Automatically generate cover letters for any job.
🎯 Suggested filtersAug 6
Copilot suggests additional filters above the results.
⚡️ Quick applicationsAug 2
Apply to jobs using info from your CV. Initial coverage of ~200k jobs in Spain, Germany, Austria, Switzerland, France, and the Netherlands.
🧠 Job AnalysisJul 12
Have Copilot read job descriptions and extract out key info you want to know. Click "Analyze All" to try it out. Click on the Copilot's gear icon to customize the prompt.
© 2024 RemoteAmbitionAffiliate · Privacy · Terms · Sitemap · Status