Browse
Employers / Recruiters

Data Engineer

DJE Holdings · 30+ days ago
Negotiable
Full-time
Continue
By pressing the button above, you agree to our Terms and Privacy Policy, and agree to receive email job alerts. You can unsubscribe anytime.
Edelman is a voice synonymous with trust, reimagining a future where the currency of communication is action. Our culture thrives on three promises: boldness is possibility, empathy is progress, and curiosity is momentum. 

At Edelman, we understand diversity, equity, inclusion and belonging (DEIB) transform our colleagues, our company, our clients, and our communities. We are in relentless pursuit of an equitable and inspiring workplace that is respectful of all, reflects and represents the world in which we live, and fosters trust, collaboration and belonging.

We currently seeking a Data Engineer with 3-5 years’ experience. The ideal candidate would have the ability to work independently within an AGILE working environment and have experience working with cloud infrastructure leveraging tools such as Apache Airflow, Databricks, and Snowflake. A familiarity with real-time data processing and AI implementation is advantageous.  

Why You'll Love Working with Us:
At Edelman, we believe in fostering a collaborative and open environment where every team member’s voice is valued. Our data engineering team thrives on building robust, scalable, and efficient data systems to power insightful decision-making. 

We are at an exciting point in our journey, focusing on designing and implementing modern data pipelines, optimizing data workflows, and enabling seamless integration of data across platforms. You’ll work with best-in-class tools and practices for data ingestion, transformation, storage and analysis, ensuring high data quality, performance, and reliability. 

Our data stack leverages technologies like ETL/ELT pipelines, distributed computing frameworks, data lakes, and data warehouses to process and analyze data efficiently at scale. Additionally, we are exploring the use of Generative AI techniques to support tasks like *data enrichment and automated reporting, enhancing the insights we deliver to stakeholders. 

This role provides a unique opportunity to work on projects involving batch processing , streaming data pipelines, and automation of data workflows, with occasional opportunities to collaborate on AI-driven solutions. 

If you’re passionate about designing scalable systems, building reliable data infrastructure, and solving real-world data challenges, you’ll thrive here. We empower our engineers to explore new tools and approaches while delivering meaningful, high-quality solutions in a supportive, forward-thinking environment.

Responsibilities:

  • Design, build, and maintain scalable and robust data pipelines to support analytics and machine learning models, ensuring high data quality and reliability for both batch & real-time use cases.
  • Design, maintain, optimize data models and data structures in tooling such as Snowflake and Databricks.
  • Leverage Databricks and Cloud-native solutions for big data processing, ensuring efficient management of Spark jobs and seamless integration with other data services.
  • Utilize PySpark and/or Ray to build and scale distributed computing tasks, enhancing the performance of machine learning model training and inference processes.
  • Monitor, troubleshoot, and resolve issues within data pipelines and infrastructure, implementing best practices for data engineering and continuous improvement.
  • Diagrammatically document data engineering workflows.
  • Collaborate with other Data Engineers, Product Owners, Software Developers and Machine Learning Engineers to implement new product features by understanding their needs and delivery timeously. 

Qualifications:

  • Minimum of 3 years experience deploying enterprise level scalable data engineering solutions.
  • Strong examples of independently developed data pipelines end-to-end, from problem formulation, raw data, to implementation, optimization, and result.
  • Proven track record of building and managing scalable cloud-based infrastructure on AWS (incl. S3, Dynamo DB, EMR).
  • Proven track record of implementing and managing of AI model lifecycle in a production environment.
  • Experience using Apache Airflow (or equivalent) , Snowflake, Lucene-based search engines.
  • Experience with Databricks (Delta format, Unity Catalog).
  • Advanced SQL and Python knowledge with associated coding experience.
  • Strong Experience with DevOps practices for continuous integration and continuous delivery (CI/CD).
  • Experience wrangling structured & unstructured file formats (Parquet, CSV, JSON).
  • Understanding and implementation of best practices within ETL end ELT processes.
  • Data Quality best practice implementation using Great Expectations.
  • Real-time data processing experience using Apache Kafka Experience (or equivalent) will be advantageous.
  • Work independently with minimal supervision.
  • Takes initiative and is action-focused.
  • Mentor and share knowledge with junior team members.
  • Collaborative with a strong ability to work in cross-functional teams.
  • Excellent communication skills with the ability to communicate with stakeholders across varying interest groups.
  • Fluency in spoken and written English.
#LI-RT9

We are dedicated to building a diverse, inclusive, and authentic workplace, so if you’re excited about this role but your experience doesn’t perfectly align with every qualification, we encourage you to apply anyway. You may be just the right candidate for this or other roles.

Last updated on Dec 17, 2024

See more

About the company

DJE HoldingsDJE Holdings is an investment firm that focuses on acquiring and building businesses in the financial services, technology, and healthcare sectors.

More jobs at DJE Holdings

Analyzing

London, England

 · 

30+ days ago

London, England

 · 

30+ days ago

São Paulo, State of São Paulo

 · 

30+ days ago

Washington, District of Columbia

 · 

30+ days ago

Washington, District of Columbia

 · 

30+ days ago

More jobs like this

Analyzing
Senior Backend Engineer (Typescript), (m/f/d)
Bouncy ·  Alquiler de castillos hinchables

Barcelona, Catalonia

 · 

30+ days ago

Site Reliability Engineer€35,000 - €60,000
Fortexpro ·  Productos de seguridad personalizados

Remote

 · 

30+ days ago

Cloud Operations Engineer AWS - Barcelona
TB
ToBeIT ·  Servicios de TI y consultoría

Barcelona, Catalonia

 · 

30+ days ago

Full stack developer
Ucademy ·  Formación y educación en línea

Madrid, Community of Madrid

 · 

30+ days ago

Data Engineer€45,000 - €70,000
HEMAV Technology ·  Drones y soluciones aéreas

Remote

 · 

30+ days ago

SENIOR WEB DEVELOPER€30,000 - €56,000
C
Confidencial ·  Servicios de seguridad y vigilancia

Remote

 · 

30+ days ago

BackEnd MAGENTO
Napptilus Tech Labs ·  Desarrollo de tecnología y aplicaciones.

Barcelona, Catalonia

 · 

30+ days ago

Technical Support Advisor (Danish Speaker)
Blu Selection ·  Agencia de reclutamiento multilingüe

Málaga, Andalusia

 · 

30+ days ago

Data Engineer (m/f/d)
Expert Systems AG ·  Tecnología de inteligencia artificial

Madrid, Community of Madrid

 · 

30+ days ago

FULL STACK DEVELOPER€40,000 - €65,000
MyneHub Ventures S.L. ·  Plataforma tecnológica para suministros

Madrid, Community of Madrid

 · 

30+ days ago

Developed by Blake and Linh in the US and Vietnam.
We're interested in hearing what you like and don't like! Live chat with our founder or join our Discord
Changelog
🚀 LaunchpadNov 27
Create a site and sell services based on your CV.
🔥 Job search dashboardNov 13
Revamped job search UI with a sortable grid, live filtering, bookmarks, and application tracking.
🫡 Cover letter instructionsSep 27
New Studio settings give you control over AI output.
✨ Cover Letter StudioAug 9
Automatically generate cover letters for any job.
🎯 Suggested filtersAug 6
Copilot suggests additional filters above the results.
⚡️ Quick applicationsAug 2
Apply to jobs using info from your CV. Initial coverage of ~200k jobs in Spain, Germany, Austria, Switzerland, France, and the Netherlands.
🧠 Job AnalysisJul 12
Have Copilot read job descriptions and extract out key info you want to know. Click "Analyze All" to try it out. Click on the Copilot's gear icon to customize the prompt.
© 2024 RemoteAmbitionAffiliate · Privacy · Terms · Sitemap · Status