Browse
Employers / Recruiters

Data Engineer - Stream Data Processing - Distributed Data Processing

pathwaycom · 30+ days ago
€60,000+
Full-time
Remote
Apply

About Pathway

Deeptech start-up, founded in March 2020.

  • Our primary developer offering is an ultra-performant Data Processing Framework (unified streaming + batch) with a Python API, distributed Rust engine, and capabilities for data source integration & transformation at scale (Kafka, S3, databases/CDC,...).
  • The single-machine version is provided on a free-to-use license (`pip install pathway`).
  • Major data use cases are around event-stream data (including real-world data such as IoT), and graph data that changes over time.
  • Our enterprise offering is currently used by leaders of the logistics industry, such as DB Schenker or La Poste, and tested across multiple industries. Pathway has been featured in Gartner's market guide for Event Stream Processing.
  • Learn more at http://pathway.com/ and https://github.com/pathwaycom/.

The Team

Pathway is built by and for overachievers. Its co-founders and employees have worked in the best AI labs in the world (Microsoft Research, Google Brain, ETH Zurich), worked at Google, and graduated from top universities (Polytechnique, ENSAE, Sciences Po, HEC Paris, PhD obtained at the age of 20, etc…). Pathway’s CTO is a co-author with Goeff Hinton and Yoshua Bengio. The management team also includes the co-founder of Spoj.com (1M+ developer users) and NK.pl (13.5M+ users) and experienced growth leader who has scaled companies with multiple exits.

The opportunity

We are searching for a person with a Data Processing or Data Engineering profile, willing to work with live client datasets, and to test, benchmark, and showcase our brand-new stream data processing technology.

The end-user of our product are mostly developers and data engineers working in a corporate environment. Our development framework is one day expected to become for them a part of their preferred development stack for analytics projects at work – their daily bread & butter.

You Will

You will be working closely with our CTO, Head of Product, as well as key developers. You will be expected to:

  • Implement the flow of data from their location in client's warehouses up to Pathway's ingress.
  • Set up CDC interfaces for change streams between client data stores and i/o data processed by Pathway; ensuring data persistence for Pathway outputs.
  • Design ETL pipelines within Pathway.
  • Contribute to benchmark framework design (throughput / latency / memory footprint; consistency), including in a distributed system setup.
  • Contribute to building open-source test frameworks for simulated streaming data scenarios on public datasets.

Requirements

  • Inside-out understanding of at least one major distributed data processing framework (Spark, Dask, Ray,...)
  • 6 months+ experience working with a streaming dataflow framework (e.g.: Flink, Kafka Streams or ksqldb, Spark in streaming mode, Beam/Dataflow)
  • Ability to set up distributed dataflows independently.
  • Experience with data streams: message queues, message brokers (Kafka), CDC.
  • Working familiarity with data schema and schema versioning concepts; Avro, Protobuf, or others.
  • Familiarities with Kubernetes.
  • Familiarity with deployments in both Azure and AWS clouds.
  • Good working knowledge of Python.
  • Good working knowledge of SQL.
  • Experienced in working for an innovative tech company (SaaS, IT infrastructure or similar preferred), with a long-term vision.
  • Warmly disposed towards open-source and open-core software, but pragmatic about licensing.


Bonus Points

  • Know the ways of developers in a corporate environment.
  • Passionate about trends in data.
  • Proficiency in Rust.
  • Experience with Machine Learning pipelines or MLOps.
  • Familiarity with any modern data transformation workflow tooling (dbt, Airflow, Dagster, Prefect,...)
  • Familiarity with Databricks Data Lakehouse architecture.
  • Familiarity with Snowflake's data product vision (2022+).
  • Experience in a startup environment.

Benefits

Why You Should Apply

  • Intellectually stimulating work environment. Be a pioneer: you get to work with a new type of stream processing framework.
  • Work in one of the hottest data startups in France, with exciting career prospects
  • Responsibilities and ability to make significant contribution to the company’ success
  • Compensation: annual salary of €60K-€100K + Employee stock option plan.
  • Inclusive workplace culture


Further details

  • Type of contract: Permanent employment contract
  • Preferable joining date: early 2023.
  • Compensation: annual salary of €60K-€100K + Employee stock option plan.
  • Location: Remote work from home. Possibility to work or meet with other team members in one of our offices:
    • Paris – Agoranov (where Doctolib, Alan, and Criteo were born) near Saint-Placide Metro (75006).
    • Paris Area – Drahi X-Novation Center, Ecole Polytechnique, Palaiseau.
    • Wroclaw – University area.

Candidates based anywhere in the EU, United States, and Canada will be considered.

Last updated on Mar 8, 2024

See more

About the company

More jobs like this

Analyzing

Kraków, Lesser Poland Voivodeship

 · 

30+ days ago

Kraków, Lesser Poland Voivodeship

 · 

30+ days ago

Warsaw, Masovian Voivodeship

 · 

30+ days ago

 · 

30+ days ago

Kraków, Lesser Poland Voivodeship

 · 

30+ days ago

Warsaw, Masovian Voivodeship

 · 

30+ days ago

Warsaw, Masovian Voivodeship

 · 

30+ days ago

Senior Security Engineer
A
appfiretechnologiesllc

 · 

30+ days ago

Developed by Blake and Linh in the US and Vietnam.
We're interested in hearing what you like and don't like! Live chat with our founder or join our Discord
Changelog
🚀 LaunchpadNov 27
Create a site and sell services based on your resume.
🔥 Job search dashboardNov 13
Revamped job search UI with a sortable grid, live filtering, bookmarks, and application tracking.
🫡 Cover letter instructionsSep 27
New Studio settings give you control over AI output.
✨ Cover Letter StudioAug 9
Automatically generate cover letters for any job.
🎯 Suggested filtersAug 6
Copilot suggests additional filters above the results.
⚡️ Quick applicationsAug 2
Apply to jobs using info from your resume. Initial coverage of ~200k jobs in Spain, Germany, Austria, Switzerland, France, and the Netherlands.
🧠 Job AnalysisJul 12
Have Copilot read job descriptions and extract out key info you want to know. Click "Analyze All" to try it out. Click on the Copilot's gear icon to customize the prompt.
© 2024 RemoteAmbitionAffiliate · Privacy · Terms · Sitemap · Status