Browse
Employers / Recruiters
Negotiable
Full-time
Continue
By pressing the button above, you agree to our Terms and Privacy Policy, and agree to receive email job alerts. You can unsubscribe anytime.

Databricks, Spark

Snowflake Cloud Data Platform

Data Vault 2.0 model

Enterprise Data Integrations

AWS Cloud architecture

Event and messaging patterns, streaming data, AWS, Kafka


Role & Responsibilities:

Data Engineers will be responsible for design, build and maintain data pipelines ensuring data quality, efficient processing, and timely delivery of accurate and trusted data.

The ability to design, implement and optimize large-scale data and analytics solutions on Databricks, Spark, Snowflake Cloud Data Warehouse is essential.

Ensure performance, security, and availability of the data warehouse.

Establish ongoing end-to-end monitoring for the data pipelines.

Strong understanding of full CI/CD lifecycle.

Must Haves:

  • 2+ years of recent experience with Databricks / Spark / Snowflake and a total of 6+ years in data engineering role.
  • Designing and implementing highly performant data ingestion pipelines from multiple sources using spark and databricks.
  • Extensive working knowledge of Spark and Databricks
  • Demonstrable experience designing and implementing modern data warehouse/data lake solutions with an understanding of best practices.
  • Hands-on development experience with Snowflake data platform including Snowpipes, SnowSQL,tasks, stored procedures, streams, resource monitors, RBAC controls, virtual warehouse sizing, query performance tuning, cloning, time travel, data sharing and understanding how to use these features.
  • Advanced proficiency in writing complex SQL statements and manipulating large structured and semi-structured datasets.
  • Data loading/Unloading and Data sharing
  • Strog hands-on Experience on SNOWSQL queries, script preparation and stored procedures and performance tunning
  • Knowledge of SnowPipe implementation
  • Create Spark jobs for data transformation and aggregation
  • Produce unit tests for Spark transformations and helper methods
  • Security design and implementation on Databricks
  • Build processes supporting data transformation, data structures, metadata, dependency and workload management.
  • A successful history of manipulating, processing and extracting value from large disconnected datasets.
  • Working knowledge of message queuing, stream processing, and scalable 'big data' data stores.

Good to Have:

  • Valid professional certification
  • Experience in Python/Pyspark/Scala/Hive Programming.
  • Excellent verbal and written communication and interpersonal skills
  • Confidence and agility in challenging times
Ability to work collaboratively with cross-functional teams in a fast-paced, team environment

Last updated on Nov 28, 2022

See more

More jobs at svjdnwzkulao5hqo7t0ifgvj8s71sf01d7dtgdstyhdixakxt6ty85zljsdyhgz2

Analyzing

Sparks, Nevada

 · 

30+ days ago

Sparks, Nevada

 · 

30+ days ago

Sparks, Nevada

 · 

30+ days ago

Indianapolis, Indiana

 · 

30+ days ago

Indianapolis, Indiana

 · 

30+ days ago

More jobs like this

Analyzing

New York, New York

 · 

30+ days ago

San Francisco, California

 · 

30+ days ago

Web Engineer
U
Upworthy ·  Viral content for social good

 · 

30+ days ago

Remote

 · 

30+ days ago

Remote

 · 

30+ days ago

Des Moines, Iowa

 · 

30+ days ago

South Jordan, Utah

 · 

30+ days ago

Tampa, Florida

 · 

30+ days ago

Web Site Designer
TT
The Talently ·  AI recruitment platform

California

 · 

30+ days ago

Apttus CPQ Developer
C
crjdnwsnowo2i4nz45b1teboszrxlg0351vr73gpqw7yanury9u287prckhdnkww

Minneapolis, Minnesota

 · 

30+ days ago

Developed by Blake and Linh in the US and Vietnam.
We're interested in hearing what you like and don't like! Live chat with our founder or join our Discord
Changelog
🚀 LaunchpadNov 27
Create a site and sell services based on your resume.
🔥 Job search dashboardNov 13
Revamped job search UI with a sortable grid, live filtering, bookmarks, and application tracking.
🫡 Cover letter instructionsSep 27
New Studio settings give you control over AI output.
✨ Cover Letter StudioAug 9
Automatically generate cover letters for any job.
🎯 Suggested filtersAug 6
Copilot suggests additional filters above the results.
⚡️ Quick applicationsAug 2
Apply to jobs using info from your resume. Initial coverage of ~200k jobs in Spain, Germany, Austria, Switzerland, France, and the Netherlands.
🧠 Job AnalysisJul 12
Have Copilot read job descriptions and extract out key info you want to know. Click "Analyze All" to try it out. Click on the Copilot's gear icon to customize the prompt.
© 2024 RemoteAmbitionAffiliate · Privacy · Terms · Sitemap · Status