Browse
Employers / Recruiters

Principal Data Engineer- GCP / Newton MA ,( Onsite), 6 months Contract

$79k+
Estimation
Full-time
Continue
By pressing the button above, you agree to our Terms and Privacy Policy, and agree to receive email job alerts. You can unsubscribe anytime.
Job Description:
The Principal Data Engineer / Lead will play a key role in in the Design and Development of Data models, datasets, and ETL/ELT pipelines from a variety of data sources to the Data Warehouse / Data Lake on Google Cloud Platform (GCP) for client's flagship Priority Engine product. Priority Engine relies heavily on this foundational data layer.
The Principal Data Engineer will help provide technical leadership and strategic direction for Priority Engine's data platform. The position will be responsible for the delivery and technical quality of our programs, taking on complex projects. It will also partner with cross-functional stakeholders to drive initiatives forward, and mentor team members in technical design, project leadership, and running team processes. The position will be responsible to build and maintain scalable data infrastructure
The candidate must have a keen sense for the business drivers, have a vision for the data strategy and help execute it. The candidate must be familiar with industry trends and best practices around data engineering on the cloud and apply them appropriately.
Role / Responsibilities:
  • Design, develop, and implement end-to-end data solutions (storage, integration, processing, access) in Google Cloud Platform (GCP);
  • Create ETL/ELT pipelines that transform and process terabytes of structured and unstructured data in real-time;
  • Design data models for optimal storage and retrieval to support sub-second latency;
  • Deploy and monitor large database clusters that are performant and highly available;
  • Deeply immerse in our Priority Engine Data Pipelines, to acquire full understanding and ability to converse with business representatives and individual contributors alike; and
  • Incorporate solid understanding of cloud-based offerings in the space.
  • Some upcoming technical challenges include scaling our data ingestion pipelines across a growing number of GCP, AWS, and data center-based data sources, reducing the latency of our product data ingestion pipelines through moving batch jobs into a streaming architecture, and extending our data lake architecture for the growing ecosystem of data ingestion and creation tools.
  • Be a hands-on contributor on the team;
  • Collaborate with product/application architects to develop holistic solutions;
  • Implement operational procedures (logging, monitoring, alerting, etc.) for dependable running of pipelines/jobs; and
  • Be autonomous. You own what you work on. You move fast, take ownership, and get things done. The Data lake/warehouse is a central and essential service for multiple initiatives, and you need to be able to multitask and juggle priorities on a regular basis with a deep understanding of business impacts of prioritization decisions.
Skills/Requirements:
  • BS Degree in Computer Science or a related field,
  • 10+ years industry experience, 5+ years of experience building real-time and distributed system architecture, from whiteboard to production,
  • Strong programming skills in Python and SQL,
  • Versatility: 8+ years' experience across the entire spectrum of data engineering, including:
  • Cloud data stores (Example: GCP BigQuery, GCP BigTable, GCP FireStore, GCP CloudSQL, ELK)
  • Data pipeline and workflow orchestration tools (e.g., Dataflow, Pentaho, Airflow, Azkaban)
  • Data processing technologies (e.g., GCP BigQuery, Spark)
  • Data messaging technologies (e.g., GCP PubSub, Kafka)
  • Deployment and monitoring large database clusters in public cloud platforms (e.g., Docker, Terraform),
  • Unix/Linux Shell scripting,
  • Demonstrated knowledge of industry trends and standards,
  • Ability to think through multiple alternatives and select the best possible solutions for strategic and tactical business needs, and
  • Excellent communication. You will need communicate complex ideas effectively to both technical and non-technical audiences, and both verbally and in writing.
Nice to Have:
  • Experience with GCP
  • Java Programming

Last updated on Feb 24, 2023

See more

More jobs at qmjdnwxbt2xo6cgd4v07xzcp6xs8h801acdvz0k17y2rbydx0aqkiiztec0x7k5r

Analyzing

Houston, Texas

 · 

30+ days ago

Boca Raton, Florida

 · 

30+ days ago

Boca Raton, Florida

 · 

30+ days ago

Denver, Colorado

 · 

30+ days ago

Houston, Texas

 · 

30+ days ago

Developed by Blake and Linh in the US and Vietnam.
We're interested in hearing what you like and don't like! Live chat with our founder or join our Discord
Changelog
🚀 LaunchpadNov 27
Create a site and sell services based on your resume.
🔥 Job search dashboardNov 13
Revamped job search UI with a sortable grid, live filtering, bookmarks, and application tracking.
🫡 Cover letter instructionsSep 27
New Studio settings give you control over AI output.
✨ Cover Letter StudioAug 9
Automatically generate cover letters for any job.
🎯 Suggested filtersAug 6
Copilot suggests additional filters above the results.
⚡️ Quick applicationsAug 2
Apply to jobs using info from your resume. Initial coverage of ~200k jobs in Spain, Germany, Austria, Switzerland, France, and the Netherlands.
🧠 Job AnalysisJul 12
Have Copilot read job descriptions and extract out key info you want to know. Click "Analyze All" to try it out. Click on the Copilot's gear icon to customize the prompt.
© 2024 RemoteAmbitionAffiliate · Privacy · Terms · Sitemap · Status