Browse
Employers / Recruiters

Data Engineering Lead

truckstop · 30+ days ago
US Remote
$140k+
Full-time
Continue
By pressing the button above, you agree to our Terms and Privacy Policy, and agree to receive email job alerts. You can unsubscribe anytime.

At Truckstop, we have transformed the entire freight-moving lifecycle with our SaaS solutions.  From freight matching to payments and everything in between, we are the trusted partner for carriers, brokers, and shippers alike. We lead this industry forward with our One Team mindset committing to principles such as assume positive intent, have each other’s back, and be your authentic self.  Our drive for greatness produces high expectations, yet our regard for humans is even higher. Join a team of brilliant minds and generous hearts who care deeply about other's success.   

Position Summary: 

The Data Engineering team within Data Services is responsible for creating and maintaining data pipelines, assembling complex datasets, and extracting, transforming, and loading data from a variety of data sources.  

Data Engineers work closely with Software Engineers, Product Managers, Data Scientists, and teams across the enterprise to deliver product features, build data pipelines, and architect our Product and Analytics data infrastructure for optimal performance.  

Data Engineers ensure that Truckstop has hygienic, accurate, data available at the point of need and partner with the System Architecture and DevOps teams to troubleshoot data pipeline production issues and respond to incidents. 

The SR/Lead Data Engineer role works directly with the Director of Data Services to oversee the Data Engineering team, handle the most complex issues, and provide coaching and guidance to more junior Data Engineers. The Sr/Lead DE possesses expert working knowledge of technical subject matter, as well as the ability to strategize and problem solve on data infrastructure, data migration tooling, data modeling, and complex troubleshooting.  

Essential Job Functions: 

  • Create, automate, and maintain complex data pipelines from multiple source systems using migration and orchestration technology, employing architecture best practices, leveraging leading edge technology, and utilizing multiple programming languages. 
  • Extract, transform, and load data from a variety of data sources using SQL and Python and cloud data technologies.  
  • Interact with API and Kafka technology to ingest and transform data into Snowflake for Product and Data Science team end uses.  
  • Write and optimize queries and data processes, write orchestration functions, perform source to target mapping and data modeling. 
  • Work directly with Product to engineer performant data pipelines required to support Product analytics, reporting, and strategy. 
  • Work with data architects, database administrators, Infosec and software engineering solution architects to continually improve data security, data capture, data pipelines, and data infrastructure.  
  • Work with IT operations to resolve data related technical issues and respond to data related major incidents. 
  • Perform incident resolution and root cause analysis of critical outages. Implement solutions to systematic failures. Provide on-call support, including after-hours. 
  • Assist with documentation of the environments that support our products in Confluence and Atlan 

Position Requirements: 

  • Bachelor's degree or equivalent professional experience required; Computer Science or Engineering preferred. 
  • Demonstrated expertise in SQL and Python required. 
  • Minimum of +6 years of experience required. 
  • Experience with Snowflake, MS SQL, and PostgreSQL required. 
  • Experience building and optimizing reliable, idempotent data pipelines (streaming or batch) using ETL tools such as Matillion, Apache Airflow, FiveTran, Kafka and Spark required. 
  • A successful history applying software engineering principles to build data infrastructure tools/libraries to automate and scale data pipelines. 
  • Experience with supporting Data Science applications, including machine learning data pipelines; Azure ML experience a plus 
  • Understanding of applying logging and metrics to monitor and detect data pipeline performance issues. 
  • Advanced understanding of data warehousing architecture including performance optimization and tuning, specifically within Snowflake and Azure.  
  • Working knowledge of dbt data modeling and source to target mapping with a demonstrated understanding of data schema design, Snowflake clustering and micro-partitioning 
  • Experience working in Azure and AWS cloud-based environment 

At Truckstop we are dedicated to creating a workplace that is equitable, inclusive, and inspiring for every employee. In an effort to provide greater transparency, we are sharing the base salary range for this position. The position is also eligible for a yearly bonus. Final salary is based on a number of factors including market location, job-related knowledge, education/training, certifications, key skills, experience, internal peer equity as well as business considerations.

The anticipated base pay range for this position is :
$140,000$160,000 USD

The above description covers the most significant duties performed but does not include other related occasional work that may be assigned or is completed by the employee. 

Truckstop provides equal employment opportunities to all employees and applicants for employment and prohibits discrimination and harassment of any type without regard to race, color, religion, age, sex, national origin, disability status, genetics, protected veteran status, sexual orientation, gender identity or expression, or any other characteristic protected by federal, state, or local laws. 

Truckstop participates in the E-Verify program. Learn more about the E-Verify program here: https://www.e-verify.gov/

Truckstop Privacy Policy

Last updated on Aug 21, 2024

See more

About the company

More jobs at truckstop

Analyzing

British Columbia

 · 

30+ days ago

British Columbia

 · 

30+ days ago

 · 

30+ days ago

British Columbia

 · 

30+ days ago

More jobs like this

Analyzing

Burlington, Massachusetts

 · 

30+ days ago

Data Architect
C
crjdnwsnowo2i4nz45b1teboszrxlg0351vr73gpqw7yanury9u287prckhdnkww

Irvine, California

 · 

30+ days ago

Data Integration Engineer
T
two95-international-inc-3

Remote

 · 

30+ days ago

Carlstadt, New Jersey

 · 

30+ days ago

Data Architect$75+ / hour
D
dhjdnwh4qm62pb5vm2o4tbd72ej7oa01f47beu0d9d984ckrwi58r2ocg36n82t5

Philadelphia, Pennsylvania

 · 

30+ days ago

Data Architect
3
3djdnw5yqdh8wl3frr5t6561tvvokq01affwpxt3lcutzo4f8yt1aeiy3msk02or

Austin, Texas

 · 

30+ days ago

Cloud Engineer - Azure Automation
D
d1jdnwvhbdgegwtuivnn7ap9pt7oxh03c5khec0dlwfqm0mxseydguueduceafch

Raleigh, North Carolina

 · 

30+ days ago

Database Administrator
R
rsjdnwc9jel4i3xyjsm3m8vnhrmayk037bphn44zg3i1bl3dcjtqhqlclsisinpr

Raleigh, North Carolina

 · 

30+ days ago

Enterprise Data Architect
B
b6jdnwcpcemgg8el3r9winlpunj8hc038b1vkhowrzxn9gitznreodi38t7rirkp

Denver, Colorado

 · 

30+ days ago

Data Warehouse Developer III
F
fvjdnwvwi7yecmymd9si3it1ointo80348emvd7mgqh749rpbe3n811jnfkeb228

Boston, Massachusetts

 · 

30+ days ago

Developed by Blake and Linh in the US and Vietnam.
We're interested in hearing what you like and don't like! Live chat with our founder or join our Discord
Changelog
🚀 LaunchpadNov 27
Create a site and sell services based on your CV.
🔥 Job search dashboardNov 13
Revamped job search UI with a sortable grid, live filtering, bookmarks, and application tracking.
🫡 Cover letter instructionsSep 27
New Studio settings give you control over AI output.
✨ Cover Letter StudioAug 9
Automatically generate cover letters for any job.
🎯 Suggested filtersAug 6
Copilot suggests additional filters above the results.
⚡️ Quick applicationsAug 2
Apply to jobs using info from your CV. Initial coverage of ~200k jobs in Spain, Germany, Austria, Switzerland, France, and the Netherlands.
🧠 Job AnalysisJul 12
Have Copilot read job descriptions and extract out key info you want to know. Click "Analyze All" to try it out. Click on the Copilot's gear icon to customize the prompt.
© 2024 RemoteAmbitionAffiliate · Privacy · Terms · Sitemap · Status