Browse
Employers / Recruiters

Sr. Data Pipeline Engineer

7743 · 30+ days ago
East Lansing, MI, 48823, US
Negotiable
Full-time
Continue
By pressing the button above, you agree to our Terms and Privacy Policy, and agree to receive email job alerts. You can unsubscribe anytime.

Vertafore is looking for talented people to join our team in Michigan. Our dynamic environment provides professional development, fast upward mobility, and exposure to the latest and greatest in technology.

Vertafore is a leading technology company whose innovative software solution are advancing the insurance industry. Our suite of products provides solutions to our customers that help them better manage their business, boost their productivity and efficiencies, and lower costs while strengthening relationships.  Our mission is to move InsurTech forward by putting people at the heart of the industry. We are leading the way with product innovation, technology partnerships, and focusing on customer success.  Our fast-paced and collaborative environment inspires us to create, think, and challenge each other in ways that make our solutions and our teams better. 

We are headquartered in Denver, Colorado, with offices across the U.S., including East Lansing, Michigan – we are minutes from Michigan State University, Lansing Community College, and Cooley Law School!

Vertafore is a Flexible First working environment which allows team members to work from home as often as you’d like, while using our offices as a place for collaboration, community, and teambuilding. There are times you may be asked to come into an office and/or travel for specific meetings for a specific business purpose and this varies by job responsibilities.

JOB DESCRIPTION

Operate, manage, and maintain data pipelines that utilize multiple data sources and reporting tools such as Oracle, PostgreSQL and Pentaho. Implement and maintain reporting and pipeline infrastructure in AWS using automation tools such as Terraform, Ansible, and Chef. Perform operational monitoring and maintenance of said infrastructure and software platforms. Serve as the liaison and collaborator with development and data operations teams. Write automated tests across to continuously validate data pipeline functionality.

 

Core Responsibilities:

Essential job functions included but are not limited to the following:

· Provisions and manages data pipeline infrastructure using technologies such as Terraform and Chef.

· Monitors and troubleshoots issues within the data pipeline infrastructure using tools like Dynatrace, and Kibana.

· Drives and manages design conversations for features considering business need

· Develops new features and supporting/bug fixing existing features

· Ensures quality of code within the team written or reviewed

· Follows industry trends and the open-source community

· Interacts with customers to gather insights and translate technical concepts

 

Knowledge, Skills, and Abilities:

· Advanced communication and interpersonal skills, able to diffuse tension and drive productive conversations, internally and externally

· Strategic problem solver with excellent analytical and critical thinking skills

· An innate curiosity about how things work; Proactively acquires new skills and learns new tools and technologies to troubleshoot complex issues

· Comfortable and effective at making decisions across the product that help long-term maintainability, reuse, security. and performance

· Applies knowledge gained and is effective at knowledge-sharing and assisting other team members

· Able to work with global, distributed teams

· In-tune with high performance and high availability design/programming

· Highly proficient with multiple database technologies

· Strong in both design and implementation at an enterprise level

· Communicate clearly to explain and defend design decisions

· Desire to collaborate with the business, designing pragmatic solutions

 

Qualifications:

· Degree in computer science, engineering or related field required

· Minimum 3 years of professional experience with Devops and data pipeline technologies..

· Experience working in an Agile environment required

· 3+ years of experience in security best practices for software and data platforms required

· 3+ years of experience with design, architecture, and implementation at an enterprise level required

· 3+ years of experience with database technologies (e.g., Oracle, SQLServer, PostgreSQL, Cassandra, etc.)

· Experience with the following technologies: Ansible, Chef, Git, Java Frameworks, Pentaho, Terraform, xDB

 

Additional Requirements and Details:

· Travel required up to 5% of the time.

· WFH Flexible

· Occasional lifting and/or moving up to 10 pounds.

· Frequent repetitive hand and arm movements required to operate a computer.

· Specific vision abilities required by this job include close vision (working on a computer, etc.).

· Frequent sitting and/or standing.

· #LI-Hybrid

· $90000 - $130000 / year

Vertafore is a Flexible First working environment which allows team members to work from home as often as you’d like, while using our offices as a place for collaboration, community, and teambuilding. There are times you may be asked to come into an office and/or travel for specific meetings for a specific business purpose and this varies by job responsibilities. •

Last updated on Sep 4, 2024

See more

About the company

More jobs at 7743

Analyzing

Denver, Colorado

 · 

30+ days ago

Pulaski, Tennessee

 · 

30+ days ago

East Lansing, Michigan

 · 

30+ days ago

Remote

 · 

30+ days ago

Denver, Colorado

 · 

30+ days ago

More jobs like this

Analyzing

Burlington, Massachusetts

 · 

30+ days ago

Data Architect
C
crjdnwsnowo2i4nz45b1teboszrxlg0351vr73gpqw7yanury9u287prckhdnkww

Irvine, California

 · 

30+ days ago

Data Integration Engineer
T
two95-international-inc-3

Remote

 · 

30+ days ago

Carlstadt, New Jersey

 · 

30+ days ago

Data Architect$75+ / hour
D
dhjdnwh4qm62pb5vm2o4tbd72ej7oa01f47beu0d9d984ckrwi58r2ocg36n82t5

Philadelphia, Pennsylvania

 · 

30+ days ago

Data Architect
3
3djdnw5yqdh8wl3frr5t6561tvvokq01affwpxt3lcutzo4f8yt1aeiy3msk02or

Austin, Texas

 · 

30+ days ago

Cloud Engineer - Azure Automation
D
d1jdnwvhbdgegwtuivnn7ap9pt7oxh03c5khec0dlwfqm0mxseydguueduceafch

Raleigh, North Carolina

 · 

30+ days ago

Database Administrator
R
rsjdnwc9jel4i3xyjsm3m8vnhrmayk037bphn44zg3i1bl3dcjtqhqlclsisinpr

Raleigh, North Carolina

 · 

30+ days ago

Enterprise Data Architect
B
b6jdnwcpcemgg8el3r9winlpunj8hc038b1vkhowrzxn9gitznreodi38t7rirkp

Denver, Colorado

 · 

30+ days ago

Data Warehouse Developer III
F
fvjdnwvwi7yecmymd9si3it1ointo80348emvd7mgqh749rpbe3n811jnfkeb228

Boston, Massachusetts

 · 

30+ days ago

Developed by Blake and Linh in the US and Vietnam.
We're interested in hearing what you like and don't like! Live chat with our founder or join our Discord
Changelog
🚀 LaunchpadNov 27
Create a site and sell services based on your resume.
🔥 Job search dashboardNov 13
Revamped job search UI with a sortable grid, live filtering, bookmarks, and application tracking.
🫡 Cover letter instructionsSep 27
New Studio settings give you control over AI output.
✨ Cover Letter StudioAug 9
Automatically generate cover letters for any job.
🎯 Suggested filtersAug 6
Copilot suggests additional filters above the results.
⚡️ Quick applicationsAug 2
Apply to jobs using info from your resume. Initial coverage of ~200k jobs in Spain, Germany, Austria, Switzerland, France, and the Netherlands.
🧠 Job AnalysisJul 12
Have Copilot read job descriptions and extract out key info you want to know. Click "Analyze All" to try it out. Click on the Copilot's gear icon to customize the prompt.
© 2024 RemoteAmbitionAffiliate · Privacy · Terms · Sitemap · Status