Browse
Employers / Recruiters

Data & Analytics - Data Engineer - Data Quality

$75+ / hour
Full-time
Continue
By pressing the button above, you agree to our Terms and Privacy Policy, and agree to receive email job alerts. You can unsubscribe anytime.
Data Engineer, Data Analytics and Data Quality
 
My name is Bill Stevens and I have a new six month plus Hybrid Scheduled Data Engineer, Data Analytics and Data Quality opportunity available for a major firm located in Bethlehem, Pennsylvania that could be of interest to you, please review my specification below and I am available at any time to speak with you so please feel free to call me. The work schedule will be a hybrid one, three days a week in the office and two days remote.
 
The firm will NOT entertain a remote candidate. The ideal candidate should also possess a green card or be of citizenship.
 
This position pays $75.00 per hour on a w-2 hourly basis or $85.00 per hour on a Corp basis. The Corp rate is for independent contractors only and not third-party firms.
 
Description: 
The firm is seeking an experienced Data Engineer to be part of their Data and Analytics organization. You will be playing a key role in building and delivering best-in-class data and analytics solutions aimed at creating value and impact for the organization and our customers. As a member of the data engineering team, you will help developing and delivery of Data Products with quality backed by best-in-class engineering. You will collaborate with analytics partners, business partners and IT partners to enable the solutions.
 
The Qualified Candidate will:
Architect, build, and maintain scalable and reliable data pipelines including robust data quality as part of data pipeline which can be consumed by analytics and BI layer.
Design, develop and implement low-latency, high-availability, and performant data applications and recommend & implement innovative engineering solutions.
Design, develop, test and debug code in Python, SQL, PySpark, bash scripting as per the firms standards.
Design and implement data quality framework and apply it to critical data pipelines to make the data layer robust and trustworthy for downstream consumers.
Design and develop orchestration layer for data pipelines which are written in SQL, Python and PySpark.
Apply and provide guidance on software engineering techniques such as design patterns, code refactoring, framework design, code reusability, code versioning, performance optimization, and continuous build and Integration (CI/CD) to make the data analytics team robust and efficient.
Performing all job functions consistent with the firms policies and procedures, including those which govern handling PHI and PII.
Work closely with various IT and business teams to understand systems opportunities and constraints for maximally utilizing the firms Enterprise Data Infrastructure.
Develop relationships with business team members by being proactive, displaying an increasing understanding of the business processes and by recommending innovative solutions.
Communicate project output in terms of customer value, business objectives, and product opportunity.
 
The Qualified Candidate should possess:
5+ years of experience with Bachelors / master's degree in computer science, Engineering, Applied mathematics or related field.
Extensive hands-on development experience in Python, SQL and Bash.
Extensive Experience in performance optimization of data pipelines.
Extensive hands-on experience working with cloud data warehouse and data lake platforms such as Databricks, Redshift or Snowflake.
Familiarity with building and deploying scalable data pipelines to develop and deploy Data Solutions using Python, SQL, PySpark.
Extensive experience in all stages of software development and expertise in applying software engineering best practices.
Experience in developing and implementing Data Quality framework either home grown or using any open-source frameworks such as Great Expectations, Soda, Deequ.
Extensive experience in developing end-to-end orchestration layer for data pipelines using frameworks such as Apache Airflow, Prefect, Databricks Workflow.
Familiar with RESTful Webservices (REST APIs) to be able to integrate with other services.
Familiarity with API Gateways such as APIGEE to secure webservice endpoints.
Familiarity with concurrency and parallelism.
Familiarity with Data pipelines and Client development cycle.
Experience in creating and configuring continuous integration/continuous deployment using pipelines to build and deploy applications in various environments and use best practices for DevOps to migrate code to Production environment.
Ability to investigate and repair application defects regardless of component: front-end, business logic, middleware, or database to improve code quality, consistency, delays and identify any bottlenecks or gaps in the implementation.
Ability to write unit tests in python using unit test library such as pytest.
 
Additional Qualifications (Nice to have and NOT required.):
Experience in using and implementing data observability platforms such as Monte Carlo Data, Metaplane, Soda, bigeye or any other similar products.
Expertise in debugging issues in Cloud environment by monitoring logs on the VM or use AWS features such as Cloudwatch.
Experience with DevOps tech stack such as Jenkins and Terraform.
Experience working with concept of Observability in software world and experience with tools such as Splunk, Zenoss, Datadog or similar.
Ability to learn and adopt to new concepts and frameworks and create proof of concept using newer technologies.
Ability to use agile methodology throughout the development lifecycle and provide update on regular basis, escalating issues or delays in a timely manner.
 
The interview process will include an initial telephone or Zoom screening.
 
Please let me know your interest for this position, availability to interview and start for this position along with a copy of your recent resume or please feel free to call me at any time with any questions.   

Regards
Bill Stevens
Senior Technical Recruiter
PRI Technology

Denville, New Jersey 07834
1-973-732-5454 x21

Bill.Stevens@PRITechnology.com
www.PriTechnology.com
 
  •

Last updated on Oct 6, 2023

See more

More jobs at sajdnwgka0m2rlirk6ekonym8l829u0143fuqrfsm5zkgtpf5ka5xsqy9e1ereyk

Analyzing

New York, New York

 · 

30+ days ago

New York, New York

 · 

30+ days ago

Chicago, Illinois

 · 

30+ days ago

Hartford, Connecticut

 · 

30+ days ago

Columbia, Maryland

 · 

30+ days ago

More jobs like this

Analyzing

Dallas, Texas

 · 

30+ days ago

Senior Cloud Database Architect
B
b6jdnwcpcemgg8el3r9winlpunj8hc038b1vkhowrzxn9gitznreodi38t7rirkp

Atlanta, Georgia

 · 

30+ days ago

Consulting Software Engineer (715669)
R
rsjdnwc9jel4i3xyjsm3m8vnhrmayk037bphn44zg3i1bl3dcjtqhqlclsisinpr

Cambridge, Massachusetts

 · 

30+ days ago

Database Architect
TT
The Talently ·  AI recruitment platform

San Jose, California

 · 

30+ days ago

Front End Developer
G
Grapevine ·  Influencer marketing platform for YouTube

Boston, Massachusetts

 · 

30+ days ago

Salesforce Developer
B
b8jdnwfetm91aeh4xxktytk2xff310011dbi7c94iwf3w4g8qka7cjkc4daepyd7

 · 

30+ days ago

Senior Software Engineer, Infrastructure Security$202-316k
Asana ·  Collaboration software for teams

San Francisco, California

 · 

30+ days ago

San Francisco, California

 · 

30+ days ago

JDA Developer
C
crjdnwsnowo2i4nz45b1teboszrxlg0351vr73gpqw7yanury9u287prckhdnkww

Alpharetta, Georgia

 · 

30+ days ago

Developed by Blake and Linh in the US and Vietnam.
We're interested in hearing what you like and don't like! Live chat with our founder or join our Discord
Changelog
🚀 LaunchpadNov 27
Create a site and sell services based on your resume.
🔥 Job search dashboardNov 13
Revamped job search UI with a sortable grid, live filtering, bookmarks, and application tracking.
🫡 Cover letter instructionsSep 27
New Studio settings give you control over AI output.
✨ Cover Letter StudioAug 9
Automatically generate cover letters for any job.
🎯 Suggested filtersAug 6
Copilot suggests additional filters above the results.
⚡️ Quick applicationsAug 2
Apply to jobs using info from your resume. Initial coverage of ~200k jobs in Spain, Germany, Austria, Switzerland, France, and the Netherlands.
🧠 Job AnalysisJul 12
Have Copilot read job descriptions and extract out key info you want to know. Click "Analyze All" to try it out. Click on the Copilot's gear icon to customize the prompt.
© 2024 RemoteAmbitionAffiliate · Privacy · Terms · Sitemap · Status