Browse
Employers / Recruiters

Data Ops Engineer

hatchit · 30+ days ago
Negotiable
Full-time
Continue
By pressing the button above, you agree to our Terms and Privacy Policy, and agree to receive email job alerts. You can unsubscribe anytime.
hatch I.T. is partnering with Expression to find a Data Ops Engineer. See details below:

About The Role:
Expression is seeking a skilled Data Ops Engineer to join their team in Annapolis, MD on a hybrid role. As a Data Ops Engineer you will play a crucial role in bridging their data and infrastructure teams. This strategically designed position offers you the chance to tackle challenging tasks that will foster your professional growth and development. You will be at the forefront of ensuring our data systems are reliable, efficient, and scalable, enabling seamless data flows that empower data-driven decision-making for their clients. This is an opportunity to grow your career while working on challenging projects that make a real impact. You will be a vital part of a collaborative environment that encourages innovation, learning, and professional development.

About the Company:
Founded in 1997 and headquartered in Washington DC, Expression provides data fusion, data analytics, software engineering, information technology, and electromagnetic spectrum management solutions to the U.S. Department of Defense, Department of State, and national security community. Expression’s “Perpetual Innovation” culture focuses on creating immediate and sustainable value for our clients via agile delivery of tailored solutions built through constant engagement with our clients. Expression was ranked #1 on the Washington Technology 2018's Fast 50 list of fastest growing small business Government contractors and a Top 20 Big Data Solutions Provider by CIO Review.

Responsibilities:

  • Design, implement, and maintain robust data infrastructure, including databases, data warehouses, and data lakes, to support our rapidly expanding data landscape. You will engage and lead initiatives that enhance our data capabilities and drive innovation.
  • Develop, deploy, and test ETL pipelines for extracting, transforming, and loading data from various sources. You will ensure data quality and integrity, playing a key role in the accuracy of our analytical insights.
  • Collaborate with data scientists and data engineers to integrate and test machine learning models within our data systems, ensuring smooth functionality and high performance. This aspect of the role provides an exciting opportunity to work at the intersection of data engineering and machine learning.
  • Implement cutting-edge automation and orchestration tools to streamline data operations, minimize manual processes, and boost efficiency. Your contributions will significantly enhance our operational capabilities.
  • Continuously assess and optimize data pipelines and infrastructure for performance, scalability, and cost-effectiveness. You will identify and resolve bottlenecks, ensuring that our systems can handle growing demands.
  • Establish proactive monitoring and alerting mechanisms to detect and address potential issues in real time. Your vigilance will help maintain high availability and reliability of our data systems.
  • Work closely with cross-functional teams—including data scientists, analysts, and software engineers—to understand evolving data requirements. You will be instrumental in delivering tailored solutions that align with business objectives.
  • Create comprehensive documentation of data infrastructure, pipelines, and processes. Help us promote a culture of continuous improvement by sharing knowledge and best practices within the team.

Requirements:

  • Top Secret with capability to obtain a CI Poly
  • Security+ certification (or willingness to get certified within the first month)
  • Associates degree or higher in engineering, computer science, or related field and 5+ years of experience as a DevOps/Cloud/Software engineer -OR- 8+ years of experience as a DevOps/Cloud/Software engineer
  • Proficiency in programming languages such as Python, Java, or Scala.
  • Strong experience with relational databases (e.g., PostgreSQL, MySQL) and big data technologies (e.g., Hadoop, Spark).
  • Experienced with Elasticsearch and Cloud Search.
  • Hands-on experience with cloud platforms such as AWS, Azure, or Google Cloud Platform.
  • Experience with data pipeline orchestration tools (e.g., Airflow, Luigi) and workflow automation tools (e.g., Jenkins, GitLab CI/CD).
  • Familiarity with containerization and orchestration technologies (e.g., Docker, Kubernetes) is a plus.
  • Data pipeline management
  • Proven experience maintaining production systems for external customers
  • Experience working with Open Source Technologies such as Red Hat (OpenShift) and Linux/Unix
  • Engaging with Data engineers in troubleshooting issues
  • Excellent problem-solving skills and attention to detail.
  • Strong communication and collaboration skills.

Last updated on Oct 10, 2024

See more

About the company

More jobs at hatchit

Analyzing

Washington, District of Columbia

 · 

30+ days ago

San Francisco, California

 · 

30+ days ago

Walpole, Massachusetts

 · 

30+ days ago

San Francisco, California

 · 

30+ days ago

More jobs like this

Analyzing

New York, New York

 · 

30+ days ago

San Francisco, California

 · 

30+ days ago

Web Engineer
U
Upworthy ·  Viral content for social good

 · 

30+ days ago

Remote

 · 

30+ days ago

Remote

 · 

30+ days ago

Des Moines, Iowa

 · 

30+ days ago

South Jordan, Utah

 · 

30+ days ago

Tampa, Florida

 · 

30+ days ago

Web Site Designer
TT
The Talently ·  AI recruitment platform

California

 · 

30+ days ago

Apttus CPQ Developer
C
crjdnwsnowo2i4nz45b1teboszrxlg0351vr73gpqw7yanury9u287prckhdnkww

Minneapolis, Minnesota

 · 

30+ days ago

Developed by Blake and Linh in the US and Vietnam.
We're interested in hearing what you like and don't like! Live chat with our founder or join our Discord
Changelog
🚀 LaunchpadNov 27
Create a site and sell services based on your resume.
🔥 Job search dashboardNov 13
Revamped job search UI with a sortable grid, live filtering, bookmarks, and application tracking.
🫡 Cover letter instructionsSep 27
New Studio settings give you control over AI output.
✨ Cover Letter StudioAug 9
Automatically generate cover letters for any job.
🎯 Suggested filtersAug 6
Copilot suggests additional filters above the results.
⚡️ Quick applicationsAug 2
Apply to jobs using info from your resume. Initial coverage of ~200k jobs in Spain, Germany, Austria, Switzerland, France, and the Netherlands.
🧠 Job AnalysisJul 12
Have Copilot read job descriptions and extract out key info you want to know. Click "Analyze All" to try it out. Click on the Copilot's gear icon to customize the prompt.
© 2024 RemoteAmbitionAffiliate · Privacy · Terms · Sitemap · Status