Browse
Employers / Recruiters

Manager, Solutions Architect - Healthcare & Life Sciences

Databricks · 30+ days ago
$178k+
Full-time
Continue
By pressing the button above, you agree to our Terms and Privacy Policy, and agree to receive email job alerts. You can unsubscribe anytime.

FEQ325R91

This role can be remote.

Reporting to the Director of Field Engineering for Healthcare & Life Science, the Solutions Architect Manager will lead a team of pre-sales Solutions Architects focusing on complex, strategic accounts. You will expand and develop a dynamic team experienced in enterprise software, big data/analytics, data engineering, and data science. Your experience partnering with the sales organization will help close revenue opportunities with the right approach whilst coaching new sales and pre-sales team members to work together. You will guide and get involved to enhance your team's effectiveness; be an expert at communicating complex, business value-focused solutions; support complex sales cycles, and build relationships with key stakeholders in large corporations.

The impact you will have:

  • You will hire, train, and grow a team of Solutions Architects for a company in high-growth mode.
  • Crush your region's sales target - by making sure your SA team knows how to qualify accounts, especially from a technical perspective, identify important use-cases, build proof of concepts, and establish themselves as trusted advisors throughout the customer life-cycle; lead the technical win.
  • Make your customers extremely successful with Databricks and provide outsized value to their businesses.
  • Establish relationships across the business (partners, sales, GTM) to make your customers and team successful.

What we look for:

  • 5+ years of experience in the data space with a technical product (i.e. data warehousing, big data, or machine learning).
  • 2+ years of experience leading technical teams, with a preference for a pre-sales organization - building a territory, up-selling strategic accounts, and exceeding annual quotas.
  • Trusted advisor to technical executives that guide strategic data infrastructure decisions.
  • Led a team through best practices for technical qualifications, proof of concepts, architecture discussions, and product demonstrations.
  • Experience hiring qualified candidates, ramping them up to be successful and promoting them into larger roles.
  • Create positive morale for the team and help foster a working relationship between Field Engineering, Sales, and other important internal partners
  • Nice to have: Healthcare & Life Science experience

While candidates in the listed location(s) are encouraged for this role, candidates in other EST locations will be considered.

As a Sr. Specialist Solutions Architect (SSA) - Data Engineering & Platform Architecture you will guide customers in the administration and security of their Databricks deployments for a variety of our customers. You will be in a customer-facing role, working with and supporting Solution Architects, that requires hands-on production experience with Apache Spark™, and expertise in other data technologies and in public cloud - AWS, Azure, and GCP.  SSAs help customers through the design and successful implementation of essential workloads while aligning their technical roadmap for expanding the usage of the Databricks Platform. As a deep go-to-expert reporting to the Specialist Field Engineering Manager, you will continue to strengthen your technical skills through mentorship, learning, and internal training programs and establish yourself in an area of specialty - whether that be streaming, performance tuning, industry expertise, as well as cloud deployments, security, networking, automaton, operations research, and more.

The impact you will have:

  • Provide technical leadership to guide strategic customers to the successful administration of Databricks, ranging from design to deployment
  • Architect production level data pipelines, including end-to-end pipeline load performance testing and optimization, as well as production level deployments and meeting necessary security and networking requirements
  • Become a technical expert in an area such as data lake technology, big data streaming, big data ingestion and workflows, as well as cloud platforms, automation, security, networking, or identity management
  • Assist Solution Architects with more advanced aspects of the technical sale including custom proof of concept content, estimating workload sizing, and custom architectures
  • Implement and optimize CI/CD pipelines to ensure smooth and automated deployment processes
  • Provide tutorials and training to improve community adoption (including hackathons and conference presentations)
  • Contribute to the Databricks Community through active participation and knowledge sharing.

What we look for:

  • 7+ years experience in a technical role with expertise in at least one of the following:
    • Software Engineering/Data Engineering: data ingestion, streaming technologies - such as Spark Streaming and Kafka, performance tuning, troubleshooting, and debugging Spark or other big data solutions
    • Data Applications Engineering: Build use cases that use data - such as risk modeling, fraud detection, customer life-time value
    • Extensive experience building big data pipelines
    • Experience maintaining and extending production data systems to evolve with complex needs
    • Cloud Platforms & Architecture: Cloud Native Architecture in CSPs such as AWS, Azure, and GCP, Serverless Architecture.
    • Security: Platform security, Network security, Data Security, Gen AI & Model Security, Encryption, Vulnerability Management, Compliance.
    • Networking: Architecture design, implementation, and performance
    • Identify management: Provisioning, SCIM, OAuth, SAML, Federation
    • Platform Administration: High availability and disaster recovery, group management, observability, logging, monitoring, audit, and cost management
    • Infrastructure Automation and InfraOps with IaC tools like Terraform
  • Maintain and extend Databricks environment to evolve with complex needs.
  • Deep Specialty Expertise in at least one of the following areas:
    • Experience scaling big data workloads (such as ETL) that are performant and cost-effective
    • Experience migrating Hadoop workloads to the public cloud - AWS, Azure, or GCP
    • Experience with large scale data ingestion pipelines and data migrations - including CDC and streaming ingestion pipelines
    • Expert with cloud data lake technologies - such as Delta and Delta Live
    • Security - understanding how to security data platforms and manage identities
    • Complex deployments
    • Public Cloud experience - experience designing data platforms on cloud infrastructure and services, such as AWS, Azure, or GCP using best practices in cloud security and networking.
  • Bachelor's degree in Computer Science, Information Systems, Engineering, or equivalent experience through work experience.
  • Hands-on experience with Python, Java, or Scala and proficiency in SQL, and Terraform and Go experience is desirable.
  • Familiarity with operations research and optimizations to enhance cloud infrastructure and deployment processes, with contributions to business intelligence, data warehousing, and/or geographic information systems.
  • 2 years of professional experience with Big Data technologies (Ex: Spark, Hadoop, Kafka) and architectures
  • 2 years of customer-facing experience in a pre-sales or post-sales role
  • Can meet expectations for technical training and role-specific outcomes within 6 months of hire
  • This role can be remote, but we prefer that you will be located in the job listing area and can travel up to 30% when needed.

Benefits

  • Private medical, dental and optical
  • Life, accident, disability and critical illness coverage
  • Central Provident Fund for local nationals
  • Equity awards
  • Enhanced Parental Leaves
  • Fitness reimbursement
  • Annual career development fund
  • Home office & work headphones reimbursement
  • Business travel accident insurance
  • Mental wellness resources
  • Employee referral bonus
 

Pay Range Transparency

Databricks is committed to fair and equitable compensation practices. The pay range(s) for this role is listed below and represents base salary range for non-commissionable roles or on-target earnings for commissionable roles.  Actual compensation packages are based on several factors that are unique to each candidate, including but not limited to job-related skills, depth of experience, relevant certifications and training, and specific work location. Based on the factors above, Databricks utilizes the full width of the range. The total compensation package for this position may also include eligibility for annual performance bonus, equity, and the benefits listed above. For more information regarding which range your location is in visit our page here.

 

Zone 1 Pay Range
$178,900$316,300 USD

About Databricks

Databricks is the data and AI company. More than 10,000 organizations worldwide — including Comcast, Condé Nast, Grammarly, and over 50% of the Fortune 500 — rely on the Databricks Data Intelligence Platform to unify and democratize data, analytics and AI. Databricks is headquartered in San Francisco, with offices around the globe and was founded by the original creators of Lakehouse, Apache Spark™, Delta Lake and MLflow. To learn more, follow Databricks on TwitterLinkedIn and Facebook.

Our Commitment to Diversity and Inclusion

At Databricks, we are committed to fostering a diverse and inclusive culture where everyone can excel. We take great care to ensure that our hiring practices are inclusive and meet equal employment opportunity standards. Individuals looking for employment at Databricks are considered without regard to age, color, disability, ethnicity, family or marital status, gender identity or expression, language, national origin, physical and mental ability, political affiliation, race, religion, sexual orientation, socio-economic status, veteran status, and other protected characteristics.

Compliance

If access to export-controlled technology or source code is required for performance of job duties, it is within Employer's discretion whether to apply for a U.S. government license for such positions, and Employer may decline to proceed with an applicant on this basis alone.

Last updated on Aug 19, 2024

See more

About the company

D
DatabricksDatabricks provides a data and AI platform for businesses, offering data engineering, data science, and machine learning tools and services.

More jobs at Databricks

Analyzing

Munich, Bavaria

 · 

30+ days ago

Amsterdam, North Holland

 · 

30+ days ago

Berlin, Berlin

 · 

30+ days ago

San Francisco, California

 · 

30+ days ago

Developed by Blake and Linh in the US and Vietnam.
We're interested in hearing what you like and don't like! Live chat with our founder or join our Discord
Changelog
🚀 LaunchpadNov 27
Create a site and sell services based on your resume.
🔥 Job search dashboardNov 13
Revamped job search UI with a sortable grid, live filtering, bookmarks, and application tracking.
🫡 Cover letter instructionsSep 27
New Studio settings give you control over AI output.
✨ Cover Letter StudioAug 9
Automatically generate cover letters for any job.
🎯 Suggested filtersAug 6
Copilot suggests additional filters above the results.
⚡️ Quick applicationsAug 2
Apply to jobs using info from your resume. Initial coverage of ~200k jobs in Spain, Germany, Austria, Switzerland, France, and the Netherlands.
🧠 Job AnalysisJul 12
Have Copilot read job descriptions and extract out key info you want to know. Click "Analyze All" to try it out. Click on the Copilot's gear icon to customize the prompt.
© 2024 RemoteAmbitionAffiliate · Privacy · Terms · Sitemap · Status