Description
Verana Health, a digital health company that delivers quality drug lifecycle and medical practice insights from an exclusive real-world data network, recently secured a $150 million Series E led by Johnson & Johnson Innovation – JJDC, Inc. (JJDC) and Novo Growth, the growth-stage investment arm of Novo Holdings.
Existing Verana Health investors GV (formerly Google Ventures), Casdin Capital, and Brook Byers also joined the round, as well as notable new investors, including the Merck Global Health Innovation Fund, THVC, and Breyer Capital.
We are driven to create quality real-world data in ophthalmology, neurology and urology to accelerate quality insights across the drug lifecycle and within medical practices. Additionally, we are driven to advance the quality of care and quality of life for patients. DRIVE defines our internal purpose and is the galvanizing force that helps ground us in a shared corporate culture. DRIVE is: Diversity, Responsibility, Integrity, Voice-of-Customer and End-Results. Click here to read more about our culture and values.
Our headquarters are located in San Francisco and we have additional offices in Knoxville, TN and New York City with employees working remotely in AZ, CA, CO, CT, FL, GA, IL, LA, MA, NC, NJ, NY, OH, OR, PA, TN, TX, UT , VA, WA, WI. All employees are required to have permanent residency in one of these states. Candidates who are willing to relocate are also encouraged to apply.
Job Title: Data Engineer
Job Intro:
As a Data/Software Engineer at Verana Health, you will be responsible for extending a set of tools used for data pipeline development. You will have strong hands-on experience in design & development of cloud services. Deep understanding of data quality metadata management, data ingestion, and curation. Generate software solutions using Apache Spark, Hive, Presto, and other big data frameworks. Analyzing the systems and requirements to provide the best technical solutions with regard to flexibility, scalability, and reliability of underlying architecture. Document and improve software testing and release processes across the entire data team.
Job Duties and Responsibilities:
- Architect, implement, and maintain scalable data architectures to meet data processing and analytics requirements utilizing AWS and Databricks
- Ability to troubleshoot complex data issues and optimize pipelines taking into consideration data quality, computation and cost.
- Collaborate with cross-functional teams to understand and translate data needed into effective data pipeline solutions
- Design solutions to solving problems related to ingestion and curation of highly variable data structures in a highly concurrent cloud environment.
- Retain metadata for tracking of execution details to reproducibility and providing operational metrics.
- Create routines to add observability and alerting to the health of pipelines.
- Establish data quality checks and ensure data integrity and accuracy throughout the data lifecycle.
- Research , perform proof-of-concept and leverage performant database technologies(like Aurora Postgres, Elasticsearch, Redshift) to support end user applications that need sub second response time.
- Participate in code reviews.
- Proactive in staying updated with industry trends and emerging technologies in data engineering.
- Development of data services using RESTful API’s which are secure(oauth/saml), scalable(containerized using dockers), observable (using monitoring tools like datadog, elk stack), documented using OpenAPI/Swagger by using frameworks in python/java and automated CI/CD deployment using Github actions.
- Document data engineering processes , architectures, and configurations.
Basic Requirements:
- A minimum of a BS degree in computer science, software engineering, or related scientific discipline.
- A minimum of 3 years of experience in software development
- Strong programming skills in languages such as Python/Pyspark, SQL
- Experience with Delta lake, Unity Catalog, Delta Sharing, Delta live tables(DLT)
- Experience with data pipeline orchestration tools - Airflow, Databricks Workflows
- 1 year of experience working in AWS cloud computing environment, preferably with Lambda, S3, SNS, SQS
- Understanding of Data Management principles(governance, security, cataloging, life cycle management, privacy, quality)
- Good understanding of relational databases.
- Demonstrated ability to build software tools in a collaborative, team oriented environment that are product and customer driven.
- Strong communication and interpersonal skills
- Utilizes source code version control.
- Hands-on experience with Docker containers and container orchestration.
Bonus:
- Healthcare and medical data experience is a plus.
- Additional experience with modern compiled programming languages (C++, Go, Rust)
- Experience building HTTP/REST APIs using popular frameworks
- Building out extensive automated test suites
Benefits:
- We provide health, vision, and dental coverage for employees
-
- Verana pays 100% of employee insurance coverage and 70% of family
- Plus an additional monthly $100 individual / $200 HSA contribution with HDHP
- Spring Health mental health support
- Flexible vacation plans
- A generous parental leave policy and family building support through the Carrot app
- $500 learning and development budget
- $25/wk in Doordash credit
- Headspace meditation app - unlimited access
- Gympass - 3 free live classes per week + monthly discounts for gyms like Soulcycle
Final note:
You do not need to match every listed expectation to apply for this position. Here at Verana, we know that diverse perspectives foster the innovation we need to be successful, and we are committed to building a team that encompasses a variety of backgrounds, experiences, and skills.
Please mention the word **ARDENTLY** and tag RMjYwMDoxZjE4OjE3OTpmOTAwOjVjNjg6OTFiNjo1ZDc1OjVkNw== when applying to show you read the job post completely (#RMjYwMDoxZjE4OjE3OTpmOTAwOjVjNjg6OTFiNjo1ZDc1OjVkNw==). This is a beta feature to avoid spam applicants. Companies can search these words to find applicants that read this and see they're human.