divvyDOSE is a healthcare startup located in Chicago’s River North neighborhood. Our vision is a life where medicine does what it's supposed to, and people get the attention and care they deserve. We strive to improve the quality of life through innovative design and compassionate customer service that allows medicine to get out of the way of our customers’ lives.
Our company is professional but fun, dedicated yet quirky. We are fast-paced, highly-motivated individuals who focus on health and wellness (and if we’re being honest, sometimes pizza) and most of us have slight obsessions with our pets. We want to make a difference in the lives of our customers by treating them like friends and family. In addition to a competitive salary and uncapped bonus potential, our company offers comprehensive medical, dental, and vision plans.
You will be the point person, champion, and owner of our data warehouse and processing pipelines, including ETL, performance, maintenance, and development. Working closely with our product, engineering, and business stakeholders, you will help prioritize and balance data needs with building a robust data infrastructure at divvyDOSE.
- Design, build, deploy and maintain high quality data pipelines
- Build and deploy the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources
- Identify, design, and implement internal process improvements: automating manual processes, process monitoring, optimizing data delivery, building deployment pipelines, re-designing infrastructure for greater scalability, etc
- Work closely with our business analysts to optimize data delivery for BI and reporting needs
- Scope, estimate, and prioritize work to deliver business value and improve the data infrastructure of divvyDOSE
- A bright, motivated, passionate engineer who wants to work with the same and is open-minded about applying the best tool for the job and solving real problems for real users
- You have 3-5 years experience scaling and optimizing schemas and performance tuning SQL and ETL pipelines. Have strong operational and development experience of at least one RDBMS (PostgreSQL is strongly preferred)
- You are proficient in Python scripting and Web frameworks (e.g., Flask).
- You have experience with JVM.
- You have experience building deployment pipelines with Jenkins or Kubernetes and with AWS, Redshift, and infrastructure as code (terraform and ansible)
- You have experience with analytics tracking tools such as Google Analytics, Segment, Heap, etc.
- You may have experience with BI tools such as Looker, PowerBI, or Tableau, which is a strong plus
- You have a Bachelors in Computer Science or equivalent experience
Our Engineering Culture:
We are a very collaborative, close knit group that is inclusive, fun and hard working. Our number one priority after producing a great product is keeping that spirit alive. We have a firm, non-negotiable no jerk policy. We have a generous time off and work from home policy and strive to create an environment where you can do your best work.
divvyDOSE provides equal employment opportunity to all qualified applicants regardless of race, color, religion, national origin, sex, sexual orientation, gender identity, age, disability, pregnancy, genetic information, or veteran status, or other legally protected classification in the state in which a person is seeking employment. divvyDOSE is a drug-free and tobacco-free work environment.