Job Description: Responsibilities & Requirements
-->
Zipline operates the world's first and only national-scale drone delivery system, sending urgent medicines like blood transfusions and vaccines to those in need - no matter where they live. We design, manufacture, and operate a fleet of autonomous aircraft delivering just-in-time, lifesaving medical supplies around the world, 7 days a week. Working at the intersection of cutting-edge technology, complex healthcare systems, and tightly regulated airspaces, we must have a robust, data-driven culture in order to fulfill our mission of providing every human on Earth with instant access to vital medical supplies.
About the team
We're a team of data scientists and software engineers working to democratize Zipline's data, ensuring every employee has timely access to relevant, accurate data, closing the gap between our teams and our customers.
ADVERTISEMENT
-->
CONTINUE READING BELOW
Our data sources include
- Hundreds of MBs of time series telemetry generated by every flight
- Ground equipment data streams
- Ordering and Inventory tracking databases
We're responsible for
- Building scalable, reliable data pipelines to upload and process data from our distribution centers.
- Creating and sharing flexible utilities, tools, and platforms for analysis.
- Using that infrastructure to answer Zipline's most difficult (big, interdisciplinary, messy) data science questions while filling analysis gaps between teams.
Hottest Jobs You Might Like:
Our current team
- Anne de Graaf: Data Science Intern
- Hugo Carr: Platform and Data Infrastructure Software Engineer
- Matt Fay: Data Team Lead
- Michael Demertzi: Data Scientist
Data Platforms
In addition to the data pipelines that upload, process, and archive data from each distribution center, we're responsible for a number of platforms used to access that data. These include:
1) a web front-end used to query, view, and download flight and ground equipment logs
2) a cloud-hosted python notebook environment built on top of Databricks for exploratory data analysis and batch processing
3) a dashboarding platform (aka BI tool) for building and sharing interactive data visualizations, using both Periscope and Snowflake.
ADVERTISEMENT
-->
CONTINUE READING BELOW
About the role
We're building a world-class, ultra-fast delivery operation that is responsible for delivering truly precious cargo. To achieve this, we must put data in the hands of decision-makers at every level of the company, driving decision-making and enabling accountability. As Zipline's Data Platform Lead, you'll own our data dashboarding platform, working with a wide variety of product, engineering, and operations teams to create rich, interactive visualizations and dashboards, informing decision-making, streamlining operations, and generating reports to our customers.
Location
We believe in the power of being close to the end user. To better serve operations and our customers on the continent, Zipline's data team is expanding from California to Ghana, where we're also building a small team of operations data analysts. This role includes ~1 month/year of international travel to support our team's global user-base.
Responsibilities
- Enable analysts across the company using Zipline's data dashboarding platform. This includes everything from providing access and onboarding to sharing tips & tricks for communicating data visually.
- Develop and maintain a roadmap of new features as our user base grows and use cases expand.
- Support the addition of new data sources on the back-end.
Strong candidates will have many of these traits
- Product vision: a customer-centric mindset and the capacity to ruthlessly prioritize the development of new features
- Data engineering: you won't just be managing a product, but also implementing features, e.g. prototyping a Python-centric ETL of new data sources into the data warehouse to make them available for dashboarding & reporting
- Data science: working familiarity with SQL and the Python data science stack, e.g. numpy, pandas, matplotlib
- Seeing around corners: balancing scrappy results with building high quality data and infrastructure
Compensation
Zipline offers a competitive salary and benefits package commensurate with the scope and impact of the role. Most of all, we'll provide you the opportunity to do your best work along with mentorship to build up your skills as both a technical contributor and leader.
« Go back to the jobs list
RELATED JOBS >> CLICK A JOB BELOW TO VIEW & APPLY
Zipline operates the world's first and only national-scale drone delivery system, sending urgent medicines like blood transfusions and vaccines to those in need - no matter where they live. We design, manufacture, and operate a fleet of autonomous aircraft delivering just-in-time, lifesaving medical supplies around the world, 7 days a week. Working at the intersection of cutting-edge technology, complex healthcare systems, and tightly regulated airspaces, we must have a robust, data-driven culture in order to fulfill our mission of providing every human on Earth with instant access to vital medical supplies.
About the team
We're a team of data scientists and software engineers working to democratize Zipline's data, ensuring every employee has timely access to relevant, accurate data, closing the gap between our teams and our customers.
ADVERTISEMENT
CONTINUE READING BELOW
Our data sources include
- Hundreds of MBs of time series telemetry generated by every flight
- Ground equipment data streams
- Ordering and Inventory tracking databases
We're responsible for
- Building scalable, reliable data pipelines to upload and process data from our distribution centers.
- Creating and sharing flexible utilities, tools, and platforms for analysis.
- Using that infrastructure to answer Zipline's most difficult (big, interdisciplinary, messy) data science questions while filling analysis gaps between teams.
Hottest Jobs You Might Like:
Our current team
- Anne de Graaf: Data Science Intern
- Hugo Carr: Platform and Data Infrastructure Software Engineer
- Matt Fay: Data Team Lead
- Michael Demertzi: Data Scientist
Data Platforms
In addition to the data pipelines that upload, process, and archive data from each distribution center, we're responsible for a number of platforms used to access that data. These include:
1) a web front-end used to query, view, and download flight and ground equipment logs
2) a cloud-hosted python notebook environment built on top of Databricks for exploratory data analysis and batch processing
3) a dashboarding platform (aka BI tool) for building and sharing interactive data visualizations, using both Periscope and Snowflake.
ADVERTISEMENT
CONTINUE READING BELOW
About the role
We're building a world-class, ultra-fast delivery operation that is responsible for delivering truly precious cargo. To achieve this, we must put data in the hands of decision-makers at every level of the company, driving decision-making and enabling accountability. As Zipline's Data Platform Lead, you'll own our data dashboarding platform, working with a wide variety of product, engineering, and operations teams to create rich, interactive visualizations and dashboards, informing decision-making, streamlining operations, and generating reports to our customers.
Location
We believe in the power of being close to the end user. To better serve operations and our customers on the continent, Zipline's data team is expanding from California to Ghana, where we're also building a small team of operations data analysts. This role includes ~1 month/year of international travel to support our team's global user-base.
Responsibilities
- Enable analysts across the company using Zipline's data dashboarding platform. This includes everything from providing access and onboarding to sharing tips & tricks for communicating data visually.
- Develop and maintain a roadmap of new features as our user base grows and use cases expand.
- Support the addition of new data sources on the back-end.
Strong candidates will have many of these traits
- Product vision: a customer-centric mindset and the capacity to ruthlessly prioritize the development of new features
- Data engineering: you won't just be managing a product, but also implementing features, e.g. prototyping a Python-centric ETL of new data sources into the data warehouse to make them available for dashboarding & reporting
- Data science: working familiarity with SQL and the Python data science stack, e.g. numpy, pandas, matplotlib
- Seeing around corners: balancing scrappy results with building high quality data and infrastructure
Compensation
Zipline offers a competitive salary and benefits package commensurate with the scope and impact of the role. Most of all, we'll provide you the opportunity to do your best work along with mentorship to build up your skills as both a technical contributor and leader.
« Go back to the jobs list