Analytics Engineer

  •  Udacity
  •  Mountain View or San Francisco
  •  Aug. 11, 2017

Job Description

Udacity's mission is to democratize education. We're an online learning platform offering groundbreaking education in fields such as artificial intelligence, machine learning, robotics, virtual reality, and more. Focused on self-empowerment through learning, Udacity is making innovative technologies such as self-driving cars available to a global community of aspiring technologists, while also enabling learners at all levels to skill up with essentials like programming, web and app development. Udacity is looking for people to join our Data team. If you love a challenge, and truly want to make a difference in the world, read on
The data team's mission is to make simple insights easy and complex insights possible. Your mission - make this possible!
- We use the state of the art tools to design our data pipelines to enable - personalization, recommendations, analysis, emails and notifications.
- Our stack - AWS Redshift, PostgreSQL, Apache Airflow for job orchestration. We're also experimenting with Druid and Elasticsearch to enable realtime notifications.
- You will work on setting up Kafka streaming listeners one day, and the next day you will work with the data scientist to productionize an algorithm that helps students learn better.
- We ship really fast and iterate rapidly to provide the best experience for our students.

What we're looking for:

    • Web development experience, including Python (or other scripting languages), JavaScript, CSS, HTML, and general visualization principles
    • 5+ years of experience working with Python, Ruby, Java or Scala
    • Prior experience working with various Rest APIs, caching, auth, and other backend related workflows.
    • Advanced knowledge of SQL and ability to write performant queries. Postgres preferred.
    • Experience working with diverse teams.

Job Responsibilites

    • Implement, manage and scale data pipelines for analytics, reporting and machine learning.
    • Interface with other engineering teams, product managers and data analysts to understand analytical and reporting needs
    • Model data appropriately to make analytics and reporting easier.
    • Design, develop, test, and launch new data products
    • Implement and integrate with various third party solutions such as Amplitude, MixPanel, Google Analytics
    • Investigate, prototype and test in-house solutions for A/B testing
    • Prototype in-house reporting solutions in d3.js etc. where applicable.
    • Work closely with data infrastructure engineering, data scientists and product managers and setup necessary reveal deeper insights about the existing data.

Our Tech Stack

    • AWS Redshift & Postgres - Data warehousing
    • Airflow - Data pipelines and ETL
    • AWS Database Migration Service (DMS) - for ETL
    • Scikit-learn for ML algorithms
    • Docker Stacks for sharing a common dev environment
    • Github, Docker Hub, CircleCI, Datadog, New Relic, Airbrake, Pager Duty