Senior Data Engineer // DataWallet

September 6, 2017
We are looking for a full-time data engineer who will design and maintain our ETL data pipeline to make people’s data work for them. Applicant will be a core part of a highly skilled and motivated team located between Berlin and New York City that is changing one of the most unethical sectors in our modern economy.

What We Do:

We are on a bold mission to give people full control over their data through a C2B data marketplace called DataWallet. It’s a platform that allows people to control who can access their data and for what purpose, allowing them to profit from an asset that is rightfully theirs. It makes their data actionable through, e.g., the personalization of existing services web services by powering companies’ AIs with their data.

Through this decentralized Customer to Business (C2B) data exchange, will we disrupt the the $300 billion data brokerage market by harnessing the Blockchain to empower users to control and profit from their data. Early investors include Tim Draper and Marc Benioff.

The Role:

We are looking for a full-time data engineer. You will be at the core of making people’s data work for them. You will design and maintain the ETL data pipeline—from pulling and parsing data from various APIs to populating normalized RDBs and calculating cached views (usually in a NoSQL form) to power our various data products and services. While you are not constrained in your tools, our current stack involves Airflow, Python, js/node, PostgreSQL, MongoDB, and AWS hosting. You will be a core part of a highly skilled and motivated team located between Berlin and New York City that is changing one of the most unethical sectors in our modern economy.

Qualifications:

[Minimum Qualifications]

  • At least 5 years of software engineering experience (Python or Javascript), with at least 2 years experience in a data-focused role
  • Expertise in building out data pipelines, efficient ETL design, implementation, and maintenance
  • Mastery of RDBs and ability to generate normative schemas from datasets
  • Experience with NoSql dbs (such as MongoDB)
  • Passion for creating data infrastructure technologies from scratch using the right tools for the job
  • Experience building and maintaining a data warehouse in production environments
    Ability to turn vague requirements into clear deliverables with minimal guidance

[Other desirable qualifications]

  • Experience with Apache Airflow, AWS tools, git, Linux.
  • Experience with systems for transforming large datasets such as Spark or Hadoop
  • Familiarity with Python-based data science tools (e.g., pandas) is also highly desirable.

Compensation:

  • Highly competitive wages
  • Flexible work hours
  • New equipment: Macbook Pro, nice monitors, and Bose headphones
  • Opportunities to travel to both the Berlin and New York City HQ

Send your application to: dan@pnyks.com

Visit the company website
DataWallet

Partners

talent.io

Related listings on StepStone: