FreshRemote.Work

Data Engineer - US Remote

TITLE: Data Engineer

DUTIES: Design, build, and operate ETL pipelines at scale. Automate processes related to data products and machine learning products. Design data structure for data products. Build knowledge graphs, flow charts, and system diagrams for problem analysis. Develop and operate API/tools related to data products and machine learning products. Provide technical solutions using Big Data technologies and create technical design documents. Design and develop Data Platform using Python, Spark, and BigQuery. Build Devops platform for Continuous Integration/Continuous deployment stack for ETL applications teams. Profile, debug, and optimize apps. Remote work permitted within the U.S. only.

SCHEDULE: 40 hours per week, Monday through Friday

SALARY: Zone 1: $210,662.00 - $218,500.00

Zone 2: $210,662.00

Zone 3: $210,662.00

LOCATION: Mercari, Inc., 3101 Park Blvd, Palo Alto, CA 94306

MAIN REQUIREMENTS: Bachelor of Science degree in Computer Science or closely related field of study and five (5) years of experience as a Data Engineer, Data Specialist, or related occupation where required experience gained.

Special Skills:

Also requires five (5) years of experience in the following:

  • Confluence;
  • JIRA;
  • GIT;
  • CI/CD pipelines;
  • Agile methodologies;
  • Java;
  • Python;
  • Data Modeling or Data Warehouse;
  • ETL: Apache Airflow;
  • Container: Docker or Kubernetes;
  • API: gRPC, Tensorflow Serving, or Flask (REST);
  • Database: MySQL, Postgres, Oracle, SqlServer, or Google Spanner;
  • Distributed Processing: Apache Beam or Apache Spark;
  • Machine Learning: Tensorflow, Keras, or Scikit-Learn, etc.; and
  • Cloud: Google Cloud (BigQuery, Google Dataflow, or Google Dataproc, etc.).

CONTACT: Email resume to us_jobs@mercari.com. Please reference job title and location.

*Zone 1 includes locales such as the San Francisco Bay Area and New York City
**Zone 2 includes locales such as Austin, Boston, Los Angeles and Seattle
***Zone 3 includes locales such as Denver, St. Louis, and Houston

 

#LI-DNI

Apply

Job Profile

Regions

North America

Countries

United States

Skills

Apache Airflow Apache Beam Big Data Technologies BigQuery Cloud Confluence Docker Flask Google Cloud Google Spanner GRPC Jira Keras Kubernetes Machine Learning MySQL Oracle Postgres Python Scikit-learn Spark SQLServer TensorFlow Tensorflow Serving

Tasks
  • Automate processes for data and ML products
  • Build DevOps platform
  • Build knowledge graphs and system diagrams
  • Create technical design documents
  • Design data structures
  • Design ETL pipelines
  • Develop APIs and tools
  • Develop Data Platform
  • Profile, debug, and optimize apps
  • Provide technical solutions using Big Data technologies
Experience

5 years

Education

Bachelor of Science in Computer Science

Restrictions

Remote work permitted Remote work permitted within the U.S. only

Timezones

America/Anchorage America/Chicago America/Denver America/Los_Angeles America/New_York Pacific/Honolulu UTC-10 UTC-5 UTC-6 UTC-7 UTC-8 UTC-9