This job was posted over 90 days ago and may no longer be available.

Data Engineer

OBM, Inc. is looking for a highly skilled Data Engineer to help us further expand Foreman, our enterprise-grade Bitcoin miner management platform. The hire will be responsible for expanding and optimizing our data and data pipeline architecture, as well as optimizing data flow and collection. The ideal candidate is an experienced data pipeline builder and data wrangler who enjoys optimizing data systems and building them from the ground up. The Data Engineer will support our team on data initiatives and will ensure optimal data delivery architecture is consistent throughout ongoing projects. They must be self-directed and comfortable supporting the data needs of multiple teams, systems and products. The right candidate will be excited by the prospect of optimizing or even re-designing our company’s data architecture to support our next generation of products and data initiatives.

Responsibilities:

  • Create and maintain optimal data pipeline architecture.
  • Assemble large, complex data sets that meet functional / non-functional business requirements.
  • Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
  • Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and GCP ‘big data’ technologies.
  • Build analytics tools that utilize the data pipeline to provide actionable insights into operational efficiency and other key business performance metrics.
  • Work with stakeholders to assist with data-related technical issues and support their data infrastructure needs.
  • Keep our data separated and secure across national boundaries through multiple data centers and GCP regions.
  • Create data tools for analytics and team members that assist them in building and optimizing our product into an innovative industry leader.

Requirements:

  • Bachelor's or Master's degree in Computer Science or another quantitative field.
  • 4+ years of Python or Java development experience.
  • 4+ years of SQL experience (No-SQL experience is a plus).
  • 4+ years of experience with schema design and dimensional data modeling.
  • Experience with GCP cloud services: Cloud Functions, Pub/Sub, Dataflow, BigQuery.
  • Experience designing, building, and maintaining data processing systems.

Compensation and Benefits:

  • Competitive salaries, commensurate with experience.
  • Opportunity for discretionary annual cash bonus.
  • 401k plan with company matching.
  • Medical, vision, and dental plans.
  • PTO and paid holidays.

Desired Skills

Contact Info

Posted: Jan. 18, 2023

Apply


Get Updates