GCP Data Engineer

Contract
Remote
Posted 3 years ago

We are in need of strong experienced GCP Data Engineer for Master Brand account, Remote location for next 6 months.

Duration : 6 months, extended to 1 year

Visa : H1B, GC/USC only (No EAD/Opt)

Start date : Immediate

Rounds : 2 rounds ( 1 internal, 1 client interview)

Requirements

  • Minimum 8-10 yrs of experience in IT experience
  • Experience using Python and SQL for data filtering, transformational and loading
  • Experience developing ETL/ELT using tools such as airflow/cloud composer
  • Ability to set up and monitor real-time streaming data solutions (Kafka)
  • Experience working with relational and MPP databases such as Postgres, HIVE, and Bigquery
  • Ability to leverage software development lifecycle capabilities including Git version control, unit testing, and CI/CD pipelines.

Projects they would work on

  • ETLs to make shared data sources (digital thread).write ETL job to get data lake into postgres for more robust applications
  • Work on setting up streaming data sources and consumption for manufacturing and other data sources. Help current applications scale through optimization.
  • Create data engineering pipelines and templates for data sources in GCP

Responsibilities

  • Create and automate ETL mappings to consume and aggregate data from multiple different data sources
  • Monitor performance, troubleshoot and tune ETL processes as appropriate
  • Execution of end to end implementation of underlying data ingestion workflow.

·       Solve complex data problems to deliver insights that helps our business to achieve their goals

Apply Online

A valid email address is required.
A valid phone number is required.
You can share this story by using your social accounts: