Back to Job Search Results

GCP Data Engineer

Remote | Hartford, Connecticut
Freelance | Job ID #69833 | Posted Last Month
Apply

GCP Data Engineer needed for an ongoing contract engagement with a national health organization. This position can be worked remotely anywhere in the Continential United States


* Must have: ETL Experience, pipeline experience, GCP certifications and multiple implementations


GCP(Google Cloud Platform) DATA ENGINEER , expert with Cloud Fusion , Big Query , Data Proc , Data Pipelines ,ETL , . BigQuery, GCS, Cloud Composer, Dataproc, Dataflow, Dataprep, Cloud Pub/Sub, Metadata DB, Data Studio, Datalab, other

Minimum Certifications needed: Google cloud Data Engineer .

  • Design Data Pipelines: work with other Data Engineering personnel on an overall design for flowing data from various internal and external sources into the GCP
  • Build Data Pipelines: leverage standard toolset and develop ETL/ELT code to move data from various internal and external sources into the GCP
  • Support Data Quality Program: work with Data QA Engineer to identify automated QA checks and associated monitoring & alerting to ensure GCP maintains consistently high quality data
  • Support Operations: triage alerts channeled to you and remediate as necessary
  • Technical Documentation: leverage templates provided and create clear, simple and comprehensive documentation for your development
  • Key contributor to defining, implementing and supporting:
  • Data Services
  • Data Dictionary
  • Tool Standards
  • Best Practices
  • Data Lineage
  • User Training

Qualifications:

  • Strong ELT/ ETL designer/developer
  • Strong SQL
  • Strong Python
  • Structured & unstructured data expertise
  • Cloud environment development & operations experience (GCP)
  • Preference for candidates experienced with:
  • Google Cloud Platform (GCP) and associated services; e.g. BigQuery, GCS, Cloud Composer, Dataproc, Dataflow, Dataprep, Cloud Pub/Sub, Metadata DB, Data Studio, Datalab, other
  • Other important tools: Apache Airflow (scheduler), Bitbucket and git (version control), Stackdriver (ops monitoring), Opsgenie (alert notification), Docker
  • Real-time data replication/streaming tools
  • Data Modeling
  • Excellent verbal and written communications
  • Strong team player
Apply

SHARE THIS JOB

Turn on email alerts for this search