Search Jobvertise Jobs
Jobvertise

GCP Data Engineer
Location:
US-MO-St. Louis
Jobcode:
OL023575
Email this job to a friend

Report this Job

Report this job





Incorrect company
Incorrect location
Job is expired
Job may be a scam
Other







Apply Online
or email this job to apply later

Skills Required: 
        Work as part of an implementation team from concept to operations, providing deep technical subject matter expertise for successful deployment. 
        Implement methods for automation of all parts of the pipeline to minimize labor in development and production. This includes designing and deploying a pipeline with automated data lineage. 
        Identify, develop, evaluate and summarize Proof of Concepts to prove out solutions. 
        Test and compare competing solutions and report out a point of view on the best solution. 
        Integration between GCP Data Catalog and Informatica EDC. 
        Design and build production data engineering solutions to deliver our pipeline patterns using Google Cloud Platform (Google Cloud Platform) services: BigQuery, DataFlow, Pub/Sub, BigTable, Data Fusion, DataProc, Cloud Compose, Cloud SQL, Compute Engine, Cloud Functions, and App Engine.
 
 
Experience Required:
        In-depth understanding of Google's product technology and underlying architectures. 
        5+ years of application development experience required, +3 years of GCP experience. 
        Experience working in GCP based Big Data deployments (Batch/Real-Time) leveraging Big Query, Big Table, Google Cloud Storage, Pub/Sub, Data Fusion, Dataflow, Dataproc, Airflow etc. 
        2 + years coding skills in Java/Python. 
        Work with data team to analyze data, build models and Integrate massive datasets from multiple data sources for data modelling.
        Implement methods for automation of all parts of the predictive pipeline to minimize labor in development and production.
        Formulate business problems as technical data problems while ensuring key business drivers are captured in collaboration with product management.
        Extracting, Loading, Transforming, cleaning, and validating data + Designing pipelines and architectures for data processing.
        Minimum 1 year of designing and building production data pipelines from ingestion to consumption within a hybrid big data architecture, using Java/ Python etc. 
        Hands-on GCP experience with a minimum of 1 solution designed and implemented at production scale.

Global IT Con LLC

Apply Online
or email this job to apply later


 
Search millions of jobs

Jobseekers
Employers
Company

Jobs by Title | Resumes by Title | Top Job Searches
Privacy | Terms of Use


* Free services are subject to limitations