Gathering your results ...
3 days
Not Specified
Not Specified
Not Specified
<p>Senior Data Engineer with GCP</p> <p>Role : Senior Data Engineer with GCP<</p> <p>Location : Charlotte, NC <</p> <p>Duration : Full time <</p> <p>Key Responsibilities</p> <p>Architect and own scalable, secure, cloud-native data platforms on Google Cloud Platform Design, build, and optimize batch and real-time data pipelines using BigQuery, Dataflow, Pub/Sub, and Dataproc Lead BigQuery performance tuning and cost optimization (partitioning, clustering, query efficiency)</p> <p>Orchestrate workflows using Cloud Composer (Apache Airflow)</p> <p>Enable Al/ML and GenAl integration via Vertex Al and BigQuery ML</p> <p>Enforce data governance, security, reliability, and FinOps best practices</p> <p>Mentor engineers, conduct design/code reviews, and set enterprise data engineering standards</p> <ul> <li>Collaborate with product, analytics, and data science teams to deliver business-critical insights </li></ul> <p>Key Skill Sets</p> <ul> <li>GCP Data Services: BigQuery, Dataflow (Apache Beam), Pub/Sub, Cloud Storage, Cloud Composer, Dataproc </li><li>Programming & SQL: Advanced SQL, Python (Java/Scala a plus) </li><li>Data Engineering: ETL/ELT, streaming & batch processing, data modeling, distributed systems </li><li>Modern Architectures: Lakehouse, Apache Iceberg, Data Mesh concepts </li><li>Al/ML Enablement: Vertex Al, BigQuery ML, GenAl-ready pipelines DevOps & laC: Terraform, CI/CD, DataOps practices </li></ul> <p>Leadership: Architecture ownership, mentoring, stakeholder communication, problem solving</p> <ul> <li>Certification: Google Cloud Professional Data Engineer (strongly preferred / often mandatory) </li></ul> <p>In addition to big query, storage bucket, following are necessary skills - data flow, composer, cloud scheduler, Pubsub and Kafka, Apigee gateway and API, Dataplex, basic knowledge of network connectivity (knowledge on data catalog, DLP, BQDTS, STS and other data transfer methodologies). Reporting background (powerbi) and ICEBERG are MUST. Data virtualization (Trenio or equivalent), Looker and GCP vertex will be a plus.</p> <p>New York5 - 8 Years10H20-Mar-2026YACTIVE115572-3-1</p>
POST A JOB
It's completely FREE to post your jobs on ZiNG! There's no catch, no credit card needed, and no limits to number of job posts.
The first step is to SIGN UP so that you can manage all your job postings under your profile.
If you already have an account, you can LOGIN to post a job or manage your other postings.
Thank you for helping us get Americans back to work!
It's completely FREE to post your jobs on ZiNG! There's no catch, no credit card needed, and no limits to number of job posts.
The first step is to SIGN UP so that you can manage all your job postings under your profile.
If you already have an account, you can LOGIN to post a job or manage your other postings.
Thank you for helping us get Americans back to work!