We use cookies. Find out more about it here. By continuing to browse this site you are agreeing to our use of cookies.
#alert
Back to search results
New

Office Manager

UST
United States, Washington, Seattle
Jan 07, 2026
Role description

About the Role

Join our award-winning UST Data Science team as a Data Engineer, where your technical expertise and communication skills will directly impact exciting data and AI-driven projects. You'll be working closely with both external clients and internal stakeholders to design and build robust, scalable data pipelines and infrastructure on Google Cloud Platform (GCP).

This is a fantastic opportunity to work in a cross-functional scrum team alongside data scientists, ML engineers and analysts helping deliver the next generation of intelligent data products.

Key Responsibilities



  • Design, develop, and maintain scalable ETL/data pipelines using GCP services like Dataflow, Dataproc, and BigQuery.
  • Implement efficient data storage solutions (Cloud Storage, Cloud SQL, Cloud Spanner).
  • Develop and manage data lakes and data warehouses on GCP.
  • Collaborate with data scientists and analysts to translate data needs into practical solutions.
  • Ensure data governance, quality, and security in line with compliance standards.
  • Automate workflows and build monitoring/ing for pipeline health and performance.
  • Share best practices across teams and contribute to the development of reusable data engineering patterns.


Required Skills & Qualifications



  • Bachelor's or Master's in Computer Science, Information Systems, or equivalent experience.
  • 5+ years of hands-on experience in data engineering roles.
  • Deep expertise with Google Cloud Platform especially BigQuery, Dataflow, Dataproc, Cloud Storage, and Pub/Sub.
  • Strong SQL skills and experience with NoSQL databases.
  • Solid understanding of data modeling, ETL/ELT, and data warehousing concepts.
  • Proficiency in Python (or Java/Scala) for data pipeline development.
  • Familiarity with big data tools Hadoop, Spark, Kafka (optional but preferred).
  • Knowledge of data governance, security, and compliance best practices.
  • Excellent communication skills, ability to explain complex tech to non-technical stakeholders.
  • GCP certifications (e.g., Professional Data Engineer) are a plus.


Skills

Bigquery,Dataflow,Dataproc,NoSql,Data warehousing, ETL,GCP,Python,Data Management,SQL,Kubernetes,Cloud Storage,Hadoop,Kafka,Spark

Applied = 0

(web-df9ddb7dc-h6wrt)