Deloitte refers to one or more of Deloitte Touche Tohmatsu Limited, a UK private company limited by guarantee (“DTTL”), its network of member firms, and their related entities. DTTL and each of its member firms are legally separate and independent entities.
Cloud Data Engineer (GCP) Solution Specialist – USDC
Job Description:
Gather systems requirements and functionalities needed for large/complex development projects.
Assess and develop high level design requirements for the project and communicate with the development team.
Utilize strong analytical skills to prepare road maps for the data migration into Google Cloud.
Design, construct, and manage data lake environments including data ingestion, staging, data quality monitoring, and business modeling.
Utilize expert guidance in architecting automated, fault tolerant, scalable GCP environments while adhering to GCP best practices for small and large initiatives.
Qualification:
Bachelor’s degree or equivalent experience in Statistics/Technology/Science/Engineering/Applied Mathematics or similar quantitative analytics field.
3+ years of Data warehousing and ETL experience, working with relational and non-relational databases.
3+ years of experience with GCP services including, but not limited to Compute Engine, App Engine, Cloud Function, Virtual Private Cloud, Load Balancing, containers, Cloud Storage, Management Services and various database options.
3+ years of experience designing, coding, testing, and supporting scalable data lake environments for high-quality, repeatable, and automated data ingestion, staging, data quality monitoring, business modeling, forecasting, and optimization solutions.
3+ years of experience in migrating data from on-premise/traditional big data systems, relational databases, NoSQL, data lakes and/or data warehouses to GCP native service.
2+ years of experience building and maintaining large-scale data lakes or data warehouses with BigQuery.
2+ years of experience in GCP and Big Data Technologies with scripting knowledge in Python and/or PySpark.
AWS Certification, Associate and/or Professional.
Previous exposure to Multi-cloud environments.
Strong knowledge in provisioning cloud infrastructure using Deployment Manager.
Understand the concepts of Forced Multiplier, Autoscaling, CI/CD tools, TensorFlow, and/or Jenkins.