Description :
|
Job Description: - Senior Experience in designing, building and operationalizing large-scale enterprise data solutions and applications using one or more of GCP data and analytics services in combination with 3rd parties - Spark, Hive, Databrick, Cloud DataProc, Cloud Dataflow, Apache Beam/ composer, Big Table, Cloud BigQuery, Cloud PubSub, Cloud storage Cloud Functions & Github
- Experience working in GCP and Google Big Query Strong SQL knowledge - able to translate complex scenarios into queries.
- Strong Programming experience in Python or Java Experience with Data modeling and mapping.
- Experience in Google Cloud platform (especially BigQuery) Experience developing scripts for flowing data into GBQ from external data sources.
- Experience in Data Fusion for automation of data movement and QA. Experience with Google Cloud SDK & API Scripting.
- Experience in performing detail assessments of current state data platforms and creating an appropriate transition path to GCP clouActive Google Cloud Data Engineer Certification or Active Google Professional Cloud Architect Certification will be great
- Data migration experience from on prim legacy systems Hadoop, Exadata, Oracle Teradata, or Netezza to any cloud platform
- Experience with Data lake, data warehouse ETL build and design
- Experience in designing and building production data pipelines from data ingestion to consumption within a hybrid big data architecture, using Cloud Native GCP, Java, Python, Scala, SQL etc.
- Experience in implementing next generation data and analytics platforms on GCP cloud
- Experience in Jenkins, Jira, confluence.
- Data engineering or Data profiling and Data warehousing.
- SQL or Spark, python or Java real-time expertise are must.
|