HCLTech Vietnam

Leadvisors Tower, 643 Pham Van Dong, Hà Nội

Company Size : 100-499

View more

Job description

Overview of job

HCLTech is a global technology company, home to more than 223,400 people across 60 countries, delivering industry-leading capabilities centered around digital, engineering, Cloud and AI, powered by a broad portfolio of technology services and products. We work with clients across all major verticals, providing industry solutions for Financial Services, Manufacturing, Life Sciences and Healthcare, Technology and Services, Telecom and Media, Retail and CPG, and Public Services. Consolidated revenues as of 12 months ending Dec 2023 totaled $13.1 billion.

*** Note: We highly appreciate your interest in this position of HCLTech Vietnam. After reviewing all applications, only qualified candidates will be contacted for the next steps within 15 days from the date of submission!

Be a part of building the ideal data ecosystem from scratch. This is your opportunity to build new, not fix old.  

  • Build and run the data processing pipeline on Google Cloud Platform (GCP)
  • Work with Implementation teams from concept to operations to provide deep technical expertise for successfully deploying large scale data solutions in the enterprise and use modern data/analytics technologies on GCP   
  • Design pipelines and architectures for data processing 
  • Implement methods for DevOps automation of all parts of the built data pipelines to deploy from development to production 
  • Formulate business problems as technical data problems while ensuring that key business drivers are captured in collaboration with product management 
  • Extract, load, transform, clean and validate data 
  • Support and debug data pipelines

Job Requirement

  • At least 4 years of experience in total at Data Engineering or at a similar role
  • Strong Cloud-based Data Engineering experience in one of the following Clouds (AWS/Azure/GCP); we mainly use GCP but we are open to your Cloud experience (with at least 2 years)
  • As GCP Cloud Data Engineer: You must be strong at General Infrastructure and Services and particular data services such as Big Query, Dataflow, Airflow, Cloud Function, etc.
  • As AWS Cloud Data Engineer: You must be strong at AWS technologies as the followings: data pipeline (Lake Formation, MWAA, EMR, S3, Glue and Athena); Data Warehousing technologies (AWS Redshift) 
  • As Azure Cloud Data Engineer: You must be strong at Azure Data Lake Storage, Azure Databricks and Azure Data Factory, Synapse, etc.
  • Proven successful design and implementation of large and complex data solutions (Data Warehouse, Data Lake) using various architectural patterns such as Microservices  
  • Experience with Advanced SQL and Python skills
  • Experience with DataOps
  • Experience in using DevOps while working on Cloud data platforms like using Terraform for Infrastructure as Code (IaC), GitOps or using Docker, Kubernetes
  • Good command of English verbal communication

Languages

  • English

    Speaking: Intermediate - Reading: Intermediate - Writing: Intermediate

Technical Skill

  • GCP
  • AWS
  • MS Azure
  • Python
  • MS SQL
  • Docker
  • AWS Redshift
  • Amazon S3
  • Data Warehouse
  • Kubernetes
  • Microservices
  • Amazon EMR
  • Dataflow
  • Azure Data Factory
  • Apache Airflow
  • Amazon Athena
  • AWS Glue
  • GitOps
  • Azure Data Lake
  • DataOps
  • Azure Synapse Analytics
  • Azure Databricks
  • Google BigQuery

COMPETENCES

  • Communication Skills

BUSINESS PROFILE

HCLTech is a global technology company.

Home to 219,000+ people across 54 countries, delivering industry-leading capabilities centered around digital, engineering and cloud, powered by a broad portfolio of technology services and products. We work with clients across all major verticals, providing industry solutions for Financial Services, Manufacturing, Life Sciences and Healthcare, Technology and Services, Telecom and Media, Retail and CPG, and Public Services.