HCL Technologies

Leadvisors Tower, 643 Pham Van Dong, Hà Nội

Company Size : 100-499

View more

Job description

Overview of job

You will be responsible for building and running the data processing pipeline on Google Cloud platform

  • Work with implementation teams from concept to operations, providing deep technical expertise for successfully deploying large scale data solutions in the enterprise, using modern data/analytics technologies on GCP
  • Work with data team to efficiently use GCP to analyze data, build data models, and generate reports/visualizations
  • Integrate massive datasets from multiple data sources for data modelling
  • Implement methods for Devops automation of all parts of the build data pipelines to deploy from development to production
  • Formulate business problems as technical data problems while ensuring key business drivers are captured in collaboration with product management
  • Design pipelines and architectures for data processing
  • Extract, load, transform, clean and validate data
  • Salary up to 4000$ and Joining Bonus up to 30m
  • Attractive package including base salary + 13th month salary + Performance Bonus
  • Insurance based on full base salary
  • Meal allowance 730,000/month
  • 100% of full salary and benefits as an official employee from the 1st day of working
  • Medical Benefit (Bao Viet Insurance Package) for Employee and Family
  • Working in a fast paced, flexible, and multinational working environment with opportunity to travel
    onsite (in 49 countries)
  • Internal Training (Technical & Functional & English)
  • Working time: 8:30 am - 6:00 pm from Mondays to Fridays

Job Requirement

Must requirements:

  • At least 2 years of experience in ETL Tools (such as Infromatica, Apache Beam, Kafka)
  • At least 1 years of experience in Google Cloud Platform (BigQuery, DataProc, DataFlow)
  • Having relative experience in any scripting/programming language as following: Java/Python/Go
  • Having relative experience in declarative CI/CD (Jenkins, Azure DevOps)
  • Having relative experience in Databases: SQL and NoSQL
  • Have a strong engineering mindset to automate tasks, identify use cases, test cases, improve the system, PR/Incident resolution and deployments

Good to Have:

  • Having relative experience in Automation: Kubernetes or Docker & Containerization
  • Having relative experience in Infrastructure as a Code (IaaC) (i.e., Terraform, Cloud Formation, Azure ARM Templates)
  • Having knowledge or experience in Big Data - Hadoop ecosystems including HDFS, MapReduce, YARN, HBase, Zookeeper, Spark, Pig, Hive...
  • Having knowledge or experience with Hadoop distributions such as Cloudera, HortonWorks...
  • Having relative experience in Data management:
    • Data Governance
    • Data Architecture
    • Data Modelling
    • Data Quality
    • Data integration

Languages

  • English

    Speaking: Intermediate - Reading: Intermediate - Writing: Intermediate

Technical Skill

  • Python
  • ETL
  • Apache Beam
  • Java
  • Templates
  • NoSQL
  • MS SQL
  • Hadoop
  • Test Case
  • Jenkins
  • MapReduce
  • Hbase
  • Docker
  • HDFS
  • Apache Spark
  • Apache Hive
  • Data Modeling
  • ARM
  • Pig script
  • Golang
  • CI
  • Big Data
  • Apache Kafka
  • CD
  • Kubernetes
  • GCP
  • Terraform
  • Cloudera
  • Google BigQuery
  • Dataflow
  • Azure DevOps
  • AWS CloudFormation
  • Yarn
  • Apache Zookeeper
  • Hortonworks

BUSINESS PROFILE

HCL Technologies is a next-generation global technology company.

We help enterprises reimagine their businesses for the digital age. With a worldwide network of R&D, innovation labs and delivery centers, and 150,000+ ‘Ideapreneurs’ working in 49 countries, HCL serves leading enterprises across key industries, including 250 of the Fortune 500 and 650 of the Global 2000. HCL generated consolidated
revenues of US$ 9.93 bn for 12 Months as of 30 th June, 2020.

We offer an integrated portfolio of products, solutions, services, and IP through our Mode 1-2-3 strategy built around Digital, IoT, Cloud, Automation, Cybersecurity, Analytics, Infrastructure Management and Engineering Services,  amongst others, to help enterprises reimagine their businesses for the digital age.