Simple Tech Investment

402 Nguyen Thi Minh Khai, TP Hồ Chí Minh

Company Size : 25-99

View more

Job Summary

25-99

Outsourcing

Việt Nam

Big Data Engineer

Simple Tech Investment

Quận 3, TP Hồ Chí Minh

  • English
  • Experienced (Non-Manager)
  • Full Time
  • Negotiable
  • Posted:16/06/2025
  • 1

Job description

Overview of job

  • Design and develop scalable data pipelines using Spark, Kafka, and distributed data systems.
  • Build and optimize batch and real-time data processing workflows.
  • Implement data lake or lakehouse architectures with structured data layering.
  • Automate data workflows using orchestration tools like Airflow or dbt.
  • Collaborate with analysts, data scientists, and platform teams to deliver usable data products.
  • Ensure data quality, lineage, and governance across the pipeline.
  • Monitor, tune, and maintain performance of large-scale distributed processing jobs.
  • Document technical processes and contribute to team knowledge sharing.
  • 100% salary on your probation period.
  • 15 full-paid days off per year.
  • 13th-month bonus salary.
  • Room to grow with support from the team and sponsorship for L&D.
  • Performance review 2 times/year.
  • Competitive packages.
  • Room to grow professionally, sponsorship for learning & development of related skills.
  • Opportunity to work on a mission that can transform the lives of millions of Vietnamese.
  • Annual Performance Bonuses (up to 2 salary months per year).
  • Parking fee & Lunch allowance.
  • Annual Health Checkup, Premium Health Care (Manager level), Accident Insurance for all members.
  • Enjoy a variety of corporate events, from sports competitions, monthly birthday parties, and team building to Year-End Party, etc.
  • Flat, open, and fast-paced environment where every idea is welcomed.

Job Requirement

  • 3–5+ years of experience in data engineering or big data environments.
  • Proficient in PythonScala, or Java; hands-on with Apache SparkKafka, and Hive.
  • Experience with workflow orchestration (Airflow, Luigi, Dagster) and CI/CD pipelines for data.
  • Knowledge of containerization tools (e.g., DockerKubernetes) is a plus.
  • Strong understanding of distributed computing and big data architecture.
  • Experience with cloud platforms (AWS, GCP, or Azure) and object storage (e.g., S3).
  • Familiar with data modeling, ETL/ELT best practices, and pipeline automation.
  • Strong problem-solving and documentation skills.
  • Excellent communication skills (verbal and written in English), with the ability to effectively influence and collaborate across various functions and organizational levels.

Languages

  • English

    Speaking: Advanced - Reading: Advanced - Writing: Advanced

Technical Skill

  • Python
  • ETL
  • Data Modeling
  • Java
  • SQL Function
  • Docker
  • Apache Spark
  • Apache Hive
  • Architecture
  • Scala
  • MS Azure
  • Big Data
  • Amazon S3
  • Apache Kafka
  • AWS
  • Ecommerce
  • Kubernetes
  • GCP
  • Apache Airflow
  • Luigi
  • ELT
  • CI/CD
  • Dagster

COMPETENCES

  • Problem Solving Skills
  • Documentation
  • Communication Skills
  • Organizational Skills