Techcombank
• Lead the team to improve scalability, reliability, and cost-efficiency of the Data Platform.
• Design, build, and deploy data pipelines (batch & streaming) using Spark orchestrated via Airflow.
• Develop libraries and frameworks for data ingestion, transformation, and governance with clean architecture principles.
• Collaborate with Data Architects to design/review data models, enforce data contracts, and maintain schema governance.
• Optimize performance with partitioning, caching, Z-Ordering, and metadata management in lakehouse environments (Delta/Iceberg/Hudi).
• Ensure security and compliance: IAM, encryption, secrets management, and GDPR/CCPA adherence.
• Drive CI/CD for data workflows, IaC (Terraform), and container orchestration (Kubernetes).
• Monitor SLOs/SLAs, implement alerting, and lead incident responses and postmortems.
• Design and operate end-to-end ML/LLM pipelines: data prep, training, evaluation, and deployment.
• Build RAG architectures, vector search, and embedding pipelines for LLM-based applications.
WHY BECOME IT/DATA EXPERTS AT TECHCOMBANK?
1. Must have:
• Bachelor’s or Master’s degree in Computer Science, Software Engineering, Information Technology, or a related technical field
• English is required
• Have 5+ years of experience as a Data Engineer or Software Engineer
• Have experience in Cloud (AWS/Azure/GCP)
• Extremely proficient in at least 1 programming language (Python/Scala/Java)
• Strong experience in systems architecture – particularly in complex, scalable, and fault- tolerant distributed systems
• Good at multi-threading, atomic operations, computation framework: Spark (DataFrame, SQL, ...), distributed storage, distributed computing
• Understand designs of resilience, fault-tolerance, high availability, and high scalability, ...
• Tools: CI/CD, Gitlab, ...
• Good at communication & team working
• Being open-minded, willing to learn new things
2. Nice to have:
• Experience with Databricks (Delta Lake, Unity Catalog, Delta Live Tables) or similar lakehouse technologies is a strong plus.
• Proven ability in performance tuning and optimization for Big Data workloads (Spark/Flink, partitioning, shuffle strategies, caching).
• Familiarity with modern data transformation frameworks (dbt).
• Knowledgeable in AI and LLM technologies is a plus, including prompt engineering, embeddings, and retrieval-augmented generation (RAG).
• Hands-on experience with vector databases (ChromaDB, Vector Search) and LLMOps practices.
English
Nói: Intermediate - Đọc: Intermediate - Viết: Intermediate
MISSION:
• To be the preferred and most trusted financial partner of our customers, providing them with a full range of financial products and services through a personalized/customer centric relationship.
• To provide our employees with a great working environment where they have multiple opportunities to develop, contribute and build a successful career
• To offer our shareholders superior long term returns by executing a fast growth strategy while enforcing rigorous corporate governance and risk management best practices
CORE VALUES:
1. Customer first: what we do is only valued if it is truly beneficial to our customers and colleagues.
2. Innovation: Make improvements to lead the way.
3. Team work: At Techcombank, you will not have good performance without cooperation.
4. People development: People with proven capability will bring the organization competitive advantages and remarkable successes.
5. Accountability: Be committed to overcoming difficulties and achieving great successes.
ITJobs được thành lập vào năm 2014 tại Việt Nam và mục tiệu chính là trở thành một trong những chuyên gia hàng đầu về tuyển dụng nhân viên CNTT ở khu vực Châu Á.