Consultant Specialist

Brand:  HSBC
Area of Interest:  Technology
Location: 

Guangzhou, GD, CN, 510620

Work style:  Hybrid Worker
Date:  30 Oct 2025

Some careers have more impact than others.

If you’re looking for a career where you can make a real impression, join HSBC and discover how valued you’ll be.

We are currently seeking an experienced professional to join our team in the role of Consultant Specialist.

 

Business: Enterprise Technology L3

 

Principal responsibilities

  • Design and implement data pipelines using tools like Apache Flink, Dataflow, Dataproc, or Cloud Composer.
  • Manage and optimize GCP services such as BigQuery, Cloud Storage, and Pub/Sub for data processing.
  • Develop ETL processes to extract, transform, and load data from various sources into GCP data warehouses.
  • Ensure data quality, consistency, and integrity through validation and monitoring.
  • Collaborate with data scientists and analysts to provide clean and structured datasets.
  • Implement data security measures, including encryption, IAM roles, and access controls.
  • Automate data workflows to improve efficiency and reduce manual intervention.
  • Monitor and troubleshoot data pipelines to ensure reliability and performance.
  • Optimize query performance and storage costs in BigQuery and other GCP services.
  • Document data engineering processes and mentor team members on GCP data tools and best practices.

 

What you will need to succeed in the role:

  • University Degree (or above) in Computer Science, Software Engineering, or a related discipline.
  • Excellent written and spoken communication skills in English is a must
  • Demonstrable 5 years above of commercial experience on developing software, application or solution for large-scale system with ideally either Java/Spring or Python. Both skill sets are highly desirable.
  • Good experience in Cloud native technologies, include but not limited for those popular ones on GCP, AWS, Alicloud. Experience on GCP is desirable but not essentia
  • Master SQL / BigQuery /Postgres and other relational databases. Good knowledge with Mongo/Clickhouse would be a big puls.
  • Familiar with Apache Flink for real-time data processing and streaming applications.
  • Efficient with batch and stream processing pipelines with Dataflow for scalable data transformations.
  • Familiar with Dataproc to manage and run Hadoop/Spark jobs for big data processing would be a big puls.
  • Deploy and orchestrate workflows using Apache Airflow or Cloud Composer for ETL automation.
  • Containerize data processing applications with Docker for portability and consistency.
  • Familiar with Kubernetes (K8s) to manage and scale containerized workloads in the cloud.
  • Ensure clear communication and documentation in English to collaborate effectively with teams.
  • Implement robust data security and governance practices across all GCP services.

 

You’ll achieve more when you join HSBC.

 

HSBC is an equal opportunity employer committed to building a culture where all employees are valued, respected and opinions count. We take pride in providing a workplace that fosters continuous professional development, flexible working and, opportunities to grow within an inclusive and diverse environment. We encourage applications from all suitably qualified persons irrespective of, but not limited to, their gender or genetic information, sexual orientation, ethnicity, religion, social status, medical care leave requirements, political affiliation, people with disabilities, color, national origin, veteran status, etc., We consider all applications based on merit and suitability to the role.

 

Personal data held by the Bank relating to employment applications will be used in accordance with our Privacy Statement, which is available on our website.

 

***Issued By HSBC Software Development (GuangDong) Limited***