Job Details
Job Code
JPC - 624
Job Start Date
11/17/25
City
[Iselin, , NJ]
Experience
11-13 Years
Primary Skills
data migration
Posted Date
11/17/25
Tax Terms
C2C
Number of Positions
1
JPC - 624 - Cloud Data Platform Engineer
[Iselin, , NJ],  New Jersey,  United States | Posted - 11/17/25
Job Description

Job Title: Cloud Data Platform Engineer

 


Job Description

We are seeking an experienced Cloud Data Platform Engineer to support large-scale data engineering, data integration, and cloud platform initiatives for our financial services client, Mizuho. The ideal candidate will have a strong background in cloud-native data solutions, data pipelines, and enterprise data platforms, with hands-on expertise in building scalable, secure, and high-performance data systems.


Key Responsibilities

  • Design, build, and optimize cloud-based data platforms supporting enterprise data, analytics, and reporting needs.
  • Develop and maintain ETL/ELT pipelines, data ingestion frameworks, and automated data workflows.
  • Collaborate with cross-functional teams including data architects, business analysts, and cloud infrastructure teams to deliver integrated data solutions.
  • Implement data quality, data validation, and data governance controls across the platform.
  • Work with cloud-native technologies (Azure/AWS/GCP) to scale data storage, processing, and compute environments.
  • Support production systems, troubleshoot issues, and ensure platform reliability, security, and performance.
  • Optimize SQL queries, data models, and high-volume pipelines for efficient processing.
  • Participate in platform modernization, migration, and transformation initiatives.

Required Skills & Experience

  • Strong hands-on experience as a Cloud Data Engineer / Data Platform Engineer.
  • Expertise in at least one major cloud ecosystem — Azure, AWS, or GCP.
  • Proficiency in building pipelines using modern data engineering tools (Databricks, Snowflake, Glue, ADF, Airflow, etc.).
  • Strong SQL programming and data modeling skills.
  • Experience with Python, Spark, or similar technologies for data processing.
  • Familiarity with cloud security, IAM, networking, monitoring, and automation.
  • Experience working in financial services or banking environments preferred (plus for Mizuho-specific exposure).
  • Excellent communication and ability to collaborate in a fast-paced enterprise setting.

Nice-to-Have

  • Experience with containerization (Docker/Kubernetes).
  • Exposure to real-time data streaming (Kafka, Kinesis, or Pub/Sub).
  • Experience with DevOps/CI-CD for data deployments.