Data Engineer (Integrations)
Location: Singapore Office
Yinson is a dynamic, equal opportunity employer with great organisational culture where people are valued and empowered to deliver powerful solutions.
Job Summary
The Data Engineer (Integrations) is responsible for:
• Designing, building, and maintaining robust data integration pipelines that connect operational, engineering, and enterprise systems.
• Developing scalable ETL / ELT architectures that transform raw data into reliable, machine-readable formats for analytics, BI, and digital solutions.
• Ensuring data quality, reliability, security, and interoperability across multi-cloud and hybrid environments.
• Supporting asset lifecycle data requirements across Business Development, Project, and Operations phases.
Key Responsibilities
• Design, develop, and maintain automated data pipelines for ingesting, transforming, and contextualising data from multiple internal and external sources.
• Build, operate and maintain system-to-system data integrations, including SQL ↔ JSON data transformations, via APIs, and across on-prem, cloud and SaaS systems of record.
• Design, develop, test and implement data architectures to support analytics, modelling, digital workflows, and AI-enabled use cases including knowledge-retrieval pipelines for RAG.
• Ensure data quality, validation, and reliability metrics are defined, monitored, and enforced. Support real-time and batch data acquisition systems, ensuring availability, performance, and production scalability.
• Prepare and structure data for descriptive, predictive, and prescriptive analytics.Support the development and maintenance of medallion architecture data design patterns.
• Coordinate integration requirements during Business Development and Project phases, ensuring downstream analytics readiness.
• Enable secure access to data for dynamic and steady-state simulations, including for Operator Training Simulators (OTS), through to operational handover.
• Work closely with Data Analysts and Digital Solutions Analysts to ensure downstream dashboards, AI solutions, and workflows are built on reliable data pipeline foundations.
• Plan and document technical specifications, integration designs, and data flows. Identify integration risks, performance bottlenecks, and data gaps, and propose mitigation strategies.
• Contribute recommendations on process improvements, architecture enhancements, and new technologies.
• Monitor emerging risks and support prevention, mitigation, and management initiatives.
Requirements
• Bachelor’s degree in Computer Science, Computer Engineering, Information Technology, or related discipline.
• Minimum 3+ years’ experience in data engineering, system integrations, or data platform development.
• Strong proficiency in SQL-based data platforms.
• Proficiency in Python or Java for data processing and automation.
• Experience designing, building, and maintaining data processing and integration systems.
• Strong analytical, problem-solving, and communication skills.
• Ability to learn new technologies quickly and apply them effectively
If you want to team up with us as we dream and stride towards a better tomorrow – we would love to hear from you!